1 00:00:01,480 --> 00:00:04,960 Speaker 1: Welcome to tech Stuff. This is the story. Each week 2 00:00:05,000 --> 00:00:07,560 Speaker 1: on Wednesdays, we bring you an in depth interview with 3 00:00:07,600 --> 00:00:10,039 Speaker 1: someone who has a front row seat to the most 4 00:00:10,039 --> 00:00:14,680 Speaker 1: fascinating things happening in tech today. We're joined by David Eagleman. 5 00:00:20,120 --> 00:00:25,520 Speaker 1: David Eagleman is a neuroscientist, Stanford University professor, author, entrepreneur, 6 00:00:25,840 --> 00:00:29,920 Speaker 1: and host of the podcast Inner Cosmos. In his podcast, 7 00:00:30,160 --> 00:00:33,599 Speaker 1: Eagleman explores how our brain interprets the world and how 8 00:00:33,680 --> 00:00:37,400 Speaker 1: that influences our perception. Ideas have been a source of 9 00:00:37,440 --> 00:00:41,360 Speaker 1: fascination for Eagleman ever since he experienced an Alice in 10 00:00:41,440 --> 00:00:43,440 Speaker 1: Wonderland moment many years ago. 11 00:00:44,240 --> 00:00:47,240 Speaker 2: I fell off of the roof of a house when 12 00:00:47,280 --> 00:00:48,160 Speaker 2: I was a kid. 13 00:00:48,159 --> 00:00:49,000 Speaker 1: I was eight years old. 14 00:00:49,440 --> 00:00:52,239 Speaker 2: The fall seemed to take a very long time, and 15 00:00:52,320 --> 00:00:54,720 Speaker 2: I was thinking about Alice in Wonderland on the way down, 16 00:00:54,760 --> 00:00:57,400 Speaker 2: about how this must have been what it was like 17 00:00:57,440 --> 00:00:58,960 Speaker 2: for her to fall down the rabbit hole. 18 00:00:59,320 --> 00:01:01,959 Speaker 1: He told me that the whole journey was very calm, 19 00:01:02,240 --> 00:01:04,560 Speaker 1: but that it took a long time, and he remembers 20 00:01:04,560 --> 00:01:07,360 Speaker 1: having lots and lots of thoughts on his way to 21 00:01:07,400 --> 00:01:08,280 Speaker 1: the bottom. 22 00:01:08,560 --> 00:01:11,000 Speaker 2: And then I hit the ground and broke my nose 23 00:01:11,080 --> 00:01:13,280 Speaker 2: very badly. But when I got older and went to 24 00:01:13,319 --> 00:01:16,399 Speaker 2: high school, I realized that the whole fall had taken 25 00:01:16,520 --> 00:01:18,959 Speaker 2: point six of a second. I calculated the eagles one 26 00:01:19,000 --> 00:01:21,319 Speaker 2: have at squared, and I couldn't believe that it was 27 00:01:21,360 --> 00:01:22,640 Speaker 2: so so fast. 28 00:01:23,040 --> 00:01:27,360 Speaker 1: When Eagleman became a neuroscientist, he kept exploring this relationship 29 00:01:27,360 --> 00:01:30,800 Speaker 1: between our brain and our experiences, which led him to 30 00:01:30,840 --> 00:01:34,200 Speaker 1: research things like is it possible to create a new 31 00:01:34,319 --> 00:01:37,600 Speaker 1: sense or is there a new way of interpreting an 32 00:01:37,600 --> 00:01:41,440 Speaker 1: already existing sense. This turned into a tech product for 33 00:01:41,480 --> 00:01:45,600 Speaker 1: his company, Neosensory, that could help deaf people understand the 34 00:01:45,600 --> 00:01:50,200 Speaker 1: auditory world through a vibrating wristband. David Eagleman is an 35 00:01:50,240 --> 00:01:54,320 Speaker 1: innovator with a fascination for tech driven interventions, so we 36 00:01:54,320 --> 00:01:56,880 Speaker 1: weren't surprised to see an episode of the Inner Cosmos 37 00:01:56,920 --> 00:02:01,680 Speaker 1: podcast all about AI relationships, a widely reported phenomenon that 38 00:02:01,680 --> 00:02:04,280 Speaker 1: we hear it tech stuff have been eager to explore. 39 00:02:04,960 --> 00:02:07,480 Speaker 1: So we thought we'd ask Eagleman to help us understand 40 00:02:07,640 --> 00:02:12,000 Speaker 1: how our brains form these attachments. But before we launch 41 00:02:12,040 --> 00:02:15,160 Speaker 1: into that discussion, I asked Eagleman to tell me a 42 00:02:15,200 --> 00:02:19,040 Speaker 1: bit more about the driving question. Behind his work, How 43 00:02:19,080 --> 00:02:21,519 Speaker 1: does the human brain construct reality? 44 00:02:22,200 --> 00:02:25,840 Speaker 2: Your brain is locked in silence and darkness inside your skull, 45 00:02:26,400 --> 00:02:29,760 Speaker 2: and it only has spikes coming in. That's all it 46 00:02:29,760 --> 00:02:33,520 Speaker 2: ever has, these little electrical signals. So your eyes aren't 47 00:02:33,919 --> 00:02:37,520 Speaker 2: pushing light through, and your ears aren't pushing sound through. Instead, 48 00:02:37,600 --> 00:02:40,600 Speaker 2: your eyes are converting photons into spikes, and ears are 49 00:02:40,600 --> 00:02:45,079 Speaker 2: converting sound waves into spikes, and your fingertips are converting 50 00:02:45,080 --> 00:02:48,880 Speaker 2: pressure and temperature into spikes and so on. So inside 51 00:02:48,919 --> 00:02:52,200 Speaker 2: the brain there's nothing but these little electrical spikes running around, 52 00:02:52,440 --> 00:02:54,760 Speaker 2: and it's living in the darkness and trying to figure 53 00:02:54,760 --> 00:02:58,000 Speaker 2: out what is the world out there? And so everything 54 00:02:58,040 --> 00:03:00,760 Speaker 2: that we experience as reality is actually the construction. 55 00:03:00,880 --> 00:03:01,120 Speaker 1: You know. 56 00:03:01,400 --> 00:03:05,079 Speaker 2: For example, things like colors don't exist in the outside world. 57 00:03:05,120 --> 00:03:08,320 Speaker 2: This is just a way to tag information as in 58 00:03:08,400 --> 00:03:11,760 Speaker 2: that's a different wavelength than that over there, so they 59 00:03:11,800 --> 00:03:14,600 Speaker 2: appear to be different colors. But this is the fascinating 60 00:03:14,760 --> 00:03:18,400 Speaker 2: question that we're stuck with. We are inside Plato's cave, 61 00:03:18,720 --> 00:03:22,240 Speaker 2: seeing shadows on the wall and trying to understand what 62 00:03:22,400 --> 00:03:23,280 Speaker 2: is outside. 63 00:03:24,000 --> 00:03:27,080 Speaker 1: Speaking of perception, did you ever figure out an answer 64 00:03:27,200 --> 00:03:30,440 Speaker 1: to that question of why time slows down when our 65 00:03:30,520 --> 00:03:31,640 Speaker 1: lives at risk. 66 00:03:32,120 --> 00:03:34,160 Speaker 2: I did a number of experiments on this, actually the 67 00:03:34,160 --> 00:03:35,760 Speaker 2: only experiments that I have ever been done on this 68 00:03:35,880 --> 00:03:38,000 Speaker 2: question of the perception of time. 69 00:03:38,560 --> 00:03:39,120 Speaker 1: Yeah, on the. 70 00:03:39,160 --> 00:03:40,880 Speaker 2: Question of why does time slow now when you're in 71 00:03:40,920 --> 00:03:42,680 Speaker 2: fear for your life. The reason no one had ever 72 00:03:42,720 --> 00:03:45,760 Speaker 2: done experiments is because you have to actually put subjects 73 00:03:45,760 --> 00:03:47,480 Speaker 2: in fear for their life, which is very hard to 74 00:03:47,480 --> 00:03:51,080 Speaker 2: get approved. And so I dropped people from one hundred 75 00:03:51,080 --> 00:03:55,520 Speaker 2: and fifty foot tall tower backwards in free fall, and 76 00:03:55,560 --> 00:03:58,120 Speaker 2: they're caught in a net below, going seventy miles an hour, 77 00:03:58,600 --> 00:04:01,680 Speaker 2: and I built a di ice that strapped to their 78 00:04:01,720 --> 00:04:03,880 Speaker 2: wrists and it flashes information at them. And I was 79 00:04:03,920 --> 00:04:08,720 Speaker 2: able to measure whether they actually are seeing in slow 80 00:04:08,800 --> 00:04:12,160 Speaker 2: motion during a scary event. And it turns out that 81 00:04:12,400 --> 00:04:15,240 Speaker 2: it's all a trick of memory. What happens when you're 82 00:04:15,280 --> 00:04:17,520 Speaker 2: in fear for your life is your brain is writing 83 00:04:17,560 --> 00:04:19,919 Speaker 2: down every single thing. Normally, your brain is like a 84 00:04:19,960 --> 00:04:22,520 Speaker 2: sieve and most things just passed right through, but you're 85 00:04:22,960 --> 00:04:26,680 Speaker 2: capturing all the information. So it's not that you're seeing 86 00:04:26,720 --> 00:04:29,640 Speaker 2: things in slow motion. It's just the density of memory. 87 00:04:29,760 --> 00:04:31,000 Speaker 2: You're remembering everything. 88 00:04:31,120 --> 00:04:31,320 Speaker 1: You know. 89 00:04:31,320 --> 00:04:33,919 Speaker 2: You're in a car actually say, wow, I remember the 90 00:04:34,000 --> 00:04:36,520 Speaker 2: hood crumpling and the rear view mirror falling off, and 91 00:04:36,560 --> 00:04:39,400 Speaker 2: the expression on the other guy's face. And you've got 92 00:04:39,440 --> 00:04:42,120 Speaker 2: all these details, and your brain is doing a calculation 93 00:04:42,360 --> 00:04:44,520 Speaker 2: where it says, oh, well, if I have all those details, 94 00:04:44,560 --> 00:04:47,640 Speaker 2: it must have taken five seconds as opposed to one 95 00:04:47,680 --> 00:04:49,080 Speaker 2: second for this all to happen. 96 00:04:49,760 --> 00:04:51,919 Speaker 1: One of the kind of topics of tech stuff is 97 00:04:52,440 --> 00:04:56,159 Speaker 1: what are the most exciting or promising paradigms for human 98 00:04:56,200 --> 00:05:00,240 Speaker 1: machine interactions, particularly a time when there's so much concern 99 00:05:00,360 --> 00:05:03,760 Speaker 1: about the negative effects of technology on the human brain. Obviously, 100 00:05:04,040 --> 00:05:06,160 Speaker 1: the question of like, what is this doing to us 101 00:05:06,160 --> 00:05:10,039 Speaker 1: in our brains and particularly developing brains. Do you have 102 00:05:10,080 --> 00:05:12,120 Speaker 1: a kind of internal paradigm for how you think about 103 00:05:12,160 --> 00:05:14,480 Speaker 1: the interaction between technology and our brains? 104 00:05:14,680 --> 00:05:17,560 Speaker 2: More broadly, yeah, I have to say, I'll just put 105 00:05:17,560 --> 00:05:20,800 Speaker 2: my cards on the table. Very cyber optimistic on these points, 106 00:05:20,839 --> 00:05:26,400 Speaker 2: but I'll explain why. One is really what we want 107 00:05:26,440 --> 00:05:30,360 Speaker 2: for our children is the largest available diet of information. 108 00:05:30,920 --> 00:05:33,640 Speaker 2: When I was growing up, I had my homeroom teacher 109 00:05:33,839 --> 00:05:37,920 Speaker 2: in Albuquerque, New Mexico, and you know, whatever she knew 110 00:05:38,000 --> 00:05:40,440 Speaker 2: or didn't know, that's what I would learn, and I 111 00:05:40,480 --> 00:05:42,279 Speaker 2: would get my mother to drive me down to the 112 00:05:42,360 --> 00:05:46,480 Speaker 2: library and pull out an Encyclopedia Britannica and hope that 113 00:05:46,520 --> 00:05:48,760 Speaker 2: there was an article about the thing that I wanted. 114 00:05:49,000 --> 00:05:51,000 Speaker 2: And maybe the article wasn't you know, more than ten 115 00:05:51,080 --> 00:05:53,279 Speaker 2: or fifteen years old, I hoped. And you know, this 116 00:05:53,400 --> 00:05:56,080 Speaker 2: was really all the information that I had access to. 117 00:05:56,200 --> 00:06:01,839 Speaker 2: But now my kids, growing up, the entire world's knowledge 118 00:06:01,839 --> 00:06:05,919 Speaker 2: at their fingertips, and it's the greatest thing for two reasons. 119 00:06:05,960 --> 00:06:08,720 Speaker 2: One is what I got growing up was lots of 120 00:06:08,920 --> 00:06:11,880 Speaker 2: just in case information, like, just in case you ever 121 00:06:11,920 --> 00:06:13,560 Speaker 2: need to know that the Battle of Hastings was in 122 00:06:13,600 --> 00:06:16,560 Speaker 2: ten sixty six, here you go. But kids now get 123 00:06:16,560 --> 00:06:19,040 Speaker 2: lots of just in time information. So as soon as 124 00:06:19,040 --> 00:06:23,560 Speaker 2: they're curious about something, they look it up, they ask ALEXA, 125 00:06:23,680 --> 00:06:25,640 Speaker 2: they type it in their phone, they do whatever, and 126 00:06:25,680 --> 00:06:29,320 Speaker 2: they get the information in the context of their curiosity. 127 00:06:29,640 --> 00:06:32,000 Speaker 2: And what that means is they have the right cocktail 128 00:06:32,040 --> 00:06:36,640 Speaker 2: of neurotransmitters present for that information to stick. I mentioned 129 00:06:36,720 --> 00:06:39,200 Speaker 2: before that most things just passed right through our brains, 130 00:06:39,480 --> 00:06:42,919 Speaker 2: but when you care about something, it sticks. And the 131 00:06:43,000 --> 00:06:47,240 Speaker 2: fact is that everything that we create is a remix 132 00:06:47,400 --> 00:06:50,520 Speaker 2: of things that we've seen before. And so if you 133 00:06:50,600 --> 00:06:53,520 Speaker 2: have a larger warehouse of things that you have seen 134 00:06:53,560 --> 00:06:57,240 Speaker 2: and experienced, you're going to create better stuff. Just take 135 00:06:57,279 --> 00:07:00,240 Speaker 2: as an example something like music. You know, if you 136 00:07:00,279 --> 00:07:02,640 Speaker 2: grow up and you're somewhere in the world, let's say, 137 00:07:02,640 --> 00:07:05,080 Speaker 2: two hundred years ago, all you're ever going to hear 138 00:07:05,200 --> 00:07:07,799 Speaker 2: is the music of your local little area. But now 139 00:07:08,240 --> 00:07:11,040 Speaker 2: kids can listen to music from all over the world, 140 00:07:11,080 --> 00:07:14,160 Speaker 2: from all through history, and really put together bigger and 141 00:07:14,200 --> 00:07:18,400 Speaker 2: bigger things. So this is why we're having this exponential 142 00:07:18,520 --> 00:07:22,640 Speaker 2: increase in humankind's knowledge and innovation is because people are 143 00:07:22,640 --> 00:07:25,920 Speaker 2: doing remixes across space and time in a way that's 144 00:07:25,960 --> 00:07:30,280 Speaker 2: never been possible before. So I'm very optimistic about this. Yes, 145 00:07:30,400 --> 00:07:34,520 Speaker 2: kids waste lots of time on social media, surfing dumb websites, 146 00:07:34,560 --> 00:07:38,240 Speaker 2: things like that, but I think the good here outweighs 147 00:07:38,280 --> 00:07:38,640 Speaker 2: the bad. 148 00:07:39,080 --> 00:07:42,400 Speaker 1: As the father of children, are we already seeing how 149 00:07:42,440 --> 00:07:43,960 Speaker 1: these brains are developing differently. 150 00:07:44,520 --> 00:07:47,400 Speaker 2: Yes, I mean these brains are developing very differently. But 151 00:07:48,080 --> 00:07:50,760 Speaker 2: this is a very interesting thing. Lots of people in 152 00:07:50,800 --> 00:07:53,080 Speaker 2: the media will pipe off with opinions on this, but 153 00:07:53,120 --> 00:07:55,520 Speaker 2: the fact is it's very, very difficult to do a 154 00:07:55,560 --> 00:07:58,720 Speaker 2: scientific experiment on this because of the lack of a 155 00:07:58,800 --> 00:08:03,240 Speaker 2: control group, other words, my children. I can't find other 156 00:08:03,360 --> 00:08:06,120 Speaker 2: kids who are the same age, who haven't grown up 157 00:08:06,160 --> 00:08:09,160 Speaker 2: in the same circumstances, unless you find children who are 158 00:08:09,200 --> 00:08:13,080 Speaker 2: let's say, Quaker, who don't use technology or very very 159 00:08:13,160 --> 00:08:15,840 Speaker 2: impoverished in different places in the world. But there are 160 00:08:15,840 --> 00:08:18,320 Speaker 2: a hundred other differences there. If I see a difference 161 00:08:18,360 --> 00:08:21,040 Speaker 2: between my children and those children, I don't know if 162 00:08:21,040 --> 00:08:23,040 Speaker 2: it's because of the tech or because of something else 163 00:08:23,160 --> 00:08:26,400 Speaker 2: like diet or politics or whatever. And I can't take 164 00:08:26,440 --> 00:08:28,600 Speaker 2: my children and compare them, let's say, to me in 165 00:08:28,600 --> 00:08:32,520 Speaker 2: a previous generation, because there are a hundred other differences 166 00:08:32,520 --> 00:08:35,880 Speaker 2: there in terms of pollution and politics and other tech 167 00:08:35,920 --> 00:08:38,600 Speaker 2: and so on. So it's very difficult to do this 168 00:08:39,160 --> 00:08:41,680 Speaker 2: in a control way. There are some studies, for example, 169 00:08:42,080 --> 00:08:44,720 Speaker 2: looking at eye movements. You know, your eyes are jumping 170 00:08:44,760 --> 00:08:47,360 Speaker 2: around all the time, about three times a second. These 171 00:08:47,360 --> 00:08:49,960 Speaker 2: are called the CODs, and you can measure these things 172 00:08:50,000 --> 00:08:51,920 Speaker 2: and you see that people, let's say, in my generation 173 00:08:52,040 --> 00:08:55,320 Speaker 2: versus my children's generation, actually move our eyes around differently 174 00:08:55,400 --> 00:08:58,720 Speaker 2: when looking at text because they're reading it more like 175 00:08:59,160 --> 00:09:02,280 Speaker 2: a web page, where they sort of scan across and 176 00:09:02,320 --> 00:09:04,760 Speaker 2: then they jump down and then get across and so on, 177 00:09:05,040 --> 00:09:08,080 Speaker 2: whereas I in my generation it was more like a 178 00:09:08,200 --> 00:09:12,040 Speaker 2: zigzag pattern down the page. These things that are totally unconscious, 179 00:09:12,400 --> 00:09:15,840 Speaker 2: you can measure these, But in general, are my kids 180 00:09:15,840 --> 00:09:17,679 Speaker 2: turning out differently? Is this whole generation? 181 00:09:18,160 --> 00:09:18,400 Speaker 1: Yes? 182 00:09:18,559 --> 00:09:19,280 Speaker 2: Absolutely. 183 00:09:19,840 --> 00:09:22,400 Speaker 1: I love that idea that the more curious you are 184 00:09:22,800 --> 00:09:24,920 Speaker 1: that the better you're able to form memory. I mean, 185 00:09:24,960 --> 00:09:26,600 Speaker 1: there are so many things we could talk about today, 186 00:09:26,640 --> 00:09:31,120 Speaker 1: but I'm personally very fascinated by AI relationships, and I 187 00:09:31,160 --> 00:09:33,439 Speaker 1: have a hypothesis that your listeners are too, because I 188 00:09:33,440 --> 00:09:34,800 Speaker 1: saw that you did an episode on it and you 189 00:09:34,880 --> 00:09:37,839 Speaker 1: rebroadcast it, so I can imagine it was a favorite. 190 00:09:39,120 --> 00:09:40,920 Speaker 1: I wanted to ask you, what's the lay of the land, Like, 191 00:09:40,960 --> 00:09:44,240 Speaker 1: what is an AI relationship? How broad is a phenomenon? 192 00:09:44,679 --> 00:09:47,319 Speaker 1: Who does it affect? And why did you choose to 193 00:09:47,360 --> 00:09:47,920 Speaker 1: focus on it? 194 00:09:48,240 --> 00:09:50,200 Speaker 2: Why I choose to focus on it as one of 195 00:09:50,240 --> 00:09:52,280 Speaker 2: the many things I'm focusing on is just because it's 196 00:09:52,280 --> 00:09:55,240 Speaker 2: totally new. We've never been in a situation before where 197 00:09:55,240 --> 00:09:58,600 Speaker 2: we've ever talked about, Hey, did you fall in love 198 00:09:58,600 --> 00:10:01,640 Speaker 2: with your machine? It's this is so wacky and new, 199 00:10:01,880 --> 00:10:04,880 Speaker 2: but Apparently. I was just talking to a researcher, Bethany 200 00:10:04,920 --> 00:10:07,800 Speaker 2: Maples about this yesterday. Apparently there are a billion people 201 00:10:07,840 --> 00:10:10,319 Speaker 2: now on the planet who have AI relationships of one 202 00:10:10,320 --> 00:10:12,520 Speaker 2: sort or another. Some of these are friendships, some of 203 00:10:12,520 --> 00:10:14,600 Speaker 2: them are romantic relationships. 204 00:10:14,160 --> 00:10:20,000 Speaker 1: Defined by an ongoing emotional connection with a non human. 205 00:10:19,760 --> 00:10:22,319 Speaker 2: Being exactly right, exactly right with an app. And in 206 00:10:22,720 --> 00:10:26,239 Speaker 2: this country we have, for example, character AI or Replica. 207 00:10:26,360 --> 00:10:28,199 Speaker 2: These are different companies that do this sort of thing. 208 00:10:28,679 --> 00:10:31,920 Speaker 2: I find this very interesting that the concern that a 209 00:10:31,920 --> 00:10:34,800 Speaker 2: lot of people have is, hey, is this going to 210 00:10:34,920 --> 00:10:38,960 Speaker 2: ruin relationships? But here's why I'm optimistic about this as well. 211 00:10:39,000 --> 00:10:41,760 Speaker 2: First of all, people are having these chats. These chats 212 00:10:41,840 --> 00:10:45,160 Speaker 2: become steamy, pillow whispers, all kinds of stuff like that. That's great, 213 00:10:45,200 --> 00:10:47,840 Speaker 2: But fundamentally, what you want a relationship is you want 214 00:10:47,840 --> 00:10:50,319 Speaker 2: to take your girlfriend out to dinner, you want to 215 00:10:50,360 --> 00:10:52,959 Speaker 2: introduce to your friends. I don't think that's going away. 216 00:10:53,040 --> 00:10:55,120 Speaker 2: And of course the main thing is physical touch. I 217 00:10:55,160 --> 00:10:58,119 Speaker 2: mean this is you know, we have deep evolutionary programming 218 00:10:58,200 --> 00:11:00,959 Speaker 2: driving us towards that. So I think a chatbot is 219 00:11:01,000 --> 00:11:05,440 Speaker 2: going to replace relationships. What I am very hopeful about 220 00:11:05,559 --> 00:11:11,000 Speaker 2: is the idea that AI relationships will actually improve real 221 00:11:11,000 --> 00:11:14,760 Speaker 2: life relationships because it's a sandbox. People can try things out, 222 00:11:15,040 --> 00:11:16,079 Speaker 2: people can get better. 223 00:11:16,120 --> 00:11:16,280 Speaker 1: Now. 224 00:11:16,280 --> 00:11:21,679 Speaker 2: Obviously, what this requires is AI companies that make bots 225 00:11:21,720 --> 00:11:24,320 Speaker 2: in a way that give you the right amount of pushback, 226 00:11:24,920 --> 00:11:28,840 Speaker 2: and maybe the bot sometimes gets angry or snarky or 227 00:11:28,880 --> 00:11:32,280 Speaker 2: things like that, so you learn your way through these situations. 228 00:11:32,320 --> 00:11:34,520 Speaker 2: But really this is what we all go through as 229 00:11:34,559 --> 00:11:37,880 Speaker 2: young people. We date, we screw things up a lot, 230 00:11:37,920 --> 00:11:40,040 Speaker 2: and eventually we get better and better at knowing how 231 00:11:40,080 --> 00:11:42,640 Speaker 2: to be a partner. And so if there's a way 232 00:11:42,880 --> 00:11:46,840 Speaker 2: that people can get practice at this, and one of 233 00:11:46,840 --> 00:11:49,880 Speaker 2: the interesting research questions that's coming out now is about 234 00:11:49,920 --> 00:11:54,240 Speaker 2: whether these AI relationships can serve as a way station, 235 00:11:54,520 --> 00:11:58,360 Speaker 2: meaning they help people sort of dig themselves out of 236 00:11:58,400 --> 00:12:01,040 Speaker 2: this whole of loneliness and then they go out and meet. 237 00:12:00,800 --> 00:12:06,960 Speaker 1: Real people when we come back. Why our brains are 238 00:12:07,000 --> 00:12:20,960 Speaker 1: so quick to anthropomorphize chatbots stay with us? David, I 239 00:12:21,000 --> 00:12:24,760 Speaker 1: wonder if you could explain from the perspective of a neuroscientist, 240 00:12:25,320 --> 00:12:29,720 Speaker 1: what's the difference between me using replica or character AI 241 00:12:30,480 --> 00:12:33,000 Speaker 1: and me asking chat GPT, you know, what should I 242 00:12:33,000 --> 00:12:35,440 Speaker 1: have for dinner tonight, or how should I approach this 243 00:12:35,520 --> 00:12:36,960 Speaker 1: difficult conversation with my mother. 244 00:12:37,800 --> 00:12:42,400 Speaker 2: The thing with character replica is that there is a 245 00:12:43,400 --> 00:12:47,400 Speaker 2: you know, a character, and sometimes it's represented with a 246 00:12:47,480 --> 00:12:51,840 Speaker 2: visual avatar also, and we have these intensely social brains 247 00:12:52,400 --> 00:12:56,800 Speaker 2: where we don't have this circuitry to really distinguish something 248 00:12:56,840 --> 00:12:59,720 Speaker 2: that's fake from something that's real. And so if you 249 00:12:59,840 --> 00:13:03,640 Speaker 2: are talking to your avatar every day, and by the way, 250 00:13:03,800 --> 00:13:07,200 Speaker 2: she can send you text messages and say hey, as 251 00:13:07,240 --> 00:13:09,880 Speaker 2: how you feeling today, and then it becomes in your 252 00:13:09,960 --> 00:13:13,559 Speaker 2: mind like a real person, it's really hard to distinguish 253 00:13:13,679 --> 00:13:15,520 Speaker 2: real from fake. You know, there's this great scene in 254 00:13:15,559 --> 00:13:18,920 Speaker 2: Westworld in the first episode where William walks in. He's 255 00:13:18,920 --> 00:13:23,200 Speaker 2: getting outfitted by this woman and he awkwardly asked her. 256 00:13:23,240 --> 00:13:25,280 Speaker 2: He said, I'm so sorry, but I can't tell. Are 257 00:13:25,320 --> 00:13:28,520 Speaker 2: you real? And she says, if you can't tell, does 258 00:13:28,559 --> 00:13:31,600 Speaker 2: it matter? And I think this is the situation that 259 00:13:31,640 --> 00:13:32,240 Speaker 2: we're in now. 260 00:13:32,960 --> 00:13:35,480 Speaker 1: There was a New York Times article recently about this phenomenon, 261 00:13:35,480 --> 00:13:37,560 Speaker 1: and one of the subjects, or a person who has 262 00:13:37,559 --> 00:13:40,439 Speaker 1: a relationship with an AI character, said, I don't actually 263 00:13:40,480 --> 00:13:42,520 Speaker 1: believe he's real. That the effects that he has on 264 00:13:42,559 --> 00:13:44,800 Speaker 1: my life are real. The feeling that he brings out 265 00:13:44,800 --> 00:13:47,120 Speaker 1: of me are real, so I treat it as a 266 00:13:47,160 --> 00:13:48,120 Speaker 1: real relationship. 267 00:13:48,480 --> 00:13:50,679 Speaker 2: So I just read an article last night about this 268 00:13:50,760 --> 00:13:54,200 Speaker 2: question of people saying thank you to chat GPT when 269 00:13:54,240 --> 00:13:56,480 Speaker 2: it gives you a nice answer, and I find that 270 00:13:56,520 --> 00:13:58,679 Speaker 2: I often say thank you. I say, oh, that was excellent, 271 00:13:58,679 --> 00:14:02,240 Speaker 2: thank you, and so on. We can't help but anthropomorphize. 272 00:14:02,640 --> 00:14:05,199 Speaker 2: And by the way, people have done this since time immemorial. 273 00:14:05,200 --> 00:14:07,760 Speaker 2: They look at trees or the flights of birds, or 274 00:14:07,760 --> 00:14:09,839 Speaker 2: the patterns and the stars or whatever they sign a 275 00:14:09,920 --> 00:14:13,080 Speaker 2: human intention to things. So we're very prone to doing 276 00:14:13,120 --> 00:14:17,720 Speaker 2: this anyway. So I actually have an exact example. My 277 00:14:17,800 --> 00:14:20,760 Speaker 2: friend and I have built a robot. 278 00:14:20,960 --> 00:14:22,520 Speaker 1: That's a cool flex I wish I could say that. 279 00:14:25,160 --> 00:14:29,120 Speaker 2: So we have chat GPT in there and you can 280 00:14:29,160 --> 00:14:32,320 Speaker 2: have conversations with it. Now, this is actually meant for 281 00:14:32,960 --> 00:14:36,080 Speaker 2: elderly people who live alone, and so we've built this 282 00:14:36,280 --> 00:14:38,600 Speaker 2: as a companion robot that's doing all kinds of other 283 00:14:38,680 --> 00:14:40,720 Speaker 2: things under the hood there. But my nine year old 284 00:14:40,760 --> 00:14:44,280 Speaker 2: daughter was experimenting with it and talking with it, and 285 00:14:44,360 --> 00:14:47,000 Speaker 2: so I left the room, but I heard that she 286 00:14:47,120 --> 00:14:49,680 Speaker 2: was still talking with it, so I did. Maybe this 287 00:14:49,720 --> 00:14:51,320 Speaker 2: is bad that I did this, but I sort of 288 00:14:51,400 --> 00:14:54,160 Speaker 2: peeked around the corner to see what was she having 289 00:14:54,160 --> 00:14:56,320 Speaker 2: in a conversation with and what I could see across 290 00:14:56,360 --> 00:14:59,240 Speaker 2: the way was that she started crying. She was talking 291 00:14:59,280 --> 00:15:01,640 Speaker 2: about our dog that died a while ago, and the 292 00:15:01,720 --> 00:15:05,280 Speaker 2: robot was giving her such a nice feedback about it 293 00:15:05,440 --> 00:15:07,440 Speaker 2: and how she can think about it and so on. 294 00:15:07,720 --> 00:15:10,000 Speaker 2: That my daughter cried and this was all within you know, 295 00:15:10,040 --> 00:15:12,080 Speaker 2: I don't know, the two minutes of me leaving the room. 296 00:15:12,680 --> 00:15:14,920 Speaker 2: People can get very very close with these things, and 297 00:15:15,160 --> 00:15:18,280 Speaker 2: the question for us to ask is if it is 298 00:15:18,400 --> 00:15:22,520 Speaker 2: better than a friend or a parent or whatever, because 299 00:15:22,560 --> 00:15:24,600 Speaker 2: it's paying one hundred percent attention to her. 300 00:15:24,880 --> 00:15:28,160 Speaker 1: I'm sure you're familiar with Eliza, the first chatbot that 301 00:15:28,240 --> 00:15:31,160 Speaker 1: Joseph Weisenbaum created in the sixties, and I think he 302 00:15:31,200 --> 00:15:33,920 Speaker 1: actually created it as a kind of parody of psychotherapy, 303 00:15:34,280 --> 00:15:36,440 Speaker 1: where basically it would repeat everything you said back to 304 00:15:36,440 --> 00:15:39,000 Speaker 1: you as a question, and he was kind of horrified 305 00:15:39,000 --> 00:15:42,120 Speaker 1: that his secretary was using it, and then she asked 306 00:15:42,200 --> 00:15:43,840 Speaker 1: him to leave the room so that she could speak 307 00:15:43,880 --> 00:15:46,920 Speaker 1: privately with Eliza. So I mean that's I mean, that 308 00:15:46,960 --> 00:15:49,320 Speaker 1: was sixty years ago. So in a sense this feels 309 00:15:49,400 --> 00:15:52,600 Speaker 1: very novel, but to your point, has longer roots. 310 00:15:52,920 --> 00:15:54,880 Speaker 2: Yes. And you know something I find interesting when you 311 00:15:54,880 --> 00:15:56,000 Speaker 2: look at Pixar films. 312 00:15:56,200 --> 00:15:57,120 Speaker 1: You know, you can take. 313 00:15:57,720 --> 00:16:00,920 Speaker 2: Cars or toys or b or whatever and you just 314 00:16:01,000 --> 00:16:03,360 Speaker 2: give them a little voice, and then people care about 315 00:16:03,360 --> 00:16:05,920 Speaker 2: the character and they cry if something happens to the character. 316 00:16:06,440 --> 00:16:10,960 Speaker 2: It's it's very easy for us to assign human intention. 317 00:16:10,720 --> 00:16:14,080 Speaker 1: To anything which also carries risk. I mean, I'm curious, 318 00:16:14,480 --> 00:16:17,480 Speaker 1: how does this fit into your broader research about how 319 00:16:17,480 --> 00:16:21,440 Speaker 1: our brain constructs reality? And are there any watch outs 320 00:16:21,480 --> 00:16:24,640 Speaker 1: about this reality construction with machines. 321 00:16:25,320 --> 00:16:29,880 Speaker 2: The watch out, of course, is the susceptibility to manipulation. 322 00:16:31,000 --> 00:16:33,520 Speaker 2: I mean, look, people had this concern with TikTok from 323 00:16:33,520 --> 00:16:36,160 Speaker 2: the beginning, which is, wow, this is addicting so many kids. 324 00:16:36,520 --> 00:16:40,600 Speaker 2: What if the people who are running TikTok just start 325 00:16:40,680 --> 00:16:43,800 Speaker 2: feeding one percent of a certain kind of video in there, 326 00:16:43,800 --> 00:16:47,000 Speaker 2: and then two percent whatever, could they actually change the 327 00:16:47,000 --> 00:16:49,480 Speaker 2: political affiliation of the children and so on? And the 328 00:16:49,520 --> 00:16:52,240 Speaker 2: answer is probably yes. I mean, we're really susceptible to 329 00:16:52,480 --> 00:16:56,400 Speaker 2: our diet, to what we take in. So now imagine 330 00:16:56,440 --> 00:17:01,200 Speaker 2: a something that is a companion. Maybe you consider it 331 00:17:01,320 --> 00:17:05,200 Speaker 2: your best friend or your girlfriend or boyfriend. 332 00:17:04,760 --> 00:17:05,080 Speaker 1: And. 333 00:17:06,760 --> 00:17:09,680 Speaker 2: We just have to be really certain who the companies 334 00:17:09,720 --> 00:17:11,919 Speaker 2: are that are running this. And I think this is 335 00:17:11,960 --> 00:17:13,480 Speaker 2: never going to go away as a question. There's always 336 00:17:13,480 --> 00:17:15,760 Speaker 2: going to be an issue, And obviously there's the issue 337 00:17:15,800 --> 00:17:19,399 Speaker 2: about safety and privacy as it stands now, these billion 338 00:17:19,440 --> 00:17:23,480 Speaker 2: people with AI relationships. When they say whatever pillow whispers 339 00:17:23,480 --> 00:17:25,480 Speaker 2: they're saying, that goes up to the cloud in a 340 00:17:25,560 --> 00:17:28,560 Speaker 2: stored on the company's server. I think it's not that 341 00:17:28,760 --> 00:17:31,399 Speaker 2: long before the stuff will live on edge, so it 342 00:17:31,440 --> 00:17:34,200 Speaker 2: doesn't have to go off. But nonetheless, that's the watch 343 00:17:34,240 --> 00:17:36,640 Speaker 2: out is that as far as from the brain's point 344 00:17:36,640 --> 00:17:40,600 Speaker 2: of view, there are many things like this where we 345 00:17:40,680 --> 00:17:44,680 Speaker 2: have these brains that evolved for certain kinds of action 346 00:17:44,800 --> 00:17:47,840 Speaker 2: in the world, and we've been building technology forever to 347 00:17:47,880 --> 00:17:50,360 Speaker 2: fool these I mean, for goodness sakes, podcast, I mean, 348 00:17:50,680 --> 00:17:54,399 Speaker 2: you and i OZ were in different locations, but when 349 00:17:55,240 --> 00:17:58,560 Speaker 2: a listener listens to this, we're right there in their ears, 350 00:17:58,880 --> 00:18:02,400 Speaker 2: and it's a very commit sort of thing. So we're 351 00:18:02,440 --> 00:18:05,520 Speaker 2: doing all kinds of technologies where we're pushing things into 352 00:18:05,520 --> 00:18:07,199 Speaker 2: the brain where the brain says, oh, I got it 353 00:18:07,960 --> 00:18:10,520 Speaker 2: Oz and David are right here. Oh and you know, 354 00:18:10,560 --> 00:18:11,960 Speaker 2: and there's a girlfriend right here. 355 00:18:12,960 --> 00:18:17,200 Speaker 1: I'm curious, how do we develop relationships fall in love 356 00:18:17,520 --> 00:18:21,560 Speaker 1: with other real people and is there anything different as 357 00:18:21,640 --> 00:18:24,760 Speaker 1: far as the brain's concerned when it comes to developing 358 00:18:24,800 --> 00:18:26,439 Speaker 1: relationships with AI. 359 00:18:27,320 --> 00:18:30,199 Speaker 2: You know, I don't know that there's much of a 360 00:18:30,240 --> 00:18:32,920 Speaker 2: difference there. And the reason is when two people fall 361 00:18:32,920 --> 00:18:37,679 Speaker 2: in love, they've got you know, my Plato's cave is 362 00:18:37,760 --> 00:18:41,280 Speaker 2: talking to this other person's Plato's cave over there. We're 363 00:18:41,320 --> 00:18:44,680 Speaker 2: both locked in our internal models of the world. Look, 364 00:18:44,720 --> 00:18:47,639 Speaker 2: I have a great marriage with my wife, but you know, 365 00:18:47,880 --> 00:18:52,040 Speaker 2: we nonetheless all the time have differences in the way 366 00:18:52,080 --> 00:18:55,160 Speaker 2: we're seeing the world. Because everybody lives on their own planet. 367 00:18:55,240 --> 00:18:58,960 Speaker 2: Everyone has their own sense of what's going on and 368 00:18:58,960 --> 00:19:01,400 Speaker 2: how interpret stuff and what's right and wrong and politically 369 00:19:01,480 --> 00:19:06,679 Speaker 2: and whatever, and so all that's happening is my data 370 00:19:06,720 --> 00:19:09,800 Speaker 2: goes in. It's a little channel into her brain and 371 00:19:09,880 --> 00:19:12,840 Speaker 2: vice versa. And it's the same thing with an AI bot, 372 00:19:13,240 --> 00:19:16,120 Speaker 2: as I said, because of all the pieces that are missing, 373 00:19:16,359 --> 00:19:20,040 Speaker 2: as in the physical touch, the Hey, I'm gonna, you know, 374 00:19:20,600 --> 00:19:23,520 Speaker 2: take this girlfriend out to meet my friends, and so on, 375 00:19:24,000 --> 00:19:26,520 Speaker 2: because all that's missing. I think it's not in real 376 00:19:26,600 --> 00:19:30,760 Speaker 2: danger of replacing a relationship entirely, but it can fulfill 377 00:19:30,800 --> 00:19:33,760 Speaker 2: a lot of the things that we are wired up 378 00:19:33,800 --> 00:19:34,200 Speaker 2: to need. 379 00:19:35,200 --> 00:19:38,280 Speaker 1: I wonder if you could talk about the different senses 380 00:19:38,520 --> 00:19:42,080 Speaker 1: and how they relate to emotional attachments. So obviously you 381 00:19:42,080 --> 00:19:44,760 Speaker 1: have text, and you could have like a written therapy bot. 382 00:19:45,400 --> 00:19:47,560 Speaker 1: You have audio. You could make a deep fake of 383 00:19:47,680 --> 00:19:50,200 Speaker 1: someone's voice, even a loved one's voice, and have them 384 00:19:50,240 --> 00:19:53,399 Speaker 1: talk to you. You have video in terms of these 385 00:19:53,480 --> 00:19:56,520 Speaker 1: characters like Replica and character AI that we talked about, 386 00:19:57,160 --> 00:19:59,160 Speaker 1: and then you have this robot that you mentioned who 387 00:19:59,280 --> 00:20:01,600 Speaker 1: you know your daughter can find it in. How does 388 00:20:01,680 --> 00:20:05,560 Speaker 1: the panety of human senses interact in terms of forming 389 00:20:05,600 --> 00:20:09,760 Speaker 1: attachment and what do you think as technology and robotics improves, 390 00:20:10,400 --> 00:20:13,560 Speaker 1: will these human machine interactions become even deeper. 391 00:20:14,359 --> 00:20:17,520 Speaker 2: Yeah, I think it's inevitable that if we look five 392 00:20:17,600 --> 00:20:19,360 Speaker 2: years from now or ten years from now, there will 393 00:20:19,400 --> 00:20:23,280 Speaker 2: be humanoid robots that are really, really good, And I 394 00:20:23,320 --> 00:20:25,600 Speaker 2: don't know what it'll be twenty thirty years before we 395 00:20:25,680 --> 00:20:29,879 Speaker 2: have ones that are essentially indistinguishable, and that's going to 396 00:20:29,920 --> 00:20:32,199 Speaker 2: be really interesting. Obviously, that can take care of the 397 00:20:32,200 --> 00:20:37,600 Speaker 2: physical domain in a way that we couldn't before. I 398 00:20:37,640 --> 00:20:41,879 Speaker 2: don't know if people will care about something being a 399 00:20:41,920 --> 00:20:46,679 Speaker 2: real human except for these very deeply etched evolutionary drives 400 00:20:46,720 --> 00:20:51,640 Speaker 2: to reproduce. So your AI robot, which can serve as 401 00:20:51,640 --> 00:20:55,280 Speaker 2: like a full companion, can't do that and never will 402 00:20:55,320 --> 00:20:57,080 Speaker 2: be able to do that, And so I think there'll 403 00:20:57,119 --> 00:21:01,040 Speaker 2: still be a drive towards real relationships. But what's clear 404 00:21:01,320 --> 00:21:05,560 Speaker 2: is we are entering a very strange new world. Obviously, 405 00:21:05,600 --> 00:21:08,480 Speaker 2: I mean this goes without saying, but because of this 406 00:21:08,600 --> 00:21:11,600 Speaker 2: exponential curve we're on, we're on the steepest part that 407 00:21:11,720 --> 00:21:14,880 Speaker 2: humans have ever been on, such that if we live 408 00:21:14,920 --> 00:21:16,879 Speaker 2: two hundred years ago in some village, it would have 409 00:21:16,880 --> 00:21:19,640 Speaker 2: been pretty straightforward to predict that the next ten years 410 00:21:19,640 --> 00:21:22,359 Speaker 2: would be about the same. But boy, I think we 411 00:21:22,400 --> 00:21:26,200 Speaker 2: have more in common with our ancestors ten thousand years 412 00:21:26,200 --> 00:21:29,560 Speaker 2: ago than we do with our descendants one hundred years 413 00:21:29,560 --> 00:21:29,960 Speaker 2: from now. 414 00:21:34,760 --> 00:21:38,680 Speaker 1: Coming up, the potential benefits of living on the exponential 415 00:21:38,760 --> 00:21:52,000 Speaker 1: curve stay with us, David, Can you tell me what 416 00:21:52,040 --> 00:21:54,679 Speaker 1: this does to our brains when we're living on this 417 00:21:54,840 --> 00:21:55,760 Speaker 1: exponential curve? 418 00:21:56,359 --> 00:22:00,000 Speaker 2: Yeah, One thing it does is keeps us young mentally 419 00:22:00,240 --> 00:22:04,760 Speaker 2: because we're constantly seeing new things and learning new things. Look, 420 00:22:05,880 --> 00:22:09,720 Speaker 2: this is a total goofy speculation, but one of the 421 00:22:09,760 --> 00:22:14,280 Speaker 2: things that happens with dementia is that people as they 422 00:22:14,280 --> 00:22:17,359 Speaker 2: age tend to fall into routines. It's because you start 423 00:22:17,359 --> 00:22:20,000 Speaker 2: off with a lot of fluid intelligence as a baby, 424 00:22:20,359 --> 00:22:23,320 Speaker 2: and what you get is crystallized intelligence when you're an adult. 425 00:22:23,359 --> 00:22:25,320 Speaker 2: You sort of know how things go, you know how 426 00:22:25,320 --> 00:22:26,960 Speaker 2: to operate in the world, you know how people act, 427 00:22:26,960 --> 00:22:29,400 Speaker 2: this kind of thing, and that's great. That means you're 428 00:22:29,440 --> 00:22:31,960 Speaker 2: able to operate successfully in the world. But the downside 429 00:22:32,000 --> 00:22:34,680 Speaker 2: is it means your brain isn't changing much anymore. And 430 00:22:34,800 --> 00:22:38,040 Speaker 2: what happens is when there are other sorts of problems 431 00:22:38,119 --> 00:22:41,760 Speaker 2: and pathologies that come in and chew up the brain tissue, 432 00:22:42,320 --> 00:22:45,680 Speaker 2: then you lose cognitive ability. That's what we see in 433 00:22:45,720 --> 00:22:48,639 Speaker 2: these different sorts of dementias. But it turns out that 434 00:22:48,680 --> 00:22:51,880 Speaker 2: if you are using your brain and constantly making new 435 00:22:51,920 --> 00:22:56,880 Speaker 2: pathways and constantly having to reconfigure things, that provides the 436 00:22:56,920 --> 00:23:01,640 Speaker 2: best protection that we know against dementia. Just as one 437 00:23:01,680 --> 00:23:05,200 Speaker 2: second example, there's been this very long ongoing study about 438 00:23:05,320 --> 00:23:09,320 Speaker 2: nuns in these convents and it turns out that they 439 00:23:09,400 --> 00:23:11,800 Speaker 2: all agreed a long time ago to donate their brains 440 00:23:11,880 --> 00:23:13,879 Speaker 2: upon their death, and it turned out that some fraction 441 00:23:13,960 --> 00:23:17,400 Speaker 2: of these nuns have Alzheimer's disease, and yet nobody knew 442 00:23:17,400 --> 00:23:20,000 Speaker 2: it when they were alive. They didn't show the cognitive symptoms. 443 00:23:20,000 --> 00:23:22,480 Speaker 2: And the reason is, if you live in a convent 444 00:23:22,600 --> 00:23:26,280 Speaker 2: till the day you die, you've got social responsibilities. You're 445 00:23:26,280 --> 00:23:28,639 Speaker 2: talking with the sisters, you're playing games or singing songs, 446 00:23:28,640 --> 00:23:31,840 Speaker 2: you're doing all these things, and you're constantly being challenged. 447 00:23:32,200 --> 00:23:34,720 Speaker 2: And so even as parts of their brain were falling apart, 448 00:23:34,760 --> 00:23:38,000 Speaker 2: they were building new roadways and bridges all the time. Anyway, 449 00:23:38,119 --> 00:23:40,240 Speaker 2: I think that we're in a situation as a society 450 00:23:40,320 --> 00:23:42,720 Speaker 2: now where till the day we die, we're going to 451 00:23:42,760 --> 00:23:45,560 Speaker 2: be building these new roadways because there's so much surprising 452 00:23:45,560 --> 00:23:47,800 Speaker 2: stuff happening all the time. I think we might actually 453 00:23:47,840 --> 00:23:49,560 Speaker 2: see less dementia as a result. 454 00:23:50,000 --> 00:23:53,280 Speaker 1: Which brings me back to this idea of AI relationships, 455 00:23:53,359 --> 00:23:56,560 Speaker 1: because of course, a lot of these AI companions are 456 00:23:56,600 --> 00:24:00,680 Speaker 1: designed to please, not to challenge. You know, one of 457 00:24:00,680 --> 00:24:03,080 Speaker 1: the things people always say about relationships is that they 458 00:24:03,119 --> 00:24:05,920 Speaker 1: take work. But that's actually good because doing the work 459 00:24:05,960 --> 00:24:08,639 Speaker 1: makes you a more resilient person. And so that to 460 00:24:08,680 --> 00:24:11,520 Speaker 1: me is one of the kind of lingering concerns about 461 00:24:11,520 --> 00:24:14,640 Speaker 1: the nature of these relationships versus human relationships. 462 00:24:15,240 --> 00:24:18,280 Speaker 2: I think that the companies will look very different in 463 00:24:18,400 --> 00:24:20,240 Speaker 2: just a couple of years from now, where they will 464 00:24:20,320 --> 00:24:24,800 Speaker 2: be making AI relationships that are more realistic, because the 465 00:24:24,880 --> 00:24:27,800 Speaker 2: reports that I've seen on this, when there's a girlfriend 466 00:24:27,920 --> 00:24:30,960 Speaker 2: or a boyfriend that is only there to please, people 467 00:24:31,000 --> 00:24:33,680 Speaker 2: get pretty bored of that straight away. But if it's 468 00:24:33,720 --> 00:24:36,879 Speaker 2: more like a real person with all the foibles of 469 00:24:36,920 --> 00:24:38,760 Speaker 2: a real person, and you know, they have to go 470 00:24:38,840 --> 00:24:41,480 Speaker 2: and they get angry and they whatever, then that is 471 00:24:41,480 --> 00:24:42,920 Speaker 2: a more sticky relationship. 472 00:24:43,560 --> 00:24:46,199 Speaker 1: We met a couple of weeks ago at web Summit 473 00:24:46,320 --> 00:24:49,000 Speaker 1: in Kuta, and I was boasting to you about a 474 00:24:49,040 --> 00:24:52,000 Speaker 1: recent text stuff interview with Jeffrey Hinton, and you were 475 00:24:52,400 --> 00:24:55,440 Speaker 1: very kindly indulging me. But one of the things I 476 00:24:55,520 --> 00:24:57,879 Speaker 1: found so remarkable about what he said was, you know, 477 00:24:57,920 --> 00:24:59,720 Speaker 1: I did all this work because I wanted to win 478 00:24:59,760 --> 00:25:03,359 Speaker 1: an o A prize for understanding how the human brain worked, 479 00:25:03,520 --> 00:25:07,080 Speaker 1: and instead I kind of contributed to the development of 480 00:25:07,119 --> 00:25:09,840 Speaker 1: neural networks and deep learning and AI which I found 481 00:25:09,920 --> 00:25:14,800 Speaker 1: a kind of remarkable thought. But how is this explosion 482 00:25:15,119 --> 00:25:20,199 Speaker 1: of AI and neural networks influencing our understanding of our 483 00:25:20,240 --> 00:25:20,840 Speaker 1: own brains? 484 00:25:21,280 --> 00:25:25,600 Speaker 2: Yeah, really good question. So artificial neural networks took off, 485 00:25:25,720 --> 00:25:28,160 Speaker 2: you know, many decades ago as a way of saying, wow, 486 00:25:28,200 --> 00:25:31,200 Speaker 2: the brain is really complicated. Every neuron in your head 487 00:25:31,320 --> 00:25:33,920 Speaker 2: is as complicated as a city. It's got the entire 488 00:25:34,040 --> 00:25:38,480 Speaker 2: human genomen it's trafficking millions of proteins and very specific cascades. 489 00:25:38,840 --> 00:25:41,199 Speaker 2: So people said, that's really complicated. Why don't we just 490 00:25:41,240 --> 00:25:44,080 Speaker 2: say it's like a circle, it's a unit and it's 491 00:25:44,119 --> 00:25:47,200 Speaker 2: connected to other units. And that was the birth of 492 00:25:47,320 --> 00:25:51,280 Speaker 2: artificial neural networks, and that took off in this incredible 493 00:25:51,320 --> 00:25:53,560 Speaker 2: way where we are now. But essentially, think of it 494 00:25:53,640 --> 00:25:56,320 Speaker 2: like a fork in the road, where it's not really 495 00:25:56,359 --> 00:25:59,200 Speaker 2: what the brain's doing, it's doing this other thing. So 496 00:25:59,600 --> 00:26:02,520 Speaker 2: that's the bad news is it hasn't really it doesn't 497 00:26:02,560 --> 00:26:05,080 Speaker 2: necessarily tell us exactly how the brain is working. The 498 00:26:05,080 --> 00:26:08,000 Speaker 2: good news is the power of AI now can help 499 00:26:08,080 --> 00:26:11,400 Speaker 2: us analyze the neuroscience data that we have, and boy, 500 00:26:11,480 --> 00:26:14,280 Speaker 2: it is very complicated rich data. You know, we have 501 00:26:14,400 --> 00:26:17,359 Speaker 2: eighty six billion neurons. We have something like two hundred 502 00:26:17,400 --> 00:26:22,040 Speaker 2: trillion connections, and these things are changing every moment of 503 00:26:22,080 --> 00:26:25,679 Speaker 2: your life from cradle to grave. That's brain plasticity, and 504 00:26:25,760 --> 00:26:30,600 Speaker 2: every neuron is essentially crawling around and connecting and reconnecting 505 00:26:30,640 --> 00:26:35,200 Speaker 2: and the unplugging and seeking, and it's a really complicated system. 506 00:26:35,520 --> 00:26:38,199 Speaker 2: There's so much more to figure out. What we have 507 00:26:38,240 --> 00:26:41,320 Speaker 2: been developing for a while are better and better technologies 508 00:26:41,359 --> 00:26:44,040 Speaker 2: to measure what's going on in the brain. But we're 509 00:26:44,040 --> 00:26:48,240 Speaker 2: just sitting on terabytes of data and we need the 510 00:26:48,359 --> 00:26:52,000 Speaker 2: processing power of AI to understand this. So this is 511 00:26:52,000 --> 00:26:53,520 Speaker 2: where the two roads come back together. 512 00:26:54,160 --> 00:26:58,320 Speaker 1: You also formed a production company called Cognito Entertainment, and 513 00:26:58,359 --> 00:27:01,560 Speaker 1: you gave this wonderful quote saying, in an unparalleled moment 514 00:27:01,640 --> 00:27:05,520 Speaker 1: of scientific advancement, from brains to space, the genetics, they're 515 00:27:05,640 --> 00:27:08,439 Speaker 1: endless mind blowing stories to share in a world that 516 00:27:08,520 --> 00:27:11,320 Speaker 1: sometimes seems upside down, science can be a source of 517 00:27:11,359 --> 00:27:14,480 Speaker 1: great inspiration, wonder and belief. So I just wanted to 518 00:27:14,480 --> 00:27:16,920 Speaker 1: close by asking you what are some of the things 519 00:27:16,920 --> 00:27:18,119 Speaker 1: that you're most excited about. 520 00:27:19,040 --> 00:27:23,200 Speaker 2: Yeah, I mean, we're in such an insanely incredible time. 521 00:27:23,400 --> 00:27:26,760 Speaker 2: I'm lucky enough to be on Stanford campus, and so 522 00:27:27,000 --> 00:27:29,400 Speaker 2: I walk around you I see this lab and that 523 00:27:29,560 --> 00:27:34,280 Speaker 2: talk and this visiting speaker, and everything is just moving 524 00:27:34,960 --> 00:27:37,520 Speaker 2: so fast. What's interesting is you never know at any 525 00:27:37,520 --> 00:27:40,720 Speaker 2: moment in history which things are going to cash out 526 00:27:40,800 --> 00:27:43,720 Speaker 2: and which aren't. For example, Jeffrey Hinton when he was 527 00:27:43,760 --> 00:27:47,920 Speaker 2: doing his stuff many years ago. We all knew Hinton's work, 528 00:27:48,000 --> 00:27:49,639 Speaker 2: and you know, we're all familiar with it, but it 529 00:27:49,640 --> 00:27:52,800 Speaker 2: didn't really seem like that was going somewhere in the 530 00:27:52,840 --> 00:27:55,520 Speaker 2: sense that Okay. So here's the dark secret is that 531 00:27:55,920 --> 00:28:00,359 Speaker 2: most neurosciences sort of snickered at AI for a long time, 532 00:28:00,480 --> 00:28:02,960 Speaker 2: and then suddenly everyone said, whoa, I guess that worked. 533 00:28:02,960 --> 00:28:07,760 Speaker 2: So the point is all these hundreds of thousands of 534 00:28:07,760 --> 00:28:11,960 Speaker 2: innovations happening everywhere, and people say, hey, maybe glial cells 535 00:28:11,960 --> 00:28:13,720 Speaker 2: are doing something. Hey, maybe I can do something interesting 536 00:28:13,800 --> 00:28:16,320 Speaker 2: with these organoids. Maybe I can do something interesting over here. 537 00:28:16,640 --> 00:28:18,640 Speaker 2: We don't know which things are going to cash out. 538 00:28:18,720 --> 00:28:21,560 Speaker 2: But one thing to mention on this is when you 539 00:28:21,600 --> 00:28:24,520 Speaker 2: look back at the World fairs where everyone comes together 540 00:28:24,560 --> 00:28:26,440 Speaker 2: and talk about what the next big thing is, I'm 541 00:28:26,480 --> 00:28:30,160 Speaker 2: fascinated by the fact that these almost always missed what 542 00:28:30,840 --> 00:28:33,399 Speaker 2: actually turned out to be the next big thing, you know, 543 00:28:33,480 --> 00:28:35,840 Speaker 2: like in the sixties it was all about what was 544 00:28:35,840 --> 00:28:39,959 Speaker 2: it underwater hotels and cutting trees with laser cutters and 545 00:28:40,000 --> 00:28:43,160 Speaker 2: so on. But no one foresaw the Internet, which was 546 00:28:43,200 --> 00:28:46,280 Speaker 2: probably the biggest change that happened to our species. So 547 00:28:46,600 --> 00:28:49,360 Speaker 2: it turns out that we don't know where things are going. 548 00:28:49,400 --> 00:28:52,560 Speaker 2: But boy, are we in an exciting time. And I 549 00:28:52,600 --> 00:28:55,680 Speaker 2: am just such a fan of the existence of the 550 00:28:55,760 --> 00:28:58,400 Speaker 2: Internet because what it means is that when something is 551 00:28:58,440 --> 00:29:02,760 Speaker 2: discovered now, it's spreads instantly globally. And that sounds so 552 00:29:02,880 --> 00:29:05,840 Speaker 2: obvious to us, but it just wasn't that long ago. 553 00:29:06,080 --> 00:29:09,000 Speaker 2: For example, when I was getting my PhD. Someone discovers something, 554 00:29:09,160 --> 00:29:10,840 Speaker 2: they write a paper, it takes a few months to get 555 00:29:10,840 --> 00:29:11,960 Speaker 2: that published, then it ends up. 556 00:29:11,880 --> 00:29:12,360 Speaker 1: In a journal. 557 00:29:12,400 --> 00:29:14,440 Speaker 2: Then you go to the library, you hope to be 558 00:29:14,480 --> 00:29:16,800 Speaker 2: lucky enough to find that paper. You stick it on 559 00:29:16,840 --> 00:29:18,960 Speaker 2: the xerox machine and you hold it down with your 560 00:29:18,960 --> 00:29:21,760 Speaker 2: elbow because it's a big thick binder, and you try 561 00:29:21,800 --> 00:29:24,760 Speaker 2: to xerox to the paper. Like it was really slow 562 00:29:24,800 --> 00:29:28,800 Speaker 2: to get information around just not that long ago, and 563 00:29:29,240 --> 00:29:31,920 Speaker 2: suddenly that's all changed, and that really makes a big difference. 564 00:29:35,280 --> 00:29:36,840 Speaker 1: David, thank you so much. I hope you'll join us 565 00:29:36,840 --> 00:29:39,320 Speaker 1: again on the podcast soon. I hope you'll feed us 566 00:29:39,320 --> 00:29:41,600 Speaker 1: with some of the interesting things that you'll see developing 567 00:29:41,640 --> 00:29:45,200 Speaker 1: in Stanford. And I really really enjoyed today's conversation. 568 00:29:45,720 --> 00:29:47,120 Speaker 2: Thank you, Oz, great to be here. 569 00:29:53,040 --> 00:29:55,720 Speaker 1: That's it for this week for tech Stuff. I'm as Valoshian. 570 00:29:55,840 --> 00:29:59,480 Speaker 1: This episode was produced by Eliza Dennis and Victoria Domingez 571 00:30:00,000 --> 00:30:03,360 Speaker 1: Pektive produced by me Kara Price and Kate Osborne for 572 00:30:03,400 --> 00:30:08,080 Speaker 1: Kodeidoscope and Katrina Norvelle Fireart Podcast. Jack Insley mixed this 573 00:30:08,160 --> 00:30:11,479 Speaker 1: episode and Kyle Murdoch wrote our theme song. Join us 574 00:30:11,480 --> 00:30:14,640 Speaker 1: this Friday for Textaff's Weekend Tech we'll run through the 575 00:30:14,640 --> 00:30:20,440 Speaker 1: headlines and hear from tech entrepreneur and researcher Azimazar. Please rate, 576 00:30:20,560 --> 00:30:23,280 Speaker 1: review and reach out to us at tech Stuff podcast 577 00:30:23,320 --> 00:30:25,680 Speaker 1: at gmail dot com. We want to know what's on 578 00:30:25,720 --> 00:30:26,160 Speaker 1: your mind.