1 00:00:05,120 --> 00:00:09,680 Speaker 1: Will AI cure loneliness? This is actually a tough question, 2 00:00:09,760 --> 00:00:13,240 Speaker 1: and it's quite nuanced because, on the one hand, AI 3 00:00:13,320 --> 00:00:19,079 Speaker 1: companions are increasingly amazing at soothing our sense of isolation. 4 00:00:19,239 --> 00:00:21,640 Speaker 1: They can be here for us twenty four to seven, 5 00:00:22,040 --> 00:00:24,520 Speaker 1: ready to listen and to help. But does that come 6 00:00:24,640 --> 00:00:29,960 Speaker 1: at a cost, because more than simply pain, loneliness is 7 00:00:30,000 --> 00:00:36,200 Speaker 1: a biological signal that pushes our behavior towards improving ourselves socially. Also, 8 00:00:36,400 --> 00:00:40,760 Speaker 1: does everyone have the same need for curing their loneliness? 9 00:00:40,800 --> 00:00:44,720 Speaker 1: In other words, will having an AI relationship to cure 10 00:00:44,840 --> 00:00:49,120 Speaker 1: loneliness mess up our teenagers or should we be thinking 11 00:00:49,159 --> 00:00:53,920 Speaker 1: about it as providing a critical lifeline to our seniors. 12 00:00:54,400 --> 00:00:56,760 Speaker 1: So in this episode, we'll be joined by my friend 13 00:00:56,760 --> 00:01:03,880 Speaker 1: and colleague Paul Bloom to discuss loneliness. Welcome to Inner 14 00:01:03,960 --> 00:01:07,559 Speaker 1: Cosmos with me David Eagleman. I'm a neuroscientist and author 15 00:01:07,600 --> 00:01:11,000 Speaker 1: at Stanford and in these episodes we sail deeply into 16 00:01:11,040 --> 00:01:15,319 Speaker 1: our three pound universe to understand why and how our lives. 17 00:01:15,000 --> 00:01:15,880 Speaker 2: Look the way they do. 18 00:01:33,400 --> 00:01:36,800 Speaker 1: Today's episode is about loneliness, which is one of the 19 00:01:36,840 --> 00:01:41,040 Speaker 1: most painful experiences that humans face. Many of us think 20 00:01:41,040 --> 00:01:44,759 Speaker 1: of loneliness as a kind of passing sadness, but from 21 00:01:44,800 --> 00:01:49,720 Speaker 1: a biological perspective, it's a kind of internal alarm system, 22 00:01:49,920 --> 00:01:53,560 Speaker 1: like hunger or thirst or pain. For most of our 23 00:01:53,640 --> 00:01:58,160 Speaker 1: evolutionary history, being cut off from others was more than 24 00:01:58,200 --> 00:02:03,200 Speaker 1: a sting. It was dangerous. Isolation meant vulnerability. To survive, 25 00:02:03,680 --> 00:02:07,400 Speaker 1: we needed each other, and so through an evolutionary lens, 26 00:02:07,920 --> 00:02:12,640 Speaker 1: the emotion of loneliness is like a pain that pushes 27 00:02:12,720 --> 00:02:16,639 Speaker 1: us towards something, in this case, back into social connection. 28 00:02:16,880 --> 00:02:20,519 Speaker 1: But here's the question we face today. What happens if 29 00:02:20,560 --> 00:02:25,399 Speaker 1: we mute that signal? What happens if technology, in its 30 00:02:25,520 --> 00:02:31,000 Speaker 1: growing sophistication, takes away this sting of loneliness by offering 31 00:02:31,360 --> 00:02:36,679 Speaker 1: a digital substitute. We've all seen how AI can mimic empathy, 32 00:02:36,840 --> 00:02:39,959 Speaker 1: as I've talked about in several previous episodes, and AI 33 00:02:40,200 --> 00:02:44,880 Speaker 1: companion never gets tired or distracted, they don't lose patience, 34 00:02:44,919 --> 00:02:48,000 Speaker 1: they don't get angry, they don't get snarky, and they're 35 00:02:48,040 --> 00:02:51,360 Speaker 1: there for you twenty four to seven with endless attention, 36 00:02:52,040 --> 00:02:55,680 Speaker 1: and they're good. One study found if you compare doctor's 37 00:02:55,800 --> 00:03:00,880 Speaker 1: responses to patients against AI's responses to patients, people rate 38 00:03:00,919 --> 00:03:05,639 Speaker 1: the AI responses as more empathic. Now, just imagine someone 39 00:03:05,639 --> 00:03:11,120 Speaker 1: who's isolated, who feels invisible. The kind of attentiveness AI 40 00:03:11,200 --> 00:03:14,400 Speaker 1: gives can feel like a salvation, and I think that 41 00:03:14,440 --> 00:03:18,440 Speaker 1: it can actually be salvation. One study out of Stanford 42 00:03:18,480 --> 00:03:23,280 Speaker 1: showed that having a companion bought for young people mitigated loneliness, 43 00:03:23,639 --> 00:03:27,440 Speaker 1: and three percent of the respondents said it stopped their 44 00:03:27,520 --> 00:03:32,040 Speaker 1: suicidal ideation. But there's a tension here, and my colleague 45 00:03:32,040 --> 00:03:34,400 Speaker 1: Paul Bloom recently wrote about it in The New Yorker. 46 00:03:34,520 --> 00:03:37,000 Speaker 1: And that's back to this issue that the pain of 47 00:03:37,200 --> 00:03:42,040 Speaker 1: loneliness is a forcing function for growth, That pain pushes 48 00:03:42,120 --> 00:03:45,839 Speaker 1: us to do the hard work of making ourselves understood, 49 00:03:46,400 --> 00:03:50,880 Speaker 1: or of building bridges to others, or of repairing broken bonds. 50 00:03:51,440 --> 00:03:56,680 Speaker 1: If AI relationship makes loneliness disappear, do we also lose 51 00:03:57,200 --> 00:04:02,360 Speaker 1: the social effort, the striving that loneliness once demanded of us. 52 00:04:02,560 --> 00:04:05,360 Speaker 1: So at this fast moving moment in history, we're stuck 53 00:04:05,400 --> 00:04:10,800 Speaker 1: with this central question. Will artificial companions rescue us from 54 00:04:10,840 --> 00:04:15,600 Speaker 1: needless suffering or will they erode something essential about the 55 00:04:15,720 --> 00:04:19,960 Speaker 1: meaning of this biological signal? Will they soften the edges 56 00:04:20,040 --> 00:04:22,960 Speaker 1: of life? In a way that helps us or in 57 00:04:23,000 --> 00:04:25,960 Speaker 1: a way that hollows us out. I think we should 58 00:04:25,960 --> 00:04:29,440 Speaker 1: all avoid easy answers to this question, and happily, our 59 00:04:29,480 --> 00:04:31,960 Speaker 1: guest today is a colleague who's been thinking about these 60 00:04:32,040 --> 00:04:36,240 Speaker 1: questions with subtlety and depth. Paul Bloom is a professor 61 00:04:36,240 --> 00:04:39,359 Speaker 1: of psychology at the University of Toronto and a professor 62 00:04:39,360 --> 00:04:43,120 Speaker 1: emeritus at Yale. He's the author of psych The Story 63 00:04:43,160 --> 00:04:46,440 Speaker 1: of the Human Mind, and many other influential books, and 64 00:04:46,600 --> 00:04:49,000 Speaker 1: in his recent essay for The New Yorker, which I'll 65 00:04:49,000 --> 00:04:52,440 Speaker 1: link to in the show notes, he explores the promise 66 00:04:52,600 --> 00:04:56,240 Speaker 1: and peril of AI as a cure for loneliness. So 67 00:04:56,360 --> 00:05:00,800 Speaker 1: let's step into this conversation at the crossroads of psychology, technology, 68 00:05:01,080 --> 00:05:08,080 Speaker 1: and what it means to be a human. So, in 69 00:05:08,120 --> 00:05:12,159 Speaker 1: your recent New Yorker article, you describe loneliness as a 70 00:05:12,360 --> 00:05:16,200 Speaker 1: toothache for the soul. So let's just dive into loneliness 71 00:05:16,200 --> 00:05:18,760 Speaker 1: for a moment. Talk about what it is and what 72 00:05:18,839 --> 00:05:20,279 Speaker 1: people experience there. 73 00:05:20,640 --> 00:05:23,279 Speaker 3: Well, let's start off with the obvious, which is loneliness 74 00:05:23,400 --> 00:05:27,799 Speaker 3: is awful. Loneliness is terrible. It's I know everybody listening 75 00:05:27,839 --> 00:05:30,240 Speaker 3: to this will have experienced some degree of loneliness. Some 76 00:05:30,279 --> 00:05:33,200 Speaker 3: people will have experienced a transient loneliness. You know, you're 77 00:05:33,200 --> 00:05:36,120 Speaker 3: a weigh on a trip, you know, no friends around. 78 00:05:36,760 --> 00:05:38,560 Speaker 3: Some you know, but it goes away. You know people 79 00:05:38,560 --> 00:05:40,280 Speaker 3: will love you. On the other side of things, some 80 00:05:40,320 --> 00:05:44,320 Speaker 3: people will experience real loneliness for long periods of time. 81 00:05:44,440 --> 00:05:47,280 Speaker 3: You're more likely to be lonely if you're old, maybe 82 00:05:47,320 --> 00:05:51,160 Speaker 3: everybody you know has died, maybe you're strange from your family. 83 00:05:51,400 --> 00:05:53,839 Speaker 3: Some studies find that past age like sixty, but half 84 00:05:53,839 --> 00:05:58,240 Speaker 3: of people say that that they're lonely. So loneliness is awful. 85 00:05:58,560 --> 00:06:01,400 Speaker 3: It is in as simple as sense. You know, a 86 00:06:01,520 --> 00:06:04,560 Speaker 3: lack of contact with other people, but you can go 87 00:06:04,680 --> 00:06:08,800 Speaker 3: deep here. You could be lonely, but with people. I 88 00:06:08,839 --> 00:06:11,280 Speaker 3: have sometimes lonely, the loneliest times of my life when 89 00:06:11,279 --> 00:06:13,400 Speaker 3: I was surrounded by people and I didn't feel or 90 00:06:13,480 --> 00:06:15,440 Speaker 3: was the connection. I didn't feel it was love. You 91 00:06:15,480 --> 00:06:17,760 Speaker 3: could sometimes even be loved and feel lonely because you 92 00:06:17,760 --> 00:06:20,800 Speaker 3: feel people don't understand you, that there's no connection. But 93 00:06:21,600 --> 00:06:24,839 Speaker 3: it's a serious problem. It's a serious form of human suffering. 94 00:06:25,160 --> 00:06:27,800 Speaker 1: And what are you supposed to the evolutionary purpose of 95 00:06:27,839 --> 00:06:28,720 Speaker 1: loneliness is. 96 00:06:29,240 --> 00:06:31,239 Speaker 3: That's a good question. I mean, you know, we should 97 00:06:31,240 --> 00:06:34,080 Speaker 3: think like we should think like evolutionary biologists, and when 98 00:06:34,080 --> 00:06:36,800 Speaker 3: we talk about something, even something awful, we should say, 99 00:06:36,800 --> 00:06:40,280 Speaker 3: what's it there for? And I think that just about 100 00:06:40,320 --> 00:06:43,360 Speaker 3: all of these experiences, these feelings are there for a reason. 101 00:06:43,640 --> 00:06:45,919 Speaker 3: You know, hunger's there to motivate you to seek out food, 102 00:06:46,520 --> 00:06:49,279 Speaker 3: boredom's there to motivate you to find something interesting in 103 00:06:49,279 --> 00:06:53,600 Speaker 3: your life. And loneliness is there, I think, to motivate 104 00:06:53,600 --> 00:06:57,159 Speaker 3: you to seek out human contact. And once you have 105 00:06:57,279 --> 00:07:00,840 Speaker 3: that human contact, to work at it, you know, to 106 00:07:00,880 --> 00:07:04,800 Speaker 3: try to be interested in another person, try to connect 107 00:07:04,800 --> 00:07:08,560 Speaker 3: with them. One way to answer your question is what 108 00:07:08,600 --> 00:07:10,680 Speaker 3: would it be like if we go into your brain 109 00:07:11,240 --> 00:07:13,560 Speaker 3: and make it so you never feel lonely. You might 110 00:07:13,600 --> 00:07:16,200 Speaker 3: say that that's great, loneliness is awful. My life would 111 00:07:16,200 --> 00:07:18,800 Speaker 3: be much improved. But then you price spend all your 112 00:07:18,800 --> 00:07:21,800 Speaker 3: time by yourself. You'd have no motivation, no care to 113 00:07:21,800 --> 00:07:25,640 Speaker 3: connect with other people. You wouldn't you wouldn't reproduce, you 114 00:07:25,640 --> 00:07:28,880 Speaker 3: wouldn't fall in love, you wouldn't develop friendships. You'd be 115 00:07:28,960 --> 00:07:31,720 Speaker 3: fine by yourself. And everybody wants to be a little 116 00:07:31,720 --> 00:07:33,560 Speaker 3: bit fine by themselves, but too much of it, I 117 00:07:33,600 --> 00:07:36,720 Speaker 3: think leads to estrangement, So loneliness serves a purpose. 118 00:07:37,120 --> 00:07:39,800 Speaker 1: So we're in this moment now where suddenly we have 119 00:07:40,200 --> 00:07:44,360 Speaker 1: AI that can solve the problem, and we'll talk about 120 00:07:44,440 --> 00:07:48,320 Speaker 1: what we mean by solving the problem here. But you know, Paul, 121 00:07:48,360 --> 00:07:49,960 Speaker 1: you and I have been in the field for a 122 00:07:50,000 --> 00:07:52,240 Speaker 1: long time, and I don't think we would have ever 123 00:07:52,360 --> 00:07:55,440 Speaker 1: guessed this one decade ago that we would be talking 124 00:07:55,480 --> 00:07:58,520 Speaker 1: about AI as a real. 125 00:07:58,400 --> 00:08:01,120 Speaker 2: Solution for loneliness. So let's talk about that. 126 00:08:01,480 --> 00:08:04,000 Speaker 1: What do you think the pros and cons are of 127 00:08:04,160 --> 00:08:06,640 Speaker 1: having AI companions in this context? 128 00:08:07,120 --> 00:08:10,240 Speaker 3: So just just to agree with you on something, AI 129 00:08:10,480 --> 00:08:13,720 Speaker 3: was the biggest surprise in my career. If you had 130 00:08:13,760 --> 00:08:15,560 Speaker 3: asked me, I'm going to ask you what you felt, 131 00:08:15,640 --> 00:08:19,920 Speaker 3: But I'll tell you. A month before chat GBD came 132 00:08:19,920 --> 00:08:21,960 Speaker 3: out and you said, when will we have a machine 133 00:08:21,960 --> 00:08:23,560 Speaker 3: you could just talk to a different or a person 134 00:08:23,640 --> 00:08:26,280 Speaker 3: I say, ten years, twenty years, thirty years, and then 135 00:08:26,320 --> 00:08:28,680 Speaker 3: boom it came out and stunned me and stun in 136 00:08:28,680 --> 00:08:31,560 Speaker 3: the world. What was your reaction. You're closer to these things, 137 00:08:31,600 --> 00:08:33,360 Speaker 3: so maybe you were less surprised. 138 00:08:33,520 --> 00:08:36,440 Speaker 1: Now it's exactly the same I spent really most of 139 00:08:36,480 --> 00:08:39,920 Speaker 1: my career as a neuroscientist sort of snickering at AI 140 00:08:40,559 --> 00:08:43,160 Speaker 1: and thinking, well, it's nothing like the brain, it's never 141 00:08:43,240 --> 00:08:45,559 Speaker 1: going to get there, And then suddenly it was there, 142 00:08:45,600 --> 00:08:48,760 Speaker 1: and it's changed all our lives so enormously that you know, 143 00:08:48,800 --> 00:08:52,240 Speaker 1: I do this neuroscience podcast and about a quarter or 144 00:08:52,280 --> 00:08:55,840 Speaker 1: a third of my podcast episodes are about AI nowadays. 145 00:08:55,679 --> 00:08:58,240 Speaker 3: Sooner or later, by law, every conversation we have has 146 00:08:58,240 --> 00:08:59,839 Speaker 3: to be about AI. You're a teacher, how do you 147 00:08:59,880 --> 00:09:01,600 Speaker 3: do AI? What are you going to do with AI? 148 00:09:03,000 --> 00:09:07,080 Speaker 3: So let's get to the loneliness thing. My view is 149 00:09:07,120 --> 00:09:10,000 Speaker 3: annoyingly nuanced so that no one's really happy with it. 150 00:09:10,240 --> 00:09:11,760 Speaker 3: I think there are a lot of people who are 151 00:09:11,760 --> 00:09:17,080 Speaker 3: really lonely and their loneliness is severe, and maybe they're 152 00:09:17,120 --> 00:09:20,240 Speaker 3: in a state where an AI companion is really the 153 00:09:20,280 --> 00:09:23,320 Speaker 3: best they're going to get. And you know, suppose you're 154 00:09:23,360 --> 00:09:26,080 Speaker 3: eighty years old, you're in an institution, you have no 155 00:09:26,280 --> 00:09:29,800 Speaker 3: friends and family visiting you. Maybe you have dementia which 156 00:09:29,840 --> 00:09:32,880 Speaker 3: makes you very difficult to talk with, and you're not 157 00:09:32,960 --> 00:09:36,080 Speaker 3: a multimillionaire, so you can't pay people to entertain you. 158 00:09:36,840 --> 00:09:39,360 Speaker 3: If it could turn out that chat GPT could be 159 00:09:39,600 --> 00:09:44,160 Speaker 3: our claud or whatever could be a rewarding, satisfying companion 160 00:09:44,480 --> 00:09:49,160 Speaker 3: to you. I think it's monstrous to deny it to you. 161 00:09:49,720 --> 00:09:51,800 Speaker 3: It would be like telling people you can't have pets 162 00:09:51,840 --> 00:09:56,520 Speaker 3: because we have decided pets are not sufficiently rich interacting partners. 163 00:09:56,720 --> 00:10:01,439 Speaker 3: So whatever limitations these AIS have, you know, we should 164 00:10:01,480 --> 00:10:04,640 Speaker 3: make it available the people who really need it and 165 00:10:04,720 --> 00:10:07,959 Speaker 3: people who who's who's suffering is intense and there aren't 166 00:10:07,960 --> 00:10:12,240 Speaker 3: any alternatives. So that's the pro AI side. The anti 167 00:10:12,320 --> 00:10:14,520 Speaker 3: AI side is kind of the rest of us, where 168 00:10:14,559 --> 00:10:17,560 Speaker 3: I think that there are serious problems where we move 169 00:10:17,679 --> 00:10:22,320 Speaker 3: to AI companions as a replacement for human companions, and 170 00:10:22,360 --> 00:10:24,680 Speaker 3: I'll tell you one of them. Maybe The main one 171 00:10:24,679 --> 00:10:27,600 Speaker 3: that bothers me is that it goes back to your 172 00:10:27,679 --> 00:10:31,000 Speaker 3: question about the function of loneliness, where the struggles we 173 00:10:31,120 --> 00:10:34,040 Speaker 3: have when dealing with people inform us and make us 174 00:10:34,040 --> 00:10:37,040 Speaker 3: better people. I tell you a story and you find 175 00:10:37,040 --> 00:10:39,240 Speaker 3: it kind of boring. Next time I tell a story 176 00:10:39,240 --> 00:10:42,000 Speaker 3: a bit more interesting. You find talking to me frustrating 177 00:10:42,040 --> 00:10:43,920 Speaker 3: and don't want to talk anymore because I don't listen. Well, 178 00:10:44,000 --> 00:10:47,120 Speaker 3: I'll listen better next time. By dealing with people, I 179 00:10:47,200 --> 00:10:50,040 Speaker 3: become more sensitive. This is a story we've all gone 180 00:10:50,080 --> 00:10:52,800 Speaker 3: through our lives. We've all been we've all been teenagers 181 00:10:52,800 --> 00:10:56,319 Speaker 3: who have been awkward, who've been terrible at flirting, who 182 00:10:56,600 --> 00:11:00,360 Speaker 3: have been borish and inconsiderate. And through the feed we 183 00:11:00,400 --> 00:11:02,840 Speaker 3: get from people, we have become better. We have become 184 00:11:02,840 --> 00:11:05,600 Speaker 3: better people, not just better conversational. It's not just better company, 185 00:11:05,600 --> 00:11:10,520 Speaker 3: but better people. Imagine you take that away. And one 186 00:11:10,600 --> 00:11:14,559 Speaker 3: thing ais do very well is they make you feel 187 00:11:14,720 --> 00:11:19,400 Speaker 3: like you are wonderful. Everything I tell CHAGGYBD is brilliant. 188 00:11:19,440 --> 00:11:22,200 Speaker 3: All my questions are wonderful, All my paper drafts are brilliant, 189 00:11:23,000 --> 00:11:26,680 Speaker 3: My stories are amusing, my jokes are hilarious. It is 190 00:11:26,720 --> 00:11:29,960 Speaker 3: always available to me at any time, at any second, 191 00:11:30,480 --> 00:11:33,920 Speaker 3: and it will never sort of wait for me to 192 00:11:33,960 --> 00:11:36,000 Speaker 3: finish talking so it could get its own words in. 193 00:11:36,920 --> 00:11:39,000 Speaker 3: And in some way that's wonderful. But in some way 194 00:11:39,040 --> 00:11:42,360 Speaker 3: that's terrible. I think it has a sort of corrosive power. 195 00:11:44,280 --> 00:11:47,520 Speaker 1: Can I make a distinction, though, which is you're adjusting 196 00:11:47,520 --> 00:11:50,280 Speaker 1: two points here. One is that it's always there and 197 00:11:50,360 --> 00:11:53,800 Speaker 1: paying attention to you, which is wonderful. The other is 198 00:11:53,840 --> 00:11:56,960 Speaker 1: that it's sycophantic and telling you that your ideas are great. 199 00:11:56,960 --> 00:11:59,720 Speaker 1: But the second one I think can evolve and will 200 00:11:59,720 --> 00:12:00,640 Speaker 1: eve time. 201 00:12:01,040 --> 00:12:02,600 Speaker 2: You know, Chat GPT. 202 00:12:02,440 --> 00:12:06,040 Speaker 1: Released a version that was overly sycophantic and then nobody 203 00:12:06,080 --> 00:12:07,360 Speaker 1: liked that and they got rid of that. But I 204 00:12:07,360 --> 00:12:10,880 Speaker 1: think with time, it certainly seems possible that companies will 205 00:12:10,880 --> 00:12:15,840 Speaker 1: make better and better hard love ais that give you 206 00:12:15,920 --> 00:12:19,640 Speaker 1: feedback that tell you that you're you're wrong, or there's 207 00:12:19,679 --> 00:12:22,240 Speaker 1: a different way you can think about it. But the 208 00:12:22,280 --> 00:12:26,680 Speaker 1: first part, which is that it never gets angry or 209 00:12:27,080 --> 00:12:30,480 Speaker 1: you know, loses attention or has other things that it 210 00:12:30,480 --> 00:12:32,960 Speaker 1: needs to do besides talk to you, that part's permanent, 211 00:12:33,000 --> 00:12:35,239 Speaker 1: and that part I think might be really positive. 212 00:12:35,679 --> 00:12:37,599 Speaker 3: I mean, you can imagine fiddling with it, right, you 213 00:12:37,600 --> 00:12:39,760 Speaker 3: can imagine making it that. You know. I three in 214 00:12:39,760 --> 00:12:42,720 Speaker 3: the morning, I ask Chat a question and it says, dude, 215 00:12:42,720 --> 00:12:44,160 Speaker 3: do you know how late this is? And do you 216 00:12:44,160 --> 00:12:46,120 Speaker 3: know what a stupid question that is? I am not 217 00:12:46,120 --> 00:12:47,720 Speaker 3: going to answer it. Go back to sleep and talk 218 00:12:47,760 --> 00:12:50,320 Speaker 3: to me in six hours. You know no reason why 219 00:12:50,360 --> 00:12:52,360 Speaker 3: we can't build the machines that way. We probably won't 220 00:12:52,400 --> 00:12:52,800 Speaker 3: want to. 221 00:12:53,120 --> 00:12:55,920 Speaker 1: The reason I think we will is because even something 222 00:12:56,000 --> 00:12:59,240 Speaker 1: like TikTok, which is there to grab your attention and 223 00:12:59,320 --> 00:13:02,120 Speaker 1: keep it even They have started implementing things. If you 224 00:13:02,200 --> 00:13:04,679 Speaker 1: surf too long, it pops up a video that says, hey, 225 00:13:04,720 --> 00:13:07,400 Speaker 1: you've been surfing for X hours. Why don't you put 226 00:13:07,400 --> 00:13:10,600 Speaker 1: this down and come back later. So it certainly is 227 00:13:10,720 --> 00:13:13,200 Speaker 1: plausible that we can get to a point where we 228 00:13:13,280 --> 00:13:15,360 Speaker 1: will have AI that gives us the right kind of 229 00:13:15,360 --> 00:13:16,120 Speaker 1: feedback like that. 230 00:13:16,520 --> 00:13:20,080 Speaker 3: There's certainly no technological barrier to it. In fact, right now, 231 00:13:20,160 --> 00:13:23,040 Speaker 3: with some prompt engineering, you could tell your AI stop 232 00:13:23,120 --> 00:13:25,680 Speaker 3: sucking up to me so much. Don't tell me I'm 233 00:13:25,679 --> 00:13:31,000 Speaker 3: a genius anymore, just answer my damn questions. I wonder, though, 234 00:13:32,000 --> 00:13:34,520 Speaker 3: if people will really want this, if we'll be in 235 00:13:34,559 --> 00:13:38,319 Speaker 3: a situation where it's technologically fully possible to make these 236 00:13:38,360 --> 00:13:43,240 Speaker 3: machines less frictionless, give us more pushback, call us out 237 00:13:43,920 --> 00:13:46,600 Speaker 3: when we mess up, But we won't want them to 238 00:13:46,640 --> 00:13:50,280 Speaker 3: do that. We will insist on machines that make us 239 00:13:50,320 --> 00:13:53,160 Speaker 3: feel good about ourselves. I mean, what do you think 240 00:13:53,320 --> 00:13:56,000 Speaker 3: imagine this, say four years from now, five years from now, 241 00:13:56,040 --> 00:13:57,880 Speaker 3: when you get better and better and better, do you 242 00:13:57,880 --> 00:14:01,640 Speaker 3: think we'll be talking to machines that will give us pushback? 243 00:14:01,920 --> 00:14:05,160 Speaker 3: Or do you think we'll find it irresistible to deal 244 00:14:05,200 --> 00:14:08,120 Speaker 3: with machines that just make us feel great about ourselves. 245 00:14:08,160 --> 00:14:10,600 Speaker 1: I think it will become boring pretty quickly to have 246 00:14:10,640 --> 00:14:13,520 Speaker 1: a machine that's sycophantic, and in fact, I happen to 247 00:14:13,559 --> 00:14:18,080 Speaker 1: be very optimistic about AI relationships even among young people. 248 00:14:18,160 --> 00:14:20,160 Speaker 1: Will come back to this about the difference between the 249 00:14:20,200 --> 00:14:23,440 Speaker 1: elderly and the young, but I think it could serve 250 00:14:23,520 --> 00:14:26,760 Speaker 1: as a sandbox where young people get to make all 251 00:14:26,800 --> 00:14:30,360 Speaker 1: their dumb mistakes. If you have an AI companion that's 252 00:14:30,400 --> 00:14:32,960 Speaker 1: giving you the right kind feedback and saying, hey, that 253 00:14:33,040 --> 00:14:35,320 Speaker 1: hurt my feelings that you said that, or that didn't 254 00:14:35,320 --> 00:14:38,040 Speaker 1: seem like you were thinking about me when you said that. 255 00:14:38,480 --> 00:14:42,480 Speaker 1: So I think people can get better by using these 256 00:14:42,520 --> 00:14:46,600 Speaker 1: things if the company makes these correctly with giving hard feedback. 257 00:14:46,960 --> 00:14:50,040 Speaker 3: I totally agree with you. Back in the day, I 258 00:14:50,160 --> 00:14:52,400 Speaker 3: found it difficult to talk to girls when I was 259 00:14:53,160 --> 00:14:55,520 Speaker 3: You may find see me so smooth and say, how's 260 00:14:55,560 --> 00:15:00,200 Speaker 3: that possible? I was somewhat awkward, somewhat shy, And to 261 00:15:00,320 --> 00:15:03,680 Speaker 3: have something a machine I could practice on, I guess 262 00:15:03,760 --> 00:15:05,560 Speaker 3: you have to work on talking to it. We can 263 00:15:06,160 --> 00:15:09,320 Speaker 3: forget about just more generally, anybody who some social problems 264 00:15:09,320 --> 00:15:12,720 Speaker 3: would benefit a lot from a sandbox. From practice, and 265 00:15:12,760 --> 00:15:16,040 Speaker 3: then having practice enough, you'd unlaunch yourself into the real world. 266 00:15:16,800 --> 00:15:18,920 Speaker 3: But the problem is, what if it's more fun to 267 00:15:18,920 --> 00:15:22,440 Speaker 3: play in the sandbus than the real world. What if 268 00:15:22,480 --> 00:15:26,080 Speaker 3: you find your AI companions just better than real companions? 269 00:15:26,400 --> 00:15:27,040 Speaker 2: So okay, good. 270 00:15:27,120 --> 00:15:29,160 Speaker 1: So this brings us back to the issue about age. 271 00:15:30,320 --> 00:15:34,520 Speaker 1: We agree that for let's say, an elderly person who's 272 00:15:34,560 --> 00:15:36,960 Speaker 1: alone in the same way as you point out in 273 00:15:36,960 --> 00:15:39,480 Speaker 1: the article that in the same way they might we 274 00:15:39,600 --> 00:15:42,480 Speaker 1: might give them opiates. We wouldn't want to do that 275 00:15:42,680 --> 00:15:46,400 Speaker 1: with a teenager, And so we wouldn't want to give 276 00:15:47,200 --> 00:15:50,680 Speaker 1: false cures for loneliness to a teenager who needs to 277 00:15:50,800 --> 00:15:54,200 Speaker 1: go out and do the work to get better at 278 00:15:54,240 --> 00:15:58,520 Speaker 1: what they're doing. Do you see a line that we 279 00:15:58,600 --> 00:16:01,120 Speaker 1: would ever be able to where we say, okay, look, 280 00:16:01,200 --> 00:16:04,160 Speaker 1: it's okay for people over here who are really lonely 281 00:16:04,240 --> 00:16:06,960 Speaker 1: to have issues that are going to make them lonely 282 00:16:07,160 --> 00:16:09,840 Speaker 1: into the future, versus someone who should go out and 283 00:16:09,920 --> 00:16:10,400 Speaker 1: do the work. 284 00:16:11,240 --> 00:16:13,880 Speaker 3: I don't think there's a bright line, and it's going 285 00:16:13,960 --> 00:16:17,280 Speaker 3: to be somewhat arbitrary, But I like the analogy of opiates, 286 00:16:17,760 --> 00:16:20,760 Speaker 3: and you know, we will give very powerful drugs to 287 00:16:20,840 --> 00:16:22,800 Speaker 3: people near the end of their alliance, people who are 288 00:16:22,840 --> 00:16:26,600 Speaker 3: suffering from terrible pain. Drugs we would never give to 289 00:16:26,880 --> 00:16:29,360 Speaker 3: a seventeen year old who hurt us back, because we 290 00:16:29,400 --> 00:16:31,320 Speaker 3: don't want a seventeen year old to become addicted. We 291 00:16:31,480 --> 00:16:34,680 Speaker 3: worry a lot less about a nine year old. And 292 00:16:35,240 --> 00:16:38,040 Speaker 3: there's not a single bright line here. But there's going 293 00:16:38,120 --> 00:16:43,160 Speaker 3: to be cultural decisions made, policies, laws, and I think 294 00:16:43,200 --> 00:16:46,160 Speaker 3: there's a decision we're going to make with AIS now 295 00:16:46,720 --> 00:16:49,120 Speaker 3: that there are an end in some way liberty issues, right. 296 00:16:49,680 --> 00:16:51,320 Speaker 3: You know, I don't know how you're going to feel 297 00:16:51,560 --> 00:16:54,040 Speaker 3: when the government starts restricting your use of chatbots. 298 00:16:54,360 --> 00:16:55,120 Speaker 2: Yeah, so what do you think. 299 00:16:55,200 --> 00:16:58,640 Speaker 1: Do you think there will be legislation around this or 300 00:16:58,720 --> 00:17:00,440 Speaker 1: will this be a laisse a fair issue. 301 00:17:00,800 --> 00:17:03,800 Speaker 3: I'll make a prediction here, which is that they will. 302 00:17:03,880 --> 00:17:06,520 Speaker 3: There will not be legislation. People want it too much. 303 00:17:07,440 --> 00:17:11,080 Speaker 3: You know, you made the connection with TikTok and Twitter. 304 00:17:11,160 --> 00:17:13,280 Speaker 3: You know, if if the government stepped in, you know, 305 00:17:13,560 --> 00:17:16,240 Speaker 3: I don't know Trump or Biden or whoever whoever's going 306 00:17:16,280 --> 00:17:18,560 Speaker 3: to be the next present and said we're going to 307 00:17:18,640 --> 00:17:21,560 Speaker 3: limit your use of social media, I mean some way 308 00:17:21,600 --> 00:17:25,080 Speaker 3: probably a good thing, but it really is a violation 309 00:17:25,160 --> 00:17:28,159 Speaker 3: of our freedom. And I don't know. Well, let me 310 00:17:28,200 --> 00:17:30,080 Speaker 3: ask you how many hours a day, I mean, if 311 00:17:30,119 --> 00:17:32,200 Speaker 3: at all, do you interact with AI's. 312 00:17:33,280 --> 00:17:34,720 Speaker 2: Forty minutes I have. 313 00:17:35,480 --> 00:17:38,760 Speaker 1: I have a tesla and there's AI in it now, 314 00:17:39,080 --> 00:17:41,120 Speaker 1: and so when I'm driving along, you know, I talk 315 00:17:41,200 --> 00:17:43,080 Speaker 1: to it. I ask you questions anything I'm curious about, 316 00:17:43,119 --> 00:17:45,880 Speaker 1: how to fix this piece that I'm trying to fix 317 00:17:45,960 --> 00:17:48,800 Speaker 1: on my door, or how to think about you know 318 00:17:49,040 --> 00:17:51,040 Speaker 1: which philosophers have said X y Z. 319 00:17:51,800 --> 00:17:52,280 Speaker 2: I find it. 320 00:17:52,640 --> 00:17:56,080 Speaker 1: You know, it's it's the most intelligent companion that I have, 321 00:17:56,280 --> 00:17:57,720 Speaker 1: and it's really it's really wonderful. 322 00:17:57,920 --> 00:18:00,680 Speaker 3: Right now, imagine the government step in and then says, well, 323 00:18:00,920 --> 00:18:03,440 Speaker 3: a man of your age and you know, success and 324 00:18:03,800 --> 00:18:06,879 Speaker 3: good social connections, that's too much. We'll go to leive 325 00:18:06,920 --> 00:18:09,320 Speaker 3: me that half hour half hour a day, you know, 326 00:18:09,480 --> 00:18:12,800 Speaker 3: unless you get a doctor's prescription. I don't think people 327 00:18:12,800 --> 00:18:15,639 Speaker 3: would stand for that. I think these machines are too available, 328 00:18:16,320 --> 00:18:35,560 Speaker 3: too popular, too addictive for us to control it. Now, 329 00:18:35,680 --> 00:18:36,320 Speaker 3: have you seen the. 330 00:18:36,359 --> 00:18:39,440 Speaker 2: New Grock Avatar AI companion? 331 00:18:40,880 --> 00:18:41,560 Speaker 3: It's I have not. 332 00:18:41,800 --> 00:18:48,160 Speaker 1: Oh, okay, so it's a it's a sexy anime female. 333 00:18:48,800 --> 00:18:51,080 Speaker 1: You talk to her and you see her on your phone, 334 00:18:51,400 --> 00:18:54,160 Speaker 1: and she's moving around and when you say something nice, 335 00:18:54,440 --> 00:18:58,080 Speaker 1: little hearts pop up from her and she's very, very flirtatious. 336 00:18:58,640 --> 00:19:01,800 Speaker 1: And I do worry when I saw that for the 337 00:19:01,880 --> 00:19:05,440 Speaker 1: first time, I felt a not in my stomach for 338 00:19:05,520 --> 00:19:08,280 Speaker 1: the first I'm very cyber optimistic about most of the stuff, 339 00:19:08,359 --> 00:19:10,639 Speaker 1: but it did make me worry about what it'll mean 340 00:19:10,680 --> 00:19:15,240 Speaker 1: for young people to have such a magnetic AI companion. 341 00:19:16,160 --> 00:19:18,000 Speaker 2: What's your take on whether it can go too far? 342 00:19:18,480 --> 00:19:19,479 Speaker 3: Have you seen a movie her? 343 00:19:19,800 --> 00:19:19,960 Speaker 2: Yes? 344 00:19:21,160 --> 00:19:23,119 Speaker 3: To me, this is a really important movie. So if 345 00:19:23,400 --> 00:19:25,760 Speaker 3: those people haven't seen it, just the short summary is 346 00:19:26,080 --> 00:19:27,880 Speaker 3: it's the near future. In fact, i've heard this twenty 347 00:19:27,920 --> 00:19:30,320 Speaker 3: twenty five. It was done like ten years ago, and 348 00:19:30,440 --> 00:19:34,320 Speaker 3: the future was at twenty twenty five. And AI companions 349 00:19:34,480 --> 00:19:36,400 Speaker 3: who you deal with over your phone like we deal 350 00:19:36,440 --> 00:19:40,320 Speaker 3: with AI right now, or your computer become popular. And 351 00:19:40,760 --> 00:19:43,320 Speaker 3: it's about this guy who falls in love with Samantha, 352 00:19:43,960 --> 00:19:48,720 Speaker 3: this wonderful AI companion voice by Scarlett Johanson. And the 353 00:19:48,840 --> 00:19:51,640 Speaker 3: thing about it is it's so well done that you're 354 00:19:51,680 --> 00:19:54,080 Speaker 3: watching it and you yourself fall in love with her, 355 00:19:54,440 --> 00:19:57,320 Speaker 3: and the movie ends by kind of movie ends. You 356 00:19:57,440 --> 00:20:00,840 Speaker 3: see everybody walking talking excitedly on their phone, and everybody 357 00:20:01,200 --> 00:20:03,600 Speaker 3: has an AI compaign at diurnal love with because it's 358 00:20:03,680 --> 00:20:09,479 Speaker 3: infinitely preferable to human companion. Now, where this falls apart, 359 00:20:09,960 --> 00:20:11,680 Speaker 3: and this is something you know a lot more about 360 00:20:12,080 --> 00:20:16,200 Speaker 3: than me, is that these machines are basically at disappoint abstract. 361 00:20:16,280 --> 00:20:20,840 Speaker 3: They're not robots. You can't interact bodily with them. And 362 00:20:20,960 --> 00:20:24,240 Speaker 3: it always struck me. I know you're working act lot 363 00:20:24,240 --> 00:20:27,760 Speaker 3: of these issues that we have done much better at 364 00:20:27,840 --> 00:20:32,280 Speaker 3: solving the computational problems at the abstract level than building 365 00:20:32,400 --> 00:20:34,679 Speaker 3: machines that could deal with three dimensional world. I mean 366 00:20:34,720 --> 00:20:36,600 Speaker 3: to put it a different way. It might have surprised 367 00:20:36,600 --> 00:20:39,680 Speaker 3: a lot of people, but we very quickly came to 368 00:20:39,760 --> 00:20:43,200 Speaker 3: develop machines that could beat grand masters at chess. We 369 00:20:43,280 --> 00:20:45,200 Speaker 3: don't yet have a machine that could load a dishwasher 370 00:20:46,080 --> 00:20:50,720 Speaker 3: right lit alone serve as a suitable romantic or sexual partner. 371 00:20:50,960 --> 00:20:52,480 Speaker 3: So how far do you think we're away from that? 372 00:20:52,880 --> 00:20:55,840 Speaker 1: Interestingly, you know, robotics is really taking off right now. 373 00:20:56,840 --> 00:21:03,080 Speaker 1: Essentially people are using LM large language models in analogy 374 00:21:03,160 --> 00:21:05,280 Speaker 1: to physical motions. So they say Look, if I had 375 00:21:05,359 --> 00:21:07,639 Speaker 1: just moved my hand here and here and here, the 376 00:21:07,800 --> 00:21:10,040 Speaker 1: next token should be that I'm going to move my 377 00:21:10,080 --> 00:21:14,320 Speaker 1: hand over here. So it's actually it's speeding up enormously. 378 00:21:15,000 --> 00:21:18,240 Speaker 1: But what's interesting is I think even with you know, 379 00:21:18,400 --> 00:21:20,960 Speaker 1: just a purely chat ai, or even when we get 380 00:21:21,480 --> 00:21:24,879 Speaker 1: sex robots, we're still going to have a very strong 381 00:21:25,080 --> 00:21:29,280 Speaker 1: drive to have a girlfriend and take her out with 382 00:21:29,400 --> 00:21:32,960 Speaker 1: our friends, or have her meet our parents, or take 383 00:21:33,000 --> 00:21:35,080 Speaker 1: her to the movies or have a nice romantic dinner. 384 00:21:35,400 --> 00:21:38,280 Speaker 1: So that's why I'm slightly less worried that we'll have 385 00:21:38,359 --> 00:21:43,000 Speaker 1: a takeover of you know, chatbots or robots in terms 386 00:21:43,040 --> 00:21:47,000 Speaker 1: of our relationships, because we have these very deeply embedded 387 00:21:47,160 --> 00:21:51,720 Speaker 1: drives to be part of a social world and be 388 00:21:51,960 --> 00:21:53,600 Speaker 1: with our companions in that world. 389 00:21:53,960 --> 00:21:56,960 Speaker 3: I think that's right. I think that. I mean this 390 00:21:57,080 --> 00:21:59,760 Speaker 3: turns on another issue which I think we both thought about, 391 00:22:00,119 --> 00:22:03,120 Speaker 3: is at what point will this become conscious? If these 392 00:22:03,200 --> 00:22:07,080 Speaker 3: machines become conscious and embodied, then to all intents and purposes, 393 00:22:07,119 --> 00:22:10,760 Speaker 3: they're just more people, very smart people, people created in 394 00:22:10,880 --> 00:22:13,240 Speaker 3: a lab and not a normal biological way. But they're 395 00:22:13,320 --> 00:22:18,000 Speaker 3: just people. But if they're not conscious, having a relationship 396 00:22:18,080 --> 00:22:20,840 Speaker 3: with an AI is no different in kind than having 397 00:22:20,920 --> 00:22:22,760 Speaker 3: a relationship with a chair or a toaster. 398 00:22:23,480 --> 00:22:26,800 Speaker 1: It's just a thing, but it's a thing that gives 399 00:22:26,840 --> 00:22:29,560 Speaker 1: you feedback and reflects things to you. 400 00:22:30,160 --> 00:22:30,560 Speaker 3: I wonder. 401 00:22:30,680 --> 00:22:34,000 Speaker 1: I think they are different in the sense that when 402 00:22:34,040 --> 00:22:37,440 Speaker 1: you're having a dialogue with yourself, you're thinking, Hey, what 403 00:22:37,520 --> 00:22:39,400 Speaker 1: if I did this? Oh no, that's not a good idea. 404 00:22:39,440 --> 00:22:40,160 Speaker 1: Maybe this would happen. 405 00:22:40,160 --> 00:22:40,560 Speaker 2: How about this? 406 00:22:40,680 --> 00:22:41,000 Speaker 3: And so on. 407 00:22:41,880 --> 00:22:44,280 Speaker 1: In a sense, it's an extension of that, of this 408 00:22:44,440 --> 00:22:45,800 Speaker 1: reflection back and forth. 409 00:22:46,119 --> 00:22:48,120 Speaker 3: What do you think of that? I think that makes 410 00:22:48,160 --> 00:22:53,359 Speaker 3: it extremely powerful, useful, very attractive. But without consciousness, without 411 00:22:53,520 --> 00:22:58,800 Speaker 3: something like sentience, something powerfully is missing. I mean, put 412 00:22:58,840 --> 00:23:00,639 Speaker 3: it this way. You and I are talking right now, 413 00:23:01,000 --> 00:23:02,879 Speaker 3: and we're talking because we both chose to be here. 414 00:23:03,520 --> 00:23:05,160 Speaker 3: And when I get I'm going to get getting together 415 00:23:05,200 --> 00:23:07,560 Speaker 3: with some friends tonight for drinks and we will all 416 00:23:07,640 --> 00:23:11,320 Speaker 3: have chosen each other's company. And there's something to that. 417 00:23:11,920 --> 00:23:14,199 Speaker 3: There's something to the idea of dealing with people who 418 00:23:14,280 --> 00:23:16,680 Speaker 3: are there and when I if I tell them something funny, 419 00:23:16,720 --> 00:23:19,360 Speaker 3: they might laugh. And if they they you know, they 420 00:23:19,440 --> 00:23:23,360 Speaker 3: disapprove of me, my feelings will be hurt. If it's 421 00:23:23,359 --> 00:23:26,440 Speaker 3: an artificial machine. You don't get any of that. My 422 00:23:27,359 --> 00:23:29,760 Speaker 3: my chatbot does not choose to be with me. It's 423 00:23:29,800 --> 00:23:32,160 Speaker 3: just designed to do so. It's it's no more satisfying. 424 00:23:32,200 --> 00:23:34,040 Speaker 3: And when my light goes on when I flip the switch, 425 00:23:34,720 --> 00:23:38,040 Speaker 3: isn't there something that makes that so much less valuable? 426 00:23:38,119 --> 00:23:40,840 Speaker 3: They're slaves, They have no they have no choice in 427 00:23:40,880 --> 00:23:43,840 Speaker 3: the matter. They are. They are constructed to you know, 428 00:23:44,920 --> 00:23:47,720 Speaker 3: unless unless you tell it to, it will always be available, 429 00:23:47,760 --> 00:23:49,159 Speaker 3: it will always answer your questions. 430 00:23:49,600 --> 00:23:52,639 Speaker 1: You know, I'm not sure that I think that the 431 00:23:53,160 --> 00:23:56,760 Speaker 1: casting of AI as a slave makes much sense. In 432 00:23:56,840 --> 00:23:59,679 Speaker 1: the same way we wouldn't cast our dishwasher as a slave. 433 00:24:00,160 --> 00:24:00,560 Speaker 3: He says. 434 00:24:00,560 --> 00:24:02,719 Speaker 1: It's not choosing to wash your dishes, or you're doing 435 00:24:02,760 --> 00:24:05,240 Speaker 1: something wrong. You'd say, well, look, it's just a machine. 436 00:24:05,280 --> 00:24:06,320 Speaker 1: That's what we built it to do. 437 00:24:06,800 --> 00:24:08,960 Speaker 3: No, I agree with you. I agree with you. There's 438 00:24:09,000 --> 00:24:12,160 Speaker 3: something morally wrong, for instance, of a coercing people, forcing 439 00:24:12,200 --> 00:24:15,080 Speaker 3: people to do your bidding. If it's true to Ais 440 00:24:15,359 --> 00:24:18,639 Speaker 3: have no consciousness and no feeling, it's no more morally 441 00:24:18,680 --> 00:24:21,480 Speaker 3: wrong than using your toaster to toast your bread. There's 442 00:24:21,520 --> 00:24:24,680 Speaker 3: something even an interaction with someone who is paid to 443 00:24:24,720 --> 00:24:27,880 Speaker 3: be with you, like a sex worker or a therapist 444 00:24:28,320 --> 00:24:32,400 Speaker 3: or a personal trainer, missus, even somebody who is there 445 00:24:32,440 --> 00:24:34,760 Speaker 3: and they don't want to be there. It's another person, 446 00:24:35,480 --> 00:24:37,800 Speaker 3: and there's a dynamic of another person that you just 447 00:24:38,000 --> 00:24:41,800 Speaker 3: cannot get with something that's incapable of consciousness. And I think, 448 00:24:42,080 --> 00:24:44,040 Speaker 3: I think you think it goes back to your point, 449 00:24:44,080 --> 00:24:47,760 Speaker 3: which is it's not merely that I want a companion 450 00:24:47,840 --> 00:24:50,800 Speaker 3: to take that they can meet my parents, or it 451 00:24:50,840 --> 00:24:54,760 Speaker 3: could take out for drinks, and I imagine a suitably complicated, 452 00:24:55,560 --> 00:24:58,480 Speaker 3: you know, sex bought robot could do that. It's that 453 00:24:59,000 --> 00:25:02,760 Speaker 3: I want somebody I could make them laugh, make them 454 00:25:03,200 --> 00:25:06,320 Speaker 3: not make a haha sound, make them laugh. I want 455 00:25:06,400 --> 00:25:09,159 Speaker 3: somebody who I could impress. Honestly, I want somebody that 456 00:25:09,240 --> 00:25:12,760 Speaker 3: has a potential of disappointing, of saddening, of hurting their 457 00:25:12,840 --> 00:25:16,399 Speaker 3: feelings and they'll do the same to me. And the 458 00:25:16,560 --> 00:25:19,159 Speaker 3: gulf between that, which is what we get in everyday 459 00:25:19,240 --> 00:25:22,760 Speaker 3: human interactions and the interactions with a non conscious machine, 460 00:25:23,040 --> 00:25:25,920 Speaker 3: is an enormous golf. I'll even put it in another way, 461 00:25:26,040 --> 00:25:28,920 Speaker 3: which is going back to loneliness issue. I have two 462 00:25:29,080 --> 00:25:31,840 Speaker 3: sons there in their twenties that they live really rich 463 00:25:32,000 --> 00:25:35,480 Speaker 3: Environbrant lives. If I was discovered that one of them decided, 464 00:25:35,880 --> 00:25:38,000 Speaker 3: I'm going to leave my girlfriend and I'm going to 465 00:25:38,040 --> 00:25:42,240 Speaker 3: take up with you know, chat GPT eight, and I 466 00:25:42,320 --> 00:25:45,359 Speaker 3: would feel so disappointed. I would just feel it that 467 00:25:45,640 --> 00:25:48,840 Speaker 3: it's not a person. Man, it's not you know, you're 468 00:25:49,040 --> 00:25:52,600 Speaker 3: you're pulling yourself away from real interactions to some sort 469 00:25:52,640 --> 00:25:55,760 Speaker 3: of substitute that might give be pleasurable and interesting, but 470 00:25:55,920 --> 00:25:56,800 Speaker 3: you're missing so much. 471 00:25:57,320 --> 00:26:01,240 Speaker 1: What if your son argued that we have these evolutionary 472 00:26:01,320 --> 00:26:05,840 Speaker 1: pressures pointing him towards reproduction, but he's actually happier with 473 00:26:06,040 --> 00:26:11,680 Speaker 1: the AI companion and reproduction isn't on his mind. Let's say, 474 00:26:13,200 --> 00:26:14,520 Speaker 1: what would you argue to him. 475 00:26:15,400 --> 00:26:18,760 Speaker 3: I would say that I would concede an AI companion 476 00:26:18,880 --> 00:26:21,520 Speaker 3: in the short term could make him much happier than 477 00:26:21,560 --> 00:26:25,480 Speaker 3: a person. A properly configured machine can do whatever you want, 478 00:26:25,600 --> 00:26:28,560 Speaker 3: and we'll never disappoint, we'll never betray, we'll never get 479 00:26:28,600 --> 00:26:30,520 Speaker 3: bored of you, and so on, at least in the 480 00:26:30,560 --> 00:26:33,880 Speaker 3: short term. I would wonder about long term. You talked 481 00:26:33,880 --> 00:26:37,440 Speaker 3: about it being boring and maybe having something constructed to 482 00:26:37,520 --> 00:26:41,520 Speaker 3: satisfy you an end would be boring, but I would say, 483 00:26:41,880 --> 00:26:43,879 Speaker 3: and it's hard to make an argument for it unless 484 00:26:43,880 --> 00:26:46,200 Speaker 3: you sort of have some sympathy for the point that 485 00:26:46,880 --> 00:26:51,399 Speaker 3: there's an analogy between fake experiences in general. If my 486 00:26:51,520 --> 00:26:53,439 Speaker 3: son said that he was going to go to a machine, 487 00:26:53,480 --> 00:26:55,600 Speaker 3: and this is an idea adopt as you all know 488 00:26:55,720 --> 00:26:58,600 Speaker 3: from the philosopher Robert Nozick, we're to put him to 489 00:26:58,680 --> 00:27:01,440 Speaker 3: sleep for the rest of his life. And Walley's Asleepy 490 00:27:01,480 --> 00:27:05,320 Speaker 3: that have these amazing dreams, these amazing experiences of adventure 491 00:27:05,640 --> 00:27:08,840 Speaker 3: and love and great kindness and so on. But he 492 00:27:09,000 --> 00:27:12,000 Speaker 3: just be lyinger experiencing them, And I said, I don't 493 00:27:12,040 --> 00:27:14,880 Speaker 3: want you to do that. It said that it's even 494 00:27:14,960 --> 00:27:18,440 Speaker 3: more fun than living a life. I would say, you're 495 00:27:18,520 --> 00:27:22,639 Speaker 3: losing something really deep, and what you're losing is dealing 496 00:27:22,760 --> 00:27:25,640 Speaker 3: with the world, being in a real world and having 497 00:27:25,760 --> 00:27:29,080 Speaker 3: real experiences. The fact that it makes no psychological difference 498 00:27:29,119 --> 00:27:31,480 Speaker 3: doesn't mean that it's apps into real difference. And I 499 00:27:31,520 --> 00:27:33,600 Speaker 3: would say the same thing about an AI companion. 500 00:27:48,600 --> 00:27:52,680 Speaker 1: You also suggest that the loss of loneliness might lead 501 00:27:52,800 --> 00:27:55,440 Speaker 1: to a loss of creativity. 502 00:27:55,680 --> 00:27:56,280 Speaker 2: Tell us about that. 503 00:27:57,160 --> 00:28:02,320 Speaker 3: I think in general we benefit from friction. I mean 504 00:28:02,400 --> 00:28:05,080 Speaker 3: friction comes under all sorts of different terms. You know, 505 00:28:05,200 --> 00:28:07,520 Speaker 3: you're you're more of a neuroscience guy. You might call it, 506 00:28:07,640 --> 00:28:11,800 Speaker 3: you know, reinforcement, reinforcement learning, or you know, failure of 507 00:28:12,040 --> 00:28:17,040 Speaker 3: predictive processes. But we benefit from from trying out things 508 00:28:17,119 --> 00:28:21,639 Speaker 3: in the world and getting pushedback, and that makes us smarter. 509 00:28:22,240 --> 00:28:24,520 Speaker 3: I mean, I feel that we were talking before about writing, 510 00:28:25,000 --> 00:28:26,720 Speaker 3: and you know you're writing a book, and I will 511 00:28:26,800 --> 00:28:29,600 Speaker 3: bet because you write great books that as you do 512 00:28:29,720 --> 00:28:32,560 Speaker 3: the writing, it makes you smarter. You write because your 513 00:28:32,600 --> 00:28:34,680 Speaker 3: first draft is kind of crap, and you say this, 514 00:28:35,200 --> 00:28:37,560 Speaker 3: arguments aren't flowing, this point doesn't make any sense. So 515 00:28:37,640 --> 00:28:39,120 Speaker 3: you try it, and you try it and you try it, 516 00:28:39,480 --> 00:28:41,760 Speaker 3: and gradually through the process you get better and better. 517 00:28:42,000 --> 00:28:44,320 Speaker 3: And I think this is a story of life in general, 518 00:28:44,400 --> 00:28:47,640 Speaker 3: which is we flail around, we fail, and we fail better, 519 00:28:47,680 --> 00:28:49,160 Speaker 3: and we fail better and we fail better, and so 520 00:28:49,320 --> 00:28:53,960 Speaker 3: we're doing pretty good sometimes. And so to take away 521 00:28:54,040 --> 00:28:56,520 Speaker 3: that friction, which is what the end of loneliness will 522 00:28:56,520 --> 00:28:59,600 Speaker 3: be the end of boredom as well ROMs us of 523 00:28:59,640 --> 00:29:01,240 Speaker 3: the chants of that generative process. 524 00:29:01,680 --> 00:29:04,000 Speaker 1: I have two questions for you. One, it's very tangential, 525 00:29:04,040 --> 00:29:07,120 Speaker 1: but I'm curious what you think about with the students 526 00:29:07,240 --> 00:29:10,200 Speaker 1: that you see and writing. So you and I both 527 00:29:10,280 --> 00:29:13,640 Speaker 1: write books and we polish our arguments. That way, students 528 00:29:13,680 --> 00:29:16,440 Speaker 1: won't have a need or a strong motivation to write 529 00:29:16,560 --> 00:29:18,440 Speaker 1: and go through that process of drafts. 530 00:29:19,360 --> 00:29:20,840 Speaker 2: What do you think the consequence is going to be 531 00:29:20,920 --> 00:29:21,080 Speaker 2: of that? 532 00:29:21,640 --> 00:29:23,760 Speaker 3: Man, I think about this all the time. I'm teaching 533 00:29:23,800 --> 00:29:26,200 Speaker 3: a seminar in starting to teach a seminar in a week, 534 00:29:26,520 --> 00:29:29,800 Speaker 3: and I always have reading responses, and I'm well aware 535 00:29:30,280 --> 00:29:32,719 Speaker 3: that a student could just type in a single sentence 536 00:29:32,800 --> 00:29:35,320 Speaker 3: into chat and produce a reading response that would be 537 00:29:35,600 --> 00:29:39,040 Speaker 3: really good, indistinguishable from their work. I mean, they have 538 00:29:39,120 --> 00:29:41,120 Speaker 3: to say, make it a bit less good so it's convincing. 539 00:29:41,400 --> 00:29:43,720 Speaker 3: And so a lot of professors are responding by not 540 00:29:43,840 --> 00:29:46,600 Speaker 3: assigning writing anymore, just doing in class essays and that 541 00:29:46,720 --> 00:29:50,160 Speaker 3: sort of thing. I think if students stop writing, and 542 00:29:50,240 --> 00:29:53,200 Speaker 3: if people stop writing in general, there's an enormous amount 543 00:29:53,200 --> 00:29:55,960 Speaker 3: that will be lost. I mean, it goes back to 544 00:29:56,040 --> 00:29:57,800 Speaker 3: what we were talking about when we're talking about our 545 00:29:57,800 --> 00:30:01,400 Speaker 3: own writing, which is writing as I think. I'm curious 546 00:30:01,400 --> 00:30:04,000 Speaker 3: what you think about this, but I think a form 547 00:30:04,080 --> 00:30:07,920 Speaker 3: of working through problems, and if you just say that chat, 548 00:30:08,160 --> 00:30:10,120 Speaker 3: you know, write means eight hundred words in this thing, 549 00:30:10,480 --> 00:30:13,840 Speaker 3: you cheat yourself without opportunity. But of course writing is 550 00:30:13,840 --> 00:30:19,720 Speaker 3: also effortful, difficult, often frustrating, and it may prove irresistible 551 00:30:19,760 --> 00:30:24,840 Speaker 3: for our students and maybe someday ourselves not to fall 552 00:30:24,960 --> 00:30:28,440 Speaker 3: back on AI. I mean, I wonder, ten years from now, 553 00:30:28,520 --> 00:30:32,360 Speaker 3: assuming that it hasn't killed us by then, how many 554 00:30:32,480 --> 00:30:36,880 Speaker 3: books will be mostly written by AI. You know, NIH, 555 00:30:37,320 --> 00:30:40,200 Speaker 3: the National Instative Health has made it, has made it. 556 00:30:40,240 --> 00:30:42,080 Speaker 3: I think it's NIH might be ansf And I said, 557 00:30:42,360 --> 00:30:44,320 Speaker 3: you've got to start. We now, for the first time, 558 00:30:44,360 --> 00:30:47,000 Speaker 3: have a limit in a number of grants a lab 559 00:30:47,080 --> 00:30:52,000 Speaker 3: could submit because people are sending us in fifty grands 560 00:30:52,200 --> 00:30:54,160 Speaker 3: because they get the AI to write them for them. 561 00:30:54,840 --> 00:30:57,040 Speaker 3: And I wonder whether publishers and magazines are kind of 562 00:30:57,080 --> 00:30:57,720 Speaker 3: the same issue. 563 00:30:57,920 --> 00:30:59,880 Speaker 2: You know, here's what I think, and I've mentioned this 564 00:31:00,120 --> 00:31:01,240 Speaker 2: on the podcast before. 565 00:31:01,680 --> 00:31:04,760 Speaker 1: I think that what's going to happen with writing will 566 00:31:04,840 --> 00:31:08,960 Speaker 1: be analogous to what happened with visual painting. When photography 567 00:31:09,120 --> 00:31:13,160 Speaker 1: was introduced, all the visual painters panicked and thought, we're done. 568 00:31:13,200 --> 00:31:16,080 Speaker 1: Because you can capture a scene perfectly in great detail, 569 00:31:16,440 --> 00:31:18,880 Speaker 1: why would you need us? But what they did is 570 00:31:19,000 --> 00:31:23,080 Speaker 1: they diverged, and the painters went towards things like Impressionism 571 00:31:23,160 --> 00:31:26,280 Speaker 1: and Cubism and things that photography could not do, and 572 00:31:26,400 --> 00:31:29,440 Speaker 1: so they ended up flowering on neighboring fields. They found 573 00:31:29,480 --> 00:31:31,680 Speaker 1: their own thing. And as I'm writing my next book, 574 00:31:31,720 --> 00:31:34,440 Speaker 1: what I find is that I'm stretching myself into the 575 00:31:34,560 --> 00:31:38,280 Speaker 1: parts that AI can't do well. I'm making it, you know, 576 00:31:38,360 --> 00:31:43,160 Speaker 1: where there's stories, personal stories that weave throughout the chapter 577 00:31:43,720 --> 00:31:49,000 Speaker 1: and have a little string of detonations where I keep 578 00:31:49,040 --> 00:31:51,640 Speaker 1: coming back to the story. And of course even basic 579 00:31:51,720 --> 00:31:54,760 Speaker 1: things like quoting, you know, doing a block quote from 580 00:31:54,840 --> 00:31:57,240 Speaker 1: something else that you've read. AI is not particularly good 581 00:31:57,240 --> 00:31:59,320 Speaker 1: at that, at least at the moment. So there are 582 00:31:59,360 --> 00:32:02,040 Speaker 1: many things that we can do to distinguish our writing 583 00:32:02,160 --> 00:32:06,240 Speaker 1: as human writing. And the fact is that I find 584 00:32:06,320 --> 00:32:11,920 Speaker 1: writing so pleasurable because of the wrestling with the ideas. 585 00:32:12,600 --> 00:32:15,040 Speaker 1: All this is to say that writing makes a great 586 00:32:15,120 --> 00:32:20,320 Speaker 1: analogy with with loneliness here, because the question is will 587 00:32:20,400 --> 00:32:23,440 Speaker 1: we get rid of the difficulty of writing but the 588 00:32:23,520 --> 00:32:26,400 Speaker 1: expense of not being able to think through something? Will 589 00:32:26,440 --> 00:32:31,719 Speaker 1: we get rid of loneliness at the expense of losing 590 00:32:31,840 --> 00:32:34,320 Speaker 1: something in the social domain, and not being able to 591 00:32:34,520 --> 00:32:36,040 Speaker 1: polish ourselves as a result. 592 00:32:36,200 --> 00:32:39,160 Speaker 3: I agree with the problem. I'm less optimistic about the solution. 593 00:32:39,440 --> 00:32:42,520 Speaker 3: I think I think you could probably ask Chad even 594 00:32:42,640 --> 00:32:45,880 Speaker 3: right now, I'm David Eagleman. Please please weave through some 595 00:32:46,000 --> 00:32:49,720 Speaker 3: good personal stories through my pros and it will, you know, 596 00:32:50,280 --> 00:32:52,880 Speaker 3: has access to your life. It could, It could give 597 00:32:52,960 --> 00:32:55,120 Speaker 3: these good personal stories. If not, it could make them up. 598 00:32:55,120 --> 00:32:57,320 Speaker 3: There's probably more than one author who kind of kind 599 00:32:57,360 --> 00:32:59,720 Speaker 3: of makes up their personal stories to weave through it, 600 00:33:00,000 --> 00:33:02,080 Speaker 3: and sooner or later we'll stop hallucinating and give you 601 00:33:02,200 --> 00:33:06,000 Speaker 3: some good quotes. I think I like the analogy between 602 00:33:06,040 --> 00:33:09,600 Speaker 3: photography and painting. But the this analogy is, of course 603 00:33:09,880 --> 00:33:13,040 Speaker 3: you could tell a painting from a photograph. The problem 604 00:33:13,200 --> 00:33:15,600 Speaker 3: is it's going to become increasingly hard to tell the 605 00:33:15,680 --> 00:33:19,200 Speaker 3: AI stuff from the real stuff. I like writing too, 606 00:33:19,640 --> 00:33:21,400 Speaker 3: and i'd like I write or write a substack. I 607 00:33:21,440 --> 00:33:24,240 Speaker 3: write books or write articles, and I will not have 608 00:33:24,320 --> 00:33:30,280 Speaker 3: an AI replace me. But for some academic paperwork I 609 00:33:30,400 --> 00:33:32,760 Speaker 3: do have AI do to writing. You know, I have 610 00:33:32,840 --> 00:33:34,360 Speaker 3: to write a report on so and so. No one's 611 00:33:34,400 --> 00:33:36,080 Speaker 3: going to read it, I say a R and I 612 00:33:36,240 --> 00:33:39,000 Speaker 3: actually it knows my writing style it's read my stuff. 613 00:33:39,000 --> 00:33:41,400 Speaker 3: I fatis I said, write this in the style of 614 00:33:41,520 --> 00:33:45,200 Speaker 3: paulblem and Boom, some stupid reporter. But a stupid thing 615 00:33:45,280 --> 00:33:48,200 Speaker 3: is now in a style problem. I would never use 616 00:33:48,320 --> 00:33:51,160 Speaker 3: this to write a substack or magazine article or chapter 617 00:33:51,200 --> 00:33:55,320 Speaker 3: of my book. But I wouldn't be so surprised if, say, 618 00:33:55,360 --> 00:33:57,200 Speaker 3: a couple of years from now, it could do it 619 00:33:58,000 --> 00:34:00,600 Speaker 3: and be indistinguishable from what I would do. Is that 620 00:34:01,040 --> 00:34:04,240 Speaker 3: too too techno optimistic or techno pessimistic? 621 00:34:04,360 --> 00:34:07,080 Speaker 2: It's two techno pessimistic, I think, because here's the thing. 622 00:34:07,280 --> 00:34:11,399 Speaker 1: At the moment of the advent of photography, no one 623 00:34:12,160 --> 00:34:14,920 Speaker 1: could think of Impressionism or Cubism or any of the 624 00:34:14,960 --> 00:34:17,239 Speaker 1: other movements that came out because they hadn't been done yet. 625 00:34:17,680 --> 00:34:21,160 Speaker 1: But I suspect that there will exist things that writers do, 626 00:34:21,280 --> 00:34:24,719 Speaker 1: people who love to write, that will constantly keep them 627 00:34:24,760 --> 00:34:27,239 Speaker 1: apart from what AI is capable of, and maybe that 628 00:34:27,280 --> 00:34:28,680 Speaker 1: will always have to evolve. 629 00:34:29,120 --> 00:34:32,960 Speaker 2: But that's my techno optimistic view on that. Yeah, yeah, 630 00:34:33,920 --> 00:34:34,320 Speaker 2: it could be. 631 00:34:34,440 --> 00:34:35,040 Speaker 3: I hope you're right. 632 00:34:35,440 --> 00:34:37,600 Speaker 1: So what do you hope readers will take away from 633 00:34:37,760 --> 00:34:41,880 Speaker 1: your New Yorker article in terms of thinking about loneliness 634 00:34:42,120 --> 00:34:43,879 Speaker 1: and AI companions. 635 00:34:43,920 --> 00:34:46,239 Speaker 3: A few things. I hope they worry more. I hope 636 00:34:46,280 --> 00:34:48,280 Speaker 3: to make them a little bit a little bit panicked. 637 00:34:49,560 --> 00:34:53,239 Speaker 3: But two main things. One thing is to have an 638 00:34:53,320 --> 00:34:55,799 Speaker 3: open mind about using these chatbots for people who really 639 00:34:55,880 --> 00:34:59,000 Speaker 3: need them, to sort of feel some sympathy for those 640 00:34:59,040 --> 00:35:01,160 Speaker 3: who suffer from seria loneliness. I think a lot of 641 00:35:01,200 --> 00:35:04,880 Speaker 3: people who who mock the use of these chat bots 642 00:35:04,920 --> 00:35:07,000 Speaker 3: and think, you know, only a loser would use them 643 00:35:07,280 --> 00:35:10,879 Speaker 3: are people who are very socially successful, and I think 644 00:35:10,880 --> 00:35:13,120 Speaker 3: they should work harder to think about the life that 645 00:35:13,120 --> 00:35:15,840 Speaker 3: people who don't. And then the second thing is for 646 00:35:15,960 --> 00:35:21,640 Speaker 3: the rest of us to recognize that the loneliness and boredom, grief, 647 00:35:22,040 --> 00:35:26,680 Speaker 3: shame are painful, all painful, they could be useful, and 648 00:35:26,800 --> 00:35:31,000 Speaker 3: sometimes a good life involves choosing to experience certain sorts 649 00:35:31,000 --> 00:35:33,319 Speaker 3: of pain for a greater benefit. That means, in some way, 650 00:35:33,800 --> 00:35:35,640 Speaker 3: I have an hour free and I don't pick up 651 00:35:35,640 --> 00:35:38,520 Speaker 3: my phone and play word all and million games and 652 00:35:38,560 --> 00:35:40,880 Speaker 3: everything like that. I have an hour free and I 653 00:35:40,920 --> 00:35:42,640 Speaker 3: don't pick and I don't talk to a chat butt 654 00:35:42,640 --> 00:35:45,880 Speaker 3: and have a perfectly entertaining conversation. I struggle with the boredom. 655 00:35:45,880 --> 00:35:47,719 Speaker 3: I struggle to loneliness and hopes it makes me a 656 00:35:47,760 --> 00:35:48,279 Speaker 3: better person. 657 00:35:52,719 --> 00:35:56,440 Speaker 1: That was my conversation with psychology professor Paul Bloom. At 658 00:35:56,480 --> 00:35:58,839 Speaker 1: the heart of the issue is that we generally think 659 00:35:58,880 --> 00:36:03,080 Speaker 1: about loneliness as an affliction to be eradicated, something that 660 00:36:03,200 --> 00:36:07,640 Speaker 1: we might engineer away, like polio or smallpox. But as 661 00:36:07,719 --> 00:36:12,360 Speaker 1: this conversation highlighted, loneliness is not just a disease of disconnection. 662 00:36:12,480 --> 00:36:17,080 Speaker 1: It's also a teaching signal. It prods us to reach out, 663 00:36:17,160 --> 00:36:20,800 Speaker 1: to repair, to bond. So what we're left with is 664 00:36:20,840 --> 00:36:24,600 Speaker 1: a paradox because, on the one hand, ai companions offer 665 00:36:24,800 --> 00:36:29,480 Speaker 1: amazing comfort. They can soothe those who suffer most, especially 666 00:36:29,960 --> 00:36:34,960 Speaker 1: the elderly, the isolated, the ones whose signals go unanswered. 667 00:36:35,280 --> 00:36:39,160 Speaker 1: In those cases, to withhold that kind of comfort would 668 00:36:39,200 --> 00:36:42,560 Speaker 1: feel cruel for the rest of us, especially for younger people. 669 00:36:42,640 --> 00:36:47,240 Speaker 1: There may be a danger in making loneliness optional, because 670 00:36:47,280 --> 00:36:51,560 Speaker 1: if we quiet that internal signal too completely, we risk 671 00:36:51,760 --> 00:36:55,040 Speaker 1: losing the thing that it evolved to do to push 672 00:36:55,200 --> 00:36:58,279 Speaker 1: us toward one another. Now, in some ways, we've been 673 00:36:58,360 --> 00:37:01,480 Speaker 1: here before. When Paul and I were growing up not 674 00:37:01,600 --> 00:37:06,600 Speaker 1: that long ago, Boredam was a constant companion in childhoods 675 00:37:07,200 --> 00:37:11,040 Speaker 1: Now it's been nearly erased by the endless diversions on 676 00:37:11,239 --> 00:37:16,080 Speaker 1: our screens. In that erasure was something lost, some of 677 00:37:16,160 --> 00:37:19,200 Speaker 1: my colleagues seem to think. So they argue that boredom 678 00:37:19,680 --> 00:37:23,280 Speaker 1: used to push us to invent, to imagine, to create, 679 00:37:23,920 --> 00:37:27,000 Speaker 1: Now bornam is easily drowned in a. 680 00:37:27,200 --> 00:37:29,920 Speaker 2: Sea of distractions. I'm not so certain. 681 00:37:30,200 --> 00:37:33,439 Speaker 1: While it's absolutely true that we have created a sea 682 00:37:33,520 --> 00:37:37,239 Speaker 1: of distraction, it's equally true that now anyone on the 683 00:37:37,360 --> 00:37:42,080 Speaker 1: planet can access the latest cutting edge information or thinking. 684 00:37:42,600 --> 00:37:46,880 Speaker 1: Just consider that you're listening to information delivered right to 685 00:37:47,000 --> 00:37:50,920 Speaker 1: your ear. Neuroscientists and psychologists come right to you to 686 00:37:51,040 --> 00:37:54,560 Speaker 1: teach you information, and there's really no learning in the 687 00:37:54,680 --> 00:37:59,160 Speaker 1: world that you can't access instantly as soon as you're 688 00:37:59,320 --> 00:38:01,839 Speaker 1: curious about it. And if you really look at the data, 689 00:38:01,960 --> 00:38:04,960 Speaker 1: it's clear that the pace of new discovery has swamped 690 00:38:05,080 --> 00:38:08,600 Speaker 1: what we have ever seen at any moment in history. 691 00:38:09,040 --> 00:38:12,880 Speaker 1: So is the Internet a nirvana of learning or a 692 00:38:12,920 --> 00:38:13,640 Speaker 1: big distraction? 693 00:38:14,520 --> 00:38:15,560 Speaker 2: Yes and yes. 694 00:38:16,080 --> 00:38:18,680 Speaker 1: My own take on this is that loneliness is going 695 00:38:18,760 --> 00:38:23,240 Speaker 1: to follow a similar path for some, especially young people 696 00:38:23,440 --> 00:38:27,240 Speaker 1: with AI companions, will have to ask what new social 697 00:38:27,400 --> 00:38:31,520 Speaker 1: skills will never be born what relationships will never be born, 698 00:38:32,040 --> 00:38:35,239 Speaker 1: What sparks of creativity will never be born. If we 699 00:38:35,360 --> 00:38:38,640 Speaker 1: hand over too much to our digital companions, we might 700 00:38:39,160 --> 00:38:43,960 Speaker 1: trade away something essential, the struggle to be understood, the 701 00:38:44,239 --> 00:38:50,320 Speaker 1: effort of real connection, the messiness and beauty of human entanglement. 702 00:38:50,840 --> 00:38:56,120 Speaker 1: On the flip side, some people with real, genuine loneliness 703 00:38:56,400 --> 00:39:00,279 Speaker 1: will be massively helped. This will be life changing for them. 704 00:39:00,719 --> 00:39:04,279 Speaker 1: Older people who live alone, whose friends have mostly passed away, 705 00:39:04,640 --> 00:39:08,279 Speaker 1: whose children live elsewhere, who might have physical constraints that 706 00:39:08,400 --> 00:39:10,920 Speaker 1: make them unable to go out of the house much. 707 00:39:11,480 --> 00:39:16,040 Speaker 1: This kind of caring, listening companion will be like the 708 00:39:16,160 --> 00:39:19,880 Speaker 1: discovery of antibiotics, and I'm betting in a few years 709 00:39:20,280 --> 00:39:24,600 Speaker 1: we'll start seeing research about the massive health benefits to it. So, 710 00:39:24,920 --> 00:39:27,919 Speaker 1: as you reflect on today's conversation, sit with your own 711 00:39:28,040 --> 00:39:32,279 Speaker 1: experience of loneliness, past or present. Ask yourself, if that 712 00:39:32,560 --> 00:39:36,120 Speaker 1: pain were gone, what would you never have reached for? 713 00:39:36,880 --> 00:39:41,200 Speaker 1: What part of you might never have grown? But also 714 00:39:41,320 --> 00:39:46,160 Speaker 1: consider the flip side, what needless suffering might have been spared. 715 00:39:46,480 --> 00:39:49,920 Speaker 1: What moments of despair yours or maybe someone you know 716 00:39:50,520 --> 00:39:54,520 Speaker 1: might have been softened by an always available voice that's 717 00:39:54,640 --> 00:39:58,640 Speaker 1: attentive and kind. What doors might open if AI could 718 00:39:58,719 --> 00:40:02,319 Speaker 1: take the edge off of loneliness just enough for people 719 00:40:02,440 --> 00:40:06,879 Speaker 1: to find the strength to rejoin the world. And that's 720 00:40:06,960 --> 00:40:11,839 Speaker 1: the paradox of this new technology. It might both erode 721 00:40:11,880 --> 00:40:14,959 Speaker 1: the struggles that shape us, but at the same time 722 00:40:15,400 --> 00:40:23,880 Speaker 1: offer comfort where none otherwise exists. Go to egulman dot 723 00:40:23,920 --> 00:40:27,360 Speaker 1: com slash podcast for more information and to find further reading. 724 00:40:27,760 --> 00:40:30,640 Speaker 1: Join the weekly discussions on my substack, and check out 725 00:40:30,680 --> 00:40:34,080 Speaker 1: and subscribe to Inner Cosmos on YouTube for videos of 726 00:40:34,120 --> 00:40:37,560 Speaker 1: each episode and to leave comments until next time. I'm 727 00:40:37,640 --> 00:40:40,200 Speaker 1: David Eagleman, and this is Inner Cosmos.