1 00:00:00,960 --> 00:00:04,280 Speaker 1: Here, then I repeat and sum up. During the endless 2 00:00:04,320 --> 00:00:08,080 Speaker 1: train journey which took me from Eisenach to Berlin, across 3 00:00:08,119 --> 00:00:11,760 Speaker 1: the Thuringia and Saxony in Ruins, I noticed for the 4 00:00:11,840 --> 00:00:14,960 Speaker 1: first time, and I don't know how long that man 5 00:00:15,000 --> 00:00:19,120 Speaker 1: whom I call my double to simplify matters, or else 6 00:00:19,560 --> 00:00:33,280 Speaker 1: my twin, or again and less theatrically the traveler. Welcome 7 00:00:33,280 --> 00:00:34,960 Speaker 1: to Stuff to Blow Your Mind, a production of I 8 00:00:35,040 --> 00:00:44,000 Speaker 1: Heart Radios How Stuff Works. Hey, Welcome to Stuff to 9 00:00:44,040 --> 00:00:46,519 Speaker 1: Blow your Mind. My name is Robert Lamb, and I'm 10 00:00:46,600 --> 00:00:49,120 Speaker 1: Joe McCormick. And today I thought we might have a 11 00:00:49,159 --> 00:00:55,080 Speaker 1: discussion bringing together the seemingly disparate topics of familiarity, doppelgangers 12 00:00:55,800 --> 00:01:00,160 Speaker 1: or doubles, cap gross syndrome, and social media. And I 13 00:01:00,200 --> 00:01:02,320 Speaker 1: got the idea to talk about this today because a 14 00:01:02,320 --> 00:01:05,320 Speaker 1: few weeks ago I read this interesting article that had 15 00:01:05,360 --> 00:01:09,360 Speaker 1: a very intriguing central comparison or image. It was. It 16 00:01:09,440 --> 00:01:11,840 Speaker 1: was a thought provoking essay by the Stanford Nero Endo 17 00:01:11,880 --> 00:01:15,839 Speaker 1: chronologist Robert Sapulsky. It was originally published a few years 18 00:01:15,880 --> 00:01:19,400 Speaker 1: ago in Nautilus, and it was an article comparing the 19 00:01:19,440 --> 00:01:23,520 Speaker 1: effects of social media and and sort of the digital 20 00:01:23,520 --> 00:01:27,000 Speaker 1: world like Facebook, and you know, all that to a 21 00:01:27,040 --> 00:01:31,760 Speaker 1: psychological condition known as Capgrass syndrome. And so today I thought, 22 00:01:31,800 --> 00:01:35,880 Speaker 1: maybe we should start by explaining and discussing Sepulsky's comparison 23 00:01:35,920 --> 00:01:38,760 Speaker 1: and argument in that article and just see where we 24 00:01:38,800 --> 00:01:41,960 Speaker 1: go from there. Now, capgrass syndrome has definitely come up 25 00:01:41,959 --> 00:01:44,240 Speaker 1: on the show before. I don't know that we've done 26 00:01:44,480 --> 00:01:48,639 Speaker 1: like a designated show on the topic, but it'd certainly 27 00:01:48,680 --> 00:01:50,720 Speaker 1: come up. But either way, we we we do need to, 28 00:01:50,880 --> 00:01:53,880 Speaker 1: you know, provide a brief refresher on its history for 29 00:01:53,880 --> 00:01:57,480 Speaker 1: our listeners. Right. One of the important cases and which 30 00:01:57,680 --> 00:02:02,279 Speaker 1: Sapulsky discusses in his article, is the case of Madam 31 00:02:02,400 --> 00:02:05,200 Speaker 1: im This. This was a woman who lived in France 32 00:02:05,360 --> 00:02:10,560 Speaker 1: in the early twentieth century who had this persistent idea. 33 00:02:10,680 --> 00:02:15,000 Speaker 1: She was fixated on the idea that her loved ones, 34 00:02:15,080 --> 00:02:18,880 Speaker 1: including her husband, family members, people she knew, had been 35 00:02:18,919 --> 00:02:24,240 Speaker 1: replaced by doubles or doppel gangers who looked exactly like them. 36 00:02:24,280 --> 00:02:27,359 Speaker 1: So she would say, my husband is not really my husband. 37 00:02:27,639 --> 00:02:30,720 Speaker 1: He's a man who looks exactly like my husband used 38 00:02:30,760 --> 00:02:33,120 Speaker 1: to and I don't know what happened to my real husband. 39 00:02:33,360 --> 00:02:35,440 Speaker 1: And this wasn't her only symptoms. She had a number 40 00:02:35,440 --> 00:02:37,359 Speaker 1: of symptoms. She believed that all kinds of things were 41 00:02:37,400 --> 00:02:39,760 Speaker 1: happening to UH, to her children. I mean, it's a 42 00:02:39,760 --> 00:02:44,240 Speaker 1: tragic story, but the underlying UH, the underlying cause of 43 00:02:44,280 --> 00:02:46,919 Speaker 1: what would lead someone to believe that people around them 44 00:02:47,080 --> 00:02:51,440 Speaker 1: were being replaced by doppel gangers or doubles is is 45 00:02:51,480 --> 00:02:54,720 Speaker 1: interesting to consider. And so the way Sepulsky in this 46 00:02:54,880 --> 00:02:58,840 Speaker 1: article characterizes the ultimate disconnect under lyon cap Grass syndrome 47 00:02:59,360 --> 00:03:02,400 Speaker 1: is that when the module of the brain used in 48 00:03:02,560 --> 00:03:07,240 Speaker 1: recognition of faces, specifically involving the fusiform gyrus in the brain, 49 00:03:07,840 --> 00:03:12,040 Speaker 1: does cognitively recognize someone, but at the same time, the 50 00:03:12,080 --> 00:03:15,320 Speaker 1: different module of the brain that normally responds to this 51 00:03:15,440 --> 00:03:19,799 Speaker 1: recognition with the emotion that we call familiarity does not 52 00:03:20,040 --> 00:03:23,920 Speaker 1: kick in. And this brain function responsible for generating the 53 00:03:23,919 --> 00:03:28,560 Speaker 1: emotion of familiarity is what Sopolsky calls the extended face 54 00:03:28,760 --> 00:03:33,200 Speaker 1: processing system. It's quote a diffuse network including a variety 55 00:03:33,240 --> 00:03:38,080 Speaker 1: of cortical and limbic regions. And apparently, when we recognize 56 00:03:38,160 --> 00:03:42,760 Speaker 1: someone but we don't feel the necessary familiarity emotion that 57 00:03:42,880 --> 00:03:46,560 Speaker 1: follows when we normally recognize somebody, What the brain often 58 00:03:46,720 --> 00:03:50,600 Speaker 1: does when faced with this contradiction is to conclude that 59 00:03:50,720 --> 00:03:54,320 Speaker 1: someone has been replaced by a double. It looks like them, 60 00:03:54,400 --> 00:03:58,240 Speaker 1: but this person doesn't feel familiar to me, thus they 61 00:03:58,320 --> 00:04:02,480 Speaker 1: must be a physically identical impostor. In the past, I 62 00:04:02,520 --> 00:04:05,720 Speaker 1: looked at a two thousand four paper from the Canadian 63 00:04:05,800 --> 00:04:09,560 Speaker 1: Journal of Psychiatry titled cap Grass Syndrome. A Review of 64 00:04:09,600 --> 00:04:14,680 Speaker 1: the neuro physiological correlates and presenting clinical features in cases 65 00:04:14,720 --> 00:04:18,960 Speaker 1: involving physical violence and UH. In this point out that 66 00:04:19,000 --> 00:04:23,839 Speaker 1: the delusional identification syndrome generally involves right brain anomalies linked 67 00:04:23,839 --> 00:04:26,760 Speaker 1: to a number of illnesses and neurological disorders, ranging from 68 00:04:27,120 --> 00:04:30,640 Speaker 1: UH schizo effective disorder and Alzheimer's disease to severe head injuries, 69 00:04:30,800 --> 00:04:35,000 Speaker 1: pituitary tumors and migraines. Even alcoholism can play a role. 70 00:04:35,560 --> 00:04:38,040 Speaker 1: You know, basically, each each of us has a visual 71 00:04:38,080 --> 00:04:40,240 Speaker 1: system and olympic system, and the ladder helps us to 72 00:04:40,279 --> 00:04:43,920 Speaker 1: generate and process emotions. Damage or disrupt communication between these 73 00:04:43,920 --> 00:04:46,479 Speaker 1: two systems, and suddenly a familiar face can in suspire, 74 00:04:46,560 --> 00:04:50,800 Speaker 1: can inspire suspicion instead of comfort. Now, Fortunately, kept Grass 75 00:04:50,800 --> 00:04:54,760 Speaker 1: syndrome usually subsides with the successful treatment of the underlying 76 00:04:54,800 --> 00:04:57,800 Speaker 1: medical condition. You know, the tumor goes away, and thankfully 77 00:04:57,839 --> 00:05:00,279 Speaker 1: so does this. Uh. This, you know, suspicion and that 78 00:05:00,400 --> 00:05:02,960 Speaker 1: people are not what they seem to be. Uh. And 79 00:05:03,000 --> 00:05:06,760 Speaker 1: in some cases doctors can prescribe antipsychotic drugs to also 80 00:05:06,839 --> 00:05:10,440 Speaker 1: achieve the same effect. But you can easily see why 81 00:05:10,480 --> 00:05:14,080 Speaker 1: the idea of someone being replaced by a double or 82 00:05:14,120 --> 00:05:17,600 Speaker 1: a doppelganger would be such a captivating one. I mean, 83 00:05:17,640 --> 00:05:21,279 Speaker 1: it's something that, um, it's something that feels very perverted. 84 00:05:21,360 --> 00:05:23,760 Speaker 1: You know, it plays on our great vulnerabilities. And I 85 00:05:23,800 --> 00:05:26,960 Speaker 1: think it is not a coincidence that this kind of 86 00:05:27,000 --> 00:05:30,400 Speaker 1: thing has featured into some of the horror folklore of 87 00:05:30,440 --> 00:05:32,000 Speaker 1: the world. I mean, you think about the idea of 88 00:05:32,000 --> 00:05:35,520 Speaker 1: the changeling uh in in fairy folklore, where there was 89 00:05:35,560 --> 00:05:37,600 Speaker 1: this idea that where the fairy folk would come in 90 00:05:37,680 --> 00:05:40,480 Speaker 1: and replace someone you knew, often a child, but sometimes 91 00:05:40,520 --> 00:05:42,400 Speaker 1: like a husband or wife, or you know, someone you 92 00:05:42,480 --> 00:05:45,719 Speaker 1: knew with a fairy double who looked like them but 93 00:05:45,960 --> 00:05:48,240 Speaker 1: wasn't familiar to you, didn't act like them. And now 94 00:05:48,320 --> 00:05:52,440 Speaker 1: this is often described as something that people would use 95 00:05:52,520 --> 00:05:55,400 Speaker 1: to explain you know, maybe when somebody's behavior changed and 96 00:05:55,400 --> 00:05:58,080 Speaker 1: they didn't seem themselves. They think, oh, maybe they've been 97 00:05:58,120 --> 00:06:01,360 Speaker 1: replaced with a changeling uh or used to explain why 98 00:06:01,360 --> 00:06:03,720 Speaker 1: people might feel that their children weren't their own, or 99 00:06:04,040 --> 00:06:06,520 Speaker 1: something like that. But then also you have to wonder 100 00:06:06,560 --> 00:06:10,160 Speaker 1: if some kinds of neurological issues maybe at work here 101 00:06:10,200 --> 00:06:12,719 Speaker 1: in the minds of the people making the accusation that 102 00:06:12,839 --> 00:06:17,120 Speaker 1: someone is is a fairy. Yeah. And and this obviously, 103 00:06:17,120 --> 00:06:19,560 Speaker 1: this idea in and of itself has played into some 104 00:06:19,880 --> 00:06:21,800 Speaker 1: so many myths throughout history and also continues to just 105 00:06:22,160 --> 00:06:25,760 Speaker 1: resound in our our popular media. Um. This is slightly 106 00:06:25,800 --> 00:06:28,880 Speaker 1: older work, of course, but Invasion of the Body Snatchers, 107 00:06:29,440 --> 00:06:31,760 Speaker 1: I mean that plays heavily on this trope, right, that 108 00:06:31,839 --> 00:06:35,320 Speaker 1: people are being replaced by something else, People that we 109 00:06:35,400 --> 00:06:39,080 Speaker 1: think we know we're not are not actually those individuals anymore. 110 00:06:39,480 --> 00:06:42,559 Speaker 1: It's been a huge yet you often find it also 111 00:06:43,040 --> 00:06:46,760 Speaker 1: not only in speculative fiction, but in literary fiction as well. Um. 112 00:06:47,200 --> 00:06:48,800 Speaker 1: The quote that a right at the top of this 113 00:06:48,880 --> 00:06:52,200 Speaker 1: episode is from a two thousand and four novel UH 114 00:06:52,560 --> 00:06:56,400 Speaker 1: titled Repetition by one of my favorite French authors, Alan 115 00:06:56,440 --> 00:06:59,000 Speaker 1: Robe Guerrilla, who often this is one of This is 116 00:06:59,000 --> 00:07:00,919 Speaker 1: a trope that he often rule into his books, like 117 00:07:00,960 --> 00:07:03,880 Speaker 1: the idea of a double or some sort of an 118 00:07:03,880 --> 00:07:07,080 Speaker 1: alter ego. And this book in particularly been in particular 119 00:07:07,160 --> 00:07:10,240 Speaker 1: like starts off with a character on a train having 120 00:07:10,280 --> 00:07:13,920 Speaker 1: glimpsed his double once more. Yeah, it's a very unsettling image. Yeah, 121 00:07:14,040 --> 00:07:18,000 Speaker 1: the plurality of self. Right. Um well, and because so 122 00:07:18,040 --> 00:07:20,160 Speaker 1: there are double ways that it can be unsettling. There's 123 00:07:20,160 --> 00:07:24,000 Speaker 1: the idea that someone you know is replaced by a double. Obviously, 124 00:07:24,040 --> 00:07:26,200 Speaker 1: if you were to come to believe that through you know, 125 00:07:26,280 --> 00:07:28,920 Speaker 1: whether you had like a brain injury or a neurological 126 00:07:28,920 --> 00:07:31,240 Speaker 1: condition that caused you to believe that, or I don't know, 127 00:07:31,280 --> 00:07:33,520 Speaker 1: if you just believed in fairies and thought maybe that 128 00:07:33,560 --> 00:07:36,280 Speaker 1: this was happening because your cultural conditioning. Either way, that 129 00:07:36,320 --> 00:07:38,960 Speaker 1: would be a terrifying thing. It's another thing entirely to 130 00:07:39,080 --> 00:07:42,200 Speaker 1: see it, to believe that you see another version of yourself, 131 00:07:42,240 --> 00:07:44,400 Speaker 1: you know, to think that you had your own double, 132 00:07:44,520 --> 00:07:47,040 Speaker 1: or there was a doppelganger of you. So I think 133 00:07:47,080 --> 00:07:49,440 Speaker 1: most of us are probably familiar with this, the the 134 00:07:49,440 --> 00:07:53,360 Speaker 1: the the idea of a doppelganger. Um. I you know, 135 00:07:53,400 --> 00:07:56,680 Speaker 1: I would love to say that I learned about doppelgangers 136 00:07:56,720 --> 00:07:59,400 Speaker 1: for the first time by either consulting a nice, you 137 00:07:59,440 --> 00:08:02,680 Speaker 1: know book on Germanic mythology, and certainly I read a 138 00:08:02,720 --> 00:08:04,800 Speaker 1: lot of different mythology books when I as a kid. Also, 139 00:08:04,840 --> 00:08:06,840 Speaker 1: I would love to say that my first encounter with 140 00:08:06,920 --> 00:08:11,119 Speaker 1: doppel gangers was a Dungeons and Dragons monster manual, because 141 00:08:11,120 --> 00:08:14,360 Speaker 1: there's another huge place that they're highly visible, as they've 142 00:08:14,400 --> 00:08:17,280 Speaker 1: long been a staple of Dungeons and Dragons, So they're 143 00:08:17,280 --> 00:08:18,880 Speaker 1: in there. Oh yeah, I mean it's a great way 144 00:08:18,920 --> 00:08:22,920 Speaker 1: to introduce a little uh, suspense and chaos into a campaign, 145 00:08:23,040 --> 00:08:26,520 Speaker 1: right somebody you know an MPC that the character's trust 146 00:08:26,520 --> 00:08:29,640 Speaker 1: has been replaced, or adoppel gangers trying to or even 147 00:08:29,680 --> 00:08:32,960 Speaker 1: successfully replaces a member of the party. Uh. So you know, 148 00:08:32,960 --> 00:08:36,080 Speaker 1: there's a lot of fun to be had with a doppelganger. Um, 149 00:08:36,120 --> 00:08:39,000 Speaker 1: but I have to admit that neither of these cases 150 00:08:39,080 --> 00:08:42,160 Speaker 1: is true. I heard about them initially in the nineties 151 00:08:42,520 --> 00:08:46,240 Speaker 1: via the Drew Barrymore movie that aired on the Sci 152 00:08:46,280 --> 00:08:48,560 Speaker 1: Fi Channel back when this is the old days, back 153 00:08:48,600 --> 00:08:51,480 Speaker 1: when before there were wise in sci Fi Wise inside 154 00:08:51,520 --> 00:08:55,319 Speaker 1: Oh Siffy you mean Siffy Yeah, Ciffy Channel. Yes, Um, 155 00:08:55,400 --> 00:08:57,680 Speaker 1: I remember next to nothing about this film, but it 156 00:08:57,760 --> 00:09:01,319 Speaker 1: was heavily promoted on the channel, and it introduced the 157 00:09:01,440 --> 00:09:04,320 Speaker 1: idea to me initially, and then I you know, followed 158 00:09:04,400 --> 00:09:07,400 Speaker 1: up by you know, asking around, Hey, Dad, what's a Doppelganger? 159 00:09:07,440 --> 00:09:10,040 Speaker 1: And then I looked it up, etcetera. Well, wait a minute, 160 00:09:10,120 --> 00:09:12,840 Speaker 1: so it was called Doppelganger the name of the movie. 161 00:09:12,880 --> 00:09:15,319 Speaker 1: That was, at least that was the title of the film, 162 00:09:15,320 --> 00:09:17,400 Speaker 1: as was promoted on Sci Fi Channel at the time. 163 00:09:17,840 --> 00:09:20,280 Speaker 1: Of course, it's often the case with films of this caliber. 164 00:09:20,320 --> 00:09:22,120 Speaker 1: They may have had multiple titles, and who knows, they 165 00:09:22,160 --> 00:09:25,040 Speaker 1: may have been promoted elsewhere under a different title. I 166 00:09:25,080 --> 00:09:27,680 Speaker 1: just looked it up. It's also known as Doppelganger colon 167 00:09:27,800 --> 00:09:30,160 Speaker 1: the Evil Within. Just to be clear, that was for 168 00:09:30,200 --> 00:09:31,880 Speaker 1: the people who didn't know what a doppel ganger was. 169 00:09:33,320 --> 00:09:36,679 Speaker 1: Always got to have a colon, real title. So but here, 170 00:09:36,679 --> 00:09:39,040 Speaker 1: here's an interesting thing that I didn't realize until I 171 00:09:39,080 --> 00:09:43,199 Speaker 1: was researching this episode. I just kind of assumed, you know, obviously, 172 00:09:43,200 --> 00:09:47,679 Speaker 1: the doppelganger itself, the term is Germanic origins, and I 173 00:09:47,679 --> 00:09:51,080 Speaker 1: figured this is a creature that emerges from German folk traditions, 174 00:09:51,120 --> 00:09:53,000 Speaker 1: you know, Uh, you know, and in the same way 175 00:09:53,040 --> 00:09:56,079 Speaker 1: that that crampus came down from the mountains and uh 176 00:09:56,320 --> 00:09:58,880 Speaker 1: and alpine traditions. I just figured the doppelganger was just 177 00:09:58,920 --> 00:10:03,200 Speaker 1: a standard because the again, the idea of a mysterious double, 178 00:10:03,480 --> 00:10:06,880 Speaker 1: either of self or other, is long established, but this 179 00:10:06,960 --> 00:10:09,920 Speaker 1: does not seem to be the case. Apparently, the word 180 00:10:09,960 --> 00:10:13,640 Speaker 1: doppelganger wasn't coined to know the eighteenth century, and it 181 00:10:13,679 --> 00:10:17,040 Speaker 1: was coined by German novelist Jean Paul in his seventeen 182 00:10:17,120 --> 00:10:23,000 Speaker 1: nine novel uh Sebenkas, in which the main character encounters 183 00:10:23,080 --> 00:10:27,000 Speaker 1: his own doppel ganger or double goer uh in the 184 00:10:27,000 --> 00:10:29,160 Speaker 1: the In this case, the dopel ganger convinces him to 185 00:10:29,160 --> 00:10:32,600 Speaker 1: fake his own death and start a new life. Uh. 186 00:10:32,640 --> 00:10:34,640 Speaker 1: And I had to I had to look in closer 187 00:10:34,679 --> 00:10:37,880 Speaker 1: on this. It's it's not as straightforward as I would 188 00:10:37,880 --> 00:10:40,160 Speaker 1: like it to be, where he's just like, hey, this 189 00:10:40,200 --> 00:10:44,400 Speaker 1: is the doppelganger. Apparently he invinced two similar words in 190 00:10:44,440 --> 00:10:48,720 Speaker 1: this book. He invinced the word doppeled ganger. Um, so 191 00:10:48,800 --> 00:10:51,719 Speaker 1: this would be the name for people who see themselves. 192 00:10:52,640 --> 00:10:55,920 Speaker 1: But then he also talks about doppel ganger as a 193 00:10:55,920 --> 00:10:58,120 Speaker 1: as a word for the second court when the second 194 00:10:58,160 --> 00:11:01,160 Speaker 1: course of a meal arrives along side the first course, 195 00:11:01,679 --> 00:11:05,920 Speaker 1: because ganga all means both you know, go or walker 196 00:11:05,960 --> 00:11:11,000 Speaker 1: as well as course in a meal, so technically doppelt 197 00:11:11,040 --> 00:11:15,520 Speaker 1: Ganger would be the mysterious double idea that he introduces. 198 00:11:15,559 --> 00:11:19,240 Speaker 1: And doppelganger itself is just a weird mishap of ordering 199 00:11:19,240 --> 00:11:22,880 Speaker 1: a multi course meal at a restaurant. But nobody's gonna 200 00:11:22,880 --> 00:11:27,679 Speaker 1: say doubled ganger, not anymore, no dafeld ganger. But this 201 00:11:27,720 --> 00:11:29,839 Speaker 1: is this is a good idea. Next time someone introduces 202 00:11:29,840 --> 00:11:32,520 Speaker 1: the dappel ganger in your dn D campaign, remind the 203 00:11:32,640 --> 00:11:36,760 Speaker 1: d M that that's a culinary term sort of. Uh. 204 00:11:36,840 --> 00:11:40,640 Speaker 1: But anyway, the termines up resonating in German literature, and 205 00:11:40,679 --> 00:11:44,320 Speaker 1: it became popular in romantic horror literature in general by 206 00:11:44,360 --> 00:11:48,160 Speaker 1: the mid eighteen hundreds. So I think originally the the 207 00:11:48,480 --> 00:11:53,920 Speaker 1: this idea was always something scary or dangerous, right, Well, yeah, 208 00:11:53,920 --> 00:11:56,360 Speaker 1: they're not as much. It's seemingly in the original and 209 00:11:56,400 --> 00:11:59,760 Speaker 1: I didn't read the original German novel This is stress, 210 00:11:59,760 --> 00:12:01,439 Speaker 1: so you know, feel free to correct me if anyone 211 00:12:01,440 --> 00:12:03,640 Speaker 1: out there is more familiar with the with the literature 212 00:12:04,000 --> 00:12:07,520 Speaker 1: we're talking about here, but it certainly took on sinister 213 00:12:07,640 --> 00:12:12,040 Speaker 1: connotations within the literary tradition. But then I was reading 214 00:12:12,080 --> 00:12:15,000 Speaker 1: about the term on websters and the sinister connotations have 215 00:12:15,040 --> 00:12:19,480 Speaker 1: apparently dropped off somewhat in its English language usage, which 216 00:12:19,520 --> 00:12:21,640 Speaker 1: is surprising to me. But then again, I'm coming from 217 00:12:21,679 --> 00:12:24,640 Speaker 1: the standpoint of knowing them mostly through Dungeons and Dragons 218 00:12:24,640 --> 00:12:27,800 Speaker 1: and Horrible Drew Barrymore movies, so I'm probably not like 219 00:12:27,840 --> 00:12:33,080 Speaker 1: the the key candidate here. Um. I guess the other thing, too, 220 00:12:33,120 --> 00:12:35,640 Speaker 1: is I really don't use the term outside of a 221 00:12:35,640 --> 00:12:40,000 Speaker 1: fantasy context. Like if I encounter someone who looks a 222 00:12:40,080 --> 00:12:44,559 Speaker 1: lot like someone I know, I don't say, oh, hey, 223 00:12:44,559 --> 00:12:47,240 Speaker 1: I saw your doppelganger today. I'm more like, hey, I 224 00:12:47,240 --> 00:12:49,680 Speaker 1: saw your Maybe I'll say evil twin, which is, you know, 225 00:12:50,160 --> 00:12:53,199 Speaker 1: another variation on this trope. Or I'll say oh, I 226 00:12:53,440 --> 00:12:55,200 Speaker 1: just I saw someone who looked just about like you, 227 00:12:55,320 --> 00:12:57,240 Speaker 1: or I I'm in another city, I might say oh 228 00:12:57,280 --> 00:13:00,880 Speaker 1: I saw your Chicago you or whatever you know. So 229 00:13:01,400 --> 00:13:03,920 Speaker 1: that's the other thing. I just don't use doppelganger outside 230 00:13:03,920 --> 00:13:06,920 Speaker 1: of fantastic settings. Myself. I think most people just use 231 00:13:06,960 --> 00:13:09,720 Speaker 1: it to me and the look alike. Now, yeah, but 232 00:13:10,040 --> 00:13:11,920 Speaker 1: I guess I don't even use it that way. Like 233 00:13:12,120 --> 00:13:14,600 Speaker 1: for me, I just if I think of Doppelganger, I 234 00:13:14,640 --> 00:13:17,960 Speaker 1: think of something like that creature and Kroll which pretends 235 00:13:17,960 --> 00:13:19,679 Speaker 1: to be the Wizard, you know. I think something that 236 00:13:19,760 --> 00:13:23,400 Speaker 1: when you reveal it, it's a horrible, pallid creature with 237 00:13:23,520 --> 00:13:26,720 Speaker 1: jet black eyes. So if I'm not specifically talking about 238 00:13:26,760 --> 00:13:30,240 Speaker 1: like a monstrous scenario, I'm not gonna use Doppelganger. Okay, 239 00:13:30,280 --> 00:13:33,040 Speaker 1: that's just me. I also think that part of but 240 00:13:33,280 --> 00:13:35,319 Speaker 1: I think part of this whole idea of the sinister 241 00:13:35,360 --> 00:13:37,840 Speaker 1: connotations fading away, it might have to do with the 242 00:13:37,840 --> 00:13:40,360 Speaker 1: fact that if it is used by and large for 243 00:13:40,520 --> 00:13:42,840 Speaker 1: just somebody's double Like if someone is to say, hey, 244 00:13:42,880 --> 00:13:46,800 Speaker 1: I saw your doppelganger today at Showney's, they're not gonna, 245 00:13:47,360 --> 00:13:49,440 Speaker 1: you know, there's there's not gonna be a creepy connotation 246 00:13:49,480 --> 00:13:52,720 Speaker 1: to that that sighting. We're not gonna say, oh my god, 247 00:13:52,760 --> 00:13:54,920 Speaker 1: I saw your doppelganger at Showny's and I was super 248 00:13:54,960 --> 00:13:58,160 Speaker 1: creeped out. I think we need to call somebody. No, 249 00:13:58,360 --> 00:13:59,920 Speaker 1: you're you're just gonna it's just gonna be a point 250 00:14:00,080 --> 00:14:04,199 Speaker 1: whimsy And the other thing is that more than likely 251 00:14:05,240 --> 00:14:07,959 Speaker 1: it was a first glance situation, Like at first glance, 252 00:14:08,080 --> 00:14:10,240 Speaker 1: I thought it was you. At second glance, I saw 253 00:14:10,280 --> 00:14:13,000 Speaker 1: that it was clearly another person and nothing to freak 254 00:14:13,000 --> 00:14:16,160 Speaker 1: out about. Well, I would be shocked though, if people 255 00:14:16,200 --> 00:14:18,800 Speaker 1: didn't still interpret this kind of thing is some kind 256 00:14:18,800 --> 00:14:22,080 Speaker 1: of weird omen or demon or whatever. Oh yeah, And 257 00:14:22,240 --> 00:14:23,920 Speaker 1: I was glancing around on the internet and there's still 258 00:14:23,960 --> 00:14:26,760 Speaker 1: plenty of that um And I think a large part 259 00:14:26,760 --> 00:14:30,760 Speaker 1: of that is, you know, as with all paranormal UH 260 00:14:31,200 --> 00:14:39,080 Speaker 1: experiences or supernatural explanations for mundane UH encounters, the supernatural 261 00:14:39,080 --> 00:14:41,880 Speaker 1: explanation is going to be more appealing. It's going you know, 262 00:14:41,920 --> 00:14:44,760 Speaker 1: it makes us feel more important, Like you want to 263 00:14:44,800 --> 00:14:47,360 Speaker 1: feel like you're in an Alan Rogue Relay novel and 264 00:14:47,360 --> 00:14:50,040 Speaker 1: you saw your mysterious double and it, you know, reveals 265 00:14:50,080 --> 00:14:53,240 Speaker 1: something about your you know, your your inner subconscious nature 266 00:14:53,320 --> 00:14:55,840 Speaker 1: or something, or that you you saw a ghost that 267 00:14:55,920 --> 00:14:57,560 Speaker 1: looked like you. I mean, all these are four more 268 00:14:57,600 --> 00:14:59,800 Speaker 1: interesting then, yeah, they're you know, there are a whole 269 00:14:59,840 --> 00:15:01,320 Speaker 1: bunch of people in the world, and it was bound 270 00:15:01,320 --> 00:15:03,440 Speaker 1: to happen sooner or later. But I saw somebody that 271 00:15:03,520 --> 00:15:05,680 Speaker 1: kind of looked like me, and had some more facial hair. 272 00:15:05,800 --> 00:15:08,400 Speaker 1: The way that you look isn't all that unique. That's 273 00:15:08,440 --> 00:15:10,360 Speaker 1: like the worst news of all. Yeah, that's that's just 274 00:15:10,520 --> 00:15:14,240 Speaker 1: nothing exciting about that that story. You don't run rush 275 00:15:14,240 --> 00:15:17,000 Speaker 1: home to tell that to your significant other. But it 276 00:15:17,000 --> 00:15:18,960 Speaker 1: does bring up the question what are your chances of 277 00:15:19,080 --> 00:15:22,320 Speaker 1: running into your own unrelated double, or for that matter, 278 00:15:22,440 --> 00:15:25,560 Speaker 1: running into an unrelated double with someone you know well. 279 00:15:25,600 --> 00:15:30,280 Speaker 1: According to ananimous Dr Tiggan Lucas quoted in the BBC 280 00:15:30,400 --> 00:15:33,520 Speaker 1: Future article, you're surprisingly likely to have a doppel game, 281 00:15:34,000 --> 00:15:37,320 Speaker 1: which I think is slightly confusing title given the contents 282 00:15:37,320 --> 00:15:40,200 Speaker 1: of the article, but still uh said that the chances 283 00:15:40,280 --> 00:15:43,640 Speaker 1: of sharing just eight dimensions with someone else are less 284 00:15:43,680 --> 00:15:46,440 Speaker 1: than one in a trillion, and with a seven point 285 00:15:46,440 --> 00:15:48,640 Speaker 1: four billion people on the planet, it was only there 286 00:15:48,680 --> 00:15:51,440 Speaker 1: was only a one in one five chants that there's 287 00:15:51,480 --> 00:15:53,960 Speaker 1: a single pair of true doppel gagers. No, wait, what 288 00:15:54,000 --> 00:15:57,080 Speaker 1: are these dimensions you're talking about? Like eight facial dimensions? 289 00:15:57,080 --> 00:15:59,040 Speaker 1: Like if you you take you take facial features and 290 00:15:59,040 --> 00:16:00,640 Speaker 1: you divide them up into eight mentions and go to 291 00:16:00,720 --> 00:16:05,320 Speaker 1: match those up. So yeah, not like a spatial dimensions. 292 00:16:05,400 --> 00:16:08,160 Speaker 1: I'm not sure how that would work. Basically, the eight 293 00:16:08,280 --> 00:16:13,400 Speaker 1: sliders on your character creator right now. Most of the time, 294 00:16:13,400 --> 00:16:16,840 Speaker 1: though again we're not talking about exact doubles. You know, generally, 295 00:16:16,880 --> 00:16:19,160 Speaker 1: these are just faces that are similar to our own 296 00:16:19,240 --> 00:16:21,640 Speaker 1: or similar to someone we know. When we focus on 297 00:16:21,680 --> 00:16:24,080 Speaker 1: the familiarity in a way that may be tied to 298 00:16:24,160 --> 00:16:27,400 Speaker 1: a means of identifying close kin uh, you know, in 299 00:16:27,680 --> 00:16:31,120 Speaker 1: early human history, like that's what this recognition system is, 300 00:16:31,160 --> 00:16:34,800 Speaker 1: perhaps four um and you know, think again about how 301 00:16:34,880 --> 00:16:37,480 Speaker 1: generally how you know, generally, doubles are kind of a 302 00:16:37,480 --> 00:16:40,840 Speaker 1: first glance thing. The similarities may be jarring, but the 303 00:16:40,880 --> 00:16:45,280 Speaker 1: differences will be pronounced as well. Now, the thing is, 304 00:16:45,320 --> 00:16:47,960 Speaker 1: there are so many humans on the planet now, and 305 00:16:48,080 --> 00:16:51,240 Speaker 1: we live in you know, closer confines. In many situations, 306 00:16:52,000 --> 00:16:54,720 Speaker 1: seeing familiar features, it doesn't necessarily mean that there's any 307 00:16:54,800 --> 00:16:59,360 Speaker 1: shared genetic heritage between two given individuals, you know, except 308 00:16:59,440 --> 00:17:01,400 Speaker 1: in the sense that all humans share most in the 309 00:17:01,400 --> 00:17:04,280 Speaker 1: grander yeah, and the grander scheme, yes, but yeah, if 310 00:17:04,280 --> 00:17:06,000 Speaker 1: you just if you're in another city you see someone 311 00:17:06,040 --> 00:17:07,760 Speaker 1: who looks kind of like you, or looks kind of 312 00:17:07,800 --> 00:17:10,560 Speaker 1: like a friend, it doesn't mean they're your long last 313 00:17:10,600 --> 00:17:13,399 Speaker 1: cousin or their long last cousin. Of an of your friend. 314 00:17:14,880 --> 00:17:17,199 Speaker 1: But it's a situation where we kind of broke the 315 00:17:17,200 --> 00:17:20,760 Speaker 1: system through population growth in the birth of cities and 316 00:17:20,760 --> 00:17:24,840 Speaker 1: and self facial recognition and facial recognition abilities. They're also 317 00:17:24,880 --> 00:17:29,480 Speaker 1: going to vary from person to person, so your doppelganger 318 00:17:29,640 --> 00:17:32,480 Speaker 1: alarms just may not be as easy to set off 319 00:17:32,520 --> 00:17:36,240 Speaker 1: as someone else's. So anyway, that's that's doppel gangers in 320 00:17:36,280 --> 00:17:38,960 Speaker 1: a nutshell, both the origin of the term, but then 321 00:17:39,359 --> 00:17:42,240 Speaker 1: a little bit about the science and the potent that 322 00:17:42,320 --> 00:17:46,560 Speaker 1: the potentiality of seeing a double or near double uh 323 00:17:46,640 --> 00:17:49,679 Speaker 1: somewhere in the world. But thinking about what is at 324 00:17:49,720 --> 00:17:53,760 Speaker 1: work with the the erroneous detection of doubles in cop 325 00:17:53,800 --> 00:17:56,720 Speaker 1: cross syndrome, uh is I guess maybe what we should 326 00:17:56,720 --> 00:18:02,119 Speaker 1: get back to when we come back after a break. Alright, 327 00:18:02,160 --> 00:18:04,800 Speaker 1: we're back and it's really us. We weren't replaced by 328 00:18:05,160 --> 00:18:07,639 Speaker 1: strange creatures from the Monster Manual over the course of 329 00:18:07,640 --> 00:18:10,800 Speaker 1: the advertisement. No, we're here, it's really us and we're 330 00:18:10,800 --> 00:18:13,280 Speaker 1: going to continue our exploration. You know, I wanted to 331 00:18:13,280 --> 00:18:15,840 Speaker 1: answer that with the body snatchers noise, but I don't 332 00:18:15,880 --> 00:18:17,560 Speaker 1: know if I can make it exactly from the Donald 333 00:18:17,560 --> 00:18:20,119 Speaker 1: Sutherland version, Yeah, which is a great version by I 334 00:18:20,160 --> 00:18:22,120 Speaker 1: still haven't seen that. I've only seen the old black 335 00:18:22,119 --> 00:18:24,840 Speaker 1: and white original. Oh, the Donald Sutherland one is great. 336 00:18:24,960 --> 00:18:30,120 Speaker 1: He's got Lambert from Alien, it's got Jeff Goldblum, He's 337 00:18:30,200 --> 00:18:33,720 Speaker 1: he's feisty. It's got oh and it's got from another 338 00:18:33,760 --> 00:18:36,840 Speaker 1: sci fi classic. It's got what's his name who played 339 00:18:36,880 --> 00:18:41,320 Speaker 1: spok Leonard n Yeah, Leonard Nimoy is fantastic in it. 340 00:18:41,440 --> 00:18:44,040 Speaker 1: I think it's his it's his great performance. Well, that's 341 00:18:44,040 --> 00:18:46,199 Speaker 1: a great cast. But the nineteen fifty six original had 342 00:18:46,240 --> 00:18:49,800 Speaker 1: had had Kevin McCarthy in the lead role. It was terrific. 343 00:18:50,240 --> 00:18:53,560 Speaker 1: You also had Carol and Jones, who had played more 344 00:18:53,600 --> 00:18:56,399 Speaker 1: Tisha on The Adams Family. Cool. Yeah, but also it 345 00:18:56,440 --> 00:18:58,399 Speaker 1: was it was just black and white, and it just 346 00:18:58,520 --> 00:19:00,560 Speaker 1: it really, at least the version I saw of it, 347 00:19:00,600 --> 00:19:04,880 Speaker 1: like the darkness felt just so murky and uh and 348 00:19:04,960 --> 00:19:08,400 Speaker 1: dirty somehow, Like it was just a very nightmare inducing 349 00:19:08,400 --> 00:19:10,199 Speaker 1: film when I saw it as a kid. Yeah, the 350 00:19:10,200 --> 00:19:13,040 Speaker 1: paranoid visual vibe, it's got a it's got a kind 351 00:19:13,040 --> 00:19:16,919 Speaker 1: of a communist infiltration thing. Oh, definitely, definitely. That's a 352 00:19:17,000 --> 00:19:19,560 Speaker 1: that's a very strong element of it, which just goes 353 00:19:19,600 --> 00:19:22,720 Speaker 1: to show like the ideas of like why this concept 354 00:19:22,800 --> 00:19:26,119 Speaker 1: of of doubles resonates so because you can apply it 355 00:19:26,160 --> 00:19:29,680 Speaker 1: to all these other scenarios social and political. Well, yeah, 356 00:19:29,680 --> 00:19:31,760 Speaker 1: I mean it's a common thing for people to say 357 00:19:31,800 --> 00:19:34,520 Speaker 1: when they don't literally think that someone they know has 358 00:19:34,560 --> 00:19:38,720 Speaker 1: been physically bodily replaced by a by a supernatural double, 359 00:19:39,280 --> 00:19:41,600 Speaker 1: they might often think, I don't know this person anymore. 360 00:19:41,800 --> 00:19:43,800 Speaker 1: I mean it's a similar like, you know, they've been 361 00:19:43,840 --> 00:19:47,480 Speaker 1: replaced with somebody, somebody replaced you with a different person. Yeah, 362 00:19:47,520 --> 00:19:51,040 Speaker 1: Like it just would. Really, you just found out you're 363 00:19:51,040 --> 00:19:53,960 Speaker 1: getting to know them better, You found out something about 364 00:19:54,000 --> 00:19:56,520 Speaker 1: them you didn't know before, and now you think that 365 00:19:56,560 --> 00:19:59,520 Speaker 1: they're like a different being entirely. And now it's just 366 00:19:59,600 --> 00:20:02,480 Speaker 1: because yeah, it turns out that they were maybe communists 367 00:20:02,600 --> 00:20:05,639 Speaker 1: or like a different football team. Well, to be fair, also, 368 00:20:05,680 --> 00:20:08,399 Speaker 1: it could be a case of um, you know, people 369 00:20:08,560 --> 00:20:12,960 Speaker 1: over emphasizing disposition all traits, thinking that people thinking that 370 00:20:13,000 --> 00:20:15,800 Speaker 1: they should expect their loved ones to be incredibly consistent 371 00:20:15,920 --> 00:20:19,680 Speaker 1: and trait predictable, when in fact people are inconsistent. That 372 00:20:19,720 --> 00:20:23,280 Speaker 1: it depends on the circumstances how they behave. Maybe sometimes 373 00:20:23,320 --> 00:20:26,240 Speaker 1: you are used to seeing someone only in one type 374 00:20:26,280 --> 00:20:29,040 Speaker 1: of context, maybe used to only seeing them at work, 375 00:20:29,320 --> 00:20:31,200 Speaker 1: and then when you see them in a different context, 376 00:20:31,240 --> 00:20:32,960 Speaker 1: when you see them, you know, out with their friends 377 00:20:33,119 --> 00:20:35,240 Speaker 1: or with their family, they seem like a totally different 378 00:20:35,280 --> 00:20:37,359 Speaker 1: person to you. It can be jarring when you see 379 00:20:37,359 --> 00:20:40,119 Speaker 1: those differences, and yet they're there for almost all of us, 380 00:20:40,160 --> 00:20:42,800 Speaker 1: almost none of us, like really behave the same way 381 00:20:42,840 --> 00:20:46,800 Speaker 1: in all contexts. Well, let's talk about the about those contexts, 382 00:20:46,880 --> 00:20:50,240 Speaker 1: especially those social contexts. Yeah, so I want to come 383 00:20:50,240 --> 00:20:52,280 Speaker 1: back to So we've talked about doppelgangers a bit and 384 00:20:52,320 --> 00:20:55,760 Speaker 1: the idea of doubles and and familiarity and recognition, But 385 00:20:55,800 --> 00:20:57,720 Speaker 1: I want to come back to uh that article I 386 00:20:57,760 --> 00:21:00,960 Speaker 1: mentioned at the beginning where Robert sapulse Key makes this 387 00:21:01,080 --> 00:21:06,400 Speaker 1: comparison between what is made clear about the brain basis 388 00:21:06,480 --> 00:21:10,680 Speaker 1: of familiarity with cop Cross syndrome and the ways that 389 00:21:11,119 --> 00:21:15,600 Speaker 1: technology is changing our social relationships. So in in Sapolski's words, 390 00:21:15,600 --> 00:21:18,679 Speaker 1: Capcross syndrome makes clear the brain basis for quote, the 391 00:21:18,720 --> 00:21:23,000 Speaker 1: differences between the thoughts that give rise to recognition. Remember 392 00:21:23,040 --> 00:21:26,720 Speaker 1: recognition as cognitive You see somebody and you cognitively know 393 00:21:26,760 --> 00:21:31,040 Speaker 1: who they are, and the feelings that give rise to familiarity. 394 00:21:31,080 --> 00:21:33,800 Speaker 1: That's the emotion that says, yes, I know this person, 395 00:21:34,200 --> 00:21:38,520 Speaker 1: they're different things. And Sapolsky's main point is quote, these 396 00:21:38,560 --> 00:21:42,239 Speaker 1: functional fault lines in the social brain, when coupled with 397 00:21:42,280 --> 00:21:45,639 Speaker 1: advances in the online world, have given rise to the 398 00:21:45,720 --> 00:21:50,280 Speaker 1: contemporary Facebook generation. They have made cop cross syndrome a 399 00:21:50,359 --> 00:21:53,760 Speaker 1: window on our culture and minds today where nothing is 400 00:21:53,880 --> 00:21:58,080 Speaker 1: quite recognizable but everything seems familiar. And I would actually 401 00:21:58,080 --> 00:21:59,960 Speaker 1: go further than that and say, I think that's an 402 00:22:00,000 --> 00:22:02,719 Speaker 1: interesting point, but the the inverse is true as well, 403 00:22:03,000 --> 00:22:06,520 Speaker 1: that the online world creates these situations where you have 404 00:22:06,640 --> 00:22:12,000 Speaker 1: familiarity without recognition and recognition without familiarity. So to further 405 00:22:12,880 --> 00:22:15,639 Speaker 1: explore the point, he makes a little bit so he 406 00:22:15,680 --> 00:22:18,000 Speaker 1: points out that you know, essentially, for all of our 407 00:22:18,040 --> 00:22:21,879 Speaker 1: evolutionary history, are only social relationships have been face to 408 00:22:21,920 --> 00:22:24,960 Speaker 1: face ones. And I'm struggling to think of a counter example. 409 00:22:25,640 --> 00:22:29,359 Speaker 1: I can't really think of a counter example for relationships 410 00:22:29,359 --> 00:22:32,240 Speaker 1: with real people. But for tens of thousands of years, 411 00:22:32,240 --> 00:22:34,760 Speaker 1: of course, we have had language, and we could have 412 00:22:34,840 --> 00:22:37,920 Speaker 1: felt as if we had relationships with people we only 413 00:22:37,960 --> 00:22:41,240 Speaker 1: heard about in stories for example. Now, obviously we do 414 00:22:41,400 --> 00:22:43,240 Speaker 1: eventually reach the point where we have the ability to 415 00:22:43,280 --> 00:22:46,760 Speaker 1: engage in activities like having a pinpal and that maybe 416 00:22:46,800 --> 00:22:48,840 Speaker 1: you know, that's a case where you can have certainly 417 00:22:49,040 --> 00:22:53,560 Speaker 1: a non face to face example. But but prior to uh, 418 00:22:53,680 --> 00:22:56,359 Speaker 1: you know, the advent of the necessary um, you know, 419 00:22:56,480 --> 00:22:59,560 Speaker 1: systems and technology. Yeah, I struggle to think of an 420 00:22:59,560 --> 00:23:02,840 Speaker 1: example as well. I mean even sort of semi imagined 421 00:23:03,040 --> 00:23:07,040 Speaker 1: situations such as speaking to the spirit of a dead 422 00:23:07,280 --> 00:23:11,280 Speaker 1: ancestor or dead relative, like you're still depending upon a 423 00:23:11,359 --> 00:23:14,960 Speaker 1: previous face to face relationship. Yes, and even even with 424 00:23:15,000 --> 00:23:18,199 Speaker 1: pin pals, I mean even the oldest versions of this, 425 00:23:18,359 --> 00:23:22,560 Speaker 1: the non digital communications, just writing to people with letters 426 00:23:22,600 --> 00:23:25,280 Speaker 1: even if you've never met them before. That that is 427 00:23:25,359 --> 00:23:28,480 Speaker 1: anatomically recent. I mean, the vast majority of the time 428 00:23:28,520 --> 00:23:30,879 Speaker 1: our species has been around, we didn't have writing. We 429 00:23:30,880 --> 00:23:33,920 Speaker 1: couldn't do that. The only relationships we had were face 430 00:23:34,000 --> 00:23:37,280 Speaker 1: to face relationships. And so it's entirely clear that our 431 00:23:37,320 --> 00:23:41,120 Speaker 1: bodies and our brains have been shaped by an evolutionary 432 00:23:41,200 --> 00:23:43,720 Speaker 1: niche and when in which all relationships were face to 433 00:23:43,760 --> 00:23:46,479 Speaker 1: face ones. Right, even our history is a symbolic Uh. 434 00:23:46,880 --> 00:23:51,720 Speaker 1: Species is mostly based on, almost exclusively based on face 435 00:23:51,720 --> 00:23:55,200 Speaker 1: to face communication. Yeah, and so when our only social 436 00:23:55,240 --> 00:23:59,720 Speaker 1: relationships were face to face relationships, it was natural for 437 00:23:59,760 --> 00:24:04,199 Speaker 1: face facial recognition and familiarity at an in person body 438 00:24:04,359 --> 00:24:07,520 Speaker 1: sensing level to be one of our main mediators of 439 00:24:07,560 --> 00:24:12,680 Speaker 1: how we conceptualized, evaluated, and formed beliefs about our relationships. 440 00:24:13,560 --> 00:24:16,440 Speaker 1: I mean, if you live in this non technological world 441 00:24:16,440 --> 00:24:19,760 Speaker 1: where your only relationships are face to face, it totally 442 00:24:19,800 --> 00:24:23,440 Speaker 1: makes sense for you to use moment to moment, face 443 00:24:23,520 --> 00:24:26,520 Speaker 1: to face, say, visual and touch data and things like 444 00:24:26,560 --> 00:24:29,720 Speaker 1: that to get the best idea of what your relationships 445 00:24:29,760 --> 00:24:31,960 Speaker 1: are and how you should feel about them, right. I mean, 446 00:24:31,960 --> 00:24:33,600 Speaker 1: some of that goes back to the you know we're 447 00:24:33,600 --> 00:24:37,760 Speaker 1: discussing earlier about uh, you know, can identification being able 448 00:24:37,800 --> 00:24:40,320 Speaker 1: to tell like this, this is a relative, I can 449 00:24:40,320 --> 00:24:43,600 Speaker 1: see it in their face exactly. But of course there 450 00:24:43,600 --> 00:24:46,760 Speaker 1: have been these technological changes that now allow relationships to 451 00:24:46,840 --> 00:24:51,680 Speaker 1: exist and persist under circumstances other than face to face interaction. 452 00:24:51,920 --> 00:24:54,399 Speaker 1: Of course, we already mentioned writing in literacy. Now this 453 00:24:54,440 --> 00:24:57,720 Speaker 1: allows you to maybe send letters, though I'd say even 454 00:24:58,200 --> 00:25:00,159 Speaker 1: for most of the time that's been around, that has 455 00:25:00,200 --> 00:25:04,400 Speaker 1: been something that is limited to a small percent of humans, 456 00:25:04,880 --> 00:25:07,840 Speaker 1: you know, because for most of human history. Most people 457 00:25:07,880 --> 00:25:10,360 Speaker 1: have not been literate, that's true. And then and then, 458 00:25:10,359 --> 00:25:13,040 Speaker 1: of course again, I feel I feel like the pinpal, 459 00:25:13,119 --> 00:25:16,000 Speaker 1: like the the the pinpal situation in which there is 460 00:25:16,280 --> 00:25:19,440 Speaker 1: never a face to face meeting. Like that's a slim 461 00:25:19,760 --> 00:25:22,640 Speaker 1: slice of the overall pie. Most of the other um 462 00:25:22,640 --> 00:25:25,480 Speaker 1: written communications are going to be carried out with individuals 463 00:25:25,920 --> 00:25:28,520 Speaker 1: um in which there was at least a previous face 464 00:25:28,560 --> 00:25:32,280 Speaker 1: to face communication. Yeah, but then think about how hard 465 00:25:32,520 --> 00:25:36,679 Speaker 1: this kind of thing can make relationships. I bet every 466 00:25:36,760 --> 00:25:43,280 Speaker 1: single person listening has had the experience of relationships strife 467 00:25:43,520 --> 00:25:48,000 Speaker 1: caused by a feeling by a misunderstanding or some kind 468 00:25:48,000 --> 00:25:51,360 Speaker 1: of feeling of emotional estrangement brought on by the media 469 00:25:51,520 --> 00:25:54,120 Speaker 1: through which you communicate. A lot of us don't feel 470 00:25:54,200 --> 00:25:56,879 Speaker 1: very comfortable talking to people on the phone. A lot 471 00:25:56,920 --> 00:25:59,240 Speaker 1: of us, don't you know, we have the experience of 472 00:25:59,280 --> 00:26:02,320 Speaker 1: sending email else and being misunderstood, having people not read 473 00:26:02,320 --> 00:26:05,920 Speaker 1: your tone correctly, or getting worried about the way somebody 474 00:26:05,960 --> 00:26:08,600 Speaker 1: punctuated a sentence and an email. I mean, I bet 475 00:26:08,640 --> 00:26:11,160 Speaker 1: you've had this experience. Oh yeah, absolutely, And I think 476 00:26:11,160 --> 00:26:14,440 Speaker 1: we all have both in personal contacts and work contacts. 477 00:26:14,920 --> 00:26:18,879 Speaker 1: You know, um says, I guess, you know, hopefully if 478 00:26:18,920 --> 00:26:22,359 Speaker 1: you have, if you're dealing a lot via email with someone, 479 00:26:22,400 --> 00:26:24,960 Speaker 1: you'll kind of get a feel for their tone and 480 00:26:25,000 --> 00:26:27,760 Speaker 1: how they tend to speak. But even then there's so 481 00:26:27,840 --> 00:26:30,560 Speaker 1: much room for miscommunication, like even when you feel like 482 00:26:30,600 --> 00:26:33,600 Speaker 1: you really uh you know, or or up to speed 483 00:26:33,680 --> 00:26:37,639 Speaker 1: on how they present themselves in a textual manner. Robert, 484 00:26:37,680 --> 00:26:39,240 Speaker 1: if you don't mind me saying you're kind of a 485 00:26:39,320 --> 00:26:42,520 Speaker 1: terse emailer, am I I can see people getting worried 486 00:26:42,560 --> 00:26:44,720 Speaker 1: when they get an email from you that maybe you're 487 00:26:44,760 --> 00:26:47,400 Speaker 1: mad at them or something. I don't think that's necessarily 488 00:26:47,400 --> 00:26:49,280 Speaker 1: always the king. Maybe sometimes you're mad at me, But 489 00:26:49,359 --> 00:26:52,040 Speaker 1: I mean, I think you just tend to not spend 490 00:26:52,080 --> 00:26:54,919 Speaker 1: a whole lot of time, you know, worrying about how 491 00:26:54,920 --> 00:26:56,720 Speaker 1: to phrase stuff on email. You just kind of bang 492 00:26:56,760 --> 00:26:59,600 Speaker 1: it out, and which I admire because you know, I 493 00:26:59,800 --> 00:27:02,840 Speaker 1: it is, you know, the amount of time that people 494 00:27:02,920 --> 00:27:06,520 Speaker 1: waste trying to phrase stuff on email is is it's 495 00:27:06,560 --> 00:27:09,680 Speaker 1: a horror. The thing is I used I remember when 496 00:27:09,720 --> 00:27:12,520 Speaker 1: I was younger, I would have these these long email 497 00:27:12,560 --> 00:27:16,160 Speaker 1: correspondence is going on with friends where we would respond 498 00:27:16,720 --> 00:27:20,320 Speaker 1: like like sometimes sentence by sentence or atleast paragraph by 499 00:27:20,560 --> 00:27:24,680 Speaker 1: paragraph where we'd respond to specific points and uh and 500 00:27:24,680 --> 00:27:28,280 Speaker 1: and right at length in response, and at some point 501 00:27:28,320 --> 00:27:32,080 Speaker 1: this just faded away. I haven't really I haven't really 502 00:27:32,080 --> 00:27:33,640 Speaker 1: thought about it too much to to try and figure 503 00:27:33,680 --> 00:27:37,240 Speaker 1: out exactly like at what point, like which like technological 504 00:27:37,440 --> 00:27:42,280 Speaker 1: or communications change altered that or and or what life 505 00:27:42,320 --> 00:27:46,160 Speaker 1: changes led to that occurring. But at the same time, 506 00:27:46,200 --> 00:27:48,560 Speaker 1: you know, it used to have of, you know, long 507 00:27:48,600 --> 00:27:51,640 Speaker 1: phone conversations with people, and now it's really it's it's 508 00:27:51,640 --> 00:27:54,159 Speaker 1: extremely rare for me to have a long phone conversation. 509 00:27:54,160 --> 00:27:57,159 Speaker 1: It's there's basically like two people in the world that 510 00:27:57,280 --> 00:28:00,280 Speaker 1: I have phone conversations within a regular basis, and once 511 00:28:00,320 --> 00:28:02,720 Speaker 1: my wife and once with my mother, and that's pretty 512 00:28:02,760 --> 00:28:05,919 Speaker 1: much it. Do you think maybe these changes have been 513 00:28:06,000 --> 00:28:08,760 Speaker 1: brought on by other technological changes, like the rise of 514 00:28:08,800 --> 00:28:12,400 Speaker 1: social media. I suspect they have. Yeah, Like instead of 515 00:28:13,000 --> 00:28:17,280 Speaker 1: having this this more, this longer, more thoughtful stream of 516 00:28:17,320 --> 00:28:20,800 Speaker 1: communication with somebody that you know now lives in another city, 517 00:28:21,040 --> 00:28:24,639 Speaker 1: you just have a continual trickle, you know, so again 518 00:28:24,680 --> 00:28:28,080 Speaker 1: we just have that familiarity, like a tripical familiarity going 519 00:28:28,080 --> 00:28:31,800 Speaker 1: on instead of like an actual stream of communication well, 520 00:28:31,840 --> 00:28:36,080 Speaker 1: and it also I think that the way that technology 521 00:28:36,160 --> 00:28:40,120 Speaker 1: has changed our communication sometimes forces us to become a 522 00:28:40,200 --> 00:28:42,880 Speaker 1: version of ourselves that we don't recognize. I mean, I 523 00:28:42,920 --> 00:28:47,560 Speaker 1: was talking about how we write work emails. I actually 524 00:28:47,680 --> 00:28:50,800 Speaker 1: don't love the way that I write work emails. I 525 00:28:50,840 --> 00:28:55,840 Speaker 1: feel like often I have to I overuse like exclamation 526 00:28:55,920 --> 00:28:58,720 Speaker 1: points and smiley faces and all that. And it's mainly 527 00:28:58,760 --> 00:29:02,640 Speaker 1: just because I don't ever want to accidentally make somebody 528 00:29:02,680 --> 00:29:04,920 Speaker 1: feel bad over email, or make them get the wrong 529 00:29:05,000 --> 00:29:06,920 Speaker 1: idea that I'm ad at them or something like that. 530 00:29:07,160 --> 00:29:09,560 Speaker 1: On the emotional intention, yeah, and of the statement, I 531 00:29:09,600 --> 00:29:13,440 Speaker 1: hate it because I can feel myself feeling insipid and 532 00:29:13,480 --> 00:29:17,000 Speaker 1: feeling not like myself as I type it. But I 533 00:29:17,000 --> 00:29:20,960 Speaker 1: would rather feel like that than worry that I'm giving 534 00:29:20,960 --> 00:29:23,280 Speaker 1: people the wrong idea or letting them think I'm mad 535 00:29:23,320 --> 00:29:25,760 Speaker 1: at them or something like that. You know, Yeah, I mean, 536 00:29:25,800 --> 00:29:28,680 Speaker 1: but I totally understand it. Yeah, sometimes you feel like 537 00:29:28,760 --> 00:29:31,280 Speaker 1: you have to really make it clear. And I do 538 00:29:31,320 --> 00:29:33,400 Speaker 1: find myself doing more and more of that with texts 539 00:29:33,640 --> 00:29:36,160 Speaker 1: when I'm sending a text, you know, via my tiny 540 00:29:36,160 --> 00:29:41,000 Speaker 1: pocket computer. Oh your pocket god, yes, pocket guy. Yes. Yeah. 541 00:29:41,040 --> 00:29:43,360 Speaker 1: And so we've got to obviously, you know, all the 542 00:29:43,360 --> 00:29:46,920 Speaker 1: stuff we're talking about email, phone, uh, text messages and 543 00:29:47,000 --> 00:29:50,800 Speaker 1: internet communications. The photograph in a way kind of kind 544 00:29:50,840 --> 00:29:54,520 Speaker 1: of a modern communication method sort of. Yeah, as it's become, 545 00:29:54,760 --> 00:29:58,720 Speaker 1: you know, increasingly easy to to take digital photographs and 546 00:29:58,760 --> 00:30:00,920 Speaker 1: send them to other people, it'd be comes a form 547 00:30:00,920 --> 00:30:04,280 Speaker 1: of communication as does you know you mentioned emoticons as 548 00:30:04,320 --> 00:30:08,760 Speaker 1: being like a way of of of tweaking textual content, 549 00:30:08,840 --> 00:30:13,560 Speaker 1: but in many cases, like they're the prime uh language 550 00:30:13,600 --> 00:30:16,920 Speaker 1: that is used in communicating to say nothing of memes. Oh, 551 00:30:16,960 --> 00:30:20,000 Speaker 1: I shudder at this thought. Memes or there's gonna be 552 00:30:20,080 --> 00:30:22,760 Speaker 1: a day in which the English language is replaced by memes. 553 00:30:23,200 --> 00:30:25,320 Speaker 1: It's just like instead of an alphabet, you have a meme, 554 00:30:25,360 --> 00:30:27,440 Speaker 1: a bet and you just like put you paste the 555 00:30:27,480 --> 00:30:31,440 Speaker 1: memes together to form ideas. Yeah, I mean I already feel, um, 556 00:30:31,480 --> 00:30:33,520 Speaker 1: you know, and maybe this is just me feeling old, 557 00:30:33,560 --> 00:30:36,880 Speaker 1: but um, I feel we we've already reached the point 558 00:30:36,920 --> 00:30:40,280 Speaker 1: where there'll be a threat about something sound Reddit and 559 00:30:41,000 --> 00:30:43,000 Speaker 1: there'll be a meme and I have to I have 560 00:30:43,080 --> 00:30:45,320 Speaker 1: to research what the meme means, Like it's a new 561 00:30:45,400 --> 00:30:47,320 Speaker 1: meme and I have to figure out like where it 562 00:30:47,400 --> 00:30:50,640 Speaker 1: came from how it's used and how it's potentially being misused, 563 00:30:50,640 --> 00:30:53,320 Speaker 1: and how it's like evolving out of that misuse to 564 00:30:53,440 --> 00:30:57,720 Speaker 1: understand like what the prevailing idea is that is being 565 00:30:58,520 --> 00:31:01,600 Speaker 1: expressed memes as a whole or exactly like words in 566 00:31:01,640 --> 00:31:03,680 Speaker 1: the sense that you can try to write down a 567 00:31:03,760 --> 00:31:06,720 Speaker 1: definition for a word, but where do uses changes over time? 568 00:31:06,720 --> 00:31:10,880 Speaker 1: I mean, words don't actually have fixed definitions. You can't 569 00:31:10,880 --> 00:31:13,800 Speaker 1: control how people use them. Yeah, it kind of like 570 00:31:13,840 --> 00:31:16,800 Speaker 1: the whole like literally right, yeah, um, it's like, yeah, 571 00:31:16,800 --> 00:31:19,160 Speaker 1: sorry you lost that battle that words changed. You can 572 00:31:19,160 --> 00:31:21,520 Speaker 1: cling to the past, but sorry, it was just misused 573 00:31:21,520 --> 00:31:23,760 Speaker 1: into a new usage. So I try not to correct 574 00:31:23,760 --> 00:31:26,400 Speaker 1: people on that one, but that does it still gets me. 575 00:31:26,840 --> 00:31:31,720 Speaker 1: My blood was literally boiling and it literally took his 576 00:31:31,760 --> 00:31:34,840 Speaker 1: head off. Yes, but yeah, So we're talking about the 577 00:31:34,840 --> 00:31:38,400 Speaker 1: you know, the technological media on which our relationships happened. 578 00:31:38,680 --> 00:31:41,760 Speaker 1: And I think many of our relationships, especially in the 579 00:31:41,840 --> 00:31:47,520 Speaker 1: last you know, ten years now, happened primarily on these media. 580 00:31:48,080 --> 00:31:50,000 Speaker 1: And on one hand, that can be a good thing 581 00:31:50,040 --> 00:31:53,880 Speaker 1: because it allows us to maintain relationships with people who 582 00:31:53,920 --> 00:31:57,040 Speaker 1: we want to have relationships with, but can't you know, 583 00:31:57,360 --> 00:32:00,080 Speaker 1: people we can't practically arrange to see in person as 584 00:32:00,120 --> 00:32:02,480 Speaker 1: often as we'd like to. Several of my best friends 585 00:32:02,480 --> 00:32:05,480 Speaker 1: live in different cities, and we've been friends for years, 586 00:32:05,480 --> 00:32:08,600 Speaker 1: and I'm only able to maintain friendships with them because 587 00:32:08,640 --> 00:32:12,440 Speaker 1: of this technology. So I would hate to lose those friendships. 588 00:32:12,480 --> 00:32:14,160 Speaker 1: But also I wonder about the fact that what is 589 00:32:14,160 --> 00:32:17,240 Speaker 1: it doing to our culture when there's a substantial number 590 00:32:17,280 --> 00:32:20,680 Speaker 1: of people who, like, I don't know, maybe seventy percent 591 00:32:20,840 --> 00:32:25,200 Speaker 1: of their friendly social interactions happen over a machine. Yeah, 592 00:32:25,320 --> 00:32:28,480 Speaker 1: I mean even people like flesh and blood friends that 593 00:32:28,600 --> 00:32:31,000 Speaker 1: I have in the city with me, Like, we still 594 00:32:31,040 --> 00:32:32,520 Speaker 1: have to do like a like a you know, a 595 00:32:32,560 --> 00:32:35,840 Speaker 1: thirty email chain to plan to meet each other in 596 00:32:35,880 --> 00:32:39,360 Speaker 1: real life. Like even if it's like a semi regular thing, 597 00:32:39,440 --> 00:32:41,280 Speaker 1: like we know where we're gonna go, we know when 598 00:32:41,320 --> 00:32:43,240 Speaker 1: we're going to do it, but we still have to 599 00:32:43,280 --> 00:32:45,560 Speaker 1: coordinate all of these things. So how much of the 600 00:32:45,800 --> 00:32:49,800 Speaker 1: relationship is truly face to face versus digital? Yeah? And 601 00:32:49,840 --> 00:32:53,400 Speaker 1: so Sapolsky says in this article that this technological reality 602 00:32:53,440 --> 00:32:57,680 Speaker 1: has conditioned us in a way to dissociate our traditional 603 00:32:57,720 --> 00:33:02,600 Speaker 1: pathways of recognition and familiarity. Uh So, he writes, quote thus, 604 00:33:02,640 --> 00:33:07,200 Speaker 1: not only has modern life increasingly dissociated recognition and familiarity, 605 00:33:07,240 --> 00:33:10,280 Speaker 1: but it has impoverished the latter in the process. So 606 00:33:10,320 --> 00:33:15,400 Speaker 1: impoverished familiarity. This is worsened by our frantic skill at multitasking, 607 00:33:15,520 --> 00:33:19,520 Speaker 1: especially social multitasking. A recent Pew study reported that eighty 608 00:33:19,600 --> 00:33:22,440 Speaker 1: nine percent of cell phone owners use their phones during 609 00:33:22,440 --> 00:33:26,600 Speaker 1: the most recent social gathering. That sounds low to me. Um, 610 00:33:26,720 --> 00:33:30,080 Speaker 1: we reduce our social connotations to mere threads so that 611 00:33:30,120 --> 00:33:33,320 Speaker 1: we can maintain as many of them as possible. This 612 00:33:33,440 --> 00:33:37,920 Speaker 1: leaves us with signposts of familiarity that are frail remnants 613 00:33:37,920 --> 00:33:40,800 Speaker 1: of the real thing. And I think he's really onto 614 00:33:40,840 --> 00:33:44,520 Speaker 1: something there about the idea of um maintaining it's almost 615 00:33:44,520 --> 00:33:48,760 Speaker 1: like putting up the scarecrows of things like these technological 616 00:33:49,000 --> 00:33:54,440 Speaker 1: stand ins for relationships that are not really functioning biologically 617 00:33:54,440 --> 00:33:58,600 Speaker 1: and psychologically for us the way relationships should. But we'd 618 00:33:58,720 --> 00:34:02,080 Speaker 1: rather maintain as any of those as we can rather 619 00:34:02,160 --> 00:34:05,640 Speaker 1: than have fewer relationships but more face to face interaction, 620 00:34:05,800 --> 00:34:08,319 Speaker 1: you know, quality time and all that. Yeah, So we 621 00:34:08,320 --> 00:34:11,880 Speaker 1: we end up maintaining these trickles of of of actual 622 00:34:11,960 --> 00:34:15,160 Speaker 1: social connections as opposed to streams of social connection. So 623 00:34:15,200 --> 00:34:18,920 Speaker 1: he's saying there that we essentially degrade our sensitivity to 624 00:34:19,120 --> 00:34:22,840 Speaker 1: the familiarity aspect of of what knowing somebody is a 625 00:34:22,960 --> 00:34:26,719 Speaker 1: social interaction, it's recognition and familiarity. And when we degrade 626 00:34:26,840 --> 00:34:30,360 Speaker 1: the familiarity thing, he says, quote uh, that we become 627 00:34:30,440 --> 00:34:35,800 Speaker 1: increasingly vulnerable to imposts. Our social media lives are rife 628 00:34:35,840 --> 00:34:40,719 Speaker 1: with simulations and simulations of simulations of reality, and so 629 00:34:40,760 --> 00:34:43,600 Speaker 1: of course you know that's uh, you know, one example 630 00:34:43,600 --> 00:34:45,480 Speaker 1: there is people who claim to know you, but they're not. 631 00:34:45,680 --> 00:34:48,520 Speaker 1: They're uh, you know, a friends email account gets hacked, 632 00:34:48,840 --> 00:34:50,960 Speaker 1: some hacker contacts you and tries to get you to 633 00:34:51,040 --> 00:34:53,840 Speaker 1: open some malware. That's one example. But there's a million 634 00:34:54,080 --> 00:34:57,480 Speaker 1: versions of this thing where where are sort of like 635 00:34:57,560 --> 00:35:02,120 Speaker 1: a low resolution familiar already detectors than this digital world 636 00:35:02,400 --> 00:35:04,960 Speaker 1: are being exploited by people who are not actually our 637 00:35:05,000 --> 00:35:09,200 Speaker 1: real friends. So and basically, our online our online version 638 00:35:09,239 --> 00:35:14,239 Speaker 1: of ourselves is essentially as a lay low resolution simulation, 639 00:35:14,560 --> 00:35:17,319 Speaker 1: and so if someone comes along to hijack that simulation, 640 00:35:17,480 --> 00:35:19,120 Speaker 1: it's all the easier to do. So you don't have 641 00:35:19,200 --> 00:35:22,840 Speaker 1: to be a high level magic user to to to 642 00:35:23,000 --> 00:35:26,359 Speaker 1: take on the likeness of another individual. When the threshold 643 00:35:26,480 --> 00:35:30,440 Speaker 1: for duplication is so low. Yeah, but then here here's 644 00:35:30,480 --> 00:35:33,719 Speaker 1: the turn. So Sapolsky says, by any logic quote, this 645 00:35:33,760 --> 00:35:37,200 Speaker 1: should induce us all to have cap craw delusions, to 646 00:35:37,239 --> 00:35:40,640 Speaker 1: find it plausible that everyone we encounter is an impostor. 647 00:35:41,040 --> 00:35:43,759 Speaker 1: After all, how can one's faith in the veracity of 648 00:35:43,800 --> 00:35:46,440 Speaker 1: people not be shaken when you sent all that money 649 00:35:46,440 --> 00:35:48,640 Speaker 1: to the guy who claimed he was from the I R. S. 650 00:35:49,239 --> 00:35:51,120 Speaker 1: And I think there is something going on here. It 651 00:35:51,160 --> 00:35:54,080 Speaker 1: didn't start with this, but this this impostor kind of 652 00:35:54,120 --> 00:35:56,840 Speaker 1: thing that the doppelgang or effect of the online world 653 00:35:56,840 --> 00:35:58,960 Speaker 1: and the fact that it's easy to be tricked by 654 00:35:59,000 --> 00:36:02,000 Speaker 1: an online doppel gang or does help contribute, I think 655 00:36:02,040 --> 00:36:04,799 Speaker 1: to this concept. I'm sure you've encountered this, Robert, that 656 00:36:05,120 --> 00:36:09,520 Speaker 1: the Internet is not real life. People always say this, right, 657 00:36:09,880 --> 00:36:11,960 Speaker 1: It's like I talked about somebody being a friend in 658 00:36:12,040 --> 00:36:14,960 Speaker 1: real life, in real life versus on the Internet. But 659 00:36:15,000 --> 00:36:18,600 Speaker 1: if most of your social interactions are happening on the Internet, 660 00:36:18,800 --> 00:36:20,680 Speaker 1: in what sense is that not real life? I mean, 661 00:36:20,680 --> 00:36:23,040 Speaker 1: of course, the Internet is real life. It is It 662 00:36:23,160 --> 00:36:26,600 Speaker 1: is a it's like a technology. The stuff you're doing 663 00:36:26,640 --> 00:36:29,799 Speaker 1: on it is actually happening. It's not like something that 664 00:36:29,920 --> 00:36:33,200 Speaker 1: didn't happen. But you are making a distinction there people 665 00:36:33,200 --> 00:36:37,040 Speaker 1: in some way or seeing these interactions is derealized or 666 00:36:37,040 --> 00:36:40,440 Speaker 1: as not having uh you know, not material in the 667 00:36:40,440 --> 00:36:43,160 Speaker 1: way that other interactions are. And yet there where we're 668 00:36:43,160 --> 00:36:47,839 Speaker 1: doing most increasingly all of our stuff. Yeah, and I 669 00:36:47,880 --> 00:36:50,279 Speaker 1: wonder if part of that, you know, I would I 670 00:36:50,320 --> 00:36:54,239 Speaker 1: wonder how this plays out generation to generation, because I 671 00:36:54,280 --> 00:36:58,120 Speaker 1: feel like from me, I maybe I had had a 672 00:36:58,200 --> 00:37:01,080 Speaker 1: sense of the Internet is being real life more so 673 00:37:01,120 --> 00:37:04,400 Speaker 1: early on, because the Internet was in some respectfully kind 674 00:37:04,400 --> 00:37:06,719 Speaker 1: of an escape. I mean at the same time, yeah, 675 00:37:06,760 --> 00:37:08,880 Speaker 1: I was. I remember having a union, use a like 676 00:37:08,880 --> 00:37:10,880 Speaker 1: a college email address, and all that kind of stuff, 677 00:37:10,880 --> 00:37:12,600 Speaker 1: you know, So you're still you're still doing in real, 678 00:37:13,200 --> 00:37:15,560 Speaker 1: real life stuff via the Internet. But then a lot 679 00:37:15,600 --> 00:37:19,440 Speaker 1: of other stuff is is about escaping either just in general, 680 00:37:19,520 --> 00:37:25,040 Speaker 1: like escaping into the into fantasy or like escaping geographical boundaries, 681 00:37:25,080 --> 00:37:27,279 Speaker 1: you know, and uh in you know, being able to 682 00:37:27,320 --> 00:37:29,560 Speaker 1: connect with people in other cities. Well, I think there's 683 00:37:29,600 --> 00:37:31,759 Speaker 1: another way in which there are multiple ways in which 684 00:37:31,800 --> 00:37:34,279 Speaker 1: people came to see the Internet is not real life, 685 00:37:34,280 --> 00:37:37,359 Speaker 1: and one of them is is anonymity. You know that 686 00:37:37,640 --> 00:37:41,840 Speaker 1: if you could go around invisible all day, what's that 687 00:37:41,880 --> 00:37:44,800 Speaker 1: Harry Potter cloak that makes you invisible? Oh the the 688 00:37:45,560 --> 00:37:46,759 Speaker 1: what was the type? I mean, it's a new clak 689 00:37:46,800 --> 00:37:49,440 Speaker 1: of invisibility, but I don't remember if it had any 690 00:37:49,600 --> 00:37:52,640 Speaker 1: particular name. Well, whatever that is, you could be invisible 691 00:37:52,760 --> 00:37:55,000 Speaker 1: in a way that would feel not real, right, because 692 00:37:55,040 --> 00:37:57,319 Speaker 1: if nobody can see you and nobody knows who you 693 00:37:57,360 --> 00:38:00,520 Speaker 1: are wherever you are, then there are no consequence, and 694 00:38:00,560 --> 00:38:03,960 Speaker 1: consequences are kind of what gives us the feeling of reality. 695 00:38:04,440 --> 00:38:06,640 Speaker 1: So that's part of it. But I think also Sopulski 696 00:38:06,719 --> 00:38:09,840 Speaker 1: is onto something here and that like this estrangement of 697 00:38:09,880 --> 00:38:13,080 Speaker 1: the sense of recognition and familiarity is it makes the 698 00:38:13,080 --> 00:38:16,280 Speaker 1: Internet start to feel like this world of social delusion, 699 00:38:16,360 --> 00:38:20,640 Speaker 1: this sort of like always cap Graw vulnerable type landscape 700 00:38:20,880 --> 00:38:23,760 Speaker 1: where nothing is really real and you can't trust anything, 701 00:38:23,800 --> 00:38:26,360 Speaker 1: and yet at the same time we're we're constantly forced 702 00:38:26,400 --> 00:38:29,160 Speaker 1: to put our trust in it as a matter of fact, 703 00:38:29,239 --> 00:38:32,440 Speaker 1: because that's where we're doing everything. But then, of course, 704 00:38:32,560 --> 00:38:34,799 Speaker 1: uh back to the idea of like all these you know, 705 00:38:35,040 --> 00:38:38,640 Speaker 1: threads that people maintain and sort of mistake for meaningful 706 00:38:38,680 --> 00:38:43,200 Speaker 1: relationships online. Uh. He comes back and on that and says, actually, 707 00:38:43,320 --> 00:38:46,319 Speaker 1: you know, it seems more the opposite has happened than 708 00:38:46,480 --> 00:38:49,200 Speaker 1: than inducing us to all have cap graw delusions where 709 00:38:49,200 --> 00:38:51,600 Speaker 1: we see people we knew and we think of them 710 00:38:52,080 --> 00:38:54,080 Speaker 1: see people we know and we think of them as 711 00:38:54,400 --> 00:38:58,120 Speaker 1: as you know, being a doppelganger or not familiar. Instead, 712 00:38:58,160 --> 00:39:00,360 Speaker 1: we go the other way and we see, well, we 713 00:39:00,440 --> 00:39:02,759 Speaker 1: don't really know very well, but we just have to 714 00:39:02,800 --> 00:39:06,080 Speaker 1: attach this feeling of familiarity to them. It allows all 715 00:39:06,120 --> 00:39:08,840 Speaker 1: of this false familiarity. And this really comes up in 716 00:39:09,680 --> 00:39:11,640 Speaker 1: I don't know, how have you read about the the 717 00:39:11,680 --> 00:39:15,480 Speaker 1: idea of you know, paras social interactions on social media. 718 00:39:15,840 --> 00:39:18,320 Speaker 1: You know, I don't think prior to this episode I 719 00:39:19,280 --> 00:39:21,200 Speaker 1: knew it by that term, but of course you do 720 00:39:21,239 --> 00:39:23,200 Speaker 1: see it all the time. Yeah, it's it's just it's 721 00:39:23,280 --> 00:39:26,000 Speaker 1: ubiquitous on the Internet. It's the idea, you know, it's 722 00:39:26,000 --> 00:39:29,440 Speaker 1: an asymmetrical relationship, the way like you follow a public 723 00:39:29,520 --> 00:39:32,640 Speaker 1: figure who doesn't know who you are. But there are 724 00:39:32,680 --> 00:39:36,400 Speaker 1: all of these indications that many people think of these 725 00:39:36,440 --> 00:39:41,120 Speaker 1: para social asymmetrical relationships as relationships. It's like they almost 726 00:39:41,239 --> 00:39:44,960 Speaker 1: view this Instagram influencer that they follow as like an 727 00:39:44,960 --> 00:39:47,960 Speaker 1: acquaint somebody they know, but of course that person doesn't 728 00:39:47,960 --> 00:39:51,480 Speaker 1: know them. Yeah. I really started thinking about this classification though, 729 00:39:51,600 --> 00:39:55,160 Speaker 1: para social relationships, uh and in wondering like to what 730 00:39:55,280 --> 00:39:59,400 Speaker 1: extent it can or could have existed in previous times, 731 00:39:59,440 --> 00:40:02,239 Speaker 1: Like what is the earliest possible example of a para 732 00:40:02,320 --> 00:40:05,520 Speaker 1: social relationship? Like maybe it could be a situation where 733 00:40:05,520 --> 00:40:09,279 Speaker 1: you have like a h an like a leader um 734 00:40:09,320 --> 00:40:11,400 Speaker 1: in a given community, and then you have like a 735 00:40:11,480 --> 00:40:14,960 Speaker 1: very low level person in that community that that the 736 00:40:15,120 --> 00:40:18,440 Speaker 1: you know, the tribal leader just has no uh, you know, 737 00:40:18,520 --> 00:40:21,319 Speaker 1: real idea of who they are, but of course you 738 00:40:21,320 --> 00:40:23,760 Speaker 1: know who the leader is. I mean, I guess that's 739 00:40:24,800 --> 00:40:27,160 Speaker 1: you know, sort of the the in real life version 740 00:40:27,400 --> 00:40:30,080 Speaker 1: of this. But we see it seems like we see 741 00:40:30,120 --> 00:40:34,120 Speaker 1: far more of it, uh in in in modern civilization, 742 00:40:34,520 --> 00:40:36,920 Speaker 1: um in certainly an Internet age, but even pre Internet, 743 00:40:37,000 --> 00:40:41,080 Speaker 1: like the idea of celebrity just enables this sort of 744 00:40:41,080 --> 00:40:45,400 Speaker 1: relationship to be possible celebrities and leaders. And of course 745 00:40:46,080 --> 00:40:48,759 Speaker 1: I would say that social media, of course did not 746 00:40:49,000 --> 00:40:52,800 Speaker 1: invent the idea of celebrities and so so it didn't 747 00:40:52,840 --> 00:40:55,719 Speaker 1: invent these relationships like you're talking about. You know, you've 748 00:40:55,719 --> 00:40:58,359 Speaker 1: always had leaders, You've always had public figures in some 749 00:40:58,400 --> 00:41:02,560 Speaker 1: way or another. So media I think has increased the 750 00:41:03,360 --> 00:41:06,719 Speaker 1: day to day relevance of these types of relationships, you know, 751 00:41:06,760 --> 00:41:09,560 Speaker 1: where you can like check in on on the accounts 752 00:41:09,640 --> 00:41:12,399 Speaker 1: of the people that you follow every day and they 753 00:41:12,400 --> 00:41:15,359 Speaker 1: don't know you, but you know you. Especially, I feel 754 00:41:15,360 --> 00:41:18,520 Speaker 1: like Instagram, especially of all the platforms I can think of, 755 00:41:18,600 --> 00:41:22,839 Speaker 1: is is really rife with this um of like these 756 00:41:22,880 --> 00:41:26,239 Speaker 1: influencers and people who lead kind of glamorous lives and 757 00:41:26,280 --> 00:41:29,279 Speaker 1: allow you to see into their lives by showing you 758 00:41:29,400 --> 00:41:32,120 Speaker 1: their house and their pets and their lunch and you know, 759 00:41:32,160 --> 00:41:35,520 Speaker 1: you get all these interior views and it's very visceral 760 00:41:35,560 --> 00:41:38,600 Speaker 1: because it's visual and often you know, visual even in 761 00:41:38,600 --> 00:41:41,319 Speaker 1: a way that's edited to make it more colorful and 762 00:41:41,360 --> 00:41:44,480 Speaker 1: exciting with the post processing filters and all that. Right, 763 00:41:44,520 --> 00:41:46,359 Speaker 1: And of course, at the same time, like like all 764 00:41:46,400 --> 00:41:51,600 Speaker 1: social media representations like this, they are they're incomplete. We're crafted, 765 00:41:51,600 --> 00:41:54,560 Speaker 1: and they're crafted there, uh, they're they're maintain in a 766 00:41:54,640 --> 00:41:57,400 Speaker 1: very strategic way usually so you don't even have like 767 00:41:57,440 --> 00:42:01,040 Speaker 1: a full vision of what you know, random celebrities life 768 00:42:01,040 --> 00:42:03,880 Speaker 1: actually is you just have this idealized version of it. 769 00:42:04,520 --> 00:42:06,799 Speaker 1: So I just want to read one last quote from 770 00:42:06,800 --> 00:42:10,560 Speaker 1: Sapulski's article before we move on. So he says, um uh. 771 00:42:10,680 --> 00:42:13,400 Speaker 1: He ends by saying quote. Throughout history, cap Cross syndrome 772 00:42:13,440 --> 00:42:17,280 Speaker 1: has been a cultural mirror of a dissociative mind, where 773 00:42:17,320 --> 00:42:21,320 Speaker 1: thoughts of recognition and feelings of intimacy have been sundered. 774 00:42:21,760 --> 00:42:24,640 Speaker 1: It's still that mirror today. We think that what is 775 00:42:24,680 --> 00:42:27,719 Speaker 1: false and artificial in the world around us is substantive 776 00:42:27,760 --> 00:42:30,600 Speaker 1: and meaningful. It's not that loved ones and friends are 777 00:42:30,600 --> 00:42:35,080 Speaker 1: mistaken for simulations, but that simulations are mistaken for them. 778 00:42:35,120 --> 00:42:36,959 Speaker 1: I think I kind of disagree with them a little 779 00:42:36,960 --> 00:42:39,520 Speaker 1: bit because I think it's actually both of those things. 780 00:42:39,840 --> 00:42:43,400 Speaker 1: It's like that that the dissociation goes two ways in 781 00:42:43,440 --> 00:42:46,439 Speaker 1: either case, though we we do typically we often find 782 00:42:46,480 --> 00:42:50,080 Speaker 1: ourselves in situations though where we are we are distracted 783 00:42:50,239 --> 00:42:55,560 Speaker 1: from from real life um relationships and real life socialization, 784 00:42:56,120 --> 00:42:58,879 Speaker 1: and instead we have to check in on these little 785 00:42:58,880 --> 00:43:03,200 Speaker 1: streams on our phone. Did to these, uh, these simulated 786 00:43:03,280 --> 00:43:06,160 Speaker 1: relationships that we have on social media. Do you ever 787 00:43:06,239 --> 00:43:09,759 Speaker 1: have the sort of direct doppelgang Er experience like with 788 00:43:09,800 --> 00:43:12,920 Speaker 1: the fairy change links or the doppelganger for a friend 789 00:43:12,960 --> 00:43:15,480 Speaker 1: on the internet, Like you have a family member, you 790 00:43:15,520 --> 00:43:18,480 Speaker 1: have a friend who you love in real life, but 791 00:43:19,120 --> 00:43:21,600 Speaker 1: when you see the way they are on the internet, 792 00:43:21,600 --> 00:43:23,319 Speaker 1: I don't know what it. You know, the kind of 793 00:43:23,360 --> 00:43:27,120 Speaker 1: stuff they post on social media or whatever, you don't 794 00:43:27,120 --> 00:43:30,000 Speaker 1: feel like you recognize them and you don't really like them. 795 00:43:30,560 --> 00:43:32,640 Speaker 1: I've definitely known people who are like that. I'm not 796 00:43:32,680 --> 00:43:36,000 Speaker 1: gonna name any names who are like that on say Twitter, Like, 797 00:43:36,560 --> 00:43:39,279 Speaker 1: I think, like I love this person, but if all 798 00:43:39,320 --> 00:43:41,720 Speaker 1: I knew about them was the way they act on Twitter, 799 00:43:41,880 --> 00:43:44,600 Speaker 1: I would I wouldn't be able to stand them well. 800 00:43:45,040 --> 00:43:49,239 Speaker 1: Social media, especially as it pertains to you know, some topics, 801 00:43:49,400 --> 00:43:53,160 Speaker 1: take politics for instance, I think it does tend to 802 00:43:53,160 --> 00:43:56,719 Speaker 1: bring out the worst in us. Uh. And I don't 803 00:43:56,719 --> 00:43:59,399 Speaker 1: think that is a risky comment to make. I think 804 00:43:59,440 --> 00:44:02,840 Speaker 1: we can all think to specific examples of that in 805 00:44:02,880 --> 00:44:05,480 Speaker 1: all of our own lives. And yeah, that can lead 806 00:44:05,520 --> 00:44:07,439 Speaker 1: you to a situation where you're like, well, I thought 807 00:44:07,480 --> 00:44:09,400 Speaker 1: I knew that person, but I guess I don't because 808 00:44:09,760 --> 00:44:13,680 Speaker 1: look at this meme they just shared, you know, um 809 00:44:13,760 --> 00:44:17,320 Speaker 1: and uh. But I think also though, when that happens, 810 00:44:17,360 --> 00:44:21,759 Speaker 1: we're just not appropriately appreciating the way that circumstances and 811 00:44:21,840 --> 00:44:28,080 Speaker 1: situations change change people's behavior, that we are the same way. Yeah. Absolutely, 812 00:44:28,320 --> 00:44:30,520 Speaker 1: all right on that note, we're gonna take a quick break, 813 00:44:30,520 --> 00:44:35,560 Speaker 1: but we'll be right back. Thank Alright, we're back. You know, 814 00:44:35,560 --> 00:44:38,560 Speaker 1: I was thinking about what you said about about my emails, 815 00:44:39,320 --> 00:44:43,759 Speaker 1: but I feel like like sixty maybe my emails or 816 00:44:43,800 --> 00:44:46,520 Speaker 1: me just saying um, cool, sounds good. I think that's 817 00:44:46,520 --> 00:44:49,200 Speaker 1: my my standard, which I feel is sufficient. It's just 818 00:44:49,280 --> 00:44:52,879 Speaker 1: me saying yes to whatever you just said. Uh, and 819 00:44:53,000 --> 00:44:55,920 Speaker 1: I'm cool with it. Do you have an Android phone? No? 820 00:44:56,040 --> 00:44:59,239 Speaker 1: I don't. Okay, use Gmail, Yeah, you use Gmail, but 821 00:44:59,280 --> 00:45:02,120 Speaker 1: I don't use the I know you have the sort 822 00:45:02,120 --> 00:45:05,320 Speaker 1: of you know, like the auto language feature starts telling 823 00:45:05,320 --> 00:45:07,000 Speaker 1: you what to say. It's like, here's the email you 824 00:45:07,040 --> 00:45:09,839 Speaker 1: could write. Man, when I saw that thing, I was like, 825 00:45:10,080 --> 00:45:14,279 Speaker 1: get out, what the heck? No, no, no, well, it's 826 00:45:14,360 --> 00:45:16,560 Speaker 1: it's a it's an easy jump to go from there 827 00:45:16,600 --> 00:45:20,640 Speaker 1: to like authorized simulations of yourself, you know, just which 828 00:45:20,640 --> 00:45:22,839 Speaker 1: I really I mean, that's not far off. That's it's 829 00:45:22,840 --> 00:45:25,960 Speaker 1: basically already here, where you just give your account the 830 00:45:25,960 --> 00:45:29,279 Speaker 1: authority to to make responses like this. Like cool, sounds good, 831 00:45:29,320 --> 00:45:31,319 Speaker 1: I'll get back to you, you know that sort of thing. Well, 832 00:45:31,360 --> 00:45:33,399 Speaker 1: it's not gonna. I don't know. I mean, even if 833 00:45:33,400 --> 00:45:36,399 Speaker 1: I would type exactly the words it's suggesting, I still 834 00:45:36,440 --> 00:45:38,719 Speaker 1: don't want to let it do that. The fact that 835 00:45:38,760 --> 00:45:41,760 Speaker 1: like Gmail is gonna is going to compose an email 836 00:45:41,880 --> 00:45:45,120 Speaker 1: for me to my parents or my wife that no, 837 00:45:45,120 --> 00:45:50,480 Speaker 1: no, no no, unacceptable. There's so much room for misunderstanding. Even 838 00:45:50,520 --> 00:45:53,680 Speaker 1: if we're applying um, you know, most or all of 839 00:45:53,719 --> 00:45:57,960 Speaker 1: our attention to crafting an email. Uh, it feels like 840 00:45:58,160 --> 00:46:02,000 Speaker 1: a machine, even a very like talent had Ai would 841 00:46:02,040 --> 00:46:04,319 Speaker 1: have difficulty with that. There's just so much nuance in 842 00:46:04,360 --> 00:46:07,360 Speaker 1: human communication and knowing who you're communicating to. Like sometimes 843 00:46:07,400 --> 00:46:09,880 Speaker 1: it's a matter of knowing there are certain words you 844 00:46:09,880 --> 00:46:13,080 Speaker 1: shouldn't use with another individual, like maybe you're aware of 845 00:46:13,080 --> 00:46:15,400 Speaker 1: what you know maybe uh you know some sort of 846 00:46:15,440 --> 00:46:18,080 Speaker 1: a trigger for them, or or you know it may 847 00:46:18,120 --> 00:46:20,400 Speaker 1: pertain to some sort of you know, um, you know, 848 00:46:20,480 --> 00:46:23,520 Speaker 1: incident from from from your personal past with that person. 849 00:46:23,560 --> 00:46:26,800 Speaker 1: Like there's so many potential uh holes to fall into 850 00:46:26,960 --> 00:46:30,480 Speaker 1: when composing written communication. Why trust that the to the 851 00:46:30,520 --> 00:46:32,239 Speaker 1: machine or I don't know, maybe the reverse is true. 852 00:46:32,320 --> 00:46:35,000 Speaker 1: Always trusted the machine as long as they have all 853 00:46:35,000 --> 00:46:37,439 Speaker 1: those caveats in mind, you know. I think one thing 854 00:46:37,800 --> 00:46:41,320 Speaker 1: that's interesting to me is about the psychological effects of 855 00:46:41,360 --> 00:46:44,239 Speaker 1: heavy social media use. Um, I feel like we're still 856 00:46:44,239 --> 00:46:47,120 Speaker 1: in the early days of getting a picture of what 857 00:46:47,239 --> 00:46:49,279 Speaker 1: that's like, and that there appears to be a lot 858 00:46:49,320 --> 00:46:52,520 Speaker 1: of conflicting evidence. I think, I think because we we 859 00:46:52,600 --> 00:46:56,200 Speaker 1: haven't refined all our categories and ways of testing things yet. 860 00:46:56,560 --> 00:46:59,560 Speaker 1: I do often say that I think in emerging this 861 00:46:59,600 --> 00:47:01,160 Speaker 1: is just a prediction. I could turn out to be 862 00:47:01,200 --> 00:47:03,919 Speaker 1: totally wrong. But my guess is that in the coming 863 00:47:04,000 --> 00:47:08,200 Speaker 1: years there's going to be emerging consensus that heavy social 864 00:47:08,239 --> 00:47:11,760 Speaker 1: media us, especially say among young people like teenagers and stuff, 865 00:47:12,080 --> 00:47:15,839 Speaker 1: is correlated with a lot of negative psychological outcomes and uh, 866 00:47:16,000 --> 00:47:19,200 Speaker 1: you know, depression and things like that, and that there 867 00:47:19,200 --> 00:47:22,000 Speaker 1: will be like a new cottage industry of like the 868 00:47:22,040 --> 00:47:26,480 Speaker 1: lobbyists who deny the emerging science on on social media. 869 00:47:26,560 --> 00:47:28,839 Speaker 1: But uh, I mean, I guess that's still to be seen. 870 00:47:28,880 --> 00:47:30,959 Speaker 1: I mean, we've only got a few years of data 871 00:47:31,000 --> 00:47:34,040 Speaker 1: to work with so far. Yeah, when trying to imagine 872 00:47:34,080 --> 00:47:36,440 Speaker 1: the future, it's difficult. And also, you know, kind of 873 00:47:36,600 --> 00:47:39,920 Speaker 1: you know, anxiety and docing to try and think where 874 00:47:39,960 --> 00:47:43,400 Speaker 1: where our social media usages going. On one hand, I 875 00:47:43,400 --> 00:47:47,719 Speaker 1: guess I'm I'm hopeful that more and more people will 876 00:47:48,160 --> 00:47:51,360 Speaker 1: you choose to, if if not, opt out of social media, 877 00:47:52,520 --> 00:47:55,920 Speaker 1: but or you know, at least rethink how they're using it, 878 00:47:56,080 --> 00:47:58,440 Speaker 1: step back from it. Even I kind of think of it. 879 00:47:58,440 --> 00:48:00,480 Speaker 1: It's kind of like a hot tub. You know, when 880 00:48:00,480 --> 00:48:02,279 Speaker 1: you first get into a hot tub, you just juice 881 00:48:02,320 --> 00:48:03,640 Speaker 1: all in. You know, It's like, let me just go 882 00:48:03,680 --> 00:48:06,080 Speaker 1: all the way up to my ears in this, and uh, 883 00:48:06,160 --> 00:48:09,080 Speaker 1: I'll zone and zone out, and you know, that's good 884 00:48:09,080 --> 00:48:11,120 Speaker 1: for a while, but then eventually realize if I stay 885 00:48:11,160 --> 00:48:14,720 Speaker 1: in here, um, I am going to die. So maybe 886 00:48:14,719 --> 00:48:18,160 Speaker 1: I need to like only put half of my body 887 00:48:18,239 --> 00:48:20,520 Speaker 1: in here. Maybe I should just sit on the side 888 00:48:20,560 --> 00:48:22,359 Speaker 1: and get my feet in here, or maybe better yet, 889 00:48:22,400 --> 00:48:24,040 Speaker 1: maybe I should go get in the pool for a 890 00:48:24,040 --> 00:48:27,279 Speaker 1: while and do that. Or even maybe I should leave 891 00:48:27,320 --> 00:48:29,600 Speaker 1: all together and go home and see my family. That's 892 00:48:29,640 --> 00:48:31,640 Speaker 1: sort of thing, you know, It is nice at first. 893 00:48:31,680 --> 00:48:34,080 Speaker 1: I remember thinking about when I very first got on Twitter, 894 00:48:34,120 --> 00:48:36,040 Speaker 1: it seemed like it was nice for a while that 895 00:48:36,080 --> 00:48:40,120 Speaker 1: I was mostly just seeing things that like learning and 896 00:48:40,320 --> 00:48:43,960 Speaker 1: things that people were enthusiastic about. People were sharing their enthusiasms, 897 00:48:43,960 --> 00:48:47,000 Speaker 1: here's a great thing, and uh, and over time, I'm 898 00:48:47,000 --> 00:48:50,040 Speaker 1: not sure exactly what happened, but it seemed like it 899 00:48:50,120 --> 00:48:54,640 Speaker 1: transformed into more like this, uh, this swamp of misery, 900 00:48:54,680 --> 00:48:57,040 Speaker 1: where the primary emotions coming off of it was just 901 00:48:57,080 --> 00:49:01,160 Speaker 1: that everybody hates everything. Of course, all of this, of course, 902 00:49:01,239 --> 00:49:03,520 Speaker 1: is depending on on how exactly when uses a social 903 00:49:03,520 --> 00:49:07,680 Speaker 1: media platform, who you follow, um like for instance, like 904 00:49:07,719 --> 00:49:11,560 Speaker 1: on Instagram. Obviously there's a lot of celebrity worship going on, 905 00:49:11,600 --> 00:49:14,719 Speaker 1: a lot of para social relationships taking place there, as 906 00:49:14,760 --> 00:49:16,920 Speaker 1: we already mentioned. I don't see as much of that, 907 00:49:17,000 --> 00:49:18,640 Speaker 1: and part of that is just because I like only 908 00:49:18,680 --> 00:49:21,720 Speaker 1: follow family and friends, and I only use it myself 909 00:49:21,760 --> 00:49:23,440 Speaker 1: for family photos, and it's like, you know, it's a 910 00:49:23,480 --> 00:49:26,480 Speaker 1: closed account. Well, I do think that there is some 911 00:49:26,560 --> 00:49:29,440 Speaker 1: evidence I've seen so far that and I'm not sure 912 00:49:29,440 --> 00:49:31,480 Speaker 1: how solid this is yet, but there's some evidence that 913 00:49:31,760 --> 00:49:34,400 Speaker 1: there's a pretty big difference in the psychological effects of 914 00:49:34,400 --> 00:49:37,560 Speaker 1: social media depending on whether you primarily use it too 915 00:49:38,120 --> 00:49:40,600 Speaker 1: as a way of keeping up with family and friends 916 00:49:40,760 --> 00:49:44,279 Speaker 1: versus as a way of interacting with public accounts. Right 917 00:49:44,680 --> 00:49:46,320 Speaker 1: but I think. But then again, one of the dangers 918 00:49:46,320 --> 00:49:47,640 Speaker 1: in all of this is even if there is a 919 00:49:47,800 --> 00:49:50,400 Speaker 1: preferred if there is a healthier way to use a 920 00:49:50,400 --> 00:49:55,920 Speaker 1: given platform, you are still fighting against the intended usage 921 00:49:56,000 --> 00:50:00,520 Speaker 1: of that platform, as engineered by the makers of that platform. 922 00:50:00,800 --> 00:50:03,200 Speaker 1: The intended usage of the platform is to open it 923 00:50:03,320 --> 00:50:07,160 Speaker 1: up and never get off. It's just and so like, 924 00:50:07,320 --> 00:50:09,080 Speaker 1: it's difficult to compete with that. I mean, we've talked 925 00:50:09,080 --> 00:50:11,800 Speaker 1: about it this on the show before. Uh, in terms 926 00:50:11,800 --> 00:50:15,719 Speaker 1: of gambling technology and then social media technology. I mean, 927 00:50:15,760 --> 00:50:20,480 Speaker 1: you're you're really up against a fearsome adversary in telling yourself, 928 00:50:20,840 --> 00:50:23,480 Speaker 1: I'm only going to use this in a way that 929 00:50:23,640 --> 00:50:28,080 Speaker 1: is mentally beneficial for me and not just purely economically 930 00:50:28,160 --> 00:50:32,280 Speaker 1: beneficial for the masters of the medium. You know, Jared Lannier, 931 00:50:32,280 --> 00:50:33,920 Speaker 1: who we've talked about on the show before, has written 932 00:50:33,960 --> 00:50:36,600 Speaker 1: a book about it. Basically, it's saying everybody should delete 933 00:50:36,640 --> 00:50:39,759 Speaker 1: their social media accounts, just get off these platforms, and 934 00:50:39,840 --> 00:50:42,400 Speaker 1: that will be it will make a much better world. 935 00:50:42,480 --> 00:50:44,320 Speaker 1: And he's got a whole argument for it in the 936 00:50:44,440 --> 00:50:46,080 Speaker 1: in this book, which I haven't read yet but I 937 00:50:46,719 --> 00:50:48,760 Speaker 1: planned to. In fact, we asked for some review copies, 938 00:50:48,800 --> 00:50:50,319 Speaker 1: but I think we should see if we can try 939 00:50:50,360 --> 00:50:53,000 Speaker 1: to get uh Jared Lannier on the podcast. Yes, I 940 00:50:53,040 --> 00:50:55,000 Speaker 1: think we should. I also wonder what he would think 941 00:50:55,000 --> 00:50:57,600 Speaker 1: about this uh comparison. I still feel like there's a 942 00:50:57,640 --> 00:51:00,840 Speaker 1: lot of stuff to work out, but I I sense 943 00:51:00,960 --> 00:51:04,600 Speaker 1: that the Sopulski's comparison here about the the rift between 944 00:51:04,640 --> 00:51:08,040 Speaker 1: the emotion of familiarity and the and the cognitive recognition 945 00:51:08,080 --> 00:51:10,880 Speaker 1: function of the brain, uh, that that's at work in 946 00:51:10,920 --> 00:51:14,319 Speaker 1: Capgrass syndrome. This is a really rich kind of comparison 947 00:51:14,600 --> 00:51:18,239 Speaker 1: for for social media and media technology generally, and I 948 00:51:18,600 --> 00:51:22,040 Speaker 1: expect to keep having more thoughts about it. Yeah, I mean, 949 00:51:23,080 --> 00:51:25,560 Speaker 1: I I don't want to just sound like I'm just 950 00:51:25,600 --> 00:51:28,560 Speaker 1: saying like the people are awful and that technology makes 951 00:51:28,600 --> 00:51:31,440 Speaker 1: us more awful. Uh. You know, I don't want that 952 00:51:31,520 --> 00:51:34,279 Speaker 1: to be my ultimate argument. Ultimately, I would say that 953 00:51:34,640 --> 00:51:40,440 Speaker 1: technology enables humans to do amazing things, and if we 954 00:51:40,960 --> 00:51:44,440 Speaker 1: direct this power in the right way. Uh, you know, 955 00:51:44,480 --> 00:51:46,160 Speaker 1: there's plenty that we can do. There's plenty we have 956 00:51:46,320 --> 00:51:50,239 Speaker 1: done to connect people and and and build a better 957 00:51:50,280 --> 00:51:54,279 Speaker 1: world out of those connections. But obviously there's more that 958 00:51:54,360 --> 00:51:56,640 Speaker 1: we could do. And I guess the worrisome thing with 959 00:51:56,680 --> 00:51:59,319 Speaker 1: these platforms, the various platforms that we're talking about, is like, 960 00:51:59,640 --> 00:52:02,480 Speaker 1: what is the the ultimate advantage, What is the ultimate 961 00:52:02,480 --> 00:52:06,000 Speaker 1: intention of the masters of those given platforms. What do 962 00:52:06,040 --> 00:52:08,719 Speaker 1: they want? And even in cases where they may say, no, 963 00:52:08,880 --> 00:52:10,840 Speaker 1: we want to build something that brings people together, we 964 00:52:10,840 --> 00:52:15,400 Speaker 1: want to build something that empowers, you know, a better world. Like, 965 00:52:15,480 --> 00:52:17,960 Speaker 1: is that impulse going to win out in the overall 966 00:52:18,000 --> 00:52:21,480 Speaker 1: structure of this given social media platform or is it 967 00:52:21,520 --> 00:52:26,680 Speaker 1: going to be profitability or engagement or some other metric 968 00:52:26,800 --> 00:52:29,879 Speaker 1: that is ultimately more important to the corporate entity. It's 969 00:52:29,920 --> 00:52:32,600 Speaker 1: always profitability, and of course that's always what's going to 970 00:52:32,640 --> 00:52:35,600 Speaker 1: win out. But I feel like there's I have to 971 00:52:35,640 --> 00:52:39,160 Speaker 1: hold onto the possibility that that humans can do better 972 00:52:39,239 --> 00:52:41,480 Speaker 1: though well, I mean, that does make me wonder if 973 00:52:41,760 --> 00:52:44,319 Speaker 1: perhaps what you could do instead is have some kind 974 00:52:44,320 --> 00:52:49,319 Speaker 1: of nonprofit, open source social media platform that would would 975 00:52:49,360 --> 00:52:52,840 Speaker 1: compete and try to replace these for profit forms that 976 00:52:52,880 --> 00:52:58,359 Speaker 1: are deranging our relationships and and causing this familiarity recognition 977 00:52:58,480 --> 00:53:02,319 Speaker 1: rift and potentially having all these psychologically negative consequences on 978 00:53:02,320 --> 00:53:04,960 Speaker 1: our lives and on our culture broadly. I'm not sure 979 00:53:04,960 --> 00:53:07,120 Speaker 1: exactly what that would look like. I mean, it would 980 00:53:07,160 --> 00:53:09,400 Speaker 1: probably be a start if there was just something that 981 00:53:09,480 --> 00:53:13,400 Speaker 1: was like Facebook, but that did not manipulate what people 982 00:53:13,480 --> 00:53:17,840 Speaker 1: saw and prioritize you know, conflict and paid content. But 983 00:53:17,880 --> 00:53:20,399 Speaker 1: then again, even just with the you know, the bare 984 00:53:20,440 --> 00:53:22,919 Speaker 1: bones basics of Facebook, I wonder about, you know, having 985 00:53:22,920 --> 00:53:27,800 Speaker 1: these friend networks. Uh does does the even the most 986 00:53:27,800 --> 00:53:32,200 Speaker 1: basic mechanic of something like Facebook encourage people to go 987 00:53:32,280 --> 00:53:35,839 Speaker 1: through these mental processes where they sort of degrade their 988 00:53:35,880 --> 00:53:39,279 Speaker 1: standards of what counts as a healthy relationship. I mean 989 00:53:39,320 --> 00:53:41,960 Speaker 1: maybe ultimately that's where where AI can come in, you know, 990 00:53:42,160 --> 00:53:46,280 Speaker 1: and we just need we need artificial intelligence to dictate 991 00:53:47,239 --> 00:53:50,719 Speaker 1: where and how to maintain healthy relationships online. And that's 992 00:53:50,800 --> 00:53:52,840 Speaker 1: that's the ultimate answer. I don't know, just hand it 993 00:53:52,840 --> 00:54:00,680 Speaker 1: off to an aidity. Why I'm not hopeful about that either. Again, 994 00:54:00,960 --> 00:54:03,000 Speaker 1: this is back to like I'm worried about these l 995 00:54:03,080 --> 00:54:06,879 Speaker 1: Ai squirrels, not the I'm not worried about the great basilisk. 996 00:54:07,160 --> 00:54:10,839 Speaker 1: I'm worried about the minor dumb aies that that are 997 00:54:10,920 --> 00:54:14,560 Speaker 1: running through our lives like a pest infestation. Now, uh, 998 00:54:14,680 --> 00:54:16,400 Speaker 1: you know, not not to end things into a negative 999 00:54:16,400 --> 00:54:18,279 Speaker 1: a place, I do want to refer listeners back to 1000 00:54:18,360 --> 00:54:22,719 Speaker 1: our episode the Great Episodes, The Great Eyeball Wars, where 1001 00:54:22,719 --> 00:54:25,239 Speaker 1: we went into a lot of this particularly about, you know, 1002 00:54:25,400 --> 00:54:28,160 Speaker 1: about how social media and these platforms and our phones 1003 00:54:28,640 --> 00:54:31,920 Speaker 1: are gamed to capture our attention and hold our attention. 1004 00:54:32,200 --> 00:54:35,640 Speaker 1: In those episodes, we also shared some some advice that 1005 00:54:35,719 --> 00:54:38,520 Speaker 1: experts have given about how to fight back, how to 1006 00:54:38,719 --> 00:54:42,000 Speaker 1: limit your use of social media and or your phone, 1007 00:54:42,200 --> 00:54:43,600 Speaker 1: and uh and so, I mean there and there are 1008 00:54:43,640 --> 00:54:46,200 Speaker 1: increasingly more tools out there. I believe you know, some 1009 00:54:46,280 --> 00:54:49,000 Speaker 1: of these these phones have ways, you know now, to 1010 00:54:49,000 --> 00:54:52,040 Speaker 1: to track how much you're using them, or even to 1011 00:54:52,200 --> 00:54:55,719 Speaker 1: remind you not to use them in certain situations. Yeah. Yeah, um, 1012 00:54:55,920 --> 00:54:59,160 Speaker 1: I mean I I can't honestly and non hypocritically tell 1013 00:54:59,160 --> 00:55:01,719 Speaker 1: people to get entirely off of social media because one 1014 00:55:01,760 --> 00:55:04,000 Speaker 1: of the things is I have to maintain social media 1015 00:55:04,040 --> 00:55:06,759 Speaker 1: accounts because of my job at this podcast, where we've 1016 00:55:06,800 --> 00:55:09,520 Speaker 1: got to like promote stuff on social media, and you know, 1017 00:55:09,560 --> 00:55:12,400 Speaker 1: we've got to We've got a Facebook discussion module that 1018 00:55:12,480 --> 00:55:15,040 Speaker 1: I really enjoy using. I probably would have deleted my 1019 00:55:15,040 --> 00:55:18,200 Speaker 1: Facebook account, but I enjoy our our discussion module with 1020 00:55:18,200 --> 00:55:21,239 Speaker 1: our fans there. Yeah. Yeah, that's that's probably one of 1021 00:55:21,360 --> 00:55:23,359 Speaker 1: the main reasons I go on Facebook these days. So 1022 00:55:23,640 --> 00:55:26,799 Speaker 1: discussion module, don't screw this up. Let's keep keep this 1023 00:55:27,000 --> 00:55:30,080 Speaker 1: positive relationship. But I mean, I will say that the 1024 00:55:30,120 --> 00:55:32,920 Speaker 1: reason it's on there, it's not any inherent strength of Facebook. 1025 00:55:32,920 --> 00:55:35,320 Speaker 1: It's on there because of audience inertia. I mean that 1026 00:55:35,320 --> 00:55:37,319 Speaker 1: that's where the people are. Like, if you want to 1027 00:55:37,360 --> 00:55:39,759 Speaker 1: have a place where people already have accounts and they 1028 00:55:39,760 --> 00:55:42,839 Speaker 1: can join Facebook, is they tell us the place where 1029 00:55:42,880 --> 00:55:45,040 Speaker 1: you can do that? You know, I'd love a world 1030 00:55:45,080 --> 00:55:48,680 Speaker 1: where somebody created some kind of non destructive, open source, 1031 00:55:49,239 --> 00:55:52,120 Speaker 1: uh you know, nonprofit platform where you could do a 1032 00:55:52,160 --> 00:55:54,719 Speaker 1: similar thing if enough people could get on there. All right, 1033 00:55:54,840 --> 00:55:56,759 Speaker 1: So so there you have it. Obviously, there are a 1034 00:55:56,800 --> 00:55:58,319 Speaker 1: lot of there's a lot of areas here we can 1035 00:55:58,320 --> 00:56:01,239 Speaker 1: call out to listeners on I mean, first of all, 1036 00:56:01,600 --> 00:56:06,279 Speaker 1: have you ever encountered a really um impressive double in 1037 00:56:06,320 --> 00:56:08,879 Speaker 1: your life, like someone that required uh, you know, not 1038 00:56:08,920 --> 00:56:11,080 Speaker 1: even just a first and second glance, maybe a third 1039 00:56:11,080 --> 00:56:13,120 Speaker 1: glance to realize that they were not your friend or 1040 00:56:13,120 --> 00:56:16,040 Speaker 1: perhaps not yourself. We'd love to hear from that all. 1041 00:56:16,160 --> 00:56:18,239 Speaker 1: I mean for to that. You know, so if you've 1042 00:56:18,280 --> 00:56:22,200 Speaker 1: actually had any experience with cap gross syndrome. Um, you know, 1043 00:56:22,880 --> 00:56:26,680 Speaker 1: we would we would really appreciate any firsthand knowledge of 1044 00:56:26,960 --> 00:56:29,920 Speaker 1: experiences like that. Uh. And then beyond that when we 1045 00:56:29,920 --> 00:56:32,600 Speaker 1: get into the you know, the of course, the literary 1046 00:56:32,880 --> 00:56:35,520 Speaker 1: and the you know, the fictional and the mythological connotations. 1047 00:56:35,520 --> 00:56:37,960 Speaker 1: If you have a particular you know, favorite double you 1048 00:56:37,960 --> 00:56:40,120 Speaker 1: want to share. But certainly we spent most of the 1049 00:56:40,120 --> 00:56:43,760 Speaker 1: time you're talking about this social media doppel gang or idea, 1050 00:56:44,120 --> 00:56:47,160 Speaker 1: and so I mean, you're pretty much all on social 1051 00:56:47,200 --> 00:56:49,640 Speaker 1: media at this point. We're really you're either on social 1052 00:56:49,640 --> 00:56:53,280 Speaker 1: media or you've made a very uh, you know, firm 1053 00:56:53,400 --> 00:56:57,840 Speaker 1: choice not to be. So whichever category you fall into, 1054 00:56:57,960 --> 00:57:00,360 Speaker 1: I feel like you probably have thoughts really added to 1055 00:57:00,440 --> 00:57:02,880 Speaker 1: this episode, and we'd love to hear from you. Absolutely 1056 00:57:02,960 --> 00:57:04,800 Speaker 1: get in touch. In the meantime, if you want to 1057 00:57:04,800 --> 00:57:06,319 Speaker 1: listen to more episodes of Stuff to Blow your Mind, 1058 00:57:06,320 --> 00:57:08,000 Speaker 1: head on over to stuff to Blow your Mind dot com. 1059 00:57:08,000 --> 00:57:10,160 Speaker 1: That is the mothership. That's we'll find all the episodes. 1060 00:57:10,520 --> 00:57:13,919 Speaker 1: You'll find a link there to our little merchandise store. 1061 00:57:13,960 --> 00:57:16,200 Speaker 1: We can get some squirrel shirts. It's a fun way 1062 00:57:16,200 --> 00:57:18,480 Speaker 1: to support the show, but the best way to support 1063 00:57:18,520 --> 00:57:21,120 Speaker 1: the show is to simply rate and review us wherever 1064 00:57:21,160 --> 00:57:22,680 Speaker 1: you have the power to do so. If you can 1065 00:57:22,760 --> 00:57:25,560 Speaker 1: leave some stars, leave a nice review, do that. And 1066 00:57:25,720 --> 00:57:28,919 Speaker 1: best of all, tell somebody about the show. Yeah, tell 1067 00:57:29,000 --> 00:57:32,000 Speaker 1: them online, but even better if you see somebody in 1068 00:57:32,120 --> 00:57:35,440 Speaker 1: real life, tell him about the show in real life, 1069 00:57:36,080 --> 00:57:39,040 Speaker 1: because I feel like, uh, you know, it's gonna it's 1070 00:57:39,040 --> 00:57:42,280 Speaker 1: gonna impress people all the more. That's right, huge, thanks 1071 00:57:42,320 --> 00:57:45,840 Speaker 1: as always to our excellent audio producer Terry Harrison. If 1072 00:57:45,880 --> 00:57:48,320 Speaker 1: you would like to get in touch with this direct Sorry, 1073 00:57:48,360 --> 00:57:51,640 Speaker 1: I'm laughing because Robert has got a little stress ball 1074 00:57:51,680 --> 00:57:53,600 Speaker 1: over here and he's squishing the guts out of it 1075 00:57:53,640 --> 00:57:55,920 Speaker 1: as we speak. Yeah, there's like some sort of white 1076 00:57:55,920 --> 00:57:57,720 Speaker 1: pus coming out of it. I had it for like 1077 00:57:57,760 --> 00:58:02,200 Speaker 1: two weeks and it's already squeezed out. So I'm not 1078 00:58:02,200 --> 00:58:06,360 Speaker 1: going to name the brand because maybe I just haded tractically. Anyway. Sorry, 1079 00:58:06,640 --> 00:58:08,320 Speaker 1: If you want to get in touch with us directly, 1080 00:58:08,800 --> 00:58:12,439 Speaker 1: you can email us at contact at stuff to Blow 1081 00:58:12,480 --> 00:58:23,960 Speaker 1: your Mind dot com. Stuff to Blow Your Mind is 1082 00:58:24,000 --> 00:58:26,320 Speaker 1: a production of iHeart Radio's How Stuff Works. For more 1083 00:58:26,360 --> 00:58:28,760 Speaker 1: podcasts from my Heart Radio, visit the iHeart Radio app, 1084 00:58:28,920 --> 00:58:40,560 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows