1 00:00:05,720 --> 00:00:07,480 Speaker 1: Hey, are you welcome to Stuff to Blow Your Mind? 2 00:00:07,520 --> 00:00:10,119 Speaker 1: My name is Robert Lamb and I'm Joe McCormick, and 3 00:00:10,200 --> 00:00:12,680 Speaker 1: it's Saturday. Time to dive into the vault. This time 4 00:00:12,680 --> 00:00:17,800 Speaker 1: we're bringing you an episode that originally aired on May nineteen. 5 00:00:18,280 --> 00:00:21,840 Speaker 1: This one was called The Doppelganger Network. Yeah, this one's 6 00:00:21,840 --> 00:00:23,840 Speaker 1: pretty fun because, you know, obviously we're going to get 7 00:00:23,840 --> 00:00:26,280 Speaker 1: into the idea of a doppel Ganga and what it 8 00:00:26,360 --> 00:00:29,280 Speaker 1: is and fairy impostures and so forth. But but then 9 00:00:29,320 --> 00:00:32,960 Speaker 1: we're getting into something a lot deeper, something that that 10 00:00:34,040 --> 00:00:36,159 Speaker 1: that really is going to play into a lot of 11 00:00:36,200 --> 00:00:42,360 Speaker 1: our every day online interactions. So let's jump right in here. 12 00:00:42,360 --> 00:00:46,280 Speaker 1: Then I repeat and some up. During the Endless train journey, 13 00:00:46,280 --> 00:00:50,040 Speaker 1: which took me from Eisenach to Berlin, across the Thuringia 14 00:00:50,080 --> 00:00:53,640 Speaker 1: and Saxony in Ruins, I noticed for the first time, 15 00:00:53,800 --> 00:00:56,560 Speaker 1: and I don't know how long that man whom I 16 00:00:56,640 --> 00:01:01,240 Speaker 1: call my double to simplify matters, or else my twin 17 00:01:01,720 --> 00:01:14,840 Speaker 1: or again unless theatrically the traveler. Welcome to Stuff to 18 00:01:14,840 --> 00:01:16,960 Speaker 1: Blow Your Mind, a production of I Heeart Radios has 19 00:01:17,000 --> 00:01:25,800 Speaker 1: to works. Hey, welcome to Stuff to Blow Your Mind. 20 00:01:25,880 --> 00:01:28,720 Speaker 1: My name is Robert Lamb, and I'm Joe McCormick, And 21 00:01:28,760 --> 00:01:31,639 Speaker 1: today I thought we might have a discussion bringing together 22 00:01:31,720 --> 00:01:38,319 Speaker 1: the seemingly disparate topics of familiarity, doppelgangers or doubles, cap 23 00:01:38,360 --> 00:01:41,880 Speaker 1: gross syndrome, and social media. And I got the idea 24 00:01:41,959 --> 00:01:44,120 Speaker 1: to talk about this today because a few weeks ago 25 00:01:44,200 --> 00:01:48,240 Speaker 1: I read this interesting article that had a very intriguing 26 00:01:48,360 --> 00:01:51,040 Speaker 1: central comparison or image it was. It was a thought 27 00:01:51,040 --> 00:01:54,840 Speaker 1: provoking essay by the Stanford neuro endo chronologist Robert Sapulski. 28 00:01:55,240 --> 00:01:58,160 Speaker 1: It was originally published a few years ago in Nautilus, 29 00:01:58,840 --> 00:02:02,400 Speaker 1: and it was an article pairing the effects of social 30 00:02:02,440 --> 00:02:06,000 Speaker 1: media and and sort of the digital world like Facebook 31 00:02:06,040 --> 00:02:09,919 Speaker 1: and you know, all that to a psychological condition known 32 00:02:10,080 --> 00:02:13,200 Speaker 1: as cap gross syndrome. And so today I thought, maybe 33 00:02:13,240 --> 00:02:17,200 Speaker 1: we should start by explaining and discussing Sepulsky's comparison and 34 00:02:17,360 --> 00:02:20,080 Speaker 1: argument in that article and just see where we go 35 00:02:20,200 --> 00:02:23,240 Speaker 1: from there. Now, capgrass syndrome has definitely come up on 36 00:02:23,240 --> 00:02:25,800 Speaker 1: the show before. I don't know that we've done like 37 00:02:26,080 --> 00:02:30,160 Speaker 1: a designated show on the topic, but it'd certainly come up. 38 00:02:30,160 --> 00:02:32,280 Speaker 1: But either way, we we we do need to, you know, 39 00:02:32,440 --> 00:02:36,040 Speaker 1: provide a brief refresher on its history for our listeners right. 40 00:02:36,560 --> 00:02:40,880 Speaker 1: One of the important cases and which Sapulsky discusses in 41 00:02:40,919 --> 00:02:44,400 Speaker 1: his article is the case of Madam im This. This 42 00:02:44,480 --> 00:02:47,600 Speaker 1: was a woman who lived in France in the early 43 00:02:47,639 --> 00:02:53,279 Speaker 1: twentieth century who had this persistent idea. She was fixated 44 00:02:53,400 --> 00:02:57,440 Speaker 1: on the idea that her loved ones, including her husband 45 00:02:57,440 --> 00:03:01,200 Speaker 1: and family members, people she knew, had been replaced by 46 00:03:01,600 --> 00:03:05,639 Speaker 1: doubles or doppel gangers who looked exactly like them. So 47 00:03:05,680 --> 00:03:08,560 Speaker 1: she would say, my husband is not really my husband. 48 00:03:08,840 --> 00:03:11,919 Speaker 1: He's a man who looks exactly like my husband used 49 00:03:11,960 --> 00:03:14,320 Speaker 1: to and I don't know what happened to my real husband. 50 00:03:14,560 --> 00:03:16,640 Speaker 1: And this wasn't her only symptoms. She had a number 51 00:03:16,639 --> 00:03:18,560 Speaker 1: of symptoms. She believed that all kinds of things were 52 00:03:18,600 --> 00:03:20,960 Speaker 1: happening to UH, to her children. I mean, it's a 53 00:03:20,960 --> 00:03:25,440 Speaker 1: tragic story, but the underlying UH, the underlying cause of 54 00:03:25,480 --> 00:03:28,120 Speaker 1: what would lead someone to believe that people around them 55 00:03:28,280 --> 00:03:32,640 Speaker 1: were being replaced by doppel gangers or doubles is is 56 00:03:32,680 --> 00:03:35,920 Speaker 1: interesting to consider. And so the way Sepulsky in this 57 00:03:36,080 --> 00:03:40,040 Speaker 1: article characterizes the ultimate disconnect under lyon cap Grass syndrome 58 00:03:40,560 --> 00:03:43,600 Speaker 1: is that when the module of the brain used in 59 00:03:43,760 --> 00:03:48,160 Speaker 1: recognition of faces, specifically involving the fusiform gyrus in the 60 00:03:48,200 --> 00:03:52,880 Speaker 1: brain does cognitively recognize someone, but at the same time, 61 00:03:53,120 --> 00:03:56,320 Speaker 1: the different module of the brain that normally responds to 62 00:03:56,400 --> 00:04:00,720 Speaker 1: this recognition with the emotion that we call familiarity does 63 00:04:00,800 --> 00:04:04,960 Speaker 1: not kick in. And this brain function responsible for generating 64 00:04:05,040 --> 00:04:09,200 Speaker 1: the emotion of familiarity is what Sopolski calls the extended 65 00:04:09,440 --> 00:04:13,800 Speaker 1: face processing system. It's quote a diffuse network including a 66 00:04:13,880 --> 00:04:18,320 Speaker 1: variety of cortical and limbic regions. And apparently, when we 67 00:04:18,440 --> 00:04:23,760 Speaker 1: recognize someone but we don't feel the necessary familiarity emotion 68 00:04:23,839 --> 00:04:27,359 Speaker 1: that follows when we normally recognize somebody, what the brain 69 00:04:27,440 --> 00:04:31,520 Speaker 1: often does when faced with this contradiction is to conclude 70 00:04:31,640 --> 00:04:35,000 Speaker 1: that someone has been replaced by a double. It looks 71 00:04:35,080 --> 00:04:38,320 Speaker 1: like them, but this person doesn't feel familiar to me, 72 00:04:38,839 --> 00:04:43,480 Speaker 1: thus they must be a physically identical impostor. In the past, 73 00:04:43,600 --> 00:04:46,520 Speaker 1: I looked at a two thousand four paper from the 74 00:04:46,560 --> 00:04:50,760 Speaker 1: Canadian Journal of Psychiatry titled Capgras Syndrome. A Review of 75 00:04:50,800 --> 00:04:55,880 Speaker 1: the neuro physiological correlates and presenting clinical features in cases 76 00:04:55,920 --> 00:05:00,120 Speaker 1: involving physical violence and uh In this that when the 77 00:05:00,279 --> 00:05:05,039 Speaker 1: delusional identification syndrome generally involves right brain. Anomally is linked 78 00:05:05,040 --> 00:05:07,920 Speaker 1: to a number of illnesses and neurological disorders, ranging from 79 00:05:08,320 --> 00:05:11,840 Speaker 1: UH schizo effective disorder and Alzheimer's disease to severe head injuries, 80 00:05:12,000 --> 00:05:16,200 Speaker 1: pituitary tumors, and migraines. Even alcoholism can play a role. 81 00:05:16,760 --> 00:05:19,240 Speaker 1: You know, basically, each each of us has a visual 82 00:05:19,279 --> 00:05:21,440 Speaker 1: system and olympic system, and the ladder helps us to 83 00:05:21,480 --> 00:05:25,120 Speaker 1: generate and process emotions. Damage or disrupt communication between these 84 00:05:25,120 --> 00:05:27,880 Speaker 1: two systems, and suddenly a familiar face can suspire, can 85 00:05:27,920 --> 00:05:32,400 Speaker 1: inspire suspicion instead of comfort. Now, Fortunately, kept grass syndrome 86 00:05:32,480 --> 00:05:36,839 Speaker 1: usually subsides with the successful treatment of the underlying medical condition. 87 00:05:37,200 --> 00:05:40,400 Speaker 1: You know, the tumor goes away, and thankfully so does this. Uh. This, 88 00:05:40,480 --> 00:05:42,960 Speaker 1: you know, suspicion that people are not what they seem 89 00:05:43,040 --> 00:05:45,920 Speaker 1: to be. Uh. And in some cases doctors can prescribe 90 00:05:45,920 --> 00:05:50,279 Speaker 1: antipsychotic drugs to also achieve the same effect. But you 91 00:05:50,320 --> 00:05:54,480 Speaker 1: can easily see why the idea of someone being replaced 92 00:05:54,480 --> 00:05:58,520 Speaker 1: by a double or doppelganger would be such a captivating one. 93 00:05:58,560 --> 00:06:02,480 Speaker 1: I mean, it's something that it's something that feels very perverted, 94 00:06:02,560 --> 00:06:04,960 Speaker 1: you know, it plays on our great vulnerabilities. And I 95 00:06:05,000 --> 00:06:08,160 Speaker 1: think it is not a coincidence that this kind of 96 00:06:08,200 --> 00:06:11,600 Speaker 1: thing has featured into some of the horror folklore of 97 00:06:11,640 --> 00:06:13,200 Speaker 1: the world. I mean, you think about the idea of 98 00:06:13,200 --> 00:06:16,719 Speaker 1: the changeling uh in in fairy folklore, where there was 99 00:06:16,720 --> 00:06:18,800 Speaker 1: this idea that where the fairy folk would come in 100 00:06:18,880 --> 00:06:21,680 Speaker 1: and replace someone you knew, often a child, but sometimes 101 00:06:21,720 --> 00:06:23,600 Speaker 1: like a husband or wife, or you know, someone you 102 00:06:23,680 --> 00:06:26,919 Speaker 1: knew with a fairy double who looked like them but 103 00:06:27,160 --> 00:06:29,440 Speaker 1: wasn't familiar to you, didn't act like them. And now 104 00:06:29,520 --> 00:06:33,640 Speaker 1: this is often described as something that people would use 105 00:06:33,720 --> 00:06:36,599 Speaker 1: to explain you know, maybe when somebody's behavior changed and 106 00:06:36,600 --> 00:06:39,280 Speaker 1: they didn't seem themselves, they think, oh, maybe they've been 107 00:06:39,279 --> 00:06:42,800 Speaker 1: replaced with a changeling, or used to explain why people 108 00:06:42,880 --> 00:06:45,520 Speaker 1: might feel that their children weren't their own, or something 109 00:06:45,600 --> 00:06:47,920 Speaker 1: like that. But then also you have to wonder if 110 00:06:48,040 --> 00:06:51,520 Speaker 1: some kinds of neurological issues maybe at work here in 111 00:06:51,600 --> 00:06:54,360 Speaker 1: the minds of the people making the accusation that someone 112 00:06:54,600 --> 00:06:58,480 Speaker 1: is is a fairy. Yeah, and this is obviously this 113 00:06:58,560 --> 00:07:01,159 Speaker 1: idea in and not itself has played into some so 114 00:07:01,160 --> 00:07:03,880 Speaker 1: many myths throughout history and also continues to just resound 115 00:07:03,880 --> 00:07:07,520 Speaker 1: in our our popular media. Um. This is slightly older work, 116 00:07:07,560 --> 00:07:10,720 Speaker 1: of course, but Invasion of the Body Snatchers and I 117 00:07:10,720 --> 00:07:13,360 Speaker 1: mean that plays heavily on this trope, right, that people 118 00:07:13,640 --> 00:07:16,760 Speaker 1: are being replaced by something else, People that we think 119 00:07:16,800 --> 00:07:20,320 Speaker 1: we know we're not are not actually those individuals anymore. 120 00:07:20,680 --> 00:07:24,320 Speaker 1: It's been a huge You often find it also not 121 00:07:24,400 --> 00:07:27,960 Speaker 1: only in speculative fiction, but in literary fiction as well. Um. 122 00:07:28,600 --> 00:07:30,000 Speaker 1: The quote that a right at the top of this 123 00:07:30,080 --> 00:07:34,200 Speaker 1: episode is from a two thousand and four novel titled 124 00:07:34,200 --> 00:07:38,120 Speaker 1: Repetition by one of my favorite French authors, Alan robue Gerle, 125 00:07:38,680 --> 00:07:40,559 Speaker 1: who often this is one of This is a trope 126 00:07:40,560 --> 00:07:42,680 Speaker 1: that he often threw into his books, like the idea 127 00:07:42,800 --> 00:07:45,800 Speaker 1: of a double or some sort of an alter ego, 128 00:07:46,160 --> 00:07:48,800 Speaker 1: and this book in particular been in particular like starts 129 00:07:48,840 --> 00:07:52,000 Speaker 1: off with a character on a train having glimpsed his 130 00:07:52,120 --> 00:07:55,160 Speaker 1: double once more. Yeah, it's a very unsettling image. Yeah, 131 00:07:55,240 --> 00:07:59,160 Speaker 1: the plurality of self right. Um, well, and because so 132 00:07:59,200 --> 00:08:01,080 Speaker 1: they are double way so that it can be unsettling. 133 00:08:01,120 --> 00:08:04,240 Speaker 1: There's the idea that someone you know is replaced by 134 00:08:04,240 --> 00:08:06,520 Speaker 1: a double. Obviously, if you were to come to believe 135 00:08:06,640 --> 00:08:08,600 Speaker 1: that through you know, whether you had like a brain 136 00:08:08,640 --> 00:08:11,560 Speaker 1: injury or a neurological condition that caused you to believe that, 137 00:08:11,960 --> 00:08:13,880 Speaker 1: or I don't know, if you just believed in fairies 138 00:08:13,920 --> 00:08:16,840 Speaker 1: and thought maybe that this was happening because your cultural conditioning. 139 00:08:17,000 --> 00:08:19,240 Speaker 1: Either way, that would be a terrifying thing. It's another 140 00:08:19,320 --> 00:08:21,840 Speaker 1: thing entirely to see it, to believe that you see 141 00:08:21,880 --> 00:08:24,400 Speaker 1: another version of yourself, you know, to think that you 142 00:08:24,480 --> 00:08:27,480 Speaker 1: had your own double or there was a doppelganger of you. 143 00:08:27,840 --> 00:08:30,080 Speaker 1: So I think most of us are probably familiar with this, 144 00:08:30,200 --> 00:08:34,560 Speaker 1: the the the idea of a doppelganger. Um. I you know, 145 00:08:34,600 --> 00:08:37,880 Speaker 1: I would love to say that I learned about doppelgangers 146 00:08:37,920 --> 00:08:40,720 Speaker 1: for the first time by either consulting a nice, you know, 147 00:08:40,760 --> 00:08:43,880 Speaker 1: book on Germanic mythology, and certainly I I read a 148 00:08:43,920 --> 00:08:46,040 Speaker 1: lot of different mythology books and as a kid. Also, 149 00:08:46,040 --> 00:08:48,040 Speaker 1: I would love to say that my first encounter with 150 00:08:48,120 --> 00:08:52,480 Speaker 1: doppelgangers was a Dungeons and Dragons monster manual, because it's 151 00:08:52,480 --> 00:08:55,800 Speaker 1: another huge place that they're highly visible, as they've long 152 00:08:55,880 --> 00:08:58,800 Speaker 1: been a staple of Dungeons and Dragons, so they're in there. 153 00:08:58,840 --> 00:09:00,640 Speaker 1: Oh yeah, I mean, it's a great way to introduce 154 00:09:00,679 --> 00:09:04,400 Speaker 1: a little um, suspense and chaos into a campaign, Right, 155 00:09:04,760 --> 00:09:07,840 Speaker 1: somebody you know an MPC that the character's trust has 156 00:09:07,880 --> 00:09:10,840 Speaker 1: been replaced, or a doppelganger is trying to or even 157 00:09:10,880 --> 00:09:14,160 Speaker 1: successfully replaces a member of the party. Uh. So you know, 158 00:09:14,160 --> 00:09:17,280 Speaker 1: there's a lot of fun to be had with a doppelganger. Um, 159 00:09:17,320 --> 00:09:20,200 Speaker 1: but I have to admit that neither of these cases 160 00:09:20,280 --> 00:09:23,280 Speaker 1: is true. I heard about them initially in the ninety 161 00:09:23,720 --> 00:09:27,440 Speaker 1: via the Drew Barrymore movie that aired on the Sci 162 00:09:27,480 --> 00:09:29,800 Speaker 1: Fi Channel back when this is the old days, back 163 00:09:29,800 --> 00:09:32,680 Speaker 1: when before there were wise in sci Fi Wise inside. 164 00:09:32,720 --> 00:09:36,520 Speaker 1: Ohh Siffy you mean Siffy Yeah, Siffy Channel. Yes. Um, 165 00:09:36,600 --> 00:09:38,880 Speaker 1: I remember next to nothing about this film, but it 166 00:09:38,960 --> 00:09:42,520 Speaker 1: was heavily promoted on the channel, and it introduced the 167 00:09:42,640 --> 00:09:45,520 Speaker 1: idea to me initially, and then I you know, followed 168 00:09:45,600 --> 00:09:48,600 Speaker 1: up by you know, asking around, Hey, Dad, what's a doppelganger? 169 00:09:48,640 --> 00:09:51,320 Speaker 1: And then I looked it up, etcetera. Well, wait a minute, 170 00:09:51,320 --> 00:09:54,040 Speaker 1: so it was called Doppelgangers the name of the movie. 171 00:09:54,080 --> 00:09:56,520 Speaker 1: That was at least that was the title if the film, 172 00:09:56,520 --> 00:09:58,600 Speaker 1: as was promoted on Sci Fi Channel at the time. 173 00:09:58,880 --> 00:10:01,000 Speaker 1: So of course it often the case, with films of 174 00:10:01,040 --> 00:10:03,240 Speaker 1: this caliber, they may have had multiple titles, and who knows, 175 00:10:03,240 --> 00:10:06,079 Speaker 1: they may have been promoted elsewhere under a different title. 176 00:10:06,160 --> 00:10:08,359 Speaker 1: I just looked it up. It's also known as doppelgang 177 00:10:08,400 --> 00:10:11,120 Speaker 1: or colon the evil within. Just to be clear, that 178 00:10:11,200 --> 00:10:12,559 Speaker 1: was for the people who didn't know what a doppel 179 00:10:12,559 --> 00:10:16,400 Speaker 1: ganger was. Always got to have a colon, real title. 180 00:10:17,040 --> 00:10:19,240 Speaker 1: So but here here's an interesting thing that I didn't 181 00:10:19,320 --> 00:10:22,640 Speaker 1: realize until I was researching this episode. I just kind 182 00:10:22,640 --> 00:10:26,280 Speaker 1: of assumed, you know, obviously, the doppel ganger itself, the 183 00:10:26,400 --> 00:10:29,600 Speaker 1: term is Germanic origins, and I figured this is a 184 00:10:29,640 --> 00:10:33,160 Speaker 1: creature that emerges from German folk traditions, you know, uh, 185 00:10:33,200 --> 00:10:35,360 Speaker 1: you know, and in the same way that that crampus 186 00:10:35,480 --> 00:10:38,640 Speaker 1: came down from the mountains and uh an alpine traditions. 187 00:10:38,679 --> 00:10:41,199 Speaker 1: I just figured the doppelganger was just a standard and 188 00:10:41,600 --> 00:10:44,920 Speaker 1: because the again, the idea of a mysterious double, either 189 00:10:44,960 --> 00:10:48,319 Speaker 1: of self or other, is long established, but this does 190 00:10:48,360 --> 00:10:52,000 Speaker 1: not seem to be the case. Apparently, the word doppelganger 191 00:10:52,400 --> 00:10:54,960 Speaker 1: wasn't coined to know the eighteenth century, and it was 192 00:10:55,000 --> 00:10:58,440 Speaker 1: coined by German novelist Jean Paul in his seventeen nine 193 00:10:58,920 --> 00:11:04,199 Speaker 1: novel Uh Seben Cos, in which the main character encounters 194 00:11:04,240 --> 00:11:08,200 Speaker 1: his own doppel ganger or double goer uh in the 195 00:11:08,360 --> 00:11:10,560 Speaker 1: In this case, the doupel ganger convinces him to fake 196 00:11:10,600 --> 00:11:13,960 Speaker 1: his own death and start a new life. Uh. And 197 00:11:14,000 --> 00:11:16,080 Speaker 1: I had to I had to look in closer on this. 198 00:11:16,280 --> 00:11:19,360 Speaker 1: It's it's not as straightforward as I would like it 199 00:11:19,400 --> 00:11:22,240 Speaker 1: to be, where he's just like, hey, this is the doppelganger. 200 00:11:22,640 --> 00:11:26,640 Speaker 1: Apparently he invents two similar words in this book. He 201 00:11:26,679 --> 00:11:30,520 Speaker 1: invinced the word doppeled ganger. Um, so this would be 202 00:11:30,640 --> 00:11:34,200 Speaker 1: the name for people who see themselves. But then he 203 00:11:34,280 --> 00:11:37,640 Speaker 1: also talks about doppel ganger as a as a word 204 00:11:37,679 --> 00:11:39,760 Speaker 1: for the second course when the second course of a 205 00:11:39,840 --> 00:11:44,640 Speaker 1: meal arrives alongside the first course, because gang or all 206 00:11:44,840 --> 00:11:47,719 Speaker 1: means both you know, go or walker as well as 207 00:11:47,800 --> 00:11:53,400 Speaker 1: course in a meal. So technically doppeled ganger would be 208 00:11:53,480 --> 00:11:58,120 Speaker 1: the mysterious double idea that he introduces. And doppelganger itself 209 00:11:58,360 --> 00:12:01,199 Speaker 1: is just a weird mishap of ordering a multi course 210 00:12:01,240 --> 00:12:05,160 Speaker 1: meal at a restaurant. But nobody's gonna say doubled ganger, 211 00:12:06,120 --> 00:12:09,200 Speaker 1: not anymore, No duffeld ganger. But this is this is 212 00:12:09,200 --> 00:12:11,800 Speaker 1: a good idea. Next time someone introduces the daffel ganger 213 00:12:11,840 --> 00:12:15,040 Speaker 1: in your dn D campaign, remind the d M that 214 00:12:15,040 --> 00:12:19,079 Speaker 1: that's a culinary term sort of. Uh. But anyway, the 215 00:12:19,440 --> 00:12:22,720 Speaker 1: termines up resonating in German literature, and it became popular 216 00:12:22,760 --> 00:12:27,000 Speaker 1: in romantic horror literature in general by the mid eighteen hundreds. 217 00:12:27,240 --> 00:12:31,480 Speaker 1: So I think originally the the this idea was always 218 00:12:31,600 --> 00:12:36,040 Speaker 1: something scary or dangerous, right, Well, yeah, they're not as much. 219 00:12:36,080 --> 00:12:38,440 Speaker 1: It's seemingly in the original And I didn't read the 220 00:12:38,600 --> 00:12:41,640 Speaker 1: original German novel This is Stress, so you know, feel 221 00:12:41,679 --> 00:12:43,160 Speaker 1: free to correct me if anyone out there is more 222 00:12:43,240 --> 00:12:46,040 Speaker 1: familiar with the with the literature we're talking about here. 223 00:12:46,080 --> 00:12:51,640 Speaker 1: But it certainly took on sinister connotations within the literary tradition. 224 00:12:51,960 --> 00:12:54,599 Speaker 1: But then I was reading about the term on websters 225 00:12:54,840 --> 00:12:58,000 Speaker 1: and the sinister connotations have apparently dropped off somewhat in 226 00:12:58,040 --> 00:13:01,640 Speaker 1: its English language usage, which is surprising to me. But 227 00:13:01,679 --> 00:13:04,360 Speaker 1: then again, I'm coming from the standpoint of knowing them 228 00:13:04,400 --> 00:13:07,360 Speaker 1: mostly through Dungeons and Dragons and Horrible Drew Barrymore movies, 229 00:13:07,840 --> 00:13:12,520 Speaker 1: so I'm probably not like the the key candidate here. Um, 230 00:13:12,679 --> 00:13:15,160 Speaker 1: I guess the other thing, too, is I really don't 231 00:13:15,360 --> 00:13:18,640 Speaker 1: use the term outside of a fantasy context. Like if 232 00:13:18,640 --> 00:13:23,079 Speaker 1: I encounter someone who looks a lot like someone I know, 233 00:13:24,160 --> 00:13:27,080 Speaker 1: I don't say, oh, hey, I saw your doppelganger today. 234 00:13:27,120 --> 00:13:29,640 Speaker 1: I'm more like, hey I saw your Maybe I'll say 235 00:13:29,679 --> 00:13:33,600 Speaker 1: evil twin, which is, you know, another variation on this trope. 236 00:13:33,720 --> 00:13:35,440 Speaker 1: Or I'll say, oh, I just I saw someone who 237 00:13:35,480 --> 00:13:37,720 Speaker 1: looked just about like you, Or I I'm in another city, 238 00:13:37,720 --> 00:13:41,080 Speaker 1: I might say, oh I saw your Chicago you or whatever. 239 00:13:41,160 --> 00:13:43,720 Speaker 1: You know. So that's the other thing. I just don't 240 00:13:43,800 --> 00:13:47,400 Speaker 1: use doppelganger outside of fantastic settings myself. I think most 241 00:13:47,400 --> 00:13:50,800 Speaker 1: people just use it to me and they look alike. Now, Yeah, 242 00:13:50,840 --> 00:13:52,720 Speaker 1: but I guess I don't even use it that way. 243 00:13:52,960 --> 00:13:55,640 Speaker 1: Like for me, I just if I think of doppelganger, 244 00:13:55,720 --> 00:13:58,800 Speaker 1: I think of something like that creature and Kroll which 245 00:13:58,800 --> 00:14:00,760 Speaker 1: pretends to be the Wizard, you know. I think something 246 00:14:00,760 --> 00:14:04,439 Speaker 1: that when you reveal it, it's a horrible, pallid creature 247 00:14:04,520 --> 00:14:07,600 Speaker 1: with jet black eyes. So if I'm not specifically talking 248 00:14:07,600 --> 00:14:11,440 Speaker 1: about like a monstrous scenario, I'm not gonna use a doppelganger. Okay, 249 00:14:11,480 --> 00:14:14,240 Speaker 1: that's just me. I also think that part of but 250 00:14:14,480 --> 00:14:16,520 Speaker 1: I think part of this whole idea of the sinister 251 00:14:16,559 --> 00:14:19,040 Speaker 1: connotations fading away, It might have to do with the 252 00:14:19,040 --> 00:14:21,560 Speaker 1: fact that if it is used by and large for 253 00:14:21,720 --> 00:14:24,040 Speaker 1: just somebody's double Like if someone is to say, hey, 254 00:14:24,080 --> 00:14:28,000 Speaker 1: I saw your doppelganger today at Showny's, they're not gonna 255 00:14:28,560 --> 00:14:30,640 Speaker 1: you know, there's there's not gonna be a creepy connotation 256 00:14:30,680 --> 00:14:33,920 Speaker 1: to that that sighting. We're not gonna say, oh my god, 257 00:14:33,960 --> 00:14:36,120 Speaker 1: I saw your doppelganger at Showny's and I was super 258 00:14:36,160 --> 00:14:39,360 Speaker 1: creeped out. I think we need to call somebody. No, 259 00:14:39,560 --> 00:14:41,160 Speaker 1: you're You're just gonna it's just gonna be a point 260 00:14:41,160 --> 00:14:44,800 Speaker 1: of whimsy. And the other thing is that more than 261 00:14:45,000 --> 00:14:49,160 Speaker 1: likely it was a first glance situation, like at first glance, 262 00:14:49,280 --> 00:14:51,440 Speaker 1: I thought it was you. At second glance, I saw 263 00:14:51,480 --> 00:14:54,200 Speaker 1: that it was clearly another person and nothing to freak 264 00:14:54,200 --> 00:14:57,360 Speaker 1: out about. Well, I would be shocked though, if people 265 00:14:57,400 --> 00:14:59,960 Speaker 1: didn't still interpret this kind of thing is some kind 266 00:15:00,000 --> 00:15:03,320 Speaker 1: a weird omen or demon or whatever. Oh yeah, And 267 00:15:03,440 --> 00:15:05,120 Speaker 1: I was glancing around on the internet and there's still 268 00:15:05,160 --> 00:15:07,920 Speaker 1: plenty of that. Um And I think a large part 269 00:15:07,960 --> 00:15:12,040 Speaker 1: of that is, you know, as with all paranormal UH 270 00:15:12,400 --> 00:15:20,280 Speaker 1: experiences or supernatural explanations for mundane UH encounters, the supernatural 271 00:15:20,280 --> 00:15:23,080 Speaker 1: explanation is going to be more appealing. It's going, you know, 272 00:15:23,120 --> 00:15:25,960 Speaker 1: it makes us feel more important, Like you want to 273 00:15:26,000 --> 00:15:28,560 Speaker 1: feel like you're in an island rogue relea novel and 274 00:15:28,560 --> 00:15:31,240 Speaker 1: you saw your mysterious double and it, you know, reveals 275 00:15:31,280 --> 00:15:34,440 Speaker 1: something about your you know, your your inner subconscious nature 276 00:15:34,520 --> 00:15:37,040 Speaker 1: or something, or that you you saw a ghost that 277 00:15:37,120 --> 00:15:38,760 Speaker 1: looked like you. I mean, all these are four more 278 00:15:38,800 --> 00:15:41,000 Speaker 1: interesting than Yeah, they're you know, there are a whole 279 00:15:41,040 --> 00:15:42,520 Speaker 1: bunch of people in the world, and it was bound 280 00:15:42,520 --> 00:15:44,640 Speaker 1: to happen sooner or later. But I saw somebody that 281 00:15:44,720 --> 00:15:46,880 Speaker 1: kind of looked like me and had some more facial hair. 282 00:15:47,000 --> 00:15:49,600 Speaker 1: The way that you look isn't all that unique. That's 283 00:15:49,640 --> 00:15:51,560 Speaker 1: like the worst news of all. Yeah, that's that's just 284 00:15:51,720 --> 00:15:55,400 Speaker 1: nothing exciting about that. That story. You don't run rush 285 00:15:55,440 --> 00:15:58,200 Speaker 1: home to tell that to your significant other. But it 286 00:15:58,200 --> 00:16:00,080 Speaker 1: does bring up the question, what are your chance is 287 00:16:00,120 --> 00:16:03,520 Speaker 1: of running into your own unrelated double, or for that matter, 288 00:16:03,640 --> 00:16:06,760 Speaker 1: running into an unrelated double with someone you know well. 289 00:16:06,800 --> 00:16:11,440 Speaker 1: According to ananimous Dr Tiggan Lucas quoted in the BBC 290 00:16:11,600 --> 00:16:14,720 Speaker 1: future article, you're surprisingly likely to have a doppel game, 291 00:16:15,200 --> 00:16:18,520 Speaker 1: which I think is slightly confusing title given the contents 292 00:16:18,520 --> 00:16:21,400 Speaker 1: of the article, but still uh said that the chances 293 00:16:21,440 --> 00:16:24,840 Speaker 1: of sharing just eight dimensions with someone else are less 294 00:16:24,880 --> 00:16:27,600 Speaker 1: than one in a trillion, and with a seven point 295 00:16:27,640 --> 00:16:29,840 Speaker 1: four billion people on the planet, it was only there 296 00:16:29,880 --> 00:16:32,640 Speaker 1: was only a one in on five chants that there's 297 00:16:32,680 --> 00:16:35,160 Speaker 1: a single pair of true doppel gangers. The wait, what 298 00:16:35,200 --> 00:16:38,280 Speaker 1: are these dimensions you're talking about? Like eight facial dimensions? 299 00:16:38,280 --> 00:16:40,320 Speaker 1: Like if you take you take facial features and you 300 00:16:40,360 --> 00:16:42,120 Speaker 1: divide them up into eight dimensions and go to mac 301 00:16:42,200 --> 00:16:46,520 Speaker 1: match those up. So yeah, not like eight spatial dimensions. 302 00:16:46,600 --> 00:16:49,040 Speaker 1: I'm not sure how that would work. Okay, basically the 303 00:16:49,080 --> 00:16:54,240 Speaker 1: eight sliders on your character creator right now most of 304 00:16:54,240 --> 00:16:57,160 Speaker 1: the time, though again we're not talking about exact doubles. 305 00:16:57,440 --> 00:16:59,720 Speaker 1: You know, generally, these are just faces that are similar 306 00:16:59,800 --> 00:17:02,160 Speaker 1: to our own or similar to someone we know when 307 00:17:02,160 --> 00:17:04,400 Speaker 1: we focus on the familiarity in a way that may 308 00:17:04,440 --> 00:17:08,000 Speaker 1: be tied to a means of identifying close skin. Uh, 309 00:17:08,000 --> 00:17:10,760 Speaker 1: you know, in early human history, like that's what this 310 00:17:11,080 --> 00:17:15,120 Speaker 1: recognition system is perhaps four um and you know, think 311 00:17:15,160 --> 00:17:18,199 Speaker 1: again about how generally, how you know, generally doubles are 312 00:17:18,320 --> 00:17:21,280 Speaker 1: kind of a first glance thing. The similarities may be jarring, 313 00:17:21,800 --> 00:17:26,159 Speaker 1: but the differences will be pronounced as well. Now, the 314 00:17:26,200 --> 00:17:28,280 Speaker 1: thing is, there are so many humans on the planet 315 00:17:28,280 --> 00:17:31,560 Speaker 1: now and we live in you know, closer confines in 316 00:17:31,640 --> 00:17:35,480 Speaker 1: many situations, seeing familiar features, it doesn't necessarily mean that 317 00:17:35,480 --> 00:17:40,080 Speaker 1: there's any shared genetic heritage between two given individuals, you know, 318 00:17:40,280 --> 00:17:42,480 Speaker 1: except in the sense that all humans share mostly in 319 00:17:42,520 --> 00:17:45,360 Speaker 1: the grander yeah, and the grander scheme. Yes, but yeah, 320 00:17:45,400 --> 00:17:47,000 Speaker 1: if you just if you're in another city you see 321 00:17:47,000 --> 00:17:48,919 Speaker 1: someone who looks kind of like you or looks kind 322 00:17:48,920 --> 00:17:51,480 Speaker 1: of like a friend, it doesn't mean their your long 323 00:17:51,560 --> 00:17:54,600 Speaker 1: last cousin or their long last cousin of an your friend. 324 00:17:56,080 --> 00:17:58,399 Speaker 1: But it's a situation where we kind of broke the 325 00:17:58,400 --> 00:18:01,480 Speaker 1: system through population growth in the birth of cities and 326 00:18:01,920 --> 00:18:06,040 Speaker 1: and self facial recognition and facial recognition abilities. They're also 327 00:18:06,080 --> 00:18:10,360 Speaker 1: going to vary from person to person, so your doppel 328 00:18:10,359 --> 00:18:13,480 Speaker 1: ganger alarms just may not be as easy to set 329 00:18:13,480 --> 00:18:16,800 Speaker 1: off as someone else's. So anyway, that's that's doppel ganger's 330 00:18:17,359 --> 00:18:19,800 Speaker 1: in a nutshell, both the origin of the term, but 331 00:18:19,960 --> 00:18:22,959 Speaker 1: then a little bit about the the science and the 332 00:18:22,960 --> 00:18:26,840 Speaker 1: potent that the potentiality of seeing a double or near 333 00:18:26,880 --> 00:18:30,400 Speaker 1: double uh somewhere in the world. But thinking about what 334 00:18:30,600 --> 00:18:34,480 Speaker 1: is at work with the the erroneous detection of doubles 335 00:18:34,480 --> 00:18:37,560 Speaker 1: in cop Cross syndrome, UH is I guess maybe what 336 00:18:37,600 --> 00:18:39,920 Speaker 1: we should get back to when we come back after 337 00:18:39,960 --> 00:18:44,600 Speaker 1: a break. All Right, we're back and it's really us. 338 00:18:44,640 --> 00:18:48,040 Speaker 1: We weren't replaced by strange creatures from the Monster Manual 339 00:18:48,200 --> 00:18:50,920 Speaker 1: over the course of the advertisement. Now we're here, it's 340 00:18:50,960 --> 00:18:54,000 Speaker 1: really us and we're going to continue our exploration. You know. 341 00:18:54,080 --> 00:18:56,359 Speaker 1: I wanted to answer that with the body snatchers noise, 342 00:18:56,400 --> 00:18:58,240 Speaker 1: but I don't know if I can make it exactly 343 00:18:58,240 --> 00:19:01,080 Speaker 1: from the Donald Sutherland version, which is a great version 344 00:19:01,080 --> 00:19:03,080 Speaker 1: by ill haven't seen that. I've only seen the old 345 00:19:03,080 --> 00:19:06,040 Speaker 1: black and white original. Oh, the Donald Sutherland one is great. 346 00:19:06,160 --> 00:19:11,320 Speaker 1: He's got Lambert from Alien, it's got Jeff Goldblum, He's 347 00:19:11,400 --> 00:19:14,920 Speaker 1: he's feisty. It's got oh and it's got from another 348 00:19:14,960 --> 00:19:18,040 Speaker 1: sci fi classic. It's got what's his name who played 349 00:19:18,080 --> 00:19:22,720 Speaker 1: spok Literary, Yeah, Literary. Nimoy is fantastic in it. I 350 00:19:22,760 --> 00:19:25,320 Speaker 1: think it's his it's his great performance. Well, that's a 351 00:19:25,320 --> 00:19:27,600 Speaker 1: great cast, but the nineteen fifty six original had had 352 00:19:28,119 --> 00:19:31,000 Speaker 1: had Kevin McCarthy in the lead role. He was terrific. 353 00:19:31,440 --> 00:19:34,760 Speaker 1: You also had Carol and Jones, who had played more 354 00:19:34,800 --> 00:19:37,560 Speaker 1: Tisha on The Adams Family. Oh cool. Yeah, But also 355 00:19:37,560 --> 00:19:40,160 Speaker 1: it was just black and white and it just it really, 356 00:19:40,400 --> 00:19:42,040 Speaker 1: at least the version I saw of it, like the 357 00:19:42,160 --> 00:19:47,160 Speaker 1: darkness felt just so murky and uh and dirty somehow, 358 00:19:47,200 --> 00:19:50,040 Speaker 1: Like it was just a very nightmare inducing film when 359 00:19:50,080 --> 00:19:51,920 Speaker 1: I saw it as a kid. You know, the paranoid 360 00:19:52,000 --> 00:19:54,359 Speaker 1: visual vibe. It's got a it's got a kind of 361 00:19:54,400 --> 00:19:58,360 Speaker 1: a communist infiltration thing. Oh, definitely, definitely, that's a that's 362 00:19:58,400 --> 00:20:00,879 Speaker 1: a very strong element of it, which just goes to 363 00:20:00,880 --> 00:20:04,400 Speaker 1: show like the ideas of like why this concept of 364 00:20:04,400 --> 00:20:07,439 Speaker 1: of doubles resonates so because you're can apply it to 365 00:20:07,480 --> 00:20:10,960 Speaker 1: all these other scenarios social and political. Well yeah, I 366 00:20:10,960 --> 00:20:13,240 Speaker 1: mean it's a common thing for people to say when 367 00:20:13,359 --> 00:20:16,040 Speaker 1: they don't literally think that someone they know has been 368 00:20:16,040 --> 00:20:20,600 Speaker 1: physically bodily replaced by a by a supernatural double, they 369 00:20:20,680 --> 00:20:23,080 Speaker 1: might often think, I don't know this person anymore. I 370 00:20:23,080 --> 00:20:25,520 Speaker 1: mean it's a similar like, you know, they've been replaced 371 00:20:25,600 --> 00:20:28,680 Speaker 1: with somebody, somebody replaced you with a different person. Yeah, 372 00:20:28,720 --> 00:20:32,239 Speaker 1: like so just would really you just found out you're 373 00:20:32,240 --> 00:20:35,160 Speaker 1: getting to know them better, You found not something about 374 00:20:35,200 --> 00:20:37,719 Speaker 1: them you didn't know before, and now you think that 375 00:20:37,760 --> 00:20:40,760 Speaker 1: they're like a different being entirely, and now it's just 376 00:20:40,800 --> 00:20:44,040 Speaker 1: because it turns out that they were maybe communists or 377 00:20:44,359 --> 00:20:46,840 Speaker 1: like a different football team. Well, to be fair, also, 378 00:20:46,880 --> 00:20:49,600 Speaker 1: it could be a case of um, you know, people 379 00:20:49,760 --> 00:20:54,159 Speaker 1: over emphasizing disposition a traits thinking that people thinking that 380 00:20:54,200 --> 00:20:57,000 Speaker 1: they should expect their loved ones to be incredibly consistent 381 00:20:57,119 --> 00:21:01,000 Speaker 1: and trait predictable, when in fact people inconsistent that it 382 00:21:01,160 --> 00:21:04,680 Speaker 1: depends on the circumstances how they behave. Maybe sometimes you 383 00:21:05,040 --> 00:21:08,000 Speaker 1: are used to seeing someone only in one type of context, 384 00:21:08,080 --> 00:21:10,719 Speaker 1: maybe used to only seeing them at work, and then 385 00:21:10,720 --> 00:21:12,679 Speaker 1: when you see them in a different context, when you 386 00:21:12,680 --> 00:21:14,480 Speaker 1: see them, you know, out with their friends or with 387 00:21:14,560 --> 00:21:16,960 Speaker 1: their family, they seem like a totally different person to you. 388 00:21:17,200 --> 00:21:19,359 Speaker 1: It can be jarring when you see those differences, and 389 00:21:19,560 --> 00:21:21,920 Speaker 1: yet they're there for almost all of us, almost none 390 00:21:21,920 --> 00:21:25,479 Speaker 1: of us, like really behave the same way in all contexts. Well, 391 00:21:25,520 --> 00:21:30,399 Speaker 1: let's talk about the about those contexts, especially the social contexts. Yeah, 392 00:21:30,560 --> 00:21:32,320 Speaker 1: so I want to come back to So we've talked 393 00:21:32,320 --> 00:21:34,880 Speaker 1: about doppelgangers a bit and the idea of doubles and 394 00:21:34,880 --> 00:21:37,680 Speaker 1: and familiarity and recognition. But I want to come back 395 00:21:37,720 --> 00:21:40,280 Speaker 1: to uh that article I mentioned at the beginning where 396 00:21:40,520 --> 00:21:45,639 Speaker 1: Robert Sapolski makes this comparison between what is made clear 397 00:21:45,800 --> 00:21:49,880 Speaker 1: about the brain basis of familiarity with cop Cross syndrome 398 00:21:50,400 --> 00:21:55,159 Speaker 1: and the ways that technology is changing our social relationships. 399 00:21:55,200 --> 00:21:58,399 Speaker 1: So in in Sapolski's words, Capcross syndrome makes clear the 400 00:21:58,440 --> 00:22:02,399 Speaker 1: brain basis for quote, the offerences between the thoughts that 401 00:22:02,520 --> 00:22:05,920 Speaker 1: give rise to recognition. Remember recognition as cognitive. You see 402 00:22:05,960 --> 00:22:09,399 Speaker 1: somebody and you cognitively know who they are, and the 403 00:22:09,480 --> 00:22:13,960 Speaker 1: feelings that give rise to familiarity. That's the emotion that says, yes, 404 00:22:14,000 --> 00:22:18,080 Speaker 1: I know this person. They're different things. And Sapolski's main 405 00:22:18,200 --> 00:22:22,399 Speaker 1: point is quote, these functional fault lines in the social brain, 406 00:22:22,560 --> 00:22:26,080 Speaker 1: when coupled with advances in the online world, have given 407 00:22:26,200 --> 00:22:30,480 Speaker 1: rise to the contemporary Facebook generation. They have made cop 408 00:22:30,480 --> 00:22:33,920 Speaker 1: Gross syndrome a window on our culture and minds today 409 00:22:34,160 --> 00:22:38,639 Speaker 1: where nothing is quite recognizable but everything seems familiar. And 410 00:22:38,680 --> 00:22:40,560 Speaker 1: I would actually go further than that and say I 411 00:22:40,560 --> 00:22:43,119 Speaker 1: think that's an interesting point. But the the inverse is 412 00:22:43,160 --> 00:22:46,919 Speaker 1: true as well, that the online world creates these situations 413 00:22:47,200 --> 00:22:51,840 Speaker 1: where you have familiarity without recognition and recognition without familiarity. 414 00:22:52,440 --> 00:22:55,720 Speaker 1: So to further explore the point, he makes a little 415 00:22:55,720 --> 00:22:58,520 Speaker 1: bit so he points out that, you know, essentially, for 416 00:22:58,640 --> 00:23:02,480 Speaker 1: all of our evolutionary story are only social relationships have 417 00:23:02,560 --> 00:23:05,120 Speaker 1: been face to face ones. And I'm struggling to think 418 00:23:05,119 --> 00:23:08,320 Speaker 1: of a counter example. I can't really think of a 419 00:23:08,359 --> 00:23:12,479 Speaker 1: counter example for relationships with real people. But for tens 420 00:23:12,480 --> 00:23:14,879 Speaker 1: of thousands of years, of course we have had language, 421 00:23:15,160 --> 00:23:18,000 Speaker 1: and we could have felt as if we had relationships 422 00:23:18,040 --> 00:23:21,600 Speaker 1: with people we only heard about in stories for example. Now, 423 00:23:21,640 --> 00:23:23,840 Speaker 1: obviously we do eventually reach the point where we have 424 00:23:23,840 --> 00:23:26,680 Speaker 1: the ability to engage in activities like having a pin pal, 425 00:23:27,240 --> 00:23:29,040 Speaker 1: and that may. You know, that's a case where you 426 00:23:29,080 --> 00:23:33,120 Speaker 1: can have certainly a non face to face example. But 427 00:23:33,280 --> 00:23:37,320 Speaker 1: prior to uh, you know, the advent of the necessary um, 428 00:23:37,359 --> 00:23:40,560 Speaker 1: you know, systems and technology. Yeah, I struggle to think 429 00:23:40,560 --> 00:23:42,640 Speaker 1: of an example as well. I mean, even sort of 430 00:23:43,080 --> 00:23:47,639 Speaker 1: semi imagined situations such as speaking to the spirit of 431 00:23:47,680 --> 00:23:52,080 Speaker 1: a dead ancestor or dead relative, like you're still depending 432 00:23:52,119 --> 00:23:55,760 Speaker 1: upon a previous face to face relationship. Yes, and even 433 00:23:55,800 --> 00:23:59,120 Speaker 1: even with pin pals, I mean even the oldest versions 434 00:23:59,119 --> 00:24:03,119 Speaker 1: of this, the non digital communications just writing to people 435 00:24:03,160 --> 00:24:06,119 Speaker 1: with letters, even if you've never met them before, that 436 00:24:06,119 --> 00:24:09,399 Speaker 1: that is anatomically recent. I mean, the vast majority of 437 00:24:09,400 --> 00:24:11,960 Speaker 1: the time our species has been around, we didn't have writing. 438 00:24:12,000 --> 00:24:14,920 Speaker 1: We couldn't do that. The only relationships we had were 439 00:24:14,960 --> 00:24:18,199 Speaker 1: face to face relationships. And so it's entirely clear that 440 00:24:18,280 --> 00:24:21,639 Speaker 1: our bodies and our brains have been shaped by an 441 00:24:21,680 --> 00:24:24,800 Speaker 1: evolutionary niche and when in which all relationships were face 442 00:24:24,880 --> 00:24:28,040 Speaker 1: to face ones. Right, Even our history is a symbolic uh. 443 00:24:28,080 --> 00:24:32,920 Speaker 1: Species is mostly based on almost exclusively based on face 444 00:24:32,920 --> 00:24:36,400 Speaker 1: to face communication. Yeah, And so when our only social 445 00:24:36,440 --> 00:24:40,560 Speaker 1: relationships were face to face relationships, it was natural for 446 00:24:41,000 --> 00:24:46,040 Speaker 1: facial recognition and familiarity at an in person body sensing 447 00:24:46,160 --> 00:24:49,040 Speaker 1: level to be one of our main mediators of how 448 00:24:49,119 --> 00:24:55,000 Speaker 1: we conceptualized, evaluated, and formed beliefs about our relationships. I mean, 449 00:24:55,040 --> 00:24:58,040 Speaker 1: if you live in this non technological world where your 450 00:24:58,080 --> 00:25:01,520 Speaker 1: only relationships are face to face, it totally makes sense 451 00:25:01,560 --> 00:25:05,160 Speaker 1: for you to use moment to moment, face to face 452 00:25:05,240 --> 00:25:08,439 Speaker 1: a visual and touch data and things like that. To 453 00:25:08,600 --> 00:25:11,199 Speaker 1: get the best idea of what your relationships are and 454 00:25:11,240 --> 00:25:13,240 Speaker 1: how you should feel about them, right, I mean some 455 00:25:13,280 --> 00:25:15,240 Speaker 1: of that goes back to the you know we're discussing 456 00:25:15,280 --> 00:25:19,080 Speaker 1: earlier about uh, you know, can identification being able to 457 00:25:19,080 --> 00:25:21,760 Speaker 1: tell like this, this is a relative, I can see 458 00:25:21,760 --> 00:25:24,879 Speaker 1: it in their face exactly. But of course there have 459 00:25:24,920 --> 00:25:28,560 Speaker 1: been these technological changes that now allow relationships to exist 460 00:25:28,680 --> 00:25:32,879 Speaker 1: and persist under circumstances other than face to face interaction. 461 00:25:33,119 --> 00:25:35,600 Speaker 1: Of course, we already mentioned writing in literacy. Now this 462 00:25:35,640 --> 00:25:38,920 Speaker 1: allows you to maybe send letters, though I'd say even 463 00:25:39,400 --> 00:25:41,320 Speaker 1: for most of the time that's been around, that has 464 00:25:41,359 --> 00:25:45,560 Speaker 1: been something that is limited to a small percent of humans, 465 00:25:46,040 --> 00:25:49,040 Speaker 1: you know, because for most of human history most people 466 00:25:49,080 --> 00:25:51,560 Speaker 1: have not been literate, That's true. And then and then 467 00:25:51,560 --> 00:25:54,240 Speaker 1: of course again I feel I feel like the pinpal, 468 00:25:54,320 --> 00:25:57,680 Speaker 1: like the the pinpal situation in which there is never 469 00:25:57,760 --> 00:26:00,680 Speaker 1: a face to face meeting. Like that's as a slim 470 00:26:00,960 --> 00:26:03,840 Speaker 1: slice of the overall pie. Most of the other um 471 00:26:03,840 --> 00:26:06,680 Speaker 1: written communications are going to be carried out with individuals 472 00:26:07,119 --> 00:26:09,720 Speaker 1: um in which there was at least a previous face 473 00:26:09,760 --> 00:26:13,479 Speaker 1: to face communication. Yeah, but then think about how hard 474 00:26:13,720 --> 00:26:17,879 Speaker 1: this kind of thing can make relationships. I bet every 475 00:26:17,960 --> 00:26:24,480 Speaker 1: single person listening has had the experience of relationships strife 476 00:26:24,720 --> 00:26:29,200 Speaker 1: caused by a feeling by a misunderstanding, or some kind 477 00:26:29,200 --> 00:26:32,680 Speaker 1: of feeling of emotional estrangement brought on by the media 478 00:26:32,720 --> 00:26:35,320 Speaker 1: through which you communicate. A lot of us don't feel 479 00:26:35,400 --> 00:26:38,080 Speaker 1: very comfortable talking to people on the phone. A lot 480 00:26:38,119 --> 00:26:40,439 Speaker 1: of us, don't you know, we have the experience of 481 00:26:40,480 --> 00:26:43,680 Speaker 1: sending emails and being misunderstood, having people not read your 482 00:26:43,720 --> 00:26:47,840 Speaker 1: tone correctly, or getting worried about the way somebody punctuated 483 00:26:47,880 --> 00:26:50,000 Speaker 1: a sentence and an email. I mean, I bet you've 484 00:26:50,000 --> 00:26:52,439 Speaker 1: had this experience. Oh yeah, absolutely, and I think we 485 00:26:52,480 --> 00:26:57,119 Speaker 1: all have both in personal contacts and work contacts. You know, um, 486 00:26:57,200 --> 00:27:00,359 Speaker 1: you know, says I guess you know. Hopefully if you have, 487 00:27:00,480 --> 00:27:03,560 Speaker 1: if you're dealing a lot uh via email with someone, 488 00:27:03,600 --> 00:27:06,160 Speaker 1: you'll kind of get a feel for their tone and 489 00:27:06,200 --> 00:27:08,960 Speaker 1: how they tend to speak. But even then there's so 490 00:27:09,040 --> 00:27:11,680 Speaker 1: much room for miscommunication, Like even when you you feel 491 00:27:11,680 --> 00:27:14,280 Speaker 1: like you you really uh you know, or or up 492 00:27:14,320 --> 00:27:18,840 Speaker 1: to speed on how they present themselves in a textual manner. Robert, 493 00:27:18,880 --> 00:27:20,439 Speaker 1: if you don't mind me saying you're kind of a 494 00:27:20,520 --> 00:27:23,719 Speaker 1: terse emailer, am I I can see people getting worried 495 00:27:23,760 --> 00:27:25,920 Speaker 1: when they get an email from you that maybe you're 496 00:27:25,960 --> 00:27:28,600 Speaker 1: mad at them or something. I don't think that's necessarily 497 00:27:28,600 --> 00:27:30,480 Speaker 1: always the king. Maybe sometimes you're mad at me, But 498 00:27:30,560 --> 00:27:33,240 Speaker 1: I mean, I think you just tend to not spend 499 00:27:33,280 --> 00:27:36,119 Speaker 1: a whole lot of time, you know, worrying about how 500 00:27:36,119 --> 00:27:37,880 Speaker 1: to phrase stuff on email. You just kind of bang 501 00:27:37,960 --> 00:27:40,679 Speaker 1: it out, and which I admire because you know, I 502 00:27:41,040 --> 00:27:44,080 Speaker 1: it is, you know, the amount of time that people 503 00:27:44,119 --> 00:27:47,720 Speaker 1: waste trying to phrase stuff on email is is it's 504 00:27:47,760 --> 00:27:50,880 Speaker 1: a horror. The thing is I used I remember when 505 00:27:50,920 --> 00:27:53,720 Speaker 1: I was younger, I would have these these long email 506 00:27:53,760 --> 00:27:56,840 Speaker 1: correspondence is we're going on with friends where we would 507 00:27:56,880 --> 00:28:01,400 Speaker 1: respond like like sometimes ence by sentence or atleast paragraph 508 00:28:01,400 --> 00:28:05,320 Speaker 1: by care paragraph where we'd respond to specific points and 509 00:28:05,440 --> 00:28:09,000 Speaker 1: uh and and right at length in response. And at 510 00:28:09,040 --> 00:28:12,840 Speaker 1: some point this just faded away. I haven't really I 511 00:28:12,840 --> 00:28:14,479 Speaker 1: haven't really thought about it too much to to try 512 00:28:14,520 --> 00:28:16,840 Speaker 1: and figure out exactly like at what point, like which 513 00:28:17,280 --> 00:28:22,880 Speaker 1: like technological or communications change altered that or and or 514 00:28:22,920 --> 00:28:27,000 Speaker 1: what life changes led to that occurring. But at the 515 00:28:27,000 --> 00:28:29,239 Speaker 1: same time, you know, it used to have of, you know, 516 00:28:29,520 --> 00:28:32,520 Speaker 1: long phone conversations with people and now it's really it's 517 00:28:32,680 --> 00:28:35,359 Speaker 1: it's extremely rare for me to have a long phone conversation. 518 00:28:35,359 --> 00:28:38,560 Speaker 1: It's basically like two people in the world that I 519 00:28:38,600 --> 00:28:41,560 Speaker 1: have phone conversations within a regular basis and once my 520 00:28:41,640 --> 00:28:43,920 Speaker 1: wife and once and my mother, and then that's pretty 521 00:28:43,960 --> 00:28:47,120 Speaker 1: much it. Do you think maybe these changes have been 522 00:28:47,200 --> 00:28:49,960 Speaker 1: brought on by other technological changes, like the rise of 523 00:28:50,000 --> 00:28:53,600 Speaker 1: social media. I suspect they have. Yeah, Like instead of 524 00:28:54,200 --> 00:28:58,480 Speaker 1: having this this more, this longer, more thoughtful stream of 525 00:28:58,520 --> 00:29:01,320 Speaker 1: communication with some body that you know now lives in 526 00:29:01,360 --> 00:29:04,760 Speaker 1: another city, you just have a continual trickle, you know, 527 00:29:05,360 --> 00:29:08,200 Speaker 1: so again we just have that familiarity, like a tripical 528 00:29:08,280 --> 00:29:13,000 Speaker 1: familiarity going on instead of like an actual stream of communication. Well, 529 00:29:13,040 --> 00:29:17,280 Speaker 1: and it also I think that the way that technology 530 00:29:17,360 --> 00:29:21,320 Speaker 1: has changed our communication sometimes forces us to become a 531 00:29:21,400 --> 00:29:24,080 Speaker 1: version of ourselves that we don't recognize. I mean, I 532 00:29:24,120 --> 00:29:28,760 Speaker 1: was talking about how we write work emails. I actually 533 00:29:28,880 --> 00:29:32,000 Speaker 1: don't love the way that I write work emails. I 534 00:29:32,040 --> 00:29:37,040 Speaker 1: feel like often I have to I overuse like exclamation 535 00:29:37,120 --> 00:29:39,920 Speaker 1: points and smiley faces and all that. And it's mainly 536 00:29:39,960 --> 00:29:43,840 Speaker 1: just because I don't ever want to accidentally make somebody 537 00:29:43,840 --> 00:29:46,120 Speaker 1: feel bad over email or make them get the wrong 538 00:29:46,200 --> 00:29:48,280 Speaker 1: idea that I'm at them or something like that. On 539 00:29:48,440 --> 00:29:51,120 Speaker 1: the emotional intention, Yeah, and the statement I hate it 540 00:29:51,160 --> 00:29:55,240 Speaker 1: because I can feel myself feeling insipid and feeling not 541 00:29:55,400 --> 00:29:58,880 Speaker 1: like myself as I type it. But I would rather 542 00:29:59,160 --> 00:30:02,560 Speaker 1: feel like that then worry that I'm giving people the 543 00:30:02,560 --> 00:30:04,840 Speaker 1: wrong idea or letting them think I'm mad at them 544 00:30:04,920 --> 00:30:07,080 Speaker 1: or something like that. You know, Yeah, I mean, but 545 00:30:08,080 --> 00:30:10,040 Speaker 1: I totally understand it. Yeah, sometimes you feel like you 546 00:30:10,080 --> 00:30:12,720 Speaker 1: have to really make it clear. And I do find 547 00:30:12,720 --> 00:30:15,000 Speaker 1: myself doing more and more of that with texts when 548 00:30:15,000 --> 00:30:18,080 Speaker 1: I'm sending a text, you know, via my tiny pocket computer. 549 00:30:18,240 --> 00:30:22,200 Speaker 1: Oh your pocket god, yes, tiny pocket guy. Yes. Yeah, 550 00:30:22,240 --> 00:30:24,520 Speaker 1: And so we've got to obviously, you know, all the 551 00:30:24,520 --> 00:30:28,120 Speaker 1: stuff we're talking about email, phone, uh, text messages and 552 00:30:28,200 --> 00:30:32,000 Speaker 1: internet communications. The photograph in a way kind of kind 553 00:30:32,040 --> 00:30:35,720 Speaker 1: of a modern communication method sort of, Yeah, is it's become, 554 00:30:35,920 --> 00:30:39,920 Speaker 1: you know, increasingly easy to to take digital photographs and 555 00:30:39,960 --> 00:30:42,200 Speaker 1: send them to other people. It becomes a form of 556 00:30:42,200 --> 00:30:45,720 Speaker 1: communication as does you know. You mentioned emoticons as being 557 00:30:45,720 --> 00:30:50,240 Speaker 1: like a way of of of tweaking textual content, but 558 00:30:50,320 --> 00:30:54,840 Speaker 1: in many cases like they're the prime uh language that 559 00:30:54,960 --> 00:30:58,120 Speaker 1: is used in communicating to say nothing of memes. Oh 560 00:30:58,160 --> 00:31:01,200 Speaker 1: I shudder at this thought. Memes or there's gonna be 561 00:31:01,280 --> 00:31:03,960 Speaker 1: a day in which the English language is replaced by memes. 562 00:31:04,400 --> 00:31:06,520 Speaker 1: It's just like, instead of an alphabet, you have a meme, 563 00:31:06,560 --> 00:31:08,640 Speaker 1: a bet and you just like put you paste the 564 00:31:08,680 --> 00:31:11,920 Speaker 1: memes together to form ideas. Yeah, I mean I already 565 00:31:11,960 --> 00:31:13,959 Speaker 1: feel um, you know, and maybe this is just me 566 00:31:14,080 --> 00:31:17,720 Speaker 1: feeling old, but um, I feel we we've already reached 567 00:31:17,720 --> 00:31:20,040 Speaker 1: the point where there'll be a threat about something sound 568 00:31:20,120 --> 00:31:23,880 Speaker 1: Reddit and there'll be a meme and I have to 569 00:31:24,040 --> 00:31:26,200 Speaker 1: I have to research what the meme means, Like it's 570 00:31:26,200 --> 00:31:28,200 Speaker 1: a new meme, and I have to figure out like 571 00:31:28,200 --> 00:31:30,520 Speaker 1: where it came from, how it's used, and how it's 572 00:31:30,520 --> 00:31:33,360 Speaker 1: potentially being misused, and how it's like evolving out of 573 00:31:33,360 --> 00:31:37,480 Speaker 1: that misuse to understand like what the prevailing idea is 574 00:31:38,360 --> 00:31:41,480 Speaker 1: that is being um expressed. Memes as a whole are 575 00:31:41,520 --> 00:31:44,240 Speaker 1: exactly like words in the sense that you can try 576 00:31:44,240 --> 00:31:46,360 Speaker 1: to write down a definition for a word, but where 577 00:31:46,520 --> 00:31:50,239 Speaker 1: uses changes over time. I mean, words don't actually have 578 00:31:50,480 --> 00:31:54,440 Speaker 1: fixed definitions. You can't control how people use them. Yeah, 579 00:31:54,520 --> 00:31:57,320 Speaker 1: it kind of like the whole like literally, right, um, 580 00:31:57,480 --> 00:31:59,760 Speaker 1: it's like, yeah, sorry, you lost that battle, that words changed. 581 00:32:00,120 --> 00:32:01,920 Speaker 1: You can cling to the past, but sorry, it was 582 00:32:02,000 --> 00:32:04,440 Speaker 1: just misused into a new usage. So I try not 583 00:32:04,480 --> 00:32:06,800 Speaker 1: to correct people on that one, but that does it 584 00:32:06,880 --> 00:32:12,000 Speaker 1: still gets me. My blood was literally boiling and it 585 00:32:12,200 --> 00:32:15,360 Speaker 1: literally took his head off. Yes, but yeah, So we're 586 00:32:15,440 --> 00:32:17,840 Speaker 1: talking about the you know, the technological media on which 587 00:32:17,840 --> 00:32:22,280 Speaker 1: our relationships happened. And I think many of our relationships, 588 00:32:22,400 --> 00:32:26,200 Speaker 1: especially in the last you know, ten years now, happened 589 00:32:26,240 --> 00:32:30,600 Speaker 1: primarily on these media. And on one hand, that can 590 00:32:30,640 --> 00:32:33,520 Speaker 1: be a good thing because it allows us to maintain 591 00:32:33,600 --> 00:32:36,640 Speaker 1: relationships with people who we want to have relationships with, 592 00:32:37,080 --> 00:32:40,400 Speaker 1: but can't, you know, people we can't practically arrange to 593 00:32:40,400 --> 00:32:42,760 Speaker 1: see in person as often as we'd like to. Several 594 00:32:42,800 --> 00:32:45,680 Speaker 1: of my best friends live in different cities and we've 595 00:32:45,680 --> 00:32:48,520 Speaker 1: been friends for years and I'm only able to maintain 596 00:32:48,560 --> 00:32:51,560 Speaker 1: friendships with them because of this technology. So I would 597 00:32:51,600 --> 00:32:54,680 Speaker 1: hate to lose those friendships. But also I wonder about 598 00:32:54,680 --> 00:32:56,640 Speaker 1: the fact that what is it doing to our culture 599 00:32:56,720 --> 00:32:59,640 Speaker 1: when there's a substantial number of people who, like, I 600 00:32:59,640 --> 00:33:04,120 Speaker 1: don't know, maybe seventy percent of their friendly social interactions 601 00:33:04,160 --> 00:33:08,120 Speaker 1: happen over a machine. Yeah. I mean even people like 602 00:33:08,200 --> 00:33:11,080 Speaker 1: flesh and blood friends that I have in the city 603 00:33:11,120 --> 00:33:13,000 Speaker 1: with me, Like, we still have to do like a 604 00:33:13,120 --> 00:33:16,040 Speaker 1: like a you know, a thirty email chain to plan 605 00:33:16,160 --> 00:33:18,760 Speaker 1: to meet each other in real life. Like even if 606 00:33:18,760 --> 00:33:21,200 Speaker 1: it's like a semi regular thing, like we know where 607 00:33:21,240 --> 00:33:23,440 Speaker 1: we're gonna go, we know when we're going to do it, 608 00:33:23,640 --> 00:33:25,720 Speaker 1: but we still have to coordinate all of these things. 609 00:33:25,720 --> 00:33:28,760 Speaker 1: So how much of the relationship is truly face to 610 00:33:28,840 --> 00:33:32,280 Speaker 1: face versus digital? Yeah? And so, Sapolsky says in this 611 00:33:32,440 --> 00:33:36,200 Speaker 1: article that this technological reality has conditioned us in a 612 00:33:36,240 --> 00:33:41,920 Speaker 1: way to dissociate our traditional pathways of recognition and familiarity. 613 00:33:42,040 --> 00:33:44,880 Speaker 1: Uh so, he writes, quote, Thus, not only has modern 614 00:33:44,920 --> 00:33:49,840 Speaker 1: life increasingly dissociated recognition and familiarity, but it has impoverished 615 00:33:49,880 --> 00:33:54,160 Speaker 1: the latter in the process. So impoverished familiarity this is 616 00:33:54,200 --> 00:33:58,520 Speaker 1: worsened buy our frantic skill at multitasking, especially social multitasking. 617 00:33:58,760 --> 00:34:01,480 Speaker 1: A recent Pew study parted that eighty nine percent of 618 00:34:01,520 --> 00:34:04,400 Speaker 1: cell phone owners use their phones during the most recent 619 00:34:04,440 --> 00:34:08,480 Speaker 1: social gathering. That sounds low to me. Um, we reduce 620 00:34:08,560 --> 00:34:11,600 Speaker 1: our social connotations to mere threads so that we can 621 00:34:11,680 --> 00:34:15,239 Speaker 1: maintain as many of them as possible. This leaves us 622 00:34:15,280 --> 00:34:19,360 Speaker 1: with signposts of familiarity that are frail remnants of the 623 00:34:19,440 --> 00:34:22,640 Speaker 1: real thing. And I think he's really onto something there 624 00:34:22,680 --> 00:34:26,520 Speaker 1: about the idea of um maintaining it's almost like putting 625 00:34:26,600 --> 00:34:31,040 Speaker 1: up the scarecrows of things like these technological stand ins 626 00:34:31,280 --> 00:34:36,560 Speaker 1: for relationships that are not really functioning biologically and psychologically 627 00:34:36,600 --> 00:34:40,800 Speaker 1: for us the way relationships should. But we'd rather maintain 628 00:34:40,920 --> 00:34:43,759 Speaker 1: as many of those as we can rather than have 629 00:34:43,920 --> 00:34:47,200 Speaker 1: fewer relationships but more face to face interaction, you know, 630 00:34:47,320 --> 00:34:49,839 Speaker 1: quality time and all that. Yeah, so we we end 631 00:34:49,920 --> 00:34:53,920 Speaker 1: up maintaining these trickles of of of actual social connections 632 00:34:53,920 --> 00:34:56,799 Speaker 1: as opposed to streams of social connection. So he's saying 633 00:34:56,800 --> 00:35:01,480 Speaker 1: there that we essentially degrade our sensitivity to the familiarity 634 00:35:01,600 --> 00:35:04,880 Speaker 1: aspect of of what knowing somebody is a social interaction. 635 00:35:04,920 --> 00:35:09,279 Speaker 1: It's recognition and familiarity. And when we degrade the familiarity thing, 636 00:35:09,600 --> 00:35:15,000 Speaker 1: he says, quote, uh, that we become increasingly vulnerable to imposts. 637 00:35:15,000 --> 00:35:19,640 Speaker 1: Our social media lives are rife with simulations and simulations 638 00:35:19,680 --> 00:35:22,640 Speaker 1: of simulations of reality, and so of course you know 639 00:35:22,760 --> 00:35:25,520 Speaker 1: that's uh, you know, one example there is people who 640 00:35:25,520 --> 00:35:27,920 Speaker 1: claim to know you, but they're not. They're uh, you know, 641 00:35:28,040 --> 00:35:31,400 Speaker 1: a friends email account gets hacked, some hacker contacts you 642 00:35:31,400 --> 00:35:33,480 Speaker 1: and tries to get you to open some malware. That's 643 00:35:33,480 --> 00:35:36,399 Speaker 1: one example. But there's a million versions of this thing 644 00:35:36,480 --> 00:35:42,160 Speaker 1: where where are sort of like low resolution familiarity detectors 645 00:35:42,239 --> 00:35:45,239 Speaker 1: in this digital world are being exploited by people who 646 00:35:45,239 --> 00:35:48,920 Speaker 1: are not actually our real friends. So and basically our 647 00:35:49,000 --> 00:35:53,280 Speaker 1: online their online version of ourselves is essentially as a lazy, 648 00:35:53,400 --> 00:35:57,120 Speaker 1: low resolution simulation, and so if someone comes along to 649 00:35:57,280 --> 00:35:59,839 Speaker 1: hijack that simulation, it's all the easier to do. So 650 00:36:00,000 --> 00:36:02,440 Speaker 1: you don't have to be a high level magic user 651 00:36:02,800 --> 00:36:06,360 Speaker 1: to to to take on the likeness of another individual 652 00:36:06,640 --> 00:36:10,600 Speaker 1: when the threshold for duplication is so low. Yeah, but 653 00:36:10,719 --> 00:36:13,799 Speaker 1: then here here's the turn. So Sapolsky says, by any 654 00:36:13,840 --> 00:36:17,120 Speaker 1: logic quote, this should induce us all to have cap 655 00:36:17,160 --> 00:36:20,920 Speaker 1: craw delusions to find it plausible that everyone we encounter 656 00:36:21,040 --> 00:36:24,120 Speaker 1: is an impostor. After all, how can one's faith in 657 00:36:24,120 --> 00:36:27,000 Speaker 1: the veracity of people not be shaken when you sent 658 00:36:27,080 --> 00:36:28,759 Speaker 1: all that money to the guy who claimed he was 659 00:36:28,800 --> 00:36:31,400 Speaker 1: from the I R. S. And I think there is 660 00:36:31,440 --> 00:36:33,640 Speaker 1: something going on here. It didn't start with this, but 661 00:36:33,760 --> 00:36:37,000 Speaker 1: this this impostor kind of thing that the doppelganger effect 662 00:36:37,040 --> 00:36:39,320 Speaker 1: of the online world and the fact that it's easy 663 00:36:39,360 --> 00:36:42,880 Speaker 1: to be tricked by an online doppelganger does help contribute, 664 00:36:42,920 --> 00:36:45,800 Speaker 1: I think to this concept. I'm sure you've encountered this, Robert, 665 00:36:45,880 --> 00:36:50,719 Speaker 1: that the Internet is not real life. People always say this, right, 666 00:36:51,080 --> 00:36:53,120 Speaker 1: It's like I talked about somebody being a friend in 667 00:36:53,239 --> 00:36:56,160 Speaker 1: real life, in real life versus on the Internet. But 668 00:36:56,200 --> 00:36:59,720 Speaker 1: if most of your social interactions are happening on the Internet, 669 00:37:00,040 --> 00:37:01,879 Speaker 1: in what sense is that not real life? I mean, 670 00:37:01,880 --> 00:37:04,239 Speaker 1: of course, the Internet is real life. It is It 671 00:37:04,360 --> 00:37:07,800 Speaker 1: is a it's like a technology. The stuff you're doing 672 00:37:07,840 --> 00:37:11,000 Speaker 1: on it is actually happening. It's not like something that 673 00:37:11,120 --> 00:37:14,400 Speaker 1: didn't happen. But you are making a distinction there people 674 00:37:14,400 --> 00:37:18,000 Speaker 1: in some way or or seeing these interactions as derealized 675 00:37:18,160 --> 00:37:21,520 Speaker 1: or as not having uh, you know, not material in 676 00:37:21,560 --> 00:37:24,120 Speaker 1: the way that other interactions are. And yet there where 677 00:37:24,160 --> 00:37:28,719 Speaker 1: we're doing most increasingly all of our stuff. Yeah, and 678 00:37:28,920 --> 00:37:31,319 Speaker 1: I wonder if part of that, you know, I would 679 00:37:31,440 --> 00:37:34,760 Speaker 1: I wonder how this plays out generation to generation, because 680 00:37:35,239 --> 00:37:38,759 Speaker 1: I feel like for me, I probably maybe I had 681 00:37:39,040 --> 00:37:41,880 Speaker 1: had a sense of the Internet is being not real life, 682 00:37:41,920 --> 00:37:44,799 Speaker 1: more so early on, because the Internet was in some 683 00:37:44,920 --> 00:37:47,920 Speaker 1: respectfully kind of an escape. I mean at the same time, yeah, 684 00:37:47,960 --> 00:37:49,640 Speaker 1: I was. I remember having a union to use a 685 00:37:49,960 --> 00:37:52,080 Speaker 1: like a college email address and all that kind of stuff. 686 00:37:52,080 --> 00:37:53,799 Speaker 1: You know, So you're still you're still doing in real 687 00:37:54,400 --> 00:37:56,759 Speaker 1: real life stuff via the Internet. But then a lot 688 00:37:56,800 --> 00:38:00,640 Speaker 1: of other stuff is is about escaping either just in general, 689 00:38:00,719 --> 00:38:06,239 Speaker 1: like escaping into the into fantasy, or like escaping geographical boundaries, 690 00:38:06,280 --> 00:38:08,480 Speaker 1: you know, and uh in you know, being able to 691 00:38:08,520 --> 00:38:10,759 Speaker 1: connect with people in other cities. Well, I think there's 692 00:38:10,800 --> 00:38:12,960 Speaker 1: another way in which there are multiple ways in which 693 00:38:13,000 --> 00:38:15,479 Speaker 1: people came to see the Internet is not real life, 694 00:38:15,480 --> 00:38:18,520 Speaker 1: and one of them is is anonymity. You know that 695 00:38:18,800 --> 00:38:23,040 Speaker 1: if you could go around invisible all day. What's that 696 00:38:23,080 --> 00:38:26,919 Speaker 1: Harry Potter cloak that makes you invisible? Oh the what 697 00:38:26,960 --> 00:38:28,719 Speaker 1: was the type? I mean, it's a new cloak of invisibility, 698 00:38:28,719 --> 00:38:32,080 Speaker 1: but I don't remember if it had any particular name. Well, 699 00:38:32,120 --> 00:38:34,359 Speaker 1: whatever that is, you could be invisible in a way 700 00:38:34,400 --> 00:38:36,839 Speaker 1: that would feel not real, right because if nobody can 701 00:38:36,880 --> 00:38:39,799 Speaker 1: see you and nobody knows who you are wherever you are, 702 00:38:39,880 --> 00:38:42,880 Speaker 1: then there are no consequences, and consequences are kind of 703 00:38:42,920 --> 00:38:46,239 Speaker 1: what gives us the feeling of reality. So that's part 704 00:38:46,320 --> 00:38:48,600 Speaker 1: of it. But I think also Spolski is onto something 705 00:38:48,600 --> 00:38:51,440 Speaker 1: here and that like that this estrangement of the sense 706 00:38:51,440 --> 00:38:55,120 Speaker 1: of recognition and familiarity is it makes the Internet start 707 00:38:55,160 --> 00:38:57,919 Speaker 1: to feel like this world of social delusion, this sort 708 00:38:57,960 --> 00:39:01,839 Speaker 1: of like always cap grav or a bold type landscape 709 00:39:02,080 --> 00:39:04,960 Speaker 1: where nothing is really real and you can't trust anything, 710 00:39:05,000 --> 00:39:07,560 Speaker 1: and yet at the same time we're we're constantly forced 711 00:39:07,600 --> 00:39:10,280 Speaker 1: to put our trust in it as a matter of fact, 712 00:39:10,440 --> 00:39:13,640 Speaker 1: because that's where we're doing everything. But then, of course, 713 00:39:14,160 --> 00:39:16,000 Speaker 1: back to the idea of like all these you know, 714 00:39:16,239 --> 00:39:19,800 Speaker 1: threads that people maintain and sort of mistake for meaningful 715 00:39:19,880 --> 00:39:24,400 Speaker 1: relationships online. Uh. He comes back and on that and says, actually, 716 00:39:24,520 --> 00:39:27,520 Speaker 1: you know, it seems more the opposite has happened than 717 00:39:27,680 --> 00:39:30,399 Speaker 1: than inducing us to all have cap grad delusions where 718 00:39:30,400 --> 00:39:32,799 Speaker 1: we see people we knew and we think of them, 719 00:39:33,280 --> 00:39:35,279 Speaker 1: see people we know and we think of them as 720 00:39:35,600 --> 00:39:39,320 Speaker 1: as you know, being a doppelganger or not familiar. Instead, 721 00:39:39,360 --> 00:39:41,560 Speaker 1: we go the other way and we see people we 722 00:39:41,600 --> 00:39:43,960 Speaker 1: don't really know very well, but we just have to 723 00:39:44,000 --> 00:39:47,279 Speaker 1: attach this feeling of familiarity to them. It allows all 724 00:39:47,320 --> 00:39:50,040 Speaker 1: of this false familiarity. And this really comes up in 725 00:39:50,880 --> 00:39:52,840 Speaker 1: I don't know, how have you read about the the 726 00:39:52,880 --> 00:39:56,680 Speaker 1: idea of you know, paras social interactions on social media? 727 00:39:57,040 --> 00:39:59,520 Speaker 1: You know, I don't think prior to this episode, I 728 00:40:00,320 --> 00:40:02,319 Speaker 1: I knew it by that term, but of course you 729 00:40:02,320 --> 00:40:04,000 Speaker 1: do see it all the time. Yeah, it's It's just 730 00:40:04,239 --> 00:40:07,000 Speaker 1: it's ubiquitous on the Internet. It's the idea, you know, 731 00:40:07,040 --> 00:40:10,239 Speaker 1: it's an asymmetrical relationship, the way like you follow a 732 00:40:10,280 --> 00:40:13,400 Speaker 1: public figure who doesn't know who you are. But there 733 00:40:13,440 --> 00:40:17,280 Speaker 1: are all of these indications that many people think of 734 00:40:17,320 --> 00:40:21,880 Speaker 1: these para social asymmetrical relationships as relationships. It's like they 735 00:40:21,920 --> 00:40:25,920 Speaker 1: almost view this Instagram influencer that they follow as like 736 00:40:26,000 --> 00:40:28,800 Speaker 1: an acquaint somebody they know, but of course that person 737 00:40:28,840 --> 00:40:31,800 Speaker 1: doesn't know them. Yeah. I really started thinking about this 738 00:40:31,840 --> 00:40:35,960 Speaker 1: classification though, of para social relationships, uh, and in wondering 739 00:40:36,000 --> 00:40:39,640 Speaker 1: like to what extent it can or could have existed 740 00:40:39,680 --> 00:40:42,960 Speaker 1: in previous times, Like what is the earliest possible example 741 00:40:43,000 --> 00:40:45,880 Speaker 1: of a para social relationship, Like maybe it could be 742 00:40:45,880 --> 00:40:48,719 Speaker 1: a situation where you have like a h an, like 743 00:40:48,800 --> 00:40:52,200 Speaker 1: a leader um in a given community, and then you 744 00:40:52,239 --> 00:40:54,520 Speaker 1: have like a very low level person in that community 745 00:40:55,280 --> 00:40:58,480 Speaker 1: that that the you know, the tribal leader just has 746 00:40:58,520 --> 00:41:01,000 Speaker 1: no uh you know, real idea of who they are. 747 00:41:01,600 --> 00:41:04,120 Speaker 1: But of course you know who the leader is. I mean, 748 00:41:04,200 --> 00:41:07,160 Speaker 1: I guess that's you know, sort of the the in 749 00:41:07,280 --> 00:41:10,760 Speaker 1: real life version of this. But we see it seems 750 00:41:10,800 --> 00:41:14,000 Speaker 1: like we see far more of it, uh in in 751 00:41:14,000 --> 00:41:17,200 Speaker 1: in modern civilization. Um in certainly an Internet age, but 752 00:41:17,280 --> 00:41:21,280 Speaker 1: even pre Internet, like the idea of celebrity just enables 753 00:41:21,719 --> 00:41:25,200 Speaker 1: this sort of relationship to be possible, celebrities and leaders. 754 00:41:25,920 --> 00:41:29,319 Speaker 1: And of course I would say that social media, of 755 00:41:29,360 --> 00:41:33,480 Speaker 1: course did not invent the idea of celebrities, and so 756 00:41:33,480 --> 00:41:36,200 Speaker 1: so it didn't invent these relationships like you're talking about. 757 00:41:36,200 --> 00:41:38,680 Speaker 1: You know, you've always had leaders, You've always had public 758 00:41:38,760 --> 00:41:42,160 Speaker 1: figures in some way or another. Social media, I think 759 00:41:42,200 --> 00:41:46,720 Speaker 1: has increased the day to day relevance of these types 760 00:41:46,760 --> 00:41:49,279 Speaker 1: of relationships. You know, where you can like check in 761 00:41:49,400 --> 00:41:52,760 Speaker 1: on on the accounts of the people that you follow 762 00:41:52,840 --> 00:41:55,560 Speaker 1: every day and they don't know you, but you know you. 763 00:41:55,800 --> 00:41:59,120 Speaker 1: Especially I feel like Instagram, especially of all the platforms 764 00:41:59,120 --> 00:42:02,000 Speaker 1: I can think of as is really rife with this 765 00:42:02,680 --> 00:42:06,120 Speaker 1: um of like these influencers and people who lead kind 766 00:42:06,120 --> 00:42:08,920 Speaker 1: of glamorous lives and allow you to see into their 767 00:42:09,000 --> 00:42:12,239 Speaker 1: lives by showing you their house and their pets and 768 00:42:12,280 --> 00:42:14,719 Speaker 1: their lunch, and you know, you get all these interior 769 00:42:14,880 --> 00:42:18,360 Speaker 1: views and it's very visceral because it's visual and often 770 00:42:18,400 --> 00:42:20,920 Speaker 1: you know, visual, even in a way that's edited to 771 00:42:20,960 --> 00:42:24,480 Speaker 1: make it more colorful and exciting with the post processing 772 00:42:24,480 --> 00:42:26,319 Speaker 1: filters and all that. Right, And of course, at the 773 00:42:26,360 --> 00:42:29,440 Speaker 1: same time, like like all social media representations like this, 774 00:42:29,760 --> 00:42:34,520 Speaker 1: they are they're incomplete. We're crafted, and they're crafted their uh, 775 00:42:34,760 --> 00:42:37,880 Speaker 1: they're they're maintain in a very strategic way usually, so 776 00:42:37,920 --> 00:42:39,759 Speaker 1: you don't even have like a full vision of what 777 00:42:40,080 --> 00:42:43,319 Speaker 1: you know, random celebrities life actually is. You just have 778 00:42:43,400 --> 00:42:46,560 Speaker 1: this idealized version of it. So I just want to 779 00:42:46,600 --> 00:42:49,799 Speaker 1: read one last quote from Sapulski's article before we move on. 780 00:42:49,840 --> 00:42:52,879 Speaker 1: So he says, um uh. He ends by saying, quote, 781 00:42:52,880 --> 00:42:56,120 Speaker 1: Throughout history, cap Cross syndrome has been a cultural mirror 782 00:42:56,480 --> 00:43:00,760 Speaker 1: of a dissociative mind, where thoughts of recognition and feelings 783 00:43:00,800 --> 00:43:04,880 Speaker 1: of intimacy have been sundered. It's still that mirror today. 784 00:43:04,920 --> 00:43:07,160 Speaker 1: We think that what is false and artificial in the 785 00:43:07,160 --> 00:43:10,560 Speaker 1: world around us is substantive and meaningful. It's not that 786 00:43:10,680 --> 00:43:13,520 Speaker 1: loved ones and friends are mistaken for simulations, but that 787 00:43:13,640 --> 00:43:17,120 Speaker 1: simulations are mistaken for them. I think I kind of 788 00:43:17,160 --> 00:43:19,040 Speaker 1: disagree with them a little bit because I think it's 789 00:43:19,040 --> 00:43:22,080 Speaker 1: actually both of those things. It's like that that the 790 00:43:22,160 --> 00:43:25,799 Speaker 1: dissociation goes two ways in either case. Though we we 791 00:43:25,880 --> 00:43:29,279 Speaker 1: do typically we often find ourselves in situations though where 792 00:43:29,280 --> 00:43:33,960 Speaker 1: we are we are distracted from from real life um 793 00:43:34,239 --> 00:43:38,600 Speaker 1: relationships and real life socialization, and instead we have to 794 00:43:38,680 --> 00:43:41,000 Speaker 1: check in on these little streams on our phone to 795 00:43:41,160 --> 00:43:45,880 Speaker 1: just these uh, these simulated relationships that we have on 796 00:43:45,960 --> 00:43:48,960 Speaker 1: social media. Do you ever have the sort of direct 797 00:43:49,120 --> 00:43:52,239 Speaker 1: doppelganger experience, like with the fairy change links or the 798 00:43:52,280 --> 00:43:55,560 Speaker 1: doppelganger for a friend on the internet, Like you have 799 00:43:55,640 --> 00:43:58,000 Speaker 1: a family member, you have a friend who you love 800 00:43:58,080 --> 00:44:01,879 Speaker 1: in real life, but when you see the way they 801 00:44:01,920 --> 00:44:04,000 Speaker 1: are on the internet, I don't know what it. You know, 802 00:44:04,040 --> 00:44:07,160 Speaker 1: the kind of stuff they post on social media or whatever, 803 00:44:07,880 --> 00:44:10,319 Speaker 1: you don't feel like you recognize them and you don't 804 00:44:10,360 --> 00:44:13,560 Speaker 1: really like them. I've definitely known people who are like that. 805 00:44:13,640 --> 00:44:16,080 Speaker 1: I'm not gonna name any names who are like that 806 00:44:16,160 --> 00:44:19,520 Speaker 1: on say Twitter, Like I think, like I love this person. 807 00:44:19,680 --> 00:44:21,839 Speaker 1: But if all I knew about them was the way 808 00:44:21,880 --> 00:44:24,399 Speaker 1: they act on Twitter, I would I wouldn't be able 809 00:44:24,440 --> 00:44:28,320 Speaker 1: to stand them well social media, especially as it pertains 810 00:44:28,360 --> 00:44:32,920 Speaker 1: to you know, some topics, take politics, for instance, I 811 00:44:32,960 --> 00:44:36,440 Speaker 1: think it does tend to bring out the worst in us. Uh. 812 00:44:36,480 --> 00:44:40,319 Speaker 1: And I don't think that is a risky comment to make. 813 00:44:40,360 --> 00:44:43,560 Speaker 1: I think we can all think too specific examples of 814 00:44:43,600 --> 00:44:46,279 Speaker 1: that in all of our own lives. And yeah, that 815 00:44:46,320 --> 00:44:48,319 Speaker 1: can lead you to a situation where you're like, well, 816 00:44:48,360 --> 00:44:49,920 Speaker 1: I thought I knew that person, but I guess I 817 00:44:50,000 --> 00:44:53,000 Speaker 1: don't because look at this name they just shared, you know, 818 00:44:53,600 --> 00:44:58,520 Speaker 1: um and uh. But I think also though, when that happens, 819 00:44:58,560 --> 00:45:02,960 Speaker 1: we're just not appropriately appreciating the way that circumstances and 820 00:45:03,040 --> 00:45:09,279 Speaker 1: situations change change people's behavior. That we are the same way. Yeah. Absolutely, 821 00:45:09,520 --> 00:45:11,719 Speaker 1: All right, on that note, we're gonna take a quick break, 822 00:45:11,719 --> 00:45:16,759 Speaker 1: but we'll be right back. All right, we're back. You know, 823 00:45:16,760 --> 00:45:19,760 Speaker 1: I was thinking about what you said about about my emails, 824 00:45:20,560 --> 00:45:24,960 Speaker 1: but I feel like, like sixty maybe my emails or 825 00:45:24,960 --> 00:45:27,520 Speaker 1: me just saying um, cool sounds good. But I think 826 00:45:27,520 --> 00:45:30,239 Speaker 1: that's my my standard, which I feel is sufficient. It's 827 00:45:30,280 --> 00:45:33,919 Speaker 1: just me saying yes to whatever you just said. Uh, 828 00:45:33,920 --> 00:45:35,839 Speaker 1: and I I'm cool with it. Do you have an 829 00:45:35,880 --> 00:45:39,400 Speaker 1: Android phone? No? I don't. Okay, use Gmail? Yeah you 830 00:45:39,520 --> 00:45:41,520 Speaker 1: use Gmail, Yeah, but I don't use the I know 831 00:45:41,560 --> 00:45:44,279 Speaker 1: that you have the the sort of you know, the 832 00:45:44,320 --> 00:45:47,319 Speaker 1: auto language feature starts telling you what to say. It's like, 833 00:45:47,360 --> 00:45:49,919 Speaker 1: here's the email you could write. Man, when I saw 834 00:45:50,000 --> 00:45:54,399 Speaker 1: that thing, I was like, get out, what the heck? No, no, no, 835 00:45:55,080 --> 00:45:57,399 Speaker 1: well it's it's it's an easy jump to go from 836 00:45:57,400 --> 00:46:01,560 Speaker 1: there to like authorized simulations yourself, you know, to just 837 00:46:01,640 --> 00:46:03,799 Speaker 1: which I really I mean, that's not far off. That's 838 00:46:03,880 --> 00:46:07,000 Speaker 1: it's basically already here where you just give your account 839 00:46:07,040 --> 00:46:09,799 Speaker 1: of the authority to to make responses like this like 840 00:46:09,800 --> 00:46:11,600 Speaker 1: cool sounds good. I'll get back to you. You know 841 00:46:11,680 --> 00:46:13,799 Speaker 1: that sort of thing. Well, it's not gonna. I don't know. 842 00:46:13,840 --> 00:46:16,320 Speaker 1: I mean, even if I would type exactly the words 843 00:46:16,360 --> 00:46:18,919 Speaker 1: that's suggesting, I still don't want to let it do that. 844 00:46:19,320 --> 00:46:21,759 Speaker 1: The fact that like Gmail is gonna is going to 845 00:46:21,840 --> 00:46:25,000 Speaker 1: compose an email for me to my parents or my 846 00:46:25,080 --> 00:46:28,520 Speaker 1: wife that no, no, no no, unacceptable. There's so much room 847 00:46:28,960 --> 00:46:34,200 Speaker 1: for misunderstanding. Even if we're applying, um, you know, most 848 00:46:34,320 --> 00:46:38,200 Speaker 1: or all of our attention to crafting an email. Uh, 849 00:46:38,239 --> 00:46:41,360 Speaker 1: it feels like a machine, even a very like talented 850 00:46:42,120 --> 00:46:45,000 Speaker 1: AI would have difficulty with that. There's just so much 851 00:46:45,080 --> 00:46:48,040 Speaker 1: nuance in human communication and knowing who you're communicating to. 852 00:46:48,120 --> 00:46:50,759 Speaker 1: Like sometimes it's a matter of knowing there are certain 853 00:46:50,760 --> 00:46:53,880 Speaker 1: words you shouldn't use with another individual, like maybe you're 854 00:46:53,920 --> 00:46:56,279 Speaker 1: aware of what you know maybe uh, you know some 855 00:46:56,360 --> 00:46:58,960 Speaker 1: sort of a trigger for them, or or you know 856 00:46:59,000 --> 00:47:01,399 Speaker 1: it may pertain to some sort of you know, um, 857 00:47:01,400 --> 00:47:04,200 Speaker 1: you know, incident from from from your personal past with 858 00:47:04,280 --> 00:47:07,279 Speaker 1: that person. Like, there's so many potential h holes to 859 00:47:07,320 --> 00:47:11,400 Speaker 1: fall into when composing written communication. Why trust that the 860 00:47:11,440 --> 00:47:13,080 Speaker 1: to the machine? Or I don't know, maybe the reverse 861 00:47:13,120 --> 00:47:15,839 Speaker 1: is true. Always trusted the machine as long as they 862 00:47:15,840 --> 00:47:18,160 Speaker 1: have all those caveats in mind, you know. I think 863 00:47:18,200 --> 00:47:22,040 Speaker 1: one thing that's interesting to me is about the psychological 864 00:47:22,080 --> 00:47:25,040 Speaker 1: effects of heavy social media use. Um. I feel like 865 00:47:25,040 --> 00:47:28,080 Speaker 1: we're still in the early days of getting a picture 866 00:47:28,120 --> 00:47:30,160 Speaker 1: of what that's like, and that there appears to be 867 00:47:30,200 --> 00:47:33,400 Speaker 1: a lot of conflicting evidence. I think. I think because 868 00:47:33,440 --> 00:47:36,279 Speaker 1: we we haven't refined all our categories and ways of 869 00:47:36,320 --> 00:47:39,600 Speaker 1: testing things yet. I do often say that I think 870 00:47:39,960 --> 00:47:42,040 Speaker 1: in emerging this is just a prediction I could turn 871 00:47:42,040 --> 00:47:44,520 Speaker 1: out to be totally wrong. But my guess is that 872 00:47:44,600 --> 00:47:48,560 Speaker 1: in the coming years, there's going to be emerging consensus 873 00:47:48,600 --> 00:47:51,600 Speaker 1: that heavy social media used, especially say among young people 874 00:47:51,640 --> 00:47:54,480 Speaker 1: like teenagers and stuff, is correlated with a lot of 875 00:47:54,520 --> 00:47:58,680 Speaker 1: negative psychological outcomes and uh, you know, the depression and 876 00:47:58,719 --> 00:48:01,200 Speaker 1: things like that, and that there will be like a 877 00:48:01,200 --> 00:48:04,480 Speaker 1: new cottage industry of like the lobbyists who deny the 878 00:48:04,520 --> 00:48:08,600 Speaker 1: emerging science on on social media. But uh, I mean, 879 00:48:08,920 --> 00:48:10,760 Speaker 1: I guess that's still to be seen. I mean, we've 880 00:48:10,800 --> 00:48:12,680 Speaker 1: only got a few years of data to work with 881 00:48:12,760 --> 00:48:15,839 Speaker 1: so far. Yeah, when trying to imagine the future, it's 882 00:48:15,840 --> 00:48:18,560 Speaker 1: difficult and also you know kind of you know, anxiety 883 00:48:18,560 --> 00:48:21,560 Speaker 1: and do thing to try and think where where our 884 00:48:21,560 --> 00:48:24,799 Speaker 1: social media usage is going. On one hand, I guess 885 00:48:24,840 --> 00:48:29,560 Speaker 1: I'm I'm hopeful that more and more people will you know, 886 00:48:29,800 --> 00:48:32,560 Speaker 1: choose to if if not opt out of social media, 887 00:48:33,719 --> 00:48:36,959 Speaker 1: but or you know, at least rethink how they're using 888 00:48:37,000 --> 00:48:39,440 Speaker 1: it step back from it. Even I kind of think 889 00:48:39,440 --> 00:48:41,480 Speaker 1: of it. It's kind of like a hot tub. You know, 890 00:48:41,520 --> 00:48:43,080 Speaker 1: when you first get into a hot tub, you just 891 00:48:43,280 --> 00:48:44,600 Speaker 1: u just all in, you know, it's like, let me 892 00:48:44,640 --> 00:48:46,000 Speaker 1: just go all the way up to my ears in 893 00:48:46,040 --> 00:48:49,400 Speaker 1: this and uh on ill zone and zone out, and 894 00:48:49,640 --> 00:48:51,880 Speaker 1: you know, that's good for a while, but then eventually realize, 895 00:48:51,880 --> 00:48:54,439 Speaker 1: if I stay in here, um, I am going to die. 896 00:48:55,239 --> 00:48:58,920 Speaker 1: So maybe I need to like only put half of 897 00:48:58,960 --> 00:49:01,200 Speaker 1: my body in here. Maybe I should just sit on 898 00:49:01,239 --> 00:49:03,160 Speaker 1: the side and get my feet in here, or maybe 899 00:49:03,239 --> 00:49:05,040 Speaker 1: very yet, maybe I should go get in the pool 900 00:49:05,080 --> 00:49:07,960 Speaker 1: for a while and do that. Or even maybe I 901 00:49:07,960 --> 00:49:09,840 Speaker 1: should leave all together and go home and see my 902 00:49:09,920 --> 00:49:12,399 Speaker 1: family that sort of thing. You know, it is nice 903 00:49:12,440 --> 00:49:14,600 Speaker 1: at first. I remember thinking about when I very first 904 00:49:14,600 --> 00:49:16,360 Speaker 1: got on Twitter, it seemed like it was nice for 905 00:49:16,360 --> 00:49:19,120 Speaker 1: a while that I was mostly just seeing things that 906 00:49:19,760 --> 00:49:23,680 Speaker 1: like learning and things that people were enthusiastic about. People 907 00:49:23,680 --> 00:49:27,279 Speaker 1: were sharing their enthusiasms. Here's a great thing, and uh 908 00:49:27,280 --> 00:49:29,880 Speaker 1: and over time, I'm not sure exactly what happened, but 909 00:49:30,520 --> 00:49:33,759 Speaker 1: it seemed like it transformed into more like this, uh, 910 00:49:33,880 --> 00:49:37,640 Speaker 1: this swamp of misery, where the primary emotions coming off 911 00:49:37,680 --> 00:49:41,200 Speaker 1: of it was just that everybody hates everything. Of course, 912 00:49:41,480 --> 00:49:43,439 Speaker 1: all of this, of course, is depending on on how 913 00:49:43,480 --> 00:49:46,200 Speaker 1: exactly when used as a social media platform, who you 914 00:49:46,280 --> 00:49:50,319 Speaker 1: follow um like for instance, like on Instagram. Obviously there's 915 00:49:50,320 --> 00:49:53,120 Speaker 1: a lot of celebrity worship going on, a lot of 916 00:49:53,160 --> 00:49:56,640 Speaker 1: para social relationships taking place there. As we already mentioned, 917 00:49:57,000 --> 00:49:58,560 Speaker 1: I don't see as much of that. And part of 918 00:49:58,560 --> 00:50:01,600 Speaker 1: that is just because I like only follow family and friends, 919 00:50:01,640 --> 00:50:03,880 Speaker 1: and I only use it myself for family photos, and 920 00:50:03,880 --> 00:50:05,759 Speaker 1: it's like, you know, it's a closed account. Well, I 921 00:50:05,800 --> 00:50:08,799 Speaker 1: do think that there is some evidence I've seen so 922 00:50:08,840 --> 00:50:11,640 Speaker 1: far that and I'm not sure I solid this is yet, 923 00:50:11,680 --> 00:50:14,200 Speaker 1: but there's some evidence that there's a pretty big difference 924 00:50:14,200 --> 00:50:17,040 Speaker 1: in the psychological effects of social media depending on whether 925 00:50:17,080 --> 00:50:20,400 Speaker 1: you primarily use it too as a way of keeping 926 00:50:20,480 --> 00:50:23,000 Speaker 1: up with family and friends versus as a way of 927 00:50:23,080 --> 00:50:26,879 Speaker 1: interacting with public accounts. But I think but then again, 928 00:50:26,880 --> 00:50:28,319 Speaker 1: one of the dangers in all of this is even 929 00:50:28,360 --> 00:50:30,720 Speaker 1: if there is a preferred if there is a healthier 930 00:50:30,760 --> 00:50:34,560 Speaker 1: way to use a given platform, you are still fighting 931 00:50:34,600 --> 00:50:40,120 Speaker 1: against the intended usage of that platform, as engineered by 932 00:50:40,200 --> 00:50:43,160 Speaker 1: the makers of that platform. The intended usage of the 933 00:50:43,160 --> 00:50:45,680 Speaker 1: platform is to open it up and never get off. 934 00:50:46,800 --> 00:50:49,680 Speaker 1: It's just and so like, it's difficult to compete with that. 935 00:50:49,719 --> 00:50:52,160 Speaker 1: I mean, we've talked about it this on the show before. Uh. 936 00:50:52,480 --> 00:50:56,440 Speaker 1: In terms of gambling, technology and then social media technology. 937 00:50:56,640 --> 00:50:59,919 Speaker 1: I mean, you're you're really up against a fearsome adversary 938 00:51:00,360 --> 00:51:04,120 Speaker 1: in telling yourself, I'm only going to use this in 939 00:51:04,160 --> 00:51:07,239 Speaker 1: a way that is mentally beneficial for me and not 940 00:51:07,400 --> 00:51:12,480 Speaker 1: just purely economically beneficial for the masters of the medium. 941 00:51:12,520 --> 00:51:14,279 Speaker 1: You know, Jared Lannier, who we've talked about on the 942 00:51:14,280 --> 00:51:16,560 Speaker 1: show before, has written a book about it. Basically, it's 943 00:51:16,600 --> 00:51:19,600 Speaker 1: saying everybody should delete their social media accounts, just get 944 00:51:19,640 --> 00:51:22,640 Speaker 1: off these platforms, and that will be it will make 945 00:51:22,680 --> 00:51:25,080 Speaker 1: a much better world. And he's got a whole argument 946 00:51:25,120 --> 00:51:27,040 Speaker 1: for it in this book, which I haven't read yet, 947 00:51:27,080 --> 00:51:29,239 Speaker 1: but I planned to. In fact, we asked for some 948 00:51:29,280 --> 00:51:31,200 Speaker 1: review copies. But I think we should see if we 949 00:51:31,200 --> 00:51:33,839 Speaker 1: can try to get uh Jared Lannier on the podcast. Yes, 950 00:51:34,160 --> 00:51:36,000 Speaker 1: I think we should. I also wonder what he would 951 00:51:36,040 --> 00:51:38,759 Speaker 1: think about this UH comparison. I still feel like there's 952 00:51:38,760 --> 00:51:42,000 Speaker 1: a lot of stuff to work out, but I sense 953 00:51:42,160 --> 00:51:45,800 Speaker 1: that the Sopulski's comparison here about the the rift between 954 00:51:45,840 --> 00:51:49,239 Speaker 1: the emotion of familiarity and the and the cognitive recognition 955 00:51:49,280 --> 00:51:52,080 Speaker 1: function of the brain, UH, that that's at work in 956 00:51:52,120 --> 00:51:54,600 Speaker 1: cap Grass syndrome. This is a really rich kind of 957 00:51:54,840 --> 00:51:58,960 Speaker 1: comparison for for social media and media technology generally, and 958 00:51:59,400 --> 00:52:02,719 Speaker 1: I e to keep having more thoughts about it. Yeah, 959 00:52:02,960 --> 00:52:06,440 Speaker 1: I mean, I I don't want to just sound like 960 00:52:06,440 --> 00:52:08,759 Speaker 1: I'm just saying like the people are awful and that 961 00:52:08,960 --> 00:52:12,240 Speaker 1: technology makes us more awful. Uh, you know, I don't 962 00:52:12,320 --> 00:52:14,840 Speaker 1: want that to be my ultimate argument. And ultimately I 963 00:52:14,840 --> 00:52:19,200 Speaker 1: would say that technology enables humans to do amazing things. 964 00:52:19,960 --> 00:52:25,399 Speaker 1: And if we direct this power in the right way, Uh, 965 00:52:25,440 --> 00:52:27,120 Speaker 1: you know, there's plenty that we can do. There's plenty 966 00:52:27,160 --> 00:52:31,120 Speaker 1: we have done to connect people and and and build 967 00:52:31,120 --> 00:52:35,239 Speaker 1: a better world out of those connections. But obviously there's 968 00:52:35,280 --> 00:52:37,600 Speaker 1: more that we could do. And I guess the worrisome 969 00:52:37,600 --> 00:52:40,160 Speaker 1: thing with these platforms, the various platforms that we're talking about, 970 00:52:40,239 --> 00:52:43,200 Speaker 1: is like, what is the the ultimate advantage? What is 971 00:52:43,200 --> 00:52:46,440 Speaker 1: the ultimate intention of the masters of those given platforms. 972 00:52:46,800 --> 00:52:49,080 Speaker 1: What do they want? And even in cases where they 973 00:52:49,080 --> 00:52:51,319 Speaker 1: may say, no, we want to build something that brings 974 00:52:51,360 --> 00:52:55,480 Speaker 1: people together, we want to build something that empowers, you know, 975 00:52:55,840 --> 00:52:58,120 Speaker 1: a better world. Like is that impulse going to win 976 00:52:58,200 --> 00:53:01,680 Speaker 1: out in the overall structure of this given social media 977 00:53:01,719 --> 00:53:06,160 Speaker 1: platform or is it going to be profitability or engagement 978 00:53:06,280 --> 00:53:09,920 Speaker 1: or some other metric that is ultimately more important to 979 00:53:10,000 --> 00:53:13,000 Speaker 1: the corporate entity. It's always profitability, and of course that's 980 00:53:13,000 --> 00:53:16,239 Speaker 1: always what's went out. But I feel like there's I 981 00:53:16,560 --> 00:53:19,920 Speaker 1: have to hold onto the possibility that that humans can 982 00:53:19,960 --> 00:53:22,080 Speaker 1: do better though. Well, I mean, that does make me 983 00:53:22,120 --> 00:53:24,960 Speaker 1: wonder if perhaps what you could do instead is have 984 00:53:25,160 --> 00:53:29,920 Speaker 1: some kind of nonprofit, open source social media platform that 985 00:53:29,960 --> 00:53:32,840 Speaker 1: would that would compete and try to replace these for 986 00:53:33,040 --> 00:53:37,600 Speaker 1: profit forms that are deranging our relationships and and causing 987 00:53:37,640 --> 00:53:42,439 Speaker 1: this familiarity recognition rift and potentially having all these psychologically 988 00:53:42,440 --> 00:53:45,360 Speaker 1: negative consequences on our lives and on our culture broadly. 989 00:53:45,640 --> 00:53:47,759 Speaker 1: I'm not sure exactly what that would look like. I mean, 990 00:53:48,040 --> 00:53:49,640 Speaker 1: it would probably be a start if there was just 991 00:53:50,160 --> 00:53:54,080 Speaker 1: something that was like Facebook, but that did not manipulate 992 00:53:54,120 --> 00:53:58,520 Speaker 1: what people saw and prioritize you know, conflict and paid content. 993 00:53:58,960 --> 00:54:01,160 Speaker 1: But then again, I e. Even just with the you know, 994 00:54:01,200 --> 00:54:03,799 Speaker 1: the bare bones basics of Facebook, I wonder about you know, 995 00:54:03,880 --> 00:54:08,719 Speaker 1: having these friend networks. Uh, does does the even the 996 00:54:08,719 --> 00:54:12,680 Speaker 1: most basic mechanic of something like Facebook encourage people to 997 00:54:13,239 --> 00:54:16,759 Speaker 1: go through these mental processes where they sort of degrade 998 00:54:16,840 --> 00:54:20,319 Speaker 1: their standards of what counts as a healthy relationship I 999 00:54:20,320 --> 00:54:22,839 Speaker 1: mean maybe ultimately that's where where AI can come in, 1000 00:54:22,920 --> 00:54:25,879 Speaker 1: you know, and we just need we need artificial intelligence 1001 00:54:26,280 --> 00:54:31,120 Speaker 1: to dictate where and how to maintain healthy relationships online. 1002 00:54:31,480 --> 00:54:33,680 Speaker 1: And that's that's the ultimate answer. I don't know, just 1003 00:54:33,719 --> 00:54:40,400 Speaker 1: hand it off to an aidity. Why I'm not hopeful 1004 00:54:40,440 --> 00:54:43,399 Speaker 1: about that either. Again, this is back to like I'm 1005 00:54:43,400 --> 00:54:46,560 Speaker 1: worried about these l Ai squirrels, not the I'm not 1006 00:54:46,600 --> 00:54:50,240 Speaker 1: worried about the great basilisk. I'm worried about the minor 1007 00:54:50,360 --> 00:54:53,360 Speaker 1: dumb aies that that are running through our lives like 1008 00:54:53,400 --> 00:54:56,560 Speaker 1: a pest infestation. Now, uh, you know, not not to 1009 00:54:56,680 --> 00:54:58,200 Speaker 1: end things into a negative a place that I do 1010 00:54:58,239 --> 00:55:01,279 Speaker 1: want to refer listeners back to our episod the Great Episodes, 1011 00:55:01,840 --> 00:55:04,680 Speaker 1: the Great Eyeball Wars, where we went into a lot 1012 00:55:04,719 --> 00:55:07,800 Speaker 1: of this, particularly about, you know, about how social media 1013 00:55:07,880 --> 00:55:11,720 Speaker 1: and these platforms and our phones are gamed to capture 1014 00:55:11,760 --> 00:55:14,239 Speaker 1: our attention and hold our attention. In those episodes, we 1015 00:55:14,320 --> 00:55:18,240 Speaker 1: also shared some some advice that experts have given about 1016 00:55:18,239 --> 00:55:21,200 Speaker 1: how to fight back, how to limit your use of 1017 00:55:21,239 --> 00:55:24,000 Speaker 1: social media and or your phone, and uh, and so 1018 00:55:24,080 --> 00:55:26,480 Speaker 1: I mean there and there are increasingly more tools out there, 1019 00:55:26,480 --> 00:55:28,840 Speaker 1: I believe you know, some of these these phones have ways, 1020 00:55:29,120 --> 00:55:32,680 Speaker 1: you know now, to to track how much you're using them, 1021 00:55:32,800 --> 00:55:34,799 Speaker 1: or even to remind you not to use them in 1022 00:55:34,840 --> 00:55:38,920 Speaker 1: certain situations. Yeah yeah, I mean I I can't honestly 1023 00:55:39,000 --> 00:55:41,840 Speaker 1: and non hypocritically tell people to get entirely off of 1024 00:55:41,920 --> 00:55:43,759 Speaker 1: social media because one of the things is I have 1025 00:55:43,920 --> 00:55:46,879 Speaker 1: to maintain social media accounts because of my job at 1026 00:55:46,920 --> 00:55:49,560 Speaker 1: this podcast, where we've got to like promote stuff on 1027 00:55:49,640 --> 00:55:51,879 Speaker 1: social media, and you know, we've got to We've got 1028 00:55:51,880 --> 00:55:55,000 Speaker 1: a Facebook discussion module that I really enjoy using. I 1029 00:55:55,040 --> 00:55:58,000 Speaker 1: probably would have deleted my Facebook account, but I enjoy 1030 00:55:58,080 --> 00:56:01,440 Speaker 1: our our discussion module with our fans there. Yeah. Yeah, 1031 00:56:01,520 --> 00:56:03,400 Speaker 1: that's that's probably one of the main reasons I go 1032 00:56:03,480 --> 00:56:06,719 Speaker 1: on Facebook these days. So discussion module, don't screw this up. 1033 00:56:06,840 --> 00:56:10,680 Speaker 1: Let's keep keep this positive relationship. But I mean, I 1034 00:56:10,719 --> 00:56:12,719 Speaker 1: will say that the reason it's on there's not any 1035 00:56:12,760 --> 00:56:15,920 Speaker 1: inherent strength of Facebook. It's on there because of audience inertia. 1036 00:56:15,960 --> 00:56:18,160 Speaker 1: I mean that that's where the people are. Like, if 1037 00:56:18,160 --> 00:56:20,120 Speaker 1: you want to have a place where people already have 1038 00:56:20,200 --> 00:56:23,440 Speaker 1: accounts and they can join Facebook, is they tell us 1039 00:56:23,480 --> 00:56:25,719 Speaker 1: the place where you can do that? You know, I'd 1040 00:56:25,719 --> 00:56:29,120 Speaker 1: love a world where somebody created some kind of non destructive, 1041 00:56:29,120 --> 00:56:33,120 Speaker 1: open source uh you know, nonprofit platform where you could 1042 00:56:33,120 --> 00:56:35,359 Speaker 1: do a similar thing if enough people could get on there. 1043 00:56:35,520 --> 00:56:37,840 Speaker 1: All right, So, so there you have it. Obviously, there 1044 00:56:37,840 --> 00:56:39,239 Speaker 1: are a lot of there's a lot of areas here 1045 00:56:39,280 --> 00:56:41,719 Speaker 1: we can call out to listeners on. Uh. I mean, 1046 00:56:41,880 --> 00:56:46,320 Speaker 1: first of all, have you ever encountered a really um 1047 00:56:46,480 --> 00:56:49,600 Speaker 1: impressive double in your life, like someone that required uh, 1048 00:56:49,640 --> 00:56:51,640 Speaker 1: you know, not even just a first and second glass, 1049 00:56:51,719 --> 00:56:53,600 Speaker 1: maybe a third glance to realize that they were not 1050 00:56:53,680 --> 00:56:56,880 Speaker 1: your friend or perhaps not yourself. We'd love to hear 1051 00:56:56,880 --> 00:56:58,880 Speaker 1: from that all. I mean, for to that, you know, 1052 00:56:58,920 --> 00:57:03,120 Speaker 1: soon if you've actually had any experience with cap gross syndrome. Um, 1053 00:57:03,200 --> 00:57:06,239 Speaker 1: you know, we would like we would really appreciate any 1054 00:57:06,280 --> 00:57:10,799 Speaker 1: firsthand knowledge of experiences like that. Uh. And then beyond that, 1055 00:57:10,840 --> 00:57:12,600 Speaker 1: when we get into the you know, the of course, 1056 00:57:12,640 --> 00:57:15,520 Speaker 1: the literary and the you know, the fictional and the 1057 00:57:15,560 --> 00:57:18,720 Speaker 1: mythological connotations. If you have a particular you know, favorite 1058 00:57:18,760 --> 00:57:21,120 Speaker 1: double you want to share. But certainly we spend most 1059 00:57:21,160 --> 00:57:24,000 Speaker 1: of the time you're talking about this social media doppel 1060 00:57:24,040 --> 00:57:27,640 Speaker 1: ganger idea, and so I mean, you're pretty much all 1061 00:57:27,720 --> 00:57:30,240 Speaker 1: on social media. At this point, we're really you're either 1062 00:57:30,320 --> 00:57:34,040 Speaker 1: on social media or you've made a very uh you know, 1063 00:57:34,200 --> 00:57:39,040 Speaker 1: firm choice not to be so whichever category you fall into, 1064 00:57:39,160 --> 00:57:41,760 Speaker 1: I feel like you probably have thoughts related to this 1065 00:57:41,800 --> 00:57:44,320 Speaker 1: episode and we'd love to hear from you. Absolutely get 1066 00:57:44,360 --> 00:57:46,200 Speaker 1: in touch. In the meantime. If you want to listen 1067 00:57:46,200 --> 00:57:47,680 Speaker 1: to more episodes of Stuff to Blow your Mind, head 1068 00:57:47,680 --> 00:57:49,160 Speaker 1: on over to stuff to Blow your Mind dot com. 1069 00:57:49,200 --> 00:57:51,360 Speaker 1: That is the mothership. That's we'll find all the episodes. 1070 00:57:51,720 --> 00:57:55,120 Speaker 1: You'll find a link there to our little merchandise store. 1071 00:57:55,160 --> 00:57:57,400 Speaker 1: We can get some squirrel shirts. It's a fun way 1072 00:57:57,400 --> 00:57:59,680 Speaker 1: to support the show. But the best way to support 1073 00:57:59,680 --> 00:58:02,280 Speaker 1: the show is to simply rate and review us wherever 1074 00:58:02,320 --> 00:58:03,840 Speaker 1: you have the power to do so. If you can 1075 00:58:03,960 --> 00:58:06,760 Speaker 1: leave some stars, leave a nice review, do that, and 1076 00:58:06,920 --> 00:58:10,120 Speaker 1: best of all, tell somebody about the show. Yeah, tell 1077 00:58:10,200 --> 00:58:13,200 Speaker 1: him online, but even better if you see somebody in 1078 00:58:13,320 --> 00:58:16,600 Speaker 1: real life, tell him about the show in real life, 1079 00:58:17,280 --> 00:58:20,240 Speaker 1: because I feel like, uh, you know, it's gonna it's 1080 00:58:20,240 --> 00:58:23,480 Speaker 1: gonna impress people all the more. That's right, huge, Thanks 1081 00:58:23,520 --> 00:58:27,040 Speaker 1: as always to our excellent audio producer, Try Harrison. If 1082 00:58:27,080 --> 00:58:29,520 Speaker 1: you would like to get in touch with this direct Sorry, 1083 00:58:29,560 --> 00:58:32,840 Speaker 1: I'm laughing because Robert has got a little stress ball 1084 00:58:32,880 --> 00:58:34,800 Speaker 1: over here and he's squishing the guts out of it 1085 00:58:34,840 --> 00:58:37,120 Speaker 1: as we speak. Yeah, there's like some sort of white 1086 00:58:37,120 --> 00:58:38,920 Speaker 1: pus coming out of it. I had it for like 1087 00:58:38,960 --> 00:58:43,320 Speaker 1: two weeks and it's already squeezed out. So uh, I'm 1088 00:58:43,320 --> 00:58:45,400 Speaker 1: not gonna name the brand because maybe I just had 1089 00:58:45,400 --> 00:58:48,520 Speaker 1: a destructively anyway. Sorry. If you want to get in 1090 00:58:48,560 --> 00:58:52,520 Speaker 1: touch with us directly, you can email us at contact 1091 00:58:52,640 --> 00:59:04,520 Speaker 1: at stuff to Blow your Mind dot com. Stuff to 1092 00:59:04,520 --> 00:59:06,480 Speaker 1: Blow Your Mind is a production of iHeart Radio's How 1093 00:59:06,520 --> 00:59:09,040 Speaker 1: Stuff Works. For more podcasts from my heart Radio, visit 1094 00:59:09,040 --> 00:59:11,840 Speaker 1: the iHeart Radio app, Apple Podcasts, or wherever you listen 1095 00:59:11,880 --> 00:59:21,720 Speaker 1: to your favorite shows.