1 00:00:04,480 --> 00:00:07,560 Speaker 1: Hello everyone. This is Seth the audio producer for Stuff 2 00:00:07,600 --> 00:00:09,600 Speaker 1: to Blow Your Mind. And there's just a quick note 3 00:00:09,600 --> 00:00:13,399 Speaker 1: before we start. Myself, Robert, and Joe. We are still 4 00:00:13,440 --> 00:00:18,520 Speaker 1: recording in isolation because it is the summer of and well, 5 00:00:18,720 --> 00:00:20,560 Speaker 1: this episode, we had a bit of a problem with 6 00:00:20,640 --> 00:00:23,560 Speaker 1: Robert's microphone. It was a one time thing. It shouldn't 7 00:00:23,560 --> 00:00:26,960 Speaker 1: happen again in the future, but for this episode, it's 8 00:00:27,000 --> 00:00:30,000 Speaker 1: going to sound a little bit like Joe is kind 9 00:00:30,000 --> 00:00:32,960 Speaker 1: of talking to a tv VCR combo and like a 10 00:00:32,960 --> 00:00:37,519 Speaker 1: bunker somewhere. But well, I guess that's technically kind of 11 00:00:37,520 --> 00:00:41,440 Speaker 1: what we're actually doing. But anyway, anyway, point is this 12 00:00:41,520 --> 00:00:46,040 Speaker 1: is a one time incident. Uh, it's all still very understandable, 13 00:00:46,120 --> 00:00:48,120 Speaker 1: very easy to listen to in this episode of cleaned 14 00:00:48,120 --> 00:00:50,960 Speaker 1: it up as best I can, and next time it'll 15 00:00:50,960 --> 00:00:54,240 Speaker 1: sound just like normal, we promise. Thanks, enjoy the show. 16 00:00:57,560 --> 00:00:59,880 Speaker 1: Welcome to Stuff to Blow Your Mind Production. If I 17 00:01:00,040 --> 00:01:09,200 Speaker 1: heart radio. Hey, welcome to Stuff to Blow Your Mind. 18 00:01:09,319 --> 00:01:12,040 Speaker 1: My name is Robert Lamp and I'm Joe McCormick. In 19 00:01:12,160 --> 00:01:16,160 Speaker 1: Today we're gonna be talking about mostly focusing on a 20 00:01:16,319 --> 00:01:20,680 Speaker 1: classic psychology paper about an effect in uh in the 21 00:01:20,800 --> 00:01:25,280 Speaker 1: kind of social cognitive bias known as the spotlight effect. 22 00:01:25,360 --> 00:01:27,839 Speaker 1: But to get into this subject, I wanted to start 23 00:01:27,880 --> 00:01:29,959 Speaker 1: off by thinking about something that a lot of people 24 00:01:30,000 --> 00:01:32,520 Speaker 1: have found themselves doing in the past few months. Of 25 00:01:32,600 --> 00:01:35,920 Speaker 1: people who maybe this wasn't part of your job very 26 00:01:36,000 --> 00:01:39,280 Speaker 1: much recently, but now you spend a significant portion of 27 00:01:39,319 --> 00:01:45,120 Speaker 1: your day in web video meetings, staring at little boxes 28 00:01:45,200 --> 00:01:49,080 Speaker 1: of your co workers faces on a computer screen, or 29 00:01:49,240 --> 00:01:53,080 Speaker 1: maybe just staring at your own face a lot. Yes, uh, 30 00:01:53,120 --> 00:01:55,520 Speaker 1: I mean, in fact, right now is Joe and I 31 00:01:55,560 --> 00:01:58,440 Speaker 1: are recording this. We are using a Zoom call. We 32 00:01:58,480 --> 00:02:01,840 Speaker 1: are using a Zoom conference to communicate with each other, 33 00:02:02,120 --> 00:02:04,080 Speaker 1: and then we're using some other programs and what not 34 00:02:04,280 --> 00:02:08,000 Speaker 1: to to actually record it. But yeah, a lot of 35 00:02:08,000 --> 00:02:09,880 Speaker 1: people haven't. You may be using something different. You might 36 00:02:09,919 --> 00:02:14,359 Speaker 1: be using what there's like Google meet, Microsoft meeting, like 37 00:02:14,680 --> 00:02:18,240 Speaker 1: Facebook flop. I mean they're like a million of them, right, 38 00:02:18,280 --> 00:02:21,800 Speaker 1: I mean this is an optagram. Yeah, it's a growth 39 00:02:21,800 --> 00:02:24,280 Speaker 1: industry figuring out and it makes sense. Right. We need 40 00:02:24,320 --> 00:02:26,080 Speaker 1: to still be able to connect with each other, We 41 00:02:26,120 --> 00:02:29,240 Speaker 1: still need to have meaningful meetings and all of those 42 00:02:29,560 --> 00:02:33,000 Speaker 1: less meaningful meetings, and in order to keep the gears 43 00:02:33,040 --> 00:02:37,240 Speaker 1: of business grinding away. Right. Yeah. And one of the 44 00:02:37,280 --> 00:02:39,640 Speaker 1: strange things I've noticed, and I've read other people noticing 45 00:02:39,639 --> 00:02:43,560 Speaker 1: the exact same thing, is that you might expect that 46 00:02:43,680 --> 00:02:47,080 Speaker 1: being able to do a meeting from your home over 47 00:02:47,160 --> 00:02:51,079 Speaker 1: the internet would be maybe less exhausting than a meeting 48 00:02:51,160 --> 00:02:53,680 Speaker 1: in person, But somehow I have not found it to 49 00:02:53,680 --> 00:02:57,079 Speaker 1: be the case. I found that, like, video chats can 50 00:02:57,160 --> 00:03:00,680 Speaker 1: be just intensely draining, like after where they're over, you 51 00:03:00,680 --> 00:03:04,840 Speaker 1: feel like you've been lifting weights or something. Uh. And 52 00:03:05,000 --> 00:03:07,200 Speaker 1: part of what's going on here I found very much 53 00:03:07,200 --> 00:03:10,600 Speaker 1: embodied in the spirit of an article that I saw 54 00:03:10,639 --> 00:03:12,560 Speaker 1: link to. It's it's just a Medium post. And I 55 00:03:12,600 --> 00:03:14,800 Speaker 1: want to be very clear that I'm not passing judgment 56 00:03:14,840 --> 00:03:18,200 Speaker 1: on the author here, that nothing wrong with with this person, 57 00:03:18,320 --> 00:03:21,000 Speaker 1: but the title of it just gave me chills. And 58 00:03:21,040 --> 00:03:24,160 Speaker 1: the title of this medium post was how to fake 59 00:03:24,280 --> 00:03:30,720 Speaker 1: eye contact during video chats and why it's important. Yes, uh, 60 00:03:30,880 --> 00:03:34,160 Speaker 1: this this was this is interesting. This was what a 61 00:03:34,240 --> 00:03:40,080 Speaker 1: medium article by Alexa Curtis, And they made three key 62 00:03:40,120 --> 00:03:43,280 Speaker 1: suggestions here. The first one is to use a webcam 63 00:03:43,360 --> 00:03:46,280 Speaker 1: even if no one else is okay. The second is 64 00:03:46,320 --> 00:03:49,240 Speaker 1: trick yourself into looking at the camera instead of at 65 00:03:49,280 --> 00:03:54,840 Speaker 1: the screen impossible, and then tape your prompts, your notes 66 00:03:54,960 --> 00:03:58,240 Speaker 1: whatever to your monitor as much as possible instead of 67 00:03:58,240 --> 00:04:01,160 Speaker 1: having to refer to like a notepad something. Yeah, the 68 00:04:01,200 --> 00:04:04,400 Speaker 1: suggestion is like, make a little fake face to go 69 00:04:04,480 --> 00:04:08,200 Speaker 1: around the webcam box so that you're looking into the 70 00:04:08,240 --> 00:04:11,480 Speaker 1: camera instead of at the screen. But I just don't 71 00:04:11,480 --> 00:04:14,200 Speaker 1: think it's like if your face is played back on 72 00:04:14,240 --> 00:04:16,479 Speaker 1: the screen unless you are able to turn your face 73 00:04:16,560 --> 00:04:20,640 Speaker 1: off for yourself. You just you're not going to be 74 00:04:20,720 --> 00:04:24,200 Speaker 1: able to help it, are you. Yeah. I actually, before 75 00:04:24,240 --> 00:04:26,720 Speaker 1: I read this, I kind of thought because I'd catch 76 00:04:26,800 --> 00:04:29,800 Speaker 1: myself doing this. Um. Now, now I should say that 77 00:04:30,200 --> 00:04:32,240 Speaker 1: I am lucky and that I do not have to 78 00:04:32,440 --> 00:04:35,119 Speaker 1: set through just hours and hours of meetings today and zoom. 79 00:04:35,120 --> 00:04:37,840 Speaker 1: I have friends who are definitely stuck in that boat, 80 00:04:37,880 --> 00:04:40,839 Speaker 1: and they seem exhausted by it. We use zoom in 81 00:04:40,880 --> 00:04:43,440 Speaker 1: these recordings, but for the most part, we're not actually 82 00:04:43,440 --> 00:04:46,040 Speaker 1: engaging in the video part of it. We we have 83 00:04:46,080 --> 00:04:48,360 Speaker 1: other stuff, we have notes, stuff that we're looking at 84 00:04:48,920 --> 00:04:52,400 Speaker 1: when we're recording. But I have found this to be 85 00:04:52,480 --> 00:04:55,159 Speaker 1: the case with my my Dungeons and Dragons group, which 86 00:04:55,240 --> 00:04:57,960 Speaker 1: used to meet in person, but now it's forced to 87 00:04:58,040 --> 00:05:00,480 Speaker 1: meet via zoom and we have this, you know, and 88 00:05:00,520 --> 00:05:03,440 Speaker 1: it's like a two or three hour zoom called you know, 89 00:05:03,520 --> 00:05:05,160 Speaker 1: like a three hour zoom called it would do once 90 00:05:05,160 --> 00:05:09,200 Speaker 1: a week to do Dengon and Dragons. And for some reason, 91 00:05:09,279 --> 00:05:11,240 Speaker 1: I've been noticing that I've just I felt kind of 92 00:05:11,240 --> 00:05:14,719 Speaker 1: worn out by it towards the end of of the night, 93 00:05:15,320 --> 00:05:18,240 Speaker 1: um in ways that I wasn't worn out previously meeting 94 00:05:18,240 --> 00:05:21,359 Speaker 1: in person, Like I just kind of felt zapped by it, 95 00:05:21,680 --> 00:05:24,320 Speaker 1: like if if we were battling something at the end, 96 00:05:24,360 --> 00:05:26,680 Speaker 1: I'm just kind of going through the motions. I'm just 97 00:05:26,720 --> 00:05:30,360 Speaker 1: not feeling it anymore at that point. Yeah, exact same experience. 98 00:05:30,400 --> 00:05:34,479 Speaker 1: I've done social stuff via video calls. I've also been 99 00:05:34,520 --> 00:05:37,039 Speaker 1: doing a D and D campaign, my my first ever 100 00:05:37,080 --> 00:05:40,960 Speaker 1: by the way, thank you over over zoom, and it's 101 00:05:40,960 --> 00:05:42,760 Speaker 1: been a lot of fun. But yes, it is. It 102 00:05:43,240 --> 00:05:47,200 Speaker 1: is kind of exhausting to just participate in the in 103 00:05:47,240 --> 00:05:51,320 Speaker 1: the eyeball Tennis of of the different video faces on 104 00:05:51,360 --> 00:05:54,360 Speaker 1: the screen. Something about it hooks its claws into your 105 00:05:54,400 --> 00:05:57,800 Speaker 1: brain and just pulls and stretches and kind of needs 106 00:05:57,880 --> 00:06:01,679 Speaker 1: your brain like a ball of dough. Yeah, now, fortunately, 107 00:06:01,720 --> 00:06:04,119 Speaker 1: I don't know how your campaign is going, but with ours, 108 00:06:04,120 --> 00:06:06,440 Speaker 1: were also using a couple other resources. We're using a 109 00:06:06,520 --> 00:06:10,200 Speaker 1: discord forum, and we're using role twenty to pull up 110 00:06:10,279 --> 00:06:13,240 Speaker 1: maps and such. So maybe maybe we should upgrade our 111 00:06:13,240 --> 00:06:16,359 Speaker 1: technology we're looking into. But but basically we have some 112 00:06:16,400 --> 00:06:20,080 Speaker 1: other things to captivate our eyes during this this process. 113 00:06:20,120 --> 00:06:22,120 Speaker 1: I guess one of the things about a straight up 114 00:06:22,560 --> 00:06:25,560 Speaker 1: like business zoom meeting is a lot of times you're 115 00:06:25,680 --> 00:06:29,120 Speaker 1: just stuck in those Brady bunch um cubes, right, You're 116 00:06:29,120 --> 00:06:31,400 Speaker 1: just you're just stuck with all these little screen pictures 117 00:06:31,400 --> 00:06:33,760 Speaker 1: of people and uh and some sometimes you have it 118 00:06:33,839 --> 00:06:36,279 Speaker 1: set to where one will take dominance over the others, 119 00:06:36,279 --> 00:06:37,599 Speaker 1: but you may just be looking at a wall of 120 00:06:37,640 --> 00:06:42,000 Speaker 1: people's faces, and then you're thinking about again this point 121 00:06:42,040 --> 00:06:44,760 Speaker 1: of should I be making eye contact with everybody? Should 122 00:06:44,760 --> 00:06:48,880 Speaker 1: I be focusing on trying to be the most presentable, 123 00:06:49,080 --> 00:06:51,400 Speaker 1: you know, the most professional looking, Like when they look 124 00:06:51,440 --> 00:06:54,560 Speaker 1: at my little box, it's like watching a TV broadcast 125 00:06:54,800 --> 00:06:57,680 Speaker 1: and I'm making direct eye contact with them. Maybe, but 126 00:06:57,880 --> 00:07:00,240 Speaker 1: because I have a little sticky smiley face that I've 127 00:07:00,279 --> 00:07:02,640 Speaker 1: put out at the top of my computer by the camera. 128 00:07:02,960 --> 00:07:05,640 Speaker 1: There are multiple ways in which this type of interaction 129 00:07:05,760 --> 00:07:07,719 Speaker 1: is not normal. I mean, of course, it's not normal 130 00:07:07,760 --> 00:07:10,160 Speaker 1: to be interfacing through technology at all. Of course, it's 131 00:07:10,240 --> 00:07:12,440 Speaker 1: kind of strange that you're not looking at the person 132 00:07:12,520 --> 00:07:15,080 Speaker 1: but you're looking at the screen, so the eye contact 133 00:07:15,120 --> 00:07:17,760 Speaker 1: is off. I understand that that point in the article. 134 00:07:18,120 --> 00:07:20,360 Speaker 1: But at the same time, it is not normal to 135 00:07:20,400 --> 00:07:23,440 Speaker 1: be able to see yourself while you're talking to people. 136 00:07:23,960 --> 00:07:27,080 Speaker 1: I can imagine if you know you were always talking 137 00:07:27,080 --> 00:07:30,200 Speaker 1: to people with a mirror in your hand that was 138 00:07:30,240 --> 00:07:34,360 Speaker 1: reflecting your face. Yeah, people would rightfully think you were insane. 139 00:07:35,480 --> 00:07:38,240 Speaker 1: That was how like self obsessed you were that you 140 00:07:38,320 --> 00:07:41,320 Speaker 1: go or I don't know, or afraid of Gordon's that 141 00:07:41,400 --> 00:07:43,600 Speaker 1: you always had to have a mirror in your hand. 142 00:07:43,760 --> 00:07:46,360 Speaker 1: And yet that's the reality we find ourselves there. And 143 00:07:46,400 --> 00:07:48,320 Speaker 1: I know this is this is not just like our 144 00:07:48,440 --> 00:07:51,400 Speaker 1: particular reaction to this. This is something that that I've 145 00:07:51,400 --> 00:07:54,160 Speaker 1: read about in multiple popular articles and also not just 146 00:07:54,200 --> 00:07:57,240 Speaker 1: an obscure scientific articles. Like there was an article I 147 00:07:57,320 --> 00:07:59,840 Speaker 1: came across in I think of his Business Insider that 148 00:08:00,080 --> 00:08:02,600 Speaker 1: is called like why you can't stop steering at yourself 149 00:08:02,640 --> 00:08:07,440 Speaker 1: in zoom calls. Yes, this was by Shia Fetter. Titled 150 00:08:07,440 --> 00:08:10,080 Speaker 1: a cyber Psychologist explains why you can't stop steering yourself 151 00:08:10,080 --> 00:08:14,080 Speaker 1: on zoom calls and everyone else is probably doing the same, um, 152 00:08:14,200 --> 00:08:17,680 Speaker 1: which I I have caught myself doing this sometimes during 153 00:08:17,720 --> 00:08:21,800 Speaker 1: Dunus and dragons, sometimes during work calls, where you know 154 00:08:21,880 --> 00:08:23,640 Speaker 1: you want to check in, you want to see how 155 00:08:23,680 --> 00:08:26,040 Speaker 1: you were presenting to the rest of the world. But 156 00:08:26,120 --> 00:08:29,480 Speaker 1: then it's often easy to sort of, you know, to 157 00:08:29,640 --> 00:08:31,680 Speaker 1: to to. You're looking at these wall of faces, and 158 00:08:31,680 --> 00:08:34,160 Speaker 1: then you decide to maximize your own, and you're like, 159 00:08:34,160 --> 00:08:36,120 Speaker 1: all right, let's see, how's the light hitting me, what's 160 00:08:36,120 --> 00:08:39,040 Speaker 1: my hair doing right now? How presented my smiling weird? 161 00:08:39,880 --> 00:08:42,160 Speaker 1: Like smiling weird? Do I look do I look engaged? 162 00:08:42,520 --> 00:08:47,960 Speaker 1: Or do I look as bored as I feel? You know? Um? 163 00:08:48,000 --> 00:08:50,840 Speaker 1: And so that's what this article gets into a little 164 00:08:50,880 --> 00:08:53,360 Speaker 1: of that year. So um. The article discusses some key 165 00:08:53,360 --> 00:08:57,320 Speaker 1: points made by cyber psychologist Andrew Franklin. So the first 166 00:08:57,360 --> 00:09:00,079 Speaker 1: one is that in general, adolescents tend to suffer for 167 00:09:00,480 --> 00:09:04,160 Speaker 1: from the imaginary audience delusion, the idea that people in 168 00:09:04,200 --> 00:09:07,400 Speaker 1: the surroundings are really paying attention to every move they make, 169 00:09:07,920 --> 00:09:11,360 Speaker 1: and this often follows us into adulthood as well. Yeah, 170 00:09:11,360 --> 00:09:14,240 Speaker 1: and this I think is pretty close to or perhaps 171 00:09:14,240 --> 00:09:16,760 Speaker 1: even just another name for the main issue we're going 172 00:09:16,800 --> 00:09:19,199 Speaker 1: to be focusing on today and otherwise known as the 173 00:09:19,240 --> 00:09:24,319 Speaker 1: spotlight effect. Now, one thing about this, this point about adolescents. 174 00:09:25,120 --> 00:09:28,520 Speaker 1: I mean, this is of course terrifying to think about, 175 00:09:28,600 --> 00:09:31,760 Speaker 1: given the nature of social media, which is pretty much 176 00:09:31,760 --> 00:09:36,760 Speaker 1: predicated on this sort of celebrity aspiring notion of a 177 00:09:36,840 --> 00:09:40,440 Speaker 1: constant audience and uh and in one tends to drift 178 00:09:40,679 --> 00:09:44,120 Speaker 1: to extremes and reaction to that, right, this feeling that 179 00:09:44,200 --> 00:09:46,839 Speaker 1: every word I put on the Internet, or every video, 180 00:09:46,880 --> 00:09:50,120 Speaker 1: every whatever is vitally important and will be viewed by 181 00:09:50,160 --> 00:09:54,080 Speaker 1: potentially everyone in the world. Yes, you're not just constantly 182 00:09:54,120 --> 00:09:58,680 Speaker 1: on view you are constantly being reviewed, is the perception. Yeah, 183 00:09:58,720 --> 00:10:01,040 Speaker 1: I know, obviously that's it's gonna from person person and 184 00:10:01,080 --> 00:10:03,400 Speaker 1: there some people are gonna use social media in a 185 00:10:03,400 --> 00:10:07,439 Speaker 1: way that hasn't far more limited scope only close friends 186 00:10:07,520 --> 00:10:09,920 Speaker 1: or maybe even family, maybe just one person can see 187 00:10:10,080 --> 00:10:13,520 Speaker 1: you know. But but but yeah, it does make me 188 00:10:13,600 --> 00:10:15,280 Speaker 1: wonder and this would have to be a you know, 189 00:10:15,320 --> 00:10:18,440 Speaker 1: a discussion for another time, like just what what what 190 00:10:18,600 --> 00:10:22,520 Speaker 1: is happening when this um spotlight effect, This imaginary audience 191 00:10:22,520 --> 00:10:28,280 Speaker 1: delusion is playing uh into our use of social media. Yeah, 192 00:10:28,280 --> 00:10:32,320 Speaker 1: but also so this psychology professor Andrew Franklin also makes 193 00:10:32,320 --> 00:10:35,600 Speaker 1: the point that like it's not just an illusion that 194 00:10:35,640 --> 00:10:39,480 Speaker 1: like video chats are actually exhausting. Yeah, Yeah, to make 195 00:10:39,520 --> 00:10:42,000 Speaker 1: the point that video chats are more stressful than in 196 00:10:42,080 --> 00:10:44,880 Speaker 1: person meetings, and a big part of that is just 197 00:10:45,000 --> 00:10:49,520 Speaker 1: everything is more distracted, more fragmented, and we have muted 198 00:10:49,600 --> 00:10:54,240 Speaker 1: or severely lessened nonverbal communicative skills. So you think of 199 00:10:55,120 --> 00:10:56,599 Speaker 1: a lot of this is kind of an overstatement of 200 00:10:56,640 --> 00:10:59,240 Speaker 1: the obvious, but you can do far less with your 201 00:10:59,240 --> 00:11:02,400 Speaker 1: body language, not only your overt body language like talking 202 00:11:02,400 --> 00:11:04,400 Speaker 1: with your hands and waving down people the other side 203 00:11:04,400 --> 00:11:07,080 Speaker 1: of the table, but in terms of just having a 204 00:11:07,160 --> 00:11:10,839 Speaker 1: bodily awareness of what everyone is doing and how they 205 00:11:10,840 --> 00:11:13,440 Speaker 1: are sort of reacting to what's going on, and if 206 00:11:13,440 --> 00:11:16,880 Speaker 1: someone else is about to speak or needs to speak. Yes, 207 00:11:17,000 --> 00:11:20,080 Speaker 1: And it's actually hard to tell who's looking at who, 208 00:11:20,280 --> 00:11:22,599 Speaker 1: Like you can assume if there are a bunch of 209 00:11:22,640 --> 00:11:26,240 Speaker 1: Brady bunch boxes that people are probably looking if they're 210 00:11:26,240 --> 00:11:28,679 Speaker 1: not looking at themselves, they're probably looking at the person 211 00:11:28,720 --> 00:11:32,600 Speaker 1: who's currently talking. But maybe not. You can't tell. Yeah, 212 00:11:32,720 --> 00:11:34,720 Speaker 1: And and it's also weird to think, like, I guess 213 00:11:35,320 --> 00:11:36,840 Speaker 1: maybe some of you out there had more of like 214 00:11:36,840 --> 00:11:40,680 Speaker 1: a you know, rules of order kind of a upbringing 215 00:11:40,800 --> 00:11:42,920 Speaker 1: or or you know, you had some more training and Okay, 216 00:11:42,960 --> 00:11:45,440 Speaker 1: this is how, this is how a business meeting goes, 217 00:11:45,480 --> 00:11:48,080 Speaker 1: this is how a work meeting operates. But I feel 218 00:11:48,120 --> 00:11:49,320 Speaker 1: for my part, a lot of it's just kind of 219 00:11:49,320 --> 00:11:50,960 Speaker 1: you just learn it. You go just sort of figure 220 00:11:50,960 --> 00:11:53,800 Speaker 1: out what is the culture of this group and this meeting, 221 00:11:53,960 --> 00:11:56,520 Speaker 1: and and how am I supposed to fit in? And 222 00:11:56,559 --> 00:11:58,559 Speaker 1: then to a certain extent, it feels like we've had 223 00:11:58,600 --> 00:12:01,760 Speaker 1: to relearn all of that or augment our understanding of 224 00:12:01,800 --> 00:12:04,480 Speaker 1: that based on the limitations of the technology. Yeah. I 225 00:12:04,480 --> 00:12:07,080 Speaker 1: think that's totally right, And and I would I would 226 00:12:07,080 --> 00:12:11,440 Speaker 1: emphasize yet again that not all digital socialization skills are 227 00:12:11,520 --> 00:12:15,040 Speaker 1: are interchangeable or transferable to one another. So you might 228 00:12:15,360 --> 00:12:18,559 Speaker 1: have been well acclimatized to the social skills one needs 229 00:12:18,559 --> 00:12:22,319 Speaker 1: in order to interact through a different type of mediated 230 00:12:22,360 --> 00:12:26,800 Speaker 1: social media like like Facebook or Twitter or something like that, 231 00:12:27,040 --> 00:12:29,520 Speaker 1: and still not really have any skills for how to 232 00:12:29,559 --> 00:12:32,320 Speaker 1: interact via a video chat. It's just like a different 233 00:12:32,320 --> 00:12:34,280 Speaker 1: set of skills, a different set of things to get 234 00:12:34,360 --> 00:12:38,520 Speaker 1: used to. Yeah, it's a different talking stick entirely now now. 235 00:12:38,559 --> 00:12:41,679 Speaker 1: Franklin also drives something that given the strain of keeping 236 00:12:41,760 --> 00:12:44,760 Speaker 1: up with everyone's tiny boxes and concern over how you 237 00:12:44,760 --> 00:12:49,040 Speaker 1: yourself look in your box, you might easily find yourself 238 00:12:49,160 --> 00:12:52,080 Speaker 1: just looking at yourself, staring into the digital mirror and 239 00:12:52,160 --> 00:12:55,040 Speaker 1: fixating on how you appear to friends, co workers, and 240 00:12:55,120 --> 00:12:59,319 Speaker 1: bosses and and Franklin maintains that this means you're likely overwhelmed. 241 00:12:59,320 --> 00:13:03,120 Speaker 1: I perceive games which which I totally get. Again, even 242 00:13:03,200 --> 00:13:06,240 Speaker 1: with the low stakes confines of dungeons and dragons. You know, 243 00:13:06,800 --> 00:13:09,240 Speaker 1: nothing huge is on the line here. But by the 244 00:13:09,360 --> 00:13:11,040 Speaker 1: end of the of the session again, I often find 245 00:13:11,080 --> 00:13:13,679 Speaker 1: myself kind of zapped in ways that I never felt 246 00:13:13,679 --> 00:13:17,240 Speaker 1: before within person gaming. And even though we're staring into 247 00:13:17,440 --> 00:13:22,040 Speaker 1: that digital reflection of our own face, Franklin stresses that 248 00:13:22,160 --> 00:13:25,920 Speaker 1: people are ultimately not fixating on you like you think 249 00:13:25,960 --> 00:13:29,160 Speaker 1: they are. They are not setting there watching you and 250 00:13:29,400 --> 00:13:32,920 Speaker 1: you know, dissecting everything about your appearance and in your 251 00:13:32,960 --> 00:13:35,680 Speaker 1: background and what your face is doing in any any 252 00:13:35,679 --> 00:13:39,160 Speaker 1: given sect. Yeah, No, they're probably much more likely fixating 253 00:13:39,200 --> 00:13:42,439 Speaker 1: on themselves the same way you are fixating on yourself. 254 00:13:43,160 --> 00:13:45,840 Speaker 1: Um uh. And so this brings us back to the 255 00:13:45,880 --> 00:13:48,640 Speaker 1: cognitive bias that we're gonna be focusing on in today's episode, 256 00:13:48,760 --> 00:13:52,800 Speaker 1: also known as the spotlight effect. And this effect is 257 00:13:52,880 --> 00:13:55,679 Speaker 1: very interesting because, on one hand, I think it's one 258 00:13:55,679 --> 00:13:59,600 Speaker 1: of the simplest psychological phenomena we've ever talked about on 259 00:13:59,600 --> 00:14:02,959 Speaker 1: the show. It's actually very simple to observe. It's very 260 00:14:03,000 --> 00:14:06,040 Speaker 1: straightforward in a way. But it's one of those things 261 00:14:06,080 --> 00:14:10,080 Speaker 1: where if you really internalize it, it's implications could be 262 00:14:10,240 --> 00:14:13,959 Speaker 1: kind of life changing. Yeah, it's one of these things 263 00:14:13,960 --> 00:14:16,520 Speaker 1: that doesn't I wouldn't say that it really like changes 264 00:14:17,559 --> 00:14:21,080 Speaker 1: in the nature of your reality, but it brings certain 265 00:14:21,120 --> 00:14:26,000 Speaker 1: aspects of it into maybe sharper focus. You might realize, Oh, well, okay, 266 00:14:26,040 --> 00:14:29,560 Speaker 1: that explains some of the things I feel when I 267 00:14:29,600 --> 00:14:33,000 Speaker 1: am in a meeting or you know, just walking around 268 00:14:33,200 --> 00:14:36,640 Speaker 1: u uh, you know, in a public space, or or 269 00:14:36,680 --> 00:14:39,560 Speaker 1: whatever the case may be. Um, I do feel like 270 00:14:39,600 --> 00:14:43,520 Speaker 1: it does it does? It does feel like a revelation 271 00:14:43,560 --> 00:14:46,400 Speaker 1: of it in its own way. Yeah. So the main 272 00:14:46,480 --> 00:14:49,160 Speaker 1: paper that I wanted to focus on today was published 273 00:14:49,160 --> 00:14:52,239 Speaker 1: in the year two thousand in the Journal of Personality 274 00:14:52,240 --> 00:14:56,440 Speaker 1: and Social Psychology by Thomas Gilovich of Victoria who Staid 275 00:14:56,480 --> 00:15:00,520 Speaker 1: Medvec and Kenneth Savitsky, and it's called the spot effect 276 00:15:00,560 --> 00:15:03,960 Speaker 1: in Social Judgment and ecocentric bias and Estimates of the 277 00:15:04,000 --> 00:15:08,240 Speaker 1: salience of one zone actions and appearance. You can pretty 278 00:15:08,240 --> 00:15:10,680 Speaker 1: easily find a full PDF of this online if you 279 00:15:10,720 --> 00:15:13,760 Speaker 1: want to read it. And this is a highly cited paper. 280 00:15:13,800 --> 00:15:15,960 Speaker 1: It has been referred to many many times in the 281 00:15:16,040 --> 00:15:18,840 Speaker 1: years since as a kind of seminal work on this 282 00:15:19,120 --> 00:15:23,680 Speaker 1: on this social cognitive bias. So the authors begin with 283 00:15:23,800 --> 00:15:28,240 Speaker 1: some anecdotal observations, and these observations are that for both 284 00:15:28,320 --> 00:15:32,080 Speaker 1: good and ill, it often seems like stuff you expect 285 00:15:32,200 --> 00:15:38,040 Speaker 1: other people to notice and recall about you really goes unnoticed. Uh. 286 00:15:38,080 --> 00:15:41,160 Speaker 1: And also on the good side, that might be like 287 00:15:41,320 --> 00:15:45,040 Speaker 1: something smart that you said in a discussion group. You're 288 00:15:45,080 --> 00:15:47,400 Speaker 1: like really pleased with yourself that like, oh, I had 289 00:15:47,440 --> 00:15:50,000 Speaker 1: that really good insight or I made that really funny joke, 290 00:15:50,600 --> 00:15:53,040 Speaker 1: and then it turns out later that nobody else seems 291 00:15:53,080 --> 00:15:57,360 Speaker 1: to recall that you said anything. Or perhaps this often 292 00:15:57,400 --> 00:16:00,720 Speaker 1: actually happens in athletic contexts where people will make a 293 00:16:00,760 --> 00:16:03,480 Speaker 1: really good shot in a basketball game or something, and 294 00:16:03,520 --> 00:16:06,280 Speaker 1: they will expect people to remember that they did that. 295 00:16:06,680 --> 00:16:09,720 Speaker 1: But then maybe it turns out that nobody really noticed. 296 00:16:09,800 --> 00:16:12,080 Speaker 1: It just kind of was one of the goals in 297 00:16:12,120 --> 00:16:14,680 Speaker 1: a game in which many goals were scored. And as 298 00:16:14,720 --> 00:16:17,200 Speaker 1: frustrating as this can be, it can also kind of 299 00:16:17,240 --> 00:16:19,440 Speaker 1: be a relief that it works the other way to 300 00:16:19,720 --> 00:16:23,600 Speaker 1: People often don't seem to have noticed when you make 301 00:16:23,640 --> 00:16:27,560 Speaker 1: what feels like a really obvious mistake faux paw on 302 00:16:27,600 --> 00:16:31,040 Speaker 1: a first meeting, or when you misspeak and what feels 303 00:16:31,080 --> 00:16:33,880 Speaker 1: like an embarrassing way, or that time you had spinach 304 00:16:33,920 --> 00:16:36,240 Speaker 1: in your teeth, like you obsess over that and you're 305 00:16:36,280 --> 00:16:39,400 Speaker 1: afraid it's going to completely ruin your reputation, that everybody's 306 00:16:39,400 --> 00:16:42,240 Speaker 1: gonna remember you for that thing forever. But a lot 307 00:16:42,320 --> 00:16:45,120 Speaker 1: of times it seems like maybe nobody even noticed. Yeah, 308 00:16:45,600 --> 00:16:49,600 Speaker 1: the dual nature of this particular revelation, I think ultimately 309 00:16:49,640 --> 00:16:51,880 Speaker 1: it is positive because yeah, and maybe it means you're 310 00:16:51,880 --> 00:16:54,120 Speaker 1: not as important as you thought you're. Maybe you're not 311 00:16:54,240 --> 00:16:57,680 Speaker 1: it's a it's a it's explosive personality as you thought 312 00:16:57,720 --> 00:17:00,400 Speaker 1: you were. But on the other hand, uh, you know, 313 00:17:00,440 --> 00:17:02,680 Speaker 1: maybe the stakes are a little bit lower every time 314 00:17:02,720 --> 00:17:06,000 Speaker 1: you hope in your mouth. Yeah. Yeah, that's the hypothesis 315 00:17:06,040 --> 00:17:09,639 Speaker 1: at the heart of this paper, that these anecdotal observations 316 00:17:09,680 --> 00:17:12,880 Speaker 1: are indicative of a real trend that can be measured 317 00:17:13,040 --> 00:17:17,920 Speaker 1: that in general, humans have an egocentric bias that causes 318 00:17:18,080 --> 00:17:21,480 Speaker 1: us to believe that our actions and our appearance are 319 00:17:21,680 --> 00:17:25,320 Speaker 1: much more salient and notable to other people than they 320 00:17:25,359 --> 00:17:29,359 Speaker 1: really are. Quote. People tend to believe that more people 321 00:17:29,440 --> 00:17:31,760 Speaker 1: take a note of their actions and appearance than is 322 00:17:31,800 --> 00:17:36,320 Speaker 1: actually the case. We dubbed this putative phenomenon the spotlight effect. 323 00:17:36,720 --> 00:17:39,840 Speaker 1: People tend to believe that the social spotlight shines more 324 00:17:39,880 --> 00:17:42,840 Speaker 1: brightly on them than it really does. Yeah, this is 325 00:17:42,880 --> 00:17:45,080 Speaker 1: insightful and I think we can all match this up 326 00:17:45,119 --> 00:17:47,480 Speaker 1: pretty easily, first of all with our own experiences, but 327 00:17:47,520 --> 00:17:50,240 Speaker 1: also with some of the ideas that we've discussed on 328 00:17:50,240 --> 00:17:53,560 Speaker 1: the show before. Uh specifically, first of all, there's the 329 00:17:53,600 --> 00:17:56,760 Speaker 1: self narrative aspect of our inner thoughts, you know, through 330 00:17:56,800 --> 00:17:59,639 Speaker 1: the inner workings of consciousness, were constantly weaving together a 331 00:17:59,680 --> 00:18:02,440 Speaker 1: store about who we are and how we fit into 332 00:18:02,440 --> 00:18:05,080 Speaker 1: the world. It's a little movie, and we're the main character, 333 00:18:05,359 --> 00:18:08,640 Speaker 1: so of course we're the most important person in that story. Right, 334 00:18:08,680 --> 00:18:11,360 Speaker 1: you're trying to link together a series of what are 335 00:18:11,480 --> 00:18:15,120 Speaker 1: in fact sort of random events into a cohesive narrative 336 00:18:15,119 --> 00:18:17,919 Speaker 1: with the logic to it. Right. And then through theory 337 00:18:17,960 --> 00:18:21,760 Speaker 1: of mind, we're constantly running simulations about the mental states 338 00:18:21,760 --> 00:18:26,200 Speaker 1: of other people, specific people, people in general, known people, 339 00:18:26,280 --> 00:18:30,280 Speaker 1: unknown people, sort of hypothetical people, uh and uh. And 340 00:18:30,280 --> 00:18:32,080 Speaker 1: of course one of the key aspects of any of 341 00:18:32,080 --> 00:18:35,200 Speaker 1: these simulations is, you know, how do they relate to me, 342 00:18:35,359 --> 00:18:38,480 Speaker 1: how do they think about me? What are their intentions 343 00:18:38,520 --> 00:18:41,639 Speaker 1: towards me? And that makes sense, right, There's an inherently 344 00:18:41,680 --> 00:18:44,359 Speaker 1: self centered quality to this sort of thinking, because it 345 00:18:44,400 --> 00:18:46,960 Speaker 1: all comes down to individual survival. We tend to air 346 00:18:47,040 --> 00:18:49,640 Speaker 1: on the side of seeing tigers in the grass when 347 00:18:49,680 --> 00:18:52,200 Speaker 1: there are none, which is better of the two possible 348 00:18:52,240 --> 00:18:55,320 Speaker 1: gambles here, But it also means going through life perpetually 349 00:18:55,359 --> 00:18:58,680 Speaker 1: imagining how the tiger sees you. Yeah, and so this 350 00:18:58,760 --> 00:19:02,840 Speaker 1: is in some ways the exact social equivalent of the 351 00:19:02,920 --> 00:19:06,439 Speaker 1: agency detection overdrive, where you, you know, over interpret a 352 00:19:06,480 --> 00:19:09,280 Speaker 1: crack of a twig as a tiger in the grass. Here, 353 00:19:09,400 --> 00:19:13,000 Speaker 1: you over interpret any little thing that that you think 354 00:19:13,040 --> 00:19:16,639 Speaker 1: maybe going wrong in a social interaction as something that 355 00:19:16,680 --> 00:19:20,520 Speaker 1: people will notice and remember and judge you for. So 356 00:19:20,560 --> 00:19:22,280 Speaker 1: maybe we should take a quick break and then when 357 00:19:22,320 --> 00:19:24,600 Speaker 1: we come back we can get a little bit further 358 00:19:24,680 --> 00:19:30,800 Speaker 1: into this study. Thank alright, we're back. Okay. So we're 359 00:19:30,800 --> 00:19:33,080 Speaker 1: talking about the study from the year two thousand by 360 00:19:33,119 --> 00:19:36,480 Speaker 1: Gilovich and co authors who are putting forth this this 361 00:19:36,600 --> 00:19:39,879 Speaker 1: putative phenomenon that at the time they called the spotlight effect, 362 00:19:39,960 --> 00:19:44,480 Speaker 1: the idea that we overestimate the salience of our appearance 363 00:19:44,520 --> 00:19:47,960 Speaker 1: and our behavior to other people. And the authors here 364 00:19:48,000 --> 00:19:51,639 Speaker 1: note several lines of previous research that helped point to 365 00:19:51,680 --> 00:19:54,640 Speaker 1: this conclusion. One of them is, first of all, this 366 00:19:54,720 --> 00:19:57,160 Speaker 1: may not be surprising at all, but people do tend 367 00:19:57,200 --> 00:20:01,160 Speaker 1: to have egocentric biases that you can measure quite easily 368 00:20:01,240 --> 00:20:05,040 Speaker 1: in in in tests. These are biases that overstate the 369 00:20:05,080 --> 00:20:08,600 Speaker 1: importance of the self. Just one example the site is 370 00:20:08,600 --> 00:20:12,480 Speaker 1: a paper by Ross and Sickly published in nineteen seventy 371 00:20:12,600 --> 00:20:16,800 Speaker 1: nine called ego centric Biases and Availability and Attribution, and 372 00:20:16,920 --> 00:20:19,080 Speaker 1: it showed it showed this in the realm of what's 373 00:20:19,080 --> 00:20:24,040 Speaker 1: called responsibility allocation, Who did, how much and how important 374 00:20:24,320 --> 00:20:27,040 Speaker 1: was what they did? So there are several different ways 375 00:20:27,040 --> 00:20:30,480 Speaker 1: you can test for this. Uh maybe in discussion groups, 376 00:20:30,600 --> 00:20:36,160 Speaker 1: maybe in household chores, maybe in basketball teams. Uh quote 377 00:20:36,480 --> 00:20:41,120 Speaker 1: one zone, contributions to a joint product are more readily available. 378 00:20:41,200 --> 00:20:45,720 Speaker 1: That is, more frequently and easily recalled. Individuals accepted more 379 00:20:45,840 --> 00:20:50,200 Speaker 1: responsibility for a group product than other participants attributed to them. 380 00:20:50,600 --> 00:20:52,840 Speaker 1: So the easy way of thinking about this is, oh, 381 00:20:52,880 --> 00:20:57,159 Speaker 1: our team one, because I scored that goal. Yeah. I 382 00:20:57,200 --> 00:21:01,360 Speaker 1: found this particularly telling. The authors now that that research 383 00:21:01,440 --> 00:21:06,240 Speaker 1: indicates that when individuals undertake complex social interactions, they alternate 384 00:21:06,280 --> 00:21:10,400 Speaker 1: between the roles of speaker or actor and listener or observer, 385 00:21:10,920 --> 00:21:13,840 Speaker 1: but much of their attention is ultimately going to be 386 00:21:13,840 --> 00:21:18,280 Speaker 1: directed in many cases at planning and executing their own responses. 387 00:21:18,720 --> 00:21:20,600 Speaker 1: And I think we can relate to this. Uh. You 388 00:21:20,640 --> 00:21:23,520 Speaker 1: know when when those times when you haven't quite zoned 389 00:21:23,520 --> 00:21:25,680 Speaker 1: out on a meeting, like you're not just or or 390 00:21:25,720 --> 00:21:27,760 Speaker 1: a conversation and you're not just you know, out here 391 00:21:28,040 --> 00:21:30,600 Speaker 1: thinking about Star Wars in the back of your head. No, 392 00:21:30,960 --> 00:21:34,840 Speaker 1: you're focusing instead on the thing that you're getting ready 393 00:21:34,880 --> 00:21:39,119 Speaker 1: to say, your interjection into the conversation, the joke that 394 00:21:39,200 --> 00:21:41,479 Speaker 1: you are intending to make when you get the talking 395 00:21:41,520 --> 00:21:46,000 Speaker 1: stick um, and uh, you know, because ultimately that's often 396 00:21:46,040 --> 00:21:47,920 Speaker 1: a part of any kind of like three way or 397 00:21:48,440 --> 00:21:52,439 Speaker 1: or or larger conversation is when when is it going 398 00:21:52,520 --> 00:21:54,639 Speaker 1: to be my turn? And how am I going to 399 00:21:55,200 --> 00:21:57,879 Speaker 1: make the most out of my my time speaking. I 400 00:21:57,920 --> 00:22:00,639 Speaker 1: believe there's actually a name for this exact defect. It's 401 00:22:00,680 --> 00:22:02,720 Speaker 1: a different thing that's been stun I mean, obviously it's 402 00:22:02,800 --> 00:22:05,080 Speaker 1: very related to the stuff we're talking about, but I 403 00:22:05,080 --> 00:22:08,359 Speaker 1: think it's called the next in line effect, where you 404 00:22:08,400 --> 00:22:12,080 Speaker 1: can measure that people have less recall of if you 405 00:22:12,160 --> 00:22:13,800 Speaker 1: if you like, sit people in a circle and go 406 00:22:13,840 --> 00:22:16,639 Speaker 1: around the circle asking them to speak, people have less 407 00:22:16,680 --> 00:22:19,720 Speaker 1: recall of the person who spoke right before them than 408 00:22:19,760 --> 00:22:22,040 Speaker 1: they do of everybody else, because you know, when the 409 00:22:22,080 --> 00:22:24,200 Speaker 1: person right before you was talking, you're planning what you're 410 00:22:24,200 --> 00:22:27,920 Speaker 1: gonna say. And and it means that when one thinks 411 00:22:27,960 --> 00:22:30,560 Speaker 1: back on a meeting, so you're in this meeting, you 412 00:22:30,560 --> 00:22:32,600 Speaker 1: have this period of time where you're you're applying most 413 00:22:32,640 --> 00:22:35,919 Speaker 1: of your cognitive efforts towards preparing for your own words, 414 00:22:36,320 --> 00:22:38,280 Speaker 1: and then when you think back on it, you're more 415 00:22:38,359 --> 00:22:41,280 Speaker 1: likely to remember the thing that you were focused on 416 00:22:41,359 --> 00:22:45,920 Speaker 1: at the time. You know your own words, your own contribution, um, 417 00:22:45,960 --> 00:22:48,800 Speaker 1: because that's where that's where you were spending the mental resources. 418 00:22:49,320 --> 00:22:52,440 Speaker 1: The exception to this, however, would be if one's could 419 00:22:52,560 --> 00:22:56,479 Speaker 1: contribution required a little effort, like instead of plotting to 420 00:22:56,560 --> 00:22:59,840 Speaker 1: interject something that will make everyone laugh or pursuing some 421 00:23:00,080 --> 00:23:03,000 Speaker 1: specific strategic aim in the meeting, what if it was 422 00:23:03,040 --> 00:23:05,600 Speaker 1: just the part of the meeting where every week your 423 00:23:05,640 --> 00:23:08,480 Speaker 1: boss says, hey, Roy, what are the numbers? Just read 424 00:23:08,520 --> 00:23:10,000 Speaker 1: us the numbers real quick, and then you read the 425 00:23:10,080 --> 00:23:13,040 Speaker 1: numbers something that's you know, quick and normal like that. Now, 426 00:23:13,040 --> 00:23:16,080 Speaker 1: the exception of this they mentioned would be passive observers, 427 00:23:16,359 --> 00:23:19,800 Speaker 1: people who are in the meeting but are not planning 428 00:23:20,000 --> 00:23:22,520 Speaker 1: to have the talking stick at any point, don't have 429 00:23:22,520 --> 00:23:24,840 Speaker 1: any kind of active role in the meeting, or if 430 00:23:24,880 --> 00:23:27,080 Speaker 1: they do, maybe it is just reading off the stats 431 00:23:27,359 --> 00:23:29,680 Speaker 1: and they don't have a larger role to play, so 432 00:23:29,840 --> 00:23:33,200 Speaker 1: they might well focus more on other people in the meeting. 433 00:23:33,680 --> 00:23:35,600 Speaker 1: They are they are going to be the ones that 434 00:23:35,640 --> 00:23:38,360 Speaker 1: are going to be more likely to notice what you 435 00:23:38,400 --> 00:23:41,720 Speaker 1: say or do. That that totally makes sense to me, Um, 436 00:23:42,160 --> 00:23:45,359 Speaker 1: I think I have much better recall of meetings where 437 00:23:45,440 --> 00:23:48,239 Speaker 1: I am not expected to speak. That being said, and 438 00:23:48,240 --> 00:23:50,560 Speaker 1: this is this is me, not the authors here, But 439 00:23:50,640 --> 00:23:54,840 Speaker 1: I suspect that the passive observers are also far more 440 00:23:54,840 --> 00:23:59,320 Speaker 1: likely to be thinking about star wars or what they're 441 00:23:59,359 --> 00:24:03,880 Speaker 1: they needed by a grocery school later the supernatural biker movies. Yeah, 442 00:24:04,040 --> 00:24:06,359 Speaker 1: or here's a big one. We didn't even get into this, 443 00:24:06,480 --> 00:24:10,399 Speaker 1: but via the zoom call, uh it one has a 444 00:24:10,440 --> 00:24:14,320 Speaker 1: tremendous ability to just simply go to other websites during 445 00:24:14,320 --> 00:24:18,560 Speaker 1: the call and still look basically attentive, right, because you'd 446 00:24:18,600 --> 00:24:21,400 Speaker 1: just be looking at the screen either way. Yeah, there's 447 00:24:21,400 --> 00:24:25,560 Speaker 1: your excuse, folks, Digital hookie. Has anybody tried just putting 448 00:24:25,640 --> 00:24:27,880 Speaker 1: up like a face like I know you can insertain 449 00:24:27,920 --> 00:24:30,840 Speaker 1: backgrounds on these video calls, putting up a background that 450 00:24:30,840 --> 00:24:33,000 Speaker 1: has a photo of them in it so it looks 451 00:24:33,040 --> 00:24:36,040 Speaker 1: like they're sitting there. I bet somebody has somebody out 452 00:24:36,080 --> 00:24:38,639 Speaker 1: there has got a little bit forward and figured out 453 00:24:38,680 --> 00:24:40,320 Speaker 1: a way to make it happen. It would be It 454 00:24:40,320 --> 00:24:42,240 Speaker 1: would be kind of the equivalent of I didn't Helmer 455 00:24:42,320 --> 00:24:44,520 Speaker 1: Simpson have some glasses at one point that made him 456 00:24:44,520 --> 00:24:46,560 Speaker 1: look like he was awake? Yes, it's when he's in 457 00:24:46,720 --> 00:24:50,240 Speaker 1: he's on a jury and he's expected to be paying attention, 458 00:24:50,280 --> 00:24:53,919 Speaker 1: but he is sleeping, right, that's right, I remember that 459 00:24:54,480 --> 00:24:57,639 Speaker 1: wide awake glasses and one of the other jurors narcs 460 00:24:57,640 --> 00:25:01,159 Speaker 1: on him. But yeah, so so anyway, the effect here, 461 00:25:01,200 --> 00:25:03,800 Speaker 1: I think is pretty straightforward. If an action stands out 462 00:25:03,800 --> 00:25:06,480 Speaker 1: in your own mind for whatever reason, you're going to 463 00:25:06,600 --> 00:25:09,840 Speaker 1: end up thinking it was more important in some objective 464 00:25:09,920 --> 00:25:13,800 Speaker 1: sense than it actually was. And so, in other words, 465 00:25:13,840 --> 00:25:16,919 Speaker 1: if people overestimate the relevance of their own actions in 466 00:25:16,960 --> 00:25:21,200 Speaker 1: an objective sense, wouldn't they also overestimate how relevant their 467 00:25:21,240 --> 00:25:26,840 Speaker 1: actions are subjectively to other people? Yeah? Yeah. The authors 468 00:25:27,000 --> 00:25:29,919 Speaker 1: also point out in many cases it might not matter 469 00:25:30,040 --> 00:25:33,360 Speaker 1: it maybe quote overlook when joint endeavors do not require 470 00:25:33,359 --> 00:25:37,439 Speaker 1: explicit allocations of responsibility. But obviously sometimes this is not 471 00:25:37,560 --> 00:25:41,320 Speaker 1: the case. Yeah. It particularly makes me think of a 472 00:25:41,359 --> 00:25:44,840 Speaker 1: frequent trope you see in films, the villainous meetings, when 473 00:25:44,840 --> 00:25:48,639 Speaker 1: you have villains around a table generally having a meeting, 474 00:25:49,119 --> 00:25:52,760 Speaker 1: having this sort of you know, dark, more antagonistic version 475 00:25:53,359 --> 00:25:59,080 Speaker 1: of our regular real life business meetings. Uh, the meetings 476 00:25:59,119 --> 00:26:02,760 Speaker 1: of Specter. The early James Bond Yes or Blowfeld would 477 00:26:02,760 --> 00:26:07,240 Speaker 1: have the have the command consoled like electrocute somebody's chair, yeah, 478 00:26:07,440 --> 00:26:09,960 Speaker 1: or another favorite of mine, or the meetings you see 479 00:26:10,000 --> 00:26:13,320 Speaker 1: the Imperial meetings in like Star Wars and New Hope, 480 00:26:13,080 --> 00:26:15,560 Speaker 1: or we're also in Rogue one. We we have the 481 00:26:15,640 --> 00:26:19,320 Speaker 1: likes of Darth Vader and Grand Moth Tarken or or 482 00:26:19,440 --> 00:26:23,160 Speaker 1: Orson Critic. You know, they're they're they're all objectively, they're 483 00:26:23,200 --> 00:26:24,960 Speaker 1: they're all talking about, okay, we need to get the 484 00:26:24,960 --> 00:26:27,199 Speaker 1: Death Star up and running. But these are all highly 485 00:26:27,280 --> 00:26:30,879 Speaker 1: egotistical and self focused individuals, and they all seen each 486 00:26:31,119 --> 00:26:34,199 Speaker 1: pretty focused on their own key role in everything, and 487 00:26:34,200 --> 00:26:37,879 Speaker 1: they're certainly not about like elevating the project itself above 488 00:26:37,960 --> 00:26:41,680 Speaker 1: personal ambition. Yeah. Yeah, they're clearly like trying to stick 489 00:26:41,760 --> 00:26:43,880 Speaker 1: up for their own branch. It's like, you know, dangerous 490 00:26:43,920 --> 00:26:48,600 Speaker 1: to your starfleet command or not to my battle station. Yeah, okay. 491 00:26:48,600 --> 00:26:51,840 Speaker 1: A few more previously observed. A psychological phenomenon that the 492 00:26:51,880 --> 00:26:54,679 Speaker 1: authors call attention to is is potentially backing up the 493 00:26:54,720 --> 00:26:57,360 Speaker 1: idea of a spotlight effect. Another one is what's known 494 00:26:57,400 --> 00:27:01,760 Speaker 1: as naive realism. The right quote. Naive realism refers to 495 00:27:01,800 --> 00:27:04,880 Speaker 1: the common tendency to assume that one's perception of an 496 00:27:04,880 --> 00:27:09,480 Speaker 1: object or event is an accurate reflection of its objective properties, 497 00:27:09,760 --> 00:27:14,359 Speaker 1: not a subjective interpretation or construle. In other words, look, 498 00:27:14,480 --> 00:27:17,280 Speaker 1: it happened just like I saw it. It's the tendency 499 00:27:17,359 --> 00:27:21,160 Speaker 1: to believe that your perception is unbiased and accurate, even 500 00:27:21,160 --> 00:27:24,920 Speaker 1: though you might readily attribute, you know, mistakes and biases 501 00:27:24,960 --> 00:27:28,360 Speaker 1: to other people's perceptions. Yeah, and this is all tied 502 00:27:28,440 --> 00:27:32,280 Speaker 1: up in the philosophy of perception. Um. So when we're 503 00:27:32,280 --> 00:27:36,800 Speaker 1: talking about naive realism also known as direct realism, that 504 00:27:36,840 --> 00:27:41,840 Speaker 1: stands in opposition to indirect or representational realism. So direct 505 00:27:41,920 --> 00:27:44,280 Speaker 1: or naive realism holds that we perceive things in the 506 00:27:44,320 --> 00:27:50,440 Speaker 1: world directly and without the then the mediation of any impression, idea, 507 00:27:50,920 --> 00:27:54,400 Speaker 1: or representation. And I think we can generally agree, especially 508 00:27:54,400 --> 00:27:56,600 Speaker 1: on this show, that this is not the true nature 509 00:27:56,640 --> 00:27:59,480 Speaker 1: of how a human process is reality. Know, the things 510 00:27:59,520 --> 00:28:01,960 Speaker 1: you see are are based on the external world, but 511 00:28:02,040 --> 00:28:06,360 Speaker 1: it's not an unbiased direct representation of the external world, right, 512 00:28:06,440 --> 00:28:08,840 Speaker 1: Like there's there's a weight to things, you know. It's 513 00:28:08,880 --> 00:28:12,040 Speaker 1: like if you know, if yesterday somebody slapped me with 514 00:28:12,119 --> 00:28:16,119 Speaker 1: the fish, today I see a fish and and like 515 00:28:16,160 --> 00:28:18,560 Speaker 1: that the nature of my perception is going to be 516 00:28:18,600 --> 00:28:23,119 Speaker 1: augmented by that previous experience. Now, indirect realism adheres to 517 00:28:23,160 --> 00:28:27,439 Speaker 1: the idea that material objects do have mind independent existence, 518 00:28:27,920 --> 00:28:31,399 Speaker 1: but but not that our visual perception is unmediated or 519 00:28:31,480 --> 00:28:34,879 Speaker 1: that these objects necessarily possess all of the features that 520 00:28:34,920 --> 00:28:38,160 Speaker 1: we perceived them to have. Like a quick example of 521 00:28:38,160 --> 00:28:40,479 Speaker 1: that would be, obviously, we look at our beloved pets, 522 00:28:40,920 --> 00:28:43,240 Speaker 1: and we may, you know, we may perceive them to 523 00:28:43,360 --> 00:28:47,520 Speaker 1: have various nuances that they simply do not have, and 524 00:28:47,760 --> 00:28:50,960 Speaker 1: you know, as pet owners, were generally okay with that. Yeah, 525 00:28:51,000 --> 00:28:53,720 Speaker 1: I think that's exactly right. And I think indirect realism, 526 00:28:53,760 --> 00:28:55,560 Speaker 1: I don't know, to me, that is the model of 527 00:28:55,600 --> 00:28:57,200 Speaker 1: the world that makes the most sense. Like I would 528 00:28:57,240 --> 00:28:59,920 Speaker 1: not say I'm an idealist, I believe the external world 529 00:29:00,000 --> 00:29:02,800 Speaker 1: doesn't exist. I don't go for that. But obviously, our 530 00:29:02,920 --> 00:29:06,760 Speaker 1: our ideas about what is motivating our dog or something 531 00:29:06,880 --> 00:29:11,560 Speaker 1: might be more us than actually coming from the dog. Yeah. Now, 532 00:29:11,600 --> 00:29:15,080 Speaker 1: of course, there's also phenomenalism, which generally rejects the mind 533 00:29:15,120 --> 00:29:19,240 Speaker 1: independent existence of material logics, but accepts un mediated visual 534 00:29:19,240 --> 00:29:23,920 Speaker 1: perception and the possession of of perceived features. So other 535 00:29:24,040 --> 00:29:27,120 Speaker 1: things are not things as much as they are bundles 536 00:29:27,120 --> 00:29:30,160 Speaker 1: of sense data, which is a weird way to to 537 00:29:30,320 --> 00:29:34,640 Speaker 1: behold your path. After Yeah, let's getting almost into kind 538 00:29:34,640 --> 00:29:37,400 Speaker 1: of George Berkeley and kind of territory that that I 539 00:29:37,400 --> 00:29:39,600 Speaker 1: don't think I can fully go for, but that it 540 00:29:39,680 --> 00:29:42,160 Speaker 1: ultimately doesn't really play into what we're talking about here. Again, 541 00:29:42,200 --> 00:29:46,520 Speaker 1: we're talking about direct realism or our naive realism versus 542 00:29:46,600 --> 00:29:50,840 Speaker 1: indirect or representational realism. Right, And I think we do 543 00:29:50,920 --> 00:29:54,880 Speaker 1: have a tendency to really underappreciate how much our perceptions 544 00:29:54,880 --> 00:29:57,600 Speaker 1: are affected by the kinds of mistakes and distortions that 545 00:29:57,680 --> 00:30:01,120 Speaker 1: we readily attribute to other people. And so the authors 546 00:30:01,120 --> 00:30:03,640 Speaker 1: of the two thousand paper, right, that quote applied to 547 00:30:03,640 --> 00:30:06,640 Speaker 1: the spotlight effect. This implies that it might be easy 548 00:30:06,720 --> 00:30:10,400 Speaker 1: to confuse how salient something is to oneself with how 549 00:30:10,480 --> 00:30:14,400 Speaker 1: salient it is to others, precisely because our own behavior 550 00:30:14,560 --> 00:30:17,080 Speaker 1: stands out in our own minds, it can be hard 551 00:30:17,120 --> 00:30:20,080 Speaker 1: to discern how well or even whether it is picked 552 00:30:20,120 --> 00:30:23,760 Speaker 1: up by others. Absolutely, we may be attending the same 553 00:30:23,840 --> 00:30:26,760 Speaker 1: meeting or Zoom conference call, but we are not all 554 00:30:26,840 --> 00:30:29,240 Speaker 1: attending the same meeting or Zoom conference call, you know 555 00:30:29,240 --> 00:30:31,800 Speaker 1: what I mean. We we all have different perceptions of 556 00:30:31,840 --> 00:30:35,400 Speaker 1: it based on our own biases, our own histories, our 557 00:30:35,440 --> 00:30:38,520 Speaker 1: own pervasive thoughts or you know, cognitive model of the 558 00:30:38,600 --> 00:30:40,800 Speaker 1: task at hand and our role in it. I mean, 559 00:30:40,840 --> 00:30:43,200 Speaker 1: our our subjective understanding of the meeting is going to 560 00:30:43,280 --> 00:30:47,200 Speaker 1: different person to person exactly. So leading into the next 561 00:30:47,240 --> 00:30:49,480 Speaker 1: thing here that the authors point out is as possibly 562 00:30:49,480 --> 00:30:52,160 Speaker 1: pointing to a spotlight effect. This is something that has 563 00:30:52,160 --> 00:30:56,680 Speaker 1: been documented known as this self as target bias. Quick example, 564 00:30:56,960 --> 00:30:59,280 Speaker 1: so you're in a classroom, the teacher gives a pop 565 00:30:59,360 --> 00:31:03,240 Speaker 1: quiz about last night's reading, and Johnny interprets this quiz 566 00:31:03,320 --> 00:31:07,120 Speaker 1: as an attack on him personally because he believes that 567 00:31:07,160 --> 00:31:10,400 Speaker 1: the teacher must believe that he didn't do the reading. 568 00:31:10,960 --> 00:31:13,360 Speaker 1: And you know, I think even the best of us 569 00:31:13,400 --> 00:31:16,320 Speaker 1: sometimes we fall prey to thinking like this something that 570 00:31:16,480 --> 00:31:19,800 Speaker 1: is a a general sort of action applied to everyone. 571 00:31:19,960 --> 00:31:24,080 Speaker 1: We think, why are they doing this to me? Yeah? Yeah, 572 00:31:24,520 --> 00:31:26,680 Speaker 1: especially if you have a light up, like a build up, 573 00:31:26,880 --> 00:31:30,240 Speaker 1: build up of anticipation about a given you know, meeting 574 00:31:30,480 --> 00:31:34,040 Speaker 1: or or social scenario. Yeah. And so the author's right 575 00:31:34,120 --> 00:31:37,320 Speaker 1: quote like naive realism, then the self is target bias 576 00:31:37,400 --> 00:31:41,239 Speaker 1: reflects a confusion between what is available to oneself and 577 00:31:41,320 --> 00:31:43,959 Speaker 1: what is likely to be available to and hence guide 578 00:31:44,000 --> 00:31:47,280 Speaker 1: the actions of others. So again, Johnny might think, well, 579 00:31:47,760 --> 00:31:50,160 Speaker 1: the teacher knows I didn't do the reading, and that's 580 00:31:50,200 --> 00:31:52,680 Speaker 1: why she's giving the test today, or she's giving the 581 00:31:52,720 --> 00:31:55,840 Speaker 1: pop quiz today. But the teacher doesn't know that. It's 582 00:31:55,880 --> 00:31:59,080 Speaker 1: just you know, that's what he knows. And then finally, 583 00:31:59,120 --> 00:32:02,280 Speaker 1: the author's point out that these previously documented eco centric 584 00:32:02,320 --> 00:32:05,200 Speaker 1: biases are very similar to the kind of egocentrism that 585 00:32:05,320 --> 00:32:10,520 Speaker 1: Jean Pierge observed pervading the cognition of young children early 586 00:32:10,560 --> 00:32:13,280 Speaker 1: in their development. One of the more important parts of 587 00:32:13,280 --> 00:32:16,560 Speaker 1: growing up, in fact, is shedding some of that ecocentrism. 588 00:32:16,720 --> 00:32:19,160 Speaker 1: But it turns out we don't shed it all. It 589 00:32:19,280 --> 00:32:23,240 Speaker 1: still appears in adults, simply in diminished form. It's certainly 590 00:32:23,240 --> 00:32:28,280 Speaker 1: more diminished in some people than it is in us. Yeah, 591 00:32:28,320 --> 00:32:31,360 Speaker 1: but yeah, so it's young children. I mean, people with 592 00:32:31,360 --> 00:32:33,720 Speaker 1: with kids will probably recognize this, often seem not to 593 00:32:33,840 --> 00:32:37,680 Speaker 1: grasp that other people have a different perspective than they do. 594 00:32:37,880 --> 00:32:40,120 Speaker 1: This happens when you're very young, and gradually, as you 595 00:32:40,160 --> 00:32:44,480 Speaker 1: get older, you get more, you get more consistent about 596 00:32:44,560 --> 00:32:47,280 Speaker 1: being able to accurately sort of model the minds of 597 00:32:47,280 --> 00:32:51,800 Speaker 1: others understand that they have different desires, different perspectives than 598 00:32:51,840 --> 00:32:54,760 Speaker 1: you do. And adults, of course by the time of adulthood, 599 00:32:54,840 --> 00:32:58,240 Speaker 1: usually recognize this gap rationally, but still might have a 600 00:32:58,240 --> 00:33:02,560 Speaker 1: hard time sort of calibrateing to predict it accurately. Now, 601 00:33:02,600 --> 00:33:05,080 Speaker 1: the authors of this paper here say that what is 602 00:33:05,120 --> 00:33:08,720 Speaker 1: the method that that we use to calibrate this this prediction. 603 00:33:09,640 --> 00:33:13,400 Speaker 1: They say that it's probably based on anchoring and adjustment. Now, 604 00:33:13,440 --> 00:33:16,240 Speaker 1: I was reading some follow up work by Gilovich about 605 00:33:16,320 --> 00:33:21,480 Speaker 1: the the anchoring and adjustment controversy. Very brief refresher on anchoring. 606 00:33:21,520 --> 00:33:23,880 Speaker 1: We've done an episode about this in the past. So, 607 00:33:23,960 --> 00:33:26,080 Speaker 1: like when you're trying to come up with an answer 608 00:33:26,080 --> 00:33:29,040 Speaker 1: to a question like how much is this car worth? 609 00:33:29,280 --> 00:33:32,320 Speaker 1: Or what do people think of the speech I just made? 610 00:33:33,080 --> 00:33:36,800 Speaker 1: You don't necessarily reason toward an answer from a neutral 611 00:33:36,880 --> 00:33:41,000 Speaker 1: starting point. We often tend to be influenced by sort 612 00:33:41,040 --> 00:33:44,640 Speaker 1: of like data points or hypothetical answers that we can 613 00:33:44,720 --> 00:33:47,280 Speaker 1: kind of hang a hat on to begin with, Which 614 00:33:47,360 --> 00:33:49,160 Speaker 1: might be one reason that you've got a good, you know, 615 00:33:49,320 --> 00:33:51,960 Speaker 1: price written on the on the windshield of a car, 616 00:33:52,040 --> 00:33:54,040 Speaker 1: even if that's not the price you would actually end 617 00:33:54,160 --> 00:33:57,080 Speaker 1: up paying Now this was the Now, the anchoring and 618 00:33:57,160 --> 00:33:59,800 Speaker 1: adjustment model was what the authors were working with. The 619 00:34:00,040 --> 00:34:02,719 Speaker 1: I'm that's the idea that we often think by starting 620 00:34:02,720 --> 00:34:05,440 Speaker 1: with an anchor and then we just adjust our estimate 621 00:34:05,560 --> 00:34:08,399 Speaker 1: up or down from the anchor. I was reading some 622 00:34:08,440 --> 00:34:12,560 Speaker 1: follow up work by Gilovich about the adjustment controversy, like, 623 00:34:12,719 --> 00:34:15,120 Speaker 1: is this really the way we think? Is this really 624 00:34:15,160 --> 00:34:18,080 Speaker 1: how we get to our anchor biased answers? Is it 625 00:34:18,120 --> 00:34:21,680 Speaker 1: based on this adjustment process? Apparently that idea has come 626 00:34:21,760 --> 00:34:24,440 Speaker 1: under some criticism in the past few decades, and there 627 00:34:24,480 --> 00:34:27,560 Speaker 1: are arguments about how to best understand what's happening in 628 00:34:27,600 --> 00:34:30,200 Speaker 1: people's heads when they fall for the anchoring bias. We're 629 00:34:30,200 --> 00:34:32,279 Speaker 1: not going to get into the weeds of that argument here. 630 00:34:32,360 --> 00:34:34,440 Speaker 1: You can check out our full episode on the anchoring 631 00:34:34,480 --> 00:34:37,400 Speaker 1: bias for more depth. Um, but whatever the role of 632 00:34:37,440 --> 00:34:41,200 Speaker 1: an adjustment mechanism in the brain, the anchoring effect does 633 00:34:41,239 --> 00:34:44,719 Speaker 1: actually appear in many scenarios, and the authors in this 634 00:34:44,760 --> 00:34:48,799 Speaker 1: paper are saying that the the anchoring effect manifests in 635 00:34:49,120 --> 00:34:52,399 Speaker 1: how we imagine the opinions of other people about us. 636 00:34:52,760 --> 00:34:56,000 Speaker 1: Because our our anchor, our starting point is how we 637 00:34:56,080 --> 00:34:59,520 Speaker 1: feel about ourselves. The stuff we notice about ourselves, and 638 00:34:59,600 --> 00:35:02,640 Speaker 1: then we kind of reason from there to what other 639 00:35:02,680 --> 00:35:05,960 Speaker 1: people's opinions would be. Well, that makes sense again, coming 640 00:35:06,000 --> 00:35:08,080 Speaker 1: back to the idea that we're using theory of mind 641 00:35:08,480 --> 00:35:12,600 Speaker 1: to ultimately create simulations about the mind states of everyone 642 00:35:12,680 --> 00:35:15,680 Speaker 1: in our lives from you know, from the person we're 643 00:35:15,719 --> 00:35:18,600 Speaker 1: closest with, two people that are just you know, like 644 00:35:18,640 --> 00:35:22,440 Speaker 1: supervisors or complete strangers, and uh and and all of 645 00:35:22,480 --> 00:35:25,399 Speaker 1: that is constructed with ourselves at the middle, like our 646 00:35:25,480 --> 00:35:29,520 Speaker 1: model of ourselfs um it is ultimately the like the 647 00:35:30,200 --> 00:35:32,200 Speaker 1: I guess you would say that the support structure on 648 00:35:32,200 --> 00:35:35,600 Speaker 1: which this entire network is built, Right, It's kind of 649 00:35:35,600 --> 00:35:38,719 Speaker 1: like you can't build any bridges two ideas of other 650 00:35:38,800 --> 00:35:42,520 Speaker 1: minds without starting from the foundation of your own. And 651 00:35:42,760 --> 00:35:44,680 Speaker 1: that foundation of your own is going to come with 652 00:35:44,719 --> 00:35:49,080 Speaker 1: a lot of baggage of like knowledge about yourself that 653 00:35:49,120 --> 00:35:52,759 Speaker 1: other people don't have, and high levels of concern about 654 00:35:52,840 --> 00:35:56,280 Speaker 1: your personal attributes that other people might not share, probably 655 00:35:56,360 --> 00:35:59,000 Speaker 1: don't share. And that's everybody We should need to drive 656 00:35:59,040 --> 00:36:00,520 Speaker 1: that home. Like we're not just talking thing about like 657 00:36:00,560 --> 00:36:05,479 Speaker 1: say like like stereotypically egocentric person you know, or someone 658 00:36:05,480 --> 00:36:09,480 Speaker 1: who has like like very obvious, uh, and pronounced personality 659 00:36:09,480 --> 00:36:12,320 Speaker 1: flaws or anything like that, or dealing with with various 660 00:36:12,800 --> 00:36:15,760 Speaker 1: mental health concerns or anything of that nature. But ultimately 661 00:36:15,800 --> 00:36:20,240 Speaker 1: this naive perception is also self perception as well. Yeah, yeah, 662 00:36:20,640 --> 00:36:23,160 Speaker 1: So I guess here we get to the actual empirical 663 00:36:23,200 --> 00:36:25,360 Speaker 1: part of this research, like how would you study this? 664 00:36:25,440 --> 00:36:29,360 Speaker 1: How would you look for empirical evidence of a spotlight effect? 665 00:36:29,600 --> 00:36:31,640 Speaker 1: And there are a number of studies that are covered 666 00:36:31,640 --> 00:36:33,840 Speaker 1: in this paper. I'm going to discuss them in sequence 667 00:36:33,880 --> 00:36:37,000 Speaker 1: and very broad strokes. Uh so, what let me guess 668 00:36:37,080 --> 00:36:39,080 Speaker 1: you know, what's going to be our our our key 669 00:36:39,320 --> 00:36:41,719 Speaker 1: um implement in this study. Is it going to be 670 00:36:41,760 --> 00:36:44,799 Speaker 1: like a god helmet that it stands my brain? Is 671 00:36:44,800 --> 00:36:46,799 Speaker 1: it going to be uh, you know, some other kind 672 00:36:46,800 --> 00:36:49,280 Speaker 1: of like high tech device that I'm cooking my nervous 673 00:36:49,280 --> 00:36:52,000 Speaker 1: system up to. You're extremely close. Now we get into 674 00:36:52,040 --> 00:36:56,320 Speaker 1: the cybernetics of a Barry Manilow T shirt. So the 675 00:36:56,640 --> 00:36:59,720 Speaker 1: question is something maybe a lot of you have wondered before. 676 00:37:00,040 --> 00:37:03,480 Speaker 1: People usually notice what's on your T shirt? Or do 677 00:37:03,560 --> 00:37:06,520 Speaker 1: they just not even care? It's a great question. I know, 678 00:37:06,800 --> 00:37:09,480 Speaker 1: when I wear a T shirt, I'm I I have 679 00:37:09,520 --> 00:37:13,160 Speaker 1: certainly caught myself, especially at least in retrospect, thinking way 680 00:37:13,200 --> 00:37:16,400 Speaker 1: too much about how others will perceive this shirt design. Yeah, 681 00:37:16,640 --> 00:37:18,680 Speaker 1: and what it's saying about me and my interest what 682 00:37:18,760 --> 00:37:21,800 Speaker 1: is it broadcasting to the world, And it would be 683 00:37:21,840 --> 00:37:23,160 Speaker 1: a But at the same time, I feel like I'm 684 00:37:23,160 --> 00:37:26,480 Speaker 1: always very interested in what other people's shirts say, to 685 00:37:26,560 --> 00:37:29,840 Speaker 1: the point that I sometimes feel self conscious about trying 686 00:37:29,880 --> 00:37:31,799 Speaker 1: to understand what someone shirt is because I'm like, I 687 00:37:31,800 --> 00:37:34,800 Speaker 1: don't want to be looking like caught staring at somebody's 688 00:37:34,840 --> 00:37:38,719 Speaker 1: T shirt. I think you probably notice certain kinds of 689 00:37:38,760 --> 00:37:41,000 Speaker 1: shirts more than others. Like you might pick up on 690 00:37:41,120 --> 00:37:43,640 Speaker 1: cues that like, oh, this is a band T shirt, 691 00:37:43,680 --> 00:37:46,200 Speaker 1: and I'm usually kind of interested in band t shirts, 692 00:37:46,200 --> 00:37:48,280 Speaker 1: and when we see what this is, But like other things, 693 00:37:48,320 --> 00:37:51,719 Speaker 1: if it's a i'd imagine like a football team or something, 694 00:37:51,760 --> 00:37:54,319 Speaker 1: you might not even take notice. That's true. I guess 695 00:37:54,520 --> 00:37:57,160 Speaker 1: the shirts I'm usually interested in our I guess, you know, 696 00:37:57,239 --> 00:37:58,839 Speaker 1: to a certain extent, band shirts, but if it has 697 00:37:58,840 --> 00:38:01,120 Speaker 1: any kind of like monster type thing, oh yeah, and 698 00:38:01,400 --> 00:38:03,600 Speaker 1: I definitely want to know what's going on. That's right, 699 00:38:03,880 --> 00:38:07,080 Speaker 1: that's our brains, that's just our brains being our brains. 700 00:38:07,200 --> 00:38:09,880 Speaker 1: Uh so so extremely simple set up for the study 701 00:38:09,920 --> 00:38:12,160 Speaker 1: Its dead simple. You get a group of participants to 702 00:38:12,160 --> 00:38:14,160 Speaker 1: gather in a room, and you have them basically in 703 00:38:14,160 --> 00:38:17,400 Speaker 1: there like filling out questionnaires for an experiment that is 704 00:38:17,440 --> 00:38:22,759 Speaker 1: supposedly about memory. And then another participant joins that group late, 705 00:38:22,920 --> 00:38:26,200 Speaker 1: but before they go into the room, you require them 706 00:38:26,239 --> 00:38:29,839 Speaker 1: to put on a Barry Manilow T shirt. Then, after 707 00:38:29,880 --> 00:38:31,799 Speaker 1: they've been in the room for a very brief time, 708 00:38:31,840 --> 00:38:34,560 Speaker 1: you say, actually, uh, this group has already gotten started, 709 00:38:34,560 --> 00:38:37,120 Speaker 1: so we're gonna hold you back for for another session. 710 00:38:37,200 --> 00:38:40,720 Speaker 1: And then you have the Barry Manilow interloper leave the room, 711 00:38:40,760 --> 00:38:44,080 Speaker 1: and then you ask everybody. Okay, you ask the interloper, 712 00:38:44,160 --> 00:38:46,880 Speaker 1: the Berry Manilow T shirt wearer, how many people in 713 00:38:46,920 --> 00:38:49,600 Speaker 1: the room do you think noticed that you were wearing 714 00:38:49,640 --> 00:38:52,799 Speaker 1: a Berry Manlow T shirt? And then you ask the 715 00:38:52,840 --> 00:38:55,719 Speaker 1: people who were in the room if they noticed who 716 00:38:55,800 --> 00:38:58,400 Speaker 1: was on the T shirt? Right, You're just very simple 717 00:38:58,440 --> 00:39:02,080 Speaker 1: comparing the person's expectation of how many people noticed too 718 00:39:02,200 --> 00:39:05,560 Speaker 1: how many people actually noticed. And true to prediction, the 719 00:39:05,640 --> 00:39:09,480 Speaker 1: students who wore the T shirt tended to wildly overestimate 720 00:39:09,520 --> 00:39:11,799 Speaker 1: how many people in the room would notice and be 721 00:39:11,880 --> 00:39:15,840 Speaker 1: able to identify their Manilow T shirt. Generally, the Manilow 722 00:39:15,960 --> 00:39:19,719 Speaker 1: interloper guests that about half of the other students on 723 00:39:19,800 --> 00:39:22,719 Speaker 1: average would be able to identify their T shirt, and 724 00:39:22,719 --> 00:39:26,560 Speaker 1: in reality only about twenty five percent of the observers 725 00:39:26,560 --> 00:39:29,840 Speaker 1: could do it. So in their minds, the people wearing 726 00:39:29,960 --> 00:39:35,200 Speaker 1: this potentially conspicuous piece of clothing mentally doubled the percentage 727 00:39:35,200 --> 00:39:38,080 Speaker 1: of people who they thought would notice it. Real quick, Joe, 728 00:39:38,920 --> 00:39:40,719 Speaker 1: when need the study you were looking at here? Did 729 00:39:40,800 --> 00:39:43,720 Speaker 1: you get to see this Manelo T shirt? No? I didn't. 730 00:39:44,560 --> 00:39:47,800 Speaker 1: That's my big question because I'm currently looking at various 731 00:39:47,880 --> 00:39:52,200 Speaker 1: very Manilow T shirts in UM managed search here, and 732 00:39:52,440 --> 00:39:55,279 Speaker 1: they do they do run the gamut here. We have 733 00:39:55,400 --> 00:39:59,200 Speaker 1: some we have some very forgettable Manilow shirts, but we 734 00:39:59,239 --> 00:40:06,600 Speaker 1: have some real singers here. Yeah. Well I think so. 735 00:40:06,640 --> 00:40:08,520 Speaker 1: I could be wrong, but I think what it was 736 00:40:08,520 --> 00:40:11,880 Speaker 1: was it was just like a picture of his face. Okay, 737 00:40:11,880 --> 00:40:14,480 Speaker 1: well even then, like it's a it's a noticeable face. 738 00:40:14,520 --> 00:40:17,520 Speaker 1: I mean that that was That's part of the whole 739 00:40:18,120 --> 00:40:22,440 Speaker 1: business proposal here. But there were also control groups for 740 00:40:22,440 --> 00:40:25,680 Speaker 1: this study. They in the control groups, this was an 741 00:40:25,719 --> 00:40:29,760 Speaker 1: interesting calibration. The control groups were not in the room, 742 00:40:30,080 --> 00:40:33,640 Speaker 1: but instead they watched the entire scene play out on 743 00:40:33,719 --> 00:40:37,760 Speaker 1: a video recording, and then they were asked to estimate 744 00:40:37,800 --> 00:40:40,480 Speaker 1: the number of observers who would notice the T shirt, 745 00:40:40,960 --> 00:40:44,160 Speaker 1: and the control groups guessed much closer to the real 746 00:40:44,440 --> 00:40:47,440 Speaker 1: number of people who would actually notice it, and they 747 00:40:47,480 --> 00:40:50,560 Speaker 1: did not overestimate to the extent that the person wearing 748 00:40:50,640 --> 00:40:53,560 Speaker 1: the shirt did. And this was taken as evidence that 749 00:40:53,680 --> 00:40:57,360 Speaker 1: quote the targets inflated estimates are not simply the result 750 00:40:57,640 --> 00:41:02,400 Speaker 1: of misguided general theories about observers powers of observation. In 751 00:41:02,400 --> 00:41:06,000 Speaker 1: other words, the relevant variable is I am the person 752 00:41:06,160 --> 00:41:08,560 Speaker 1: wearing it. Well, that makes sense. Again, we are the 753 00:41:09,120 --> 00:41:12,360 Speaker 1: we are the central character in our own narrative. Okay, 754 00:41:12,520 --> 00:41:15,840 Speaker 1: second study in this paper, so uh. It's worth noting 755 00:41:15,920 --> 00:41:18,839 Speaker 1: that the majority of the students that they interviewed in 756 00:41:18,880 --> 00:41:21,640 Speaker 1: the first study reported, in fact that wearing a very 757 00:41:21,719 --> 00:41:25,640 Speaker 1: Manilow T shirt was considered embarrassing, that Barry Manilow was 758 00:41:25,680 --> 00:41:28,880 Speaker 1: considered kind of corny and uncool. And it doesn't make 759 00:41:28,880 --> 00:41:31,399 Speaker 1: me wonder has Barry Manilow come full circle yet? Has 760 00:41:31,400 --> 00:41:33,560 Speaker 1: he become cool again? I don't know. Some of these 761 00:41:33,560 --> 00:41:36,080 Speaker 1: shirts I was just looking at look pretty cool. Yeah, 762 00:41:36,120 --> 00:41:37,600 Speaker 1: I think that was part of the design of the 763 00:41:37,600 --> 00:41:39,799 Speaker 1: first study was that this is a figure that not 764 00:41:39,920 --> 00:41:42,600 Speaker 1: everybody but a lot of people wearing the shirt. Would 765 00:41:42,640 --> 00:41:44,880 Speaker 1: you know, it's not just like any face. It's somebody 766 00:41:44,920 --> 00:41:47,040 Speaker 1: who a lot of students would probably feel kind of 767 00:41:47,080 --> 00:41:50,839 Speaker 1: embarrassed to be wearing a shirt of. But the question is, like, 768 00:41:50,920 --> 00:41:54,480 Speaker 1: does this phenomenon hold for T shirts that would not 769 00:41:54,520 --> 00:41:57,200 Speaker 1: be embarrassing? That would just be a picture of anybody, 770 00:41:57,239 --> 00:42:00,400 Speaker 1: maybe anybody that the student liked. So the second study 771 00:42:00,480 --> 00:42:03,920 Speaker 1: tested for the spotlight effect with reference to non embarrassing 772 00:42:04,000 --> 00:42:07,440 Speaker 1: personal details. It replicated the design of the first study, 773 00:42:07,800 --> 00:42:10,720 Speaker 1: but it allowed students to choose a T shirt featuring 774 00:42:10,719 --> 00:42:14,000 Speaker 1: a person that they liked and viewed as not embarrassing. 775 00:42:14,080 --> 00:42:16,280 Speaker 1: So it might be a T shirt of like Bob 776 00:42:16,320 --> 00:42:21,160 Speaker 1: Marley or Jerry Seinfeld or something. Wait, was Jerry Seinfeld? Really? 777 00:42:21,360 --> 00:42:23,920 Speaker 1: Was that? Was he specifically mentioned? And yeah, yeah, that 778 00:42:24,000 --> 00:42:26,479 Speaker 1: was one of them. How is that Jerry Seinfeld shirt 779 00:42:26,560 --> 00:42:29,600 Speaker 1: not embarrassing? I don't know. Some people didn't think it was. 780 00:42:31,000 --> 00:42:34,480 Speaker 1: You know, times changed, this was what you're two thousands 781 00:42:34,480 --> 00:42:37,200 Speaker 1: something like that. Yeah, I get the Bob Marley shirt. 782 00:42:37,239 --> 00:42:39,640 Speaker 1: I think remains that remains cool, But I just have 783 00:42:39,760 --> 00:42:42,239 Speaker 1: questions about the Gears Science Felt shirt. Maybe this is 784 00:42:42,280 --> 00:42:45,320 Speaker 1: again just tells more, says some more about my interest 785 00:42:45,680 --> 00:42:49,440 Speaker 1: versus other people's interests. But maybe I am uncool for 786 00:42:49,520 --> 00:42:52,720 Speaker 1: not wearing them. No, no, no, you're very cool, Robert. 787 00:42:54,120 --> 00:42:57,000 Speaker 1: But again there was a huge mismatch right between even 788 00:42:57,040 --> 00:43:00,200 Speaker 1: when you're wearing a shirt that's not conspicuously embarrassed, saying 789 00:43:00,280 --> 00:43:03,600 Speaker 1: to a number of students, people just predicted that observers 790 00:43:03,600 --> 00:43:06,000 Speaker 1: would notice who was on their shirt a lot more 791 00:43:06,040 --> 00:43:08,919 Speaker 1: than the observers actually did. It just makes me think 792 00:43:08,920 --> 00:43:11,600 Speaker 1: of of of myself or anyone you know. You've got 793 00:43:11,600 --> 00:43:14,800 Speaker 1: that new shirt and you're like, is today today? It 794 00:43:15,040 --> 00:43:17,120 Speaker 1: is today the day that I wear this, uh, this 795 00:43:17,200 --> 00:43:20,920 Speaker 1: new shirt going unleashed, this bad boy on an unsuspecting world. 796 00:43:21,080 --> 00:43:25,239 Speaker 1: Is the world ready? And yeah, the thing is, yeah, 797 00:43:25,239 --> 00:43:28,520 Speaker 1: they're ready, and yeah they'll be fine. Calson like don't 798 00:43:28,520 --> 00:43:32,759 Speaker 1: rended to worry about um Yeah, so so okay, But 799 00:43:32,800 --> 00:43:36,120 Speaker 1: that's appearance, that's just clothing items. What about for behavior? 800 00:43:36,200 --> 00:43:38,960 Speaker 1: Can we look for examples of this in behavior? So 801 00:43:39,080 --> 00:43:42,719 Speaker 1: the third study tested for whether the spotlight effect exists 802 00:43:42,760 --> 00:43:46,760 Speaker 1: not just for clothing, but for specifically stuff people say 803 00:43:46,840 --> 00:43:50,000 Speaker 1: in a group setting. Quote. In particular, we sought to 804 00:43:50,040 --> 00:43:52,719 Speaker 1: investigate whether people tend to believe that their positive and 805 00:43:52,800 --> 00:43:56,919 Speaker 1: negative actions stand out to others more than they actually do. 806 00:43:57,920 --> 00:44:01,440 Speaker 1: And this was tested with stage to discussion groups. So 807 00:44:01,480 --> 00:44:03,640 Speaker 1: they would have a discussion group meeting and then they 808 00:44:03,640 --> 00:44:07,719 Speaker 1: would ask people afterwards to rate other participants on both 809 00:44:07,760 --> 00:44:11,000 Speaker 1: positive and negative dimensions of their contributions. So you'd rate 810 00:44:11,040 --> 00:44:13,440 Speaker 1: all the people you just had a group with, and 811 00:44:13,560 --> 00:44:17,360 Speaker 1: you'd say, how much did participant X due to advance 812 00:44:17,440 --> 00:44:20,560 Speaker 1: the discussion. That'd be a positive mark, and then negative 813 00:44:20,560 --> 00:44:24,040 Speaker 1: things would be like how many speech errors did participant 814 00:44:24,200 --> 00:44:27,760 Speaker 1: X make? Or how likely was participants to offend someone? 815 00:44:28,800 --> 00:44:31,640 Speaker 1: And participants also rated, of course, what they thought others 816 00:44:31,680 --> 00:44:33,719 Speaker 1: would think of them. You know, then they turned it 817 00:44:33,760 --> 00:44:36,440 Speaker 1: on themselves and they found the same thing. The study 818 00:44:36,440 --> 00:44:39,440 Speaker 1: indicated that we tend to overestimate the salience of our 819 00:44:39,480 --> 00:44:44,600 Speaker 1: behavior to others in both positive and negative ways. So 820 00:44:44,680 --> 00:44:47,239 Speaker 1: it's not just like a self serving bias or self 821 00:44:47,280 --> 00:44:50,799 Speaker 1: critical bias. It's just we tend to assume people are 822 00:44:50,840 --> 00:44:55,440 Speaker 1: paying way more attention and noticing way more about the 823 00:44:55,480 --> 00:44:58,719 Speaker 1: stuff we do, both good and bad, which makes me 824 00:44:58,760 --> 00:45:00,560 Speaker 1: think of like the hell Raiser Ta line, it's like 825 00:45:00,600 --> 00:45:03,359 Speaker 1: angels to some devils to others. But really maybe you 826 00:45:03,400 --> 00:45:05,319 Speaker 1: think am I an angel or a devil? And in 827 00:45:05,360 --> 00:45:07,960 Speaker 1: fact you're just kind of a gray blur that someone 828 00:45:08,040 --> 00:45:13,719 Speaker 1: does not recall in any way, just an antiagu Now, 829 00:45:13,760 --> 00:45:16,640 Speaker 1: an important thing that's worth pointing out here is that 830 00:45:17,200 --> 00:45:21,200 Speaker 1: people's self ratings on this discussion group thing, we're not 831 00:45:21,560 --> 00:45:25,120 Speaker 1: entirely divorced from reality. To the contrary, the study found 832 00:45:25,120 --> 00:45:28,520 Speaker 1: that these ratings were in some ways based in reality. 833 00:45:28,560 --> 00:45:32,520 Speaker 1: People who rated themselves as doing more to advance the 834 00:45:32,560 --> 00:45:36,400 Speaker 1: discussion we're also on average rated by others as doing 835 00:45:36,440 --> 00:45:39,960 Speaker 1: more to advance the discussion. And people who rated themselves 836 00:45:40,000 --> 00:45:43,000 Speaker 1: as more likely to have said something that offended someone 837 00:45:43,320 --> 00:45:45,759 Speaker 1: were in fact more likely to have said something that 838 00:45:45,800 --> 00:45:49,680 Speaker 1: offended someone. But it's the size of these effects, both 839 00:45:49,719 --> 00:45:53,000 Speaker 1: positive and negative, that was exaggerated when people were thinking 840 00:45:53,040 --> 00:45:58,239 Speaker 1: about themselves so in self evaluation and insightful comment that 841 00:45:58,480 --> 00:46:01,440 Speaker 1: might have actually been an insightful comment to you, it 842 00:46:01,480 --> 00:46:04,160 Speaker 1: feels like, wow, that was earth shaking, I really changed 843 00:46:04,200 --> 00:46:07,920 Speaker 1: the game, or a faux paw that other people might notice, 844 00:46:08,000 --> 00:46:10,520 Speaker 1: but you know, doesn't really stand out to them all 845 00:46:10,600 --> 00:46:15,520 Speaker 1: that much. It might become reputation ruining, this terror, this obsession. Yeah, 846 00:46:15,560 --> 00:46:19,640 Speaker 1: that's interesting. Um, and again I wonder I can't help 847 00:46:19,960 --> 00:46:22,239 Speaker 1: think about social media and when you have systems that 848 00:46:22,280 --> 00:46:26,839 Speaker 1: are set up so that comments that are insightful may 849 00:46:26,880 --> 00:46:29,200 Speaker 1: seem more earth shaking because they are being you know, 850 00:46:29,719 --> 00:46:34,200 Speaker 1: re shared or retweeted or or lighthearted, etcetera. And then 851 00:46:34,280 --> 00:46:36,520 Speaker 1: and then likewise there is the negative reaction to the 852 00:46:36,800 --> 00:46:39,239 Speaker 1: things as well, And of course those tend to be 853 00:46:39,280 --> 00:46:42,320 Speaker 1: the extremes that we we we hear about in a 854 00:46:42,520 --> 00:46:45,560 Speaker 1: in a digital setting. Yeah, but even then, I think 855 00:46:45,560 --> 00:46:47,799 Speaker 1: the reality is that, like most people are not paying 856 00:46:47,840 --> 00:46:51,200 Speaker 1: attention to you and won't remember anything you did, right, right, 857 00:46:52,600 --> 00:46:56,120 Speaker 1: it's just humbling in a kind of nice way. Yeah. Okay, 858 00:46:56,120 --> 00:46:58,040 Speaker 1: a couple more studies real quick. One of them is 859 00:46:58,160 --> 00:47:01,200 Speaker 1: in the fourth one. This recreated the early T shirt scenario, 860 00:47:01,520 --> 00:47:04,600 Speaker 1: but then ask participants questions to probe how they came 861 00:47:04,680 --> 00:47:07,040 Speaker 1: up with their estimates. This is how the authors were 862 00:47:07,080 --> 00:47:09,560 Speaker 1: trying to test whether or not it was an anchoring 863 00:47:09,600 --> 00:47:13,560 Speaker 1: and adjustment mental process that people were using to to 864 00:47:13,760 --> 00:47:18,200 Speaker 1: get to their mistaken assumptions about other people's views of them, 865 00:47:18,360 --> 00:47:21,680 Speaker 1: and uh so their model Again, this could be it 866 00:47:21,680 --> 00:47:23,600 Speaker 1: could turn out that this is not the best way 867 00:47:23,600 --> 00:47:25,440 Speaker 1: to model the thinking going on here. But what they 868 00:47:25,440 --> 00:47:28,359 Speaker 1: thought at the time was that people start with their 869 00:47:28,400 --> 00:47:31,600 Speaker 1: own rich and powerful sense of how they appear to others. 870 00:47:32,000 --> 00:47:35,160 Speaker 1: They realize correctly that other people are not paying as 871 00:47:35,200 --> 00:47:38,080 Speaker 1: much attention to them as they pay to themselves, so 872 00:47:38,080 --> 00:47:41,560 Speaker 1: they may be adjust down from their own experience to 873 00:47:41,640 --> 00:47:45,960 Speaker 1: a hypothetical other observer, but they don't adjust enough. So 874 00:47:46,040 --> 00:47:49,080 Speaker 1: you know, how important was what I just did? Uh? Well, 875 00:47:49,120 --> 00:47:51,160 Speaker 1: to me, it was a ten, but I know other 876 00:47:51,200 --> 00:47:53,279 Speaker 1: people probably aren't going to rate it at ten, so 877 00:47:53,320 --> 00:47:56,920 Speaker 1: I'll say it's a nine to them, but in reality 878 00:47:57,000 --> 00:47:59,839 Speaker 1: was maybe like a yeah or a four. And then 879 00:48:00,080 --> 00:48:02,520 Speaker 1: the last of the last of the studies, the fifth one. 880 00:48:02,600 --> 00:48:05,279 Speaker 1: This one I think established something that's very important that 881 00:48:05,320 --> 00:48:09,320 Speaker 1: we can come back to in a minute. This tested habituation. 882 00:48:09,920 --> 00:48:13,080 Speaker 1: If people are allowed a period to get used to 883 00:48:13,200 --> 00:48:17,080 Speaker 1: wearing the unfamiliar very Manilow T shirt, will they feel 884 00:48:17,440 --> 00:48:20,880 Speaker 1: less self conscious about others noticing it? And the answer 885 00:48:20,960 --> 00:48:22,920 Speaker 1: is yes. If you wear the T shirt for a 886 00:48:22,920 --> 00:48:25,719 Speaker 1: while before going in front of other people, you will 887 00:48:25,760 --> 00:48:28,400 Speaker 1: tend to imagine that fewer of them took notice of 888 00:48:28,440 --> 00:48:30,279 Speaker 1: it than if you just put it on and then 889 00:48:30,360 --> 00:48:33,480 Speaker 1: go in. But of course, in these scenarios, absolutely nothing 890 00:48:33,520 --> 00:48:35,640 Speaker 1: has changed for the observers. The only thing that has 891 00:48:35,719 --> 00:48:38,799 Speaker 1: changed is you. The more that you're more used to 892 00:48:38,800 --> 00:48:41,920 Speaker 1: the shirt yourself, you're less conscious of it, so you 893 00:48:42,000 --> 00:48:45,840 Speaker 1: imagine less consciousness among others. So in this game, like 894 00:48:45,880 --> 00:48:47,359 Speaker 1: if if I don't know if anyone has ever had 895 00:48:47,400 --> 00:48:50,879 Speaker 1: this experience being you know, somebody who wears a well 896 00:48:50,960 --> 00:48:55,600 Speaker 1: worn but offensive T shirt an inappropriate setting, um, like 897 00:48:55,719 --> 00:48:58,680 Speaker 1: you know, they're used to it, they're used to the 898 00:48:58,800 --> 00:49:02,480 Speaker 1: potentially profane statement that is on it. But everyone else, man, 899 00:49:02,520 --> 00:49:05,120 Speaker 1: it might not be ready for Yes, I think this 900 00:49:05,200 --> 00:49:07,440 Speaker 1: is actually a various tute observation and it will come 901 00:49:07,480 --> 00:49:09,839 Speaker 1: back to something I want to get at right at 902 00:49:09,840 --> 00:49:12,320 Speaker 1: the end, here, should we take another break, and then 903 00:49:12,400 --> 00:49:14,040 Speaker 1: when we come back we can discuss some of the 904 00:49:14,080 --> 00:49:17,239 Speaker 1: implications of this research. Let's do it. We'll be right back, 905 00:49:18,719 --> 00:49:22,719 Speaker 1: Thank you, thank you. All right, we're back. We're continuing 906 00:49:22,719 --> 00:49:27,200 Speaker 1: our discussion here of of the spotlight effect and uh 907 00:49:27,200 --> 00:49:31,280 Speaker 1: and of course the various key shirt experiments, Uh that 908 00:49:31,360 --> 00:49:35,359 Speaker 1: relate to that explanation. Yeah, So I wanted to talk 909 00:49:35,400 --> 00:49:39,920 Speaker 1: about some commentary on and and implications from the spotlight effect. 910 00:49:39,960 --> 00:49:42,239 Speaker 1: To whatever extent this is a real effect, it does 911 00:49:42,400 --> 00:49:44,319 Speaker 1: appear like it still stands. I mean, I would be 912 00:49:44,400 --> 00:49:48,440 Speaker 1: interested to see some more recent research replicating this, but 913 00:49:48,719 --> 00:49:51,120 Speaker 1: but it looks pretty solid to me. One of the 914 00:49:51,200 --> 00:49:53,799 Speaker 1: things that the lead author of the study we've been 915 00:49:53,840 --> 00:49:58,200 Speaker 1: talking about, Gilovich, Uh, he's noted in another source he 916 00:49:58,239 --> 00:50:01,560 Speaker 1: actually had an article a about the spotlight effect for 917 00:50:01,680 --> 00:50:05,640 Speaker 1: the Encyclopedia of Social Psychology edited by Baumeister and Voes. 918 00:50:06,440 --> 00:50:10,320 Speaker 1: And in that article, Gilovich notes that research indicates the 919 00:50:10,400 --> 00:50:13,720 Speaker 1: quote people of all ages are prone to the spotlight effect, 920 00:50:13,760 --> 00:50:17,200 Speaker 1: but it appears to be particularly pronounced among adolescents and 921 00:50:17,360 --> 00:50:22,160 Speaker 1: young adults. So as you get older, the spotlight effect 922 00:50:22,280 --> 00:50:25,960 Speaker 1: seems to work less powerfully on your brain. What would 923 00:50:25,960 --> 00:50:28,680 Speaker 1: explain this, Well, one answer might be experienced. Right in 924 00:50:28,800 --> 00:50:32,160 Speaker 1: over time, you just learn through experience that people pay 925 00:50:32,239 --> 00:50:35,399 Speaker 1: less attention to you and notice less about you than 926 00:50:35,480 --> 00:50:38,520 Speaker 1: you expect them to. And it's possible this does play 927 00:50:38,520 --> 00:50:40,560 Speaker 1: some role. Maybe you get conditioned, you kind of learn 928 00:50:40,640 --> 00:50:43,840 Speaker 1: how things work in life, and you experience less of 929 00:50:43,920 --> 00:50:47,880 Speaker 1: this cognitive bias. But Gilovich identifies a different reason, and 930 00:50:47,960 --> 00:50:52,800 Speaker 1: that reason is that social motivation is stronger when you're younger. 931 00:50:53,440 --> 00:50:57,880 Speaker 1: Younger people show a heightened consciousness of and concerned for 932 00:50:58,280 --> 00:51:02,520 Speaker 1: their standing within social groups quote. But having a heightened 933 00:51:02,520 --> 00:51:05,880 Speaker 1: concern with one's social standing means, by its very nature, 934 00:51:06,280 --> 00:51:09,520 Speaker 1: that one is vulnerable to having an excessive concern with 935 00:51:09,600 --> 00:51:13,320 Speaker 1: one standing and hence is likely to overestimate the extent 936 00:51:13,400 --> 00:51:16,759 Speaker 1: to which one is the target of others thoughts and attention. 937 00:51:17,280 --> 00:51:19,719 Speaker 1: So I'd say to take away from this maybe a 938 00:51:19,760 --> 00:51:23,319 Speaker 1: special message to like younger and like teenage listeners, like 939 00:51:23,920 --> 00:51:27,160 Speaker 1: other people really probably are noticing less about you and 940 00:51:27,200 --> 00:51:29,719 Speaker 1: thinking less about you than you think they are, as 941 00:51:29,760 --> 00:51:33,799 Speaker 1: shocking as that may be to hear. Another thing that 942 00:51:33,800 --> 00:51:37,000 Speaker 1: that's related to this idea that the authors mentioned in 943 00:51:37,000 --> 00:51:40,200 Speaker 1: their in their discussion section on their paper is the 944 00:51:40,280 --> 00:51:43,600 Speaker 1: way that the spotlight effect relates to something that's known 945 00:51:43,640 --> 00:51:48,000 Speaker 1: as the illusion of transparency. So the illusion of transparency 946 00:51:48,040 --> 00:51:52,160 Speaker 1: is the belief that your internal states are more observable 947 00:51:52,160 --> 00:51:55,840 Speaker 1: to others than they actually are. We often assume that 948 00:51:55,960 --> 00:52:00,480 Speaker 1: our unspoken thoughts and our feelings can be sort of 949 00:52:00,520 --> 00:52:03,759 Speaker 1: sniffed out and discerned by people around us. But that's 950 00:52:03,840 --> 00:52:06,440 Speaker 1: usually not true, not not to the extent that we 951 00:52:06,480 --> 00:52:08,920 Speaker 1: think it is. And there are examples of this from 952 00:52:08,920 --> 00:52:14,000 Speaker 1: empirical research. For example, if you stage a mock negotiation 953 00:52:14,200 --> 00:52:16,399 Speaker 1: where people are trying to, you know, negotiate to get 954 00:52:16,440 --> 00:52:19,920 Speaker 1: to a certain price on something, people tend to imagine 955 00:52:20,000 --> 00:52:23,560 Speaker 1: that they have given away more information about what they're 956 00:52:23,560 --> 00:52:27,239 Speaker 1: trying to get than they actually have. Another variation is 957 00:52:27,280 --> 00:52:30,759 Speaker 1: that studies show that a lot of times people imagine 958 00:52:30,800 --> 00:52:35,120 Speaker 1: that other people can tell when they are lying, but 959 00:52:35,200 --> 00:52:38,960 Speaker 1: in reality, people can't actually tell when people are lying, 960 00:52:39,080 --> 00:52:41,399 Speaker 1: or at least I mean, some people maybe can tell 961 00:52:41,440 --> 00:52:43,959 Speaker 1: some of the time, but most of the time, other 962 00:52:44,040 --> 00:52:47,400 Speaker 1: people cannot tell if you're lying, cannot spot your lies 963 00:52:47,480 --> 00:52:50,799 Speaker 1: with nearly as much accuracy as you think they can 964 00:52:51,440 --> 00:52:54,919 Speaker 1: do with that information what you will, I really don't. 965 00:52:56,239 --> 00:52:57,960 Speaker 1: I think it's it's like, it's a great point because 966 00:52:57,960 --> 00:53:01,640 Speaker 1: first of all, we all lie, like lying is is 967 00:53:01,640 --> 00:53:05,799 Speaker 1: is part of our communication. Sweet you know we're gonna 968 00:53:05,960 --> 00:53:09,040 Speaker 1: individuals are gonna engage in it to varying degrees, but 969 00:53:09,360 --> 00:53:11,040 Speaker 1: you know, it is important to have that tool in 970 00:53:11,080 --> 00:53:13,720 Speaker 1: your toolbox. You know, if someone shows you a picture 971 00:53:13,760 --> 00:53:17,080 Speaker 1: of a baby and and uh and and and you're 972 00:53:17,080 --> 00:53:20,080 Speaker 1: expected to comment upon it, it is generally in your 973 00:53:20,080 --> 00:53:23,600 Speaker 1: best interest to lie if you think that baby is ugly, right, 974 00:53:24,040 --> 00:53:26,360 Speaker 1: or or at least find some way to uh that 975 00:53:26,560 --> 00:53:31,000 Speaker 1: is not just comedic adherence to truth. Right, You can 976 00:53:31,080 --> 00:53:35,160 Speaker 1: you can find something nice to say that isn't necessarily untrue. Right. 977 00:53:35,480 --> 00:53:38,840 Speaker 1: And yet at the same time, lying can be, or 978 00:53:38,880 --> 00:53:42,600 Speaker 1: at least certainly feel like a high risk act, right. 979 00:53:42,640 --> 00:53:44,680 Speaker 1: I mean, no one wants to be caught in a 980 00:53:44,760 --> 00:53:49,000 Speaker 1: lie um, even if the stakes are ultimately kind of low. 981 00:53:49,040 --> 00:53:51,120 Speaker 1: I mean, I guess maybe even more so at times 982 00:53:51,120 --> 00:53:53,160 Speaker 1: if the stakes are low, because why are you lying 983 00:53:53,160 --> 00:53:55,720 Speaker 1: about that? Well, like, why didn't you say you didn't 984 00:53:55,800 --> 00:53:58,960 Speaker 1: like this picture of my baby? Or I don't know 985 00:54:00,000 --> 00:54:02,279 Speaker 1: me say, I can't think of a specific example, but okay, 986 00:54:02,320 --> 00:54:05,480 Speaker 1: here's a potential example where if someone says, hey, if 987 00:54:05,480 --> 00:54:07,319 Speaker 1: you're seeing guy hard too, and you're like, oh, yeah, 988 00:54:07,400 --> 00:54:09,440 Speaker 1: it's pretty good, and and maybe the thing is you 989 00:54:09,680 --> 00:54:11,799 Speaker 1: haven't seen it, you have no interest in saying maybe 990 00:54:11,840 --> 00:54:13,880 Speaker 1: you think that the whole concept sounds kind of stupid, 991 00:54:14,280 --> 00:54:16,040 Speaker 1: but you want to be polite about it, and you 992 00:54:16,080 --> 00:54:18,000 Speaker 1: also don't want to be You don't want to the 993 00:54:18,000 --> 00:54:20,239 Speaker 1: plot to have to be explained to you now you 994 00:54:20,280 --> 00:54:22,120 Speaker 1: know you didn't see it in the theater. You also 995 00:54:22,160 --> 00:54:25,520 Speaker 1: don't want to hear your friend Ron surprise it for you. 996 00:54:25,960 --> 00:54:27,400 Speaker 1: But then if they're like, oh, yeah, what was your 997 00:54:27,400 --> 00:54:30,759 Speaker 1: favorite part? Well, crap. Now this has become a much 998 00:54:30,760 --> 00:54:35,680 Speaker 1: stickier situation because I'm lying about having seen Guy Hard too. Yeah. 999 00:54:35,719 --> 00:54:39,040 Speaker 1: But people tend to assume that, like that fact that 1000 00:54:39,080 --> 00:54:42,080 Speaker 1: they're lying about having seen die Hard Too is somehow 1001 00:54:42,239 --> 00:54:45,120 Speaker 1: leaking out of them in an observable way. And in 1002 00:54:45,600 --> 00:54:47,719 Speaker 1: some cases it might be like some people do have 1003 00:54:47,800 --> 00:54:51,160 Speaker 1: big tells when they're lying, but generally that information is 1004 00:54:51,200 --> 00:54:53,799 Speaker 1: not leaking out as much as people imagine it is. 1005 00:54:54,360 --> 00:54:56,480 Speaker 1: And I wonder if this is compounded to a certain 1006 00:54:56,520 --> 00:54:59,759 Speaker 1: extent by the lying we observe in media lying, that is, 1007 00:54:59,800 --> 00:55:03,239 Speaker 1: the are exposed via conflicting relevant media like here's a 1008 00:55:03,440 --> 00:55:06,960 Speaker 1: here's one scene of politicians saying one thing, and here's another. 1009 00:55:07,120 --> 00:55:10,160 Speaker 1: Here's the another bit of footage that shows that they're lying, 1010 00:55:10,760 --> 00:55:14,480 Speaker 1: or more often overt lying by a fictional character, which, 1011 00:55:14,520 --> 00:55:16,840 Speaker 1: of course is is played up for dramatic effect and 1012 00:55:16,960 --> 00:55:19,719 Speaker 1: is also an artificial situation, you know, and that we 1013 00:55:19,800 --> 00:55:22,680 Speaker 1: know they are lying to another care Oh yeah. But 1014 00:55:22,760 --> 00:55:27,560 Speaker 1: it's also like there is there's a stock type of 1015 00:55:27,680 --> 00:55:31,120 Speaker 1: hero in like detective fiction and all that, the person 1016 00:55:31,320 --> 00:55:34,320 Speaker 1: who can just magically tell when other people are lying 1017 00:55:34,440 --> 00:55:38,960 Speaker 1: and that that skill. No, but then there's a wonderful 1018 00:55:39,040 --> 00:55:45,240 Speaker 1: character in the recent Watchman series on HBO looking Glass, Yes, 1019 00:55:45,239 --> 00:55:48,320 Speaker 1: played by the great Tim Blake Nelson. I mean the 1020 00:55:48,400 --> 00:55:52,040 Speaker 1: character has the tower, not not Nelson himself, right, yeah, 1021 00:55:52,080 --> 00:55:54,240 Speaker 1: And we love characters like that, right, I mean, that's 1022 00:55:54,280 --> 00:55:57,240 Speaker 1: a really fun power to try to see realized in fiction. 1023 00:55:57,360 --> 00:56:00,560 Speaker 1: But uh, just lies or not disease need to sniff 1024 00:56:00,560 --> 00:56:04,520 Speaker 1: out as I mean to to really detectualize in reality. 1025 00:56:04,960 --> 00:56:06,680 Speaker 1: What you have to try to do is like trap 1026 00:56:06,760 --> 00:56:09,279 Speaker 1: people in contradictions and stuff like ask a bunch of 1027 00:56:09,280 --> 00:56:12,319 Speaker 1: follow up questions. It doesn't just leak out of your 1028 00:56:12,360 --> 00:56:14,600 Speaker 1: face that yes I'm telling a lie and you can 1029 00:56:14,640 --> 00:56:17,600 Speaker 1: smell it absolutely. And you know, I also think about 1030 00:56:17,640 --> 00:56:20,640 Speaker 1: this in terms of religious upbringing. Um, I don't know 1031 00:56:20,640 --> 00:56:22,560 Speaker 1: about you, but the growing up in the sort of 1032 00:56:23,320 --> 00:56:27,080 Speaker 1: pen optagonical teachings of of a Protestant Church. There was 1033 00:56:27,120 --> 00:56:30,920 Speaker 1: always this idea that God and and also the devil 1034 00:56:31,200 --> 00:56:34,000 Speaker 1: and perhaps other entities like lesser angels and demons what 1035 00:56:34,080 --> 00:56:36,880 Speaker 1: have you were privy to your inner thoughts, you know, 1036 00:56:36,920 --> 00:56:39,680 Speaker 1: the whole idea that it wasn't just what you said 1037 00:56:39,719 --> 00:56:41,640 Speaker 1: and did that made you sinful, it was also what 1038 00:56:41,680 --> 00:56:44,200 Speaker 1: you were thinking about doing, or considering doing, or just 1039 00:56:44,640 --> 00:56:47,200 Speaker 1: entertaining the mental images of doing. So there was this 1040 00:56:47,400 --> 00:56:50,799 Speaker 1: ingrained notion that your private thoughts are not private at all, 1041 00:56:51,280 --> 00:56:54,799 Speaker 1: at least not so far as supernatural entities are concerned. Yeah, 1042 00:56:54,800 --> 00:56:57,040 Speaker 1: that's right, And I guess it is possible that this 1043 00:56:57,080 --> 00:57:00,640 Speaker 1: could have a conditioning effect to make you assume that general, 1044 00:57:00,760 --> 00:57:03,640 Speaker 1: your private thoughts are not private. Maybe they're observable not 1045 00:57:03,800 --> 00:57:08,040 Speaker 1: just two supernatural entities, but to other regular entities that 1046 00:57:08,080 --> 00:57:11,240 Speaker 1: you interact with every day. Yeah, because I definitely remember 1047 00:57:11,280 --> 00:57:14,440 Speaker 1: at times, certainly when I was younger, sort of freaking 1048 00:57:14,440 --> 00:57:17,720 Speaker 1: out at times about just the idea of other humans 1049 00:57:17,760 --> 00:57:19,920 Speaker 1: being privated in my thoughts, you know, an idea that 1050 00:57:20,000 --> 00:57:22,760 Speaker 1: was probably also compounded by science fiction that is just 1051 00:57:22,880 --> 00:57:27,680 Speaker 1: lousy with psychics, right, um, and also these not quite 1052 00:57:27,680 --> 00:57:33,040 Speaker 1: psychics but just really insightful TV. So with the Hannibal lecters, 1053 00:57:33,120 --> 00:57:35,360 Speaker 1: basically they like, look at you and tell your whole 1054 00:57:35,400 --> 00:57:39,000 Speaker 1: life story. Yeah. But but then again, as we've discussed 1055 00:57:39,000 --> 00:57:40,840 Speaker 1: on the show before, this is this sort of fear 1056 00:57:40,880 --> 00:57:43,800 Speaker 1: is not entirely unfounded given the potential trajectory of some 1057 00:57:43,880 --> 00:57:47,520 Speaker 1: of our technology. That's true, but that's technology. I mean, 1058 00:57:47,800 --> 00:57:51,400 Speaker 1: normally people are not doing like AI, you know, learning 1059 00:57:51,400 --> 00:57:54,000 Speaker 1: on data sets about your social media use or whatever. 1060 00:57:54,840 --> 00:57:57,000 Speaker 1: There was one more example given about the illusion of 1061 00:57:57,000 --> 00:58:00,320 Speaker 1: transparency that I really liked, which was that, uh, people 1062 00:58:00,520 --> 00:58:06,200 Speaker 1: overestimated the extent to which observers could tell whether the 1063 00:58:06,320 --> 00:58:11,240 Speaker 1: drink they were drinking was pleasant or nasty tasting, even 1064 00:58:11,240 --> 00:58:13,440 Speaker 1: though they were supposed to use a neutral facial expression. 1065 00:58:13,480 --> 00:58:16,240 Speaker 1: So you give people drinks this one, this one tastes good, 1066 00:58:16,280 --> 00:58:18,640 Speaker 1: this one tastes disgusting, and you tell them they have 1067 00:58:18,720 --> 00:58:22,240 Speaker 1: to maintain a neutrol facial expression while they drink them. 1068 00:58:22,240 --> 00:58:24,480 Speaker 1: People assumed, oh, yeah, people can just read it on 1069 00:58:24,480 --> 00:58:27,080 Speaker 1: my face that you know that that was a nasty one. 1070 00:58:27,280 --> 00:58:29,040 Speaker 1: But it turns out people can't read all that. Well. 1071 00:58:30,200 --> 00:58:32,240 Speaker 1: Good to know when you have your next inner party 1072 00:58:32,720 --> 00:58:37,280 Speaker 1: um in the year, hopefully it's gonna be the next 1073 00:58:37,320 --> 00:58:40,360 Speaker 1: thing after a competitive eating right now, So right now 1074 00:58:40,360 --> 00:58:43,440 Speaker 1: it's the people who wolfed down like thirty white castles 1075 00:58:43,520 --> 00:58:46,120 Speaker 1: or whatever. The next thing is how many nasty drinks 1076 00:58:46,160 --> 00:58:50,000 Speaker 1: can you drink? I can see it becoming a big hit. Okay, 1077 00:58:50,000 --> 00:58:52,600 Speaker 1: one last thing. So the authors of this two thousand 1078 00:58:52,680 --> 00:58:57,080 Speaker 1: paper ask a question, when is the spotlight effect most 1079 00:58:57,160 --> 00:59:00,600 Speaker 1: pronounced and when is it least pronounced? There'll be such 1080 00:59:00,600 --> 00:59:03,880 Speaker 1: a thing as like a reverse spotlight effect, a sort 1081 00:59:03,920 --> 00:59:07,640 Speaker 1: of mental cloak of invisibility where other people are noticing 1082 00:59:07,680 --> 00:59:11,520 Speaker 1: you more than you think they are, And the authors think, yeah, 1083 00:59:11,560 --> 00:59:15,520 Speaker 1: this is probably possible. They claim that this would probably 1084 00:59:15,560 --> 00:59:20,440 Speaker 1: correlate with the subject's own consciousness of their appearance or behavior. 1085 00:59:21,040 --> 00:59:24,160 Speaker 1: So obviously, the more conscious you are of your own 1086 00:59:24,200 --> 00:59:27,520 Speaker 1: appearance and behavior, the more conscious of it you imagine 1087 00:59:27,560 --> 00:59:30,280 Speaker 1: other people are, and probably vice versa. If you're less 1088 00:59:30,280 --> 00:59:34,200 Speaker 1: conscious of yourself, you're imagining other people are less conscious 1089 00:59:34,200 --> 00:59:37,080 Speaker 1: of you. And so for this reason, it might be 1090 00:59:37,120 --> 00:59:41,200 Speaker 1: correlated somewhat to the novelty of what you're doing or wearing, 1091 00:59:41,280 --> 00:59:43,760 Speaker 1: or what you look like or how you sound. So 1092 00:59:43,880 --> 00:59:46,880 Speaker 1: remember in the fifth study in in that paper, uh, 1093 00:59:46,880 --> 00:59:49,880 Speaker 1: the spotlight effect was less pronounced for people who had 1094 00:59:49,960 --> 00:59:53,480 Speaker 1: some time to get used to wearing a potentially embarrassing, 1095 00:59:53,560 --> 00:59:57,440 Speaker 1: conspicuous T shirt. So it's highly possible that we are 1096 00:59:57,560 --> 01:00:00,680 Speaker 1: most likely to manifest the spotlight effect when we're doing 1097 01:00:00,720 --> 01:00:04,080 Speaker 1: something new or unusual. Well, that's interesting. It kind of 1098 01:00:04,080 --> 01:00:06,960 Speaker 1: ties back to what we're talking about earlier, about when 1099 01:00:07,000 --> 01:00:09,000 Speaker 1: you're about to say something in a meeting and you're 1100 01:00:09,000 --> 01:00:11,919 Speaker 1: putting a lot of cognitive effort into preparing for that 1101 01:00:12,280 --> 01:00:15,920 Speaker 1: and preparing to do something that you don't normally do. Yeah, exactly, 1102 01:00:16,240 --> 01:00:19,600 Speaker 1: takes more effort, it takes up more space in your brain. 1103 01:00:19,880 --> 01:00:22,400 Speaker 1: It's more salient to you, and you assume it's more 1104 01:00:22,440 --> 01:00:25,800 Speaker 1: salient to other people. So it's possible. This isn't proven yet, 1105 01:00:25,800 --> 01:00:29,360 Speaker 1: but it's possible that the inverse effect, where we would 1106 01:00:29,600 --> 01:00:33,360 Speaker 1: underestimate how much other people are noticing our appearance and behavior, 1107 01:00:33,720 --> 01:00:37,720 Speaker 1: it's possible this happens when we are least conscious, meaning 1108 01:00:37,800 --> 01:00:44,320 Speaker 1: during highly familiar, routine or automatic behaviors. There's actually an 1109 01:00:44,320 --> 01:00:47,760 Speaker 1: example that has been studied here, uh and the example 1110 01:00:48,000 --> 01:00:51,440 Speaker 1: and I thought this was interesting. So people underestimate the 1111 01:00:51,520 --> 01:00:56,600 Speaker 1: extent to which other people notice their cologne or perfume, 1112 01:00:57,960 --> 01:01:01,400 Speaker 1: so you cover yourself in a fragrance, you become accustomed 1113 01:01:01,400 --> 01:01:04,520 Speaker 1: to that fragrance and you stop noticing it, right Olfactory 1114 01:01:04,560 --> 01:01:08,760 Speaker 1: desensitization sets in. You no longer smell it yourself, so 1115 01:01:08,800 --> 01:01:12,520 Speaker 1: it basically disappears for you. But other people smell it 1116 01:01:12,600 --> 01:01:15,960 Speaker 1: even if you don't expect them to. Yeah. Yeah, I 1117 01:01:15,960 --> 01:01:19,720 Speaker 1: think we all have had that experience with with someone 1118 01:01:19,800 --> 01:01:23,720 Speaker 1: who has just outrageously powerful per fume, you know, like 1119 01:01:23,800 --> 01:01:27,360 Speaker 1: sometimes the extent that it announces their presence. Yes, yeah, 1120 01:01:27,520 --> 01:01:30,680 Speaker 1: sometimes people just lather up. And this makes me wonder 1121 01:01:30,720 --> 01:01:35,720 Speaker 1: about whether the spotlight effect is especially salient for appearance, because, 1122 01:01:36,160 --> 01:01:40,240 Speaker 1: of course, we normally can't really see ourselves when we're 1123 01:01:40,240 --> 01:01:42,760 Speaker 1: going about our lives. If we're in a regular business 1124 01:01:42,760 --> 01:01:45,480 Speaker 1: meeting talking, we can't see our face. We might be 1125 01:01:45,520 --> 01:01:47,440 Speaker 1: able to see our bodies if we look down at it. 1126 01:01:47,560 --> 01:01:50,720 Speaker 1: We're probably not looking down, probably looking up at the room. 1127 01:01:50,760 --> 01:01:55,960 Speaker 1: But we're also frequently suddenly reminded of our appearance when 1128 01:01:55,960 --> 01:01:58,280 Speaker 1: we walk in front of a mirror or log into 1129 01:01:58,280 --> 01:02:00,800 Speaker 1: a web meeting or something. So it might be the 1130 01:02:00,920 --> 01:02:05,400 Speaker 1: sort of perfect mix of obliviousness in your regular behaviors 1131 01:02:05,440 --> 01:02:09,080 Speaker 1: and then the sudden shocking reminders of oh yeah, I 1132 01:02:09,160 --> 01:02:11,960 Speaker 1: look like this to external people, and that kind of 1133 01:02:12,040 --> 01:02:15,280 Speaker 1: keeps you on your toes. Like what if after putting 1134 01:02:15,280 --> 01:02:19,640 Speaker 1: on some cologne you could suddenly smell it intensely again 1135 01:02:19,640 --> 01:02:22,760 Speaker 1: every hour or so. Yeah, I mean that's theme. You 1136 01:02:22,760 --> 01:02:24,320 Speaker 1: put it like that, it almost sounds like it would 1137 01:02:24,320 --> 01:02:29,120 Speaker 1: be helpful. But I don't feel like our experiences with 1138 01:02:30,400 --> 01:02:32,800 Speaker 1: our own footage in a zoom call or what have you, 1139 01:02:33,160 --> 01:02:37,880 Speaker 1: is necessarily helpful. It really feels like built in egocentric feedback, yeah, 1140 01:02:37,880 --> 01:02:40,400 Speaker 1: because there's too much of it. It's just constantly there. 1141 01:02:41,400 --> 01:02:44,520 Speaker 1: So anyway, if we assume that the spotlight effect is real, 1142 01:02:44,640 --> 01:02:48,200 Speaker 1: it is a something that's generally true about people might 1143 01:02:48,240 --> 01:02:50,200 Speaker 1: not be true to the same extent for everyone. But 1144 01:02:50,280 --> 01:02:54,280 Speaker 1: if this effect is correctly observed, what would the implications 1145 01:02:54,320 --> 01:02:57,640 Speaker 1: for our lives be. Well, Gilovich has actually gotten kind 1146 01:02:57,640 --> 01:03:00,960 Speaker 1: of kind of sweet about this, so he notes that, 1147 01:03:01,280 --> 01:03:03,720 Speaker 1: you know, there's studies that show that later in life, 1148 01:03:04,000 --> 01:03:07,400 Speaker 1: most people report that their major regrets about their lives 1149 01:03:07,480 --> 01:03:11,280 Speaker 1: concern things that they failed to do rather than things 1150 01:03:11,320 --> 01:03:13,320 Speaker 1: that they did. It's not the same for everybody, but 1151 01:03:13,360 --> 01:03:15,920 Speaker 1: that is a much more common framing and you've probably 1152 01:03:15,920 --> 01:03:19,720 Speaker 1: read about this before. This is widely observed. So many 1153 01:03:19,800 --> 01:03:23,040 Speaker 1: of the things that people want to do, but never do, 1154 01:03:23,960 --> 01:03:27,000 Speaker 1: they hold back from them out of a sense of 1155 01:03:27,000 --> 01:03:30,040 Speaker 1: self consciousness or anxiety about how people are going to 1156 01:03:30,160 --> 01:03:33,880 Speaker 1: perceive us, you know, for doing these things. So one 1157 01:03:33,920 --> 01:03:36,439 Speaker 1: easy example might be that you failed to ever take 1158 01:03:36,520 --> 01:03:40,640 Speaker 1: up playing a musical instrument because you fear that other 1159 01:03:40,680 --> 01:03:44,520 Speaker 1: people will judge you as unskilled at playing it, especially 1160 01:03:44,520 --> 01:03:47,400 Speaker 1: at first. And so the research on the spotlight effects 1161 01:03:47,440 --> 01:03:51,280 Speaker 1: suggests that we are were very likely to be overestimating, 1162 01:03:51,560 --> 01:03:56,520 Speaker 1: perhaps even grossly overestimating, how much people would even notice 1163 01:03:56,640 --> 01:03:59,840 Speaker 1: whatever it is that we're afraid of doing. And the 1164 01:04:00,000 --> 01:04:02,440 Speaker 1: authors of the study right quote. The lesson of this 1165 01:04:02,560 --> 01:04:05,440 Speaker 1: research then, is that we might all have fewer regrets 1166 01:04:05,480 --> 01:04:09,920 Speaker 1: if we properly understood how much attention or inattention our 1167 01:04:09,960 --> 01:04:13,760 Speaker 1: actions actually draw from others. Yeah, that is. It is 1168 01:04:13,840 --> 01:04:15,840 Speaker 1: kind of a sweet twist on it. You know. It's 1169 01:04:15,880 --> 01:04:18,680 Speaker 1: like saying, look, go go for it, go through you 1170 01:04:18,800 --> 01:04:21,360 Speaker 1: live your dream, because nobody's really going to pay that 1171 01:04:21,440 --> 01:04:25,320 Speaker 1: much attention even when it falls flat. Dance like nobody's watching, 1172 01:04:25,360 --> 01:04:28,560 Speaker 1: because probably nobody is watching, or if they are watching 1173 01:04:28,560 --> 01:04:31,280 Speaker 1: they might not even remember. I mean, it's just like 1174 01:04:31,520 --> 01:04:36,800 Speaker 1: you're you're probably way over concerned about possible minor faux 1175 01:04:36,880 --> 01:04:41,000 Speaker 1: pause or looking weird or awkward. Yeah, like they're probably 1176 01:04:41,040 --> 01:04:43,400 Speaker 1: even if they're they're watching you and they're thinking about it, 1177 01:04:43,440 --> 01:04:45,520 Speaker 1: they're probably thinking, oh, man, do I look like that 1178 01:04:45,640 --> 01:04:48,560 Speaker 1: when I danced by myself? What do I look like 1179 01:04:48,560 --> 01:04:52,560 Speaker 1: when I danced by myself? This? Uh uh, this reminds me. 1180 01:04:52,640 --> 01:04:56,280 Speaker 1: That's talking about situations where you realize that it may 1181 01:04:56,320 --> 01:04:59,640 Speaker 1: be perceived as as weird by other people, were embarrassing 1182 01:04:59,680 --> 01:05:02,880 Speaker 1: by people. So obviously I've mentioned Star Wars like three 1183 01:05:02,880 --> 01:05:06,120 Speaker 1: times so far. I'm mostly tracked in the house here 1184 01:05:06,160 --> 01:05:07,920 Speaker 1: and me and my son are super in Star Wars. 1185 01:05:08,160 --> 01:05:11,400 Speaker 1: He has a couple of lightsabers, and he'll he'll often 1186 01:05:11,400 --> 01:05:14,120 Speaker 1: ask me to go out to have a lightsaber battle 1187 01:05:14,160 --> 01:05:16,640 Speaker 1: with him, which is something we have to do outside 1188 01:05:17,120 --> 01:05:20,080 Speaker 1: because otherwise we would destroy things in the house, and 1189 01:05:20,120 --> 01:05:21,600 Speaker 1: we have to do it in the front yard because 1190 01:05:21,600 --> 01:05:24,080 Speaker 1: the mosquitoes are too bad in the back yard. Um, 1191 01:05:24,200 --> 01:05:26,520 Speaker 1: so we'll have this fight in the front yard. People 1192 01:05:26,600 --> 01:05:28,920 Speaker 1: driving by we'll be able to see it, which generally 1193 01:05:28,960 --> 01:05:31,080 Speaker 1: I imagine they'll say oh, well, there's a dad having 1194 01:05:31,120 --> 01:05:34,440 Speaker 1: a lightsaber battle with with his son. That's great, the 1195 01:05:34,440 --> 01:05:37,880 Speaker 1: sweetest thing you'll see all day. But occasionally my son, 1196 01:05:37,960 --> 01:05:40,520 Speaker 1: who's much he gets so into this. Occasionally he'll have 1197 01:05:40,560 --> 01:05:42,400 Speaker 1: to run over to the side of the house to 1198 01:05:42,640 --> 01:05:46,000 Speaker 1: like fight a pretend droid or something, which leaves me 1199 01:05:46,120 --> 01:05:50,520 Speaker 1: in the front yard apparently by myself fighting pretend droids. 1200 01:05:50,960 --> 01:05:54,560 Speaker 1: And I realized when that happens, people may drive by 1201 01:05:54,640 --> 01:05:58,400 Speaker 1: and think that I have lost my mind, um, which 1202 01:05:58,640 --> 01:06:01,120 Speaker 1: I don't know. I'm I'm okay with I'm ultimately okay. 1203 01:06:01,240 --> 01:06:03,120 Speaker 1: I don't think you got anything to worry about. Man. 1204 01:06:03,240 --> 01:06:06,280 Speaker 1: That's that that that's gonna be the ray of sunshine 1205 01:06:06,400 --> 01:06:10,240 Speaker 1: in the in the day of so many people driving by. Seriously, 1206 01:06:10,280 --> 01:06:12,960 Speaker 1: if I was driving by and I saw uh, and 1207 01:06:13,040 --> 01:06:15,160 Speaker 1: I saw some people having a lightsaber duel in their 1208 01:06:15,200 --> 01:06:17,960 Speaker 1: front yard, I would be like, that's you know, there's hope, 1209 01:06:18,400 --> 01:06:20,919 Speaker 1: a new hope. Yeah. Maybe maybe that's what I'm doing. 1210 01:06:20,960 --> 01:06:23,360 Speaker 1: I'm giving people hope that they they're like, I didn't 1211 01:06:23,360 --> 01:06:24,960 Speaker 1: realize I could do that as a grown up, that 1212 01:06:25,000 --> 01:06:28,200 Speaker 1: I could just uh get a lightsaber and uh and 1213 01:06:28,240 --> 01:06:30,600 Speaker 1: start having pretend battles in my front yard. I'm gonna 1214 01:06:30,640 --> 01:06:34,080 Speaker 1: do it. That's gonna make this quarantine situation a lot easier. 1215 01:06:34,360 --> 01:06:36,880 Speaker 1: Along the same lines, I am extremely in favor of 1216 01:06:36,920 --> 01:06:41,160 Speaker 1: adults climbing trees. There's this bizarre idea that adults shouldn't 1217 01:06:41,160 --> 01:06:44,360 Speaker 1: climb trees. Climbing trees is for children. Why adults should 1218 01:06:44,360 --> 01:06:46,959 Speaker 1: climb trees all the time? I love a good climbing tree. 1219 01:06:47,080 --> 01:06:48,840 Speaker 1: It's a good skill to have. I see people in 1220 01:06:48,960 --> 01:06:50,920 Speaker 1: movies having to do it all the time to escape, 1221 01:06:50,960 --> 01:06:54,080 Speaker 1: like you know, robot and monsters and whynot. Yeah, so 1222 01:06:54,240 --> 01:06:57,520 Speaker 1: go for it. Yeah, those like those Boston Dynamics dog 1223 01:06:57,680 --> 01:06:59,560 Speaker 1: robots are coming for you. Where are you gonna go? 1224 01:06:59,680 --> 01:07:01,720 Speaker 1: You got to get a joke, like suddenly you've got 1225 01:07:01,720 --> 01:07:04,200 Speaker 1: to climb a tree and you haven't been practicing for 1226 01:07:04,520 --> 01:07:08,320 Speaker 1: twenty or thirty years. Good luck? And then what if 1227 01:07:08,360 --> 01:07:11,320 Speaker 1: you have to fight it with a lifesaver. Also, you've 1228 01:07:11,320 --> 01:07:14,320 Speaker 1: got to keep those skills war you know, these are 1229 01:07:14,360 --> 01:07:17,440 Speaker 1: the skills one needs to survive in the waste land. Alright, Well, 1230 01:07:17,440 --> 01:07:19,960 Speaker 1: we're gonna go ahead and close it out there. I 1231 01:07:20,000 --> 01:07:21,680 Speaker 1: think we have there's a lot of material in here 1232 01:07:21,720 --> 01:07:25,840 Speaker 1: for everyone to think about, and we of course, await 1233 01:07:26,320 --> 01:07:29,200 Speaker 1: listener responses to this, how how do you perceive the 1234 01:07:29,240 --> 01:07:31,800 Speaker 1: spotlight effect in your own life or in the lives 1235 01:07:31,800 --> 01:07:35,000 Speaker 1: of others? Has this forced you to to rethink anything 1236 01:07:35,480 --> 01:07:37,280 Speaker 1: going on in the world around you, or how you 1237 01:07:37,320 --> 01:07:42,680 Speaker 1: indeed engage in your daily or weekly U conference digital 1238 01:07:42,680 --> 01:07:45,400 Speaker 1: conference calls for the book. In the meantime, if you 1239 01:07:45,400 --> 01:07:47,960 Speaker 1: would like to check out other episodes of Stuff to 1240 01:07:47,960 --> 01:07:50,480 Speaker 1: Blow Your Mind, including that when we mentioned anchor in 1241 01:07:50,520 --> 01:07:54,400 Speaker 1: the Mind dealing with anchoring, you can find those wherever 1242 01:07:54,520 --> 01:07:57,200 Speaker 1: you get your podcasts. I will say this, you can 1243 01:07:57,240 --> 01:08:00,480 Speaker 1: certainly find our our I heart radio list by going 1244 01:08:00,520 --> 01:08:02,920 Speaker 1: to stuff Toble your Mind dot com. And if you 1245 01:08:03,000 --> 01:08:05,840 Speaker 1: go there, you'll see a little part of the page 1246 01:08:05,840 --> 01:08:08,520 Speaker 1: and says show links. There is a store link there, 1247 01:08:08,560 --> 01:08:10,920 Speaker 1: and if you go there you will find T shirts 1248 01:08:11,640 --> 01:08:17,320 Speaker 1: that are bringing it around. And some of them are cool. Uh, 1249 01:08:17,600 --> 01:08:19,679 Speaker 1: some of them I would personally be embarrassed to wear. 1250 01:08:20,240 --> 01:08:22,080 Speaker 1: You'll have to look at them and trying to decide 1251 01:08:22,160 --> 01:08:25,360 Speaker 1: which which design is great. But we charge every listener 1252 01:08:25,400 --> 01:08:28,320 Speaker 1: to buy one cool T shirt and one extremely embarrassing 1253 01:08:28,360 --> 01:08:33,400 Speaker 1: T shirt. Yes, well we have both there, so go 1254 01:08:33,600 --> 01:08:36,400 Speaker 1: check them out if you so desire Um. I don't know. 1255 01:08:36,520 --> 01:08:38,559 Speaker 1: None of them have very manolol on the front though, 1256 01:08:38,600 --> 01:08:40,519 Speaker 1: so you need to do a separate image search to 1257 01:08:40,560 --> 01:08:43,599 Speaker 1: see what I'm talking about there. Huge thanks as always 1258 01:08:43,640 --> 01:08:47,240 Speaker 1: to our excellent audio producers Seth Nicholas Johnson. If you 1259 01:08:47,240 --> 01:08:49,080 Speaker 1: would like to get in touch with us with feedback 1260 01:08:49,120 --> 01:08:51,320 Speaker 1: on this episode or any other, to suggest a topic 1261 01:08:51,360 --> 01:08:53,439 Speaker 1: for the future, or just to say hi, you can 1262 01:08:53,479 --> 01:08:56,200 Speaker 1: email us at contact that Stuff to Blow your Mind 1263 01:08:56,360 --> 01:09:06,160 Speaker 1: dot com. Stuff to Blow Your Mind is production of 1264 01:09:06,160 --> 01:09:08,800 Speaker 1: I Heart Radio. For more podcasts for my heart Radio, 1265 01:09:09,000 --> 01:09:11,679 Speaker 1: visit the i heart Radio app, Apple Podcasts, or wherever 1266 01:09:11,720 --> 01:09:21,520 Speaker 1: you're listening to your favorite shows.