1 00:00:05,760 --> 00:00:08,400 Speaker 1: Hey, welcome to Stuff to Blow Your Mind. My name 2 00:00:08,440 --> 00:00:11,800 Speaker 1: is Robert Lamb and I'm Joe McCormick, and it's Saturday. 3 00:00:11,840 --> 00:00:13,760 Speaker 1: Time to go into the vault for an older episode 4 00:00:13,760 --> 00:00:16,759 Speaker 1: of the show. This one originally published on August four, 5 00:00:18,000 --> 00:00:21,880 Speaker 1: and it is about a psychological phenomenon called the spotlight effect. 6 00:00:22,560 --> 00:00:25,160 Speaker 1: I think this one, this one may have come out 7 00:00:25,200 --> 00:00:27,880 Speaker 1: of us thinking about being on zoom calls all the 8 00:00:27,920 --> 00:00:30,880 Speaker 1: time last summer, which at least of the time we're 9 00:00:31,200 --> 00:00:35,080 Speaker 1: recording this intro is still an ongoing, uh journey. But 10 00:00:35,560 --> 00:00:38,280 Speaker 1: but yeah, that was definitely on our minds back then, 11 00:00:38,400 --> 00:00:40,080 Speaker 1: and and this episode came out of it. But I 12 00:00:40,200 --> 00:00:43,080 Speaker 1: ended up thinking this was a really interesting subject. Yeah, 13 00:00:43,120 --> 00:00:46,640 Speaker 1: it definitely goes beyond the zoom realm. So if you 14 00:00:46,680 --> 00:00:48,320 Speaker 1: were by the time you listen to this, if you 15 00:00:48,320 --> 00:00:50,600 Speaker 1: were outside of the zoom realm. If if this seems 16 00:00:50,600 --> 00:00:53,840 Speaker 1: like an artifact of the past, uh, you know, don't worry. 17 00:00:53,840 --> 00:00:56,720 Speaker 1: The spotlight effect is very much in play in the 18 00:00:56,760 --> 00:01:04,760 Speaker 1: world that you're living in. Hello. Everyone, This is Seth, 19 00:01:04,800 --> 00:01:07,160 Speaker 1: the audio producer for Stuff to Blow Your Mind. And 20 00:01:07,200 --> 00:01:10,560 Speaker 1: there's just a quick note before we start. Myself, Robert 21 00:01:10,600 --> 00:01:13,680 Speaker 1: and Joe, we are still recording in isolation because it 22 00:01:13,800 --> 00:01:18,040 Speaker 1: is the summer of and well, this episode, we had 23 00:01:18,040 --> 00:01:20,640 Speaker 1: a bit of a problem with Robert's microphone. It was 24 00:01:20,680 --> 00:01:23,160 Speaker 1: a one time thing. It shouldn't happen again in the future. 25 00:01:23,800 --> 00:01:26,839 Speaker 1: But for this episode, it's gonna sound a little bit 26 00:01:26,959 --> 00:01:30,480 Speaker 1: like Joe is kind of talking to a tv VCR 27 00:01:30,600 --> 00:01:34,160 Speaker 1: combo and like a bunker somewhere. But well, I guess 28 00:01:34,920 --> 00:01:38,600 Speaker 1: that's technically kind of what we're actually doing. But anyway, anyway, 29 00:01:39,200 --> 00:01:42,240 Speaker 1: point is this is a one time incident. Uh, it's 30 00:01:42,240 --> 00:01:45,720 Speaker 1: all still very understandable, very easy to listen to in 31 00:01:45,760 --> 00:01:47,560 Speaker 1: this episode of Cleaned it Up as best I can, 32 00:01:48,080 --> 00:01:51,880 Speaker 1: And next time it'll sound just like normal, we promise. Thanks, 33 00:01:52,160 --> 00:01:57,520 Speaker 1: enjoy the show. Welcome to Stuff to Blow Your Mind, 34 00:01:57,800 --> 00:02:07,040 Speaker 1: production of My Heart Radio. Hey, welcome to Stuff to 35 00:02:07,080 --> 00:02:09,639 Speaker 1: Blow Your Mind. My name is Robert lamp and I'm 36 00:02:09,720 --> 00:02:13,240 Speaker 1: Joe McCormick. In today we're gonna be talking about mostly 37 00:02:13,280 --> 00:02:18,520 Speaker 1: focusing on a classic psychology paper about an effect in 38 00:02:18,840 --> 00:02:22,360 Speaker 1: uh in a kind of social cognitive bias known as 39 00:02:22,480 --> 00:02:25,600 Speaker 1: the spotlight effect. But to get into this subject, I 40 00:02:25,639 --> 00:02:27,760 Speaker 1: wanted to start off by thinking about something that a 41 00:02:27,800 --> 00:02:30,360 Speaker 1: lot of people have found themselves doing in the past 42 00:02:30,400 --> 00:02:33,120 Speaker 1: few months of people who maybe this wasn't part of 43 00:02:33,160 --> 00:02:36,360 Speaker 1: your job very much recently, but now you spend a 44 00:02:36,400 --> 00:02:42,120 Speaker 1: significant portion of your day in web video meetings, staring 45 00:02:42,240 --> 00:02:45,480 Speaker 1: at little boxes of your co workers faces on a 46 00:02:45,520 --> 00:02:49,800 Speaker 1: computer screen, or maybe just staring at your own face 47 00:02:49,840 --> 00:02:53,239 Speaker 1: a lot. Yes, uh, I mean, in fact, right now 48 00:02:53,520 --> 00:02:56,080 Speaker 1: is Joe and I are recording this. We are using 49 00:02:56,080 --> 00:02:58,840 Speaker 1: a Zoom call. We are using a Zoom conference room 50 00:02:58,919 --> 00:03:01,360 Speaker 1: to communicate with each other, and then we're using some 51 00:03:01,440 --> 00:03:04,799 Speaker 1: other programs and what not to to actually record it. 52 00:03:05,639 --> 00:03:07,160 Speaker 1: But yeah, a lot of people are having to You 53 00:03:07,200 --> 00:03:08,760 Speaker 1: may be using something different. You might be using what 54 00:03:08,880 --> 00:03:14,840 Speaker 1: there's like Google meet, Microsoft meeting, like Facebook flop. I 55 00:03:14,840 --> 00:03:17,000 Speaker 1: mean they're like a million of them, right, I mean 56 00:03:17,000 --> 00:03:21,400 Speaker 1: this is an optagram. Yeah, it's a growth industry figuring 57 00:03:21,400 --> 00:03:23,160 Speaker 1: out and it makes sense, right. We need to still 58 00:03:23,200 --> 00:03:25,000 Speaker 1: be able to connect with each other, We still need 59 00:03:25,040 --> 00:03:28,800 Speaker 1: to have meaningful meetings and all of those less meaningful 60 00:03:28,840 --> 00:03:32,280 Speaker 1: meetings and in order to keep the gears of business 61 00:03:33,000 --> 00:03:36,320 Speaker 1: grinding away. Right. Yeah. And one of the strange things 62 00:03:36,360 --> 00:03:38,600 Speaker 1: I've noticed, and I've read other people noticing the exact 63 00:03:38,600 --> 00:03:42,760 Speaker 1: same thing, is that you might expect that being able 64 00:03:42,800 --> 00:03:46,400 Speaker 1: to do a meeting from your home over the internet 65 00:03:46,640 --> 00:03:50,320 Speaker 1: would be maybe less exhausting than a meeting in person, 66 00:03:50,400 --> 00:03:52,800 Speaker 1: but somehow I have not found it to be the case. 67 00:03:53,400 --> 00:03:58,040 Speaker 1: I found that like, video chats can be just intensely draining, 68 00:03:58,120 --> 00:04:00,280 Speaker 1: like after they're over, you feel like you've been lifting 69 00:04:00,360 --> 00:04:04,360 Speaker 1: weights or something. Uh. And part of what's going on 70 00:04:04,440 --> 00:04:07,480 Speaker 1: here I found very much embodied in the spirit of 71 00:04:07,680 --> 00:04:10,080 Speaker 1: an article that I saw link to. It's it's just 72 00:04:10,120 --> 00:04:12,000 Speaker 1: a medium post, and I want to be very clear 73 00:04:12,040 --> 00:04:14,560 Speaker 1: that I'm not passing judgment on the author here, that 74 00:04:14,960 --> 00:04:18,080 Speaker 1: nothing wrong with with this person, but the title of 75 00:04:18,080 --> 00:04:20,280 Speaker 1: it just gave me chills. And the title of this 76 00:04:20,360 --> 00:04:24,560 Speaker 1: medium post was how to fake eye contact during video 77 00:04:24,680 --> 00:04:30,680 Speaker 1: chats and why it's important. Yes, this this was this 78 00:04:30,760 --> 00:04:35,680 Speaker 1: is interesting. This is what a medium article by Alexa Curtis, 79 00:04:36,279 --> 00:04:40,760 Speaker 1: and they made three key suggestions here. The first one 80 00:04:40,800 --> 00:04:42,680 Speaker 1: is to use a webcam even if no one else 81 00:04:42,880 --> 00:04:46,280 Speaker 1: is okay. The second is trick yourself into looking at 82 00:04:46,360 --> 00:04:51,080 Speaker 1: the camera instead of at the screen impossible, and then 83 00:04:51,600 --> 00:04:55,200 Speaker 1: tape your prompts, your notes, whatever to your monitor as 84 00:04:55,240 --> 00:04:57,839 Speaker 1: much as possible instead of having to refer to like 85 00:04:57,880 --> 00:05:00,640 Speaker 1: a notepad or something. Yeah, the suggest sin is like 86 00:05:00,800 --> 00:05:04,560 Speaker 1: make a little fake face to go around the webcam 87 00:05:04,839 --> 00:05:07,600 Speaker 1: box so that we're looking into the camera instead of 88 00:05:07,640 --> 00:05:10,760 Speaker 1: at the screen. But I just don't think it's like 89 00:05:10,800 --> 00:05:13,480 Speaker 1: if your face is played back on the screen, unless 90 00:05:13,480 --> 00:05:16,440 Speaker 1: you are able to turn your face off for yourself, 91 00:05:17,480 --> 00:05:19,880 Speaker 1: you just you're not going to be able to help it, 92 00:05:19,960 --> 00:05:23,520 Speaker 1: are you. Yeah. I actually, before I read this, I 93 00:05:23,760 --> 00:05:27,080 Speaker 1: kind of thought because I'd catch myself doing this. Um, 94 00:05:27,160 --> 00:05:29,560 Speaker 1: And now now I should say that I am lucky 95 00:05:29,680 --> 00:05:31,640 Speaker 1: and that I do not have to set through just 96 00:05:31,760 --> 00:05:33,800 Speaker 1: hours and hours of meetings today and zoom. I have 97 00:05:33,920 --> 00:05:36,840 Speaker 1: friends who are definitely stuck in that boat, and they 98 00:05:36,880 --> 00:05:40,279 Speaker 1: seem exhausted by it. We use zoom in these recordings, 99 00:05:40,440 --> 00:05:42,440 Speaker 1: but for the most part, we're not actually engaging in 100 00:05:42,440 --> 00:05:45,200 Speaker 1: the video part of it. We we have other stuff, 101 00:05:45,400 --> 00:05:48,240 Speaker 1: We have notes, stuff that we're looking at when we're recording. 102 00:05:49,000 --> 00:05:51,600 Speaker 1: But I have found this to be the case with 103 00:05:51,640 --> 00:05:54,200 Speaker 1: my My Dungeons and Dragons group, which used to meet 104 00:05:54,440 --> 00:05:58,000 Speaker 1: in person, but now it's forced to meet via zoom, 105 00:05:58,080 --> 00:05:59,680 Speaker 1: and we have this, you know, and it's like a 106 00:05:59,760 --> 00:06:02,159 Speaker 1: two or three hour zoom called you know, like a 107 00:06:02,200 --> 00:06:04,000 Speaker 1: three hour zoom called it would be once a week, 108 00:06:04,279 --> 00:06:07,680 Speaker 1: you know, to do Dungeon and Dragons. And for some reason, 109 00:06:07,760 --> 00:06:09,719 Speaker 1: I've been noticing that I've just I felt kind of 110 00:06:09,760 --> 00:06:13,200 Speaker 1: worn out by it towards the end of of the night, 111 00:06:13,800 --> 00:06:16,719 Speaker 1: um in ways that I wasn't worn out previously meeting 112 00:06:16,720 --> 00:06:19,920 Speaker 1: in person, Like I just kind of felt zapped by it, 113 00:06:20,200 --> 00:06:23,040 Speaker 1: like if we were battling something. At the end, I'm 114 00:06:23,080 --> 00:06:25,320 Speaker 1: just kind of going through the motions. I'm just not 115 00:06:25,440 --> 00:06:28,880 Speaker 1: feeling it anymore at that point. Yeah, the exact same experience. 116 00:06:28,880 --> 00:06:32,960 Speaker 1: I've done social stuff via video calls. I've also been 117 00:06:33,000 --> 00:06:35,560 Speaker 1: doing a D and D campaign, my my first ever 118 00:06:35,600 --> 00:06:39,440 Speaker 1: by the way, Thank you over over Zoom, and it's 119 00:06:39,440 --> 00:06:41,240 Speaker 1: been a lot of fun. But yes, it is. It 120 00:06:41,720 --> 00:06:45,680 Speaker 1: is kind of exhausting to just participate in the in 121 00:06:45,760 --> 00:06:49,800 Speaker 1: the eyeball Tennis of of the different video faces on 122 00:06:49,880 --> 00:06:52,880 Speaker 1: the screen. Something about it hooks its claws into your 123 00:06:52,880 --> 00:06:56,280 Speaker 1: brain and just pulls and stretches and kind of needs 124 00:06:56,360 --> 00:07:00,240 Speaker 1: your brain like a ball of dough. Now four only, 125 00:07:00,240 --> 00:07:02,599 Speaker 1: I don't know how your campaign is going, but with ours, 126 00:07:02,600 --> 00:07:04,920 Speaker 1: were also using a couple other resources, were Reason to 127 00:07:05,000 --> 00:07:08,720 Speaker 1: Discord Forum and we're using role twenty to pull up 128 00:07:08,760 --> 00:07:12,320 Speaker 1: maps and such. So maybe maybe we should upgrade our technology. 129 00:07:12,640 --> 00:07:14,840 Speaker 1: It's worth looking into. But but basically we have some 130 00:07:14,920 --> 00:07:18,560 Speaker 1: other things to captivate our eyes during this this process. 131 00:07:18,600 --> 00:07:20,600 Speaker 1: I guess one of the things about a straight up 132 00:07:21,080 --> 00:07:24,080 Speaker 1: like business zoom meeting is a lot of times you're 133 00:07:24,160 --> 00:07:27,600 Speaker 1: just stuck in those Brady bunch um cubes, right, You're 134 00:07:27,640 --> 00:07:29,880 Speaker 1: just you're just stuck with all these little screen pictures 135 00:07:29,920 --> 00:07:32,240 Speaker 1: of people and uh and some Sometimes you have it 136 00:07:32,360 --> 00:07:34,760 Speaker 1: set to where one will take dominance over the others, 137 00:07:34,760 --> 00:07:36,120 Speaker 1: but you may just be looking at a wall of 138 00:07:36,120 --> 00:07:40,560 Speaker 1: people's faces. And then you're thinking about again this point 139 00:07:40,560 --> 00:07:43,239 Speaker 1: of should I be making eye contact with everybody? Should 140 00:07:43,240 --> 00:07:47,360 Speaker 1: I be focusing on trying to be the most presentable, 141 00:07:47,560 --> 00:07:49,880 Speaker 1: you know, the most professional looking, Like when they look 142 00:07:49,920 --> 00:07:53,080 Speaker 1: at my little box, it's like watching a TV broadcast 143 00:07:53,320 --> 00:07:56,240 Speaker 1: and I'm making direct eye contact with them, maybe by 144 00:07:56,400 --> 00:07:58,720 Speaker 1: because i have a little sticky smiley face that I've 145 00:07:58,720 --> 00:08:01,160 Speaker 1: put up at the top of my computer, by the camera. 146 00:08:01,480 --> 00:08:04,160 Speaker 1: There are multiple ways in which this type of interaction 147 00:08:04,240 --> 00:08:06,200 Speaker 1: is not normal. I mean, of course, it's not normal 148 00:08:06,240 --> 00:08:08,680 Speaker 1: to be interfacing through technology at all. Of course, it's 149 00:08:08,720 --> 00:08:10,960 Speaker 1: kind of strange that you're not looking at the person, 150 00:08:11,000 --> 00:08:13,560 Speaker 1: but you're looking at the screen, so the eye contact 151 00:08:13,640 --> 00:08:16,280 Speaker 1: is off. I understand that that point in the article. 152 00:08:16,640 --> 00:08:18,840 Speaker 1: But at the same time, it is not normal to 153 00:08:18,880 --> 00:08:21,920 Speaker 1: be able to see yourself while you're talking to people. 154 00:08:22,440 --> 00:08:25,560 Speaker 1: I can imagine if you know you were always talking 155 00:08:25,600 --> 00:08:28,680 Speaker 1: to people with a mirror in your hand that was 156 00:08:28,720 --> 00:08:32,880 Speaker 1: reflecting your face. Yeah, people would rightfully think you were insane, 157 00:08:33,600 --> 00:08:36,599 Speaker 1: think that was how like self obsessed you are. But 158 00:08:36,679 --> 00:08:39,720 Speaker 1: you go or I don't know, or afraid of Gorgon's 159 00:08:39,720 --> 00:08:42,080 Speaker 1: that you always had to have a mirror in your hand. 160 00:08:42,240 --> 00:08:44,960 Speaker 1: And yet that's the reality we find ourselves. And I 161 00:08:45,040 --> 00:08:47,640 Speaker 1: know this is this is not just like our particular 162 00:08:47,800 --> 00:08:50,040 Speaker 1: reaction to this. This is something that that I've read 163 00:08:50,040 --> 00:08:52,800 Speaker 1: about in multiple popular articles and also not just an 164 00:08:52,840 --> 00:08:56,000 Speaker 1: obscure scientific articles. Like there was an article I came 165 00:08:56,040 --> 00:08:58,560 Speaker 1: across and I think it was Business Insider that was 166 00:08:58,600 --> 00:09:01,120 Speaker 1: called like why you can't stop up steering at yourself 167 00:09:01,120 --> 00:09:05,920 Speaker 1: in zoom calls? Yes, this was by Shia Fetter titled 168 00:09:05,960 --> 00:09:08,559 Speaker 1: a cyber psychologist explains why you can't stop steering yourself 169 00:09:08,559 --> 00:09:12,599 Speaker 1: on zoom calls? And everyone else is probably doing the same, um, 170 00:09:12,679 --> 00:09:16,200 Speaker 1: which I I have caught myself doing this, sometimes during 171 00:09:16,240 --> 00:09:20,320 Speaker 1: donuts and dragons, sometimes during work calls, where you know 172 00:09:20,520 --> 00:09:22,160 Speaker 1: you want to check in, you want to see how 173 00:09:22,200 --> 00:09:24,560 Speaker 1: you were presenting to the rest of the world. But 174 00:09:24,600 --> 00:09:28,160 Speaker 1: then it's often easy to sort of, you know, to 175 00:09:28,160 --> 00:09:30,160 Speaker 1: to to. You're looking at these wall of faces, and 176 00:09:30,160 --> 00:09:32,640 Speaker 1: then you decide to maximize your own, and you're like, 177 00:09:32,640 --> 00:09:34,600 Speaker 1: all right, let's see, how's the light hitting me, what's 178 00:09:34,600 --> 00:09:37,560 Speaker 1: my hair doing right now? How presented my smiling weird? 179 00:09:38,360 --> 00:09:40,679 Speaker 1: My smiling weird? Do I look do I look engage? 180 00:09:41,000 --> 00:09:46,440 Speaker 1: Or do I look as bored as I feel? You know? Um? 181 00:09:46,480 --> 00:09:49,360 Speaker 1: And so that's what this article gets into a little 182 00:09:49,360 --> 00:09:51,840 Speaker 1: of that year. So m the article discusses some key 183 00:09:51,880 --> 00:09:55,839 Speaker 1: points made by cyber psychologist Andrew Franklin. So the first 184 00:09:55,840 --> 00:09:59,200 Speaker 1: one is that in general, adolescents tend to suffer from 185 00:09:59,240 --> 00:10:02,640 Speaker 1: the imagine very audience delusion, the idea that people in 186 00:10:02,679 --> 00:10:05,920 Speaker 1: the surroundings are really paying attention to every move they make, 187 00:10:06,440 --> 00:10:09,800 Speaker 1: and this often follows us into adulthood as well. Yeah, 188 00:10:09,880 --> 00:10:12,720 Speaker 1: and this I think is pretty close to or perhaps 189 00:10:12,720 --> 00:10:15,240 Speaker 1: even just another name for the main issue we're going 190 00:10:15,280 --> 00:10:17,720 Speaker 1: to be focusing on today and otherwise known as the 191 00:10:17,720 --> 00:10:21,920 Speaker 1: spotlight effect. Yes, now, one thing about this, this point 192 00:10:21,920 --> 00:10:25,960 Speaker 1: about adolescents. Uh, I mean, this is of course terrifying 193 00:10:26,360 --> 00:10:29,319 Speaker 1: to think about, given the nature of social media, which 194 00:10:29,720 --> 00:10:33,840 Speaker 1: is pretty much predicated on this sort of celebrity aspiring 195 00:10:34,640 --> 00:10:38,040 Speaker 1: notion of a constant audience. And uh and and one 196 00:10:38,040 --> 00:10:41,400 Speaker 1: tends to drift to extremes and reaction to that, right, 197 00:10:41,920 --> 00:10:44,600 Speaker 1: this feeling that every word I put on the Internet, 198 00:10:44,679 --> 00:10:47,880 Speaker 1: or every video, every whatever is vitally important. It will 199 00:10:47,920 --> 00:10:51,600 Speaker 1: be viewed by potentially everyone in the world. Yes, You're 200 00:10:51,600 --> 00:10:55,360 Speaker 1: not just constantly on view, you are constantly being reviewed, 201 00:10:55,760 --> 00:10:58,400 Speaker 1: is the perception. Yeah, I know obviously that's it's gonna 202 00:10:58,440 --> 00:11:01,200 Speaker 1: vary from person person, or some people are gonna use 203 00:11:01,240 --> 00:11:03,959 Speaker 1: social media in a way that hasn't far more limited 204 00:11:04,000 --> 00:11:07,320 Speaker 1: scope only close friends or maybe even family, maybe just 205 00:11:07,360 --> 00:11:11,160 Speaker 1: one person can see you know. But but but yeah, 206 00:11:11,160 --> 00:11:13,280 Speaker 1: it does make me wonder and this would have to 207 00:11:13,280 --> 00:11:15,840 Speaker 1: be a you know, a discussion for another time, like 208 00:11:15,960 --> 00:11:19,880 Speaker 1: just what what what is happening when this spotlight effect, 209 00:11:19,880 --> 00:11:24,280 Speaker 1: this imaginary audience delusion is playing uh into our use 210 00:11:24,320 --> 00:11:29,360 Speaker 1: of social media. Yeah, but also so this psychology professor 211 00:11:29,400 --> 00:11:33,000 Speaker 1: Andrew Franklin also makes the point that, like, it's not 212 00:11:33,080 --> 00:11:37,640 Speaker 1: just an illusion that like video chats are actually exhausting. Yeah, yeah, 213 00:11:37,720 --> 00:11:40,199 Speaker 1: to make the point of video chats are more stressful 214 00:11:40,200 --> 00:11:43,040 Speaker 1: than in person meetings, and a big part of that 215 00:11:43,120 --> 00:11:47,320 Speaker 1: is just everything is more distracted, more fragmented, and we 216 00:11:47,360 --> 00:11:52,400 Speaker 1: have muted or severely lessened nonverbal communicative skills. So you 217 00:11:52,400 --> 00:11:54,480 Speaker 1: think of a lot of this is kind of an 218 00:11:54,520 --> 00:11:57,400 Speaker 1: overstatement of the obvious. But you can do far less 219 00:11:57,440 --> 00:12:00,280 Speaker 1: with your body language, not only your overt bodyline, which 220 00:12:00,320 --> 00:12:02,360 Speaker 1: like talking with your hands and waving down people to 221 00:12:02,400 --> 00:12:04,400 Speaker 1: the other side of the table, but in terms of 222 00:12:04,440 --> 00:12:08,640 Speaker 1: just having a bodily awareness of what everyone is doing 223 00:12:08,720 --> 00:12:11,079 Speaker 1: and how they are sort of reacting to what's going on, 224 00:12:11,440 --> 00:12:14,080 Speaker 1: and if someone else is about to speak or needs 225 00:12:14,160 --> 00:12:17,600 Speaker 1: to speak. Yes, And it's actually hard to tell who's 226 00:12:17,640 --> 00:12:20,760 Speaker 1: looking at who, Like you can assume if there are 227 00:12:20,760 --> 00:12:23,800 Speaker 1: a bunch of Brady bunch boxes that people are probably 228 00:12:23,920 --> 00:12:26,640 Speaker 1: looking if they're not looking at themselves, they're probably looking 229 00:12:26,679 --> 00:12:29,560 Speaker 1: at the person who's currently talking, but maybe not. You 230 00:12:29,600 --> 00:12:32,760 Speaker 1: can't tell. Yeah, And and it's also weird to think, 231 00:12:32,800 --> 00:12:34,959 Speaker 1: like I guess maybe some of you out there had 232 00:12:34,960 --> 00:12:37,120 Speaker 1: more of like a you know, rules of order kind 233 00:12:37,120 --> 00:12:40,280 Speaker 1: of a upbringing or or you know, you have some 234 00:12:40,320 --> 00:12:42,480 Speaker 1: more training and think, Okay, this is how this is 235 00:12:42,480 --> 00:12:44,720 Speaker 1: how a business meeting goes, this is how a work 236 00:12:44,760 --> 00:12:47,280 Speaker 1: meeting operates. But I feel for my part, a lot 237 00:12:47,280 --> 00:12:48,600 Speaker 1: of it's just kind of you just learn it to go, 238 00:12:48,800 --> 00:12:50,800 Speaker 1: just sort of figure out what is the culture of 239 00:12:50,840 --> 00:12:53,720 Speaker 1: this group and this meeting, and and how am I 240 00:12:53,720 --> 00:12:56,040 Speaker 1: supposed to fit in? And then to a certain extent, 241 00:12:56,040 --> 00:12:58,280 Speaker 1: it feels like we've had to relearn all of that 242 00:12:58,679 --> 00:13:00,920 Speaker 1: or augment our understand ending of that based on the 243 00:13:00,960 --> 00:13:04,040 Speaker 1: limitations of the technology. Yeah, I think that's totally right, 244 00:13:04,160 --> 00:13:06,600 Speaker 1: and and I would I would emphasize yet again that 245 00:13:07,160 --> 00:13:12,280 Speaker 1: not all digital socialization skills are are interchangeable or transferable 246 00:13:12,320 --> 00:13:15,479 Speaker 1: to one another. So you might have been well acclimatized 247 00:13:15,480 --> 00:13:18,320 Speaker 1: to the social skills one needs in order to interact 248 00:13:18,520 --> 00:13:22,400 Speaker 1: through a different type of mediated social media like like 249 00:13:22,520 --> 00:13:26,280 Speaker 1: Facebook or Twitter or something like that, and still not 250 00:13:26,400 --> 00:13:28,920 Speaker 1: really have any skills for how to interact via a 251 00:13:29,040 --> 00:13:31,480 Speaker 1: video chat. It's just like a different set of skills, 252 00:13:31,520 --> 00:13:33,840 Speaker 1: a different set of things to get used to. Yeah, 253 00:13:33,880 --> 00:13:38,200 Speaker 1: it's a different talking stick entirely now now. Franklin also 254 00:13:38,280 --> 00:13:40,480 Speaker 1: drives something that given the strain of keeping up with 255 00:13:40,559 --> 00:13:44,040 Speaker 1: everyone's tiny boxes and concern over how you yourself look 256 00:13:44,520 --> 00:13:48,200 Speaker 1: in your box, you might easily find yourself just looking 257 00:13:48,200 --> 00:13:51,320 Speaker 1: at yourself, staring into the digital mirror and fixating on 258 00:13:51,360 --> 00:13:54,880 Speaker 1: how you appear to friends, co workers, and bosses and 259 00:13:54,880 --> 00:13:58,800 Speaker 1: and Franklin maintains that this means you're likely overwhelmed by proceedings, 260 00:13:59,559 --> 00:14:02,120 Speaker 1: which which I totally get. Again, even with the low 261 00:14:02,200 --> 00:14:05,600 Speaker 1: stakes confines of a Dungeons and Dragons, you know, nothing 262 00:14:05,800 --> 00:14:07,960 Speaker 1: huge is on the line here, But by the end 263 00:14:07,960 --> 00:14:09,839 Speaker 1: of the of the session again, I often find myself 264 00:14:09,880 --> 00:14:12,480 Speaker 1: kind of zapped in ways that I never felt before 265 00:14:12,480 --> 00:14:16,160 Speaker 1: within person gaming. And even though we're staring into that 266 00:14:16,679 --> 00:14:20,960 Speaker 1: digital reflection of our own face, Franklin stresses that people 267 00:14:21,000 --> 00:14:24,960 Speaker 1: are ultimately not fixating on you like you think they are. 268 00:14:25,000 --> 00:14:28,080 Speaker 1: They are not setting there watching you and you know, 269 00:14:28,160 --> 00:14:32,280 Speaker 1: dissecting everything about your appearance and in your background and 270 00:14:32,560 --> 00:14:35,440 Speaker 1: what your face is doing it in any given set. Yeah. No, 271 00:14:35,560 --> 00:14:39,080 Speaker 1: They're probably much more likely fixating on themselves the same 272 00:14:39,120 --> 00:14:43,520 Speaker 1: way you are fixating on yourself. Uh. And so this 273 00:14:43,560 --> 00:14:45,560 Speaker 1: brings us back to the cognitive bias that we're gonna 274 00:14:45,600 --> 00:14:48,120 Speaker 1: be focusing on in today's episode, also known as the 275 00:14:48,160 --> 00:14:52,640 Speaker 1: spotlight effect. And this effect is very interesting because, on 276 00:14:52,640 --> 00:14:56,640 Speaker 1: one hand, I think it's one of the simplest psychological 277 00:14:56,680 --> 00:14:58,920 Speaker 1: phenomena we've ever talked about on the show. It's actually 278 00:14:59,040 --> 00:15:03,440 Speaker 1: very simple to or it's very straightforward in a way. 279 00:15:03,480 --> 00:15:05,960 Speaker 1: But it's one of those things where if you really 280 00:15:06,040 --> 00:15:10,800 Speaker 1: internalize it, it's implications could be kind of life changing. Yeah, 281 00:15:10,920 --> 00:15:13,640 Speaker 1: it's one of these things that doesn't I wouldn't say 282 00:15:13,640 --> 00:15:17,640 Speaker 1: that it really like changes the nature of your reality, 283 00:15:17,680 --> 00:15:21,960 Speaker 1: but it brings certain aspects of it into maybe sharper focus. 284 00:15:22,400 --> 00:15:26,280 Speaker 1: You might realize, Oh, well, okay, that explains some of 285 00:15:26,280 --> 00:15:29,400 Speaker 1: the things I feel when I am in a meeting 286 00:15:29,640 --> 00:15:32,600 Speaker 1: or you know, just walking around. Um uh, you know, 287 00:15:32,840 --> 00:15:36,840 Speaker 1: in a public space or or whatever the case may be. Um, 288 00:15:37,360 --> 00:15:40,480 Speaker 1: I do feel like it does it does It does 289 00:15:40,520 --> 00:15:44,000 Speaker 1: feel like a revelation of it in its own way. Yeah. 290 00:15:44,120 --> 00:15:46,520 Speaker 1: So the main paper that I wanted to focus on 291 00:15:46,560 --> 00:15:49,160 Speaker 1: today was published in the year two thousand in the 292 00:15:49,240 --> 00:15:54,360 Speaker 1: Journal of Personality and Social Psychology by Thomas Gilovich, Victoria 293 00:15:54,440 --> 00:15:58,000 Speaker 1: who Staid Medvec and Kenneth Savitsky, and it's called the 294 00:15:58,040 --> 00:16:02,200 Speaker 1: spotlight effect in social judgment, an ecocentric bias, and estimates 295 00:16:02,200 --> 00:16:06,080 Speaker 1: of the salience of one zone actions and appearance. You 296 00:16:06,120 --> 00:16:08,920 Speaker 1: can pretty easily find a full PDF of this online 297 00:16:08,960 --> 00:16:10,840 Speaker 1: if you want to read it. And this is a 298 00:16:11,160 --> 00:16:13,840 Speaker 1: highly cited paper. It has been referred to many many 299 00:16:13,880 --> 00:16:16,600 Speaker 1: times in the years since as a kind of seminal 300 00:16:16,640 --> 00:16:20,040 Speaker 1: work on this on this social cognitive bias. So the 301 00:16:20,120 --> 00:16:25,920 Speaker 1: authors begin with some anecdotal observations, and these observations are 302 00:16:25,960 --> 00:16:29,160 Speaker 1: that for both good and ill, it often seems like 303 00:16:29,360 --> 00:16:33,160 Speaker 1: stuff you expect other people to notice and recall about 304 00:16:33,200 --> 00:16:38,240 Speaker 1: you really goes unnoticed. Uh. And also on the good side, 305 00:16:38,640 --> 00:16:42,160 Speaker 1: that might be like something smart that you said in 306 00:16:42,200 --> 00:16:45,560 Speaker 1: a discussion group. You're like really pleased with yourself that like, oh, 307 00:16:45,600 --> 00:16:47,520 Speaker 1: I had that really good insight or I made that 308 00:16:47,600 --> 00:16:50,600 Speaker 1: really funny joke, and then it turns out later that 309 00:16:50,640 --> 00:16:54,200 Speaker 1: nobody else seems to recall that you said anything. Or 310 00:16:54,360 --> 00:16:58,400 Speaker 1: perhaps this often actually happens in athletic contexts, where people 311 00:16:58,680 --> 00:17:01,280 Speaker 1: will make a really good odd in a basketball game 312 00:17:01,400 --> 00:17:04,080 Speaker 1: or something, and they will expect people to remember that 313 00:17:04,160 --> 00:17:06,760 Speaker 1: they did that. But then maybe it turns out that 314 00:17:07,040 --> 00:17:09,919 Speaker 1: nobody really noticed. It just kind of was one of 315 00:17:09,920 --> 00:17:12,399 Speaker 1: the goals in a game in which many goals were scored. 316 00:17:12,800 --> 00:17:15,320 Speaker 1: And as frustrating as this can be, it can also 317 00:17:15,440 --> 00:17:17,640 Speaker 1: kind of be a relief that it works the other 318 00:17:17,680 --> 00:17:21,680 Speaker 1: way to People often don't seem to have noticed when 319 00:17:21,720 --> 00:17:25,560 Speaker 1: you make what feels like a really obvious mistake faux 320 00:17:25,640 --> 00:17:28,800 Speaker 1: paw on a first meeting, or when you misspeak and 321 00:17:28,960 --> 00:17:31,760 Speaker 1: what feels like an embarrassing way, or that time you 322 00:17:31,800 --> 00:17:34,439 Speaker 1: had spinach in your teeth, like you obsess over that 323 00:17:34,480 --> 00:17:37,200 Speaker 1: and you're afraid it's going to completely ruin your reputation, 324 00:17:37,280 --> 00:17:40,480 Speaker 1: that everybody's gonna remember you for that thing forever. But 325 00:17:40,560 --> 00:17:43,600 Speaker 1: a lot of times it seems like maybe nobody even noticed. Yeah, 326 00:17:44,080 --> 00:17:48,119 Speaker 1: the dual nature of this particular revelation, I think ultimately 327 00:17:48,119 --> 00:17:50,439 Speaker 1: it is positive because, yeah, maybe it means you're not 328 00:17:50,480 --> 00:17:52,600 Speaker 1: as important as you thought you were. Maybe you're not 329 00:17:52,720 --> 00:17:55,879 Speaker 1: as a it's a it's explosive the personality as you 330 00:17:55,960 --> 00:17:58,879 Speaker 1: thought you were. But on the other hand, uh, you know, 331 00:17:58,960 --> 00:18:01,280 Speaker 1: maybe the steaks a little bit lower every time you 332 00:18:01,359 --> 00:18:04,720 Speaker 1: open your mouth. Yeah, Yeah, that's the hypothesis at the 333 00:18:04,720 --> 00:18:09,080 Speaker 1: heart of this paper that these anecdotal observations are indicative 334 00:18:09,119 --> 00:18:12,560 Speaker 1: of a real trend that can be measured that, in general, 335 00:18:13,080 --> 00:18:17,360 Speaker 1: humans have an ecocentric bias that causes us to believe 336 00:18:17,480 --> 00:18:21,560 Speaker 1: that our actions and our appearance are much more salient 337 00:18:21,720 --> 00:18:25,520 Speaker 1: and notable to other people than they really are. Quote. 338 00:18:25,760 --> 00:18:28,480 Speaker 1: People tend to believe that more people take a note 339 00:18:28,480 --> 00:18:31,159 Speaker 1: of their actions and appearance than is actually the case. 340 00:18:31,600 --> 00:18:35,800 Speaker 1: We dubbed this putative phenomenon the spotlight effect. People tend 341 00:18:35,800 --> 00:18:38,959 Speaker 1: to believe that the social spotlight shines more brightly on 342 00:18:39,040 --> 00:18:42,000 Speaker 1: them than it really does. Yeah, this is insightful, and 343 00:18:42,040 --> 00:18:44,280 Speaker 1: I think we can all match this up pretty easily, 344 00:18:44,320 --> 00:18:46,760 Speaker 1: first of all with our own experiences, but also with 345 00:18:46,840 --> 00:18:49,439 Speaker 1: some of the ideas that we've discussed on the show before. 346 00:18:49,960 --> 00:18:53,199 Speaker 1: Uh specifically, First of all, there's the self narrative aspect 347 00:18:53,280 --> 00:18:56,040 Speaker 1: of our inner thoughts. You know, through the inner workings 348 00:18:56,040 --> 00:18:59,320 Speaker 1: of consciousness, were constantly weaving together a story about who 349 00:18:59,359 --> 00:19:01,480 Speaker 1: we are and how we fit into the world. It's 350 00:19:01,480 --> 00:19:04,119 Speaker 1: a little movie, and we're the main character, so of 351 00:19:04,160 --> 00:19:07,120 Speaker 1: course we're the most important person in that story. Right. 352 00:19:07,160 --> 00:19:09,960 Speaker 1: You're trying to link together a series of what are 353 00:19:10,000 --> 00:19:13,600 Speaker 1: in fact sort of random events into a cohesive narrative 354 00:19:13,640 --> 00:19:16,399 Speaker 1: with a logic to it, right, And then through theory 355 00:19:16,440 --> 00:19:20,280 Speaker 1: of mind, we're constantly running simulations about the mental states 356 00:19:20,280 --> 00:19:24,680 Speaker 1: of other people, specific people, people in general, known people, 357 00:19:24,760 --> 00:19:28,840 Speaker 1: unknown people, sort of hypothetical people. Uh. And and of 358 00:19:28,880 --> 00:19:30,719 Speaker 1: course one of the key aspects of any of these 359 00:19:30,720 --> 00:19:33,680 Speaker 1: simulations is, you know, how do they relate to me, 360 00:19:33,840 --> 00:19:37,000 Speaker 1: how do they think about me? What are their intentions 361 00:19:37,040 --> 00:19:40,159 Speaker 1: towards me? And that makes sense, right, There's an inherently 362 00:19:40,160 --> 00:19:42,840 Speaker 1: self centered quality to this sort of thinking because it 363 00:19:42,880 --> 00:19:45,440 Speaker 1: all comes down to individual survival. We tend to air 364 00:19:45,520 --> 00:19:48,160 Speaker 1: on the side of seeing tigers in the grass when 365 00:19:48,160 --> 00:19:50,720 Speaker 1: there are none, which is better of the two possible 366 00:19:50,720 --> 00:19:53,840 Speaker 1: gambles here, But it also means going through life perpetually 367 00:19:53,840 --> 00:19:57,200 Speaker 1: imagining how the tiger sees you. Yeah, and so this 368 00:19:57,280 --> 00:20:01,360 Speaker 1: is in some ways the exact social equivalent of the 369 00:20:01,400 --> 00:20:04,960 Speaker 1: agency detection overdrive, where you, you know, over interpret a 370 00:20:05,000 --> 00:20:07,800 Speaker 1: crack of a twig as a tiger in the grass. Here, 371 00:20:07,880 --> 00:20:11,480 Speaker 1: you over interpret any little thing that that you think 372 00:20:11,560 --> 00:20:15,159 Speaker 1: maybe going wrong in a social interaction as something that 373 00:20:15,200 --> 00:20:19,040 Speaker 1: people will notice and remember and judge you for. So 374 00:20:19,080 --> 00:20:20,800 Speaker 1: maybe we should take a quick break and then when 375 00:20:20,800 --> 00:20:23,080 Speaker 1: we come back, we can get a little bit further 376 00:20:23,160 --> 00:20:29,000 Speaker 1: into this study. Thank you, thank you, thank Alright, we're back. Okay. 377 00:20:29,000 --> 00:20:30,719 Speaker 1: So we're talking about the study from the year two 378 00:20:30,800 --> 00:20:34,480 Speaker 1: thousand by Gilovich and co authors who are putting forth 379 00:20:34,600 --> 00:20:37,320 Speaker 1: this this putative phenomenon that at the time they called 380 00:20:37,359 --> 00:20:42,000 Speaker 1: the spotlight effects, the idea that we overestimate the salience 381 00:20:42,080 --> 00:20:45,639 Speaker 1: of our appearance and our behavior to other people. And 382 00:20:45,680 --> 00:20:49,240 Speaker 1: the authors here note several lines of previous research that 383 00:20:49,359 --> 00:20:52,560 Speaker 1: helped point to this conclusion. One of them is, first 384 00:20:52,560 --> 00:20:54,600 Speaker 1: of all, this may not be surprising at all, but 385 00:20:54,680 --> 00:20:58,360 Speaker 1: people do tend to have egocentric biases that you can 386 00:20:58,520 --> 00:21:02,600 Speaker 1: measure quite easily and in in tests. These are biases 387 00:21:02,640 --> 00:21:06,440 Speaker 1: that overstate the importance of the self. Just one example 388 00:21:06,520 --> 00:21:10,000 Speaker 1: the site is a paper by Ross and Sickly published 389 00:21:10,040 --> 00:21:13,600 Speaker 1: in nineteen seventy nine called ego centric Biases and Availability 390 00:21:13,640 --> 00:21:16,800 Speaker 1: and Attribution, and it showed it showed this in the 391 00:21:16,800 --> 00:21:21,560 Speaker 1: realm of what's called responsibility allocation, Who did, how much 392 00:21:21,640 --> 00:21:24,679 Speaker 1: and how important was what they did? So there are 393 00:21:24,680 --> 00:21:27,600 Speaker 1: several different ways you can test for this. Uh, maybe 394 00:21:27,600 --> 00:21:33,960 Speaker 1: in discussion groups, maybe in household chores, maybe in basketball teams. Uh, 395 00:21:34,280 --> 00:21:38,440 Speaker 1: quote one zone, contributions to a joint product are more 396 00:21:38,520 --> 00:21:43,400 Speaker 1: readily available. That is, more frequently and easily recalled. Individuals 397 00:21:43,440 --> 00:21:47,560 Speaker 1: accepted more responsibility for a group product than other participants 398 00:21:47,560 --> 00:21:50,760 Speaker 1: attributed to them. So the easy way of thinking about 399 00:21:50,800 --> 00:21:55,160 Speaker 1: this is, oh, our team one, because I scored that goal. Yeah, 400 00:21:55,560 --> 00:21:58,720 Speaker 1: I kind of this particularly telling. The authors point out 401 00:21:58,800 --> 00:22:03,680 Speaker 1: that the research indicates that when individuals undertake complex social interactions, 402 00:22:04,000 --> 00:22:07,600 Speaker 1: they alternate between the roles of speaker or actor and 403 00:22:07,840 --> 00:22:11,639 Speaker 1: listener or observer, but much of their attention is ultimately 404 00:22:11,960 --> 00:22:14,919 Speaker 1: going to be directed in many cases at planning and 405 00:22:14,920 --> 00:22:18,200 Speaker 1: executing their own responses. And I think we can relate 406 00:22:18,240 --> 00:22:20,880 Speaker 1: to this. Uh, you know when when those times when 407 00:22:20,920 --> 00:22:23,160 Speaker 1: you haven't quite zoned out on a meeting, like you're 408 00:22:23,200 --> 00:22:25,760 Speaker 1: not just or or a conversation. You're not just you know, 409 00:22:25,800 --> 00:22:28,359 Speaker 1: out here thinking about Star Wars in the back of 410 00:22:28,359 --> 00:22:32,160 Speaker 1: your head. No, you're focusing instead on the thing that 411 00:22:32,200 --> 00:22:36,160 Speaker 1: you're getting ready to say, your interjection into the conversation, 412 00:22:36,920 --> 00:22:39,240 Speaker 1: the joke that you are intending to make when you 413 00:22:39,280 --> 00:22:43,160 Speaker 1: get the talking stick. Um. And uh, you know, because 414 00:22:43,240 --> 00:22:45,560 Speaker 1: ultimately that's often a part of any kind of like 415 00:22:45,720 --> 00:22:50,600 Speaker 1: three way or or or larger conversation is when when 416 00:22:50,680 --> 00:22:52,560 Speaker 1: is it going to be my turn? And how am 417 00:22:52,600 --> 00:22:55,280 Speaker 1: I going to make the most out of my my 418 00:22:55,359 --> 00:22:58,080 Speaker 1: time speaking. I believe there's actually a name for this 419 00:22:58,160 --> 00:23:00,639 Speaker 1: exact effect. It's a different thing spend stuff. I mean, 420 00:23:00,680 --> 00:23:03,080 Speaker 1: obviously it's very related to the stuff we're talking about, 421 00:23:03,160 --> 00:23:05,840 Speaker 1: but I think it's called the next in line effect, 422 00:23:06,440 --> 00:23:10,200 Speaker 1: where you can measure that people have less recall of 423 00:23:10,320 --> 00:23:12,040 Speaker 1: if you if you like, sit people in a circle 424 00:23:12,119 --> 00:23:14,600 Speaker 1: and go around the circle asking them to speak, people 425 00:23:14,640 --> 00:23:17,760 Speaker 1: have less recall of the person who spoke right before 426 00:23:17,840 --> 00:23:20,480 Speaker 1: them than they do everybody else, because you know, when 427 00:23:20,480 --> 00:23:22,480 Speaker 1: the person right before you was talking, you're planning what 428 00:23:22,520 --> 00:23:25,920 Speaker 1: you're gonna say. Yeah, And it means that when one 429 00:23:26,119 --> 00:23:28,879 Speaker 1: thinks back on a meeting, so you're in this meeting, 430 00:23:28,960 --> 00:23:30,959 Speaker 1: you have this period of time where you're you're applying 431 00:23:30,960 --> 00:23:34,439 Speaker 1: most of your cognitive efforts towards preparing for your own words, 432 00:23:34,800 --> 00:23:36,840 Speaker 1: and then when you think back on it, you're more 433 00:23:36,880 --> 00:23:39,760 Speaker 1: likely to remember the thing that you were focused on 434 00:23:39,840 --> 00:23:42,480 Speaker 1: at the time. You know, your own words, your own 435 00:23:42,520 --> 00:23:46,119 Speaker 1: contribution um, because that's where that's where you're spending the 436 00:23:46,359 --> 00:23:50,400 Speaker 1: mental resources. The exception to this however, would be if 437 00:23:50,440 --> 00:23:54,000 Speaker 1: one's could contribution required a little effort, like instead of 438 00:23:54,359 --> 00:23:57,720 Speaker 1: plotting to interject something that will make everyone laugh or 439 00:23:57,840 --> 00:24:01,239 Speaker 1: pursuing some specific strategic aim in the meeting, what if 440 00:24:01,240 --> 00:24:03,560 Speaker 1: it was just the part of the meeting where every 441 00:24:03,600 --> 00:24:06,479 Speaker 1: week your boss says, hey, Roy, what are the numbers? 442 00:24:06,680 --> 00:24:08,199 Speaker 1: Just read us the numbers real quick, and then you 443 00:24:08,240 --> 00:24:10,320 Speaker 1: read the numbers something that's you know, quick and normal 444 00:24:10,400 --> 00:24:13,400 Speaker 1: like that. Now, the exception of this they mentioned would 445 00:24:13,440 --> 00:24:16,440 Speaker 1: be passive observers, people who are in the meeting but 446 00:24:17,320 --> 00:24:20,560 Speaker 1: are not planning to have the talking stick at any point, 447 00:24:20,640 --> 00:24:22,639 Speaker 1: don't have any kind of active role in the meeting, 448 00:24:22,800 --> 00:24:25,040 Speaker 1: or if they do, maybe it is just reading off 449 00:24:25,040 --> 00:24:27,720 Speaker 1: the stats and they don't have a larger role to play, 450 00:24:28,000 --> 00:24:31,119 Speaker 1: so they might well focus more on other people in 451 00:24:31,200 --> 00:24:33,679 Speaker 1: the meeting. They are, they are going to be the 452 00:24:33,680 --> 00:24:36,400 Speaker 1: ones that are going to be more likely to notice 453 00:24:36,440 --> 00:24:39,359 Speaker 1: what you say or do that. That totally makes sense 454 00:24:39,359 --> 00:24:42,840 Speaker 1: to me, Um, I think I have much better recall 455 00:24:42,920 --> 00:24:45,879 Speaker 1: of meetings where I am not expected to speak. That 456 00:24:46,000 --> 00:24:47,879 Speaker 1: being said, and this is this is me, not the 457 00:24:48,200 --> 00:24:52,440 Speaker 1: authors here, But I suspect that the passive observers are 458 00:24:52,480 --> 00:24:56,800 Speaker 1: also far more likely to be thinking about Star Wars 459 00:24:56,920 --> 00:24:58,879 Speaker 1: or what they're they need to buy at the grocery 460 00:24:58,880 --> 00:25:02,920 Speaker 1: score later that so supernatural biker movies. Yeah, or here's 461 00:25:02,920 --> 00:25:04,840 Speaker 1: a big one and we didn't even get into this, 462 00:25:04,960 --> 00:25:09,400 Speaker 1: but via the zoom call, uh one has a tremendous 463 00:25:09,440 --> 00:25:12,919 Speaker 1: ability to just simply go to other websites during the 464 00:25:13,000 --> 00:25:17,159 Speaker 1: call and still look basically attentive, right, because you'd just 465 00:25:17,200 --> 00:25:20,920 Speaker 1: be looking at the screen either way. Yeah, there's your excuse, folks, 466 00:25:21,320 --> 00:25:24,560 Speaker 1: digital hookie, because anybody tried just putting up like a 467 00:25:24,680 --> 00:25:27,600 Speaker 1: face like I know, you can insertain backgrounds on these 468 00:25:27,680 --> 00:25:29,959 Speaker 1: video calls, putting up a background that has a photo 469 00:25:30,040 --> 00:25:32,440 Speaker 1: of them in it so it looks like they're sitting there. 470 00:25:32,840 --> 00:25:35,480 Speaker 1: I bet somebody has somebody out there has gotten a 471 00:25:35,520 --> 00:25:37,600 Speaker 1: little bit forward and figured out a way to make 472 00:25:37,600 --> 00:25:39,359 Speaker 1: it happen. It would be It would be kind of 473 00:25:39,400 --> 00:25:41,960 Speaker 1: the equivalent of but didn't Helmer Simpson have some glasses 474 00:25:42,000 --> 00:25:44,440 Speaker 1: at one point that made him look like he was awake? Yes, 475 00:25:44,480 --> 00:25:47,280 Speaker 1: it's when he's in he's on a jury and he's 476 00:25:47,320 --> 00:25:50,760 Speaker 1: expected to be paying attention, but he is sleeping. That's right, 477 00:25:51,119 --> 00:25:54,880 Speaker 1: that's right, I remember that wide awake glasses and one 478 00:25:54,880 --> 00:25:57,960 Speaker 1: of the other jurors and arcs on him. But yeah, 479 00:25:57,960 --> 00:26:00,919 Speaker 1: so so anyway, the effect here, I think pretty straightforward. 480 00:26:00,960 --> 00:26:03,520 Speaker 1: If an action stands out in your own mind for 481 00:26:03,560 --> 00:26:06,119 Speaker 1: whatever reason, you're going to end up thinking it was 482 00:26:06,200 --> 00:26:10,280 Speaker 1: more important in some objective sense than it actually was. 483 00:26:10,920 --> 00:26:14,240 Speaker 1: And so, in other words, if people overestimate the relevance 484 00:26:14,280 --> 00:26:17,119 Speaker 1: of their own actions in an objective sense, wouldn't they 485 00:26:17,160 --> 00:26:24,080 Speaker 1: also overestimate how relevant their actions are subjectively to other people? Yeah? Yeah. 486 00:26:24,359 --> 00:26:27,840 Speaker 1: The authors also point out in many cases it might 487 00:26:27,880 --> 00:26:31,199 Speaker 1: not matter it maybe quote overlook when joint endeavors do 488 00:26:31,240 --> 00:26:35,639 Speaker 1: not require explicit allocations of responsibility. But obviously sometimes this 489 00:26:35,760 --> 00:26:38,840 Speaker 1: is not the case. Yeah. It particularly makes me think 490 00:26:38,880 --> 00:26:43,120 Speaker 1: of a frequent folk you see in films, the villainous meetings, 491 00:26:43,160 --> 00:26:47,119 Speaker 1: when you have villains around a table generally having a meeting, 492 00:26:47,600 --> 00:26:51,320 Speaker 1: having this sort of you know, dark, more antagonistic version 493 00:26:51,880 --> 00:26:57,600 Speaker 1: of our regular real life business meetings. Um, the meetings 494 00:26:57,600 --> 00:27:00,360 Speaker 1: of Specter in the early James Bond Yeah, who are 495 00:27:00,359 --> 00:27:03,960 Speaker 1: Blowfeld would have the have the command consoled like electrocute 496 00:27:03,960 --> 00:27:07,240 Speaker 1: somebody's chair, yeah, or another favorite of mine, or the 497 00:27:07,680 --> 00:27:10,919 Speaker 1: meetings you see the Imperial meetings in like Star Wars 498 00:27:10,920 --> 00:27:13,320 Speaker 1: and New Hope, or we're also in Rogue one where 499 00:27:13,359 --> 00:27:16,359 Speaker 1: we have the likes of Darth Vader and grand Moth 500 00:27:16,480 --> 00:27:19,960 Speaker 1: Tarken or or Orson Critic. You know, they're they're they're 501 00:27:20,000 --> 00:27:23,199 Speaker 1: all objectively, they're they're all talking about, okay, we need 502 00:27:23,200 --> 00:27:24,920 Speaker 1: to get the Death Star up and running. But these 503 00:27:24,920 --> 00:27:28,359 Speaker 1: are all highly egotistical and self focused individuals, and they 504 00:27:28,440 --> 00:27:31,680 Speaker 1: all seen each pretty focused on their own key role 505 00:27:31,800 --> 00:27:35,080 Speaker 1: in everything, and they're certainly not about like elevating the 506 00:27:35,080 --> 00:27:39,480 Speaker 1: project itself above personal ambition. Yeah. Yeah, they're clearly like 507 00:27:39,560 --> 00:27:41,560 Speaker 1: trying to stick up for their own branch. It's like, 508 00:27:41,600 --> 00:27:44,560 Speaker 1: you know, dangerous to your star fleet commander, not to 509 00:27:44,720 --> 00:27:48,720 Speaker 1: my battle station. Yeah, okay. A few more previously observed. 510 00:27:48,880 --> 00:27:51,800 Speaker 1: A psychological phenomenon that the authors call attention to is 511 00:27:51,960 --> 00:27:54,439 Speaker 1: is potentially backing up the idea of a spotlight effect. 512 00:27:54,440 --> 00:27:58,199 Speaker 1: Another one is what's known as naive realism. The right 513 00:27:58,320 --> 00:28:01,320 Speaker 1: quote naive realism or for is to the common tendency 514 00:28:01,359 --> 00:28:04,320 Speaker 1: to assume that one's perception of an object or event 515 00:28:04,720 --> 00:28:08,560 Speaker 1: is an accurate reflection of its objective properties, not a 516 00:28:08,600 --> 00:28:13,560 Speaker 1: subjective interpretation or construle. In other words, Look, it happened 517 00:28:13,600 --> 00:28:16,240 Speaker 1: just like I saw it. It's the tendency to believe 518 00:28:16,320 --> 00:28:19,960 Speaker 1: that your perception is unbiased and accurate, even though you 519 00:28:20,040 --> 00:28:23,760 Speaker 1: might readily attribute, you know, mistakes and biases to other 520 00:28:23,760 --> 00:28:27,239 Speaker 1: people's perceptions. Yeah, and this is all tied up in 521 00:28:27,359 --> 00:28:31,320 Speaker 1: the philosophy of perception. Um. So when we're talking about 522 00:28:31,440 --> 00:28:35,760 Speaker 1: naive realism also known as direct realism, that stands in 523 00:28:35,800 --> 00:28:40,800 Speaker 1: opposition do indirect or representational realism. So direct or niaive 524 00:28:40,880 --> 00:28:44,000 Speaker 1: realism holds that we perceive things in the world directly 525 00:28:45,000 --> 00:28:48,960 Speaker 1: and without the then the mediation of any impression, idea, 526 00:28:49,400 --> 00:28:52,840 Speaker 1: or representation. And I think we can generally agree, especially 527 00:28:52,880 --> 00:28:55,080 Speaker 1: on this show, that this is not the true nature 528 00:28:55,120 --> 00:28:58,000 Speaker 1: of how a human process is reality. No, the things 529 00:28:58,000 --> 00:29:00,680 Speaker 1: you see are based on the external world, but it's 530 00:29:00,760 --> 00:29:05,000 Speaker 1: not an unbiased direct representation of the external world. Right, Like, 531 00:29:05,120 --> 00:29:07,480 Speaker 1: there's there's a weight to things, you know. It's like 532 00:29:07,560 --> 00:29:11,400 Speaker 1: if you know, if yesterday somebody slapped me with the fish, 533 00:29:11,600 --> 00:29:15,040 Speaker 1: today I see a fish and like that that the 534 00:29:15,360 --> 00:29:17,719 Speaker 1: nature of my perception is going to be augmented by 535 00:29:17,760 --> 00:29:21,760 Speaker 1: that previous experience. Right now. Indirect realism adheres to the 536 00:29:21,800 --> 00:29:26,800 Speaker 1: idea that material objects do have mind independent existence, but 537 00:29:26,800 --> 00:29:30,000 Speaker 1: but not that our visual perception is unmediated, or that 538 00:29:30,120 --> 00:29:33,480 Speaker 1: these objects necessarily possess all of the features that we 539 00:29:33,600 --> 00:29:36,720 Speaker 1: perceived them to have. Like a quick example of that 540 00:29:36,760 --> 00:29:39,520 Speaker 1: would be, obviously, we look at our beloved pets, and 541 00:29:39,560 --> 00:29:41,959 Speaker 1: we may, you know, we may perceive them to have 542 00:29:42,680 --> 00:29:46,480 Speaker 1: various nuances that they simply do not have. And you know, 543 00:29:46,920 --> 00:29:49,520 Speaker 1: as pet owners, were generally okay with that. Yeah, I 544 00:29:49,560 --> 00:29:52,320 Speaker 1: think that's exactly right. And I think indirect realism, I 545 00:29:52,360 --> 00:29:54,160 Speaker 1: don't know, to me, that is the model of the 546 00:29:54,200 --> 00:29:55,800 Speaker 1: world that makes the most sense. Like I would not 547 00:29:55,840 --> 00:29:59,120 Speaker 1: say I'm an idealist. I believe the external world doesn't exist. 548 00:29:59,360 --> 00:30:02,360 Speaker 1: I don't go for at But obviously our our ideas 549 00:30:02,400 --> 00:30:05,680 Speaker 1: about what is motivating our dog or something might be 550 00:30:06,160 --> 00:30:10,080 Speaker 1: more us than actually coming from the dog. Yeah. Now, 551 00:30:10,080 --> 00:30:13,600 Speaker 1: of course, there's also phenomenalism, which generally rejects the mind 552 00:30:13,600 --> 00:30:17,720 Speaker 1: independent existence of material logics but accepts on mediated visual 553 00:30:17,760 --> 00:30:22,400 Speaker 1: perception and the possession of of perceived features. So other 554 00:30:22,520 --> 00:30:25,600 Speaker 1: things are not things as much as they are bundles 555 00:30:25,640 --> 00:30:28,640 Speaker 1: of sense data, which is a weird way to to 556 00:30:28,840 --> 00:30:33,120 Speaker 1: behold your pat after Yeah, that's getting almost into kind 557 00:30:33,160 --> 00:30:35,880 Speaker 1: of George Berkeley and kind of territory that that I 558 00:30:35,880 --> 00:30:38,080 Speaker 1: don't think I can fully go for, but that it 559 00:30:38,160 --> 00:30:40,640 Speaker 1: ultimately doesn't really play into what we're talking about here. Again, 560 00:30:40,680 --> 00:30:45,080 Speaker 1: we're talking about direct realism or our naive realism versus 561 00:30:45,080 --> 00:30:49,320 Speaker 1: indirect or representational realism, right, And I think we do 562 00:30:49,400 --> 00:30:53,360 Speaker 1: have a tendency to really underappreciate how much our perceptions 563 00:30:53,360 --> 00:30:56,120 Speaker 1: are affected by the kinds of mistakes and distortions that 564 00:30:56,160 --> 00:30:59,600 Speaker 1: we readily attribute to other people. And so the authors 565 00:30:59,600 --> 00:31:02,120 Speaker 1: of the two thousand paper right that quote applied to 566 00:31:02,160 --> 00:31:05,120 Speaker 1: the spotlight effect. This implies that it might be easy 567 00:31:05,240 --> 00:31:08,920 Speaker 1: to confuse how salient something is to oneself with how 568 00:31:08,960 --> 00:31:12,880 Speaker 1: salient it is to others, precisely because our own behavior 569 00:31:13,040 --> 00:31:15,560 Speaker 1: stands out in our own minds, it can be hard 570 00:31:15,600 --> 00:31:18,560 Speaker 1: to discern how well or even whether it is picked 571 00:31:18,640 --> 00:31:22,280 Speaker 1: up by others. Absolutely, we may be attending the same 572 00:31:22,320 --> 00:31:25,280 Speaker 1: meeting or Zoom conference call, but we are not all 573 00:31:25,320 --> 00:31:27,680 Speaker 1: attending the same meeting or Zoom conference call, you know 574 00:31:27,720 --> 00:31:30,280 Speaker 1: what I mean. We we all have different perceptions of 575 00:31:30,360 --> 00:31:33,880 Speaker 1: it based on our own biases, our own histories, our 576 00:31:33,920 --> 00:31:36,880 Speaker 1: own pervasive thoughts are right, you know. Cognitive model of 577 00:31:36,960 --> 00:31:39,320 Speaker 1: the task at hand and our role in it. I mean, 578 00:31:39,320 --> 00:31:41,680 Speaker 1: our our subjective understanding of the meeting is going to 579 00:31:41,760 --> 00:31:45,680 Speaker 1: different person to person exactly. So leading into the next 580 00:31:45,720 --> 00:31:48,280 Speaker 1: thing here that the authors point out as possibly pointing 581 00:31:48,280 --> 00:31:50,840 Speaker 1: to a spotlight effect. This is something that has been 582 00:31:50,840 --> 00:31:55,160 Speaker 1: documented known as the self as target bias. Quick example, 583 00:31:55,440 --> 00:31:57,800 Speaker 1: so you're in a classroom the teacher gives a pop 584 00:31:57,880 --> 00:32:01,960 Speaker 1: quiz about last night's reading. Johnny interprets this quiz as 585 00:32:02,000 --> 00:32:05,760 Speaker 1: an attack on him personally because he believes that the 586 00:32:05,840 --> 00:32:09,720 Speaker 1: teacher must believe that he didn't do the reading. And 587 00:32:10,200 --> 00:32:12,320 Speaker 1: you know, I think even the best of us sometimes 588 00:32:12,400 --> 00:32:15,080 Speaker 1: we fall prey to thinking like this something that is 589 00:32:15,120 --> 00:32:18,800 Speaker 1: a a general sort of action applied to everyone. We think, 590 00:32:18,880 --> 00:32:23,600 Speaker 1: why are they doing this to me? Yeah? Yeah, especially 591 00:32:23,640 --> 00:32:25,240 Speaker 1: if you have a light up, like a build up, 592 00:32:25,360 --> 00:32:28,760 Speaker 1: build up of anticipation about a given you know, meeting 593 00:32:28,960 --> 00:32:32,520 Speaker 1: or or social scenario. Yeah, And so the author's right 594 00:32:32,640 --> 00:32:35,840 Speaker 1: quote like naive realism, then the self is target bias 595 00:32:35,880 --> 00:32:39,760 Speaker 1: reflects a confusion between what is available to oneself and 596 00:32:39,800 --> 00:32:42,479 Speaker 1: what is likely to be available to and hence guide 597 00:32:42,520 --> 00:32:45,800 Speaker 1: the actions of others. So again, Johnny might think, well, 598 00:32:46,240 --> 00:32:48,640 Speaker 1: the teacher knows I didn't do the reading, and that's 599 00:32:48,680 --> 00:32:51,200 Speaker 1: why she's giving the test today, or she's giving the 600 00:32:51,200 --> 00:32:54,320 Speaker 1: pop quiz today. But the teacher doesn't know that. It's 601 00:32:54,400 --> 00:32:57,600 Speaker 1: just you know, that's what he knows. And then finally, 602 00:32:57,600 --> 00:33:00,760 Speaker 1: the authors point out that these previously document at ecocentric 603 00:33:00,800 --> 00:33:03,720 Speaker 1: biases are very similar to the kind of egocentrism that 604 00:33:03,800 --> 00:33:09,000 Speaker 1: Jean Piege observed pervading the cognition of young children early 605 00:33:09,080 --> 00:33:11,760 Speaker 1: in their development. One of the more important parts of 606 00:33:11,800 --> 00:33:15,040 Speaker 1: growing up, in fact, is shedding some of that ecocentrism. 607 00:33:15,240 --> 00:33:17,640 Speaker 1: But it turns out we don't shed it all. It 608 00:33:17,760 --> 00:33:21,720 Speaker 1: still appears in adults, simply in diminished form. It's certainly 609 00:33:21,760 --> 00:33:26,360 Speaker 1: more diminished in some people than it is in us. Yeah, 610 00:33:26,800 --> 00:33:29,880 Speaker 1: but yeah, so it's young children. I mean, people with 611 00:33:29,880 --> 00:33:32,240 Speaker 1: with kids will probably recognize this. Often seem not to 612 00:33:32,360 --> 00:33:36,200 Speaker 1: grasp that other people have a different perspective than they do. 613 00:33:36,400 --> 00:33:38,600 Speaker 1: This happens when you're very young, and gradually, as you 614 00:33:38,640 --> 00:33:43,000 Speaker 1: get older, you get more, you get more consistent about 615 00:33:43,040 --> 00:33:46,160 Speaker 1: being able to accurately sort of model the minds of others. 616 00:33:46,240 --> 00:33:50,640 Speaker 1: Understand that they have different desires, different perspectives than you do. 617 00:33:51,080 --> 00:33:53,640 Speaker 1: And adults of course by the time of adulthood usually 618 00:33:53,640 --> 00:33:56,960 Speaker 1: recognize this gap rationally, but still might have a hard 619 00:33:57,000 --> 00:34:01,040 Speaker 1: time sort of calibrating to predict it accurate. Really, now, 620 00:34:01,080 --> 00:34:03,560 Speaker 1: the authors of this paper here say that what is 621 00:34:03,600 --> 00:34:07,240 Speaker 1: the method that that we use to calibrate this this prediction. 622 00:34:08,120 --> 00:34:11,879 Speaker 1: They say that it's probably based on anchoring and adjustment. Now, 623 00:34:11,920 --> 00:34:14,720 Speaker 1: I was reading some follow up work by Gilovich about 624 00:34:14,840 --> 00:34:19,960 Speaker 1: the the anchoring and adjustment controversy. Very brief refresher on anchoring. 625 00:34:20,000 --> 00:34:22,359 Speaker 1: We've done an episode about this in the past. So, 626 00:34:22,440 --> 00:34:24,560 Speaker 1: like when you're trying to come up with an answer 627 00:34:24,600 --> 00:34:27,520 Speaker 1: to a question like how much is this car worth? 628 00:34:27,760 --> 00:34:30,840 Speaker 1: Or what do people think of the speech I just made? 629 00:34:31,560 --> 00:34:35,280 Speaker 1: You don't necessarily reason toward an answer from a neutral 630 00:34:35,400 --> 00:34:39,520 Speaker 1: starting point. We often tend to be influenced by sort 631 00:34:39,520 --> 00:34:43,160 Speaker 1: of like data points or hypothetical answers that we can 632 00:34:43,239 --> 00:34:45,759 Speaker 1: kind of hang a hat on to begin with. Which 633 00:34:45,840 --> 00:34:47,640 Speaker 1: might be one reason that you've got a good, you know, 634 00:34:47,840 --> 00:34:50,439 Speaker 1: price written on the on the windshield of a car, 635 00:34:50,560 --> 00:34:52,560 Speaker 1: even if that's not the price you would actually end 636 00:34:52,640 --> 00:34:55,560 Speaker 1: up paying. Now, this was the Now, the anchoring and 637 00:34:55,640 --> 00:34:58,279 Speaker 1: adjustment model was what the authors were working with at 638 00:34:58,280 --> 00:35:01,200 Speaker 1: the time. That's the idea we often think by starting 639 00:35:01,200 --> 00:35:03,920 Speaker 1: with an anchor, and then we just adjust our estimate 640 00:35:04,040 --> 00:35:06,880 Speaker 1: up or down from the anchor. I was reading some 641 00:35:06,960 --> 00:35:11,040 Speaker 1: follow up work by Gilovich about the adjustment controversy, like, 642 00:35:11,200 --> 00:35:13,640 Speaker 1: is this really the way we think? Is this really 643 00:35:13,640 --> 00:35:16,560 Speaker 1: how we get to our anchor biased answers? Is it 644 00:35:16,600 --> 00:35:20,160 Speaker 1: based on this adjustment process? Apparently that idea has come 645 00:35:20,239 --> 00:35:22,960 Speaker 1: under some criticism in the past few decades, and there 646 00:35:22,960 --> 00:35:26,080 Speaker 1: are arguments about how to best understand what's happening in 647 00:35:26,080 --> 00:35:28,680 Speaker 1: people's heads when they fall for the anchoring bias. We're 648 00:35:28,719 --> 00:35:30,759 Speaker 1: not going to get into the weeds of that argument here. 649 00:35:30,840 --> 00:35:32,920 Speaker 1: You can check out our full episode on the anchoring 650 00:35:32,960 --> 00:35:35,880 Speaker 1: bias for more depth. Um, but whatever the role of 651 00:35:35,920 --> 00:35:39,680 Speaker 1: an adjustment mechanism in the brain, the anchoring effect does 652 00:35:39,760 --> 00:35:43,200 Speaker 1: actually appear in many scenarios, and the authors in this 653 00:35:43,280 --> 00:35:47,839 Speaker 1: paper are saying that the anchoring effect manifests in how 654 00:35:47,920 --> 00:35:51,640 Speaker 1: we imagine the opinions of other people about us, because 655 00:35:51,680 --> 00:35:54,880 Speaker 1: our our anchor, Our starting point is how we feel 656 00:35:54,920 --> 00:35:58,279 Speaker 1: about ourselves, the stuff we notice about ourselves, and then 657 00:35:58,320 --> 00:36:01,600 Speaker 1: we kind of reason from there to what other people's 658 00:36:01,600 --> 00:36:04,600 Speaker 1: opinions would be. Well that makes sense again coming back 659 00:36:04,640 --> 00:36:07,000 Speaker 1: to the idea that we're using theory of mind to 660 00:36:07,280 --> 00:36:11,239 Speaker 1: ultimately create simulations about the mind states of everyone in 661 00:36:11,280 --> 00:36:14,879 Speaker 1: our lives from you know, from the person we're closest with, 662 00:36:15,320 --> 00:36:18,000 Speaker 1: two people that are just you know, like supervisors or 663 00:36:18,040 --> 00:36:21,480 Speaker 1: complete strangers and uh and and all of that is 664 00:36:21,520 --> 00:36:24,480 Speaker 1: constructed with ourselves at the middle, like our model of 665 00:36:24,520 --> 00:36:29,080 Speaker 1: ourselfs um is ultimately the like the I guess you 666 00:36:29,080 --> 00:36:31,600 Speaker 1: would say that the support structure on which this entire 667 00:36:31,680 --> 00:36:34,680 Speaker 1: network is built, right, It's kind of like you can't 668 00:36:34,800 --> 00:36:38,560 Speaker 1: build any bridges two ideas of other minds without starting 669 00:36:38,600 --> 00:36:42,120 Speaker 1: from the foundation of your own. And that foundation of 670 00:36:42,120 --> 00:36:43,719 Speaker 1: your own is going to come with a lot of 671 00:36:43,800 --> 00:36:48,520 Speaker 1: baggage of like knowledge about yourself that other people don't have, 672 00:36:48,800 --> 00:36:52,600 Speaker 1: and high levels of concern about your personal attributes that 673 00:36:52,960 --> 00:36:56,160 Speaker 1: other people might not share, probably don't share. And that's 674 00:36:56,200 --> 00:36:58,080 Speaker 1: everybody we should need to drive that home. Like we're 675 00:36:58,080 --> 00:37:01,320 Speaker 1: not just talking about like say, like it's like stereotypically 676 00:37:01,600 --> 00:37:04,840 Speaker 1: egocentric person you know, or someone who has like like 677 00:37:05,040 --> 00:37:09,120 Speaker 1: very obvious uh and pronounced personality flaws or anything like that, 678 00:37:09,480 --> 00:37:12,760 Speaker 1: or dealing with with various mental health concerns or anything 679 00:37:12,800 --> 00:37:16,200 Speaker 1: of that nature. But ultimately this naive perception is also 680 00:37:16,320 --> 00:37:19,960 Speaker 1: self perception as well. Yeah, yeah, so I guess here 681 00:37:19,960 --> 00:37:22,560 Speaker 1: we get to the actual empirical part of this research, 682 00:37:22,760 --> 00:37:24,680 Speaker 1: like how would you study this? How would you look 683 00:37:24,719 --> 00:37:28,600 Speaker 1: for empirical evidence of a spotlight effect? And there are 684 00:37:28,600 --> 00:37:30,680 Speaker 1: a number of studies that are covered in this paper. 685 00:37:30,719 --> 00:37:33,560 Speaker 1: I'm going to discuss them in sequence and very broad strokes. 686 00:37:33,719 --> 00:37:36,000 Speaker 1: Uh so, what, let me guess you know, what's going 687 00:37:36,080 --> 00:37:39,680 Speaker 1: to be our our our key um implement in this study. 688 00:37:39,719 --> 00:37:41,719 Speaker 1: Is it going to be like a god helmet that 689 00:37:41,920 --> 00:37:44,479 Speaker 1: it scans my brain? Is it going to be uh, 690 00:37:44,560 --> 00:37:46,440 Speaker 1: you know, some other kind of like high tech device 691 00:37:46,480 --> 00:37:48,799 Speaker 1: that I'm I'm cooking my nervous system up to. You're 692 00:37:48,800 --> 00:37:51,680 Speaker 1: extremely close. No, we get into the cybernetics of a 693 00:37:51,800 --> 00:37:56,879 Speaker 1: very Manilow T shirt. So the question is something maybe 694 00:37:56,880 --> 00:37:59,320 Speaker 1: a lot of you have wondered before. Do people usually 695 00:37:59,400 --> 00:38:02,400 Speaker 1: notice it's on your T shirt? Or do they just 696 00:38:02,480 --> 00:38:05,480 Speaker 1: not even care? It's a great question. I know when 697 00:38:05,480 --> 00:38:08,360 Speaker 1: I wear a T shirt, I'm I I have certainly 698 00:38:08,400 --> 00:38:11,839 Speaker 1: caught myself, especially at least in retrospect, thinking way too 699 00:38:11,880 --> 00:38:14,879 Speaker 1: much about how others will perceive this shirt design, yeah, 700 00:38:15,120 --> 00:38:17,160 Speaker 1: and what it's saying about me and my interest what 701 00:38:17,239 --> 00:38:20,320 Speaker 1: is it broadcasting to the world, and that would be 702 00:38:20,440 --> 00:38:22,040 Speaker 1: But at the same time, I feel like I'm always 703 00:38:22,160 --> 00:38:25,120 Speaker 1: very interested in what other people's shirts say, to the 704 00:38:25,160 --> 00:38:28,480 Speaker 1: point that I sometimes feel self conscious about trying to 705 00:38:28,560 --> 00:38:30,279 Speaker 1: understand what someone shure it is because I'm like, I 706 00:38:30,280 --> 00:38:33,280 Speaker 1: don't want to be looking like caught staring at somebody's 707 00:38:33,320 --> 00:38:37,200 Speaker 1: T shirt. I think you probably notice certain kinds of 708 00:38:37,280 --> 00:38:39,520 Speaker 1: shirts more than others, Like you might pick up on 709 00:38:39,600 --> 00:38:42,120 Speaker 1: cues that like, oh, this is a band T shirt, 710 00:38:42,160 --> 00:38:44,680 Speaker 1: and I'm usually kind of interested in band t shirts, 711 00:38:44,680 --> 00:38:46,760 Speaker 1: so when you see what this is, But like other things, 712 00:38:46,800 --> 00:38:50,240 Speaker 1: if it's a i'd imagine like a football team or something, 713 00:38:50,280 --> 00:38:52,839 Speaker 1: you might not even take notice. That's true. I guess 714 00:38:53,040 --> 00:38:55,560 Speaker 1: the shirts I'm usually interested in are I guess you 715 00:38:55,560 --> 00:38:57,200 Speaker 1: know it's a certain extent band shirts, But if it 716 00:38:57,239 --> 00:38:59,160 Speaker 1: has any kind of like monster type thing on it, 717 00:38:59,719 --> 00:39:02,080 Speaker 1: then I definitely want to know what's going on. That's right, 718 00:39:02,360 --> 00:39:05,560 Speaker 1: that's our brains. That's just our brains being our brains. 719 00:39:05,680 --> 00:39:08,360 Speaker 1: Uh so so extremely simple set up for the study 720 00:39:08,680 --> 00:39:10,919 Speaker 1: dead simple. You get a group of participants to gather 721 00:39:10,960 --> 00:39:12,879 Speaker 1: in a room and you have them basically in there 722 00:39:12,880 --> 00:39:16,520 Speaker 1: like filling out questionnaires for an experiment that is supposedly 723 00:39:16,520 --> 00:39:21,279 Speaker 1: about memory. And then another participant joins that group late, 724 00:39:21,440 --> 00:39:24,680 Speaker 1: but before they go into the room, you require them 725 00:39:24,719 --> 00:39:28,000 Speaker 1: to put on a Berry man Low T shirt. Then, 726 00:39:28,080 --> 00:39:30,279 Speaker 1: after they've been in the room for a very brief time, 727 00:39:30,360 --> 00:39:33,160 Speaker 1: you say, actually, this group has already gotten started, so 728 00:39:33,160 --> 00:39:35,759 Speaker 1: we're gonna hold you back for for another session. And 729 00:39:35,800 --> 00:39:39,200 Speaker 1: then you have the Barry Manilow interloper leave the room, 730 00:39:39,239 --> 00:39:42,560 Speaker 1: and then you ask everybody. Okay, you ask the interloper, 731 00:39:42,680 --> 00:39:45,279 Speaker 1: the Berry man Low T shirt wearer, how many people 732 00:39:45,320 --> 00:39:47,759 Speaker 1: in the room do you think noticed that you were 733 00:39:47,800 --> 00:39:50,680 Speaker 1: wearing a Berry man Low T shirt? And then you 734 00:39:50,760 --> 00:39:53,600 Speaker 1: ask the people who were in the room if they 735 00:39:53,640 --> 00:39:56,000 Speaker 1: noticed who was on the T shirt? Right, You're just 736 00:39:56,400 --> 00:39:59,839 Speaker 1: very simple comparing the person's expectation of how many people 737 00:40:00,000 --> 00:40:03,840 Speaker 1: noticed to how many people actually noticed. And true to prediction, 738 00:40:03,960 --> 00:40:07,000 Speaker 1: the students who wore the T shirt tended to wildly 739 00:40:07,160 --> 00:40:10,160 Speaker 1: overestimate how many people in the room would notice and 740 00:40:10,200 --> 00:40:13,880 Speaker 1: be able to identify their Manilow T shirt. Generally, the 741 00:40:13,920 --> 00:40:18,080 Speaker 1: Manilow interloper guests that about half of the other students 742 00:40:18,120 --> 00:40:21,000 Speaker 1: on average would be able to identify their T shirt, 743 00:40:21,080 --> 00:40:24,400 Speaker 1: and in reality only about twenty five percent of the 744 00:40:24,440 --> 00:40:27,680 Speaker 1: observers could do it. So in their minds, the people 745 00:40:27,880 --> 00:40:33,080 Speaker 1: wearing this potentially conspicuous piece of clothing mentally doubled the 746 00:40:33,120 --> 00:40:36,560 Speaker 1: percentage of people who they thought would notice it. Real quick, Joe, 747 00:40:37,440 --> 00:40:38,880 Speaker 1: when you have the study you were looking at here, 748 00:40:39,080 --> 00:40:41,640 Speaker 1: did you get to see this Manlo T shirt? No? 749 00:40:41,760 --> 00:40:45,480 Speaker 1: I didn't. That's my big question because I'm currently looking 750 00:40:45,560 --> 00:40:50,080 Speaker 1: at various very Manilow T shirts in um manage search here, 751 00:40:50,560 --> 00:40:53,640 Speaker 1: and they do. They do run the gamut here. We 752 00:40:53,680 --> 00:40:57,560 Speaker 1: have some we have some very forgettable Manilow shirts, but 753 00:40:57,600 --> 00:41:03,799 Speaker 1: we have some some real singers here. Uh yeah, well 754 00:41:03,880 --> 00:41:06,359 Speaker 1: I think so. I could be wrong, but I think 755 00:41:06,400 --> 00:41:08,600 Speaker 1: what it was was it was just like a picture 756 00:41:08,719 --> 00:41:11,880 Speaker 1: of his face. Okay, well even then, like it's a 757 00:41:11,960 --> 00:41:14,400 Speaker 1: it's a noticeable face. I mean you know that. That 758 00:41:14,560 --> 00:41:19,400 Speaker 1: was That's part of the whole business proposal here. But 759 00:41:19,440 --> 00:41:22,040 Speaker 1: there were also control groups for this study. They in 760 00:41:22,160 --> 00:41:26,280 Speaker 1: the control groups, this was an interesting calibration. The control 761 00:41:26,360 --> 00:41:30,239 Speaker 1: groups were not in the room, but instead they watched 762 00:41:30,320 --> 00:41:33,960 Speaker 1: the entire scene play out on a video recording, and 763 00:41:34,000 --> 00:41:37,440 Speaker 1: then they were asked to estimate the number of observers 764 00:41:37,520 --> 00:41:40,640 Speaker 1: who would notice the T shirt, and the control groups 765 00:41:40,760 --> 00:41:44,120 Speaker 1: guessed much closer to the real number of people who 766 00:41:44,120 --> 00:41:47,440 Speaker 1: would actually notice it, and they did not overestimate to 767 00:41:47,480 --> 00:41:50,520 Speaker 1: the extent that the person wearing the shirt did. And 768 00:41:50,600 --> 00:41:53,880 Speaker 1: this was taken as evidence that quote the targets inflated 769 00:41:54,040 --> 00:41:58,080 Speaker 1: estimates are not simply the result of misguided general theories 770 00:41:58,120 --> 00:42:02,120 Speaker 1: about observers powers of servation. In other words, the relevant 771 00:42:02,200 --> 00:42:05,880 Speaker 1: variable is I am the person wearing it. But that 772 00:42:05,880 --> 00:42:08,600 Speaker 1: makes sense. Again, we are the we are the central 773 00:42:08,680 --> 00:42:12,320 Speaker 1: character in our own narrative. Okay, second study in this paper, 774 00:42:12,680 --> 00:42:15,680 Speaker 1: so uh, It's worth noting that the majority of the 775 00:42:15,719 --> 00:42:18,759 Speaker 1: students that they interviewed in the first study reported, in 776 00:42:18,880 --> 00:42:23,000 Speaker 1: fact that wearing a very Manilow T shirt was considered embarrassing, 777 00:42:23,080 --> 00:42:26,400 Speaker 1: that Barry Manilow was considered kind of corny and uncool. 778 00:42:26,760 --> 00:42:28,879 Speaker 1: And it does make me wonder has Barry Manilow come 779 00:42:28,920 --> 00:42:31,600 Speaker 1: full circle yet? Has he become cool again? I don't know, 780 00:42:31,719 --> 00:42:33,200 Speaker 1: some of those shirts I was just looking at look 781 00:42:33,239 --> 00:42:35,560 Speaker 1: pretty cool. Yeah. I think that was part of the 782 00:42:35,600 --> 00:42:37,400 Speaker 1: design of the first study, was that this is a 783 00:42:37,400 --> 00:42:39,960 Speaker 1: figure that not everybody but a lot of people wearing 784 00:42:40,000 --> 00:42:42,759 Speaker 1: the shirt. Would you know, it's not just like any face, 785 00:42:42,840 --> 00:42:45,239 Speaker 1: it's somebody who a lot of students would probably feel 786 00:42:45,280 --> 00:42:48,279 Speaker 1: kind of embarrassed to be wearing a shirt of. But 787 00:42:48,480 --> 00:42:52,120 Speaker 1: the question is, like, does this phenomena hold for T 788 00:42:52,239 --> 00:42:54,239 Speaker 1: shirts that would not be embarrassing? That would just be 789 00:42:54,280 --> 00:42:57,480 Speaker 1: a picture of anybody, maybe anybody that the student liked. 790 00:42:57,840 --> 00:43:00,480 Speaker 1: So the second study tested for the spotlight effect with 791 00:43:00,520 --> 00:43:05,120 Speaker 1: reference to non embarrassing personal details. It replicated the design 792 00:43:05,120 --> 00:43:08,040 Speaker 1: of the first study, but it allowed students to choose 793 00:43:08,080 --> 00:43:10,920 Speaker 1: a T shirt featuring a person that they liked and 794 00:43:11,040 --> 00:43:13,719 Speaker 1: viewed as not embarrassing. So it might be a T 795 00:43:13,840 --> 00:43:16,840 Speaker 1: shirt of like Bob Marley or Jerry Seinfeld or something. 796 00:43:17,280 --> 00:43:20,520 Speaker 1: Way it was Jerry Seinfeld. Really was that this was 797 00:43:20,560 --> 00:43:23,240 Speaker 1: he specifically mentioned? And yeah, yeah that was one of them. 798 00:43:23,320 --> 00:43:26,520 Speaker 1: How is that Jerry Seinfeld shirt not embarrassed? I don't know. 799 00:43:26,800 --> 00:43:30,560 Speaker 1: Some people didn't think it was, you know, times changed, 800 00:43:30,600 --> 00:43:34,880 Speaker 1: this was what you're thousands something like that. Yeah, the 801 00:43:34,880 --> 00:43:37,480 Speaker 1: Bob Marley shirt I think remains that remains cool. But 802 00:43:37,760 --> 00:43:39,919 Speaker 1: I just have questions about the Gary Science Felt shirt. 803 00:43:39,960 --> 00:43:43,040 Speaker 1: Maybe this is again just tells more, says some more 804 00:43:43,040 --> 00:43:47,120 Speaker 1: about my interests versus other people's interests. But maybe I 805 00:43:47,160 --> 00:43:50,200 Speaker 1: am uncool for not wearing one. No, no, no, you're 806 00:43:50,320 --> 00:43:54,440 Speaker 1: very cool, Rubbert. But again there was a huge mismatch 807 00:43:54,520 --> 00:43:56,839 Speaker 1: right between. Even when you're wearing a shirt that's not 808 00:43:56,960 --> 00:44:01,280 Speaker 1: conspicuously embarrassing to a number of students, people just predicted 809 00:44:01,320 --> 00:44:03,960 Speaker 1: that observers would notice who was on their shirt a 810 00:44:04,000 --> 00:44:07,040 Speaker 1: lot more than the observers actually did. It just makes 811 00:44:07,040 --> 00:44:09,919 Speaker 1: me think of of myself or anyone you know. You've 812 00:44:09,960 --> 00:44:12,600 Speaker 1: got that new shirt and you're like, is today today? 813 00:44:13,520 --> 00:44:15,640 Speaker 1: Is today the day that I wear this, uh, this 814 00:44:15,680 --> 00:44:19,360 Speaker 1: new shirt going unleashed, this bad boy on an unsuspecting world. 815 00:44:19,560 --> 00:44:23,759 Speaker 1: Is the world ready? And yes? The thing is, yeah, 816 00:44:23,760 --> 00:44:26,759 Speaker 1: they're ready, and yeah they'll they'll be fine. Also like, 817 00:44:26,800 --> 00:44:31,200 Speaker 1: don't really to worry about um Yeah, so so okay, 818 00:44:31,200 --> 00:44:34,640 Speaker 1: But that's appearance. That's just clothing items. What about for behavior? 819 00:44:34,719 --> 00:44:37,440 Speaker 1: Can we look for examples of this in behavior? So 820 00:44:37,560 --> 00:44:41,200 Speaker 1: the third study tested for whether the spotlight effect exists 821 00:44:41,239 --> 00:44:45,240 Speaker 1: not just for clothing, but for specifically stuff people say 822 00:44:45,320 --> 00:44:48,520 Speaker 1: in a group setting. Quote. In particular, we sought to 823 00:44:48,520 --> 00:44:51,239 Speaker 1: investigate whether people tend to believe that their positive and 824 00:44:51,280 --> 00:44:55,440 Speaker 1: negative actions stand out to others more than they actually do. 825 00:44:56,400 --> 00:45:00,120 Speaker 1: And this was tested with stage discussion groups, So they 826 00:45:00,120 --> 00:45:02,359 Speaker 1: would have a discussion group meeting, and then they would 827 00:45:02,400 --> 00:45:06,760 Speaker 1: ask people afterwards to rate other participants on both positive 828 00:45:06,760 --> 00:45:09,680 Speaker 1: and negative dimensions of their contributions. So you'd rate all 829 00:45:09,680 --> 00:45:12,400 Speaker 1: the people you just had a group with, and you'd 830 00:45:12,400 --> 00:45:16,560 Speaker 1: say how much did participant X due to advance the discussion. 831 00:45:16,640 --> 00:45:19,560 Speaker 1: That'd be a positive mark, and then negative things would 832 00:45:19,600 --> 00:45:23,040 Speaker 1: be like how many speech errors did participant X make 833 00:45:23,239 --> 00:45:27,440 Speaker 1: or how likely was participant X to offend someone? And 834 00:45:27,600 --> 00:45:30,399 Speaker 1: participants also rated, of course, what they thought others would 835 00:45:30,440 --> 00:45:32,319 Speaker 1: think of them, you know, then they turned it on 836 00:45:32,360 --> 00:45:35,440 Speaker 1: themselves and they found the same thing. The study indicated 837 00:45:35,480 --> 00:45:38,480 Speaker 1: that we tend to overestimate the salience of our behavior 838 00:45:38,520 --> 00:45:43,359 Speaker 1: to others in both positive and negative ways. So it's 839 00:45:43,400 --> 00:45:46,560 Speaker 1: not just like a self serving bias or self critical bias. 840 00:45:46,600 --> 00:45:50,160 Speaker 1: It's just we tend to assume people are paying way 841 00:45:50,160 --> 00:45:54,680 Speaker 1: more attention and noticing way more about the stuff we do, 842 00:45:54,800 --> 00:45:57,640 Speaker 1: both good and bad, which makes me think of like 843 00:45:57,680 --> 00:46:00,440 Speaker 1: the hell razor tagline, it's like angels to some devils. 844 00:46:00,480 --> 00:46:02,560 Speaker 1: To others. But really, maybe you think am I an 845 00:46:02,600 --> 00:46:04,719 Speaker 1: angel or a devil? And in fact you're just kind 846 00:46:04,719 --> 00:46:07,480 Speaker 1: of a gray blur that someone does not recall in 847 00:46:07,520 --> 00:46:13,720 Speaker 1: any way. An Now, an important thing that's worth pointing 848 00:46:13,719 --> 00:46:19,000 Speaker 1: out here is that people's self ratings on this discussion 849 00:46:19,000 --> 00:46:22,840 Speaker 1: group thing, we're not entirely divorced from reality. To the contrary, 850 00:46:22,840 --> 00:46:25,800 Speaker 1: the study found that these ratings were in some ways 851 00:46:25,960 --> 00:46:30,200 Speaker 1: based in reality. People who rated themselves as doing more 852 00:46:30,280 --> 00:46:33,799 Speaker 1: to advance the discussion we're also on average rated by 853 00:46:33,840 --> 00:46:37,200 Speaker 1: others as doing more to advance the discussion. And people 854 00:46:37,239 --> 00:46:40,400 Speaker 1: who rated themselves as more likely to have said something 855 00:46:40,440 --> 00:46:43,520 Speaker 1: that offended someone were in fact more likely to have 856 00:46:43,560 --> 00:46:47,120 Speaker 1: said something that offended someone. But it's the size of 857 00:46:47,160 --> 00:46:50,719 Speaker 1: these effects, both positive and negative, that was exaggerated when 858 00:46:50,719 --> 00:46:54,880 Speaker 1: people were thinking about themselves so in self evaluation and 859 00:46:55,040 --> 00:46:59,320 Speaker 1: insightful comment that might have actually been an insightful comment 860 00:46:59,400 --> 00:47:01,800 Speaker 1: to you it feels like, wow, that was earth shaking, 861 00:47:01,840 --> 00:47:05,120 Speaker 1: I really changed the game, or a faux paw that 862 00:47:05,200 --> 00:47:08,400 Speaker 1: other people might notice, but you know, doesn't really stand 863 00:47:08,400 --> 00:47:11,040 Speaker 1: out to them all that much. It might become reputation 864 00:47:11,200 --> 00:47:16,040 Speaker 1: ruining this terror, this obsession. Yeah, that's interesting. Um, and 865 00:47:16,400 --> 00:47:19,520 Speaker 1: again I wonder I can't help think about social media 866 00:47:19,600 --> 00:47:22,680 Speaker 1: when you have systems that are set up so that 867 00:47:22,920 --> 00:47:26,800 Speaker 1: comments that are insightful may see more earth shaking because 868 00:47:26,880 --> 00:47:30,120 Speaker 1: they are being you know, re shared or retweeted or 869 00:47:30,520 --> 00:47:33,800 Speaker 1: or lighthearted, etcetera. And then and then likewise, there is 870 00:47:33,840 --> 00:47:36,440 Speaker 1: the negative reaction to the things as well. And of 871 00:47:36,480 --> 00:47:39,600 Speaker 1: course those tend to be the extremes that we we 872 00:47:39,600 --> 00:47:42,840 Speaker 1: we hear about in a in a digital setting. Yeah, 873 00:47:42,920 --> 00:47:45,120 Speaker 1: but even then, I think the reality is that, like 874 00:47:45,160 --> 00:47:47,480 Speaker 1: most people are not paying attention to you and won't 875 00:47:47,480 --> 00:47:52,120 Speaker 1: remember anything you did, right, right, it's just humbling in 876 00:47:52,160 --> 00:47:55,520 Speaker 1: a kind of nice way. Yeah. Okay, a couple more studies, 877 00:47:55,520 --> 00:47:57,520 Speaker 1: real quick. One of them in the fourth one, this 878 00:47:57,680 --> 00:48:01,360 Speaker 1: recreated the early T shirt scenario, but then ask participants 879 00:48:01,440 --> 00:48:04,120 Speaker 1: questions to probe how they came up with their estimates. 880 00:48:04,360 --> 00:48:06,680 Speaker 1: This is how the authors were trying to test whether 881 00:48:06,760 --> 00:48:10,000 Speaker 1: or not it was an anchoring and adjustment mental process 882 00:48:10,080 --> 00:48:13,920 Speaker 1: that people were using to to get to their mistaken 883 00:48:13,960 --> 00:48:18,120 Speaker 1: assumptions about other people's views of them and uh so 884 00:48:18,239 --> 00:48:20,839 Speaker 1: their model. Again, this could be it could turn out 885 00:48:20,880 --> 00:48:22,640 Speaker 1: that this is not the best way to model the 886 00:48:22,680 --> 00:48:24,400 Speaker 1: thinking going on here. But what they thought at the 887 00:48:24,440 --> 00:48:27,759 Speaker 1: time was that people start with their own rich and 888 00:48:27,880 --> 00:48:31,360 Speaker 1: powerful sense of how they appear to others. They realize 889 00:48:31,440 --> 00:48:34,360 Speaker 1: correctly that other people are not paying as much attention 890 00:48:34,440 --> 00:48:36,880 Speaker 1: to them as they pay to themselves. So they may 891 00:48:36,880 --> 00:48:41,000 Speaker 1: be adjust down from their own experience to a hypothetical 892 00:48:41,040 --> 00:48:44,719 Speaker 1: other observer, but they don't adjust enough. So you know, 893 00:48:44,800 --> 00:48:47,920 Speaker 1: how important was what I just did? Uh? Well, to me, 894 00:48:48,000 --> 00:48:50,320 Speaker 1: it was a ten, But I know other people probably 895 00:48:50,320 --> 00:48:52,160 Speaker 1: aren't going to rate it at ten, so I'll say 896 00:48:52,200 --> 00:48:55,840 Speaker 1: it's a nine to them, but in reality was maybe 897 00:48:55,840 --> 00:48:58,600 Speaker 1: like a six, yeah, or a four. And then the 898 00:48:58,719 --> 00:49:01,000 Speaker 1: last of the last of the studies, the fifth one. 899 00:49:01,120 --> 00:49:03,759 Speaker 1: This one, I think established something that's very important that 900 00:49:03,800 --> 00:49:07,800 Speaker 1: we can come back to in a minute. This tested habituation. 901 00:49:08,400 --> 00:49:11,600 Speaker 1: If people are allowed a period to get used to 902 00:49:11,719 --> 00:49:15,560 Speaker 1: wearing the unfamiliar very Manilow T shirt, will they feel 903 00:49:15,920 --> 00:49:19,400 Speaker 1: less self conscious about others noticing it? And the answer 904 00:49:19,480 --> 00:49:21,400 Speaker 1: is yes. If you wear the T shirt for a 905 00:49:21,440 --> 00:49:24,239 Speaker 1: while before going in front of other people, you will 906 00:49:24,239 --> 00:49:27,040 Speaker 1: tend to imagine that fewer of them took notice of it. 907 00:49:27,120 --> 00:49:29,239 Speaker 1: Then if you just put it on and then go in. 908 00:49:29,560 --> 00:49:32,560 Speaker 1: But of course, in these scenarios, absolutely nothing has changed 909 00:49:32,560 --> 00:49:35,160 Speaker 1: for the observers. The only thing that has changed is you. 910 00:49:35,560 --> 00:49:38,480 Speaker 1: The more that you're more used to the shirt yourself, 911 00:49:38,520 --> 00:49:42,040 Speaker 1: you're less conscious of it, so you imagine less consciousness 912 00:49:42,120 --> 00:49:44,920 Speaker 1: among others. So in this like if if I don't 913 00:49:44,960 --> 00:49:46,960 Speaker 1: know if anyone has ever had this experience being you know, 914 00:49:47,080 --> 00:49:50,960 Speaker 1: somebody who wears a well worn but offensive T shirt 915 00:49:51,680 --> 00:49:55,360 Speaker 1: an inappropriate setting, um, like, you know, they're used to it. 916 00:49:55,680 --> 00:49:59,720 Speaker 1: They're used to the potentially profane statement that is on it. 917 00:50:00,280 --> 00:50:03,120 Speaker 1: Everyone else and man, it might not be ready for Yes, 918 00:50:03,200 --> 00:50:05,560 Speaker 1: I think this is actually a various tute observation and 919 00:50:05,560 --> 00:50:07,560 Speaker 1: it will come back to something I want to get 920 00:50:07,560 --> 00:50:10,040 Speaker 1: at right at the end here, should we take another 921 00:50:10,080 --> 00:50:12,240 Speaker 1: break and then when we come back we can discuss 922 00:50:12,320 --> 00:50:14,960 Speaker 1: some of the implications of this research. Let's do it. 923 00:50:15,000 --> 00:50:20,279 Speaker 1: We'll be right back. Thank you. All right, we're back. 924 00:50:20,440 --> 00:50:25,160 Speaker 1: We're continuing our discussion here of of the spotlight effect 925 00:50:25,239 --> 00:50:27,960 Speaker 1: and uh and of course the various key shirt experiments 926 00:50:28,600 --> 00:50:33,520 Speaker 1: things that relate to that explanation. Yeah, so I wanted 927 00:50:33,560 --> 00:50:37,439 Speaker 1: to talk about some commentary on and and implications from 928 00:50:37,480 --> 00:50:40,480 Speaker 1: the spotlight effect to whatever extent this is a real effect. 929 00:50:40,480 --> 00:50:42,480 Speaker 1: It does appear like it still stands. I mean, I 930 00:50:42,480 --> 00:50:46,759 Speaker 1: would be interested to see some more recent research replicating this, 931 00:50:46,800 --> 00:50:49,480 Speaker 1: but but it looks pretty solid to me. One of 932 00:50:49,520 --> 00:50:52,160 Speaker 1: the things that the lead author of the study we've 933 00:50:52,160 --> 00:50:56,600 Speaker 1: been talking about, Gilovich, Uh, he's noted in another source 934 00:50:56,640 --> 00:50:59,840 Speaker 1: he actually had an article about the spotlight effect for 935 00:51:00,239 --> 00:51:04,120 Speaker 1: the Encyclopedia of Social Psychology edited by Baumeister and Vos, 936 00:51:04,920 --> 00:51:08,800 Speaker 1: And in that article, Gilovich notes that research indicates the 937 00:51:08,880 --> 00:51:12,240 Speaker 1: quote people of all ages are prone to the spotlight effect, 938 00:51:12,280 --> 00:51:15,759 Speaker 1: but it appears to be particularly pronounced among adolescents and 939 00:51:15,840 --> 00:51:20,680 Speaker 1: young adults. So as you get older, the spotlight effect 940 00:51:20,760 --> 00:51:24,399 Speaker 1: seems to work less powerfully on your brain. What would 941 00:51:24,440 --> 00:51:27,160 Speaker 1: explain this, Well, one answer might be experienced right in 942 00:51:27,280 --> 00:51:30,640 Speaker 1: Over time, you just learn through experience that people pay 943 00:51:30,719 --> 00:51:33,880 Speaker 1: less attention to you and notice less about you than 944 00:51:33,960 --> 00:51:37,000 Speaker 1: you expect them to. And it's possible this does play 945 00:51:37,040 --> 00:51:39,040 Speaker 1: some role. Maybe you get conditioned, you kind of learn 946 00:51:39,120 --> 00:51:42,360 Speaker 1: how things work in life, and you experience less of 947 00:51:42,400 --> 00:51:46,399 Speaker 1: this cognitive bias. But Gilovich identifies a different reason, and 948 00:51:46,440 --> 00:51:51,280 Speaker 1: that reason is that social motivation is stronger when you're younger. 949 00:51:51,920 --> 00:51:56,400 Speaker 1: Younger people show a heightened consciousness of and concerned for 950 00:51:56,760 --> 00:52:01,040 Speaker 1: their standing within social groups. Quote. But having a heightened 951 00:52:01,040 --> 00:52:04,400 Speaker 1: concern with one's social standing means, by its very nature 952 00:52:04,800 --> 00:52:08,000 Speaker 1: that one is vulnerable to having an excessive concern with 953 00:52:08,080 --> 00:52:11,800 Speaker 1: one standing and hints is likely to overestimate the extent 954 00:52:11,880 --> 00:52:15,280 Speaker 1: to which one is the target of others thoughts and attention. 955 00:52:15,760 --> 00:52:18,200 Speaker 1: So I'd say to take away from this maybe a 956 00:52:18,239 --> 00:52:21,799 Speaker 1: special message to like younger and like teenage listeners, like 957 00:52:22,400 --> 00:52:25,680 Speaker 1: other people really probably are noticing less about you and 958 00:52:25,719 --> 00:52:28,640 Speaker 1: thinking less about you than you think they are. As shocking, 959 00:52:28,760 --> 00:52:33,160 Speaker 1: is that maybe to hear. Another thing that that's related 960 00:52:33,239 --> 00:52:35,840 Speaker 1: to this idea that the authors mentioned in their in 961 00:52:35,880 --> 00:52:39,239 Speaker 1: their discussion section on their paper is the way that 962 00:52:39,280 --> 00:52:42,360 Speaker 1: the spotlight effect relates to something that's known as the 963 00:52:42,400 --> 00:52:46,759 Speaker 1: illusion of transparency. So the illusion of transparency is the 964 00:52:46,800 --> 00:52:51,400 Speaker 1: belief that your internal states are more observable to others 965 00:52:51,440 --> 00:52:55,440 Speaker 1: than they actually are. We often assume that our unspoken 966 00:52:55,560 --> 00:52:59,680 Speaker 1: thoughts and our feelings can be sort of sniffed out 967 00:52:59,760 --> 00:53:03,320 Speaker 1: and discerned by people around us, But that's usually not true, 968 00:53:03,440 --> 00:53:05,560 Speaker 1: not not to the extent that we think it is. 969 00:53:06,000 --> 00:53:09,400 Speaker 1: And there are examples of this from empirical research. For example, 970 00:53:09,840 --> 00:53:13,719 Speaker 1: if you stage a mock negotiation where people are trying to, 971 00:53:13,840 --> 00:53:16,280 Speaker 1: you know, negotiate to get to a certain price on something, 972 00:53:16,840 --> 00:53:20,239 Speaker 1: people tend to imagine that they have given away more 973 00:53:20,360 --> 00:53:23,960 Speaker 1: information about what they're trying to get than they actually have. 974 00:53:24,560 --> 00:53:27,920 Speaker 1: Another variation is that studies show that a lot of 975 00:53:27,960 --> 00:53:31,560 Speaker 1: times people imagine that other people can tell when they 976 00:53:31,560 --> 00:53:36,640 Speaker 1: are lying, but in reality, people can't actually tell when 977 00:53:36,640 --> 00:53:39,240 Speaker 1: people are lying, or at least I mean some people 978 00:53:39,280 --> 00:53:41,480 Speaker 1: maybe can tell some of the time, but most of 979 00:53:41,520 --> 00:53:44,880 Speaker 1: the time, other people cannot tell if you're lying, cannot 980 00:53:44,880 --> 00:53:48,520 Speaker 1: spot your lies with nearly as much accuracy as you 981 00:53:48,600 --> 00:53:51,640 Speaker 1: think they can do with that information what you will, 982 00:53:52,600 --> 00:53:55,719 Speaker 1: I really don't. I think it's it's like, it's a 983 00:53:55,760 --> 00:53:58,600 Speaker 1: great point because first of all, we all lie, like 984 00:53:58,719 --> 00:54:03,239 Speaker 1: lying is is is part of our communication suite. You know, 985 00:54:03,880 --> 00:54:07,280 Speaker 1: we're gonna individuals are gonna engage in it too, varying degrees, 986 00:54:07,360 --> 00:54:09,439 Speaker 1: but you know, it is important to have that tool 987 00:54:09,480 --> 00:54:11,920 Speaker 1: in your toolbox. You know, if someone shows you a 988 00:54:11,920 --> 00:54:15,560 Speaker 1: picture of a baby, and and and and and you're 989 00:54:15,560 --> 00:54:18,560 Speaker 1: expected to comment upon it, it is generally in your 990 00:54:18,600 --> 00:54:22,080 Speaker 1: best interest to lie. If you think that baby is ugly, right, 991 00:54:22,520 --> 00:54:24,840 Speaker 1: or or at least find some way to uh that 992 00:54:25,040 --> 00:54:29,520 Speaker 1: is not just comedic adherence to truth? Right, you can 993 00:54:29,600 --> 00:54:33,680 Speaker 1: you can find something nice to say that isn't necessarily untrue. Right, 994 00:54:34,000 --> 00:54:37,320 Speaker 1: And yet at the same time, lying can be, or 995 00:54:37,360 --> 00:54:41,080 Speaker 1: at least only, feel like a high risk act, right. 996 00:54:41,120 --> 00:54:43,200 Speaker 1: I mean, no one wants to be caught in a 997 00:54:43,280 --> 00:54:47,480 Speaker 1: lie um, even if the stakes are ultimately kind of low. 998 00:54:47,520 --> 00:54:49,560 Speaker 1: I mean, I guess maybe even more so at times 999 00:54:49,600 --> 00:54:51,640 Speaker 1: if the stakes are low, because why are you lying 1000 00:54:51,680 --> 00:54:54,280 Speaker 1: about that? Well, like, why didn't you say you didn't 1001 00:54:54,320 --> 00:54:57,480 Speaker 1: like this picture of my baby? Or I don't know, 1002 00:54:58,320 --> 00:55:00,799 Speaker 1: let me say, I can't think of specific example, but okay, 1003 00:55:00,840 --> 00:55:03,960 Speaker 1: here's a potential example where if someone says, hey, if 1004 00:55:04,000 --> 00:55:05,879 Speaker 1: you're seen Die Hard too, and you're like, oh, yeah, 1005 00:55:05,880 --> 00:55:07,920 Speaker 1: it's pretty good, and and maybe the thing is you 1006 00:55:08,160 --> 00:55:10,319 Speaker 1: haven't seen it. You have no interest in saying maybe 1007 00:55:10,320 --> 00:55:12,360 Speaker 1: you think that the whole concept sounds kind of stupid, 1008 00:55:12,760 --> 00:55:14,520 Speaker 1: but you want to be polite about it. And you 1009 00:55:14,560 --> 00:55:16,800 Speaker 1: also don't want to be You don't want the plot 1010 00:55:16,840 --> 00:55:18,840 Speaker 1: to have to be explained to you now you know 1011 00:55:19,200 --> 00:55:20,839 Speaker 1: you can see it in the theater. You also don't 1012 00:55:20,840 --> 00:55:24,040 Speaker 1: want to hear your friend Ron surprise it for you. 1013 00:55:24,440 --> 00:55:25,880 Speaker 1: But then if they're like, oh, yeah, what was your 1014 00:55:25,920 --> 00:55:29,239 Speaker 1: favorite part? Well, crap. Now this has become a much 1015 00:55:29,280 --> 00:55:34,160 Speaker 1: stickier situation because I'm lying about having seen die Hard too. Yeah. 1016 00:55:34,200 --> 00:55:37,560 Speaker 1: But people tend to assume that, like that fact that 1017 00:55:37,600 --> 00:55:40,560 Speaker 1: they're lying about having seen die Hard too is somehow 1018 00:55:40,760 --> 00:55:44,080 Speaker 1: leaking out of them in an observable way. And in 1019 00:55:44,120 --> 00:55:46,200 Speaker 1: some cases it might be like some people do have 1020 00:55:46,280 --> 00:55:49,640 Speaker 1: big tells when they're lying, but generally that information is 1021 00:55:49,680 --> 00:55:52,319 Speaker 1: not leaking out as much as people imagine it is. 1022 00:55:52,880 --> 00:55:55,000 Speaker 1: And I wonder if this is compounded to a certain 1023 00:55:55,000 --> 00:55:58,080 Speaker 1: extent by the lying we observe in media, lying that 1024 00:55:58,239 --> 00:56:01,759 Speaker 1: is either exposed via lifting relevant media like here's a 1025 00:56:01,920 --> 00:56:05,120 Speaker 1: here's one scene of politicians saying one thing, and here's 1026 00:56:05,160 --> 00:56:08,080 Speaker 1: another here's the another bit of footage that shows that 1027 00:56:08,120 --> 00:56:12,480 Speaker 1: they're lying or more often overt lying by a fictional character, 1028 00:56:12,760 --> 00:56:15,240 Speaker 1: which of course is is played up for dramatic effect 1029 00:56:15,320 --> 00:56:18,000 Speaker 1: and is also an artificial situation, you know, and that 1030 00:56:18,120 --> 00:56:20,879 Speaker 1: we know they are lying to another care Oh yeah. 1031 00:56:21,080 --> 00:56:25,960 Speaker 1: But it's also like there is there's a stock type 1032 00:56:25,960 --> 00:56:29,239 Speaker 1: of hero in like detective fiction and all that the 1033 00:56:29,280 --> 00:56:32,800 Speaker 1: person who can just magically tell when other people are lying, 1034 00:56:32,920 --> 00:56:38,120 Speaker 1: and that that skill. No, there's a wonderful character in 1035 00:56:38,760 --> 00:56:44,040 Speaker 1: the recent Watchman series on HBO looking Glass, Yes, played 1036 00:56:44,040 --> 00:56:46,800 Speaker 1: by the great Tim Blake. Nelson has that power. The 1037 00:56:46,880 --> 00:56:50,560 Speaker 1: character has the power, not not Nelson himself. Right, Yeah, 1038 00:56:50,560 --> 00:56:52,760 Speaker 1: And we love characters like that, right. I mean, that's 1039 00:56:52,760 --> 00:56:55,760 Speaker 1: a really fun power to try to see realized in fiction. 1040 00:56:55,840 --> 00:56:59,040 Speaker 1: But uh, just lies are not as easy to sniff 1041 00:56:59,040 --> 00:57:02,400 Speaker 1: out as I mean eat to. To really detect aalize 1042 00:57:02,400 --> 00:57:04,680 Speaker 1: in reality, what you have to try to do is 1043 00:57:04,719 --> 00:57:07,400 Speaker 1: like trap people in contradictions and stuff like ask a 1044 00:57:07,480 --> 00:57:10,480 Speaker 1: bunch of follow up questions. It doesn't just leak out 1045 00:57:10,480 --> 00:57:12,759 Speaker 1: of your face that yes I'm telling a lie and 1046 00:57:12,800 --> 00:57:15,600 Speaker 1: you can smell it absolutely. And you know, I also 1047 00:57:15,640 --> 00:57:18,880 Speaker 1: think about this in terms of religious upbringing. Um, I 1048 00:57:18,880 --> 00:57:20,680 Speaker 1: don't know about you, but the growing up in the 1049 00:57:20,720 --> 00:57:25,240 Speaker 1: sort of pen octagonical teachings of a Protestant church, there 1050 00:57:25,520 --> 00:57:29,080 Speaker 1: was always this idea that God and and also the 1051 00:57:29,120 --> 00:57:31,920 Speaker 1: devil and perhaps other entities like less your angels and 1052 00:57:31,960 --> 00:57:35,360 Speaker 1: demons whatever you were privy to your inner thoughts, you know, 1053 00:57:35,400 --> 00:57:38,200 Speaker 1: the whole idea that it wasn't just what you said 1054 00:57:38,200 --> 00:57:40,120 Speaker 1: and did that made you sinful. It was also what 1055 00:57:40,160 --> 00:57:42,720 Speaker 1: you were thinking about doing, or considering doing, or just 1056 00:57:43,160 --> 00:57:45,720 Speaker 1: entertaining the mental images of doing. So there was this 1057 00:57:45,880 --> 00:57:49,320 Speaker 1: grain notion that your private thoughts are not private at all, 1058 00:57:49,800 --> 00:57:53,240 Speaker 1: at least not so far as supernatural entities are concerned. Yeah, 1059 00:57:53,280 --> 00:57:55,560 Speaker 1: that's right, And I guess it is possible that this 1060 00:57:55,560 --> 00:57:58,320 Speaker 1: could have a conditioning effect to make you assume that 1061 00:57:58,400 --> 00:58:01,360 Speaker 1: in general, your private thoughts are not private. Maybe they're 1062 00:58:01,400 --> 00:58:05,840 Speaker 1: observable not just two supernatural entities, but to other regular 1063 00:58:05,960 --> 00:58:08,920 Speaker 1: entities that you interact with every day. Yeah, because I 1064 00:58:08,960 --> 00:58:12,479 Speaker 1: definitely remember at times, certainly when I was younger, sort 1065 00:58:12,520 --> 00:58:14,920 Speaker 1: of freaking out at times about just the idea of 1066 00:58:14,960 --> 00:58:18,080 Speaker 1: other humans being privd in my thoughts, you know, an 1067 00:58:18,120 --> 00:58:21,000 Speaker 1: idea that was probably also compounded by science fiction that's 1068 00:58:21,040 --> 00:58:25,800 Speaker 1: just lousy with psychics, right, um, And also these not 1069 00:58:25,880 --> 00:58:30,960 Speaker 1: quite psychics but just really insightful TV suits. The Hannibal 1070 00:58:31,040 --> 00:58:33,480 Speaker 1: lecter is basically they like look at you and tell 1071 00:58:33,520 --> 00:58:36,840 Speaker 1: your whole life story. Yeah. But but then again, as 1072 00:58:36,880 --> 00:58:39,000 Speaker 1: we've discussed on the show before, this is this sort 1073 00:58:39,040 --> 00:58:42,080 Speaker 1: of fear is not entirely unfounded. Given the potential trajectory 1074 00:58:42,120 --> 00:58:45,720 Speaker 1: of some of our technology. That's true, but that's technology. 1075 00:58:45,760 --> 00:58:49,480 Speaker 1: I mean, normally people are not doing like AI, you know, 1076 00:58:49,600 --> 00:58:52,480 Speaker 1: learning on data sets about your social media use or whatever. 1077 00:58:53,320 --> 00:58:55,480 Speaker 1: There was one more example given about the illusion of 1078 00:58:55,520 --> 00:58:59,920 Speaker 1: transparency that I really liked, which was that people overestimate 1079 00:59:00,040 --> 00:59:04,680 Speaker 1: did the extent to which observers could tell whether the 1080 00:59:04,800 --> 00:59:09,720 Speaker 1: drink they were drinking was pleasant or nasty tasting, even 1081 00:59:09,760 --> 00:59:11,960 Speaker 1: though they were supposed to use a neutral facial expression. 1082 00:59:12,000 --> 00:59:14,720 Speaker 1: So you give people drinks this one, this one tastes good, 1083 00:59:14,800 --> 00:59:17,120 Speaker 1: this one tastes disgusting, and you tell them they have 1084 00:59:17,200 --> 00:59:20,400 Speaker 1: to maintain a neutrol facial expression while they drink them. 1085 00:59:20,760 --> 00:59:22,960 Speaker 1: People assumed, oh yeah, people can just read it on 1086 00:59:23,000 --> 00:59:25,600 Speaker 1: my face that you know that that was a nasty one. 1087 00:59:25,760 --> 00:59:27,520 Speaker 1: But it turns out people can't read all that. Well. 1088 00:59:28,680 --> 00:59:30,760 Speaker 1: Good to know when you have your next inner party. 1089 00:59:31,200 --> 00:59:36,280 Speaker 1: Um ye, hopefully it's gonna be the next thing after 1090 00:59:36,360 --> 00:59:39,120 Speaker 1: a competitive eating right now, So right now it's the 1091 00:59:39,160 --> 00:59:42,440 Speaker 1: people who wolfed down like thirty white castles or whatever. 1092 00:59:42,680 --> 00:59:45,240 Speaker 1: The next thing is, how many nasty drinks can you drink? 1093 00:59:45,560 --> 00:59:48,640 Speaker 1: I can see it becoming a big hit Okay, one 1094 00:59:48,760 --> 00:59:51,440 Speaker 1: last thing. So the authors of this two thousand paper 1095 00:59:51,520 --> 00:59:56,320 Speaker 1: ask a question, when is the spotlight effect most pronounced 1096 00:59:56,560 --> 00:59:59,080 Speaker 1: and when is it least pronounced? Could there be such 1097 00:59:59,080 --> 01:00:02,360 Speaker 1: a thing as like a verse spotlight effect, a sort 1098 01:00:02,400 --> 01:00:06,120 Speaker 1: of mental cloak of invisibility where other people are noticing 1099 01:00:06,160 --> 01:00:09,960 Speaker 1: you more than you think they are, And the authors think, yeah, 1100 01:00:10,040 --> 01:00:14,000 Speaker 1: this is probably possible. They claim that this would probably 1101 01:00:14,040 --> 01:00:18,960 Speaker 1: correlate with the subject's own consciousness of their appearance or behavior. 1102 01:00:19,520 --> 01:00:22,640 Speaker 1: So obviously, the more conscious you are of your own 1103 01:00:22,680 --> 01:00:26,000 Speaker 1: appearance and behavior, the more conscious of it you imagine 1104 01:00:26,040 --> 01:00:28,760 Speaker 1: other people are, and probably vice versa. If you're less 1105 01:00:28,800 --> 01:00:32,680 Speaker 1: conscious of yourself, you're imagining other people are less conscious 1106 01:00:32,680 --> 01:00:35,560 Speaker 1: of you. And so for this reason, it might be 1107 01:00:35,640 --> 01:00:39,680 Speaker 1: correlated somewhat to the novelty of what you're doing or wearing, 1108 01:00:39,800 --> 01:00:42,240 Speaker 1: or what you look like or how you sound. So 1109 01:00:42,400 --> 01:00:45,360 Speaker 1: remember in the fifth study in in that paper, uh, 1110 01:00:45,400 --> 01:00:48,400 Speaker 1: the spotlight effect was less pronounced for people who had 1111 01:00:48,440 --> 01:00:52,000 Speaker 1: some time to get used to wearing a potentially embarrassing, 1112 01:00:52,040 --> 01:00:55,960 Speaker 1: conspicuous T shirt. So it's highly possible that we are 1113 01:00:56,080 --> 01:00:59,160 Speaker 1: most likely to manifest the spotlight effect when we're doing 1114 01:00:59,240 --> 01:01:02,560 Speaker 1: something new or unusual. Well, that's interesting. It kind of 1115 01:01:02,560 --> 01:01:05,480 Speaker 1: ties back to what we're talking about earlier, about when 1116 01:01:05,480 --> 01:01:07,480 Speaker 1: you're about to say something in a meeting and you're 1117 01:01:07,520 --> 01:01:10,400 Speaker 1: putting a lot of cognitive effort into preparing for that 1118 01:01:10,760 --> 01:01:14,400 Speaker 1: and preparing to do something that you don't normally do. Yeah, exactly, 1119 01:01:14,720 --> 01:01:18,080 Speaker 1: takes more effort, it takes up more space in your brain. 1120 01:01:18,360 --> 01:01:20,880 Speaker 1: It's more salient to you, and you assume it's more 1121 01:01:20,920 --> 01:01:24,280 Speaker 1: salient to other people. So it's possible. This isn't proven yet, 1122 01:01:24,320 --> 01:01:28,000 Speaker 1: but it's possible that the inverse effect, where we would 1123 01:01:28,080 --> 01:01:31,840 Speaker 1: underestimate how much other people are noticing our appearance and behavior, 1124 01:01:32,240 --> 01:01:36,200 Speaker 1: it's possible this happens when we are least conscious, meaning 1125 01:01:36,320 --> 01:01:42,840 Speaker 1: during highly familiar, routine or automatic behaviors. There's actually an 1126 01:01:42,840 --> 01:01:46,240 Speaker 1: example that has been studied here, uh and the example 1127 01:01:46,480 --> 01:01:49,920 Speaker 1: and I thought this was interesting. So people underestimate the 1128 01:01:50,000 --> 01:01:55,120 Speaker 1: extent to which other people notice their cologne or perfume. 1129 01:01:56,440 --> 01:01:59,840 Speaker 1: So you cover yourself in a fragrance, you become accustomed 1130 01:01:59,840 --> 01:02:03,000 Speaker 1: to that fragrance and you stop noticing it. Right Olfactory 1131 01:02:03,040 --> 01:02:07,240 Speaker 1: desynsitization sets in. You no longer smell it yourself, so 1132 01:02:07,280 --> 01:02:11,000 Speaker 1: it basically disappears for you, but other people smell it 1133 01:02:11,120 --> 01:02:14,440 Speaker 1: even if you don't expect them to. Yeah, yeah, I 1134 01:02:14,440 --> 01:02:18,280 Speaker 1: think we all have had that experience with with someone 1135 01:02:18,280 --> 01:02:22,800 Speaker 1: who has just outrageously powerful perfume, you know, like sometimes 1136 01:02:22,800 --> 01:02:26,840 Speaker 1: the extent that announces their presence. Yes, yes, sometimes people 1137 01:02:26,880 --> 01:02:29,880 Speaker 1: just lather up. And this makes me wonder about whether 1138 01:02:30,080 --> 01:02:35,040 Speaker 1: the spotlight effect is especially salient for appearance, because, of course, 1139 01:02:35,080 --> 01:02:39,200 Speaker 1: we normally can't really see ourselves when we're going about 1140 01:02:39,200 --> 01:02:41,960 Speaker 1: our lives. If we're in a regular business meeting talking, 1141 01:02:42,320 --> 01:02:44,240 Speaker 1: we can't see our face. We might be able to 1142 01:02:44,280 --> 01:02:46,000 Speaker 1: see our bodies if we look down at it, but 1143 01:02:46,040 --> 01:02:48,160 Speaker 1: we're probably not looking down. We're probably looking up at 1144 01:02:48,200 --> 01:02:53,400 Speaker 1: the room. But we're also frequently suddenly reminded of our 1145 01:02:53,440 --> 01:02:56,080 Speaker 1: appearance when we walk in front of a mirror or 1146 01:02:56,240 --> 01:02:59,000 Speaker 1: log into a web meeting or something. So it might 1147 01:02:59,000 --> 01:03:02,360 Speaker 1: be the sort of perf mix of obliviousness in your 1148 01:03:02,360 --> 01:03:07,280 Speaker 1: regular behaviors and then the sudden shocking reminders of oh yeah, 1149 01:03:07,440 --> 01:03:10,440 Speaker 1: I look like this external people, and that kind of 1150 01:03:10,520 --> 01:03:13,760 Speaker 1: keeps you on your toes. Like what if after putting 1151 01:03:13,800 --> 01:03:18,120 Speaker 1: on some cologne, you could suddenly smell it intensely again 1152 01:03:18,160 --> 01:03:21,120 Speaker 1: every hour or so yeah, I mean that's the mag 1153 01:03:21,200 --> 01:03:22,560 Speaker 1: You put it like that, it almost sounds like it 1154 01:03:22,600 --> 01:03:27,480 Speaker 1: would be helpful. But I don't feel like our experiences 1155 01:03:27,520 --> 01:03:30,960 Speaker 1: with our own footage in a zoom call or what 1156 01:03:31,040 --> 01:03:33,800 Speaker 1: have you is necessarily helpful. It really feels like sort 1157 01:03:33,840 --> 01:03:37,400 Speaker 1: of built in egocentric feedback. Yeah, because there's too much 1158 01:03:37,400 --> 01:03:40,840 Speaker 1: of it. It's just constantly there. So anyway, if we 1159 01:03:40,920 --> 01:03:43,640 Speaker 1: assume that the spotlight effect is real, it is a 1160 01:03:44,320 --> 01:03:47,200 Speaker 1: something that's generally true about people might not be true 1161 01:03:47,200 --> 01:03:49,480 Speaker 1: to the same extent for everyone. But if this effect 1162 01:03:49,560 --> 01:03:54,320 Speaker 1: is correctly observed, what would the implications for our lives be. Well, 1163 01:03:54,680 --> 01:03:57,680 Speaker 1: Gilovich has actually gotten kind of kind of sweet about this, 1164 01:03:57,760 --> 01:04:00,640 Speaker 1: so uh. He notes that, you know, there are studies 1165 01:04:00,680 --> 01:04:03,720 Speaker 1: that show that later in life, most people report that 1166 01:04:03,840 --> 01:04:07,880 Speaker 1: their major regrets about their lives concern things that they 1167 01:04:07,920 --> 01:04:10,640 Speaker 1: failed to do, rather than things that they did. It's 1168 01:04:10,680 --> 01:04:12,520 Speaker 1: not the same for everybody, but that is a much 1169 01:04:12,520 --> 01:04:15,280 Speaker 1: more common framing, and you've probably read about this before. 1170 01:04:15,320 --> 01:04:18,960 Speaker 1: This is widely observed. So many of the things that 1171 01:04:19,000 --> 01:04:23,760 Speaker 1: people want to do but never do, they hold back 1172 01:04:23,840 --> 01:04:26,720 Speaker 1: from them out of a sense of self consciousness or 1173 01:04:26,760 --> 01:04:30,200 Speaker 1: anxiety about how people are going to perceive us, you know, 1174 01:04:30,320 --> 01:04:33,520 Speaker 1: for doing these things. So one easy example might be 1175 01:04:33,600 --> 01:04:36,080 Speaker 1: that you failed to ever take up playing a musical 1176 01:04:36,120 --> 01:04:40,520 Speaker 1: instrument because you fear that other people will judge you 1177 01:04:40,640 --> 01:04:44,400 Speaker 1: as unskilled at playing it, especially at first. And so 1178 01:04:44,440 --> 01:04:47,040 Speaker 1: the research on the spotlight effects suggests that we are 1179 01:04:47,240 --> 01:04:52,440 Speaker 1: were very likely to be overestimating, perhaps even grossly overestimating, 1180 01:04:52,840 --> 01:04:56,120 Speaker 1: how much people would even notice whatever it is that 1181 01:04:56,200 --> 01:04:59,360 Speaker 1: we're afraid of doing. And the authors of the study 1182 01:04:59,440 --> 01:05:02,000 Speaker 1: right quote. The lesson of this research then is that 1183 01:05:02,040 --> 01:05:05,400 Speaker 1: we might all have fewer regrets if we properly understood 1184 01:05:05,440 --> 01:05:11,439 Speaker 1: how much attention or inattention our actions actually draw from others. Yeah, 1185 01:05:11,480 --> 01:05:13,800 Speaker 1: that is it is kind of a sweet twist on it. 1186 01:05:13,920 --> 01:05:16,520 Speaker 1: You know, it's like saying, look, go go for it, 1187 01:05:16,720 --> 01:05:19,440 Speaker 1: go through you live your dream, because nobody's really going 1188 01:05:19,440 --> 01:05:22,160 Speaker 1: to pay that much attention even when it falls flat 1189 01:05:22,440 --> 01:05:26,240 Speaker 1: dance like nobody's watching, because probably nobody is watching, or 1190 01:05:26,280 --> 01:05:29,160 Speaker 1: if they are watching, they might not even remember. I mean, 1191 01:05:29,200 --> 01:05:33,760 Speaker 1: it's just like you're you're probably way over concerned about 1192 01:05:33,800 --> 01:05:38,000 Speaker 1: possible minor faux pause or looking weird or awkward. Yeah, 1193 01:05:38,320 --> 01:05:41,000 Speaker 1: like they're probably even if they're they're watching you, and 1194 01:05:41,040 --> 01:05:43,240 Speaker 1: they're thinking about it. They're probably thinking, oh, man, do 1195 01:05:43,440 --> 01:05:46,320 Speaker 1: I look like that when I danced by myself? What 1196 01:05:46,360 --> 01:05:49,960 Speaker 1: do I look like when I danced by myself? This? Uh, 1197 01:05:50,280 --> 01:05:54,400 Speaker 1: this reminds me. That's talking about situations where you realize 1198 01:05:54,440 --> 01:05:57,479 Speaker 1: that it may be perceived as as weird by other people, 1199 01:05:57,520 --> 01:06:00,800 Speaker 1: were embarrassing by other people. So so I've mentioned Star 1200 01:06:00,800 --> 01:06:04,200 Speaker 1: Wars like three times so far. I'm mostly tracking the 1201 01:06:04,200 --> 01:06:05,800 Speaker 1: house here, and me and my son are super in 1202 01:06:05,880 --> 01:06:09,080 Speaker 1: Star Wars. He has a couple of lightsabers and he'll 1203 01:06:09,320 --> 01:06:11,280 Speaker 1: he'll often ask me to go out to to have 1204 01:06:11,560 --> 01:06:14,200 Speaker 1: a lightsaber battle with him, which is something we have 1205 01:06:14,280 --> 01:06:17,720 Speaker 1: to do outside because otherwise we would destroy things in 1206 01:06:17,720 --> 01:06:19,320 Speaker 1: the house. And we have to do it in the 1207 01:06:19,320 --> 01:06:22,560 Speaker 1: front yard because the mosquitoes are too bad in the backyard. Um, 1208 01:06:22,720 --> 01:06:25,000 Speaker 1: so we'll have this fight in the front yard. People 1209 01:06:25,080 --> 01:06:27,440 Speaker 1: driving by we'll be able to see it, which generally 1210 01:06:27,480 --> 01:06:29,600 Speaker 1: I imagine they'll say, oh, well, there's a dad having 1211 01:06:29,640 --> 01:06:32,920 Speaker 1: a lightsaber battle with with his son. That's great, the 1212 01:06:32,960 --> 01:06:36,360 Speaker 1: sweetest thing they'll see all day. But occasionally, my son, 1213 01:06:36,440 --> 01:06:39,000 Speaker 1: who's much he gets so into this. Occasionally he'll have 1214 01:06:39,080 --> 01:06:40,880 Speaker 1: to run over to the side of the house to 1215 01:06:41,160 --> 01:06:44,520 Speaker 1: like fight a pretend droid or something, which leaves me 1216 01:06:44,640 --> 01:06:49,000 Speaker 1: in the front yard apparently by myself fighting pretend droids. 1217 01:06:49,440 --> 01:06:53,080 Speaker 1: And I realized when that happens, people may drive by 1218 01:06:53,120 --> 01:06:56,920 Speaker 1: and think that I have lost my mind, um, which 1219 01:06:57,120 --> 01:06:59,760 Speaker 1: I don't know. I'm I'm okay, I'm ultimately okay. I 1220 01:07:00,040 --> 01:07:02,360 Speaker 1: think you got anything to worry about. Man. That's that 1221 01:07:02,360 --> 01:07:05,000 Speaker 1: that that's going to be the ray of sunshine in 1222 01:07:05,040 --> 01:07:08,760 Speaker 1: the in the day of so many people driving by. Seriously, 1223 01:07:08,760 --> 01:07:11,480 Speaker 1: if I was driving by and I saw uh, and 1224 01:07:11,520 --> 01:07:13,680 Speaker 1: I saw some people having a lightsaber duel in their 1225 01:07:13,680 --> 01:07:16,440 Speaker 1: front yard, I would be like, that's you know, there's hope, 1226 01:07:16,880 --> 01:07:19,600 Speaker 1: a new hope. Maybe maybe that's what I'm doing. I'm 1227 01:07:19,640 --> 01:07:22,240 Speaker 1: giving people hope that they're like, I didn't realize I 1228 01:07:22,280 --> 01:07:23,800 Speaker 1: could do that as a grown up, that I could 1229 01:07:23,840 --> 01:07:27,200 Speaker 1: just uh get a lightsaber and uh and start having 1230 01:07:27,240 --> 01:07:29,400 Speaker 1: pretend battles in my front yard. I'm gonna do it. 1231 01:07:29,480 --> 01:07:33,200 Speaker 1: That's gonna make this quarantine situation a lot easier. Along 1232 01:07:33,200 --> 01:07:35,920 Speaker 1: the same lines, I am extremely in favor of adults 1233 01:07:35,960 --> 01:07:40,240 Speaker 1: climbing trees. There's this bizarre idea that adults shouldn't climb trees. 1234 01:07:40,240 --> 01:07:43,360 Speaker 1: Climbing trees is for children. Why adults should climb trees 1235 01:07:43,400 --> 01:07:45,720 Speaker 1: all the time? I love a good climbing tree. It's 1236 01:07:45,720 --> 01:07:47,800 Speaker 1: a good skill to have. I see people in movies 1237 01:07:47,840 --> 01:07:49,560 Speaker 1: having to do it all the time to escape, like 1238 01:07:49,640 --> 01:07:52,840 Speaker 1: you know, robot and monsters and whatnot. Yeah, so go 1239 01:07:52,960 --> 01:07:56,560 Speaker 1: for it. Yeah, those like those Boston Dynamics dog robots 1240 01:07:56,600 --> 01:07:58,240 Speaker 1: are coming for you. Where are you gonna go? You 1241 01:07:58,280 --> 01:08:00,280 Speaker 1: gotta get up the oak, like suddenly you got to 1242 01:08:00,280 --> 01:08:03,480 Speaker 1: climate tree and you haven't been practicing for twenty or 1243 01:08:03,520 --> 01:08:07,040 Speaker 1: thirty years. Good luck? And then what if you have 1244 01:08:07,120 --> 01:08:10,040 Speaker 1: to fight it with a lifesaver? Also, you've got to 1245 01:08:10,160 --> 01:08:13,280 Speaker 1: keep these skills war you know, these are the skills 1246 01:08:13,320 --> 01:08:15,920 Speaker 1: one needs to survive in the waste land. Alright, Well, 1247 01:08:15,920 --> 01:08:18,400 Speaker 1: we're gonna go ahead and close it out there. I 1248 01:08:18,479 --> 01:08:20,200 Speaker 1: think we have. There's a lot of material in here 1249 01:08:20,200 --> 01:08:24,320 Speaker 1: for everyone to think about, and we of course await 1250 01:08:24,840 --> 01:08:27,680 Speaker 1: listener responses to this. How how do you perceive the 1251 01:08:27,720 --> 01:08:30,280 Speaker 1: spotlight effect in your own life or in the lives 1252 01:08:30,280 --> 01:08:33,479 Speaker 1: of others? Has this forced you to to rethink anything 1253 01:08:33,960 --> 01:08:35,760 Speaker 1: going on in the world around you, or how you 1254 01:08:35,800 --> 01:08:41,559 Speaker 1: indeed engage in your daily or weekly conference digital conference 1255 01:08:41,600 --> 01:08:44,240 Speaker 1: calls for them. In the meantime, if you'd like to 1256 01:08:44,320 --> 01:08:47,000 Speaker 1: check out other episodes of Stuff to Blow Your Mind, 1257 01:08:47,040 --> 01:08:50,280 Speaker 1: including that when we mentioned anchor in the mind dealing 1258 01:08:50,280 --> 01:08:54,280 Speaker 1: with anchoring, you can find those wherever you get your podcasts. 1259 01:08:54,320 --> 01:08:57,160 Speaker 1: I will say this, you can certainly find our our 1260 01:08:57,280 --> 01:08:59,680 Speaker 1: I heart radio listing by going to stuff to Blow 1261 01:08:59,720 --> 01:09:02,639 Speaker 1: your Mind dot com. And if you go there, you'll 1262 01:09:02,640 --> 01:09:05,280 Speaker 1: see a little part of the page and says show links. 1263 01:09:05,600 --> 01:09:07,559 Speaker 1: There is a store link there, and if you go 1264 01:09:07,640 --> 01:09:13,559 Speaker 1: there you will find T shirts that are bringing it around. 1265 01:09:14,120 --> 01:09:16,559 Speaker 1: And some of them are cool. Uh, some of them 1266 01:09:16,560 --> 01:09:19,160 Speaker 1: I would personally be embarrassed to wear. You'll have to 1267 01:09:19,200 --> 01:09:21,960 Speaker 1: look at them and trying to decide which which design 1268 01:09:22,040 --> 01:09:24,360 Speaker 1: is which. But we charge every listener to buy one 1269 01:09:24,439 --> 01:09:30,160 Speaker 1: cool T shirt and one extremely embarrassing T shirt. Yes, 1270 01:09:30,320 --> 01:09:32,720 Speaker 1: well we have both there, so go check them out 1271 01:09:32,960 --> 01:09:35,240 Speaker 1: if you so desire. Um. I don't know. None of 1272 01:09:35,280 --> 01:09:37,360 Speaker 1: them have very manilol on the front though, so you 1273 01:09:37,400 --> 01:09:39,360 Speaker 1: need to do a separate image search to see what 1274 01:09:39,400 --> 01:09:42,360 Speaker 1: I'm talking about. There. Huge thanks as always to our 1275 01:09:42,439 --> 01:09:46,080 Speaker 1: excellent audio producers Seth Nicholas Johnson. If you would like 1276 01:09:46,120 --> 01:09:47,840 Speaker 1: to get in touch with us with feedback on this 1277 01:09:47,880 --> 01:09:50,400 Speaker 1: episode or any other, to suggest a topic for the future, 1278 01:09:50,720 --> 01:09:52,760 Speaker 1: or just to say hi. You can email us at 1279 01:09:52,920 --> 01:09:55,360 Speaker 1: contact at stuff to Blow your Mind dot com s. 1280 01:10:02,920 --> 01:10:05,400 Speaker 1: Stuff to Blow Your Mind is production of I Heart Radio. 1281 01:10:05,760 --> 01:10:07,760 Speaker 1: For more podcasts for my heart Radio, this is the 1282 01:10:07,800 --> 01:10:10,599 Speaker 1: i heart Radio app, Apple Podcasts, or wherever you listen 1283 01:10:10,600 --> 01:10:23,400 Speaker 1: me to your favorite shows.