1 00:00:00,240 --> 00:00:02,640 Speaker 1: Coming up on you Need Therapy. 2 00:00:02,759 --> 00:00:06,320 Speaker 2: We tend to internalize things as true even when they 3 00:00:06,320 --> 00:00:09,399 Speaker 2: are totally implausible, and we really really know deep down 4 00:00:09,480 --> 00:00:12,720 Speaker 2: that they couldn't be true when we've heard them multiple times, 5 00:00:12,840 --> 00:00:16,759 Speaker 2: when they rhyme, when they're written in like legible, aesthetically 6 00:00:16,760 --> 00:00:19,560 Speaker 2: pleasing fonts. These are things that make us feel like 7 00:00:19,600 --> 00:00:22,400 Speaker 2: something is true because they activate processing fluency. 8 00:00:23,600 --> 00:00:26,800 Speaker 3: I started to realize that not being an expert isn't 9 00:00:26,840 --> 00:00:28,920 Speaker 3: a liability, it's a real gift. 10 00:00:29,520 --> 00:00:32,200 Speaker 2: If we don't know something about ourselves at this point 11 00:00:32,200 --> 00:00:34,839 Speaker 2: in our life, it's probably because it's uncomfortable to know. 12 00:00:35,600 --> 00:00:38,480 Speaker 4: If you can die before you die, then you can 13 00:00:38,520 --> 00:00:42,960 Speaker 4: really live. There's a wisdom at death's door. I thought 14 00:00:43,000 --> 00:00:45,839 Speaker 4: I was insane. Yeah, and I didn't know what to 15 00:00:45,840 --> 00:00:48,640 Speaker 4: do because there was no internet. I don't know, man, 16 00:00:48,840 --> 00:00:52,400 Speaker 4: I'm like, I feel like everything is hard. 17 00:00:54,040 --> 00:00:55,440 Speaker 1: Hey, y'all, my name is Kat. 18 00:00:55,840 --> 00:00:59,000 Speaker 3: I'm a human first and a licensed therapist second, and 19 00:00:59,120 --> 00:01:02,920 Speaker 3: right now inviting you into conversations that I hope encourage 20 00:01:02,960 --> 00:01:07,880 Speaker 3: you to become more curious and less judgmental about yourself, others, 21 00:01:08,000 --> 00:01:13,280 Speaker 3: and the world around you. Welcome to you need Therapy. Hi, 22 00:01:13,400 --> 00:01:15,200 Speaker 3: guys and Welcome to a new episode of You Need 23 00:01:15,240 --> 00:01:17,920 Speaker 3: Therapy podcast. My name is Kat I am the host, 24 00:01:18,120 --> 00:01:21,760 Speaker 3: and quick reminder before we get into today's episode that 25 00:01:21,920 --> 00:01:25,039 Speaker 3: although I'm a therapist in this podcast is called You 26 00:01:25,080 --> 00:01:27,080 Speaker 3: Need Therapy, it does not serve as a replacement or 27 00:01:27,120 --> 00:01:29,600 Speaker 3: reminder for any actual mental health services. 28 00:01:29,640 --> 00:01:31,319 Speaker 1: However, we always hope that. 29 00:01:31,480 --> 00:01:34,720 Speaker 3: It can be helpful wherever you are, and today is 30 00:01:34,720 --> 00:01:36,480 Speaker 3: one of those episodes that I think is going to 31 00:01:36,520 --> 00:01:40,399 Speaker 3: be helpful wherever you are. I am so excited for 32 00:01:40,440 --> 00:01:42,319 Speaker 3: you guys to listen to this conversation. I was so 33 00:01:42,400 --> 00:01:45,920 Speaker 3: excited to have this conversation because it is with somebody 34 00:01:45,920 --> 00:01:48,480 Speaker 3: that you probably have heard me mention a time or 35 00:01:48,480 --> 00:01:52,080 Speaker 3: ten on here. Her name is Amanda Montel and she 36 00:01:52,440 --> 00:01:55,560 Speaker 3: is the author of the book Cultish, which I have 37 00:01:56,080 --> 00:01:59,320 Speaker 3: talked about multiple times on this podcast. She is the 38 00:01:59,360 --> 00:02:02,920 Speaker 3: host of the podcast Sounds Like a Cult. She's also 39 00:02:02,960 --> 00:02:05,480 Speaker 3: coming out with a new podcast very soon. And she 40 00:02:05,600 --> 00:02:08,040 Speaker 3: is the author of the new book The Age of 41 00:02:08,160 --> 00:02:12,560 Speaker 3: Magical Overthinking And oh my gosh, this book is so 42 00:02:12,800 --> 00:02:14,240 Speaker 3: full of so much. 43 00:02:14,360 --> 00:02:16,760 Speaker 1: I mean, it is equal. 44 00:02:16,480 --> 00:02:20,720 Speaker 3: Parts informational, educational, thought provoking, and entertaining. She writes in 45 00:02:20,760 --> 00:02:25,399 Speaker 3: a way that really allows you to investigate what's going 46 00:02:25,400 --> 00:02:28,040 Speaker 3: on with yourself as you're investigating with what's going on 47 00:02:28,120 --> 00:02:30,160 Speaker 3: with the world at the same time. And it was 48 00:02:30,320 --> 00:02:33,440 Speaker 3: such an interesting and fun book for me to go through, 49 00:02:33,600 --> 00:02:35,640 Speaker 3: and I'm going to continue to go through it because 50 00:02:35,680 --> 00:02:37,959 Speaker 3: I feel like I can reread this one about fifteen 51 00:02:38,000 --> 00:02:40,519 Speaker 3: times and gain something new every time. But she's just 52 00:02:40,560 --> 00:02:43,919 Speaker 3: a wealth of knowledge and information. And we also talk 53 00:02:44,200 --> 00:02:44,840 Speaker 3: together at. 54 00:02:44,720 --> 00:02:45,239 Speaker 1: The same time. 55 00:02:45,360 --> 00:02:47,760 Speaker 3: How even when we are wealth of knowledge and information, 56 00:02:47,840 --> 00:02:50,480 Speaker 3: we have our limits and it's important to note that. 57 00:02:50,560 --> 00:02:52,600 Speaker 3: But she's wonderful. I'm so excited for you to hear 58 00:02:52,639 --> 00:02:56,160 Speaker 3: this conversation and for you to learn some of the 59 00:02:56,160 --> 00:02:58,959 Speaker 3: stuff that I have been learning from her while reading 60 00:02:59,000 --> 00:03:03,880 Speaker 3: this book. Out any more of me fangirling over this 61 00:03:04,080 --> 00:03:05,560 Speaker 3: author and our guest today. 62 00:03:05,639 --> 00:03:08,440 Speaker 1: Here is my conversation with Amanda at Montel. 63 00:03:10,120 --> 00:03:14,480 Speaker 3: So to start, I thought it would be good for 64 00:03:14,600 --> 00:03:17,400 Speaker 3: some of our listeners just to hear who you are, 65 00:03:17,440 --> 00:03:20,240 Speaker 3: which most of them know who you are because I 66 00:03:20,360 --> 00:03:21,760 Speaker 3: talk about Cultish all the time. 67 00:03:22,600 --> 00:03:23,200 Speaker 4: Of course. 68 00:03:23,320 --> 00:03:27,040 Speaker 2: Yeah, So I'm Amanda Montel and I wrote a book 69 00:03:27,080 --> 00:03:30,120 Speaker 2: called word Slut, which is about language and gender, and 70 00:03:30,160 --> 00:03:33,519 Speaker 2: then I wrote a book about the language of cults from. 71 00:03:33,400 --> 00:03:36,160 Speaker 4: Scientology to soul cycle called Cultish. 72 00:03:36,520 --> 00:03:39,800 Speaker 2: And then I just published my third book, my favorite 73 00:03:39,800 --> 00:03:42,040 Speaker 2: of the Budge, but I think I say that every time. 74 00:03:42,160 --> 00:03:43,240 Speaker 4: It's called The Age. 75 00:03:42,960 --> 00:03:47,440 Speaker 2: Of Magical Overthinking, notes on modern irationality, and it is 76 00:03:47,480 --> 00:03:52,360 Speaker 2: about cognitive biases in the information age digital age delulu 77 00:03:52,520 --> 00:03:55,800 Speaker 2: in various forms. So every chapter of the book is 78 00:03:55,840 --> 00:03:59,000 Speaker 2: dedicated to a different cognitive bias. Some of them are 79 00:03:59,360 --> 00:04:02,400 Speaker 2: pretty well known, like confirmation bias or some cost fallacy, 80 00:04:02,680 --> 00:04:04,960 Speaker 2: and then others have these cool names like the Halo 81 00:04:05,040 --> 00:04:07,720 Speaker 2: effect or the Ikea effect, and I use each one 82 00:04:07,840 --> 00:04:11,600 Speaker 2: as a lens to explore some mysterious irrationality from the 83 00:04:11,600 --> 00:04:14,720 Speaker 2: broader zeitgeist and my own life. So I talk about 84 00:04:14,800 --> 00:04:18,960 Speaker 2: cycles of celebrity worship and dethronement. I talk about Instagram 85 00:04:19,160 --> 00:04:23,760 Speaker 2: manifestation gurus. I talk about my own experience in a 86 00:04:23,800 --> 00:04:28,240 Speaker 2: sort of cult like cult of one romantic relationship, and 87 00:04:28,360 --> 00:04:31,159 Speaker 2: some other irrational choices that I've made in my life. 88 00:04:31,160 --> 00:04:33,480 Speaker 2: And I also host a podcast called Sounds Like a Cult, 89 00:04:33,720 --> 00:04:37,159 Speaker 2: and I'm launching a new podcast called Magical Overthinkers in May. 90 00:04:37,520 --> 00:04:37,719 Speaker 3: Oh. 91 00:04:37,800 --> 00:04:38,560 Speaker 1: I'm excited for that. 92 00:04:38,640 --> 00:04:40,760 Speaker 3: So I started listening to sounds like a cult after 93 00:04:40,800 --> 00:04:44,159 Speaker 3: I read Cultish and I listened to that book on audio, 94 00:04:44,200 --> 00:04:45,760 Speaker 3: and then I bought the book, so I could, you know, 95 00:04:45,880 --> 00:04:48,760 Speaker 3: highlight and write things and all that. But when I 96 00:04:48,880 --> 00:04:51,720 Speaker 3: listen to the episode, and I want to say it 97 00:04:51,800 --> 00:04:54,080 Speaker 3: was like an early on episode. It was about Instagram 98 00:04:54,120 --> 00:04:58,720 Speaker 3: therapists or social media therapists, which I am a therapist, 99 00:04:58,880 --> 00:05:05,040 Speaker 3: and it has been like three years of excruciating back 100 00:05:05,080 --> 00:05:08,000 Speaker 3: and forth. What am I doing? Can I be on 101 00:05:08,080 --> 00:05:10,960 Speaker 3: social media? Can I have a podcast? What are the limits? 102 00:05:11,600 --> 00:05:14,280 Speaker 3: Am I doing what those other people are doing? Am 103 00:05:14,279 --> 00:05:15,760 Speaker 3: I part of the problem? Am I part of the slute? 104 00:05:15,760 --> 00:05:18,839 Speaker 3: I mean, it's been insanity. And so when I listened 105 00:05:18,839 --> 00:05:21,599 Speaker 3: to that episode, part of it was listening listening to 106 00:05:21,640 --> 00:05:22,479 Speaker 3: cultician reading that. 107 00:05:22,480 --> 00:05:23,919 Speaker 1: But when I listened to that episode, I was like, 108 00:05:23,920 --> 00:05:24,520 Speaker 1: oh my gosh. 109 00:05:24,920 --> 00:05:28,160 Speaker 3: First of all, I'm not crazy, and there are other 110 00:05:28,279 --> 00:05:31,479 Speaker 3: people that are seeing what I'm seeing because from my 111 00:05:31,600 --> 00:05:35,839 Speaker 3: perspective as a therapist who is on social media but 112 00:05:36,000 --> 00:05:39,720 Speaker 3: doesn't have like this viral following, I've had one thin 113 00:05:39,800 --> 00:05:42,080 Speaker 3: go viral one time and I didn't even mean it 114 00:05:42,080 --> 00:05:43,960 Speaker 3: to be. I forgot that I put it up and 115 00:05:44,000 --> 00:05:47,479 Speaker 3: I said never again, it's not worth it. 116 00:05:48,040 --> 00:05:51,640 Speaker 2: I did an interview recently with the newsletter Embedded, which 117 00:05:51,720 --> 00:05:54,800 Speaker 2: is run and written by a fantastic. 118 00:05:54,279 --> 00:05:56,560 Speaker 4: Journalist who writes about the Internet named Kate Lindsay. 119 00:05:56,920 --> 00:05:59,720 Speaker 2: And one of the questions in this interview Q and 120 00:05:59,760 --> 00:06:04,720 Speaker 2: A have you ever gone viral? And personally no, there 121 00:06:04,800 --> 00:06:07,440 Speaker 2: was one sounds like a cult reels and to your point, like, 122 00:06:07,680 --> 00:06:11,040 Speaker 2: you never think the things that go viral would be 123 00:06:11,080 --> 00:06:12,279 Speaker 2: the things that go viral. 124 00:06:12,320 --> 00:06:14,520 Speaker 4: They're always just like the most. 125 00:06:14,560 --> 00:06:17,960 Speaker 2: Innocuous, like silliest, most last minute things. But it was 126 00:06:18,000 --> 00:06:21,160 Speaker 2: some video where it was just pulled from It sounds 127 00:06:21,200 --> 00:06:23,359 Speaker 2: like a cult recording where we were roasting the cult 128 00:06:23,360 --> 00:06:26,120 Speaker 2: of the Kardashians and it was like this silly little moment. 129 00:06:26,560 --> 00:06:27,760 Speaker 4: I didn't even notice it had. 130 00:06:27,680 --> 00:06:30,040 Speaker 2: Gone viral until months later, thank god, because I think 131 00:06:30,080 --> 00:06:32,599 Speaker 2: it would have stressed me out. But when a video 132 00:06:32,680 --> 00:06:36,039 Speaker 2: starts to get more views than I expected, I start 133 00:06:36,040 --> 00:06:39,000 Speaker 2: to feel paranoid. I don't know about you, but I'm 134 00:06:39,040 --> 00:06:42,360 Speaker 2: just like, who is this reaching and why and who 135 00:06:42,440 --> 00:06:44,000 Speaker 2: is gonna take this out of context? 136 00:06:44,040 --> 00:06:47,120 Speaker 4: Like it is not It's not fun to like be 137 00:06:47,240 --> 00:06:48,200 Speaker 4: perceived that way. 138 00:06:48,520 --> 00:06:50,800 Speaker 3: So this is so interesting because it plays into like 139 00:06:50,839 --> 00:06:52,480 Speaker 3: a lot of the things that I didn't think. I 140 00:06:52,560 --> 00:06:55,520 Speaker 3: was initially gonna ask you about this certain vice. I 141 00:06:55,560 --> 00:06:57,920 Speaker 3: think this is going to go into. But I am 142 00:06:58,400 --> 00:07:01,159 Speaker 3: because the video that went viral was just a fun 143 00:07:01,480 --> 00:07:03,960 Speaker 3: video that I made, like sitting on my couch with 144 00:07:04,320 --> 00:07:06,080 Speaker 3: my roommate. At the time, I didn't really know how 145 00:07:06,080 --> 00:07:08,320 Speaker 3: to work TikTok. I don't think I had any followers. 146 00:07:08,640 --> 00:07:10,680 Speaker 3: We were joking about like all of the things that 147 00:07:10,720 --> 00:07:14,120 Speaker 3: people are willing to do except go to therapy, And 148 00:07:14,480 --> 00:07:16,880 Speaker 3: it was partly like a joke, and part of it 149 00:07:16,920 --> 00:07:19,280 Speaker 3: was probably like my passive aggressive nature of being like, 150 00:07:19,360 --> 00:07:22,280 Speaker 3: oh my gosh, Like crystals aren't going to heal your trauma, 151 00:07:22,520 --> 00:07:24,680 Speaker 3: they can be helpful. I guess I don't really understand 152 00:07:24,840 --> 00:07:27,520 Speaker 3: what the science is there, but I put things like 153 00:07:27,800 --> 00:07:30,720 Speaker 3: all over the range, like it was like tearo cards 154 00:07:30,880 --> 00:07:35,600 Speaker 3: and buying self help books, exercise, certain foods, dating. I 155 00:07:35,640 --> 00:07:39,800 Speaker 3: think I had some like incense burning or crystals, just 156 00:07:39,880 --> 00:07:42,440 Speaker 3: all that kind of stuff, which leads into the manifestation 157 00:07:42,480 --> 00:07:44,800 Speaker 3: stuff that you talk about. Yeah, but when I tell 158 00:07:44,840 --> 00:07:48,480 Speaker 3: you there were people coming after me telling me they're 159 00:07:48,480 --> 00:07:50,880 Speaker 3: gonna put spells on me. How First of all, how 160 00:07:50,960 --> 00:07:53,680 Speaker 3: dare I. I'm privileged and I get where some of 161 00:07:53,680 --> 00:07:56,720 Speaker 3: that was coming from. But the backlash from like the 162 00:07:56,840 --> 00:08:01,320 Speaker 3: Manifestation community, Oh, I know, it's like, I want to 163 00:08:01,320 --> 00:08:03,360 Speaker 3: be on your team, I don't want to be against 164 00:08:03,440 --> 00:08:08,840 Speaker 3: you the same At the same time, it brings up 165 00:08:08,840 --> 00:08:12,080 Speaker 3: this bigger conversation of I think there can be room 166 00:08:12,120 --> 00:08:14,360 Speaker 3: for all kinds of things, and different things are going 167 00:08:14,440 --> 00:08:17,720 Speaker 3: to work for different people. But one of the reasons 168 00:08:17,760 --> 00:08:22,600 Speaker 3: that I have had this frustration, or you could call 169 00:08:22,640 --> 00:08:26,760 Speaker 3: it cognitive dissonance, around being on social media and creating 170 00:08:26,920 --> 00:08:30,880 Speaker 3: a platform where people can get information where they otherwise 171 00:08:31,040 --> 00:08:33,839 Speaker 3: wouldn't be able to afford it or wouldn't feel safe 172 00:08:33,840 --> 00:08:35,640 Speaker 3: to find it, or even just I want to dip 173 00:08:35,640 --> 00:08:38,560 Speaker 3: a toe in this before I jump into therapy. Then 174 00:08:38,720 --> 00:08:42,640 Speaker 3: turns into okay, well, am I spreading the things that 175 00:08:42,720 --> 00:08:45,920 Speaker 3: get taken out of context that bother me and make my. 176 00:08:45,920 --> 00:08:46,720 Speaker 1: Job really hard? 177 00:08:47,679 --> 00:08:52,839 Speaker 3: Or am I you called them like explainer grams? Am 178 00:08:52,840 --> 00:08:56,560 Speaker 3: I part of that problem? Or am I getting frustrated 179 00:08:56,559 --> 00:08:58,400 Speaker 3: with these people because they. 180 00:08:58,280 --> 00:09:01,200 Speaker 1: Are more successful than me? Or what have you? 181 00:09:01,520 --> 00:09:03,280 Speaker 3: I mean, there's been a lot of digging, but I 182 00:09:03,280 --> 00:09:06,480 Speaker 3: think that's somewhere where we can start, because when I 183 00:09:06,520 --> 00:09:09,280 Speaker 3: was reading your chapter about the halo effect, which is 184 00:09:09,480 --> 00:09:12,800 Speaker 3: so timely with the Taylor Swift stuff, I don't know 185 00:09:12,840 --> 00:09:14,520 Speaker 3: that if you could have planned that any better, you 186 00:09:14,559 --> 00:09:15,720 Speaker 3: probably wasn't even planned. 187 00:09:15,880 --> 00:09:17,040 Speaker 4: It was not even planned. 188 00:09:17,080 --> 00:09:20,800 Speaker 2: But she comes out with an album frequently enough like 189 00:09:20,840 --> 00:09:24,559 Speaker 2: it'll work out. Yeah, the odds of writing about Taylor 190 00:09:24,559 --> 00:09:27,040 Speaker 2: Swift at the very time that she's putting out an 191 00:09:27,080 --> 00:09:29,079 Speaker 2: album are pretty high. 192 00:09:29,120 --> 00:09:30,960 Speaker 3: So I opened the book and I was reading that 193 00:09:31,000 --> 00:09:34,040 Speaker 3: chapter At first when you started talking about certain things, 194 00:09:34,120 --> 00:09:37,800 Speaker 3: I felt like I was going to be understood from like, 195 00:09:37,840 --> 00:09:41,679 Speaker 3: it's not me, it's them. H You're like and and 196 00:09:41,720 --> 00:09:44,319 Speaker 3: then I started reading that chapter and I was like, Okay, 197 00:09:44,880 --> 00:09:50,040 Speaker 3: I'm putting my expectations for what a therapist should be, 198 00:09:50,480 --> 00:09:53,360 Speaker 3: what a helper should be, what a good person should 199 00:09:53,440 --> 00:09:58,160 Speaker 3: be on these people, and I have to take accountability 200 00:09:58,320 --> 00:10:01,320 Speaker 3: for my part in that. So can you explain what 201 00:10:01,360 --> 00:10:03,560 Speaker 3: the halo effect is for any people that are like, 202 00:10:03,559 --> 00:10:04,360 Speaker 3: what are you talking about? 203 00:10:04,559 --> 00:10:08,280 Speaker 2: It's so funny, like my way into talking about this 204 00:10:08,360 --> 00:10:11,920 Speaker 2: book ends it being totally different every interview that I do, 205 00:10:11,960 --> 00:10:15,120 Speaker 2: because the contents of this book really are just like 206 00:10:15,679 --> 00:10:19,280 Speaker 2: my reckoning with my own irrational choices and my own 207 00:10:19,360 --> 00:10:22,760 Speaker 2: judgments and shame of myself and other people. And it 208 00:10:22,840 --> 00:10:25,240 Speaker 2: was really fun to open with this chapter on the 209 00:10:25,240 --> 00:10:29,880 Speaker 2: halo effect, because observing cycles of celebrity worship and dethronement 210 00:10:30,080 --> 00:10:34,280 Speaker 2: is something that like anybody with a digital device has seen, 211 00:10:34,400 --> 00:10:37,600 Speaker 2: you know, and being able to sort of like parse 212 00:10:37,720 --> 00:10:44,920 Speaker 2: through why, like there's so much hallucinatory brutality surrounding celebrities 213 00:10:44,960 --> 00:10:48,760 Speaker 2: that feels very current, like it doesn't it feels unique 214 00:10:48,800 --> 00:10:51,719 Speaker 2: to right now. Being able to parse through, parse through 215 00:10:51,720 --> 00:10:55,000 Speaker 2: those ideas from a behavioral egonomic standpoint was really cool. 216 00:10:55,040 --> 00:10:59,240 Speaker 2: So the halo effect describes this tendency to admire one 217 00:10:59,320 --> 00:11:01,680 Speaker 2: quality and a part and jump to the conclusion that 218 00:11:01,720 --> 00:11:04,760 Speaker 2: they must be perfect overall. So we enjoy a pop 219 00:11:04,760 --> 00:11:07,480 Speaker 2: stars music, we jump to the conclusion that they must 220 00:11:07,520 --> 00:11:13,720 Speaker 2: be nurturing, worldly that they align politically with us. Say 221 00:11:13,840 --> 00:11:17,120 Speaker 2: we enjoy an influencer sense of style, we might assume 222 00:11:17,160 --> 00:11:20,880 Speaker 2: that they are educated, updated on current events as much 223 00:11:20,920 --> 00:11:25,200 Speaker 2: as they're updated on their style trends. And this is 224 00:11:25,600 --> 00:11:29,520 Speaker 2: not necessarily a rational thing, like there is little to 225 00:11:29,600 --> 00:11:32,640 Speaker 2: no evidence to suggest that your favorite pop star would 226 00:11:32,640 --> 00:11:37,400 Speaker 2: align politically with you, But this bias is it stems 227 00:11:37,440 --> 00:11:42,599 Speaker 2: from generations of habit forming behavior where we used to 228 00:11:42,679 --> 00:11:44,680 Speaker 2: use the halo effect to fine role models. You know, 229 00:11:44,720 --> 00:11:46,880 Speaker 2: you can imagine a time in human history where you 230 00:11:46,880 --> 00:11:50,079 Speaker 2: would clock someone in your community with big muscles and 231 00:11:50,520 --> 00:11:53,680 Speaker 2: intact teeth, and you would think, like, oh, that person 232 00:11:53,760 --> 00:11:56,640 Speaker 2: must be a good hunter or a skilled fighter who's 233 00:11:56,920 --> 00:11:59,680 Speaker 2: avoided disfigurement from battle. You know, that would be a 234 00:11:59,679 --> 00:12:03,320 Speaker 2: good person to align myself with for survival purposes, even 235 00:12:03,320 --> 00:12:05,400 Speaker 2: though there might not have been like the perfect amount 236 00:12:05,440 --> 00:12:07,200 Speaker 2: of data to suggest that it was a good enough 237 00:12:07,200 --> 00:12:10,520 Speaker 2: conclusion to jump to. We are now mapping this halo 238 00:12:10,559 --> 00:12:15,839 Speaker 2: effect onto modern parasocial relationships with celebrities that have nothing 239 00:12:15,880 --> 00:12:18,480 Speaker 2: to do with survival and have very much to do 240 00:12:18,559 --> 00:12:23,920 Speaker 2: with these more abstract and cerebral concepts of identity and values. 241 00:12:24,120 --> 00:12:26,480 Speaker 2: And it relates back to the cult stuff too, because 242 00:12:26,720 --> 00:12:29,839 Speaker 2: I think living in you know, contemporary society, a lot 243 00:12:29,840 --> 00:12:32,560 Speaker 2: of us confront a great deal of choosers. Paradox, we 244 00:12:32,920 --> 00:12:36,319 Speaker 2: you know, feel overwhelmed by the amount of information that 245 00:12:36,360 --> 00:12:38,720 Speaker 2: we are tasked to contend with every single day, the 246 00:12:38,760 --> 00:12:42,080 Speaker 2: amount of identities that we're tasked to compare ourselves to 247 00:12:42,679 --> 00:12:45,440 Speaker 2: there's at least the illusion of like limitless directions for 248 00:12:45,480 --> 00:12:47,400 Speaker 2: our lives to go in, like who should I be, 249 00:12:47,520 --> 00:12:48,320 Speaker 2: what should I wear? 250 00:12:48,520 --> 00:12:49,800 Speaker 4: Like what should my hair color be? 251 00:12:50,440 --> 00:12:54,960 Speaker 2: And being able to worship a celebrity like Beyonce or 252 00:12:55,000 --> 00:12:58,520 Speaker 2: Taylor Swift as a kind of mother figure in a 253 00:12:58,520 --> 00:13:00,880 Speaker 2: way that sort of parental I fear that you might 254 00:13:00,880 --> 00:13:04,120 Speaker 2: have admired before the Internet and now we've sort of 255 00:13:04,160 --> 00:13:06,480 Speaker 2: turned to celebrities as a new kind of role model 256 00:13:06,520 --> 00:13:09,920 Speaker 2: or paragon. Being able to look to that celebrity not 257 00:13:10,040 --> 00:13:13,040 Speaker 2: just for entertainment but as a sort of savior can 258 00:13:13,080 --> 00:13:19,040 Speaker 2: feel really healing until those grandiose expectations for who they 259 00:13:19,040 --> 00:13:21,920 Speaker 2: are end up not coming to fruition and they do 260 00:13:22,080 --> 00:13:25,600 Speaker 2: something to disappoint us, and that can feel like a 261 00:13:25,640 --> 00:13:29,160 Speaker 2: real violation. So I think that can apply not just 262 00:13:29,280 --> 00:13:31,920 Speaker 2: to you know, mainstream celebrities that you might hear on 263 00:13:31,920 --> 00:13:34,600 Speaker 2: the radio or see on TV, but these microelebrities that 264 00:13:34,600 --> 00:13:36,840 Speaker 2: you would only interact with on social media. 265 00:13:36,920 --> 00:13:39,600 Speaker 3: I think that one is huge too, because even with 266 00:13:39,640 --> 00:13:42,520 Speaker 3: these like smaller micro influencers, if they don't show up. 267 00:13:42,520 --> 00:13:44,160 Speaker 1: I've heard people talking about like, Oh, I met. 268 00:13:44,000 --> 00:13:45,600 Speaker 3: Her in real life and she was this way, or 269 00:13:45,640 --> 00:13:47,560 Speaker 3: she was that way, or she had no social skills, 270 00:13:47,640 --> 00:13:49,439 Speaker 3: or I saw her doing that, and I'm like, oh 271 00:13:49,480 --> 00:13:51,880 Speaker 3: my gosh, well she's probably not used to having people 272 00:13:51,960 --> 00:13:53,760 Speaker 3: stare at her or follow her around and ask her 273 00:13:53,800 --> 00:13:56,679 Speaker 3: to take her picture. And I can speak to that that, Like, 274 00:13:57,240 --> 00:14:00,000 Speaker 3: I am somebody who is very I would never come 275 00:14:00,040 --> 00:14:03,360 Speaker 3: myself a micro influencer, but I'm somebody who is very 276 00:14:03,720 --> 00:14:08,560 Speaker 3: introverted when it comes to certain social situations. But I 277 00:14:08,600 --> 00:14:12,080 Speaker 3: appear very extroverted. I think when I'm talking about the 278 00:14:12,080 --> 00:14:14,880 Speaker 3: things that I know, and when I'm a therapist and 279 00:14:14,960 --> 00:14:17,360 Speaker 3: I can get in front of somebody and maybe a 280 00:14:17,400 --> 00:14:18,000 Speaker 3: classroom and. 281 00:14:18,000 --> 00:14:18,920 Speaker 1: Talk about X, y Z. 282 00:14:19,400 --> 00:14:21,920 Speaker 3: But if you see me at a party and you 283 00:14:21,960 --> 00:14:24,440 Speaker 3: want to have small talk and like where do we 284 00:14:24,480 --> 00:14:26,440 Speaker 3: go to next? And so it's like, well, I thought 285 00:14:26,480 --> 00:14:29,840 Speaker 3: you were this way, and so that is dangerous. But 286 00:14:30,160 --> 00:14:34,480 Speaker 3: I guess I'm curious about your perception on where we 287 00:14:34,560 --> 00:14:38,840 Speaker 3: draw the line between our responsibility for how we put 288 00:14:38,840 --> 00:14:42,880 Speaker 3: those expectations on people and then also the kind of 289 00:14:42,920 --> 00:14:46,760 Speaker 3: like manipulative way that some people. 290 00:14:47,040 --> 00:14:48,200 Speaker 1: Do use social media. 291 00:14:48,720 --> 00:14:52,000 Speaker 3: And yeah, I got in a heated argument discussion not 292 00:14:52,160 --> 00:14:57,160 Speaker 3: argument about Jay Shetty recently and you don't have to 293 00:14:57,160 --> 00:14:59,600 Speaker 3: give your personal opinion on him. This was before that 294 00:14:59,640 --> 00:15:02,800 Speaker 3: whole art came out about him more recently. Yeah, And 295 00:15:02,960 --> 00:15:05,000 Speaker 3: I was like, I saw him in this gap ad 296 00:15:05,360 --> 00:15:06,920 Speaker 3: that was like in New York City or something on 297 00:15:06,960 --> 00:15:09,200 Speaker 3: a billboard and like Times Square or something, and I 298 00:15:09,240 --> 00:15:09,640 Speaker 3: was like. 299 00:15:09,760 --> 00:15:11,760 Speaker 1: Okay, this feels strange. 300 00:15:12,120 --> 00:15:15,840 Speaker 3: I have seen all of these clips, viral clips about 301 00:15:15,920 --> 00:15:19,480 Speaker 3: him talking about materialism and how to live a certain way, 302 00:15:19,760 --> 00:15:24,360 Speaker 3: and then he's in this multi million million dollar probably 303 00:15:24,400 --> 00:15:28,480 Speaker 3: worldwide ad campaign for a clothing company. I was like 304 00:15:28,680 --> 00:15:32,120 Speaker 3: playing ping pong a, who is this? And I was 305 00:15:32,160 --> 00:15:34,160 Speaker 3: never really a big fan of him. I didn't really 306 00:15:34,160 --> 00:15:36,000 Speaker 3: know much about him. So I started like, I tried 307 00:15:36,040 --> 00:15:38,240 Speaker 3: to listen to one of his podcasts and my friend 308 00:15:38,240 --> 00:15:39,680 Speaker 3: was telling about one of his books. 309 00:15:39,400 --> 00:15:42,960 Speaker 1: And I was like, this is not new information. First 310 00:15:43,000 --> 00:15:45,400 Speaker 1: of all. Second all, he's not really saying anything. 311 00:15:45,760 --> 00:15:47,680 Speaker 2: The Jayshetties of the world. It's like a tale as 312 00:15:47,720 --> 00:15:49,320 Speaker 2: old as time, you know what I mean. It's like 313 00:15:49,360 --> 00:15:52,840 Speaker 2: the Tony Robbins. Yes, we look to guru figures like 314 00:15:52,920 --> 00:15:56,240 Speaker 2: that who have the loudest voice and the most shameless voice, 315 00:15:56,280 --> 00:16:00,320 Speaker 2: the most populist voice during times of tumult when we 316 00:16:00,400 --> 00:16:03,120 Speaker 2: just want someone to tell us how to be, like 317 00:16:03,160 --> 00:16:06,120 Speaker 2: that monologue in a Fleabag in season two, when she's 318 00:16:06,160 --> 00:16:08,560 Speaker 2: in the confessional and she's like, I just want someone 319 00:16:08,560 --> 00:16:10,200 Speaker 2: to tell me, like what music to listen to, what 320 00:16:10,240 --> 00:16:12,840 Speaker 2: tickets to buy, like who love and how to love them. 321 00:16:13,120 --> 00:16:16,880 Speaker 2: It's just like really confounding to be an adult or 322 00:16:17,000 --> 00:16:20,560 Speaker 2: child living in the world right now. And so you know, 323 00:16:20,640 --> 00:16:24,080 Speaker 2: it's unfortunate that those voices that rise to the top 324 00:16:24,200 --> 00:16:28,720 Speaker 2: are often the most self focused, driven by profit. And 325 00:16:28,760 --> 00:16:30,480 Speaker 2: then I don't know if you feel this way, but 326 00:16:30,560 --> 00:16:34,240 Speaker 2: like I sometimes feel self questioning, Like I talk in 327 00:16:34,280 --> 00:16:37,440 Speaker 2: public about stuff, Yeah, does that. 328 00:16:37,360 --> 00:16:39,640 Speaker 4: Make me a narcissist? Like, I don't know. 329 00:16:39,720 --> 00:16:42,280 Speaker 2: I am just following my curiosities and they've taken me 330 00:16:42,320 --> 00:16:43,880 Speaker 2: to this place where I get to talk about them 331 00:16:43,880 --> 00:16:46,880 Speaker 2: on podcasts and things like that. But I, of course 332 00:16:47,280 --> 00:16:52,800 Speaker 2: do not endorse worshiping anyone or treating anyone's ideas as dogma. 333 00:16:53,040 --> 00:16:55,440 Speaker 2: And yet it's easy to do because. 334 00:16:55,120 --> 00:16:56,080 Speaker 4: Of the halo effect. 335 00:16:56,080 --> 00:16:59,000 Speaker 2: When we hear someone speak with authority on a platform 336 00:16:59,120 --> 00:17:02,640 Speaker 2: like a podcast or in a book, it is easy 337 00:17:02,680 --> 00:17:05,760 Speaker 2: to project that crown of light on their head and 338 00:17:06,480 --> 00:17:08,439 Speaker 2: forget to perceive their humanity. 339 00:17:08,640 --> 00:17:09,800 Speaker 4: But where do I draw the line. 340 00:17:09,880 --> 00:17:13,439 Speaker 2: I mean, there is a difference between celebrities who have 341 00:17:13,560 --> 00:17:16,879 Speaker 2: made no suggestion that you should worship them as a 342 00:17:16,920 --> 00:17:20,560 Speaker 2: political authority, and yet we do anyway, and we're expecting 343 00:17:20,600 --> 00:17:25,640 Speaker 2: them to voice their opinions and their stances on various matters. 344 00:17:25,640 --> 00:17:27,960 Speaker 2: And I don't have like a definitive answer about whether 345 00:17:28,040 --> 00:17:31,160 Speaker 2: or not celebrities should do that, or what their role 346 00:17:31,200 --> 00:17:34,159 Speaker 2: in public life and political life, and if their voices 347 00:17:34,200 --> 00:17:37,199 Speaker 2: about the mental health crisis or any other phenomenon that 348 00:17:37,240 --> 00:17:39,200 Speaker 2: we're confronting in modern society. I don't know if those 349 00:17:39,240 --> 00:17:42,080 Speaker 2: voices like should be heard, if they're qualified to voice them, 350 00:17:42,359 --> 00:17:42,960 Speaker 2: I don't know. 351 00:17:43,400 --> 00:17:45,800 Speaker 4: But it is one thing when we. 352 00:17:45,840 --> 00:17:52,200 Speaker 2: Are demanding that Taylor Swift speak out endorsing progressive politics 353 00:17:52,240 --> 00:17:55,720 Speaker 2: when there's literally nothing to suggest that she holds those values. 354 00:17:55,760 --> 00:17:58,640 Speaker 2: Like she flies on private jets and her security team 355 00:17:58,680 --> 00:18:01,600 Speaker 2: is all former police. There's nothing, there's nothing to suggest 356 00:18:01,640 --> 00:18:05,000 Speaker 2: that she holds like extremely left wing values. But then 357 00:18:05,119 --> 00:18:10,119 Speaker 2: there are celebrities who have absolutely no shame in positioning 358 00:18:10,160 --> 00:18:13,520 Speaker 2: themselves as some kind of spiritual guru or political figure. 359 00:18:13,560 --> 00:18:16,800 Speaker 2: I mean, look at the celebrity presidents and other you know, 360 00:18:16,920 --> 00:18:21,440 Speaker 2: celebrities who have taken a very formal role in political life. 361 00:18:21,680 --> 00:18:24,320 Speaker 2: So I think, yeah, we're we're living in this very 362 00:18:24,320 --> 00:18:28,760 Speaker 2: confusing time when there is mass mistrust in authority figures 363 00:18:28,760 --> 00:18:31,359 Speaker 2: like the government and the healthcare system for good reason, 364 00:18:31,840 --> 00:18:36,439 Speaker 2: and yet looking to celebrities as a replacement, sort of 365 00:18:36,520 --> 00:18:43,000 Speaker 2: insurgent outsider figure is also very problematic. Looking to influencers 366 00:18:43,000 --> 00:18:46,520 Speaker 2: and micro celebrities online to be perfect, you know, holding 367 00:18:46,560 --> 00:18:49,800 Speaker 2: those expectations that they should be perfect is an issue, 368 00:18:49,840 --> 00:18:53,480 Speaker 2: and the worship in the end, the lambasting are like 369 00:18:53,560 --> 00:18:57,080 Speaker 2: two sides of the same coin, because yeah, there are 370 00:18:57,240 --> 00:19:00,760 Speaker 2: people on social media who are attempting to posis themselves 371 00:19:00,760 --> 00:19:04,119 Speaker 2: as having access to transcendent wisdom and who are taking 372 00:19:04,160 --> 00:19:07,240 Speaker 2: advantage of other people's anxieties and saying like, oh, you 373 00:19:07,280 --> 00:19:09,359 Speaker 2: feel out of control of your life, sign up for 374 00:19:09,400 --> 00:19:11,600 Speaker 2: my thirty five dollars a month course where you will 375 00:19:11,680 --> 00:19:15,560 Speaker 2: learn my proprietary manifestation technique. And if it doesn't work 376 00:19:15,600 --> 00:19:18,560 Speaker 2: for you, well that's your problem. Because I'm a genius. 377 00:19:18,960 --> 00:19:20,920 Speaker 2: All of these things are happening at the same time, 378 00:19:21,000 --> 00:19:25,080 Speaker 2: and so I don't have definitive answers for the solutions 379 00:19:25,119 --> 00:19:30,000 Speaker 2: for these dynamics. But I think being aware of biases 380 00:19:30,160 --> 00:19:34,440 Speaker 2: like the Halo effect and sort of recalibrating our expectations 381 00:19:34,520 --> 00:19:36,800 Speaker 2: for various celebrities, sort of looking at the evidence and 382 00:19:36,880 --> 00:19:37,960 Speaker 2: being like, what are. 383 00:19:38,040 --> 00:19:39,560 Speaker 4: Jay Shehtty's qualifications? 384 00:19:39,720 --> 00:19:43,720 Speaker 2: You know, like how am I perceiving this celebrity in 385 00:19:43,760 --> 00:19:46,840 Speaker 2: the context of their full humanity. I think all of 386 00:19:46,840 --> 00:19:48,760 Speaker 2: that is a really worthwhile exercise. 387 00:19:55,640 --> 00:19:58,720 Speaker 3: I want to read something from your book because this 388 00:19:58,760 --> 00:20:01,480 Speaker 3: is kind of tying in like the Halo effect. I 389 00:20:01,520 --> 00:20:04,080 Speaker 3: think it is probably some of well, I think it 390 00:20:04,119 --> 00:20:07,520 Speaker 3: definitely is some of the proportionality bias, and then there's 391 00:20:07,560 --> 00:20:13,360 Speaker 3: a little bit of the overconfidence bias in this tangled web. 392 00:20:13,400 --> 00:20:17,000 Speaker 1: In my head, this was so important for me to read. 393 00:20:17,520 --> 00:20:22,159 Speaker 3: You wrote small, mundane explanations for important events. Princess Diana 394 00:20:22,280 --> 00:20:25,159 Speaker 3: died because her limo driver was drunk and speeding to 395 00:20:25,200 --> 00:20:28,560 Speaker 3: avoid paparazzi are generally not as satiating as more dramatic 396 00:20:28,600 --> 00:20:32,400 Speaker 3: explanations she was murdered by the British government, and then 397 00:20:32,840 --> 00:20:36,119 Speaker 3: later in that same chapter you wrote in twenty eighteen, 398 00:20:36,280 --> 00:20:40,000 Speaker 3: Mit found that true stories take six times longer to 399 00:20:40,119 --> 00:20:43,640 Speaker 3: reach fifteen hundred people on Twitter than false ones. That's 400 00:20:43,680 --> 00:20:46,600 Speaker 3: because false news is more novel and people are more 401 00:20:46,680 --> 00:20:50,000 Speaker 3: likely to share novel information. People who share novel information 402 00:20:50,119 --> 00:20:53,159 Speaker 3: are seen as being in the know. One sided sentiments 403 00:20:53,200 --> 00:20:56,359 Speaker 3: like over explaining yourself is a traveler response that stems 404 00:20:56,400 --> 00:20:59,159 Speaker 3: from an unresolved childhood fear of conflict are far. 405 00:20:59,040 --> 00:20:59,960 Speaker 1: Better for engagement. 406 00:21:00,320 --> 00:21:03,640 Speaker 3: Then people justify their actions different ways for different reasons. 407 00:21:03,760 --> 00:21:06,520 Speaker 3: Or all traumatic events are stressful, but not all stressful 408 00:21:06,520 --> 00:21:10,560 Speaker 3: events are traumatic, and uh, like you put into those 409 00:21:10,920 --> 00:21:15,280 Speaker 3: sentences so much of what I've been feeling because those 410 00:21:15,280 --> 00:21:17,359 Speaker 3: things are true that you just said at the end, like, 411 00:21:17,800 --> 00:21:22,440 Speaker 3: people justifyd through their actions for different reasons. And I 412 00:21:22,480 --> 00:21:26,399 Speaker 3: know enough in this, I think is part of self 413 00:21:26,400 --> 00:21:28,960 Speaker 3: awareness in a way. I know enough to know that 414 00:21:29,320 --> 00:21:31,960 Speaker 3: I don't know a lot, and I know enough to 415 00:21:32,080 --> 00:21:35,440 Speaker 3: know that I don't know what that behavior that. 416 00:21:35,440 --> 00:21:36,440 Speaker 1: You're doing is from. 417 00:21:36,520 --> 00:21:39,600 Speaker 3: And I can't pinpoint that exactly on this one event 418 00:21:39,880 --> 00:21:43,040 Speaker 3: without especially without knowing you. We can try to figure 419 00:21:43,080 --> 00:21:45,920 Speaker 3: it out, and hopefully we can, but I can't over 420 00:21:46,000 --> 00:21:50,080 Speaker 3: promise you. And my fear, I think from a mental 421 00:21:50,080 --> 00:21:53,359 Speaker 3: health perspective, is our world is kind of going in 422 00:21:53,920 --> 00:21:58,520 Speaker 3: a way where true information isn't good enough because it's 423 00:21:58,600 --> 00:22:02,240 Speaker 3: not so big and great and people don't like the 424 00:22:02,280 --> 00:22:06,560 Speaker 3: answer of it depends correct And that's most of what 425 00:22:07,080 --> 00:22:07,800 Speaker 3: I have to admit this. 426 00:22:07,880 --> 00:22:11,320 Speaker 1: I used to post little explainer grams like years. 427 00:22:11,119 --> 00:22:14,720 Speaker 3: Ago, and I remember finding myself being like, this isn't 428 00:22:15,000 --> 00:22:18,359 Speaker 3: exciting enough, Like nothing I want to write is exciting enough. 429 00:22:18,400 --> 00:22:21,280 Speaker 1: And eventually I was like, well, this is why, Like you. 430 00:22:21,320 --> 00:22:23,600 Speaker 3: Have to stopped doing this because the truth is not 431 00:22:23,680 --> 00:22:24,520 Speaker 3: exciting enough. 432 00:22:24,600 --> 00:22:26,520 Speaker 4: And yeah, I don't know. 433 00:22:26,440 --> 00:22:27,160 Speaker 1: What to do with that. 434 00:22:27,520 --> 00:22:28,520 Speaker 4: Yeah, it's tricky. 435 00:22:28,600 --> 00:22:31,920 Speaker 2: Okay, So I'll break down the bias behind the quote 436 00:22:31,920 --> 00:22:35,400 Speaker 2: that you just read for the listeners. So proportionality bias 437 00:22:35,520 --> 00:22:37,920 Speaker 2: is the subject of the chapter in the book that's 438 00:22:37,920 --> 00:22:40,760 Speaker 2: called I Swear I manifested this, in which I argue 439 00:22:40,760 --> 00:22:45,840 Speaker 2: that the bias powering nefarious conspiracy theories and the bias 440 00:22:46,080 --> 00:22:49,680 Speaker 2: powering ideas of manifestation that are seemingly more innocent and positive, 441 00:22:49,960 --> 00:22:52,960 Speaker 2: it's actually the same bias. So proportionality bias is the 442 00:22:52,960 --> 00:22:56,800 Speaker 2: tendency to think that big events or even just big feelings. 443 00:22:56,359 --> 00:22:57,760 Speaker 4: Must have had a big cause. 444 00:22:58,240 --> 00:23:02,720 Speaker 2: So the global pandemic strikes this is such like an overwhelming, 445 00:23:03,359 --> 00:23:07,000 Speaker 2: mind blowing, enormous calamity. We think to ourselves that could 446 00:23:07,000 --> 00:23:08,920 Speaker 2: not have been the result of just a bunch of small, 447 00:23:09,000 --> 00:23:12,240 Speaker 2: random misfortunes. A government must have engineered it on purpose. 448 00:23:12,280 --> 00:23:14,720 Speaker 2: That's the only way this makes proportional sense. You know, 449 00:23:14,800 --> 00:23:17,639 Speaker 2: we as human beings, like to infuse a sort of 450 00:23:18,040 --> 00:23:22,160 Speaker 2: cosmic logic into tragedies or events that make us feel 451 00:23:22,200 --> 00:23:26,200 Speaker 2: really out of control. We're the only species that mythologizes 452 00:23:26,240 --> 00:23:27,840 Speaker 2: the world in order to make sense of it, like 453 00:23:27,880 --> 00:23:30,840 Speaker 2: we make up stories about the world in order for 454 00:23:30,920 --> 00:23:34,719 Speaker 2: it to feel manageable to us. So while proportionality bias 455 00:23:35,119 --> 00:23:39,000 Speaker 2: can clearly explain why people think, oh my god, xyz 456 00:23:39,680 --> 00:23:44,119 Speaker 2: result could have only stemmed from this, you know, nefarious 457 00:23:44,200 --> 00:23:47,919 Speaker 2: evil elite secretly controlling the socio political order, you can 458 00:23:48,040 --> 00:23:51,400 Speaker 2: also see how proportionality bias relates to manifestation because it's 459 00:23:51,440 --> 00:23:53,560 Speaker 2: the same misattribution of cause and effect. 460 00:23:53,600 --> 00:23:55,440 Speaker 4: It's this idea that if I. 461 00:23:55,680 --> 00:23:59,000 Speaker 2: Just vision board hard enough, then I will be able 462 00:23:59,000 --> 00:24:02,360 Speaker 2: to yield health and success and beauty and love. And 463 00:24:02,680 --> 00:24:05,720 Speaker 2: if those things don't come to fruition, there must be 464 00:24:05,720 --> 00:24:10,960 Speaker 2: a problem with my manifestation technique. And what makes this 465 00:24:11,280 --> 00:24:14,480 Speaker 2: sinister or what can be sinister, is that there is 466 00:24:14,520 --> 00:24:18,399 Speaker 2: this whole class of what I'm calling conspiracy therapists that 467 00:24:18,600 --> 00:24:22,280 Speaker 2: has emerged on social media who will harness these ideas 468 00:24:22,320 --> 00:24:26,080 Speaker 2: of manifestation make it sound like science, Like there is, 469 00:24:26,480 --> 00:24:31,280 Speaker 2: you know, an empirically backed, like bespoke manifestation technique that 470 00:24:31,400 --> 00:24:36,520 Speaker 2: involves neuroscience, and you know DSM terminology combined with spiritual terminology. 471 00:24:36,560 --> 00:24:39,760 Speaker 2: You know, invoking the word borderline personality disorder and the 472 00:24:39,800 --> 00:24:43,240 Speaker 2: acacic records in the same sentence makes you seem supremely wise, 473 00:24:43,320 --> 00:24:46,360 Speaker 2: spiritually wise, and academically wise at the same time. And 474 00:24:46,600 --> 00:24:51,880 Speaker 2: they are harnessing that incredible lack of control and sort 475 00:24:51,880 --> 00:24:55,479 Speaker 2: of like existential suffering that so many of us are 476 00:24:55,480 --> 00:24:57,560 Speaker 2: feeling right now and have been feeling over the last 477 00:24:58,040 --> 00:25:01,800 Speaker 2: however long time, long long time throughout the digital age, 478 00:25:01,960 --> 00:25:07,120 Speaker 2: and they are promising a solution while also implying that 479 00:25:07,320 --> 00:25:11,359 Speaker 2: if their solution doesn't help you, then it's your fault. 480 00:25:11,440 --> 00:25:14,400 Speaker 2: And if you get cancer, then you must have done 481 00:25:14,400 --> 00:25:18,600 Speaker 2: something wrong that was your unresolved trauma, and that has 482 00:25:18,760 --> 00:25:23,320 Speaker 2: like incredibly sinister victim blaming implications, and then not to 483 00:25:23,359 --> 00:25:27,480 Speaker 2: mention like because these ideas of manifestation. Some of them 484 00:25:28,040 --> 00:25:32,479 Speaker 2: have ideology in common with the more pernicious conspiracy theories. 485 00:25:32,760 --> 00:25:35,879 Speaker 2: There is a sort of pipeline, you see it all 486 00:25:35,920 --> 00:25:38,879 Speaker 2: the time, the sort of like new age holistic self 487 00:25:38,920 --> 00:25:43,359 Speaker 2: healing community slowly but surely falling into the sort of 488 00:25:43,440 --> 00:25:46,560 Speaker 2: QAnon of it all. And that's why you see conspiratuality, 489 00:25:46,640 --> 00:25:49,719 Speaker 2: the portmanteau of conspiracy theory and spirituality, the sort of 490 00:25:49,760 --> 00:25:52,960 Speaker 2: like new age yoga mom's like marching arm in arm 491 00:25:53,000 --> 00:25:56,399 Speaker 2: with Holocaust deniers. You know, like this seemingly doesn't make sense, 492 00:25:56,520 --> 00:25:59,159 Speaker 2: but it's all powered by this idea that we're on 493 00:25:59,160 --> 00:26:01,000 Speaker 2: the brink of a paradigm shift and there is some 494 00:26:01,119 --> 00:26:04,760 Speaker 2: evil elite controlling the socio political order. And while the 495 00:26:04,840 --> 00:26:10,280 Speaker 2: idea that the government has like underground labs or chemtrails whatever, 496 00:26:10,320 --> 00:26:14,520 Speaker 2: that stuff might be a little spicy for the average person, 497 00:26:15,000 --> 00:26:19,200 Speaker 2: the idea that you have brought suffering onto yourself because 498 00:26:19,280 --> 00:26:22,720 Speaker 2: you didn't practice the law of attraction effectively enough, and 499 00:26:22,880 --> 00:26:25,960 Speaker 2: you don't need big pharma or mental health medication to 500 00:26:26,000 --> 00:26:28,640 Speaker 2: solve your problems. You just need to self heal. That 501 00:26:28,920 --> 00:26:31,440 Speaker 2: can be a sort of gateway. So that's a sort 502 00:26:31,480 --> 00:26:33,720 Speaker 2: of like overarching Submary. 503 00:26:33,400 --> 00:26:36,400 Speaker 3: Well, and you talking about the gateway is so I 504 00:26:36,440 --> 00:26:39,600 Speaker 3: think important to that conversation because people are quick to 505 00:26:39,640 --> 00:26:43,120 Speaker 3: be like, well, I'm not crazy or whatever, I wouldn't 506 00:26:43,119 --> 00:26:46,160 Speaker 3: and it's progressive. You put a toe in, and then 507 00:26:46,200 --> 00:26:49,160 Speaker 3: you like what you feel, and so you put another 508 00:26:49,200 --> 00:26:51,439 Speaker 3: toe in, and then you are doing those things. And 509 00:26:51,520 --> 00:26:54,240 Speaker 3: so you wrote this early in the book, but you 510 00:26:54,280 --> 00:26:56,560 Speaker 3: were talking about how we have so much information now 511 00:26:57,200 --> 00:26:57,800 Speaker 3: we don't know. 512 00:26:57,680 --> 00:26:59,280 Speaker 1: How to what to do with it. 513 00:26:59,400 --> 00:27:02,000 Speaker 3: And for you, you said, learning to stomach a sense 514 00:27:02,040 --> 00:27:03,200 Speaker 3: of irresolution might. 515 00:27:03,119 --> 00:27:04,840 Speaker 1: Be the only way to survive the crisis. 516 00:27:05,400 --> 00:27:10,880 Speaker 3: And that feels profound in its simplicity to me, because well, 517 00:27:10,880 --> 00:27:14,159 Speaker 3: because I think we're always trying to make sense of 518 00:27:14,280 --> 00:27:15,920 Speaker 3: every We want to make sense of everything, We want 519 00:27:15,920 --> 00:27:16,879 Speaker 3: to understand everything. 520 00:27:17,160 --> 00:27:19,400 Speaker 1: And I get it because it's so uncomfortable. 521 00:27:20,000 --> 00:27:22,879 Speaker 3: And I also get it because if there's something I 522 00:27:22,920 --> 00:27:26,840 Speaker 3: can do to make my life better or to not 523 00:27:26,920 --> 00:27:29,399 Speaker 3: get cancer or to heal my cancer, like it's natural 524 00:27:29,440 --> 00:27:32,240 Speaker 3: to want to do that, of course, but then we 525 00:27:32,600 --> 00:27:36,480 Speaker 3: are mind fucking ourselves in the sense that like, there's 526 00:27:36,480 --> 00:27:39,160 Speaker 3: not an answer to everything, and we know so much, 527 00:27:39,200 --> 00:27:41,760 Speaker 3: but at the same time we don't know so much. 528 00:27:42,040 --> 00:27:44,440 Speaker 3: I think of this as a therapist because I will 529 00:27:44,480 --> 00:27:47,240 Speaker 3: look at and read books about like what people used 530 00:27:47,280 --> 00:27:49,600 Speaker 3: to do in this field, and I'm like, oh. 531 00:27:49,400 --> 00:27:51,720 Speaker 4: My gosh, I. 532 00:27:51,320 --> 00:27:54,840 Speaker 3: Cannot believe how we treated people. I mean, the simplest 533 00:27:54,880 --> 00:27:57,160 Speaker 3: idea is like lobotomies, like we thought that was. 534 00:27:57,119 --> 00:28:00,160 Speaker 1: Great and it wasn't even that long ago. Right now, Oh, 535 00:28:00,480 --> 00:28:01,400 Speaker 1: we don't know everything. 536 00:28:01,480 --> 00:28:05,439 Speaker 3: We know more, we don't know everything, And so I'm thinking, like, 537 00:28:05,480 --> 00:28:07,080 Speaker 3: in twenty years, what am I going to look back 538 00:28:07,119 --> 00:28:10,239 Speaker 3: on and be like, I am so sad that I 539 00:28:10,280 --> 00:28:11,240 Speaker 3: did that or thought that. 540 00:28:11,760 --> 00:28:12,200 Speaker 4: Yeah. 541 00:28:12,320 --> 00:28:16,399 Speaker 2: There's a quote from an opinion writer named Jessica Gross 542 00:28:16,400 --> 00:28:18,359 Speaker 2: that I included in the intro of this book, and 543 00:28:18,400 --> 00:28:20,119 Speaker 2: I'm not gonna be able to remember it verbatim, but 544 00:28:20,160 --> 00:28:23,359 Speaker 2: she said something like, I think, because we've progressed so 545 00:28:23,480 --> 00:28:25,880 Speaker 2: much technologically in the past one hundred years, we think 546 00:28:25,920 --> 00:28:29,240 Speaker 2: that everything is knowable now. And that's both so arrogant 547 00:28:29,240 --> 00:28:31,080 Speaker 2: and so fucking boring, is what she said. 548 00:28:31,680 --> 00:28:34,000 Speaker 4: And yeah, no, I think. 549 00:28:33,880 --> 00:28:38,000 Speaker 2: There are so many biases at play to make us 550 00:28:38,560 --> 00:28:42,520 Speaker 2: like want to mullify the sense of like unease or 551 00:28:42,560 --> 00:28:45,920 Speaker 2: irresolution that we feel, and there are so many biases, 552 00:28:46,240 --> 00:28:49,240 Speaker 2: misleading us into thinking that we do know more than 553 00:28:49,240 --> 00:28:51,440 Speaker 2: we do. There is a chapter of the book about 554 00:28:51,480 --> 00:28:54,800 Speaker 2: overconfidence bias. It was a humbling chapter to write, because God, 555 00:28:54,880 --> 00:28:57,440 Speaker 2: I was thinking, like, surely this is the one bias 556 00:28:57,520 --> 00:29:00,960 Speaker 2: that can't apply to me, Like I'm a reasonable person, 557 00:29:01,160 --> 00:29:02,240 Speaker 2: I hate myself. 558 00:29:03,000 --> 00:29:06,200 Speaker 4: But indeed, this is like this near. 559 00:29:06,160 --> 00:29:11,240 Speaker 2: Universal tendency to over credit ourselves with positive outcomes or 560 00:29:11,280 --> 00:29:15,440 Speaker 2: to you know, overestimate our expertise in small everyday things 561 00:29:15,520 --> 00:29:16,000 Speaker 2: like you. 562 00:29:15,960 --> 00:29:19,000 Speaker 4: Know more than well over half of. 563 00:29:19,400 --> 00:29:23,840 Speaker 2: Study participants reported that they were above average in driving, sex, 564 00:29:23,880 --> 00:29:24,400 Speaker 2: and cooking. 565 00:29:24,800 --> 00:29:26,760 Speaker 4: You know, like everybody thinks like, oh yeah, no, I'm 566 00:29:26,920 --> 00:29:29,320 Speaker 4: I'm better than average at sex, and it's like, well, 567 00:29:29,800 --> 00:29:33,440 Speaker 4: only fifty percent can be. So that was interesting. 568 00:29:33,480 --> 00:29:37,560 Speaker 2: But what I discovered researching that chapter is that we 569 00:29:37,640 --> 00:29:41,920 Speaker 2: have this really hazy cognitive barrier, so like, we we 570 00:29:42,000 --> 00:29:45,640 Speaker 2: have a lot of trouble distinguishing where our knowledge ends 571 00:29:45,720 --> 00:29:49,360 Speaker 2: and another person's begins. There was a study that this 572 00:29:49,400 --> 00:29:54,240 Speaker 2: great book called the knowledge illusion quoted where participants were asked, 573 00:29:54,600 --> 00:29:57,760 Speaker 2: do you know how a zipper or toilet works? And 574 00:29:57,760 --> 00:29:59,520 Speaker 2: most people were like, yeah, of course, I feel like 575 00:29:59,520 --> 00:30:01,040 Speaker 2: I use as a and a toilet every day. Of course, 576 00:30:01,040 --> 00:30:02,960 Speaker 2: I know how it works. And then they were asked 577 00:30:02,960 --> 00:30:06,880 Speaker 2: to write step by step breakdowns of exactly how these 578 00:30:07,120 --> 00:30:12,280 Speaker 2: gadgets operated, and that exercise showed them their ignorance, because 579 00:30:12,600 --> 00:30:14,360 Speaker 2: when you get right down to it, most of us 580 00:30:14,600 --> 00:30:17,720 Speaker 2: do not know how toilets and zippers truly truly work. 581 00:30:18,040 --> 00:30:22,120 Speaker 2: But it's someone else's expertise, and it's someone else's capabilities 582 00:30:22,280 --> 00:30:24,800 Speaker 2: that allowed us to think that we do know that, 583 00:30:25,280 --> 00:30:29,120 Speaker 2: and that is obviously disturbing that, like we think we 584 00:30:29,200 --> 00:30:31,000 Speaker 2: know more than we do, and that we're not able 585 00:30:31,040 --> 00:30:33,760 Speaker 2: to tell when you know, like where our limits are. 586 00:30:34,120 --> 00:30:37,000 Speaker 2: But there's a beautiful upside because that's also why human 587 00:30:37,080 --> 00:30:39,760 Speaker 2: beings get so much done. That's like how progress is 588 00:30:39,800 --> 00:30:43,080 Speaker 2: able to move forward because we collaborate, We like sort 589 00:30:43,120 --> 00:30:45,959 Speaker 2: of mind meld with people who have different skill sets 590 00:30:46,040 --> 00:30:48,800 Speaker 2: than we do, and that allows us to have the 591 00:30:48,840 --> 00:30:53,160 Speaker 2: audacity to move forward, whether we're talking about technological progress 592 00:30:53,280 --> 00:30:56,120 Speaker 2: or grassroots activism. Like if everyone had to have a 593 00:30:56,160 --> 00:30:59,680 Speaker 2: PhD in you know, gender studies before they participated in 594 00:30:59,720 --> 00:31:03,360 Speaker 2: femine movements, we wouldn't have feminist movements. So you know, 595 00:31:03,360 --> 00:31:06,520 Speaker 2: there's an upside to it, and yet as the world 596 00:31:06,600 --> 00:31:12,000 Speaker 2: becomes ever more complicated and abstract, the downsides become a 597 00:31:12,040 --> 00:31:14,760 Speaker 2: little more risky. There's also a phenomenon called the Google 598 00:31:14,760 --> 00:31:19,040 Speaker 2: effect where we tend to like pretty immediately forget the 599 00:31:19,160 --> 00:31:21,920 Speaker 2: answers that we learn via web search, but we also 600 00:31:21,960 --> 00:31:25,160 Speaker 2: forget that we forgot it, like we mistake Google's knowledge 601 00:31:25,200 --> 00:31:28,760 Speaker 2: for our own. And so that's upsetting too, because when 602 00:31:28,800 --> 00:31:32,720 Speaker 2: you get into you know, debates or when you're going 603 00:31:32,760 --> 00:31:36,680 Speaker 2: to vote, like you might think you understand a policy 604 00:31:37,000 --> 00:31:39,240 Speaker 2: or an argument better than you do, and that can 605 00:31:39,600 --> 00:31:42,840 Speaker 2: incite so much conflict with potentially people that you love, 606 00:31:42,920 --> 00:31:45,000 Speaker 2: and that can result in like you cutting them off 607 00:31:45,080 --> 00:31:47,120 Speaker 2: even though like nobody really knew what they were talking about. 608 00:31:47,160 --> 00:31:49,360 Speaker 2: So being aware of those things, I think is just 609 00:31:49,480 --> 00:31:53,040 Speaker 2: so important and really humbling, because while we're not very 610 00:31:53,080 --> 00:31:56,000 Speaker 2: good at changing other people's minds, we're pretty good at 611 00:31:56,040 --> 00:31:56,680 Speaker 2: changing our own. 612 00:31:57,040 --> 00:32:00,160 Speaker 3: It reminds me of another chapter where like information and 613 00:32:00,480 --> 00:32:02,840 Speaker 3: it leaves as soon as easy as it comes. And 614 00:32:03,320 --> 00:32:05,320 Speaker 3: before we get to that, I did want to say 615 00:32:05,320 --> 00:32:09,280 Speaker 3: because over confidence bias, I am curious, like what parts 616 00:32:09,320 --> 00:32:11,640 Speaker 3: of doing writing this book were like the most shocking 617 00:32:11,960 --> 00:32:14,400 Speaker 3: or surprising to you? But I also felt the same 618 00:32:14,400 --> 00:32:16,040 Speaker 3: way of like, ooh, this is one that I'm going 619 00:32:16,080 --> 00:32:18,200 Speaker 3: to get out of jail free with this chapter. 620 00:32:18,960 --> 00:32:22,800 Speaker 1: And as you were starting it out, my thought to myself, well, 621 00:32:23,080 --> 00:32:25,360 Speaker 1: I struggle with imposter syndrome. This is not my thing. 622 00:32:25,760 --> 00:32:28,920 Speaker 3: And then very quickly you tied that all in and 623 00:32:29,320 --> 00:32:31,160 Speaker 3: I want to read something else that you wrote because 624 00:32:31,200 --> 00:32:34,080 Speaker 3: it was important, and I wrote some theme. I journaled 625 00:32:34,120 --> 00:32:37,280 Speaker 3: about it because it felt so important to me. You said, 626 00:32:37,400 --> 00:32:40,160 Speaker 3: we cannot perceive self doubt as weakness, and we shouldn't 627 00:32:40,200 --> 00:32:45,040 Speaker 3: demand underlying certainty, even from experts, or they will surely 628 00:32:45,080 --> 00:32:49,120 Speaker 3: bullshit us in order to meet that expectation. That's so 629 00:32:49,240 --> 00:32:53,160 Speaker 3: scary to me, because I felt it, and so I 630 00:32:53,200 --> 00:32:54,040 Speaker 3: know other people. 631 00:32:53,880 --> 00:32:56,800 Speaker 1: Are feeling it. I felt it too, and I wrote, I. 632 00:32:56,800 --> 00:32:58,560 Speaker 3: Don't know if you'll relate to this, but I wrote 633 00:32:58,640 --> 00:33:01,800 Speaker 3: kind of in the margins, there is feelings of inferiority. 634 00:33:02,200 --> 00:33:06,320 Speaker 3: Might be my accurate feelings of knowing my limits. However, 635 00:33:06,800 --> 00:33:10,840 Speaker 3: the expectations to know it all creates a force idea 636 00:33:10,880 --> 00:33:13,479 Speaker 3: that I know nothing or I need to fake it, 637 00:33:14,080 --> 00:33:17,400 Speaker 3: and I'm overconfident when I don't think people can call 638 00:33:17,480 --> 00:33:20,520 Speaker 3: out the bullshit. And I've become the dumbest person alive 639 00:33:20,920 --> 00:33:24,680 Speaker 3: when there's someone around me that could point out any discrepancy. 640 00:33:24,720 --> 00:33:28,240 Speaker 3: That's how I feel, as I guess more so a 641 00:33:28,280 --> 00:33:31,560 Speaker 3: person in the field that I'm in. But also there 642 00:33:31,640 --> 00:33:33,760 Speaker 3: is this drive. It feels like I need to know 643 00:33:33,880 --> 00:33:37,440 Speaker 3: every social construct of it. I need to understand everything 644 00:33:37,480 --> 00:33:39,480 Speaker 3: that's coming out. I need to know every political stance. 645 00:33:39,520 --> 00:33:41,480 Speaker 3: I need to be behind all these things because I 646 00:33:41,520 --> 00:33:42,520 Speaker 3: work to help people. 647 00:33:43,160 --> 00:33:44,960 Speaker 1: But it's scary because. 648 00:33:45,160 --> 00:33:48,720 Speaker 3: I know that I felt that, like, well, I could 649 00:33:48,800 --> 00:33:51,000 Speaker 3: fake it right here and just like act like I 650 00:33:51,040 --> 00:33:53,680 Speaker 3: know what I'm talking about, but I don't, and so 651 00:33:53,960 --> 00:33:56,480 Speaker 3: I could be perpetuating things that aren't true. And I'm 652 00:33:56,520 --> 00:33:59,360 Speaker 3: also perpetuating the idea that I need to know everything. 653 00:34:06,560 --> 00:34:10,160 Speaker 3: I know that I felt that, like, well, I could 654 00:34:10,200 --> 00:34:12,400 Speaker 3: fake it right here and just like act like I 655 00:34:12,440 --> 00:34:15,120 Speaker 3: know what I'm talking about, but I don't, And so 656 00:34:15,360 --> 00:34:17,919 Speaker 3: I could be perpetuating things that aren't true, and I'm 657 00:34:17,960 --> 00:34:20,760 Speaker 3: also perpetuating the idea that I need to know everything. 658 00:34:21,400 --> 00:34:25,480 Speaker 2: Yeah, I really struggle with this too, because when you 659 00:34:25,600 --> 00:34:28,560 Speaker 2: have a platform, or even if you're just talking one 660 00:34:28,560 --> 00:34:30,719 Speaker 2: on one and you're wanting to create a good impression 661 00:34:30,800 --> 00:34:34,319 Speaker 2: and people are looking at you so expectantly, there is 662 00:34:34,440 --> 00:34:37,040 Speaker 2: pressure to wanna fake it till you make it. I mean, 663 00:34:37,080 --> 00:34:41,319 Speaker 2: that's considered like conventional wisdom in America, you know, fake 664 00:34:41,320 --> 00:34:43,520 Speaker 2: it till you make it. And another thing about that phrase, 665 00:34:43,560 --> 00:34:45,759 Speaker 2: fake it till you make it, I have internalized that 666 00:34:45,840 --> 00:34:48,839 Speaker 2: as truth because it rhymes. And this is another phenomenon 667 00:34:48,840 --> 00:34:50,719 Speaker 2: that I actually talk about in a different part of 668 00:34:50,719 --> 00:34:52,800 Speaker 2: the book where I address a bias called the illusory 669 00:34:52,840 --> 00:34:55,799 Speaker 2: truth effect, which describes our tendency to think that something 670 00:34:55,840 --> 00:34:58,480 Speaker 2: is true just because we've heard it multiple times. We 671 00:34:58,560 --> 00:35:02,040 Speaker 2: tend to internalize things as true even when they are 672 00:35:02,080 --> 00:35:05,120 Speaker 2: totally implausible and we really really know deep down that 673 00:35:05,440 --> 00:35:08,320 Speaker 2: they couldn't be true when we've heard them multiple times, 674 00:35:08,400 --> 00:35:12,320 Speaker 2: when they rhyme, when they're written in like legible, aesthetically 675 00:35:12,320 --> 00:35:15,120 Speaker 2: pleasing fonts. These are things that make us feel like 676 00:35:15,160 --> 00:35:18,560 Speaker 2: something is true because they activate processing fluency, Like if 677 00:35:18,560 --> 00:35:22,279 Speaker 2: something goes down easy, we tend to think like, oh, well, 678 00:35:22,280 --> 00:35:25,680 Speaker 2: there must be validity there. So that just is a 679 00:35:25,719 --> 00:35:29,480 Speaker 2: sidebar to explain how that phrase fake it till you 680 00:35:29,520 --> 00:35:32,600 Speaker 2: make it is so internalized in me. Also because I 681 00:35:32,680 --> 00:35:35,279 Speaker 2: grew up a theater kid, so like when you do 682 00:35:35,440 --> 00:35:37,439 Speaker 2: community theater as a child, or when you do theater 683 00:35:37,480 --> 00:35:40,719 Speaker 2: as an adult too, like your directors and your theater 684 00:35:40,800 --> 00:35:42,839 Speaker 2: teachers are being like if someone asks if you can 685 00:35:42,840 --> 00:35:45,840 Speaker 2: tap dance, you better say yes, I fucking can, you know, 686 00:35:46,120 --> 00:35:47,880 Speaker 2: like yes I can't, I can do that, you know. 687 00:35:48,480 --> 00:35:52,680 Speaker 2: And so I have been noticing that in just like 688 00:35:52,719 --> 00:35:57,080 Speaker 2: American culture and in everyday life, like a little bit 689 00:35:57,080 --> 00:36:02,840 Speaker 2: of overconfidence is actually proven to boost morale. It's proven 690 00:36:02,880 --> 00:36:07,080 Speaker 2: to like galvanize people. There are several experiments that have 691 00:36:07,080 --> 00:36:10,520 Speaker 2: been conducted little like wargame experiments where the person who 692 00:36:10,560 --> 00:36:13,959 Speaker 2: came out on top was always someone who had behaved with. 693 00:36:14,120 --> 00:36:18,279 Speaker 4: A well calibrated but still a level of overconfidence that 694 00:36:18,400 --> 00:36:19,719 Speaker 4: was like Delulu. 695 00:36:20,080 --> 00:36:22,160 Speaker 2: So there are like there are benefits to overconfidence, like 696 00:36:22,200 --> 00:36:26,480 Speaker 2: it will get you somewhere. However, this like malignant over 697 00:36:26,560 --> 00:36:30,080 Speaker 2: the top, you know, narcissistic level of overconfidence that we 698 00:36:30,560 --> 00:36:35,640 Speaker 2: demand in our public figures, whether they're CEOs or politicians 699 00:36:35,719 --> 00:36:36,600 Speaker 2: or celebrities. 700 00:36:37,080 --> 00:36:39,839 Speaker 4: I have tried so hard to. 701 00:36:39,800 --> 00:36:44,520 Speaker 2: Stop expecting that when I am perceiving someone speak in public, 702 00:36:44,680 --> 00:36:48,359 Speaker 2: like it is not healthy for anyone to demand that 703 00:36:48,400 --> 00:36:51,799 Speaker 2: they speak on subjects that they don't know about, and 704 00:36:51,880 --> 00:36:54,359 Speaker 2: yet we perceive it as a weakness when someone says 705 00:36:54,440 --> 00:36:56,680 Speaker 2: I'm not sure or I need to check, or I'm 706 00:36:56,719 --> 00:36:59,839 Speaker 2: going to default to this person. So again, we're bad 707 00:36:59,880 --> 00:37:02,560 Speaker 2: at changing other people's minds. We're pretty good at changing 708 00:37:02,640 --> 00:37:04,960 Speaker 2: our own. And so if we can mitigate our own 709 00:37:05,080 --> 00:37:08,920 Speaker 2: expectations of public figures over confidence, I think that that's 710 00:37:08,920 --> 00:37:09,800 Speaker 2: a good place to start. 711 00:37:10,200 --> 00:37:11,759 Speaker 3: Yeah, I think there's going to be there could be 712 00:37:11,760 --> 00:37:15,759 Speaker 3: a return on that investment, so to speak. So one 713 00:37:15,800 --> 00:37:18,440 Speaker 3: of the bias that I had no idea what it 714 00:37:18,480 --> 00:37:21,799 Speaker 3: was was recent sea illusion. I had never heard of that. 715 00:37:22,440 --> 00:37:23,960 Speaker 3: Can you explain that? 716 00:37:24,360 --> 00:37:24,720 Speaker 4: Yeah? 717 00:37:24,760 --> 00:37:28,640 Speaker 2: The recncy illusion is our proclivity to think that something 718 00:37:28,719 --> 00:37:32,880 Speaker 2: is new objectively and thus urgent and worthy of panic 719 00:37:33,280 --> 00:37:36,960 Speaker 2: just because it's new to us. So I might come 720 00:37:36,960 --> 00:37:42,480 Speaker 2: across a headline that feels really urgent and scary, and 721 00:37:42,520 --> 00:37:44,200 Speaker 2: it will like send me into fight or flight, and 722 00:37:44,239 --> 00:37:46,919 Speaker 2: I'll go down a rabbit hole googling, like I don't 723 00:37:46,920 --> 00:37:50,080 Speaker 2: know why such and such a thing as like polluting 724 00:37:50,080 --> 00:37:52,800 Speaker 2: the soil and poisoning us all, And then I'll realize 725 00:37:52,840 --> 00:37:56,040 Speaker 2: that that headline was just like an article refresh from 726 00:37:56,080 --> 00:37:58,319 Speaker 2: six years ago and it's not new and it's not 727 00:37:58,400 --> 00:38:00,759 Speaker 2: a threaten like I was in fire, so I didn't 728 00:38:00,760 --> 00:38:01,479 Speaker 2: even like look at. 729 00:38:01,360 --> 00:38:02,560 Speaker 4: The date that it was published. 730 00:38:02,640 --> 00:38:06,360 Speaker 2: Yeah, but the recency illusion connects to again one of 731 00:38:06,360 --> 00:38:12,759 Speaker 2: these survival mechanisms. Long before the digital age, recency meant relevance, 732 00:38:12,960 --> 00:38:16,360 Speaker 2: and if a stimulus was claiming your attention, that was 733 00:38:16,520 --> 00:38:18,960 Speaker 2: a good enough sign that you might want to pay 734 00:38:18,960 --> 00:38:23,000 Speaker 2: attention to it for survival purposes. So say you're in 735 00:38:23,160 --> 00:38:25,680 Speaker 2: your community, you're a hunter gatherer, and you hear a 736 00:38:25,760 --> 00:38:28,600 Speaker 2: rustling in the bushes. The cause of that rustling in 737 00:38:28,640 --> 00:38:31,520 Speaker 2: the bushes, maybe it's a predator, could have been sitting 738 00:38:31,560 --> 00:38:34,399 Speaker 2: there for six hours, But the fact that you are 739 00:38:34,560 --> 00:38:38,040 Speaker 2: just noticing it means that it might be time to run. 740 00:38:38,560 --> 00:38:41,520 Speaker 2: There is like no advantage to being like, oh, what 741 00:38:42,280 --> 00:38:45,280 Speaker 2: if that thing is not even a predator, and actually 742 00:38:45,280 --> 00:38:47,760 Speaker 2: what if it's been there all day? There's no upside 743 00:38:47,760 --> 00:38:51,000 Speaker 2: to thinking like that, Just run, like your survival will 744 00:38:51,000 --> 00:38:53,319 Speaker 2: thank you even if it wasn't actually a threat. It's 745 00:38:53,440 --> 00:38:56,560 Speaker 2: it's better just to panic just in case. But again, 746 00:38:56,600 --> 00:39:01,440 Speaker 2: we are now projecting those impulses onto these digital threats, 747 00:39:01,800 --> 00:39:05,319 Speaker 2: and I think this can explain why you know, a 748 00:39:05,440 --> 00:39:09,000 Speaker 2: curt email from your boss that you didn't realize had 749 00:39:09,360 --> 00:39:12,439 Speaker 2: shown up eight hours ago can fully send you into 750 00:39:12,440 --> 00:39:15,440 Speaker 2: fight or flight, even though it's not actually a threat, 751 00:39:15,680 --> 00:39:16,759 Speaker 2: and it's not even new. 752 00:39:17,440 --> 00:39:19,680 Speaker 3: In this chapter, I think you were talking about when 753 00:39:19,719 --> 00:39:22,200 Speaker 3: you worked as an editor and you were kind of 754 00:39:22,200 --> 00:39:23,719 Speaker 3: recycling old headlines. 755 00:39:24,360 --> 00:39:26,200 Speaker 2: Yeah, I used to work as a beauty editor. That 756 00:39:26,280 --> 00:39:28,640 Speaker 2: was my day job. It's a long story how I 757 00:39:28,760 --> 00:39:31,120 Speaker 2: ended up there. But it was not my dream to 758 00:39:31,120 --> 00:39:32,759 Speaker 2: write about I cream for a living. But you know, 759 00:39:33,280 --> 00:39:36,080 Speaker 2: you take the job you can get truly no complaints. 760 00:39:36,080 --> 00:39:38,120 Speaker 2: There are many worse jobs in this world. And now 761 00:39:38,120 --> 00:39:39,680 Speaker 2: I know how to do my makeup, So it was 762 00:39:39,719 --> 00:39:41,720 Speaker 2: a good little training camp, I guess. 763 00:39:42,040 --> 00:39:42,400 Speaker 4: Anyway. 764 00:39:43,239 --> 00:39:46,080 Speaker 2: Yeah, like a huge part of our job as editors 765 00:39:46,360 --> 00:39:50,360 Speaker 2: was to refresh old stories because it would, you know, 766 00:39:50,440 --> 00:39:52,560 Speaker 2: allow you to get clicks without having to write something 767 00:39:52,600 --> 00:39:55,320 Speaker 2: brand new. And so sometimes you would take a story 768 00:39:55,360 --> 00:39:59,120 Speaker 2: from yeah, like three years ago, and we would refresh 769 00:39:59,120 --> 00:40:03,320 Speaker 2: the headline to sound like really dramatic and click baity, 770 00:40:03,440 --> 00:40:06,400 Speaker 2: like a problem that had just been discovered and that 771 00:40:06,440 --> 00:40:10,920 Speaker 2: you just need to address right now, like eleven, like 772 00:40:11,160 --> 00:40:15,800 Speaker 2: blow causing diet mistakes. Nutritionists want you to stop making 773 00:40:16,040 --> 00:40:19,399 Speaker 2: like yesterday, you know, something like that, or it could 774 00:40:19,440 --> 00:40:22,440 Speaker 2: be a wellness thing, like therapists have just discovered this 775 00:40:22,719 --> 00:40:25,120 Speaker 2: the one thing that's messing with your sleep. 776 00:40:25,640 --> 00:40:27,560 Speaker 4: But really that article was. 777 00:40:27,640 --> 00:40:30,560 Speaker 2: You know, old and therapists did not just discover anything, 778 00:40:30,560 --> 00:40:32,160 Speaker 2: and it's not really ruining. 779 00:40:31,840 --> 00:40:32,400 Speaker 4: Your sleep, you know. 780 00:40:32,520 --> 00:40:39,120 Speaker 2: But we were unconsciously tasked with exploiting this bias that 781 00:40:39,200 --> 00:40:41,480 Speaker 2: I now know about called the recency illusion. 782 00:40:41,360 --> 00:40:43,719 Speaker 3: Which is part of like you saying, like knowing that 783 00:40:43,760 --> 00:40:46,360 Speaker 3: this is these are things is helpful and of itself, 784 00:40:46,680 --> 00:40:49,600 Speaker 3: I've noticed myself like doubling before I click that. 785 00:40:49,680 --> 00:40:52,960 Speaker 1: I'm like, wait a second, have I read this before? 786 00:40:53,560 --> 00:40:56,759 Speaker 3: But like the title was different, or didn't they put 787 00:40:56,760 --> 00:40:59,640 Speaker 3: this article out yesterday with a different title, or isn't 788 00:40:59,680 --> 00:41:01,680 Speaker 3: this the same thing they were just talking about. They're 789 00:41:01,719 --> 00:41:04,040 Speaker 3: just like pulled a different sentence to be the headline. 790 00:41:04,640 --> 00:41:08,200 Speaker 3: So that's that is very helpful because I was unaware 791 00:41:08,200 --> 00:41:08,919 Speaker 3: that this was a thing. 792 00:41:09,320 --> 00:41:11,439 Speaker 1: But it also on a different. 793 00:41:11,200 --> 00:41:16,239 Speaker 3: Level, I have also grappled with this idea, whether it's 794 00:41:16,320 --> 00:41:18,600 Speaker 3: mine or other people's. When I don't want to focus 795 00:41:18,680 --> 00:41:22,040 Speaker 3: on my own stuff, I will make it another people problem. 796 00:41:22,160 --> 00:41:24,759 Speaker 3: Even though I struggle with this myself, but I will 797 00:41:24,760 --> 00:41:28,640 Speaker 3: sometimes get kind of down on myself and wonder again 798 00:41:28,640 --> 00:41:31,560 Speaker 3: whether it's me or somebody else, if some of the 799 00:41:31,600 --> 00:41:34,960 Speaker 3: causes that I care about, if it is performative, or 800 00:41:35,000 --> 00:41:37,560 Speaker 3: like if I really care about certain things and you 801 00:41:37,640 --> 00:41:39,560 Speaker 3: kind of touched on that, or you definitely touched on 802 00:41:39,600 --> 00:41:42,160 Speaker 3: that we were talking about, like the me too movement 803 00:41:42,360 --> 00:41:45,680 Speaker 3: and like black Lives Matter, and during I think, well, 804 00:41:45,680 --> 00:41:48,520 Speaker 3: from twenty twenty until now, I've just feel like I've 805 00:41:48,560 --> 00:41:51,440 Speaker 3: noticed so many And maybe it's because social media is 806 00:41:51,880 --> 00:41:56,560 Speaker 3: growing and becoming more of a news source that I 807 00:41:56,600 --> 00:42:00,640 Speaker 3: have noticed that I'll feel like the pressure to act now. 808 00:42:01,200 --> 00:42:03,920 Speaker 3: A friend of mine recently was talking about how when 809 00:42:03,920 --> 00:42:07,600 Speaker 3: she's handed information, she has a larger social media following 810 00:42:07,600 --> 00:42:10,520 Speaker 3: and has shifted the way she uses social media because 811 00:42:10,560 --> 00:42:12,880 Speaker 3: she felt like there was pressure for her to respond 812 00:42:12,920 --> 00:42:16,600 Speaker 3: to every event immediately. And she was like, I need 813 00:42:16,600 --> 00:42:19,839 Speaker 3: to process what I think about these things, and then 814 00:42:19,920 --> 00:42:22,080 Speaker 3: I can maybe make a statement if it's something that 815 00:42:22,239 --> 00:42:24,759 Speaker 3: I can spend my energy on, or maybe I don't 816 00:42:25,480 --> 00:42:27,840 Speaker 3: deal that I deal without on my own. But I 817 00:42:27,840 --> 00:42:29,560 Speaker 3: want to read another thing that you wrote about that, 818 00:42:29,600 --> 00:42:31,399 Speaker 3: and then I want to kind of hear your take, 819 00:42:31,840 --> 00:42:34,719 Speaker 3: but you wrote brain scientists agree that these shifts are 820 00:42:34,760 --> 00:42:37,719 Speaker 3: not always due to a lack of care. The issues 821 00:42:37,800 --> 00:42:40,440 Speaker 3: brought to light during social reckonings like the met To 822 00:42:40,640 --> 00:42:44,719 Speaker 3: movement and Black Lives Matter all already existed, and their 823 00:42:44,800 --> 00:42:47,520 Speaker 3: reactions to them are long overdue. But the thing is 824 00:42:47,680 --> 00:42:50,479 Speaker 3: the same power that gave those issues salients gave other 825 00:42:50,520 --> 00:42:55,040 Speaker 3: things salience very quickly thereafter, our nervous systems struggle to 826 00:42:55,080 --> 00:43:00,480 Speaker 3: sustain agitation for the many new crises new platforms serve us. Actually, 827 00:43:00,560 --> 00:43:05,239 Speaker 3: when material changes don't result right away, the brain is 828 00:43:05,280 --> 00:43:08,320 Speaker 3: not prepared to be exposed to trauma, so very often 829 00:43:08,840 --> 00:43:11,480 Speaker 3: it also needs positive feedback to help us step out 830 00:43:11,480 --> 00:43:12,200 Speaker 3: of survival. 831 00:43:12,320 --> 00:43:15,440 Speaker 1: And I felt like that it helps kind of my 832 00:43:15,440 --> 00:43:16,640 Speaker 1: shoulder stop a little bit. 833 00:43:16,719 --> 00:43:18,840 Speaker 3: And when I read that, of yeah, maybe it's not 834 00:43:18,880 --> 00:43:21,240 Speaker 3: that I don't care about these issues, but I don't. 835 00:43:21,480 --> 00:43:24,160 Speaker 3: And we aren't supposed to be able to be in 836 00:43:24,400 --> 00:43:27,040 Speaker 3: depths of these twenty four to seven all of the time, 837 00:43:27,360 --> 00:43:29,239 Speaker 3: and it feels like there's always another one I need 838 00:43:29,280 --> 00:43:29,680 Speaker 3: to add. 839 00:43:30,239 --> 00:43:32,439 Speaker 2: Yeah, So some of the quotes that you read there, 840 00:43:32,480 --> 00:43:34,719 Speaker 2: I just want to give credit to the people who 841 00:43:34,880 --> 00:43:38,160 Speaker 2: spoke them. A couple of those quotes were from Sikul Corostev, 842 00:43:38,160 --> 00:43:41,320 Speaker 2: who is a decision scientist, and another was from Mina Bee, 843 00:43:41,320 --> 00:43:45,480 Speaker 2: who is a therapist who has a pretty big platform online. 844 00:43:45,680 --> 00:43:48,799 Speaker 2: Because I approach that subject matter with the question of, like, 845 00:43:48,840 --> 00:43:52,560 Speaker 2: how does the recency illusion connect to so many of 846 00:43:52,560 --> 00:43:55,880 Speaker 2: the ways that we process the news, news that directly 847 00:43:55,960 --> 00:43:58,320 Speaker 2: affects us, as well as news that is maybe. 848 00:43:58,160 --> 00:44:01,200 Speaker 4: More abstract because it's affecting people across the world. 849 00:44:01,320 --> 00:44:04,800 Speaker 2: And I was really reckoning with behavior in myself, in 850 00:44:04,840 --> 00:44:07,680 Speaker 2: behavior that I was seeing in some of my loved ones, 851 00:44:07,680 --> 00:44:12,080 Speaker 2: where yes, we were extremely quick to engage in issues 852 00:44:12,120 --> 00:44:16,560 Speaker 2: that were important, and we would clamor to social media 853 00:44:16,600 --> 00:44:20,600 Speaker 2: platforms to express our solidarity and to share information and 854 00:44:20,640 --> 00:44:23,640 Speaker 2: all these things, and then time would go by and 855 00:44:23,680 --> 00:44:27,480 Speaker 2: that urgency, at least the appearance of that urgency on 856 00:44:27,520 --> 00:44:30,279 Speaker 2: social media would kind of dissipate. 857 00:44:30,440 --> 00:44:32,680 Speaker 4: And I felt really disheartened. 858 00:44:32,120 --> 00:44:34,319 Speaker 2: By that and by my own self, like, what I 859 00:44:34,360 --> 00:44:37,960 Speaker 2: do I do care about this? This is materially affecting 860 00:44:38,320 --> 00:44:41,520 Speaker 2: many people in our culture who I care about in 861 00:44:42,480 --> 00:44:45,319 Speaker 2: society at large, And why am I not, you know, 862 00:44:45,400 --> 00:44:48,640 Speaker 2: posting with the same urgency that I was when this 863 00:44:48,719 --> 00:44:51,680 Speaker 2: news broke or when these matters were like truly in 864 00:44:51,680 --> 00:44:54,959 Speaker 2: every headline and I was, you know, questioning, like yeah, 865 00:44:55,000 --> 00:44:57,960 Speaker 2: like it was this nefarious behavior, like you know, was. 866 00:44:57,960 --> 00:44:59,160 Speaker 4: I just jumping on a bandwagon? 867 00:44:59,160 --> 00:45:02,000 Speaker 2: But no, I don't feel though that's true. And as 868 00:45:02,000 --> 00:45:04,560 Speaker 2: it turns out, like our attention spans have so much 869 00:45:04,600 --> 00:45:06,760 Speaker 2: to do with it, the global attention span is shrinking 870 00:45:06,760 --> 00:45:09,600 Speaker 2: in the digital age. And of course not everyone has 871 00:45:09,680 --> 00:45:13,360 Speaker 2: the privilege to disengage from certain subjects, from certain matters. 872 00:45:13,400 --> 00:45:16,320 Speaker 2: You know, if a matter is directly affecting you, whether 873 00:45:16,360 --> 00:45:19,240 Speaker 2: it is you know, a climate issue or a social 874 00:45:19,320 --> 00:45:23,279 Speaker 2: justice issue, you might not be able to just be like, oh, yeah, no, 875 00:45:23,400 --> 00:45:26,160 Speaker 2: I'm moving on to the next you know, most okmall issue. 876 00:45:26,200 --> 00:45:28,759 Speaker 2: And we don't do these things consciously, We do them unconsciously. 877 00:45:29,280 --> 00:45:32,640 Speaker 2: But it just like invited me to entertain this notion 878 00:45:32,760 --> 00:45:37,480 Speaker 2: that when assessing the salience of contemporary concerns, our attention 879 00:45:37,960 --> 00:45:41,319 Speaker 2: is no longer the best barometer, you know, because like 880 00:45:41,400 --> 00:45:43,760 Speaker 2: as soon as that wrestling in the bushes was proven 881 00:45:43,920 --> 00:45:45,600 Speaker 2: not to be a threat, we're like, okay, we can 882 00:45:45,680 --> 00:45:49,200 Speaker 2: move on, and that is no longer good enough when 883 00:45:49,320 --> 00:45:51,080 Speaker 2: making decisions about our culture. 884 00:45:51,560 --> 00:45:54,640 Speaker 3: That feels like a big deal to hear you say that. 885 00:45:54,760 --> 00:45:56,799 Speaker 3: Attention is will you say it again, Attention is not 886 00:45:56,840 --> 00:45:58,040 Speaker 3: the best barometer. 887 00:45:58,200 --> 00:46:01,239 Speaker 2: Yeah, for assessing the salience of contemporary concerns. 888 00:46:01,520 --> 00:46:04,960 Speaker 3: Oh, oh my gosh, So I feel like and we 889 00:46:05,000 --> 00:46:07,160 Speaker 3: haven't covered them all like this book is. 890 00:46:07,520 --> 00:46:08,319 Speaker 1: And also I don't know. 891 00:46:08,440 --> 00:46:11,120 Speaker 3: I'm curious to you because so many of them I feel, 892 00:46:11,160 --> 00:46:12,480 Speaker 3: and I think in one of the chapters you kind 893 00:46:12,480 --> 00:46:15,960 Speaker 3: of talked about this, so many of them are connected 894 00:46:15,960 --> 00:46:19,520 Speaker 3: in so many ways. And if you're not using one 895 00:46:19,760 --> 00:46:21,920 Speaker 3: of these biases and another one is popping up right 896 00:46:21,960 --> 00:46:24,800 Speaker 3: underneath of it, or maybe you're combining them, I'm curious 897 00:46:25,080 --> 00:46:28,040 Speaker 3: kind of when we close out, like what was the 898 00:46:28,080 --> 00:46:31,040 Speaker 3: most interesting or shocking for you to look at? 899 00:46:31,120 --> 00:46:34,480 Speaker 1: And then also how did your brain organize this? 900 00:46:35,600 --> 00:46:38,319 Speaker 3: Just because I feel like they are so connected, but 901 00:46:38,400 --> 00:46:41,040 Speaker 3: the book is so makes so much sense at the 902 00:46:41,040 --> 00:46:41,600 Speaker 3: same time. 903 00:46:42,080 --> 00:46:44,719 Speaker 4: Thanks, Oh gosh, the one that blew my mind the most. 904 00:46:44,800 --> 00:46:47,080 Speaker 2: I mean, there's a chapter in there that's very, very 905 00:46:47,080 --> 00:46:50,359 Speaker 2: personal that addresses the sunk cost fallacy, which is our 906 00:46:50,400 --> 00:46:54,640 Speaker 2: tendency to think that resources already spent on an endeavor time, money, 907 00:46:54,680 --> 00:46:59,399 Speaker 2: but also emotional resources like hope secrets justify spending even more, 908 00:47:00,160 --> 00:47:04,080 Speaker 2: and looking into that bias really helped me understand what 909 00:47:04,120 --> 00:47:07,799 Speaker 2: I once perceived as like this unforgivably irrational decision to 910 00:47:07,840 --> 00:47:11,200 Speaker 2: spend seven of my formative years in a relationship that 911 00:47:11,480 --> 00:47:14,400 Speaker 2: really really did not serve me. And every time something 912 00:47:14,400 --> 00:47:16,840 Speaker 2: bad happened in that relationship, I would dig my heels 913 00:47:16,840 --> 00:47:18,600 Speaker 2: in and I would justify it to myself, and I 914 00:47:18,600 --> 00:47:21,000 Speaker 2: would tell myself like, surely a win is just around 915 00:47:21,040 --> 00:47:22,880 Speaker 2: the corner, even though there was nothing to suggest that 916 00:47:22,920 --> 00:47:26,520 Speaker 2: would be the case. And yeah, I felt really ashamed 917 00:47:26,520 --> 00:47:29,080 Speaker 2: for a long time until I started looking into the 918 00:47:29,160 --> 00:47:32,120 Speaker 2: role that the sunk cost fallacy plays in our personal 919 00:47:32,160 --> 00:47:36,280 Speaker 2: life decision making. So that was a really healing, scary 920 00:47:36,360 --> 00:47:39,040 Speaker 2: but healing chapter to write. And then in terms of 921 00:47:39,080 --> 00:47:42,160 Speaker 2: the organization, So I was inspired to write this book. 922 00:47:42,200 --> 00:47:43,520 Speaker 2: I guess we're at the end of the interview, but 923 00:47:43,560 --> 00:47:46,120 Speaker 2: I will just say that I was inspired to write 924 00:47:46,120 --> 00:47:49,319 Speaker 2: this book that uses cognitive biases as a lens to 925 00:47:49,440 --> 00:47:52,960 Speaker 2: understand these modern forms of rationality by the research that 926 00:47:53,000 --> 00:47:55,200 Speaker 2: I was doing for my last book, Cultish. So as 927 00:47:55,239 --> 00:47:58,600 Speaker 2: I was looking into the mechanics of cult influence, naturally, 928 00:47:58,640 --> 00:48:01,920 Speaker 2: I came across terms like confirmation bias and sun cost fallacy, 929 00:48:02,440 --> 00:48:05,920 Speaker 2: and while I couldn't really dive deep into those for 930 00:48:06,200 --> 00:48:09,040 Speaker 2: the cult book, I knew I wanted to pursue them 931 00:48:09,040 --> 00:48:11,840 Speaker 2: in a future project because not only were they explaining, 932 00:48:11,880 --> 00:48:14,959 Speaker 2: you know, people's decision to remain in scientology for twenty 933 00:48:15,000 --> 00:48:17,279 Speaker 2: years or whatever, but they also really connected to my 934 00:48:17,360 --> 00:48:19,360 Speaker 2: own life and so many of the behaviors I was 935 00:48:19,400 --> 00:48:21,880 Speaker 2: noticing in the broader culture. And so the way that 936 00:48:21,920 --> 00:48:24,839 Speaker 2: I decided to organize the book is I found this 937 00:48:24,960 --> 00:48:28,920 Speaker 2: amazingly helpful infographic that laid out all like two hundred 938 00:48:28,960 --> 00:48:31,760 Speaker 2: plus cognitive biases that have been described over the years. 939 00:48:31,960 --> 00:48:34,120 Speaker 4: There are so many that have been described, and by 940 00:48:34,120 --> 00:48:34,680 Speaker 4: the way, the. 941 00:48:34,680 --> 00:48:38,040 Speaker 2: Term cognitive bias was coined by the late behavioral economists 942 00:48:38,080 --> 00:48:39,839 Speaker 2: Amos Tiversky and Daniel Konoman. 943 00:48:39,640 --> 00:48:41,720 Speaker 4: I most certainly did not go on enough AnyWho. 944 00:48:42,000 --> 00:48:45,640 Speaker 2: So I looked at this infographic and I started reading 945 00:48:45,640 --> 00:48:47,520 Speaker 2: into some of the biases that caught my eye, and 946 00:48:47,560 --> 00:48:50,560 Speaker 2: there were somewhere intuitively, I was like, oh my god, 947 00:48:50,600 --> 00:48:52,640 Speaker 2: I have to talk about the halo effect because that 948 00:48:52,719 --> 00:48:55,359 Speaker 2: applies to this celebrity stuff, or I have to talk 949 00:48:55,400 --> 00:48:58,279 Speaker 2: about proportionality bias, because that is what I'm seeing in 950 00:48:58,320 --> 00:49:02,680 Speaker 2: this sort of conspiracy theoryrapist world. So that decision was 951 00:49:02,800 --> 00:49:04,920 Speaker 2: pretty intuitive, But I thought it would just be kind 952 00:49:04,960 --> 00:49:09,120 Speaker 2: of like a nice or even poetic constraint to use 953 00:49:09,560 --> 00:49:13,040 Speaker 2: eleven different cognitive biases as like themes for every chapter. 954 00:49:13,520 --> 00:49:16,520 Speaker 3: Yeah, I just want to say I loved Cultish, and 955 00:49:16,560 --> 00:49:19,399 Speaker 3: I think that that book surprised me in the sense 956 00:49:19,440 --> 00:49:21,640 Speaker 3: I was like, oh, I'm gonna listen to this book 957 00:49:21,680 --> 00:49:24,919 Speaker 3: about something that has nothing to do with me. Yeah, 958 00:49:25,120 --> 00:49:28,600 Speaker 3: I was like, Okay, maybe not, But then this book, 959 00:49:28,719 --> 00:49:33,120 Speaker 3: I guess. I love the way that you offer information 960 00:49:33,320 --> 00:49:37,319 Speaker 3: and tell stories that allows us to take from it 961 00:49:37,600 --> 00:49:39,920 Speaker 3: what we want. You're not promising any like, it's not 962 00:49:40,000 --> 00:49:41,360 Speaker 3: like read this book and you'll. 963 00:49:41,280 --> 00:49:43,600 Speaker 1: Do da da da da or whatever. You're like, this 964 00:49:43,640 --> 00:49:45,239 Speaker 1: is information, this is how. 965 00:49:45,160 --> 00:49:47,000 Speaker 3: It's sometimes applied to my life, this is how it 966 00:49:47,080 --> 00:49:49,600 Speaker 3: sometimes applied to other people. And you can look at 967 00:49:49,600 --> 00:49:52,319 Speaker 3: how it applies to you and to me. That that 968 00:49:52,360 --> 00:49:56,120 Speaker 3: offers a lot of agency, offers a lot of empowerment, 969 00:49:56,320 --> 00:49:59,480 Speaker 3: which is really what therapists. 970 00:49:58,880 --> 00:49:59,799 Speaker 1: Are trained to do. 971 00:50:00,560 --> 00:50:02,439 Speaker 3: Not that you're trying to be a therapist, but it's 972 00:50:02,520 --> 00:50:05,040 Speaker 3: just it's it's so refreshing to see and maybe that's 973 00:50:05,040 --> 00:50:06,360 Speaker 3: why I love it so much. 974 00:50:06,480 --> 00:50:06,920 Speaker 4: Thank you. 975 00:50:07,239 --> 00:50:09,120 Speaker 1: But I've been so kind of burnt out by the 976 00:50:09,280 --> 00:50:12,759 Speaker 1: like is this helping me to like what is? What 977 00:50:12,880 --> 00:50:13,160 Speaker 1: is this? 978 00:50:13,320 --> 00:50:16,120 Speaker 3: Or have I read this before? And this feels like 979 00:50:16,200 --> 00:50:17,440 Speaker 3: it has been so helpful. 980 00:50:18,080 --> 00:50:19,000 Speaker 1: But I got to do that. 981 00:50:19,040 --> 00:50:21,680 Speaker 3: I got to decide how and what that help was, 982 00:50:21,800 --> 00:50:24,160 Speaker 3: and that that I think is really special. 983 00:50:24,200 --> 00:50:25,200 Speaker 1: So thank you for doing that. 984 00:50:25,600 --> 00:50:28,320 Speaker 4: Oh, thank you. I'm so glad that's what you took from. 985 00:50:28,120 --> 00:50:30,759 Speaker 3: It and your brain. I just want to like live 986 00:50:30,760 --> 00:50:32,040 Speaker 3: inside of it for like a day. 987 00:50:32,680 --> 00:50:35,520 Speaker 2: Oh no, I really recommend that. 988 00:50:35,880 --> 00:50:37,760 Speaker 4: I would not endorse that. 989 00:50:38,680 --> 00:50:40,239 Speaker 1: It's very creative, I will say. 990 00:50:40,239 --> 00:50:42,160 Speaker 3: And the way you read I really like, So I 991 00:50:42,239 --> 00:50:45,120 Speaker 3: really anyone who's listening, like, we just like scratched the 992 00:50:45,160 --> 00:50:47,680 Speaker 3: surface of what is inside of this book and I 993 00:50:47,719 --> 00:50:50,399 Speaker 3: think what you guys can make of it on your own. 994 00:50:50,600 --> 00:50:53,279 Speaker 3: And so it's out now. Can you buy it anywhere 995 00:50:53,320 --> 00:50:54,400 Speaker 3: and everywhere. 996 00:50:54,040 --> 00:50:54,920 Speaker 4: Anywhere and everywhere. 997 00:50:55,000 --> 00:50:58,879 Speaker 3: It's an ebook, audiobook, hard copy, And the plus is 998 00:50:59,280 --> 00:51:00,799 Speaker 3: the audiobook is you reading it? 999 00:51:01,080 --> 00:51:02,799 Speaker 2: Yeah? I got to read it myself, which was so 1000 00:51:02,920 --> 00:51:05,279 Speaker 2: much fun. And a little fun fact is that The 1001 00:51:05,320 --> 00:51:08,040 Speaker 2: flourish of intro music that you hear at the beginning 1002 00:51:08,440 --> 00:51:12,080 Speaker 2: was composed by my partner, Casey, who I dedicated the 1003 00:51:12,080 --> 00:51:13,880 Speaker 2: whole book too, So it's very special. 1004 00:51:13,880 --> 00:51:14,560 Speaker 1: It's so fun. 1005 00:51:14,840 --> 00:51:17,319 Speaker 3: And then your podcast sounds like a cult? Are you 1006 00:51:17,320 --> 00:51:19,960 Speaker 3: transitioning that to the new podcast? Just so people know 1007 00:51:20,040 --> 00:51:20,640 Speaker 3: what to look for. 1008 00:51:20,920 --> 00:51:23,920 Speaker 1: I'm both You're like, I'm just gonna take something else on. 1009 00:51:24,160 --> 00:51:26,480 Speaker 3: I will say, sounds like a cult is very It's 1010 00:51:26,520 --> 00:51:29,640 Speaker 3: a fun podcast where sometimes you tackle these serious topics 1011 00:51:29,640 --> 00:51:32,400 Speaker 3: that are like kind of gut wrenching and like heartbreaking, 1012 00:51:32,719 --> 00:51:34,560 Speaker 3: and then you'll do something that's a little bit more 1013 00:51:34,560 --> 00:51:36,960 Speaker 3: fun and light, and so it's a nice balance I 1014 00:51:37,040 --> 00:51:39,240 Speaker 3: think that we need today. 1015 00:51:39,480 --> 00:51:42,120 Speaker 2: That show is not for everyone because the tone is 1016 00:51:42,160 --> 00:51:45,640 Speaker 2: like so wacky, but yeah, I have so much fun 1017 00:51:45,680 --> 00:51:46,080 Speaker 2: doing it. 1018 00:51:46,200 --> 00:51:47,560 Speaker 4: So that's continuing on. 1019 00:51:47,719 --> 00:51:50,040 Speaker 2: It's we're on a mid season break now right now, 1020 00:51:50,080 --> 00:51:51,480 Speaker 2: but it's generally every Tuesday. 1021 00:51:51,600 --> 00:51:53,959 Speaker 3: And then where can people find you on social media 1022 00:51:53,960 --> 00:51:54,880 Speaker 3: if they want to follow you? 1023 00:51:55,160 --> 00:51:59,280 Speaker 2: Oh, I am reluctantly on Instagram at Amanda Underscore Motel. 1024 00:51:59,560 --> 00:52:01,920 Speaker 2: And then I also have a substack, amandaman tell that 1025 00:52:02,000 --> 00:52:02,759 Speaker 2: substack dot com. 1026 00:52:02,760 --> 00:52:05,000 Speaker 3: Okay, amazing, Well, Thank you so much for this. This 1027 00:52:05,160 --> 00:52:07,400 Speaker 3: was so fun for me and I think people are gonna. 1028 00:52:07,200 --> 00:52:09,880 Speaker 4: Love it awesome. Thank you so much for having me