1 00:00:00,840 --> 00:00:04,440 Speaker 1: I did have friends in the truth or community that 2 00:00:04,480 --> 00:00:08,160 Speaker 1: were all musicians, and we had supported each other and 3 00:00:08,200 --> 00:00:12,400 Speaker 1: supported each other's art in this like fringe element of society, 4 00:00:12,560 --> 00:00:15,400 Speaker 1: and I built relationships with some of these people for 5 00:00:15,480 --> 00:00:17,960 Speaker 1: a good ten years, and to have some of them 6 00:00:18,040 --> 00:00:20,440 Speaker 1: really turn their back on me, it did hurt. They 7 00:00:20,440 --> 00:00:22,720 Speaker 1: don't want to talk to me because I loved them 8 00:00:22,720 --> 00:00:26,400 Speaker 1: like they were my brothers. But it's gone. 9 00:00:27,040 --> 00:00:29,280 Speaker 2: I'm John Cipher and I'm Jerry o'she. 10 00:00:29,400 --> 00:00:32,640 Speaker 3: I served in the CIA's Clandestine Service for twenty eight years, 11 00:00:32,680 --> 00:00:34,559 Speaker 3: living undercover all around the. 12 00:00:34,479 --> 00:00:36,960 Speaker 4: World, and in my thirty three years with the CIA, 13 00:00:37,200 --> 00:00:40,200 Speaker 4: I served in Africa, Asia, Europe, and the Middle East. 14 00:00:40,360 --> 00:00:42,599 Speaker 3: Although we don't usually look at it this way, we 15 00:00:42,720 --> 00:00:44,880 Speaker 3: created conspiracies. 16 00:00:44,240 --> 00:00:47,239 Speaker 4: In our operations. We got people to believe things that 17 00:00:47,320 --> 00:00:48,000 Speaker 4: weren't true. 18 00:00:48,120 --> 00:00:51,040 Speaker 3: Now we're investigating the conspiracy theories we see in the 19 00:00:51,080 --> 00:00:52,280 Speaker 3: news almost every day. 20 00:00:52,520 --> 00:00:54,760 Speaker 4: Will break them down for you to determine whether they 21 00:00:54,800 --> 00:00:57,160 Speaker 4: could be real or whether we're being manipulated. 22 00:00:57,440 --> 00:00:59,720 Speaker 3: Welcome to Mission implausible. 23 00:01:03,160 --> 00:01:05,840 Speaker 5: We talk about conspiracy theorists as if it's just a 24 00:01:05,880 --> 00:01:08,679 Speaker 5: fixed thing, but there's something that brings them in and 25 00:01:08,720 --> 00:01:10,920 Speaker 5: then every once in a while there's something that brings 26 00:01:10,959 --> 00:01:11,440 Speaker 5: them out. 27 00:01:11,400 --> 00:01:15,000 Speaker 4: How eventually you climb out and what's the psychological process 28 00:01:15,000 --> 00:01:17,039 Speaker 4: neurological process that goes along with that. 29 00:01:17,240 --> 00:01:19,479 Speaker 3: Falling down a rabbit hole is easy. Getting out as hard. 30 00:01:19,520 --> 00:01:21,880 Speaker 3: And so I give credit to people like our guest 31 00:01:21,959 --> 00:01:25,199 Speaker 3: today who battle with this issue. Yeah, so today's guest 32 00:01:25,360 --> 00:01:28,720 Speaker 3: is Brent Lee. Brent is a form of conspiracy theory addict, 33 00:01:28,760 --> 00:01:30,679 Speaker 3: I think you would say, who now works to help 34 00:01:30,760 --> 00:01:34,320 Speaker 3: others who've gotten lost in conspiracy rabbit holes. He writes 35 00:01:34,400 --> 00:01:37,000 Speaker 3: and speaks on the issues of conspiracy and hosts a 36 00:01:37,040 --> 00:01:40,319 Speaker 3: regular podcast called Some Dare Call It Conspiracy. 37 00:01:40,400 --> 00:01:43,759 Speaker 1: So welcome Brett, Thank you, thanks for having me so. 38 00:01:43,720 --> 00:01:45,760 Speaker 3: Brent, First, I would like to praise you for what 39 00:01:45,800 --> 00:01:48,480 Speaker 3: you're doing. It cannot have been easy to examine your 40 00:01:48,680 --> 00:01:51,720 Speaker 3: beliefs and change your life, and more importantly, your willingness 41 00:01:51,720 --> 00:01:54,600 Speaker 3: to share and help others is really laudable. So I 42 00:01:54,600 --> 00:01:56,240 Speaker 3: guess my first question is the simplest one. Can you 43 00:01:56,240 --> 00:01:58,120 Speaker 3: tell us a little bit about your journey and how 44 00:01:58,120 --> 00:01:58,840 Speaker 3: you ended up here. 45 00:01:59,160 --> 00:02:02,240 Speaker 1: Sure, in two thousand and three we were on the 46 00:02:02,320 --> 00:02:05,800 Speaker 1: verge of war with Iraq. The press and the government 47 00:02:05,920 --> 00:02:09,160 Speaker 1: in the UK was spinning like the workmans of mass 48 00:02:09,200 --> 00:02:13,360 Speaker 1: destruction eventually turned out to be a lie, and the 49 00:02:13,520 --> 00:02:16,400 Speaker 1: UN voted against going to Iraq, and all this kind 50 00:02:16,400 --> 00:02:20,800 Speaker 1: of stuff was going on, and I somehow came across 51 00:02:21,040 --> 00:02:24,799 Speaker 1: nine to eleven Truther documentaries, which basically we're pointing out 52 00:02:24,800 --> 00:02:27,120 Speaker 1: that nine to eleven was an inside job by the 53 00:02:27,200 --> 00:02:30,280 Speaker 1: government and it was all to go to war in 54 00:02:30,440 --> 00:02:33,520 Speaker 1: the Middle East and take over resources. And I'm telling you, 55 00:02:33,560 --> 00:02:35,800 Speaker 1: because of the lies that I was seeing in the 56 00:02:36,040 --> 00:02:39,440 Speaker 1: media spread by our government and your government at the time, 57 00:02:39,520 --> 00:02:42,360 Speaker 1: the Bush administration, I fell, hook line and sinker for 58 00:02:42,400 --> 00:02:46,520 Speaker 1: it because the arguments were so put together. And with 59 00:02:46,720 --> 00:02:48,799 Speaker 1: that I started going down a rabbit hole and to 60 00:02:48,880 --> 00:02:51,000 Speaker 1: trying to figure out, well, who are these people who 61 00:02:51,280 --> 00:02:54,280 Speaker 1: did this? Oh, George Bush is a member of Skull 62 00:02:54,320 --> 00:02:57,640 Speaker 1: and Bones. Oh so was Senator Kerry at the time. Okay, 63 00:02:57,680 --> 00:03:01,160 Speaker 1: that's interesting, Maybe I should look into these create society links. 64 00:03:01,240 --> 00:03:03,960 Speaker 1: And then I start coming across freemasonry and all that 65 00:03:04,000 --> 00:03:07,840 Speaker 1: sort of grand conspiracy myth and it just made sense, 66 00:03:08,080 --> 00:03:10,720 Speaker 1: It really just made sense. And from there I just 67 00:03:10,919 --> 00:03:14,000 Speaker 1: descended further and further into the rabbit hole. 68 00:03:14,480 --> 00:03:16,840 Speaker 4: Fascinating by the way, and I had a sort of 69 00:03:16,880 --> 00:03:19,640 Speaker 4: similar journey with the Iraq War as well, that even 70 00:03:19,639 --> 00:03:22,880 Speaker 4: though I was inside CIA and I knew some people 71 00:03:22,919 --> 00:03:26,040 Speaker 4: who were involved in the analysis, I believe that it 72 00:03:26,120 --> 00:03:28,520 Speaker 4: was true, and I'll have to say that a lot 73 00:03:28,560 --> 00:03:31,400 Speaker 4: of them did too. Now, how much is a lie 74 00:03:31,880 --> 00:03:34,359 Speaker 4: in a conspiracy to tell that lie, and how much 75 00:03:34,440 --> 00:03:38,360 Speaker 4: of it is quite frankly, just human error. Give us 76 00:03:38,400 --> 00:03:41,360 Speaker 4: your thoughts on the difference between lying with mailis a 77 00:03:41,400 --> 00:03:44,600 Speaker 4: forethought to achieve something, to create a conspiracy, and then 78 00:03:44,800 --> 00:03:48,680 Speaker 4: just like simple fuck ups and randomness in the world. 79 00:03:48,720 --> 00:03:54,640 Speaker 1: So disinformation is deliberately like misinforming people, So disinforming people, 80 00:03:54,680 --> 00:03:57,880 Speaker 1: and anyone who picks that disinformation up and spreads it 81 00:03:57,960 --> 00:04:01,840 Speaker 1: without knowing that it's a lie is spreading misinformation. And 82 00:04:01,880 --> 00:04:04,360 Speaker 1: I think that's definitely what happened with the Weapons of 83 00:04:04,400 --> 00:04:05,480 Speaker 1: mass destruction lie. 84 00:04:05,760 --> 00:04:07,480 Speaker 3: Can you explain a little bit why you think the 85 00:04:07,560 --> 00:04:10,360 Speaker 3: Truther view of the world is so powerful? 86 00:04:10,840 --> 00:04:13,440 Speaker 1: I think because it does give us an answer. It's 87 00:04:13,480 --> 00:04:15,240 Speaker 1: really good to bring it back to nine to eleven 88 00:04:15,480 --> 00:04:19,680 Speaker 1: because that was such a chaotic moment, such a traumatic 89 00:04:19,760 --> 00:04:22,920 Speaker 1: moment for everyone, Like pretty much everyone around the world 90 00:04:22,960 --> 00:04:26,240 Speaker 1: watched three thousand people die within an hour of the 91 00:04:26,279 --> 00:04:29,159 Speaker 1: two towers falling. We all needed to work out how 92 00:04:29,200 --> 00:04:34,359 Speaker 1: this happened, and these conspiracies that conspiracy pushers were pushing, essentially, 93 00:04:34,400 --> 00:04:37,520 Speaker 1: like everyone knows Alex Jones, and at the time it 94 00:04:37,560 --> 00:04:40,000 Speaker 1: was William Cooper was one of the big ones, Like 95 00:04:40,080 --> 00:04:42,240 Speaker 1: he was on air at the time saying this was 96 00:04:42,240 --> 00:04:45,120 Speaker 1: a conspiracy. But yeah, it gives us that order out 97 00:04:45,160 --> 00:04:50,080 Speaker 1: of chaos. To use another conspiracy term, like how could 98 00:04:50,120 --> 00:04:54,200 Speaker 1: America with all of its power, with all of its intelligence, 99 00:04:54,600 --> 00:04:57,440 Speaker 1: with everything that they have, Like, how did they let 100 00:04:57,480 --> 00:05:00,800 Speaker 1: this slip? How did they let nineteen high jackers take 101 00:05:00,839 --> 00:05:03,760 Speaker 1: four planes in one day? That just that shouldn't happen. 102 00:05:03,960 --> 00:05:06,560 Speaker 3: What is it about you you think made you susceptible 103 00:05:06,600 --> 00:05:08,719 Speaker 3: to going down this path? Why you? What is it 104 00:05:08,720 --> 00:05:11,520 Speaker 3: about your background, or your personality, or your feelings or 105 00:05:11,560 --> 00:05:12,680 Speaker 3: what have you that got you there? 106 00:05:12,960 --> 00:05:16,040 Speaker 4: Yeah, you seem like a really thoughtful, reasonable guy. You 107 00:05:16,120 --> 00:05:16,800 Speaker 4: generally do. 108 00:05:17,360 --> 00:05:21,160 Speaker 1: I think creativity does play a role. Creativity and intelligence 109 00:05:21,200 --> 00:05:24,560 Speaker 1: does play a role. It helps you imagine a situation 110 00:05:24,880 --> 00:05:27,560 Speaker 1: and then logic yourself into believing it. I think that 111 00:05:27,600 --> 00:05:31,160 Speaker 1: plays a big part in say, my personality. However, I 112 00:05:31,279 --> 00:05:34,960 Speaker 1: think the thing that brought me into the conspiracism in 113 00:05:35,000 --> 00:05:37,279 Speaker 1: the way that it did for nine to eleven, through 114 00:05:37,600 --> 00:05:41,360 Speaker 1: the anti war sentiment I had for going to Iraq. 115 00:05:41,640 --> 00:05:44,320 Speaker 1: I think that definitely ties into my background, and it 116 00:05:44,360 --> 00:05:48,520 Speaker 1: ties into being brought up in the military. My first stepfather, 117 00:05:48,800 --> 00:05:51,880 Speaker 1: he was in Vietnam. He was a Marine, he was 118 00:05:51,960 --> 00:05:55,000 Speaker 1: drafted and he went to Vietnam, and I didn't really 119 00:05:55,120 --> 00:05:58,599 Speaker 1: know very much about what happened until like later on, 120 00:05:59,000 --> 00:06:02,200 Speaker 1: but I can just say my childhood was not a 121 00:06:02,240 --> 00:06:04,520 Speaker 1: happy childhood with him. While he was in our life, 122 00:06:05,240 --> 00:06:10,360 Speaker 1: he had PTSD from a terrible experience in Vietnam. He drank, 123 00:06:10,760 --> 00:06:14,320 Speaker 1: he was abusive, He half beat my mother to death 124 00:06:14,320 --> 00:06:17,240 Speaker 1: one night, and then they discharged him from the military. 125 00:06:17,640 --> 00:06:20,760 Speaker 1: And that left quite a big scar on me. And 126 00:06:20,839 --> 00:06:24,200 Speaker 1: none of that really bubbled to the surface until the 127 00:06:24,480 --> 00:06:28,640 Speaker 1: First Gulf War happens in the nineties. At that time, 128 00:06:28,880 --> 00:06:31,720 Speaker 1: my mother is remarried to a man in the US 129 00:06:31,720 --> 00:06:35,240 Speaker 1: Air Force, and my friend's dads were starting to get 130 00:06:35,440 --> 00:06:40,719 Speaker 1: called to go to Iraq, and I remember being terrified 131 00:06:41,120 --> 00:06:44,920 Speaker 1: of my stepfather going there because of what had happened before. 132 00:06:45,240 --> 00:06:50,000 Speaker 1: Because this guy was sorry, not was is like the loveliest, 133 00:06:50,520 --> 00:06:55,159 Speaker 1: kindest person. He was such an amazing role model growing 134 00:06:55,240 --> 00:06:57,920 Speaker 1: up that I was terrified if he went to war 135 00:06:58,000 --> 00:07:01,120 Speaker 1: in Iraq, He's either going to die or is going 136 00:07:01,160 --> 00:07:03,040 Speaker 1: to come back and be like my first step father. 137 00:07:03,279 --> 00:07:07,600 Speaker 1: And I feel like that idea of war and what 138 00:07:07,640 --> 00:07:12,880 Speaker 1: it does to people I really shaped my sentiment, especially 139 00:07:12,920 --> 00:07:16,239 Speaker 1: when it came to again Iraq war, the second one, 140 00:07:16,760 --> 00:07:19,320 Speaker 1: Like I was so dead against that. I did not 141 00:07:19,480 --> 00:07:23,640 Speaker 1: want our citizens going to war over there and coming 142 00:07:23,680 --> 00:07:26,480 Speaker 1: back basically like my first step father with all these 143 00:07:26,520 --> 00:07:29,000 Speaker 1: troubles PTSD, alcoholism. 144 00:07:29,360 --> 00:07:33,480 Speaker 4: What were the benefits to being inside a conspiray theorist mindset. 145 00:07:33,680 --> 00:07:35,960 Speaker 1: I think at the beginning it was more to do 146 00:07:36,280 --> 00:07:39,560 Speaker 1: with having an answer, just oh, I know what happened, 147 00:07:39,560 --> 00:07:42,119 Speaker 1: then I know what's going on. I didn't really feel 148 00:07:42,160 --> 00:07:44,520 Speaker 1: empowered by it. I know that's a common thing that 149 00:07:44,560 --> 00:07:48,240 Speaker 1: has said, like people feel empowered by having this secret knowledge, 150 00:07:48,240 --> 00:07:51,640 Speaker 1: but I never genuinely had that. It's just that I 151 00:07:51,680 --> 00:07:53,720 Speaker 1: had a weird sort of comfort in knowing what was 152 00:07:53,760 --> 00:07:56,280 Speaker 1: going on, even though it was absolutely terrible. 153 00:07:56,840 --> 00:07:59,720 Speaker 4: Did you feel a missionary zeal to spread that word 154 00:07:59,720 --> 00:08:02,120 Speaker 4: as well? I'm going to like bring this to everyone. 155 00:08:01,760 --> 00:08:06,280 Speaker 1: Else I considered myself an activist. I was a truth activist. 156 00:08:06,720 --> 00:08:09,520 Speaker 1: I was a musician, I was a spoken word artist. 157 00:08:09,680 --> 00:08:13,320 Speaker 1: I was a rapper, and I've always been like putting 158 00:08:13,400 --> 00:08:18,160 Speaker 1: politics and my activism in music. So having that, I 159 00:08:18,200 --> 00:08:21,160 Speaker 1: had this other whole topic of stuff to talk about. 160 00:08:21,880 --> 00:08:24,600 Speaker 3: But can you help me understand or explain why the issue, 161 00:08:24,640 --> 00:08:28,120 Speaker 3: for example, a pedophilia crops up so often in these conspiracies. 162 00:08:28,360 --> 00:08:30,480 Speaker 3: It returns, it seems, again and again nowadays. 163 00:08:30,640 --> 00:08:33,280 Speaker 1: Well, so the most evil thing to do is the 164 00:08:33,320 --> 00:08:37,479 Speaker 1: most evil thing on this planet, basically being a pedophile 165 00:08:37,640 --> 00:08:41,959 Speaker 1: or a murderer or just like sacrificing children and stuff 166 00:08:42,000 --> 00:08:45,480 Speaker 1: like that. Yeah, it really paints people like as the worst, 167 00:08:45,840 --> 00:08:48,800 Speaker 1: so it drives your hate towards them. 168 00:08:49,160 --> 00:08:52,720 Speaker 4: A bit of a historian and the Roman Empire in 169 00:08:52,760 --> 00:08:56,760 Speaker 4: the Two Punic Wars, what was their charge against the 170 00:08:56,760 --> 00:09:00,720 Speaker 4: Phoenicians the Carthaginians? Was it they sacrificed their children? And 171 00:09:00,920 --> 00:09:04,280 Speaker 4: this is like a two five hundred year old conspiracy theory, 172 00:09:04,320 --> 00:09:08,240 Speaker 4: and modern archaeology has shown that when babies died early, 173 00:09:08,280 --> 00:09:10,040 Speaker 4: which a lot of in those days, a lot of 174 00:09:10,040 --> 00:09:12,360 Speaker 4: babies did, twenty to thirty percent never made it to 175 00:09:12,400 --> 00:09:15,880 Speaker 4: their first year. The babies were then cremated rather than buried, 176 00:09:16,120 --> 00:09:19,960 Speaker 4: and the Romans then turned that into their baby killers. 177 00:09:20,040 --> 00:09:22,800 Speaker 4: And this allows us to wage war and not to 178 00:09:23,040 --> 00:09:27,800 Speaker 4: just defeat them, but to exterminate Carthage. So yeah, exactly 179 00:09:27,960 --> 00:09:30,280 Speaker 4: that thing, even two thousand years ago. 180 00:09:30,480 --> 00:09:33,080 Speaker 1: Go even further back in the Old Testament, the Hebrews 181 00:09:33,080 --> 00:09:35,959 Speaker 1: are talking about the Canaanites serving their babies to molic 182 00:09:36,000 --> 00:09:39,040 Speaker 1: That's right, and that's probably propaganda as well, for at 183 00:09:39,040 --> 00:09:39,600 Speaker 1: the time. 184 00:09:39,559 --> 00:09:42,920 Speaker 4: The Canaanites were Phoenicians. So yeah, it's the same civilization. 185 00:09:43,080 --> 00:09:45,920 Speaker 1: It's incredibly emotive, and that's what you want to do. 186 00:09:45,960 --> 00:09:48,760 Speaker 1: If you want to manipulate people, that's what you do. 187 00:09:49,040 --> 00:09:50,880 Speaker 4: Brend I'd like to talk to you more about the 188 00:09:51,080 --> 00:09:54,800 Speaker 4: destructive impact of conspiracy theories on both the society and 189 00:09:54,840 --> 00:09:56,640 Speaker 4: on people and on families. 190 00:09:57,000 --> 00:09:58,840 Speaker 1: I got to be honest, like, when I was in it, 191 00:09:58,920 --> 00:10:01,920 Speaker 1: I didn't understand how dangerous it was because I didn't 192 00:10:01,920 --> 00:10:05,880 Speaker 1: really see it as a tool being used to manipulate 193 00:10:05,920 --> 00:10:08,800 Speaker 1: me by people that wanted to manipulate me or have 194 00:10:09,000 --> 00:10:11,280 Speaker 1: me on their side. I just thought it was people 195 00:10:11,320 --> 00:10:14,840 Speaker 1: trying to expose the truth, expose what was really going on. 196 00:10:15,280 --> 00:10:18,360 Speaker 1: But I have now being out of it, especially since 197 00:10:18,720 --> 00:10:24,840 Speaker 1: say twenty fifteen, watching political figures embrace conspiracism and push it, 198 00:10:25,120 --> 00:10:28,679 Speaker 1: and watching people who I'd never thought would They didn't 199 00:10:28,720 --> 00:10:30,640 Speaker 1: listen to a word I had to say when I 200 00:10:30,720 --> 00:10:34,000 Speaker 1: was a conspiracy theorist. Now like listening to them and 201 00:10:34,040 --> 00:10:36,840 Speaker 1: pushing the conspiracies that I used to believe in. That 202 00:10:36,960 --> 00:10:38,040 Speaker 1: was a real wake up call. 203 00:10:38,559 --> 00:10:40,760 Speaker 3: How did you come out of it? Was it a 204 00:10:40,800 --> 00:10:43,120 Speaker 3: slow process of coming out or was it one or 205 00:10:43,160 --> 00:10:44,800 Speaker 3: two key things that pushed you along there? 206 00:10:45,000 --> 00:10:48,160 Speaker 1: Yeah, it was a very slow process which kind of 207 00:10:48,160 --> 00:10:51,880 Speaker 1: gained momentum. It took six years. Sandy Hook happens December 208 00:10:51,880 --> 00:10:56,360 Speaker 1: twenty twelve, and throughout all of twenty thirteen, the crisis 209 00:10:56,400 --> 00:11:00,680 Speaker 1: actor and hoax narrative becomes very prevalent in the truth movement, 210 00:11:00,960 --> 00:11:03,720 Speaker 1: and that went against everything that I was believing at 211 00:11:03,720 --> 00:11:06,400 Speaker 1: the time. At the time, I was believing that all 212 00:11:06,400 --> 00:11:12,280 Speaker 1: of these mass shootings, terrorist attacks, etc. Were essentially ritual sacrifices, 213 00:11:12,720 --> 00:11:17,199 Speaker 1: and so real people die in these things. And when 214 00:11:17,440 --> 00:11:20,800 Speaker 1: Alex Jones and many other people in the truth community 215 00:11:20,840 --> 00:11:23,920 Speaker 1: were pushing that these events were hoaxes and that the 216 00:11:23,960 --> 00:11:27,000 Speaker 1: people involved were crisis actors, so nobody dies. I had 217 00:11:27,000 --> 00:11:30,600 Speaker 1: a real problem with that because no, I really believed 218 00:11:30,640 --> 00:11:35,240 Speaker 1: these people were being murdered by them, the Illuminati, the 219 00:11:35,280 --> 00:11:38,160 Speaker 1: New World Order, the cult, whatever you want to call them. 220 00:11:38,440 --> 00:11:40,560 Speaker 1: And just to clarify, I wasn't the Jews. I was 221 00:11:40,679 --> 00:11:43,440 Speaker 1: never like a Jew World Order guy. I was always like, 222 00:11:43,640 --> 00:11:47,120 Speaker 1: it's them, all of them. The Queen was involved, and 223 00:11:47,600 --> 00:11:51,520 Speaker 1: Barack Obama was involved. But through twenty thirteen, this hoax 224 00:11:51,640 --> 00:11:56,440 Speaker 1: crisis actor narrative just became applied cut and paste every 225 00:11:56,480 --> 00:11:59,439 Speaker 1: time something happened. We had the Pulse nightclub shooting, the 226 00:11:59,480 --> 00:12:04,160 Speaker 1: Boston Marathon bombing, all leading up through twenty fourteen fifteen 227 00:12:04,240 --> 00:12:07,400 Speaker 1: up into the Charlie Hebdo shooting and the Paris Batic 228 00:12:07,480 --> 00:12:12,080 Speaker 1: clan and Nie attacks. Every single time something happened, my 229 00:12:12,160 --> 00:12:15,320 Speaker 1: Facebook would just light up with the whole community. He's 230 00:12:15,320 --> 00:12:17,520 Speaker 1: talking about, Oh, this is hoax, it's not real. 231 00:12:17,880 --> 00:12:18,240 Speaker 4: Look at this. 232 00:12:18,320 --> 00:12:20,840 Speaker 1: Crisis sector, that chrisis sector, and they'd show these pictures 233 00:12:20,840 --> 00:12:24,000 Speaker 1: and all throughout that time it was pulling me away 234 00:12:24,280 --> 00:12:28,280 Speaker 1: from the community and like making me isolate myself further 235 00:12:28,320 --> 00:12:31,079 Speaker 1: into my belief but not vibing with what they were saying. 236 00:12:31,440 --> 00:12:35,280 Speaker 1: And then between twenty fifteen and twenty eighteen, so much 237 00:12:35,280 --> 00:12:38,880 Speaker 1: stuff happened on the world stage that just made me question, like, 238 00:12:39,040 --> 00:12:41,920 Speaker 1: is the New World Order actually in charge here? Because 239 00:12:41,920 --> 00:12:44,440 Speaker 1: there was three massive things. To me. First, it was 240 00:12:44,520 --> 00:12:47,680 Speaker 1: the election of Trump. It wasn't just because it was Trump, 241 00:12:47,800 --> 00:12:51,360 Speaker 1: but the idea was like Hillary Clinton had been primed 242 00:12:51,400 --> 00:12:54,000 Speaker 1: for this job for years, so she was supposed to 243 00:12:54,000 --> 00:12:56,280 Speaker 1: be the person who took over because she was a 244 00:12:56,280 --> 00:12:59,560 Speaker 1: New World Order. But Trump won and I could see 245 00:12:59,679 --> 00:13:03,199 Speaker 1: him Alex Jones on Info Wars and all these other things, 246 00:13:03,240 --> 00:13:05,640 Speaker 1: and I was like, what, how is this guy getting 247 00:13:05,679 --> 00:13:09,480 Speaker 1: into the highest office in America? That doesn't make sense, 248 00:13:09,760 --> 00:13:13,040 Speaker 1: especially when he's popular amongst the conspiracy theorists. So this 249 00:13:13,120 --> 00:13:15,800 Speaker 1: is really strange. So we had that, and then here 250 00:13:15,840 --> 00:13:18,480 Speaker 1: in England we had the Brexit vote, and the Brexit 251 00:13:18,559 --> 00:13:21,680 Speaker 1: vote pulled us away from the European Union, and that 252 00:13:21,800 --> 00:13:24,040 Speaker 1: was just it just did not make sense in the 253 00:13:24,040 --> 00:13:26,880 Speaker 1: grand scheme of things of the New World Order's goal 254 00:13:27,080 --> 00:13:30,079 Speaker 1: is a one world government, a police state around the 255 00:13:30,320 --> 00:13:33,600 Speaker 1: entire world. But we had a guy called Jeremy Corbyn 256 00:13:33,720 --> 00:13:36,760 Speaker 1: take over the Labor Party and Corbyn was like a 257 00:13:36,840 --> 00:13:40,360 Speaker 1: left wing populist so him being able to take over 258 00:13:40,600 --> 00:13:43,720 Speaker 1: the Labor Party, which is basically if Bernie Sanders took 259 00:13:43,720 --> 00:13:46,920 Speaker 1: over the Democrats. It wasn't going to be allowed, was he. 260 00:13:47,600 --> 00:13:50,920 Speaker 1: And I had always thought every single vote, every single 261 00:13:51,400 --> 00:13:54,520 Speaker 1: election was a see election, it was nothing to do 262 00:13:54,600 --> 00:13:59,160 Speaker 1: with us. Leaders were selected, not elected. However, those three 263 00:13:59,240 --> 00:14:02,320 Speaker 1: votes and watching it all happen on social media, watching 264 00:14:02,400 --> 00:14:07,199 Speaker 1: people move, it just made almost everything fall apart for 265 00:14:07,360 --> 00:14:09,640 Speaker 1: me at that time. So I had to log off 266 00:14:10,080 --> 00:14:12,679 Speaker 1: and that's when I did. I just lugged off of Facebook. 267 00:14:13,160 --> 00:14:16,520 Speaker 1: And yeah, just over that twenty eighteen to twenty twenty one, 268 00:14:16,600 --> 00:14:18,760 Speaker 1: even more things started to happen, and I was just like, 269 00:14:19,000 --> 00:14:20,880 Speaker 1: I think I've been wrong the whole time, and I 270 00:14:20,920 --> 00:14:22,360 Speaker 1: need to go figure out why. 271 00:14:23,000 --> 00:14:26,680 Speaker 4: QAnon famously said trust the plan. So you didn't trust 272 00:14:26,720 --> 00:14:29,000 Speaker 4: the plan, there's your mistake. 273 00:14:30,080 --> 00:14:30,800 Speaker 2: Let's take a break. 274 00:14:30,800 --> 00:14:31,560 Speaker 6: We'll be right back. 275 00:14:38,360 --> 00:14:43,920 Speaker 4: And Rebecca, it's funny you cite two things that I 276 00:14:43,960 --> 00:14:47,560 Speaker 4: think it's plausible and I'm open to actual there being 277 00:14:47,600 --> 00:14:53,000 Speaker 4: a conspiracy. One being that Russian disinformation attempted to or 278 00:14:53,440 --> 00:14:56,080 Speaker 4: did have some impact on the Brexit vote, or at 279 00:14:56,120 --> 00:14:58,920 Speaker 4: least they tried to. And it's clear that the US 280 00:14:59,000 --> 00:15:03,200 Speaker 4: intelligence community did say that Russia did conspire and did 281 00:15:03,320 --> 00:15:07,200 Speaker 4: attempt to impact the US election to get Trump elected. 282 00:15:07,640 --> 00:15:09,800 Speaker 4: So it's funny that the two things that you that 283 00:15:09,920 --> 00:15:14,520 Speaker 4: broke your sense of conspiracy theories may have had elements 284 00:15:14,520 --> 00:15:16,680 Speaker 4: of a genuine conspiracy attached. Yeah. 285 00:15:16,760 --> 00:15:19,080 Speaker 1: Yeah, absolutely, this is what I've learned after. Oh so 286 00:15:19,120 --> 00:15:22,840 Speaker 1: they were conspiracies, just not the conspiracy I thought, because 287 00:15:22,840 --> 00:15:25,520 Speaker 1: that's my problem is it was a big, grand conspiracy. 288 00:15:26,000 --> 00:15:29,320 Speaker 4: What was the impact on your family and your relationships 289 00:15:29,480 --> 00:15:32,040 Speaker 4: when you went down the rabbit hole? Were you ostracized 290 00:15:32,160 --> 00:15:34,520 Speaker 4: or nobody wanted your round anymore? Did they still put 291 00:15:34,560 --> 00:15:36,280 Speaker 4: up with you, did they try to reason with you? 292 00:15:36,400 --> 00:15:38,680 Speaker 4: And what was the impact? Did you lose friends? Family? 293 00:15:39,160 --> 00:15:43,160 Speaker 1: So the first few years while I was descending down 294 00:15:43,200 --> 00:15:47,920 Speaker 1: that hole with my family, they were like quite understanding 295 00:15:48,040 --> 00:15:52,000 Speaker 1: because I was always into very fringe topics and my 296 00:15:52,120 --> 00:15:54,920 Speaker 1: mum just said, it's a phase, like just let him 297 00:15:54,920 --> 00:15:57,440 Speaker 1: go through this, whatever he's going through. But I came 298 00:15:57,480 --> 00:15:59,480 Speaker 1: to realize they didn't want to talk about it, so 299 00:15:59,520 --> 00:16:02,560 Speaker 1: no one really pushed me away. But I did isolate 300 00:16:02,640 --> 00:16:05,360 Speaker 1: myself because they didn't want to talk about it, especially 301 00:16:05,400 --> 00:16:07,720 Speaker 1: when it came to my friends. I was in a band, 302 00:16:07,960 --> 00:16:10,560 Speaker 1: and I would stop going to the after parties at 303 00:16:10,560 --> 00:16:12,880 Speaker 1: the gig. So I turn up to the gig, do 304 00:16:13,000 --> 00:16:15,680 Speaker 1: our set, and then I go home. I pulled myself 305 00:16:15,720 --> 00:16:17,360 Speaker 1: away from my social circle. 306 00:16:17,600 --> 00:16:20,400 Speaker 3: Nowadays, obviously to your credit, you're trying to help people. 307 00:16:20,440 --> 00:16:22,200 Speaker 3: And as part of what your podcast is about and 308 00:16:22,200 --> 00:16:25,240 Speaker 3: what you speak about, what is the hardest thing about 309 00:16:25,280 --> 00:16:27,440 Speaker 3: trying to do that, What kind of success do you have, 310 00:16:27,560 --> 00:16:30,280 Speaker 3: and what is the biggest challenges You try to talk 311 00:16:30,320 --> 00:16:32,400 Speaker 3: about your experience and to help others so that they 312 00:16:32,400 --> 00:16:33,720 Speaker 3: don't go down the same rabbit hole. 313 00:16:34,000 --> 00:16:36,680 Speaker 1: The things I face now are that I'm part of 314 00:16:36,720 --> 00:16:41,280 Speaker 1: the conspiracy. This is now my favorite conspiracy theory that 315 00:16:41,360 --> 00:16:42,200 Speaker 1: I'm a part of it. 316 00:16:42,520 --> 00:16:46,160 Speaker 2: Welcome to the club man, good audience. 317 00:16:47,040 --> 00:16:51,200 Speaker 4: People who are listening, who are conspiracy curious, who maybe 318 00:16:51,280 --> 00:16:56,040 Speaker 4: dabbling ways that they could extradite themselves or explore things 319 00:16:56,040 --> 00:16:58,320 Speaker 4: in a way that doesn't bring them deeper in like 320 00:16:58,360 --> 00:16:59,880 Speaker 4: what happened to you Initially. 321 00:17:00,080 --> 00:17:02,880 Speaker 1: All I would say is if you are inquisitive, if 322 00:17:02,880 --> 00:17:07,240 Speaker 1: you are really questioning everything due, please question everything. Question 323 00:17:07,440 --> 00:17:10,160 Speaker 1: all sides of the argument, because there's not two sides. 324 00:17:10,560 --> 00:17:14,120 Speaker 1: That's a bit of a false dichotomy. There's many sides 325 00:17:14,200 --> 00:17:18,040 Speaker 1: to these. Just because something disagrees with you doesn't mean 326 00:17:18,080 --> 00:17:21,600 Speaker 1: it's government propaganda or it's them or something. Just look 327 00:17:21,640 --> 00:17:26,560 Speaker 1: at everything. Honestly, I encourage people to actually go down 328 00:17:26,640 --> 00:17:28,880 Speaker 1: and look at this stuff. But you have to look 329 00:17:28,920 --> 00:17:31,960 Speaker 1: at everything because sometimes a conspiracy can be true. But 330 00:17:32,200 --> 00:17:34,000 Speaker 1: try not to get wrapped up in too many of them. 331 00:17:34,240 --> 00:17:37,960 Speaker 1: We're taught that to be fair by the big conspiracists 332 00:17:38,000 --> 00:17:40,920 Speaker 1: like Alex Jones, like David Ake, and they would always say, 333 00:17:41,080 --> 00:17:44,360 Speaker 1: in those first hours of chaos, the truth comes out 334 00:17:44,520 --> 00:17:47,440 Speaker 1: and then the official story comes out later. So we 335 00:17:47,440 --> 00:17:49,399 Speaker 1: were looking for that, and this is what they do. 336 00:17:49,440 --> 00:17:52,960 Speaker 1: They pick out all these little inconsistencies where people have 337 00:17:53,160 --> 00:17:56,600 Speaker 1: just got things wrong. Essentially, then jump the gun. That's 338 00:17:56,800 --> 00:17:59,880 Speaker 1: what we dig into, and we use that to paint 339 00:17:59,880 --> 00:18:02,399 Speaker 1: the rest of it as lies. 340 00:18:03,440 --> 00:18:06,720 Speaker 3: I find that fascinating because in our world, I know 341 00:18:06,760 --> 00:18:09,119 Speaker 3: where you're going. When we were overseas or be a crisis, 342 00:18:09,240 --> 00:18:11,879 Speaker 3: our instinct and what we're taught is the first things 343 00:18:11,880 --> 00:18:14,200 Speaker 3: that come out are going to probably be wrong. 344 00:18:14,200 --> 00:18:16,720 Speaker 4: Oh absolutely, especially in a crisis. I was in a 345 00:18:16,800 --> 00:18:19,440 Speaker 4: rock when ISIS was taking over part of the country, 346 00:18:19,800 --> 00:18:23,040 Speaker 4: and whenever something would happen, there'd be a battle. It 347 00:18:23,080 --> 00:18:25,000 Speaker 4: would take us a day or two to figure out 348 00:18:25,000 --> 00:18:28,199 Speaker 4: statistically what's going on, because, especially when you're in a 349 00:18:28,240 --> 00:18:32,280 Speaker 4: crisis situation, people reporting things very quickly, and very often 350 00:18:32,320 --> 00:18:34,840 Speaker 4: you get them wrong. So, Brent, was there a Damna 351 00:18:34,880 --> 00:18:38,480 Speaker 4: scene moment? Was there a specific moment in your background 352 00:18:38,480 --> 00:18:41,120 Speaker 4: where you really started to question where it clicked? 353 00:18:41,480 --> 00:18:45,520 Speaker 1: So in twenty eighteen, I was scrolling through Facebook and 354 00:18:45,560 --> 00:18:50,240 Speaker 1: I kept seeing people posting about crisis actors or Qan 355 00:18:50,320 --> 00:18:54,720 Speaker 1: on Israel, flat Earth clones, all these different things. I 356 00:18:54,800 --> 00:18:56,800 Speaker 1: just I didn't agree with it, and it was just 357 00:18:56,800 --> 00:19:00,679 Speaker 1: my community just pushing it, and I remember thinking, I 358 00:19:00,680 --> 00:19:03,359 Speaker 1: don't want to hear this anymore. I don't want to 359 00:19:03,520 --> 00:19:06,159 Speaker 1: see this on my feed anymore. And that was the 360 00:19:06,280 --> 00:19:09,000 Speaker 1: day I logged out of it. It wasn't until later 361 00:19:09,040 --> 00:19:12,200 Speaker 1: that year I was writing lyrics. I was trying to 362 00:19:12,240 --> 00:19:16,280 Speaker 1: make a new album and I was going through the 363 00:19:16,320 --> 00:19:20,280 Speaker 1: writing process and it just clicked all of a sudden. 364 00:19:20,840 --> 00:19:23,560 Speaker 1: I don't think I believe this. I feel like I'm 365 00:19:23,600 --> 00:19:28,160 Speaker 1: just repeating tropes, falling back on ideas that I've had. 366 00:19:28,200 --> 00:19:31,360 Speaker 1: But I'm writing this and it doesn't actually feel genuine anymore. 367 00:19:31,600 --> 00:19:33,800 Speaker 1: And those were like the moments where I was just thought, 368 00:19:34,240 --> 00:19:38,240 Speaker 1: I need to really reevaluate everything here because what's just 369 00:19:38,240 --> 00:19:39,840 Speaker 1: been going on for the last three years and the 370 00:19:39,880 --> 00:19:44,640 Speaker 1: political scene doesn't make sense. So I need to take 371 00:19:44,680 --> 00:19:46,840 Speaker 1: some time off and try and figure this out. And 372 00:19:46,880 --> 00:19:49,199 Speaker 1: the best thing about it is me and my partner. 373 00:19:49,240 --> 00:19:52,199 Speaker 1: We got together in twenty ten and we're still together today, 374 00:19:52,400 --> 00:19:55,760 Speaker 1: and we went through this process together because we were 375 00:19:55,760 --> 00:19:58,239 Speaker 1: watching things, we were seeing it all happen, and we 376 00:19:58,240 --> 00:20:00,959 Speaker 1: were able to bounce these ideas off each other, like 377 00:20:01,000 --> 00:20:03,160 Speaker 1: when one of us had a doubt, we were able 378 00:20:03,200 --> 00:20:06,719 Speaker 1: to question it with the other person. And to be fair, like, 379 00:20:06,840 --> 00:20:11,800 Speaker 1: it wasn't super shattering until I realized that I was out, 380 00:20:12,160 --> 00:20:14,440 Speaker 1: and then it just fell apart, like leaving a religion 381 00:20:14,600 --> 00:20:17,080 Speaker 1: or leaving a cult or something like that. It's the 382 00:20:17,119 --> 00:20:18,720 Speaker 1: only thing I can compare it to. 383 00:20:19,400 --> 00:20:22,080 Speaker 3: As you came out of this and started to change, 384 00:20:22,480 --> 00:20:25,600 Speaker 3: did you maintain the same friend network? How did your 385 00:20:25,640 --> 00:20:28,840 Speaker 3: relationships change from before to after? 386 00:20:29,359 --> 00:20:32,920 Speaker 1: When I went down, I isolated myself. When I came out, 387 00:20:32,920 --> 00:20:36,000 Speaker 1: I isolated myself from the new group of friends. And 388 00:20:36,040 --> 00:20:41,040 Speaker 1: when I came back to tell people that I was out. Yeah, 389 00:20:41,040 --> 00:20:43,680 Speaker 1: I got a bit of backlash for that. Like at 390 00:20:43,680 --> 00:20:47,200 Speaker 1: that point, I wasn't isolating myself. I was being kicked 391 00:20:47,200 --> 00:20:51,359 Speaker 1: out of the community. They just blocked me, unfriended me, 392 00:20:51,600 --> 00:20:57,000 Speaker 1: just everyone just completely left, and again I just had 393 00:20:57,000 --> 00:21:00,199 Speaker 1: to rebuild. It does suck, though, to be fair, I 394 00:21:00,359 --> 00:21:03,840 Speaker 1: did have friends in the truth Or community that were 395 00:21:03,840 --> 00:21:08,000 Speaker 1: all musicians, and we had supported each other and supported 396 00:21:08,160 --> 00:21:12,280 Speaker 1: each other's art in this like fringe element of society, 397 00:21:12,480 --> 00:21:15,200 Speaker 1: and I built relationships with some of these people for 398 00:21:15,800 --> 00:21:18,480 Speaker 1: a good ten years. And to have some of them 399 00:21:18,600 --> 00:21:22,000 Speaker 1: really turn their back on me like it did hurt. 400 00:21:22,040 --> 00:21:25,439 Speaker 1: There's about four people that I'm really gutted that they 401 00:21:25,480 --> 00:21:26,960 Speaker 1: don't want to be my friend, they don't want to 402 00:21:27,000 --> 00:21:29,560 Speaker 1: talk to me because I loved them like they were 403 00:21:29,640 --> 00:21:31,920 Speaker 1: my brothers. But it's gone. 404 00:21:32,520 --> 00:21:33,240 Speaker 4: There's a few. 405 00:21:33,040 --> 00:21:35,800 Speaker 1: Others though, that kept me kept along and they're like, hey, 406 00:21:36,119 --> 00:21:39,080 Speaker 1: whatever do you believe I don't understand how you've done it. 407 00:21:39,200 --> 00:21:42,040 Speaker 1: I can't understand how you don't see it still, but hey, 408 00:21:42,840 --> 00:21:44,520 Speaker 1: I know you're not a shill. I know you're not 409 00:21:44,760 --> 00:21:46,919 Speaker 1: bought off or an ething and we're all on our 410 00:21:46,920 --> 00:21:50,600 Speaker 1: own paths and they still respect me for walking my path. 411 00:21:51,240 --> 00:21:54,480 Speaker 1: So yeah, sometimes it's hard, other times it's whatever. 412 00:21:55,240 --> 00:21:58,040 Speaker 4: And when you came out, were people ready to accept 413 00:21:58,040 --> 00:22:01,080 Speaker 4: you back again? If you were defriended you illicially left 414 00:22:01,080 --> 00:22:02,400 Speaker 4: when you came back six years later. 415 00:22:02,960 --> 00:22:07,240 Speaker 1: Now, when I came back as not a conspiracist anymore 416 00:22:07,520 --> 00:22:10,560 Speaker 1: in the previous few years, those same people have sent 417 00:22:10,640 --> 00:22:14,600 Speaker 1: me messages and been like, Wow, I'm so surprised that 418 00:22:14,800 --> 00:22:18,560 Speaker 1: you turned yourself around after all of this. They were 419 00:22:18,600 --> 00:22:22,000 Speaker 1: my really good friends, and they completely welcomed me with 420 00:22:22,040 --> 00:22:25,040 Speaker 1: open arms. There's glad that I'm back, and glad that 421 00:22:25,080 --> 00:22:28,679 Speaker 1: I'm thinking critically now not so wrapped up in and 422 00:22:28,720 --> 00:22:31,120 Speaker 1: all this stuff that I was wrapped up in. And 423 00:22:31,240 --> 00:22:33,800 Speaker 1: they're proud. They're really proud of what I'm doing today. 424 00:22:33,960 --> 00:22:37,560 Speaker 1: They're proud that I'm able to talk about this thing 425 00:22:37,640 --> 00:22:40,760 Speaker 1: that basically happened, and they're proud that I'm trying to 426 00:22:40,800 --> 00:22:44,480 Speaker 1: help bring more people out and also letting everyone understand 427 00:22:44,520 --> 00:22:47,480 Speaker 1: what it's like to be a conspiracist, what it's like 428 00:22:47,720 --> 00:22:50,840 Speaker 1: being isolated in all of this. That's what they're very 429 00:22:50,880 --> 00:22:54,680 Speaker 1: proud of that I've been able to explain that all right, Well, Brat. 430 00:22:54,840 --> 00:22:57,040 Speaker 2: It's been fantastic to get to know you. 431 00:22:56,600 --> 00:22:59,000 Speaker 3: You know, are very excited about your podcast and what 432 00:22:59,040 --> 00:23:00,199 Speaker 3: you're trying to do when we give you a hell 433 00:23:00,240 --> 00:23:02,200 Speaker 3: of a lot of credit for doing that, and thank 434 00:23:02,240 --> 00:23:03,800 Speaker 3: you so much for spending some time with us. 435 00:23:04,320 --> 00:23:07,399 Speaker 1: Thank you, guys both for inviting me, thanks for speaking 436 00:23:07,400 --> 00:23:09,280 Speaker 1: to me. It's been absolutely pleasure. 437 00:23:10,520 --> 00:23:13,320 Speaker 4: That was great. We have a second guest. We're really 438 00:23:13,359 --> 00:23:16,760 Speaker 4: pleased today to have Sandra Fenderlin. He's a professor of 439 00:23:17,000 --> 00:23:21,240 Speaker 4: social psychology Cambridge University and a research affiliate at Yale. 440 00:23:21,680 --> 00:23:26,240 Speaker 4: He's a leading academic expert in the field of social influencing, risk, 441 00:23:26,600 --> 00:23:29,639 Speaker 4: human judgment, and decision making. He's also the author of 442 00:23:29,680 --> 00:23:34,720 Speaker 4: a really great book, Foolproof, Why Misinformation infects Our Minds 443 00:23:34,760 --> 00:23:38,280 Speaker 4: and How to Build Immunity. The Financial Times voted this 444 00:23:38,400 --> 00:23:42,560 Speaker 4: the best nonfiction book of twenty twenty three. So, Sandra, 445 00:23:42,680 --> 00:23:45,040 Speaker 4: welcome today. It's really good, heaven y On, thanks so 446 00:23:45,119 --> 00:23:48,080 Speaker 4: much for having me on. So said, let's begin with 447 00:23:48,440 --> 00:23:52,120 Speaker 4: kool aid. Okay, So the expression drinking the kool aid 448 00:23:52,280 --> 00:23:55,600 Speaker 4: comes from when a cult Jim Jones people temple and 449 00:23:55,680 --> 00:23:57,800 Speaker 4: acting on a conspiracy theory that they were about to 450 00:23:57,840 --> 00:24:02,680 Speaker 4: be attacked by the US military, CINAID blased great kool aid. 451 00:24:02,880 --> 00:24:05,400 Speaker 4: What can you tell us about the overlap between cults 452 00:24:05,520 --> 00:24:08,800 Speaker 4: and conspiracy theories and why is it that people choose 453 00:24:08,880 --> 00:24:10,040 Speaker 4: to drink the kool aid? 454 00:24:10,280 --> 00:24:12,719 Speaker 6: Yeah, that's a great question. I think to some extent, 455 00:24:12,800 --> 00:24:17,440 Speaker 6: we like to differentiate belief in everyday conspiracy theories from 456 00:24:17,720 --> 00:24:20,679 Speaker 6: the cult psychology. I think cults take it to the 457 00:24:20,760 --> 00:24:23,760 Speaker 6: next level in terms that they're often what we refer 458 00:24:23,840 --> 00:24:26,280 Speaker 6: to as high control groups. Let's say you believe in 459 00:24:26,320 --> 00:24:28,280 Speaker 6: one conspiracy theory and that's the end of it, and 460 00:24:28,320 --> 00:24:30,560 Speaker 6: you live your life and you're free to do whatever. 461 00:24:30,760 --> 00:24:33,000 Speaker 6: But when you're in a cult, you're really under a 462 00:24:33,040 --> 00:24:35,600 Speaker 6: high amount of social pressure to conform to the norms 463 00:24:35,600 --> 00:24:37,600 Speaker 6: of the cult and not just buy into their theory, 464 00:24:37,640 --> 00:24:40,879 Speaker 6: but also perform in a way follow the actions of 465 00:24:40,920 --> 00:24:44,439 Speaker 6: the leaders, and you get punished mentally, physically sometimes for 466 00:24:44,560 --> 00:24:46,680 Speaker 6: not following those rules. And so one of the things 467 00:24:46,680 --> 00:24:50,840 Speaker 6: we monitor often is the transition from when somebody believes 468 00:24:50,880 --> 00:24:54,359 Speaker 6: in the conspiracy to believing in multiple conspiracies that are 469 00:24:54,400 --> 00:24:58,040 Speaker 6: potentially joining a cult. QAnon has, as you guys know, 470 00:24:58,160 --> 00:25:01,919 Speaker 6: evolved from a single conspiracy to almost cult status that 471 00:25:02,000 --> 00:25:04,320 Speaker 6: is I think a good example where a conspiracy actually 472 00:25:04,440 --> 00:25:08,000 Speaker 6: transforms into a cult. It has a following, and members 473 00:25:08,160 --> 00:25:12,919 Speaker 6: become more extreme within that following, really endorse the rules 474 00:25:12,920 --> 00:25:16,160 Speaker 6: as if it's a godlike written script, and it's very 475 00:25:16,160 --> 00:25:19,000 Speaker 6: difficult to pull people away from that. You can argue 476 00:25:19,000 --> 00:25:21,639 Speaker 6: with people about conspiracy theory, but getting people out of 477 00:25:21,680 --> 00:25:24,680 Speaker 6: a cult is a whole different story because as cults 478 00:25:24,760 --> 00:25:27,240 Speaker 6: have a playbook and we can get into that, but 479 00:25:27,480 --> 00:25:31,119 Speaker 6: conspiracy theories often use parts of that playbook but are 480 00:25:31,160 --> 00:25:31,840 Speaker 6: not as extreme. 481 00:25:32,200 --> 00:25:34,240 Speaker 3: So Andra explaining why why is this so hard to 482 00:25:34,320 --> 00:25:37,439 Speaker 3: debunk conspiracies and pull people out of the rabbit hole. 483 00:25:37,480 --> 00:25:40,280 Speaker 3: It seems like it's easier to sow, dealt and resolved out. 484 00:25:40,680 --> 00:25:40,880 Speaker 4: Yeah. 485 00:25:40,880 --> 00:25:43,600 Speaker 6: Absolutely, So I have this bad dad joke that kind 486 00:25:43,600 --> 00:25:46,120 Speaker 6: of explains the psychology of it. Are you ready it's 487 00:25:46,160 --> 00:25:49,000 Speaker 6: a perfect place for life? Yeah, okay, it's the perfect place. 488 00:25:49,040 --> 00:25:49,440 Speaker 4: Okay. 489 00:25:49,880 --> 00:25:53,359 Speaker 6: So there's this guy, right, conspiracy theorist, and unfortunately he 490 00:25:53,440 --> 00:25:56,600 Speaker 6: dies and goes to heaven, and then God him or 491 00:25:56,640 --> 00:26:00,879 Speaker 6: herself is waiting at the pearly gates, and this conspiracy 492 00:26:00,920 --> 00:26:03,520 Speaker 6: theorist says, look, God, I gotta tell you this whole 493 00:26:03,560 --> 00:26:07,240 Speaker 6: earth thing. You have to admit, right, the Earth is flat. 494 00:26:07,320 --> 00:26:09,760 Speaker 6: You know you can't see the horizon. I have all 495 00:26:09,760 --> 00:26:12,880 Speaker 6: these facts right the earth. The earth is flat. I mean, 496 00:26:12,920 --> 00:26:15,639 Speaker 6: you supposedly created it, so you tell me. And so 497 00:26:16,040 --> 00:26:19,119 Speaker 6: God kind of frowns and looks the conspiracist in the 498 00:26:19,160 --> 00:26:21,120 Speaker 6: eye and says, I hate to break it to you, 499 00:26:21,280 --> 00:26:25,479 Speaker 6: but I can tell you for sure the Earth is round. 500 00:26:25,840 --> 00:26:28,280 Speaker 6: It is not flat like a pancake. It is round. 501 00:26:28,440 --> 00:26:31,120 Speaker 6: And so the conspiracy theorist goes, damn, this thing goes 502 00:26:31,160 --> 00:26:34,040 Speaker 6: higher up than I thought. And that's the psychology of 503 00:26:34,080 --> 00:26:37,600 Speaker 6: trying to debunk conspiracy theorists. On a more serious note, 504 00:26:37,600 --> 00:26:41,680 Speaker 6: there's this thing we call the conspiratorial worldview, which is 505 00:26:41,760 --> 00:26:45,080 Speaker 6: what we call monological belief system, which is a fancy 506 00:26:45,080 --> 00:26:48,480 Speaker 6: word for saying that belief in one conspiracy serves as 507 00:26:48,520 --> 00:26:51,560 Speaker 6: evidence for the existence of other conspiracy theories. And so 508 00:26:51,600 --> 00:26:55,320 Speaker 6: when you try to debunk conspiracy theories, debunking one is 509 00:26:55,359 --> 00:26:59,160 Speaker 6: often unsuccessful because they'll just jump to a higher order 510 00:26:59,359 --> 00:27:02,200 Speaker 6: conspiracy that sort of explains it all. And so typically 511 00:27:02,200 --> 00:27:03,879 Speaker 6: what you get is that if you try to create 512 00:27:03,880 --> 00:27:06,080 Speaker 6: this what we call local incoherence, so you point out 513 00:27:06,119 --> 00:27:09,840 Speaker 6: about oh, oh, maybe Obama's birth certificate was fake. Then 514 00:27:09,880 --> 00:27:12,680 Speaker 6: they come up with a higher order conspiracy. If you 515 00:27:12,720 --> 00:27:14,960 Speaker 6: try to debunk, that sort of explains the inconsistency. And 516 00:27:15,000 --> 00:27:17,560 Speaker 6: there's been interesting studies on that. For example, Osama bin Laden. 517 00:27:17,640 --> 00:27:19,760 Speaker 6: So one of the studies asked people, oh, do you 518 00:27:19,760 --> 00:27:22,359 Speaker 6: think he was already dead when the troops arrived? And 519 00:27:22,359 --> 00:27:24,960 Speaker 6: then people say yes, And then how about this other 520 00:27:25,000 --> 00:27:27,919 Speaker 6: theory that suggests that he's still alive somewhere, and people 521 00:27:27,920 --> 00:27:30,800 Speaker 6: say yes. And so the idea is that you know 522 00:27:30,960 --> 00:27:34,159 Speaker 6: that somehow when you point out this contradiction, it's not 523 00:27:34,240 --> 00:27:37,800 Speaker 6: strange for people because local contradictions are allowed because there's 524 00:27:37,840 --> 00:27:41,560 Speaker 6: this higher order conspiracy that connects both accounts into this 525 00:27:41,680 --> 00:27:45,280 Speaker 6: higher order conspiracy that it's the government's behind things, that's 526 00:27:45,320 --> 00:27:48,720 Speaker 6: the overarching concern for this particular example, and that can 527 00:27:48,760 --> 00:27:51,919 Speaker 6: explain away any consistencies. Yeah, So coming back to the 528 00:27:52,000 --> 00:27:56,120 Speaker 6: cult's ideology, it's a system of beliefs that's shaped very 529 00:27:56,200 --> 00:27:59,280 Speaker 6: much like what you experience in cults. Sometimes that people 530 00:27:59,320 --> 00:28:01,200 Speaker 6: just jumped to the next level on the next level, 531 00:28:01,480 --> 00:28:04,080 Speaker 6: and that's why I often don't try to debunk conspiracy. 532 00:28:04,080 --> 00:28:06,399 Speaker 6: It's a good example is Neil the Grass Pyson trying 533 00:28:06,440 --> 00:28:10,880 Speaker 6: to debunk the wrapper Bob's belief in a flat Earth conspiracy. 534 00:28:11,000 --> 00:28:13,879 Speaker 6: So basically he tries to repeatedly point out why to 535 00:28:13,960 --> 00:28:17,679 Speaker 6: the celebrity, why the Earth using physics, why the Earth 536 00:28:17,800 --> 00:28:20,960 Speaker 6: really isn't flat, and it just falls on deaf ears 537 00:28:20,960 --> 00:28:23,679 Speaker 6: and Bob still joined the Flat Earth Society. Neil the 538 00:28:23,680 --> 00:28:26,040 Speaker 6: Grass even made a track called flat the Fact to 539 00:28:26,040 --> 00:28:28,280 Speaker 6: try to convince him, but unsuccessfully. 540 00:28:28,440 --> 00:28:30,359 Speaker 7: You say the Earth is flattened, and you try to 541 00:28:30,359 --> 00:28:33,399 Speaker 7: disrespect him. I'll bringing facts to the badest silly theory 542 00:28:33,440 --> 00:28:36,119 Speaker 7: because Bob has got to know the planet is a spait. 543 00:28:36,240 --> 00:28:39,280 Speaker 2: G Wolves, let's sec break co right back. 544 00:28:39,560 --> 00:28:41,720 Speaker 7: You say that Nils vesus what he needs to loosen 545 00:28:41,800 --> 00:28:48,080 Speaker 7: numb the Brothers getting stayed when the ignorance you're spinning 546 00:28:48,120 --> 00:28:49,920 Speaker 7: helps to keep people enslaved. 547 00:28:49,520 --> 00:28:52,320 Speaker 4: I mean, and we're back, you go me. Once people 548 00:28:52,400 --> 00:28:55,160 Speaker 4: are in and they're down the rabbit hole, it's really 549 00:28:55,240 --> 00:28:58,160 Speaker 4: hard to deprogram and pull them out. But you're doing 550 00:28:58,200 --> 00:29:00,400 Speaker 4: a lot of research I understand on what to do 551 00:29:00,440 --> 00:29:02,840 Speaker 4: to vet people from falling in in the first place. Right, 552 00:29:02,880 --> 00:29:05,080 Speaker 4: some really groundbreaking stuff. Could you maybe take us to 553 00:29:05,200 --> 00:29:07,200 Speaker 4: real briefly, what inoculation theory is. 554 00:29:07,520 --> 00:29:09,400 Speaker 6: We start to use that we'd pre bunking because it's 555 00:29:09,400 --> 00:29:12,080 Speaker 6: easier for people to understand, and as you said, it's 556 00:29:12,160 --> 00:29:14,840 Speaker 6: very difficult to try to give people the radicalized. It 557 00:29:14,880 --> 00:29:16,920 Speaker 6: takes a long time, it's very hard to do. Even 558 00:29:16,960 --> 00:29:19,800 Speaker 6: debunking can be difficult to do. I've switched my attention 559 00:29:19,840 --> 00:29:22,600 Speaker 6: to the idea of prevention, and it follows the medical analogy. 560 00:29:22,640 --> 00:29:25,480 Speaker 6: So the idea is that your body produces antibodies to 561 00:29:25,480 --> 00:29:27,880 Speaker 6: help for resistance against future infection. But it turns out 562 00:29:27,880 --> 00:29:29,640 Speaker 6: you can do the same with the mind. And in fact, 563 00:29:29,720 --> 00:29:32,360 Speaker 6: this whole idea emerged because the US government at that 564 00:29:32,480 --> 00:29:35,000 Speaker 6: time was thinking, hey, we need to double down on 565 00:29:35,480 --> 00:29:39,200 Speaker 6: instilling American values in American soldiers because they don't seem 566 00:29:39,200 --> 00:29:43,000 Speaker 6: prepared to answer questions like whi's capitalism good? Or why 567 00:29:43,000 --> 00:29:45,360 Speaker 6: are you fighting this war? And so on, and they 568 00:29:45,360 --> 00:29:47,680 Speaker 6: didn't have what we call mental defenses against the types 569 00:29:47,720 --> 00:29:50,240 Speaker 6: of counter arguments that were being raised by foreign troops. 570 00:29:50,400 --> 00:29:52,480 Speaker 6: But the government's response was, we need to give people 571 00:29:52,480 --> 00:29:55,280 Speaker 6: more facts about why the US is great, whereas the 572 00:29:55,320 --> 00:29:57,840 Speaker 6: psychologists at the time said, maybe we shouldn't give people 573 00:29:57,920 --> 00:30:01,600 Speaker 6: more facts. We should expose them in simulation form to 574 00:30:01,680 --> 00:30:04,560 Speaker 6: what it's like to being attacked on your belief system, 575 00:30:04,920 --> 00:30:07,840 Speaker 6: and then help them resist and refute these attempts so 576 00:30:07,880 --> 00:30:10,520 Speaker 6: that that become more immune when it actually happens. And 577 00:30:10,520 --> 00:30:13,520 Speaker 6: that's really the core idea of inoculation. Of course, you 578 00:30:13,520 --> 00:30:15,960 Speaker 6: always get into the discussion. If it's your country, we 579 00:30:16,040 --> 00:30:19,080 Speaker 6: call it education. If it's another country, it's called propaganda. 580 00:30:19,120 --> 00:30:21,600 Speaker 6: Irrespective of the content of what we're talking about, the 581 00:30:21,640 --> 00:30:24,400 Speaker 6: technique is just the neutral device, and we're using it 582 00:30:24,440 --> 00:30:26,120 Speaker 6: to help people identify misinformation. 583 00:30:26,360 --> 00:30:26,960 Speaker 2: I think it's true. 584 00:30:27,000 --> 00:30:29,360 Speaker 3: You will see the fact checking after the fact and 585 00:30:29,400 --> 00:30:30,840 Speaker 3: debunking doesn't seem to be enough. 586 00:30:31,040 --> 00:30:34,720 Speaker 6: Part of the spread of misinformation online is leading people 587 00:30:35,000 --> 00:30:39,680 Speaker 6: to quote unquote alternative facts, alternative knowledge, and alternative claims, 588 00:30:39,800 --> 00:30:43,480 Speaker 6: which creates the sense of distrust around official narratives and 589 00:30:43,560 --> 00:30:46,280 Speaker 6: around experts. And I think that has to do with 590 00:30:46,320 --> 00:30:49,440 Speaker 6: the more fundamental misunderstanding of how science works. And we 591 00:30:49,480 --> 00:30:52,120 Speaker 6: know that from experiments that when you tell people something 592 00:30:52,560 --> 00:30:55,960 Speaker 6: and then the science changes, and then you tell people 593 00:30:56,000 --> 00:30:59,160 Speaker 6: that it's now different, the conclusion is often that, hence 594 00:30:59,160 --> 00:31:02,080 Speaker 6: scientists are in competent, and of course the consensus amongst 595 00:31:02,120 --> 00:31:04,800 Speaker 6: government agencies is different from what scientists are saying, and 596 00:31:04,840 --> 00:31:05,800 Speaker 6: that's confusing to people. 597 00:31:06,160 --> 00:31:07,920 Speaker 4: Well, the best way to fix science is more and 598 00:31:07,960 --> 00:31:10,120 Speaker 4: better science, right, Although I could say the best way 599 00:31:10,160 --> 00:31:12,000 Speaker 4: to fix a conspiracy theory is just more and better 600 00:31:12,000 --> 00:31:12,880 Speaker 4: conspiracy theories. 601 00:31:13,000 --> 00:31:14,520 Speaker 6: I love that idea. 602 00:31:14,680 --> 00:31:17,120 Speaker 4: There is actually an interesting historical parallel to this. So 603 00:31:17,360 --> 00:31:21,120 Speaker 4: the Thirty Years War in Europe, arguably one of the 604 00:31:21,160 --> 00:31:24,880 Speaker 4: reasons that it happened was because of the printing press. Suddenly, 605 00:31:24,960 --> 00:31:29,040 Speaker 4: for the first time, there wasn't one entity that had 606 00:31:29,200 --> 00:31:33,080 Speaker 4: a monopoly on truth. People were printing whatever they wanted 607 00:31:33,200 --> 00:31:36,360 Speaker 4: about religion, and all of a sudden, you had different 608 00:31:36,440 --> 00:31:40,360 Speaker 4: theories about all sorts of bits about religion and Christianity, 609 00:31:40,640 --> 00:31:42,640 Speaker 4: and people went to war for thirty years and tore 610 00:31:42,760 --> 00:31:45,840 Speaker 4: Europe apart trying to figure out what is truth? Who 611 00:31:45,880 --> 00:31:49,840 Speaker 4: decides what are facts? And people lived in those days 612 00:31:49,880 --> 00:31:53,760 Speaker 4: in their own micro information bubbles by deciding what you read, 613 00:31:54,240 --> 00:31:56,760 Speaker 4: and so I think this is a problem that we're 614 00:31:56,760 --> 00:31:59,200 Speaker 4: going to have for a long time. I did want 615 00:31:59,240 --> 00:32:01,959 Speaker 4: to ask you about the Pineapple Pizza game, which you're 616 00:32:02,000 --> 00:32:02,800 Speaker 4: famous for, right. 617 00:32:03,400 --> 00:32:07,800 Speaker 6: We did this game with homeland security, specifically SISA Cyber 618 00:32:07,920 --> 00:32:12,160 Speaker 6: Infrastructure Security Agency, and Chris Krebs, who at the time 619 00:32:12,240 --> 00:32:15,680 Speaker 6: was heading CASA, came up with this pineapple pizza idea 620 00:32:16,080 --> 00:32:21,000 Speaker 6: and the goal was, can we inoculate Americans against foreign 621 00:32:21,120 --> 00:32:24,400 Speaker 6: influence techniques that are used by bad actors during elections? 622 00:32:24,760 --> 00:32:26,720 Speaker 6: And of course, the first part of this process is 623 00:32:26,720 --> 00:32:29,400 Speaker 6: always breaking down if we follow the public health analogy 624 00:32:29,400 --> 00:32:31,400 Speaker 6: with the structure of the viruses in a way, so 625 00:32:31,440 --> 00:32:33,520 Speaker 6: what are the key building blocks? So what's the playbook 626 00:32:33,560 --> 00:32:35,720 Speaker 6: that they use? Can we then break that down for 627 00:32:35,800 --> 00:32:39,000 Speaker 6: people in a way that's fun and digestible so that 628 00:32:39,160 --> 00:32:41,760 Speaker 6: we can then inoculate and create this resistance in a 629 00:32:41,760 --> 00:32:45,160 Speaker 6: simulated environment. And that's really what Harmony Square was. And 630 00:32:45,200 --> 00:32:48,120 Speaker 6: so Harmony Square is a ridiculous game, but it's a 631 00:32:48,120 --> 00:32:50,520 Speaker 6: fake news game. And so you start out in your 632 00:32:50,600 --> 00:32:54,160 Speaker 6: role as chief disinformation officer, and your goal is to 633 00:32:54,160 --> 00:32:57,840 Speaker 6: try to create chaos in this fictional town called Harmony Square. 634 00:32:58,080 --> 00:33:02,120 Speaker 6: And they have an annual swan and also a Pineapple 635 00:33:02,160 --> 00:33:04,920 Speaker 6: Pizza festival, and it kind of starts in this way 636 00:33:04,960 --> 00:33:07,160 Speaker 6: that one of your options is to try to steal 637 00:33:07,440 --> 00:33:10,800 Speaker 6: the swan and then the game narrator says are you crazy, 638 00:33:10,920 --> 00:33:12,640 Speaker 6: that's illegal. You know, you have to come up with 639 00:33:12,760 --> 00:33:16,280 Speaker 6: something that isn't going to get you arrested. This information actors, 640 00:33:16,280 --> 00:33:19,240 Speaker 6: they don't go and steal statues. And so you learn 641 00:33:19,280 --> 00:33:21,880 Speaker 6: that there's an election going on for the local bear patroller, 642 00:33:22,000 --> 00:33:25,040 Speaker 6: even though there's no bears in Harmony Square. But they 643 00:33:25,080 --> 00:33:27,800 Speaker 6: just love elections. They love the idea of democracy, and 644 00:33:27,840 --> 00:33:30,560 Speaker 6: they just love having elections. And your goals is disrupted. 645 00:33:30,800 --> 00:33:32,880 Speaker 6: And the pineapple pizza metaphor comes in is that the 646 00:33:32,920 --> 00:33:35,320 Speaker 6: techniques that are used and we break down for people 647 00:33:35,320 --> 00:33:38,200 Speaker 6: are the following. So you take an innocuous issue, should 648 00:33:38,200 --> 00:33:46,680 Speaker 6: you put pineapple on pizza? Maligne actors, that's the first step, right, 649 00:33:46,720 --> 00:33:50,240 Speaker 6: And then then you start controversy and say the Italians 650 00:33:50,240 --> 00:33:52,720 Speaker 6: they're going to be offended by pineapple and pizza. Right, 651 00:33:52,720 --> 00:33:55,680 Speaker 6: it's a cultural that's just culturally inappropriate. And then other 652 00:33:55,720 --> 00:33:58,000 Speaker 6: people say, nobody's going to tell me what topic to 653 00:33:58,040 --> 00:34:01,200 Speaker 6: put on my pizza. Right, you can't restraint pizza topping freedom, 654 00:34:01,320 --> 00:34:04,800 Speaker 6: pizza amendment rights. And then the next step, after you 655 00:34:04,880 --> 00:34:08,200 Speaker 6: stir up debates on both sides, is a false amplification. 656 00:34:08,400 --> 00:34:10,319 Speaker 6: So now you're going to have to buy bots there's 657 00:34:10,360 --> 00:34:14,040 Speaker 6: a whole market for this bots through fake SMS, phones 658 00:34:14,080 --> 00:34:16,319 Speaker 6: and lots of accounts, and then you blow it up 659 00:34:16,560 --> 00:34:19,160 Speaker 6: on both sides. So it's meant to be a bit sarcastic, right, 660 00:34:19,160 --> 00:34:20,759 Speaker 6: But you go through these techniques and then we test 661 00:34:20,760 --> 00:34:23,440 Speaker 6: people though with real social media content that use these 662 00:34:23,480 --> 00:34:26,680 Speaker 6: techniques from past events and as well stuff people might 663 00:34:26,680 --> 00:34:28,640 Speaker 6: come across, and yeah, we find people are now better 664 00:34:28,680 --> 00:34:31,920 Speaker 6: able to recognize and resist these techniques. You know, one 665 00:34:31,960 --> 00:34:34,440 Speaker 6: of the more general theories is that conspiracy theories are 666 00:34:34,440 --> 00:34:36,719 Speaker 6: for losers. And so what we mean by that is 667 00:34:37,120 --> 00:34:39,600 Speaker 6: when you're out of power, the party or the group 668 00:34:39,640 --> 00:34:41,680 Speaker 6: that's out of power that was more likely to think 669 00:34:41,719 --> 00:34:44,440 Speaker 6: that the group who is currently in power is conspiring 670 00:34:44,480 --> 00:34:47,400 Speaker 6: against them, and that switches depending on who's in power. 671 00:34:47,400 --> 00:34:49,960 Speaker 6: And that's why we like to say conspiracies are for losers, 672 00:34:49,960 --> 00:34:52,319 Speaker 6: because the people in power are never coming up with 673 00:34:52,400 --> 00:34:55,439 Speaker 6: the conspiracy that those out of power are plotting against them. 674 00:34:55,560 --> 00:34:57,399 Speaker 6: And the funny thing is that we create this new 675 00:34:57,400 --> 00:35:00,680 Speaker 6: game called cat Park and it's about educating people conspiracy 676 00:35:00,680 --> 00:35:02,480 Speaker 6: theories and the idea is that you arrive in your 677 00:35:02,560 --> 00:35:06,040 Speaker 6: new apartment, do your noodles, and it's messy, and so 678 00:35:06,080 --> 00:35:08,080 Speaker 6: the bad news is that you have a new apartment, 679 00:35:08,160 --> 00:35:09,600 Speaker 6: new city, and the bad news is you don't have 680 00:35:09,640 --> 00:35:12,760 Speaker 6: any friends. And so you get this ping from a 681 00:35:12,760 --> 00:35:15,840 Speaker 6: shady actor down at the local bar who has a 682 00:35:15,880 --> 00:35:18,520 Speaker 6: story to tell you. You want friends, and you feel marginalized. 683 00:35:18,560 --> 00:35:20,480 Speaker 6: So you go to the bar and they say, look, 684 00:35:20,560 --> 00:35:24,800 Speaker 6: all these elitists are trying to erect this new cat park. However, 685 00:35:25,080 --> 00:35:27,520 Speaker 6: there are no cats in the city, and so it 686 00:35:27,560 --> 00:35:29,799 Speaker 6: turns into a big conspiracy and it's called cat Park. 687 00:35:29,920 --> 00:35:32,520 Speaker 6: Is it's supposed to be fun. But then this whole 688 00:35:32,560 --> 00:35:34,960 Speaker 6: conspiracy website now says that he's going to try to 689 00:35:35,000 --> 00:35:37,840 Speaker 6: convince you that this is a game about cats. However, 690 00:35:38,320 --> 00:35:41,160 Speaker 6: really it's an attempt to brainwash you and so on 691 00:35:41,280 --> 00:35:43,640 Speaker 6: and the cats and their symbols about the cats, and 692 00:35:43,760 --> 00:35:46,560 Speaker 6: it's a classic symbol that was used in so and 693 00:35:46,560 --> 00:35:48,799 Speaker 6: so history. And there's all these hidden messages in the 694 00:35:48,840 --> 00:35:51,520 Speaker 6: game that haven't linked it yet to satanic pedofiles. But 695 00:35:51,520 --> 00:35:53,319 Speaker 6: I'm sure give it time. It just. 696 00:35:55,280 --> 00:35:57,240 Speaker 4: And I think it's important to say that really smart 697 00:35:57,280 --> 00:35:59,319 Speaker 4: people can fall into this. So I would just like 698 00:35:59,680 --> 00:36:03,279 Speaker 4: to throw in that, Sir Isaac Newton, arguably one of 699 00:36:03,320 --> 00:36:07,600 Speaker 4: the most brilliant people ever on this planet, invented physics, 700 00:36:07,640 --> 00:36:11,880 Speaker 4: invented optics. The other thing is he spent years reading 701 00:36:11,920 --> 00:36:15,719 Speaker 4: the Bible to find hidden messages in it, and at 702 00:36:15,719 --> 00:36:18,359 Speaker 4: the end he determined that the world was going to 703 00:36:18,480 --> 00:36:21,480 Speaker 4: end in the year two thousand and sixty. Right, he 704 00:36:21,560 --> 00:36:23,560 Speaker 4: came up with this figure. So even one of the 705 00:36:23,560 --> 00:36:26,040 Speaker 4: most brilliant people in the world might be true. We 706 00:36:26,080 --> 00:36:28,880 Speaker 4: still got a few years. But yeah, it's not people 707 00:36:28,880 --> 00:36:29,640 Speaker 4: who are dumb. 708 00:36:29,840 --> 00:36:33,080 Speaker 3: So Sendra, is there anything you're learning of the best 709 00:36:33,360 --> 00:36:36,880 Speaker 3: practices of trying to get people to slowly pull themselves 710 00:36:36,880 --> 00:36:37,759 Speaker 3: out of the rabbit hole? 711 00:36:37,960 --> 00:36:38,160 Speaker 2: Yeah? 712 00:36:38,160 --> 00:36:40,080 Speaker 6: I think there's a few lessons. And in the book, 713 00:36:40,320 --> 00:36:44,040 Speaker 6: I interviewed people who were former conspiracy theorists and extremists 714 00:36:44,040 --> 00:36:46,000 Speaker 6: and people are part of cults and so on, and 715 00:36:46,160 --> 00:36:49,200 Speaker 6: often the story that I would get is that there 716 00:36:49,200 --> 00:36:52,680 Speaker 6: were some inflection, you know, there was some point at 717 00:36:52,719 --> 00:36:56,080 Speaker 6: which there was a counter narrative that was convincing enough 718 00:36:56,120 --> 00:36:59,080 Speaker 6: for them to take a small step outside of their 719 00:36:59,160 --> 00:37:02,320 Speaker 6: regular circle and start digging. And I think Brent Lee's 720 00:37:02,320 --> 00:37:05,439 Speaker 6: story is not altogether this similar, but it was that 721 00:37:05,560 --> 00:37:08,759 Speaker 6: the theories started getting more and more implausible to the 722 00:37:08,760 --> 00:37:11,960 Speaker 6: point where they started questioning you, is this real? You 723 00:37:11,960 --> 00:37:14,000 Speaker 6: know how plausible is This is starting to sound a 724 00:37:14,040 --> 00:37:17,719 Speaker 6: little too kooky, which led them to spend less time online. 725 00:37:18,040 --> 00:37:20,160 Speaker 6: And I think these things are key. So we're doing 726 00:37:20,680 --> 00:37:23,480 Speaker 6: big experiments at the moment trying to get people to 727 00:37:23,640 --> 00:37:28,239 Speaker 6: unfollow polarizing accounts, deactivate social media for a while, and 728 00:37:28,280 --> 00:37:30,799 Speaker 6: we find that has a huge impact on people in 729 00:37:30,920 --> 00:37:35,080 Speaker 6: terms of their information consumption, the information environment that surrounds them. 730 00:37:35,200 --> 00:37:38,879 Speaker 6: People spread less misinformation, they feel less polarized. And so 731 00:37:39,040 --> 00:37:41,799 Speaker 6: the challenge with getting people out is that you have 732 00:37:41,880 --> 00:37:45,520 Speaker 6: to provide a new network for people. Right, they're used 733 00:37:45,640 --> 00:37:48,520 Speaker 6: to having the crowd that they affiliate with, but you 734 00:37:48,600 --> 00:37:50,600 Speaker 6: got to have other people that you can turn to. Right, 735 00:37:50,640 --> 00:37:52,600 Speaker 6: You've got to have other narratives, and you need to 736 00:37:52,600 --> 00:37:55,480 Speaker 6: have a new support system. And so the first step 737 00:37:55,520 --> 00:37:58,279 Speaker 6: is to get people engaged with counter content and then 738 00:37:58,360 --> 00:38:01,520 Speaker 6: give them a new support system. Brent was courageous because 739 00:38:01,640 --> 00:38:03,960 Speaker 6: he just logged off and lost all of his friends. 740 00:38:04,160 --> 00:38:07,600 Speaker 6: And that's why most conspiracy theorists don't turn out like Brent, 741 00:38:07,760 --> 00:38:10,440 Speaker 6: because nobody wants to lose their friends. And it's difficult, 742 00:38:10,640 --> 00:38:12,319 Speaker 6: and you have to find your way out on your own. 743 00:38:12,360 --> 00:38:13,840 Speaker 6: So I think it was really brave of him to 744 00:38:13,920 --> 00:38:17,160 Speaker 6: do that and persist. So you can't persuade people coming 745 00:38:17,200 --> 00:38:19,560 Speaker 6: in saying you're wrong, these are the facts. This is 746 00:38:19,560 --> 00:38:22,000 Speaker 6: what most people do, and it just doesn't work. You 747 00:38:22,000 --> 00:38:24,200 Speaker 6: have to have a more indirect approach. I always start 748 00:38:24,200 --> 00:38:27,360 Speaker 6: with what we called worldview validation, which is a technique 749 00:38:27,400 --> 00:38:30,080 Speaker 6: that says, look, some conspiracies have really happened in the past, 750 00:38:30,400 --> 00:38:32,400 Speaker 6: and now they feel like you're validating part of what 751 00:38:32,440 --> 00:38:34,880 Speaker 6: they're saying. And then you pivot and say, look, but 752 00:38:34,960 --> 00:38:38,160 Speaker 6: the specific one that we're talking about, let's agree to disagree. 753 00:38:38,280 --> 00:38:40,360 Speaker 6: But you have to get people inched them away a 754 00:38:40,400 --> 00:38:43,279 Speaker 6: little bit from the story, right, Oh, okay, Okay, I 755 00:38:43,320 --> 00:38:46,279 Speaker 6: still think on this particular issue, Okay, but would you 756 00:38:46,320 --> 00:38:49,040 Speaker 6: read this other article, not saying you have to believe it, 757 00:38:49,120 --> 00:38:51,680 Speaker 6: just read it, right, and then we'll go, okay, let's 758 00:38:52,160 --> 00:38:54,120 Speaker 6: read it, and then we meet somewhere in the middle, 759 00:38:54,120 --> 00:38:55,680 Speaker 6: and then you leave it at that. And I think 760 00:38:55,719 --> 00:38:58,000 Speaker 6: for most people that stuff, they would have convinced someone 761 00:38:58,560 --> 00:39:01,120 Speaker 6: right they're cranky uncle or family member, and I leave 762 00:39:01,160 --> 00:39:03,319 Speaker 6: it at that. The next interaction we move a little 763 00:39:03,360 --> 00:39:06,600 Speaker 6: bit and you invite people come over to my party 764 00:39:06,640 --> 00:39:09,400 Speaker 6: where there's a different narrative happening, meeting different people, and 765 00:39:09,440 --> 00:39:10,840 Speaker 6: you have to do that over and over. It's a 766 00:39:10,880 --> 00:39:13,360 Speaker 6: huge investment, it takes a lot of time, but I 767 00:39:13,400 --> 00:39:15,640 Speaker 6: think that's where the evidence is in terms of the 768 00:39:15,719 --> 00:39:19,520 Speaker 6: de radicalization strategy. And then Cas Sunstein as a technique 769 00:39:19,640 --> 00:39:22,879 Speaker 6: calls cognitive infiltration, which I don't think I could get 770 00:39:22,920 --> 00:39:26,040 Speaker 6: through our ethics committee at the university, but basically, KI 771 00:39:26,719 --> 00:39:29,160 Speaker 6: maybe a method you guys are familiar with, but cognitive 772 00:39:29,200 --> 00:39:33,360 Speaker 6: infiltration is basically you pose as a fellow conspiracy theorist, 773 00:39:33,440 --> 00:39:36,520 Speaker 6: you join their group, and you do subversion from within that. 774 00:39:36,560 --> 00:39:39,400 Speaker 6: You start to try to sort of unravel the narrative 775 00:39:39,480 --> 00:39:43,040 Speaker 6: and start asking questions and getting people skeptical. And you 776 00:39:43,040 --> 00:39:46,040 Speaker 6: can do that at online chat rooms for example, quite easily. 777 00:39:46,360 --> 00:39:49,239 Speaker 6: Conspiracy theories hate being manipulated. They don't want to be 778 00:39:49,280 --> 00:39:51,360 Speaker 6: part of the sheeple. So giving them the idea that 779 00:39:51,440 --> 00:39:53,000 Speaker 6: maybe they're part of a pyramid scheme, all of the 780 00:39:53,040 --> 00:39:54,560 Speaker 6: sudden is I don't want to be part of that, 781 00:39:54,840 --> 00:39:56,359 Speaker 6: and so that's what I try to do. I try 782 00:39:56,360 --> 00:40:00,000 Speaker 6: to uncover the manipulation techniques and not discuss specific facts 783 00:40:00,040 --> 00:40:02,239 Speaker 6: and then leave it at that, and then next time 784 00:40:02,280 --> 00:40:05,160 Speaker 6: maybe we can discuss and so on. And then the 785 00:40:05,200 --> 00:40:07,560 Speaker 6: suggestion that making money off of it Alex Jones sells 786 00:40:07,560 --> 00:40:10,320 Speaker 6: stuff on the show, right, You're contributing money to somebody, 787 00:40:10,400 --> 00:40:12,239 Speaker 6: and I really don't like that, and so that's one 788 00:40:12,239 --> 00:40:14,880 Speaker 6: way to introduce it. We call some friction, Santor. 789 00:40:14,920 --> 00:40:17,520 Speaker 2: That was actually wonderful. Really, thank you very much. 790 00:40:17,719 --> 00:40:19,440 Speaker 3: We're really happy to get a chance to talk to 791 00:40:19,480 --> 00:40:20,920 Speaker 3: you today, and I hope we are able to do 792 00:40:20,960 --> 00:40:21,359 Speaker 3: so again. 793 00:40:21,480 --> 00:40:24,040 Speaker 6: Yeah, my pleasure. And I'm sure people know this was 794 00:40:24,080 --> 00:40:25,520 Speaker 6: all planned and predetermined. 795 00:40:27,880 --> 00:40:29,120 Speaker 4: All stick to the plan. 796 00:40:29,280 --> 00:40:30,720 Speaker 2: I'm too lazy to be planned. 797 00:40:32,120 --> 00:40:34,839 Speaker 3: Now, let's speak with our producer Adam Davidson and try 798 00:40:34,840 --> 00:40:36,120 Speaker 3: to sort this all out together. 799 00:40:37,400 --> 00:40:40,080 Speaker 5: I actually read something the other day that the single 800 00:40:40,120 --> 00:40:46,000 Speaker 5: best intervention found so far is AI slowly methodically convincing 801 00:40:46,040 --> 00:40:48,839 Speaker 5: people to stop having a conspiracy theory. We should get 802 00:40:48,840 --> 00:40:51,960 Speaker 5: the researchers of that on, but this is a reminder 803 00:40:52,000 --> 00:40:55,600 Speaker 5: that joining a conspiracy it's not just accepting some facts. 804 00:40:55,719 --> 00:40:58,200 Speaker 5: It's an entire way of life. It's a community. I 805 00:40:58,200 --> 00:41:01,400 Speaker 5: don't know, do you emerge hopeful podcasts like ours or 806 00:41:01,440 --> 00:41:04,520 Speaker 5: other things could break the hold or is it just 807 00:41:04,560 --> 00:41:06,399 Speaker 5: too few, too little. 808 00:41:06,760 --> 00:41:09,200 Speaker 4: People teetering on the edge. I think shows like this 809 00:41:09,200 --> 00:41:11,040 Speaker 4: can actually help get him to think about it. But 810 00:41:11,680 --> 00:41:14,160 Speaker 4: once you're down there, no, we've all stayed around on 811 00:41:14,239 --> 00:41:17,320 Speaker 4: Thanksgiving with the crazy uncle or cousin who's in a 812 00:41:17,360 --> 00:41:20,279 Speaker 4: conspiracy theory. And even when he sits with people who 813 00:41:20,360 --> 00:41:23,239 Speaker 4: love him at an intervention talking about how this just 814 00:41:23,360 --> 00:41:27,960 Speaker 4: doesn't make sense, still doesn't work. It's about faith, not fax. 815 00:41:28,000 --> 00:41:30,120 Speaker 5: Faith and friends. That was what someone was pointing out 816 00:41:30,120 --> 00:41:31,960 Speaker 5: to me, that you go to a MAGA rally if 817 00:41:32,000 --> 00:41:35,120 Speaker 5: you're lonely and you suddenly have one hundred thousand best friends. 818 00:41:34,719 --> 00:41:36,920 Speaker 4: And I think that's huge, big deal. 819 00:41:37,040 --> 00:41:40,080 Speaker 3: Yeah, yeah, thank you for today's discussion and we will 820 00:41:40,080 --> 00:41:42,439 Speaker 3: see you next time on Mission Implausible. 821 00:41:47,640 --> 00:41:52,640 Speaker 8: Mission Implausible is produced by Adam Davidson, Jerry O'shay, John Cipher, 822 00:41:53,000 --> 00:41:54,320 Speaker 8: and Jonathan Sterner. 823 00:41:54,440 --> 00:41:56,800 Speaker 4: The associate producer is Rachel Harner. 824 00:41:57,040 --> 00:42:00,839 Speaker 8: Mission Implausible is a production of Honorable Mention and Abominable 825 00:42:00,880 --> 00:42:02,560 Speaker 8: Pictures for iHeartMedia.