1 00:00:00,480 --> 00:00:02,560 Speaker 1: I get aur groovers. Welcome to another installment of the 2 00:00:02,600 --> 00:00:07,320 Speaker 1: show Tippany and Cook. David Bryan, Kevin Patrick James Gillespie, 3 00:00:07,840 --> 00:00:10,480 Speaker 1: the less talented of the two Gillespies I've had on 4 00:00:10,520 --> 00:00:12,760 Speaker 1: the show today, joins us. 5 00:00:12,760 --> 00:00:14,600 Speaker 2: Will say lighter, tiff Hi, tiv. 6 00:00:15,160 --> 00:00:15,680 Speaker 3: Hi, harps. 7 00:00:15,720 --> 00:00:17,959 Speaker 4: I'm standing up at my standing desk for the first 8 00:00:18,040 --> 00:00:19,599 Speaker 4: time in twenty twenty five. 9 00:00:20,480 --> 00:00:23,360 Speaker 3: How is it? It's good? Huh. 10 00:00:23,440 --> 00:00:24,680 Speaker 4: I'll pa sick of it soon. 11 00:00:25,480 --> 00:00:26,560 Speaker 2: Yeah, I need to. 12 00:00:28,560 --> 00:00:30,639 Speaker 3: What exactly is the point of a standing desk. 13 00:00:30,480 --> 00:00:36,720 Speaker 1: Again, because we weren't built to sit galispo because flexes. 14 00:00:36,880 --> 00:00:37,720 Speaker 2: It gives us time. 15 00:00:38,960 --> 00:00:42,520 Speaker 1: So us and all those muscles and you old people, 16 00:00:42,640 --> 00:00:43,480 Speaker 1: I mean, feel free. 17 00:00:43,520 --> 00:00:46,000 Speaker 2: That's your age. It's a wonder you can even get 18 00:00:46,040 --> 00:00:46,920 Speaker 2: in and out of a chair. 19 00:00:46,960 --> 00:00:50,280 Speaker 1: But that's true us youngsters, Us youngsters, we need to 20 00:00:50,320 --> 00:00:54,280 Speaker 1: maintain integrity through the hip muscles and lower back. 21 00:00:54,080 --> 00:00:56,279 Speaker 3: And so you can get so you can get into 22 00:00:56,320 --> 00:00:58,120 Speaker 3: your portion out again. Yeah, because it's so. 23 00:00:58,120 --> 00:01:01,760 Speaker 1: Low, saving up for fucking salad. 24 00:01:01,840 --> 00:01:02,560 Speaker 2: So don't start. 25 00:01:03,120 --> 00:01:05,600 Speaker 1: Only one of us on this call who's rich and 26 00:01:05,680 --> 00:01:10,240 Speaker 1: were to get this. There's only one of us who's 27 00:01:10,240 --> 00:01:16,280 Speaker 1: sold more than three books and mum bought too, so 28 00:01:16,360 --> 00:01:17,560 Speaker 1: that was fucking tragic. 29 00:01:19,880 --> 00:01:20,800 Speaker 2: How are you gless, bo? 30 00:01:21,280 --> 00:01:24,520 Speaker 3: Yeah good, Yeah, I'm sitting. I'm sitting at a desk, 31 00:01:24,560 --> 00:01:28,119 Speaker 3: by the way, and I never intend to be standing. 32 00:01:29,000 --> 00:01:34,199 Speaker 1: I am also sitting at a desk, Tiff. What led 33 00:01:34,240 --> 00:01:37,520 Speaker 1: to the new kind of standing format. 34 00:01:38,000 --> 00:01:40,200 Speaker 4: I've been doing a lot of work lately and a 35 00:01:40,240 --> 00:01:43,840 Speaker 4: lot of sitting, and so I just decided to stand up. 36 00:01:44,560 --> 00:01:47,480 Speaker 4: And it's good if I leave it up, then I'll 37 00:01:47,520 --> 00:01:49,560 Speaker 4: come in and I won't just sit down as it 38 00:01:49,600 --> 00:01:50,000 Speaker 4: to have it. 39 00:01:50,760 --> 00:01:52,640 Speaker 1: I used to think that too when I bought my 40 00:01:52,760 --> 00:01:55,320 Speaker 1: million dollar two motor electric desk. 41 00:01:55,400 --> 00:01:57,080 Speaker 2: And all I do is sit at the fucking thing. 42 00:01:57,120 --> 00:01:59,760 Speaker 1: I just tell everyone how good I was, because I'd 43 00:01:59,800 --> 00:02:02,520 Speaker 1: like sit for four hours, stand four hours, like hour on, 44 00:02:02,640 --> 00:02:05,080 Speaker 1: hour off. All I do is sit now. So I'm 45 00:02:05,120 --> 00:02:08,040 Speaker 1: with gillespo So I'm full of shit. So don't listen 46 00:02:08,080 --> 00:02:08,280 Speaker 1: to me. 47 00:02:09,760 --> 00:02:12,840 Speaker 3: At least you know you could stand if you wanted to. 48 00:02:13,560 --> 00:02:18,080 Speaker 2: Yes, yes, yes, Tiff. Here's my question. 49 00:02:18,160 --> 00:02:19,880 Speaker 1: Before we talk to the great Man and we tap 50 00:02:19,919 --> 00:02:24,560 Speaker 1: into that big, massive prefrontal cortex of his and all 51 00:02:24,600 --> 00:02:29,600 Speaker 1: the other bits, can you focus and concentrate as much 52 00:02:29,639 --> 00:02:32,040 Speaker 1: when you stand is it the same, better or worse? 53 00:02:32,400 --> 00:02:34,440 Speaker 2: Because it took me a minute to get used to it. 54 00:02:36,120 --> 00:02:38,080 Speaker 4: No, I think it's the same for me. It's good, 55 00:02:38,240 --> 00:02:43,760 Speaker 4: it's not, no, no different, okay, just a better position. 56 00:02:43,800 --> 00:02:45,840 Speaker 4: I don't I find when I sit I end up 57 00:02:46,080 --> 00:02:50,160 Speaker 4: slouching and wriggling and fidgeting. So just is me a 58 00:02:50,200 --> 00:02:50,720 Speaker 4: break from that? 59 00:02:51,960 --> 00:02:55,280 Speaker 2: Well, beautiful, mister Gillespie. 60 00:02:55,360 --> 00:02:58,920 Speaker 1: So we spoke a while ago, and we've spoken like 61 00:02:59,080 --> 00:03:02,840 Speaker 1: periodically and pieces about the idea of building a cult 62 00:03:03,720 --> 00:03:07,920 Speaker 1: and the different kinds of cults. And you know, I 63 00:03:07,960 --> 00:03:13,200 Speaker 1: think a lot of pyramid marketing is cultish, and you know, 64 00:03:13,280 --> 00:03:14,880 Speaker 1: a cult doesn't have to be religious. 65 00:03:14,919 --> 00:03:15,400 Speaker 2: We know that. 66 00:03:15,480 --> 00:03:18,360 Speaker 1: But you actually created a kind of a or you 67 00:03:18,400 --> 00:03:22,200 Speaker 1: wrote like a five step playbook, yes, a kind of 68 00:03:22,480 --> 00:03:25,240 Speaker 1: a how to guide, which I think is so generous 69 00:03:25,280 --> 00:03:28,600 Speaker 1: of you, like literally a five step plan on building 70 00:03:28,639 --> 00:03:31,120 Speaker 1: a cult for those would be cult leaders who listen. 71 00:03:31,160 --> 00:03:37,320 Speaker 2: And I know who you are. What was the prompt 72 00:03:37,360 --> 00:03:37,680 Speaker 2: for this? 73 00:03:38,080 --> 00:03:40,680 Speaker 3: Like what, well, talking to you last week last time 74 00:03:40,840 --> 00:03:44,600 Speaker 3: was the prompt because I said, you know, you really 75 00:03:44,640 --> 00:03:48,200 Speaker 3: just need to create a cult. And I said, it's 76 00:03:48,200 --> 00:03:51,640 Speaker 3: a pretty simple formula. And I didn't really run through 77 00:03:51,680 --> 00:03:53,240 Speaker 3: what the formula was, so I was a bit of 78 00:03:53,280 --> 00:03:56,240 Speaker 3: a teaser for you, but I thought, you know, I 79 00:03:56,280 --> 00:03:59,320 Speaker 3: actually have never written down what the formula for creating 80 00:03:59,320 --> 00:04:02,320 Speaker 3: a cult is, so I thought I should do that, 81 00:04:02,480 --> 00:04:03,440 Speaker 3: So I wrote that piece. 82 00:04:04,040 --> 00:04:06,120 Speaker 2: Is it pretty similar? I mean, you'll tell us. 83 00:04:06,160 --> 00:04:10,000 Speaker 1: But the five steps and the principles that work in 84 00:04:10,640 --> 00:04:15,720 Speaker 1: say a religious or spiritual context similar in other contexts 85 00:04:16,279 --> 00:04:18,240 Speaker 1: or areas. 86 00:04:17,880 --> 00:04:22,839 Speaker 3: It's in everything. Because the cult, what we're calling a cult, 87 00:04:22,920 --> 00:04:29,640 Speaker 3: is actually just a hack to the way humans are wired. So, 88 00:04:30,080 --> 00:04:33,640 Speaker 3: and we've talked about this before, which is the key 89 00:04:33,720 --> 00:04:38,359 Speaker 3: differentiator for humans as a species and why we're able 90 00:04:38,400 --> 00:04:41,280 Speaker 3: to dominate the planet even though we're a terrible predator, 91 00:04:41,720 --> 00:04:45,599 Speaker 3: no teeth, no venom, no scales, nothing, you know, we're 92 00:04:45,640 --> 00:04:51,320 Speaker 3: basically meat on feet for an alligator or crocodile or 93 00:04:51,320 --> 00:04:55,800 Speaker 3: a great white shark or even a koala bear. Really, then, 94 00:04:56,680 --> 00:05:00,640 Speaker 3: how we defeat that is that we're the only species 95 00:05:00,640 --> 00:05:05,440 Speaker 3: that can cooperate with strangers and regularly does so we 96 00:05:05,520 --> 00:05:08,280 Speaker 3: can work in groups of one hundred or ten thousand, 97 00:05:08,320 --> 00:05:12,520 Speaker 3: a million and be just fine. No other species on 98 00:05:12,560 --> 00:05:15,000 Speaker 3: the planet can do that. Now, people, whenever I say 99 00:05:15,000 --> 00:05:17,320 Speaker 3: they'd always say, oh, yeah, but what about ants. Well, 100 00:05:17,400 --> 00:05:25,520 Speaker 3: ants aren't individuals. Ants are an animal with severable moving parts. 101 00:05:25,839 --> 00:05:29,359 Speaker 3: So you know, an ants nest or a bees hive 102 00:05:30,000 --> 00:05:33,760 Speaker 3: is all one organism. It just has bits that move independently. 103 00:05:34,560 --> 00:05:38,440 Speaker 3: They're not individuals. You don't see bees making individual decisions 104 00:05:38,480 --> 00:05:41,120 Speaker 3: about what they're going to do today. Oh, I feel 105 00:05:41,120 --> 00:05:42,600 Speaker 3: like I want to take the day off today, so 106 00:05:42,800 --> 00:05:45,000 Speaker 3: I'll let you guys go out and get the pollen. 107 00:05:45,680 --> 00:05:49,320 Speaker 3: That's not how bees work. They're part of one organism. 108 00:05:50,200 --> 00:05:56,520 Speaker 3: And humans, however, unlike every other species, can cooperate with 109 00:05:56,600 --> 00:06:00,640 Speaker 3: strangers because we have some built in mechanism that allow 110 00:06:00,720 --> 00:06:05,160 Speaker 3: us to do it. That we've evolved relatively recently, and 111 00:06:05,800 --> 00:06:09,440 Speaker 3: a key mechanism is that we do things for the 112 00:06:09,520 --> 00:06:15,000 Speaker 3: approval of others. So dogs don't care what other dogs 113 00:06:15,000 --> 00:06:20,680 Speaker 3: think of them, and neither the sharks. Humans care a 114 00:06:20,720 --> 00:06:25,479 Speaker 3: lot what other humans think of them, and it motivates us, 115 00:06:25,920 --> 00:06:29,920 Speaker 3: so we are motivated to do things that we think 116 00:06:30,000 --> 00:06:33,800 Speaker 3: other people will like. Now you can imagine that that 117 00:06:33,839 --> 00:06:36,320 Speaker 3: would be a pretty powerful glue for getting us to 118 00:06:36,360 --> 00:06:38,920 Speaker 3: cooperate in large groups. If we get a large group 119 00:06:38,960 --> 00:06:42,599 Speaker 3: of people together and give them a motivation and some 120 00:06:42,680 --> 00:06:45,640 Speaker 3: of us start doing it, then the rest of this 121 00:06:45,880 --> 00:06:49,880 Speaker 3: will start doing it too, because we want to please 122 00:06:50,040 --> 00:06:52,599 Speaker 3: those that are doing the thing that obviously large numbers 123 00:06:52,640 --> 00:06:58,640 Speaker 3: of humans want us to do. So that automatic grouping 124 00:06:58,760 --> 00:07:02,799 Speaker 3: or coalescing, where we always automatically seek the approval of others, 125 00:07:04,040 --> 00:07:07,320 Speaker 3: works to make us very powerful because we can cooperate 126 00:07:07,360 --> 00:07:11,600 Speaker 3: in large groups of strangers, and it means we won't intentionally, 127 00:07:11,840 --> 00:07:14,760 Speaker 3: most of the time, hurt each other because that's generally 128 00:07:14,920 --> 00:07:18,640 Speaker 3: not going to make us popular with other humans. So 129 00:07:20,000 --> 00:07:22,240 Speaker 3: that's not something you can say about any other species. 130 00:07:22,960 --> 00:07:26,560 Speaker 3: They will behave that way with their own family members 131 00:07:26,840 --> 00:07:28,960 Speaker 3: because of DNA, and we could go to that in 132 00:07:29,040 --> 00:07:33,120 Speaker 3: great detail, but we won't, but beyond that, they won't. 133 00:07:34,200 --> 00:07:38,680 Speaker 3: And so the difficulty then for other species is that 134 00:07:38,720 --> 00:07:41,400 Speaker 3: when you get two groups of elephants coming together, if 135 00:07:41,400 --> 00:07:44,920 Speaker 3: they're not related to each other, they're the enemy, whereas 136 00:07:45,520 --> 00:07:50,080 Speaker 3: with humans they are potentially a uniting force, which makes 137 00:07:50,280 --> 00:07:54,880 Speaker 3: them all much more powerful. So that's the advantage. The 138 00:07:54,920 --> 00:07:58,280 Speaker 3: downside to this is that it can be manipulated by 139 00:07:58,320 --> 00:08:04,080 Speaker 3: bad actors, and that's what a cult is. A cult 140 00:08:04,200 --> 00:08:08,520 Speaker 3: is a bad actor who creates a system of belief 141 00:08:09,080 --> 00:08:13,320 Speaker 3: for a large group of humans that is all about 142 00:08:13,360 --> 00:08:19,800 Speaker 3: empowering them to achieve their aims by getting people to 143 00:08:20,280 --> 00:08:22,360 Speaker 3: behave that way. And it's just a hack on that 144 00:08:22,880 --> 00:08:25,560 Speaker 3: self approval system, and it's a pretty easy thing to do. 145 00:08:27,560 --> 00:08:31,240 Speaker 3: All you've got to do is well, a be charismatic. 146 00:08:31,440 --> 00:08:33,520 Speaker 3: So I guess that rules you an eye out, Craig, 147 00:08:33,559 --> 00:08:39,360 Speaker 3: but you know, TIFF's got a chance. And the definition 148 00:08:39,400 --> 00:08:42,560 Speaker 3: of charismatic here is someone who people want to follow. 149 00:08:43,559 --> 00:08:48,560 Speaker 3: So if it doesn't necessarily mean good looking, it means 150 00:08:49,280 --> 00:08:54,160 Speaker 3: someone who people find compelled to listen to. 151 00:08:54,920 --> 00:08:59,640 Speaker 1: Yes, so, and I guess attractive, not in the physical sense, 152 00:08:59,679 --> 00:09:03,320 Speaker 1: but that they are attracted to their energy, like you said, 153 00:09:03,360 --> 00:09:05,160 Speaker 1: their charisma, their ideas. 154 00:09:05,320 --> 00:09:07,360 Speaker 3: Yeah, well it can be in the physical sense too, 155 00:09:08,559 --> 00:09:11,199 Speaker 3: you know, and often is. But there's also plenty of 156 00:09:11,280 --> 00:09:15,240 Speaker 3: examples where the leaders of cults or the things that 157 00:09:15,280 --> 00:09:17,320 Speaker 3: I'm describing here, and I keep using the word cult, 158 00:09:17,320 --> 00:09:20,360 Speaker 3: but it describes anything, right, It describes a political party, 159 00:09:20,360 --> 00:09:24,040 Speaker 3: it describes a religion. It does describe what people traditionally 160 00:09:24,120 --> 00:09:28,640 Speaker 3: call a cult, but any kind of movement where large 161 00:09:28,679 --> 00:09:33,600 Speaker 3: groups of people are mobilized to a cause by a leader, 162 00:09:34,360 --> 00:09:35,840 Speaker 3: and that leader. 163 00:09:35,720 --> 00:09:38,920 Speaker 2: Is by definition, that would make religion's. 164 00:09:38,400 --> 00:09:45,240 Speaker 3: Cults of course, yes so, And it makes the MAGA 165 00:09:45,320 --> 00:09:48,800 Speaker 3: crowd in the United States a cult. It makes anyone 166 00:09:48,840 --> 00:09:53,079 Speaker 3: who gets that absolutely devoted following where And you can 167 00:09:53,120 --> 00:09:58,200 Speaker 3: tell when it's a cult because it's impervious to logic. 168 00:10:00,040 --> 00:10:01,760 Speaker 3: We'll talk about that in a little bit more detail 169 00:10:01,800 --> 00:10:06,440 Speaker 3: about how that's achieved. But it's impervious to logic. You 170 00:10:06,480 --> 00:10:08,480 Speaker 3: can throw facts at it till you blow on the face, 171 00:10:08,559 --> 00:10:12,600 Speaker 3: and the people who are believers in the cult will 172 00:10:12,640 --> 00:10:14,280 Speaker 3: not change their point of view at all. 173 00:10:14,400 --> 00:10:19,439 Speaker 1: Ever, And I guess you could also call sporting fans 174 00:10:19,600 --> 00:10:22,640 Speaker 1: cult members in that sense that it's very emotional, it's 175 00:10:22,720 --> 00:10:26,240 Speaker 1: very irrational, it's very us versus them. It's like we're 176 00:10:26,280 --> 00:10:28,880 Speaker 1: the best, everyone else is the worst. We're right, they're wrong, 177 00:10:28,960 --> 00:10:31,600 Speaker 1: they're shit, We're great. That's cultie. 178 00:10:32,440 --> 00:10:34,480 Speaker 3: It's a weak form of it. Yes, so, there are 179 00:10:34,520 --> 00:10:37,720 Speaker 3: probably a core group of fans who very definitely behave 180 00:10:37,840 --> 00:10:40,360 Speaker 3: like a cult. What they often like is a leader though, 181 00:10:41,320 --> 00:10:44,240 Speaker 3: so a leader is a core component of this. There 182 00:10:44,240 --> 00:10:47,880 Speaker 3: has to be someone who is projecting the core message 183 00:10:47,920 --> 00:10:51,280 Speaker 3: of the cult, and a sports team is not a leader. 184 00:10:51,640 --> 00:10:55,280 Speaker 3: A sports team is a common interest and yes, there 185 00:10:55,280 --> 00:10:58,200 Speaker 3: are people who are radically committed to it and will 186 00:10:58,200 --> 00:11:01,080 Speaker 3: brook no criticism of it at all, but it's still 187 00:11:01,120 --> 00:11:03,040 Speaker 3: what I would call a weak version of it. But 188 00:11:03,120 --> 00:11:05,520 Speaker 3: it is using the same mechanism in the human brain, 189 00:11:05,840 --> 00:11:08,000 Speaker 3: which is they're doing things that they think the other 190 00:11:08,040 --> 00:11:11,320 Speaker 3: people who believe the same things as them will approve of, 191 00:11:11,640 --> 00:11:14,600 Speaker 3: and it will therefore increase their status with those people. 192 00:11:15,679 --> 00:11:18,559 Speaker 3: It's the same, by the way, at an even looser level, 193 00:11:18,679 --> 00:11:21,240 Speaker 3: when someone goes to say, you know, a stand up 194 00:11:21,280 --> 00:11:26,920 Speaker 3: comedian performance, if the audience is laughing, you're laughing even 195 00:11:26,960 --> 00:11:31,920 Speaker 3: if it's not funny, because it's part of a group 196 00:11:32,000 --> 00:11:36,320 Speaker 3: of humans behaving in a way that pleases the other humans. 197 00:11:38,080 --> 00:11:43,800 Speaker 3: So those are all examples of that in action, but 198 00:11:43,880 --> 00:11:46,800 Speaker 3: it isn't. None of those things are a cult unless 199 00:11:46,800 --> 00:11:50,680 Speaker 3: you've got that core ingredient of a leader. So you 200 00:11:50,800 --> 00:11:53,679 Speaker 3: need a leader. The leader has to be charismatic in 201 00:11:53,720 --> 00:11:56,560 Speaker 3: some way. There has to be something about that leader 202 00:11:57,360 --> 00:12:02,320 Speaker 3: that causes people to listen to what they say, because 203 00:12:02,360 --> 00:12:04,280 Speaker 3: if you don't listen to what the leader says, you 204 00:12:04,320 --> 00:12:09,360 Speaker 3: can't create a cult. So, because the core in a 205 00:12:09,440 --> 00:12:12,960 Speaker 3: cult is what comes out of a person's mouth, what 206 00:12:13,040 --> 00:12:17,640 Speaker 3: are they saying that you are finding as an intrinsically 207 00:12:18,520 --> 00:12:22,560 Speaker 3: powerful belief to hold so at the core of every cult. 208 00:12:22,559 --> 00:12:24,560 Speaker 3: So if we look at, say, for example, the Nazis, 209 00:12:24,559 --> 00:12:28,400 Speaker 3: which your classic cult, you have a leader there not 210 00:12:28,440 --> 00:12:33,480 Speaker 3: particularly good looking, but very charismatic in you know, tif, 211 00:12:33,559 --> 00:12:36,640 Speaker 3: I see a smiling she obviously found Hitler quite attractive, 212 00:12:36,720 --> 00:12:40,839 Speaker 3: but there, you know, I think generally not regarded as 213 00:12:40,840 --> 00:12:44,199 Speaker 3: particularly good looking. But he said some really powerful things, 214 00:12:45,960 --> 00:12:48,080 Speaker 3: and he said things to people who wanted to hear 215 00:12:48,320 --> 00:12:52,160 Speaker 3: exactly these things. So when he rose to power at 216 00:12:52,200 --> 00:12:56,600 Speaker 3: the end, well in between the wars, I mean, Germany 217 00:12:56,720 --> 00:13:04,280 Speaker 3: was being destroyed by dream punishments driven largely by France 218 00:13:05,200 --> 00:13:08,800 Speaker 3: economic punishments that were destroying the German economy. Life was 219 00:13:08,800 --> 00:13:12,520 Speaker 3: getting hard and harder in that economy, and they wanted 220 00:13:12,559 --> 00:13:16,040 Speaker 3: to hear someone who had a solution for them. And 221 00:13:17,559 --> 00:13:21,400 Speaker 3: Hitler didn't just offer a solution which would be dry 222 00:13:21,520 --> 00:13:24,800 Speaker 3: enough and could be any political party. It came with 223 00:13:24,880 --> 00:13:28,920 Speaker 3: a really nice message. It came with a message that said, 224 00:13:30,320 --> 00:13:33,760 Speaker 3: you are part of the Aryan race. You, the German people, 225 00:13:33,840 --> 00:13:37,080 Speaker 3: the German speaking people, are part of the Aryan race. 226 00:13:37,760 --> 00:13:40,360 Speaker 3: You are superior to all of these other people who 227 00:13:40,400 --> 00:13:42,880 Speaker 3: are trying to take you down, and all of these 228 00:13:42,920 --> 00:13:46,800 Speaker 3: other people are your enemies. So he gave them an ideal, 229 00:13:47,280 --> 00:13:50,800 Speaker 3: which is you are special. The people who follow me, 230 00:13:50,880 --> 00:13:52,560 Speaker 3: the people who believe in the things I say, are 231 00:13:52,600 --> 00:13:57,760 Speaker 3: special people, you know, gifted people, you know, apart from 232 00:13:57,840 --> 00:14:03,120 Speaker 3: the others, and created a US and them narrative which 233 00:14:03,679 --> 00:14:05,720 Speaker 3: he then just fed into with the other elements of 234 00:14:05,760 --> 00:14:08,960 Speaker 3: cult building. But that's a key part of creating a cult. 235 00:14:09,040 --> 00:14:12,800 Speaker 3: So when you create your cult, Craig, let's assuming you 236 00:14:12,840 --> 00:14:14,240 Speaker 3: can be charismatic to have one. 237 00:14:15,360 --> 00:14:18,280 Speaker 1: Can you not talk about my cult straight after you 238 00:14:18,320 --> 00:14:20,160 Speaker 1: talk about God? 239 00:14:20,280 --> 00:14:20,480 Speaker 2: Say? 240 00:14:21,680 --> 00:14:24,040 Speaker 3: But but when you create it, you're going to have 241 00:14:24,080 --> 00:14:25,840 Speaker 3: to have that key message. You're going to have to 242 00:14:25,840 --> 00:14:30,320 Speaker 3: have the message that the harp Rights are better people, Okay, 243 00:14:31,000 --> 00:14:34,880 Speaker 3: just better people in general. They're they're fitter, they're healthier, 244 00:14:35,040 --> 00:14:37,880 Speaker 3: they're you know, you're going to have to find some 245 00:14:38,000 --> 00:14:41,920 Speaker 3: characteristics that unite the harp Rights in their belief that 246 00:14:42,000 --> 00:14:45,840 Speaker 3: they are better than everybody else, okay, And so that 247 00:14:45,880 --> 00:14:48,280 Speaker 3: you need to create that us and them narrative because 248 00:14:48,320 --> 00:14:52,480 Speaker 3: that's a key part of this cognitive tool that we 249 00:14:52,520 --> 00:14:54,880 Speaker 3: have in our brain. There has to be there has 250 00:14:54,920 --> 00:14:57,160 Speaker 3: to be an us who are bound together by a 251 00:14:57,160 --> 00:14:59,640 Speaker 3: common belief, and there has to be them who are 252 00:14:59,640 --> 00:15:01,720 Speaker 3: the out siders who don't believe these things. 253 00:15:02,200 --> 00:15:05,360 Speaker 2: Yeah. Wow, I've never been less inclined to start a cult. 254 00:15:06,880 --> 00:15:11,640 Speaker 1: But it is I remember you were saying, and we'll 255 00:15:11,760 --> 00:15:13,760 Speaker 1: jump straight back into where you're at. But one of 256 00:15:13,760 --> 00:15:16,000 Speaker 1: the things that really resonated with me last time we 257 00:15:16,080 --> 00:15:19,479 Speaker 1: spoke about this was you said, you know those religions 258 00:15:19,520 --> 00:15:24,480 Speaker 1: where there's these kind of world ending or world changing 259 00:15:24,520 --> 00:15:28,000 Speaker 1: events that are coming on a certain day, and these 260 00:15:28,080 --> 00:15:31,080 Speaker 1: people in these cults believe this, and then the day 261 00:15:31,160 --> 00:15:34,560 Speaker 1: comes and goes and the event didn't happen. So rather 262 00:15:34,640 --> 00:15:37,440 Speaker 1: than people going, oh, well, old mates full of shit. 263 00:15:37,960 --> 00:15:41,920 Speaker 1: So clearly he's not connected to God. Clearly he or 264 00:15:42,000 --> 00:15:46,160 Speaker 1: she doesn't really have any mystical, magical, mythical insight. 265 00:15:47,120 --> 00:15:49,240 Speaker 2: And they don't leave the cult. 266 00:15:49,400 --> 00:15:52,000 Speaker 1: Some of them do, but a lot of them, you know, 267 00:15:52,160 --> 00:15:55,760 Speaker 1: stay and just kind of like reason and logic and 268 00:15:55,800 --> 00:16:01,040 Speaker 1: critical thinking goes out the window because they're so intertwined 269 00:16:01,080 --> 00:16:04,240 Speaker 1: with this thing that the idea of leaving is scarier 270 00:16:04,280 --> 00:16:06,480 Speaker 1: than the idea that old mate might be wrong. 271 00:16:07,000 --> 00:16:12,160 Speaker 3: And that's another key element. So that is demanding escalating investment. 272 00:16:12,640 --> 00:16:15,800 Speaker 3: So you have to demand that people commit more and 273 00:16:15,960 --> 00:16:20,120 Speaker 3: more and more to the cult. So the example you're 274 00:16:20,160 --> 00:16:22,200 Speaker 3: talking about there, well, I mean there were quite a 275 00:16:22,240 --> 00:16:24,880 Speaker 3: few in the eighteen hundreds, but the biggest and most 276 00:16:24,920 --> 00:16:28,560 Speaker 3: famous one was William Miller. William Miller who was a 277 00:16:28,600 --> 00:16:32,960 Speaker 3: preacher in the eighteen forties, and he predicted that on 278 00:16:33,120 --> 00:16:37,040 Speaker 3: October twenty second, eighteen forty four, Christ would return to 279 00:16:37,080 --> 00:16:39,760 Speaker 3: Earth and the gates to Heaven would be open, and 280 00:16:39,840 --> 00:16:44,240 Speaker 3: only true believers would be admitted in. And so his 281 00:16:44,440 --> 00:16:49,680 Speaker 3: cult consisted of the true believers and everybody else. And 282 00:16:50,560 --> 00:16:54,400 Speaker 3: the true believers, yeah, I ended up being about one 283 00:16:54,440 --> 00:16:56,080 Speaker 3: hundred thousand of them by the time you got to 284 00:16:56,080 --> 00:17:04,080 Speaker 3: October twenty second. But he demanded escalating commitment. So in 285 00:17:04,160 --> 00:17:06,920 Speaker 3: order to demonstrate their belief, they had to rid themselves 286 00:17:06,960 --> 00:17:10,159 Speaker 3: of all worldly possessions because they wouldn't need them because 287 00:17:10,240 --> 00:17:13,040 Speaker 3: on October twenty second they would be being admitted to heaven. 288 00:17:13,160 --> 00:17:17,560 Speaker 2: So admitted very clinical. 289 00:17:17,880 --> 00:17:21,439 Speaker 1: It's not unlike, it's not unlike I could be fucking 290 00:17:21,480 --> 00:17:21,800 Speaker 1: this up. 291 00:17:21,840 --> 00:17:22,800 Speaker 2: I think I've got it right. 292 00:17:22,880 --> 00:17:26,040 Speaker 1: The Jehovah's witnesses who think who believe one hundred and 293 00:17:26,040 --> 00:17:30,679 Speaker 1: forty four thousand will be transcended to heaven on the 294 00:17:30,760 --> 00:17:33,440 Speaker 1: last day, but there's a lot more than one hundred 295 00:17:33,440 --> 00:17:35,680 Speaker 1: and forty four thousand Jehovah's witnesses. 296 00:17:36,480 --> 00:17:38,280 Speaker 3: Yeah, so I guess someone going to miss out. 297 00:17:38,359 --> 00:17:40,080 Speaker 2: But that's a problem. 298 00:17:40,320 --> 00:17:43,720 Speaker 3: But he regarded part of his thing was you had 299 00:17:43,760 --> 00:17:47,560 Speaker 3: to be you had to demonstrate total commitment. So his 300 00:17:47,720 --> 00:17:51,440 Speaker 3: believers gave away their property, abandon their farms, gave away 301 00:17:51,480 --> 00:17:56,560 Speaker 3: their world It possessions, and were regarded as being totally 302 00:17:56,560 --> 00:18:00,240 Speaker 3: insane by almost everybody else. But that actually helped the 303 00:18:00,320 --> 00:18:05,280 Speaker 3: narrative that proves the US versus them. You know, we 304 00:18:05,320 --> 00:18:08,560 Speaker 3: are the true believers. Those people who are making fun 305 00:18:08,600 --> 00:18:12,879 Speaker 3: of us are the evil people who don't understand. So 306 00:18:14,920 --> 00:18:17,640 Speaker 3: now the fact that we are now passed October twenty second, 307 00:18:17,720 --> 00:18:22,640 Speaker 3: eighteen forty four and still Jesus hasn't turned up might 308 00:18:23,400 --> 00:18:25,560 Speaker 3: seem like it's a bit of a problem. 309 00:18:25,720 --> 00:18:28,800 Speaker 2: And you're right, Jesus hasn't turned up. 310 00:18:28,840 --> 00:18:31,879 Speaker 1: That's the name of the show, tip and so what 311 00:18:31,920 --> 00:18:34,000 Speaker 1: we're going to call it Jesus hasn't turned up. With 312 00:18:34,040 --> 00:18:36,919 Speaker 1: the little hands in the air emoji, do you know 313 00:18:38,600 --> 00:18:44,560 Speaker 1: Jesus hasn't turned up emoji the oldest shrug emojill. 314 00:18:45,400 --> 00:18:49,959 Speaker 3: So anyway, October twenty third, eighteen forty four rolls around. 315 00:18:51,000 --> 00:18:53,160 Speaker 3: Everything's still the same as it was the day before. 316 00:18:54,480 --> 00:19:00,680 Speaker 3: And that's subsequently became called the Great Disappointment, which to 317 00:19:00,720 --> 00:19:05,200 Speaker 3: me is a bit of an understatement. So the disappointment occurred, 318 00:19:06,200 --> 00:19:11,879 Speaker 3: and there were some people who that caused to lose faith, 319 00:19:12,520 --> 00:19:16,680 Speaker 3: believe it or not, who left the church and who 320 00:19:16,680 --> 00:19:19,720 Speaker 3: didn't believe any longer. But there was a fairly large 321 00:19:19,800 --> 00:19:24,520 Speaker 3: group of people who said, ah, well, we just we 322 00:19:24,600 --> 00:19:29,359 Speaker 3: must have must have misunderstood the calculations or misunderstood the scriptures, 323 00:19:29,400 --> 00:19:32,240 Speaker 3: et cetera, et cetera. And in fact it split the place. 324 00:19:32,440 --> 00:19:35,280 Speaker 3: The things split into about three or four main groupings 325 00:19:35,280 --> 00:19:39,040 Speaker 3: of people who had different explanations for why, but they 326 00:19:39,040 --> 00:19:40,879 Speaker 3: all believed that it was still right. It was just 327 00:19:41,240 --> 00:19:46,040 Speaker 3: something had been misinterpreted. The one that Brian in accounting 328 00:19:46,840 --> 00:19:50,720 Speaker 3: that's right. Well, one of the groups actually survived and 329 00:19:51,080 --> 00:19:55,080 Speaker 3: now has millions of followers today. So the group that 330 00:19:55,160 --> 00:19:59,320 Speaker 3: survived we now call the Seventh day Adventist Church, and 331 00:19:59,720 --> 00:20:02,480 Speaker 3: they were one of the groups of the Miller Rights 332 00:20:03,119 --> 00:20:07,960 Speaker 3: that came through the disappointment, and their rationalization for why 333 00:20:08,000 --> 00:20:10,639 Speaker 3: it didn't work was, oh, listen, old mate, Miller. He 334 00:20:11,520 --> 00:20:14,760 Speaker 3: stuffed up the message because what was actually going to 335 00:20:14,760 --> 00:20:19,600 Speaker 3: happen was Jesus was going to come to his place 336 00:20:19,640 --> 00:20:22,720 Speaker 3: in heaven, and all of this was supposed to occur 337 00:20:22,800 --> 00:20:25,800 Speaker 3: in heaven, and it did occur in heaven from people 338 00:20:26,000 --> 00:20:29,320 Speaker 3: who had already passed, and and so you stuffed it 339 00:20:29,400 --> 00:20:30,920 Speaker 3: up thinking that it was going to happen on earth. 340 00:20:31,119 --> 00:20:34,000 Speaker 3: The convenient thing about the in Heaven story is that 341 00:20:34,040 --> 00:20:36,040 Speaker 3: there was no way to verify whether it had happened 342 00:20:36,160 --> 00:20:36,399 Speaker 3: or not. 343 00:20:39,480 --> 00:20:40,520 Speaker 2: I know it did happen. 344 00:20:40,920 --> 00:20:42,000 Speaker 3: Yeah, you just enjoyed. 345 00:20:42,040 --> 00:20:46,840 Speaker 1: There really just not anywhere that we can verify. But 346 00:20:47,400 --> 00:20:48,560 Speaker 1: it definitely did. 347 00:20:49,000 --> 00:20:51,480 Speaker 3: Yes, yes, you know you don't know my boyfriend. He's 348 00:20:51,480 --> 00:20:56,840 Speaker 3: from Canada, so the I knew you had a boyfriend. 349 00:20:58,560 --> 00:21:03,320 Speaker 3: So anyway, the reason that that happens from a cognitive perspective, 350 00:21:03,359 --> 00:21:06,640 Speaker 3: from a biochemical perspective is you know what accountants called 351 00:21:06,640 --> 00:21:09,680 Speaker 3: the sunk cost fallacy, which is the more we invest 352 00:21:09,880 --> 00:21:13,800 Speaker 3: in something, the less easy we find it to pull 353 00:21:13,840 --> 00:21:16,600 Speaker 3: away from it. So the more invested we are in 354 00:21:16,640 --> 00:21:19,920 Speaker 3: the message, if we've given away, our home, our car, 355 00:21:20,119 --> 00:21:23,520 Speaker 3: our farm, towards our wife, et cetera, et cetera. Because 356 00:21:23,560 --> 00:21:27,880 Speaker 3: of this belief escalating investment required of this cult, then 357 00:21:28,560 --> 00:21:33,200 Speaker 3: even though it didn't happen, it still must be true 358 00:21:33,280 --> 00:21:35,840 Speaker 3: because I've given up everything, so that the thought process 359 00:21:36,000 --> 00:21:39,360 Speaker 3: is I've given up everything for this, therefore it must 360 00:21:39,440 --> 00:21:43,560 Speaker 3: be the truth. And you can see that same process 361 00:21:43,560 --> 00:21:47,520 Speaker 3: occurring in cults all the time. It makes them impervious 362 00:21:47,520 --> 00:21:48,040 Speaker 3: to logic. 363 00:21:48,760 --> 00:21:51,600 Speaker 1: Well, also when you think about the fact that people 364 00:21:51,680 --> 00:21:54,760 Speaker 1: have like got this, some of them ten twenty thirty 365 00:21:54,880 --> 00:21:59,879 Speaker 1: forty year relationship with this ideology, philosophy, theology, fill in, 366 00:22:00,200 --> 00:22:04,040 Speaker 1: fill in your own ology, and their identity is intertwined 367 00:22:04,119 --> 00:22:07,480 Speaker 1: with that. And if you tell them your identity like 368 00:22:07,560 --> 00:22:10,199 Speaker 1: that thing that you identify with, that thing that is 369 00:22:10,320 --> 00:22:13,840 Speaker 1: essentially you, it isn't true. Well that's got to be 370 00:22:13,920 --> 00:22:17,040 Speaker 1: terrifying for them. So they don't know who they are now. 371 00:22:17,440 --> 00:22:20,360 Speaker 1: So you defy reason and logic and evidence and go. 372 00:22:21,119 --> 00:22:23,719 Speaker 1: You know, I was thinking as you were talking a 373 00:22:23,720 --> 00:22:28,960 Speaker 1: little bit, this is not extremely dissimilar to your theory 374 00:22:29,040 --> 00:22:31,040 Speaker 1: or not your theory. A lot of people's theory and 375 00:22:31,080 --> 00:22:37,640 Speaker 1: evidence and science around seed oils versus mainstream Yes, dietetics. 376 00:22:38,000 --> 00:22:45,600 Speaker 3: Yeah, say again, it's the same cost fallacy. It's we've 377 00:22:45,640 --> 00:22:48,959 Speaker 3: invested so much in this, it must be right, yes, 378 00:22:50,119 --> 00:22:54,639 Speaker 3: and so it becomes impervious to science that proves it isn't, 379 00:22:56,520 --> 00:22:59,879 Speaker 3: because that would mean that you would have to un 380 00:23:00,160 --> 00:23:02,800 Speaker 3: leave everything you've spent your entire life believing. 381 00:23:03,800 --> 00:23:05,479 Speaker 2: Yeah, what do you think that? 382 00:23:06,440 --> 00:23:11,080 Speaker 1: Like? Why are we We're getting philosophical now and we 383 00:23:11,119 --> 00:23:13,680 Speaker 1: haven't even done step one, which is isolate the subject 384 00:23:13,720 --> 00:23:16,320 Speaker 1: and become the source of validation. This could be the 385 00:23:16,320 --> 00:23:22,000 Speaker 1: longest Gillespie show ever. By the way, everyone, it is 386 00:23:22,040 --> 00:23:24,679 Speaker 1: a really good article. It is on if you go 387 00:23:24,800 --> 00:23:30,679 Speaker 1: to de Gillespie dot substack dot com you'll find it there. 388 00:23:31,440 --> 00:23:34,760 Speaker 2: It's one of the best articles I think you've ever written. 389 00:23:34,800 --> 00:23:38,159 Speaker 2: I loved it. But I'm fascinated with this shit. I've 390 00:23:38,200 --> 00:23:39,919 Speaker 2: got no idea what I was saying. Carry on. 391 00:23:41,080 --> 00:23:43,520 Speaker 3: So yeah, So there's different steps to this. The first 392 00:23:43,520 --> 00:23:46,880 Speaker 3: one if you're going to go full cold, right, I mean, 393 00:23:47,480 --> 00:23:51,320 Speaker 3: there's different ways to do this. Many of the sort 394 00:23:51,359 --> 00:23:55,000 Speaker 3: of more modern day cults start from they've got a 395 00:23:55,000 --> 00:23:57,879 Speaker 3: bit of a leg up because they've got social media 396 00:23:57,920 --> 00:24:02,159 Speaker 3: to help them. So social media sort of reinforces the 397 00:24:02,200 --> 00:24:04,600 Speaker 3: mechanisms I've been talking about. There, you no longer have 398 00:24:04,760 --> 00:24:08,040 Speaker 3: to get in a group of people and convince them 399 00:24:08,119 --> 00:24:14,560 Speaker 3: personally of something. You can access millions of people simultaneously 400 00:24:14,600 --> 00:24:17,159 Speaker 3: with a message, and some of them will catch on 401 00:24:17,240 --> 00:24:22,160 Speaker 3: that message and receive validation from it, and the algorithm 402 00:24:22,160 --> 00:24:26,359 Speaker 3: will then keep feeding them even more of that message 403 00:24:26,400 --> 00:24:30,120 Speaker 3: and more validation. So it forces them into an echo 404 00:24:30,320 --> 00:24:35,240 Speaker 3: chamber where their own mind becomes a prison guard, defending 405 00:24:35,280 --> 00:24:38,720 Speaker 3: their commitment to the cult and causing them to attack 406 00:24:38,760 --> 00:24:42,440 Speaker 3: anything that arms the cult. So social media is a 407 00:24:42,480 --> 00:24:46,960 Speaker 3: massive accelerator for this massive accelerator. But back in the 408 00:24:47,000 --> 00:24:50,200 Speaker 3: olden days before social media, if you wanted to start 409 00:24:50,200 --> 00:24:53,480 Speaker 3: a cult, you actually had to isolate the subject. So 410 00:24:54,280 --> 00:24:57,080 Speaker 3: you had to find someone who was susceptible to your 411 00:24:57,080 --> 00:25:02,320 Speaker 3: messaging cut them off from any external value because they're 412 00:25:02,920 --> 00:25:07,560 Speaker 3: open to persuasion by someone else. In the early stages 413 00:25:07,880 --> 00:25:11,640 Speaker 3: of recruitment, if you're just talking to them one on one, 414 00:25:12,200 --> 00:25:14,959 Speaker 3: then they might find you very charismatic and believe everything 415 00:25:15,000 --> 00:25:17,399 Speaker 3: you say, and then they know they walk out in 416 00:25:17,440 --> 00:25:19,119 Speaker 3: the street and someone else says something else to them 417 00:25:19,160 --> 00:25:21,800 Speaker 3: as well, and they think, oh, hang on, maybe what 418 00:25:21,920 --> 00:25:25,280 Speaker 3: harp said is a load of bs I better reconsider. 419 00:25:25,520 --> 00:25:28,160 Speaker 3: So to avoid that kind of thing happening, you've got 420 00:25:28,240 --> 00:25:30,640 Speaker 3: to cut them off from that sort of communication. You've 421 00:25:30,640 --> 00:25:33,280 Speaker 3: got to make sure that the only messaging they're hearing 422 00:25:33,359 --> 00:25:37,479 Speaker 3: is your messaging, which so that usually involves, you know, 423 00:25:37,600 --> 00:25:40,840 Speaker 3: isolating them from their family, isolating them from their friends, 424 00:25:41,000 --> 00:25:43,960 Speaker 3: other networks. That can't be done instantly, but you start 425 00:25:44,240 --> 00:25:46,920 Speaker 3: in order to do that, you create very very frequent 426 00:25:46,960 --> 00:25:50,040 Speaker 3: meetings with them where you're controlling the messaging. As I said, 427 00:25:50,400 --> 00:25:52,439 Speaker 3: this is a bit old school because you don't need 428 00:25:52,480 --> 00:25:54,880 Speaker 3: to do that so much now because the algorithm does 429 00:25:54,880 --> 00:25:57,800 Speaker 3: it for you. So as soon as someone starts to 430 00:25:57,840 --> 00:26:01,679 Speaker 3: indicate a preference for a message, the algorithm starts isolating 431 00:26:01,720 --> 00:26:04,840 Speaker 3: them anyway, so that all they see is their echo 432 00:26:04,920 --> 00:26:06,840 Speaker 3: chamber of the things they think they believe. 433 00:26:07,520 --> 00:26:12,640 Speaker 2: Yeah, yeah, yeah, it's it's so true. It's yeah. 434 00:26:12,720 --> 00:26:16,200 Speaker 1: Obviously, all it comes up on mine is health stuff, 435 00:26:16,240 --> 00:26:20,400 Speaker 1: fitness stuff, motorbike stuff, and stuff that for the most 436 00:26:20,440 --> 00:26:24,280 Speaker 1: part I align with anyway. So yeah, it's just feeding 437 00:26:24,320 --> 00:26:26,400 Speaker 1: you the stuff that you want to keep looking at. 438 00:26:26,840 --> 00:26:30,880 Speaker 3: That's right. And but once you've got them captured. Then 439 00:26:30,920 --> 00:26:34,400 Speaker 3: the last key ingredient that you've got to, you know, 440 00:26:34,600 --> 00:26:37,560 Speaker 3: the icing on the cake, is you've got to control 441 00:26:38,119 --> 00:26:40,879 Speaker 3: the narrative. You've got to you've got to make sure 442 00:26:41,720 --> 00:26:48,200 Speaker 3: that the messaging becomes your messaging, so people will still 443 00:26:48,240 --> 00:26:51,720 Speaker 3: be exposed to media which says something about your leader 444 00:26:51,760 --> 00:26:53,879 Speaker 3: that isn't pleasant. You know, believe it or not, there 445 00:26:53,880 --> 00:26:56,320 Speaker 3: are people out there saying really bad things about Trump, 446 00:26:56,359 --> 00:27:02,760 Speaker 3: for example, and yeah, apparently and myself back up off 447 00:27:02,760 --> 00:27:07,000 Speaker 3: the ground. And if you don't want that to start 448 00:27:07,000 --> 00:27:11,120 Speaker 3: affecting the movement and start chipping people off the edge, 449 00:27:11,520 --> 00:27:14,000 Speaker 3: you've got to give a counter narrative that is just 450 00:27:14,040 --> 00:27:16,920 Speaker 3: as strong, and you've got to put it on maximum blast. 451 00:27:17,440 --> 00:27:20,199 Speaker 3: So you've got to within the cult, you've got to 452 00:27:20,240 --> 00:27:24,760 Speaker 3: be distributing your own version of everything, your own version 453 00:27:24,800 --> 00:27:29,920 Speaker 3: of events, your own counter narrative, your own plausible deniability 454 00:27:29,960 --> 00:27:36,280 Speaker 3: for everything, because people outside the cult will start saying 455 00:27:36,320 --> 00:27:39,520 Speaker 3: that the bigger the cult becomes, the more nasty people 456 00:27:39,560 --> 00:27:43,639 Speaker 3: will get about you, its leader and its followers. You know, 457 00:27:43,680 --> 00:27:46,720 Speaker 3: it's like the poor Miller rights. You know, people calling 458 00:27:46,760 --> 00:27:49,880 Speaker 3: them learns because they gave away everything because you know, 459 00:27:50,080 --> 00:27:53,040 Speaker 3: the world was going to end, you have to have 460 00:27:53,040 --> 00:27:56,200 Speaker 3: a counter message going within the cult where you are 461 00:27:56,240 --> 00:27:59,840 Speaker 3: controlling the messaging and the language, so that they've always 462 00:28:00,119 --> 00:28:02,679 Speaker 3: got something to lean back on when someone from outside 463 00:28:02,720 --> 00:28:06,200 Speaker 3: says something nasty about their leader or them, they've got 464 00:28:06,240 --> 00:28:07,959 Speaker 3: an answer to whatever the event is. 465 00:28:10,520 --> 00:28:14,080 Speaker 1: Wow, I'm sure I was in a cult when I 466 00:28:14,119 --> 00:28:19,000 Speaker 1: was younger. But you know what is interesting about just 467 00:28:19,080 --> 00:28:22,280 Speaker 1: in terms of not so much specifically about religion, but 468 00:28:22,480 --> 00:28:25,800 Speaker 1: happens to be religious groups like just individual groups, like 469 00:28:25,840 --> 00:28:29,920 Speaker 1: there's whatever. There is twelve major religions thereabouts and four 470 00:28:30,040 --> 00:28:34,239 Speaker 1: thousand minor religions there abouts three to four thousand, and 471 00:28:34,400 --> 00:28:39,160 Speaker 1: overwhelmingly everyone who is in any of those religions thinks 472 00:28:39,240 --> 00:28:43,600 Speaker 1: pretty much that every other religion is wrong. It's, you know, 473 00:28:43,800 --> 00:28:47,200 Speaker 1: like it's the only kind of environment where all these 474 00:28:47,320 --> 00:28:48,120 Speaker 1: cult kind. 475 00:28:47,960 --> 00:28:51,440 Speaker 3: Of but that's a critical component. You have to believe 476 00:28:51,520 --> 00:28:53,080 Speaker 3: that or you're not in a cult. 477 00:28:53,400 --> 00:28:57,240 Speaker 1: Well, but isn't it funny how you can suspend intelligence 478 00:28:57,440 --> 00:29:02,240 Speaker 1: or reason or logic when you go, Okay, I just 479 00:29:02,400 --> 00:29:04,400 Speaker 1: ended up in this group. Now I'm in this group, 480 00:29:04,400 --> 00:29:07,040 Speaker 1: and there's another four thousand or so groups that don't 481 00:29:07,040 --> 00:29:10,360 Speaker 1: believe what don't have our ideology, philosophy, theology. 482 00:29:10,080 --> 00:29:10,600 Speaker 3: They're wrong. 483 00:29:11,160 --> 00:29:12,640 Speaker 2: Yeah, of course that's the thing. 484 00:29:12,720 --> 00:29:16,960 Speaker 1: But what's the odds that, statistically, I just landed in 485 00:29:17,040 --> 00:29:19,920 Speaker 1: the one group that's got the direct hotline to God. 486 00:29:19,960 --> 00:29:24,640 Speaker 1: That's well, the one true whatever, even statistically, it's. 487 00:29:24,680 --> 00:29:27,000 Speaker 3: The one thing we all believe is that we're always right. 488 00:29:28,760 --> 00:29:35,200 Speaker 3: So the thing that every minute that you spend in 489 00:29:35,240 --> 00:29:38,880 Speaker 3: that cult reinforces your belief that you have made the 490 00:29:38,920 --> 00:29:42,120 Speaker 3: correct decision and that everybody else has made the wrong decision. 491 00:29:42,360 --> 00:29:45,000 Speaker 3: And every minute that you're in there, the sunk cost 492 00:29:45,040 --> 00:29:48,720 Speaker 3: fallacy gets stronger and stronger because you've devoted more time 493 00:29:48,840 --> 00:29:52,080 Speaker 3: now to this than anything else, so you've made the 494 00:29:52,080 --> 00:29:54,400 Speaker 3: correct decision until you get to the point like the 495 00:29:54,400 --> 00:29:59,800 Speaker 3: Miller rights, where facts collide with with your beliefs and 496 00:30:00,120 --> 00:30:04,600 Speaker 3: it still can't change things because you have then got 497 00:30:04,600 --> 00:30:11,480 Speaker 3: the mindset this can't be true because I have invested 498 00:30:11,520 --> 00:30:12,200 Speaker 3: so much in it. 499 00:30:12,880 --> 00:30:13,120 Speaker 2: Yeah. 500 00:30:13,240 --> 00:30:17,440 Speaker 1: Yeah, now, I know we're going way beyond the Gillespie quota, 501 00:30:17,600 --> 00:30:23,040 Speaker 1: So charge me tariff on this. I want to quickly 502 00:30:23,120 --> 00:30:27,200 Speaker 1: go through. Give me like one two minutes on each 503 00:30:27,240 --> 00:30:31,680 Speaker 1: of these. So the article is called well how to 504 00:30:31,680 --> 00:30:34,160 Speaker 1: build a cult? Firstly, but then kind of the sub 505 00:30:34,560 --> 00:30:37,760 Speaker 1: It goes straight down to I love this. The recipe 506 00:30:37,800 --> 00:30:42,560 Speaker 1: a five step protocol for hijacking the human mind. Step one, 507 00:30:43,200 --> 00:30:44,320 Speaker 1: so I want you to riff on this for a 508 00:30:44,360 --> 00:30:47,160 Speaker 1: couple of minutes. Step one isolate the subject and become 509 00:30:47,240 --> 00:30:50,400 Speaker 1: the sole source. We've kind of covered all this a bit, 510 00:30:50,440 --> 00:30:53,360 Speaker 1: but at snapshot, become the sole source of validation. 511 00:30:53,880 --> 00:30:56,560 Speaker 3: Okay, so this isn't as necessary these days, but if 512 00:30:56,600 --> 00:31:00,920 Speaker 3: you're starting small and you need your core group of followers, 513 00:31:01,200 --> 00:31:02,760 Speaker 3: this is what you need to do. You need to 514 00:31:03,160 --> 00:31:06,680 Speaker 3: isolate them from people who will say bad things about 515 00:31:06,720 --> 00:31:11,800 Speaker 3: your beliefs. So and you need to be a source 516 00:31:11,840 --> 00:31:15,240 Speaker 3: of validation for them so that you are continuously reinforcing 517 00:31:15,280 --> 00:31:19,000 Speaker 3: for them that they have made the correct decision. And 518 00:31:19,120 --> 00:31:22,840 Speaker 3: so it's that simple. You just need to be that 519 00:31:22,960 --> 00:31:25,080 Speaker 3: all encompassing source, and you need to cut them off 520 00:31:25,120 --> 00:31:29,200 Speaker 3: from their sources of support. Now, as you grow, your 521 00:31:29,240 --> 00:31:33,080 Speaker 3: cult starts to take on an automatic recruitment capability, because 522 00:31:33,120 --> 00:31:38,080 Speaker 3: now you can use social media to achieve this same 523 00:31:38,080 --> 00:31:41,400 Speaker 3: because the algorithms will automatically do this for you. So 524 00:31:41,440 --> 00:31:44,760 Speaker 3: it'll give them validation, it'll show them only the things 525 00:31:44,840 --> 00:31:47,240 Speaker 3: they want to see and it will make anything else 526 00:31:47,280 --> 00:31:48,240 Speaker 3: foreign to them. 527 00:31:49,080 --> 00:31:52,640 Speaker 1: Step two, create a system of variable rewards. 528 00:31:52,880 --> 00:31:55,160 Speaker 3: Ah. Now we haven't talked about this one, but a 529 00:31:55,280 --> 00:32:02,280 Speaker 3: critical component of this is that the oxytocin reward we 530 00:32:02,360 --> 00:32:07,720 Speaker 3: get from believing that other people like what we're doing is, say, 531 00:32:07,800 --> 00:32:11,840 Speaker 3: at level one hundred. It's a standard dopamine reward, just 532 00:32:11,920 --> 00:32:15,840 Speaker 3: like the one we get from having sex or from gambling, 533 00:32:15,880 --> 00:32:18,240 Speaker 3: et cetera, et cetera. And we've talked about this a lot, 534 00:32:19,160 --> 00:32:21,200 Speaker 3: but the way our brain is wired is that we 535 00:32:21,320 --> 00:32:26,400 Speaker 3: prefer uncertainty, and the way that works is it magnifies it, 536 00:32:26,480 --> 00:32:28,960 Speaker 3: in fact, doubles the hit of dopamine we get from 537 00:32:29,000 --> 00:32:33,240 Speaker 3: a rewarding experience if it's not definitely going to be delivered. 538 00:32:34,400 --> 00:32:38,680 Speaker 3: So if we only get the rewarding thing every now 539 00:32:38,720 --> 00:32:43,520 Speaker 3: and then, then the hit we get from exposure to 540 00:32:43,560 --> 00:32:46,040 Speaker 3: it is double what it would be if we get 541 00:32:46,040 --> 00:32:51,400 Speaker 3: it all the time. And colts use this quite powerfully 542 00:32:51,520 --> 00:32:55,400 Speaker 3: in that they're not always your friend. They're not they're 543 00:32:55,440 --> 00:32:57,800 Speaker 3: not always telling you you're the greatest person on earth. 544 00:32:57,920 --> 00:33:02,640 Speaker 3: Sometimes they're telling you you're an idiot, or they're randomly 545 00:33:02,680 --> 00:33:05,080 Speaker 3: punishing you. So one of the most powerful and effective 546 00:33:05,160 --> 00:33:08,479 Speaker 3: methods that a cult leader can use is to make 547 00:33:08,520 --> 00:33:13,280 Speaker 3: an example of someone who has betrayed the cult, So 548 00:33:13,560 --> 00:33:16,760 Speaker 3: to take someone down in front of the other cult members. 549 00:33:17,120 --> 00:33:21,480 Speaker 3: This is a non believer who has made a transgression 550 00:33:21,560 --> 00:33:24,840 Speaker 3: against the beliefs of the cult. And if you do 551 00:33:24,920 --> 00:33:27,440 Speaker 3: that every now and then and on a random schedule, 552 00:33:28,080 --> 00:33:31,280 Speaker 3: and where no one knows for sure whether they might 553 00:33:31,360 --> 00:33:35,880 Speaker 3: be next, the pleasure they receive from being part of 554 00:33:35,920 --> 00:33:39,240 Speaker 3: this cult is twice as high as it would otherwise be. 555 00:33:39,640 --> 00:33:43,600 Speaker 3: So while it seems cruel, it's a powerful motivator. 556 00:33:44,960 --> 00:33:49,400 Speaker 1: Yeah, I guess, Like think about in Christianity, while in 557 00:33:49,440 --> 00:33:54,720 Speaker 1: fundamental Christianity anyway, and most versions of Christianity, most denominations, 558 00:33:55,760 --> 00:33:59,080 Speaker 1: you're taught that you were born a sinner, like you 559 00:34:00,120 --> 00:34:03,280 Speaker 1: like babies sinners, and the only way that they can 560 00:34:04,320 --> 00:34:07,080 Speaker 1: like can't behave their way out of it, Like you're 561 00:34:07,120 --> 00:34:09,520 Speaker 1: just a sinner and you need redemption. 562 00:34:09,880 --> 00:34:12,719 Speaker 3: Well, it's treata means keep them keen. You know it's 563 00:34:12,800 --> 00:34:15,319 Speaker 3: where you get it. Yeah, treata mean keep them keen. 564 00:34:15,600 --> 00:34:17,719 Speaker 3: Is if you're a member of a cult and then 565 00:34:17,760 --> 00:34:21,600 Speaker 3: suddenly you're on the outside, Suddenly you're getting the cold 566 00:34:21,600 --> 00:34:24,520 Speaker 3: shoulder you're being punished, then the thing you want most 567 00:34:24,520 --> 00:34:26,719 Speaker 3: in the world is to once again be part of 568 00:34:26,760 --> 00:34:32,759 Speaker 3: the cult. So it triggers an intense craving for the 569 00:34:32,920 --> 00:34:35,600 Speaker 3: uniformity and the belonging that the colt gave you. 570 00:34:37,080 --> 00:34:41,440 Speaker 1: Step three, establish a fortress of us versus them. 571 00:34:41,640 --> 00:34:43,680 Speaker 3: Yeah, so this is the thing we're talking about before, 572 00:34:43,719 --> 00:34:46,239 Speaker 3: the one that Hitler used with great effect, which is, 573 00:34:46,760 --> 00:34:50,800 Speaker 3: you know, you are better than them, you the Aryan people. Yeah, 574 00:34:50,840 --> 00:34:53,279 Speaker 3: you might feel like the world's against you, but you 575 00:34:53,320 --> 00:34:58,680 Speaker 3: are better. You're fundamentally better people, and that's what unites us. 576 00:34:58,719 --> 00:35:00,680 Speaker 3: You have to have something that you and you can 577 00:35:00,680 --> 00:35:03,319 Speaker 3: see this playing out in the United States on a 578 00:35:03,320 --> 00:35:06,279 Speaker 3: broader scale, you know, the immigrants of them and the 579 00:35:06,320 --> 00:35:10,440 Speaker 3: ass says the people born here. So this is a 580 00:35:10,480 --> 00:35:14,239 Speaker 3: really common narrative in cults, which is you latch onto 581 00:35:14,239 --> 00:35:16,319 Speaker 3: a message it's already there. You can't. By the way, 582 00:35:16,360 --> 00:35:18,080 Speaker 3: this is not something that you can just make up. 583 00:35:18,120 --> 00:35:20,839 Speaker 3: It's much much harder if you make it up. You 584 00:35:20,920 --> 00:35:25,000 Speaker 3: have to find something that your potential cult members do 585 00:35:25,200 --> 00:35:28,920 Speaker 3: regard as a unifying feature. I mean, this is, at 586 00:35:28,920 --> 00:35:32,120 Speaker 3: the end of the day, the fundamental basis for racism, 587 00:35:33,280 --> 00:35:35,960 Speaker 3: which is you know, white people are all better than 588 00:35:35,960 --> 00:35:38,560 Speaker 3: other people, et cetera, et cetera. So that's the the 589 00:35:38,960 --> 00:35:42,640 Speaker 3: underlying mechanism that is being exploited by this. But you 590 00:35:42,719 --> 00:35:46,200 Speaker 3: need to find a uniting thing that most of your 591 00:35:46,200 --> 00:35:50,520 Speaker 3: followers will find, will agree is a common superior factor 592 00:35:50,600 --> 00:35:53,680 Speaker 3: about them. Now, with the Miller Rights, it was their 593 00:35:53,719 --> 00:35:58,439 Speaker 3: belief in that event, you know. And but you will 594 00:35:58,440 --> 00:36:02,399 Speaker 3: find in every single cult there'll be something that unites them, 595 00:36:02,440 --> 00:36:06,120 Speaker 3: and whether it's spoken or unspoken, it'll be a characteristic 596 00:36:06,200 --> 00:36:11,960 Speaker 3: that they all regard as they share and the outsiders don't. 597 00:36:12,760 --> 00:36:16,240 Speaker 1: Yeah, will you kind of cover this one speaking William 598 00:36:16,239 --> 00:36:19,960 Speaker 1: Miller demand escalating investment? 599 00:36:20,480 --> 00:36:24,520 Speaker 3: Yeah, so you just got to keep ramping up the ask. Right. Initially, 600 00:36:24,560 --> 00:36:27,520 Speaker 3: it's very very simple. It's, you know, give me some 601 00:36:27,560 --> 00:36:31,160 Speaker 3: of your time, you know, commit to being helping at 602 00:36:31,160 --> 00:36:35,280 Speaker 3: this event. And then it turns into asking for money, 603 00:36:35,280 --> 00:36:37,400 Speaker 3: and then it turns into asking for possessions, and then 604 00:36:37,440 --> 00:36:40,799 Speaker 3: it turns into surrendering all worldly possessions and so on. 605 00:36:41,120 --> 00:36:44,640 Speaker 3: So it's an escalating ask because you want people to 606 00:36:45,120 --> 00:36:48,160 Speaker 3: get to the point where they have committed so much 607 00:36:48,200 --> 00:36:50,440 Speaker 3: to it that they cannot walk away from it no 608 00:36:50,480 --> 00:36:51,640 Speaker 3: matter what happens. 609 00:36:51,960 --> 00:36:54,759 Speaker 2: Yeah, yeah, I love this. I'm just going to read 610 00:36:54,760 --> 00:36:55,480 Speaker 2: a little bit. 611 00:36:57,080 --> 00:37:00,000 Speaker 1: On the prophest side Day, which was October twenty two, 612 00:37:00,120 --> 00:37:05,120 Speaker 1: eight forty four. The followers gathered on hillsides and in churches, 613 00:37:05,200 --> 00:37:08,719 Speaker 1: dressed in white robes to await their salvation, but the 614 00:37:08,760 --> 00:37:12,640 Speaker 1: sun rose on October twenty third to a silent, unchanged world. 615 00:37:12,840 --> 00:37:16,759 Speaker 1: This was the ultimate reality landmine. From an outsider, the 616 00:37:16,880 --> 00:37:20,720 Speaker 1: conclusion was obvious. Miller was a fraud for the follower 617 00:37:20,719 --> 00:37:25,719 Speaker 1: who had given up everything. However, admitting this was psychologically impossible. 618 00:37:25,800 --> 00:37:30,839 Speaker 1: The pain of acknowledging that their entire investment, their home, reputation, 619 00:37:31,080 --> 00:37:34,040 Speaker 1: life's meaning was for nothing was too great. 620 00:37:34,239 --> 00:37:35,880 Speaker 2: That's so interesting, isn't it. 621 00:37:36,880 --> 00:37:39,200 Speaker 1: Yeah, that's so so bloody. 622 00:37:39,920 --> 00:37:40,279 Speaker 2: All right? 623 00:37:40,920 --> 00:37:44,520 Speaker 1: Last one, step five, Step five, if you're paying attention 624 00:37:44,600 --> 00:37:48,120 Speaker 1: and taking notes, seize control of the narrative. 625 00:37:48,800 --> 00:37:50,640 Speaker 3: Yeah. So this is the one I'm talking about before, 626 00:37:50,680 --> 00:37:53,440 Speaker 3: where you have to keep the messaging. You can't just 627 00:37:53,560 --> 00:37:56,080 Speaker 3: start a col and then walk away. This is a 628 00:37:56,080 --> 00:37:58,760 Speaker 3: full time job. You've got to keep the messaging going. 629 00:37:59,440 --> 00:38:02,360 Speaker 3: I mean, and I give the example in the article 630 00:38:02,400 --> 00:38:05,840 Speaker 3: of nineteen eighty four where they took it to the 631 00:38:05,880 --> 00:38:13,800 Speaker 3: extreme of controlling the language. Followers can't conceive of something 632 00:38:13,880 --> 00:38:17,680 Speaker 3: like freedom or rebellion or individuality if the words don't exist. 633 00:38:19,120 --> 00:38:22,360 Speaker 3: So you see this sort of thing happening in North Korea, 634 00:38:22,400 --> 00:38:26,720 Speaker 3: for example, where there's just some concepts that no longer 635 00:38:26,800 --> 00:38:30,640 Speaker 3: exist in their language. And that's because they've had such 636 00:38:30,760 --> 00:38:36,120 Speaker 3: absolute control over what people say and believe that they 637 00:38:36,520 --> 00:38:39,720 Speaker 3: really don't know of anything else and can't even conceive 638 00:38:39,760 --> 00:38:41,960 Speaker 3: of it. And that's where you want to get to 639 00:38:42,040 --> 00:38:44,840 Speaker 3: with your cult that you're making, is you want to 640 00:38:44,880 --> 00:38:48,279 Speaker 3: get to a point where the only narrative comes from you. 641 00:38:49,200 --> 00:38:52,840 Speaker 1: Yeah, yeah, I love this sentence. This five step protocol 642 00:38:52,960 --> 00:38:56,600 Speaker 1: is a recipe for building a cage inside a human mind. 643 00:38:57,000 --> 00:38:58,520 Speaker 2: Well, that's fucking terrifying. 644 00:39:00,120 --> 00:39:05,360 Speaker 3: It works, it works. It's taking advantage of a bug 645 00:39:05,760 --> 00:39:08,400 Speaker 3: in the way we work. I mean, it's both a 646 00:39:08,440 --> 00:39:10,400 Speaker 3: feature and a bug. It's the feature that makes us 647 00:39:10,440 --> 00:39:12,879 Speaker 3: the most powerful animal on the planet, but it's also 648 00:39:12,920 --> 00:39:14,320 Speaker 3: a bug that can be exploited. 649 00:39:15,880 --> 00:39:17,719 Speaker 1: Well, I might have to put my cult plans on 650 00:39:17,760 --> 00:39:20,200 Speaker 1: hold because I feel a little bit out of alignment 651 00:39:20,200 --> 00:39:20,720 Speaker 1: at the moment. 652 00:39:20,760 --> 00:39:23,640 Speaker 3: You feel the one thing I didn't put in there, 653 00:39:23,680 --> 00:39:26,880 Speaker 3: and this is probably A really important requirement is you 654 00:39:26,960 --> 00:39:33,240 Speaker 3: have to be a psychopath. So normal humans would find 655 00:39:33,400 --> 00:39:36,000 Speaker 3: this protocol really really hard to do because it involves 656 00:39:36,040 --> 00:39:40,480 Speaker 3: really really hurting people, and psychopasts don't care about doing that, 657 00:39:40,760 --> 00:39:43,600 Speaker 3: and psychoasts also don't care what people think of them, 658 00:39:43,960 --> 00:39:47,839 Speaker 3: so they're all good. If you're not a psychopath, you'll 659 00:39:47,840 --> 00:39:49,239 Speaker 3: find this really difficult to do. 660 00:39:50,920 --> 00:39:54,800 Speaker 1: Wow, thank goodness for that. Well, mate, I think March 661 00:39:54,800 --> 00:39:56,399 Speaker 1: of Tea is going to have to wait. We're talking 662 00:39:56,440 --> 00:39:59,080 Speaker 1: about it next time. You'll have to wait a fortnite 663 00:39:59,120 --> 00:40:01,160 Speaker 1: TIF to get you into Are you still standing? 664 00:40:01,160 --> 00:40:01,920 Speaker 2: Are you sitting? 665 00:40:02,480 --> 00:40:04,879 Speaker 4: No way that I had to sit down. 666 00:40:05,640 --> 00:40:11,959 Speaker 2: I knew you were sitting. That's a terrible idea. That's 667 00:40:12,000 --> 00:40:16,960 Speaker 2: so hilarious. That's fucking that lasted about twelve minutes. 668 00:40:17,000 --> 00:40:17,360 Speaker 3: Everyone. 669 00:40:17,440 --> 00:40:20,160 Speaker 2: Oh no, I'm really into standing up. I've got a 670 00:40:20,239 --> 00:40:23,239 Speaker 2: stand up disk and it's fucking this is the new me. 671 00:40:24,040 --> 00:40:28,160 Speaker 2: Seven minutes later, sitting on her ass your ideas though, 672 00:40:28,200 --> 00:40:29,399 Speaker 2: when then you. 673 00:40:30,680 --> 00:40:33,279 Speaker 1: Especially when you're telling the world you know how good 674 00:40:33,280 --> 00:40:37,319 Speaker 1: you are at it. And then I saw your elbows 675 00:40:37,360 --> 00:40:40,399 Speaker 1: resting on the desk. I'm like, that motherfucker is sitting down? 676 00:40:40,680 --> 00:40:41,000 Speaker 3: What you. 677 00:40:43,920 --> 00:40:48,560 Speaker 2: Gills boat. We'll say goodbye affair. But yeah, thanks for's name. 678 00:40:48,680 --> 00:40:53,160 Speaker 3: Yeah yeah yeah I noticed Tip sit down about halfway 679 00:40:53,160 --> 00:40:55,320 Speaker 3: through that. Oh dear, she's having asleep. 680 00:40:55,760 --> 00:40:59,880 Speaker 2: Yeah so much for that new activity. Thanks mate,