1 00:00:04,040 --> 00:00:07,080 Speaker 1: Welcome to it could happen here. I'm Garrison Davis. This 2 00:00:07,160 --> 00:00:11,400 Speaker 1: is part two of our terrorism Roundtable discussion. If you 3 00:00:11,400 --> 00:00:13,480 Speaker 1: haven't listened to part one already, I would recommend you 4 00:00:13,760 --> 00:00:16,680 Speaker 1: scroll back, listen to the previous episode and then continue 5 00:00:16,680 --> 00:00:18,560 Speaker 1: on from here so you have kind of context to 6 00:00:18,800 --> 00:00:21,880 Speaker 1: what exactly we're talking about. Anyway, this is part two 7 00:00:22,040 --> 00:00:25,360 Speaker 1: of our discussion in the Woods. I hope you enjoy. 8 00:00:27,000 --> 00:00:29,720 Speaker 1: Something that we've talked about earlier this year after January six, 9 00:00:30,120 --> 00:00:33,479 Speaker 1: was like should the government ban telegram? Right? That was 10 00:00:33,520 --> 00:00:35,040 Speaker 1: the thing And there's a lot of a lot of 11 00:00:35,159 --> 00:00:40,000 Speaker 1: arguments are like no, absolutely not, and there's does it? 12 00:00:40,040 --> 00:00:41,280 Speaker 1: Does anyone want to speak on that? Because you know, 13 00:00:41,320 --> 00:00:43,479 Speaker 1: because like if I want to talk about the government's 14 00:00:43,520 --> 00:00:45,680 Speaker 1: response to these things, you know, that's a very government 15 00:00:45,840 --> 00:00:47,960 Speaker 1: thing to do. Be like, oh, people are organized on 16 00:00:48,000 --> 00:00:50,920 Speaker 1: this platform, get out of the platform. Problem gone, you know, 17 00:00:51,040 --> 00:00:53,600 Speaker 1: and that's not how that works. You want us talking 18 00:00:53,640 --> 00:00:57,520 Speaker 1: that a little bit? Sure? Um? Yeah, so there, Yeah, 19 00:00:57,520 --> 00:01:01,720 Speaker 1: getting rid of the platform? Doesn't that necessarily help, especially 20 00:01:01,760 --> 00:01:04,640 Speaker 1: when it's something that is important, such as like you know, 21 00:01:04,760 --> 00:01:07,240 Speaker 1: encrypted communication, which is something that more people than just 22 00:01:07,760 --> 00:01:11,920 Speaker 1: Nazis need. Um And that resource should not be cut off. 23 00:01:11,959 --> 00:01:13,800 Speaker 1: And there's also kind of a bad precedent to be 24 00:01:13,920 --> 00:01:17,640 Speaker 1: set if the if the government is deciding which uh 25 00:01:17,800 --> 00:01:20,200 Speaker 1: forms of speech it needs to have complete access to. 26 00:01:20,360 --> 00:01:23,760 Speaker 1: I don't love that. Um. The other thing is that 27 00:01:23,840 --> 00:01:28,160 Speaker 1: if we knew telegram, right, they don't disappear, they formed 28 00:01:28,200 --> 00:01:37,160 Speaker 1: that networks there just harder to hard. People are absolutely 29 00:01:37,160 --> 00:01:39,800 Speaker 1: correct when they say D platforming works because it works 30 00:01:39,840 --> 00:01:42,200 Speaker 1: for the platform and a lot of people just want that. 31 00:01:42,240 --> 00:01:44,199 Speaker 1: A lot of people just don't want to see Nazi 32 00:01:44,240 --> 00:01:46,240 Speaker 1: ship and they're fine with you know, D platform and 33 00:01:46,240 --> 00:01:48,760 Speaker 1: they say this works, and they have data to backup 34 00:01:48,800 --> 00:01:51,480 Speaker 1: that it does work. But it works for the platform. 35 00:01:51,520 --> 00:01:54,840 Speaker 1: But the people still exist, Yeah, still boosting their own 36 00:01:54,840 --> 00:01:58,320 Speaker 1: ship and when they bring up building their own all 37 00:01:58,400 --> 00:02:03,160 Speaker 1: tech platforms, you know works there early yeah, yeah, yeah, 38 00:02:03,240 --> 00:02:05,840 Speaker 1: and there is elements that Yeah, D platform is a 39 00:02:05,840 --> 00:02:08,240 Speaker 1: wider thing can or especially for like in person stuff, 40 00:02:08,880 --> 00:02:11,079 Speaker 1: but yeah, for your sort of the things you're mentioning, yes, 41 00:02:11,120 --> 00:02:14,320 Speaker 1: it is. It is definitely not not that cut and dry. 42 00:02:14,480 --> 00:02:17,400 Speaker 1: You know, I'm really interesting because it is kind of 43 00:02:17,440 --> 00:02:20,720 Speaker 1: this middle space between social media and just a messaging app. Yeah. 44 00:02:21,400 --> 00:02:23,960 Speaker 1: The thing about it too, is that anybody can look 45 00:02:24,000 --> 00:02:27,000 Speaker 1: at these you know, the public channels, so without without 46 00:02:27,080 --> 00:02:29,639 Speaker 1: saying anything in the chance, so people could be kind 47 00:02:29,639 --> 00:02:32,720 Speaker 1: of completely invisible. Nobody like nobody knows that they're there. 48 00:02:32,800 --> 00:02:36,320 Speaker 1: They're watching this stuff, and they're still getting the same messaging, 49 00:02:36,560 --> 00:02:39,480 Speaker 1: they're still getting the same dates for protests, they're still 50 00:02:39,480 --> 00:02:43,000 Speaker 1: like organizing. But they can be uh, sort of just 51 00:02:43,160 --> 00:02:45,880 Speaker 1: subscribe to a channel and you don't even need to 52 00:02:45,919 --> 00:02:48,959 Speaker 1: be subscribed. You can, yeah, just looking into it and 53 00:02:49,120 --> 00:02:53,600 Speaker 1: getting that flow of information without ever having like formal organizing, 54 00:02:53,639 --> 00:02:56,600 Speaker 1: so to speak. And so it's really hard to say that, like, 55 00:02:57,160 --> 00:03:01,600 Speaker 1: you know, these people planned this because there's a lot 56 00:03:01,639 --> 00:03:04,640 Speaker 1: of claus bolts and ability and involved. There's so much 57 00:03:04,639 --> 00:03:07,520 Speaker 1: easy hyber linking between groups and channels and everything. So 58 00:03:07,520 --> 00:03:11,080 Speaker 1: it's so easy for someone to move between ideology and 59 00:03:11,160 --> 00:03:13,120 Speaker 1: to go from kind of like the base level ship 60 00:03:13,160 --> 00:03:18,440 Speaker 1: and to be much deeper stuff extremely quick, very quick. Yeah, 61 00:03:18,440 --> 00:03:21,760 Speaker 1: actually quick. Well that's like that's good and that's good 62 00:03:21,800 --> 00:03:24,640 Speaker 1: for them. About telegram is that you have all of 63 00:03:24,680 --> 00:03:28,560 Speaker 1: the people that are vulnerable to let's say, new ideas 64 00:03:28,639 --> 00:03:34,040 Speaker 1: in one place exactly if you're trying to plan a collapse, 65 00:03:34,040 --> 00:03:36,840 Speaker 1: you're gonna need a lot more people than the numbers 66 00:03:36,880 --> 00:03:39,080 Speaker 1: that the people who want the collapse actually have. So 67 00:03:39,200 --> 00:03:42,080 Speaker 1: the easiest way to kind of move things along is 68 00:03:42,080 --> 00:03:45,280 Speaker 1: to sert inserting their ideas and their discourses and kind 69 00:03:45,280 --> 00:03:50,960 Speaker 1: of altering the vibe of certain digital environments manually until 70 00:03:51,120 --> 00:03:56,200 Speaker 1: they have um, what we can kindly call cannon fodder. Yeah, 71 00:03:56,600 --> 00:03:59,240 Speaker 1: or even starting their own and saying like, you know, 72 00:03:59,320 --> 00:04:03,640 Speaker 1: this is a maga platform and it's actually just you know, 73 00:04:03,760 --> 00:04:06,280 Speaker 1: went some bunch of accelerationists too made it, and they 74 00:04:06,280 --> 00:04:08,800 Speaker 1: made it to recruit them because he saw attempts of 75 00:04:08,880 --> 00:04:10,560 Speaker 1: this with like Q and on its people who are 76 00:04:10,600 --> 00:04:14,440 Speaker 1: way more slavation is trying to use people's extreme It 77 00:04:14,520 --> 00:04:18,240 Speaker 1: was successful and they did it and people died. Well 78 00:04:18,360 --> 00:04:21,480 Speaker 1: you're I mean that. And then also you've got like 79 00:04:21,480 --> 00:04:23,359 Speaker 1: like the idea of the boogaloo right that's being co 80 00:04:23,480 --> 00:04:26,880 Speaker 1: opted to try to appeal to leftists, and I mean 81 00:04:27,080 --> 00:04:29,200 Speaker 1: there's a really good article by Left Coast Right Watch 82 00:04:29,240 --> 00:04:31,560 Speaker 1: that goes into one of those chats and they're basically like, yeah, 83 00:04:31,600 --> 00:04:34,360 Speaker 1: I really try to push these ideas of really try 84 00:04:34,400 --> 00:04:36,279 Speaker 1: to push talking points like black lives matter and all 85 00:04:36,279 --> 00:04:38,920 Speaker 1: of us who want to get these protesters on our side. 86 00:04:39,320 --> 00:04:43,800 Speaker 1: And then you also have um, some blatant like white 87 00:04:44,000 --> 00:04:46,920 Speaker 1: supremacist groups who are all using the boogaloo, and how 88 00:04:46,960 --> 00:04:49,000 Speaker 1: much of that too is like how much that is 89 00:04:50,240 --> 00:04:53,680 Speaker 1: sort of real genuine like I am not racist, I 90 00:04:53,720 --> 00:04:56,440 Speaker 1: believe in black lives matter, like I want to be 91 00:04:56,560 --> 00:05:00,400 Speaker 1: part of this even though or like much of it 92 00:05:00,440 --> 00:05:02,800 Speaker 1: also is um kind of reminiscent of what we were 93 00:05:02,800 --> 00:05:04,560 Speaker 1: talking about or how about my stuff in it? But 94 00:05:04,720 --> 00:05:08,160 Speaker 1: like with you know, the idea of from Anson of 95 00:05:08,279 --> 00:05:11,600 Speaker 1: like Healter Skelter and like causing that race where it's 96 00:05:11,640 --> 00:05:13,839 Speaker 1: like they what they would do is like try and 97 00:05:13,920 --> 00:05:16,240 Speaker 1: frame black people for it and say like this was 98 00:05:16,440 --> 00:05:19,440 Speaker 1: you know yeah, yeah, I mean, and how much of 99 00:05:19,480 --> 00:05:22,440 Speaker 1: it is saying like this is black lives matter and 100 00:05:22,560 --> 00:05:25,520 Speaker 1: they want people to see that after they do. The 101 00:05:25,560 --> 00:05:27,960 Speaker 1: Googleloo group that showed up in Portland in January in 102 00:05:28,120 --> 00:05:32,360 Speaker 1: July when the particular fads were happening, now that they 103 00:05:32,400 --> 00:05:34,240 Speaker 1: showed up and we're all like, yeah, we're here to 104 00:05:34,240 --> 00:05:36,479 Speaker 1: support black lives matter and standing against the federal government 105 00:05:36,480 --> 00:05:40,080 Speaker 1: and stuff, um, and they had some very suspicious patches 106 00:05:40,440 --> 00:05:42,480 Speaker 1: that it took me took me about a year to 107 00:05:42,520 --> 00:05:46,280 Speaker 1: figure out what they were. And it's like this accelerationist like, um, 108 00:05:46,520 --> 00:05:48,120 Speaker 1: it ties into a whole bunch of like eco fascist 109 00:05:48,160 --> 00:05:51,440 Speaker 1: propaganda stuff. Um and yeah, it's like they're they're saying 110 00:05:51,480 --> 00:05:55,520 Speaker 1: these things, well they have these very obscure patches, um 111 00:05:55,640 --> 00:05:58,440 Speaker 1: and yeah, this is an important reason why we need 112 00:05:59,160 --> 00:06:01,719 Speaker 1: people who are not very smart, Like I will say, 113 00:06:01,800 --> 00:06:04,800 Speaker 1: Jimmy dor who puts these gives these people platforms are 114 00:06:04,920 --> 00:06:06,599 Speaker 1: some of the worst and are going to cause a 115 00:06:06,600 --> 00:06:09,200 Speaker 1: lot of problems because they have no idea what they're 116 00:06:09,200 --> 00:06:10,920 Speaker 1: doing or they know what they're doing, and they're just bad. 117 00:06:12,080 --> 00:06:14,039 Speaker 1: And like that boogle boo thing kind of serves a 118 00:06:14,080 --> 00:06:18,760 Speaker 1: twofold purpose in that you can bring people who self 119 00:06:18,760 --> 00:06:21,760 Speaker 1: identify as leftists into the movement, but you also have 120 00:06:21,839 --> 00:06:25,760 Speaker 1: a really good scapegoat for like actual Actually that was 121 00:06:25,760 --> 00:06:27,880 Speaker 1: a big thing that we saw in Minneapolis when things 122 00:06:28,400 --> 00:06:31,680 Speaker 1: first popped off, and like Precinct was getting burned down 123 00:06:32,279 --> 00:06:34,840 Speaker 1: and suddenly people on the internet start losing their minds 124 00:06:34,880 --> 00:06:40,640 Speaker 1: about the umbrella guy at the and there was a 125 00:06:40,760 --> 00:06:43,880 Speaker 1: guy who was indicted. He was a boo boo boy 126 00:06:43,880 --> 00:06:47,360 Speaker 1: who was indicted for um like headlines said burning down 127 00:06:47,400 --> 00:06:49,640 Speaker 1: the precinct. He fired a weapon, and he fired a 128 00:06:49,680 --> 00:06:53,560 Speaker 1: gun on like near the wall exactly, and so that 129 00:06:53,839 --> 00:06:56,360 Speaker 1: at the same time takes away agency from left wing 130 00:06:56,440 --> 00:06:59,039 Speaker 1: movement and the States able to be like, look, see, 131 00:06:59,040 --> 00:07:01,239 Speaker 1: it's just it's okay that cracked down on them because 132 00:07:01,279 --> 00:07:04,640 Speaker 1: they're all, you know, wild white supremacists exactly, just from 133 00:07:04,640 --> 00:07:06,400 Speaker 1: any autonomous room with the forms of the people in 134 00:07:06,400 --> 00:07:08,800 Speaker 1: a community that isn't that we wouldn't necessarily refer to 135 00:07:08,839 --> 00:07:10,960 Speaker 1: his left. It's just piste off people. I mean, that's 136 00:07:10,960 --> 00:07:14,400 Speaker 1: what we saw in every single you know, everybody, the 137 00:07:14,440 --> 00:07:16,320 Speaker 1: young kids who are sucking piste off and are going 138 00:07:16,360 --> 00:07:18,840 Speaker 1: to go smash it. And it's like saying, all of 139 00:07:18,880 --> 00:07:20,520 Speaker 1: this is people from outside of the town where it's 140 00:07:20,520 --> 00:07:24,200 Speaker 1: like I know, yeah, it's a tale as old as time, 141 00:07:24,240 --> 00:07:27,040 Speaker 1: Like outside agitator has been used since before this. It's 142 00:07:27,080 --> 00:07:29,920 Speaker 1: a very old state talking point. Yeah, what are you 143 00:07:29,960 --> 00:07:33,280 Speaker 1: gonna say that? Um, yeah, I was gonna say. Also, 144 00:07:34,120 --> 00:07:36,320 Speaker 1: it's somewhat related to that we're talking about using like 145 00:07:36,400 --> 00:07:39,200 Speaker 1: human on his cannon fodder, and also ties into the 146 00:07:39,280 --> 00:07:42,960 Speaker 1: sab sit conversation we're having. So my research I special 147 00:07:43,000 --> 00:07:46,800 Speaker 1: are not specialized. I focus on Christian identity, this white 148 00:07:46,800 --> 00:07:50,600 Speaker 1: supremacist ideology, and how specifically how it's grown since the 149 00:07:50,680 --> 00:07:53,920 Speaker 1: nineties until now through like the Internet and all that 150 00:07:53,960 --> 00:07:57,760 Speaker 1: fun stuff. This whole point they've been pushing lately is 151 00:07:57,840 --> 00:08:02,360 Speaker 1: to like there this with Christian identity. The whole thing 152 00:08:02,440 --> 00:08:05,560 Speaker 1: is they are preparing for the apocalypse, which they called 153 00:08:05,560 --> 00:08:10,160 Speaker 1: the tribulations. And they see modern see I folks see 154 00:08:10,200 --> 00:08:12,880 Speaker 1: the boogaloo as like the tribulation that's coming. So what 155 00:08:12,920 --> 00:08:15,880 Speaker 1: they're trying to do is go off grid and really 156 00:08:15,880 --> 00:08:19,160 Speaker 1: try to like establish this new land or like to 157 00:08:19,200 --> 00:08:21,280 Speaker 1: protect their kids and everything from like pollution and all 158 00:08:21,280 --> 00:08:24,720 Speaker 1: that ship, but also to be away from the collapse 159 00:08:24,720 --> 00:08:27,600 Speaker 1: and be able to start buy it. And then while 160 00:08:27,640 --> 00:08:30,760 Speaker 1: they're doing all that, like prepping homesteads and compounds and stuff, 161 00:08:30,800 --> 00:08:36,480 Speaker 1: they're also like pushing like election fraud conspiracies and all 162 00:08:36,559 --> 00:08:38,400 Speaker 1: that on like Q and on and the maggot crowd 163 00:08:38,880 --> 00:08:40,800 Speaker 1: because they believe it. Not because yeah, right, they don't 164 00:08:40,800 --> 00:08:42,880 Speaker 1: believe it. They know it's bullshit, but they can use 165 00:08:42,920 --> 00:08:47,000 Speaker 1: it to accelerate collapse, just like January says, yes, so exactly. 166 00:08:47,160 --> 00:08:50,160 Speaker 1: I mean when I mean there are groups when Joe 167 00:08:50,160 --> 00:08:53,120 Speaker 1: Biden won the presidency or won the election or whatever. 168 00:08:54,679 --> 00:08:56,920 Speaker 1: Some groups being like, yeah, I really try to push 169 00:08:56,960 --> 00:08:59,959 Speaker 1: this theory, this conspiracy about election fraud, even if you 170 00:09:00,080 --> 00:09:01,599 Speaker 1: even if you don't believe in it, just push it 171 00:09:01,640 --> 00:09:05,440 Speaker 1: because that helps our cause. And that's that's something to 172 00:09:05,480 --> 00:09:09,719 Speaker 1: be really mindful of too. Forgot where else I was 173 00:09:09,760 --> 00:09:11,680 Speaker 1: going with that. Well, yeah, a lot of them don't 174 00:09:11,800 --> 00:09:15,280 Speaker 1: mean what they say. They'll say things that will push 175 00:09:15,400 --> 00:09:19,360 Speaker 1: other people to do something that they don't necessarily want 176 00:09:19,400 --> 00:09:20,880 Speaker 1: to do. And that's a lot of a lot of 177 00:09:20,880 --> 00:09:24,920 Speaker 1: Like during January six, so much excitement because they could 178 00:09:25,000 --> 00:09:27,920 Speaker 1: see that the Q and on crowd are actually mobilizing, 179 00:09:28,480 --> 00:09:31,120 Speaker 1: and so they said to them, like to themselves, like, 180 00:09:33,559 --> 00:09:36,680 Speaker 1: you know, get them, get the mobilizing for the white race, 181 00:09:36,720 --> 00:09:41,480 Speaker 1: get the mobilizing for you know, our cause. And they've 182 00:09:41,800 --> 00:09:47,400 Speaker 1: really successfully been able to infiltrate that and be able 183 00:09:47,440 --> 00:09:50,120 Speaker 1: to get some people on board with some of it 184 00:09:50,559 --> 00:09:53,240 Speaker 1: just based on using their rhetoric. Yeah, I know, I 185 00:09:53,280 --> 00:09:57,000 Speaker 1: talked about this on our podcast, but you could see it, 186 00:09:57,200 --> 00:10:00,240 Speaker 1: like I reported on January six, in person, and like 187 00:10:00,559 --> 00:10:02,840 Speaker 1: you could watch it happen. Someone with a skull mask 188 00:10:02,920 --> 00:10:07,280 Speaker 1: on or proud boy or an oath keeper would literally 189 00:10:07,360 --> 00:10:10,640 Speaker 1: come back from the police line, grab a group of people, 190 00:10:11,040 --> 00:10:13,240 Speaker 1: yell something at them about Q and On or the 191 00:10:13,280 --> 00:10:16,120 Speaker 1: storms upon us, and throw them up to that riot 192 00:10:16,160 --> 00:10:18,640 Speaker 1: line until I did a really good did a really 193 00:10:18,640 --> 00:10:22,280 Speaker 1: good visual investigation of how those extremistcupes used mega people 194 00:10:22,320 --> 00:10:24,920 Speaker 1: in Q and On people as their foot soldiers. Qua 195 00:10:25,080 --> 00:10:29,559 Speaker 1: a folk Qua didn't really good break down their animous podcast. Yea, yeah. 196 00:10:30,280 --> 00:10:33,200 Speaker 1: But it's also with I mean not to link everything 197 00:10:33,280 --> 00:10:35,360 Speaker 1: the Christian identity, which I have a tendency to do, 198 00:10:35,760 --> 00:10:39,559 Speaker 1: but it's it's very ideologically similar to Q and On, 199 00:10:39,720 --> 00:10:42,280 Speaker 1: like from a Christianity point of view, Like that's Q 200 00:10:42,440 --> 00:10:44,760 Speaker 1: and On is like so close to the edge of 201 00:10:44,840 --> 00:10:47,640 Speaker 1: Christian identity. It's very scary. Actually I talked about it 202 00:10:47,679 --> 00:10:52,679 Speaker 1: on Jake Hammer Hands Q Clarence podcast. But there's also 203 00:10:52,840 --> 00:10:56,000 Speaker 1: like not only trying to who accelerate things through them, 204 00:10:56,000 --> 00:10:58,920 Speaker 1: but also trying to recruit them through these like very 205 00:10:59,040 --> 00:11:01,160 Speaker 1: very similar talking bo It's about like the Synagogue of 206 00:11:01,200 --> 00:11:04,000 Speaker 1: Satan and all that, saying that Christian identity is an 207 00:11:04,120 --> 00:11:06,200 Speaker 1: entry point for some of them, some of them can 208 00:11:06,280 --> 00:11:10,040 Speaker 1: bring it up as an entry point into further like 209 00:11:10,080 --> 00:11:15,440 Speaker 1: accelerationist Nazi ship, but like they will start with Christian 210 00:11:15,440 --> 00:11:18,240 Speaker 1: identity because I think that it's more pactual to people 211 00:11:18,240 --> 00:11:21,000 Speaker 1: who already believe in Q and on. Yeah exactly. I mean, 212 00:11:21,200 --> 00:11:23,240 Speaker 1: like Will was saying, these there's a lot of this 213 00:11:23,320 --> 00:11:27,000 Speaker 1: comes from these kind of boomer conspiracies and anti vax groups, 214 00:11:27,040 --> 00:11:28,680 Speaker 1: and you're not going to be able to get you know, 215 00:11:29,000 --> 00:11:31,839 Speaker 1: Mimon pap pap into like Wotonism or something like that 216 00:11:32,679 --> 00:11:36,760 Speaker 1: hard enough, you can't. But like Christianity is something that's palatable. 217 00:11:36,760 --> 00:11:38,960 Speaker 1: It's something that's normal to them, and as you can 218 00:11:39,040 --> 00:11:41,040 Speaker 1: kind of slowly tweak it through to it on, you 219 00:11:41,040 --> 00:11:43,400 Speaker 1: can get them to this much work stream Like okay, 220 00:11:43,400 --> 00:11:46,880 Speaker 1: talk about Christian identity. I think we should, like maybe Matt, 221 00:11:46,920 --> 00:11:50,440 Speaker 1: you could define it Christian identity. It's this radical offshoot 222 00:11:50,480 --> 00:11:53,040 Speaker 1: of Christianity that sees all white people as a true 223 00:11:53,160 --> 00:11:56,960 Speaker 1: Israelites from the Bible um. And they also think Jewish 224 00:11:56,960 --> 00:12:00,360 Speaker 1: people are all literally the spawn of say And there's 225 00:12:00,400 --> 00:12:03,680 Speaker 1: this really dumb theory they came up with and like 226 00:12:03,720 --> 00:12:06,960 Speaker 1: kind of rewrote the whole Bible off of called can 227 00:12:07,000 --> 00:12:10,440 Speaker 1: I name z okay okay dual seed line theory where 228 00:12:10,440 --> 00:12:14,040 Speaker 1: they say, like the story if you know about uh, 229 00:12:14,360 --> 00:12:18,320 Speaker 1: like Adam and even all that they had Caine enabled, right, 230 00:12:18,400 --> 00:12:22,880 Speaker 1: so they see, um Kane was the offspring of Eve 231 00:12:23,040 --> 00:12:25,640 Speaker 1: and the devil and he was literally the spawn of 232 00:12:25,679 --> 00:12:28,440 Speaker 1: Satan and then he intermingled with all these races that 233 00:12:28,480 --> 00:12:31,760 Speaker 1: were there before Adam and Eve and created this demonic race. 234 00:12:31,800 --> 00:12:35,920 Speaker 1: And it's really really fucking dumb, but it's still here. 235 00:12:36,120 --> 00:12:39,959 Speaker 1: It's been for the tot minute. I'm probably going to 236 00:12:40,080 --> 00:12:43,480 Speaker 1: keep going. It's gonna get worse calling it now, it's 237 00:12:43,480 --> 00:12:47,760 Speaker 1: gonna get worse. Yeah, but uh, and the whole thing 238 00:12:47,840 --> 00:12:50,680 Speaker 1: is they essentially like worshiped like a Nazi Jesus. They 239 00:12:50,720 --> 00:12:55,400 Speaker 1: see Jesus only was really only talking to the white race, 240 00:12:55,480 --> 00:12:59,320 Speaker 1: and that Christianity and like God only is able to 241 00:12:59,360 --> 00:13:02,000 Speaker 1: be perceived to buy the white race. And before you 242 00:13:02,040 --> 00:13:04,600 Speaker 1: start laughing at people, because yes, it does sound very silly, 243 00:13:04,960 --> 00:13:08,480 Speaker 1: keep in mind that these are extremely dangerous, like right, 244 00:13:08,559 --> 00:13:10,000 Speaker 1: this is this is one problem with two and on 245 00:13:10,040 --> 00:13:12,120 Speaker 1: when liberals just start laughing about how crazy it is, 246 00:13:12,120 --> 00:13:15,400 Speaker 1: and then they're so surprised at January six, where like no, no, 247 00:13:15,440 --> 00:13:22,040 Speaker 1: like it's yeah, like they're actually dangerous because you've mentioned 248 00:13:22,040 --> 00:13:23,800 Speaker 1: in a lot. Yeah, and he's chricially that he's been 249 00:13:23,800 --> 00:13:28,199 Speaker 1: mentioned in various manifestos linked to you know, actually has 250 00:13:28,280 --> 00:13:34,560 Speaker 1: warned very like organized. It's like I mean, historically I 251 00:13:34,559 --> 00:13:37,439 Speaker 1: think part of like with Christian identity, with a lot 252 00:13:37,440 --> 00:13:40,480 Speaker 1: of these kind of like a lot of them based 253 00:13:40,520 --> 00:13:45,160 Speaker 1: their like whole historical context of like arianism on this 254 00:13:45,360 --> 00:13:49,600 Speaker 1: rewriting of history based on um a fake study that 255 00:13:49,720 --> 00:13:56,280 Speaker 1: was done in Nazi Germany about uh where some proto 256 00:13:56,400 --> 00:13:59,400 Speaker 1: Indo European languages came from. And so they believe that 257 00:13:59,440 --> 00:14:04,240 Speaker 1: like white people came from uh an area that's you know, 258 00:14:04,280 --> 00:14:08,960 Speaker 1: you could generally say it's certainly near the Black Sea. UM, 259 00:14:08,960 --> 00:14:13,560 Speaker 1: and that it's based on this like strange idea that 260 00:14:13,600 --> 00:14:21,200 Speaker 1: like Sanskrit is not the oldest language, but like are 261 00:14:21,200 --> 00:14:24,960 Speaker 1: you pointing the gun at me because I'm stepping in you? 262 00:14:30,120 --> 00:14:33,680 Speaker 1: I think it actually is useful and yeah, there there 263 00:14:33,760 --> 00:14:38,440 Speaker 1: is actual because they really tried to push this. They 264 00:14:38,480 --> 00:14:41,680 Speaker 1: made um a lot of fake studies that you could 265 00:14:42,080 --> 00:14:44,960 Speaker 1: spend a lot of time researching this and believe that 266 00:14:45,000 --> 00:14:48,600 Speaker 1: it's true, UM, because there's just so much written about it. 267 00:14:48,640 --> 00:14:50,880 Speaker 1: And I think this is like a tactic that they 268 00:14:50,920 --> 00:14:54,280 Speaker 1: really tend to do with historical revisionism. Allat is just 269 00:14:54,800 --> 00:14:57,840 Speaker 1: crank out essay after essay, even if it's wrong, even 270 00:14:57,840 --> 00:15:02,920 Speaker 1: if it's totally like based on false data, just doesn't 271 00:15:02,960 --> 00:15:05,360 Speaker 1: data that they don't care. They just write about it 272 00:15:05,400 --> 00:15:08,360 Speaker 1: and then they think that like having more written about 273 00:15:08,360 --> 00:15:10,800 Speaker 1: it makes it more legitimate. And if that's what we 274 00:15:10,840 --> 00:15:13,120 Speaker 1: are talking to I've been talking about this this whole 275 00:15:13,160 --> 00:15:17,520 Speaker 1: time we've been not recording. Is there's just an overflow 276 00:15:17,560 --> 00:15:21,560 Speaker 1: of content that is so easy to access, you know, 277 00:15:21,640 --> 00:15:26,320 Speaker 1: not necessarily from these specificers they're talking about, just from 278 00:15:26,360 --> 00:15:30,160 Speaker 1: the further right. In general, I just overflow the content. 279 00:15:30,200 --> 00:15:43,840 Speaker 1: It's like always the top ship on Facebook. To give 280 00:15:43,880 --> 00:15:46,800 Speaker 1: an idea of how pervasive even that idea of like 281 00:15:46,840 --> 00:15:49,960 Speaker 1: where Indo European languages came from, Like when I still 282 00:15:50,000 --> 00:15:51,920 Speaker 1: went to college, I took a Religions of South Asia 283 00:15:51,960 --> 00:15:55,000 Speaker 1: course and we had to spend like multiple days where 284 00:15:55,000 --> 00:15:58,800 Speaker 1: a professor went through these myths about like what was 285 00:15:59,120 --> 00:16:04,400 Speaker 1: the area invasion? Which like was there are arian people 286 00:16:04,440 --> 00:16:08,800 Speaker 1: that that is a thing historically, Yes they're not white people, 287 00:16:09,280 --> 00:16:11,320 Speaker 1: but like going through a new definition of white people. 288 00:16:11,400 --> 00:16:15,640 Speaker 1: Sure it's based on language. They think of arianism as 289 00:16:15,640 --> 00:16:20,720 Speaker 1: like referring to a linguistic pattern. Yeah, but like in 290 00:16:20,760 --> 00:16:23,560 Speaker 1: a university course, we still had to go through and 291 00:16:23,600 --> 00:16:27,440 Speaker 1: like debunk these myths because they've gotten so pervasive within 292 00:16:27,560 --> 00:16:30,200 Speaker 1: the culture. Yeah. And another thing I want to say 293 00:16:30,480 --> 00:16:37,840 Speaker 1: is that kind of these more entry level conspiracy ideas, 294 00:16:37,880 --> 00:16:41,520 Speaker 1: it is hard to over emphasize how small the spaces 295 00:16:41,600 --> 00:16:44,800 Speaker 1: between the entry level stuff and the much harder stuff. 296 00:16:44,840 --> 00:16:48,280 Speaker 1: It can happen extremely, extremely, you know, I'll give I'll 297 00:16:48,280 --> 00:16:50,480 Speaker 1: give an example. I went to you I was reporting 298 00:16:50,520 --> 00:16:54,880 Speaker 1: on an anti vax protest and they went straight into 299 00:16:54,960 --> 00:16:59,560 Speaker 1: talking about New World Order and Project Locks, Lockstep and 300 00:16:59,560 --> 00:17:01,600 Speaker 1: and the raw Childs and the Builderers and like the 301 00:17:01,640 --> 00:17:04,600 Speaker 1: Sabbatines and David Ice ship just me and this is 302 00:17:04,680 --> 00:17:06,359 Speaker 1: this was the middle of the day and like a 303 00:17:06,359 --> 00:17:09,080 Speaker 1: metropolitan area with a bunch of boomers and trump paths 304 00:17:09,600 --> 00:17:13,560 Speaker 1: who are getting this like hardcore ship pumped at them 305 00:17:13,840 --> 00:17:16,919 Speaker 1: or you uh sell that a lot. With the Nashville bombing, 306 00:17:16,960 --> 00:17:19,040 Speaker 1: to like immediately it was like, oh, it was actually 307 00:17:19,040 --> 00:17:23,719 Speaker 1: an attack on dominion. And also it was orchestrated by 308 00:17:23,760 --> 00:17:26,639 Speaker 1: the roths Childs to destroy evidence of voter for I 309 00:17:26,680 --> 00:17:29,080 Speaker 1: forgot that. Yeah, And then also there was a whole like, 310 00:17:29,119 --> 00:17:30,480 Speaker 1: there was a bunch of stuff that came up. There 311 00:17:30,560 --> 00:17:32,720 Speaker 1: is a big conspiracy that it was actually a missile strike. 312 00:17:32,760 --> 00:17:34,480 Speaker 1: I had to talk of my grandpa down from that. 313 00:17:37,000 --> 00:17:39,120 Speaker 1: There was a video that circulated for a while about 314 00:17:39,160 --> 00:17:42,480 Speaker 1: then I had to get into a conversation with my grandpa, 315 00:17:42,560 --> 00:17:44,919 Speaker 1: but at the time was super isolated because of COVID 316 00:17:45,160 --> 00:17:49,080 Speaker 1: and that's a whole yeah, And I had to like 317 00:17:49,119 --> 00:17:51,760 Speaker 1: talk him down and show him like, no, here's uh, 318 00:17:51,840 --> 00:17:53,919 Speaker 1: here's a video from somebody I knew who was like 319 00:17:53,960 --> 00:17:56,600 Speaker 1: somewhat in the area and saw the explosion and like, 320 00:17:56,680 --> 00:18:00,880 Speaker 1: and there was not a missile anywhere the day. One 321 00:18:00,880 --> 00:18:03,760 Speaker 1: of the data studies that I've done is UM and 322 00:18:03,800 --> 00:18:07,680 Speaker 1: worked on is using big pool and small pool discord 323 00:18:07,720 --> 00:18:12,159 Speaker 1: servers of far right extremists UM, far right militia groups 324 00:18:12,240 --> 00:18:17,440 Speaker 1: and UM very very like accelerationist skull mass type networks 325 00:18:17,880 --> 00:18:20,840 Speaker 1: UM and looking at the big pools and small pools 326 00:18:20,960 --> 00:18:26,280 Speaker 1: and seeing the app mentions between them, and there was 327 00:18:26,440 --> 00:18:30,200 Speaker 1: not one person who was more than three notes away 328 00:18:30,400 --> 00:18:34,480 Speaker 1: from anybody else. So it's very it can't be overstated 329 00:18:34,560 --> 00:18:38,960 Speaker 1: how close people are from entry to very very very 330 00:18:38,960 --> 00:18:47,000 Speaker 1: extreme h types of goals and ideology, explicit ideologies that 331 00:18:47,000 --> 00:18:50,200 Speaker 1: explicitly pushed violence. And you know, another point I want 332 00:18:50,200 --> 00:18:53,560 Speaker 1: to bring up is um like, yeah, there's been much 333 00:18:53,600 --> 00:18:55,720 Speaker 1: said about Queen On isn't going away. It's just not 334 00:18:55,760 --> 00:18:59,800 Speaker 1: called quan On anymore. Um with with these anti vax mobilization, 335 00:19:01,560 --> 00:19:04,280 Speaker 1: those mobilizations and groups aren't going away. They're just going 336 00:19:04,320 --> 00:19:07,600 Speaker 1: to continue to shift in evolve their focus and the network, 337 00:19:09,359 --> 00:19:12,760 Speaker 1: and they're planning for it like they've they've they've they've 338 00:19:12,800 --> 00:19:16,920 Speaker 1: designed it that way. So I sometimes I find the 339 00:19:17,000 --> 00:19:19,800 Speaker 1: normal stuff first, sometimes I find the crazy stuff first. 340 00:19:20,320 --> 00:19:23,440 Speaker 1: But I mean, not even that long ago. I came 341 00:19:23,440 --> 00:19:26,639 Speaker 1: across a particular social media profile that was explicitly calling 342 00:19:26,680 --> 00:19:28,520 Speaker 1: for acts of terror and attempting to organize acts of 343 00:19:28,600 --> 00:19:30,399 Speaker 1: terror and displaying acts of terror, which is like an 344 00:19:30,400 --> 00:19:33,560 Speaker 1: immediate problem that needs to be dealt with. However, they 345 00:19:33,600 --> 00:19:37,439 Speaker 1: had multiple alternate accounts that you follow that path, and 346 00:19:37,480 --> 00:19:40,760 Speaker 1: on their other accounts they're sharing like Tucker Carlson, stuff 347 00:19:41,160 --> 00:19:44,000 Speaker 1: like things that your grandparents are going to watch right like, 348 00:19:44,200 --> 00:19:46,840 Speaker 1: and that is done on purpose to try to like 349 00:19:47,040 --> 00:19:52,400 Speaker 1: siphon people out of um more quote unquote mainstream versions 350 00:19:52,480 --> 00:19:56,720 Speaker 1: of like conspiratorial thinking directly into like you should start 351 00:19:56,760 --> 00:19:59,919 Speaker 1: exploding things, and even even more even more, let's sa 352 00:20:00,080 --> 00:20:04,760 Speaker 1: left of center conspiracy thinking ties into this, and it's not, 353 00:20:05,000 --> 00:20:08,119 Speaker 1: you know, conspiracy theories are not solely a thing of 354 00:20:08,160 --> 00:20:12,320 Speaker 1: the right, which which passed me off to no end. No, 355 00:20:12,480 --> 00:20:14,480 Speaker 1: I just want to back you up on that, Like 356 00:20:14,520 --> 00:20:17,080 Speaker 1: I think there's this maybe this like implicit idea that 357 00:20:17,119 --> 00:20:20,360 Speaker 1: the left is immune to conspiracy theories when it very 358 00:20:20,480 --> 00:20:28,840 Speaker 1: much is. Um. Yeah, I just wanted to emphasize that point. Yeah, 359 00:20:28,960 --> 00:20:31,719 Speaker 1: that idea though, of like never being that far from 360 00:20:31,760 --> 00:20:36,760 Speaker 1: the serious stuff is something that's really really observable, even 361 00:20:36,800 --> 00:20:39,879 Speaker 1: beyond like a data level. I I used to like 362 00:20:40,000 --> 00:20:43,320 Speaker 1: consult with local newsrooms on how to report on things, 363 00:20:43,760 --> 00:20:45,159 Speaker 1: and one of the big points I always tried to 364 00:20:45,200 --> 00:20:48,679 Speaker 1: drill in was like, if you fuck this up and 365 00:20:48,760 --> 00:20:52,119 Speaker 1: you frame this the wrong way, it will have consequences. 366 00:20:52,560 --> 00:20:54,399 Speaker 1: And if this is stepping in it too much, we 367 00:20:54,400 --> 00:21:01,639 Speaker 1: can come literally but like the um Dylan Roof. Dylan 368 00:21:01,720 --> 00:21:06,360 Speaker 1: Roof started his journey to radicalization by reading about Trayvon 369 00:21:06,440 --> 00:21:13,000 Speaker 1: Martin in local news websites and local newspapers and then 370 00:21:13,040 --> 00:21:17,760 Speaker 1: googling black on White crime and his first result up 371 00:21:19,800 --> 00:21:22,560 Speaker 1: by the same exact thing exactly, and like it does 372 00:21:22,640 --> 00:21:26,240 Speaker 1: not It did not take long for him to go 373 00:21:26,400 --> 00:21:29,679 Speaker 1: from I am reading local news articles that are framed 374 00:21:29,760 --> 00:21:35,719 Speaker 1: this specific way to I am killing people. That's not normal, 375 00:21:35,920 --> 00:21:37,600 Speaker 1: of course, like a lot of people are not going 376 00:21:37,640 --> 00:21:40,240 Speaker 1: to be reading local news and then suddenly start to 377 00:21:40,240 --> 00:21:43,160 Speaker 1: think this way, but like there is a concerted effort 378 00:21:43,560 --> 00:21:49,920 Speaker 1: by some very specific people who would like to make 379 00:21:50,000 --> 00:21:56,560 Speaker 1: that pathway easier. It's well, it's interesting because we don't 380 00:21:57,000 --> 00:21:59,359 Speaker 1: we can't like define it really as terrorism. What are 381 00:21:59,359 --> 00:22:02,560 Speaker 1: they doing the like just they're just saying things, They're 382 00:22:02,560 --> 00:22:05,600 Speaker 1: just encouraging people to do things, and like they're not 383 00:22:06,760 --> 00:22:09,440 Speaker 1: like they're not doing anything wrong. We can't really call 384 00:22:09,480 --> 00:22:12,640 Speaker 1: it terrorism. The most dangerous people in this game are 385 00:22:12,720 --> 00:22:16,480 Speaker 1: usually not the ones doing the shooting, people behind the 386 00:22:16,520 --> 00:22:18,480 Speaker 1: scenes trying to people to go on these paths in 387 00:22:18,520 --> 00:22:20,720 Speaker 1: the first looking for people who are willing and then 388 00:22:20,760 --> 00:22:24,080 Speaker 1: so they see somebody reading local news maybe and they 389 00:22:24,400 --> 00:22:26,560 Speaker 1: want to make that pathway easier for to go from 390 00:22:26,640 --> 00:22:29,560 Speaker 1: local news and Dylan roof like because that's not a 391 00:22:29,600 --> 00:22:34,199 Speaker 1: normal jump, but they really want to find people who 392 00:22:34,280 --> 00:22:36,640 Speaker 1: are looking at local news like that and then say 393 00:22:36,640 --> 00:22:38,680 Speaker 1: to them like, well, okay, you look at this, now 394 00:22:38,760 --> 00:22:40,880 Speaker 1: look at that trying to try us back to climate change. 395 00:22:40,880 --> 00:22:43,400 Speaker 1: How do you see do you see a similar pathway 396 00:22:43,440 --> 00:22:46,120 Speaker 1: instead of instead of someone googling no black white crime, 397 00:22:46,200 --> 00:22:50,280 Speaker 1: like googling stuff about collapse and and and like modern modernization. 398 00:22:51,200 --> 00:22:54,119 Speaker 1: Eric Striker, I don't know. Eric Striker has been on 399 00:22:54,280 --> 00:22:56,360 Speaker 1: about this, and I think that he's a I mean 400 00:22:58,560 --> 00:23:01,920 Speaker 1: relatively like middle point that people get to like fairly 401 00:23:02,000 --> 00:23:07,320 Speaker 1: like average people do listen to things like Eric Stryker 402 00:23:08,560 --> 00:23:13,720 Speaker 1: entry level explicit Nazi and another thing and cut me 403 00:23:13,760 --> 00:23:15,280 Speaker 1: off if we don't want to go in this direction. 404 00:23:15,960 --> 00:23:19,359 Speaker 1: But you know, one of the biggest places where we 405 00:23:19,480 --> 00:23:26,479 Speaker 1: see young people getting into conspiracy theories is tiktoktok alright, 406 00:23:27,200 --> 00:23:33,400 Speaker 1: TikTok cut that cut that cut that we're not we're 407 00:23:33,440 --> 00:23:36,120 Speaker 1: not cutting that that is that is with the branches 408 00:23:36,160 --> 00:23:38,680 Speaker 1: of the pod. Yeah. I mean, the biggest entry points 409 00:23:39,040 --> 00:23:42,640 Speaker 1: I've seen for a lot of things remains crisis. Yeah. 410 00:23:43,680 --> 00:23:47,399 Speaker 1: And the thing is this, our upcoming climate scenario is 411 00:23:47,400 --> 00:23:50,800 Speaker 1: going to give people an easier jumping on point. Well, yeah, 412 00:23:50,840 --> 00:23:52,680 Speaker 1: that's so. I mean we were talking about how like 413 00:23:53,440 --> 00:23:56,760 Speaker 1: the mythology of like black and white crime and all 414 00:23:56,800 --> 00:23:59,879 Speaker 1: this stuff. They're trying to create a situation that, you know, 415 00:24:00,000 --> 00:24:03,720 Speaker 1: with the urgency that justifies fascism, which on its own 416 00:24:04,119 --> 00:24:08,639 Speaker 1: is unjustifiable and ridiculous. But when there's a crisis, climate 417 00:24:08,720 --> 00:24:11,399 Speaker 1: change is the existential threat that they've been trying to 418 00:24:11,480 --> 00:24:14,040 Speaker 1: artificially create, and they no longer have to. They now 419 00:24:14,080 --> 00:24:15,879 Speaker 1: get to skip a lot of steps and save a 420 00:24:15,880 --> 00:24:17,639 Speaker 1: lot of energy by just planning at the fact that 421 00:24:17,720 --> 00:24:22,760 Speaker 1: everything is literally on fire, and that like that that 422 00:24:22,800 --> 00:24:26,280 Speaker 1: makes it so much quicker. Say we have to do something. 423 00:24:27,040 --> 00:24:29,800 Speaker 1: We have all the guns. Now would be a great 424 00:24:29,880 --> 00:24:33,119 Speaker 1: time to join it in our power. This kid, this 425 00:24:33,119 --> 00:24:36,440 Speaker 1: this is our Bimar era hyperinflation type ship. I mean, 426 00:24:36,480 --> 00:24:41,560 Speaker 1: this is like when you're when you can't get food 427 00:24:41,560 --> 00:24:44,280 Speaker 1: from the grocery store anymore because of supply chain problems, 428 00:24:44,359 --> 00:24:46,560 Speaker 1: or when everything around you is on fire. You don't 429 00:24:46,640 --> 00:24:48,920 Speaker 1: need like a grace, you don't need a great replacement theory, 430 00:24:49,400 --> 00:24:51,920 Speaker 1: you don't need anything. You don't need to say that 431 00:24:51,960 --> 00:24:55,200 Speaker 1: the Rothchilds are behind it. You. You haven't just need 432 00:24:55,240 --> 00:24:57,159 Speaker 1: to wait. You have you have enough things that you 433 00:24:57,240 --> 00:25:01,119 Speaker 1: experience yourself, and it's much scary or when you can't 434 00:25:01,359 --> 00:25:04,160 Speaker 1: because I can't, Like how do we how do we stop? Yeah? 435 00:25:04,160 --> 00:25:07,960 Speaker 1: I can't. It's harder. The world is literally on fire. 436 00:25:08,280 --> 00:25:10,360 Speaker 1: It's it's a problem and something needs to be done 437 00:25:10,400 --> 00:25:13,800 Speaker 1: about it. I don't like your solution, but something needs 438 00:25:13,880 --> 00:25:16,320 Speaker 1: to happen. So what what what do you think on 439 00:25:16,359 --> 00:25:17,680 Speaker 1: this path? And this is going to get a whole 440 00:25:17,680 --> 00:25:19,879 Speaker 1: lot more speculative, but like, what can we do to 441 00:25:19,920 --> 00:25:26,280 Speaker 1: make people falling down those pathways less often? Like like, yes, 442 00:25:26,560 --> 00:25:28,199 Speaker 1: that is that, that's that's one of the things that 443 00:25:28,200 --> 00:25:30,040 Speaker 1: we're trying to do on the pot is making sure 444 00:25:30,000 --> 00:25:33,600 Speaker 1: that people do not fall fall down the the doomer 445 00:25:33,680 --> 00:25:36,480 Speaker 1: pathway because yeah, this that that does get people along 446 00:25:36,600 --> 00:25:43,240 Speaker 1: down like against like like against most types of extremism. 447 00:25:43,240 --> 00:25:46,320 Speaker 1: Eco extremism is most logical. Like you look at it 448 00:25:46,359 --> 00:25:49,040 Speaker 1: and you said, we need a radical change right now, 449 00:25:49,520 --> 00:25:53,440 Speaker 1: and that's correct. Um, It's just the way that they 450 00:25:53,440 --> 00:25:55,560 Speaker 1: go about it is very very different. And that's why, 451 00:25:55,600 --> 00:25:58,159 Speaker 1: like you know, eco fascism is very different. It's its 452 00:25:58,240 --> 00:26:02,040 Speaker 1: own type of eco extremism is in green anarchy, that's 453 00:26:02,040 --> 00:26:04,919 Speaker 1: a very different type of eco extremism. Like these are 454 00:26:04,960 --> 00:26:10,120 Speaker 1: all different parts of something that almost has the same 455 00:26:10,160 --> 00:26:13,560 Speaker 1: goals but wants to go about them very very very differently. Again, 456 00:26:13,560 --> 00:26:15,640 Speaker 1: and it's so easy to just look around and see 457 00:26:15,640 --> 00:26:18,240 Speaker 1: how everything's on fire and I think like the government's 458 00:26:18,280 --> 00:26:20,879 Speaker 1: doing nothing about it. The government starts doing something about it, 459 00:26:20,880 --> 00:26:24,040 Speaker 1: and then suddenly it's the state's two bigs were in communism, 460 00:26:24,240 --> 00:26:27,159 Speaker 1: you know, so they all of like different goals and 461 00:26:27,200 --> 00:26:29,200 Speaker 1: it's very conflicting on how to how to deal with 462 00:26:29,480 --> 00:26:33,560 Speaker 1: and like even the very different tactics between green anarchy 463 00:26:33,680 --> 00:26:37,480 Speaker 1: and like fascistic you extremism. They also will get two 464 00:26:37,480 --> 00:26:41,560 Speaker 1: different endicals, right, like you like your your basic amprinm, 465 00:26:41,560 --> 00:26:44,680 Speaker 1: what's a very different life than you're you know, very 466 00:26:44,800 --> 00:26:48,439 Speaker 1: you know very stepping in its pilled fascist right. A 467 00:26:48,480 --> 00:26:52,400 Speaker 1: collapse can only benefit the right, it can't. A collapse 468 00:26:52,400 --> 00:26:56,480 Speaker 1: can only benefit the people who already have power, who 469 00:26:56,480 --> 00:27:02,960 Speaker 1: already able bodied, who already it's stocked up on gone already, Like, yeah, 470 00:27:03,000 --> 00:27:05,600 Speaker 1: that does frustrate me with their being anarchists who are 471 00:27:05,600 --> 00:27:08,640 Speaker 1: like rooting for the collapse because you're not gonna win, 472 00:27:08,880 --> 00:27:11,960 Speaker 1: like you're just going to get behind a fence somewhere 473 00:27:13,200 --> 00:27:15,520 Speaker 1: on the wall. Yeah, well, they've got very strict ideas 474 00:27:15,520 --> 00:27:17,480 Speaker 1: of which people count as human and the goal of 475 00:27:17,640 --> 00:27:20,480 Speaker 1: the majority of fascist movements is to you know, purge 476 00:27:20,520 --> 00:27:22,679 Speaker 1: the ranks of the people they see is lesser, and 477 00:27:22,720 --> 00:27:26,160 Speaker 1: they have they have they have very precise ideas about 478 00:27:26,280 --> 00:27:30,919 Speaker 1: who they plan on letting to survive the collapse. So 479 00:27:31,000 --> 00:27:33,200 Speaker 1: let's let's I think it's time to start talking about 480 00:27:33,240 --> 00:27:35,080 Speaker 1: and tell me if I'm taking this in the long direction. 481 00:27:35,200 --> 00:27:39,480 Speaker 1: You know what the funding someone who's listening to this, Yeah, recycle, 482 00:27:41,320 --> 00:27:44,840 Speaker 1: stop recycling, it's all, it's all getting buried in the 483 00:27:44,960 --> 00:27:51,640 Speaker 1: organ force. Just vote, vote. I mean, like, what did 484 00:27:51,720 --> 00:27:54,879 Speaker 1: you start local? Find a local group, find a local 485 00:27:55,760 --> 00:28:00,399 Speaker 1: direct action group, investigate that group and see who is 486 00:28:00,440 --> 00:28:02,919 Speaker 1: behind it. But fine, start locally. It has to start 487 00:28:03,520 --> 00:28:07,000 Speaker 1: at the local level, because when should I'm trying to say, 488 00:28:07,040 --> 00:28:10,200 Speaker 1: I'm gonna say, if the collapse comes or like orally no, 489 00:28:10,320 --> 00:28:15,240 Speaker 1: not the collapse like local, local collapses. There's disasters continuiou 490 00:28:15,240 --> 00:28:17,880 Speaker 1: disasters are gonna fec at the local level. No, talk 491 00:28:17,920 --> 00:28:20,879 Speaker 1: to your talk to your neighbors, neighbors, talk to your family, Like, 492 00:28:21,080 --> 00:28:24,399 Speaker 1: let's try to get your family on these paths that 493 00:28:24,480 --> 00:28:27,800 Speaker 1: lead to helping your neighbors instead of you know, making 494 00:28:27,840 --> 00:28:30,200 Speaker 1: friends with the church militia. Before you buy a gun, 495 00:28:30,280 --> 00:28:33,760 Speaker 1: learn how to fucking garden. Yes, but buying a gun 496 00:28:34,119 --> 00:28:37,480 Speaker 1: and that sort of thing is is good. It's good 497 00:28:37,480 --> 00:28:40,600 Speaker 1: to know how to use firearms. Basic emergency preparedness, yes, 498 00:28:40,640 --> 00:28:42,360 Speaker 1: but learn learn how to put on our turn to 499 00:28:42,560 --> 00:28:45,160 Speaker 1: learn how to feed yourself. Learn how to grow some 500 00:28:45,240 --> 00:28:48,040 Speaker 1: fucking food. Learn how to cook that fucking food. Get 501 00:28:48,040 --> 00:28:50,720 Speaker 1: an effect all that comes before, like you get to 502 00:28:50,760 --> 00:28:57,440 Speaker 1: be a fallow up character. Oh yeah, an individual first 503 00:28:57,440 --> 00:29:00,120 Speaker 1: aid kit. You can buy them by the online, can 504 00:29:00,160 --> 00:29:02,600 Speaker 1: buy the stores, you can you can buy them in 505 00:29:02,640 --> 00:29:05,120 Speaker 1: like some pawn shops. Yeah, I like North American Rescue 506 00:29:05,400 --> 00:29:07,280 Speaker 1: or North River Rescue. I'm sure we'll talk about a 507 00:29:07,400 --> 00:29:10,160 Speaker 1: text more on the product. Well there, Look, there are 508 00:29:10,160 --> 00:29:13,000 Speaker 1: two big things. One, we all have a moral obligation 509 00:29:13,240 --> 00:29:17,200 Speaker 1: to consistently counter the black pill doomer ship. Everything is 510 00:29:17,240 --> 00:29:20,680 Speaker 1: coming to an end, like it doesn't have to. That's optional. 511 00:29:21,200 --> 00:29:23,240 Speaker 1: Like we we things are going to get bad, but 512 00:29:23,280 --> 00:29:26,720 Speaker 1: there's degrees of that. We can stop it from being. 513 00:29:26,800 --> 00:29:29,600 Speaker 1: And you don't need civilizational We don't need civilization to 514 00:29:29,760 --> 00:29:32,480 Speaker 1: end like that can be done. So. Two, we also 515 00:29:32,520 --> 00:29:35,720 Speaker 1: have an obligation to counter the individualist stuff and and 516 00:29:35,720 --> 00:29:40,400 Speaker 1: and focus our efforts more towards towards community and relationships. 517 00:29:40,640 --> 00:29:42,680 Speaker 1: And that is so so important because every idiot that's 518 00:29:42,680 --> 00:29:44,560 Speaker 1: going to buy a gun and have a bunker not 519 00:29:44,600 --> 00:29:45,840 Speaker 1: only is not going to make it, but it's gonna 520 00:29:45,840 --> 00:29:47,320 Speaker 1: screw the rest of us. Like this has to be 521 00:29:47,360 --> 00:29:49,920 Speaker 1: a communal effort and a civilization thing, Like we do 522 00:29:50,080 --> 00:29:53,480 Speaker 1: need the civilization to change, like we need human society, 523 00:29:53,480 --> 00:29:55,280 Speaker 1: as we lay out we as has a lot of problems. 524 00:29:55,320 --> 00:29:58,560 Speaker 1: I understand people's critiques of human civilization. We still need 525 00:29:58,560 --> 00:30:01,400 Speaker 1: a society, but yeah, we need we need places that 526 00:30:01,680 --> 00:30:04,360 Speaker 1: you know, people are going to catherine and people you know, 527 00:30:04,720 --> 00:30:06,640 Speaker 1: provide the things that we have. Um. I noticed that 528 00:30:06,640 --> 00:30:09,080 Speaker 1: that can be a loaded word on certain political circles. 529 00:30:09,440 --> 00:30:11,640 Speaker 1: So I'm not you know, we're not getting into like 530 00:30:11,720 --> 00:30:14,400 Speaker 1: civilization theory and that kind of anything. I was going 531 00:30:14,440 --> 00:30:18,040 Speaker 1: to say, I would argue any ideology or ideas just 532 00:30:18,080 --> 00:30:21,640 Speaker 1: the boogaloo that uh kind of hypes up a collapse 533 00:30:21,920 --> 00:30:24,240 Speaker 1: is generally one you should stay away from anything that 534 00:30:24,280 --> 00:30:27,560 Speaker 1: makes the collapse sound like it makes it sound sexy 535 00:30:27,720 --> 00:30:30,240 Speaker 1: and personal story. As I think it's important to remember, 536 00:30:30,280 --> 00:30:33,240 Speaker 1: like if there was some massive civil conflict that happened. 537 00:30:33,240 --> 00:30:34,840 Speaker 1: I think the people who would suffer the most, or 538 00:30:34,840 --> 00:30:39,160 Speaker 1: the noncombatants, talk about anything to deal with it. Yeah, yeah, yeah, 539 00:30:39,480 --> 00:30:42,479 Speaker 1: don't talk about a kind of episode of terrorism. Bad. Um, well, 540 00:30:42,480 --> 00:30:44,920 Speaker 1: we'll do plugs of the end. Hold put the gun 541 00:30:44,960 --> 00:30:48,000 Speaker 1: back in your pants together. I was talking about historical 542 00:30:48,000 --> 00:30:51,120 Speaker 1: precedent earlier, about things I've seen in the past with 543 00:30:51,200 --> 00:30:54,480 Speaker 1: collapses and how people with guns and people who with 544 00:30:54,600 --> 00:30:57,640 Speaker 1: training end up being the ones to gain power. Um. 545 00:30:57,840 --> 00:31:01,280 Speaker 1: Something that like I was specifically reading about that was, um, 546 00:31:01,440 --> 00:31:04,600 Speaker 1: like the Rwandan genocide. If you know, it was just 547 00:31:05,320 --> 00:31:10,400 Speaker 1: three months where most of the Susi people were wiped out. Um, 548 00:31:10,680 --> 00:31:14,320 Speaker 1: there are conflicting numbers, so I'm not gonna specifically say 549 00:31:14,360 --> 00:31:18,560 Speaker 1: any but um, you know the more recently like this year. 550 00:31:18,600 --> 00:31:22,480 Speaker 1: Earlier this year, Um was only when Rwanda admitted what 551 00:31:22,560 --> 00:31:26,240 Speaker 1: it was, that it was a genocide. And um, the 552 00:31:26,240 --> 00:31:30,959 Speaker 1: people the armed forces were the ones who became like 553 00:31:31,560 --> 00:31:38,000 Speaker 1: the leaders and then they were backed by the government. Yeah. 554 00:31:38,040 --> 00:31:44,480 Speaker 1: And it's like it can't happen here though we are 555 00:31:44,600 --> 00:31:47,520 Speaker 1: we are immune to this in our response of the world. 556 00:31:47,880 --> 00:31:51,600 Speaker 1: It will not happen here. And the other thing is, 557 00:31:51,960 --> 00:31:54,440 Speaker 1: look at where you get your information from. Seriously, no 558 00:31:54,480 --> 00:31:57,120 Speaker 1: matter who you are, take a long, hard look at 559 00:31:57,160 --> 00:31:59,160 Speaker 1: who you get your even if you're on the left, 560 00:31:59,240 --> 00:32:02,040 Speaker 1: especially if you're actually on the left. You know, if 561 00:32:02,040 --> 00:32:04,680 Speaker 1: you want to hear about something that's happening in an area, 562 00:32:05,440 --> 00:32:07,600 Speaker 1: look at the people who are actually on the ground reporting. 563 00:32:07,800 --> 00:32:11,440 Speaker 1: People don't just rely on like news aggregators. Especially on Twitter. 564 00:32:13,840 --> 00:32:16,880 Speaker 1: There has been a lot of bad, very bad faith 565 00:32:16,920 --> 00:32:19,040 Speaker 1: news agregators on Twitter who are posing as leftist. This 566 00:32:19,080 --> 00:32:22,640 Speaker 1: has been a huge problem, even leftists who just don't 567 00:32:22,640 --> 00:32:25,280 Speaker 1: do their or just do a very bad job. People 568 00:32:25,280 --> 00:32:29,760 Speaker 1: who call themselves like extremism or counter terism researchers and 569 00:32:30,160 --> 00:32:34,680 Speaker 1: they are really talking about anti They say that they 570 00:32:34,680 --> 00:32:38,360 Speaker 1: are counter extremism researchers, and they pose that way, and 571 00:32:38,400 --> 00:32:43,280 Speaker 1: they look sometimes like they could be, sometimes like they're not, 572 00:32:43,400 --> 00:32:47,080 Speaker 1: but like you know, vary varying degrees of life, legitimacy, 573 00:32:47,160 --> 00:32:54,320 Speaker 1: but like they focus only on like the left wing stuff. 574 00:32:54,360 --> 00:32:59,200 Speaker 1: They don't think, they don't see it has to be 575 00:32:59,240 --> 00:33:01,360 Speaker 1: this idea of like keeping it balanced right, like not 576 00:33:01,480 --> 00:33:04,120 Speaker 1: making it just like a far right issue, which I 577 00:33:04,160 --> 00:33:06,240 Speaker 1: would argue I think a lot of other people would 578 00:33:06,320 --> 00:33:13,000 Speaker 1: that this kind of stuff is more concerning issue and 579 00:33:13,040 --> 00:33:17,720 Speaker 1: there is like merit definitely to looking at left acceleration 580 00:33:18,440 --> 00:33:23,800 Speaker 1: left which is not for the record, left accelerationism is 581 00:33:23,840 --> 00:33:29,520 Speaker 1: not talking about anti fascists. But um, it's really not 582 00:33:29,640 --> 00:33:34,520 Speaker 1: time to get like, well, left accelerations will people be 583 00:33:34,560 --> 00:33:37,920 Speaker 1: its own affo. But what what some people do posing 584 00:33:38,040 --> 00:33:43,360 Speaker 1: as um, you know, people who have credibility and are 585 00:33:43,440 --> 00:33:49,760 Speaker 1: able to um kind of sway opinion, They are not 586 00:33:50,480 --> 00:33:53,000 Speaker 1: really doing what they say that they're doing. They're really 587 00:33:53,040 --> 00:33:58,280 Speaker 1: just trying to shift the narrative form of racially motivated 588 00:33:58,360 --> 00:34:03,560 Speaker 1: violent extremism, which to the big obviously to being like 589 00:34:04,040 --> 00:34:07,640 Speaker 1: BLM is racially motivated violent extremes, and they want to 590 00:34:07,640 --> 00:34:20,480 Speaker 1: push that narrative further and further. I think let's let's 591 00:34:20,520 --> 00:34:22,120 Speaker 1: let's let's go, let's kind of probably start to like 592 00:34:23,040 --> 00:34:25,080 Speaker 1: wrap up and say our final thoughts on you know, 593 00:34:25,200 --> 00:34:28,200 Speaker 1: this whole this whole topic. Um, I know, we we didn't. 594 00:34:28,200 --> 00:34:30,319 Speaker 1: We didn't, we did not, we did we did not 595 00:34:30,400 --> 00:34:33,600 Speaker 1: get to talk about like eco defense very much. Anyone 596 00:34:33,640 --> 00:34:35,319 Speaker 1: has any final thoughts on that and how they see 597 00:34:35,360 --> 00:34:37,080 Speaker 1: it kind of growing and how they see the state's 598 00:34:37,560 --> 00:34:41,239 Speaker 1: response to it. Um, that might be worth briefly mentioning. Yeah, 599 00:34:41,280 --> 00:34:42,719 Speaker 1: let's kind of let's let kind of go around in 600 00:34:42,719 --> 00:34:44,800 Speaker 1: a circle and give kind of everyone's you know, final 601 00:34:44,880 --> 00:34:49,880 Speaker 1: thoughts on the on the subjects. Um, I think collapse 602 00:34:50,120 --> 00:34:56,200 Speaker 1: is bad and I think that well, I mean that's 603 00:34:56,200 --> 00:34:59,560 Speaker 1: my main my main thing. But anything that's uh appealing 604 00:34:59,600 --> 00:35:04,120 Speaker 1: to you and on like an ecological level that's collapsed 605 00:35:04,120 --> 00:35:06,279 Speaker 1: related to something you should be very wary of. And 606 00:35:06,280 --> 00:35:08,160 Speaker 1: I think you should be ware very wary of, like 607 00:35:08,480 --> 00:35:11,120 Speaker 1: generally everything. I feel like it's kind of like a butcher, 608 00:35:12,120 --> 00:35:17,400 Speaker 1: be careful about everything. Um. Yeah, I guess in my opinion, 609 00:35:17,400 --> 00:35:21,560 Speaker 1: the idea of total collapse is very misleading because it's 610 00:35:21,600 --> 00:35:25,600 Speaker 1: easy and disasters don't work like that. You're not going 611 00:35:25,640 --> 00:35:29,840 Speaker 1: to suddenly reset one day. Um. Everything is going to suck, 612 00:35:30,120 --> 00:35:33,360 Speaker 1: and you're going to need to fight for whatever semblance 613 00:35:33,400 --> 00:35:35,440 Speaker 1: of a society that you want to see in the world. 614 00:35:36,120 --> 00:35:38,520 Speaker 1: Talk to your neighbors. That can people in your city, 615 00:35:38,600 --> 00:35:41,759 Speaker 1: in your neighborhood. There are people doing good ship in 616 00:35:41,840 --> 00:35:44,319 Speaker 1: whatever city town you live in most likely, if not, 617 00:35:44,880 --> 00:35:47,600 Speaker 1: you can start it. Look at your local mutual aid network, 618 00:35:48,000 --> 00:35:50,239 Speaker 1: Look at the people who are taking action around and 619 00:35:50,320 --> 00:35:53,279 Speaker 1: get involved seriously. You know, it could be going out 620 00:35:53,320 --> 00:35:56,840 Speaker 1: into a park Saturday mornings and just like giving out food, 621 00:35:57,280 --> 00:35:59,640 Speaker 1: talking to the people who are most affective, talked to 622 00:35:59,719 --> 00:36:02,319 Speaker 1: people seriously every once a person. You need to talk 623 00:36:02,320 --> 00:36:05,600 Speaker 1: to touch grass, you talk to people if you need, 624 00:36:05,640 --> 00:36:08,120 Speaker 1: like the most basic thing to start on any sort 625 00:36:08,160 --> 00:36:09,759 Speaker 1: of mutual aid work trying to find a food not 626 00:36:09,840 --> 00:36:13,440 Speaker 1: ball chapter in your They're well organized, they're easy to 627 00:36:13,520 --> 00:36:15,960 Speaker 1: join it. You don't have to put on block and 628 00:36:15,960 --> 00:36:19,400 Speaker 1: fight a cop. It's yeah, it's a good entry point 629 00:36:19,520 --> 00:36:22,280 Speaker 1: and it's it's great. That's great training for for disaster relief. 630 00:36:22,760 --> 00:36:25,400 Speaker 1: If you have money and you want to help seriously, 631 00:36:25,560 --> 00:36:27,880 Speaker 1: just give cash to on house, people on the street. 632 00:36:28,120 --> 00:36:30,520 Speaker 1: You give money, give money to people, give money directly 633 00:36:30,560 --> 00:36:34,600 Speaker 1: to people. Yep. Uh. My last thoughts are just that 634 00:36:35,160 --> 00:36:38,480 Speaker 1: I think the idea of collapse or whether actual collapse themselves, 635 00:36:38,680 --> 00:36:42,040 Speaker 1: environmental or otherwise, will always be something to rally behind, 636 00:36:42,440 --> 00:36:44,719 Speaker 1: Like it is always an entry point as well as 637 00:36:44,760 --> 00:36:48,600 Speaker 1: a motivator from from all for all sides, from all sides. Um. 638 00:36:48,680 --> 00:36:50,680 Speaker 1: But it's like when these things become very silent, like 639 00:36:50,800 --> 00:36:53,600 Speaker 1: was mentioned before, when they're outside of your door, that's 640 00:36:53,680 --> 00:36:57,840 Speaker 1: when you know, that's when like the ideology kind of 641 00:36:57,880 --> 00:37:00,799 Speaker 1: hits the pavement, like what is actually going to play out, 642 00:37:00,840 --> 00:37:02,680 Speaker 1: what is actually going to happen, and how that's gonna 643 00:37:02,719 --> 00:37:05,920 Speaker 1: affect people. It's very real. So building community, you know, 644 00:37:06,000 --> 00:37:10,000 Speaker 1: building connections and just understanding you know, who is in 645 00:37:10,000 --> 00:37:14,640 Speaker 1: your community. It's probably one of the most important things. Yeah, 646 00:37:14,719 --> 00:37:19,080 Speaker 1: the idea of collapse is a romantic and ridiculous notion. 647 00:37:19,600 --> 00:37:21,759 Speaker 1: Uh come up with people who are like really into 648 00:37:21,840 --> 00:37:25,200 Speaker 1: like apocalyptic thinking and the version of themselves where they 649 00:37:25,200 --> 00:37:28,480 Speaker 1: get to be the main character. So first and foremost 650 00:37:28,800 --> 00:37:30,400 Speaker 1: take care of each other. There are a lot of 651 00:37:30,400 --> 00:37:32,600 Speaker 1: people out there who want to manipulate you and want 652 00:37:32,640 --> 00:37:34,359 Speaker 1: to change the way you think about things, and they 653 00:37:34,480 --> 00:37:37,040 Speaker 1: really really want you to buy in to the end 654 00:37:37,080 --> 00:37:42,120 Speaker 1: times and you don't have to because you're smarter than that. Yeah, 655 00:37:42,200 --> 00:37:44,600 Speaker 1: it's it's not hopeless. We really have to move away 656 00:37:44,640 --> 00:37:49,760 Speaker 1: from hierarchical thinking. Our society really incentivized this hierarchical thinking 657 00:37:49,840 --> 00:37:53,799 Speaker 1: and thinking. I was saying to is sect like we um, 658 00:37:53,880 --> 00:37:56,200 Speaker 1: we really need to just be focusing on people, like 659 00:37:56,239 --> 00:37:59,880 Speaker 1: if things people, because you know, somebody doesn't have to 660 00:38:00,000 --> 00:38:06,600 Speaker 1: you know, earn you know, respect and earn humanity. For 661 00:38:06,640 --> 00:38:09,759 Speaker 1: some reason, we try and make it seem like that. 662 00:38:09,880 --> 00:38:13,919 Speaker 1: But people are people. Um, people are in different circumstances 663 00:38:13,960 --> 00:38:17,120 Speaker 1: because of usually because of just the way that the 664 00:38:17,160 --> 00:38:20,440 Speaker 1: world is. And um, yeah, you need to just you 665 00:38:20,440 --> 00:38:22,319 Speaker 1: need to organize locally. You need to help your own 666 00:38:22,320 --> 00:38:29,560 Speaker 1: people and stay away from the internet. Ship stop posting. 667 00:38:30,040 --> 00:38:34,000 Speaker 1: Stop posting, as I'm as, stop posting, even though like 668 00:38:34,040 --> 00:38:37,000 Speaker 1: we'll keep doing it because I'm the good posters. Um, 669 00:38:37,320 --> 00:38:41,320 Speaker 1: who wants who wants to plug the pot? Your PCT 670 00:38:42,280 --> 00:38:49,240 Speaker 1: follow terrorism bad? We're on what is the pot? Like? What? What? 671 00:38:49,239 --> 00:38:54,080 Speaker 1: What do you? Yeah? We go through, um, portrayals of 672 00:38:54,239 --> 00:38:58,360 Speaker 1: terrorism and extremism and conspiracies and conspiracies in popular media, 673 00:38:58,640 --> 00:39:01,359 Speaker 1: and we get it from the perspective of people who 674 00:39:01,400 --> 00:39:05,040 Speaker 1: studied this and say, did this succeed in portraying these 675 00:39:05,080 --> 00:39:08,600 Speaker 1: things or did it as more often does problems completely 676 00:39:08,640 --> 00:39:12,080 Speaker 1: fail and cause us all personal problems be propaganda? And 677 00:39:12,719 --> 00:39:14,520 Speaker 1: did you make care of propaganda or did you make 678 00:39:14,600 --> 00:39:17,640 Speaker 1: good media about That is a thin line, I mean 679 00:39:18,000 --> 00:39:20,160 Speaker 1: such a thin line. I've made a career out that 680 00:39:20,239 --> 00:39:23,400 Speaker 1: is that is the thin terror line? Yeah, do you 681 00:39:23,400 --> 00:39:27,160 Speaker 1: want to plug your fantastic group. Yeah, absolutely with you 682 00:39:27,160 --> 00:39:29,000 Speaker 1: can read anything I read at Anti Hate dot c 683 00:39:29,160 --> 00:39:33,600 Speaker 1: A and we do just general reporting on uh far 684 00:39:33,719 --> 00:39:39,600 Speaker 1: right extremism in Canada as well as Infiltration podcast. Oh 685 00:39:39,640 --> 00:39:43,400 Speaker 1: and I also host a podcast called The Unusual Show. Yeah, 686 00:39:43,480 --> 00:39:45,480 Speaker 1: if you want to keep up to date on extremism 687 00:39:45,520 --> 00:39:47,680 Speaker 1: in Canada, their group is one of is probably the 688 00:39:47,760 --> 00:39:51,359 Speaker 1: best one around right now in my opinions. And yeah, 689 00:39:52,000 --> 00:39:53,640 Speaker 1: you do, you do, you do very good work. You 690 00:39:53,760 --> 00:39:56,759 Speaker 1: keep your eye on my home country where my family lives. 691 00:39:56,840 --> 00:39:59,360 Speaker 1: So thank you for that. Um, and I'm very happy 692 00:39:59,400 --> 00:40:02,320 Speaker 1: to to be talking with you guys in the beautiful 693 00:40:02,360 --> 00:40:05,240 Speaker 1: woods where we have no cell service. We can't post 694 00:40:05,880 --> 00:40:08,440 Speaker 1: um and that's good and we're gonna continue doing that 695 00:40:08,520 --> 00:40:12,839 Speaker 1: and stop using this microphone. So goodbye. Um yeah and 696 00:40:13,400 --> 00:40:19,359 Speaker 1: terrorist and at the podcast. With that, that wraps up 697 00:40:19,480 --> 00:40:25,360 Speaker 1: the Terrorism round Table Forest Discussion episodes. Thanks for listening 698 00:40:25,400 --> 00:40:30,000 Speaker 1: to all of us rant about our specific weird niche 699 00:40:30,200 --> 00:40:34,160 Speaker 1: focuses and hopefully trying to have it within the useful 700 00:40:34,200 --> 00:40:38,040 Speaker 1: context of climate change. You can follow me at Hungry Bowtie. 701 00:40:38,080 --> 00:40:41,600 Speaker 1: You can follow the podcast Happen Here pod and cool 702 00:40:41,680 --> 00:40:45,680 Speaker 1: Zone Media on Twitter, and I believe Instagram, you can 703 00:40:45,680 --> 00:40:48,239 Speaker 1: follow some of the researchers. I interviewed UM on their 704 00:40:48,280 --> 00:40:52,440 Speaker 1: podcast at Terrorism Bad. So that wraps up this discussion. 705 00:40:52,520 --> 00:40:57,799 Speaker 1: Thanks for listening, See you later in the podcasting verse, 706 00:40:57,960 --> 00:41:05,120 Speaker 1: the pod verse. Okay, goodbye, Okay. It could Happen here 707 00:41:05,160 --> 00:41:07,800 Speaker 1: as a production of cool Zone Media. For more podcasts 708 00:41:07,800 --> 00:41:10,400 Speaker 1: from cool Zone Media, visit our website cool zone Media 709 00:41:10,480 --> 00:41:12,279 Speaker 1: dot com, or check us out on the I Heart 710 00:41:12,320 --> 00:41:15,400 Speaker 1: Radio app, Apple Podcasts, or wherever you listen to podcasts. 711 00:41:15,920 --> 00:41:18,080 Speaker 1: You can find sources for It Could Happen Here, updated 712 00:41:18,120 --> 00:41:21,560 Speaker 1: monthly at cool zone Media dot com slash sources. Thanks 713 00:41:21,600 --> 00:41:22,160 Speaker 1: for listening,