1 00:00:04,040 --> 00:00:08,719 Speaker 1: What's terrible my me? This is it could happen here 2 00:00:08,760 --> 00:00:12,920 Speaker 1: a podcast about collapse, and that's appropriate because everyone's faith 3 00:00:12,960 --> 00:00:15,960 Speaker 1: in me as a colleague has collapsed today as the 4 00:00:16,000 --> 00:00:19,880 Speaker 1: result of a series of horrific cluster fox On my part, 5 00:00:19,920 --> 00:00:23,000 Speaker 1: I'm late to the meeting. I accidentally left the meeting 6 00:00:23,000 --> 00:00:26,439 Speaker 1: when they started recording. Just a just a complete fucking 7 00:00:26,440 --> 00:00:30,560 Speaker 1: ship show. Speaking of ship shows, my co host Garrison Davis, 8 00:00:30,560 --> 00:00:33,959 Speaker 1: how are you, Garrison. I'm the one that saved this. 9 00:00:34,080 --> 00:00:36,240 Speaker 1: I had to send the guest to the zoom call. 10 00:00:36,440 --> 00:00:38,880 Speaker 1: I know. I'm not even supposed to be on this call. 11 00:00:39,159 --> 00:00:41,120 Speaker 1: No you're not. You're not even supposed to be working today. 12 00:00:41,360 --> 00:00:44,200 Speaker 1: That's not true. Well but you're not on this call, 13 00:00:44,320 --> 00:00:47,680 Speaker 1: not on this call. But here I am saving. This 14 00:00:47,760 --> 00:00:49,839 Speaker 1: is enough. This is enough, Woody Banter. This is a 15 00:00:49,920 --> 00:00:55,720 Speaker 1: daily podcast, and now let's bring on our guest for today, 16 00:00:56,480 --> 00:01:02,560 Speaker 1: monsignor Alex new House. Alex, how are you doing. I'm 17 00:01:02,600 --> 00:01:04,760 Speaker 1: doing well. Thanks for having me. I feel like I 18 00:01:04,840 --> 00:01:07,600 Speaker 1: was pulled in off the street, just like bundled into 19 00:01:07,600 --> 00:01:10,880 Speaker 1: a van and then yeah, yeah, we uh. You know 20 00:01:10,920 --> 00:01:13,880 Speaker 1: how people used to get like shanghaied, like like captured 21 00:01:14,120 --> 00:01:18,200 Speaker 1: by allegedly allegedly and forced to work on on on 22 00:01:18,280 --> 00:01:20,800 Speaker 1: boats in like San Francisco and whatnot. We do that 23 00:01:20,840 --> 00:01:23,440 Speaker 1: with podcasts. I mean that is actually most of what 24 00:01:23,480 --> 00:01:26,000 Speaker 1: I've done to the people who work on your podcast. 25 00:01:26,560 --> 00:01:28,600 Speaker 1: I think I think I've had everyone from your show 26 00:01:28,680 --> 00:01:30,960 Speaker 1: on our show now, and it has been very much 27 00:01:31,040 --> 00:01:34,640 Speaker 1: like I'm just pulling them on a string. Speaking of which, Alex, 28 00:01:34,840 --> 00:01:36,720 Speaker 1: you are one of the hosts of the Terrorism Is 29 00:01:36,760 --> 00:01:43,120 Speaker 1: Bad podcast, a very uh controversially named podcast. Uh. And 30 00:01:43,160 --> 00:01:46,319 Speaker 1: you work at the Middlebury Institute of International Studies at 31 00:01:46,360 --> 00:01:50,680 Speaker 1: Monterey Center of Terrorism, Extremism and counter Terrorism Center on 32 00:01:50,840 --> 00:01:53,600 Speaker 1: not not of that would be a different center, very important, 33 00:01:53,720 --> 00:01:56,760 Speaker 1: very important. Yeah, and we're not we're not bringing you 34 00:01:56,760 --> 00:01:59,760 Speaker 1: on to talk about how to make explosively formed penetrators. 35 00:02:00,080 --> 00:02:03,720 Speaker 1: Not that not this time. That is someone else. Yeah, 36 00:02:04,040 --> 00:02:09,360 Speaker 1: but you are also you were also a actual games journalist. Yes, yeah, 37 00:02:09,560 --> 00:02:12,080 Speaker 1: I got my start in this weird space. How do 38 00:02:12,160 --> 00:02:14,600 Speaker 1: you gamer gate? How do you feel about ethics in 39 00:02:14,760 --> 00:02:24,320 Speaker 1: the game journalism industry? Alex uh, always been fine, Like yeah, alright, anyway, 40 00:02:24,440 --> 00:02:26,679 Speaker 1: that's the end of that. Yeah, I do want to 41 00:02:26,680 --> 00:02:29,519 Speaker 1: actually start there, Alex, because you and I both have 42 00:02:30,280 --> 00:02:32,520 Speaker 1: something in common, which is that we we've got our 43 00:02:32,560 --> 00:02:36,000 Speaker 1: start writing in a field that's wildly different from consulting 44 00:02:36,240 --> 00:02:40,280 Speaker 1: with like governments on terrorism. Like for me, it was 45 00:02:40,320 --> 00:02:42,320 Speaker 1: I wanted to write like Dick jokes on the Internet 46 00:02:42,360 --> 00:02:44,840 Speaker 1: and I just like stumbled into a bunch of ices 47 00:02:44,880 --> 00:02:47,640 Speaker 1: propaganda that most people weren't aware of and and that 48 00:02:47,760 --> 00:02:51,560 Speaker 1: started me like lecturing at universities and ship And for 49 00:02:51,600 --> 00:02:53,240 Speaker 1: you it was gamer Gates. I'm interested in kind of 50 00:02:53,240 --> 00:02:56,160 Speaker 1: you telling your story a little bit to start us off. Yeah, 51 00:02:56,160 --> 00:02:59,320 Speaker 1: so I was. I was during undergrad I entered every 52 00:02:59,320 --> 00:03:02,320 Speaker 1: summer at game Spot video game website you may have 53 00:03:02,400 --> 00:03:04,640 Speaker 1: heard of. It's one of the two big ones along 54 00:03:04,680 --> 00:03:06,519 Speaker 1: with I g n UM. And when I was doing that, 55 00:03:06,560 --> 00:03:11,680 Speaker 1: I was so this was like right in the at 56 00:03:11,720 --> 00:03:14,880 Speaker 1: the beginning stages of of gamer Gate really popping off. 57 00:03:14,880 --> 00:03:17,560 Speaker 1: And what ended up happening is a lot of the 58 00:03:18,120 --> 00:03:19,840 Speaker 1: people I worked with, a lot of my colleagues and 59 00:03:19,880 --> 00:03:22,600 Speaker 1: friends were just in the blast zone. They were just 60 00:03:22,680 --> 00:03:27,200 Speaker 1: targeted by the absolute onslought of of harassment UM and 61 00:03:27,639 --> 00:03:30,080 Speaker 1: I just had a curiosity started looking into some of 62 00:03:30,080 --> 00:03:32,880 Speaker 1: those people who were who are targeting my friends and colleagues, 63 00:03:32,880 --> 00:03:34,400 Speaker 1: and it ended up being a lot of the people 64 00:03:34,400 --> 00:03:36,680 Speaker 1: that were still talking about today. Uh you know, it 65 00:03:36,760 --> 00:03:40,440 Speaker 1: all all rolls back up to the bright bart metropolitan area, 66 00:03:40,760 --> 00:03:44,760 Speaker 1: if you will. And um, I don't know what a 67 00:03:44,960 --> 00:03:47,640 Speaker 1: uh the thing that made me want to I mean, 68 00:03:47,640 --> 00:03:49,120 Speaker 1: obviously I've been aware of you work for well, the 69 00:03:49,120 --> 00:03:50,840 Speaker 1: thing that maybe want to specifically bring you on as 70 00:03:50,880 --> 00:03:53,200 Speaker 1: you started on a new project to create like a 71 00:03:53,320 --> 00:03:57,000 Speaker 1: video game that that will hopefully have an ability to 72 00:03:57,000 --> 00:03:59,920 Speaker 1: help like de radicalize people. And I'm I'm not entirely 73 00:04:00,000 --> 00:04:01,880 Speaker 1: certain like of the details of the project, but I 74 00:04:01,920 --> 00:04:05,200 Speaker 1: think it's a fascinating project because as as you know 75 00:04:05,440 --> 00:04:08,840 Speaker 1: all too well, a lot of this stuff started in gaming, 76 00:04:08,920 --> 00:04:11,000 Speaker 1: not as a result of anything specifically about gaming, but 77 00:04:11,040 --> 00:04:13,920 Speaker 1: the kind of like socialization that occurs in those spaces 78 00:04:13,960 --> 00:04:17,000 Speaker 1: and the kind of like different communities. And it's been 79 00:04:17,040 --> 00:04:19,320 Speaker 1: like we have going back to the nineties evidence of 80 00:04:19,360 --> 00:04:22,400 Speaker 1: like different Nazi groups on the early Internet, like talking 81 00:04:22,400 --> 00:04:25,640 Speaker 1: about like these are specific specific groups and subcultures that 82 00:04:25,720 --> 00:04:27,919 Speaker 1: you know will have an easier time radicalizing and whatnot. 83 00:04:28,000 --> 00:04:31,039 Speaker 1: But yeah, I'm interested in kind of what actually is 84 00:04:31,080 --> 00:04:33,880 Speaker 1: going on with this project, um, And and how you 85 00:04:33,920 --> 00:04:35,719 Speaker 1: think it's going to look at this stage. I understand 86 00:04:35,760 --> 00:04:37,640 Speaker 1: it's pretty early in development right now, so I'm not 87 00:04:37,680 --> 00:04:41,960 Speaker 1: expecting like, you know, an E three walkthrough. Yeah, our 88 00:04:42,000 --> 00:04:45,480 Speaker 1: E three size life and I wish we had that. Um. Yeah. 89 00:04:45,480 --> 00:04:49,440 Speaker 1: We won a grant from DHS and FEMA their their 90 00:04:49,480 --> 00:04:51,960 Speaker 1: Tears and Prevention Grant program this year. We just got 91 00:04:52,040 --> 00:04:55,720 Speaker 1: awarded it like literally two weeks ago, so I have 92 00:04:55,760 --> 00:04:58,400 Speaker 1: not even started work on it at all. But the 93 00:04:58,440 --> 00:05:02,680 Speaker 1: project will be a laboration between my center and a 94 00:05:02,720 --> 00:05:06,800 Speaker 1: nonprofit games development company called the Eye Thrive Foundation. And 95 00:05:06,960 --> 00:05:09,680 Speaker 1: basically what we are going to do is like build 96 00:05:10,080 --> 00:05:14,120 Speaker 1: digital scenarios, digital narratives that can be engaged with UH 97 00:05:14,360 --> 00:05:17,480 Speaker 1: within classroom settings. So we're targeting high schools for rolling 98 00:05:17,480 --> 00:05:19,760 Speaker 1: this out UH, and the idea is that we're going 99 00:05:19,800 --> 00:05:21,960 Speaker 1: to give students the ability to take on roles that 100 00:05:22,040 --> 00:05:25,760 Speaker 1: empower them to better understand how extremism and radicalization work 101 00:05:25,839 --> 00:05:29,680 Speaker 1: as mechanisms, which will hopefully the idea is that it 102 00:05:29,720 --> 00:05:33,080 Speaker 1: will it will improve resilience and you know, civil integrity 103 00:05:33,080 --> 00:05:36,800 Speaker 1: and all those fun buzzwords within within high school communities. 104 00:05:37,040 --> 00:05:40,359 Speaker 1: So we're not necessarily trying to de radicalize already radicalize people. 105 00:05:40,440 --> 00:05:44,040 Speaker 1: But we're really trying to build community awareness, community resilience 106 00:05:44,279 --> 00:05:48,320 Speaker 1: to to radicalization pathways. I mean, this is something I 107 00:05:48,360 --> 00:05:51,000 Speaker 1: think about constantly because I get asked this a lot. 108 00:05:51,000 --> 00:05:52,920 Speaker 1: You know, I'll get I'll get emailed questions from people, 109 00:05:52,960 --> 00:05:54,880 Speaker 1: sometimes as much detail as like, hey, I'm like a 110 00:05:54,920 --> 00:05:57,599 Speaker 1: teacher and here's some things this could in my classes said, 111 00:05:57,680 --> 00:05:59,120 Speaker 1: or something he put in an essay, and like, I'm 112 00:05:59,120 --> 00:06:02,320 Speaker 1: growing really concerned earned about him, and like, I what 113 00:06:02,360 --> 00:06:04,600 Speaker 1: do I do? And my usual answer is, you know, 114 00:06:04,640 --> 00:06:07,640 Speaker 1: there's a couple of people who I respect that I'll 115 00:06:07,680 --> 00:06:10,160 Speaker 1: try to direct them to, but I I don't. I'm 116 00:06:10,160 --> 00:06:13,240 Speaker 1: pretty good at how people get radicalized. It's something I 117 00:06:13,279 --> 00:06:15,200 Speaker 1: spent a lot of time studying. I don't know how 118 00:06:16,000 --> 00:06:18,600 Speaker 1: how you I have trouble figuring out how to break 119 00:06:18,640 --> 00:06:21,880 Speaker 1: down these pathways because like, right, the the default for 120 00:06:21,920 --> 00:06:23,160 Speaker 1: a lot of people and for a lot of time 121 00:06:23,200 --> 00:06:25,520 Speaker 1: has been will you d platform? Right? You um, you 122 00:06:25,560 --> 00:06:27,440 Speaker 1: get them off of whatever. And there's there's I do 123 00:06:27,520 --> 00:06:30,200 Speaker 1: certainly think there's there's utility in that, but there's also 124 00:06:30,640 --> 00:06:32,880 Speaker 1: you know, the toothpaste tube effect, the fact that when 125 00:06:32,920 --> 00:06:36,080 Speaker 1: you you squash these popular areas where they're able to 126 00:06:36,120 --> 00:06:39,680 Speaker 1: spread them, they filter off into increasingly isolated communities that 127 00:06:39,720 --> 00:06:42,320 Speaker 1: develop new terms, they find out ways to hide it, 128 00:06:42,320 --> 00:06:45,000 Speaker 1: and that actually increases you know, it may it may 129 00:06:45,040 --> 00:06:47,560 Speaker 1: reduce the number of people who get radicalized, but the 130 00:06:47,600 --> 00:06:50,480 Speaker 1: people who remain just get more and more extreme because 131 00:06:50,480 --> 00:06:53,880 Speaker 1: they're even more isolated from you know, everyone else. And 132 00:06:54,279 --> 00:06:57,560 Speaker 1: I don't know, how do you how do you how 133 00:06:57,600 --> 00:07:00,760 Speaker 1: do you break that that radicalization and cycle? Like, how 134 00:07:00,760 --> 00:07:03,000 Speaker 1: do you how do you stop that ship before it gets, 135 00:07:03,480 --> 00:07:06,400 Speaker 1: you know, to a tipping point? Yeah, I mean, in general, 136 00:07:06,400 --> 00:07:09,640 Speaker 1: I'm with you, I'm pretty skeptical of a lot of 137 00:07:09,720 --> 00:07:13,680 Speaker 1: de radicalization strategies. Uh. And it's it's like an incredibly 138 00:07:13,680 --> 00:07:16,840 Speaker 1: difficult task to to pull someone out who's already going 139 00:07:16,880 --> 00:07:19,120 Speaker 1: down these pathways. And then, like you said, it's also 140 00:07:19,120 --> 00:07:21,080 Speaker 1: an incredibly difficult task to make sure that when you 141 00:07:21,120 --> 00:07:25,040 Speaker 1: are disrupting the radicalization networks that they aren't just disappearing 142 00:07:25,040 --> 00:07:27,160 Speaker 1: off to some other corner of the Internet, which we 143 00:07:27,160 --> 00:07:29,280 Speaker 1: know they're doing. Like one of the reasons why we're 144 00:07:29,480 --> 00:07:32,400 Speaker 1: we're working with a video game video game company is 145 00:07:32,960 --> 00:07:35,640 Speaker 1: over the last few years, we've noticed a big migration 146 00:07:35,760 --> 00:07:40,400 Speaker 1: into video game platforms especially big social based video game 147 00:07:40,440 --> 00:07:43,960 Speaker 1: platforms like Roadblocks and Minecraft, which are like not even 148 00:07:44,000 --> 00:07:48,679 Speaker 1: remotely prepared to deal with you know, very well developed 149 00:07:48,680 --> 00:07:52,760 Speaker 1: sophisticated radicalization networks. They have moved over there both for 150 00:07:52,880 --> 00:07:57,880 Speaker 1: organization and radicalization reasons, um, since mainstream companies have started 151 00:07:57,920 --> 00:08:00,640 Speaker 1: taking more of an interest in de platforming them. Uh 152 00:08:00,680 --> 00:08:04,320 Speaker 1: And so we're ending up like pretty wildly unprepared for 153 00:08:04,360 --> 00:08:07,440 Speaker 1: this sudden onslought of extremists being right in front of 154 00:08:07,520 --> 00:08:09,920 Speaker 1: kids as they're playing games or you know, teenagers or 155 00:08:09,920 --> 00:08:14,360 Speaker 1: even young adults. So our idea essentially is to use 156 00:08:14,720 --> 00:08:17,640 Speaker 1: that language, the same language that extremists are trying to 157 00:08:17,680 --> 00:08:21,600 Speaker 1: adopt the structures of video games three via the sort 158 00:08:21,640 --> 00:08:26,640 Speaker 1: of interactivity there, to better communicate, uh, the the impacts 159 00:08:26,680 --> 00:08:29,320 Speaker 1: of extremism, what it looks like, how to identify it, 160 00:08:29,600 --> 00:08:32,880 Speaker 1: and hopefully how to avoid getting you know, falling into 161 00:08:32,920 --> 00:08:36,240 Speaker 1: the traps that are laid uh for for unsuspecting people. 162 00:08:37,000 --> 00:08:39,680 Speaker 1: One of the issues and I'm curiously thoughts on this 163 00:08:39,720 --> 00:08:42,760 Speaker 1: because we we we talk a lot about, Like I 164 00:08:42,760 --> 00:08:46,199 Speaker 1: think people have become increasingly aware of how bad Facebook 165 00:08:46,280 --> 00:08:48,480 Speaker 1: in particulars is a problem with this. It's it's really 166 00:08:48,480 --> 00:08:50,960 Speaker 1: well real a lot of the boogaloo movement to and 167 00:08:51,000 --> 00:08:53,120 Speaker 1: now this stuff is coming out about like the data 168 00:08:53,160 --> 00:08:55,480 Speaker 1: Facebook has had on just and this isn't this isn't 169 00:08:55,520 --> 00:08:59,920 Speaker 1: this is adjacent to radicalization. Um, the mental impact that 170 00:09:00,000 --> 00:09:02,680 Speaker 1: it's been having on teenagers right like the just how 171 00:09:02,720 --> 00:09:06,800 Speaker 1: bad it is for people, And UM, I'm wondering, like 172 00:09:07,000 --> 00:09:09,360 Speaker 1: how do you scale this stuff? I guess is the question, 173 00:09:09,440 --> 00:09:11,679 Speaker 1: like how do you actually how do you make the 174 00:09:11,760 --> 00:09:16,880 Speaker 1: social internet less dangerous? Yeah, I mean that's that's going 175 00:09:16,920 --> 00:09:19,880 Speaker 1: to be extremely tough. And we are even starting very 176 00:09:19,960 --> 00:09:22,400 Speaker 1: very small, like we're building we're building on a narrative 177 00:09:22,400 --> 00:09:25,880 Speaker 1: platform to target three high schools right now. Um. But 178 00:09:26,000 --> 00:09:28,719 Speaker 1: the hope is that ultimately what we can do is 179 00:09:28,760 --> 00:09:31,760 Speaker 1: build a tool set and and a platform, like literally 180 00:09:32,000 --> 00:09:34,760 Speaker 1: a game platform that can be used by high school 181 00:09:34,800 --> 00:09:37,880 Speaker 1: teachers in high school classes throughout the country or throughout 182 00:09:37,880 --> 00:09:41,120 Speaker 1: the world. Um. The idea will be to hopefully make 183 00:09:41,200 --> 00:09:45,960 Speaker 1: a new sort of package of different methods and interactive 184 00:09:46,200 --> 00:09:50,560 Speaker 1: experiences that can be reused into the future. But it 185 00:09:50,720 --> 00:09:53,280 Speaker 1: is one of the big open questions that we will 186 00:09:53,320 --> 00:09:55,679 Speaker 1: hopefully come to some sort of answer for throughout the 187 00:09:55,679 --> 00:09:58,720 Speaker 1: project about how do we actually scale us up UM. 188 00:09:58,760 --> 00:10:02,800 Speaker 1: But you know, in general, it is again like one 189 00:10:02,840 --> 00:10:04,920 Speaker 1: of the biggest open questions right now. One of the 190 00:10:04,960 --> 00:10:06,959 Speaker 1: reasons why I's so skeptical of a lot of d 191 00:10:07,120 --> 00:10:10,320 Speaker 1: RAD and CBE techniques is they try to go for 192 00:10:10,400 --> 00:10:14,040 Speaker 1: scale about effectiveness UM, when in reality, one of the 193 00:10:14,080 --> 00:10:17,520 Speaker 1: best and only de radicalization pathways that we know of 194 00:10:17,720 --> 00:10:20,800 Speaker 1: involves people that you know and I know going out 195 00:10:20,840 --> 00:10:23,480 Speaker 1: and meeting with these people one on one and having intensive, 196 00:10:24,160 --> 00:10:29,080 Speaker 1: frequent communications with them. So UM, there's as far as 197 00:10:29,080 --> 00:10:30,679 Speaker 1: we know, there's not a good answer right now. This 198 00:10:30,760 --> 00:10:32,880 Speaker 1: is a huge place of research right now because we 199 00:10:32,880 --> 00:10:35,560 Speaker 1: should just straight up do not understand how to scale 200 00:10:35,600 --> 00:10:39,839 Speaker 1: up UM radicalization, prevention and de radicalization. I mean, and 201 00:10:40,480 --> 00:10:43,400 Speaker 1: you know what you're trying to do, And like, reaching 202 00:10:43,440 --> 00:10:46,280 Speaker 1: kids in high school in something that's meant they're meant 203 00:10:46,320 --> 00:10:48,960 Speaker 1: to be consuming while they're in school is even such 204 00:10:49,000 --> 00:10:51,640 Speaker 1: an additional challenge because I think you and I are 205 00:10:51,679 --> 00:10:54,880 Speaker 1: both young enough to at least remember that like almost 206 00:10:54,920 --> 00:10:57,760 Speaker 1: nothing that you put before kids in that context in 207 00:10:57,760 --> 00:11:00,400 Speaker 1: a school gets through. I can I I can think 208 00:11:00,400 --> 00:11:02,760 Speaker 1: about like anti drug programs and stuff when I was 209 00:11:02,760 --> 00:11:04,720 Speaker 1: a kid, and how ineffective they were. There was I 210 00:11:04,760 --> 00:11:08,360 Speaker 1: had one one effective anti drug like speech by a 211 00:11:08,440 --> 00:11:10,920 Speaker 1: teacher and it was just a teacher who whose son 212 00:11:11,240 --> 00:11:13,040 Speaker 1: was part of this this there was this one night 213 00:11:13,040 --> 00:11:15,199 Speaker 1: and plane where like six kids indeed on heroin. It 214 00:11:15,280 --> 00:11:17,160 Speaker 1: was there was big Rolling Stone article about it was 215 00:11:17,200 --> 00:11:19,199 Speaker 1: a very famous moment, and her son was one of 216 00:11:19,240 --> 00:11:21,400 Speaker 1: the kids who nearly died and she was and she 217 00:11:21,679 --> 00:11:24,480 Speaker 1: like just explained like physically what happened to him and 218 00:11:24,520 --> 00:11:26,880 Speaker 1: begged us not to do heroin. And that actually did 219 00:11:26,920 --> 00:11:30,600 Speaker 1: stick with me. I've never never shot up anything, um, 220 00:11:30,640 --> 00:11:33,600 Speaker 1: but you know, like the a lot of it doesn't work. 221 00:11:33,559 --> 00:11:36,080 Speaker 1: And I think part of why it's this thing I 222 00:11:36,120 --> 00:11:38,520 Speaker 1: talked about when I tried to explain like why isis 223 00:11:38,679 --> 00:11:42,160 Speaker 1: propaganda was so effective, it's the it feels more authentic 224 00:11:42,360 --> 00:11:45,520 Speaker 1: than the than the counter narrative, right, the counter narrative 225 00:11:45,880 --> 00:11:48,440 Speaker 1: because it's it's usually focus grouped. It's coming as the 226 00:11:48,480 --> 00:11:50,640 Speaker 1: result of like some sort of government initiative, a bunch 227 00:11:50,679 --> 00:11:54,000 Speaker 1: of people worked intogether. It feels focus grouped as opposed 228 00:11:54,000 --> 00:11:57,480 Speaker 1: to there's something inherently more compelling about something that just 229 00:11:57,559 --> 00:12:01,200 Speaker 1: like feels like somebody who really gave a ship cares 230 00:12:01,240 --> 00:12:03,280 Speaker 1: a lot put this thing together, even if it's terrible. 231 00:12:03,360 --> 00:12:05,800 Speaker 1: And I that strikes me as a really because if 232 00:12:05,800 --> 00:12:07,800 Speaker 1: you're going to be scaling something and trying to reach 233 00:12:07,840 --> 00:12:09,200 Speaker 1: a lot of people, it's going to have to be 234 00:12:09,280 --> 00:12:13,000 Speaker 1: something that is put together at scale by an organization. 235 00:12:13,040 --> 00:12:15,600 Speaker 1: And how do you I mean, I know this must 236 00:12:15,640 --> 00:12:17,120 Speaker 1: be on your mind as you're trying to figure out 237 00:12:17,120 --> 00:12:18,800 Speaker 1: how to craft this thing. I'm just interested in your 238 00:12:18,840 --> 00:12:21,680 Speaker 1: thoughts on that. Really, Yeah, I mean that exact challenge 239 00:12:21,920 --> 00:12:24,400 Speaker 1: challenges what led us to proposing the project project that 240 00:12:24,440 --> 00:12:27,400 Speaker 1: we are so the idea behind it or the the 241 00:12:27,679 --> 00:12:31,480 Speaker 1: impetus behind what we did what we proposed is, um, 242 00:12:31,559 --> 00:12:34,640 Speaker 1: the exact problem of students just don't listen to people 243 00:12:34,960 --> 00:12:38,920 Speaker 1: in whether that's anti drug programs or anything like that. Often, 244 00:12:39,520 --> 00:12:43,600 Speaker 1: my uh my uh feeling about it is they are 245 00:12:43,720 --> 00:12:46,160 Speaker 1: often resistant to it because it's very negative. It's very 246 00:12:46,280 --> 00:12:49,280 Speaker 1: don't do this, don't do this. I'm setting up boundaries 247 00:12:49,320 --> 00:12:53,320 Speaker 1: for for kids and ano listens to act within. It's 248 00:12:53,320 --> 00:12:57,640 Speaker 1: all very declaratory, very you know, commanding. Um, there's no 249 00:12:58,240 --> 00:13:02,400 Speaker 1: there's no sense of treating kids like people who have control, 250 00:13:02,440 --> 00:13:06,440 Speaker 1: who have interests, who have motivations. It's all attempting to 251 00:13:06,600 --> 00:13:09,440 Speaker 1: restrict them. And so the idea is that we're going 252 00:13:09,480 --> 00:13:12,000 Speaker 1: to attempt to build a game platform that actually empower 253 00:13:12,040 --> 00:13:15,760 Speaker 1: students to operate within roles that have control, that that 254 00:13:15,880 --> 00:13:18,880 Speaker 1: have something to say, to give them voices, to give 255 00:13:18,960 --> 00:13:22,040 Speaker 1: them UM and that sort of feeling of being an 256 00:13:22,200 --> 00:13:26,400 Speaker 1: established UM person within a within a certain scenario. UM. 257 00:13:26,440 --> 00:13:28,040 Speaker 1: The way that I've been thinking about it is that 258 00:13:28,080 --> 00:13:31,360 Speaker 1: we're basically merging video games with like the structure of 259 00:13:31,360 --> 00:13:34,240 Speaker 1: a model you in conference or something like that hopefully 260 00:13:34,240 --> 00:13:36,480 Speaker 1: will be a little less nerdy than the model un conferences. 261 00:13:36,720 --> 00:13:40,520 Speaker 1: But that's the idea of giving people power to make decisions, uh, 262 00:13:40,559 --> 00:13:47,160 Speaker 1: and and treat them like actual, you know, operating humans. Yeah, 263 00:13:47,200 --> 00:13:49,880 Speaker 1: I uh. I'm wondering do you have any kind of 264 00:13:49,920 --> 00:13:51,800 Speaker 1: models that you're looking at when you think of like 265 00:13:52,400 --> 00:13:54,480 Speaker 1: something that you see is is kind of worth I 266 00:13:54,520 --> 00:13:56,640 Speaker 1: don't emulating maybe the wrong word, but like, oh, these 267 00:13:56,679 --> 00:13:59,960 Speaker 1: people I think got it right and and this was effective, 268 00:14:00,120 --> 00:14:02,800 Speaker 1: Like or is this really a situation where you feel 269 00:14:02,840 --> 00:14:05,240 Speaker 1: like we're kind of in the fucking wilderness here. There's 270 00:14:05,280 --> 00:14:07,880 Speaker 1: not a lot of great models for what's effective. We 271 00:14:08,000 --> 00:14:12,480 Speaker 1: are very much in the wilderness. I was expecting you 272 00:14:12,520 --> 00:14:16,040 Speaker 1: to say, like, so much of c V and d 273 00:14:16,200 --> 00:14:19,560 Speaker 1: RAD work of the last ten years has been directly 274 00:14:19,600 --> 00:14:22,960 Speaker 1: towards trying to essentially recreate the like the DARE model 275 00:14:23,040 --> 00:14:26,760 Speaker 1: or the anti drug model, just in a different field. Um. 276 00:14:26,840 --> 00:14:30,520 Speaker 1: And so we're going to be pulling from scenario builders 277 00:14:30,680 --> 00:14:33,920 Speaker 1: and like mad un and debate and like all of 278 00:14:33,920 --> 00:14:36,240 Speaker 1: these different models that seem to at least work to 279 00:14:36,240 --> 00:14:40,040 Speaker 1: get kids engage with like operating that sort of situation. 280 00:14:40,640 --> 00:14:44,160 Speaker 1: But it is going to be pretty I mean, at 281 00:14:44,240 --> 00:14:46,040 Speaker 1: least from what I understand, is gonna be pretty new. 282 00:14:46,080 --> 00:14:48,560 Speaker 1: We're going to be out there really flying blind for 283 00:14:48,560 --> 00:14:51,600 Speaker 1: a lot of it, um, But we will you know, 284 00:14:51,680 --> 00:14:53,760 Speaker 1: we have a pilot phase built in to try to 285 00:14:53,800 --> 00:14:56,400 Speaker 1: bay attest this with with um some of the students 286 00:14:56,440 --> 00:15:00,840 Speaker 1: were incorporating students and instructors and the actual creation devolopment stage. 287 00:15:01,280 --> 00:15:04,720 Speaker 1: So that'll be another hopefully good part of this. We'll 288 00:15:05,280 --> 00:15:09,360 Speaker 1: we'll give some students experience with the game development process, 289 00:15:10,120 --> 00:15:12,480 Speaker 1: which I think will health engage them as well. That 290 00:15:13,840 --> 00:15:16,120 Speaker 1: strikes me as a particularly good idea of like giving 291 00:15:16,120 --> 00:15:18,280 Speaker 1: and also just giving them some agency. So it's not 292 00:15:18,320 --> 00:15:21,400 Speaker 1: like this is a thing that you are forced to consume, 293 00:15:21,640 --> 00:15:23,960 Speaker 1: like this is the thing that you can like learn 294 00:15:24,080 --> 00:15:37,920 Speaker 1: something from. I think that's that's very important. I'm interested 295 00:15:37,960 --> 00:15:40,960 Speaker 1: in how you see how you see this because like 296 00:15:41,000 --> 00:15:43,320 Speaker 1: again we kind of both got in around the same time. 297 00:15:43,400 --> 00:15:46,400 Speaker 1: Gamer Gate is when I started paying attention to radicalization 298 00:15:46,480 --> 00:15:49,080 Speaker 1: to how do you think it's changed since then? How 299 00:15:49,080 --> 00:15:51,119 Speaker 1: do you think like the nature of of how, particularly 300 00:15:51,160 --> 00:15:54,040 Speaker 1: like younger people are being radicalized has changed. And I 301 00:15:54,040 --> 00:15:56,160 Speaker 1: guess I'm also interested because I get the feeling that 302 00:15:56,800 --> 00:15:59,880 Speaker 1: back then it was mostly younger people getting radicalized and 303 00:16:00,000 --> 00:16:02,720 Speaker 1: that's no longer the case. Just as we're talking, I 304 00:16:02,800 --> 00:16:05,000 Speaker 1: just came across the video on Twitter of a group 305 00:16:05,000 --> 00:16:08,000 Speaker 1: of anti vax protesters chasing parents and children away from 306 00:16:08,000 --> 00:16:10,880 Speaker 1: an elementary school and screaming at them that they're raping 307 00:16:10,880 --> 00:16:13,720 Speaker 1: their kids with a vaccine. So clearly the problem is expanded. 308 00:16:13,880 --> 00:16:16,760 Speaker 1: But yeah, yeah, and honestly, one of the things that 309 00:16:16,840 --> 00:16:18,880 Speaker 1: keeps me up at night is when we start, if 310 00:16:19,080 --> 00:16:21,240 Speaker 1: you know, knocking movies, are able to roll this out 311 00:16:21,280 --> 00:16:24,120 Speaker 1: to more schools, we're going to run into some probably 312 00:16:24,240 --> 00:16:28,920 Speaker 1: very resistant parents who have radicalized. Yum, yeah, I mean 313 00:16:29,200 --> 00:16:30,640 Speaker 1: the big one is, like what you said, like the 314 00:16:31,160 --> 00:16:36,320 Speaker 1: radicalization demographics have vastly expanded to incorporate so many more 315 00:16:36,320 --> 00:16:38,920 Speaker 1: different types of people, so many more ages and even 316 00:16:38,920 --> 00:16:43,360 Speaker 1: ethnicities and genders. Um. But what we do know is 317 00:16:43,440 --> 00:16:46,760 Speaker 1: that the hardcore of the of the violent extremists are 318 00:16:46,760 --> 00:16:51,920 Speaker 1: still targeting adolescents. UM. We know, accelerationists, for instance, hang 319 00:16:51,920 --> 00:16:55,960 Speaker 1: out and try to essentially black pill a bunch of teens, 320 00:16:56,360 --> 00:17:00,000 Speaker 1: especially autistic teens, especially teens with mental health issues, UH, 321 00:17:00,040 --> 00:17:04,439 Speaker 1: and bring them into a more violent, more accelerationist posture. UM. 322 00:17:04,520 --> 00:17:07,040 Speaker 1: So I mean, I think that has sort of stayed 323 00:17:08,080 --> 00:17:11,960 Speaker 1: constant throughout all of this. One of the big uh 324 00:17:12,240 --> 00:17:15,639 Speaker 1: changes has been platforms. You know, ten years ago, it 325 00:17:15,720 --> 00:17:18,000 Speaker 1: was much easier for a neo Nazi to operate openly 326 00:17:18,040 --> 00:17:23,760 Speaker 1: on YouTube or Facebook, but that has thankfully changed. Um. 327 00:17:23,800 --> 00:17:26,879 Speaker 1: But they have spread out into like I mentioned earlier, 328 00:17:26,920 --> 00:17:29,280 Speaker 1: they've spread out into video games. They spread out into 329 00:17:29,320 --> 00:17:33,520 Speaker 1: other sorts of platforms where the social aspect isn't necessarily 330 00:17:33,600 --> 00:17:36,680 Speaker 1: the first part of the platform, but rather a secondary 331 00:17:36,680 --> 00:17:40,320 Speaker 1: aspect to it, And they try to engage UM adolescence 332 00:17:40,320 --> 00:17:42,640 Speaker 1: on their own turf on you know, in a roadblocks 333 00:17:42,680 --> 00:17:46,040 Speaker 1: game or in a in a video game forum. Out there, 334 00:17:47,000 --> 00:17:49,159 Speaker 1: it's not even enough to say. It feels like the 335 00:17:49,520 --> 00:17:53,680 Speaker 1: task of reducing radicalization or or not not even mentioned, 336 00:17:53,720 --> 00:17:56,439 Speaker 1: pulling it back, just stopping the process. It feels not 337 00:17:56,520 --> 00:17:58,440 Speaker 1: just like whack a mole, but like whack a mole 338 00:17:58,480 --> 00:18:02,359 Speaker 1: when you're surrounded by moles. Um. And I guess that 339 00:18:02,440 --> 00:18:04,800 Speaker 1: is the thing that keeps me up at night the most, too, 340 00:18:04,920 --> 00:18:08,240 Speaker 1: is that like the problem has gotten because of how 341 00:18:08,280 --> 00:18:10,720 Speaker 1: social media scales, I think, in large part, has gotten 342 00:18:10,800 --> 00:18:14,879 Speaker 1: so much worse than it ever was. And the I 343 00:18:15,200 --> 00:18:18,560 Speaker 1: see these crowds of adults, you know, assembling in you know, 344 00:18:18,600 --> 00:18:22,080 Speaker 1: places like Los Angeles and showing up outside of schools 345 00:18:22,080 --> 00:18:25,840 Speaker 1: to her ask people, and like, I don't know what. 346 00:18:27,520 --> 00:18:30,800 Speaker 1: I don't know what to do about that. Like part 347 00:18:30,840 --> 00:18:37,040 Speaker 1: of me thinks, um, part of me thinks that the 348 00:18:37,080 --> 00:18:40,959 Speaker 1: only effective long term answer is to mobilize a larger 349 00:18:41,040 --> 00:18:46,240 Speaker 1: number of people two show up to you know, not 350 00:18:46,359 --> 00:18:49,720 Speaker 1: necessarily confront those people, but make them make them feel outnumbered, 351 00:18:49,760 --> 00:18:51,920 Speaker 1: and maybe they'll stop, and that will start a process 352 00:18:51,920 --> 00:18:54,880 Speaker 1: where they they alter their thinking. Like I'm thinking kind 353 00:18:54,880 --> 00:18:57,119 Speaker 1: of back to some aspects of the civil rights movement 354 00:18:57,160 --> 00:18:58,840 Speaker 1: here right where you would have these people show up 355 00:18:58,840 --> 00:19:01,119 Speaker 1: at schools just try to stop integration and whatnot, and 356 00:19:01,160 --> 00:19:04,399 Speaker 1: they would be opposed often by by larger groups that 357 00:19:04,440 --> 00:19:06,440 Speaker 1: they would see the size of the marches in the street. 358 00:19:06,480 --> 00:19:08,199 Speaker 1: And like, I don't know, I don't even know if 359 00:19:08,200 --> 00:19:10,760 Speaker 1: it works that way anymore. Like knowing that you know, 360 00:19:11,560 --> 00:19:14,080 Speaker 1: tend to one people think your stance on vaccines is 361 00:19:14,119 --> 00:19:16,160 Speaker 1: stupid and they're willing to show up to like yell 362 00:19:16,200 --> 00:19:18,120 Speaker 1: at you if that would do anything. But I don't 363 00:19:18,119 --> 00:19:20,399 Speaker 1: know what. I don't know what's going to do. Like 364 00:19:20,440 --> 00:19:23,919 Speaker 1: I guess I'm asking you, like, can you have you 365 00:19:23,960 --> 00:19:25,760 Speaker 1: figured this out? Because I don't know what the funk 366 00:19:25,800 --> 00:19:29,240 Speaker 1: to do? Um, But it's it's it's not you can't 367 00:19:29,440 --> 00:19:32,080 Speaker 1: we can't close our obviously you're someone who's trying to 368 00:19:32,119 --> 00:19:35,360 Speaker 1: confront it directly, but we certainly can't keep ourselves like 369 00:19:36,520 --> 00:19:40,000 Speaker 1: just pretend it's not going to get worse, right, No, totally, 370 00:19:40,040 --> 00:19:43,399 Speaker 1: And um, you know, I often feel like it's almost 371 00:19:43,440 --> 00:19:46,320 Speaker 1: too far gone. And you know, frequently I worry that 372 00:19:46,359 --> 00:19:48,960 Speaker 1: we've already passed some sort of you know, part of 373 00:19:49,000 --> 00:19:53,719 Speaker 1: the return on rodo causation exploitation of social media. But 374 00:19:54,520 --> 00:19:57,200 Speaker 1: one of the other things I've also recognized is that 375 00:19:57,440 --> 00:20:00,200 Speaker 1: when you're in a space that is dedicated to one 376 00:20:00,240 --> 00:20:06,360 Speaker 1: type of confronting one one method of confronting extremism, very 377 00:20:06,400 --> 00:20:10,639 Speaker 1: often they will forget about, or d prioritize, or or 378 00:20:10,680 --> 00:20:15,520 Speaker 1: even ignore the other types, the other methods. And one 379 00:20:15,560 --> 00:20:18,360 Speaker 1: of the tasks before us, I think, before we throw 380 00:20:18,440 --> 00:20:20,640 Speaker 1: up our hands and give up, is trying to tie 381 00:20:20,640 --> 00:20:24,879 Speaker 1: together all the different facets of resisting extremism, from the 382 00:20:25,800 --> 00:20:29,400 Speaker 1: hardcore confrontational doxing and showing up in the streets counterprotesting, 383 00:20:29,400 --> 00:20:32,080 Speaker 1: which I think is an essential part of it, to 384 00:20:32,760 --> 00:20:34,720 Speaker 1: UM working as hard as we can to try to 385 00:20:34,760 --> 00:20:38,359 Speaker 1: get tech companies to to realize what's going on UH, 386 00:20:38,400 --> 00:20:40,640 Speaker 1: and then also on the educational side, like what we're 387 00:20:40,640 --> 00:20:43,439 Speaker 1: doing with this with this project UM. Some of the 388 00:20:43,440 --> 00:20:45,280 Speaker 1: things that make me at least a little bit optimistic 389 00:20:45,359 --> 00:20:50,360 Speaker 1: is that there is obviously inertia, both intentional and unintentional 390 00:20:50,359 --> 00:20:54,120 Speaker 1: at tech companies, but frankly, they are still extremely far 391 00:20:54,200 --> 00:20:57,520 Speaker 1: behind in understanding how to even do D platforming on 392 00:20:57,560 --> 00:21:01,359 Speaker 1: their platforms, how to even identify who t D platform like, 393 00:21:01,560 --> 00:21:05,400 Speaker 1: the majority of tech companies are still making conto moderation 394 00:21:05,440 --> 00:21:09,840 Speaker 1: decisions on a piece by piece basis specifically looking at content. 395 00:21:10,520 --> 00:21:12,800 Speaker 1: Very few of them are doing actor analysis, very few 396 00:21:12,800 --> 00:21:15,919 Speaker 1: of them are doing secial network analysis. Very few of 397 00:21:15,920 --> 00:21:18,880 Speaker 1: them are looking at even the links between like off 398 00:21:18,880 --> 00:21:23,160 Speaker 1: platform violence and on platform content, Like it's the They 399 00:21:23,160 --> 00:21:25,520 Speaker 1: are still very much in the stone ages when it 400 00:21:25,520 --> 00:21:29,320 Speaker 1: comes to contom moderation. And that's so so key when 401 00:21:29,359 --> 00:21:32,160 Speaker 1: I think about like what actually would reduce the harm 402 00:21:32,359 --> 00:21:35,600 Speaker 1: that these platforms are doing at scale. It's focusing on 403 00:21:35,640 --> 00:21:39,600 Speaker 1: the actors, um and and not just like the individual actors, 404 00:21:39,600 --> 00:21:41,560 Speaker 1: which is part of the patterns that let you tell 405 00:21:41,560 --> 00:21:43,800 Speaker 1: whether or not someone is like that same actor who's 406 00:21:43,840 --> 00:21:47,440 Speaker 1: kind of like putting on a different hat, so to speak. Um, 407 00:21:47,480 --> 00:21:51,440 Speaker 1: are you aware of like, is there any I because 408 00:21:51,440 --> 00:21:54,640 Speaker 1: I have not seen that happen yet. I haven't seen 409 00:21:54,720 --> 00:21:57,840 Speaker 1: Facebook take that seriously, um, and I have I have 410 00:21:58,240 --> 00:22:01,520 Speaker 1: spent some time there. I haven't seen certainly haven't seen 411 00:22:01,560 --> 00:22:05,639 Speaker 1: Twitter take that seriously. UM, I haven't really seen. I 412 00:22:05,640 --> 00:22:09,800 Speaker 1: don't believe TikTok is like they're they're they're they're just 413 00:22:10,080 --> 00:22:13,439 Speaker 1: um like you said, they're going after they're taking it 414 00:22:13,480 --> 00:22:15,600 Speaker 1: on a piece bay piece basis, which is never there's 415 00:22:15,600 --> 00:22:18,840 Speaker 1: too many pieces. That's never going to handle the problem. Yeah, 416 00:22:18,880 --> 00:22:21,439 Speaker 1: I mean taketok is crawling right now. They are in 417 00:22:21,480 --> 00:22:23,920 Speaker 1: their infancy. Um, they don't. They don't have a data 418 00:22:23,960 --> 00:22:27,520 Speaker 1: sharing uh, any sort of data sharing systems set up 419 00:22:27,560 --> 00:22:32,360 Speaker 1: for researchers or anything like that. Yet I've seen optimistic signals. 420 00:22:32,359 --> 00:22:34,359 Speaker 1: So I think Facebook's approach to q and on in 421 00:22:34,440 --> 00:22:37,639 Speaker 1: boogleoo movement over the past year has been probably the best, 422 00:22:38,359 --> 00:22:41,480 Speaker 1: the most positive development we've seen on the content moderation front, 423 00:22:41,520 --> 00:22:44,719 Speaker 1: because they took an actual network based approach to it. 424 00:22:45,240 --> 00:22:48,080 Speaker 1: It was hands strung by a variety of different policy decisions, 425 00:22:48,200 --> 00:22:50,919 Speaker 1: but it was still from like a from like a 426 00:22:50,960 --> 00:22:54,760 Speaker 1: mechanics standpoint, the most sophisticated one any of the comedies 427 00:22:54,800 --> 00:22:58,080 Speaker 1: has actually talked about openly. Uh. And YouTube has followed 428 00:22:58,320 --> 00:23:01,880 Speaker 1: in their path. They've started taking more network approaches. Um. 429 00:23:02,040 --> 00:23:06,119 Speaker 1: They they've taken moderation action against q and On on 430 00:23:06,160 --> 00:23:10,120 Speaker 1: a similar basis. But the thing that I want tech 431 00:23:10,160 --> 00:23:13,439 Speaker 1: companies to start looking at is applying a lot of 432 00:23:13,480 --> 00:23:17,680 Speaker 1: the techniques they're using for disinformation and info ops work 433 00:23:18,000 --> 00:23:21,560 Speaker 1: to extremism and radicalization. It's very similar, but right now 434 00:23:21,640 --> 00:23:24,960 Speaker 1: it seems to be just easier politically or just there further, 435 00:23:25,040 --> 00:23:29,560 Speaker 1: along with doing the large scale network analysis approaches on 436 00:23:29,680 --> 00:23:32,560 Speaker 1: this info UM, like Twitter is doing a lot of that, 437 00:23:32,800 --> 00:23:37,320 Speaker 1: but it's all on information operations and take info yeah, 438 00:23:37,359 --> 00:23:42,199 Speaker 1: as opposed to yeah people. Yeah. And I I worry 439 00:23:42,280 --> 00:23:44,919 Speaker 1: too because I'm paying attention to kind of you know, 440 00:23:44,960 --> 00:23:47,320 Speaker 1: you have this whistle blower from Facebook, and how that's 441 00:23:47,359 --> 00:23:49,960 Speaker 1: being politicized right, how the right is kind of coming 442 00:23:49,960 --> 00:23:52,560 Speaker 1: at this from a they're trying to say, like as 443 00:23:52,560 --> 00:23:55,800 Speaker 1: Ben Shapira said, they're trying to to UM to censor 444 00:23:56,280 --> 00:23:59,800 Speaker 1: alternative media voices and the like. And I I worry 445 00:23:59,800 --> 00:24:03,880 Speaker 1: each tremendously about the politicization because number one, it means 446 00:24:03,920 --> 00:24:07,680 Speaker 1: that at best we've got like three years to get 447 00:24:07,720 --> 00:24:10,600 Speaker 1: something together before you know, who knows whose wides up 448 00:24:10,600 --> 00:24:13,040 Speaker 1: in the White House next. But also if it's just 449 00:24:13,080 --> 00:24:17,000 Speaker 1: this thing of like veering between who gets who gets 450 00:24:17,760 --> 00:24:21,119 Speaker 1: paid attention to UM based on like what is politically 451 00:24:21,200 --> 00:24:24,720 Speaker 1: viable for Facebook, we're never going to solve the problem. 452 00:24:24,760 --> 00:24:27,480 Speaker 1: And I I think I agree with you for the 453 00:24:27,560 --> 00:24:30,800 Speaker 1: most part on the Facebook's response to the Google Boo movement. 454 00:24:31,359 --> 00:24:33,199 Speaker 1: I mean, I guess I think the problem was that 455 00:24:33,280 --> 00:24:36,600 Speaker 1: by the time they developed a functional set of responses 456 00:24:36,640 --> 00:24:40,240 Speaker 1: to it. Um, it had metastasized, it had grown, it 457 00:24:40,320 --> 00:24:42,680 Speaker 1: had grown strong enough to exist on its own, and 458 00:24:42,720 --> 00:24:55,240 Speaker 1: a lot of people have gotten exposed. What do you 459 00:24:55,240 --> 00:24:57,880 Speaker 1: think is the actual is reasonable to expect in terms 460 00:24:57,920 --> 00:25:01,160 Speaker 1: of response time from these people? Because with boogle boo stuff, 461 00:25:01,160 --> 00:25:04,879 Speaker 1: it was about I want to say, about three months maybe, well, no, 462 00:25:04,960 --> 00:25:06,800 Speaker 1: it was more like five. It was about five months 463 00:25:06,800 --> 00:25:09,919 Speaker 1: that it had from like December of twenty nineteen was 464 00:25:09,920 --> 00:25:12,720 Speaker 1: when I started really noticing it, and then like you know, 465 00:25:12,840 --> 00:25:15,080 Speaker 1: May at the when when stuff really kicked off with 466 00:25:15,119 --> 00:25:18,320 Speaker 1: the George Floyd protests, when you started to see action 467 00:25:18,359 --> 00:25:21,720 Speaker 1: taking the tail in the May. Yeah, So I guess 468 00:25:21,760 --> 00:25:24,760 Speaker 1: that I'm wondering, like what is the half life of 469 00:25:24,760 --> 00:25:27,000 Speaker 1: this ship? Like how quickly do you need to crack 470 00:25:27,040 --> 00:25:31,040 Speaker 1: down on this stuff before it gets to be impossible 471 00:25:31,080 --> 00:25:35,639 Speaker 1: to contain? Uh? Yeah, I mean that's the biggest limiting 472 00:25:35,720 --> 00:25:39,560 Speaker 1: factor on that effectiveness of UH contemnation in general, but 473 00:25:39,600 --> 00:25:43,160 Speaker 1: also in particular these new approaches that the tech companies 474 00:25:43,160 --> 00:25:46,439 Speaker 1: seem to be experimenting with. UM. My understanding is that 475 00:25:46,520 --> 00:25:48,760 Speaker 1: part of the So I'm not I'm not defending Facebook 476 00:25:48,760 --> 00:25:51,040 Speaker 1: by any stretch I'm not here to be the Facebook 477 00:25:51,640 --> 00:25:55,600 Speaker 1: rallying croup, but my understanding is that they literally did 478 00:25:55,640 --> 00:25:59,960 Speaker 1: develop an entirely separate approach to taking down the Biblie movement, 479 00:26:00,160 --> 00:26:03,280 Speaker 1: So that explains at least a little bit of the delay. 480 00:26:03,440 --> 00:26:07,520 Speaker 1: But hopefully, you know, my optimistic side hopes that they 481 00:26:07,520 --> 00:26:10,240 Speaker 1: will be able to apply it more quickly in the future. Um. 482 00:26:10,280 --> 00:26:12,879 Speaker 1: The problem is a lot of the network approaches that 483 00:26:12,920 --> 00:26:16,880 Speaker 1: have been developed are have like these very high thresholds 484 00:26:16,880 --> 00:26:20,800 Speaker 1: for attribution. So it has to be like a dedicated 485 00:26:20,840 --> 00:26:23,480 Speaker 1: network that has crossed the line into criminal activity and 486 00:26:23,640 --> 00:26:27,600 Speaker 1: is actively calling for you know, political violence on like 487 00:26:27,640 --> 00:26:30,880 Speaker 1: a network level, and that like we all know that 488 00:26:30,880 --> 00:26:33,119 Speaker 1: that is that is like the end goal or the 489 00:26:33,320 --> 00:26:39,280 Speaker 1: end point in exactly right, Like that is the terminal 490 00:26:39,320 --> 00:26:42,959 Speaker 1: point of the development of these extremist networks. So you know, 491 00:26:43,640 --> 00:26:46,760 Speaker 1: we're one of the one of the things that we're 492 00:26:46,760 --> 00:26:48,159 Speaker 1: working on is trying to figure out a way to 493 00:26:48,200 --> 00:26:50,760 Speaker 1: convince tech companies that you can and should take action 494 00:26:50,800 --> 00:26:53,320 Speaker 1: earlier before it reaches that point. And it's going to 495 00:26:53,400 --> 00:26:55,200 Speaker 1: be a mosaic of things. It's going to be combining 496 00:26:55,280 --> 00:26:58,600 Speaker 1: violent extremism with hate speech, with even like c sam 497 00:26:58,720 --> 00:27:02,200 Speaker 1: child exploitation stuff with um, all of you know, criminal 498 00:27:02,240 --> 00:27:05,800 Speaker 1: criminal conspiracy, network policies, all of those things need to 499 00:27:05,840 --> 00:27:09,399 Speaker 1: be sort of thought of as pieces in a single, big, 500 00:27:09,400 --> 00:27:12,600 Speaker 1: overarching umbrella that we can use to take down networks 501 00:27:12,600 --> 00:27:15,520 Speaker 1: earlier on. But you know, it's a it's a that's 502 00:27:15,520 --> 00:27:17,520 Speaker 1: one of the biggest tasks is just convincing them to 503 00:27:17,520 --> 00:27:23,640 Speaker 1: think about it much much earlier. Yeah. Um, all right, well, 504 00:27:23,720 --> 00:27:25,640 Speaker 1: let's I think most of what I wanted to get 505 00:27:25,680 --> 00:27:27,560 Speaker 1: into today. Is there anything else you really wanted to 506 00:27:27,600 --> 00:27:31,320 Speaker 1: like kind of talk about while you're here? Um, those 507 00:27:31,359 --> 00:27:34,280 Speaker 1: are the those are the big ones for sure. We 508 00:27:34,359 --> 00:27:37,399 Speaker 1: will hopefully have more to talk about very soon. And 509 00:27:37,440 --> 00:27:40,119 Speaker 1: how we're approaching this project. Um, it's going to be 510 00:27:40,400 --> 00:27:44,240 Speaker 1: a pretty big project. It will take two years to implement, 511 00:27:44,320 --> 00:27:46,640 Speaker 1: but um, we're pretty excited to see what comes out 512 00:27:46,640 --> 00:27:51,480 Speaker 1: of it. Yeah. Um, well, people can find you on 513 00:27:51,520 --> 00:27:55,159 Speaker 1: Twitter at it's just at alex new house, right, alex 514 00:27:55,200 --> 00:27:57,879 Speaker 1: B new house, alex B new house. Yeah, at alex 515 00:27:57,920 --> 00:28:03,080 Speaker 1: B new House. Um, they can check out where you 516 00:28:03,119 --> 00:28:06,000 Speaker 1: work at at C T E, C M, I I 517 00:28:06,680 --> 00:28:10,600 Speaker 1: s UM And yeah, I'm I'm excited to see. Well, 518 00:28:10,640 --> 00:28:13,800 Speaker 1: maybe we'll have you back on when you UMU, when 519 00:28:13,880 --> 00:28:16,680 Speaker 1: you you actually put out the game. But I'm really 520 00:28:16,680 --> 00:28:18,680 Speaker 1: interested in looking at that. Oh yeah, it was the 521 00:28:18,720 --> 00:28:21,679 Speaker 1: last thing you brewed. Oh I brewed a red I 522 00:28:21,760 --> 00:28:23,960 Speaker 1: p A and I'm currently brewing three gallons of apple cider. 523 00:28:24,320 --> 00:28:27,040 Speaker 1: Oh nice. We just um, we juiced ten gallons of 524 00:28:27,040 --> 00:28:30,600 Speaker 1: apples and pears that I just kegged after almost four 525 00:28:30,600 --> 00:28:32,600 Speaker 1: weeks of fermentation. That I know. I've been I've been 526 00:28:32,640 --> 00:28:35,720 Speaker 1: looking at I've been looking at apple mills, like apple presses, 527 00:28:35,880 --> 00:28:38,240 Speaker 1: and yeah, I should I should just buy one. We 528 00:28:38,240 --> 00:28:40,640 Speaker 1: found one to rent. Um So it's just like, I 529 00:28:40,640 --> 00:28:42,720 Speaker 1: don't know, thirty bucks for the day. Uh, and we 530 00:28:42,840 --> 00:28:45,080 Speaker 1: just gathered up all the apples on property. But it's 531 00:28:45,120 --> 00:28:48,520 Speaker 1: it was rab definitely very solely. Yeah, we were juicing 532 00:28:48,560 --> 00:28:52,040 Speaker 1: all of the apples the day that um tiny got 533 00:28:52,080 --> 00:28:55,640 Speaker 1: shot at that protest in Olympia. So it's just like 534 00:28:55,640 --> 00:28:58,120 Speaker 1: looking at the Twitter saying there's been a shooting into 535 00:28:58,120 --> 00:29:01,240 Speaker 1: protests and be like, yeah, I'm glad I'm not working today. Yeah, 536 00:29:01,280 --> 00:29:04,960 Speaker 1: I'm glad I'm not working today. Idellic afternoon pressing apples. 537 00:29:05,800 --> 00:29:08,280 Speaker 1: This is this is a more enjoyable use of my 538 00:29:08,400 --> 00:29:12,040 Speaker 1: time right now. All right, well, Alex, thank you so 539 00:29:12,120 --> 00:29:14,480 Speaker 1: much for being on, Thank you for what you're doing, 540 00:29:14,720 --> 00:29:18,640 Speaker 1: and thank you all for listening. Go with you know, whoever, 541 00:29:19,440 --> 00:29:26,680 Speaker 1: whatever deity up to you. It could happen here as 542 00:29:26,720 --> 00:29:29,440 Speaker 1: a production of cool Zone Media. For more podcasts from 543 00:29:29,440 --> 00:29:32,400 Speaker 1: cool Zone Media, visit our website cool zone media dot com, 544 00:29:32,520 --> 00:29:34,240 Speaker 1: or check us out on the I Heart Radio app, 545 00:29:34,280 --> 00:29:37,640 Speaker 1: Apple Podcasts, or wherever you listen to podcasts. You can 546 00:29:37,640 --> 00:29:40,360 Speaker 1: find sources for It could Happen Here, updated monthly at 547 00:29:40,360 --> 00:29:43,640 Speaker 1: cool zone media dot com slash sources. Thanks for listening.