1 00:00:11,720 --> 00:00:14,760 Speaker 1: Good morning, Peevesen, Welcome to wok F Daily with me 2 00:00:14,960 --> 00:00:18,720 Speaker 1: your Girl Daniel Moody pre recording from the Home Bunker, Folks. 3 00:00:18,720 --> 00:00:23,000 Speaker 1: As I said yesterday, I am away for a trip 4 00:00:23,040 --> 00:00:26,799 Speaker 1: to London to join Meddie Hassan on one of his 5 00:00:27,120 --> 00:00:31,600 Speaker 1: infamous debates panels that he does and I am super 6 00:00:31,960 --> 00:00:37,240 Speaker 1: super excited to be joining him. So today, on today's episode, 7 00:00:37,680 --> 00:00:42,520 Speaker 1: we are doing part two of my in depth conversation 8 00:00:42,960 --> 00:00:47,600 Speaker 1: with the fabulous producer of wok F Daily, Andrew Marcello. 9 00:00:47,880 --> 00:00:51,920 Speaker 1: If you remember, in part one we went through kind 10 00:00:51,960 --> 00:01:01,120 Speaker 1: of this incredible web of gen z white male gamers 11 00:01:01,400 --> 00:01:07,840 Speaker 1: and how they have become fertile ground for MAGA, and 12 00:01:08,040 --> 00:01:11,240 Speaker 1: how this did not just happen overnight, it actually has 13 00:01:11,360 --> 00:01:15,960 Speaker 1: been years, if not decades into the making, and what 14 00:01:16,160 --> 00:01:21,440 Speaker 1: this could potentially mean as we move into not only 15 00:01:21,840 --> 00:01:25,640 Speaker 1: the ferocious sprint of this election, but in general the 16 00:01:25,680 --> 00:01:30,319 Speaker 1: sanctity and security of our democracy going forward. So Andrew 17 00:01:30,440 --> 00:01:34,720 Speaker 1: walks us through this world again, giving us some insights 18 00:01:34,800 --> 00:01:39,360 Speaker 1: that are just frankly mind blowing. And my feeling is that, 19 00:01:39,520 --> 00:01:42,120 Speaker 1: like we all need to be aware of what is 20 00:01:42,160 --> 00:01:46,400 Speaker 1: happening and how Steve Bannon and others are tapping into 21 00:01:46,440 --> 00:01:52,600 Speaker 1: this cult of personality and essentially weaponizing this group against democracy, 22 00:01:52,680 --> 00:01:57,600 Speaker 1: against progress, against a multiracial democracy, and more so, that 23 00:01:57,680 --> 00:02:05,360 Speaker 1: conversation with Andrew is coming up next, folks. I am 24 00:02:05,760 --> 00:02:10,600 Speaker 1: so happy to be rejoined on Woke app Daily by 25 00:02:11,080 --> 00:02:15,880 Speaker 1: my producer extraordinaire Andrew Marcello, who if you remember, we 26 00:02:15,919 --> 00:02:20,960 Speaker 1: are doing a two part episode on the rise of 27 00:02:22,280 --> 00:02:27,400 Speaker 1: oh God, right wing gen z but basically through the 28 00:02:27,400 --> 00:02:32,800 Speaker 1: world of gaming. And Andrew gave us in our first 29 00:02:32,960 --> 00:02:37,160 Speaker 1: episode of this two part series a real deep dive 30 00:02:37,240 --> 00:02:40,320 Speaker 1: in timeline that took us back to the nineties in 31 00:02:40,440 --> 00:02:45,040 Speaker 1: terms of like when this world was created, who's responsible 32 00:02:45,160 --> 00:02:48,680 Speaker 1: for it, and kind of moving us through to today 33 00:02:48,760 --> 00:02:53,160 Speaker 1: to the present day, where there's a particular set of 34 00:02:53,680 --> 00:03:00,640 Speaker 1: white young male gamers that are I guess, what would 35 00:03:00,680 --> 00:03:03,720 Speaker 1: you say, Andrew, like a foundational part of like the 36 00:03:03,720 --> 00:03:07,280 Speaker 1: Maga movement, an arm of the Maga movement. 37 00:03:07,960 --> 00:03:10,079 Speaker 2: Yeah, they're definitely a wing and I want to get 38 00:03:10,080 --> 00:03:12,560 Speaker 2: into how their presence got minimized over time as well. 39 00:03:12,560 --> 00:03:16,520 Speaker 1: Actually, okay, so like bring us in. So where we 40 00:03:16,680 --> 00:03:22,520 Speaker 1: left off was this extraordinary timeline of how we got 41 00:03:22,680 --> 00:03:26,920 Speaker 1: to this present place, but we still don't really understand 42 00:03:27,760 --> 00:03:37,880 Speaker 1: this group of young white males that are seemingly misogynist, 43 00:03:38,320 --> 00:03:44,320 Speaker 1: racist in cell types who have created like this network, 44 00:03:44,360 --> 00:03:48,600 Speaker 1: this web that is able to coordinate and take people 45 00:03:48,760 --> 00:03:52,240 Speaker 1: down in various ways. So like, give us the insight 46 00:03:52,360 --> 00:03:53,240 Speaker 1: into these people. 47 00:03:54,000 --> 00:03:57,480 Speaker 2: Absolutely, so I appreciate you having me back on to 48 00:03:57,960 --> 00:04:01,560 Speaker 2: wrap up this discussion if you're listening right now and 49 00:04:01,760 --> 00:04:04,360 Speaker 2: you want more detailed context for what's going on, the 50 00:04:04,440 --> 00:04:08,400 Speaker 2: September sixteenth episode of WOKF From Gamergate to Trump was 51 00:04:08,440 --> 00:04:10,720 Speaker 2: really my full primer on what I'm about to discuss. 52 00:04:11,160 --> 00:04:15,120 Speaker 2: But the Cliff Snowes version of that conversation is in 53 00:04:15,200 --> 00:04:20,440 Speaker 2: twenty thirteen, many online social factors such as the rise 54 00:04:20,680 --> 00:04:26,120 Speaker 2: of populist feminism, the rise of non traditional video games 55 00:04:26,120 --> 00:04:28,200 Speaker 2: that are more focused on narrative and more focused on 56 00:04:28,240 --> 00:04:33,159 Speaker 2: social issues, and a history of gamers people who play 57 00:04:33,240 --> 00:04:38,200 Speaker 2: video games and identify as gamers feeling like they are 58 00:04:39,160 --> 00:04:45,719 Speaker 2: a not oppressed people but socially ostracized, you know, So 59 00:04:45,880 --> 00:04:49,599 Speaker 2: all of those things conflated. And then the spark that 60 00:04:49,600 --> 00:04:55,120 Speaker 2: blew this up was a jilted X of a woman 61 00:04:55,680 --> 00:04:59,400 Speaker 2: presenting person who worked in the video game industry and 62 00:04:59,440 --> 00:05:02,640 Speaker 2: so that create an online harassment campaign, which at first 63 00:05:02,680 --> 00:05:06,960 Speaker 2: was called quinspiracy and then became known as gamergait. As 64 00:05:07,080 --> 00:05:09,400 Speaker 2: much as they denied that it was coordinated, it was 65 00:05:09,440 --> 00:05:12,840 Speaker 2: a coordinated harassment campaign not just against this one person, 66 00:05:13,080 --> 00:05:16,960 Speaker 2: but against multiple women and fem presenting people who worked 67 00:05:17,120 --> 00:05:19,880 Speaker 2: in the video game industry, the tech industry, and the 68 00:05:20,000 --> 00:05:25,400 Speaker 2: gaming and tech online journalism space as well. So that 69 00:05:25,600 --> 00:05:29,080 Speaker 2: started in twenty thirteen, really blew up through twenty fourteen, 70 00:05:29,600 --> 00:05:34,960 Speaker 2: and then continued through twenty fifteen, aided very largely by 71 00:05:35,080 --> 00:05:39,680 Speaker 2: articles being published on Breitbart, which was being operated by 72 00:05:39,839 --> 00:05:41,200 Speaker 2: Steve Bannon at the time. 73 00:05:42,240 --> 00:05:45,839 Speaker 1: So what a tangled web we weave. 74 00:05:46,520 --> 00:05:48,960 Speaker 2: Everything is connected. We have to have context, like the 75 00:05:48,960 --> 00:05:50,560 Speaker 2: coconut tree, like. 76 00:05:50,760 --> 00:05:55,039 Speaker 1: It's so true, because like it's like you pull one thread, 77 00:05:55,640 --> 00:05:59,120 Speaker 1: which I think is like, oh, this misogyny that's also 78 00:05:59,200 --> 00:06:01,800 Speaker 1: a present in the gaming community, and here are these 79 00:06:01,839 --> 00:06:04,200 Speaker 1: examples of how this happened. But then when you pull 80 00:06:04,279 --> 00:06:07,680 Speaker 1: the thread, you have all of these characters that we've 81 00:06:07,720 --> 00:06:11,040 Speaker 1: been talking about on WOKF but in a different context. 82 00:06:11,560 --> 00:06:17,039 Speaker 1: So tell us about Steve Bannon's involvement here, and then 83 00:06:17,400 --> 00:06:23,600 Speaker 1: why it's necessary for his involvement to then create this 84 00:06:23,760 --> 00:06:25,760 Speaker 1: wing of the MAGA movement. 85 00:06:27,000 --> 00:06:30,760 Speaker 2: Absolutely so. Steve Bannon, if you don't remember or aren't 86 00:06:30,760 --> 00:06:35,680 Speaker 2: familiar with the name, was a operator, and I believe 87 00:06:35,800 --> 00:06:38,200 Speaker 2: he had become at this point the owner or co 88 00:06:38,279 --> 00:06:41,600 Speaker 2: owner of bright Bart News, which was a blog, not 89 00:06:41,800 --> 00:06:44,920 Speaker 2: a newspaper or legitimate news source, but it was called 90 00:06:44,920 --> 00:06:47,400 Speaker 2: bright Bart News. It was a right wing hate blog 91 00:06:48,040 --> 00:06:52,520 Speaker 2: that was at least infamous among people on the left 92 00:06:52,560 --> 00:06:55,360 Speaker 2: for how extreme they were, as well as people on 93 00:06:55,400 --> 00:06:57,960 Speaker 2: the right, at least people on the online right what 94 00:06:58,120 --> 00:07:01,560 Speaker 2: became known as the alt right. And I should say 95 00:07:01,600 --> 00:07:04,520 Speaker 2: actually that Breitbart was essentially a funneling mechanism for the 96 00:07:04,560 --> 00:07:07,520 Speaker 2: alt right, because it took people online who already had 97 00:07:07,880 --> 00:07:13,200 Speaker 2: antisocial notions, misogynous notions, racist notions, all these kind of things, 98 00:07:13,240 --> 00:07:17,280 Speaker 2: notions of hate, and brought them into this space where 99 00:07:17,480 --> 00:07:21,720 Speaker 2: those notions of hate could be stoked by having headlines 100 00:07:21,840 --> 00:07:25,800 Speaker 2: about things they're interested in, like video gaming attech. So 101 00:07:26,640 --> 00:07:30,800 Speaker 2: Steve Bannon hired a writer named Steve Vann, hired someone 102 00:07:30,840 --> 00:07:35,560 Speaker 2: who writes words named Milo Unopolis to write these articles 103 00:07:35,600 --> 00:07:37,760 Speaker 2: about Other people wrote them too, but Milo was a 104 00:07:37,800 --> 00:07:43,040 Speaker 2: really big figure in writing these articles on Breitbart, which 105 00:07:43,080 --> 00:07:45,720 Speaker 2: Steve Bannon literally once called the platform for the alt right, 106 00:07:46,000 --> 00:07:47,760 Speaker 2: and at the same time this was happening, I was 107 00:07:47,800 --> 00:07:50,440 Speaker 2: reminded that there was a man named Mark Cernovich. Do 108 00:07:50,480 --> 00:07:51,440 Speaker 2: you remember him? 109 00:07:51,560 --> 00:07:53,120 Speaker 1: No, no, I don't so. 110 00:07:53,240 --> 00:07:56,400 Speaker 2: Mark Cernovich was in early one of those people early 111 00:07:56,440 --> 00:08:01,080 Speaker 2: online who was pro hashtag gamer gait, along with some 112 00:08:01,160 --> 00:08:03,840 Speaker 2: other people like an actor from Firefly and all this 113 00:08:03,920 --> 00:08:06,760 Speaker 2: other stuff. But Milo Unopolis was in Breitbart at the 114 00:08:06,800 --> 00:08:14,880 Speaker 2: same time. Mike Cernovich was plowing a crusade against the 115 00:08:14,920 --> 00:08:18,520 Speaker 2: mainstream media from a right wing perspective. He was a 116 00:08:18,520 --> 00:08:22,160 Speaker 2: person who would call CNN the Clinton News Network. He 117 00:08:22,440 --> 00:08:25,600 Speaker 2: was an online right wing extremist. So he was someone 118 00:08:25,600 --> 00:08:28,400 Speaker 2: who early on said he was pro gamergate and would 119 00:08:28,440 --> 00:08:31,240 Speaker 2: post things about it, and again bringing those people in, 120 00:08:31,440 --> 00:08:34,160 Speaker 2: bringing people in who are interested in this, and again, 121 00:08:34,160 --> 00:08:35,600 Speaker 2: if you want to find out why people were so 122 00:08:35,640 --> 00:08:38,079 Speaker 2: interested in Gamergate, you got to listen to the last episode. 123 00:08:38,600 --> 00:08:43,800 Speaker 2: So he takes this population that's already primed for hate 124 00:08:43,920 --> 00:08:47,480 Speaker 2: and then brings them into the Trump world and the 125 00:08:47,559 --> 00:08:50,400 Speaker 2: anti Clinton world and the anti democratic world and the 126 00:08:50,480 --> 00:08:53,800 Speaker 2: anti mainstream media world in a way that is very 127 00:08:53,840 --> 00:08:58,000 Speaker 2: different than the way that you tak against the mainstream media. 128 00:08:58,080 --> 00:09:01,760 Speaker 1: It's just wild to me. I guess I'm still stuck 129 00:09:01,840 --> 00:09:03,920 Speaker 1: on the how did we get here? But I don't 130 00:09:03,920 --> 00:09:06,040 Speaker 1: want to like Boo labor that. Like, I just say 131 00:09:06,080 --> 00:09:08,800 Speaker 1: that as like a how the fuck did we get here? 132 00:09:09,360 --> 00:09:11,959 Speaker 2: I hate to give the man credit, but it really 133 00:09:12,040 --> 00:09:15,360 Speaker 2: was a master stroke of strategy by Steve Bannon. I 134 00:09:15,400 --> 00:09:17,640 Speaker 2: really hate to say it, but like some people saw 135 00:09:17,640 --> 00:09:21,320 Speaker 2: it happening in real time, and it's like watching a 136 00:09:21,360 --> 00:09:24,440 Speaker 2: car crash when you see something happening that you have 137 00:09:24,480 --> 00:09:27,120 Speaker 2: no control over, and it's like, oh, this guy is 138 00:09:27,160 --> 00:09:30,959 Speaker 2: coordinating all these different anti social movements in this space 139 00:09:31,080 --> 00:09:34,840 Speaker 2: that not everybody knows about. If you try to talk 140 00:09:34,880 --> 00:09:37,199 Speaker 2: to normal people about it, they might look at you 141 00:09:38,120 --> 00:09:40,040 Speaker 2: some kind of way because you're telling them about all 142 00:09:40,040 --> 00:09:41,679 Speaker 2: this stuff that's happening on the internet, and they don't 143 00:09:41,679 --> 00:09:44,000 Speaker 2: think it's quote unquote real. They don't think it's important. 144 00:09:44,320 --> 00:09:46,000 Speaker 2: Let me just mention real quick, A lot of people 145 00:09:46,000 --> 00:09:48,920 Speaker 2: don't think online harassment is a real issue, is a serious. 146 00:09:48,559 --> 00:09:51,000 Speaker 1: Issue, except for those that are actually caught up in 147 00:09:51,040 --> 00:09:52,800 Speaker 1: it right and have their lives up ended. 148 00:09:53,040 --> 00:09:55,800 Speaker 2: It leads to real world harassment. People who exist online 149 00:09:55,800 --> 00:09:57,640 Speaker 2: are people who exist in the real world, and people 150 00:09:57,679 --> 00:10:00,800 Speaker 2: who make threats online can absolutely translate those threats into 151 00:10:00,800 --> 00:10:03,080 Speaker 2: real world action. So let's make that perfectly clear. 152 00:10:03,480 --> 00:10:04,520 Speaker 1: Yeah, one hundred person. 153 00:10:04,679 --> 00:10:06,079 Speaker 2: But that's part of it too for me, is that 154 00:10:06,280 --> 00:10:08,760 Speaker 2: you do then do this, all this organizing in an 155 00:10:08,800 --> 00:10:12,079 Speaker 2: online social space that not everybody takes seriously, and by 156 00:10:12,120 --> 00:10:14,920 Speaker 2: the time you can take it seriously, like we're discussing, 157 00:10:15,240 --> 00:10:17,800 Speaker 2: it's already in some ways, it's already happened. I don't 158 00:10:17,800 --> 00:10:19,920 Speaker 2: want to say it's already too late, but it's already happened. 159 00:10:20,160 --> 00:10:23,760 Speaker 2: Twenty sixteen already happened. All the conversions of these movements 160 00:10:23,800 --> 00:10:26,160 Speaker 2: into the alt right and then into what is now 161 00:10:26,200 --> 00:10:30,319 Speaker 2: called maga has already happened. We're here, We're in twenty 162 00:10:30,360 --> 00:10:31,719 Speaker 2: twenty four, and. 163 00:10:31,720 --> 00:10:35,160 Speaker 1: I feel like by the time that the mainstream and 164 00:10:35,240 --> 00:10:39,480 Speaker 1: I consider myself in the mainstream population because this is 165 00:10:39,520 --> 00:10:42,000 Speaker 1: not my niche and not my world, by the time 166 00:10:42,640 --> 00:10:45,439 Speaker 1: I'm hipped to something like this, we've been a decade 167 00:10:45,480 --> 00:10:52,120 Speaker 1: in it. Steve Bannon, to your point, had eyes on 168 00:10:52,320 --> 00:10:57,360 Speaker 1: what can be named as a disaffected group of young 169 00:10:57,440 --> 00:11:01,120 Speaker 1: men that are the same that have been targeted with 170 00:11:01,360 --> 00:11:05,240 Speaker 1: YouTube personalities that listen to Joe Rogan, like the same 171 00:11:05,800 --> 00:11:10,520 Speaker 1: kind of angry young white men who don't know where 172 00:11:10,559 --> 00:11:15,200 Speaker 1: their place is in a changing world where they are 173 00:11:15,240 --> 00:11:20,000 Speaker 1: no longer considered like white men still run every fucking thing, 174 00:11:20,960 --> 00:11:24,680 Speaker 1: but like there is just more people of color, more women, 175 00:11:24,880 --> 00:11:29,480 Speaker 1: more equity, but we're still not at the equitable fifty 176 00:11:29,520 --> 00:11:33,720 Speaker 1: to fifty split of how C suites of how business 177 00:11:33,720 --> 00:11:39,360 Speaker 1: and industry run. But yet Steve Bannon and the MAGA 178 00:11:40,280 --> 00:11:46,679 Speaker 1: misogynist sexist movement is able to capitalize on this group. 179 00:11:46,720 --> 00:11:51,280 Speaker 1: And I guess my question, Andrew is like how and 180 00:11:51,520 --> 00:11:55,920 Speaker 1: why and is there a way out? 181 00:11:56,880 --> 00:11:58,719 Speaker 2: So I think the why of it for me as 182 00:11:58,760 --> 00:12:02,320 Speaker 2: a white person is on standable. It's not relatable to me, 183 00:12:02,440 --> 00:12:06,080 Speaker 2: but it is understandable white people and white men even 184 00:12:06,120 --> 00:12:08,040 Speaker 2: more so white CIS men, Like the more privilege you 185 00:12:08,040 --> 00:12:10,440 Speaker 2: add to it, the more this multiplies, I would say, 186 00:12:10,760 --> 00:12:13,480 Speaker 2: or at least you know, adds, multiplies, exponential, whatever math 187 00:12:13,520 --> 00:12:17,680 Speaker 2: you want to do on it. White mail cis straight. 188 00:12:17,800 --> 00:12:20,560 Speaker 2: You start stacking all these on. There's a lot of 189 00:12:20,720 --> 00:12:24,560 Speaker 2: reactionary feelings there. There's a lot of sensitivity there, to 190 00:12:24,559 --> 00:12:28,480 Speaker 2: be honest, and you know, traditional masculinity is not about sensitivity, 191 00:12:28,480 --> 00:12:30,920 Speaker 2: but a lot of people who are traditionally masculine are 192 00:12:31,440 --> 00:12:34,840 Speaker 2: the most sensitive people because they haven't had to exist 193 00:12:34,840 --> 00:12:37,680 Speaker 2: in a world where their existence is threatened in any way, 194 00:12:37,760 --> 00:12:41,600 Speaker 2: even like the remotest possible way. So, if you're growing 195 00:12:41,679 --> 00:12:44,319 Speaker 2: up in a world where you see yourself in everything, 196 00:12:44,720 --> 00:12:47,480 Speaker 2: and the protagonist of everything you've ever every piece of 197 00:12:47,520 --> 00:12:50,960 Speaker 2: media you've ever engaged with, the main character is a 198 00:12:50,960 --> 00:12:53,719 Speaker 2: white man, and he's presumably it doesn't even have to 199 00:12:53,760 --> 00:12:56,000 Speaker 2: be explicitly stated. You can just assume that he's also 200 00:12:56,120 --> 00:12:59,480 Speaker 2: sis and straight. And this has only started to change 201 00:12:59,559 --> 00:13:01,640 Speaker 2: in take it for granted. This has really only started 202 00:13:01,679 --> 00:13:03,520 Speaker 2: to change in the last decade or fifteen years in 203 00:13:03,559 --> 00:13:06,800 Speaker 2: a major way. Yes, there was representation before that, but 204 00:13:06,920 --> 00:13:09,440 Speaker 2: like in a major way. We've started to see like 205 00:13:09,920 --> 00:13:12,960 Speaker 2: Star Wars only got a black protagonist in twenty seventeen. 206 00:13:13,040 --> 00:13:14,320 Speaker 1: And we saw how that went out. 207 00:13:14,679 --> 00:13:16,280 Speaker 2: I have a lot of thoughts on that as well. 208 00:13:16,360 --> 00:13:19,840 Speaker 2: That might need to be a different podcast anyway. So 209 00:13:19,840 --> 00:13:22,320 Speaker 2: the why of it is white people are reactionary. White 210 00:13:22,320 --> 00:13:24,360 Speaker 2: men are reactionary white men on the internet who are 211 00:13:24,360 --> 00:13:27,640 Speaker 2: sensitive about not every single video game character being a 212 00:13:27,679 --> 00:13:31,080 Speaker 2: white Man anymore and let them, you know, making Laura 213 00:13:31,120 --> 00:13:35,079 Speaker 2: Croft skinny. They're a very reactionary population. And Steve Bannon 214 00:13:35,120 --> 00:13:37,160 Speaker 2: recognized this. I gave this quote last time, but I'll 215 00:13:37,160 --> 00:13:39,320 Speaker 2: read it again. This is a quote from Steve Bannon. 216 00:13:39,720 --> 00:13:43,560 Speaker 2: I realized mylow connect with these kids right away. You 217 00:13:43,600 --> 00:13:47,200 Speaker 2: can activate that army. They come in through Gamergate or whatever, 218 00:13:47,760 --> 00:13:50,520 Speaker 2: and then get turned onto politics and Trump. So there's 219 00:13:50,559 --> 00:13:53,040 Speaker 2: a lot going on there in his mind. In Steve 220 00:13:53,040 --> 00:13:57,520 Speaker 2: Bannon's mind, he calls them kids, these are seeds. He's 221 00:13:57,520 --> 00:14:00,760 Speaker 2: not going to the boomers, He's going to the young people. 222 00:14:00,960 --> 00:14:03,600 Speaker 2: What's the article we were discussing many gen z men 223 00:14:03,960 --> 00:14:07,000 Speaker 2: feel left behind. Some see Trump as an answer. They 224 00:14:07,080 --> 00:14:11,280 Speaker 2: get them young. That's any marketing. Really, why did they 225 00:14:11,320 --> 00:14:14,800 Speaker 2: used to market McDonald so much to children? You get 226 00:14:14,800 --> 00:14:18,840 Speaker 2: them while they're young, You activate that army they come 227 00:14:18,840 --> 00:14:22,200 Speaker 2: in through Gamergate or whatever. And it is worth noting 228 00:14:22,200 --> 00:14:24,200 Speaker 2: this wasn't just Gamergate. We talked about this a bit 229 00:14:24,280 --> 00:14:27,000 Speaker 2: last time, but there was the rising in cell culture 230 00:14:27,040 --> 00:14:29,400 Speaker 2: online that got funneled through this. There are other anti 231 00:14:29,440 --> 00:14:31,960 Speaker 2: social movements like I mentioned online that got funneled through 232 00:14:31,960 --> 00:14:36,600 Speaker 2: this and then get turned onto politics. And Trump, because 233 00:14:36,600 --> 00:14:39,280 Speaker 2: it's not just activating them politically that matters, of course, 234 00:14:39,520 --> 00:14:42,920 Speaker 2: but activating them towards not even just the Republican Party. 235 00:14:43,000 --> 00:14:46,280 Speaker 2: Steve Bannon's goal. He was not a Republican strategist. He 236 00:14:46,320 --> 00:14:51,200 Speaker 2: wanted people to be part of this extremist movement that honestly, 237 00:14:51,400 --> 00:14:53,680 Speaker 2: in a lot of ways, Donald Trump was kind of 238 00:14:53,720 --> 00:14:56,240 Speaker 2: just a tool for right, Like Donald Trump is a 239 00:14:56,240 --> 00:14:58,080 Speaker 2: hateful man and all that other stuff. I'm not trying 240 00:14:58,080 --> 00:15:01,040 Speaker 2: to absolve Trump, but behind the scenes twenty sixteen, I 241 00:15:01,080 --> 00:15:04,120 Speaker 2: think it needs to really be remembered that he had 242 00:15:04,160 --> 00:15:07,040 Speaker 2: a lot of people operating around him, using him as 243 00:15:07,080 --> 00:15:09,400 Speaker 2: a pawn for their own means. And Steve Bannon was 244 00:15:09,440 --> 00:15:15,000 Speaker 2: one of the biggest Breitbart gamer gait maga. These were 245 00:15:15,160 --> 00:15:18,560 Speaker 2: all tools to the same These were all means to 246 00:15:18,640 --> 00:15:22,800 Speaker 2: the same end. And we're now living in Steve Bannon's endgame. 247 00:15:23,480 --> 00:15:26,920 Speaker 1: Yo. I mean I needed you to roll that back, Andrew, 248 00:15:27,080 --> 00:15:32,960 Speaker 1: because I literally, like my heart just stopped when you 249 00:15:33,120 --> 00:15:39,000 Speaker 1: said we're playing in Steve Bannon's end game. Oh my god, 250 00:15:39,760 --> 00:15:43,920 Speaker 1: keep going, because I'm like, you're right, because we're still 251 00:15:44,400 --> 00:15:46,560 Speaker 1: and by we, not you, but I mean like the 252 00:15:46,640 --> 00:15:52,720 Speaker 1: mainstream and the Democratic Party is still ten years behind 253 00:15:53,320 --> 00:15:56,840 Speaker 1: trying to figure out what the rules of this new 254 00:15:57,280 --> 00:16:02,840 Speaker 1: world political order is. Meanwhile, they are activating all of 255 00:16:02,880 --> 00:16:04,920 Speaker 1: these different cells. Yep. 256 00:16:05,920 --> 00:16:09,200 Speaker 2: And it's not just gamer Gate. I don't want to 257 00:16:09,280 --> 00:16:13,600 Speaker 2: spend all this time going Democrats and American liberals have 258 00:16:13,680 --> 00:16:16,920 Speaker 2: missed the boat, but missing the boat on this missing 259 00:16:16,920 --> 00:16:18,720 Speaker 2: the boat on the whole rise of the alt right 260 00:16:18,960 --> 00:16:24,160 Speaker 2: into what it's become echoes, missing the boat on the 261 00:16:24,200 --> 00:16:28,600 Speaker 2: anti abortion movement and how deep that went, and missing 262 00:16:28,600 --> 00:16:31,480 Speaker 2: the boat on the fight for the courts, and how 263 00:16:31,520 --> 00:16:38,880 Speaker 2: deep that went. And I think that I can't speak 264 00:16:39,000 --> 00:16:41,080 Speaker 2: so much to the political world because I'm not in it. 265 00:16:41,640 --> 00:16:44,920 Speaker 2: But my experience with this in the mainstream media world, 266 00:16:45,240 --> 00:16:48,760 Speaker 2: where we both were at one point was trying to 267 00:16:48,840 --> 00:16:53,240 Speaker 2: sound the alarms to this was like the people. It 268 00:16:53,320 --> 00:16:56,200 Speaker 2: felt like being a person on the Titanic saying that 269 00:16:56,240 --> 00:17:01,560 Speaker 2: there's icebergs in the water. There was a indifference towards it. 270 00:17:02,440 --> 00:17:05,560 Speaker 2: There was that attitude of this is online. It was 271 00:17:05,600 --> 00:17:07,240 Speaker 2: in some ways too late because Trump had already been 272 00:17:07,280 --> 00:17:10,080 Speaker 2: elected when this was happening from my perspective, but even 273 00:17:10,119 --> 00:17:11,880 Speaker 2: then I had wanted to connect the dots like I'm 274 00:17:11,880 --> 00:17:13,720 Speaker 2: doing right now, but I wanted to do it in 275 00:17:13,760 --> 00:17:17,440 Speaker 2: twenty sixteen or twenty seventeen, and the feedback I received 276 00:17:17,680 --> 00:17:20,479 Speaker 2: was this is you know, this is too niche, this 277 00:17:20,560 --> 00:17:23,680 Speaker 2: is the tech world, this is online, and there hadn't 278 00:17:23,680 --> 00:17:25,920 Speaker 2: been a lot of mainstream reporting on it, and there 279 00:17:25,960 --> 00:17:29,280 Speaker 2: still isn't. But there's what sparked this conversation was you 280 00:17:29,320 --> 00:17:31,720 Speaker 2: sent me that New York Times article and I said, Wow, 281 00:17:31,720 --> 00:17:33,919 Speaker 2: the media is finally reporting on this, huh. 282 00:17:33,960 --> 00:17:37,119 Speaker 1: And I was just like, wait, what you know what 283 00:17:37,200 --> 00:17:41,160 Speaker 1: I'm saying like, there is just I think the depths 284 00:17:41,480 --> 00:17:44,919 Speaker 1: of which this goes and how far back, like we 285 00:17:45,960 --> 00:17:50,440 Speaker 1: in the liberal kind of democratic sphere want to believe 286 00:17:50,520 --> 00:17:56,280 Speaker 1: that Maga happened overnight and want to believe that these 287 00:17:56,320 --> 00:17:59,680 Speaker 1: weren't plans that much in the way to your point 288 00:17:59,720 --> 00:18:02,720 Speaker 1: of the overturning of Ruby Wade were fifty years in 289 00:18:02,760 --> 00:18:06,840 Speaker 1: the making. You took us back on the first episode 290 00:18:07,400 --> 00:18:12,800 Speaker 1: thirty years of when the seeds of this were being planted. 291 00:18:13,640 --> 00:18:16,639 Speaker 1: And so what I realize is that then nefarious nature 292 00:18:17,080 --> 00:18:19,040 Speaker 1: can we continue to say like, oh, it doesn't know 293 00:18:19,119 --> 00:18:22,240 Speaker 1: any depths and it's because it was it's a project, 294 00:18:23,080 --> 00:18:27,879 Speaker 1: a thirty forty to fifty year long project to dismantle 295 00:18:29,200 --> 00:18:32,959 Speaker 1: democracy from the inside out. That you weren't going to 296 00:18:33,080 --> 00:18:36,480 Speaker 1: need foreign agents. You were just going to need to 297 00:18:36,560 --> 00:18:39,400 Speaker 1: activate disgruntled young white men. 298 00:18:39,920 --> 00:18:43,280 Speaker 2: Yep. And I you know, even giving a charitable reading, 299 00:18:43,320 --> 00:18:45,320 Speaker 2: which is the stuff that happened in the nineties with 300 00:18:45,359 --> 00:18:47,200 Speaker 2: the Senate committees and the stuff that happened in the 301 00:18:47,200 --> 00:18:51,040 Speaker 2: two thousands with Jack Thompson, those don't even have to 302 00:18:51,080 --> 00:18:54,280 Speaker 2: be part of any sort of grand plan because the 303 00:18:54,280 --> 00:18:59,480 Speaker 2: fact remains, as you said, that these things even coincidentally, 304 00:19:00,119 --> 00:19:05,159 Speaker 2: I'm a population, a sub group of people in society 305 00:19:05,800 --> 00:19:08,480 Speaker 2: to be a certain way, and behave a certain way, 306 00:19:08,520 --> 00:19:11,840 Speaker 2: and be reactionary in a very strong way. That then 307 00:19:12,119 --> 00:19:15,720 Speaker 2: someone like Steve Bannon, a strategist who isn't I don't 308 00:19:15,720 --> 00:19:17,919 Speaker 2: remember the name, but there was it, Leonard Leo, like 309 00:19:18,200 --> 00:19:20,000 Speaker 2: Steve Vann, is not the first of his kind either, 310 00:19:20,640 --> 00:19:23,679 Speaker 2: but he is someone who understands the human condition and 311 00:19:23,800 --> 00:19:29,240 Speaker 2: uses their understanding of the human condition for very nefarious means. 312 00:19:29,280 --> 00:19:31,520 Speaker 1: And I think that that's the thing that we have 313 00:19:31,720 --> 00:19:34,920 Speaker 1: not really fully been able to tap into, is the 314 00:19:35,000 --> 00:19:38,639 Speaker 1: psychology of the thing. And that's where I feel to 315 00:19:38,680 --> 00:19:43,919 Speaker 1: your point, Steve Bannon and the rest of them have 316 00:19:44,080 --> 00:19:47,720 Speaker 1: been able to do right, Like they've been able to 317 00:19:47,880 --> 00:19:52,800 Speaker 1: really get into the psychology where kind of I don't know, 318 00:19:52,880 --> 00:19:55,280 Speaker 1: maybe to our detriment, and I don't think that it's 319 00:19:55,320 --> 00:19:57,719 Speaker 1: a big maybe we were like the world is changing, 320 00:19:57,800 --> 00:19:59,800 Speaker 1: move on, catch up it. 321 00:20:00,000 --> 00:20:03,280 Speaker 2: It is flawed thinking on any political side to have 322 00:20:03,320 --> 00:20:05,720 Speaker 2: the feeling of most people think the way I do. 323 00:20:06,000 --> 00:20:08,119 Speaker 2: You see it manifest in Trump people with like he 324 00:20:08,200 --> 00:20:11,560 Speaker 2: says what everyone's thinking, and it's like, I'm I'm not 325 00:20:11,640 --> 00:20:15,080 Speaker 2: thinking that stuff, but they think it is because they 326 00:20:15,080 --> 00:20:17,720 Speaker 2: think that way. I think how that manifests on the 327 00:20:17,760 --> 00:20:20,880 Speaker 2: liberal and left wing side, unfortunately, is I can't believe 328 00:20:20,880 --> 00:20:23,640 Speaker 2: people would think that way. How could people think this way? 329 00:20:23,680 --> 00:20:25,680 Speaker 2: How could people have that much hate in their hearts 330 00:20:25,760 --> 00:20:28,919 Speaker 2: or whatever? And it's not even necessarily all the time. 331 00:20:29,119 --> 00:20:31,640 Speaker 2: It definitely is for some people. But it's not even 332 00:20:31,680 --> 00:20:34,720 Speaker 2: like all these people like wake up and think about 333 00:20:34,840 --> 00:20:38,080 Speaker 2: how freaking racist they are. It's that they have a 334 00:20:38,119 --> 00:20:40,800 Speaker 2: reactionary Again, they're so used to living in a world 335 00:20:40,840 --> 00:20:43,800 Speaker 2: and the world is changing, and rather that the only 336 00:20:43,840 --> 00:20:47,679 Speaker 2: way that they have to cope with that is projecting 337 00:20:47,880 --> 00:20:53,560 Speaker 2: this negativity onto whatever they perceive is threatening their existence 338 00:20:53,560 --> 00:20:57,680 Speaker 2: as a dominant force in society, an unquestioningly dominant force 339 00:20:57,720 --> 00:21:00,959 Speaker 2: in society. And if there is nowhere else for them 340 00:21:01,040 --> 00:21:03,200 Speaker 2: to turn to, they're going to turn to the people 341 00:21:03,200 --> 00:21:08,920 Speaker 2: who say welcome in. We hate those slur slur slur too, 342 00:21:09,119 --> 00:21:11,800 Speaker 2: and just make those people worse because they're primed to 343 00:21:11,880 --> 00:21:13,119 Speaker 2: be made worse. 344 00:21:14,160 --> 00:21:20,359 Speaker 1: Yeah, it's something that is quite extraordinary. And while I 345 00:21:20,400 --> 00:21:23,280 Speaker 1: would say just like kind of pushing us into the 346 00:21:23,320 --> 00:21:27,400 Speaker 1: present where the Harris Walt's team is trying to tap 347 00:21:27,480 --> 00:21:32,640 Speaker 1: into this place of hopefulness and possibility, you can't at 348 00:21:32,640 --> 00:21:36,679 Speaker 1: the same time then ignore and I hate to fucking 349 00:21:36,800 --> 00:21:41,480 Speaker 1: trust me, like the quote unquote economic anxiety, the racial anxiety, 350 00:21:41,560 --> 00:21:45,679 Speaker 1: all of these things that are coded for racism. I 351 00:21:45,840 --> 00:21:49,880 Speaker 1: hate to even give any attention air and oxygen to it. 352 00:21:50,560 --> 00:21:55,240 Speaker 1: But there is something to what would have been or 353 00:21:55,280 --> 00:21:58,880 Speaker 1: what should be the right way, or at least an 354 00:21:58,880 --> 00:22:05,840 Speaker 1: initial step in bringing these kind of young men into 355 00:22:05,880 --> 00:22:08,639 Speaker 1: the fold. Or is it now? I guess is it 356 00:22:08,760 --> 00:22:09,440 Speaker 1: just too late? 357 00:22:10,200 --> 00:22:12,320 Speaker 2: I think for young people it's not too late. I 358 00:22:12,320 --> 00:22:13,960 Speaker 2: don't know about the people who are affected by this 359 00:22:14,040 --> 00:22:16,320 Speaker 2: ten years ago, because now a decade has gone on, 360 00:22:16,440 --> 00:22:19,159 Speaker 2: and if someone's in MAGA world and someone's that entrenched 361 00:22:19,160 --> 00:22:21,240 Speaker 2: in it, I genuinely don't know if you can get 362 00:22:21,640 --> 00:22:24,720 Speaker 2: that person out of that type of thinking without some 363 00:22:25,000 --> 00:22:29,520 Speaker 2: like really serious one on one intervention, which is not 364 00:22:29,560 --> 00:22:32,640 Speaker 2: what we're talking about. But with people who are younger, 365 00:22:32,760 --> 00:22:36,200 Speaker 2: like the gen Z men, who feel quote unquote left behind, 366 00:22:36,520 --> 00:22:38,760 Speaker 2: like left behind you could take as another sort of 367 00:22:39,680 --> 00:22:43,040 Speaker 2: code like economic anxiety and all that other stuff, but 368 00:22:43,200 --> 00:22:47,080 Speaker 2: I think it does speak to that perceived real feeling 369 00:22:47,080 --> 00:22:49,400 Speaker 2: that they have, Like if you feel something, then it's 370 00:22:49,440 --> 00:22:52,520 Speaker 2: real to you. And there's that feeling even if they 371 00:22:52,520 --> 00:22:56,919 Speaker 2: can't place why it is of like my dominance is 372 00:22:56,960 --> 00:22:59,200 Speaker 2: coming to an end or already has come to an end, 373 00:22:59,800 --> 00:23:02,560 Speaker 2: And it's not a good thing, in my opinion, to 374 00:23:02,640 --> 00:23:05,720 Speaker 2: want to be dominant in society in that way. But 375 00:23:06,640 --> 00:23:09,080 Speaker 2: there is that very real feeling of like the world 376 00:23:09,119 --> 00:23:10,840 Speaker 2: is changing and I feel left behind. So let's put 377 00:23:10,880 --> 00:23:13,480 Speaker 2: it that way. How do you help someone when they 378 00:23:13,480 --> 00:23:15,600 Speaker 2: feel like the world is changing and they're being left behind. 379 00:23:15,880 --> 00:23:18,879 Speaker 2: My feeling is you show them that they're not going 380 00:23:18,920 --> 00:23:21,040 Speaker 2: to be left behind by the world that's coming or 381 00:23:21,080 --> 00:23:23,159 Speaker 2: the world that's already here. You show them there's a 382 00:23:23,200 --> 00:23:25,159 Speaker 2: place for you in this world. The place for you 383 00:23:25,240 --> 00:23:26,960 Speaker 2: in this world is not lesser. The place for you 384 00:23:27,000 --> 00:23:28,679 Speaker 2: in this world is equal. The place for you in 385 00:23:28,680 --> 00:23:31,880 Speaker 2: this world is equitable. And by the way, the right 386 00:23:31,920 --> 00:23:35,200 Speaker 2: wing is going against the word equity and have been 387 00:23:35,240 --> 00:23:38,760 Speaker 2: for the last few years, because that's really what's trying 388 00:23:38,760 --> 00:23:41,240 Speaker 2: to be achieved. There's been an understanding of the difference 389 00:23:41,240 --> 00:23:45,560 Speaker 2: between equality and equity, and so now they don't even 390 00:23:45,600 --> 00:23:48,439 Speaker 2: want young people to find it desirable to exist in 391 00:23:48,480 --> 00:23:54,720 Speaker 2: an equitable world. But I think unfortunately, I say unfortunately, 392 00:23:54,840 --> 00:23:57,280 Speaker 2: because when you're in a place and you understand something, 393 00:23:57,320 --> 00:23:58,480 Speaker 2: you don't want to have to do the work of 394 00:23:58,520 --> 00:24:00,960 Speaker 2: explaining it to someone else and explain to someone why, 395 00:24:01,080 --> 00:24:03,240 Speaker 2: like you want to live in an equitable world. Inequitable 396 00:24:03,240 --> 00:24:05,760 Speaker 2: world is good for all of us. But that is 397 00:24:05,800 --> 00:24:07,880 Speaker 2: the work, in my opinion, that has to be done. 398 00:24:08,480 --> 00:24:10,280 Speaker 2: You don't need to coddle people, you don't need to 399 00:24:10,280 --> 00:24:12,359 Speaker 2: make people still feel like. You don't need to protect 400 00:24:12,400 --> 00:24:15,800 Speaker 2: people's privilege. But I keep saying people, I mean white people, 401 00:24:15,920 --> 00:24:19,560 Speaker 2: I mean mostly white men. You don't need to coddle 402 00:24:19,600 --> 00:24:23,080 Speaker 2: them or anything else. But I do think that you 403 00:24:23,200 --> 00:24:26,960 Speaker 2: need to welcome people in and show them that this 404 00:24:27,040 --> 00:24:30,320 Speaker 2: world has a space for them. There's a conversation now 405 00:24:30,960 --> 00:24:37,560 Speaker 2: about feminism and inclusivity. Jonathan had mentioned the like feminist 406 00:24:37,960 --> 00:24:40,800 Speaker 2: phrasing of like I hate men, and men are like 407 00:24:40,880 --> 00:24:43,840 Speaker 2: this and painting men with a broad brush. And I 408 00:24:43,880 --> 00:24:48,120 Speaker 2: don't necessarily, like find that so problematic, but I do 409 00:24:48,160 --> 00:24:52,320 Speaker 2: feel like movements like that, like social movements for positive change, 410 00:24:52,359 --> 00:24:58,560 Speaker 2: like feminism and like pro black movements, can bring men 411 00:24:58,920 --> 00:25:02,280 Speaker 2: and white people and white men in in a way 412 00:25:02,320 --> 00:25:05,800 Speaker 2: that's like, you're coming along with us. The world that 413 00:25:05,840 --> 00:25:08,640 Speaker 2: we're advocating for is a better world for all of us. 414 00:25:08,840 --> 00:25:11,119 Speaker 2: Come along with us, help us make that world better. 415 00:25:11,800 --> 00:25:13,719 Speaker 2: And it's a tough needle to thread to say, like, 416 00:25:13,800 --> 00:25:16,000 Speaker 2: men have created all these problems, white people have created 417 00:25:16,040 --> 00:25:21,080 Speaker 2: all these problems, help us solve them, because those problems 418 00:25:21,119 --> 00:25:25,680 Speaker 2: also have made your life worse. But I genuinely think 419 00:25:25,720 --> 00:25:27,600 Speaker 2: that that's part of the work that has to be done. 420 00:25:27,640 --> 00:25:29,440 Speaker 2: And I realize I'm saying that as a white man, 421 00:25:29,840 --> 00:25:32,280 Speaker 2: but as someone who lives in the world as a 422 00:25:32,320 --> 00:25:35,000 Speaker 2: white man and interacts with a lot of white men, 423 00:25:35,400 --> 00:25:39,240 Speaker 2: I think leaving white men out is only going to 424 00:25:39,280 --> 00:25:42,120 Speaker 2: make white men feel left out of the world that's 425 00:25:42,119 --> 00:25:44,560 Speaker 2: being created, in the world that we want to form, 426 00:25:44,800 --> 00:25:47,280 Speaker 2: because otherwise the world that gets formed is the world 427 00:25:47,320 --> 00:25:50,479 Speaker 2: that they get called into. I mean, it's just like 428 00:25:51,240 --> 00:25:55,200 Speaker 2: children are the future. Teach them well them and let 429 00:25:55,240 --> 00:25:56,000 Speaker 2: them lead the way. 430 00:25:56,920 --> 00:26:00,280 Speaker 1: That's the problem is that, like evidently we're they're not 431 00:26:00,400 --> 00:26:01,080 Speaker 1: teaching well. 432 00:26:01,720 --> 00:26:03,439 Speaker 2: Oh, And there's so much that can be said about that, 433 00:26:03,520 --> 00:26:05,199 Speaker 2: and the schools and the right wing and the right 434 00:26:05,200 --> 00:26:07,360 Speaker 2: wing interest in schools and George Bush and no Child 435 00:26:07,440 --> 00:26:10,960 Speaker 2: left Behind and how George Bush got into office. Dan Yelle, 436 00:26:11,000 --> 00:26:12,919 Speaker 2: we got back to Bush v. Gore again. I do 437 00:26:13,000 --> 00:26:16,080 Speaker 2: that anytime I appear on anything. It's kind of like 438 00:26:16,280 --> 00:26:18,439 Speaker 2: and who was at the Brooks Brothers riot? Now a 439 00:26:18,480 --> 00:26:19,960 Speaker 2: Supreme Court justice? Anyway? 440 00:26:20,640 --> 00:26:24,080 Speaker 1: A seven degrees of separation. It's like the game that 441 00:26:24,240 --> 00:26:27,320 Speaker 1: was played in the eighties, but it's with like the election. 442 00:26:27,680 --> 00:26:29,800 Speaker 2: So how do democrats do they Democrats need to start 443 00:26:29,800 --> 00:26:31,919 Speaker 2: paying more attention to things when they start happening, and 444 00:26:32,000 --> 00:26:34,320 Speaker 2: not ten or twenty years later when they've already happened, 445 00:26:34,320 --> 00:26:36,480 Speaker 2: and it's arguably too late. Hopefully not. 446 00:26:39,640 --> 00:26:41,680 Speaker 1: I don't know if it's not too like I guess 447 00:26:41,720 --> 00:26:46,040 Speaker 1: that's The problem is that while I feel like we're 448 00:26:46,119 --> 00:26:51,119 Speaker 1: unearthing these things right now, like the mainstream is in 449 00:26:51,280 --> 00:26:54,639 Speaker 1: unearthing these things right now, I don't want to believe 450 00:26:54,680 --> 00:26:57,760 Speaker 1: that it's too late to clean up this mess because 451 00:26:57,800 --> 00:27:00,800 Speaker 1: I feel like if we do, if we're just like, 452 00:27:00,880 --> 00:27:03,920 Speaker 1: well fuck it. Steve Beannon had like a twenty thirty 453 00:27:04,000 --> 00:27:07,280 Speaker 1: year head start, so these people are gone. 454 00:27:07,600 --> 00:27:10,560 Speaker 2: Right, We've lost the courts, so we can't keep fighting 455 00:27:10,560 --> 00:27:10,960 Speaker 2: for it. 456 00:27:10,920 --> 00:27:14,520 Speaker 1: Right, so then we just acquiesced, which is part of 457 00:27:15,359 --> 00:27:19,240 Speaker 1: their plan. So I'm just I'm kind of in the 458 00:27:19,280 --> 00:27:23,359 Speaker 1: place of we need to understand this, We need to 459 00:27:23,400 --> 00:27:30,000 Speaker 1: continue having these really important conversations so that we don't 460 00:27:30,119 --> 00:27:34,760 Speaker 1: leave groups that are susceptible to MAGA, because it's not 461 00:27:34,920 --> 00:27:37,320 Speaker 1: going to go away. I guess that's my point of 462 00:27:37,359 --> 00:27:41,040 Speaker 1: how like maga's not going away just because we have 463 00:27:41,080 --> 00:27:45,280 Speaker 1: an election in November, if by the grace of God, 464 00:27:45,359 --> 00:27:51,200 Speaker 1: Kamala Harrison Tim Walls are able to win, they'll just retreat, 465 00:27:51,520 --> 00:27:57,679 Speaker 1: clean up, and repackage with a new candidate. But the 466 00:27:57,840 --> 00:28:03,200 Speaker 1: same principles, is same network, the same web that they 467 00:28:03,240 --> 00:28:07,640 Speaker 1: have built will remain. So it's like the conversation needs 468 00:28:07,640 --> 00:28:11,920 Speaker 1: to continue, Like these are activated. Dare I say, white 469 00:28:11,960 --> 00:28:17,880 Speaker 1: supremact terrorists cells that are activated one after the other. 470 00:28:18,280 --> 00:28:20,679 Speaker 1: And just because you play whack a mole and you 471 00:28:20,760 --> 00:28:24,160 Speaker 1: hit one down, it doesn't like you need to pull 472 00:28:24,200 --> 00:28:25,040 Speaker 1: it out from. 473 00:28:24,840 --> 00:28:27,600 Speaker 2: The bottom, activated in a coordinated way. 474 00:28:28,240 --> 00:28:35,040 Speaker 1: Yeah. I appreciate you so much bringing this to my attention, 475 00:28:35,280 --> 00:28:39,080 Speaker 1: to our attention, to the audience's attention, because it's like 476 00:28:39,120 --> 00:28:42,080 Speaker 1: a conversation that I want to keep having because there's 477 00:28:42,280 --> 00:28:45,760 Speaker 1: just so much there, so less thoughts to you. 478 00:28:46,760 --> 00:28:49,680 Speaker 2: I've been thinking about action points since we've been talking, 479 00:28:50,000 --> 00:28:54,240 Speaker 2: and Democrats need to stop shutting out young people from 480 00:28:54,320 --> 00:28:59,520 Speaker 2: everything from movements from like official movements, from the party, 481 00:29:00,120 --> 00:29:05,840 Speaker 2: from narratives from democratic media. It's not the Clinton years anymore. 482 00:29:06,560 --> 00:29:09,200 Speaker 2: Democrats need to live in the modern world and welcome 483 00:29:09,240 --> 00:29:11,800 Speaker 2: younger people and welcome younger voices in a broader way 484 00:29:11,920 --> 00:29:15,720 Speaker 2: than they're currently doing right now, because I feel that 485 00:29:15,920 --> 00:29:19,760 Speaker 2: otherwise the same cycle of not recognizing patterns until it's 486 00:29:19,800 --> 00:29:21,200 Speaker 2: too late will continue to happen. 487 00:29:22,000 --> 00:29:27,040 Speaker 1: Yeah, it's truly, it's extraordinary, and I so appreciate you 488 00:29:27,560 --> 00:29:30,920 Speaker 1: for bringing us this conversation. We need to have more 489 00:29:30,960 --> 00:29:34,480 Speaker 1: of these so that people get a sense of what's 490 00:29:34,520 --> 00:29:37,120 Speaker 1: going on, and this, you know, frankly to the listener 491 00:29:37,200 --> 00:29:40,800 Speaker 1: is like this could be your kids, and like your grandkids, 492 00:29:40,840 --> 00:29:43,959 Speaker 1: and like your roommates, folks that you don't really know, 493 00:29:44,160 --> 00:29:48,680 Speaker 1: but you're like, something seems off. We're not having the 494 00:29:48,720 --> 00:29:53,680 Speaker 1: conversations with each other. And I think that that's extraordinarily important. 495 00:29:54,400 --> 00:29:56,280 Speaker 2: I'll button real quick to say that, actually because I 496 00:29:56,280 --> 00:29:58,240 Speaker 2: talked about one on one intervention before and to go 497 00:29:58,280 --> 00:30:00,840 Speaker 2: from systematic to the listener. Talk your loved ones, talk 498 00:30:00,880 --> 00:30:03,040 Speaker 2: to your family, talk to your neighbors, talk to people 499 00:30:03,080 --> 00:30:06,600 Speaker 2: in your communities, and don't start arguments if you can 500 00:30:06,600 --> 00:30:09,560 Speaker 2: help it. But also don't just passively accept that they 501 00:30:09,560 --> 00:30:13,080 Speaker 2: have hate in their heart. Bring them in and try, 502 00:30:13,800 --> 00:30:16,600 Speaker 2: if you haven't already, try to help them see the 503 00:30:16,640 --> 00:30:19,560 Speaker 2: light before you see them as a lost cause. 504 00:30:20,280 --> 00:30:24,960 Speaker 1: One hundred percent. Andrew Marcello, thank you so much for 505 00:30:25,080 --> 00:30:29,120 Speaker 1: coming from behind the scenes to the microphone to have 506 00:30:29,200 --> 00:30:35,959 Speaker 1: this conversation on wokaf with me. Greatly appreciate it. That 507 00:30:36,120 --> 00:30:40,200 Speaker 1: is it for me today, dear friends on wokaypp as always, 508 00:30:40,400 --> 00:30:43,400 Speaker 1: power to the people and to all the people. Power, 509 00:30:43,680 --> 00:30:45,960 Speaker 1: get woke and stay woke as fuck.