1 00:00:05,040 --> 00:00:09,720 Speaker 1: Welcome too. It could happen here a podcast about things 2 00:00:09,800 --> 00:00:16,160 Speaker 1: falling apart and sometimes putting things back together. And you know, 3 00:00:16,320 --> 00:00:18,479 Speaker 1: today we're doing an episode that's kind of more on 4 00:00:18,520 --> 00:00:24,119 Speaker 1: the intellectual and emotional end of a very specific set 5 00:00:24,160 --> 00:00:28,240 Speaker 1: of things falling apart, And rather than clumsily try to 6 00:00:28,280 --> 00:00:31,360 Speaker 1: introduce it myself, I'm going to bring on the person 7 00:00:31,440 --> 00:00:34,640 Speaker 1: who who I think the thoughts that have kind of 8 00:00:34,640 --> 00:00:36,319 Speaker 1: been going through my head. I know they've been going 9 00:00:36,320 --> 00:00:37,879 Speaker 1: through the heads of a lot of the folks that 10 00:00:37,880 --> 00:00:41,840 Speaker 1: we have here at cool Zone for quite some time now. Thoughtslime. 11 00:00:42,000 --> 00:00:47,640 Speaker 1: You are a YouTuber and a good YouTuber who does 12 00:00:47,920 --> 00:00:50,319 Speaker 1: a number of videos. Some of your recent ones are 13 00:00:50,440 --> 00:00:54,480 Speaker 1: thoughts on ai Art, a timeline of Elon's Twitter mistakes. 14 00:00:54,520 --> 00:00:57,880 Speaker 1: She did a really fun video on the QAnon Queen 15 00:00:57,920 --> 00:01:03,200 Speaker 1: of Canada, who is a pretty problematic character. Welcome to 16 00:01:03,200 --> 00:01:05,880 Speaker 1: the show. Thanks for having me, Happy to be here. 17 00:01:06,959 --> 00:01:08,560 Speaker 1: Do you want to just kind of start by reading 18 00:01:08,600 --> 00:01:10,640 Speaker 1: us that thread? Because you posted this on Twitter the 19 00:01:10,680 --> 00:01:12,840 Speaker 1: other day and I started chatting with it, and then 20 00:01:12,880 --> 00:01:16,640 Speaker 1: we moved over to DMS and decided we should kind 21 00:01:16,640 --> 00:01:20,240 Speaker 1: of do a little more formal thing. Yeah, so basically 22 00:01:20,800 --> 00:01:24,360 Speaker 1: I said that I'm constantly considering making a Why I 23 00:01:24,440 --> 00:01:26,560 Speaker 1: Left the Left video about how my views have not 24 00:01:26,680 --> 00:01:30,120 Speaker 1: changed one iota, but I've become completely disillusioned about my 25 00:01:30,240 --> 00:01:33,959 Speaker 1: role in communicating them. Part of the reason I shifted 26 00:01:34,000 --> 00:01:36,759 Speaker 1: my focus to trying to be just entertaining is because 27 00:01:36,800 --> 00:01:38,520 Speaker 1: deep down I don't really see a lot of value 28 00:01:38,560 --> 00:01:41,679 Speaker 1: in getting people on my side anymore. I don't think 29 00:01:41,680 --> 00:01:44,160 Speaker 1: it does anything or means anything. But the best I 30 00:01:44,160 --> 00:01:46,760 Speaker 1: can do is give you information and hopefully a laugh. 31 00:01:47,600 --> 00:01:49,680 Speaker 1: I used to feel like I was participating in something 32 00:01:49,720 --> 00:01:51,720 Speaker 1: bigger than I think A lot. Then I think I 33 00:01:51,760 --> 00:01:54,600 Speaker 1: really am that I was helping in some small, in 34 00:01:54,640 --> 00:01:57,080 Speaker 1: some small way towards a sort of shift towards a 35 00:01:57,120 --> 00:01:59,840 Speaker 1: more revolutionary mass consciousness. I think that was a bit 36 00:01:59,840 --> 00:02:03,440 Speaker 1: of a childish fantasy. In retrospect, sometimes people will say 37 00:02:03,520 --> 00:02:05,880 Speaker 1: you made me an anarchist, and like, Buddy, I don't 38 00:02:05,880 --> 00:02:08,000 Speaker 1: even think it matters that I myself am an anarchist, 39 00:02:08,400 --> 00:02:10,200 Speaker 1: and I regret that that sort of we're fighting the 40 00:02:10,240 --> 00:02:12,880 Speaker 1: good fight mentality has allowed some of the worst grifters 41 00:02:12,880 --> 00:02:15,800 Speaker 1: on the platform to flourish by manipulating people's passions for 42 00:02:15,840 --> 00:02:18,600 Speaker 1: their own weird petty reasons. I think what I do 43 00:02:18,639 --> 00:02:20,280 Speaker 1: has a lot of value, But I'm just saying that 44 00:02:20,320 --> 00:02:22,760 Speaker 1: I think I perceive that value to be is a 45 00:02:22,800 --> 00:02:24,520 Speaker 1: lot different than what I thought it was a few 46 00:02:24,560 --> 00:02:28,400 Speaker 1: years ago. Is basically what I had to say. Yeah, 47 00:02:28,440 --> 00:02:31,560 Speaker 1: that I think does such a good job of nailing 48 00:02:32,320 --> 00:02:36,120 Speaker 1: the problem that I've been kind of dealing with emotionally 49 00:02:36,160 --> 00:02:38,400 Speaker 1: as well, which is it's it's not It'd be easy 50 00:02:38,400 --> 00:02:41,040 Speaker 1: to sum it up as like I no longer believe in, 51 00:02:41,280 --> 00:02:44,840 Speaker 1: you know, trying to transmit you know, leftist ideas or 52 00:02:45,320 --> 00:02:49,800 Speaker 1: political analysis, or that I don't believe in the value 53 00:02:49,840 --> 00:02:53,760 Speaker 1: of like trying to inform people about the world, because 54 00:02:53,760 --> 00:02:56,680 Speaker 1: that's not how I feel. But there is there has 55 00:02:56,720 --> 00:02:58,919 Speaker 1: been this shift, and I think probably the high point 56 00:02:59,000 --> 00:03:02,400 Speaker 1: for the version of me that was optimistic about the 57 00:03:02,440 --> 00:03:07,200 Speaker 1: ability to use mass media to build power and the 58 00:03:07,240 --> 00:03:09,639 Speaker 1: ability to take effective action on the left. I think 59 00:03:09,639 --> 00:03:12,680 Speaker 1: that kind of crescendoed I'm gonna I'm gonna say June 60 00:03:12,720 --> 00:03:16,720 Speaker 1: of twenty twenty um, and it had a pretty sharp 61 00:03:16,880 --> 00:03:20,480 Speaker 1: drop after that point. And I both think it's it's 62 00:03:20,560 --> 00:03:24,400 Speaker 1: valuable to still acknowledge kind of how remarkable what happened 63 00:03:24,400 --> 00:03:26,440 Speaker 1: in twenty twenty was, for all of its flaws and 64 00:03:26,560 --> 00:03:31,160 Speaker 1: all of the really messy fallout from it, we saw 65 00:03:31,200 --> 00:03:35,080 Speaker 1: an ups an uprising of unprecedented scale, and part of 66 00:03:35,120 --> 00:03:38,600 Speaker 1: why the crackdown in response has been so narly is 67 00:03:39,080 --> 00:03:41,000 Speaker 1: that it scared the hell out of a lot of 68 00:03:41,040 --> 00:03:47,640 Speaker 1: really unpleasant people. Um and the media had a significant 69 00:03:47,720 --> 00:03:49,320 Speaker 1: role to play in that, both in the fact that 70 00:03:49,320 --> 00:03:51,000 Speaker 1: there were a lot of people who were who were 71 00:03:51,080 --> 00:03:55,280 Speaker 1: kind of already organizing and radicalized when the shit started 72 00:03:55,280 --> 00:03:59,320 Speaker 1: to hit the fan, and that as things happened, um, 73 00:03:59,560 --> 00:04:02,360 Speaker 1: you know, the what was happening in the streets, what 74 00:04:02,400 --> 00:04:04,960 Speaker 1: the police were doing, the different kind of marches and 75 00:04:05,200 --> 00:04:09,400 Speaker 1: and different campaigns that were being started, got spread to people. 76 00:04:09,680 --> 00:04:12,240 Speaker 1: And I do think that you know, folks, you know 77 00:04:12,320 --> 00:04:14,360 Speaker 1: like you and me, were a part of that. Although 78 00:04:14,680 --> 00:04:17,039 Speaker 1: it never is far from my mind that the most 79 00:04:17,080 --> 00:04:21,080 Speaker 1: influential piece of media that was that was recorded and 80 00:04:21,160 --> 00:04:25,760 Speaker 1: disseminated during twenty twenty was the video of George Floyd 81 00:04:25,960 --> 00:04:29,320 Speaker 1: being murdered, which was filmed by you know, someone who 82 00:04:29,400 --> 00:04:31,680 Speaker 1: just happened to be nearby and had the courage to 83 00:04:31,720 --> 00:04:36,040 Speaker 1: film it, not a professional journalist, not a not an influencer, 84 00:04:36,200 --> 00:04:39,760 Speaker 1: not a not somebody who was a professional political thinker 85 00:04:39,800 --> 00:04:44,599 Speaker 1: and everything else combined didn't have the influence of that video. Yeah. 86 00:04:44,640 --> 00:04:46,240 Speaker 1: I think that that kind of gets to the heart 87 00:04:46,279 --> 00:04:50,120 Speaker 1: of it, right, is that like we express support for ideas, 88 00:04:50,120 --> 00:04:51,960 Speaker 1: and thus people tend to treat us as though we 89 00:04:52,040 --> 00:04:56,719 Speaker 1: are the progenitors of those ideas, or the guardians of 90 00:04:56,720 --> 00:05:01,440 Speaker 1: those ideas, or the leaders of a kind of decentralized 91 00:05:02,000 --> 00:05:06,120 Speaker 1: proxy party of some kind. Yeah, it's it's it's both, 92 00:05:06,279 --> 00:05:10,960 Speaker 1: because I think thankfully there's that. I mean, there's there's 93 00:05:11,000 --> 00:05:15,000 Speaker 1: always going to be every everyone who makes popular media 94 00:05:15,480 --> 00:05:18,000 Speaker 1: gets forms a little cult, and so there's always going 95 00:05:18,040 --> 00:05:20,600 Speaker 1: to be a number of people who, you know, take 96 00:05:20,680 --> 00:05:25,040 Speaker 1: any given person in the media more seriously than they deserve. 97 00:05:25,080 --> 00:05:28,279 Speaker 1: And that that includes the both of us. And that's 98 00:05:28,320 --> 00:05:30,719 Speaker 1: that's not attempting to be that's not attempting to be 99 00:05:30,800 --> 00:05:32,719 Speaker 1: like humble or anything. That is simply a fact of 100 00:05:32,720 --> 00:05:35,680 Speaker 1: how mass media works. Um. I do think we've seen 101 00:05:36,080 --> 00:05:39,160 Speaker 1: I think there's been a mix of a healthy pushback 102 00:05:39,279 --> 00:05:44,039 Speaker 1: against looking at people who are doing creating popular media 103 00:05:44,279 --> 00:05:47,240 Speaker 1: as more than what they are and more than what 104 00:05:47,320 --> 00:05:49,840 Speaker 1: that media is capable of being. I think there has 105 00:05:49,839 --> 00:05:52,360 Speaker 1: been a pushback against that in the last couple of years. 106 00:05:52,400 --> 00:05:54,480 Speaker 1: It's been healthy, and I think there's been a pushback 107 00:05:54,640 --> 00:05:58,400 Speaker 1: that's been unhealthy. Um, I think people have forgotten some 108 00:05:58,480 --> 00:06:01,360 Speaker 1: of the lessons of like one. I think a good 109 00:06:01,360 --> 00:06:05,200 Speaker 1: example would be there was a very justified backlash against 110 00:06:06,240 --> 00:06:08,280 Speaker 1: and when I say streamers here, I'm referring to people 111 00:06:08,279 --> 00:06:10,719 Speaker 1: who are actually in the streets streaming during riots and 112 00:06:10,760 --> 00:06:13,240 Speaker 1: protests and whatnot, right and that, And the justified part 113 00:06:13,240 --> 00:06:15,520 Speaker 1: of that backlash was due to the fact that past 114 00:06:15,600 --> 00:06:20,160 Speaker 1: a certain point, particularly those video those streams were primarily 115 00:06:20,160 --> 00:06:23,599 Speaker 1: being used by law enforcement, both to get charges on 116 00:06:23,680 --> 00:06:26,160 Speaker 1: people and to just to know where folks were as 117 00:06:26,200 --> 00:06:29,360 Speaker 1: an intelligence gathering method. And I think that the backlash, 118 00:06:29,440 --> 00:06:32,000 Speaker 1: which was understandable, and there was a lot of ugly behavior, 119 00:06:32,080 --> 00:06:35,559 Speaker 1: including people who kind of got in after the early 120 00:06:35,640 --> 00:06:38,039 Speaker 1: portions of that, in order to make shitloads of money 121 00:06:38,160 --> 00:06:40,520 Speaker 1: by you know, streaming people getting the shit beat out 122 00:06:40,560 --> 00:06:43,280 Speaker 1: of them by the cops, and that was I think 123 00:06:43,760 --> 00:06:49,039 Speaker 1: very justified, a pretty aggressive social response to that, but 124 00:06:49,080 --> 00:06:50,520 Speaker 1: I think it's also caused a lot of people to 125 00:06:50,560 --> 00:06:53,480 Speaker 1: forget that. A huge part of why things kicked off 126 00:06:53,520 --> 00:06:55,479 Speaker 1: in twenty twenty and why so many people got involved 127 00:06:55,520 --> 00:06:58,640 Speaker 1: was Nico from Uniform Corn riot on the ground every 128 00:06:58,720 --> 00:07:02,200 Speaker 1: night in Minneapolis, doing one of the most impressive pieces 129 00:07:02,279 --> 00:07:05,360 Speaker 1: of citizen journalism that I think we've seen in this country. 130 00:07:06,640 --> 00:07:09,640 Speaker 1: And so I do think that some of what's frustrating 131 00:07:09,680 --> 00:07:12,440 Speaker 1: here is that it's difficult for people. It's difficult for 132 00:07:12,520 --> 00:07:16,240 Speaker 1: us as a community to take some of the proper 133 00:07:16,320 --> 00:07:19,160 Speaker 1: lessons from these these things that are happening from the 134 00:07:19,200 --> 00:07:21,440 Speaker 1: push and pull of the conflict that we all find 135 00:07:21,480 --> 00:07:24,640 Speaker 1: ourselves in, in part because the nature of the way 136 00:07:24,680 --> 00:07:30,560 Speaker 1: people express their understanding of these lessons via social media 137 00:07:30,880 --> 00:07:33,560 Speaker 1: is very geared towards flattening them and making it a 138 00:07:33,600 --> 00:07:35,400 Speaker 1: very simple matter of this is bad or this is 139 00:07:35,440 --> 00:07:37,680 Speaker 1: good and not well in this period of time, this 140 00:07:37,800 --> 00:07:40,440 Speaker 1: worked and then it didn't. You know, there's no real 141 00:07:40,440 --> 00:07:44,400 Speaker 1: sense of proportionality in these discussions. It isn't just a 142 00:07:44,440 --> 00:07:48,200 Speaker 1: matter of like, hey, you fucked up. You should probably 143 00:07:48,200 --> 00:07:50,280 Speaker 1: take this down, or this could be dangerous if you 144 00:07:50,360 --> 00:07:52,040 Speaker 1: leave this up, or if you continue to do this. 145 00:07:52,680 --> 00:07:55,280 Speaker 1: It's more so like what are you a cop? What 146 00:07:55,360 --> 00:07:58,400 Speaker 1: are you some kind of cop doing this? Yeah, you know, 147 00:07:58,480 --> 00:08:02,520 Speaker 1: let's spread that rumor and it. I mean, yeah, the 148 00:08:02,560 --> 00:08:04,800 Speaker 1: cop jacketing thing is is kind of one part of 149 00:08:04,800 --> 00:08:06,760 Speaker 1: the problem. But I want to focus a little bit 150 00:08:06,760 --> 00:08:09,320 Speaker 1: on what you were talking about in terms of what 151 00:08:09,360 --> 00:08:12,760 Speaker 1: do you think as you're kind of looking at you know, 152 00:08:12,800 --> 00:08:16,240 Speaker 1: and we're all kind of staring twenty twenty four as 153 00:08:16,240 --> 00:08:22,120 Speaker 1: it approaches, what do you think is useful from media 154 00:08:22,320 --> 00:08:28,040 Speaker 1: that that attempts to analyze and share perspectives that are 155 00:08:28,120 --> 00:08:31,240 Speaker 1: that are left wing, that are anarchists inclined. What do 156 00:08:31,280 --> 00:08:35,320 Speaker 1: you think is actually the value that can be added 157 00:08:35,440 --> 00:08:41,079 Speaker 1: to attempts to achieve greater justice in our society. Well, 158 00:08:41,120 --> 00:08:44,880 Speaker 1: I think the answer is twofold. I think firstly, anything 159 00:08:44,880 --> 00:08:48,120 Speaker 1: that drives people to like real life organizing and taking 160 00:08:48,160 --> 00:08:55,160 Speaker 1: action outside of online spaces is obviously useful. Beyond that, though, like, 161 00:08:55,240 --> 00:09:00,880 Speaker 1: I think there is some value to just exposing people 162 00:09:00,920 --> 00:09:04,800 Speaker 1: to ideas that they might not have found otherwise. But 163 00:09:04,880 --> 00:09:08,480 Speaker 1: I think that, um, that a lot of that has 164 00:09:08,640 --> 00:09:10,720 Speaker 1: been accomplished now. I feel like a lot of people 165 00:09:10,760 --> 00:09:13,880 Speaker 1: are more familiar with with kind of the leftist the 166 00:09:14,240 --> 00:09:19,120 Speaker 1: leftist idehology one on one type of content that people 167 00:09:19,200 --> 00:09:22,800 Speaker 1: might expect in that way, So yeah, I would say 168 00:09:22,800 --> 00:09:26,160 Speaker 1: those are the two value propositions. I wonder if you 169 00:09:26,200 --> 00:09:29,679 Speaker 1: think a lot about because one thing that concerns me obviously, UM. 170 00:09:30,400 --> 00:09:34,680 Speaker 1: Any community develops a language that is to some extent 171 00:09:34,880 --> 00:09:39,840 Speaker 1: its own UM, and that's that's a that's part of politics, 172 00:09:39,920 --> 00:09:42,120 Speaker 1: you know, political analysis, if you're looking at things with 173 00:09:42,120 --> 00:09:45,640 Speaker 1: a Marxist analysis, or if you're analyzing things, you know, 174 00:09:45,880 --> 00:09:50,599 Speaker 1: based on your understanding of generations of anarchist political philosophy. 175 00:09:50,640 --> 00:09:52,840 Speaker 1: There's terms that you're going to use that that other 176 00:09:52,920 --> 00:09:55,880 Speaker 1: thinkers have created, that are the terms that people use 177 00:09:55,920 --> 00:09:59,920 Speaker 1: to discuss those ideas UM. But it is sometimes kind 178 00:10:00,120 --> 00:10:01,680 Speaker 1: of a thin line between that and the thing that 179 00:10:01,760 --> 00:10:03,440 Speaker 1: colts do where they come up with a bunch of 180 00:10:03,480 --> 00:10:05,800 Speaker 1: specific terms that no one else uses in order to 181 00:10:06,240 --> 00:10:10,600 Speaker 1: separate a community from the rest from everyone else. And 182 00:10:10,600 --> 00:10:12,760 Speaker 1: obviously I don't think there's any intentionality there. I don't 183 00:10:12,760 --> 00:10:16,400 Speaker 1: think people who are talking about you know, the dialectic 184 00:10:16,480 --> 00:10:19,880 Speaker 1: or whatever are attempting to separate their listeners from the 185 00:10:19,920 --> 00:10:23,000 Speaker 1: mass of humanity. But I do think that happens sometimes. 186 00:10:23,080 --> 00:10:27,600 Speaker 1: And when I listen sometimes to conversations on the left 187 00:10:27,600 --> 00:10:32,680 Speaker 1: about justice, in particular about social justice, I wonder, like, well, 188 00:10:32,679 --> 00:10:35,440 Speaker 1: how is somebody who isn't like reading all this shit 189 00:10:35,720 --> 00:10:39,040 Speaker 1: going to interpret this? Is it just going to sound 190 00:10:39,120 --> 00:10:41,640 Speaker 1: like nonsense to them? And I think maybe like part 191 00:10:41,640 --> 00:10:45,880 Speaker 1: of the purpose, the positive purpose of mass media that 192 00:10:45,960 --> 00:10:49,360 Speaker 1: looks at things from the left is trying to communicate 193 00:10:49,400 --> 00:10:51,679 Speaker 1: with folks who are not going to sit down or 194 00:10:51,720 --> 00:10:53,720 Speaker 1: at least who have not yet sat down and done 195 00:10:53,720 --> 00:10:56,400 Speaker 1: a whole bunch of reading on the history and the politics, 196 00:10:56,440 --> 00:10:58,840 Speaker 1: but whose hearts are in the right place and who 197 00:10:58,880 --> 00:11:03,280 Speaker 1: I would like to be to engage in conversations with 198 00:11:03,520 --> 00:11:06,680 Speaker 1: folks who maybe kind of get their heads a little 199 00:11:06,679 --> 00:11:11,480 Speaker 1: bit too full of specific terminology. Sometimes I think it's 200 00:11:11,520 --> 00:11:15,200 Speaker 1: it's a specific balancing act because on the other hand, 201 00:11:15,280 --> 00:11:16,960 Speaker 1: like you also have to give your audience a little 202 00:11:16,960 --> 00:11:22,480 Speaker 1: credit that they're absolutely but I think that like you 203 00:11:22,520 --> 00:11:24,560 Speaker 1: have to be able to meet people where they're at. 204 00:11:24,600 --> 00:11:27,320 Speaker 1: But at the same time, like if someone has expressed 205 00:11:27,320 --> 00:11:32,040 Speaker 1: this idea in a way that's already sufficient, like it's it's, uh, 206 00:11:32,440 --> 00:11:35,480 Speaker 1: why do the work of like trying to re explain that, 207 00:11:35,640 --> 00:11:38,240 Speaker 1: you know. But that being said, I think there is 208 00:11:38,240 --> 00:11:42,160 Speaker 1: a tendency to just assume people already are on our 209 00:11:42,200 --> 00:11:46,520 Speaker 1: side or understand ideas to the level of complexity that 210 00:11:46,600 --> 00:11:49,400 Speaker 1: we might like and that people are on board with, 211 00:11:49,440 --> 00:11:53,560 Speaker 1: like what even something as simple as what capitalism means. 212 00:11:53,840 --> 00:11:56,760 Speaker 1: You know, all the time you see people online who 213 00:11:56,840 --> 00:12:01,840 Speaker 1: will say that, Like a musician will post their band 214 00:12:01,880 --> 00:12:04,160 Speaker 1: Campaige and people will be like, oh, I thought you 215 00:12:04,200 --> 00:12:09,040 Speaker 1: were anti capitalist. You know. Yeah, it's it's like you know, 216 00:12:10,120 --> 00:12:12,720 Speaker 1: but like you also can't get caught up in the 217 00:12:12,920 --> 00:12:18,600 Speaker 1: kind of um weaponized ignorance that the people you know 218 00:12:18,800 --> 00:12:22,679 Speaker 1: like you. You can't make someone understand something if they 219 00:12:22,720 --> 00:12:26,439 Speaker 1: have a particular reason not to want to. So I 220 00:12:26,720 --> 00:12:30,280 Speaker 1: absolutely agree that, like there's the danger of that that 221 00:12:30,400 --> 00:12:35,520 Speaker 1: group in speak, uh, but it's it's a it's a 222 00:12:35,559 --> 00:12:37,760 Speaker 1: difficult problem to solve. I think the kind of approach 223 00:12:37,800 --> 00:12:40,120 Speaker 1: I take to it most of the time is that 224 00:12:40,200 --> 00:12:45,560 Speaker 1: I tend to write my scripts as the as though 225 00:12:45,679 --> 00:12:50,720 Speaker 1: I am uh just the like like a child. Like 226 00:12:50,800 --> 00:12:53,760 Speaker 1: I I try to write as though I'm speaking to 227 00:12:53,840 --> 00:12:57,240 Speaker 1: a five year old, you know, Yeah, I mean, and 228 00:12:57,280 --> 00:12:59,880 Speaker 1: I think. I also I think a lot about and 229 00:13:00,000 --> 00:13:02,199 Speaker 1: this is something you know here at cool Zone, I've 230 00:13:02,200 --> 00:13:05,480 Speaker 1: brought we brought on a couple of years ago. People 231 00:13:05,520 --> 00:13:10,000 Speaker 1: who you know are now making podcasts for the team 232 00:13:10,080 --> 00:13:12,480 Speaker 1: who when we brought them on, had a lot less 233 00:13:12,480 --> 00:13:17,520 Speaker 1: experience writing scripts and making media for mass consumption. And 234 00:13:17,559 --> 00:13:19,600 Speaker 1: one of the things that I found it was kind 235 00:13:19,640 --> 00:13:24,080 Speaker 1: of like my job to do repeatedly was to point out, like, Okay, stop, go, 236 00:13:24,120 --> 00:13:26,120 Speaker 1: actually go back to that term, because you you just 237 00:13:26,240 --> 00:13:29,160 Speaker 1: you know, said a term that I think means a 238 00:13:29,240 --> 00:13:31,560 Speaker 1: specific or you just referenced a thing from history that 239 00:13:31,800 --> 00:13:35,000 Speaker 1: I think that people are interested in and should know about. 240 00:13:35,040 --> 00:13:37,320 Speaker 1: But you do have to like go in and explain 241 00:13:37,360 --> 00:13:40,960 Speaker 1: it and walk people through it. And that's kind of 242 00:13:41,040 --> 00:13:43,400 Speaker 1: part of That's really one of the challenges I find, 243 00:13:43,480 --> 00:13:48,480 Speaker 1: particularly with Behind the Bastards, right where we're talking sometimes 244 00:13:48,520 --> 00:13:52,600 Speaker 1: about these complicated social movements and moments in history, and 245 00:13:52,640 --> 00:13:55,280 Speaker 1: it's this kind of tub of war between you want 246 00:13:55,320 --> 00:13:58,200 Speaker 1: to respect the intelligence of the audience and you want 247 00:13:58,240 --> 00:14:01,360 Speaker 1: to give them enough to hail that they have contexts 248 00:14:01,400 --> 00:14:03,920 Speaker 1: and that can maybe understand multiple sides of it, but 249 00:14:04,040 --> 00:14:06,880 Speaker 1: also you can't get bogged down in every detail, otherwise 250 00:14:06,880 --> 00:14:08,719 Speaker 1: you're never going to finish the damn thing. And we 251 00:14:09,040 --> 00:14:13,600 Speaker 1: can't all be Dan Carlin making ten hour long podcasts. Unfortunately. 252 00:14:13,720 --> 00:14:16,520 Speaker 1: I do like there's a degree to which I'm quite 253 00:14:16,559 --> 00:14:20,400 Speaker 1: jealous of his work, the way he set up his workload. 254 00:14:20,480 --> 00:14:22,240 Speaker 1: But I would just never be able to think of 255 00:14:22,280 --> 00:14:26,720 Speaker 1: that many boxing analogies. Yeah, I don't. I don't know 256 00:14:26,880 --> 00:14:30,080 Speaker 1: very much about boxing. I would probably just like throw 257 00:14:30,120 --> 00:14:33,000 Speaker 1: in a whole lot of balls mahoney analogy, Yeah, a 258 00:14:33,000 --> 00:14:34,280 Speaker 1: lot of for me, it would be a lot of 259 00:14:34,400 --> 00:14:47,400 Speaker 1: super punch out references. Like hell, yes, stan Lee would 260 00:14:47,400 --> 00:14:50,680 Speaker 1: always say to comic book writers that every comic is 261 00:14:50,680 --> 00:14:53,760 Speaker 1: somebody's first comic, and so you kind of have to 262 00:14:53,800 --> 00:14:57,080 Speaker 1: consider that, like every piece of messaging you do might 263 00:14:57,200 --> 00:14:59,960 Speaker 1: this might be like the first time someone is stepping 264 00:15:00,040 --> 00:15:03,800 Speaker 1: out of a completely different ideological bubble than you might expect, 265 00:15:03,840 --> 00:15:06,240 Speaker 1: and so you know, it kind of has the messaging 266 00:15:06,320 --> 00:15:08,360 Speaker 1: kind of has to stand on its own. But I 267 00:15:08,400 --> 00:15:11,040 Speaker 1: think that's also like a unique problem to mass media 268 00:15:11,120 --> 00:15:13,560 Speaker 1: because it also means that in a sense, it's much 269 00:15:13,600 --> 00:15:17,160 Speaker 1: harder to like build on previous work. It's harder to 270 00:15:17,240 --> 00:15:20,880 Speaker 1: like go from your one on one content and then 271 00:15:21,000 --> 00:15:23,800 Speaker 1: get to the more advanced subjects, because someone could just 272 00:15:23,840 --> 00:15:26,920 Speaker 1: start at the more advanced part and get lost. I 273 00:15:26,920 --> 00:15:30,360 Speaker 1: think that's a really apt way of describing what I 274 00:15:30,400 --> 00:15:33,840 Speaker 1: also find as one of the central problems because a 275 00:15:33,920 --> 00:15:37,200 Speaker 1: ton of the episodes of Bastards, especially the stuff when 276 00:15:37,240 --> 00:15:42,200 Speaker 1: we focus on fascists, builds on itself. Right, you, your 277 00:15:42,320 --> 00:15:47,080 Speaker 1: understanding of fascism in Romania will be influenced and is 278 00:15:47,120 --> 00:15:50,880 Speaker 1: to some degree. You don't really you can't understand fascism 279 00:15:50,880 --> 00:15:54,440 Speaker 1: in Romania without understanding fascism and ymar fascism in Italy, 280 00:15:54,720 --> 00:15:57,080 Speaker 1: fascism in the United States during the same period, and 281 00:15:57,480 --> 00:16:01,600 Speaker 1: vice versa. And so my hope is that the people 282 00:16:01,600 --> 00:16:05,240 Speaker 1: who catch all of the episodes are building a really 283 00:16:05,440 --> 00:16:09,280 Speaker 1: complex and durable understanding of the problem through it. But 284 00:16:09,520 --> 00:16:11,640 Speaker 1: it's also the struggle of like, well, a lot of 285 00:16:11,680 --> 00:16:13,760 Speaker 1: people are just going to be like, oh shit, I 286 00:16:13,840 --> 00:16:16,800 Speaker 1: know Hitler, but I maybe I'm not interested in hearing 287 00:16:16,800 --> 00:16:19,000 Speaker 1: about Romania, you know, and I'm not going to click 288 00:16:19,040 --> 00:16:21,680 Speaker 1: on those episodes. And there's nothing against people like when 289 00:16:21,720 --> 00:16:24,320 Speaker 1: I listen to podcasts, I find myself doing the same 290 00:16:24,360 --> 00:16:26,680 Speaker 1: thing where it's like, there's a million episodes of this show, 291 00:16:26,760 --> 00:16:28,120 Speaker 1: I'm not going to listen. I don't I don't have 292 00:16:28,120 --> 00:16:31,200 Speaker 1: the time to listen to all of them. Sure, yeah, 293 00:16:30,440 --> 00:16:34,280 Speaker 1: And that touches on another problem, which is that you know, 294 00:16:34,400 --> 00:16:38,280 Speaker 1: the subjects that people like us tend to cover are 295 00:16:38,320 --> 00:16:41,920 Speaker 1: biased towards what we think people will find interesting. Yeah, 296 00:16:42,000 --> 00:16:44,760 Speaker 1: you know, and beyond that, like what we ourselves find 297 00:16:44,800 --> 00:16:47,560 Speaker 1: interesting to research, Yeah, and what in what you can 298 00:16:47,720 --> 00:16:50,040 Speaker 1: And this is a thing that I try to point 299 00:16:50,080 --> 00:16:52,680 Speaker 1: out on my subreddit sometimes when people are like, I 300 00:16:52,680 --> 00:16:54,440 Speaker 1: can't believe you haven't done this guy or that guy, 301 00:16:54,480 --> 00:16:57,520 Speaker 1: and it's like, well, that doing that research is going 302 00:16:57,600 --> 00:16:59,520 Speaker 1: to fuck me up, and like, so I'm not going 303 00:16:59,560 --> 00:17:01,520 Speaker 1: to do it yet. I'm gonna do this thing that's funny. 304 00:17:01,520 --> 00:17:03,360 Speaker 1: I'm gonna read about the liver King this week. I 305 00:17:03,400 --> 00:17:05,399 Speaker 1: need I need a break. So the liver King is 306 00:17:05,400 --> 00:17:10,000 Speaker 1: who we're talking about. Yeah, everybody needs a liver king 307 00:17:10,000 --> 00:17:12,480 Speaker 1: in their life at some point. Yeah. It's like I 308 00:17:14,080 --> 00:17:17,640 Speaker 1: read the Turner Diaries for one video. Yeah, And I've 309 00:17:17,720 --> 00:17:19,639 Speaker 1: been constant people who have been constantly like, oh, you 310 00:17:19,680 --> 00:17:22,680 Speaker 1: should read Camp of the Saints, you should read Siege. 311 00:17:22,720 --> 00:17:24,480 Speaker 1: And I'm like, oh, I don't know if I want to. 312 00:17:25,000 --> 00:17:26,600 Speaker 1: First of all, I don't even know if I want 313 00:17:26,600 --> 00:17:30,359 Speaker 1: those things on my hard drive. Yeah. Yeah, Camp of 314 00:17:30,400 --> 00:17:33,359 Speaker 1: the Saints is a little easier, but yeah, maybe maybe 315 00:17:33,440 --> 00:17:35,840 Speaker 1: one of those a year and no more. That's like 316 00:17:35,920 --> 00:17:41,240 Speaker 1: the most I would recommend from like a mental health standpoint. 317 00:17:42,040 --> 00:17:44,840 Speaker 1: It's also like you don't need to read the full 318 00:17:44,880 --> 00:17:46,639 Speaker 1: text of all of those I mean, that's part of 319 00:17:46,680 --> 00:17:48,240 Speaker 1: the thing is that like you can get a lot 320 00:17:48,800 --> 00:17:53,160 Speaker 1: by checking in some exerpts and reading scholarly papers analyzing 321 00:17:53,160 --> 00:17:56,159 Speaker 1: this stuff, and there there always will be that um 322 00:17:56,520 --> 00:18:00,320 Speaker 1: and I think to a significant standpoint, like it's more 323 00:18:00,400 --> 00:18:04,600 Speaker 1: important to understand, you know, And this isn't true for everybody, 324 00:18:04,600 --> 00:18:07,119 Speaker 1: because there's some people who you know, are scholars of 325 00:18:07,160 --> 00:18:09,560 Speaker 1: this stuff, and you do need to to to do 326 00:18:09,600 --> 00:18:12,080 Speaker 1: the deep reading. But if you want to understand the 327 00:18:12,200 --> 00:18:16,080 Speaker 1: degree to which Siege and the Turner Diers diaries influence 328 00:18:16,160 --> 00:18:18,240 Speaker 1: the mass shootings that we see in the United States 329 00:18:18,280 --> 00:18:20,000 Speaker 1: state that are carried out by the far right, you 330 00:18:20,040 --> 00:18:21,760 Speaker 1: don't need to read those books to do that, right, 331 00:18:21,800 --> 00:18:24,440 Speaker 1: There's plenty of really good scholarly analysis, and that's part 332 00:18:24,480 --> 00:18:26,679 Speaker 1: of what you and I try to do for people. 333 00:18:27,640 --> 00:18:30,200 Speaker 1: And what what other you know, folks who are creating 334 00:18:30,200 --> 00:18:33,880 Speaker 1: this kind of media other journalists do for folks. Yeah, 335 00:18:33,880 --> 00:18:37,760 Speaker 1: I will I would say that I strongly balk at 336 00:18:37,880 --> 00:18:42,719 Speaker 1: the I don't consider myself a journalist. Um, yeah, I mean, 337 00:18:42,760 --> 00:18:46,680 Speaker 1: and I don't consider that's something people talk about as well. 338 00:18:46,720 --> 00:18:49,440 Speaker 1: On the subreddit, I get a lot of like comments 339 00:18:49,480 --> 00:18:52,720 Speaker 1: on people appreciating the journalism in the series, and we 340 00:18:52,760 --> 00:18:55,800 Speaker 1: do in some of our shows, like you know we 341 00:18:55,880 --> 00:18:58,240 Speaker 1: did we went to the Border, mean mar last year. 342 00:18:58,320 --> 00:19:01,520 Speaker 1: Garrison just got back from Cop City. But like Bastards, 343 00:19:01,600 --> 00:19:05,080 Speaker 1: isn't journalism, you know, sometimes it's like celebrating journalism, but 344 00:19:05,119 --> 00:19:08,360 Speaker 1: it's it's it's entertainment that I hope has like an 345 00:19:08,480 --> 00:19:12,280 Speaker 1: educational quality to it. Yeah, it's I don't. I don't 346 00:19:12,320 --> 00:19:15,080 Speaker 1: say this to belittle myself. I just don't see that 347 00:19:15,119 --> 00:19:18,560 Speaker 1: as as the function of my job. I think, like 348 00:19:18,560 --> 00:19:21,320 Speaker 1: like I have, I have in the in the course 349 00:19:21,359 --> 00:19:24,880 Speaker 1: of my work, occasionally done journalism by accidents. I did 350 00:19:24,880 --> 00:19:28,879 Speaker 1: a long interview where I had like about the chaz 351 00:19:29,000 --> 00:19:31,159 Speaker 1: and and kind of the misconceptions that people had, and 352 00:19:31,200 --> 00:19:33,639 Speaker 1: I had some you know, talks with people within and 353 00:19:33,680 --> 00:19:37,720 Speaker 1: like that is technically, on its face a piece of journalism. Sure, 354 00:19:37,800 --> 00:19:43,720 Speaker 1: you know, absolutely, it's not what I consider my uh 355 00:19:43,920 --> 00:19:46,919 Speaker 1: strength or role to be well, and I honestly this 356 00:19:46,960 --> 00:19:48,600 Speaker 1: goes back to what we were talking about with the 357 00:19:48,720 --> 00:19:53,720 Speaker 1: young woman who filmed the video of of George Floyd. Um, 358 00:19:54,160 --> 00:19:56,879 Speaker 1: journalism is a profession, but it's also just like a 359 00:19:56,960 --> 00:20:00,400 Speaker 1: set of tools, and you know sometimes you will use 360 00:20:00,440 --> 00:20:02,879 Speaker 1: those tools in order to do other things. You know 361 00:20:02,920 --> 00:20:15,680 Speaker 1: that that's that's certainly true. I'm curious you and I. 362 00:20:15,800 --> 00:20:19,240 Speaker 1: You and I both kind of uh like make our 363 00:20:19,280 --> 00:20:24,600 Speaker 1: our our work work uh differently. UM, mine's ad supported obviously, 364 00:20:24,680 --> 00:20:28,280 Speaker 1: so my conversation with fans, you know, outside of like 365 00:20:28,320 --> 00:20:30,639 Speaker 1: when I'm doing a live show is primarily through we 366 00:20:30,680 --> 00:20:34,760 Speaker 1: have a subreddit and we have Twitter, um, and that's uh, 367 00:20:34,800 --> 00:20:37,239 Speaker 1: you know, there's some difficulty there. For one thing, like 368 00:20:38,200 --> 00:20:40,600 Speaker 1: every single guest we have, there are people who will 369 00:20:40,640 --> 00:20:42,920 Speaker 1: be like, this is the best guests you've ever had, 370 00:20:42,960 --> 00:20:45,280 Speaker 1: and this person is the worst guest you've ever had, 371 00:20:45,520 --> 00:20:48,520 Speaker 1: And there's absolutely no way to make decisions based on that, right, 372 00:20:48,520 --> 00:20:53,400 Speaker 1: It's just a much Um. You you've got a different relationship, 373 00:20:53,480 --> 00:20:55,840 Speaker 1: or at least a different method of I think communicating. 374 00:20:55,880 --> 00:20:59,840 Speaker 1: I imagine it's different, um because because your your Patreon supported. 375 00:21:00,280 --> 00:21:02,879 Speaker 1: I'm interested in, how have, if at all, have you 376 00:21:02,960 --> 00:21:07,280 Speaker 1: seen kind of the conversations about what people want from you? 377 00:21:07,359 --> 00:21:09,920 Speaker 1: And you know the way in which you've been talking 378 00:21:09,960 --> 00:21:13,600 Speaker 1: with your fans. Have you seen that change since twenty twenty. Well, 379 00:21:14,960 --> 00:21:18,120 Speaker 1: I think one of the major ways is since I've 380 00:21:18,200 --> 00:21:22,240 Speaker 1: kind of taken a step back from this explicitly political content, 381 00:21:22,480 --> 00:21:25,399 Speaker 1: it's a lot of people have kind of encouraged me 382 00:21:25,480 --> 00:21:28,480 Speaker 1: to go more in that direction, and I have seen 383 00:21:28,600 --> 00:21:31,640 Speaker 1: like a big drop in my support as a result. 384 00:21:34,359 --> 00:21:38,359 Speaker 1: I think that it's it's a tricky balance to strike again. 385 00:21:38,520 --> 00:21:42,800 Speaker 1: Many of these things are like such a balancing act 386 00:21:42,880 --> 00:21:47,840 Speaker 1: because I always am careful to remind people that, like, hey, 387 00:21:48,840 --> 00:21:51,680 Speaker 1: you can support me on Patreon if you like what 388 00:21:51,720 --> 00:21:53,520 Speaker 1: I'm doing and want there to be more of it, 389 00:21:53,880 --> 00:21:57,080 Speaker 1: but please don't operate under the assumption that doing so 390 00:21:58,040 --> 00:22:02,159 Speaker 1: is activism or contr utes to activism, because it is not. 391 00:22:02,600 --> 00:22:07,840 Speaker 1: You are not like making the revolution more than exactly 392 00:22:07,960 --> 00:22:10,120 Speaker 1: you know, you are getting a little drawing that I'm 393 00:22:10,119 --> 00:22:12,000 Speaker 1: going to put at the end of my video, like 394 00:22:12,080 --> 00:22:16,280 Speaker 1: that's that's the value proposition here. Yeah, And I think 395 00:22:16,280 --> 00:22:19,240 Speaker 1: that you know, it's it's a I don't The reason 396 00:22:19,280 --> 00:22:23,720 Speaker 1: I don't accept ad ad reads on Thoughtslime I do 397 00:22:23,800 --> 00:22:27,840 Speaker 1: on scaredy Cats is because I don't want the perception 398 00:22:27,920 --> 00:22:31,040 Speaker 1: that my views are going to be limited or held 399 00:22:31,080 --> 00:22:34,199 Speaker 1: back by, you know, the desire to seek out advertisers, 400 00:22:34,240 --> 00:22:37,720 Speaker 1: which whether or not I would have the the integrity 401 00:22:37,760 --> 00:22:40,920 Speaker 1: to withstand like it, it would create the illusion. But 402 00:22:42,160 --> 00:22:45,760 Speaker 1: that creates the problem of, well, now I kind of 403 00:22:45,760 --> 00:22:48,960 Speaker 1: have to do what I think that my audience will want, 404 00:22:49,440 --> 00:22:52,280 Speaker 1: and that's its own kettle of fish. Like am I 405 00:22:52,320 --> 00:22:55,359 Speaker 1: am I pushing people to donate more than than they 406 00:22:55,440 --> 00:22:58,159 Speaker 1: might be comfortable with, and so that's you know, I 407 00:22:58,200 --> 00:23:01,080 Speaker 1: don't I don't really know like the the ethics of it, 408 00:23:01,119 --> 00:23:05,280 Speaker 1: to be perfectly frank, there have been times when people 409 00:23:05,320 --> 00:23:07,960 Speaker 1: have made big donations and I've had to message them 410 00:23:08,000 --> 00:23:10,200 Speaker 1: and say like, hey, I think you should you should 411 00:23:10,200 --> 00:23:13,359 Speaker 1: probably take this money back. You probably weren't thinking straight 412 00:23:13,359 --> 00:23:14,960 Speaker 1: when you sent me this money. I think you should 413 00:23:15,000 --> 00:23:18,480 Speaker 1: probably have it back. Yeah. That's such an interesting thing 414 00:23:18,520 --> 00:23:21,240 Speaker 1: for me because it also you know, I've I've thought 415 00:23:21,240 --> 00:23:23,959 Speaker 1: about that myself quite a lot. You know. I had 416 00:23:24,000 --> 00:23:26,240 Speaker 1: a decision to make when we first started doing these 417 00:23:26,240 --> 00:23:28,040 Speaker 1: shows about how it was going to be done, and 418 00:23:28,080 --> 00:23:31,840 Speaker 1: I took the ad supported corporate route, and I've been 419 00:23:31,920 --> 00:23:34,560 Speaker 1: very happy with that so far. There's a lot of 420 00:23:34,560 --> 00:23:37,840 Speaker 1: things that's let us do. There's certainly downsides to it, um, 421 00:23:38,160 --> 00:23:42,600 Speaker 1: you know, including occasionally advertising for the Washington State Highway Patrol. Yeah, 422 00:23:43,760 --> 00:23:45,399 Speaker 1: but um, you know, it's one of those things. I 423 00:23:45,440 --> 00:23:47,800 Speaker 1: made a comment, and this is part one of the 424 00:23:47,920 --> 00:23:50,639 Speaker 1: one of the frustrating things about making media for a 425 00:23:50,720 --> 00:23:53,879 Speaker 1: large audience is there's always going to be people who 426 00:23:53,920 --> 00:23:56,360 Speaker 1: will like read into what you've said something you never meant. 427 00:23:56,359 --> 00:23:58,880 Speaker 1: I made a comment once about like, you know, because 428 00:23:59,080 --> 00:24:01,400 Speaker 1: we get people asking why don't you do a Patreon 429 00:24:01,520 --> 00:24:03,439 Speaker 1: or whatever, why do you do it this way? And 430 00:24:03,520 --> 00:24:06,439 Speaker 1: I just made a comment like expressing what you had 431 00:24:06,480 --> 00:24:09,440 Speaker 1: just expressed, like, well, you know, I feel weird sometimes 432 00:24:09,480 --> 00:24:12,000 Speaker 1: asking for money. And if I can just like get 433 00:24:12,040 --> 00:24:14,479 Speaker 1: money from a big company and you know, hire my 434 00:24:14,520 --> 00:24:17,320 Speaker 1: friends and do my work, I feel okay doing that. 435 00:24:17,400 --> 00:24:19,680 Speaker 1: It's how most of my career has worked. So that's 436 00:24:19,680 --> 00:24:22,639 Speaker 1: what I'm most comfortable doing. And yes, there were people 437 00:24:22,640 --> 00:24:25,600 Speaker 1: who took from that, like, well, Robert doesn't think it's 438 00:24:25,600 --> 00:24:27,600 Speaker 1: ethical to have a Patreon. It's like half of my 439 00:24:27,680 --> 00:24:30,439 Speaker 1: friends make their living son Patreon. I do not have 440 00:24:30,480 --> 00:24:34,640 Speaker 1: an ethical problem with supporting yourself that way. I will 441 00:24:34,680 --> 00:24:37,560 Speaker 1: say that. When I heard you mentioned that in an episode, 442 00:24:37,600 --> 00:24:41,160 Speaker 1: and it did send a chill down my spine briefly. No. 443 00:24:41,280 --> 00:24:44,240 Speaker 1: I mean I think like Cody Johnston, who I've worked 444 00:24:44,280 --> 00:24:47,879 Speaker 1: with for what fifteen years now, has a massive Patreon. 445 00:24:48,280 --> 00:24:51,159 Speaker 1: Tom and Dave Boyd lived with some of my best friends. 446 00:24:51,240 --> 00:24:54,200 Speaker 1: You know. Yeah, Like it's it's I think it's perfectly 447 00:24:54,240 --> 00:24:56,880 Speaker 1: it's it's certainly no less ethical, and you can make 448 00:24:56,920 --> 00:24:59,080 Speaker 1: a case people do that it's more ethical than being 449 00:24:59,119 --> 00:25:02,000 Speaker 1: ad supported. It's just like, I mean, some of it 450 00:25:02,040 --> 00:25:03,760 Speaker 1: just comes down to like what kind of stuff are 451 00:25:03,760 --> 00:25:06,280 Speaker 1: you making and what kind of like a person are you, 452 00:25:06,400 --> 00:25:09,640 Speaker 1: and what's going to work best with you as like 453 00:25:10,520 --> 00:25:12,840 Speaker 1: a creative method in a way of interacting with fans, 454 00:25:13,240 --> 00:25:15,520 Speaker 1: and they have downsides and they have positives. You know. 455 00:25:15,720 --> 00:25:19,120 Speaker 1: It's also like a matter of of what you're able 456 00:25:19,160 --> 00:25:21,840 Speaker 1: to do to certain extents because like I don't know 457 00:25:21,880 --> 00:25:25,440 Speaker 1: how to get advertisers. Yeah, any advertisers that I've ever 458 00:25:25,480 --> 00:25:28,120 Speaker 1: gotten on my my horror channel have just reached out 459 00:25:28,119 --> 00:25:30,800 Speaker 1: to me, and like, I don't know if I'm getting 460 00:25:30,800 --> 00:25:32,439 Speaker 1: as much money out of them as they should be. 461 00:25:32,440 --> 00:25:34,240 Speaker 1: I have no idea. I just I just kind of 462 00:25:34,240 --> 00:25:36,640 Speaker 1: wing it, you know, Like if you have that background 463 00:25:36,720 --> 00:25:40,080 Speaker 1: in radio or broadcasting or what have you like it, 464 00:25:40,160 --> 00:25:42,160 Speaker 1: can you know that it's it's a much more viable 465 00:25:42,160 --> 00:25:44,680 Speaker 1: option for some people than it is for others. Yeah. Yeah, 466 00:25:44,680 --> 00:25:46,600 Speaker 1: I mean a lot of why it works for me 467 00:25:46,680 --> 00:25:48,480 Speaker 1: the way that it does is because I've had a 468 00:25:48,560 --> 00:25:52,119 Speaker 1: fifteen year career and not in broadcast, but you know, 469 00:25:52,200 --> 00:25:56,120 Speaker 1: in comedy writing and whatnot. And so I mean that's 470 00:25:56,240 --> 00:25:58,920 Speaker 1: how I got the I got my podcast hosted on 471 00:25:59,000 --> 00:26:01,520 Speaker 1: my heart in the first place, and that's like a thing. 472 00:26:02,000 --> 00:26:03,960 Speaker 1: And this is actually one of the things that concerns 473 00:26:03,960 --> 00:26:06,200 Speaker 1: me most about the ship that's happening with AI right now, 474 00:26:06,240 --> 00:26:09,200 Speaker 1: because you know, there's this, Uh, the folks that kind 475 00:26:09,200 --> 00:26:13,040 Speaker 1: of I came into making media for all of us 476 00:26:13,040 --> 00:26:16,399 Speaker 1: started as fairly a political comedy. I think that's Cody 477 00:26:16,440 --> 00:26:19,240 Speaker 1: Johnston right some more news. Coody was making videos about 478 00:26:19,280 --> 00:26:23,040 Speaker 1: like chat roulette and penises when we when we all 479 00:26:23,080 --> 00:26:27,280 Speaker 1: started working together, very funny videos, but like we were 480 00:26:27,320 --> 00:26:33,160 Speaker 1: making silly things, um, and everyone has kind of moved 481 00:26:33,400 --> 00:26:40,440 Speaker 1: into making like, you know, pretty pretty serious fact based media. Um. 482 00:26:40,720 --> 00:26:43,960 Speaker 1: You know, Cody does a very popular, very political kind 483 00:26:44,000 --> 00:26:49,359 Speaker 1: of current events show, and we were able to get 484 00:26:49,440 --> 00:26:52,560 Speaker 1: good at making the kind of media that we made, 485 00:26:52,640 --> 00:26:54,800 Speaker 1: and build the connections that we built, and build the 486 00:26:54,840 --> 00:26:58,440 Speaker 1: audiences that we built because we had years of time 487 00:26:58,480 --> 00:27:01,879 Speaker 1: where you could make a decent living writing stuff for 488 00:27:01,920 --> 00:27:04,320 Speaker 1: the Internet. And I see the kind of shit that 489 00:27:04,359 --> 00:27:08,399 Speaker 1: I'm afraid AI is going to do to these jobs 490 00:27:08,440 --> 00:27:11,399 Speaker 1: where people would get their start as writers and whatnot. 491 00:27:11,480 --> 00:27:12,760 Speaker 1: Maybe it wasn't the best, you know, it's not that 492 00:27:12,760 --> 00:27:14,399 Speaker 1: you're not doing the best writing you're ever going to 493 00:27:14,480 --> 00:27:17,000 Speaker 1: do the jobs that get replaced by AI, but it's 494 00:27:17,000 --> 00:27:19,080 Speaker 1: a foot in the door, and I keep feeling I 495 00:27:19,080 --> 00:27:21,879 Speaker 1: feel like I keep seeing the room for people to 496 00:27:21,920 --> 00:27:24,360 Speaker 1: put their foot in the door get smaller and smaller 497 00:27:24,400 --> 00:27:27,760 Speaker 1: every year, and that's that worries me a lot. I 498 00:27:27,800 --> 00:27:31,280 Speaker 1: definitely know what you mean. I also feel that like 499 00:27:32,600 --> 00:27:36,360 Speaker 1: there's a fear among some people that, like, you get 500 00:27:36,359 --> 00:27:40,120 Speaker 1: crowded out of these spaces the more people there are 501 00:27:40,160 --> 00:27:42,840 Speaker 1: doing this sort of thing, and I kind of feel 502 00:27:42,840 --> 00:27:47,840 Speaker 1: like that's not the case. I like the AI stuff, 503 00:27:47,880 --> 00:27:51,720 Speaker 1: I definitely share your concerns, but yeah, the the institutional 504 00:27:51,760 --> 00:27:56,399 Speaker 1: barriers and people's way, like I think that, like to 505 00:27:56,520 --> 00:27:59,720 Speaker 1: be frank, like, I started doing this on a shitty 506 00:27:59,760 --> 00:28:06,800 Speaker 1: too hundred dollar computer and a completely legal video editing software. 507 00:28:07,240 --> 00:28:11,399 Speaker 1: But I love video editing software. I found it in 508 00:28:11,400 --> 00:28:14,080 Speaker 1: a dumpster and I used that, so, you know, and 509 00:28:14,080 --> 00:28:17,199 Speaker 1: then like through that, I was able to like be 510 00:28:17,240 --> 00:28:20,000 Speaker 1: able to afford a fancy camera and some lights and 511 00:28:20,280 --> 00:28:23,480 Speaker 1: you know, but like I didn't know what I was doing, 512 00:28:23,520 --> 00:28:25,840 Speaker 1: like it was all self taught. And I think there 513 00:28:25,880 --> 00:28:30,080 Speaker 1: has to be that kind of diy attitude, yeah, for people. 514 00:28:30,080 --> 00:28:32,280 Speaker 1: And it is something I try to encourage in people, 515 00:28:32,359 --> 00:28:35,159 Speaker 1: is that like, just just do it like I did it, 516 00:28:35,240 --> 00:28:37,480 Speaker 1: You can do it. Yeah, you know. I think that's 517 00:28:37,480 --> 00:28:39,840 Speaker 1: a great point, because I am coming at this from 518 00:28:39,880 --> 00:28:43,720 Speaker 1: the the old man dumerist perspective of somebody who like 519 00:28:43,800 --> 00:28:45,960 Speaker 1: the world has changed from the way it was when 520 00:28:46,000 --> 00:28:48,400 Speaker 1: I'm young, when I was young, and people don't get 521 00:28:48,440 --> 00:28:52,040 Speaker 1: their careers started that way anymore. And your point is 522 00:28:52,160 --> 00:28:56,360 Speaker 1: very valid that while changes in the industry have closed 523 00:28:56,480 --> 00:29:01,520 Speaker 1: specific doors, they've also created some um and I think 524 00:29:01,560 --> 00:29:03,960 Speaker 1: probably in the long run, it is better for people 525 00:29:04,000 --> 00:29:05,800 Speaker 1: to get their foot in the door doing what you 526 00:29:05,840 --> 00:29:09,640 Speaker 1: did than rewriting a bunch of press releases about tech 527 00:29:09,680 --> 00:29:12,760 Speaker 1: gadgets for a shady website that takes advantage of the 528 00:29:12,760 --> 00:29:15,920 Speaker 1: Google algorithm, which has always started my career. I think 529 00:29:15,920 --> 00:29:18,640 Speaker 1: that's actually a really valid I don't think, yeah, I don't. 530 00:29:18,640 --> 00:29:21,200 Speaker 1: I don't. I don't think that's a It depends on 531 00:29:21,240 --> 00:29:23,600 Speaker 1: your end goal too, right, But I think like the 532 00:29:23,640 --> 00:29:27,520 Speaker 1: thing that becomes incumbent on people like me is to 533 00:29:27,640 --> 00:29:31,640 Speaker 1: like help people, you know, Like I've experienced a certain 534 00:29:31,680 --> 00:29:34,840 Speaker 1: amount of success and so, and I attribute that largely 535 00:29:34,880 --> 00:29:37,160 Speaker 1: to the fact that, like when I was just starting out, 536 00:29:37,160 --> 00:29:39,480 Speaker 1: like I had no idea how to make people see 537 00:29:39,520 --> 00:29:41,280 Speaker 1: my ship. I'd like, I did not know what I 538 00:29:41,320 --> 00:29:44,160 Speaker 1: was doing. Yeah, and a bigger creator just reached out 539 00:29:44,200 --> 00:29:45,800 Speaker 1: and it's like, hey, can I share your video? I 540 00:29:45,840 --> 00:29:48,800 Speaker 1: think it's really good and it kind of snowballed from there. 541 00:29:49,240 --> 00:29:52,880 Speaker 1: So my philosophy has always been, like you take these 542 00:29:53,200 --> 00:29:57,400 Speaker 1: uh you know, you you make space for to lift 543 00:29:57,440 --> 00:30:01,280 Speaker 1: people up with you and do so. It's not an 544 00:30:01,400 --> 00:30:05,320 Speaker 1: entirely selfless gesture either, because in doing so, if if 545 00:30:05,320 --> 00:30:08,640 Speaker 1: there's an extremely talented person who succeeds partially because you 546 00:30:08,720 --> 00:30:11,560 Speaker 1: help them, now you have a connection to an extremely 547 00:30:11,600 --> 00:30:15,800 Speaker 1: talented person, you know, yeah, like that's that's a sense of, 548 00:30:17,720 --> 00:30:20,200 Speaker 1: for lack of a better term, mutual aid in a 549 00:30:20,360 --> 00:30:24,360 Speaker 1: very yeah, loose sense. I suppose that reminds me of 550 00:30:24,400 --> 00:30:27,000 Speaker 1: something a good friend of mine and a colleague at 551 00:30:27,040 --> 00:30:31,280 Speaker 1: Cracked who now helps run the Small Beans podcast network 552 00:30:32,200 --> 00:30:35,040 Speaker 1: said to me years and years ago when he was 553 00:30:35,120 --> 00:30:37,840 Speaker 1: directing a video, which is, I want to spend the 554 00:30:37,840 --> 00:30:40,920 Speaker 1: rest of my career getting hired and fired by my friends, 555 00:30:41,440 --> 00:30:45,080 Speaker 1: Which is I think a nice way of looking at it. 556 00:30:45,320 --> 00:30:46,840 Speaker 1: And there's a degree to which it's a very old 557 00:30:46,840 --> 00:30:49,080 Speaker 1: Hollywood way of looking at it, but it doesn't. It 558 00:30:49,120 --> 00:30:52,440 Speaker 1: also works very well in this It can work very 559 00:30:52,440 --> 00:30:57,680 Speaker 1: well in this new, this new kind of ecosystem that 560 00:30:57,800 --> 00:31:01,239 Speaker 1: is still being put together. And I do think that 561 00:31:01,320 --> 00:31:04,520 Speaker 1: it's because I see a lot and I don't. I'm 562 00:31:04,560 --> 00:31:06,000 Speaker 1: not someone who does a lot of time like I 563 00:31:06,240 --> 00:31:08,320 Speaker 1: like to watch. I watch like the stuff that you 564 00:31:08,360 --> 00:31:10,680 Speaker 1: put together, the stuff h Bomber guy puts together where 565 00:31:10,680 --> 00:31:14,960 Speaker 1: it's actual, like videos on topics and I'm learning something. 566 00:31:15,360 --> 00:31:17,400 Speaker 1: The stuff that the Dan Olsen puts together. You know, 567 00:31:18,360 --> 00:31:20,040 Speaker 1: I'm not so much into And this is not I'm 568 00:31:20,080 --> 00:31:21,960 Speaker 1: not attacking anybody. I'm not like trying to shoot on 569 00:31:22,000 --> 00:31:25,080 Speaker 1: the field. But personally I don't watch like the just 570 00:31:25,160 --> 00:31:27,680 Speaker 1: kind of like stream stuff A lot, and I it 571 00:31:27,720 --> 00:31:30,360 Speaker 1: does seem like there's a lot of conflicts between people 572 00:31:30,440 --> 00:31:33,920 Speaker 1: in that and I'm wondering, you know, my hope is 573 00:31:33,960 --> 00:31:38,600 Speaker 1: that there's more people building connections to create resiliency between 574 00:31:38,600 --> 00:31:41,720 Speaker 1: the people who are are trying to make good shit 575 00:31:42,600 --> 00:31:45,440 Speaker 1: and trying to make stuff that people enjoy, and that 576 00:31:45,480 --> 00:31:48,080 Speaker 1: has an impact on people, and that even changes people 577 00:31:48,120 --> 00:31:51,360 Speaker 1: in positive ways. And it sounds like from what from 578 00:31:51,400 --> 00:31:53,720 Speaker 1: what you're talking about, you know, honestly from what I 579 00:31:53,920 --> 00:31:56,880 Speaker 1: experience too, I do think that's more the case than 580 00:31:56,960 --> 00:31:59,360 Speaker 1: like the drama that that goes viral on Twitter from 581 00:31:59,360 --> 00:32:03,680 Speaker 1: time to time. Yeah, I think, you know, I hope 582 00:32:03,680 --> 00:32:07,240 Speaker 1: so too. I think that that it's it's very easy 583 00:32:07,320 --> 00:32:11,600 Speaker 1: to piss people off and it's much harder to get 584 00:32:11,600 --> 00:32:16,720 Speaker 1: people's attention by being kind. But you know, I like, look, 585 00:32:16,920 --> 00:32:19,560 Speaker 1: you know, how many nice comments do I get in 586 00:32:19,640 --> 00:32:22,560 Speaker 1: a day can't count, But like the one shitty comment 587 00:32:22,640 --> 00:32:24,640 Speaker 1: will always stick out. It's the same way like if 588 00:32:24,880 --> 00:32:29,320 Speaker 1: if I have a thousand pleasant interactions with someone else, uh, 589 00:32:29,920 --> 00:32:32,280 Speaker 1: nobody notices. But if I you know, get into if 590 00:32:32,320 --> 00:32:35,040 Speaker 1: I pick a fight with somebody, you know, it's people 591 00:32:35,080 --> 00:32:37,920 Speaker 1: are going to remember forever. I think that's the thing 592 00:32:38,000 --> 00:32:40,520 Speaker 1: that unsettles me most. And this isn't actually even just 593 00:32:40,600 --> 00:32:43,920 Speaker 1: like this isn't about streaming media or left with media 594 00:32:44,000 --> 00:32:46,800 Speaker 1: or whatever. This is a problem of social media that 595 00:32:46,880 --> 00:32:50,560 Speaker 1: You're right, it's the it's the fights that always get 596 00:32:50,640 --> 00:32:54,160 Speaker 1: most of the attention as opposed to them. I mean, 597 00:32:54,440 --> 00:32:57,240 Speaker 1: not not entirely, because some of like the big moments 598 00:32:57,360 --> 00:32:59,800 Speaker 1: is particularly in recent left wing media things like um, 599 00:33:00,560 --> 00:33:03,080 Speaker 1: you know, people doing these giant streams that raise huge 600 00:33:03,120 --> 00:33:05,240 Speaker 1: amounts of money for a cause. So that that certainly 601 00:33:05,360 --> 00:33:08,680 Speaker 1: is a thing that happens and does get a lot 602 00:33:08,720 --> 00:33:11,760 Speaker 1: of attention when it does happen. But you are fighting 603 00:33:11,800 --> 00:33:14,040 Speaker 1: against and I think we have to be consciously fighting 604 00:33:14,080 --> 00:33:17,680 Speaker 1: against this system that does want to engender conflict. Yes, 605 00:33:18,880 --> 00:33:22,120 Speaker 1: it's also kind of difficult, and I and you know, 606 00:33:22,200 --> 00:33:24,160 Speaker 1: keep in mind this this is perhaps coming from a 607 00:33:24,200 --> 00:33:28,120 Speaker 1: bias perspective when there are individuals and I'm not going 608 00:33:28,160 --> 00:33:31,920 Speaker 1: to name names, who do see that as an easy 609 00:33:32,000 --> 00:33:38,360 Speaker 1: source of generating attentions. It's very easy to the same 610 00:33:38,360 --> 00:33:40,080 Speaker 1: way that like, if I'm going to make a video 611 00:33:40,400 --> 00:33:43,320 Speaker 1: on a subject, I will frame it as like I'm 612 00:33:43,320 --> 00:33:47,480 Speaker 1: disagreeing with Ben Shapiro or I'm disagreeing with Jordan Peterson. 613 00:33:47,880 --> 00:33:50,600 Speaker 1: It's very easy to go look at thought slime. There's 614 00:33:50,600 --> 00:33:53,120 Speaker 1: a big piece of shit because he thought this when 615 00:33:53,280 --> 00:33:56,200 Speaker 1: when actually this is the truth. That's more attention grabbing 616 00:33:56,200 --> 00:34:00,920 Speaker 1: than just, you know, a kind of neutrally positioned argument. Yeah, 617 00:34:00,920 --> 00:34:06,000 Speaker 1: so it's a it's a it's a tricky problem. Yeah, yeah, 618 00:34:06,480 --> 00:34:09,319 Speaker 1: I think one of the ones that I think on 619 00:34:09,560 --> 00:34:12,040 Speaker 1: quite a lot. Well, um, I think that's most of 620 00:34:12,040 --> 00:34:14,920 Speaker 1: what I wanted to talk about today. Did you want 621 00:34:14,920 --> 00:34:18,719 Speaker 1: to like throwing anything else? Or if not, we can 622 00:34:18,760 --> 00:34:21,760 Speaker 1: go to plugs. Yeah, I mean, I'm good. That's pretty 623 00:34:21,800 --> 00:34:24,040 Speaker 1: much it. I will say that one of the things 624 00:34:24,040 --> 00:34:26,960 Speaker 1: that tends to bother me the most is people will 625 00:34:26,960 --> 00:34:30,560 Speaker 1: occasionally say to me that they'll they'll send a message 626 00:34:30,560 --> 00:34:33,319 Speaker 1: thing you seem like a really good person, and I 627 00:34:33,360 --> 00:34:37,319 Speaker 1: will say thank you, But please don't feel that way 628 00:34:37,400 --> 00:34:42,040 Speaker 1: about content creators, because why would I make a work 629 00:34:42,200 --> 00:34:45,359 Speaker 1: that portrayed myself as a bad person? And while I, 630 00:34:45,640 --> 00:34:47,960 Speaker 1: in my mind think I am a good person, I 631 00:34:48,000 --> 00:34:51,400 Speaker 1: think it sets the dangerous precedent that you could allow 632 00:34:51,400 --> 00:34:54,359 Speaker 1: yourself to be emotionally manipulated by someone else who might 633 00:34:54,360 --> 00:34:57,480 Speaker 1: not be well the name of the game. When you 634 00:34:57,480 --> 00:35:01,120 Speaker 1: are creating media, particularly when you're create media that's meant 635 00:35:01,120 --> 00:35:05,600 Speaker 1: to make people feel things. Part of that is manipulation, right. 636 00:35:06,000 --> 00:35:09,400 Speaker 1: Manipulate is not an inherently negative term. You know, Stanley 637 00:35:09,480 --> 00:35:12,360 Speaker 1: Kubrick is trying to manipulate you when he makes a movie. 638 00:35:13,239 --> 00:35:16,080 Speaker 1: I'm trying to persuade you. Yeah, you do it does 639 00:35:16,160 --> 00:35:20,040 Speaker 1: It is incumbent upon the audience for their own protection 640 00:35:20,080 --> 00:35:22,240 Speaker 1: to keep that in mind. And it's incumbent upon ethical 641 00:35:22,280 --> 00:35:26,399 Speaker 1: people who make stuff to not create cults, at least 642 00:35:26,480 --> 00:35:30,920 Speaker 1: not create too many cults. Yeah, as much as you 643 00:35:30,960 --> 00:35:33,080 Speaker 1: can avoid it, for sure. Yeah, all right, you want 644 00:35:33,080 --> 00:35:36,640 Speaker 1: to plug your plugables. Sure. You can find my work 645 00:35:36,680 --> 00:35:39,880 Speaker 1: at YouTube dot com slash thoughtslime or thoughtslime dot com. 646 00:35:39,920 --> 00:35:44,480 Speaker 1: You also find my horror content at YouTube dot com 647 00:35:44,480 --> 00:35:48,560 Speaker 1: slash Scaredycats TV. Scaredycats was taken. That's me. That's what 648 00:35:48,640 --> 00:35:54,360 Speaker 1: I do. I make videos about fartsand or butts. Well, 649 00:35:54,440 --> 00:35:56,719 Speaker 1: thank you so much for coming on the show. That 650 00:35:56,880 --> 00:35:59,480 Speaker 1: is going to be it for us today. We will 651 00:35:59,520 --> 00:36:06,719 Speaker 1: be back probably tomorrow. It could happen here as a 652 00:36:06,760 --> 00:36:09,600 Speaker 1: production of cool Zone Media. For more podcasts from cool 653 00:36:09,680 --> 00:36:12,520 Speaker 1: Zone Media, visit our website cool zonemedia dot com, or 654 00:36:12,560 --> 00:36:15,239 Speaker 1: check us out on the iHeartRadio app, Apple podcasts, or 655 00:36:15,280 --> 00:36:18,239 Speaker 1: wherever you listen to podcasts, you can find sources for 656 00:36:18,320 --> 00:36:21,359 Speaker 1: It could happen here, Updated monthly at cool zonemedia dot 657 00:36:21,360 --> 00:36:23,560 Speaker 1: com slash sources. Thanks for listening.