1 00:00:04,880 --> 00:00:08,600 Speaker 1: It's Oregon. You don't get to consent, and that's how 2 00:00:08,600 --> 00:00:11,920 Speaker 1: you open up the podcast. That's right, baby, look up 3 00:00:11,960 --> 00:00:18,239 Speaker 1: one party consent laws for recording recording. Um, this is 4 00:00:18,280 --> 00:00:21,720 Speaker 1: it could happen here a podcast about when you can 5 00:00:21,920 --> 00:00:26,000 Speaker 1: legally record people without their consent. Hint always in the 6 00:00:26,040 --> 00:00:30,520 Speaker 1: state of Oregon. Um. I'm Robert Evans. We're talking about 7 00:00:31,000 --> 00:00:33,760 Speaker 1: bad things, good things, things that are good and bad, 8 00:00:33,800 --> 00:00:40,280 Speaker 1: all that stuff. Who what do we you know what? 9 00:00:40,360 --> 00:00:41,599 Speaker 1: You know what we should talk about? You know what 10 00:00:41,640 --> 00:00:45,240 Speaker 1: no one has talked about ever on the internet lately. 11 00:00:47,080 --> 00:00:52,320 Speaker 1: Josephine Robinette Rogan. Oh, I've never heard of him. What 12 00:00:52,360 --> 00:00:54,840 Speaker 1: does he do? Well? He has a podcast. If you 13 00:00:54,880 --> 00:00:58,279 Speaker 1: heard a podcast, Garrison, I'm unfamiliar, but just I'll just 14 00:00:58,320 --> 00:01:01,080 Speaker 1: go with it. Yeah. Well, it's like the radio, but 15 00:01:01,840 --> 00:01:08,000 Speaker 1: easier to spread disinformation. Um and also sexier for reasons 16 00:01:08,000 --> 00:01:12,600 Speaker 1: that are hard to explain. Uh. And Joe Rogan gets 17 00:01:12,600 --> 00:01:14,800 Speaker 1: on his podcast and he says a lot of stuff 18 00:01:14,800 --> 00:01:17,280 Speaker 1: that people think is bad, and then everybody gets angry 19 00:01:17,360 --> 00:01:19,720 Speaker 1: at him, and then he makes more money. And today 20 00:01:19,760 --> 00:01:22,840 Speaker 1: we're going to talk about how maybe we could handle 21 00:01:23,000 --> 00:01:25,800 Speaker 1: this problem differently. Maybe we could not do the same 22 00:01:25,840 --> 00:01:30,200 Speaker 1: thing over and over. There'spech a different results. Yeah, and upfront, 23 00:01:30,240 --> 00:01:32,960 Speaker 1: obviously we're talking about him, We're trying to talk less 24 00:01:32,959 --> 00:01:35,880 Speaker 1: about specifically what he's said and more about kind of 25 00:01:35,920 --> 00:01:38,280 Speaker 1: the problem he represents in the ways in which the 26 00:01:38,319 --> 00:01:42,200 Speaker 1: responses people have aren't having the results they desire. We're 27 00:01:42,240 --> 00:01:44,200 Speaker 1: gonna avoid using his name in the title of the 28 00:01:44,200 --> 00:01:46,800 Speaker 1: episode or the description because that doesn't feed into the 29 00:01:46,840 --> 00:01:49,720 Speaker 1: algorithm kind of in the same way. But yeah, Garrison, 30 00:01:49,760 --> 00:01:52,560 Speaker 1: you want to kick us off here. Yeah, I've been 31 00:01:52,560 --> 00:01:55,840 Speaker 1: watching the Rogan thing online be getting kind of frustrated 32 00:01:55,840 --> 00:01:58,720 Speaker 1: because of the way the discourse is going, and it's 33 00:01:58,720 --> 00:02:01,360 Speaker 1: just repeating the same loops we see every few months 34 00:02:01,800 --> 00:02:05,760 Speaker 1: and nothing really changes and Realgan just gets more popular. 35 00:02:05,880 --> 00:02:09,280 Speaker 1: So earlier this month or like a middle of I 36 00:02:09,280 --> 00:02:12,520 Speaker 1: guess it was closer to like January. UM, there was 37 00:02:12,520 --> 00:02:16,079 Speaker 1: like a group of like two d seventy doctors, healthcare workers, UM, 38 00:02:16,080 --> 00:02:18,840 Speaker 1: and scientists who are campaigning for Spotify to adopt a 39 00:02:18,919 --> 00:02:22,639 Speaker 1: miss information policy. UM. This was prompted by a few 40 00:02:22,639 --> 00:02:25,520 Speaker 1: episodes of the Joe Rogan podcast that we've already actually 41 00:02:25,560 --> 00:02:29,480 Speaker 1: talked about. UM about Dr Robert Malone and someone else 42 00:02:29,800 --> 00:02:36,360 Speaker 1: who said some stupid things about the pandemic. Um, So 43 00:02:36,560 --> 00:02:38,359 Speaker 1: I write what when when I when we talked about 44 00:02:38,360 --> 00:02:40,480 Speaker 1: these episodes this last time, I tried to actually talk 45 00:02:40,520 --> 00:02:43,680 Speaker 1: about what these doctors were doing, and let not focus 46 00:02:43,720 --> 00:02:46,440 Speaker 1: on Rogan himself, but specifically what these doctors were doing 47 00:02:46,480 --> 00:02:49,600 Speaker 1: and their ideology, because I didn't want to add to 48 00:02:49,600 --> 00:02:53,360 Speaker 1: the whole Rogan side of the discourse. Um and you know, 49 00:02:53,720 --> 00:02:56,519 Speaker 1: for for this, for this, like a letter that that 50 00:02:56,600 --> 00:02:59,320 Speaker 1: these doctors sent to Spotify if they were not really 51 00:02:59,320 --> 00:03:03,160 Speaker 1: advocating Rogan to be removed from the platform, um or 52 00:03:03,240 --> 00:03:06,440 Speaker 1: even for episodes to be removed, just to have Spotify 53 00:03:06,880 --> 00:03:11,480 Speaker 1: clarify their guidelines regarding medical misinformation, because you know, and 54 00:03:11,560 --> 00:03:15,120 Speaker 1: it's important to note that Joe Rogan has a exclusivity 55 00:03:15,200 --> 00:03:18,280 Speaker 1: contract with Spotify. He does not work for them, but 56 00:03:18,560 --> 00:03:20,800 Speaker 1: Rogan gets paid a lot of money to get his 57 00:03:20,840 --> 00:03:24,520 Speaker 1: podcast only published on Spotify's feed. So it's it's not 58 00:03:24,600 --> 00:03:26,440 Speaker 1: it's it's it's it's like a it's a it's a 59 00:03:26,480 --> 00:03:29,000 Speaker 1: weird kind of set up, and it can give a 60 00:03:29,000 --> 00:03:32,320 Speaker 1: lot of like gray area for like does Spotify count 61 00:03:32,320 --> 00:03:34,400 Speaker 1: that's as publisher or not? You're like, well, not really 62 00:03:34,440 --> 00:03:38,040 Speaker 1: because they could. He could also just end that contract 63 00:03:38,120 --> 00:03:41,040 Speaker 1: and post his podcast everywhere. Um, I mean I think 64 00:03:41,040 --> 00:03:43,440 Speaker 1: it would take there. There's probably some sort of exclusive 65 00:03:43,640 --> 00:03:47,680 Speaker 1: time limit on the exclusivity agreement, etcetera. Um, because it 66 00:03:47,680 --> 00:03:50,200 Speaker 1: it is, it is mixed, because they did recently when 67 00:03:50,240 --> 00:03:52,600 Speaker 1: it came out that he said the N word a 68 00:03:52,600 --> 00:03:55,320 Speaker 1: whole bunch of times, Spotify removed those episodes. So there's 69 00:03:55,320 --> 00:03:59,200 Speaker 1: a degree to which they have acted as a publisher. Yeah, 70 00:03:59,520 --> 00:04:01,440 Speaker 1: there's a whole of stuff to kind of talk about 71 00:04:01,480 --> 00:04:04,560 Speaker 1: this on the So the letter gwent kind of viral 72 00:04:04,760 --> 00:04:06,720 Speaker 1: and it prompted this whole kind of thing in the 73 00:04:06,720 --> 00:04:10,880 Speaker 1: middle of January, but like deleting your Spotify subscription, and 74 00:04:10,920 --> 00:04:15,520 Speaker 1: then we had musicians, most uh, most popularly Neil Young 75 00:04:15,840 --> 00:04:19,120 Speaker 1: decided to remove all their music from the Spotify platform, 76 00:04:19,240 --> 00:04:22,159 Speaker 1: um as like a performative thing, being like, Okay, if 77 00:04:22,160 --> 00:04:25,000 Speaker 1: Spotify is gonna host all this medical misinformation, we're going 78 00:04:25,040 --> 00:04:27,680 Speaker 1: to remove this as protest. Now. Of course, Neil Young 79 00:04:27,760 --> 00:04:32,040 Speaker 1: then just signed an exclusivity deal with Amazon, so oh great, cool. 80 00:04:32,720 --> 00:04:37,479 Speaker 1: We yes, Amazon, the bastion of moral purity. Yeah, and 81 00:04:37,520 --> 00:04:39,840 Speaker 1: they're not I mean I think they are probably pay 82 00:04:39,839 --> 00:04:42,200 Speaker 1: a better rate because Spotify is pretty much at the 83 00:04:42,200 --> 00:04:45,080 Speaker 1: bottom to musicians. But I don't think it's good. Um, 84 00:04:46,120 --> 00:04:49,159 Speaker 1: I think Napster actually has the best rate, doesn't the 85 00:04:49,160 --> 00:04:51,360 Speaker 1: best rate, which I mean also like if you want 86 00:04:51,360 --> 00:04:53,839 Speaker 1: to be actually moral, just just just use band camp. 87 00:04:54,200 --> 00:04:56,719 Speaker 1: But I mean I use Spotify because it's really easy. 88 00:04:56,760 --> 00:04:58,760 Speaker 1: And that's why Spotify works. It's because it's super it 89 00:04:58,920 --> 00:05:01,280 Speaker 1: is it is a well made product. That does not 90 00:05:01,400 --> 00:05:03,680 Speaker 1: mean it's an ethical product, but it does. It does 91 00:05:03,720 --> 00:05:06,320 Speaker 1: the thing that it's supposed to do quite well. So 92 00:05:06,320 --> 00:05:09,560 Speaker 1: so yeah, it basically we've had endless discourse since then 93 00:05:09,640 --> 00:05:13,160 Speaker 1: about Joe Rogan about Spotify is the platform, talking about 94 00:05:13,160 --> 00:05:15,599 Speaker 1: how bad Spotify is, which yes, it is bad, talking 95 00:05:15,640 --> 00:05:18,200 Speaker 1: about how you know how bad Joe Rogan is, and 96 00:05:18,320 --> 00:05:20,320 Speaker 1: you know, the thing is, Joe Rogan already had the 97 00:05:20,360 --> 00:05:24,240 Speaker 1: most popular podcast in the world his exclusivity deal with Spotify, 98 00:05:24,279 --> 00:05:27,200 Speaker 1: and he's currently estimated to bring in eleven million listeners 99 00:05:27,240 --> 00:05:31,839 Speaker 1: per episode of his podcast. Yeah. For for some reference, um, 100 00:05:31,880 --> 00:05:36,000 Speaker 1: Behind the Bastards is one of the largest podcasts out there. Um, 101 00:05:36,040 --> 00:05:38,960 Speaker 1: and he's on average something like ten times our traffic. Uh, 102 00:05:39,320 --> 00:05:41,680 Speaker 1: like it's and it's not He's not He's not just 103 00:05:41,720 --> 00:05:45,600 Speaker 1: the most popular podcast. He helped invent what podcasting is. 104 00:05:45,640 --> 00:05:48,200 Speaker 1: Podcast He was one of the first, and like he 105 00:05:48,279 --> 00:05:51,920 Speaker 1: had a foundational role in how the entire industry works. 106 00:06:02,200 --> 00:06:04,000 Speaker 1: Since this letter and since the's episodes, there's been a 107 00:06:04,040 --> 00:06:07,160 Speaker 1: whole lot of discourse around if Spotify should remove Joe 108 00:06:07,240 --> 00:06:10,040 Speaker 1: Rogan from the platform, um, if they should cancel his deal. 109 00:06:10,080 --> 00:06:11,440 Speaker 1: You know, a lot of people calling on Spotify to 110 00:06:11,480 --> 00:06:13,000 Speaker 1: do that, a lot of people calling on Spotify to 111 00:06:13,000 --> 00:06:16,880 Speaker 1: remove certain episodes, and Spotify has not been keen to so. Like, 112 00:06:17,080 --> 00:06:19,880 Speaker 1: but let's and I know, Joe Rogan himself did actually 113 00:06:20,000 --> 00:06:22,520 Speaker 1: authorize the removal of a certain amount of episodes, which 114 00:06:22,520 --> 00:06:25,760 Speaker 1: for reasons we'll talk about later. Um, but what's all 115 00:06:25,800 --> 00:06:29,200 Speaker 1: this discourse and outrage and articles and tweets actually doing 116 00:06:29,240 --> 00:06:32,599 Speaker 1: to Spotify into Rogan? Okay? In my opinion, kind of 117 00:06:32,640 --> 00:06:35,159 Speaker 1: the end result is actually very similar to all of 118 00:06:35,160 --> 00:06:38,080 Speaker 1: the free advertising that companies get whenever they make awoke 119 00:06:38,160 --> 00:06:41,360 Speaker 1: statement that infuriate, that infuriates the actionary right, you know, 120 00:06:42,080 --> 00:06:44,880 Speaker 1: resulting in throwing your kel out your window, flushing your 121 00:06:44,920 --> 00:06:48,040 Speaker 1: Gillette blade down the toilet, and burning your nikes. Um. 122 00:06:48,080 --> 00:06:50,040 Speaker 1: And it's even widely speculated and kind of like a 123 00:06:50,120 --> 00:06:53,839 Speaker 1: known fact that companies will use progressive statements and policies 124 00:06:54,120 --> 00:06:57,480 Speaker 1: to drum up this outrage um to give their company 125 00:06:57,480 --> 00:06:59,920 Speaker 1: and product tons of free advertising and just to get 126 00:07:00,000 --> 00:07:03,680 Speaker 1: a brand name itself inside consumers heads. And this is 127 00:07:03,720 --> 00:07:06,719 Speaker 1: definitely happening. Was was Roken and Spotify in terms of 128 00:07:06,960 --> 00:07:11,400 Speaker 1: outrage being used as advertising. It may not be intentional, 129 00:07:11,560 --> 00:07:14,520 Speaker 1: but that is what the result is. Yeah, and it's 130 00:07:15,160 --> 00:07:19,720 Speaker 1: I mean, it's very uh, it's it's both sides like 131 00:07:19,800 --> 00:07:21,920 Speaker 1: to make fun of the other for doing this, like 132 00:07:21,960 --> 00:07:23,360 Speaker 1: folks on the left like to make funt of their 133 00:07:23,440 --> 00:07:26,680 Speaker 1: right when they're when they're breaking their carriggs or whatever. Um. 134 00:07:26,720 --> 00:07:30,720 Speaker 1: But you know, it happens. It's equally profitable for both sides. 135 00:07:30,760 --> 00:07:32,560 Speaker 1: You just do the opposite. You know, you have someone 136 00:07:32,600 --> 00:07:34,600 Speaker 1: come on and talk about how they're a truth teller 137 00:07:34,680 --> 00:07:37,880 Speaker 1: being canceled, and they get a bunch of attention money, 138 00:07:37,960 --> 00:07:42,400 Speaker 1: and and it works equally well both ways. Pretty much. Yeah, 139 00:07:42,520 --> 00:07:45,400 Speaker 1: So with Spotify and Rogan in the news every day 140 00:07:45,480 --> 00:07:49,000 Speaker 1: for the past like three weeks, the the the end 141 00:07:49,280 --> 00:07:52,040 Speaker 1: is that like the fact that it's just that people 142 00:07:52,040 --> 00:07:54,120 Speaker 1: are hearing these names in their head more often, and 143 00:07:54,160 --> 00:07:58,000 Speaker 1: they're probably subconsciously going to use Spotify more often because 144 00:07:58,400 --> 00:08:01,280 Speaker 1: you know, despite a few people that might canceled their subscriptions, 145 00:08:01,440 --> 00:08:03,280 Speaker 1: the net effect will be more listeners who seemed to 146 00:08:03,280 --> 00:08:06,720 Speaker 1: Spotify because there because the name is in my subconscious 147 00:08:06,720 --> 00:08:08,960 Speaker 1: it's it's it's in there. And all the effect that's 148 00:08:09,000 --> 00:08:11,000 Speaker 1: going to have on Rogan is giving him away more 149 00:08:11,040 --> 00:08:14,120 Speaker 1: publicity to attract new listeners, and and it's listeners who 150 00:08:14,120 --> 00:08:17,720 Speaker 1: themselves are like attracted to unconventional ideas outside the mainstream, 151 00:08:17,760 --> 00:08:20,320 Speaker 1: and his more passive listeners are gonna like double down 152 00:08:20,360 --> 00:08:22,679 Speaker 1: on him because there's gonna be like the backfire effect, 153 00:08:23,040 --> 00:08:25,640 Speaker 1: so they will like feel defensive and then become more 154 00:08:25,680 --> 00:08:27,880 Speaker 1: of a fan of his because he's seen as a 155 00:08:27,880 --> 00:08:30,600 Speaker 1: cultural outsider, even though he's not an outsider. He is 156 00:08:30,640 --> 00:08:33,240 Speaker 1: the mainstream, he's the biggest podcaster in the world, but 157 00:08:33,280 --> 00:08:35,840 Speaker 1: he's seen as a cultural outsider. So you know, he 158 00:08:36,000 --> 00:08:38,320 Speaker 1: like brings on guests who say things that they're not 159 00:08:38,360 --> 00:08:41,120 Speaker 1: supposed to say. You know, so who's actually going to 160 00:08:41,200 --> 00:08:43,840 Speaker 1: be convinced by all this outrage to not listen to 161 00:08:43,960 --> 00:08:46,480 Speaker 1: Rogan via pointing out all the wrong things he's said 162 00:08:46,480 --> 00:08:48,920 Speaker 1: and all the slurs he's used, Like, is that really 163 00:08:48,960 --> 00:08:52,400 Speaker 1: going to stop fans from listening to Joe Rogan's time? 164 00:08:52,440 --> 00:08:56,439 Speaker 1: Really pointing out that Donald Trump illegally took classified documents, 165 00:08:56,520 --> 00:08:58,880 Speaker 1: It's like, yeah, I mean that's fucked up and ship, 166 00:08:58,920 --> 00:09:01,040 Speaker 1: but like he's never gonna get charged with crimes and 167 00:09:01,080 --> 00:09:03,839 Speaker 1: none of his supporters care. You're the only people who 168 00:09:03,840 --> 00:09:06,440 Speaker 1: are angry about this, and it doesn't matter because the 169 00:09:06,480 --> 00:09:08,400 Speaker 1: people you vote for aren't going to punish him. So 170 00:09:08,480 --> 00:09:11,760 Speaker 1: like just yeah, you know, chill out a little bit. 171 00:09:12,280 --> 00:09:15,400 Speaker 1: It's like all of the outrage is simply inflating the 172 00:09:15,440 --> 00:09:19,400 Speaker 1: importance of Jarrogan on like an entire cultural level. It's 173 00:09:19,440 --> 00:09:22,360 Speaker 1: that he's becoming He's becoming more important to his fans, 174 00:09:22,400 --> 00:09:25,080 Speaker 1: more important to his haters, and more important to himself 175 00:09:25,240 --> 00:09:28,480 Speaker 1: and Spotify as an asset because he generates a lot 176 00:09:28,520 --> 00:09:31,000 Speaker 1: of exclusive listeners and news coverage and buzz around the 177 00:09:31,000 --> 00:09:34,960 Speaker 1: Spotify brand. And it's important to talk about. Like, so 178 00:09:35,040 --> 00:09:40,200 Speaker 1: you have, broadly speaking within the field of entertainment, like 179 00:09:40,240 --> 00:09:43,079 Speaker 1: digital entertainment in particular, you have like two ways that 180 00:09:43,160 --> 00:09:46,840 Speaker 1: you can grow your audience. One of them is organic growth, 181 00:09:46,880 --> 00:09:50,560 Speaker 1: which is you know, I listened to Garrison's podcast. I 182 00:09:50,640 --> 00:09:53,560 Speaker 1: like it. I tell a friend about Garrison's podcast. They 183 00:09:53,640 --> 00:09:55,600 Speaker 1: like it. They tell a friend about Garrett that that's 184 00:09:55,640 --> 00:09:58,160 Speaker 1: like organic, you know, it's very natural. That's purely the 185 00:09:58,480 --> 00:10:02,080 Speaker 1: kind of quality of the content UM reaching people. And 186 00:10:02,080 --> 00:10:04,080 Speaker 1: then there is in organic growth, which is can be 187 00:10:04,120 --> 00:10:06,760 Speaker 1: the result of like ad campaigns can be the result 188 00:10:06,960 --> 00:10:10,280 Speaker 1: of UM an algorithm. Often in today we're talking about 189 00:10:10,320 --> 00:10:13,960 Speaker 1: like oh, Twitter or Facebook prioritizes this kind of content. 190 00:10:14,040 --> 00:10:16,400 Speaker 1: So like something article on bright Barred about black on 191 00:10:16,440 --> 00:10:19,200 Speaker 1: white crime that would have been read ten thousand times 192 00:10:19,240 --> 00:10:21,320 Speaker 1: ten years ago gets read a million times because it 193 00:10:21,320 --> 00:10:24,440 Speaker 1: spreads well on this platform for reasons that aren't organic. 194 00:10:25,080 --> 00:10:30,200 Speaker 1: UM And with Joe Rogan, one of the reasons why 195 00:10:30,200 --> 00:10:32,360 Speaker 1: because we can talk about like d platforming. If you 196 00:10:32,360 --> 00:10:35,280 Speaker 1: want to talk about like Alex Jones for example, or Meloianopolis, 197 00:10:35,280 --> 00:10:38,280 Speaker 1: good case that D platforming really reduced both of their reaches. Now, 198 00:10:38,600 --> 00:10:42,280 Speaker 1: Milo uh pretty much wiped out as a person who 199 00:10:42,320 --> 00:10:45,800 Speaker 1: mattered in terms of the discourse, thank christ. Alex Jones 200 00:10:45,920 --> 00:10:48,400 Speaker 1: less so. UM it definitely hurt his business and it 201 00:10:48,520 --> 00:10:51,600 Speaker 1: reduced his reach. But by the time Facebook and Twitter 202 00:10:51,640 --> 00:10:55,160 Speaker 1: and whatnot started throttling him, he'd he had already inorganically 203 00:10:55,160 --> 00:10:57,680 Speaker 1: increased his reach enough that like he's able to he 204 00:10:57,720 --> 00:11:00,320 Speaker 1: had he had a large a large enough audios to 205 00:11:00,320 --> 00:11:03,320 Speaker 1: stay somewhat relevant and keep going. The thing about Joe 206 00:11:03,360 --> 00:11:07,640 Speaker 1: Rogan is he did not get famous and popular and organically. 207 00:11:07,679 --> 00:11:09,440 Speaker 1: I'm sure there was some degree of that on like 208 00:11:09,559 --> 00:11:12,800 Speaker 1: social media, but most of his growth before that, Like 209 00:11:12,960 --> 00:11:15,600 Speaker 1: people like him, Like whatever you think of him, he's 210 00:11:15,600 --> 00:11:17,960 Speaker 1: a good broadcaster. That's that's the thing. Even though, like 211 00:11:18,000 --> 00:11:20,640 Speaker 1: you know, for all this research, I don't like him. 212 00:11:20,679 --> 00:11:22,920 Speaker 1: He says horrible things, but you know, I was watching 213 00:11:22,960 --> 00:11:26,559 Speaker 1: I watched all of like Rogan's like, um, like Instagram 214 00:11:26,600 --> 00:11:28,839 Speaker 1: videos he made like a few ten minute things talking 215 00:11:28,840 --> 00:11:31,680 Speaker 1: about the outrage, and it's it sucks because when you 216 00:11:31,720 --> 00:11:34,000 Speaker 1: listen to him, he's like a really good talker. He's 217 00:11:34,160 --> 00:11:36,439 Speaker 1: very good at what he does. He's very good at 218 00:11:36,480 --> 00:11:40,360 Speaker 1: like generating sympathy and generating like good Like it's it 219 00:11:40,480 --> 00:11:42,800 Speaker 1: sucks because yeah, I wanted to like hate this person, 220 00:11:42,840 --> 00:11:44,680 Speaker 1: but I'm like listening to him talking about this issue 221 00:11:44,760 --> 00:11:47,520 Speaker 1: like oh wow, yeah, like you actually have a decent 222 00:11:47,559 --> 00:11:50,840 Speaker 1: grasp on what's going on here. Um, And that's that's horrible. 223 00:11:50,960 --> 00:11:53,440 Speaker 1: He is not part of He gets some of his 224 00:11:53,480 --> 00:11:57,920 Speaker 1: money from playing like a dumb, chill, stoner dude, but 225 00:11:58,000 --> 00:12:00,160 Speaker 1: he's not dumb. He's definitely a stoner. He's not a 226 00:12:00,240 --> 00:12:02,440 Speaker 1: dumb man. He's very intell. He's very good at what 227 00:12:02,480 --> 00:12:04,920 Speaker 1: he does. UM. One of the things we don't kind 228 00:12:04,920 --> 00:12:06,760 Speaker 1: of talk about enough when we talk about media that 229 00:12:06,960 --> 00:12:10,720 Speaker 1: I think is important to note is that UM being 230 00:12:11,280 --> 00:12:15,480 Speaker 1: likable in a professional sense is a skill. And it's 231 00:12:15,520 --> 00:12:18,480 Speaker 1: a skill like any technical skill. It's like knowing how 232 00:12:18,520 --> 00:12:21,080 Speaker 1: to how to how to farm or weld. UM. It 233 00:12:21,200 --> 00:12:23,480 Speaker 1: is a thing that you build on over time. It 234 00:12:23,559 --> 00:12:27,040 Speaker 1: is a thing that UM takes a lot of trial 235 00:12:27,080 --> 00:12:30,600 Speaker 1: and error and a lot of education to get right. Uh. 236 00:12:30,600 --> 00:12:32,480 Speaker 1: It is a thing that Joe Rogan has been doing 237 00:12:32,760 --> 00:12:35,480 Speaker 1: for longer than a significant chunk of the people on 238 00:12:35,480 --> 00:12:38,839 Speaker 1: this show, including Garrison, have been alive. Uh. It's like 239 00:12:38,840 --> 00:12:40,679 Speaker 1: like I've been like this is this is more or 240 00:12:40,720 --> 00:12:42,760 Speaker 1: less been my job for like thirteen or fourteen years. 241 00:12:42,760 --> 00:12:45,520 Speaker 1: And it is like a skill that you build. And 242 00:12:45,559 --> 00:12:47,800 Speaker 1: the thing that he is really good at is making 243 00:12:47,800 --> 00:12:50,120 Speaker 1: people want to listen to him. And so if you 244 00:12:50,200 --> 00:12:53,960 Speaker 1: were to say kick him out of Spotify, tomorrow. It's 245 00:12:54,080 --> 00:12:57,520 Speaker 1: it's entirely possible that his that that would increase the 246 00:12:57,600 --> 00:13:00,000 Speaker 1: number of people who who listen to the podcast. There's 247 00:13:00,080 --> 00:13:02,760 Speaker 1: there's a case to be made that Spotify has limited 248 00:13:02,800 --> 00:13:06,080 Speaker 1: his maximum adi and limiting him to Spotify as opposed 249 00:13:06,080 --> 00:13:07,920 Speaker 1: to if he was just any app he wanted to 250 00:13:07,960 --> 00:13:10,240 Speaker 1: be on, maybe it'd be twenty million you know, listening 251 00:13:10,240 --> 00:13:12,640 Speaker 1: to every exactly. Yes, Like even if did even a 252 00:13:12,679 --> 00:13:15,640 Speaker 1: Spotify did drop him because of all these you know, 253 00:13:15,720 --> 00:13:17,880 Speaker 1: outrage you know, and all the tweets and they're all 254 00:13:17,920 --> 00:13:20,280 Speaker 1: of the petitions, even if they did drop him, he 255 00:13:20,280 --> 00:13:23,040 Speaker 1: would probably not only gain more listeners due to the 256 00:13:23,040 --> 00:13:26,400 Speaker 1: outrage porn and and free speech advocates, but also with 257 00:13:26,480 --> 00:13:29,760 Speaker 1: his exclusivity ending help his podcast will just be available 258 00:13:29,760 --> 00:13:32,280 Speaker 1: in more platforms and more people who will to listen 259 00:13:32,360 --> 00:13:35,520 Speaker 1: to him, like really easily. So yeah, he's only going 260 00:13:35,559 --> 00:13:38,040 Speaker 1: to grow if people get what they want. And like 261 00:13:38,679 --> 00:13:42,160 Speaker 1: that makes you think, like this, this outrage isn't actually 262 00:13:42,559 --> 00:13:45,240 Speaker 1: meant to get the Joe Rogan Joe Rogan problem taking 263 00:13:45,240 --> 00:13:47,720 Speaker 1: care of it, Like it's this, this actually isn't about 264 00:13:47,720 --> 00:13:52,080 Speaker 1: stopping misinformation. It's this this isn't actually about having they're 265 00:13:52,120 --> 00:13:55,000 Speaker 1: being less fans of Joe Rogan. All of his outrage 266 00:13:55,000 --> 00:13:57,840 Speaker 1: is about making you feel better because you feel like 267 00:13:57,840 --> 00:14:01,160 Speaker 1: you're doing something right, Like bad thing is happening in 268 00:14:01,200 --> 00:14:04,800 Speaker 1: the world, and it's easier to pretend like your actions 269 00:14:04,800 --> 00:14:06,800 Speaker 1: are hurting it than it is to accept that, like 270 00:14:07,120 --> 00:14:10,000 Speaker 1: maybe there's nothing I can do about this right now. Yeah, 271 00:14:10,040 --> 00:14:12,360 Speaker 1: it would be a really nice world if the Tarrogan 272 00:14:12,440 --> 00:14:16,080 Speaker 1: problem could be solved so easily by Spotify dropping his 273 00:14:16,120 --> 00:14:19,040 Speaker 1: exclusivity deal, right that would that would be great, But 274 00:14:19,120 --> 00:14:21,320 Speaker 1: that's not the world we live in. And tricking yourself 275 00:14:21,320 --> 00:14:24,520 Speaker 1: into thinking that is just kind of delusional. Um, and 276 00:14:24,560 --> 00:14:26,480 Speaker 1: like yeah, it makes you feel better, but like that's 277 00:14:26,520 --> 00:14:29,320 Speaker 1: not actually helping because like, yeah, we can obviously compare 278 00:14:29,360 --> 00:14:31,640 Speaker 1: this to other like D platform and campaigns with people 279 00:14:31,720 --> 00:14:33,960 Speaker 1: for like Alex Jones, but you know, Jones was way 280 00:14:33,960 --> 00:14:36,440 Speaker 1: more niche and way more extreme around the time of 281 00:14:36,520 --> 00:14:40,440 Speaker 1: his like D platforming campaign, and his campaign wasn't about 282 00:14:40,600 --> 00:14:43,360 Speaker 1: ending exclusivity deals. It was about getting him off of 283 00:14:43,440 --> 00:14:47,240 Speaker 1: popular platforms altogether. And that's not happening with Joe Rogan 284 00:14:47,440 --> 00:14:49,600 Speaker 1: because Joe Rogan isn't saying the things that are going 285 00:14:49,600 --> 00:14:52,080 Speaker 1: to get him booted from platforms. He's smart, he knows 286 00:14:52,120 --> 00:14:54,800 Speaker 1: what he can and cannot say. He's not He's not 287 00:14:54,880 --> 00:14:58,200 Speaker 1: dumb enough to get banned from these platforms, right, and also, 288 00:14:58,240 --> 00:15:00,400 Speaker 1: you know, he's a giant financial asset, so they wouldn't 289 00:15:00,400 --> 00:15:03,280 Speaker 1: ben him anyway. But like he's he's What he's doing 290 00:15:03,360 --> 00:15:06,680 Speaker 1: is it's bringing on people who say horrible things to 291 00:15:06,840 --> 00:15:10,240 Speaker 1: continue a cultural conversation, which gives him in the headlines 292 00:15:10,320 --> 00:15:12,320 Speaker 1: and gives a platform like Spotify a whole bunch of 293 00:15:12,360 --> 00:15:14,200 Speaker 1: room to cry free speech and get away with it. 294 00:15:14,600 --> 00:15:16,600 Speaker 1: Removing Alex Jones could be seen as like a monetary 295 00:15:16,600 --> 00:15:20,160 Speaker 1: decision in and of itself, because it is actually removing liability. 296 00:15:20,440 --> 00:15:24,160 Speaker 1: But this is this isn't really the case for Rogan. Yeah, 297 00:15:24,200 --> 00:15:28,280 Speaker 1: and he's um, yeah, it's just not. One of the 298 00:15:28,280 --> 00:15:31,400 Speaker 1: problems is that this kind of does fly in the 299 00:15:31,440 --> 00:15:35,440 Speaker 1: face of a lot of what people want to think. 300 00:15:35,480 --> 00:15:37,120 Speaker 1: And I don't want to be making the case that 301 00:15:37,160 --> 00:15:38,960 Speaker 1: it's as black and white as it is, because, for example, 302 00:15:38,960 --> 00:15:41,240 Speaker 1: I'm not saying that it was bad for someone to 303 00:15:41,400 --> 00:15:43,440 Speaker 1: go through the effort of finding and pointing out, Hey, 304 00:15:43,440 --> 00:15:45,760 Speaker 1: there's like seventy episodes where Joe Rogan drops the N 305 00:15:45,800 --> 00:15:48,520 Speaker 1: word um and getting those pulls. I don't think that 306 00:15:48,600 --> 00:15:53,120 Speaker 1: I think that that was broadly speaking, a productive thing. 307 00:15:53,760 --> 00:15:57,400 Speaker 1: But keeping Joe Rogan at the forefront of the outrage 308 00:15:57,440 --> 00:15:59,880 Speaker 1: cycle is doing nothing but printing money for the guy. 309 00:16:00,320 --> 00:16:02,520 Speaker 1: And that's that's not an easy thing to deal with 310 00:16:02,560 --> 00:16:03,920 Speaker 1: because it's like, do you want me to just like 311 00:16:04,400 --> 00:16:07,840 Speaker 1: stay quiet in the face of injustice? And it's like, no, 312 00:16:08,160 --> 00:16:09,640 Speaker 1: that's not what I want you to do. But I 313 00:16:09,720 --> 00:16:12,520 Speaker 1: do want you to recognize that there are times in 314 00:16:12,600 --> 00:16:15,920 Speaker 1: ways of speaking up that are just yes, putting putting 315 00:16:15,920 --> 00:16:19,320 Speaker 1: gasoline on an injustice fire. It's it's important to remember 316 00:16:19,360 --> 00:16:22,440 Speaker 1: that d platform ng is just a tactic and and 317 00:16:22,600 --> 00:16:26,000 Speaker 1: single tactics aren't always effective in every situation. That's what 318 00:16:26,120 --> 00:16:28,200 Speaker 1: makes them a tactic, right, Like, in order for a 319 00:16:28,240 --> 00:16:31,200 Speaker 1: tactic to work, you need to understand the scenario that's 320 00:16:31,200 --> 00:16:33,880 Speaker 1: are applying the tactic to and seeing if that tactic 321 00:16:33,920 --> 00:16:36,280 Speaker 1: achieves the goal. And if it doesn't, great, but if 322 00:16:36,280 --> 00:16:38,680 Speaker 1: it doesn't, you need to choose other tactic and stop 323 00:16:38,720 --> 00:16:41,000 Speaker 1: doing the same thing over and over again and expecting 324 00:16:41,000 --> 00:16:43,600 Speaker 1: a new result. And this is one of those situations 325 00:16:43,640 --> 00:16:46,440 Speaker 1: where like when you bring up, hey, maybe nothing might 326 00:16:46,480 --> 00:16:48,520 Speaker 1: be the best thing, at least for for most people 327 00:16:48,520 --> 00:16:51,280 Speaker 1: specifically to do. Like, the thing that gets brought up 328 00:16:51,360 --> 00:16:54,600 Speaker 1: is like, well do you not want me to uh? 329 00:16:54,680 --> 00:16:57,040 Speaker 1: Or like, well, what do you suggest I do? Like 330 00:16:57,360 --> 00:16:59,040 Speaker 1: you know, you're saying I shouldn't do this, but you're 331 00:16:59,080 --> 00:17:00,960 Speaker 1: not telling me what to do. It's like, well, it's 332 00:17:01,000 --> 00:17:04,040 Speaker 1: like if somebody gets shot in the leg and one 333 00:17:04,080 --> 00:17:06,320 Speaker 1: person has a tourniquet and the other person has a 334 00:17:06,320 --> 00:17:07,960 Speaker 1: bunch of razor blades that they want to throw in 335 00:17:08,000 --> 00:17:10,520 Speaker 1: their eyes, and it's like, well, what do you want 336 00:17:10,560 --> 00:17:12,240 Speaker 1: me to do? All I have is these razor blades. 337 00:17:12,240 --> 00:17:13,960 Speaker 1: This is the only other thing I can do? Then 338 00:17:14,080 --> 00:17:15,640 Speaker 1: just stand by and do nothing. And it's like, well, 339 00:17:15,640 --> 00:17:18,080 Speaker 1: in this case, doing nothing is the best thing to do, 340 00:17:18,119 --> 00:17:21,960 Speaker 1: because it's it's not that's not going to help the problem. 341 00:17:22,280 --> 00:17:25,360 Speaker 1: Boycotts of this scale kind of only tend to benefit 342 00:17:25,480 --> 00:17:28,399 Speaker 1: brands and businesses right like, because if if if the 343 00:17:28,400 --> 00:17:30,879 Speaker 1: brand of business is a person is smaller in more 344 00:17:30,960 --> 00:17:33,320 Speaker 1: niche like and say like Alex Jones or Richard Spencer, 345 00:17:33,440 --> 00:17:37,520 Speaker 1: right then yes, these tactics about boycotts can really work 346 00:17:37,560 --> 00:17:40,359 Speaker 1: to push things out of the cultural like market and 347 00:17:40,400 --> 00:17:42,199 Speaker 1: also in some cases in terms of businesses like the 348 00:17:42,240 --> 00:17:45,800 Speaker 1: literal market, but when you're dealing with things like Target, Nike, 349 00:17:45,960 --> 00:17:49,119 Speaker 1: and Joe Rogan, that's not the case because those brands 350 00:17:49,119 --> 00:17:51,760 Speaker 1: are way too big. Any any you know, any conservative 351 00:17:51,760 --> 00:17:54,520 Speaker 1: boycott against Target isn't going to have that effect. It 352 00:17:54,520 --> 00:17:57,359 Speaker 1: will probably make weird liberals be like, Oh, I'm gonna 353 00:17:57,400 --> 00:18:00,600 Speaker 1: go to Target now because the conservative still want me to. 354 00:18:01,200 --> 00:18:05,359 Speaker 1: It's so like it's I don't it's not it's it's like, 355 00:18:05,440 --> 00:18:07,680 Speaker 1: it's the problem is. It's like Joe Rogan himself isn't 356 00:18:07,680 --> 00:18:09,560 Speaker 1: really the problem either. You know. A lot of the 357 00:18:09,600 --> 00:18:12,280 Speaker 1: problem can be seen more as like content algorithms that 358 00:18:12,320 --> 00:18:16,560 Speaker 1: boost and reward misinformation and disinformation and conspiracism. And that's 359 00:18:16,560 --> 00:18:18,880 Speaker 1: more of like an actual issue at hand here. Joe 360 00:18:18,920 --> 00:18:21,120 Speaker 1: Rogan is just a business. That's how he hears about 361 00:18:21,119 --> 00:18:23,920 Speaker 1: a lot of these people. Yeah, I go viral somewhere else. 362 00:18:23,960 --> 00:18:26,320 Speaker 1: And in a lot of cases, it's an inorganic thing 363 00:18:26,359 --> 00:18:28,200 Speaker 1: that brings them in front of them. It's some fucking 364 00:18:28,240 --> 00:18:30,240 Speaker 1: algorithm that and and that is a case where you 365 00:18:30,280 --> 00:18:32,399 Speaker 1: can target and work on deep exacting and it can 366 00:18:32,400 --> 00:18:35,000 Speaker 1: be more productive that that. That was That was what 367 00:18:35,080 --> 00:18:37,160 Speaker 1: that was what I was getting at. It's like Joe 368 00:18:37,400 --> 00:18:39,679 Speaker 1: Joe Rogan himself was just a visible outgrowth of the 369 00:18:39,680 --> 00:18:42,720 Speaker 1: core problem. And the core problem is these things getting 370 00:18:42,720 --> 00:18:45,159 Speaker 1: onto his show in the first place. So yeah, we 371 00:18:45,200 --> 00:18:47,200 Speaker 1: can't stop his show, but maybe we should do more 372 00:18:47,200 --> 00:18:50,080 Speaker 1: work to prevent to like figure out ways to do 373 00:18:50,240 --> 00:18:53,320 Speaker 1: you know, start using these tactics to prevent algorithms from 374 00:18:53,320 --> 00:18:55,720 Speaker 1: boosting these things so that Joe Roken sees them and 375 00:18:55,720 --> 00:18:58,679 Speaker 1: then and then invites them on. And yeah, that's a 376 00:18:58,720 --> 00:19:02,040 Speaker 1: lot more work than just being angry at Spotify. And 377 00:19:02,119 --> 00:19:04,160 Speaker 1: yet maybe it will actually do something. And one thing 378 00:19:04,200 --> 00:19:06,760 Speaker 1: that can do something is with spot and it won't 379 00:19:06,800 --> 00:19:10,240 Speaker 1: work if it's just Spotify. But I am one of 380 00:19:10,240 --> 00:19:12,879 Speaker 1: those people who thinks that maybe it's not the worst 381 00:19:12,880 --> 00:19:16,479 Speaker 1: thing if things like Spotify are seen as publishers and 382 00:19:16,520 --> 00:19:19,600 Speaker 1: thus when they spread misinformation that leads to disaster as 383 00:19:19,600 --> 00:19:23,840 Speaker 1: health consequences, um, they can be held liable. Right. Uh, 384 00:19:23,880 --> 00:19:26,399 Speaker 1: that's not the worst possible change, although it is a 385 00:19:26,400 --> 00:19:28,640 Speaker 1: problematic one. I don't want to like boil that down 386 00:19:28,640 --> 00:19:31,760 Speaker 1: to a simple question, but I think that's an avenue 387 00:19:31,800 --> 00:19:35,120 Speaker 1: that should be explored because I don't see that there's 388 00:19:35,119 --> 00:19:38,320 Speaker 1: a lot of difference in Spotify choosing to let something 389 00:19:38,359 --> 00:19:42,920 Speaker 1: go to air or the New York Times printing misinformation. UM. 390 00:19:43,000 --> 00:19:45,080 Speaker 1: And in fact, Spotify is going to reach more people 391 00:19:45,080 --> 00:19:48,200 Speaker 1: because nobody cares what the New York Times says anymore. Yeah, 392 00:19:48,400 --> 00:19:52,280 Speaker 1: so Spotify. The Spotify CEO did kind of address the 393 00:19:52,320 --> 00:19:57,400 Speaker 1: ongoing controversy around the you know, internal publishing stuff and 394 00:19:57,480 --> 00:20:00,760 Speaker 1: how they view medical misinformation. UM. They do adopt a 395 00:20:00,800 --> 00:20:04,720 Speaker 1: clarified policy that prohibits content that promotes dangerous, false, or 396 00:20:04,840 --> 00:20:09,439 Speaker 1: dangerous deceptive medical information which may cause offline harm or 397 00:20:09,560 --> 00:20:12,720 Speaker 1: pose a direct threat to public health. UM. And then 398 00:20:12,800 --> 00:20:15,320 Speaker 1: the Post also announced that the Spotify would add content 399 00:20:15,359 --> 00:20:18,919 Speaker 1: advisories to any content related to COVID on the platform, 400 00:20:19,560 --> 00:20:23,160 Speaker 1: Twitter and Instagram have no, it's not, it's not, it's 401 00:20:23,160 --> 00:20:26,359 Speaker 1: not actually it's but but if you have the operation, 402 00:20:26,440 --> 00:20:29,280 Speaker 1: if you can again, if there is like an actual 403 00:20:29,480 --> 00:20:35,440 Speaker 1: dollar consequence to companies that aired massive disinformation, UM, then 404 00:20:35,760 --> 00:20:38,600 Speaker 1: you're not without sort of making Joe Rogan the focus. 405 00:20:38,720 --> 00:20:40,679 Speaker 1: You can make it so that the people that he 406 00:20:40,760 --> 00:20:44,240 Speaker 1: actually is accountable to, which is the people who make 407 00:20:44,320 --> 00:20:47,840 Speaker 1: him give him the money that he gets. UH, have 408 00:20:48,320 --> 00:20:51,199 Speaker 1: a vested interest in tamping down on the worst excesses 409 00:20:51,400 --> 00:20:55,480 Speaker 1: that he's responsible for, Like that might have an impact. 410 00:20:55,520 --> 00:20:57,280 Speaker 1: I don't know, Like part of the problem is that 411 00:20:57,600 --> 00:20:59,240 Speaker 1: and one thing we should acknowledge here when we're talking 412 00:20:59,240 --> 00:21:01,600 Speaker 1: about like what would work better than what's being done. 413 00:21:02,119 --> 00:21:05,639 Speaker 1: This is a pretty new problem. Versions of it have 414 00:21:05,720 --> 00:21:09,840 Speaker 1: existed before, but without the Internet and without podcasts being 415 00:21:09,880 --> 00:21:11,960 Speaker 1: what they are, this is a pretty new thing to 416 00:21:12,000 --> 00:21:14,879 Speaker 1: be dealing with. And I'm not I don't I'm not 417 00:21:14,920 --> 00:21:17,200 Speaker 1: saying like this is here's the obvious solution to this, 418 00:21:17,560 --> 00:21:19,119 Speaker 1: but I think we are trying to point out, like 419 00:21:19,200 --> 00:21:22,480 Speaker 1: what folks are doing doesn't work. The tactics being applied 420 00:21:22,520 --> 00:21:26,200 Speaker 1: are not effective, and we should be exploring other opportunities 421 00:21:26,240 --> 00:21:29,080 Speaker 1: to mitigate this harm that are not well. I guess 422 00:21:29,119 --> 00:21:31,960 Speaker 1: it's time to delete another app and post about it 423 00:21:32,000 --> 00:21:35,640 Speaker 1: on Twitter. Yeah. I think I think this is especially 424 00:21:35,640 --> 00:21:36,920 Speaker 1: a thing with like it's like one of the other 425 00:21:36,960 --> 00:21:40,240 Speaker 1: things that that's been popping up as broken's like weapons 426 00:21:40,240 --> 00:21:44,159 Speaker 1: grade transphobia. Yes, oh yeah, jeez, yeah, and that's that 427 00:21:44,200 --> 00:21:48,399 Speaker 1: stuff us is horrifying. Like the racism is also like 428 00:21:48,720 --> 00:21:52,159 Speaker 1: really bad. He's extremely sexist. But it's like I think misogynist, racist, 429 00:21:53,720 --> 00:21:57,880 Speaker 1: all there's a lot of money and being that dude. Yeah, well, 430 00:21:57,880 --> 00:22:02,359 Speaker 1: and I think this is sort of you know, this, 431 00:22:02,359 --> 00:22:04,320 Speaker 1: this is you know, this is an inherent problem for 432 00:22:04,359 --> 00:22:09,640 Speaker 1: the left because fighting like this, this kind of sort 433 00:22:09,640 --> 00:22:13,240 Speaker 1: of like shock jockey information stuff works better like that 434 00:22:13,240 --> 00:22:15,120 Speaker 1: that rage economy works better for the right thing does 435 00:22:15,119 --> 00:22:17,840 Speaker 1: for the left. And I think in some ways that 436 00:22:17,880 --> 00:22:19,880 Speaker 1: means like you have to fight them in other spaces, 437 00:22:19,880 --> 00:22:22,320 Speaker 1: you like, you know, you you you can't just like 438 00:22:22,400 --> 00:22:24,919 Speaker 1: keep throwing yourself. It's the same thing with like so 439 00:22:25,000 --> 00:22:28,119 Speaker 1: why why why you don't have just like one line 440 00:22:28,119 --> 00:22:29,639 Speaker 1: where you just run into a bunch of cops over 441 00:22:29,640 --> 00:22:34,520 Speaker 1: and over again in one spot, right, like, but we try. Yeah. Yeah, 442 00:22:34,560 --> 00:22:37,760 Speaker 1: it's like some folks gave that one the old College, 443 00:22:37,960 --> 00:22:41,119 Speaker 1: the old College. Yeah, it's like you know, like in 444 00:22:41,160 --> 00:22:43,080 Speaker 1: some in some sense, yeah, like it's it's it's it's 445 00:22:43,200 --> 00:22:44,960 Speaker 1: it's hard to be too hard on these on people 446 00:22:45,000 --> 00:22:46,840 Speaker 1: who doing this. And it's like if I think they're 447 00:22:46,840 --> 00:22:48,480 Speaker 1: doing the right thing, but it's like you have to 448 00:22:50,040 --> 00:22:53,239 Speaker 1: you have to pick your battles. And you know, if 449 00:22:53,440 --> 00:22:56,040 Speaker 1: if you're taking a fight that's fair, like that's a 450 00:22:56,119 --> 00:22:58,920 Speaker 1: that's a bad fight that is a bad fight for you. 451 00:22:58,640 --> 00:23:00,760 Speaker 1: You you need you need to be finding them in 452 00:23:01,560 --> 00:23:03,479 Speaker 1: different spheres. You need to be you know, I mean, 453 00:23:03,520 --> 00:23:06,040 Speaker 1: working for example, on stuff like tech regulation, like you 454 00:23:06,320 --> 00:23:09,560 Speaker 1: like working on you know, unionizing these places, right like 455 00:23:10,240 --> 00:23:14,560 Speaker 1: fighting purely finding them in information space, we will lose 456 00:23:14,600 --> 00:23:17,880 Speaker 1: every time. The advantage that we have is that we 457 00:23:17,960 --> 00:23:20,440 Speaker 1: also do other things, and it's it's you know, we're 458 00:23:20,520 --> 00:23:23,560 Speaker 1: we're going to keep losing hearing if you know, if 459 00:23:23,560 --> 00:23:25,280 Speaker 1: we keep fighting them in exactly the same way here, 460 00:23:25,320 --> 00:23:27,280 Speaker 1: we're going to keep losing. So we have to you know, 461 00:23:27,440 --> 00:23:29,080 Speaker 1: like we we we we we have to fight in 462 00:23:29,119 --> 00:23:33,280 Speaker 1: other places. And that's hard and it sucks because you know, 463 00:23:33,480 --> 00:23:37,879 Speaker 1: this is such a mispart of just what reality is 464 00:23:37,920 --> 00:23:41,160 Speaker 1: now is you know, yelling at people online. But like 465 00:23:43,080 --> 00:23:47,960 Speaker 1: you have to stop doing that because the problem is 466 00:23:48,000 --> 00:23:51,720 Speaker 1: that we've all gotten fucking caught for quite some time 467 00:23:51,760 --> 00:23:55,560 Speaker 1: in this escalating culture war. And it's not it's not 468 00:23:55,720 --> 00:23:59,920 Speaker 1: a battle ground that can be entirely ignored because when 469 00:24:00,040 --> 00:24:03,560 Speaker 1: you kind of seed ground them, they create conspiracy theories 470 00:24:03,600 --> 00:24:06,760 Speaker 1: about trans people attacking kids that lead to them murdering 471 00:24:06,800 --> 00:24:09,399 Speaker 1: people in the streets, or they spread conspiracy theories about 472 00:24:09,400 --> 00:24:12,720 Speaker 1: masks that lead to them occupying Ottawa. Um. So it can't. 473 00:24:12,720 --> 00:24:15,800 Speaker 1: The culture war battleground cannot be ignored. But at the 474 00:24:15,880 --> 00:24:17,919 Speaker 1: end of the day, what we should rather than just 475 00:24:18,000 --> 00:24:20,600 Speaker 1: like seeking new ways to engage with it, because the 476 00:24:20,640 --> 00:24:23,639 Speaker 1: more you engage with it, and it is sometimes necessary 477 00:24:23,640 --> 00:24:25,400 Speaker 1: to engage with but the more you engage with it, 478 00:24:25,640 --> 00:24:28,840 Speaker 1: the stronger you make this whole thing and the heavier 479 00:24:28,920 --> 00:24:31,720 Speaker 1: it lies on all of us like a cloud. And 480 00:24:31,920 --> 00:24:35,240 Speaker 1: in the only real way to actually win in the 481 00:24:35,280 --> 00:24:37,480 Speaker 1: long run is to find a way to get off 482 00:24:37,600 --> 00:24:40,479 Speaker 1: of that, to get out of this, like this fucking 483 00:24:40,920 --> 00:24:46,840 Speaker 1: treadmill of bullshit that has become everything all consuming, and 484 00:24:46,880 --> 00:24:48,639 Speaker 1: it's it's in a lot of people's best interest for 485 00:24:48,640 --> 00:24:53,120 Speaker 1: it to stay all consuming. Um And I there's a 486 00:24:53,200 --> 00:24:55,280 Speaker 1: there's a lot that's going on here because it's not 487 00:24:56,240 --> 00:24:58,680 Speaker 1: I think sometimes when you criticize people for the actions 488 00:24:58,680 --> 00:25:02,080 Speaker 1: they take in situation is like this, they kind of 489 00:25:02,080 --> 00:25:04,840 Speaker 1: interpreted as you saying, well, like you're stupid and you 490 00:25:04,920 --> 00:25:07,560 Speaker 1: fucked up and you never should have liked done this, 491 00:25:07,680 --> 00:25:09,680 Speaker 1: And and the way I think of it is more 492 00:25:09,760 --> 00:25:13,240 Speaker 1: like this is a we have found ourselves trapped in 493 00:25:13,280 --> 00:25:16,440 Speaker 1: a really messy situation and no one has figured out 494 00:25:16,440 --> 00:25:18,479 Speaker 1: how to get out. So it's not a situation of like, 495 00:25:18,520 --> 00:25:22,480 Speaker 1: people are are dumb for having done something that's not effective. 496 00:25:22,520 --> 00:25:25,240 Speaker 1: It's a situation of we are all trying to figure 497 00:25:25,280 --> 00:25:28,960 Speaker 1: out what works in this new world we have kind 498 00:25:28,960 --> 00:25:33,320 Speaker 1: of somewhat accidentally somewhat purposefully built for ourselves, and it 499 00:25:33,400 --> 00:25:36,879 Speaker 1: is important to have humility and be willing to accept it, like, 500 00:25:36,920 --> 00:25:39,040 Speaker 1: you know what, that's not working and we have to 501 00:25:39,080 --> 00:25:42,760 Speaker 1: stop doing the thing that's not working, rather than you know, 502 00:25:43,119 --> 00:25:45,240 Speaker 1: treat it as if it's sort of a moral failing 503 00:25:45,320 --> 00:25:58,800 Speaker 1: that something we we we tried was not effective. The 504 00:25:58,880 --> 00:26:00,760 Speaker 1: last thing is it's like, really, it's not just the 505 00:26:00,800 --> 00:26:04,919 Speaker 1: non effectiveness, but also so the idea that the fact 506 00:26:04,960 --> 00:26:07,919 Speaker 1: that this outrage is said just a constant is just 507 00:26:08,000 --> 00:26:13,560 Speaker 1: a constant free bannerad for Spotify every everywhere online is 508 00:26:13,600 --> 00:26:15,280 Speaker 1: like also not great. So it's not it's not even 509 00:26:15,280 --> 00:26:18,160 Speaker 1: not not even just not effective, but you're just giving 510 00:26:18,160 --> 00:26:21,879 Speaker 1: a corporation tons of free press. And maybe we can 511 00:26:21,960 --> 00:26:24,399 Speaker 1: reframe the way we approach these things so that we 512 00:26:24,480 --> 00:26:27,000 Speaker 1: don't do that, because in the end, that's just kind 513 00:26:27,000 --> 00:26:29,720 Speaker 1: of adding to promoting the misinformation that's kind of that's 514 00:26:29,720 --> 00:26:32,880 Speaker 1: all that's kind of really doing. And it's not that's 515 00:26:32,880 --> 00:26:35,560 Speaker 1: not nearly as you know, impactful, as you know, just 516 00:26:35,680 --> 00:26:37,920 Speaker 1: Rogan doing it himself, but it's still is it still 517 00:26:37,960 --> 00:26:40,080 Speaker 1: is a contributing factor and and and it it does 518 00:26:40,119 --> 00:26:43,280 Speaker 1: contribute to the backfire effect of people who listen to 519 00:26:43,320 --> 00:26:45,880 Speaker 1: his podcast. Maybe people who like don't even but they're 520 00:26:45,880 --> 00:26:49,080 Speaker 1: still gonna get defensive over him because they're seeing this 521 00:26:49,119 --> 00:26:51,119 Speaker 1: attack on him. And even though he's even though he 522 00:26:51,200 --> 00:26:53,760 Speaker 1: is a huge figure, he's seen as an outsider. So 523 00:26:54,280 --> 00:26:57,399 Speaker 1: that that really does contribute to that back backfire effect 524 00:26:57,400 --> 00:27:00,560 Speaker 1: thing of getting people more and more vested in him 525 00:27:00,640 --> 00:27:04,000 Speaker 1: as a content creator. But yeah, it's really dumb. But 526 00:27:04,440 --> 00:27:06,000 Speaker 1: the thing we need to deal with it, it's it's 527 00:27:06,000 --> 00:27:09,200 Speaker 1: a version of the lesson people still didn't learn with Trump, 528 00:27:09,240 --> 00:27:13,800 Speaker 1: which is that like, you can't you can't beat these 529 00:27:13,800 --> 00:27:16,400 Speaker 1: people by dunking on them. It doesn't matter that Joe 530 00:27:16,480 --> 00:27:19,840 Speaker 1: rogan said something dumb. It doesn't matter that Joe Rogan's inconsistent, 531 00:27:19,880 --> 00:27:24,359 Speaker 1: It doesn't matter that like Joe Rogan has tell tells 532 00:27:24,400 --> 00:27:27,040 Speaker 1: lies or whatever. That's not going to change anybody's mind 533 00:27:27,080 --> 00:27:29,840 Speaker 1: about the dude, because it's not about Joe Rogan. They 534 00:27:29,840 --> 00:27:34,040 Speaker 1: don't support him because they love They support him as 535 00:27:34,119 --> 00:27:35,520 Speaker 1: much as any of the people who are at least 536 00:27:35,520 --> 00:27:38,760 Speaker 1: engaging primarily online about it. Most of his fans are 537 00:27:38,800 --> 00:27:40,919 Speaker 1: just don't think about any of this because they're not 538 00:27:40,960 --> 00:27:43,359 Speaker 1: as online as the rest of us. But the people 539 00:27:43,359 --> 00:27:45,600 Speaker 1: who are kind of engaging with this and helping to 540 00:27:45,640 --> 00:27:48,919 Speaker 1: fuel the culture war side of this thing, they don't care. 541 00:27:49,520 --> 00:27:53,600 Speaker 1: There this This isn't about his inherent characteristics. This is 542 00:27:53,600 --> 00:27:56,840 Speaker 1: about it's a chance to dunk on the enemy. Um so, 543 00:27:56,920 --> 00:27:59,080 Speaker 1: like you, you're not going to convince them of anything. Ever. 544 00:28:01,080 --> 00:28:03,840 Speaker 1: That's all I had to say on this. Yeah, I 545 00:28:03,880 --> 00:28:06,280 Speaker 1: don't know. Yeah, because I again, I really I really 546 00:28:06,320 --> 00:28:09,960 Speaker 1: resisted writing this episode for a long time because I 547 00:28:10,000 --> 00:28:12,320 Speaker 1: didn't want to add to the Rogan discourse. But after 548 00:28:12,440 --> 00:28:16,119 Speaker 1: a while, the Rogan discourse itself became worth talking about. 549 00:28:17,000 --> 00:28:19,639 Speaker 1: How we talk so because it is more of a 550 00:28:19,680 --> 00:28:22,320 Speaker 1: meta angle, like okay, that is actually worth talking about. 551 00:28:22,560 --> 00:28:27,280 Speaker 1: But yeah, I am so tired of hearing, watching, and 552 00:28:27,320 --> 00:28:30,880 Speaker 1: seeing the words still Rogan. Yeah, I'm exhausted by it. 553 00:28:31,080 --> 00:28:34,480 Speaker 1: I hate that, Like it's a bigger story in the 554 00:28:34,560 --> 00:28:37,719 Speaker 1: United States than the gigantic war that might break out 555 00:28:37,760 --> 00:28:42,160 Speaker 1: in Eastern Europe. Um, it's it's just just a very 556 00:28:42,200 --> 00:28:45,560 Speaker 1: frustrating time and the only way to win this particular 557 00:28:45,600 --> 00:28:49,680 Speaker 1: game is not to play um. So that's that's why 558 00:28:49,680 --> 00:28:51,320 Speaker 1: we're doing this. We're not going to stick his name 559 00:28:51,320 --> 00:28:57,080 Speaker 1: in the description or the episode title. Um yeah it is. 560 00:28:57,480 --> 00:29:00,240 Speaker 1: You know, the title we were working with this under 561 00:29:00,280 --> 00:29:01,760 Speaker 1: that we're not going to use so as to not 562 00:29:01,800 --> 00:29:04,800 Speaker 1: feed into the algorithm was Joe Rogan the Egregor. And 563 00:29:04,840 --> 00:29:06,840 Speaker 1: that is really how I think about it. If you 564 00:29:06,880 --> 00:29:09,960 Speaker 1: if you haven't listened to our episode on the book 565 00:29:09,960 --> 00:29:12,320 Speaker 1: about the Flat Earth, book on Behind the Bastards that 566 00:29:12,400 --> 00:29:17,240 Speaker 1: talks about as Um and eggs, it's basically a god 567 00:29:17,280 --> 00:29:20,640 Speaker 1: that is made up by the kind of directed thoughts 568 00:29:20,640 --> 00:29:23,880 Speaker 1: of a population of people. Yeah, it's like a distalt 569 00:29:24,160 --> 00:29:27,760 Speaker 1: deity that exists like this thought form that if people 570 00:29:27,800 --> 00:29:32,040 Speaker 1: put energy into it, it almost gains its own it's 571 00:29:32,080 --> 00:29:36,280 Speaker 1: it's own like um um independence from the people that 572 00:29:36,280 --> 00:29:38,080 Speaker 1: that like burst it, even though it is just like 573 00:29:38,080 --> 00:29:40,000 Speaker 1: a thought form and it because it was just an 574 00:29:40,040 --> 00:29:42,560 Speaker 1: idea or presence, and now it basically is its own 575 00:29:42,920 --> 00:29:45,840 Speaker 1: god that's self sustaining and it can impact the world. Yeah, 576 00:29:45,840 --> 00:29:49,360 Speaker 1: global capital is an egregor And and the more that 577 00:29:49,520 --> 00:29:52,480 Speaker 1: people kind of feed into the discourse around Joe Rogan, 578 00:29:53,040 --> 00:29:56,560 Speaker 1: the more he turns into one kind of outside of 579 00:29:56,640 --> 00:30:00,760 Speaker 1: his own actions, he this idea of him has an 580 00:30:00,800 --> 00:30:04,920 Speaker 1: influence on everything around us. And boy, we don't need that, 581 00:30:05,000 --> 00:30:08,520 Speaker 1: do we certainly did not just said we just put 582 00:30:08,600 --> 00:30:13,880 Speaker 1: that one down, right, try something else. Yeah, let's throw 583 00:30:13,920 --> 00:30:16,640 Speaker 1: a brick at your sheriff instead. This will go better 584 00:30:16,720 --> 00:30:23,400 Speaker 1: for you. I mean sure, sure, Chris, Yeah, Chris, Chris 585 00:30:23,400 --> 00:30:26,080 Speaker 1: said it, not me. Yeah, I mean obviously in Minecraft 586 00:30:26,120 --> 00:30:29,400 Speaker 1: he said bricks. It's fine. Robert Robert, Robert Robert. It's 587 00:30:29,400 --> 00:30:33,520 Speaker 1: not Minecraft anymore. It's a roadblocks the feds cracked in Minecraft. Yeah, 588 00:30:33,560 --> 00:30:35,600 Speaker 1: you're right. It's going to take peks for them, for 589 00:30:35,640 --> 00:30:41,280 Speaker 1: them to their computers to realize what roadblocks is them. 590 00:30:41,440 --> 00:30:43,160 Speaker 1: Was a story I think I saw on Twitter about 591 00:30:43,280 --> 00:30:45,600 Speaker 1: like like some yeah, so some some kids who actually 592 00:30:45,680 --> 00:30:48,760 Speaker 1: legitimately got arrested in Brushford, like blowing up a building 593 00:30:48,800 --> 00:30:56,080 Speaker 1: in Minecraft, so like like literally so we uh yeah, 594 00:30:56,160 --> 00:30:59,680 Speaker 1: and instead again again it's roadblocks where nothing bad happens, 595 00:31:00,000 --> 00:31:04,480 Speaker 1: nothing bad happens, all right, Everybody get on Roadblocks, um 596 00:31:04,520 --> 00:31:07,160 Speaker 1: and don't talk about Joe Rogan and don't talk about 597 00:31:07,240 --> 00:31:13,520 Speaker 1: Joe Rogan. It Could Happen Here is a production of 598 00:31:13,560 --> 00:31:16,480 Speaker 1: cool Zone Media. Or more podcasts from cool Zone Media, 599 00:31:16,600 --> 00:31:19,040 Speaker 1: visit our website cool zone media dot com, or check 600 00:31:19,120 --> 00:31:21,360 Speaker 1: us out on the I Heart Radio app, Apple Podcasts, 601 00:31:21,480 --> 00:31:24,520 Speaker 1: or wherever you listen to podcasts. You can find sources 602 00:31:24,520 --> 00:31:27,080 Speaker 1: for It Could Happen Here, updated monthly at cool zone 603 00:31:27,080 --> 00:31:29,880 Speaker 1: Media dot com slash sources. Thanks for listening.