1 00:00:05,160 --> 00:00:09,040 Speaker 1: It could happen. Here is the podcast you're listening to 2 00:00:09,480 --> 00:00:12,640 Speaker 1: with your ears or perhaps other parts of your body 3 00:00:12,760 --> 00:00:16,000 Speaker 1: if you have, I don't know, some bizarre form of 4 00:00:16,000 --> 00:00:20,080 Speaker 1: synesthesia that causes you to taste sound. Um, maybe you're 5 00:00:20,120 --> 00:00:23,800 Speaker 1: tasting us right now, in which case, UM, I'm gonna 6 00:00:23,880 --> 00:00:27,320 Speaker 1: open up the flavor bouquet by introducing my co host 7 00:00:27,360 --> 00:00:31,360 Speaker 1: Garrison UM, and our guest for today. Why don't you 8 00:00:31,400 --> 00:00:34,040 Speaker 1: take over now, Garrison, I've done my job great. That 9 00:00:34,159 --> 00:00:37,839 Speaker 1: sounds lovely. Yeah, hey Garrison, here could happen. Here is 10 00:00:37,880 --> 00:00:41,760 Speaker 1: the podcast. We have a special guest today, UH journalisten 11 00:00:41,800 --> 00:00:46,239 Speaker 1: researcher W. F. Thomas. Hello, Hello, it's so good to 12 00:00:46,320 --> 00:00:49,680 Speaker 1: be here on Behind the Woman's Revolution the Police Insurrection Daily. 13 00:00:50,200 --> 00:00:53,199 Speaker 1: Thank you, thank you lovely to have you. Um. A 14 00:00:53,240 --> 00:00:56,760 Speaker 1: lot of people say Garrison's voice tastes like surbet, by 15 00:00:56,800 --> 00:01:00,600 Speaker 1: the way, it's a comment. We get a lot a lot, 16 00:01:00,640 --> 00:01:02,000 Speaker 1: a lot of a lot of a lot of those 17 00:01:02,040 --> 00:01:06,440 Speaker 1: d M s. Um, you surprised, not that. Um. So 18 00:01:07,160 --> 00:01:09,880 Speaker 1: we're gonna be talking about something I've wanted to actually 19 00:01:09,880 --> 00:01:12,880 Speaker 1: bring up myself for a while now, but I just 20 00:01:12,880 --> 00:01:14,920 Speaker 1: have not put it the work in and now Luckily 21 00:01:14,959 --> 00:01:17,240 Speaker 1: someone else did the actual work, so we now we 22 00:01:17,240 --> 00:01:20,039 Speaker 1: could just talk about it. We're talking about something called 23 00:01:20,160 --> 00:01:25,440 Speaker 1: disclosed TV UM, which is a broad range of things. 24 00:01:25,440 --> 00:01:28,200 Speaker 1: It's not it's not just one thing. Uh, And I 25 00:01:28,200 --> 00:01:30,760 Speaker 1: guess I'll hand it over to the person who do 26 00:01:30,840 --> 00:01:33,240 Speaker 1: the actual work in terms of like how how would 27 00:01:33,240 --> 00:01:36,960 Speaker 1: you describe what disclosed TV is? But but like before 28 00:01:37,000 --> 00:01:39,640 Speaker 1: we get into like the journey of the platform and 29 00:01:39,720 --> 00:01:43,840 Speaker 1: thing like what like what is it? Yeah, UM, let 30 00:01:43,840 --> 00:01:46,640 Speaker 1: me start this off by saying, I've already before the 31 00:01:46,640 --> 00:01:51,480 Speaker 1: publication of the article been publicly threatened vaguely with legal 32 00:01:51,520 --> 00:01:56,480 Speaker 1: action from Disclosed TV, so that will be largely informing 33 00:01:56,520 --> 00:01:59,040 Speaker 1: what I say today. But we do have a lot 34 00:01:59,080 --> 00:02:03,240 Speaker 1: of receipts. UM, we have very scary lawyers here. So 35 00:02:03,360 --> 00:02:08,560 Speaker 1: I'm i'm, i'm I'm excited whatever happens, So feel free 36 00:02:08,600 --> 00:02:11,680 Speaker 1: to say whatever you want to say. But UM, Disclosed 37 00:02:11,680 --> 00:02:16,160 Speaker 1: TV markets itself and presents itself as a news aggregator 38 00:02:16,919 --> 00:02:22,960 Speaker 1: UM operating Twitter, UM Telegram, gab getter. They have a 39 00:02:22,960 --> 00:02:26,320 Speaker 1: Facebook as well as well as the main site where 40 00:02:26,320 --> 00:02:33,920 Speaker 1: they host what one could describe as articles as well. Yeah, 41 00:02:34,000 --> 00:02:38,840 Speaker 1: and I think disclosed TV for our purposes, despite like 42 00:02:38,880 --> 00:02:43,280 Speaker 1: they have a very large Facebook presence UM, but the 43 00:02:43,320 --> 00:02:46,640 Speaker 1: way that we usually interact with them specifically like me 44 00:02:46,639 --> 00:02:49,280 Speaker 1: and Robert and then other people who are like journalists 45 00:02:49,280 --> 00:02:53,160 Speaker 1: storage to anti fascist researchers, usually we interact with disclosed 46 00:02:53,160 --> 00:02:56,760 Speaker 1: TV on Telegram or through Twitter um Twitter through like 47 00:02:56,840 --> 00:02:58,640 Speaker 1: it's how they like break a lot of current events 48 00:02:58,680 --> 00:03:00,920 Speaker 1: in like a like you know, a lot of like 49 00:03:00,960 --> 00:03:04,360 Speaker 1: political figures talk about them is is Twitter um and 50 00:03:04,400 --> 00:03:06,720 Speaker 1: then Telegram is where they really disseminate these out into 51 00:03:06,720 --> 00:03:09,040 Speaker 1: more obscure groups and maybe they change their wording because 52 00:03:09,040 --> 00:03:11,399 Speaker 1: they know the audience is a little bit different and 53 00:03:12,440 --> 00:03:18,160 Speaker 1: they've been a vector of information for a while. Really 54 00:03:18,280 --> 00:03:21,760 Speaker 1: really with the protests, they kind of picked up a 55 00:03:21,800 --> 00:03:26,079 Speaker 1: lot of there they were everywhere in terms of like 56 00:03:26,280 --> 00:03:30,320 Speaker 1: saying specific things, not doing sourcing UM and just having 57 00:03:30,360 --> 00:03:33,560 Speaker 1: like basically they are a place where they kind of 58 00:03:33,600 --> 00:03:36,360 Speaker 1: create what then they try to create what the news 59 00:03:36,640 --> 00:03:40,360 Speaker 1: is because of how isolated they are from the sources 60 00:03:40,400 --> 00:03:43,080 Speaker 1: that they actually pull info from, and they're very they're 61 00:03:43,120 --> 00:03:45,960 Speaker 1: very interested in kind of crafting their own version of 62 00:03:46,000 --> 00:03:51,080 Speaker 1: events UM which appeals to people across the spectrum. Like 63 00:03:51,080 --> 00:03:54,640 Speaker 1: they don't just market towards the far right wing. Sometimes 64 00:03:54,640 --> 00:03:57,240 Speaker 1: they frame things to kind of attract of a variety 65 00:03:57,280 --> 00:04:00,840 Speaker 1: of people on like the under the Street extremist banner. 66 00:04:00,960 --> 00:04:04,240 Speaker 1: Let's say. Um, so, you know, you don't just see 67 00:04:04,280 --> 00:04:06,560 Speaker 1: them in far right circles. You see discloses pop up 68 00:04:06,600 --> 00:04:08,680 Speaker 1: in a lot of places because of the way they 69 00:04:08,760 --> 00:04:12,640 Speaker 1: framed news and breaking events. UM. But they didn't always 70 00:04:12,640 --> 00:04:14,520 Speaker 1: start out like this. This isn't what they always were. 71 00:04:14,560 --> 00:04:17,719 Speaker 1: They weren't always this kind of content aggregator that creates 72 00:04:17,760 --> 00:04:23,080 Speaker 1: their version of news. UM. And Thomas did more research 73 00:04:23,120 --> 00:04:26,159 Speaker 1: into what they were before, which I actually had not 74 00:04:26,279 --> 00:04:29,800 Speaker 1: done that research yet. UM. So yeah, let's uh, let's 75 00:04:29,800 --> 00:04:32,360 Speaker 1: talk about that a little bit. Yeah. So I'm gonna 76 00:04:32,360 --> 00:04:37,960 Speaker 1: start off with talking about how I first heard about disclosed. UM. So. 77 00:04:38,000 --> 00:04:40,440 Speaker 1: I was living in Germany when the pandemic hit UM 78 00:04:40,480 --> 00:04:45,240 Speaker 1: and got COVID first wave in Germany in the March. UM. 79 00:04:45,320 --> 00:04:47,840 Speaker 1: Luckily I was totally asymptomatic. UM, but I was kind 80 00:04:47,839 --> 00:04:51,080 Speaker 1: of stranded in Germany for a couple of weeks UM 81 00:04:51,120 --> 00:04:54,360 Speaker 1: and how to isolate in a vacation rental uh and 82 00:04:54,600 --> 00:04:57,359 Speaker 1: the Bavarian man who owned it just kept coming in 83 00:04:57,480 --> 00:05:00,440 Speaker 1: talking to me and I would tell him, Hey, it's 84 00:05:00,440 --> 00:05:02,600 Speaker 1: probably not the best idea for you to be coming 85 00:05:02,640 --> 00:05:06,280 Speaker 1: by and chatting with me all the time. And now 86 00:05:06,360 --> 00:05:08,400 Speaker 1: he got into talking, and we're talking in German. He 87 00:05:08,440 --> 00:05:12,080 Speaker 1: got we got into talking about the pandemic, what he 88 00:05:12,120 --> 00:05:14,800 Speaker 1: thought about it, and he started talking about how he thought, Oh, 89 00:05:14,839 --> 00:05:16,960 Speaker 1: the government's making this seem way worse than it is. 90 00:05:17,040 --> 00:05:19,520 Speaker 1: You know, the deep state if you know anything about that, 91 00:05:19,600 --> 00:05:23,400 Speaker 1: and he and he said deep state in English. Um. 92 00:05:23,440 --> 00:05:27,320 Speaker 1: And I was familiar with German far right currents at 93 00:05:27,360 --> 00:05:31,200 Speaker 1: that time. Um, but I had never encountered a pilled 94 00:05:31,440 --> 00:05:35,840 Speaker 1: German dude. And that's when I realized this is gonna 95 00:05:35,839 --> 00:05:40,800 Speaker 1: be a fucking problem. Yeah. Yeah, and and it lo 96 00:05:40,920 --> 00:05:43,680 Speaker 1: and behold, it has continued to be a problem. So 97 00:05:44,080 --> 00:05:45,600 Speaker 1: as I got back to the U S. And and 98 00:05:45,640 --> 00:05:47,160 Speaker 1: the other thing when I was in Germany is the 99 00:05:47,160 --> 00:05:50,960 Speaker 1: first time I toounded I encountered telegram. Um. When a 100 00:05:51,040 --> 00:05:53,880 Speaker 1: German I knew said hey, I I just don't trust 101 00:05:53,880 --> 00:05:56,920 Speaker 1: WhatsApp because it's on by Facebook. Why don't you download 102 00:05:56,920 --> 00:06:01,200 Speaker 1: telegram in two thousand nineteen, I think, um, and and 103 00:06:01,279 --> 00:06:04,760 Speaker 1: it was pretty innocuous to me at the time. I 104 00:06:04,760 --> 00:06:11,080 Speaker 1: didn't realize this would become a problem. Yeah, fast forward, UM. 105 00:06:11,120 --> 00:06:15,680 Speaker 1: I was working on my master's project, which UM you 106 00:06:15,680 --> 00:06:17,960 Speaker 1: can talk about more later on because it's kind of 107 00:06:18,000 --> 00:06:20,599 Speaker 1: besides the point, UM if you don't want to hear 108 00:06:20,640 --> 00:06:24,279 Speaker 1: about it, But looking at Telegram as the cultic milieu UM, 109 00:06:24,360 --> 00:06:28,120 Speaker 1: using Colin Campbell's framework of the cultic milieu UM to 110 00:06:28,240 --> 00:06:32,840 Speaker 1: understand specifically how qan on spread in Germany and how 111 00:06:32,920 --> 00:06:38,800 Speaker 1: Q and on interacted with these native underlying conspiracy narratives 112 00:06:38,839 --> 00:06:43,560 Speaker 1: within Germany. UM. Because Telegram is already massively popular in 113 00:06:43,600 --> 00:06:46,960 Speaker 1: Germany before after j six, I think of the band 114 00:06:46,960 --> 00:06:49,039 Speaker 1: wave came down and there was much more migration to 115 00:06:49,120 --> 00:06:53,360 Speaker 1: the platform. So you know, I did the social network 116 00:06:53,360 --> 00:06:56,400 Speaker 1: analysis looking at the conspirat German conspiracy is seen on 117 00:06:56,440 --> 00:06:59,720 Speaker 1: Telegram and one of the biggest notes that came up, 118 00:06:59,720 --> 00:07:02,120 Speaker 1: and I looking at a number of times shared into 119 00:07:02,160 --> 00:07:06,480 Speaker 1: other groups or channels was disclosed tv UM and that's 120 00:07:06,520 --> 00:07:10,320 Speaker 1: the first time I came into it. I look through 121 00:07:10,320 --> 00:07:14,600 Speaker 1: it and realized, okay, they there is an editorial stance 122 00:07:14,720 --> 00:07:20,440 Speaker 1: within this UM, and that editorial stance largely attracts conspiracists 123 00:07:20,480 --> 00:07:25,680 Speaker 1: and far right extremists to this coverage and too you 124 00:07:25,760 --> 00:07:31,240 Speaker 1: know this is widely shared among conspiracists and far right extremists. UM. 125 00:07:31,480 --> 00:07:36,240 Speaker 1: Fast forward. UM. I'm on Twitter, as many of us are, unfortunately, UM, 126 00:07:36,280 --> 00:07:39,280 Speaker 1: and I saw Disclosed TV just popping up everywhere, UM, 127 00:07:39,320 --> 00:07:43,000 Speaker 1: even from people who who I would think should know better. Yeah, 128 00:07:43,200 --> 00:07:49,000 Speaker 1: absolutely are you know big extremism researchers and journalists UM 129 00:07:49,080 --> 00:07:51,640 Speaker 1: shared it. I remember there one specific one that really 130 00:07:51,640 --> 00:07:55,000 Speaker 1: came across my feed. UM disclosed had had taken a 131 00:07:55,120 --> 00:08:00,400 Speaker 1: video from like the Blaze, Glennbex whatever, the Empire whatever, 132 00:08:01,520 --> 00:08:08,200 Speaker 1: UM about the firefighters who were quitting over vaccine mandate 133 00:08:08,280 --> 00:08:11,760 Speaker 1: or something and had all of their boots or whatever. 134 00:08:11,840 --> 00:08:16,840 Speaker 1: And I saw, yeah, lots of people sharing that as well. UM. 135 00:08:16,920 --> 00:08:20,280 Speaker 1: And eventually I got tired of saying, hey, this should 136 00:08:20,280 --> 00:08:26,120 Speaker 1: is suspect, don't share it UM, and decided to write 137 00:08:26,160 --> 00:08:28,280 Speaker 1: an article about it so I could just send my 138 00:08:28,360 --> 00:08:31,640 Speaker 1: article to people. And it's really interesting what I found. UM. 139 00:08:31,680 --> 00:08:37,520 Speaker 1: Disclosed tv started off in the mid two thousand's as 140 00:08:37,559 --> 00:08:43,840 Speaker 1: just this forum for UFOs, paranormal stuff, cryptids, bigfoot sightings, 141 00:08:44,040 --> 00:08:50,160 Speaker 1: and existed in largely the same format until two thousand 142 00:08:50,320 --> 00:08:53,520 Speaker 1: and twenty one. UM. There were some shifts in in 143 00:08:54,080 --> 00:08:57,880 Speaker 1: the way the site presented itself. Um, it was. It 144 00:08:57,920 --> 00:09:00,640 Speaker 1: started off as a member loggin where member could write 145 00:09:00,960 --> 00:09:04,200 Speaker 1: articles that were largely long form forum posts and then 146 00:09:04,240 --> 00:09:09,360 Speaker 1: have people comment on them and reply. Um. And at 147 00:09:09,400 --> 00:09:15,040 Speaker 1: one point Disclosed made the jump to functioning as a 148 00:09:15,040 --> 00:09:19,480 Speaker 1: news aggregator while including an editorial spin on that and 149 00:09:19,520 --> 00:09:21,800 Speaker 1: including some of their own articles. Do you want me 150 00:09:21,800 --> 00:09:25,280 Speaker 1: to get more into that now, Yeah, because yeah, because 151 00:09:25,320 --> 00:09:29,719 Speaker 1: like the shift was was it wasn't like immediate as well, right, 152 00:09:29,760 --> 00:09:31,920 Speaker 1: Like they were starting to kind of present themselves in 153 00:09:31,960 --> 00:09:35,640 Speaker 1: more of a news gathering way, you know, around the 154 00:09:35,720 --> 00:09:38,959 Speaker 1: late of course, during this became a big thing in 155 00:09:39,120 --> 00:09:42,400 Speaker 1: terms of their social media presence. Um, they were trying 156 00:09:42,440 --> 00:09:45,920 Speaker 1: to present themselves as like a news aggregator, right, but 157 00:09:45,960 --> 00:09:48,400 Speaker 1: they still operated that. But they still operated the forum 158 00:09:48,400 --> 00:09:50,640 Speaker 1: on their site throughout most of that time. And it's 159 00:09:50,679 --> 00:09:54,320 Speaker 1: only until recently where they shut that forum down. Um, 160 00:09:54,360 --> 00:09:57,000 Speaker 1: which was you know, full of full of all kinds 161 00:09:57,000 --> 00:09:59,800 Speaker 1: of conspiratorial nonsense. It's very easy to see passed for 162 00:10:00,240 --> 00:10:05,040 Speaker 1: most people, uh, secret you know, secret Arctic ship, which 163 00:10:05,120 --> 00:10:10,480 Speaker 1: is yeah, it's always that's usually Yeah, even going to 164 00:10:10,520 --> 00:10:15,240 Speaker 1: get stuff like watch this s j W get wrecked, 165 00:10:15,400 --> 00:10:17,440 Speaker 1: which which is not a quotation, but just that kind 166 00:10:17,440 --> 00:10:31,080 Speaker 1: of vibe, that style of content. Yeah, going from the 167 00:10:31,160 --> 00:10:34,560 Speaker 1: forum operating than what they know their social media accounts 168 00:10:34,720 --> 00:10:36,920 Speaker 1: to the shift this more of like presenting as a 169 00:10:36,920 --> 00:10:39,960 Speaker 1: news website. Talking about that and the potential effects that 170 00:10:40,000 --> 00:10:42,920 Speaker 1: we see this having on both like the social media 171 00:10:43,000 --> 00:10:46,880 Speaker 1: sites and just the overall trend of news aggregation in general, 172 00:10:46,960 --> 00:10:51,120 Speaker 1: I guess. Yeah. So the first big shift that I 173 00:10:51,240 --> 00:10:53,839 Speaker 1: found UM was the creation of their Telegram channel, which 174 00:10:53,880 --> 00:10:58,120 Speaker 1: is in January of one actually, so this is relatively 175 00:10:58,480 --> 00:11:04,600 Speaker 1: more recent than Yeah, if that happened UM, and they 176 00:11:04,640 --> 00:11:08,360 Speaker 1: operated their Telegram as a UM as more in this 177 00:11:08,400 --> 00:11:12,040 Speaker 1: traditional news aggregator since and so that's how they really 178 00:11:12,040 --> 00:11:14,439 Speaker 1: blew up on Telegram. At some point they deleted all 179 00:11:14,440 --> 00:11:17,360 Speaker 1: of their old tweets UM and started operating their Twitter 180 00:11:17,400 --> 00:11:20,920 Speaker 1: in a similar manner. It was after they created this 181 00:11:20,960 --> 00:11:28,120 Speaker 1: telegram channel UM in September, actually overnight on septemb they 182 00:11:28,160 --> 00:11:31,360 Speaker 1: completely re rebranded the site. They took out all the 183 00:11:31,440 --> 00:11:37,600 Speaker 1: user forums UM, they included backdated articles to a year 184 00:11:37,679 --> 00:11:42,760 Speaker 1: prior UM, and looking looking through archives of it, there 185 00:11:42,800 --> 00:11:47,600 Speaker 1: was a note saying something along the lines of we 186 00:11:47,679 --> 00:11:50,480 Speaker 1: have found so much growth on our social media are 187 00:11:50,520 --> 00:11:57,760 Speaker 1: growing Telegram channel, are growing Twitter account UM, and something 188 00:11:57,800 --> 00:11:59,960 Speaker 1: to the effect of we we are changing our strategy 189 00:12:00,000 --> 00:12:04,679 Speaker 1: and going about this a different way. UM. And you 190 00:12:04,720 --> 00:12:06,560 Speaker 1: can if you were into the forum, you can join 191 00:12:06,600 --> 00:12:10,880 Speaker 1: our discord UM which is defunct and I'll get I'll 192 00:12:10,880 --> 00:12:17,440 Speaker 1: get into that later. UM. And yeah, looking it was 193 00:12:17,440 --> 00:12:21,880 Speaker 1: really interesting too because looking at these backdated articles UM, 194 00:12:21,920 --> 00:12:27,960 Speaker 1: they included very obviously plagiarized content. UM. They had I 195 00:12:27,960 --> 00:12:30,200 Speaker 1: believe it's it's all in the article, but they had 196 00:12:30,280 --> 00:12:37,719 Speaker 1: four journalists UM names attached with the article using um 197 00:12:37,760 --> 00:12:48,200 Speaker 1: AI generated images for their pictures. UM. And the especially 198 00:12:48,240 --> 00:12:51,720 Speaker 1: the articles that they themselves published UM were very focused 199 00:12:51,920 --> 00:13:00,600 Speaker 1: on UFOs paranormal paranormal phenomenon UM, as well as content 200 00:13:00,720 --> 00:13:06,400 Speaker 1: that could cause skepticism within an audience, UM about vaccines 201 00:13:06,440 --> 00:13:10,320 Speaker 1: and lockdowns. And I do not know the intent of 202 00:13:11,160 --> 00:13:13,880 Speaker 1: U their editorial board, and so I cannot speak on that, 203 00:13:13,920 --> 00:13:17,080 Speaker 1: but of course it generated this effect. Yes, that is, 204 00:13:17,679 --> 00:13:20,439 Speaker 1: they found a way of creating content which develops a 205 00:13:20,480 --> 00:13:24,360 Speaker 1: very specific audience, which grew their numbers, which made them, 206 00:13:24,360 --> 00:13:26,600 Speaker 1: you know what one could assume would make them want 207 00:13:26,640 --> 00:13:28,360 Speaker 1: to make more of that content. Because that makes more 208 00:13:28,440 --> 00:13:34,400 Speaker 1: numbers than they can UM use that to grow their platform. UM. Yeah, 209 00:13:34,640 --> 00:13:39,280 Speaker 1: specifically leading up like after after January UM ramping up 210 00:13:39,320 --> 00:13:41,240 Speaker 1: one of the vaccines were coming more and more common 211 00:13:41,240 --> 00:13:44,319 Speaker 1: in the States and that across the world. UM. They 212 00:13:44,360 --> 00:13:47,280 Speaker 1: have seen a pretty significant growth and have changed their 213 00:13:47,320 --> 00:13:56,320 Speaker 1: platform accordingly exactly. UM. So we began looking into who 214 00:13:56,320 --> 00:13:58,400 Speaker 1: the who the fun owns this, what's going on with 215 00:13:58,400 --> 00:14:04,199 Speaker 1: this UM. Like all German companies UM, and it is 216 00:14:04,240 --> 00:14:06,880 Speaker 1: based in Germany, there's a requirement by law to include 217 00:14:06,920 --> 00:14:11,000 Speaker 1: an imprint or an impress them that includes an address 218 00:14:11,120 --> 00:14:17,280 Speaker 1: UM contact information for the site UM. There in the 219 00:14:17,320 --> 00:14:21,600 Speaker 1: company that owns it UM a company called Future Bites 220 00:14:21,680 --> 00:14:26,160 Speaker 1: operates disclosed tv UM, which describes itself as a private 221 00:14:26,280 --> 00:14:31,160 Speaker 1: equity firm and media group. UM and looking into the 222 00:14:31,160 --> 00:14:34,440 Speaker 1: ownership behind Future Bites is a man by the name 223 00:14:34,440 --> 00:14:38,560 Speaker 1: of Uva Brown who has a pretty interesting backstory. He's hosted, 224 00:14:38,720 --> 00:14:43,080 Speaker 1: he's He's made numerous web hosting sites. UM. I believe 225 00:14:43,160 --> 00:14:46,600 Speaker 1: he created some dating sites as well, but but my 226 00:14:46,720 --> 00:14:50,680 Speaker 1: research was not conclusive, so that's a maybe. UM. But 227 00:14:50,760 --> 00:14:53,520 Speaker 1: eventually he sold. He had his most success when he 228 00:14:53,600 --> 00:14:57,800 Speaker 1: sold one of his web hosting sites to go Daddy 229 00:14:58,680 --> 00:15:02,800 Speaker 1: for a lot of money, and along the way, in 230 00:15:02,880 --> 00:15:07,280 Speaker 1: his own as he described, booked a flight on virgin 231 00:15:07,600 --> 00:15:09,880 Speaker 1: to go into space and see for himself if the earth, 232 00:15:10,320 --> 00:15:14,120 Speaker 1: if the Earth was flat. Oh my god, awesome cool 233 00:15:14,280 --> 00:15:17,960 Speaker 1: that that, this is great? Thank you. Yeah. So this 234 00:15:17,960 --> 00:15:22,520 Speaker 1: this is who we're dealing with, um. Sweet. And the 235 00:15:22,600 --> 00:15:26,360 Speaker 1: thing about Disclosed being based in Germany, UM, that that 236 00:15:27,760 --> 00:15:32,280 Speaker 1: becomes an issue, UM, is that Germany has a very 237 00:15:32,480 --> 00:15:36,720 Speaker 1: different uh look at free speech than in the US. UM. 238 00:15:36,800 --> 00:15:42,040 Speaker 1: For example, even online displaying swastikas UM and denying the 239 00:15:42,080 --> 00:15:45,520 Speaker 1: Holocaust is legal and is a prosecutable crime. They can 240 00:15:45,560 --> 00:15:51,440 Speaker 1: get you jail time. UM. So as we explored as 241 00:15:51,520 --> 00:15:54,680 Speaker 1: I mostly and UM, there's additional reporting from Berni Piper 242 00:15:54,680 --> 00:15:57,560 Speaker 1: and I'll talk a bit more about that later explored 243 00:15:58,440 --> 00:16:01,880 Speaker 1: the discord and their telegram. We realized, Hum, it seems 244 00:16:01,880 --> 00:16:04,600 Speaker 1: to be a lot of Nazis here, and by which 245 00:16:04,640 --> 00:16:06,320 Speaker 1: when I say seems to be a lot of Nazis, 246 00:16:06,360 --> 00:16:11,400 Speaker 1: I mean people with swastikas in their profile builds, UM 247 00:16:11,440 --> 00:16:15,760 Speaker 1: with you know, names, referencing the Holocaust with the whispers 248 00:16:15,800 --> 00:16:21,320 Speaker 1: parentheses um, and saying denying that the hologast Happened UM, 249 00:16:21,400 --> 00:16:25,840 Speaker 1: and also sharing the neo Nazi, famous neo Nazi, infamous 250 00:16:25,840 --> 00:16:30,680 Speaker 1: new Nazi propaganda film Europa The Last Battle UM, which 251 00:16:31,400 --> 00:16:35,720 Speaker 1: was shared by prominent q and On influencer ghost Ezra. Yeah. 252 00:16:36,200 --> 00:16:40,280 Speaker 1: I know. UM, oh man, this this came up a 253 00:16:40,280 --> 00:16:44,680 Speaker 1: few days ago. One of the one of the channels 254 00:16:44,720 --> 00:16:48,480 Speaker 1: that me and someone else have been watching, UH forwarded 255 00:16:48,600 --> 00:16:51,960 Speaker 1: me it being that that that film being shared at 256 00:16:51,960 --> 00:16:54,400 Speaker 1: the it was it was it was that the the 257 00:16:54,400 --> 00:16:58,880 Speaker 1: Free Oregon Telegram channel was sharing links to that and 258 00:16:58,960 --> 00:17:01,160 Speaker 1: I wonder I would I would like to track back 259 00:17:01,160 --> 00:17:05,320 Speaker 1: where that link came from. UM. Yeah. Not not great 260 00:17:05,359 --> 00:17:08,399 Speaker 1: seeing that film circulate more and more, especially among like 261 00:17:08,760 --> 00:17:10,720 Speaker 1: you know, the free organ Telegram channels, you know, like 262 00:17:10,760 --> 00:17:15,480 Speaker 1: Anti mask, Anti backs, Anti Lockdown channel. Yeah, and seeing 263 00:17:15,480 --> 00:17:21,000 Speaker 1: the percoloration of that type of content. Yeah. So and 264 00:17:21,040 --> 00:17:24,720 Speaker 1: prepare for this article UM. With the help of of 265 00:17:24,760 --> 00:17:30,680 Speaker 1: the Logically editorial team, I'm a freelancer UM, their current 266 00:17:30,720 --> 00:17:36,359 Speaker 1: head of content, Ernie Piper UM sent an email basically asking, hey, 267 00:17:36,640 --> 00:17:39,080 Speaker 1: what's going on. You all seem to have a Nazi 268 00:17:39,119 --> 00:17:44,800 Speaker 1: problem that's kind of borderline illegal in Germany, UM, to 269 00:17:45,160 --> 00:17:49,000 Speaker 1: which for a while this for for about twenty four 270 00:17:49,000 --> 00:17:51,919 Speaker 1: hours as closed, just went totally quiet and didn't post 271 00:17:52,560 --> 00:17:57,480 Speaker 1: Um and then came out with a post specifically targeting 272 00:17:57,600 --> 00:18:01,280 Speaker 1: Ernie Um by name him and with a picture of 273 00:18:01,320 --> 00:18:03,639 Speaker 1: him and linking to some of his old reporting work 274 00:18:04,760 --> 00:18:09,919 Speaker 1: as well, Um saying yes, Uva Brown owns this. He 275 00:18:10,000 --> 00:18:12,720 Speaker 1: got his money from Go daddy. You know, we value 276 00:18:12,760 --> 00:18:17,080 Speaker 1: free speech and we condemn hatred whatnot, and and saying, oh, 277 00:18:17,119 --> 00:18:19,560 Speaker 1: our telegram we have a tell there's a telegram group, 278 00:18:19,600 --> 00:18:22,840 Speaker 1: but we have these rules in it, and h okay, yeah, 279 00:18:23,000 --> 00:18:25,640 Speaker 1: we we had to shut down the discord that got 280 00:18:25,640 --> 00:18:28,520 Speaker 1: a little bit out of hand. We admit that. You know, 281 00:18:28,560 --> 00:18:31,879 Speaker 1: they had people denying the Holocaust with swastka icons in 282 00:18:31,920 --> 00:18:36,159 Speaker 1: their discord that they didn't seem to care too much 283 00:18:36,160 --> 00:18:40,600 Speaker 1: about until someone pointed it out. Um. And you know 284 00:18:40,640 --> 00:18:45,159 Speaker 1: there's there was additionally very explicit neo Nazi content in 285 00:18:45,200 --> 00:18:47,719 Speaker 1: their telegram channel as well, with the excuse, oh well, 286 00:18:47,720 --> 00:18:53,040 Speaker 1: we're a glowing growing platform. We we can't moderate everything. 287 00:18:53,080 --> 00:18:55,359 Speaker 1: Is well they have They just crossed four hundred thousand 288 00:18:55,400 --> 00:18:57,960 Speaker 1: in their telegram channel and I think about thirty thousand 289 00:18:58,000 --> 00:19:05,119 Speaker 1: and their telegram group UM, which is frankly bullshit. Yeah 290 00:19:05,320 --> 00:19:08,760 Speaker 1: that is if if my personal opinion is that if 291 00:19:08,800 --> 00:19:13,040 Speaker 1: you cannot don't have the resources to moderate this space, 292 00:19:13,440 --> 00:19:18,879 Speaker 1: you probably shouldn't space the space. Um. And and additionally confirming, oh, okay, 293 00:19:18,920 --> 00:19:21,840 Speaker 1: we we when we made our new version of the site. Yes, 294 00:19:21,880 --> 00:19:26,679 Speaker 1: we backdated some articles from previous user generated content that 295 00:19:26,800 --> 00:19:30,040 Speaker 1: we you know, didn't vet properly. We're trying to fix 296 00:19:30,119 --> 00:19:33,520 Speaker 1: that now they removed some of those articles. Um. And 297 00:19:33,600 --> 00:19:38,680 Speaker 1: that yeah, we none of the people who who are 298 00:19:38,680 --> 00:19:40,720 Speaker 1: the authors of our articles are real people and they're 299 00:19:40,720 --> 00:19:44,840 Speaker 1: all pen names. Um. You know. They also have or 300 00:19:44,840 --> 00:19:46,720 Speaker 1: at least had a tab on their website that said 301 00:19:47,119 --> 00:19:50,239 Speaker 1: right for us and looking for people to send them 302 00:19:50,280 --> 00:19:53,040 Speaker 1: things and saying you know, well we will disclose your 303 00:19:53,080 --> 00:19:55,239 Speaker 1: bio and link to all your social media if you 304 00:19:55,320 --> 00:19:58,160 Speaker 1: write a story for us. And they're there was zero 305 00:19:58,200 --> 00:20:02,200 Speaker 1: of that happening as well. So do you think that, 306 00:20:02,440 --> 00:20:04,920 Speaker 1: I know, like on the rules for their telegram they 307 00:20:04,960 --> 00:20:08,560 Speaker 1: have the no Nazi stuff rule. Um. Do you think 308 00:20:08,560 --> 00:20:10,760 Speaker 1: they're actually trying to discourage that because they're scared of 309 00:20:10,840 --> 00:20:14,639 Speaker 1: legal stuff or is that just presentorary and they I 310 00:20:14,640 --> 00:20:16,800 Speaker 1: guess you know, this is just going into speculations. So 311 00:20:16,800 --> 00:20:18,400 Speaker 1: I think this might be more a question for even 312 00:20:18,520 --> 00:20:22,760 Speaker 1: Robert Um in terms of yeah, like is the anti 313 00:20:22,880 --> 00:20:25,800 Speaker 1: Nazi stuff presentorary? And because it does seem to be 314 00:20:26,119 --> 00:20:28,280 Speaker 1: a lot of their user base is fostering that type 315 00:20:28,320 --> 00:20:30,639 Speaker 1: of thing or is you know being moved over from 316 00:20:30,720 --> 00:20:34,439 Speaker 1: other similar channels? Um? Because yeah, like a lot of 317 00:20:34,480 --> 00:20:37,080 Speaker 1: like the amount that we see disclose, like you know, 318 00:20:37,280 --> 00:20:39,480 Speaker 1: intercept with channels like you know, the Rise above Him 319 00:20:39,520 --> 00:20:41,480 Speaker 1: the channel, UM, and a whole bunch of like eco 320 00:20:41,520 --> 00:20:44,239 Speaker 1: fascist channels, and a whole bunch of channels you know 321 00:20:44,640 --> 00:20:46,960 Speaker 1: on a like a broad range of like actual like 322 00:20:47,000 --> 00:20:49,720 Speaker 1: fascist topics, like people who are like into fascist theory. 323 00:20:50,280 --> 00:20:53,399 Speaker 1: Um is quite high the like the amount that disclosed 324 00:20:53,440 --> 00:20:56,800 Speaker 1: shows up. Um. And I don't know, like you could 325 00:20:56,800 --> 00:20:58,960 Speaker 1: look at all their stuff saying I mean, like yeah 326 00:20:58,960 --> 00:21:02,919 Speaker 1: on on the their rule page saying no Nazi bullshit. Um. 327 00:21:02,960 --> 00:21:04,760 Speaker 1: But then if you spend any amount of time looking 328 00:21:04,760 --> 00:21:07,840 Speaker 1: at at where they're posts are forwarded, it's almost primarily 329 00:21:08,040 --> 00:21:12,800 Speaker 1: people who are self describe themselves as fascists. UM. So 330 00:21:13,000 --> 00:21:17,440 Speaker 1: I mean, yeah, it's hard to or Donald J. Trump Jr. Yes, Yes, 331 00:21:18,119 --> 00:21:20,239 Speaker 1: it expands out into a lot of you know, just 332 00:21:20,320 --> 00:21:24,280 Speaker 1: like you know, American journalists suits studying streamism could also 333 00:21:24,440 --> 00:21:28,040 Speaker 1: share discloses stuff on Twitter, right, that is part of 334 00:21:28,080 --> 00:21:30,480 Speaker 1: their thing is making that and you know that that 335 00:21:30,560 --> 00:21:33,320 Speaker 1: does strengthen them because it gives them that legitimacy. So 336 00:21:33,359 --> 00:21:35,280 Speaker 1: then when people point out that they have Nazi problem, like, no, 337 00:21:35,400 --> 00:21:38,120 Speaker 1: that's not us, that's just some of our users who 338 00:21:38,119 --> 00:21:41,240 Speaker 1: are trolling or you know whatever whatever bullshit they want 339 00:21:41,280 --> 00:21:44,520 Speaker 1: to say. Um So I guess, like how I guess 340 00:21:44,520 --> 00:21:47,560 Speaker 1: the real way to frame this is like how often 341 00:21:48,160 --> 00:21:51,639 Speaker 1: have you seen Nazi stuff associated with the disclose with 342 00:21:51,680 --> 00:21:53,919 Speaker 1: the Disclosed TV brand? Because that's the one thing we 343 00:21:53,920 --> 00:21:56,159 Speaker 1: actually can measure, right, we can't measure their intentions, but 344 00:21:56,240 --> 00:22:00,200 Speaker 1: we can't measure how often this stuff happens. Yeah, mean, 345 00:22:00,200 --> 00:22:03,280 Speaker 1: that's always like the best way to measure that kind 346 00:22:03,320 --> 00:22:06,280 Speaker 1: of thing, rather than just sort of like making the 347 00:22:06,320 --> 00:22:09,919 Speaker 1: allegation listing like we find it in this many channels, 348 00:22:09,960 --> 00:22:12,840 Speaker 1: we see it shared in these areas, it's being discussed 349 00:22:12,840 --> 00:22:16,120 Speaker 1: by these people, and like like that that's I think 350 00:22:16,160 --> 00:22:19,480 Speaker 1: always kind of how you actually build these these sort 351 00:22:19,480 --> 00:22:23,280 Speaker 1: of networks is by looking at what is actually spreading 352 00:22:23,320 --> 00:22:26,119 Speaker 1: where like that's that it is thankfully, something that you 353 00:22:26,160 --> 00:22:29,399 Speaker 1: can measure pretty objectively, and like they are fostering it 354 00:22:29,480 --> 00:22:32,640 Speaker 1: with the amount of stuff they talk about, like George Soros, 355 00:22:32,680 --> 00:22:34,240 Speaker 1: and you know, the amount of stuff that they like. 356 00:22:34,520 --> 00:22:39,280 Speaker 1: The way they frame breaking news is has that editorial 357 00:22:39,320 --> 00:22:42,000 Speaker 1: bent where it's very clear that it's getting pushed in 358 00:22:42,040 --> 00:22:46,360 Speaker 1: a specific direction like there's that is That is a 359 00:22:46,400 --> 00:22:49,320 Speaker 1: thing that you can observe by reading the type of 360 00:22:49,400 --> 00:22:54,600 Speaker 1: narratives they're weaving via how they report information. Um, the 361 00:22:54,600 --> 00:22:58,919 Speaker 1: the topics that they choose to cover, our topics that 362 00:22:59,560 --> 00:23:06,439 Speaker 1: reson very deeply with conspiracist and with far at extremist communities. UM. 363 00:23:06,480 --> 00:23:09,800 Speaker 1: If I had to speculate, UM, I will say at 364 00:23:09,840 --> 00:23:14,040 Speaker 1: least since the article has come out, Um, they've done 365 00:23:14,080 --> 00:23:18,120 Speaker 1: a better job of moderating their telegram channel at least 366 00:23:18,160 --> 00:23:22,439 Speaker 1: for now. So good job, disclosed, TV no one. You 367 00:23:22,520 --> 00:23:25,320 Speaker 1: can't find links to Europe with the last battle there anymore. 368 00:23:25,320 --> 00:23:30,520 Speaker 1: You can still find you can still find very rampant 369 00:23:30,600 --> 00:23:37,879 Speaker 1: homophobia slurs um, because you know they didn't. They clearly 370 00:23:37,920 --> 00:23:41,000 Speaker 1: auto blocked some words, but people can shorten them or 371 00:23:41,119 --> 00:23:44,080 Speaker 1: use different spellings for those words to still be used 372 00:23:44,080 --> 00:23:52,240 Speaker 1: in the channel. Um, there's still anti Semitic um coded 373 00:23:52,280 --> 00:23:56,520 Speaker 1: anti Semitic references as well responding to something saying oive 374 00:23:56,560 --> 00:24:01,199 Speaker 1: a for example, which is something that tends to be 375 00:24:01,280 --> 00:24:06,399 Speaker 1: used by a lot of neo Nazis and anti Semites. Yeah, 376 00:24:06,480 --> 00:24:08,520 Speaker 1: I mean even and if you do any amount of 377 00:24:08,600 --> 00:24:12,480 Speaker 1: research on Telegram, you will you will find forward links 378 00:24:12,680 --> 00:24:17,000 Speaker 1: forwarded links to this channel all everywhere, like if it's 379 00:24:17,119 --> 00:24:21,840 Speaker 1: it is, it is so massive. The footprint that they 380 00:24:21,880 --> 00:24:27,280 Speaker 1: have currently in the in like the the cycle of 381 00:24:27,920 --> 00:24:32,520 Speaker 1: of forwarding posts specifically on Telegram UM and yeah, and 382 00:24:32,600 --> 00:24:34,439 Speaker 1: they're they're getting a lot of traction on it because 383 00:24:34,640 --> 00:24:37,240 Speaker 1: they have stuff framed in a way that's really easy 384 00:24:37,320 --> 00:24:41,399 Speaker 1: for them to have those stuff like line up with 385 00:24:41,520 --> 00:24:46,520 Speaker 1: the communities that promote those types of worldviews UM and 386 00:24:46,600 --> 00:24:49,520 Speaker 1: promote the you know, the narratives that they want to foster. 387 00:24:50,400 --> 00:24:54,680 Speaker 1: So let's see, let's have another quick break and then 388 00:24:54,880 --> 00:24:59,080 Speaker 1: let's maybe talk about your big master's project, which is 389 00:24:59,080 --> 00:25:03,040 Speaker 1: really interesting. Yeah, can I can? I can I do it? Yeah? 390 00:25:03,119 --> 00:25:07,760 Speaker 1: I do this. You know you know what isn't telegram? Uh? 391 00:25:08,080 --> 00:25:10,760 Speaker 1: Literally these ads unless we get an ad by Telegram, 392 00:25:10,760 --> 00:25:14,760 Speaker 1: which we are primarily sponsored by the Duo Brothers UM. 393 00:25:14,800 --> 00:25:18,679 Speaker 1: But that's for a separate project. My favorite at is 394 00:25:18,720 --> 00:25:20,720 Speaker 1: the is the one where it's the kid playing and 395 00:25:20,760 --> 00:25:34,960 Speaker 1: they find a gun. Oh yeah, that's my favorite. And 396 00:25:35,720 --> 00:25:39,280 Speaker 1: oh we are thanks to great hope. Everybody enjoyed that 397 00:25:39,400 --> 00:25:44,280 Speaker 1: kid finding a gun a kid firing it. Yeah, so 398 00:25:44,400 --> 00:25:48,120 Speaker 1: there's one of my favorite tweets recently was like somebody, 399 00:25:48,119 --> 00:25:51,160 Speaker 1: it was somebody like clipped a screen grab of news 400 00:25:51,240 --> 00:25:54,240 Speaker 1: article that was like a toddler has shot someone every 401 00:25:54,320 --> 00:25:56,480 Speaker 1: day in the United States for the last three years, 402 00:25:56,520 --> 00:26:01,520 Speaker 1: and somebody quote tweeted and said, somebody fucking stop him. 403 00:26:01,560 --> 00:26:07,119 Speaker 1: It's very good. Um. The last thing I want to 404 00:26:07,119 --> 00:26:10,520 Speaker 1: talk about is just kind of why news aggregators are 405 00:26:10,560 --> 00:26:13,400 Speaker 1: bad in the first place, and examples of which we've 406 00:26:13,400 --> 00:26:15,679 Speaker 1: seen the past few years, and how they contribute to 407 00:26:15,680 --> 00:26:19,399 Speaker 1: dis information specifically, and how they don't do sourcing for 408 00:26:19,440 --> 00:26:21,719 Speaker 1: any claims and they try to make themselves a primary 409 00:26:21,720 --> 00:26:24,400 Speaker 1: source even though they're not. UM. And then also would 410 00:26:24,440 --> 00:26:30,720 Speaker 1: love to talk about your very fancy project. So yeah, 411 00:26:30,960 --> 00:26:35,400 Speaker 1: we saw a lot of news aggregators that like during 412 00:26:35,400 --> 00:26:39,919 Speaker 1: the protest, specifically that that spawned and killed many a 413 00:26:39,960 --> 00:26:44,880 Speaker 1: news aggregator account UM, which did not help things very much. UM. 414 00:26:45,240 --> 00:26:47,280 Speaker 1: And this is this is an issue that strikes across 415 00:26:47,320 --> 00:26:50,520 Speaker 1: the political spectrum. Yes, I mean one of the biggest 416 00:26:50,520 --> 00:26:53,520 Speaker 1: instances of that would be an account called and on 417 00:26:53,600 --> 00:26:58,639 Speaker 1: cat right. That was the yeah um who you know 418 00:26:59,160 --> 00:27:02,160 Speaker 1: marketed themselves towards the left wing UM. And I again, 419 00:27:02,280 --> 00:27:04,199 Speaker 1: I don't know what their intentionality was. They may have 420 00:27:04,200 --> 00:27:05,879 Speaker 1: had their heart in the right place. I have no idea, 421 00:27:06,280 --> 00:27:08,080 Speaker 1: um and I'm not going to speculate on that right now. 422 00:27:08,320 --> 00:27:13,120 Speaker 1: But the effect that they caused was damaging to how 423 00:27:13,119 --> 00:27:18,000 Speaker 1: information is disseminated, specifically in high stress events. Um. You know, 424 00:27:18,280 --> 00:27:20,760 Speaker 1: like for instance, the writtenhouse shooting, you know, like stuff 425 00:27:20,800 --> 00:27:25,040 Speaker 1: like that, Uh, the big accounts, the demos around that 426 00:27:25,119 --> 00:27:28,960 Speaker 1: before that, yeah, before that, yes, yeah, like the the 427 00:27:29,240 --> 00:27:34,960 Speaker 1: in fostering that very fast paced unverified information circulation UM 428 00:27:35,040 --> 00:27:36,920 Speaker 1: that gets you know a lot of retweets, it gets 429 00:27:36,960 --> 00:27:39,000 Speaker 1: it gets a lot of eyeballs on it. But but 430 00:27:39,440 --> 00:27:42,640 Speaker 1: it's hard. It makes it very hard to backtrack claims 431 00:27:43,160 --> 00:27:46,840 Speaker 1: because they do not want to link to other accounts, 432 00:27:46,960 --> 00:27:50,320 Speaker 1: because they're mostly interested in growing their own account. Um. 433 00:27:50,359 --> 00:27:54,159 Speaker 1: So and I will say Disclose has gotten better about 434 00:27:54,520 --> 00:27:57,840 Speaker 1: linking to the sources, even if the title and the 435 00:27:57,840 --> 00:28:03,720 Speaker 1: tweet don't necessarily match what in the story they link to. Yeah, 436 00:28:03,720 --> 00:28:07,280 Speaker 1: at least someone could take a different interpretation from the two. Yes, So, 437 00:28:07,480 --> 00:28:09,440 Speaker 1: just like you know, news aggregation and the way it 438 00:28:09,480 --> 00:28:12,919 Speaker 1: intersects with disinformation and misinformation, not just a problem for 439 00:28:12,920 --> 00:28:14,680 Speaker 1: the far right, not just a problem for the right wing, 440 00:28:15,280 --> 00:28:17,479 Speaker 1: just a problem for liberals, not just problem for leftist 441 00:28:17,560 --> 00:28:19,879 Speaker 1: This is the thing that anyone, anyone can really grasp 442 00:28:19,920 --> 00:28:22,520 Speaker 1: on to um. And some of it's accidental, some of 443 00:28:22,520 --> 00:28:25,840 Speaker 1: it's intentional. Right, there's some some people might just do 444 00:28:25,880 --> 00:28:28,760 Speaker 1: this kind of mindlessly, and some people may, you know, 445 00:28:28,880 --> 00:28:31,240 Speaker 1: do this aggregation with a very specific intent in mind. 446 00:28:31,520 --> 00:28:34,160 Speaker 1: So just be very careful whenever you have an account 447 00:28:34,160 --> 00:28:38,640 Speaker 1: that always leads with all caps like breaking like news. 448 00:28:38,720 --> 00:28:41,440 Speaker 1: Like if if you have an account that always does that, 449 00:28:41,760 --> 00:28:44,880 Speaker 1: maybe maybe maybe don't take that account super seriously all 450 00:28:44,880 --> 00:28:47,320 Speaker 1: the time. Maybe you should uh find other sources of 451 00:28:47,320 --> 00:28:50,440 Speaker 1: info that don't always start the tweets with breaking news 452 00:28:50,480 --> 00:28:52,880 Speaker 1: and all caps or advice to people if they do 453 00:28:52,960 --> 00:28:57,280 Speaker 1: want something like that, find an actual news source. Yeah, 454 00:28:57,360 --> 00:29:01,680 Speaker 1: there's plenty of valid criticism to be made. Again, you 455 00:29:01,720 --> 00:29:07,200 Speaker 1: know these mainstream media m s M centrist stuff from 456 00:29:07,280 --> 00:29:10,840 Speaker 1: from you know, even from the left, there's there's criticism. Um, 457 00:29:10,880 --> 00:29:13,719 Speaker 1: but you have to find some way of finding your 458 00:29:13,720 --> 00:29:16,000 Speaker 1: own meaning and understanding of what is going on in 459 00:29:16,000 --> 00:29:22,080 Speaker 1: the world around you. I think a CNN Reuters. Yeah, 460 00:29:22,120 --> 00:29:24,760 Speaker 1: And on that point, I think that is part of 461 00:29:24,840 --> 00:29:28,480 Speaker 1: why I think Disclosed can succeed and or like what 462 00:29:28,560 --> 00:29:30,920 Speaker 1: they did can succeed, even like when I see stuff 463 00:29:30,920 --> 00:29:33,880 Speaker 1: shared on the left, even by like anarchists, because it 464 00:29:34,040 --> 00:29:37,440 Speaker 1: is a not mainstream media news source, the way they 465 00:29:37,440 --> 00:29:40,840 Speaker 1: can frame things of sometimes rarely will match up with 466 00:29:40,920 --> 00:29:43,280 Speaker 1: like an actual anarchist views and the like, Yeah, I'm 467 00:29:43,280 --> 00:29:45,000 Speaker 1: going to share it from this thing because it does 468 00:29:45,000 --> 00:29:48,520 Speaker 1: feel like an underground you know source. It doesn't. It doesn't. 469 00:29:48,520 --> 00:29:50,719 Speaker 1: It's not you're not sharing a scene an article, so 470 00:29:50,800 --> 00:29:53,920 Speaker 1: you feel better because instead you're sharing something that is 471 00:29:54,000 --> 00:29:57,200 Speaker 1: not in the mainstream. So like I get that, I 472 00:29:57,560 --> 00:30:00,960 Speaker 1: get that pull to not something that genuous reading of 473 00:30:00,960 --> 00:30:04,080 Speaker 1: a CNN article instead. Yeah, but instead of you're you know, 474 00:30:04,080 --> 00:30:07,440 Speaker 1: it's not actually better, it's just marketing. They're just tricking 475 00:30:07,440 --> 00:30:10,640 Speaker 1: you via esthetics and branding, and that's all that it is, right, 476 00:30:10,920 --> 00:30:15,280 Speaker 1: So maybe you should learn learn to see past the 477 00:30:15,320 --> 00:30:16,959 Speaker 1: marketing and branding of those types of things, and look 478 00:30:17,000 --> 00:30:20,080 Speaker 1: at the actual content of what's being shared. What is 479 00:30:20,360 --> 00:30:24,840 Speaker 1: the university project thing that has been taking up a 480 00:30:24,880 --> 00:30:28,800 Speaker 1: lot of your time? Yeah, and um, so you're back 481 00:30:28,800 --> 00:30:32,680 Speaker 1: in the U S. And I got interested, um, especially 482 00:30:32,720 --> 00:30:34,440 Speaker 1: in looking at the spread of Q and on in Germany, 483 00:30:34,440 --> 00:30:39,280 Speaker 1: and that you led me down this research path, um, 484 00:30:39,560 --> 00:30:42,400 Speaker 1: and brought me especially to telegram um again before it 485 00:30:42,480 --> 00:30:46,720 Speaker 1: was largely used in in right wing circles in the 486 00:30:46,800 --> 00:30:50,320 Speaker 1: U S. Well, the the Nazis have have pretty regularly 487 00:30:50,560 --> 00:30:53,840 Speaker 1: in the US, but on telegram as well. Um. But 488 00:30:54,720 --> 00:30:56,920 Speaker 1: this led me to look at this and and um 489 00:30:57,320 --> 00:31:00,959 Speaker 1: especially to to look at telegram the context of as 490 00:31:01,040 --> 00:31:06,080 Speaker 1: I mentioned, Colin Campbell's concept of the cultic milieu, um, 491 00:31:06,200 --> 00:31:08,200 Speaker 1: which I don't know if you'll have talked about that 492 00:31:08,480 --> 00:31:12,760 Speaker 1: on this we have one behind subject of time, okay, 493 00:31:12,760 --> 00:31:16,560 Speaker 1: but yeah, to give a quick summary, is is the 494 00:31:16,600 --> 00:31:21,320 Speaker 1: concept that there is this space and when Colin Campbell 495 00:31:21,360 --> 00:31:24,480 Speaker 1: wrote that, it was in I believe the seventies. So 496 00:31:24,520 --> 00:31:28,560 Speaker 1: it was a physical space where people go to find 497 00:31:29,320 --> 00:31:33,120 Speaker 1: these rejected narratives, you know, reject the idea of rejected knowledge, 498 00:31:33,400 --> 00:31:36,040 Speaker 1: and they go to seek this kind of knowledge and 499 00:31:36,040 --> 00:31:38,760 Speaker 1: and and these things. UM. So he's talking about things 500 00:31:38,760 --> 00:31:44,440 Speaker 1: like UFO conferences or or meet ups, or or alternative bookstores, 501 00:31:45,120 --> 00:31:48,440 Speaker 1: or perhaps maybe signing up at an institute to get 502 00:31:48,480 --> 00:31:54,800 Speaker 1: a degree in metaphysics. Yeah, as a weirdly specific example, 503 00:31:54,880 --> 00:31:58,920 Speaker 1: gear Yeah, sorry, does red red of thought? Yeah. Anyway, 504 00:31:58,960 --> 00:32:04,480 Speaker 1: how's that going, by the way, Garrison, it's going good, good, Yeah. 505 00:32:04,560 --> 00:32:08,640 Speaker 1: And and what you find is is people can very 506 00:32:08,680 --> 00:32:14,040 Speaker 1: easily move between ideologies, UM. And as they move between 507 00:32:14,640 --> 00:32:19,320 Speaker 1: ideologies concepts, specific schools, they cross pollinate these schools UM. 508 00:32:19,320 --> 00:32:20,920 Speaker 1: And this is how you get these kind of highly 509 00:32:21,240 --> 00:32:26,360 Speaker 1: syncretic movements like Q and on UM, like the modern 510 00:32:26,440 --> 00:32:31,520 Speaker 1: conspiracy movement UM, which is incredibly syncretic, and UM some 511 00:32:31,560 --> 00:32:33,920 Speaker 1: of the other really bad ones that are out there 512 00:32:33,920 --> 00:32:39,120 Speaker 1: as well that combine these different views UM. Specifically, when 513 00:32:39,160 --> 00:32:40,680 Speaker 1: you start when you start combining the type of like 514 00:32:40,800 --> 00:32:47,520 Speaker 1: cultural mysticism with politics, often you can have very volatile results. Yes, exactly. 515 00:32:48,240 --> 00:32:53,920 Speaker 1: Can you think of any examples? I mean, in some ways, 516 00:32:53,960 --> 00:32:55,880 Speaker 1: the modern eco fascist movement is built on a lot 517 00:32:55,880 --> 00:32:57,440 Speaker 1: of this type of stuff, so that that would be 518 00:32:57,440 --> 00:33:00,200 Speaker 1: the easiest one, would be the easiest one. The I 519 00:33:00,200 --> 00:33:02,560 Speaker 1: think that the syncretism of because I think a lot 520 00:33:02,600 --> 00:33:05,240 Speaker 1: of people have been surprised to see like you know, 521 00:33:05,440 --> 00:33:09,480 Speaker 1: kind of like natural medicine and and what not subcultures 522 00:33:09,520 --> 00:33:13,000 Speaker 1: and eight like alien subcultures kind of colliding with Q 523 00:33:13,240 --> 00:33:16,440 Speaker 1: and on and and these like more like far right 524 00:33:17,000 --> 00:33:19,160 Speaker 1: neo Nazi type groups, and the fact that there are 525 00:33:19,160 --> 00:33:21,760 Speaker 1: all of these things that were associated for years kind 526 00:33:21,760 --> 00:33:25,440 Speaker 1: of more with the left have been increasingly um pulled 527 00:33:25,560 --> 00:33:30,600 Speaker 1: into this this sort of weather system of conspiratorial thinking 528 00:33:30,600 --> 00:33:32,240 Speaker 1: has been surprising to a lot of people who don't 529 00:33:32,280 --> 00:33:34,440 Speaker 1: understand this stuff. But it makes total sense if you 530 00:33:34,920 --> 00:33:38,520 Speaker 1: if you have been paying attention to the scholarship on 531 00:33:38,520 --> 00:33:41,680 Speaker 1: on what is actually like how cults sort of form 532 00:33:41,760 --> 00:33:45,200 Speaker 1: like it's it's it's um. It's like a weather pattern 533 00:33:45,240 --> 00:33:47,680 Speaker 1: that's been building for quite a while. There's a gravity 534 00:33:47,720 --> 00:33:51,080 Speaker 1: to it that sucks um everything in together and it 535 00:33:51,160 --> 00:33:55,040 Speaker 1: all kind of it's as you said, syncretic um. It's 536 00:33:55,160 --> 00:33:57,880 Speaker 1: not to get into horsehoe through. But this is even 537 00:33:57,880 --> 00:34:00,240 Speaker 1: how you get some of that crossover, right, Yes, yeah, 538 00:34:00,280 --> 00:34:01,560 Speaker 1: that was that was That was what I was just 539 00:34:01,600 --> 00:34:03,360 Speaker 1: going to mention, is that even a lot of like 540 00:34:03,400 --> 00:34:05,640 Speaker 1: the left wing authors or you know, post left wing 541 00:34:05,680 --> 00:34:09,560 Speaker 1: authors who got into this like cultural mysticism. Um, you 542 00:34:09,600 --> 00:34:13,000 Speaker 1: see their texts now getting shared by like like open fascists. 543 00:34:13,080 --> 00:34:16,920 Speaker 1: Even though these authors were anti fascist, Um, they are 544 00:34:16,960 --> 00:34:19,160 Speaker 1: able to still pick and choose what parts they're writing 545 00:34:19,200 --> 00:34:22,400 Speaker 1: to appropriate because some of it can kind of synchronize 546 00:34:24,040 --> 00:34:27,560 Speaker 1: at from opposite ends a very long time. Like if 547 00:34:27,560 --> 00:34:29,720 Speaker 1: you we've we talked about in our Gabriel the Nunzio 548 00:34:29,800 --> 00:34:32,880 Speaker 1: episodes Fume, which was this kind of like we're a 549 00:34:33,000 --> 00:34:37,040 Speaker 1: large chunk of like the fascist intellectual movement got started, 550 00:34:37,480 --> 00:34:39,680 Speaker 1: um in the post World War One period, but also 551 00:34:39,719 --> 00:34:41,520 Speaker 1: there were like a ton of anarchists and a lot 552 00:34:41,560 --> 00:34:45,320 Speaker 1: of like left wing um like thought leaders and whatnot. 553 00:34:45,360 --> 00:34:47,359 Speaker 1: We're kind of all it was again kind of there 554 00:34:47,360 --> 00:34:50,240 Speaker 1: there was this kind of like gravity center that pulled 555 00:34:50,320 --> 00:34:54,720 Speaker 1: everything in and it all started churning together, and um, 556 00:34:54,840 --> 00:34:58,720 Speaker 1: yeah we're we're we're we're seeing that happen now. Um 557 00:34:59,440 --> 00:35:03,799 Speaker 1: and yeah it sucks. That's that's that's great. To jump 558 00:35:03,840 --> 00:35:05,319 Speaker 1: back to, campl That's one of those examples of those 559 00:35:05,320 --> 00:35:07,600 Speaker 1: physical space of the call and that Cal Campbell was 560 00:35:07,640 --> 00:35:11,360 Speaker 1: talking about right, Um, where it's it's any place that 561 00:35:11,440 --> 00:35:14,839 Speaker 1: ideas that are rejected by you know, the orthodox kind 562 00:35:14,840 --> 00:35:18,960 Speaker 1: of the establishment. There is overlap, there is not necessarily 563 00:35:19,000 --> 00:35:21,960 Speaker 1: ideological overlap. There is an interplay between them as people 564 00:35:22,000 --> 00:35:24,920 Speaker 1: move between them and as these ideas come into collision 565 00:35:25,000 --> 00:35:29,520 Speaker 1: with one another. Um, and with the Internet, right, whole 566 00:35:29,719 --> 00:35:34,640 Speaker 1: different fucking ball game. Um yeah, because that space is 567 00:35:34,680 --> 00:35:41,960 Speaker 1: now everywhere exactly and and telegram specifically has the specific 568 00:35:42,040 --> 00:35:45,200 Speaker 1: affordances that make it ideal for having this soup of 569 00:35:45,280 --> 00:35:50,160 Speaker 1: bullshit on it as well. Um. It's it's additionally one 570 00:35:50,200 --> 00:35:52,560 Speaker 1: of and this may be changing. There's a lot of 571 00:35:52,560 --> 00:35:55,600 Speaker 1: discussion going on about this, especially within the German government, 572 00:35:55,640 --> 00:35:57,880 Speaker 1: who could actually they already have a lot that they 573 00:35:57,920 --> 00:36:00,719 Speaker 1: could use to say, hey, you can't have Nazis on 574 00:36:00,920 --> 00:36:05,520 Speaker 1: you can't have this Nazi ship in telegram. Um. But 575 00:36:05,600 --> 00:36:08,680 Speaker 1: telegrams one of these last places where where things are 576 00:36:08,719 --> 00:36:14,320 Speaker 1: largely allowed to spread without any kind of interruption, right, Um, 577 00:36:14,360 --> 00:36:16,160 Speaker 1: which I do think you know you look at telegram 578 00:36:16,160 --> 00:36:19,880 Speaker 1: is used in in um, the Hong Kong uprising as well, 579 00:36:20,040 --> 00:36:23,879 Speaker 1: it was used for it was used in the George Yeah, 580 00:36:23,880 --> 00:36:27,920 Speaker 1: that George Floyd uprising as well. Um. And it's the 581 00:36:27,960 --> 00:36:31,080 Speaker 1: same things that people too at time is fake, um, 582 00:36:32,000 --> 00:36:36,080 Speaker 1: bolish lane of your time. Um, but shooting your clock, 583 00:36:37,160 --> 00:36:41,319 Speaker 1: shoot the fucking clock. But okay, said, let's get back 584 00:36:41,360 --> 00:36:44,359 Speaker 1: to the topic. Yeah, but but but but drumming back 585 00:36:44,360 --> 00:36:47,200 Speaker 1: into this, Telegram markets itself is this very secure platform. 586 00:36:47,760 --> 00:36:50,799 Speaker 1: It's probably not right. It does have it's certainly not 587 00:36:51,280 --> 00:36:54,560 Speaker 1: absolutely not. It does have it does have encrypted chats, 588 00:36:54,560 --> 00:36:58,239 Speaker 1: but that's only for one to one messaging between people. 589 00:36:58,280 --> 00:36:59,879 Speaker 1: And even then you need to go and make your 590 00:37:00,239 --> 00:37:03,359 Speaker 1: the security settings right. And again, I I don't fully 591 00:37:03,360 --> 00:37:07,480 Speaker 1: trust that. Yeah, I don't fully truy. I mean signal 592 00:37:07,560 --> 00:37:15,480 Speaker 1: is about barely trust signal. Yeah, yeah, I trust conversations 593 00:37:15,520 --> 00:37:18,439 Speaker 1: when everyone has put their phone inside a Faraday bag 594 00:37:18,480 --> 00:37:20,759 Speaker 1: in a house and then we walked two miles into 595 00:37:21,239 --> 00:37:23,600 Speaker 1: we walked two miles into the woods, then you can 596 00:37:23,680 --> 00:37:28,000 Speaker 1: have a conversation. Yeah. Um, but Telegram market itself is 597 00:37:28,000 --> 00:37:31,120 Speaker 1: a very secure app, right, um, which is which is 598 00:37:31,160 --> 00:37:33,480 Speaker 1: largely marketing. You know, It's it's appeal is that it's 599 00:37:33,520 --> 00:37:36,240 Speaker 1: not What's app, it's not owned by Facebook. It's probably 600 00:37:36,239 --> 00:37:39,000 Speaker 1: worth acknowledging that for because it's also very popular with 601 00:37:39,000 --> 00:37:41,239 Speaker 1: a lot of people in um, you know, parts of 602 00:37:41,280 --> 00:37:44,279 Speaker 1: the global South and countries with authoritarian governments, and it 603 00:37:44,360 --> 00:37:46,320 Speaker 1: is has been used for a lot of organizing and 604 00:37:46,440 --> 00:37:50,279 Speaker 1: can be more secure and all more secure but also 605 00:37:50,360 --> 00:37:55,040 Speaker 1: more able than what than any other tool people have 606 00:37:55,120 --> 00:37:58,160 Speaker 1: access to. I mean in Syria, it's like it's again 607 00:37:58,280 --> 00:38:01,839 Speaker 1: extremely common for like nay, like neighborhoods and towns will 608 00:38:01,880 --> 00:38:04,360 Speaker 1: have like Telegram groups for this little village where they 609 00:38:04,520 --> 00:38:06,640 Speaker 1: a lot of stuff gets done over Telegram in places 610 00:38:06,680 --> 00:38:10,000 Speaker 1: like that, And telegram sits in this interesting space between 611 00:38:10,120 --> 00:38:13,480 Speaker 1: social media. UM. It's not a full on social media site, 612 00:38:13,480 --> 00:38:17,040 Speaker 1: but it's also not just a messaging app like telegrams 613 00:38:17,840 --> 00:38:20,480 Speaker 1: to categorize it is an interesting sort of like in 614 00:38:20,520 --> 00:38:24,919 Speaker 1: between type thing. Yeah, because you can have essentially unlimited 615 00:38:25,200 --> 00:38:27,520 Speaker 1: I think the number is in the hundreds of thousands 616 00:38:27,560 --> 00:38:30,840 Speaker 1: for how many people can join a group message UM 617 00:38:30,880 --> 00:38:34,960 Speaker 1: on Telegram. And you also have these one way messaging 618 00:38:34,960 --> 00:38:37,319 Speaker 1: thing called channels where where one person or group of 619 00:38:37,320 --> 00:38:41,280 Speaker 1: people can send out messages that appear alongside everyone else's 620 00:38:41,320 --> 00:38:45,239 Speaker 1: message feed as well. UM, and that can you can 621 00:38:45,239 --> 00:38:48,120 Speaker 1: also enable comments on that UM, which will get into 622 00:38:48,160 --> 00:38:50,560 Speaker 1: in a second UM. But but it's a great way 623 00:38:50,640 --> 00:38:53,600 Speaker 1: to share information as well. And what I was specifically 624 00:38:53,600 --> 00:38:56,880 Speaker 1: looking at is the forwarding of messages, because you you 625 00:38:56,920 --> 00:39:00,000 Speaker 1: can forward a message from this one channel into whatever 626 00:39:00,000 --> 00:39:02,000 Speaker 1: a group chat you're in, and it links back to 627 00:39:02,040 --> 00:39:05,319 Speaker 1: that channel. And I was interested in saying, how far 628 00:39:05,800 --> 00:39:07,440 Speaker 1: you know, what connections can we make from this? What 629 00:39:07,560 --> 00:39:11,759 Speaker 1: kind of zigzagging can we find? Um enhancer is sucking 630 00:39:11,800 --> 00:39:18,880 Speaker 1: a lot uh where where someone may may use telegram 631 00:39:18,920 --> 00:39:21,160 Speaker 1: for for example, a neighborhood group message right, and then 632 00:39:21,640 --> 00:39:25,040 Speaker 1: someone forwards the messages a message for this channel UM 633 00:39:25,280 --> 00:39:28,600 Speaker 1: or for this other group message UM, where they talk about, oh, 634 00:39:28,680 --> 00:39:31,200 Speaker 1: here's kind of helped practices to use. And then you 635 00:39:31,239 --> 00:39:35,560 Speaker 1: get into the pseudo science of things crossing into further 636 00:39:35,600 --> 00:39:38,440 Speaker 1: messages from what's forward forward groups and channels from what's 637 00:39:38,440 --> 00:39:40,960 Speaker 1: forwarded into that group and channel, and so on so on, 638 00:39:41,040 --> 00:39:44,680 Speaker 1: until you get to the neo nazis eventually UM. And 639 00:39:44,719 --> 00:39:47,400 Speaker 1: it's also it is it is a concerted effort on 640 00:39:47,440 --> 00:39:51,480 Speaker 1: the part of people pushing their ideology, who will go 641 00:39:51,520 --> 00:39:55,240 Speaker 1: in the comments of these giant channels and say, hey, 642 00:39:55,480 --> 00:40:00,680 Speaker 1: check out my channel. UM, what's not a real one? 643 00:40:00,760 --> 00:40:04,319 Speaker 1: You know, areean cooking, which is probably a channel but 644 00:40:04,360 --> 00:40:09,080 Speaker 1: it probably is. Yeah, great job checking. Sorry, but check 645 00:40:09,120 --> 00:40:11,560 Speaker 1: out check out check out, check out this or whatever. 646 00:40:11,719 --> 00:40:15,640 Speaker 1: And and especially when q and on moved on a 647 00:40:15,680 --> 00:40:22,120 Speaker 1: lot of promoters. Um, there there was organized groups of 648 00:40:22,120 --> 00:40:24,879 Speaker 1: of internet neo Nazis going on and trying to pill 649 00:40:24,880 --> 00:40:29,600 Speaker 1: boomers into neo Nazism. And because of the because of 650 00:40:29,640 --> 00:40:32,080 Speaker 1: the mesh like network of Telegram, they try to make 651 00:40:32,160 --> 00:40:35,759 Speaker 1: those meshes connect via dissemination. Right you can, you know, 652 00:40:35,800 --> 00:40:38,480 Speaker 1: people who are dedicated to these more esoteric groups can 653 00:40:38,560 --> 00:40:42,600 Speaker 1: join more regular like Marga groups or qut on groups 654 00:40:42,719 --> 00:40:46,240 Speaker 1: and start slowly bringing links to the start doing links 655 00:40:46,239 --> 00:40:50,799 Speaker 1: and forwarding to the more extreme channels. Um. And eventually yeah, 656 00:40:50,840 --> 00:40:52,680 Speaker 1: that does that does work. It can be a slow, 657 00:40:52,719 --> 00:40:55,840 Speaker 1: careful process um, or it can be very fast and 658 00:40:56,640 --> 00:41:00,640 Speaker 1: like bobastic And it'll depending on the said one of 659 00:41:00,680 --> 00:41:03,680 Speaker 1: them will latch onto one one watch onto the other. Yeah. 660 00:41:03,719 --> 00:41:07,640 Speaker 1: And and before the article came out, UM, what I 661 00:41:07,680 --> 00:41:11,640 Speaker 1: did see was the specific thing of accounts that I 662 00:41:11,640 --> 00:41:15,919 Speaker 1: would associate or or believe to be neo Nazi UM 663 00:41:16,000 --> 00:41:19,399 Speaker 1: encouraging people to join their groups and channels UM in 664 00:41:19,440 --> 00:41:23,680 Speaker 1: the in the telegram group message as well. And I 665 00:41:23,680 --> 00:41:25,680 Speaker 1: cannot speak to what that looks like right now after 666 00:41:25,680 --> 00:41:29,920 Speaker 1: the article has come out, yeah, and I've been trying 667 00:41:29,960 --> 00:41:32,120 Speaker 1: to take a break from telegram for my day to 668 00:41:32,160 --> 00:41:35,640 Speaker 1: day life, um and focus on reading actual books. So 669 00:41:36,320 --> 00:41:38,279 Speaker 1: but yeah, that is how I can always tell when 670 00:41:38,320 --> 00:41:40,960 Speaker 1: one of us has been spending time on telegram, because 671 00:41:42,040 --> 00:41:48,040 Speaker 1: the things we consider jokes get much worse. Yeah. You 672 00:41:48,080 --> 00:41:52,760 Speaker 1: remember when I found that playlist of Blink covers covers 673 00:41:53,200 --> 00:41:55,120 Speaker 1: it was what there was like a hundred of them 674 00:41:55,239 --> 00:42:02,919 Speaker 1: Blink or some full ship. Yeah, you always find you could, 675 00:42:02,920 --> 00:42:05,759 Speaker 1: you could find the most fucked up stuff. Don't don't 676 00:42:05,760 --> 00:42:09,080 Speaker 1: do it. Don't don't scrowl until do not. You're not 677 00:42:09,120 --> 00:42:11,440 Speaker 1: going to get this isn't like coveting the stay. You 678 00:42:11,480 --> 00:42:15,080 Speaker 1: don't need to be like it's it's not even it's 679 00:42:15,080 --> 00:42:18,600 Speaker 1: not worth it, Like there's no sacred novit like knowledge 680 00:42:18,600 --> 00:42:22,080 Speaker 1: that we're hiding. It's just it's it's just kind of sucks, 681 00:42:22,080 --> 00:42:24,640 Speaker 1: like it just like it just to feel bad. Yeah, 682 00:42:24,719 --> 00:42:28,520 Speaker 1: it just makes you feel worse about life and yourself 683 00:42:28,680 --> 00:42:31,560 Speaker 1: and the people around you. So the scope for your 684 00:42:31,600 --> 00:42:34,120 Speaker 1: master's project what it's kind of the what's the what's 685 00:42:34,120 --> 00:42:39,520 Speaker 1: the deal with like tying these things together? I guess yeah. Yeah, 686 00:42:39,600 --> 00:42:42,080 Speaker 1: So so using this social network analysis to argue that 687 00:42:42,360 --> 00:42:46,640 Speaker 1: the telegram does function as this could take Millie you um, 688 00:42:46,840 --> 00:42:51,200 Speaker 1: which yeah, it seems seems to be. It seems to 689 00:42:51,239 --> 00:42:55,160 Speaker 1: be the case. Yeah. Um. You know, and the question 690 00:42:55,160 --> 00:42:58,839 Speaker 1: gets into what is the responsibility the platform? Right, um, 691 00:42:58,880 --> 00:43:03,440 Speaker 1: because I fully believe there should be something at least 692 00:43:03,880 --> 00:43:07,080 Speaker 1: similar to this. It has been used for you know, 693 00:43:07,200 --> 00:43:09,759 Speaker 1: purposes along with my politics, in which I would call 694 00:43:10,000 --> 00:43:16,280 Speaker 1: good and needed. Um. However, they've also allowed this fucking 695 00:43:16,320 --> 00:43:20,359 Speaker 1: awful ecosystem to spread. It's it's interesting to see when 696 00:43:20,360 --> 00:43:22,440 Speaker 1: Telegram has had to step in and they you know, 697 00:43:22,520 --> 00:43:26,200 Speaker 1: they have pulled down some ices, accounts and channels. Um. 698 00:43:27,080 --> 00:43:31,600 Speaker 1: And and they have down when I was in um 699 00:43:31,760 --> 00:43:33,520 Speaker 1: Al Hall, which is that the camp where all the 700 00:43:33,560 --> 00:43:36,239 Speaker 1: ISIS prisoners were in Syria. Like while Jake and I 701 00:43:36,280 --> 00:43:39,440 Speaker 1: were in the camp, we could see on telegram like 702 00:43:39,560 --> 00:43:42,919 Speaker 1: ISIS supporters and al hole talking about stabbing guards like 703 00:43:44,239 --> 00:43:48,920 Speaker 1: in real time. It was not particularly Uh. They've done 704 00:43:48,960 --> 00:43:51,040 Speaker 1: like a lot. There's less of it than there used 705 00:43:51,040 --> 00:43:52,719 Speaker 1: to be, but it is still not hard to find 706 00:43:52,760 --> 00:43:57,560 Speaker 1: ISIS on telegram. Yeah. And they they've taken down a 707 00:43:57,600 --> 00:44:00,399 Speaker 1: few amounts of neo Nazi channels. Um. It's it's funny 708 00:44:00,440 --> 00:44:04,279 Speaker 1: because oh god, maybe cut this. But they've taken on 709 00:44:04,360 --> 00:44:05,880 Speaker 1: some of the your honesty channels when they've shared Ice 710 00:44:05,960 --> 00:44:10,160 Speaker 1: as ship for example. Yeah, yeah, I think we're we've 711 00:44:10,360 --> 00:44:12,760 Speaker 1: we're familiar with that line of thing. That's something we've 712 00:44:12,760 --> 00:44:17,040 Speaker 1: mentioned before. Okay. There has been pressure from from the 713 00:44:17,040 --> 00:44:19,719 Speaker 1: play Store and Google as well, or the play Store 714 00:44:19,719 --> 00:44:23,240 Speaker 1: from Google and and the app store Apples, app Store 715 00:44:23,680 --> 00:44:26,840 Speaker 1: from Apple to say we we aren't going to carry 716 00:44:26,840 --> 00:44:30,719 Speaker 1: the app if you don't do just a tiny bit 717 00:44:30,760 --> 00:44:35,480 Speaker 1: better essentially um which which which also it exists as 718 00:44:35,520 --> 00:44:38,440 Speaker 1: a web client, um, both as as a web client 719 00:44:38,520 --> 00:44:41,719 Speaker 1: and as a desktop app as well. UM. But that 720 00:44:41,760 --> 00:44:45,640 Speaker 1: would you know, limits some of it. Um. So, So 721 00:44:45,719 --> 00:44:49,640 Speaker 1: this has become largely discussed in the German in the 722 00:44:49,640 --> 00:44:53,200 Speaker 1: German parment because there's a new UM, there's a new 723 00:44:53,239 --> 00:44:56,799 Speaker 1: government in Germany, and and there is this history of 724 00:44:57,880 --> 00:45:03,800 Speaker 1: Germany kind of is the lead for doing things about 725 00:45:03,800 --> 00:45:08,040 Speaker 1: this digital content, especially within the EU. And and as 726 00:45:08,080 --> 00:45:10,360 Speaker 1: I mentioned, there's already law call Adoptic Enforcement Act that 727 00:45:10,440 --> 00:45:14,920 Speaker 1: requires platforms to take down content in Germany. Um. They 728 00:45:14,920 --> 00:45:20,280 Speaker 1: could be implemented on telegram as well. There's already a law. Yeah, 729 00:45:20,320 --> 00:45:23,839 Speaker 1: I mean This is like the thing why you know 730 00:45:23,960 --> 00:45:26,640 Speaker 1: what I watch happen lots is you know, these channels 731 00:45:26,640 --> 00:45:28,399 Speaker 1: will get shut down and they'll make a new one, 732 00:45:28,440 --> 00:45:29,880 Speaker 1: and it will shut down that one and make a 733 00:45:29,880 --> 00:45:31,560 Speaker 1: new one. Right. It's this You see this with like 734 00:45:31,560 --> 00:45:36,040 Speaker 1: discord servers, telegram channels. It is kind of this endless cycle, um, 735 00:45:36,120 --> 00:45:39,120 Speaker 1: and seeking an end to the cycle. It's always not 736 00:45:39,280 --> 00:45:43,400 Speaker 1: as easy as what one would hope, UM, because of 737 00:45:43,480 --> 00:45:46,840 Speaker 1: the cyclical nature of building these platforms and connections and 738 00:45:46,920 --> 00:45:49,160 Speaker 1: how the people who run these you know, intersect and 739 00:45:49,160 --> 00:45:52,279 Speaker 1: specifically with with telegrams. Really easy because channel get shut down, 740 00:45:52,480 --> 00:45:54,520 Speaker 1: you're still part of twelve of their channels, and odds 741 00:45:54,560 --> 00:45:56,360 Speaker 1: are one of those channels is going to afford you 742 00:45:56,400 --> 00:45:59,359 Speaker 1: the link to the new channel that was that was lost. Yeah, 743 00:45:59,400 --> 00:46:00,799 Speaker 1: And this is the thing you see where they send 744 00:46:00,840 --> 00:46:04,759 Speaker 1: lists of channels within within extremest groups and channels, they 745 00:46:04,760 --> 00:46:06,400 Speaker 1: will send out a list of here's other groups and 746 00:46:06,480 --> 00:46:09,719 Speaker 1: channels to check out as well. Yeah, but I mean 747 00:46:09,800 --> 00:46:12,120 Speaker 1: I would So that's something that's you know, hard for 748 00:46:12,120 --> 00:46:14,880 Speaker 1: regular people to actually do. But something I think that 749 00:46:14,880 --> 00:46:18,480 Speaker 1: people who do not own these platforms nor lawmakers can't 750 00:46:18,480 --> 00:46:23,560 Speaker 1: think about is particularly the the cultic comulie you that 751 00:46:23,640 --> 00:46:27,120 Speaker 1: does you know, go past regular left right divisions in 752 00:46:27,200 --> 00:46:31,520 Speaker 1: terms of politics and how you know, symbol like symbology 753 00:46:31,640 --> 00:46:35,360 Speaker 1: um and stuff that was you know initially you know, 754 00:46:35,719 --> 00:46:39,440 Speaker 1: perhaps more anarchist or or left wing is being used 755 00:46:39,480 --> 00:46:42,319 Speaker 1: by people on the right, and some people are really 756 00:46:42,360 --> 00:46:44,640 Speaker 1: confused by that, and there is ways to there is 757 00:46:44,680 --> 00:46:46,440 Speaker 1: ways to understand it, like it is, it does. I 758 00:46:46,920 --> 00:46:49,200 Speaker 1: am very frustrated when I look at you know, people 759 00:46:49,200 --> 00:46:53,840 Speaker 1: online who don't understand why Nazis can use ted K 760 00:46:54,560 --> 00:46:57,160 Speaker 1: and right, it's like, yeah, like it's it's not it's 761 00:46:57,200 --> 00:46:58,920 Speaker 1: it's not it's not what It's not not really about 762 00:46:58,920 --> 00:47:03,640 Speaker 1: what ted K actually run. It's more the symbolic meme 763 00:47:03,880 --> 00:47:06,560 Speaker 1: of ted K. And trying to you know, get that, 764 00:47:06,640 --> 00:47:09,240 Speaker 1: get that line of thinking across is not the easiest 765 00:47:09,239 --> 00:47:11,000 Speaker 1: thing because sometimes it will go in the other direction 766 00:47:11,000 --> 00:47:12,919 Speaker 1: and be like, oh, ted K is a Nazi, which 767 00:47:13,040 --> 00:47:15,239 Speaker 1: isn't accurate either, Like that's that's not also the most 768 00:47:15,280 --> 00:47:19,400 Speaker 1: accurate thing to say, So it's it's the cultimalu framework 769 00:47:19,400 --> 00:47:22,120 Speaker 1: of being. Yeah. Sometimes these symbols can cross over from 770 00:47:22,160 --> 00:47:24,640 Speaker 1: one thing to another, and sometimes the action can be 771 00:47:24,680 --> 00:47:28,240 Speaker 1: the same you know, both anarchists and like instructionary fascists 772 00:47:28,280 --> 00:47:31,720 Speaker 1: both want to like attack like industrialization and attack points 773 00:47:31,719 --> 00:47:34,840 Speaker 1: of industry, right, but maybe their ideologies are slightly different, 774 00:47:34,960 --> 00:47:38,560 Speaker 1: sometimes in specific ways, so it's always a tricky thing 775 00:47:38,600 --> 00:47:42,319 Speaker 1: to kind of navigate. UM. So I think in terms of, 776 00:47:42,360 --> 00:47:45,719 Speaker 1: you know, people should think about what symbols they promote, uh, 777 00:47:45,800 --> 00:47:48,319 Speaker 1: like publicly and stuff is a good thing, and think 778 00:47:48,360 --> 00:47:53,399 Speaker 1: about news aggregation and how to maybe not not just 779 00:47:53,760 --> 00:47:56,840 Speaker 1: share something because it's countercultural, trying to figure out what 780 00:47:56,960 --> 00:48:01,280 Speaker 1: other what other types of narratives the sources spread. Um 781 00:48:02,200 --> 00:48:04,640 Speaker 1: journalists support the work of real journalist because a bunch 782 00:48:04,640 --> 00:48:07,359 Speaker 1: of kick asks people out there um who were doing 783 00:48:07,360 --> 00:48:11,880 Speaker 1: awesome research and work. So I think that kind of 784 00:48:11,920 --> 00:48:13,520 Speaker 1: wraps up the scope of what I want to talk 785 00:48:13,560 --> 00:48:16,839 Speaker 1: about around Disclose specifically, because I mean, discloses a thing, 786 00:48:17,080 --> 00:48:19,160 Speaker 1: but it's also like it's good as just like an 787 00:48:19,160 --> 00:48:23,880 Speaker 1: example to like this overly this broader like phenomenon, I think, um, 788 00:48:23,920 --> 00:48:27,200 Speaker 1: because like disclose won't be here forever, hopefully, like you know, 789 00:48:27,239 --> 00:48:29,080 Speaker 1: hopefully in a few years. It's something that we can 790 00:48:29,160 --> 00:48:32,279 Speaker 1: just like look back on and laugh about. Um. But 791 00:48:32,440 --> 00:48:34,799 Speaker 1: it's you know, it's still a good signifier for a 792 00:48:34,800 --> 00:48:38,040 Speaker 1: phenomenon that happens, and the phenomenon even even if disclose 793 00:48:38,080 --> 00:48:40,719 Speaker 1: goes away, the phenomenon is still going to stay. Um, 794 00:48:40,719 --> 00:48:42,920 Speaker 1: and it's important to point it out when you see 795 00:48:42,960 --> 00:48:45,960 Speaker 1: whatever the next version of this is. So, And I'll 796 00:48:46,000 --> 00:48:49,839 Speaker 1: also say that the culticnalu isn't necessarily a bad thing. Right, 797 00:48:50,200 --> 00:48:52,480 Speaker 1: this is you know where where the stuff that is 798 00:48:52,520 --> 00:48:57,319 Speaker 1: rejected by the orthodox goes and clearly eliminating right, any 799 00:48:57,400 --> 00:48:59,520 Speaker 1: kind of culticnal you just means everything is exactly the 800 00:48:59,560 --> 00:49:02,360 Speaker 1: fucking same, mean falls in line with or thoughts belief 801 00:49:02,400 --> 00:49:05,640 Speaker 1: which I strongly disagree with as well. No, there's there 802 00:49:05,719 --> 00:49:10,880 Speaker 1: is a way to be countercultural without being a conspiratorial fascist. 803 00:49:11,840 --> 00:49:15,040 Speaker 1: I would say that most responsibility in which you were 804 00:49:15,040 --> 00:49:17,560 Speaker 1: consuming and sharing, Yes, And I would say like most 805 00:49:17,600 --> 00:49:20,800 Speaker 1: people who are actually counterculture are Yeah, like actual punk 806 00:49:21,280 --> 00:49:25,080 Speaker 1: is that you know, once you're enforcing traditional hierarchical of 807 00:49:25,120 --> 00:49:29,400 Speaker 1: viewpoints that that ain't punk. That is uh, that's playing 808 00:49:29,400 --> 00:49:33,200 Speaker 1: into what the status quo was. That isn't that isn't revolutionary. 809 00:49:35,160 --> 00:49:39,840 Speaker 1: Pistols would disagree with you, aren't they all? Yeah? But 810 00:49:40,160 --> 00:49:42,360 Speaker 1: I think we can all agree that having living members 811 00:49:42,360 --> 00:49:45,320 Speaker 1: of the sex Pistols was a mistake, and I would 812 00:49:45,360 --> 00:49:48,600 Speaker 1: I prefer Alana Wachowski's version of punk to theirs anyway. 813 00:49:48,719 --> 00:49:54,040 Speaker 1: So hey, who cares? UM? So thank you for your work, Thomas. UM. 814 00:49:54,120 --> 00:49:57,319 Speaker 1: I would recommend people reach your article, UM, which you 815 00:49:57,360 --> 00:50:01,040 Speaker 1: can do by googling disclosed TV and out it will 816 00:50:01,160 --> 00:50:03,160 Speaker 1: be for me. It's the second result that pops up, 817 00:50:03,200 --> 00:50:06,160 Speaker 1: So that's gool um. Send it to all your friends 818 00:50:06,160 --> 00:50:08,719 Speaker 1: and mutuals who are sharing disclosed TV. UM. You can 819 00:50:08,760 --> 00:50:12,080 Speaker 1: find it on logically dot Ai is the website and 820 00:50:12,080 --> 00:50:15,279 Speaker 1: the full title of the article is Disclosed TV Conspiracy 821 00:50:15,280 --> 00:50:19,880 Speaker 1: Forum turned disinformation Factory. Thank you, Thank you for that. UM. 822 00:50:20,120 --> 00:50:22,200 Speaker 1: Do you want to direct people to your Twitter account 823 00:50:22,280 --> 00:50:23,720 Speaker 1: or do you want to be a ghost that fades 824 00:50:23,719 --> 00:50:26,960 Speaker 1: away in their memory? Uh? Just don't be fucking weird. 825 00:50:27,000 --> 00:50:31,720 Speaker 1: You can find me on Twitter at w underscore underscore Thomas, 826 00:50:31,719 --> 00:50:36,120 Speaker 1: God damnit, Christ be weird. All right, I guess I'm 827 00:50:36,160 --> 00:50:41,520 Speaker 1: keeping my account Lockford. UM. Yeah, I also want to 828 00:50:41,520 --> 00:50:44,960 Speaker 1: shout out, um some of the local mutual aid or 829 00:50:45,000 --> 00:50:47,120 Speaker 1: one of the local mutual aid groups in the town 830 00:50:47,120 --> 00:50:49,040 Speaker 1: where I live or in the area where I lives. 831 00:50:49,320 --> 00:50:53,600 Speaker 1: The Atlanta Justice Alliance. Their cash app is cash symbol 832 00:50:53,680 --> 00:50:55,719 Speaker 1: a t L mutual fund and their Venmos a t 833 00:50:55,880 --> 00:51:00,960 Speaker 1: L mutual fund. They're helping out. Um they they've done weekly, 834 00:51:01,440 --> 00:51:05,960 Speaker 1: UM weekly provided food and resources for people, and how's 835 00:51:05,960 --> 00:51:09,880 Speaker 1: people living in downtown Atlanta. UM and are a great group. 836 00:51:10,280 --> 00:51:13,200 Speaker 1: And then also people want to give more money to things. 837 00:51:13,239 --> 00:51:16,279 Speaker 1: Shout out Atlanta Solidary Fund, who have helped many of 838 00:51:16,280 --> 00:51:18,400 Speaker 1: my friends get out of jail after they were arrested 839 00:51:18,400 --> 00:51:24,120 Speaker 1: in protests. And also you can hire me, yes if 840 00:51:24,160 --> 00:51:27,520 Speaker 1: you researchers, Yes you can. You can't. You can't hire 841 00:51:27,560 --> 00:51:30,319 Speaker 1: Thomas if you want. I mean, I've I've I've I've 842 00:51:30,360 --> 00:51:36,160 Speaker 1: known Thomas for a bit. Um they do really good work. UM. Yeah, 843 00:51:36,239 --> 00:51:38,920 Speaker 1: they're very They're very they're in my experience, they're very 844 00:51:38,920 --> 00:51:42,359 Speaker 1: careful researcher. They will not say things without thinking about 845 00:51:42,400 --> 00:51:45,160 Speaker 1: them a lot first, which is always great in the researcher, 846 00:51:46,440 --> 00:51:50,000 Speaker 1: or at least not publicly send them money and off 847 00:51:50,040 --> 00:51:52,960 Speaker 1: put in comments. An even mix of money and really 848 00:51:53,000 --> 00:51:55,920 Speaker 1: off putting Twitter comments. One more shout out to my 849 00:51:56,000 --> 00:51:58,120 Speaker 1: One more shout out to my friends at Terrorism Bad Pod, 850 00:51:58,200 --> 00:52:00,840 Speaker 1: which you should listen to and is on Twitter terrorism 851 00:52:00,880 --> 00:52:05,400 Speaker 1: bad pod Well, that doesn't for us today. If you, 852 00:52:05,560 --> 00:52:08,000 Speaker 1: for some reason are on social media and you want 853 00:52:08,000 --> 00:52:09,640 Speaker 1: to follow us, you can follow us at cool zone 854 00:52:09,680 --> 00:52:13,239 Speaker 1: Media or hassolutely don't do or happen here pod um. 855 00:52:13,280 --> 00:52:16,399 Speaker 1: You can follow Robert Evans at I Write Okay, send 856 00:52:16,480 --> 00:52:19,080 Speaker 1: him weird messages, do not do that, and you can 857 00:52:19,120 --> 00:52:25,280 Speaker 1: send me weird messages at creep time all right, Garrison, 858 00:52:25,360 --> 00:52:29,000 Speaker 1: pictures of salads that you make, and keep keep doing 859 00:52:29,000 --> 00:52:31,040 Speaker 1: that for like five or six years, to the point 860 00:52:31,120 --> 00:52:33,560 Speaker 1: that it actually becomes funny, because it's going to take 861 00:52:33,560 --> 00:52:35,600 Speaker 1: a while. I'm just happy that people stop sending me 862 00:52:35,640 --> 00:52:40,520 Speaker 1: meal porn. So that's honestly, that's a wind. That one's 863 00:52:40,560 --> 00:52:50,280 Speaker 1: on you though, A good find everybody. It could Happen 864 00:52:50,280 --> 00:52:52,640 Speaker 1: Here is a production of cool Zone Media. For more 865 00:52:52,640 --> 00:52:55,439 Speaker 1: podcasts from cool Zone Media, visit our website cool zone 866 00:52:55,440 --> 00:52:57,319 Speaker 1: media dot com, or check us out on the I 867 00:52:57,400 --> 00:53:00,719 Speaker 1: Heart Radio app, Apple Podcasts, or wherever you listen of podcasts. 868 00:53:01,239 --> 00:53:03,400 Speaker 1: You can find sources for It could Happen here, updated 869 00:53:03,440 --> 00:53:06,920 Speaker 1: monthly at cool zone Media dot com slash sources. Thanks 870 00:53:06,920 --> 00:53:07,480 Speaker 1: for listening.