1 00:00:00,240 --> 00:00:02,680 Speaker 1: Hey, everybody, Robert Evans here and I wanted to let 2 00:00:02,720 --> 00:00:06,359 Speaker 1: you know this is a compilation episode. So every episode 3 00:00:06,480 --> 00:00:09,600 Speaker 1: of the week that just happened is here in one 4 00:00:09,680 --> 00:00:13,440 Speaker 1: convenient and with somewhat less ads package for you to 5 00:00:13,480 --> 00:00:15,960 Speaker 1: listen to in a long stretch if you want. If 6 00:00:16,000 --> 00:00:17,919 Speaker 1: you've been listening to the episodes every day this week, 7 00:00:17,960 --> 00:00:20,119 Speaker 1: there's gonna be nothing new here for you, but you 8 00:00:20,160 --> 00:00:28,600 Speaker 1: can make your own decisions. Uh, welcome to it could 9 00:00:28,640 --> 00:00:32,760 Speaker 1: happen here a podcast about things falling apart and how 10 00:00:32,800 --> 00:00:36,720 Speaker 1: to deal with that and hopefully take care of yourself 11 00:00:36,720 --> 00:00:39,800 Speaker 1: and your people. UM. Today we have a returning guest, 12 00:00:39,920 --> 00:00:44,200 Speaker 1: Carl Casarda from Enranged TV. UM. Now, Carl, every time 13 00:00:44,440 --> 00:00:46,879 Speaker 1: you and I have chatted on a show together, it 14 00:00:46,920 --> 00:00:50,640 Speaker 1: has been about firearms, which is obviously your passion and specialty, 15 00:00:50,760 --> 00:00:52,760 Speaker 1: well one of your specialties. But today we're not talking 16 00:00:52,760 --> 00:00:55,560 Speaker 1: at all about guns. UM. I mean maybe here and there, 17 00:00:55,560 --> 00:00:58,240 Speaker 1: but today we're talking about the thing that is has 18 00:00:58,280 --> 00:01:01,880 Speaker 1: been your your career, uh for what most of your 19 00:01:01,920 --> 00:01:05,360 Speaker 1: working life. Fair to UM, you want to kind of 20 00:01:05,360 --> 00:01:07,160 Speaker 1: walk through your background, he because we're gonna be talking 21 00:01:07,160 --> 00:01:09,880 Speaker 1: about information security and like sort of the future of 22 00:01:10,280 --> 00:01:14,520 Speaker 1: threats that are going to be like coming uh throughout 23 00:01:14,560 --> 00:01:17,119 Speaker 1: like the next few years of our lives. Obviously this 24 00:01:17,240 --> 00:01:19,399 Speaker 1: year in particular, there's been a bunch of stories about 25 00:01:19,440 --> 00:01:22,600 Speaker 1: like Russian attacks on digital infrastructure and vice versa. And 26 00:01:22,640 --> 00:01:25,399 Speaker 1: that's always, like pretty much has been something that's in 27 00:01:25,440 --> 00:01:28,800 Speaker 1: everybody's backburn since we got the Internet, usually through like 28 00:01:28,880 --> 00:01:33,319 Speaker 1: questionable films with Sandra Bullock. Um, I think net that 29 00:01:33,400 --> 00:01:37,720 Speaker 1: was net right? Um? The Net? The Net? Yes, exactly, Yes, 30 00:01:37,840 --> 00:01:41,040 Speaker 1: where they somehow hacked a car and nineteen or something. 31 00:01:41,600 --> 00:01:43,560 Speaker 1: You gotta do that when you're flying through cyberspace with 32 00:01:43,560 --> 00:01:48,160 Speaker 1: your right yea. Um. But yeah, you want to walk 33 00:01:48,160 --> 00:01:50,400 Speaker 1: everyone through kind of what your actual background is in 34 00:01:50,400 --> 00:01:53,960 Speaker 1: this industry first. Yeah, totally. So if anyone watches in 35 00:01:54,080 --> 00:01:55,800 Speaker 1: Rangers watched it for a long time, you'll see this 36 00:01:55,840 --> 00:01:57,760 Speaker 1: reflected in some of my content because I do deal 37 00:01:57,800 --> 00:02:00,560 Speaker 1: with some of this intermittently on the channel. It's definitely 38 00:02:00,560 --> 00:02:03,520 Speaker 1: influenced how I approach my work there with the social 39 00:02:03,520 --> 00:02:06,680 Speaker 1: media and all that. But so way back when I 40 00:02:06,720 --> 00:02:08,080 Speaker 1: was like one of those kids that was in the 41 00:02:08,080 --> 00:02:10,480 Speaker 1: hacker space, and I grew up like trying to make 42 00:02:10,480 --> 00:02:12,800 Speaker 1: computers and technology do what it wasn't designed to do, 43 00:02:12,840 --> 00:02:14,440 Speaker 1: and learn to make it do things it shouldn't have 44 00:02:14,440 --> 00:02:17,120 Speaker 1: done for my own interests or others around me, Not 45 00:02:17,120 --> 00:02:19,240 Speaker 1: not in any really negative way, but like just a 46 00:02:19,280 --> 00:02:21,760 Speaker 1: deep curiosity and how does this stuff work? And being 47 00:02:21,800 --> 00:02:24,840 Speaker 1: part of the early online community. We're talking pre internet, 48 00:02:24,840 --> 00:02:27,400 Speaker 1: where you have like an acoustic coupling jack modem and 49 00:02:27,440 --> 00:02:30,799 Speaker 1: you would dial in like war games, yea, literally plug 50 00:02:30,840 --> 00:02:33,560 Speaker 1: your headset into the fat. I was on boards like 51 00:02:33,600 --> 00:02:37,040 Speaker 1: that way back when whatever should have gone past those days. 52 00:02:37,120 --> 00:02:40,280 Speaker 1: Doing things wirelessly was such a mistake, Like I'm so 53 00:02:40,400 --> 00:02:42,560 Speaker 1: piste off that when I like sit down to research, 54 00:02:42,639 --> 00:02:46,320 Speaker 1: I'm not like jacking into a gigantic box um like 55 00:02:46,400 --> 00:02:49,480 Speaker 1: it that makes me live it Like shadow Run promised 56 00:02:49,520 --> 00:02:52,399 Speaker 1: me that I was going to be like using one 57 00:02:52,440 --> 00:02:56,440 Speaker 1: hand to shoot at the approaching corporate security guards and 58 00:02:56,440 --> 00:02:58,760 Speaker 1: have another hand on my like keyboard that I wear 59 00:02:58,760 --> 00:03:01,120 Speaker 1: around my neck that I like plug into the wall 60 00:03:01,240 --> 00:03:05,080 Speaker 1: to hack buildings. Well, hey, maybe someday we'll have neurological 61 00:03:05,080 --> 00:03:07,680 Speaker 1: implants or wet wire implants brought to us by Monsanto 62 00:03:07,760 --> 00:03:10,040 Speaker 1: that'llually the RM and will just get shut off in 63 00:03:10,080 --> 00:03:12,720 Speaker 1: our own rooms right from the mouth to God's ears, 64 00:03:12,800 --> 00:03:15,560 Speaker 1: Carl absolutely, who doesn't want that? Who doesn't want my 65 00:03:15,680 --> 00:03:19,399 Speaker 1: neural tissue tied directly to a corporation? But that fuck yes? 66 00:03:20,720 --> 00:03:22,520 Speaker 1: But anyway, so I grew up in that space, and 67 00:03:22,560 --> 00:03:25,919 Speaker 1: it actually, back then it naturally turned into a career. 68 00:03:26,000 --> 00:03:28,040 Speaker 1: It wasn't like now nowadays you pretty much have to 69 00:03:28,080 --> 00:03:30,840 Speaker 1: go get a bunch of certificates and a college degree 70 00:03:30,880 --> 00:03:33,840 Speaker 1: to even start looking at an infract career. But back then, 71 00:03:33,960 --> 00:03:36,280 Speaker 1: if you kind of had like skills with a Z 72 00:03:36,480 --> 00:03:39,040 Speaker 1: at the end of you could get a job. And 73 00:03:39,040 --> 00:03:41,280 Speaker 1: I landed up doing like help desk at this one company. 74 00:03:41,400 --> 00:03:43,760 Speaker 1: Landed up they noticed that that's where my interests were, 75 00:03:43,760 --> 00:03:46,600 Speaker 1: and I ended up becoming their information security architect over 76 00:03:46,640 --> 00:03:48,960 Speaker 1: a couple of years, and that turned into a multiple 77 00:03:49,000 --> 00:03:52,480 Speaker 1: decade career pretty much culminating and working at a Tier 78 00:03:52,480 --> 00:03:57,160 Speaker 1: one Internet backbone provider doing sub seed fiber optic like routing, 79 00:03:57,640 --> 00:04:01,200 Speaker 1: networking and de DOS mitigation and bought neck control search 80 00:04:01,200 --> 00:04:04,360 Speaker 1: and destroy. So it really turned into a really wide career, 81 00:04:04,720 --> 00:04:07,680 Speaker 1: not only like when I started off backbone Internet, but 82 00:04:07,760 --> 00:04:11,520 Speaker 1: like encryption, firewalls, application layer controls across the board for 83 00:04:11,680 --> 00:04:15,080 Speaker 1: multiple corporations. So it was a weird and interesting space. 84 00:04:15,120 --> 00:04:17,720 Speaker 1: But I don't really do that much anymore except on 85 00:04:17,760 --> 00:04:21,040 Speaker 1: the side, but I've had a pretty exciting career with it. 86 00:04:22,000 --> 00:04:24,240 Speaker 1: So I think probably a good place to start is 87 00:04:24,279 --> 00:04:26,360 Speaker 1: just in general, because folks are always interested about this. 88 00:04:26,800 --> 00:04:28,880 Speaker 1: What what do you What is your recommendation for people 89 00:04:28,960 --> 00:04:30,600 Speaker 1: ask like, what should I be doing to kind of 90 00:04:30,600 --> 00:04:36,919 Speaker 1: protect myself as I forced my head under the constant 91 00:04:37,000 --> 00:04:40,279 Speaker 1: stream of sewer water that is social media these days? Well, yeah, 92 00:04:40,279 --> 00:04:42,760 Speaker 1: you know, the simplest thing and everything, an infosec is 93 00:04:42,760 --> 00:04:46,200 Speaker 1: always controversial, just like anything life or any any recommendation 94 00:04:46,279 --> 00:04:48,880 Speaker 1: makes someone's gonna be like but otherwise or anyways, or 95 00:04:48,880 --> 00:04:51,800 Speaker 1: there's a better solution, and there always is a better solution. 96 00:04:51,839 --> 00:04:54,000 Speaker 1: But the realistic thing is when you talk to the 97 00:04:54,120 --> 00:04:56,840 Speaker 1: average person, the average person isn't gonna sit there and 98 00:04:56,920 --> 00:05:00,000 Speaker 1: hack a Linux box to have a better social media experience. 99 00:05:00,160 --> 00:05:03,320 Speaker 1: That's just not realistic. So the best thing anyone can do, 100 00:05:03,360 --> 00:05:05,920 Speaker 1: the simplest, best thing, is to get one of the 101 00:05:06,000 --> 00:05:08,840 Speaker 1: trusted password managers. There's a number of them out there. 102 00:05:08,839 --> 00:05:10,839 Speaker 1: I'm not going to recommend an individual one right now, 103 00:05:10,960 --> 00:05:13,560 Speaker 1: because anyone I recommend someone's gonna go. But there's another one, 104 00:05:13,760 --> 00:05:15,640 Speaker 1: but there's a few of them out there. Having a 105 00:05:15,680 --> 00:05:20,600 Speaker 1: password manager and having a unique, difficult, complex password for 106 00:05:20,760 --> 00:05:23,560 Speaker 1: every account you log in onto on the internet is 107 00:05:23,640 --> 00:05:26,159 Speaker 1: the first number one thing you can do as an 108 00:05:26,200 --> 00:05:29,120 Speaker 1: individual to protect your interest. Because if you're logging in 109 00:05:29,120 --> 00:05:33,080 Speaker 1: with the same password monkey to Facebook, Twitter, and your 110 00:05:33,080 --> 00:05:35,760 Speaker 1: bank account, that is a disaster waiting to happen. So 111 00:05:35,800 --> 00:05:38,880 Speaker 1: the first thing you can do password manager passwords you 112 00:05:38,880 --> 00:05:41,719 Speaker 1: yourself can't remember. As a result, I allow the password 113 00:05:41,720 --> 00:05:45,120 Speaker 1: manager to generate like twenty four character long alpha numeric 114 00:05:45,200 --> 00:05:47,480 Speaker 1: crypto nonsense. You put a gun on my mouth to say, 115 00:05:47,520 --> 00:05:49,280 Speaker 1: what's your password to your bank and I don't know. 116 00:05:49,640 --> 00:05:51,479 Speaker 1: I can't give it you. I have no idea. And 117 00:05:51,560 --> 00:05:54,440 Speaker 1: so that right there is the first thing any basic 118 00:05:54,520 --> 00:05:57,719 Speaker 1: individual can do to protect themselves on the Internet that, 119 00:05:58,040 --> 00:06:01,200 Speaker 1: uh is totally sensible. Um, I don't I'm not great 120 00:06:01,240 --> 00:06:03,720 Speaker 1: at password managers, but I never know what my passwords 121 00:06:03,720 --> 00:06:05,560 Speaker 1: are and they're all different, and so my life is 122 00:06:05,600 --> 00:06:08,720 Speaker 1: this constant stream of like needing to figure out what 123 00:06:08,760 --> 00:06:12,480 Speaker 1: my password was, failing and resetting it. But it does 124 00:06:12,560 --> 00:06:15,520 Speaker 1: mean that I change passwords regularly, right, But what's so 125 00:06:15,560 --> 00:06:17,880 Speaker 1: great about password managers. You can have passwords that you 126 00:06:17,920 --> 00:06:20,240 Speaker 1: could never human remember, and you can have neak ones 127 00:06:20,320 --> 00:06:23,359 Speaker 1: per website, every website you log in, you could be unique. 128 00:06:23,600 --> 00:06:26,320 Speaker 1: And by having it in this database that's properly encrypted 129 00:06:26,520 --> 00:06:29,040 Speaker 1: with a key phrase or even dual factor, then at 130 00:06:29,040 --> 00:06:31,119 Speaker 1: that point means you literally just can cut and paste 131 00:06:31,120 --> 00:06:33,760 Speaker 1: your passwords into things you don't yourself know what they 132 00:06:33,800 --> 00:06:37,039 Speaker 1: are and if depending on your privacy levels, you can 133 00:06:37,120 --> 00:06:40,720 Speaker 1: do that locally with local solutions with files like on 134 00:06:40,760 --> 00:06:43,159 Speaker 1: your own machine. But frankly a couple of the cloud 135 00:06:43,160 --> 00:06:45,240 Speaker 1: based solutions, as much as the cloud freaks people out, 136 00:06:45,600 --> 00:06:47,480 Speaker 1: is the better one because it'll work on your phone, 137 00:06:47,480 --> 00:06:50,760 Speaker 1: it'll work on your black top, it'll work on everything everywhere. 138 00:06:51,240 --> 00:06:54,080 Speaker 1: That makes total sense. Um. I think Another good thing 139 00:06:54,080 --> 00:06:56,000 Speaker 1: to get into while we're on this subject. We just 140 00:06:56,000 --> 00:06:58,880 Speaker 1: started talking about passwords, and obviously it is important to 141 00:06:58,960 --> 00:07:02,440 Speaker 1: keep and secure though is um I think one thing 142 00:07:02,520 --> 00:07:05,839 Speaker 1: folks don't often think about, especially people who are activists 143 00:07:05,960 --> 00:07:08,960 Speaker 1: um who who may foresee or have engaged in things 144 00:07:08,960 --> 00:07:13,040 Speaker 1: that are legally questionable, don't think about enough is social 145 00:07:13,040 --> 00:07:17,000 Speaker 1: media networking um as, and by which I mean having 146 00:07:17,120 --> 00:07:20,200 Speaker 1: social media that like it is possible to find your 147 00:07:20,200 --> 00:07:23,120 Speaker 1: other social media by like knowing you know, like having 148 00:07:23,120 --> 00:07:26,560 Speaker 1: the same name and Twitter and on Instagram and stuff. Um, 149 00:07:26,600 --> 00:07:29,880 Speaker 1: having social media that like can be tracked across accounts. Um. 150 00:07:30,000 --> 00:07:32,000 Speaker 1: Most people would be surprised at how easy it is 151 00:07:32,040 --> 00:07:34,560 Speaker 1: to do that. A hug and melanct a huge amount 152 00:07:34,680 --> 00:07:38,840 Speaker 1: of tracking. Nazis tracking even like a ton of the 153 00:07:39,160 --> 00:07:41,520 Speaker 1: what the work I did not do, but my colleagues 154 00:07:41,520 --> 00:07:45,880 Speaker 1: did to like doc docs Russian like secret service agents 155 00:07:45,920 --> 00:07:48,720 Speaker 1: and stuff. Was like, oh, we found them in you 156 00:07:48,720 --> 00:07:52,720 Speaker 1: know somebody, uh, their their bosses wedding, Like they're tagged 157 00:07:52,720 --> 00:07:55,560 Speaker 1: in this thing in VK And from that we were 158 00:07:55,560 --> 00:07:58,720 Speaker 1: able to like find their their account on this other 159 00:07:58,760 --> 00:08:00,840 Speaker 1: site and like from that, like now we have this 160 00:08:00,920 --> 00:08:02,920 Speaker 1: like map of everywhere they've been for the last like 161 00:08:03,000 --> 00:08:05,040 Speaker 1: three weeks, and we can build this social map of 162 00:08:05,080 --> 00:08:08,160 Speaker 1: their entire life. Yeah. No, by list by just literally 163 00:08:08,160 --> 00:08:12,560 Speaker 1: existing in modern space, you're constantly leaking some form of metadata, 164 00:08:12,760 --> 00:08:15,000 Speaker 1: right you are. You are always leaking metadata, And the 165 00:08:15,040 --> 00:08:17,200 Speaker 1: more of you allow to exist in the world, the 166 00:08:17,240 --> 00:08:19,520 Speaker 1: more that's the case. So, like there's also you've got 167 00:08:19,520 --> 00:08:21,360 Speaker 1: to think about what the threat is and what the 168 00:08:21,480 --> 00:08:23,800 Speaker 1: risk is. Right, there's the risk of the individual having 169 00:08:23,840 --> 00:08:26,560 Speaker 1: a pair of social relationship with the Internet like I 170 00:08:26,600 --> 00:08:29,360 Speaker 1: do as a content creator is one thing. People, there's 171 00:08:29,360 --> 00:08:31,320 Speaker 1: always someone that wants to delve into your private life. 172 00:08:31,320 --> 00:08:35,199 Speaker 1: But that's a very different risk than a nation state actor. Right, 173 00:08:35,240 --> 00:08:37,320 Speaker 1: those are two different things. And when it comes to 174 00:08:37,360 --> 00:08:40,760 Speaker 1: a nation state actor, quite honestly, unless you're real good 175 00:08:41,080 --> 00:08:42,560 Speaker 1: and I've been doing it for a long time, the 176 00:08:42,600 --> 00:08:47,920 Speaker 1: individual bluntly is kind of fucked word. As a general rule, 177 00:08:47,960 --> 00:08:50,800 Speaker 1: your best security as an individual in that situation is 178 00:08:51,360 --> 00:08:55,320 Speaker 1: the anonymity of the crowd. But when we're also not 179 00:08:55,360 --> 00:08:57,640 Speaker 1: talking about most people who are threatened to kind of 180 00:08:57,679 --> 00:08:59,840 Speaker 1: by the state and that situation are not being threatened 181 00:08:59,880 --> 00:09:02,640 Speaker 1: by the federal government, but they may have they may 182 00:09:02,679 --> 00:09:06,600 Speaker 1: like be attending protests and not want the Louisville Police 183 00:09:07,040 --> 00:09:09,360 Speaker 1: to like put together that they're in an affinity group 184 00:09:09,360 --> 00:09:10,840 Speaker 1: with people and like something you can do for that 185 00:09:10,960 --> 00:09:13,280 Speaker 1: is make sure you're not like, if you have a 186 00:09:13,280 --> 00:09:15,960 Speaker 1: personal account that's under your name with your friends, that 187 00:09:16,000 --> 00:09:18,760 Speaker 1: account shouldn't be liking and sharing things from like a 188 00:09:18,800 --> 00:09:21,080 Speaker 1: political account that you have, or from the account of 189 00:09:21,120 --> 00:09:23,160 Speaker 1: like a group that you're a part of, or something 190 00:09:23,200 --> 00:09:26,240 Speaker 1: like that like, just try to think about and look 191 00:09:26,280 --> 00:09:29,880 Speaker 1: at your your digital footprint from the outside and think, 192 00:09:29,960 --> 00:09:32,280 Speaker 1: is it possible to connect me to people I don't 193 00:09:32,280 --> 00:09:34,720 Speaker 1: want to be publicly connected to through this And the 194 00:09:34,720 --> 00:09:37,280 Speaker 1: minute you've breached that connection once it's gone forever. Right, 195 00:09:37,320 --> 00:09:39,760 Speaker 1: this is the forever. Yes, this is the same thing 196 00:09:39,800 --> 00:09:41,720 Speaker 1: as like with phones, like someone will have like their 197 00:09:41,720 --> 00:09:43,760 Speaker 1: regular phone, which, by the way, all these smartphones are 198 00:09:43,800 --> 00:09:46,719 Speaker 1: just surveillance devices in our pocket. Right, Let's let's say 199 00:09:46,720 --> 00:09:48,559 Speaker 1: you go get a burner so that you don't want 200 00:09:48,559 --> 00:09:50,480 Speaker 1: to be connected to the device that you normally use 201 00:09:51,080 --> 00:09:53,560 Speaker 1: on on a level that's one step above the regular 202 00:09:53,600 --> 00:09:56,760 Speaker 1: individual level. If you ever have those two devices emanating 203 00:09:56,800 --> 00:09:58,680 Speaker 1: at the same time, they're now connected in a way 204 00:09:58,960 --> 00:10:02,079 Speaker 1: that like, let's say, the authorities can associate them together 205 00:10:02,120 --> 00:10:05,560 Speaker 1: because of triangulation and seeing a burner phone and your 206 00:10:05,559 --> 00:10:08,240 Speaker 1: phone coming from the same house, you've breached all the 207 00:10:08,240 --> 00:10:11,400 Speaker 1: privacy you would have had from your burner phone for example. Now, Karl, 208 00:10:11,440 --> 00:10:13,000 Speaker 1: do you have much to say on the subject of 209 00:10:13,000 --> 00:10:14,960 Speaker 1: because I know one thing I have seen people do 210 00:10:15,280 --> 00:10:18,600 Speaker 1: people who are you know, having conversations that they're concerned 211 00:10:18,640 --> 00:10:21,120 Speaker 1: about is put bags in Faraday cages. And I've heard 212 00:10:21,280 --> 00:10:24,760 Speaker 1: mixed things about how reliable Faraday bags and stuff are 213 00:10:24,920 --> 00:10:27,000 Speaker 1: for actually stopping signals. Do you have much to say 214 00:10:27,040 --> 00:10:29,240 Speaker 1: on that matter. My experience with that is not all, 215 00:10:29,320 --> 00:10:30,920 Speaker 1: not all bags that you can just buy off the 216 00:10:30,960 --> 00:10:33,840 Speaker 1: internet are made equally um So what you want to 217 00:10:33,840 --> 00:10:35,439 Speaker 1: do is tested, and you can only test it to 218 00:10:35,480 --> 00:10:38,000 Speaker 1: a certain degree. But the really simple tests are you 219 00:10:38,040 --> 00:10:39,480 Speaker 1: put it in the bag and you try to darn 220 00:10:39,520 --> 00:10:41,880 Speaker 1: dial the darn thing or use any WiFi connections to it. 221 00:10:41,920 --> 00:10:43,960 Speaker 1: And that's a simple test. Now is it as good 222 00:10:44,000 --> 00:10:46,360 Speaker 1: as like? Is it as good as not having the 223 00:10:46,400 --> 00:10:48,640 Speaker 1: thing on you? Of course not leaving or else is 224 00:10:48,679 --> 00:10:52,520 Speaker 1: always the best answer, but it properly In my opinion, 225 00:10:52,520 --> 00:10:56,600 Speaker 1: a properly built Faraday box or cage or bag that 226 00:10:56,679 --> 00:10:59,760 Speaker 1: you've put some testing into is a pretty reliable solution. 227 00:11:00,320 --> 00:11:02,600 Speaker 1: And it's you know, there are so a problem that 228 00:11:02,720 --> 00:11:05,480 Speaker 1: you might encounter is um or that I have. So 229 00:11:05,559 --> 00:11:07,559 Speaker 1: one thing I have heard people talk about it is like, well, 230 00:11:07,880 --> 00:11:10,280 Speaker 1: in order to have kind of a private conversation, we 231 00:11:10,360 --> 00:11:13,120 Speaker 1: like drove to a specific location and we left our 232 00:11:13,120 --> 00:11:15,280 Speaker 1: phones off in the car and then went on a walk. 233 00:11:15,640 --> 00:11:17,280 Speaker 1: And the problem with that is that Now you have 234 00:11:17,320 --> 00:11:19,760 Speaker 1: both just driven to a location with those phones, and 235 00:11:19,800 --> 00:11:23,120 Speaker 1: those phones are associated with each other, right right, Well, 236 00:11:23,160 --> 00:11:24,360 Speaker 1: so first of all, you got to think of a 237 00:11:24,400 --> 00:11:26,640 Speaker 1: world where all of this metadata is being collected at 238 00:11:26,679 --> 00:11:30,280 Speaker 1: all times. So these phones and their associations in physical 239 00:11:30,360 --> 00:11:33,680 Speaker 1: proximity to one another is stored somewhere at all times, 240 00:11:33,840 --> 00:11:36,320 Speaker 1: whether or not it's going to be resourced or accessible 241 00:11:36,360 --> 00:11:38,480 Speaker 1: to the powers that be when they wanted to be. 242 00:11:38,679 --> 00:11:41,000 Speaker 1: It's all there. My phone next to your phone, next 243 00:11:41,040 --> 00:11:43,840 Speaker 1: to that guy's phone. Those associations all exist. They're all 244 00:11:43,880 --> 00:11:46,600 Speaker 1: talking to the same cell phone towers in the same area, 245 00:11:46,640 --> 00:11:50,360 Speaker 1: giving them not only GPS coordinates but triangulation data, which, 246 00:11:50,360 --> 00:11:51,680 Speaker 1: by the way, if you go way back to the 247 00:11:51,720 --> 00:11:54,080 Speaker 1: hacker Kevin Mitnick, that stuff was going on back then 248 00:11:54,120 --> 00:11:57,880 Speaker 1: before they had triangulation data to get him, right, So 249 00:11:58,280 --> 00:12:01,080 Speaker 1: that stuff is all still happening, and those association does occur. 250 00:12:01,520 --> 00:12:04,520 Speaker 1: In regards to saying I turned my phone off, how 251 00:12:04,520 --> 00:12:07,200 Speaker 1: do you know that's off? Most of the moderate phones, 252 00:12:07,360 --> 00:12:12,720 Speaker 1: what does off mean? And yeah, okay, pull the battery maybe, 253 00:12:13,160 --> 00:12:16,160 Speaker 1: But even then, I would not trust any of these 254 00:12:16,160 --> 00:12:19,960 Speaker 1: devices in the regards to them quote being off, especially 255 00:12:19,960 --> 00:12:23,040 Speaker 1: things like phones that have unremovable or not removable batteries. 256 00:12:23,400 --> 00:12:27,800 Speaker 1: Off is more like sleep it is, right, Yeah, I mean, 257 00:12:27,840 --> 00:12:29,720 Speaker 1: I think one of the worst things that's happened for 258 00:12:29,920 --> 00:12:32,440 Speaker 1: personal security is the end of the phone where you 259 00:12:32,440 --> 00:12:37,080 Speaker 1: can remove the battery, Like being unable to actually cut 260 00:12:37,120 --> 00:12:40,080 Speaker 1: power to it without you know, disassembling it is a 261 00:12:40,120 --> 00:12:42,800 Speaker 1: real issue. One could argue that there was like that 262 00:12:42,800 --> 00:12:45,080 Speaker 1: that's a much much more insidious reason they did that, 263 00:12:45,160 --> 00:12:47,000 Speaker 1: or one could argue that it was just one of 264 00:12:47,040 --> 00:12:50,120 Speaker 1: design and comfort, And it's like hard to say. It 265 00:12:50,120 --> 00:12:52,640 Speaker 1: doesn't really matter if it was insidious or not. Reality 266 00:12:52,760 --> 00:12:55,800 Speaker 1: kind of a poor Knolos dose situation, right totally is 267 00:12:55,840 --> 00:12:57,520 Speaker 1: so when now that we're talking about phones, here's another 268 00:12:57,520 --> 00:12:59,120 Speaker 1: thing that's been near and dear, and I think you've 269 00:12:59,120 --> 00:13:02,080 Speaker 1: seen some posts from me out this um. Everybody really 270 00:13:02,080 --> 00:13:05,720 Speaker 1: likes the convenience of things like biometrics, thumb authentication, fingerprint 271 00:13:05,720 --> 00:13:09,439 Speaker 1: i D, facial identification. And here's the reality of that. 272 00:13:09,520 --> 00:13:11,920 Speaker 1: We know this already and there's legal this exists in 273 00:13:12,000 --> 00:13:14,320 Speaker 1: legal space already. But the reality is that you can 274 00:13:14,360 --> 00:13:18,400 Speaker 1: be coerced to provide biometric data against your will. So 275 00:13:18,440 --> 00:13:20,600 Speaker 1: if your phone is authenticated to you with a fingerprint 276 00:13:20,640 --> 00:13:22,839 Speaker 1: I D or your facial i D. They can pretty 277 00:13:22,920 --> 00:13:25,680 Speaker 1: much say you must give us your thumb to unlock 278 00:13:25,760 --> 00:13:28,319 Speaker 1: this phone, or for that matter, frankly, they could hold 279 00:13:28,320 --> 00:13:30,880 Speaker 1: the phone in front of your face in certain circumstract circumstances, 280 00:13:31,320 --> 00:13:33,480 Speaker 1: even against your will, and it will unlock the device, 281 00:13:33,800 --> 00:13:36,560 Speaker 1: and that is considered not a violation of your rights. So, 282 00:13:36,600 --> 00:13:39,720 Speaker 1: for example, if you had a long, strong password on 283 00:13:39,720 --> 00:13:41,840 Speaker 1: the phone, they cannot coerce you to give that up 284 00:13:41,880 --> 00:13:44,120 Speaker 1: because that would be a violation of your own rights 285 00:13:44,120 --> 00:13:47,760 Speaker 1: and Fifth Amendment, which is interesting um so. But at 286 00:13:47,800 --> 00:13:50,360 Speaker 1: the same time, one could also argue that at certain 287 00:13:50,360 --> 00:13:52,600 Speaker 1: circumstances where there's a lot of cameras that are not 288 00:13:52,720 --> 00:13:56,160 Speaker 1: necessarily watching everything you do. But you could also consider 289 00:13:56,200 --> 00:13:58,800 Speaker 1: that pass phrases could be dangerous, like saying an airport 290 00:13:59,080 --> 00:14:01,360 Speaker 1: because all those members could see you plugging in your 291 00:14:01,400 --> 00:14:03,720 Speaker 1: pass code. So it's a matter of if, when and 292 00:14:03,760 --> 00:14:06,520 Speaker 1: where right, So what's the right solution at the best time. 293 00:14:06,559 --> 00:14:07,920 Speaker 1: But I would say that if you were going to 294 00:14:07,960 --> 00:14:11,160 Speaker 1: be in a place that was contentious, um it is 295 00:14:11,200 --> 00:14:13,680 Speaker 1: almost always better to make sure you do not allow 296 00:14:13,760 --> 00:14:17,280 Speaker 1: for any biometric authentication on device. Yes, I never like 297 00:14:17,400 --> 00:14:19,840 Speaker 1: never turn on don't even like ever have had it 298 00:14:19,880 --> 00:14:22,920 Speaker 1: in the like ideally you have never turned on facial 299 00:14:22,960 --> 00:14:26,600 Speaker 1: recognition on your phone, like even if you like deactivated, 300 00:14:26,680 --> 00:14:29,840 Speaker 1: I don't know, I don't, I really that was That 301 00:14:29,920 --> 00:14:31,120 Speaker 1: was one of the first I used to be in 302 00:14:31,160 --> 00:14:33,360 Speaker 1: tech journalism. Right, obviously I'm not an expert on any 303 00:14:33,360 --> 00:14:37,320 Speaker 1: of this, but like the worst thing in terms of 304 00:14:37,320 --> 00:14:39,960 Speaker 1: like my personal comfort with devices was when they were like, 305 00:14:40,240 --> 00:14:43,800 Speaker 1: everything's gonna read faces and fingerprints. Now I don't. I 306 00:14:43,840 --> 00:14:48,360 Speaker 1: don't love that, um, but you know it's it's inevitable, 307 00:14:48,440 --> 00:14:51,400 Speaker 1: right because it is and I had in the past. 308 00:14:51,440 --> 00:14:54,680 Speaker 1: I did a fingerprint unlock earlier in my life, and 309 00:14:54,840 --> 00:14:57,240 Speaker 1: I do not have any devices that unlocked that way anymore. 310 00:14:57,240 --> 00:14:59,080 Speaker 1: But you do like that it is more convenient, right, 311 00:14:59,120 --> 00:15:00,920 Speaker 1: You miss it when you need to get to your 312 00:15:00,920 --> 00:15:03,480 Speaker 1: phone quickly and you can't do it. But like, I 313 00:15:03,520 --> 00:15:05,200 Speaker 1: don't even I don't even let my phone have just 314 00:15:05,280 --> 00:15:08,720 Speaker 1: like a four phrase like password anymore. Like it's eight 315 00:15:08,800 --> 00:15:10,520 Speaker 1: characters for me. It's a little bit of a pain 316 00:15:10,560 --> 00:15:13,760 Speaker 1: in the ass, but it comes with fewer risks and 317 00:15:13,840 --> 00:15:16,320 Speaker 1: one of the things that's challenging to every individuals, they 318 00:15:16,320 --> 00:15:18,640 Speaker 1: have to look at what their threat profile is. Right, 319 00:15:18,720 --> 00:15:22,600 Speaker 1: So like, for example, UM, soccer mom driving her kids 320 00:15:22,640 --> 00:15:25,000 Speaker 1: to school and stuff, she might be really good, well 321 00:15:25,040 --> 00:15:28,200 Speaker 1: off with a bottometric authentication on her phone, frankly, because 322 00:15:28,240 --> 00:15:30,040 Speaker 1: if she didn't use that, maybe she wouldn't even use 323 00:15:30,080 --> 00:15:32,280 Speaker 1: a proper four character pass phrase. And if she's not 324 00:15:32,360 --> 00:15:35,320 Speaker 1: concerned about being at a protest, for example, and having 325 00:15:35,360 --> 00:15:38,280 Speaker 1: some authoritarian take her phone away from her and authenticate 326 00:15:38,320 --> 00:15:40,880 Speaker 1: to it, maybe she doesn't need to worry about that. 327 00:15:40,960 --> 00:15:42,400 Speaker 1: But for a lot of us in the world's we 328 00:15:42,440 --> 00:15:44,960 Speaker 1: live in, that's a different risk profile, right. We got 329 00:15:44,960 --> 00:15:47,680 Speaker 1: to think about what our risks are as individuals and 330 00:15:47,760 --> 00:15:49,960 Speaker 1: what makes sense. So if your passphrase is going to 331 00:15:50,000 --> 00:15:52,400 Speaker 1: be one, two, three, four or use a thumb print 332 00:15:52,440 --> 00:15:54,440 Speaker 1: I D. For most people they'd be better with the 333 00:15:54,480 --> 00:15:57,200 Speaker 1: thumb print I D. But for someone like myself, no, 334 00:15:57,360 --> 00:16:00,960 Speaker 1: it's not a good idea. Yeah, and that's m Yeah. 335 00:16:01,120 --> 00:16:04,560 Speaker 1: I think that kind of brings us to uh, probably 336 00:16:04,600 --> 00:16:07,200 Speaker 1: the last part of this, which is UM, do you 337 00:16:07,200 --> 00:16:10,880 Speaker 1: have specific advice on like VPNs UM? Obviously I recommend 338 00:16:10,880 --> 00:16:15,320 Speaker 1: everybody you signal I I just firm messages in general, 339 00:16:15,360 --> 00:16:18,600 Speaker 1: But like a specially stuff that is secure. Don't if 340 00:16:18,640 --> 00:16:21,840 Speaker 1: you if you like number one first rule of any 341 00:16:21,920 --> 00:16:26,880 Speaker 1: kind of this sort of security, don't ever put anything 342 00:16:26,960 --> 00:16:30,040 Speaker 1: on your phone ever that's legally questionable if you can 343 00:16:30,080 --> 00:16:33,920 Speaker 1: avoid it, like conversationally, like right, do not don't send 344 00:16:33,920 --> 00:16:36,840 Speaker 1: it over a phone if it's something you would not 345 00:16:36,960 --> 00:16:40,960 Speaker 1: be able to survive having read d you in a courtroom. Yeah, 346 00:16:41,480 --> 00:16:42,920 Speaker 1: for the audience, a lot of the audience may not 347 00:16:42,960 --> 00:16:45,200 Speaker 1: know what signal even is. Right, So signal ism is 348 00:16:45,200 --> 00:16:48,240 Speaker 1: a is a text messaging alternative. So like for example, 349 00:16:48,240 --> 00:16:49,760 Speaker 1: on your phone, you've got regular text or if you've 350 00:16:49,760 --> 00:16:52,960 Speaker 1: got an iPhone, you've got I message. Signal is an 351 00:16:53,120 --> 00:16:56,280 Speaker 1: end end encrypted solution that you install as an app. 352 00:16:56,920 --> 00:16:59,360 Speaker 1: And because it's end end encryption, it means that it 353 00:16:59,400 --> 00:17:03,160 Speaker 1: passes the wire in theory not decryptible by the parties 354 00:17:03,160 --> 00:17:05,800 Speaker 1: that are passing the data packets in the middle. So 355 00:17:05,880 --> 00:17:07,639 Speaker 1: that's a man in the middle the decryption. Right. So 356 00:17:07,680 --> 00:17:11,239 Speaker 1: for example, I message is encrypted theoretically end to end, 357 00:17:11,240 --> 00:17:14,600 Speaker 1: but Apple ultimately has the cryptographic keys, so there is 358 00:17:14,680 --> 00:17:17,480 Speaker 1: while they might say one thing, there is nothing really 359 00:17:17,520 --> 00:17:19,399 Speaker 1: preventing them from being man in the middle and being 360 00:17:19,440 --> 00:17:22,440 Speaker 1: able to read the message in transit from a to 361 00:17:22,640 --> 00:17:25,480 Speaker 1: be But if the keys are stored on your device, 362 00:17:25,480 --> 00:17:28,159 Speaker 1: which are then protected with your passphrase or whatever your 363 00:17:28,200 --> 00:17:31,520 Speaker 1: authentication mechanism is, and those keys are not archived or 364 00:17:31,600 --> 00:17:35,440 Speaker 1: kept by some hierarchical man in the middle authority, if 365 00:17:35,440 --> 00:17:38,679 Speaker 1: it's done right, which Signal is done pretty well, it 366 00:17:38,720 --> 00:17:42,080 Speaker 1: means that your data in transit is probably not decryptable. 367 00:17:42,760 --> 00:17:44,679 Speaker 1: And that's why signals a good solution, and it's a 368 00:17:44,680 --> 00:17:47,600 Speaker 1: good one for the average person. Install the app. It 369 00:17:47,680 --> 00:17:50,359 Speaker 1: works just like testa text messaging, but you can have 370 00:17:50,520 --> 00:17:53,600 Speaker 1: a pretty good level of a knowledge that the data 371 00:17:53,680 --> 00:17:57,640 Speaker 1: you're passing is not being decrypted or caught in transmission 372 00:17:57,720 --> 00:18:02,880 Speaker 1: or in the path. So I would say get get Signal. Um, 373 00:18:02,920 --> 00:18:05,920 Speaker 1: it's it's your best bet, right Like and again we said, 374 00:18:06,119 --> 00:18:08,680 Speaker 1: I said, you know, you don't want to ever say 375 00:18:08,720 --> 00:18:11,840 Speaker 1: anything over a phone that is something that can get 376 00:18:11,840 --> 00:18:14,680 Speaker 1: you in trouble. But also like life is life, and 377 00:18:14,800 --> 00:18:18,919 Speaker 1: that's not always realistic for people in certain situations. So again, 378 00:18:19,080 --> 00:18:22,920 Speaker 1: signal is your best bet. Nothing is perfect. And again, 379 00:18:22,960 --> 00:18:25,119 Speaker 1: if you're putting it on your phone, there's a number 380 00:18:25,160 --> 00:18:27,080 Speaker 1: of things that could go wrong every single time you 381 00:18:27,119 --> 00:18:30,359 Speaker 1: do that. But um, that that's one of your better 382 00:18:30,640 --> 00:18:32,080 Speaker 1: things that you could do. And then of course we 383 00:18:32,119 --> 00:18:36,560 Speaker 1: talk about VPNs. Yeah, so so VPN to those like, 384 00:18:36,600 --> 00:18:38,159 Speaker 1: I'm just gonna go with the basic levels because I 385 00:18:38,160 --> 00:18:41,320 Speaker 1: don't necessarily know the level of knowledge that people are listening. 386 00:18:41,600 --> 00:18:44,199 Speaker 1: VPN is a virtual private network. So what that is. 387 00:18:44,520 --> 00:18:47,720 Speaker 1: You connect to this virtual private network and it passes 388 00:18:47,800 --> 00:18:50,920 Speaker 1: your data through an encrypted tunnel to an exit point 389 00:18:50,960 --> 00:18:55,280 Speaker 1: somewhere else on the Internet in theory masking the source 390 00:18:55,320 --> 00:18:59,200 Speaker 1: and origin of your requests. So like, for example, let's 391 00:18:59,200 --> 00:19:00,879 Speaker 1: say you were looking up something on the Internet that 392 00:19:00,920 --> 00:19:03,800 Speaker 1: you didn't necessarily want people to know you're looking up. Yeah, like, 393 00:19:03,920 --> 00:19:07,879 Speaker 1: let's say you're researching the truth about the assassination of 394 00:19:07,880 --> 00:19:11,800 Speaker 1: President John F. Kennedy by Bernard Montgomery Sanders. Um, and 395 00:19:11,840 --> 00:19:13,919 Speaker 1: you know that the n s A is looking for 396 00:19:14,000 --> 00:19:17,200 Speaker 1: truth seekers who are who are finding out the reality 397 00:19:17,240 --> 00:19:19,640 Speaker 1: of that situation. You know, you don't necessarily want them 398 00:19:19,640 --> 00:19:23,600 Speaker 1: to know that you have have become pilled. Right, So 399 00:19:23,640 --> 00:19:25,920 Speaker 1: if you were to do this from your computer at home, 400 00:19:26,320 --> 00:19:28,040 Speaker 1: But what happen is to people that don't know how 401 00:19:28,119 --> 00:19:29,760 Speaker 1: this all works, you would be coming from an I 402 00:19:29,840 --> 00:19:33,239 Speaker 1: P address that's associated with your account that you're connecting to, 403 00:19:33,320 --> 00:19:37,040 Speaker 1: whether it's Verizon or Comcast or whatever. And you go 404 00:19:37,119 --> 00:19:39,240 Speaker 1: and search up that truth and the n s A 405 00:19:39,400 --> 00:19:42,600 Speaker 1: finds you with a keyword search for JFK and the truth. 406 00:19:42,920 --> 00:19:45,600 Speaker 1: And therefore, because of that keyword search, they go to 407 00:19:45,720 --> 00:19:49,800 Speaker 1: Comcast or to Verison and say, hey, we are requesting 408 00:19:49,840 --> 00:19:52,440 Speaker 1: you tell us who did this search. They will get 409 00:19:52,480 --> 00:19:55,680 Speaker 1: them essentially a request that's a legal request for information, 410 00:19:55,680 --> 00:19:58,399 Speaker 1: and then Comcast or Verizon will provide the n s 411 00:19:58,440 --> 00:20:00,920 Speaker 1: A this is the IP address account of the person 412 00:20:00,960 --> 00:20:04,520 Speaker 1: that did that. What VPN does is you connect to 413 00:20:04,520 --> 00:20:07,760 Speaker 1: the vp and service first, the connection from your machine 414 00:20:08,080 --> 00:20:10,840 Speaker 1: to the VPN services then encrypted. Now does the VPN 415 00:20:10,880 --> 00:20:15,000 Speaker 1: service know your IP address? Yes, but when you actually 416 00:20:15,040 --> 00:20:16,840 Speaker 1: type in that information or go to the Internet to 417 00:20:16,920 --> 00:20:20,000 Speaker 1: request that data, it actually goes through the VPNs Private 418 00:20:20,000 --> 00:20:23,520 Speaker 1: Tunneling network and egresses from somewhere else on the Internet, 419 00:20:23,920 --> 00:20:28,720 Speaker 1: thus masking your actual I P address and in theory 420 00:20:29,440 --> 00:20:33,639 Speaker 1: your origin of source. Now that's not true, but what 421 00:20:33,760 --> 00:20:35,600 Speaker 1: that does is mean that if someone if say the 422 00:20:35,720 --> 00:20:37,600 Speaker 1: n s A I wanted to know who's doing this 423 00:20:37,680 --> 00:20:40,720 Speaker 1: truth search, they would then find an eat IP address 424 00:20:40,880 --> 00:20:45,680 Speaker 1: that actually came out of let's say Joe's VPN service, 425 00:20:46,000 --> 00:20:47,919 Speaker 1: and they would have to go to Joe's VPN service 426 00:20:47,920 --> 00:20:51,720 Speaker 1: and go we noticed this emanated from your network? Who 427 00:20:51,760 --> 00:20:55,479 Speaker 1: did this? At that point, you have to trust Joe's 428 00:20:55,520 --> 00:20:59,040 Speaker 1: VPN service to not disclose their account information about you. 429 00:21:00,080 --> 00:21:02,159 Speaker 1: So what you've done is you've changed it. We know 430 00:21:02,280 --> 00:21:05,879 Speaker 1: the telecoms will communicate with the government or whoever if 431 00:21:05,880 --> 00:21:09,119 Speaker 1: they need to, they always will have You don't necessarily 432 00:21:09,160 --> 00:21:12,520 Speaker 1: know if Joe's VPN service will. You've changed your trust 433 00:21:12,560 --> 00:21:16,439 Speaker 1: model from your telecom to your VPN service. So if 434 00:21:16,440 --> 00:21:18,359 Speaker 1: you're gonna pick a VPN, you have to do a 435 00:21:18,359 --> 00:21:20,120 Speaker 1: little bit of research to know that it's a trustworthy 436 00:21:20,160 --> 00:21:23,480 Speaker 1: resource that won't just give you up at the lightest 437 00:21:23,600 --> 00:21:27,320 Speaker 1: form of interrogation. Yeah, and none of them again, there's 438 00:21:27,320 --> 00:21:30,679 Speaker 1: nothing perfect and often like we did find out what 439 00:21:30,760 --> 00:21:33,240 Speaker 1: was it last year that one of popular vpn was 440 00:21:33,280 --> 00:21:36,560 Speaker 1: like run by the FEDS, Like it's yeah, that's not 441 00:21:36,600 --> 00:21:39,480 Speaker 1: an impossible thing. Um. I know a lot of folks, 442 00:21:39,680 --> 00:21:43,320 Speaker 1: particularly journalists, use proton Um, which is I think based 443 00:21:43,320 --> 00:21:46,520 Speaker 1: in Switzerland, and you will get given up if you 444 00:21:46,840 --> 00:21:50,240 Speaker 1: if the Swiss government is angry at you right, you 445 00:21:50,400 --> 00:21:54,440 Speaker 1: brought up a very good point. Uh, services that exists 446 00:21:54,520 --> 00:21:57,840 Speaker 1: outside of the KNUS, the continental US mean that they 447 00:21:57,840 --> 00:22:01,040 Speaker 1: are under different legal jurisdiction the ones that exist wholly 448 00:22:01,080 --> 00:22:04,600 Speaker 1: within the KONUS. So as a result, if something from 449 00:22:04,640 --> 00:22:07,320 Speaker 1: the United States government comes as a request to the 450 00:22:07,359 --> 00:22:11,000 Speaker 1: Swiss company, there's a much like like higher chance that 451 00:22:11,040 --> 00:22:13,400 Speaker 1: a Swiss company would be like, we don't really care 452 00:22:13,440 --> 00:22:17,080 Speaker 1: about your reports. That's worth considering. Also, think about this, 453 00:22:17,080 --> 00:22:18,720 Speaker 1: This actually works in reverse. And I don't want to 454 00:22:18,720 --> 00:22:20,639 Speaker 1: get too deep into this, but when you're working at 455 00:22:20,680 --> 00:22:22,800 Speaker 1: a tier one internet back one provider, you should know 456 00:22:23,520 --> 00:22:27,880 Speaker 1: that sometimes traffic strangely gets pushed offshore and then back 457 00:22:27,920 --> 00:22:30,199 Speaker 1: to the United States for analysis that would normally be 458 00:22:30,840 --> 00:22:34,800 Speaker 1: let's say, not necessarily constitutionally legal in the United States. 459 00:22:35,240 --> 00:22:40,360 Speaker 1: So there's a lot of shenanigans going on. Yeah, and again, like, 460 00:22:40,880 --> 00:22:43,400 Speaker 1: I think protons are generally a pretty good service. I've 461 00:22:43,400 --> 00:22:46,880 Speaker 1: had no problems with it um, But we should should 462 00:22:46,880 --> 00:22:49,120 Speaker 1: be clear here none of these are perfect solutions. There 463 00:22:49,160 --> 00:22:52,440 Speaker 1: is no perfect solution. The only perfect method of digital 464 00:22:52,520 --> 00:22:55,359 Speaker 1: security is not putting things on the Internet or like 465 00:22:55,480 --> 00:22:58,280 Speaker 1: through you know, the mobile networks and stuff like that is, 466 00:22:58,520 --> 00:23:02,159 Speaker 1: if it stays between you someone else, Um, that is 467 00:23:02,160 --> 00:23:05,720 Speaker 1: your best bet of it not being you know, intercepted 468 00:23:05,800 --> 00:23:08,520 Speaker 1: or something. A conversation that you have in the woods 469 00:23:08,560 --> 00:23:11,560 Speaker 1: without phones anywhere near you is the most secure kind 470 00:23:11,560 --> 00:23:14,600 Speaker 1: of conversation. Let me second on proton I agree, it's 471 00:23:14,600 --> 00:23:16,040 Speaker 1: a good service. There are others out there. We're not 472 00:23:16,080 --> 00:23:18,119 Speaker 1: trying to pick on one in particular or pick against 473 00:23:18,119 --> 00:23:20,359 Speaker 1: anyone in particularly. There's a bunch of at work. Yeah. 474 00:23:20,760 --> 00:23:22,560 Speaker 1: Another thing that you need to consider in this sort 475 00:23:22,560 --> 00:23:25,119 Speaker 1: of thing is also what you're dealing with. Like so 476 00:23:25,160 --> 00:23:27,520 Speaker 1: for example, on I put up a post a while 477 00:23:27,560 --> 00:23:29,399 Speaker 1: back because there was a bunch of stuff going on 478 00:23:29,440 --> 00:23:32,440 Speaker 1: in Ukraine with with people posting photos that got their 479 00:23:32,480 --> 00:23:35,919 Speaker 1: locations add things happen that I mean, that's and that 480 00:23:35,960 --> 00:23:38,240 Speaker 1: has been happening for a decade in that war, like 481 00:23:38,280 --> 00:23:40,159 Speaker 1: a well almost a decade as long as it's been 482 00:23:40,160 --> 00:23:42,520 Speaker 1: going on, And I posted something about it. And one 483 00:23:42,520 --> 00:23:44,800 Speaker 1: of the recommendations I made on there was a contentious one, 484 00:23:44,800 --> 00:23:46,119 Speaker 1: but I'm gonna back it up in a minute. As 485 00:23:46,119 --> 00:23:49,159 Speaker 1: I used I mentioned pour the onion relay. So the 486 00:23:49,200 --> 00:23:54,000 Speaker 1: tour is essentially it was originally created as a as 487 00:23:54,040 --> 00:23:58,160 Speaker 1: a way to deal with the dark web quote unquote, 488 00:23:58,160 --> 00:24:01,240 Speaker 1: and to also relay traffic in a way to mask 489 00:24:01,320 --> 00:24:04,320 Speaker 1: the origins, very much like a VPN service. Now there 490 00:24:04,359 --> 00:24:05,680 Speaker 1: are a bunch of these. So what it was is 491 00:24:05,720 --> 00:24:08,639 Speaker 1: there's these Onion relay nodes all over the Internet, and 492 00:24:08,680 --> 00:24:11,160 Speaker 1: when you connect to the to the Onion network, your 493 00:24:11,160 --> 00:24:14,159 Speaker 1: traffic bounces through three, four, or five, six, seven of 494 00:24:14,200 --> 00:24:16,520 Speaker 1: these nodes. You can sort of dictate what you want 495 00:24:16,560 --> 00:24:18,600 Speaker 1: depending on the client you have. And so let's say 496 00:24:18,640 --> 00:24:21,840 Speaker 1: you connect to an Onion router network node in Arizona 497 00:24:21,920 --> 00:24:24,960 Speaker 1: and then you egress somewhere in France and you've jumped 498 00:24:24,960 --> 00:24:27,399 Speaker 1: through six nodes in the process. Well, one of the 499 00:24:27,440 --> 00:24:29,520 Speaker 1: things that's a well known fact is that a number 500 00:24:29,560 --> 00:24:32,880 Speaker 1: of these Onion relay routing nodes are owned by nation 501 00:24:32,960 --> 00:24:36,240 Speaker 1: state actors, whether it's the United States or others. So 502 00:24:36,240 --> 00:24:38,480 Speaker 1: so one of the things I got taken to task 503 00:24:38,600 --> 00:24:40,199 Speaker 1: for and I want to explain this is people like, well, 504 00:24:40,200 --> 00:24:43,159 Speaker 1: that's not the compromise network. It doesn't mean that it's useful. 505 00:24:43,440 --> 00:24:46,600 Speaker 1: Actually it does, because depending on what you're trying to do, 506 00:24:47,119 --> 00:24:50,240 Speaker 1: may matter. If you're trying to mask the origin of 507 00:24:50,320 --> 00:24:54,879 Speaker 1: your data source or your upload or your search for 508 00:24:55,000 --> 00:24:57,879 Speaker 1: a short duration of time, this will still help you 509 00:24:57,960 --> 00:25:00,800 Speaker 1: jump through six nodes. They've got to relay back six 510 00:25:00,880 --> 00:25:04,120 Speaker 1: nodes to figure out the origin of the person connecting 511 00:25:04,119 --> 00:25:07,280 Speaker 1: to the relay network. And that's assuming that there was 512 00:25:07,320 --> 00:25:10,800 Speaker 1: a compromise node in the process. So that means if 513 00:25:10,840 --> 00:25:13,280 Speaker 1: you're passing data through a compromise node, does that mean 514 00:25:13,320 --> 00:25:17,080 Speaker 1: the data in transit is safe? No? But is the 515 00:25:17,280 --> 00:25:21,160 Speaker 1: Is the anonymity of the origin of the poster safer 516 00:25:21,240 --> 00:25:24,720 Speaker 1: for a longer duration of time? Yes, So these things 517 00:25:24,720 --> 00:25:28,120 Speaker 1: get really complex, real fast, and this is again one 518 00:25:28,160 --> 00:25:30,000 Speaker 1: of the best things you can do, because there's no 519 00:25:30,080 --> 00:25:33,520 Speaker 1: single perfect solution, but stacking, so not just going through 520 00:25:33,560 --> 00:25:35,880 Speaker 1: tour but also tour into VPN at the same time, 521 00:25:35,960 --> 00:25:38,800 Speaker 1: and you're I think one of the better ways to 522 00:25:38,800 --> 00:25:42,680 Speaker 1: think about security is kind of the way Sebastian Younger 523 00:25:42,760 --> 00:25:46,200 Speaker 1: describes how insurgent war works, which is it's all about 524 00:25:46,240 --> 00:25:49,640 Speaker 1: creating friction for anybody trying to spy on your ship. 525 00:25:50,080 --> 00:25:52,640 Speaker 1: There's no perfect answer, but the more things you can 526 00:25:52,680 --> 00:25:54,960 Speaker 1: make be a pain in the ass, the better your 527 00:25:54,960 --> 00:25:57,439 Speaker 1: odds that you will not have an issue. Right Like, 528 00:25:57,520 --> 00:26:00,280 Speaker 1: that's all you can do is make it potential really 529 00:26:00,720 --> 00:26:04,679 Speaker 1: more annoying and more difficult for for whoever might be 530 00:26:04,720 --> 00:26:07,959 Speaker 1: looking right like it. The more friction you can create, 531 00:26:08,280 --> 00:26:11,520 Speaker 1: broadly speaking, the more secure you're going to be. Absolutely 532 00:26:11,600 --> 00:26:13,479 Speaker 1: now another thing to think about, and we're getting kind 533 00:26:13,480 --> 00:26:14,840 Speaker 1: of deep in the weeds too. This is above and 534 00:26:14,880 --> 00:26:17,280 Speaker 1: beyond the average person, right, the average person, get a 535 00:26:17,320 --> 00:26:20,680 Speaker 1: password manager, don't use your same passive everywhere, and don't 536 00:26:20,760 --> 00:26:23,040 Speaker 1: use biometrics unless you're forced, like pretty much have to 537 00:26:23,440 --> 00:26:25,720 Speaker 1: and move on with your life. But once you're beyond 538 00:26:25,720 --> 00:26:27,920 Speaker 1: the average person, this is what we're talking about now. 539 00:26:28,200 --> 00:26:29,840 Speaker 1: So like if you're if you have a computer and 540 00:26:29,840 --> 00:26:32,840 Speaker 1: you use it as your normal day to day operating system, 541 00:26:32,840 --> 00:26:35,200 Speaker 1: talking to your friends, doing dot dot dot dot dot, 542 00:26:35,440 --> 00:26:37,359 Speaker 1: but then also need to do something else a little 543 00:26:37,359 --> 00:26:41,160 Speaker 1: more privacy inclined, you should not trust that device. So 544 00:26:41,200 --> 00:26:44,200 Speaker 1: at that point, your web browser may have all sorts 545 00:26:44,240 --> 00:26:47,200 Speaker 1: of cookies and metadata and storage in it that, even 546 00:26:47,240 --> 00:26:50,080 Speaker 1: if you're going through a VPN, still may be able 547 00:26:50,119 --> 00:26:53,239 Speaker 1: to reveal your identity a spell as mac addresses and 548 00:26:53,240 --> 00:26:55,399 Speaker 1: other stuff. So if you really want to get pretty 549 00:26:55,400 --> 00:26:57,119 Speaker 1: into the weeds with this, you have to do something 550 00:26:57,160 --> 00:27:01,359 Speaker 1: like use an ephemeral operating system install that it has 551 00:27:01,359 --> 00:27:05,000 Speaker 1: no legacy data on it. One example of that that's 552 00:27:05,000 --> 00:27:07,639 Speaker 1: a Linux based when it's called TAILS. You essentially use 553 00:27:07,680 --> 00:27:10,399 Speaker 1: it like a live USB drive. You boot off of 554 00:27:10,440 --> 00:27:12,680 Speaker 1: that only, or you use a machine dedicated for this, 555 00:27:13,000 --> 00:27:15,680 Speaker 1: and you burn the OS down every time you're done, 556 00:27:15,720 --> 00:27:19,159 Speaker 1: because there's no legacy information or data that can be 557 00:27:19,800 --> 00:27:22,639 Speaker 1: pulled out of your web browser or your cookies or 558 00:27:22,680 --> 00:27:25,720 Speaker 1: your Mac dress information that can associate it with you, 559 00:27:26,000 --> 00:27:29,000 Speaker 1: regardless of if you've done everything right to mask your 560 00:27:29,000 --> 00:27:32,880 Speaker 1: IP address of origin. God, that's the hot girl shit. Um, 561 00:27:32,960 --> 00:27:35,000 Speaker 1: when you're when you're when you're doing when you when 562 00:27:35,000 --> 00:27:37,639 Speaker 1: you're doing that kind of stuff. Um. And again I 563 00:27:37,640 --> 00:27:39,639 Speaker 1: think at this point, I think, up through most of this, 564 00:27:39,760 --> 00:27:42,440 Speaker 1: it's been kind of like people being like that's too much, 565 00:27:42,480 --> 00:27:44,560 Speaker 1: and people being like, Okay, yep, this is exactly what 566 00:27:44,600 --> 00:27:47,400 Speaker 1: I already am or need to be doing. Um, this 567 00:27:47,480 --> 00:27:50,600 Speaker 1: is probably very few people need to be concerned about 568 00:27:50,640 --> 00:27:53,720 Speaker 1: that sort of thing, but um, you know it it is. 569 00:27:53,920 --> 00:27:55,840 Speaker 1: I I've know I note Like again, I worked at 570 00:27:55,880 --> 00:27:58,360 Speaker 1: Belling cat Um. I had a number of colleagues who 571 00:27:58,359 --> 00:28:01,200 Speaker 1: were like personal enemies of the Rush in state who 572 00:28:02,320 --> 00:28:06,800 Speaker 1: had to do stuff like this, um, and it's you know, paranoia. 573 00:28:06,840 --> 00:28:09,440 Speaker 1: I mean, and here's the thing going above. So again, 574 00:28:09,480 --> 00:28:11,640 Speaker 1: like if you're a normal person, you probably don't need 575 00:28:11,680 --> 00:28:15,040 Speaker 1: to be you know, doing stacking a VPN, you know, 576 00:28:15,440 --> 00:28:19,399 Speaker 1: getting signal and all this stuff. But also why not, right, Like, 577 00:28:19,440 --> 00:28:22,119 Speaker 1: there's no harm in in the additional security. It is 578 00:28:22,119 --> 00:28:24,919 Speaker 1: a little bit frustrating. But here's one of the things 579 00:28:24,960 --> 00:28:27,600 Speaker 1: I think people don't often think about enough. You're not 580 00:28:28,280 --> 00:28:32,320 Speaker 1: engaging in that kind of security stuff purely because there's 581 00:28:32,359 --> 00:28:35,080 Speaker 1: a threat now, but in part because you don't know 582 00:28:35,119 --> 00:28:37,080 Speaker 1: what the future is going to bring. And one of 583 00:28:37,080 --> 00:28:39,480 Speaker 1: the things that I would point out for that is 584 00:28:39,880 --> 00:28:42,440 Speaker 1: a lot of people right now have been having for 585 00:28:42,560 --> 00:28:46,400 Speaker 1: years conversations about a thing that may soon legally be 586 00:28:46,520 --> 00:28:50,680 Speaker 1: murder on a federal level, you know, um, abortion, right, 587 00:28:51,160 --> 00:28:54,880 Speaker 1: And so it is possible that overnight an awful lot 588 00:28:54,880 --> 00:28:58,720 Speaker 1: of conversations a bunch of people have had legally will 589 00:28:58,760 --> 00:29:01,600 Speaker 1: suddenly be very illegal conversations. And then you may be 590 00:29:01,680 --> 00:29:04,920 Speaker 1: glad that you took greater care with your your personal 591 00:29:04,920 --> 00:29:07,280 Speaker 1: security prior to that point. Yeah, I mean, like, so 592 00:29:07,400 --> 00:29:09,960 Speaker 1: think of the I mean, I'm not a person that menstrates, 593 00:29:09,960 --> 00:29:12,120 Speaker 1: but on menstruation tracking app is very useful to a 594 00:29:12,200 --> 00:29:14,760 Speaker 1: lot of people who do and those tracking apps, now 595 00:29:15,280 --> 00:29:18,720 Speaker 1: that metadata in there at some point could be extremely 596 00:29:18,840 --> 00:29:23,880 Speaker 1: dangerous or incriminalized or incrim criminalizing, incriminating excuse me, to 597 00:29:24,120 --> 00:29:27,320 Speaker 1: someone who otherwise was doing nothing more than trying to 598 00:29:27,360 --> 00:29:30,160 Speaker 1: maintain their natural health. And so that is a really 599 00:29:30,240 --> 00:29:32,720 Speaker 1: dangerous concept. So at this point, I mean, within the 600 00:29:32,760 --> 00:29:35,320 Speaker 1: United States, I hate to say this, those apps are 601 00:29:35,320 --> 00:29:38,800 Speaker 1: probably dangerous to the individual because that data could be 602 00:29:38,880 --> 00:29:42,040 Speaker 1: easily used by a government resource to uh, to do 603 00:29:42,120 --> 00:29:45,240 Speaker 1: something bad to someone who's done nothing wrong. So I 604 00:29:45,320 --> 00:29:46,920 Speaker 1: think we should move. I mean, at this point, I 605 00:29:46,960 --> 00:29:49,040 Speaker 1: think we've covered the basis that you could kind of 606 00:29:49,080 --> 00:29:52,280 Speaker 1: responsibly the advice you can responsibly give someone in a podcast. 607 00:29:52,360 --> 00:29:54,880 Speaker 1: And and folks, should it be able to add let 608 00:29:55,080 --> 00:29:57,640 Speaker 1: me throw one thing out real quickly. So you mentioned like, 609 00:29:57,680 --> 00:29:59,760 Speaker 1: for example, we don't you don't necessarily have the risk 610 00:30:00,000 --> 00:30:02,800 Speaker 1: actor that requires using VPN or signal. Let me see 611 00:30:02,840 --> 00:30:05,200 Speaker 1: this way back when gosh, when I was doing crypto 612 00:30:05,240 --> 00:30:09,280 Speaker 1: work decades ago, I was what you mean cryptography and 613 00:30:09,360 --> 00:30:12,520 Speaker 1: not we should specify these days? Oh yeah, excuse me, 614 00:30:12,520 --> 00:30:16,760 Speaker 1: cryptogracryption work. Yeah, yeah, I I had the opportunity. We're 615 00:30:16,920 --> 00:30:19,200 Speaker 1: Phill Zimmerman of p GP, and actually p g P 616 00:30:19,320 --> 00:30:22,760 Speaker 1: pretty Good Privacy, which was one of the fundamental UH 617 00:30:22,880 --> 00:30:25,840 Speaker 1: security project or projects way back when, was actually written 618 00:30:25,880 --> 00:30:28,400 Speaker 1: for human rights violations. He wrote it because people were 619 00:30:28,440 --> 00:30:32,640 Speaker 1: doing research of like warlords were getting their laptops taken 620 00:30:32,640 --> 00:30:34,120 Speaker 1: away and then finding out who spoke to them and 621 00:30:34,120 --> 00:30:36,560 Speaker 1: getting people killed. So p GP was like this human 622 00:30:36,640 --> 00:30:39,760 Speaker 1: rights thing right from the beginning. And cryptography back when 623 00:30:39,760 --> 00:30:42,360 Speaker 1: I was young and naive, I always thought to myself, 624 00:30:42,600 --> 00:30:44,240 Speaker 1: this is what we need. This is the future when 625 00:30:44,280 --> 00:30:47,880 Speaker 1: everyone gets proper crypto will blind the government will blind 626 00:30:47,920 --> 00:30:51,520 Speaker 1: the corporations. We're gonna have this crypto anarchist future where 627 00:30:51,520 --> 00:30:54,520 Speaker 1: the government and corporations can't get us. And the reality 628 00:30:54,640 --> 00:30:57,440 Speaker 1: is most of that God who served. And the truth 629 00:30:57,480 --> 00:30:59,760 Speaker 1: is cryptography is too hard for most people to use, 630 00:31:00,040 --> 00:31:02,000 Speaker 1: and as a result we don't. But here's what I 631 00:31:02,040 --> 00:31:04,520 Speaker 1: will say. The more people that do something simple like 632 00:31:04,800 --> 00:31:08,120 Speaker 1: you signal or use a VPN just to browse the Internet, 633 00:31:08,240 --> 00:31:11,360 Speaker 1: not because they're doing anything various, just because their privacy 634 00:31:11,440 --> 00:31:15,960 Speaker 1: like conscious, because it makes it normalize and that means 635 00:31:16,000 --> 00:31:17,920 Speaker 1: that the person that's using it because they need to 636 00:31:18,000 --> 00:31:21,960 Speaker 1: for like, let's say, to protect human rights doesn't stick 637 00:31:22,000 --> 00:31:25,160 Speaker 1: out like a needle in the haystack, because everybody's already 638 00:31:25,160 --> 00:31:28,760 Speaker 1: doing something sane in the first place. Normalizing proper privacy 639 00:31:29,040 --> 00:31:33,280 Speaker 1: and cryptography is better for everyone. Yes, yes, absolutely agreed. 640 00:31:33,840 --> 00:31:35,920 Speaker 1: This is a nice segue because you were just talking 641 00:31:35,960 --> 00:31:40,080 Speaker 1: about the past and how beautiful and bright it seemed. Um, 642 00:31:40,160 --> 00:31:42,280 Speaker 1: let's talk about what you see as kind of the 643 00:31:42,280 --> 00:31:47,760 Speaker 1: future of info security threats. Well, I mean, so there's 644 00:31:47,760 --> 00:31:49,640 Speaker 1: so many levels to that. First of all, if we're 645 00:31:49,640 --> 00:31:52,680 Speaker 1: talking nation state level, I personally strongly believe that all 646 00:31:52,720 --> 00:31:56,640 Speaker 1: of the big players have already compromised everyone's network. Everybody 647 00:31:56,720 --> 00:32:01,160 Speaker 1: got everybody has got us. China's about us, We got China. 648 00:32:01,440 --> 00:32:03,880 Speaker 1: Anybody right now could go in and pretty much funk 649 00:32:03,920 --> 00:32:08,200 Speaker 1: up the grid on someone else like that, And there's yeah, 650 00:32:08,280 --> 00:32:10,920 Speaker 1: and that's not actually the least that's that's safer than 651 00:32:11,040 --> 00:32:13,880 Speaker 1: other possibilities, like because there is a level of of 652 00:32:13,960 --> 00:32:16,440 Speaker 1: mutually a shared destruction there where it's like, yeah, man, 653 00:32:16,520 --> 00:32:18,880 Speaker 1: Russia could take down the grid, but like that wouldn't 654 00:32:18,880 --> 00:32:21,960 Speaker 1: be good for them, and vice versa. You know, yeah, no, true. 655 00:32:22,040 --> 00:32:24,600 Speaker 1: So the reality is though everybody's and everybody's network, those 656 00:32:24,640 --> 00:32:27,800 Speaker 1: days are over. Um. When it comes to the individual 657 00:32:27,840 --> 00:32:31,160 Speaker 1: and I'm gonna have a the audience, there might be 658 00:32:31,160 --> 00:32:33,000 Speaker 1: people in the audience to feel differently, and it still 659 00:32:33,000 --> 00:32:34,640 Speaker 1: doesn't mean that we don't try. So one of the 660 00:32:34,680 --> 00:32:36,640 Speaker 1: things I want to say is you're gonna hear some 661 00:32:36,760 --> 00:32:38,760 Speaker 1: skepticism here because I've been doing this career for a 662 00:32:38,760 --> 00:32:41,840 Speaker 1: long time and I've seen things go wrong more than right, 663 00:32:42,160 --> 00:32:44,760 Speaker 1: and so in that regard, this is gonna sound kind 664 00:32:44,800 --> 00:32:46,720 Speaker 1: of cynical. But when it comes to the idea of 665 00:32:47,600 --> 00:32:51,280 Speaker 1: individual privacy, in my opinion, with the exception of when 666 00:32:51,280 --> 00:32:54,840 Speaker 1: you're taking a very active effort in something very specific 667 00:32:54,880 --> 00:32:57,240 Speaker 1: that you want to keep private because that's something you're 668 00:32:57,240 --> 00:33:00,680 Speaker 1: working on personally, the reality is individual to all privacy 669 00:33:00,760 --> 00:33:03,720 Speaker 1: is dead and gone, and we're just starting to smell 670 00:33:03,800 --> 00:33:09,200 Speaker 1: that corpse um. Whether it is credit card data transactions, 671 00:33:09,520 --> 00:33:12,280 Speaker 1: your cell phone history, your phone numbers, what you've done 672 00:33:12,280 --> 00:33:14,840 Speaker 1: on the internet, what you've done on social media or 673 00:33:14,880 --> 00:33:17,160 Speaker 1: not done on social media, whether you have an account 674 00:33:17,200 --> 00:33:20,120 Speaker 1: on Facebook or not, doesn't even matter. The metadata and 675 00:33:20,120 --> 00:33:23,320 Speaker 1: the trailer you're leaving behind you is all aggregated, all 676 00:33:23,360 --> 00:33:26,960 Speaker 1: of it behind big data corporations, all of it compromised, 677 00:33:27,120 --> 00:33:30,240 Speaker 1: all of it searchable. Even stuff the government has on 678 00:33:30,320 --> 00:33:32,920 Speaker 1: you has been sold to large corporations because I can 679 00:33:32,960 --> 00:33:35,000 Speaker 1: tell you that some of the data that they kept 680 00:33:35,080 --> 00:33:37,360 Speaker 1: for like let's say D m V or m v D, 681 00:33:37,960 --> 00:33:40,320 Speaker 1: they decided to sell it off to a corporation and 682 00:33:40,360 --> 00:33:43,680 Speaker 1: they themselves access it through a third party when doing 683 00:33:43,720 --> 00:33:46,440 Speaker 1: research on you. So all of that big data, there's 684 00:33:46,440 --> 00:33:48,959 Speaker 1: a law of physics. The more you aggregate, the more 685 00:33:48,960 --> 00:33:55,959 Speaker 1: it will get compromised. Um, geez, I'm sorry, that's the truth. No, no, no, 686 00:33:56,040 --> 00:34:02,840 Speaker 1: I mean yeah, you're you're you're like, it's this Uh, 687 00:34:02,920 --> 00:34:05,800 Speaker 1: there's this frustration because I can remember the days when 688 00:34:06,600 --> 00:34:10,239 Speaker 1: the privacy hounds. And I don't say that in a 689 00:34:10,320 --> 00:34:13,799 Speaker 1: negative term. We're like warning everybody about, Hey, you don't 690 00:34:13,800 --> 00:34:16,239 Speaker 1: want to be aggregating all of these different social media 691 00:34:16,280 --> 00:34:17,799 Speaker 1: things together. Hey you don't want to be using all 692 00:34:17,840 --> 00:34:20,239 Speaker 1: of these services. Hey there's actually some like real downsides 693 00:34:20,400 --> 00:34:23,120 Speaker 1: like all of what's happening. Like part of why things 694 00:34:23,120 --> 00:34:25,120 Speaker 1: are so cheap on Amazon is you know that that 695 00:34:25,239 --> 00:34:27,480 Speaker 1: your data there is is one of the assets that 696 00:34:27,520 --> 00:34:31,839 Speaker 1: they have, And um, those people were absolutely right, and 697 00:34:31,880 --> 00:34:37,080 Speaker 1: they lost harder than anyone has ever lost at anything like. So, 698 00:34:37,200 --> 00:34:39,080 Speaker 1: Like when I was back there at that company doing 699 00:34:39,080 --> 00:34:41,840 Speaker 1: all that cryptography work, we were trying to give crypto 700 00:34:42,000 --> 00:34:45,160 Speaker 1: like to the average general population of the Internet, I 701 00:34:45,200 --> 00:34:46,920 Speaker 1: had this, like I said, this naive view of like 702 00:34:47,000 --> 00:34:49,200 Speaker 1: the future that was gonna be this place where we're 703 00:34:49,200 --> 00:34:51,080 Speaker 1: gonna have the Internet where everyone was connected, and it 704 00:34:51,160 --> 00:34:53,719 Speaker 1: was gonna be not only would we have personal privacy 705 00:34:53,719 --> 00:34:56,640 Speaker 1: through cryptography, but we would be able to transfer information 706 00:34:56,680 --> 00:34:58,640 Speaker 1: to one another in a way that would make the 707 00:34:58,640 --> 00:35:01,560 Speaker 1: Shenanigans impossible. Well, to some degree that's been true what 708 00:35:01,520 --> 00:35:04,040 Speaker 1: we've seen some of that, But to another degree we 709 00:35:04,080 --> 00:35:08,400 Speaker 1: also have Snowden dropping the bomb on revelations about what 710 00:35:08,440 --> 00:35:11,040 Speaker 1: the government has done to the individual and how they've 711 00:35:11,080 --> 00:35:13,800 Speaker 1: broken the law with all of our privacy and data 712 00:35:14,000 --> 00:35:16,880 Speaker 1: and what came of that a man in exile in 713 00:35:16,960 --> 00:35:22,439 Speaker 1: Russia and pretty much fucking nothing, Yeah right, nothing, And um, 714 00:35:22,560 --> 00:35:26,279 Speaker 1: I was sitting at a Defcon presentation where General Alexander 715 00:35:26,400 --> 00:35:29,040 Speaker 1: was on the screen talking about what they weren't doing, 716 00:35:29,320 --> 00:35:33,040 Speaker 1: while Snowden was dropping revelations proving him to be lying, 717 00:35:34,000 --> 00:35:36,680 Speaker 1: and nothing comes of it, right, nothing really comes of it. 718 00:35:37,080 --> 00:35:40,000 Speaker 1: And one of the things that's so real. And so 719 00:35:40,239 --> 00:35:44,040 Speaker 1: whether it's the tribal level, your neighbors across the street 720 00:35:44,560 --> 00:35:47,600 Speaker 1: or the internet tribe, we as a people in the 721 00:35:47,680 --> 00:35:51,600 Speaker 1: aggregate are always willing to give up our rights to 722 00:35:51,800 --> 00:35:54,680 Speaker 1: something bigger for convenience. And we've done that and it's 723 00:35:54,680 --> 00:35:59,239 Speaker 1: called Facebook and Twitter and social media. And in the process, 724 00:35:59,520 --> 00:36:02,760 Speaker 1: what was going to be an amazing resource has become 725 00:36:02,800 --> 00:36:08,000 Speaker 1: the trap. Uh. It's such a it's because you know, 726 00:36:08,160 --> 00:36:11,160 Speaker 1: you know Garrison, I I my my friend who is 727 00:36:11,239 --> 00:36:14,400 Speaker 1: much younger than me, Um has grown up with the 728 00:36:14,440 --> 00:36:19,200 Speaker 1: Internet being being what it is now right like this 729 00:36:19,200 --> 00:36:21,759 Speaker 1: this kind of like nightmare trap. You know that that's 730 00:36:21,760 --> 00:36:24,279 Speaker 1: sucking us all in this like giant squid that has 731 00:36:24,360 --> 00:36:27,520 Speaker 1: us in its tentacles. Um. And it's I get I 732 00:36:27,600 --> 00:36:30,799 Speaker 1: sometimes like dissociate talking with them about certain Internet things 733 00:36:30,840 --> 00:36:35,120 Speaker 1: because in my heart it's still the Promised Land. Yeah, 734 00:36:35,280 --> 00:36:37,799 Speaker 1: I wish, I I guess my I wish I felt 735 00:36:37,840 --> 00:36:39,600 Speaker 1: that way. It doesn't feel like that way to me anymore, 736 00:36:39,600 --> 00:36:41,600 Speaker 1: to be honest, I mean, it's it's not right like 737 00:36:41,640 --> 00:36:44,719 Speaker 1: and what I mean that in like sort of I 738 00:36:44,800 --> 00:36:47,319 Speaker 1: have this I don't know. I've never entirely been able 739 00:36:47,360 --> 00:36:49,160 Speaker 1: to like let go of the vision of like, oh 740 00:36:49,239 --> 00:36:50,960 Speaker 1: it could have been There's so many things that could 741 00:36:51,000 --> 00:36:53,680 Speaker 1: have been. Well, it's like, you know, it's like all technology, 742 00:36:53,880 --> 00:36:56,719 Speaker 1: anything can be remponized, right right, Like an a R 743 00:36:56,800 --> 00:36:59,480 Speaker 1: fifteen can be used for good or for evil, a 744 00:36:59,560 --> 00:37:01,440 Speaker 1: knife to be used to make a beautiful meal or 745 00:37:01,480 --> 00:37:04,400 Speaker 1: to commit a murder. And the Internet is technology, and 746 00:37:04,400 --> 00:37:07,160 Speaker 1: it has been weaponized. It's been weaponized against us. But 747 00:37:07,200 --> 00:37:09,120 Speaker 1: at the same time, if we just turn a blind 748 00:37:09,120 --> 00:37:11,279 Speaker 1: eye to it and then not learn how to use 749 00:37:11,320 --> 00:37:14,080 Speaker 1: this technology to our advantage, we're allowing them to do 750 00:37:14,120 --> 00:37:16,400 Speaker 1: that unabated. And that's where like the kind of hacker 751 00:37:16,440 --> 00:37:18,399 Speaker 1: mindset comes from, which is like, how do I make 752 00:37:18,440 --> 00:37:20,520 Speaker 1: this thing do what I wanted to do for me 753 00:37:20,920 --> 00:37:23,640 Speaker 1: while not letting someone else do it for them. And 754 00:37:23,760 --> 00:37:26,920 Speaker 1: unless we take control of the technology for ourselves, like 755 00:37:26,960 --> 00:37:30,920 Speaker 1: I said earlier, normalizing using signal and even basic VPN 756 00:37:30,960 --> 00:37:33,920 Speaker 1: and cryptography, then we're just giving it up. We're not 757 00:37:33,960 --> 00:37:35,960 Speaker 1: even making it a challenge. We're just like here, you 758 00:37:36,239 --> 00:37:38,839 Speaker 1: go have it. And Uh, that's something that I think 759 00:37:38,840 --> 00:37:41,839 Speaker 1: that's more important as a community. Maybe as people grow 760 00:37:41,920 --> 00:37:44,279 Speaker 1: up on the Internet versus seeing it becoming something that 761 00:37:44,600 --> 00:37:49,040 Speaker 1: I saw become something maybe either a they'll just accept 762 00:37:49,200 --> 00:37:51,160 Speaker 1: which I hope isn't the case that the reality is 763 00:37:51,200 --> 00:37:54,560 Speaker 1: privacy is dead, or maybe they'll approach the Internet differently 764 00:37:54,640 --> 00:37:57,440 Speaker 1: than say someone at my age did. We're frankly, we 765 00:37:57,520 --> 00:38:00,600 Speaker 1: kind of messed up and we didn't realize the Primrose 766 00:38:00,640 --> 00:38:04,360 Speaker 1: path was actually trapped, and that's a like that was 767 00:38:04,400 --> 00:38:07,440 Speaker 1: a mistake and maybe we can kind of like evolve 768 00:38:07,560 --> 00:38:10,200 Speaker 1: beyond that. But like you're asking, where is info set 769 00:38:10,239 --> 00:38:12,920 Speaker 1: going now, I I don't have good notes for that. 770 00:38:12,960 --> 00:38:15,080 Speaker 1: Like when I first started working in the career, it 771 00:38:15,200 --> 00:38:17,640 Speaker 1: really felt like a great thing. We were doing important stuff. 772 00:38:17,680 --> 00:38:20,000 Speaker 1: We were doing the DOS mitigation. We were going into 773 00:38:20,040 --> 00:38:24,200 Speaker 1: hospitals and making sure that insulin pumps weren't compromised as 774 00:38:24,200 --> 00:38:28,280 Speaker 1: a DIDOS host believe it or not, hospitals are infosect nightmares. 775 00:38:28,480 --> 00:38:31,319 Speaker 1: And we were doing stuff that felt good. And then 776 00:38:31,400 --> 00:38:33,839 Speaker 1: later in the career I realized, wait a minute, I'm 777 00:38:33,840 --> 00:38:37,080 Speaker 1: not doing anything to secure anybody's personal information or make 778 00:38:37,120 --> 00:38:40,400 Speaker 1: the Internet safer. I was just protecting some corporate coffer 779 00:38:40,880 --> 00:38:43,640 Speaker 1: and the reality was that the private information that we 780 00:38:43,640 --> 00:38:46,919 Speaker 1: were supposedly protecting. The debate would turn into calls which 781 00:38:47,040 --> 00:38:51,200 Speaker 1: was what's more expensive losing the data or the lawsuit 782 00:38:51,280 --> 00:38:54,920 Speaker 1: for losing the data. Literally, those were the conversations and corporations, 783 00:38:55,120 --> 00:38:58,160 Speaker 1: and those are the conversations that corporations have now about 784 00:38:58,160 --> 00:39:02,799 Speaker 1: each and every one of ours personal information. Now when 785 00:39:02,840 --> 00:39:07,160 Speaker 1: you when you think about because so I obviously I'm 786 00:39:07,200 --> 00:39:09,759 Speaker 1: in a different was in a different field. But when 787 00:39:09,800 --> 00:39:12,000 Speaker 1: I was doing a lot of the research on terrorism 788 00:39:12,040 --> 00:39:14,399 Speaker 1: that I was doing, I had these things that were 789 00:39:14,440 --> 00:39:17,200 Speaker 1: like sort of the this kind of attack is going 790 00:39:17,239 --> 00:39:18,959 Speaker 1: to happen at something I feel that very much about, 791 00:39:19,000 --> 00:39:22,000 Speaker 1: Like drones. There's going to be like a mass killing 792 00:39:22,040 --> 00:39:25,680 Speaker 1: of civilians not in a war zone by a civilian 793 00:39:25,719 --> 00:39:28,160 Speaker 1: weaponized drone at some point in the not too distant 794 00:39:28,160 --> 00:39:29,799 Speaker 1: futures going to happen. It's going to be done. It's 795 00:39:29,840 --> 00:39:34,719 Speaker 1: absolutely inevitability. Um that kind of stuff. Do you what 796 00:39:34,800 --> 00:39:36,680 Speaker 1: are you when you think about kind of the the 797 00:39:36,800 --> 00:39:42,359 Speaker 1: digital equivalence of that, like, what are you looking towards? Well, 798 00:39:42,400 --> 00:39:43,799 Speaker 1: I agree with you about the drone, Like you can 799 00:39:43,800 --> 00:39:46,320 Speaker 1: see stuff. God, yes, you plot the you plot the 800 00:39:46,360 --> 00:39:48,120 Speaker 1: dots and you know what's going to occur, right, It's 801 00:39:48,160 --> 00:39:51,080 Speaker 1: it's not it's not possible to avoid. We unleashed that 802 00:39:51,440 --> 00:39:55,040 Speaker 1: out of the cage, and it's going to happen. Quite honestly, 803 00:39:55,080 --> 00:39:58,000 Speaker 1: I think we're seeing it already. We're seeing we're seeing 804 00:39:58,040 --> 00:40:01,600 Speaker 1: the level of privates see invasion that I don't think 805 00:40:01,640 --> 00:40:04,719 Speaker 1: people already know has happened. Like I know some of 806 00:40:04,800 --> 00:40:07,200 Speaker 1: us realize that, we talked about that, we rant about it, 807 00:40:07,239 --> 00:40:10,440 Speaker 1: but like, I don't think people realize the level of 808 00:40:10,480 --> 00:40:14,640 Speaker 1: the incursion that has occurred to the point where all 809 00:40:14,680 --> 00:40:16,840 Speaker 1: of this data aggregated to the point they know what 810 00:40:17,000 --> 00:40:20,279 Speaker 1: toilet paper you prefer to buy. Like I'm talking like 811 00:40:20,320 --> 00:40:24,239 Speaker 1: people like Facebook knowing that um or the size of 812 00:40:24,280 --> 00:40:27,279 Speaker 1: the corporate oligarchy that controls the Internet, whether it's the 813 00:40:27,360 --> 00:40:32,000 Speaker 1: small like Alphabet, Core, Facebook, Apple, Microsoft's becoming a smaller 814 00:40:32,040 --> 00:40:34,680 Speaker 1: player weirdly. But when you think about those big names, 815 00:40:35,600 --> 00:40:39,200 Speaker 1: they kind of like control everything and every piece of 816 00:40:39,320 --> 00:40:43,759 Speaker 1: data about you and everything you move, and say that, 817 00:40:43,840 --> 00:40:45,480 Speaker 1: I think, I think, what's the end of that. I 818 00:40:45,480 --> 00:40:47,160 Speaker 1: don't think we got to the end game of that, 819 00:40:47,760 --> 00:40:50,080 Speaker 1: But I don't know how we roll it back. And 820 00:40:50,120 --> 00:40:53,480 Speaker 1: that's the thing. So what's the prediction My prediction is 821 00:40:54,040 --> 00:40:56,200 Speaker 1: it's gonna get worse, and we're gonna get to the 822 00:40:56,200 --> 00:41:00,759 Speaker 1: point where there isn't room to move without that surveillance 823 00:41:00,760 --> 00:41:03,080 Speaker 1: tracking you. And like so, for example, you think of 824 00:41:03,120 --> 00:41:06,120 Speaker 1: things like Sci Fi Minority Report, you walk to the 825 00:41:06,160 --> 00:41:09,120 Speaker 1: mall and there's facial idea happening everywhere you go with 826 00:41:09,560 --> 00:41:13,600 Speaker 1: targeted advertising at the mall. Oh, that's coming. I guarantee 827 00:41:13,840 --> 00:41:17,600 Speaker 1: that's coming. And all of that's happening already, and that 828 00:41:17,680 --> 00:41:21,400 Speaker 1: facial recognition stuff that's going on is happening currently now 829 00:41:21,760 --> 00:41:24,919 Speaker 1: we're just not that aware of it happening. The cop 830 00:41:25,000 --> 00:41:28,000 Speaker 1: cars driving down the road and every license plate is 831 00:41:28,040 --> 00:41:32,759 Speaker 1: being measured with the cameras being o CR optical character recognition, 832 00:41:33,040 --> 00:41:35,719 Speaker 1: and that's coming back, and they're tracking every car they're 833 00:41:35,800 --> 00:41:37,920 Speaker 1: driving by on the highway even though there's not a 834 00:41:37,960 --> 00:41:43,880 Speaker 1: GPS unit on your car. The ability to not be 835 00:41:44,000 --> 00:41:50,399 Speaker 1: tracked will soon be impossible. How's that? Yeah, I mean allegedly, 836 00:41:50,960 --> 00:41:55,160 Speaker 1: when I was younger, they were like certain stupid petty 837 00:41:55,239 --> 00:41:58,000 Speaker 1: crimes I would commit just because like people will not 838 00:41:58,080 --> 00:42:00,160 Speaker 1: be able to do this in the future, and I 839 00:42:00,160 --> 00:42:02,839 Speaker 1: have a moral responsibility to steal the light bulbs from 840 00:42:02,840 --> 00:42:04,600 Speaker 1: in front of this bar and throw them in the thread. 841 00:42:04,600 --> 00:42:06,680 Speaker 1: It's like, what, one day that will be a thing 842 00:42:06,680 --> 00:42:09,080 Speaker 1: that people can't do without getting caught. And so like 843 00:42:09,120 --> 00:42:12,040 Speaker 1: I just I had to, you know, there are like 844 00:42:12,120 --> 00:42:14,640 Speaker 1: some bright spots because I I think you're absolutely right, 845 00:42:14,680 --> 00:42:17,800 Speaker 1: there's no on like a broader scale, there's no turning 846 00:42:17,840 --> 00:42:20,520 Speaker 1: back the clock for stuff like facial recognition and how 847 00:42:20,520 --> 00:42:22,279 Speaker 1: funked up it's going to get. There are states like 848 00:42:22,320 --> 00:42:24,400 Speaker 1: where I live in Oregon, where like they have passed 849 00:42:24,440 --> 00:42:27,040 Speaker 1: laws that are just like you, public facial recognition is 850 00:42:27,080 --> 00:42:30,320 Speaker 1: not a thing that is legal in this state. Um. 851 00:42:30,400 --> 00:42:34,200 Speaker 1: And I definitely support more attempts like that because again, 852 00:42:34,719 --> 00:42:37,160 Speaker 1: anything you can do to sty me them, to reduce 853 00:42:37,239 --> 00:42:40,080 Speaker 1: the spread of the grid, to reduce the profitability of 854 00:42:40,080 --> 00:42:45,520 Speaker 1: these things, even though it's again overall a doomed cause. Right. Um, yeah, 855 00:42:45,560 --> 00:42:47,880 Speaker 1: I don't know. I mean I obviously I think that 856 00:42:47,880 --> 00:42:49,560 Speaker 1: that's a good law, But I don't know that laws 857 00:42:49,719 --> 00:42:53,440 Speaker 1: stop corporations when the corporations have more power than law, yes, 858 00:42:53,480 --> 00:42:57,480 Speaker 1: of course. Um. And it's like, I mean, obviously you 859 00:42:57,520 --> 00:43:00,200 Speaker 1: can you can ban it for police to you and 860 00:43:00,239 --> 00:43:03,120 Speaker 1: stuff which does something to the extent that you know 861 00:43:03,280 --> 00:43:06,080 Speaker 1: they follow the law, but none of this is I 862 00:43:06,120 --> 00:43:09,000 Speaker 1: don't know, like I That's one of the things that 863 00:43:09,080 --> 00:43:13,040 Speaker 1: makes me most depressed about the future is the thought that, like, 864 00:43:15,360 --> 00:43:18,399 Speaker 1: the space for this is not like a major issue, 865 00:43:18,440 --> 00:43:20,000 Speaker 1: I guess, but like the space for kids just like 866 00:43:20,080 --> 00:43:23,840 Speaker 1: funk around and do dumb ship when they're nineteen is 867 00:43:23,880 --> 00:43:25,839 Speaker 1: going to get so much smaller. I mean, I would say, 868 00:43:25,840 --> 00:43:27,520 Speaker 1: I mean, I think the thing is like, as a 869 00:43:27,640 --> 00:43:30,920 Speaker 1: natural human being, whether you're doing anything wrong, even if 870 00:43:30,960 --> 00:43:34,080 Speaker 1: you're not doing anything wrong, the nature to feel like 871 00:43:34,120 --> 00:43:37,000 Speaker 1: you have a private space that's to you, your private 872 00:43:37,000 --> 00:43:39,600 Speaker 1: community space. I'm not even talking about wrong or right here, 873 00:43:39,600 --> 00:43:42,600 Speaker 1: We're just talking about just that feeling that at this moment, 874 00:43:43,000 --> 00:43:46,400 Speaker 1: this is my space where I'm not being watched. Is 875 00:43:46,440 --> 00:43:51,160 Speaker 1: a natural, healthy need of the human orgasm or organism. 876 00:43:51,440 --> 00:43:56,879 Speaker 1: Um and interesting, uh, but no, it's it's a it's 877 00:43:56,880 --> 00:43:59,239 Speaker 1: a human need. And I think we're gonna find those 878 00:43:59,239 --> 00:44:03,239 Speaker 1: spaces because the smaller and smaller, And I think when 879 00:44:03,239 --> 00:44:04,840 Speaker 1: you said, what's your prediction, I hate to say it, 880 00:44:04,840 --> 00:44:07,520 Speaker 1: but I think the prediction is it will become impossible 881 00:44:07,560 --> 00:44:10,480 Speaker 1: to not be tracked. Now, the bright side of that, 882 00:44:10,560 --> 00:44:13,000 Speaker 1: the bright side of that maybe maybe there's a bright side. 883 00:44:13,520 --> 00:44:17,080 Speaker 1: Maybe at some point when that's the reality, it could 884 00:44:17,120 --> 00:44:21,480 Speaker 1: somehow also affect the people that are powerful and the 885 00:44:21,520 --> 00:44:24,360 Speaker 1: people that are small. And we all realize that humans 886 00:44:24,360 --> 00:44:28,239 Speaker 1: are humans, and therefore the failings that sometimes we have 887 00:44:28,400 --> 00:44:30,800 Speaker 1: as all human beings, we just kind of acknowledge it 888 00:44:30,800 --> 00:44:33,520 Speaker 1: and be like, oh, yeah, of course that's just what 889 00:44:33,560 --> 00:44:36,280 Speaker 1: people do. Like maybe we just realize people are people. 890 00:44:36,600 --> 00:44:38,480 Speaker 1: But the idea that there's never going to be a 891 00:44:38,520 --> 00:44:40,960 Speaker 1: space to not get tracked, I don't know. To me, 892 00:44:41,360 --> 00:44:44,920 Speaker 1: I find darkly disturbing. It is disturbing. I do think 893 00:44:44,960 --> 00:44:47,000 Speaker 1: it kind of to pivot off of what you were saying. 894 00:44:47,000 --> 00:44:49,239 Speaker 1: The other aspect of that that is more positive is 895 00:44:49,280 --> 00:44:53,680 Speaker 1: that all of this stuff, all of this surveillance shiit 896 00:44:54,440 --> 00:44:56,640 Speaker 1: um or at least not all, but quite a bit 897 00:44:56,680 --> 00:45:00,160 Speaker 1: of it is you know, in a way, it's can 898 00:45:00,200 --> 00:45:03,320 Speaker 1: knife fight. There's no way that both parties don't get cut, 899 00:45:03,440 --> 00:45:05,759 Speaker 1: and you know, the ones wielding the knife might get 900 00:45:05,800 --> 00:45:07,799 Speaker 1: cut less, but they're still going to get cut. And 901 00:45:08,200 --> 00:45:10,799 Speaker 1: part of what that means in this situation is that 902 00:45:11,560 --> 00:45:14,200 Speaker 1: the prevalence of all of these different ways to surveil 903 00:45:14,280 --> 00:45:18,799 Speaker 1: and track also allows us to track in the same 904 00:45:18,800 --> 00:45:22,720 Speaker 1: way that like police law enforcement watches people through their phones, 905 00:45:22,760 --> 00:45:24,600 Speaker 1: but also a hell of a lot of cops are 906 00:45:24,600 --> 00:45:29,680 Speaker 1: getting filmed doing fucked up ship now right now. Again, 907 00:45:30,280 --> 00:45:32,640 Speaker 1: the balance of the cuts I don't think is going 908 00:45:32,680 --> 00:45:34,400 Speaker 1: to be work out in our favor, but it's not 909 00:45:34,520 --> 00:45:36,680 Speaker 1: going to be nothing on them either. And and and 910 00:45:36,760 --> 00:45:39,239 Speaker 1: you're right, I think there are there are some things 911 00:45:39,320 --> 00:45:41,879 Speaker 1: that we will learn in the future about the people 912 00:45:41,960 --> 00:45:44,120 Speaker 1: in power in the world that would it wouldn't have 913 00:45:44,160 --> 00:45:46,160 Speaker 1: been possible for us to learn in the past or 914 00:45:46,200 --> 00:45:50,200 Speaker 1: may not be possibly even right now. And that and 915 00:45:50,239 --> 00:45:52,000 Speaker 1: if we learned that about people in power, then they 916 00:45:52,040 --> 00:45:54,000 Speaker 1: can't weaponize it as much against the people that aren't 917 00:45:54,000 --> 00:45:56,120 Speaker 1: in power. Right yeah, Yeah, you know one thing that 918 00:45:56,200 --> 00:45:58,719 Speaker 1: I'm because I'm thinking a lot about the fact that 919 00:45:58,960 --> 00:46:03,080 Speaker 1: a bunch of folks the reproductive healthcare industry have pointed 920 00:46:03,120 --> 00:46:09,120 Speaker 1: out that right wingers have started using drones to follow 921 00:46:09,200 --> 00:46:11,239 Speaker 1: people home from like planted parrot hoods and follow them 922 00:46:11,239 --> 00:46:13,160 Speaker 1: to their cards to like build databases of the people 923 00:46:13,160 --> 00:46:16,960 Speaker 1: who are going to places to potentially like do that 924 00:46:17,040 --> 00:46:21,040 Speaker 1: kind of reproductive healthcare that these folks don't think should exist. Um, 925 00:46:21,400 --> 00:46:23,920 Speaker 1: the other side of it though, is that Um, it 926 00:46:24,120 --> 00:46:28,920 Speaker 1: is also possible to surveil them. Um. And it will 927 00:46:28,960 --> 00:46:32,799 Speaker 1: be possible to track the people doing that sort of thing, 928 00:46:32,840 --> 00:46:34,560 Speaker 1: and it will be possible to do that in terms 929 00:46:34,600 --> 00:46:36,640 Speaker 1: of like legal accountability. And it will be possible to 930 00:46:36,640 --> 00:46:41,120 Speaker 1: do that for the people who embrace h questionably legal 931 00:46:41,239 --> 00:46:46,320 Speaker 1: tactics for for frustrating those efforts, um or illegal tactics 932 00:46:46,360 --> 00:46:50,160 Speaker 1: for frustrating those efforts. They have access to the same technology. Um. 933 00:46:50,520 --> 00:46:53,200 Speaker 1: And again it's it's it is a knife that will 934 00:46:53,280 --> 00:46:57,040 Speaker 1: cut everybody, um. And I guess that's better than just 935 00:46:57,200 --> 00:47:00,320 Speaker 1: one person getting cut in this situation. That's that's the 936 00:47:00,360 --> 00:47:02,319 Speaker 1: concern I have, right, I agree with that. Like I said, 937 00:47:02,360 --> 00:47:05,160 Speaker 1: technology goes it's a weapon and as weaponized in all directions, 938 00:47:05,200 --> 00:47:06,799 Speaker 1: depending on how to use it for good or for bad. 939 00:47:06,920 --> 00:47:08,719 Speaker 1: And so this is the same place I come to 940 00:47:08,800 --> 00:47:10,719 Speaker 1: when it comes to the gun control argument. I mean, 941 00:47:11,400 --> 00:47:16,759 Speaker 1: we can do no, no, no, the same the same problem, right, 942 00:47:17,080 --> 00:47:21,480 Speaker 1: because if we allow only one side to have all 943 00:47:21,560 --> 00:47:23,920 Speaker 1: of the control and power and understanding of the technology, 944 00:47:24,280 --> 00:47:26,640 Speaker 1: then we at ourselves are at a huge deficit. We 945 00:47:26,800 --> 00:47:30,200 Speaker 1: cannot defend ourselves or fight back. So when it comes 946 00:47:30,239 --> 00:47:32,800 Speaker 1: to this kind of data and technology. Knowing the basic 947 00:47:32,880 --> 00:47:35,560 Speaker 1: fundamentals of what you can do to protect yourself, understand 948 00:47:35,600 --> 00:47:38,960 Speaker 1: the reality of what the surveillance state or corporation is, 949 00:47:39,360 --> 00:47:41,319 Speaker 1: and then doing your best to not make it easy 950 00:47:41,440 --> 00:47:43,839 Speaker 1: for them is at least one step forward. But if 951 00:47:43,880 --> 00:47:46,879 Speaker 1: we don't own this technology, if we don't own the tech, 952 00:47:47,440 --> 00:47:50,120 Speaker 1: someone else will, and they will use it against us. 953 00:47:50,200 --> 00:47:52,879 Speaker 1: It's as simple as that. And like they're super simple 954 00:47:52,920 --> 00:47:54,279 Speaker 1: stuff Like I was gonna bring us up with like 955 00:47:54,880 --> 00:47:56,640 Speaker 1: you can't see video because it's a podcast, but like 956 00:47:56,680 --> 00:48:00,160 Speaker 1: there's these cool glasses from doctor are called reflectacles, but 957 00:48:00,239 --> 00:48:03,400 Speaker 1: I'm showing you Robert, and it looked like regular sunglasses. 958 00:48:03,480 --> 00:48:06,880 Speaker 1: But when you put them on, they do they reflect 959 00:48:07,040 --> 00:48:10,920 Speaker 1: i R light and actually mess with cameras in a 960 00:48:10,960 --> 00:48:12,719 Speaker 1: way that your turns are diet face into a ball 961 00:48:12,800 --> 00:48:15,399 Speaker 1: of light. So you can wear these, you can wear 962 00:48:15,400 --> 00:48:17,359 Speaker 1: the cold reflectacles. You can wear them and just walk 963 00:48:17,400 --> 00:48:20,080 Speaker 1: around the mall and all the cameras get blown out 964 00:48:20,160 --> 00:48:23,160 Speaker 1: by your by your glasses. Like doing that just because 965 00:48:23,239 --> 00:48:26,160 Speaker 1: you can. It's kind of fun. That's the hot ship. 966 00:48:26,520 --> 00:48:29,319 Speaker 1: That's the ship I was promised that that at least 967 00:48:29,400 --> 00:48:32,759 Speaker 1: does exist. It's not everything I had hoped it would 968 00:48:32,800 --> 00:48:34,680 Speaker 1: be in terms of its ability, But it is like 969 00:48:34,800 --> 00:48:37,000 Speaker 1: that kind of stuff rules and I will be picking 970 00:48:37,040 --> 00:48:39,520 Speaker 1: up a pair of those. UM. Well, we should probably 971 00:48:39,560 --> 00:48:42,080 Speaker 1: close out. I did want to note because I mentioned this, UM, 972 00:48:42,200 --> 00:48:43,640 Speaker 1: I got something a little wrong when I was talking 973 00:48:43,680 --> 00:48:46,719 Speaker 1: about the facial recognition ban. Um. It is in an 974 00:48:46,920 --> 00:48:50,520 Speaker 1: ordinance in the City of Portland itself. Um. It's the 975 00:48:50,600 --> 00:48:53,000 Speaker 1: first city that has done this, and it prohibits the 976 00:48:53,120 --> 00:48:57,480 Speaker 1: use of public facial recognition technology by all private businesses 977 00:48:57,520 --> 00:49:00,759 Speaker 1: in the city. UM. So that is the scope of 978 00:49:00,840 --> 00:49:02,680 Speaker 1: the band that a band that exists in Portland. I 979 00:49:02,719 --> 00:49:04,600 Speaker 1: recommend looking it up. It is the kind of thing 980 00:49:04,680 --> 00:49:08,399 Speaker 1: that I would support everyone pushing forward in their city. UM. 981 00:49:08,760 --> 00:49:11,759 Speaker 1: Because again, the more holes you can make in this thing, 982 00:49:11,960 --> 00:49:13,960 Speaker 1: the better. Yeah. I don't want to put that down. 983 00:49:14,040 --> 00:49:15,719 Speaker 1: That's a good thing. But the challenge of this is, 984 00:49:15,760 --> 00:49:17,880 Speaker 1: just like I mentioned earlier, moving the data out of 985 00:49:17,920 --> 00:49:20,920 Speaker 1: the konas and back the minute photos from Like I 986 00:49:21,040 --> 00:49:23,120 Speaker 1: take my iPhone and scan the crowd and then put 987 00:49:23,160 --> 00:49:25,880 Speaker 1: that picture up on the internet. It's not under their 988 00:49:25,960 --> 00:49:31,160 Speaker 1: jurisdiction and all that happens on every face in that yep, 989 00:49:31,760 --> 00:49:34,160 Speaker 1: and that is again, well, we'll do another episode at 990 00:49:34,200 --> 00:49:35,920 Speaker 1: some point about things that you can do to just 991 00:49:36,040 --> 00:49:39,480 Speaker 1: get like there. That's a whole different bag of tricks. Um, 992 00:49:39,640 --> 00:49:42,040 Speaker 1: But this has been really useful and really valuable. Carl. 993 00:49:42,200 --> 00:49:44,000 Speaker 1: Do you want to plug anything before we roll out here? 994 00:49:44,800 --> 00:49:46,719 Speaker 1: Not much, just my normal thing. If you're interested in 995 00:49:46,760 --> 00:49:49,160 Speaker 1: this kind of content, but with a more firearms oriented thing, 996 00:49:49,200 --> 00:49:51,400 Speaker 1: you can find me an en range dot tv. But 997 00:49:51,520 --> 00:49:54,160 Speaker 1: you'll also find some information security stuff there as well. 998 00:49:54,239 --> 00:49:57,360 Speaker 1: I cover that intermittently when it applies to both topics. 999 00:49:57,440 --> 00:50:00,640 Speaker 1: So if you, if you even if you disagree, but 1000 00:50:00,719 --> 00:50:03,360 Speaker 1: appreciate my approach to this, come check me out. I 1001 00:50:03,440 --> 00:50:07,319 Speaker 1: appreciate it. Awesome, Check out Carl, check out in Range tv, 1002 00:50:08,080 --> 00:50:11,759 Speaker 1: and continue to listen to podcasts because the only thing 1003 00:50:12,960 --> 00:50:16,560 Speaker 1: that will save us is podcasts. When I didn't the 1004 00:50:16,560 --> 00:50:37,919 Speaker 1: same Ryan, but good for business. Oh it could happen here, 1005 00:50:38,320 --> 00:50:41,800 Speaker 1: Um is the podcast. We're talking about things falling apart. 1006 00:50:41,960 --> 00:50:45,560 Speaker 1: And you know a place where things have fallen apart 1007 00:50:45,640 --> 00:50:48,600 Speaker 1: a bit is large chunks of Ukraine due to a 1008 00:50:49,000 --> 00:50:52,920 Speaker 1: Russian invasion. Um. And you know we've chatted about this 1009 00:50:52,960 --> 00:50:54,600 Speaker 1: a bit on the show. We've had some interviews with 1010 00:50:54,680 --> 00:50:56,520 Speaker 1: some folks who are living and fighting over there. And 1011 00:50:56,600 --> 00:51:00,319 Speaker 1: today we're going to talk with Jake Hanrahan Uh friend 1012 00:51:00,400 --> 00:51:03,760 Speaker 1: of the pod Um who has been over a couple 1013 00:51:03,800 --> 00:51:08,120 Speaker 1: of times this year, including since the more expanded conflict began. 1014 00:51:08,200 --> 00:51:10,880 Speaker 1: It has just released a new documentary on the popular 1015 00:51:10,960 --> 00:51:15,040 Speaker 1: front YouTube called Ukraine's Anti Fascist Football Hooligans Fighting the 1016 00:51:15,120 --> 00:51:18,759 Speaker 1: Russian Invasion. Jake, how you doing, Thanks for having me back, 1017 00:51:19,080 --> 00:51:22,560 Speaker 1: Thanks for being on now, Jake. First off, I guess 1018 00:51:22,600 --> 00:51:27,960 Speaker 1: we can get into YouTube censorship stuff, but um, I 1019 00:51:28,000 --> 00:51:30,920 Speaker 1: want to chat about like how this story came about 1020 00:51:31,000 --> 00:51:33,720 Speaker 1: and when you kind of got in contact with these people, 1021 00:51:33,800 --> 00:51:36,840 Speaker 1: because kind of in brief, what you have, you know, 1022 00:51:36,960 --> 00:51:40,440 Speaker 1: the clips notes that you hear from like folks who 1023 00:51:40,640 --> 00:51:42,320 Speaker 1: have kind of an ax to grind is that like 1024 00:51:42,960 --> 00:51:45,960 Speaker 1: you know, Ukraine is all neo Nazis and the government's 1025 00:51:46,000 --> 00:51:47,960 Speaker 1: all run by neo Nazis. And the reality is that 1026 00:51:48,080 --> 00:51:51,560 Speaker 1: Ukraine obviously has a substantial Nazi problem. And as with 1027 00:51:51,719 --> 00:51:55,000 Speaker 1: any country where you have a substantial Nazi problem and 1028 00:51:55,440 --> 00:51:58,239 Speaker 1: some degree of freedom in terms of you know, your 1029 00:51:58,280 --> 00:52:01,719 Speaker 1: ability to organize for their political purposes, you also have 1030 00:52:01,800 --> 00:52:03,839 Speaker 1: a shipload of people who are anti fascists and who 1031 00:52:03,880 --> 00:52:07,240 Speaker 1: have been fighting those fascists in the street, um, often 1032 00:52:07,320 --> 00:52:10,680 Speaker 1: with intense levels of violence. UM. And this is a 1033 00:52:10,760 --> 00:52:13,840 Speaker 1: story about a group of those people, um, who have 1034 00:52:13,960 --> 00:52:17,680 Speaker 1: now kind of retooled their organization and capacity towards fighting 1035 00:52:17,719 --> 00:52:21,400 Speaker 1: the Russian invasion. Yeah, man, exactly that I mean, so 1036 00:52:21,560 --> 00:52:23,320 Speaker 1: what I wanted to do with Popular Front. You know, 1037 00:52:23,560 --> 00:52:27,360 Speaker 1: I've been reporting from Ukraine since sixty and I've been 1038 00:52:27,400 --> 00:52:29,160 Speaker 1: there more than ten times on the ground in the 1039 00:52:29,239 --> 00:52:32,279 Speaker 1: bombas like way before you know, people were focused on 1040 00:52:32,320 --> 00:52:35,000 Speaker 1: the area and before the invasions, so I was very 1041 00:52:35,040 --> 00:52:39,120 Speaker 1: aware of. Yeah, there is a significant fascist element to 1042 00:52:39,320 --> 00:52:42,560 Speaker 1: the militias out there, but it's the same. Any country 1043 00:52:42,560 --> 00:52:44,480 Speaker 1: in Europe that would have a war would have the 1044 00:52:44,560 --> 00:52:46,560 Speaker 1: exact same thing. Trust me, if we had it in Britain, 1045 00:52:46,920 --> 00:52:49,520 Speaker 1: we would have a similar issue. You know, Eastern Europe. 1046 00:52:49,520 --> 00:52:52,920 Speaker 1: Obviously it's a little bit more hardcore, um, but that's 1047 00:52:52,960 --> 00:52:54,800 Speaker 1: the way it is. That's Eastern Europe for you. And 1048 00:52:54,880 --> 00:52:56,520 Speaker 1: I will mention just at the top as well, I 1049 00:52:56,600 --> 00:52:59,600 Speaker 1: would argue that Russia has a much worse neon Nazi problem. 1050 00:52:59,719 --> 00:53:04,799 Speaker 1: They Um, more than fifteen people were killed between by 1051 00:53:04,840 --> 00:53:08,680 Speaker 1: an actual Neo Nazi serial killer gang in Moscow that 1052 00:53:08,840 --> 00:53:12,080 Speaker 1: filmed these attacks. They have a massive neo Nazi party. 1053 00:53:12,800 --> 00:53:15,640 Speaker 1: Um you know they they're they're exporting Nazis all across Europe, 1054 00:53:15,680 --> 00:53:18,560 Speaker 1: and we know there is several, um, you know, well 1055 00:53:18,640 --> 00:53:21,400 Speaker 1: trained neo Nazi battalions fighting for the pro Russian So 1056 00:53:21,440 --> 00:53:23,719 Speaker 1: it's neither here nor there. Yes, there's Nazi problems in 1057 00:53:23,719 --> 00:53:25,719 Speaker 1: the region, but I didn't want to constantly be on 1058 00:53:25,800 --> 00:53:28,279 Speaker 1: this back foot like no, actually, yes there's a Nazi problem, 1059 00:53:28,280 --> 00:53:30,560 Speaker 1: but not this, not that. I was like, how can 1060 00:53:30,640 --> 00:53:32,680 Speaker 1: we do a documentary that's kind of a positive way 1061 00:53:32,719 --> 00:53:35,320 Speaker 1: to be like, well, instead of saying no, not everyone 1062 00:53:35,480 --> 00:53:37,400 Speaker 1: is this, or having to film with a unit and 1063 00:53:37,480 --> 00:53:40,080 Speaker 1: then being like, actually, these guys are fascists. How can 1064 00:53:40,160 --> 00:53:45,000 Speaker 1: I show, you know, like they're uncomfortable. Yeah right, like 1065 00:53:45,160 --> 00:53:49,520 Speaker 1: oh a total cough again. Like it was like, how 1066 00:53:49,640 --> 00:53:51,680 Speaker 1: how can I kind of put a doc out there 1067 00:53:52,160 --> 00:53:54,920 Speaker 1: where it's like, oh, no, actually, like here's a different 1068 00:53:55,000 --> 00:53:57,480 Speaker 1: side to it, and you know, this group obviously as 1069 00:53:57,520 --> 00:54:00,880 Speaker 1: soon as the war started again. Ukraine is a country 1070 00:54:00,920 --> 00:54:03,360 Speaker 1: of forty four million people and it's a very diverse, 1071 00:54:03,680 --> 00:54:08,000 Speaker 1: a very smart, very open country in terms of people 1072 00:54:08,000 --> 00:54:09,879 Speaker 1: will tell you what they think, and they will argue 1073 00:54:09,880 --> 00:54:12,319 Speaker 1: with you and you won't be you know, you can 1074 00:54:12,400 --> 00:54:15,520 Speaker 1: have like really serious discussions with people about politics there 1075 00:54:15,560 --> 00:54:19,080 Speaker 1: and not fall out. You know. Um, so they're very 1076 00:54:19,200 --> 00:54:21,600 Speaker 1: I think, like a very clever people, are really nice people. 1077 00:54:21,640 --> 00:54:24,759 Speaker 1: I love Ukraine. I love Ukrainians, So so it's to me, 1078 00:54:24,880 --> 00:54:26,680 Speaker 1: it was I knew about the players, Like, yeah, of course, 1079 00:54:26,719 --> 00:54:29,640 Speaker 1: there's a massive anti fascist element in Ukraine. Okay, it's 1080 00:54:29,680 --> 00:54:33,320 Speaker 1: definitely smaller than the fascist element, but already since the 1081 00:54:33,400 --> 00:54:36,879 Speaker 1: war started with there's Eco Platform, there's Harky hardcore, there's 1082 00:54:36,920 --> 00:54:40,640 Speaker 1: the Resistance Committee Let's hold towards Klan. There's Operations Solidarity, like, 1083 00:54:41,080 --> 00:54:44,480 Speaker 1: there's a nest the Macno machine gun repair unit, Like, 1084 00:54:44,560 --> 00:54:49,120 Speaker 1: there's so many different anti fascist left wing elements to 1085 00:54:49,200 --> 00:54:50,960 Speaker 1: the to the conflict. They just get a lot less 1086 00:54:51,000 --> 00:54:54,680 Speaker 1: attention because the fascists have got really good at propaganda 1087 00:54:54,760 --> 00:54:57,080 Speaker 1: over the years, and and let's be honest, a lot 1088 00:54:57,160 --> 00:54:59,360 Speaker 1: of the fascist groups are fighting in the East and 1089 00:54:59,520 --> 00:55:03,080 Speaker 1: right now it's kind of can't back. Yeah, well, it 1090 00:55:03,520 --> 00:55:06,040 Speaker 1: all hands on deck, right, It's like everyone's like, yeah, okay, 1091 00:55:06,560 --> 00:55:09,520 Speaker 1: we don't really care, like we just want to not die. 1092 00:55:09,600 --> 00:55:13,879 Speaker 1: Which is understandable. So my point is, Um, I looked 1093 00:55:13,920 --> 00:55:16,239 Speaker 1: at this this group, the Resistance Committee, which is this 1094 00:55:16,360 --> 00:55:21,840 Speaker 1: kind of anti authoritarian um, you know, coalition of various 1095 00:55:21,920 --> 00:55:25,600 Speaker 1: different units. They have Red Deer under their wing, which 1096 00:55:25,719 --> 00:55:27,400 Speaker 1: is an anarchist group in Ukraine that I made a 1097 00:55:27,440 --> 00:55:29,759 Speaker 1: documentary with a few years ago. So I was looking 1098 00:55:29,760 --> 00:55:31,399 Speaker 1: at maybe we'll do a doc n of Deer again 1099 00:55:31,440 --> 00:55:33,400 Speaker 1: now that they're fighting on the front. But then I 1100 00:55:33,440 --> 00:55:36,000 Speaker 1: see this other group with them put towards Klan and 1101 00:55:36,080 --> 00:55:38,239 Speaker 1: it's like what who? Like? Firstly, the name is kind 1102 00:55:38,239 --> 00:55:41,600 Speaker 1: of weird in the US. That brings up some unpleasant 1103 00:55:41,640 --> 00:55:44,879 Speaker 1: connotation in the US. Yeah, I mean it didn't really 1104 00:55:45,080 --> 00:55:47,840 Speaker 1: click to me, But what does what does like hoods? 1105 00:55:48,000 --> 00:55:51,040 Speaker 1: Hoods mean? So basically, when they would go and do 1106 00:55:51,239 --> 00:55:53,279 Speaker 1: you know, when they would go and beat up fascists, 1107 00:55:53,719 --> 00:55:57,400 Speaker 1: they'll be like, right, hood's up because you're you're like 1108 00:55:57,480 --> 00:55:59,360 Speaker 1: putting your hoodie up so you don't get like spotted, 1109 00:55:59,480 --> 00:56:03,279 Speaker 1: right because act the goods up. Um. There's totage of 1110 00:56:03,360 --> 00:56:05,400 Speaker 1: them beating up Nazis as well, chanting they had a 1111 00:56:05,480 --> 00:56:08,319 Speaker 1: chance Dodds, like you know, to put the fucking fear 1112 00:56:08,360 --> 00:56:11,719 Speaker 1: into them like a yeah, yeah, and then clan, I 1113 00:56:11,760 --> 00:56:14,480 Speaker 1: mean the Ukrainian translation of Plan. It's with the K. 1114 00:56:15,080 --> 00:56:19,800 Speaker 1: It's not about the you know, it's just also anglicization 1115 00:56:19,920 --> 00:56:23,720 Speaker 1: can lead to some unfortunate things, right right, But also 1116 00:56:23,840 --> 00:56:26,200 Speaker 1: you know, they're smart guys, and at first I thought 1117 00:56:26,239 --> 00:56:27,600 Speaker 1: this wasn't true, but then I spoke to me it 1118 00:56:27,719 --> 00:56:30,000 Speaker 1: was true. They were kind of aware. They're like, yeah, 1119 00:56:30,160 --> 00:56:33,680 Speaker 1: h h K who towards clan. They're kind of trolling KKK, 1120 00:56:34,000 --> 00:56:36,320 Speaker 1: Like it's like a second meaning because in Ukraine that 1121 00:56:36,400 --> 00:56:38,680 Speaker 1: you know, they've got that culture. They're very cheeky. They 1122 00:56:38,760 --> 00:56:40,759 Speaker 1: think it's very funny to be like ha ha, you know, 1123 00:56:41,040 --> 00:56:43,120 Speaker 1: fuck you. Um, So for them, they were like, yeah, 1124 00:56:43,160 --> 00:56:46,120 Speaker 1: we're basically trolling the fascists, like they hear who towards 1125 00:56:46,160 --> 00:56:49,160 Speaker 1: clan and they're like surprised, Sorry, we're anti fascists, you 1126 00:56:49,200 --> 00:56:52,200 Speaker 1: know what I mean, your head's broken. So it was 1127 00:56:52,280 --> 00:56:54,000 Speaker 1: kind of that vibe, and you know, they didn't really 1128 00:56:54,040 --> 00:56:55,840 Speaker 1: think about it. And when I asked Anton, you know, 1129 00:56:56,160 --> 00:56:58,080 Speaker 1: he's like the kind of the fact old leader. He 1130 00:56:58,239 --> 00:56:59,920 Speaker 1: was like he told me this and then he was 1131 00:57:00,040 --> 00:57:02,480 Speaker 1: I just kind of wanted to piss people off as well. Um, 1132 00:57:02,680 --> 00:57:05,080 Speaker 1: and you you gotta remember these guys started over ten 1133 00:57:05,160 --> 00:57:08,200 Speaker 1: years ago, before you know, politics was as online as 1134 00:57:08,280 --> 00:57:10,760 Speaker 1: it is, um, and they started off in the hardcore 1135 00:57:10,840 --> 00:57:13,040 Speaker 1: punk scene. Now you know, I'm sure you know, Like 1136 00:57:13,120 --> 00:57:15,279 Speaker 1: you know, hardcore punk, especially in Europe is like a 1137 00:57:15,480 --> 00:57:20,040 Speaker 1: very very exciting, very fun, very happy and like Gnari 1138 00:57:20,280 --> 00:57:22,360 Speaker 1: fucking scene. So for them, it was like yeah, with 1139 00:57:22,440 --> 00:57:24,160 Speaker 1: a hood towards clan like you know what I mean. 1140 00:57:24,280 --> 00:57:27,280 Speaker 1: But but unfortunately some people in America like why they 1141 00:57:27,320 --> 00:57:29,200 Speaker 1: go to hood to clan. I don't believe that they're 1142 00:57:29,200 --> 00:57:32,560 Speaker 1: empty fast. It's just like mate, there's over seventy videos 1143 00:57:32,600 --> 00:57:36,880 Speaker 1: of them beating up Nazis successfully. But it's a whole 1144 00:57:36,960 --> 00:57:39,920 Speaker 1: continent that doesn't have the same history as the United States, 1145 00:57:40,040 --> 00:57:42,800 Speaker 1: right you can't. I mean yeah, I mean even if 1146 00:57:42,840 --> 00:57:46,080 Speaker 1: you said in England like KKK, like now, people would 1147 00:57:46,080 --> 00:57:48,520 Speaker 1: be like who oh yeah, like yeah, I've heard of that. 1148 00:57:48,600 --> 00:57:50,640 Speaker 1: It's not like we didn't have it here like that. 1149 00:57:50,760 --> 00:57:52,880 Speaker 1: You know. Yeah, it's one of those So, I mean 1150 00:57:52,960 --> 00:57:55,480 Speaker 1: one of the things that um, that's interesting here that 1151 00:57:55,600 --> 00:57:57,800 Speaker 1: you you hit on a new documentaries like these folks, 1152 00:57:58,200 --> 00:58:01,160 Speaker 1: the that that these are not just like um, anti 1153 00:58:01,200 --> 00:58:04,840 Speaker 1: authoritarian folks, they're they're very much committed to anti racism, 1154 00:58:04,920 --> 00:58:08,240 Speaker 1: which is, um, you know, a place like Ukraine where 1155 00:58:09,640 --> 00:58:12,160 Speaker 1: the history of their being, you know, folks who are 1156 00:58:12,240 --> 00:58:15,080 Speaker 1: not white is not quite as extensive as it is 1157 00:58:15,120 --> 00:58:16,800 Speaker 1: in a lot of places. It's really interesting to me 1158 00:58:16,840 --> 00:58:19,840 Speaker 1: to have people who are kind of organizing specifically for 1159 00:58:19,920 --> 00:58:23,040 Speaker 1: that purpose. Um, and I think, really cool. Yeah, yeah, 1160 00:58:23,120 --> 00:58:24,880 Speaker 1: it is really cool. And it's for them. What I 1161 00:58:25,000 --> 00:58:28,240 Speaker 1: found very fascinating is it's just natural. So, you know, 1162 00:58:28,480 --> 00:58:30,560 Speaker 1: I said, you know, their political I geology. Some of 1163 00:58:30,600 --> 00:58:33,000 Speaker 1: them are like, well, some of us are anarchists, some 1164 00:58:33,160 --> 00:58:35,800 Speaker 1: of us are kind of anti fascists, but otherwise kind 1165 00:58:35,840 --> 00:58:38,840 Speaker 1: of a political and you know, when it's very simple 1166 00:58:38,920 --> 00:58:40,680 Speaker 1: for them, it's like why are you I asked, how 1167 00:58:40,720 --> 00:58:42,520 Speaker 1: come you guys are anti fascists and they're like, well, 1168 00:58:42,840 --> 00:58:44,760 Speaker 1: we just see life differently, like you know, it's like 1169 00:58:45,200 --> 00:58:49,200 Speaker 1: obviously like there was no big political theory. It was 1170 00:58:49,240 --> 00:58:51,240 Speaker 1: just like, no, it's just basically they were like, it's 1171 00:58:51,280 --> 00:58:53,920 Speaker 1: just wrong, you know, like fascism is just wrong. And 1172 00:58:54,040 --> 00:58:57,000 Speaker 1: we're tough guys, you know, and we joined we were 1173 00:58:57,080 --> 00:58:59,040 Speaker 1: we wanted to be the ones that said no, we're 1174 00:58:59,080 --> 00:59:02,280 Speaker 1: not the fascist of the anti fascists, and luckily for them, 1175 00:59:02,360 --> 00:59:05,080 Speaker 1: they had a really good friendship group and a very 1176 00:59:05,200 --> 00:59:07,400 Speaker 1: solid group who were all very good at combat sports, 1177 00:59:07,960 --> 00:59:10,240 Speaker 1: and like in the dock, you know, Anton says, our 1178 00:59:10,360 --> 00:59:13,760 Speaker 1: enemies is almost every other Ukrainian football firm in the 1179 00:59:13,800 --> 00:59:17,240 Speaker 1: whole of the country. But you will ask even their enemies, 1180 00:59:17,440 --> 00:59:20,040 Speaker 1: they will tell you like, yeah, unfortunately, those guys are tough. 1181 00:59:20,200 --> 00:59:24,280 Speaker 1: You know, they can fight. You know, they have to be. Yeah, exactly, 1182 00:59:24,360 --> 00:59:25,840 Speaker 1: they had to be. They were like we had to 1183 00:59:25,920 --> 00:59:28,760 Speaker 1: be you know. So I mean, I I do my research. 1184 00:59:28,840 --> 00:59:32,640 Speaker 1: I found um kind of a fascist football ultras for 1185 00:59:32,840 --> 00:59:37,120 Speaker 1: Hum in Eastern Europe that banned any mention of Houdswood's Klan, 1186 00:59:37,680 --> 00:59:39,480 Speaker 1: and it kind of boiled down to the fact they 1187 00:59:39,520 --> 00:59:42,040 Speaker 1: were just so embarrassed that so many of the fascist 1188 00:59:42,080 --> 00:59:45,640 Speaker 1: groups were getting beaten up like by by anti fascists 1189 00:59:45,640 --> 00:59:47,760 Speaker 1: and often outnumbered. You know. It even got to a 1190 00:59:47,800 --> 00:59:50,760 Speaker 1: point where Hudsward's Clan weren't allowed to they wouldn't even 1191 00:59:50,800 --> 00:59:53,640 Speaker 1: talk to them to do like arranged fights anymore in 1192 00:59:53,680 --> 00:59:56,560 Speaker 1: the field. So instead of quitting, Hudson's Clan said, Okay, 1193 00:59:56,640 --> 00:59:58,200 Speaker 1: then when we see you, we'll just beat you up 1194 00:59:58,240 --> 00:59:59,840 Speaker 1: in the subway, we'll beat you up in the street, 1195 01:00:00,480 --> 01:00:02,280 Speaker 1: you know, And a lot of people might say, oh, 1196 01:00:02,320 --> 01:00:04,480 Speaker 1: well this is violent. So for me, the football Luganism 1197 01:00:04,600 --> 01:00:07,200 Speaker 1: side of it, I don't see an issue with it personally. 1198 01:00:07,280 --> 01:00:10,240 Speaker 1: I mean, they're not attacking anyone innocent, they're not attacking 1199 01:00:10,320 --> 01:00:12,600 Speaker 1: by standards. It was all very contained. It was all 1200 01:00:12,760 --> 01:00:15,240 Speaker 1: very you know, it was that was their thing, you know, 1201 01:00:15,440 --> 01:00:17,600 Speaker 1: so that that to me is whatever. And when you're 1202 01:00:17,640 --> 01:00:20,680 Speaker 1: talking about neo Nazi groups that were i mean in Ukraine, 1203 01:00:20,720 --> 01:00:25,240 Speaker 1: that they've stabbed up the Roma community, they're destroying LGBT events, 1204 01:00:25,840 --> 01:00:27,800 Speaker 1: and you know, the clan were just like, no, we're 1205 01:00:27,800 --> 01:00:29,560 Speaker 1: not about that, we don't we don't think you should 1206 01:00:29,600 --> 01:00:31,800 Speaker 1: do that. And so they formed and for ten years 1207 01:00:31,840 --> 01:00:35,040 Speaker 1: they were fighting. But now they have called a truce 1208 01:00:35,280 --> 01:00:37,640 Speaker 1: because they're like, you know, Anton explains in the dock. 1209 01:00:37,720 --> 01:00:41,280 Speaker 1: He says, look, there's a bigger problem now because Ukraine 1210 01:00:41,320 --> 01:00:44,000 Speaker 1: is actually not a Nazi jump as the Kremlin says. 1211 01:00:44,240 --> 01:00:46,640 Speaker 1: It's actually quite easy to kind of you know, it's 1212 01:00:46,640 --> 01:00:49,560 Speaker 1: a very small subset in the in the relative size 1213 01:00:49,600 --> 01:00:52,240 Speaker 1: of the actual military, so you know, it's actually for them. 1214 01:00:52,280 --> 01:00:53,880 Speaker 1: They said, well, yeah, it makes sense. We put all 1215 01:00:53,920 --> 01:00:57,480 Speaker 1: our other political differences aside, because this is way bigger. 1216 01:00:57,520 --> 01:00:59,720 Speaker 1: You're talking about one of the most powerful militaries on 1217 01:00:59,800 --> 01:01:03,120 Speaker 1: earth invading our country and killing our people. I mean, 1218 01:01:03,160 --> 01:01:06,120 Speaker 1: we've seen the massacres in Boucher and up in Um. 1219 01:01:06,200 --> 01:01:09,880 Speaker 1: You know, people killed civilians, hands behind their back, executed 1220 01:01:09,880 --> 01:01:12,480 Speaker 1: in the street. M thirty of the people killed in 1221 01:01:12,520 --> 01:01:15,880 Speaker 1: Butcher with children, like you know, this is just insane. 1222 01:01:15,960 --> 01:01:17,840 Speaker 1: So for them they were like, yeah, we we can, 1223 01:01:18,200 --> 01:01:20,520 Speaker 1: we can call the truth. You know, we don't like them, 1224 01:01:20,800 --> 01:01:22,440 Speaker 1: but right now we're not going to beat each other 1225 01:01:22,600 --> 01:01:24,960 Speaker 1: up on the front line. Um. But I think it 1226 01:01:25,040 --> 01:01:28,160 Speaker 1: really kind of shows the testament of how serious too 1227 01:01:28,200 --> 01:01:31,040 Speaker 1: towards Clara about the anti fascism that even whilst in 1228 01:01:31,120 --> 01:01:35,840 Speaker 1: the truth, most of them actually still joined the resistance Committee, 1229 01:01:35,840 --> 01:01:39,160 Speaker 1: the anti authoritarian groups, so they're not just directly next 1230 01:01:39,240 --> 01:01:42,520 Speaker 1: to fascist battalions. But again, you know, a lot is 1231 01:01:42,600 --> 01:01:44,360 Speaker 1: changing out there on the front now. I think, yeah, 1232 01:01:44,400 --> 01:01:45,680 Speaker 1: I don't know. Anton said to me, he was like, 1233 01:01:45,680 --> 01:01:47,240 Speaker 1: I'll be honest with you, Like we didn't put this 1234 01:01:47,280 --> 01:01:48,600 Speaker 1: in a DoPT, he said, I'll be honest. I think 1235 01:01:48,640 --> 01:01:51,240 Speaker 1: after this war, a lot of these far right guys 1236 01:01:51,360 --> 01:01:55,520 Speaker 1: might change their minds because now we see what totalitarianism 1237 01:01:55,640 --> 01:01:58,080 Speaker 1: brings death, you know what I mean, whether that whether 1238 01:01:58,160 --> 01:02:00,760 Speaker 1: however that's which we're thinking or not, I'm sure, but 1239 01:02:00,960 --> 01:02:03,560 Speaker 1: you know what I'm saying because obviously, like the I mean, 1240 01:02:03,600 --> 01:02:06,520 Speaker 1: I I would obviously I would hope that that's that's 1241 01:02:06,560 --> 01:02:08,880 Speaker 1: what happens, but I tend to do it. But yeah, 1242 01:02:09,000 --> 01:02:11,040 Speaker 1: the thing that scares me, of course, is there's just 1243 01:02:11,240 --> 01:02:13,720 Speaker 1: as at least as much a chance that you know, 1244 01:02:13,760 --> 01:02:17,080 Speaker 1: they get more powerful. Um, which is again part of 1245 01:02:17,160 --> 01:02:19,880 Speaker 1: why it's important for folks like Hudson's plan to be 1246 01:02:20,080 --> 01:02:24,520 Speaker 1: organizing and getting weapons and being prepared because like, yes, 1247 01:02:25,200 --> 01:02:28,000 Speaker 1: that if that conflict comes after the war, you know, 1248 01:02:28,320 --> 01:02:31,439 Speaker 1: you don't want the fascist militias to be the best 1249 01:02:32,360 --> 01:02:35,880 Speaker 1: armed and most organized. Yeah, and this is the issue, 1250 01:02:35,880 --> 01:02:38,080 Speaker 1: you know. But I think for them it's like, okay, 1251 01:02:38,160 --> 01:02:39,880 Speaker 1: we'll deal with that when it comes, you know. Like 1252 01:02:40,280 --> 01:02:43,560 Speaker 1: I think they're very aware that this war is going nowhere, 1253 01:02:43,760 --> 01:02:46,400 Speaker 1: you know, and you know they say, and our dog, oh, 1254 01:02:46,480 --> 01:02:48,480 Speaker 1: we just want to go and kill Russian pigs. I mean, 1255 01:02:48,640 --> 01:02:51,120 Speaker 1: you know what they mean is I mean some people 1256 01:02:51,160 --> 01:02:52,800 Speaker 1: are like, ooh, that's really bad. I was like mate, 1257 01:02:52,960 --> 01:02:56,320 Speaker 1: you're talking about that. They were It's a war, right, 1258 01:02:56,760 --> 01:03:00,080 Speaker 1: they were. They were guarding the areas where the massacres happened. Know, 1259 01:03:00,520 --> 01:03:03,360 Speaker 1: Hood towards Klan got shelled trying to get civilians out 1260 01:03:03,400 --> 01:03:08,360 Speaker 1: of Boradanka when Russians were shelling. You're talking women and children. Yeah, 1261 01:03:08,360 --> 01:03:11,720 Speaker 1: I'm surprised they said that mildly, you know, like yeah, 1262 01:03:11,800 --> 01:03:13,320 Speaker 1: like you know, it's a woman, it is what it is. 1263 01:03:13,600 --> 01:03:17,680 Speaker 1: And also their football hooligans they're wild people, you know, Yeah, 1264 01:03:17,760 --> 01:03:20,680 Speaker 1: it's it's um. I mean that is kind of interesting though, 1265 01:03:20,720 --> 01:03:24,720 Speaker 1: I'm curious, Um, do you have kind of, ah an 1266 01:03:24,760 --> 01:03:27,480 Speaker 1: assessment of what kind of numbers they're looking at, like 1267 01:03:27,560 --> 01:03:29,800 Speaker 1: how many folks they've actually got in the field on 1268 01:03:29,840 --> 01:03:33,680 Speaker 1: a regular basis. Yeah, so the resistance Committee is I 1269 01:03:33,720 --> 01:03:37,080 Speaker 1: don't know, like fifty hundred right now. Good towards Klan 1270 01:03:37,200 --> 01:03:40,080 Speaker 1: they have like maybe twenty to thirty year there guys 1271 01:03:40,240 --> 01:03:42,760 Speaker 1: in that group. But then they also have other people, 1272 01:03:43,240 --> 01:03:46,440 Speaker 1: um that join different units in the East. So they 1273 01:03:46,480 --> 01:03:48,600 Speaker 1: were like already military, so they didn't have to go 1274 01:03:48,800 --> 01:03:50,920 Speaker 1: you know, former militia, they just joined the military. So 1275 01:03:51,000 --> 01:03:55,760 Speaker 1: there's like quite a strong hood towards Clan Um mortar group. UM, 1276 01:03:55,920 --> 01:03:57,720 Speaker 1: and I know that. So. So one of the footage 1277 01:03:57,760 --> 01:04:00,920 Speaker 1: we included in our documentary where um, a Russian tank 1278 01:04:01,000 --> 01:04:03,040 Speaker 1: gets blown up like very close core as he gets 1279 01:04:03,080 --> 01:04:05,000 Speaker 1: hit with the javelin, he's like a hundred feet away. 1280 01:04:05,320 --> 01:04:07,040 Speaker 1: That was a hood Toude clan attack. That was one 1281 01:04:07,040 --> 01:04:09,560 Speaker 1: of their guys doing it, you know. So yeah, so 1282 01:04:09,720 --> 01:04:12,960 Speaker 1: that there's they're all over the place. Um. Unfortunately, due 1283 01:04:13,000 --> 01:04:15,920 Speaker 1: to various bureaucracy within the Territorial Defense, I do think 1284 01:04:16,000 --> 01:04:19,440 Speaker 1: that the Resistance Committee might have to split up to 1285 01:04:19,560 --> 01:04:21,200 Speaker 1: actually get to the front, you know what I mean, 1286 01:04:21,280 --> 01:04:23,240 Speaker 1: Like they're they're probably gonna have to join other units 1287 01:04:23,440 --> 01:04:26,439 Speaker 1: because there's some issues that the you know, various people 1288 01:04:27,040 --> 01:04:28,720 Speaker 1: they're just not sending them out there. It's not because 1289 01:04:28,720 --> 01:04:30,560 Speaker 1: they're anti fascist or anything. It's nothing to do that. 1290 01:04:30,680 --> 01:04:33,520 Speaker 1: It's it's because you know, it's corruption, man. There, there's 1291 01:04:33,560 --> 01:04:36,600 Speaker 1: there's some corruption emerging. Some some commanders just want to 1292 01:04:36,640 --> 01:04:38,640 Speaker 1: sit sit around and not actually have to go to 1293 01:04:38,720 --> 01:04:42,160 Speaker 1: the front. Um. Whereas you know, the fighters themselves are 1294 01:04:42,200 --> 01:04:44,480 Speaker 1: desperate because they're like, you know, our people are dying. 1295 01:04:44,600 --> 01:04:46,600 Speaker 1: We want to avenge them and we want to stop it. 1296 01:04:46,920 --> 01:04:49,720 Speaker 1: So you know, right, nowhoods Planner essentially on their way. 1297 01:04:49,960 --> 01:04:51,720 Speaker 1: They're doing a lot more training right now. They've been 1298 01:04:51,720 --> 01:04:53,560 Speaker 1: given the go ahead. Yeah, they're going to the east. 1299 01:04:54,000 --> 01:04:55,800 Speaker 1: Um and as far as I know, they're they're kind 1300 01:04:55,840 --> 01:04:58,720 Speaker 1: of on route obviously stopping off doing training. I think 1301 01:04:58,760 --> 01:05:00,840 Speaker 1: they have an upeat. They're going to be an APG unit, 1302 01:05:00,920 --> 01:05:02,720 Speaker 1: so they'll be a very close cause, you know what 1303 01:05:02,760 --> 01:05:04,800 Speaker 1: I'm saying. So it's going to be Narlie for them. 1304 01:05:05,160 --> 01:05:07,000 Speaker 1: These guys, as you stayed at all, kind of started 1305 01:05:07,000 --> 01:05:08,920 Speaker 1: out as a friend group, right, Like they weren't. This 1306 01:05:09,040 --> 01:05:13,200 Speaker 1: isn't a political party, These aren't like these guys didn't 1307 01:05:13,200 --> 01:05:15,640 Speaker 1: start as ideological comrads. They were like buddies who are 1308 01:05:15,720 --> 01:05:18,120 Speaker 1: into the same team and into the same kind of 1309 01:05:18,160 --> 01:05:21,200 Speaker 1: combat sports. And now they're you know, they're going to 1310 01:05:21,320 --> 01:05:23,040 Speaker 1: be some of them are gonna be dying in front 1311 01:05:23,040 --> 01:05:27,240 Speaker 1: of the others, which is like a difficulty I think. Um, 1312 01:05:27,640 --> 01:05:30,680 Speaker 1: I'm interested in kind of how how are they actually 1313 01:05:30,880 --> 01:05:33,440 Speaker 1: organizing sort of in the field or is it just 1314 01:05:33,680 --> 01:05:36,040 Speaker 1: as I've heard of a number of like militia units 1315 01:05:36,120 --> 01:05:38,160 Speaker 1: kind of along the lines of the Ukrainian military, or 1316 01:05:38,200 --> 01:05:41,680 Speaker 1: have they have they kind of adopted different organizational styles 1317 01:05:41,760 --> 01:05:44,040 Speaker 1: in their their hoods, hoods units as befits sort of 1318 01:05:44,800 --> 01:05:48,960 Speaker 1: their unique kind of origins. Yeah, that's a good question. Well, 1319 01:05:49,000 --> 01:05:52,520 Speaker 1: I mean it's kind of trickly because essentially there, you know, 1320 01:05:52,680 --> 01:05:54,640 Speaker 1: I guess, I guess they phoned as a militia. You know, 1321 01:05:54,720 --> 01:05:57,200 Speaker 1: as soon as they stir they got guns. But then 1322 01:05:57,360 --> 01:05:59,560 Speaker 1: you know, Anton was like, we have everything from the 1323 01:05:59,600 --> 01:06:02,560 Speaker 1: anti ashes networks, everything we need apart from the weapons. 1324 01:06:02,880 --> 01:06:05,120 Speaker 1: So they had to sign up as a part of 1325 01:06:05,800 --> 01:06:08,800 Speaker 1: the territorial defense the weapons. So they're under the territorial 1326 01:06:08,840 --> 01:06:11,400 Speaker 1: defense as are you know, a hundreds of different people 1327 01:06:11,480 --> 01:06:14,800 Speaker 1: that did the same thing. Um. So luckily forward towards clan, 1328 01:06:14,880 --> 01:06:17,400 Speaker 1: I think because they're so close friends. I mean you 1329 01:06:17,440 --> 01:06:19,120 Speaker 1: can see it in the Dock, you know. I even 1330 01:06:19,320 --> 01:06:21,360 Speaker 1: the subtitle of our Dock is like, you know, this 1331 01:06:21,480 --> 01:06:25,080 Speaker 1: is a film about friendship, m violence and resistance, because 1332 01:06:25,120 --> 01:06:27,160 Speaker 1: that's essentially what it is, you know. So they're very 1333 01:06:27,200 --> 01:06:30,120 Speaker 1: close friends. So commanders have recognized that that, Yeah, this 1334 01:06:30,280 --> 01:06:32,520 Speaker 1: is a group that is disciplined as well. A lot 1335 01:06:32,560 --> 01:06:35,240 Speaker 1: of them are straight edge, which is actually a discipline 1336 01:06:35,280 --> 01:06:37,600 Speaker 1: in itself, you know what I mean. So they're very 1337 01:06:37,640 --> 01:06:40,120 Speaker 1: well disciplined. They're very good. You know, the training is 1338 01:06:40,240 --> 01:06:42,320 Speaker 1: very good. They know what they're doing. But they have 1339 01:06:42,480 --> 01:06:46,360 Speaker 1: like a commander that is from the Territorial Defense if 1340 01:06:46,400 --> 01:06:48,240 Speaker 1: you like. It's not he's not hood towards clan. He's 1341 01:06:48,440 --> 01:06:51,640 Speaker 1: never been assigned a commander sort of thing. Um, So 1342 01:06:52,080 --> 01:06:54,600 Speaker 1: they're being taught just the same kind of tactics as 1343 01:06:54,600 --> 01:06:57,400 Speaker 1: anybody else. As they're an RPG unit. I think, you know, 1344 01:06:57,520 --> 01:06:59,400 Speaker 1: there will be a lot of close quarters stuff, but 1345 01:06:59,440 --> 01:07:02,480 Speaker 1: they're just doing a lot of a lot of arms training. 1346 01:07:02,880 --> 01:07:05,240 Speaker 1: There's you know, Constantine and the dark. One of them 1347 01:07:05,360 --> 01:07:07,240 Speaker 1: is like, I just want to get better faster. They're 1348 01:07:07,280 --> 01:07:10,720 Speaker 1: just they're very they're very focused on being like not 1349 01:07:10,800 --> 01:07:12,640 Speaker 1: an elite unit, but they want to get it perfect. 1350 01:07:12,640 --> 01:07:14,360 Speaker 1: They're not just like, yeah, let's go and kill. They're like, 1351 01:07:14,440 --> 01:07:16,040 Speaker 1: now we have to be good, you know what I mean. 1352 01:07:16,120 --> 01:07:17,840 Speaker 1: We have to go in there and have the same 1353 01:07:17,920 --> 01:07:20,920 Speaker 1: discipline and organization as we had in the streets when 1354 01:07:20,960 --> 01:07:23,120 Speaker 1: we were fighting. There was a reason that they were 1355 01:07:23,200 --> 01:07:25,800 Speaker 1: renowned as being a good street fight in football, looking 1356 01:07:25,880 --> 01:07:30,520 Speaker 1: firm despite being completely outnumbered. It's because they had good discipline. Um, 1357 01:07:30,840 --> 01:07:33,600 Speaker 1: they're tough, they trained and also because they're good friends. 1358 01:07:33,720 --> 01:07:35,800 Speaker 1: They will have each other's back. It's it's not hobby 1359 01:07:35,840 --> 01:07:38,920 Speaker 1: for them as a lifestyle, you know. Um, yeah, it's 1360 01:07:38,920 --> 01:07:40,600 Speaker 1: just so much went into it, you know who plan 1361 01:07:40,680 --> 01:07:44,640 Speaker 1: started off the back of um anti fascist punk punk 1362 01:07:44,680 --> 01:07:48,200 Speaker 1: hardcore in Ukraine, and then that itself was a scene, 1363 01:07:48,240 --> 01:07:50,320 Speaker 1: and then the football organism and then yeah, and now 1364 01:07:50,440 --> 01:07:52,680 Speaker 1: it's it's crazy, really, it's it's honestly one of the 1365 01:07:52,760 --> 01:07:56,640 Speaker 1: most fascinating stories I've covered. Now they're fucking frontline unit, 1366 01:07:56,680 --> 01:07:58,720 Speaker 1: you know. It's it's sad and I hope to guard 1367 01:07:58,840 --> 01:08:01,120 Speaker 1: nothing happens to any of them. Probably the nicest guys 1368 01:08:01,160 --> 01:08:04,760 Speaker 1: are filmed with you know. Um and yeah, it's it's, it's, 1369 01:08:04,880 --> 01:08:06,920 Speaker 1: it's it's a good question on it. It's very tricky 1370 01:08:07,000 --> 01:08:08,840 Speaker 1: to know how it's going to happen for them once 1371 01:08:08,920 --> 01:08:11,280 Speaker 1: they're on the front. I mean, Anton, the main guy, 1372 01:08:11,360 --> 01:08:15,040 Speaker 1: he has served before, joined the militia to fight in 1373 01:08:15,080 --> 01:08:17,439 Speaker 1: the dom Bus. So they do have some experience, you know, 1374 01:08:18,200 --> 01:08:20,880 Speaker 1: and it does seem like um kind of their natural 1375 01:08:21,200 --> 01:08:24,479 Speaker 1: the skills that they've been developing, because there's there's broadly 1376 01:08:24,520 --> 01:08:27,560 Speaker 1: speaking from my understanding, kind of two main types of 1377 01:08:27,960 --> 01:08:31,960 Speaker 1: combat going on in Ukraine. There's the what you're seeing 1378 01:08:32,000 --> 01:08:34,560 Speaker 1: a lot in you know, the Don Boss, which is 1379 01:08:35,040 --> 01:08:38,240 Speaker 1: this kind of like meat grinder like frontline ship, and 1380 01:08:38,280 --> 01:08:40,120 Speaker 1: then there's sort of the seek and destroy kind of 1381 01:08:40,160 --> 01:08:42,960 Speaker 1: stuff where you've got people sort of hunting convoys and 1382 01:08:43,120 --> 01:08:45,840 Speaker 1: and doing ambushes. And it does strike me like these 1383 01:08:45,880 --> 01:08:48,680 Speaker 1: guys talents would lend themselves more to the ambushes then. 1384 01:08:48,720 --> 01:08:51,200 Speaker 1: I mean, there's not really any talent that helps you 1385 01:08:51,320 --> 01:08:54,240 Speaker 1: in the in the sitting in a trench meat grinder 1386 01:08:54,320 --> 01:08:56,920 Speaker 1: kind of ship. But obviously you don't have that choice 1387 01:08:57,000 --> 01:09:02,519 Speaker 1: when you're when you're serving under you know, the national military. Yeah, yeah, 1388 01:09:02,600 --> 01:09:04,760 Speaker 1: it's I think you're right, like they would be much 1389 01:09:04,960 --> 01:09:09,920 Speaker 1: better um placed as like you know, I guess like 1390 01:09:10,000 --> 01:09:13,080 Speaker 1: a kind of shooting scoot kind of unit exactly. Yeah, 1391 01:09:13,200 --> 01:09:14,800 Speaker 1: And I think they will be because you know, the 1392 01:09:14,920 --> 01:09:17,880 Speaker 1: train with RPG UM some of their fighters already have 1393 01:09:18,000 --> 01:09:22,479 Speaker 1: javelins on the front um and and laws. So yeah, 1394 01:09:22,520 --> 01:09:24,280 Speaker 1: I think that's where it would be if they just 1395 01:09:24,360 --> 01:09:26,120 Speaker 1: put them in some kind of meat grind of position, 1396 01:09:26,200 --> 01:09:28,479 Speaker 1: which very much could happen, you know. I mean, it's 1397 01:09:28,520 --> 01:09:31,160 Speaker 1: bad for anybody, let's be realistic. It's getting very bad 1398 01:09:31,200 --> 01:09:35,479 Speaker 1: in the don Bass right tonight. Yeah, yeah, yeah, um, 1399 01:09:35,760 --> 01:09:37,479 Speaker 1: I mean, and that's one of the there's been posts 1400 01:09:37,520 --> 01:09:41,080 Speaker 1: and stuff from people talking about like, um, you know, 1401 01:09:41,320 --> 01:09:43,880 Speaker 1: the a lot of the funk ups that are happening 1402 01:09:44,160 --> 01:09:46,800 Speaker 1: because you know, Ukraine started this war with everyone being 1403 01:09:46,880 --> 01:09:50,960 Speaker 1: kind of overwhelmed by the competence of their military effort, 1404 01:09:51,000 --> 01:09:52,920 Speaker 1: and now that things in in the Don Boss have 1405 01:09:53,040 --> 01:09:56,519 Speaker 1: turned into this kind of ugly slug Um, there's been 1406 01:09:56,640 --> 01:09:59,439 Speaker 1: some you know, oh the you know units getting hit 1407 01:09:59,479 --> 01:10:01,240 Speaker 1: by their own our hillary fire, the kind of messy 1408 01:10:01,320 --> 01:10:04,080 Speaker 1: stuff that happens when you have a fight like this, 1409 01:10:04,360 --> 01:10:06,960 Speaker 1: right like it is it is unavoidable, um, when you 1410 01:10:07,080 --> 01:10:10,719 Speaker 1: have like a situation like has developed in the Dan Boss. 1411 01:10:10,800 --> 01:10:15,400 Speaker 1: But that doesn't make it any less unpleasant to endure 1412 01:10:15,479 --> 01:10:18,240 Speaker 1: as an actual soldier. Like it's just it's one of 1413 01:10:18,280 --> 01:10:21,280 Speaker 1: those I mean, there's only so much that like competence 1414 01:10:21,320 --> 01:10:23,759 Speaker 1: and training can do if you wind up getting squeezed 1415 01:10:23,760 --> 01:10:26,960 Speaker 1: into that kind of position. Yeah, And and there's this 1416 01:10:27,160 --> 01:10:30,360 Speaker 1: is why a lot of people are even Ukrainian has 1417 01:10:30,360 --> 01:10:33,760 Speaker 1: actually had a conversation with Ukrainian friend yesterday that was saying, like, 1418 01:10:34,040 --> 01:10:36,200 Speaker 1: you know, the situation is so bad in the East. 1419 01:10:36,960 --> 01:10:40,280 Speaker 1: We really need to be honest about this because you know, 1420 01:10:40,360 --> 01:10:42,320 Speaker 1: if people think it's going better than it is, Okay, 1421 01:10:42,360 --> 01:10:43,960 Speaker 1: it's good for morale, but it's not good for the 1422 01:10:44,000 --> 01:10:45,960 Speaker 1: guys on the ground, like they're not going to get 1423 01:10:46,000 --> 01:10:48,800 Speaker 1: what they need. And the reality is that it's getting 1424 01:10:48,840 --> 01:10:51,680 Speaker 1: really bad and it's not anything to do with incompetence 1425 01:10:51,760 --> 01:10:54,360 Speaker 1: from the fighters. It's just the war. The level is 1426 01:10:54,439 --> 01:10:57,080 Speaker 1: being so hot, and you know, Russia has learned from 1427 01:10:57,120 --> 01:11:00,160 Speaker 1: its mistakes unfortunately from the start where they can be 1428 01:11:00,280 --> 01:11:02,519 Speaker 1: funked up. But now you know, things are getting a 1429 01:11:02,560 --> 01:11:06,360 Speaker 1: little bit Harry, Um, Ukrainians are doing like an incredible effort. 1430 01:11:06,880 --> 01:11:10,600 Speaker 1: But again it's like, yeah, you're talking about decades and 1431 01:11:10,760 --> 01:11:15,759 Speaker 1: decades of armor and you know, um weapons that Russia 1432 01:11:15,840 --> 01:11:17,920 Speaker 1: has and it's all very well, us being like, oh 1433 01:11:17,960 --> 01:11:21,200 Speaker 1: it's their armor is blah blah. I doubt it, you know, 1434 01:11:21,280 --> 01:11:23,920 Speaker 1: I very much doubt that. Um, it doesn't look like that, 1435 01:11:24,120 --> 01:11:26,160 Speaker 1: certainly from when people I'm talking to in the East. 1436 01:11:26,200 --> 01:11:30,040 Speaker 1: You know, so I think again, when when you know 1437 01:11:30,240 --> 01:11:32,599 Speaker 1: Ukrainians are like, well we do need more weapons, it's 1438 01:11:32,600 --> 01:11:34,519 Speaker 1: because they need more weapons, you know what I mean, 1439 01:11:34,600 --> 01:11:36,920 Speaker 1: they really do. Well, this is like one of the 1440 01:11:38,000 --> 01:11:40,759 Speaker 1: This is uh one of the things that's that's difficult 1441 01:11:40,760 --> 01:11:43,599 Speaker 1: to I think get across to people, um, because there 1442 01:11:43,720 --> 01:11:46,280 Speaker 1: is such a you know that we are dealing with 1443 01:11:46,400 --> 01:11:50,240 Speaker 1: the legacy of decades of shipping weapons places, um, and 1444 01:11:50,400 --> 01:11:52,519 Speaker 1: not having that helped the conflict in a lot of 1445 01:11:52,800 --> 01:11:55,759 Speaker 1: in a lot of ways, um, in decades of stories 1446 01:11:55,840 --> 01:11:57,519 Speaker 1: like you know, all the weapons that got sent to 1447 01:11:57,600 --> 01:11:59,800 Speaker 1: the Iraqi government and then wound up in isis is 1448 01:12:00,040 --> 01:12:02,920 Speaker 1: Marie and ship um, which creates kind of an easy 1449 01:12:03,040 --> 01:12:05,439 Speaker 1: narrative for folks who are like, well, you know, you're 1450 01:12:05,479 --> 01:12:07,800 Speaker 1: just trying to prolong the conflict in making it worse 1451 01:12:07,880 --> 01:12:10,880 Speaker 1: by shipping and weapons. But the reality is one side 1452 01:12:10,920 --> 01:12:14,160 Speaker 1: of this war has a substantial percentage of all of 1453 01:12:14,240 --> 01:12:17,439 Speaker 1: the artillery that exists on the planet. Yeah, and the 1454 01:12:17,520 --> 01:12:21,160 Speaker 1: other side does not. Yeah. And I do understand the 1455 01:12:21,280 --> 01:12:24,439 Speaker 1: argument though, Like I totally get it, but it's it's 1456 01:12:25,080 --> 01:12:27,080 Speaker 1: I lived through the early two thousand's as well, I 1457 01:12:27,200 --> 01:12:30,640 Speaker 1: understand it. It's like war isn't a template. It's not 1458 01:12:30,720 --> 01:12:33,719 Speaker 1: like does this happened there, this will happened there or whatever, 1459 01:12:33,800 --> 01:12:35,800 Speaker 1: And it's like you have to weigh up no matter 1460 01:12:35,920 --> 01:12:37,960 Speaker 1: what bad is going to come from this, Do you 1461 01:12:38,040 --> 01:12:41,240 Speaker 1: want the bad to be Okay, there's a problem with 1462 01:12:41,600 --> 01:12:44,720 Speaker 1: arms in Eastern Ukraine, which the Eastern Europe, which there 1463 01:12:44,720 --> 01:12:47,160 Speaker 1: already is, and it gets worse. Or do you want 1464 01:12:47,479 --> 01:12:50,280 Speaker 1: the bad problem to be Russia has taken over the 1465 01:12:50,320 --> 01:12:54,000 Speaker 1: whole country, massacred everybody and is unlike undoubtly going to 1466 01:12:54,080 --> 01:12:57,040 Speaker 1: try and move into other countries. Do you want aids 1467 01:12:57,400 --> 01:12:59,600 Speaker 1: or do you want cancer? I don't know, you know 1468 01:12:59,680 --> 01:13:01,920 Speaker 1: what I mean? Indeed, do you want the Do you 1469 01:13:02,000 --> 01:13:04,599 Speaker 1: want the lesson from this to be that if you're 1470 01:13:04,640 --> 01:13:07,360 Speaker 1: just willing to burn a couple of hundred thousand human 1471 01:13:07,439 --> 01:13:11,639 Speaker 1: lives as a state like Russia or any other state, 1472 01:13:11,800 --> 01:13:15,920 Speaker 1: you can easily gain access to, you know, a pile 1473 01:13:16,000 --> 01:13:18,519 Speaker 1: of wealth right in the shape of a country. Um, 1474 01:13:18,840 --> 01:13:21,880 Speaker 1: which isn't a positive. It's not like a good lesson 1475 01:13:22,040 --> 01:13:23,960 Speaker 1: for anyone to take out of this. But like if 1476 01:13:24,600 --> 01:13:28,639 Speaker 1: if that's the lesson, right, yeah, yeah, no, that's the reality. 1477 01:13:28,760 --> 01:13:31,479 Speaker 1: Like it's all very nice having a fifty tweet Twitter 1478 01:13:31,560 --> 01:13:34,000 Speaker 1: threat about why this that and referred should or shouldn't happen, 1479 01:13:34,720 --> 01:13:37,240 Speaker 1: But that's just completely removed from real life. I mean, 1480 01:13:37,360 --> 01:13:39,160 Speaker 1: a real life is it's going to be very bad, 1481 01:13:39,320 --> 01:13:41,920 Speaker 1: very nasty, no matter what happens, and you just have 1482 01:13:42,040 --> 01:13:44,320 Speaker 1: to weigh. Oh I don't like natso oh I don't 1483 01:13:44,360 --> 01:13:46,599 Speaker 1: like this. Yeah, I mean either, but I care about 1484 01:13:46,640 --> 01:13:49,000 Speaker 1: people that die for no reason, you know, Like I 1485 01:13:49,080 --> 01:13:51,720 Speaker 1: think that's the real issue. Um. I think people need 1486 01:13:51,800 --> 01:13:53,840 Speaker 1: to stand with the people, you know, and if that 1487 01:13:54,000 --> 01:13:57,640 Speaker 1: means okay, use the tools that you have, Okay, Like, 1488 01:13:57,920 --> 01:13:59,519 Speaker 1: oh I don't like nats. We yeah, but they're going 1489 01:13:59,560 --> 01:14:01,640 Speaker 1: to give them. What do you think that Ukrainians like 1490 01:14:01,800 --> 01:14:04,800 Speaker 1: having Russian firearms? Probably not. But they also don't give 1491 01:14:04,800 --> 01:14:07,280 Speaker 1: a ship because they shoot. It's it's that simple, you know, 1492 01:14:07,760 --> 01:14:10,960 Speaker 1: kind of coming back to the subject of your documentary. Um, 1493 01:14:11,360 --> 01:14:13,400 Speaker 1: if weapons are going to be going over there, and 1494 01:14:13,520 --> 01:14:16,439 Speaker 1: by god they are, um I I would hope that 1495 01:14:16,680 --> 01:14:18,680 Speaker 1: as many of them as possible are going into the 1496 01:14:18,760 --> 01:14:23,360 Speaker 1: hands of people like the Hoods Hood's client, right, yeah, 1497 01:14:24,040 --> 01:14:28,080 Speaker 1: um yeah, I mean that is that is a yeah, 1498 01:14:28,200 --> 01:14:31,200 Speaker 1: a lot. There's definitely this isn't from them telling me, 1499 01:14:31,280 --> 01:14:34,240 Speaker 1: but it's just from research I've done. There's definitely a 1500 01:14:34,320 --> 01:14:37,720 Speaker 1: discrepancy in terms of which groups get what weapons. And 1501 01:14:38,360 --> 01:14:40,960 Speaker 1: it's not based on ideology, but it's definitely based on 1502 01:14:41,560 --> 01:14:44,000 Speaker 1: some serious paroxy that needs to be sorted out. You know. 1503 01:14:44,200 --> 01:14:46,720 Speaker 1: I have some some Western volunteers that I know that 1504 01:14:46,800 --> 01:14:49,240 Speaker 1: are on the front right now, and they're saying, like, 1505 01:14:49,439 --> 01:14:51,680 Speaker 1: for some reason, you know, one unit that is not 1506 01:14:51,800 --> 01:14:55,960 Speaker 1: an RPG unit, for example, will have more rockets than 1507 01:14:56,040 --> 01:14:58,560 Speaker 1: the RPG unit, you know, And it's like, what like, 1508 01:14:58,800 --> 01:15:00,360 Speaker 1: And that's not because they've used the more. It's it's 1509 01:15:00,439 --> 01:15:03,799 Speaker 1: it's supply lines are again. It's it's not even corruption. 1510 01:15:03,840 --> 01:15:05,920 Speaker 1: Often it's just supply lines are wrecked or whatever. But 1511 01:15:06,320 --> 01:15:07,880 Speaker 1: it has to be addressed as to be looked at. 1512 01:15:07,880 --> 01:15:09,880 Speaker 1: I mean, I'm no tactician. I don't know anything about 1513 01:15:09,920 --> 01:15:12,160 Speaker 1: that side of things. I'm just basing on what you know, 1514 01:15:12,240 --> 01:15:14,000 Speaker 1: people are telling me, because you know, I like to 1515 01:15:14,040 --> 01:15:16,840 Speaker 1: talk to them and hear what's happening. Yeah. Um, I 1516 01:15:16,880 --> 01:15:19,680 Speaker 1: think we should move into you know, when I when 1517 01:15:19,720 --> 01:15:23,080 Speaker 1: I pull up your documentary on YouTube, which is again 1518 01:15:23,200 --> 01:15:26,599 Speaker 1: for folks at home, titled Ukraine's anti fascist football Hooligans 1519 01:15:26,680 --> 01:15:29,120 Speaker 1: Fighting the Russian Invasion, the first thing that I see 1520 01:15:29,240 --> 01:15:36,280 Speaker 1: is this video maybe inappropriate for some users. Um yeah's yeah, well, 1521 01:15:36,320 --> 01:15:38,880 Speaker 1: and it's We've talked a lot on our various shows 1522 01:15:38,920 --> 01:15:41,519 Speaker 1: on this network about all of the fascist propaganda that 1523 01:15:41,600 --> 01:15:43,720 Speaker 1: you can find and not even find on YouTube that 1524 01:15:43,760 --> 01:15:45,559 Speaker 1: will be like spoon fed. Do you if you wind 1525 01:15:45,640 --> 01:15:49,439 Speaker 1: up like watching a video game review or something. Um, yeah, 1526 01:15:49,640 --> 01:15:51,679 Speaker 1: this is something that you've been dealing with, unpopular front. 1527 01:15:51,720 --> 01:15:53,640 Speaker 1: Somebody seems to have like an ax to grind with 1528 01:15:53,760 --> 01:15:56,000 Speaker 1: you guys. I don't know. Maybe it's just the algorithm, 1529 01:15:56,160 --> 01:15:59,800 Speaker 1: but I'll be honest, I I felt like it was 1530 01:16:00,000 --> 01:16:03,120 Speaker 1: to the algorithm until this recent one. Right. So so yeah, 1531 01:16:03,160 --> 01:16:04,640 Speaker 1: like you said, if people want to fight, I mean 1532 01:16:04,680 --> 01:16:06,880 Speaker 1: the dogs called front Line who looking? But yeah, the 1533 01:16:07,360 --> 01:16:10,479 Speaker 1: for s c O. Um, yeah, it's it's Ukraine's anti 1534 01:16:10,520 --> 01:16:15,320 Speaker 1: fascists fighting Russian invasion. Um. But yeah, the second it 1535 01:16:15,439 --> 01:16:17,680 Speaker 1: was uploaded it got age restricted. Now that to me 1536 01:16:17,880 --> 01:16:19,880 Speaker 1: is very odd. I don't get why there's no gore 1537 01:16:19,960 --> 01:16:22,880 Speaker 1: in it. Um. Okay, yeah, there's violence, but there's a 1538 01:16:22,920 --> 01:16:26,120 Speaker 1: guideline where you can show violence in if it's relative 1539 01:16:26,160 --> 01:16:28,800 Speaker 1: to report in, which obviously it is because it's an 1540 01:16:28,840 --> 01:16:31,880 Speaker 1: anti fascist football looking firm fighting Russia. So of course 1541 01:16:31,920 --> 01:16:33,920 Speaker 1: we're going to show what that looks like. But there's 1542 01:16:33,960 --> 01:16:38,120 Speaker 1: there's no there's no gore. Um, there's no there's there's 1543 01:16:38,200 --> 01:16:41,800 Speaker 1: just it's just lads hanging out talking about their lives 1544 01:16:41,880 --> 01:16:44,040 Speaker 1: now they've been tipped upside down and how they really 1545 01:16:44,160 --> 01:16:46,840 Speaker 1: dislike the far right. Now. To give you an idea 1546 01:16:46,840 --> 01:16:49,240 Speaker 1: of how messed up this is, UM, there's a real, 1547 01:16:49,439 --> 01:16:53,400 Speaker 1: a real parasite YouTuber. It's called Danny Mullen and he 1548 01:16:53,600 --> 01:16:56,720 Speaker 1: has a video on YouTube where him and his his 1549 01:16:56,880 --> 01:17:00,559 Speaker 1: friend both for them scoundrels go to the con border 1550 01:17:00,840 --> 01:17:05,679 Speaker 1: and the whole video is trying to get with quote 1551 01:17:05,800 --> 01:17:09,280 Speaker 1: like hot Ukrainian refugees. Now, it's the most disgusting thing 1552 01:17:09,320 --> 01:17:11,439 Speaker 1: you've ever seen. They're praying on young girls, some of 1553 01:17:11,520 --> 01:17:15,040 Speaker 1: them are very clearly underage. UM. And that is monetized. 1554 01:17:15,160 --> 01:17:18,640 Speaker 1: That is monetized, and it is even on the algorithm. 1555 01:17:18,720 --> 01:17:21,360 Speaker 1: I found it because I was watching Ukraine war stuff 1556 01:17:21,400 --> 01:17:24,640 Speaker 1: and it was put onto my recommended. Now, these are 1557 01:17:24,640 --> 01:17:27,280 Speaker 1: the biggest parasites you've ever seen in your life. UM. 1558 01:17:27,360 --> 01:17:29,920 Speaker 1: And they have hundreds and hundreds of thousands of subscribers 1559 01:17:29,960 --> 01:17:33,320 Speaker 1: and they're making money from content like that that is 1560 01:17:33,439 --> 01:17:36,320 Speaker 1: not age restricted. There is no censorship thing, There is 1561 01:17:36,400 --> 01:17:39,480 Speaker 1: no message saying this might be offensive. But a documentary 1562 01:17:39,479 --> 01:17:44,120 Speaker 1: which is one journalistic covering anti fascists fighting one of 1563 01:17:44,160 --> 01:17:47,920 Speaker 1: the worst invasions we've seen in Europe is suddenly deemed 1564 01:17:48,640 --> 01:17:51,320 Speaker 1: inappropriate and his age restricted on YouTube. Tell me what's 1565 01:17:51,320 --> 01:17:53,560 Speaker 1: going on there that doesn't seem right to me. So 1566 01:17:53,760 --> 01:17:55,840 Speaker 1: basically you tube by doing that as saying we're actually 1567 01:17:55,920 --> 01:17:59,920 Speaker 1: happy to make money off of people that exploit under 1568 01:18:00,080 --> 01:18:03,439 Speaker 1: rage Ukrainian refugees, but we're not happy for people showing 1569 01:18:03,439 --> 01:18:05,600 Speaker 1: the world a different side to the war. That to 1570 01:18:05,720 --> 01:18:08,080 Speaker 1: me is madness, Like it doesn't make any sense to me, 1571 01:18:08,200 --> 01:18:10,559 Speaker 1: you know, And it's nothing but soft censorship. Some people 1572 01:18:10,560 --> 01:18:12,720 Speaker 1: are tell him it's not it's not censorship. Of course, 1573 01:18:12,760 --> 01:18:15,479 Speaker 1: it is censorship. This is the way the world works now, Yeah, 1574 01:18:15,560 --> 01:18:17,840 Speaker 1: because I mean a huge chunk of the success or 1575 01:18:17,880 --> 01:18:20,840 Speaker 1: the visibility of anything that you're putting on YouTube is 1576 01:18:20,920 --> 01:18:23,479 Speaker 1: whether or not the algorithm is going to like suggest 1577 01:18:23,560 --> 01:18:26,120 Speaker 1: it to people, even people who have watched you or 1578 01:18:26,160 --> 01:18:28,519 Speaker 1: other like that. Not even talking about like suggesting it 1579 01:18:28,560 --> 01:18:30,560 Speaker 1: to somebody who's never heard of popular front, but like 1580 01:18:31,000 --> 01:18:33,439 Speaker 1: people who have watched multiple things that you've done and 1581 01:18:33,560 --> 01:18:36,719 Speaker 1: are just on YouTube. The thing that would make sense 1582 01:18:36,920 --> 01:18:38,400 Speaker 1: is for when you put out a new thing them 1583 01:18:38,439 --> 01:18:39,880 Speaker 1: to get like YouTube to be like, hey, we know 1584 01:18:39,960 --> 01:18:41,800 Speaker 1: you've watched the ship check out this, but that's not 1585 01:18:41,880 --> 01:18:43,760 Speaker 1: going to happen for a lot of folks because of 1586 01:18:43,800 --> 01:18:47,320 Speaker 1: this kind of thing, which is yeah fucked up, Yeah 1587 01:18:47,680 --> 01:18:50,240 Speaker 1: yeah right. And it's like it's not just me that 1588 01:18:50,360 --> 01:18:52,639 Speaker 1: like I mean, it's other people it's happening as well. 1589 01:18:53,520 --> 01:18:55,559 Speaker 1: And basically what it is is if we wanted somebody 1590 01:18:55,600 --> 01:18:58,400 Speaker 1: the doc somehow be allowed to be monetized or not 1591 01:18:58,439 --> 01:19:00,439 Speaker 1: even monitored. I don't even want the monetizes the whole 1592 01:19:00,479 --> 01:19:04,160 Speaker 1: channel is demonetized. I just want it to not be restricted, 1593 01:19:04,280 --> 01:19:07,040 Speaker 1: because that is an algorithm torpedo. And you know, it's 1594 01:19:07,040 --> 01:19:09,960 Speaker 1: like I would have to rec the whole documentary essentially 1595 01:19:10,040 --> 01:19:13,040 Speaker 1: sense of myself, my own journalism, make you excuse me, 1596 01:19:13,360 --> 01:19:16,400 Speaker 1: make the integrity of the doc weaker, just to be 1597 01:19:16,560 --> 01:19:19,040 Speaker 1: able people to see it like this is war, this 1598 01:19:19,240 --> 01:19:22,000 Speaker 1: is real life. I just it's just really depressing, you know, 1599 01:19:22,200 --> 01:19:25,360 Speaker 1: And this is something I mean, YouTube, Twitter, Facebook have 1600 01:19:25,439 --> 01:19:27,680 Speaker 1: all been guilty of degrees of this, but there's this 1601 01:19:28,360 --> 01:19:31,599 Speaker 1: of all other things that don't that that are allowed 1602 01:19:31,640 --> 01:19:34,720 Speaker 1: to spread unchecked on those platforms, they have this consistent 1603 01:19:35,720 --> 01:19:38,800 Speaker 1: maybe because it's it's easier to algorithmically go after But this, 1604 01:19:39,040 --> 01:19:43,439 Speaker 1: this consistent pattern of going after war journalism, um, and 1605 01:19:43,600 --> 01:19:46,080 Speaker 1: like your you know, what's happening to your documentary is 1606 01:19:46,160 --> 01:19:48,719 Speaker 1: a piece of this, But like the much scarier pieces, 1607 01:19:48,840 --> 01:19:51,880 Speaker 1: a tremendous amount of documentation of war crimes in places 1608 01:19:51,960 --> 01:19:55,800 Speaker 1: like Syria have been deleted um kind of automatically over 1609 01:19:55,880 --> 01:19:58,559 Speaker 1: the years, which means that like again, evidence of crimes 1610 01:19:58,600 --> 01:20:01,240 Speaker 1: against humanity has been lost forever because of these kind 1611 01:20:01,320 --> 01:20:05,600 Speaker 1: of like purges of of war content that um, I 1612 01:20:05,680 --> 01:20:08,720 Speaker 1: don't think are actually protecting anybody from anything, but are 1613 01:20:08,840 --> 01:20:14,160 Speaker 1: are perhaps even making things worse. Yeah, of course, and 1614 01:20:14,240 --> 01:20:19,000 Speaker 1: it and it allows um, look Russian propaganda or whatever like, 1615 01:20:19,400 --> 01:20:20,960 Speaker 1: people are going to seek that out and they're going 1616 01:20:21,000 --> 01:20:24,200 Speaker 1: to digest it whatever way they can. So then surely 1617 01:20:24,240 --> 01:20:26,880 Speaker 1: you should say, okay, take the brakes off. Let's you 1618 01:20:26,960 --> 01:20:28,680 Speaker 1: know if you care, which I mean YouTube is a 1619 01:20:28,760 --> 01:20:31,240 Speaker 1: media platform. You would think that they would say, okay, well, 1620 01:20:31,600 --> 01:20:34,280 Speaker 1: this is kind of our duty to balance it out, 1621 01:20:34,360 --> 01:20:37,080 Speaker 1: to allow all the free information. I'm not even saying, 1622 01:20:37,439 --> 01:20:40,000 Speaker 1: oh yeah, through all Russian propaganda. I think people have 1623 01:20:40,080 --> 01:20:42,040 Speaker 1: a choice to see whatever they want to see, even 1624 01:20:42,080 --> 01:20:44,720 Speaker 1: if it is completely ridiculous but the fact that they're 1625 01:20:44,920 --> 01:20:48,000 Speaker 1: they're censoring the stuff that you you would think is 1626 01:20:48,080 --> 01:20:51,880 Speaker 1: okay to see because for for for I know, you 1627 01:20:51,920 --> 01:20:53,720 Speaker 1: know our content that you won't find a lie in 1628 01:20:53,800 --> 01:20:56,439 Speaker 1: that documentary. You know, we're very honest, very frank with 1629 01:20:56,520 --> 01:20:59,840 Speaker 1: the situation. We're not white washing fascism in Ukraine. Um. 1630 01:21:00,000 --> 01:21:02,600 Speaker 1: And we're certainly not putting out Russian propaganda. We're just 1631 01:21:02,680 --> 01:21:05,880 Speaker 1: telling an interesting journalistic story. So you would think as 1632 01:21:05,920 --> 01:21:08,120 Speaker 1: a media platform that would be like yeah, right up 1633 01:21:08,120 --> 01:21:10,000 Speaker 1: there street, but it's not really a media platform is 1634 01:21:10,000 --> 01:21:12,360 Speaker 1: a money making platform, and you know, they just they 1635 01:21:12,479 --> 01:21:15,920 Speaker 1: just survived for adverts yep. Um. And I think that 1636 01:21:16,360 --> 01:21:19,080 Speaker 1: is kind of where we're gonna We're gonna leave off 1637 01:21:19,160 --> 01:21:20,559 Speaker 1: for the day unless you have Do you have anything 1638 01:21:20,600 --> 01:21:23,760 Speaker 1: else you wanted to get into on your documentary, Jake, no, man, 1639 01:21:23,840 --> 01:21:25,360 Speaker 1: I just I guess the last thing I would say 1640 01:21:25,400 --> 01:21:28,000 Speaker 1: is I want people to kind of know that there 1641 01:21:28,040 --> 01:21:31,760 Speaker 1: are many different factions out there. This isn't you know. 1642 01:21:31,920 --> 01:21:33,840 Speaker 1: I saw some comment being like, oh, you found the 1643 01:21:33,880 --> 01:21:37,160 Speaker 1: only fascists and anti fascists in Ukraine. It's like, no, 1644 01:21:37,400 --> 01:21:40,200 Speaker 1: there's there's literally I've been documenting them. There's thousands there's 1645 01:21:40,240 --> 01:21:43,519 Speaker 1: so many, you know, and not just like oh, anti fascists, yeah, 1646 01:21:43,560 --> 01:21:46,080 Speaker 1: we this is what we believe, like people forming units. 1647 01:21:46,120 --> 01:21:50,960 Speaker 1: There's a whole pipeline of um anti authoritarian activists there. 1648 01:21:51,080 --> 01:21:55,560 Speaker 1: There's loads and generally like Ukrainians um happy. You know, 1649 01:21:55,640 --> 01:21:57,000 Speaker 1: they'll they'll take that out they can get. It's not 1650 01:21:57,040 --> 01:21:59,840 Speaker 1: like Ukrainians are like god damn those anti fascists. Not 1651 01:22:00,400 --> 01:22:01,760 Speaker 1: they love them. They love them the same way they 1652 01:22:01,800 --> 01:22:04,240 Speaker 1: love anybody that's defending the country. You know. It's it's 1653 01:22:04,280 --> 01:22:06,479 Speaker 1: just normal and I think people should really, you know, 1654 01:22:06,680 --> 01:22:08,240 Speaker 1: if they want to watch our doctor as well, like 1655 01:22:08,360 --> 01:22:10,639 Speaker 1: if they can share it, that would be great because 1656 01:22:11,040 --> 01:22:13,360 Speaker 1: it's just very it's a struggle to get people to 1657 01:22:13,360 --> 01:22:16,080 Speaker 1: watch it now because of because if it's been torpedo 1658 01:22:16,200 --> 01:22:18,280 Speaker 1: on the algorithm. So if they go to YouTube dot 1659 01:22:18,320 --> 01:22:21,479 Speaker 1: com slash popular Front, they'll find it's the first dot there. 1660 01:22:21,520 --> 01:22:23,960 Speaker 1: But yeah, people can share it. I'll be great, all right, 1661 01:22:24,080 --> 01:22:27,200 Speaker 1: We'll check it out again. The title is Ukraine's anti 1662 01:22:27,240 --> 01:22:30,719 Speaker 1: fascist football hooligans fighting the Russian Invasion on the Popular 1663 01:22:30,760 --> 01:22:33,599 Speaker 1: Front YouTube channel. We're also going to have a link 1664 01:22:33,680 --> 01:22:35,960 Speaker 1: in the bio. If you are someone who doesn't like 1665 01:22:36,240 --> 01:22:39,920 Speaker 1: the type things. Yeah, thank you, Jake. All right, everybody, 1666 01:22:40,120 --> 01:22:57,400 Speaker 1: that's the episode. Welcome to Tack. It happen here, a 1667 01:22:57,520 --> 01:23:01,920 Speaker 1: show that is currently taking lace in the death abortion 1668 01:23:02,080 --> 01:23:07,800 Speaker 1: rights in the US. And yeah, it's not good. Um 1669 01:23:08,680 --> 01:23:12,439 Speaker 1: with me to talk about this is sharene Is Sophie, 1670 01:23:13,280 --> 01:23:19,080 Speaker 1: is Garrison, and is Robert Evans And okay, So what 1671 01:23:19,240 --> 01:23:21,200 Speaker 1: one of one of the things that's been happening in 1672 01:23:21,320 --> 01:23:25,559 Speaker 1: the immediate wake of the Supreme Court decision that has 1673 01:23:25,800 --> 01:23:28,519 Speaker 1: destroyed Review Wade is there's been a lot of discussion 1674 01:23:28,520 --> 01:23:31,240 Speaker 1: about the abortion rights movement in Mexico. And by discussion, 1675 01:23:31,360 --> 01:23:34,719 Speaker 1: I mean in sort of classical American fashion. People saw 1676 01:23:34,880 --> 01:23:38,560 Speaker 1: exactly one meme and reposted it and that's now the 1677 01:23:38,720 --> 01:23:42,519 Speaker 1: sum of like all American knowledge about the abortion struggle 1678 01:23:42,560 --> 01:23:45,600 Speaker 1: in Mexico. So to try to get a deeper understanding 1679 01:23:45,720 --> 01:23:48,000 Speaker 1: of what's been going on in Mexico and how the 1680 01:23:48,040 --> 01:23:52,160 Speaker 1: struggle for abortion was one there, we're talking to Erica Yamada, 1681 01:23:52,240 --> 01:23:54,800 Speaker 1: who's a feminist and human rights activist born and raised 1682 01:23:54,840 --> 01:23:58,679 Speaker 1: in Mexico. Erica, thank you so much for joining us. Yeah, 1683 01:23:58,760 --> 01:24:02,800 Speaker 1: thank you, and you so much. Chris sharing so figures 1684 01:24:02,880 --> 01:24:06,560 Speaker 1: and and Robert I'm so honored and excited to be 1685 01:24:06,760 --> 01:24:10,280 Speaker 1: here and very grateful to be considered to share about 1686 01:24:10,360 --> 01:24:15,719 Speaker 1: the struggle for abortion rights in Mexico. So before starting 1687 01:24:15,840 --> 01:24:18,519 Speaker 1: this discussion, I would like to share a little bit 1688 01:24:18,600 --> 01:24:22,360 Speaker 1: about myself and the organization I work in to have 1689 01:24:22,880 --> 01:24:26,920 Speaker 1: some background about the experiences and data I will be 1690 01:24:27,120 --> 01:24:32,200 Speaker 1: communicating in this space. UM I have been involved in 1691 01:24:32,439 --> 01:24:36,479 Speaker 1: many agendas for girls and women's rights for approximately eight 1692 01:24:36,680 --> 01:24:40,679 Speaker 1: years now. I am currently part of the Woman Delivered 1693 01:24:40,760 --> 01:24:43,559 Speaker 1: twenty twenty class and I also work in the non 1694 01:24:43,640 --> 01:24:50,280 Speaker 1: governmental organization Gender Quality, Citizenship, Work and Family that has 1695 01:24:50,360 --> 01:24:54,880 Speaker 1: over twenty five years of experience working for sexual and 1696 01:24:55,120 --> 01:24:59,960 Speaker 1: productive rights in Mexico, particularly for the access to leave 1697 01:25:00,000 --> 01:25:05,840 Speaker 1: goal and safe abortion. Our organization promotes and advocate for 1698 01:25:05,960 --> 01:25:12,960 Speaker 1: the sexual and reproductive health and right of use through 1699 01:25:13,120 --> 01:25:15,720 Speaker 1: that They Said. That They Said is the network for 1700 01:25:15,920 --> 01:25:19,960 Speaker 1: sexual and reproductive rights in Mexico that has presence in 1701 01:25:20,120 --> 01:25:25,880 Speaker 1: tolve states and we focus mainly on marginalized communities. We 1702 01:25:26,000 --> 01:25:32,120 Speaker 1: support children, use women and advocate for change at local, regional, 1703 01:25:32,400 --> 01:25:37,960 Speaker 1: and national level and their access is contributing to the 1704 01:25:38,080 --> 01:25:43,839 Speaker 1: criminalize abortion, guarenting access to to health services and generate 1705 01:25:44,800 --> 01:25:49,880 Speaker 1: a favorable public opinion about women's right to decide. We 1706 01:25:50,000 --> 01:25:54,000 Speaker 1: are also part of the National Court Choice Alliance in 1707 01:25:54,160 --> 01:26:00,880 Speaker 1: Mexico and effort of bi organizations General Equality, the Population 1708 01:26:01,040 --> 01:26:06,679 Speaker 1: Council EPAS Central America, Catholics for the Right to Decide 1709 01:26:07,080 --> 01:26:13,040 Speaker 1: and heat It, each with different expertise regarding abortion. Together 1710 01:26:13,479 --> 01:26:18,280 Speaker 1: we have worked on comprehensive strategies that include the legal, 1711 01:26:18,560 --> 01:26:28,360 Speaker 1: the social, religious, ethical and investigation aspects of abortion and well, 1712 01:26:29,200 --> 01:26:32,560 Speaker 1: I would like to start up like sharing some of 1713 01:26:32,680 --> 01:26:37,560 Speaker 1: the context and the legal situation of abortion in Mexico 1714 01:26:37,720 --> 01:26:44,240 Speaker 1: if it's okay or or in our country, Voluntary abortion 1715 01:26:44,960 --> 01:26:50,599 Speaker 1: UH during the twelve weeks of pregnancy is legalized only 1716 01:26:50,920 --> 01:26:55,800 Speaker 1: in certain states. Mexico City, the capital, was the only 1717 01:26:55,920 --> 01:26:59,840 Speaker 1: state in the whole country that they criminalized abortion into 1718 01:27:00,000 --> 01:27:04,880 Speaker 1: two thousand seven. Twelve years later, in two thousand nineteen, 1719 01:27:05,280 --> 01:27:09,400 Speaker 1: the state of of Wahaca became the second state twenty 1720 01:27:09,439 --> 01:27:15,240 Speaker 1: sure access to this health service. Afterwards, two thousand twenty 1721 01:27:15,320 --> 01:27:20,559 Speaker 1: one was historic. It was a very very historic year. 1722 01:27:20,600 --> 01:27:27,640 Speaker 1: It was for states Evaluo Vera, Cruz Bacca, California and 1723 01:27:27,880 --> 01:27:33,240 Speaker 1: Polyma also the criminalized abortion. Then this year two thousand 1724 01:27:33,360 --> 01:27:38,000 Speaker 1: twenty two, three other states have been added to this 1725 01:27:38,240 --> 01:27:45,320 Speaker 1: list Sinaloa, Guerrero and California. This means nine out of 1726 01:27:45,479 --> 01:27:50,200 Speaker 1: thirty two states have the criminalized abortion. In the other 1727 01:27:50,840 --> 01:27:55,559 Speaker 1: states of the Mexican Republic, abortion is only allowed under 1728 01:27:55,800 --> 01:28:01,800 Speaker 1: certain grounds established by the law of each entity. For example, 1729 01:28:02,040 --> 01:28:06,240 Speaker 1: if it was a companeous abortion, if the pregnancy was 1730 01:28:06,360 --> 01:28:11,479 Speaker 1: due to non consensual insemination, if the woman's life is 1731 01:28:11,560 --> 01:28:17,800 Speaker 1: in danger of death, if the product pas serious genetic alterations, 1732 01:28:18,800 --> 01:28:24,760 Speaker 1: if the pregnancy causes health effects, among other reasons. It 1733 01:28:25,240 --> 01:28:30,519 Speaker 1: depends on each Plano code of each state. And I 1734 01:28:30,600 --> 01:28:34,320 Speaker 1: also must add that pregnancy due to right is the 1735 01:28:34,520 --> 01:28:40,120 Speaker 1: only indication that permits legal abortion in all states. And 1736 01:28:40,360 --> 01:28:44,320 Speaker 1: now coming back to what Chris said that there was 1737 01:28:44,439 --> 01:28:49,040 Speaker 1: like a meme. I think, uh, you refer to the 1738 01:28:49,280 --> 01:28:54,960 Speaker 1: name of the public protests. Yeah, yeah, yeah, the black 1739 01:28:55,040 --> 01:29:00,080 Speaker 1: clad female protesters attacking is it? I couldn't tell. I 1740 01:29:00,120 --> 01:29:02,080 Speaker 1: don't recall it's a city hall or a police station 1741 01:29:02,200 --> 01:29:07,040 Speaker 1: or something. I I have also seen some of these 1742 01:29:07,120 --> 01:29:12,040 Speaker 1: media reports that that they say that we achieve legal 1743 01:29:12,160 --> 01:29:17,600 Speaker 1: abortion thanks to these radical public demonstrations, and well, it 1744 01:29:17,800 --> 01:29:22,720 Speaker 1: is undeniable that among the most significant achievements is the 1745 01:29:22,920 --> 01:29:28,200 Speaker 1: pro mobilization of feminists and women to eradicate violence and 1746 01:29:28,520 --> 01:29:35,520 Speaker 1: learn justice Mexico. Mexico has demonstrated the world the revolutionary 1747 01:29:35,640 --> 01:29:39,760 Speaker 1: progress with the mass feminist protests and this image is 1748 01:29:40,000 --> 01:29:44,959 Speaker 1: from two thousand nineteen. It was a huge feminist protest 1749 01:29:45,520 --> 01:29:50,040 Speaker 1: that are condemned violence against women, especially sexual and feminist 1750 01:29:50,080 --> 01:29:55,519 Speaker 1: side violence, police brutality, and the intunity that permeates the 1751 01:29:55,640 --> 01:30:00,799 Speaker 1: governmental system. We received a lot of internet national media 1752 01:30:00,800 --> 01:30:04,800 Speaker 1: attention and it has been one of the like the 1753 01:30:04,920 --> 01:30:10,720 Speaker 1: recent highlights of the feminist movement in our country. But 1754 01:30:11,240 --> 01:30:16,720 Speaker 1: like the struggle for abortion entails so much more, and 1755 01:30:17,200 --> 01:30:21,120 Speaker 1: yes it did have some influence. For example, in two 1756 01:30:21,240 --> 01:30:26,559 Speaker 1: thousand twenty, feminists in two states, Quintana and Puebla took 1757 01:30:26,720 --> 01:30:31,240 Speaker 1: their local congresses and demanded the discussion of abortion initiatives, 1758 01:30:32,000 --> 01:30:36,320 Speaker 1: and they have put this agenda on the table. It 1759 01:30:36,560 --> 01:30:41,880 Speaker 1: is worth mentioning that the struggle for abortion that it 1760 01:30:42,080 --> 01:30:48,320 Speaker 1: goes back so many years. Feminists have been fighting for 1761 01:30:48,439 --> 01:30:53,160 Speaker 1: reproductive rights, including the access to legal abortion for that 1762 01:30:53,400 --> 01:30:59,040 Speaker 1: it's now the progress regarding this struggle has unfolded historically 1763 01:30:59,360 --> 01:31:04,479 Speaker 1: during the recent years. For many other reasons, One thing 1764 01:31:04,560 --> 01:31:06,880 Speaker 1: I want to go back to a little bit to 1765 01:31:07,000 --> 01:31:11,000 Speaker 1: talk about is you were talking about the protests being 1766 01:31:11,080 --> 01:31:15,439 Speaker 1: pro abortion protests but also talking about antifemicide and anti 1767 01:31:16,640 --> 01:31:18,400 Speaker 1: violent stuff, and I was wondering if you could talk 1768 01:31:18,400 --> 01:31:22,240 Speaker 1: about the anti femicide it campaigns too, because that's been 1769 01:31:22,280 --> 01:31:25,080 Speaker 1: a really big part of this that gets basically no 1770 01:31:25,240 --> 01:31:30,960 Speaker 1: coverage in the US. Well, in Mexico, eleven women are 1771 01:31:31,080 --> 01:31:37,600 Speaker 1: murdered every day. We have a huge femicide problem that 1772 01:31:37,760 --> 01:31:43,160 Speaker 1: has been silenced by the government, even by the president 1773 01:31:43,280 --> 01:31:49,920 Speaker 1: who minimizes this horrible situation. So in two thousand nineteen 1774 01:31:50,479 --> 01:31:58,280 Speaker 1: there was a UH emblematic case where police officers rape 1775 01:31:58,439 --> 01:32:04,320 Speaker 1: and tortured a row and that's how this protest UH 1776 01:32:04,840 --> 01:32:10,200 Speaker 1: started and since all the two thousand nineteen, like most 1777 01:32:11,040 --> 01:32:20,720 Speaker 1: feminist protests, have been regarding the violence against women. But UH, 1778 01:32:21,960 --> 01:32:26,200 Speaker 1: I would also like to add about the struggle for abortion. 1779 01:32:27,360 --> 01:32:31,280 Speaker 1: I think that in the global cells the mare, the 1780 01:32:31,439 --> 01:32:38,680 Speaker 1: green type it played like the most fundamental roles. UH. 1781 01:32:38,960 --> 01:32:42,280 Speaker 1: This movement, which came from Argentina, is one of the 1782 01:32:42,400 --> 01:32:48,120 Speaker 1: main successes that strengthened the struggle for abortion rights and 1783 01:32:48,240 --> 01:32:53,599 Speaker 1: even the feminist movement in Mexico. It expanded in many 1784 01:32:53,720 --> 01:32:58,200 Speaker 1: countries including Mexico. Here we have a national Green Type 1785 01:32:58,320 --> 01:33:01,759 Speaker 1: and many local Green type in all of our space, 1786 01:33:02,840 --> 01:33:07,240 Speaker 1: and these collectives have played a large role demanding social 1787 01:33:07,360 --> 01:33:13,400 Speaker 1: and legal thecuminalization of abortion across the country. And there 1788 01:33:13,560 --> 01:33:20,400 Speaker 1: is also an increase of networks that provides self managed 1789 01:33:20,479 --> 01:33:26,719 Speaker 1: abortion information and accompaniment services, which have contributed to fighting 1790 01:33:26,800 --> 01:33:31,479 Speaker 1: the stigma that still surrounds abortion. And the Green Tide 1791 01:33:31,640 --> 01:33:36,360 Speaker 1: and the feminist movement, it's it has become like how 1792 01:33:36,439 --> 01:33:41,840 Speaker 1: do you say it's been merged merg and like feminist 1793 01:33:41,960 --> 01:33:46,639 Speaker 1: movements and the Green Tide fight for legal and safe abortion, 1794 01:33:46,720 --> 01:33:51,200 Speaker 1: but also to radicate the violence against girls and women. Yeah, 1795 01:33:51,280 --> 01:33:55,400 Speaker 1: that makes that makes sense. And about the Green Tide, 1796 01:33:55,880 --> 01:33:59,200 Speaker 1: I have two questions about the Green Tide. One is 1797 01:34:00,160 --> 01:34:03,280 Speaker 1: what kinds of tactics have Green Tide groups been doing? 1798 01:34:03,520 --> 01:34:08,720 Speaker 1: And also how how linked have the international movement has been? 1799 01:34:08,880 --> 01:34:13,240 Speaker 1: Like how how close how closely have these organization has 1800 01:34:13,280 --> 01:34:19,760 Speaker 1: been communicating across and working together across the different countries? Okay, uh? 1801 01:34:20,080 --> 01:34:24,960 Speaker 1: Since the Green Tide came from Argentina like the most, 1802 01:34:26,960 --> 01:34:31,759 Speaker 1: how do you say the the communication comes from regional 1803 01:34:31,920 --> 01:34:36,479 Speaker 1: countries in Latin America and Mexico has been learning from 1804 01:34:36,720 --> 01:34:41,240 Speaker 1: these Latin American countries the experiences we have seen the 1805 01:34:41,439 --> 01:34:46,920 Speaker 1: feminist movements, the protests also more in the South and 1806 01:34:47,720 --> 01:34:54,080 Speaker 1: the the green handkerchief has this very very powerful symbol 1807 01:34:54,560 --> 01:34:58,800 Speaker 1: of legal and safe abortion, and these have also contributed 1808 01:34:59,000 --> 01:35:03,920 Speaker 1: to the social be communalization of abortion. And wearing the 1809 01:35:04,800 --> 01:35:11,400 Speaker 1: green handkerchiefs and in the protest also means demanding this 1810 01:35:13,200 --> 01:35:18,599 Speaker 1: health service. And one of our tactics is of course 1811 01:35:19,240 --> 01:35:25,840 Speaker 1: pressuring the government. In Mexico. Political will, primarily from the 1812 01:35:26,560 --> 01:35:32,840 Speaker 1: left leaning ruling party, has been fundamental for for the becommunalization. 1813 01:35:34,360 --> 01:35:37,439 Speaker 1: With the new government that I wade in two thousand 1814 01:35:37,560 --> 01:35:47,080 Speaker 1: eighteen headed by Andreo, we have more allies and progressive legislators. 1815 01:35:47,800 --> 01:35:53,519 Speaker 1: So due to the majority uh that this political party 1816 01:35:53,640 --> 01:35:59,559 Speaker 1: has in many local converses, the feminists of each state 1817 01:35:59,760 --> 01:36:04,080 Speaker 1: have been able to pressure and work with these legislators 1818 01:36:04,520 --> 01:36:10,719 Speaker 1: and keep pushing this agenda. That's awesome. I think something 1819 01:36:10,760 --> 01:36:13,080 Speaker 1: that I'm still stuck on is that at the very 1820 01:36:13,200 --> 01:36:18,280 Speaker 1: least all the states agree that abortion is okay if 1821 01:36:18,280 --> 01:36:20,679 Speaker 1: it happened from from rape. Is that what you said earlier? 1822 01:36:20,760 --> 01:36:25,080 Speaker 1: Like that's the one we we have a federal lot. 1823 01:36:25,560 --> 01:36:31,640 Speaker 1: It's the zero forty six met official Mexican norm that 1824 01:36:32,080 --> 01:36:36,519 Speaker 1: states that abortion is legal if the pregnancy is due 1825 01:36:36,600 --> 01:36:41,960 Speaker 1: to rape, and all the states, all the public officials 1826 01:36:42,040 --> 01:36:47,960 Speaker 1: have this obligation to to ensure that this that this happens. 1827 01:36:48,040 --> 01:36:54,000 Speaker 1: But sometimes, um, like we we have so many prejudices 1828 01:36:54,320 --> 01:37:00,320 Speaker 1: that sometimes even doctors don't respect the law. But by 1829 01:37:00,439 --> 01:37:03,559 Speaker 1: not it should be legal. And it's not that they 1830 01:37:03,600 --> 01:37:08,519 Speaker 1: all agree, it's the it's the one. Yeah, it's just so, 1831 01:37:08,960 --> 01:37:12,160 Speaker 1: I mean, it's definitely has its flaws and people with 1832 01:37:12,200 --> 01:37:17,160 Speaker 1: their own biases. But like here, usually the rapists will 1833 01:37:17,160 --> 01:37:19,800 Speaker 1: have more rights and protection than the person that got raped. 1834 01:37:19,880 --> 01:37:21,840 Speaker 1: Like there are the family is allowed to sue the 1835 01:37:21,920 --> 01:37:25,839 Speaker 1: person that got an abortion, for example. It's it's insane, 1836 01:37:26,080 --> 01:37:29,320 Speaker 1: but so for then here, a lot of it, a 1837 01:37:29,400 --> 01:37:32,519 Speaker 1: lot of the biggotry comes from like Christianity and religion. 1838 01:37:32,640 --> 01:37:34,479 Speaker 1: Is it the same, like is that the baseline for 1839 01:37:35,040 --> 01:37:41,520 Speaker 1: the opposition there too? Yes, because Mexico is a predominantly 1840 01:37:41,760 --> 01:37:47,160 Speaker 1: Catholic country and abortion and feels many controversies due to 1841 01:37:47,320 --> 01:37:51,080 Speaker 1: the different positions that are come from this. Religious stances 1842 01:37:51,200 --> 01:37:56,040 Speaker 1: is stances that ignore and deny the access to this 1843 01:37:56,240 --> 01:38:01,440 Speaker 1: service and deny it's a human rights issue. And religious 1844 01:38:01,880 --> 01:38:06,519 Speaker 1: anti rights groups or how how do you say anti 1845 01:38:06,800 --> 01:38:11,960 Speaker 1: choice groups have a powerful presence and are actively hindering 1846 01:38:12,280 --> 01:38:17,920 Speaker 1: law proposals regarding this topic. The prejudices and stigma are 1847 01:38:18,040 --> 01:38:25,400 Speaker 1: present even amongst healthcare providers and sometimes uh the religious people. 1848 01:38:26,080 --> 01:38:31,840 Speaker 1: They pressure these healthcare providers, the legislators. For example, every 1849 01:38:31,960 --> 01:38:36,040 Speaker 1: time there is a line on local congress. There are 1850 01:38:36,280 --> 01:38:42,599 Speaker 1: so many religious groups outside the Congress. They are how 1851 01:38:42,680 --> 01:38:48,200 Speaker 1: do you say, like bothering the legislators. They even get 1852 01:38:48,280 --> 01:38:54,799 Speaker 1: their personal numbers and they are harassing them. Yes, harassing 1853 01:38:55,000 --> 01:39:00,080 Speaker 1: is the work. They're harassing them. So yes, they have 1854 01:39:00,600 --> 01:39:04,280 Speaker 1: a lot of power, a lot of money, and these 1855 01:39:04,400 --> 01:39:11,120 Speaker 1: effects even the states where abortion is legal, because as 1856 01:39:11,160 --> 01:39:17,280 Speaker 1: I said before, sometimes doctors denied it even if it's 1857 01:39:17,360 --> 01:39:24,160 Speaker 1: requested under the legal indications. So yes, it's a problem. 1858 01:39:25,880 --> 01:39:30,000 Speaker 1: I'm curious what you see as kind of the value 1859 01:39:30,080 --> 01:39:33,320 Speaker 1: of the street actions that were carried out as opposed 1860 01:39:33,360 --> 01:39:37,240 Speaker 1: to um kind of the the actual organization on the 1861 01:39:37,360 --> 01:39:40,120 Speaker 1: legislative side of things, Like what what degree do you 1862 01:39:40,200 --> 01:39:44,479 Speaker 1: think both contributed to, you know, the successes that you 1863 01:39:44,479 --> 01:39:50,440 Speaker 1: all have seen. I think both were very very important 1864 01:39:50,520 --> 01:39:58,800 Speaker 1: to the recent successes. The public demonstrations help the feminist 1865 01:39:59,080 --> 01:40:07,200 Speaker 1: movements strengthened. Like it is like yes, this recent protests 1866 01:40:07,280 --> 01:40:11,280 Speaker 1: have been the what do you say? It has been 1867 01:40:11,520 --> 01:40:15,519 Speaker 1: where the most women have gone out to the streets, 1868 01:40:15,840 --> 01:40:19,799 Speaker 1: taken the streets, and it has helped because the government 1869 01:40:19,920 --> 01:40:25,360 Speaker 1: has responded to some of our request. But also it 1870 01:40:25,560 --> 01:40:33,680 Speaker 1: is extremely important too to talk about the organization. And also, Uh, 1871 01:40:34,000 --> 01:40:37,320 Speaker 1: something I didn't mention and that I would like to 1872 01:40:37,400 --> 01:40:42,320 Speaker 1: emphasize is that in two thousand twenty one, the Supreme 1873 01:40:42,400 --> 01:40:47,320 Speaker 1: Court of Justice in Mexico ruled favorably in four abortion 1874 01:40:47,400 --> 01:40:53,400 Speaker 1: related cases, and this provided us with with progressive jurisprudence 1875 01:40:54,040 --> 01:40:59,880 Speaker 1: and legal interpretations in favor of recognizing an increasing abortion right. 1876 01:41:00,640 --> 01:41:05,639 Speaker 1: So this has how do you say, this has served 1877 01:41:05,800 --> 01:41:13,400 Speaker 1: our movement and all the argumentation to push the decriminalizations. 1878 01:41:14,520 --> 01:41:18,799 Speaker 1: And well, about the four cases. In the first case, 1879 01:41:19,240 --> 01:41:26,040 Speaker 1: the Supreme Court declared that limitations to access legal abortion 1880 01:41:26,320 --> 01:41:30,760 Speaker 1: after it must be removed. In the second case, it 1881 01:41:30,920 --> 01:41:40,120 Speaker 1: declared that the absolute criminalization of abortion with consent is unconstitutional. Uh. 1882 01:41:41,520 --> 01:41:47,240 Speaker 1: And in the third case is it declared that the 1883 01:41:47,479 --> 01:41:52,080 Speaker 1: protection of life from the moment of conception is unconstitutional. 1884 01:41:53,520 --> 01:41:58,520 Speaker 1: And in the fourth case, the court ruled that legislative 1885 01:41:58,560 --> 01:42:04,200 Speaker 1: reforms broadening the boundaries of conscientiens objection in that federal 1886 01:42:04,320 --> 01:42:10,160 Speaker 1: health law are unconstitutional. And the Supreme Court is the 1887 01:42:10,640 --> 01:42:16,840 Speaker 1: highest court of justice in Mexico and all judges should 1888 01:42:16,920 --> 01:42:22,920 Speaker 1: respect what they established. And well, unfortunately it doesn't happen 1889 01:42:23,240 --> 01:42:28,360 Speaker 1: in all states. And what it is like the most 1890 01:42:28,439 --> 01:42:32,560 Speaker 1: important president we we have right now, and it is 1891 01:42:32,760 --> 01:42:40,200 Speaker 1: fundamental for our argumentation in local congresses. I have has 1892 01:42:40,240 --> 01:42:42,439 Speaker 1: the national government done anything at all to try to 1893 01:42:42,520 --> 01:42:47,080 Speaker 1: force the states who are like not following the rulings too, 1894 01:42:48,120 --> 01:42:54,519 Speaker 1: like except the rulings. No, because our president uh he 1895 01:42:56,080 --> 01:43:00,920 Speaker 1: he is very neutral in this topic, and he has 1896 01:43:02,120 --> 01:43:08,160 Speaker 1: spoken against feminist movements and he thinks that any protest 1897 01:43:08,439 --> 01:43:15,200 Speaker 1: means like uh, conservators against his liberal government. So now 1898 01:43:15,520 --> 01:43:22,919 Speaker 1: we we don't have this this support from the national government, 1899 01:43:23,120 --> 01:43:27,680 Speaker 1: although assignmenttioned before. We have a lot of allies and 1900 01:43:28,080 --> 01:43:35,200 Speaker 1: in many instances that have helped to pressure state state 1901 01:43:35,439 --> 01:43:39,639 Speaker 1: public officials too to respect a law and to keep 1902 01:43:39,680 --> 01:43:48,439 Speaker 1: pushing this agenda. Is the president. I'm just curious and 1903 01:43:49,280 --> 01:43:52,439 Speaker 1: I'm ignorant that is the president? Like, well, how is 1904 01:43:52,479 --> 01:43:55,320 Speaker 1: he received by the general public, Like what people's like, 1905 01:43:55,800 --> 01:43:57,880 Speaker 1: is he neutral because he needs a coward because there 1906 01:43:57,920 --> 01:44:00,960 Speaker 1: wasn't one of the rockety votes? But what all right, 1907 01:44:01,040 --> 01:44:05,800 Speaker 1: what's the response for the public. He he still has 1908 01:44:06,200 --> 01:44:11,760 Speaker 1: a lot of support from from the majority. He he 1909 01:44:12,080 --> 01:44:17,639 Speaker 1: is one of the the first how do you stay 1910 01:44:17,760 --> 01:44:23,719 Speaker 1: progressive presidents, Although we have been very disappointed by many 1911 01:44:23,840 --> 01:44:29,519 Speaker 1: of his actions, for instance the increase of militarization and 1912 01:44:30,080 --> 01:44:36,559 Speaker 1: the communalization of feminists, of human rights activists, of journalists. However, 1913 01:44:37,400 --> 01:44:41,280 Speaker 1: it is the first time in so many years that 1914 01:44:41,720 --> 01:44:48,960 Speaker 1: a president talks about poor indigenous people that uh sends 1915 01:44:49,000 --> 01:44:53,840 Speaker 1: support to rural communities. So he still has a lot 1916 01:44:53,960 --> 01:45:01,000 Speaker 1: of support. M UM. One thing that I don't know 1917 01:45:01,080 --> 01:45:03,160 Speaker 1: how much. I don't know how much you want to 1918 01:45:03,200 --> 01:45:06,559 Speaker 1: get into it, but UM we talked to some people, 1919 01:45:08,240 --> 01:45:09,840 Speaker 1: Oh god, I don't remember how many months ago now, 1920 01:45:10,040 --> 01:45:12,080 Speaker 1: but we talked to some people a while back who 1921 01:45:12,200 --> 01:45:17,920 Speaker 1: were UM doing trans rights organizing in Mexico, and they 1922 01:45:17,960 --> 01:45:22,960 Speaker 1: were talking a lot about how UM that they were 1923 01:45:22,960 --> 01:45:31,200 Speaker 1: talking about how I guess like anti choice conservative groups 1924 01:45:31,240 --> 01:45:35,080 Speaker 1: have been using UM have been using sort of organized 1925 01:45:35,120 --> 01:45:38,479 Speaker 1: transphobic groups as a way to sort of different attention 1926 01:45:38,520 --> 01:45:41,679 Speaker 1: away from the abortion struggle, in the femicide struggle into 1927 01:45:41,960 --> 01:45:47,000 Speaker 1: stuff that doesn't like challenges that is quo And yeah, 1928 01:45:47,000 --> 01:45:48,439 Speaker 1: and I was wondering, I just wanted to talk about 1929 01:45:48,439 --> 01:45:50,880 Speaker 1: that a little bit. Yes, thank you so much for 1930 01:45:51,200 --> 01:46:00,320 Speaker 1: talking about this transphobia and in the feminist movements is horrible. 1931 01:46:00,520 --> 01:46:10,520 Speaker 1: Like the Transforbaic feminists have been getting to conservative public officials, 1932 01:46:10,840 --> 01:46:16,200 Speaker 1: they have been approaching religious groups, and they have even 1933 01:46:16,520 --> 01:46:21,800 Speaker 1: affected the abortion agenda because some of our laws include 1934 01:46:22,000 --> 01:46:27,960 Speaker 1: people with the capacity to get pregnant. So these health 1935 01:46:28,040 --> 01:46:34,680 Speaker 1: services include ah, trans meant and non binary people. But 1936 01:46:34,920 --> 01:46:40,360 Speaker 1: this Transforbaic feminists have been how do you say, obstaculizing. 1937 01:46:41,240 --> 01:46:47,519 Speaker 1: There's this struggle because of these prejudices, and it is 1938 01:46:47,880 --> 01:46:52,960 Speaker 1: very very sad. And some of are some of the 1939 01:46:53,360 --> 01:47:00,280 Speaker 1: main and most famous reference references and feminism have been 1940 01:47:00,800 --> 01:47:06,320 Speaker 1: signing this transphobic side. And yes, they are approaching to 1941 01:47:06,520 --> 01:47:11,320 Speaker 1: the ultra right, and they they have been hindering not 1942 01:47:11,520 --> 01:47:18,639 Speaker 1: only trans people's rights, but now women's rights. In Gentlemary, Yeah, 1943 01:47:18,680 --> 01:47:22,360 Speaker 1: I think was it. I'm trying to remember off top 1944 01:47:22,400 --> 01:47:23,800 Speaker 1: of my head. I think that there was there was 1945 01:47:23,840 --> 01:47:25,920 Speaker 1: a picture that was going around. That was some of 1946 01:47:26,000 --> 01:47:29,840 Speaker 1: the organizers from one of the Lake trans phobic feminist collectives. 1947 01:47:30,280 --> 01:47:37,320 Speaker 1: I like taking pictures with sleeping held around. Yeah, I 1948 01:47:37,400 --> 01:47:39,479 Speaker 1: think I think it was sleeping held around. Yeah, but 1949 01:47:42,040 --> 01:47:46,560 Speaker 1: I don't know. I haven't seen that. But there was 1950 01:47:46,960 --> 01:47:52,080 Speaker 1: a forum some weeks ago. It was a forum in 1951 01:47:52,200 --> 01:47:56,680 Speaker 1: the National Altonomous University of Mexico, and it was a 1952 01:47:57,160 --> 01:48:02,280 Speaker 1: feminist on discussion and most of the familists who are 1953 01:48:02,600 --> 01:48:06,679 Speaker 1: so so famous in all Latin America started to say 1954 01:48:06,760 --> 01:48:15,040 Speaker 1: some transpulbic points. So, yes, this anti rights movement is 1955 01:48:15,280 --> 01:48:20,320 Speaker 1: very present and in feminism. Yeah, I guess. The other 1956 01:48:20,439 --> 01:48:22,280 Speaker 1: thing on that point that I was wondering is how 1957 01:48:22,360 --> 01:48:26,479 Speaker 1: have like pro trans feminists been sort of fighting back 1958 01:48:26,520 --> 01:48:31,479 Speaker 1: against these people. Has that been happening a lot? Uh, well, 1959 01:48:31,560 --> 01:48:37,080 Speaker 1: we're trying, but it has been very, very difficult because 1960 01:48:38,000 --> 01:48:43,480 Speaker 1: literally there are transpolic people everywhere everywhere, I mean government 1961 01:48:44,320 --> 01:48:50,000 Speaker 1: and non governmental organizations and institutions, and the majority of 1962 01:48:50,120 --> 01:48:57,960 Speaker 1: the people are not well these says um socially conscious 1963 01:48:58,080 --> 01:49:05,080 Speaker 1: about about trans right, so transport with people who have 1964 01:49:05,400 --> 01:49:12,280 Speaker 1: so much more power, but uh, sometimes we we denounce 1965 01:49:12,400 --> 01:49:19,880 Speaker 1: it in social media. We reported to two international organizations 1966 01:49:20,840 --> 01:49:24,640 Speaker 1: and like, we have all the human rights narrative and 1967 01:49:24,880 --> 01:49:30,479 Speaker 1: argumentation in our favor, but it is difficult because there 1968 01:49:30,520 --> 01:49:36,599 Speaker 1: are so many trans transports everywhere. Um. We have also 1969 01:49:38,600 --> 01:49:44,759 Speaker 1: contacted international organizations to to publicly say that, for example, 1970 01:49:44,880 --> 01:49:47,640 Speaker 1: if you want to access a certain grant, you have 1971 01:49:47,840 --> 01:49:55,840 Speaker 1: to have an inclusive position. What other ways, uh, we 1972 01:49:57,280 --> 01:50:02,519 Speaker 1: like the trans movement has strengthened so much since two 1973 01:50:02,640 --> 01:50:09,400 Speaker 1: thousand nineteen because in Mexico City, um uh, the a 1974 01:50:09,600 --> 01:50:18,240 Speaker 1: lot to recognize trans children and adolescents was pushed for 1975 01:50:18,400 --> 01:50:23,439 Speaker 1: the for the first time, like via the ministative way. 1976 01:50:24,520 --> 01:50:30,160 Speaker 1: So there has been how do you say, a commission 1977 01:50:30,240 --> 01:50:38,519 Speaker 1: of of trans organizations collectives. So I think that is 1978 01:50:38,640 --> 01:50:45,200 Speaker 1: the the most noteworthy thing progress. Yeah, I guess there's 1979 01:50:45,240 --> 01:50:47,920 Speaker 1: been a lot of people like looking to the green 1980 01:50:48,000 --> 01:50:51,679 Speaker 1: tide and looking to sort of the broader Latin American 1981 01:50:51,920 --> 01:50:55,720 Speaker 1: feminist movement for sort of inspiration and also for sort 1982 01:50:55,720 --> 01:50:58,960 Speaker 1: of tactical advice. And I was wondering, what, like, what 1983 01:50:59,080 --> 01:51:01,360 Speaker 1: advice would you give people in the US who are 1984 01:51:01,439 --> 01:51:05,240 Speaker 1: coming into this fight now, and where would you send 1985 01:51:05,320 --> 01:51:08,840 Speaker 1: people to learn more stuff about it? Mm hmm. There 1986 01:51:08,960 --> 01:51:14,800 Speaker 1: some some key points I consider relevant is firstly the 1987 01:51:15,520 --> 01:51:20,920 Speaker 1: visibility of the pro choice agenda and the social ecommunalization 1988 01:51:21,080 --> 01:51:25,640 Speaker 1: of abortion. When we talk about Lettle abortion, we we 1989 01:51:25,800 --> 01:51:30,800 Speaker 1: have to emphasize a lot also on the social the communalization. 1990 01:51:31,200 --> 01:51:37,040 Speaker 1: It is very important to work on strategies to to 1991 01:51:37,240 --> 01:51:42,960 Speaker 1: reduce statema and demonstrate that abortion is a common reproductive 1992 01:51:43,200 --> 01:51:48,680 Speaker 1: event that must be approached using gender perspective and the 1993 01:51:48,920 --> 01:51:56,839 Speaker 1: human rights framework. We we encourage public dissemination of the legal, 1994 01:51:57,320 --> 01:52:02,960 Speaker 1: medical and social information with with hard sustained data from 1995 01:52:03,200 --> 01:52:09,200 Speaker 1: international organizations that position abortion as a as a human 1996 01:52:09,360 --> 01:52:14,280 Speaker 1: right and an essential health service. And related to this 1997 01:52:14,640 --> 01:52:21,439 Speaker 1: first point, the narrative and the argumentation, we have to 1998 01:52:21,600 --> 01:52:26,519 Speaker 1: focus um the access to save from legal abortion as 1999 01:52:26,600 --> 01:52:31,440 Speaker 1: a human rights issue, which means it's a governmental obligation 2000 01:52:31,680 --> 01:52:36,360 Speaker 1: to ensure access to this service. On our case Mexico 2001 01:52:36,560 --> 01:52:42,519 Speaker 1: as nationally and international commitments regarding girls and women's rights, 2002 01:52:42,600 --> 01:52:47,120 Speaker 1: and I'm pretty share the United States also has this commitments, 2003 01:52:47,360 --> 01:52:52,000 Speaker 1: so it's their obligation, it's the government's obligation to to 2004 01:52:52,200 --> 01:52:59,200 Speaker 1: ensure and also regarding their narrative, we have to work 2005 01:52:59,400 --> 01:53:05,519 Speaker 1: on naturalizing abortion and encourage people to stop using this 2006 01:53:05,720 --> 01:53:11,360 Speaker 1: word as a crime. Abortion is a human right and 2007 01:53:11,880 --> 01:53:15,960 Speaker 1: it is a reproductive event in the life of women 2008 01:53:16,200 --> 01:53:19,960 Speaker 1: and people with a capacity to get pregnant. And it's 2009 01:53:20,000 --> 01:53:25,240 Speaker 1: a reproductive event that has always existed and will always exist, 2010 01:53:25,400 --> 01:53:33,000 Speaker 1: either naturally or induced. And some of the organizations that 2011 01:53:33,320 --> 01:53:38,479 Speaker 1: I know of here that that can provide information are 2012 01:53:38,920 --> 01:53:45,439 Speaker 1: the Poor Choice Alliance, organizations, Catholics for the Right to Decide. 2013 01:53:45,560 --> 01:53:50,720 Speaker 1: They can give the religious and ethical arguments. UM my 2014 01:53:50,960 --> 01:53:57,760 Speaker 1: organization Gender Quality, we have the social argumentations. We re 2015 01:53:57,880 --> 01:54:02,320 Speaker 1: a company and work with with the girls and women. 2016 01:54:02,600 --> 01:54:05,400 Speaker 1: We we are in twelve states and we are in 2017 01:54:05,479 --> 01:54:10,720 Speaker 1: the mobilizations. We are in the state on the local congresses. 2018 01:54:11,960 --> 01:54:21,880 Speaker 1: Also Heated Heated in Spanish is utiful informers. They have 2019 01:54:22,280 --> 01:54:29,000 Speaker 1: all the legal expertise UH and they worked this reforms 2020 01:54:29,120 --> 01:54:34,760 Speaker 1: and lost with the criminalized abortion. We have EPAS. EPAS 2021 01:54:35,160 --> 01:54:40,280 Speaker 1: is an international organization and they are medical experts and 2022 01:54:40,640 --> 01:54:46,840 Speaker 1: they provide all types of data and information regarding this 2023 01:54:47,080 --> 01:54:53,080 Speaker 1: part and the Population Council. They are the experts on 2024 01:54:53,840 --> 01:55:02,480 Speaker 1: monitoring and investigation and they have many research papers and well. 2025 01:55:03,160 --> 01:55:09,240 Speaker 1: There are also like other pages that that can give 2026 01:55:09,320 --> 01:55:16,080 Speaker 1: information for example about these they self induced abortion the 2027 01:55:17,360 --> 01:55:22,320 Speaker 1: health organization has a protocol. It's a public protocol for 2028 01:55:23,200 --> 01:55:27,560 Speaker 1: for self induced abortion, and it is completely safe to 2029 01:55:27,640 --> 01:55:35,720 Speaker 1: do it at home. Well, I really appreciate all the information. 2030 01:55:36,240 --> 01:55:40,080 Speaker 1: Uh yeah, thank you so much. Yeah, thank you so much. 2031 01:55:40,200 --> 01:55:44,120 Speaker 1: I think it's really helpful to hear um what other 2032 01:55:44,200 --> 01:55:46,520 Speaker 1: countries have done in the same struggle. It's like so 2033 01:55:46,680 --> 01:55:49,440 Speaker 1: similar but different, the say at the same time, because 2034 01:55:49,480 --> 01:55:51,680 Speaker 1: we've dealt with the same similar things like turfs and 2035 01:55:52,600 --> 01:55:56,720 Speaker 1: religious like opposition and everything. So it's really helpful, I think, 2036 01:55:57,680 --> 01:56:01,760 Speaker 1: to see to realize that first of all, it is 2037 01:56:01,840 --> 01:56:04,040 Speaker 1: a basic human right, like it's not even it's like 2038 01:56:04,120 --> 01:56:07,640 Speaker 1: internationally an issue. And then just to see how other 2039 01:56:07,680 --> 01:56:11,840 Speaker 1: people have organized it's really important, I think, yes, And 2040 01:56:12,400 --> 01:56:16,760 Speaker 1: now I believe that we have like kind of a 2041 01:56:16,880 --> 01:56:24,360 Speaker 1: similar situation whether well, it's a situation of legal discrimination 2042 01:56:24,880 --> 01:56:28,880 Speaker 1: in which only women who live or have the resources 2043 01:56:28,960 --> 01:56:32,760 Speaker 1: to travel to the states that have the criminalized abortion 2044 01:56:32,880 --> 01:56:38,080 Speaker 1: can exercise the right who a voluntary legal interruption of 2045 01:56:38,160 --> 01:56:43,920 Speaker 1: their pregnancy? Am I right? Because I like, I don't 2046 01:56:43,960 --> 01:56:47,480 Speaker 1: know much about the situation and the United States, but 2047 01:56:47,600 --> 01:56:52,240 Speaker 1: I know that it is legal in some states, right, yeah, yeah, 2048 01:56:53,880 --> 01:56:57,440 Speaker 1: some states. And then like in contrast to that, it's 2049 01:56:57,520 --> 01:57:02,040 Speaker 1: like legal even in case of rape, and like the 2050 01:57:02,440 --> 01:57:04,480 Speaker 1: people that have been raped can be suited. It's like 2051 01:57:04,520 --> 01:57:08,520 Speaker 1: a very like up and down kind of balance. Um. 2052 01:57:09,240 --> 01:57:11,920 Speaker 1: But yeah, there's definitely both that exists, and I think 2053 01:57:11,960 --> 01:57:16,760 Speaker 1: that's where it becomes really hard to extinguish the bad side. 2054 01:57:17,680 --> 01:57:21,080 Speaker 1: It's part part of our struggle to be criminalized abortion 2055 01:57:21,160 --> 01:57:25,520 Speaker 1: in the other states is because woman who who live 2056 01:57:25,600 --> 01:57:29,840 Speaker 1: in poverty and marginalized conditions, who want to have an 2057 01:57:29,880 --> 01:57:34,720 Speaker 1: abortion but reside in other states where it's illegal, kind 2058 01:57:34,720 --> 01:57:40,680 Speaker 1: of do so under legal circumstances. So it's also a 2059 01:57:41,800 --> 01:57:48,920 Speaker 1: class problem. Its yeah, definitely. And also in Mexico there 2060 01:57:48,960 --> 01:57:53,040 Speaker 1: are some states that even criminalize us contaneous abortion. It 2061 01:57:53,360 --> 01:57:59,360 Speaker 1: it wasn't even induced and instead of calling an ambuments, 2062 01:58:00,040 --> 01:58:03,920 Speaker 1: some people called the cubs when when a woman is 2063 01:58:04,080 --> 01:58:12,280 Speaker 1: dying because of christ containous abortion. So yes, and this 2064 01:58:12,520 --> 01:58:18,680 Speaker 1: has caused also a public health problem affecting girls and 2065 01:58:18,800 --> 01:58:23,000 Speaker 1: women in more vulnerable situations who live in the in 2066 01:58:23,120 --> 01:58:29,880 Speaker 1: the most restrictive context rural and indigenous communities, also migrants, 2067 01:58:31,000 --> 01:58:37,680 Speaker 1: girls and women victims of of sexual abuse, women with disabilities, 2068 01:58:38,040 --> 01:58:45,880 Speaker 1: among others, and always, always, always the most vulnerable. Vulnerable 2069 01:58:46,000 --> 01:58:52,720 Speaker 1: women are more susceptible to get an unsafe contesting abortions, 2070 01:58:52,840 --> 01:59:00,320 Speaker 1: which can get to infections, camorrhaging, uh, injury to to 2071 01:59:00,480 --> 01:59:06,880 Speaker 1: internal organs. And even there there are some places like 2072 01:59:07,160 --> 01:59:11,800 Speaker 1: in communities work there is not even access to to 2073 01:59:12,040 --> 01:59:17,720 Speaker 1: internet or or to basic health services. And m h, 2074 01:59:18,920 --> 01:59:24,800 Speaker 1: you're awesome. Women are still buying betool once safe abortions. Yeah, 2075 01:59:24,880 --> 01:59:31,600 Speaker 1: and they are like a hundred percent preventable this. Yeah, no, totally, yeah, 2076 01:59:32,360 --> 01:59:37,040 Speaker 1: thank you, you're even amazing. Um, but it's interesting because 2077 01:59:37,080 --> 01:59:39,720 Speaker 1: that's true. I think regardless of the country, the most 2078 01:59:39,840 --> 01:59:42,320 Speaker 1: vulnerable are the most affected. Whether it's I mean it's 2079 01:59:42,360 --> 01:59:45,360 Speaker 1: a class issue, it's a race issue, it's a disability issue, 2080 01:59:45,400 --> 01:59:48,960 Speaker 1: it's like all these things that I mean, rich people 2081 01:59:48,960 --> 01:59:52,800 Speaker 1: will get abortions either way, like privileged people will always 2082 01:59:52,840 --> 01:59:56,160 Speaker 1: have a route to take care of themselves. Um. So 2083 01:59:57,160 --> 01:59:59,680 Speaker 1: it's just I don't know, it's unfortunate for seeing how 2084 01:59:59,800 --> 02:00:06,920 Speaker 1: like humans have functioned regardless of the country that they build. Yes, 2085 02:00:07,360 --> 02:00:13,800 Speaker 1: sad and criminalizing abortion does not reduce its practice. I 2086 02:00:13,920 --> 02:00:19,720 Speaker 1: think that prohibiting it will like and its practice, and 2087 02:00:20,080 --> 02:00:25,520 Speaker 1: it only increases the probabilities of decent, safe procedures, and 2088 02:00:25,880 --> 02:00:31,560 Speaker 1: it increases the stigma and prejudices, and it even strengthens 2089 02:00:31,720 --> 02:00:37,320 Speaker 1: this empty rights and choice groups. But when abortion is 2090 02:00:37,440 --> 02:00:41,120 Speaker 1: performed in a in a safe and important matter, it 2091 02:00:41,280 --> 02:00:47,600 Speaker 1: is even less risky that childbirth among other interventions. And 2092 02:00:48,720 --> 02:00:52,720 Speaker 1: us for example, it is much safer for for our 2093 02:00:52,800 --> 02:00:57,360 Speaker 1: goal to have an abortion than to what you said, 2094 02:00:57,520 --> 02:01:02,840 Speaker 1: than to continue with with when when the pregnancy is 2095 02:01:02,920 --> 02:01:08,240 Speaker 1: like threatening her life. Ye yes, um, well, that's why 2096 02:01:08,440 --> 02:01:13,640 Speaker 1: we have to keep fighting for people. I was really 2097 02:01:14,160 --> 02:01:18,760 Speaker 1: getting down there. Yes, and here in Mexico, like bills 2098 02:01:18,880 --> 02:01:24,080 Speaker 1: continue to be promoted in different states, we keep forming 2099 02:01:24,240 --> 02:01:29,720 Speaker 1: and strengthening alliances, and we have to strengthen these alliances 2100 02:01:29,920 --> 02:01:33,960 Speaker 1: with all types of sectors. And that's why the the 2101 02:01:34,080 --> 02:01:39,440 Speaker 1: alliance work, for example, because we are the religious sectors 2102 02:01:39,720 --> 02:01:45,240 Speaker 1: that we work also with legislators, with doctors, health care providers, 2103 02:01:45,880 --> 02:01:52,480 Speaker 1: even in schools, and with the public general. So uh, 2104 02:01:52,880 --> 02:02:01,800 Speaker 1: it is a collective effort and the collective commitments mm hmm, yeah, 2105 02:02:03,200 --> 02:02:06,360 Speaker 1: very true. I have nothing to add. That's a good 2106 02:02:06,400 --> 02:02:09,280 Speaker 1: better than that. Um. So, thank you so much for 2107 02:02:10,520 --> 02:02:14,000 Speaker 1: joining us, and I'm going to step away now. Thank 2108 02:02:14,080 --> 02:02:18,840 Speaker 1: you asually, of course, yeah, and I guess uh. The 2109 02:02:18,960 --> 02:02:20,680 Speaker 1: last thing, well, do you do you have anything else 2110 02:02:20,720 --> 02:02:24,040 Speaker 1: you want to say? And then b I where can 2111 02:02:24,080 --> 02:02:27,000 Speaker 1: people find you on the internet like if they want to? 2112 02:02:27,440 --> 02:02:30,160 Speaker 1: And do you have other organizations and stuff do you 2113 02:02:30,200 --> 02:02:35,080 Speaker 1: want to promote? H? I'm like Erica Yamava. You know 2114 02:02:35,320 --> 02:02:41,040 Speaker 1: my social media and the organization I work in, it's 2115 02:02:41,240 --> 02:02:47,520 Speaker 1: I Giva the Middle Silvavania, Bravai from herere but the 2116 02:02:48,080 --> 02:02:52,400 Speaker 1: the National Network for Reproductive Rights where we are intolves states. 2117 02:02:52,760 --> 02:02:58,600 Speaker 1: It's called they said, but it's B B E, S 2118 02:02:58,880 --> 02:03:03,960 Speaker 1: E R. And you can find those uh in most 2119 02:03:04,040 --> 02:03:10,000 Speaker 1: of these states. And we can provide information regarding abortion 2120 02:03:11,160 --> 02:03:15,040 Speaker 1: if you write to us. And also something I would 2121 02:03:15,080 --> 02:03:19,120 Speaker 1: like to say is that even after it's even nice, 2122 02:03:19,280 --> 02:03:23,560 Speaker 1: one must continue to to ensure that these abortion services 2123 02:03:23,640 --> 02:03:28,160 Speaker 1: are are when they say, our implement that, and that 2124 02:03:28,480 --> 02:03:33,080 Speaker 1: they can reach to all girls and women. That it 2125 02:03:33,200 --> 02:03:40,200 Speaker 1: must be guaranteed in paper and in practice. And yes, 2126 02:03:40,360 --> 02:03:47,560 Speaker 1: the emphasis in reaching the most underserved and vulnerable populations. 2127 02:03:48,360 --> 02:03:52,760 Speaker 1: M hmm all right, well, uh, I think that's probably 2128 02:03:52,800 --> 02:03:54,960 Speaker 1: going to do it for us today. Erica, thank you 2129 02:03:55,120 --> 02:03:59,879 Speaker 1: so much for talking about It's a wealth of information. 2130 02:04:00,000 --> 02:04:05,200 Speaker 1: It's really valuable. Thank you, Yeah, and thank you all 2131 02:04:05,360 --> 02:04:23,440 Speaker 1: for listening. That's your episode for the day. Welcome to 2132 02:04:23,640 --> 02:04:27,520 Speaker 1: it could happen here. I'm Garrison with me today is Chris, 2133 02:04:28,240 --> 02:04:31,240 Speaker 1: and today we'll be talking about something that I was 2134 02:04:31,360 --> 02:04:34,160 Speaker 1: wanting to do for Pride Month but time kind of 2135 02:04:34,240 --> 02:04:38,360 Speaker 1: got away from me. But we'll be talking with Noah Adams, 2136 02:04:38,600 --> 02:04:42,760 Speaker 1: who does research into the kind of crossover between autism 2137 02:04:42,960 --> 02:04:46,040 Speaker 1: and being trans I know we've we've briefly mentioned some 2138 02:04:46,200 --> 02:04:49,400 Speaker 1: like rhetoric around this on our War on trans People 2139 02:04:49,560 --> 02:04:52,920 Speaker 1: series episodes, and Noah was kind enough to reach out 2140 02:04:52,960 --> 02:04:56,120 Speaker 1: to me to be willing to discuss this more in depth. 2141 02:04:56,360 --> 02:05:01,120 Speaker 1: Of greetings, Hello, thank you. It's it's always Pride Months 2142 02:05:01,160 --> 02:05:04,960 Speaker 1: somewhere in the summer. So that's true. That's true. Um So, 2143 02:05:05,280 --> 02:05:07,960 Speaker 1: I guess let's I first want to kind of hear 2144 02:05:08,160 --> 02:05:10,600 Speaker 1: how you how you personally kind of got into this 2145 02:05:10,720 --> 02:05:13,280 Speaker 1: field of research, and then maybe kind of clarify what 2146 02:05:13,600 --> 02:05:19,600 Speaker 1: what exactly your your field of research is? Sure, Um, well, 2147 02:05:19,800 --> 02:05:21,960 Speaker 1: I guess where do you start? I mean, I'm I'm 2148 02:05:22,000 --> 02:05:25,760 Speaker 1: a trans person and I'm also autistic, so it's sort 2149 02:05:25,760 --> 02:05:29,440 Speaker 1: of a natural yeah crossover for me. Um, I got 2150 02:05:29,560 --> 02:05:36,760 Speaker 1: started in trans research uh or transactivism doing suicidality work. 2151 02:05:37,560 --> 02:05:41,720 Speaker 1: Um in such a long time ago now, but in um, 2152 02:05:42,160 --> 02:05:44,560 Speaker 1: I think two thousand and six, myself and my best 2153 02:05:44,600 --> 02:05:49,280 Speaker 1: friend cycled across Canada to bring awareness for trans suicidality 2154 02:05:50,000 --> 02:05:53,480 Speaker 1: um and in memorandum of a person, a friend who 2155 02:05:53,840 --> 02:05:58,320 Speaker 1: who committed suicide. UM. So you know, we went to 2156 02:05:58,360 --> 02:06:02,440 Speaker 1: a lot of different communities in Canon, including Saskatoon, did 2157 02:06:02,560 --> 02:06:05,600 Speaker 1: talks about you know, did talks with local trans communities 2158 02:06:05,600 --> 02:06:09,560 Speaker 1: about suicide, audy prevention and stuff. Uh and met a 2159 02:06:09,600 --> 02:06:13,120 Speaker 1: lot of great people. Um. And then I came back 2160 02:06:13,360 --> 02:06:17,160 Speaker 1: and I did my Master of Social work also on 2161 02:06:17,280 --> 02:06:20,560 Speaker 1: trans suicidality research, kind of looking at how there's a 2162 02:06:20,560 --> 02:06:23,280 Speaker 1: lot of different research out there and who knows, you know, 2163 02:06:23,640 --> 02:06:25,440 Speaker 1: there's a lot of different rates that are given all 2164 02:06:25,560 --> 02:06:28,240 Speaker 1: high and and where are we supposed to you know, 2165 02:06:28,400 --> 02:06:31,560 Speaker 1: fall on that? Um. And then I finished that and 2166 02:06:31,840 --> 02:06:36,360 Speaker 1: I was kind of tired of doing suicidality work. Yeah, 2167 02:06:36,800 --> 02:06:39,600 Speaker 1: speaking it's a little bit a little bit exhausting. UM. 2168 02:06:39,920 --> 02:06:42,800 Speaker 1: And kind I have a I have a much darker 2169 02:06:42,880 --> 02:06:46,760 Speaker 1: sense of humor than I used to. Yeah, um so 2170 02:06:47,800 --> 02:06:51,760 Speaker 1: a friend I was kind of drifting into autism work, 2171 02:06:51,880 --> 02:06:56,240 Speaker 1: and a friend who, uh Jake Pine, who who's a 2172 02:06:56,320 --> 02:07:00,960 Speaker 1: professor now at your university, UM, suggested I kind of 2173 02:07:01,320 --> 02:07:04,320 Speaker 1: move into that area. And and here I am so 2174 02:07:04,560 --> 02:07:07,600 Speaker 1: with the kind of crossover between being trans and have 2175 02:07:07,800 --> 02:07:11,240 Speaker 1: and and uh, I guess I'm not the best I 2176 02:07:11,440 --> 02:07:14,600 Speaker 1: don't I don't know, I consciously don't kind of know 2177 02:07:14,720 --> 02:07:17,800 Speaker 1: all the what the most appropriate languages would you say 2178 02:07:17,800 --> 02:07:20,040 Speaker 1: that you would you prefer to say that you like 2179 02:07:20,360 --> 02:07:24,440 Speaker 1: have autism or you are artistic? Um? I mean I 2180 02:07:24,560 --> 02:07:29,240 Speaker 1: think it's pretty universal in the autism community to uh 2181 02:07:29,480 --> 02:07:32,920 Speaker 1: talk about identity first language. So yeah, exactly, logism kind 2182 02:07:32,960 --> 02:07:36,800 Speaker 1: of leads and and that's yeah. I mean I guess 2183 02:07:36,840 --> 02:07:41,400 Speaker 1: i'd say I'm autistic. Most people don't say person with autism. Yeah. Yeah. 2184 02:07:42,800 --> 02:07:46,200 Speaker 1: So with that, how have you kind of what what 2185 02:07:46,320 --> 02:07:48,840 Speaker 1: initially got you to? You know, because we see a 2186 02:07:48,880 --> 02:07:53,360 Speaker 1: lot of propaganda and stuff trying to almost like take 2187 02:07:53,400 --> 02:07:58,400 Speaker 1: away people's agency around both being trans and and and 2188 02:07:58,840 --> 02:08:02,760 Speaker 1: um with and how and being autistic? Right, there's a 2189 02:08:02,840 --> 02:08:05,560 Speaker 1: lot of a lot of propaganda, especially from the Turfs 2190 02:08:05,600 --> 02:08:09,160 Speaker 1: in the UK really started this out and really accelerated 2191 02:08:09,200 --> 02:08:12,680 Speaker 1: on this point. Um. And I mean we're not. There's 2192 02:08:12,760 --> 02:08:17,000 Speaker 1: a whole bunch of basically autism exterminationist groups out there, 2193 02:08:17,640 --> 02:08:19,040 Speaker 1: um and a whole a whole bunch of other kind 2194 02:08:19,040 --> 02:08:24,600 Speaker 1: of problems around this. How when when these kind of 2195 02:08:24,680 --> 02:08:27,840 Speaker 1: crossovers start happening, where did you kind of what what 2196 02:08:27,960 --> 02:08:29,840 Speaker 1: kind of prompted you to see this and be like, Hey, 2197 02:08:30,120 --> 02:08:32,840 Speaker 1: here's this thing that needs to be researched, and here's 2198 02:08:32,840 --> 02:08:35,200 Speaker 1: how I'm going to go about it. Because you see 2199 02:08:35,200 --> 02:08:37,560 Speaker 1: a lot of people talk about this thing, but it's 2200 02:08:37,600 --> 02:08:41,400 Speaker 1: always generally to like attack trans people, um or attack 2201 02:08:41,440 --> 02:08:45,000 Speaker 1: autistic people. I mean, you know there's a lot there. 2202 02:08:45,120 --> 02:08:47,960 Speaker 1: I mean I would say, you know, my sort of 2203 02:08:48,480 --> 02:08:51,720 Speaker 1: seedling of interest is is I just really don't like injustice. 2204 02:08:52,440 --> 02:08:54,640 Speaker 1: I really, I mean that's such a broad thing to 2205 02:08:54,680 --> 02:08:58,160 Speaker 1: say that I really, you know, injustice against people for 2206 02:08:58,240 --> 02:08:59,600 Speaker 1: the sake of who they are really just kind of 2207 02:08:59,640 --> 02:09:03,400 Speaker 1: pisses me off. Um. And you know, I mean, when 2208 02:09:03,440 --> 02:09:05,400 Speaker 1: you pick a research topic, you've kind of pick something 2209 02:09:05,440 --> 02:09:08,320 Speaker 1: that you're willing to spend hours and hours and hours 2210 02:09:08,360 --> 02:09:12,560 Speaker 1: in a library or you know, um a virtual library. 2211 02:09:12,760 --> 02:09:14,960 Speaker 1: What have you just kind of plowing away at it? 2212 02:09:15,720 --> 02:09:17,960 Speaker 1: And and it seemed like I was piste off enough 2213 02:09:18,000 --> 02:09:20,200 Speaker 1: at the injustice of the way autistic people, in the 2214 02:09:20,240 --> 02:09:22,640 Speaker 1: way trans people are treated, and especially the way I 2215 02:09:22,720 --> 02:09:25,360 Speaker 1: think I think we're ignored by both the theaters, by 2216 02:09:25,440 --> 02:09:29,600 Speaker 1: both you know, for for turfs and trans exclusionary folks. Um. 2217 02:09:31,120 --> 02:09:34,080 Speaker 1: I really feel like we're an easy target where you know, 2218 02:09:34,160 --> 02:09:36,760 Speaker 1: autistic people or or for that matter, people with the 2219 02:09:36,840 --> 02:09:39,840 Speaker 1: volment of disabilities or people with with any kind of 2220 02:09:40,320 --> 02:09:44,600 Speaker 1: um and I'm using having finger pots on as any 2221 02:09:44,680 --> 02:09:48,040 Speaker 1: kind of impairment based disability feel like a soft target 2222 02:09:48,320 --> 02:09:51,640 Speaker 1: for people that just want to attack trans folks, right like, 2223 02:09:51,680 --> 02:09:56,520 Speaker 1: because they're they're a group that are so it's incomprehensible 2224 02:09:56,560 --> 02:09:58,760 Speaker 1: to them that we would be able to speak for ourselves. 2225 02:10:00,280 --> 02:10:03,480 Speaker 1: So so they're you know, I mean, I don't even 2226 02:10:03,600 --> 02:10:08,360 Speaker 1: think that they I don't think they care about autistic people. 2227 02:10:08,400 --> 02:10:11,000 Speaker 1: But I don't think they even even occurs to them 2228 02:10:11,040 --> 02:10:13,880 Speaker 1: that autodic people might have and trans autistic folks might 2229 02:10:13,920 --> 02:10:18,440 Speaker 1: have something to say for themselves. So, what's kind of 2230 02:10:18,640 --> 02:10:22,960 Speaker 1: the scope of your research been the for however however 2231 02:10:23,040 --> 02:10:26,240 Speaker 1: long you've been working on this for it's it's for 2232 02:10:26,880 --> 02:10:31,040 Speaker 1: a PhD program, correct, Yeah, Well, I started doing I 2233 02:10:31,120 --> 02:10:34,280 Speaker 1: wrote a book on trans authentic folks. That's sort of 2234 02:10:34,320 --> 02:10:39,560 Speaker 1: a series that interviews with folks um and you know, 2235 02:10:39,800 --> 02:10:42,160 Speaker 1: I mean I just asked them like about their lives 2236 02:10:42,240 --> 02:10:43,800 Speaker 1: and what it was like to be trans and what 2237 02:10:43,880 --> 02:10:45,520 Speaker 1: it was like to be autistic, and what it's like 2238 02:10:45,640 --> 02:10:48,160 Speaker 1: to you know, try to transition as an autistic person, 2239 02:10:48,200 --> 02:10:50,000 Speaker 1: and a lot of stuff came out of that around 2240 02:10:50,440 --> 02:10:52,600 Speaker 1: you know, how difficult it can be for folks that 2241 02:10:52,680 --> 02:10:55,520 Speaker 1: are their boat to access trans healthcare and to sort 2242 02:10:55,520 --> 02:10:58,360 Speaker 1: of navigate their way in the world. Um. And this 2243 02:10:58,920 --> 02:11:01,200 Speaker 1: is for my PhD work. It's sort of an outgrowth 2244 02:11:01,240 --> 02:11:06,360 Speaker 1: of that. So I'm looking specifically at how trans autistic 2245 02:11:06,520 --> 02:11:10,720 Speaker 1: community groups sort of grassroots group formations are are forming 2246 02:11:10,880 --> 02:11:15,440 Speaker 1: and and what their goals are. How do you, like, 2247 02:11:16,680 --> 02:11:19,160 Speaker 1: how do you go about like ethical research into this topic, 2248 02:11:19,200 --> 02:11:22,360 Speaker 1: because definitely the like, you know, there's a certain way 2249 02:11:22,360 --> 02:11:24,440 Speaker 1: to like, there's a certain way to be like, I'm 2250 02:11:24,480 --> 02:11:28,160 Speaker 1: researching autism and trans people to be like, huh, that's 2251 02:11:28,200 --> 02:11:30,200 Speaker 1: a little bit of a side I right, because because 2252 02:11:30,240 --> 02:11:32,160 Speaker 1: of how some of because of how like the turf 2253 02:11:32,200 --> 02:11:34,760 Speaker 1: groups talk about it. Obviously you're treads and autistic and 2254 02:11:34,840 --> 02:11:38,800 Speaker 1: that's two completely different. But like, definitely Yeah, I was 2255 02:11:38,840 --> 02:11:42,040 Speaker 1: definitely wondering, like, is there, like how what what types 2256 02:11:42,080 --> 02:11:44,960 Speaker 1: of like ways does ethical research go about so that 2257 02:11:45,040 --> 02:11:47,920 Speaker 1: you actually understand people when you're when you're talking to them. 2258 02:11:47,960 --> 02:11:50,880 Speaker 1: It's not it's not about like here, we need to 2259 02:11:51,040 --> 02:11:53,680 Speaker 1: we need to like solve these problems, because they're not 2260 02:11:53,720 --> 02:11:56,560 Speaker 1: problems to be solved, But it's it's research into living 2261 02:11:56,640 --> 02:12:00,600 Speaker 1: people who are like having lives. Yeah, mean, that's that's 2262 02:12:00,600 --> 02:12:03,880 Speaker 1: a great question. Um. I'm sort of starting through that 2263 02:12:04,040 --> 02:12:06,560 Speaker 1: myself right now because I'm just you know, working through 2264 02:12:06,640 --> 02:12:11,840 Speaker 1: my ethical research proposal. Um, but I think you just 2265 02:12:11,920 --> 02:12:14,640 Speaker 1: have to be really honest and open and and really 2266 02:12:14,680 --> 02:12:16,440 Speaker 1: write all this stuff out, like how are you going 2267 02:12:16,480 --> 02:12:18,120 Speaker 1: to contact people and what are you going to talk 2268 02:12:18,160 --> 02:12:20,880 Speaker 1: to them about? And showing other people what you're doing 2269 02:12:21,440 --> 02:12:25,880 Speaker 1: and being very open to that process, if that makes sense. Yeah. Yeah, 2270 02:12:26,840 --> 02:12:30,120 Speaker 1: In what types of ways do you see the crossover 2271 02:12:30,280 --> 02:12:33,720 Speaker 1: kind of between ableism and transphobia and like what how 2272 02:12:33,880 --> 02:12:36,920 Speaker 1: how have you seen that crossover be used to kind 2273 02:12:36,960 --> 02:12:39,480 Speaker 1: of hurt both trans people and people who are artistic 2274 02:12:39,560 --> 02:12:42,880 Speaker 1: and people who are both you know, I mean I 2275 02:12:42,960 --> 02:12:46,880 Speaker 1: think the most explicit way has been I you know, 2276 02:12:46,960 --> 02:12:49,800 Speaker 1: I see a lot of articles by the Guardian or 2277 02:12:49,880 --> 02:12:53,480 Speaker 1: the Daily Mail, you know, bring up the specter of 2278 02:12:53,760 --> 02:12:58,800 Speaker 1: autistic people and being overrepresented, overrepresented again in finger quotes, Uh, 2279 02:12:59,720 --> 02:13:02,880 Speaker 1: I'm the trans population going to gender clinics and and 2280 02:13:03,120 --> 02:13:06,560 Speaker 1: there isn't ever any explanation after you say that that 2281 02:13:06,680 --> 02:13:11,960 Speaker 1: the scaremongering is just saying there's artistic people supposedly overrepresented 2282 02:13:12,120 --> 02:13:16,760 Speaker 1: among trans folks. Oh no, but as if it doesn't 2283 02:13:16,800 --> 02:13:20,880 Speaker 1: need saying. It's it's assumed that that's that's appalling, you know, 2284 02:13:20,960 --> 02:13:24,200 Speaker 1: but I like a little bit more explanation. There's there's 2285 02:13:24,360 --> 02:13:26,760 Speaker 1: there's a lot said by how they how there's how 2286 02:13:26,840 --> 02:13:28,560 Speaker 1: they frame it and how they and like what they 2287 02:13:28,600 --> 02:13:32,200 Speaker 1: don't say, um really fright, Like the it's all framed 2288 02:13:32,320 --> 02:13:36,360 Speaker 1: as if this is, you know, something that everyone recognizes 2289 02:13:36,480 --> 02:13:40,800 Speaker 1: as like a problem. Um, and countering that is really 2290 02:13:40,840 --> 02:13:43,760 Speaker 1: challenging because it is. Yeah, because like again you're you're 2291 02:13:43,760 --> 02:13:46,520 Speaker 1: doing research into this in this into this specific crossover, 2292 02:13:47,120 --> 02:13:49,760 Speaker 1: and what kinds of stuff have you kind of learned 2293 02:13:50,040 --> 02:13:54,600 Speaker 1: throughout your research about about this. I mean it's interesting, 2294 02:13:54,720 --> 02:13:58,440 Speaker 1: like it's that attitude is also represented in the academic literature, 2295 02:13:58,520 --> 02:14:01,400 Speaker 1: like over the last I bet I'd say over the 2296 02:14:01,520 --> 02:14:05,600 Speaker 1: last five years, the literature on the crossover of of 2297 02:14:05,720 --> 02:14:09,240 Speaker 1: autism and trans folks has like skyrocketed, Like in I 2298 02:14:09,320 --> 02:14:13,200 Speaker 1: always say, in to twenty alone, something like a hundred 2299 02:14:13,240 --> 02:14:16,760 Speaker 1: and fifty articles were published, whereas two years before that 2300 02:14:16,960 --> 02:14:21,760 Speaker 1: maybe twenty UM and and the vast majority of them 2301 02:14:22,160 --> 02:14:27,800 Speaker 1: are mentioning the um the co occurrence uh in passing. 2302 02:14:28,080 --> 02:14:30,320 Speaker 1: So they're saying, oh, well, we read these other things 2303 02:14:30,680 --> 02:14:33,960 Speaker 1: where autism and trans identity co occurs, So thus you 2304 02:14:34,000 --> 02:14:37,720 Speaker 1: should be very careful prior to providing trans healthcare because 2305 02:14:37,800 --> 02:14:39,760 Speaker 1: they might be autistic. Yeah, That's the only thing I 2306 02:14:39,800 --> 02:14:43,440 Speaker 1: wanted to talk about was like people medical gatekeeping aspect, 2307 02:14:44,160 --> 02:14:47,560 Speaker 1: and like you especially you see this like turfs. You know, 2308 02:14:47,600 --> 02:14:50,480 Speaker 1: there's a lot of like infantilization with the turf Frederick 2309 02:14:50,560 --> 02:14:53,360 Speaker 1: around this, and then that kind of leads to this 2310 02:14:53,480 --> 02:14:56,760 Speaker 1: type of medical gatekeeping. Yeah, I just think, you know, 2311 02:14:56,880 --> 02:14:58,440 Speaker 1: I see I see this. I saw this in a 2312 02:14:58,520 --> 02:15:01,120 Speaker 1: book with the interview as I did, and I see 2313 02:15:01,120 --> 02:15:04,040 Speaker 1: it in so many other places, and especially conversation with folks. 2314 02:15:04,760 --> 02:15:07,960 Speaker 1: Is that, you know, the problem seems to be if 2315 02:15:08,080 --> 02:15:12,600 Speaker 1: you tell an unexpected narrative to the person in charge 2316 02:15:12,640 --> 02:15:15,480 Speaker 1: of gate keeping you for transgender healthcare, you're going to 2317 02:15:15,560 --> 02:15:18,320 Speaker 1: make them nervous. And if you make them nervous, they're 2318 02:15:18,320 --> 02:15:21,080 Speaker 1: they're not necessarily gonna say no, but they're gonna say 2319 02:15:21,920 --> 02:15:24,400 Speaker 1: to themselves at least, oh, let's wait and see. And 2320 02:15:24,640 --> 02:15:28,960 Speaker 1: for for autistic folks, waiting and seeing might mean forever, right, 2321 02:15:29,000 --> 02:15:31,600 Speaker 1: Like I talked to folks in the book, uh that, 2322 02:15:32,680 --> 02:15:35,760 Speaker 1: you know, without without mentioning actual cities, because of the 2323 02:15:36,320 --> 02:15:39,080 Speaker 1: you know, the particular situation of this person. But let's 2324 02:15:39,160 --> 02:15:42,760 Speaker 1: let's you know, let's say he lived in New Orleans, uh, 2325 02:15:42,880 --> 02:15:45,200 Speaker 1: and he wasn't able to access trans healthcare and New 2326 02:15:45,320 --> 02:15:49,720 Speaker 1: Orleans because it just wasn't available to folks who were autistic, 2327 02:15:50,320 --> 02:15:54,160 Speaker 1: And so he ended up moving to Chicago, which which shows, 2328 02:15:54,280 --> 02:15:57,000 Speaker 1: you know, he moved to Chicago specifically to get trans healthcare, 2329 02:15:57,040 --> 02:16:00,800 Speaker 1: which shows a level of capacity that there implying in 2330 02:16:00,880 --> 02:16:03,080 Speaker 1: the context of trans health care in New Orleans that 2331 02:16:03,200 --> 02:16:05,800 Speaker 1: he's not capable of. But you know, he can upbroot 2332 02:16:05,840 --> 02:16:09,240 Speaker 1: his whole life, move across the country, find a doctor. 2333 02:16:09,280 --> 02:16:12,080 Speaker 1: And then he talked to the doctor in an informed 2334 02:16:12,120 --> 02:16:15,680 Speaker 1: health clinic in Chicago and and they were like, oh, yeah, 2335 02:16:15,800 --> 02:16:17,640 Speaker 1: we knew that you were from this city, and we 2336 02:16:17,720 --> 02:16:19,800 Speaker 1: do that you were autistic before you told us, because 2337 02:16:19,840 --> 02:16:23,080 Speaker 1: there's this whole pipeline of autistic trans folks making the 2338 02:16:23,200 --> 02:16:27,280 Speaker 1: migration to Chicago from this particular city because they can't 2339 02:16:27,280 --> 02:16:31,080 Speaker 1: get healthcare. I mean, like, you know, I'm also thinking 2340 02:16:31,120 --> 02:16:34,680 Speaker 1: about you know, like kids trying to come out as trans, 2341 02:16:35,600 --> 02:16:38,640 Speaker 1: um who have autism or have any other kind of 2342 02:16:39,680 --> 02:16:44,280 Speaker 1: you know, quote unquote developmental disability UM, and like just 2343 02:16:44,760 --> 02:16:47,360 Speaker 1: all of the ways that that can be used to 2344 02:16:47,520 --> 02:16:51,840 Speaker 1: gaslight kids about their gender identity. UM. I know in 2345 02:16:51,959 --> 02:16:54,680 Speaker 1: your book you mentioned stuff around like self discovery and 2346 02:16:55,000 --> 02:16:58,480 Speaker 1: coming out and issues with family. UM, what kind of 2347 02:16:58,600 --> 02:17:00,680 Speaker 1: what kind of things have you heard heard about that 2348 02:17:00,760 --> 02:17:06,160 Speaker 1: in terms of how how kids that that the kids 2349 02:17:06,200 --> 02:17:10,400 Speaker 1: are figuring figuring out gender stuff while also having this 2350 02:17:10,480 --> 02:17:13,400 Speaker 1: whole other thing that people use to kind of, you know, 2351 02:17:13,680 --> 02:17:17,000 Speaker 1: add on to their experience. I mean, I think, you know, 2352 02:17:17,120 --> 02:17:19,800 Speaker 1: one of the things I noticed, especially in you know, 2353 02:17:20,040 --> 02:17:23,600 Speaker 1: sort of trans autistic autobiographies, is that you know, gender 2354 02:17:23,680 --> 02:17:26,280 Speaker 1: doesn't really make an inherent sort of sense do a 2355 02:17:26,320 --> 02:17:28,840 Speaker 1: lot of autistic folks. I mean, it doesn't make sense 2356 02:17:28,879 --> 02:17:31,120 Speaker 1: to do. But I mean I have I have something 2357 02:17:31,200 --> 02:17:33,000 Speaker 1: going on in my brain. I don't know what it is. 2358 02:17:33,040 --> 02:17:36,920 Speaker 1: I don't think it's autism, but yeah, genders never made 2359 02:17:36,959 --> 02:17:40,240 Speaker 1: sense to me either, And I think like where you 2360 02:17:40,879 --> 02:17:43,240 Speaker 1: for autistic people, especially where you come across things that 2361 02:17:43,320 --> 02:17:46,800 Speaker 1: don't make an inherent sort of sense, it's difficult to 2362 02:17:46,879 --> 02:17:49,560 Speaker 1: accept them. Like so much in the world doesn't make 2363 02:17:49,600 --> 02:17:51,840 Speaker 1: inherent sense. But that can be a real sticking point 2364 02:17:51,879 --> 02:17:55,040 Speaker 1: for autistic folks. So, you know, what I see, what 2365 02:17:55,120 --> 02:17:57,480 Speaker 1: I seem to see a lot of is that by 2366 02:17:57,560 --> 02:17:59,720 Speaker 1: the time folks come out, well, first of all, it's 2367 02:17:59,720 --> 02:18:02,200 Speaker 1: see was like, although you know, I don't want to 2368 02:18:02,280 --> 02:18:04,240 Speaker 1: quote any particular kind of research on it because I 2369 02:18:04,280 --> 02:18:06,480 Speaker 1: think the jury is still out, but it seems like 2370 02:18:07,160 --> 02:18:10,400 Speaker 1: autistic people are more likely to identify as non binary 2371 02:18:11,480 --> 02:18:15,040 Speaker 1: um or to just not identify with gender at all. Um. 2372 02:18:15,480 --> 02:18:17,720 Speaker 1: And it does seem like by the time folks come 2373 02:18:17,760 --> 02:18:21,840 Speaker 1: out as trans, whatever you know, permutation of that there 2374 02:18:22,000 --> 02:18:25,200 Speaker 1: is for them, um, They've they've tried just about every 2375 02:18:25,240 --> 02:18:28,320 Speaker 1: other identity they can, you know, they can try out 2376 02:18:28,920 --> 02:18:32,480 Speaker 1: like especially you know, I mean, we're aware that there 2377 02:18:32,520 --> 02:18:37,440 Speaker 1: are social stigmas and and and social expectations around gender. 2378 02:18:38,280 --> 02:18:41,280 Speaker 1: So I think for for a lot of autistic folks, 2379 02:18:41,360 --> 02:18:43,920 Speaker 1: they're going to try to fit that, even though it 2380 02:18:43,959 --> 02:18:45,520 Speaker 1: doesn't make sense to them, They're going to try to 2381 02:18:45,600 --> 02:18:48,160 Speaker 1: fit within that because they know it exists, and that 2382 02:18:48,320 --> 02:18:50,520 Speaker 1: by the time they come out as as trans, like 2383 02:18:52,160 --> 02:18:56,280 Speaker 1: or main or female or what have you, like, we 2384 02:18:56,440 --> 02:19:01,520 Speaker 1: pretty well know. Is there any any kind of sense? Yeah? No, 2385 02:19:01,640 --> 02:19:03,480 Speaker 1: I mean I think that there's a lot of misconceptions 2386 02:19:03,520 --> 02:19:07,160 Speaker 1: people have about about i mean, both being trans and 2387 02:19:07,320 --> 02:19:11,800 Speaker 1: big autistic. Let let a load being both of Is 2388 02:19:11,840 --> 02:19:16,320 Speaker 1: there any like, yeah, like, what's the common misconceptions about this? 2389 02:19:16,440 --> 02:19:18,800 Speaker 1: Only kind of on like a broader level. Would you 2390 02:19:18,879 --> 02:19:23,640 Speaker 1: like to dispel? Um? Sure, Yeah, I mean I think 2391 02:19:23,640 --> 02:19:27,000 Speaker 1: a lot of people get told, you know, you can't 2392 02:19:27,040 --> 02:19:30,200 Speaker 1: you can't be autistic because you're too you're too articulate, 2393 02:19:30,720 --> 02:19:32,440 Speaker 1: or you know, you have too much of an opinion. 2394 02:19:32,480 --> 02:19:35,400 Speaker 1: Autistic people can't have an opinion of themselves, of their 2395 02:19:35,440 --> 02:19:38,720 Speaker 1: own life. And yeah, that's gross. I'm paraphrasing, but I 2396 02:19:38,760 --> 02:19:43,320 Speaker 1: think that's what it is. Yeah, absolutely, Um, And then 2397 02:19:43,840 --> 02:19:46,720 Speaker 1: you know, trans folks get told It's not uncommon to 2398 02:19:46,760 --> 02:19:49,680 Speaker 1: get told, oh, well, you can't be autistic because you're trans, 2399 02:19:50,640 --> 02:19:53,879 Speaker 1: so you're sort of in this this no, no person's land. 2400 02:19:55,640 --> 02:19:59,120 Speaker 1: That's such a it's such like an ontological attack on 2401 02:19:59,240 --> 02:20:03,680 Speaker 1: someone's being. It's so really like that's you know, like 2402 02:20:04,879 --> 02:20:08,240 Speaker 1: you know already like again just being solely trans or autistic, 2403 02:20:08,320 --> 02:20:10,480 Speaker 1: you get some of that, and then together it's like 2404 02:20:10,600 --> 02:20:12,640 Speaker 1: it's just attacking every kind of part of you that 2405 02:20:12,720 --> 02:20:15,760 Speaker 1: you're trying to figure out. Um well, I mean, I 2406 02:20:15,840 --> 02:20:18,760 Speaker 1: mean in terms of attacking people rhetorically, it's sort of 2407 02:20:19,200 --> 02:20:22,120 Speaker 1: the perfect weapon because you can make autistic trans people 2408 02:20:22,160 --> 02:20:27,040 Speaker 1: into whatever you want to be convenient to you. What 2409 02:20:27,320 --> 02:20:30,200 Speaker 1: kinds of stuff do you think people should know about 2410 02:20:30,280 --> 02:20:33,320 Speaker 1: this to help kind of either you know, to help 2411 02:20:33,360 --> 02:20:35,400 Speaker 1: either like counter some of like the rhetoric around this, 2412 02:20:35,840 --> 02:20:39,480 Speaker 1: or just to gain better like personal understanding, right if 2413 02:20:39,520 --> 02:20:41,080 Speaker 1: they have if they have you know, people in their 2414 02:20:41,120 --> 02:20:43,800 Speaker 1: lives who are like this, or maybe they suspect that 2415 02:20:43,959 --> 02:20:47,120 Speaker 1: they're that they're trans and they're autistic. Like how what 2416 02:20:47,240 --> 02:20:48,720 Speaker 1: what kinds of what kinds of stuff would you like 2417 02:20:48,920 --> 02:20:53,160 Speaker 1: people to be more aware of about this intersection? Well? 2418 02:20:53,280 --> 02:20:55,920 Speaker 1: I guess I remember a story someone from the book 2419 02:20:55,959 --> 02:20:59,680 Speaker 1: told me about how you know he was He had 2420 02:20:59,800 --> 02:21:02,000 Speaker 1: his best friend is is trans and autistic as well 2421 02:21:02,200 --> 02:21:05,080 Speaker 1: and has a number of physical disabilities, and he was 2422 02:21:05,160 --> 02:21:07,440 Speaker 1: kind of he's sort of the caretaker for him. Um, 2423 02:21:08,120 --> 02:21:11,000 Speaker 1: and he was kind of talking to him about how, oh, well, 2424 02:21:11,120 --> 02:21:12,720 Speaker 1: I don't know if I'm trans, and I don't know 2425 02:21:12,800 --> 02:21:14,440 Speaker 1: if I should you know, if I should use that 2426 02:21:14,640 --> 02:21:17,680 Speaker 1: label or like, you know, maybe it's not the right 2427 02:21:17,760 --> 02:21:21,080 Speaker 1: thing to do or it's bad or something. And his 2428 02:21:21,200 --> 02:21:23,520 Speaker 1: friend was like, well, you know you're not You're not 2429 02:21:23,640 --> 02:21:25,840 Speaker 1: a hormone of vampire. Like you're not gonna like suck 2430 02:21:26,200 --> 02:21:28,840 Speaker 1: the hormones out of somebody else and and hurt them 2431 02:21:28,879 --> 02:21:31,440 Speaker 1: by taking away their testosterone. Like, oh I wish is 2432 02:21:31,959 --> 02:21:36,959 Speaker 1: I wish it worked like that? Um, you know I would. 2433 02:21:37,000 --> 02:21:41,320 Speaker 1: I would be a trans empire. Um, this is about 2434 02:21:41,360 --> 02:21:43,960 Speaker 1: you and what makes you comfortable. It's not about like 2435 02:21:44,120 --> 02:21:47,120 Speaker 1: you're not hurting anybody else being yourself. And I think 2436 02:21:47,840 --> 02:21:52,959 Speaker 1: you know, autistic folks like anybody else you know, worry 2437 02:21:53,040 --> 02:21:57,760 Speaker 1: about I mean, we're we're just as susceptible to the 2438 02:21:57,879 --> 02:22:01,800 Speaker 1: attacks on trans folks. Yeah, is anybody else right? Like 2439 02:22:01,920 --> 02:22:04,040 Speaker 1: and you worry that, like, well, maybe this isn't the 2440 02:22:04,160 --> 02:22:06,440 Speaker 1: right thing to do, but like what are you hurting 2441 02:22:06,640 --> 02:22:09,800 Speaker 1: by by exploring it? That doesn't mean you have to 2442 02:22:09,879 --> 02:22:12,160 Speaker 1: be trans or you have to transition, or or you 2443 02:22:12,200 --> 02:22:15,120 Speaker 1: can't change your mind like, but it doesn't. It doesn't 2444 02:22:15,160 --> 02:22:18,200 Speaker 1: hurt anybody, even though you to just be open to it, 2445 02:22:18,480 --> 02:22:23,000 Speaker 1: even just like temporarily trying out different names or pronouns. Right, 2446 02:22:23,040 --> 02:22:25,959 Speaker 1: it can can be like such a big deal, um, 2447 02:22:26,800 --> 02:22:30,200 Speaker 1: and it can be very incidental. Like it doesn't. It 2448 02:22:30,240 --> 02:22:33,200 Speaker 1: doesn't need to be so cataclysmic. Right, that's something that 2449 02:22:33,280 --> 02:22:38,000 Speaker 1: you can experiment with and it's fine. Right, you never yeah, 2450 02:22:38,280 --> 02:22:41,320 Speaker 1: you don't lock yourself into anything. Um, but of course 2451 02:22:41,360 --> 02:22:43,840 Speaker 1: you know what it's about, your personal sense of identity. 2452 02:22:44,120 --> 02:22:47,240 Speaker 1: Of course it feels much bigger. Well, I think I 2453 02:22:47,320 --> 02:22:49,760 Speaker 1: think people worry about what other folks I mean, obviously 2454 02:22:49,800 --> 02:22:52,840 Speaker 1: people worry about what other folks will think and mean 2455 02:22:52,959 --> 02:22:56,040 Speaker 1: for them. Um, I don't know what I mean. It 2456 02:22:56,080 --> 02:22:58,960 Speaker 1: seems like a strange comparison to make. But I don't know. 2457 02:22:59,040 --> 02:23:03,480 Speaker 1: If you've seen Crimes of the Future, I've not yet. Oh, 2458 02:23:03,560 --> 02:23:06,800 Speaker 1: it's it's really good. It's it's the most recent David 2459 02:23:06,840 --> 02:23:10,240 Speaker 1: Cronenberg movie. And there's this great I'm going to give 2460 02:23:10,280 --> 02:23:13,000 Speaker 1: away the end of the movie, so spoiler boilers. I 2461 02:23:13,120 --> 02:23:16,000 Speaker 1: know we're giving this into a movie podcast what I've 2462 02:23:16,000 --> 02:23:19,000 Speaker 1: always wanted. There's this great scene at the end though, 2463 02:23:19,040 --> 02:23:22,120 Speaker 1: where Vigo Mortison is like in this he has this 2464 02:23:22,240 --> 02:23:25,440 Speaker 1: special like very David Cronaberti bone chair that he has 2465 02:23:25,520 --> 02:23:27,480 Speaker 1: to like be in to move him around what he's eating, 2466 02:23:28,200 --> 02:23:31,000 Speaker 1: and he finally is convinced by his partner to like 2467 02:23:31,480 --> 02:23:35,000 Speaker 1: try the plastic chocolate bar that's you know, supposed to 2468 02:23:35,080 --> 02:23:37,440 Speaker 1: be like it's a whole digestive thing. I will get 2469 02:23:37,480 --> 02:23:40,080 Speaker 1: into it. But you know, there's this moment of realization 2470 02:23:40,160 --> 02:23:41,960 Speaker 1: and like he's been avoiding this for the whole the 2471 02:23:42,000 --> 02:23:44,280 Speaker 1: whole movie, and he like tries it and he's eating 2472 02:23:44,320 --> 02:23:48,720 Speaker 1: it and and suddenly there's this this realization moment in 2473 02:23:48,800 --> 02:23:52,280 Speaker 1: his face where he's like, oh, this didn't have to 2474 02:23:52,320 --> 02:23:57,600 Speaker 1: be so difficult, Like, I society doesn't want me to 2475 02:23:57,680 --> 02:23:59,520 Speaker 1: do this, and it's it's seen as a crime, and 2476 02:23:59,600 --> 02:24:02,760 Speaker 1: it's see is as terrible. But actually, when you cross 2477 02:24:02,840 --> 02:24:08,000 Speaker 1: that rubicon, it wasn't as bad as you thought. Yeah, 2478 02:24:08,200 --> 02:24:11,320 Speaker 1: I mean, especially if you if you're even even if 2479 02:24:11,320 --> 02:24:14,200 Speaker 1: you're not like coming out to everybody you know at 2480 02:24:14,200 --> 02:24:16,000 Speaker 1: the same time, right, you can start you start off 2481 02:24:16,040 --> 02:24:17,480 Speaker 1: with a smell group of people that you know are 2482 02:24:17,920 --> 02:24:20,000 Speaker 1: gonna be with you, and you try it out with 2483 02:24:20,080 --> 02:24:22,920 Speaker 1: them and if you like it, then great, that's that's 2484 02:24:23,000 --> 02:24:25,160 Speaker 1: a really good sign if you if you start it 2485 02:24:25,200 --> 02:24:26,840 Speaker 1: and you're like, and this doesn't feel right, then you 2486 02:24:26,920 --> 02:24:28,680 Speaker 1: don't need to commit. Like it's not a thing. Right. 2487 02:24:29,600 --> 02:24:33,480 Speaker 1: That rubicon can feel so big sometimes, yeah, and it 2488 02:24:33,600 --> 02:24:36,680 Speaker 1: feels like you're you're jumping across a giant like the 2489 02:24:36,760 --> 02:24:39,560 Speaker 1: Grand Canyon. But really all it is is you're stepping across, 2490 02:24:40,200 --> 02:24:43,040 Speaker 1: you know, a small stream and you can step right 2491 02:24:43,080 --> 02:24:47,320 Speaker 1: back across there if you didn't like it. Yeah, So 2492 02:24:47,560 --> 02:24:49,440 Speaker 1: what kind of things we like to see happen around 2493 02:24:49,520 --> 02:24:53,119 Speaker 1: like the medical gatekeeping so that it's less fucked up? 2494 02:24:54,040 --> 02:24:56,520 Speaker 1: I mean, I know a lot of I'm actually act 2495 02:24:56,720 --> 02:25:01,840 Speaker 1: a conference in Belgium right now for trans health sort 2496 02:25:01,840 --> 02:25:07,000 Speaker 1: of medical trans health stuff. Um, and you know, I mean, 2497 02:25:07,040 --> 02:25:08,720 Speaker 1: I think one of the things I keep coming back 2498 02:25:08,800 --> 02:25:11,920 Speaker 1: to is you don't need to treat autistic people in 2499 02:25:12,000 --> 02:25:14,280 Speaker 1: the realm of trans healthcare any differently than you do 2500 02:25:14,360 --> 02:25:18,360 Speaker 1: anywhere else, like anyone else, Like, especially in the gate 2501 02:25:18,440 --> 02:25:22,440 Speaker 1: keeper model, we have like either you have the capacity 2502 02:25:22,520 --> 02:25:25,680 Speaker 1: to consent or you don't like that that test is, 2503 02:25:26,280 --> 02:25:29,600 Speaker 1: and I have lots of thoughts about that if that's 2504 02:25:29,640 --> 02:25:32,480 Speaker 1: for another day. But you know, whether you meet those 2505 02:25:32,560 --> 02:25:34,640 Speaker 1: tests or not, it should not be any different just 2506 02:25:34,720 --> 02:25:38,120 Speaker 1: because you're autistic. UM would like to, I guess talk 2507 02:25:38,560 --> 02:25:42,320 Speaker 1: briefly about your book. Um, you know, what's it like? What? 2508 02:25:42,600 --> 02:25:44,320 Speaker 1: What what the scope of it is? Where people can 2509 02:25:44,400 --> 02:25:49,760 Speaker 1: find it? Um, trans and Autistic Stories from Our Stories 2510 02:25:49,760 --> 02:25:53,920 Speaker 1: from Life at the Intersection by Jessica Kingsley Publishers. UM. 2511 02:25:54,440 --> 02:25:57,240 Speaker 1: It was out in I think people can find it 2512 02:25:57,280 --> 02:25:59,760 Speaker 1: on Amazon or wherever you buy books. I'm sure Paul's 2513 02:25:59,800 --> 02:26:05,040 Speaker 1: book store over there in Portland has it. Um. Yeah, 2514 02:26:05,160 --> 02:26:07,360 Speaker 1: it's it's a series of interviews with folks who are 2515 02:26:07,600 --> 02:26:10,640 Speaker 1: trans and artistic. I sat down with folks and and 2516 02:26:10,920 --> 02:26:13,200 Speaker 1: asked them about their life and and what's going on 2517 02:26:13,320 --> 02:26:15,880 Speaker 1: and what that looks like. And then I sort of, uh, 2518 02:26:17,000 --> 02:26:19,879 Speaker 1: you know, try to transcribe that into into a text, 2519 02:26:19,959 --> 02:26:21,959 Speaker 1: into a narrative form and put that in a book. 2520 02:26:22,000 --> 02:26:25,280 Speaker 1: And here we are. That sounds wonderful. Um. I see 2521 02:26:25,320 --> 02:26:27,359 Speaker 1: the I see that, I see the listing on Amazon 2522 02:26:27,520 --> 02:26:31,240 Speaker 1: dot c A for our for our Canadian Canadian folks 2523 02:26:31,280 --> 02:26:35,480 Speaker 1: as well. UM. Yeah, thank you so much for talking 2524 02:26:35,560 --> 02:26:38,240 Speaker 1: about this. Um, is there any other kind of random 2525 02:26:38,320 --> 02:26:39,880 Speaker 1: thoughts that you would like to you would like to 2526 02:26:39,959 --> 02:26:44,480 Speaker 1: mention that we haven't that we haven't brought up yet. Sure. Um. 2527 02:26:45,320 --> 02:26:47,800 Speaker 1: You know, I always like to plug groups Walsh's work, 2528 02:26:47,959 --> 02:26:50,680 Speaker 1: which looks at you know, they do a lot of 2529 02:26:50,760 --> 02:26:54,440 Speaker 1: work in trans autistic stuff too, and they kind of 2530 02:26:54,520 --> 02:26:58,400 Speaker 1: look at why more people may be trans and autistic, uh, 2531 02:26:58,520 --> 02:27:01,520 Speaker 1: And one of the things they've they found is that 2532 02:27:01,640 --> 02:27:06,600 Speaker 1: it may be that uh, autistic people are both less 2533 02:27:06,640 --> 02:27:09,960 Speaker 1: capable of hiding the fact that they're trans and less 2534 02:27:10,640 --> 02:27:14,240 Speaker 1: less capable of caring um or caring about hiding it, 2535 02:27:14,800 --> 02:27:20,720 Speaker 1: so it may be more obvious that there is a concurrence. 2536 02:27:20,760 --> 02:27:25,280 Speaker 1: They're not an actual overage of a concurrence, let's say, yeah, 2537 02:27:25,320 --> 02:27:27,720 Speaker 1: of course. Yeah, that means that was definitely in the 2538 02:27:27,800 --> 02:27:33,080 Speaker 1: back of my mind. Yeah, huh. Well again, no, thank 2539 02:27:33,160 --> 02:27:37,160 Speaker 1: you so much for coming to talk with us. Yeah, 2540 02:27:37,879 --> 02:27:39,440 Speaker 1: can I can I plug a couple of things? Is 2541 02:27:39,560 --> 02:27:46,320 Speaker 1: that plug? Plug away? This is still still your time, okay. Um. 2542 02:27:46,800 --> 02:27:50,640 Speaker 1: So I'm leading a refugee sponsorship group for a group 2543 02:27:50,720 --> 02:27:53,480 Speaker 1: of five for a trans guy from the Middle East, 2544 02:27:53,920 --> 02:27:57,920 Speaker 1: and we're raising funds through the Metropolitan Community Church in Toronto. 2545 02:27:58,360 --> 02:28:00,640 Speaker 1: We got to raise a certain amount before we can 2546 02:28:00,720 --> 02:28:03,760 Speaker 1: put the application in and I can give you that link, 2547 02:28:03,879 --> 02:28:07,360 Speaker 1: but it's at Canada helps dot org. And the Navy 2548 02:28:07,520 --> 02:28:13,760 Speaker 1: is trans Proud, trans Proud, trans and proud you are ill, 2549 02:28:13,840 --> 02:28:15,959 Speaker 1: but it's kind of long. I will I will put 2550 02:28:16,040 --> 02:28:17,400 Speaker 1: if you send me that link, I will put it 2551 02:28:17,480 --> 02:28:21,560 Speaker 1: in the description for people to click on. Awesome and 2552 02:28:21,680 --> 02:28:25,280 Speaker 1: you can find me at no atoms on Twitter because 2553 02:28:25,280 --> 02:28:28,119 Speaker 1: I got in early enough to get my name. Yeah. Wow, 2554 02:28:28,720 --> 02:28:31,520 Speaker 1: March two thousand nine, just right right on the cusp 2555 02:28:32,320 --> 02:28:35,879 Speaker 1: right well again, thank you so much for reaching out 2556 02:28:36,080 --> 02:28:39,520 Speaker 1: to talk about this intersection. Hopefully if anyone was interested 2557 02:28:40,000 --> 02:28:43,199 Speaker 1: what we were talking about, UM, please check out Noah's book, 2558 02:28:43,879 --> 02:28:47,400 Speaker 1: UM to just read a whole bunch of stories from 2559 02:28:47,680 --> 02:28:53,080 Speaker 1: from from people about this. Yeah. Awesome, thank you, Yeah, 2560 02:28:53,200 --> 02:28:55,480 Speaker 1: all right, everyone that does it for us today it's 2561 02:28:55,480 --> 02:29:14,480 Speaker 1: the on the other side. Oh yeah, Sophie, that's how 2562 02:29:14,560 --> 02:29:18,480 Speaker 1: we opened the episode. I didn't think anything could be 2563 02:29:18,640 --> 02:29:20,600 Speaker 1: more appalling than that other thing that you said that 2564 02:29:20,680 --> 02:29:23,240 Speaker 1: I want. What I was talking about, Brett Kavanaugh and 2565 02:29:23,320 --> 02:29:27,720 Speaker 1: Clarence Thomas wrapped together so tightly that they can't tell 2566 02:29:27,800 --> 02:29:33,120 Speaker 1: who's where one person's skin begins in the other. I 2567 02:29:33,280 --> 02:29:37,080 Speaker 1: walked you right into it, just like Neil Gorsch walked 2568 02:29:37,200 --> 02:29:39,440 Speaker 1: right into that and then decided, you know what, in 2569 02:29:39,600 --> 02:29:43,680 Speaker 1: for appendy, in for a pound of um? This is 2570 02:29:43,760 --> 02:29:47,800 Speaker 1: it could happen here the podcast about serious problems, where 2571 02:29:47,879 --> 02:29:51,920 Speaker 1: we talk about them seriously, and sometimes about the Supreme 2572 02:29:52,000 --> 02:29:55,920 Speaker 1: Court having a threesome like that, like that, like that 2573 02:29:56,000 --> 02:29:57,960 Speaker 1: cruise ship where there was a threesome and then a 2574 02:29:58,000 --> 02:30:01,320 Speaker 1: giant six fight. How's everybody doing today? I think the 2575 02:30:01,400 --> 02:30:05,000 Speaker 1: opening will work better if we just believe out and yeah, yes, 2576 02:30:05,160 --> 02:30:09,280 Speaker 1: always bleep outcome um except for right there. So I 2577 02:30:09,320 --> 02:30:12,960 Speaker 1: I feel like today we should chat about one of 2578 02:30:13,040 --> 02:30:16,960 Speaker 1: the many things that's that's a problem, which is a 2579 02:30:17,040 --> 02:30:20,640 Speaker 1: specific piece of disinformation that is spreading and not quite 2580 02:30:20,720 --> 02:30:24,280 Speaker 1: like wildfire. It's more spreading like in the background, like 2581 02:30:24,440 --> 02:30:27,640 Speaker 1: monkey pops on the internet. This is not like the 2582 02:30:27,800 --> 02:30:32,520 Speaker 1: number one piece of of of of like conspiratorial nonsense 2583 02:30:32,600 --> 02:30:34,720 Speaker 1: that's getting around, but it's getting around deeply, and I'm 2584 02:30:34,760 --> 02:30:36,880 Speaker 1: seeing it on the left and the right. You have 2585 02:30:37,080 --> 02:30:39,440 Speaker 1: if you spend any time at all on social media, 2586 02:30:40,080 --> 02:30:44,080 Speaker 1: which statistically you do, you've probably seen a bunch of 2587 02:30:44,240 --> 02:30:49,680 Speaker 1: stories and like freaked out posts about fires and arson 2588 02:30:50,120 --> 02:30:54,560 Speaker 1: at agricultural facilities and factories of food factories is often 2589 02:30:54,600 --> 02:30:57,480 Speaker 1: how it's phrased. I think the post I saw about 2590 02:30:57,560 --> 02:31:01,160 Speaker 1: it that was sort of most emblematic with someone being like, hey, 2591 02:31:01,480 --> 02:31:05,640 Speaker 1: you know, uh, you're probably not aware that some huge 2592 02:31:05,720 --> 02:31:08,040 Speaker 1: number of chickens died in this fire recently and a 2593 02:31:08,080 --> 02:31:11,240 Speaker 1: bunch of cows died in this field. But if you were, 2594 02:31:11,360 --> 02:31:15,280 Speaker 1: it's the only thing you'd be talking about. And the 2595 02:31:15,440 --> 02:31:18,520 Speaker 1: idea kind of that people are pushing when they when 2596 02:31:18,560 --> 02:31:24,720 Speaker 1: they uh, catastrophize this is that there's this massive rash 2597 02:31:24,879 --> 02:31:29,480 Speaker 1: of attacks on American food infrastructure, UM at a year 2598 02:31:29,560 --> 02:31:32,760 Speaker 1: when we're already due for a food crisis because of 2599 02:31:32,840 --> 02:31:36,520 Speaker 1: the Ukraine, and um, it's going to be this this big, 2600 02:31:36,959 --> 02:31:40,160 Speaker 1: like looming disaster and some like shady group is trying 2601 02:31:40,240 --> 02:31:45,080 Speaker 1: to starve everybody. UM. And we brought in a friend 2602 02:31:45,240 --> 02:31:47,600 Speaker 1: to talk about this because it's it is not at 2603 02:31:47,640 --> 02:31:50,720 Speaker 1: all what what people who are kind of catastrophizing or 2604 02:31:50,720 --> 02:31:53,640 Speaker 1: saying UM. And I wanted to introduce Carl to the show. Carl, 2605 02:31:53,959 --> 02:31:57,039 Speaker 1: how are you doing, buddy? You know, living living life 2606 02:31:57,080 --> 02:32:02,039 Speaker 1: in a one party state. Yeah yeah, Um, I don't know, man, 2607 02:32:02,080 --> 02:32:03,960 Speaker 1: there's a lot of parties these days, like the one 2608 02:32:04,000 --> 02:32:07,520 Speaker 1: on that cruise ship. Uh so, or is the Forward 2609 02:32:07,680 --> 02:32:15,440 Speaker 1: Party our our favorite? This is a big Yang Gang podcast. Um. Now, Carl, 2610 02:32:16,000 --> 02:32:17,800 Speaker 1: you and I had buddies on the old Twitter for 2611 02:32:17,879 --> 02:32:20,080 Speaker 1: a while. You were the origin of one of the 2612 02:32:20,280 --> 02:32:24,760 Speaker 1: terms that that that we use a lot on this show. Um, 2613 02:32:26,120 --> 02:32:28,520 Speaker 1: and uh yeah, I wanted to I wanted to talk 2614 02:32:28,560 --> 02:32:31,160 Speaker 1: to you because this is this is a pretty potent 2615 02:32:31,240 --> 02:32:35,440 Speaker 1: piece of weaponized on reality. Um. You have been tracking 2616 02:32:35,520 --> 02:32:38,920 Speaker 1: this for a while on kind of your Own. Yeah, well, 2617 02:32:40,080 --> 02:32:42,480 Speaker 1: this is one of those ones that's it sits in 2618 02:32:42,680 --> 02:32:46,000 Speaker 1: between a lot of the other conspiracies, right so, like 2619 02:32:46,240 --> 02:32:50,880 Speaker 1: you said, it's it's kind of the background operating thing 2620 02:32:51,080 --> 02:32:56,280 Speaker 1: right now. Um. And you know, so when we think 2621 02:32:56,320 --> 02:32:58,920 Speaker 1: of the big conspiracies right now, they kind of revolve 2622 02:32:58,959 --> 02:33:02,200 Speaker 1: around what they always did, depopulation, weird n W. We 2623 02:33:02,480 --> 02:33:07,080 Speaker 1: like secret Society stuff, the Q, the Q brand of that. However, 2624 02:33:07,120 --> 02:33:08,720 Speaker 1: we want to look at this is a little bit 2625 02:33:08,840 --> 02:33:14,720 Speaker 1: different because this is more overtly political, right So, this 2626 02:33:14,959 --> 02:33:19,279 Speaker 1: is this is looking to not just dig the hole 2627 02:33:19,800 --> 02:33:22,920 Speaker 1: of well, everyone's out to get us Bill Gates is 2628 02:33:22,959 --> 02:33:25,520 Speaker 1: buying all the farmland, you know, the crazy stuff we 2629 02:33:25,640 --> 02:33:27,960 Speaker 1: normally I mean, you know that's right in this, but 2630 02:33:28,120 --> 02:33:32,920 Speaker 1: it's not the center part um. And yeah, I've been 2631 02:33:32,959 --> 02:33:36,680 Speaker 1: looking at this for a few months now since I 2632 02:33:37,000 --> 02:33:39,400 Speaker 1: first saw it, and I first saw kind of traces 2633 02:33:39,440 --> 02:33:44,440 Speaker 1: of this right after the invasion of Ukraine started, so 2634 02:33:44,840 --> 02:33:50,040 Speaker 1: early March. Things started to kind of shift and nothing, 2635 02:33:50,520 --> 02:33:53,000 Speaker 1: you know, post here and there that are now missing, 2636 02:33:53,120 --> 02:33:56,160 Speaker 1: stuff like that, the kind of classic Well, let's test 2637 02:33:56,200 --> 02:33:58,800 Speaker 1: the waters. Let's see how people accept the idea that 2638 02:33:58,959 --> 02:34:02,400 Speaker 1: maybe something out you know, in the in the conspiratorial 2639 02:34:02,480 --> 02:34:07,039 Speaker 1: way is going on just asking questions type yeah, exactly exactly. 2640 02:34:07,120 --> 02:34:09,720 Speaker 1: It's the just asking questions It said, just well maybe 2641 02:34:09,760 --> 02:34:13,400 Speaker 1: think about it kind of thing. And those those pique 2642 02:34:13,440 --> 02:34:17,200 Speaker 1: my interest because those tend to be test balloons, and 2643 02:34:18,160 --> 02:34:20,600 Speaker 1: for this kind of thing, I had a weird you know, 2644 02:34:21,040 --> 02:34:23,480 Speaker 1: they're weird feelings you kind of get when you watch 2645 02:34:23,600 --> 02:34:30,960 Speaker 1: some of this as much as we do. Yes, yeah, 2646 02:34:31,520 --> 02:34:33,760 Speaker 1: and you think kind of sense when the thing has 2647 02:34:33,879 --> 02:34:37,560 Speaker 1: enough ingredients to catch on exactly, and especially when they're 2648 02:34:37,600 --> 02:34:41,800 Speaker 1: super kind of inflammatory ingredients, right, you know, the Bill 2649 02:34:41,879 --> 02:34:45,199 Speaker 1: Gates um buying all the farmland is a good example, 2650 02:34:45,320 --> 02:34:49,200 Speaker 1: not quite as inflammatory, but catches on because people, you know, 2651 02:34:49,320 --> 02:34:53,600 Speaker 1: it's it's the social paradelia thing. There's always like, there's 2652 02:34:53,600 --> 02:34:55,720 Speaker 1: always this, I mean, and this is something again that's 2653 02:34:55,760 --> 02:34:57,920 Speaker 1: a broader thing with conspiracies. There's always a germ of truth. 2654 02:34:57,959 --> 02:34:59,800 Speaker 1: The germ of truth with that is that Bill Gates 2655 02:34:59,879 --> 02:35:02,039 Speaker 1: is on a lot of farmland. Now, if you compare 2656 02:35:02,080 --> 02:35:04,800 Speaker 1: it to the total quantity of farmland he has bought, 2657 02:35:04,959 --> 02:35:07,760 Speaker 1: very it's like, yeah, it's a fraction point oh three 2658 02:35:08,320 --> 02:35:10,400 Speaker 1: or something. I mean, it's an absolutely tiny amount of 2659 02:35:10,440 --> 02:35:13,400 Speaker 1: the total, right, Yeah, Because this this country is I 2660 02:35:13,480 --> 02:35:16,680 Speaker 1: don't know, if you've looked at him, appresently pretty sizeable country, 2661 02:35:16,840 --> 02:35:20,040 Speaker 1: the United States of America kind of a big place 2662 02:35:20,720 --> 02:35:23,920 Speaker 1: when you actually look. Yeah, and so the kernel of 2663 02:35:24,040 --> 02:35:28,640 Speaker 1: truth is there there. There are fires right there, industrial accidents, 2664 02:35:28,760 --> 02:35:32,760 Speaker 1: there are weird stuff happens in big industrial situations. We 2665 02:35:32,879 --> 02:35:36,040 Speaker 1: have a large industrial farming situation in this country. So 2666 02:35:36,120 --> 02:35:39,000 Speaker 1: you see it. And I think part of what makes 2667 02:35:39,160 --> 02:35:41,560 Speaker 1: the kind of the idea that oh, this is suddenly 2668 02:35:41,680 --> 02:35:44,960 Speaker 1: happening and it's suddenly like a massive problem easier to 2669 02:35:45,000 --> 02:35:48,360 Speaker 1: sell to people is that most Americans know next to 2670 02:35:48,520 --> 02:35:50,720 Speaker 1: nothing about the food supply and how it works. Like 2671 02:35:50,800 --> 02:35:53,680 Speaker 1: if you have because I grew up in and around farms, 2672 02:35:53,680 --> 02:35:56,040 Speaker 1: I've been a lot of my life in agricultural areas. 2673 02:35:56,920 --> 02:36:01,640 Speaker 1: Farms and things related to farms catch on fire fucking constantly. 2674 02:36:02,200 --> 02:36:05,200 Speaker 1: You may not be Yeah, they're they're like I think 2675 02:36:05,240 --> 02:36:09,120 Speaker 1: they said there are five thousand annually annually, about fifteen 2676 02:36:09,160 --> 02:36:13,840 Speaker 1: a day. I mean, it's it's giant fields of dried grain. 2677 02:36:14,080 --> 02:36:17,000 Speaker 1: It's deeds of dried grain, and it's shipped like silos 2678 02:36:17,080 --> 02:36:19,920 Speaker 1: full of like flour and stuff, which is like there's 2679 02:36:20,040 --> 02:36:25,320 Speaker 1: nothing like, yeah, silos explode like like the like a 2680 02:36:25,480 --> 02:36:29,920 Speaker 1: a silo full of grain is slightly less explosive than 2681 02:36:30,080 --> 02:36:33,560 Speaker 1: like a military like missile or some ship. Like they're 2682 02:36:33,959 --> 02:36:36,560 Speaker 1: like they detonate if you catch them at the right 2683 02:36:36,600 --> 02:36:39,320 Speaker 1: way exactly. And like I know here in Minnesota a 2684 02:36:39,400 --> 02:36:41,600 Speaker 1: few years ago and there's video of a floating around. 2685 02:36:41,640 --> 02:36:43,720 Speaker 1: You know, there was a you know, a corn, a 2686 02:36:43,800 --> 02:36:47,560 Speaker 1: corn silo split and the dust goes out and something 2687 02:36:47,680 --> 02:36:50,040 Speaker 1: you know, a car or engine because it's hot, sparks 2688 02:36:50,080 --> 02:36:52,320 Speaker 1: it off and it's a fireball. You know, So these 2689 02:36:52,360 --> 02:36:56,440 Speaker 1: things happen, and I can remember there was one of 2690 02:36:56,480 --> 02:36:59,000 Speaker 1: the last things I saw and I went and covered 2691 02:36:59,040 --> 02:37:01,560 Speaker 1: in Texas before I moved, was there's this little town 2692 02:37:01,600 --> 02:37:04,280 Speaker 1: called West which is not in West Texas, it's in 2693 02:37:04,360 --> 02:37:09,200 Speaker 1: North Texas. It's in between, um, Dallas and Waco, which 2694 02:37:09,320 --> 02:37:12,800 Speaker 1: is in between Dallas and Austin, because no, Waco is 2695 02:37:12,840 --> 02:37:16,240 Speaker 1: not a destination. And they had this big god it 2696 02:37:16,360 --> 02:37:19,680 Speaker 1: was some sort of what was the I'm gonna google 2697 02:37:19,720 --> 02:37:23,400 Speaker 1: what the facility was, uh, but it was it was 2698 02:37:23,520 --> 02:37:27,160 Speaker 1: this like, um yeah, it was a fertilizer factory and 2699 02:37:27,440 --> 02:37:29,920 Speaker 1: it caught on fire. There's a terrifying video of this 2700 02:37:30,000 --> 02:37:32,400 Speaker 1: guy with his daughter watching it and it goes off 2701 02:37:33,080 --> 02:37:36,600 Speaker 1: like like a fucking fuel air bomb, the massive exploit. 2702 02:37:36,640 --> 02:37:40,040 Speaker 1: It killed the entire town's fire department, like all of 2703 02:37:40,120 --> 02:37:46,240 Speaker 1: them dead in a second. It was basically fucking anfo 2704 02:37:46,400 --> 02:37:49,959 Speaker 1: And because it happens this is like two, I think, um, 2705 02:37:50,160 --> 02:37:52,600 Speaker 1: it never it's just this big tragedy. If it had 2706 02:37:52,600 --> 02:37:54,440 Speaker 1: happened a couple of years later, there would have been 2707 02:37:54,520 --> 02:37:58,080 Speaker 1: like a conspiracy attached to it. It was just yeah, 2708 02:37:58,240 --> 02:38:01,800 Speaker 1: it was just slightly to r a but like this 2709 02:38:01,879 --> 02:38:03,840 Speaker 1: ship happened. I mean, the point of making is that 2710 02:38:04,240 --> 02:38:06,080 Speaker 1: and that we're making here is that like this ship 2711 02:38:06,200 --> 02:38:08,520 Speaker 1: happens all the time, and to the numbers we were 2712 02:38:08,560 --> 02:38:13,000 Speaker 1: quoting earlier, there's no evidence whatsoever that there are a 2713 02:38:13,200 --> 02:38:16,959 Speaker 1: higher number of of these events this year than there 2714 02:38:17,040 --> 02:38:19,800 Speaker 1: ever are. Um. Basically, one of the things that we've 2715 02:38:19,879 --> 02:38:22,160 Speaker 1: seen is as of like the spring of this year, 2716 02:38:23,360 --> 02:38:28,039 Speaker 1: a list has been compiled, um, mainly in places like Gateway, 2717 02:38:28,080 --> 02:38:30,800 Speaker 1: Pundit and zero Hedge, where they've got like a hundred 2718 02:38:30,879 --> 02:38:34,160 Speaker 1: different events too, and and it it looks very compelling 2719 02:38:34,160 --> 02:38:36,440 Speaker 1: when you just see this list of and there's this fire, 2720 02:38:36,480 --> 02:38:38,680 Speaker 1: and this many chickens died here, and this many cows died, 2721 02:38:38,720 --> 02:38:41,199 Speaker 1: and there was this explosion. But again, if you actually 2722 02:38:41,240 --> 02:38:44,240 Speaker 1: look at the number of events that are expected in 2723 02:38:44,320 --> 02:38:47,320 Speaker 1: a year, there's nothing abnormal about this, and in fact, 2724 02:38:47,400 --> 02:38:50,600 Speaker 1: it's pretty middle of the road for any year. And 2725 02:38:50,760 --> 02:38:53,560 Speaker 1: like the bird, the bird calls right, like that's a 2726 02:38:53,600 --> 02:38:57,000 Speaker 1: great example of this being just absolutely out of the 2727 02:38:57,160 --> 02:39:00,120 Speaker 1: park conspiracy land. I mean, there's a massive eight and 2728 02:39:00,240 --> 02:39:03,840 Speaker 1: flu epidemic going on right now that's killed more birds, 2729 02:39:04,160 --> 02:39:08,000 Speaker 1: you know, than the last ten years. And so when 2730 02:39:08,040 --> 02:39:10,360 Speaker 1: you start talking about you know, three hundred you know, 2731 02:39:10,480 --> 02:39:14,720 Speaker 1: three hundred million birds worldwide being called whatever the massive 2732 02:39:14,800 --> 02:39:19,440 Speaker 1: numbers that funny how avian flu does that, and that's 2733 02:39:19,440 --> 02:39:23,040 Speaker 1: a response. But when you get into the zero hedge, 2734 02:39:23,400 --> 02:39:26,920 Speaker 1: who is really pushing this right now world that's one 2735 02:39:26,959 --> 02:39:28,800 Speaker 1: of the top ones on the list. And it also 2736 02:39:28,920 --> 02:39:30,920 Speaker 1: makes you know they have their little maps up right 2737 02:39:30,959 --> 02:39:33,720 Speaker 1: now with all the drop tabs that show right they 2738 02:39:33,879 --> 02:39:37,439 Speaker 1: love doing and there's a you see this in other conspiracies. 2739 02:39:37,440 --> 02:39:39,400 Speaker 1: I think one of the big ones that that kind 2740 02:39:39,400 --> 02:39:42,080 Speaker 1: of was a little I don't know, on the edge 2741 02:39:42,120 --> 02:39:45,640 Speaker 1: of of of getting mainstream recently was like the conspiracy 2742 02:39:45,680 --> 02:39:49,800 Speaker 1: about people disappearing at national parks where it's like mapped, yeah, 2743 02:39:49,959 --> 02:39:58,360 Speaker 1: exactly like it, yeah, and it's like, yeah, man, um people, 2744 02:39:58,520 --> 02:40:01,600 Speaker 1: there's three million p in the United States, Like and 2745 02:40:01,879 --> 02:40:04,280 Speaker 1: also people go missing while hiking and one of the 2746 02:40:04,520 --> 02:40:06,680 Speaker 1: like a bunch of stuff isn't on that list, like 2747 02:40:06,720 --> 02:40:08,800 Speaker 1: the number of those people who were found again and 2748 02:40:08,879 --> 02:40:11,120 Speaker 1: what exactly a lot of people just like slip and 2749 02:40:11,240 --> 02:40:13,360 Speaker 1: fall and never a seat again because they falled out 2750 02:40:13,360 --> 02:40:17,280 Speaker 1: a cliff. Yes, national parks are kind of dangerous, funny 2751 02:40:17,360 --> 02:40:20,880 Speaker 1: enough once you're off, Yeah, that's why they're fun Yeah exactly. 2752 02:40:21,360 --> 02:40:23,480 Speaker 1: There was there was a whole four wh one documentary 2753 02:40:23,520 --> 02:40:25,440 Speaker 1: made a few years ago about this person who went missing, 2754 02:40:25,480 --> 02:40:27,800 Speaker 1: you like were they were They dropped into a secret 2755 02:40:27,840 --> 02:40:31,480 Speaker 1: underground government bunker where they abducted and they like a 2756 02:40:31,600 --> 02:40:33,560 Speaker 1: year later they found his body at the bottom of 2757 02:40:33,600 --> 02:40:36,680 Speaker 1: a cliff. Yeah, and like it it doesn't you know, 2758 02:40:37,000 --> 02:40:40,400 Speaker 1: that doesn't talk about the horrible stuff done with like 2759 02:40:41,000 --> 02:40:44,199 Speaker 1: especially in Canada with all of the missing indigenous women. 2760 02:40:45,360 --> 02:40:47,960 Speaker 1: Is actually it is actually a big problem. But I 2761 02:40:48,040 --> 02:40:49,560 Speaker 1: mean to back to back to the fact, back to 2762 02:40:49,760 --> 02:40:51,640 Speaker 1: the farming thing. I think what all of these you 2763 02:40:51,680 --> 02:40:56,000 Speaker 1: know stories show is just the innate holder of industrial farming. 2764 02:40:56,400 --> 02:41:03,039 Speaker 1: It's actually the scar Yeah, yeah, it is absolutely scary, um, 2765 02:41:03,800 --> 02:41:07,119 Speaker 1: but it's also like normal scary. Like the thing that's 2766 02:41:07,120 --> 02:41:11,120 Speaker 1: scary is that the system of industrial farming is incredibly dangerous. 2767 02:41:11,160 --> 02:41:14,000 Speaker 1: And like, if you actually want to be properly horrified 2768 02:41:14,040 --> 02:41:17,080 Speaker 1: about something relating to food production, look at how many 2769 02:41:17,200 --> 02:41:20,160 Speaker 1: people die because they get sucked into bogs of pigshit 2770 02:41:21,879 --> 02:41:25,560 Speaker 1: or drowning grain stile or drowning grain sile. I mean people, legit, 2771 02:41:27,480 --> 02:41:30,800 Speaker 1: whole families. One person will fall in the grain silo 2772 02:41:30,840 --> 02:41:33,240 Speaker 1: and they'll try to get him out the whole families. 2773 02:41:33,600 --> 02:41:36,200 Speaker 1: I know, I know people who have who have died 2774 02:41:36,240 --> 02:41:40,879 Speaker 1: that way because I grew up in the very agricultural area. Yeah, 2775 02:41:41,560 --> 02:41:43,440 Speaker 1: a lot of this is just like people don't know 2776 02:41:43,560 --> 02:41:46,920 Speaker 1: the country, but sharine, yes, um so industrial. I mean 2777 02:41:47,000 --> 02:41:49,640 Speaker 1: like yeah for me, for me someone that hasn't growed 2778 02:41:49,680 --> 02:41:53,959 Speaker 1: up in any agricultural area at all, and this is yeah. 2779 02:41:54,720 --> 02:41:58,520 Speaker 1: Grain is like so it's like quicksand it sucks you in. 2780 02:41:59,120 --> 02:42:01,320 Speaker 1: It takes you to that bottom. If you don't spread 2781 02:42:01,320 --> 02:42:03,440 Speaker 1: out immediately, you're going down and there's really no way 2782 02:42:03,520 --> 02:42:06,880 Speaker 1: to save somebody. It's stay pretty, stay the funk away 2783 02:42:06,959 --> 02:42:10,080 Speaker 1: from grain silos. Yeah, do not play around grain silos, 2784 02:42:10,879 --> 02:42:13,480 Speaker 1: around with the grain silo. It is it is. It 2785 02:42:13,600 --> 02:42:15,840 Speaker 1: is killed entire families because people will try to save 2786 02:42:15,920 --> 02:42:19,720 Speaker 1: each other then they get sucked down and it's it's pretty. Yeah, 2787 02:42:20,000 --> 02:42:24,520 Speaker 1: it's bad when you have livestock livestock poop, and sometimes 2788 02:42:24,560 --> 02:42:26,480 Speaker 1: that poop is super useful. Chicken ship's one of the 2789 02:42:26,520 --> 02:42:29,720 Speaker 1: best fertilizers ever. You can make. Chicken ship very very useful. 2790 02:42:30,200 --> 02:42:33,600 Speaker 1: Pigshit is like nasty, it's toxic. It is very hard 2791 02:42:33,640 --> 02:42:36,920 Speaker 1: to do anything. It's it's in the ground long enough 2792 02:42:36,959 --> 02:42:39,200 Speaker 1: it's a bio Well theoretically, if you were to like 2793 02:42:39,360 --> 02:42:41,480 Speaker 1: really care about it, you could you could make a 2794 02:42:41,640 --> 02:42:44,360 Speaker 1: use of it given enough time. But there's so many 2795 02:42:44,480 --> 02:42:47,280 Speaker 1: pigs because our hunger for bacon is insatiable that you 2796 02:42:47,360 --> 02:42:51,440 Speaker 1: wind up with this this massive tox of massive toxic sludge. 2797 02:42:51,480 --> 02:42:53,080 Speaker 1: So there's the chunk of the country in which most 2798 02:42:53,120 --> 02:42:56,080 Speaker 1: of the pigs come from. There are these huge pigshit 2799 02:42:56,240 --> 02:42:59,640 Speaker 1: bogs that are like there are countries smaller than bogs 2800 02:42:59,680 --> 02:43:01,520 Speaker 1: of pig ship that we have in the United States, 2801 02:43:01,600 --> 02:43:03,320 Speaker 1: and people die in them all the time. They get 2802 02:43:03,320 --> 02:43:06,720 Speaker 1: sucked down into the big ship or you suffocate because 2803 02:43:06,760 --> 02:43:08,840 Speaker 1: you get one of them bursts. I mean, there's so 2804 02:43:08,920 --> 02:43:11,080 Speaker 1: many weird things because it's a meth their methane and 2805 02:43:11,200 --> 02:43:14,800 Speaker 1: hydrogen sulfide scences. So it's just like bad things around 2806 02:43:14,879 --> 02:43:19,240 Speaker 1: farms all the time, and that's just that's just farming 2807 02:43:19,640 --> 02:43:22,520 Speaker 1: and what we're ultimately what we are seeing here. If 2808 02:43:22,560 --> 02:43:24,680 Speaker 1: you want to like actually analyze the thing that is 2809 02:43:24,720 --> 02:43:29,160 Speaker 1: happening um with all of these conspiracies, it's it's what's 2810 02:43:29,160 --> 02:43:32,360 Speaker 1: called the frequency illusion, which is the idea that like 2811 02:43:32,760 --> 02:43:35,199 Speaker 1: if you've ever I don't know, if somebody when somebody 2812 02:43:35,280 --> 02:43:37,600 Speaker 1: like teaches, like you learn a new word, right, or 2813 02:43:37,600 --> 02:43:39,480 Speaker 1: you like you hear about a historic event, and then 2814 02:43:39,520 --> 02:43:45,160 Speaker 1: you keep seeing everywhere. This is something that's an author 2815 02:43:45,240 --> 02:43:48,160 Speaker 1: that Garrison and I quite like Robert Anton Wilson played 2816 02:43:48,200 --> 02:43:50,280 Speaker 1: with a lot um. It's why, like twenty three is 2817 02:43:50,440 --> 02:43:53,240 Speaker 1: one thing you'll notice in like Hollywood movies and TV 2818 02:43:53,360 --> 02:43:55,960 Speaker 1: shows if you look out for twenty they're fucking everywhere 2819 02:43:56,000 --> 02:43:58,160 Speaker 1: because a whole bunch of people who got into Hollywood 2820 02:43:58,160 --> 02:44:00,879 Speaker 1: are fans of the same guy. And there's this conspiracy 2821 02:44:00,920 --> 02:44:02,720 Speaker 1: with the number of twenty three people sticking. It's all 2822 02:44:02,760 --> 02:44:05,440 Speaker 1: over the fucking wire. It's in a bunch of ship um, 2823 02:44:06,120 --> 02:44:08,840 Speaker 1: and it's it's yeah, at the base of things like right, 2824 02:44:09,080 --> 02:44:12,520 Speaker 1: humans are paradelium, right, So we're looking for patterns and 2825 02:44:12,600 --> 02:44:15,880 Speaker 1: static that's what we do. It's part like in my mind, 2826 02:44:15,920 --> 02:44:19,520 Speaker 1: it's part of our like ancestral uh, you know, deep 2827 02:44:19,560 --> 02:44:23,200 Speaker 1: in the past protection, right, how we cont me? Yeah, 2828 02:44:23,560 --> 02:44:25,920 Speaker 1: exactly what's that? And it's how you look for monsters 2829 02:44:25,959 --> 02:44:28,080 Speaker 1: in the woods, you know. It's like when we're looking 2830 02:44:28,240 --> 02:44:31,480 Speaker 1: for eyes in the dark. That's part of it. And 2831 02:44:31,640 --> 02:44:35,400 Speaker 1: so you know, we tend to find meaning in points 2832 02:44:35,920 --> 02:44:38,600 Speaker 1: and then try and connect them because that's how we work. 2833 02:44:38,760 --> 02:44:41,840 Speaker 1: And so this is a great example of this because 2834 02:44:41,879 --> 02:44:44,800 Speaker 1: it hasn't gone full Q level yet where it's just 2835 02:44:46,440 --> 02:44:50,280 Speaker 1: absurd to be absurd. The shield itself, like you can 2836 02:44:50,360 --> 02:44:54,000 Speaker 1: see where people are trying to pick together points that 2837 02:44:54,320 --> 02:44:58,320 Speaker 1: normally are just industrial accidents. And you know, some of 2838 02:44:58,360 --> 02:45:01,240 Speaker 1: the stuff I saw early on, for like the cow 2839 02:45:01,360 --> 02:45:05,160 Speaker 1: death posts and the stuff related to climate change, what 2840 02:45:05,320 --> 02:45:09,520 Speaker 1: you really were seeing was people trying to make order 2841 02:45:09,959 --> 02:45:13,320 Speaker 1: out of what is just chaotic accidents and now and 2842 02:45:13,440 --> 02:45:19,640 Speaker 1: now yeah, yeah, no, it's it's it's something you rarely 2843 02:45:19,720 --> 02:45:24,119 Speaker 1: actually see in the cascade of you know, conspiracy theory 2844 02:45:24,200 --> 02:45:27,560 Speaker 1: like this so overtly and it's been really interesting for 2845 02:45:27,640 --> 02:45:31,640 Speaker 1: me watching that because you know, as someone who's far 2846 02:45:31,800 --> 02:45:36,400 Speaker 1: too into watching people melt their brains. UM. This this 2847 02:45:36,600 --> 02:45:39,120 Speaker 1: kind of lays out some of the ways that this 2848 02:45:39,280 --> 02:45:42,480 Speaker 1: works for all of us, UM, and I think it 2849 02:45:42,520 --> 02:45:45,720 Speaker 1: also offers a roadmap in certain ways to like see 2850 02:45:45,800 --> 02:45:48,640 Speaker 1: past it and be able to correct it for yourself 2851 02:45:48,760 --> 02:45:51,480 Speaker 1: so you don't get into the same Oh there are 2852 02:45:51,480 --> 02:45:55,320 Speaker 1: a thousand points a light here, let's fall all of them. Yeah, 2853 02:45:55,640 --> 02:45:57,960 Speaker 1: it's UM. One of the things that's interesting. So, like 2854 02:45:58,480 --> 02:46:03,160 Speaker 1: we just called it the the recency bias or the 2855 02:46:03,240 --> 02:46:06,120 Speaker 1: frequency illusions, is also the recency illusion, which is like 2856 02:46:06,200 --> 02:46:09,400 Speaker 1: the belief that things that you have like noticed only 2857 02:46:09,520 --> 02:46:12,280 Speaker 1: recently are a recent phenomena rather than things that go 2858 02:46:12,400 --> 02:46:14,640 Speaker 1: back a long time. There these are kind of inter related. 2859 02:46:14,720 --> 02:46:17,720 Speaker 1: But this, this sort of phenomena that we're seeing is 2860 02:46:17,760 --> 02:46:22,120 Speaker 1: often called the bottom mine off phenomena. And that's so 2861 02:46:22,440 --> 02:46:25,560 Speaker 1: so the bottom I'm pretty yeah bottom the bottom mine 2862 02:46:25,560 --> 02:46:29,360 Speaker 1: Off group was a it's also called the German Red Army. UM. 2863 02:46:29,520 --> 02:46:33,120 Speaker 1: It was a yeah, it was a West German terrorist 2864 02:46:33,280 --> 02:46:37,600 Speaker 1: organization from like seventy years ago. Like this is not 2865 02:46:37,879 --> 02:46:41,720 Speaker 1: a recent thing, but there was an article about them 2866 02:46:41,879 --> 02:46:45,680 Speaker 1: in like a Minnesota St. Paul newspaper in nineteen four 2867 02:46:45,840 --> 02:46:47,680 Speaker 1: that happened to be one of the first newspapers with 2868 02:46:47,720 --> 02:46:53,280 Speaker 1: an online comment page. Oh no, well yeah, so this 2869 02:46:53,440 --> 02:46:55,199 Speaker 1: is like you'll always here referred to as the bottom 2870 02:46:55,200 --> 02:46:57,040 Speaker 1: mine off phenomena. It has nothing to do with this 2871 02:46:57,200 --> 02:46:59,800 Speaker 1: terrorist group other than the fact that one commenter on 2872 02:47:00,480 --> 02:47:03,720 Speaker 1: saw an article about them, um within a couple of 2873 02:47:03,800 --> 02:47:06,520 Speaker 1: hours of someone else in their life telling them about 2874 02:47:06,560 --> 02:47:09,080 Speaker 1: the group, and so they named it in the common 2875 02:47:09,120 --> 02:47:12,320 Speaker 1: section the bottom mine off phenomena because yeah, like it's 2876 02:47:12,360 --> 02:47:16,200 Speaker 1: it's which is an example of the phenomena. Um, but 2877 02:47:16,440 --> 02:47:19,400 Speaker 1: like that's it is. It is. It's a thing that 2878 02:47:19,480 --> 02:47:22,880 Speaker 1: people do for again good reason. Like like you said, 2879 02:47:22,959 --> 02:47:26,640 Speaker 1: like if you're a fucking hunter gatherer and you notice that, like, oh, 2880 02:47:26,760 --> 02:47:29,560 Speaker 1: after a rainstorm is when the big cats come out 2881 02:47:29,640 --> 02:47:31,600 Speaker 1: and hunt. And like if somebody, if one of your 2882 02:47:31,640 --> 02:47:34,800 Speaker 1: friends gets eaten by like a tiger, it's probably after 2883 02:47:34,959 --> 02:47:38,720 Speaker 1: a rainstorm. You associate after the rainstorm with danger, which 2884 02:47:38,760 --> 02:47:43,359 Speaker 1: is like good, right, Like I live inside urban environments, 2885 02:47:43,480 --> 02:47:46,960 Speaker 1: the usually usually less. This becomes useful as relating to 2886 02:47:47,040 --> 02:47:52,080 Speaker 1: more of our like instinctual practices. Learning to recognize this 2887 02:47:52,280 --> 02:47:56,520 Speaker 1: like first step of delusion is really important. Um, I 2888 02:47:56,560 --> 02:48:00,560 Speaker 1: don't think decisions in the future, right, But I think 2889 02:48:00,600 --> 02:48:03,119 Speaker 1: it's much more similar than we realized to like how 2890 02:48:03,200 --> 02:48:09,120 Speaker 1: people think of religion, because even religions people are Yeah, 2891 02:48:09,160 --> 02:48:12,600 Speaker 1: like what you're saying is like there's so much chaos. 2892 02:48:12,720 --> 02:48:15,040 Speaker 1: People can't make sense of the world. And just like religion, 2893 02:48:15,080 --> 02:48:18,360 Speaker 1: you're trying to make order out of disorder and look 2894 02:48:18,400 --> 02:48:20,320 Speaker 1: for signs, to look for patterns. It's like an element 2895 02:48:20,360 --> 02:48:23,760 Speaker 1: of magical thinking where yes, you look for reasons that 2896 02:48:23,959 --> 02:48:27,080 Speaker 1: this has meaning, so I understand where they're coming from. 2897 02:48:29,000 --> 02:48:31,240 Speaker 1: And so the problem, again, the problem is not with 2898 02:48:31,360 --> 02:48:33,800 Speaker 1: your brain, because this is not like a bad thing 2899 02:48:33,920 --> 02:48:35,520 Speaker 1: your brain is doing. It's just a thing your brain 2900 02:48:35,680 --> 02:48:38,840 Speaker 1: is doing. The problem is that this is one of 2901 02:48:38,920 --> 02:48:42,560 Speaker 1: the easiest ways that bad faith actors can take advantage 2902 02:48:42,600 --> 02:48:45,240 Speaker 1: of you and other people, and so in terms of 2903 02:48:45,320 --> 02:48:49,040 Speaker 1: protecting yourself and others from it. And again, one of 2904 02:48:49,080 --> 02:48:50,640 Speaker 1: the problems with this and one of the things that 2905 02:48:50,760 --> 02:48:54,840 Speaker 1: makes it so so much more difficult twenty years ago 2906 02:48:55,040 --> 02:48:57,439 Speaker 1: the batter mind how obviously the bottom line how phenomenon 2907 02:48:57,560 --> 02:48:59,200 Speaker 1: was as much of a thing as that dude in 2908 02:48:59,240 --> 02:49:03,440 Speaker 1: the fucking Commons page that Minnesota paper proves, But there 2909 02:49:03,560 --> 02:49:06,120 Speaker 1: was less ship coming at you, so you kind of 2910 02:49:06,200 --> 02:49:08,080 Speaker 1: had even if you might get caught for a little 2911 02:49:08,120 --> 02:49:10,040 Speaker 1: bit and they're like, oh, is there something weird going 2912 02:49:10,080 --> 02:49:14,000 Speaker 1: on with this this German terrorist group. Um, you kind 2913 02:49:14,040 --> 02:49:16,840 Speaker 1: of had the space in your head and the space 2914 02:49:16,920 --> 02:49:19,720 Speaker 1: in your media diet to like actually parse that out 2915 02:49:19,840 --> 02:49:22,960 Speaker 1: and calm down. But today it all comes with you 2916 02:49:23,040 --> 02:49:24,959 Speaker 1: with a flood. There's like three new fucked up Supreme 2917 02:49:25,000 --> 02:49:27,480 Speaker 1: Court decisions. Oh, and now all of the food factories 2918 02:49:27,520 --> 02:49:29,480 Speaker 1: are on fire and all of the chickens are dead, 2919 02:49:29,520 --> 02:49:33,400 Speaker 1: and this war in Ukraine is actually elevating the food prices, 2920 02:49:33,440 --> 02:49:36,600 Speaker 1: and it all compounds on itself. If you when you 2921 02:49:36,800 --> 02:49:40,800 Speaker 1: start seeing something new like this come into your media 2922 02:49:40,879 --> 02:49:43,640 Speaker 1: diet that seems scary, one of the first things you 2923 02:49:43,680 --> 02:49:46,760 Speaker 1: should do is just try to get a handle on 2924 02:49:47,600 --> 02:49:52,520 Speaker 1: the raw numbers this Well, this is a complexity Yeah, 2925 02:49:52,600 --> 02:49:54,720 Speaker 1: you know, you know, this is a complexity issue. That's 2926 02:49:54,720 --> 02:49:56,640 Speaker 1: how I like to look at it. And that's exactly 2927 02:49:56,680 --> 02:49:59,039 Speaker 1: one of the great ways to to kind of get 2928 02:49:59,200 --> 02:50:02,560 Speaker 1: disrupt the complex nature of this and the amount of 2929 02:50:02,600 --> 02:50:05,240 Speaker 1: it you're taking in is just to start breaking it down. 2930 02:50:05,360 --> 02:50:07,720 Speaker 1: Numbers are great, right, Like if you can look and 2931 02:50:07,800 --> 02:50:12,520 Speaker 1: see their eighteen thou instances of industrial accidents leading to X, 2932 02:50:12,720 --> 02:50:15,360 Speaker 1: y or z and five thousand fires, you start to 2933 02:50:15,440 --> 02:50:18,800 Speaker 1: really get yourself into a better position to understand what's 2934 02:50:18,800 --> 02:50:21,680 Speaker 1: being thrown at you. Yeah, but I don't think most 2935 02:50:21,720 --> 02:50:26,080 Speaker 1: people can actually understand what those numbers mean. Like there's 2936 02:50:26,200 --> 02:50:28,680 Speaker 1: like they're large numbers, but I don't think people understand 2937 02:50:28,800 --> 02:50:30,879 Speaker 1: like that means a lot of that stuff is happening 2938 02:50:31,040 --> 02:50:33,560 Speaker 1: versus just like one or two things you hear about 2939 02:50:33,640 --> 02:50:35,920 Speaker 1: and you don't realize that probability wise, that it's like 2940 02:50:36,040 --> 02:50:39,480 Speaker 1: insignificant because I don't think those numbers makes sense. I 2941 02:50:39,480 --> 02:50:42,040 Speaker 1: mean even to me sometimes I can't. I can't picture 2942 02:50:42,200 --> 02:50:45,320 Speaker 1: so many things, So I think it's I don't know, 2943 02:50:45,520 --> 02:50:48,400 Speaker 1: maybe it's just like a deficit and how our brains 2944 02:50:48,440 --> 02:50:50,640 Speaker 1: would you be able to understand why the numbers exist. 2945 02:50:50,760 --> 02:50:54,920 Speaker 1: But you can try to compare them two previous years, right, 2946 02:50:54,959 --> 02:50:58,560 Speaker 1: you can't exactly, you can't expand what you're relating to, 2947 02:50:58,760 --> 02:51:00,760 Speaker 1: right if you're if you're looking from is everything from 2948 02:51:00,840 --> 02:51:04,360 Speaker 1: March to June two, you're like, whoa, this is a 2949 02:51:04,400 --> 02:51:06,320 Speaker 1: lot of stuff just in these few months that if 2950 02:51:06,320 --> 02:51:08,800 Speaker 1: you compare that to every preceding year for the past 2951 02:51:08,840 --> 02:51:11,680 Speaker 1: few years, like, oh, this actually isn't a regular this 2952 02:51:11,879 --> 02:51:14,280 Speaker 1: is this is this is still fucked up, but it's 2953 02:51:14,280 --> 02:51:17,520 Speaker 1: actually kind of normalized. Um, and it's not. It's not 2954 02:51:18,400 --> 02:51:22,000 Speaker 1: an abnormal phenomenon right now. And so even if you 2955 02:51:22,120 --> 02:51:24,360 Speaker 1: can't like understand what the numbers are, you can still 2956 02:51:24,400 --> 02:51:27,080 Speaker 1: compare them to previous things. But but yeah, I mean 2957 02:51:27,200 --> 02:51:29,360 Speaker 1: that does require more work than just like looking at 2958 02:51:29,400 --> 02:51:31,240 Speaker 1: a meme, right, And the reason why this stuff works 2959 02:51:31,280 --> 02:51:33,880 Speaker 1: is because people know how to exploit this part of 2960 02:51:33,920 --> 02:51:36,960 Speaker 1: our brains really well. Not not not not not in 2961 02:51:37,040 --> 02:51:41,160 Speaker 1: this part of this brain is useless, right, It has uses, um, 2962 02:51:41,400 --> 02:51:44,279 Speaker 1: you can play with it, but it's also is exploitable. 2963 02:51:44,520 --> 02:51:46,440 Speaker 1: And and that's the thing that you want to be 2964 02:51:47,000 --> 02:51:50,520 Speaker 1: aware of, is trying to be cognizant of if the 2965 02:51:50,640 --> 02:51:54,080 Speaker 1: information you're taking in is exploiting this pathway and then 2966 02:51:54,400 --> 02:51:58,680 Speaker 1: choosing how how you want to be be circumvent some 2967 02:51:58,800 --> 02:52:01,000 Speaker 1: of those mental effects act like well, and we have 2968 02:52:01,160 --> 02:52:03,360 Speaker 1: such I mean, as humans, we have a real issue 2969 02:52:03,400 --> 02:52:06,440 Speaker 1: with this kind of brain hacking, and it's something we're 2970 02:52:06,520 --> 02:52:09,400 Speaker 1: just all kind of getting up to right now and understanding. 2971 02:52:09,520 --> 02:52:11,800 Speaker 1: And we still don't fully understand some of this. But 2972 02:52:12,000 --> 02:52:15,160 Speaker 1: you know, I UM a lot of the stuff I 2973 02:52:15,440 --> 02:52:18,440 Speaker 1: I kind of initially worked off of for the concept 2974 02:52:18,520 --> 02:52:21,920 Speaker 1: of weaponized unreality kind of talks about social engineering in 2975 02:52:22,000 --> 02:52:24,520 Speaker 1: the way that like freaking was done and hacking back 2976 02:52:24,560 --> 02:52:27,280 Speaker 1: in the day was done, and this is so similar 2977 02:52:27,320 --> 02:52:30,120 Speaker 1: to that in certain ways that it's kind of shocking, right, 2978 02:52:30,240 --> 02:52:34,080 Speaker 1: Like it's a conspiracy, but it's also a management tool 2979 02:52:34,400 --> 02:52:37,240 Speaker 1: and it's a it's a memory management and and you know, 2980 02:52:37,360 --> 02:52:41,840 Speaker 1: ultimately a reality management tool and giving it numbers looking 2981 02:52:41,879 --> 02:52:44,520 Speaker 1: at context like that does take time, but some of 2982 02:52:44,600 --> 02:52:47,959 Speaker 1: these are like gonna gonna be hard and fast rules 2983 02:52:48,000 --> 02:52:52,680 Speaker 1: probably going forward to like interact with the digital world, 2984 02:52:52,840 --> 02:52:54,560 Speaker 1: because this is gonna be how it is for a 2985 02:52:54,720 --> 02:52:58,640 Speaker 1: long while. There's a book that was kind of considered 2986 02:52:58,680 --> 02:53:01,400 Speaker 1: to be like the foundational text of or at least 2987 02:53:01,400 --> 02:53:04,720 Speaker 1: strategic document of the Islamic State, called The Management of 2988 02:53:04,800 --> 02:53:08,040 Speaker 1: Savagery Um. And the title gives away what what you're doing, right, 2989 02:53:08,080 --> 02:53:11,400 Speaker 1: you're carrying out You're you're engaging in acts of savagery, 2990 02:53:11,560 --> 02:53:14,920 Speaker 1: terrorist attacks that that that kill innocent people, that are 2991 02:53:15,120 --> 02:53:19,120 Speaker 1: that exists to disrupt um the state that you're in 2992 02:53:19,400 --> 02:53:22,640 Speaker 1: in order to and you're attempting to like you're attempting 2993 02:53:22,720 --> 02:53:25,880 Speaker 1: to build kind of a melieu of savagery, which then 2994 02:53:26,040 --> 02:53:29,120 Speaker 1: provides you the opportunity to take an exert power. And 2995 02:53:29,240 --> 02:53:32,560 Speaker 1: what we're seeing here is like the management of cognitive biases, 2996 02:53:32,720 --> 02:53:36,600 Speaker 1: right exactly, the management of like these weird little evolutionary 2997 02:53:37,200 --> 02:53:42,480 Speaker 1: holdovers in your brain, um that that don't quite work 2998 02:53:42,600 --> 02:53:46,720 Speaker 1: in the modern world. But if you understand what what's happening, 2999 02:53:46,800 --> 02:53:48,920 Speaker 1: you can take advantage of them, and you can you 3000 02:53:49,040 --> 02:53:52,000 Speaker 1: can trick people into thinking things are happening that aren't. 3001 02:53:52,040 --> 02:53:54,600 Speaker 1: It's the same. You know, you can see this. The 3002 02:53:54,760 --> 02:53:56,800 Speaker 1: right does this very effectively in a lot of the 3003 02:53:56,840 --> 02:54:01,240 Speaker 1: anti trans stuff they've been doing, obviously with gay you know, 3004 02:54:01,360 --> 02:54:03,560 Speaker 1: if you look at the population trans and of gay people, 3005 02:54:03,959 --> 02:54:07,360 Speaker 1: some number of people in that community are going to 3006 02:54:07,520 --> 02:54:10,480 Speaker 1: do things that are bad, right, because it's a population 3007 02:54:10,560 --> 02:54:14,440 Speaker 1: of human beings, um, and because the country is large enough. 3008 02:54:14,480 --> 02:54:17,320 Speaker 1: If you get people hyper focused on here's a story, 3009 02:54:17,440 --> 02:54:19,600 Speaker 1: here's another story, here's two, here's three stories. Now, is 3010 02:54:19,680 --> 02:54:21,720 Speaker 1: that does that mean that there's any kind of actual 3011 02:54:21,760 --> 02:54:25,080 Speaker 1: systemic problem. No. Um, that community is no more likely 3012 02:54:25,160 --> 02:54:27,240 Speaker 1: to do things that are bad than any other community. 3013 02:54:27,480 --> 02:54:29,560 Speaker 1: But if you get people focused on each of those 3014 02:54:29,640 --> 02:54:33,080 Speaker 1: stories in their head, they feel like there's they feel 3015 02:54:33,120 --> 02:54:35,040 Speaker 1: like there's an epidemic, and like, well, we have to 3016 02:54:35,080 --> 02:54:36,920 Speaker 1: get a handle. It's the same thing that that gets 3017 02:54:36,959 --> 02:54:39,760 Speaker 1: done with like Islamic terrorism, right where it's like, yes, 3018 02:54:39,840 --> 02:54:42,920 Speaker 1: since nine eleven, actually not that many acts of Islamic 3019 02:54:43,040 --> 02:54:46,080 Speaker 1: terrorism in the United States extremely fucking uncommon, much less 3020 02:54:46,120 --> 02:54:49,600 Speaker 1: common than right wing terrorism like homegrown terrorism, but the 3021 02:54:49,720 --> 02:54:52,680 Speaker 1: media doesn't really cover one of those kinds of terrorism 3022 02:54:52,760 --> 02:54:54,879 Speaker 1: and loves to cover the other. So you get people 3023 02:54:55,000 --> 02:54:58,440 Speaker 1: periodically tricked into thinking that they're under direct threat from 3024 02:54:58,480 --> 02:55:01,280 Speaker 1: the Islamic State or whatever. The right well, and I 3025 02:55:01,360 --> 02:55:03,800 Speaker 1: think it's you know, I think going to that point right, 3026 02:55:03,840 --> 02:55:07,360 Speaker 1: like it's almost u I mean, it's a reality filter, right, so, 3027 02:55:07,560 --> 02:55:10,280 Speaker 1: like it's a way to selectively filter out things that 3028 02:55:10,320 --> 02:55:14,160 Speaker 1: would counter the narrative that you're trying to overall push. 3029 02:55:14,240 --> 02:55:17,200 Speaker 1: And I think that that's something that's what's interesting me 3030 02:55:17,280 --> 02:55:19,119 Speaker 1: about this in a lot of ways is that we're 3031 02:55:19,320 --> 02:55:23,040 Speaker 1: seeing a filter being set up that only allows people 3032 02:55:23,080 --> 02:55:26,560 Speaker 1: into one lane of this thought. And we've seen what 3033 02:55:26,680 --> 02:55:29,440 Speaker 1: the end result of that is with radicalization and things 3034 02:55:29,680 --> 02:55:33,560 Speaker 1: that come along with these kind of conspiracies. But it's 3035 02:55:33,600 --> 02:55:37,840 Speaker 1: really it's been very wild to watch since the you know, 3036 02:55:37,959 --> 02:55:42,240 Speaker 1: the nineteen April till now where we're seeing it. You know, 3037 02:55:42,360 --> 02:55:45,040 Speaker 1: Serovich is doing it, every any one of the guys 3038 02:55:45,080 --> 02:55:48,280 Speaker 1: you can think of doing it, Yeah, exactly. Tucker ran 3039 02:55:48,720 --> 02:55:51,119 Speaker 1: a couple of things on this and kind of interspersed 3040 02:55:51,160 --> 02:55:54,720 Speaker 1: it with his you know, white male virility ship. It's 3041 02:55:54,760 --> 02:55:56,840 Speaker 1: the we're we're we're in a weird place where these 3042 02:55:57,080 --> 02:55:59,080 Speaker 1: are starting to be able to be played with and 3043 02:55:59,280 --> 02:56:02,840 Speaker 1: on each other, and that kind of filtering, you know, 3044 02:56:03,000 --> 02:56:08,600 Speaker 1: starts to get people onboarded from a conspiracy into you know, 3045 02:56:08,720 --> 02:56:10,880 Speaker 1: what we're seeing now is kind of the white nationalist 3046 02:56:11,000 --> 02:56:16,280 Speaker 1: Christian nationalist movement. That's that's become that that thing. And 3047 02:56:17,000 --> 02:56:19,560 Speaker 1: you know, for me, that's where my interest stems from 3048 02:56:19,680 --> 02:56:22,320 Speaker 1: because of this idea of weaponizing on reality, seeing what 3049 02:56:22,440 --> 02:56:25,520 Speaker 1: happened in Russia when that happened, and seeing this kind 3050 02:56:25,560 --> 02:56:28,600 Speaker 1: of thing which is so similar to that filtering and 3051 02:56:28,800 --> 02:56:32,680 Speaker 1: that narrative shift and building that goes on in that world. 3052 02:56:32,800 --> 02:56:38,040 Speaker 1: It's it's been you know, staring into a void feels bad. 3053 02:56:38,160 --> 02:56:40,040 Speaker 1: Sometimes this is just one where it's like, oh this 3054 02:56:40,200 --> 02:56:43,680 Speaker 1: is terrible, and it's just the beginning of it. Every 3055 02:56:43,720 --> 02:56:45,800 Speaker 1: once in a while, the void stares back and you're like, 3056 02:56:45,920 --> 02:56:48,720 Speaker 1: oh boy, oh yeah no, and that's exactly I mean, 3057 02:56:48,840 --> 02:56:51,520 Speaker 1: that's uh, that's the problem is sometimes it just stares 3058 02:56:51,560 --> 02:56:54,200 Speaker 1: you right in the eyes and tells you, yeah, I'm here, 3059 02:56:54,760 --> 02:56:59,080 Speaker 1: and that's a bad feeling. Yeah. Well, I think that's 3060 02:56:59,120 --> 02:57:02,600 Speaker 1: more or less we needed to talk. That would be like, 3061 02:57:02,800 --> 02:57:04,480 Speaker 1: you know, like one of the one of the ways 3062 02:57:04,520 --> 02:57:07,160 Speaker 1: to combat this, if you can, is honestly, creating your 3063 02:57:07,200 --> 02:57:11,240 Speaker 1: own memetic graphs is really useful because these things spread 3064 02:57:11,280 --> 02:57:15,080 Speaker 1: so fast when they're in images. Images of of dates 3065 02:57:15,160 --> 02:57:18,680 Speaker 1: and instances spread like wildfire. Um So, if you can 3066 02:57:18,920 --> 02:57:22,440 Speaker 1: make your own which compares it to previous years, say hey, 3067 02:57:22,840 --> 02:57:25,200 Speaker 1: this actually isn't a new pattern, that's something that's been 3068 02:57:25,280 --> 02:57:27,560 Speaker 1: this is this is just what happens in industrial farming. 3069 02:57:28,280 --> 02:57:31,800 Speaker 1: I think spreading it via memetic images is one of 3070 02:57:31,879 --> 02:57:34,720 Speaker 1: the if there is a way to combat it, that's 3071 02:57:34,760 --> 02:57:37,000 Speaker 1: probably one of the core ways to go about it, 3072 02:57:37,520 --> 02:57:40,680 Speaker 1: just to get how fast those things spread. And again 3073 02:57:40,800 --> 02:57:43,240 Speaker 1: you can see I've seen some useful people have been 3074 02:57:43,240 --> 02:57:45,600 Speaker 1: trying to push back against you know, this idea that 3075 02:57:45,680 --> 02:57:49,000 Speaker 1: there's been this like massive crime surge in San Francisco 3076 02:57:49,000 --> 02:57:52,320 Speaker 1: and stuff, and they it's uses the same tactics, right, yeah. Absolutely. 3077 02:57:52,680 --> 02:57:55,000 Speaker 1: You have like a couple of videos of people shoplifting 3078 02:57:55,160 --> 02:57:57,520 Speaker 1: or something, and then you make and and is there 3079 02:57:57,680 --> 02:57:59,880 Speaker 1: is that kind of crime actually up? Well no, it 3080 02:58:00,040 --> 02:58:03,920 Speaker 1: really isn't. But like it doesn't matter because um or 3081 02:58:04,040 --> 02:58:06,080 Speaker 1: is it any higher there than it is in someplace 3082 02:58:06,160 --> 02:58:08,440 Speaker 1: like Duluth where no videos are coming out like no, 3083 02:58:08,640 --> 02:58:12,840 Speaker 1: it's not, but um, it's uh. If you have to 3084 02:58:12,879 --> 02:58:14,520 Speaker 1: be aware, the first thing you have to be aware 3085 02:58:14,560 --> 02:58:16,920 Speaker 1: of is the phenomena is like the way in which 3086 02:58:16,959 --> 02:58:20,120 Speaker 1: they're taking advantage of you. And then you have to 3087 02:58:20,760 --> 02:58:22,160 Speaker 1: you have to kind of deter and you have to 3088 02:58:22,320 --> 02:58:24,680 Speaker 1: use the tactics they're using against them. And one of 3089 02:58:24,760 --> 02:58:28,080 Speaker 1: the things that is effective is these these graphs with 3090 02:58:28,240 --> 02:58:30,520 Speaker 1: kind of like numbers and dates and ship on them. 3091 02:58:30,840 --> 02:58:36,880 Speaker 1: People love to feel like they're looking at research. But yeah, 3092 02:58:37,040 --> 02:58:39,520 Speaker 1: at the same time though not to not to be like, 3093 02:58:40,120 --> 02:58:43,200 Speaker 1: I don't know negative about this at all, but in 3094 02:58:43,320 --> 02:58:46,040 Speaker 1: my mind, this is like a modern day version of 3095 02:58:46,080 --> 02:58:48,480 Speaker 1: someone starting your religion and make people like making the 3096 02:58:48,560 --> 02:58:51,320 Speaker 1: sheep of this like following and then having them turn 3097 02:58:51,360 --> 02:58:54,040 Speaker 1: into like whatever it is, whether it's Christian nationalism or whatever. 3098 02:58:54,520 --> 02:58:57,279 Speaker 1: But just like in religion, if people are presented with science, 3099 02:58:57,280 --> 02:58:59,680 Speaker 1: they don't care, you know what I mean, You'll present 3100 02:58:59,720 --> 02:59:02,160 Speaker 1: them with like, I don't know's there's some people that 3101 02:59:03,680 --> 02:59:06,040 Speaker 1: it's it's it's not science. It's about everyone wants to 3102 02:59:06,120 --> 02:59:11,320 Speaker 1: have access to special, secretive, secret knowledge. Everyone wants to 3103 02:59:11,360 --> 02:59:14,039 Speaker 1: have esoteric knowledge that no one else has so these 3104 02:59:14,080 --> 02:59:16,120 Speaker 1: graphs are still compelling in the first place because you're like, oh, 3105 02:59:16,560 --> 02:59:18,520 Speaker 1: no one else knows all of these things. No one 3106 02:59:18,520 --> 02:59:20,200 Speaker 1: else has laid it out in this manner. So if 3107 02:59:20,240 --> 02:59:23,320 Speaker 1: you could present your information in that same style, say hey, 3108 02:59:23,840 --> 02:59:25,720 Speaker 1: no one knows that this is actually part of this 3109 02:59:25,959 --> 02:59:28,880 Speaker 1: overall thing that's been going on for years and it's 3110 02:59:28,879 --> 02:59:31,440 Speaker 1: about industrial farming, then you hope that that will spread. 3111 02:59:32,200 --> 02:59:34,840 Speaker 1: That Then that spreads because because it infects the same 3112 02:59:34,920 --> 02:59:37,640 Speaker 1: point in someone's brain. Right, want to we want to 3113 02:59:37,680 --> 02:59:39,680 Speaker 1: feel smart, we want to feel unique, we want to 3114 02:59:39,760 --> 02:59:42,480 Speaker 1: have like esoteric knowledge. So if you can, if you 3115 02:59:42,560 --> 02:59:45,720 Speaker 1: can frame it to to fit that same mold, then 3116 02:59:45,800 --> 02:59:49,039 Speaker 1: it's not science. It's just playing with the same tactics 3117 02:59:49,160 --> 02:59:52,359 Speaker 1: that got them convinced of this in the first place. Exactly. 3118 02:59:52,879 --> 02:59:58,280 Speaker 1: People that's different. Yeah, I think I think it's true that, like, 3119 02:59:58,440 --> 03:00:01,160 Speaker 1: if somebody is a committed believer in in whatever, like 3120 03:00:01,560 --> 03:00:05,680 Speaker 1: like Mike Sernovitch or something, you're not convincing them. The 3121 03:00:06,080 --> 03:00:08,520 Speaker 1: danger The thing that they're doing that's dangerous is there. 3122 03:00:08,600 --> 03:00:11,199 Speaker 1: They're quote unquote pilling a lot of like random people 3123 03:00:11,240 --> 03:00:15,200 Speaker 1: into the problem that scares those people. And when those 3124 03:00:15,240 --> 03:00:18,360 Speaker 1: people get scared, they're willing to accept ship they wouldn't 3125 03:00:18,360 --> 03:00:22,800 Speaker 1: otherwise scare. And I think those people you can push 3126 03:00:23,320 --> 03:00:25,840 Speaker 1: get to step down from the ledge because one thing 3127 03:00:25,920 --> 03:00:28,120 Speaker 1: we do want this is also a problem. But like 3128 03:00:28,200 --> 03:00:30,400 Speaker 1: you think about like climate change, right, and how much 3129 03:00:30,480 --> 03:00:33,760 Speaker 1: of the denial of climate change is not based around 3130 03:00:33,800 --> 03:00:37,400 Speaker 1: getting people to reject the idea entirely, but getting like 3131 03:00:37,600 --> 03:00:40,080 Speaker 1: when people bring up a specific problem being like, well, 3132 03:00:40,360 --> 03:00:42,720 Speaker 1: but look at this weird new piece of technology that 3133 03:00:42,840 --> 03:00:44,880 Speaker 1: some kid developed, and like, this is going to fix it, 3134 03:00:44,920 --> 03:00:46,720 Speaker 1: and then you get to not worry about it. Right, 3135 03:00:47,160 --> 03:00:51,400 Speaker 1: So if somebody suddenly starts freaking out about agricultural fires 3136 03:00:51,440 --> 03:00:53,440 Speaker 1: for the first time and you're like, actually, they're lower 3137 03:00:53,520 --> 03:00:55,680 Speaker 1: than they are in normal years, this isn't a problem, 3138 03:00:56,040 --> 03:00:58,720 Speaker 1: then maybe their brain, maybe you can get their brain 3139 03:00:58,800 --> 03:01:00,480 Speaker 1: to go like, Okay, that I won't worry about that 3140 03:01:00,560 --> 03:01:02,960 Speaker 1: because I don't want more things to worry about. I 3141 03:01:03,080 --> 03:01:07,320 Speaker 1: just have been given them. Um, that's targeting the ledge people. 3142 03:01:07,800 --> 03:01:11,160 Speaker 1: We're talking. Yeah, you're not getting to true believers. You're 3143 03:01:11,160 --> 03:01:13,280 Speaker 1: not getting to true believers at this point of any 3144 03:01:13,320 --> 03:01:15,120 Speaker 1: of this stuff. For the most part, you know that 3145 03:01:15,240 --> 03:01:17,840 Speaker 1: takes a wholly different level of work. I mean, that's 3146 03:01:18,120 --> 03:01:20,879 Speaker 1: that's in the ballpark in my mind of the radicalization, right, 3147 03:01:21,000 --> 03:01:24,000 Speaker 1: like you're you're in a wholly different ballpark. And if 3148 03:01:24,040 --> 03:01:26,240 Speaker 1: you can target the people who are thinking about jumping 3149 03:01:26,280 --> 03:01:30,000 Speaker 1: into the pool too, they tend to if you do 3150 03:01:30,240 --> 03:01:34,599 Speaker 1: change their mind, they become some of the biggest proponents 3151 03:01:34,879 --> 03:01:37,200 Speaker 1: of trying to get other people off the ledge that 3152 03:01:37,360 --> 03:01:44,520 Speaker 1: they might know. And that's something well, it's something, yeah, 3153 03:01:44,600 --> 03:01:47,520 Speaker 1: it's it's it's it's very similar, and it's something I've 3154 03:01:47,520 --> 03:01:49,600 Speaker 1: seen even in my friends circles, you know, talking to 3155 03:01:49,680 --> 03:01:52,640 Speaker 1: people who five years ago, we're fully you know, in 3156 03:01:52,760 --> 03:01:55,240 Speaker 1: the all let's do Donald Trump for the lulls thing. 3157 03:01:55,760 --> 03:01:57,640 Speaker 1: You know. Now those are the same people who are 3158 03:01:57,680 --> 03:02:01,200 Speaker 1: telling their friends, oh shit, we have a Christian nationalist 3159 03:02:01,280 --> 03:02:04,280 Speaker 1: movement that's trying to overthrow democracy. And that's a huge 3160 03:02:04,400 --> 03:02:08,840 Speaker 1: you know, like that's a huge help, um to everyone. Right. 3161 03:02:08,959 --> 03:02:11,720 Speaker 1: You want more people saying the truth to people who 3162 03:02:11,840 --> 03:02:15,520 Speaker 1: might not hear it from someone like us um and 3163 03:02:15,920 --> 03:02:21,600 Speaker 1: can internalize it and infiltrate. There's you know, the truth 3164 03:02:21,680 --> 03:02:25,480 Speaker 1: takes a lot more work than fiction, unfortunately, but once 3165 03:02:25,520 --> 03:02:28,760 Speaker 1: it starts to work, it's a compounding thing. And the 3166 03:02:28,840 --> 03:02:32,480 Speaker 1: truth tends to really set people free. As corny as 3167 03:02:32,560 --> 03:02:35,360 Speaker 1: that is. If people find out they've been lied to, 3168 03:02:36,000 --> 03:02:39,040 Speaker 1: they get they want to know why it worked, and 3169 03:02:39,160 --> 03:02:41,800 Speaker 1: that works in our favor and the truth's favor, and 3170 03:02:42,879 --> 03:02:45,879 Speaker 1: reality is the thing we you know, we gotta protect 3171 03:02:45,920 --> 03:02:50,000 Speaker 1: this at all costs because we're getting tidle weight by 3172 03:02:50,720 --> 03:02:53,320 Speaker 1: unreality and that's a problem for all of us for 3173 03:02:53,400 --> 03:02:58,560 Speaker 1: different reasons. That's the more uplifting note I think than 3174 03:02:58,800 --> 03:03:02,760 Speaker 1: a couple of minutes ago. Yeah, all right, well there 3175 03:03:02,879 --> 03:03:10,720 Speaker 1: there we go. Um go, I don't know, fix it. Yeah, 3176 03:03:11,480 --> 03:03:14,840 Speaker 1: go fix things. Yeah, go fix things. Don't go swimming 3177 03:03:14,920 --> 03:03:19,360 Speaker 1: in grain silos, and uh, avoid grain silos. Always avoid 3178 03:03:19,440 --> 03:03:26,040 Speaker 1: grain silos. Hey, we'll be back Monday with more episodes 3179 03:03:26,200 --> 03:03:28,680 Speaker 1: every week from now until the heat death of the Universe. 3180 03:03:29,280 --> 03:03:31,600 Speaker 1: It Could Happen Here is a production of cool Zone Media. 3181 03:03:31,879 --> 03:03:34,520 Speaker 1: For more podcasts from cool Zone Media, visit our website 3182 03:03:34,560 --> 03:03:36,640 Speaker 1: cool zone media dot com, or check us out on 3183 03:03:36,720 --> 03:03:39,240 Speaker 1: the I Heart Radio app, Apple Podcasts, or wherever you 3184 03:03:39,320 --> 03:03:41,800 Speaker 1: listen to podcasts. You can find sources for It Could 3185 03:03:41,800 --> 03:03:44,760 Speaker 1: Happen Here, updated monthly at cool zone media dot com 3186 03:03:44,879 --> 03:03:46,720 Speaker 1: slash sources. Thanks for listening,