1 00:00:01,720 --> 00:00:05,920 Speaker 1: Also media. Hey everybody, Robert Evans here and I wanted 2 00:00:05,960 --> 00:00:08,960 Speaker 1: to let you know this is a compilation episode, So 3 00:00:09,039 --> 00:00:12,639 Speaker 1: every episode of the week that just happened is here 4 00:00:12,760 --> 00:00:16,680 Speaker 1: in one convenient and with somewhat less ads package for 5 00:00:16,760 --> 00:00:18,840 Speaker 1: you to listen to in a long stretch if you want. 6 00:00:19,360 --> 00:00:21,400 Speaker 1: If you've been listening to the episodes every day this week, 7 00:00:21,440 --> 00:00:23,200 Speaker 1: there's going to be nothing new here for you, but 8 00:00:23,560 --> 00:00:24,919 Speaker 1: you can make your own decisions. 9 00:00:27,360 --> 00:00:29,360 Speaker 2: Hi, everyone, welcome, Vick can have it here. It's me 10 00:00:29,560 --> 00:00:32,400 Speaker 2: James today with a terrible cold as you can probably tell, 11 00:00:32,760 --> 00:00:35,360 Speaker 2: but still very important to listen today because I'm talking 12 00:00:35,360 --> 00:00:38,560 Speaker 2: to Andreina, an organizer from Ktown for All Up in LA, 13 00:00:39,040 --> 00:00:40,680 Speaker 2: and we're going to talk about the fires in LA 14 00:00:40,800 --> 00:00:42,519 Speaker 2: and the meat fade response and what you can do 15 00:00:42,520 --> 00:00:42,840 Speaker 2: to help. 16 00:00:42,960 --> 00:00:44,120 Speaker 3: So welcome to the show. 17 00:00:43,960 --> 00:00:45,360 Speaker 4: Andrena, thank you for having me. 18 00:00:45,440 --> 00:00:45,800 Speaker 3: James. 19 00:00:46,440 --> 00:00:48,440 Speaker 2: Yeah, thanks for being here. I know you guys are 20 00:00:48,440 --> 00:00:51,559 Speaker 2: really busy right now. So to begin with, like in 21 00:00:51,600 --> 00:00:53,599 Speaker 2: case this has missed people, and there's a lot there's 22 00:00:53,600 --> 00:00:56,520 Speaker 2: a lot of news happening right now, can you explain 23 00:00:56,600 --> 00:00:59,160 Speaker 2: what's been going on in LA with respected to fires 24 00:00:59,360 --> 00:01:01,160 Speaker 2: for the last two three days, So. 25 00:01:01,200 --> 00:01:03,880 Speaker 4: About three or four days ago, we got a warning 26 00:01:03,960 --> 00:01:06,680 Speaker 4: that we were going to be experiencing high winds up 27 00:01:06,720 --> 00:01:09,920 Speaker 4: to fifty miles per hour, which is nuts, and they 28 00:01:09,920 --> 00:01:12,120 Speaker 4: were going to be coming from the desert. So this 29 00:01:12,240 --> 00:01:14,759 Speaker 4: is just like a barrage of hot wind. So we 30 00:01:14,760 --> 00:01:18,600 Speaker 4: were preparing to have to replace tents and tarps because 31 00:01:18,840 --> 00:01:21,560 Speaker 4: you know, man made structures that people are surviving with 32 00:01:21,640 --> 00:01:24,680 Speaker 4: can't survive that kind of wind. But when we hear 33 00:01:24,720 --> 00:01:28,640 Speaker 4: that wind here in southern California, we immediately think fire, sadly, 34 00:01:28,680 --> 00:01:32,959 Speaker 4: because any little you know, a cigarette, but an electrical spark, 35 00:01:33,120 --> 00:01:36,040 Speaker 4: you know, like when when it's this dry, it's enough 36 00:01:36,080 --> 00:01:39,240 Speaker 4: to cause devastation, which is exactly what's happened. There are 37 00:01:39,319 --> 00:01:42,240 Speaker 4: about seven fires right now spread around the perimeter of 38 00:01:42,280 --> 00:01:45,720 Speaker 4: Los Angeles County that have been started and then spread 39 00:01:45,840 --> 00:01:49,440 Speaker 4: massively by these giant winds everywhere, so the embers are 40 00:01:49,480 --> 00:01:52,680 Speaker 4: being picked up. Thankfully, the wind has settled down, but 41 00:01:52,800 --> 00:01:55,960 Speaker 4: the wind itself has prevented, you know, the big water 42 00:01:56,040 --> 00:01:59,880 Speaker 4: tankers from flying, which is led to the massive deaths 43 00:02:00,040 --> 00:02:03,200 Speaker 4: s that you saw in the Palisades and other areas. 44 00:02:03,280 --> 00:02:06,600 Speaker 4: You know, the entire water we being grounded for a 45 00:02:06,640 --> 00:02:10,040 Speaker 4: while just meant that it was burning with no control, 46 00:02:10,400 --> 00:02:14,120 Speaker 4: relying on on the ground firefighters. So what we've seen 47 00:02:14,240 --> 00:02:18,360 Speaker 4: is just mass devastation, thousands of homes lost. I think 48 00:02:18,440 --> 00:02:22,720 Speaker 4: there is a death tally, thankfully very low, in about tennish. 49 00:02:22,840 --> 00:02:26,200 Speaker 4: I think I've heard from this morning with confirmation. But yeah, 50 00:02:26,280 --> 00:02:28,040 Speaker 4: that's all we're facing right now. 51 00:02:28,800 --> 00:02:32,600 Speaker 2: Yeah, it's pretty devastating, Like whole neighborhoods have gone right, 52 00:02:32,639 --> 00:02:36,639 Speaker 2: I think I thought like two thousand structures have already 53 00:02:36,680 --> 00:02:39,600 Speaker 2: been burned. And like, as you said, if people aren't 54 00:02:39,639 --> 00:02:42,000 Speaker 2: in the United States or nut familiar with how fire 55 00:02:42,240 --> 00:02:45,160 Speaker 2: is for like out here in the Western United States, 56 00:02:45,680 --> 00:02:48,160 Speaker 2: it's a lot of air dropping fire retired in and 57 00:02:48,200 --> 00:02:51,760 Speaker 2: air dropping water, which without that, it's very hard to 58 00:02:51,800 --> 00:02:53,639 Speaker 2: get enough water to where it needs to be. And 59 00:02:53,680 --> 00:02:55,400 Speaker 2: I believe at one point that actually ran out of 60 00:02:55,400 --> 00:02:57,919 Speaker 2: water in water towers right up in Palisades. 61 00:02:58,520 --> 00:03:03,080 Speaker 4: Yeah, the fire hydrants ran dry in some areas, which 62 00:03:03,120 --> 00:03:05,840 Speaker 4: is terrifying to think of. And we were warned. I'm 63 00:03:05,840 --> 00:03:09,480 Speaker 4: in the Koreatan neighborhood. We were warned about low water pressure. 64 00:03:09,960 --> 00:03:12,680 Speaker 4: And I do know that some areas in Los Angeles, 65 00:03:12,760 --> 00:03:16,160 Speaker 4: particularly in that region, are being worn to boil water 66 00:03:16,400 --> 00:03:18,760 Speaker 4: and that their water is unsafe to drink right now. 67 00:03:19,120 --> 00:03:21,440 Speaker 2: Yeah, I've seen that too. Dare was a water boil 68 00:03:21,520 --> 00:03:24,800 Speaker 2: warning for yeah, lots of places. So as a result 69 00:03:25,040 --> 00:03:27,840 Speaker 2: of these fires and all the destruction they've caused, I 70 00:03:27,840 --> 00:03:29,440 Speaker 2: think I saw it was it one hundred and fifty 71 00:03:29,480 --> 00:03:31,959 Speaker 2: thousand odd people have been displaced now, is that? 72 00:03:32,360 --> 00:03:33,720 Speaker 3: Is that right? Is that a good number? 73 00:03:33,919 --> 00:03:36,160 Speaker 4: I saw something large like that of just the people 74 00:03:36,240 --> 00:03:39,600 Speaker 4: that have been evacuated. Right north of me was the 75 00:03:39,640 --> 00:03:43,280 Speaker 4: Sunset fire, and that was very concerningly close to the 76 00:03:43,360 --> 00:03:47,120 Speaker 4: Korea Ten neighborhood that is generally never concerned about fires 77 00:03:47,160 --> 00:03:50,560 Speaker 4: because we're so in the concrete jungle, like, we're so insulated. 78 00:03:51,040 --> 00:03:54,360 Speaker 4: I think that's the closest we've come to devastation, and 79 00:03:54,400 --> 00:03:56,480 Speaker 4: we were really stressed out last night just keeping an 80 00:03:56,480 --> 00:03:59,360 Speaker 4: eye on the news because that's, you know, not even 81 00:03:59,400 --> 00:04:02,320 Speaker 4: two miles away away from the core of the densest 82 00:04:02,360 --> 00:04:03,880 Speaker 4: neighborhood of Los Angeles. 83 00:04:04,280 --> 00:04:07,160 Speaker 2: Right, yeah, I guess again, people aren't familiar like fires 84 00:04:07,160 --> 00:04:10,480 Speaker 2: destroy property and kill people every year here, and the 85 00:04:10,520 --> 00:04:13,680 Speaker 2: climate change has meant that they have become worse and worse. 86 00:04:14,040 --> 00:04:16,159 Speaker 3: But in the middle of a city. 87 00:04:16,200 --> 00:04:19,520 Speaker 2: You're Germany not worried about fires because the resources will 88 00:04:19,520 --> 00:04:22,159 Speaker 2: be spent to defend that property, right, Like this is 89 00:04:22,160 --> 00:04:25,440 Speaker 2: this is a very unique situation to see huge parts 90 00:04:25,440 --> 00:04:26,560 Speaker 2: of a city burning down. 91 00:04:27,240 --> 00:04:34,960 Speaker 4: Yeah, particularly the Palisades, which is historically a significantly wealthy neighborhood. Yeah, 92 00:04:35,600 --> 00:04:39,640 Speaker 4: you know, a den of celebrity in Hollywood elites, and 93 00:04:39,800 --> 00:04:43,000 Speaker 4: seeing it devastated just kind of sends home the point 94 00:04:43,040 --> 00:04:45,760 Speaker 4: that you know, you have wealth that insulates you from 95 00:04:45,880 --> 00:04:48,680 Speaker 4: the worst of what we're facing, but that only goes 96 00:04:48,720 --> 00:04:51,599 Speaker 4: so far. I saw that there was a couple of 97 00:04:51,600 --> 00:04:55,600 Speaker 4: wealthy people on Twitter begging for private fired fighting forces 98 00:04:55,640 --> 00:04:59,000 Speaker 4: to come save their homes. Famously, the same ones that 99 00:04:59,040 --> 00:05:02,360 Speaker 4: are talking about activation and how smart they are to 100 00:05:02,400 --> 00:05:06,400 Speaker 4: do real estate, you know, maneuvering to not pay into 101 00:05:06,440 --> 00:05:10,440 Speaker 4: the social system that helps in these times. Clearly we're 102 00:05:10,640 --> 00:05:15,080 Speaker 4: severely underfunded and severely undermanaged when it comes to the 103 00:05:15,120 --> 00:05:17,240 Speaker 4: government stepping in during these emergencies. 104 00:05:18,080 --> 00:05:21,560 Speaker 2: Yeah, and like that's something I want to address because 105 00:05:21,600 --> 00:05:25,960 Speaker 2: I think in every natural disaster that I've covered, the 106 00:05:26,040 --> 00:05:27,960 Speaker 2: reason it becomes a disaster, I guess is because the 107 00:05:28,000 --> 00:05:31,080 Speaker 2: state's incapable of responding in a way that protects people, 108 00:05:31,640 --> 00:05:34,680 Speaker 2: and in almost every case it's people who have to step. 109 00:05:34,480 --> 00:05:35,680 Speaker 3: Up and look after one another. 110 00:05:35,839 --> 00:05:39,560 Speaker 2: So we should talk about the response of the LA 111 00:05:40,279 --> 00:05:43,520 Speaker 2: City and county governments, and then I'd love to talk 112 00:05:43,560 --> 00:05:45,039 Speaker 2: about the mutial aid response after that. 113 00:05:45,640 --> 00:05:48,800 Speaker 4: Yeah, from what we've seen here in Kaytown, if you 114 00:05:48,880 --> 00:05:54,159 Speaker 4: weren't immediately evacuated, there's nothing. All of our outreach folks 115 00:05:54,200 --> 00:05:56,560 Speaker 4: that were out talking to all of our un house 116 00:05:56,600 --> 00:05:59,480 Speaker 4: neighbors here in the area, which are in the hundreds, 117 00:06:00,000 --> 00:06:01,920 Speaker 4: first of all, didn't know what was going on. They 118 00:06:01,920 --> 00:06:04,160 Speaker 4: saw the sky, they assumed there was a fire nearby, 119 00:06:04,240 --> 00:06:07,120 Speaker 4: but they didn't know the swath of the devastation and 120 00:06:07,160 --> 00:06:10,359 Speaker 4: that we were generally threatened as well. They didn't have 121 00:06:10,400 --> 00:06:14,360 Speaker 4: any supplies, and in some areas of Los Angeles we've 122 00:06:14,360 --> 00:06:17,640 Speaker 4: heard as of this morning and yesterday that sweeps have continued. 123 00:06:17,680 --> 00:06:20,839 Speaker 4: So the city has continued throwing away tents from the 124 00:06:20,839 --> 00:06:23,440 Speaker 4: people living on the streets. And then for the house 125 00:06:23,480 --> 00:06:27,599 Speaker 4: people that have been displaced, there are shelter designations that 126 00:06:27,640 --> 00:06:30,479 Speaker 4: they've set up. Pan Pacific Park is one of them 127 00:06:30,520 --> 00:06:34,919 Speaker 4: for Hollywood. There's one in Pasadena, you know, and the like. 128 00:06:35,120 --> 00:06:37,560 Speaker 4: But it seems to be you know, a hodgepodge of 129 00:06:38,000 --> 00:06:42,200 Speaker 4: you know, disorganization and a lot of you know, mutual 130 00:06:42,200 --> 00:06:44,720 Speaker 4: aid folks on the ground being the ones to direct 131 00:06:44,720 --> 00:06:48,200 Speaker 4: people and gather the supplies. I have not heard of, 132 00:06:48,440 --> 00:06:51,200 Speaker 4: you know, a very formal eyed system. There is no 133 00:06:51,279 --> 00:06:54,520 Speaker 4: word on any kind of significant assistance for people who 134 00:06:54,560 --> 00:06:56,640 Speaker 4: have lost their homes at the moment. I don't know 135 00:06:56,640 --> 00:06:59,120 Speaker 4: if the Red Cross is gonna set a staging zone 136 00:06:59,200 --> 00:07:01,560 Speaker 4: up or anything, but I do know that the people 137 00:07:01,600 --> 00:07:05,640 Speaker 4: who are setting up you know, places for people to go, food, water, 138 00:07:06,400 --> 00:07:11,120 Speaker 4: even pet care, things like that have been just random volunteers. 139 00:07:11,160 --> 00:07:14,480 Speaker 4: You know. I'm in this chat group Mutual ADLA that 140 00:07:14,760 --> 00:07:17,480 Speaker 4: spurred you know, literally just on signal the day that 141 00:07:17,520 --> 00:07:20,160 Speaker 4: the fire started, that has a thousand people on it 142 00:07:20,560 --> 00:07:25,800 Speaker 4: mobilizing and distributing and volunteering to move people from one 143 00:07:25,840 --> 00:07:27,960 Speaker 4: area of the city to the other. You know, I 144 00:07:28,000 --> 00:07:29,720 Speaker 4: have this person who needs a place to stay, Like, 145 00:07:29,760 --> 00:07:32,800 Speaker 4: who's got a list of places that are open, because 146 00:07:32,800 --> 00:07:37,400 Speaker 4: when you have disasters as big, you need help quickly. Yeah, 147 00:07:37,440 --> 00:07:41,720 Speaker 4: and bureaucracy just doesn't you know that that's not built 148 00:07:41,760 --> 00:07:41,960 Speaker 4: for that. 149 00:07:42,640 --> 00:07:43,280 Speaker 3: Yeah, it's not. 150 00:07:43,480 --> 00:07:46,800 Speaker 2: And like we've definitely seen that there was just a 151 00:07:46,840 --> 00:07:49,440 Speaker 2: failure of the state to respond like in the way 152 00:07:49,480 --> 00:07:51,120 Speaker 2: that it needed to as quickly as it needed to. 153 00:07:51,320 --> 00:07:54,320 Speaker 2: And it's really it's wonderful to see people picking up slack, 154 00:07:54,360 --> 00:07:56,480 Speaker 2: like of course it is. It's really beautiful that people 155 00:07:57,520 --> 00:07:59,920 Speaker 2: show up for each other in these times. There's something 156 00:08:00,040 --> 00:08:04,200 Speaker 2: about that that I obviously, like find really affirming. That's 157 00:08:04,200 --> 00:08:07,040 Speaker 2: maybe why I do this for a living. But yeah, 158 00:08:07,080 --> 00:08:08,720 Speaker 2: it's really beautiful to see. It doesn't mean that we 159 00:08:08,720 --> 00:08:11,960 Speaker 2: should forget that, Like the state has capacity that it 160 00:08:12,040 --> 00:08:14,960 Speaker 2: is using, as you said, to displace people who are unhoused. 161 00:08:15,560 --> 00:08:18,880 Speaker 2: It could be using that capacity to bring masks to people, 162 00:08:19,000 --> 00:08:21,440 Speaker 2: to bring food to people, to create shelter for people. 163 00:08:21,480 --> 00:08:24,200 Speaker 2: It's not it's choosing to harasp people who live on 164 00:08:24,240 --> 00:08:24,600 Speaker 2: the street. 165 00:08:25,160 --> 00:08:27,520 Speaker 4: Yeah, and this is something we see repeatedly. You know, 166 00:08:28,120 --> 00:08:30,320 Speaker 4: it hasn't rained in LA for about eight months, but 167 00:08:30,400 --> 00:08:33,840 Speaker 4: when it did rain, we had historical rains last year, 168 00:08:33,880 --> 00:08:37,520 Speaker 4: in particular, we had a cold front where folks die 169 00:08:37,600 --> 00:08:39,880 Speaker 4: every time, and we know folks are going to die 170 00:08:39,960 --> 00:08:42,240 Speaker 4: every time it rains here in LA. We have more 171 00:08:42,280 --> 00:08:45,439 Speaker 4: people that die of hypothermia and Los Angeles than New 172 00:08:45,480 --> 00:08:49,520 Speaker 4: York and San Francisco. Combined every year because hypothermia actually 173 00:08:49,559 --> 00:08:51,840 Speaker 4: doesn't require it to be freezing this set end. It 174 00:08:51,920 --> 00:08:55,719 Speaker 4: just requires you to be in around sixty degrees and 175 00:08:55,880 --> 00:08:59,320 Speaker 4: be wet, which is very common on the streets here 176 00:08:59,320 --> 00:09:02,719 Speaker 4: of LA. We've seen people get frostbite from having their 177 00:09:02,760 --> 00:09:05,760 Speaker 4: skin against cold concrete, you know, over the night while 178 00:09:05,760 --> 00:09:08,839 Speaker 4: it's raining, and our electeds know this. When I first 179 00:09:08,840 --> 00:09:12,160 Speaker 4: started doing this work, there was a slogan that we 180 00:09:12,160 --> 00:09:14,960 Speaker 4: were chanting for a day in LA and that was 181 00:09:15,000 --> 00:09:17,560 Speaker 4: the number of unhoused people that died every day. And 182 00:09:17,600 --> 00:09:21,080 Speaker 4: now we're at about six or seven. We request, you know, 183 00:09:21,160 --> 00:09:24,760 Speaker 4: through the Freedom of Information Act, requests the coroner's report 184 00:09:24,800 --> 00:09:27,120 Speaker 4: every year of how many people died, and that number 185 00:09:27,160 --> 00:09:29,920 Speaker 4: is only growing. And the government knows this. They know 186 00:09:30,000 --> 00:09:31,880 Speaker 4: every time we have a heat wave that there are 187 00:09:31,920 --> 00:09:35,439 Speaker 4: seventy thousand people sleeping on the streets, sleeping in their cars. 188 00:09:35,840 --> 00:09:38,520 Speaker 4: They know that during the winter, you know, people are 189 00:09:38,520 --> 00:09:40,920 Speaker 4: out there in the cold and the rain. And I 190 00:09:41,360 --> 00:09:44,040 Speaker 4: talk to people who aren't into the organizing space and 191 00:09:44,080 --> 00:09:47,040 Speaker 4: they asked me like, well, aren't there you know, insert 192 00:09:47,080 --> 00:09:49,600 Speaker 4: service here that you think there should be, you know, 193 00:09:49,880 --> 00:09:52,520 Speaker 4: right now during the fires, Like aren't there vans picking 194 00:09:52,559 --> 00:09:55,120 Speaker 4: people up and taking them to shelter, and it's like 195 00:09:55,160 --> 00:09:58,640 Speaker 4: that would be wonderful, when there's not. There's never any 196 00:09:58,760 --> 00:10:01,720 Speaker 4: vans picking people up. Know, even when they open up 197 00:10:01,720 --> 00:10:06,320 Speaker 4: cooling shelters and warming shelters. The number one barrier we 198 00:10:06,360 --> 00:10:08,240 Speaker 4: heard from people in the streets is how would I 199 00:10:08,280 --> 00:10:10,360 Speaker 4: get there? And when I get there, they make me 200 00:10:10,559 --> 00:10:12,480 Speaker 4: not bring my stuff in, so it's all going to 201 00:10:12,480 --> 00:10:15,760 Speaker 4: get stolen. There's just all of these barriers that the 202 00:10:15,880 --> 00:10:20,520 Speaker 4: city is just completely you know, purposely neglecting. They could 203 00:10:20,559 --> 00:10:22,320 Speaker 4: talk to any of us on how to run a 204 00:10:22,360 --> 00:10:26,280 Speaker 4: successful you know, warming or cooling shelter, they don't, you know, 205 00:10:26,320 --> 00:10:29,040 Speaker 4: they have no interest in what we have to say. 206 00:10:29,480 --> 00:10:32,600 Speaker 4: In fact, our city council person hearing Kytown doesn't respond 207 00:10:32,640 --> 00:10:35,959 Speaker 4: to any of our inquiries at all. She just flat 208 00:10:36,000 --> 00:10:38,880 Speaker 4: out doesn't respond to us whenever we email her with 209 00:10:38,920 --> 00:10:42,160 Speaker 4: concerns or questions. And that's kind of how we've been, 210 00:10:42,640 --> 00:10:45,320 Speaker 4: you know, working just with the knowledge that we don't 211 00:10:45,360 --> 00:10:48,800 Speaker 4: have this support of this agency, and in fact they 212 00:10:48,800 --> 00:10:51,840 Speaker 4: are opposition. You know, we're the ones having to organize 213 00:10:51,840 --> 00:10:53,079 Speaker 4: around them and what they're doing. 214 00:10:53,720 --> 00:10:57,320 Speaker 2: Yeah, it's sadly not that dissimilar here, Like every time 215 00:10:57,360 --> 00:10:58,840 Speaker 2: it rains, people will die. 216 00:10:59,040 --> 00:11:00,520 Speaker 3: Every time we have heat wave. 217 00:11:01,880 --> 00:11:03,640 Speaker 2: I remember they found the remains of an in house 218 00:11:03,679 --> 00:11:05,280 Speaker 2: person a couple of years ago, and they thought the 219 00:11:05,280 --> 00:11:08,199 Speaker 2: person had been burned like like by fire, and it 220 00:11:08,240 --> 00:11:10,720 Speaker 2: turned out they had just been exposed to massive amounts 221 00:11:10,720 --> 00:11:13,560 Speaker 2: of heat. And yeah, I remember a couple of years ago, 222 00:11:13,600 --> 00:11:16,839 Speaker 2: just like give an anecdote, it was I think above 223 00:11:16,840 --> 00:11:19,120 Speaker 2: one hundred degrees in town. It was so hot, and 224 00:11:19,160 --> 00:11:20,880 Speaker 2: I was in the river bed like I had this 225 00:11:20,920 --> 00:11:24,400 Speaker 2: big insulated backpack to give people cold water, and just 226 00:11:24,640 --> 00:11:28,920 Speaker 2: like dozens of people were in terrible distress. And yeah, 227 00:11:28,920 --> 00:11:33,000 Speaker 2: there was no presence of police fire and anyone to help. 228 00:11:33,080 --> 00:11:36,520 Speaker 2: Right Like we have these sometimes billion dollar police departments 229 00:11:36,600 --> 00:11:40,439 Speaker 2: in these cities, and people are still unsafe and they 230 00:11:40,480 --> 00:11:44,440 Speaker 2: don't feel safe reaching out to any government agencies because 231 00:11:44,480 --> 00:11:46,280 Speaker 2: these government agencies are the same ones that you say 232 00:11:46,320 --> 00:11:48,839 Speaker 2: that throw away their shit, that destroy all the little 233 00:11:48,840 --> 00:11:50,800 Speaker 2: things that they've been trying to build up to get onto, 234 00:11:50,880 --> 00:11:52,720 Speaker 2: you know, like a better situation in life. 235 00:11:53,440 --> 00:11:57,000 Speaker 4: Yeah, and I think there's this sense of like apathy 236 00:11:57,040 --> 00:11:59,760 Speaker 4: that has built and rightfully so, from the people that 237 00:11:59,800 --> 00:12:03,720 Speaker 4: live on the streets where we've you know, relayed messages 238 00:12:03,720 --> 00:12:05,959 Speaker 4: that we've heard like hey, two one one says they 239 00:12:06,000 --> 00:12:08,840 Speaker 4: have one hundred shelter beds tonight, call and see if 240 00:12:08,880 --> 00:12:11,680 Speaker 4: you can get in, and they're like, okay, you know, 241 00:12:11,840 --> 00:12:13,800 Speaker 4: like I'll give it a shot, you know, And it's 242 00:12:14,080 --> 00:12:18,680 Speaker 4: very well received because we understand the amount of disappointment 243 00:12:18,679 --> 00:12:21,080 Speaker 4: these people have gone through when they do the care 244 00:12:21,160 --> 00:12:24,360 Speaker 4: plus sweeps, which is in itself such an evil name, yea, 245 00:12:24,920 --> 00:12:27,440 Speaker 4: when they throw all their stuff away when they show 246 00:12:27,520 --> 00:12:30,160 Speaker 4: up and they do care Plus, they show up with 247 00:12:30,200 --> 00:12:33,400 Speaker 4: a social worker first, which if I was a social worker, 248 00:12:33,440 --> 00:12:36,120 Speaker 4: I'd be kicking and screaming about how damaging that is 249 00:12:36,160 --> 00:12:38,840 Speaker 4: that right before they throw away everything that an non 250 00:12:38,880 --> 00:12:42,200 Speaker 4: house person owns, they send in a loan social worker 251 00:12:42,320 --> 00:12:45,360 Speaker 4: to write their names and maybe their numbers down and 252 00:12:45,640 --> 00:12:47,640 Speaker 4: tell them that the shelters are full, but they'll get 253 00:12:47,640 --> 00:12:49,280 Speaker 4: back to them, and then they have all of their 254 00:12:49,280 --> 00:12:52,959 Speaker 4: belongings droven away. I can't imagine the harm that has 255 00:12:53,040 --> 00:12:58,320 Speaker 4: done for just trusting services even when they're available, you know, 256 00:12:58,720 --> 00:13:02,240 Speaker 4: accessing them and then giving them your information. Have one 257 00:13:02,280 --> 00:13:05,439 Speaker 4: person who rightfully so told me they have trauma about 258 00:13:05,480 --> 00:13:08,640 Speaker 4: filling out forms because they've done this three hundred times, 259 00:13:08,640 --> 00:13:11,760 Speaker 4: you know, they said something incredible. They've been counting about 260 00:13:11,800 --> 00:13:14,480 Speaker 4: how many times they've filled the same forms out to 261 00:13:15,000 --> 00:13:18,440 Speaker 4: have it lead nowhere. And I can't imagine, you know, 262 00:13:18,559 --> 00:13:22,560 Speaker 4: that kind of resilience. Now, with this devastation, there's going 263 00:13:22,600 --> 00:13:24,160 Speaker 4: to be a lot of homeowners who are going to 264 00:13:24,200 --> 00:13:27,360 Speaker 4: experience that firsthand. I'm seeing a lot of people that 265 00:13:27,360 --> 00:13:29,680 Speaker 4: are homeless for the first time ever in their lives, 266 00:13:29,720 --> 00:13:32,600 Speaker 4: like in their late fifties. And these are people that 267 00:13:32,679 --> 00:13:35,760 Speaker 4: have owned homes, that have worked careers, that have you know, 268 00:13:36,679 --> 00:13:38,360 Speaker 4: lived their whole life as you're supposed to you in 269 00:13:38,360 --> 00:13:42,160 Speaker 4: the United States, and then in their elder years, befall 270 00:13:42,240 --> 00:13:45,040 Speaker 4: some sort of disaster or social Security doesn't pay anymore, 271 00:13:45,440 --> 00:13:48,400 Speaker 4: and they are severely shocked when I tell them what 272 00:13:48,520 --> 00:13:51,719 Speaker 4: the landscape of our social safety net looks like. I've 273 00:13:51,720 --> 00:13:53,560 Speaker 4: had people ask me like where do I go to 274 00:13:53,600 --> 00:13:56,959 Speaker 4: sign up for free housing? And I have to tell them, 275 00:13:57,160 --> 00:14:00,000 Speaker 4: you know, the wait list for vouchers is fifteen years, 276 00:14:00,160 --> 00:14:03,000 Speaker 4: is long, and it's a lottery. The list is closed 277 00:14:03,040 --> 00:14:06,000 Speaker 4: because it's so full. You can apply to senior housing, 278 00:14:06,120 --> 00:14:08,240 Speaker 4: but that's about a ten year wait, you know, that 279 00:14:08,559 --> 00:14:10,520 Speaker 4: I have to be the one to tell them that, 280 00:14:10,640 --> 00:14:13,719 Speaker 4: and that's sort of shock I think is going to 281 00:14:13,760 --> 00:14:16,640 Speaker 4: be hitting a lot of folks that have never tried 282 00:14:16,640 --> 00:14:18,280 Speaker 4: to access services before. 283 00:14:18,800 --> 00:14:21,760 Speaker 2: Yeah, definitely, let's take a little break here for some 284 00:14:21,960 --> 00:14:35,480 Speaker 2: advertisements and then we'll come back. All right, we're back. 285 00:14:36,040 --> 00:14:39,800 Speaker 2: So Yeah, I think anyone who's familiar with the situation 286 00:14:39,920 --> 00:14:44,200 Speaker 2: facing unhoused people in southern California will understand that there 287 00:14:44,280 --> 00:14:47,120 Speaker 2: is not a safety net, and that's about to become 288 00:14:47,480 --> 00:14:51,880 Speaker 2: more profoundly obvious than ever for thousands of people. Let's 289 00:14:51,880 --> 00:14:54,920 Speaker 2: talk about the way that people are helping to take 290 00:14:54,960 --> 00:14:57,280 Speaker 2: care of one another, because I think that's that's what 291 00:14:57,360 --> 00:15:00,640 Speaker 2: always happened to these situations. So let's talk about the 292 00:15:00,680 --> 00:15:03,240 Speaker 2: mutual aid effort. Maybe you could like talk about some 293 00:15:03,280 --> 00:15:04,760 Speaker 2: of the groups, talk about some of the things you've 294 00:15:04,800 --> 00:15:06,760 Speaker 2: been doing, and then I want to get onto how 295 00:15:06,760 --> 00:15:08,680 Speaker 2: people can help if they're in town, and how people 296 00:15:08,720 --> 00:15:10,520 Speaker 2: can help if they are a long way away. 297 00:15:11,000 --> 00:15:11,280 Speaker 1: Yeah. 298 00:15:11,440 --> 00:15:15,520 Speaker 4: In LA, we have a very robust network of mutual 299 00:15:15,560 --> 00:15:19,760 Speaker 4: aid groups that have been built by force, honestly, via 300 00:15:19,880 --> 00:15:23,360 Speaker 4: this government. I think a lot of them have started 301 00:15:23,440 --> 00:15:27,480 Speaker 4: up to step in just there's no denying all over 302 00:15:27,640 --> 00:15:30,840 Speaker 4: La that there's this crisis because you walk outside of 303 00:15:30,880 --> 00:15:34,040 Speaker 4: your house and there are people sleeping on your street. 304 00:15:34,160 --> 00:15:37,000 Speaker 4: You know, there's people digging through your garbage. So we've 305 00:15:37,000 --> 00:15:41,240 Speaker 4: seen this blossoming of mutual aid groups all over the city, 306 00:15:41,760 --> 00:15:45,920 Speaker 4: and we in times of crisis, you know, will spark 307 00:15:46,000 --> 00:15:49,560 Speaker 4: up a signal group that grows from zero to thousands 308 00:15:49,560 --> 00:15:53,080 Speaker 4: of people overnight that are willing to jump in and 309 00:15:53,120 --> 00:15:57,680 Speaker 4: get their hands dirty to coalesce and find resources. You know, 310 00:15:57,760 --> 00:16:00,240 Speaker 4: here's where we're buying masks. This star is out, don't 311 00:16:00,240 --> 00:16:02,840 Speaker 4: go to this one. Go to that one who's reimbursing 312 00:16:02,880 --> 00:16:06,480 Speaker 4: people for a gas et cetera, et cetera. And it's 313 00:16:07,160 --> 00:16:09,480 Speaker 4: normal people. You know, I have a full time job. 314 00:16:09,960 --> 00:16:13,040 Speaker 4: My friends here in Ktown for all. Some are teachers, 315 00:16:13,040 --> 00:16:17,480 Speaker 4: some are in the movie industry, you know, some are 316 00:16:17,720 --> 00:16:21,480 Speaker 4: random lawyers you know that will take their time out 317 00:16:20,920 --> 00:16:26,120 Speaker 4: to do this work. And I think that it's beautiful 318 00:16:26,160 --> 00:16:28,600 Speaker 4: in the sense that we get people the help they 319 00:16:28,640 --> 00:16:33,600 Speaker 4: need and it's never enough, which is crushing. Here in Ktown. 320 00:16:34,200 --> 00:16:38,000 Speaker 4: We give supplies to about four hundred or so on 321 00:16:38,040 --> 00:16:41,600 Speaker 4: house people a week minimum, and that is hygiene supplies 322 00:16:41,760 --> 00:16:46,320 Speaker 4: tense blankets, We connect them to any services that they 323 00:16:46,400 --> 00:16:48,880 Speaker 4: might ask us to connect them to, driving them to 324 00:16:48,920 --> 00:16:52,080 Speaker 4: the hospital, et cetera. And this has been going on 325 00:16:52,160 --> 00:16:55,360 Speaker 4: for the last five years. And Ktown for All specifically 326 00:16:55,400 --> 00:16:59,520 Speaker 4: started as a counter protest because there was an attempt 327 00:16:59,520 --> 00:17:02,920 Speaker 4: to build a showter here in Koreatown and some homeowners 328 00:17:03,000 --> 00:17:05,840 Speaker 4: organized against it. They marched down Wiltshire and shut it down. 329 00:17:05,920 --> 00:17:08,800 Speaker 4: And our founders found each other because they were the 330 00:17:08,840 --> 00:17:13,399 Speaker 4: only five people holding up we Want Shelter Science and 331 00:17:13,520 --> 00:17:16,600 Speaker 4: just started doing distribution themselves. And I think that's one 332 00:17:16,800 --> 00:17:19,800 Speaker 4: thing that I would really suggest to folks is it's 333 00:17:19,840 --> 00:17:22,320 Speaker 4: not as intimidating as it seems to start one of 334 00:17:22,359 --> 00:17:25,840 Speaker 4: these projects. It's literally you and a couple of friends 335 00:17:25,880 --> 00:17:29,639 Speaker 4: who decide that you're going to do something, and you 336 00:17:29,840 --> 00:17:32,560 Speaker 4: acknowledge that you can't do everything and that you'll never 337 00:17:32,600 --> 00:17:35,120 Speaker 4: be able to meet the need because what we need 338 00:17:35,240 --> 00:17:38,359 Speaker 4: is a government who cares about people. But in the meanwhile, 339 00:17:38,760 --> 00:17:41,400 Speaker 4: we're going to do the best we can, and the 340 00:17:41,440 --> 00:17:43,879 Speaker 4: lives of the you know, now four hundred or so 341 00:17:44,000 --> 00:17:46,480 Speaker 4: people that we see every week are a little better 342 00:17:46,560 --> 00:17:48,040 Speaker 4: because we decide to do that. 343 00:17:48,720 --> 00:17:50,919 Speaker 2: Yeah, I think that's really important to say that, Like 344 00:17:51,440 --> 00:17:54,520 Speaker 2: it can seem really overwhelming. This is an email I 345 00:17:54,560 --> 00:17:56,359 Speaker 2: get almost every week, like how do I start a 346 00:17:56,440 --> 00:17:59,640 Speaker 2: mutual aid group? Like if you can make a sandwich, 347 00:17:59,680 --> 00:18:01,880 Speaker 2: then you and you can start a mutual aid group. 348 00:18:01,920 --> 00:18:04,400 Speaker 2: Like just go and feed people who are hungry. If 349 00:18:04,440 --> 00:18:07,199 Speaker 2: someone's cold, give them a blanket. Like it doesn't have 350 00:18:07,320 --> 00:18:10,080 Speaker 2: to be like you don't have to read seventeen books, 351 00:18:10,080 --> 00:18:11,919 Speaker 2: you know, and be like starting a five H one, 352 00:18:12,000 --> 00:18:13,040 Speaker 2: C three and stuff. 353 00:18:13,080 --> 00:18:15,000 Speaker 3: You just need to do things. 354 00:18:15,119 --> 00:18:18,840 Speaker 2: And I think, especially like we're going into a new administration, 355 00:18:19,600 --> 00:18:22,679 Speaker 2: we're going to see the state being more hostile to 356 00:18:22,680 --> 00:18:26,760 Speaker 2: people who already marginalized. And like, the best advice I 357 00:18:26,760 --> 00:18:29,000 Speaker 2: have for people is to get off the internet and 358 00:18:29,040 --> 00:18:31,199 Speaker 2: to get into the streets and just do something. It 359 00:18:31,200 --> 00:18:33,159 Speaker 2: doesn't matter if you say you won't be able to 360 00:18:33,240 --> 00:18:35,600 Speaker 2: do everything, not right away, maybe one day we will, 361 00:18:35,720 --> 00:18:38,560 Speaker 2: but like doing something is a lot better than doing nothing, 362 00:18:38,680 --> 00:18:41,880 Speaker 2: and I guarantee it is also much better for you 363 00:18:42,160 --> 00:18:43,959 Speaker 2: and you're met, Like I feel so much better when 364 00:18:43,960 --> 00:18:45,879 Speaker 2: I'm able to help people. I wouldn't be able to 365 00:18:45,880 --> 00:18:47,520 Speaker 2: do the job I do at the border if I 366 00:18:47,560 --> 00:18:50,600 Speaker 2: wasn't also able to help people like it. It helps 367 00:18:50,600 --> 00:18:53,560 Speaker 2: me feel like I'm not part of the problem, I guess, 368 00:18:53,720 --> 00:18:56,439 Speaker 2: or like we're doing something about it at least. What 369 00:18:56,480 --> 00:18:59,000 Speaker 2: are people doing right now to help people who are 370 00:18:59,000 --> 00:19:01,439 Speaker 2: impacted by the fire. What are the needs that are 371 00:19:01,440 --> 00:19:03,400 Speaker 2: arising and how people meeting them? 372 00:19:03,760 --> 00:19:07,320 Speaker 4: Yeah, well, Ktown for all focuses here in the Ktown neighborhood, 373 00:19:07,320 --> 00:19:10,720 Speaker 4: and what we've particularly focused on is mass distribution. People 374 00:19:10,760 --> 00:19:13,920 Speaker 4: are sitting and it's literally raining ash in some areas 375 00:19:14,400 --> 00:19:17,320 Speaker 4: and are sitting in the suot, So there's that. There's 376 00:19:17,359 --> 00:19:22,840 Speaker 4: basic tent and tarp gathering meals. So many emergency services 377 00:19:22,880 --> 00:19:26,760 Speaker 4: shut down during disasters, you know, makes sense, but a 378 00:19:26,760 --> 00:19:29,200 Speaker 4: lot of food kitchens that people would get meals from 379 00:19:29,200 --> 00:19:32,119 Speaker 4: are not open right now. So it's getting people food, 380 00:19:32,119 --> 00:19:35,760 Speaker 4: getting people water just enough to survive. In other areas, 381 00:19:35,880 --> 00:19:39,560 Speaker 4: folks are gathering supplies. There's all Power Books that is 382 00:19:40,040 --> 00:19:44,159 Speaker 4: a big distribution site right now. Po Mutuoid out in 383 00:19:44,200 --> 00:19:47,200 Speaker 4: the Palms area is doing a lot of really great work. 384 00:19:47,520 --> 00:19:50,040 Speaker 4: The South Bay got swept last night, so South Bay 385 00:19:50,119 --> 00:19:53,720 Speaker 4: Mutuo Aid Club is replacing tents this morning. There's a 386 00:19:53,720 --> 00:19:56,560 Speaker 4: lot of the Pet Mutuo Aid groups who are gathering 387 00:19:56,600 --> 00:19:59,520 Speaker 4: pet food and finding foster homes for a lot of 388 00:19:59,560 --> 00:20:03,040 Speaker 4: the found dogs and cats. It's just I mean, I 389 00:20:03,080 --> 00:20:05,479 Speaker 4: can't even list the amount of people right now that 390 00:20:05,520 --> 00:20:08,960 Speaker 4: are like in their vehicles doing drop offs to you know, 391 00:20:09,040 --> 00:20:13,280 Speaker 4: the sidewalk project. There's a big skid row distribution point 392 00:20:13,560 --> 00:20:16,679 Speaker 4: that is building up crowdsourcing insulin things that like you 393 00:20:16,680 --> 00:20:18,879 Speaker 4: don't think about that people ran out of their house 394 00:20:19,320 --> 00:20:21,320 Speaker 4: that they need to live. They don't have time to 395 00:20:21,320 --> 00:20:23,920 Speaker 4: go get a prescription right you know at a primary 396 00:20:23,960 --> 00:20:26,960 Speaker 4: care provider like that, we need albuterol that people are 397 00:20:26,960 --> 00:20:30,080 Speaker 4: having asthma attacks. So there's these kind of burdens that 398 00:20:30,200 --> 00:20:33,680 Speaker 4: mutuoid projects get around because people a don't have to 399 00:20:33,720 --> 00:20:36,400 Speaker 4: fill out any forms, they don't have to wait. If 400 00:20:36,400 --> 00:20:39,120 Speaker 4: we have it, you're going to be handed it. And 401 00:20:39,520 --> 00:20:43,280 Speaker 4: you know, even medical providers as part of our projects 402 00:20:44,000 --> 00:20:47,560 Speaker 4: have become a really big support as people on the 403 00:20:47,560 --> 00:20:50,480 Speaker 4: streets are often very disabled. We have a lot of 404 00:20:50,480 --> 00:20:54,520 Speaker 4: folks at diabetes, like diabetic open wounds, like just very 405 00:20:54,560 --> 00:20:58,400 Speaker 4: horrible injuries that need constant care. All Power Bookstore has 406 00:20:58,440 --> 00:21:01,680 Speaker 4: a free clinic, All Power Clinic and they offer free 407 00:21:01,680 --> 00:21:04,080 Speaker 4: medical care and come with us on our roots here 408 00:21:04,119 --> 00:21:07,399 Speaker 4: in Katown to offer free treatment for folks. And I 409 00:21:07,400 --> 00:21:11,160 Speaker 4: think that's something that is going to only grow, as 410 00:21:11,200 --> 00:21:15,760 Speaker 4: you said, as this administration occurs. Homelessness rose eighteen percent 411 00:21:15,800 --> 00:21:18,960 Speaker 4: in the last year, and that's only been the case 412 00:21:19,320 --> 00:21:22,880 Speaker 4: every year since we started counting. There is no way 413 00:21:22,920 --> 00:21:26,600 Speaker 4: this administration is going to institute rent control or anything 414 00:21:26,800 --> 00:21:30,879 Speaker 4: that keeps people from being displaced. One mutuoid project that 415 00:21:30,880 --> 00:21:33,520 Speaker 4: I think people overlook often is the tenants unions, the 416 00:21:33,680 --> 00:21:37,800 Speaker 4: LA Tenants Union mobilizing to care for their members, checking 417 00:21:37,880 --> 00:21:41,960 Speaker 4: in on their disabled members. These kind of community based 418 00:21:42,040 --> 00:21:45,120 Speaker 4: organizations where people know people, they know who to check 419 00:21:45,200 --> 00:21:48,680 Speaker 4: up on, they know who's vulnerable. Those kind of organizations 420 00:21:48,680 --> 00:21:50,960 Speaker 4: are invaluable in emergencies like these. 421 00:21:51,680 --> 00:21:53,919 Speaker 2: Yeah, definitely, And like I one good thing that can 422 00:21:53,960 --> 00:21:56,800 Speaker 2: come out of this is that we can build stronger communities, 423 00:21:57,000 --> 00:22:00,000 Speaker 2: right and we can hopefully folks who are finding themselves 424 00:22:00,119 --> 00:22:02,760 Speaker 2: depending on mutual aid for the first time can realize that, 425 00:22:02,840 --> 00:22:05,320 Speaker 2: like they can participate in that. And I know there 426 00:22:05,320 --> 00:22:07,920 Speaker 2: are folks already why who have lost their homes who 427 00:22:07,920 --> 00:22:10,560 Speaker 2: are still out there helping other people, driving around, rescuing 428 00:22:10,600 --> 00:22:11,399 Speaker 2: people and stuff. 429 00:22:12,119 --> 00:22:14,480 Speaker 4: Yeah, And I think we say this all the time. 430 00:22:14,680 --> 00:22:18,480 Speaker 4: In the homelessness space, you know, you're closer to being 431 00:22:18,520 --> 00:22:20,960 Speaker 4: homeless than you are to be a billionaire. And I 432 00:22:20,960 --> 00:22:24,520 Speaker 4: think this is one of the most direct examples, Like 433 00:22:24,600 --> 00:22:28,440 Speaker 4: these people might have been well off maybe a month 434 00:22:28,560 --> 00:22:32,159 Speaker 4: or two ago, and then now they have zero. You know, 435 00:22:32,200 --> 00:22:34,639 Speaker 4: they're going to be fighting with insurance companies for maybe 436 00:22:34,680 --> 00:22:39,119 Speaker 4: five years, you know, if some of them, and hopefully, 437 00:22:39,400 --> 00:22:41,719 Speaker 4: you know, they end up recovering. But I hope they 438 00:22:41,720 --> 00:22:46,520 Speaker 4: don't forget that climate change and emergency disasters are a 439 00:22:46,520 --> 00:22:51,000 Speaker 4: great equalizer. And the people that show their faces, they're 440 00:22:51,000 --> 00:22:55,359 Speaker 4: not the politicians, they're not the lobbyists, they're not you know, 441 00:22:55,480 --> 00:23:00,320 Speaker 4: the Democratic Party. You know, TM, it's your neighbor who 442 00:23:00,520 --> 00:23:03,920 Speaker 4: has a mask for you. It's me, someone random from 443 00:23:03,960 --> 00:23:06,760 Speaker 4: down the block, who got a couple friends together, who 444 00:23:06,840 --> 00:23:10,480 Speaker 4: has water for you. You know, like that's who comes through, 445 00:23:10,560 --> 00:23:14,120 Speaker 4: and that's who you need to care for all the time, 446 00:23:14,200 --> 00:23:16,960 Speaker 4: including your un housed neighbors that are around you all 447 00:23:17,000 --> 00:23:20,520 Speaker 4: the time, who live in your community and who face 448 00:23:20,600 --> 00:23:23,560 Speaker 4: this emergency every day. You know, they don't know where 449 00:23:23,560 --> 00:23:25,360 Speaker 4: they're going to sleep every night, they don't know where 450 00:23:25,359 --> 00:23:28,240 Speaker 4: their next meal is coming from. Every day, they get 451 00:23:28,280 --> 00:23:32,200 Speaker 4: their stuff destroyed by the state, you know, regularly, if 452 00:23:32,240 --> 00:23:35,800 Speaker 4: not once a week, very frequently, And I hope this 453 00:23:35,920 --> 00:23:38,280 Speaker 4: is really sad, but I hope it forces some empathy 454 00:23:38,320 --> 00:23:41,359 Speaker 4: in people who otherwise don't think about themselves in this 455 00:23:41,440 --> 00:23:45,159 Speaker 4: context of being a human that needs food, water, and shelter, 456 00:23:45,840 --> 00:23:46,960 Speaker 4: you know, the basics. 457 00:23:47,440 --> 00:23:50,120 Speaker 2: Yeah, talking food, water, and shelter the things I need 458 00:23:50,160 --> 00:23:52,719 Speaker 2: as well, and so to pay for them, I have 459 00:23:52,800 --> 00:24:07,480 Speaker 2: to pivot to ads. Now, Okay, we're back. I think 460 00:24:07,720 --> 00:24:09,680 Speaker 2: that was a really good plug for like why Muti 461 00:24:09,720 --> 00:24:12,240 Speaker 2: Lady is important and hopefully there are people who are 462 00:24:12,240 --> 00:24:15,639 Speaker 2: listening right or people who are finding themselves for the 463 00:24:15,680 --> 00:24:18,800 Speaker 2: first time interested in helping seeing a crisis. A lot 464 00:24:18,800 --> 00:24:20,560 Speaker 2: of people like will ask me if they can come 465 00:24:20,560 --> 00:24:23,000 Speaker 2: help at the border, and of course you can, but 466 00:24:23,080 --> 00:24:24,879 Speaker 2: you should also help in your own community because there 467 00:24:24,880 --> 00:24:27,280 Speaker 2: are people who need you there, and obviously that's very 468 00:24:27,320 --> 00:24:30,560 Speaker 2: true in LA right now. So I want to give 469 00:24:30,680 --> 00:24:33,520 Speaker 2: some resources some ways people can help if people are 470 00:24:33,520 --> 00:24:36,239 Speaker 2: listening in LA, what are some like I know there 471 00:24:36,240 --> 00:24:39,080 Speaker 2: are all kinds of efforts, but what are some concrete 472 00:24:39,119 --> 00:24:41,400 Speaker 2: things they can do or some places they can go 473 00:24:41,960 --> 00:24:45,360 Speaker 2: if they're in a situation where they're not massively impacted 474 00:24:45,400 --> 00:24:47,159 Speaker 2: by the fires and they want to help other people. 475 00:24:47,640 --> 00:24:48,959 Speaker 3: What are some things they can do? 476 00:24:49,560 --> 00:24:52,600 Speaker 4: You're free to follow kitown for all on Instagram. We 477 00:24:52,680 --> 00:24:59,720 Speaker 4: are constantly uploading on our stories year round. Fundraisers, resource requests, 478 00:25:00,040 --> 00:25:03,560 Speaker 4: go fundmes, et cetera. We really try to stay connected 479 00:25:03,600 --> 00:25:06,560 Speaker 4: with the LA Mutual Aid network, and honestly, once you 480 00:25:06,600 --> 00:25:08,199 Speaker 4: follow one of us, you kind of follow all of 481 00:25:08,240 --> 00:25:11,240 Speaker 4: us because we're very supportive of each other's effort. Mutual 482 00:25:11,280 --> 00:25:13,639 Speaker 4: Aid LA is a good hub. They have a magazine 483 00:25:13,680 --> 00:25:16,119 Speaker 4: that gets published every month that has a list of 484 00:25:16,920 --> 00:25:19,960 Speaker 4: mutual aid programs all over LA. If you can't come 485 00:25:20,000 --> 00:25:22,679 Speaker 4: out on physical outreach with us, which we do on 486 00:25:22,760 --> 00:25:25,560 Speaker 4: Saturdays every Saturday except the first Saturday of the month 487 00:25:25,560 --> 00:25:29,400 Speaker 4: when we do our planning meeting, you're free to help us, 488 00:25:29,680 --> 00:25:32,840 Speaker 4: you know, connect with others. You're free to help us financially. 489 00:25:33,440 --> 00:25:36,560 Speaker 4: But we also you know, funny you mentioned this, James, 490 00:25:36,560 --> 00:25:38,720 Speaker 4: But if you DM us and you're like, hey, I 491 00:25:38,800 --> 00:25:41,119 Speaker 4: want to talk to someone about starting a project in 492 00:25:41,160 --> 00:25:44,200 Speaker 4: my region, I'm so happy to hop on a zoom 493 00:25:44,200 --> 00:25:47,560 Speaker 4: with you. Tell you how we do our distribution, tell 494 00:25:47,560 --> 00:25:50,119 Speaker 4: you how we make our maps, of encampments, tell you 495 00:25:50,200 --> 00:25:54,480 Speaker 4: how we you know, fund and routsource always happy to 496 00:25:54,520 --> 00:25:56,720 Speaker 4: find that knowledge and people messages all the time. Can 497 00:25:56,760 --> 00:26:00,439 Speaker 4: we start a Neighborhood for All chapter? And we're like, 498 00:26:00,440 --> 00:26:02,800 Speaker 4: we're so honored that you would do that. Please don't ask, 499 00:26:02,840 --> 00:26:06,280 Speaker 4: but you're totally welcome to. And so we have Pasadena 500 00:26:06,359 --> 00:26:09,480 Speaker 4: for All that is doing great work, and Pasadena for 501 00:26:09,560 --> 00:26:13,200 Speaker 4: All is definitely always in need of support. They are 502 00:26:13,240 --> 00:26:16,760 Speaker 4: in a huge disaster zone Altadena, Pasadena, like all those 503 00:26:16,800 --> 00:26:20,840 Speaker 4: areas are been evacuated. Palms Mutoid. But yeah, if you 504 00:26:20,840 --> 00:26:23,240 Speaker 4: want to stay connected, you know, follow us on Instagram, 505 00:26:23,280 --> 00:26:27,280 Speaker 4: Ktown for All, same Twitter, same on Blue Sky and 506 00:26:27,359 --> 00:26:32,080 Speaker 4: will hopefully be your input into the La Mutoid scene. 507 00:26:32,320 --> 00:26:35,000 Speaker 4: We're always so happy to support anyone else doing this work. 508 00:26:35,720 --> 00:26:38,480 Speaker 4: And while we focus in the Ktown neighborhood, LA is 509 00:26:38,520 --> 00:26:42,000 Speaker 4: a giant place. And if you have any neighborhoods in 510 00:26:42,040 --> 00:26:45,399 Speaker 4: Los Angeles that you feel passionate about or need extra attention, 511 00:26:45,560 --> 00:26:47,520 Speaker 4: you know, we'll always be the ones to uplift those. 512 00:26:48,200 --> 00:26:49,879 Speaker 3: Yeah, that's really cool. I think it's really important that 513 00:26:49,920 --> 00:26:50,399 Speaker 3: we share. 514 00:26:50,600 --> 00:26:53,200 Speaker 2: Like one of my friends when we were doing border stuff, 515 00:26:53,200 --> 00:26:55,280 Speaker 2: made a website where we documented all the stuff we 516 00:26:55,320 --> 00:26:57,639 Speaker 2: did so that it was open source and available to people, 517 00:26:57,720 --> 00:27:01,080 Speaker 2: like how we built shelters and how we could But yeah, 518 00:27:01,119 --> 00:27:03,280 Speaker 2: we don't need to reinvent the wheel every time. Like 519 00:27:03,320 --> 00:27:05,320 Speaker 2: we can all help each other get that start and 520 00:27:05,320 --> 00:27:07,200 Speaker 2: not make the mistakes so we all make So that's 521 00:27:07,200 --> 00:27:09,320 Speaker 2: really cool that people can reach out to you. What 522 00:27:09,440 --> 00:27:11,520 Speaker 2: about if they're a long way away and they just 523 00:27:11,520 --> 00:27:13,480 Speaker 2: want to send some money, They want to help and 524 00:27:13,880 --> 00:27:15,160 Speaker 2: they've got money they want to share. 525 00:27:15,720 --> 00:27:19,160 Speaker 4: Yeah, you're always welcome to venmo us kton for all. 526 00:27:19,440 --> 00:27:22,199 Speaker 4: Same on venmo. We have a PayPal link, We have 527 00:27:22,240 --> 00:27:25,360 Speaker 4: a website, kton for all dot org. We are five 528 00:27:25,400 --> 00:27:27,840 Speaker 4: oho and C three. If you'd like to donate in 529 00:27:27,920 --> 00:27:31,240 Speaker 4: our you know, in some kind of corporate fancy way, 530 00:27:31,320 --> 00:27:34,440 Speaker 4: feel free to dm us. We just got that figured out. 531 00:27:34,600 --> 00:27:38,400 Speaker 4: But yeah, all of our money gets spent directly on 532 00:27:38,640 --> 00:27:41,040 Speaker 4: material goods. We don't have any employees, we don't have 533 00:27:41,080 --> 00:27:44,080 Speaker 4: any overhead. Our volunteers are up to their necks and 534 00:27:44,160 --> 00:27:47,920 Speaker 4: baby wipes usually when we get you know, sock donations 535 00:27:47,920 --> 00:27:50,720 Speaker 4: and things like that, And honestly, we prefer it that 536 00:27:50,760 --> 00:27:54,720 Speaker 4: way just because you know, we we know what nonprofit 537 00:27:54,760 --> 00:27:58,399 Speaker 4: requirements are like and that kind of burden that that 538 00:27:58,480 --> 00:28:01,120 Speaker 4: place is on mutuoid projects and we're trying to avoid them, 539 00:28:01,480 --> 00:28:04,719 Speaker 4: so every time still goes to supplies, and I know 540 00:28:05,119 --> 00:28:09,760 Speaker 4: every mutoid project. Jaytown Action in Japantown as well operates 541 00:28:09,760 --> 00:28:12,680 Speaker 4: in a very similar model. I would just suggest people 542 00:28:12,720 --> 00:28:16,880 Speaker 4: get plugged in to mutuoid La. They follow us on Instagram, 543 00:28:17,240 --> 00:28:20,720 Speaker 4: feel free to send any money, work constantly on our stories, 544 00:28:20,960 --> 00:28:25,920 Speaker 4: uploading gofundmes and venmos and stuff. I really appreciate their 545 00:28:25,920 --> 00:28:29,360 Speaker 4: help you out of the country and hope that one 546 00:28:29,440 --> 00:28:33,159 Speaker 4: day orgs like ours are not needed anymore because we 547 00:28:33,200 --> 00:28:34,159 Speaker 4: live in a great world. 548 00:28:34,640 --> 00:28:35,800 Speaker 3: Yeah, yeah, that'd be nice. 549 00:28:36,040 --> 00:28:38,160 Speaker 2: Is there anything else, like, do you have any bottlenecks 550 00:28:38,240 --> 00:28:40,520 Speaker 2: or particular shortages that you want to shout out that 551 00:28:40,600 --> 00:28:41,520 Speaker 2: the audience can maybe. 552 00:28:41,400 --> 00:28:41,880 Speaker 3: Help you with. 553 00:28:42,480 --> 00:28:47,160 Speaker 4: We're always looking for staples, so those are tents and 554 00:28:47,320 --> 00:28:52,240 Speaker 4: tarps constantly. Those are often the most expensive items people 555 00:28:52,280 --> 00:28:55,120 Speaker 4: have to purchase. Tents go about thirty to forty dollars 556 00:28:55,720 --> 00:28:58,800 Speaker 4: each one, and the government throws a lot of them 557 00:28:58,840 --> 00:29:02,960 Speaker 4: away every week. So those items. Feel free to always 558 00:29:03,040 --> 00:29:05,080 Speaker 4: DM me if you have some that you would like 559 00:29:05,120 --> 00:29:07,680 Speaker 4: to drop off. But I will say mutual aid orgs 560 00:29:07,720 --> 00:29:11,520 Speaker 4: are really good at building connections directly with vendors and 561 00:29:11,560 --> 00:29:13,960 Speaker 4: we usually get like a discount and buying in bulk. 562 00:29:14,480 --> 00:29:16,960 Speaker 4: So I would really love to shake people from their 563 00:29:16,960 --> 00:29:18,680 Speaker 4: fear of donating cash. 564 00:29:18,760 --> 00:29:19,480 Speaker 1: Yeah, yeah, I. 565 00:29:19,440 --> 00:29:21,800 Speaker 4: Know a lot of folks feel comfortable like buying an 566 00:29:21,800 --> 00:29:24,320 Speaker 4: item because you know that that's the item that's given out. 567 00:29:24,320 --> 00:29:27,040 Speaker 4: But sometimes we get a better deal buying a thousand 568 00:29:27,040 --> 00:29:30,760 Speaker 4: of those tenths, and your dollar goes farther. So you know, 569 00:29:31,280 --> 00:29:33,680 Speaker 4: tense blankets, and again, don't be afraid to do this 570 00:29:33,760 --> 00:29:35,800 Speaker 4: by yourself. Like you can go to home depot and 571 00:29:35,800 --> 00:29:39,400 Speaker 4: buyotent and hand it to someone. You can go to 572 00:29:39,480 --> 00:29:41,840 Speaker 4: home depot and buy masks right now and hand them 573 00:29:41,840 --> 00:29:44,200 Speaker 4: to someone. You don't have to wait for a group 574 00:29:44,320 --> 00:29:48,120 Speaker 4: like this to be around and to help, particularly if 575 00:29:48,160 --> 00:29:49,719 Speaker 4: your neighborhood needs you. 576 00:29:50,600 --> 00:29:52,400 Speaker 3: Yeah, and it's a really good message. It's a good 577 00:29:52,400 --> 00:29:53,040 Speaker 3: place to end. 578 00:29:53,320 --> 00:29:55,480 Speaker 2: Just to remind everyone, it's at Ktown for All on 579 00:29:55,520 --> 00:29:58,200 Speaker 2: Instagram and k Town for All on Venmo. 580 00:29:58,320 --> 00:30:00,280 Speaker 1: Right, yep, great, thanks so. 581 00:30:00,360 --> 00:30:01,720 Speaker 4: Much, thank you so much. 582 00:30:15,680 --> 00:30:21,000 Speaker 1: Oh it could happen here. A podcast from CEES The 583 00:30:21,120 --> 00:30:25,760 Speaker 1: Consumer Electronics Show twenty twenty five. I am here with 584 00:30:26,000 --> 00:30:30,720 Speaker 1: my friend and work partner, Garrison Davis. We have been 585 00:30:31,520 --> 00:30:35,280 Speaker 1: trotting the boards, the boards being the Las Vegas Convention 586 00:30:35,440 --> 00:30:39,920 Speaker 1: Center all day. Garrison, today, you started earlier than I 587 00:30:39,960 --> 00:30:43,080 Speaker 1: did because I was catastrophically hungover after getting very drunk 588 00:30:43,120 --> 00:30:45,720 Speaker 1: with the priest last night. Yeah, we had a nice 589 00:30:45,720 --> 00:30:49,560 Speaker 1: dinner and then we set out to experience a fresh 590 00:30:49,600 --> 00:30:51,280 Speaker 1: new hell. And in this case, that fresh new hell 591 00:30:51,440 --> 00:30:54,280 Speaker 1: was what the AI bros Have ready for your children. 592 00:30:54,400 --> 00:30:57,480 Speaker 5: No, it's funny how we both stumbled across AI products 593 00:30:57,520 --> 00:31:00,520 Speaker 5: for kids like the same day, during the exactly same time. 594 00:31:00,600 --> 00:31:03,040 Speaker 1: Uh huh. Yeah, it really is remarkable that, Like, yeah, 595 00:31:03,160 --> 00:31:05,160 Speaker 1: I guess in part just because like Betty is such 596 00:31:05,200 --> 00:31:08,320 Speaker 1: a focus. I think it has something to do with 597 00:31:08,360 --> 00:31:11,680 Speaker 1: what you saw some of yesterday where and I had 598 00:31:11,680 --> 00:31:13,760 Speaker 1: caught a little the day before where they're like, yeah, 599 00:31:13,760 --> 00:31:15,800 Speaker 1: they don't really like this stuff. We're gonna have to 600 00:31:15,840 --> 00:31:18,400 Speaker 1: get around it. Like obviously this is inevitable, but like 601 00:31:18,440 --> 00:31:21,000 Speaker 1: people really also seem to not enjoy it very much. 602 00:31:21,520 --> 00:31:24,840 Speaker 1: No one can explain why, but I think that this 603 00:31:24,920 --> 00:31:26,479 Speaker 1: may be like, Okay, well, if we get them when 604 00:31:26,480 --> 00:31:28,880 Speaker 1: they're young enough, if we train these kids. We can 605 00:31:28,920 --> 00:31:30,760 Speaker 1: force this on them and they'll have no choice but 606 00:31:30,880 --> 00:31:31,360 Speaker 1: to like it. 607 00:31:31,680 --> 00:31:33,880 Speaker 5: And it's interesting to say that because the first thing 608 00:31:33,880 --> 00:31:35,520 Speaker 5: I did today was go to a panel at the 609 00:31:35,600 --> 00:31:42,360 Speaker 5: Venetian titled Raising AI Kids Responsibly, which is maybe the 610 00:31:42,400 --> 00:31:44,480 Speaker 5: best title for any single panel. 611 00:31:44,600 --> 00:31:46,320 Speaker 1: Yeah, that's that's fucked up. 612 00:31:47,000 --> 00:31:50,400 Speaker 5: The description was a new generation of kids are being 613 00:31:50,440 --> 00:31:53,360 Speaker 5: brought up with AI technologies as a part of their lives. 614 00:31:54,160 --> 00:31:58,160 Speaker 5: How does this affect their learning, entertainment and socialization? Which 615 00:31:58,200 --> 00:32:00,360 Speaker 5: is a good question. Yeah, we should be asked that, 616 00:32:00,920 --> 00:32:03,840 Speaker 5: more people should. There was four people on the panel, 617 00:32:04,080 --> 00:32:09,280 Speaker 5: Karen Ruth Wong from Ido, Play Lab Partnerships, Nilo Lewick 618 00:32:09,400 --> 00:32:13,440 Speaker 5: from Skyrocket Toys, Melissa Hunter from Family Video Network, and 619 00:32:13,520 --> 00:32:16,520 Speaker 5: Joshua Garrett from a Ready Land. And I'll talk about 620 00:32:16,560 --> 00:32:19,440 Speaker 5: all these all these different companies and people in a 621 00:32:19,560 --> 00:32:22,560 Speaker 5: sec Yeah. So the panel started with Karen Ruth Wang 622 00:32:22,600 --> 00:32:25,800 Speaker 5: from Ido, which is the company that first partnered with 623 00:32:25,840 --> 00:32:28,360 Speaker 5: sessing me a Workshop to start making online apps. So, 624 00:32:28,480 --> 00:32:30,240 Speaker 5: you know, that was interesting to me because sess me 625 00:32:30,240 --> 00:32:33,080 Speaker 5: a Workshop generally puts a lot of care into, like 626 00:32:33,160 --> 00:32:35,680 Speaker 5: you know, making media for children. This is a company 627 00:32:35,720 --> 00:32:37,960 Speaker 5: that works with them, so I was interested in what 628 00:32:38,080 --> 00:32:41,640 Speaker 5: she was going to say, and basically she talked not 629 00:32:41,760 --> 00:32:45,200 Speaker 5: about any products that her company's making, but instead research 630 00:32:45,360 --> 00:32:49,600 Speaker 5: into how like AI is affecting gen Z, how gen 631 00:32:49,720 --> 00:32:52,600 Speaker 5: Z wants to like interact with AI, and talked about 632 00:32:52,600 --> 00:32:54,440 Speaker 5: a whole bunch of research that her company has been 633 00:32:54,480 --> 00:32:57,280 Speaker 5: doing for the past few years on like what people 634 00:32:57,400 --> 00:32:59,800 Speaker 5: you know, my age and you know younger, what they're 635 00:32:59,800 --> 00:33:03,120 Speaker 5: attitudes are towards this thing that has become like an 636 00:33:03,120 --> 00:33:06,680 Speaker 5: increasingly encroaching part of their lives. I'm just going to 637 00:33:06,920 --> 00:33:08,479 Speaker 5: play a series of clips. 638 00:33:08,720 --> 00:33:11,480 Speaker 1: Couldn't be more excited. So I was sharing this. 639 00:33:11,520 --> 00:33:13,160 Speaker 6: Morning a little bit about what we're learning. 640 00:33:14,080 --> 00:33:17,080 Speaker 7: That question is what if the tech savvy generation isn't 641 00:33:17,120 --> 00:33:19,120 Speaker 7: buying anymore? We have a lot of one of the 642 00:33:19,200 --> 00:33:21,880 Speaker 7: interesting opinions and assumptions in our heads that these are 643 00:33:21,880 --> 00:33:23,479 Speaker 7: the one they are going to be the first users 644 00:33:23,480 --> 00:33:25,440 Speaker 7: and the first viewers, and in many ways they are, 645 00:33:25,800 --> 00:33:28,760 Speaker 7: but then the ones that also comes the most informed 646 00:33:28,800 --> 00:33:32,440 Speaker 7: opinions not just about how badly the tech feels, how 647 00:33:32,520 --> 00:33:35,120 Speaker 7: cringey some of them imp be landing, but also how 648 00:33:35,160 --> 00:33:37,680 Speaker 7: it's affecting their sense of humanity. 649 00:33:38,000 --> 00:33:39,880 Speaker 1: That's fascinating yeah. 650 00:33:39,960 --> 00:33:42,160 Speaker 5: The very first thing this is literally like the minutes 651 00:33:42,160 --> 00:33:44,040 Speaker 5: into the panels, is like after they do their introductions, 652 00:33:44,080 --> 00:33:46,680 Speaker 5: the first thing to talk about is how gen Z 653 00:33:47,120 --> 00:33:49,240 Speaker 5: is both an early adopter of new tech, but they're 654 00:33:49,240 --> 00:33:51,240 Speaker 5: also kind of the most AI critical. 655 00:33:51,480 --> 00:33:51,720 Speaker 8: Yeah. 656 00:33:51,800 --> 00:33:53,000 Speaker 1: Yeah, yeah, it's cringey. 657 00:33:53,280 --> 00:33:56,240 Speaker 5: Yeah, like how it feels cringy and not just that, 658 00:33:56,400 --> 00:34:00,320 Speaker 5: how it's affecting people's sense of humanity and viewing them like, 659 00:34:00,360 --> 00:34:02,080 Speaker 5: you know, in some ways, as like an obstacle to 660 00:34:02,120 --> 00:34:04,440 Speaker 5: get over. But also this is I'm not sure how 661 00:34:04,480 --> 00:34:06,720 Speaker 5: I feel about about, like, you know, Karen and the 662 00:34:06,720 --> 00:34:09,879 Speaker 5: company she's representing here, because in some ways I felt 663 00:34:09,960 --> 00:34:13,720 Speaker 5: like she's probably actually good. She just had to frame 664 00:34:13,920 --> 00:34:16,000 Speaker 5: all of the things she was saying as like shocking 665 00:34:16,040 --> 00:34:19,040 Speaker 5: revelations to all these tech bros, be like, yeah, actually, 666 00:34:19,400 --> 00:34:22,520 Speaker 5: it turns out kids surprisingly don't want their lives run 667 00:34:22,520 --> 00:34:22,960 Speaker 5: by AI. 668 00:34:23,120 --> 00:34:25,520 Speaker 1: Yeah, don't want to communicate only with AI. 669 00:34:25,760 --> 00:34:27,319 Speaker 5: I actually liked what she was saying, it is just 670 00:34:27,360 --> 00:34:29,560 Speaker 5: her presentation of it felt kind of odd at times 671 00:34:29,840 --> 00:34:31,560 Speaker 5: because because of who the audience is. 672 00:34:31,640 --> 00:34:33,160 Speaker 1: Do you get see do you get the feeling that 673 00:34:33,200 --> 00:34:37,080 Speaker 1: she was like a bad person trying to help like 674 00:34:37,239 --> 00:34:40,560 Speaker 1: other bad people sell poison to children or somebody who 675 00:34:40,600 --> 00:34:43,000 Speaker 1: was trying to like in a way that these guys 676 00:34:43,040 --> 00:34:44,920 Speaker 1: would listen to tell them that what they're doing is in. 677 00:34:45,000 --> 00:34:47,799 Speaker 5: Work maybe like twenty eighty so like like a little 678 00:34:47,840 --> 00:34:49,759 Speaker 5: bit of like, yeah, we have to sell some of this, 679 00:34:50,120 --> 00:34:53,720 Speaker 5: but mostly it felt like trying to inform people about 680 00:34:53,880 --> 00:34:56,600 Speaker 5: how this isn't really what people want and you know, 681 00:34:56,640 --> 00:34:59,200 Speaker 5: it has a lot of actual drawbacks. Here's a clip 682 00:34:59,239 --> 00:35:01,440 Speaker 5: of Karen talking about the sort of questions that they're 683 00:35:01,480 --> 00:35:04,239 Speaker 5: asking kids to you know, get data on how they 684 00:35:04,239 --> 00:35:05,040 Speaker 5: feel about AI. 685 00:35:06,239 --> 00:35:10,040 Speaker 7: Here's a few provocative ones. We really put tangible expressions 686 00:35:10,480 --> 00:35:10,719 Speaker 7: what it. 687 00:35:10,719 --> 00:35:12,520 Speaker 6: Would be like to interact with. 688 00:35:12,600 --> 00:35:16,240 Speaker 7: A potential AI tool, And so we asked questions like, Okay, 689 00:35:16,360 --> 00:35:19,240 Speaker 7: you've recently had a friend breakup. What kind of intervention 690 00:35:19,360 --> 00:35:21,560 Speaker 7: do you want? Do you want someone to counsel you 691 00:35:21,600 --> 00:35:24,040 Speaker 7: through that process or do you want someone to kind 692 00:35:24,040 --> 00:35:26,040 Speaker 7: of replace their friend for the time being, just you 693 00:35:26,080 --> 00:35:29,279 Speaker 7: can you know, back yourself out from that relationship. So 694 00:35:29,360 --> 00:35:32,799 Speaker 7: by asking really tangible questions, by putting prototypes in front 695 00:35:32,800 --> 00:35:34,799 Speaker 7: of youth, we were able to co design to view 696 00:35:35,160 --> 00:35:39,520 Speaker 7: insights at this one always gets all audience members. We 697 00:35:39,640 --> 00:35:43,920 Speaker 7: put out a provocational expression of Imagine you could have 698 00:35:44,360 --> 00:35:48,319 Speaker 7: an AI trained on your preferences, on your personalities, live 699 00:35:48,360 --> 00:35:51,280 Speaker 7: your life for you. Imagine they can swipe your tender 700 00:35:51,280 --> 00:35:55,040 Speaker 7: for you. They would have the any conversations or they 701 00:35:55,040 --> 00:35:57,640 Speaker 7: would go through the awkward introductions you know, new person 702 00:35:57,680 --> 00:35:58,440 Speaker 7: in school. 703 00:35:58,800 --> 00:36:00,640 Speaker 6: And we heard some really interesting things. 704 00:36:01,120 --> 00:36:02,880 Speaker 7: I want to go on a bad day for myself 705 00:36:02,880 --> 00:36:04,280 Speaker 7: and I want to have that davcation. 706 00:36:05,160 --> 00:36:09,440 Speaker 6: There was a really interesting sign that be able to 707 00:36:09,440 --> 00:36:11,160 Speaker 6: live life for yourselves at badge of want. 708 00:36:12,440 --> 00:36:15,040 Speaker 5: Being able to live life for yourself is a badge 709 00:36:15,160 --> 00:36:15,680 Speaker 5: of honor. 710 00:36:15,920 --> 00:36:18,480 Speaker 1: Amazing that human beings don't want a robot to replace 711 00:36:18,560 --> 00:36:21,360 Speaker 1: them in such drudgery as the search for love and 712 00:36:21,480 --> 00:36:26,200 Speaker 1: human connection. Incredible that that teens aren't interested in letting 713 00:36:26,239 --> 00:36:28,440 Speaker 1: a robot go on dates for them. 714 00:36:28,640 --> 00:36:30,640 Speaker 5: No, it's super interesting and like even like the first 715 00:36:30,840 --> 00:36:33,720 Speaker 5: thing she said about you know you like lost some friends. 716 00:36:34,200 --> 00:36:36,239 Speaker 5: Do you want an AI to like, you know, like 717 00:36:36,400 --> 00:36:38,160 Speaker 5: counsel you or like you know, like like like talk 718 00:36:38,200 --> 00:36:41,200 Speaker 5: about your feelings or do you want a friend replacement? 719 00:36:41,800 --> 00:36:45,080 Speaker 5: And no, people don't want a friend replacement. And this 720 00:36:45,239 --> 00:36:47,600 Speaker 5: even like honor question of like you know, like AI 721 00:36:48,040 --> 00:36:50,239 Speaker 5: swiping your Tinder for you trying to figure out what 722 00:36:50,320 --> 00:36:53,719 Speaker 5: your preferences are. No, like gen Z wants to live 723 00:36:53,800 --> 00:36:56,400 Speaker 5: life for themselves. It's like it's odd because I. 724 00:36:57,320 --> 00:36:59,239 Speaker 1: Because that's what being a person, that's to be a person. 725 00:36:59,280 --> 00:37:01,120 Speaker 5: That's right, But like it's all how that's framed as 726 00:37:01,160 --> 00:37:02,920 Speaker 5: like a surprising revelation. 727 00:37:03,120 --> 00:37:04,720 Speaker 1: Wow, these kids want to live lives. 728 00:37:05,320 --> 00:37:06,839 Speaker 5: So yeah, it was it was a kind of an 729 00:37:06,840 --> 00:37:09,399 Speaker 5: odd pedal to go to. She highlighted that the key 730 00:37:09,480 --> 00:37:12,560 Speaker 5: areas of tension in AI for for gen Z is 731 00:37:12,880 --> 00:37:17,480 Speaker 5: twofold a creative expression and human relationships. These are the 732 00:37:17,520 --> 00:37:21,759 Speaker 5: two biggest things that people are concerned about is how 733 00:37:21,960 --> 00:37:24,359 Speaker 5: how will affect your ability to you know, make art 734 00:37:24,480 --> 00:37:26,640 Speaker 5: be creative and what it means for like, you know, 735 00:37:26,840 --> 00:37:29,080 Speaker 5: relationships as a human being, right, especially if you're being 736 00:37:29,080 --> 00:37:31,640 Speaker 5: asked questions about you know, would you let an AI 737 00:37:32,239 --> 00:37:34,520 Speaker 5: like meet someone that you want to date first, have 738 00:37:35,719 --> 00:37:37,800 Speaker 5: have them go through like a first like fake AI 739 00:37:37,960 --> 00:37:42,280 Speaker 5: date to like to like get through like icebreaker questions 740 00:37:42,400 --> 00:37:42,760 Speaker 5: or something. 741 00:37:43,239 --> 00:37:46,080 Speaker 1: The amount of people I meet who feel that way 742 00:37:46,120 --> 00:37:49,120 Speaker 1: about like their digital twins or like who take pride 743 00:37:49,239 --> 00:37:51,640 Speaker 1: in having like an ai trained off of their social 744 00:37:51,760 --> 00:37:54,600 Speaker 1: media posts at events like these. It's shocking to me 745 00:37:54,760 --> 00:37:58,600 Speaker 1: because like, do you feel good about saying that a 746 00:37:58,920 --> 00:38:01,480 Speaker 1: chat bought you feel like it is you that you 747 00:38:01,560 --> 00:38:04,920 Speaker 1: have trained to chatbot to be a reasonable simulacrum of yourself. 748 00:38:04,960 --> 00:38:08,200 Speaker 1: Do you feel good about thinking that? Does that make 749 00:38:08,239 --> 00:38:10,080 Speaker 1: you happy about yourself? 750 00:38:10,320 --> 00:38:10,480 Speaker 9: Well? 751 00:38:10,560 --> 00:38:13,960 Speaker 5: And the data that person was talking about showed no, like, yeah, 752 00:38:14,000 --> 00:38:16,840 Speaker 5: people actually don't want these things, Like no, that's actually 753 00:38:17,000 --> 00:38:19,439 Speaker 5: is it what anyone wants out of life? This isn't 754 00:38:19,440 --> 00:38:21,640 Speaker 5: what anyone wants out of this technology, right, Like we 755 00:38:21,760 --> 00:38:23,520 Speaker 5: use AI all the time, you know, like like you know, 756 00:38:23,719 --> 00:38:26,239 Speaker 5: like auto complete. It has a whole bunch of like 757 00:38:26,239 --> 00:38:27,480 Speaker 5: you know, pretty basic uses. 758 00:38:27,719 --> 00:38:30,160 Speaker 1: Yeah, it saves me from having dispel certain words too 759 00:38:30,200 --> 00:38:30,680 Speaker 1: many times. 760 00:38:30,840 --> 00:38:32,920 Speaker 5: Yeah, but we don't want it to like go on 761 00:38:33,120 --> 00:38:35,200 Speaker 5: dates for us. And the whole part of being human 762 00:38:35,360 --> 00:38:37,520 Speaker 5: is having you know, a degree of bad experiences and 763 00:38:37,640 --> 00:38:40,520 Speaker 5: that that helps shape us as people. And this isn't 764 00:38:40,560 --> 00:38:42,600 Speaker 5: like a hurdle to get over. This is like a 765 00:38:42,719 --> 00:38:44,799 Speaker 5: part of what it means to be human. And then 766 00:38:45,000 --> 00:38:46,440 Speaker 5: she kind of talked about that a little bit more 767 00:38:46,719 --> 00:38:48,240 Speaker 5: in this last clip that I'll play. 768 00:38:50,440 --> 00:38:53,279 Speaker 6: The next one here. I prefer to give opportunities to 769 00:38:53,320 --> 00:38:54,520 Speaker 6: people over technology. 770 00:38:55,120 --> 00:38:58,479 Speaker 7: I think these are the ones, and again they've seen 771 00:38:59,000 --> 00:39:03,879 Speaker 7: what it's like full overplaced. I'll definitely share a lot more, 772 00:39:04,000 --> 00:39:07,319 Speaker 7: but starting off with a few key learnings gen Z's 773 00:39:07,360 --> 00:39:09,560 Speaker 7: value advice and perspective from lived experience. 774 00:39:10,040 --> 00:39:11,720 Speaker 6: There's something about designing for friction. 775 00:39:12,080 --> 00:39:15,440 Speaker 7: I'm gonnay like a design for friction in our age 776 00:39:15,520 --> 00:39:18,840 Speaker 7: of optimization and our age of assuming that everything should 777 00:39:18,840 --> 00:39:20,800 Speaker 7: move as fast as possible to make life as smooth 778 00:39:20,840 --> 00:39:24,279 Speaker 7: as possible. There's something about the challenge and that comes 779 00:39:24,320 --> 00:39:27,120 Speaker 7: back to play, right, Why would we spend so much 780 00:39:27,200 --> 00:39:30,160 Speaker 7: time to hit a ball several hundred yards away. 781 00:39:31,000 --> 00:39:33,680 Speaker 6: There's something about the joy of achieving, the joy. 782 00:39:33,520 --> 00:39:36,759 Speaker 7: Of overcoming challenge, the joy of bringing through your first 783 00:39:36,840 --> 00:39:39,600 Speaker 7: friend break up, your boyfriend or girl friend breakup. 784 00:39:39,480 --> 00:39:40,640 Speaker 6: That makes you to a person. 785 00:39:41,320 --> 00:39:44,320 Speaker 7: And as many times as helicopter parents or as people 786 00:39:44,440 --> 00:39:47,719 Speaker 7: who are designed technology assume that the smoothest possible path 787 00:39:48,040 --> 00:39:49,280 Speaker 7: is the best possible path. 788 00:39:49,320 --> 00:39:53,320 Speaker 1: There's some push back there, some pushback, some pushback to 789 00:39:53,400 --> 00:39:57,359 Speaker 1: the idea that like you should live a life, your 790 00:39:57,400 --> 00:39:59,040 Speaker 1: one precious life should be lived. 791 00:39:59,400 --> 00:40:02,120 Speaker 5: No, there's a a bunch of interesting stuff there. Gen 792 00:40:02,200 --> 00:40:04,480 Speaker 5: Z has great fears about being replaced. Yeah, you know, 793 00:40:04,640 --> 00:40:09,080 Speaker 5: like having like workforce replacement gen Z prefers to actually 794 00:40:09,120 --> 00:40:12,160 Speaker 5: like make connections and network with other people our age 795 00:40:12,200 --> 00:40:13,560 Speaker 5: and actually like share opportunities. 796 00:40:13,800 --> 00:40:14,000 Speaker 1: Yeah. 797 00:40:14,239 --> 00:40:16,399 Speaker 5: In previous panels, this was something that was also talked 798 00:40:16,400 --> 00:40:18,800 Speaker 5: about how millennials were way more like selective about like 799 00:40:18,880 --> 00:40:21,520 Speaker 5: sharing like employment opportunities because they were like so focused 800 00:40:21,560 --> 00:40:23,480 Speaker 5: on like making sure that they make it and there's 801 00:40:23,480 --> 00:40:26,759 Speaker 5: a lot more like like open collaboration and sharing sharing opportunities. 802 00:40:26,760 --> 00:40:28,960 Speaker 1: It's harder, so you guys have to be better about that. 803 00:40:29,440 --> 00:40:33,160 Speaker 5: Yeah, yeah, no talking about you know, like designing for friction. 804 00:40:33,719 --> 00:40:36,120 Speaker 5: There's like there's value in something being challenging. 805 00:40:36,280 --> 00:40:39,600 Speaker 1: That was very interesting because the surprise about that because 806 00:40:39,600 --> 00:40:42,319 Speaker 1: it is it is this kind of I'm sure most 807 00:40:42,400 --> 00:40:45,040 Speaker 1: of these people were born to wealth and privilege and 808 00:40:45,120 --> 00:40:47,440 Speaker 1: the first thing that people do with money, the primary 809 00:40:47,520 --> 00:40:50,439 Speaker 1: reason to have money is to reduce friction. The fact 810 00:40:50,440 --> 00:40:53,640 Speaker 1: that that's surprising to anyone that like, no, like friction's 811 00:40:53,760 --> 00:40:56,680 Speaker 1: necessary otherwise you're not a person. I mean, it's like that. 812 00:40:56,880 --> 00:40:59,759 Speaker 1: It's like the Gooule we saw the other night, right, 813 00:40:59,880 --> 00:41:03,560 Speaker 1: like you know, they're just not really people, you know. 814 00:41:04,280 --> 00:41:05,759 Speaker 5: One thing she kind of closed on in this section 815 00:41:05,880 --> 00:41:08,640 Speaker 5: is talking about how gen Z does not trust AI 816 00:41:08,800 --> 00:41:12,120 Speaker 5: to understand the nuance of their lives, especially in this 817 00:41:12,200 --> 00:41:15,640 Speaker 5: age of like tech optimization. Like that misses a part 818 00:41:15,719 --> 00:41:17,919 Speaker 5: of what it means to, like, you know, feel proud 819 00:41:17,960 --> 00:41:19,319 Speaker 5: of yourself and the work that you've done. 820 00:41:19,440 --> 00:41:19,640 Speaker 4: Yeah. 821 00:41:19,800 --> 00:41:21,160 Speaker 5: Something she should talk about at the very end of 822 00:41:21,200 --> 00:41:23,440 Speaker 5: the panel was like how they hadn't factored in like 823 00:41:23,760 --> 00:41:26,239 Speaker 5: like gen Z, you know and people in general, right, 824 00:41:26,440 --> 00:41:28,839 Speaker 5: will will feel proud about, you know, making a piece 825 00:41:28,880 --> 00:41:32,479 Speaker 5: of art, yeah, and they don't have that same sense 826 00:41:32,520 --> 00:41:35,879 Speaker 5: of pride for an AI generated image, no, whether it's 827 00:41:35,880 --> 00:41:39,719 Speaker 5: like a screenplay, whether it's whatever. Someone gave an example of, like, 828 00:41:39,920 --> 00:41:42,120 Speaker 5: you know, I have a kid who does creative stuff. 829 00:41:42,400 --> 00:41:44,840 Speaker 5: They edit videos, right, and there is AI tools that 830 00:41:44,920 --> 00:41:47,839 Speaker 5: make editing videos like easier, But if the AI does 831 00:41:47,920 --> 00:41:50,680 Speaker 5: all the work, they don't feel happy about that, Like 832 00:41:50,800 --> 00:41:53,319 Speaker 5: they don't feel proud, They don't feel like they've actually 833 00:41:53,360 --> 00:41:55,680 Speaker 5: achieved something. And you have to feel proud about the 834 00:41:55,719 --> 00:41:57,799 Speaker 5: work that you've done, so like there's actually a sense 835 00:41:57,840 --> 00:41:59,960 Speaker 5: of like ownership over like the art that we create. 836 00:42:00,040 --> 00:42:00,160 Speaker 7: Eight. 837 00:42:00,680 --> 00:42:05,920 Speaker 5: An exact quote was quote you can't eliminate life formative aspects. 838 00:42:06,719 --> 00:42:08,600 Speaker 1: Which which is like, yes, it is called life. Yeah, 839 00:42:09,000 --> 00:42:10,320 Speaker 1: you don't ever have anything. 840 00:42:10,520 --> 00:42:13,279 Speaker 5: I'm happy someone at CS is saying this, the fact 841 00:42:13,320 --> 00:42:14,760 Speaker 5: that it needs to be said. 842 00:42:14,719 --> 00:42:18,480 Speaker 1: At all very bleak, very sad. It's really bleak. Yeah, 843 00:42:18,840 --> 00:42:24,640 Speaker 1: dating people, making friends, being social, doing whatever it is 844 00:42:24,760 --> 00:42:27,760 Speaker 1: you do for a living as yourself is what life 845 00:42:27,960 --> 00:42:29,080 Speaker 1: is like. 846 00:42:29,440 --> 00:42:29,640 Speaker 10: Yeah. 847 00:42:29,680 --> 00:42:31,160 Speaker 5: I think last thing that you talked about is like 848 00:42:31,400 --> 00:42:35,279 Speaker 5: gen Z aren't technophobes, but they do have strong boundaries. 849 00:42:35,440 --> 00:42:35,640 Speaker 11: Yeah. 850 00:42:35,719 --> 00:42:38,080 Speaker 5: Good, and they have to reinforce their own sense of 851 00:42:38,160 --> 00:42:43,920 Speaker 5: self because we're constantly being bombarded with you know, slop content, influencers, podcasts, 852 00:42:44,040 --> 00:42:47,080 Speaker 5: live streams, like everything you know, TikTok social media. So 853 00:42:47,200 --> 00:42:50,120 Speaker 5: we have strongly boundaries on how tech it like integrates 854 00:42:50,480 --> 00:42:52,440 Speaker 5: into our lives. And a lot of the way these 855 00:42:52,480 --> 00:42:55,520 Speaker 5: tech bros want AI to like become more invasive. We 856 00:42:55,600 --> 00:42:57,400 Speaker 5: are not super into no. 857 00:42:57,760 --> 00:43:00,520 Speaker 1: Like all they're offering people is like this this machine 858 00:43:00,600 --> 00:43:03,600 Speaker 1: will do everything that you actually want to do with 859 00:43:03,719 --> 00:43:06,040 Speaker 1: your time, and also you won't have a job, like 860 00:43:06,160 --> 00:43:08,320 Speaker 1: that's what big tech is promising gen Z. 861 00:43:08,600 --> 00:43:11,280 Speaker 5: Yeah, so that's how I started my day speaking. 862 00:43:10,960 --> 00:43:15,600 Speaker 1: Of gen z. Z stands for zillions of dollars that 863 00:43:15,719 --> 00:43:28,719 Speaker 1: will get if you listen to these ads and we're back. 864 00:43:29,680 --> 00:43:32,880 Speaker 5: So unfortunately, that panel wasn't just talking about how kids 865 00:43:33,000 --> 00:43:35,480 Speaker 5: maybe don't want AI to run their lives. It also 866 00:43:35,600 --> 00:43:38,399 Speaker 5: had two other people from AI products. The first one 867 00:43:38,760 --> 00:43:42,120 Speaker 5: that I'll mention is called ready Land, which I think 868 00:43:42,120 --> 00:43:44,279 Speaker 5: I think partnered with Amazon ors to some degree. It 869 00:43:44,320 --> 00:43:48,040 Speaker 5: at least uses like Amazon Alexa's essentially a chooser own 870 00:43:48,040 --> 00:43:50,800 Speaker 5: adventure story book with like a natural physical copy that 871 00:43:50,920 --> 00:43:53,040 Speaker 5: Alexa will read to you and you can talk to 872 00:43:53,200 --> 00:43:55,760 Speaker 5: it so you can talk to characters and choose different pathways. 873 00:43:56,040 --> 00:43:57,680 Speaker 5: I was more skeptical out of that first, because I 874 00:43:57,760 --> 00:44:00,640 Speaker 5: just don't like AI's reading books to kids. Became more 875 00:44:00,760 --> 00:44:04,160 Speaker 5: like an interactive story thing, and it actually seemed kind 876 00:44:04,239 --> 00:44:07,360 Speaker 5: of good at what it was doing. And then the 877 00:44:07,440 --> 00:44:10,960 Speaker 5: guy behind it clarified ready Land is not using AI 878 00:44:11,120 --> 00:44:15,080 Speaker 5: to generate new content for kids. It's all like pre programmed, 879 00:44:15,120 --> 00:44:18,000 Speaker 5: like human paths, you know, just with so many variables 880 00:44:18,080 --> 00:44:20,160 Speaker 5: already built in based on you know, like if you're 881 00:44:20,200 --> 00:44:22,239 Speaker 5: making food in one of these books, or like you know, 882 00:44:22,280 --> 00:44:24,160 Speaker 5: a kid wants to go on like a weird side quest. 883 00:44:24,480 --> 00:44:27,680 Speaker 5: The AI already has like stuff for how to handle that. Yeah, 884 00:44:27,680 --> 00:44:28,839 Speaker 5: he knows how to say these words and knows how 885 00:44:28,840 --> 00:44:31,240 Speaker 5: to stitch together these things. But it's not actually generating 886 00:44:31,640 --> 00:44:34,719 Speaker 5: new content itself. It's if everything is like pre baked, 887 00:44:34,719 --> 00:44:37,440 Speaker 5: it can be assembled in many different ways. Okay, so 888 00:44:37,560 --> 00:44:39,759 Speaker 5: every time you read a book to the kid, it'll 889 00:44:39,800 --> 00:44:43,120 Speaker 5: be slightly different because the kid will respond to certain 890 00:44:43,200 --> 00:44:45,960 Speaker 5: plot elements. The kid can like talk to talk to characters, 891 00:44:46,239 --> 00:44:49,280 Speaker 5: ask questions. So like this was this was actually pretty interesting. 892 00:44:49,560 --> 00:44:52,000 Speaker 5: The fact that it's simply not even generating new content 893 00:44:52,280 --> 00:44:55,400 Speaker 5: makes it miles better than any of these other AI 894 00:44:55,600 --> 00:44:56,320 Speaker 5: Kids products. 895 00:44:56,360 --> 00:44:58,440 Speaker 1: That it's actually just kind of using some of the 896 00:44:58,480 --> 00:45:00,759 Speaker 1: tech that makes up AI exactly allow you to make 897 00:45:01,239 --> 00:45:03,920 Speaker 1: something humans wrote more reactive exactly. 898 00:45:04,080 --> 00:45:05,640 Speaker 5: Yeah, So like it's it's it's it's actually like a 899 00:45:05,680 --> 00:45:08,680 Speaker 5: pretty interesting piece of technology, and it's not just Alexa 900 00:45:09,000 --> 00:45:11,720 Speaker 5: reading a story book. It has like a large interactive 901 00:45:11,719 --> 00:45:14,239 Speaker 5: element which you know that makes the Alexa part, you know, 902 00:45:14,560 --> 00:45:17,880 Speaker 5: actually useful. And then there was this other product what 903 00:45:18,080 --> 00:45:20,359 Speaker 5: was this what what was this one called? It's from 904 00:45:20,440 --> 00:45:24,600 Speaker 5: coming called Skyrocket Toys, Hoe the AI Teddy Bear or 905 00:45:24,680 --> 00:45:29,960 Speaker 5: something like that, pokay Hoe the AI Bear, which does 906 00:45:30,160 --> 00:45:34,439 Speaker 5: generate live content with with guardrails. He did say, oh good, 907 00:45:34,960 --> 00:45:38,320 Speaker 5: but the AI content both comes from the input and 908 00:45:38,440 --> 00:45:41,399 Speaker 5: the output. You talked about guardrails, you know, he said. 909 00:45:41,440 --> 00:45:44,680 Speaker 5: You know, chat gpt does have internal guardrails, but the 910 00:45:44,840 --> 00:45:50,399 Speaker 5: reliability is suspect, which there certainly is, considering just last 911 00:45:50,440 --> 00:45:52,360 Speaker 5: week there was a piece of news about chat gpt 912 00:45:52,480 --> 00:45:53,600 Speaker 5: helping someone build a bomb. 913 00:45:53,840 --> 00:45:56,400 Speaker 1: Yeah, yeah, which they used in just this magical city. 914 00:45:56,680 --> 00:45:59,719 Speaker 5: Yes, so he didn't say that like guardrail reliability can 915 00:45:59,760 --> 00:46:01,440 Speaker 5: be so, but there is a difference when you have 916 00:46:01,600 --> 00:46:03,960 Speaker 5: certainly like like more like child's friendly features turned on. 917 00:46:04,480 --> 00:46:07,400 Speaker 5: But he admitted that like moderation is part of the challenge. 918 00:46:07,840 --> 00:46:11,360 Speaker 5: I don't know. He basically how this works is you 919 00:46:11,719 --> 00:46:14,080 Speaker 5: have an app synced up with this AI teddy bear 920 00:46:14,160 --> 00:46:16,480 Speaker 5: that talks with a not very pleasing voice. 921 00:46:16,640 --> 00:46:17,400 Speaker 1: Oh I got to hear it. 922 00:46:17,960 --> 00:46:19,319 Speaker 5: Do you want me to pull this up? Yes? 923 00:46:19,440 --> 00:46:19,960 Speaker 1: Absolutely? 924 00:46:20,239 --> 00:46:22,799 Speaker 5: Okay, yeah, But basically you put in a whole bunch 925 00:46:22,800 --> 00:46:24,640 Speaker 5: of story inputs, being like I want the story set 926 00:46:24,719 --> 00:46:27,320 Speaker 5: in this place, I want it featuring these types of characters. 927 00:46:27,400 --> 00:46:29,360 Speaker 5: I want this archetype to be the villain, or it 928 00:46:29,440 --> 00:46:31,880 Speaker 5: has like dozens or not hundreds of like eve like 929 00:46:31,960 --> 00:46:34,520 Speaker 5: archetypal things that you can like click and then the 930 00:46:34,600 --> 00:46:36,960 Speaker 5: teddy bear will will generate a new story. So it 931 00:46:37,080 --> 00:46:39,880 Speaker 5: is generating new content, but with like pre baked characters. Okay, 932 00:46:40,280 --> 00:46:42,759 Speaker 5: so then it'll sit together the story. The weirder you 933 00:46:43,000 --> 00:46:45,360 Speaker 5: make the variables, the weirder the story is going to be. Well, 934 00:46:45,440 --> 00:46:46,759 Speaker 5: let me play a clip for Robert here. 935 00:46:49,400 --> 00:46:53,399 Speaker 12: It's a bright and shiny January morning, the perfect time 936 00:46:53,560 --> 00:46:56,680 Speaker 12: for another story. Did you know that in Las Vegas, 937 00:46:56,800 --> 00:47:00,960 Speaker 12: where our story takes place, they haven't gigantic called the 938 00:47:01,080 --> 00:47:01,720 Speaker 12: high Rowler. 939 00:47:02,239 --> 00:47:04,200 Speaker 6: It's taller than the Statue of Liberty. 940 00:47:04,480 --> 00:47:07,320 Speaker 1: So the whole in the real world events in places 941 00:47:07,400 --> 00:47:10,280 Speaker 1: based on the setting achieves What if I told. 942 00:47:10,080 --> 00:47:13,080 Speaker 12: You there's a mystery waiting to be unround at the 943 00:47:13,200 --> 00:47:14,920 Speaker 12: consumer Electronic Shoe. 944 00:47:15,640 --> 00:47:21,399 Speaker 1: There's excitement here and these that guy like sitting there 945 00:47:21,520 --> 00:47:24,120 Speaker 1: talking almost rolling his eyes at his own product while 946 00:47:24,160 --> 00:47:26,840 Speaker 1: it yaps in his lap is a perfect like he 947 00:47:27,000 --> 00:47:29,680 Speaker 1: clearly didn't think about how that would look because it 948 00:47:29,800 --> 00:47:32,720 Speaker 1: does not make an appealing ad for the product. 949 00:47:33,000 --> 00:47:36,720 Speaker 5: No, so it doesn't sound good. So yeah, they generated 950 00:47:36,760 --> 00:47:40,400 Speaker 5: a story set in cees in Las Vegas and he 951 00:47:40,440 --> 00:47:42,880 Speaker 5: would occasionally interrupt the bear to like explain what it 952 00:47:43,000 --> 00:47:46,160 Speaker 5: was doing. So that was the other product, not nearly 953 00:47:46,360 --> 00:47:48,719 Speaker 5: as polished or like really really as thoughtful as like 954 00:47:48,800 --> 00:47:52,279 Speaker 5: the AI story book. But you know, maybe if you 955 00:47:52,560 --> 00:47:55,799 Speaker 5: are tired of having to, you know, talk to your kid, 956 00:47:55,880 --> 00:47:57,640 Speaker 5: you can just get one of these teddy bears to 957 00:47:57,800 --> 00:47:59,360 Speaker 5: yeah throw thrown front raise it. 958 00:47:59,400 --> 00:48:01,120 Speaker 1: I mean it looks I think it could probably handle 959 00:48:01,160 --> 00:48:03,120 Speaker 1: all of the physical contact they need too, so you 960 00:48:03,160 --> 00:48:05,279 Speaker 1: don't even need to ever touch your child. And in fact, 961 00:48:05,560 --> 00:48:07,920 Speaker 1: you can just have chat GPT root that through the 962 00:48:08,000 --> 00:48:11,400 Speaker 1: bear and never even see your own flesh and blood. 963 00:48:11,400 --> 00:48:13,839 Speaker 1: Like I think ideally you would have them cut out 964 00:48:13,920 --> 00:48:16,840 Speaker 1: of there, you know, really surgically remove that baby, you know, 965 00:48:16,960 --> 00:48:19,359 Speaker 1: a month or two early, and that way you can 966 00:48:19,440 --> 00:48:21,800 Speaker 1: kind of absolutely minimize the amount of time that you 967 00:48:21,920 --> 00:48:23,800 Speaker 1: ever spend in contact with your spawn. 968 00:48:24,280 --> 00:48:26,439 Speaker 5: One other thing I will add is that the ready 969 00:48:26,520 --> 00:48:29,560 Speaker 5: Land guy the AI story book particlarly when talking about, 970 00:48:29,560 --> 00:48:31,759 Speaker 5: you know, the importance of guardrails, he said that there's 971 00:48:31,840 --> 00:48:35,400 Speaker 5: multiple levels to safety. Right, an AI Kid's robot that 972 00:48:35,600 --> 00:48:37,840 Speaker 5: swears right, it's one thing that's it's pretty easy to 973 00:48:37,920 --> 00:48:39,680 Speaker 5: avoid actually, Like that's that's pretty easy. 974 00:48:39,719 --> 00:48:41,839 Speaker 1: There's a limited number of swear words, right. 975 00:48:41,800 --> 00:48:43,080 Speaker 5: And you could just like you could just block out 976 00:48:43,080 --> 00:48:44,680 Speaker 5: certain things from happening. Yeah, you can build that in. 977 00:48:44,880 --> 00:48:47,120 Speaker 5: But another aspect that's really important to safety is like 978 00:48:47,239 --> 00:48:50,000 Speaker 5: the accuracy of the things it's saying, right, Like, what 979 00:48:50,120 --> 00:48:52,480 Speaker 5: if it's saying something that's supposed to be you know, 980 00:48:52,640 --> 00:48:55,600 Speaker 5: some like factual statement about the world that's just that 981 00:48:55,719 --> 00:48:58,160 Speaker 5: just like isn't true or can actually like lead to danger, Right, 982 00:48:58,400 --> 00:49:00,480 Speaker 5: what if it tells your kid to do something which 983 00:49:00,520 --> 00:49:02,399 Speaker 5: is actually kind of dangerous, or what if it says 984 00:49:02,719 --> 00:49:04,920 Speaker 5: like not even not even directly telling them, but you know, 985 00:49:05,040 --> 00:49:06,880 Speaker 5: it says something that if the kid then tries to 986 00:49:06,960 --> 00:49:09,239 Speaker 5: do that, it's really dangerous. And like this is why 987 00:49:09,600 --> 00:49:12,120 Speaker 5: their storybook program, you know, does not generate new content. 988 00:49:12,320 --> 00:49:14,720 Speaker 5: So everything it says is like it's like already pre approved, 989 00:49:14,800 --> 00:49:17,120 Speaker 5: Like it already is going to have you know, like 990 00:49:17,320 --> 00:49:22,960 Speaker 5: verified like verified safe you know sentences versus this ai 991 00:49:23,320 --> 00:49:25,560 Speaker 5: teddy Bear. Because it is generating new content, you know, 992 00:49:25,719 --> 00:49:28,200 Speaker 5: it could if things go horribly wrong, you know, talk 993 00:49:28,239 --> 00:49:31,279 Speaker 5: about drinking bleach. You know. Yeah, theoretically, you know, it's 994 00:49:31,440 --> 00:49:33,799 Speaker 5: just just like something you know, like things can go wrong. 995 00:49:34,160 --> 00:49:36,239 Speaker 5: So it's not just about you know, avoiding bad words 996 00:49:36,360 --> 00:49:38,640 Speaker 5: or talking about sex or you know those types of 997 00:49:38,719 --> 00:49:41,759 Speaker 5: like like inappropriate things. It's also making sure it's not 998 00:49:41,840 --> 00:49:44,600 Speaker 5: like hallucinating or saying things that could like lead to 999 00:49:44,640 --> 00:49:46,000 Speaker 5: like dangerous situations. 1000 00:49:46,400 --> 00:49:51,399 Speaker 1: Right, Well, the good news is that I don't think 1001 00:49:51,520 --> 00:49:54,200 Speaker 1: these are going to be wildly successful products. I mean, 1002 00:49:54,239 --> 00:49:59,000 Speaker 1: I guess we'll see. But these are super expensive and like, 1003 00:49:59,320 --> 00:50:00,759 Speaker 1: did you get a price point for that bear? 1004 00:50:01,000 --> 00:50:02,640 Speaker 5: I did not hear her price point for the bear. 1005 00:50:02,840 --> 00:50:04,840 Speaker 1: I'm curious as to what they're going to be charging 1006 00:50:04,920 --> 00:50:06,640 Speaker 1: for it. I mean, we'll see if any of this 1007 00:50:06,680 --> 00:50:10,440 Speaker 1: stuff really does take off. I wouldn't consider it an 1008 00:50:10,440 --> 00:50:12,960 Speaker 1: optimism to hope this stuff takes off, but like, they 1009 00:50:13,000 --> 00:50:16,279 Speaker 1: don't seem like great products to me, so I guess 1010 00:50:16,360 --> 00:50:20,240 Speaker 1: we'll see. I read something very interesting that is related exactly, 1011 00:50:20,400 --> 00:50:22,000 Speaker 1: and it probably was he might have been talked about 1012 00:50:22,080 --> 00:50:24,799 Speaker 1: like that weird bear or something. I read something very 1013 00:50:24,840 --> 00:50:28,279 Speaker 1: interesting on the subject of like AI children's toys from 1014 00:50:28,360 --> 00:50:31,080 Speaker 1: a guy who was like an AI developer. This was 1015 00:50:31,120 --> 00:50:34,480 Speaker 1: from a post on Twitter by Alex Volkov. I got 1016 00:50:34,520 --> 00:50:36,319 Speaker 1: my six year old daughter an AI toy for her 1017 00:50:36,360 --> 00:50:39,320 Speaker 1: birthday that arrived for Christmas instead, she unpacked it all excited. 1018 00:50:39,400 --> 00:50:41,719 Speaker 1: I explained that this isn't like other toys, that this 1019 00:50:41,840 --> 00:50:43,680 Speaker 1: one has AI in it. She of course knows what 1020 00:50:43,760 --> 00:50:46,359 Speaker 1: AI is, seeing the things I've built and interacted with them, 1021 00:50:46,680 --> 00:50:49,640 Speaker 1: chatted with chat GBT and Santa Mode, knows that Daddy 1022 00:50:49,760 --> 00:50:53,480 Speaker 1: is doing AI, et cetera. So a very interesting experiment happened. 1023 00:50:53,520 --> 00:50:56,439 Speaker 1: After Magical Toys reached out and fixed the issue reference below. 1024 00:50:56,719 --> 00:50:58,759 Speaker 1: She started playing with this dino, chatted with it, and 1025 00:50:58,840 --> 00:51:01,280 Speaker 1: then learned to turn it off and wanted to talk anymore. 1026 00:51:01,520 --> 00:51:03,320 Speaker 1: She still loves playing with it, dressed it up. It 1027 00:51:03,400 --> 00:51:05,239 Speaker 1: now has paper shoes in the top hat that we 1028 00:51:05,320 --> 00:51:07,080 Speaker 1: made together. But every time I asked her if she 1029 00:51:07,280 --> 00:51:09,799 Speaker 1: liked to chat with it, she says no. A few 1030 00:51:09,840 --> 00:51:11,760 Speaker 1: times it turned it back on and she did speak 1031 00:51:11,800 --> 00:51:13,480 Speaker 1: with it for a bit, and then she just turned 1032 00:51:13,520 --> 00:51:16,680 Speaker 1: it off again, not wanting to engage. I gently asked why, 1033 00:51:16,760 --> 00:51:19,120 Speaker 1: and I wasn't really able to understand where there's the resistance. 1034 00:51:19,200 --> 00:51:21,040 Speaker 1: It's not weird to her. In fact, at one point 1035 00:51:21,120 --> 00:51:23,400 Speaker 1: she was pretending that Dinah was a baby and was 1036 00:51:23,480 --> 00:51:25,600 Speaker 1: turned on. So I told her, let's ask it to 1037 00:51:25,640 --> 00:51:28,080 Speaker 1: pretend to be a baby, and it obliged and said okay, 1038 00:51:28,239 --> 00:51:31,239 Speaker 1: So we asked it to cry. Granted, they don't have 1039 00:51:31,320 --> 00:51:34,080 Speaker 1: an amazing advanced voice mode like open AI, so it 1040 00:51:34,120 --> 00:51:36,160 Speaker 1: did its best, but it sounded weird, which made her 1041 00:51:36,239 --> 00:51:39,160 Speaker 1: laugh really hard. It was basically making crying sounds like talking, 1042 00:51:40,040 --> 00:51:42,360 Speaker 1: and also there are still technical issues. The voice is 1043 00:51:42,440 --> 00:51:44,720 Speaker 1: sometimes choppy, so it could be that it's still uncanny 1044 00:51:44,800 --> 00:51:47,560 Speaker 1: for her. I'm honestly fascinated about why the AI aspect 1045 00:51:47,600 --> 00:51:49,399 Speaker 1: of this didn't connect with my six year old. 1046 00:51:49,520 --> 00:51:51,479 Speaker 5: Because it's creepy, because it's people. 1047 00:51:51,920 --> 00:51:56,799 Speaker 1: They don't like it. Nobody wants this. Yeah, ick, yeah, ick. 1048 00:51:57,520 --> 00:51:59,360 Speaker 1: I know this is a sample size of one kid here, 1049 00:51:59,360 --> 00:52:01,359 Speaker 1: and I'm sure many many things will change. I shall 1050 00:52:01,400 --> 00:52:04,000 Speaker 1: grow and learn to interact with more AIS and different forms. 1051 00:52:04,320 --> 00:52:07,760 Speaker 1: But the first toy contact was interestingly almost a complete failure. 1052 00:52:09,280 --> 00:52:10,600 Speaker 5: That is interesting. 1053 00:52:11,280 --> 00:52:13,200 Speaker 1: Yeah, I find that fucking fascinating. 1054 00:52:13,640 --> 00:52:16,600 Speaker 5: Yeah, No one wants this, even six year olds. You're like, hey, 1055 00:52:17,040 --> 00:52:19,400 Speaker 5: I would prefer to a regular toy you can play with. 1056 00:52:19,680 --> 00:52:22,239 Speaker 1: I would prefer all pretend it's a robot, but I 1057 00:52:22,320 --> 00:52:25,960 Speaker 1: don't want it to be a robot. That talks to me, Poe. 1058 00:52:26,080 --> 00:52:28,560 Speaker 5: The AI bear is fifty dollars on Amazon. 1059 00:52:28,600 --> 00:52:31,279 Speaker 1: Oh that's not bad. Actually, no, that's good. Okay, good, 1060 00:52:31,400 --> 00:52:33,080 Speaker 1: all right, Well maybe maybe we can. 1061 00:52:32,960 --> 00:52:35,920 Speaker 5: Even maybe order one and see and see what we 1062 00:52:36,000 --> 00:52:39,000 Speaker 5: can get out of it. Yeah, all right, we're gonna 1063 00:52:39,000 --> 00:52:42,160 Speaker 5: go on another break and return to talk once acaind 1064 00:52:42,360 --> 00:52:56,160 Speaker 5: about AI products for your children. Okay, we're back. 1065 00:52:57,440 --> 00:52:59,560 Speaker 1: So we went and saw something else today while you 1066 00:53:00,000 --> 00:53:03,120 Speaker 1: were at a different chunk of the event talking to 1067 00:53:03,239 --> 00:53:06,480 Speaker 1: yet another flying car company that promises to revolutionize the 1068 00:53:06,520 --> 00:53:08,600 Speaker 1: ease with which we can all do nine to elevens. 1069 00:53:09,160 --> 00:53:12,320 Speaker 1: Super excited for that future. By the way, I stumbled 1070 00:53:12,400 --> 00:53:16,000 Speaker 1: upon the booth for a company called TCL, a pretty 1071 00:53:16,000 --> 00:53:18,040 Speaker 1: big company, fairly large, Yeah, large com and make a 1072 00:53:18,080 --> 00:53:20,399 Speaker 1: lot of TVs stuff like that. They had a couple 1073 00:53:20,440 --> 00:53:22,640 Speaker 1: of things. They had an AI laundry machine. 1074 00:53:22,760 --> 00:53:24,040 Speaker 5: So many air laundry bots. 1075 00:53:24,200 --> 00:53:26,759 Speaker 1: Yeah, this one was the worst because it's like this 1076 00:53:26,920 --> 00:53:32,640 Speaker 1: little almost a soft rounded pyramid shape. It hangs your laundry. 1077 00:53:32,760 --> 00:53:34,719 Speaker 1: They say they can't do folding yet, so it just 1078 00:53:34,760 --> 00:53:37,480 Speaker 1: sort of like picks up dry laundry and holds it. 1079 00:53:38,239 --> 00:53:39,520 Speaker 5: Like it just suspends it in the air. 1080 00:53:39,600 --> 00:53:42,120 Speaker 1: It suspends it in the air inside of itself. And 1081 00:53:42,200 --> 00:53:44,360 Speaker 1: also it can only do a kilogram of laundry. The 1082 00:53:44,400 --> 00:53:48,160 Speaker 1: only thing they had in there was like handkerchiefs and scarfs, 1083 00:53:48,360 --> 00:53:51,320 Speaker 1: So it's like probably a couple of thousand dollars, but 1084 00:53:51,400 --> 00:53:54,840 Speaker 1: you can AI can clean your handkerchiefs and scarfs. 1085 00:53:54,600 --> 00:53:56,640 Speaker 5: As opposed to my regular washing machine. 1086 00:53:56,760 --> 00:53:58,440 Speaker 1: Yeah, and they had a washing machine that it can 1087 00:53:58,560 --> 00:54:01,680 Speaker 1: identify and count exactly what clothes are in it, how 1088 00:54:01,719 --> 00:54:03,480 Speaker 1: many of them there are, and it'll tell you the 1089 00:54:03,560 --> 00:54:06,759 Speaker 1: soil level and YadA YadA, YadA YadA. Like I'm sure 1090 00:54:06,800 --> 00:54:10,000 Speaker 1: some people changing will want this shit, but it's like, yeah, 1091 00:54:10,360 --> 00:54:13,319 Speaker 1: only people who have a lot of money and want 1092 00:54:13,360 --> 00:54:15,239 Speaker 1: to spend it on a laundry machine, because I don't 1093 00:54:15,239 --> 00:54:18,239 Speaker 1: see that it actually reduces the amount of work you 1094 00:54:18,400 --> 00:54:21,040 Speaker 1: need to do at this point. But the thing they 1095 00:54:21,120 --> 00:54:23,440 Speaker 1: had at the booth that caught my eye was a 1096 00:54:23,560 --> 00:54:27,640 Speaker 1: robot toy for kids. AI Space Me is the name 1097 00:54:27,719 --> 00:54:31,000 Speaker 1: of the robot. Baby Yoda was a partial inspiration because 1098 00:54:31,080 --> 00:54:34,879 Speaker 1: like Ferby, Yeah, poor Ferby, there's some porg in there. 1099 00:54:34,920 --> 00:54:38,120 Speaker 1: It's a two part toy. The interior part is like 1100 00:54:38,239 --> 00:54:42,400 Speaker 1: a swaddled up almost looking little porg thing with acute 1101 00:54:42,480 --> 00:54:45,359 Speaker 1: face and the eyes are reasonably good, Like they did 1102 00:54:45,400 --> 00:54:47,880 Speaker 1: a decent job of the eyes not looking creepy, but 1103 00:54:48,080 --> 00:54:51,000 Speaker 1: like that blink and change color and contract and expand, 1104 00:54:51,080 --> 00:54:53,600 Speaker 1: and then it's got like two little flapper arms that 1105 00:54:53,719 --> 00:54:56,920 Speaker 1: can like wiggle, and it's seated inside almost like the 1106 00:54:56,960 --> 00:55:00,840 Speaker 1: Aliens and independence stay. It's seated inside like this large 1107 00:55:01,000 --> 00:55:04,759 Speaker 1: rolling body frame that allows it to move around on 1108 00:55:04,880 --> 00:55:08,000 Speaker 1: the ground. And so it's supposed to like be your 1109 00:55:08,320 --> 00:55:10,960 Speaker 1: child's friend. And the first thing that was upsetting to 1110 00:55:11,000 --> 00:55:13,160 Speaker 1: me because they had this video ad that would play 1111 00:55:13,440 --> 00:55:16,360 Speaker 1: every so often and it was very creepy, and you know, 1112 00:55:16,520 --> 00:55:19,520 Speaker 1: I thought back to when we were doing the interview 1113 00:55:19,600 --> 00:55:21,600 Speaker 1: with the guy who had like the robot for old people. 1114 00:55:21,640 --> 00:55:23,319 Speaker 1: He was like, it's very important that it not tell 1115 00:55:23,400 --> 00:55:26,000 Speaker 1: them it loves them. That it like always reiterate that 1116 00:55:26,080 --> 00:55:28,719 Speaker 1: it's a false thing. This robot just keeps telling the kid, 1117 00:55:28,760 --> 00:55:31,360 Speaker 1: I love you, like I care for you. When the 1118 00:55:31,440 --> 00:55:33,200 Speaker 1: lady did a demo, she was like, it's a toy 1119 00:55:33,280 --> 00:55:35,840 Speaker 1: that actually knows and cares about your child and like, no, 1120 00:55:35,960 --> 00:55:38,480 Speaker 1: it's not, No, it's not don't say that. That shouldn't 1121 00:55:38,480 --> 00:55:40,320 Speaker 1: be legal for you. To say that for you to 1122 00:55:40,400 --> 00:55:42,719 Speaker 1: sell this to children and tell them it's an intelligent 1123 00:55:42,840 --> 00:55:45,920 Speaker 1: being that loves them is like deeply abusive in my opinion, 1124 00:55:46,000 --> 00:55:49,800 Speaker 1: Like that is actually child abuse because it's not alive anyway. 1125 00:55:50,320 --> 00:55:53,840 Speaker 1: So I had to bring Garrison over because you needed 1126 00:55:53,880 --> 00:55:54,239 Speaker 1: to see it. 1127 00:55:55,200 --> 00:55:56,239 Speaker 5: Oh and saw it, I did. 1128 00:55:56,480 --> 00:55:59,120 Speaker 1: Yeah, And I'm going to play a little clip from 1129 00:55:59,160 --> 00:56:02,600 Speaker 1: the ad, so I want you to hear the way 1130 00:56:02,640 --> 00:56:04,040 Speaker 1: this thing sounds. 1131 00:56:05,800 --> 00:56:10,600 Speaker 12: Everywhere learning shut and growingly amy you reminds us sound. 1132 00:56:22,719 --> 00:56:23,680 Speaker 5: Oh my god. 1133 00:56:25,120 --> 00:56:28,799 Speaker 1: I found that profoundly upsetting disturbing. Yeah, your kid can 1134 00:56:28,880 --> 00:56:30,600 Speaker 1: like pick it up and like walk with it. It'll 1135 00:56:30,680 --> 00:56:32,800 Speaker 1: like talk to them, It'll make up stories. It'll like 1136 00:56:33,600 --> 00:56:36,440 Speaker 1: look at pictures your kid draws and then generate them 1137 00:56:36,440 --> 00:56:39,600 Speaker 1: into like live AI videos. You can put a pin 1138 00:56:39,680 --> 00:56:42,120 Speaker 1: on and it will record stuff that your kid does 1139 00:56:42,200 --> 00:56:43,960 Speaker 1: and play it back to you at night as a video. 1140 00:56:44,040 --> 00:56:46,480 Speaker 1: So again absolutely minimizing the amount of time you have 1141 00:56:46,560 --> 00:56:47,439 Speaker 1: to spend with your child. 1142 00:56:47,680 --> 00:56:48,640 Speaker 10: It's in the car. 1143 00:56:49,040 --> 00:56:52,399 Speaker 1: Yeah, it takes over your car, so that like it's 1144 00:56:52,640 --> 00:56:55,280 Speaker 1: talking to you from the screens in your car. 1145 00:56:55,360 --> 00:56:57,600 Speaker 5: Like the video like taking taking this thing every like 1146 00:56:57,800 --> 00:57:01,080 Speaker 5: everywhere the kid goes It's like the kid's main interaction 1147 00:57:01,239 --> 00:57:04,759 Speaker 5: with the world, Yeah, is with this little rolling like 1148 00:57:05,080 --> 00:57:08,279 Speaker 5: plastic furbie and yeah, like like talking about like like 1149 00:57:08,400 --> 00:57:11,400 Speaker 5: expressing like love and like how damaging this must be 1150 00:57:11,560 --> 00:57:13,120 Speaker 5: for like a four year old to have. Like the 1151 00:57:13,200 --> 00:57:16,320 Speaker 5: first thing that it constantly expressed like love an affection 1152 00:57:16,480 --> 00:57:19,240 Speaker 5: for is this little rolling robot that's that you're gonna 1153 00:57:19,240 --> 00:57:21,080 Speaker 5: throw in the garbage in like you know, four years 1154 00:57:21,440 --> 00:57:23,320 Speaker 5: when you're when you're like too old for it. How 1155 00:57:23,480 --> 00:57:26,600 Speaker 5: like traumatizing and like deeply fucked up. That's gonna be 1156 00:57:26,720 --> 00:57:29,200 Speaker 5: for your for like your sense of self and like 1157 00:57:29,320 --> 00:57:30,439 Speaker 5: love and affection. 1158 00:57:31,080 --> 00:57:33,240 Speaker 1: The mix of things that we're trying to have this do. 1159 00:57:33,600 --> 00:57:35,800 Speaker 1: Like the other ones were build as toys, this was 1160 00:57:35,840 --> 00:57:39,000 Speaker 1: built as like a friend for your child as well 1161 00:57:39,000 --> 00:57:41,080 Speaker 1: as like a home assistance. Yeah, it's supposed to also 1162 00:57:41,200 --> 00:57:43,200 Speaker 1: act as like it'll change that you can hook it 1163 00:57:43,240 --> 00:57:46,120 Speaker 1: into your smart home so it can change the temperature. 1164 00:57:46,200 --> 00:57:48,360 Speaker 1: Like they did a little in person demo where like 1165 00:57:48,440 --> 00:57:50,760 Speaker 1: a woman pretending to be a mom talked with it 1166 00:57:50,840 --> 00:57:53,320 Speaker 1: about like planting, planned a birthday party for her kid 1167 00:57:53,440 --> 00:57:55,480 Speaker 1: with it, Yeah, and it like put food in her 1168 00:57:55,560 --> 00:57:58,760 Speaker 1: Amazon cart and like change the temperature inside because more 1169 00:57:58,800 --> 00:58:01,480 Speaker 1: people were coming over. One of the things they advertise 1170 00:58:01,600 --> 00:58:04,120 Speaker 1: is is security mode, where it like travels around your 1171 00:58:04,120 --> 00:58:06,440 Speaker 1: house at night and acts as a sentry watching your 1172 00:58:06,520 --> 00:58:09,800 Speaker 1: home like wild stuff. 1173 00:58:10,440 --> 00:58:13,280 Speaker 5: No, it's it was Honestly, I've seen a few like 1174 00:58:13,360 --> 00:58:15,760 Speaker 5: disturbing things, you know, all of like the the new 1175 00:58:15,880 --> 00:58:18,400 Speaker 5: drone tech to have like solar powered drones that can 1176 00:58:18,440 --> 00:58:20,240 Speaker 5: stay in the air to drop bombs is like bad, 1177 00:58:20,520 --> 00:58:23,919 Speaker 5: But like this type of stuff is like really dehumanizing. 1178 00:58:24,480 --> 00:58:26,360 Speaker 5: It really like viscerally upsets me. 1179 00:58:27,640 --> 00:58:32,000 Speaker 1: Yeah, and I think probably very bad for children. Everything 1180 00:58:32,080 --> 00:58:35,720 Speaker 1: they showed us was incredibly curated, Like I when we 1181 00:58:35,840 --> 00:58:37,840 Speaker 1: watched this live thing where she was having a very 1182 00:58:37,920 --> 00:58:41,800 Speaker 1: fluid conversation with it that was clearly scripted. Yes, and 1183 00:58:42,200 --> 00:58:45,360 Speaker 1: so I wonder how well this thing actually works in practice. 1184 00:58:45,720 --> 00:58:48,120 Speaker 5: We never got an actual like live down, No. 1185 00:58:48,240 --> 00:58:50,960 Speaker 1: Because they always show it perfectly recognizing the kid, perfectly 1186 00:58:51,080 --> 00:58:53,680 Speaker 1: recognizing like what's in their you know, little kid drawings 1187 00:58:53,760 --> 00:58:56,680 Speaker 1: and stuff. What it's supposed to be to make beautiful, 1188 00:58:57,360 --> 00:59:00,880 Speaker 1: creepily shiny AI moving version some stuff. So like I 1189 00:59:00,960 --> 00:59:04,000 Speaker 1: wonder how much less good it's going to be in 1190 00:59:04,160 --> 00:59:06,160 Speaker 1: reality than the thing that they've showed us, but it's 1191 00:59:06,200 --> 00:59:09,800 Speaker 1: definitely some amount shittier than what they've displayed already. And 1192 00:59:10,160 --> 00:59:11,520 Speaker 1: part of why I think that is, like we went 1193 00:59:11,600 --> 00:59:13,680 Speaker 1: to check out the booth that this other the South 1194 00:59:13,720 --> 00:59:16,400 Speaker 1: Korean company just called I think sk had like a 1195 00:59:16,760 --> 00:59:19,880 Speaker 1: they called it a quantum security camera that was AI enabled, 1196 00:59:20,080 --> 00:59:22,280 Speaker 1: And then thinking about how like in the ads, it 1197 00:59:22,360 --> 00:59:24,800 Speaker 1: always like recognized the kid and its parents and a 1198 00:59:24,920 --> 00:59:27,480 Speaker 1: drawing accurately, Well, this one when I flipped off the 1199 00:59:27,560 --> 00:59:30,320 Speaker 1: camera with both middle fingers, recognized it and wrote up 1200 00:59:30,320 --> 00:59:32,760 Speaker 1: a description of a man giving the camera a thumbs up. 1201 00:59:33,320 --> 00:59:35,360 Speaker 1: Like I'm really curious for when these things hit the 1202 00:59:35,480 --> 00:59:38,600 Speaker 1: market and people start buying them, like what sort of 1203 00:59:38,680 --> 00:59:41,240 Speaker 1: fucked up stuff it'll do, and how kind of big 1204 00:59:41,360 --> 00:59:44,160 Speaker 1: the seams are. I don't expect a long life for 1205 00:59:44,240 --> 00:59:46,760 Speaker 1: this thing, which is going to be funnier because like 1206 00:59:47,280 --> 00:59:49,920 Speaker 1: there was already a big eight hundred dollars like Children's 1207 00:59:49,960 --> 00:59:53,840 Speaker 1: Companion AI toy that failed last year and the company 1208 00:59:53,960 --> 00:59:56,640 Speaker 1: shut off access to them, and like so parents had 1209 00:59:56,680 --> 00:59:59,160 Speaker 1: to explain to their kids who had bonded with this 1210 00:59:59,320 --> 01:00:02,480 Speaker 1: thing that it was dying forever. And that's especially excited 1211 01:00:02,520 --> 01:00:04,840 Speaker 1: to be because they They've built a robot that talks 1212 01:00:04,880 --> 01:00:06,760 Speaker 1: to your kid and tells it it loves them, and 1213 01:00:06,880 --> 01:00:09,520 Speaker 1: eventually that robot is going to be taken away from 1214 01:00:09,560 --> 01:00:12,960 Speaker 1: the child by the company when it no longer becomes profitable. 1215 01:00:13,000 --> 01:00:16,040 Speaker 1: And that's I'm excited for that, like new ground and 1216 01:00:16,080 --> 01:00:20,040 Speaker 1: how to fuck up kids. Anyway, that's what I got, Garrison. 1217 01:00:20,160 --> 01:00:23,600 Speaker 5: What an uplifting adventures again? 1218 01:00:23,680 --> 01:00:27,200 Speaker 1: Yeah? No, that's all. Yeah, it's all great. All right, everybody, 1219 01:00:27,560 --> 01:00:29,920 Speaker 1: Well this has been behind the bastards. No it's not, 1220 01:00:30,400 --> 01:00:31,080 Speaker 1: or no it's not. 1221 01:00:31,240 --> 01:00:31,560 Speaker 5: What is this? 1222 01:00:31,720 --> 01:00:34,560 Speaker 1: This has been? It could happen here? A podcast by 1223 01:00:34,600 --> 01:00:36,760 Speaker 1: somebody who is slowly going insane. 1224 01:00:36,960 --> 01:00:39,560 Speaker 5: Yeah, because we're like four days in Vegas now we 1225 01:00:39,680 --> 01:00:40,760 Speaker 5: still have one more day. 1226 01:00:40,880 --> 01:00:43,000 Speaker 1: I'm out of my mind. I'm completely broken. 1227 01:00:43,680 --> 01:00:47,040 Speaker 5: Uh. Hopefully tomorrow we'll have our final of our like 1228 01:00:47,160 --> 01:00:49,760 Speaker 5: on the ground coverage with our with our cees best 1229 01:00:49,840 --> 01:00:52,360 Speaker 5: in show. Yeah so and always going to be maybe 1230 01:00:52,400 --> 01:00:52,959 Speaker 5: a high note. 1231 01:00:53,440 --> 01:00:55,080 Speaker 1: So see you there who. 1232 01:01:07,040 --> 01:01:10,240 Speaker 13: Welcome Dick It appen here a podcast increasingly about it 1233 01:01:10,440 --> 01:01:13,840 Speaker 13: having happened. We have spent a long time on this 1234 01:01:14,000 --> 01:01:17,560 Speaker 13: show talking about what the second Trump administration is going 1235 01:01:17,600 --> 01:01:20,120 Speaker 13: to be for trans people and you know, go listen 1236 01:01:20,160 --> 01:01:22,200 Speaker 13: to those episodes. The short version is that it is 1237 01:01:22,240 --> 01:01:24,520 Speaker 13: going to be very very bad. We're facing care bans, 1238 01:01:24,560 --> 01:01:28,480 Speaker 13: we're facing federal funding bands. Things are about to get 1239 01:01:29,000 --> 01:01:33,640 Speaker 13: unbelievably bleak. But this campaign didn't come out of nowhere. 1240 01:01:33,680 --> 01:01:36,560 Speaker 13: It is the combination of almost a decades worth of 1241 01:01:36,640 --> 01:01:40,320 Speaker 13: fighting by the right, And I think we have a 1242 01:01:40,440 --> 01:01:44,840 Speaker 13: tendency to treat the rights campaign against trans people as 1243 01:01:45,760 --> 01:01:48,960 Speaker 13: something abstract right, as a sort of abstract political debate, 1244 01:01:50,200 --> 01:01:51,959 Speaker 13: or even if it affects us, we tend to treat 1245 01:01:52,080 --> 01:01:55,040 Speaker 13: the subjects, the immediate subject of the harassment, as sort 1246 01:01:55,040 --> 01:01:59,919 Speaker 13: of these distant, famous figures. But the issue with looking 1247 01:01:59,920 --> 01:02:03,680 Speaker 13: at it this way is that the harassment, the hatred, 1248 01:02:03,760 --> 01:02:07,000 Speaker 13: the violence is happening to real people, with real names 1249 01:02:07,040 --> 01:02:10,040 Speaker 13: and faces, who live lives exactly like yours. The difference 1250 01:02:10,040 --> 01:02:12,520 Speaker 13: between you sitting in your house right now and someone 1251 01:02:12,560 --> 01:02:16,280 Speaker 13: whose face is on TV is about the difference between 1252 01:02:16,600 --> 01:02:18,960 Speaker 13: whether a few right wing journalists cover who you are. 1253 01:02:19,840 --> 01:02:21,880 Speaker 13: So today we're going to be talking to someone who 1254 01:02:21,960 --> 01:02:26,720 Speaker 13: has been subject to almost the entire spectrum and range 1255 01:02:27,360 --> 01:02:30,479 Speaker 13: of the sort of emergent far right campaign against trans people, 1256 01:02:30,480 --> 01:02:35,320 Speaker 13: who have seen basically the entire campaign against trans people 1257 01:02:35,400 --> 01:02:38,640 Speaker 13: evolve specifically in the far right's harassment against them, and 1258 01:02:39,520 --> 01:02:42,400 Speaker 13: that person is the artist and musician Precious Child out 1259 01:02:42,440 --> 01:02:44,800 Speaker 13: of La Welcome to the show. I wish it was 1260 01:02:44,880 --> 01:02:46,160 Speaker 13: under better circumstances. 1261 01:02:47,160 --> 01:02:48,720 Speaker 11: Thank you, Thank you for having me mea. 1262 01:02:48,720 --> 01:02:52,000 Speaker 1: I glad to be here. Yeah, and I'm excited to 1263 01:02:52,040 --> 01:02:52,360 Speaker 1: talk to you. 1264 01:02:52,640 --> 01:02:57,240 Speaker 13: I'm slightly apprehensive in the sense that, my god, this 1265 01:02:57,320 --> 01:02:58,120 Speaker 13: stuff sucks. 1266 01:02:58,520 --> 01:03:03,280 Speaker 11: But yeah, oh you know, it's it's our lives. What 1267 01:03:03,480 --> 01:03:04,000 Speaker 11: can we do? 1268 01:03:04,600 --> 01:03:04,800 Speaker 1: Yeah? 1269 01:03:04,920 --> 01:03:05,480 Speaker 11: What will we do? 1270 01:03:06,440 --> 01:03:06,640 Speaker 8: Wow? 1271 01:03:07,640 --> 01:03:10,880 Speaker 13: Yes, that's the question for the end of the episode, 1272 01:03:10,960 --> 01:03:12,200 Speaker 13: is what are we going to do about all of 1273 01:03:12,240 --> 01:03:16,760 Speaker 13: this shit? But let's go back to sort of the beginning. 1274 01:03:16,840 --> 01:03:18,560 Speaker 13: Can you can you sort of talk about your first 1275 01:03:18,680 --> 01:03:21,520 Speaker 13: encounter with I guess at that point what was a 1276 01:03:22,440 --> 01:03:26,280 Speaker 13: not especially mainstream part of the religious right back in 1277 01:03:26,360 --> 01:03:27,360 Speaker 13: night around twenty eighteen. 1278 01:03:27,880 --> 01:03:31,400 Speaker 8: Yeah, so I've been I've been making music as Precious 1279 01:03:31,480 --> 01:03:34,400 Speaker 8: Child for almost a decade and it was my very 1280 01:03:34,440 --> 01:03:37,840 Speaker 8: first album that I put out, one called Trapped, that 1281 01:03:38,080 --> 01:03:41,960 Speaker 8: had this track on it titled Phantom, and that was 1282 01:03:42,040 --> 01:03:46,200 Speaker 8: an instrumental track with just some kind of whispery vocals. 1283 01:03:46,280 --> 01:03:46,400 Speaker 5: You know. 1284 01:03:46,520 --> 01:03:49,360 Speaker 8: It wasn't a song per se. It was experimental and 1285 01:03:49,560 --> 01:03:51,360 Speaker 8: I put on a music video with it, and it 1286 01:03:51,480 --> 01:03:55,240 Speaker 8: was pretty It's pretty creepy and there's flashing lights. You know, 1287 01:03:55,280 --> 01:03:58,520 Speaker 8: if you think of movies from the eighties like hal Raiser, 1288 01:03:59,120 --> 01:04:00,960 Speaker 8: it's kind of like that, you know, like kind of 1289 01:04:01,640 --> 01:04:07,680 Speaker 8: evocrative of some type of greater supernatural horror. And the 1290 01:04:08,280 --> 01:04:12,080 Speaker 8: far right at that time, the far right, vintage twenty eighteen, 1291 01:04:12,680 --> 01:04:16,280 Speaker 8: they found it and started reporting it on mess and 1292 01:04:16,480 --> 01:04:19,040 Speaker 8: tagging their friends and saying report this, report this. And 1293 01:04:19,280 --> 01:04:24,840 Speaker 8: this was on Instagram and Facebook and on YouTube as well. 1294 01:04:25,480 --> 01:04:28,880 Speaker 8: And that video, like, as I said, you know, it's creepy, 1295 01:04:29,400 --> 01:04:33,120 Speaker 8: but it's there's nothing political in it. There's a little 1296 01:04:33,160 --> 01:04:37,880 Speaker 8: bit of like of blood, but there's no gore. But 1297 01:04:38,040 --> 01:04:42,240 Speaker 8: if they found it unsettling and explicitly satanic, that's what 1298 01:04:42,360 --> 01:04:42,800 Speaker 8: they said. 1299 01:04:43,320 --> 01:04:48,440 Speaker 11: It's is satanic. And that was my first brush with 1300 01:04:48,560 --> 01:04:48,840 Speaker 11: the right. 1301 01:04:49,120 --> 01:04:51,720 Speaker 13: Yeah, and that's really interesting to me that it's specifically 1302 01:04:51,800 --> 01:04:54,600 Speaker 13: the satanic angle that they're taking, because it's like it's 1303 01:04:54,880 --> 01:04:58,280 Speaker 13: like in this early enough phase that they're still sort 1304 01:04:58,280 --> 01:05:01,720 Speaker 13: of developing their reasons to be angry. They haven't quite 1305 01:05:01,800 --> 01:05:05,880 Speaker 13: like metastasized transphobia as like they're driving things, so they're 1306 01:05:05,920 --> 01:05:09,960 Speaker 13: kind of they're reaching back into this kind of satanic 1307 01:05:10,040 --> 01:05:14,320 Speaker 13: panic era, like the weird nineties and two thousand stuff 1308 01:05:14,440 --> 01:05:17,280 Speaker 13: that like when when I was growing up, the town 1309 01:05:17,320 --> 01:05:19,240 Speaker 13: that I grew up with super religious, and like you know, 1310 01:05:19,360 --> 01:05:22,360 Speaker 13: we had to have lists of like if you were 1311 01:05:22,360 --> 01:05:24,520 Speaker 13: inviting like a friend in high school to a party 1312 01:05:25,160 --> 01:05:27,520 Speaker 13: whose parents could know that it was a Halloween party 1313 01:05:27,560 --> 01:05:30,120 Speaker 13: and whose parents couldn't because they would freak out about 1314 01:05:30,120 --> 01:05:32,880 Speaker 13: which is it's like that kind of thing, which I 1315 01:05:32,920 --> 01:05:36,360 Speaker 13: don't know, it feels like almost quaint now, even even 1316 01:05:36,440 --> 01:05:37,600 Speaker 13: as the stuff's escalated. 1317 01:05:37,800 --> 01:05:40,640 Speaker 8: But yeah, it was, as I said, it was a 1318 01:05:40,680 --> 01:05:44,280 Speaker 8: moral panic. And their point was that that I was 1319 01:05:44,360 --> 01:05:47,720 Speaker 8: a moral for making art like this, and it was 1320 01:05:48,000 --> 01:05:50,360 Speaker 8: this is the same thing that's happening today. I'm a 1321 01:05:50,520 --> 01:05:52,440 Speaker 8: moral for the art that I make. 1322 01:05:53,000 --> 01:05:54,840 Speaker 1: And it's not just my art, but it's me. 1323 01:05:55,520 --> 01:05:55,840 Speaker 10: It's me. 1324 01:05:56,240 --> 01:06:00,680 Speaker 8: Yeah, And I think that's perhaps what has changed as well, 1325 01:06:01,000 --> 01:06:03,440 Speaker 8: Like before they were saying that this is a satanic 1326 01:06:03,520 --> 01:06:06,919 Speaker 8: evil person because they're making this art, and now they're 1327 01:06:06,920 --> 01:06:10,040 Speaker 8: saying this is a satanic evil person making satanic evil art. 1328 01:06:10,960 --> 01:06:14,760 Speaker 13: Yeah, And I think part of the focus on art here, right, 1329 01:06:15,040 --> 01:06:19,280 Speaker 13: is this kind of mirrored reflection of the sort of 1330 01:06:19,520 --> 01:06:21,960 Speaker 13: I mean of the original Nazis, Right, Like one of 1331 01:06:22,000 --> 01:06:24,160 Speaker 13: their big things was just like cracked out on quote 1332 01:06:24,240 --> 01:06:27,560 Speaker 13: unquote degenerate art, and they had like these like quote 1333 01:06:27,640 --> 01:06:30,680 Speaker 13: unquote degenerate art festivals of just like Jewish artists and 1334 01:06:30,720 --> 01:06:32,600 Speaker 13: people whose art they didn't like it. It was, you know, 1335 01:06:32,720 --> 01:06:34,959 Speaker 13: like what a thing that was like a significant factor 1336 01:06:35,000 --> 01:06:36,480 Speaker 13: in their rise. And I think there's this sort of 1337 01:06:36,600 --> 01:06:39,960 Speaker 13: mirror of it here, but I don't know, starting in 1338 01:06:40,040 --> 01:06:44,640 Speaker 13: a weirder place in some ways, like starting more out 1339 01:06:44,640 --> 01:06:50,720 Speaker 13: of this very very weird like Christian moral panic shit. 1340 01:06:51,360 --> 01:06:54,200 Speaker 13: That's I guess if you want to look at how this, 1341 01:06:55,080 --> 01:06:56,960 Speaker 13: you know, plays out, like that's kind of where it 1342 01:06:57,080 --> 01:06:59,480 Speaker 13: is in like twenty eighteen, Right, this is the first 1343 01:06:59,560 --> 01:07:03,160 Speaker 13: bathroom in past By twenty eighteen it was Carolina. But 1344 01:07:03,200 --> 01:07:06,680 Speaker 13: there's you know, there's a huge backlash to it, and 1345 01:07:06,800 --> 01:07:08,800 Speaker 13: that's something that's I think very different than now, where 1346 01:07:08,840 --> 01:07:10,880 Speaker 13: like all of this city transhit is happening and everyone's 1347 01:07:10,880 --> 01:07:13,520 Speaker 13: just kind of going eh, So, do you want to 1348 01:07:13,520 --> 01:07:18,880 Speaker 13: talk about the second time he became Tarken a. 1349 01:07:18,920 --> 01:07:23,480 Speaker 8: Far right Yeah, I mean realistically, like this has been 1350 01:07:23,560 --> 01:07:27,880 Speaker 8: pretty constant throughout my life as a public artist. And 1351 01:07:28,520 --> 01:07:31,000 Speaker 8: there was another there was another track on that album 1352 01:07:31,120 --> 01:07:34,280 Speaker 8: that was also targeted. One call it titled My Little 1353 01:07:34,400 --> 01:07:39,240 Speaker 8: Problem Violet Door, that has some more provocative imagery than 1354 01:07:39,320 --> 01:07:43,520 Speaker 8: the track Phantom, has some nudity, and that was the 1355 01:07:43,560 --> 01:07:48,840 Speaker 8: collaboration between myself and a artist who is trans themselves, 1356 01:07:49,240 --> 01:07:54,120 Speaker 8: Kid out of Brazil, and that has again some body 1357 01:07:54,200 --> 01:07:58,400 Speaker 8: horror in it. There's a commentary about gender norms and 1358 01:07:58,880 --> 01:08:03,320 Speaker 8: plastic surgery and identity, but it wasn't explicitly political. Again, 1359 01:08:03,440 --> 01:08:06,160 Speaker 8: it was kind of a surreal body horror video. And 1360 01:08:06,280 --> 01:08:09,600 Speaker 8: that was Brigade reported not in twenty eighteen, but in 1361 01:08:09,760 --> 01:08:16,040 Speaker 8: twenty nineteen and actually taken down from YouTube and then reinstated. 1362 01:08:17,200 --> 01:08:21,000 Speaker 8: And that video is notable because as a result of 1363 01:08:21,080 --> 01:08:24,040 Speaker 8: what's going on today, YouTube took that down despite it 1364 01:08:24,120 --> 01:08:28,320 Speaker 8: being up for five years without a problem that you know, 1365 01:08:28,360 --> 01:08:31,920 Speaker 8: I'd had tens of thousands of views and now it's gone. 1366 01:08:32,720 --> 01:08:34,920 Speaker 11: So that was the second time. 1367 01:08:35,720 --> 01:08:38,960 Speaker 13: Yeah, and that one I think is interesting to in 1368 01:08:39,080 --> 01:08:41,880 Speaker 13: the sense of like that one's like a lot more, 1369 01:08:43,360 --> 01:08:45,479 Speaker 13: it's more overtly trans and it's also I think the 1370 01:08:45,600 --> 01:08:49,479 Speaker 13: more trands you are, the more the more like very 1371 01:08:49,600 --> 01:08:52,720 Speaker 13: obviously trans it is. And this is I guess something 1372 01:08:52,760 --> 01:08:57,920 Speaker 13: that's very common among trans artists is this kind of 1373 01:08:58,080 --> 01:09:01,080 Speaker 13: like art that's an exploration of sort of body horror it. 1374 01:09:01,720 --> 01:09:05,479 Speaker 13: You know, I mean, I'm not going to project onto it. 1375 01:09:05,560 --> 01:09:07,840 Speaker 13: I don't know if this is ucifically are doing, but like, 1376 01:09:07,920 --> 01:09:09,920 Speaker 13: you know, there's a lot of it that's body horror 1377 01:09:09,960 --> 01:09:13,400 Speaker 13: as this sort of metaphor for dysphoria, and like it's 1378 01:09:13,439 --> 01:09:15,240 Speaker 13: this way of sort of thinking about the things that 1379 01:09:15,320 --> 01:09:17,040 Speaker 13: are happening to your body, things that have been done 1380 01:09:17,080 --> 01:09:19,000 Speaker 13: to your body, and the things that you're doing back 1381 01:09:19,400 --> 01:09:19,680 Speaker 13: to it. 1382 01:09:20,280 --> 01:09:22,240 Speaker 11: Yeah, yeah, absolutely. 1383 01:09:22,720 --> 01:09:25,360 Speaker 8: You know, I didn't really think too much about the 1384 01:09:25,479 --> 01:09:29,639 Speaker 8: concepts in that direction when I made it or when 1385 01:09:30,160 --> 01:09:33,400 Speaker 8: or and I okay didn't either at that time. However, 1386 01:09:34,120 --> 01:09:37,000 Speaker 8: the truth is, for much of my young life, I 1387 01:09:37,080 --> 01:09:40,040 Speaker 8: felt out of body and I wanted my flesh to 1388 01:09:40,160 --> 01:09:44,800 Speaker 8: match my personal vision of myself and my identity. And 1389 01:09:44,920 --> 01:09:48,320 Speaker 8: that was something I struggled with for quite a long time. 1390 01:09:48,640 --> 01:09:49,280 Speaker 11: When I was young. 1391 01:09:49,360 --> 01:09:53,320 Speaker 8: I didn't have access to the information and communities that 1392 01:09:53,400 --> 01:09:56,519 Speaker 8: are out there now that sup portraans people. Yeah, and 1393 01:09:57,120 --> 01:10:02,160 Speaker 8: I have had some gender confirming procedures done, but not 1394 01:10:02,320 --> 01:10:04,080 Speaker 8: as many as I think I would have when I 1395 01:10:04,200 --> 01:10:09,519 Speaker 8: was younger. I did feel existential discordance, and I don't 1396 01:10:09,520 --> 01:10:11,160 Speaker 8: know if that's a word, but if it's not, I'm 1397 01:10:11,200 --> 01:10:14,880 Speaker 8: going to coin it because pretty sure I didn't. I 1398 01:10:14,920 --> 01:10:20,519 Speaker 8: didn't feel feel in concordance with my flesh, and so 1399 01:10:21,120 --> 01:10:25,280 Speaker 8: you know, I came to experiment what the boundaries were 1400 01:10:25,600 --> 01:10:29,439 Speaker 8: of of my fleshy identity in my existence in my art. 1401 01:10:30,000 --> 01:10:32,080 Speaker 13: Yeah, And I think there's something about the way that 1402 01:10:32,160 --> 01:10:33,600 Speaker 13: your art works and the way that a lot of 1403 01:10:33,720 --> 01:10:35,920 Speaker 13: Curtis are where there's you know, and this isn't to 1404 01:10:35,960 --> 01:10:38,080 Speaker 13: say that all currotas like this, but there's definitely like 1405 01:10:38,680 --> 01:10:39,400 Speaker 13: an edge to it. 1406 01:10:39,479 --> 01:10:40,200 Speaker 1: There's stuff going on. 1407 01:10:40,280 --> 01:10:43,799 Speaker 13: There's body horror things happening, There's like eighties aesthetic stuff, 1408 01:10:44,000 --> 01:10:47,439 Speaker 13: and I think it conflicts with this kind of weird 1409 01:10:47,920 --> 01:10:52,200 Speaker 13: concaid everything is like happy and cozy, sort of kitch 1410 01:10:52,320 --> 01:10:54,240 Speaker 13: aesthetic thing that a lot of like this kind of 1411 01:10:54,280 --> 01:10:56,800 Speaker 13: fascism is really into. They use the kind of a 1412 01:10:56,920 --> 01:11:00,200 Speaker 13: kind of aesthetic sensibility as as a weapon to go 1413 01:11:00,280 --> 01:11:04,080 Speaker 13: after stuff that they opposed for more oftly political reasons 1414 01:11:04,120 --> 01:11:06,800 Speaker 13: they can do this kind of like hey, look at 1415 01:11:06,800 --> 01:11:10,439 Speaker 13: this disgusting thing, et cetera, et cetera kind of attack 1416 01:11:10,800 --> 01:11:13,400 Speaker 13: on queer art as a result of this kind of 1417 01:11:13,439 --> 01:11:16,560 Speaker 13: like fascist kitsch aesthetic thing that's kind of like this this, 1418 01:11:16,760 --> 01:11:19,719 Speaker 13: you know, this sort of like cultural norm in our society. 1419 01:11:19,760 --> 01:11:22,720 Speaker 13: And I think I have I haven't one worked out 1420 01:11:22,720 --> 01:11:24,400 Speaker 13: the political inplication of this, but I think there's this 1421 01:11:24,520 --> 01:11:28,520 Speaker 13: kind of connection between their their weaponization of like revulsion 1422 01:11:28,640 --> 01:11:32,920 Speaker 13: and their weaponization of this reaction to like anything that's kind. 1423 01:11:32,760 --> 01:11:35,639 Speaker 1: Of like horror that they kind of like use as 1424 01:11:35,680 --> 01:11:36,479 Speaker 1: the political attack. 1425 01:11:37,240 --> 01:11:40,599 Speaker 11: Yeah, you know, I think gender is horrifying period. 1426 01:11:40,920 --> 01:11:44,519 Speaker 8: Yeah, not just for queer people, but also for CIS people, 1427 01:11:44,800 --> 01:11:48,080 Speaker 8: Like I'm gonna defend CIS people here for a second. 1428 01:11:48,640 --> 01:11:52,480 Speaker 11: So SIS people struggle with gender dysphoria. 1429 01:11:53,120 --> 01:11:55,960 Speaker 8: I think maybe I would say every bit as much 1430 01:11:56,000 --> 01:11:58,760 Speaker 8: as trans people, but they sure fucking struggle with that. 1431 01:11:59,040 --> 01:11:59,200 Speaker 4: Yeah. 1432 01:11:59,240 --> 01:12:02,320 Speaker 8: For instance, an example I will give is facial hair. 1433 01:12:02,880 --> 01:12:06,160 Speaker 8: Lots of people that were assigned male at birth, they 1434 01:12:06,280 --> 01:12:08,960 Speaker 8: fret and earn worry over their facial hair. 1435 01:12:09,240 --> 01:12:12,000 Speaker 11: Is it too much? Is it too little? 1436 01:12:12,400 --> 01:12:14,600 Speaker 8: Then people that are signed female at birth, you know, 1437 01:12:14,680 --> 01:12:17,760 Speaker 8: if they have facial hair, you know, they thread over it. 1438 01:12:17,920 --> 01:12:19,400 Speaker 8: You know, well, people see it. Do I have to 1439 01:12:19,439 --> 01:12:21,880 Speaker 8: bleach it? Do I have to pluck it? People fret 1440 01:12:21,960 --> 01:12:25,559 Speaker 8: over like they're freaking jaw lines. Like, you know, if 1441 01:12:25,600 --> 01:12:28,439 Speaker 8: you search online like masculine jawline, how do I get? 1442 01:12:28,760 --> 01:12:32,200 Speaker 8: There's a huge community out there of assigned male at 1443 01:12:32,280 --> 01:12:34,679 Speaker 8: birth SYS men that are trying to get more defined 1444 01:12:34,720 --> 01:12:37,920 Speaker 8: jaw lines because they feel that their genetics are presenting 1445 01:12:38,000 --> 01:12:42,320 Speaker 8: them as a non optimal male quote unquote, and of course, 1446 01:12:42,400 --> 01:12:45,360 Speaker 8: same thing for quote unquote females, like do I have 1447 01:12:45,479 --> 01:12:47,000 Speaker 8: the female feminine bodies? 1448 01:12:47,560 --> 01:12:49,600 Speaker 11: Is it curvy enough in the right way? Is my 1449 01:12:49,720 --> 01:12:53,400 Speaker 11: waist slim enough? You know? Is am I just brick shaped? 1450 01:12:54,000 --> 01:12:57,040 Speaker 8: And then all the industries around that, and that I 1451 01:12:57,120 --> 01:13:01,000 Speaker 8: think is horrifying, and everyone goes through the struggles with it, 1452 01:13:01,080 --> 01:13:04,400 Speaker 8: and very few people are lucky enough to embody the 1453 01:13:05,439 --> 01:13:08,880 Speaker 8: ideals of gender that we thrust upon ourselves. 1454 01:13:09,479 --> 01:13:11,320 Speaker 11: And to me, that is a tragedy. 1455 01:13:11,840 --> 01:13:13,880 Speaker 13: Yeah, And I think that sort of like fear and 1456 01:13:13,960 --> 01:13:16,479 Speaker 13: that sort of like grinding experience of being forced to 1457 01:13:16,840 --> 01:13:20,000 Speaker 13: like perform a gender in a certain way. Well, okay, 1458 01:13:20,000 --> 01:13:22,120 Speaker 13: I'm not going to say perform because the Butler scholars 1459 01:13:22,160 --> 01:13:23,160 Speaker 13: are going to get extremely. 1460 01:13:22,960 --> 01:13:23,280 Speaker 1: Mad at me. 1461 01:13:23,560 --> 01:13:24,600 Speaker 5: But the. 1462 01:13:26,320 --> 01:13:27,680 Speaker 13: Way in which are forced to sort of live up 1463 01:13:27,680 --> 01:13:30,439 Speaker 13: to these standards that are sort of nonsense. I think 1464 01:13:31,560 --> 01:13:33,599 Speaker 13: it gets to this thing where you know, you could 1465 01:13:33,640 --> 01:13:35,760 Speaker 13: either sort of like muddle through it and try to 1466 01:13:35,840 --> 01:13:38,720 Speaker 13: ignore the distance as much as you can. You can 1467 01:13:39,040 --> 01:13:42,320 Speaker 13: attempt to fight it, or you can get extremely bad 1468 01:13:42,360 --> 01:13:44,439 Speaker 13: at everyone else is trying to do something about it, 1469 01:13:44,760 --> 01:13:48,120 Speaker 13: and I think we're seeing an explosion of the last option. Yeah, 1470 01:13:48,640 --> 01:13:51,200 Speaker 13: Unfortunately we need to go to ads, which is another 1471 01:13:51,280 --> 01:13:53,880 Speaker 13: thing that drives a whole bunch of this. Luckily, these 1472 01:13:53,920 --> 01:13:57,599 Speaker 13: are audio ads, so hopefully they're not driving beauty standards. 1473 01:13:57,720 --> 01:14:01,280 Speaker 13: But you know who knows people are wizards. Sty'll find 1474 01:14:01,320 --> 01:14:03,439 Speaker 13: a way brought to you by Mabeline. 1475 01:14:14,040 --> 01:14:15,559 Speaker 1: And we are back. 1476 01:14:15,680 --> 01:14:17,679 Speaker 13: I mean, I guess the way should stay. Highway patrol 1477 01:14:17,840 --> 01:14:21,000 Speaker 13: does for skinder dorms on people. 1478 01:14:21,360 --> 01:14:27,320 Speaker 8: Oh, they sure do MAKEI another example of hideous gender 1479 01:14:27,360 --> 01:14:27,840 Speaker 8: and arms. 1480 01:14:28,160 --> 01:14:28,360 Speaker 1: Yeah. 1481 01:14:28,600 --> 01:14:32,120 Speaker 8: So you listener, you're hearing me right, and you're hearing 1482 01:14:32,160 --> 01:14:36,439 Speaker 8: my voice. And this is another thing that all people 1483 01:14:36,520 --> 01:14:40,040 Speaker 8: fret about. It's not trust trans people. Lots of a fabs. 1484 01:14:40,320 --> 01:14:42,759 Speaker 8: I know, I'm just gonna say a fab and amap Okay, 1485 01:14:43,280 --> 01:14:45,720 Speaker 8: lots of a fabs, I know. You know, they have 1486 01:14:45,840 --> 01:14:49,040 Speaker 8: pretty darned deep voices naturally, and they I've talked with 1487 01:14:49,080 --> 01:14:51,320 Speaker 8: them privately and they say that they worry about how 1488 01:14:51,400 --> 01:14:54,439 Speaker 8: husky their voice is when they just relax. And then 1489 01:14:54,600 --> 01:14:57,800 Speaker 8: same thing for amaps, they talk about worrying if their 1490 01:14:57,880 --> 01:15:02,320 Speaker 8: voice is squeaky and thinner. I'm talk like a freaking 1491 01:15:02,400 --> 01:15:07,160 Speaker 8: wrestler from w WE, you know, like people people like that. 1492 01:15:07,960 --> 01:15:09,479 Speaker 11: People struggle with that too. 1493 01:15:09,800 --> 01:15:13,040 Speaker 8: Just so yeah, just something as simple as our appearance 1494 01:15:13,080 --> 01:15:18,320 Speaker 8: and our voice, you know, we're just torturing ourselves. And ultimately, 1495 01:15:18,400 --> 01:15:21,719 Speaker 8: I gotta say Nia, you know, I'm I'm a trans woman. However, 1496 01:15:21,960 --> 01:15:25,280 Speaker 8: ultimately I'm a gender abolitionist because this shit sucks. 1497 01:15:26,080 --> 01:15:29,320 Speaker 5: Yeah, it's it's not it's not great. It's not a 1498 01:15:29,439 --> 01:15:30,559 Speaker 5: good time for any way. 1499 01:15:30,600 --> 01:15:35,360 Speaker 13: It fauld Yeah, speaking of bad times, this isn't even 1500 01:15:35,360 --> 01:15:37,240 Speaker 13: an activate. I just do this now, which really bad. 1501 01:15:37,280 --> 01:15:39,000 Speaker 13: I do it to people my daily life. They're like, 1502 01:15:39,080 --> 01:15:41,040 Speaker 13: why are you anpivoting me? And I'm like, oh God, 1503 01:15:42,960 --> 01:15:46,240 Speaker 13: but yeah, I guess God. This and the sort of 1504 01:15:46,360 --> 01:15:51,000 Speaker 13: racialization aspect, and I don't know this is the aspect 1505 01:15:51,040 --> 01:15:55,360 Speaker 13: of zones of gender performance. God damn, I keep saying 1506 01:15:55,400 --> 01:15:58,000 Speaker 13: performance that I literally mean that you were performing it, 1507 01:15:58,080 --> 01:16:01,559 Speaker 13: as in you were acting and not the butler thing 1508 01:16:01,680 --> 01:16:03,120 Speaker 13: of your perform to get to make it real. 1509 01:16:03,520 --> 01:16:04,760 Speaker 5: Please don't yell at me. 1510 01:16:04,800 --> 01:16:05,720 Speaker 1: Of the comments I had. 1511 01:16:06,360 --> 01:16:10,000 Speaker 13: I had the guy who wrote a writer on the 1512 01:16:10,080 --> 01:16:13,760 Speaker 13: Big Bang Theory yelled at me specifically about that on Twitter. 1513 01:16:13,840 --> 01:16:14,400 Speaker 6: What times. 1514 01:16:16,120 --> 01:16:17,000 Speaker 1: It's really okay. 1515 01:16:17,080 --> 01:16:20,320 Speaker 13: Sorry, I about to rail this enough, partially because this 1516 01:16:20,520 --> 01:16:25,800 Speaker 13: next part is really depressing. But so a while back 1517 01:16:26,160 --> 01:16:29,080 Speaker 13: on this show, my my co host Garrison, who is 1518 01:16:29,800 --> 01:16:32,800 Speaker 13: I don't know, probably having an absolutely terrible time at 1519 01:16:32,840 --> 01:16:36,320 Speaker 13: the Consumer Electronics Show right now, covered a specific far 1520 01:16:36,439 --> 01:16:39,200 Speaker 13: right panic that became don as the we spat contraversia. 1521 01:16:39,360 --> 01:16:41,519 Speaker 13: Do you want to talk about how the right stuck 1522 01:16:41,600 --> 01:16:44,120 Speaker 13: you into that shit, because Jesus Christ. 1523 01:16:44,800 --> 01:16:45,000 Speaker 4: Yeah. 1524 01:16:45,160 --> 01:16:47,280 Speaker 8: So you know, in twenty eighteen, as I said, I 1525 01:16:47,360 --> 01:16:49,160 Speaker 8: put out this album, and then I put out another, 1526 01:16:49,320 --> 01:16:52,519 Speaker 8: and I start I was touring the country and Canada 1527 01:16:52,640 --> 01:16:56,360 Speaker 8: and stuff and doing shows pretty much constantly, and then 1528 01:16:56,800 --> 01:17:01,120 Speaker 8: twenty happened and I became involved in the George Floyd 1529 01:17:01,439 --> 01:17:07,120 Speaker 8: uprising and the Black Lives Matter marches and protesting, and 1530 01:17:07,439 --> 01:17:12,240 Speaker 8: so I began to live stream those protests and marches 1531 01:17:12,720 --> 01:17:16,160 Speaker 8: with the specific intent of contextualizing what the heck was 1532 01:17:16,240 --> 01:17:19,920 Speaker 8: going on on the streets to people watching, because a 1533 01:17:19,960 --> 01:17:23,200 Speaker 8: lot of people, you know, regardless of their politics, did 1534 01:17:23,280 --> 01:17:26,479 Speaker 8: not understand what the issues were. And the thing is, 1535 01:17:26,560 --> 01:17:30,519 Speaker 8: in LA there were a lot of continual police murders 1536 01:17:30,640 --> 01:17:33,640 Speaker 8: even through the riots. And by police murders, I mean 1537 01:17:33,680 --> 01:17:37,160 Speaker 8: the cops shooting on armed black people in the back 1538 01:17:37,280 --> 01:17:41,120 Speaker 8: as they were running away, or executions are shooting them 1539 01:17:41,160 --> 01:17:44,720 Speaker 8: in the car, that type of thing. And so I 1540 01:17:44,840 --> 01:17:46,920 Speaker 8: was explaining that to the viewers, like, this is why 1541 01:17:47,000 --> 01:17:49,400 Speaker 8: people are in the streets, this is a specific issue, 1542 01:17:49,680 --> 01:17:52,320 Speaker 8: these are the laws surrounding it, and why these actions 1543 01:17:52,400 --> 01:17:56,519 Speaker 8: by the police are not just horrifying, they're also illegal. 1544 01:17:57,479 --> 01:17:59,679 Speaker 8: And so I was doing that and I became pretty 1545 01:17:59,760 --> 01:18:04,120 Speaker 8: darn visible and popular, like I was maybe one of 1546 01:18:04,200 --> 01:18:11,080 Speaker 8: the top five best known activist or racial justice voices. 1547 01:18:12,000 --> 01:18:15,599 Speaker 8: And I was targeted by a right wing activist who 1548 01:18:15,800 --> 01:18:20,439 Speaker 8: was known for blocking the vaccination clinics at Dodger Stadium 1549 01:18:21,040 --> 01:18:25,720 Speaker 8: specifically because I was visible, because I was trans, and 1550 01:18:25,880 --> 01:18:29,080 Speaker 8: so she went for me and posted and said that 1551 01:18:29,240 --> 01:18:32,320 Speaker 8: I was a transgender individual who was in the women's 1552 01:18:32,320 --> 01:18:36,240 Speaker 8: spa of we Spa and I was sexually harassing people, 1553 01:18:36,720 --> 01:18:40,200 Speaker 8: and that went viral on social media. It was covered 1554 01:18:40,240 --> 01:18:42,960 Speaker 8: on Fox News for a week. I was getting constant 1555 01:18:43,120 --> 01:18:48,200 Speaker 8: death threats, and you know, I was docksed. It was 1556 01:18:48,280 --> 01:18:52,600 Speaker 8: pretty terrible, especially because I was not that person in 1557 01:18:52,720 --> 01:18:55,479 Speaker 8: the spa and I was only best picked up because 1558 01:18:55,600 --> 01:18:59,200 Speaker 8: I was picked up for my visibility. My response to 1559 01:18:59,280 --> 01:19:02,000 Speaker 8: that was no, I didn't immediately say yeah, it wasn't 1560 01:19:02,040 --> 01:19:03,800 Speaker 8: mean it wasn't me, leave me alone, Leave me alone. 1561 01:19:04,400 --> 01:19:08,519 Speaker 8: I didn't do that because I knew that if I 1562 01:19:08,640 --> 01:19:10,840 Speaker 8: said that, then they would just pick and attack some 1563 01:19:10,920 --> 01:19:15,200 Speaker 8: other trans person. Yeah, and you know, I know that 1564 01:19:15,520 --> 01:19:18,400 Speaker 8: like the ship that the that the right wing machine 1565 01:19:19,520 --> 01:19:22,000 Speaker 8: and acts, if it happens to one of us, it 1566 01:19:22,120 --> 01:19:24,120 Speaker 8: can happen to all that's end, it likely will. 1567 01:19:24,720 --> 01:19:26,360 Speaker 13: Yeah, And I mean, I think the thing that it 1568 01:19:26,439 --> 01:19:28,720 Speaker 13: reminds me the most something we've also covered on the 1569 01:19:29,240 --> 01:19:31,640 Speaker 13: This is one of the problems with talking about this. 1570 01:19:31,760 --> 01:19:34,280 Speaker 13: It's like doing this for so long that like there's 1571 01:19:34,400 --> 01:19:36,040 Speaker 13: very few things that I can say that I can't 1572 01:19:36,120 --> 01:19:37,760 Speaker 13: be Like I've said this on the show before. But 1573 01:19:37,880 --> 01:19:39,760 Speaker 13: we spent a lot of time, typically Garrison, I spend 1574 01:19:39,760 --> 01:19:42,160 Speaker 13: a lot of time covering the way that every time 1575 01:19:42,200 --> 01:19:44,800 Speaker 13: there's a mass shooter, the right immediately just like picks 1576 01:19:44,800 --> 01:19:47,639 Speaker 13: a random transperson and goes, it was this person. Yeah, 1577 01:19:48,000 --> 01:19:50,240 Speaker 13: and this reminds me a lot of the same thing, 1578 01:19:50,240 --> 01:19:53,439 Speaker 13: although this isn't more targeted, Like we've invented a fake 1579 01:19:53,520 --> 01:19:56,439 Speaker 13: controversy about a trans person and then we're going to 1580 01:19:56,960 --> 01:20:00,200 Speaker 13: like also just pick a random famous rand well not 1581 01:20:00,240 --> 01:20:02,720 Speaker 13: your favous, but like a random trans person that we 1582 01:20:02,880 --> 01:20:03,840 Speaker 13: know about and don't like. 1583 01:20:04,520 --> 01:20:09,360 Speaker 8: It's my understanding that this twenty twenty one we saw 1584 01:20:09,439 --> 01:20:13,080 Speaker 8: a controversy that I was targeted for became something of 1585 01:20:13,880 --> 01:20:15,040 Speaker 8: a right wing playbook. 1586 01:20:15,160 --> 01:20:15,320 Speaker 4: Yep. 1587 01:20:15,400 --> 01:20:17,880 Speaker 8: It was after that that they started saying, oh, this 1588 01:20:18,000 --> 01:20:21,600 Speaker 8: and that person's trans, and before that they didn't have 1589 01:20:21,800 --> 01:20:25,200 Speaker 8: a real moral panic around trans people unless you look 1590 01:20:25,200 --> 01:20:28,639 Speaker 8: all the way back to the North Carolina South Carolina 1591 01:20:28,680 --> 01:20:29,200 Speaker 8: bathroom man. 1592 01:20:30,600 --> 01:20:32,280 Speaker 13: Yeah, I mean I think I think there's an interesting 1593 01:20:32,320 --> 01:20:37,320 Speaker 13: intermediary thing too. Where so my friend Vicky ashterwhile who 1594 01:20:38,040 --> 01:20:41,679 Speaker 13: depending on when this episode comes out, you'll be hearing 1595 01:20:41,720 --> 01:20:45,240 Speaker 13: from either right before this episode or right after. I 1596 01:20:45,360 --> 01:20:47,800 Speaker 13: had another version of this where she wrote a book 1597 01:20:47,800 --> 01:20:50,400 Speaker 13: called In Defensive Looting. It came out in twenty twenty 1598 01:20:50,520 --> 01:20:54,120 Speaker 13: and just for like three months became all the rest 1599 01:20:54,160 --> 01:20:57,000 Speaker 13: of it. Really two months, like became the like the 1600 01:20:57,120 --> 01:21:00,599 Speaker 13: giant figure with which everyone who didn't like the uprising 1601 01:21:00,880 --> 01:21:02,519 Speaker 13: was just like taking it out on, so like like 1602 01:21:02,600 --> 01:21:06,000 Speaker 13: sitting in Congress, people were like denouncing this book that 1603 01:21:06,120 --> 01:21:09,519 Speaker 13: you wrote. Every base re media outlet like specifically had 1604 01:21:09,560 --> 01:21:12,200 Speaker 13: their editorial people going like, this book is evil. 1605 01:21:12,960 --> 01:21:13,759 Speaker 1: Vicky's evil. 1606 01:21:14,520 --> 01:21:16,240 Speaker 13: And I think that was also like this moment of 1607 01:21:16,479 --> 01:21:19,479 Speaker 13: deep connection between the backlash of the uprisings and the 1608 01:21:19,479 --> 01:21:22,439 Speaker 13: anti trans uprisings, because the people who you know are 1609 01:21:22,520 --> 01:21:28,480 Speaker 13: trying to maintain a white supremacist gender system like intimately themselves, 1610 01:21:29,160 --> 01:21:31,200 Speaker 13: even if they don't understand it on a theoretical level, 1611 01:21:31,320 --> 01:21:34,000 Speaker 13: understand that they're that these things are preserving the same 1612 01:21:34,479 --> 01:21:37,720 Speaker 13: systems of violence, and so they picked us as sort 1613 01:21:37,760 --> 01:21:40,840 Speaker 13: of like the wedge point to to break this thing apart. 1614 01:21:40,920 --> 01:21:44,479 Speaker 13: And I think with Vicky they hadn't really figured out 1615 01:21:44,600 --> 01:21:48,080 Speaker 13: how to do it. And I think it was like 1616 01:21:48,160 --> 01:21:50,880 Speaker 13: specifically your case, the the we spaw like with you 1617 01:21:51,000 --> 01:21:52,519 Speaker 13: being put in as the figure of the we spause. 1618 01:21:52,640 --> 01:21:55,400 Speaker 13: This is where they like actually really figured out how 1619 01:21:55,439 --> 01:21:59,160 Speaker 13: to do the whole thing, And yeah, there's just something 1620 01:21:59,240 --> 01:22:01,680 Speaker 13: really bleak up with how effective it was and the 1621 01:22:01,760 --> 01:22:04,680 Speaker 13: fact that it's like these are just people I don't know, 1622 01:22:04,760 --> 01:22:06,720 Speaker 13: like this isn't the thing that's happening to sort of 1623 01:22:06,840 --> 01:22:09,360 Speaker 13: like astrak figures. It's just like, yeah, people I'm having 1624 01:22:09,439 --> 01:22:13,320 Speaker 13: conversations with are how they. 1625 01:22:13,320 --> 01:22:15,800 Speaker 1: Did this Just I don't know. 1626 01:22:16,760 --> 01:22:19,040 Speaker 8: I wish I had more and ELS's but yeah, I 1627 01:22:19,120 --> 01:22:24,280 Speaker 8: think the trends people like to to greater America, to most, 1628 01:22:24,720 --> 01:22:28,840 Speaker 8: to a lot of America, I'll say, are are sensational. 1629 01:22:29,479 --> 01:22:31,840 Speaker 11: You know, people people imagine. 1630 01:22:32,000 --> 01:22:36,040 Speaker 8: Chicks with dicks and dudes with dudes without dicks, and 1631 01:22:37,160 --> 01:22:40,160 Speaker 8: so I think that's really exciting for a lot of people, for. 1632 01:22:40,280 --> 01:22:43,360 Speaker 11: Better or for worse. I think for for worse. 1633 01:22:43,720 --> 01:22:47,800 Speaker 8: But yeah, I think that's just like people and their bodies, right, Like, 1634 01:22:48,120 --> 01:22:50,280 Speaker 8: you know, I guess some people walk around think all 1635 01:22:50,360 --> 01:22:52,040 Speaker 8: the time about other people's crotches. 1636 01:22:53,520 --> 01:22:57,439 Speaker 11: I'm not going to say that's a bad thing. Crush 1637 01:22:57,520 --> 01:22:58,639 Speaker 11: the first of your scene. 1638 01:23:00,400 --> 01:23:03,160 Speaker 13: Okay, you know, But on the other hand, right, there's 1639 01:23:03,200 --> 01:23:06,120 Speaker 13: this there's this aspect of which like you know, there's 1640 01:23:06,160 --> 01:23:09,000 Speaker 13: a serisacialism, but then there's also the experience of being 1641 01:23:09,040 --> 01:23:11,360 Speaker 13: a trans person is like I too am trying to 1642 01:23:11,400 --> 01:23:14,360 Speaker 13: find a way to not pay rent, Like that's like, 1643 01:23:14,960 --> 01:23:15,320 Speaker 13: I don't know. 1644 01:23:16,400 --> 01:23:20,040 Speaker 8: Yeah, so back to back to Wei SPA. Yeah, that 1645 01:23:20,160 --> 01:23:22,640 Speaker 8: was a pretty terrible experience for me, and I'm not 1646 01:23:22,720 --> 01:23:23,560 Speaker 8: going to lie about it. 1647 01:23:23,800 --> 01:23:23,960 Speaker 5: You know. 1648 01:23:24,080 --> 01:23:28,360 Speaker 8: I I'm glad that I stood up for for myself. 1649 01:23:28,439 --> 01:23:30,479 Speaker 8: I'm glad I stood up for trans people that I 1650 01:23:30,560 --> 01:23:34,879 Speaker 8: didn't pass the buck. It was also really difficult and traumatic, 1651 01:23:35,360 --> 01:23:38,879 Speaker 8: and I didn't appreciate the death threats. I didn't appreciate 1652 01:23:39,040 --> 01:23:42,640 Speaker 8: being docs and you know, people have come after me 1653 01:23:43,000 --> 01:23:47,080 Speaker 8: my my whole life, like I present, I think just 1654 01:23:47,280 --> 01:23:51,439 Speaker 8: naturally physically, like I present as being gender queer, like 1655 01:23:51,600 --> 01:23:55,040 Speaker 8: I've been pretty like fearing like looking fem my entire life. 1656 01:23:55,360 --> 01:23:58,400 Speaker 8: That had nothing to do with my internal identity. I've 1657 01:23:58,439 --> 01:24:03,200 Speaker 8: been weird my entire life. I've perpetually been curious and 1658 01:24:03,880 --> 01:24:07,280 Speaker 8: provocative and interested in things that were provocative, and so 1659 01:24:08,000 --> 01:24:11,400 Speaker 8: I've been harassed my whole life. However, UNTI was SPA 1660 01:24:11,840 --> 01:24:15,880 Speaker 8: and to experience strangers by the hundreds saying that they're 1661 01:24:15,920 --> 01:24:19,240 Speaker 8: gonna hunt me down, and fucking shoot me. Yeah, I 1662 01:24:19,360 --> 01:24:21,360 Speaker 8: never know if someone will recognize me when I'm out 1663 01:24:21,479 --> 01:24:23,640 Speaker 8: and be like, hey buddy. 1664 01:24:23,400 --> 01:24:23,800 Speaker 5: I know you. 1665 01:24:24,640 --> 01:24:26,840 Speaker 13: Yeah, And it's one of these things where like the 1666 01:24:27,200 --> 01:24:30,320 Speaker 13: more of a target on you, the more likely it 1667 01:24:30,439 --> 01:24:32,280 Speaker 13: is to happen. And even people who don't have targets 1668 01:24:32,320 --> 01:24:34,640 Speaker 13: on them, like I know people you know who've never 1669 01:24:34,720 --> 01:24:37,040 Speaker 13: experienced anything like this that have still just like had 1670 01:24:37,120 --> 01:24:39,880 Speaker 13: attacks on them, And it's fucking terrifying. 1671 01:24:40,040 --> 01:24:41,880 Speaker 1: It is an absolutely terrible. 1672 01:24:42,600 --> 01:24:44,560 Speaker 13: Way to have to live in, a terrible sort of 1673 01:24:45,160 --> 01:24:48,480 Speaker 13: thing to have to experience, especially is it's just intensifying. 1674 01:24:49,120 --> 01:24:51,040 Speaker 1: So yeah, that was That was twenty twenty one, and 1675 01:24:51,120 --> 01:24:51,720 Speaker 1: that changed me. 1676 01:24:52,400 --> 01:24:55,400 Speaker 8: That experience of being targeted and just picked on out 1677 01:24:55,439 --> 01:24:58,439 Speaker 8: of the freaking blue that changed me as an artist. 1678 01:24:59,160 --> 01:25:02,560 Speaker 8: And at the same time it also firmly established me 1679 01:25:02,800 --> 01:25:06,479 Speaker 8: as a sort of celebrity. And I want to speak 1680 01:25:06,520 --> 01:25:10,840 Speaker 8: to that because there's different tiers of celebrity, you know. 1681 01:25:11,000 --> 01:25:17,320 Speaker 8: At that yes, at the top, there's the tier that's 1682 01:25:17,400 --> 01:25:20,519 Speaker 8: known as I get a Christmas card from Tom Cruise 1683 01:25:20,760 --> 01:25:25,200 Speaker 8: every year, and that's an actual thing, and that's the 1684 01:25:25,280 --> 01:25:27,760 Speaker 8: A List. And you know you're on the A List 1685 01:25:27,800 --> 01:25:31,600 Speaker 8: because Tom cruise since your Christmas card. And at the 1686 01:25:31,600 --> 01:25:35,639 Speaker 8: bottom is me who's been in mass media many times 1687 01:25:35,720 --> 01:25:37,759 Speaker 8: now but has none of the benefits. 1688 01:25:37,920 --> 01:25:38,120 Speaker 5: Yeah. 1689 01:25:38,680 --> 01:25:41,719 Speaker 8: You know, I make very little money as an artist, 1690 01:25:42,360 --> 01:25:45,840 Speaker 8: and I don't have an entourage. I can tour and 1691 01:25:45,920 --> 01:25:48,920 Speaker 8: I play play some shows, and you know, some of 1692 01:25:48,960 --> 01:25:51,559 Speaker 8: them are sold out. I'm playing a show in LA 1693 01:25:51,760 --> 01:25:54,360 Speaker 8: that's sold out, but I'm not playing large venues. No, 1694 01:25:54,520 --> 01:25:57,240 Speaker 8: I maybe play like you know, three hundred, five hundred 1695 01:25:57,280 --> 01:26:00,240 Speaker 8: capacity tops, and most of the places I play like 1696 01:26:00,360 --> 01:26:02,760 Speaker 8: small colabs that one hundred and fifty or so. Yeah, 1697 01:26:02,920 --> 01:26:07,120 Speaker 8: And so I don't I don't have the protection that 1698 01:26:07,800 --> 01:26:12,360 Speaker 8: most celebrities inherently have. I don't have book deals. I 1699 01:26:12,439 --> 01:26:15,439 Speaker 8: don't have movie deals. I don't even have an asia. 1700 01:26:15,920 --> 01:26:17,280 Speaker 8: I don't even have a record label. 1701 01:26:18,120 --> 01:26:18,360 Speaker 5: Yeah. 1702 01:26:19,200 --> 01:26:23,080 Speaker 11: So I'm the freaking D tier. I'm the D tier, 1703 01:26:24,360 --> 01:26:25,360 Speaker 11: and they're coming after me. 1704 01:26:26,600 --> 01:26:26,840 Speaker 10: Yeah. 1705 01:26:26,920 --> 01:26:28,320 Speaker 13: And it's I mean, and that's the thing about the 1706 01:26:28,360 --> 01:26:31,360 Speaker 13: sort of like this status of like niche D tier 1707 01:26:31,400 --> 01:26:34,200 Speaker 13: internet micro celebrity is like I don't like I feel 1708 01:26:34,200 --> 01:26:36,479 Speaker 13: like I got the best possible version of it, where 1709 01:26:36,520 --> 01:26:40,160 Speaker 13: like I got a job that pays slightly. I think 1710 01:26:40,200 --> 01:26:42,880 Speaker 13: I might have hit no I'm still I'm still below 1711 01:26:43,000 --> 01:26:45,040 Speaker 13: the median salary of assists man in the. 1712 01:26:45,120 --> 01:26:46,800 Speaker 1: US, but I'm approaching it. 1713 01:26:47,240 --> 01:26:50,799 Speaker 13: We're getting closer every year, every union fight, we approach 1714 01:26:51,320 --> 01:26:55,240 Speaker 13: median cis man salary. But like, yeah, the situation I 1715 01:26:55,320 --> 01:26:57,840 Speaker 13: got is effectively the equivalent of winning the trans lottery, right, 1716 01:26:58,120 --> 01:27:00,840 Speaker 13: Like this is about the best you could possibly hope 1717 01:27:00,880 --> 01:27:03,760 Speaker 13: for if you're a transperson and you get famous, Like yeah, 1718 01:27:04,000 --> 01:27:05,920 Speaker 13: I mean, like I get that threats too, right, Like 1719 01:27:06,000 --> 01:27:10,160 Speaker 13: nothing anywhere near the scale that you get, And mostly 1720 01:27:10,240 --> 01:27:14,160 Speaker 13: what happens is like I mean, like single digit numbers 1721 01:27:14,400 --> 01:27:18,439 Speaker 13: of trans people in the US have the kind of 1722 01:27:18,560 --> 01:27:22,760 Speaker 13: protection that actual celebrity gives you and everyone else. Celebrity 1723 01:27:22,920 --> 01:27:25,400 Speaker 13: is just another It's just a giant target painted on you. 1724 01:27:26,240 --> 01:27:28,840 Speaker 13: And yes, you have none of the benefits and all 1725 01:27:28,920 --> 01:27:31,720 Speaker 13: of the sort of Hey, here is one hundred and 1726 01:27:31,760 --> 01:27:35,000 Speaker 13: fifty million people who absolutely hate you and who've been 1727 01:27:35,040 --> 01:27:36,960 Speaker 13: primed and targeted like specifically at you. 1728 01:27:38,120 --> 01:27:41,880 Speaker 8: Yeah, It's something that I wonder about, Like why why 1729 01:27:41,920 --> 01:27:44,599 Speaker 8: would they do something like this? Why would Fox News 1730 01:27:44,680 --> 01:27:47,160 Speaker 8: we talk about me? Why would the founder of the 1731 01:27:47,280 --> 01:27:52,160 Speaker 8: Proud Boys, Gavin mccinnis that old video about me bad 1732 01:27:52,200 --> 01:27:54,559 Speaker 8: Boys is a terrorist group if you don't now listener 1733 01:27:54,800 --> 01:27:58,360 Speaker 8: to ever recognize as a terrorist group by Canada, not here, 1734 01:27:58,479 --> 01:27:59,800 Speaker 8: because like we're just cool with that. 1735 01:28:00,080 --> 01:28:05,640 Speaker 11: Should hear? They're white supremacist terrorist group? So why me? 1736 01:28:06,439 --> 01:28:09,960 Speaker 11: Why why would a congress person come after me? 1737 01:28:10,600 --> 01:28:16,559 Speaker 8: And my own hypothesis is that they punched down because 1738 01:28:17,040 --> 01:28:22,120 Speaker 8: they secretly believe that they themselves are weak and attacking me, 1739 01:28:22,439 --> 01:28:25,720 Speaker 8: attacking people like me, is a fight that they can win. 1740 01:28:27,080 --> 01:28:30,600 Speaker 8: And my view is that it's not a fight that 1741 01:28:30,680 --> 01:28:34,519 Speaker 8: they can win because they've already lost. They're trying to 1742 01:28:34,560 --> 01:28:38,920 Speaker 8: get power in small, small ways and false ways, in 1743 01:28:39,000 --> 01:28:41,599 Speaker 8: my opinion, and they're trying to get money. First of all, 1744 01:28:41,640 --> 01:28:44,320 Speaker 8: they want money, you know, they want their views, they 1745 01:28:44,360 --> 01:28:48,240 Speaker 8: want their donations, and that's the entire top of the 1746 01:28:48,280 --> 01:28:52,679 Speaker 8: pyramid for them and for me though, I'm a fucking artist, 1747 01:28:53,080 --> 01:28:57,960 Speaker 8: and power is something that I've always been developing because 1748 01:28:58,000 --> 01:29:02,040 Speaker 8: I've sought to know myself. I've sought to understand who 1749 01:29:02,120 --> 01:29:06,200 Speaker 8: I am and why that extends to myself as a 1750 01:29:06,240 --> 01:29:08,760 Speaker 8: queer person. To come to understand myself who I am 1751 01:29:08,800 --> 01:29:11,680 Speaker 8: as a queer person, that's not something they'll ever have. 1752 01:29:12,240 --> 01:29:16,000 Speaker 8: So I've read a fucking won. And same with all 1753 01:29:16,080 --> 01:29:18,640 Speaker 8: the other queer people that are under attack, all you 1754 01:29:18,720 --> 01:29:23,959 Speaker 8: other DCER queer celebrities out there, we fucking won. 1755 01:29:25,720 --> 01:29:26,400 Speaker 1: Hopefully well. 1756 01:29:26,479 --> 01:29:28,600 Speaker 13: And I think part of this is also like the 1757 01:29:28,680 --> 01:29:30,920 Speaker 13: reason this campaign is happening is because they're trying to 1758 01:29:30,960 --> 01:29:35,160 Speaker 13: stop the tide from coming in. And they saw how 1759 01:29:35,400 --> 01:29:38,439 Speaker 13: far in the tide had already come, and now they're 1760 01:29:38,520 --> 01:29:44,080 Speaker 13: trying to like dam off the tide. And you know, like, 1761 01:29:44,400 --> 01:29:47,840 Speaker 13: probably it's not going to work. But the only way 1762 01:29:47,880 --> 01:29:50,760 Speaker 13: that it can is if everyone just like sits here 1763 01:29:50,920 --> 01:29:53,599 Speaker 13: does nothing and let's to just keep building and building 1764 01:29:53,680 --> 01:29:56,680 Speaker 13: and building more walls. It's something that is within our 1765 01:29:56,760 --> 01:30:00,400 Speaker 13: power to resist. We just have to actually do it, right. 1766 01:30:00,479 --> 01:30:02,400 Speaker 13: You have to actually organize, You have to talk to 1767 01:30:02,439 --> 01:30:04,120 Speaker 13: the people around you, you have to go get them 1768 01:30:04,160 --> 01:30:07,360 Speaker 13: to do things to resist this. And if we do, yeah, well, 1769 01:30:07,400 --> 01:30:09,559 Speaker 13: like with the things that we've already won, the things 1770 01:30:09,560 --> 01:30:11,200 Speaker 13: that we are going to win are going to stick. 1771 01:30:11,280 --> 01:30:13,679 Speaker 13: But if not, like things are going to get really 1772 01:30:13,840 --> 01:30:17,680 Speaker 13: really bad really quickly. And yeah, and speaking of things 1773 01:30:17,760 --> 01:30:21,599 Speaker 13: getting very bad very quickly, I carre some more ats 1774 01:30:22,000 --> 01:30:36,519 Speaker 13: before we get back to these getting wars. We are back, Yeah, 1775 01:30:37,040 --> 01:30:41,800 Speaker 13: You've been specifically targeted by a sitting US Congresswoman, Nancy Mace, 1776 01:30:41,960 --> 01:30:45,519 Speaker 13: who is the person who actually I don't know if 1777 01:30:45,520 --> 01:30:48,320 Speaker 13: we talked about the bathroom stuff here yet, but she's 1778 01:30:48,479 --> 01:30:51,120 Speaker 13: the sort of person behind an attempt to get trans 1779 01:30:51,160 --> 01:30:52,960 Speaker 13: people to not be able to use the bathroom on 1780 01:30:53,080 --> 01:30:57,679 Speaker 13: Capitol Hill. She's become a leading anti trans figure in Congress. 1781 01:30:57,760 --> 01:31:00,559 Speaker 13: Literally every single thing that she tweets about is about 1782 01:31:00,600 --> 01:31:03,439 Speaker 13: trans women and how they should be put in men's jails, 1783 01:31:03,760 --> 01:31:07,400 Speaker 13: which is just an incredibly cynical ploy to make a 1784 01:31:07,400 --> 01:31:11,000 Speaker 13: bunch of people get horribly braved and killed, which is 1785 01:31:11,760 --> 01:31:13,479 Speaker 13: one of the predominant things that happens when we get 1786 01:31:13,520 --> 01:31:16,960 Speaker 13: put in ben's prisons. And she specifically came after years, 1787 01:31:17,000 --> 01:31:19,560 Speaker 13: you want to talk about how that happened and the 1788 01:31:19,720 --> 01:31:24,120 Speaker 13: latest sort of a congress woman tweets and a fucking 1789 01:31:24,200 --> 01:31:26,200 Speaker 13: social media company does their bidding. 1790 01:31:27,000 --> 01:31:30,680 Speaker 8: Yeah, so right at the end of twenty twenty four, 1791 01:31:30,800 --> 01:31:36,720 Speaker 8: I think it was on December twenty eighth, I was 1792 01:31:37,040 --> 01:31:42,439 Speaker 8: doxed by a right wing control nazi that has docked 1793 01:31:43,520 --> 01:31:47,920 Speaker 8: multiple friends of mine, the activist friends, artist friends, and 1794 01:31:48,520 --> 01:31:53,640 Speaker 8: they pointed out in a tweet how my art was 1795 01:31:53,760 --> 01:31:57,719 Speaker 8: on YouTube, specifically, my music videos were on YouTube calling 1796 01:31:57,880 --> 01:32:01,360 Speaker 8: for violence and how I was in evil trans person 1797 01:32:02,280 --> 01:32:06,320 Speaker 8: and they added like at symbol YouTube and said that 1798 01:32:07,280 --> 01:32:09,599 Speaker 8: these videos are in violation of your terms of service? 1799 01:32:10,200 --> 01:32:11,400 Speaker 11: Why are they still up? 1800 01:32:11,880 --> 01:32:15,360 Speaker 8: And Nancy May saw that because if you look at 1801 01:32:15,439 --> 01:32:19,400 Speaker 8: her Twitter, it's all just the stocks and trans people 1802 01:32:19,720 --> 01:32:24,080 Speaker 8: and perpetual like rage bait about the queer menace, the 1803 01:32:24,200 --> 01:32:28,919 Speaker 8: trans menace, and so she saw that retweeted and said, YouTube, 1804 01:32:29,240 --> 01:32:32,360 Speaker 8: this squarly violates your terms of service. Why haven't you 1805 01:32:32,400 --> 01:32:36,559 Speaker 8: done anything? And then immediately following that, my videos were 1806 01:32:36,600 --> 01:32:39,799 Speaker 8: taken down, the ones that were mentioned in these tweets. 1807 01:32:40,600 --> 01:32:44,120 Speaker 8: And as I said before, some one of these videos, 1808 01:32:44,600 --> 01:32:46,840 Speaker 8: what was that of my little problem that's been up 1809 01:32:46,920 --> 01:32:48,880 Speaker 8: for seven years? 1810 01:32:49,520 --> 01:32:51,800 Speaker 1: Yeah, for a long time. 1811 01:32:52,360 --> 01:32:52,599 Speaker 6: Yeah. 1812 01:32:53,040 --> 01:32:58,519 Speaker 8: And YouTube terms of service they're very clear, and I 1813 01:32:58,640 --> 01:33:01,360 Speaker 8: do my best to stay within YouTube terms of service 1814 01:33:01,880 --> 01:33:03,559 Speaker 8: so my work doesn't get taken down. 1815 01:33:04,000 --> 01:33:11,000 Speaker 11: And they state that stuff like violence, minimal nudity that is. 1816 01:33:11,080 --> 01:33:13,920 Speaker 8: Allowed within the context of art, within the context of 1817 01:33:14,120 --> 01:33:18,840 Speaker 8: music videos, and so my videos, you know, they weren't 1818 01:33:18,920 --> 01:33:21,920 Speaker 8: designed to this is from the terms of service. They 1819 01:33:21,960 --> 01:33:26,720 Speaker 8: weren't designed to sexually titillate or gratify. That's exactly what 1820 01:33:26,840 --> 01:33:30,080 Speaker 8: it says in the in the terms. Now they weren't 1821 01:33:30,600 --> 01:33:34,960 Speaker 8: recreations of real life violence, and they weren't real life violence, 1822 01:33:35,240 --> 01:33:37,920 Speaker 8: but they were still they were still removed at the 1823 01:33:38,000 --> 01:33:42,559 Speaker 8: behest at the easy click press of Nancy Mace. 1824 01:33:43,360 --> 01:33:45,599 Speaker 13: Yeah, And I think there's there's a couple of things 1825 01:33:45,680 --> 01:33:47,880 Speaker 13: going on here, one of which is, you know, so 1826 01:33:48,040 --> 01:33:51,080 Speaker 13: we've seen this with Facebook and the last I guess 1827 01:33:51,120 --> 01:33:52,559 Speaker 13: when this comes out will be like a week ago. 1828 01:33:52,680 --> 01:33:55,800 Speaker 13: But you know, Facebook has in stated policies that allow 1829 01:33:55,920 --> 01:33:59,160 Speaker 13: you to basically stay slurs against queer people, allowed you 1830 01:33:59,240 --> 01:34:01,640 Speaker 13: to call core people the illnesses and stuff like that 1831 01:34:01,800 --> 01:34:04,479 Speaker 13: that's very specifically you could only do to queer people. 1832 01:34:04,520 --> 01:34:06,760 Speaker 13: You can't do it to anyone else. And I think 1833 01:34:06,800 --> 01:34:09,320 Speaker 13: there's this sort of trend here of I don't know 1834 01:34:09,479 --> 01:34:13,000 Speaker 13: with Facebook. I wouldn't say that it's like compliance with 1835 01:34:13,240 --> 01:34:15,400 Speaker 13: the sort of new Trump regime, because like this is 1836 01:34:15,479 --> 01:34:17,400 Speaker 13: just your Facebook is right, like they did the Row 1837 01:34:17,439 --> 01:34:20,320 Speaker 13: Higg the genocide, like the genocide into Gray too, that 1838 01:34:20,479 --> 01:34:23,120 Speaker 13: was also a Facebook thing. So they've always just been 1839 01:34:23,560 --> 01:34:26,280 Speaker 13: evil and they have been sort of looking for the 1840 01:34:26,360 --> 01:34:29,439 Speaker 13: excuse that they needed to like drop the hammer on us. 1841 01:34:29,600 --> 01:34:32,160 Speaker 13: But I think YouTube, to some extent to what we're 1842 01:34:32,200 --> 01:34:34,599 Speaker 13: seeing right now is this kind of like mask coming 1843 01:34:34,640 --> 01:34:38,400 Speaker 13: off moment where people are realizing that with Trump and power, 1844 01:34:38,479 --> 01:34:41,599 Speaker 13: they can just drop the hammer on a queer artist 1845 01:34:41,720 --> 01:34:46,120 Speaker 13: because specifically like on a trans artist, because now they 1846 01:34:46,240 --> 01:34:48,840 Speaker 13: have this sort of like backing to do this stuff, 1847 01:34:49,000 --> 01:34:51,960 Speaker 13: and the right has you know, realized that they can 1848 01:34:52,120 --> 01:34:54,280 Speaker 13: feel like YouTube, take this video down, and they'll do it. 1849 01:34:55,680 --> 01:35:01,280 Speaker 13: And that's a really terrifying p residents in a lot 1850 01:35:01,320 --> 01:35:03,719 Speaker 13: of ways. And also it's very you know, I was like, yeah, obviously, 1851 01:35:03,720 --> 01:35:05,439 Speaker 13: because a point of a hypocrisy does nothing. But like, 1852 01:35:06,080 --> 01:35:10,479 Speaker 13: I'm trying to think of a more explicit demonstration of 1853 01:35:10,600 --> 01:35:13,160 Speaker 13: censorship than a member of the government says that something 1854 01:35:13,200 --> 01:35:14,960 Speaker 13: should be taken down and it's taken down. 1855 01:35:15,360 --> 01:35:18,439 Speaker 10: It's like, it's really so you. 1856 01:35:18,479 --> 01:35:23,600 Speaker 8: Know, it makes me kind of afraid, honestly, because you know, 1857 01:35:23,760 --> 01:35:27,600 Speaker 8: before I was a victim of a moral panic, and 1858 01:35:27,840 --> 01:35:33,120 Speaker 8: now my work is effectively being disappeared with little fanfare. 1859 01:35:33,920 --> 01:35:37,200 Speaker 8: So you know, what's what's going to happen next? What 1860 01:35:37,360 --> 01:35:41,240 Speaker 8: will we see just made invisible and unseen? And I 1861 01:35:41,360 --> 01:35:43,599 Speaker 8: know that in this country, I have freedom of speech, 1862 01:35:44,120 --> 01:35:46,640 Speaker 8: but that's that's really bullshit. We all know that, Like, 1863 01:35:46,720 --> 01:35:48,840 Speaker 8: I'm not going to reach many people if I stand 1864 01:35:48,920 --> 01:35:51,400 Speaker 8: on a street corner at the park and yell at people. 1865 01:35:51,960 --> 01:35:55,040 Speaker 11: What matters nowadays is the freedom of reach that. 1866 01:35:55,200 --> 01:35:59,160 Speaker 8: These social media platforms control, that are themselves controlled now 1867 01:35:59,280 --> 01:36:03,839 Speaker 8: by the Republican Party. And what happens when our freedom 1868 01:36:03,880 --> 01:36:08,080 Speaker 8: of reach is annihilated and then suddenly trans people are 1869 01:36:08,320 --> 01:36:09,560 Speaker 8: actually invisible. 1870 01:36:10,880 --> 01:36:13,960 Speaker 11: We're very close to that. I think Fancy proved that. 1871 01:36:15,000 --> 01:36:18,479 Speaker 13: Yeah, and disappearing people from the mainstream is the first 1872 01:36:18,520 --> 01:36:20,599 Speaker 13: step for how you destroy a people. 1873 01:36:20,920 --> 01:36:21,160 Speaker 5: Yeah. 1874 01:36:21,439 --> 01:36:27,639 Speaker 8: Art art is perhaps the loudest way a person can speak, 1875 01:36:28,560 --> 01:36:33,400 Speaker 8: and I know that's why she came after my art. 1876 01:36:33,680 --> 01:36:36,479 Speaker 13: There's something just incredibly galling about watching this whole thing happen. 1877 01:36:36,560 --> 01:36:38,360 Speaker 13: And then like the next tweet is again a sitting 1878 01:36:38,400 --> 01:36:40,519 Speaker 13: member of the US government saying that trans women should 1879 01:36:40,520 --> 01:36:42,840 Speaker 13: be put in men's prisons, and it's like, Okay, one 1880 01:36:42,880 --> 01:36:45,960 Speaker 13: of these is considered violence fight sort of the media machine. 1881 01:36:46,080 --> 01:36:46,840 Speaker 1: One of them isn't. 1882 01:36:47,320 --> 01:36:50,559 Speaker 8: Yeah, does she even actually do work for the people 1883 01:36:50,680 --> 01:36:54,800 Speaker 8: of South Carolina? Like I saw, all she does is 1884 01:36:55,040 --> 01:36:58,000 Speaker 8: just like start shit with trans people, like, she says, 1885 01:36:58,080 --> 01:37:01,599 Speaker 8: another lousy politician trying to be an entertainer. And she's 1886 01:37:01,640 --> 01:37:05,560 Speaker 8: just a knockoff product of a bootleg Trump, that's what 1887 01:37:05,680 --> 01:37:08,840 Speaker 8: she is. And she's not even good at her fucking politics. 1888 01:37:08,960 --> 01:37:13,160 Speaker 8: Like she's tried to get this bathroom band for disallowing 1889 01:37:13,720 --> 01:37:16,360 Speaker 8: trans people to use a bathroom methaiost Capital and her 1890 01:37:16,439 --> 01:37:20,680 Speaker 8: own freaking party kicked the bill out of the out 1891 01:37:20,680 --> 01:37:23,840 Speaker 8: of the bylaws for this year, so she couldn't even 1892 01:37:23,880 --> 01:37:26,960 Speaker 8: get that. And yeah, so I guess she thinks you 1893 01:37:27,040 --> 01:37:31,040 Speaker 8: can get a get a win by harassing me, harassing 1894 01:37:31,080 --> 01:37:33,720 Speaker 8: my art. Yeah, try to get people to come after me. 1895 01:37:34,360 --> 01:37:37,120 Speaker 13: You know, these people are not as powerful as they 1896 01:37:37,200 --> 01:37:39,280 Speaker 13: want you to believe. Right, A lot of their stuff 1897 01:37:39,400 --> 01:37:43,360 Speaker 13: just fails. But they will only fail if people are 1898 01:37:43,400 --> 01:37:45,680 Speaker 13: willing to resist and people are willing to stop them. 1899 01:37:46,439 --> 01:37:48,760 Speaker 13: And that's the thing that's needed in this moment. Is 1900 01:37:49,000 --> 01:37:53,400 Speaker 13: organization is you know, like is an organization. It is action, 1901 01:37:53,680 --> 01:37:56,599 Speaker 13: it is it is now. It is now the time 1902 01:37:56,840 --> 01:38:00,280 Speaker 13: to go do whatever political activity thing you've been being 1903 01:38:00,439 --> 01:38:02,439 Speaker 13: like a should I be organizing a union? Should I 1904 01:38:02,520 --> 01:38:05,920 Speaker 13: be like setting up strikes? Should I be doing street demonstrations? 1905 01:38:05,960 --> 01:38:09,480 Speaker 13: It's like, yeah, it's time, it's time to go because otherwise, 1906 01:38:09,560 --> 01:38:11,320 Speaker 13: you know, and I think this is something that like 1907 01:38:12,520 --> 01:38:15,320 Speaker 13: every trans person now understands intimately, and I think most 1908 01:38:15,360 --> 01:38:19,760 Speaker 13: people don't. Which is that right now it's us, But 1909 01:38:20,200 --> 01:38:23,479 Speaker 13: you know, in two years, assuming we're all still alive, 1910 01:38:23,560 --> 01:38:25,000 Speaker 13: there's a very good chance that it's going to be 1911 01:38:25,160 --> 01:38:27,320 Speaker 13: you like showing up on this show because a fucking 1912 01:38:27,360 --> 01:38:30,800 Speaker 13: congress person is deliberately interviewed to destroy your life. And 1913 01:38:31,280 --> 01:38:33,720 Speaker 13: I would rather we had stopped this before it got 1914 01:38:33,760 --> 01:38:36,800 Speaker 13: to any of us. But they're going to come for 1915 01:38:36,880 --> 01:38:40,840 Speaker 13: you too unless we stop them. Thank you so much 1916 01:38:41,120 --> 01:38:43,559 Speaker 13: for coming on the show. And where can people find 1917 01:38:43,680 --> 01:38:47,000 Speaker 13: you and find your art? And yeah, support you? 1918 01:38:47,360 --> 01:38:49,559 Speaker 8: Yeah, thanks for having me. May I really like talking 1919 01:38:49,720 --> 01:38:54,120 Speaker 8: with you. It's been very good and listeneris You can 1920 01:38:54,160 --> 01:38:58,400 Speaker 8: find my work on Spotify. You can also check out 1921 01:38:58,439 --> 01:39:03,280 Speaker 8: my website at precious Child dot com and please sign 1922 01:39:03,360 --> 01:39:06,120 Speaker 8: up for my mailing list there as well. That is 1923 01:39:06,240 --> 01:39:09,360 Speaker 8: the best and best direct way for me to stay 1924 01:39:09,400 --> 01:39:12,679 Speaker 8: in touch with my friends and fans. I'm also Precious 1925 01:39:12,760 --> 01:39:17,960 Speaker 8: Child on Instagram on TikTok. I am the last Precious Child. 1926 01:39:18,880 --> 01:39:22,320 Speaker 8: I also will be doing live shows this spring and 1927 01:39:22,439 --> 01:39:26,400 Speaker 8: summer in the US, and so follow me on my 1928 01:39:26,479 --> 01:39:29,240 Speaker 8: website and on social media to stay up to date 1929 01:39:29,320 --> 01:39:31,519 Speaker 8: on that and come say him in person. 1930 01:39:31,880 --> 01:39:34,880 Speaker 13: Hell yeah, we will have thanks to all of that 1931 01:39:35,240 --> 01:39:38,640 Speaker 13: in the description. So yeah, go check it out and 1932 01:39:39,640 --> 01:39:41,959 Speaker 13: resist the creep of fascism. 1933 01:39:42,920 --> 01:39:45,880 Speaker 8: That's one thing I want to add about I said 1934 01:39:45,920 --> 01:39:49,360 Speaker 8: earlier about personal power and how I have to develop 1935 01:39:49,360 --> 01:39:53,240 Speaker 8: plan personal power by getting to down myself. I want 1936 01:39:53,280 --> 01:39:56,439 Speaker 8: to toil trans people out there, you queer people and 1937 01:39:56,600 --> 01:40:00,519 Speaker 8: your allies. That first thing is getting today's and then 1938 01:40:00,840 --> 01:40:03,760 Speaker 8: next thing is like, fuck these fucking laws, Fuck these 1939 01:40:03,840 --> 01:40:07,479 Speaker 8: fucking lawmakers. Yeah, good to know each other and strengthen 1940 01:40:07,600 --> 01:40:11,320 Speaker 8: our bonds with each other, because those are bigger than 1941 01:40:11,439 --> 01:40:14,120 Speaker 8: any type of oppressive laws then we put upon us. 1942 01:40:14,600 --> 01:40:17,320 Speaker 8: And it's only by the strength that we developed with 1943 01:40:17,520 --> 01:40:21,559 Speaker 8: each other, within each other that we will persevere. 1944 01:40:38,760 --> 01:40:41,680 Speaker 1: Welcome back to it could happen here, a podcast about it, 1945 01:40:42,000 --> 01:40:46,000 Speaker 1: the Consumer Electronics Show happening here to everyone, And of 1946 01:40:46,080 --> 01:40:48,680 Speaker 1: course it is in fact happening to everyone because over 1947 01:40:48,720 --> 01:40:51,439 Speaker 1: the course of the day, all of our subjects here, 1948 01:40:51,560 --> 01:40:55,000 Speaker 1: all of our experts here, have watched different kinds of 1949 01:40:55,120 --> 01:40:57,600 Speaker 1: dudes explain the different kinds of jobs they want to 1950 01:40:57,640 --> 01:41:00,080 Speaker 1: replace with a chatbot that was trained on redit it. 1951 01:41:00,960 --> 01:41:03,519 Speaker 1: So I'm going to go around a circle and introduce 1952 01:41:03,640 --> 01:41:06,519 Speaker 1: our guests today. First off, we've got the great Ed 1953 01:41:06,640 --> 01:41:09,439 Speaker 1: Angwaiso Junior. Ed, thank you for being here, Thanks for 1954 01:41:09,479 --> 01:41:13,479 Speaker 1: having me on. We've got Garrison Davis, who's also great, 1955 01:41:13,560 --> 01:41:15,360 Speaker 1: but I'm not gonna say it at the same time 1956 01:41:15,400 --> 01:41:17,599 Speaker 1: because I don't want Ed's compliment to feel like less. 1957 01:41:18,200 --> 01:41:20,880 Speaker 1: But you're contractually obligated to not mind. 1958 01:41:21,840 --> 01:41:28,759 Speaker 5: Yes, thank you boss. Great to be here, as always. 1959 01:41:29,400 --> 01:41:34,200 Speaker 1: Very natural, very natural. Zi hi hi, Hello, Hello, hello, hello, 1960 01:41:34,360 --> 01:41:37,559 Speaker 1: thank you. This is your first cees as well. That's right, 1961 01:41:37,880 --> 01:41:41,400 Speaker 1: your first time being a journalist? Also true. How do 1962 01:41:41,479 --> 01:41:44,439 Speaker 1: you feel doing the job that Alex Garland has just 1963 01:41:44,560 --> 01:41:47,960 Speaker 1: reminded us in the movie Civil War is a fundamentally noble, 1964 01:41:48,040 --> 01:41:50,200 Speaker 1: perfect endeavor only practiced by heroes. 1965 01:41:50,400 --> 01:41:53,240 Speaker 14: I love wearing a dress, shirt and tie and just 1966 01:41:53,320 --> 01:41:54,360 Speaker 14: getting very drunk. 1967 01:41:54,640 --> 01:41:57,599 Speaker 1: Yeah, you were very surprised when I gave you your gun, 1968 01:41:57,720 --> 01:42:00,800 Speaker 1: but you can't be a journalist without one. Yeah, yeah, 1969 01:42:01,439 --> 01:42:06,800 Speaker 1: I'm playing with it without the safety and last but 1970 01:42:06,960 --> 01:42:10,320 Speaker 1: certainly not least in fact, maybe better than some people 1971 01:42:10,400 --> 01:42:12,160 Speaker 1: in the room. Again, I'm not going to say who. 1972 01:42:12,280 --> 01:42:14,400 Speaker 1: You can wonder that for yourself. Feel in. 1973 01:42:16,120 --> 01:42:16,320 Speaker 4: Rath. 1974 01:42:16,840 --> 01:42:20,479 Speaker 15: Thanks, I agree that I'm pretty good. How much better 1975 01:42:20,520 --> 01:42:22,439 Speaker 15: I am than how many people in this room? 1976 01:42:22,439 --> 01:42:24,479 Speaker 1: I'm not. I'm not even really like something that was 1977 01:42:24,640 --> 01:42:26,840 Speaker 1: not like talking about Yeah, exactly, because we haven't gotten 1978 01:42:26,880 --> 01:42:29,280 Speaker 1: those numbers back from open AI. Yeah, it would be 1979 01:42:29,640 --> 01:42:32,160 Speaker 1: irresponsible to speculate at the mind. Yeah, I got away 1980 01:42:32,200 --> 01:42:36,439 Speaker 1: for three So what I want to do here? I 1981 01:42:36,479 --> 01:42:38,160 Speaker 1: think this is kind of our roll up. We spent 1982 01:42:38,240 --> 01:42:40,040 Speaker 1: our last day on the floor. I want to go 1983 01:42:40,120 --> 01:42:41,840 Speaker 1: around and I'll start first. You guys have a second 1984 01:42:41,880 --> 01:42:44,639 Speaker 1: to get your thoughts together. What comes to mind immedia 1985 01:42:44,720 --> 01:42:45,960 Speaker 1: is like, this is the thing that I had the 1986 01:42:46,040 --> 01:42:48,680 Speaker 1: most positive reaction to, and this is the thing that 1987 01:42:48,720 --> 01:42:51,160 Speaker 1: I had the most negative reaction to. I think it 1988 01:42:51,200 --> 01:42:53,439 Speaker 1: is a solid way for us to start out, and 1989 01:42:53,520 --> 01:42:56,679 Speaker 1: I think my most negative reaction, obviously was the Amy 1990 01:42:56,920 --> 01:43:01,360 Speaker 1: artificial child best Friend toy, which was deeply upsetting and uncomfortable, 1991 01:43:02,080 --> 01:43:03,800 Speaker 1: and I hated both that. Like I could tell from 1992 01:43:03,800 --> 01:43:07,559 Speaker 1: an industrial design standpoint, pretty good design, Like it looked 1993 01:43:07,600 --> 01:43:10,240 Speaker 1: like something like oh, Kitt'll think that's cute and from 1994 01:43:10,320 --> 01:43:13,160 Speaker 1: a this is our intent for this product standpoint, it 1995 01:43:13,240 --> 01:43:15,519 Speaker 1: felt like a replacement for the love of adults in 1996 01:43:15,560 --> 01:43:18,200 Speaker 1: the life of a small child, which I thought was 1997 01:43:18,280 --> 01:43:21,000 Speaker 1: like evil in a profound way. And I guess the 1998 01:43:21,200 --> 01:43:24,680 Speaker 1: best thing that I saw. I'm not perfectly competent at 1999 01:43:24,720 --> 01:43:27,720 Speaker 1: this point to like analyze how well it worked, but 2000 01:43:27,800 --> 01:43:29,840 Speaker 1: from the demo I saw, I was very impressed with 2001 01:43:30,120 --> 01:43:34,439 Speaker 1: Naqi Knakis basically reads facial microemotions in order to let 2002 01:43:34,479 --> 01:43:38,519 Speaker 1: people control the computer, not exclusively, but especially if they're 2003 01:43:38,560 --> 01:43:41,080 Speaker 1: quadriplegic or whatever. Like, I thought that was really interesting, 2004 01:43:41,120 --> 01:43:43,400 Speaker 1: and it's the kind of thing because honestly, I might 2005 01:43:43,479 --> 01:43:46,200 Speaker 1: loop that in with There was a AI assisted like 2006 01:43:46,320 --> 01:43:49,080 Speaker 1: Caine for people who were blind. There was another device 2007 01:43:49,120 --> 01:43:51,599 Speaker 1: that led you to control a computer through like facial 2008 01:43:51,640 --> 01:43:53,920 Speaker 1: movements in your mouth. It was like a retainer all 2009 01:43:54,040 --> 01:43:56,320 Speaker 1: the stuff that's like, oh, these are like really people 2010 01:43:56,520 --> 01:43:59,760 Speaker 1: care a lot about helping somebody regain the ability to 2011 01:44:00,120 --> 01:44:03,120 Speaker 1: utilize technology to let them reconnect to the world. That's 2012 01:44:03,280 --> 01:44:06,160 Speaker 1: like the opposite of replacing a child's parents with a 2013 01:44:06,240 --> 01:44:09,040 Speaker 1: toy ed. You're in the hot seat next. 2014 01:44:09,520 --> 01:44:09,680 Speaker 4: You know. 2015 01:44:09,800 --> 01:44:15,040 Speaker 16: The thing I loved the most was obviously the Global 2016 01:44:15,120 --> 01:44:17,960 Speaker 16: Pavilion for connecting Web three businesses. 2017 01:44:17,479 --> 01:44:19,000 Speaker 1: Across crypto botching. 2018 01:44:19,320 --> 01:44:25,840 Speaker 16: Oh yeah, DeFi fintech cb DC's, which are central bank 2019 01:44:25,920 --> 01:44:30,840 Speaker 16: digital currencies and legal advocacy. You know, this made my 2020 01:44:31,000 --> 01:44:35,960 Speaker 16: heart flutter because you know what, even when you think 2021 01:44:36,040 --> 01:44:41,240 Speaker 16: they're down, Crypto finds a way to squirm into your life. 2022 01:44:41,640 --> 01:44:43,960 Speaker 16: It really is the zombie of the tech world. Yeah, 2023 01:44:44,120 --> 01:44:48,719 Speaker 16: because it's dead and yet it's undead. It's constantly trying 2024 01:44:48,760 --> 01:44:49,479 Speaker 16: to crawl that. 2025 01:44:50,000 --> 01:44:53,719 Speaker 1: Somehow the fact that it's dead makes it more dangerous. Now, yeah, exactly, 2026 01:44:54,640 --> 01:44:57,280 Speaker 1: it's specifically a zombie. I will try to figure out 2027 01:44:57,280 --> 01:45:00,799 Speaker 1: what's the vampire, but specifically Crypto is the zombie. 2028 01:45:00,880 --> 01:45:01,080 Speaker 13: Yeah. 2029 01:45:01,120 --> 01:45:03,320 Speaker 1: Yeah. When I first read the line that is not dead, 2030 01:45:03,600 --> 01:45:06,400 Speaker 1: which can eternal lie and with strange eons, even death 2031 01:45:06,479 --> 01:45:09,280 Speaker 1: may die. Yeah, it would. 2032 01:45:10,680 --> 01:45:15,280 Speaker 16: When I think of, you know, twenty eight years later 2033 01:45:15,439 --> 01:45:17,440 Speaker 16: trailer where they used that poem. 2034 01:45:19,160 --> 01:45:20,240 Speaker 1: Three six. 2035 01:45:20,600 --> 01:45:23,040 Speaker 16: You know, like, I'm just guessing where the price of 2036 01:45:23,080 --> 01:45:26,680 Speaker 16: bitcoin is gonna go. I think we're we're at the 2037 01:45:26,760 --> 01:45:28,640 Speaker 16: beginning of a golden age, not for us but for 2038 01:45:28,720 --> 01:45:33,400 Speaker 16: the grifters. Oh God, next week, when are dear Golden 2039 01:45:33,400 --> 01:45:37,440 Speaker 16: Boy gets or Orange Boy gets elected. So I are inaugurated? 2040 01:45:37,760 --> 01:45:38,800 Speaker 16: Has he already won the election? 2041 01:45:39,920 --> 01:45:40,280 Speaker 1: You did? 2042 01:45:40,640 --> 01:45:40,760 Speaker 4: Right? 2043 01:45:41,840 --> 01:45:45,160 Speaker 5: Well, that's that's that's debatable. I think there was some 2044 01:45:45,400 --> 01:45:50,639 Speaker 5: very curious irregularities in days right straightforward from here. 2045 01:45:50,680 --> 01:45:54,920 Speaker 1: To be encouraging the Blue and one people let him, 2046 01:45:54,960 --> 01:45:58,200 Speaker 1: let him have it, Let him have it. You're right, 2047 01:45:58,400 --> 01:46:01,479 Speaker 1: he wasn't shot at all. That was all an AI trick. Yeah, 2048 01:46:02,760 --> 01:46:04,760 Speaker 1: fifteen would blow your whole head off that way. 2049 01:46:05,760 --> 01:46:05,920 Speaker 5: You know. 2050 01:46:06,040 --> 01:46:07,720 Speaker 16: The thing I actually did like the most, similar to you, 2051 01:46:07,840 --> 01:46:11,040 Speaker 16: I really did like the assistive tech. I mean the 2052 01:46:11,080 --> 01:46:14,280 Speaker 16: stuff that is for people who are disabled, not able 2053 01:46:14,360 --> 01:46:18,560 Speaker 16: body either experiencings either cognitive decline or you know, the 2054 01:46:18,720 --> 01:46:20,880 Speaker 16: nerodegenitive things or paralyzed. 2055 01:46:20,920 --> 01:46:20,960 Speaker 7: Like. 2056 01:46:21,240 --> 01:46:24,040 Speaker 16: This is actual stuff that we need a lot more 2057 01:46:24,880 --> 01:46:27,439 Speaker 16: investment in development and I assume maybe the scale up 2058 01:46:27,439 --> 01:46:29,840 Speaker 16: production of it and figure out ways that it can 2059 01:46:29,960 --> 01:46:33,120 Speaker 16: be offered to people in a variety or in a 2060 01:46:33,160 --> 01:46:34,439 Speaker 16: spectrum of use cases. 2061 01:46:34,560 --> 01:46:34,680 Speaker 5: Right. 2062 01:46:35,280 --> 01:46:38,439 Speaker 16: I think the stuff that I did not like, hmm, 2063 01:46:39,600 --> 01:46:42,120 Speaker 16: you know, I didn't really care for a lot of 2064 01:46:42,280 --> 01:46:49,360 Speaker 16: the luxury surveillance stuff, you know, the fake cgms that 2065 01:46:49,680 --> 01:46:53,080 Speaker 16: you know, I'll never forget this woman telling someone right 2066 01:46:53,160 --> 01:46:55,519 Speaker 16: next to me, it's a It was a medical device 2067 01:46:55,560 --> 01:46:58,400 Speaker 16: that when I asked, she looks at my attack and goes, no, 2068 01:47:00,200 --> 01:47:06,120 Speaker 16: it's not a medically. 2069 01:47:05,320 --> 01:47:07,720 Speaker 1: We had a beautiful moment where it was this like 2070 01:47:08,160 --> 01:47:10,800 Speaker 1: it was like a set of smart goggles, which there 2071 01:47:10,840 --> 01:47:12,400 Speaker 1: were a lot of that had like night vision, but 2072 01:47:12,439 --> 01:47:15,840 Speaker 1: also it had like threat assessments. So the specific thing 2073 01:47:15,840 --> 01:47:18,040 Speaker 1: they bragged is like it can help a police officer 2074 01:47:18,160 --> 01:47:22,000 Speaker 1: identify if somebody has a gun. Right, And this this 2075 01:47:22,280 --> 01:47:24,799 Speaker 1: was right after we had gone to an AI security 2076 01:47:24,880 --> 01:47:26,680 Speaker 1: camera that I had flipped off with both hands and 2077 01:47:26,720 --> 01:47:30,439 Speaker 1: it had identified as a man giving the thumbbat And 2078 01:47:31,080 --> 01:47:33,400 Speaker 1: I don't feel great about it identifying the gun. 2079 01:47:34,520 --> 01:47:37,960 Speaker 16: Luxury surveillance for health, luxury surveillance for a recognition. Also 2080 01:47:38,080 --> 01:47:40,120 Speaker 16: like they had it in litter boxes and shit, don't 2081 01:47:40,200 --> 01:47:40,439 Speaker 16: need that. 2082 01:47:40,880 --> 01:47:41,920 Speaker 10: I really don't need that. 2083 01:47:41,960 --> 01:47:44,160 Speaker 16: Why does the letterbox need to be connected to the internet. 2084 01:47:44,479 --> 01:47:45,519 Speaker 16: Why does it need a camera? 2085 01:47:46,560 --> 01:47:48,639 Speaker 1: You know that does maybe think of a better world 2086 01:47:48,680 --> 01:47:51,439 Speaker 1: where we have exactly as much money and focus on AI, 2087 01:47:51,600 --> 01:47:55,560 Speaker 1: but it's all integrating it into cat focused products like 2088 01:47:55,720 --> 01:47:59,960 Speaker 1: fifty billion dollars being poured into cattle. Translate whatever you're 2089 01:48:00,040 --> 01:48:04,120 Speaker 1: cat is saying into French perfectly. Your cat can make 2090 01:48:04,200 --> 01:48:07,840 Speaker 1: deals with Chinese and by the way, we've hooked him 2091 01:48:07,920 --> 01:48:10,280 Speaker 1: up to venture capital. He has an open line of 2092 01:48:10,439 --> 01:48:12,800 Speaker 1: soft bank. Siri, why you know that? 2093 01:48:13,040 --> 01:48:15,720 Speaker 16: I would love this translation. Let's help my cat make 2094 01:48:15,800 --> 01:48:19,080 Speaker 16: some deals. Help me figure out why or how he 2095 01:48:19,240 --> 01:48:22,000 Speaker 16: learned to open my door, you know, things like this, 2096 01:48:22,680 --> 01:48:24,200 Speaker 16: But what we get an am bullshit. 2097 01:48:24,360 --> 01:48:26,160 Speaker 1: I want to see a guy dressed as Steve Jobs 2098 01:48:26,200 --> 01:48:28,439 Speaker 1: be like ladies and gentlemen. We have finally done it. 2099 01:48:28,720 --> 01:48:31,840 Speaker 1: We have gotten across the concept of death to a cat, 2100 01:48:32,280 --> 01:48:33,559 Speaker 1: and I understand there. 2101 01:48:33,680 --> 01:48:41,400 Speaker 15: Mortalent actually that like common ad where he's talking about AI, 2102 01:48:41,520 --> 01:48:43,720 Speaker 15: but he's like, we taught proofs to a dog. 2103 01:48:44,800 --> 01:48:49,439 Speaker 1: Got turtleback on what a dog would think about It's 2104 01:48:49,560 --> 01:48:52,480 Speaker 1: just a dog sitting at a table smoking a cigarette. 2105 01:48:54,840 --> 01:48:59,800 Speaker 1: The future. Mm hmm, Garrison, you're up. 2106 01:49:01,160 --> 01:49:05,839 Speaker 5: Best of CS I think was definitely the VLC media 2107 01:49:05,960 --> 01:49:09,800 Speaker 5: booth at the park where they they had like they 2108 01:49:09,840 --> 01:49:12,200 Speaker 5: had big, big traffic cones on their head, wearing them 2109 01:49:12,280 --> 01:49:14,240 Speaker 5: like wizard hats with huge cloaks. 2110 01:49:14,280 --> 01:49:15,519 Speaker 1: They were dressed as wizards. 2111 01:49:15,800 --> 01:49:17,800 Speaker 5: They were dressed as wizards, and we walked up to 2112 01:49:17,880 --> 01:49:19,160 Speaker 5: them and they said. 2113 01:49:19,160 --> 01:49:21,639 Speaker 1: Let's let's start VLC folks, if you if you don't 2114 01:49:21,720 --> 01:49:23,519 Speaker 1: know this, this was especially relevant to those of us 2115 01:49:23,560 --> 01:49:26,479 Speaker 1: who pirated a lot. It's a media app that allows 2116 01:49:26,520 --> 01:49:30,120 Speaker 1: you to basically play any kind of like any video Fi, idiophile, yes, 2117 01:49:30,520 --> 01:49:32,920 Speaker 1: or audio file, and now it will automatically give you 2118 01:49:33,000 --> 01:49:35,840 Speaker 1: subtitles to using local AI. That's not like reaching to 2119 01:49:35,880 --> 01:49:36,800 Speaker 1: the cloud or anything to do. 2120 01:49:36,880 --> 01:49:39,799 Speaker 5: It, because putting some titles on pirated media can sometimes 2121 01:49:39,840 --> 01:49:42,040 Speaker 5: be really hard. So they said, we have something that 2122 01:49:42,160 --> 01:49:45,320 Speaker 5: analyzes the audio that's being spoken in whatever media you're watching, 2123 01:49:45,360 --> 01:49:47,280 Speaker 5: and we will put some titles up for you. We 2124 01:49:47,439 --> 01:49:50,439 Speaker 5: walked up and we're like, so what do you have here? Like, 2125 01:49:50,680 --> 01:49:54,240 Speaker 5: we are not selling anything, We have nothing to sell you. 2126 01:49:55,080 --> 01:49:57,280 Speaker 1: In this beautiful they're fringe. So it was in this 2127 01:49:57,520 --> 01:50:00,080 Speaker 1: like yeah, yeah, wonderful access. I'm not going to the 2128 01:50:00,760 --> 01:50:02,799 Speaker 1: degree of like I don't give a fuck about anything 2129 01:50:02,840 --> 01:50:06,000 Speaker 1: else of this stupid goddamn show that they gave off. 2130 01:50:06,120 --> 01:50:07,320 Speaker 1: They exuded it and. 2131 01:50:07,439 --> 01:50:10,000 Speaker 5: They're they're they're by far the coolest because of something 2132 01:50:10,160 --> 01:50:11,200 Speaker 5: Robert You said to them. 2133 01:50:11,400 --> 01:50:13,600 Speaker 1: I walked up and I was like, VLC is a 2134 01:50:13,680 --> 01:50:16,040 Speaker 1: very popular app. They just crossed six billion downloads. I've 2135 01:50:16,040 --> 01:50:18,360 Speaker 1: been using them for almost as long as you've been alive. 2136 01:50:18,400 --> 01:50:19,960 Speaker 1: And I walked up and I was like, I've been 2137 01:50:20,040 --> 01:50:23,240 Speaker 1: using your product for fifteen years. In ordered a pirate 2138 01:50:23,360 --> 01:50:27,880 Speaker 1: media and they said, very nonchalantly, keep going, keep keep going, 2139 01:50:29,760 --> 01:50:32,479 Speaker 1: keep doing that, keep doing that. I'm obsessed. 2140 01:50:32,560 --> 01:50:33,720 Speaker 3: Yeah, that's amazing. 2141 01:50:33,920 --> 01:50:36,240 Speaker 1: About this too, because, like I, it's a good app. 2142 01:50:36,280 --> 01:50:38,000 Speaker 1: I have also used it. I saw the guy in 2143 01:50:38,080 --> 01:50:39,559 Speaker 1: the hat and I was like, Oh, it's the VLC 2144 01:50:39,840 --> 01:50:41,400 Speaker 1: from you know, from on your desktop. 2145 01:50:41,720 --> 01:50:43,920 Speaker 15: And then I was like, that's that's stupid. I don't 2146 01:50:43,920 --> 01:50:45,479 Speaker 15: need to talk to the man. He's wearing a hat 2147 01:50:45,560 --> 01:50:47,400 Speaker 15: and a cape. And I'm glad that you followed through. 2148 01:50:47,520 --> 01:50:49,800 Speaker 15: As a journalist pushed aside your instinct to be like, 2149 01:50:49,880 --> 01:50:52,360 Speaker 15: do not approach stranger in a cape? 2150 01:50:52,920 --> 01:50:54,160 Speaker 1: Garrison does not have that answer. 2151 01:50:55,040 --> 01:50:58,560 Speaker 5: No, no, At some point the contrary, I feel a 2152 01:50:58,640 --> 01:50:59,519 Speaker 5: magnetic attraction. 2153 01:51:01,479 --> 01:51:03,320 Speaker 1: This is why I keep an air tag on them. 2154 01:51:04,280 --> 01:51:05,479 Speaker 1: A great way to get abducted. 2155 01:51:07,400 --> 01:51:12,639 Speaker 5: I think similarly, obviously all of the AI stuff for kids, 2156 01:51:12,680 --> 01:51:15,000 Speaker 5: all of like the AI slop is like obviously bad. 2157 01:51:15,040 --> 01:51:17,120 Speaker 5: We've talked about that a lot already. The other thing 2158 01:51:17,200 --> 01:51:19,120 Speaker 5: that's like kind of like the worst is similar to 2159 01:51:19,280 --> 01:51:21,559 Speaker 5: what you said, ed like the level of surveillance tech. 2160 01:51:21,760 --> 01:51:23,800 Speaker 5: I tried out multiple a systems that are supposed to 2161 01:51:23,960 --> 01:51:27,920 Speaker 5: like detect and predict behavior based on facial expressions or 2162 01:51:28,000 --> 01:51:32,120 Speaker 5: gesture and this is really tricky. There was one at 2163 01:51:32,200 --> 01:51:35,200 Speaker 5: Eureka Park. It's a South Korean company that's powered I 2164 01:51:35,240 --> 01:51:37,880 Speaker 5: believe by Samsung with money and also they've access to 2165 01:51:37,960 --> 01:51:43,240 Speaker 5: like their training data. They're called Visomatic And specifically why 2166 01:51:43,400 --> 01:51:45,519 Speaker 5: this exists. It is a camera that you can put 2167 01:51:45,560 --> 01:51:47,439 Speaker 5: on a put on a computer. It will it will 2168 01:51:47,520 --> 01:51:50,360 Speaker 5: detect where your face is pointing and where your eyes 2169 01:51:50,400 --> 01:51:53,360 Speaker 5: are paying attention to. And the reason why this exists 2170 01:51:53,760 --> 01:51:56,680 Speaker 5: is for online test taking. It's so people don't like 2171 01:51:56,760 --> 01:51:59,880 Speaker 5: look at their phone to like cheat, so it tracks 2172 01:52:00,040 --> 01:52:01,559 Speaker 5: where your eyes are moving and if your eyes look 2173 01:52:01,640 --> 01:52:04,280 Speaker 5: down too much, it's gonna flag it as someone's possibly cheating. 2174 01:52:04,400 --> 01:52:07,679 Speaker 5: So this was obviously like introduced after the pandemic. There's 2175 01:52:07,680 --> 01:52:09,960 Speaker 5: a lot a lot of online test taking. Samsung uses 2176 01:52:10,040 --> 01:52:12,400 Speaker 5: this tech themselves for any kind of like online exams 2177 01:52:12,600 --> 01:52:14,920 Speaker 5: that they as a company will put on, you know, 2178 01:52:15,080 --> 01:52:18,080 Speaker 5: whether it's like for people students, employees. But they also 2179 01:52:18,160 --> 01:52:20,080 Speaker 5: had like other features where you could switch it. I 2180 01:52:20,120 --> 01:52:21,439 Speaker 5: assume what it's doing all the same work, it just 2181 01:52:21,600 --> 01:52:24,160 Speaker 5: is placed differently on the monitor and instead of can 2182 01:52:24,200 --> 01:52:26,759 Speaker 5: you know, do like object detection you know what you're wearing, 2183 01:52:27,560 --> 01:52:30,000 Speaker 5: and the general like behavior analysis if you seem like 2184 01:52:30,040 --> 01:52:33,400 Speaker 5: you're behaving suspiciously, which is something that we tried at 2185 01:52:33,439 --> 01:52:36,320 Speaker 5: the SK booth, which also South Korean company for their 2186 01:52:36,439 --> 01:52:40,560 Speaker 5: own like like surveillance detection. But I asked Visomatic like like, 2187 01:52:40,720 --> 01:52:42,040 Speaker 5: what kind of use cases do you see for this 2188 01:52:42,120 --> 01:52:44,880 Speaker 5: beyond test tick? You know, like yeah, general surveillance, Like yeah, 2189 01:52:45,800 --> 01:52:48,560 Speaker 5: we want to learn how to like predict or like 2190 01:52:48,680 --> 01:52:52,559 Speaker 5: analyze potentially suspicious human behavior. As we were walking by 2191 01:52:52,640 --> 01:52:56,280 Speaker 5: the SK version. One quite funny thing is as I 2192 01:52:56,400 --> 01:52:59,400 Speaker 5: walked by, it first at first identified me as a 2193 01:52:59,479 --> 01:53:03,679 Speaker 5: blonde war holding a cup. It then changed and said 2194 01:53:03,720 --> 01:53:06,799 Speaker 5: blonde person, which I think is pretty it's pretty deep. 2195 01:53:07,160 --> 01:53:09,360 Speaker 1: Very progressive. It's doing the opposite of a Facebook. 2196 01:53:09,479 --> 01:53:12,679 Speaker 5: Yeah, it could sense the pronouns. It's like hmm, maybe 2197 01:53:13,320 --> 01:53:18,240 Speaker 5: maybe not maybe not a woman, maybe not a blonde person, 2198 01:53:19,760 --> 01:53:22,320 Speaker 5: but but yes, that was you know, something that was 2199 01:53:22,360 --> 01:53:26,120 Speaker 5: like quite well done. Specifically the visomatic stuff like very functional. 2200 01:53:26,439 --> 01:53:28,160 Speaker 5: It could tell when I was looking at the screen, 2201 01:53:28,360 --> 01:53:30,240 Speaker 5: when I was looking at my phone, it could tell 2202 01:53:30,360 --> 01:53:32,960 Speaker 5: from like various various angles, like what I was holding, 2203 01:53:33,040 --> 01:53:35,599 Speaker 5: what I was looking at, where my attention was being directed. 2204 01:53:35,960 --> 01:53:36,240 Speaker 14: Like it was. 2205 01:53:36,320 --> 01:53:38,720 Speaker 5: It was very well done. It was very accurate, but 2206 01:53:38,800 --> 01:53:39,840 Speaker 5: you know, possibly scary. 2207 01:53:40,720 --> 01:53:43,920 Speaker 1: Well, speaking of possibly scary, the sponsors of this podcast 2208 01:53:44,600 --> 01:53:46,400 Speaker 1: don't know who they are, could be the Washington State 2209 01:53:46,479 --> 01:53:49,160 Speaker 1: Highway Patrol again, in which case, thank you boys for 2210 01:53:49,280 --> 01:53:53,639 Speaker 1: your noble service on our nation's roads. I'm not saying 2211 01:53:53,720 --> 01:53:56,160 Speaker 1: that because I got pulled over the other week and 2212 01:53:56,200 --> 01:53:58,240 Speaker 1: I'm really trying to fight a case right now. I 2213 01:53:58,240 --> 01:54:13,439 Speaker 1: would never do that anyway. Thanks guys, and we're back. 2214 01:54:15,720 --> 01:54:16,320 Speaker 11: Is it my turn? 2215 01:54:16,479 --> 01:54:17,120 Speaker 1: It is your turn? 2216 01:54:17,600 --> 01:54:20,360 Speaker 5: Okay, So I'm going to introduce our special white woman 2217 01:54:20,520 --> 01:54:24,840 Speaker 5: CORRESPONDENTSI to give us some exciting breaking news in the 2218 01:54:25,000 --> 01:54:27,040 Speaker 5: white woman tech development world. 2219 01:54:27,920 --> 01:54:31,400 Speaker 14: Okay, so the first one is positive for contexts, I'm 2220 01:54:31,439 --> 01:54:34,840 Speaker 14: a trans woman. And one of the boosts that was 2221 01:54:34,920 --> 01:54:39,480 Speaker 14: pretty interesting it was this group. Were they French, Garrett, 2222 01:54:39,520 --> 01:54:43,480 Speaker 14: you remember I, I you know they're they're they're European. 2223 01:54:43,640 --> 01:54:46,120 Speaker 14: They're called Ellie Ellie Health. 2224 01:54:46,200 --> 01:54:47,080 Speaker 10: It's E l I. 2225 01:54:47,680 --> 01:54:52,840 Speaker 14: Anyways, this is a at home hormone tester, So it 2226 01:54:53,000 --> 01:54:57,400 Speaker 14: is saliva base. It's like a little disposable package. Currently 2227 01:54:57,520 --> 01:55:01,440 Speaker 14: they only advertise cortisol and progress rone, but they have 2228 01:55:01,640 --> 01:55:05,720 Speaker 14: plans for estrade, isle and other hormones, testoster as well, 2229 01:55:05,800 --> 01:55:11,600 Speaker 14: testosterone as well. Sorry, and yeah, so you swab your 2230 01:55:11,640 --> 01:55:15,360 Speaker 14: mouth in the morning or evening and then you wait, 2231 01:55:15,600 --> 01:55:18,480 Speaker 14: what was it, like, fifteen minutes, twenty minutes, and then 2232 01:55:18,600 --> 01:55:23,040 Speaker 14: you scan this little like QR thing on the device 2233 01:55:23,560 --> 01:55:27,880 Speaker 14: and your phone calculates what your levels are. And this 2234 01:55:28,040 --> 01:55:33,400 Speaker 14: has very interesting implications for like the DIY hormone market 2235 01:55:33,680 --> 01:55:37,880 Speaker 14: or use case. I started DIY and did my own 2236 01:55:38,040 --> 01:55:41,040 Speaker 14: like blood tests, but a lot of like trans kids 2237 01:55:41,080 --> 01:55:42,280 Speaker 14: don't have access to that. 2238 01:55:42,560 --> 01:55:47,480 Speaker 3: So this is a it's a good idea if it's actually. 2239 01:55:47,200 --> 01:55:50,480 Speaker 14: Effective, Like we an't have hands on yet, we haven't 2240 01:55:50,520 --> 01:55:52,360 Speaker 14: tested it yet, but I would love to do a 2241 01:55:52,440 --> 01:55:55,440 Speaker 14: comparison of like testing my own levels and then trying 2242 01:55:55,560 --> 01:55:57,839 Speaker 14: this very interesting, very intriguing. 2243 01:55:58,040 --> 01:56:00,320 Speaker 5: Yeah, we will certainly as soon as possible. Has this 2244 01:56:00,760 --> 01:56:03,840 Speaker 5: compared to the regular like mail in blood tests, which 2245 01:56:03,920 --> 01:56:05,520 Speaker 5: is like the current way to do it, but that 2246 01:56:05,680 --> 01:56:08,960 Speaker 5: requires shipping your blood to a laboratory, and that's maybe 2247 01:56:09,040 --> 01:56:12,600 Speaker 5: not always the best or even like convenient. So being 2248 01:56:12,600 --> 01:56:14,720 Speaker 5: able to test this just at home without shipping any 2249 01:56:14,760 --> 01:56:18,240 Speaker 5: of your DNA to some random laboratory would be really 2250 01:56:18,520 --> 01:56:19,040 Speaker 5: really cool. 2251 01:56:19,520 --> 01:56:25,560 Speaker 14: Right, There's no insurance involved. This is completely supposedly close source. 2252 01:56:25,800 --> 01:56:28,040 Speaker 1: From what y'all were telling me earlier today, when you 2253 01:56:28,320 --> 01:56:30,880 Speaker 1: explained this to me, it sounded kind of like the 2254 01:56:30,960 --> 01:56:34,480 Speaker 1: people making this have an understanding of the dangers inherent, 2255 01:56:34,600 --> 01:56:37,000 Speaker 1: particularly to the trans community, and why they might want 2256 01:56:37,040 --> 01:56:39,840 Speaker 1: to use this, and a focus on privacy. For that reason, 2257 01:56:40,160 --> 01:56:43,520 Speaker 1: I didn't press them on that because I don't know. 2258 01:56:43,680 --> 01:56:47,920 Speaker 1: I follow ces you know, wide variety of Yeah. 2259 01:56:48,040 --> 01:56:50,440 Speaker 5: No, we tried to expect as much intel as possible 2260 01:56:50,720 --> 01:56:53,600 Speaker 5: about kind of what their future plans are, but not 2261 01:56:53,720 --> 01:56:55,520 Speaker 5: specifically like in that level. 2262 01:56:55,520 --> 01:56:58,400 Speaker 1: But privacy, like they seemed like they had of coursably 2263 01:56:58,400 --> 01:56:59,400 Speaker 1: good understand. 2264 01:56:59,080 --> 01:57:00,840 Speaker 5: Of course, it's because it's because it is your own 2265 01:57:00,880 --> 01:57:03,520 Speaker 5: like DNA and hormones, you know, Like, I do not 2266 01:57:03,680 --> 01:57:05,720 Speaker 5: know this company is even thinking about trans people if 2267 01:57:05,760 --> 01:57:08,680 Speaker 5: it is trans friendly, but it could be used by 2268 01:57:08,800 --> 01:57:10,040 Speaker 5: trans people regardless. 2269 01:57:10,480 --> 01:57:14,360 Speaker 1: Yeah, much like a glock exactly exactly. 2270 01:57:15,120 --> 01:57:16,280 Speaker 14: The potential is great. 2271 01:57:16,920 --> 01:57:20,240 Speaker 11: And then probably my least favorite goofs. 2272 01:57:20,480 --> 01:57:23,240 Speaker 14: I have to call out some other white women my 2273 01:57:23,600 --> 01:57:27,840 Speaker 14: so call boho white women is ebjects. 2274 01:57:28,080 --> 01:57:32,840 Speaker 1: And has that spelled ev ev J E C t okay? 2275 01:57:33,320 --> 01:57:34,320 Speaker 4: And what this is? 2276 01:57:34,360 --> 01:57:34,760 Speaker 1: Oh god? 2277 01:57:34,880 --> 01:57:39,960 Speaker 14: Yes, this is a special plug for your charging port 2278 01:57:40,080 --> 01:57:45,520 Speaker 14: of your EV. So the idea is a nefarious party 2279 01:57:45,880 --> 01:57:49,320 Speaker 14: sees you and your fancy EV and approaches you and 2280 01:57:49,440 --> 01:57:53,400 Speaker 14: you need a quick getaway. Their words, their words, their words, 2281 01:57:53,520 --> 01:57:56,200 Speaker 14: by the way, like they see your fancy car and 2282 01:57:56,320 --> 01:58:02,520 Speaker 14: your targets. So this device will like e jacks, you 2283 01:58:02,600 --> 01:58:05,839 Speaker 14: can just drive away from the charge the power cart. 2284 01:58:07,240 --> 01:58:11,720 Speaker 1: Leaving broken pieces of plastic both yes, the charger way 2285 01:58:12,160 --> 01:58:15,920 Speaker 1: like this is not reusable, single use, one time use. 2286 01:58:16,240 --> 01:58:16,440 Speaker 3: Yes. 2287 01:58:17,200 --> 01:58:21,200 Speaker 14: So yeah, all those all those people targeting so call 2288 01:58:21,360 --> 01:58:21,880 Speaker 14: white women. 2289 01:58:23,360 --> 01:58:25,720 Speaker 15: This is finally someone is serving the community of people 2290 01:58:25,760 --> 01:58:27,320 Speaker 15: that think that if you find a zip. 2291 01:58:27,200 --> 01:58:31,320 Speaker 5: Tie on your car door handle MS thirteen. That was 2292 01:58:31,360 --> 01:58:33,120 Speaker 5: the first thing I said. As soon as soon as 2293 01:58:33,160 --> 01:58:35,280 Speaker 5: we walked away, I was like, this product wins the 2294 01:58:35,400 --> 01:58:39,080 Speaker 5: Coolson Media Award for the most white woman product. It's 2295 01:58:39,080 --> 01:58:41,720 Speaker 5: specifically right, like if you see a slice of cheese 2296 01:58:41,840 --> 01:58:46,040 Speaker 5: on your windshield, you're already targeted. Way this is this 2297 01:58:46,200 --> 01:58:48,600 Speaker 5: is that exact demographic of people who think they're going 2298 01:58:48,640 --> 01:58:51,960 Speaker 5: to get trafficked in like your local like olive garden 2299 01:58:52,280 --> 01:58:53,120 Speaker 5: parking lots. 2300 01:58:53,160 --> 01:58:59,320 Speaker 1: Gangs stalked Americans in the Tesla charging station in Brentwood, California, 2301 01:58:59,600 --> 01:59:03,600 Speaker 1: where they average incomes like in the eight figures. I 2302 01:59:03,680 --> 01:59:05,800 Speaker 1: gotta say, though, you're being very unfair to them. It 2303 01:59:05,920 --> 01:59:08,080 Speaker 1: was so nice of them to put down the phone 2304 01:59:08,080 --> 01:59:10,120 Speaker 1: where they were doom scrolling TikTok, to look at all 2305 01:59:10,160 --> 01:59:12,240 Speaker 1: of the different reasons their kids are going to be abducted. 2306 01:59:12,280 --> 01:59:15,200 Speaker 15: We're talking about this product. You know, they're like some 2307 01:59:15,400 --> 01:59:19,080 Speaker 15: finally someone's gonna do something about it. Create a disposable 2308 01:59:19,120 --> 01:59:19,880 Speaker 15: piece of plastic. 2309 01:59:20,080 --> 01:59:22,480 Speaker 16: You notice that guy is always sitting down at the 2310 01:59:22,640 --> 01:59:26,360 Speaker 16: gym and the coffee shop and the gas station. 2311 01:59:27,040 --> 01:59:27,880 Speaker 1: This is what he doesn't get. 2312 01:59:28,240 --> 01:59:28,680 Speaker 3: Be careful. 2313 01:59:28,800 --> 01:59:30,480 Speaker 1: I feel like I could have upsold them, and like 2314 01:59:30,520 --> 01:59:32,360 Speaker 1: what if we put some explosives in this, you know, 2315 01:59:32,520 --> 01:59:34,520 Speaker 1: really like keep them off of the. 2316 01:59:34,600 --> 01:59:39,680 Speaker 16: Car, like blo, create a diversion inside of it, called 2317 01:59:39,800 --> 01:59:42,440 Speaker 16: the police station and took a picture of him. 2318 01:59:42,400 --> 01:59:46,200 Speaker 1: And someone scared. A lady driving a Vibe that's one 2319 01:59:46,240 --> 01:59:47,160 Speaker 1: of the electric. 2320 01:59:46,880 --> 01:59:51,080 Speaker 5: Cars, right, It's like a crocodile tail as as as 2321 01:59:51,160 --> 01:59:55,839 Speaker 5: it whips around, mobilizing anyone in the vicinity. 2322 01:59:56,000 --> 01:59:58,840 Speaker 1: We're calling it the iguana, and it does with enough 2323 01:59:58,920 --> 02:00:04,120 Speaker 1: force to break a grow man. Yeah, yes, Okay, David Roth. 2324 02:00:04,600 --> 02:00:07,400 Speaker 15: So there's a lot of I guess I gathered less 2325 02:00:07,600 --> 02:00:09,480 Speaker 15: than in years past that this was at one point 2326 02:00:09,560 --> 02:00:11,200 Speaker 15: like basically a car show. There is not a lot 2327 02:00:11,240 --> 02:00:14,200 Speaker 15: of transit stuff this time around. I didn't get to 2328 02:00:14,200 --> 02:00:16,600 Speaker 15: see very much of it, but I did have. I 2329 02:00:16,680 --> 02:00:18,879 Speaker 15: guess this is both my best and my worst experience, 2330 02:00:18,960 --> 02:00:21,880 Speaker 15: the most powerful transit experience of my life. 2331 02:00:22,200 --> 02:00:23,360 Speaker 1: So I live in New York City. I take the 2332 02:00:23,400 --> 02:00:26,000 Speaker 1: subway pretty much everywhere I go, and you know, it 2333 02:00:26,080 --> 02:00:28,240 Speaker 1: has its ups and downs. For the most part, it's good. 2334 02:00:28,280 --> 02:00:30,560 Speaker 15: It moves like twelve hundred people through a tunnel of 2335 02:00:30,720 --> 02:00:33,120 Speaker 15: thirty odd miles an hour, and for the most part, 2336 02:00:33,120 --> 02:00:35,520 Speaker 15: everybody leaves everybody else alone, or you know, watches videos 2337 02:00:35,520 --> 02:00:38,080 Speaker 15: on their phone and stuff. But I knew that there 2338 02:00:38,160 --> 02:00:41,120 Speaker 15: had to be a better way, and at the Las 2339 02:00:41,240 --> 02:00:44,400 Speaker 15: Vegas Convention Center, I got to experience it. 2340 02:00:44,880 --> 02:00:47,360 Speaker 1: You're familiar Elon Musk, serial entrepreneur. 2341 02:00:48,080 --> 02:00:50,720 Speaker 15: Yeah, so he invented something called the hyper loop which 2342 02:00:50,800 --> 02:00:52,400 Speaker 15: is a car that goes through a tunnel that's the 2343 02:00:52,520 --> 02:00:55,440 Speaker 15: exact same size as the car at eleven miles an hour, 2344 02:00:56,120 --> 02:00:58,760 Speaker 15: and it takes there's someone has to drive it, and 2345 02:00:58,840 --> 02:01:01,400 Speaker 15: also someone has to help you into the car. But 2346 02:01:01,480 --> 02:01:04,040 Speaker 15: you can fit up the three additional people into the car, 2347 02:01:04,200 --> 02:01:07,480 Speaker 15: so that ratio everyone, I know, Yes, right, So yeah, 2348 02:01:07,800 --> 02:01:12,480 Speaker 15: you got two people moving three people two hundred yards 2349 02:01:13,200 --> 02:01:16,040 Speaker 15: at the speed of like a brisk walk. Now, David, 2350 02:01:16,040 --> 02:01:19,400 Speaker 15: this kind of technology wasn't possible just a few decades, right, exactly. 2351 02:01:19,440 --> 02:01:21,360 Speaker 1: I mean, this was the sort of thing. Though there 2352 02:01:21,440 --> 02:01:25,760 Speaker 1: had been tunnels, they were mostly used by animals, voles miners, yes, right, 2353 02:01:25,800 --> 02:01:28,280 Speaker 1: and thought yeah, and that was mostly for pirates in 2354 02:01:28,320 --> 02:01:30,560 Speaker 1: at least one movie I saw recently, Yes, but no 2355 02:01:30,600 --> 02:01:32,680 Speaker 1: one had thought about it as a transit sort of thing. 2356 02:01:32,760 --> 02:01:34,320 Speaker 15: It was more of a like a place where you 2357 02:01:34,360 --> 02:01:38,240 Speaker 15: would go if you needed to get copper. And of course, 2358 02:01:38,720 --> 02:01:41,000 Speaker 15: but in this case, so this is like where it's 2359 02:01:41,040 --> 02:01:43,560 Speaker 15: good to have and this I guess every ces is 2360 02:01:43,600 --> 02:01:46,480 Speaker 15: like this. This was my first to be reminded that 2361 02:01:46,760 --> 02:01:49,800 Speaker 15: there are visionaries out there who are like, what if 2362 02:01:49,840 --> 02:01:53,080 Speaker 15: you put car through hole? What if instead of a 2363 02:01:53,160 --> 02:01:56,640 Speaker 15: thing that moves multiple people at once. You had a 2364 02:01:56,680 --> 02:01:59,320 Speaker 15: thing that took exactly the same number of people to 2365 02:01:59,480 --> 02:02:02,920 Speaker 15: move a number of people slightly more than Yes, Yeah, 2366 02:02:03,440 --> 02:02:05,160 Speaker 15: so that was cool. I mean it's just like fun 2367 02:02:05,240 --> 02:02:06,920 Speaker 15: to see like where this stuff is going. And I 2368 02:02:07,000 --> 02:02:09,640 Speaker 15: really wonder if we're not going to start seeing things 2369 02:02:09,720 --> 02:02:13,320 Speaker 15: like cars on the streets of American cities, you know, 2370 02:02:13,520 --> 02:02:18,920 Speaker 15: like it could be okay, David, I mean most of 2371 02:02:18,960 --> 02:02:20,720 Speaker 15: the obviously this is something we'd go on. Last yere 2372 02:02:20,800 --> 02:02:22,280 Speaker 15: was like three or four good things you all said them. 2373 02:02:22,320 --> 02:02:25,600 Speaker 15: I thought the accessibility tech stuff was the the stuff 2374 02:02:25,640 --> 02:02:28,720 Speaker 15: that made me feel good about what was going on here, 2375 02:02:28,760 --> 02:02:30,200 Speaker 15: and there was a great deal of stuff that made 2376 02:02:30,200 --> 02:02:32,200 Speaker 15: me feel like pretty bad about what was going on. Yeah, 2377 02:02:32,360 --> 02:02:35,400 Speaker 15: up to and including like the surveillance stuff beyond the 2378 02:02:35,800 --> 02:02:40,680 Speaker 15: you know, like advanced Samsung powered snitch tech so that 2379 02:02:40,800 --> 02:02:43,600 Speaker 15: nobody whatever your boss can tell if you're really looking 2380 02:02:43,680 --> 02:02:46,120 Speaker 15: at the zoom that you're on. I don't really love 2381 02:02:46,200 --> 02:02:48,680 Speaker 15: that personally, But for me, the a lot of the 2382 02:02:49,120 --> 02:02:51,680 Speaker 15: smart home stuff is real drag like just in the 2383 02:02:51,760 --> 02:02:54,400 Speaker 15: sense that it clearly, first of all, beyond being like 2384 02:02:54,440 --> 02:02:58,800 Speaker 15: sort of unnecessary, there's a level of just willingly giving 2385 02:02:58,880 --> 02:03:03,400 Speaker 15: over your agency over the small moments that make you 2386 02:03:03,480 --> 02:03:05,440 Speaker 15: know human life human life, and just being like I 2387 02:03:05,480 --> 02:03:08,040 Speaker 15: would really love it if just like an artificial intelligence 2388 02:03:08,080 --> 02:03:10,839 Speaker 15: could pick my pants out for the day. I'll simply 2389 02:03:10,920 --> 02:03:14,000 Speaker 15: stand here waiting for that to happen. Yeah, just fucking 2390 02:03:14,080 --> 02:03:17,400 Speaker 15: grim actually, like didn't really care for it. I feel 2391 02:03:17,400 --> 02:03:20,120 Speaker 15: like you gotta like, what are you using that time 2392 02:03:20,200 --> 02:03:20,440 Speaker 15: to do? 2393 02:03:21,520 --> 02:03:24,200 Speaker 1: Yeah? What are you getting? What are you optimizing from 2394 02:03:24,280 --> 02:03:27,200 Speaker 1: yourself by not having like pieces of like the thing 2395 02:03:27,320 --> 02:03:29,360 Speaker 1: that a human being does, which is like pick your 2396 02:03:29,440 --> 02:03:32,720 Speaker 1: clothes right? Yeah, I wonder how you feel about this 2397 02:03:32,760 --> 02:03:34,360 Speaker 1: because you and I have been going to sees from 2398 02:03:34,400 --> 02:03:37,000 Speaker 1: and I guess a broadly similar number of years. Like 2399 02:03:37,080 --> 02:03:38,880 Speaker 1: I've never been to CEO. Oh really this is your No, 2400 02:03:39,000 --> 02:03:40,880 Speaker 1: I'm a fucking sports writer man, Like, this is a 2401 02:03:40,920 --> 02:03:43,640 Speaker 1: lot of out here because Ed got me a folding head. 2402 02:03:43,800 --> 02:03:46,360 Speaker 1: I have like the dead eyedes veterans. Oh yeah, well 2403 02:03:46,400 --> 02:03:47,520 Speaker 1: I'm very tired. Yeah. 2404 02:03:47,760 --> 02:03:49,680 Speaker 15: This is the thing with like I think, as far 2405 02:03:49,720 --> 02:03:51,640 Speaker 15: as I can tell, it seems like it's a loop 2406 02:03:52,200 --> 02:03:54,400 Speaker 15: where you more or less like you start out it's 2407 02:03:54,440 --> 02:03:56,520 Speaker 15: too much, you get big eye right away, and then 2408 02:03:56,560 --> 02:03:58,880 Speaker 15: you just sort of feel zombiefied. But then we have 2409 02:03:59,000 --> 02:04:01,360 Speaker 15: talked to people over the last few days that are like, 2410 02:04:01,400 --> 02:04:04,120 Speaker 15: you know, I remember like fourteen cess ago that was 2411 02:04:04,200 --> 02:04:08,080 Speaker 15: pretty good, like and they're also tired and also arranged 2412 02:04:08,080 --> 02:04:08,520 Speaker 15: by this point. 2413 02:04:08,600 --> 02:04:10,920 Speaker 1: Yeah, the first time someone showed me a tablet computer, 2414 02:04:11,040 --> 02:04:13,600 Speaker 1: I was like, oh man, science has given me everything 2415 02:04:13,680 --> 02:04:15,960 Speaker 1: I want, like, and I guess it's I don't know. 2416 02:04:16,200 --> 02:04:18,720 Speaker 1: Do you remember like when the last one was that 2417 02:04:18,880 --> 02:04:25,680 Speaker 1: you felt like even sort of that stirring. Yeah, twenty 2418 02:04:25,880 --> 02:04:29,080 Speaker 1: like eleven or twelve when they did. I got to 2419 02:04:29,120 --> 02:04:31,640 Speaker 1: see inductive charging of a car for the first time, 2420 02:04:31,680 --> 02:04:34,080 Speaker 1: and it like was so big. The Las Vegas Convention 2421 02:04:34,240 --> 02:04:37,560 Speaker 1: Center is like as the size of a city, and 2422 02:04:37,720 --> 02:04:40,760 Speaker 1: seeing like the lights in that whole convention center dim 2423 02:04:40,960 --> 02:04:45,960 Speaker 1: as they were doing it was very inefficient because our 2424 02:04:46,080 --> 02:04:48,240 Speaker 1: wild Yeah, but that was just like that was like, 2425 02:04:48,400 --> 02:04:51,080 Speaker 1: oh wow, this is kind of like amazing that this 2426 02:04:51,240 --> 02:04:55,360 Speaker 1: is even possible. But yeah, not really sense, not really sense. 2427 02:04:56,200 --> 02:04:57,840 Speaker 1: That's why I'm really glad that there's lights in the 2428 02:04:57,920 --> 02:05:02,280 Speaker 1: hyper loop tunnel. Yeah, otherwise it'd be unless something goes wrong, 2429 02:05:02,320 --> 02:05:04,520 Speaker 1: would it started to seem kind of grim? Otherwise? Well, 2430 02:05:04,560 --> 02:05:07,440 Speaker 1: I the smart home stuff is interesting because that has 2431 02:05:07,480 --> 02:05:09,080 Speaker 1: been as long as I've been going to these they've 2432 02:05:09,080 --> 02:05:12,560 Speaker 1: been trying to sell people on smart homes, and I 2433 02:05:12,680 --> 02:05:15,640 Speaker 1: don't think I've ever gotten a good idea of what 2434 02:05:15,800 --> 02:05:18,440 Speaker 1: a smart home is that I think a person would want. 2435 02:05:18,680 --> 02:05:20,960 Speaker 1: I can think of two things a person would want right. 2436 02:05:21,280 --> 02:05:23,360 Speaker 1: One of them is, it would be nice if, like 2437 02:05:23,440 --> 02:05:25,480 Speaker 1: I didn't have to think about playing music. I could 2438 02:05:25,560 --> 02:05:27,880 Speaker 1: just like tell my house to play the music I wanted, 2439 02:05:27,880 --> 02:05:29,720 Speaker 1: and it would play the music and I could hear 2440 02:05:29,760 --> 02:05:31,280 Speaker 1: it everywhere and I didn't have to fuss with a 2441 02:05:31,320 --> 02:05:33,720 Speaker 1: bunch of shit. And the second is, what if I'm 2442 02:05:33,720 --> 02:05:35,960 Speaker 1: coming home from vacation and my house is cold, it 2443 02:05:36,000 --> 02:05:37,720 Speaker 1: would be nice to turn on the heat or like 2444 02:05:37,760 --> 02:05:40,960 Speaker 1: an hour before I get home. And one of those 2445 02:05:41,040 --> 02:05:43,400 Speaker 1: things you'd use every day, and one of those things 2446 02:05:43,440 --> 02:05:45,360 Speaker 1: it's not really viable to base a business off of. 2447 02:05:45,480 --> 02:05:47,880 Speaker 1: But like, they keep trying to find new ways to 2448 02:05:47,960 --> 02:05:50,680 Speaker 1: stick computers in my house, and I don't know, does 2449 02:05:50,720 --> 02:05:52,440 Speaker 1: anyone else have anything they want out of a fucking 2450 02:05:52,520 --> 02:05:53,040 Speaker 1: smart home? 2451 02:05:53,320 --> 02:05:53,360 Speaker 5: No? 2452 02:05:53,560 --> 02:05:55,880 Speaker 15: I mean, I like, it's not an accident that my 2453 02:05:56,040 --> 02:05:58,440 Speaker 15: apartment is basically going to be in the year two 2454 02:05:58,440 --> 02:06:01,280 Speaker 15: thousand and five forever means it's expensive to do all 2455 02:06:01,320 --> 02:06:03,240 Speaker 15: this stuff. This is the bit thatit with so many 2456 02:06:03,280 --> 02:06:06,120 Speaker 15: of these demos, like just you start to notice how 2457 02:06:06,600 --> 02:06:09,840 Speaker 15: incredibly grandiose the residences in which all of this stuff 2458 02:06:09,920 --> 02:06:12,760 Speaker 15: is being sort of like postulated as being useful. Is 2459 02:06:12,840 --> 02:06:17,160 Speaker 15: It's like the like Lexus December to remember sales event 2460 02:06:17,280 --> 02:06:18,240 Speaker 15: type energy just a. 2461 02:06:18,280 --> 02:06:19,920 Speaker 1: Big fuck what lives do you live? 2462 02:06:20,040 --> 02:06:20,240 Speaker 16: Yeah? 2463 02:06:20,400 --> 02:06:23,360 Speaker 15: This also we've talked about this on Ed's show that 2464 02:06:23,520 --> 02:06:25,160 Speaker 15: like there's a lot of stuff here that feels like 2465 02:06:25,440 --> 02:06:28,000 Speaker 15: like the first fifteen minutes of a George Romero movie, 2466 02:06:28,200 --> 02:06:30,720 Speaker 15: like just getting you set for Eventually there's going to 2467 02:06:30,720 --> 02:06:32,880 Speaker 15: be a lot of like, you know, disembowelings and hideous 2468 02:06:32,880 --> 02:06:36,240 Speaker 15: shambling zombies. Yeah, and smart home not a bad horror 2469 02:06:36,320 --> 02:06:39,640 Speaker 15: movie concept. I don't think it's a great consumer concept. 2470 02:06:39,800 --> 02:06:44,160 Speaker 1: Yeah, speaking of great consumer concepts, the ads for this podcast. 2471 02:06:54,280 --> 02:06:56,120 Speaker 1: All right, we're back, and I want to close this 2472 02:06:56,240 --> 02:06:58,960 Speaker 1: out by asking everybody a question, which is, how do 2473 02:06:59,000 --> 02:07:00,960 Speaker 1: you feel about where tech is going? 2474 02:07:02,600 --> 02:07:05,240 Speaker 16: I think we're going to hell. I think we are 2475 02:07:05,440 --> 02:07:09,839 Speaker 16: getting wrapped it up very fast into the sweet abyss. 2476 02:07:10,640 --> 02:07:14,120 Speaker 16: I'm worried about the fact that so much of the 2477 02:07:14,280 --> 02:07:19,600 Speaker 16: tech is oriented around surveillance, around precursor forms of prepping, 2478 02:07:20,440 --> 02:07:26,480 Speaker 16: around very soft forms of like perfection and optimization that 2479 02:07:26,800 --> 02:07:30,360 Speaker 16: rhyme with eugenics. I'm like, I don't like the direction 2480 02:07:30,480 --> 02:07:33,960 Speaker 16: that a lot of this stuff is going, but also 2481 02:07:34,040 --> 02:07:35,840 Speaker 16: there I don't know what to do about it because 2482 02:07:35,840 --> 02:07:39,240 Speaker 16: so much of it is driven by private interest, right, 2483 02:07:39,320 --> 02:07:44,960 Speaker 16: It's like venture capitalists, well capitalized individuals and firms that 2484 02:07:45,040 --> 02:07:47,760 Speaker 16: they're connected to decide what we get to get pushed 2485 02:07:48,040 --> 02:07:49,280 Speaker 16: and these corporations, you. 2486 02:07:49,320 --> 02:07:52,040 Speaker 1: Know, yeah, the nature of like you can you can 2487 02:07:52,080 --> 02:07:54,920 Speaker 1: really tell that a lot of like the health products 2488 02:07:54,960 --> 02:07:58,720 Speaker 1: are very optimized for like rich tech executives. Like there's 2489 02:07:58,720 --> 02:08:00,560 Speaker 1: a lot of sleep products that are relied on you 2490 02:08:00,680 --> 02:08:03,839 Speaker 1: being willing to like bathe yourself in speakers, playing benaural 2491 02:08:03,920 --> 02:08:07,040 Speaker 1: beats while you slept, and like at different devices measure 2492 02:08:07,080 --> 02:08:09,080 Speaker 1: You're like do an ECG and it's like I don't know, 2493 02:08:09,640 --> 02:08:11,280 Speaker 1: my aunt's not going to do that. 2494 02:08:11,560 --> 02:08:13,560 Speaker 16: Oh yeah, you know, like I was, you know, I was. 2495 02:08:13,880 --> 02:08:15,160 Speaker 11: I took with my partner about this. 2496 02:08:15,200 --> 02:08:17,200 Speaker 16: They have type one diabetes, they have a CGM, they 2497 02:08:17,360 --> 02:08:19,480 Speaker 16: use it constantly, and they're We've been talking about and 2498 02:08:19,560 --> 02:08:22,120 Speaker 16: thinking about writing about how there's been a crop of 2499 02:08:22,280 --> 02:08:24,560 Speaker 16: devices that are like trying to push onto this idea 2500 02:08:24,640 --> 02:08:28,160 Speaker 16: that you need to have close monitoring of it to 2501 02:08:28,640 --> 02:08:31,400 Speaker 16: preempt if you are going to be prediabetic, or to 2502 02:08:31,480 --> 02:08:33,760 Speaker 16: optimize what you're eating throughout the day, but that you know, 2503 02:08:34,280 --> 02:08:36,640 Speaker 16: when you actually dig into what they're doing, it's like 2504 02:08:37,280 --> 02:08:40,320 Speaker 16: part of this track of rhetoric where it's like, wow, 2505 02:08:40,480 --> 02:08:42,800 Speaker 16: you know, if you're sugar slightly goes out, it's because 2506 02:08:42,800 --> 02:08:44,600 Speaker 16: you're being a bad person. It's because you're eating the 2507 02:08:44,680 --> 02:08:46,920 Speaker 16: way that you shouldn't. It's because there's a moral failing 2508 02:08:47,040 --> 02:08:49,640 Speaker 16: or character failing there that this tech can help purify 2509 02:08:49,760 --> 02:08:51,440 Speaker 16: you of and you can be your best self, which 2510 02:08:51,480 --> 02:08:52,960 Speaker 16: is really just like not. 2511 02:08:53,640 --> 02:08:54,320 Speaker 1: Large, you know. 2512 02:08:54,560 --> 02:08:59,280 Speaker 16: And I feel that sort of rhetoric lurking behind a 2513 02:08:59,360 --> 02:09:02,920 Speaker 16: lot of the bi metric surveillance stuff, even though there 2514 02:09:02,960 --> 02:09:04,080 Speaker 16: are applications that are not that. 2515 02:09:04,400 --> 02:09:06,880 Speaker 1: Yeah, you know, it's kind of focused on like the sin, 2516 02:09:07,080 --> 02:09:09,400 Speaker 1: the health sins that you're committing. We spend a decent 2517 02:09:09,440 --> 02:09:11,440 Speaker 1: amount of this week hanging out with a Catholic priest, 2518 02:09:11,520 --> 02:09:13,960 Speaker 1: and I do feel like several tech companies were the 2519 02:09:14,000 --> 02:09:18,720 Speaker 1: ones trying to sell us indulgences, right, yeah, yeah, all 2520 02:09:18,800 --> 02:09:19,200 Speaker 1: right gear. 2521 02:09:20,080 --> 02:09:23,200 Speaker 5: There's small improvements for consumer tech. 2522 02:09:23,360 --> 02:09:23,480 Speaker 4: Right. 2523 02:09:23,600 --> 02:09:25,840 Speaker 5: This is a very consumer based where it's supposed to 2524 02:09:25,880 --> 02:09:28,920 Speaker 5: be a consumer based tech show. There's products like the 2525 02:09:29,000 --> 02:09:32,520 Speaker 5: Shocks headphones, which every year get a little bit better. 2526 02:09:32,960 --> 02:09:36,040 Speaker 5: I try to bone conducting headphones last year, which are 2527 02:09:36,480 --> 02:09:38,120 Speaker 5: very good. They work underwater. 2528 02:09:38,360 --> 02:09:40,160 Speaker 1: If you're deaf in an ear you can listen to 2529 02:09:40,240 --> 02:09:41,960 Speaker 1: your music the way you used to be able to. 2530 02:09:42,080 --> 02:09:44,880 Speaker 5: Yeah, very cool stuff. This year they have what they 2531 02:09:44,960 --> 02:09:47,960 Speaker 5: called air conductive. I don't quite know how it works, 2532 02:09:48,320 --> 02:09:50,240 Speaker 5: but it does work. I can hear it if you're 2533 02:09:50,240 --> 02:09:52,600 Speaker 5: standing like two three feet away. There's no sound bleed, 2534 02:09:52,880 --> 02:09:54,920 Speaker 5: but I hear music in the middle of my head 2535 02:09:55,280 --> 02:09:57,520 Speaker 5: despite having to not put an earbud actually like in 2536 02:09:57,680 --> 02:10:02,520 Speaker 5: my ear. They're super useful, work, great, really good sound quality, durable. 2537 02:10:02,560 --> 02:10:04,280 Speaker 1: I'm on year two of the same pair that I 2538 02:10:04,400 --> 02:10:08,240 Speaker 1: run with every single day, like sweat rain. Great products. 2539 02:10:08,240 --> 02:10:11,720 Speaker 5: It's like small improvements, right, It's not it's not necessarily 2540 02:10:11,800 --> 02:10:16,000 Speaker 5: like revolutionizing hearing, but it's it's very very small improvements. 2541 02:10:16,440 --> 02:10:19,520 Speaker 5: Where as the other kind of big, big trend, which 2542 02:10:19,560 --> 02:10:22,400 Speaker 5: isn't necessarily like holly consumer based. It's kind of what 2543 02:10:22,480 --> 02:10:25,480 Speaker 5: these larger companies are trying to move towards is I 2544 02:10:25,560 --> 02:10:32,240 Speaker 5: feel like they're trying to replace friendship with this form 2545 02:10:32,320 --> 02:10:35,880 Speaker 5: of like technology and like AI enable technology. You used 2546 02:10:35,880 --> 02:10:38,080 Speaker 5: to have friends to get recommended new music. You used 2547 02:10:38,080 --> 02:10:40,160 Speaker 5: to have like friends to like tell you about new 2548 02:10:40,240 --> 02:10:43,480 Speaker 5: stuff that they're interested in. No longer. Now you have 2549 02:10:43,520 --> 02:10:45,480 Speaker 5: an AI agent that can do that for you. You 2550 02:10:45,720 --> 02:10:48,800 Speaker 5: you don't need you don't need friends to help kind 2551 02:10:48,800 --> 02:10:52,880 Speaker 5: of talk about, you know, you had a rough breakup. Instead, 2552 02:10:52,920 --> 02:10:56,040 Speaker 5: you can have a short term replacement. Using AI, you 2553 02:10:56,120 --> 02:10:59,640 Speaker 5: could have a friend replacement of a girlfriend replacement. It's 2554 02:10:59,680 --> 02:11:02,120 Speaker 5: all these things are trying to replace the core concept 2555 02:11:02,160 --> 02:11:05,160 Speaker 5: of friendship, even as like as even as like a baby, 2556 02:11:05,240 --> 02:11:07,920 Speaker 5: even as a toddler. Your first friend doesn't need to 2557 02:11:07,960 --> 02:11:10,000 Speaker 5: be people you meet outside. It can be this little 2558 02:11:10,080 --> 02:11:12,640 Speaker 5: hovering robot you have in the living room that can 2559 02:11:12,680 --> 02:11:15,520 Speaker 5: also organize your fridge, tell you what to tell you what. 2560 02:11:17,320 --> 02:11:18,840 Speaker 5: We'll roll around your house in the middle of the 2561 02:11:18,920 --> 02:11:21,880 Speaker 5: night with cameras, and that could be your first friend. 2562 02:11:22,560 --> 02:11:25,760 Speaker 5: It's replacing the core concept of friendship. It's this move 2563 02:11:25,840 --> 02:11:29,800 Speaker 5: towards complete optimization of every aspect of human life. Because 2564 02:11:29,800 --> 02:11:32,280 Speaker 5: as smooth as possible, that completely ignores like what it 2565 02:11:32,360 --> 02:11:33,040 Speaker 5: means to be human. 2566 02:11:33,120 --> 02:11:36,600 Speaker 1: It's the fascinating difference between that elder care robot elie Q, 2567 02:11:36,840 --> 02:11:38,920 Speaker 1: which was clearly a man with a tremendous amount of 2568 02:11:38,960 --> 02:11:41,440 Speaker 1: empathy trying to design a device to help people, and 2569 02:11:41,520 --> 02:11:43,520 Speaker 1: what I usually see with AI, which is trying to 2570 02:11:43,600 --> 02:11:47,120 Speaker 1: design a device to remove the need for human empathy. 2571 02:11:47,600 --> 02:11:49,480 Speaker 1: Like I went to a there was a vet app 2572 02:11:49,600 --> 02:11:53,120 Speaker 1: called Laika that's like chat GPT for veterinarians, and they 2573 02:11:53,160 --> 02:11:54,640 Speaker 1: were like, yeah, you know what, most of it we 2574 02:11:54,760 --> 02:11:57,400 Speaker 1: focused initially on like technical questions, so like if I 2575 02:11:57,440 --> 02:12:00,480 Speaker 1: have these symptoms, what can that mean? But that's asking 2576 02:12:00,600 --> 02:12:02,240 Speaker 1: us is like we would really like advice on how 2577 02:12:02,280 --> 02:12:04,200 Speaker 1: to talk to people that their pets are going to die? 2578 02:12:04,720 --> 02:12:07,160 Speaker 1: And I was like, do you are Vett's not getting 2579 02:12:07,200 --> 02:12:09,840 Speaker 1: out of that school, because that's like that's a big 2580 02:12:09,960 --> 02:12:12,480 Speaker 1: part of being an a vet, Like do they need 2581 02:12:12,600 --> 02:12:13,680 Speaker 1: chet GPT for this? 2582 02:12:13,960 --> 02:12:16,640 Speaker 5: I saw this other company that was like it was 2583 02:12:16,680 --> 02:12:19,680 Speaker 5: designed to help you get over the loss of your pet, 2584 02:12:20,000 --> 02:12:22,120 Speaker 5: where you could you could pump tons of photos of 2585 02:12:22,160 --> 02:12:26,040 Speaker 5: your pet into this AI into this AI generator and 2586 02:12:26,320 --> 02:12:29,440 Speaker 5: it will generate new images and this is proven to 2587 02:12:29,480 --> 02:12:32,400 Speaker 5: help you move on from loss, which is literally a 2588 02:12:32,520 --> 02:12:37,680 Speaker 5: Nathan Fielder joke from like seven years ago, seven years ago, 2589 02:12:37,960 --> 02:12:40,880 Speaker 5: and like, no, you should talk with your friends about that. 2590 02:12:41,080 --> 02:12:43,400 Speaker 5: That's why you are a human. That's how you can 2591 02:12:43,480 --> 02:12:46,720 Speaker 5: move on from loss. You have to make new connections. 2592 02:12:47,360 --> 02:12:50,760 Speaker 5: Poorly AI generated images of your cat aren't going to 2593 02:12:50,840 --> 02:12:56,680 Speaker 5: help you move on, Like what why? Anyway, Replacing friendship 2594 02:12:56,800 --> 02:12:58,040 Speaker 5: is the thing that I see a lot of the 2595 02:12:58,120 --> 02:13:00,600 Speaker 5: tech world wanting to do, maybe because they don't understand 2596 02:13:00,680 --> 02:13:05,680 Speaker 5: real human relationships that aren't like innately transactional. I'm not sure, 2597 02:13:05,800 --> 02:13:08,040 Speaker 5: but that is like a huge trend that I've seen 2598 02:13:08,080 --> 02:13:09,280 Speaker 5: multiple multiple people mention. 2599 02:13:09,760 --> 02:13:13,440 Speaker 14: All right, Zi, So I've worked in this industry for 2600 02:13:13,680 --> 02:13:17,200 Speaker 14: like three years now and this is my first big convention, 2601 02:13:18,080 --> 02:13:21,120 Speaker 14: and I would say, uh, this is just affirmed pretty 2602 02:13:21,160 --> 02:13:24,360 Speaker 14: much all of my disillusions with the tech world, and 2603 02:13:25,280 --> 02:13:29,880 Speaker 14: most of it's just nonsense. And maybe the postive people 2604 02:13:29,920 --> 02:13:30,880 Speaker 14: are onto some stuff. 2605 02:13:32,640 --> 02:13:34,920 Speaker 1: Well you say that, but I really do think via 2606 02:13:35,000 --> 02:13:38,560 Speaker 1: dox is kind of revolutionize the way in which mysterious 2607 02:13:38,640 --> 02:13:43,480 Speaker 1: fogs kill large numbers of the Maybe, but don't name 2608 02:13:43,560 --> 02:13:46,320 Speaker 1: it something so sinister. Yeah, yeah, it's if you were 2609 02:13:46,360 --> 02:13:47,720 Speaker 1: to be like this is the thing that keeps your 2610 02:13:47,760 --> 02:13:49,720 Speaker 1: apples fresh for a long time. That would be great. 2611 02:13:49,760 --> 02:13:51,840 Speaker 1: I would just don't call it apple fresh, yes, but 2612 02:13:51,960 --> 02:13:52,840 Speaker 1: call it apple fresh. 2613 02:13:53,920 --> 02:13:55,520 Speaker 5: By the way, you should listen to You should listen 2614 02:13:55,520 --> 02:13:58,760 Speaker 5: to Better Offline to hear context for veradox, which we 2615 02:13:58,840 --> 02:14:01,240 Speaker 5: discussed in the last episode of our daily Cees coverage 2616 02:14:01,320 --> 02:14:05,240 Speaker 5: over there with the wonderful Edixitron. But essentially baradox is 2617 02:14:05,280 --> 02:14:09,320 Speaker 5: this missed. They get sprayed on produce, which allegedly helps 2618 02:14:09,400 --> 02:14:12,120 Speaker 5: it stay shelf stable for a few more days. 2619 02:14:14,360 --> 02:14:17,919 Speaker 1: Exactly. So maybe that shelf stable mist will also translate 2620 02:14:17,960 --> 02:14:18,640 Speaker 1: to waking. 2621 02:14:18,480 --> 02:14:21,040 Speaker 15: Up the dead possibly, But you don't know that it's 2622 02:14:21,160 --> 02:14:22,600 Speaker 15: gonna do You don't know that it's not going to 2623 02:14:22,640 --> 02:14:23,320 Speaker 15: do that, right right. 2624 02:14:25,440 --> 02:14:28,160 Speaker 1: As a journalist, that's sorry, you have to ask these questions. 2625 02:14:28,240 --> 02:14:31,200 Speaker 5: And we discussed that way more in depth on Better Offline. 2626 02:14:31,280 --> 02:14:33,640 Speaker 15: Yeah, we do discuss whether the ability to bring red 2627 02:14:33,720 --> 02:14:36,720 Speaker 15: leaf led us back to life does have any repercussions 2628 02:14:37,160 --> 02:14:40,080 Speaker 15: in a pet cemetery sort of way for your possibly 2629 02:14:40,120 --> 02:14:40,720 Speaker 15: dead loved ones. 2630 02:14:41,840 --> 02:14:45,080 Speaker 1: David, it's me sorry. A lot of good points. 2631 02:14:45,160 --> 02:14:47,480 Speaker 15: I mean, I think garyin ever both made the point 2632 02:14:47,520 --> 02:14:51,520 Speaker 15: about the sort of sociopathic like thread of a lot 2633 02:14:51,560 --> 02:14:53,640 Speaker 15: of this just sort of like an inability to understand 2634 02:14:53,720 --> 02:14:57,520 Speaker 15: not just what people might want from a technology, I think, 2635 02:14:57,560 --> 02:15:00,400 Speaker 15: which is to fuel not I mean, they're probably our people. 2636 02:15:00,440 --> 02:15:02,000 Speaker 1: I imagine that. It's like if you were the guy, 2637 02:15:02,280 --> 02:15:05,120 Speaker 1: the dude that's like trying to age himself backwards. You know, 2638 02:15:05,320 --> 02:15:08,360 Speaker 1: he's like Brian Johnson. Brian Johnson. We love Brian's. 2639 02:15:09,480 --> 02:15:12,160 Speaker 15: Yeah, but he like I feel like he would have 2640 02:15:12,240 --> 02:15:15,600 Speaker 15: been walking through this clapping his hands with delight. 2641 02:15:17,680 --> 02:15:21,480 Speaker 5: A day, drinking his son's blood time for Yeah, it's a. 2642 02:15:21,600 --> 02:15:25,320 Speaker 1: It's I drink his son's blood pretty rights and it's 2643 02:15:25,360 --> 02:15:25,800 Speaker 1: not bad. 2644 02:15:25,720 --> 02:15:28,840 Speaker 15: The high quality plot, but that it felt like it 2645 02:15:28,960 --> 02:15:30,440 Speaker 15: was that that there was a lot of this sort 2646 02:15:30,480 --> 02:15:35,600 Speaker 15: of like an optimization onto like transcending being human at all. 2647 02:15:35,760 --> 02:15:38,480 Speaker 15: And I don't think I mean again, there probably are 2648 02:15:38,520 --> 02:15:40,920 Speaker 15: people that want that. They certainly have money. I don't 2649 02:15:41,040 --> 02:15:45,160 Speaker 15: imagine that. I think what most people would like. I mean, 2650 02:15:45,200 --> 02:15:47,280 Speaker 15: then you don't expect technology to make you feel more human. 2651 02:15:47,480 --> 02:15:49,280 Speaker 15: But something I've been thinking. 2652 02:15:49,160 --> 02:15:52,000 Speaker 1: About a lot. We're talking about this a lot on Better. 2653 02:15:51,880 --> 02:15:55,640 Speaker 15: Offline, But there's a passivity that a lot of this 2654 02:15:56,040 --> 02:15:58,840 Speaker 15: sort of seems to be forcing onto people where you're 2655 02:15:58,880 --> 02:16:01,280 Speaker 15: just like sort of are happening to you that make 2656 02:16:01,320 --> 02:16:03,200 Speaker 15: your life more efficient and convenient. 2657 02:16:03,320 --> 02:16:05,640 Speaker 1: And I don't think that I want that. 2658 02:16:06,280 --> 02:16:08,840 Speaker 15: I mean, I'm older than and poorer than the you 2659 02:16:08,920 --> 02:16:11,600 Speaker 15: know market that I think they're aiming for with this, 2660 02:16:11,760 --> 02:16:14,680 Speaker 15: But it's certainly old enough to remember, as you said, 2661 02:16:14,760 --> 02:16:17,600 Speaker 15: like finding music, like that's a thing that yeah, you know, 2662 02:16:17,680 --> 02:16:19,400 Speaker 15: your friends tell you about it. And in my case, 2663 02:16:19,400 --> 02:16:22,760 Speaker 15: I mean again just being in my middle age, you 2664 02:16:22,880 --> 02:16:24,440 Speaker 15: like go to a store and you flip through shit, 2665 02:16:24,560 --> 02:16:28,240 Speaker 15: like there's a distinction between finding something and being given 2666 02:16:28,440 --> 02:16:30,720 Speaker 15: something or being fed to something like you're a foa 2667 02:16:30,800 --> 02:16:33,440 Speaker 15: grag goose and it's just getting sort of piped into 2668 02:16:33,480 --> 02:16:35,160 Speaker 15: your brain and life and being. 2669 02:16:35,520 --> 02:16:38,080 Speaker 1: And I think it's an important distinction. I think that 2670 02:16:38,240 --> 02:16:38,800 Speaker 1: little bit of. 2671 02:16:38,840 --> 02:16:42,200 Speaker 15: Agency of having some sense of doing the things that 2672 02:16:42,320 --> 02:16:45,160 Speaker 15: you want to do, like I would imagine that, well, 2673 02:16:45,240 --> 02:16:47,480 Speaker 15: I don't have to imagine it technology that helps you 2674 02:16:47,600 --> 02:16:49,560 Speaker 15: do that as opposed to doing it for you. 2675 02:16:49,720 --> 02:16:50,120 Speaker 1: I think that. 2676 02:16:51,800 --> 02:16:53,880 Speaker 15: I don't want stuff that makes me feel less human. 2677 02:16:54,080 --> 02:16:55,880 Speaker 15: I don't want stuff that makes me feel more like 2678 02:16:56,080 --> 02:16:58,440 Speaker 15: I'm in a fucking matrix pod and I think that 2679 02:16:59,680 --> 02:17:01,720 Speaker 15: a lot of the stuff that was out there seemed 2680 02:17:01,800 --> 02:17:02,720 Speaker 15: targeted towards the. 2681 02:17:04,200 --> 02:17:07,720 Speaker 1: Matrix pod dwelling community. I think that's that's about the 2682 02:17:07,760 --> 02:17:11,360 Speaker 1: best line we could go out on, like, that's that's yeah, 2683 02:17:11,640 --> 02:17:12,120 Speaker 1: you nailed it. 2684 02:17:12,200 --> 02:17:12,400 Speaker 4: Thanks. 2685 02:17:12,440 --> 02:17:14,360 Speaker 1: I thought I crushed that one. Yeah you did. You did, 2686 02:17:14,760 --> 02:17:18,160 Speaker 1: great job, Dave. Where can people find your workday defector 2687 02:17:18,240 --> 02:17:21,080 Speaker 1: dot com? Why let me do that without crushing my water? No, no, no, no, 2688 02:17:21,200 --> 02:17:21,680 Speaker 1: that's okay. 2689 02:17:21,959 --> 02:17:22,080 Speaker 5: Let me. 2690 02:17:22,280 --> 02:17:25,320 Speaker 1: That's a load bearing piece of content. Defector dot com 2691 02:17:25,440 --> 02:17:26,959 Speaker 1: is the website and at. 2692 02:17:27,280 --> 02:17:30,720 Speaker 16: Big black Jack a bit on Twitter and Blue Sky. 2693 02:17:31,080 --> 02:17:33,920 Speaker 16: This Machine Kills is my podcast. Tech Bubble dot Soapstack 2694 02:17:33,959 --> 02:17:35,320 Speaker 16: dot com is the newsletter. 2695 02:17:35,800 --> 02:17:38,520 Speaker 1: Hell yeah, do you want to tell people how to 2696 02:17:38,600 --> 02:17:38,920 Speaker 1: find you? 2697 02:17:39,600 --> 02:17:43,759 Speaker 14: Zi hat new old woman on Twitter with okay zeros. 2698 02:17:44,879 --> 02:17:45,920 Speaker 5: Zeros for neo. 2699 02:17:45,959 --> 02:17:48,720 Speaker 1: Zeros, All right, everybody, Well, uh, that's going to do 2700 02:17:48,800 --> 02:17:50,680 Speaker 1: it for us here at It could happen here and 2701 02:17:50,800 --> 02:17:53,160 Speaker 1: are week at cees. You know, just try to hub 2702 02:17:53,280 --> 02:17:55,879 Speaker 1: hug your loved ones until the Viera ox sweeps through 2703 02:17:56,160 --> 02:18:00,520 Speaker 1: all of their homes neighborhoods. Yeah, it's in the room, 2704 02:18:00,600 --> 02:18:01,200 Speaker 1: It's in the room. 2705 02:18:16,200 --> 02:18:19,160 Speaker 13: Welcome to ninkadappan here a podcast about things falling apart, 2706 02:18:19,240 --> 02:18:21,480 Speaker 13: how to put them back together again. I am your host, 2707 02:18:21,560 --> 02:18:26,120 Speaker 13: Mia Long Return for the holidays, Returned, rejuvenated, returned, refreshed, 2708 02:18:26,920 --> 02:18:29,280 Speaker 13: Return to do something a little bit different. 2709 02:18:30,080 --> 02:18:31,000 Speaker 1: In the coming weeks. 2710 02:18:31,080 --> 02:18:32,720 Speaker 13: We're going to be doing a lot of nitty gritty 2711 02:18:32,720 --> 02:18:34,600 Speaker 13: analysis of the coming wave of fascism. 2712 02:18:35,000 --> 02:18:38,039 Speaker 1: But what we haven't really been doing as much. 2713 02:18:38,160 --> 02:18:39,879 Speaker 5: What I want to take some time to do today 2714 02:18:40,120 --> 02:18:43,360 Speaker 5: is to talk about fascism atic sort of macro level 2715 02:18:43,480 --> 02:18:46,640 Speaker 5: and what it looks like right now, and also talk 2716 02:18:46,800 --> 02:18:49,480 Speaker 5: about an extremely cooked guy. 2717 02:18:49,480 --> 02:18:54,199 Speaker 9: Who blew himself up in a cyb outside of Trump Building, 2718 02:18:55,720 --> 02:19:01,600 Speaker 9: and with me to talk about this is writer, organizer, agitator, 2719 02:19:01,800 --> 02:19:06,080 Speaker 9: doer of so many different things that like, I don't 2720 02:19:06,080 --> 02:19:08,440 Speaker 9: know someone's going to write a great biography in like 2721 02:19:08,800 --> 02:19:09,680 Speaker 9: one hundred years. 2722 02:19:09,959 --> 02:19:10,200 Speaker 5: It is. 2723 02:19:10,280 --> 02:19:12,040 Speaker 1: It is the one and only Vicky Oshtawhile. 2724 02:19:13,360 --> 02:19:13,680 Speaker 10: Thank you. 2725 02:19:14,560 --> 02:19:16,760 Speaker 17: Sorry, I couldn't keep the giggle down long enough for 2726 02:19:16,840 --> 02:19:19,120 Speaker 17: you to get to the intro before you're you find 2727 02:19:19,160 --> 02:19:19,880 Speaker 17: people could hear me. 2728 02:19:22,280 --> 02:19:23,600 Speaker 1: Oh, I'm glad to have you here. 2729 02:19:23,720 --> 02:19:27,160 Speaker 13: And part part of this the initial thing that was like, okay, 2730 02:19:27,200 --> 02:19:30,000 Speaker 13: we need to do this was I saw you called 2731 02:19:30,160 --> 02:19:33,840 Speaker 13: all of this the years of lead paint, and that 2732 02:19:34,080 --> 02:19:36,680 Speaker 13: is just it has stuck in my mind every for 2733 02:19:36,800 --> 02:19:39,080 Speaker 13: every single second of every day since then. 2734 02:19:40,080 --> 02:19:43,320 Speaker 17: Yeah, yeah, I was writing for the journal that I 2735 02:19:43,360 --> 02:19:45,600 Speaker 17: am working and fundraising for CA Go chext us out, 2736 02:19:45,800 --> 02:19:48,760 Speaker 17: but I wrote a piece about how unpleasant the cyberpunk 2737 02:19:48,800 --> 02:19:51,720 Speaker 17: dystopia is in the face of you know, that sort 2738 02:19:51,720 --> 02:19:54,080 Speaker 17: of that image of the cyber truck on fire outside 2739 02:19:54,080 --> 02:19:56,680 Speaker 17: the Trump Hotel. Then about you know, as we were 2740 02:19:56,760 --> 02:19:58,960 Speaker 17: about to talk about Matthew Livelsburger, I think is how 2741 02:19:58,959 --> 02:20:03,080 Speaker 17: it's pronounced, who's the green beret? Then big Trump fan 2742 02:20:03,280 --> 02:20:06,040 Speaker 17: who thought blowing up a cyber trunk outside of the 2743 02:20:06,080 --> 02:20:08,600 Speaker 17: Trump Hotel would start not a race war, but like 2744 02:20:08,760 --> 02:20:10,320 Speaker 17: the purging of democratic politicians. 2745 02:20:10,400 --> 02:20:12,920 Speaker 13: Is that what we think his yeah, version was, Now 2746 02:20:13,080 --> 02:20:17,120 Speaker 13: that seems to be it like politicians, and like it's 2747 02:20:17,200 --> 02:20:19,160 Speaker 13: kind of an evolution of the like purge the deep 2748 02:20:19,240 --> 02:20:21,480 Speaker 13: state thing where he wants democrats gone from like the 2749 02:20:21,720 --> 02:20:24,879 Speaker 13: army and right right, you know, so it's the kind 2750 02:20:24,920 --> 02:20:28,560 Speaker 13: of more generic version of like the sort of Nazi 2751 02:20:28,640 --> 02:20:30,320 Speaker 13: fantasy of the day of the rope from the Turner 2752 02:20:30,400 --> 02:20:32,800 Speaker 13: Diaries is kind of like metastasized into all this right 2753 02:20:32,840 --> 02:20:34,960 Speaker 13: wing culture where they have their own sort of like 2754 02:20:35,560 --> 02:20:40,120 Speaker 13: less race worry or like less anti Semitic versions of it. Yes, 2755 02:20:40,360 --> 02:20:43,560 Speaker 13: and that's apparently what this guy was trying to start 2756 02:20:43,680 --> 02:20:46,440 Speaker 13: off by exactly blowing himself up with a. 2757 02:20:46,480 --> 02:20:48,800 Speaker 5: Truck full of fireworks in front of a Trump. 2758 02:20:50,080 --> 02:20:53,160 Speaker 17: So basically, this guy that the spaping a Green Beret, which, 2759 02:20:53,280 --> 02:20:55,880 Speaker 17: like say what you will, arguably some of the most 2760 02:20:56,160 --> 02:20:59,320 Speaker 17: trained and experienced murderers in the world, you know, whatever 2761 02:20:59,360 --> 02:21:01,039 Speaker 17: else you say about and this is important. 2762 02:21:01,080 --> 02:21:04,040 Speaker 13: You know, I'm not sure there's any capacity drop in 2763 02:21:04,120 --> 02:21:06,600 Speaker 13: the world that is greater than the drop from like 2764 02:21:06,800 --> 02:21:09,240 Speaker 13: green Beret to like former Green Beret. 2765 02:21:09,400 --> 02:21:13,400 Speaker 1: This guy was active duty, so like right, yes, yes, yes, exactly. 2766 02:21:14,120 --> 02:21:15,480 Speaker 1: This wasn't even like a cooked vet. 2767 02:21:15,560 --> 02:21:17,520 Speaker 17: This is a guy who is like in the shit 2768 02:21:18,240 --> 02:21:20,320 Speaker 17: and we know that he was drinking the kool aid 2769 02:21:20,320 --> 02:21:21,520 Speaker 17: because he used chat GPT. 2770 02:21:22,040 --> 02:21:24,360 Speaker 10: He've just turned out today to help plan his attack. 2771 02:21:24,800 --> 02:21:30,560 Speaker 17: But unfortunately, despite his murder expertise which undeniable, cyber truck, 2772 02:21:30,680 --> 02:21:33,640 Speaker 17: like all Tesla's, is designed mostly to endanger the people 2773 02:21:33,680 --> 02:21:38,400 Speaker 17: inside it because they won't sue Tesla because they're already 2774 02:21:38,640 --> 02:21:41,200 Speaker 17: huge super fans, And what I really mean, of course, 2775 02:21:41,280 --> 02:21:43,840 Speaker 17: is that they have terrible safety protocols. And the cyber truck, 2776 02:21:44,280 --> 02:21:45,960 Speaker 17: which is like a twelve year old's idea of a 2777 02:21:46,040 --> 02:21:49,800 Speaker 17: good idea, which is an incredibly incredibly firm, stainless steel 2778 02:21:49,879 --> 02:21:52,760 Speaker 17: body which does not crumple and does not take damage, 2779 02:21:53,120 --> 02:21:57,200 Speaker 17: which means that your frail human body inside it in 2780 02:21:57,280 --> 02:22:00,400 Speaker 17: an accident bashes against a wall of steel metal, very 2781 02:22:00,560 --> 02:22:04,520 Speaker 17: dangerous to be inside. But the car doesn't take damage, 2782 02:22:04,760 --> 02:22:06,520 Speaker 17: and that means that if you leave a bomb in it, 2783 02:22:07,280 --> 02:22:09,960 Speaker 17: the sides of the car were fine, so the explosion 2784 02:22:10,000 --> 02:22:12,680 Speaker 17: went straight up right, so it did no damage to 2785 02:22:12,720 --> 02:22:14,440 Speaker 17: the hotel. It's not clear if he intended that, but 2786 02:22:14,520 --> 02:22:16,080 Speaker 17: it seems like he probably wanted to do a. 2787 02:22:16,040 --> 02:22:17,080 Speaker 10: Little damage at the hotel. 2788 02:22:17,400 --> 02:22:20,520 Speaker 17: Most people who are doing suicide bombings want that, I 2789 02:22:20,560 --> 02:22:25,600 Speaker 17: would imagine. So anyway, all this is to say, you 2790 02:22:25,680 --> 02:22:27,879 Speaker 17: know this guy who's like an active duty green beret 2791 02:22:27,959 --> 02:22:30,280 Speaker 17: who believes for some reason that attacking a Trump hotel 2792 02:22:30,480 --> 02:22:33,400 Speaker 17: in an elon musk car will somehow lead to the 2793 02:22:33,480 --> 02:22:37,560 Speaker 17: murder of Democrats. But he's so tech pilled that he 2794 02:22:37,680 --> 02:22:40,240 Speaker 17: takes a cyber truck, which doesn't even work as a 2795 02:22:40,320 --> 02:22:43,720 Speaker 17: bomb and dies in it and just leaves this like 2796 02:22:44,120 --> 02:22:46,560 Speaker 17: horrible image. And I mean, you know, I'm being flippant 2797 02:22:46,560 --> 02:22:48,760 Speaker 17: about this, Like it's awful thing obviously, but no one 2798 02:22:48,800 --> 02:22:50,880 Speaker 17: else was hurt except himself. I mean, the image was 2799 02:22:50,920 --> 02:22:53,240 Speaker 17: everywhere on social media for like the last three days 2800 02:22:53,280 --> 02:22:55,160 Speaker 17: of that of that cyber truck on fire outside of 2801 02:22:55,160 --> 02:22:57,960 Speaker 17: the Trump towers. Yeah, it was the perfect image of 2802 02:22:58,000 --> 02:22:59,520 Speaker 17: a thing I had already been thinking of as the 2803 02:22:59,600 --> 02:23:01,920 Speaker 17: years of life. So I wrote an essay around that basically. 2804 02:23:03,080 --> 02:23:05,600 Speaker 13: Yeah, So I want to start talking about this by 2805 02:23:05,680 --> 02:23:07,760 Speaker 13: getting a little bit into what the years of lead are, 2806 02:23:07,879 --> 02:23:11,760 Speaker 13: because I imagine it's some of you, like there's probably, 2807 02:23:11,959 --> 02:23:14,360 Speaker 13: like I don't know, there's probably several thousand of you 2808 02:23:14,480 --> 02:23:17,000 Speaker 13: who are obsessive nerds about the years of lead and 2809 02:23:17,240 --> 02:23:19,320 Speaker 13: like know the name of every single guy he was 2810 02:23:19,400 --> 02:23:23,360 Speaker 13: implicated for these car bombings, but for everyone else who's normal. 2811 02:23:23,560 --> 02:23:25,800 Speaker 13: And I caught myself among the non normal people because 2812 02:23:25,800 --> 02:23:27,880 Speaker 13: I did. I spent about two years going down the 2813 02:23:27,959 --> 02:23:30,440 Speaker 13: years of lead rabbit hole and destroyed my brain. But 2814 02:23:30,640 --> 02:23:33,600 Speaker 13: the years of lead were this thing in roughly the 2815 02:23:33,680 --> 02:23:36,960 Speaker 13: seventies and the eighties, in Italy, where as a response 2816 02:23:37,160 --> 02:23:39,280 Speaker 13: to the sort of rising power of the left through 2817 02:23:39,320 --> 02:23:41,880 Speaker 13: the sixties, and like the giant uprising is ninety sixty eight. 2818 02:23:42,080 --> 02:23:44,080 Speaker 13: And Italy's kind of different from the rest of Europe 2819 02:23:44,120 --> 02:23:47,440 Speaker 13: because in Italy, you know, like in France, for example, 2820 02:23:47,480 --> 02:23:49,400 Speaker 13: France has this huge up rising in May sixty eight, 2821 02:23:49,440 --> 02:23:51,760 Speaker 13: like they nearly knock off the government, like workers councils 2822 02:23:51,800 --> 02:23:54,360 Speaker 13: have seized control of the factories, like they lose this 2823 02:23:54,440 --> 02:23:56,400 Speaker 13: our bottle, Like there's you know, the president's like fleeing 2824 02:23:56,400 --> 02:23:59,600 Speaker 13: at a helicopter. But then after that, like they kind 2825 02:23:59,640 --> 02:24:03,800 Speaker 13: of never seriously threatened the French government. Again in Italy, 2826 02:24:03,879 --> 02:24:05,000 Speaker 13: that is not true. 2827 02:24:05,000 --> 02:24:05,680 Speaker 5: Like sixty eight. 2828 02:24:05,760 --> 02:24:07,840 Speaker 13: In Italy, there's a very similar thing going on, but 2829 02:24:07,879 --> 02:24:09,400 Speaker 13: like the seizure of the factories has been going on 2830 02:24:09,680 --> 02:24:11,879 Speaker 13: since Like I mean, stuff like this has been happening 2831 02:24:11,879 --> 02:24:15,480 Speaker 13: since the fifties, and it really only stops in nineteen 2832 02:24:15,480 --> 02:24:17,720 Speaker 13: seventy seven when like they have one last big push 2833 02:24:17,800 --> 02:24:20,880 Speaker 13: uprising and it fails. So as a way to contain this, 2834 02:24:21,000 --> 02:24:25,800 Speaker 13: the Italian government develops this strategy of backing right wing 2835 02:24:25,920 --> 02:24:28,720 Speaker 13: terror groups and then also orchestrating left wing terror groups 2836 02:24:28,720 --> 02:24:30,640 Speaker 13: and by terror groups I mean, like the most famous 2837 02:24:30,920 --> 02:24:32,680 Speaker 13: thing in this is called the Bolognia train bombing in 2838 02:24:32,720 --> 02:24:36,360 Speaker 13: nineteen eighty. It kills eighty five people, wounds like two 2839 02:24:36,520 --> 02:24:40,200 Speaker 13: hundred and ninety. Like it's a really really horrific attack, 2840 02:24:40,200 --> 02:24:42,039 Speaker 13: and it's immediately blamed that an anarchist group. It turns 2841 02:24:42,080 --> 02:24:43,760 Speaker 13: out it's not an anarchist group. It is a state 2842 02:24:43,840 --> 02:24:47,039 Speaker 13: back like fascist group. And yeah, like there are other 2843 02:24:47,120 --> 02:24:49,240 Speaker 13: ones I will pass over to VICTI you talk about, 2844 02:24:49,280 --> 02:24:52,080 Speaker 13: like the other terrible shit that they did. 2845 02:24:52,440 --> 02:24:55,039 Speaker 17: Well, that bombing kind of ends in some ways ends 2846 02:24:55,080 --> 02:24:57,000 Speaker 17: the years of lad you could end it there. 2847 02:24:57,040 --> 02:24:58,520 Speaker 10: It's sort of the last big terrorist month. 2848 02:24:58,920 --> 02:25:01,000 Speaker 17: The first thing, the event that like sort of after 2849 02:25:01,080 --> 02:25:02,760 Speaker 17: sixty eight kind of starts at as this thing called 2850 02:25:02,800 --> 02:25:06,400 Speaker 17: the Piazza Fontana bonding in Milan, which is like an 2851 02:25:06,480 --> 02:25:08,920 Speaker 17: agriculture bank, I think is what it's called. It's just 2852 02:25:09,000 --> 02:25:11,760 Speaker 17: like but seventeen people are killed, almost one hundred people 2853 02:25:11,800 --> 02:25:13,920 Speaker 17: are wounded, and the first thing that the police do 2854 02:25:14,040 --> 02:25:16,680 Speaker 17: is they blame anarchists in sixty eight as well, And 2855 02:25:16,720 --> 02:25:18,880 Speaker 17: there's a famous there's a famous case of this anarchist 2856 02:25:19,000 --> 02:25:22,560 Speaker 17: organizer named Pinelli who is arrested and then while he 2857 02:25:22,720 --> 02:25:26,360 Speaker 17: is under interrogation, falls out of the window of the 2858 02:25:26,400 --> 02:25:27,800 Speaker 17: police department to his death. 2859 02:25:28,640 --> 02:25:28,840 Speaker 1: Yep. 2860 02:25:29,160 --> 02:25:31,640 Speaker 17: It has still never been proven that he was pushed. 2861 02:25:31,760 --> 02:25:35,760 Speaker 17: The police claimed he've jumped out after they interrogated him 2862 02:25:35,840 --> 02:25:36,320 Speaker 17: really hard. 2863 02:25:36,480 --> 02:25:36,680 Speaker 1: Yeah. 2864 02:25:36,760 --> 02:25:37,000 Speaker 10: Sure. 2865 02:25:37,160 --> 02:25:39,400 Speaker 17: Oh Like here's a very famous Italian play about it 2866 02:25:39,480 --> 02:25:42,039 Speaker 17: by Dario Foe called the Death of an Anarchist. So anyways, 2867 02:25:42,080 --> 02:25:44,720 Speaker 17: they blame the anarchists, They literally murder a leading anarchist 2868 02:25:44,920 --> 02:25:47,800 Speaker 17: printer and organizer, and then of course it turns out 2869 02:25:47,800 --> 02:25:50,720 Speaker 17: that it was this terrorist group called ordinay Nuovo, who was, 2870 02:25:50,920 --> 02:25:52,959 Speaker 17: you know, this neo fascist group that had let's say 2871 02:25:53,080 --> 02:25:56,600 Speaker 17: significant overlap with parts of the Italian state. And I 2872 02:25:56,680 --> 02:25:58,760 Speaker 17: think like one way of understanding the years of Lead, 2873 02:25:58,840 --> 02:26:00,800 Speaker 17: I think that might be easy forople who aren't familiar 2874 02:26:00,879 --> 02:26:02,840 Speaker 17: with it, is that it's it's like a very low 2875 02:26:02,959 --> 02:26:05,520 Speaker 17: level civil war. It's it's I think the closest thing 2876 02:26:05,560 --> 02:26:08,680 Speaker 17: we can maybe think of as the troubles in Northern Ireland. Yeah, 2877 02:26:08,800 --> 02:26:10,600 Speaker 17: and the reason those were a little different was because 2878 02:26:10,600 --> 02:26:12,840 Speaker 17: a lot of those attacks were happening in England, whereas 2879 02:26:12,959 --> 02:26:14,600 Speaker 17: like the you know, the movement was in Ireland. But 2880 02:26:15,040 --> 02:26:18,560 Speaker 17: this is very similar, which is like there's these armed wings, 2881 02:26:19,040 --> 02:26:20,680 Speaker 17: both on the right and the left that are like 2882 02:26:21,320 --> 02:26:24,119 Speaker 17: both meeting in combat and sort of fighting each other. 2883 02:26:24,320 --> 02:26:27,640 Speaker 17: But in this instance, rather than a colonial occupation that 2884 02:26:27,640 --> 02:26:32,000 Speaker 17: they're fighting against, the Italian government was literally both paying 2885 02:26:32,120 --> 02:26:35,000 Speaker 17: for arming the fascists and instructing them to frame the 2886 02:26:35,080 --> 02:26:35,960 Speaker 17: left for these attacks. 2887 02:26:36,160 --> 02:26:38,200 Speaker 13: Yeah, and there's I mean, there's other stuff too. We're 2888 02:26:38,240 --> 02:26:40,320 Speaker 13: not going to get into the kidnapping of Aldo Moro here. 2889 02:26:40,600 --> 02:26:43,320 Speaker 13: I have explained this on the show at some point. 2890 02:26:43,400 --> 02:26:43,960 Speaker 1: I think it's in. 2891 02:26:44,240 --> 02:26:45,640 Speaker 13: I think it's in if you go to the Hall 2892 02:26:45,640 --> 02:26:47,720 Speaker 13: of Weed episode we did where we talk about conspiracies. 2893 02:26:47,720 --> 02:26:50,920 Speaker 13: I've explained that whole thing. But like the goal of this, right, 2894 02:26:51,160 --> 02:26:53,200 Speaker 13: the reason that you know, they're they're giving all of 2895 02:26:53,240 --> 02:26:55,520 Speaker 13: these weapons to these like stay behind networks, so it's 2896 02:26:55,520 --> 02:26:57,760 Speaker 13: designed to like fight a Soviet invasion and like and 2897 02:26:57,879 --> 02:27:01,400 Speaker 13: having all these bombings was specifically something they call the 2898 02:27:01,400 --> 02:27:04,800 Speaker 13: strategy of tension, which is a strategy of promoting sort 2899 02:27:04,840 --> 02:27:08,240 Speaker 13: of mass violence and promoting terror as a strategy to 2900 02:27:08,360 --> 02:27:11,120 Speaker 13: drive people back towards the state. Because the idea was 2901 02:27:11,160 --> 02:27:13,200 Speaker 13: and this and this seems to have worked. You know, 2902 02:27:13,360 --> 02:27:15,480 Speaker 13: you scare people enough by the fact that there's you know, 2903 02:27:15,520 --> 02:27:17,560 Speaker 13: there's bombs going off all the time, people are getting killed, 2904 02:27:17,560 --> 02:27:19,800 Speaker 13: people are getting kidnapped, There's all of this just like 2905 02:27:20,120 --> 02:27:23,960 Speaker 13: horror happening, and the goal is to get people to 2906 02:27:24,040 --> 02:27:26,039 Speaker 13: turn to the state for you know, sort of order 2907 02:27:26,120 --> 02:27:28,760 Speaker 13: insecurity and like stop doing all of this uprising stuff 2908 02:27:28,800 --> 02:27:30,280 Speaker 13: because we need you know, we need to sort of 2909 02:27:30,360 --> 02:27:30,920 Speaker 13: terror to end. 2910 02:27:31,959 --> 02:27:32,879 Speaker 5: And it was. 2911 02:27:32,879 --> 02:27:35,880 Speaker 13: Extremely effective, and the sort of knowledge of this has 2912 02:27:36,160 --> 02:27:38,680 Speaker 13: I guess proliferated through the American left in the last 2913 02:27:39,600 --> 02:27:43,040 Speaker 13: like decade, and that has led to a lot of 2914 02:27:44,360 --> 02:27:47,680 Speaker 13: I think kind of unhelpful comparisons. You will hear people 2915 02:27:47,760 --> 02:27:50,840 Speaker 13: sometimes talk about like American Gladio, which is Gladio is 2916 02:27:50,879 --> 02:27:52,840 Speaker 13: those those state behind networks that were armed by the 2917 02:27:52,920 --> 02:27:54,640 Speaker 13: Italian state and used as sort of the basis of 2918 02:27:54,680 --> 02:27:58,200 Speaker 13: these neo fascist groups, and like to refer to this 2919 02:27:58,280 --> 02:28:00,240 Speaker 13: sort of like I don't like what's happening in the US, 2920 02:28:00,240 --> 02:28:02,960 Speaker 13: and that's not really what's happening. And this is where 2921 02:28:02,959 --> 02:28:04,480 Speaker 13: I want to pass it to VICKI to talk about 2922 02:28:04,680 --> 02:28:07,039 Speaker 13: sort of the characteristics of what we're calling the Years 2923 02:28:07,080 --> 02:28:09,119 Speaker 13: of Lead Paint and how they're sort of different from 2924 02:28:09,160 --> 02:28:10,120 Speaker 13: the Italian ones. 2925 02:28:10,680 --> 02:28:15,240 Speaker 17: Yeah, in classic American fashion. Everything is more chaotic and autonomous, yes, 2926 02:28:15,959 --> 02:28:20,080 Speaker 17: and widely proliferated and also widely proliferated all over America 2927 02:28:20,440 --> 02:28:21,360 Speaker 17: products and services. 2928 02:28:21,680 --> 02:28:22,240 Speaker 10: Did I do again? 2929 02:28:22,360 --> 02:28:37,920 Speaker 1: Let's support this podcast. That's a good one. We are back, 2930 02:28:37,920 --> 02:28:39,640 Speaker 1: all right, Years of Blood Paint. Let's go. 2931 02:28:40,160 --> 02:28:42,840 Speaker 17: Yes, Right, So I actually think, you know, as you 2932 02:28:42,920 --> 02:28:45,080 Speaker 17: were saying that, I think actually a thing that might 2933 02:28:45,120 --> 02:28:47,840 Speaker 17: be the closest to Gladio And it's not Gladio, because 2934 02:28:47,879 --> 02:28:49,480 Speaker 17: that was very conscious and it was like these stay 2935 02:28:49,520 --> 02:28:53,520 Speaker 17: behind networks are organized explicitly but the US state defense 2936 02:28:53,760 --> 02:28:57,760 Speaker 17: of the Second Amendment and of like assault rifle availability 2937 02:28:57,920 --> 02:29:00,959 Speaker 17: and making the US the sort of home for military 2938 02:29:01,000 --> 02:29:04,200 Speaker 17: surplus because obviously, like the military industrial complex sells lots 2939 02:29:04,200 --> 02:29:06,600 Speaker 17: of guns. It's a very helpful thing that producing a 2940 02:29:06,680 --> 02:29:09,600 Speaker 17: reign of mass shooters who also operate in a sort 2941 02:29:09,640 --> 02:29:11,960 Speaker 17: of Years of Lead terroristic sort of strategy of tension way, 2942 02:29:12,280 --> 02:29:14,720 Speaker 17: I think might actually be close. But you can tell 2943 02:29:14,800 --> 02:29:17,800 Speaker 17: that that's very disorganized. Yeah, it's very distributed through the 2944 02:29:17,840 --> 02:29:21,920 Speaker 17: social it's done by you know, volunteers, right. 2945 02:29:22,320 --> 02:29:22,520 Speaker 4: Yeah. 2946 02:29:22,600 --> 02:29:24,320 Speaker 13: And also the people who are doing the Years of 2947 02:29:24,440 --> 02:29:27,520 Speaker 13: Lead are unbelievably cynical about it right, Like they don't 2948 02:29:27,600 --> 02:29:30,680 Speaker 13: they don't believe any of this shit, right, yes, yes, 2949 02:29:30,920 --> 02:29:33,320 Speaker 13: no exactly, we're the Second Amendment guys. Like that stuff 2950 02:29:33,360 --> 02:29:35,800 Speaker 13: is driven a lot by sort of like hardline true 2951 02:29:35,840 --> 02:29:39,120 Speaker 13: believers who aren't trying to sort of like fuel a 2952 02:29:39,160 --> 02:29:41,440 Speaker 13: bunch of mass shooting to push people in towards extreme 2953 02:29:41,520 --> 02:29:43,560 Speaker 13: like increasingly right wing politics. That's sort of like not 2954 02:29:44,640 --> 02:29:46,240 Speaker 13: what they were trying to do, but that's sort of 2955 02:29:46,280 --> 02:29:48,360 Speaker 13: you know, that's the effect of a lot of this stuff. 2956 02:29:48,720 --> 02:29:49,320 Speaker 10: Yeah, it wasn't. 2957 02:29:49,320 --> 02:29:51,400 Speaker 17: It wasn't a conscious effort at all. But that's also 2958 02:29:51,520 --> 02:29:53,560 Speaker 17: not the years of lead paint. That's just like a 2959 02:29:53,640 --> 02:29:57,560 Speaker 17: similar thing, the years of lead paint, which is obviously 2960 02:29:57,720 --> 02:30:00,480 Speaker 17: like which is a joke about. There's this big reactionary 2961 02:30:00,520 --> 02:30:02,960 Speaker 17: myth from like the freakonomics guys. I think that like 2962 02:30:03,040 --> 02:30:05,920 Speaker 17: the rise in crimes like correlated to like the use 2963 02:30:05,959 --> 02:30:08,080 Speaker 17: of lead paint in children's bedrooms, which is. 2964 02:30:08,080 --> 02:30:10,720 Speaker 13: Really funny because for the freakonomics guy that that is 2965 02:30:10,800 --> 02:30:14,920 Speaker 13: a down right left wing theory standard. 2966 02:30:14,560 --> 02:30:17,440 Speaker 10: Yeah exactly, or maybe it was a dude directing it. 2967 02:30:17,440 --> 02:30:20,640 Speaker 17: I don't even remember now anyway, So it became it 2968 02:30:20,680 --> 02:30:23,359 Speaker 17: became a meme to like talk about sort of boomers 2969 02:30:23,400 --> 02:30:26,400 Speaker 17: and generation acts people you know, having the bled paint 2970 02:30:26,480 --> 02:30:29,880 Speaker 17: in their gasoline and in their walls cause all this stuff. 2971 02:30:30,080 --> 02:30:33,520 Speaker 17: Obviously I'm not advocating that kind of like ablest insult 2972 02:30:33,560 --> 02:30:35,480 Speaker 17: when I talk about this, and is a memetic way 2973 02:30:35,480 --> 02:30:37,000 Speaker 17: of making fun of that concept. 2974 02:30:37,160 --> 02:30:40,480 Speaker 10: But all of that to say, they. 2975 02:30:40,440 --> 02:30:43,800 Speaker 17: Have completely drunk the kool aid, right, the fascists, as 2976 02:30:43,840 --> 02:30:45,600 Speaker 17: you're saying, Yah, they knew what they were doing. They 2977 02:30:45,680 --> 02:30:48,199 Speaker 17: knew they were framing the left. They were like making 2978 02:30:48,280 --> 02:30:50,440 Speaker 17: it up. But like a lot of people on the 2979 02:30:50,560 --> 02:30:52,880 Speaker 17: right in Italy, yeah, yeah, in Italy, excuse me, in 2980 02:30:52,879 --> 02:30:55,160 Speaker 17: Italy in the sixties and in the actual years of lead, 2981 02:30:55,400 --> 02:30:59,240 Speaker 17: years of lead paint, you've got people genuinely probably believing 2982 02:30:59,280 --> 02:31:03,240 Speaker 17: that January was Antifah, like people whose friends were there, 2983 02:31:03,879 --> 02:31:06,680 Speaker 17: you know, yeah, like stuff like Q and the other 2984 02:31:06,760 --> 02:31:08,440 Speaker 17: thing that the reason that this is years of lead 2985 02:31:08,480 --> 02:31:11,440 Speaker 17: pain and not the first Trump administration is because during 2986 02:31:11,480 --> 02:31:14,720 Speaker 17: the first Trump administration there was actually pretty pretty well 2987 02:31:14,840 --> 02:31:17,400 Speaker 17: organized on the ground fascist movements and they could they 2988 02:31:17,440 --> 02:31:19,440 Speaker 17: could certainly come back in the US right now. 2989 02:31:19,560 --> 02:31:21,040 Speaker 10: There's no reason they couldn't. 2990 02:31:21,200 --> 02:31:23,800 Speaker 13: Yeah, And it's also worth talking about we'll be covering 2991 02:31:23,879 --> 02:31:26,360 Speaker 13: this on the show, like at some point in the 2992 02:31:26,400 --> 02:31:28,000 Speaker 13: future when we've had time to go through the documents. 2993 02:31:28,040 --> 02:31:31,199 Speaker 13: But there was recently a massive from distributed to Nile Secrets, 2994 02:31:31,320 --> 02:31:35,680 Speaker 13: a massive drop of stuff on the militia movements from 2995 02:31:35,720 --> 02:31:37,600 Speaker 13: a guy who infiltrated it. It's a very good Republic 2996 02:31:37,640 --> 02:31:40,240 Speaker 13: of story talking about the guy that will link in 2997 02:31:40,400 --> 02:31:44,519 Speaker 13: the show notes. So, like, the militia movement has survived, 2998 02:31:44,640 --> 02:31:46,280 Speaker 13: but the kind of stuff that like we saw in 2999 02:31:46,400 --> 02:31:48,600 Speaker 13: like twenty seventeen, twenty eighteen, twenty twenty. 3000 02:31:48,480 --> 02:31:49,240 Speaker 9: Like is not. 3001 02:31:49,400 --> 02:31:53,360 Speaker 17: Yeah, the Proud Boys Q and on, the folks who 3002 02:31:53,480 --> 02:31:55,520 Speaker 17: made up J six and the folks who made up 3003 02:31:55,600 --> 02:31:58,520 Speaker 17: the alt right, you know, broadly were largely defeated by 3004 02:31:58,520 --> 02:32:01,600 Speaker 17: anti fascists in the street. And then the people who 3005 02:32:01,640 --> 02:32:04,480 Speaker 17: remained QAnon folks who were I think, you know, some 3006 02:32:04,600 --> 02:32:07,600 Speaker 17: of those people were pretty hardcore neo Nazis obviously, but 3007 02:32:07,720 --> 02:32:10,520 Speaker 17: a lot of those folks were confused Internet boomers, right, 3008 02:32:10,959 --> 02:32:14,160 Speaker 17: and like those people mostly got discouraged by the repression. 3009 02:32:14,200 --> 02:32:16,920 Speaker 17: The repression I think successfully sort of put the ends 3010 02:32:16,959 --> 02:32:18,160 Speaker 17: to that organized Q stuff. 3011 02:32:18,560 --> 02:32:18,760 Speaker 1: Yeah. 3012 02:32:18,800 --> 02:32:20,680 Speaker 13: Well, and also, and I also have talked about on 3013 02:32:20,720 --> 02:32:21,920 Speaker 13: this show The other thing they put an end to 3014 02:32:22,000 --> 02:32:24,240 Speaker 13: that was that the Daily Wire figured out that you 3015 02:32:24,280 --> 02:32:26,400 Speaker 13: could use the literally the exact same structure of Q 3016 02:32:26,520 --> 02:32:28,480 Speaker 13: and on, but they make it about trans people. Yes, 3017 02:32:28,720 --> 02:32:31,040 Speaker 13: and that has been unbelievably effective. 3018 02:32:31,200 --> 02:32:34,600 Speaker 17: Now the strategy as a media strategy has continued, but 3019 02:32:34,760 --> 02:32:38,000 Speaker 17: as an on the ground organizing principle, it's not that functional. 3020 02:32:38,120 --> 02:32:40,800 Speaker 10: Yeah, it's not, which is very lucky. But what that 3021 02:32:40,920 --> 02:32:42,080 Speaker 10: means is that Trump has come to. 3022 02:32:42,120 --> 02:32:45,600 Speaker 17: Power without a ground movement in the same way that 3023 02:32:45,640 --> 02:32:47,640 Speaker 17: he had in twenty fifteen, twenty sixteen, Like that was 3024 02:32:47,640 --> 02:32:49,800 Speaker 17: a real movement. His rallies were really well attended. His 3025 02:32:49,959 --> 02:32:52,320 Speaker 17: rallies this election. People left early. You know, it was 3026 02:32:52,440 --> 02:32:54,600 Speaker 17: like it was like going to see a losing team 3027 02:32:54,680 --> 02:32:56,480 Speaker 17: and their last home game of the season. You know, 3028 02:32:56,920 --> 02:32:58,400 Speaker 17: was the vibe at those rallies. 3029 02:32:58,800 --> 02:33:01,320 Speaker 13: Yeah, to do a very specific example, it's like the 3030 02:33:01,520 --> 02:33:04,400 Speaker 13: vibe is like the last games of the Oakland Athletics 3031 02:33:04,440 --> 02:33:06,960 Speaker 13: before they were fucking run out to be for their 3032 02:33:06,959 --> 02:33:09,600 Speaker 13: owner move in the last bag exactly where like they've 3033 02:33:09,640 --> 02:33:13,119 Speaker 13: had an incredibly disappointing season, like deliberately by the owner 3034 02:33:13,160 --> 02:33:15,160 Speaker 13: who decided to make who made a bad team? So 3035 02:33:15,200 --> 02:33:17,440 Speaker 13: people wouldn't fight him, like moving the team to La 3036 02:33:17,879 --> 02:33:19,039 Speaker 13: like it's like that kind of. 3037 02:33:19,160 --> 02:33:21,960 Speaker 17: Hyeah, those are the vibes. And yet of course the Democrats, 3038 02:33:22,000 --> 02:33:26,320 Speaker 17: in their infinite infinite capacities, lost the election. And so 3039 02:33:27,080 --> 02:33:28,600 Speaker 17: what that means, though, is that is that you have 3040 02:33:28,680 --> 02:33:31,600 Speaker 17: this moment where actually the right has as much power 3041 02:33:32,240 --> 02:33:34,920 Speaker 17: in the federal government as it's ever had. You know, 3042 02:33:35,000 --> 02:33:37,720 Speaker 17: the resistance is you know, they you know, they're very 3043 02:33:37,760 --> 02:33:40,640 Speaker 17: proud of legally handing power to the man and ending 3044 02:33:40,720 --> 02:33:41,840 Speaker 17: all of his charges or whatever. 3045 02:33:42,360 --> 02:33:45,080 Speaker 10: But the street movement is disorganized. 3046 02:33:45,240 --> 02:33:48,640 Speaker 17: So you have this gap between the two where there's 3047 02:33:48,720 --> 02:33:52,920 Speaker 17: this really powerful media apparatus Fox News, Truth, Social X, 3048 02:33:53,000 --> 02:33:55,360 Speaker 17: the Everything app you know, all of these like all 3049 02:33:55,400 --> 02:33:57,760 Speaker 17: these places where the fascists you know, and I guess 3050 02:33:57,800 --> 02:34:00,280 Speaker 17: Meta has now just officially announced they're like going to 3051 02:34:00,400 --> 02:34:03,119 Speaker 17: remove all content restrictions or whatever today. 3052 02:34:03,240 --> 02:34:05,240 Speaker 10: I mean, you know when we're recording this. So it's 3053 02:34:05,320 --> 02:34:06,280 Speaker 10: just there's this. 3054 02:34:06,520 --> 02:34:09,800 Speaker 17: Huge spectacular apparatus, but there isn't this on the ground organization. 3055 02:34:09,920 --> 02:34:14,240 Speaker 17: You get people like this Green beret who has been 3056 02:34:14,280 --> 02:34:19,560 Speaker 17: really radicalized, made angry, desperate, and like is blowing not 3057 02:34:19,720 --> 02:34:23,280 Speaker 17: even the Trump hotel up, which would be a nonsensical 3058 02:34:23,320 --> 02:34:25,840 Speaker 17: thing to do, but like literally failing to blow the 3059 02:34:25,879 --> 02:34:29,840 Speaker 17: Trump Hotel up in an attempt to start the race 3060 02:34:29,920 --> 02:34:33,560 Speaker 17: war by getting Democrats hung. So it's still kind of 3061 02:34:33,600 --> 02:34:36,640 Speaker 17: strategy of tension stuff, right, the imagination of as you said, 3062 02:34:36,680 --> 02:34:38,880 Speaker 17: the Turner Diaries or this sort of like you know, 3063 02:34:39,040 --> 02:34:41,760 Speaker 17: the right wing terror networks in the US. You know, 3064 02:34:41,840 --> 02:34:45,039 Speaker 17: there's a reason they're obsessed with attacking electrical power grits. 3065 02:34:45,120 --> 02:34:47,640 Speaker 10: Right, they think of the cause enough chaos, like. 3066 02:34:47,680 --> 02:34:49,880 Speaker 17: You will return everything to the hobbsy and world of 3067 02:34:49,920 --> 02:34:51,800 Speaker 17: all against all, and you'll get a race war and 3068 02:34:51,879 --> 02:34:54,560 Speaker 17: everything will fall apart. Whatever it's you know, it's step one, 3069 02:34:55,000 --> 02:34:57,320 Speaker 17: kill my family, step two, question mark question marks, step three, 3070 02:34:57,480 --> 02:34:58,760 Speaker 17: white supremacist revolution. 3071 02:34:59,160 --> 02:35:01,959 Speaker 10: It's horrifying. I mean, it's horrifying, horrifying idea, But. 3072 02:35:02,040 --> 02:35:05,680 Speaker 17: That's happening in these groups that have really really they 3073 02:35:05,879 --> 02:35:08,800 Speaker 17: believe I think genuinely that, Like I think the right 3074 02:35:08,920 --> 02:35:12,440 Speaker 17: does not understand the difference between like Nancy Pelosi and 3075 02:35:12,480 --> 02:35:15,560 Speaker 17: Asana Shakur, Like they see them both as equally dangerous. 3076 02:35:15,600 --> 02:35:15,680 Speaker 5: Right. 3077 02:35:15,680 --> 02:35:16,600 Speaker 10: They hate Liz Cheney. 3078 02:35:16,760 --> 02:35:18,600 Speaker 17: Yeah, Like in the final days of elections, she was 3079 02:35:18,640 --> 02:35:20,200 Speaker 17: the person they were saying we're gonna go after her, 3080 02:35:20,640 --> 02:35:22,000 Speaker 17: like Liz Cheney like. 3081 02:35:22,160 --> 02:35:26,560 Speaker 13: Really like yeah, it's like like the closest parallel could 3082 02:35:26,560 --> 02:35:27,920 Speaker 13: think of this as like there was a faction of 3083 02:35:27,959 --> 02:35:29,880 Speaker 13: people between the Cold War who thought that like the 3084 02:35:29,959 --> 02:35:33,640 Speaker 13: Sino Soviet split between Russia and China was like faked, 3085 02:35:34,000 --> 02:35:36,640 Speaker 13: and like there were literally guys murdering each other, like 3086 02:35:36,800 --> 02:35:40,280 Speaker 13: Chinese and Russian troops were firing artillery at each other 3087 02:35:40,480 --> 02:35:43,080 Speaker 13: like on the border like in sixty nine, right, like 3088 02:35:43,360 --> 02:35:46,080 Speaker 13: and there were people who were convinced for the entire 3089 02:35:46,280 --> 02:35:50,359 Speaker 13: Cold War, even as like as China is invading Vietnam, 3090 02:35:50,879 --> 02:35:53,760 Speaker 13: are completely convinced that the entire thing is a ploy 3091 02:35:54,320 --> 02:35:56,800 Speaker 13: and that like and that's the secretly like the USSR 3092 02:35:56,920 --> 02:35:59,400 Speaker 13: and the PUERC are working together, and these are not 3093 02:35:59,680 --> 02:36:01,440 Speaker 13: like you know, some random guys that's like, these are 3094 02:36:01,480 --> 02:36:03,039 Speaker 13: these are like the guys that like like the peak 3095 02:36:03,080 --> 02:36:06,160 Speaker 13: of conservative power are absolutely convinced that this is true. 3096 02:36:06,160 --> 02:36:07,520 Speaker 13: And this is I think, yeah, like this this is 3097 02:36:07,520 --> 02:36:09,879 Speaker 13: the kind of thing we're in now. Just like these 3098 02:36:09,879 --> 02:36:12,160 Speaker 13: people are completely cooked. They don't they don't have any 3099 02:36:12,160 --> 02:36:15,080 Speaker 13: analytical building whatsoever. They just they actually have drunk their 3100 02:36:15,120 --> 02:36:15,720 Speaker 13: own kool aid. 3101 02:36:16,000 --> 02:36:18,160 Speaker 10: There was just a scoop. Sorr jop is really quick. 3102 02:36:18,160 --> 02:36:19,560 Speaker 17: There was a scoop right before we got on to 3103 02:36:19,640 --> 02:36:22,680 Speaker 17: record that Heritage Foundation, you know, authors of products Many 3104 02:36:22,760 --> 02:36:23,160 Speaker 17: day five. 3105 02:36:23,360 --> 02:36:25,680 Speaker 10: Their new big plan is to go after Wikipedia. 3106 02:36:26,240 --> 02:36:29,440 Speaker 17: They want to take down Wikipedia, like because because that's 3107 02:36:29,480 --> 02:36:32,160 Speaker 17: a place you can verify facts at right, They've already 3108 02:36:32,160 --> 02:36:34,160 Speaker 17: got the Post, They've got the Times, Like, what are 3109 02:36:34,160 --> 02:36:35,840 Speaker 17: they gonna do? They gotta GoF to Wikipedia. This is 3110 02:36:35,879 --> 02:36:39,039 Speaker 17: the kind of like level of unreality they're trying to build. 3111 02:36:39,600 --> 02:36:41,560 Speaker 5: Yeah, and do you know what else builds a world 3112 02:36:41,720 --> 02:36:43,440 Speaker 5: of unreality and that attempts to sell it to you? 3113 02:36:43,760 --> 02:36:46,840 Speaker 1: Ooh, prox and services. That's a fourth this podcast. 3114 02:36:47,280 --> 02:36:59,800 Speaker 13: Yes we are back. I'm very proud of that one. 3115 02:36:59,840 --> 02:37:01,640 Speaker 13: That that's one of the best ones I've ever done. 3116 02:37:01,680 --> 02:37:03,560 Speaker 13: And I just completely off the top of my head, 3117 02:37:04,480 --> 02:37:06,000 Speaker 13: just came back better than ever. 3118 02:37:07,080 --> 02:37:08,240 Speaker 1: She's never been so bad. 3119 02:37:09,480 --> 02:37:13,480 Speaker 13: So I want to move a little bit from the 3120 02:37:13,720 --> 02:37:16,080 Speaker 13: just what does the state look like? How pilleda these 3121 02:37:16,120 --> 02:37:19,120 Speaker 13: people kind of thing to I want to talk a 3122 02:37:19,160 --> 02:37:21,080 Speaker 13: bit about the sort of macro thing that's going on 3123 02:37:21,200 --> 02:37:23,880 Speaker 13: here because I think part of what's happening here and 3124 02:37:24,080 --> 02:37:26,640 Speaker 13: it's become kind of unfashionable in academic to talk about 3125 02:37:26,680 --> 02:37:31,320 Speaker 13: neoliberalism because everyone got obsessed with like the Chips Act 3126 02:37:31,520 --> 02:37:34,160 Speaker 13: and like the capacity of the state or whatever. But 3127 02:37:34,720 --> 02:37:36,800 Speaker 13: I think, actually, if you want to understand what's going 3128 02:37:36,800 --> 02:37:39,600 Speaker 13: on here, a good place to go is like going 3129 02:37:39,680 --> 02:37:42,680 Speaker 13: back to here David Graver, and he has this line 3130 02:37:42,920 --> 02:37:46,520 Speaker 13: talking about neoliberalism. I think this might God, I should 3131 02:37:46,520 --> 02:37:48,520 Speaker 13: have actually looked up where this quote is from before 3132 02:37:48,560 --> 02:37:51,040 Speaker 13: I quote it. I think it might be from the 3133 02:37:51,120 --> 02:37:54,760 Speaker 13: Shock of Victory. But he has this line about how neoliberalism, 3134 02:37:55,040 --> 02:37:58,119 Speaker 13: when given a choice between making their system actually work 3135 02:37:58,520 --> 02:38:01,880 Speaker 13: and making it seem like alternative and neoliberalism is impossible, 3136 02:38:01,920 --> 02:38:04,959 Speaker 13: it will always choose making me alternatives seem impossible, because 3137 02:38:04,959 --> 02:38:07,560 Speaker 13: that's what neoliberalism is, right. This is, you know, the 3138 02:38:07,680 --> 02:38:11,439 Speaker 13: sort of maxim of Margaret Thatchery is the reason of alternative. 3139 02:38:12,000 --> 02:38:14,160 Speaker 13: It is a system that is designed to destroy all 3140 02:38:14,160 --> 02:38:16,879 Speaker 13: alternatives in the you know, and this includes the possibility 3141 02:38:16,920 --> 02:38:21,040 Speaker 13: of a future and the goal of this And this 3142 02:38:21,160 --> 02:38:23,440 Speaker 13: is I think that the sort of dominant affect of 3143 02:38:23,640 --> 02:38:27,440 Speaker 13: the years of lead paint is this induced helplessness. Yeah, 3144 02:38:27,480 --> 02:38:28,880 Speaker 13: this is some thing thicky I would ask you about 3145 02:38:29,040 --> 02:38:30,840 Speaker 13: the sort of like induced helplessness of this moment. 3146 02:38:32,360 --> 02:38:34,800 Speaker 10: Yeah, yeah, I was sort of vibing with what you're saying. 3147 02:38:34,800 --> 02:38:37,040 Speaker 17: But yeah, I think a lot of people online have 3148 02:38:37,160 --> 02:38:39,440 Speaker 17: accepted sort of you know, don't give in an advance, right. 3149 02:38:39,560 --> 02:38:42,480 Speaker 17: But like, I think one big thing that has been 3150 02:38:42,720 --> 02:38:45,480 Speaker 17: part of the Biden like strategy of counter revolution and 3151 02:38:45,520 --> 02:38:47,000 Speaker 17: part of what's been going on over the last four years, 3152 02:38:47,000 --> 02:38:49,200 Speaker 17: but indeed over the last four decades as well as 3153 02:38:49,240 --> 02:38:52,200 Speaker 17: sort of part of neoliberalism, is like the idea that 3154 02:38:52,320 --> 02:38:56,119 Speaker 17: you actually really can't do stuff yourself. You need a market, 3155 02:38:56,560 --> 02:38:59,119 Speaker 17: you need assistance, you need a professional, you need an 3156 02:38:59,160 --> 02:39:03,119 Speaker 17: expert to make a choice, right, and any choice made otherwise, 3157 02:39:03,520 --> 02:39:04,879 Speaker 17: you know, is doomed to failure. 3158 02:39:05,040 --> 02:39:05,160 Speaker 4: Right. 3159 02:39:05,400 --> 02:39:08,560 Speaker 17: And I think part of why Trump feels like to people, 3160 02:39:08,760 --> 02:39:13,080 Speaker 17: some people like he's resisting neoliberalism is because he's like, no, no, no, 3161 02:39:13,200 --> 02:39:15,120 Speaker 17: I don't listen to experts. I don't listen to anyone 3162 02:39:15,200 --> 02:39:17,840 Speaker 17: except my gut. I just do what I want. The 3163 02:39:18,040 --> 02:39:22,200 Speaker 17: incredibly exhausting and miserablest strategy of the previous thirty years 3164 02:39:22,240 --> 02:39:24,600 Speaker 17: of politics, which is you get a ton of expert 3165 02:39:24,720 --> 02:39:27,640 Speaker 17: reviews and then you do a political change that moves 3166 02:39:27,680 --> 02:39:30,960 Speaker 17: things like twelve percent one way, you know, nudge politics 3167 02:39:31,000 --> 02:39:32,520 Speaker 17: as like Barack Obama loved or whatever. 3168 02:39:32,600 --> 02:39:34,800 Speaker 10: Right, So that's sort of like there's there's that sense. 3169 02:39:34,840 --> 02:39:37,039 Speaker 10: But then on the individual sense, it's. 3170 02:39:37,000 --> 02:39:40,920 Speaker 17: Also about distributing the workplaces and breaking down the possibility 3171 02:39:40,959 --> 02:39:43,600 Speaker 17: of labor solidarity, right, because part of what the sixties 3172 02:39:43,760 --> 02:39:45,840 Speaker 17: was and the reason the sixties lasted so long in 3173 02:39:45,879 --> 02:39:48,520 Speaker 17: Italy is because Italy had the biggest factories and had 3174 02:39:48,600 --> 02:39:50,960 Speaker 17: the like the last in Western Europe. They had the 3175 02:39:51,080 --> 02:39:54,440 Speaker 17: last folks still becoming proletarians from peasantry, like coming up 3176 02:39:54,480 --> 02:39:58,000 Speaker 17: from Sicily. So they had this like massive, massive factories 3177 02:39:58,280 --> 02:40:00,880 Speaker 17: that had these like crazy strikes over over again. So 3178 02:40:01,160 --> 02:40:04,200 Speaker 17: the distribution of labor, you know, with globalization, neoliberalism and 3179 02:40:04,200 --> 02:40:07,560 Speaker 17: blah blah blah, breaking down labor workforce. Like, we also 3180 02:40:07,640 --> 02:40:11,360 Speaker 17: are very helpless individually in our workplaces, right, and like 3181 02:40:11,480 --> 02:40:13,640 Speaker 17: we go to the HR department to get help, right. 3182 02:40:13,800 --> 02:40:16,200 Speaker 10: Where we sort of get self care. We like work 3183 02:40:16,240 --> 02:40:16,720 Speaker 10: on ourselves. 3184 02:40:16,760 --> 02:40:19,000 Speaker 17: We get therapy, you know that our boss offers us 3185 02:40:19,080 --> 02:40:22,600 Speaker 17: you know, thoughts and prayers right when when things are hard. 3186 02:40:22,920 --> 02:40:28,200 Speaker 17: But like there's a big attempt to allow people to 3187 02:40:28,720 --> 02:40:31,440 Speaker 17: define themselves sort of the carrot. The carrot of the 3188 02:40:31,480 --> 02:40:33,360 Speaker 17: sixties was like, you know, you get to like have 3189 02:40:33,480 --> 02:40:36,560 Speaker 17: an identity, like, Okay, we won't be officially racist. 3190 02:40:36,640 --> 02:40:39,440 Speaker 10: Yeah, quote unquote, you know, okay, we won't be officially sexist. 3191 02:40:39,480 --> 02:40:41,800 Speaker 17: And they claim, okay, whatever, none of that's true, but 3192 02:40:41,879 --> 02:40:44,440 Speaker 17: they but they sort of sell that, and then they say, 3193 02:40:44,720 --> 02:40:47,680 Speaker 17: but in return, you have to like do all of 3194 02:40:47,760 --> 02:40:48,280 Speaker 17: the self work. 3195 02:40:48,360 --> 02:40:50,360 Speaker 10: You have to be an identity in the marketplace. 3196 02:40:51,080 --> 02:40:54,680 Speaker 17: So basically you get exhausted because like even choosing what 3197 02:40:55,000 --> 02:40:59,160 Speaker 17: shoes to wear becomes like both an identity defining question 3198 02:40:59,640 --> 02:41:04,960 Speaker 17: and an exhausting slog through debt structures and infinite marketplaces, right, 3199 02:41:05,080 --> 02:41:08,280 Speaker 17: like yeah, and so that in you know, spoony world 3200 02:41:08,360 --> 02:41:10,560 Speaker 17: becalled that sort of choice paralysis, right, And I think 3201 02:41:10,720 --> 02:41:13,520 Speaker 17: that's probably accepted as well, that like you have so 3202 02:41:13,680 --> 02:41:16,119 Speaker 17: much choice that you feel absolutely helpless on the face 3203 02:41:16,160 --> 02:41:18,680 Speaker 17: of it. You can't do anything, and so that produces 3204 02:41:18,720 --> 02:41:21,400 Speaker 17: a craving for authoritarianism, for authority, right, That's another thing 3205 02:41:21,440 --> 02:41:22,520 Speaker 17: people want is like. 3206 02:41:22,600 --> 02:41:24,720 Speaker 10: Someone else decide for me. I'm sick of thinking about this. 3207 02:41:25,800 --> 02:41:27,520 Speaker 13: Yeah, And that's I think been one of the most 3208 02:41:27,560 --> 02:41:30,800 Speaker 13: important aspects of everything that's been happening right now is 3209 02:41:30,879 --> 02:41:33,320 Speaker 13: this sort of strategy of exhaustion and this demand for 3210 02:41:33,440 --> 02:41:35,160 Speaker 13: someone else to make choices for you, to free you 3211 02:41:35,240 --> 02:41:38,200 Speaker 13: from this just like this endless nightmare of like trying 3212 02:41:38,240 --> 02:41:40,800 Speaker 13: to figure out which healthcare plan you're supposed to buy 3213 02:41:41,080 --> 02:41:41,680 Speaker 13: and shit. 3214 02:41:41,520 --> 02:41:43,320 Speaker 5: Like that and oh my god, you know, and the 3215 02:41:43,400 --> 02:41:46,360 Speaker 5: right has a bunch of alternatives here right with like 3216 02:41:46,440 --> 02:41:48,760 Speaker 5: this is the fantasy of what treadwives is. It's like 3217 02:41:48,800 --> 02:41:50,400 Speaker 5: what if someone else did your thinking for you. 3218 02:41:50,560 --> 02:41:54,200 Speaker 13: It's also the entire logic behind AI right and betwinding, 3219 02:41:54,240 --> 02:41:56,360 Speaker 13: this sort of AI agent's thing that they're like pushing 3220 02:41:56,440 --> 02:41:58,840 Speaker 13: right now, go listen towards see ES coverage and you'll 3221 02:41:58,920 --> 02:41:59,640 Speaker 13: care much about it. 3222 02:42:00,200 --> 02:42:02,640 Speaker 1: Is like what if someone just like planned your life 3223 02:42:02,720 --> 02:42:02,920 Speaker 1: for you? 3224 02:42:03,040 --> 02:42:03,160 Speaker 4: Right? 3225 02:42:03,640 --> 02:42:04,880 Speaker 13: What if you could talk to a machine and it 3226 02:42:04,879 --> 02:42:06,320 Speaker 13: would plan all your trips and it would tell you 3227 02:42:06,360 --> 02:42:07,760 Speaker 13: what to eat, and we would tell you how to live. 3228 02:42:07,840 --> 02:42:10,600 Speaker 5: And this is you know, this is also the structure 3229 02:42:10,600 --> 02:42:11,400 Speaker 5: of how cults work. 3230 02:42:11,640 --> 02:42:13,039 Speaker 1: Like this is why colts have been able to. 3231 02:42:13,760 --> 02:42:17,040 Speaker 13: Attract people that, like, I think the media conception of 3232 02:42:17,080 --> 02:42:18,560 Speaker 13: cults you wouldn't think would be in them, as why 3233 02:42:18,560 --> 02:42:21,200 Speaker 13: there so many engineers in cults because there's like a 3234 02:42:21,200 --> 02:42:23,120 Speaker 13: once of people who have to make choices constantly, and 3235 02:42:23,480 --> 02:42:25,000 Speaker 13: the cult is like, hey, what if I just like 3236 02:42:25,120 --> 02:42:28,160 Speaker 13: made all of these choices for you? And this is ultimately, 3237 02:42:28,240 --> 02:42:29,600 Speaker 13: you know, we talked about this a little bit before. 3238 02:42:29,640 --> 02:42:32,440 Speaker 13: This is ultimately part of what's going on with like 3239 02:42:32,520 --> 02:42:35,879 Speaker 13: trump Ism, right, because Trump is also to some extent, 3240 02:42:36,000 --> 02:42:37,920 Speaker 13: like if you're in this movement, like you no longer 3241 02:42:38,000 --> 02:42:38,800 Speaker 13: have to choose anymore. 3242 02:42:38,959 --> 02:42:40,720 Speaker 1: You just you know, here is the guy. The guy 3243 02:42:40,840 --> 02:42:41,760 Speaker 1: is going to do the thing for you. 3244 02:42:41,840 --> 02:42:45,360 Speaker 13: This is also if you go back to your original 3245 02:42:45,480 --> 02:42:48,440 Speaker 13: sort of conceptions of fascism, right, it's about the sort 3246 02:42:48,440 --> 02:42:52,080 Speaker 13: of populace delegates their will into the single heroic individual, 3247 02:42:52,120 --> 02:42:54,600 Speaker 13: and the single heroic individual like acts outside of the 3248 02:42:55,240 --> 02:42:56,840 Speaker 13: bonds of the system in order to preserve it and 3249 02:42:56,920 --> 02:42:59,320 Speaker 13: like does all this stuff for you. And I think 3250 02:42:59,360 --> 02:43:03,520 Speaker 13: there's a compa of that with this sort of paralysis 3251 02:43:04,320 --> 02:43:07,960 Speaker 13: and exhaustion, particularly like exhaustion and anxiety also, and this 3252 02:43:08,040 --> 02:43:09,920 Speaker 13: is something that is very well documented that you know, 3253 02:43:10,000 --> 02:43:11,480 Speaker 13: we're are going to get into a foll here. But 3254 02:43:11,920 --> 02:43:13,320 Speaker 13: all of the stuff we've been talking about about the 3255 02:43:13,360 --> 02:43:16,400 Speaker 13: information space, where there's this constant deluge of just nonsense. 3256 02:43:16,440 --> 02:43:20,080 Speaker 13: And that's designed specifically, not even necessarily to convince you 3257 02:43:20,160 --> 02:43:21,880 Speaker 13: that something is true, but to convince you that it's 3258 02:43:21,920 --> 02:43:24,440 Speaker 13: impossible to figure out what is happening and to make 3259 02:43:24,480 --> 02:43:27,360 Speaker 13: you just give up. And when you're refusing to make 3260 02:43:27,520 --> 02:43:30,840 Speaker 13: a choice between like was there a gas attack in 3261 02:43:30,920 --> 02:43:32,920 Speaker 13: Syria or was it like staged by the rebels as 3262 02:43:32,920 --> 02:43:35,600 Speaker 13: the fallse flag, right, you're refusing to make the choice, 3263 02:43:36,160 --> 02:43:38,640 Speaker 13: has the effect of legitimizing both of them and also 3264 02:43:38,760 --> 02:43:40,920 Speaker 13: removes you from sort of the field of play of action. 3265 02:43:41,879 --> 02:43:43,480 Speaker 13: And this has been a really important part of this 3266 02:43:43,600 --> 02:43:45,320 Speaker 13: to sort of demobilize the left. Like it's part of 3267 02:43:45,440 --> 02:43:48,440 Speaker 13: what the sort of Tulsi gabberd Gabit was right, was 3268 02:43:48,480 --> 02:43:49,960 Speaker 13: that you could take a bunch of this sort of 3269 02:43:50,040 --> 02:43:53,560 Speaker 13: like rising nominally anti imperialist thing and you could just 3270 02:43:53,959 --> 02:43:56,640 Speaker 13: do this shit to them. And you know, now, Tulsi 3271 02:43:56,640 --> 02:43:59,920 Speaker 13: Gabbert is like one of the big people in Trump world, right, 3272 02:44:00,080 --> 02:44:02,840 Speaker 13: I think, what's his name? I disrespect him by not 3273 02:44:02,959 --> 02:44:04,800 Speaker 13: remembering his name, but I should. For the podcast, Steve 3274 02:44:04,840 --> 02:44:06,760 Speaker 13: Bannon put it well when he said, just flood the 3275 02:44:06,840 --> 02:44:09,320 Speaker 13: zone with shit, right, it's sort of the strategy. You 3276 02:44:09,520 --> 02:44:13,400 Speaker 13: just release so much terrible information that it doesn't matter. 3277 02:44:13,879 --> 02:44:16,440 Speaker 13: And this is how Trump also like kept ahead of 3278 02:44:16,520 --> 02:44:19,000 Speaker 13: his you know, many scandals, as he would just like say, 3279 02:44:19,040 --> 02:44:21,320 Speaker 13: the next most outrageous thing, and you know, you'd have 3280 02:44:21,400 --> 02:44:23,440 Speaker 13: to commit to responding to one, but he was already 3281 02:44:23,440 --> 02:44:25,000 Speaker 13: at the next thing. And it was just a sort 3282 02:44:25,040 --> 02:44:29,840 Speaker 13: of like amplifying, amplifying wave of like chaos and nonsense 3283 02:44:30,400 --> 02:44:32,720 Speaker 13: that you eventually, yeah, you get bowled over by it, 3284 02:44:32,800 --> 02:44:35,080 Speaker 13: you get exhausted, And I think, you know, you mentioned 3285 02:44:35,120 --> 02:44:38,200 Speaker 13: healthcare markets, and I think, like that's really that's really 3286 02:44:38,360 --> 02:44:40,600 Speaker 13: telling too, because we've just like lived through a pandemic. 3287 02:44:40,879 --> 02:44:43,560 Speaker 13: We're in the midst of a pandemic. Covids is in 3288 02:44:43,640 --> 02:44:45,840 Speaker 13: another wave that like no one has named right now, 3289 02:44:46,320 --> 02:44:50,200 Speaker 13: and no one even mentioned healthcare, let alone the pandemic 3290 02:44:50,360 --> 02:44:52,800 Speaker 13: during the election of twenty twenty four. Yeah, So part 3291 02:44:52,840 --> 02:44:55,240 Speaker 13: of what's been going on too is that there has 3292 02:44:55,320 --> 02:44:57,880 Speaker 13: been this mass push by the Biden administration of the 3293 02:44:57,959 --> 02:45:01,320 Speaker 13: Democrats to make us forget what happened in twenty twenty 3294 02:45:01,920 --> 02:45:03,560 Speaker 13: in terms of the uprising YEA, and then make us 3295 02:45:03,560 --> 02:45:07,439 Speaker 13: forget the pandemic, which is so unpopular and which continuing 3296 02:45:07,560 --> 02:45:11,920 Speaker 13: to actually prevent would have done significant damage to the economy. Right, 3297 02:45:12,000 --> 02:45:13,400 Speaker 13: it was already pretty bad for it, and it would 3298 02:45:13,440 --> 02:45:15,200 Speaker 13: have continued to get worse. So everyone had to be 3299 02:45:15,240 --> 02:45:16,000 Speaker 13: forced back to work. 3300 02:45:16,080 --> 02:45:19,080 Speaker 17: How do you force people back to work who evidently 3301 02:45:19,160 --> 02:45:22,480 Speaker 17: care about each other and their own safety. You lied them, 3302 02:45:22,640 --> 02:45:24,800 Speaker 17: You confuse them about what's actually going on. Right, So 3303 02:45:25,240 --> 02:45:28,360 Speaker 17: there's been this huge priming of the pump for this 3304 02:45:28,560 --> 02:45:31,840 Speaker 17: strategy by Biden and the Democrats, and by our own 3305 02:45:31,920 --> 02:45:34,520 Speaker 17: exhaustion over the pandemic and the fact that we had 3306 02:45:34,520 --> 02:45:35,840 Speaker 17: to go back to work, so we had to get 3307 02:45:35,879 --> 02:45:38,160 Speaker 17: over the cognitive dissonance of that. So all of these 3308 02:45:38,200 --> 02:45:43,280 Speaker 17: factors together have produced a psychic stew culturally in which 3309 02:45:43,520 --> 02:45:47,080 Speaker 17: people are very susceptible to just throwing up their hands 3310 02:45:47,120 --> 02:45:48,760 Speaker 17: and going, I don't know whatever. 3311 02:45:49,720 --> 02:45:49,920 Speaker 14: Yeah. 3312 02:45:50,040 --> 02:45:52,400 Speaker 13: But on the other hand, the strategy of the Years 3313 02:45:52,440 --> 02:45:54,400 Speaker 13: of Lead was a strategy born of strength, right, the 3314 02:45:54,480 --> 02:45:58,160 Speaker 13: years of Lead paint This is not a strategy built 3315 02:45:58,280 --> 02:46:01,400 Speaker 13: by people who haven't credibly solid grasp on power. 3316 02:46:01,680 --> 02:46:01,800 Speaker 6: Right. 3317 02:46:02,440 --> 02:46:05,760 Speaker 13: The actual base that put Trump in power, right, and 3318 02:46:05,840 --> 02:46:08,920 Speaker 13: their actual political base is incredibly brittle. Right, they are 3319 02:46:09,040 --> 02:46:12,960 Speaker 13: about to tank the entire global economy like through by 3320 02:46:13,080 --> 02:46:15,240 Speaker 13: by putting like fifty percent taria of some like every 3321 02:46:15,280 --> 02:46:18,480 Speaker 13: single country in the world. They okay, let's let's be 3322 02:46:18,480 --> 02:46:21,800 Speaker 13: accurate here. That on on on Chinese, Mexican and Canadian goods, 3323 02:46:21,840 --> 02:46:23,760 Speaker 13: which is like, okay, like I'm gonna I'm gonna ask you, 3324 02:46:23,840 --> 02:46:25,480 Speaker 13: as an exercise to the reader to go look up 3325 02:46:25,520 --> 02:46:27,240 Speaker 13: the places that the US imports things from. 3326 02:46:29,240 --> 02:46:31,400 Speaker 5: Right, So, like, you know, this is how you resist. 3327 02:46:33,280 --> 02:46:34,400 Speaker 5: This is this how you resist. 3328 02:46:34,440 --> 02:46:36,600 Speaker 13: Here you're learned helplessness is by going and research and 3329 02:46:36,640 --> 02:46:40,359 Speaker 13: things for yourself. But you know they're about to annihilate 3330 02:46:40,400 --> 02:46:42,480 Speaker 13: the entire economy when the thing that brought into power 3331 02:46:42,600 --> 02:46:46,240 Speaker 13: was fury at rising prices. Right, these fucking arrogant bastards 3332 02:46:46,400 --> 02:46:48,520 Speaker 13: have sown the winds and they are going to reap 3333 02:46:48,560 --> 02:46:51,480 Speaker 13: the fucking whirlwinds. The basis of this fucking of this 3334 02:46:51,760 --> 02:46:54,200 Speaker 13: entire strategy, you know, and I ask you this, like, 3335 02:46:54,280 --> 02:46:57,120 Speaker 13: dear listener, do you think these people can hold three 3336 02:46:57,240 --> 02:47:00,360 Speaker 13: hundred and thirty million people in line by sheer force? 3337 02:47:01,080 --> 02:47:01,800 Speaker 1: No, of course not. 3338 02:47:01,879 --> 02:47:03,520 Speaker 13: There's no fucking way. This is the most heavily armed 3339 02:47:03,520 --> 02:47:05,400 Speaker 13: population that has ever existed in human history. 3340 02:47:05,720 --> 02:47:05,879 Speaker 1: Right. 3341 02:47:07,160 --> 02:47:11,480 Speaker 13: This strategy is a strategy that is built around getting 3342 02:47:11,520 --> 02:47:14,240 Speaker 13: your compliance. Yes, And if they can't get your compliance 3343 02:47:14,320 --> 02:47:16,840 Speaker 13: by you agreeing with them, they're going to attempt to 3344 02:47:16,879 --> 02:47:19,400 Speaker 13: get your compliance by just taking you out of the equation. 3345 02:47:19,640 --> 02:47:19,760 Speaker 6: Right. 3346 02:47:20,200 --> 02:47:22,440 Speaker 13: They need you scared, They need you confused, They need 3347 02:47:22,480 --> 02:47:25,520 Speaker 13: you completely convinced of your own helplessness. They need you 3348 02:47:25,640 --> 02:47:28,280 Speaker 13: to forget that, as the old song says, in your 3349 02:47:28,440 --> 02:47:30,959 Speaker 13: hands is placed to power greater than their hoarded gold, 3350 02:47:31,440 --> 02:47:34,680 Speaker 13: greater than the might of armies magnified one thousandfold. They 3351 02:47:34,720 --> 02:47:36,440 Speaker 13: need you to forget the next line of the song, 3352 02:47:36,920 --> 02:47:39,040 Speaker 13: which goes, we can bring to birth a new world 3353 02:47:39,240 --> 02:47:42,640 Speaker 13: from the ashes of the old when the union makes 3354 02:47:42,720 --> 02:47:46,160 Speaker 13: us strong. And this is the entire fucking thing, right, 3355 02:47:46,520 --> 02:47:48,800 Speaker 13: If these people were actually strong, they would not need 3356 02:47:48,920 --> 02:47:51,680 Speaker 13: an entire strategy that was based around political demobilization. 3357 02:47:52,360 --> 02:47:53,080 Speaker 10: Yeah, exactly. 3358 02:47:53,879 --> 02:47:55,280 Speaker 11: And the thing is right. 3359 02:47:55,600 --> 02:47:58,280 Speaker 13: The thing about this moment is that basically everyone is 3360 02:47:58,320 --> 02:48:03,200 Speaker 13: incredibly disorganized. However, that means that you just literally any 3361 02:48:03,360 --> 02:48:06,600 Speaker 13: random person can just take the things that you know 3362 02:48:06,720 --> 02:48:10,840 Speaker 13: how to do and start organizing. The system is designed 3363 02:48:10,959 --> 02:48:13,000 Speaker 13: to make sure that you don't do that. And guess what, 3364 02:48:13,200 --> 02:48:16,880 Speaker 13: it's not very hard for you to pick up the 3365 02:48:16,959 --> 02:48:18,360 Speaker 13: things that you know how to do for you, to 3366 02:48:18,520 --> 02:48:20,680 Speaker 13: use the relationships in people you know in your life 3367 02:48:20,959 --> 02:48:23,000 Speaker 13: to get together with them and to go do things. 3368 02:48:23,520 --> 02:48:27,039 Speaker 13: And they are fucking terrified of this. Yes, their entire 3369 02:48:27,120 --> 02:48:29,480 Speaker 13: strategies to make sure that you simply do not do this. 3370 02:48:29,720 --> 02:48:32,560 Speaker 13: And every single one of you has the power to 3371 02:48:32,720 --> 02:48:35,240 Speaker 13: do this. And I know this because I also was 3372 02:48:35,320 --> 02:48:37,880 Speaker 13: just some random dipshit Like I was just literally a 3373 02:48:38,000 --> 02:48:41,600 Speaker 13: random college student, right, Like, I was just some asshole, 3374 02:48:41,840 --> 02:48:43,560 Speaker 13: and I just started doing things right. And I got 3375 02:48:43,600 --> 02:48:45,560 Speaker 13: together with my friends and we fucking we made a 3376 02:48:45,680 --> 02:48:47,720 Speaker 13: tendance union and we did anti I stuff, and we 3377 02:48:47,760 --> 02:48:50,120 Speaker 13: did all of this shit. And it wasn't that like 3378 02:48:50,200 --> 02:48:51,840 Speaker 13: any of us are any different than you. We just 3379 02:48:52,200 --> 02:48:54,160 Speaker 13: you know, decided one day we were going to do it. 3380 02:48:54,200 --> 02:48:57,640 Speaker 13: And it happens to return one last time to David Graeber. 3381 02:48:57,959 --> 02:49:00,240 Speaker 13: One of one of his most famous quotes is theultimate 3382 02:49:00,360 --> 02:49:02,199 Speaker 13: hidden truth of this world is that it is something 3383 02:49:02,240 --> 02:49:04,680 Speaker 13: that we make and could just as easily make differently. 3384 02:49:05,640 --> 02:49:08,920 Speaker 13: And everyone who is in power right now is absolutely 3385 02:49:09,040 --> 02:49:11,680 Speaker 13: terrified of the idea of you making this world differently, 3386 02:49:11,840 --> 02:49:12,960 Speaker 13: and together we can do that. 3387 02:49:13,840 --> 02:49:15,200 Speaker 10: Yes, that's exactly right. 3388 02:49:15,280 --> 02:49:17,360 Speaker 17: And another thing that I think is really powerful about 3389 02:49:17,400 --> 02:49:20,800 Speaker 17: getting started in that way is that all of those 3390 02:49:20,920 --> 02:49:25,360 Speaker 17: false choices they become so much less important. And actually, 3391 02:49:25,440 --> 02:49:27,600 Speaker 17: when you have a real goal that you and your 3392 02:49:27,640 --> 02:49:30,240 Speaker 17: friends have made together, that you're building towards, it's actually 3393 02:49:30,240 --> 02:49:32,800 Speaker 17: a lot easier to make choices as to make decisions, yeah, 3394 02:49:32,800 --> 02:49:34,760 Speaker 17: because you would know what you need for the next step, 3395 02:49:34,879 --> 02:49:36,200 Speaker 17: or you'll have an idea of it. You might make 3396 02:49:36,240 --> 02:49:38,800 Speaker 17: a mistake, you might be wrong, but each step along 3397 02:49:38,879 --> 02:49:41,280 Speaker 17: that way, like, it's an easier way to do this 3398 02:49:41,480 --> 02:49:44,480 Speaker 17: and to feel the power of real choices rather than 3399 02:49:44,520 --> 02:49:47,640 Speaker 17: the false choices of like do you want your AI 3400 02:49:48,080 --> 02:49:52,400 Speaker 17: from grock or do you want it from CHATGBT, Right, 3401 02:49:52,840 --> 02:49:55,960 Speaker 17: And obviously like that's a joke, but it's true that 3402 02:49:56,640 --> 02:50:00,760 Speaker 17: they aren't offering us anything anymore. They they have decided, 3403 02:50:00,879 --> 02:50:03,440 Speaker 17: they have decided that what we get is stomped We 3404 02:50:03,520 --> 02:50:05,640 Speaker 17: get stomped on. That's what they've agreed to give us. 3405 02:50:06,000 --> 02:50:09,240 Speaker 17: Is like getting stomped on. Like, okay, that was always 3406 02:50:09,520 --> 02:50:11,640 Speaker 17: what they wanted to give us in the past, but 3407 02:50:12,040 --> 02:50:15,279 Speaker 17: they might learn very very quickly and reaping the whirlwind 3408 02:50:15,640 --> 02:50:19,920 Speaker 17: that the reason that a century of American politicians have 3409 02:50:20,160 --> 02:50:23,440 Speaker 17: tipped their hat to democratic norms and have tried really 3410 02:50:23,520 --> 02:50:27,600 Speaker 17: hard to preserve the niceties of the government. Is because 3411 02:50:28,120 --> 02:50:31,600 Speaker 17: they have a slightly fresher memory of the French Revolution 3412 02:50:31,720 --> 02:50:34,320 Speaker 17: and the guillotines which haunts them, or the Haitian Revolution, 3413 02:50:34,440 --> 02:50:36,760 Speaker 17: which is the real fear lurking behind the fear of 3414 02:50:36,800 --> 02:50:39,800 Speaker 17: the French. Yeah, when the slaves rose up and destroyed 3415 02:50:39,879 --> 02:50:42,520 Speaker 17: the sugar plantation of Heiti and it has been punished 3416 02:50:42,520 --> 02:50:43,000 Speaker 17: ever since. 3417 02:50:43,520 --> 02:50:44,920 Speaker 10: The point being that. 3418 02:50:45,480 --> 02:50:49,720 Speaker 17: These things that they are overwhelming. This flooding the zone 3419 02:50:49,800 --> 02:50:52,320 Speaker 17: was shit, as Mia says, is from a position of 3420 02:50:52,360 --> 02:50:56,240 Speaker 17: weakness because when they were strong, when they were strong, 3421 02:50:56,320 --> 02:50:58,800 Speaker 17: they had Obama was a sign of strength. We can 3422 02:50:58,840 --> 02:51:01,600 Speaker 17: elect a black person, a black man in this racist country, 3423 02:51:01,959 --> 02:51:04,600 Speaker 17: and he can just go on hope and he can 3424 02:51:04,840 --> 02:51:07,480 Speaker 17: actually make very few changes and he'll still be incredibly popular, 3425 02:51:08,040 --> 02:51:11,440 Speaker 17: like even through a huge economic collapse. Right, that was 3426 02:51:11,520 --> 02:51:14,320 Speaker 17: a sort of strong gesture. Trump is a sign of 3427 02:51:14,520 --> 02:51:18,840 Speaker 17: real senescence. That I use the phrase advisably. And there 3428 02:51:18,959 --> 02:51:20,440 Speaker 17: are a lot of holes. 3429 02:51:20,640 --> 02:51:22,520 Speaker 10: And they have drunk the kool aid. The right has 3430 02:51:22,600 --> 02:51:23,320 Speaker 10: drunk the kool aid. 3431 02:51:23,560 --> 02:51:28,080 Speaker 17: They don't know the difference between democrats and anarchists, not really, 3432 02:51:28,360 --> 02:51:31,080 Speaker 17: they genuinely don't really know the difference. Some of them do, 3433 02:51:31,280 --> 02:51:33,880 Speaker 17: their philosophers do, but the main ones on the street 3434 02:51:33,920 --> 02:51:36,080 Speaker 17: have no idea about the difference. That gives us a 3435 02:51:36,160 --> 02:51:38,600 Speaker 17: lot of space to move, That gives us a lot 3436 02:51:38,680 --> 02:51:41,680 Speaker 17: of space to take action, to build things that are 3437 02:51:41,680 --> 02:51:45,560 Speaker 17: invisible to them, and that might be invisible to social media, 3438 02:51:46,040 --> 02:51:49,800 Speaker 17: which is a place built around reinforcing our helplessness. In 3439 02:51:49,879 --> 02:51:52,440 Speaker 17: many ways, the strategies we have to take will be 3440 02:51:52,680 --> 02:51:54,920 Speaker 17: less visible in many ways, I think, than they were 3441 02:51:55,080 --> 02:51:56,640 Speaker 17: in previous times, and they're going to have to be 3442 02:51:56,720 --> 02:52:00,800 Speaker 17: of necessity because maga is basically, you know, it's the 3443 02:52:00,879 --> 02:52:03,000 Speaker 17: eye of sore on and if it lands on you like, 3444 02:52:03,360 --> 02:52:06,360 Speaker 17: you're in trouble. But if it doesn't, like, you can 3445 02:52:06,480 --> 02:52:07,920 Speaker 17: just kind of move, and if you don't, you know, 3446 02:52:08,080 --> 02:52:11,880 Speaker 17: run into any any trouble like, you can get a 3447 02:52:11,920 --> 02:52:13,720 Speaker 17: lot done. I think that's as much as I'll say 3448 02:52:13,720 --> 02:52:16,120 Speaker 17: about that. But there's a lot to do, and there's 3449 02:52:16,160 --> 02:52:17,960 Speaker 17: a lot of movements to make and a lot of 3450 02:52:18,000 --> 02:52:20,520 Speaker 17: building to do that will both give you a sense 3451 02:52:20,560 --> 02:52:23,440 Speaker 17: of power and solve these big problems for you and 3452 02:52:23,520 --> 02:52:26,640 Speaker 17: your community. And if enough people start doing that, then 3453 02:52:26,680 --> 02:52:28,280 Speaker 17: they will take away all their power. 3454 02:52:30,959 --> 02:52:34,120 Speaker 1: Hey, We'll be back Monday with more episodes every week 3455 02:52:34,240 --> 02:52:36,080 Speaker 1: from now until the heat death of the universe. 3456 02:52:36,720 --> 02:52:39,160 Speaker 4: It Could Happen Here is a production of cool Zone Media. 3457 02:52:39,360 --> 02:52:42,400 Speaker 4: For more podcasts from cool Zone Media, visit our website 3458 02:52:42,520 --> 02:52:46,039 Speaker 4: Coolzonemedia dot com, or check us out on the iHeartRadio app, 3459 02:52:46,160 --> 02:52:49,680 Speaker 4: Apple Podcasts, or wherever you listen to podcasts. You can 3460 02:52:49,760 --> 02:52:52,080 Speaker 4: now find sources for It Could Happen Here, listed directly 3461 02:52:52,120 --> 02:52:54,360 Speaker 4: in episode descriptions. Thanks for listening.