1 00:00:00,160 --> 00:00:03,120 Speaker 1: Welcome to Worst Year Ever, A production of I Heart 2 00:00:03,240 --> 00:00:26,800 Speaker 1: Radio Together everything, So don't don't don't. Oh, I did 3 00:00:26,840 --> 00:00:35,919 Speaker 1: my part today. What else is going on? I don't know. 4 00:00:35,960 --> 00:00:38,639 Speaker 1: I've done all of my work for the week. Oh good, good, 5 00:00:38,680 --> 00:00:40,520 Speaker 1: good job. What are you gonna do with your time now? 6 00:00:41,240 --> 00:00:46,159 Speaker 1: I don't know. Maybe fish? What kind of fish? Um? 7 00:00:46,280 --> 00:00:49,800 Speaker 1: I don't know, Like, uh, whatever's whatever's in the creek. 8 00:00:50,440 --> 00:00:52,479 Speaker 1: I'll just throw needles in the creek until I get 9 00:00:52,479 --> 00:00:57,680 Speaker 1: a trash. Yeah. So you fish by tossing needles like 10 00:00:57,680 --> 00:01:01,520 Speaker 1: like shooting them kind of like like like just throwing 11 00:01:01,680 --> 00:01:04,160 Speaker 1: pile handfuls of them, just handfuls of needles into a 12 00:01:04,160 --> 00:01:06,760 Speaker 1: creek until I get a fish. Okay, it's like a 13 00:01:06,800 --> 00:01:09,040 Speaker 1: trap kind of like you put it down and then 14 00:01:09,240 --> 00:01:12,399 Speaker 1: hope they swim into the needle. Yeah, they swim into 15 00:01:12,400 --> 00:01:15,360 Speaker 1: the needles. Like how we used to fish, Yeah, like 16 00:01:15,440 --> 00:01:20,280 Speaker 1: the old times. Yes, you say, like the ancient ancients. Yeah, 17 00:01:20,280 --> 00:01:22,280 Speaker 1: the old days they used to fish with needles. That's 18 00:01:22,280 --> 00:01:27,560 Speaker 1: how it works. Yeah, basic anthropology here. Well, it's so 19 00:01:27,680 --> 00:01:31,840 Speaker 1: nice that before we've even started recording, you're done with 20 00:01:31,880 --> 00:01:35,520 Speaker 1: your work for the week. That's so nice for you. Yeah, no, 21 00:01:35,680 --> 00:01:40,360 Speaker 1: I'm done. All right, Well, you guys have a good recording. 22 00:01:39,880 --> 00:01:45,720 Speaker 1: Wait oh wait, wait, wait oh? How was your weekend? 23 00:01:46,440 --> 00:01:51,520 Speaker 1: Can I fish too? It was? How was my weekend? 24 00:01:51,560 --> 00:01:54,360 Speaker 1: What did I do? I don't even know? You can 25 00:01:54,360 --> 00:01:56,280 Speaker 1: do a party at my house? I did go to 26 00:01:56,320 --> 00:02:00,800 Speaker 1: a party of your house? Right? How much time is 27 00:02:00,960 --> 00:02:05,480 Speaker 1: meaningless to? Yeah? Yeah? How's your weekend? Did you go 28 00:02:05,520 --> 00:02:07,680 Speaker 1: to a party at your house? I did? I threw 29 00:02:07,720 --> 00:02:10,080 Speaker 1: a party at my house. It's tiring. It's fun. It 30 00:02:10,160 --> 00:02:15,519 Speaker 1: was tiring parties, am I right? Anyway, Sophie, you how 31 00:02:15,600 --> 00:02:17,920 Speaker 1: was your weekend? This is fun? This is what people 32 00:02:17,919 --> 00:02:20,600 Speaker 1: tune in for. Right. I got my flu shot and 33 00:02:20,639 --> 00:02:24,000 Speaker 1: then I got a fever, and mom hurt, But I 34 00:02:24,040 --> 00:02:30,120 Speaker 1: got my flu shot. She gives me fever. Thanks, fever, fever, 35 00:02:30,280 --> 00:02:35,280 Speaker 1: when you touched me fever, when you hold me tight ve. 36 00:02:37,000 --> 00:02:40,840 Speaker 1: My goal is to pety sing songs on most episodes. 37 00:02:41,320 --> 00:02:44,959 Speaker 1: Remember when this was a news podcast barely. I've been 38 00:02:44,960 --> 00:02:48,200 Speaker 1: thinking about that. I've been thinking about the evolution of 39 00:02:48,240 --> 00:02:52,160 Speaker 1: this show a lot. There's a lot the devolution, that's 40 00:02:52,200 --> 00:02:54,760 Speaker 1: the right term, because now there's a lot of crossover 41 00:02:54,800 --> 00:02:57,600 Speaker 1: between what we're doing in all the various other projects. 42 00:02:57,760 --> 00:03:00,560 Speaker 1: But it's such a nice podcast. Too many pod podcast 43 00:03:00,880 --> 00:03:05,640 Speaker 1: But like, I like this group of people, you know, 44 00:03:05,800 --> 00:03:08,120 Speaker 1: and so it's fun. I like talking about the news 45 00:03:08,120 --> 00:03:10,080 Speaker 1: with you guys. But it is interesting to be like, 46 00:03:10,120 --> 00:03:11,520 Speaker 1: what are we gonna talk about this week? How do 47 00:03:11,600 --> 00:03:14,040 Speaker 1: we space ou out other other shows? Who has the 48 00:03:14,080 --> 00:03:16,000 Speaker 1: time when we all show up and we're like, hey, 49 00:03:16,000 --> 00:03:19,239 Speaker 1: it's friends that we get to talk to you about chaos? Um, 50 00:03:19,280 --> 00:03:22,040 Speaker 1: But it was I was messaging with Robert at some point. 51 00:03:22,120 --> 00:03:27,400 Speaker 1: I don't know. Time is funny like that, um, and 52 00:03:27,400 --> 00:03:31,560 Speaker 1: recalling that the original mandate for the show was to 53 00:03:32,280 --> 00:03:35,920 Speaker 1: unpack the election and to travel around and talk to people, 54 00:03:36,080 --> 00:03:40,400 Speaker 1: and now I mostly talk to you from I am 55 00:03:40,480 --> 00:03:43,160 Speaker 1: hopeful that we can start to branch out and do 56 00:03:43,240 --> 00:03:48,440 Speaker 1: some projects here like that. Maybe one day you guys 57 00:03:48,520 --> 00:03:52,080 Speaker 1: want to talk quickly about the giant tungsten cube. Um, 58 00:03:52,240 --> 00:03:56,680 Speaker 1: you go more than anything. So I got no plans. 59 00:03:57,720 --> 00:04:02,560 Speaker 1: In the crypto world, oh, a minor tungsten shortage, at 60 00:04:02,640 --> 00:04:05,360 Speaker 1: least within one or two companies, has been caused because, 61 00:04:06,000 --> 00:04:09,440 Speaker 1: for reasons of basically just a meme, crypto bros have 62 00:04:09,520 --> 00:04:12,600 Speaker 1: been buying tungsten cubes. Tungsten is an extremely dense metal 63 00:04:12,640 --> 00:04:14,440 Speaker 1: you make like army piercing amm o out of it. 64 00:04:14,480 --> 00:04:16,640 Speaker 1: For one thing, because like a little bit of tungsten, 65 00:04:16,839 --> 00:04:20,159 Speaker 1: like a four inch tungsten tungsten cube, it's like forty pounds. 66 00:04:21,040 --> 00:04:23,560 Speaker 1: It's just like a weird dense metal. Some people have 67 00:04:23,600 --> 00:04:26,000 Speaker 1: been buying cubes just because they think it's neat, and 68 00:04:26,040 --> 00:04:28,880 Speaker 1: it became kind of a crypto meme, and so this company, 69 00:04:28,920 --> 00:04:31,719 Speaker 1: Midwest Tungsten, decided to capitalize on it by making the 70 00:04:31,760 --> 00:04:37,560 Speaker 1: biggest tungsten cube they possibly could, which was like, I want, 71 00:04:37,600 --> 00:04:42,320 Speaker 1: I'm trying to look at the it's it's uh fourteen 72 00:04:42,360 --> 00:04:47,880 Speaker 1: inches UM around and it weighs almost two thousand pounds UM, 73 00:04:47,920 --> 00:04:50,720 Speaker 1: and they wanted there. They're selling it, but because it's 74 00:04:50,880 --> 00:04:53,599 Speaker 1: would be a nightmare to actually move anywhere, they're not 75 00:04:53,760 --> 00:04:55,960 Speaker 1: really selling it. They're selling it as an n f 76 00:04:56,000 --> 00:04:59,680 Speaker 1: T UM starting at two grand. It will probably go 77 00:04:59,800 --> 00:05:03,760 Speaker 1: for much for millions UM. But you don't actually own 78 00:05:03,800 --> 00:05:06,240 Speaker 1: the cube. But if you buy the cube as an 79 00:05:06,279 --> 00:05:08,640 Speaker 1: n F t you can come see it once a 80 00:05:08,720 --> 00:05:12,640 Speaker 1: year at their off time share. Yeah. Well, no, only 81 00:05:12,680 --> 00:05:15,039 Speaker 1: one person gets to see it once a year, and 82 00:05:15,080 --> 00:05:17,240 Speaker 1: if you later sell it, if that person is already 83 00:05:17,320 --> 00:05:20,159 Speaker 1: visited and touched the cube earlier that year. You can't 84 00:05:20,160 --> 00:05:24,400 Speaker 1: do it for a whole other year. Don't understand. I'm 85 00:05:24,440 --> 00:05:26,520 Speaker 1: going to be honest, and I'm sure a lot of 86 00:05:26,520 --> 00:05:29,480 Speaker 1: people will lose respect for me. I don't understand it. 87 00:05:29,839 --> 00:05:35,080 Speaker 1: It don't understand. I don't I don't understand bitcoin mining. 88 00:05:35,440 --> 00:05:39,640 Speaker 1: I don't understand. Okay, So here's what you need to understand. 89 00:05:39,920 --> 00:05:43,080 Speaker 1: If you man in illegal drugs, or if you made 90 00:05:43,080 --> 00:05:45,159 Speaker 1: it and you just want to avoid taxes on it, 91 00:05:45,200 --> 00:05:47,760 Speaker 1: if you've committed a bunch of crimes and you need 92 00:05:47,800 --> 00:05:50,480 Speaker 1: a way to like kind of wash and scrub your 93 00:05:50,560 --> 00:05:53,479 Speaker 1: laundry your money the way you would like your laundry, 94 00:05:53,960 --> 00:05:57,400 Speaker 1: you can use crypto to do that. And that's that's why. 95 00:05:57,440 --> 00:06:00,120 Speaker 1: That's what all of these like Giant Crypto Transaction are 96 00:06:00,000 --> 00:06:04,240 Speaker 1: a somebody trying to launder money. Um, it's it's it's 97 00:06:04,279 --> 00:06:07,400 Speaker 1: all a big fucking sketchy grift propped up by a 98 00:06:07,440 --> 00:06:10,479 Speaker 1: handful of true believers and a bunch of more people 99 00:06:10,520 --> 00:06:12,960 Speaker 1: who recognize that pretending to be true believers is a 100 00:06:12,960 --> 00:06:16,039 Speaker 1: great way to keep a pretty good money laundering scheme going. 101 00:06:16,640 --> 00:06:19,440 Speaker 1: Um So that's all I need to understand. Yeah, it's 102 00:06:19,480 --> 00:06:22,479 Speaker 1: not nothing else matters. Um. We did a two parter 103 00:06:22,520 --> 00:06:24,160 Speaker 1: about it that will come out that everyone's going to 104 00:06:24,240 --> 00:06:26,760 Speaker 1: find as frustrating as I found researching it. But it's 105 00:06:26,800 --> 00:06:31,360 Speaker 1: it's just dumb and it's it's stupid. Um uh yep 106 00:06:33,320 --> 00:06:40,240 Speaker 1: on Bastards or shows on Bastards. I read a Yeah, 107 00:06:40,320 --> 00:06:45,120 Speaker 1: it's it's just all dumb. But the fucking yeah, it's 108 00:06:45,160 --> 00:06:48,480 Speaker 1: sucking cube. It's just like it's the fucking cube. Yeah, good, 109 00:06:48,800 --> 00:06:51,640 Speaker 1: good for you guys. Like that's what you want to do. 110 00:06:51,960 --> 00:06:54,520 Speaker 1: And that's what because like money is stupid and fake. 111 00:06:55,160 --> 00:06:58,560 Speaker 1: It is it always is, like currency is fiat currency. 112 00:06:58,640 --> 00:07:00,600 Speaker 1: That's why I like the fucking pressure metal. People are 113 00:07:00,800 --> 00:07:03,400 Speaker 1: just like, Okay, well what's gold worth? Really? What's gold worth? 114 00:07:03,960 --> 00:07:08,960 Speaker 1: Like making a space suit? Then gold's worth nothing like exactly, 115 00:07:09,000 --> 00:07:13,240 Speaker 1: so like so like whatever whatever there and this kind 116 00:07:13,240 --> 00:07:14,760 Speaker 1: of like points to that. So in a way, I'm like, 117 00:07:14,920 --> 00:07:20,920 Speaker 1: good for you guys, You're you're illustrating how stupid money is. Um. 118 00:07:21,000 --> 00:07:27,120 Speaker 1: But maybe the cube isn't worth it. Yeah, maybe the 119 00:07:27,160 --> 00:07:29,760 Speaker 1: cube isn't worth it. Maybe the fucking terrible ape drawings 120 00:07:29,760 --> 00:07:33,559 Speaker 1: aren't worth it. Maybe the the the automatically generated lion 121 00:07:33,720 --> 00:07:36,560 Speaker 1: pictures aren't worth hundreds of thousands or millions of dollars. 122 00:07:36,560 --> 00:07:38,600 Speaker 1: Maybe it's all very dumb, but at the same time, 123 00:07:39,160 --> 00:07:42,400 Speaker 1: if people will pay for something, that's what it's worth, 124 00:07:42,520 --> 00:07:45,239 Speaker 1: because that's how capitalism works, which is why Tim Pool 125 00:07:45,280 --> 00:07:50,800 Speaker 1: owns a big house so and Hassan owns a big 126 00:07:50,800 --> 00:07:54,880 Speaker 1: house too. It's it's whatever, it's how, it's the entire 127 00:07:54,920 --> 00:07:57,400 Speaker 1: it's the system we live under. The two genders Hassan 128 00:07:57,480 --> 00:08:01,400 Speaker 1: and the reality is that in ft are really really dumb, 129 00:08:01,520 --> 00:08:05,080 Speaker 1: but also when you really think about it, not inherently 130 00:08:05,160 --> 00:08:12,000 Speaker 1: dumber than in the rest of currency. It's it's a sideshow. Yeah, 131 00:08:13,200 --> 00:08:19,000 Speaker 1: David read David Graber's debt the first five thousand years, Um, 132 00:08:19,240 --> 00:08:23,000 Speaker 1: I have to. Yeah, you'll like it's actually written, it's 133 00:08:23,120 --> 00:08:26,080 Speaker 1: it's it will clear up an enormous amount of important 134 00:08:26,160 --> 00:08:29,040 Speaker 1: questions you have about the way that currency works. And 135 00:08:29,080 --> 00:08:33,640 Speaker 1: it's also very well written because David Graber was a genius. Um, 136 00:08:33,720 --> 00:08:36,360 Speaker 1: I'm either getting less smart smart or I'm just so 137 00:08:36,440 --> 00:08:39,079 Speaker 1: oversaturated that it's hard to sit down and read things. 138 00:08:39,160 --> 00:08:42,240 Speaker 1: But that's the recommendation, and I'll do it. You have 139 00:08:42,320 --> 00:08:46,320 Speaker 1: to force yourself to rece myself to do everything. Robert, 140 00:08:46,760 --> 00:08:49,400 Speaker 1: so do I. Sophie will tell you the only thing 141 00:08:49,559 --> 00:08:53,160 Speaker 1: human condition. Yeah, the human condition. I thought, being an adult, 142 00:08:53,720 --> 00:08:58,120 Speaker 1: it would be better than this, just the problem with 143 00:08:58,400 --> 00:09:01,200 Speaker 1: the only difference. When I was a kid, there was 144 00:09:01,240 --> 00:09:04,440 Speaker 1: somebody else making me do stuff, and so it was 145 00:09:04,520 --> 00:09:09,400 Speaker 1: easier than it. Now I gotta go read a goddamn 146 00:09:09,520 --> 00:09:15,000 Speaker 1: book because I used to have so much fucking ability 147 00:09:15,040 --> 00:09:18,360 Speaker 1: to focus on books. And yeah, now I have to 148 00:09:18,360 --> 00:09:22,600 Speaker 1: read like three at a time in order to read one. Yeah, 149 00:09:22,679 --> 00:09:25,240 Speaker 1: it's really hard. It's a d h D is what 150 00:09:25,280 --> 00:09:27,920 Speaker 1: it probably is. Yeah, but my a d h D 151 00:09:28,040 --> 00:09:30,400 Speaker 1: medicine doesn't seem to help. It just makes me ready 152 00:09:30,400 --> 00:09:34,079 Speaker 1: to go garden instead. It makes me really ready to 153 00:09:34,120 --> 00:09:36,840 Speaker 1: focus on anything other than the thing I have to do. 154 00:09:38,120 --> 00:09:44,120 Speaker 1: Gardening is vastly better anyway. We've we've drifted from and 155 00:09:44,160 --> 00:09:48,040 Speaker 1: I blame myself. Tokyo. We've drifted. We've Tokyo drifted away 156 00:09:48,120 --> 00:09:52,400 Speaker 1: from the the cube, the cube, the cube. We've we've 157 00:09:52,400 --> 00:09:59,920 Speaker 1: cube stopped gleaning the cube, romancing the cube, romancing the 158 00:10:00,080 --> 00:10:04,760 Speaker 1: square stone, romancing the cube. Once a year, Yeah, once 159 00:10:04,800 --> 00:10:11,960 Speaker 1: a year. In a supervised visit, only one trending on 160 00:10:12,000 --> 00:10:17,320 Speaker 1: Twitter at the fucking cube. Yeah, i'd better be I mean, 161 00:10:17,400 --> 00:10:21,360 Speaker 1: actually one trending. Yeah, we'll talk. I guess we should 162 00:10:21,400 --> 00:10:26,400 Speaker 1: talk about that. Now, go ahead about I know you 163 00:10:26,400 --> 00:10:31,040 Speaker 1: you want to do this only in theory. We talked 164 00:10:31,040 --> 00:10:34,040 Speaker 1: about it last week and again we're a culture show. 165 00:10:34,080 --> 00:10:38,160 Speaker 1: Now we only we talk about movies. Can we talk 166 00:10:38,200 --> 00:10:41,920 Speaker 1: about succession later? Because I'm finally yes, we may? Um? 167 00:10:42,120 --> 00:10:44,080 Speaker 1: Is he? Well, so, I guess my question is is 168 00:10:44,120 --> 00:10:48,000 Speaker 1: Timpole trending because of his additional thread on squid Game 169 00:10:48,080 --> 00:10:52,199 Speaker 1: or is it about his tweets about Dune yesterday. It's 170 00:10:52,200 --> 00:10:57,199 Speaker 1: a good question. Let's both are silly. It's probably both. Yeah, 171 00:10:57,240 --> 00:11:01,239 Speaker 1: Timpoles just a very funny little character. Um, and sometimes 172 00:11:01,520 --> 00:11:03,880 Speaker 1: is he doing a character? Is he? Oh? Here? It is? Okay. 173 00:11:04,000 --> 00:11:07,000 Speaker 1: It's surprising that people who think a show about people 174 00:11:07,040 --> 00:11:10,160 Speaker 1: being forced into equality stand in food lines where they 175 00:11:10,160 --> 00:11:13,560 Speaker 1: are underfed and used as fodder for the elites does 176 00:11:13,600 --> 00:11:17,000 Speaker 1: not represent communism in any way. If squid Game was 177 00:11:17,040 --> 00:11:20,360 Speaker 1: about capitalism, then rich people would start off disadvantage and 178 00:11:20,400 --> 00:11:26,240 Speaker 1: not equal to the other players advantaged. Whatever. The rich 179 00:11:26,320 --> 00:11:29,040 Speaker 1: people aren't aren't playing in the squid Game, are there? 180 00:11:29,040 --> 00:11:31,320 Speaker 1: Here's the thing. Here's the thing, And I'm not gonna 181 00:11:31,360 --> 00:11:34,680 Speaker 1: spoil anything. Um, I don't know who's hurt. Who's watched 182 00:11:34,679 --> 00:11:38,080 Speaker 1: it all, not watched it all. You don't watch I 183 00:11:38,160 --> 00:11:41,120 Speaker 1: have to catch up on Succession. That's fine. I can't 184 00:11:41,160 --> 00:11:44,040 Speaker 1: watch squid Game before I can watch Dude. That's fine. 185 00:11:44,080 --> 00:11:47,000 Speaker 1: I'm not gonna spoil anything. Um. The statement if Squid 186 00:11:47,000 --> 00:11:49,000 Speaker 1: Game was about capitalism and rich people would start off 187 00:11:49,040 --> 00:11:51,240 Speaker 1: advantage and not equal to the the other players. First of all, 188 00:11:51,280 --> 00:11:53,520 Speaker 1: Tim finished the show to finish the fucking show. Watched 189 00:11:53,520 --> 00:11:56,200 Speaker 1: to the end of the fucking show. Tim. Second of all, 190 00:11:57,360 --> 00:12:00,520 Speaker 1: they're all in debt. That's the plot to the show. 191 00:12:01,040 --> 00:12:06,079 Speaker 1: They're all in massive debt. So there are that's the 192 00:12:06,120 --> 00:12:09,320 Speaker 1: plot of the show. They're all they're all in massive debt, 193 00:12:09,760 --> 00:12:11,480 Speaker 1: and they're and they have to play the squid Game. 194 00:12:12,400 --> 00:12:16,319 Speaker 1: So so what's your point, Tim? But um, and this 195 00:12:16,400 --> 00:12:20,720 Speaker 1: is not I just some players would have inherited advantages 196 00:12:20,760 --> 00:12:24,520 Speaker 1: toward winning the money. First of all, Tim, they kind 197 00:12:24,520 --> 00:12:27,480 Speaker 1: of do. Second of all, there are games in the 198 00:12:27,520 --> 00:12:31,520 Speaker 1: squid Game that is set up like a race, and 199 00:12:31,559 --> 00:12:33,840 Speaker 1: there are some people in the front that start, and 200 00:12:33,840 --> 00:12:36,880 Speaker 1: there's some people in the back and it's presented as 201 00:12:36,920 --> 00:12:40,000 Speaker 1: being an equal game, even though you can literally look 202 00:12:40,160 --> 00:12:43,319 Speaker 1: and it's not. Physically you can see that it's not 203 00:12:43,400 --> 00:12:45,520 Speaker 1: equal because they're not all starting at the same time. 204 00:12:45,880 --> 00:12:49,679 Speaker 1: It's just very silly. Um and he closes this threat 205 00:12:49,679 --> 00:12:52,000 Speaker 1: out by saying, there are critiques of capitalism if you 206 00:12:52,080 --> 00:12:56,360 Speaker 1: try hard enough. And again, first of all, there's an 207 00:12:56,360 --> 00:13:02,080 Speaker 1: episode I'm I'm trying Katie close your ears. The second episode. 208 00:13:02,559 --> 00:13:06,120 Speaker 1: The second episode spoilers. The second episode they leave the 209 00:13:06,160 --> 00:13:08,600 Speaker 1: squid games. They're like, oh wow, the squid games are dangerous. Well, 210 00:13:08,640 --> 00:13:11,040 Speaker 1: we're all gonna die, so they leave, and then the 211 00:13:11,080 --> 00:13:14,240 Speaker 1: second episode is them going back to the capitalist world. 212 00:13:14,240 --> 00:13:16,560 Speaker 1: They're all in debt, they're all miserable, everyone around them 213 00:13:16,640 --> 00:13:20,000 Speaker 1: is going to die. The episode is called Hell, and 214 00:13:20,080 --> 00:13:22,800 Speaker 1: they go back to the games for the rest of 215 00:13:22,840 --> 00:13:28,000 Speaker 1: the show. Yeah you can, you can, you can put 216 00:13:28,000 --> 00:13:29,880 Speaker 1: your yes, you can put your headphones back on. It's 217 00:13:29,920 --> 00:13:33,679 Speaker 1: just so silly to pretend like, if you try hard enough, 218 00:13:33,679 --> 00:13:35,920 Speaker 1: they're critiques. No, they are very specific critiques of it. 219 00:13:36,200 --> 00:13:38,520 Speaker 1: You can misinterpret it all you want, but you can't 220 00:13:38,559 --> 00:13:41,880 Speaker 1: claim that that. You have to try very hard to 221 00:13:41,920 --> 00:13:45,080 Speaker 1: get to that critique. Even Ben Shapiro today did a 222 00:13:45,120 --> 00:13:49,200 Speaker 1: whole video reviewing squid Game, and he's like, it's about 223 00:13:49,360 --> 00:13:53,480 Speaker 1: how communism is good, and like, I mean kind of 224 00:13:53,600 --> 00:13:58,800 Speaker 1: like not but they specifically attack North Korea in it too. Yeah, no, yes, 225 00:13:59,040 --> 00:14:02,199 Speaker 1: And and also like by critiquing like all the problems 226 00:14:02,200 --> 00:14:05,880 Speaker 1: with capitalism, you're not inherently saying like communism is good, 227 00:14:06,280 --> 00:14:09,840 Speaker 1: Like that's not how it works. Um. So Like even 228 00:14:09,880 --> 00:14:12,199 Speaker 1: then he's it's like not a very good review, but 229 00:14:12,559 --> 00:14:16,199 Speaker 1: he at least says it's a good show. It's politics 230 00:14:16,200 --> 00:14:18,360 Speaker 1: are bad. He gives it a seven out of ten 231 00:14:18,440 --> 00:14:20,960 Speaker 1: for watchability and a one out of ten for politics. 232 00:14:21,400 --> 00:14:23,760 Speaker 1: So like, at the bare minimum, you have to be 233 00:14:23,800 --> 00:14:27,760 Speaker 1: able to recognize what the show is. Um. And it's 234 00:14:27,800 --> 00:14:30,560 Speaker 1: just very silly. It doesn't matter. None of this matters. 235 00:14:31,440 --> 00:14:34,520 Speaker 1: I don't know why I care so much. Um, it's 236 00:14:34,560 --> 00:14:41,600 Speaker 1: just like dislike disingenuous grifters. It's kind of thing. I 237 00:14:41,600 --> 00:14:44,280 Speaker 1: don't know if disingenuous or not, like part of it. 238 00:14:45,160 --> 00:14:47,360 Speaker 1: Just like, do you think is this how you think? 239 00:14:47,400 --> 00:14:50,160 Speaker 1: I think is how you absorb media? He does? I 240 00:14:50,200 --> 00:14:53,120 Speaker 1: think that. I think it's his tribalism. Of here's another 241 00:14:53,160 --> 00:14:55,960 Speaker 1: thing that I can say us versus them, here's another 242 00:14:56,000 --> 00:14:58,120 Speaker 1: way that I can like just argue against you. I 243 00:14:58,800 --> 00:15:03,440 Speaker 1: I do think that this is um purposeful ignorance. I 244 00:15:03,440 --> 00:15:06,840 Speaker 1: don't know, I won I guess so, like because even 245 00:15:06,840 --> 00:15:09,080 Speaker 1: like with Done, he's like, I didn't like it was 246 00:15:09,120 --> 00:15:12,920 Speaker 1: boring and slow, and I fell asleep a bunch and 247 00:15:13,120 --> 00:15:16,240 Speaker 1: didn't even finish it and it's bad, and like, even 248 00:15:16,320 --> 00:15:18,760 Speaker 1: that is so silly to me. First of all, it's 249 00:15:18,760 --> 00:15:21,600 Speaker 1: not a slow movie. I don't understand that critique. It's 250 00:15:21,600 --> 00:15:26,880 Speaker 1: like it's pretty uh engaging the entire time. But it's 251 00:15:26,880 --> 00:15:32,320 Speaker 1: the kind of thing where he can't come up with like, oh, 252 00:15:32,400 --> 00:15:35,560 Speaker 1: it's like it's woke, so I'm gonna talk about it. 253 00:15:35,640 --> 00:15:37,120 Speaker 1: So he has to fall back on like I didn't 254 00:15:37,160 --> 00:15:41,560 Speaker 1: watch it was boring, um, which you totally could because 255 00:15:41,600 --> 00:15:47,680 Speaker 1: it is woke, like from the sixties, like sixties woke ism. 256 00:15:47,680 --> 00:15:51,000 Speaker 1: But like they're very clear themes that he's talking about. Yeah, 257 00:15:51,040 --> 00:15:55,120 Speaker 1: I mean, the overall theme of the of the books 258 00:15:55,240 --> 00:16:00,600 Speaker 1: is that number one, colonialism inherently destroys everything, not just 259 00:16:00,800 --> 00:16:05,680 Speaker 1: the colonized, but the colonizers. That heroes are a trap 260 00:16:06,160 --> 00:16:10,960 Speaker 1: uh and inherently uh. Just the concept is inherently destructive 261 00:16:11,360 --> 00:16:16,320 Speaker 1: and that given enough time, the oppressed will become the oppressors. Uh, 262 00:16:16,360 --> 00:16:18,360 Speaker 1: there's a lot going on in Dune. All of it 263 00:16:18,440 --> 00:16:21,680 Speaker 1: is in fact very woke because Frank Herbert ate his 264 00:16:21,760 --> 00:16:24,960 Speaker 1: body weight in psychedelic mushrooms while wandering the Oregon Dunes 265 00:16:25,040 --> 00:16:28,320 Speaker 1: every month. Uh that's what That's what the other parts 266 00:16:28,360 --> 00:16:32,920 Speaker 1: of the bookers are about, Like, um, big mushrooms fan 267 00:16:33,560 --> 00:16:40,000 Speaker 1: is a huge mushrooms fan, spice um. And it's just yeah, 268 00:16:40,000 --> 00:16:42,880 Speaker 1: it's very silly, like because you could watch that movie 269 00:16:42,880 --> 00:16:46,240 Speaker 1: and go the s j w's got to Dune. We 270 00:16:46,240 --> 00:16:50,680 Speaker 1: would be inaccurate because it's always been Dune. But like 271 00:16:50,800 --> 00:16:53,400 Speaker 1: it's just like if you can't. I've even seen people 272 00:16:53,480 --> 00:16:55,760 Speaker 1: like oh it's boring, Like they cast they cast as 273 00:16:55,760 --> 00:16:59,320 Speaker 1: black woman as his character's not and then like they 274 00:16:59,360 --> 00:17:03,400 Speaker 1: reply back to their uh their threaded like I saw 275 00:17:03,440 --> 00:17:05,600 Speaker 1: it as a really good is faithful and like beautiful 276 00:17:05,600 --> 00:17:07,159 Speaker 1: stuff and she did a great job. Like yeah, you 277 00:17:07,200 --> 00:17:11,520 Speaker 1: just want to get mad, like they the book where 278 00:17:11,600 --> 00:17:16,400 Speaker 1: the first special Boy hero counts how many people he's 279 00:17:16,480 --> 00:17:19,639 Speaker 1: murdered in multiples of Hitler and then kills himself and 280 00:17:19,720 --> 00:17:26,760 Speaker 1: his son becomes a giant worm. Like, it's just I 281 00:17:26,800 --> 00:17:28,879 Speaker 1: haven't participated in much of this because I took my 282 00:17:28,920 --> 00:17:31,160 Speaker 1: headphones off for a while and never quite caught up. 283 00:17:31,560 --> 00:17:35,200 Speaker 1: I think this is a perfect time for some products 284 00:17:35,200 --> 00:17:40,639 Speaker 1: and services in the other things that we've definitely figured 285 00:17:40,640 --> 00:18:03,680 Speaker 1: out already together. Everything area are and we are not 286 00:18:03,880 --> 00:18:08,560 Speaker 1: talking about movies or TV anymore, right or all about 287 00:18:08,560 --> 00:18:13,480 Speaker 1: that time Alec Baldwin killed that lady. Yeah, this is 288 00:18:13,840 --> 00:18:17,119 Speaker 1: we really should have led with this um segment, but 289 00:18:17,200 --> 00:18:21,320 Speaker 1: it's actually so it's awful. It's hard to lead from 290 00:18:21,400 --> 00:18:24,960 Speaker 1: jokes straight into this story. But yeah, the headline is 291 00:18:25,000 --> 00:18:31,560 Speaker 1: that Alec Baldwin accidentally shot and killed their DP. Yeah, 292 00:18:31,960 --> 00:18:34,960 Speaker 1: the the you know that meme where it's like the 293 00:18:36,240 --> 00:18:39,920 Speaker 1: galaxy brains, the brains going bigger, Like the tiniest brain 294 00:18:40,080 --> 00:18:44,080 Speaker 1: version of this is Alec Baldwin killed somebody, and then 295 00:18:44,119 --> 00:18:49,760 Speaker 1: the medium one is actually the armor on set killed somebody, 296 00:18:50,119 --> 00:18:53,360 Speaker 1: and then the biggest galaxy brain is actually Alec Baldwin 297 00:18:53,560 --> 00:18:55,879 Speaker 1: helped to kill somebody because he was the producer on 298 00:18:55,920 --> 00:19:00,240 Speaker 1: a movie that deliberately cut corners leading to this person's death. 299 00:19:00,280 --> 00:19:02,919 Speaker 1: In the series of yeah, there's there's a lot to 300 00:19:02,960 --> 00:19:08,040 Speaker 1: go over here, there's the rights reaction, there's what's happening, 301 00:19:08,040 --> 00:19:11,280 Speaker 1: what happened on set. There is what happened on set 302 00:19:11,680 --> 00:19:16,199 Speaker 1: against the backdrop of the IoTs negotiations. Um, it's a 303 00:19:16,240 --> 00:19:20,639 Speaker 1: lot and it's been hard. You know, I I Cody 304 00:19:20,680 --> 00:19:21,960 Speaker 1: and I both know a lot of people in the 305 00:19:22,000 --> 00:19:25,720 Speaker 1: entertainment industry. You guys do too, You've lived here, but 306 00:19:25,760 --> 00:19:30,480 Speaker 1: this has just really hit everybody pretty hard. So as 307 00:19:30,520 --> 00:19:33,720 Speaker 1: most people probably are aware or maybe aren't of the 308 00:19:33,800 --> 00:19:40,119 Speaker 1: little details. Um. This set was a somewhat low budget set, 309 00:19:40,400 --> 00:19:44,960 Speaker 1: I guess comparatively speaking, but not really, but they mean 310 00:19:45,040 --> 00:19:47,919 Speaker 1: still tons of money. Um, but they were rushing to 311 00:19:47,960 --> 00:19:52,960 Speaker 1: get production done, especially in case IoTs did strike. Uh. 312 00:19:53,000 --> 00:19:58,960 Speaker 1: They're halfway through this shoot. There had been multiple multiple 313 00:19:59,000 --> 00:20:04,240 Speaker 1: instances of gun of safety concerns, three misfires I think 314 00:20:04,400 --> 00:20:09,880 Speaker 1: three missfires with this one gun in the week prior. Um. 315 00:20:10,040 --> 00:20:14,000 Speaker 1: Uh even outside of that, they the crew had been 316 00:20:14,000 --> 00:20:17,920 Speaker 1: promised accommodations near set, but instead, once they arrived, they 317 00:20:17,920 --> 00:20:21,360 Speaker 1: were told that they would be housed fifty miles away. 318 00:20:21,520 --> 00:20:24,440 Speaker 1: So they are driving back and forth fifty miles every 319 00:20:24,520 --> 00:20:28,359 Speaker 1: day to set and schedule and safety concerns and the 320 00:20:28,560 --> 00:20:33,080 Speaker 1: morning of the accident. I don't know have the number 321 00:20:33,119 --> 00:20:35,399 Speaker 1: in front of me, but the majority of the crew, 322 00:20:35,520 --> 00:20:41,480 Speaker 1: camera crew walked off, walked off set UM as well 323 00:20:41,520 --> 00:20:44,720 Speaker 1: as yes, the armorers, the people that the props masters 324 00:20:44,840 --> 00:20:49,280 Speaker 1: had left UM. And so instead of meeting the demands 325 00:20:49,320 --> 00:20:53,480 Speaker 1: of their crew or you know, being responsible, they brought 326 00:20:53,520 --> 00:20:58,199 Speaker 1: in local non union props masters and an armorer. And 327 00:20:58,240 --> 00:21:00,240 Speaker 1: I feel very bad for this young armor and I 328 00:21:00,240 --> 00:21:02,440 Speaker 1: don't feel very bad. I have to see the I 329 00:21:02,640 --> 00:21:05,280 Speaker 1: I'm sure this is a horrifying moment for these people 330 00:21:05,320 --> 00:21:10,000 Speaker 1: that were these non careers and she's young and you know, 331 00:21:10,480 --> 00:21:13,199 Speaker 1: been brought in for this opportunity, and they're being rushed 332 00:21:13,200 --> 00:21:18,680 Speaker 1: as well. Like I, Ultimately, the responsibility does lie with production, 333 00:21:18,720 --> 00:21:21,000 Speaker 1: and it lies with the a D. The a D 334 00:21:21,280 --> 00:21:26,679 Speaker 1: who apparently had been fired from other sets for the 335 00:21:26,800 --> 00:21:34,119 Speaker 1: same thing, like actually on another set of his and 336 00:21:34,200 --> 00:21:37,320 Speaker 1: he has been known as the guy that comes in 337 00:21:37,840 --> 00:21:40,560 Speaker 1: when the real a D leaves, when people leave because 338 00:21:40,560 --> 00:21:42,240 Speaker 1: of it. He's willing to come in and do what 339 00:21:42,320 --> 00:21:46,000 Speaker 1: it takes. And the a D, if you guys don't know, 340 00:21:46,440 --> 00:21:51,879 Speaker 1: is the assistant director. And it's not about directing UM 341 00:21:52,000 --> 00:21:55,520 Speaker 1: the action on camera. It's basically running the set. Your 342 00:21:55,640 --> 00:21:58,919 Speaker 1: a D is the person running the safety. Every a 343 00:21:59,000 --> 00:22:04,919 Speaker 1: D I've worked with is militant about safety, and they 344 00:22:05,040 --> 00:22:07,480 Speaker 1: run the whole cruise. So the buck does stop there. 345 00:22:07,840 --> 00:22:10,520 Speaker 1: But even more than that, the buck stops at the producers, 346 00:22:10,560 --> 00:22:13,520 Speaker 1: I e. Alec Baldwin, who hired this person and made 347 00:22:13,520 --> 00:22:17,120 Speaker 1: that choice. And I'm I am so fucking furious, And yes, 348 00:22:17,160 --> 00:22:19,280 Speaker 1: I feel terrible for him, because my god, you have 349 00:22:19,400 --> 00:22:21,080 Speaker 1: to live with that for the rest of your life. 350 00:22:22,840 --> 00:22:26,320 Speaker 1: But also, yeah, you have to live with this for 351 00:22:26,359 --> 00:22:28,960 Speaker 1: the rest of your life. Yeah, I mean it's it's 352 00:22:29,000 --> 00:22:31,960 Speaker 1: there's so much that went horribly wrong that shouldn't have 353 00:22:32,000 --> 00:22:34,440 Speaker 1: back in our cracked days. I interviewed somebody who was 354 00:22:34,480 --> 00:22:38,600 Speaker 1: like one of the most well known armorers in uh 355 00:22:38,640 --> 00:22:41,040 Speaker 1: in Hollywood. Like he's he's worked with Keanu Reeves, he's 356 00:22:41,040 --> 00:22:43,119 Speaker 1: worked with like Tom Cruise, like he's one of he 357 00:22:43,240 --> 00:22:47,240 Speaker 1: not not just like provides Union armors, but also provides weapons. 358 00:22:47,800 --> 00:22:51,600 Speaker 1: And when you have those people who are like the 359 00:22:51,640 --> 00:22:56,880 Speaker 1: proper iotsy armors, it's been decades since a fatal incident 360 00:22:57,000 --> 00:22:59,560 Speaker 1: with a firearm as a result of like a Union armor. 361 00:23:00,400 --> 00:23:03,679 Speaker 1: Um God, I think like maybe a D sometimes the 362 00:23:03,720 --> 00:23:05,879 Speaker 1: eighties is the last one I remember hearing about. And 363 00:23:05,920 --> 00:23:09,080 Speaker 1: it's because they have all of these different redundancies. Like, 364 00:23:09,160 --> 00:23:11,439 Speaker 1: for one thing, when you're doing a scene like this, 365 00:23:11,560 --> 00:23:15,000 Speaker 1: Number one, the barrel of the of the of the 366 00:23:15,040 --> 00:23:17,840 Speaker 1: functional firearm is never supposed to be pointed at a person. 367 00:23:17,880 --> 00:23:20,119 Speaker 1: Even in scenes when it looks like that's happening, you 368 00:23:20,160 --> 00:23:22,800 Speaker 1: have like the you have it canted to some way, 369 00:23:22,920 --> 00:23:26,200 Speaker 1: especially since like they're pointing at a camera right, which 370 00:23:26,240 --> 00:23:28,240 Speaker 1: is a dangerous set. But number one, you make sure 371 00:23:28,320 --> 00:23:30,520 Speaker 1: the barrel is not pointed directly at a person. That's 372 00:23:30,520 --> 00:23:34,240 Speaker 1: an armorer's job. And number two, even if the barrel 373 00:23:34,280 --> 00:23:36,159 Speaker 1: isn't supposed to be pointed directly at a person, you 374 00:23:36,200 --> 00:23:39,879 Speaker 1: have a sheet of bulletproof plexiglass in between the camera 375 00:23:40,040 --> 00:23:42,600 Speaker 1: and the gun. Um there's a bunch of ships that's 376 00:23:42,600 --> 00:23:45,280 Speaker 1: supposed Like for one thing, the way the transfer of 377 00:23:45,280 --> 00:23:47,320 Speaker 1: the gun was done, the a D just picked it 378 00:23:47,400 --> 00:23:50,000 Speaker 1: up and said cold weapon and handed it to Baldwin. 379 00:23:51,160 --> 00:23:53,959 Speaker 1: Number one, the a D is never, ever, ever, fucking 380 00:23:54,000 --> 00:23:57,200 Speaker 1: ever supposed to put a gun, a functional firearm in 381 00:23:57,240 --> 00:23:59,399 Speaker 1: the hands of an actor. That is the job of 382 00:23:59,440 --> 00:24:01,840 Speaker 1: the armor And when the armorers does it, he is 383 00:24:01,880 --> 00:24:05,159 Speaker 1: supposed to open the cylinder or drop the magazine, whichever 384 00:24:05,280 --> 00:24:08,280 Speaker 1: kind of weapon it is, and show you down the barrel. 385 00:24:08,320 --> 00:24:11,440 Speaker 1: The actor is supposed to look down the barrel and see, Okay, 386 00:24:11,480 --> 00:24:13,760 Speaker 1: there is nothing in this gun. This is not a 387 00:24:13,800 --> 00:24:16,919 Speaker 1: loaded firearm, in order to know that it's a cold weapon. 388 00:24:17,560 --> 00:24:20,840 Speaker 1: Another like, the most fucked up thing about this is 389 00:24:20,840 --> 00:24:23,920 Speaker 1: that it is now coming out so one. For one thing, 390 00:24:24,200 --> 00:24:26,480 Speaker 1: there are some situations in which you might use a 391 00:24:26,560 --> 00:24:30,159 Speaker 1: live round like a functional bullet as opposed to a 392 00:24:30,280 --> 00:24:33,480 Speaker 1: blank UM, and a blank is pretty close to a 393 00:24:33,520 --> 00:24:35,560 Speaker 1: live round, it just doesn't have like a full bullet 394 00:24:35,640 --> 00:24:38,360 Speaker 1: on the end. Um. There are some situations in which 395 00:24:38,359 --> 00:24:41,080 Speaker 1: you would fire a live round from a gun UM 396 00:24:41,320 --> 00:24:43,679 Speaker 1: on a movie set, but you would never keep the 397 00:24:43,680 --> 00:24:47,080 Speaker 1: blanks and the live functional bullets in the same place. 398 00:24:47,200 --> 00:24:49,560 Speaker 1: They were kept in the same place next to each 399 00:24:49,560 --> 00:24:51,600 Speaker 1: other in the case of this. And what's come out 400 00:24:51,640 --> 00:24:54,320 Speaker 1: now is that members of the crew were going out 401 00:24:54,400 --> 00:24:57,760 Speaker 1: and shooting actual ammunition out of these guns off hours 402 00:24:58,440 --> 00:25:00,920 Speaker 1: and that's how Yeah, that's the thing that's coming out. 403 00:25:00,920 --> 00:25:02,480 Speaker 1: And I don't know, it hasn't been hard confirmed, but 404 00:25:02,480 --> 00:25:04,720 Speaker 1: there's been a couple of articles on it. Because the 405 00:25:04,800 --> 00:25:07,119 Speaker 1: live bullet was in the gun. It went through a 406 00:25:07,160 --> 00:25:11,200 Speaker 1: person in another person. That's what's so confused, because people 407 00:25:11,680 --> 00:25:13,600 Speaker 1: say it's a live ammunition, or then it's a blank, 408 00:25:13,640 --> 00:25:16,560 Speaker 1: but it's still live. Here's one thing I need to say, though, 409 00:25:16,960 --> 00:25:19,919 Speaker 1: Go ahead, Robert, clarify one of the things it can't have. 410 00:25:20,400 --> 00:25:23,639 Speaker 1: Because obviously a blank can kill. People have died from blanks. 411 00:25:23,640 --> 00:25:25,880 Speaker 1: It has happened before. A blank can absolutely kill someone, 412 00:25:25,960 --> 00:25:29,320 Speaker 1: but not this bullet killed someone and hit another person, 413 00:25:29,400 --> 00:25:31,280 Speaker 1: went through them. A blank will never do that. A 414 00:25:31,320 --> 00:25:34,240 Speaker 1: blank is not going to pierce another person's body and 415 00:25:34,280 --> 00:25:36,800 Speaker 1: go into a separate person. Yeah, this was an actual 416 00:25:36,880 --> 00:25:40,320 Speaker 1: Here's the thing. The gosh, there's so much there's so 417 00:25:40,359 --> 00:25:45,600 Speaker 1: many things. You do not need to have a firing 418 00:25:45,760 --> 00:25:50,840 Speaker 1: gun on set at all at this point, we don't. 419 00:25:51,480 --> 00:25:53,840 Speaker 1: You don't. There are arguments about like, oh, you want 420 00:25:53,880 --> 00:25:57,080 Speaker 1: that flash, you want this, all of that for the majority, 421 00:25:57,119 --> 00:25:59,440 Speaker 1: And I've been watching all the seeing all these posts 422 00:25:59,440 --> 00:26:04,080 Speaker 1: from cinema pographers. It's almost impossible to capture that on camera. 423 00:26:04,880 --> 00:26:06,720 Speaker 1: They always try, and they almost always do it in 424 00:26:06,760 --> 00:26:10,760 Speaker 1: post anyway. Um, the training, the fact that it actually 425 00:26:10,840 --> 00:26:14,600 Speaker 1: spooks most actors out and it's much easier. They want 426 00:26:14,600 --> 00:26:17,920 Speaker 1: to talk about the noise and the reaction. You can 427 00:26:18,359 --> 00:26:21,520 Speaker 1: bang something together and it gives that jolt to get 428 00:26:21,520 --> 00:26:27,399 Speaker 1: an authentic performance. It's it's it doesn't make sense. Yeah, 429 00:26:27,720 --> 00:26:31,560 Speaker 1: we are any capacity popular movies, the most popular movies 430 00:26:31,600 --> 00:26:36,679 Speaker 1: in the they're all cd I, Marvel movies. So you 431 00:26:36,840 --> 00:26:38,919 Speaker 1: do a fake gun. I hope that that's not the 432 00:26:39,000 --> 00:26:43,199 Speaker 1: only lesson that's learned, though it seems like no, I know. 433 00:26:43,280 --> 00:26:44,919 Speaker 1: I'm just saying, like I see a lot of like 434 00:26:45,000 --> 00:26:48,040 Speaker 1: just do the special effects special effects, Like, yeah, there 435 00:26:48,040 --> 00:26:52,040 Speaker 1: are other aspects of like the safety of workers on 436 00:26:52,080 --> 00:26:55,080 Speaker 1: these sets that need to like what I already said 437 00:26:55,160 --> 00:26:57,800 Speaker 1: about how they left because of the safety conditions in 438 00:26:57,800 --> 00:27:00,920 Speaker 1: general set. One of my issues people being like we 439 00:27:00,960 --> 00:27:03,720 Speaker 1: should just never have actual guns on set is that, well, 440 00:27:04,200 --> 00:27:06,840 Speaker 1: that's kind of like saying we should destroy the career 441 00:27:07,040 --> 00:27:09,879 Speaker 1: of all of the iatsy armors who never made a 442 00:27:09,920 --> 00:27:13,000 Speaker 1: mistake and who in hundreds of movies were perfectly safe 443 00:27:13,040 --> 00:27:15,680 Speaker 1: because that's what they trained to do. Because a crew 444 00:27:15,800 --> 00:27:18,000 Speaker 1: hired someone who never should have been in armor and 445 00:27:18,080 --> 00:27:23,240 Speaker 1: wasn't with AATSI like, we should destroy this entire career field, um, 446 00:27:23,280 --> 00:27:27,320 Speaker 1: because like which I like, I would say I wouldn't, 447 00:27:28,480 --> 00:27:31,160 Speaker 1: But that's you would still have a need for these 448 00:27:31,200 --> 00:27:36,280 Speaker 1: skilled people. Um, because there's other types of weapons that 449 00:27:36,320 --> 00:27:39,520 Speaker 1: are used on set. We're talking specifically about guns that 450 00:27:39,560 --> 00:27:42,199 Speaker 1: would still need to have a professional handling them, but 451 00:27:42,359 --> 00:27:46,640 Speaker 1: do not need to actively be fired. That's it. Like 452 00:27:47,000 --> 00:27:49,479 Speaker 1: you're not c G eyeing a whole gun in you 453 00:27:49,480 --> 00:27:51,679 Speaker 1: know what I mean? Like, but like the fact that 454 00:27:51,720 --> 00:27:54,719 Speaker 1: there's firing happening shouldn't be happening, do you know what 455 00:27:54,760 --> 00:27:58,920 Speaker 1: I mean. I mean, that's my perspective. It's been done. 456 00:27:58,960 --> 00:28:01,280 Speaker 1: Like I guess my issue would be like this, The 457 00:28:01,359 --> 00:28:04,719 Speaker 1: issue here isn't that, Uh, The issue here is that 458 00:28:04,760 --> 00:28:07,719 Speaker 1: like things that should never have been done if they 459 00:28:08,200 --> 00:28:14,400 Speaker 1: followed the industry standard, were why somebody died here. Absolutely 460 00:28:14,720 --> 00:28:17,680 Speaker 1: absolutely people are missing the bigger picture point of what 461 00:28:17,720 --> 00:28:20,600 Speaker 1: this is about. Yeah, there's been a bunch of fucking 462 00:28:20,640 --> 00:28:23,640 Speaker 1: Westerns filmed with old timey guns where you don't have 463 00:28:23,840 --> 00:28:26,320 Speaker 1: as many of the options for like sim munitions and 464 00:28:26,359 --> 00:28:29,920 Speaker 1: stuff with these antique guns. And it's fine because they 465 00:28:29,960 --> 00:28:33,399 Speaker 1: don't do the variety the huge variety of things that 466 00:28:33,400 --> 00:28:35,719 Speaker 1: would horribly awry in this I don't know It's one 467 00:28:35,720 --> 00:28:40,080 Speaker 1: of those things where like, um, I just I'm I'm 468 00:28:40,080 --> 00:28:45,480 Speaker 1: comprehensively frustrated at every aspect of this because the thing 469 00:28:45,520 --> 00:28:49,920 Speaker 1: that got people killed was that decades of best practices 470 00:28:49,960 --> 00:28:54,320 Speaker 1: were ignored. Like the people whose jobs are going to 471 00:28:54,400 --> 00:28:57,480 Speaker 1: be eliminated by this switch, which if you, I don't know, fine, 472 00:28:57,480 --> 00:28:59,400 Speaker 1: maybe that's the way it always was going to evolve. 473 00:28:59,760 --> 00:29:02,040 Speaker 1: But the people whose jobs are going to be eliminated 474 00:29:02,040 --> 00:29:04,680 Speaker 1: are the people who spent years developing a series of 475 00:29:04,720 --> 00:29:07,960 Speaker 1: best practices to make this not possible when they are involved, 476 00:29:08,560 --> 00:29:12,800 Speaker 1: And I I do agree with both of you guys 477 00:29:12,840 --> 00:29:17,320 Speaker 1: about this, and I agree completely that the energy in 478 00:29:17,360 --> 00:29:22,480 Speaker 1: the outrage is being misspent and again we're missing the 479 00:29:22,520 --> 00:29:26,880 Speaker 1: actual point of Again again again this is said against 480 00:29:26,880 --> 00:29:29,320 Speaker 1: the backdrop of the Ions negotiations, which is all about 481 00:29:29,360 --> 00:29:33,160 Speaker 1: this exact thing. I would want to look into, um 482 00:29:33,320 --> 00:29:38,480 Speaker 1: what removing firing web firing gun. I think it's certainly 483 00:29:38,480 --> 00:29:41,560 Speaker 1: what actually mean, but I mean, I mean, I would 484 00:29:41,640 --> 00:29:44,320 Speaker 1: like to look into what that would mean, um in 485 00:29:44,440 --> 00:29:48,200 Speaker 1: terms of armorers and how that affects their jobs, because 486 00:29:48,200 --> 00:29:50,520 Speaker 1: my gut would be that there is still other jobs, 487 00:29:50,760 --> 00:29:52,760 Speaker 1: like they would still have a job, you know, it's 488 00:29:52,840 --> 00:29:56,640 Speaker 1: just changing slightly. However, if I'm wrong, I might I mean, 489 00:29:56,680 --> 00:29:59,840 Speaker 1: I could be wrong, you know. Um to me, the 490 00:30:00,200 --> 00:30:02,960 Speaker 1: that's but yeah, I agree with you guys. Um. Should 491 00:30:02,960 --> 00:30:05,440 Speaker 1: we also talk about how the right is reacting to this? 492 00:30:05,960 --> 00:30:12,120 Speaker 1: Oh God, how's the rights and guns? Guns don't kill people. 493 00:30:12,200 --> 00:30:19,960 Speaker 1: Alec Baldwin with a gun kills people. Yeah, that's okay, gross, 494 00:30:20,000 --> 00:30:23,440 Speaker 1: but also yeah, but you're missing again, missing the point. 495 00:30:23,760 --> 00:30:26,400 Speaker 1: You're not understanding the way. It's not the fact that 496 00:30:26,440 --> 00:30:28,920 Speaker 1: he held the gun. It's the choices he made. It's 497 00:30:29,000 --> 00:30:32,400 Speaker 1: the union busting that kills people. It's it's removing the 498 00:30:32,440 --> 00:30:35,920 Speaker 1: trained professionals who actually know how to function with the 499 00:30:35,960 --> 00:30:39,160 Speaker 1: And you could even like they're like the actual fucking 500 00:30:39,240 --> 00:30:41,880 Speaker 1: if you wanted to take an actual right wing tact 501 00:30:42,040 --> 00:30:45,080 Speaker 1: on this, that wasn't like completely batshit stupid and wrong. 502 00:30:45,120 --> 00:30:47,920 Speaker 1: It could be like, well, this is partly what happens 503 00:30:47,960 --> 00:30:49,960 Speaker 1: when you have a bunch of people who rail against 504 00:30:49,960 --> 00:30:53,280 Speaker 1: firearms but don't functionally understand them. Because if Alec Baldwin 505 00:30:53,840 --> 00:30:56,520 Speaker 1: um had any understanding of guns, he would have known 506 00:30:56,560 --> 00:30:59,480 Speaker 1: instinctively number one, you always check to make sure the 507 00:30:59,480 --> 00:31:02,320 Speaker 1: gun is loaded. And number two, even if you know 508 00:31:02,360 --> 00:31:04,680 Speaker 1: what's unloaded, you treat it like a loaded gun, which 509 00:31:04,720 --> 00:31:07,200 Speaker 1: means not pointing the barrel at a human being. That's 510 00:31:07,240 --> 00:31:08,960 Speaker 1: the actual like if you're if you want to take 511 00:31:09,000 --> 00:31:11,640 Speaker 1: the reasonable gun culture approach, it's that if he had 512 00:31:11,680 --> 00:31:14,800 Speaker 1: been raised with firearms and understood them in any kind 513 00:31:14,840 --> 00:31:17,840 Speaker 1: of functional capacity, this also wouldn't have happened because you 514 00:31:17,880 --> 00:31:20,000 Speaker 1: don't point a gun at a person and pull the trigger, 515 00:31:20,480 --> 00:31:24,320 Speaker 1: because there's no such thing as an unloaded gun. Like that, 516 00:31:24,320 --> 00:31:26,760 Speaker 1: that that's the actual reasonable thing you could take about 517 00:31:26,760 --> 00:31:29,760 Speaker 1: it as as opposed to what they're what they're doing, well, 518 00:31:29,840 --> 00:31:32,640 Speaker 1: yeah he's the Trump he's the Trump impression guy. So yeah, 519 00:31:33,000 --> 00:31:35,680 Speaker 1: they're just trying to get their little dunks in. Yeah, 520 00:31:35,720 --> 00:31:40,280 Speaker 1: every everything about this is just comprehensively heartbreaking, especially since 521 00:31:41,080 --> 00:31:44,520 Speaker 1: this is like everything turning into a goddamn culture war issue. 522 00:31:44,600 --> 00:31:47,840 Speaker 1: While a woman who had two children is dead and 523 00:31:48,040 --> 00:31:49,800 Speaker 1: they're not going to get to see their mom anymore, 524 00:31:50,320 --> 00:31:52,560 Speaker 1: and she's not going to get to live the remainder 525 00:31:52,640 --> 00:31:55,080 Speaker 1: of her what probably would have been a long life 526 00:31:55,640 --> 00:31:59,760 Speaker 1: for for what amounts to a stupid cost cutting decision 527 00:32:00,160 --> 00:32:03,240 Speaker 1: and a series of careless actions. Bio completely a whole 528 00:32:03,280 --> 00:32:05,560 Speaker 1: lot of people, A tremendous number of people had to 529 00:32:05,600 --> 00:32:09,719 Speaker 1: be careless for this to happen. Um, it's just it sucks, 530 00:32:09,760 --> 00:32:13,479 Speaker 1: really bad. I just now started to feel bad, you know. 531 00:32:14,600 --> 00:32:16,520 Speaker 1: Um No, I didn't just now start to feel bad, 532 00:32:16,520 --> 00:32:19,880 Speaker 1: but thinking about her camera crew who walked off that day, 533 00:32:20,280 --> 00:32:22,760 Speaker 1: and she'd been really torn because she'd been advocating and 534 00:32:22,800 --> 00:32:25,280 Speaker 1: fighting for them and really fighting for the safety conditions 535 00:32:25,280 --> 00:32:27,760 Speaker 1: on set. But it was she was the deep p 536 00:32:28,280 --> 00:32:32,240 Speaker 1: you know, cinematographer and this was her first DP, like 537 00:32:32,840 --> 00:32:37,600 Speaker 1: so she needed to stick around and see this project through. 538 00:32:37,600 --> 00:32:39,440 Speaker 1: But she was in a bad And so then I 539 00:32:39,520 --> 00:32:41,320 Speaker 1: just thought about all the people that left and how 540 00:32:41,320 --> 00:32:44,080 Speaker 1: they must feel that the day they walk out this happened. 541 00:32:44,080 --> 00:32:45,560 Speaker 1: And not that this is in any way their fault. 542 00:32:45,560 --> 00:32:48,760 Speaker 1: They're doing the thing they need to do, you know, 543 00:32:49,000 --> 00:32:51,840 Speaker 1: they were in fact trying to stop this from happen, 544 00:32:51,880 --> 00:33:00,560 Speaker 1: stop this from happening. Yeah, it's everything is terrible. And 545 00:33:00,680 --> 00:33:03,720 Speaker 1: also everything about this is terrible. Yeah, and I don't 546 00:33:04,160 --> 00:33:08,560 Speaker 1: and again and it's so just like everything literally everything, 547 00:33:08,720 --> 00:33:12,240 Speaker 1: the government, climate change, COVID, all of it. You see 548 00:33:12,280 --> 00:33:14,320 Speaker 1: the writing on the wall as to what the problems are, 549 00:33:14,360 --> 00:33:18,400 Speaker 1: and you also see how the people in charge spin 550 00:33:18,480 --> 00:33:22,440 Speaker 1: it or take the wrong point, purposefully obfuskating things so 551 00:33:22,480 --> 00:33:25,640 Speaker 1: that they can continue on, continue on, and things don't change. 552 00:33:25,880 --> 00:33:29,200 Speaker 1: Like will this change the negotiations? I don't know. Will 553 00:33:29,240 --> 00:33:31,840 Speaker 1: this affect anything in the long run? I don't know. 554 00:33:33,120 --> 00:33:35,880 Speaker 1: I mean in terms of the actual set conditions that 555 00:33:35,920 --> 00:33:40,360 Speaker 1: need to be fixed. Um, does that make sense? Yeah, 556 00:33:40,560 --> 00:33:49,400 Speaker 1: I'm rambling. All right, let's take an ad break, right, 557 00:33:51,320 --> 00:34:09,920 Speaker 1: breather everything, So don't don't yeah, baby, yeah, suck on that. 558 00:34:11,920 --> 00:34:14,640 Speaker 1: I don't know why I told people to suck on that. Yeah, 559 00:34:14,880 --> 00:34:17,000 Speaker 1: I mean, I mean they can. That was weird, right, 560 00:34:17,160 --> 00:34:20,560 Speaker 1: I shouldn't cut it out? No, no, no, no, keep 561 00:34:20,600 --> 00:34:25,120 Speaker 1: it in. Um, not necessary contact weird just like maybe 562 00:34:25,160 --> 00:34:30,160 Speaker 1: not necessary, maybe unnecessary, a little aggressive, not necessarily necessary, 563 00:34:30,440 --> 00:34:35,640 Speaker 1: not necessarily Now what you guys got what you want 564 00:34:35,640 --> 00:34:41,080 Speaker 1: to talk about? Um? I don't know. Uh. Facebook killing everyone? 565 00:34:41,320 --> 00:34:44,560 Speaker 1: The website? Yeah, the website, the website that is committed 566 00:34:44,600 --> 00:34:49,120 Speaker 1: to destroying the world. Um, that one. Is it because 567 00:34:49,320 --> 00:34:52,600 Speaker 1: they want to or because they have to? Because it's 568 00:34:52,680 --> 00:34:58,280 Speaker 1: because destroying the world increases engagement on their platform? Okay, 569 00:34:58,400 --> 00:35:01,200 Speaker 1: so this does and that's the only way they make money. 570 00:35:01,239 --> 00:35:07,560 Speaker 1: Now Yeah, so the world has to go because they're 571 00:35:07,560 --> 00:35:13,000 Speaker 1: pivoting to kids. Yeah, they are interested. Kids are interested 572 00:35:13,080 --> 00:35:18,520 Speaker 1: in perhaps fire bombing the Facebook offices. Um, which where 573 00:35:18,520 --> 00:35:21,680 Speaker 1: that to happen? I don't encourage it, but where that 574 00:35:21,800 --> 00:35:24,799 Speaker 1: to happen? These days? I think it would be funny. 575 00:35:25,120 --> 00:35:31,800 Speaker 1: Um where that to happen? Um? Yeah? I like, what 576 00:35:32,200 --> 00:35:33,960 Speaker 1: how are we doing on that? Is that incitement to 577 00:35:34,040 --> 00:35:38,560 Speaker 1: that cross the line? Where are we ignoring? You? What 578 00:35:38,560 --> 00:35:40,839 Speaker 1: what just just said? Are we bleeping that entire thing? 579 00:35:40,960 --> 00:35:43,560 Speaker 1: Or are we believing that? Did that go too far? 580 00:35:44,160 --> 00:35:47,160 Speaker 1: All I'm gonna say is if things were just Mark 581 00:35:47,239 --> 00:35:51,920 Speaker 1: Zuckerberg would be taken into custody by an international group 582 00:35:52,120 --> 00:35:55,600 Speaker 1: of shall we say, a vengeance squad. He would be 583 00:35:55,600 --> 00:35:59,640 Speaker 1: tried in the Hague right where right where um you 584 00:35:59,680 --> 00:36:03,080 Speaker 1: know Herman Garring was tried, uh, And then he would 585 00:36:03,120 --> 00:36:05,160 Speaker 1: be hung where Herman Garring would have been hung if 586 00:36:05,160 --> 00:36:10,799 Speaker 1: Garring hadn't snuck opiates into an an overdosed in order 587 00:36:10,840 --> 00:36:16,560 Speaker 1: to avoid um. Anyway, Mark Zuckerberg should be legally executed 588 00:36:16,560 --> 00:36:19,800 Speaker 1: by the international community is what I think actually, along 589 00:36:19,800 --> 00:36:22,360 Speaker 1: with Cheryl Sandberg and most of the most of the 590 00:36:22,360 --> 00:36:24,640 Speaker 1: people who have been running Facebook for the last decade 591 00:36:26,400 --> 00:36:29,840 Speaker 1: or yeah, sure, give him a bunch of opiates. Yeah, whatever, 592 00:36:29,880 --> 00:36:33,400 Speaker 1: I don't care how it goes. One of the fun facts. 593 00:36:33,440 --> 00:36:36,399 Speaker 1: So basically, a bunch of shipload of stuff came out 594 00:36:36,520 --> 00:36:40,560 Speaker 1: recently about Facebook, perhaps too much, all at once. I 595 00:36:40,560 --> 00:36:42,720 Speaker 1: want to throw that out. All the kind of stuff 596 00:36:42,760 --> 00:36:45,839 Speaker 1: came out on on Monday, all this stuff, I mean, 597 00:36:47,680 --> 00:36:50,600 Speaker 1: and it's so much to sift through that it's almost 598 00:36:50,640 --> 00:36:53,560 Speaker 1: like people aren't touching it, you know. Anyway, go ahead, Roberts. 599 00:36:53,600 --> 00:36:55,400 Speaker 1: A lot of journalists are going, but it's going to 600 00:36:55,480 --> 00:37:01,600 Speaker 1: be like the general public. Yeah, there's so much. UM. 601 00:37:01,680 --> 00:37:04,000 Speaker 1: One of the things that came out that's most immediately 602 00:37:04,040 --> 00:37:07,759 Speaker 1: relevant is that in April of two thou and twenty UM, 603 00:37:08,239 --> 00:37:11,960 Speaker 1: like the early days of the pandemic, Facebook internal Facebook 604 00:37:11,960 --> 00:37:15,600 Speaker 1: employees started looking at the spread of coronavirus related misinformation 605 00:37:16,239 --> 00:37:19,040 Speaker 1: UM and they found out that if they if they 606 00:37:19,080 --> 00:37:24,080 Speaker 1: limited serial resharers because serial resharers, people who reshare a 607 00:37:24,120 --> 00:37:28,600 Speaker 1: lot of content, tended to be resharing mostly misinformation. So 608 00:37:28,880 --> 00:37:32,840 Speaker 1: Facebook internal employees were like, hey, what if we uh 609 00:37:33,120 --> 00:37:37,920 Speaker 1: limited people from boosting content UM that like our news 610 00:37:37,960 --> 00:37:40,880 Speaker 1: feed predicts kind of falls into this category because that 611 00:37:40,920 --> 00:37:44,160 Speaker 1: stuff tends to be misinformation, and their early tests showed 612 00:37:44,160 --> 00:37:46,080 Speaker 1: that if they had done this, it would reduce the 613 00:37:46,480 --> 00:37:50,080 Speaker 1: sharing of coronavirus misinformation by thirty eight percent, and Mark 614 00:37:50,160 --> 00:37:55,080 Speaker 1: Zuckerberg said no. Um. The exact quote was from Anna Stepanov, 615 00:37:55,120 --> 00:37:58,480 Speaker 1: who's director UM, who's a director at Facebook, who was 616 00:37:58,520 --> 00:38:00,839 Speaker 1: supposed to who was like speaking for Zuckerberg. She said, 617 00:38:00,880 --> 00:38:03,200 Speaker 1: Mark doesn't think we could go abroad. We wouldn't launch 618 00:38:03,280 --> 00:38:05,080 Speaker 1: if there was a material trade off with m s I, 619 00:38:05,239 --> 00:38:11,160 Speaker 1: which is like there um uh engagement statistic, so that 620 00:38:11,160 --> 00:38:14,279 Speaker 1: that's like one of a billion different examples. But like 621 00:38:14,440 --> 00:38:16,359 Speaker 1: they knew that, oh hey we could do this one 622 00:38:16,360 --> 00:38:19,879 Speaker 1: thing that would reduce coronavirus misinformation by but it's going 623 00:38:19,920 --> 00:38:24,319 Speaker 1: to reduce our profitability, so we will not take that step. Yeah, 624 00:38:24,320 --> 00:38:26,799 Speaker 1: it's the it's like with Twitter, how they're like, yeah, 625 00:38:26,880 --> 00:38:28,560 Speaker 1: we could get rid of some Nazis, but it also 626 00:38:28,640 --> 00:38:32,360 Speaker 1: get rid of some Republican politicians, so we can't. But 627 00:38:32,400 --> 00:38:39,920 Speaker 1: at least Twitter put like caveats, you know, or Twitter 628 00:38:40,000 --> 00:38:43,560 Speaker 1: also does like if you're going to retweet and you 629 00:38:43,600 --> 00:38:46,600 Speaker 1: haven't clicked the link, if you're just knee jerk resharing, 630 00:38:47,040 --> 00:38:49,200 Speaker 1: they say you haven't read the article yet, could you 631 00:38:49,239 --> 00:38:52,960 Speaker 1: do that? That's another thing that what's the something whistleblower's 632 00:38:53,080 --> 00:38:58,280 Speaker 1: name Dayana Facebook? She mentioned that specifically, like like that's 633 00:38:58,320 --> 00:39:03,239 Speaker 1: been talked about. Um, yeah, so much going through you 634 00:39:03,320 --> 00:39:07,080 Speaker 1: pointed out, Um a few things that feel true to 635 00:39:07,080 --> 00:39:09,799 Speaker 1: me and are true. So much of this, I mean, 636 00:39:09,840 --> 00:39:14,160 Speaker 1: not just COVID, but just the disinformation in general is 637 00:39:14,239 --> 00:39:19,040 Speaker 1: because Facebook, in their quote unquote attempt to seem like nonpartisan. 638 00:39:19,400 --> 00:39:23,160 Speaker 1: They've actively been I mean Mark Zuckerberg has actively been 639 00:39:23,200 --> 00:39:28,480 Speaker 1: taking meetings, you know, with different members of the Republican Party. 640 00:39:28,560 --> 00:39:33,480 Speaker 1: He routinely courts them, and you know, it's all part 641 00:39:33,480 --> 00:39:37,040 Speaker 1: of this. It's all part of that positioning and their 642 00:39:37,120 --> 00:39:39,160 Speaker 1: choices that they make. But yes, money is that the 643 00:39:39,200 --> 00:39:42,680 Speaker 1: thing is the be all end all here and that's 644 00:39:42,680 --> 00:39:45,719 Speaker 1: where they make their money because I know they're focusing 645 00:39:45,760 --> 00:39:48,880 Speaker 1: on they're gonna you know, Facebook for kids, but the 646 00:39:49,000 --> 00:39:54,719 Speaker 1: youths don't go on Facebook. It's for the olds get 647 00:39:54,800 --> 00:40:00,439 Speaker 1: grabbed with racist ship that they can reach is like there, like, yeah, 648 00:40:00,440 --> 00:40:04,279 Speaker 1: we're retooling to with the quote that he said, retooling 649 00:40:04,280 --> 00:40:06,719 Speaker 1: to make serving young adults the north star rather than 650 00:40:06,760 --> 00:40:11,080 Speaker 1: optimizing for older people and like, so you're just gonna 651 00:40:11,080 --> 00:40:16,399 Speaker 1: like poison kids more instead of poison adults. Like that's 652 00:40:16,400 --> 00:40:19,200 Speaker 1: the goal, right, Like we already know that, Like Instagram 653 00:40:19,280 --> 00:40:24,359 Speaker 1: is destroying like young girls and uh like all all 654 00:40:24,400 --> 00:40:26,520 Speaker 1: the things that like these apps are doing by their 655 00:40:26,560 --> 00:40:28,839 Speaker 1: albums them and feeding. It's like if you're so, if 656 00:40:28,840 --> 00:40:32,080 Speaker 1: you're optimizing for young people instead of older people, then 657 00:40:32,120 --> 00:40:35,680 Speaker 1: you're just getting them earlier. You're fucking them up earlier 658 00:40:35,719 --> 00:40:40,040 Speaker 1: instead of doing it to boomers. So maybe that's not good. 659 00:40:41,200 --> 00:40:46,680 Speaker 1: I mean, all of it's not good. Also, it's not 660 00:40:46,719 --> 00:40:53,480 Speaker 1: even about um banning topics or removing the topics from Facebook. 661 00:40:54,080 --> 00:40:57,520 Speaker 1: I mean, I mean that's all, but it's also from 662 00:40:59,680 --> 00:41:05,680 Speaker 1: their yes, actively promoting those things, and they know that 663 00:41:05,840 --> 00:41:09,319 Speaker 1: it increases their bottom line. They know, I will go 664 00:41:09,360 --> 00:41:12,960 Speaker 1: even further, Katie, they are actively promoting toxic, untrue, and 665 00:41:13,040 --> 00:41:16,759 Speaker 1: violent content because doing so damages the brains of their 666 00:41:16,880 --> 00:41:19,640 Speaker 1: users in a way that keeps those users on and 667 00:41:19,719 --> 00:41:22,759 Speaker 1: engaged to the website. And so damaging the brains of 668 00:41:22,800 --> 00:41:26,319 Speaker 1: their users permanently is profitable to Facebook. So they made 669 00:41:26,680 --> 00:41:32,200 Speaker 1: a a distinct, uh documented choice to cause brain damage 670 00:41:32,239 --> 00:41:35,359 Speaker 1: to millions of people in order to abuse their profitability. 671 00:41:35,480 --> 00:41:37,360 Speaker 1: We're talking about it through the lens of how this 672 00:41:37,400 --> 00:41:41,000 Speaker 1: affects us here in America, but also to be unpacked 673 00:41:41,040 --> 00:41:43,920 Speaker 1: and sifted through with all of this is internationally. The 674 00:41:44,120 --> 00:41:49,400 Speaker 1: influence in other countries is even ethnic cleansings in India 675 00:41:49,480 --> 00:41:54,040 Speaker 1: and myan mark just as a start, uh in Ethiopia, 676 00:41:54,520 --> 00:42:00,600 Speaker 1: um in in Egypt. H Yeah, pro government propaganda another place, 677 00:42:00,840 --> 00:42:09,480 Speaker 1: you know, like anyway, it's truly the destruction of the world. Yeah, 678 00:42:09,520 --> 00:42:13,360 Speaker 1: there's a really good Ryan Brodrick article on his sub stack. 679 00:42:13,560 --> 00:42:15,480 Speaker 1: I think, well, is this a substack or not? And 680 00:42:15,560 --> 00:42:21,040 Speaker 1: his blood garbage day um. Yeah. And and Broderick is 681 00:42:21,080 --> 00:42:23,680 Speaker 1: someone who's kind of reported in some of the same 682 00:42:23,760 --> 00:42:27,160 Speaker 1: like we we're both some of the only like English 683 00:42:27,280 --> 00:42:30,120 Speaker 1: language writers have written about fucking degola chan, which was 684 00:42:30,719 --> 00:42:33,680 Speaker 1: like Brazil's eight chan kind of equivalent. It's like a 685 00:42:33,680 --> 00:42:37,200 Speaker 1: mass shooter fan site. He wrote like a short little 686 00:42:37,200 --> 00:42:41,359 Speaker 1: thing about the Facebook papers, um, which you can also find, 687 00:42:41,360 --> 00:42:44,400 Speaker 1: by the way, if you're interested in kind of learning 688 00:42:44,400 --> 00:42:47,719 Speaker 1: how to start looking into the Facebook papers protocol you 689 00:42:47,800 --> 00:42:50,720 Speaker 1: go to protocol. If you just type protocol Facebook Papers 690 00:42:50,719 --> 00:42:53,279 Speaker 1: into Google, it'll bring you to protocols. They've got like 691 00:42:53,280 --> 00:42:56,319 Speaker 1: an aggregation thing that's aggregating all of the articles so 692 00:42:56,360 --> 00:42:58,680 Speaker 1: far written about the Facebook papers. It makes it easy 693 00:42:58,719 --> 00:43:02,319 Speaker 1: to kind of see what's actually happening here. Um. But 694 00:43:02,400 --> 00:43:05,160 Speaker 1: Ryan Broderick wrote a short little article about this that 695 00:43:05,200 --> 00:43:08,719 Speaker 1: I think gets at how everyone who's been I've been 696 00:43:08,760 --> 00:43:10,640 Speaker 1: reporting around the edge of this issue, like a bunch 697 00:43:10,640 --> 00:43:13,120 Speaker 1: of people like Ben Collins, like Brandy's a Drawsney, like 698 00:43:13,120 --> 00:43:16,520 Speaker 1: like there's about a million of us. Well, there's at 699 00:43:16,560 --> 00:43:17,960 Speaker 1: least a few dozen of us who have been like 700 00:43:18,000 --> 00:43:20,480 Speaker 1: writing around the edges of these stories and trying to 701 00:43:20,560 --> 00:43:23,960 Speaker 1: like prod at this stuff for a while. And Broderick 702 00:43:24,000 --> 00:43:26,319 Speaker 1: gets that, I think how we're all feeling and about 703 00:43:26,360 --> 00:43:29,600 Speaker 1: how everyone should feel about this, about the Facebook papers 704 00:43:29,640 --> 00:43:32,360 Speaker 1: leak quote. I'll be honest, I'm not sure what to 705 00:43:32,400 --> 00:43:34,200 Speaker 1: do with all this. I'm not sure what more we 706 00:43:34,280 --> 00:43:36,719 Speaker 1: need to know before something is done, nor am I 707 00:43:36,760 --> 00:43:39,040 Speaker 1: even sure anything can be done now after almost a 708 00:43:39,080 --> 00:43:41,040 Speaker 1: decade of writing about it and talking about it, I 709 00:43:41,080 --> 00:43:43,600 Speaker 1: honestly feel number the whole thing. It's why I don't 710 00:43:43,600 --> 00:43:45,880 Speaker 1: write a lot about Facebook and garbage day. There are 711 00:43:45,880 --> 00:43:48,080 Speaker 1: simply other parts of the Internet that need attention. Just 712 00:43:48,080 --> 00:43:50,080 Speaker 1: think about how different things would be if an entire 713 00:43:50,120 --> 00:43:53,440 Speaker 1: generation of reporters and technologists weren't forced to be unpaid 714 00:43:53,480 --> 00:43:56,799 Speaker 1: janitors for a bloated tech monopoly. My hope is that 715 00:43:56,840 --> 00:43:58,840 Speaker 1: the sheer scale of these leaks isn't too much for 716 00:43:58,880 --> 00:44:01,520 Speaker 1: the average reader or Paul Petician to wrap their heads around. 717 00:44:01,719 --> 00:44:04,440 Speaker 1: But they haven't affected the company's stock price. But this 718 00:44:04,560 --> 00:44:07,600 Speaker 1: much is clear. Facebook knew all along. Their own employees 719 00:44:07,640 --> 00:44:10,160 Speaker 1: were desperately trying to get anyone inside the company to 720 00:44:10,200 --> 00:44:13,520 Speaker 1: listen as their products radicalized their own friends and family members, 721 00:44:13,760 --> 00:44:15,520 Speaker 1: and as they were breaking the world. They had an 722 00:44:15,600 --> 00:44:18,920 Speaker 1: army of spokespeople publicly and privately gas lighting and intimidating 723 00:44:18,960 --> 00:44:21,560 Speaker 1: reporters and researchers who were trying to ring the alarm bell. 724 00:44:21,880 --> 00:44:24,239 Speaker 1: They knew all along, and they simply did not give 725 00:44:24,280 --> 00:44:30,840 Speaker 1: a ship. Yeah I don't know. Um yeah, they still don't. Uh. 726 00:44:31,400 --> 00:44:33,360 Speaker 1: Zuck said, Yeah, what we're seeing is a coordinated efforts 727 00:44:33,400 --> 00:44:36,319 Speaker 1: selectively use leaked documents to paint a false picture of 728 00:44:36,320 --> 00:44:42,120 Speaker 1: our company. Um did the are the documents false? Yeah? Mark? 729 00:44:42,520 --> 00:44:50,040 Speaker 1: Is everything in here saying such blatant pr bullshit corporate spin? 730 00:44:50,360 --> 00:44:52,960 Speaker 1: And I could say that with confidence because I'm almost 731 00:44:52,960 --> 00:44:57,360 Speaker 1: caught up on succession good good again, and I understand 732 00:44:57,640 --> 00:45:05,600 Speaker 1: how this works, you know, I'm seeing it happen anyway, Katie, 733 00:45:05,800 --> 00:45:12,600 Speaker 1: love that for you, Thank you. I had to contribute something. 734 00:45:13,520 --> 00:45:15,920 Speaker 1: I've talked plenty of this episode. I've talked, I've had 735 00:45:15,960 --> 00:45:18,719 Speaker 1: thoughts that we want to end the episode. We're talking 736 00:45:18,719 --> 00:45:24,640 Speaker 1: about the written house thing or now yeah, just sucks 737 00:45:24,680 --> 00:45:31,680 Speaker 1: as the um Yeah, the judge is um saying that 738 00:45:33,680 --> 00:45:43,880 Speaker 1: they can't refer to as victims as victims, terrorists, terrorists, arsonists. 739 00:45:44,880 --> 00:45:52,279 Speaker 1: I thought that Republicans hated activist judges. Yeah, it's it's 740 00:45:52,320 --> 00:45:57,280 Speaker 1: fun because the argument, which isn't a completely fallacious argument, 741 00:45:57,320 --> 00:45:59,719 Speaker 1: is that, Okay, if you call them victims, then you 742 00:45:59,800 --> 00:46:04,560 Speaker 1: are convicting someone while they're in trial on trial, which okay, 743 00:46:04,600 --> 00:46:07,160 Speaker 1: but then you're if you can call them arsonists, which 744 00:46:07,200 --> 00:46:09,200 Speaker 1: none of them are convicted of arson, none of these 745 00:46:09,239 --> 00:46:11,759 Speaker 1: the people he killed or convicted arson, then you are 746 00:46:11,880 --> 00:46:17,600 Speaker 1: convicting someone else. Even yes, if you were just saying 747 00:46:17,719 --> 00:46:19,960 Speaker 1: the people that he shot, and if that was the 748 00:46:20,000 --> 00:46:22,000 Speaker 1: only way you allowed anyone to refer to them, I 749 00:46:22,719 --> 00:46:26,960 Speaker 1: think that would be fine. That's that's at least yes, No, 750 00:46:27,000 --> 00:46:30,640 Speaker 1: one's arguing that he shot these people. That's accurate, and 751 00:46:30,680 --> 00:46:34,040 Speaker 1: you're not You're not convicting anyone by saying the people 752 00:46:34,120 --> 00:46:39,319 Speaker 1: that he shot because he objectively shot them. Um. I would, 753 00:46:39,480 --> 00:46:42,040 Speaker 1: I would. I would say, Okay, well that's someone being fair. 754 00:46:42,080 --> 00:46:45,080 Speaker 1: But this is obviously like clearly not that this is 755 00:46:45,120 --> 00:46:48,520 Speaker 1: clearly an activist to call them terrorists instead like the 756 00:46:49,640 --> 00:46:52,400 Speaker 1: it's like the opposite of what he's trying to avoid 757 00:46:52,560 --> 00:46:57,520 Speaker 1: and also even more intense. Yeah, I'd like to read 758 00:46:57,560 --> 00:47:01,840 Speaker 1: this tweet from Ben she In, the Kenosha County judge 759 00:47:01,880 --> 00:47:05,240 Speaker 1: in the Kyle Rittenhouse case. Bruce Schroeder has been serving 760 00:47:05,239 --> 00:47:08,960 Speaker 1: since nine four. He keeps getting re elected to six 761 00:47:09,040 --> 00:47:12,080 Speaker 1: year terms, running unopposed. He's one of those six hundred 762 00:47:12,160 --> 00:47:14,600 Speaker 1: judge elections on your ballot where you're like, who is this? 763 00:47:14,680 --> 00:47:18,040 Speaker 1: Does this matter? It does? Um? Yeah, I think that's 764 00:47:18,080 --> 00:47:21,359 Speaker 1: a very good point and little reminder this man has 765 00:47:21,400 --> 00:47:24,239 Speaker 1: been has run, he's been serving since four and his 766 00:47:24,400 --> 00:47:31,280 Speaker 1: rant unopposed. Yeah, I know, it's great. Yeah, that said, 767 00:47:31,640 --> 00:47:34,759 Speaker 1: often the choices between one judge and another judge, both 768 00:47:34,840 --> 00:47:36,480 Speaker 1: of whom are the kind of people who want to 769 00:47:36,520 --> 00:47:40,840 Speaker 1: be judges. Yeah, And how can you know, it's just 770 00:47:40,880 --> 00:47:44,000 Speaker 1: how can you know? I don't know. I would like 771 00:47:44,120 --> 00:47:48,080 Speaker 1: to see Mark Zuckerberg, Cheryl Sandberg and most of the 772 00:47:48,120 --> 00:47:55,080 Speaker 1: Facebook executives from the last decade or so arrested by 773 00:47:55,080 --> 00:47:59,840 Speaker 1: an international um tribunal and uh and and executed in 774 00:47:59,880 --> 00:48:02,160 Speaker 1: the eg. I think that would be rad I think 775 00:48:02,160 --> 00:48:07,960 Speaker 1: that would be judging the verdict, but the trial. You know, 776 00:48:08,040 --> 00:48:10,120 Speaker 1: we will get all of the kids on board with 777 00:48:10,360 --> 00:48:13,200 Speaker 1: your company. If you were hung in the Hague, the 778 00:48:13,360 --> 00:48:18,040 Speaker 1: kids think that's pot they think that's one d gecks 779 00:48:18,239 --> 00:48:25,000 Speaker 1: of cool. Yeah, yeah, the Mark real chugy, Yeah, very chuggy. 780 00:48:25,120 --> 00:48:30,040 Speaker 1: If you were to not be hung in the Hague, 781 00:48:30,960 --> 00:48:34,399 Speaker 1: I think we did it. That is that? Yeah, that's 782 00:48:34,680 --> 00:48:39,320 Speaker 1: that's the episode. That's how we're ending this ship. Oh good, 783 00:48:40,000 --> 00:48:47,000 Speaker 1: I love based. Yeah, baste ending. I'm like desperately trying 784 00:48:47,040 --> 00:48:50,000 Speaker 1: to think of some slang to throw at you, but 785 00:48:50,280 --> 00:48:52,400 Speaker 1: I mean I think it. You know, there is another 786 00:48:52,440 --> 00:48:57,040 Speaker 1: thing that's just happened recently on the TikTok's that's pretty funny. Um. 787 00:48:57,600 --> 00:49:03,719 Speaker 1: The random fucking targeting matrix that is TikTok memes has 788 00:49:03,840 --> 00:49:08,200 Speaker 1: landed on the Mountain Goats song No Children, which if 789 00:49:08,239 --> 00:49:11,240 Speaker 1: you've ever been in a divorce that nearly killed both parties? 790 00:49:11,440 --> 00:49:13,640 Speaker 1: Is a pretty good song about a divorce that nearly 791 00:49:13,719 --> 00:49:16,759 Speaker 1: kills both people involved. UM, and now the kids are 792 00:49:17,120 --> 00:49:24,520 Speaker 1: into it. Yeah, they've landed on an incredibly dark anthem 793 00:49:24,560 --> 00:49:28,920 Speaker 1: about being like a divorce that cripples you emotionally for 794 00:49:28,960 --> 00:49:37,040 Speaker 1: the rest of your life American metaphor for kids. That's cool. 795 00:49:37,480 --> 00:49:40,320 Speaker 1: Listen to the rest of their catalogs, really good music, 796 00:49:41,520 --> 00:49:45,480 Speaker 1: yeah stuff. Yeah, The Mountain Goats have a lot of bangers. 797 00:49:45,480 --> 00:49:49,200 Speaker 1: Not just about one song that's fine, but just like expands, 798 00:49:49,280 --> 00:49:51,600 Speaker 1: you know, yeah, go for the rest of the Goats. 799 00:49:51,640 --> 00:49:54,799 Speaker 1: They got a song about D and D in there. Yeah, 800 00:49:57,640 --> 00:50:02,160 Speaker 1: great bard. Yeah, TikTok's doing that and also like making 801 00:50:02,239 --> 00:50:05,239 Speaker 1: young girls to like think that they have to rest. Yeah, 802 00:50:05,480 --> 00:50:09,640 Speaker 1: there's lots of stuff that's less good, but I do 803 00:50:09,719 --> 00:50:12,080 Speaker 1: like that. When someone went to I think it was 804 00:50:12,160 --> 00:50:14,160 Speaker 1: Vice went to the Mountain Goats, you know, the main 805 00:50:14,280 --> 00:50:16,359 Speaker 1: guy who was at some point it's been the only 806 00:50:16,440 --> 00:50:19,520 Speaker 1: guy in the Mountain Goats. UM went to him for comment. 807 00:50:19,600 --> 00:50:22,080 Speaker 1: He was like, yeah, I'm not getting on TikTok, absolutely not. 808 00:50:22,160 --> 00:50:25,560 Speaker 1: I'm doing I'm taking no actions as a like all 809 00:50:25,560 --> 00:50:29,320 Speaker 1: these their song goes viral on TikTok and they suddenly 810 00:50:29,320 --> 00:50:33,239 Speaker 1: like fucking um Fleetwood Mac gets a TikTok and he's 811 00:50:33,320 --> 00:50:37,160 Speaker 1: just like, no, I am I'm doing that. I am 812 00:50:37,200 --> 00:50:40,920 Speaker 1: fifty four goddamn years old. I'm not getting on TikTok now. 813 00:50:41,040 --> 00:50:48,640 Speaker 1: I'm just going to continue doing the thing that Yeah, alright, guys, 814 00:50:48,400 --> 00:51:00,480 Speaker 1: come back next at where by oh wow bye, weeping 815 00:51:00,719 --> 00:51:07,680 Speaker 1: so dull and it's not again, I tried. Daniel. Worst 816 00:51:07,719 --> 00:51:10,000 Speaker 1: Year Ever is a production of I Heart Radio. For 817 00:51:10,040 --> 00:51:12,439 Speaker 1: more podcasts from my heart Radio, visit the i heart 818 00:51:12,480 --> 00:51:15,359 Speaker 1: Radio app, Apple Podcasts, or wherever you listen to your 819 00:51:15,360 --> 00:51:16,080 Speaker 1: favorite shows.