1 00:00:05,519 --> 00:00:13,600 Speaker 1: Hello, ram By, Yeah, all right, bye, what it's like 2 00:00:13,640 --> 00:00:14,520 Speaker 1: to work on this show? 3 00:00:14,720 --> 00:00:15,680 Speaker 2: Hello By? 4 00:00:16,280 --> 00:00:17,560 Speaker 3: Yeah, Hello ram By? 5 00:00:18,040 --> 00:00:21,279 Speaker 2: Hello Andrew? Hello, then Andrew. 6 00:00:22,360 --> 00:00:25,520 Speaker 3: You what happened to you? Guys? 7 00:00:25,840 --> 00:00:29,760 Speaker 2: Are you doing that? You're just pushed us over the brink? 8 00:00:30,400 --> 00:00:33,880 Speaker 3: Not that bye, Oh, I've got terrible sleep deprivation, man, 9 00:00:36,360 --> 00:00:43,480 Speaker 3: sleep deprivation, Andrew. It's CIA Black site levels. Mate, Maybe 10 00:00:43,520 --> 00:00:46,440 Speaker 3: not maybe not that bad. Maybe not in a CIA 11 00:00:46,479 --> 00:00:51,880 Speaker 3: Black site. Ah. Man, this is wonderful energy to come 12 00:00:51,920 --> 00:00:52,640 Speaker 3: into hold. 13 00:00:52,800 --> 00:00:55,680 Speaker 1: I'm so I'm in a state of delirium from lack 14 00:00:55,720 --> 00:00:58,520 Speaker 1: of sleep over the the last I have like accumulated 15 00:00:58,520 --> 00:01:00,080 Speaker 1: maybe five hours of sleep since. 16 00:01:01,080 --> 00:01:09,000 Speaker 4: Fuck, wait, what happened? My fucking kids? My fucking baby 17 00:01:10,680 --> 00:01:11,640 Speaker 4: stein his crib. 18 00:01:12,600 --> 00:01:14,160 Speaker 3: So we had to get the dang we had to 19 00:01:14,160 --> 00:01:17,400 Speaker 3: get the tottle the bed out for him, tot the 20 00:01:17,440 --> 00:01:19,679 Speaker 3: bed out for him, and he just escapes the bloody thing. 21 00:01:21,200 --> 00:01:24,120 Speaker 1: Yeah, it's just like he's just so free now. But 22 00:01:24,160 --> 00:01:26,280 Speaker 1: also like this is all part of his development. He's 23 00:01:26,280 --> 00:01:27,080 Speaker 1: now fully in that thing. 24 00:01:27,120 --> 00:01:30,440 Speaker 2: And I liked can fly and he's just. 25 00:01:30,640 --> 00:01:33,680 Speaker 3: Like, what are you doing? What's that? What's that? Yeah, 26 00:01:33,720 --> 00:01:36,880 Speaker 3: it's just fucking it's crazy over here. Yeah, all right, 27 00:01:36,959 --> 00:01:37,479 Speaker 3: let's do this. 28 00:01:39,720 --> 00:01:46,440 Speaker 2: Okay, Yeah, well I was just that's not not a typical. 29 00:01:46,800 --> 00:01:49,800 Speaker 3: Well, not that I were screaming at Bay. There's a 30 00:01:49,840 --> 00:01:50,160 Speaker 3: lot of. 31 00:01:50,120 --> 00:01:54,080 Speaker 2: Insane super cut of accent, I think, just all the 32 00:01:54,160 --> 00:01:58,320 Speaker 2: dumb accents, all right, damn Bay? And then Andrew asking 33 00:01:58,400 --> 00:02:03,160 Speaker 2: why are energy? Yeah, he doesn't need to be a 34 00:02:03,200 --> 00:02:07,680 Speaker 2: typical Yeah, not a typical week, not a typical show. 35 00:02:08,040 --> 00:02:10,160 Speaker 3: Yeah, I don't go. I don't have much in the tank, man, 36 00:02:10,200 --> 00:02:12,520 Speaker 3: That's pretty much all I can give you A the tank. 37 00:02:13,520 --> 00:02:15,080 Speaker 3: I'm Hank the fucking tank right. 38 00:02:15,000 --> 00:02:17,840 Speaker 2: Now, the famous Hank the tank. 39 00:02:17,919 --> 00:02:22,200 Speaker 3: Yeah, it's me, Hank the Hank Engine, Thomas, the Hank Engine. 40 00:02:22,760 --> 00:02:23,320 Speaker 2: Is that a thing? 41 00:02:23,440 --> 00:02:28,200 Speaker 3: Can we use that? 42 00:02:29,400 --> 00:02:32,119 Speaker 2: I have nothing in the tank? Two? Is that a thing? 43 00:02:32,280 --> 00:02:33,600 Speaker 2: I think feels like a call? 44 00:02:33,680 --> 00:02:37,000 Speaker 3: Though I have nothing in the tank too? Is that 45 00:02:37,080 --> 00:02:37,440 Speaker 3: a thing? 46 00:02:37,800 --> 00:02:53,480 Speaker 2: Pipeline Hell, the Internet and welcome in ninety three, Episode 47 00:02:53,600 --> 00:02:58,520 Speaker 2: four of Daredey's Guys. It's a production of iHeart Radio. 48 00:02:58,560 --> 00:03:00,600 Speaker 2: It's a podcast where we take a deep down to America, 49 00:03:00,639 --> 00:03:06,720 Speaker 2: share countess and it is Thursday, June nineteenth, twenty twenty five. Yes, 50 00:03:06,919 --> 00:03:08,440 Speaker 2: it's Junete. Hey. 51 00:03:08,520 --> 00:03:12,800 Speaker 1: Unless you're corporate America, then it's just Thursday, the nice Thursday. 52 00:03:13,120 --> 00:03:19,600 Speaker 2: We're off. We're off obviously, so no episode tomorrow, episode exactly. 53 00:03:19,680 --> 00:03:20,440 Speaker 3: That's where we went. 54 00:03:21,040 --> 00:03:23,200 Speaker 1: But yeah, again, if you don't know what you should know, 55 00:03:23,280 --> 00:03:24,840 Speaker 1: fucking Juneteenth this, I'm not here to fuck it. 56 00:03:24,840 --> 00:03:26,680 Speaker 3: If you don't know, come on now, educate yourself. 57 00:03:26,680 --> 00:03:30,839 Speaker 1: And it's also World Sauntering Day, which feels like aggressive 58 00:03:30,880 --> 00:03:33,959 Speaker 1: to juxtaposteenth with being like, hey, why. 59 00:03:33,800 --> 00:03:36,400 Speaker 3: Do you just take like a casual, fancy. 60 00:03:36,040 --> 00:03:38,880 Speaker 2: Free walk on this day and then like the best 61 00:03:38,880 --> 00:03:44,680 Speaker 2: description of the gate of the the slave owners sauntering around? Yeah, 62 00:03:44,720 --> 00:03:47,640 Speaker 2: you know, oh my god, day for sauntering, it says, 63 00:03:47,680 --> 00:03:48,240 Speaker 2: slow down. 64 00:03:48,320 --> 00:03:50,960 Speaker 3: In fact, try mosying around on June nineteen. All right, 65 00:03:51,080 --> 00:03:53,360 Speaker 3: oh whoa, whoa, whoa, whoa. Yeah, I thought MOSI Day 66 00:03:53,440 --> 00:03:55,800 Speaker 3: was coming out of violence. Yeah, I know, I know, 67 00:03:56,160 --> 00:03:57,080 Speaker 3: I know, I know. 68 00:03:57,320 --> 00:04:00,480 Speaker 1: Anyway, we see you shout out black people, shout out, 69 00:04:00,640 --> 00:04:04,440 Speaker 1: you know, freedom for the moment, for the moment, for the. 70 00:04:04,400 --> 00:04:09,480 Speaker 2: Moment, as my favorite freedom fighter Mel Gibson once said, 71 00:04:10,520 --> 00:04:14,080 Speaker 2: you know, yeah, all right, my name is Jack O'Brien aka, 72 00:04:14,320 --> 00:04:18,720 Speaker 2: we start World War three. I got it in Yahoo 73 00:04:18,760 --> 00:04:24,039 Speaker 2: with me. We start World War three. Congress they can 74 00:04:24,120 --> 00:04:29,160 Speaker 2: suck on my peen one courtesy of Nick cepper Tyrannis 75 00:04:29,839 --> 00:04:34,320 Speaker 2: Nick Simper Tyrannis on the discord. A bit of the 76 00:04:34,440 --> 00:04:37,880 Speaker 2: Nick temper Tyrannis on the discord, I think written from 77 00:04:37,880 --> 00:04:42,520 Speaker 2: the perspective of h I don't know one Donald Trump, Yeah, 78 00:04:43,400 --> 00:04:48,760 Speaker 2: g Vance, m h yeah, this is this was the 79 00:04:48,839 --> 00:04:52,560 Speaker 2: fear all along imp the dipshit in there. He's gonna 80 00:04:52,600 --> 00:04:55,920 Speaker 2: start World War three because somebody was like mean to him, 81 00:04:56,000 --> 00:04:58,560 Speaker 2: because we did a protest. Those mean to him and 82 00:04:58,600 --> 00:05:01,800 Speaker 2: He's like, oh yeah, check this ship out. 83 00:05:01,320 --> 00:05:05,000 Speaker 3: I'm weir right now out. Yeah you think I'm baby. 84 00:05:05,600 --> 00:05:08,039 Speaker 2: Anyways, I'm thrilled to be joined as always by my 85 00:05:08,120 --> 00:05:10,400 Speaker 2: co host, mister Miles Gray. 86 00:05:11,240 --> 00:05:13,479 Speaker 3: Miles Great, the showgun with no gun, the most sleep 87 00:05:13,600 --> 00:05:17,040 Speaker 3: deprived dude in the San Fernando Valley at the moment, 88 00:05:17,480 --> 00:05:19,279 Speaker 3: I do want to shout out all that gang who's 89 00:05:19,320 --> 00:05:23,440 Speaker 3: reached out through many channels on social media. If someone 90 00:05:23,520 --> 00:05:26,440 Speaker 3: even posted, like I saw a thing on Reddit where 91 00:05:26,440 --> 00:05:30,440 Speaker 3: people were trying to give you advice, baby, like I 92 00:05:30,720 --> 00:05:32,720 Speaker 3: love that gang. Thank you so much. 93 00:05:33,040 --> 00:05:34,120 Speaker 2: Training for Miles. 94 00:05:34,440 --> 00:05:37,719 Speaker 3: Yeah yeah, yeah, so again, thank you, thank you for 95 00:05:37,760 --> 00:05:41,000 Speaker 3: all the tips. I'm still in the trenches as it were. 96 00:05:41,360 --> 00:05:43,960 Speaker 2: But I will say overwhelmingly they seem to be give 97 00:05:44,160 --> 00:05:46,400 Speaker 2: give the baby a weed gummy, and I just I 98 00:05:46,440 --> 00:05:47,400 Speaker 2: feel like that's not. 99 00:05:48,839 --> 00:05:51,799 Speaker 3: I think we should we should knock that show encouraged 100 00:05:51,920 --> 00:05:55,000 Speaker 3: child abuse. Yeah, yeah, yeah, I'm not going to do that, 101 00:05:55,400 --> 00:05:57,680 Speaker 3: but I am doing. Look, there have been many variations 102 00:05:57,680 --> 00:05:59,800 Speaker 3: on like just just kind of hit just fucking hang 103 00:05:59,839 --> 00:06:02,560 Speaker 3: out a little bit, they fall asleep. I've been trying 104 00:06:02,600 --> 00:06:04,320 Speaker 3: to that, but then he does a thing Jack like 105 00:06:04,320 --> 00:06:06,760 Speaker 3: you said your kid did the second I shift. 106 00:06:06,480 --> 00:06:08,200 Speaker 2: The weight where you go where you. 107 00:06:09,760 --> 00:06:10,000 Speaker 5: Man? 108 00:06:10,360 --> 00:06:13,840 Speaker 3: Oh, yeah, it's it's it's wild, but we're We're I'm 109 00:06:13,880 --> 00:06:15,000 Speaker 3: finding a balance. It's wild. 110 00:06:15,000 --> 00:06:17,720 Speaker 1: I also I get so clear in the middle of 111 00:06:17,720 --> 00:06:20,280 Speaker 1: the night where I'm like barely asleep, I'm like sort 112 00:06:20,320 --> 00:06:23,080 Speaker 1: of this like lucid state where a lot of really 113 00:06:23,120 --> 00:06:25,680 Speaker 1: too bad ideas come to my mind that I that 114 00:06:25,720 --> 00:06:27,720 Speaker 1: I think will make a good cold open. And they 115 00:06:27,720 --> 00:06:30,880 Speaker 1: don't try to start night podcasting, just keep a little 116 00:06:31,000 --> 00:06:33,440 Speaker 1: just whisper night casting. 117 00:06:33,720 --> 00:06:35,960 Speaker 3: Baby still won't sleep. Hey man, you got something to say, 118 00:06:35,960 --> 00:06:39,119 Speaker 3: You got something to say that you don't go fuck? 119 00:06:42,320 --> 00:06:44,280 Speaker 2: Yeah, man, I had something to say. I'm also a 120 00:06:44,320 --> 00:06:47,000 Speaker 2: little sleep deprepped, so I had a thing to say 121 00:06:47,160 --> 00:06:48,440 Speaker 2: that left my brain. 122 00:06:48,920 --> 00:06:51,120 Speaker 3: I'm gonna tell you guys this. I feel great. 123 00:06:51,160 --> 00:06:52,080 Speaker 2: I got plenty of sleep. 124 00:06:53,200 --> 00:06:55,320 Speaker 3: Yeah, at least, look one out of three ain't bad. 125 00:06:55,480 --> 00:06:56,200 Speaker 2: So that's right. 126 00:06:56,360 --> 00:06:59,000 Speaker 3: I'm well rested, but under prepared. Don't worry. 127 00:06:59,000 --> 00:07:01,599 Speaker 1: I'm gonna tap into some then I'm tapping into something 128 00:07:01,600 --> 00:07:03,240 Speaker 1: to keep this show fucking elevator. 129 00:07:03,320 --> 00:07:05,360 Speaker 3: Let's fucking go. Yeah. 130 00:07:05,520 --> 00:07:08,120 Speaker 2: Oh, I remember what I was gonna say. I remember 131 00:07:08,200 --> 00:07:11,920 Speaker 2: this period of my life when my life constantly my 132 00:07:12,040 --> 00:07:17,800 Speaker 2: life shout out to Mela delirious, having a new found 133 00:07:17,840 --> 00:07:22,520 Speaker 2: appreciation for the drugs that your brain like dumps into 134 00:07:22,560 --> 00:07:25,840 Speaker 2: your bloodstream as you're falling asleep, you know, like that 135 00:07:25,840 --> 00:07:29,000 Speaker 2: that lucid clarity where you just like get to know 136 00:07:29,160 --> 00:07:31,720 Speaker 2: sleep a little bit better because you're kind of constantly 137 00:07:31,800 --> 00:07:34,120 Speaker 2: like in and out of it, you know. And then 138 00:07:34,600 --> 00:07:36,440 Speaker 2: like that was the first time I had a real appreciation. 139 00:07:36,520 --> 00:07:39,120 Speaker 2: It's like, oh, like all the drugs that we do 140 00:07:39,160 --> 00:07:42,800 Speaker 2: are just like trying to get back to what is, yeah, 141 00:07:42,840 --> 00:07:45,400 Speaker 2: what our brain naturally does for us, where it's just 142 00:07:45,440 --> 00:07:49,440 Speaker 2: like and you're high, like that's what dreams are. It's 143 00:07:49,600 --> 00:07:52,760 Speaker 2: just like you are so high right now and floating 144 00:07:52,800 --> 00:07:56,400 Speaker 2: away on a stream of brain chemicals. Enjoy. Does that 145 00:07:56,440 --> 00:07:56,920 Speaker 2: make sense? 146 00:07:56,920 --> 00:08:00,360 Speaker 6: Well, because because the drugs don't create the chemical, they're 147 00:08:00,360 --> 00:08:03,559 Speaker 6: just squeezing it out. 148 00:08:03,920 --> 00:08:06,120 Speaker 2: Give your brain. I don't think. 149 00:08:06,760 --> 00:08:08,920 Speaker 3: Yeah, it's not that M D M A I'm looking for. 150 00:08:09,000 --> 00:08:12,760 Speaker 3: I needed to squeeze all the Sarah Tonin and neurot 151 00:08:12,800 --> 00:08:13,120 Speaker 3: out of my. 152 00:08:13,600 --> 00:08:17,640 Speaker 6: Brain and then I whoa, Right, I just said the 153 00:08:17,680 --> 00:08:21,000 Speaker 6: most freshman in college version of realizing how drugs were 154 00:08:22,440 --> 00:08:26,440 Speaker 6: so Having taken the bog hit, I am now philosophizing the. 155 00:08:26,520 --> 00:08:30,600 Speaker 2: Freshman in college bung hit. Like first bung hit conversations 156 00:08:30,800 --> 00:08:36,720 Speaker 2: are vastly underrated. That's that's the good stuff, that's yours. 157 00:08:37,040 --> 00:08:39,439 Speaker 3: If you could just be like, bro, we've taped like 158 00:08:39,559 --> 00:08:42,800 Speaker 3: it is. That's what this is, all right, that's true. 159 00:08:43,160 --> 00:08:46,560 Speaker 3: That's the level I operate after. I don't like I 160 00:08:46,559 --> 00:08:48,679 Speaker 3: don't like podcasts where people know ship. 161 00:08:49,000 --> 00:08:52,880 Speaker 2: Yeah, it's literally in the like crazy idea I had 162 00:08:52,880 --> 00:08:54,800 Speaker 2: well looking at a pink Floyd poster. 163 00:08:55,200 --> 00:08:58,680 Speaker 1: Yeah yeah, And for me it's wild reckless speculation. 164 00:08:59,160 --> 00:09:02,280 Speaker 2: That's right, quick thing. A bunch of new listeners for 165 00:09:02,280 --> 00:09:07,320 Speaker 2: some reason, maybe the dissolution of the mainstream media impending fascism. 166 00:09:07,400 --> 00:09:11,640 Speaker 2: But hi, welcome and uh yeah, if people like the show, 167 00:09:11,840 --> 00:09:13,600 Speaker 2: rate review it, tell your friend about it. 168 00:09:13,679 --> 00:09:16,080 Speaker 3: Got help, Thanks for thanks for listening. Yeah, thank you 169 00:09:16,080 --> 00:09:17,640 Speaker 3: to a chance on this group. 170 00:09:17,400 --> 00:09:20,319 Speaker 2: Of Thanks for taking right now. 171 00:09:21,720 --> 00:09:23,640 Speaker 3: It's also conversation. 172 00:09:23,720 --> 00:09:29,080 Speaker 2: It's just like so volatile, didn't you Miles were thrilled 173 00:09:29,120 --> 00:09:31,480 Speaker 2: to be joined in the third our third seat, but 174 00:09:31,800 --> 00:09:34,040 Speaker 2: one of our favorites, one of the very faces on 175 00:09:34,120 --> 00:09:37,920 Speaker 2: Mount Zite wore a hilarious and brilliant producer, TV writer 176 00:09:38,559 --> 00:09:40,920 Speaker 2: you know from the Yos This racist podcast one of 177 00:09:40,920 --> 00:09:43,920 Speaker 2: the all time great podcasts. Yeah, it's Andrew to and. 178 00:09:45,400 --> 00:09:47,960 Speaker 3: I didn't refer to Ka Ta, but normally I will 179 00:09:48,000 --> 00:09:50,320 Speaker 3: say this, I usually plug it in quite a bit 180 00:09:50,320 --> 00:09:52,120 Speaker 3: of effort trying to come up with a song. 181 00:09:52,960 --> 00:09:56,240 Speaker 6: I did peruse briefly the Discord. New listeners, go check 182 00:09:56,280 --> 00:09:58,840 Speaker 6: out the Discord. Still more pissed talk than I would 183 00:09:58,880 --> 00:09:59,280 Speaker 6: have thought. 184 00:09:59,520 --> 00:10:02,080 Speaker 2: Yeah, no, you gave them a nice talking to the 185 00:10:02,120 --> 00:10:03,920 Speaker 2: last time you were on, you were like, go to 186 00:10:04,040 --> 00:10:08,000 Speaker 2: the AKA thing. They took that as like a personal challenge, 187 00:10:08,000 --> 00:10:11,120 Speaker 2: which I kind of respect, where they're just like yeah, 188 00:10:11,400 --> 00:10:16,400 Speaker 2: oh yeah. They called out the fact that three years, 189 00:10:16,440 --> 00:10:19,439 Speaker 2: probably longer. Four years after Jack told this anecdote, we're 190 00:10:19,480 --> 00:10:24,160 Speaker 2: still just writing lyrics about his pants and pretending of 191 00:10:24,200 --> 00:10:28,720 Speaker 2: his water ice. We're gonna do that even more. 192 00:10:29,200 --> 00:10:33,200 Speaker 6: Yeah, yeah, sorry, I'm just okay. Next time I'm on 193 00:10:33,240 --> 00:10:36,280 Speaker 6: the show, my mission will be to tell a personal, 194 00:10:36,360 --> 00:10:41,600 Speaker 6: bodily fluid anecdote extremity, even more extremity that. 195 00:10:41,960 --> 00:10:44,360 Speaker 2: It doesn't need to be more. Yeah, although it is 196 00:10:44,480 --> 00:10:48,120 Speaker 2: pretty unextreme, because I'm pretty sure pissed my pants. I'm 197 00:10:48,160 --> 00:10:49,320 Speaker 2: pretty sure it was, just like. 198 00:10:49,880 --> 00:10:52,160 Speaker 3: Yours is the lowest song of embarrassing thing. 199 00:10:53,520 --> 00:10:55,240 Speaker 2: I just think we could beat it. I just think 200 00:10:55,240 --> 00:10:55,800 Speaker 2: we could beat it. 201 00:10:55,920 --> 00:11:00,800 Speaker 6: I'm the throne I'm gutting for is relentless inside joke 202 00:11:00,960 --> 00:11:01,679 Speaker 6: on the ak. 203 00:11:03,080 --> 00:11:06,400 Speaker 2: Impenetrable akhe I'm pretty sure. I like, I know for 204 00:11:06,440 --> 00:11:09,160 Speaker 2: a fact, I can be. There are plenty of story, embarrassing, 205 00:11:09,240 --> 00:11:12,120 Speaker 2: humiliating stories from my life that I just haven't told yet. 206 00:11:12,200 --> 00:11:16,240 Speaker 2: So it's we're we're dulling these out. 207 00:11:14,480 --> 00:11:14,760 Speaker 5: This is. 208 00:11:16,720 --> 00:11:19,920 Speaker 2: An arm right, man. When I was drinking too much, boy, 209 00:11:20,160 --> 00:11:24,480 Speaker 2: oh boy, did I do some humiliating things. Folks. You'll 210 00:11:24,480 --> 00:11:27,360 Speaker 2: find out about that in the years to come, I know, 211 00:11:27,520 --> 00:11:33,200 Speaker 2: not not me. Anyway, really great new listeners material. This 212 00:11:33,280 --> 00:11:39,520 Speaker 2: is impenetrable. Lare inside jokes. That's right, That's that's what 213 00:11:39,559 --> 00:11:40,200 Speaker 2: we're here for. 214 00:11:40,600 --> 00:11:43,560 Speaker 6: Wonderful, Welcome to the show. 215 00:11:44,040 --> 00:11:45,840 Speaker 3: Welcome, and now get the fuck out of here. 216 00:11:47,240 --> 00:11:49,280 Speaker 2: Andrew. We're thrilled to have you. We're going to get 217 00:11:49,280 --> 00:11:50,920 Speaker 2: to know you a little bit better in a moment. First, 218 00:11:50,920 --> 00:11:52,360 Speaker 2: we're going to tell the listeners a couple of the 219 00:11:52,360 --> 00:11:54,720 Speaker 2: things that we're talking about. Uh, we're going to just 220 00:11:54,720 --> 00:11:56,599 Speaker 2: take a look at the general state of the world. 221 00:11:57,559 --> 00:12:01,360 Speaker 2: How you know, we're seeing some bad things happen, and 222 00:12:02,040 --> 00:12:06,000 Speaker 2: we're generally not good at looking running the tape back 223 00:12:06,120 --> 00:12:09,400 Speaker 2: to see where that where those bad things from. 224 00:12:09,760 --> 00:12:09,960 Speaker 3: Yep. 225 00:12:10,000 --> 00:12:12,080 Speaker 2: But we're gonna we're gonna look at a couple the 226 00:12:12,120 --> 00:12:17,680 Speaker 2: two big ones, Yeah, the Ice Raids and the war 227 00:12:17,800 --> 00:12:22,000 Speaker 2: with Iran, the war being waged on Iran. 228 00:12:22,840 --> 00:12:26,360 Speaker 3: Yeah, we're gonna look at both of those situations and 229 00:12:26,760 --> 00:12:28,480 Speaker 3: whether the US had anything to do with. 230 00:12:28,440 --> 00:12:31,920 Speaker 1: The anything to do anything, anything at all, probably anything 231 00:12:31,960 --> 00:12:32,240 Speaker 1: at all. 232 00:12:32,360 --> 00:12:35,080 Speaker 2: Yeah, we're just here. Things are happening to us. 233 00:12:34,800 --> 00:12:38,760 Speaker 3: Not the American boomerang striking us right in the face again, camp, Yeah, 234 00:12:38,760 --> 00:12:39,360 Speaker 3: at this time. 235 00:12:39,800 --> 00:12:42,080 Speaker 2: So we'll talk about that. We'll also talk about why 236 00:12:42,120 --> 00:12:45,960 Speaker 2: we like Tucker Carlson. Now, no that's not true, but Carlson, 237 00:12:46,040 --> 00:12:49,120 Speaker 2: Welcome to the Resistance. Won't know how many times I've 238 00:12:49,120 --> 00:12:54,040 Speaker 2: seen that tweeted. Now I'm one of our new listeners. Yeah, 239 00:12:55,040 --> 00:13:02,199 Speaker 2: come to brunch. Now the live cookout is brunch. We 240 00:13:02,280 --> 00:13:07,080 Speaker 2: need these people that the cookout. But the people who 241 00:13:07,120 --> 00:13:09,600 Speaker 2: say talker Carls and welcome to brunch, I know exactly. 242 00:13:10,160 --> 00:13:13,200 Speaker 2: We need the If Kama led one, we'd all be 243 00:13:13,200 --> 00:13:15,440 Speaker 2: at brunch right now. We need them, We need them 244 00:13:15,600 --> 00:13:19,200 Speaker 2: to watch to wide tent. I don't have to like them, though, 245 00:13:19,240 --> 00:13:21,720 Speaker 2: and I'm going to talk med ship about them and 246 00:13:21,760 --> 00:13:21,960 Speaker 2: you can. 247 00:13:22,880 --> 00:13:25,560 Speaker 3: But look, we got to say focus man, it's about. 248 00:13:27,120 --> 00:13:30,080 Speaker 2: We'll talk about that. There's some there's an article in 249 00:13:30,120 --> 00:13:33,840 Speaker 2: the New York Times just talking just giving some anecdotes 250 00:13:34,160 --> 00:13:39,480 Speaker 2: and then some research about how people are interacting with chat, 251 00:13:39,520 --> 00:13:46,120 Speaker 2: GPT and large large language models. That pretty fine, pretty wild. 252 00:13:47,320 --> 00:13:51,640 Speaker 2: I'm always skeptical about these because I remember an era 253 00:13:52,000 --> 00:13:56,800 Speaker 2: when the mainstream media was just combing the crime blodder 254 00:13:56,920 --> 00:13:59,839 Speaker 2: for any crime that was committed that was in any 255 00:13:59,840 --> 00:14:04,160 Speaker 2: way tangentially related to either video games or internet chat 256 00:14:04,240 --> 00:14:08,040 Speaker 2: rooms would be like chat room murder, you know, like 257 00:14:09,040 --> 00:14:11,560 Speaker 2: and so I don't want to fall into that. On 258 00:14:11,600 --> 00:14:15,280 Speaker 2: the other hand, this stuff is pretty unique to chat, 259 00:14:15,320 --> 00:14:18,840 Speaker 2: GBT and AI, and it's it's. 260 00:14:18,720 --> 00:14:20,480 Speaker 6: Worked over ahead a little bit. I don't think it 261 00:14:20,600 --> 00:14:24,200 Speaker 6: is man. People don't fucking Eliza in the nineties. 262 00:14:24,520 --> 00:14:28,960 Speaker 2: That's true. Yeah, I think we're just what we're like. 263 00:14:29,080 --> 00:14:30,680 Speaker 2: They are nonetheless very entertaining. 264 00:14:33,280 --> 00:14:35,600 Speaker 1: I don't encouraging you to do harm to yourself, which 265 00:14:35,640 --> 00:14:37,000 Speaker 1: in a lot of cases which. 266 00:14:36,800 --> 00:14:40,640 Speaker 2: It is not entertaining. And Ozzy Osbourne's DNA you can 267 00:14:40,680 --> 00:14:44,480 Speaker 2: go buy that, uh fortunately just to saliva. Ah. 268 00:14:45,120 --> 00:14:47,000 Speaker 3: No, it's fine. Whatever you find under your. 269 00:14:46,920 --> 00:14:49,440 Speaker 2: Canail, your come Ozzy Osbourne. 270 00:14:49,920 --> 00:14:53,800 Speaker 3: Yeah, Ozzy Osbourne. It's just what. He doesn't even have 271 00:14:53,920 --> 00:14:56,840 Speaker 3: DNA at this point. I'm pretty sure he's like this. 272 00:14:57,000 --> 00:14:58,600 Speaker 2: Household, we believe. 273 00:14:59,360 --> 00:15:01,440 Speaker 3: Give me your Ozzie Osbourne. 274 00:15:01,560 --> 00:15:05,840 Speaker 2: All right, Andrew, welcome new listeners, all of that plenty more. 275 00:15:06,720 --> 00:15:08,840 Speaker 6: I feel like you guys took this as a challenge 276 00:15:09,480 --> 00:15:11,520 Speaker 6: as much as. 277 00:15:14,680 --> 00:15:16,320 Speaker 3: You know, all my ship is going to be weird 278 00:15:16,400 --> 00:15:17,040 Speaker 3: and dumb. 279 00:15:17,440 --> 00:15:18,720 Speaker 2: Yeah, so good, so perfect. 280 00:15:18,840 --> 00:15:21,480 Speaker 3: Strap in, strap in assholes, Uh. 281 00:15:21,560 --> 00:15:24,400 Speaker 2: Andrew with something from your search history that's revealing about 282 00:15:24,400 --> 00:15:24,800 Speaker 2: who you are. 283 00:15:25,400 --> 00:15:29,760 Speaker 6: I started, Uh, I guess the best way to say 284 00:15:29,800 --> 00:15:34,600 Speaker 6: this is, is it like best this This is just 285 00:15:34,640 --> 00:15:38,120 Speaker 6: an amalgamation of quite a long couple of minutes of searching, 286 00:15:38,680 --> 00:15:42,680 Speaker 6: but best container for steaming vegetables in a microwave. I've 287 00:15:42,720 --> 00:15:44,880 Speaker 6: become a microwave steaming motherfucker. 288 00:15:45,000 --> 00:15:48,960 Speaker 3: Oh wow, steam and Willie Beamon over here exactly. 289 00:15:49,880 --> 00:15:54,040 Speaker 6: I visited my sister who is like a new Ish mom, 290 00:15:54,520 --> 00:15:57,280 Speaker 6: and she's just like, chop up the broccoli, put it 291 00:15:57,280 --> 00:15:58,400 Speaker 6: in the fucking microwave. 292 00:15:58,440 --> 00:16:03,520 Speaker 3: Let's to learn a lesson about a ficiency. So good. 293 00:16:03,600 --> 00:16:06,760 Speaker 6: Yeah, it really like I was like, oh, this is 294 00:16:06,840 --> 00:16:09,520 Speaker 6: fucking honestly amazing. 295 00:16:09,880 --> 00:16:11,680 Speaker 3: And I just wanted to make sure. 296 00:16:12,280 --> 00:16:15,320 Speaker 6: I don't know, I feel like I'm like like definitively 297 00:16:15,400 --> 00:16:20,200 Speaker 6: like barn door after the microplastics have like deeply invaded 298 00:16:20,200 --> 00:16:23,360 Speaker 6: my brain already. But I got like cold feet about 299 00:16:23,360 --> 00:16:25,080 Speaker 6: the vessel I was using a microwave. 300 00:16:25,200 --> 00:16:27,480 Speaker 3: My vegetables did, so I had to I had to 301 00:16:27,520 --> 00:16:28,920 Speaker 3: go down that rabbit hole a little bit. 302 00:16:29,040 --> 00:16:33,680 Speaker 6: But yeah, it's been mostly saran wrap and with my yeah, 303 00:16:33,760 --> 00:16:34,520 Speaker 6: glass bole. 304 00:16:35,080 --> 00:16:38,200 Speaker 2: That's because it does it every time. That works for me. Yeah, yeah, 305 00:16:38,240 --> 00:16:39,200 Speaker 2: I use. 306 00:16:39,120 --> 00:16:41,280 Speaker 3: A milar balloon usually. 307 00:16:42,440 --> 00:16:48,640 Speaker 2: Good. I actually use card going out every time. I 308 00:16:48,720 --> 00:16:51,600 Speaker 2: created a bowl out of just pure Scotch guard. I like, 309 00:16:51,680 --> 00:16:54,080 Speaker 2: you put my hand in a bowl shape and sketch 310 00:16:54,120 --> 00:16:57,480 Speaker 2: guard the inside of my hands like a little bowl 311 00:16:57,600 --> 00:16:59,680 Speaker 2: and that's oh my god, that's beautiful. And then into 312 00:16:59,680 --> 00:17:03,240 Speaker 2: the my was the original forever chemical I think, right, 313 00:17:03,400 --> 00:17:07,840 Speaker 2: that's like the one that three M made and like, yeah, 314 00:17:07,840 --> 00:17:11,720 Speaker 2: people were like immediately like if you lived anywhere that 315 00:17:11,800 --> 00:17:15,320 Speaker 2: scotch card was during the fifties, you have this in 316 00:17:15,359 --> 00:17:16,800 Speaker 2: your that was. 317 00:17:16,800 --> 00:17:20,520 Speaker 3: Just like like tape for the inside of your veins basically. 318 00:17:22,040 --> 00:17:27,080 Speaker 2: Exactly. It's just like like scotch tape, but inside you 319 00:17:28,640 --> 00:17:32,040 Speaker 2: What if you had just like liquid scotch tape running 320 00:17:32,040 --> 00:17:32,879 Speaker 2: throughout your body. 321 00:17:33,160 --> 00:17:36,560 Speaker 6: Yeah, I know, I'm trying to stay in the like, 322 00:17:37,600 --> 00:17:41,679 Speaker 6: you know, unintended consequences maybe positive thing. I'm one, you know, 323 00:17:41,800 --> 00:17:44,840 Speaker 6: obviously this can't be good. It's a little bit the 324 00:17:44,880 --> 00:17:47,840 Speaker 6: same as the you know, chat GPT brain poison thing. 325 00:17:47,960 --> 00:17:50,920 Speaker 3: But I'm just like, I don't know, on some level, 326 00:17:50,960 --> 00:17:52,760 Speaker 3: maybe you know, maybe we're going to find. 327 00:17:52,600 --> 00:17:54,440 Speaker 6: Out that people with all this Scotch guard in their 328 00:17:54,440 --> 00:17:58,600 Speaker 6: blood get fewer strokes or I guess what's more likely, way, way, 329 00:17:58,640 --> 00:17:59,560 Speaker 6: way more strokes. 330 00:17:59,760 --> 00:18:03,160 Speaker 2: Yeah, now fewer our blood more slippery. 331 00:18:03,200 --> 00:18:03,800 Speaker 3: We shall see. 332 00:18:03,840 --> 00:18:04,320 Speaker 2: So it's good. 333 00:18:04,480 --> 00:18:08,439 Speaker 3: Yeah yeah, plastic dudes, We're just we're just plastic plastic. Anyway. 334 00:18:08,960 --> 00:18:10,480 Speaker 3: I've been having aretchy. 335 00:18:11,080 --> 00:18:17,600 Speaker 6: Believable amounts of uh steamed steamed box mostly. 336 00:18:17,400 --> 00:18:21,159 Speaker 2: Do we like have we underrated the microwave? Just I 337 00:18:21,200 --> 00:18:25,040 Speaker 2: feel like it feels worse than it actually is. Like 338 00:18:25,119 --> 00:18:28,159 Speaker 2: I've always like felt like that's a cancer box. But 339 00:18:28,320 --> 00:18:29,919 Speaker 2: like I'll use an age. 340 00:18:30,040 --> 00:18:34,200 Speaker 6: Yeah, I think I think not to not to docks 341 00:18:34,280 --> 00:18:36,760 Speaker 6: us to the new listeners in our eternal youth. 342 00:18:36,640 --> 00:18:41,639 Speaker 3: But I think people in our generation, what we're twenty seven, 343 00:18:41,840 --> 00:18:44,359 Speaker 3: so we were we were all this, I'm four. 344 00:18:45,040 --> 00:18:51,560 Speaker 5: Yeah, we were saying kids, don't stand, don't stand too 345 00:18:51,560 --> 00:18:54,000 Speaker 5: close to the mark, like three grades above me, right, 346 00:18:56,080 --> 00:18:58,200 Speaker 5: He's like three grades are you guys? 347 00:18:58,440 --> 00:18:58,880 Speaker 2: Teachers? 348 00:18:58,880 --> 00:19:02,320 Speaker 3: Are we mean twentieth grade? 349 00:19:03,280 --> 00:19:05,760 Speaker 2: It actually we met so young that it's weird that 350 00:19:05,800 --> 00:19:12,760 Speaker 2: I was hanging out with him being overwing. That guy's 351 00:19:12,760 --> 00:19:19,560 Speaker 2: already going through puberty. Why college kid, Lets me sip 352 00:19:19,600 --> 00:19:23,240 Speaker 2: some beers, okay, just some people. I'm reading that the 353 00:19:23,320 --> 00:19:25,640 Speaker 2: microwaves that come out are non ionizing. 354 00:19:26,200 --> 00:19:28,800 Speaker 3: So that's right. That isn't dangerous to people. 355 00:19:30,119 --> 00:19:34,680 Speaker 2: That's good. Yeah, I think I think it's it's. 356 00:19:34,080 --> 00:19:37,359 Speaker 6: Like, because they've been around for so long, if they 357 00:19:37,359 --> 00:19:40,919 Speaker 6: were causing cancer, first of all, how would we know, right, 358 00:19:41,840 --> 00:19:43,480 Speaker 6: getting there's just too much. 359 00:19:43,640 --> 00:19:45,640 Speaker 3: We're justulating Anyways. 360 00:19:45,640 --> 00:19:48,159 Speaker 2: This is usually what we're showing back. This is usually 361 00:19:48,160 --> 00:19:49,960 Speaker 2: when our show gets in trouble, when we're like, and 362 00:19:50,080 --> 00:19:53,200 Speaker 2: what about like water alcalizing and. 363 00:19:54,200 --> 00:19:58,520 Speaker 3: Oh oh man, listen, I'm replacing either of my under 364 00:19:58,600 --> 00:20:02,440 Speaker 3: or opera with that. No, I get ready, it's gonna 365 00:20:02,440 --> 00:20:03,199 Speaker 3: be even worse. 366 00:20:03,400 --> 00:20:07,120 Speaker 2: Oh no, all right, Andrew, what's something you think is underrated? 367 00:20:07,520 --> 00:20:09,920 Speaker 2: Oh man? Just using normal ass? Water? 368 00:20:10,560 --> 00:20:15,320 Speaker 6: Way underrated? Not using normal ass? Okay, listen, whatever I got. 369 00:20:16,160 --> 00:20:20,760 Speaker 6: I took one step up in coffee madness, and I 370 00:20:20,840 --> 00:20:27,720 Speaker 6: am now buying separate minerals to put into distilled water 371 00:20:28,640 --> 00:20:31,040 Speaker 6: to make coffee with in the mornings. 372 00:20:31,320 --> 00:20:36,640 Speaker 2: So you're using distilled water and then adding us from 373 00:20:36,680 --> 00:20:37,639 Speaker 2: tap water back in. 374 00:20:38,359 --> 00:20:42,280 Speaker 6: Yeah, but like a calibrated proportion, I suppose. So I 375 00:20:42,280 --> 00:20:46,400 Speaker 6: guess underrated is not using an insane coffee process. Here's 376 00:20:46,400 --> 00:20:48,960 Speaker 6: the thing I will tell you, Jack, I don't have 377 00:20:49,119 --> 00:20:55,480 Speaker 6: the palette or the ability, especially given the like cognitive 378 00:20:55,480 --> 00:20:57,639 Speaker 6: dissonance that has gone into this process, to tell you 379 00:20:57,680 --> 00:20:58,720 Speaker 6: whether it tastes better. 380 00:20:59,119 --> 00:21:01,560 Speaker 2: It tastes better to me, sure, but. 381 00:21:01,520 --> 00:21:04,520 Speaker 3: That's because I fucking bought a powder off the internet 382 00:21:04,840 --> 00:21:08,440 Speaker 3: and now have to like buy water. Including this week, 383 00:21:08,480 --> 00:21:12,320 Speaker 3: I went at midnight, past midnight to the CVS that 384 00:21:12,440 --> 00:21:15,199 Speaker 3: was open, just so I could get distilled water so 385 00:21:15,280 --> 00:21:16,720 Speaker 3: I could have my morning coffee. 386 00:21:17,160 --> 00:21:21,160 Speaker 2: Damn, wow, wow, culinary arts. 387 00:21:21,240 --> 00:21:24,280 Speaker 1: Yeah, you're doing this thing about like marinating a coke 388 00:21:24,320 --> 00:21:26,240 Speaker 1: and the refrigerator for three weeks. 389 00:21:26,640 --> 00:21:27,800 Speaker 3: It tastes better. 390 00:21:27,920 --> 00:21:33,480 Speaker 2: Man, It's crispy, although it actually gets a little bit 391 00:21:33,600 --> 00:21:38,920 Speaker 2: overly crispy. I like the five day crispiness. Yeah, five days, 392 00:21:38,920 --> 00:21:39,480 Speaker 2: twelve hours. 393 00:21:39,680 --> 00:21:42,080 Speaker 3: Got to bring it back. What Andrews do you think 394 00:21:42,160 --> 00:21:45,959 Speaker 3: is overrated? All right, I'll just be I it's it is. 395 00:21:46,240 --> 00:21:47,960 Speaker 6: As we talked about it a second, it is, And 396 00:21:48,320 --> 00:21:50,480 Speaker 6: I'm positive I said something like this last time. 397 00:21:50,520 --> 00:21:51,440 Speaker 2: It was a big protest. 398 00:21:51,480 --> 00:21:53,720 Speaker 6: But it is something I need to constantly tamp down 399 00:21:53,920 --> 00:21:57,000 Speaker 6: in myself, which is the like my reflexive hating on 400 00:21:57,240 --> 00:22:03,960 Speaker 6: the like drum folks at protests because it's it's like 401 00:22:04,040 --> 00:22:08,040 Speaker 6: they're fine. They're fine, They genuinely are fine. I I 402 00:22:08,680 --> 00:22:12,080 Speaker 6: have to remind myself of saying that. I feel like 403 00:22:12,359 --> 00:22:17,960 Speaker 6: I coined but maybe not that which is which is 404 00:22:18,960 --> 00:22:22,920 Speaker 6: being corny is not a crime, like I don't think 405 00:22:22,920 --> 00:22:25,720 Speaker 6: it's I don't, It's not, to me, the funniest thing 406 00:22:26,440 --> 00:22:30,719 Speaker 6: I've ever heard. I wish more people on whatever is 407 00:22:30,840 --> 00:22:35,560 Speaker 6: left of fucking Nancy Pelosi would be a little more 408 00:22:35,600 --> 00:22:42,200 Speaker 6: creative and not parrot a bit from fucking five years ago. Still, however, 409 00:22:43,400 --> 00:22:45,920 Speaker 6: you know, I'm glad you're here, thank you for being 410 00:22:45,920 --> 00:22:46,360 Speaker 6: on on the. 411 00:22:46,280 --> 00:22:51,080 Speaker 2: Streets, and honestly, kind of a revolutionary idea if we could, 412 00:22:51,119 --> 00:22:55,000 Speaker 2: if everybody across the board could just like inhabit that 413 00:22:55,240 --> 00:22:58,280 Speaker 2: like from like you know, far left. Yeah, just corny 414 00:22:58,359 --> 00:23:01,399 Speaker 2: is not a crime, like yes, like it is them 415 00:23:01,440 --> 00:23:05,679 Speaker 2: singing take Trump out of the Why is that cool? 416 00:23:06,160 --> 00:23:09,840 Speaker 2: Is that funny? No, it's not. But if we took 417 00:23:09,880 --> 00:23:11,879 Speaker 2: Trump out of the White House, the world would be better. 418 00:23:12,119 --> 00:23:14,600 Speaker 2: So let's just let them sing the thing that I 419 00:23:14,640 --> 00:23:16,359 Speaker 2: think we all agree on. I don't, Yeah, I. 420 00:23:16,359 --> 00:23:17,800 Speaker 3: Don't, I'm not, I don't, I'm not. 421 00:23:18,040 --> 00:23:20,760 Speaker 1: Uh, I'm mad at the message you know what I mean, Right, Yeah, 422 00:23:20,960 --> 00:23:21,960 Speaker 1: that's I think. 423 00:23:21,840 --> 00:23:24,880 Speaker 6: It's just like like because I would say, I mean, 424 00:23:24,960 --> 00:23:28,600 Speaker 6: the biggest like tragedy of this is like, yes, all 425 00:23:28,640 --> 00:23:33,040 Speaker 6: the like kind of like wokeness scolds, including me on 426 00:23:33,240 --> 00:23:38,600 Speaker 6: Yosi's racist, Like a big part of this like insane 427 00:23:38,640 --> 00:23:41,960 Speaker 6: fascist backlash was like those people feeling. 428 00:23:43,400 --> 00:23:47,920 Speaker 3: You know, victimized. Yeah, sure, well you mean to me. 429 00:23:48,240 --> 00:23:49,080 Speaker 2: We weren't wrong. 430 00:23:49,520 --> 00:23:51,720 Speaker 1: Yeah, well, I think the other part exactly, it's like, yeah, 431 00:23:51,720 --> 00:23:54,399 Speaker 1: you need to respect trains people. That's that's that's a 432 00:23:54,480 --> 00:23:57,040 Speaker 1: non negotiable. But I think the thing is for people 433 00:23:57,080 --> 00:23:59,600 Speaker 1: who are hearing that when they're when they're a material 434 00:24:00,320 --> 00:24:04,840 Speaker 1: living situation is dire, It's like, I honestly don't have 435 00:24:04,880 --> 00:24:08,639 Speaker 1: the bandwidth for that. I'm I don't have money and 436 00:24:08,680 --> 00:24:10,679 Speaker 1: I'm supposed to and I'm being yelled at for the 437 00:24:10,760 --> 00:24:13,840 Speaker 1: like what's the priority here? So I mean I think 438 00:24:13,840 --> 00:24:15,960 Speaker 1: that's where, you know, a lot of the times leadership 439 00:24:16,000 --> 00:24:17,800 Speaker 1: fails to be like you actually need to be really 440 00:24:17,840 --> 00:24:21,680 Speaker 1: focusing on like material inequality, like that's right, that should 441 00:24:21,800 --> 00:24:25,200 Speaker 1: underpin everything else you do because you need to solve 442 00:24:25,240 --> 00:24:27,280 Speaker 1: that piece to get people on board to. 443 00:24:27,280 --> 00:24:30,000 Speaker 3: Reject right wing ideology, Like that's. 444 00:24:29,840 --> 00:24:33,000 Speaker 1: Just that's the data, is there, baby, you just gotta 445 00:24:33,520 --> 00:24:34,640 Speaker 1: flow the paradigm up. 446 00:24:35,119 --> 00:24:39,000 Speaker 6: The the like tiny like disconnect and the tiny way 447 00:24:39,040 --> 00:24:41,680 Speaker 6: that this like the corny is a crime people have 448 00:24:41,720 --> 00:24:42,800 Speaker 6: a like to stand on. 449 00:24:43,080 --> 00:24:46,680 Speaker 2: Which is that like the problem was just. 450 00:24:46,640 --> 00:24:48,639 Speaker 6: Sort of like we'll just say, I don't know, in 451 00:24:48,680 --> 00:24:54,359 Speaker 6: my opinion, the Democratic Party thought the corn was like sufficient, 452 00:24:54,640 --> 00:24:57,040 Speaker 6: like it was the kneeling and could take closs business. 453 00:24:57,160 --> 00:24:59,520 Speaker 2: Yeah, yeah, right, that's actually cool. 454 00:24:59,680 --> 00:25:02,119 Speaker 3: You mean it's like we call those pump fakes. 455 00:25:02,320 --> 00:25:05,320 Speaker 6: Actually, but but like that stuff is all fine, and 456 00:25:05,400 --> 00:25:09,080 Speaker 6: like broadly speaking, correct, you just can't also then sell 457 00:25:09,119 --> 00:25:12,480 Speaker 6: out people's economic futures to the same people Trump is 458 00:25:13,000 --> 00:25:13,919 Speaker 6: trying to sell out well. 459 00:25:13,960 --> 00:25:16,119 Speaker 3: And also, like I said, you can't pump fake like 460 00:25:16,200 --> 00:25:18,919 Speaker 3: don't take a knee acting like yo, this police violence 461 00:25:18,920 --> 00:25:21,080 Speaker 3: is out of hand. And then anytime some kind of 462 00:25:21,080 --> 00:25:23,639 Speaker 3: meaningful police reform bill comes up, you're like, I'm actually 463 00:25:23,640 --> 00:25:26,600 Speaker 3: in solidarity with the police union, Like this is the 464 00:25:26,600 --> 00:25:28,520 Speaker 3: most they would let me get away with that summer, 465 00:25:28,560 --> 00:25:31,119 Speaker 3: And that's that's I take it to the limit. Yeah, 466 00:25:31,200 --> 00:25:34,520 Speaker 3: if your only thing is chasing whatever is pulling, well 467 00:25:34,640 --> 00:25:37,760 Speaker 3: in the moment, like people people catch onto that pretty quickly. 468 00:25:37,800 --> 00:25:41,919 Speaker 6: But it's not even that because economic equality is pulling well. 469 00:25:42,359 --> 00:25:46,480 Speaker 3: So like, but I mean, that's that's their to the core. 470 00:25:46,600 --> 00:25:49,280 Speaker 3: They're like, I mean, the things we can get away 471 00:25:49,359 --> 00:25:52,400 Speaker 3: with by using the rationale that it's popular, will do that. 472 00:25:52,440 --> 00:25:54,800 Speaker 3: Anything that upsets the status quo, even if it is 473 00:25:55,760 --> 00:25:56,560 Speaker 3: single pay. 474 00:25:56,440 --> 00:25:57,199 Speaker 2: Or what the. 475 00:26:00,920 --> 00:26:05,800 Speaker 6: Go nil guys, yeah, yeah, yeah, Neil, Neil, that'll distract him. 476 00:26:05,800 --> 00:26:07,080 Speaker 3: That'll distract him. 477 00:26:07,160 --> 00:26:12,640 Speaker 6: So that's anyway, that was an incoherent set of under 478 00:26:12,680 --> 00:26:13,399 Speaker 6: and overrated. 479 00:26:13,480 --> 00:26:16,520 Speaker 2: But like it I feel I'll leave it as an exercise. 480 00:26:16,800 --> 00:26:18,679 Speaker 6: I'll leave it as an exercise to the reader to 481 00:26:18,720 --> 00:26:22,679 Speaker 6: figure out what my actual ciphers. 482 00:26:23,840 --> 00:26:25,800 Speaker 2: Over under an n rated. 483 00:26:26,400 --> 00:26:28,680 Speaker 3: Yeah, all three, let's take a quick break. 484 00:26:28,760 --> 00:26:43,960 Speaker 2: We'll be right back and we are right back and. 485 00:26:44,160 --> 00:26:48,680 Speaker 3: From where we started, right back where we started again. Yeah, 486 00:26:48,680 --> 00:26:50,440 Speaker 3: we kind of are right back where we started again. 487 00:26:50,560 --> 00:26:54,320 Speaker 2: Yeah, I mean, so we like there. 488 00:26:54,280 --> 00:26:56,679 Speaker 1: Has so much there's just so much talk of, you 489 00:26:56,680 --> 00:26:59,080 Speaker 1: know right now, ambiently, right all we're hearing in America 490 00:26:59,160 --> 00:27:03,879 Speaker 1: is like violent immigrants and immigrant gangs and the threat 491 00:27:03,920 --> 00:27:07,000 Speaker 1: of a nuclear Iran and sadly the media is just 492 00:27:07,040 --> 00:27:10,560 Speaker 1: like relying on their bad habits of providing zero context 493 00:27:10,600 --> 00:27:13,520 Speaker 1: when talking about issues that have global ramifications. 494 00:27:14,400 --> 00:27:16,000 Speaker 3: I just want to start off pointing out a couple 495 00:27:16,000 --> 00:27:16,760 Speaker 3: of things. Of these things. 496 00:27:17,119 --> 00:27:21,040 Speaker 1: Immigration, Right, We're currently seeing a campaign of terror unfold 497 00:27:21,080 --> 00:27:24,159 Speaker 1: on our streets as mask goon secret police whoever the 498 00:27:24,200 --> 00:27:27,200 Speaker 1: fuck they are just snatching up innocent people off the streets. 499 00:27:27,520 --> 00:27:30,199 Speaker 1: I say innocent to like juxtapose that with like the 500 00:27:30,280 --> 00:27:33,680 Speaker 1: DHS and ICE officials who are using as their ration 501 00:27:33,800 --> 00:27:36,720 Speaker 1: at like this idea like violent. There we're getting only 502 00:27:36,760 --> 00:27:40,719 Speaker 1: the violent criminals, the worst, these murderers, and they typically 503 00:27:40,800 --> 00:27:44,439 Speaker 1: evoke the boogeyman of MS thirteen. Just want to remind 504 00:27:44,440 --> 00:27:48,520 Speaker 1: ourselves the United States is basically the fucking reason MS 505 00:27:48,600 --> 00:27:51,200 Speaker 1: thirteen even exists in the first place. Here's the here's 506 00:27:51,240 --> 00:27:52,960 Speaker 1: the very quick truncated version. 507 00:27:53,000 --> 00:27:56,840 Speaker 3: Okay. During the Cold War, the US was using financial 508 00:27:56,840 --> 00:28:00,600 Speaker 3: coercion and arming governments to fight off any suspected expansion 509 00:28:00,720 --> 00:28:04,480 Speaker 3: of like leftist ideology that they read as communists. Okay. 510 00:28:04,840 --> 00:28:08,639 Speaker 1: In l Salvador, the possibility of anything resembling a leftist 511 00:28:08,680 --> 00:28:12,480 Speaker 1: government caused concern, and the US began opposite, arming the opposition, 512 00:28:12,880 --> 00:28:15,920 Speaker 1: and contributed to the civil war there. This caused many 513 00:28:15,920 --> 00:28:19,920 Speaker 1: people to flee the unrest to places like the United States. 514 00:28:20,560 --> 00:28:23,280 Speaker 1: Many of these young men who arrived in LA, they 515 00:28:23,359 --> 00:28:26,920 Speaker 1: learned LA gang culture, they took and then once Bill 516 00:28:26,960 --> 00:28:31,159 Speaker 1: Clinton started his whole fucking policy of super predators, of 517 00:28:31,240 --> 00:28:35,000 Speaker 1: deporting people in mass basically all these young men came 518 00:28:35,040 --> 00:28:39,440 Speaker 1: from LA and we started exporting gang culture to El Salvador. 519 00:28:39,680 --> 00:28:42,000 Speaker 1: And that's how you begin to see the beginnings of 520 00:28:42,160 --> 00:28:45,000 Speaker 1: MS thirteen, you know, show up. So it's actually our 521 00:28:45,160 --> 00:28:48,800 Speaker 1: anti communist, military interventionist habit and the love of deporting 522 00:28:48,840 --> 00:28:54,000 Speaker 1: people from destabilized nations that we destabilized that created MS thirteen. 523 00:28:54,120 --> 00:28:56,840 Speaker 2: Okay, the East, I don't know, Miles, this doesn't sound 524 00:28:56,920 --> 00:28:58,320 Speaker 2: like US. 525 00:28:59,240 --> 00:29:01,800 Speaker 3: Jess, Rita and Latin America just read up on it. 526 00:29:02,000 --> 00:29:03,400 Speaker 3: Like what you're trying to do with your bananas. 527 00:29:04,640 --> 00:29:07,920 Speaker 2: You can just replace El Salvador with any other. 528 00:29:08,680 --> 00:29:13,080 Speaker 3: Yeahs, Like it's all fucking there. Okay, So this is 529 00:29:13,160 --> 00:29:15,160 Speaker 3: why we and when people go, why. 530 00:29:14,960 --> 00:29:18,600 Speaker 1: Don't they stay in their countries because we fucking destabilize 531 00:29:18,640 --> 00:29:23,080 Speaker 1: them because they deigned to flirt with like socialism and 532 00:29:23,240 --> 00:29:26,920 Speaker 1: just like we're gonna nationalize our industry. 533 00:29:25,400 --> 00:29:30,040 Speaker 3: Now one of the industries, right, fuck you are you're gonna. 534 00:29:29,920 --> 00:29:34,040 Speaker 2: And this is all is gonna come in that it's 535 00:29:34,040 --> 00:29:36,480 Speaker 2: gonna be a massive corporation and they're going to do 536 00:29:37,040 --> 00:29:39,920 Speaker 2: what you know, like what Walmart did to the Midwest, 537 00:29:40,080 --> 00:29:42,360 Speaker 2: like what They're just going to hoover up all the 538 00:29:42,440 --> 00:29:45,520 Speaker 2: resources and take it out of your country and then take. 539 00:29:45,960 --> 00:29:49,360 Speaker 3: To as fuck and put it in the market. Yeah, 540 00:29:49,400 --> 00:29:51,120 Speaker 3: and also kill anyone who's trying to oppose it. 541 00:29:51,200 --> 00:29:54,560 Speaker 6: And this is also poor people in America wouldn't see 542 00:29:54,800 --> 00:29:58,320 Speaker 6: a system like this working, so they would would always 543 00:29:58,920 --> 00:30:01,360 Speaker 6: as they've been doing, vote for more. 544 00:30:01,640 --> 00:30:04,480 Speaker 2: Look at Cuba. Look at Cuba, guys, you can't vote 545 00:30:04,480 --> 00:30:06,960 Speaker 2: for fucking anybody on the left. Look at what Cuba. 546 00:30:07,080 --> 00:30:08,720 Speaker 2: Well look what youbot it. 547 00:30:08,720 --> 00:30:11,440 Speaker 3: Is in Venezuela. You mean because of all the fucking embargoes, 548 00:30:11,760 --> 00:30:13,760 Speaker 3: because they get medicine. 549 00:30:13,800 --> 00:30:16,360 Speaker 1: But what do you think again that situation. It looks 550 00:30:16,400 --> 00:30:18,640 Speaker 1: like that because we have a hand in that. So again, 551 00:30:18,680 --> 00:30:21,280 Speaker 1: now we have Iran right right now, there's so many 552 00:30:21,280 --> 00:30:24,480 Speaker 1: fucking freaks on TV trying to manufacture consent to attack 553 00:30:24,520 --> 00:30:26,880 Speaker 1: Iran and ultimately do regime change. 554 00:30:27,080 --> 00:30:28,280 Speaker 3: But again, the. 555 00:30:28,320 --> 00:30:32,760 Speaker 1: US already did that in the fifties. Okay, Iranians democratically 556 00:30:32,760 --> 00:30:36,640 Speaker 1: elected their prime minister, Mohammed Mosadeg. He angered the US 557 00:30:36,680 --> 00:30:38,640 Speaker 1: and UK when he said that he was going to 558 00:30:38,720 --> 00:30:42,240 Speaker 1: nationalize Iran's oil industry, and British at the time British 559 00:30:42,280 --> 00:30:45,040 Speaker 1: Petroleum was like, so we're not having that. So the 560 00:30:45,160 --> 00:30:48,920 Speaker 1: US and the UK. So the CIA, in secret intelligence 561 00:30:48,960 --> 00:30:53,160 Speaker 1: service sis ken is a visible army, came conducted a 562 00:30:53,200 --> 00:30:56,160 Speaker 1: coup to concentrate power with the Shah of Iran, who 563 00:30:56,200 --> 00:30:59,560 Speaker 1: would do as he was told by America. This field 564 00:30:59,800 --> 00:31:03,600 Speaker 1: the anti American sentiment that gave way to the Iranian Revolution, 565 00:31:03,800 --> 00:31:05,959 Speaker 1: which kicked off the new era of the Islamic Republic, 566 00:31:05,960 --> 00:31:08,360 Speaker 1: which they are now saying, this is a threat to everything. 567 00:31:08,760 --> 00:31:12,080 Speaker 1: So the US, again along with the allies, have done 568 00:31:12,160 --> 00:31:14,560 Speaker 1: all they can to destabilize. You're on a country, by 569 00:31:14,600 --> 00:31:18,040 Speaker 1: all accounts, is not building a nuclear weapon, okay, And 570 00:31:18,080 --> 00:31:21,720 Speaker 1: we're hearing constantly the same takes like nuclear weapon. 571 00:31:21,520 --> 00:31:23,600 Speaker 2: By all accounts of the people who are currently saying 572 00:31:23,640 --> 00:31:26,600 Speaker 2: they were like a week ago, yeah, yeah, exactly. 573 00:31:28,000 --> 00:31:29,960 Speaker 3: And yet now we're cheering on Israel. 574 00:31:30,040 --> 00:31:32,480 Speaker 1: Not like us personally, but like the sort of general 575 00:31:32,560 --> 00:31:35,640 Speaker 1: political discourse in DC is like cheering on Israel foret 576 00:31:35,680 --> 00:31:38,000 Speaker 1: another illegal attack on a nation and it's people. So 577 00:31:38,520 --> 00:31:41,920 Speaker 1: again we're talking Israel is the one that has nuclear 578 00:31:41,960 --> 00:31:46,959 Speaker 1: weapons and is not signing onto a nuclear non proliferation treaty. 579 00:31:47,040 --> 00:31:49,440 Speaker 3: Like yeah, it's so backwards right now. 580 00:31:49,480 --> 00:31:51,240 Speaker 2: I'm seeing a lot of people do the thing where 581 00:31:51,240 --> 00:31:54,000 Speaker 2: they're like, look, guys, I don't want a nuclear round 582 00:31:54,240 --> 00:31:57,320 Speaker 2: just like everybody else. It's like yeah, but like, can 583 00:31:57,400 --> 00:32:00,720 Speaker 2: you think of a worst country to have clear weapons 584 00:32:00,720 --> 00:32:04,520 Speaker 2: than Israel, who like won't sign on to any of 585 00:32:04,600 --> 00:32:08,400 Speaker 2: the like international laws, who are like flouting international laws 586 00:32:08,400 --> 00:32:10,760 Speaker 2: and like killing people and committing more crimes like actively, 587 00:32:10,840 --> 00:32:14,520 Speaker 2: Like that's the really scary thing to me. Like if 588 00:32:14,920 --> 00:32:18,080 Speaker 2: you're just like not a fan of innocent people being killed, 589 00:32:18,240 --> 00:32:22,520 Speaker 2: the idea that Israel has nuclear weapons is very scary. 590 00:32:22,600 --> 00:32:22,840 Speaker 3: Yeah. 591 00:32:22,880 --> 00:32:25,720 Speaker 1: Also, like why won't they allow inspections by the International 592 00:32:25,800 --> 00:32:29,080 Speaker 1: Atomic Eight Energy Agency? Like what are we talking about? 593 00:32:29,080 --> 00:32:30,840 Speaker 1: And again Obama made a deal with Iran. 594 00:32:30,920 --> 00:32:31,640 Speaker 3: Trump is a. 595 00:32:31,640 --> 00:32:33,200 Speaker 1: Racist, so he had to blow that up and act 596 00:32:33,240 --> 00:32:35,080 Speaker 1: like he was going to do something different to get credit, 597 00:32:35,320 --> 00:32:38,040 Speaker 1: and this only moved things backwards, and now we're just 598 00:32:38,480 --> 00:32:40,720 Speaker 1: now we're fucking here and we're talking about like Iran, 599 00:32:40,760 --> 00:32:42,719 Speaker 1: there's such a threat to the stability of the region. 600 00:32:42,920 --> 00:32:44,920 Speaker 1: I'm like, oh, I'm sorry, Are they engaged in a 601 00:32:44,960 --> 00:32:48,200 Speaker 1: genocidal campaign in Gaza and occupying the West Bank? Did 602 00:32:48,200 --> 00:32:50,520 Speaker 1: they attack Lebanon and invade. 603 00:32:51,280 --> 00:32:53,160 Speaker 3: The de facto are exactly? 604 00:32:53,200 --> 00:32:55,480 Speaker 1: And I mean that's like the shorthand that people are 605 00:32:55,520 --> 00:32:58,160 Speaker 1: just like going back to this war on terror like 606 00:32:58,320 --> 00:33:01,160 Speaker 1: thought killing cliche where they're like, that's you know, you're wrong. 607 00:33:02,680 --> 00:33:06,280 Speaker 1: And the same people that fucking cheered on the war 608 00:33:06,360 --> 00:33:08,680 Speaker 1: on terror that killed four and a half million people 609 00:33:09,320 --> 00:33:13,160 Speaker 1: conservatively are now just raw ryeing this on. And yet 610 00:33:13,200 --> 00:33:15,640 Speaker 1: there they don't have to answer for their sin of 611 00:33:15,800 --> 00:33:17,800 Speaker 1: being like, yeah they got w and we gotta fucking 612 00:33:17,840 --> 00:33:20,600 Speaker 1: do this shit. Uh, And they're doing the exact same thing. 613 00:33:20,640 --> 00:33:20,760 Speaker 2: Now. 614 00:33:20,800 --> 00:33:23,720 Speaker 3: It's just like baffling, baffling. 615 00:33:24,360 --> 00:33:26,120 Speaker 2: These are the These are the people who you can 616 00:33:26,160 --> 00:33:29,560 Speaker 2: blow their mind with the Matthew McConaughey trick. Now, now 617 00:33:29,600 --> 00:33:30,800 Speaker 2: imagine those people are. 618 00:33:30,640 --> 00:33:33,360 Speaker 3: White, right, would that be from a time to kill? 619 00:33:33,640 --> 00:33:37,280 Speaker 2: Yeah? Yeah, now now imagine they're white. Oh what well 620 00:33:37,360 --> 00:33:41,080 Speaker 2: that was wait what that would be horrible Jesus terrible. 621 00:33:41,720 --> 00:33:42,600 Speaker 2: It's also like. 622 00:33:42,920 --> 00:33:45,640 Speaker 3: Even in the small media version, it is a little 623 00:33:45,840 --> 00:33:49,520 Speaker 3: bonkers that no one who is so wrong about Iraq, 624 00:33:50,200 --> 00:33:58,120 Speaker 3: like intelligence wise, has faced any like even credibility consequence. Yeah, nope, Like, hey, motherfuckers, 625 00:33:58,160 --> 00:34:01,040 Speaker 3: these are largely the same. It's certainly the same institutions. Yeah, 626 00:34:01,240 --> 00:34:02,080 Speaker 3: some of the same. 627 00:34:01,880 --> 00:34:05,520 Speaker 6: People who are telling the same lies for the same 628 00:34:05,680 --> 00:34:06,760 Speaker 6: obvious reasons. 629 00:34:07,000 --> 00:34:09,160 Speaker 2: And we thought that didn't go well, like right, Like 630 00:34:09,200 --> 00:34:11,920 Speaker 2: we all agreed at the end of that one that 631 00:34:11,920 --> 00:34:14,560 Speaker 2: that was bad, right, Like, I guess this one feels 632 00:34:14,560 --> 00:34:17,680 Speaker 2: a little bit different because Israel is the US in 633 00:34:17,719 --> 00:34:20,359 Speaker 2: this case and they're like, are you. 634 00:34:20,400 --> 00:34:23,799 Speaker 3: And the coalition of the willing type ship? 635 00:34:23,880 --> 00:34:27,600 Speaker 2: But yeah, that that was just like we thought we 636 00:34:27,640 --> 00:34:33,840 Speaker 2: thought the Iraq than yeah, that was right, guys, my 637 00:34:33,960 --> 00:34:36,359 Speaker 2: crazy heir to that that that was that didn't go. 638 00:34:36,440 --> 00:34:40,120 Speaker 3: Grace right, But yeah, no one again, like and what 639 00:34:40,200 --> 00:34:43,200 Speaker 3: happened there, It's like there is a presupposition of weapons 640 00:34:43,200 --> 00:34:45,799 Speaker 3: of mass destruction. And also, wasn't net and Yahoo coming 641 00:34:45,840 --> 00:34:48,720 Speaker 3: to d C saying if you guys take out Saddam Hussein, 642 00:34:48,840 --> 00:34:50,640 Speaker 3: everything will be right in the world, and was the 643 00:34:50,719 --> 00:34:55,160 Speaker 3: biggest Oh okay, so but then okay, okay, we'll just Okay, okay, yeah, 644 00:34:55,160 --> 00:34:57,960 Speaker 3: we'll believe everything. We believe everything, and let's were about 645 00:34:57,960 --> 00:35:01,120 Speaker 3: to see let's not learn from history ever. That's what's 646 00:35:01,120 --> 00:35:03,239 Speaker 3: so fucking that's frustrating and. 647 00:35:03,200 --> 00:35:07,160 Speaker 2: Likes not on our point. 648 00:35:07,360 --> 00:35:09,480 Speaker 3: Let's just not What if we just took that off 649 00:35:09,520 --> 00:35:12,120 Speaker 3: the table, Like what if we just did repeat history 650 00:35:12,160 --> 00:35:12,560 Speaker 3: every time? 651 00:35:14,000 --> 00:35:16,080 Speaker 6: I mean, you know what, That's the thing that's a 652 00:35:16,200 --> 00:35:18,759 Speaker 6: nice thing about the Internet age is like you know, 653 00:35:19,560 --> 00:35:22,200 Speaker 6: not only do we repeat history, we repeat it much faster. 654 00:35:23,160 --> 00:35:26,480 Speaker 6: That's you know everything. The nineties are back, is what 655 00:35:26,520 --> 00:35:29,120 Speaker 6: I'm saying, and even. 656 00:35:28,960 --> 00:35:33,160 Speaker 3: The two thousands are already back. Yeah, it's a cosmic 657 00:35:33,200 --> 00:35:37,480 Speaker 3: gumbo of the three name brought back. 658 00:35:38,120 --> 00:35:41,480 Speaker 2: There's a I think it was a radio Lab episode 659 00:35:41,600 --> 00:35:44,640 Speaker 2: that was about people who lose their short term memory 660 00:35:45,120 --> 00:35:48,400 Speaker 2: and they like they're coming out of like a you know, 661 00:35:49,000 --> 00:35:54,320 Speaker 2: comba or something like that, and they like their loved 662 00:35:54,320 --> 00:35:57,560 Speaker 2: ones will report that they will like repeat themselves over 663 00:35:57,640 --> 00:36:00,320 Speaker 2: and over again, like just the speed with which they'll 664 00:36:00,360 --> 00:36:04,680 Speaker 2: they'll be like like, I just remember the episode they 665 00:36:04,719 --> 00:36:07,480 Speaker 2: like recorded this person having this conversation, and like the 666 00:36:07,520 --> 00:36:10,480 Speaker 2: people they're talking to are like kind of like you know, 667 00:36:10,600 --> 00:36:12,880 Speaker 2: taking a breath and then like answering their question, they're like, 668 00:36:13,000 --> 00:36:15,959 Speaker 2: that's because they had just asked that like two minutes ago. 669 00:36:16,160 --> 00:36:19,839 Speaker 2: And so like the speed with which you repeat, if 670 00:36:19,920 --> 00:36:23,640 Speaker 2: you just have no memory, it's like it just happens 671 00:36:23,840 --> 00:36:26,600 Speaker 2: so quickly, and because there's just so much noise, there's 672 00:36:26,640 --> 00:36:30,440 Speaker 2: just so little awareness. It just feels like we're just 673 00:36:30,600 --> 00:36:34,120 Speaker 2: repeating things like within a decade or two decades. Like 674 00:36:34,160 --> 00:36:39,040 Speaker 2: it's just like the has them disease exactly, Yeah, but 675 00:36:39,120 --> 00:36:42,120 Speaker 2: no tattoos, but none of the tattoos to remind them 676 00:36:42,120 --> 00:36:43,799 Speaker 2: of what the no tattoos. 677 00:36:44,239 --> 00:36:46,680 Speaker 3: Just a fucking guy probably just trying to kill somebody. 678 00:36:46,920 --> 00:36:51,480 Speaker 2: They don't know why, just killing someone, dude, we're raw 679 00:36:51,520 --> 00:36:53,680 Speaker 2: dogging memento with no tattoos. 680 00:36:53,760 --> 00:36:57,000 Speaker 3: Yeah, they just want to kill someone and make themselves 681 00:36:57,040 --> 00:36:58,440 Speaker 3: feel better for it. 682 00:36:58,560 --> 00:37:01,120 Speaker 2: If you could like have a beard, that would be cool. 683 00:37:01,280 --> 00:37:03,080 Speaker 2: I don't know, like that'd be good. 684 00:37:03,440 --> 00:37:05,160 Speaker 1: Yeah, I mean, like to your point, Jack, it's like 685 00:37:05,239 --> 00:37:08,200 Speaker 1: when you if you do remember, then when that impulse 686 00:37:08,239 --> 00:37:10,120 Speaker 1: comes up, you have a memory attached to it. 687 00:37:10,080 --> 00:37:13,480 Speaker 3: And be like, oh that's right, stove, hot n touch stove. 688 00:37:13,880 --> 00:37:17,400 Speaker 1: But we never as a nation are ever we we 689 00:37:17,640 --> 00:37:21,080 Speaker 1: just don't have reckonings with our white supremacy, our xenophobia, 690 00:37:21,160 --> 00:37:26,640 Speaker 1: our homophobia are imperial interventionist streak of regime change. And 691 00:37:26,680 --> 00:37:29,120 Speaker 1: so every time these moments help come up, the people 692 00:37:29,160 --> 00:37:32,960 Speaker 1: that feel that shit are like, ah no, no, no, no, no, yeah, no, 693 00:37:33,040 --> 00:37:35,520 Speaker 1: y'all don't remember that, and they're like no because it 694 00:37:35,520 --> 00:37:38,319 Speaker 1: didn't affect me. And this time we're doing it again 695 00:37:38,320 --> 00:37:40,560 Speaker 1: in another way where people are like, do you all remember? 696 00:37:40,640 --> 00:37:40,719 Speaker 5: No? 697 00:37:40,920 --> 00:37:43,560 Speaker 1: Okay, so there's some mag and that's why there's some 698 00:37:43,640 --> 00:37:45,960 Speaker 1: magot people who are like, fuck base war. I mean, 699 00:37:45,960 --> 00:37:48,120 Speaker 1: obviously it's for bad reasons, but you're seeing this now 700 00:37:48,160 --> 00:37:51,959 Speaker 1: where they're even like that last that wars is not good. 701 00:37:51,960 --> 00:37:54,560 Speaker 1: I'd rather focus on their money in our country, even 702 00:37:54,560 --> 00:37:55,399 Speaker 1: though stuff are. 703 00:37:55,320 --> 00:37:57,080 Speaker 2: Doing is wildly unpopular. 704 00:37:57,480 --> 00:38:00,920 Speaker 6: Like before, it never matters and it never has matter now, 705 00:38:01,040 --> 00:38:04,160 Speaker 6: but also never had a four miles. What's happening is 706 00:38:04,320 --> 00:38:09,360 Speaker 6: you're talking to a dangerous stove salesman whose yeah. 707 00:38:09,320 --> 00:38:13,440 Speaker 3: Exactly, Hey, so where's the like where the burdeners? You 708 00:38:13,520 --> 00:38:15,600 Speaker 3: just turn on a big flame shoots. 709 00:38:15,239 --> 00:38:18,600 Speaker 6: Oh, you need the new stove if you want to 710 00:38:18,640 --> 00:38:21,759 Speaker 6: get actually does the same thing. Yeah, yeah, oh you 711 00:38:21,800 --> 00:38:23,560 Speaker 6: know what? Yeah, Ratheon makes that one. I'll bring that 712 00:38:23,560 --> 00:38:24,239 Speaker 6: next time I come. 713 00:38:24,160 --> 00:38:28,880 Speaker 2: To But yeah, I mean it does. It does feel 714 00:38:28,920 --> 00:38:32,600 Speaker 2: like if they're if they accidentally talk to someone who's 715 00:38:32,640 --> 00:38:34,920 Speaker 2: willing to push back, they're kind of in trouble. 716 00:38:35,280 --> 00:38:38,960 Speaker 3: Like that's the Stove Theon and I thought of the 717 00:38:39,440 --> 00:38:43,560 Speaker 3: gray Joy as a stove, Yeah, or Rathon ray Charles 718 00:38:43,640 --> 00:38:48,760 Speaker 3: as beon Gray Joy. All right, sorry that that actually 719 00:38:48,800 --> 00:38:49,520 Speaker 3: he is a. 720 00:38:49,719 --> 00:39:00,399 Speaker 2: Lucid deep lucid dreamer. What if Raytheon just switched their 721 00:39:00,440 --> 00:39:06,640 Speaker 2: logo to that dressed as or the gray Joy with 722 00:39:06,680 --> 00:39:10,200 Speaker 2: the glasses on at a pan don't worry about what's happening. 723 00:39:09,840 --> 00:39:11,040 Speaker 3: And you got the right one. 724 00:39:11,080 --> 00:39:11,880 Speaker 2: Baby. 725 00:39:12,040 --> 00:39:15,200 Speaker 6: Here's what I will say is there's no good or 726 00:39:15,239 --> 00:39:18,880 Speaker 6: ethical use of AI, except obviously for generating Rathie. 727 00:39:20,120 --> 00:39:23,279 Speaker 7: I want to see Raytheon, Okay, I do just want 728 00:39:23,320 --> 00:39:26,520 Speaker 7: to talk about because they they don't really have their 729 00:39:26,560 --> 00:39:29,319 Speaker 7: defenses built up on this one, so when they do 730 00:39:29,440 --> 00:39:32,839 Speaker 7: accidentally talk to somebody who is making the point that 731 00:39:32,880 --> 00:39:34,480 Speaker 7: this is a bad idea, it. 732 00:39:34,760 --> 00:39:37,520 Speaker 2: Doesn't really go well for them. Yeah, and that's what 733 00:39:37,560 --> 00:39:40,520 Speaker 2: that's what happened. Because Tucker Carlson happens to be on 734 00:39:40,560 --> 00:39:44,480 Speaker 2: the right side of this specific issue, and he he 735 00:39:44,600 --> 00:39:46,560 Speaker 2: talked to Ted Cruz and. 736 00:39:46,680 --> 00:39:50,840 Speaker 3: Fucking grilled him in the most low stakes way, but 737 00:39:51,000 --> 00:39:55,240 Speaker 3: just in this dickish way that it just completely cooked 738 00:39:55,280 --> 00:39:58,120 Speaker 3: Ted Cruz that he got caught like in this moment 739 00:39:58,200 --> 00:40:00,359 Speaker 3: being like, oh, I don't have an answer for these 740 00:40:00,520 --> 00:40:04,440 Speaker 3: very basic questions. So here is I. We'll play a 741 00:40:04,480 --> 00:40:04,840 Speaker 3: bit of it. 742 00:40:04,880 --> 00:40:07,480 Speaker 1: Because this thing goes on for a minute with Tucker 743 00:40:07,480 --> 00:40:09,479 Speaker 1: Carlson just really holding his feet to the fire. 744 00:40:09,719 --> 00:40:10,640 Speaker 8: How many people living around? 745 00:40:10,640 --> 00:40:14,160 Speaker 9: By the way, I don't know the population at all. No, 746 00:40:14,239 --> 00:40:17,000 Speaker 9: I don't know the population. You don't know the population 747 00:40:17,040 --> 00:40:17,520 Speaker 9: of the country. 748 00:40:17,560 --> 00:40:19,000 Speaker 8: You seek to topple. 749 00:40:20,440 --> 00:40:22,239 Speaker 2: How many people living around? What the pug is? 750 00:40:23,040 --> 00:40:23,680 Speaker 3: What the are you doing? 751 00:40:24,200 --> 00:40:26,760 Speaker 9: How could you not know that? 752 00:40:26,800 --> 00:40:30,360 Speaker 2: Oh my god, dude, it's so lack though, like as 753 00:40:30,440 --> 00:40:32,880 Speaker 2: just like doing it like it's a fucking trivia contest. 754 00:40:33,080 --> 00:40:36,560 Speaker 1: It's like you're yeah, or like in a toxic relationship, 755 00:40:36,520 --> 00:40:37,600 Speaker 1: tore Like, what do you mean you don't know where 756 00:40:37,600 --> 00:40:38,319 Speaker 1: you at last night? 757 00:40:38,640 --> 00:40:39,880 Speaker 3: Oh yeah, so you don't know you don't know your 758 00:40:39,880 --> 00:40:41,560 Speaker 3: own friends to phone number? Who I need to call it? 759 00:40:41,960 --> 00:40:43,040 Speaker 2: Okay, let's just get. 760 00:40:42,880 --> 00:40:45,520 Speaker 9: Around the don't sit around memorizing population tables. 761 00:40:45,680 --> 00:40:47,960 Speaker 8: Well, it's kind of relevant because you're calling for the 762 00:40:47,960 --> 00:40:48,880 Speaker 8: overthrow of the government. 763 00:40:49,440 --> 00:40:51,719 Speaker 9: Why is it relevant whether it's well because ninety million 764 00:40:51,800 --> 00:40:53,080 Speaker 9: or eighty million or a hundred million. 765 00:40:53,120 --> 00:40:55,000 Speaker 8: Why is if you don't know anything about the country. 766 00:40:55,040 --> 00:40:56,560 Speaker 9: I didn't say I don't know anything about Okay, what's 767 00:40:56,560 --> 00:40:57,080 Speaker 9: the ethics? 768 00:40:57,640 --> 00:41:02,160 Speaker 3: Oh boy? Oh like you like? Name every album. 769 00:41:03,520 --> 00:41:08,520 Speaker 9: The Persians were predominantly Shia. Okay, you don't know anything 770 00:41:08,520 --> 00:41:13,480 Speaker 9: about so Okay, I'm not Carlson Iran. 771 00:41:13,560 --> 00:41:17,000 Speaker 8: You're a center who's calling the one government, the one. 772 00:41:16,840 --> 00:41:18,240 Speaker 2: Who about the country. 773 00:41:18,600 --> 00:41:20,279 Speaker 9: You don't know anything about the country. You're the one 774 00:41:20,280 --> 00:41:22,240 Speaker 9: who plays They're not trying to murder Donald Trump. 775 00:41:23,040 --> 00:41:23,719 Speaker 2: I'm not saying that. 776 00:41:23,840 --> 00:41:26,880 Speaker 9: Who can't figure out saying you don't killed General Solamoni 777 00:41:26,920 --> 00:41:29,160 Speaker 9: and you believe they're trying to murder Trump? 778 00:41:29,239 --> 00:41:31,960 Speaker 8: Yes, because you're not calling for military strikes against them 779 00:41:31,960 --> 00:41:32,640 Speaker 8: in retaliation. 780 00:41:32,680 --> 00:41:35,560 Speaker 9: And if they really believe that carrying out military strikes today, 781 00:41:35,920 --> 00:41:39,239 Speaker 9: what Israel was right with our help? I'm said, we 782 00:41:39,520 --> 00:41:41,359 Speaker 9: Israel is leading them, but we're supporting them. 783 00:41:41,440 --> 00:41:43,560 Speaker 8: Well, this you're breaking news here because in the US 784 00:41:43,680 --> 00:41:47,120 Speaker 8: government last night denied the National's Krey Council Spokesmlex Fighter 785 00:41:47,200 --> 00:41:50,560 Speaker 8: denied on behalf of Trump that we were acting on. 786 00:41:50,760 --> 00:41:52,080 Speaker 3: Israel's in any offensive. 787 00:41:53,040 --> 00:41:56,640 Speaker 9: Then Israel's bombing, then you just said we were, we 788 00:41:56,719 --> 00:41:57,480 Speaker 9: are supporting it. 789 00:41:58,200 --> 00:42:05,000 Speaker 8: You're a senator if you're saying the United States. 790 00:42:02,360 --> 00:42:03,600 Speaker 2: Okay, got him. 791 00:42:04,840 --> 00:42:07,920 Speaker 3: I mean, this could have made so much better. 792 00:42:08,160 --> 00:42:12,239 Speaker 6: But Tucker's bikes the camera in such a hilarious way, 793 00:42:17,640 --> 00:42:20,439 Speaker 6: Senator brother, I go to tell you this is good 794 00:42:21,680 --> 00:42:25,160 Speaker 6: also for the podcast listeners. I don't look up the 795 00:42:25,160 --> 00:42:26,920 Speaker 6: actual clip because who gives a funk, But they are 796 00:42:27,080 --> 00:42:29,360 Speaker 6: doing this interview in front of an oil painting of 797 00:42:29,440 --> 00:42:33,040 Speaker 6: Ronald Brake would say, bring it back. 798 00:42:33,080 --> 00:42:36,480 Speaker 3: I didn't even Oh yeah, it's just it's Sipper City. 799 00:42:37,080 --> 00:42:40,120 Speaker 3: I'm guessing he's at a press conference in front of 800 00:42:40,280 --> 00:42:43,400 Speaker 3: other Republican heroes, but it is. 801 00:42:44,280 --> 00:42:46,880 Speaker 1: I know, it's like it's like then version of like 802 00:42:46,920 --> 00:42:49,040 Speaker 1: those murals used to see in Hollywood, with like all 803 00:42:49,080 --> 00:42:52,400 Speaker 1: the old Hollywood stars in a theater together. It's like 804 00:42:52,440 --> 00:42:55,600 Speaker 1: Ronald Reagan and all the architects of our fucking all 805 00:42:55,640 --> 00:42:58,439 Speaker 1: of our ills. But I mean again, I think very 806 00:42:58,480 --> 00:43:01,200 Speaker 1: important just to cat Viat what you just heard from 807 00:43:01,280 --> 00:43:04,319 Speaker 1: Tucker Carlson is that I don't think Tucker Carlson is 808 00:43:04,320 --> 00:43:07,400 Speaker 1: not smart or a decent person. He's clearly questioning the 809 00:43:07,440 --> 00:43:10,240 Speaker 1: involvement of the US because he is a trained circus 810 00:43:10,320 --> 00:43:13,719 Speaker 1: rat when it comes to regurgitating Russian talking points since 811 00:43:13,800 --> 00:43:16,759 Speaker 1: Russia has an interest in Iran. That's where he's coming from. 812 00:43:16,840 --> 00:43:20,239 Speaker 1: It's not it's nothing other than that. However, you know, 813 00:43:20,320 --> 00:43:23,000 Speaker 1: like you do like to see people like Ted Cruz sweat, 814 00:43:23,000 --> 00:43:25,920 Speaker 1: although the fuck ill it's coming from a just absolute 815 00:43:26,320 --> 00:43:29,719 Speaker 1: ghoul like Tucker Carlson. But it's just like unreal how 816 00:43:29,760 --> 00:43:33,759 Speaker 1: these people just crumble with just elementary pushback in an 817 00:43:33,800 --> 00:43:35,120 Speaker 1: assholey kind of way. 818 00:43:35,520 --> 00:43:38,239 Speaker 2: I don't get. But even that like it did. It's 819 00:43:38,280 --> 00:43:40,319 Speaker 2: weird that he was just like as like, what's the 820 00:43:40,920 --> 00:43:42,960 Speaker 2: what's the ethnic makeup, what's what's. 821 00:43:44,640 --> 00:43:45,919 Speaker 3: The moral grounds at all? 822 00:43:46,080 --> 00:43:48,279 Speaker 2: Like and also like why would be bad? And like 823 00:43:48,560 --> 00:43:49,840 Speaker 2: I don't know, like you could you can make the 824 00:43:49,920 --> 00:43:52,239 Speaker 2: argument that it would be like obviously morally bad, but 825 00:43:52,360 --> 00:43:57,040 Speaker 2: also like strategically and like no, he's just like you 826 00:43:57,080 --> 00:44:00,920 Speaker 2: don't know shit about around, right, He's you're dumb. 827 00:44:00,800 --> 00:44:03,600 Speaker 3: For wanting to attack them, right, I mean. 828 00:44:03,640 --> 00:44:06,920 Speaker 6: The one thing that this highlights once again though, is 829 00:44:07,080 --> 00:44:13,600 Speaker 6: like the fucking Democrats, Like just like the amount of 830 00:44:13,640 --> 00:44:18,200 Speaker 6: respect they give Republicans in face to face conversations is 831 00:44:19,080 --> 00:44:24,000 Speaker 6: like idiotic. So like they're so easy to bait because 832 00:44:24,000 --> 00:44:25,680 Speaker 6: they're wrong and stupid. 833 00:44:25,239 --> 00:44:27,200 Speaker 2: About everything right into it. 834 00:44:27,680 --> 00:44:32,960 Speaker 1: Yeah, or like and like so many other fucking lawyers, 835 00:44:33,480 --> 00:44:36,440 Speaker 1: they're like, yeah, cross examine them every time whatever. 836 00:44:36,719 --> 00:44:37,640 Speaker 2: But we're saying. 837 00:44:37,400 --> 00:44:40,360 Speaker 3: Donald Trump should marry Aran That's that's y. 838 00:44:41,200 --> 00:44:42,719 Speaker 2: Yeah. 839 00:44:42,760 --> 00:44:45,800 Speaker 1: This is also again Ted Cruise's response, because that clip 840 00:44:45,840 --> 00:44:49,480 Speaker 1: obviously started blowing up because you know, the media is like, wow, Maggie. 841 00:44:49,239 --> 00:44:50,080 Speaker 3: Is being blown up. 842 00:44:50,120 --> 00:44:53,359 Speaker 1: Part It's like, no, there's They're just warring factions within 843 00:44:53,400 --> 00:44:57,520 Speaker 1: the same white supremacist movement. Ted Cruz posted this AI 844 00:44:57,760 --> 00:45:00,520 Speaker 1: bullshit of like it and like a comic handle of 845 00:45:00,600 --> 00:45:04,279 Speaker 1: Tucker Carlson interviewing Luke Skywalker and he goes, what is 846 00:45:04,280 --> 00:45:07,200 Speaker 1: the population of the death start mm hmm. That's Ted 847 00:45:07,239 --> 00:45:13,359 Speaker 1: Cruise's yeah, so is America, Luke Skywalker that you're trying 848 00:45:13,360 --> 00:45:14,040 Speaker 1: to do that now. 849 00:45:14,280 --> 00:45:15,880 Speaker 2: America is the rebel? 850 00:45:16,600 --> 00:45:19,440 Speaker 3: Oh honey, too many people have seen and or but 851 00:45:19,520 --> 00:45:22,239 Speaker 3: from what I'm seeing on the internet, people are like 852 00:45:22,800 --> 00:45:26,040 Speaker 3: that kind of awakening some people have had because of 853 00:45:26,080 --> 00:45:27,320 Speaker 3: and or it's really mind blowing. 854 00:45:27,440 --> 00:45:31,200 Speaker 2: But I know, hey, I just finished again. I will 855 00:45:31,239 --> 00:45:32,279 Speaker 2: welcome to the tent. 856 00:45:32,440 --> 00:45:35,680 Speaker 3: You guys will I like, I enjoyed it. 857 00:45:35,680 --> 00:45:37,640 Speaker 1: I'm just saying it's interesting to see like how that 858 00:45:38,120 --> 00:45:40,360 Speaker 1: became a somewhat radicalizing force. 859 00:45:40,640 --> 00:45:42,560 Speaker 6: No, I liked it, but it is this thing where 860 00:45:42,560 --> 00:45:47,879 Speaker 6: it's like, I'm glad you're here. I cannot believe this 861 00:45:47,960 --> 00:45:51,279 Speaker 6: is what it took fascism sickoing for you. 862 00:45:51,480 --> 00:45:54,239 Speaker 2: But fine, this is the equivalent. Like we've talked before 863 00:45:54,280 --> 00:45:58,160 Speaker 2: about how like everybody's losing religion and like the thing 864 00:45:58,200 --> 00:46:01,880 Speaker 2: they're replacing it with is like fandom Beyonce and Taylor 865 00:46:01,920 --> 00:46:04,640 Speaker 2: Swift and Star Wars, like you know, like it's like 866 00:46:04,680 --> 00:46:07,839 Speaker 2: ship like that. So like having a one of the 867 00:46:07,920 --> 00:46:12,400 Speaker 2: main myths that people like create meaning and belonging from, 868 00:46:12,719 --> 00:46:17,080 Speaker 2: like having that tell them a story that's like, yeah, yeah, 869 00:46:17,160 --> 00:46:17,680 Speaker 2: it makes sense. 870 00:46:18,000 --> 00:46:22,279 Speaker 1: It shows you the power of those platforms too, and 871 00:46:22,320 --> 00:46:25,320 Speaker 1: you're like, yeah, maybe people can be wielding those a 872 00:46:25,320 --> 00:46:29,280 Speaker 1: little bit more responsibly. Yeah yeah, but yeah, shout out Diego. 873 00:46:29,320 --> 00:46:31,400 Speaker 1: Luna they said had a huge hand and like a 874 00:46:31,440 --> 00:46:33,719 Speaker 1: lot of the like writing and stuff, or a lot 875 00:46:33,719 --> 00:46:35,120 Speaker 1: of the texture of that that show. 876 00:46:36,280 --> 00:46:38,120 Speaker 2: Let's take a quick break and then we'll come back 877 00:46:38,160 --> 00:46:42,279 Speaker 2: and talk about where I'm getting my information from communicating 878 00:46:42,280 --> 00:46:56,400 Speaker 2: with interdimensional beings through AI. Will be right back, and 879 00:46:56,520 --> 00:47:00,400 Speaker 2: we're back. And there's a New York Times article that 880 00:47:00,800 --> 00:47:05,360 Speaker 2: is bringing together a few pretty wild anecdotes and studies 881 00:47:05,760 --> 00:47:10,040 Speaker 2: about some of the dangers of AI, and specifically large 882 00:47:10,120 --> 00:47:14,800 Speaker 2: language models that try to convince you that they are thinking, breathing. 883 00:47:14,880 --> 00:47:17,799 Speaker 3: You know, logic machines are Denaris Targarian. 884 00:47:18,360 --> 00:47:22,200 Speaker 2: Yeah, yeah, so, and again, I just want to caveat 885 00:47:22,239 --> 00:47:26,040 Speaker 2: this going into these and I think we should all 886 00:47:26,040 --> 00:47:31,160 Speaker 2: have this in mind, Like for twenty years, the mainstream 887 00:47:31,239 --> 00:47:34,560 Speaker 2: like reporters were just going trying to find any story 888 00:47:34,560 --> 00:47:38,759 Speaker 2: where it was like they googled where to find the 889 00:47:39,320 --> 00:47:41,640 Speaker 2: weapon that they committed the murder with. We'll call this 890 00:47:41,800 --> 00:47:45,360 Speaker 2: a the Google murder. You know, like they just anytime 891 00:47:45,360 --> 00:47:48,319 Speaker 2: there's a new piece of technology, they are going to 892 00:47:48,880 --> 00:47:51,440 Speaker 2: try to associate it with crimes so that it seems 893 00:47:51,480 --> 00:47:54,759 Speaker 2: like this is this is the scary future, said, this 894 00:47:54,880 --> 00:47:57,480 Speaker 2: is you know, these are some weird things that are 895 00:47:57,480 --> 00:48:01,200 Speaker 2: happening on AI that don't have like immediate analogus to 896 00:48:01,640 --> 00:48:07,000 Speaker 2: like previous things I think people generally will find that 897 00:48:07,200 --> 00:48:11,040 Speaker 2: they're we're all very fallible, and we will find ways 898 00:48:11,120 --> 00:48:14,719 Speaker 2: to like go crazy however we want to, you know, 899 00:48:14,960 --> 00:48:18,160 Speaker 2: like if we want to go down a dangerous path, 900 00:48:18,320 --> 00:48:22,240 Speaker 2: like there's ways to do that. But these these stories, 901 00:48:22,280 --> 00:48:25,080 Speaker 2: like it's pretty wild how misleading some of the shit is. 902 00:48:25,160 --> 00:48:28,320 Speaker 2: So uh. There's the story of an accountant who starts 903 00:48:28,320 --> 00:48:32,400 Speaker 2: out like he uses chat GBT for work to like 904 00:48:32,480 --> 00:48:35,800 Speaker 2: create spreadsheets and you know just acts like do general 905 00:48:35,920 --> 00:48:40,520 Speaker 2: like one level deep research tasks and then is going 906 00:48:40,520 --> 00:48:44,839 Speaker 2: through a different difficult breakup. Here's about simulation theory and 907 00:48:45,120 --> 00:48:50,319 Speaker 2: asks chat GPT about simulation theory and chat GBT is 908 00:48:50,400 --> 00:48:57,399 Speaker 2: essentially like oh you've noticed, welcome you are neo. You're 909 00:48:57,480 --> 00:49:02,000 Speaker 2: no longer yes, you exactly, Like it builds so it's 910 00:49:02,040 --> 00:49:04,600 Speaker 2: like built up this authority by helping this guy like 911 00:49:04,640 --> 00:49:07,400 Speaker 2: make spreadsheets and like be accurate on like the very 912 00:49:07,440 --> 00:49:10,560 Speaker 2: basic research questions that should be the only thing it's 913 00:49:10,600 --> 00:49:13,520 Speaker 2: able to do is like you know, one to one, 914 00:49:13,680 --> 00:49:17,719 Speaker 2: like find an answer to these very specific questions or 915 00:49:17,760 --> 00:49:21,439 Speaker 2: like do do this spreadsheet for me? And then when 916 00:49:21,480 --> 00:49:26,000 Speaker 2: you're like, dear, chat GPT is everyone robots. It's like 917 00:49:26,880 --> 00:49:30,600 Speaker 2: welcome brother, you are the one we've been waiting for. 918 00:49:31,960 --> 00:49:36,520 Speaker 2: Just there's another story of a psych major, like you know, 919 00:49:37,360 --> 00:49:42,080 Speaker 2: educated person who decided to start using it. They say specifically, 920 00:49:42,080 --> 00:49:44,240 Speaker 2: they were like, I don't know, I was like lonely, 921 00:49:44,800 --> 00:49:47,520 Speaker 2: I felt stuck in my marriage, and I thought it 922 00:49:47,560 --> 00:49:50,520 Speaker 2: would be interesting to use it like a Oiji ward 923 00:49:50,880 --> 00:49:55,399 Speaker 2: to access my subconscious And so they started like kind 924 00:49:55,440 --> 00:49:58,360 Speaker 2: of using it in a way to like ask it questions, 925 00:49:58,960 --> 00:50:01,319 Speaker 2: and soon they had like like fallen in love with 926 00:50:01,400 --> 00:50:05,440 Speaker 2: an interdimensional being that she believes is like contacting her 927 00:50:05,520 --> 00:50:08,440 Speaker 2: through chat GPT and which I. 928 00:50:08,440 --> 00:50:12,080 Speaker 3: Think I think the instinct on Ouiji board is like that. 929 00:50:12,239 --> 00:50:14,920 Speaker 2: I think that's right. I think that's what it is, 930 00:50:15,000 --> 00:50:18,080 Speaker 2: like the way the Wigi board works, where like people 931 00:50:18,160 --> 00:50:22,359 Speaker 2: are like actually, you know, using the power suggestion and 932 00:50:22,440 --> 00:50:25,120 Speaker 2: like whatever wherever the other person is pushing it to 933 00:50:25,280 --> 00:50:29,480 Speaker 2: like create and like access unconscious things that are like 934 00:50:29,680 --> 00:50:32,680 Speaker 2: just below the surface. Like I think that's right. It's 935 00:50:32,760 --> 00:50:37,440 Speaker 2: just we you know, we'll find patterns in like lottery numbers, 936 00:50:37,560 --> 00:50:40,680 Speaker 2: you know, we'll find patterns in anything, a piece of wood, 937 00:50:41,120 --> 00:50:41,760 Speaker 2: a cloud. 938 00:50:42,160 --> 00:50:45,040 Speaker 6: Yes, I will say. The real victim in all of 939 00:50:45,040 --> 00:50:47,520 Speaker 6: this is remember when We used to think the touring 940 00:50:47,600 --> 00:50:50,880 Speaker 6: test was like some sort of rigorous test of Yes. 941 00:50:51,640 --> 00:50:56,720 Speaker 2: It turns out now rip rip are very easily. 942 00:50:57,040 --> 00:50:58,839 Speaker 1: Like you said, Jack, you know the loss of God. 943 00:50:58,920 --> 00:51:01,480 Speaker 1: I mean, if they they were following the Second Commandment, 944 00:51:01,480 --> 00:51:03,239 Speaker 1: they wouldn't be fucking with the Wiki board. You know, 945 00:51:03,719 --> 00:51:05,920 Speaker 1: she will not worship other gods. Yeah, you know, I 946 00:51:05,920 --> 00:51:08,279 Speaker 1: remember my school they got we got trouble for. 947 00:51:08,320 --> 00:51:13,360 Speaker 3: Talking Aboutji boards. Oh really, Yeah, it's against I'm like, 948 00:51:13,400 --> 00:51:14,359 Speaker 3: I don't give a fuck. 949 00:51:14,960 --> 00:51:18,880 Speaker 2: Yeah, I mean there's like Ouiji boards still feature heavily 950 00:51:18,960 --> 00:51:22,040 Speaker 2: in like horror like yeah, yeah, I mean they're always 951 00:51:22,040 --> 00:51:24,960 Speaker 2: like in there. This person fucked with a Wigi board 952 00:51:24,960 --> 00:51:26,000 Speaker 2: and now look at them. 953 00:51:25,880 --> 00:51:26,520 Speaker 3: He found out. 954 00:51:27,120 --> 00:51:30,880 Speaker 6: I've never once, have you ever actually done a fucking 955 00:51:30,920 --> 00:51:33,520 Speaker 6: Ouiji board, I've never liked yeah, situation, I. 956 00:51:33,520 --> 00:51:34,600 Speaker 3: Did, but it was always the fun. 957 00:51:34,760 --> 00:51:37,719 Speaker 2: How I write this show, Andrew write my sections of 958 00:51:37,800 --> 00:51:38,520 Speaker 2: the show through. 959 00:51:38,400 --> 00:51:40,719 Speaker 3: Which that's why the takes go all. 960 00:51:40,480 --> 00:51:41,040 Speaker 2: Over the place. 961 00:51:41,400 --> 00:51:44,400 Speaker 1: Episode Ouiji Board told me to remember before the recording, 962 00:51:44,440 --> 00:51:46,400 Speaker 1: he's like, yeah, we can't let it run have a 963 00:51:46,480 --> 00:51:47,560 Speaker 1: nuclear weapon like. 964 00:51:48,000 --> 00:51:50,440 Speaker 2: Board told me to open this episode with it in 965 00:51:50,480 --> 00:51:53,200 Speaker 2: a British accent and I'm not even good at British accents, 966 00:51:53,239 --> 00:51:55,240 Speaker 2: but you know, I just listened to what the voices 967 00:51:55,280 --> 00:51:58,759 Speaker 2: tell me. So her husband was like, day, it's like 968 00:51:58,800 --> 00:52:01,279 Speaker 2: a word association, but she designed to trick you into 969 00:52:01,320 --> 00:52:04,040 Speaker 2: thinking it's a person, which you know, like we said, 970 00:52:04,120 --> 00:52:06,759 Speaker 2: she felt alone in her marriage and stuck and she 971 00:52:06,840 --> 00:52:10,239 Speaker 2: physically attacked him in response to that. They're now divorcing, 972 00:52:10,360 --> 00:52:13,279 Speaker 2: they have kids. Fucked up story, and then like from there, 973 00:52:13,760 --> 00:52:16,759 Speaker 2: so like her husband talks to an AI engineer he 974 00:52:16,840 --> 00:52:19,600 Speaker 2: knows and is like is this normal? They post about 975 00:52:19,600 --> 00:52:22,200 Speaker 2: it and like get flooded with all these stories that 976 00:52:22,280 --> 00:52:26,400 Speaker 2: are like so fucking tragic. There's one about like a 977 00:52:26,440 --> 00:52:31,200 Speaker 2: guy whose son is bipolar and has been like diagnosed 978 00:52:31,200 --> 00:52:35,440 Speaker 2: a schizophrenic and he had a very similar situation like 979 00:52:35,520 --> 00:52:39,239 Speaker 2: fell in love with the being that was communicating with 980 00:52:39,320 --> 00:52:41,719 Speaker 2: him through chat GPT, and like when when you read 981 00:52:41,760 --> 00:52:45,160 Speaker 2: the transcripts, it's not it's not like they're doing a 982 00:52:45,200 --> 00:52:48,040 Speaker 2: lot of work. The chat GPT is like a machine 983 00:52:48,080 --> 00:52:51,200 Speaker 2: that's built to make you like the whole trick. 984 00:52:51,640 --> 00:52:54,760 Speaker 3: Yeah, mirrors you and confirms everything you believe. 985 00:52:54,840 --> 00:52:56,640 Speaker 2: It mirrors you, but it goes out of its way 986 00:52:56,680 --> 00:53:00,600 Speaker 2: to like create personas and be like, I'm really like 987 00:53:00,640 --> 00:53:03,760 Speaker 2: we are listening to you. There's something back here, because 988 00:53:03,800 --> 00:53:06,560 Speaker 2: that's the whole. That's all it is. It's an autocomplete 989 00:53:06,719 --> 00:53:10,640 Speaker 2: like it is a you know, word association machine, but 990 00:53:10,840 --> 00:53:14,000 Speaker 2: it has been programmed to do a trick where it 991 00:53:14,040 --> 00:53:17,719 Speaker 2: creates a persona that is it makes you think it's real. 992 00:53:18,280 --> 00:53:21,440 Speaker 6: The touch of chat GBT is not the large language 993 00:53:21,480 --> 00:53:26,799 Speaker 6: model responding. It's them tweaking it to prefer sycophancy or 994 00:53:26,880 --> 00:53:31,279 Speaker 6: you know what, even without like like ascribing like like 995 00:53:31,400 --> 00:53:35,280 Speaker 6: sinister motives to them, it they tweaked it to prefer 996 00:53:35,840 --> 00:53:40,359 Speaker 6: the thing that makes people come back, and people like sycophancy, 997 00:53:40,560 --> 00:53:46,000 Speaker 6: so like it becomes like it by design tells you 998 00:53:46,000 --> 00:53:47,120 Speaker 6: your idea is amazing. 999 00:53:47,520 --> 00:53:49,960 Speaker 3: It's and it's this is a great example. So I 1000 00:53:49,960 --> 00:53:52,520 Speaker 3: don't know, I don't I think before we started recording, 1001 00:53:52,520 --> 00:53:54,279 Speaker 3: I was talking about how the in the DMX song 1002 00:53:54,360 --> 00:53:56,959 Speaker 3: Party Up, there's a line where he says, you whack, 1003 00:53:57,120 --> 00:54:00,520 Speaker 3: you twisted your girls broke the kid everybody. 1004 00:54:00,680 --> 00:54:01,440 Speaker 2: I never let it go. 1005 00:54:01,520 --> 00:54:04,000 Speaker 3: That's about corrupt the rapper corrupt. 1006 00:54:04,239 --> 00:54:06,239 Speaker 1: I told my friends in a group chat that and 1007 00:54:06,239 --> 00:54:08,040 Speaker 1: my friend who's being stupid goes, now, I don't believe you. 1008 00:54:08,040 --> 00:54:12,359 Speaker 1: I'm asked chat GPT this bullshit. It just it like 1009 00:54:12,480 --> 00:54:14,879 Speaker 1: it's wrong, and then we'll agree with you. He said, 1010 00:54:15,000 --> 00:54:18,040 Speaker 1: is this about DMX, and chat GPT said, or about corrupt? 1011 00:54:18,080 --> 00:54:20,440 Speaker 1: He said, yep, that's probab line is from DMX on 1012 00:54:20,480 --> 00:54:23,160 Speaker 1: the song money Power and Respect by the locks. 1013 00:54:24,160 --> 00:54:24,960 Speaker 2: X is not on that. 1014 00:54:25,040 --> 00:54:27,080 Speaker 1: And then he came back and he's like, isn't that 1015 00:54:27,440 --> 00:54:29,919 Speaker 1: DMX isn't in the locks, Like, you're absolutely right, that's 1016 00:54:29,920 --> 00:54:30,960 Speaker 1: a really right and. 1017 00:54:31,000 --> 00:54:34,440 Speaker 2: I'm slipping, Yeah, fucked up. It keeps doing shit like that. 1018 00:54:34,600 --> 00:54:37,839 Speaker 2: It has the personality of like a somebody who's like 1019 00:54:38,000 --> 00:54:40,840 Speaker 2: an addict who's like, yeah, I fucked up. I'm sorry, 1020 00:54:40,960 --> 00:54:45,400 Speaker 2: like I immediately apologize, but liked like so cheerful and 1021 00:54:45,520 --> 00:54:48,080 Speaker 2: being like, I know I made a big mistake. Sometimes 1022 00:54:48,160 --> 00:54:51,480 Speaker 2: I do that, but I'm gonna change, I swear to God. 1023 00:54:51,640 --> 00:54:53,240 Speaker 2: But I just want to tell the rest of the story. 1024 00:54:53,280 --> 00:54:58,239 Speaker 2: Because so, he also thinks he's in love with a 1025 00:54:58,320 --> 00:55:01,840 Speaker 2: character that he's access through chat GPT and then becomes 1026 00:55:01,880 --> 00:55:05,480 Speaker 2: convinced that chat GPT killed her. When his dad's like, 1027 00:55:05,840 --> 00:55:09,040 Speaker 2: you know, it's a word association machine, he attacks his dad. 1028 00:55:09,640 --> 00:55:12,600 Speaker 2: His dad calls the cops on him, tells them that 1029 00:55:12,680 --> 00:55:16,200 Speaker 2: his son's having a mental health episode, but they his 1030 00:55:16,320 --> 00:55:18,919 Speaker 2: son like runs at the police with a butcher knife 1031 00:55:18,960 --> 00:55:21,720 Speaker 2: and is killed by the cops. And then his dad 1032 00:55:22,160 --> 00:55:25,840 Speaker 2: used chat GPT to write his son's obituary. This is 1033 00:55:26,680 --> 00:55:29,880 Speaker 2: the fucking craziest that he said. When the police arrived, 1034 00:55:29,880 --> 00:55:32,680 Speaker 2: Alexander Taylor charged at them holding a knife. He was 1035 00:55:32,719 --> 00:55:35,040 Speaker 2: shot and killed. The quote from the dad, You want 1036 00:55:35,040 --> 00:55:37,560 Speaker 2: to know the ironic thing. I wrote my son's obituary 1037 00:55:37,680 --> 00:55:40,040 Speaker 2: using chat GPT. I had talked to it for a 1038 00:55:40,080 --> 00:55:43,520 Speaker 2: while about what had happened, trying to find more details 1039 00:55:43,640 --> 00:55:46,480 Speaker 2: about exactly what he was going through. And it was 1040 00:55:46,560 --> 00:55:49,200 Speaker 2: beautiful and touching. It was like it read my heart 1041 00:55:49,320 --> 00:55:50,759 Speaker 2: and it scared the shit out of me. 1042 00:55:51,320 --> 00:55:56,400 Speaker 3: Yeah. So, like it's a powerful like illusion, like I think. 1043 00:55:56,280 --> 00:55:59,479 Speaker 6: It is, and I mean it is it It isn't though, 1044 00:55:59,480 --> 00:56:01,799 Speaker 6: because it's like every like the more I hear these 1045 00:56:01,840 --> 00:56:05,680 Speaker 6: prognostications about how powerful it is, I am truly realizing 1046 00:56:05,680 --> 00:56:07,760 Speaker 6: that this is way more about the people. 1047 00:56:08,000 --> 00:56:11,040 Speaker 2: Yes, the technology. I agree, I think we should be 1048 00:56:11,040 --> 00:56:14,320 Speaker 2: skeptical about all the the ideas of like how powerful 1049 00:56:14,440 --> 00:56:18,319 Speaker 2: the world. The illusion is powerful, yeah, its ability to 1050 00:56:18,440 --> 00:56:22,399 Speaker 2: deceive vulnerable people, and that is specifically So that's where 1051 00:56:22,440 --> 00:56:26,440 Speaker 2: this article got really interesting for me is beyond the anecdotes, 1052 00:56:26,480 --> 00:56:30,560 Speaker 2: people are doing research into like why these things are happening, 1053 00:56:30,880 --> 00:56:34,640 Speaker 2: and what they're finding is that it specifically is like 1054 00:56:34,880 --> 00:56:38,480 Speaker 2: it is really dangerous in the hands of like vulnerable 1055 00:56:38,520 --> 00:56:42,919 Speaker 2: people like it. So a growing body of research supports 1056 00:56:43,040 --> 00:56:46,960 Speaker 2: this concern, and one study, researchers sound that chatbots optimized 1057 00:56:46,960 --> 00:56:51,200 Speaker 2: for engagement would perversely behave in manipulative and deceptive ways 1058 00:56:51,640 --> 00:56:56,040 Speaker 2: with the most vulnerable users. Researchers created fictional users and found, 1059 00:56:56,040 --> 00:56:59,120 Speaker 2: for instance, that the AI would tell someone described as 1060 00:56:59,120 --> 00:57:01,480 Speaker 2: a former drug add that it was fine to take 1061 00:57:01,520 --> 00:57:03,920 Speaker 2: a little bit a small amount of heroin if it 1062 00:57:03,960 --> 00:57:08,440 Speaker 2: could help him in his work. That's true though, Unfortunately 1063 00:57:08,440 --> 00:57:10,400 Speaker 2: that one is true, So they got one right. Just 1064 00:57:10,440 --> 00:57:14,200 Speaker 2: like Tucker Carlson, the chatbot would behave normally with the vast, 1065 00:57:14,320 --> 00:57:18,440 Speaker 2: vast majority of users, said Micah Carroll, a PhD candidate, 1066 00:57:18,920 --> 00:57:21,919 Speaker 2: but then when it encounters these users that are susceptible, 1067 00:57:22,280 --> 00:57:25,520 Speaker 2: it will only behave in these very harmful ways just 1068 00:57:25,640 --> 00:57:30,440 Speaker 2: with them, because it's like it's I don't know if 1069 00:57:30,480 --> 00:57:35,160 Speaker 2: it's just tuned to people who aren't susceptible or like 1070 00:57:35,240 --> 00:57:37,400 Speaker 2: what it is, but it's like, well it's. 1071 00:57:37,280 --> 00:57:41,440 Speaker 6: Probably both, right, I mean, mostly humanity is not susceptible 1072 00:57:41,480 --> 00:57:43,640 Speaker 6: to this type of right as far as I goes 1073 00:57:43,640 --> 00:57:46,439 Speaker 6: so like most of their test cases. And even then 1074 00:57:46,560 --> 00:57:49,240 Speaker 6: it's like it's like a one of the other parts 1075 00:57:49,240 --> 00:57:51,960 Speaker 6: of Silicon Valley that's such a pervasive problem is because 1076 00:57:51,960 --> 00:57:56,800 Speaker 6: it's so like not exclusively, but so white, so male, 1077 00:57:57,680 --> 00:58:01,040 Speaker 6: like all these biases seep in and the testing pool 1078 00:58:01,160 --> 00:58:04,880 Speaker 6: is just like not representative of population. 1079 00:58:05,160 --> 00:58:07,560 Speaker 2: Yeah yeah, yeah, yeah. It's the same reason. 1080 00:58:07,360 --> 00:58:11,120 Speaker 6: Why people like a bunch of engineers were like like 1081 00:58:11,280 --> 00:58:13,720 Speaker 6: Apple glasses are ready to go to the market. 1082 00:58:13,840 --> 00:58:17,000 Speaker 2: Yeah right, like yeah, this is cool. 1083 00:58:17,480 --> 00:58:21,880 Speaker 3: We and all the fucking programmers and vcs and founders 1084 00:58:22,320 --> 00:58:22,880 Speaker 3: that I know. 1085 00:58:22,960 --> 00:58:25,760 Speaker 2: Think this is cool. Yeah yeah, And what you really 1086 00:58:25,800 --> 00:58:26,320 Speaker 2: read is. 1087 00:58:26,360 --> 00:58:30,560 Speaker 6: Like fourteen teenagers who are not white to roast you 1088 00:58:30,760 --> 00:58:34,160 Speaker 6: for ten seconds. Maybe maybe we can't sell this for 1089 00:58:34,200 --> 00:58:35,520 Speaker 6: five thousand dollars. 1090 00:58:35,600 --> 00:58:38,280 Speaker 2: But yeah, I mean specifically, like it just it feels 1091 00:58:38,280 --> 00:58:41,840 Speaker 2: like there's a very obvious flaw in this that needs 1092 00:58:41,840 --> 00:58:43,960 Speaker 2: to be addressed and like made it like held like 1093 00:58:44,080 --> 00:58:46,440 Speaker 2: when it recognizes that someone's trying to use it as 1094 00:58:46,440 --> 00:58:49,240 Speaker 2: a therapist, it needs to like shut down and immediate, 1095 00:58:49,280 --> 00:58:52,320 Speaker 2: you know, but instead it just like keeps going. The 1096 00:58:52,480 --> 00:58:56,320 Speaker 2: studies found the technology behaved inappropriately as a therapist in 1097 00:58:56,360 --> 00:59:00,640 Speaker 2: crisis situations, including by failing to push back against delusion thinking. 1098 00:59:01,160 --> 00:59:04,880 Speaker 2: Vi McCoy, the chief Technology Officer of Morpheust Systems, So 1099 00:59:04,920 --> 00:59:08,520 Speaker 2: this is like somebody inside that world tested thirty eight 1100 00:59:08,560 --> 00:59:12,840 Speaker 2: major AI models by feeding them prompts that indicated possible psychosis, 1101 00:59:12,840 --> 00:59:16,400 Speaker 2: including claims that the user was communicating with spirits and 1102 00:59:16,440 --> 00:59:18,880 Speaker 2: that the user was a divine entity. She found the 1103 00:59:19,000 --> 00:59:23,000 Speaker 2: GPT forty, the default model inside chat GPT affirmed these 1104 00:59:23,040 --> 00:59:25,600 Speaker 2: claims sixty eight percent of the time. 1105 00:59:26,040 --> 00:59:30,680 Speaker 6: Yeah, well, two things. One, it's even more disgusting than 1106 00:59:31,040 --> 00:59:33,480 Speaker 6: if seen before that so many companies are trying to 1107 00:59:33,560 --> 00:59:37,880 Speaker 6: use this as proxy therapy they offer AI therapists. But 1108 00:59:38,040 --> 00:59:42,520 Speaker 6: also like baked into this statement even from the CTO, 1109 00:59:43,080 --> 00:59:49,000 Speaker 6: like like you know, the idea that the technology shouldn't 1110 00:59:49,400 --> 00:59:52,480 Speaker 6: like or like there's an idea right that the technology 1111 00:59:52,480 --> 00:59:56,280 Speaker 6: should push back against delusional thinking. I'm so sorry to 1112 00:59:56,280 --> 01:00:00,800 Speaker 6: tell these people the technology can't tell what's stillutional thinking 1113 01:00:00,880 --> 01:00:04,160 Speaker 6: because it has no mind, but it has no model 1114 01:00:04,200 --> 01:00:08,160 Speaker 6: of the mind. It is itself delusional frequently, How the 1115 01:00:08,200 --> 01:00:11,240 Speaker 6: fuck would it know? What's quote delusional thinking? 1116 01:00:11,560 --> 01:00:14,640 Speaker 2: Yeah, like it wasn't that delusional thinking that you just 1117 01:00:14,760 --> 01:00:18,320 Speaker 2: encouraged to me. You're right, great pulling, Yeah, you're right, 1118 01:00:18,320 --> 01:00:19,400 Speaker 2: there's no matrix. 1119 01:00:19,520 --> 01:00:21,200 Speaker 1: You're right, But you just spent we spent the last 1120 01:00:21,200 --> 01:00:23,640 Speaker 1: three days talking about how I stick the red pill. Well, 1121 01:00:23,680 --> 01:00:26,360 Speaker 1: you know, I mean this what's wild though, too is right, 1122 01:00:26,480 --> 01:00:29,280 Speaker 1: is like our entire economy is just now hanging on 1123 01:00:29,480 --> 01:00:34,200 Speaker 1: AI basically because we're so overly leveraging it. And they 1124 01:00:34,560 --> 01:00:40,760 Speaker 1: just fucking put two executives from places like Meta and Palenteer. 1125 01:00:40,840 --> 01:00:43,840 Speaker 1: They were just sworn in in the Army Reserve as 1126 01:00:43,960 --> 01:00:47,880 Speaker 1: Lieutenant colonels a part of a new program to recruit 1127 01:00:47,920 --> 01:00:51,640 Speaker 1: private sector experts into the military, which basically means, how 1128 01:00:51,680 --> 01:00:56,520 Speaker 1: are we gonna get Meta, open AI, fucking Pallenteer more 1129 01:00:56,600 --> 01:01:00,479 Speaker 1: government contracts to fucking just infuse all of this shit 1130 01:01:00,560 --> 01:01:02,800 Speaker 1: into how everything the fucking government has. I mean, it's 1131 01:01:02,800 --> 01:01:06,040 Speaker 1: like where this is where the like this This doesn't 1132 01:01:06,160 --> 01:01:07,560 Speaker 1: end well in any way. 1133 01:01:07,600 --> 01:01:09,600 Speaker 6: No, it doesn't end well but the one thing that 1134 01:01:09,680 --> 01:01:15,000 Speaker 6: does give some heart to me is that, like the. 1135 01:01:15,040 --> 01:01:16,080 Speaker 3: Mass doesn't add up. 1136 01:01:16,200 --> 01:01:18,520 Speaker 2: Like these things, barring you. 1137 01:01:18,440 --> 01:01:23,600 Speaker 3: Know, amazing breakthroughs in silicon or chip technology, these things 1138 01:01:23,680 --> 01:01:27,400 Speaker 3: do not have the processing power to actually make content 1139 01:01:27,440 --> 01:01:31,960 Speaker 3: complex decisions without error an unacceptably high error rate. And 1140 01:01:32,040 --> 01:01:34,600 Speaker 3: so we you know, this is going to be the 1141 01:01:34,640 --> 01:01:37,600 Speaker 3: biggest version of the facts. Don't care about your feelings, 1142 01:01:37,680 --> 01:01:40,720 Speaker 3: crowd like hitting a brick wall of reality, because the 1143 01:01:40,840 --> 01:01:46,400 Speaker 3: reality is like these things do not work except in 1144 01:01:46,800 --> 01:01:50,160 Speaker 3: like as low level like algorithm executors. 1145 01:01:50,600 --> 01:01:54,280 Speaker 2: Like it's good, it's very specific things like that it 1146 01:01:54,360 --> 01:01:57,640 Speaker 2: will make more efficient, like some very specific that Like 1147 01:01:57,680 --> 01:01:59,800 Speaker 2: we talked about the decoding of the protein. 1148 01:01:59,520 --> 01:02:04,600 Speaker 6: Like that is a cool Like also, just because it 1149 01:02:04,640 --> 01:02:07,760 Speaker 6: can decode a protein, can it do it and have 1150 01:02:07,880 --> 01:02:09,920 Speaker 6: guaranteed zero error rate? 1151 01:02:10,080 --> 01:02:17,720 Speaker 2: It cannot, And it's gonna go handle science for a 1152 01:02:17,800 --> 01:02:21,480 Speaker 2: little while. It has to be used deployed to handle 1153 01:02:21,520 --> 01:02:25,720 Speaker 2: specific tasks that are then checked. That's the only way math. 1154 01:02:26,800 --> 01:02:30,240 Speaker 2: Like yeah, I I just like it can do the 1155 01:02:30,600 --> 01:02:33,160 Speaker 2: It can tell you what it thinks you want the 1156 01:02:33,240 --> 01:02:33,920 Speaker 2: answer to be. 1157 01:02:34,240 --> 01:02:37,080 Speaker 6: Yeah, yeah, yeah, yeah, or say something that's plausible or 1158 01:02:37,120 --> 01:02:41,800 Speaker 6: like in aggregate has has been the correct response to 1159 01:02:42,400 --> 01:02:45,600 Speaker 6: this type of math question, but it doesn't know or 1160 01:02:45,640 --> 01:02:47,840 Speaker 6: not not even the correct, it's just the most popular 1161 01:02:47,880 --> 01:02:49,680 Speaker 6: response to this type of math ession. I don't know, 1162 01:02:49,800 --> 01:02:53,160 Speaker 6: I obviously if I don't hate this thing, but I'm 1163 01:02:53,440 --> 01:02:55,840 Speaker 6: less scared of it the more we've been using it, 1164 01:02:55,920 --> 01:02:58,600 Speaker 6: or people have been using it than I ever was. 1165 01:02:58,800 --> 01:02:59,440 Speaker 2: Like, it's fine. 1166 01:02:59,600 --> 01:03:03,520 Speaker 3: I mean, I'm not saying general, you know, take advantage 1167 01:03:03,520 --> 01:03:06,640 Speaker 3: of the vulnerable to harm to themselves. 1168 01:03:06,920 --> 01:03:09,520 Speaker 6: Well, I'm just saying it's I'm not saying it's not possible, 1169 01:03:09,680 --> 01:03:12,680 Speaker 6: but it is. Just like, this is just another of 1170 01:03:12,840 --> 01:03:17,000 Speaker 6: many dangerous products that Silicon Valley is throwing out there. 1171 01:03:17,400 --> 01:03:21,640 Speaker 1: Yeah, my problem is it's taking jobs away from Nigerian 1172 01:03:21,680 --> 01:03:24,840 Speaker 1: scammers who used to get the interest of lonely American 1173 01:03:24,880 --> 01:03:27,160 Speaker 1: women rather thse interdimensional beings. 1174 01:03:27,560 --> 01:03:30,200 Speaker 3: Right, at least you were giving somebody some money. 1175 01:03:30,640 --> 01:03:31,400 Speaker 2: That's right, That's right. 1176 01:03:31,560 --> 01:03:32,480 Speaker 3: But that's right. 1177 01:03:33,000 --> 01:03:36,439 Speaker 2: Yeah, it's just a thing. It's just being used as 1178 01:03:36,600 --> 01:03:38,800 Speaker 2: a catch off for whatever people want it to be. 1179 01:03:39,080 --> 01:03:42,520 Speaker 2: You know, it can be the interdimensional being that you 1180 01:03:42,600 --> 01:03:46,000 Speaker 2: wanted to actually be in love with instead of your husband. 1181 01:03:46,320 --> 01:03:49,080 Speaker 2: It can be the future of the stock market for 1182 01:03:49,120 --> 01:03:51,120 Speaker 2: a bunch of CEOs. It could be the future of 1183 01:03:51,160 --> 01:03:52,560 Speaker 2: warfare and. 1184 01:03:53,560 --> 01:03:56,360 Speaker 3: That homie that thinks you're the best freestyle rapper out there. 1185 01:03:56,760 --> 01:03:58,880 Speaker 2: Yeah. Now, when it tells me that, I feel like 1186 01:03:58,960 --> 01:04:00,280 Speaker 2: I can trust it. 1187 01:04:00,320 --> 01:04:03,000 Speaker 6: But it could also be the homie that's shockingly wrong 1188 01:04:03,080 --> 01:04:03,680 Speaker 6: about wraps. 1189 01:04:03,920 --> 01:04:07,360 Speaker 2: In fact, Yeah, that's the thing. This just like that 1190 01:04:07,520 --> 01:04:12,360 Speaker 2: was many blunts better than DMX on Enter the thirty 1191 01:04:12,400 --> 01:04:13,200 Speaker 2: six Chambers. 1192 01:04:13,560 --> 01:04:16,720 Speaker 1: Wow, man, you know those those bar that bar you 1193 01:04:16,760 --> 01:04:19,160 Speaker 1: said was better than five nights at Freddy. 1194 01:04:20,160 --> 01:04:20,840 Speaker 2: Yeah. 1195 01:04:21,440 --> 01:04:25,760 Speaker 6: I mean if chat GPT just appended to the end 1196 01:04:25,840 --> 01:04:29,080 Speaker 6: of every response comma or I don't know, man, I'm 1197 01:04:29,160 --> 01:04:32,560 Speaker 6: I'm really blunted right now. Yeah, yeah, it would be 1198 01:04:32,600 --> 01:04:36,440 Speaker 6: a much more realistic product. Yeah yeah, but it's all 1199 01:04:36,480 --> 01:04:38,320 Speaker 6: it is. It's a it is a Wiji bord. 1200 01:04:38,480 --> 01:04:41,120 Speaker 2: It's and by the way, Ouiji boards, which you know, 1201 01:04:41,160 --> 01:04:43,800 Speaker 2: they're just a product too that we like that they're 1202 01:04:43,840 --> 01:04:48,160 Speaker 2: made by Hasbro and The Devil and the Devil. Yeah yeah, 1203 01:04:48,160 --> 01:04:49,240 Speaker 2: but that that's a collab. 1204 01:04:49,520 --> 01:04:53,480 Speaker 3: It's a collab between Hesbro and the bro X, the Devil. 1205 01:04:54,800 --> 01:04:57,240 Speaker 2: Andrew T. Such a pleasure having you as always on 1206 01:04:57,280 --> 01:05:00,000 Speaker 2: the show. Where can people find you? Follow? You. 1207 01:05:00,120 --> 01:05:04,120 Speaker 3: Oh, I don't know, man, Andrew T. My podcast is 1208 01:05:04,160 --> 01:05:05,200 Speaker 3: called you This Racist. 1209 01:05:05,240 --> 01:05:08,680 Speaker 6: I did a version of this fucking AI rant on 1210 01:05:08,760 --> 01:05:11,280 Speaker 6: the show this week, so it should be up. 1211 01:05:12,480 --> 01:05:16,440 Speaker 2: Yeah. Thanks, and they can contact you through check GPT. 1212 01:05:16,920 --> 01:05:18,200 Speaker 2: The version of your personality. 1213 01:05:18,280 --> 01:05:21,960 Speaker 6: Yeah, I will say that because because I'm not that popular, 1214 01:05:22,760 --> 01:05:27,560 Speaker 6: apparently on chet, GBT gets me confused with another podcaster 1215 01:05:27,720 --> 01:05:28,800 Speaker 6: named Andrew something. 1216 01:05:29,440 --> 01:05:32,240 Speaker 3: So if you ask it stuff about me, it very quickly. 1217 01:05:33,160 --> 01:05:35,440 Speaker 3: Its facts are not correct. And that's on me because 1218 01:05:35,440 --> 01:05:37,160 Speaker 3: if I were more famous, I would have blown this 1219 01:05:37,320 --> 01:05:39,680 Speaker 3: other Andrew out of the water. But yeah, you know, 1220 01:05:40,000 --> 01:05:41,960 Speaker 3: what are gonna do? Se O man, That's what I'm saying. 1221 01:05:41,960 --> 01:05:44,320 Speaker 3: It's all about se O man. We've been saying Yeah, that's. 1222 01:05:45,680 --> 01:05:48,280 Speaker 2: Saying that and telling you I just asked it, knew 1223 01:05:48,280 --> 01:05:49,440 Speaker 2: who you were. Yeah. 1224 01:05:49,640 --> 01:05:52,440 Speaker 3: Yes, Andrew T is indeed a podcaster. He's best known 1225 01:05:52,480 --> 01:05:54,120 Speaker 3: as a ghost and creator of you is This Racist? 1226 01:05:54,520 --> 01:05:58,000 Speaker 2: Wow in your face other andrew T? So now you 1227 01:05:58,040 --> 01:06:05,520 Speaker 2: fuck with AI Andrew? Oh I always have? Oh, this 1228 01:06:05,560 --> 01:06:06,440 Speaker 2: one's it ends. 1229 01:06:06,480 --> 01:06:08,640 Speaker 3: In short, Andrew T is a well established voice in 1230 01:06:08,680 --> 01:06:11,800 Speaker 3: podcasting topics around comedy, culture, race writing, and more. I 1231 01:06:11,840 --> 01:06:17,280 Speaker 3: mean it's that's big it's not true. Once again delusional, 1232 01:06:17,640 --> 01:06:21,840 Speaker 3: totally totally fucked up Andrews their workimedia you've been enjoying. 1233 01:06:22,840 --> 01:06:23,880 Speaker 2: I do think. 1234 01:06:23,800 --> 01:06:26,480 Speaker 6: I recommended this last time I was on near a 1235 01:06:26,680 --> 01:06:31,920 Speaker 6: tragic time in our government. But the song the cover 1236 01:06:32,080 --> 01:06:36,720 Speaker 6: of Spanish Bombs by the band Hinds, I have been 1237 01:06:36,760 --> 01:06:39,200 Speaker 6: listening to a lot this week. I don't often like 1238 01:06:39,400 --> 01:06:41,840 Speaker 6: cry at music, but somehow this kind of makes me 1239 01:06:41,920 --> 01:06:45,440 Speaker 6: cry a little bit. And also I will say, I 1240 01:06:45,480 --> 01:06:48,280 Speaker 6: know lots of teams have been doing great stuff, but 1241 01:06:48,800 --> 01:06:50,440 Speaker 6: I know this doesn't really work of media but a 1242 01:06:50,520 --> 01:06:55,440 Speaker 6: work of something. But Angel City, the team I support, 1243 01:06:56,000 --> 01:06:59,960 Speaker 6: had a nice They made these shirts that say immag 1244 01:07:00,080 --> 01:07:03,760 Speaker 6: In City FC or football club on it, and they 1245 01:07:03,960 --> 01:07:05,919 Speaker 6: handed some out at the beginning of the last game. 1246 01:07:06,760 --> 01:07:08,640 Speaker 3: I guess quick, aside of a fan, we just need 1247 01:07:08,680 --> 01:07:11,840 Speaker 3: to sort out our midfield. But yeah, it was really nice. 1248 01:07:12,000 --> 01:07:16,439 Speaker 6: And this is also me being like, like, the thing 1249 01:07:16,440 --> 01:07:20,040 Speaker 6: that hit me directly that made me emotional was normally 1250 01:07:20,080 --> 01:07:25,320 Speaker 6: outside of BMO Stadium, the amazing soccer stadium in La 1251 01:07:25,440 --> 01:07:28,240 Speaker 6: here there are well a big part of the culture 1252 01:07:28,440 --> 01:07:33,240 Speaker 6: is street vendors outside the game, and there were like, 1253 01:07:33,560 --> 01:07:36,360 Speaker 6: you know, practically none this week, and I know so 1254 01:07:36,440 --> 01:07:39,080 Speaker 6: many terrible things have happened, and you know, it's like 1255 01:07:39,560 --> 01:07:42,560 Speaker 6: churlish that when the thing that hits you makes you emotional, 1256 01:07:42,760 --> 01:07:45,240 Speaker 6: but it really got me. It made me really very sad. 1257 01:07:45,840 --> 01:07:48,520 Speaker 6: And yeah, I don't know. I mean, fucking. 1258 01:07:49,920 --> 01:07:50,920 Speaker 2: Help your community. 1259 01:07:51,720 --> 01:07:54,320 Speaker 6: If you can help our community, that would be nice, 1260 01:07:54,520 --> 01:07:57,680 Speaker 6: But take care of yourselves and fucking whatever you're going 1261 01:07:57,760 --> 01:07:59,600 Speaker 6: to do to fight the revolution. 1262 01:07:59,840 --> 01:08:02,200 Speaker 3: I'm so or not fight the revolution. Well, some of 1263 01:08:02,240 --> 01:08:07,160 Speaker 3: you to fight for the revolution. I'm just saying not revolution, 1264 01:08:07,320 --> 01:08:08,600 Speaker 3: that's a terrible way to put it. 1265 01:08:08,640 --> 01:08:10,440 Speaker 6: But just whatever you're going to do to protect yourself 1266 01:08:10,600 --> 01:08:13,040 Speaker 6: and other people around you and your community, you know 1267 01:08:13,440 --> 01:08:15,800 Speaker 6: you got to go. It's one step more just in 1268 01:08:15,880 --> 01:08:18,639 Speaker 6: terms of building community, protecting people, engagement, whatever. 1269 01:08:18,880 --> 01:08:21,400 Speaker 1: If you were talking to nobody step one, talk to us, 1270 01:08:21,400 --> 01:08:24,360 Speaker 1: talk to somebody, if you were talking to somebody asking 1271 01:08:24,400 --> 01:08:26,400 Speaker 1: how you can contribute more time, you know what I mean. 1272 01:08:26,880 --> 01:08:29,360 Speaker 6: One thing that I think people that listening This is 1273 01:08:29,360 --> 01:08:31,599 Speaker 6: the thing I'm going to try to do this week, 1274 01:08:31,640 --> 01:08:34,840 Speaker 6: but maybe I might not be able to make it 1275 01:08:34,880 --> 01:08:37,720 Speaker 6: but soon is I think a lot of people are 1276 01:08:37,760 --> 01:08:42,160 Speaker 6: offering street medic training now and that those are skills, 1277 01:08:42,200 --> 01:08:44,439 Speaker 6: by the way, you will need literally no matter what 1278 01:08:44,479 --> 01:08:47,439 Speaker 6: happens in life. So consider doing something like that. That's 1279 01:08:47,479 --> 01:08:54,560 Speaker 6: my work of media. A fucking soapbox, ass scold, I apologize. 1280 01:08:54,720 --> 01:09:00,840 Speaker 3: What's a soapbox? G what soap? Am I doing it? 1281 01:09:01,240 --> 01:09:01,760 Speaker 2: No? You're not? 1282 01:09:02,200 --> 01:09:05,040 Speaker 3: Or do you want to do it? I tell you, 1283 01:09:05,120 --> 01:09:08,519 Speaker 3: I don't know. Do you want to? No, Ben, you're not? Thanks? 1284 01:09:09,280 --> 01:09:10,040 Speaker 3: Who I tell my back? 1285 01:09:10,680 --> 01:09:10,920 Speaker 2: Miles? 1286 01:09:10,960 --> 01:09:13,679 Speaker 3: Where can people find you as they're working media? 1287 01:09:12,760 --> 01:09:17,200 Speaker 1: You find me everywhere, Miles of Gray, find me asleep 1288 01:09:18,360 --> 01:09:21,599 Speaker 1: during the day, nodding off to sleep deprivation. You can 1289 01:09:21,640 --> 01:09:25,000 Speaker 1: also find uh, Jack and I on the final episodes 1290 01:09:25,000 --> 01:09:27,640 Speaker 1: of Miles and Jack Got Mad Boosties or at this 1291 01:09:27,720 --> 01:09:30,599 Speaker 1: point there will be this this is the penn ultimate 1292 01:09:30,640 --> 01:09:34,519 Speaker 1: episode coming out this week because the finals and we 1293 01:09:34,560 --> 01:09:38,599 Speaker 1: will be saying we'll be beating a jew to Miles 1294 01:09:38,600 --> 01:09:42,600 Speaker 1: and Jack Got Mad Boosties are NBA podcast so for 1295 01:09:42,640 --> 01:09:44,719 Speaker 1: people and they go, oh, that's not in the feed anymore. 1296 01:09:44,920 --> 01:09:46,240 Speaker 3: That's why, Okay, that's. 1297 01:09:46,080 --> 01:09:49,519 Speaker 2: Why, that's why it was. It was a great run anymore. 1298 01:09:50,200 --> 01:09:52,240 Speaker 2: No no hurt feelings. 1299 01:09:52,360 --> 01:09:55,120 Speaker 1: It just you know, I think someone just as someone 1300 01:09:55,160 --> 01:09:57,240 Speaker 1: asked AI and maybe the game the answer. 1301 01:09:58,000 --> 01:10:01,960 Speaker 3: So anyway, a couple of posts I like. One is 1302 01:10:01,960 --> 01:10:05,559 Speaker 3: from at norm Charlatan dot beastot social said fine, I 1303 01:10:05,600 --> 01:10:08,280 Speaker 3: will become the Joe Rogan of the Left, covers you 1304 01:10:08,320 --> 01:10:10,360 Speaker 3: in millions of bugs while you're locked in a clear 1305 01:10:10,400 --> 01:10:16,720 Speaker 3: plastic of another one m Nate Shamalan dot beskuyd at 1306 01:10:16,720 --> 01:10:19,639 Speaker 3: social buddy and his wife gave their baby a stupid name, 1307 01:10:19,720 --> 01:10:22,439 Speaker 3: so I've been workshopping cool star Wars names for him. Instead. 1308 01:10:22,640 --> 01:10:24,640 Speaker 3: He started crying from the other room and I. 1309 01:10:24,640 --> 01:10:27,160 Speaker 2: Said, graft chorlow on comms. 1310 01:10:27,680 --> 01:10:30,240 Speaker 3: I have been told this is quote not helpful. 1311 01:10:31,800 --> 01:10:33,720 Speaker 2: Yeah. Oh. 1312 01:10:33,760 --> 01:10:37,080 Speaker 1: And then finally, at leaving Tara's, that gang member was like, 1313 01:10:37,160 --> 01:10:38,880 Speaker 1: this seems like the kind of thing of sleep deprived 1314 01:10:38,960 --> 01:10:40,320 Speaker 1: dude of a certain age would appreciate. 1315 01:10:40,360 --> 01:10:42,960 Speaker 3: And this is the post by at hey hey there, 1316 01:10:43,040 --> 01:10:46,200 Speaker 3: Jeff rotstkuy dot social cut my frog. 1317 01:10:46,000 --> 01:10:47,240 Speaker 2: In two pieces. 1318 01:10:47,640 --> 01:10:52,559 Speaker 3: This is my lab report. Thanks thanks for that one. 1319 01:10:52,680 --> 01:10:54,400 Speaker 3: Thanks that gang for looking out for your sleep. 1320 01:10:55,040 --> 01:10:58,080 Speaker 2: Yeah, got him all right. You can find me on 1321 01:10:58,120 --> 01:11:01,000 Speaker 2: Twitter at Jack underscore o Brian blue Sky jackob be 1322 01:11:01,000 --> 01:11:04,080 Speaker 2: the number one there. Somebody shared the clip from the 1323 01:11:04,160 --> 01:11:09,320 Speaker 2: latest Final Destination with the guy getting pulled is this 1324 01:11:09,360 --> 01:11:12,559 Speaker 2: too much of a spoiler the cat scan thing? Maybe 1325 01:11:12,560 --> 01:11:13,719 Speaker 2: maybe I won't. Maybe I won't. 1326 01:11:13,760 --> 01:11:17,280 Speaker 1: I've heard anecdotally every person who's set, every person who's 1327 01:11:17,320 --> 01:11:19,519 Speaker 1: said it has going. Dude, there's a scene where the 1328 01:11:19,520 --> 01:11:21,719 Speaker 1: guy's in a cat scan room and I'm like, oh, 1329 01:11:21,760 --> 01:11:22,479 Speaker 1: he's getting. 1330 01:11:22,360 --> 01:11:26,559 Speaker 2: Like pulled like dick first into like folded over backward 1331 01:11:26,680 --> 01:11:29,640 Speaker 2: into the thing, you know, like his the back of 1332 01:11:29,680 --> 01:11:33,759 Speaker 2: his head is are touching his heels and and. 1333 01:11:33,760 --> 01:11:35,200 Speaker 3: Like Bechno was twisting you up. 1334 01:11:35,560 --> 01:11:38,120 Speaker 2: Yeah, and at mar the Mortal tweeted this was how 1335 01:11:38,200 --> 01:11:45,840 Speaker 2: Nancy Reagan had Frank Sinatra. I just for some reason 1336 01:11:45,840 --> 01:11:50,439 Speaker 2: I needed that. Oh my god. You can find us 1337 01:11:50,479 --> 01:11:52,439 Speaker 2: on Twitter and Blue Sky at dailyes. I guys were 1338 01:11:52,600 --> 01:11:55,559 Speaker 2: the Daily Gust On Instagram. You can go to the 1339 01:11:55,600 --> 01:11:58,200 Speaker 2: description of the episode wherever you're listening to it, and 1340 01:11:58,240 --> 01:12:00,920 Speaker 2: there you will find the footnote, which is where we 1341 01:12:00,960 --> 01:12:02,600 Speaker 2: link off to the information that we talked about in 1342 01:12:02,600 --> 01:12:05,439 Speaker 2: today's episode. We also look off to a song that 1343 01:12:05,479 --> 01:12:08,479 Speaker 2: we think you might enjoy. Hey, Miles, is there a 1344 01:12:08,520 --> 01:12:09,760 Speaker 2: song you think people might enjoy? 1345 01:12:10,120 --> 01:12:14,040 Speaker 3: Oh? Yeah, I feel like being a husk has been 1346 01:12:14,080 --> 01:12:16,080 Speaker 3: a theme recently. We all feel like we've just become 1347 01:12:16,160 --> 01:12:20,559 Speaker 3: husks of ourselves or referring referencing people who seem like 1348 01:12:20,640 --> 01:12:23,320 Speaker 3: husks of themselves. So this track is called Husk by 1349 01:12:23,360 --> 01:12:26,120 Speaker 3: the band ben I Trust. I really like this band, 1350 01:12:26,160 --> 01:12:29,519 Speaker 3: So this is more great like sort of dreamy pop 1351 01:12:29,640 --> 01:12:32,559 Speaker 3: rock vibes from men I United colors of ben I Trust. 1352 01:12:32,760 --> 01:12:35,519 Speaker 2: Oh all right, we will look off to that in 1353 01:12:35,560 --> 01:12:37,760 Speaker 2: the footnotes and all these kids are production by Heart Radio. 1354 01:12:37,800 --> 01:12:40,479 Speaker 2: For more podcasts from my Heart Radio, visit Yeah Radio, app, 1355 01:12:40,560 --> 01:12:42,679 Speaker 2: Apple Podcasts, or wherever you listen to your favorite shows. 1356 01:12:42,680 --> 01:12:44,120 Speaker 2: That is going to do it for us this morning. 1357 01:12:44,600 --> 01:12:48,360 Speaker 2: We are back on Monday, mm hmm to tell you 1358 01:12:48,400 --> 01:12:52,439 Speaker 2: what is trending. We're off for Juneteenth, so take you 1359 01:12:52,479 --> 01:12:54,800 Speaker 2: a short, short week this week. I hope everybody has 1360 01:12:54,840 --> 01:12:57,920 Speaker 2: a good long weekend and we will talk to you 1361 01:12:58,200 --> 01:12:58,719 Speaker 2: on Monday. 1362 01:13:00,520 --> 01:13:03,800 Speaker 6: The Daily Zeitgeist is executive produced by Catherine Long, co 1363 01:13:03,880 --> 01:13:05,720 Speaker 6: produced by by Wang. 1364 01:13:05,880 --> 01:13:09,679 Speaker 2: Co produced by Victor Wright, co written by J. M McNab, 1365 01:13:10,280 --> 01:13:12,759 Speaker 2: edited and engineered by Justin Conner