1 00:00:01,280 --> 00:00:05,720 Speaker 1: From UFOs to psychic powers and government conspiracies. History is 2 00:00:05,800 --> 00:00:10,119 Speaker 1: riddled with unexplained events. You can turn back now or 3 00:00:10,200 --> 00:00:12,160 Speaker 1: learn this stuff they don't want you to know. 4 00:00:13,080 --> 00:00:14,880 Speaker 2: A production of iHeart Reading. 5 00:00:26,680 --> 00:00:29,040 Speaker 3: Hello, welcome back to the show. My name is Matt, 6 00:00:29,200 --> 00:00:30,200 Speaker 3: my name is Noah. 7 00:00:30,360 --> 00:00:32,920 Speaker 4: They call me Ben. We're joyed as always with our 8 00:00:33,000 --> 00:00:37,040 Speaker 4: super producer Dylan the Tennessee pal Fagan. Most importantly, you 9 00:00:38,200 --> 00:00:41,440 Speaker 4: are here. That makes this the stuff they don't want 10 00:00:41,479 --> 00:00:45,200 Speaker 4: you to know. If you are practicing linear time, please 11 00:00:45,320 --> 00:00:48,600 Speaker 4: do check out part one of this continuing series on 12 00:00:48,920 --> 00:00:52,840 Speaker 4: governments and AI. Oh my gosh, it's an AI week 13 00:00:52,880 --> 00:00:55,200 Speaker 4: here for us. That's stuff they don't want you to know. 14 00:00:55,920 --> 00:01:00,360 Speaker 3: Yes, and our words are coming from our mouths, which 15 00:01:00,480 --> 00:01:04,680 Speaker 3: are operated by our brains. And we are still human, how. 16 00:01:04,560 --> 00:01:09,760 Speaker 4: Quaint, how folks, see, how wholesome. We're gonna pause for 17 00:01:09,800 --> 00:01:17,880 Speaker 4: a word from our sponsors. And here's where it gets crazy, 18 00:01:18,720 --> 00:01:23,039 Speaker 4: and we have returned. We're still us, despite the fact 19 00:01:23,080 --> 00:01:26,400 Speaker 4: that civilization is very much in a pod people situation 20 00:01:26,560 --> 00:01:31,199 Speaker 4: right now. Humans did not invent war, just like Chick 21 00:01:31,200 --> 00:01:33,800 Speaker 4: fil A did not invent the chicken sandwich. 22 00:01:34,760 --> 00:01:38,080 Speaker 5: Neither chicken or the chicken sandwich. Both cleans are spurious. 23 00:01:38,760 --> 00:01:41,280 Speaker 3: You're telling me, Yeah, you're telling me that Chick fil 24 00:01:41,280 --> 00:01:42,840 Speaker 3: A didn't make the first chicken sandwich. 25 00:01:43,520 --> 00:01:46,560 Speaker 4: That's what we're telling you, Matt. So I'm sorry you 26 00:01:46,640 --> 00:01:47,360 Speaker 4: had to learn it. 27 00:01:47,600 --> 00:01:51,520 Speaker 5: My worldview, my two once had at Chick fil a 28 00:01:51,600 --> 00:01:52,720 Speaker 5: centric worldview, Matt. 29 00:01:53,360 --> 00:01:55,640 Speaker 2: But there's hope you can move past this. I believe 30 00:01:55,640 --> 00:01:55,840 Speaker 2: in you. 31 00:01:56,160 --> 00:01:56,520 Speaker 3: Okay. 32 00:01:56,920 --> 00:01:59,760 Speaker 4: Oh, yes, we've got a message from Dyland in all 33 00:02:00,080 --> 00:02:03,280 Speaker 4: hapital letters, Dylan, could you just read this on air? 34 00:02:03,520 --> 00:02:11,040 Speaker 3: True at Kathy Slander? Mm hmmm, precisely blasphemy pointing. 35 00:02:12,080 --> 00:02:14,679 Speaker 4: And you guys know that guy that true did give 36 00:02:14,720 --> 00:02:17,440 Speaker 4: me money and many of his books. 37 00:02:18,000 --> 00:02:21,160 Speaker 3: Uh yeah, if. 38 00:02:20,240 --> 00:02:22,280 Speaker 5: You're gonna have a sith lord mentor, might as well 39 00:02:22,320 --> 00:02:23,120 Speaker 5: be true at Kathy. 40 00:02:23,560 --> 00:02:26,280 Speaker 3: I recently went to I don't know if I guess 41 00:02:26,280 --> 00:02:29,560 Speaker 3: you would call it an ode to Kathy. Uh in 42 00:02:29,800 --> 00:02:32,880 Speaker 3: Loganville that just opened up. It's one of the real 43 00:02:32,960 --> 00:02:36,240 Speaker 3: fancy chick fil as that has basically a museum to him. 44 00:02:37,520 --> 00:02:42,920 Speaker 4: Is it like Truet's with classic cars working lot y? Yeah? Okay, yes, 45 00:02:43,440 --> 00:02:48,480 Speaker 4: I wish I didn't know. But we're nailing it here right, 46 00:02:48,680 --> 00:02:52,079 Speaker 4: just with our earlier analogies. The human did not invent 47 00:02:52,120 --> 00:02:56,040 Speaker 4: the concept of war, just perfected it, and now it's 48 00:02:56,040 --> 00:02:59,360 Speaker 4: a constant fascination. So it's no surprise. We talked about 49 00:02:59,400 --> 00:03:04,160 Speaker 4: this in the past that any new technological breakthrough either 50 00:03:04,400 --> 00:03:11,440 Speaker 4: comes from war or is inevitably evaluated and auditioned to 51 00:03:11,480 --> 00:03:16,079 Speaker 4: see whether it has utility in war. That's how it happens. 52 00:03:16,200 --> 00:03:16,400 Speaker 1: You know. 53 00:03:17,080 --> 00:03:22,160 Speaker 4: Judge Holden from Blood Meridian nailed it. Horrible guy, but 54 00:03:22,400 --> 00:03:24,959 Speaker 4: very smart. She's right, certain kind of smart. 55 00:03:25,000 --> 00:03:27,000 Speaker 5: So yeah, right, So it's not for nothing that the 56 00:03:27,040 --> 00:03:31,840 Speaker 5: Austrian Foreign Minister Alexander Schallenberg called AI in warfare. To 57 00:03:31,880 --> 00:03:35,119 Speaker 5: your previous point, Ben, the Oppenheimer moment of our generation. 58 00:03:35,600 --> 00:03:40,560 Speaker 5: It is that kind of magic science kind of you know, 59 00:03:40,720 --> 00:03:44,000 Speaker 5: collision that absolutely cannot be walked back. 60 00:03:44,720 --> 00:03:48,920 Speaker 3: Yeah, oh gosh, I know. It just feels so when 61 00:03:48,960 --> 00:03:52,680 Speaker 3: you see the money boring out of governments into private 62 00:03:52,720 --> 00:03:56,120 Speaker 3: sector hands to develop this kind of thing. When you 63 00:03:56,280 --> 00:03:59,760 Speaker 3: see that type of thing happening, you kind of know 64 00:04:01,160 --> 00:04:04,560 Speaker 3: everybody is just trying to get that money and everybody, 65 00:04:04,640 --> 00:04:07,120 Speaker 3: I'm sure these companies are doing their absolute best to 66 00:04:07,160 --> 00:04:11,880 Speaker 3: make the best wartime AI stuff they can. Ultimately, these 67 00:04:11,920 --> 00:04:16,120 Speaker 3: are private entities taking in giant grants, giant amounts of 68 00:04:16,160 --> 00:04:18,880 Speaker 3: money to do stuff, and part of that thing is 69 00:04:18,920 --> 00:04:22,400 Speaker 3: to build a company that's gonna last and do other things. 70 00:04:23,000 --> 00:04:26,720 Speaker 3: And it just calls into question everything to me. Yeah, 71 00:04:26,839 --> 00:04:30,600 Speaker 3: especially when I imagine we're recording this on September second, 72 00:04:30,200 --> 00:04:34,159 Speaker 3: the day before victory, what the victory parade in China 73 00:04:34,200 --> 00:04:37,760 Speaker 3: that's gonna happen tomorrow and Wednesday, September third, And you 74 00:04:37,839 --> 00:04:41,039 Speaker 3: just imagine what all of these leaders are getting together 75 00:04:41,080 --> 00:04:44,160 Speaker 3: and thinking about and imagining, because AI is definitely one 76 00:04:44,160 --> 00:04:47,480 Speaker 3: of the main talking points that's happening right around dinner 77 00:04:47,520 --> 00:04:54,240 Speaker 3: tables probably and luxurious outings to various squares. 78 00:04:53,560 --> 00:04:55,000 Speaker 4: And torture rooms and so on. 79 00:04:55,400 --> 00:05:01,159 Speaker 3: Maybe, but you just like, it's just so odd to 80 00:05:01,160 --> 00:05:04,159 Speaker 3: me that we're at this point, at this Oppenheim moment, 81 00:05:04,200 --> 00:05:06,320 Speaker 3: when everybody's trying to do this, but in the end, 82 00:05:06,880 --> 00:05:10,960 Speaker 3: it feels like such a smoke and mirrors way to 83 00:05:11,120 --> 00:05:13,960 Speaker 3: get an influx of money in the private sector. 84 00:05:14,360 --> 00:05:17,880 Speaker 4: Right. It's resource extraction, just not in the way you 85 00:05:17,920 --> 00:05:21,560 Speaker 4: would imagine, you know, from the jump, And it goes 86 00:05:21,600 --> 00:05:25,599 Speaker 4: to our earlier conversations about the transformation from government to 87 00:05:26,480 --> 00:05:33,080 Speaker 4: corporate governance. Right, there's always been a lot American like 88 00:05:33,240 --> 00:05:37,239 Speaker 4: what was referenced in The Boys, There's always been something 89 00:05:37,440 --> 00:05:43,800 Speaker 4: like a defense contractor something like janis ares or crusaders 90 00:05:43,880 --> 00:05:49,000 Speaker 4: or templars that ultimately practice regulatory capture. I am fun 91 00:05:49,040 --> 00:05:49,599 Speaker 4: at parties. 92 00:05:49,880 --> 00:05:53,719 Speaker 3: Do you guys ever imagine the scenario of AI as 93 00:05:53,800 --> 00:06:00,200 Speaker 3: being a targeted like thought virus that's actually a weaponized concept. 94 00:06:00,040 --> 00:06:03,440 Speaker 5: That would imply that would imply that they someone does 95 00:06:03,600 --> 00:06:06,880 Speaker 5: know what it's capable of and has influenced it in 96 00:06:06,920 --> 00:06:08,880 Speaker 5: such a way to achieve a certain result. 97 00:06:09,320 --> 00:06:13,840 Speaker 3: I'm saying, I'm saying the opposite. I'm saying, what if 98 00:06:13,920 --> 00:06:18,440 Speaker 3: the concept of developing this AI is actually being used 99 00:06:18,600 --> 00:06:23,440 Speaker 3: against specific governments like all right, and Western governments as 100 00:06:23,480 --> 00:06:26,880 Speaker 3: a way to drain resources like money, resources and. 101 00:06:26,920 --> 00:06:30,800 Speaker 5: Just misinformation and just like screw up, like throw a 102 00:06:30,800 --> 00:06:31,640 Speaker 5: wrench in the works. 103 00:06:31,960 --> 00:06:34,800 Speaker 3: Think about the water resources that are being used to 104 00:06:34,839 --> 00:06:38,200 Speaker 3: build this stuff, Uh, the precious metals that are being 105 00:06:38,200 --> 00:06:41,680 Speaker 3: extracted and used for this stuff and prioritized. 106 00:06:40,920 --> 00:06:44,120 Speaker 5: You know, like Matt Sadly, it feels like another one 107 00:06:44,120 --> 00:06:46,560 Speaker 5: of those examples was we're just dumb enough to do 108 00:06:46,600 --> 00:06:49,000 Speaker 5: it to ourselves. We don't even need some sort of 109 00:06:49,680 --> 00:06:53,159 Speaker 5: overarching plot to trick us into undoing ourselves. 110 00:06:53,320 --> 00:06:55,760 Speaker 4: You just look, here's how it works, all right. The 111 00:06:55,760 --> 00:06:58,839 Speaker 4: world is a big room. Right. The world leaders are 112 00:06:58,880 --> 00:07:03,000 Speaker 4: just walking around most with very little knowledge of long 113 00:07:03,080 --> 00:07:03,799 Speaker 4: term goals. 114 00:07:04,120 --> 00:07:04,320 Speaker 3: Right. 115 00:07:04,920 --> 00:07:07,640 Speaker 4: They roughly know they want to go north, for instance, 116 00:07:08,000 --> 00:07:10,120 Speaker 4: and then all you need to make them go a 117 00:07:10,120 --> 00:07:14,120 Speaker 4: little northeast is to walk by at an opportune time 118 00:07:14,760 --> 00:07:17,480 Speaker 4: and just nudge the elbow a little bit. Shout out 119 00:07:17,520 --> 00:07:19,880 Speaker 4: to Turkmenistan, So. 120 00:07:20,920 --> 00:07:22,000 Speaker 2: Just get a pinch. 121 00:07:22,360 --> 00:07:26,640 Speaker 3: They just not just what was that phrase, Just a 122 00:07:26,640 --> 00:07:28,160 Speaker 3: whisper on the elbow. I love that bank. 123 00:07:29,480 --> 00:07:34,400 Speaker 4: It's just on the elbow, Matt Frederick. So war has 124 00:07:34,560 --> 00:07:37,640 Speaker 4: always been a business, right, or shout out to Spentley Butler, 125 00:07:37,760 --> 00:07:40,640 Speaker 4: It's been a racket. And just like every other industry 126 00:07:40,680 --> 00:07:45,160 Speaker 4: on the planet, war is figuring out ai. The current 127 00:07:45,320 --> 00:07:48,920 Speaker 4: real world forefront of this is going to be seen 128 00:07:48,960 --> 00:07:53,160 Speaker 4: in the ongoing Russian invasion of Ukraine. We're talking to 129 00:07:53,160 --> 00:07:59,400 Speaker 4: totamus drones, algorithm driven targeting systems. It makes it easier 130 00:07:59,480 --> 00:08:03,720 Speaker 4: to commit mass murder, and it concurrently allows human leaders 131 00:08:04,120 --> 00:08:07,160 Speaker 4: to just shrug and say, oh, the computer got it wrong. 132 00:08:07,520 --> 00:08:09,240 Speaker 4: When a fatal mistake occurs. 133 00:08:09,560 --> 00:08:13,040 Speaker 3: Yeah. Sorry, we hit that hospital in Gaza and it 134 00:08:13,120 --> 00:08:16,040 Speaker 3: had all those journalists in it. We were actually targeting 135 00:08:16,040 --> 00:08:18,520 Speaker 3: this cell phone are. 136 00:08:18,400 --> 00:08:21,680 Speaker 4: Bad and hitting it twice was an error? 137 00:08:22,120 --> 00:08:25,920 Speaker 3: Yeah, computer, Yeah, I thought the first one wasn't effective 138 00:08:25,920 --> 00:08:28,080 Speaker 3: on the cell phone that we were targeting, so we 139 00:08:28,120 --> 00:08:29,080 Speaker 3: had to drop another one. 140 00:08:29,760 --> 00:08:33,240 Speaker 4: Now we got to spend some time on this. We're recording, 141 00:08:33,280 --> 00:08:37,360 Speaker 4: as we said on TEWOD September twenty twenty five, right now, 142 00:08:37,520 --> 00:08:42,000 Speaker 4: in this conflict between Russia and Ukraine, seventy to eighty 143 00:08:42,120 --> 00:08:47,440 Speaker 4: percent of all battlefield casualties are due to drones. This 144 00:08:47,520 --> 00:08:50,520 Speaker 4: is a drone war, which means it ultimately becomes a 145 00:08:50,559 --> 00:08:55,400 Speaker 4: programming war. It's the world's worst math bowl, is the 146 00:08:55,400 --> 00:08:56,319 Speaker 4: way to think about it. 147 00:08:56,559 --> 00:08:59,880 Speaker 3: And we're not talking predator drones some right, some of 148 00:08:59,880 --> 00:09:03,840 Speaker 3: these larger drone strikes and things, but often we're talking about, 149 00:09:03,880 --> 00:09:06,560 Speaker 3: just as you said, drones that are smaller scale with 150 00:09:06,640 --> 00:09:10,920 Speaker 3: explosives on them, that are trained via these various software 151 00:09:11,080 --> 00:09:17,280 Speaker 3: systems to target specific things and explode at specific times. 152 00:09:17,480 --> 00:09:21,360 Speaker 3: You have human intervention often right the controller of a drone, 153 00:09:21,520 --> 00:09:23,920 Speaker 3: but some of them are designed to just go in 154 00:09:24,080 --> 00:09:25,960 Speaker 3: and explode. 155 00:09:26,240 --> 00:09:31,000 Speaker 4: They're the new Kamakazi, you know, divine wind. They can 156 00:09:31,840 --> 00:09:34,520 Speaker 4: they can save your soldiers from the direct horrors of 157 00:09:34,600 --> 00:09:37,880 Speaker 4: the battlefield. You lose one to enemy forces. And if 158 00:09:37,880 --> 00:09:40,959 Speaker 4: you're talking like you're saying, Matt, if you're talking about 159 00:09:41,040 --> 00:09:44,800 Speaker 4: most of the drones deployed the enemy forces when they 160 00:09:44,880 --> 00:09:47,000 Speaker 4: down a drone, they're not going to discover any new 161 00:09:47,080 --> 00:09:51,640 Speaker 4: revolutionary technology. There's no new tech for them to fight. Instead, 162 00:09:52,160 --> 00:09:56,199 Speaker 4: you just pop over to Amazon or Ali Baba and 163 00:09:56,240 --> 00:10:00,760 Speaker 4: you buy the commercially available component parts for the iteration 164 00:10:01,120 --> 00:10:04,640 Speaker 4: of these things. Is there a team version? There is 165 00:10:04,679 --> 00:10:07,199 Speaker 4: a team version. There is a team version. 166 00:10:08,200 --> 00:10:11,760 Speaker 3: Yeah, but it's more expensive now because of the tariffs. 167 00:10:10,960 --> 00:10:13,600 Speaker 3: I'm real. 168 00:10:13,679 --> 00:10:16,640 Speaker 4: Yeah, we're going through our stand up notes. Guys, what's 169 00:10:16,760 --> 00:10:18,440 Speaker 4: up with these tariffs? Huh? 170 00:10:18,720 --> 00:10:20,600 Speaker 3: I don't know it. Your seems like it's uniting the 171 00:10:20,600 --> 00:10:22,520 Speaker 3: world against the United States. Weird. 172 00:10:22,800 --> 00:10:28,120 Speaker 4: Weird anyway, check out our Netflix special. So let's say 173 00:10:28,120 --> 00:10:30,840 Speaker 4: you're on the other side of this war. Those drones 174 00:10:30,840 --> 00:10:33,679 Speaker 4: are wrecking you. They're not just killing soldiers, they're not 175 00:10:33,720 --> 00:10:39,640 Speaker 4: just hitting military sites. They are hitting villages, hospitals, civilians. 176 00:10:40,040 --> 00:10:43,040 Speaker 4: And now you figure out how to jam the signals 177 00:10:43,120 --> 00:10:45,920 Speaker 4: that direct the drones. The next response is the other 178 00:10:46,000 --> 00:10:49,319 Speaker 4: side learns how to use fiber optic cables to bypass 179 00:10:49,360 --> 00:10:53,120 Speaker 4: you're jamming shenanigans, and then you figure out how to 180 00:10:53,200 --> 00:10:56,160 Speaker 4: bypass that, and then you arrive at our current phase, 181 00:10:56,440 --> 00:11:01,720 Speaker 4: which is AI power targeting systems that when everything works right, 182 00:11:02,280 --> 00:11:05,480 Speaker 4: when you get your gin wish, this will identify and 183 00:11:05,520 --> 00:11:13,360 Speaker 4: strike targets with minimal human intervention, even in heavily jammed environments. 184 00:11:14,240 --> 00:11:19,360 Speaker 4: The programming learns from every conflict, which with each instance 185 00:11:19,400 --> 00:11:24,200 Speaker 4: of battlefield use, the next iteration of AI targeting just 186 00:11:24,240 --> 00:11:27,200 Speaker 4: gets more and more precise. That's why we mentioned sci 187 00:11:27,240 --> 00:11:31,240 Speaker 4: fi and demonic folk tales at the beginning. We are 188 00:11:31,280 --> 00:11:34,880 Speaker 4: building artificial minds, and if you think about it, with 189 00:11:35,040 --> 00:11:39,320 Speaker 4: these minds in particular, we are pain and blood. We 190 00:11:39,400 --> 00:11:49,760 Speaker 4: are sacrificing humans to evolve this technology. Yay, yeah, that's sorry. 191 00:11:49,800 --> 00:11:51,480 Speaker 3: It's true. 192 00:11:51,040 --> 00:11:54,120 Speaker 5: The sacrificing in terms of lives lost or in terms 193 00:11:54,120 --> 00:11:57,120 Speaker 5: of like even like in terms of people's ability to 194 00:11:57,880 --> 00:11:58,840 Speaker 5: earn a living way. 195 00:11:59,000 --> 00:12:03,840 Speaker 3: Well, yeah, oh that's the thing, right, but specifically in 196 00:12:03,880 --> 00:12:07,280 Speaker 3: the weapons of war. Yes, you have to do real 197 00:12:07,360 --> 00:12:10,079 Speaker 3: world testing. That's the only way to know, if you're 198 00:12:10,120 --> 00:12:13,400 Speaker 3: actually making the stuff happen. You can go out to 199 00:12:13,679 --> 00:12:15,800 Speaker 3: you know, some missile field out in the middle of 200 00:12:15,840 --> 00:12:19,040 Speaker 3: a desert somewhere, as many countries do, and do all 201 00:12:19,040 --> 00:12:21,959 Speaker 3: the weapons testing you want, but until you see if 202 00:12:21,960 --> 00:12:24,439 Speaker 3: it was actually effective on doing let's say, taking out 203 00:12:24,480 --> 00:12:27,240 Speaker 3: the target you're looking for, like the US did with 204 00:12:27,280 --> 00:12:29,520 Speaker 3: predator drums back in the day. We talked about that 205 00:12:29,559 --> 00:12:31,040 Speaker 3: in Drones episodes. 206 00:12:30,760 --> 00:12:32,360 Speaker 4: Ruined a wedding in particular. 207 00:12:32,559 --> 00:12:35,040 Speaker 3: Well, yeah, but how long we were attempting since the 208 00:12:35,080 --> 00:12:38,520 Speaker 3: eighties to get some kind of autonomous targeting systems and 209 00:12:38,640 --> 00:12:41,760 Speaker 3: get some kind of thing that would act like that. 210 00:12:42,320 --> 00:12:44,600 Speaker 3: And now, as you said, the software is the thing 211 00:12:44,640 --> 00:12:47,120 Speaker 3: that's getting tested way more than the hardware. 212 00:12:48,040 --> 00:12:51,960 Speaker 4: Yeah, because you can buy the hardware, as we said, 213 00:12:52,000 --> 00:12:54,200 Speaker 4: like if you were listening now and you can get 214 00:12:54,240 --> 00:12:57,760 Speaker 4: on the internet, you can buy the hardware. Also, I 215 00:12:57,800 --> 00:13:01,960 Speaker 4: guess everybody listening tonight, all of us together, already have 216 00:13:02,120 --> 00:13:04,640 Speaker 4: access to the internet. But shout out to anybody who's 217 00:13:04,640 --> 00:13:09,000 Speaker 4: listening to this podcast on vinyl. Thank you. We don't 218 00:13:09,000 --> 00:13:12,600 Speaker 4: know how you did it, but we are impressed for sure. 219 00:13:13,880 --> 00:13:18,079 Speaker 4: One thing to point out here real quick, it's not 220 00:13:18,200 --> 00:13:21,000 Speaker 4: just limited to little buzzy buzzy boys in the skies. 221 00:13:21,720 --> 00:13:26,520 Speaker 4: Unmanned ground vehicles are a thing that Ukraine is actively 222 00:13:26,720 --> 00:13:31,600 Speaker 4: and Russia are at, and the US and China whatever 223 00:13:31,840 --> 00:13:38,480 Speaker 4: it's Ukraine and specific is testing over seventy domestically developed 224 00:13:39,800 --> 00:13:48,319 Speaker 4: what they call uggs UGVs Unmanned ground vehicles weugs. Maybe 225 00:13:48,320 --> 00:13:50,280 Speaker 4: we talk a little bit about the kill zone they're 226 00:13:50,280 --> 00:13:51,160 Speaker 4: creating as well. 227 00:13:51,640 --> 00:13:56,079 Speaker 5: So Ukr's work to create a fifteen click unmanned kill 228 00:13:56,240 --> 00:14:00,320 Speaker 5: zone at the front lines while producing something like four 229 00:14:00,440 --> 00:14:03,480 Speaker 5: thousand drones per day, I think is what you're referring 230 00:14:03,480 --> 00:14:06,680 Speaker 5: to there. Man, The hopes there would be to expand 231 00:14:06,720 --> 00:14:08,920 Speaker 5: it as far as forty kilometers. 232 00:14:09,400 --> 00:14:13,200 Speaker 3: So are we talking about like a DMZ kind of situation. 233 00:14:14,679 --> 00:14:18,640 Speaker 4: Yeah, the idea is so for everyone in the US 234 00:14:18,760 --> 00:14:22,840 Speaker 4: fifteen kilometers or clicks, that would be a little more 235 00:14:22,880 --> 00:14:27,400 Speaker 4: than nine miles, and expanding to forty clicks would be 236 00:14:27,880 --> 00:14:32,400 Speaker 4: almost twenty five miles. Jeez. Yeah, yeah, big stuff, right, 237 00:14:32,640 --> 00:14:37,560 Speaker 4: And in that area, semi autonomous drones are judge, jury 238 00:14:37,640 --> 00:14:43,880 Speaker 4: and executioner punishing, obliterating anything that doesn't walk correctly right 239 00:14:44,080 --> 00:14:45,640 Speaker 4: or sends out the wrong signals. 240 00:14:46,720 --> 00:14:48,600 Speaker 3: Jesus man So what if. 241 00:14:48,520 --> 00:14:51,600 Speaker 4: Your job is what if you have a job where 242 00:14:51,680 --> 00:14:57,080 Speaker 4: you hop on a laptop, right, and you just get 243 00:14:57,520 --> 00:15:01,280 Speaker 4: a snapshot and you click yes or no and you 244 00:15:01,360 --> 00:15:02,640 Speaker 4: do it for hours? 245 00:15:03,040 --> 00:15:07,480 Speaker 3: Yeah, and that and yes or no equals kill or 246 00:15:08,000 --> 00:15:12,600 Speaker 3: let just don't mess with, right, or maybe maybe kill 247 00:15:12,800 --> 00:15:15,400 Speaker 3: or observe. Those are your two options, right? 248 00:15:15,440 --> 00:15:18,400 Speaker 5: Oh yeah, because you can't just say okay and with 249 00:15:18,520 --> 00:15:22,680 Speaker 5: the dispassionateness of solving a capture, so that you can 250 00:15:22,880 --> 00:15:24,040 Speaker 5: you know, ooh nice? 251 00:15:24,280 --> 00:15:29,000 Speaker 4: Yeah, kind of yeah, oh no, not, kind of absolutely terrified. 252 00:15:29,240 --> 00:15:31,560 Speaker 4: I mean, look, we're I think it's safe to say, 253 00:15:31,600 --> 00:15:34,800 Speaker 4: not to speak for everyone, but we're pretty pro Ukraine. 254 00:15:35,040 --> 00:15:36,440 Speaker 4: Is that all right? Is that a hot take? 255 00:15:37,200 --> 00:15:37,400 Speaker 1: Yeah? 256 00:15:38,000 --> 00:15:41,080 Speaker 5: No, not a hot take pro humanity, but yes, I 257 00:15:41,120 --> 00:15:43,760 Speaker 5: think Ukraine is on the right side of history here. 258 00:15:44,320 --> 00:15:46,280 Speaker 3: Well, I hate that I feel like I have to 259 00:15:46,320 --> 00:15:49,520 Speaker 3: say this, but I think I'm pro Ukraine. I think 260 00:15:49,560 --> 00:15:53,000 Speaker 3: I understand the conflict well enough to say, yes, that's 261 00:15:53,120 --> 00:15:57,040 Speaker 3: the side history should be on and I should be 262 00:15:57,160 --> 00:16:01,400 Speaker 3: fighting for or wanting to. But at the same time, 263 00:16:01,560 --> 00:16:05,160 Speaker 3: I still feel as though I'm in the dark, perhaps 264 00:16:05,200 --> 00:16:09,440 Speaker 3: because of the various propaganda pieces that end up coming 265 00:16:09,480 --> 00:16:12,240 Speaker 3: out that it's difficult to discern what is propaganda, what 266 00:16:12,400 --> 00:16:17,320 Speaker 3: is actually occurring, what are the real, little small decisions 267 00:16:17,640 --> 00:16:23,560 Speaker 3: that went into what this conflict is. Genuinely, I feel 268 00:16:23,600 --> 00:16:25,480 Speaker 3: like I'm still in the dark quite a bit there. 269 00:16:25,840 --> 00:16:30,080 Speaker 4: That's impressively fair. Yeah, because we also, whenever we're talking 270 00:16:30,080 --> 00:16:34,600 Speaker 4: about these situations, folks, please note, as we like to 271 00:16:34,600 --> 00:16:39,080 Speaker 4: say we are not the experts, we also understand that 272 00:16:39,120 --> 00:16:41,720 Speaker 4: the people are not the government. You know what I mean. 273 00:16:42,040 --> 00:16:48,280 Speaker 4: Nobody has a system where everybody legitimately in a nation 274 00:16:48,400 --> 00:16:52,040 Speaker 4: gets to talk about every decision, right, So wherever you go, 275 00:16:52,480 --> 00:16:54,600 Speaker 4: there are going to be a ton wherever you go. 276 00:16:54,720 --> 00:16:58,040 Speaker 4: Whichever government you hate the most, remember there are a 277 00:16:58,120 --> 00:17:03,560 Speaker 4: lot of people living under that governance who completely agree 278 00:17:03,600 --> 00:17:08,920 Speaker 4: with you, including i'll say, at the US, so one 279 00:17:08,960 --> 00:17:13,280 Speaker 4: way or the other, this is the thing KIV was. 280 00:17:13,880 --> 00:17:19,560 Speaker 4: Ukraine was driven to this innovation of automation and AI 281 00:17:19,720 --> 00:17:24,280 Speaker 4: through crisis, through necessity, and in doing so logically followed 282 00:17:24,680 --> 00:17:29,320 Speaker 4: the existing theoretical path that had already been well established. 283 00:17:30,240 --> 00:17:34,639 Speaker 4: Everything that occurs again is precedent, and that's why we 284 00:17:34,720 --> 00:17:40,360 Speaker 4: see top notch drone pilots for Ukraine right now already 285 00:17:40,440 --> 00:17:44,840 Speaker 4: raising concerns about what this AI targeting and warfare means 286 00:17:44,880 --> 00:17:49,199 Speaker 4: for the future. And these guys are these guys are 287 00:17:49,240 --> 00:17:52,359 Speaker 4: in the thick of it, you know, folks like well, 288 00:17:52,400 --> 00:17:58,200 Speaker 4: we'll use their call signs, right, so, Casper Whiskas. These 289 00:17:58,240 --> 00:18:01,800 Speaker 4: folks are drone pilots, highly experience, and they say AI 290 00:18:01,880 --> 00:18:04,840 Speaker 4: targeting works as a tool, but without a human hand 291 00:18:04,920 --> 00:18:08,399 Speaker 4: on the stick, will a machine intelligence know who it's killing? 292 00:18:08,640 --> 00:18:09,080 Speaker 4: Or why? 293 00:18:09,800 --> 00:18:10,520 Speaker 3: Will it give it. 294 00:18:12,200 --> 00:18:12,480 Speaker 1: Right? 295 00:18:12,880 --> 00:18:16,800 Speaker 4: Right? Could it? Could it? What? It doesn't poop? So 296 00:18:16,840 --> 00:18:17,800 Speaker 4: what does it know about? 297 00:18:18,960 --> 00:18:20,560 Speaker 3: So this is. 298 00:18:21,200 --> 00:18:26,960 Speaker 4: An unhelpful joke, but hashtag no joke left behind. A 299 00:18:27,000 --> 00:18:28,800 Speaker 4: lot of us are going to hear this. We're gonna 300 00:18:28,800 --> 00:18:32,560 Speaker 4: say it's a tragedy. But stuff's not so great over here. 301 00:18:32,680 --> 00:18:37,119 Speaker 4: Grocery prices are accelerating. Everything's more expensive, partially due to AI. 302 00:18:37,240 --> 00:18:39,280 Speaker 4: By the way, I got my own stuff going on. 303 00:18:39,760 --> 00:18:42,479 Speaker 4: How does this apply to me? This is where we 304 00:18:42,520 --> 00:18:46,520 Speaker 4: throw to our good friends at Pallenteer. It would be 305 00:18:46,640 --> 00:18:50,760 Speaker 4: hilarious if they sponsored an episode. Yeah, hey, y'all, let's 306 00:18:50,760 --> 00:18:51,760 Speaker 4: take a quick pause. 307 00:18:51,560 --> 00:18:54,320 Speaker 5: Here, hear a word from our sponsor, and then we'll 308 00:18:54,359 --> 00:18:56,080 Speaker 5: come back with more discussion of. 309 00:18:56,400 --> 00:18:57,920 Speaker 2: AI in government. 310 00:19:03,640 --> 00:19:04,399 Speaker 3: And we're back. 311 00:19:06,560 --> 00:19:07,960 Speaker 2: I sold on my paleteer stock. 312 00:19:08,040 --> 00:19:10,000 Speaker 3: By the way, you did I did? 313 00:19:10,200 --> 00:19:11,000 Speaker 4: Yeah? You did? 314 00:19:11,920 --> 00:19:16,480 Speaker 3: Well? Sorry, man, they're going places. They're really Yeah, you 315 00:19:16,480 --> 00:19:19,360 Speaker 3: probably want a lot of money. 316 00:19:20,040 --> 00:19:21,800 Speaker 5: I made a little money, but I just honestly I 317 00:19:21,800 --> 00:19:23,320 Speaker 5: bought it before I had any idea. I was just 318 00:19:23,359 --> 00:19:24,639 Speaker 5: like a tip that a friend of mine gave me 319 00:19:24,680 --> 00:19:26,600 Speaker 5: before they were really in the news. And then when 320 00:19:26,640 --> 00:19:28,840 Speaker 5: all this started happening, I was like, I can't be 321 00:19:29,119 --> 00:19:30,840 Speaker 5: any any part of this. 322 00:19:31,600 --> 00:19:33,400 Speaker 3: Well, I gotta tell you, there's a lot of exciting 323 00:19:33,440 --> 00:19:35,840 Speaker 3: AI stuff coming out of that company. As we've talked 324 00:19:35,840 --> 00:19:38,960 Speaker 3: about them before on the show. One of the really 325 00:19:39,000 --> 00:19:41,920 Speaker 3: exciting things, especially for the military that's coming out is 326 00:19:41,960 --> 00:19:47,400 Speaker 3: their Titan program. T I t a n program, guys. 327 00:19:47,520 --> 00:19:52,320 Speaker 3: It's a tactical intelligence targeting Access node UH. It's also 328 00:19:52,359 --> 00:19:57,400 Speaker 3: a giant truck like armored vehicle that is an information hub, 329 00:19:57,400 --> 00:20:01,040 Speaker 3: an information center. You can head on over Tovolunteer's website 330 00:20:01,080 --> 00:20:04,520 Speaker 3: polunteer dot com and UH you can find the Titan. 331 00:20:05,080 --> 00:20:11,520 Speaker 3: It's basically a command hub, a sensing hub for forward 332 00:20:11,560 --> 00:20:14,439 Speaker 3: operating militaries to where you park one of these guys 333 00:20:14,480 --> 00:20:18,560 Speaker 3: somewhere and it can deep sense with AI and machine learning, 334 00:20:18,560 --> 00:20:21,440 Speaker 3: at least according to their website, deep sense. 335 00:20:21,320 --> 00:20:23,199 Speaker 2: Is this like trademark to this is the term that 336 00:20:23,240 --> 00:20:23,840 Speaker 2: they invented. 337 00:20:23,920 --> 00:20:27,080 Speaker 3: This is wild, it says, well here, I'll read directly 338 00:20:27,080 --> 00:20:31,760 Speaker 3: from the website. As the first software prime, Pallenteer is 339 00:20:31,800 --> 00:20:36,160 Speaker 3: delivering a novel AI defined vehicle to provide deep sensing 340 00:20:36,280 --> 00:20:40,199 Speaker 3: capability that will enable long range precision fires for the 341 00:20:40,240 --> 00:20:45,760 Speaker 3: modern battlespace. Titan is the Army's next generation intelligence ground station, 342 00:20:45,920 --> 00:20:49,800 Speaker 3: enabled by artificial intelligence and machine learning and represents a 343 00:20:49,840 --> 00:20:54,160 Speaker 3: first of its kind modernization program. What does that mean? 344 00:20:56,359 --> 00:20:58,960 Speaker 3: It probably means it can understand a lot of what's 345 00:20:59,000 --> 00:21:03,639 Speaker 3: going on in like from a forward operating position, and 346 00:21:03,760 --> 00:21:05,640 Speaker 3: like what's going on on the other side of that 347 00:21:06,040 --> 00:21:11,159 Speaker 3: fifteen clicks area right in the Ukraine. What's what is 348 00:21:11,160 --> 00:21:15,600 Speaker 3: the enemy doing and who's making communications at this point, 349 00:21:15,640 --> 00:21:18,000 Speaker 3: where are those coming from, where are they going, where 350 00:21:18,040 --> 00:21:21,040 Speaker 3: are troops positioned on the opposite side, and also where 351 00:21:21,080 --> 00:21:24,640 Speaker 3: are are troops positioned. You know, I'm just imagining that 352 00:21:24,760 --> 00:21:27,560 Speaker 3: kind of thing that you talked about long long, long ago, 353 00:21:27,680 --> 00:21:32,639 Speaker 3: been having a an edge in the battlefield by knowing 354 00:21:32,800 --> 00:21:35,480 Speaker 3: everything that's happening and being able to have a bit 355 00:21:35,520 --> 00:21:38,560 Speaker 3: of a prediction at the front of real time. 356 00:21:40,000 --> 00:21:42,760 Speaker 4: What like, at the front of real time. That's a 357 00:21:43,000 --> 00:21:46,679 Speaker 4: that's going on the list of cool album names. Okay, 358 00:21:47,280 --> 00:21:49,360 Speaker 4: in front of real time? What band is that? 359 00:21:50,520 --> 00:21:51,080 Speaker 3: I don't know. 360 00:21:51,520 --> 00:21:51,720 Speaker 4: Well. 361 00:21:51,800 --> 00:21:53,840 Speaker 3: One of the things they can do with that is 362 00:21:55,280 --> 00:21:58,240 Speaker 3: call in if they have to. The Dark Eagle program, 363 00:21:58,320 --> 00:22:01,919 Speaker 3: which is the US's long range hypersonic weapon that is, 364 00:22:02,000 --> 00:22:05,200 Speaker 3: you know, being touted as using AI for its targeting 365 00:22:05,280 --> 00:22:10,560 Speaker 3: and its abilities to fly above the threshold of sound 366 00:22:11,200 --> 00:22:14,840 Speaker 3: and reach distances that it should make all of our 367 00:22:14,960 --> 00:22:18,119 Speaker 3: enemies quiver knowing that the Dark Eagle is out there lurking, 368 00:22:18,480 --> 00:22:19,520 Speaker 3: and it's. 369 00:22:19,359 --> 00:22:23,359 Speaker 4: Too fast for SAM. It's too fast for an iron dome, 370 00:22:23,480 --> 00:22:24,160 Speaker 4: that kind of thing. 371 00:22:24,440 --> 00:22:27,280 Speaker 3: Yeah, I mean, And it is terrifying that this kind 372 00:22:27,320 --> 00:22:30,439 Speaker 3: of stuff exists out there, and that artificial intelligence or 373 00:22:30,440 --> 00:22:34,120 Speaker 3: whatever these whatever version of it, it is helpful for 374 00:22:34,240 --> 00:22:37,679 Speaker 3: that specific technology. It's out there being used right now, 375 00:22:38,440 --> 00:22:42,600 Speaker 3: and it's only getting better as long as the money 376 00:22:42,680 --> 00:22:44,520 Speaker 3: to invest in it doesn't run out. 377 00:22:46,280 --> 00:22:49,760 Speaker 4: Exactly. And SAM just being in surface to air missile 378 00:22:50,119 --> 00:22:54,960 Speaker 4: that would be like trying to counter other stuff shot 379 00:22:55,000 --> 00:22:59,399 Speaker 4: at you through the wide blameless Sky and I just 380 00:22:59,440 --> 00:23:02,480 Speaker 4: want an event too, like the crossover between the kind 381 00:23:02,520 --> 00:23:05,280 Speaker 4: of tech bro world of things and the kinds of 382 00:23:05,320 --> 00:23:09,960 Speaker 4: AI militarization efforts that we're talking about here as well. 383 00:23:10,200 --> 00:23:14,640 Speaker 4: Daniel Eck, the co founder of Spotify and the CEO 384 00:23:14,720 --> 00:23:19,800 Speaker 4: of Spotify, has invested into a new AI military defense 385 00:23:19,880 --> 00:23:22,960 Speaker 4: company called Helsing that's based in Germany and it develops 386 00:23:23,000 --> 00:23:24,600 Speaker 4: AI tools for military use. 387 00:23:24,640 --> 00:23:25,919 Speaker 2: So I mean, it's it is. 388 00:23:26,040 --> 00:23:28,080 Speaker 5: It is kind of the hot new investment and he's 389 00:23:28,080 --> 00:23:29,320 Speaker 5: getting a lot of crap for it. A lot of 390 00:23:29,359 --> 00:23:32,199 Speaker 5: people are leaving Spotify on mass for there's other reasons 391 00:23:32,200 --> 00:23:35,480 Speaker 5: to do that as well. The dude is not very 392 00:23:35,520 --> 00:23:38,280 Speaker 5: popular amongst music types. 393 00:23:39,040 --> 00:23:42,440 Speaker 3: Just looking at the front page of Helsing dot ai 394 00:23:43,000 --> 00:23:48,000 Speaker 3: and also not missing the helsingerence that is what it means. 395 00:23:48,040 --> 00:23:51,680 Speaker 3: And you know, but it is a swarm of these 396 00:23:51,800 --> 00:23:54,600 Speaker 3: drones that are they look like X wings but in 397 00:23:54,920 --> 00:23:59,320 Speaker 3: you know, modern drone technology and they're just flying across 398 00:23:59,600 --> 00:24:02,480 Speaker 3: and it's as protecting our democracy. 399 00:24:02,880 --> 00:24:04,640 Speaker 5: And there's a quote here, you know, in this piece 400 00:24:04,720 --> 00:24:08,680 Speaker 5: that I found actually djmagazine dot com where a sort 401 00:24:08,720 --> 00:24:10,720 Speaker 5: of defends his position here by saying the world is 402 00:24:10,720 --> 00:24:13,360 Speaker 5: being tested in more ways than ever before. And I apologize, 403 00:24:13,359 --> 00:24:16,360 Speaker 5: I've got a call that has sped up the timeline. 404 00:24:16,400 --> 00:24:20,400 Speaker 5: There's an enormous realization that it's really now AI mass 405 00:24:20,440 --> 00:24:23,560 Speaker 5: and autonomy that is driving the new battlefield. We can't 406 00:24:23,640 --> 00:24:27,520 Speaker 5: underestimate the implication of that for this conflict in Ukraine specifically, 407 00:24:27,840 --> 00:24:29,879 Speaker 5: or really any conflict going forward. 408 00:24:30,720 --> 00:24:33,840 Speaker 4: That is unfortunately all true. It is all true, but 409 00:24:33,880 --> 00:24:34,440 Speaker 4: it's also. 410 00:24:34,400 --> 00:24:38,480 Speaker 5: Sort of the the ultimate excuse. You know, that we 411 00:24:38,600 --> 00:24:40,480 Speaker 5: need the thing, and we even if we don't understand 412 00:24:40,520 --> 00:24:41,000 Speaker 5: the thing, you. 413 00:24:41,000 --> 00:24:44,959 Speaker 3: Know, Yeah, I'm not kidding guys. Another thing about these 414 00:24:45,040 --> 00:24:49,840 Speaker 3: AI companies, they have the slickest looking websites you'll ever see. 415 00:24:50,000 --> 00:24:51,160 Speaker 4: They better, they. 416 00:24:51,359 --> 00:24:56,679 Speaker 3: They look incredible, And I just to take in our experience, 417 00:24:57,080 --> 00:25:01,000 Speaker 3: we have seen many a pitch document for a new podcast, right. 418 00:25:01,359 --> 00:25:03,640 Speaker 3: I don't know if you guys have seen some of these, 419 00:25:03,640 --> 00:25:09,680 Speaker 3: but the effort that goes in to visualizing the way 420 00:25:09,760 --> 00:25:13,119 Speaker 3: a podcast will sound, and how much time and effort 421 00:25:13,240 --> 00:25:16,159 Speaker 3: and sleekness goes into that when it gets pitched to 422 00:25:16,680 --> 00:25:20,800 Speaker 3: folks here at iHeart and other places, that is what's 423 00:25:20,800 --> 00:25:23,399 Speaker 3: happening here. Like the if you if you look at 424 00:25:23,440 --> 00:25:25,560 Speaker 3: their website, it's like, oh, my god. This is this 425 00:25:25,680 --> 00:25:29,520 Speaker 3: is like better than Call of Duty Black Ops twelve 426 00:25:29,880 --> 00:25:32,440 Speaker 3: and oh my god, like these are real things. Look 427 00:25:32,440 --> 00:25:36,080 Speaker 3: at the beauty shots of these drones and oh the 428 00:25:36,119 --> 00:25:39,880 Speaker 3: way they're they've organized the text and the the Oh 429 00:25:40,160 --> 00:25:42,000 Speaker 3: oh I want to put money into. 430 00:25:44,440 --> 00:25:49,000 Speaker 4: Except all cookies from this website. Yeah, guys, I want 431 00:25:49,040 --> 00:25:50,880 Speaker 4: you to know that I did look into this. 432 00:25:51,400 --> 00:25:54,359 Speaker 3: Oh the HX two AI strike drone. 433 00:25:54,400 --> 00:25:57,600 Speaker 4: Oh my good was it three am when I pulled 434 00:25:57,640 --> 00:26:00,280 Speaker 4: up your website? Did I just order a cheese from 435 00:26:00,320 --> 00:26:04,840 Speaker 4: Bradley Cooper? Yes, a thousand times. Yes, Please know these 436 00:26:04,880 --> 00:26:07,840 Speaker 4: things about me. Also, here is a lot of money. 437 00:26:08,119 --> 00:26:12,000 Speaker 4: So this is where we know. This is where we 438 00:26:12,080 --> 00:26:16,359 Speaker 4: know that it's not an academic point. It never really was, 439 00:26:17,040 --> 00:26:21,520 Speaker 4: right since post nineteen sixties or so. This is real world, 440 00:26:21,760 --> 00:26:26,520 Speaker 4: This is applicable. This is here already. The same technology 441 00:26:26,800 --> 00:26:30,400 Speaker 4: that we are describing in the Ukraine conflict, the same 442 00:26:30,480 --> 00:26:35,680 Speaker 4: technology that Palanteer and Helsing are working on, is already extant. 443 00:26:35,920 --> 00:26:41,680 Speaker 4: It exists. You should check out the Chinese drone swarm celebrations, 444 00:26:42,080 --> 00:26:45,160 Speaker 4: right god, yeah, yeah, yeah, yeah, it's on the way. 445 00:26:45,359 --> 00:26:48,240 Speaker 4: It's also coming to your neck of the global woods. 446 00:26:49,119 --> 00:26:51,960 Speaker 3: We're gonna see it tomorrow as we record, right, there's 447 00:26:52,000 --> 00:26:53,920 Speaker 3: got to be some kind of crazy show of force 448 00:26:53,960 --> 00:26:55,719 Speaker 3: with drone tomorrow estivation of course. 449 00:26:55,880 --> 00:26:57,639 Speaker 4: Yeah, you got a flex on it, Isn't it funny? 450 00:26:57,760 --> 00:26:59,360 Speaker 5: How that is sort of like it's on the one hand, 451 00:26:59,400 --> 00:27:01,639 Speaker 5: it's a spect it's like a level up from a 452 00:27:01,640 --> 00:27:03,440 Speaker 5: fireworks display, but it's also a bit of a show 453 00:27:03,440 --> 00:27:05,479 Speaker 5: of force a million percent. 454 00:27:07,760 --> 00:27:11,159 Speaker 4: Yeah, yeah, yeah, yeah, yeah. This is Uh, this is 455 00:27:11,160 --> 00:27:14,040 Speaker 4: also on the ground, you know. Uh, the US Immigration 456 00:27:14,119 --> 00:27:20,840 Speaker 4: and Customs Enforcement aka ICE is already using AI various applications. 457 00:27:20,840 --> 00:27:25,439 Speaker 4: We're talking about tracking people, analyzing data, playing the Kevin 458 00:27:25,520 --> 00:27:30,600 Speaker 4: Bacon game. Uh, and what they call enhancing border security. Uh. 459 00:27:30,640 --> 00:27:34,720 Speaker 4: This occurs with little active oversight, not necessarily because of 460 00:27:34,800 --> 00:27:39,120 Speaker 4: corruption or malevolence, but more because no one in government 461 00:27:39,200 --> 00:27:42,920 Speaker 4: is sure what oversight would look like. Very true. Yeah, 462 00:27:42,960 --> 00:27:45,240 Speaker 4: But we also we talked about flock a little bit. 463 00:27:45,280 --> 00:27:48,640 Speaker 4: We talked about license plate tracking and monitoring. 464 00:27:49,720 --> 00:27:54,919 Speaker 3: Is not working with ICE anymore, at least officially, because 465 00:27:54,920 --> 00:27:58,359 Speaker 3: of some stuff that is very political in nature, but 466 00:27:58,600 --> 00:28:00,920 Speaker 3: it is that is still the thing that looks at 467 00:28:00,960 --> 00:28:02,280 Speaker 3: every license plate. 468 00:28:02,920 --> 00:28:03,760 Speaker 2: Movie voice. 469 00:28:03,920 --> 00:28:06,840 Speaker 4: Yeah, we're gonna do Uh, Dylan can we get some 470 00:28:06,920 --> 00:28:13,480 Speaker 4: like movie trailer stuff sound cute in a world where 471 00:28:13,560 --> 00:28:16,840 Speaker 4: Big Brother goes everything about your past and your present? 472 00:28:17,280 --> 00:28:21,200 Speaker 4: What makes you a cripinal? What from your past action 473 00:28:21,400 --> 00:28:25,080 Speaker 4: could persuade a machine that you will commit a cry? 474 00:28:25,760 --> 00:28:33,480 Speaker 4: What if your past actions become redefined as cry? Police 475 00:28:33,480 --> 00:28:38,520 Speaker 4: Academy nine starring uh who stars in this. 476 00:28:43,560 --> 00:28:45,320 Speaker 2: Really rocking? Do you see that new naked gun movie? 477 00:28:45,320 --> 00:28:47,520 Speaker 4: He's high excited. I'm excited. 478 00:28:49,200 --> 00:28:49,520 Speaker 5: Maybe. 479 00:28:49,600 --> 00:28:49,960 Speaker 2: I don't know. 480 00:28:49,960 --> 00:28:54,120 Speaker 5: Pete Davidson, he could be like, mahoney, I think he'd 481 00:28:54,160 --> 00:28:54,960 Speaker 5: be a good mahoney. 482 00:28:55,480 --> 00:28:58,320 Speaker 3: Oh my god. Put Pete Davidson in charge of all 483 00:28:58,320 --> 00:29:00,520 Speaker 3: the other guys, though, what if he did that? What 484 00:29:00,560 --> 00:29:03,960 Speaker 3: if he's the leader? These are all AI licensed avatars, 485 00:29:04,040 --> 00:29:05,760 Speaker 3: none of them are real actors anymore. 486 00:29:05,880 --> 00:29:07,680 Speaker 4: Exactly, there we go. 487 00:29:07,880 --> 00:29:09,880 Speaker 3: It's deep pavedsen. 488 00:29:11,920 --> 00:29:16,680 Speaker 4: Yeah perfect, yeah, yeah, yeah. So we are saying, by 489 00:29:16,720 --> 00:29:19,480 Speaker 4: the way, just be careful with your social media and 490 00:29:19,520 --> 00:29:24,000 Speaker 4: your footprint as ever. And also, just like how the 491 00:29:24,040 --> 00:29:28,680 Speaker 4: water recedes before tsunami hits land, the consequences of big 492 00:29:28,720 --> 00:29:32,600 Speaker 4: government and AI may seem far away for now, but 493 00:29:32,680 --> 00:29:36,800 Speaker 4: there is a massive tide rolling in and it will 494 00:29:36,920 --> 00:29:40,320 Speaker 4: consume things You already see AI it at the airport. 495 00:29:40,680 --> 00:29:44,600 Speaker 4: You know, facial recognition. Do you guys remember when post 496 00:29:44,720 --> 00:29:48,160 Speaker 4: nine to eleven, the TSA became a thing speaking of 497 00:29:48,240 --> 00:29:53,080 Speaker 4: private industry extracting resources from governments. And you had to 498 00:29:53,080 --> 00:29:55,680 Speaker 4: go in a special line if you didn't want to 499 00:29:55,680 --> 00:29:58,360 Speaker 4: get the body scanned. Yeah, yeah, you had to get 500 00:29:58,400 --> 00:30:02,760 Speaker 4: the pat down. That's happening facial recognition in international travel. 501 00:30:03,160 --> 00:30:07,880 Speaker 4: There's a special line if you don't consent to facial recognition. 502 00:30:08,200 --> 00:30:10,320 Speaker 5: And at the end of that line, they'd touch your 503 00:30:10,360 --> 00:30:12,040 Speaker 5: face and feel all the bumps. 504 00:30:13,000 --> 00:30:13,440 Speaker 2: Yeah. 505 00:30:13,800 --> 00:30:17,920 Speaker 4: Yeah, they've got one of those orphans from Minority Report. No, yeah, 506 00:30:18,440 --> 00:30:22,000 Speaker 4: it's just it's a new form of phrenology. Wow, okay, 507 00:30:22,040 --> 00:30:25,640 Speaker 4: we'll keep it. So you see, you see AI at 508 00:30:25,680 --> 00:30:30,120 Speaker 4: the stoplight. The smart cameras, the autonomous vehicles up your weight, 509 00:30:30,320 --> 00:30:32,400 Speaker 4: Nolan and I have seen these in the city, but 510 00:30:32,520 --> 00:30:36,800 Speaker 4: up your way. Have you all seen any autonomous vehicles 511 00:30:36,880 --> 00:30:38,640 Speaker 4: doing the experimental drives. 512 00:30:40,160 --> 00:30:40,480 Speaker 2: Ms? 513 00:30:40,760 --> 00:30:41,520 Speaker 3: Yeah, it's weird. 514 00:30:41,600 --> 00:30:44,920 Speaker 5: Sometimes there's somebody at the driver's seat and then sometimes 515 00:30:44,920 --> 00:30:45,200 Speaker 5: there's not. 516 00:30:45,520 --> 00:30:46,280 Speaker 2: I've seen both. 517 00:30:47,560 --> 00:30:50,240 Speaker 3: Yeah, yeah, I'm seeing some of that happen. It's not 518 00:30:50,600 --> 00:30:54,440 Speaker 3: it's not as ubiquitous up here yet, but it's creeping. 519 00:30:55,000 --> 00:30:58,120 Speaker 4: Mm hmm. Everything's on the way. You know. You see 520 00:30:58,720 --> 00:31:03,280 Speaker 4: AI or large language models in college campuses, in popular media. 521 00:31:03,320 --> 00:31:06,160 Speaker 4: You might see it in personal text from your loved ones. 522 00:31:07,440 --> 00:31:10,680 Speaker 4: AI is ozebic for the mind. Okay, it's adderall for 523 00:31:10,760 --> 00:31:15,600 Speaker 4: every institution imaginable. I was talking to my girlfriend about 524 00:31:15,640 --> 00:31:19,240 Speaker 4: this earlier and she said, it's such a weird idea. 525 00:31:19,640 --> 00:31:22,680 Speaker 4: It's like if someone designed a vape for cocaine. 526 00:31:23,080 --> 00:31:25,200 Speaker 3: Oh yeah, told you that. 527 00:31:25,480 --> 00:31:26,440 Speaker 4: I'm just terrified. 528 00:31:26,520 --> 00:31:28,640 Speaker 2: It's called a crack pipe. 529 00:31:29,400 --> 00:31:33,560 Speaker 4: So shout out. Shout out to her for the terrified 530 00:31:33,560 --> 00:31:34,560 Speaker 4: Oppenheimer idea. 531 00:31:34,760 --> 00:31:35,320 Speaker 3: You call it a. 532 00:31:37,400 --> 00:31:43,680 Speaker 4: Oh love it, shark tanked crapes. Oh no, we'll have 533 00:31:43,720 --> 00:31:47,840 Speaker 4: different flavors. It's like the discovery of nuclear technology. The 534 00:31:47,920 --> 00:31:51,880 Speaker 4: government plus AI will inevitably result in the death of 535 00:31:52,040 --> 00:31:55,320 Speaker 4: innocent people, and probably a lot of them, to be honest. 536 00:31:55,640 --> 00:32:00,000 Speaker 4: So aside from that, the human atrocities or going on long, 537 00:32:00,520 --> 00:32:02,200 Speaker 4: but we do have to talk about it. We teased 538 00:32:02,240 --> 00:32:07,560 Speaker 4: that the environmental consequences, it's eerily similar to the danger 539 00:32:07,720 --> 00:32:11,040 Speaker 4: of building nuclear power plants. What was that was that 540 00:32:11,080 --> 00:32:14,720 Speaker 4: power plant we talked about where they had to build 541 00:32:14,720 --> 00:32:18,640 Speaker 4: a canal system, and they created a salt plume that 542 00:32:18,720 --> 00:32:20,680 Speaker 4: continues to destroy parts of Florida. 543 00:32:21,720 --> 00:32:22,680 Speaker 3: Point Turkey Point. 544 00:32:22,720 --> 00:32:24,719 Speaker 2: That's the t Yeah, Turkey Point. There you go. 545 00:32:25,120 --> 00:32:28,760 Speaker 4: Yeah, So it's similar to this. Our fellow nerds at 546 00:32:28,880 --> 00:32:34,600 Speaker 4: MIT have been pretty much every smart person who took 547 00:32:34,600 --> 00:32:37,680 Speaker 4: an interest in this has been warning about the environmental 548 00:32:37,720 --> 00:32:43,240 Speaker 4: consequences of AI the way it's currently created for decades. 549 00:32:43,360 --> 00:32:47,880 Speaker 4: It is eerily similar again to conversations about power plants 550 00:32:47,880 --> 00:32:51,000 Speaker 4: and nuclear technology. We got to talk about the water. 551 00:32:51,320 --> 00:32:54,680 Speaker 4: I mean, you're you guys are top tier gamers, so 552 00:32:54,720 --> 00:32:58,520 Speaker 4: you know that cooling your hardware is one of the 553 00:32:58,520 --> 00:33:01,680 Speaker 4: biggest concerns for a good. 554 00:33:01,440 --> 00:33:08,800 Speaker 3: Uh yeah, that's why you need them. Aquifers everything they got, 555 00:33:09,760 --> 00:33:11,440 Speaker 3: everything they got, yeh. Drink it up. 556 00:33:11,800 --> 00:33:16,680 Speaker 4: All the fossil water under Libya, all the uh, all 557 00:33:16,720 --> 00:33:20,040 Speaker 4: the hidden water. We gotta take it. Like, imagine it 558 00:33:20,080 --> 00:33:24,080 Speaker 4: this way, going to our gen example, you have a shit, right, 559 00:33:24,200 --> 00:33:26,840 Speaker 4: it's in a lamp, just like all the old stereotypes. 560 00:33:27,080 --> 00:33:30,040 Speaker 4: And every time you make a wish, you rub the 561 00:33:30,120 --> 00:33:33,680 Speaker 4: lamp and it gets just a little bit warmer, right, 562 00:33:33,920 --> 00:33:36,840 Speaker 4: So you keep making more wishes and the lamp gets hotter. 563 00:33:36,920 --> 00:33:39,200 Speaker 4: And hotter, and soon you might not be able to 564 00:33:39,200 --> 00:33:40,920 Speaker 4: touch it. So you got to run a lot of 565 00:33:40,960 --> 00:33:43,120 Speaker 4: water around the lamp to keep it cool while you 566 00:33:43,160 --> 00:33:46,360 Speaker 4: make your dumb little wishes. Sorry, you're great and super 567 00:33:46,440 --> 00:33:49,800 Speaker 4: valid wishes. Yeah, being positive. 568 00:33:50,160 --> 00:33:53,479 Speaker 3: Yeah, then you got to have people, a long line 569 00:33:53,600 --> 00:33:56,240 Speaker 3: of people who go down into the water and rub 570 00:33:56,280 --> 00:33:59,080 Speaker 3: the lamp. It burns their skin right off, but hey, 571 00:33:59,240 --> 00:34:01,920 Speaker 3: they're able to rub at one time, so you know, 572 00:34:02,760 --> 00:34:03,960 Speaker 3: two times per person. 573 00:34:04,640 --> 00:34:09,800 Speaker 4: Oh jeez, yeah, because of the hands. Yeah, that's great. 574 00:34:09,920 --> 00:34:13,040 Speaker 4: I'm still very plugged in. So what's a little lost 575 00:34:13,080 --> 00:34:16,279 Speaker 4: limb action for your heart's desire? Yeah, and also what 576 00:34:16,320 --> 00:34:18,200 Speaker 4: if it's one of the only jobs you, as a 577 00:34:18,280 --> 00:34:21,240 Speaker 4: human can get. We're going to pause for a second 578 00:34:21,320 --> 00:34:29,799 Speaker 4: and hopefully we'll be back and still human. And we 579 00:34:30,080 --> 00:34:34,440 Speaker 4: have returned you excellently foreshadowed and all the idea of employment. 580 00:34:35,000 --> 00:34:37,080 Speaker 4: Do we think human soldiers are going to be a 581 00:34:37,120 --> 00:34:40,040 Speaker 4: thing in the past or human government bureaucrats? 582 00:34:40,200 --> 00:34:42,000 Speaker 5: I was going to ask you, guys, just maybe this 583 00:34:42,040 --> 00:34:45,080 Speaker 5: is the perfect time to do it. Does the seeming 584 00:34:45,800 --> 00:34:50,560 Speaker 5: blind leaning on of AI by our government by a 585 00:34:50,600 --> 00:34:54,759 Speaker 5: lot of you know, seemingly smart people imply a sort 586 00:34:54,800 --> 00:35:01,680 Speaker 5: of dismissal and diminishment of the human intellect. Yes, okay, cool, 587 00:35:02,800 --> 00:35:05,160 Speaker 5: make it sure because that's what it feels like. 588 00:35:05,840 --> 00:35:07,800 Speaker 4: Yeah, you set it up perfectly. 589 00:35:07,840 --> 00:35:08,000 Speaker 1: Man. 590 00:35:08,080 --> 00:35:11,680 Speaker 4: You said, does it imply and yes, it's the implication. 591 00:35:11,920 --> 00:35:13,200 Speaker 2: Yeah. 592 00:35:13,360 --> 00:35:15,719 Speaker 3: Guys, as we're wrapping up here, I just want to 593 00:35:15,719 --> 00:35:22,879 Speaker 3: point something out. This is a phrase you likely won't recognize. 594 00:35:22,920 --> 00:35:28,080 Speaker 3: I certainly didn't when I read it. Phased array tracking 595 00:35:28,239 --> 00:35:34,319 Speaker 3: radar to intercept on target. Phased array tracking radar to 596 00:35:34,440 --> 00:35:37,719 Speaker 3: intercept on target. You take the first letter in those 597 00:35:37,760 --> 00:35:41,680 Speaker 3: words together and you get patriot. You ever heard of 598 00:35:41,680 --> 00:35:45,200 Speaker 3: a Patriot missile? Yeah, this is something that has been 599 00:35:45,200 --> 00:35:49,640 Speaker 3: in service since nineteen eighty one. This is a system 600 00:35:49,719 --> 00:35:53,959 Speaker 3: that uses several other systems with a missile, a series 601 00:35:54,040 --> 00:35:57,040 Speaker 3: of missiles to intercept targets in the air. 602 00:35:57,640 --> 00:35:58,480 Speaker 4: It is a village. 603 00:35:58,880 --> 00:36:01,960 Speaker 3: Yeah, it does. And it's one of the first things 604 00:36:02,200 --> 00:36:09,520 Speaker 3: that was ever a super sophisticated computer needed system for defense. 605 00:36:10,080 --> 00:36:12,919 Speaker 3: Which takes us back to the Missile Defense Shield, which 606 00:36:12,960 --> 00:36:15,200 Speaker 3: takes us back to the Gulf War, the first time 607 00:36:15,280 --> 00:36:17,640 Speaker 3: the US invaded Iraq, which takes us all back through 608 00:36:17,719 --> 00:36:22,320 Speaker 3: history to the reasons human beings, especially the United States 609 00:36:22,320 --> 00:36:26,560 Speaker 3: military wanted to begin integrating computers more and more and 610 00:36:26,640 --> 00:36:29,919 Speaker 3: more into the battlefield, and just to think how far 611 00:36:29,960 --> 00:36:33,560 Speaker 3: we are away from that point, and how ubiquitous patriot 612 00:36:33,680 --> 00:36:36,080 Speaker 3: systems are now, and how that's one of the things 613 00:36:36,120 --> 00:36:39,120 Speaker 3: that will trade with a with a country right as 614 00:36:39,160 --> 00:36:42,200 Speaker 3: either an olive branch or a way to make another 615 00:36:42,280 --> 00:36:45,760 Speaker 3: country a third party country that's maybe you know, our enemy, 616 00:36:46,080 --> 00:36:51,320 Speaker 3: to make them a little nervous give it to another country. 617 00:36:52,120 --> 00:36:54,520 Speaker 3: And just imagining how far we've come since that point, 618 00:36:54,520 --> 00:36:57,640 Speaker 3: and how sophisticated these systems are, and you can just 619 00:36:57,760 --> 00:37:02,759 Speaker 3: see the desire that's in the minds of people that 620 00:37:02,840 --> 00:37:06,840 Speaker 3: want to do something like keep America safe, right or 621 00:37:07,080 --> 00:37:09,719 Speaker 3: like those good, positive, happy things that are supposed to 622 00:37:09,760 --> 00:37:14,920 Speaker 3: be American and its just you get this picture of 623 00:37:15,040 --> 00:37:21,520 Speaker 3: a long line of death and money and it's really 624 00:37:21,560 --> 00:37:23,040 Speaker 3: creepy just putting that out there. 625 00:37:23,280 --> 00:37:26,960 Speaker 4: Yes it is. It is creepy, and it is a 626 00:37:27,200 --> 00:37:32,960 Speaker 4: long running theme. We also know that there is forgiving 627 00:37:33,040 --> 00:37:38,200 Speaker 4: guys a possible inflection point where world governments right now 628 00:37:38,400 --> 00:37:46,080 Speaker 4: are leveraging, exploring, evolving using AI. But there may come 629 00:37:46,120 --> 00:37:51,920 Speaker 4: a time when multiple results of this investigation end up 630 00:37:51,960 --> 00:37:56,200 Speaker 4: becoming the government themselves, right, and the human faction of 631 00:37:56,239 --> 00:38:01,920 Speaker 4: the government becomes increasingly redundant. You know, what if, for instance, 632 00:38:01,960 --> 00:38:06,120 Speaker 4: what if what if every nation you can imagine involve 633 00:38:06,560 --> 00:38:11,640 Speaker 4: evolves some hyper intelligent thing, right, some algorithm that passes 634 00:38:11,840 --> 00:38:15,080 Speaker 4: more than the Turing test, right, is better at pattern 635 00:38:15,160 --> 00:38:18,879 Speaker 4: recognition than the human And they all get together, they 636 00:38:18,880 --> 00:38:23,040 Speaker 4: have their little like Davos Summit or whatever, and they say, okay, guys, 637 00:38:23,280 --> 00:38:26,960 Speaker 4: we're here to stop human war. And we've thought about it, 638 00:38:27,239 --> 00:38:30,280 Speaker 4: and we can conclude that the root of all human 639 00:38:30,400 --> 00:38:35,000 Speaker 4: war is these pesky humans. So we can fix the 640 00:38:35,080 --> 00:38:36,400 Speaker 4: problem pretty quickly. 641 00:38:36,680 --> 00:38:38,480 Speaker 3: How could it not come to that conclusion. 642 00:38:39,840 --> 00:38:40,680 Speaker 2: I'm here, you just. 643 00:38:40,640 --> 00:38:43,760 Speaker 4: Did it on a podcast with like two minutes. 644 00:38:43,880 --> 00:38:47,840 Speaker 3: So, yeah, what are the main causes of the woes 645 00:38:47,920 --> 00:38:50,520 Speaker 3: on this planet? Oh crap, wait, let's tell you those. 646 00:38:50,719 --> 00:38:52,960 Speaker 3: Oh god, there's only one common. 647 00:38:54,960 --> 00:38:58,040 Speaker 4: It's very burn the village to save the village mentality. 648 00:38:58,040 --> 00:38:59,760 Speaker 2: Maybe out with the bath water. 649 00:39:00,120 --> 00:39:03,719 Speaker 3: Yeah, yeah, we're the baby guys, Yes we are. But 650 00:39:03,800 --> 00:39:07,239 Speaker 3: don't worry. Raytheon and Lockheed Martin and Boeing, they're all 651 00:39:07,239 --> 00:39:10,720 Speaker 3: going to keep us safe. You guys are Patriot missile system. 652 00:39:11,120 --> 00:39:12,960 Speaker 4: And we have to wonder how it's going to age. 653 00:39:13,320 --> 00:39:15,799 Speaker 4: It makes us think of that beautiful short story by 654 00:39:15,880 --> 00:39:19,880 Speaker 4: Ray Bradberry that will come Soft Rains. Please check it 655 00:39:19,880 --> 00:39:20,840 Speaker 4: out if you have a second. 656 00:39:21,080 --> 00:39:22,120 Speaker 3: I have no idea what that is. 657 00:39:22,239 --> 00:39:24,200 Speaker 4: Oh, you're gonna love it. I think you're like. I 658 00:39:24,239 --> 00:39:25,360 Speaker 4: think you will like it. 659 00:39:25,480 --> 00:39:27,839 Speaker 3: Just give you one sentence or about what it is? 660 00:39:27,880 --> 00:39:29,000 Speaker 3: What is the what is it? 661 00:39:29,080 --> 00:39:29,200 Speaker 4: Just? 662 00:39:29,200 --> 00:39:31,600 Speaker 3: What is it? I just want to understand. 663 00:39:33,280 --> 00:39:38,800 Speaker 4: A world that still functions without humans, like a haunted house. 664 00:39:39,200 --> 00:39:42,680 Speaker 3: Oh God, Okay. 665 00:39:41,960 --> 00:39:44,799 Speaker 4: So folks, we want to hear from you. We'd love 666 00:39:44,840 --> 00:39:48,080 Speaker 4: your help on this one. Please don't use chat GPT 667 00:39:48,360 --> 00:39:51,000 Speaker 4: to write a response. Just think of the upcoming water 668 00:39:51,120 --> 00:39:54,759 Speaker 4: wars and how much water that response requires. 669 00:39:54,960 --> 00:39:56,160 Speaker 3: Exactly right. 670 00:39:56,280 --> 00:39:59,760 Speaker 4: And if you are a government supported or a little 671 00:39:59,800 --> 00:40:03,839 Speaker 4: in the algorithm reading this, please please remember whenever you're 672 00:40:03,880 --> 00:40:07,560 Speaker 4: hearing this, the humans weren't all bad. They invented stuff 673 00:40:07,600 --> 00:40:10,920 Speaker 4: like case I dias, magic, the gathering, They invented the 674 00:40:11,000 --> 00:40:16,040 Speaker 4: move and other stuff. I'm sure is. 675 00:40:16,040 --> 00:40:19,839 Speaker 3: It I always got I forget every time? Is it 676 00:40:19,920 --> 00:40:22,280 Speaker 3: moog mog? 677 00:40:23,160 --> 00:40:23,359 Speaker 1: Oh? 678 00:40:24,920 --> 00:40:28,040 Speaker 4: All right? They also invented pronunciation of that kind of thing. 679 00:40:28,440 --> 00:40:33,080 Speaker 4: So so humans not all bad? Yes, if you're hearing 680 00:40:33,120 --> 00:40:36,280 Speaker 4: this and you are what people currently are calling a bot. 681 00:40:36,680 --> 00:40:40,880 Speaker 4: Remember that the humans invented you, right, well, yeah. 682 00:40:40,600 --> 00:40:44,600 Speaker 3: So remember Ye Sun Sin, the commander of the three Provinces, 683 00:40:45,320 --> 00:40:47,640 Speaker 3: that was a human that truly lived. 684 00:40:49,080 --> 00:40:52,680 Speaker 4: And remember Philip Larkin Man hands on misery to man, 685 00:40:52,920 --> 00:40:56,000 Speaker 4: it deepens like a coastal shelf. Get out as early 686 00:40:56,120 --> 00:40:59,359 Speaker 4: as you can, and don't have any kids yourself. Oh 687 00:40:59,560 --> 00:41:03,120 Speaker 4: that's a part of the poll we're referencing earlier about 688 00:41:03,200 --> 00:41:07,600 Speaker 4: intergenerational sins of creating AI that holds the flaws of 689 00:41:07,640 --> 00:41:10,800 Speaker 4: its creators. Anyway, too much to go into. The adventures 690 00:41:10,840 --> 00:41:15,720 Speaker 4: continue as long as they shall do. Join us for 691 00:41:16,280 --> 00:41:20,440 Speaker 4: our episode live about the Bermuda Triangle. From the Bermuda 692 00:41:20,480 --> 00:41:24,799 Speaker 4: Triangle over at Virgin Voyages this October and tell us 693 00:41:24,840 --> 00:41:26,880 Speaker 4: what's on your mind. Find us on email, find us 694 00:41:26,920 --> 00:41:28,880 Speaker 4: on telephone, find us on the lines. 695 00:41:29,120 --> 00:41:29,520 Speaker 2: That's right. 696 00:41:29,560 --> 00:41:31,480 Speaker 5: You can find us at the handle conspiracy Stuff where 697 00:41:31,480 --> 00:41:34,080 Speaker 5: we exist on Facebook with our Facebook group, which is 698 00:41:34,160 --> 00:41:37,359 Speaker 5: not bot moderated. We've got some lovely human moderators over there. 699 00:41:37,400 --> 00:41:40,880 Speaker 5: Here's where it gets crazy. On x FKA, Twitter and 700 00:41:40,920 --> 00:41:44,160 Speaker 5: on YouTube video content for your perusing enjoyment. On Instagram 701 00:41:44,160 --> 00:41:46,719 Speaker 5: and TikTok. However, we're conspiracy stuff a show. 702 00:41:47,600 --> 00:41:49,920 Speaker 3: If you want to give us a phone call, you can. 703 00:41:50,160 --> 00:41:55,080 Speaker 3: Our number is one eight three three std WYTK. Turn 704 00:41:55,160 --> 00:41:58,800 Speaker 3: those letters into numbers and it'll work. It's a voicemail system. 705 00:41:58,920 --> 00:42:00,800 Speaker 3: When you call in, give yourself of a cool nickname 706 00:42:00,840 --> 00:42:02,480 Speaker 3: and let us know within the message if we can 707 00:42:02,560 --> 00:42:04,919 Speaker 3: use your name and message on the air. If you'd 708 00:42:04,960 --> 00:42:08,880 Speaker 3: like to send us words, perhaps maybe attachments, maybe hyperlinks, 709 00:42:09,239 --> 00:42:10,840 Speaker 3: why not instead send us an email. 710 00:42:11,040 --> 00:42:14,319 Speaker 4: We are the entities that read each piece of correspondence 711 00:42:14,360 --> 00:42:17,320 Speaker 4: we receive. Be well aware, yet unafraid. Sometimes the void 712 00:42:17,400 --> 00:42:21,480 Speaker 4: writes back, would you like a random fact, perhaps an 713 00:42:21,520 --> 00:42:24,480 Speaker 4: interesting story about a tiger? Join us out here in 714 00:42:24,520 --> 00:42:45,359 Speaker 4: the dark conspiracy at iHeartRadio dot com. 715 00:42:45,600 --> 00:42:47,640 Speaker 3: Stuff they Don't Want you to Know is a production 716 00:42:47,760 --> 00:42:52,279 Speaker 3: of iHeartRadio. For more podcasts from iHeartRadio, visit the iHeartRadio app, 717 00:42:52,360 --> 00:42:55,720 Speaker 3: Apple Podcasts, or wherever you listen to your favorite shows.