1 00:00:15,920 --> 00:00:16,720 Speaker 1: Root for us. 2 00:00:17,320 --> 00:00:21,279 Speaker 2: If it doesn't work, you're just not using enough. You're 3 00:00:21,360 --> 00:00:26,279 Speaker 2: listening to self web Radio, Special Operations, Military Nails and 4 00:00:26,520 --> 00:00:34,640 Speaker 2: straight talk with the guys and the community. 5 00:00:41,000 --> 00:00:42,960 Speaker 3: You guys have probably seen the ads already for the 6 00:00:42,960 --> 00:00:46,080 Speaker 3: Bear Skin hoodie right well, if you've been waiting to 7 00:00:46,120 --> 00:00:50,040 Speaker 3: get yourself one, now is the time. It's officially hoodie season, 8 00:00:50,320 --> 00:00:54,960 Speaker 3: cold mornings, windy days, surprise snow, and right now Bear 9 00:00:55,000 --> 00:00:57,960 Speaker 3: Skin is running their biggest deal of the year. We're 10 00:00:58,000 --> 00:01:01,880 Speaker 3: talking sixty percent off for this black fri and Cyber Monday. Now, 11 00:01:02,040 --> 00:01:04,800 Speaker 3: this isn't some old fleece. This thing's built with three 12 00:01:04,920 --> 00:01:08,640 Speaker 3: hundred and forty gsm Bear Skin fleece, ten pockets, and 13 00:01:08,680 --> 00:01:12,199 Speaker 3: a rugged athletic fit that actually looks good on you. Plus, 14 00:01:12,240 --> 00:01:16,640 Speaker 3: if it starts pouring, you can zip up on the 15 00:01:16,920 --> 00:01:20,360 Speaker 3: heavy storm rain jacket and instantly leveled up to full 16 00:01:20,400 --> 00:01:24,080 Speaker 3: waterproof protection. You'll get free US shipping from their Kentucky 17 00:01:24,120 --> 00:01:27,120 Speaker 3: warehouse and gear that'll last you season after season. So 18 00:01:27,200 --> 00:01:29,360 Speaker 3: here's what you need to do to get sixty percent off. 19 00:01:29,640 --> 00:01:33,920 Speaker 3: Text rad to three six nine twelve. Again, that's r 20 00:01:34,160 --> 00:01:37,440 Speaker 3: A D to three six nine twelve and they'll text 21 00:01:37,440 --> 00:01:39,240 Speaker 3: you a link so you can lock in your sixty 22 00:01:39,280 --> 00:01:42,200 Speaker 3: percent off before the sale disappears. Don't wait till the 23 00:01:42,200 --> 00:01:45,080 Speaker 3: weather gets really bad. Grab a bear skin now while 24 00:01:45,120 --> 00:01:48,280 Speaker 3: this Black Friday deal is live. And one more thing, 25 00:01:48,520 --> 00:01:51,160 Speaker 3: when you support bear Skin, you are also supporting the 26 00:01:51,160 --> 00:01:54,760 Speaker 3: Fallen Outdoors and the Hope for the Warriors Veterans program, 27 00:01:54,800 --> 00:01:57,160 Speaker 3: So you're not just buying great gear, You're backing a 28 00:01:57,240 --> 00:02:00,240 Speaker 3: cause that matters. Grab your bear Skin while this Black 29 00:02:00,280 --> 00:02:05,680 Speaker 3: Friday deal is still live on it is rad and 30 00:02:05,760 --> 00:02:09,640 Speaker 3: I'm totally not surprised to be right here talking to you. 31 00:02:09,760 --> 00:02:11,520 Speaker 4: But I'm acting like it. So what's up. 32 00:02:11,800 --> 00:02:15,480 Speaker 3: This is softw REP Radio and welcome to your first 33 00:02:15,600 --> 00:02:18,119 Speaker 3: episode or video. If this is the first time you're 34 00:02:18,120 --> 00:02:20,880 Speaker 3: tuning in, if you've listened and you're a veteran. 35 00:02:20,560 --> 00:02:22,400 Speaker 4: Of SOFTWAREP Radio, welcome back. 36 00:02:22,600 --> 00:02:24,120 Speaker 3: You know who I am, and you know how this 37 00:02:24,160 --> 00:02:26,000 Speaker 3: is about to go, and you probably know who my 38 00:02:26,040 --> 00:02:29,040 Speaker 3: guest is already. But first, before I introduce you to 39 00:02:29,080 --> 00:02:32,840 Speaker 3: my guest, say it with me. The merch store soft 40 00:02:32,880 --> 00:02:35,919 Speaker 3: reap dot com forward slash Merch Go check it out. 41 00:02:35,919 --> 00:02:38,639 Speaker 3: We've got lots of cool items. The holidays are upon us. 42 00:02:38,720 --> 00:02:40,320 Speaker 3: You might want to get a little something for yourself. 43 00:02:40,360 --> 00:02:42,959 Speaker 3: Spoil yourself, go ahead and spoil me, spoil someone else 44 00:02:43,000 --> 00:02:45,640 Speaker 3: around you, get them a little something. And then second 45 00:02:45,960 --> 00:02:48,480 Speaker 3: is our book club and that soft rep dot Com 46 00:02:48,520 --> 00:02:52,799 Speaker 3: Forward slash Book hyphen Club. So it's the gem for 47 00:02:52,919 --> 00:02:55,799 Speaker 3: your brain reading a book. Knowledge is power. You hear 48 00:02:55,840 --> 00:02:58,160 Speaker 3: it all the time. We can't stress it enough around here. 49 00:02:58,280 --> 00:02:59,960 Speaker 3: We have so many folks that come on the show 50 00:03:00,120 --> 00:03:03,440 Speaker 3: that write books, that read books. It's just so powerful 51 00:03:03,440 --> 00:03:05,920 Speaker 3: to hear their message. So please go read a book 52 00:03:05,919 --> 00:03:08,200 Speaker 3: and check out the softwarep dot Com Forward slash Book 53 00:03:08,280 --> 00:03:13,839 Speaker 3: hyphen Club where it's curated for viewers like you, from 54 00:03:13,880 --> 00:03:18,200 Speaker 3: people like Brandon Webb and in the special operations community. Okay, 55 00:03:18,240 --> 00:03:21,600 Speaker 3: So the book that I would like to probably hopefully 56 00:03:21,639 --> 00:03:23,480 Speaker 3: try to get into the book club if my guest 57 00:03:23,840 --> 00:03:29,840 Speaker 3: will work with us Anthony Vincey, who is written his book, 58 00:03:30,120 --> 00:03:32,639 Speaker 3: let me just read this to you correctly. He is 59 00:03:32,680 --> 00:03:35,480 Speaker 3: the author of the book The Fourth Intelligence Revolution, The 60 00:03:35,520 --> 00:03:39,120 Speaker 3: Future of Espionage and the Battle to Save America. He 61 00:03:39,200 --> 00:03:42,640 Speaker 3: served as the first Chief Technology Officer and Associate Director 62 00:03:42,840 --> 00:03:48,120 Speaker 3: for Capabilities at the National Geospatial Intelligence Agency. 63 00:03:47,640 --> 00:03:50,320 Speaker 4: Also known as an MGA. Earlier in his career. 64 00:03:50,360 --> 00:03:54,040 Speaker 3: He served as an intelligence officer in Iraq, Africa, and Asia. 65 00:03:54,160 --> 00:03:57,880 Speaker 3: After leaving the world of intelligence, Anthony became an executive 66 00:03:58,000 --> 00:04:01,400 Speaker 3: at a private equity firm. Is now the CEO of Vico, 67 00:04:01,760 --> 00:04:05,720 Speaker 3: an AI company, which is what is AI staff wor 68 00:04:05,720 --> 00:04:10,240 Speaker 3: again automated interface, but it's actually artficial intelligence. 69 00:04:10,360 --> 00:04:10,680 Speaker 4: Thank you. 70 00:04:10,760 --> 00:04:13,080 Speaker 3: Yeah, I forget that, dude, I forget I'm living in 71 00:04:13,120 --> 00:04:14,000 Speaker 3: AI days. 72 00:04:14,360 --> 00:04:15,160 Speaker 4: He has a company. 73 00:04:15,200 --> 00:04:21,760 Speaker 3: It's a democratized democratizing intelligence analysis and super empowering analysts. 74 00:04:22,920 --> 00:04:25,919 Speaker 3: Anthony is also an Adjective Senior Fellow at the Center 75 00:04:25,920 --> 00:04:28,680 Speaker 3: for a New American Security. He received his PhD in 76 00:04:28,680 --> 00:04:31,560 Speaker 3: International relations from the London School of Economics. I want 77 00:04:31,560 --> 00:04:33,880 Speaker 3: to welcome you to the show, not butcher your bio, 78 00:04:33,960 --> 00:04:36,040 Speaker 3: but it's nice to have you here. 79 00:04:36,640 --> 00:04:38,800 Speaker 5: Thanks for having me man, super excited. 80 00:04:39,680 --> 00:04:46,080 Speaker 3: So intelligence you were Is that army or was that 81 00:04:46,120 --> 00:04:48,719 Speaker 3: a military branch that you were in with this intelligence? 82 00:04:49,560 --> 00:04:52,280 Speaker 5: Well, I was a civilian intelligence officer. I was with 83 00:04:52,440 --> 00:04:55,960 Speaker 5: DA and I was I was a case officer. So 84 00:04:56,040 --> 00:04:56,880 Speaker 5: that's somebody who. 85 00:04:56,720 --> 00:05:00,560 Speaker 1: Goes out in the field and recruit spies and them 86 00:05:00,600 --> 00:05:01,840 Speaker 1: to provide information. 87 00:05:02,440 --> 00:05:04,520 Speaker 3: So did you like go from I want to be 88 00:05:04,880 --> 00:05:08,560 Speaker 3: on the baseball team to a spy, to the civilian 89 00:05:08,600 --> 00:05:11,760 Speaker 3: spy sector. I, you know, I. 90 00:05:12,240 --> 00:05:15,200 Speaker 1: Like a lot of a lot of other guys, was 91 00:05:15,240 --> 00:05:17,840 Speaker 1: of a generation where nine eleven happened when I was young, 92 00:05:18,640 --> 00:05:21,039 Speaker 1: and as soon as that happened, I was like, I 93 00:05:21,080 --> 00:05:24,320 Speaker 1: want to get in a game. My dad was a marine, 94 00:05:24,360 --> 00:05:26,359 Speaker 1: my brother is a marine. I just sort of assumed 95 00:05:26,360 --> 00:05:30,800 Speaker 1: that I would be serving. And I realize that, you know, 96 00:05:31,000 --> 00:05:33,480 Speaker 1: the intelligence world was the right one for me. I'd 97 00:05:33,480 --> 00:05:36,560 Speaker 1: already lived abroad, you know, I knew what it was like. 98 00:05:36,680 --> 00:05:38,600 Speaker 1: You know, like, this is this is how to get 99 00:05:38,640 --> 00:05:40,800 Speaker 1: in that game, and this was important for the country 100 00:05:40,839 --> 00:05:41,320 Speaker 1: at that time. 101 00:05:42,320 --> 00:05:42,880 Speaker 4: That's amazing. 102 00:05:42,920 --> 00:05:45,680 Speaker 3: So did you just like google how to apply to 103 00:05:46,279 --> 00:05:47,679 Speaker 3: this DA? 104 00:05:47,800 --> 00:05:50,000 Speaker 4: Is that what you said? Or CIA? What was the 105 00:05:50,040 --> 00:05:51,159 Speaker 4: acronym you threw out. 106 00:05:51,000 --> 00:05:53,760 Speaker 5: There, DA Defense Intelligence? 107 00:05:54,240 --> 00:05:57,560 Speaker 4: Yeah, okay, okay, yeah. 108 00:05:58,279 --> 00:06:00,600 Speaker 5: That's literally the way you do. I mean, that's what 109 00:06:00,640 --> 00:06:02,760 Speaker 5: most people do. You just sort of google and you're like, 110 00:06:03,440 --> 00:06:05,040 Speaker 5: how do I go do this job. 111 00:06:06,360 --> 00:06:09,120 Speaker 1: I talked to some friends as well who were kind 112 00:06:09,120 --> 00:06:12,360 Speaker 1: of knew a little bit about that world, and they 113 00:06:12,480 --> 00:06:15,200 Speaker 1: sort of pointed in different directions and were like, here's 114 00:06:15,279 --> 00:06:19,320 Speaker 1: here's the place, and you literally put your application in online. 115 00:06:20,000 --> 00:06:28,679 Speaker 1: How old, oh man, I was in my mid twenties. Wow, yeah, 116 00:06:28,800 --> 00:06:33,279 Speaker 1: I think it was probably twenty seven when I applied. 117 00:06:33,400 --> 00:06:36,800 Speaker 3: Yeah, that's crazy, that's that's incredible. It's like, you know, 118 00:06:37,839 --> 00:06:40,000 Speaker 3: I mean, hey, I believe it. You speak other languages 119 00:06:40,040 --> 00:06:40,400 Speaker 3: as well. 120 00:06:41,240 --> 00:06:43,679 Speaker 1: At the time, I spoke a little bit of Arabic. 121 00:06:43,880 --> 00:06:45,320 Speaker 1: It's it's completely gone. 122 00:06:45,520 --> 00:06:47,760 Speaker 4: It's oh I see, I see, I see. That's what 123 00:06:47,800 --> 00:06:48,400 Speaker 4: they all say. 124 00:06:48,440 --> 00:06:52,040 Speaker 3: That's what they all say now in Hillah, yeah yeah, 125 00:06:52,040 --> 00:06:55,640 Speaker 3: in Sila, yeah, awesome, Mama lakam you see it's okay. Yeah, yes, 126 00:06:55,720 --> 00:07:00,479 Speaker 3: I love that. And so you know, Chukrin see here, 127 00:07:01,880 --> 00:07:04,919 Speaker 3: it's good, it's good. I just have questions, you know 128 00:07:04,960 --> 00:07:06,719 Speaker 3: what I mean for you dude and my listeners like 129 00:07:06,800 --> 00:07:08,200 Speaker 3: Rad ask him questions, bro. 130 00:07:08,120 --> 00:07:10,160 Speaker 4: Crack the egg? You know? How do we crack the 131 00:07:10,200 --> 00:07:14,880 Speaker 4: egg and scramble this? So? As a DA. 132 00:07:16,160 --> 00:07:20,800 Speaker 5: Operative, Yeah, intelligence officer. 133 00:07:21,240 --> 00:07:23,400 Speaker 4: Are you looking for? I mean, are you literally like? 134 00:07:23,440 --> 00:07:26,040 Speaker 3: Okay, I'm thinking of like the Saint with Val Kilmer, 135 00:07:26,080 --> 00:07:28,160 Speaker 3: where he was like I am Saint so and so 136 00:07:28,200 --> 00:07:30,880 Speaker 3: and he's like, you know, different load outs, different faces, 137 00:07:30,920 --> 00:07:33,840 Speaker 3: mission impossible, you know, you know, is that. 138 00:07:33,960 --> 00:07:37,720 Speaker 4: Something that worth I'm thinking about you like I would. 139 00:07:37,760 --> 00:07:39,800 Speaker 5: I like to think about it like this. 140 00:07:40,480 --> 00:07:44,120 Speaker 1: You want to know what's going on in you know, 141 00:07:44,400 --> 00:07:52,760 Speaker 1: southern Afghanistan or Iraq or China. And you know we're Americans, 142 00:07:53,200 --> 00:07:55,800 Speaker 1: we we grew up here. We probably don't know that 143 00:07:55,880 --> 00:07:57,920 Speaker 1: many people there. How are you going to get that information? 144 00:07:58,840 --> 00:07:59,480 Speaker 5: Right? 145 00:07:59,600 --> 00:08:01,720 Speaker 1: And you to get that is you want to get 146 00:08:01,760 --> 00:08:04,400 Speaker 1: a local guy. Yeah, you want to get somebody who's 147 00:08:04,400 --> 00:08:05,800 Speaker 1: got that placement and access. 148 00:08:05,880 --> 00:08:07,800 Speaker 5: He lives there, he knows people. 149 00:08:07,880 --> 00:08:10,400 Speaker 1: Maybe you know, back during the kind of global war 150 00:08:10,440 --> 00:08:13,920 Speaker 1: on terrorism day, maybe this guy you know had friends 151 00:08:14,160 --> 00:08:18,080 Speaker 1: who were involved with with with terrorist groups someone like that, 152 00:08:18,320 --> 00:08:21,240 Speaker 1: or maybe he works at a government agency that you 153 00:08:21,280 --> 00:08:22,560 Speaker 1: want to get information from. 154 00:08:22,640 --> 00:08:24,200 Speaker 5: You got to go get one of those guys. 155 00:08:25,080 --> 00:08:28,520 Speaker 1: You're not going to walk in yourself, right, And so 156 00:08:29,480 --> 00:08:32,000 Speaker 1: what you do is you learn how to find those 157 00:08:32,040 --> 00:08:36,080 Speaker 1: people and recruit them to come work for you. 158 00:08:36,080 --> 00:08:39,960 Speaker 5: You know, you're essentially convincing them to commit treason you know, 159 00:08:40,000 --> 00:08:40,600 Speaker 5: on their own, right. 160 00:08:42,679 --> 00:08:45,320 Speaker 3: Yeah, it's like you got to get human intelligence. So 161 00:08:45,440 --> 00:08:49,079 Speaker 3: humid is what you're going after there by somebody human 162 00:08:49,120 --> 00:08:51,679 Speaker 3: intelligence and there's like oh sit, there's like you know, 163 00:08:52,000 --> 00:08:54,680 Speaker 3: open sourced intelligence where you can find all the information 164 00:08:54,760 --> 00:08:56,960 Speaker 3: about somebody online and they're like, how did you know 165 00:08:56,960 --> 00:08:57,559 Speaker 3: all this about me? 166 00:08:57,640 --> 00:08:59,480 Speaker 4: It's like, well, you wrote a daily journal every day. 167 00:09:00,080 --> 00:09:01,520 Speaker 3: I knew you were going to be on vacation for 168 00:09:01,559 --> 00:09:03,200 Speaker 3: eight days because you post it every single day. 169 00:09:03,960 --> 00:09:06,640 Speaker 1: Some people say about human it's the it's the int 170 00:09:06,880 --> 00:09:09,640 Speaker 1: or the intelligence of last resort. So you know, first 171 00:09:09,720 --> 00:09:11,640 Speaker 1: maybe yeah, you try to look it up online, some 172 00:09:11,840 --> 00:09:16,040 Speaker 1: open source intelligence. Maybe you use signals intelligence, you know, 173 00:09:16,720 --> 00:09:20,720 Speaker 1: and or geo intelligence like you you use imagery. 174 00:09:20,320 --> 00:09:21,240 Speaker 5: Maybe from satellite. 175 00:09:21,400 --> 00:09:23,760 Speaker 1: But if you really can't find it any other way, 176 00:09:24,559 --> 00:09:26,240 Speaker 1: the only way to get in that room or to 177 00:09:26,280 --> 00:09:28,480 Speaker 1: get that piece of information is to get another human 178 00:09:28,520 --> 00:09:29,800 Speaker 1: being to do it. 179 00:09:30,160 --> 00:09:32,640 Speaker 5: Then that that's where somebody like me came in. 180 00:09:33,160 --> 00:09:36,880 Speaker 3: Okay, now let's flash backwards to World War Two. 181 00:09:38,120 --> 00:09:40,840 Speaker 4: Okay, you have people inside of our. 182 00:09:40,760 --> 00:09:44,840 Speaker 3: Own you know, allies that are like not they sympathize 183 00:09:44,880 --> 00:09:47,560 Speaker 3: with what Hitler is doing in World War Two, and 184 00:09:47,559 --> 00:09:48,800 Speaker 3: they're like, oh, it's no big deal. 185 00:09:48,920 --> 00:09:50,440 Speaker 4: You know, it's not affecting us. 186 00:09:50,480 --> 00:09:52,680 Speaker 3: And you got people inside try to like you know, 187 00:09:53,400 --> 00:09:59,520 Speaker 3: thwart any type of like you know, problem against him. Now, 188 00:10:00,600 --> 00:10:02,920 Speaker 3: you probably haven't understanding of World War two stuff. I mean, 189 00:10:02,960 --> 00:10:06,480 Speaker 3: I mean, you know, there's spy versus spy. You know, 190 00:10:06,520 --> 00:10:09,080 Speaker 3: they're like trying to get in and you know four things. 191 00:10:09,520 --> 00:10:12,240 Speaker 3: Do you have any conversation about that? 192 00:10:12,920 --> 00:10:15,760 Speaker 5: I mean, look, this is the nature of intelligence, right. 193 00:10:16,679 --> 00:10:19,680 Speaker 1: We have guys on our side who want to collect 194 00:10:19,720 --> 00:10:23,000 Speaker 1: information on the other guys. They don't want us to 195 00:10:23,040 --> 00:10:27,480 Speaker 1: collect that information, so they have counterintelligence. They're trying to 196 00:10:27,520 --> 00:10:30,360 Speaker 1: find you know, like I was saying, I recruit a 197 00:10:30,400 --> 00:10:32,559 Speaker 1: spy to go spy on your government agency. 198 00:10:32,559 --> 00:10:35,120 Speaker 5: They're trying to find that guy. So it's just back 199 00:10:35,160 --> 00:10:36,760 Speaker 5: and forth. It's that spy verse. 200 00:10:36,679 --> 00:10:39,040 Speaker 1: Spy, like you said. And that's been going on. That 201 00:10:39,120 --> 00:10:41,360 Speaker 1: goes back way further than World War Two. That's been 202 00:10:41,360 --> 00:10:46,120 Speaker 1: going on since the dawn of humanity. Some people, you know, 203 00:10:46,200 --> 00:10:50,600 Speaker 1: some people called intelligence the second oldest profession, and. 204 00:10:50,559 --> 00:10:52,520 Speaker 5: So in World War two. 205 00:10:52,800 --> 00:10:56,760 Speaker 1: Special about that for America though, is that I think 206 00:10:56,760 --> 00:10:59,679 Speaker 1: of it as the big Bang for American intelligence because 207 00:10:59,679 --> 00:11:04,520 Speaker 1: we did really have centralized intelligence before that. We kind 208 00:11:04,520 --> 00:11:08,400 Speaker 1: of had military intelligence. During the war, we would spin up, 209 00:11:08,600 --> 00:11:11,439 Speaker 1: you know, reconnaissance and intelligence function, we would turn it, 210 00:11:11,640 --> 00:11:13,120 Speaker 1: we would turn it off during peacetime. 211 00:11:13,200 --> 00:11:14,719 Speaker 5: We didn't like to have intelligence. 212 00:11:15,559 --> 00:11:19,520 Speaker 1: And in World War Two we created something called the OSS, 213 00:11:19,640 --> 00:11:22,199 Speaker 1: which was really our first spy agency. 214 00:11:23,440 --> 00:11:27,319 Speaker 3: Right, and again that was in conjunction with like kind 215 00:11:27,360 --> 00:11:31,439 Speaker 3: of the British right where you know, the whole thing and. 216 00:11:31,360 --> 00:11:32,920 Speaker 4: The guy out of Australia. 217 00:11:32,440 --> 00:11:35,280 Speaker 3: I think who created the OSS, who was like a 218 00:11:35,360 --> 00:11:37,400 Speaker 3: dual spy, and it was like, what side is he 219 00:11:37,480 --> 00:11:39,760 Speaker 3: really on? Well he was buried with like American honors, 220 00:11:40,080 --> 00:11:43,160 Speaker 3: So there's kind of that, you know, I can't remember 221 00:11:43,200 --> 00:11:45,720 Speaker 3: how the spy. Forgive me for forgetting. I've actually done 222 00:11:45,960 --> 00:11:51,080 Speaker 3: a book interview about him. But yeah, you're right, the OSS, Okay, 223 00:11:51,840 --> 00:11:53,840 Speaker 3: And now that led to what we know is like 224 00:11:53,880 --> 00:11:54,400 Speaker 3: the CIA. 225 00:11:55,480 --> 00:11:58,079 Speaker 5: Yeah, I mean we look, we were so. 226 00:12:00,040 --> 00:12:04,200 Speaker 1: We had such a distaste for intelligence in America. There 227 00:12:04,280 --> 00:12:07,360 Speaker 1: was this guy, Henry Stimpson. He was the Secretary of 228 00:12:07,440 --> 00:12:09,520 Speaker 1: State in the Secretary of War. He sort of said, 229 00:12:09,679 --> 00:12:13,440 Speaker 1: sort of notoriously said, gentlemen, don't read each other's mail. 230 00:12:14,240 --> 00:12:15,480 Speaker 5: Like that was how we felt. 231 00:12:15,480 --> 00:12:19,719 Speaker 1: We felt like it was distasteful, it was ungentlemanly to 232 00:12:20,640 --> 00:12:23,760 Speaker 1: spy in a way. And actually after World War Two, 233 00:12:24,480 --> 00:12:28,200 Speaker 1: Truman shut down the OSS and he said he didn't 234 00:12:28,200 --> 00:12:29,880 Speaker 1: want there to be an American Gestapo. 235 00:12:30,000 --> 00:12:32,120 Speaker 5: So for a couple of years there in the beginning of. 236 00:12:32,040 --> 00:12:34,400 Speaker 1: Cold War, we didn't have any spy agency. You know, 237 00:12:34,440 --> 00:12:36,040 Speaker 1: we kind of had some stuff going on in the 238 00:12:36,160 --> 00:12:37,240 Speaker 1: army and the State Department. 239 00:12:37,240 --> 00:12:37,679 Speaker 5: That was it. 240 00:12:38,160 --> 00:12:41,520 Speaker 1: And then nineteen forty seven they set up the CIA 241 00:12:42,000 --> 00:12:44,440 Speaker 1: because we realized, you know, look we're going up against 242 00:12:44,480 --> 00:12:47,000 Speaker 1: a big boy now. We were in a Cold War 243 00:12:47,080 --> 00:12:49,720 Speaker 1: with the Soviet Union and they had the KGB, and 244 00:12:49,760 --> 00:12:52,680 Speaker 1: those guys don't mess around, and we need our own, 245 00:12:53,160 --> 00:12:57,000 Speaker 1: we need our own permanent, peacetime professional agency. 246 00:12:57,040 --> 00:13:00,640 Speaker 3: And that was a CIA, right, just take a lot 247 00:13:00,679 --> 00:13:04,280 Speaker 3: everything under their umbrella that was just not military per 248 00:13:04,280 --> 00:13:05,000 Speaker 3: se related. 249 00:13:05,400 --> 00:13:05,640 Speaker 4: You know. 250 00:13:06,280 --> 00:13:09,000 Speaker 3: I've also we've all seen like clear and present danger 251 00:13:09,040 --> 00:13:11,840 Speaker 3: from Tom Clancy where you know, Willem Dafoe and his 252 00:13:11,920 --> 00:13:14,720 Speaker 3: men are just like cut off. It's like, just let 253 00:13:14,800 --> 00:13:16,520 Speaker 3: them get out of it by leave them there. Like 254 00:13:16,800 --> 00:13:19,320 Speaker 3: they send these guys in to do a mission and 255 00:13:19,360 --> 00:13:21,920 Speaker 3: then they just canceled the mission. Rambo's the same thing, 256 00:13:22,200 --> 00:13:24,120 Speaker 3: Rambo First Blood Part two. He goes in to get 257 00:13:24,120 --> 00:13:27,400 Speaker 3: the POWs out. It's all like under the CIA guys 258 00:13:27,440 --> 00:13:30,000 Speaker 3: and all this discrete, you know, clandestine, and then all 259 00:13:30,000 --> 00:13:31,360 Speaker 3: of a sudden the very end, it's like, yeah, just 260 00:13:31,400 --> 00:13:33,199 Speaker 3: don't pick him up. He gets there and gets the 261 00:13:33,240 --> 00:13:35,160 Speaker 3: mission done. He finds the POWs. They didn't want to 262 00:13:35,160 --> 00:13:37,200 Speaker 3: find that. They didn't want to find the intelligence. 263 00:13:38,240 --> 00:13:38,480 Speaker 4: Yeah. 264 00:13:38,559 --> 00:13:44,480 Speaker 1: We the OSS did both intelligence collection. They also did 265 00:13:44,480 --> 00:13:47,920 Speaker 1: special operations right and the and the OSS was in 266 00:13:47,960 --> 00:13:51,560 Speaker 1: many ways kind of the precursor to the Green Berets, 267 00:13:51,640 --> 00:13:55,000 Speaker 1: and Aaron Bank was a member, and then and the Seals. 268 00:13:55,120 --> 00:13:59,439 Speaker 1: They had the first maritime unit well before the Navy. 269 00:13:59,200 --> 00:14:02,280 Speaker 1: They were willing to, you know, use scuba gear before 270 00:14:02,320 --> 00:14:07,199 Speaker 1: anybody else. And the CIA kind of continued that because 271 00:14:07,200 --> 00:14:10,840 Speaker 1: they had both intelligence collection this is by law, and 272 00:14:11,120 --> 00:14:15,680 Speaker 1: covert action. And in covert action, you're doing those things 273 00:14:15,720 --> 00:14:18,840 Speaker 1: that other agencies can't do. You're you're doing things that 274 00:14:18,920 --> 00:14:23,320 Speaker 1: are deniable by the government. And some of them can 275 00:14:23,400 --> 00:14:27,000 Speaker 1: be direct action. Some of them could be you know, 276 00:14:27,280 --> 00:14:32,040 Speaker 1: influencing other people, say politically, you're influencing. 277 00:14:32,120 --> 00:14:34,200 Speaker 5: Some of them could be supporting, you. 278 00:14:34,080 --> 00:14:37,520 Speaker 1: Know, opposition groups in other countries. There's lots of types 279 00:14:37,560 --> 00:14:40,040 Speaker 1: of covert action. They had that piece as well as 280 00:14:40,640 --> 00:14:45,480 Speaker 1: information collection, and really intelligence is about information collection and analysis. 281 00:14:45,520 --> 00:14:49,200 Speaker 1: That's the vast majority of what's done you know, president 282 00:14:49,280 --> 00:14:51,400 Speaker 1: needs to make a decision. Somebody's got to give them 283 00:14:51,440 --> 00:14:52,360 Speaker 1: information to make that. 284 00:14:52,360 --> 00:14:55,000 Speaker 3: Decision, right, You would think that they would want to 285 00:14:55,040 --> 00:14:58,080 Speaker 3: have this, you know, briefing, daily briefing given to them 286 00:14:58,200 --> 00:15:00,400 Speaker 3: or what Hey, here's what's going on on the the net. 287 00:15:00,440 --> 00:15:01,880 Speaker 3: You know, you might want to think about it. They're 288 00:15:01,880 --> 00:15:03,840 Speaker 3: doing this over in this country or they're trying to 289 00:15:03,880 --> 00:15:05,640 Speaker 3: do this to our country. Do you have any concern 290 00:15:05,720 --> 00:15:10,240 Speaker 3: about that or just gentlemen's game, sir. Yeah, you know, 291 00:15:10,360 --> 00:15:11,920 Speaker 3: let's not look at their mail. It's like what we 292 00:15:12,000 --> 00:15:15,520 Speaker 3: got their mail? Hmm, hold the light. 293 00:15:17,720 --> 00:15:20,320 Speaker 5: I mean now we're now we're seemed to be perfectly. 294 00:15:19,840 --> 00:15:23,600 Speaker 3: Comfortable completely Yeah, I mean, you know, I guess covert 295 00:15:23,640 --> 00:15:27,640 Speaker 3: operations would be like you know, in Afghanistan when Masuri 296 00:15:27,760 --> 00:15:32,000 Speaker 3: Sharif was being under you know, uh suppression by like 297 00:15:32,040 --> 00:15:34,320 Speaker 3: see what thirties. I think the first casualty was Mike 298 00:15:34,400 --> 00:15:35,280 Speaker 3: Span who was a. 299 00:15:35,280 --> 00:15:38,720 Speaker 4: CIA you know, uh KIA. 300 00:15:38,440 --> 00:15:42,960 Speaker 3: Killed in action during that Masri Sharif, that would be 301 00:15:43,440 --> 00:15:46,000 Speaker 3: the same. That's that's he's he's on the ground, you know, 302 00:15:46,280 --> 00:15:49,240 Speaker 3: under that guy's and we all know about him because 303 00:15:49,480 --> 00:15:52,840 Speaker 3: he was k iaight unfortunately, and you know I mean, 304 00:15:52,840 --> 00:15:55,160 Speaker 3: you guys are are putting yourselves in the line of fire. 305 00:15:55,600 --> 00:15:59,200 Speaker 3: You know, you're doing things to make sure that guys 306 00:15:59,240 --> 00:16:02,840 Speaker 3: like myself can have this podcast, you know, under the 307 00:16:02,840 --> 00:16:06,120 Speaker 3: guise of the American flag freedom right, so we can 308 00:16:06,120 --> 00:16:07,720 Speaker 3: do the things that we want to continue to do 309 00:16:07,800 --> 00:16:08,800 Speaker 3: and our neighbors. 310 00:16:08,800 --> 00:16:10,640 Speaker 4: And I don't think anybody's really ready. 311 00:16:11,240 --> 00:16:14,000 Speaker 3: I asked this question around my circle here in Salt Lake, 312 00:16:14,040 --> 00:16:16,160 Speaker 3: you know, to people around me. I'm like that, I'm like, 313 00:16:16,200 --> 00:16:18,160 Speaker 3: are you ready for a hypersonic missile to come blow 314 00:16:18,200 --> 00:16:19,760 Speaker 3: up the car wash while you're standing there? 315 00:16:20,600 --> 00:16:22,480 Speaker 4: Are we ready? Do you even have? Are you ready 316 00:16:22,480 --> 00:16:22,680 Speaker 4: for that? 317 00:16:22,720 --> 00:16:24,920 Speaker 3: Because like what's going on in like Ukraine and how 318 00:16:24,960 --> 00:16:27,720 Speaker 3: of a sudden super missile comes in and just blows 319 00:16:27,800 --> 00:16:29,640 Speaker 3: up the park. You know, people are just having a 320 00:16:29,640 --> 00:16:32,320 Speaker 3: normal day and then the sounds and the tonightis and 321 00:16:32,360 --> 00:16:34,600 Speaker 3: the ears and then the ringing and then like all 322 00:16:34,640 --> 00:16:37,560 Speaker 3: of them when the smoke settles and the calamity, It 323 00:16:37,600 --> 00:16:39,200 Speaker 3: is like, are we ready for that? 324 00:16:39,360 --> 00:16:39,720 Speaker 4: Here? 325 00:16:40,800 --> 00:16:43,200 Speaker 3: I don't think so, you know, we're just walking around 326 00:16:43,240 --> 00:16:45,200 Speaker 3: kind of like, oh, we believe in guys like yourself 327 00:16:45,280 --> 00:16:47,480 Speaker 3: or the Navy seals that run the sites, or the 328 00:16:47,520 --> 00:16:50,440 Speaker 3: green berets do their jobs in the closeness of night. 329 00:16:51,600 --> 00:16:52,200 Speaker 5: Yeah. 330 00:16:52,320 --> 00:16:55,680 Speaker 1: Yeah, Look, there are guys down range in the military 331 00:16:55,680 --> 00:16:58,960 Speaker 1: and intelligence who kind of take the ultimate risks, right, 332 00:16:59,040 --> 00:17:02,880 Speaker 1: And a lot of intelligence guys it might just be you, 333 00:17:03,000 --> 00:17:03,880 Speaker 1: it might be one guy. 334 00:17:04,680 --> 00:17:05,800 Speaker 5: Right, You're just out. 335 00:17:05,600 --> 00:17:11,040 Speaker 1: There and there's there's maybe there's no backup, right, and 336 00:17:11,040 --> 00:17:14,239 Speaker 1: and if something goes wrong, you only have you know 337 00:17:14,280 --> 00:17:16,040 Speaker 1: your wits and your planning to kind of get out 338 00:17:16,040 --> 00:17:20,280 Speaker 1: of that. And that can be really scary, and it 339 00:17:20,320 --> 00:17:22,040 Speaker 1: takes a lot of courage, It's true. It takes a 340 00:17:22,040 --> 00:17:25,960 Speaker 1: lot of bravery. Military obviously has the same and there's 341 00:17:26,080 --> 00:17:28,960 Speaker 1: there are a lot of guys in the intelligence community 342 00:17:29,200 --> 00:17:31,919 Speaker 1: served in the military before they came and. 343 00:17:34,760 --> 00:17:37,080 Speaker 4: You know, you see you say these guys are down range. 344 00:17:37,680 --> 00:17:39,840 Speaker 4: Don't you think we're all kind of downrange today? With 345 00:17:39,920 --> 00:17:42,399 Speaker 4: the internet and you know, I mean a little bit 346 00:17:42,440 --> 00:17:42,679 Speaker 4: of this. 347 00:17:42,880 --> 00:17:45,440 Speaker 3: I think that there's a there's there's a war going 348 00:17:45,480 --> 00:17:48,600 Speaker 3: on also right versus like the missiles and the guns 349 00:17:48,600 --> 00:17:50,720 Speaker 3: and the trenches. There's these keyboard wars that are going 350 00:17:50,760 --> 00:17:53,679 Speaker 3: on right now that maybe people aren't paying attention to 351 00:17:53,680 --> 00:17:56,960 Speaker 3: where they're collecting data or there's bad actors in the 352 00:17:56,960 --> 00:17:59,639 Speaker 3: world that are trying to gather information based off of 353 00:17:59,640 --> 00:18:00,680 Speaker 3: your post. 354 00:18:02,080 --> 00:18:04,679 Speaker 1: You hit the nail on the head. There used to 355 00:18:04,720 --> 00:18:07,000 Speaker 1: be they called it in the Cold War the fold 356 00:18:07,000 --> 00:18:09,560 Speaker 1: of Gap. You know this that the Soviet Union would 357 00:18:09,600 --> 00:18:12,000 Speaker 1: invade Europe through the folder Gap is kind of a 358 00:18:12,080 --> 00:18:13,520 Speaker 1: region in Central Europe. 359 00:18:14,080 --> 00:18:16,000 Speaker 5: Now that fold of GAP's on our cell phone. 360 00:18:16,480 --> 00:18:20,120 Speaker 1: Right, We're all the edge, we are all the perimeter, 361 00:18:21,000 --> 00:18:22,640 Speaker 1: and they are. 362 00:18:22,480 --> 00:18:23,720 Speaker 5: Trying to collect information. 363 00:18:24,040 --> 00:18:28,240 Speaker 1: And when I say they, I mean China, Russia, Iran, 364 00:18:29,000 --> 00:18:32,840 Speaker 1: North Korea, places like that there, and they are trying 365 00:18:32,840 --> 00:18:35,240 Speaker 1: to collect information from people, but they're also trying to 366 00:18:35,320 --> 00:18:39,560 Speaker 1: change their minds. You know, they've realized that the American 367 00:18:39,600 --> 00:18:43,320 Speaker 1: citizen is the soft underbelly of this nation. 368 00:18:44,560 --> 00:18:44,760 Speaker 4: Right. 369 00:18:44,920 --> 00:18:49,439 Speaker 1: We are the ones who vote, We're the ones who. 370 00:18:48,960 --> 00:18:51,840 Speaker 5: Run for Congress. We're the ones who work in the military. 371 00:18:51,920 --> 00:18:53,960 Speaker 5: We are the ones who work at a company. 372 00:18:53,960 --> 00:18:57,720 Speaker 1: That maybe builds technology like AI for the military. 373 00:18:58,160 --> 00:18:59,480 Speaker 5: We are the soft underbelly. 374 00:18:59,480 --> 00:19:03,359 Speaker 1: And so for decades we kind of got used to 375 00:19:03,720 --> 00:19:07,200 Speaker 1: this idea that it's like presidents and generals and kernels 376 00:19:07,280 --> 00:19:10,600 Speaker 1: and people like that were the targets, and now we're 377 00:19:10,640 --> 00:19:13,080 Speaker 1: all the targets, and that. 378 00:19:13,080 --> 00:19:18,240 Speaker 5: Is a massive change and really the core of. 379 00:19:19,200 --> 00:19:21,840 Speaker 1: What my book's about and what I wanted to write about, 380 00:19:21,880 --> 00:19:26,159 Speaker 1: because I don't think people realize what's really going on. 381 00:19:26,359 --> 00:19:28,159 Speaker 5: You see stuff in the news. 382 00:19:28,240 --> 00:19:31,800 Speaker 1: You hear oh tiktoks of danger or oh there is 383 00:19:31,840 --> 00:19:34,840 Speaker 1: another hack of a healthcare system, and it's just these blips, 384 00:19:35,560 --> 00:19:38,000 Speaker 1: but they all add up to something, which is we're 385 00:19:38,000 --> 00:19:41,240 Speaker 1: all being targeted, and now we need to all protect 386 00:19:41,240 --> 00:19:44,120 Speaker 1: ourselves because we're all being targeted. 387 00:19:45,200 --> 00:19:47,440 Speaker 3: I mean, what about these companies that you know, I've 388 00:19:47,560 --> 00:19:49,960 Speaker 3: been I've done it where I've spit in the tube 389 00:19:49,960 --> 00:19:51,879 Speaker 3: and I've sent it off to see how the family 390 00:19:51,920 --> 00:19:54,560 Speaker 3: all reacts to each other's DNAs, you know, and then 391 00:19:54,600 --> 00:19:57,320 Speaker 3: you hear that, you know, all that information. 392 00:19:57,000 --> 00:20:00,000 Speaker 4: Could be just sold to somebody else as a business. 393 00:20:00,760 --> 00:20:03,080 Speaker 4: It's much at all, you know what, Yeah. 394 00:20:03,640 --> 00:20:04,680 Speaker 5: It's much worse than that. 395 00:20:05,240 --> 00:20:09,199 Speaker 1: So you spit that to and first it can be stolen, 396 00:20:10,160 --> 00:20:12,560 Speaker 1: maybe somebody can hack in and take it from the company. 397 00:20:13,000 --> 00:20:16,240 Speaker 1: But what's even worse is that most likely that company, 398 00:20:16,280 --> 00:20:20,159 Speaker 1: even if they were American, probably sent your spit to 399 00:20:20,320 --> 00:20:24,760 Speaker 1: China to have it tested there because they can operate 400 00:20:25,000 --> 00:20:28,119 Speaker 1: like everything else. Right, They're just it's just cheaper to 401 00:20:28,200 --> 00:20:31,800 Speaker 1: do things in China on purpose for some of these 402 00:20:31,800 --> 00:20:35,880 Speaker 1: things they're they're subsidizing businesses like this, and so they 403 00:20:36,240 --> 00:20:40,320 Speaker 1: most likely do actually have a lot of people's DNA 404 00:20:40,800 --> 00:20:45,480 Speaker 1: sitting over there, and there are companies like BGI Genomics 405 00:20:45,560 --> 00:20:50,800 Speaker 1: is a big Chinese pharmaceutical and biotechnology company. And not 406 00:20:50,880 --> 00:20:53,679 Speaker 1: only are they collecting DNA from people all over the 407 00:20:53,680 --> 00:20:59,400 Speaker 1: world there there there is evidence that they're they they 408 00:20:59,800 --> 00:21:04,439 Speaker 1: are are in contracts with the People's Liberation Army with 409 00:21:04,480 --> 00:21:06,280 Speaker 1: the PLA, so. 410 00:21:06,320 --> 00:21:07,800 Speaker 5: That information. 411 00:21:09,000 --> 00:21:14,919 Speaker 1: That DNA your genetics is possibly going directly to the 412 00:21:14,960 --> 00:21:18,360 Speaker 1: People's Liberation Army and who knows who else, most likely 413 00:21:18,440 --> 00:21:22,879 Speaker 1: their intelligence community as well, because in China, their intelligence 414 00:21:22,880 --> 00:21:25,880 Speaker 1: community can ask for any information they want from any 415 00:21:25,960 --> 00:21:29,480 Speaker 1: Chinese citizen or any Chinese company, any to them they want, 416 00:21:29,480 --> 00:21:31,520 Speaker 1: and there's no recourse. They can just go in and 417 00:21:31,560 --> 00:21:35,200 Speaker 1: take it and say, yep, that's mine. 418 00:21:36,160 --> 00:21:38,480 Speaker 3: Interesting, you know what I mean, because like I guess 419 00:21:38,520 --> 00:21:40,280 Speaker 3: you could start thinking about like if they can just 420 00:21:40,400 --> 00:21:42,960 Speaker 3: I mean, what like we all sign up to Instagram 421 00:21:43,000 --> 00:21:45,119 Speaker 3: or we all sign up to the local website and 422 00:21:45,320 --> 00:21:48,560 Speaker 3: upload our credit card information, and you know, it's just like, oh, hey, 423 00:21:48,600 --> 00:21:51,399 Speaker 3: buy now with three taps of your thumb on you know, 424 00:21:52,400 --> 00:21:54,960 Speaker 3: on a I don't even want to say their website, 425 00:21:55,000 --> 00:21:56,920 Speaker 3: you know, because I don't want to give them any 426 00:21:57,080 --> 00:21:59,360 Speaker 3: like respect about that. But you know they're not doing 427 00:21:59,359 --> 00:22:04,160 Speaker 3: proper humans a arryan, you know, workforce. They're using underage 428 00:22:04,240 --> 00:22:07,720 Speaker 3: kids to make your clothing, and it's fast clothing. But 429 00:22:07,800 --> 00:22:10,719 Speaker 3: yet you side right up and you just say, oh, 430 00:22:10,760 --> 00:22:13,720 Speaker 3: it's only six bucks versus thirteen dollars locally, let me 431 00:22:13,760 --> 00:22:15,160 Speaker 3: just buy it for six bucks and have it sent 432 00:22:15,200 --> 00:22:15,399 Speaker 3: to me. 433 00:22:15,680 --> 00:22:17,560 Speaker 4: But they have all that information from you. 434 00:22:17,560 --> 00:22:19,760 Speaker 3: You could have shopped locally and supported the local neighbor 435 00:22:19,800 --> 00:22:22,600 Speaker 3: that sells the same thing. But you know, they do 436 00:22:22,680 --> 00:22:25,760 Speaker 3: sell it so much cheaper because they just want to 437 00:22:25,840 --> 00:22:31,600 Speaker 3: gather the info which we just give thinking it's just 438 00:22:31,640 --> 00:22:32,240 Speaker 3: a normal thing. 439 00:22:33,320 --> 00:22:35,639 Speaker 1: We just give it away and we don't think about 440 00:22:35,640 --> 00:22:40,879 Speaker 1: it as a security concern, whereas we don't. Most of 441 00:22:40,960 --> 00:22:46,240 Speaker 1: us don't think like intelligence officers, and for decades or 442 00:22:46,280 --> 00:22:49,119 Speaker 1: centuries of the American life, we didn't have to, and 443 00:22:49,160 --> 00:22:53,959 Speaker 1: now we kind of do. Weirdly, we have all learned 444 00:22:54,000 --> 00:22:57,480 Speaker 1: to protect our computers, right we have. 445 00:22:57,800 --> 00:22:59,200 Speaker 5: You know, we teach five year. 446 00:22:59,040 --> 00:23:02,000 Speaker 1: Old CyberSecure, and we say, you know, you have to 447 00:23:02,080 --> 00:23:04,679 Speaker 1: use a password, and you have to you know, and 448 00:23:04,720 --> 00:23:06,840 Speaker 1: if you see a weird email, don't open it. 449 00:23:06,840 --> 00:23:08,200 Speaker 5: It might be it's a different sing. 450 00:23:08,880 --> 00:23:12,639 Speaker 1: So we've learned to protect our computers, but we've never 451 00:23:13,080 --> 00:23:17,000 Speaker 1: learned to protect ourselves, to protect our own minds. We 452 00:23:17,040 --> 00:23:20,320 Speaker 1: don't even think about that as a security issue, but 453 00:23:20,400 --> 00:23:25,120 Speaker 1: it is right there. We need to start thinking about 454 00:23:25,160 --> 00:23:29,720 Speaker 1: it that way and training people, even training kids, to 455 00:23:29,840 --> 00:23:33,280 Speaker 1: think about, Hey, when somebody's collecting your information, when you're 456 00:23:33,280 --> 00:23:37,040 Speaker 1: giving away information, it might not just be you know, 457 00:23:37,160 --> 00:23:40,560 Speaker 1: to sign up for an email list you want to 458 00:23:40,560 --> 00:23:42,920 Speaker 1: be on. It might it might be going somewhere bad 459 00:23:43,280 --> 00:23:46,159 Speaker 1: and those people might not have your best intentions at heart. 460 00:23:47,160 --> 00:23:49,520 Speaker 1: And it's even worse than them just trying to maybe 461 00:23:49,600 --> 00:23:52,600 Speaker 1: steal some money from you. They might be doing this 462 00:23:52,760 --> 00:23:56,200 Speaker 1: to harm your your community or harm your country. 463 00:23:57,000 --> 00:23:59,440 Speaker 3: I mean, they might try to figure out what makes 464 00:23:59,520 --> 00:24:02,320 Speaker 3: us tick in a bio way to use you know, 465 00:24:02,560 --> 00:24:06,520 Speaker 3: bio on us that we could not you know, vaccine, 466 00:24:06,560 --> 00:24:09,840 Speaker 3: vaccine against or you know, thwart or use any type 467 00:24:09,880 --> 00:24:12,679 Speaker 3: of like over the counter medication to stop It's just like, 468 00:24:12,800 --> 00:24:14,879 Speaker 3: what do you do you know, it's like, here comes 469 00:24:14,880 --> 00:24:18,560 Speaker 3: a super a superflu And I say that because there's 470 00:24:18,560 --> 00:24:20,959 Speaker 3: a movie and it's called The Sword of the Stone 471 00:24:21,000 --> 00:24:24,560 Speaker 3: and it's a Disney cartoon and it has Merlin and 472 00:24:25,000 --> 00:24:27,600 Speaker 3: Arthur and he pulls the sword. But during that he 473 00:24:27,680 --> 00:24:34,200 Speaker 3: battles mim Okay, Merlin battles another wizard mime and they 474 00:24:34,240 --> 00:24:37,000 Speaker 3: try to battle. He turned into fiery dragons, and then 475 00:24:37,040 --> 00:24:39,280 Speaker 3: they turn into a mouse and an elephant, and then 476 00:24:39,720 --> 00:24:44,119 Speaker 3: at the end, he turns into a biological cold and 477 00:24:44,200 --> 00:24:46,240 Speaker 3: she's in bed with a thermometer and. 478 00:24:46,080 --> 00:24:49,840 Speaker 4: She's He's like, I've turned into the cold, bro. You 479 00:24:49,880 --> 00:24:52,240 Speaker 4: can't get rid of that, you know what I mean. 480 00:24:52,320 --> 00:24:52,680 Speaker 4: And so. 481 00:24:54,680 --> 00:24:57,959 Speaker 3: To kind of bring it full circle, you know, biological 482 00:24:58,280 --> 00:25:04,480 Speaker 3: warfare obviously it's wizards, Okay, it could be out there, 483 00:25:04,680 --> 00:25:08,960 Speaker 3: you know, it has to be something on the intelligence circuit, right. 484 00:25:09,960 --> 00:25:11,840 Speaker 5: Well, if you want to get dark for a. 485 00:25:11,840 --> 00:25:12,800 Speaker 4: Second, yeah, I do. 486 00:25:14,119 --> 00:25:18,000 Speaker 1: During apartheid in South Africa, the South African government had 487 00:25:18,040 --> 00:25:24,080 Speaker 1: a program a biological warfare targeted specifically at black people that. 488 00:25:24,000 --> 00:25:26,360 Speaker 5: Would affect black people and not white people. So they 489 00:25:26,440 --> 00:25:29,720 Speaker 5: went there. Now this was decades ago. 490 00:25:29,880 --> 00:25:33,800 Speaker 1: They didn't have the capability to genetically engineer a disease, 491 00:25:34,640 --> 00:25:39,399 Speaker 1: but they they kind of showed that it's possible for 492 00:25:39,440 --> 00:25:43,520 Speaker 1: a country to do this, and it is extremely scary 493 00:25:43,600 --> 00:25:48,120 Speaker 1: to think of today where people have seen how bad 494 00:25:48,280 --> 00:25:53,000 Speaker 1: and a pandemic can get economically, how many people can die, 495 00:25:53,119 --> 00:25:57,879 Speaker 1: And we do have the engineering capability to engineer diseases, 496 00:25:58,600 --> 00:26:01,119 Speaker 1: and we are talking about a country like China that 497 00:26:01,280 --> 00:26:05,760 Speaker 1: is sitting on the genetic information of millions of people, 498 00:26:07,040 --> 00:26:11,040 Speaker 1: and it is conceivable that that would happen. That isn't 499 00:26:11,160 --> 00:26:15,960 Speaker 1: science fiction anymore. That's within the world of possible. 500 00:26:16,640 --> 00:26:16,720 Speaker 4: Now. 501 00:26:16,760 --> 00:26:19,719 Speaker 5: I don't know if that's happening, and hopefully it's not. 502 00:26:20,560 --> 00:26:24,280 Speaker 1: But this is where you know, the intelligence community and 503 00:26:24,320 --> 00:26:26,760 Speaker 1: the military and people like this come in is to 504 00:26:26,760 --> 00:26:30,080 Speaker 1: try to protect us from those threats. But we also 505 00:26:30,200 --> 00:26:33,840 Speaker 1: need to protect ourselves and be careful about where we 506 00:26:33,920 --> 00:26:36,679 Speaker 1: send information, like our own genetics. 507 00:26:37,280 --> 00:26:39,919 Speaker 3: We all need to be our own little internal intelligence 508 00:26:39,960 --> 00:26:41,119 Speaker 3: officer inside. 509 00:26:41,359 --> 00:26:42,640 Speaker 4: We all need to be a little bit. 510 00:26:42,520 --> 00:26:48,439 Speaker 3: More cautious with our personal details, finances and things that 511 00:26:48,480 --> 00:26:52,439 Speaker 3: we post up online or what seems so innocent. You know, 512 00:26:52,480 --> 00:26:55,320 Speaker 3: you open up a cruise ship app and you're entering 513 00:26:55,320 --> 00:26:57,960 Speaker 3: your information and is their site secure? 514 00:26:58,560 --> 00:27:01,439 Speaker 4: Yeah, you know, and it's the small guys that can 515 00:27:01,480 --> 00:27:01,840 Speaker 4: get hit. 516 00:27:02,680 --> 00:27:06,359 Speaker 1: Think like a spy, you know, be kind of a 517 00:27:06,400 --> 00:27:13,080 Speaker 1: citizen intelligence officer. That may be necessary. And look, I 518 00:27:13,119 --> 00:27:13,920 Speaker 1: don't think. 519 00:27:13,720 --> 00:27:15,760 Speaker 5: That it's like we all need to be sitting around 520 00:27:15,880 --> 00:27:20,240 Speaker 5: paranoid all the time. Sure, but you know we're all raised. 521 00:27:20,720 --> 00:27:22,840 Speaker 1: You grow up and you're taught Okay, this is a 522 00:27:22,880 --> 00:27:25,600 Speaker 1: bad neighborhood. Be a little more careful. Don't leave your 523 00:27:25,680 --> 00:27:29,520 Speaker 1: valuables in the car. Put your wallet into your front pocket, right, Like, 524 00:27:29,560 --> 00:27:33,000 Speaker 1: we're all raised in a way to protect ourselves. 525 00:27:33,240 --> 00:27:35,520 Speaker 5: And by the way, in the Cold War we're raised. 526 00:27:35,720 --> 00:27:37,600 Speaker 1: You know, people were raised to hide under their desks 527 00:27:37,600 --> 00:27:41,520 Speaker 1: and say of a NUCA missile coming in, and today 528 00:27:41,880 --> 00:27:44,520 Speaker 1: kids are taught to hide in case of an active shooter. 529 00:27:44,680 --> 00:27:47,840 Speaker 5: So KRULDN realize there are bad things. We know there 530 00:27:47,840 --> 00:27:48,760 Speaker 5: are bad things. 531 00:27:48,480 --> 00:27:51,439 Speaker 1: In the world, and it's just prudent to say there 532 00:27:51,440 --> 00:27:55,040 Speaker 1: are other bad things. And maybe that's collecting information, and 533 00:27:55,119 --> 00:27:57,400 Speaker 1: maybe that's trying to convince me of something. Maybe when 534 00:27:57,400 --> 00:28:01,760 Speaker 1: I'm on you know, something like TikTok, which is Chinese 535 00:28:01,760 --> 00:28:05,920 Speaker 1: owned and they not only can, but but it's been 536 00:28:06,000 --> 00:28:10,560 Speaker 1: proven do use it for censorship and other information control. 537 00:28:11,280 --> 00:28:14,960 Speaker 1: Then you should be prudent and we should be aware 538 00:28:15,040 --> 00:28:18,080 Speaker 1: of these kinds of threats and act accordingly. 539 00:28:19,000 --> 00:28:20,679 Speaker 4: I mean even video games. I can think of a 540 00:28:20,760 --> 00:28:21,760 Speaker 4: very popular. 541 00:28:21,400 --> 00:28:23,880 Speaker 3: Game out there, and I usually like to like invest 542 00:28:24,200 --> 00:28:27,199 Speaker 3: in a company that has video games, and if you 543 00:28:27,240 --> 00:28:28,720 Speaker 3: have to like buy in on the video game for 544 00:28:28,840 --> 00:28:31,800 Speaker 3: points or bucks or you know, any of that kind 545 00:28:31,840 --> 00:28:34,120 Speaker 3: of thing, I'm like, well, if they're going to collect 546 00:28:34,160 --> 00:28:36,359 Speaker 3: money from everybody, I'm what's their stock value? 547 00:28:36,480 --> 00:28:38,960 Speaker 4: That's my first thing. So this one company I looked up. 548 00:28:39,760 --> 00:28:41,520 Speaker 3: You know, my daughter was into this game and it 549 00:28:41,560 --> 00:28:42,840 Speaker 3: was very popular, and I was like, what are you 550 00:28:42,840 --> 00:28:44,840 Speaker 3: spending money on there? She's like, oh, this game. And 551 00:28:44,880 --> 00:28:46,680 Speaker 3: I was like, oh, what's that game called? Right, That's 552 00:28:46,680 --> 00:28:48,440 Speaker 3: all I wanted to know. And I walked away and 553 00:28:48,480 --> 00:28:50,280 Speaker 3: I'm like, why she spending you know, twelve bucks to 554 00:28:50,320 --> 00:28:53,000 Speaker 3: get points on this game to build her load out? 555 00:28:53,520 --> 00:28:55,160 Speaker 3: So I google it and I'm like, oh, it's only 556 00:28:55,160 --> 00:28:58,600 Speaker 3: owned by three people, not in America, out of China 557 00:28:59,160 --> 00:29:01,880 Speaker 3: and or you know, in that area of Asia. And 558 00:29:02,240 --> 00:29:04,480 Speaker 3: I was just like, well, how do I buy stock? 559 00:29:04,560 --> 00:29:06,440 Speaker 3: I'm like, is this an open traded It's like no, 560 00:29:06,480 --> 00:29:09,400 Speaker 3: it's privately owned. I'm like, wow, you know they're getting 561 00:29:09,760 --> 00:29:14,000 Speaker 3: three people have access to this huge video game platform 562 00:29:14,400 --> 00:29:17,719 Speaker 3: and they're just collecting everybody's data, your strokes, your keystrokes, 563 00:29:17,760 --> 00:29:20,520 Speaker 3: everything that you're doing through that game. Nobody really reads 564 00:29:20,800 --> 00:29:25,200 Speaker 3: I think the user agreements, the privacy agreements, and if 565 00:29:25,200 --> 00:29:27,240 Speaker 3: they are I mean, are they honestly the real thing? 566 00:29:27,280 --> 00:29:29,640 Speaker 4: But still, I mean you're doing it. 567 00:29:29,680 --> 00:29:31,280 Speaker 5: You just did it right there, right, You did some 568 00:29:31,400 --> 00:29:32,640 Speaker 5: due diligence I did. 569 00:29:32,720 --> 00:29:35,680 Speaker 1: Yeah, you thought like an intelligence officer, you kind of 570 00:29:35,760 --> 00:29:38,480 Speaker 1: learned something and you dug into it and that that's 571 00:29:38,480 --> 00:29:41,600 Speaker 1: what people will need to do. And we can do 572 00:29:41,680 --> 00:29:45,240 Speaker 1: that for anything, right, and we can go and say, hey, 573 00:29:45,240 --> 00:29:48,200 Speaker 1: there's a new we're so used to this technology as 574 00:29:48,280 --> 00:29:50,760 Speaker 1: American owned, right, and like it's safe in some way 575 00:29:50,760 --> 00:29:53,560 Speaker 1: and it's regulated and it's not me. 576 00:29:53,840 --> 00:29:56,240 Speaker 5: But all of a sudden, yeah, all of a sudden, 577 00:29:56,360 --> 00:29:59,720 Speaker 5: like some of it's not right. And so it just 578 00:29:59,800 --> 00:30:02,560 Speaker 5: take can you know that one minute to say. 579 00:30:02,480 --> 00:30:04,600 Speaker 1: Let me, let me dive a little deeper into this 580 00:30:05,120 --> 00:30:08,280 Speaker 1: and see and see what's really going on and are 581 00:30:08,360 --> 00:30:09,680 Speaker 1: people talking about. 582 00:30:09,360 --> 00:30:11,440 Speaker 5: This being a problem. Is it a thread? Is my 583 00:30:11,520 --> 00:30:13,920 Speaker 5: information going somewhere? I don't want it? Are they are? 584 00:30:14,200 --> 00:30:16,440 Speaker 1: Is it known that this is maybe a state owned 585 00:30:16,520 --> 00:30:20,080 Speaker 1: enterprise by the Chinese government, and so it's and just 586 00:30:20,360 --> 00:30:22,560 Speaker 1: literally it might take a minute, may take five minutes. 587 00:30:22,600 --> 00:30:25,120 Speaker 1: You put it into chat GPT and then you can 588 00:30:25,160 --> 00:30:28,520 Speaker 1: make a decision. And that doesn't mean not doing anything. 589 00:30:28,600 --> 00:30:31,000 Speaker 1: We're all going to buy Chinese stuff, right, like it's 590 00:30:31,040 --> 00:30:31,960 Speaker 1: impossible to live. 591 00:30:32,040 --> 00:30:33,800 Speaker 4: I'm not against that, right exactly. 592 00:30:33,840 --> 00:30:37,760 Speaker 1: I'm just saying, but now you're making an informed decision 593 00:30:38,120 --> 00:30:40,760 Speaker 1: like a president would, right, And you're saying. 594 00:30:40,480 --> 00:30:42,040 Speaker 5: Okay, I know this. 595 00:30:42,200 --> 00:30:45,160 Speaker 1: Now I can assess the risk and I'm willing to 596 00:30:45,200 --> 00:30:46,480 Speaker 1: live with it, or maybe I'm not. 597 00:30:46,760 --> 00:30:49,400 Speaker 5: And we need to do it for our kids too, because. 598 00:30:49,160 --> 00:30:51,440 Speaker 4: They make intelligence exactly. 599 00:30:51,520 --> 00:30:53,840 Speaker 3: So I gathered the intelligence and then made a decision 600 00:30:54,320 --> 00:30:56,040 Speaker 3: and just was like, well, I told my daughter. I 601 00:30:56,040 --> 00:30:57,640 Speaker 3: was like, hey, do you ever think about this because 602 00:30:57,680 --> 00:30:59,720 Speaker 3: the other things I've invested in there seem to be 603 00:30:59,760 --> 00:31:02,480 Speaker 3: openly traded or on the stock market, you know, and 604 00:31:02,480 --> 00:31:04,440 Speaker 3: I find that to be a little bit you know, 605 00:31:04,480 --> 00:31:08,400 Speaker 3: they've probably follow some type of ethical code. 606 00:31:08,920 --> 00:31:13,440 Speaker 1: Yeah, yeah, there's some transparency, like you could be kind 607 00:31:13,440 --> 00:31:16,840 Speaker 1: of sure they're going to follow some rules. They probably 608 00:31:16,920 --> 00:31:21,680 Speaker 1: have some level of cybersecurity and you know, government regulation, right. 609 00:31:21,520 --> 00:31:24,160 Speaker 5: And so you you made that assessment, right, and you. 610 00:31:24,360 --> 00:31:28,080 Speaker 3: Because because there's going to pick that open sourced intelligence 611 00:31:28,160 --> 00:31:31,200 Speaker 3: or the intelligence were given somebody through these apps or these. 612 00:31:31,920 --> 00:31:34,240 Speaker 4: You know, social media platforms. 613 00:31:34,240 --> 00:31:37,840 Speaker 3: And it's even the current big dogs social media platforms, 614 00:31:37,840 --> 00:31:39,960 Speaker 3: all the main four whoever you go to, you know, 615 00:31:40,080 --> 00:31:42,000 Speaker 3: all of them out there. It's like they're all gathering 616 00:31:42,000 --> 00:31:46,040 Speaker 3: our information. And it's not just China, it's America. It's 617 00:31:46,080 --> 00:31:48,880 Speaker 3: like these guys are in America and they're gathering information, 618 00:31:49,640 --> 00:31:52,440 Speaker 3: you know, for their new glasses that you are going 619 00:31:52,520 --> 00:31:54,520 Speaker 3: to wear now and that's going to zoom right into 620 00:31:54,520 --> 00:31:56,720 Speaker 3: your career cerebral cortex. 621 00:31:57,080 --> 00:31:59,000 Speaker 4: All this information just flooding in. You know. 622 00:31:59,080 --> 00:32:02,040 Speaker 3: It's like, you know, we're just getting pegged for all 623 00:32:02,040 --> 00:32:02,600 Speaker 3: of our info. 624 00:32:03,480 --> 00:32:08,200 Speaker 1: Yeah, I mean, look, the same thing does happen. American 625 00:32:08,240 --> 00:32:13,160 Speaker 1: companies take information and that's how that whole economy works. 626 00:32:12,800 --> 00:32:13,880 Speaker 5: In social media. 627 00:32:13,960 --> 00:32:16,880 Speaker 1: But there is a big difference between an American company 628 00:32:16,880 --> 00:32:18,400 Speaker 1: and a Chinese one when it comes to this. 629 00:32:18,520 --> 00:32:22,440 Speaker 5: You know one, Look, they're also American citizens. 630 00:32:22,080 --> 00:32:25,840 Speaker 1: Primarily running these companies, and they have to abide by 631 00:32:25,920 --> 00:32:28,640 Speaker 1: laws and if they do something wrong, they can be 632 00:32:28,800 --> 00:32:29,480 Speaker 1: brought to court. 633 00:32:29,600 --> 00:32:31,560 Speaker 5: That's not the case in China. 634 00:32:31,760 --> 00:32:36,120 Speaker 1: CEO of that company does something that is very illegal 635 00:32:36,160 --> 00:32:38,160 Speaker 1: in America, there's nothing we can do about it. That 636 00:32:38,200 --> 00:32:41,560 Speaker 1: guy lives in Beijing or Shanghai or something, and there's nothing. 637 00:32:41,480 --> 00:32:42,400 Speaker 4: Coming all China. 638 00:32:42,520 --> 00:32:45,640 Speaker 3: In China, it's like what's yours is theirs, and what's 639 00:32:45,720 --> 00:32:48,720 Speaker 3: theirs is theirs in China. So the Chinese are like, hey, 640 00:32:48,960 --> 00:32:54,040 Speaker 3: I don't see many people in the provinces running you know, cryptomind. 641 00:32:54,160 --> 00:32:54,880 Speaker 4: You know what I'm saying. 642 00:32:55,440 --> 00:32:57,640 Speaker 3: All the people that live out in the different areas 643 00:32:57,680 --> 00:33:02,360 Speaker 3: that are enfranchised from the rest of the big dogs 644 00:33:02,360 --> 00:33:04,520 Speaker 3: in China, there's not a lot of folks don't realize 645 00:33:04,520 --> 00:33:07,719 Speaker 3: that the people are struggling to just maintain a life. 646 00:33:07,920 --> 00:33:09,600 Speaker 3: And then there's the people at the very top that 647 00:33:09,720 --> 00:33:12,400 Speaker 3: just are running the whole show and just gathering all 648 00:33:12,440 --> 00:33:14,240 Speaker 3: the information. Like you said, they can just go to 649 00:33:14,280 --> 00:33:16,400 Speaker 3: Company A and say give us all your information. So 650 00:33:16,440 --> 00:33:19,560 Speaker 3: everything over there really just works for you know, the 651 00:33:19,600 --> 00:33:20,720 Speaker 3: government of China. 652 00:33:21,320 --> 00:33:23,360 Speaker 5: Yeah, exactly. I mean you bring up a good point. Look, 653 00:33:23,440 --> 00:33:26,560 Speaker 5: this isn't about Chinese people. No, Chinese people are like 654 00:33:26,600 --> 00:33:27,320 Speaker 5: American people. 655 00:33:27,360 --> 00:33:29,920 Speaker 1: They're just living their life, working at a job, doing 656 00:33:29,960 --> 00:33:33,000 Speaker 1: their thing. It's the government, it's the regime, it's the 657 00:33:33,000 --> 00:33:37,760 Speaker 1: Communist Party, and that that party is I believe bad 658 00:33:37,960 --> 00:33:41,520 Speaker 1: for the Chinese people. I mean they commit you know, 659 00:33:42,280 --> 00:33:46,960 Speaker 1: atrocities on people, and this is well documented. These are 660 00:33:47,000 --> 00:33:50,440 Speaker 1: not good people and they do you know, they want 661 00:33:50,480 --> 00:33:52,000 Speaker 1: to do worse to us. 662 00:33:52,640 --> 00:33:54,600 Speaker 5: You know, think of what they're doing to their own 663 00:33:54,640 --> 00:33:55,480 Speaker 5: people and. 664 00:33:55,440 --> 00:33:57,560 Speaker 1: What they would do to us if they consider us 665 00:33:57,640 --> 00:34:00,440 Speaker 1: the the you know, they consider us in some way 666 00:34:00,440 --> 00:34:01,080 Speaker 1: as an enemy. 667 00:34:01,560 --> 00:34:03,320 Speaker 5: And so but that. 668 00:34:03,360 --> 00:34:08,120 Speaker 1: Also gets at how how it is different there. Everything 669 00:34:08,200 --> 00:34:10,880 Speaker 1: is controlled from the top down, right. I mean that 670 00:34:11,800 --> 00:34:14,640 Speaker 1: there there are other sources of power. I mean, province 671 00:34:14,840 --> 00:34:17,080 Speaker 1: is not power, and you know they're big companies. 672 00:34:17,120 --> 00:34:21,400 Speaker 5: But like really, at the end of the day, Jijingping. 673 00:34:20,760 --> 00:34:23,839 Speaker 1: Can do whatever he wants and he can dig down 674 00:34:23,880 --> 00:34:26,759 Speaker 1: deep and tell anybody to do what he wants. And 675 00:34:27,680 --> 00:34:30,120 Speaker 1: we don't have that here. And that's a big difference. 676 00:34:30,600 --> 00:34:33,319 Speaker 1: And the and there is a separation. Sure, maybe you're 677 00:34:33,360 --> 00:34:36,319 Speaker 1: on this, maybe you're on Instagram or you're on you know, 678 00:34:36,440 --> 00:34:38,839 Speaker 1: you're on x or you're on you know, social media 679 00:34:38,880 --> 00:34:43,000 Speaker 1: company here, but the government, our government can't just go 680 00:34:43,560 --> 00:34:45,480 Speaker 1: and walk in that door and say give me all 681 00:34:45,520 --> 00:34:48,080 Speaker 1: your data, and I want you to do I want 682 00:34:48,120 --> 00:34:50,160 Speaker 1: you to take this person and kick them off, and 683 00:34:50,200 --> 00:34:51,200 Speaker 1: I want you to silence this. 684 00:34:51,360 --> 00:34:53,440 Speaker 5: They don't have control. We don't do that. We have 685 00:34:53,480 --> 00:34:55,800 Speaker 5: a separation. They don't have that separation. 686 00:34:55,920 --> 00:34:59,960 Speaker 4: Then what do you think about our current president Trump. 687 00:34:59,680 --> 00:35:04,080 Speaker 3: Only a social media platform like that with his truth platform, 688 00:35:04,120 --> 00:35:07,120 Speaker 3: where he does capture everybody's information on there. 689 00:35:08,880 --> 00:35:12,560 Speaker 1: Yeah, look, I think there's a slippery slope where I 690 00:35:12,600 --> 00:35:16,480 Speaker 1: think everybody believes in freedom and speech in this country, 691 00:35:16,520 --> 00:35:18,880 Speaker 1: and the last thing any of us want to happen 692 00:35:19,880 --> 00:35:24,759 Speaker 1: is for our government to control what we're saying. I 693 00:35:24,840 --> 00:35:28,560 Speaker 1: know I feel this way. I think everybody does. It's 694 00:35:28,640 --> 00:35:31,719 Speaker 1: fundamental to who we are. And so we want to 695 00:35:31,760 --> 00:35:34,680 Speaker 1: look at our government and know that it's not trying 696 00:35:34,719 --> 00:35:38,239 Speaker 1: to control what we say, what we do at a 697 00:35:38,480 --> 00:35:44,680 Speaker 1: really deep, fundamental level as citizens. And so we've just 698 00:35:44,760 --> 00:35:50,520 Speaker 1: got to hope that the system, the laws in place, 699 00:35:50,640 --> 00:35:53,400 Speaker 1: the checks and balances in place, ensure. 700 00:35:53,120 --> 00:35:56,800 Speaker 5: That any actions by. 701 00:35:56,640 --> 00:36:00,400 Speaker 1: Anybody in the government, president, or anybody else don't range 702 00:36:00,440 --> 00:36:02,000 Speaker 1: on those basic rights. 703 00:36:02,960 --> 00:36:07,160 Speaker 5: And if they do, that we stop that from happening. 704 00:36:07,320 --> 00:36:10,440 Speaker 1: You know, by making it illegal, forcing a divestia, sure, 705 00:36:10,880 --> 00:36:15,160 Speaker 1: you know, forcing some separation. And to me, that's just 706 00:36:15,320 --> 00:36:18,080 Speaker 1: like core to who we are as a nation. 707 00:36:20,040 --> 00:36:21,600 Speaker 4: That's very good to hear, very good to hear. 708 00:36:21,920 --> 00:36:24,880 Speaker 3: Now, you know with all of these different things like 709 00:36:25,040 --> 00:36:29,799 Speaker 3: medical records, financial records, the data breaches that happen here 710 00:36:29,800 --> 00:36:30,319 Speaker 3: and there that. 711 00:36:30,280 --> 00:36:33,880 Speaker 4: People just seem to blow off. You know, oh, hey, don't. 712 00:36:33,640 --> 00:36:37,920 Speaker 3: Forget that so and so's credit checking company, you know 713 00:36:38,000 --> 00:36:39,920 Speaker 3: that tells you your credit score has had a breach. 714 00:36:40,440 --> 00:36:42,359 Speaker 3: Do you want to join the class action and get 715 00:36:42,400 --> 00:36:46,520 Speaker 3: twelve cents back? It's like, well, wait that twelve cents back? 716 00:36:46,520 --> 00:36:47,480 Speaker 3: What got breached? 717 00:36:47,640 --> 00:36:49,960 Speaker 4: Right? And so what's that being used for? 718 00:36:50,200 --> 00:36:52,759 Speaker 3: And you know who took that and what are the 719 00:36:52,800 --> 00:36:55,680 Speaker 3: actors like? And you know, I guess we are just 720 00:36:55,719 --> 00:36:59,000 Speaker 3: so trusting with our devices that are in our hand. 721 00:36:59,400 --> 00:37:02,279 Speaker 3: Living in the US or wherever you may live, you 722 00:37:02,280 --> 00:37:05,160 Speaker 3: may feel just confident in your hometown of wherever that's at. 723 00:37:05,200 --> 00:37:06,040 Speaker 4: Even if you're in England. 724 00:37:06,080 --> 00:37:07,920 Speaker 3: You may feel like you have some freedom, right, freedom 725 00:37:07,960 --> 00:37:11,160 Speaker 3: of looking on your phone and now there's now there's 726 00:37:11,320 --> 00:37:13,320 Speaker 3: this thing on your phone and you're in your country, 727 00:37:13,400 --> 00:37:16,080 Speaker 3: but that thing's not in your country. It's on your phone, 728 00:37:16,320 --> 00:37:20,680 Speaker 3: but it's coming from a third party like uh, North 729 00:37:20,760 --> 00:37:25,840 Speaker 3: Korea or you know, Russia or China or like you said, 730 00:37:25,920 --> 00:37:28,680 Speaker 3: you know, any of the different acts of evils whoever 731 00:37:28,719 --> 00:37:31,759 Speaker 3: they decide to call them today, you know, but it's 732 00:37:31,840 --> 00:37:33,279 Speaker 3: it's right here, so you feel kind of safe, you're 733 00:37:33,280 --> 00:37:35,000 Speaker 3: in your own home. You're like, what's the worst that 734 00:37:35,000 --> 00:37:35,480 Speaker 3: can happen? 735 00:37:36,960 --> 00:37:39,239 Speaker 5: That's that's the trick, right, that's the mind trick. 736 00:37:39,280 --> 00:37:42,640 Speaker 1: It's almost like we can't even comprehend what that means 737 00:37:42,760 --> 00:37:45,400 Speaker 1: that there can be this threat coming through this little 738 00:37:45,560 --> 00:37:49,080 Speaker 1: black box in my hand, but it does, and it 739 00:37:49,120 --> 00:37:52,759 Speaker 1: can change how you think about the world or how 740 00:37:52,760 --> 00:37:56,879 Speaker 1: your kids think about the world. And I think there's 741 00:37:56,880 --> 00:37:59,600 Speaker 1: something else going on as well, which is like for 742 00:37:59,640 --> 00:38:02,920 Speaker 1: a long time we looked at those breaches, like you're saying, 743 00:38:03,000 --> 00:38:06,000 Speaker 1: you know, somebody took you know, this financial data that 744 00:38:06,160 --> 00:38:09,520 Speaker 1: we saw his crime, right, so we just think about 745 00:38:09,560 --> 00:38:12,960 Speaker 1: it as like some hackers, you know, like visualize, you know, 746 00:38:13,040 --> 00:38:16,560 Speaker 1: like some guy in a hoodie, you know, hacking away 747 00:38:16,600 --> 00:38:18,560 Speaker 1: just trying to make it, you know, make some money 748 00:38:18,719 --> 00:38:20,000 Speaker 1: by stealing some credit cards. 749 00:38:20,040 --> 00:38:21,680 Speaker 5: Like that's where we at in our mind. 750 00:38:21,920 --> 00:38:24,880 Speaker 1: And that's how we treated it, and and so we 751 00:38:25,200 --> 00:38:28,120 Speaker 1: we monetized it. We're like, okay, you get hacked, you 752 00:38:28,160 --> 00:38:30,480 Speaker 1: get twelve cents back, Like you're saying. 753 00:38:30,880 --> 00:38:33,160 Speaker 5: That's not how it is now. 754 00:38:33,239 --> 00:38:38,400 Speaker 1: Always it's not just a financial crime, right, it's not 755 00:38:39,000 --> 00:38:42,239 Speaker 1: just hackers. These are nation states doing it, and they're 756 00:38:42,280 --> 00:38:47,320 Speaker 1: doing it for national security reasons. They're trying to harm 757 00:38:47,360 --> 00:38:51,200 Speaker 1: the country. They're trying to you know, disrupt an election. 758 00:38:51,440 --> 00:38:57,399 Speaker 1: They're trying to have fellows citizens be angry with each 759 00:38:57,400 --> 00:39:00,400 Speaker 1: other because that they know that we'll make us weaker. 760 00:39:00,840 --> 00:39:06,480 Speaker 1: They're trying to steal, you know, information to create technologies 761 00:39:06,520 --> 00:39:09,800 Speaker 1: that would be used in a war against us. Right, 762 00:39:09,960 --> 00:39:14,080 Speaker 1: So these are not financial crimes anymore. These are national 763 00:39:14,160 --> 00:39:16,359 Speaker 1: security crimes that affect all. 764 00:39:16,280 --> 00:39:19,000 Speaker 5: Of us, and we need to start thinking. 765 00:39:18,600 --> 00:39:22,640 Speaker 1: Differently about it as individuals, but also as companies and 766 00:39:22,800 --> 00:39:23,760 Speaker 1: also as a government. 767 00:39:24,280 --> 00:39:26,480 Speaker 5: You know, we can't let the stuff slide anymore. We 768 00:39:26,560 --> 00:39:27,319 Speaker 5: have to look at it. 769 00:39:27,239 --> 00:39:30,120 Speaker 1: And say, no, these our citizens are being targeted. We 770 00:39:30,160 --> 00:39:32,440 Speaker 1: need to go after this and we need to stop 771 00:39:32,480 --> 00:39:36,440 Speaker 1: it from happening. And a fine, you know, just finding 772 00:39:36,480 --> 00:39:39,120 Speaker 1: a company for gradual it's not going to cut it. Like, 773 00:39:39,160 --> 00:39:42,040 Speaker 1: it's not just like oh, they lost some money for people, 774 00:39:42,040 --> 00:39:43,000 Speaker 1: but we can ensure it. 775 00:39:43,000 --> 00:39:43,480 Speaker 5: It's fine. 776 00:39:43,560 --> 00:39:45,200 Speaker 1: It's like, no, they harmed us, and we need to 777 00:39:45,200 --> 00:39:48,880 Speaker 1: stop them. We need to deter them from doing this again. 778 00:39:51,080 --> 00:39:53,880 Speaker 3: I love it. No great conversation. I hope that you know, 779 00:39:54,120 --> 00:39:57,640 Speaker 3: if you come away from this, you know, podcast episode, 780 00:39:57,880 --> 00:39:58,920 Speaker 3: that you know. 781 00:39:58,920 --> 00:40:00,560 Speaker 4: We our listener, our viewer. 782 00:40:00,600 --> 00:40:03,760 Speaker 3: It's just to be a little more cognizant or aware 783 00:40:04,160 --> 00:40:07,239 Speaker 3: of your hey, your situational awareness around you. 784 00:40:07,320 --> 00:40:09,799 Speaker 4: Okay, always looking around, making sure you're good, be. 785 00:40:10,440 --> 00:40:13,839 Speaker 3: What you're uploading, situational awareness and and and then see 786 00:40:14,320 --> 00:40:17,480 Speaker 3: what you're joining and the information you're giving. And I 787 00:40:17,560 --> 00:40:20,760 Speaker 3: know that like places are legit. And I'm not saying 788 00:40:20,840 --> 00:40:24,480 Speaker 3: stop you know, doing business with your local bank and stuff. 789 00:40:24,480 --> 00:40:26,440 Speaker 3: But you just got to be aware that if you're 790 00:40:26,480 --> 00:40:31,279 Speaker 3: just opting into some crazy chat, gpt app or some 791 00:40:31,440 --> 00:40:34,040 Speaker 3: type of filter that says I'm gonna make you look 792 00:40:34,080 --> 00:40:41,200 Speaker 3: like a lion's face, okay, just kind of just grow 793 00:40:41,239 --> 00:40:43,279 Speaker 3: your beard out and be a lion. Okay, you just 794 00:40:43,320 --> 00:40:45,479 Speaker 3: got to grow it. Let it go, grow your own maid. 795 00:40:46,080 --> 00:40:47,480 Speaker 3: But you get what I'm saying. 796 00:40:47,320 --> 00:40:51,719 Speaker 1: Right, Yeah, I think I think that's exactly the takeaway. 797 00:40:51,960 --> 00:40:58,279 Speaker 1: And look The takeaway is also that it's everybody, and 798 00:40:58,400 --> 00:41:07,600 Speaker 1: it includes soldiers, includes retired military right there. Especially they 799 00:41:07,680 --> 00:41:10,319 Speaker 1: especially want to target people like that. And I know 800 00:41:10,400 --> 00:41:14,120 Speaker 1: people are seeing into this right now, Like they look 801 00:41:14,160 --> 00:41:19,640 Speaker 1: at you as somebody they probably cannot defeat in the battlefield. 802 00:41:21,040 --> 00:41:25,480 Speaker 5: They want to try to defeat you in before the 803 00:41:25,480 --> 00:41:28,600 Speaker 5: battle even happens by changing your mind. 804 00:41:28,840 --> 00:41:34,000 Speaker 1: A win for them is China doesn't even fight for Taiwan, 805 00:41:34,120 --> 00:41:34,920 Speaker 1: We just give it up. 806 00:41:35,680 --> 00:41:35,879 Speaker 4: Right. 807 00:41:36,040 --> 00:41:38,920 Speaker 1: That's called cognitive warfare, when you try to win a 808 00:41:38,960 --> 00:41:42,759 Speaker 1: fight without even fighting by changing people's minds. And this 809 00:41:42,920 --> 00:41:46,759 Speaker 1: is ultimately where it's going for them, is how do 810 00:41:46,840 --> 00:41:50,120 Speaker 1: I win that battle over Taiwan or whatever it is 811 00:41:50,600 --> 00:41:52,040 Speaker 1: before it even happens, so I. 812 00:41:52,000 --> 00:41:55,160 Speaker 5: Don't have to fight you on the battlefield, or. 813 00:41:55,080 --> 00:41:58,920 Speaker 1: Maybe I win it against your kids, right and I 814 00:41:59,040 --> 00:42:01,239 Speaker 1: change their minds. So when they grow up and they 815 00:42:01,239 --> 00:42:04,359 Speaker 1: turn eighteen and they enlist, they don't want to fight 816 00:42:04,400 --> 00:42:07,640 Speaker 1: in Taiwan either, or when they become a colonel or 817 00:42:07,800 --> 00:42:10,120 Speaker 1: general one day and have to decide a war plan, 818 00:42:10,200 --> 00:42:11,600 Speaker 1: they say, you know what, I don't want to fight 819 00:42:11,640 --> 00:42:14,319 Speaker 1: this war, right, Like that's the win for them, And 820 00:42:14,400 --> 00:42:16,440 Speaker 1: that's just a really different way to think about the 821 00:42:16,440 --> 00:42:18,600 Speaker 1: world in a way none of us are used to 822 00:42:18,640 --> 00:42:19,640 Speaker 1: thinking about the world. 823 00:42:20,000 --> 00:42:21,400 Speaker 5: But that's the new world we live in. 824 00:42:21,920 --> 00:42:23,920 Speaker 3: And you know, President Biden said this when he was 825 00:42:23,960 --> 00:42:26,440 Speaker 3: in office, was you know, the next war is not 826 00:42:26,480 --> 00:42:29,000 Speaker 3: going to be a bullet shot. It's going to be 827 00:42:29,040 --> 00:42:32,480 Speaker 3: a keystroke. You know, it's just going to be that 828 00:42:32,640 --> 00:42:35,480 Speaker 3: kind of warfare. Somebody's going to just trigger somebody with 829 00:42:35,520 --> 00:42:37,279 Speaker 3: a keystroke and then it's just going to be set 830 00:42:37,320 --> 00:42:41,399 Speaker 3: and then off it goes, you know, online per se, 831 00:42:41,480 --> 00:42:42,680 Speaker 3: whatever that keystroke is. 832 00:42:42,719 --> 00:42:44,160 Speaker 4: But I kind of believe in that. 833 00:42:44,320 --> 00:42:46,600 Speaker 3: You know, I can see that, and I know that 834 00:42:46,840 --> 00:42:49,120 Speaker 3: like in Ukraine right now, it is very bullets on target, 835 00:42:49,160 --> 00:42:51,880 Speaker 3: but it's also very propaganda coming into. 836 00:42:51,800 --> 00:42:54,120 Speaker 4: My feed and your feed, and like, this. 837 00:42:54,000 --> 00:42:56,279 Speaker 3: Is what's going on, and it's showing us what it 838 00:42:56,360 --> 00:42:59,840 Speaker 3: thinks we need to see because this is a supercomputer 839 00:43:00,600 --> 00:43:03,440 Speaker 3: right here, and I am not a supercomputer. 840 00:43:04,239 --> 00:43:04,879 Speaker 5: Yeah. 841 00:43:04,960 --> 00:43:08,560 Speaker 1: There's this author William Gibson, he wrote this book called Neuromancer, 842 00:43:08,680 --> 00:43:10,359 Speaker 1: is kind of a science fiction author, and he has 843 00:43:10,400 --> 00:43:13,000 Speaker 1: this great quote he said, he said, the future is here, 844 00:43:13,120 --> 00:43:14,919 Speaker 1: it's just not evenly distributed. 845 00:43:15,560 --> 00:43:18,080 Speaker 5: And so when we look at Ukraine, you're. 846 00:43:17,920 --> 00:43:21,120 Speaker 1: Seeing that future. Yeah, look they're in trenches in the bud. 847 00:43:21,200 --> 00:43:23,120 Speaker 1: That's a real war over there. There's a lot of 848 00:43:23,120 --> 00:43:26,680 Speaker 1: guys dying. But they're also showing the future of that war. 849 00:43:26,800 --> 00:43:30,000 Speaker 1: These guys are flying drones against each other. They're updating 850 00:43:30,000 --> 00:43:34,000 Speaker 1: those drones all the time. They're most likely doing cyber 851 00:43:34,040 --> 00:43:35,800 Speaker 1: attacks and information. 852 00:43:35,400 --> 00:43:36,960 Speaker 5: Attacks and all these kinds of things. 853 00:43:37,200 --> 00:43:40,440 Speaker 1: So we're seeing both right, and we're seeing that you know, 854 00:43:40,560 --> 00:43:43,799 Speaker 1: unevenly distributed future, and that is going to be our 855 00:43:43,840 --> 00:43:46,279 Speaker 1: future probably next war we fight, it is going to 856 00:43:46,360 --> 00:43:48,719 Speaker 1: be that keystroke. It's going to be robot. We're still 857 00:43:48,719 --> 00:43:51,479 Speaker 1: gonna have guys on the ground, you know, they're gonna 858 00:43:51,480 --> 00:43:53,560 Speaker 1: be guys like you know what I did, you know 859 00:43:53,560 --> 00:43:55,360 Speaker 1: when I was in Iraq, out. 860 00:43:55,120 --> 00:43:58,200 Speaker 5: There collecting information, recruiting sources. 861 00:43:58,320 --> 00:44:01,880 Speaker 1: They're gonna be soft guys out there, you know, doing 862 00:44:02,040 --> 00:44:05,759 Speaker 1: doing their thing on an island somewhere in the South Pacific. 863 00:44:06,280 --> 00:44:07,839 Speaker 5: But a lot of it is going. 864 00:44:07,840 --> 00:44:11,400 Speaker 1: To be these drones, and those drones or those AI 865 00:44:11,520 --> 00:44:15,120 Speaker 1: systems were made by somebody here who went to an 866 00:44:15,120 --> 00:44:19,600 Speaker 1: American college maybe and works at a company in the 867 00:44:19,680 --> 00:44:23,160 Speaker 1: Bay Area or in Austin or something, and that person is. 868 00:44:23,160 --> 00:44:25,560 Speaker 5: Being targeted, right because. 869 00:44:25,280 --> 00:44:29,200 Speaker 1: They they want that whoever that that guy or gal is, 870 00:44:29,560 --> 00:44:33,360 Speaker 1: they want that person to not make that software right 871 00:44:33,360 --> 00:44:35,799 Speaker 1: where if they do make it, they want them to 872 00:44:36,040 --> 00:44:38,840 Speaker 1: they want to take that software and use it themselves 873 00:44:38,880 --> 00:44:40,080 Speaker 1: as well. 874 00:44:40,120 --> 00:44:42,279 Speaker 3: Like the family that was escorted out of the US 875 00:44:42,360 --> 00:44:43,960 Speaker 3: and they had to take their kids who didn't know 876 00:44:44,000 --> 00:44:46,880 Speaker 3: they were Russian spies and the parents. There was that 877 00:44:47,000 --> 00:44:50,800 Speaker 3: trade between Russia and the US for like a prisoner 878 00:44:50,800 --> 00:44:53,279 Speaker 3: swap and they gave up this family that had been 879 00:44:53,320 --> 00:44:55,120 Speaker 3: captured and like send them back to Russia. 880 00:44:55,120 --> 00:44:58,360 Speaker 4: And the kids were like what Yeah, yeah, the. 881 00:44:58,360 --> 00:45:00,080 Speaker 3: Whole time, you know, living a fan, living a in 882 00:45:00,120 --> 00:45:04,680 Speaker 3: the US, you know this is current, that's like, this 883 00:45:04,800 --> 00:45:05,320 Speaker 3: is happening. 884 00:45:05,400 --> 00:45:10,040 Speaker 5: You know, there's this there's this kind of story where. 885 00:45:11,880 --> 00:45:16,920 Speaker 1: It talks about if the US and the Russians and 886 00:45:17,000 --> 00:45:20,560 Speaker 1: the Chinese each wanted to get some sand from a 887 00:45:20,600 --> 00:45:26,640 Speaker 1: beach somewhere right the US, we'd we'd get you know, 888 00:45:26,680 --> 00:45:29,120 Speaker 1: we'd get this satellite and it'd be a billion dollar 889 00:45:29,200 --> 00:45:31,680 Speaker 1: satellite and we'd send it up into space and it 890 00:45:32,040 --> 00:45:35,000 Speaker 1: would use a special sensor that you know, Nobel Prize 891 00:45:35,000 --> 00:45:38,320 Speaker 1: winning physicists made that would scan the beach and understand 892 00:45:38,400 --> 00:45:42,080 Speaker 1: the chemistry of the sand. The Russians they would send 893 00:45:42,080 --> 00:45:44,600 Speaker 1: in a submarine and they would have their spetsnas like 894 00:45:44,680 --> 00:45:47,680 Speaker 1: special operations guy in the middle of the night swim 895 00:45:47,760 --> 00:45:49,480 Speaker 1: up to the beach and grab, you know, a little 896 00:45:49,600 --> 00:45:55,000 Speaker 1: sample of the sand. The Chinese, they would invite a 897 00:45:55,080 --> 00:45:59,080 Speaker 1: thousand Chinese tourists to go to that beach and they 898 00:45:59,080 --> 00:46:02,919 Speaker 1: would suntan all day, hang out, go swimming, get back 899 00:46:02,960 --> 00:46:06,480 Speaker 1: on the plane. And when they landed in Shanghai, the 900 00:46:06,560 --> 00:46:10,400 Speaker 1: Chinese MSS their CIA would come in, take their towels, 901 00:46:10,760 --> 00:46:13,239 Speaker 1: shake out the sand from the towels, and they would 902 00:46:13,239 --> 00:46:14,800 Speaker 1: have more sand than the rest of us could to 903 00:46:14,840 --> 00:46:17,880 Speaker 1: collect it, and it would cost almost nothing. Right, And 904 00:46:17,920 --> 00:46:21,280 Speaker 1: it's just a different mentality of how you collect information 905 00:46:21,440 --> 00:46:24,040 Speaker 1: and how you do things, and like to your point, 906 00:46:24,640 --> 00:46:27,080 Speaker 1: you know, people don't They're going to come in and 907 00:46:27,239 --> 00:46:29,560 Speaker 1: just think about it in non traditional ways that we're 908 00:46:30,360 --> 00:46:30,840 Speaker 1: not used to. 909 00:46:32,080 --> 00:46:32,239 Speaker 4: Right. 910 00:46:32,280 --> 00:46:34,120 Speaker 3: There's always someone out there trying to think of another 911 00:46:34,160 --> 00:46:38,239 Speaker 3: way to be devious, and so we have to keep 912 00:46:38,760 --> 00:46:43,680 Speaker 3: training guys like yourself and others to be also as devious. 913 00:46:43,680 --> 00:46:47,160 Speaker 3: Because as I've talked to undercover FBI agents who helped 914 00:46:47,160 --> 00:46:50,319 Speaker 3: thwart Russian spies inside the FBI. The dude had to 915 00:46:50,320 --> 00:46:53,520 Speaker 3: become the same thing. Yeah, he had to like learn 916 00:46:53,560 --> 00:46:55,759 Speaker 3: and live the same path for five years. You know, 917 00:46:55,880 --> 00:46:57,480 Speaker 3: he's like talk about spy. 918 00:46:58,400 --> 00:47:01,840 Speaker 1: Yeah, this is the fun thing about being an intelligence officer. 919 00:47:01,960 --> 00:47:05,920 Speaker 5: Right, you are, You're a paid criminal. You're paid to 920 00:47:06,480 --> 00:47:08,520 Speaker 5: You're a criminal on behalf of your country. 921 00:47:08,800 --> 00:47:12,760 Speaker 1: You're not taking more laws, right, You're thinking about devious 922 00:47:12,800 --> 00:47:19,000 Speaker 1: ways to get things, to steal things information usually from 923 00:47:19,120 --> 00:47:22,000 Speaker 1: other countries, and think are clever ways to do it, 924 00:47:22,040 --> 00:47:24,560 Speaker 1: and you're doing it to give your own country an 925 00:47:24,640 --> 00:47:26,280 Speaker 1: unfair advantage. 926 00:47:26,360 --> 00:47:29,000 Speaker 5: Right, This isn't a fair game. If we're in a 927 00:47:29,080 --> 00:47:30,880 Speaker 5: fair fight, we've already lost. 928 00:47:31,200 --> 00:47:34,200 Speaker 1: The idea is to get information unfairly, give us an 929 00:47:34,280 --> 00:47:37,480 Speaker 1: unfair advantage by having people who go there and are 930 00:47:38,280 --> 00:47:39,440 Speaker 1: our devious people. 931 00:47:39,920 --> 00:47:41,280 Speaker 5: You have to be like that a little. 932 00:47:41,680 --> 00:47:44,520 Speaker 1: The balance, though, is you also have to have a 933 00:47:44,600 --> 00:47:47,920 Speaker 1: huge amount of integrity because you can never turn that back. 934 00:47:47,800 --> 00:47:49,879 Speaker 5: To your own country. And that's a lot of people 935 00:47:49,920 --> 00:47:51,600 Speaker 5: don't realize. This is when. 936 00:47:51,400 --> 00:47:54,239 Speaker 1: You're recruited as an intelligence officer. The number one thing 937 00:47:54,800 --> 00:47:57,759 Speaker 1: for being recruited is do you have integrity? If you 938 00:47:57,760 --> 00:47:59,840 Speaker 1: don't have that, then you're not allowed to play in 939 00:47:59,840 --> 00:48:00,480 Speaker 1: this handbox. 940 00:48:00,520 --> 00:48:00,799 Speaker 5: That's it. 941 00:48:00,880 --> 00:48:04,040 Speaker 3: You're going home, right, because again, like the gentleman that 942 00:48:04,080 --> 00:48:05,400 Speaker 3: cred oss Or was a. 943 00:48:05,360 --> 00:48:07,319 Speaker 4: Big part of that in the very beginning. You know, 944 00:48:07,360 --> 00:48:07,800 Speaker 4: they all. 945 00:48:07,640 --> 00:48:10,960 Speaker 3: Thought that he fought for multiple sides. And they're like, well, 946 00:48:10,960 --> 00:48:12,760 Speaker 3: I is he fighting for the Russians? Is he fighting 947 00:48:12,800 --> 00:48:14,480 Speaker 3: for England? Is he fighting for America? 948 00:48:14,600 --> 00:48:16,719 Speaker 4: You know? And again when I mentioned that he was buried. 949 00:48:16,719 --> 00:48:20,200 Speaker 3: With American honors, et cetera, you know, for his service, 950 00:48:20,920 --> 00:48:25,280 Speaker 3: that's what he should be recognized as, not the other things. 951 00:48:25,280 --> 00:48:29,120 Speaker 3: But through history it's like, well, was his loyalty to 952 00:48:29,160 --> 00:48:32,000 Speaker 3: the US the whole time? Well, he had to do 953 00:48:32,120 --> 00:48:36,200 Speaker 3: things that seemed like it wasn't in order for it 954 00:48:36,239 --> 00:48:36,480 Speaker 3: to be. 955 00:48:37,520 --> 00:48:42,240 Speaker 1: Yeah, it's true, they these these there is a long 956 00:48:42,680 --> 00:48:46,120 Speaker 1: history of people you for whom you may not even 957 00:48:46,160 --> 00:48:49,120 Speaker 1: know the real story. But some people do know the 958 00:48:49,160 --> 00:48:51,479 Speaker 1: real story. And they were out there protecting the nation. 959 00:48:52,239 --> 00:48:53,480 Speaker 5: And and and. 960 00:48:53,719 --> 00:48:56,280 Speaker 1: I think, I think you're you're, you're you're talking about 961 00:48:56,320 --> 00:48:58,400 Speaker 1: some of them. I six guys and the and the 962 00:48:58,400 --> 00:49:01,480 Speaker 1: Potish who helped us olled this. We didn't know how 963 00:49:01,480 --> 00:49:02,480 Speaker 1: to do it. 964 00:49:02,640 --> 00:49:03,120 Speaker 5: We didn't have. 965 00:49:03,239 --> 00:49:06,000 Speaker 1: We didn't have a history of this, and we you know, 966 00:49:06,840 --> 00:49:10,600 Speaker 1: President Roosevelt at the time appointed this guy, Bill Donovan. 967 00:49:10,719 --> 00:49:14,239 Speaker 1: They called him wild of Donovan and and he he 968 00:49:14,680 --> 00:49:16,560 Speaker 1: was he was a lawyer. He is an amazing guy. 969 00:49:17,000 --> 00:49:21,040 Speaker 1: He was a Republican who worked for Roosevelt. They disagreed 970 00:49:21,080 --> 00:49:24,200 Speaker 1: with each other politically, but but Roosevelt knew Donovan was 971 00:49:24,239 --> 00:49:26,560 Speaker 1: the right guy, and he said, I needed to make 972 00:49:27,320 --> 00:49:30,680 Speaker 1: our intelligence agency was called the COI at first, and 973 00:49:31,880 --> 00:49:35,160 Speaker 1: Donovan was was was also a bit of a devious guy, 974 00:49:36,000 --> 00:49:39,440 Speaker 1: and he put it together. He learned from the British. 975 00:49:39,760 --> 00:49:42,680 Speaker 1: They helped us set it up, and he went out 976 00:49:42,719 --> 00:49:46,560 Speaker 1: there and he didn't think like he didn't. 977 00:49:46,280 --> 00:49:49,160 Speaker 5: Think like a normal person at that time. He was like, well, 978 00:49:49,160 --> 00:49:50,600 Speaker 5: who do I need? 979 00:49:50,760 --> 00:49:52,880 Speaker 1: Yeah, I need some guys who went to Yale and 980 00:49:52,920 --> 00:49:55,920 Speaker 1: were lawyers and whatever. But you know, also need I 981 00:49:55,920 --> 00:49:58,600 Speaker 1: need this guy who's in prison because he's the best 982 00:49:58,640 --> 00:50:02,680 Speaker 1: guy at cracking safes in the world, and I'm going 983 00:50:02,719 --> 00:50:04,840 Speaker 1: to go get him out of prison and I'm going 984 00:50:04,880 --> 00:50:06,800 Speaker 1: to have him come teach me how to do it. 985 00:50:07,000 --> 00:50:09,120 Speaker 5: He would recruit magicians who. 986 00:50:08,960 --> 00:50:10,920 Speaker 1: Could do slight a hand and teach this guy, like 987 00:50:11,000 --> 00:50:14,160 Speaker 1: he would recruit people who knew how to do this stuff. 988 00:50:14,280 --> 00:50:17,640 Speaker 5: And and give them a chance, and they made us 989 00:50:17,800 --> 00:50:18,320 Speaker 5: better at it. 990 00:50:18,320 --> 00:50:22,000 Speaker 1: And he created that whole culture that we still have 991 00:50:22,120 --> 00:50:24,920 Speaker 1: today in special forces and in the CIA and another 992 00:50:24,920 --> 00:50:27,880 Speaker 1: intelligence agency, which is like, get the job done, now, 993 00:50:27,920 --> 00:50:30,560 Speaker 1: get it done with integrity for your own country. But 994 00:50:30,840 --> 00:50:33,480 Speaker 1: do you got to do get do the mission when 995 00:50:33,480 --> 00:50:34,240 Speaker 1: you're out in the field. 996 00:50:35,360 --> 00:50:35,520 Speaker 4: Yeah. 997 00:50:35,560 --> 00:50:37,719 Speaker 3: It's like that show Catch Me if you Can with 998 00:50:37,840 --> 00:50:41,200 Speaker 3: Leonardo DiCaprio and Tom Hanks. You know, no spoilers here, 999 00:50:41,239 --> 00:50:43,640 Speaker 3: but if you haven't seen it. Towards the end, they're like, well, 1000 00:50:43,680 --> 00:50:46,080 Speaker 3: we want to work, We want to hire you. You know, 1001 00:50:46,160 --> 00:50:47,360 Speaker 3: you know how to do all these checks. 1002 00:50:47,480 --> 00:50:49,000 Speaker 4: We need you, or. 1003 00:50:48,880 --> 00:50:51,400 Speaker 3: Like, you know the people that worked at the Pentagon 1004 00:50:51,440 --> 00:50:53,799 Speaker 3: that made a fake parking pass because they got tired 1005 00:50:53,840 --> 00:50:56,640 Speaker 3: of walking so far away to the Pentagon, so they 1006 00:50:56,680 --> 00:50:59,040 Speaker 3: made a pass and they were just getting away with 1007 00:50:59,120 --> 00:51:01,600 Speaker 3: like parking in the animals parking or something like that. 1008 00:51:01,680 --> 00:51:03,920 Speaker 3: And then one day someone's like, how come this guy 1009 00:51:04,000 --> 00:51:05,000 Speaker 3: keep parking here? 1010 00:51:05,480 --> 00:51:07,360 Speaker 4: And they said where'd you get this pass? 1011 00:51:07,400 --> 00:51:10,360 Speaker 3: And he's like, ah, I guess I made it, And 1012 00:51:10,360 --> 00:51:11,399 Speaker 3: they're like can you make more? 1013 00:51:13,800 --> 00:51:14,600 Speaker 5: Yeah? 1014 00:51:14,840 --> 00:51:16,680 Speaker 4: Can you make more. You see. 1015 00:51:17,040 --> 00:51:20,279 Speaker 3: He's like, oh, yeah, no problem, no problem, you know, okay, cool, Yeah, 1016 00:51:20,320 --> 00:51:23,399 Speaker 3: The Dirty Dozen was you know, a good movie, right, 1017 00:51:23,600 --> 00:51:26,000 Speaker 3: I need these guys do this job. 1018 00:51:26,520 --> 00:51:28,720 Speaker 5: Yeah. I think that's exactly right. 1019 00:51:28,840 --> 00:51:32,080 Speaker 1: And there's a place for those people still, you know, 1020 00:51:32,320 --> 00:51:37,000 Speaker 1: in the intelligence world for doing that, for creating cover 1021 00:51:37,080 --> 00:51:39,520 Speaker 1: documents and things like that still has to get done. 1022 00:51:39,600 --> 00:51:41,279 Speaker 5: We still need devious people. 1023 00:51:41,120 --> 00:51:44,759 Speaker 1: Who are good at something and obsessed oftentimes obsessed with it, 1024 00:51:44,960 --> 00:51:47,879 Speaker 1: like this is what they love doing, and you get 1025 00:51:47,880 --> 00:51:50,279 Speaker 1: to do it. I mean it's just the coolest job ever. 1026 00:51:50,640 --> 00:51:53,200 Speaker 1: Like you get to do these things and you're somebody like. 1027 00:51:53,200 --> 00:51:53,920 Speaker 4: Me, cool job. 1028 00:51:55,239 --> 00:51:57,480 Speaker 5: I love to live abroad, I love to travel. 1029 00:51:58,000 --> 00:52:01,560 Speaker 1: I love the idea of doing operations, figuring out heart problems, 1030 00:52:01,560 --> 00:52:03,439 Speaker 1: and you get to go do that every day. 1031 00:52:04,280 --> 00:52:04,440 Speaker 4: You know. 1032 00:52:04,520 --> 00:52:06,839 Speaker 1: The dirty secret is you spend still half your time 1033 00:52:07,000 --> 00:52:09,880 Speaker 1: doing paperwork like every other job, but like for that 1034 00:52:09,960 --> 00:52:11,640 Speaker 1: other half when you're out in the field. 1035 00:52:11,440 --> 00:52:12,280 Speaker 5: It's it's amazing. 1036 00:52:12,280 --> 00:52:14,840 Speaker 1: And there are people who do that in technology and 1037 00:52:14,880 --> 00:52:17,439 Speaker 1: all sorts of things, and it's like, you know, look, 1038 00:52:17,480 --> 00:52:20,640 Speaker 1: you could go make software at any company in this 1039 00:52:20,760 --> 00:52:23,960 Speaker 1: country if you're a great software developer or you have 1040 00:52:24,000 --> 00:52:25,520 Speaker 1: a degree in math or something. 1041 00:52:26,239 --> 00:52:27,480 Speaker 5: You know, you could be making. 1042 00:52:27,239 --> 00:52:31,520 Speaker 1: Ad tech that sells more, you know, widgets to people, 1043 00:52:31,920 --> 00:52:34,719 Speaker 1: or you could go work at the NSA or the 1044 00:52:34,760 --> 00:52:38,719 Speaker 1: CIA Ordia or any of these agencies and you could 1045 00:52:38,719 --> 00:52:41,200 Speaker 1: be making software. But you're doing it for a higher cause. 1046 00:52:41,800 --> 00:52:44,160 Speaker 1: And trust me, it's going to be way more interesting 1047 00:52:44,320 --> 00:52:47,879 Speaker 1: than making software that sells more widgets, just. 1048 00:52:47,800 --> 00:52:49,759 Speaker 4: An analyst, quote unquote. 1049 00:52:51,120 --> 00:52:54,240 Speaker 3: Now, listen, you've been a great guest on the show, 1050 00:52:54,480 --> 00:52:57,200 Speaker 3: going back and forth just all about the spy game 1051 00:52:57,280 --> 00:52:59,160 Speaker 3: and talking about just your book, you know, and I 1052 00:52:59,200 --> 00:53:02,520 Speaker 3: want to just read iterate that. You know, Anthony Vincey, 1053 00:53:02,560 --> 00:53:04,360 Speaker 3: who is here with us today, is the author of 1054 00:53:04,440 --> 00:53:08,600 Speaker 3: the book The Fourth Intelligence Revolution, The Future of Espionage 1055 00:53:08,760 --> 00:53:11,279 Speaker 3: and the Battle to Save America. So I want to 1056 00:53:11,280 --> 00:53:13,440 Speaker 3: plug that again and make sure that that gets said. 1057 00:53:13,280 --> 00:53:14,240 Speaker 4: Clearly out here. 1058 00:53:14,440 --> 00:53:17,400 Speaker 3: You know, you've put time and thought into writing this 1059 00:53:17,520 --> 00:53:20,600 Speaker 3: and having it out there, you know, so make sure 1060 00:53:20,600 --> 00:53:24,239 Speaker 3: that if you do pick up Anthony's book, that if 1061 00:53:24,280 --> 00:53:27,040 Speaker 3: you buy it online, then you leave a review where 1062 00:53:27,040 --> 00:53:29,960 Speaker 3: you buy it when you've picked it up, put down 1063 00:53:30,000 --> 00:53:31,799 Speaker 3: a great book or you know what I could have 1064 00:53:31,840 --> 00:53:34,840 Speaker 3: done this better or whatever, just put it in the comments, 1065 00:53:35,160 --> 00:53:37,120 Speaker 3: so that way, you know, I'm sure he'll check it 1066 00:53:37,160 --> 00:53:39,759 Speaker 3: out or you know, everybody that's involved with the book, 1067 00:53:39,800 --> 00:53:40,640 Speaker 3: and it's a good thing. 1068 00:53:40,480 --> 00:53:42,800 Speaker 4: To do for the author. That way they get some feedback. 1069 00:53:42,840 --> 00:53:45,560 Speaker 3: So my big push is just to leave comments where 1070 00:53:45,600 --> 00:53:47,480 Speaker 3: they buy their book, and if they can't buy it local, 1071 00:53:47,800 --> 00:53:49,960 Speaker 3: then buy it wherever they can. That's my first that's 1072 00:53:49,960 --> 00:53:56,279 Speaker 3: my other thing. And with that said, we've had you 1073 00:53:56,320 --> 00:54:01,440 Speaker 3: for just about an hour. You have served our You've 1074 00:54:01,680 --> 00:54:05,080 Speaker 3: gone into different places that I probably don't want to 1075 00:54:05,080 --> 00:54:07,960 Speaker 3: think about, and so thanks for doing that, Thanks for 1076 00:54:08,000 --> 00:54:11,120 Speaker 3: figuring things out, Thanks for being OCD about whatever it 1077 00:54:11,160 --> 00:54:14,160 Speaker 3: is that you're passionate about that. You know, the agency 1078 00:54:14,200 --> 00:54:17,560 Speaker 3: loves you to have your gig and to write your book. 1079 00:54:17,640 --> 00:54:20,400 Speaker 3: So you know, I know you're a PhD and I 1080 00:54:20,400 --> 00:54:23,160 Speaker 3: didn't want to call you a doctor, so I'm not 1081 00:54:23,200 --> 00:54:25,880 Speaker 3: going to call you a doctor. But a PhD is 1082 00:54:25,920 --> 00:54:28,200 Speaker 3: pretty badass, bro, And I just want to let you 1083 00:54:28,239 --> 00:54:31,040 Speaker 3: know that, you know, congratulations on that, And if you 1084 00:54:31,080 --> 00:54:33,120 Speaker 3: ever want to be back on SOFTWAREP Radio with me, 1085 00:54:33,520 --> 00:54:36,080 Speaker 3: you know you have an open platform to talk about 1086 00:54:36,080 --> 00:54:39,160 Speaker 3: your book further or any stories else that we may 1087 00:54:39,160 --> 00:54:40,800 Speaker 3: not have even touched. Base on that you want to 1088 00:54:40,800 --> 00:54:42,840 Speaker 3: talk about, you can bring it to light here. I 1089 00:54:42,880 --> 00:54:44,040 Speaker 3: just want to let you know you have an open 1090 00:54:44,200 --> 00:54:45,680 Speaker 3: door here at Software Radio. 1091 00:54:46,040 --> 00:54:48,520 Speaker 1: Thank thanks so much, thanks for having me on, and 1092 00:54:48,600 --> 00:54:50,560 Speaker 1: thanks for what you do. You know, it's important to 1093 00:54:50,560 --> 00:54:53,000 Speaker 1: get these stories out there, and it means a lot 1094 00:54:53,040 --> 00:54:55,239 Speaker 1: to people, especially you know, young people are trying to 1095 00:54:55,280 --> 00:54:56,960 Speaker 1: figure out what to do with their lives, and so 1096 00:54:57,040 --> 00:54:58,160 Speaker 1: I appreciate what you do. 1097 00:54:58,600 --> 00:55:00,520 Speaker 4: So thank you well, thank you so much, Thank you 1098 00:55:00,560 --> 00:55:00,959 Speaker 4: so much. 1099 00:55:01,040 --> 00:55:04,719 Speaker 3: And with that said, and behalf on, behalf of Anthony 1100 00:55:04,920 --> 00:55:07,520 Speaker 3: and my team here at soft REAP and Brandon Webb 1101 00:55:07,600 --> 00:55:08,800 Speaker 3: and everybody. 1102 00:55:08,440 --> 00:55:11,600 Speaker 4: Who has helped support me, be on this and be 1103 00:55:11,719 --> 00:55:12,160 Speaker 4: the host. 1104 00:55:12,360 --> 00:55:14,239 Speaker 3: I want to say thank you and to everyone out 1105 00:55:14,239 --> 00:55:17,240 Speaker 3: there that listens and comments and reaches out to emails, 1106 00:55:17,440 --> 00:55:19,520 Speaker 3: continue to do so. I love to get that information 1107 00:55:19,680 --> 00:55:22,680 Speaker 3: and I digest it and then who knows what I 1108 00:55:22,719 --> 00:55:24,799 Speaker 3: do with it next I might call you out. So 1109 00:55:25,200 --> 00:55:29,640 Speaker 3: with that said, be kind to somebody. Watch what you're downloading, 1110 00:55:30,000 --> 00:55:33,680 Speaker 3: watch what you're uploading, be aware at all times, and 1111 00:55:33,719 --> 00:55:35,520 Speaker 3: from all of us here at softwarep this is RAT 1112 00:55:35,560 --> 00:55:36,200 Speaker 3: say in peace. 1113 00:55:52,000 --> 00:55:54,360 Speaker 2: You've been listening to self. Rep Ladi