1 00:00:07,120 --> 00:00:19,400 Speaker 1: Ah, beaut force. If it doesn't work, you're just not 2 00:00:19,720 --> 00:00:25,440 Speaker 1: using enough. You're listening to Software Radio, special Operations, military 3 00:00:25,520 --> 00:01:00,280 Speaker 1: news and straight talk with the guys and the community. Yeah. 4 00:01:18,280 --> 00:01:21,360 Speaker 1: I came in and took a very hard position, very 5 00:01:21,360 --> 00:01:24,120 Speaker 1: hard position. And you know what, Now we have this 6 00:01:24,200 --> 00:01:26,920 Speaker 1: great relationship, and let's see what happens. We're gonna have 7 00:01:26,920 --> 00:01:30,360 Speaker 1: another meeting. Jeremy Kim would like have another meeting. We 8 00:01:30,440 --> 00:01:32,480 Speaker 1: have a meeting. I was really being tough and so 9 00:01:32,680 --> 00:01:36,040 Speaker 1: was he, and we'll go back and forth. And then 10 00:01:36,080 --> 00:01:39,600 Speaker 1: we fell in love. Okay, now really he wrote me 11 00:01:39,680 --> 00:01:44,319 Speaker 1: beautiful letters and they're great letters. We fell in love 12 00:01:45,200 --> 00:01:49,440 Speaker 1: Software Radio on Time on Target. I had to start 13 00:01:49,480 --> 00:01:53,720 Speaker 1: with that because we don't comment on everything Trump says 14 00:01:53,800 --> 00:01:56,120 Speaker 1: here on soft Our Radio. But I know that both 15 00:01:56,160 --> 00:01:58,280 Speaker 1: of us had a strong reaction to that clip. We 16 00:01:58,400 --> 00:02:02,640 Speaker 1: both tweeted about it because I think that it's every 17 00:02:02,720 --> 00:02:05,040 Speaker 1: day there's something that's like kind of unheard of in 18 00:02:05,080 --> 00:02:07,520 Speaker 1: this administration that's never been done. But when I hear 19 00:02:07,600 --> 00:02:11,080 Speaker 1: something like that, it just it. I have a reaction 20 00:02:11,120 --> 00:02:13,040 Speaker 1: to it. I think everybody should. I mean, when you 21 00:02:13,120 --> 00:02:15,679 Speaker 1: hear it, I mean my reaction is kind of like, 22 00:02:17,680 --> 00:02:22,919 Speaker 1: is he having early onset dementia, Like is this Alzheimer's 23 00:02:22,919 --> 00:02:27,600 Speaker 1: because like, I mean, the guy can't really be even 24 00:02:27,680 --> 00:02:31,399 Speaker 1: even if even a foolish person knows that you say 25 00:02:31,440 --> 00:02:35,399 Speaker 1: something like that, it just it's it gets you negative attention. 26 00:02:35,520 --> 00:02:38,320 Speaker 1: But I mean, but it's Trump. I mean, maybe he 27 00:02:38,440 --> 00:02:40,200 Speaker 1: is doing it just for attention, but I mean it's 28 00:02:40,200 --> 00:02:42,959 Speaker 1: it's just a totally bizarre comment. I think he enjoys 29 00:02:43,040 --> 00:02:46,040 Speaker 1: negative attention though yeah, yeah, I know, I think so too, 30 00:02:46,200 --> 00:02:49,639 Speaker 1: and I wish Yeah, the rest of the rest of 31 00:02:49,680 --> 00:02:52,200 Speaker 1: the clip where he says something right right after that 32 00:02:52,240 --> 00:02:56,200 Speaker 1: comment about falling in love, he says something actually um, 33 00:02:56,280 --> 00:02:59,200 Speaker 1: which is even more important. I think that than the 34 00:02:59,280 --> 00:03:01,639 Speaker 1: comments about falling in love with see if I could 35 00:03:01,639 --> 00:03:04,000 Speaker 1: find the larger one. I mean, well, you remember back 36 00:03:04,040 --> 00:03:08,640 Speaker 1: when Obama was um talking to Iran UM his administration 37 00:03:08,720 --> 00:03:11,880 Speaker 1: was negotiating with Iran, and like people were freaking out. 38 00:03:11,919 --> 00:03:14,120 Speaker 1: They were like, you can't negotiate with the RAN, you 39 00:03:14,160 --> 00:03:16,320 Speaker 1: can't talk, you need to cut, and like Ron Paul 40 00:03:16,440 --> 00:03:18,400 Speaker 1: was like, why can't we negotiate with a RAN? We 41 00:03:18,440 --> 00:03:21,560 Speaker 1: negotiated with the Soviet Union, And like there's huge debates 42 00:03:21,639 --> 00:03:24,800 Speaker 1: about like who you're allowed to talk to. But but 43 00:03:25,240 --> 00:03:27,040 Speaker 1: to be fair, because I know someone's in a point 44 00:03:27,080 --> 00:03:30,240 Speaker 1: something that Barack Obama did not say he fell in 45 00:03:30,280 --> 00:03:33,800 Speaker 1: love with Acumenens. No, No, it was just they were negotiating. 46 00:03:34,000 --> 00:03:36,320 Speaker 1: That's it. It was just diplomacy. Let me see if 47 00:03:36,360 --> 00:03:38,560 Speaker 1: this is the if this is the bigger clip, just 48 00:03:38,600 --> 00:03:47,200 Speaker 1: to be uh, this might be the longer clip. No, 49 00:03:47,360 --> 00:03:50,040 Speaker 1: this is a parody of some sort. All right, let's 50 00:03:50,040 --> 00:03:53,160 Speaker 1: see this is it? And yes, I agree, the rhetoric 51 00:03:53,280 --> 00:03:59,760 Speaker 1: was unbelievably harsh at the beginning. Let us we fell 52 00:03:59,760 --> 00:04:02,240 Speaker 1: at up. But you know what, now they'll make it 53 00:04:02,320 --> 00:04:05,880 Speaker 1: just like Donald Trump said, they fell in love. How horrible? 54 00:04:06,360 --> 00:04:12,480 Speaker 1: How horrible? Is that? So unpresidential? And I always tell 55 00:04:12,520 --> 00:04:15,240 Speaker 1: you it's so easy to be presidential. But instead of 56 00:04:15,280 --> 00:04:18,000 Speaker 1: having ten thousand people outside trying to get into this 57 00:04:18,080 --> 00:04:24,120 Speaker 1: fact arena, we'd have about two people standing right there. Okay, alright, 58 00:04:25,120 --> 00:04:29,040 Speaker 1: that's what's really interesting. And because he tells you, he's 59 00:04:29,080 --> 00:04:31,960 Speaker 1: straight up telling you what he's about, what what his 60 00:04:32,440 --> 00:04:36,240 Speaker 1: m O is is to cause those controversies, to not 61 00:04:36,320 --> 00:04:40,960 Speaker 1: be a typical president, but to make these bombastic statements 62 00:04:40,960 --> 00:04:44,640 Speaker 1: in public on Twitter, at news conferences, whatever the case is, 63 00:04:45,279 --> 00:04:47,719 Speaker 1: because he wants that attention. He comes from a reality 64 00:04:47,760 --> 00:04:52,120 Speaker 1: TV background, uh, and is he necessarily He's not wrong 65 00:04:52,160 --> 00:04:55,760 Speaker 1: on this particular topic. It's like, okay, Donald, like you 66 00:04:56,279 --> 00:05:00,080 Speaker 1: packed this arena with people, um, instead of having one 67 00:05:00,080 --> 00:05:03,599 Speaker 1: of these boring political rallies that just a handful of 68 00:05:03,600 --> 00:05:06,360 Speaker 1: people show up, which is the reason. You know, he 69 00:05:06,400 --> 00:05:08,599 Speaker 1: was able to energize the base more than someone like 70 00:05:08,680 --> 00:05:11,600 Speaker 1: Jeff Bush had to do rallies where he literally said 71 00:05:11,600 --> 00:05:15,640 Speaker 1: to people, please clap at the end of a sentence. Right, 72 00:05:15,720 --> 00:05:20,000 Speaker 1: No one was enthusiastic, no one cared. But Trump is 73 00:05:20,320 --> 00:05:23,680 Speaker 1: a showman. And I think it's interesting because Trump is 74 00:05:23,680 --> 00:05:27,320 Speaker 1: in this regard, he really is like a postmodern president. 75 00:05:27,839 --> 00:05:32,960 Speaker 1: I he is the facade. Um. There's nothing inside, there's 76 00:05:33,000 --> 00:05:36,840 Speaker 1: there's no content, there's no there's no value underneath his 77 00:05:36,960 --> 00:05:40,920 Speaker 1: surface um. And he kind of communicates that himself in 78 00:05:41,080 --> 00:05:43,240 Speaker 1: very very plain language. I mean, I think he makes 79 00:05:43,240 --> 00:05:46,400 Speaker 1: it very clear, like, look, I'm a showman and that's 80 00:05:46,440 --> 00:05:48,560 Speaker 1: what that's what it is, that's what it's about for me. 81 00:05:49,120 --> 00:05:51,480 Speaker 1: So he's kind of the president that doesn't offer any 82 00:05:51,520 --> 00:05:54,760 Speaker 1: real content or he doesn't really even take himself seriously 83 00:05:55,200 --> 00:05:58,359 Speaker 1: and he doesn't care that you know that he's not 84 00:05:58,440 --> 00:06:03,320 Speaker 1: a serious president, and his supporters know that he's not serious, 85 00:06:03,960 --> 00:06:07,440 Speaker 1: and his supporters don't care that he's not a serious president. 86 00:06:08,279 --> 00:06:11,160 Speaker 1: It's like that first time where we've completely moved on. 87 00:06:11,560 --> 00:06:14,279 Speaker 1: That's why I say he's he's postmodern, he's post political, 88 00:06:14,400 --> 00:06:17,320 Speaker 1: that no one really cares that it's all a charade. 89 00:06:17,400 --> 00:06:19,800 Speaker 1: We all know it's a charade, and no one gives 90 00:06:19,839 --> 00:06:24,119 Speaker 1: a ship that it's a charade. That's very interesting to me. Yeah, 91 00:06:24,160 --> 00:06:26,560 Speaker 1: So I was going to mention another angle on this, 92 00:06:26,720 --> 00:06:29,680 Speaker 1: and personally, I'm in the same boat as you on this. 93 00:06:29,800 --> 00:06:32,120 Speaker 1: I do I do know that there's gonna be feedback 94 00:06:32,160 --> 00:06:35,279 Speaker 1: that's saying, you know, we're bashing President Trump. I personally, 95 00:06:35,320 --> 00:06:38,360 Speaker 1: I think I've been very fair, but when you hear it, 96 00:06:38,600 --> 00:06:40,880 Speaker 1: like we try, it's hard to be We try to 97 00:06:40,920 --> 00:06:43,240 Speaker 1: be fair. But yes, I bashed President Trump at times. 98 00:06:43,240 --> 00:06:45,800 Speaker 1: I bashed President Obama at times. You know, I mean, 99 00:06:45,920 --> 00:06:49,560 Speaker 1: these people are very prominent political leaders. They are not 100 00:06:49,640 --> 00:06:52,840 Speaker 1: above criticism. I don't know why people think that sometimes 101 00:06:52,880 --> 00:06:54,920 Speaker 1: that you're not allowed to criticize the president. I mean, 102 00:06:55,400 --> 00:06:59,160 Speaker 1: grow up. So here's my stance on it, and this 103 00:06:59,240 --> 00:07:03,680 Speaker 1: is actually me seeing things from all sides, Um, I think, 104 00:07:03,680 --> 00:07:05,480 Speaker 1: and I think you've got you guys will hear where 105 00:07:05,480 --> 00:07:08,440 Speaker 1: I'm coming from. When we had Sam Fattis on the podcast, 106 00:07:08,440 --> 00:07:12,000 Speaker 1: who's got a great background in intelligence, you are an 107 00:07:12,040 --> 00:07:14,000 Speaker 1: on with me. But that was at the height of 108 00:07:14,040 --> 00:07:18,160 Speaker 1: the controversy of Trump meeting with North Korea, and I 109 00:07:18,200 --> 00:07:20,920 Speaker 1: asked Sam his feelings about it, and to me, Sam 110 00:07:20,920 --> 00:07:23,920 Speaker 1: does seem pretty pro Trump. Um. But but I think 111 00:07:23,960 --> 00:07:27,200 Speaker 1: what he said was entirely fair. What he said was that, 112 00:07:27,760 --> 00:07:30,960 Speaker 1: you know, Trump makes some outrageous tweets. He says, he 113 00:07:31,120 --> 00:07:33,200 Speaker 1: uses some very harsh rhetoric, and he's like, I would 114 00:07:33,200 --> 00:07:36,480 Speaker 1: prefer that he doesn't tweet this ridiculous stuff. But he said, 115 00:07:36,640 --> 00:07:39,080 Speaker 1: you have people in the administration I think he mentioned 116 00:07:39,080 --> 00:07:43,360 Speaker 1: Mike Pompeo, and people were very hawkish on Korea, so 117 00:07:43,440 --> 00:07:47,120 Speaker 1: on North Korea. So what he was saying was that 118 00:07:48,040 --> 00:07:51,680 Speaker 1: even though he's using this rhetoric, it's a good thing 119 00:07:51,760 --> 00:07:54,600 Speaker 1: that we're using diplomacy, that we're meeting there. That we're 120 00:07:54,680 --> 00:07:58,520 Speaker 1: starting to see some new clear facility facilities dismantled and 121 00:07:58,680 --> 00:08:01,360 Speaker 1: are we we have seen some, But then there's also 122 00:08:01,400 --> 00:08:04,480 Speaker 1: been conflicting reports and not dismantling. No, I mean, yeah, 123 00:08:04,560 --> 00:08:07,040 Speaker 1: until we send inspectors, and I mean it doesn't I 124 00:08:07,080 --> 00:08:09,679 Speaker 1: don't take their word, but I've seen reports on both sides. 125 00:08:09,840 --> 00:08:12,200 Speaker 1: I you know, it's out there in the news. But 126 00:08:12,240 --> 00:08:15,840 Speaker 1: I agree, we don't know what's really happening. Uh. But 127 00:08:16,080 --> 00:08:19,480 Speaker 1: I mean not to not to take away anything. I mean, 128 00:08:19,520 --> 00:08:22,880 Speaker 1: I do agree that the Trump administration has made some 129 00:08:22,920 --> 00:08:27,040 Speaker 1: forward progress on this. Uh, it remains to be seen 130 00:08:27,640 --> 00:08:30,840 Speaker 1: what comes of it, if anything substantial happens. But I 131 00:08:30,880 --> 00:08:35,000 Speaker 1: also don't want to like minimize that or or bash Trump. 132 00:08:35,080 --> 00:08:37,240 Speaker 1: I agree in that regard. I mean, I think he 133 00:08:37,280 --> 00:08:40,960 Speaker 1: has made some forward progress. So Fattest was, if you 134 00:08:41,000 --> 00:08:44,000 Speaker 1: listen back to that episode, was definitely very optimistic on it, 135 00:08:44,240 --> 00:08:46,839 Speaker 1: thought that the Trump administration was doing a good job 136 00:08:47,520 --> 00:08:51,120 Speaker 1: in being diplomatic with North Korea. But the point that 137 00:08:51,160 --> 00:08:53,800 Speaker 1: he made that was interesting and conflicts with this, is 138 00:08:53,840 --> 00:08:58,040 Speaker 1: that Sam insisted that the Trump administration is very has 139 00:08:58,080 --> 00:09:00,319 Speaker 1: people in it that are very hawkish on North Korea. 140 00:09:00,800 --> 00:09:03,240 Speaker 1: So when I hear that, and then I hear I 141 00:09:03,280 --> 00:09:07,560 Speaker 1: fell in love with this dictator who's you know, killed people, 142 00:09:07,840 --> 00:09:12,760 Speaker 1: had people living under subhuman conditions, it's pretty outrageous. It's 143 00:09:12,760 --> 00:09:15,280 Speaker 1: a criminal regime. Yeah, it should be outrageous to all 144 00:09:15,320 --> 00:09:18,120 Speaker 1: of us. And it's just it's it's insane to me 145 00:09:18,240 --> 00:09:21,520 Speaker 1: that Trump goes all out in bashing someone likes let's 146 00:09:21,520 --> 00:09:25,480 Speaker 1: say someone in the liberal establishment of politics like Chuck Schumer, 147 00:09:25,800 --> 00:09:29,280 Speaker 1: or say his own Attorney General Jeff Sessions. Yes, and 148 00:09:29,280 --> 00:09:32,320 Speaker 1: and then we'll handle with kid gloves, or say he 149 00:09:32,400 --> 00:09:36,800 Speaker 1: fell in love with addict with a ruthless, evil, fucking dictator. Yes, 150 00:09:37,400 --> 00:09:39,400 Speaker 1: I am, by no means a fan of let's say 151 00:09:39,440 --> 00:09:42,000 Speaker 1: someone like Chuck Schumer. But do I believe Chuck Schumer 152 00:09:42,080 --> 00:09:46,280 Speaker 1: is like working to topple the United States? No, He's 153 00:09:46,320 --> 00:09:49,640 Speaker 1: a member of the government with a different ideology than 154 00:09:49,679 --> 00:09:54,640 Speaker 1: myself for Donald Trump. And it's just people will try 155 00:09:54,679 --> 00:09:56,679 Speaker 1: to draw what we've talked about this actually when Nick 156 00:09:56,720 --> 00:09:59,600 Speaker 1: Colfin was here at comparisons between Obama and Trump, and 157 00:09:59,760 --> 00:10:02,520 Speaker 1: it really is impossible to do because you've never heard 158 00:10:02,880 --> 00:10:06,000 Speaker 1: Obama use this type of rhetoric or any president. But 159 00:10:06,520 --> 00:10:08,360 Speaker 1: that's what I was saying. I mean, Trump is a 160 00:10:08,440 --> 00:10:12,680 Speaker 1: unique postmodern president or post political president. You can't compare 161 00:10:12,720 --> 00:10:15,000 Speaker 1: him to George W. Bush, you can't compare him to 162 00:10:15,000 --> 00:10:17,480 Speaker 1: Bill Clinton. I mean he and I mean I think 163 00:10:17,480 --> 00:10:19,199 Speaker 1: his supporters would tell you the same thing. It's that 164 00:10:19,280 --> 00:10:21,400 Speaker 1: Trump is his own man. He's he's different than them, 165 00:10:21,400 --> 00:10:24,360 Speaker 1: and that's why we voted for him. Yeah, most people 166 00:10:24,400 --> 00:10:27,320 Speaker 1: would agree, but there are always those defenders who come 167 00:10:27,320 --> 00:10:29,520 Speaker 1: out and say, well, what about when Obama said, you know, 168 00:10:29,679 --> 00:10:32,280 Speaker 1: called ISIS the JV team or something like that, that's 169 00:10:32,280 --> 00:10:34,920 Speaker 1: a little, yeah, a little bit different. I agree, Um, 170 00:10:34,960 --> 00:10:37,280 Speaker 1: and ISIS was the JV team. Like, I'm sorry to 171 00:10:37,320 --> 00:10:40,160 Speaker 1: pop burst anybody's bubble there, but I mean, ISIS was 172 00:10:40,200 --> 00:10:43,520 Speaker 1: always j V they got That's why they basically got 173 00:10:43,559 --> 00:10:47,280 Speaker 1: wiped out too. Um, a Q is around a Q, 174 00:10:47,640 --> 00:10:50,480 Speaker 1: they're not JV. They're like g Q there. Al Qaite 175 00:10:50,559 --> 00:10:53,319 Speaker 1: is like the g Q of terrorist organizations. So they're 176 00:10:53,320 --> 00:10:57,880 Speaker 1: the cool kids, they're slick um. But yeah, ISIS was 177 00:10:58,000 --> 00:11:00,320 Speaker 1: j V. So I mean, I think Obama was right 178 00:11:00,360 --> 00:11:04,080 Speaker 1: about that. And you can see the total fucking hysteria 179 00:11:04,160 --> 00:11:06,800 Speaker 1: around ices just a couple of years ago. You know, 180 00:11:06,920 --> 00:11:09,160 Speaker 1: there were people, I know, we've talked about it here before. 181 00:11:09,200 --> 00:11:11,920 Speaker 1: I mean, people were in such a crazy, irrational panic 182 00:11:11,960 --> 00:11:14,760 Speaker 1: about ISIS. They're acting like ISIS was about to come 183 00:11:14,800 --> 00:11:17,360 Speaker 1: to you know, New York City here where we are 184 00:11:17,400 --> 00:11:19,959 Speaker 1: in institute Sharia law and you know your wife is 185 00:11:19,960 --> 00:11:21,720 Speaker 1: going to have to wear a burka. It was just 186 00:11:21,760 --> 00:11:28,319 Speaker 1: like total irrational, crazy, uh, inflations of that threat. Um. 187 00:11:28,360 --> 00:11:32,240 Speaker 1: And now here we are eighteen and like people are like, Syria, 188 00:11:32,400 --> 00:11:36,920 Speaker 1: what the fucking Syria? What's going on? You know, like, 189 00:11:36,960 --> 00:11:39,640 Speaker 1: no one cares. I mean, we've moved. It's the just 190 00:11:39,720 --> 00:11:42,480 Speaker 1: the tension spans are so short. And we talk a 191 00:11:42,520 --> 00:11:44,959 Speaker 1: little bit about Syria too, because there's some interesting stuff 192 00:11:45,000 --> 00:11:47,840 Speaker 1: happening there. Yeah. Absolutely, Um. I mean if you want 193 00:11:47,840 --> 00:11:50,040 Speaker 1: to get into that now, that's fine. We have I 194 00:11:50,080 --> 00:11:53,560 Speaker 1: know that we're dialing. We're dialing up justin Carol in 195 00:11:53,640 --> 00:11:55,120 Speaker 1: just a few minutes, so I don't know if you 196 00:11:55,120 --> 00:11:57,240 Speaker 1: want to wait we end out how long? No, we 197 00:11:57,320 --> 00:12:00,840 Speaker 1: go over it briefly. I mean what's interesting is maddest 198 00:12:01,160 --> 00:12:04,800 Speaker 1: Um making this comment that we're gonna send more advisors, 199 00:12:04,960 --> 00:12:08,360 Speaker 1: more people into Syria, into that northern Syrian region that 200 00:12:08,440 --> 00:12:11,600 Speaker 1: the Kurds called java Um where we've had people for 201 00:12:11,720 --> 00:12:17,160 Speaker 1: quite a while. And the estimates there's something like two 202 00:12:17,200 --> 00:12:20,439 Speaker 1: thousand troops in northern Syria, which is a hell of 203 00:12:20,480 --> 00:12:23,199 Speaker 1: a lot of people. Um, now that the fighting is 204 00:12:23,280 --> 00:12:26,520 Speaker 1: kind of winding down, I would think that we would 205 00:12:26,520 --> 00:12:28,720 Speaker 1: want to have like a handful of od As over there. 206 00:12:28,760 --> 00:12:30,400 Speaker 1: And I'm not sure why we want to have two 207 00:12:30,400 --> 00:12:35,600 Speaker 1: thousand people, but hey whatever. UM. The the announcement that 208 00:12:35,600 --> 00:12:41,600 Speaker 1: we're pushing more people into Syria is interesting because what 209 00:12:41,720 --> 00:12:44,080 Speaker 1: has been trying to happen, and I think we talked 210 00:12:44,080 --> 00:12:47,040 Speaker 1: about this just on the last podcast too, Why we're there, 211 00:12:47,040 --> 00:12:50,480 Speaker 1: why we still have troops there? UM that I think 212 00:12:50,480 --> 00:12:52,720 Speaker 1: there's always been an understanding that the Kurds we're gonna 213 00:12:52,800 --> 00:12:56,040 Speaker 1: have to negotiate with the regime, with the Assad regime 214 00:12:56,440 --> 00:13:01,760 Speaker 1: and negotiate their reentry into I hesitate to call it 215 00:13:01,800 --> 00:13:04,560 Speaker 1: the Syrian state considering everything they've been through, but let's 216 00:13:04,559 --> 00:13:08,760 Speaker 1: say a consolidation, a reorganization of the Syrian state, that 217 00:13:08,800 --> 00:13:11,520 Speaker 1: they would have to negotiate that and negotiate their way 218 00:13:11,559 --> 00:13:14,880 Speaker 1: back in with some kurt probably some Kurdish autonomy and 219 00:13:14,920 --> 00:13:18,000 Speaker 1: some local governance, but they would still become part of 220 00:13:18,000 --> 00:13:22,320 Speaker 1: the Syrian state. UM. This is so this is so 221 00:13:22,440 --> 00:13:25,280 Speaker 1: interesting to me because what it signals, I think, is 222 00:13:25,480 --> 00:13:29,600 Speaker 1: that our policy is changing, that we are changing our 223 00:13:29,640 --> 00:13:35,319 Speaker 1: focus to UM put the emphasis on anti Iran operations 224 00:13:35,400 --> 00:13:38,960 Speaker 1: in Syria. So we've bounced around. We went from we 225 00:13:38,960 --> 00:13:42,600 Speaker 1: were anti assad aside must go. We pushed that through 226 00:13:42,640 --> 00:13:45,160 Speaker 1: covert operations. Then we changed and we said, okay, a 227 00:13:45,280 --> 00:13:47,520 Speaker 1: side can stay, but we still got to defeat ISIS. 228 00:13:47,920 --> 00:13:51,080 Speaker 1: And now it's okay, isis Is more or less defeated. Um, 229 00:13:51,120 --> 00:13:54,240 Speaker 1: but we're gonna stay and now we're gonna shift our 230 00:13:54,280 --> 00:13:56,160 Speaker 1: focus and we're gonna try to stir sh it up 231 00:13:56,160 --> 00:13:59,839 Speaker 1: with Iran. Uh. And all of this is in my 232 00:14:00,240 --> 00:14:04,120 Speaker 1: views about having John Bolton in the White House and 233 00:14:04,160 --> 00:14:07,640 Speaker 1: his obsession with Iran. And I don't know what the 234 00:14:07,920 --> 00:14:11,240 Speaker 1: what the long term plan is there, Maybe there isn't one, 235 00:14:12,080 --> 00:14:13,360 Speaker 1: but I think they want to go stir sh it 236 00:14:13,520 --> 00:14:16,840 Speaker 1: up with Iran in Syria. I think they probably in 237 00:14:16,880 --> 00:14:19,400 Speaker 1: their view, they probably want to deal or Ran a 238 00:14:19,400 --> 00:14:22,640 Speaker 1: strategic defeat in Syria, or at least that's probably one 239 00:14:22,680 --> 00:14:26,200 Speaker 1: of the conversations that's going on in the White House 240 00:14:26,320 --> 00:14:28,640 Speaker 1: right now. So that will help open up the door 241 00:14:28,720 --> 00:14:30,640 Speaker 1: for us to go and do Iran. I mean that 242 00:14:30,800 --> 00:14:34,400 Speaker 1: is that's the long term plan. Um. It's so interesting 243 00:14:34,480 --> 00:14:37,640 Speaker 1: to me because one of the reasons someone like Jeb 244 00:14:37,640 --> 00:14:40,880 Speaker 1: Bush was defeated. And you know, there are people critical 245 00:14:40,880 --> 00:14:44,040 Speaker 1: of John McCain is they look at the George Bush 246 00:14:44,080 --> 00:14:47,080 Speaker 1: Dick Cheney administration and say that these are you know, 247 00:14:47,120 --> 00:14:49,280 Speaker 1: the neo cons who want us to intervene in every 248 00:14:49,280 --> 00:14:51,440 Speaker 1: part of the world, and they stuck Trump as a 249 00:14:51,440 --> 00:14:54,080 Speaker 1: departure from that. That's why a lot of Trump supporters 250 00:14:54,080 --> 00:14:57,440 Speaker 1: were critical of us going into Syria or continuing anything 251 00:14:57,720 --> 00:15:00,920 Speaker 1: in Syria in the first place. And to me, John 252 00:15:00,920 --> 00:15:03,479 Speaker 1: Bolton is a guy who probably would have been appointed 253 00:15:03,560 --> 00:15:07,160 Speaker 1: under a Jeb Bush administration. It's not a departure at all. Well, 254 00:15:07,240 --> 00:15:09,520 Speaker 1: this is uh, this is a topic. It's actually interesting 255 00:15:09,560 --> 00:15:11,120 Speaker 1: to bring this up. I'm reading this book right now. 256 00:15:11,160 --> 00:15:15,520 Speaker 1: It's called The Great Delusion by John Muirscheimer, who is, 257 00:15:15,640 --> 00:15:20,160 Speaker 1: you know, one of the prominent intellectuals in the field 258 00:15:20,160 --> 00:15:23,680 Speaker 1: of political science. And um he this book. He has 259 00:15:23,680 --> 00:15:27,920 Speaker 1: written the the subtitle here the byline on it Liberal 260 00:15:28,040 --> 00:15:31,640 Speaker 1: Dreams and International Realities. And what he's talking about in 261 00:15:31,640 --> 00:15:34,640 Speaker 1: the book is about how since the end of the 262 00:15:34,680 --> 00:15:38,360 Speaker 1: Cold War, we've had these administrations Bill Clinton, George Bush, 263 00:15:38,640 --> 00:15:43,359 Speaker 1: um Barack Obama, and they have all had this ideology 264 00:15:43,400 --> 00:15:45,880 Speaker 1: to varying extents that we need to go and start 265 00:15:45,920 --> 00:15:48,480 Speaker 1: wars around the world and try to make other countries 266 00:15:48,560 --> 00:15:51,880 Speaker 1: look like us, that we make them become liberal democracies, 267 00:15:52,000 --> 00:15:55,880 Speaker 1: and that's going to spread more human rights, more freedom. 268 00:15:56,240 --> 00:15:59,280 Speaker 1: It will lead to a more stable international system. And 269 00:15:59,320 --> 00:16:01,840 Speaker 1: what Mureshei were saying is that foreign policy has been 270 00:16:01,880 --> 00:16:05,560 Speaker 1: a fucking disaster and it has led to instability. We've 271 00:16:05,600 --> 00:16:09,800 Speaker 1: crashed regimes, We've caused chaos. Uh, that it has led 272 00:16:09,840 --> 00:16:13,120 Speaker 1: to an imbalanced and global power. Um. And I'm still 273 00:16:13,160 --> 00:16:14,920 Speaker 1: at the beginning of the book. There's a lot more 274 00:16:14,960 --> 00:16:19,480 Speaker 1: to the argument he's making, I'm sure, um. And the 275 00:16:19,560 --> 00:16:21,960 Speaker 1: book that this book just came out, and he doesn't 276 00:16:22,000 --> 00:16:24,920 Speaker 1: he doesn't really cover the Trump administration because he's like, well, 277 00:16:25,000 --> 00:16:27,280 Speaker 1: I don't have enough data yet, we don't have enough. 278 00:16:28,240 --> 00:16:30,360 Speaker 1: We have to see what what the admitte, what Trump 279 00:16:30,440 --> 00:16:32,920 Speaker 1: is gonna do before we write a whole book about it, right, 280 00:16:32,960 --> 00:16:39,240 Speaker 1: So um, so he he says. My impression though, is 281 00:16:39,280 --> 00:16:43,920 Speaker 1: that Trump and his people have this acknowledgement that this 282 00:16:44,000 --> 00:16:48,440 Speaker 1: foreign policy that the past three administrations, past three presidents 283 00:16:48,440 --> 00:16:51,480 Speaker 1: have followed didn't work, that it doesn't work, and that 284 00:16:51,520 --> 00:16:53,640 Speaker 1: they're going to try to do something different. But now 285 00:16:53,680 --> 00:16:58,960 Speaker 1: what we're seeing is even Trump is curtailing uh uh, 286 00:16:59,200 --> 00:17:04,240 Speaker 1: he is bowing down to that sort of ideology that 287 00:17:04,280 --> 00:17:07,520 Speaker 1: we have to go around and fight all these dictatorships 288 00:17:07,520 --> 00:17:10,960 Speaker 1: around the world. I think certain people are have been 289 00:17:11,000 --> 00:17:13,959 Speaker 1: forced on Trump, and that's been written before, that that 290 00:17:14,040 --> 00:17:17,120 Speaker 1: Mike Pence was basically chosen for him, that putting someone 291 00:17:17,200 --> 00:17:21,720 Speaker 1: like Rent's previous initially in the administration was pushed on him. 292 00:17:22,080 --> 00:17:25,040 Speaker 1: And I think someone like John Bolton is someone that 293 00:17:25,040 --> 00:17:27,640 Speaker 1: that establishment wants to be a part of his administered. 294 00:17:27,640 --> 00:17:31,240 Speaker 1: I mean, even Barack Obama tried to push back against 295 00:17:31,640 --> 00:17:35,159 Speaker 1: this foreign policy ideology. Um. Some people would call it 296 00:17:35,240 --> 00:17:38,000 Speaker 1: neo conservative. Someone people would call it neo liberal. But 297 00:17:38,080 --> 00:17:40,280 Speaker 1: it's all rooted in the same idea that we can 298 00:17:40,320 --> 00:17:43,520 Speaker 1: go around and we forced the world to look like US. Um. 299 00:17:43,600 --> 00:17:46,199 Speaker 1: And Obama tried to push back against it in several 300 00:17:46,240 --> 00:17:49,800 Speaker 1: regards and uh and you know, the system fought back 301 00:17:50,520 --> 00:17:54,280 Speaker 1: and he was mildly successful. Um and I think he 302 00:17:54,359 --> 00:17:57,280 Speaker 1: failed in other regards. You know, he was careful about 303 00:17:57,600 --> 00:18:00,920 Speaker 1: dealing with Ukraine. I think he was hesitant about Syria, 304 00:18:01,480 --> 00:18:03,480 Speaker 1: but like we still went into Syria and did all 305 00:18:03,560 --> 00:18:06,080 Speaker 1: we all kinds of fun and games happened there, and 306 00:18:06,359 --> 00:18:10,280 Speaker 1: President Drone strikes all sorts of different stupid stuff that 307 00:18:10,280 --> 00:18:13,919 Speaker 1: that he I think he at least wanted to scale 308 00:18:13,960 --> 00:18:16,800 Speaker 1: some of it back. Um, And I don't think he 309 00:18:16,880 --> 00:18:19,960 Speaker 1: was entirely successful in that. So I mean, like you're 310 00:18:20,000 --> 00:18:23,040 Speaker 1: saying about the Trump administration, I mean there is a system. 311 00:18:23,080 --> 00:18:28,920 Speaker 1: There is almost cultish ideology foreign policy ideology in Washington, 312 00:18:29,119 --> 00:18:33,359 Speaker 1: d C. And that system is bigger and more powerful 313 00:18:33,440 --> 00:18:38,840 Speaker 1: and in some ways darker than than anyone presidential administration. 314 00:18:38,880 --> 00:18:41,480 Speaker 1: And I mean that sounds very conspiratorial. I know, and 315 00:18:41,560 --> 00:18:45,320 Speaker 1: some people probably be upset that I even use that 316 00:18:45,400 --> 00:18:49,040 Speaker 1: kind of language, But um, I think I don't think 317 00:18:49,040 --> 00:18:51,760 Speaker 1: it is, because I'll give you two examples here. If 318 00:18:51,800 --> 00:18:54,240 Speaker 1: you look back, a lot of people forget about this stuff. 319 00:18:54,280 --> 00:18:57,560 Speaker 1: But I've rewatched it too. When George W. Bush was 320 00:18:57,640 --> 00:19:00,680 Speaker 1: running for president, he was actually running on very non 321 00:19:00,720 --> 00:19:04,320 Speaker 1: interventionist policy, correct. He said things that were similar to 322 00:19:04,359 --> 00:19:06,960 Speaker 1: what Ron Paul said, correct, And then he became very 323 00:19:07,000 --> 00:19:10,439 Speaker 1: interventionoust Barack Obama campaigned on a lot of ideas that 324 00:19:10,480 --> 00:19:14,760 Speaker 1: were considered you know, like utopian of dismantling atomic weapons 325 00:19:14,800 --> 00:19:17,760 Speaker 1: and things like that, and it was certainly not how 326 00:19:17,800 --> 00:19:22,600 Speaker 1: they ended up, uh working as a president And yeah, 327 00:19:22,640 --> 00:19:25,560 Speaker 1: I mean George W. Bush, I mean, deep down he's 328 00:19:25,560 --> 00:19:29,720 Speaker 1: a good person. UM. And you're right, he had no 329 00:19:29,800 --> 00:19:32,720 Speaker 1: intention whatsoever of going around the world starting wars. No, 330 00:19:32,840 --> 00:19:34,359 Speaker 1: he was like, that was being cost of Oh, he 331 00:19:34,400 --> 00:19:37,280 Speaker 1: was criticized. That was that was not his m Oh, 332 00:19:37,440 --> 00:19:41,080 Speaker 1: that was not what he intended when he took office. Um. 333 00:19:41,119 --> 00:19:45,600 Speaker 1: But his mind was, his thought process was captured by 334 00:19:45,800 --> 00:19:48,359 Speaker 1: the likes of you know, Dick Cheney and Don Rumsfeld, 335 00:19:48,440 --> 00:19:50,919 Speaker 1: and they talked him into some really bad decisions. And 336 00:19:51,000 --> 00:19:54,400 Speaker 1: nine eleven certainly helped. Nine eleven, of course, UM. And 337 00:19:54,480 --> 00:19:57,480 Speaker 1: that was the of course, we went into Afghanistan and 338 00:19:57,480 --> 00:20:00,359 Speaker 1: we're fighting al Qaeda. But then two thousand and three 339 00:20:00,400 --> 00:20:02,800 Speaker 1: they there are people in that administration who start as 340 00:20:02,840 --> 00:20:06,040 Speaker 1: an opportunity to go around the world and fight everybody. Um. 341 00:20:06,080 --> 00:20:08,320 Speaker 1: And that was the two thousand and three decision was 342 00:20:08,480 --> 00:20:12,879 Speaker 1: a disaster um for our country. UM. From a foreign 343 00:20:12,880 --> 00:20:16,520 Speaker 1: policy standpoint, it was a disaster and we're still feeling 344 00:20:16,520 --> 00:20:20,359 Speaker 1: the repercussions of that to this day. And then after 345 00:20:20,440 --> 00:20:25,600 Speaker 1: that failed, Uh, Dick Cheney wanted to invade Syria, and 346 00:20:25,760 --> 00:20:29,680 Speaker 1: uh W was like, uh, no, to his credit, I mean, 347 00:20:29,720 --> 00:20:33,639 Speaker 1: he at least learned his lesson from that failure. Was like, no, like, 348 00:20:34,080 --> 00:20:36,800 Speaker 1: I think we're good, um, but there are people like 349 00:20:36,880 --> 00:20:40,080 Speaker 1: Cheney and like Bolton who are just like completely nuts. 350 00:20:40,800 --> 00:20:42,800 Speaker 1: And it's like, you want to go and invade these countries, 351 00:20:42,880 --> 00:20:45,560 Speaker 1: top all these countries, and then it's on our soldiers 352 00:20:45,960 --> 00:20:49,480 Speaker 1: to fight and bleed and die in the streets of 353 00:20:49,520 --> 00:20:51,920 Speaker 1: a place like Iraq or if they have their way, 354 00:20:51,920 --> 00:20:56,639 Speaker 1: iran Um. And there's no there's no accountability for the 355 00:20:56,640 --> 00:21:00,000 Speaker 1: people who make these failed policy decisions, but they're sure 356 00:21:00,000 --> 00:21:02,919 Speaker 1: as hell is accountability for our troops who end up 357 00:21:02,920 --> 00:21:05,000 Speaker 1: in the middle of all of that ship. And there's 358 00:21:05,040 --> 00:21:08,439 Speaker 1: accountability for the people who live in these countries. And 359 00:21:08,640 --> 00:21:11,840 Speaker 1: you know, the Iraqi people. I mean, it's impossible to 360 00:21:11,880 --> 00:21:15,760 Speaker 1: describe the amount of suffering that they've gone through. Yeah, 361 00:21:15,840 --> 00:21:19,160 Speaker 1: I couldn't agree more. And it is something I always 362 00:21:19,200 --> 00:21:21,200 Speaker 1: think about when you see guys come home from combat 363 00:21:21,280 --> 00:21:25,560 Speaker 1: missing limbs, and and the question of what's worth it? 364 00:21:25,600 --> 00:21:27,480 Speaker 1: Why are we there? Can you? I mean, and when 365 00:21:27,840 --> 00:21:30,080 Speaker 1: one of these kids dies over there, can you go 366 00:21:30,119 --> 00:21:32,040 Speaker 1: to their parents and be like it was worth it? 367 00:21:32,720 --> 00:21:35,440 Speaker 1: Like I mean, as painful as it is, like all 368 00:21:35,640 --> 00:21:40,520 Speaker 1: our soldiers who died taking Normandy, We're gonna at least say, look, 369 00:21:40,600 --> 00:21:44,280 Speaker 1: they fund those guys fundamentally changed the world for the better. 370 00:21:44,760 --> 00:21:48,280 Speaker 1: If we had not gone and fought the Nazis, the 371 00:21:48,359 --> 00:21:51,320 Speaker 1: world would be a very dark place. Um. It's not 372 00:21:51,359 --> 00:21:52,920 Speaker 1: to say that it would be like, Oh, what's the 373 00:21:53,000 --> 00:21:56,040 Speaker 1: TV show on Amazon where the Nazis and the Japanese takeover? 374 00:21:56,080 --> 00:21:57,720 Speaker 1: I know what you're talking, man on the Man in 375 00:21:57,760 --> 00:22:00,880 Speaker 1: the High Castle. Uh. I don't think the world would 376 00:22:00,880 --> 00:22:04,119 Speaker 1: look like that. But fascism would be a predominant political 377 00:22:04,160 --> 00:22:06,919 Speaker 1: force in global politics. I mean, the world would be 378 00:22:06,920 --> 00:22:11,920 Speaker 1: a very nasty place. But the sacrifice that our soldiers 379 00:22:11,960 --> 00:22:16,080 Speaker 1: made going and fighting the Nazis in Europe fundamentally changed 380 00:22:16,119 --> 00:22:18,560 Speaker 1: the world for the better. And that's a fact. But 381 00:22:19,000 --> 00:22:23,480 Speaker 1: when these guys die in Afghanistan, when they died in Iraq, um, 382 00:22:23,720 --> 00:22:26,440 Speaker 1: when they if we send them to go di in Iran, 383 00:22:26,760 --> 00:22:30,400 Speaker 1: can we really look at that and say that sacrifice mattered? 384 00:22:31,000 --> 00:22:35,960 Speaker 1: And I can't say that in good conscious Yeah, and 385 00:22:35,960 --> 00:22:38,320 Speaker 1: and these are questions we all have to think about. UM. 386 00:22:38,520 --> 00:22:41,040 Speaker 1: I do want to talk about some changes happening at 387 00:22:41,080 --> 00:22:43,040 Speaker 1: the website. I want to get to a great email 388 00:22:43,080 --> 00:22:45,480 Speaker 1: that was sent to us. UH want to talk about 389 00:22:45,520 --> 00:22:48,000 Speaker 1: a Medal of Honor recipient who needs your help. But 390 00:22:48,080 --> 00:22:50,640 Speaker 1: I know we have Justin Carol standing by, so let's 391 00:22:50,640 --> 00:22:53,600 Speaker 1: get over to him joining us for the first time 392 00:22:53,680 --> 00:22:57,920 Speaker 1: on Software Radio. Justin Carol UH to give some background, 393 00:22:57,960 --> 00:23:00,080 Speaker 1: and I'm looking forward to you getting into the back on. 394 00:23:00,200 --> 00:23:04,639 Speaker 1: Former MARSK operator has a background in privacy and security 395 00:23:04,680 --> 00:23:08,200 Speaker 1: with us s oh Com Uh co author of several 396 00:23:08,280 --> 00:23:11,360 Speaker 1: books in that topic, including the latest which is calm Sec. 397 00:23:11,800 --> 00:23:14,160 Speaker 1: And then with that same co author, you co host 398 00:23:14,160 --> 00:23:19,199 Speaker 1: a podcast called the Complete Privacy and Security Podcast before 399 00:23:19,320 --> 00:23:21,680 Speaker 1: we get into the privacy and security stuff, which I'm 400 00:23:21,680 --> 00:23:24,960 Speaker 1: looking forward to. I was saying to Jack that, to 401 00:23:25,200 --> 00:23:28,320 Speaker 1: my knowledge, Fred Galvin and then we had another guest 402 00:23:28,359 --> 00:23:32,199 Speaker 1: on with Fred Galvin, are the only actual Marsac operators 403 00:23:32,240 --> 00:23:35,280 Speaker 1: we've had on the podcast. I'm pretty sure not too many. 404 00:23:35,640 --> 00:23:37,359 Speaker 1: I think those are the only ones. We have guys 405 00:23:37,400 --> 00:23:40,440 Speaker 1: who have worked as a part of Marsock, but who 406 00:23:40,440 --> 00:23:44,320 Speaker 1: were in actual Marsac operators. UM so I would love 407 00:23:44,359 --> 00:23:47,440 Speaker 1: to hear your background just getting into MARSK. Nick Kaufman 408 00:23:47,560 --> 00:23:49,760 Speaker 1: was telling me some stuff, but I'd love to hear 409 00:23:49,760 --> 00:23:53,000 Speaker 1: it from you. Okay, absolutely. So I was in a 410 00:23:53,119 --> 00:23:57,800 Speaker 1: forcer connaissance platoon back in two thousand late two thousand five, 411 00:23:57,880 --> 00:24:00,040 Speaker 1: going into early two thousand six. We were on a 412 00:24:00,080 --> 00:24:04,919 Speaker 1: work up for another I RACK deployment when you know, 413 00:24:04,960 --> 00:24:07,520 Speaker 1: the Wines kind of read the tea leaves and see 414 00:24:07,520 --> 00:24:12,399 Speaker 1: that Marsac was definitely happening. So uh, they assembled what 415 00:24:12,600 --> 00:24:15,240 Speaker 1: ended up being Fox Company, which is the kind of 416 00:24:15,280 --> 00:24:18,440 Speaker 1: infinite infamous company that got kicked out of Afghanistan. Yeah, 417 00:24:18,480 --> 00:24:22,879 Speaker 1: Gavin's company. Yeah, so Galvin was was my company commander 418 00:24:22,920 --> 00:24:25,919 Speaker 1: for that deployment, and um, actually he and I. He 419 00:24:26,000 --> 00:24:29,760 Speaker 1: has a podcast as well called The Go Commando Show. Yeah, 420 00:24:29,800 --> 00:24:33,520 Speaker 1: he and I talked about this kind of links though. 421 00:24:33,640 --> 00:24:36,680 Speaker 1: I'm glad he does. He's a good guy. Yeah, yeah, 422 00:24:37,359 --> 00:24:40,240 Speaker 1: fantastic guy. But uh yeah, So anyway, I just I 423 00:24:40,320 --> 00:24:42,320 Speaker 1: just happened to be in the right place, right time 424 00:24:42,480 --> 00:24:44,560 Speaker 1: or right place, wrong time, or however you want to 425 00:24:44,560 --> 00:24:47,560 Speaker 1: look at that. And I got pulled over from a 426 00:24:47,640 --> 00:24:51,320 Speaker 1: third platoon, second Force Reconnaissance Company and do what at 427 00:24:51,320 --> 00:24:54,520 Speaker 1: the time was fourth Platoon, but what became Fox Company, 428 00:24:54,920 --> 00:24:59,040 Speaker 1: second Marine Special Operations Battalion. Interesting, So what what made 429 00:24:59,080 --> 00:25:01,160 Speaker 1: you want to be I'm a part of that unit. 430 00:25:03,200 --> 00:25:08,520 Speaker 1: Um man. That's that's such a big question. Um, I definitely. 431 00:25:08,600 --> 00:25:10,800 Speaker 1: I I don't know, just some some kind of innate 432 00:25:11,000 --> 00:25:13,240 Speaker 1: kind of draw to that type of thing and that 433 00:25:13,280 --> 00:25:17,240 Speaker 1: type of lifestyle and that type of military service. And 434 00:25:18,160 --> 00:25:22,200 Speaker 1: unfortunately I got back. Uh nothing was a closed loop 435 00:25:22,240 --> 00:25:25,440 Speaker 1: at the time. There's no guarantees of anything. And after 436 00:25:25,480 --> 00:25:28,720 Speaker 1: that deployment, I was on the old Marine Corps recruiting 437 00:25:28,760 --> 00:25:32,040 Speaker 1: duty hit list and uh and was was not down 438 00:25:32,080 --> 00:25:35,439 Speaker 1: for that. And as a communicator, as a platoon communicator, 439 00:25:35,480 --> 00:25:37,840 Speaker 1: I was also on the hit list to go back 440 00:25:37,880 --> 00:25:41,400 Speaker 1: to UH Calm Chief School with no guarantee whatsoever coming 441 00:25:41,400 --> 00:25:44,320 Speaker 1: back to Marsk. So at that at that time, I 442 00:25:44,400 --> 00:25:46,600 Speaker 1: decided to go and call it quits on the military. 443 00:25:47,359 --> 00:25:49,600 Speaker 1: Well what was it like, uh? I mean, so you 444 00:25:49,640 --> 00:25:51,960 Speaker 1: were like one of those initial guys helping them stand 445 00:25:52,040 --> 00:25:55,680 Speaker 1: up MARSK. And I say this as somebody who Uh 446 00:25:55,880 --> 00:25:57,840 Speaker 1: I was one of the guys who helped stand up 447 00:25:57,880 --> 00:26:01,560 Speaker 1: a new Special Forces battalion. Um just as a as 448 00:26:01,600 --> 00:26:04,040 Speaker 1: a as a nug I didn't do anything special, but 449 00:26:04,080 --> 00:26:05,720 Speaker 1: I was one of those initial guys. I just know, 450 00:26:06,320 --> 00:26:08,000 Speaker 1: I mean it was a big nutroll for us. I 451 00:26:08,040 --> 00:26:10,480 Speaker 1: just wonder if you could tell that experience of kind 452 00:26:10,480 --> 00:26:14,600 Speaker 1: of you were starting up the unit from scratch, right, Yeah. Yeah, 453 00:26:14,640 --> 00:26:16,639 Speaker 1: Like I said, they pulled, they took the bones of 454 00:26:16,680 --> 00:26:19,040 Speaker 1: what had been fourth platoon and said, you guys are 455 00:26:19,080 --> 00:26:22,359 Speaker 1: the main effort. And they you know, they grabbed a 456 00:26:22,359 --> 00:26:25,119 Speaker 1: bunch of bodies from other platoons and we're kind of 457 00:26:25,119 --> 00:26:27,760 Speaker 1: a mishmash of people, and we were a much bigger platoon. 458 00:26:28,520 --> 00:26:32,400 Speaker 1: I think it was close to forty people, which I mean, 459 00:26:32,400 --> 00:26:35,439 Speaker 1: that's just evidence of what evolution of what a growing 460 00:26:35,480 --> 00:26:38,880 Speaker 1: pain this was. Now platoons are structured much more like 461 00:26:39,359 --> 00:26:41,560 Speaker 1: you know, or teams are structured much more like O. D. 462 00:26:41,680 --> 00:26:44,920 Speaker 1: A's with three teams per company. And but at the time, 463 00:26:44,960 --> 00:26:48,720 Speaker 1: we I think envisioned ourselves being like kind of a 464 00:26:48,840 --> 00:26:52,600 Speaker 1: strike force, like heavy assault force kind of mentality. But 465 00:26:53,200 --> 00:26:55,600 Speaker 1: so so that's one big thing right off right out 466 00:26:55,600 --> 00:26:58,000 Speaker 1: of the gate that has changed massively since I left. 467 00:26:58,080 --> 00:27:01,560 Speaker 1: But uh, there was no one knew where we're going. 468 00:27:01,600 --> 00:27:05,400 Speaker 1: That deployment we started working up in in like March 469 00:27:05,520 --> 00:27:09,080 Speaker 1: or April of two thousand and six. With the intensive 470 00:27:09,119 --> 00:27:12,240 Speaker 1: deploy in early two thousand seven, and up to November 471 00:27:12,320 --> 00:27:14,520 Speaker 1: December that year, we still thought we were going to 472 00:27:14,560 --> 00:27:17,600 Speaker 1: Iraq and then last minute change, We're going to Afghanistan. 473 00:27:18,200 --> 00:27:21,880 Speaker 1: So the uncertainty was a big piece. UM. The hallmark 474 00:27:21,920 --> 00:27:24,440 Speaker 1: of that deployment, and as I talked about a lot 475 00:27:24,520 --> 00:27:29,159 Speaker 1: on Fred's show, was we did have pretty much unlimited 476 00:27:29,160 --> 00:27:34,119 Speaker 1: training support man we were. That deployment cycle is kind 477 00:27:34,160 --> 00:27:37,320 Speaker 1: of infamous for just the amount of training we got 478 00:27:37,359 --> 00:27:41,040 Speaker 1: and the you know, so typically there's a work hard, 479 00:27:41,119 --> 00:27:43,879 Speaker 1: play hard mentality, or there had been in like the 480 00:27:43,920 --> 00:27:46,760 Speaker 1: forcer connoissance platoons, UH, and this was very much a 481 00:27:46,800 --> 00:27:50,360 Speaker 1: work hard, work hard mentality through this entire work up UH. 482 00:27:50,440 --> 00:27:53,080 Speaker 1: And then, UM, I don't think I want to get 483 00:27:53,080 --> 00:27:57,760 Speaker 1: into the details of UH that deployment itself too much. UM. 484 00:27:57,920 --> 00:28:00,359 Speaker 1: I'd much rather let someone like Fred address that that 485 00:28:00,440 --> 00:28:03,239 Speaker 1: dealt with a much more firsthand. But I would say 486 00:28:03,320 --> 00:28:06,280 Speaker 1: uncertainty is the watchword of what that whole thing. We've 487 00:28:06,320 --> 00:28:08,720 Speaker 1: had Fred on twice to discuss in depth, you know, 488 00:28:08,760 --> 00:28:16,119 Speaker 1: the anguish Yeah, but yeah, a lot of people trying 489 00:28:16,160 --> 00:28:21,280 Speaker 1: to thrive in an environment of uncertainty and UH and 490 00:28:22,480 --> 00:28:25,520 Speaker 1: competing or perhaps not a competing goal with that, but 491 00:28:26,040 --> 00:28:28,640 Speaker 1: concurrent with that or co morbid with that. There's also 492 00:28:28,680 --> 00:28:30,200 Speaker 1: a bunch of people that want to go out the 493 00:28:30,240 --> 00:28:32,720 Speaker 1: door and make a you know, like make a good 494 00:28:32,720 --> 00:28:35,879 Speaker 1: impression on the soft community or on the military at large, 495 00:28:35,960 --> 00:28:40,320 Speaker 1: and do an awesome job. And for better, for better 496 00:28:40,400 --> 00:28:42,960 Speaker 1: or worse, what happened to us happened to us. I guess, well, 497 00:28:43,080 --> 00:28:45,880 Speaker 1: what I'm wondering as well is since mar Sock is 498 00:28:45,920 --> 00:28:49,120 Speaker 1: so new to the community. I mean, people like Jack 499 00:28:49,280 --> 00:28:51,040 Speaker 1: could you know, grew up as kids saying I want 500 00:28:51,040 --> 00:28:52,880 Speaker 1: to be an Army ranger or people saying I want 501 00:28:52,880 --> 00:28:54,920 Speaker 1: to be a Navy seal. Did you want to go 502 00:28:54,960 --> 00:28:57,400 Speaker 1: into special operations? And then this is just what you 503 00:28:57,400 --> 00:29:01,960 Speaker 1: fell into. Yeah, yeah, you're absolutely right. So I went 504 00:29:01,960 --> 00:29:04,840 Speaker 1: to the recruiter and the Army recruiter and said, hey, man, 505 00:29:04,920 --> 00:29:06,600 Speaker 1: I want to be a ranger. He's like, yeah, we 506 00:29:06,600 --> 00:29:08,560 Speaker 1: can definitely make that happen. It'll be a couple of 507 00:29:08,600 --> 00:29:10,880 Speaker 1: weeks before we get you out of the door. So 508 00:29:10,880 --> 00:29:12,680 Speaker 1: it's like, all right, I'll be right back. So I 509 00:29:12,720 --> 00:29:16,080 Speaker 1: walked over to the Marine recruiter's office and he said, yeah, man, 510 00:29:16,280 --> 00:29:17,800 Speaker 1: if you just want to be a marine, we can 511 00:29:17,800 --> 00:29:19,800 Speaker 1: make that happen. Like in a couple of days, so 512 00:29:20,040 --> 00:29:22,720 Speaker 1: I think I was in the UH like two days later, 513 00:29:22,760 --> 00:29:26,960 Speaker 1: I was showing up at Parris Island. I made a 514 00:29:27,080 --> 00:29:30,760 Speaker 1: very like snap, probably not the wisest decision in all honesty, 515 00:29:30,840 --> 00:29:34,320 Speaker 1: but it worked out great for me. But I would 516 00:29:34,440 --> 00:29:37,360 Speaker 1: definitely if I were mentoring someone going in the military 517 00:29:37,440 --> 00:29:41,160 Speaker 1: and say, take a more wizened and cautious approach to UH, 518 00:29:41,720 --> 00:29:44,360 Speaker 1: how you actually do that well? And now, I mean 519 00:29:44,400 --> 00:29:47,719 Speaker 1: MARSAC has a pretty well established pipeline and everything like that. 520 00:29:47,760 --> 00:29:50,120 Speaker 1: I mean, it's much more formalized for you know, let's 521 00:29:50,120 --> 00:29:52,480 Speaker 1: say a young marine who wants to go down that 522 00:29:52,600 --> 00:29:56,760 Speaker 1: road agreed, and I got a chance to witness that firsthand. 523 00:29:56,880 --> 00:29:58,880 Speaker 1: A couple of years after I got out, I went 524 00:29:58,960 --> 00:30:02,120 Speaker 1: back to the Special Operations Schoolhouse as a full time 525 00:30:02,160 --> 00:30:04,960 Speaker 1: equivalent contractor. I worked there for almost five years, and 526 00:30:05,000 --> 00:30:07,840 Speaker 1: I got to deal with I t C specifically quite 527 00:30:07,880 --> 00:30:10,440 Speaker 1: a bit. I didn't deal so much with the like 528 00:30:10,680 --> 00:30:13,840 Speaker 1: preselection and selection events, but I did deal with I 529 00:30:13,920 --> 00:30:15,960 Speaker 1: t C a lot. And I you know, even in 530 00:30:17,240 --> 00:30:21,120 Speaker 1: eleven years time, they've developed quite a bit of institutional 531 00:30:21,200 --> 00:30:24,400 Speaker 1: maturity relative to what they had when I was still 532 00:30:24,400 --> 00:30:27,080 Speaker 1: in the marine. Corps and there is a much more 533 00:30:27,240 --> 00:30:31,960 Speaker 1: formalized pathway to This is where you go for like, 534 00:30:32,240 --> 00:30:35,160 Speaker 1: let's say a young person thinking about joining the Marines, 535 00:30:35,240 --> 00:30:36,719 Speaker 1: I mean, could you walk us through a little bit 536 00:30:36,720 --> 00:30:39,920 Speaker 1: about what the pipeline looks like, because I mean, god knows, 537 00:30:39,960 --> 00:30:44,440 Speaker 1: we've probably discussed the Special Forces Qualification course and buds 538 00:30:44,480 --> 00:30:46,040 Speaker 1: to the point that people are just sick to their 539 00:30:46,080 --> 00:30:49,120 Speaker 1: stomach with it, but we haven't really talked too much 540 00:30:49,160 --> 00:30:53,920 Speaker 1: about the Marsk end of it. Um. Yeah, and and 541 00:30:54,000 --> 00:30:57,200 Speaker 1: again this this may have changed. I've left the schoolhouse 542 00:30:57,320 --> 00:31:01,400 Speaker 1: in late so things may have changed slightly, but the 543 00:31:01,520 --> 00:31:04,800 Speaker 1: general picture of it should look similar. You have to 544 00:31:04,840 --> 00:31:07,960 Speaker 1: be in the Marine Corps to apply, and and MARSAC 545 00:31:08,080 --> 00:31:10,719 Speaker 1: I don't think ever has met its manning goals, so 546 00:31:10,880 --> 00:31:15,120 Speaker 1: recruitment is constant and there's always spaces to fill. So um, 547 00:31:16,000 --> 00:31:18,400 Speaker 1: I'm not sure how it is in in other branches, 548 00:31:18,800 --> 00:31:20,920 Speaker 1: but I have my I kind of have my biases 549 00:31:20,920 --> 00:31:23,320 Speaker 1: and my preconceived notions, and I feel like in other 550 00:31:23,360 --> 00:31:25,400 Speaker 1: branches it's easier. If you say I want to be 551 00:31:25,440 --> 00:31:28,600 Speaker 1: a seal or I want to be a Special Forces soldier, 552 00:31:29,440 --> 00:31:31,840 Speaker 1: it's maybe easier to get that opportunity to go to selection. 553 00:31:31,880 --> 00:31:34,320 Speaker 1: The Marine Corps is very jealous of its talent pool 554 00:31:34,840 --> 00:31:38,840 Speaker 1: and uh, it's kind of infamous for infantry platoon commanders 555 00:31:38,880 --> 00:31:41,280 Speaker 1: not letting their best guy go to a sniper school 556 00:31:41,280 --> 00:31:43,400 Speaker 1: selection or whatever because they don't want to lose that guy. 557 00:31:43,480 --> 00:31:48,840 Speaker 1: But um, basically, you you have your opportunity to attend selection, 558 00:31:49,360 --> 00:31:53,760 Speaker 1: and I I can't speak about that too much. I 559 00:31:54,000 --> 00:31:56,560 Speaker 1: do have decent insight into the process. It's a lot 560 00:31:56,600 --> 00:31:59,840 Speaker 1: of rocking and land navigation and and that sort of thing. Uh. 561 00:31:59,840 --> 00:32:03,680 Speaker 1: And if you're selected, you you get orders to mar sock. 562 00:32:04,240 --> 00:32:09,440 Speaker 1: There is a pre assessment or a basically a conditioning 563 00:32:09,440 --> 00:32:11,440 Speaker 1: phase that you go to your in a platoon for 564 00:32:11,480 --> 00:32:16,200 Speaker 1: a few weeks, basically getting a lot of PT, a 565 00:32:16,200 --> 00:32:18,080 Speaker 1: lot of time in the pool, getting ready to go 566 00:32:18,120 --> 00:32:19,800 Speaker 1: to I t C. And then I t C is 567 00:32:19,840 --> 00:32:23,840 Speaker 1: the actual oh, three seventy two ms producing school and 568 00:32:23,920 --> 00:32:26,200 Speaker 1: at the time I t C was I'm going to say, 569 00:32:26,240 --> 00:32:31,520 Speaker 1: about nine months long. And it starts out with basic skills, 570 00:32:31,600 --> 00:32:34,840 Speaker 1: which is small unit tactics kind of thing. Uh. There's 571 00:32:35,000 --> 00:32:38,280 Speaker 1: um uh, I think this has been trimmed down a 572 00:32:38,280 --> 00:32:41,200 Speaker 1: little bit, but there was a fairly really in depth 573 00:32:41,440 --> 00:32:44,920 Speaker 1: relative to most of the other services. CQB package one 574 00:32:44,960 --> 00:32:48,120 Speaker 1: thing that has been increased quite a bit and amplified 575 00:32:48,200 --> 00:32:51,240 Speaker 1: quite a bit as a special reconnaissance phase of that 576 00:32:51,320 --> 00:32:56,040 Speaker 1: course that's several weeks long, a FED unconventional warfare phase, 577 00:32:56,440 --> 00:32:59,600 Speaker 1: and then it culminates with a big exercise down in 578 00:33:00,080 --> 00:33:05,240 Speaker 1: um South Carolina. I'm definitely leaving a couple of things out. 579 00:33:05,280 --> 00:33:08,760 Speaker 1: There's there's Radar Spirit, which is a big, you know, 580 00:33:08,800 --> 00:33:12,360 Speaker 1: typical military nut roll exercise where you're not sleeping for 581 00:33:12,400 --> 00:33:14,560 Speaker 1: two weeks and rucking around all over the place and 582 00:33:15,360 --> 00:33:17,160 Speaker 1: that sort of thing. But that's kind of the general 583 00:33:17,240 --> 00:33:20,240 Speaker 1: gist of it. And Uh, once you graduate I t C, 584 00:33:21,320 --> 00:33:24,480 Speaker 1: you'll either be given orders to a language school or 585 00:33:24,480 --> 00:33:26,720 Speaker 1: some sort of following on school, but most likely you're 586 00:33:26,720 --> 00:33:29,120 Speaker 1: going to go to a battalion and get dumped into 587 00:33:29,160 --> 00:33:32,640 Speaker 1: a company and pick up with your team. I'm wondering 588 00:33:32,680 --> 00:33:36,719 Speaker 1: if there's any particulars in in personality or anything like 589 00:33:36,760 --> 00:33:39,719 Speaker 1: that that you think will thrive in Marstock. I mean, 590 00:33:39,800 --> 00:33:41,760 Speaker 1: Jack would correct me if I'm wrong here, but I 591 00:33:41,800 --> 00:33:44,840 Speaker 1: feel like when I meet Special Forces guys who come 592 00:33:44,840 --> 00:33:48,360 Speaker 1: on the podcast, they're guys who are typically interested in 593 00:33:48,440 --> 00:33:52,520 Speaker 1: learning other languages. They're interested in embedding another cultures more 594 00:33:52,640 --> 00:33:56,960 Speaker 1: so than other Special Operations branches. Just from meeting guys. 595 00:33:57,520 --> 00:33:59,160 Speaker 1: I don't know if if if you think it's a 596 00:33:59,160 --> 00:34:01,840 Speaker 1: fair assessment. I mean, each of these units have their 597 00:34:01,880 --> 00:34:03,840 Speaker 1: own kind of culture and their own I mean, we 598 00:34:03,920 --> 00:34:05,840 Speaker 1: have so much in common with one another, but they 599 00:34:05,840 --> 00:34:08,520 Speaker 1: do also, I mean, have their own unique unit culture. 600 00:34:08,640 --> 00:34:10,799 Speaker 1: So yeah, if you want to comment on on what 601 00:34:10,840 --> 00:34:14,160 Speaker 1: that Marstock individual is like and what that unit culture 602 00:34:14,200 --> 00:34:17,360 Speaker 1: is like, that'd be awesome. Man, I would be afraid 603 00:34:17,400 --> 00:34:22,400 Speaker 1: to put that into into two tied to box. Um. 604 00:34:22,440 --> 00:34:25,000 Speaker 1: I think there. I think there's a lot of differences 605 00:34:25,520 --> 00:34:28,360 Speaker 1: in you know, kind of attitudes and cultures across the 606 00:34:28,400 --> 00:34:31,120 Speaker 1: different soft branches. But I also think there's a lot 607 00:34:31,120 --> 00:34:34,600 Speaker 1: of common threads through all those adventure seeking type personalities 608 00:34:34,680 --> 00:34:38,759 Speaker 1: and um uh, people that are willing to like push 609 00:34:38,840 --> 00:34:43,680 Speaker 1: themselves and and that sort of thing. Um I, I 610 00:34:43,680 --> 00:34:45,319 Speaker 1: don't know. I guess I'd prefer to focus on that 611 00:34:45,400 --> 00:34:48,319 Speaker 1: then the differences. And largely that's because I had a 612 00:34:48,440 --> 00:34:51,879 Speaker 1: very really, very limited insight into Marstok as a whole 613 00:34:51,960 --> 00:34:54,440 Speaker 1: when I was at the schoolhouse. Although I helped out 614 00:34:54,440 --> 00:34:58,560 Speaker 1: with I PC occasionally as needed. Primarily I taught for 615 00:34:58,800 --> 00:35:01,640 Speaker 1: one of the sensitive Activity courses that we held there. 616 00:35:02,239 --> 00:35:06,400 Speaker 1: UM and well, actually the store runs there, so typically 617 00:35:06,680 --> 00:35:09,839 Speaker 1: the people that were coming to the course were E five, 618 00:35:10,480 --> 00:35:13,040 Speaker 1: UM six IS and E seven with four or five 619 00:35:13,080 --> 00:35:16,439 Speaker 1: deployments under their belt, and and I saw a very 620 00:35:16,520 --> 00:35:22,600 Speaker 1: limited cross section of the people that actually staff MARSK Interesting. Okay, Well, 621 00:35:22,800 --> 00:35:25,879 Speaker 1: I know that your background today and what you do 622 00:35:26,400 --> 00:35:29,320 Speaker 1: podcasts on and what you've been writing several books about 623 00:35:29,480 --> 00:35:32,600 Speaker 1: is really more in the realm of privacy and security. 624 00:35:33,080 --> 00:35:37,480 Speaker 1: So let's get into how you transitioned into that world. Okay, 625 00:35:37,560 --> 00:35:40,600 Speaker 1: So immediately after I got out of the military, I 626 00:35:40,680 --> 00:35:43,359 Speaker 1: was unemployed for less than like a month. I got 627 00:35:43,440 --> 00:35:47,600 Speaker 1: picked up on a contract working overseas, and a couple 628 00:35:47,600 --> 00:35:49,920 Speaker 1: of things happened during that I had the opportunity to 629 00:35:50,000 --> 00:35:53,840 Speaker 1: interact with like a much more joint kind of environment 630 00:35:53,880 --> 00:35:57,840 Speaker 1: with other government agencies, and I had the opportunity to 631 00:35:58,000 --> 00:36:00,040 Speaker 1: rub up against some people that had some skills that 632 00:36:00,320 --> 00:36:02,439 Speaker 1: I didn't and some things that I became interested in. 633 00:36:02,920 --> 00:36:05,520 Speaker 1: I also had my identity stolen during one of those trips, 634 00:36:05,800 --> 00:36:08,759 Speaker 1: which was not a pleasant experience to deal with. And 635 00:36:08,800 --> 00:36:11,120 Speaker 1: then I went to after I finished that up and 636 00:36:11,160 --> 00:36:14,280 Speaker 1: went to work at the Schoolhouse. The course that I 637 00:36:14,400 --> 00:36:16,799 Speaker 1: ended up teaching there. For most of my time at 638 00:36:16,840 --> 00:36:21,320 Speaker 1: the Schoolhouse involved driving around in civilian clothes, civilian cars, 639 00:36:21,360 --> 00:36:24,399 Speaker 1: long hair, all that good stuff, and everything we do 640 00:36:24,560 --> 00:36:28,800 Speaker 1: involved a computer, and the course director there was forward 641 00:36:28,800 --> 00:36:32,720 Speaker 1: thinking enough to realize, that's probably a problem. That's probably 642 00:36:32,719 --> 00:36:35,320 Speaker 1: not a great idea to have all this operational data, 643 00:36:35,360 --> 00:36:38,200 Speaker 1: all this historical operation and old data and future plans 644 00:36:38,200 --> 00:36:42,640 Speaker 1: and everything on a computer that's leaving the fob or 645 00:36:42,680 --> 00:36:45,200 Speaker 1: going in front of the hardline or leaving the embassy, 646 00:36:45,360 --> 00:36:47,239 Speaker 1: or you know whatever model you want to put that 647 00:36:47,280 --> 00:36:50,880 Speaker 1: into without some sort of protection on it. So I 648 00:36:50,960 --> 00:36:53,440 Speaker 1: kind of volunteered, Hey, I this is the thing I'm 649 00:36:53,520 --> 00:36:56,400 Speaker 1: kind of interested in, so I'll take the reins on this. 650 00:36:56,480 --> 00:36:59,120 Speaker 1: And I got the opportunity to go some courses related 651 00:36:59,160 --> 00:37:02,760 Speaker 1: to things like that, and it became a massive area 652 00:37:02,760 --> 00:37:06,080 Speaker 1: of personal interest and really just a huge hobby that 653 00:37:06,200 --> 00:37:10,960 Speaker 1: I turned into a job. For better or worse. It 654 00:37:11,040 --> 00:37:15,319 Speaker 1: sounds like Mark Giaconia going and teaching himself coding and 655 00:37:15,360 --> 00:37:17,600 Speaker 1: programming in his spare time. He was one of our 656 00:37:18,080 --> 00:37:20,720 Speaker 1: guests a few a few episodes back Who's an eighteen 657 00:37:20,760 --> 00:37:24,319 Speaker 1: Bravo and uh, he kind of very similar to your 658 00:37:24,320 --> 00:37:26,319 Speaker 1: story actually as far as how he got into the 659 00:37:26,360 --> 00:37:31,960 Speaker 1: tech side of things. Yeah. That that's awesome, actually, I 660 00:37:32,320 --> 00:37:36,239 Speaker 1: that's I've listened to that episode kind of recently. Yes, 661 00:37:36,320 --> 00:37:40,160 Speaker 1: it's a recent show, so yeah. Um, but yeah, I 662 00:37:40,600 --> 00:37:42,919 Speaker 1: basically just picked that up and ran with it and 663 00:37:43,520 --> 00:37:45,799 Speaker 1: like kind of out of the blue with way more 664 00:37:45,840 --> 00:37:48,680 Speaker 1: confidence than I should have had. I'm like, you know what, 665 00:37:48,680 --> 00:37:50,680 Speaker 1: I'm going to write a book on this. I wrote 666 00:37:50,719 --> 00:37:53,400 Speaker 1: a couple of books on that several years ago that 667 00:37:53,520 --> 00:37:56,600 Speaker 1: I would be really embarrassed if anybody could actually purchase now. 668 00:37:56,760 --> 00:38:00,560 Speaker 1: But um, I had the opportunity to, like through the 669 00:38:00,600 --> 00:38:04,360 Speaker 1: Schoolhouse through training and instructor development training. We booked there 670 00:38:04,440 --> 00:38:06,840 Speaker 1: to meet the guy that eventually became my co author 671 00:38:06,880 --> 00:38:10,040 Speaker 1: and co host on the Complete Privacy and Security Podcast. 672 00:38:10,760 --> 00:38:13,800 Speaker 1: We wrote this huge five page book called the Complete 673 00:38:13,800 --> 00:38:16,719 Speaker 1: Privacy and Security Desk Reference, which in all honesty is 674 00:38:16,760 --> 00:38:18,759 Speaker 1: a little bit dated now. That came out in UH 675 00:38:19,080 --> 00:38:24,080 Speaker 1: twenty and we've since updated that with a second edition. 676 00:38:24,520 --> 00:38:27,760 Speaker 1: And then I recently wrote Compact and I worked for 677 00:38:28,160 --> 00:38:32,040 Speaker 1: a company in the UH in the kind of cyber 678 00:38:32,080 --> 00:38:34,879 Speaker 1: domain that does a lot of this kind of non 679 00:38:34,880 --> 00:38:39,920 Speaker 1: traditional communication instruction and infrastructure support to d O D 680 00:38:40,080 --> 00:38:43,799 Speaker 1: in the intelligence community. I feel like with all the 681 00:38:43,840 --> 00:38:47,800 Speaker 1: stuff going on at social media lately, it really couldn't 682 00:38:47,840 --> 00:38:50,080 Speaker 1: be more relevant of a topic. And I'd actually love 683 00:38:50,160 --> 00:38:54,000 Speaker 1: to get into, uh, something you posted Jack on Twitter recently. 684 00:38:54,040 --> 00:38:58,439 Speaker 1: I'd love to hear um Justin's take on how now 685 00:38:58,520 --> 00:39:01,719 Speaker 1: a few of these phones that unlie by facial recognition, 686 00:39:02,280 --> 00:39:05,920 Speaker 1: any biometrics, that's easy for the court to um to 687 00:39:06,400 --> 00:39:10,960 Speaker 1: um compel coerce you into unlocking your phone or device 688 00:39:11,719 --> 00:39:14,360 Speaker 1: if you use biometrics like a thumb print or a 689 00:39:14,880 --> 00:39:17,600 Speaker 1: face scan or something like that, because there there are 690 00:39:17,640 --> 00:39:21,920 Speaker 1: some constitutional protections on let's say, um, your pass code, 691 00:39:21,960 --> 00:39:25,319 Speaker 1: because that's knowledge that you have, but your face does 692 00:39:25,360 --> 00:39:28,600 Speaker 1: not constitute knowledge in the eyes of the law. UM. 693 00:39:28,640 --> 00:39:31,200 Speaker 1: But anyway, yeah, Justin, if you want to go more 694 00:39:31,200 --> 00:39:34,400 Speaker 1: in depth on that topic, that'd be awesome. Yeah, I 695 00:39:34,400 --> 00:39:35,960 Speaker 1: would love to, And you may have to cut me 696 00:39:36,000 --> 00:39:39,560 Speaker 1: off here because I can definitely go for a minute. 697 00:39:39,840 --> 00:39:43,080 Speaker 1: So there's several problems inherent with this. First and foremost, 698 00:39:43,120 --> 00:39:45,399 Speaker 1: I look at this through most of my customer base, 699 00:39:45,480 --> 00:39:47,759 Speaker 1: which is I see and D O D And if 700 00:39:47,760 --> 00:39:49,680 Speaker 1: you're traveling overseas, you may or may not have the 701 00:39:49,760 --> 00:39:53,719 Speaker 1: choice to provide your face, and especially military members who 702 00:39:53,880 --> 00:39:57,800 Speaker 1: statistically stand a very high risk of being killed captured 703 00:39:58,040 --> 00:40:00,319 Speaker 1: that sort of thing where your face exists, and that 704 00:40:00,360 --> 00:40:02,640 Speaker 1: can be used to unlock that device, especially in an 705 00:40:02,640 --> 00:40:06,920 Speaker 1: operational context, which could be gravely damaging to US and 706 00:40:06,920 --> 00:40:12,160 Speaker 1: Western interests. Much more mundane problems with this. In the 707 00:40:12,200 --> 00:40:16,560 Speaker 1: civilian space, like you mentioned, you may or may not 708 00:40:16,640 --> 00:40:20,000 Speaker 1: be compelled to provide some knowledge based factor to unlock 709 00:40:20,080 --> 00:40:24,760 Speaker 1: that phone. But your face is public information because anyone 710 00:40:24,840 --> 00:40:26,720 Speaker 1: can see it every time you walk out in public. 711 00:40:26,760 --> 00:40:30,040 Speaker 1: It's not considered secret, so it's much more readily available 712 00:40:30,120 --> 00:40:33,200 Speaker 1: for the courts to demand that you provide or for 713 00:40:33,239 --> 00:40:35,520 Speaker 1: someone to just hold your phone in front of your face. 714 00:40:36,760 --> 00:40:41,239 Speaker 1: I if anyone has these phones, there is the opportunity 715 00:40:41,320 --> 00:40:43,560 Speaker 1: to opt out of that and use a pass code instead, 716 00:40:43,560 --> 00:40:46,479 Speaker 1: which I do recommend. I don't have anything that takes 717 00:40:46,480 --> 00:40:49,200 Speaker 1: facial recognition yet, but I would still choose to use 718 00:40:49,200 --> 00:40:52,680 Speaker 1: a password. The problem with that is Apple, who's kind 719 00:40:52,680 --> 00:40:56,000 Speaker 1: of I support a lot of what they say. They're 720 00:40:56,000 --> 00:40:58,800 Speaker 1: a little bit on their high horse about protecting consumer 721 00:40:58,840 --> 00:41:02,399 Speaker 1: security and privacy. But they're making things like this more 722 00:41:02,440 --> 00:41:05,319 Speaker 1: and more accessible, and they're making them the default. And 723 00:41:05,360 --> 00:41:09,359 Speaker 1: I don't imagine there are very many consumers who are 724 00:41:09,400 --> 00:41:12,879 Speaker 1: going to accept the convenience penalty of not using face 725 00:41:12,920 --> 00:41:15,160 Speaker 1: idea to open their phones. And and Apple is not 726 00:41:15,200 --> 00:41:17,239 Speaker 1: the only one, but I'll pick on them since they 727 00:41:17,239 --> 00:41:19,600 Speaker 1: are a little bit in their ivory tower of judgment 728 00:41:19,680 --> 00:41:22,600 Speaker 1: about privacy and security and pointing the finger at everyone else. 729 00:41:22,880 --> 00:41:25,279 Speaker 1: I think if Apple, if that's what they truly had 730 00:41:25,280 --> 00:41:27,759 Speaker 1: in their best interests, that would be an option deep 731 00:41:27,800 --> 00:41:29,960 Speaker 1: inside the settings that you could unlock if you really 732 00:41:30,000 --> 00:41:33,439 Speaker 1: wanted to, instead of it being being the default. And 733 00:41:34,120 --> 00:41:38,120 Speaker 1: I think there are some systemic problems in like just 734 00:41:38,360 --> 00:41:42,760 Speaker 1: gross public and security of accounts and devices and everything else. 735 00:41:43,560 --> 00:41:45,719 Speaker 1: I just want to jump in there on one thing. 736 00:41:45,800 --> 00:41:50,759 Speaker 1: I do remember when Apple first announced the facial recognition 737 00:41:50,800 --> 00:41:53,160 Speaker 1: stuff to unlock a phone, it was almost like a 738 00:41:53,200 --> 00:41:56,640 Speaker 1: selling point. It was on their commercials that they ran 739 00:41:56,880 --> 00:41:59,960 Speaker 1: on you know, TV and online. And I think it's 740 00:42:00,040 --> 00:42:02,480 Speaker 1: especially young people are like, oh my god, that's so cool. 741 00:42:02,480 --> 00:42:05,480 Speaker 1: It could recognize my face. It wasn't seen as like 742 00:42:05,680 --> 00:42:11,400 Speaker 1: a negative for privacy there's no question that it was 743 00:42:11,440 --> 00:42:17,000 Speaker 1: a selling point, and I'm sure seen as a negative 744 00:42:17,000 --> 00:42:20,399 Speaker 1: for privacy by some crazy people like me and some 745 00:42:21,120 --> 00:42:25,560 Speaker 1: and I we'll we'll definitely take raise my hand in 746 00:42:25,640 --> 00:42:28,480 Speaker 1: the who's the paranoid person with a tinfoil hat in 747 00:42:28,520 --> 00:42:31,799 Speaker 1: the room, but also some forward thinking individuals. I I'm 748 00:42:31,920 --> 00:42:34,319 Speaker 1: far from the only category of person that thinks about 749 00:42:34,360 --> 00:42:37,120 Speaker 1: this type of thing, but generally across the board in public, 750 00:42:38,040 --> 00:42:40,640 Speaker 1: I don't think people care. I don't think people really 751 00:42:40,760 --> 00:42:44,080 Speaker 1: understand the risk, and and I think it's a much 752 00:42:44,120 --> 00:42:48,480 Speaker 1: bigger systemic risk than people realize. And it's probably not 753 00:42:48,600 --> 00:42:52,240 Speaker 1: going to impact anyone individual, but if it does impact 754 00:42:52,239 --> 00:42:55,719 Speaker 1: that one individual, then the statistics don't matter. But I 755 00:42:55,760 --> 00:42:58,480 Speaker 1: also think, like a hinted at earlier, I think it 756 00:42:58,520 --> 00:43:05,000 Speaker 1: creates much bigger systemic problems. I you know, Apple is 757 00:43:06,520 --> 00:43:11,080 Speaker 1: very vocal about the idea that we can't on lock phones. 758 00:43:11,200 --> 00:43:13,839 Speaker 1: We law enforcement can bring us these phones and there's 759 00:43:13,880 --> 00:43:15,960 Speaker 1: no way we can unlock them, and then they roll 760 00:43:16,040 --> 00:43:19,279 Speaker 1: out this which completely short circuits that whole process. So 761 00:43:19,920 --> 00:43:23,759 Speaker 1: it just seems terribly disingenuous to me, and to cappy 762 00:43:23,840 --> 00:43:26,799 Speaker 1: out this even a little further. Law enforcement is not 763 00:43:26,880 --> 00:43:29,400 Speaker 1: typically in my threat model that I talk about, UM, 764 00:43:29,480 --> 00:43:33,520 Speaker 1: I don't. I don't. Almost every single person I teach 765 00:43:33,600 --> 00:43:36,279 Speaker 1: is affiliated with the do O D, a federal law 766 00:43:36,320 --> 00:43:39,520 Speaker 1: enforcement agency, or the the intelligence community in some capacity, 767 00:43:39,760 --> 00:43:43,680 Speaker 1: so law enforcement is not their chief concern. But if 768 00:43:43,680 --> 00:43:47,160 Speaker 1: a if an exploit is available law enforcement, it's also 769 00:43:47,200 --> 00:43:50,160 Speaker 1: available to everyone else. So I do think this is 770 00:43:50,160 --> 00:43:52,319 Speaker 1: a big concern, even though law enforcement is not my 771 00:43:52,440 --> 00:43:56,359 Speaker 1: chief threat actor that I'm worried about. And just additionally, 772 00:43:56,760 --> 00:44:00,160 Speaker 1: with UM, it's more pertains to you know, guys who 773 00:44:00,160 --> 00:44:03,400 Speaker 1: are soldiers, people who are in the intelligence community and 774 00:44:03,440 --> 00:44:06,799 Speaker 1: so on. But those biometric systems can be spoofed. I mean, 775 00:44:06,840 --> 00:44:10,839 Speaker 1: you can go into someone's hotel room, uh with their 776 00:44:10,880 --> 00:44:16,040 Speaker 1: fingerprints off the dresser, transpose the fingerprint and use it 777 00:44:16,080 --> 00:44:19,680 Speaker 1: to unlock these device. I mean, these these techniques exist, UM. 778 00:44:19,880 --> 00:44:22,120 Speaker 1: So it's just another way that you know, really you 779 00:44:22,160 --> 00:44:24,400 Speaker 1: do not need to be using those biometric locks on 780 00:44:24,440 --> 00:44:27,640 Speaker 1: your phone. And also the problem with biometric tokens is 781 00:44:27,640 --> 00:44:31,319 Speaker 1: there there intensely complex. They're they're much more complex than 782 00:44:31,360 --> 00:44:35,200 Speaker 1: any pass code you and I could remember to unlock 783 00:44:35,200 --> 00:44:39,360 Speaker 1: our phones. They're they're immensely complex. The problem is once 784 00:44:39,400 --> 00:44:43,200 Speaker 1: those tokens are stolen or lost or spilled in some way, 785 00:44:43,239 --> 00:44:45,160 Speaker 1: there's no way to change them. So a thing like 786 00:44:45,239 --> 00:44:49,040 Speaker 1: the OPM breach where fingerprint cards might have been accessed, 787 00:44:49,360 --> 00:44:53,480 Speaker 1: your fingerprint is no longer a valid or safe authentication 788 00:44:53,520 --> 00:44:56,200 Speaker 1: token because now it's it's in the open world, and 789 00:44:56,440 --> 00:45:00,480 Speaker 1: your phone just the form factor of the phone itself. 790 00:45:00,520 --> 00:45:03,560 Speaker 1: It's got this beautiful tempered glass screen that pick up 791 00:45:03,719 --> 00:45:07,120 Speaker 1: picks up fingerprints really really well on the exact device 792 00:45:07,200 --> 00:45:09,719 Speaker 1: you're trying to protect with a fingerprint. It's kind of 793 00:45:09,719 --> 00:45:12,880 Speaker 1: tantamount to leaving a post it note on your computer 794 00:45:12,960 --> 00:45:17,960 Speaker 1: monitor with your user name and password on it. Interesting, 795 00:45:18,160 --> 00:45:21,719 Speaker 1: that is true. What about the benefit to let's say 796 00:45:21,760 --> 00:45:24,839 Speaker 1: the San Bernardino shooter where they were trying to break 797 00:45:24,840 --> 00:45:28,239 Speaker 1: into this guy's iPhone. Hattie have facial Hattie have had 798 00:45:28,360 --> 00:45:32,040 Speaker 1: facial recognition on there. You know they wouldn't have cared 799 00:45:32,040 --> 00:45:34,600 Speaker 1: at that point if they flashed the phone on dead 800 00:45:34,600 --> 00:45:40,480 Speaker 1: guy's face and unlock it. Oh man, um, you've opened 801 00:45:40,480 --> 00:45:42,640 Speaker 1: a door that I didn't want to open. I'd definitely 802 00:45:42,640 --> 00:45:44,799 Speaker 1: be glad to rant about this. And and first let 803 00:45:44,840 --> 00:45:47,960 Speaker 1: me caveat that I'm not anti law enforcement. Um. Like 804 00:45:48,040 --> 00:45:50,759 Speaker 1: three weeks ago, I spoke at a conference at d 805 00:45:51,000 --> 00:45:54,279 Speaker 1: A headquarters. I'm not anti law enforcement at all, but 806 00:45:54,320 --> 00:45:58,640 Speaker 1: I do think there's like a system of meaningful, powerful 807 00:45:58,719 --> 00:46:03,480 Speaker 1: checks and balances for reason. But essentially I look at 808 00:46:03,520 --> 00:46:06,400 Speaker 1: that as political maneuvering on the part of the FBI. 809 00:46:06,600 --> 00:46:10,000 Speaker 1: So every piece of information that could have been obtained 810 00:46:10,000 --> 00:46:12,799 Speaker 1: on that phone could have probably been obtained somewhere else. 811 00:46:12,840 --> 00:46:15,480 Speaker 1: For example, to see who this individual had called from 812 00:46:15,480 --> 00:46:17,840 Speaker 1: that device, they probably could have just looked at his 813 00:46:17,880 --> 00:46:22,120 Speaker 1: phone record to see who he actually called to. This 814 00:46:22,200 --> 00:46:26,680 Speaker 1: individual also took the time to destroy several other electronic devices, 815 00:46:27,440 --> 00:46:29,680 Speaker 1: basically throwing them a fire and burning them. A couple 816 00:46:29,719 --> 00:46:31,840 Speaker 1: of laptops and a couple of other phones, So this 817 00:46:31,880 --> 00:46:35,840 Speaker 1: one probably wasn't the super high yield device that it 818 00:46:35,960 --> 00:46:39,600 Speaker 1: was proposed to be in the media. And I forget 819 00:46:39,640 --> 00:46:41,759 Speaker 1: the specifics because it's been a little while since I've 820 00:46:41,760 --> 00:46:43,960 Speaker 1: talked about this, but for a brief period about twenty 821 00:46:44,000 --> 00:46:46,720 Speaker 1: four hours after the shooting, the FBI could have gotten 822 00:46:46,719 --> 00:46:50,320 Speaker 1: access to this individual's iCloud account, and either someone dropped 823 00:46:50,320 --> 00:46:52,920 Speaker 1: the ball or someone chose not to. There's definitely some 824 00:46:52,960 --> 00:46:56,279 Speaker 1: finger pointing back and forth about what exactly transpired, but 825 00:46:56,320 --> 00:46:59,560 Speaker 1: they lost that access and the password was changed or 826 00:46:59,640 --> 00:47:02,520 Speaker 1: something that is not you're not going to read a 827 00:47:02,560 --> 00:47:06,239 Speaker 1: headline about in the New York Times. But there were 828 00:47:06,280 --> 00:47:09,239 Speaker 1: ways to access that information without having to get into 829 00:47:09,320 --> 00:47:13,759 Speaker 1: that phone. And James Comey, uh, well, I guess he's 830 00:47:13,760 --> 00:47:17,360 Speaker 1: out now. But the well, James Comey, as head of 831 00:47:17,360 --> 00:47:20,799 Speaker 1: the FBI said several times, we have sixty phones that 832 00:47:20,840 --> 00:47:23,920 Speaker 1: we can't unlock. And that's a powerful number. It's a 833 00:47:24,000 --> 00:47:27,000 Speaker 1: it's a big headline to put in a newspaper or 834 00:47:27,040 --> 00:47:30,520 Speaker 1: on CNN, but it's really meaningless because what we should 835 00:47:30,560 --> 00:47:32,759 Speaker 1: be asking is, well, how many crimes have you not 836 00:47:32,840 --> 00:47:36,160 Speaker 1: solved because you can't get into these phones? Like you 837 00:47:36,280 --> 00:47:42,200 Speaker 1: probably solved most of those through some other methodology. It um. 838 00:47:42,680 --> 00:47:45,360 Speaker 1: I don't think the going dark problem quote unquote is 839 00:47:45,400 --> 00:47:47,960 Speaker 1: as maybe as big as it's purported to be. But 840 00:47:48,480 --> 00:47:52,359 Speaker 1: maybe that's maybe that's just me. Yeah. I remember I 841 00:47:52,400 --> 00:47:56,080 Speaker 1: had a conversation with a friend of mine who's uh, 842 00:47:56,200 --> 00:47:59,640 Speaker 1: you know, he was working for an intelligence service and 843 00:47:59,800 --> 00:48:01,959 Speaker 1: he was see to telling me, like, when we pull 844 00:48:02,200 --> 00:48:05,239 Speaker 1: iPhones off a target be ating like you know, say 845 00:48:05,360 --> 00:48:07,560 Speaker 1: Yemen or something like that, we pull like three or 846 00:48:07,600 --> 00:48:10,480 Speaker 1: four iPhones off target, will take them to our people 847 00:48:10,680 --> 00:48:14,759 Speaker 1: and they'll be like, Okay, choose one, and like, what 848 00:48:14,800 --> 00:48:16,319 Speaker 1: are you talking about that? Like, we don't have the 849 00:48:16,320 --> 00:48:19,160 Speaker 1: resources to crack all four of these phones, so choose 850 00:48:19,200 --> 00:48:22,000 Speaker 1: one that you want, the one you really want. And 851 00:48:22,080 --> 00:48:25,279 Speaker 1: so they do have methodologies to crack into the break 852 00:48:25,280 --> 00:48:30,560 Speaker 1: into the iPhone, but apparently it's not easy. Yeah, the 853 00:48:30,880 --> 00:48:33,879 Speaker 1: iPhones are generally considered to be far more secure than 854 00:48:33,960 --> 00:48:36,640 Speaker 1: Android devices, and and take us for what it's worth, 855 00:48:36,680 --> 00:48:38,239 Speaker 1: I don't have a dog in the fight. I have 856 00:48:38,320 --> 00:48:42,480 Speaker 1: an iPhone, but I fully comprehend and understand that it 857 00:48:42,560 --> 00:48:45,640 Speaker 1: is the best of a bad situation in a binary 858 00:48:45,680 --> 00:48:47,840 Speaker 1: marketplace where I have a choice of of one of 859 00:48:47,920 --> 00:48:53,600 Speaker 1: two device types um iPhone. For in, I guess almost 860 00:48:53,719 --> 00:48:58,040 Speaker 1: every practical threat actor against malware, against attacks on the 861 00:48:58,120 --> 00:49:00,919 Speaker 1: data that are at rest on that device is going 862 00:49:00,960 --> 00:49:05,120 Speaker 1: to be vastly superior to Android devices, generally speaking, not 863 00:49:06,200 --> 00:49:10,239 Speaker 1: of the time, because Android devices vary so widely. But yeah, 864 00:49:10,280 --> 00:49:12,560 Speaker 1: iPhones are very hard to get into. But I also 865 00:49:12,600 --> 00:49:16,880 Speaker 1: have no illusions that a sufficiently sophisticated adversary with enough 866 00:49:17,080 --> 00:49:21,880 Speaker 1: time and most importantly money and I guess thirdly, focus 867 00:49:21,920 --> 00:49:24,239 Speaker 1: on me specifically, could get into my phone if they 868 00:49:24,280 --> 00:49:27,040 Speaker 1: really wanted to. Yeah, well, I mean there are so 869 00:49:27,080 --> 00:49:30,120 Speaker 1: many workarounds and things like that, and I mean a 870 00:49:30,160 --> 00:49:32,800 Speaker 1: lot of times in this conversation, what's what gets lost 871 00:49:32,880 --> 00:49:35,240 Speaker 1: is I mean, if you're really dealing with a bad actor, 872 00:49:35,640 --> 00:49:37,440 Speaker 1: they're basically going to be like, Okay, give me your 873 00:49:37,480 --> 00:49:38,839 Speaker 1: pass code or I'm gonna hit you in the head 874 00:49:38,840 --> 00:49:42,520 Speaker 1: with this monkey ranch you know. Yeah, yeah, yeah, absolutely, 875 00:49:42,560 --> 00:49:46,840 Speaker 1: like rubber hose decryption is the like industry, you know, 876 00:49:46,920 --> 00:49:50,160 Speaker 1: punchline for that. But yeah, yeah, you're absolutely right. And 877 00:49:50,800 --> 00:49:53,680 Speaker 1: if if it comes to a situation like that, yeah, 878 00:49:53,680 --> 00:49:55,920 Speaker 1: we all, every single one of us has our breaking 879 00:49:55,920 --> 00:49:57,520 Speaker 1: point in our point where it's like, Okay, have the 880 00:49:57,640 --> 00:50:02,480 Speaker 1: damn password. I don't care, So tell us about the 881 00:50:02,520 --> 00:50:05,439 Speaker 1: latest book, com Sack and you know what you cover 882 00:50:05,520 --> 00:50:08,600 Speaker 1: in there. For people who are enjoying this interview and 883 00:50:08,600 --> 00:50:12,120 Speaker 1: maybe interested in picking it up, Okay. As as further 884 00:50:12,200 --> 00:50:15,319 Speaker 1: evidence that I am not anti law enforcement, I wrote 885 00:50:15,360 --> 00:50:18,880 Speaker 1: this in conjunction with a full time law enforcement officer 886 00:50:19,000 --> 00:50:23,440 Speaker 1: who also has a very avid interest in privacy and security. 887 00:50:23,920 --> 00:50:27,080 Speaker 1: And this is I'll give credit where credit was due 888 00:50:27,520 --> 00:50:30,080 Speaker 1: this was his idea. He's like, hey, man, I've got 889 00:50:30,120 --> 00:50:32,279 Speaker 1: an idea for this book. I can't write all of 890 00:50:32,280 --> 00:50:35,560 Speaker 1: it myself. Are you interested in like collaborating on this? 891 00:50:35,640 --> 00:50:39,080 Speaker 1: And I was like, yeah, definitely. So basically we spend 892 00:50:39,600 --> 00:50:43,959 Speaker 1: the entire first chapter talking about the insecurity of smartphones, 893 00:50:44,360 --> 00:50:48,440 Speaker 1: where I guess, more accurately the lack of privacy inherent 894 00:50:48,480 --> 00:50:53,320 Speaker 1: with smartphones. So there's there's a distinction between security and privacy, 895 00:50:53,320 --> 00:50:55,880 Speaker 1: and everybody tends to lump those into the same category. 896 00:50:56,320 --> 00:50:59,800 Speaker 1: But your iPhone, even your Android phone, if it's modern, 897 00:51:00,000 --> 00:51:02,560 Speaker 1: it's running the latest version of Android, and it's updated, 898 00:51:02,600 --> 00:51:05,760 Speaker 1: and it's not filled with bloatware applications, it's probably vastly 899 00:51:05,800 --> 00:51:08,200 Speaker 1: more secure than you could ever make your Windows Mac 900 00:51:08,760 --> 00:51:12,520 Speaker 1: or I'll say it, even Linux computer. Smartphones are much 901 00:51:12,560 --> 00:51:15,920 Speaker 1: smaller operating systems. They generally have like just much tighter 902 00:51:15,960 --> 00:51:20,160 Speaker 1: code um and and they're just smaller. There's less chance 903 00:51:20,280 --> 00:51:24,719 Speaker 1: of some shoddy code being in the operating system of 904 00:51:24,760 --> 00:51:28,000 Speaker 1: that device. Probably see why is they're a disaster because 905 00:51:28,160 --> 00:51:31,360 Speaker 1: basically what we've married up is the really old and 906 00:51:31,400 --> 00:51:36,359 Speaker 1: efficient technology of a mobile phone, which has to run 907 00:51:36,600 --> 00:51:39,200 Speaker 1: basically a second operating system and a second processor that 908 00:51:39,280 --> 00:51:41,760 Speaker 1: you can't even see, and that's the g SM modem 909 00:51:41,800 --> 00:51:46,400 Speaker 1: that communicates with the cellular network. And because that firmware 910 00:51:46,480 --> 00:51:49,600 Speaker 1: for that modem has to be approved by the FCC 911 00:51:49,800 --> 00:51:53,319 Speaker 1: before it's released, there's this super long update cycle, which 912 00:51:53,360 --> 00:51:56,240 Speaker 1: means it's almost constantly out of date and there's almost 913 00:51:56,239 --> 00:51:59,360 Speaker 1: constantly valid attack vectors on it. And because it touches 914 00:51:59,360 --> 00:52:02,839 Speaker 1: that network, anyone who can maliciously access that network can 915 00:52:02,880 --> 00:52:05,759 Speaker 1: access any device they choose to access. So what we've 916 00:52:05,800 --> 00:52:08,680 Speaker 1: coupled that with is a device with at least two 917 00:52:08,719 --> 00:52:13,560 Speaker 1: cameras in it. In any modern smartphone, your device with 918 00:52:13,920 --> 00:52:17,600 Speaker 1: up to four or five, maybe six microphones. Any modern 919 00:52:17,680 --> 00:52:22,960 Speaker 1: iPhone has four completely discrete microphones in it, an accelerometer, 920 00:52:23,320 --> 00:52:26,200 Speaker 1: and a GPS. So basically, if I get backdoor access 921 00:52:26,239 --> 00:52:30,160 Speaker 1: to that insecure processor in your iPhone that is basically 922 00:52:30,200 --> 00:52:32,920 Speaker 1: only there to allow you to communicate with the cellular network, 923 00:52:33,239 --> 00:52:36,080 Speaker 1: I can potentially access your camera, your microphones that can 924 00:52:36,120 --> 00:52:39,720 Speaker 1: see your location data. I can access your accelerometer data 925 00:52:39,760 --> 00:52:42,040 Speaker 1: and see if you're driving down the road, if you're 926 00:52:42,080 --> 00:52:45,719 Speaker 1: walking down the street, if your phone is sitting on 927 00:52:45,760 --> 00:52:49,040 Speaker 1: a table, if it's in portrait or landscape mode, and 928 00:52:49,160 --> 00:52:54,000 Speaker 1: that is a pretty serious potential privacy invasion, especially for 929 00:52:54,200 --> 00:52:57,920 Speaker 1: law enforcement officers working against very well fund asphisticated actors, 930 00:52:58,239 --> 00:53:03,480 Speaker 1: intelligence personnel, braa military personnel abroad. That that pretty much 931 00:53:03,520 --> 00:53:06,919 Speaker 1: gives away everything you need you need to to mount 932 00:53:07,040 --> 00:53:10,799 Speaker 1: a successful attack against any of those parties. And Drew 933 00:53:10,840 --> 00:53:13,080 Speaker 1: and I just take it a little further and provide 934 00:53:13,120 --> 00:53:16,080 Speaker 1: that information. And it's not like there's a government source 935 00:53:16,120 --> 00:53:21,080 Speaker 1: of information for intelligence people. This is still very like 936 00:53:21,160 --> 00:53:24,120 Speaker 1: a very budding field. But we wanted to make that 937 00:53:24,160 --> 00:53:26,640 Speaker 1: available to anyone who said, yeah, I really want to 938 00:53:26,719 --> 00:53:29,440 Speaker 1: understand the risk and kind of choose my own adventure 939 00:53:29,480 --> 00:53:31,880 Speaker 1: as to how far into this rabbit hole I actually 940 00:53:31,880 --> 00:53:37,520 Speaker 1: want to go. Yeah, it sounds like this is also 941 00:53:37,640 --> 00:53:41,399 Speaker 1: something that you know, well, it's an uphill battle trying 942 00:53:41,400 --> 00:53:43,360 Speaker 1: to educate the public, of course, but I mean, I 943 00:53:43,360 --> 00:53:45,480 Speaker 1: think it's something that people are going to be increasingly 944 00:53:45,520 --> 00:53:50,120 Speaker 1: interested in, you know, protecting their data, especially you know, 945 00:53:50,239 --> 00:53:56,040 Speaker 1: younger generations who are going to grow up with this technology. Yeah, 946 00:53:56,080 --> 00:54:00,920 Speaker 1: and that is an encouraging phenomenon that I have notice recently, 947 00:54:01,480 --> 00:54:05,120 Speaker 1: there have been indicators that people are starting to care 948 00:54:05,120 --> 00:54:08,040 Speaker 1: a little bit more. For the first time in years, 949 00:54:08,440 --> 00:54:12,279 Speaker 1: fewer people have actually joined Facebook, and a pretty substantial 950 00:54:12,360 --> 00:54:15,640 Speaker 1: percentage of people have left Face. Not substantial to Facebook, 951 00:54:15,960 --> 00:54:19,880 Speaker 1: but as a totality of humanity, large number of people 952 00:54:19,920 --> 00:54:23,959 Speaker 1: have left UM. People are starting to ask about things 953 00:54:23,960 --> 00:54:28,400 Speaker 1: like privacy policies. Congress is holding hearings with the the 954 00:54:28,440 --> 00:54:32,520 Speaker 1: big fame corporations Facebook, Apple, Amazon, Netflix, and Google to 955 00:54:32,680 --> 00:54:36,160 Speaker 1: ask about their privacy policies and how they're protecting data. So, yeah, 956 00:54:36,160 --> 00:54:39,920 Speaker 1: there are some slightly encouraging signs on the horizon that 957 00:54:40,000 --> 00:54:43,640 Speaker 1: maybe maybe there is a sea change and it's very 958 00:54:43,719 --> 00:54:48,000 Speaker 1: very early stages here. Yeah, I mean there's uh, I mean, 959 00:54:48,040 --> 00:54:50,360 Speaker 1: it's hard to tell what is just bluff and bluster, 960 00:54:50,480 --> 00:54:53,320 Speaker 1: but I mean there was that story that the Trump 961 00:54:53,360 --> 00:54:58,239 Speaker 1: administration was considering antitrust investigations against you know, Google and 962 00:54:58,280 --> 00:55:03,920 Speaker 1: Facebook and some of some of these big, uh big corporations. Yeah, 963 00:55:04,000 --> 00:55:07,880 Speaker 1: and uh, you know, I'd be afraid to comment on that. 964 00:55:07,640 --> 00:55:10,200 Speaker 1: That's a little outside of my range of expertise, but 965 00:55:10,520 --> 00:55:14,799 Speaker 1: we've seen hints of that in in the EU. Uh. 966 00:55:15,760 --> 00:55:18,520 Speaker 1: I use an email service called proton Mail that's a 967 00:55:18,920 --> 00:55:21,600 Speaker 1: really easy out of the box and an encrypted email service. 968 00:55:21,760 --> 00:55:26,560 Speaker 1: Which suited Google UH year because Google had suppressed search 969 00:55:26,600 --> 00:55:30,640 Speaker 1: results for or it had suppressed protons ability to rank 970 00:55:30,640 --> 00:55:33,800 Speaker 1: and search results for searches for things like secure email 971 00:55:33,880 --> 00:55:36,280 Speaker 1: or encrypted email. And they also one point three billion 972 00:55:36,280 --> 00:55:42,440 Speaker 1: dollar judgment in the EU OR for stat very interesting. Well, 973 00:55:42,760 --> 00:55:44,120 Speaker 1: you know what I was going to ask you about 974 00:55:44,160 --> 00:55:47,839 Speaker 1: as well. Beyond the actual companies, some of the app 975 00:55:47,920 --> 00:55:51,760 Speaker 1: developers UH seem to have, you know, breach of privacy issues. 976 00:55:51,800 --> 00:55:55,040 Speaker 1: I've even heard about like, for example, apps that are 977 00:55:55,080 --> 00:55:58,560 Speaker 1: like Bible reading apps that have access to your microphone, 978 00:55:58,719 --> 00:56:01,080 Speaker 1: and people have said, why should a app for someone 979 00:56:01,080 --> 00:56:03,600 Speaker 1: who wants to read versus from the Bible have access 980 00:56:03,600 --> 00:56:06,279 Speaker 1: to your microphone? What are they using it for? And 981 00:56:06,280 --> 00:56:08,319 Speaker 1: and there seems to be plenty of examples of that 982 00:56:08,360 --> 00:56:11,680 Speaker 1: type of thing. Absolutely, there's no question about it. And 983 00:56:11,680 --> 00:56:14,000 Speaker 1: I think the most notorious example is the quote unquote 984 00:56:14,000 --> 00:56:17,040 Speaker 1: flashlight app before phones had a flash on the camera 985 00:56:17,080 --> 00:56:19,680 Speaker 1: that could use as a flashlight. Would basically make your 986 00:56:19,719 --> 00:56:22,160 Speaker 1: screen flight and make it the brightest intensity, so you 987 00:56:22,160 --> 00:56:24,960 Speaker 1: could use your screen as a flashlight. And by by 988 00:56:25,040 --> 00:56:28,120 Speaker 1: choosing to install that app, you also said, yes, you 989 00:56:28,160 --> 00:56:32,600 Speaker 1: can access my contacts, my location information, my microphone, my camera, 990 00:56:32,800 --> 00:56:39,280 Speaker 1: my camera roll And that's historical, really traditional mission creep 991 00:56:39,320 --> 00:56:43,920 Speaker 1: of apps like that that ask for for superfluous permissions 992 00:56:44,480 --> 00:56:47,160 Speaker 1: basically so they can capture that data and sell it. 993 00:56:47,560 --> 00:56:50,640 Speaker 1: And with things like the latest Android rollout, Android P, 994 00:56:51,160 --> 00:56:54,240 Speaker 1: which I don't remember the candy name of, but Android 995 00:56:54,280 --> 00:56:59,480 Speaker 1: P gives it actually enhanced what app developers could get 996 00:56:59,520 --> 00:57:01,800 Speaker 1: access to. So now they can get access to things 997 00:57:02,239 --> 00:57:05,600 Speaker 1: like WiFi round trip time WiFi r t T, which 998 00:57:05,640 --> 00:57:09,360 Speaker 1: is even more invasive because that actually lets them measure 999 00:57:09,400 --> 00:57:11,719 Speaker 1: the distance you are from the router at any given time. 1000 00:57:11,719 --> 00:57:13,719 Speaker 1: And if you really stop and think about that, that 1001 00:57:13,840 --> 00:57:16,640 Speaker 1: router is blasting out a bubble uh in in a 1002 00:57:16,960 --> 00:57:20,160 Speaker 1: you know, a certain diameter from that access point, and 1003 00:57:20,200 --> 00:57:22,160 Speaker 1: if it can constantly measure your distance from it, it it 1004 00:57:22,280 --> 00:57:24,720 Speaker 1: can basically map your entire house. It can map your 1005 00:57:24,720 --> 00:57:27,880 Speaker 1: office just based on that. And that is a permission 1006 00:57:27,920 --> 00:57:31,000 Speaker 1: that's by default granted to app developers if they choose 1007 00:57:31,000 --> 00:57:33,240 Speaker 1: to access that information, and my guess is that many will. 1008 00:57:33,760 --> 00:57:36,280 Speaker 1: Rumba is another example of this. Rumba is owned by 1009 00:57:36,400 --> 00:57:39,640 Speaker 1: Amazon and it does map out your house and and 1010 00:57:39,680 --> 00:57:43,439 Speaker 1: share that information with Amazon. Um I could go into 1011 00:57:43,520 --> 00:57:47,280 Speaker 1: example after example, after of this, and what the user 1012 00:57:47,400 --> 00:57:49,960 Speaker 1: really needs to do first and foremost is look judiciously 1013 00:57:50,080 --> 00:57:53,920 Speaker 1: at applications. Every single app you install is you're basically 1014 00:57:53,960 --> 00:57:58,080 Speaker 1: making a choice to put a company inside the secure 1015 00:57:58,120 --> 00:58:01,040 Speaker 1: perimeter of your device. Say yes, you can see a 1016 00:58:01,040 --> 00:58:03,440 Speaker 1: lot of information that I'm not even aware you can see. 1017 00:58:03,920 --> 00:58:06,040 Speaker 1: You can see what Wi Fi networks I'm on, and 1018 00:58:06,080 --> 00:58:09,600 Speaker 1: by by default, extrapolate my location based off that, even 1019 00:58:09,640 --> 00:58:12,640 Speaker 1: if I don't permit you to access the GEO coordinates 1020 00:58:12,720 --> 00:58:16,000 Speaker 1: off my GPS chip. Through location services, you can see 1021 00:58:16,720 --> 00:58:19,240 Speaker 1: what my IM E I is, so that number will 1022 00:58:19,280 --> 00:58:21,840 Speaker 1: never change on this phone. You you'll always know this 1023 00:58:21,920 --> 00:58:25,320 Speaker 1: is me and this is where I'm at. Just this wealth, 1024 00:58:25,680 --> 00:58:28,440 Speaker 1: this enormity of information. So first thing you should you 1025 00:58:28,440 --> 00:58:31,760 Speaker 1: should ask why do I really need this app? And 1026 00:58:31,800 --> 00:58:33,800 Speaker 1: what is it going to ask for and maybe make 1027 00:58:33,800 --> 00:58:35,920 Speaker 1: a decision about, Yeah, I don't really need this. Uh. 1028 00:58:36,000 --> 00:58:38,800 Speaker 1: The next thing is to if you do choose to 1029 00:58:38,800 --> 00:58:40,960 Speaker 1: put that app on your phone, first and foremost, keep 1030 00:58:40,960 --> 00:58:45,040 Speaker 1: it updated um and secondly, go through those permissions and 1031 00:58:45,480 --> 00:58:48,000 Speaker 1: limit what it actually needs access to. There's no reason 1032 00:58:48,080 --> 00:58:50,720 Speaker 1: for a Bible reading app to have access to your 1033 00:58:50,720 --> 00:58:53,720 Speaker 1: camera unless it performs some function like you can hold 1034 00:58:53,720 --> 00:58:56,160 Speaker 1: it in front of the actual Bible and it will 1035 00:58:56,600 --> 00:58:59,680 Speaker 1: read from that or something. I'm I'm struggling to come 1036 00:58:59,760 --> 00:59:02,440 Speaker 1: up with a plausible example for why that might be necessary. 1037 00:59:02,480 --> 00:59:05,640 Speaker 1: But in some cases there is some function like scanning 1038 00:59:05,680 --> 00:59:08,240 Speaker 1: QR codes that might be enhanced by allowing access to 1039 00:59:08,240 --> 00:59:11,720 Speaker 1: the camera. But also understand the risk that access is 1040 00:59:11,800 --> 00:59:15,080 Speaker 1: granted all the time. I just point out, and I 1041 00:59:15,080 --> 00:59:18,400 Speaker 1: don't say this to be overly alarmist. I think these 1042 00:59:18,440 --> 00:59:21,880 Speaker 1: issues matter, um from a standpoint of like your personal privacy, 1043 00:59:22,160 --> 00:59:25,120 Speaker 1: as we've been speaking about, but I'd also encourage people 1044 00:59:25,160 --> 00:59:28,200 Speaker 1: to think about this and sort of a more long term, 1045 00:59:28,240 --> 00:59:32,440 Speaker 1: big picture, social wide issue, UM because if you go 1046 00:59:32,560 --> 00:59:34,920 Speaker 1: and um, like, if you ever look at that book 1047 00:59:35,000 --> 00:59:39,400 Speaker 1: UM IBM. In the Holocaust, the Nazis were involved in this, uh, 1048 00:59:39,560 --> 00:59:44,040 Speaker 1: using technology and punch card computing that categorize people and 1049 00:59:44,680 --> 00:59:47,880 Speaker 1: in using technology to create one ended up being, you know, 1050 00:59:48,960 --> 00:59:52,480 Speaker 1: a genocide machine. And I'm not at all saying that's 1051 00:59:52,520 --> 00:59:57,160 Speaker 1: what's happening here in America today, but this was also 1052 00:59:57,240 --> 01:00:01,120 Speaker 1: with pretty primitive technology, right might My point with this is, 1053 01:00:01,200 --> 01:00:03,520 Speaker 1: I mean, think about everything we're talking about, all these 1054 01:00:03,520 --> 01:00:07,960 Speaker 1: devices that map out our entire lives um without our knowledge. 1055 01:00:07,960 --> 01:00:11,360 Speaker 1: In a lot of cases, the next time some dictator 1056 01:00:11,440 --> 01:00:13,880 Speaker 1: in the world comes around and they try to start 1057 01:00:13,960 --> 01:00:17,320 Speaker 1: up death camps and they start some insane project to 1058 01:00:17,480 --> 01:00:21,320 Speaker 1: catalog people and categorize people and schedule them for extermination. 1059 01:00:21,800 --> 01:00:25,600 Speaker 1: With all of these UM apps and tracking technologies and 1060 01:00:25,800 --> 01:00:30,560 Speaker 1: UM and artificial intelligence coming on the horizon, the next 1061 01:00:30,600 --> 01:00:34,760 Speaker 1: holocaust is going to be something terrifying, terrifying like the 1062 01:00:34,800 --> 01:00:38,240 Speaker 1: world has never seen but previously. And that's why I'm 1063 01:00:38,280 --> 01:00:40,280 Speaker 1: not saying this to be alarmist. I'm saying this because 1064 01:00:40,320 --> 01:00:44,480 Speaker 1: I want people to consider the larger social implications of 1065 01:00:44,520 --> 01:00:48,000 Speaker 1: these technologies that we allow into our lives and what 1066 01:00:48,120 --> 01:00:50,240 Speaker 1: kind of impact they're gonna have on the human race 1067 01:00:50,360 --> 01:00:53,520 Speaker 1: in the coming decades. And even looking at that on 1068 01:00:53,560 --> 01:00:58,919 Speaker 1: a slightly smaller, maybe very slightly less alarmist scale, look 1069 01:00:58,920 --> 01:01:03,120 Speaker 1: at things like marijuana dispensaries. So chances are if you 1070 01:01:03,360 --> 01:01:06,680 Speaker 1: frequent or have visited a marijuana dispensary, that information is 1071 01:01:06,720 --> 01:01:10,160 Speaker 1: recorded somewhere by your phone. Because we we don't tend 1072 01:01:10,240 --> 01:01:12,920 Speaker 1: to leave our phones behind anywhere. We take our phones 1073 01:01:12,960 --> 01:01:16,600 Speaker 1: absolutely everywhere. So let's say the current administration wanted to 1074 01:01:16,680 --> 01:01:18,920 Speaker 1: really crack down on that and said, give me a 1075 01:01:18,960 --> 01:01:23,000 Speaker 1: list of everyone who you know subpoena the phone companies, 1076 01:01:23,000 --> 01:01:24,680 Speaker 1: give me a list of everyone who has ever set 1077 01:01:24,680 --> 01:01:27,720 Speaker 1: foot into a marijuana dispensary. That's probably a capability that's 1078 01:01:27,840 --> 01:01:30,240 Speaker 1: very realistic. There's a lot of problems with that, right 1079 01:01:30,240 --> 01:01:31,840 Speaker 1: because a lot of people that step in there step 1080 01:01:31,920 --> 01:01:33,800 Speaker 1: in there just out of curiosity, they step in there 1081 01:01:33,840 --> 01:01:37,680 Speaker 1: with someone else, but it definitely creates this this perception 1082 01:01:37,680 --> 01:01:40,480 Speaker 1: of guilt. Um. We could go the other way with 1083 01:01:40,480 --> 01:01:43,000 Speaker 1: with other political parties and say things like, you know 1084 01:01:43,040 --> 01:01:46,120 Speaker 1: gun ownership and gun control, Um, give me everyone that's 1085 01:01:46,200 --> 01:01:49,160 Speaker 1: visited a gun store or gun range or liked the 1086 01:01:49,320 --> 01:01:52,480 Speaker 1: n r A on Facebook. That creates a similar type 1087 01:01:52,520 --> 01:01:56,200 Speaker 1: of problem and a way to categorize people that may 1088 01:01:56,280 --> 01:01:59,680 Speaker 1: or may not be factually accurate. And even things like 1089 01:01:59,720 --> 01:02:04,160 Speaker 1: face recognition. Facial recognition technology with enough data samples is 1090 01:02:04,160 --> 01:02:07,600 Speaker 1: insanely good, but there's still an alarming number of false 1091 01:02:07,640 --> 01:02:11,720 Speaker 1: positives that occur out of most facial recognition databases and 1092 01:02:11,760 --> 01:02:14,240 Speaker 1: the I think it was a c L. You recently 1093 01:02:14,280 --> 01:02:17,720 Speaker 1: did a test of a facial recognition program that rolled 1094 01:02:17,720 --> 01:02:20,920 Speaker 1: out in Florida and basically they took the photographs of 1095 01:02:20,920 --> 01:02:23,640 Speaker 1: all the sitting members of Congress and ran it against 1096 01:02:23,640 --> 01:02:27,760 Speaker 1: this facial recognition database. They were like fifteen quote unquote 1097 01:02:27,760 --> 01:02:30,520 Speaker 1: matches of like this is this person who's wanted for this, 1098 01:02:31,000 --> 01:02:34,040 Speaker 1: and it's actually a sitting senator whose whose face is 1099 01:02:34,120 --> 01:02:39,120 Speaker 1: being displayed, there are matched with that offender. This placing 1100 01:02:39,120 --> 01:02:41,720 Speaker 1: all and not that there aren't false positives when we 1101 01:02:41,800 --> 01:02:45,800 Speaker 1: put things into the frailty of human memory and human competition, uh, 1102 01:02:46,080 --> 01:02:49,840 Speaker 1: computational power, for sure, there there definitely are. But placing 1103 01:02:49,880 --> 01:02:52,240 Speaker 1: all this in the hands of computers with these massive 1104 01:02:52,320 --> 01:02:55,120 Speaker 1: databases of information about us, much of which we have 1105 01:02:55,200 --> 01:02:58,320 Speaker 1: forgotten about which a computer will never forget, kind of 1106 01:02:58,320 --> 01:03:01,680 Speaker 1: paints a bit of a dystopian picture. Yeah, it can. 1107 01:03:01,800 --> 01:03:04,120 Speaker 1: And I mean I really encourage people to think through 1108 01:03:04,120 --> 01:03:07,840 Speaker 1: the implications I mean, for on a more practical, pragmatic 1109 01:03:07,880 --> 01:03:11,880 Speaker 1: sense for their own personal security and privacy, but also 1110 01:03:12,080 --> 01:03:14,920 Speaker 1: you know, as a country, as a society, there are 1111 01:03:14,960 --> 01:03:18,640 Speaker 1: some very important and critical decisions I think we have 1112 01:03:18,760 --> 01:03:21,560 Speaker 1: to make that impact the future of the human race. 1113 01:03:21,960 --> 01:03:29,920 Speaker 1: Quite frankly, Yeah, yeah, absolutely, I yeah, there, I'm almost 1114 01:03:29,960 --> 01:03:32,600 Speaker 1: at a loss for at one time, I'm at a 1115 01:03:32,640 --> 01:03:34,840 Speaker 1: loss for word, and I have so many things to say, 1116 01:03:34,840 --> 01:03:36,440 Speaker 1: I can't get them all out. But yeah, we tell 1117 01:03:36,520 --> 01:03:40,640 Speaker 1: Facebook absolutely everything, even if it's not explicit. A great 1118 01:03:40,680 --> 01:03:42,760 Speaker 1: amount of it can be applied just from something as 1119 01:03:42,800 --> 01:03:45,920 Speaker 1: simple as a single like or two, which, when compared 1120 01:03:45,960 --> 01:03:49,400 Speaker 1: with that same like against millions of other Facebook users, 1121 01:03:49,520 --> 01:03:53,320 Speaker 1: can probably lump you into a category more accurately, maybe 1122 01:03:53,320 --> 01:03:55,800 Speaker 1: than even you would have put yourself into that category. 1123 01:03:56,160 --> 01:04:00,200 Speaker 1: And no, no president in US history has ever even 1124 01:04:00,240 --> 01:04:05,320 Speaker 1: presidential power back. There's just this you know, creeping concentration 1125 01:04:05,360 --> 01:04:08,640 Speaker 1: of power in the hands of very few people. And 1126 01:04:09,240 --> 01:04:12,560 Speaker 1: with this technology, with this you know, quote unquote absolute 1127 01:04:12,560 --> 01:04:17,360 Speaker 1: power that we're building, or this incredibly intimate knowledge of 1128 01:04:17,440 --> 01:04:22,320 Speaker 1: every single American nearly that we're building. It's, man, it's 1129 01:04:22,400 --> 01:04:25,880 Speaker 1: just primed to be abused and well at something. We're 1130 01:04:25,920 --> 01:04:29,880 Speaker 1: handing this information off in most cases to corporations. They're 1131 01:04:29,880 --> 01:04:33,160 Speaker 1: not accountable to the public in any tangible way. They're 1132 01:04:33,160 --> 01:04:37,440 Speaker 1: not run by elected representatives. Um. And we do it voluntarily. 1133 01:04:37,520 --> 01:04:40,560 Speaker 1: We we click okay on the little eu C without 1134 01:04:40,600 --> 01:04:43,760 Speaker 1: reading the damn thing. Um. But in a lot of ways, 1135 01:04:43,800 --> 01:04:47,000 Speaker 1: I mean, it's almost like if this information is in 1136 01:04:47,040 --> 01:04:48,920 Speaker 1: the hands of our government. At least you know it's 1137 01:04:48,960 --> 01:04:52,680 Speaker 1: a it's a democratic government that the people voted for. Uh. 1138 01:04:52,720 --> 01:04:54,960 Speaker 1: In the case of a Google or an Apple, that's 1139 01:04:54,960 --> 01:04:58,240 Speaker 1: not the case. I would like to think. So, yeah, 1140 01:04:58,280 --> 01:05:00,200 Speaker 1: I would like to think that it's in the ends 1141 01:05:00,200 --> 01:05:02,160 Speaker 1: of something like that. But also this kind of short 1142 01:05:02,160 --> 01:05:06,560 Speaker 1: circuit sup process. Right, if I want to fifty years ago, 1143 01:05:06,600 --> 01:05:09,280 Speaker 1: if I really wanted to know about someone and know 1144 01:05:09,400 --> 01:05:10,959 Speaker 1: the things that they had in their home, I would 1145 01:05:10,960 --> 01:05:12,400 Speaker 1: have to go to a search warrant. I would have 1146 01:05:12,400 --> 01:05:14,800 Speaker 1: to go execute that search warrant. Now, I can probably 1147 01:05:14,800 --> 01:05:17,520 Speaker 1: just pull up their Facebook page or their Instagram, look 1148 01:05:17,520 --> 01:05:19,360 Speaker 1: at the photos that they've taken inside their home, and 1149 01:05:19,360 --> 01:05:21,760 Speaker 1: have a pretty good idea of Okay, this is this 1150 01:05:21,800 --> 01:05:23,760 Speaker 1: is what's going on in this person's life, because they've 1151 01:05:23,760 --> 01:05:26,920 Speaker 1: probably told that to people about this thorough understanding that 1152 01:05:27,680 --> 01:05:33,000 Speaker 1: anyone can look at that, including governments, and another problem 1153 01:05:33,040 --> 01:05:36,640 Speaker 1: as I see it. Publicly available information p AI is 1154 01:05:36,720 --> 01:05:41,120 Speaker 1: defined by Department of Defense directives and Presidential directives as 1155 01:05:41,120 --> 01:05:45,560 Speaker 1: any information that can be purchased. So frequently, cell phone 1156 01:05:45,600 --> 01:05:48,000 Speaker 1: companies like Verizon, A, T and T, Sprint, T Mobile 1157 01:05:48,360 --> 01:05:51,560 Speaker 1: sell all their location data to third party companies that 1158 01:05:51,600 --> 01:05:53,840 Speaker 1: can then sell it to the government, and there's a 1159 01:05:53,920 --> 01:05:57,040 Speaker 1: veil of protection there in you know, instead of being 1160 01:05:57,080 --> 01:05:59,680 Speaker 1: able to search by phone number, there's an anonymized user 1161 01:05:59,720 --> 01:06:02,560 Speaker 1: idea or something. But this kind of short circuits that 1162 01:06:02,840 --> 01:06:07,360 Speaker 1: system of checks and balances. If companies that generate all 1163 01:06:07,360 --> 01:06:09,560 Speaker 1: this data can sell it to someone else and then 1164 01:06:09,640 --> 01:06:12,400 Speaker 1: it can just be purchased by the government. It's considered 1165 01:06:12,440 --> 01:06:15,360 Speaker 1: publicly available information. It's considered something that you've put into 1166 01:06:15,360 --> 01:06:18,919 Speaker 1: the public domain, and most people probably if you sat 1167 01:06:18,920 --> 01:06:21,480 Speaker 1: down and had a conversation, would realize, yeah, okay, all 1168 01:06:21,480 --> 01:06:23,800 Speaker 1: my photos on Facebook. I made the conscious decision to 1169 01:06:23,840 --> 01:06:26,760 Speaker 1: put those up, So maybe those have less legal protections. 1170 01:06:26,960 --> 01:06:29,080 Speaker 1: But I don't think most people are really thinking through 1171 01:06:29,120 --> 01:06:31,640 Speaker 1: the cause and effect of having a cellular phone that's 1172 01:06:31,680 --> 01:06:35,680 Speaker 1: creating a permanent, tangible record of where you are twenty 1173 01:06:35,680 --> 01:06:40,400 Speaker 1: four hours a day, and and the real this unstated 1174 01:06:40,440 --> 01:06:44,880 Speaker 1: realization that that information is available essentially to whoever can 1175 01:06:44,880 --> 01:06:49,120 Speaker 1: pay for it. Another consequence that we haven't mentioned yet, 1176 01:06:49,400 --> 01:06:52,040 Speaker 1: and that I could connect with current happenings in the world. 1177 01:06:52,360 --> 01:06:55,840 Speaker 1: We have this Brent Kavanaugh hearing happening where they're basically 1178 01:06:55,880 --> 01:06:58,280 Speaker 1: trying to piece together this guy's life through a fucking 1179 01:06:58,360 --> 01:07:01,720 Speaker 1: desk calendar out. You know this is going to change 1180 01:07:01,720 --> 01:07:03,960 Speaker 1: in the future when someone runs for political office, when 1181 01:07:03,960 --> 01:07:07,120 Speaker 1: someone is appointed to a judge position that we need 1182 01:07:07,160 --> 01:07:09,080 Speaker 1: to look through his meday. Yeah, they're gonna they're gonna 1183 01:07:09,080 --> 01:07:12,960 Speaker 1: be able to say, like, on July you checked into 1184 01:07:13,000 --> 01:07:15,760 Speaker 1: this person's house, who with this party where you know 1185 01:07:15,840 --> 01:07:18,640 Speaker 1: something illegal occurred, Or they'll be able to say that 1186 01:07:18,720 --> 01:07:23,080 Speaker 1: you know, you like this organization that's tied to something nefarious, 1187 01:07:23,160 --> 01:07:25,800 Speaker 1: or you know, I literally think in the debates, you know, 1188 01:07:26,000 --> 01:07:28,720 Speaker 1: years from now, they're gonna be like Jack Murphy on 1189 01:07:28,720 --> 01:07:35,120 Speaker 1: on June you tweeted out fuck my life, Like yeah, right, 1190 01:07:35,320 --> 01:07:37,760 Speaker 1: well what that's what That's what Justin was saying about. 1191 01:07:37,840 --> 01:07:39,920 Speaker 1: Is there like a guilt by association kind of thing, 1192 01:07:39,960 --> 01:07:43,240 Speaker 1: because it's like, if you have a copy of Mind 1193 01:07:43,280 --> 01:07:46,320 Speaker 1: comp on your bookshelf, maybe that's because you are a 1194 01:07:46,360 --> 01:07:50,240 Speaker 1: World War Two enthusiast or historian, and you know, you 1195 01:07:50,280 --> 01:07:54,200 Speaker 1: really believe on reading all of the texts that impacted 1196 01:07:54,200 --> 01:07:57,640 Speaker 1: the conflict, so naturally you would read Hitler's book Mind comp. 1197 01:07:57,800 --> 01:07:59,680 Speaker 1: But that doesn't mean you're a Nazi. Just because you 1198 01:07:59,720 --> 01:08:03,160 Speaker 1: have that on your shelf right, But now you know 1199 01:08:03,400 --> 01:08:06,240 Speaker 1: is that is the new version of that going to be, oh, well, 1200 01:08:06,280 --> 01:08:10,000 Speaker 1: you liked this, you know page on Facebook. And I 1201 01:08:10,120 --> 01:08:12,360 Speaker 1: tell you I'm an investigative journalist, so I'm on all 1202 01:08:12,440 --> 01:08:16,800 Speaker 1: kinds of weird websites and I investigate extremists and um, 1203 01:08:16,840 --> 01:08:21,240 Speaker 1: all sorts of unpleasant people. Um, could you easily, you know, 1204 01:08:21,400 --> 01:08:24,880 Speaker 1: just making a surface analysis of my metadata, you might 1205 01:08:24,920 --> 01:08:27,800 Speaker 1: conclude that, you know, I'm just a horrible human being. 1206 01:08:28,000 --> 01:08:30,599 Speaker 1: You know, you might be with JR. Pio who called 1207 01:08:30,680 --> 01:08:35,639 Speaker 1: the anti anti anti American author Yes, anti government, anti government, 1208 01:08:35,720 --> 01:08:39,599 Speaker 1: yeah yeah, And and give me five minutes with anyone's 1209 01:08:39,600 --> 01:08:42,280 Speaker 1: search history, and there is something incriminating in There's something 1210 01:08:42,320 --> 01:08:46,120 Speaker 1: that could be spun to paint you into a bad light. 1211 01:08:46,240 --> 01:08:49,800 Speaker 1: And that's absolutely the same thing with Facebook. And in 1212 01:08:49,840 --> 01:08:53,519 Speaker 1: many cases, people don't even realize they're liking something. Um, 1213 01:08:53,600 --> 01:08:56,639 Speaker 1: do you guys remember Ashley Madison dot commen still around 1214 01:08:56,680 --> 01:08:59,320 Speaker 1: or not? I think it is actually because I saw 1215 01:08:59,400 --> 01:09:02,759 Speaker 1: something about it the other day. But uh, I actually 1216 01:09:02,800 --> 01:09:07,759 Speaker 1: know somebody who met a woman like probably the only 1217 01:09:07,760 --> 01:09:11,280 Speaker 1: woman that was actually on that website, And I really 1218 01:09:11,320 --> 01:09:13,120 Speaker 1: mean a friend I'm not saying like asking for a 1219 01:09:13,200 --> 01:09:17,599 Speaker 1: friend like this is literally I know somebody. If you're listening, Benny, yeah, 1220 01:09:17,840 --> 01:09:21,519 Speaker 1: if my wife is listening, Um, yeah, actually hooked up 1221 01:09:21,560 --> 01:09:23,840 Speaker 1: with somebody on that website. But yeah, anyway, I'm sorry, 1222 01:09:23,840 --> 01:09:26,519 Speaker 1: go ahead, no, no, that that's interesting. That is the 1223 01:09:26,680 --> 01:09:28,880 Speaker 1: first I bring this up in every single class. That's 1224 01:09:28,880 --> 01:09:31,360 Speaker 1: the first story I've ever heard of anyone meeting an 1225 01:09:31,360 --> 01:09:36,360 Speaker 1: actual female human. Yeah, Ashley Madison dot com. But Ashley 1226 01:09:36,400 --> 01:09:42,040 Speaker 1: Madison um decided to do people a favor and figured that, 1227 01:09:42,160 --> 01:09:43,920 Speaker 1: you know what, people don't want to have to click 1228 01:09:43,960 --> 01:09:45,719 Speaker 1: the like button. So you know what, if they're logged 1229 01:09:45,720 --> 01:09:49,200 Speaker 1: into Facebook or if they have that Facebook Loggin credential 1230 01:09:49,280 --> 01:09:52,000 Speaker 1: on their cookie on their computer, when they come to 1231 01:09:52,040 --> 01:09:54,479 Speaker 1: Ashley Madison dot com, we assume that they liked the site. 1232 01:09:54,680 --> 01:09:56,280 Speaker 1: So we're just going to go ahead and click the 1233 01:09:56,320 --> 01:09:59,120 Speaker 1: like button for them. So, even if you made a 1234 01:09:59,120 --> 01:10:03,599 Speaker 1: conscious effort to segregate those two factors of your life, 1235 01:10:03,640 --> 01:10:07,200 Speaker 1: those two areas of your life, Ashley Madison just click 1236 01:10:07,280 --> 01:10:10,080 Speaker 1: that like button for you by virtue of the fact 1237 01:10:10,120 --> 01:10:14,000 Speaker 1: that you've visited that website. So now you are among 1238 01:10:14,040 --> 01:10:16,280 Speaker 1: those people who at the time and this is a 1239 01:10:16,320 --> 01:10:18,080 Speaker 1: couple of years ago at the time, could be graphic 1240 01:10:18,120 --> 01:10:21,599 Speaker 1: search as likers of Ashley Madison dot com. Your name 1241 01:10:21,600 --> 01:10:24,400 Speaker 1: appears in that list of someone who's done that, completely 1242 01:10:24,439 --> 01:10:28,280 Speaker 1: without your consent, completely without your knowledge, yet you're guilty 1243 01:10:28,280 --> 01:10:34,639 Speaker 1: by association. Yeah, it's unbelievable how they just make these 1244 01:10:34,680 --> 01:10:37,639 Speaker 1: connections and uh, yeah, you're right. I think people think 1245 01:10:37,680 --> 01:10:40,960 Speaker 1: that they're doing that, they're separating different factors of their 1246 01:10:41,040 --> 01:10:44,439 Speaker 1: life online and then they get tied together as you 1247 01:10:44,479 --> 01:10:47,320 Speaker 1: just it's the old classic. I mean, I always tell 1248 01:10:47,400 --> 01:10:49,840 Speaker 1: people when you see a dude who's carrying to cell 1249 01:10:49,880 --> 01:10:53,080 Speaker 1: phones that that tells you, well, what they're doing is 1250 01:10:53,080 --> 01:10:55,519 Speaker 1: there they have two networks they're running, and they're trying 1251 01:10:55,560 --> 01:10:58,639 Speaker 1: to keep them compartmentalized. I once heard the comedian Jim 1252 01:10:58,680 --> 01:11:00,920 Speaker 1: Norton describe it as that you have a good boy 1253 01:11:00,960 --> 01:11:04,000 Speaker 1: phone and a naughty boy phone. Well, and that's what 1254 01:11:04,240 --> 01:11:06,040 Speaker 1: that's what it is. It means you're either you're running 1255 01:11:06,040 --> 01:11:08,800 Speaker 1: too intelligence networks you don't want to know about each other, 1256 01:11:09,280 --> 01:11:11,640 Speaker 1: or you're running two women that you don't want to 1257 01:11:11,640 --> 01:11:14,400 Speaker 1: know about each other. It's one or the other. Yeah, 1258 01:11:14,439 --> 01:11:17,240 Speaker 1: here's here's who our rives and girlfriends may they never meet, 1259 01:11:17,600 --> 01:11:22,559 Speaker 1: but um, even even that, if those two cell phones 1260 01:11:22,600 --> 01:11:25,920 Speaker 1: are co located, even if they're turned off, if they're 1261 01:11:25,960 --> 01:11:28,000 Speaker 1: co located in the same place at the same time 1262 01:11:28,040 --> 01:11:31,120 Speaker 1: and they have the same pattern of movement, there's there's 1263 01:11:31,160 --> 01:11:34,280 Speaker 1: really no secret, depending on how high you're able to 1264 01:11:34,360 --> 01:11:36,200 Speaker 1: or how deep, depending on how you want to look 1265 01:11:36,200 --> 01:11:40,800 Speaker 1: at the data. I'm wondering if there's any devices that 1266 01:11:41,000 --> 01:11:45,320 Speaker 1: that you personally justin like will not buy, or stuff 1267 01:11:45,320 --> 01:11:48,200 Speaker 1: that you just will not participate it participated because you 1268 01:11:48,200 --> 01:11:50,840 Speaker 1: think it's too invasive. I mean, for example, when I 1269 01:11:50,880 --> 01:11:53,040 Speaker 1: see these people with Alexa in their home and you 1270 01:11:53,080 --> 01:11:55,480 Speaker 1: know it's recording what they're saying at all times, possibly 1271 01:11:55,880 --> 01:11:58,479 Speaker 1: I think that that is not something I need in 1272 01:11:58,479 --> 01:12:01,960 Speaker 1: my life. I'm wondering, someone with your knowledge base, if 1273 01:12:02,000 --> 01:12:05,439 Speaker 1: there's something that you're like, good question, any devices that 1274 01:12:05,520 --> 01:12:08,360 Speaker 1: you just will not touch, any social media you will 1275 01:12:08,400 --> 01:12:12,200 Speaker 1: not touch? What are like the big things out there 1276 01:12:12,280 --> 01:12:15,200 Speaker 1: for that, like Joe Schmo anybody American should just stay 1277 01:12:15,240 --> 01:12:18,200 Speaker 1: the hell away from. So. I have no social media 1278 01:12:18,320 --> 01:12:20,880 Speaker 1: presence at all except for a Twitter feed that all 1279 01:12:20,880 --> 01:12:23,400 Speaker 1: it does is repost my blog post, and if I'm 1280 01:12:23,400 --> 01:12:25,519 Speaker 1: being honest, I'm probably going to kill that soon because 1281 01:12:25,560 --> 01:12:28,040 Speaker 1: it's not really doing anything for me. Uh No Facebook, 1282 01:12:28,040 --> 01:12:32,800 Speaker 1: no Instagram, no my Space, no, I don't know whatever else. 1283 01:12:33,240 --> 01:12:40,439 Speaker 1: IoT devices, specifically, especially really aggressive things like Internet connected cameras, 1284 01:12:40,479 --> 01:12:43,240 Speaker 1: they're just too easy to make a mistake in the 1285 01:12:43,280 --> 01:12:46,000 Speaker 1: configuration of and leave those things open. There's a website 1286 01:12:46,000 --> 01:12:49,080 Speaker 1: called show dan s h O d A n dot 1287 01:12:49,160 --> 01:12:53,400 Speaker 1: io that basically is just a huge repository of connections 1288 01:12:53,400 --> 01:12:58,000 Speaker 1: to unsecured Internet devices. And I've I will admit no 1289 01:12:58,040 --> 01:13:00,200 Speaker 1: wrong'm doing here, but I've seen people do this. Open 1290 01:13:00,280 --> 01:13:04,040 Speaker 1: up this website and connect to these open uh nanny camps, 1291 01:13:04,200 --> 01:13:08,640 Speaker 1: security cameras, traffic cameras, and just watch some people in 1292 01:13:08,680 --> 01:13:11,719 Speaker 1: an office in Japan work all day uh their internet 1293 01:13:11,720 --> 01:13:15,000 Speaker 1: connected home systems that let you turn your lights on, 1294 01:13:15,080 --> 01:13:18,080 Speaker 1: turn your heat up, turn your air conditioning up remotely. 1295 01:13:18,800 --> 01:13:22,320 Speaker 1: People leave those unsecured um and and it's possible through 1296 01:13:22,360 --> 01:13:24,960 Speaker 1: that interface to turn people's lights off and turn their 1297 01:13:24,960 --> 01:13:27,760 Speaker 1: heat up and unlocked their doors. In many cases, I 1298 01:13:28,120 --> 01:13:30,880 Speaker 1: will stay far far away from any of those things, 1299 01:13:30,920 --> 01:13:35,920 Speaker 1: and then things like Alexa or Google Home or whatever 1300 01:13:35,960 --> 01:13:39,479 Speaker 1: the Apple competitor is with that, I'm typically well I'm 1301 01:13:39,520 --> 01:13:42,200 Speaker 1: not a fan at all of those as well, and 1302 01:13:42,520 --> 01:13:45,879 Speaker 1: those are not always recording, but they are always listening. 1303 01:13:46,040 --> 01:13:48,840 Speaker 1: And had a recent case of Alexa being used in 1304 01:13:48,880 --> 01:13:52,679 Speaker 1: a murder investigation. Murder occurred in this house and they 1305 01:13:52,840 --> 01:13:55,479 Speaker 1: subpoena to Amazon and said, give us everything you have 1306 01:13:55,680 --> 01:13:59,800 Speaker 1: from this Alexa during this daytime range, and that was 1307 01:14:00,040 --> 01:14:03,519 Speaker 1: supplement to that murder investigation. So those things are always there. 1308 01:14:03,600 --> 01:14:07,599 Speaker 1: And Amazon specifically is you know, I bashed Facebook all 1309 01:14:07,600 --> 01:14:09,880 Speaker 1: the time. I have historically bashed Google all the time. 1310 01:14:10,200 --> 01:14:12,240 Speaker 1: Amazon is the one that's starting to concern me a 1311 01:14:12,240 --> 01:14:16,160 Speaker 1: little bit more. Amazon rolled out a massive facial recognition 1312 01:14:16,200 --> 01:14:21,560 Speaker 1: system to government entities called Recognition, And it's not recognition 1313 01:14:21,600 --> 01:14:24,200 Speaker 1: with the C, it's Recognition with a K. To put 1314 01:14:24,200 --> 01:14:27,320 Speaker 1: a little even a little more big brother twist on that. 1315 01:14:28,080 --> 01:14:32,920 Speaker 1: Amazon has access to massive repositories data through Amazon Web 1316 01:14:32,960 --> 01:14:37,599 Speaker 1: Service Amazon Web Service AWS. Amazon recently rolled out security 1317 01:14:37,680 --> 01:14:42,000 Speaker 1: key that lets a delivery driver actually opened the door 1318 01:14:42,040 --> 01:14:46,679 Speaker 1: to your home through uh, basically a smart connected door lock. 1319 01:14:47,560 --> 01:14:50,200 Speaker 1: And I think, I well, I don't think, I know. 1320 01:14:50,320 --> 01:14:54,240 Speaker 1: IoT devices also create systemic problems, so there because they're 1321 01:14:54,280 --> 01:14:58,759 Speaker 1: typically unpatched and people are vaguely aware that they're connected 1322 01:14:58,760 --> 01:15:02,320 Speaker 1: to the Internet. But thankfully, smart thermostats and smart refrigerators 1323 01:15:02,320 --> 01:15:06,360 Speaker 1: and smart micro uh microwaves, they all have an Internet connection, 1324 01:15:06,360 --> 01:15:09,200 Speaker 1: and they're all running a tiny computer that's pretty easily hacked. 1325 01:15:09,479 --> 01:15:12,679 Speaker 1: So these have been hacked and used in several massive 1326 01:15:12,760 --> 01:15:16,640 Speaker 1: scale distributed denial of service attacks. Basically, let's say I 1327 01:15:16,640 --> 01:15:18,200 Speaker 1: want to take down a website, I would just send 1328 01:15:18,240 --> 01:15:20,840 Speaker 1: a billion computers to this website at the same time 1329 01:15:20,840 --> 01:15:23,920 Speaker 1: and keep them coming back so no legitimate customer can 1330 01:15:23,960 --> 01:15:28,799 Speaker 1: actually get through. And when you think about the scale 1331 01:15:28,840 --> 01:15:31,360 Speaker 1: of that and then apply that to something like our 1332 01:15:31,760 --> 01:15:34,599 Speaker 1: power grids in the United States, which are typically run 1333 01:15:34,920 --> 01:15:38,439 Speaker 1: on Internet connected skata systems, that gets a little concerning 1334 01:15:38,439 --> 01:15:41,720 Speaker 1: as well. I had to get your expert opinion on this. 1335 01:15:41,840 --> 01:15:45,919 Speaker 1: Out of curiosity. I've heard people claim that, let's say 1336 01:15:46,080 --> 01:15:48,759 Speaker 1: they don't own a cat, They've never searched for anything 1337 01:15:48,760 --> 01:15:51,000 Speaker 1: about cats. It's not, you know, a part of their life, 1338 01:15:51,360 --> 01:15:53,240 Speaker 1: and they'll be out with like let's say I'll be 1339 01:15:53,240 --> 01:15:55,920 Speaker 1: out with Jack, we have a twenty minute conversation about 1340 01:15:56,000 --> 01:15:58,960 Speaker 1: cats while my phone is near me, and all of 1341 01:15:59,000 --> 01:16:01,840 Speaker 1: a sudden, they start seeing ads pop up on Facebook 1342 01:16:01,920 --> 01:16:05,760 Speaker 1: or Amazon for cat food. Is that real or is 1343 01:16:05,760 --> 01:16:11,559 Speaker 1: that paranoia? I think there's probably there's definitely an element 1344 01:16:11,600 --> 01:16:14,519 Speaker 1: of truth in that. I don't I honestly don't think 1345 01:16:14,520 --> 01:16:17,000 Speaker 1: and if anybody we're willing to throw the stone, it 1346 01:16:17,040 --> 01:16:19,000 Speaker 1: would be me. I don't think Facebook is listening to 1347 01:16:19,080 --> 01:16:21,519 Speaker 1: everything you say. When you consider the amount of data 1348 01:16:21,560 --> 01:16:24,160 Speaker 1: that would use to transmit all that back, the amount 1349 01:16:24,200 --> 01:16:27,480 Speaker 1: of bandwidth that would consume, the amount of like processing 1350 01:16:27,479 --> 01:16:31,479 Speaker 1: power it would take to process basically every adult human 1351 01:16:31,479 --> 01:16:34,880 Speaker 1: in the United States twenty four or seven, and and 1352 01:16:35,080 --> 01:16:37,639 Speaker 1: lots of people in other countries as well. I don't 1353 01:16:37,640 --> 01:16:41,240 Speaker 1: really see that as totally feasible or doable right now. 1354 01:16:41,680 --> 01:16:44,479 Speaker 1: But I think we probably I think probably what happens 1355 01:16:44,520 --> 01:16:47,400 Speaker 1: is people leak that information or that interests or that 1356 01:16:47,520 --> 01:16:51,599 Speaker 1: idea through some other avenue. They maybe they mentioned something 1357 01:16:51,640 --> 01:16:54,920 Speaker 1: about an email or And also it doesn't have to 1358 01:16:54,960 --> 01:16:58,000 Speaker 1: be a direct correlation to cats, right Facebook can probably say, yeah, 1359 01:16:58,000 --> 01:17:01,120 Speaker 1: I know, if this person likes this particular show, they're 1360 01:17:01,120 --> 01:17:04,040 Speaker 1: probably into cats. Uh. And Facebook can do that very 1361 01:17:04,120 --> 01:17:07,639 Speaker 1: very accurately because they have millions, hundreds of millions, billions 1362 01:17:07,640 --> 01:17:10,960 Speaker 1: potentially if data points to compare you as an individual too. 1363 01:17:11,280 --> 01:17:13,639 Speaker 1: So as much as I would like to say, yes, 1364 01:17:13,680 --> 01:17:15,400 Speaker 1: Facebook is listening to all your stuff, you need to 1365 01:17:15,400 --> 01:17:17,680 Speaker 1: take Facebook off your phone, which I do think you 1366 01:17:17,680 --> 01:17:23,040 Speaker 1: should do. Um, I just don't think they are interesting. Well, 1367 01:17:23,040 --> 01:17:25,840 Speaker 1: there has been a great conversation. Yeah, you know, you 1368 01:17:25,840 --> 01:17:30,240 Speaker 1: see the latest compromise where um, what what happened with Facebook? 1369 01:17:30,280 --> 01:17:32,960 Speaker 1: Like fifty million people are compromised or something like that, 1370 01:17:33,560 --> 01:17:35,840 Speaker 1: and uh, apparently I was one of them because I 1371 01:17:35,880 --> 01:17:38,680 Speaker 1: got logged out of Facebook on my phone and I 1372 01:17:38,720 --> 01:17:41,000 Speaker 1: was like, you know what, I'm just not gonna log 1373 01:17:41,040 --> 01:17:46,120 Speaker 1: back in. That's great, man. I my hats off too. 1374 01:17:46,920 --> 01:17:48,840 Speaker 1: I already know that you're still going to keep it, though, 1375 01:17:48,920 --> 01:17:51,160 Speaker 1: at least for desktop. I feel like we end up 1376 01:17:51,280 --> 01:17:53,639 Speaker 1: using it so much for what we do. I'm gonna 1377 01:17:53,680 --> 01:17:55,840 Speaker 1: I'm gonna. I'm not saying I'm gonna get off Facebook 1378 01:17:55,880 --> 01:17:58,120 Speaker 1: because I need it for work, but I don't think 1379 01:17:58,120 --> 01:17:59,639 Speaker 1: I'm gonna put it back. I don't think I'm gonna 1380 01:17:59,720 --> 01:18:01,360 Speaker 1: use it. I'm i phone anymore. I think I'm just 1381 01:18:01,360 --> 01:18:03,160 Speaker 1: going to delete the app. I'm not on the app. 1382 01:18:03,280 --> 01:18:05,360 Speaker 1: I just use it through my browser because the app 1383 01:18:05,400 --> 01:18:09,040 Speaker 1: does seem very invasive. Yeah, so if you are going 1384 01:18:09,080 --> 01:18:11,400 Speaker 1: to use Facebook, that would be my recommendation. At least 1385 01:18:11,400 --> 01:18:13,320 Speaker 1: get it off the device where it doesn't have access 1386 01:18:13,320 --> 01:18:16,040 Speaker 1: to all this location information, all your WiFi, S, s 1387 01:18:16,080 --> 01:18:18,080 Speaker 1: I d s not only you connect to, but your 1388 01:18:18,080 --> 01:18:21,840 Speaker 1: phone sees just a wealth of information that you're you're 1389 01:18:21,840 --> 01:18:24,760 Speaker 1: giving a close, persistent access when you put it on 1390 01:18:24,800 --> 01:18:27,759 Speaker 1: that device and putting it in your pocket. The browsers 1391 01:18:27,760 --> 01:18:30,960 Speaker 1: the way to go. Yeah. Well, like I said, this 1392 01:18:31,040 --> 01:18:34,920 Speaker 1: has been a fascinating conversation. I really enjoyed it. Um, 1393 01:18:34,960 --> 01:18:37,559 Speaker 1: before we wrap up, do you want to talk about 1394 01:18:37,560 --> 01:18:40,200 Speaker 1: your podcast and your latest book that you have out 1395 01:18:40,240 --> 01:18:43,599 Speaker 1: and anything else you want to plug? Yeah, definitely so. 1396 01:18:43,800 --> 01:18:46,760 Speaker 1: Uh comsec is the latest book that's out that just 1397 01:18:46,840 --> 01:18:50,400 Speaker 1: came out a few months ago. Um, if you're the 1398 01:18:50,400 --> 01:18:52,439 Speaker 1: tinfoil hat guy that really wants to dig into this, 1399 01:18:52,560 --> 01:18:54,360 Speaker 1: or if you just want a little bit more awareness 1400 01:18:54,439 --> 01:18:56,960 Speaker 1: of exactly how invasive cell phones are, that would be 1401 01:18:56,960 --> 01:18:59,519 Speaker 1: an awesome place to start. I'm not doing a whole 1402 01:18:59,520 --> 01:19:03,200 Speaker 1: lot on the Complete Privacy and Security Podcast lately, but 1403 01:19:03,240 --> 01:19:07,439 Speaker 1: there is a massive backlock of episodes it that would 1404 01:19:07,479 --> 01:19:10,280 Speaker 1: be a great place to start if you're beginning this 1405 01:19:10,400 --> 01:19:12,519 Speaker 1: journey towards more privacy. You just want to have some 1406 01:19:12,560 --> 01:19:15,519 Speaker 1: more awareness of some of the some of the issues. 1407 01:19:15,680 --> 01:19:19,559 Speaker 1: And my main effort right now is a podcast I 1408 01:19:19,600 --> 01:19:23,000 Speaker 1: started called Across the Peak and it is kind of 1409 01:19:23,040 --> 01:19:26,599 Speaker 1: a generalist podcast because I got tired of talking doom 1410 01:19:26,600 --> 01:19:31,320 Speaker 1: and gloom, security and privacy nightmares every single week. So um, 1411 01:19:31,320 --> 01:19:34,120 Speaker 1: this is just my fun manly ship podcast. How to 1412 01:19:34,200 --> 01:19:35,800 Speaker 1: change a tire, how to build a fire, how to 1413 01:19:35,840 --> 01:19:39,360 Speaker 1: make love to a woman kind of thing. Very cool. Man. Well, 1414 01:19:39,439 --> 01:19:41,759 Speaker 1: this is usually the part of the interview where I say, 1415 01:19:41,880 --> 01:19:45,360 Speaker 1: you know where could we follow you on Twitter, on Instagram, 1416 01:19:45,360 --> 01:19:49,400 Speaker 1: on facebo But already have the answer to that. Yep, 1417 01:19:49,720 --> 01:19:53,000 Speaker 1: uh yeah, you just answered your own question. Awesome, man, 1418 01:19:53,080 --> 01:19:55,080 Speaker 1: Well this is great if you're ever in New York, 1419 01:19:55,120 --> 01:19:56,880 Speaker 1: would love to have Yeah, yeah, I mean definitely. I 1420 01:19:56,880 --> 01:19:59,120 Speaker 1: mean these issues come up all the time with a 1421 01:19:59,200 --> 01:20:02,840 Speaker 1: different from the O p M hack to Facebook being compromised, 1422 01:20:02,840 --> 01:20:04,479 Speaker 1: So I mean, I'm sure we'll have to get you 1423 01:20:04,520 --> 01:20:06,760 Speaker 1: on again to talk about some of those things as 1424 01:20:06,800 --> 01:20:09,880 Speaker 1: they as they will no doubt occur in the future. 1425 01:20:10,360 --> 01:20:13,639 Speaker 1: Ye okay, awesome, Thank you guys. I really appreciate it. Yeah, 1426 01:20:13,640 --> 01:20:17,559 Speaker 1: thank you, thanks man. Great conversation with Justin Carrol. I 1427 01:20:17,560 --> 01:20:19,599 Speaker 1: didn't think we'd go as long as we did. But there, 1428 01:20:19,720 --> 01:20:22,519 Speaker 1: but you once you go down that rabbit hole of 1429 01:20:22,720 --> 01:20:25,439 Speaker 1: information breaching and you know, and by the way, if 1430 01:20:25,479 --> 01:20:27,799 Speaker 1: you guys like that interview, you're gonna love the authors 1431 01:20:27,800 --> 01:20:30,000 Speaker 1: of that book that's in front of you, like War, 1432 01:20:30,320 --> 01:20:34,000 Speaker 1: which ties into this type of stuff weaponization of social media. 1433 01:20:34,439 --> 01:20:37,320 Speaker 1: Um So I'm gonna I'm actually taking a trip this 1434 01:20:37,360 --> 01:20:40,960 Speaker 1: week to the left coast and I'm going to take 1435 01:20:40,960 --> 01:20:43,479 Speaker 1: this book with me and start flipping through it and 1436 01:20:43,520 --> 01:20:45,800 Speaker 1: reading it. Yeah, we'll get both of those authors on 1437 01:20:45,840 --> 01:20:48,439 Speaker 1: because it's a co authored book. I mean, this is 1438 01:20:48,520 --> 01:20:52,320 Speaker 1: this is pretty topical stuff. Yeah. So if you guys 1439 01:20:52,400 --> 01:20:55,599 Speaker 1: like this episode, be on the lookout for an episode 1440 01:20:55,640 --> 01:20:58,519 Speaker 1: that you're really gonna dig. Um Man, I feel like 1441 01:20:58,560 --> 01:21:01,760 Speaker 1: after that conversation, if they're are FEMA caves, we're all 1442 01:21:01,800 --> 01:21:05,439 Speaker 1: being for that. That's what I was saying is that, 1443 01:21:05,520 --> 01:21:08,200 Speaker 1: you know, our government has these massive data centers and 1444 01:21:08,479 --> 01:21:11,639 Speaker 1: they're using them spy on on foreign powers and so forth. 1445 01:21:11,760 --> 01:21:14,720 Speaker 1: And there are legitimate concerns about civil liberties in this 1446 01:21:14,800 --> 01:21:18,200 Speaker 1: country and surveillance. Um, but at least there are some 1447 01:21:18,240 --> 01:21:20,880 Speaker 1: protections on it. You know, there are FISSO warrants, you know, 1448 01:21:20,920 --> 01:21:23,080 Speaker 1: wire tapping warrants, things like that that have to be 1449 01:21:23,160 --> 01:21:27,600 Speaker 1: obtained through the court system. My main fear on this 1450 01:21:27,720 --> 01:21:31,800 Speaker 1: subject is what happens when there is some sort of 1451 01:21:31,800 --> 01:21:35,360 Speaker 1: because I mean, there will be in history another Stalin, 1452 01:21:35,479 --> 01:21:40,360 Speaker 1: another Hitler, another mau Uh. There's gonna be another crazy 1453 01:21:40,439 --> 01:21:43,479 Speaker 1: lunatic that wants to institute some sort of social cleansing. 1454 01:21:43,520 --> 01:21:47,680 Speaker 1: And with the technological tools at the fingertips of a 1455 01:21:47,880 --> 01:21:52,960 Speaker 1: evil tyrannical government, I mean, the holocaust that will ensue 1456 01:21:53,000 --> 01:21:56,560 Speaker 1: we'll just pale in comparison to anything we've seen previously 1457 01:21:56,640 --> 01:22:00,000 Speaker 1: because of the way that people are tracked and category 1458 01:22:00,000 --> 01:22:04,880 Speaker 1: categorized and cataloged. Um, it's gonna be horrendous. It's gonna 1459 01:22:04,960 --> 01:22:08,400 Speaker 1: be something that a nightmare that we can't even imagine. Uh. 1460 01:22:08,439 --> 01:22:12,559 Speaker 1: And that's my my big fear uh for the human 1461 01:22:12,640 --> 01:22:15,479 Speaker 1: race going into the future. It's something that has not 1462 01:22:15,600 --> 01:22:19,120 Speaker 1: happened yet. It's when these technologies are really used by 1463 01:22:19,240 --> 01:22:24,960 Speaker 1: a criminal regime. Yeah, what you're saying sounds slightly Alex Jones, 1464 01:22:25,080 --> 01:22:27,519 Speaker 1: but at the same time it's very realistic. I mean, 1465 01:22:27,800 --> 01:22:30,840 Speaker 1: you just have to like, Unfortunately, we have to be 1466 01:22:30,880 --> 01:22:35,320 Speaker 1: realistic with ourselves that you know, evil exists in the world. 1467 01:22:35,600 --> 01:22:39,480 Speaker 1: You know, it's it's not going away. And these technologies 1468 01:22:39,560 --> 01:22:42,559 Speaker 1: exist in the world. We know this what happens when 1469 01:22:42,600 --> 01:22:46,080 Speaker 1: these two are combined. Um, and I mean it will 1470 01:22:46,160 --> 01:22:49,759 Speaker 1: happen somewhere someplace. Um. We don't know where or when, 1471 01:22:49,800 --> 01:22:53,120 Speaker 1: but it will happen one day. Um. All right, I 1472 01:22:53,120 --> 01:22:56,200 Speaker 1: have an email sent here to soft rep dot Radio 1473 01:22:56,200 --> 01:22:59,360 Speaker 1: at soft rep dot com Ian and Jack. First, I 1474 01:22:59,360 --> 01:23:02,040 Speaker 1: would like to say the audio has been sounding really 1475 01:23:02,040 --> 01:23:04,160 Speaker 1: good when the guests come on. Not sure if Ian 1476 01:23:04,200 --> 01:23:08,040 Speaker 1: has done anything differently, but I can really hear an improvement. Um. 1477 01:23:08,080 --> 01:23:11,280 Speaker 1: The one thing I can mention actually is if you 1478 01:23:11,360 --> 01:23:14,840 Speaker 1: compare where we are now to the last facility where 1479 01:23:14,880 --> 01:23:17,559 Speaker 1: you were in a lot quieter. It's a lot quieter here. 1480 01:23:17,600 --> 01:23:21,160 Speaker 1: It's just a better space. Um. You know, I play 1481 01:23:21,160 --> 01:23:23,000 Speaker 1: around with the board a little bit to make sure 1482 01:23:23,040 --> 01:23:25,519 Speaker 1: everything sounds crisp. And yeah, I think lately the audio 1483 01:23:25,600 --> 01:23:29,200 Speaker 1: has been good, so I do appreciate the compliment there. Uh. Second, 1484 01:23:29,320 --> 01:23:32,519 Speaker 1: I have a question for Jack. Why has China's efforts 1485 01:23:32,560 --> 01:23:35,919 Speaker 1: to influence US policy has gone less noticed than Russia's 1486 01:23:35,920 --> 01:23:39,559 Speaker 1: efforts in the American media. Why has subjects such as 1487 01:23:39,560 --> 01:23:43,920 Speaker 1: the Confucius Centers and Chinese influence in academic settings not 1488 01:23:44,120 --> 01:23:47,599 Speaker 1: been spoken about as much as voting machine hacking currently 1489 01:23:47,640 --> 01:23:51,000 Speaker 1: studying abroad as sciences po in Reems France, am I 1490 01:23:51,000 --> 01:23:54,519 Speaker 1: saying that that correctly? I'm not sure actually, Um, but 1491 01:23:54,560 --> 01:23:57,559 Speaker 1: they also have a dual program with Jack's alma mater. 1492 01:23:57,680 --> 01:24:01,240 Speaker 1: He mentions, Uh, and it's refreshing to hear quality news reporting, 1493 01:24:01,400 --> 01:24:05,360 Speaker 1: especially after this week. Thanks, And that's from Trey Cool. No, 1494 01:24:05,520 --> 01:24:08,840 Speaker 1: that's a really really good question. And UM, I wish 1495 01:24:08,880 --> 01:24:12,600 Speaker 1: I knew why. Um. I think that Russia is the 1496 01:24:12,640 --> 01:24:14,960 Speaker 1: big bad enemy. That's kind of easy for us to 1497 01:24:14,960 --> 01:24:18,880 Speaker 1: wrap our our minds around. Um. Largely, maybe a lot 1498 01:24:18,960 --> 01:24:20,880 Speaker 1: of it is institutional. It's just like, you know, we 1499 01:24:20,920 --> 01:24:23,240 Speaker 1: had the Cold War and a lot of Americans grew 1500 01:24:23,320 --> 01:24:25,800 Speaker 1: up fearing the Red menace, so maybe it's easier for 1501 01:24:25,880 --> 01:24:28,439 Speaker 1: us to understand. They could also tie it to Trump, 1502 01:24:28,479 --> 01:24:30,839 Speaker 1: and I feel like American news media is all Trump 1503 01:24:30,880 --> 01:24:35,280 Speaker 1: all the time, and so that's a more recent phenomenon. 1504 01:24:35,320 --> 01:24:41,400 Speaker 1: But yeah, um, the election quote unquote hacking. Uh, Interestingly, 1505 01:24:41,400 --> 01:24:43,680 Speaker 1: there was no voting of that. I'm sorry, there was 1506 01:24:43,720 --> 01:24:47,879 Speaker 1: no hacking of the voting machines that, Um, the Russians 1507 01:24:47,880 --> 01:24:51,479 Speaker 1: did not do that there, Although I was gonna say, 1508 01:24:51,520 --> 01:24:54,080 Speaker 1: and I talked about this on a previous episode at 1509 01:24:54,120 --> 01:24:57,720 Speaker 1: that black Hat con that hacking convention, they've demonstrated that 1510 01:24:57,760 --> 01:25:01,400 Speaker 1: they could be hacked exactly. Yeah, and I think it's 1511 01:25:01,439 --> 01:25:04,160 Speaker 1: definitely something we need to protect, um, you know, our 1512 01:25:04,320 --> 01:25:07,400 Speaker 1: democratic process against But the Russians didn't hack into our 1513 01:25:07,439 --> 01:25:11,280 Speaker 1: voting machines. What they did do was attempt a social 1514 01:25:11,360 --> 01:25:16,800 Speaker 1: engineering project, a disinformation operation against the American public. Uh. 1515 01:25:16,840 --> 01:25:19,160 Speaker 1: And it goes back to our conversation with Justin. I mean, 1516 01:25:19,200 --> 01:25:22,400 Speaker 1: one good thing about that, I think, UM, and all 1517 01:25:22,400 --> 01:25:24,880 Speaker 1: the press coverage is at least people are kind of 1518 01:25:24,880 --> 01:25:27,400 Speaker 1: waking up to how social media can be used by 1519 01:25:27,439 --> 01:25:31,880 Speaker 1: some bad actors and some really nefarious ways. UM. As 1520 01:25:31,880 --> 01:25:36,599 Speaker 1: far as like why people aren't talking more about China, UM, 1521 01:25:36,840 --> 01:25:39,320 Speaker 1: I think we have a really hard time understanding China. 1522 01:25:39,400 --> 01:25:41,640 Speaker 1: We have a really hard time understanding their culture. We 1523 01:25:41,680 --> 01:25:44,560 Speaker 1: have a hard time understanding their strategy because their strategy 1524 01:25:44,640 --> 01:25:49,240 Speaker 1: is very different than anyone else we've really ever faced. UM. 1525 01:25:49,320 --> 01:25:52,120 Speaker 1: And it's just difficult for Americans to wrap their minds 1526 01:25:52,160 --> 01:25:55,840 Speaker 1: around a slowly escalating threat, we do very good and 1527 01:25:55,920 --> 01:25:59,960 Speaker 1: responding to immediate threats the Pearl Harbors, the nine eleven 1528 01:26:00,360 --> 01:26:04,080 Speaker 1: American jump ship, you'll have Americans lined up outside the 1529 01:26:04,080 --> 01:26:07,160 Speaker 1: recruiting station ready to go kick some ass for Uncle Sam. 1530 01:26:07,880 --> 01:26:10,799 Speaker 1: We do very well with that. But slowly escalating threats 1531 01:26:11,040 --> 01:26:15,240 Speaker 1: we don't do well with at all. Um These more subtle, nuanced, 1532 01:26:15,640 --> 01:26:18,880 Speaker 1: long term strategies that the Chinese employee is something we 1533 01:26:18,920 --> 01:26:22,439 Speaker 1: don't do well in responding to, although it does appear 1534 01:26:22,960 --> 01:26:25,080 Speaker 1: just over the last couple of years that there's a 1535 01:26:25,200 --> 01:26:30,519 Speaker 1: lot more media coverage of the confusion institutes. Um, Chinese 1536 01:26:30,520 --> 01:26:33,040 Speaker 1: students spies. I mean, I remember when I first wrote 1537 01:26:33,040 --> 01:26:35,679 Speaker 1: about that, and people were like beside themselves, like falling 1538 01:26:35,720 --> 01:26:37,960 Speaker 1: out of their chair, like Jack Murphy, you racist? How 1539 01:26:38,000 --> 01:26:41,519 Speaker 1: can you say this? Now there's mainstream media articles about 1540 01:26:41,520 --> 01:26:45,439 Speaker 1: how China uses student spies. Um. So I can sense that, 1541 01:26:45,520 --> 01:26:48,080 Speaker 1: you know, in the in the fabric of America, we 1542 01:26:48,160 --> 01:26:51,960 Speaker 1: are becoming a little bit more responsive to this, the 1543 01:26:51,960 --> 01:26:55,160 Speaker 1: the Chinese strategy and how they plan to supplant us 1544 01:26:55,200 --> 01:26:58,040 Speaker 1: as a global power. Um. The question is is it 1545 01:26:58,120 --> 01:27:01,600 Speaker 1: too little, too late? Uh? Um? So I think that 1546 01:27:01,680 --> 01:27:03,840 Speaker 1: I think there are some changes, that there are some 1547 01:27:03,880 --> 01:27:07,040 Speaker 1: shifts happening, and the Americans are becoming a little bit 1548 01:27:07,040 --> 01:27:10,400 Speaker 1: more aware that China is up to a lot of 1549 01:27:10,439 --> 01:27:13,600 Speaker 1: the same games that Russia is, um, just in a 1550 01:27:13,720 --> 01:27:18,120 Speaker 1: very different way. Yeah. Um, I also wanted to mention 1551 01:27:18,200 --> 01:27:21,080 Speaker 1: something about giving back to guys in the community. UM. 1552 01:27:21,200 --> 01:27:24,120 Speaker 1: I don't seem to know this guy's last name from 1553 01:27:24,120 --> 01:27:28,040 Speaker 1: looking at the website, but Ron. So yesterday President Trump 1554 01:27:28,080 --> 01:27:32,840 Speaker 1: awarded Ron Shrewder or Shrewer. I'm sorry if I can't 1555 01:27:32,840 --> 01:27:36,360 Speaker 1: pronounce his name, that's my fault. Um. But Sergeant Ron 1556 01:27:37,000 --> 01:27:40,400 Speaker 1: was awarded the Medal of Honor and uh for actions 1557 01:27:40,439 --> 01:27:44,040 Speaker 1: in Afghanistan. Um. But what I was told is that 1558 01:27:44,120 --> 01:27:47,200 Speaker 1: he is struggling to fight cancer and UH we found 1559 01:27:47,520 --> 01:27:50,080 Speaker 1: you know, his go fund me account to help pay 1560 01:27:50,120 --> 01:27:54,000 Speaker 1: for his treatment, and UM, you know, I I donated 1561 01:27:54,280 --> 01:27:57,000 Speaker 1: some money. I really like donating directly to the people 1562 01:27:57,640 --> 01:28:00,680 Speaker 1: who need it and to the families involved. So I 1563 01:28:00,680 --> 01:28:02,600 Speaker 1: mean go fund me is actually kind of cool in 1564 01:28:02,600 --> 01:28:06,160 Speaker 1: that regard that you know you can do that. Um. 1565 01:28:06,200 --> 01:28:07,760 Speaker 1: So yeah, if you if you guys want to look 1566 01:28:07,800 --> 01:28:09,720 Speaker 1: it up, I'll link to it on the article. But 1567 01:28:09,760 --> 01:28:12,000 Speaker 1: it's UM, go fund Me and if you look up 1568 01:28:12,040 --> 01:28:15,559 Speaker 1: help Ron keep up the Fight. They're doing really well though, 1569 01:28:15,600 --> 01:28:19,839 Speaker 1: because he's currently at seventy five thousand and they're looking 1570 01:28:19,880 --> 01:28:22,400 Speaker 1: to get a hundred thousand. I was told that the 1571 01:28:22,400 --> 01:28:27,200 Speaker 1: White House sped up his Medal of Honor citation because 1572 01:28:27,680 --> 01:28:30,800 Speaker 1: he has cancer. Yeah, so I mean, you don't want 1573 01:28:30,800 --> 01:28:35,280 Speaker 1: to be awarded that posthumously. I mean, if it's horrible, Man, 1574 01:28:35,720 --> 01:28:39,639 Speaker 1: it's horrible. I wonder if it's not burn Pit related, 1575 01:28:40,120 --> 01:28:43,960 Speaker 1: but I have no idea, and um but man, that's 1576 01:28:44,200 --> 01:28:47,320 Speaker 1: that's a raw deal. Yeah, So look up go fund me, 1577 01:28:47,439 --> 01:28:50,160 Speaker 1: help Ron keep up the Fight, and you should be 1578 01:28:50,200 --> 01:28:53,519 Speaker 1: able to find it or I'll link to it on 1579 01:28:53,560 --> 01:28:57,080 Speaker 1: our own website. Which brings me to our last stuff 1580 01:28:57,120 --> 01:29:00,160 Speaker 1: that I wanted to mention here. So we're gonna have 1581 01:29:00,320 --> 01:29:04,960 Speaker 1: a site up very soon devoted to Softwrep Radio. On 1582 01:29:05,080 --> 01:29:07,120 Speaker 1: top of that, we're gonna have an app devoted to 1583 01:29:07,120 --> 01:29:09,960 Speaker 1: softwarep Radio because I've been asking people about what's going 1584 01:29:10,080 --> 01:29:11,759 Speaker 1: on with you know, what used to be the soft 1585 01:29:11,760 --> 01:29:15,120 Speaker 1: rep app. We do have the news Wrap app, which 1586 01:29:15,120 --> 01:29:17,880 Speaker 1: is now available on both the iPhone and the Android. 1587 01:29:18,479 --> 01:29:21,759 Speaker 1: And I also just launched a Facebook page just because 1588 01:29:21,800 --> 01:29:25,320 Speaker 1: soft rep is now a separate thing, so Facebook dot com, 1589 01:29:25,439 --> 01:29:29,200 Speaker 1: slash softwarep radio. That's a brand new thing that we launched. 1590 01:29:29,240 --> 01:29:31,720 Speaker 1: So just like it on there. I'll post up all 1591 01:29:31,800 --> 01:29:34,519 Speaker 1: the stuff that you need. But the email address is 1592 01:29:34,560 --> 01:29:36,920 Speaker 1: still because that some people asking me about this soft 1593 01:29:36,960 --> 01:29:40,559 Speaker 1: rep dot radio at soft rerep dot com. Um. Yeah, 1594 01:29:40,560 --> 01:29:42,320 Speaker 1: so a lot of cool stuff on the horizon. I 1595 01:29:42,360 --> 01:29:45,160 Speaker 1: saw a template of what the website looks like. It 1596 01:29:45,200 --> 01:29:48,679 Speaker 1: looks great, um and I'm looking forward to that going public. 1597 01:29:48,760 --> 01:29:51,320 Speaker 1: What should be in the next couple of days. Last, 1598 01:29:51,360 --> 01:29:53,519 Speaker 1: as we get out of here, I always like to 1599 01:29:53,560 --> 01:29:55,639 Speaker 1: tell you what we have going on here at Hurricane 1600 01:29:55,680 --> 01:29:58,800 Speaker 1: Group with our clubs, with all of our media going on, 1601 01:29:59,320 --> 01:30:02,800 Speaker 1: be sure to check out Crate Club. The exclusive collaboration 1602 01:30:02,880 --> 01:30:06,360 Speaker 1: watch we did with n FW Watches is a big 1603 01:30:06,400 --> 01:30:09,360 Speaker 1: item in the premium create. We have different tiers of 1604 01:30:09,400 --> 01:30:12,479 Speaker 1: membership depending on how prepared you want to be, and 1605 01:30:12,640 --> 01:30:15,760 Speaker 1: gift options are available as well. Scott Whittner from the 1606 01:30:15,760 --> 01:30:18,120 Speaker 1: load out room and the guys are currently working on 1607 01:30:18,240 --> 01:30:24,160 Speaker 1: bringing you custom products in everything from sunglass cases to 1608 01:30:24,320 --> 01:30:27,479 Speaker 1: E d C bags and other manly products. It's a 1609 01:30:27,479 --> 01:30:30,639 Speaker 1: club fore men by men. You can check that all 1610 01:30:30,680 --> 01:30:34,320 Speaker 1: out at Crate Club dot Us. Once again, that's Create 1611 01:30:34,400 --> 01:30:38,320 Speaker 1: Club dot Us. If you're a dog owner, check this out. 1612 01:30:38,320 --> 01:30:41,280 Speaker 1: You're gonna love this. We've just partnered with Kuna. Kuna 1613 01:30:41,320 --> 01:30:48,080 Speaker 1: has the word what the C word? Kuna? What? Wait? 1614 01:30:48,080 --> 01:30:50,519 Speaker 1: What makes you mention that? Though I don't know I 1615 01:30:50,960 --> 01:30:52,920 Speaker 1: that I was like, is that connected to something news 1616 01:30:52,920 --> 01:30:56,200 Speaker 1: worthy right now? Anyway? You shouldn't use the C words. 1617 01:30:56,800 --> 01:30:59,120 Speaker 1: They have a team of trained kane on handlers picking 1618 01:30:59,120 --> 01:31:02,840 Speaker 1: out a box free each month of healthy treats and 1619 01:31:02,920 --> 01:31:06,040 Speaker 1: training aids. It's custom built for your dog size end 1620 01:31:06,120 --> 01:31:09,320 Speaker 1: age as well. The products are US sourced, all natural, 1621 01:31:09,680 --> 01:31:12,240 Speaker 1: and they not only promote a healthy diet, but also 1622 01:31:12,280 --> 01:31:15,400 Speaker 1: promote being active with your dog. So whether we're talking 1623 01:31:15,400 --> 01:31:18,240 Speaker 1: to pitbull or Chihuahua, whatever size dog that you have, 1624 01:31:18,360 --> 01:31:20,519 Speaker 1: this is just what you're looking for. You can see 1625 01:31:20,520 --> 01:31:24,000 Speaker 1: all of that at Kuna dot Dog. That's Kuna dot Dog. 1626 01:31:24,280 --> 01:31:27,400 Speaker 1: It's efficient for you. Your dog will appreciate it as well, 1627 01:31:27,439 --> 01:31:30,960 Speaker 1: of course, and that's CEU and a Dot d o G. 1628 01:31:31,800 --> 01:31:35,639 Speaker 1: I just mentioned Sam Fattus earlier in the podcast and 1629 01:31:36,040 --> 01:31:38,720 Speaker 1: as you might know if you're a subscriber to the 1630 01:31:38,760 --> 01:31:40,960 Speaker 1: spec ops channel, Sam Fattus was a part of that 1631 01:31:41,120 --> 01:31:44,360 Speaker 1: last intelligence inside the team room that we did. So 1632 01:31:44,479 --> 01:31:46,439 Speaker 1: if you want to join and become a member of 1633 01:31:46,439 --> 01:31:49,440 Speaker 1: the spec ops Channel and get all that exclusive footage 1634 01:31:49,920 --> 01:31:53,559 Speaker 1: at a fifty discount of membership, uh, go to spec 1635 01:31:53,640 --> 01:31:56,840 Speaker 1: ops channel dot com. We also have training sell on 1636 01:31:56,960 --> 01:32:00,320 Speaker 1: there and uh yeah, all that great content for four 1637 01:32:01,040 --> 01:32:04,080 Speaker 1: a month. The app is up spec ops channel dot com. 1638 01:32:04,960 --> 01:32:06,720 Speaker 1: As I said, go to the Facebook page though for 1639 01:32:06,800 --> 01:32:10,160 Speaker 1: software Radio Facebook dot com slash software Radio because I 1640 01:32:10,240 --> 01:32:13,400 Speaker 1: just started that up. The Twitter following, the Instagram following 1641 01:32:13,439 --> 01:32:16,599 Speaker 1: continue to grow, so I guess we're working kind of backward. 1642 01:32:16,640 --> 01:32:19,679 Speaker 1: It's like Facebook. I feel like here's a thing of that. Yeah, 1643 01:32:20,000 --> 01:32:22,200 Speaker 1: it kind of is a lot of ways we've been 1644 01:32:22,240 --> 01:32:25,760 Speaker 1: on Twitter and Instagram. Instagram is growing pretty rapidly, which 1645 01:32:25,840 --> 01:32:29,439 Speaker 1: is great. Um, gets our visibility up. Uh and I 1646 01:32:29,560 --> 01:32:32,400 Speaker 1: guess that's it. Anything else from you before we wrap 1647 01:32:32,520 --> 01:32:35,040 Speaker 1: up here? No, man, I think that's it. Um. I'm 1648 01:32:35,120 --> 01:32:38,000 Speaker 1: heading off to a wedding this week for a friend 1649 01:32:38,080 --> 01:32:41,280 Speaker 1: of our Yeah. Um, so that'll be pretty if you 1650 01:32:41,320 --> 01:32:44,840 Speaker 1: want to mention who, But not right now maybe, h Well, 1651 01:32:44,920 --> 01:32:49,160 Speaker 1: we'll see how things go. Um. But his his bachelor 1652 01:32:49,320 --> 01:32:51,519 Speaker 1: party is going to be like range day, I think. 1653 01:32:51,600 --> 01:32:54,320 Speaker 1: So we're going out shooting nice? Do you do you 1654 01:32:54,479 --> 01:32:58,760 Speaker 1: prefer that over strippers? Yeah? No, man, I uh, you know, 1655 01:32:58,880 --> 01:33:01,880 Speaker 1: at my own wedding, I remember my friend from that 1656 01:33:01,960 --> 01:33:03,640 Speaker 1: I went through the Q course with and we're in 1657 01:33:03,760 --> 01:33:07,680 Speaker 1: fifth Special Forces Group together, and um, I was it 1658 01:33:07,760 --> 01:33:10,040 Speaker 1: was at my own wedding and he was there with 1659 01:33:10,120 --> 01:33:12,640 Speaker 1: his wife and I was having a conversation with his 1660 01:33:12,760 --> 01:33:14,880 Speaker 1: wife and I and I was describing, you know, I 1661 01:33:15,120 --> 01:33:17,960 Speaker 1: very mundane bachelor party. I wouldn't even call it that. 1662 01:33:18,479 --> 01:33:20,920 Speaker 1: And I was like, yeah, you know, I partied a 1663 01:33:20,960 --> 01:33:23,360 Speaker 1: lot in my twenties, and you know, I'm kind of 1664 01:33:23,400 --> 01:33:25,400 Speaker 1: done with all that. And she's like, yeah, Jack, I know. 1665 01:33:26,240 --> 01:33:31,960 Speaker 1: I'm like, oh fuck, my reputation procedes me. So yeah, 1666 01:33:32,200 --> 01:33:34,640 Speaker 1: it wasn't your bachelor party, like you playing dungeons in 1667 01:33:34,680 --> 01:33:37,000 Speaker 1: your right with it with um with a couple of 1668 01:33:37,080 --> 01:33:41,760 Speaker 1: guys who work at the website and uh and my daughter. Yeah, man, 1669 01:33:41,880 --> 01:33:46,040 Speaker 1: twenty dollar bills, it's crazy. Um. So yeah, no, I 1670 01:33:46,080 --> 01:33:50,880 Speaker 1: don't feel any need to go to a gentleman's club. Um. 1671 01:33:51,080 --> 01:33:53,360 Speaker 1: And especially like after you've done a bunch of research 1672 01:33:53,920 --> 01:33:56,439 Speaker 1: and stuff on like human trafficking. It kind of takes 1673 01:33:56,520 --> 01:33:59,640 Speaker 1: the award and the fun out of You see a 1674 01:33:59,680 --> 01:34:01,800 Speaker 1: lot of at a strip club. That's all it's all 1675 01:34:02,000 --> 01:34:05,400 Speaker 1: attached to. It's all connected, you know. So I mean 1676 01:34:06,000 --> 01:34:08,879 Speaker 1: the girls that you meet and I have that are strippers, 1677 01:34:09,479 --> 01:34:12,599 Speaker 1: it genuinely are sometimes girls just trying to go through college. 1678 01:34:12,640 --> 01:34:16,400 Speaker 1: Oh yeah, No, those absolutely those that exists, Um, But 1679 01:34:16,520 --> 01:34:18,519 Speaker 1: a lot of times what happens is that those are 1680 01:34:18,600 --> 01:34:21,599 Speaker 1: girls who have been previously trafficked, Those are girls who 1681 01:34:21,640 --> 01:34:25,280 Speaker 1: will be trafficked in the future, or traffickers are washing 1682 01:34:25,320 --> 01:34:28,320 Speaker 1: their money, They're wandering the drug or the trafficking money 1683 01:34:28,760 --> 01:34:33,680 Speaker 1: through legitimate businesses like strip clubs. Um. So they're all 1684 01:34:33,920 --> 01:34:37,840 Speaker 1: associated and connected, and um, it's hard to separate one 1685 01:34:37,920 --> 01:34:40,719 Speaker 1: from the other. And I'm not saying that every strip 1686 01:34:40,800 --> 01:34:43,800 Speaker 1: club is illegitimate or that every girl you see in 1687 01:34:43,800 --> 01:34:48,120 Speaker 1: a strip club is trafficked, but the rates of how 1688 01:34:48,280 --> 01:34:51,360 Speaker 1: often those women are trafficked, they're way higher than most 1689 01:34:51,400 --> 01:34:53,560 Speaker 1: people suspect. And it it just takes a lot of 1690 01:34:53,600 --> 01:34:56,080 Speaker 1: fun out of it when you know otherwise a strip 1691 01:34:56,160 --> 01:34:58,559 Speaker 1: club is just you know, it's it's just a big 1692 01:34:58,640 --> 01:35:00,519 Speaker 1: tease and you can go there and it's just fun 1693 01:35:00,600 --> 01:35:03,400 Speaker 1: and whatever, you know, but once you learn something kind 1694 01:35:03,439 --> 01:35:05,720 Speaker 1: of like icky stuff behind the scenes, it kind of 1695 01:35:05,760 --> 01:35:08,479 Speaker 1: takes the the allure and the fun out of going 1696 01:35:08,560 --> 01:35:11,439 Speaker 1: to a strip club. Someone listening to this podcast is 1697 01:35:11,479 --> 01:35:12,920 Speaker 1: why it was like, why do you got to ruin 1698 01:35:13,040 --> 01:35:15,720 Speaker 1: my fun? Yeah, but I feel like fun Police, fun 1699 01:35:15,800 --> 01:35:19,240 Speaker 1: Police inbound. It could be my ignorance, but I feel 1700 01:35:19,240 --> 01:35:22,479 Speaker 1: like I associate that way more with like prostitution, not 1701 01:35:22,640 --> 01:35:26,439 Speaker 1: with strippers. I think again, prostitution and trafficking. I mean, 1702 01:35:26,520 --> 01:35:29,080 Speaker 1: it's all associated with with one another. But I just 1703 01:35:29,160 --> 01:35:31,680 Speaker 1: think of girls I've met in my own wife who 1704 01:35:31,720 --> 01:35:34,920 Speaker 1: are strippers that are like, yeah, it's just good money. 1705 01:35:35,040 --> 01:35:37,800 Speaker 1: It's easy money for some of that absolutely is, you know. 1706 01:35:37,880 --> 01:35:41,599 Speaker 1: And there are some prostitutes who are independent entrepreneurs. They're 1707 01:35:41,640 --> 01:35:44,840 Speaker 1: not forced into it. They're women making money and God 1708 01:35:44,920 --> 01:35:47,439 Speaker 1: bless them. You know that If that's your choice, you know, 1709 01:35:47,560 --> 01:35:51,519 Speaker 1: I'm not here to judge, um, but a lot of 1710 01:35:51,600 --> 01:35:54,960 Speaker 1: them are are not They're my choice. So that's that's 1711 01:35:55,000 --> 01:35:58,519 Speaker 1: what it. When it becomes a problem. Jack Murphy routing 1712 01:35:58,560 --> 01:36:01,400 Speaker 1: your fun. Sorry guys, you're on. Sorry guys. I I 1713 01:36:01,560 --> 01:36:03,840 Speaker 1: like naked women as much as the next guy. Um, 1714 01:36:04,120 --> 01:36:06,479 Speaker 1: but yeah, some of the fun of of strip clubs 1715 01:36:06,520 --> 01:36:09,080 Speaker 1: has been kind of killed for me personally. All Right, well, 1716 01:36:09,120 --> 01:36:12,479 Speaker 1: I'll see you next week. Um next episode, I am 1717 01:36:12,520 --> 01:36:15,320 Speaker 1: going to do a best of Navy Seals. It's only 1718 01:36:15,360 --> 01:36:19,160 Speaker 1: fair because they've done best of Army special operations where 1719 01:36:19,200 --> 01:36:21,800 Speaker 1: we had Delta Force guys. I'm apparently not allowed to 1720 01:36:21,840 --> 01:36:25,439 Speaker 1: say Delta Force according to some emails Delta for guys 1721 01:36:25,640 --> 01:36:27,720 Speaker 1: who sent an email saying I showed you that where 1722 01:36:27,720 --> 01:36:29,479 Speaker 1: they're like, you don't say the D word, you say 1723 01:36:29,520 --> 01:36:32,160 Speaker 1: the unit you say you know, Okay, you know here 1724 01:36:32,320 --> 01:36:35,559 Speaker 1: this is and that's fucking stupid. Okay, here's my little 1725 01:36:35,640 --> 01:36:37,559 Speaker 1: rant at the end of this podcast. That's fucking dumb. 1726 01:36:37,640 --> 01:36:40,760 Speaker 1: And I'll tell you why it's dumb. So if I 1727 01:36:40,840 --> 01:36:43,720 Speaker 1: go on this podcast and I say, oh, Ean that 1728 01:36:44,080 --> 01:36:47,360 Speaker 1: you know I overheard someone saying the N word, or 1729 01:36:47,439 --> 01:36:50,560 Speaker 1: any time you say the N word, every time I 1730 01:36:50,640 --> 01:36:53,960 Speaker 1: say the N word, you're thinking that word in your head. 1731 01:36:54,479 --> 01:36:56,840 Speaker 1: You're thinking that racial sort. I'm making you think it. 1732 01:36:57,360 --> 01:37:01,080 Speaker 1: So what's the difference between saying the end word and 1733 01:37:01,240 --> 01:37:03,599 Speaker 1: actually saying the racial sort. Well, this guy was saying, 1734 01:37:03,640 --> 01:37:05,680 Speaker 1: you don't say the D word. He was saying you 1735 01:37:05,760 --> 01:37:08,519 Speaker 1: just don't say it, you call them. I remember the right, 1736 01:37:08,800 --> 01:37:10,920 Speaker 1: so say that they're in the unit. I don't know 1737 01:37:12,400 --> 01:37:14,639 Speaker 1: in this now I'm telling you why it's stupid, because 1738 01:37:14,800 --> 01:37:17,840 Speaker 1: if I say the unit, what am I making you think? 1739 01:37:18,600 --> 01:37:22,920 Speaker 1: Delta Force? Everybody fucking knows exactly what the uninitiated more 1740 01:37:23,000 --> 01:37:26,400 Speaker 1: people I think in the in the mainstream media, if 1741 01:37:26,400 --> 01:37:28,679 Speaker 1: they hear report, they know what they have an idea 1742 01:37:28,680 --> 01:37:31,599 Speaker 1: of I can I can whisper into this microphone the unit. 1743 01:37:32,120 --> 01:37:35,439 Speaker 1: Everyone knows what you're saying, bro, Like, like what's the 1744 01:37:35,600 --> 01:37:39,160 Speaker 1: point behind? And I understand that probably because there's some 1745 01:37:39,240 --> 01:37:43,280 Speaker 1: sort of classification around the official name that's it's our 1746 01:37:43,360 --> 01:37:47,320 Speaker 1: first Special Forces Operational Detachment Delta. Probably there's a piece 1747 01:37:47,320 --> 01:37:50,200 Speaker 1: of paperwork somewhere that says that's a top secret unit name. 1748 01:37:50,680 --> 01:37:54,640 Speaker 1: So that's why people just say the unit, especially in 1749 01:37:54,800 --> 01:37:56,800 Speaker 1: like a public setting. They don't want to drop the 1750 01:37:56,920 --> 01:37:59,720 Speaker 1: D bomb. I think guys on the show to who 1751 01:37:59,760 --> 01:38:02,720 Speaker 1: have it don't say right, And that's why I'm saying 1752 01:38:02,760 --> 01:38:07,000 Speaker 1: it's it's just stupid, Like we've special Mission unit. I 1753 01:38:07,120 --> 01:38:11,439 Speaker 1: was a Special Mission Unit operators A recent guests. Everybody 1754 01:38:11,560 --> 01:38:14,320 Speaker 1: knows what the funk you're saying. And I understand why 1755 01:38:14,439 --> 01:38:17,439 Speaker 1: some of those those guys are cautious about it, But again, 1756 01:38:17,520 --> 01:38:19,160 Speaker 1: I mean, if I come on here and I say 1757 01:38:19,320 --> 01:38:22,040 Speaker 1: the N word, everyone knows what you're what's the difference 1758 01:38:22,080 --> 01:38:26,280 Speaker 1: between saying the actual word and saying the euphemism when 1759 01:38:26,400 --> 01:38:29,559 Speaker 1: every swing and Dick knows exactly what you're talking about. 1760 01:38:29,880 --> 01:38:32,760 Speaker 1: Do any of these guys, not the guy emailing, but 1761 01:38:32,840 --> 01:38:34,439 Speaker 1: any of the guests who have kind of come on, 1762 01:38:34,520 --> 01:38:36,680 Speaker 1: do you think they worry about being like disciplined for 1763 01:38:37,080 --> 01:38:40,479 Speaker 1: they worry about being socially shamed um former almost like 1764 01:38:40,560 --> 01:38:43,400 Speaker 1: how people say horroring at the Trident like that former 1765 01:38:43,520 --> 01:38:46,599 Speaker 1: members of the unit, former members of Delta Force who 1766 01:38:46,600 --> 01:38:50,680 Speaker 1: are operators UM that unit and people who served there 1767 01:38:50,720 --> 01:38:52,799 Speaker 1: and in Jay Sock will engage in like a social 1768 01:38:52,880 --> 01:38:55,640 Speaker 1: shaming game, you know, and I, oh, you're saying this, 1769 01:38:55,760 --> 01:38:57,800 Speaker 1: You're not supposed to say that, and they'll they'll like 1770 01:38:57,920 --> 01:39:01,120 Speaker 1: shame shame them and make them an outcast, and you know, 1771 01:39:01,240 --> 01:39:03,840 Speaker 1: you can become persona non grata with the unit. Nobody 1772 01:39:03,920 --> 01:39:07,479 Speaker 1: wants that, you know, um, which I understand, But again, 1773 01:39:07,720 --> 01:39:10,360 Speaker 1: like my point is like, Okay, so you go around 1774 01:39:10,400 --> 01:39:13,160 Speaker 1: and you say the unit, or you say Jaysock operator 1775 01:39:13,240 --> 01:39:16,160 Speaker 1: or special Mission Unit operation, Like, what's the fucking point 1776 01:39:16,280 --> 01:39:19,400 Speaker 1: between saying what's the difference between saying that and saying delta? 1777 01:39:20,040 --> 01:39:22,439 Speaker 1: Like it's just I mean, at some point you're just 1778 01:39:22,520 --> 01:39:25,120 Speaker 1: looking in the mirror, stroking yourself off, thinking that, like 1779 01:39:25,200 --> 01:39:28,040 Speaker 1: you're saying some clever euphemism that no one can decipher 1780 01:39:28,160 --> 01:39:31,479 Speaker 1: or something. I don't understand the logic behind that, but anyway, 1781 01:39:31,560 --> 01:39:34,000 Speaker 1: that's my rant for the day. Others can go they 1782 01:39:34,120 --> 01:39:36,679 Speaker 1: go around say whatever they want, but I'm not gonna 1783 01:39:37,200 --> 01:39:40,919 Speaker 1: go around saying the unit. That's That was my reaction 1784 01:39:41,000 --> 01:39:43,720 Speaker 1: to the email um when I saw But it was 1785 01:39:43,760 --> 01:39:46,760 Speaker 1: from a while ago, but anyway, I was reminded of it. 1786 01:39:46,920 --> 01:39:50,360 Speaker 1: But yeah, So I had that best of Army special 1787 01:39:50,479 --> 01:39:54,160 Speaker 1: operations recently where we had delt Force guys on, Special 1788 01:39:54,360 --> 01:39:59,280 Speaker 1: Forces guys on, and Army rangers on, uh, compiled it altogether. 1789 01:39:59,360 --> 01:40:02,479 Speaker 1: Guys like Nate boy Er, guys like Todd Apalski, and 1790 01:40:02,760 --> 01:40:06,040 Speaker 1: people really enjoyed it. George hand So I figured I'd 1791 01:40:06,040 --> 01:40:08,120 Speaker 1: do the same for Navy seals because we've had countless 1792 01:40:08,160 --> 01:40:11,400 Speaker 1: Navy seals. No disrespect to any of our operators, you know, 1793 01:40:11,680 --> 01:40:13,880 Speaker 1: and I know they go around they talk about being 1794 01:40:13,960 --> 01:40:17,400 Speaker 1: in the unit. Uh. You know they do do what 1795 01:40:17,479 --> 01:40:19,760 Speaker 1: they feel is appropriate, no problem. Then you know I 1796 01:40:19,840 --> 01:40:24,200 Speaker 1: have an immense amount of respect for the unit. I really, 1797 01:40:24,280 --> 01:40:25,880 Speaker 1: I really do. I have a lot of I have 1798 01:40:25,960 --> 01:40:27,760 Speaker 1: a lot of respect for those guys. I think it's 1799 01:40:27,800 --> 01:40:30,120 Speaker 1: more so than the guy who emailed it was like Ian, 1800 01:40:30,320 --> 01:40:34,800 Speaker 1: you sounded ignorant saying dot the forest, I love the unit. 1801 01:40:35,840 --> 01:40:39,040 Speaker 1: Uh No, I have an immense amount of respect for 1802 01:40:39,080 --> 01:40:41,280 Speaker 1: those guys, and that's why we've always been very nice 1803 01:40:41,360 --> 01:40:44,400 Speaker 1: to them the film. When they come on the show, Um, 1804 01:40:45,080 --> 01:40:48,840 Speaker 1: they're I mean, most of them are just terrific guys, absolutely, 1805 01:40:48,880 --> 01:40:52,000 Speaker 1: and some of are like most listened to shows. People 1806 01:40:52,080 --> 01:40:56,400 Speaker 1: loved Rob Trevino, people loved uh Todd Apalski. And you 1807 01:40:56,439 --> 01:40:58,280 Speaker 1: know what I should mention, I was actually just on 1808 01:40:58,360 --> 01:41:01,880 Speaker 1: the phone with Nick Kaufman, and I believe that Rob 1809 01:41:01,960 --> 01:41:05,240 Speaker 1: Trevino said to George Hand after he was on the show, 1810 01:41:05,400 --> 01:41:08,960 Speaker 1: his book did like tremendous trailers. And then the most 1811 01:41:09,080 --> 01:41:12,479 Speaker 1: recent example of that was having marchi Aconia, on which 1812 01:41:12,520 --> 01:41:16,519 Speaker 1: you brought to my attention. Mark's book was relatively unknown. 1813 01:41:16,720 --> 01:41:19,439 Speaker 1: I believe it has about four reviews on Amazon I 1814 01:41:19,520 --> 01:41:21,680 Speaker 1: saw and it shot to like number two in its 1815 01:41:21,800 --> 01:41:25,600 Speaker 1: category and number two for the e book on Amazon. 1816 01:41:26,400 --> 01:41:29,280 Speaker 1: And I think Iraq War in that category. And if 1817 01:41:29,280 --> 01:41:31,920 Speaker 1: you look at the other books in that category, they 1818 01:41:32,000 --> 01:41:35,320 Speaker 1: have like hundreds or in some cases thousands of reviews. 1819 01:41:35,360 --> 01:41:39,120 Speaker 1: We're talking about books like Rob O'Neill's Operator, um, Sean 1820 01:41:39,200 --> 01:41:44,599 Speaker 1: Parnell's first book. And then you see Mark Giaconia suddenly 1821 01:41:44,960 --> 01:41:47,880 Speaker 1: number two in the category, and to be honest, there's 1822 01:41:47,920 --> 01:41:51,080 Speaker 1: no explanation for it other than him coming on. So yeah, yeah, well, 1823 01:41:51,160 --> 01:41:54,280 Speaker 1: and I'm I'm so happy to lend this platform to um, 1824 01:41:54,479 --> 01:41:59,200 Speaker 1: you know, former members of the Unit, and especially and 1825 01:41:59,320 --> 01:42:02,720 Speaker 1: Rob rob Ravino is in the Unit. Um a great guy, 1826 01:42:02,920 --> 01:42:06,759 Speaker 1: and I mean, uh and Mark Gioconio is a Special 1827 01:42:06,840 --> 01:42:09,680 Speaker 1: Forces soldier, and I'm so glad that we can lend 1828 01:42:09,720 --> 01:42:13,920 Speaker 1: this platform to them and their voice and UM, you know, 1829 01:42:14,040 --> 01:42:15,960 Speaker 1: kind of pump that up, give them the little bump 1830 01:42:16,439 --> 01:42:19,280 Speaker 1: and you know, help them get their word, to get 1831 01:42:19,320 --> 01:42:22,000 Speaker 1: their voice out there. UM. I think it's so important 1832 01:42:22,040 --> 01:42:24,800 Speaker 1: that people hear that. And I mean it's better that 1833 01:42:25,280 --> 01:42:27,160 Speaker 1: you don't need to hear everything from me. I mean, 1834 01:42:27,240 --> 01:42:30,000 Speaker 1: go read it, read it, directly from the horse's mouth. 1835 01:42:30,040 --> 01:42:32,599 Speaker 1: Read what Rob Trevino has to say, Read what Mark 1836 01:42:32,600 --> 01:42:36,120 Speaker 1: Gioconia has to say, and uh, you know, I think 1837 01:42:36,160 --> 01:42:38,600 Speaker 1: it's amazing and I'm so happy that we can be 1838 01:42:38,680 --> 01:42:41,759 Speaker 1: a part of that. Want one last piece of Breaking 1839 01:42:41,840 --> 01:42:46,639 Speaker 1: Gossip as wel podcast. So Task and Purpose has apparently 1840 01:42:46,720 --> 01:42:50,200 Speaker 1: announced a new editor in chief. I don't know if 1841 01:42:50,240 --> 01:42:54,200 Speaker 1: you're familiar with the chairman, mau Paul Soldra. It's called 1842 01:42:54,439 --> 01:42:57,200 Speaker 1: sz O L D R A. I have no idea 1843 01:42:57,240 --> 01:42:59,800 Speaker 1: who that is. Well, this is their figure. This is 1844 01:42:59,800 --> 01:43:03,520 Speaker 1: their fifth editor in chief in like four months. So interesting. 1845 01:43:04,320 --> 01:43:07,599 Speaker 1: All right, thanks to text as always for checking us out, guys, 1846 01:43:07,960 --> 01:43:10,879 Speaker 1: and we'll be back with a new episode on Friday. 1847 01:43:12,479 --> 01:43:16,800 Speaker 1: You've been listening to Self Rep Radio. New episodes up 1848 01:43:16,840 --> 01:43:20,760 Speaker 1: every Wednesday and Friday. Follow the show on Instagram and 1849 01:43:20,880 --> 01:43:22,920 Speaker 1: Twitter at self Rep Radio.