1 00:00:07,120 --> 00:00:19,400 Speaker 1: Ah, beaut force. If it doesn't work, you're just not 2 00:00:19,720 --> 00:00:25,440 Speaker 1: using enough. You're listening to Software Radio Special Operations, military 3 00:00:25,520 --> 00:01:09,520 Speaker 1: news and straight talk with the guys and the community 4 00:01:10,520 --> 00:01:14,240 Speaker 1: Software dot Com on time on Target. Very excited to 5 00:01:14,400 --> 00:01:16,840 Speaker 1: have John Rob coming on. And I know you've been 6 00:01:16,840 --> 00:01:21,080 Speaker 1: a followers work for years now. Yeah, I started reading 7 00:01:21,640 --> 00:01:25,640 Speaker 1: John Rob's website. It's called still spelled like Goldal Guerrillas. 8 00:01:26,440 --> 00:01:30,520 Speaker 1: And Um, he does a lot of theorizing and study 9 00:01:30,760 --> 00:01:35,320 Speaker 1: of emerging tactics and strategies. Um. He's a very smart 10 00:01:35,319 --> 00:01:39,319 Speaker 1: guy on essentially the future of warfare, the future of conflict, 11 00:01:39,520 --> 00:01:41,520 Speaker 1: but also a lot of other subjects. You know, he 12 00:01:41,600 --> 00:01:44,920 Speaker 1: talked a lot about AI. Um you can even get 13 00:01:44,959 --> 00:01:49,040 Speaker 1: into some sort of like the cultural uh and um 14 00:01:49,200 --> 00:01:52,440 Speaker 1: you guess you could say, like mega trends in national 15 00:01:52,520 --> 00:01:56,360 Speaker 1: and international politics, um, where things are going in the 16 00:01:56,400 --> 00:01:59,160 Speaker 1: coming years. Um. So you know, we'll definitely talk to 17 00:01:59,240 --> 00:02:02,000 Speaker 1: him about all that. And he himself is also a 18 00:02:02,120 --> 00:02:06,280 Speaker 1: former soft guy. I believe he was in soft aviation 19 00:02:06,280 --> 00:02:08,360 Speaker 1: in the Air Force. Yeah, he was. I was looking 20 00:02:08,400 --> 00:02:11,959 Speaker 1: up his bio, but took a very unconventional career path 21 00:02:12,080 --> 00:02:14,840 Speaker 1: after I mean, other than the writing, he's he's not 22 00:02:14,880 --> 00:02:19,400 Speaker 1: writing about typical stuff that former military guys usually right about. No, 23 00:02:19,560 --> 00:02:24,600 Speaker 1: he's a futurist in many ways. Um. I feel like 24 00:02:24,720 --> 00:02:28,240 Speaker 1: he's seen you know, he hasn't. He's able to see 25 00:02:28,560 --> 00:02:30,800 Speaker 1: what's coming at us in the future and the way 26 00:02:30,840 --> 00:02:34,880 Speaker 1: things are developing. And it's often very different, um than 27 00:02:35,400 --> 00:02:38,160 Speaker 1: what you'll hear from other sources, you know. I for 28 00:02:38,360 --> 00:02:40,440 Speaker 1: when I was reading his work, I started reading it 29 00:02:40,440 --> 00:02:43,480 Speaker 1: around like two thousand five, two thousand six, and I 30 00:02:43,520 --> 00:02:46,519 Speaker 1: think John rob was looking at the insurgency brewing in 31 00:02:46,600 --> 00:02:50,160 Speaker 1: Iraq primarily at that time, and he was seeing how 32 00:02:50,240 --> 00:02:53,880 Speaker 1: quickly the insurgency evolved. Um. But he of course he 33 00:02:53,919 --> 00:02:58,400 Speaker 1: can comment on his own work and UM, but he 34 00:02:58,720 --> 00:03:02,520 Speaker 1: he sees these things and thinking in his thinking about 35 00:03:02,680 --> 00:03:05,440 Speaker 1: tactics and strategy. He's just like light years ahead of 36 00:03:05,440 --> 00:03:08,760 Speaker 1: where the Department of Defenses today. And I could see 37 00:03:08,760 --> 00:03:11,840 Speaker 1: that when I was, you know, a young soldier. Um. 38 00:03:11,919 --> 00:03:13,799 Speaker 1: But he's still out there, he's still doing some good work. 39 00:03:13,880 --> 00:03:16,040 Speaker 1: So we'll get into all of that with him. That's cool. Yeah, 40 00:03:16,040 --> 00:03:19,800 Speaker 1: I'm excited to hear from him. I wanted to mention actually, 41 00:03:19,800 --> 00:03:22,480 Speaker 1: the first thing that I think I should get into 42 00:03:22,600 --> 00:03:24,880 Speaker 1: is Oliver North named president of the n r A 43 00:03:25,480 --> 00:03:27,200 Speaker 1: and because of that, right when I right when I 44 00:03:27,240 --> 00:03:29,440 Speaker 1: heard the news, I was like, I gotta grab equip 45 00:03:29,520 --> 00:03:32,040 Speaker 1: of when he was on soft reap radio. So I 46 00:03:32,080 --> 00:03:34,040 Speaker 1: put that up there for people to hear. I would 47 00:03:34,080 --> 00:03:36,800 Speaker 1: love to have him back on and talk about the news. Um. 48 00:03:37,200 --> 00:03:40,480 Speaker 1: I know that there's like obviously some controversy on Twitter 49 00:03:40,520 --> 00:03:43,520 Speaker 1: about you know, his past or on Contra. I think 50 00:03:43,520 --> 00:03:45,440 Speaker 1: it's a good pick. I think Aliver North is a 51 00:03:45,480 --> 00:03:48,680 Speaker 1: very well white guy, and uh, I'm glad to him 52 00:03:48,680 --> 00:03:53,760 Speaker 1: his president. He's very controversial, um. I mean when I 53 00:03:53,800 --> 00:03:55,720 Speaker 1: heard the news, I thought that the n r A 54 00:03:55,960 --> 00:03:59,720 Speaker 1: was like deliberately trolling. Honestly, Like, I think they picked 55 00:04:00,040 --> 00:04:03,160 Speaker 1: that because they picked him because they knew he was 56 00:04:03,200 --> 00:04:07,120 Speaker 1: such he'd be such an inflammatory figure for the left. 57 00:04:08,200 --> 00:04:11,000 Speaker 1: That's true, but he is. I don't know, I've met 58 00:04:11,040 --> 00:04:14,160 Speaker 1: the guy. I like the guy. Um. I think he's 59 00:04:14,240 --> 00:04:17,000 Speaker 1: very smart. And also I think people bringing up things 60 00:04:17,000 --> 00:04:19,919 Speaker 1: like Iran Contra decades ago has nothing to do with 61 00:04:19,920 --> 00:04:22,479 Speaker 1: the gun rights issue, Like it shouldn't detract him from 62 00:04:22,480 --> 00:04:25,160 Speaker 1: being the spokesman for a gun rights ad because he 63 00:04:25,200 --> 00:04:27,599 Speaker 1: grew up in my opinion, I mean, yes, and no, 64 00:04:27,760 --> 00:04:30,799 Speaker 1: but I mean anybody who gets appointed to a position, 65 00:04:31,120 --> 00:04:33,279 Speaker 1: people are going to look into their past and see 66 00:04:33,360 --> 00:04:36,880 Speaker 1: how clean you are, not, which you know brings us 67 00:04:36,920 --> 00:04:41,440 Speaker 1: to our beloved New York City Attorney General. Yes, I 68 00:04:41,480 --> 00:04:43,599 Speaker 1: was never mentioned that as well. And the reason I 69 00:04:43,640 --> 00:04:45,760 Speaker 1: thought it was interesting to bring up is because we 70 00:04:45,760 --> 00:04:49,240 Speaker 1: we got into a whole discussion last episode about the 71 00:04:49,279 --> 00:04:53,640 Speaker 1: whole feminism movement and toxic masculinity. And here we have 72 00:04:53,800 --> 00:04:56,320 Speaker 1: this guy who was this huge advocate for the Me 73 00:04:56,440 --> 00:04:59,400 Speaker 1: too movement for feminism. He was involved in the civil 74 00:04:59,440 --> 00:05:02,560 Speaker 1: suit against Harvey Weinstein, I believe. Yeah. Like, I even 75 00:05:02,600 --> 00:05:05,559 Speaker 1: pulled up a tweet of his. I I wrote something 76 00:05:05,600 --> 00:05:09,040 Speaker 1: about it, but I'll read it. I like the tweet 77 00:05:09,080 --> 00:05:11,440 Speaker 1: he sent out where he was talking about how like 78 00:05:11,600 --> 00:05:14,560 Speaker 1: slavery is still coated into our legal system, we need 79 00:05:14,600 --> 00:05:16,480 Speaker 1: to fight it. And that turns out he was calling 80 00:05:16,480 --> 00:05:19,720 Speaker 1: his girlfriend like a brown slave, call me master, and 81 00:05:19,760 --> 00:05:22,680 Speaker 1: like slapping her around and stuff, and and according to them, 82 00:05:23,120 --> 00:05:25,599 Speaker 1: from what I've read, not consensual at all. But he's 83 00:05:25,640 --> 00:05:29,280 Speaker 1: trying to play it off. And yeah, I like this tweet. 84 00:05:29,320 --> 00:05:32,159 Speaker 1: I'm a proud feminist and I am committed to using 85 00:05:32,160 --> 00:05:35,680 Speaker 1: the privilege of my office to advance equality, as every 86 00:05:35,720 --> 00:05:38,919 Speaker 1: elected official should be. And you pretty much said this 87 00:05:39,040 --> 00:05:41,600 Speaker 1: last episode that these guys who go over the top 88 00:05:41,680 --> 00:05:45,000 Speaker 1: to say like how feminist they are, they tend to be. Yeah, 89 00:05:45,000 --> 00:05:47,160 Speaker 1: they're throwing up a smoke screen. Yeah, when you're when 90 00:05:47,200 --> 00:05:50,400 Speaker 1: you're virtue signaling that much, when you're like out there, 91 00:05:50,440 --> 00:05:53,599 Speaker 1: hot dog in it, you know, hashtag in it. You 92 00:05:53,640 --> 00:05:56,520 Speaker 1: know me too. I'm a feminist. I'm a male feminist. 93 00:05:56,560 --> 00:05:59,440 Speaker 1: I'm with you girls, Let's go. Yeah. I mean those 94 00:05:59,440 --> 00:06:01,640 Speaker 1: people who are like really out there hot dog and it, 95 00:06:01,800 --> 00:06:04,440 Speaker 1: I never trust them. I always feel like they're throwing 96 00:06:04,520 --> 00:06:07,600 Speaker 1: up a smoke screen. It's like them, like the guy, 97 00:06:07,800 --> 00:06:11,320 Speaker 1: like a guy who's incredibly homophobic. You're like, I think 98 00:06:11,320 --> 00:06:14,040 Speaker 1: the dude's overcompensating for something, you know what I mean. 99 00:06:14,160 --> 00:06:16,280 Speaker 1: And that's the case with this guy too. He's throwing 100 00:06:16,360 --> 00:06:18,919 Speaker 1: up a smoke screen. You know. The guy just treats 101 00:06:19,520 --> 00:06:21,240 Speaker 1: I mean, it seemed pretty clear. I read the New 102 00:06:21,320 --> 00:06:24,480 Speaker 1: Yorker article last night, and this guy just treats women 103 00:06:24,520 --> 00:06:27,320 Speaker 1: like shit. I mean literally, it's like four different women 104 00:06:27,520 --> 00:06:29,840 Speaker 1: slapping girls around. Well, there's going to be more than 105 00:06:29,839 --> 00:06:32,680 Speaker 1: four for women. That's just the four main sources in 106 00:06:32,680 --> 00:06:36,839 Speaker 1: the In the article, Um spinning on women, slapping them around, 107 00:06:37,000 --> 00:06:39,760 Speaker 1: hitting them, getting drunk all the time, which is not 108 00:06:39,839 --> 00:06:43,800 Speaker 1: necessarily a crime. But I'm getting intoxicated and beating women 109 00:06:43,880 --> 00:06:46,040 Speaker 1: and calling them horrors and sluts and all this kind 110 00:06:46,080 --> 00:06:52,440 Speaker 1: of stuff. It's like, yeah, yeah, And considering that he resigned, 111 00:06:52,440 --> 00:06:54,400 Speaker 1: I think it's said a few hours after the article 112 00:06:54,440 --> 00:06:56,880 Speaker 1: was published, that's a pretty big admission of guilt there. 113 00:06:56,920 --> 00:06:58,720 Speaker 1: I would say, yeah, I read that. I said, I 114 00:06:58,760 --> 00:07:01,480 Speaker 1: read that article in the New Yorker last night after 115 00:07:01,520 --> 00:07:04,720 Speaker 1: actually b k Um mentioned it, and I was like, 116 00:07:04,760 --> 00:07:07,320 Speaker 1: what are you talking about? What are you talking about, dude, 117 00:07:07,480 --> 00:07:09,920 Speaker 1: And he's like, oh, your attorney general is going down, man. 118 00:07:10,000 --> 00:07:12,160 Speaker 1: And so I went and I read the article and 119 00:07:12,400 --> 00:07:14,400 Speaker 1: I finished reading it, and I walked downstairs and I 120 00:07:14,440 --> 00:07:17,080 Speaker 1: told my wife. I was like, yeah, our attorney general's fucked, 121 00:07:17,240 --> 00:07:19,680 Speaker 1: but he hasn't coming. And then like an hour later, 122 00:07:19,880 --> 00:07:22,800 Speaker 1: she she turns to me and she's like, yeah, he resigned. 123 00:07:23,800 --> 00:07:25,239 Speaker 1: You know what I hate too. I saw an article 124 00:07:25,280 --> 00:07:28,840 Speaker 1: on CNN that said it was like yet another politician 125 00:07:28,840 --> 00:07:30,920 Speaker 1: who has let us down, like as if you know 126 00:07:30,920 --> 00:07:33,240 Speaker 1: they're tied to him. Some op ed and it's just 127 00:07:33,280 --> 00:07:35,960 Speaker 1: like I never feel the way because I don't think 128 00:07:36,000 --> 00:07:38,200 Speaker 1: you should put your faith in politics, and I don't 129 00:07:38,240 --> 00:07:41,120 Speaker 1: think politicians are on our side. Yeah, I don't think 130 00:07:41,120 --> 00:07:43,680 Speaker 1: any of politician is aligned with me. I don't think 131 00:07:43,720 --> 00:07:45,920 Speaker 1: that any one of them is my friend or gives 132 00:07:45,920 --> 00:07:48,600 Speaker 1: a shit about me. The only reason politician gives a 133 00:07:48,600 --> 00:07:51,320 Speaker 1: shit about you is because you can get them into office. 134 00:07:51,680 --> 00:07:54,040 Speaker 1: And once you get them into office, their primary goal 135 00:07:54,400 --> 00:07:57,840 Speaker 1: becomes to get you to get themselves reelected and to 136 00:07:57,920 --> 00:08:00,040 Speaker 1: get you to keep them in office. It's not the 137 00:08:00,040 --> 00:08:03,320 Speaker 1: there like nice like hey and Jack, I'm your buddy. 138 00:08:03,360 --> 00:08:07,240 Speaker 1: We're allied, Bro, we're the same. No, we're not. You know, 139 00:08:07,320 --> 00:08:10,960 Speaker 1: politicians are part of an elite class and they are 140 00:08:11,280 --> 00:08:15,080 Speaker 1: self interested actors making sure that they get what they 141 00:08:15,160 --> 00:08:17,280 Speaker 1: want and you don't. And it's as simple as that. 142 00:08:17,320 --> 00:08:19,559 Speaker 1: And the only reason why they do anything is because 143 00:08:19,560 --> 00:08:21,560 Speaker 1: we demanded of them and we hold their feet to 144 00:08:21,600 --> 00:08:23,880 Speaker 1: the fire. So I mean thinking that like, oh, he's 145 00:08:23,920 --> 00:08:27,680 Speaker 1: our guy, it's a bunch of bullshit. Uh, you know. 146 00:08:27,760 --> 00:08:31,200 Speaker 1: And I respect the New Yorker for sticking to principles 147 00:08:31,360 --> 00:08:35,160 Speaker 1: and prioritizing the truth ahead of a political agenda and 148 00:08:35,160 --> 00:08:37,040 Speaker 1: I mean, it's a New Yorker magazine. It's fair to 149 00:08:37,080 --> 00:08:41,040 Speaker 1: say that these are, you know, the intellectual liberal elites 150 00:08:41,240 --> 00:08:44,720 Speaker 1: of New York City. But they went after this guy 151 00:08:44,760 --> 00:08:47,280 Speaker 1: despite him being a prominent Democrat, and a lot of 152 00:08:47,320 --> 00:08:49,959 Speaker 1: people would say, oh, well, he's too important for the movement, 153 00:08:50,000 --> 00:08:52,760 Speaker 1: he's too important for our politics. Were like, like we 154 00:08:52,760 --> 00:08:57,800 Speaker 1: saw happen with Al Franklin, Remember all of that Al Franken. Yeah, um, 155 00:08:57,840 --> 00:08:59,839 Speaker 1: you know he's a Democrat. We need to keep him 156 00:08:59,840 --> 00:09:03,840 Speaker 1: in office. He's allied And it's like, I don't know, 157 00:09:03,880 --> 00:09:06,000 Speaker 1: that's such a weird thing for me to hear. It's 158 00:09:06,080 --> 00:09:08,440 Speaker 1: I think I would also think the New Yorker, like, yes, 159 00:09:08,520 --> 00:09:10,560 Speaker 1: I give them credit, but when you have a scooped 160 00:09:10,600 --> 00:09:12,400 Speaker 1: that big, I would think they're not going to pass 161 00:09:12,480 --> 00:09:14,720 Speaker 1: it up no matter who what's done. I think there's 162 00:09:14,760 --> 00:09:18,920 Speaker 1: still tremendous pressure um internally, and they're gonna catch heat 163 00:09:18,960 --> 00:09:21,200 Speaker 1: from their own people's their own peer group, you know, 164 00:09:21,320 --> 00:09:23,439 Speaker 1: just like I catch heat from my peer group because 165 00:09:23,520 --> 00:09:26,200 Speaker 1: I say things about the military that people in the 166 00:09:26,200 --> 00:09:29,360 Speaker 1: military or veterans but may not want out there. But 167 00:09:29,480 --> 00:09:32,640 Speaker 1: I have to prioritize the truth ahead of what's comfortable 168 00:09:32,679 --> 00:09:35,720 Speaker 1: because that's my job. And I mean I think the 169 00:09:35,720 --> 00:09:38,360 Speaker 1: New Yorker did the same in their case, in their 170 00:09:39,000 --> 00:09:43,280 Speaker 1: um culture and their their peers and journalism. And here 171 00:09:43,320 --> 00:09:45,760 Speaker 1: you know, the quote unquote liberal elite New York City. 172 00:09:45,760 --> 00:09:48,840 Speaker 1: I'm sure we'll criticize them for some of those decisions, 173 00:09:48,880 --> 00:09:50,319 Speaker 1: but you know, they went ahead and they did the 174 00:09:50,400 --> 00:09:54,840 Speaker 1: right thing. Yeah, I Like I said, I also read 175 00:09:54,880 --> 00:09:59,800 Speaker 1: the article like yourself. Um, pretty horrifying accounts, definitely, And 176 00:09:59,800 --> 00:10:02,520 Speaker 1: and the women were very queer because, like I said, 177 00:10:02,520 --> 00:10:05,600 Speaker 1: his excuse was just this role play this and it's 178 00:10:05,600 --> 00:10:07,840 Speaker 1: a it's he has an m O. Right, it's across 179 00:10:07,880 --> 00:10:11,520 Speaker 1: all these different women that the the the abuse files 180 00:10:11,559 --> 00:10:15,320 Speaker 1: a pattern. It's pretty much the same. Yeah. To be fair, though, 181 00:10:15,400 --> 00:10:18,439 Speaker 1: I actually didn't even know the name Eric Schneiderman until now. No, 182 00:10:18,520 --> 00:10:20,400 Speaker 1: I never paid attention to him. Like I said, I 183 00:10:20,440 --> 00:10:23,400 Speaker 1: don't think that politicians are our friends. Yeah, that's why 184 00:10:23,440 --> 00:10:27,040 Speaker 1: I always I mean, I wouldn't say all conservatives feel 185 00:10:27,080 --> 00:10:30,520 Speaker 1: this way, but I guess principal constitutional conservatives have this 186 00:10:30,600 --> 00:10:34,720 Speaker 1: idea of like, just follow the constitution and kind of 187 00:10:34,720 --> 00:10:37,319 Speaker 1: get out of my way type of thing. And I think, 188 00:10:37,880 --> 00:10:39,520 Speaker 1: you know, you do see it with Trump's I can't 189 00:10:39,520 --> 00:10:41,320 Speaker 1: just say the left, but they have this thing where 190 00:10:41,360 --> 00:10:43,680 Speaker 1: it's almost like they fall in love with these politicians. 191 00:10:43,679 --> 00:10:46,720 Speaker 1: The way the way they like falled over Hillary and 192 00:10:46,720 --> 00:10:49,560 Speaker 1: it absolutely happened, happened with Obama and it happened with Trump, 193 00:10:49,720 --> 00:10:53,000 Speaker 1: is that they absolutely, yeah, they fell in love. It's incredible. 194 00:10:53,120 --> 00:10:55,960 Speaker 1: It's incredible to watch. And oftentimes it's the same people 195 00:10:55,960 --> 00:10:59,360 Speaker 1: who criticize the Obama zombies as being like cult followers 196 00:10:59,440 --> 00:11:01,680 Speaker 1: and you're drinking the kool aid. They did the same 197 00:11:01,679 --> 00:11:04,120 Speaker 1: thing with Trump. They fell in love with the guy. Yeah. 198 00:11:04,440 --> 00:11:07,800 Speaker 1: And I never try to be too you know, uh, 199 00:11:08,080 --> 00:11:11,080 Speaker 1: I don't. I'll criticize whoever it is if they're on 200 00:11:11,120 --> 00:11:12,840 Speaker 1: the right of the left, that they're doing something I 201 00:11:12,840 --> 00:11:15,400 Speaker 1: don't agree with. I mean, the only politician I ever 202 00:11:15,400 --> 00:11:17,360 Speaker 1: saw that I was like, I agree with this manager 203 00:11:17,440 --> 00:11:20,720 Speaker 1: about everything was Ron Paul. But if he says something. Actually, 204 00:11:20,840 --> 00:11:22,719 Speaker 1: he said a few things here and there that I've 205 00:11:22,720 --> 00:11:26,400 Speaker 1: been critical of. I remember he wrote some pretty inflammatory 206 00:11:26,440 --> 00:11:30,839 Speaker 1: and I thought just um uncalled for stuff about Chris 207 00:11:30,960 --> 00:11:33,160 Speaker 1: Kyle after he died on Twitter. I don't remember that. 208 00:11:33,240 --> 00:11:35,640 Speaker 1: I do remember that I thought it was really uncalled 209 00:11:35,679 --> 00:11:38,240 Speaker 1: for um, but yeah, I don't think you should fall 210 00:11:38,280 --> 00:11:42,280 Speaker 1: in love with politicians generally. There we're all flawed human 211 00:11:42,320 --> 00:11:46,360 Speaker 1: beings and they if you want them to uphold your 212 00:11:46,400 --> 00:11:50,320 Speaker 1: principles and all that, that's great. But yeah, sometimes this 213 00:11:50,400 --> 00:11:52,280 Speaker 1: is how they turn out to be. And especially it's 214 00:11:52,280 --> 00:11:54,760 Speaker 1: like in New York seeing this pattern Anthony Weener, this guy, 215 00:11:54,880 --> 00:11:56,600 Speaker 1: I mean, this goes I mean, this goes back to 216 00:11:56,720 --> 00:11:59,720 Speaker 1: like ancient Rome. I mean, why do I have to 217 00:11:59,720 --> 00:12:02,840 Speaker 1: even tell people that politicians aren't our friends? I mean, 218 00:12:02,960 --> 00:12:05,800 Speaker 1: these are these people are elites. They like to lord 219 00:12:05,920 --> 00:12:08,160 Speaker 1: over us, they like to abuse us, and the only 220 00:12:08,200 --> 00:12:11,320 Speaker 1: reason why they don't is because we hold them accountable. Yeah, 221 00:12:11,400 --> 00:12:14,280 Speaker 1: that's it. I mean there are no like oh the 222 00:12:14,720 --> 00:12:18,320 Speaker 1: you know, the conservative politicians are not our friends. The 223 00:12:18,360 --> 00:12:21,760 Speaker 1: liberal politicians are not our friends. They're really they're all 224 00:12:21,760 --> 00:12:24,160 Speaker 1: a bunch of assholes who go to the same cocktail parties. 225 00:12:24,480 --> 00:12:26,080 Speaker 1: I mean they all hang out together. You want to 226 00:12:26,080 --> 00:12:30,080 Speaker 1: talk about collusion, These people collude with one another to 227 00:12:30,160 --> 00:12:33,640 Speaker 1: stay empower. Yeah, which which they have to to be fair, 228 00:12:33,760 --> 00:12:35,440 Speaker 1: I mean, I also don't want to see us in 229 00:12:35,520 --> 00:12:37,760 Speaker 1: a um well, yeah, they have to be a state 230 00:12:37,800 --> 00:12:41,120 Speaker 1: of civil war where you know, some people do want 231 00:12:41,160 --> 00:12:43,400 Speaker 1: to see where it's just like Democrats aren't even talking 232 00:12:43,400 --> 00:12:46,480 Speaker 1: to Republicans. They won't discuss anything, they won't we won't 233 00:12:46,480 --> 00:12:49,440 Speaker 1: work on any legislat together. Like that's not a good 234 00:12:50,480 --> 00:12:52,800 Speaker 1: way of doing No, it's not, And I'm not I'm 235 00:12:52,880 --> 00:12:56,040 Speaker 1: definitely not saying that our politicians should not work together. 236 00:12:56,800 --> 00:13:00,520 Speaker 1: Uh in us as politically um interested people, that we 237 00:13:00,559 --> 00:13:03,720 Speaker 1: shouldn't negotiate. I mean, that's democracies that you you come 238 00:13:03,760 --> 00:13:05,720 Speaker 1: together and you negotiate. You're not going to agree on 239 00:13:05,760 --> 00:13:08,839 Speaker 1: all issues, but maybe you'll find some common ground. And 240 00:13:08,920 --> 00:13:11,120 Speaker 1: if we don't do that, then legislation is never gonna 241 00:13:11,120 --> 00:13:15,160 Speaker 1: get passed on any subject. And oftentimes the best periods 242 00:13:15,240 --> 00:13:18,720 Speaker 1: of American history, or when we had like two separate 243 00:13:18,760 --> 00:13:21,600 Speaker 1: branches of government, when we had like a Democrat Congress 244 00:13:21,600 --> 00:13:24,320 Speaker 1: and a Republican president, you know, people would say people 245 00:13:24,320 --> 00:13:26,719 Speaker 1: always talk about Reagan and Tip O'Neil or new get 246 00:13:26,840 --> 00:13:29,560 Speaker 1: Rich and Bill quentin Um. Those are years that a 247 00:13:29,559 --> 00:13:32,440 Speaker 1: lot of people think we're the best years of our car. Yeah. Well, 248 00:13:32,600 --> 00:13:34,280 Speaker 1: I don't know if I necessarily agree, but I'm just 249 00:13:35,520 --> 00:13:37,000 Speaker 1: I think a lot of times when we have one 250 00:13:37,040 --> 00:13:41,800 Speaker 1: branch in total power. It's well, the left and the 251 00:13:41,960 --> 00:13:44,120 Speaker 1: right are just two sides of the same coin, and 252 00:13:44,160 --> 00:13:48,120 Speaker 1: they desperately need each other. They need each other wrestling. Yeah, yeah, 253 00:13:48,120 --> 00:13:51,320 Speaker 1: it's pro wrestling. Because when Republicans are in office, they 254 00:13:51,320 --> 00:13:54,359 Speaker 1: need someone to blame for their failures, and when Democrats 255 00:13:54,360 --> 00:13:56,160 Speaker 1: are in power, they need someone that they can blame 256 00:13:56,160 --> 00:13:58,360 Speaker 1: for their failures. So, I mean, these two like they 257 00:13:58,400 --> 00:14:00,440 Speaker 1: need each other. If it was just one party, then 258 00:14:00,480 --> 00:14:03,640 Speaker 1: they would be responsible for everything. And then so what 259 00:14:03,679 --> 00:14:06,480 Speaker 1: does one party become. It becomes something like communism, it 260 00:14:06,520 --> 00:14:10,200 Speaker 1: becomes something uh that claims to be utopian. It has 261 00:14:10,240 --> 00:14:13,160 Speaker 1: to be you know, you're sounding like Soviet Russia or 262 00:14:13,240 --> 00:14:15,600 Speaker 1: North Korea. It's like, man, you guys got it really good. 263 00:14:15,640 --> 00:14:18,160 Speaker 1: We're living in a utopia. It's those people over there 264 00:14:18,160 --> 00:14:21,880 Speaker 1: in Western civilization. They're really screwed. You know. It just 265 00:14:21,920 --> 00:14:25,560 Speaker 1: turns into this nutty nonsense. So or you know, Trump 266 00:14:25,600 --> 00:14:28,520 Speaker 1: is a fascist. On the other side, people people come 267 00:14:28,600 --> 00:14:31,280 Speaker 1: up with all these weird ideas about politics and how 268 00:14:31,920 --> 00:14:34,800 Speaker 1: you know, all the left is destroying America? Are all 269 00:14:34,840 --> 00:14:37,840 Speaker 1: the right a bunch of neo fascists and they're destroying America? 270 00:14:37,840 --> 00:14:40,960 Speaker 1: It's like, no, these people they're they're all basically the same. 271 00:14:41,360 --> 00:14:43,120 Speaker 1: They have way more in common with each other than 272 00:14:43,200 --> 00:14:45,480 Speaker 1: they do with any of us, and they all desperately, 273 00:14:45,720 --> 00:14:47,840 Speaker 1: you know, the left and right desperately need one another 274 00:14:47,880 --> 00:14:52,239 Speaker 1: in order to perpetuate the system that they exist in. Yeah, um, 275 00:14:52,280 --> 00:14:54,160 Speaker 1: I know, we want to talk about this article that 276 00:14:54,240 --> 00:14:57,400 Speaker 1: ties into John rob stuff, The Meet the Renegades of 277 00:14:57,440 --> 00:15:00,400 Speaker 1: the NIAL. I don't know if that necessarily ties directly 278 00:15:00,400 --> 00:15:04,440 Speaker 1: into Rob's work, Um, but John Robb has commented on 279 00:15:04,640 --> 00:15:07,120 Speaker 1: um sort of like cultural trends and things like that, 280 00:15:07,200 --> 00:15:08,960 Speaker 1: and we can ask him about it. He might have 281 00:15:09,000 --> 00:15:10,840 Speaker 1: some thoughts. I thought it was a good article though. 282 00:15:10,840 --> 00:15:12,800 Speaker 1: You sent it to me, and it talks about kind 283 00:15:12,800 --> 00:15:15,320 Speaker 1: of these people who are not in the mainstream going 284 00:15:15,360 --> 00:15:18,000 Speaker 1: on podcast, like Joe Rose. The New York Times article, 285 00:15:18,040 --> 00:15:20,400 Speaker 1: what's it called, it's They'll Meet the Renegades of the 286 00:15:20,400 --> 00:15:23,960 Speaker 1: Intellectual Dark Web, So it's spotlights people that I actually 287 00:15:23,960 --> 00:15:25,200 Speaker 1: I'm a fan of it. I know you're a fan. 288 00:15:25,400 --> 00:15:27,960 Speaker 1: I know you're Jordan Peterson, you like a lot. Christina 289 00:15:28,000 --> 00:15:31,240 Speaker 1: hoff Summers is talked about in there. Ben Shapiro, who 290 00:15:31,280 --> 00:15:33,560 Speaker 1: I greatly like, I know share I know a little 291 00:15:33,600 --> 00:15:36,960 Speaker 1: bit about Shapiro. I know, um familiar with Peterson's work, 292 00:15:37,040 --> 00:15:38,920 Speaker 1: but the othersn't mentioned in the article. I don't really 293 00:15:38,960 --> 00:15:40,920 Speaker 1: know so well Sam Harris. But I think that the 294 00:15:40,960 --> 00:15:44,040 Speaker 1: point was that these are people you're not seeing Ben Shaparry. 295 00:15:44,040 --> 00:15:46,040 Speaker 1: You do see on mainstream media now and again, but 296 00:15:46,080 --> 00:15:50,000 Speaker 1: they you're generally, I guess, not seeing, and that these 297 00:15:50,000 --> 00:15:54,120 Speaker 1: are people open to discussing uh their views with with 298 00:15:54,280 --> 00:15:56,440 Speaker 1: other people on the left on the right. They don't 299 00:15:56,440 --> 00:16:00,280 Speaker 1: necessarily have anything in common, but they'll these people mentioning 300 00:16:00,320 --> 00:16:02,600 Speaker 1: the article, they will sit down and have an intellectual 301 00:16:02,600 --> 00:16:05,600 Speaker 1: debate amongst one another and their their views that contradict 302 00:16:05,960 --> 00:16:09,480 Speaker 1: the UM the current narrative. It's like there's a narrative 303 00:16:09,480 --> 00:16:12,200 Speaker 1: that's just feels like it's being shoved down the throats 304 00:16:12,800 --> 00:16:15,840 Speaker 1: of people who are conservative, but maybe not even conservative, 305 00:16:15,920 --> 00:16:18,200 Speaker 1: just people who are like not on the far left. 306 00:16:18,400 --> 00:16:22,400 Speaker 1: You know. Yeah, there's there's a lot of those who, um, 307 00:16:22,520 --> 00:16:25,880 Speaker 1: I feel like they're being allen alienated because they're you know, 308 00:16:25,960 --> 00:16:32,160 Speaker 1: moderate Democrats, people like Rick Unger we've had on Brandon's podcast. Um, 309 00:16:32,280 --> 00:16:33,720 Speaker 1: by the way, I was gonna be speaking of the 310 00:16:33,720 --> 00:16:37,120 Speaker 1: far left. I was gonna mention this article I commented 311 00:16:37,240 --> 00:16:39,320 Speaker 1: on Twitter, but did you see this? Uh? And this 312 00:16:39,440 --> 00:16:42,880 Speaker 1: is from CNN, and it's like, you wonder why CNN 313 00:16:43,320 --> 00:16:45,800 Speaker 1: just becomes more and more unpopular. So this is by 314 00:16:45,840 --> 00:16:50,800 Speaker 1: some guy Richard Edmund Vargas. Guns alone don't kill people. Patriarchy. 315 00:16:51,000 --> 00:16:54,960 Speaker 1: Patriarchy kills people. Um. And the whole article goes into 316 00:16:55,000 --> 00:16:57,720 Speaker 1: like all these problems of the patriarchy. And this this 317 00:16:58,800 --> 00:17:01,720 Speaker 1: totally ties into the verse true signaling thing, like it's 318 00:17:01,760 --> 00:17:04,760 Speaker 1: a man who wrote this article. That stuff is so weird. 319 00:17:04,880 --> 00:17:07,760 Speaker 1: And they say things, they say, we live in this patriarchy, 320 00:17:07,880 --> 00:17:10,640 Speaker 1: and like they say a lot of things like I've 321 00:17:10,680 --> 00:17:14,040 Speaker 1: never experienced in life, but like here it says patriarchy 322 00:17:14,160 --> 00:17:18,080 Speaker 1: is a social system that defines men as being inherently violent, dominant, 323 00:17:18,240 --> 00:17:21,919 Speaker 1: and controlling, while rewarding them with power for being that way. 324 00:17:22,320 --> 00:17:24,960 Speaker 1: It's no secret, especially these days, that we live in 325 00:17:24,960 --> 00:17:29,600 Speaker 1: a patriarchal, patriarchical society. What are why are we continually 326 00:17:29,640 --> 00:17:31,960 Speaker 1: surprised when a man takes up arms and commits a 327 00:17:31,960 --> 00:17:34,639 Speaker 1: mass murder? I mean, I feel like these people live 328 00:17:34,680 --> 00:17:38,359 Speaker 1: in an alternate reality. And no, you know, I just 329 00:17:38,760 --> 00:17:41,520 Speaker 1: you don't see these op eds on CNN about psychotropic 330 00:17:41,600 --> 00:17:45,680 Speaker 1: drugs about I mean, mental health is being discussed, but 331 00:17:45,720 --> 00:17:47,800 Speaker 1: I think that's a major issue. And then the major 332 00:17:47,840 --> 00:17:49,679 Speaker 1: issue they of course will not discuss. And I know 333 00:17:49,720 --> 00:17:51,720 Speaker 1: you talked about Daniel lash saying at the n r 334 00:17:51,800 --> 00:17:55,679 Speaker 1: A conventions, like they make these mass shooters famous and 335 00:17:55,680 --> 00:17:58,679 Speaker 1: it gets them ratings. Yeah, they I don't think, CNN, 336 00:17:58,760 --> 00:18:02,520 Speaker 1: let's talk about that. Um, But that's a complicated issue too. 337 00:18:02,520 --> 00:18:05,240 Speaker 1: I mean, should the media just not report on mass shootings? 338 00:18:05,280 --> 00:18:08,080 Speaker 1: I mean I don't believe that. But at the same time, 339 00:18:08,240 --> 00:18:11,679 Speaker 1: do we need twenty seven coverage of every shooting, you know, 340 00:18:11,760 --> 00:18:14,399 Speaker 1: like the way we had after say Columbine like that 341 00:18:14,560 --> 00:18:16,840 Speaker 1: that was just all over the news for but in days. 342 00:18:16,880 --> 00:18:18,639 Speaker 1: An example I could think of is you remember the 343 00:18:18,720 --> 00:18:20,880 Speaker 1: Star and I have brother on the cover of Rolling 344 00:18:20,920 --> 00:18:23,280 Speaker 1: Stone magazine and it was like that Wan is looking 345 00:18:23,280 --> 00:18:26,040 Speaker 1: at that. That made me angry because I mean it's 346 00:18:26,040 --> 00:18:29,080 Speaker 1: not necessarily that he was on the cover, but it was, Yeah, 347 00:18:29,080 --> 00:18:32,480 Speaker 1: they were trying to turn him into like a rock star. Yeah, 348 00:18:32,600 --> 00:18:35,239 Speaker 1: it was just totally provocative. Um, they just did that 349 00:18:35,280 --> 00:18:38,080 Speaker 1: to piss people off and get some attention, I think, Um, 350 00:18:38,200 --> 00:18:41,119 Speaker 1: and it worked. But yeah, that's why I mean I 351 00:18:41,160 --> 00:18:42,800 Speaker 1: tell all of my friends, like, if you ever get 352 00:18:42,800 --> 00:18:44,760 Speaker 1: a call from the Rolling Stone, hang up the phone. 353 00:18:45,080 --> 00:18:47,760 Speaker 1: I've gotten calls from them and emails. I made the 354 00:18:47,800 --> 00:18:51,199 Speaker 1: mistake of talking to one Rolling Stone journalist, uh, you know, 355 00:18:51,280 --> 00:18:54,120 Speaker 1: earlier on in my my journalistic career, and I never 356 00:18:54,160 --> 00:18:57,240 Speaker 1: talked to them again. I know another guy, I don't 357 00:18:57,240 --> 00:18:58,679 Speaker 1: know if he still works there or not, but he 358 00:18:58,720 --> 00:19:01,120 Speaker 1: wanted to talk to me about an article actually about 359 00:19:01,160 --> 00:19:03,439 Speaker 1: Wayne Simmons, and I refused. I was like, I'm not 360 00:19:03,440 --> 00:19:05,920 Speaker 1: gonna because he was working at Rolling Stone. I was like, look, 361 00:19:05,960 --> 00:19:08,639 Speaker 1: it's nothing personal, but I don't trust your editors. I 362 00:19:08,640 --> 00:19:12,520 Speaker 1: don't trust that outlet. I think they have serious ethical issues. 363 00:19:12,760 --> 00:19:14,720 Speaker 1: And you know, if you decide to go and work 364 00:19:14,760 --> 00:19:18,240 Speaker 1: for another magazine or another newspaper or something, we can talk. 365 00:19:18,280 --> 00:19:20,520 Speaker 1: That's fine. But I'm not going to have anything to 366 00:19:20,520 --> 00:19:23,080 Speaker 1: do with Rolling Stone. And I would advise other people 367 00:19:23,080 --> 00:19:26,480 Speaker 1: not to I'd tell them the same thing. Yeah, it's understandable, 368 00:19:27,040 --> 00:19:30,399 Speaker 1: especially after that article and it did because and other stuff. 369 00:19:30,440 --> 00:19:32,800 Speaker 1: I mean remember the uh that was like one of 370 00:19:32,800 --> 00:19:36,240 Speaker 1: those campus gang rape stories. Yeah, I think so, and 371 00:19:36,240 --> 00:19:38,480 Speaker 1: it turned out it wasn't true. Well, yeah, there's been 372 00:19:38,520 --> 00:19:40,520 Speaker 1: so much to that I may have mentioned on the 373 00:19:40,520 --> 00:19:44,040 Speaker 1: podcast before, but I went to like middle school and 374 00:19:44,359 --> 00:19:46,760 Speaker 1: part of high school with one of the Duke Lacrosse 375 00:19:46,920 --> 00:19:50,320 Speaker 1: kids who was accused of being a rapist. And turns out, 376 00:19:50,359 --> 00:19:51,960 Speaker 1: if you watch that thirty for thirty on it, he 377 00:19:52,000 --> 00:19:54,439 Speaker 1: wasn't even there when it went down. A lot of 378 00:19:54,440 --> 00:19:57,159 Speaker 1: those guys were it. I write my book about the 379 00:19:57,400 --> 00:20:00,080 Speaker 1: and I don't have any like firsthand knowledge about this 380 00:20:00,240 --> 00:20:02,879 Speaker 1: at all, um, except that I was there when this 381 00:20:02,920 --> 00:20:05,960 Speaker 1: whole Fiesco is happening. Was the mattress girl at Columbia, 382 00:20:06,560 --> 00:20:09,119 Speaker 1: And I was there actually the first day when she 383 00:20:09,200 --> 00:20:11,400 Speaker 1: got out of the van logan the mattress with her, 384 00:20:11,560 --> 00:20:14,040 Speaker 1: and she carried the mattress that supposedly she was raped 385 00:20:14,080 --> 00:20:17,080 Speaker 1: on um around campus as a piece of performance art 386 00:20:17,200 --> 00:20:19,400 Speaker 1: and was giving course credit for that. So I walk 387 00:20:19,480 --> 00:20:21,280 Speaker 1: up to school the first day of the semester and 388 00:20:21,320 --> 00:20:22,720 Speaker 1: there she is getting out of the van, and the 389 00:20:22,720 --> 00:20:27,160 Speaker 1: paparazzi was just all around her, taking pictures and flashes everywhere. 390 00:20:27,320 --> 00:20:29,000 Speaker 1: And I didn't even know what this was about at 391 00:20:29,000 --> 00:20:31,600 Speaker 1: the time. I was like, what in the world, like 392 00:20:31,960 --> 00:20:35,480 Speaker 1: another day in New York City, you know? Um? And 393 00:20:35,520 --> 00:20:38,199 Speaker 1: she claimed that she had been raped on campus by 394 00:20:38,200 --> 00:20:42,520 Speaker 1: another student, a student from Germany, and he got raped 395 00:20:42,560 --> 00:20:46,359 Speaker 1: over the coals, and I mean it looks like he 396 00:20:46,480 --> 00:20:48,879 Speaker 1: just did not do it, like and the whole the 397 00:20:48,880 --> 00:20:52,199 Speaker 1: whole case fell apart and everything. Um, so he's probably 398 00:20:52,280 --> 00:20:55,600 Speaker 1: an innocent victim and all of that. Holy sh it. So, 399 00:20:55,640 --> 00:20:59,000 Speaker 1: for the first time on soft Rep Radio, John Robbed 400 00:20:59,080 --> 00:21:02,840 Speaker 1: former Air Force were special operations pilot and military analyst. 401 00:21:03,359 --> 00:21:06,640 Speaker 1: Uh now really most known as an author and entrepreneurs 402 00:21:06,680 --> 00:21:10,360 Speaker 1: pen several books, but right now is doing the Global 403 00:21:10,359 --> 00:21:13,359 Speaker 1: Guerrillas Report, and we're excited to have you on. I mean, 404 00:21:13,400 --> 00:21:15,639 Speaker 1: there's a lot to talk about, and you have just 405 00:21:15,680 --> 00:21:20,920 Speaker 1: such an interesting background. I came upon your work when 406 00:21:20,960 --> 00:21:23,400 Speaker 1: I was in the military, um, and I was reading 407 00:21:23,440 --> 00:21:26,040 Speaker 1: your Global Guerilla's blog. This was like right smack in 408 00:21:26,080 --> 00:21:29,080 Speaker 1: the dab of the insurgency in Iraq, and I just 409 00:21:29,160 --> 00:21:31,119 Speaker 1: found that you were, you know, light years ahead of 410 00:21:31,119 --> 00:21:34,000 Speaker 1: where everyone else was as far as understanding what was 411 00:21:34,040 --> 00:21:37,320 Speaker 1: going on over there. Um. And actually I saw you 412 00:21:37,359 --> 00:21:41,119 Speaker 1: speak at Austin p University in Clarksville laws in fifth Group. 413 00:21:41,160 --> 00:21:43,679 Speaker 1: I saw on your website or something like that you 414 00:21:43,680 --> 00:21:45,280 Speaker 1: announced it and I went over there and saw you 415 00:21:45,320 --> 00:21:49,000 Speaker 1: speak very interesting stuff. UM, I felt like there were 416 00:21:49,000 --> 00:21:51,000 Speaker 1: more people in d O D S who should probably 417 00:21:51,040 --> 00:21:53,920 Speaker 1: be listening to you. Well. Cool, That's how I came 418 00:21:53,960 --> 00:21:56,800 Speaker 1: across your work, And UM, I hope other people go 419 00:21:56,840 --> 00:22:00,440 Speaker 1: and check out your Global Guerillas website, which is really interesting. 420 00:22:00,440 --> 00:22:02,440 Speaker 1: And I guess I was wondering if we could start 421 00:22:02,480 --> 00:22:05,480 Speaker 1: off talking a little bit about kind of where we've 422 00:22:05,560 --> 00:22:08,600 Speaker 1: come since that time frame, Like say, you know, when 423 00:22:08,600 --> 00:22:12,800 Speaker 1: you started writing the website about the evolution of insurgencies 424 00:22:12,880 --> 00:22:16,640 Speaker 1: and unconventional warfare. I guess you could say, um to 425 00:22:16,640 --> 00:22:22,880 Speaker 1: today and where we're at now in Okay. Um, Well, 426 00:22:22,880 --> 00:22:25,320 Speaker 1: I started writing this site back in two thousand four, 427 00:22:26,320 --> 00:22:28,560 Speaker 1: and I did it because, UM, you know, what I 428 00:22:28,600 --> 00:22:30,879 Speaker 1: was seeing, you know, coming out of the d D 429 00:22:31,160 --> 00:22:35,040 Speaker 1: and and being reported in the press wasn't um making 430 00:22:35,080 --> 00:22:39,480 Speaker 1: sense to me, you know, particularly in light of my background. 431 00:22:39,520 --> 00:22:44,600 Speaker 1: And I also for the last for the years in 432 00:22:44,680 --> 00:22:47,160 Speaker 1: the run up to two thousand four, UM, I had 433 00:22:47,200 --> 00:22:52,080 Speaker 1: been in software companies that I had been immersed in 434 00:22:52,119 --> 00:22:53,560 Speaker 1: the Internet. I was one of the you know, the 435 00:22:53,600 --> 00:22:56,760 Speaker 1: early probably the first you know, full time Internet analysts 436 00:22:56,960 --> 00:23:01,680 Speaker 1: back and you know, it's a lot of things going 437 00:23:01,720 --> 00:23:04,760 Speaker 1: on in the Internet that had applicability to what I 438 00:23:04,800 --> 00:23:07,480 Speaker 1: was seeing in interac you know that you know, kind 439 00:23:07,480 --> 00:23:10,719 Speaker 1: of connected the dots or made more sense to me. UM. 440 00:23:11,880 --> 00:23:13,720 Speaker 1: And one of the things I pulled out was this 441 00:23:13,800 --> 00:23:20,000 Speaker 1: idea of open source insurgency. UH, this idea that you're 442 00:23:20,080 --> 00:23:24,840 Speaker 1: not going to see these big ideological insurgencies anymore, UH, 443 00:23:25,359 --> 00:23:30,240 Speaker 1: their communism or UM sort of like Palestinian nationalism back 444 00:23:30,280 --> 00:23:35,280 Speaker 1: in the seventies, right with these you know, very hierarchical structures, 445 00:23:35,440 --> 00:23:39,439 Speaker 1: you know, and you know, with these different committees that 446 00:23:39,520 --> 00:23:43,280 Speaker 1: are mirror images of the nation state. UM. You have 447 00:23:43,320 --> 00:23:46,920 Speaker 1: the propaganda element, and you have the you know, supply 448 00:23:48,400 --> 00:23:53,400 Speaker 1: uh bureau in the insurgency and and this new type 449 00:23:53,400 --> 00:23:56,840 Speaker 1: of insurgency work like open source software UM with some 450 00:23:56,920 --> 00:23:59,800 Speaker 1: modifications UM. And I had to figure those out as 451 00:24:00,080 --> 00:24:02,320 Speaker 1: was going along, and I started writing in real time 452 00:24:03,040 --> 00:24:05,240 Speaker 1: the Global Grillas plug and I was getting good feedback 453 00:24:05,600 --> 00:24:08,399 Speaker 1: as I was going UM. And I wanted to make 454 00:24:08,400 --> 00:24:10,240 Speaker 1: sure also that you know, when I was writing the theory, 455 00:24:10,320 --> 00:24:14,000 Speaker 1: that I tied it to actual events, because there's nothing 456 00:24:14,040 --> 00:24:18,720 Speaker 1: worse than really getting into theory crafting UM and you know, 457 00:24:18,760 --> 00:24:21,119 Speaker 1: going way out on a limb or going you know, 458 00:24:21,160 --> 00:24:24,600 Speaker 1: way out into space and having this beautiful theory that's elegant, 459 00:24:24,600 --> 00:24:28,960 Speaker 1: that you know, doesn't explain anything, is not predictive UM. 460 00:24:29,000 --> 00:24:34,040 Speaker 1: And that what I found it in kind of modeling 461 00:24:34,080 --> 00:24:38,320 Speaker 1: the insurgency is this idea that there's lots of small groups, 462 00:24:38,760 --> 00:24:41,720 Speaker 1: UM that could work together to take on a much 463 00:24:41,800 --> 00:24:45,119 Speaker 1: larger foe. You know, sixties seventy different groups in Iraq, 464 00:24:45,240 --> 00:24:48,520 Speaker 1: you know, all with their different motivations for fighting UM, 465 00:24:49,400 --> 00:24:54,160 Speaker 1: you know, coming together to you know, share resources, share knowledge, 466 00:24:54,400 --> 00:25:02,440 Speaker 1: sometimes indirectly, sometimes directly UM, and UH putting together an 467 00:25:02,440 --> 00:25:07,200 Speaker 1: ansurgency they can actually you know, do amazingly well UM 468 00:25:07,240 --> 00:25:11,399 Speaker 1: and be you know, amazingly innovative. I don't know what 469 00:25:11,480 --> 00:25:13,880 Speaker 1: your thoughts are, but I feel like we've seen almost 470 00:25:13,960 --> 00:25:18,080 Speaker 1: the culmination of this theory in Syria. UM. And you know, 471 00:25:18,119 --> 00:25:19,879 Speaker 1: the New York Times and others have come out with 472 00:25:20,000 --> 00:25:23,480 Speaker 1: articles recently talking about how like Ice has had this 473 00:25:23,640 --> 00:25:27,840 Speaker 1: um fairly well administrated state. UM. But the way I 474 00:25:27,960 --> 00:25:32,080 Speaker 1: perceived things is that it looked like there were many 475 00:25:32,119 --> 00:25:35,120 Speaker 1: different groups on the ground and they were coming together 476 00:25:35,160 --> 00:25:38,879 Speaker 1: in a very ad hoc manner UM from battle to battle, 477 00:25:39,440 --> 00:25:41,560 Speaker 1: almost like this very kind of like lase a fair 478 00:25:41,760 --> 00:25:44,400 Speaker 1: type of capitalism. If I were to draw some sort 479 00:25:44,440 --> 00:25:47,040 Speaker 1: of parallel um that they're just kind of be thrown 480 00:25:47,040 --> 00:25:50,000 Speaker 1: together for one battle and then the alliance would completely 481 00:25:50,000 --> 00:25:52,320 Speaker 1: shift in a different group of people coming together for 482 00:25:52,359 --> 00:25:56,159 Speaker 1: the next battle. What do you make of of the 483 00:25:56,440 --> 00:26:00,679 Speaker 1: how that insurgency kind of jumped from Iraq Syria in 484 00:26:00,720 --> 00:26:03,840 Speaker 1: some ways and how it evolved. Well, you know, many 485 00:26:03,840 --> 00:26:07,000 Speaker 1: of the same lessons that we're learned by the Iraqi 486 00:26:07,000 --> 00:26:14,000 Speaker 1: insurgency insurgents were ported to the Syrian insurgency and you 487 00:26:14,080 --> 00:26:18,040 Speaker 1: know things like you know, I describe it as stigma gy. 488 00:26:18,119 --> 00:26:22,600 Speaker 1: It's this, it's the kind of communication that ants have, uh, 489 00:26:22,720 --> 00:26:25,639 Speaker 1: you know, ants don't you know, talk to each other directly. 490 00:26:25,680 --> 00:26:28,280 Speaker 1: What they do is they leave a chemical trail to 491 00:26:28,920 --> 00:26:32,199 Speaker 1: a piece of food, and each ant that you know, 492 00:26:32,240 --> 00:26:34,960 Speaker 1: travels at trail. Then it is successful in finding the food. 493 00:26:35,119 --> 00:26:39,000 Speaker 1: You know, it reinforces the trail um and that that 494 00:26:39,119 --> 00:26:43,359 Speaker 1: kind of decentralized information sharing, you know, allows them to 495 00:26:43,400 --> 00:26:46,119 Speaker 1: do you know, act as an act as a group. 496 00:26:46,119 --> 00:26:48,840 Speaker 1: And in the case of an insurgency, if you haven't 497 00:26:50,640 --> 00:26:54,280 Speaker 1: some level of success in conducting attack and it's covered 498 00:26:54,320 --> 00:26:59,360 Speaker 1: by the press or it's picked up as a success 499 00:26:59,359 --> 00:27:02,719 Speaker 1: and it's described. Uh what other groups do is they 500 00:27:02,720 --> 00:27:04,840 Speaker 1: just copy it, just like an open source software. As 501 00:27:04,880 --> 00:27:07,720 Speaker 1: you copy it and then you you apply it. What 502 00:27:07,840 --> 00:27:12,119 Speaker 1: was interesting also about the Assyrian in certaincies that uh, 503 00:27:12,280 --> 00:27:16,840 Speaker 1: you know isis was able to go online and create 504 00:27:16,880 --> 00:27:20,719 Speaker 1: kind of a decentralized version of it online to do 505 00:27:20,800 --> 00:27:22,600 Speaker 1: the kind of recruiting that it that it was able 506 00:27:22,640 --> 00:27:26,639 Speaker 1: to pull off. I mean that was a you know, 507 00:27:26,760 --> 00:27:31,200 Speaker 1: pretty major innovation. I mean it was able to pull 508 00:27:31,200 --> 00:27:34,920 Speaker 1: in about thirty people and also franchising itself out or 509 00:27:34,960 --> 00:27:40,359 Speaker 1: attempting to in Libya, Somalia. I think there were attempts Nigeria, 510 00:27:40,480 --> 00:27:44,480 Speaker 1: I believe in the Philippines absolutely they tried to um 511 00:27:44,640 --> 00:27:48,719 Speaker 1: actor or you saw local groups trying to aspire and 512 00:27:48,760 --> 00:27:51,560 Speaker 1: call themselves. Isis like that sort of plug and play 513 00:27:52,080 --> 00:27:57,920 Speaker 1: ideology or insurgency. Yeah, and it it was pretty effective. 514 00:27:58,359 --> 00:28:01,800 Speaker 1: They were able to play the whole caliphate angle um 515 00:28:02,080 --> 00:28:07,240 Speaker 1: and this idea of fealty which is kind of a 516 00:28:07,400 --> 00:28:10,360 Speaker 1: it's a it's a very medieval concept is different than loyalty, 517 00:28:10,840 --> 00:28:15,440 Speaker 1: you know, I fealty is um well, it's as you also, 518 00:28:15,560 --> 00:28:21,600 Speaker 1: you know, you pledge to uphold your liege lords um 519 00:28:21,920 --> 00:28:25,240 Speaker 1: goals and everything that you do, and you're given incredible 520 00:28:25,280 --> 00:28:28,159 Speaker 1: amounts of leeway and independence in doing that. And you 521 00:28:28,280 --> 00:28:34,719 Speaker 1: just you're you've pledged yourself to your liege and uh, 522 00:28:36,600 --> 00:28:38,720 Speaker 1: it's a you know, a very decentralized kind of thing. 523 00:28:38,760 --> 00:28:42,240 Speaker 1: It's not it's not a you know, in feudal in 524 00:28:42,280 --> 00:28:46,120 Speaker 1: fuel societies. There wasn't like that kind of direct command 525 00:28:46,120 --> 00:28:49,000 Speaker 1: and control of you know, of a bureaucracy or or 526 00:28:49,040 --> 00:28:54,800 Speaker 1: our hierarchy. Each each of the lords were operating independently, 527 00:28:54,840 --> 00:28:59,600 Speaker 1: but they operated in their leege lord's interest fealty is it. 528 00:29:01,320 --> 00:29:03,239 Speaker 1: I was just gonna say they had to operate like 529 00:29:03,240 --> 00:29:07,400 Speaker 1: that because they didn't have the telecommunications infrastructure and um 530 00:29:07,440 --> 00:29:10,360 Speaker 1: and I imagine you know you could probably comment on 531 00:29:10,440 --> 00:29:14,800 Speaker 1: the agrarian practices as well as playing into it. Yeah. 532 00:29:14,920 --> 00:29:17,400 Speaker 1: I mean, but they parted a version of that to 533 00:29:17,640 --> 00:29:20,640 Speaker 1: the online world, and they were able to you know, 534 00:29:20,680 --> 00:29:23,760 Speaker 1: incentivize some people to you know, take up arms and 535 00:29:24,080 --> 00:29:29,360 Speaker 1: conduct relatively um, disconnected attacks. I mean, they didn't go 536 00:29:29,400 --> 00:29:31,960 Speaker 1: through the standard planning process and kind of hand holding 537 00:29:31,960 --> 00:29:36,080 Speaker 1: and support that you typically see with terrorist organizations. Um, 538 00:29:36,120 --> 00:29:41,840 Speaker 1: they in order to pledge their fealty. They were obligated 539 00:29:41,880 --> 00:29:44,080 Speaker 1: to do all the planning on their own and then 540 00:29:44,120 --> 00:29:46,880 Speaker 1: conducted the attack, And the attack itself was actually the pledge. 541 00:29:47,840 --> 00:29:51,360 Speaker 1: And you saw that in numerous instances, um, in recent 542 00:29:51,400 --> 00:29:55,800 Speaker 1: attacks in Europe and even in the US. In the press, 543 00:29:55,840 --> 00:29:59,280 Speaker 1: they often say that, you know, isis is barbaric and 544 00:29:59,360 --> 00:30:02,280 Speaker 1: primitive in medieval and I suppose they really are all 545 00:30:02,280 --> 00:30:04,680 Speaker 1: of those things. But what do you make of the 546 00:30:04,760 --> 00:30:09,400 Speaker 1: appeal to a group like that in twenty fifteen, sixteen, 547 00:30:09,440 --> 00:30:13,640 Speaker 1: seventeen eighteen. Why why does such a primitive barbaric group 548 00:30:14,280 --> 00:30:18,120 Speaker 1: have an enduring appeal to people around the world. And 549 00:30:18,160 --> 00:30:20,160 Speaker 1: I mean it's not the majority of people, of course, 550 00:30:20,200 --> 00:30:22,840 Speaker 1: but a significant minority of people are still drawn to 551 00:30:22,920 --> 00:30:27,000 Speaker 1: that kind of thing. Oh wow, they're well, they're you know, 552 00:30:27,040 --> 00:30:29,840 Speaker 1: approximate factors, you know, having to do with you know, 553 00:30:30,640 --> 00:30:33,720 Speaker 1: bad local economies and lack of opportunity. Um, if you 554 00:30:33,720 --> 00:30:37,360 Speaker 1: don't have a job, you can't get married kind of thing. Um. 555 00:30:37,400 --> 00:30:41,560 Speaker 1: But there's also this kind of sense that you know, globalization, 556 00:30:42,400 --> 00:30:46,320 Speaker 1: uh makes everything relative. And we're seeing this in the 557 00:30:46,360 --> 00:30:49,000 Speaker 1: States too. Is that you know, once you become you know, 558 00:30:49,680 --> 00:30:52,840 Speaker 1: you lower the barriers to everything. Everything is as good 559 00:30:52,840 --> 00:30:58,040 Speaker 1: as the stuff you have cultural uh countries, you know, 560 00:30:59,120 --> 00:31:02,320 Speaker 1: you can't hold up one over the other. UM. And 561 00:31:02,360 --> 00:31:05,360 Speaker 1: that it creates a huge vacuum, kind of a big 562 00:31:05,400 --> 00:31:09,240 Speaker 1: loss in people. Yeah, people don't know what to fill 563 00:31:09,320 --> 00:31:12,160 Speaker 1: that with. Yeah, they just they don't you know, they 564 00:31:12,160 --> 00:31:14,760 Speaker 1: don't have this kind of group that they belong to. 565 00:31:15,760 --> 00:31:20,640 Speaker 1: This kind of in tribalism, it's kind of called fictive kinship, 566 00:31:21,120 --> 00:31:24,040 Speaker 1: this idea that you have this kind of cohesive narrative 567 00:31:24,600 --> 00:31:26,760 Speaker 1: that unites you and you have you know, traditions and 568 00:31:26,760 --> 00:31:30,760 Speaker 1: other things that make you part of a group, and 569 00:31:30,880 --> 00:31:32,680 Speaker 1: that the other people that are part of that group 570 00:31:32,800 --> 00:31:37,760 Speaker 1: is our people that you give some level of loyalty 571 00:31:37,800 --> 00:31:42,760 Speaker 1: to and you protect and they protect you. UM. In 572 00:31:42,800 --> 00:31:45,560 Speaker 1: a globalized world that doesn't exist. It's like this global 573 00:31:45,600 --> 00:31:48,000 Speaker 1: commercial culture is just it is devoid of that that 574 00:31:48,160 --> 00:31:50,840 Speaker 1: and that belonging was. You know, you've got that spashion 575 00:31:50,920 --> 00:31:56,080 Speaker 1: younger approaches that it's it's almost hardwired into our into 576 00:31:56,080 --> 00:31:58,560 Speaker 1: our beings. We're always you know, searching for that kind 577 00:31:58,600 --> 00:32:03,240 Speaker 1: of belonging. UM. So what happens is is it you 578 00:32:03,280 --> 00:32:06,560 Speaker 1: know when nationalism, you know you've heard of the post 579 00:32:06,600 --> 00:32:12,440 Speaker 1: Westphalian stuff out of out of Martin van Kraveldt. I mean, 580 00:32:12,560 --> 00:32:14,320 Speaker 1: I think a lot of people have talked about it, 581 00:32:14,440 --> 00:32:18,000 Speaker 1: even you know, some more notable people Henry Kissinger and such. 582 00:32:18,000 --> 00:32:21,320 Speaker 1: If we're moving into uh, some system like that. I 583 00:32:21,360 --> 00:32:24,280 Speaker 1: know you've commented on the perhaps moving into like a 584 00:32:24,320 --> 00:32:27,320 Speaker 1: post state society. Well, it's the decline of the state 585 00:32:27,400 --> 00:32:28,960 Speaker 1: or it's like kind of a hollowing out of the 586 00:32:29,000 --> 00:32:34,120 Speaker 1: state as as as we globalize, and you know, that 587 00:32:34,160 --> 00:32:37,880 Speaker 1: creates these this kind of this vacuum, and people are 588 00:32:37,960 --> 00:32:41,920 Speaker 1: filling it with all sorts of direct gag get their 589 00:32:41,920 --> 00:32:44,880 Speaker 1: hands on and if they're not in some cases, you know, 590 00:32:44,920 --> 00:32:49,040 Speaker 1: I kind of connect the you know, the high rates 591 00:32:49,080 --> 00:32:53,440 Speaker 1: of alcoholism and the declining uh longevity of Russians after 592 00:32:53,520 --> 00:32:56,480 Speaker 1: the fall of the Soviet Union. You know, they're drinking. 593 00:32:56,760 --> 00:32:58,720 Speaker 1: You know, the guys in the primary life are drinking 594 00:32:58,720 --> 00:33:01,880 Speaker 1: themselves to death. And we're seeing this today to a 595 00:33:01,960 --> 00:33:04,640 Speaker 1: certain extent in the US with the opioid crisis. Is people, 596 00:33:04,920 --> 00:33:07,240 Speaker 1: it's not the young people are the old people who 597 00:33:07,280 --> 00:33:11,360 Speaker 1: you normally expect, you know, expect to have problems with addiction. 598 00:33:12,320 --> 00:33:14,800 Speaker 1: Almost all the deaths are occurring between the thirty five 599 00:33:14,840 --> 00:33:19,280 Speaker 1: and fifty four range. People have lost their personal narrative correct. Yeah, 600 00:33:19,320 --> 00:33:21,480 Speaker 1: they've lost that kind of stuff that connects them. And 601 00:33:21,800 --> 00:33:24,240 Speaker 1: it sounds when you talk about this, it reminds me. 602 00:33:24,320 --> 00:33:28,640 Speaker 1: Have you seen the Robert Curtis documentary Hypernormalization? Yeah? I 603 00:33:28,640 --> 00:33:31,040 Speaker 1: thought that was quite good. Yeah, it sounds exactly what 604 00:33:31,080 --> 00:33:34,520 Speaker 1: you're talking about. That like, we don't have a social, 605 00:33:34,680 --> 00:33:38,960 Speaker 1: societal scale narrative. Everyone knows it doesn't work, that that 606 00:33:39,040 --> 00:33:42,040 Speaker 1: these stories were told by our leaders don't function, they 607 00:33:42,080 --> 00:33:44,400 Speaker 1: don't mean anything anymore. But we have nothing to replace 608 00:33:44,440 --> 00:33:47,200 Speaker 1: it with. Correct and that's a big problem. That's what 609 00:33:47,240 --> 00:33:49,240 Speaker 1: I'm trying to do with the reports as I'm working 610 00:33:49,240 --> 00:33:53,800 Speaker 1: through these, you know different things. You know what you know, 611 00:33:53,880 --> 00:33:57,000 Speaker 1: the open source element has done to our politics as 612 00:33:57,000 --> 00:33:59,280 Speaker 1: we kind of move towards an open source kind of 613 00:33:59,320 --> 00:34:02,680 Speaker 1: non linear politics, UM, and then UM trying to figure 614 00:34:02,680 --> 00:34:05,480 Speaker 1: out you know how AI. You know this basically the 615 00:34:05,480 --> 00:34:09,120 Speaker 1: big social AI s H. I mean we've seen at 616 00:34:09,160 --> 00:34:11,560 Speaker 1: least with Facebook and YouTube and others. I mean, the 617 00:34:11,600 --> 00:34:16,320 Speaker 1: biggest real world, real time application of artificial intelligence is 618 00:34:16,320 --> 00:34:19,359 Speaker 1: is being done in these big social networks and they're 619 00:34:19,360 --> 00:34:22,120 Speaker 1: touching AIS are touching two and a half billion people 620 00:34:22,719 --> 00:34:25,680 Speaker 1: right now and influencing them and managing them. I was 621 00:34:25,960 --> 00:34:28,200 Speaker 1: grappling with that for a bit, and what China is 622 00:34:28,239 --> 00:34:34,200 Speaker 1: doing with that, and and this idea of this declining tribalism. 623 00:34:34,280 --> 00:34:36,600 Speaker 1: This is this, you know, we don't have this fictive 624 00:34:36,640 --> 00:34:39,680 Speaker 1: kinship that connects us as Americans anymore or less. So 625 00:34:40,280 --> 00:34:41,920 Speaker 1: and what we're seeing is a rise of kind of 626 00:34:41,920 --> 00:34:46,680 Speaker 1: an identity politics. We're seeing uh, you know, existential crisis 627 00:34:46,719 --> 00:34:49,480 Speaker 1: with people just basically killing themselves. And for the first 628 00:34:49,520 --> 00:34:53,560 Speaker 1: time and in modern history, the u S life expectancy 629 00:34:53,600 --> 00:34:55,040 Speaker 1: for the second year in a row has gone down, 630 00:34:56,160 --> 00:35:00,160 Speaker 1: and in a modern country, it's gone down. I saw you. 631 00:35:00,800 --> 00:35:02,879 Speaker 1: I was looking over you know, your website a little 632 00:35:02,880 --> 00:35:04,439 Speaker 1: bit before we got started. In one of the things 633 00:35:04,440 --> 00:35:07,200 Speaker 1: that's stuck out to me, you said, we are in uh, 634 00:35:07,320 --> 00:35:10,719 Speaker 1: we're in the middle of uncharted non linear territory when 635 00:35:10,760 --> 00:35:14,560 Speaker 1: it comes to politics. And I think you touched upon that, 636 00:35:14,600 --> 00:35:16,680 Speaker 1: but could you expand a little bit on what you 637 00:35:16,719 --> 00:35:18,600 Speaker 1: mean and that we're kind of like going out into 638 00:35:18,600 --> 00:35:20,560 Speaker 1: the deep end of the pool without really knowing where 639 00:35:20,600 --> 00:35:25,319 Speaker 1: the bottom is. Yeah, it's um. The problem was is 640 00:35:25,360 --> 00:35:28,440 Speaker 1: that our political parties and this left right kind of 641 00:35:28,440 --> 00:35:33,080 Speaker 1: spectrum um started falling apart, you know, around the round 642 00:35:33,080 --> 00:35:37,719 Speaker 1: the turn of the century, and we didn't really have 643 00:35:37,760 --> 00:35:40,839 Speaker 1: anything to replace it. And then we saw the kind 644 00:35:40,880 --> 00:35:43,920 Speaker 1: of the elites making bad decision after bad decision you 645 00:35:43,960 --> 00:35:48,640 Speaker 1: know happens in invade Iraq. Yeah, excuse me, our our 646 00:35:48,719 --> 00:35:54,200 Speaker 1: experts making bad decisions. Yeah, yeah, and you know eleven 647 00:35:54,200 --> 00:35:57,080 Speaker 1: happens and we had invade Iraq, which doesn't make any sense. 648 00:35:57,120 --> 00:35:59,680 Speaker 1: We're trying to connect it or that the you know 649 00:36:00,120 --> 00:36:03,200 Speaker 1: that we allow the whole banking and derivatives saying to 650 00:36:03,760 --> 00:36:07,680 Speaker 1: get totally out of hand and let people take amazing 651 00:36:07,719 --> 00:36:10,720 Speaker 1: bet you know, large bets, and then the financial crisis 652 00:36:10,800 --> 00:36:12,440 Speaker 1: and then they don't have to pay any price for 653 00:36:12,480 --> 00:36:17,160 Speaker 1: doing that. We make them whole um and that we pay, 654 00:36:17,520 --> 00:36:20,279 Speaker 1: and we've socialized the cost of that, you know. And 655 00:36:21,280 --> 00:36:24,480 Speaker 1: we saw after that that that particularly the financial crisis, 656 00:36:24,520 --> 00:36:27,279 Speaker 1: arise of lots of populist movements have been calling them, 657 00:36:27,280 --> 00:36:35,719 Speaker 1: but basically there groups of individuals, each with their own motivation, 658 00:36:35,840 --> 00:36:37,880 Speaker 1: coming together and kind of open source networks that kind 659 00:36:37,920 --> 00:36:40,759 Speaker 1: of express their disgust with how things are being run. 660 00:36:41,880 --> 00:36:45,120 Speaker 1: Occupy movement, Yeah, we saw the occupy movement, we saw 661 00:36:45,160 --> 00:36:47,640 Speaker 1: the Tea Party movement, and then most recently, we saw 662 00:36:47,640 --> 00:36:50,279 Speaker 1: a kind of you know, an insurgency very similar to 663 00:36:50,320 --> 00:36:53,160 Speaker 1: the kind of insurgency I was describing, interact with the 664 00:36:53,160 --> 00:36:55,960 Speaker 1: open source insurgency or the kind of open source protests 665 00:36:56,040 --> 00:37:01,120 Speaker 1: that made it possible to topple Tunisia and pop Tople Egypt. 666 00:37:01,440 --> 00:37:05,399 Speaker 1: And then we saw a flare up in Syria. Um, 667 00:37:05,440 --> 00:37:11,560 Speaker 1: we saw an open source election campaign that propelled Trump, 668 00:37:12,000 --> 00:37:15,319 Speaker 1: and Trump was like this wrecking ball that was sent 669 00:37:15,400 --> 00:37:20,520 Speaker 1: to Washington to actually just expressed displeasure. And then no 670 00:37:20,520 --> 00:37:23,200 Speaker 1: matter what he did, you know, all these people with 671 00:37:23,239 --> 00:37:27,000 Speaker 1: all their different motivations kind of you know, said we're 672 00:37:27,040 --> 00:37:28,719 Speaker 1: here for one reason is just to get him to 673 00:37:28,760 --> 00:37:32,240 Speaker 1: Washington to cost trouble. Right. It's like, it's interesting because 674 00:37:32,239 --> 00:37:35,400 Speaker 1: there are people primarily on the right, but they're expressing 675 00:37:35,400 --> 00:37:40,080 Speaker 1: this very Leninist attitude of like smash the state. Yeah exactly, 676 00:37:40,120 --> 00:37:44,360 Speaker 1: and it basically that insertaincy has overrun the Republican Party. 677 00:37:44,360 --> 00:37:46,920 Speaker 1: It's gone. A lot of that was was Steve Bennon. 678 00:37:47,080 --> 00:37:49,719 Speaker 1: You know, that was like he said, he's a Leninist 679 00:37:50,239 --> 00:37:52,560 Speaker 1: and he was a big believer, but he was influenced 680 00:37:52,560 --> 00:37:55,480 Speaker 1: by that book. Um, do you know what it is? John, 681 00:37:56,080 --> 00:37:57,839 Speaker 1: There was a book that was a huge influence to him. 682 00:37:57,920 --> 00:38:00,000 Speaker 1: Was called like every I don't know if it's every 683 00:38:00,080 --> 00:38:01,719 Speaker 1: hundred Years or something like that, but he was a 684 00:38:01,760 --> 00:38:04,880 Speaker 1: believer that, like a certain amount of time, that the 685 00:38:05,040 --> 00:38:07,759 Speaker 1: entire political system would basically turn on its head and 686 00:38:07,800 --> 00:38:10,359 Speaker 1: you have to collapse the whole system and start new. 687 00:38:10,440 --> 00:38:12,480 Speaker 1: And that was kind of what he wanted to do 688 00:38:12,520 --> 00:38:15,600 Speaker 1: with Trump. Yeah, it was. It was a cycle series 689 00:38:15,640 --> 00:38:18,720 Speaker 1: of Yeah, I'm not a big fan of the cycle serie. 690 00:38:18,920 --> 00:38:20,799 Speaker 1: I mean, you know, you can always look at those 691 00:38:21,320 --> 00:38:26,400 Speaker 1: hun cycles years cycles. But I like that because I 692 00:38:26,440 --> 00:38:29,839 Speaker 1: looked at it's called the Fourth Turning. What psychic track 693 00:38:29,880 --> 00:38:34,160 Speaker 1: tell us about America's next rendezvous with destiny? He apparently, 694 00:38:34,200 --> 00:38:36,560 Speaker 1: I'm looking at an article from Business Insider. Was obsessed 695 00:38:36,560 --> 00:38:40,800 Speaker 1: with this book and I've heard that before. Yeah, he 696 00:38:40,800 --> 00:38:46,200 Speaker 1: he liked that a lot. Um. He definitely had a 697 00:38:46,239 --> 00:38:48,520 Speaker 1: good nose for what was going on in the insurgency. 698 00:38:48,680 --> 00:38:52,040 Speaker 1: He saw the surgency rising in any any fed it. Um. 699 00:38:52,560 --> 00:38:56,440 Speaker 1: You saw the economic discontent, you saw the uh sense 700 00:38:56,480 --> 00:39:00,800 Speaker 1: of betrayal that was driving a lot of this Um 701 00:39:00,840 --> 00:39:03,759 Speaker 1: you know, it's there's just lots of lots of things 702 00:39:03,840 --> 00:39:08,040 Speaker 1: that were yeah, kind of pushed on people very very quickly. 703 00:39:08,080 --> 00:39:11,320 Speaker 1: I mean little things like it's like a Chinese trade 704 00:39:11,400 --> 00:39:14,120 Speaker 1: or trade with China. You know, in the nineties, we 705 00:39:14,160 --> 00:39:16,879 Speaker 1: had the similar problem with Japan and when it hit 706 00:39:16,880 --> 00:39:20,680 Speaker 1: about sixty billion dollars in in a trade deficit and 707 00:39:20,680 --> 00:39:24,120 Speaker 1: it was smashing in the US auto industry. You know, Baker, 708 00:39:24,160 --> 00:39:26,120 Speaker 1: who was a Secretary of State at the time, basically 709 00:39:26,280 --> 00:39:30,479 Speaker 1: told Japan knock it off, keep it at sixty until 710 00:39:30,520 --> 00:39:33,640 Speaker 1: you figure out how to either import more or or 711 00:39:33,680 --> 00:39:36,920 Speaker 1: freeze it where it is, and allowed the Detroit to 712 00:39:36,960 --> 00:39:39,840 Speaker 1: recover and create a you know, a viable car industry 713 00:39:39,840 --> 00:39:44,120 Speaker 1: that we see you today. Um, the problem was is 714 00:39:44,160 --> 00:39:46,719 Speaker 1: like in two thousand two or so, you know, Chinese, 715 00:39:47,000 --> 00:39:51,160 Speaker 1: the Chinese trade deficit was hitting about at billion, and 716 00:39:51,200 --> 00:39:53,480 Speaker 1: then we were focused on our you know nine eleven 717 00:39:53,520 --> 00:39:57,000 Speaker 1: Iraq in Afghanistan, and we didn't do anything about it, 718 00:39:57,080 --> 00:40:01,200 Speaker 1: but just it swept across the the whole country. And 719 00:40:01,239 --> 00:40:05,560 Speaker 1: then by the late late odds you had stories of 720 00:40:05,800 --> 00:40:09,280 Speaker 1: Chinese companies just buying out all of the machine tools 721 00:40:09,280 --> 00:40:11,759 Speaker 1: out of out of factories across the United States. I 722 00:40:11,760 --> 00:40:13,520 Speaker 1: mean it's very similar tout of the way the Russians 723 00:40:13,520 --> 00:40:16,759 Speaker 1: were depopulating Germany after World War Two. It's just it 724 00:40:16,840 --> 00:40:20,440 Speaker 1: was just devastating and the trade came on so quickly 725 00:40:22,000 --> 00:40:24,680 Speaker 1: and it wasn't managed. It wasn't you know, that rate 726 00:40:24,719 --> 00:40:27,920 Speaker 1: of onset you know, swamped us. And you could actually 727 00:40:27,920 --> 00:40:30,520 Speaker 1: see a lot of connection between the areas the counties 728 00:40:30,520 --> 00:40:34,360 Speaker 1: that were impacted heavily by Chinese trade and there you know, 729 00:40:34,440 --> 00:40:38,960 Speaker 1: vote for Trump. Uh. I mean, I'm interested in your 730 00:40:38,960 --> 00:40:41,839 Speaker 1: thoughts on China because I mean we've talked about that 731 00:40:42,160 --> 00:40:44,640 Speaker 1: quite a bit on here in the past, and you 732 00:40:44,680 --> 00:40:47,640 Speaker 1: know some of my feelings that China is you know, 733 00:40:47,920 --> 00:40:50,680 Speaker 1: like a freight train trying to support the United States 734 00:40:50,800 --> 00:40:54,440 Speaker 1: as you know, a global superpower, global hedgemon, or however 735 00:40:54,480 --> 00:40:58,759 Speaker 1: you prefer to think about it. Yeah, well, there's a 736 00:40:58,760 --> 00:41:02,200 Speaker 1: couple of things. Is that Out and Road initiative, right, 737 00:41:02,320 --> 00:41:07,680 Speaker 1: and and basically trying to set up a global transportation 738 00:41:09,640 --> 00:41:12,600 Speaker 1: you know, infrastructure that they own, you know, they own 739 00:41:12,640 --> 00:41:17,480 Speaker 1: the debt on they manage, they provide security for um. 740 00:41:17,600 --> 00:41:20,920 Speaker 1: And that means that you know, if you're a US company, 741 00:41:21,160 --> 00:41:23,080 Speaker 1: you know, trying to trade in Africa, or you're trying 742 00:41:23,120 --> 00:41:26,040 Speaker 1: to trade in Asia and you're trying to ship goods 743 00:41:26,120 --> 00:41:30,239 Speaker 1: or get goods ship to you. What you're gonna run 744 00:41:30,239 --> 00:41:32,960 Speaker 1: into is that your stuff takes to three weeks longer 745 00:41:33,800 --> 00:41:35,880 Speaker 1: to arrive, or ahead gets hit with an extra tariff, 746 00:41:36,320 --> 00:41:38,839 Speaker 1: or it disappears, you know, it doesn't get the kind 747 00:41:38,840 --> 00:41:42,279 Speaker 1: of protection the Chinese goods get. It's the end of 748 00:41:42,360 --> 00:41:46,359 Speaker 1: the end of um what's the term innocent passage? Right, Yeah, 749 00:41:46,400 --> 00:41:50,920 Speaker 1: it's a it's it's a very ambitious and very expensive endeavor. 750 00:41:50,960 --> 00:41:55,560 Speaker 1: And given that we're in no condition to kind of 751 00:41:57,880 --> 00:42:00,520 Speaker 1: fight it or or counter it or anything. You know, 752 00:42:00,640 --> 00:42:02,719 Speaker 1: usually what would happen is that you try to fight 753 00:42:02,719 --> 00:42:04,839 Speaker 1: a kind of rearguard action and kind of just break 754 00:42:04,840 --> 00:42:06,719 Speaker 1: it a little bit, slow it down, and make it 755 00:42:06,719 --> 00:42:09,359 Speaker 1: more expensive. Right. Well, and during the Cold War, there's 756 00:42:09,400 --> 00:42:12,600 Speaker 1: a so called War for the Third World where there's 757 00:42:12,640 --> 00:42:15,080 Speaker 1: the theory that the Soviets were going to you know, 758 00:42:15,160 --> 00:42:17,840 Speaker 1: have many revolutions and all these Third World countries and 759 00:42:17,960 --> 00:42:20,920 Speaker 1: essentially box in the United States by limiting our freedom 760 00:42:20,920 --> 00:42:23,440 Speaker 1: of movement around the world. And at the time, we 761 00:42:23,480 --> 00:42:28,080 Speaker 1: had a national agenda or identity, you know, a national 762 00:42:28,200 --> 00:42:32,160 Speaker 1: narrative the way you had talked about today, where are 763 00:42:32,200 --> 00:42:35,000 Speaker 1: we encountering China and we don't have any of that? 764 00:42:35,280 --> 00:42:39,319 Speaker 1: And the thing that the thing that we have on 765 00:42:39,360 --> 00:42:42,200 Speaker 1: our side, I guess, is that the kind of instability, 766 00:42:42,440 --> 00:42:45,279 Speaker 1: the kind of nonlinearity that's hitting us because of globalization 767 00:42:46,640 --> 00:42:50,200 Speaker 1: is just about to hit China too. Well do you think? Uh? 768 00:42:50,760 --> 00:42:53,720 Speaker 1: I'm really interested to hear you expand on that, because 769 00:42:54,040 --> 00:42:57,400 Speaker 1: you know, with the Chinese control measures and social credit 770 00:42:57,480 --> 00:43:01,359 Speaker 1: schemes that we've talked about, Um, what what's how? How 771 00:43:01,440 --> 00:43:06,000 Speaker 1: is it? How is that nonlinear aspects going to hit China? Yeah? 772 00:43:06,000 --> 00:43:10,640 Speaker 1: That's that's the trying to you know, I've framed it 773 00:43:10,719 --> 00:43:15,360 Speaker 1: as trying to maintain coherence. You know, socio political coherence. Uh. 774 00:43:15,640 --> 00:43:17,600 Speaker 1: Coherence is a really cool word. It's like this idea 775 00:43:17,640 --> 00:43:24,680 Speaker 1: that you can think logically and clearly, you can you know, 776 00:43:24,719 --> 00:43:28,640 Speaker 1: connect this the facts UM in a way that makes sense. 777 00:43:29,480 --> 00:43:32,920 Speaker 1: It's you know, kind of uh logically connected and and 778 00:43:33,040 --> 00:43:36,959 Speaker 1: aligned UM in a global as environment that's tough because 779 00:43:37,000 --> 00:43:40,120 Speaker 1: of all the interconnectivity and complexity UM causing effect kind 780 00:43:40,120 --> 00:43:43,560 Speaker 1: of breakdown in an extremely complex system. It's been impacting 781 00:43:43,640 --> 00:43:45,520 Speaker 1: us and we have yet to figure out a solution 782 00:43:45,560 --> 00:43:48,160 Speaker 1: for that, yet we'll probably come up with a messy 783 00:43:48,160 --> 00:43:52,680 Speaker 1: one that given our given our past, China did exactly opposite. 784 00:43:53,840 --> 00:43:56,200 Speaker 1: They're understanding that, they now understand that they have this 785 00:43:56,320 --> 00:44:00,480 Speaker 1: like incredibly complex system internally as well as that externally, 786 00:44:00,480 --> 00:44:03,000 Speaker 1: and they're gonna have to deal with this. And what 787 00:44:03,040 --> 00:44:05,360 Speaker 1: they decided to do is build a social credit system. 788 00:44:05,360 --> 00:44:08,040 Speaker 1: And um, they're doing it in part by you know, 789 00:44:08,080 --> 00:44:12,600 Speaker 1: outsourcing uh part of it to the big social networks, yeah, 790 00:44:13,239 --> 00:44:18,080 Speaker 1: you know we chat and Ali baba. And they're also 791 00:44:18,840 --> 00:44:22,759 Speaker 1: doing it in part through each of the each of 792 00:44:22,760 --> 00:44:26,440 Speaker 1: the big cities. I mean, they are are implementing versions 793 00:44:26,440 --> 00:44:29,960 Speaker 1: of it. And what that's trying to do is gamify 794 00:44:30,000 --> 00:44:33,279 Speaker 1: this whole idea of you know, what's right and what's 795 00:44:33,320 --> 00:44:36,560 Speaker 1: wrong in society, that that if if you do things 796 00:44:36,600 --> 00:44:42,200 Speaker 1: that are loyal and in the public morality, you get points, 797 00:44:42,840 --> 00:44:46,600 Speaker 1: and if you do things that are they deem immoral 798 00:44:46,800 --> 00:44:49,600 Speaker 1: or unethical, or or if you criticize the state, or 799 00:44:49,600 --> 00:44:54,800 Speaker 1: if you uh say nasty stuff online, you'll lose points 800 00:44:54,840 --> 00:44:58,200 Speaker 1: and those points. The way they designed the system is 801 00:44:58,239 --> 00:45:00,680 Speaker 1: that if you have a lot of you high high 802 00:45:00,680 --> 00:45:04,839 Speaker 1: social credit score, you can get all sorts of great 803 00:45:04,920 --> 00:45:06,880 Speaker 1: benefits you can give, you know, the shang and visa, 804 00:45:07,400 --> 00:45:10,320 Speaker 1: you know, go visit all the different European countries. You 805 00:45:10,360 --> 00:45:15,200 Speaker 1: can uh, you know, get access to loans and and 806 00:45:15,200 --> 00:45:18,440 Speaker 1: and different opportunities you wouldn't be able to get access 807 00:45:18,480 --> 00:45:23,719 Speaker 1: to otherwise. But if you get a low score, what 808 00:45:23,800 --> 00:45:28,560 Speaker 1: happens is that your low score can prevent you from 809 00:45:28,600 --> 00:45:31,400 Speaker 1: getting loans, and can prevent you from running, uh, you know, 810 00:45:31,440 --> 00:45:36,160 Speaker 1: getting buying property. It can prevent you from traveling on 811 00:45:36,280 --> 00:45:41,080 Speaker 1: trains and airplanes. Um. If it gets to the point 812 00:45:41,120 --> 00:45:45,600 Speaker 1: where you're blacklisted, you even get a different dial tone 813 00:45:45,640 --> 00:45:47,919 Speaker 1: in certain parts of China that when people call you, 814 00:45:48,080 --> 00:45:51,160 Speaker 1: it makes a funny sound that identifies you as one 815 00:45:51,160 --> 00:45:54,719 Speaker 1: of the people in the blacklist. But yeah, that the 816 00:45:54,800 --> 00:45:57,400 Speaker 1: architect of it is the way it works as a 817 00:45:57,440 --> 00:46:01,880 Speaker 1: network is really the killer here is your score impacts 818 00:46:01,880 --> 00:46:05,520 Speaker 1: your family's score. Your score impacts your coworkers and your 819 00:46:05,560 --> 00:46:09,720 Speaker 1: friends online. So if you drag them down, pure pressure 820 00:46:09,800 --> 00:46:12,719 Speaker 1: kicks in correct and it's is connected and it's going 821 00:46:13,640 --> 00:46:15,920 Speaker 1: and it has like a Confucian kind of you know, 822 00:46:16,040 --> 00:46:20,600 Speaker 1: han morality built baked into it to provide kind of 823 00:46:20,640 --> 00:46:26,399 Speaker 1: social cohesion Um, it's a it's a it's a kind 824 00:46:26,400 --> 00:46:31,120 Speaker 1: of system that's so locked down though that it um 825 00:46:31,280 --> 00:46:34,680 Speaker 1: has a potential of of really killing off any kind 826 00:46:34,719 --> 00:46:39,040 Speaker 1: of innovation, anything anyone stepping outside a line. Yeah. Absolutely, 827 00:46:39,360 --> 00:46:42,320 Speaker 1: I mean I think the Chinese Communist government has already 828 00:46:43,040 --> 00:46:45,800 Speaker 1: done a pretty good job at that, hence the corporate 829 00:46:45,920 --> 00:46:50,239 Speaker 1: espionage and technology transfers and everything else that have gone 830 00:46:50,280 --> 00:46:53,680 Speaker 1: over you know, the last couple of decades. But uh, 831 00:46:53,719 --> 00:46:56,759 Speaker 1: I mean, human beings don't really like being put in boxes, 832 00:46:56,880 --> 00:46:59,440 Speaker 1: as history shows us. Where do you see this all 833 00:46:59,480 --> 00:47:03,080 Speaker 1: going in the future for China, especially when you factor 834 00:47:03,200 --> 00:47:07,240 Speaker 1: in some of the other chaotic aspects of the uncertainties 835 00:47:07,320 --> 00:47:09,040 Speaker 1: that exist in the modern world that are going to 836 00:47:09,160 --> 00:47:15,040 Speaker 1: impact China as well well. Um, The thing is, I 837 00:47:15,080 --> 00:47:18,560 Speaker 1: think the system actually could be pretty effective in terms 838 00:47:18,600 --> 00:47:21,640 Speaker 1: of providing them stability in the short term, you know, 839 00:47:21,640 --> 00:47:23,960 Speaker 1: because it's supposed to be done by it's going to 840 00:47:24,040 --> 00:47:26,759 Speaker 1: combine that government information and private information in a way 841 00:47:26,760 --> 00:47:29,400 Speaker 1: that we haven't seen. Even you're purchasing habits and they 842 00:47:29,400 --> 00:47:32,480 Speaker 1: don't have a retail infrastructure like we do in China. 843 00:47:32,600 --> 00:47:34,600 Speaker 1: Everything is mostly done online where you get it delivered, 844 00:47:35,200 --> 00:47:37,600 Speaker 1: you know, all of your purchasing is being factored into 845 00:47:37,600 --> 00:47:41,200 Speaker 1: this thing. And this peer pressure in you know, in 846 00:47:41,239 --> 00:47:44,239 Speaker 1: a society that that is used to peer pressure or 847 00:47:44,280 --> 00:47:48,040 Speaker 1: accepts it. UH, could provide them a lot of sensibility. 848 00:47:48,040 --> 00:47:49,800 Speaker 1: But over the long term, I think they're going to 849 00:47:49,880 --> 00:47:53,880 Speaker 1: run into the same problem that you know, communism ran into. UM, 850 00:47:54,000 --> 00:47:57,920 Speaker 1: is that you know, by leaving out the market element 851 00:47:58,040 --> 00:48:01,880 Speaker 1: in that old system market decision making, UH, and sticking 852 00:48:01,880 --> 00:48:05,360 Speaker 1: only with the bureaucratic decision making model. UH, they weren't 853 00:48:05,360 --> 00:48:07,759 Speaker 1: able to innovate and come up with stuff like computers 854 00:48:08,880 --> 00:48:10,680 Speaker 1: and you know, at least the kind of computers that 855 00:48:10,840 --> 00:48:13,279 Speaker 1: you know, and the kind of innovations that allowed us 856 00:48:13,320 --> 00:48:15,920 Speaker 1: to kind of spread way ahead and create a you know, 857 00:48:15,960 --> 00:48:22,120 Speaker 1: a very complex, widely complex economy by by the mid eighties. UM, 858 00:48:22,160 --> 00:48:28,040 Speaker 1: that's probably going to be their problem too. But UM, 859 00:48:28,080 --> 00:48:32,960 Speaker 1: you know, if they can stay stable while we fall apart, right, 860 00:48:33,640 --> 00:48:38,279 Speaker 1: then maybe it's a price that's worth paying. I mean, 861 00:48:38,440 --> 00:48:40,160 Speaker 1: do you do you see that happening here in the 862 00:48:40,200 --> 00:48:42,920 Speaker 1: United States. I mean we've talked about you know, you 863 00:48:42,920 --> 00:48:45,759 Speaker 1: you had just mentioned earlier about how these these so 864 00:48:45,880 --> 00:48:51,040 Speaker 1: called insurgencies political insurgencies, UM, ideas about smashing the state, 865 00:48:51,160 --> 00:48:54,840 Speaker 1: not having a political or social identity. Where is it 866 00:48:54,880 --> 00:48:57,520 Speaker 1: going to go in the in the United States? How's 867 00:48:57,560 --> 00:49:01,080 Speaker 1: that going to happen? Yeah, that's a that's an open question. 868 00:49:01,080 --> 00:49:03,120 Speaker 1: I was kind of debating whether or not to kind 869 00:49:03,160 --> 00:49:06,279 Speaker 1: of do a report on or you know reports. What 870 00:49:06,360 --> 00:49:09,880 Speaker 1: they do is they they're so ruthlessly logical. For me, 871 00:49:10,000 --> 00:49:12,520 Speaker 1: is I walk through the argument and it kind of 872 00:49:12,560 --> 00:49:14,960 Speaker 1: teaches me as I go through. I learned as I'm writing, 873 00:49:15,760 --> 00:49:18,120 Speaker 1: because I do write this. I have this very structured 874 00:49:18,160 --> 00:49:20,560 Speaker 1: style with a paragraph bullet bulletla. It's he really easy 875 00:49:20,560 --> 00:49:23,239 Speaker 1: to read. But the logic is like inexhorable as it 876 00:49:23,280 --> 00:49:25,840 Speaker 1: goes through. Whether I was going to write a report 877 00:49:25,880 --> 00:49:32,320 Speaker 1: on potential for civil conflict in states and there's there's 878 00:49:32,400 --> 00:49:36,200 Speaker 1: a rising potential in terms of all of these different 879 00:49:36,280 --> 00:49:39,640 Speaker 1: centers are gravity that have kind of erupted um and 880 00:49:39,719 --> 00:49:43,200 Speaker 1: that a lot of the older structures are starting to 881 00:49:43,400 --> 00:49:48,960 Speaker 1: be de legitimized. I mean, institutional de legitimization is rampant. 882 00:49:49,040 --> 00:49:52,440 Speaker 1: Now you look at everybody, I mean every single institution 883 00:49:52,440 --> 00:49:56,000 Speaker 1: in the United States, even the U S. Militaries in decline, 884 00:49:56,080 --> 00:49:58,560 Speaker 1: everything from the Department of Justice to the Boy scouts, 885 00:49:58,600 --> 00:50:01,000 Speaker 1: it seems. Oh yeah, and every time we try to 886 00:50:01,080 --> 00:50:06,320 Speaker 1: fix a crisis in particular, we we end up destroying 887 00:50:06,320 --> 00:50:11,480 Speaker 1: an institution. So right now we're destroying FBI. It's credibility. 888 00:50:11,520 --> 00:50:17,720 Speaker 1: It's just getting shot to hell um and that can't 889 00:50:17,760 --> 00:50:21,759 Speaker 1: be good. You know. It's the distrust is growing and 890 00:50:21,840 --> 00:50:26,040 Speaker 1: that there's increasing amount of this is this is kind 891 00:50:26,040 --> 00:50:29,799 Speaker 1: of interesting to me? Is this similarity to our Civil war? 892 00:50:30,239 --> 00:50:34,879 Speaker 1: Is that you'll see you know, jurisdictions disregarding laws, yes, 893 00:50:35,160 --> 00:50:40,680 Speaker 1: and that you know they won't comply, and uh, they'll 894 00:50:40,719 --> 00:50:44,560 Speaker 1: start passing laws that are uh that do funkies. Things 895 00:50:44,600 --> 00:50:50,279 Speaker 1: like Connecticut just yesterday passed all hill that said that 896 00:50:50,360 --> 00:50:53,480 Speaker 1: they give all their electoral votes to the majority winner 897 00:50:53,640 --> 00:50:58,080 Speaker 1: in the popular vote nationally. So say they voted, Uh yeah, 898 00:50:58,280 --> 00:51:01,279 Speaker 1: I can understand that's legal, though, is it? Yeah? I 899 00:51:01,320 --> 00:51:04,960 Speaker 1: guess I guess it is, really yeah, And so everybody votes, 900 00:51:05,320 --> 00:51:08,759 Speaker 1: saying in Connecticut and say all the votes go to 901 00:51:08,840 --> 00:51:11,640 Speaker 1: a Republican candidate, which is not likely for Connecticut, but 902 00:51:11,760 --> 00:51:14,200 Speaker 1: say it did. And then they say the national vote 903 00:51:14,960 --> 00:51:18,400 Speaker 1: right now, you know, the popular vote goes democratic. They 904 00:51:18,440 --> 00:51:21,239 Speaker 1: would switch that whole state and give it to the 905 00:51:21,280 --> 00:51:26,520 Speaker 1: Democratic like a form of like anarcho uh what in 906 00:51:26,560 --> 00:51:31,640 Speaker 1: anarcho capitalism or something like yeah, what we want? Yeah, 907 00:51:31,680 --> 00:51:34,240 Speaker 1: not only are they the kind of negating the votes 908 00:51:34,239 --> 00:51:37,920 Speaker 1: of their citizens, it's it's um. It's a thing that 909 00:51:38,040 --> 00:51:44,040 Speaker 1: makes a connection between that negates the the election tooral process. 910 00:51:44,800 --> 00:51:46,560 Speaker 1: You know. It's like it's what I call it is 911 00:51:46,600 --> 00:51:49,279 Speaker 1: it's like a financial derivative. It makes a connection that 912 00:51:49,320 --> 00:51:53,160 Speaker 1: goes around the market right when you when you build 913 00:51:53,160 --> 00:51:55,239 Speaker 1: a financial I was in financial drivers for a bit 914 00:51:55,320 --> 00:51:57,800 Speaker 1: for back in the nineties, and a lot of the 915 00:51:57,840 --> 00:52:01,000 Speaker 1: stuff was just trying to make connections, um, non market 916 00:52:01,040 --> 00:52:05,239 Speaker 1: connections that connected you know, different indicries in ways that 917 00:52:05,280 --> 00:52:08,719 Speaker 1: allowed you to bet or reduce risk um. And all 918 00:52:08,719 --> 00:52:10,799 Speaker 1: these connections, because they didn't go through the clearing house 919 00:52:10,840 --> 00:52:14,239 Speaker 1: of the election or in this case election, in that 920 00:52:14,280 --> 00:52:16,200 Speaker 1: case it would be the clearing house of the market, 921 00:52:17,239 --> 00:52:21,120 Speaker 1: created all sorts of instabilities. When things went bad, these 922 00:52:21,120 --> 00:52:24,000 Speaker 1: things started moving and things, you know, companies started going bankrupt, 923 00:52:24,080 --> 00:52:27,040 Speaker 1: and that risk got spread globally and it didn't go 924 00:52:27,080 --> 00:52:29,719 Speaker 1: through the markets. It went on this parallel network of 925 00:52:29,800 --> 00:52:33,680 Speaker 1: this parallel spider web of of derivatives. The same thing. 926 00:52:33,800 --> 00:52:36,400 Speaker 1: If you could see this happening in politics. Is is 927 00:52:36,440 --> 00:52:42,160 Speaker 1: all these different connections are being made a weird kind 928 00:52:42,160 --> 00:52:45,200 Speaker 1: of political derivatives, like we saw in Connecticut or or 929 00:52:45,239 --> 00:52:53,600 Speaker 1: California complying with immigration law. Is that you end up 930 00:52:53,600 --> 00:53:00,120 Speaker 1: with these pockets of instability and you know, drives to 931 00:53:00,200 --> 00:53:02,440 Speaker 1: kind of say, Okay, we're not going to listen to 932 00:53:02,440 --> 00:53:06,920 Speaker 1: anything you say. Or the converse is that they you know, 933 00:53:07,320 --> 00:53:09,880 Speaker 1: the state tries to enforce the laws in a heavy 934 00:53:09,920 --> 00:53:13,400 Speaker 1: handed way, which causes the reaction that the boast. By 935 00:53:13,400 --> 00:53:15,360 Speaker 1: the way, I just pulled up the article about this 936 00:53:15,440 --> 00:53:17,919 Speaker 1: since you know we were talking about it. The one 937 00:53:18,040 --> 00:53:21,000 Speaker 1: catch here is that this will not kick in, and 938 00:53:21,040 --> 00:53:24,280 Speaker 1: I'm talking about Connecticut unless it's backed by enough states 939 00:53:24,400 --> 00:53:27,920 Speaker 1: and other voting areas to claim a majority Electoral College votes. 940 00:53:28,640 --> 00:53:31,319 Speaker 1: So it says, in addition to Connecticut, the other jurisdictions 941 00:53:31,320 --> 00:53:35,600 Speaker 1: in the Pactrick California, Hawaii, Illinois, Maryland, Massachusetts, New Jersey, 942 00:53:35,640 --> 00:53:39,359 Speaker 1: New York, Rhode Island, Vermont, Washington stayed along with d C. So, 943 00:53:39,880 --> 00:53:42,640 Speaker 1: but I highly doubt it's gonna pass in all those areas. 944 00:53:42,640 --> 00:53:45,400 Speaker 1: So I'm guessing what that means is unless it passes 945 00:53:45,440 --> 00:53:49,040 Speaker 1: in quite a bit of the map. This probably won't 946 00:53:49,480 --> 00:53:53,680 Speaker 1: actually happen, Yeah, potentially. I mean it's just the it's 947 00:53:53,680 --> 00:53:56,200 Speaker 1: just the effort. Yeah, it's it's a lot of people 948 00:53:56,239 --> 00:54:00,160 Speaker 1: who are guests. I feel disenfranchised by the electoral college system. 949 00:54:00,200 --> 00:54:02,799 Speaker 1: And I mean it's a weird thing because this is 950 00:54:02,840 --> 00:54:05,000 Speaker 1: not the first time that the person who won the 951 00:54:05,040 --> 00:54:08,640 Speaker 1: popular vote didn't win the electoral vote. But I think 952 00:54:08,680 --> 00:54:12,480 Speaker 1: it's almost like this uh attitude in the country of 953 00:54:12,560 --> 00:54:14,759 Speaker 1: like that Trump somehow like cheated his way to win 954 00:54:14,800 --> 00:54:17,680 Speaker 1: the presidency and it's like he won fair and square 955 00:54:17,800 --> 00:54:21,319 Speaker 1: on the rules that we have. Yeah, it's um, it's 956 00:54:21,360 --> 00:54:23,960 Speaker 1: a funny situation. I mean, the electoral college was set 957 00:54:24,000 --> 00:54:26,360 Speaker 1: up so you wouldn't have regional candidates right right to 958 00:54:26,400 --> 00:54:30,040 Speaker 1: protect the farmers and people. It kind of spread the 959 00:54:30,280 --> 00:54:33,120 Speaker 1: kind of make sure that you have candidates that at 960 00:54:33,200 --> 00:54:36,200 Speaker 1: least semi middle of the road. You couldn't have a 961 00:54:36,239 --> 00:54:38,920 Speaker 1: candidate as well in one region, and because you know 962 00:54:39,000 --> 00:54:42,680 Speaker 1: that candidate gets the vote in that region, you know, 963 00:54:42,840 --> 00:54:47,280 Speaker 1: swamps the system and then that would cause, you know, breakup. 964 00:54:47,560 --> 00:54:49,720 Speaker 1: And in this case, what we're seeing is that the 965 00:54:49,800 --> 00:54:54,880 Speaker 1: extreme popularity um of the Dems in California and the 966 00:54:55,200 --> 00:54:58,680 Speaker 1: you know, the East Coast has created kind of a 967 00:54:58,719 --> 00:55:03,880 Speaker 1: barbell of of votes. UM. You know where you you 968 00:55:04,480 --> 00:55:09,359 Speaker 1: have you know support in California, UM, and you have 969 00:55:09,880 --> 00:55:12,400 Speaker 1: a pretty tight race across most of the rest of 970 00:55:12,400 --> 00:55:14,480 Speaker 1: the country until you get the East Coast and then 971 00:55:14,480 --> 00:55:18,840 Speaker 1: you have you know again that six in Massachusetts and 972 00:55:18,840 --> 00:55:23,600 Speaker 1: New York for the Dems. UM. That's the reason why 973 00:55:23,640 --> 00:55:26,360 Speaker 1: the polls. Incidentally, I think that's the reason why the 974 00:55:26,360 --> 00:55:31,520 Speaker 1: polls were so off going into the last presidential election, 975 00:55:32,040 --> 00:55:35,600 Speaker 1: is that they were looking at the national polls and 976 00:55:35,640 --> 00:55:37,799 Speaker 1: they were seeing a significant lead for Hillary, but they 977 00:55:37,800 --> 00:55:42,239 Speaker 1: didn't account for the regionalization of the regional bias. John, 978 00:55:42,280 --> 00:55:45,000 Speaker 1: were you one of the I would say a few 979 00:55:45,040 --> 00:55:48,680 Speaker 1: people in the media who predicted it Trump victory while 980 00:55:48,719 --> 00:55:50,359 Speaker 1: I said it was too close to call. And then 981 00:55:50,400 --> 00:55:52,920 Speaker 1: that morning when I when I said, man, this is 982 00:55:52,920 --> 00:55:56,120 Speaker 1: not this sounds like a Trump victory, and people are 983 00:55:56,160 --> 00:55:58,080 Speaker 1: like all quiet on the thing. And then as a 984 00:55:58,239 --> 00:56:01,760 Speaker 1: day went on, just you know, you get the feel 985 00:56:01,800 --> 00:56:03,520 Speaker 1: for the thing and it's like, oh boy, this is 986 00:56:03,520 --> 00:56:06,239 Speaker 1: gonna this is going to go in a way that 987 00:56:06,280 --> 00:56:09,520 Speaker 1: people didn't expect. I gotta I was on the I 988 00:56:09,600 --> 00:56:12,879 Speaker 1: was on the you know that Donald on read I've 989 00:56:12,920 --> 00:56:16,719 Speaker 1: seen it. Yes, yeah, it's like thousand people. And it 990 00:56:16,840 --> 00:56:21,680 Speaker 1: was like the big kind of centipede kind of engine 991 00:56:21,719 --> 00:56:25,359 Speaker 1: for supporting the Trump and surgency. And and they had 992 00:56:25,400 --> 00:56:27,759 Speaker 1: interviewed Trump and then they interviewed me the next week, 993 00:56:28,040 --> 00:56:31,600 Speaker 1: and uh, which is kind of cool, I mean, and 994 00:56:31,600 --> 00:56:33,880 Speaker 1: that they were asking and they were actually pretty respectful, 995 00:56:33,880 --> 00:56:37,040 Speaker 1: and then moderators didn't let them savage me even though 996 00:56:37,080 --> 00:56:38,520 Speaker 1: I was I was kind of trying to stay more 997 00:56:38,640 --> 00:56:40,160 Speaker 1: middle of the ground, in the middle of the road, 998 00:56:40,160 --> 00:56:43,400 Speaker 1: because I like, I like the kind of perspective it 999 00:56:43,440 --> 00:56:45,520 Speaker 1: provides me. You know, if you're too much in one 1000 00:56:45,520 --> 00:56:49,319 Speaker 1: camp or another, you can't see what's going on. It 1001 00:56:49,360 --> 00:56:53,200 Speaker 1: tends to blind you a bit. And uh, they were 1002 00:56:53,239 --> 00:56:56,640 Speaker 1: like kind of disheartened and demoralized and that that, you know, 1003 00:56:57,040 --> 00:57:00,840 Speaker 1: the press was to be just savaging Trump and it 1004 00:57:01,000 --> 00:57:03,040 Speaker 1: didn't look like he was gonna make it. And I go, 1005 00:57:03,680 --> 00:57:06,880 Speaker 1: just he's being held up by this insurgency and there 1006 00:57:06,960 --> 00:57:10,319 Speaker 1: a surgency is pushing him forward and and it won't matter, 1007 00:57:10,600 --> 00:57:14,120 Speaker 1: you know what the press says that no matter what 1008 00:57:14,239 --> 00:57:17,280 Speaker 1: you know, it's even if you get that access Hollywood 1009 00:57:17,320 --> 00:57:21,000 Speaker 1: thing Um that would have crushed anyone else, but he 1010 00:57:21,080 --> 00:57:23,800 Speaker 1: was backed by an insurgency. And it wasn't Trump per se. 1011 00:57:24,320 --> 00:57:27,440 Speaker 1: It was the insurgency that said that didn't matter. So 1012 00:57:27,520 --> 00:57:30,800 Speaker 1: all these people you see with an open source insurgency. 1013 00:57:30,920 --> 00:57:33,800 Speaker 1: With open source software, when when I create a piece 1014 00:57:33,800 --> 00:57:39,120 Speaker 1: of open source software, uh, what I do is I 1015 00:57:39,200 --> 00:57:41,960 Speaker 1: demonstrate that it that it works, and that means it's 1016 00:57:42,080 --> 00:57:45,240 Speaker 1: it's plausible, and I also show that has some promise, 1017 00:57:45,720 --> 00:57:48,240 Speaker 1: so it's called a plausible promise. And then other people 1018 00:57:49,080 --> 00:57:50,720 Speaker 1: they look at that piece of software and they go 1019 00:57:51,400 --> 00:57:54,160 Speaker 1: or you know, the some piece of hardware that I built, 1020 00:57:54,200 --> 00:57:56,480 Speaker 1: and they go, wow, that's cool. I can see it 1021 00:57:56,560 --> 00:57:59,280 Speaker 1: doing this and scratching my itch. You know, this thing 1022 00:57:59,320 --> 00:58:02,080 Speaker 1: could do something really cool for me, And they may 1023 00:58:02,120 --> 00:58:06,720 Speaker 1: see it as promising for something entirely different than the 1024 00:58:06,720 --> 00:58:09,640 Speaker 1: person that initially put it up than me Um and that. 1025 00:58:10,680 --> 00:58:13,600 Speaker 1: But it's that you know, simple plausible promise of that 1026 00:58:13,800 --> 00:58:16,680 Speaker 1: software that or that hardware that keep gets all these 1027 00:58:16,680 --> 00:58:19,920 Speaker 1: different people to come together and and work on it 1028 00:58:19,960 --> 00:58:22,000 Speaker 1: and move it forward and push it forward, and then 1029 00:58:22,040 --> 00:58:27,440 Speaker 1: they differentiate it and and later later down the down 1030 00:58:27,520 --> 00:58:30,440 Speaker 1: the development cycle. Same thing with the politics is that 1031 00:58:31,880 --> 00:58:35,520 Speaker 1: if the possible promise was to get Trump into office, 1032 00:58:36,680 --> 00:58:39,360 Speaker 1: you know, so you can be that wrecking ball, and 1033 00:58:39,440 --> 00:58:41,480 Speaker 1: that he proved that he was plausible because he had 1034 00:58:41,480 --> 00:58:46,240 Speaker 1: some early victories unexpected victories or it did surprisingly well 1035 00:58:46,480 --> 00:58:51,240 Speaker 1: in the in some of the early debates. Um, people 1036 00:58:52,560 --> 00:58:55,880 Speaker 1: held that one simple thing, that plausible promise, that one 1037 00:58:56,200 --> 00:58:59,760 Speaker 1: simple goal that united all these different people with all 1038 00:58:59,760 --> 00:59:02,400 Speaker 1: these different motivations, and they were all pushing it forward 1039 00:59:02,600 --> 00:59:05,880 Speaker 1: and something that only online can really do well, um 1040 00:59:06,200 --> 00:59:08,960 Speaker 1: and do what scale. And we saw the development of 1041 00:59:09,000 --> 00:59:12,480 Speaker 1: this first it's kind of online political party on the 1042 00:59:12,600 --> 00:59:17,920 Speaker 1: right anomaly the right, um, and uh, it put Trump 1043 00:59:17,920 --> 00:59:20,200 Speaker 1: in office. What's going on on the left is pretty 1044 00:59:20,200 --> 00:59:23,360 Speaker 1: cool too. It may actually swamp the Democratic Party then 1045 00:59:23,360 --> 00:59:27,040 Speaker 1: this next next presidential go run, Well, what's happening there? Yeah, 1046 00:59:27,040 --> 00:59:30,640 Speaker 1: it's different. It's a different kind of open source incertaency. 1047 00:59:31,040 --> 00:59:36,520 Speaker 1: It's not even it. You know. Whereas on the right 1048 00:59:36,560 --> 00:59:39,200 Speaker 1: side they came together over Trump, on the left they're 1049 00:59:39,200 --> 00:59:43,920 Speaker 1: coming together over these kind of moral propositions and it 1050 00:59:43,920 --> 00:59:47,280 Speaker 1: which is kind of kind of a public morality consensus morality. Uh, 1051 00:59:47,280 --> 00:59:49,920 Speaker 1: you know, there's this me too thing, and there's a 1052 00:59:50,360 --> 00:59:55,400 Speaker 1: there's a this never again thing with a guns um, 1053 00:59:55,440 --> 01:00:00,800 Speaker 1: and that you know, this asymmetry of capacity for violence. UM. 1054 01:00:01,720 --> 01:00:05,640 Speaker 1: And then there's this uh you know, using disparaging words 1055 01:00:05,680 --> 01:00:08,960 Speaker 1: or hate speech online or any in public life or 1056 01:00:09,680 --> 01:00:14,840 Speaker 1: anytime it's recorded. Um. And then you know, there's a 1057 01:00:14,840 --> 01:00:19,040 Speaker 1: little bit of American social credit scoring going on. I believe, Yeah, 1058 01:00:19,160 --> 01:00:22,800 Speaker 1: it's a it's basically a social credit score. UM. It's 1059 01:00:22,840 --> 01:00:26,680 Speaker 1: creating a public morality. It's filling the gap or the 1060 01:00:26,680 --> 01:00:30,480 Speaker 1: the this you know, this vacuum that was created with 1061 01:00:30,520 --> 01:00:32,480 Speaker 1: the decline of the nation state and decline of this 1062 01:00:32,600 --> 01:00:35,400 Speaker 1: kind of fictive kinship that we've had. And they're creating 1063 01:00:35,400 --> 01:00:38,400 Speaker 1: this new morality this in real time. We're seeing it 1064 01:00:38,560 --> 01:00:43,360 Speaker 1: built um. And it's being enforced, and it's being enforced 1065 01:00:43,360 --> 01:00:47,200 Speaker 1: through shame. Man. Is they find an individual that's violated 1066 01:00:47,240 --> 01:00:50,520 Speaker 1: this thing. Um. You see it a lot in the 1067 01:00:50,600 --> 01:00:53,640 Speaker 1: me too, and they you know, pilarium and they destroy 1068 01:00:53,720 --> 01:00:56,240 Speaker 1: their careers and they make it impossible for them to work. 1069 01:00:56,960 --> 01:00:59,400 Speaker 1: It really is strange too, because I feel like, you 1070 01:00:59,440 --> 01:01:01,280 Speaker 1: know too, of the out of those three things, I 1071 01:01:01,320 --> 01:01:04,440 Speaker 1: think the gun laws in America and our free speech 1072 01:01:04,520 --> 01:01:07,480 Speaker 1: laws or what separates us from the culture of Europe, 1073 01:01:07,880 --> 01:01:10,960 Speaker 1: you know where in many places, Yeah, but I mean, 1074 01:01:11,000 --> 01:01:13,760 Speaker 1: for example, in in many parts of Europe, if you 1075 01:01:13,760 --> 01:01:16,760 Speaker 1: write a book denying that the Holocaust happens, you will 1076 01:01:16,800 --> 01:01:19,439 Speaker 1: be put in jail or defaming the royal family. Yeah, 1077 01:01:19,480 --> 01:01:22,160 Speaker 1: and these are things that they consider hate speech. And 1078 01:01:22,240 --> 01:01:25,600 Speaker 1: in America we really don't have anything called hate speech. 1079 01:01:25,640 --> 01:01:28,680 Speaker 1: It's it's not a clearer line being drawn. Oh yeah, 1080 01:01:28,800 --> 01:01:32,080 Speaker 1: that's it's it's really wild to see these things develop. 1081 01:01:32,480 --> 01:01:35,240 Speaker 1: It it's you know, how do you develop a morality 1082 01:01:35,280 --> 01:01:40,560 Speaker 1: when um your multicultural? Truly multiculture? I mean what what 1083 01:01:40,640 --> 01:01:47,400 Speaker 1: morality crosses lines? I mean, so everybody you know what ethics? What? Uh? 1084 01:01:48,560 --> 01:01:53,520 Speaker 1: What do you what regulates or limits public behavior? Well, 1085 01:01:53,640 --> 01:01:55,760 Speaker 1: when you when you don't have you know, a single 1086 01:01:55,760 --> 01:01:59,480 Speaker 1: culture or a single thing that connects you otherwise and 1087 01:01:59,760 --> 01:02:02,680 Speaker 1: this kind of consensus morality, it's kind of a tough 1088 01:02:02,720 --> 01:02:05,240 Speaker 1: concept to get your head around to see it being 1089 01:02:05,240 --> 01:02:09,680 Speaker 1: done online, but it's being um, it's being built online, 1090 01:02:09,760 --> 01:02:14,280 Speaker 1: and it's being enforced online, and and the enforcement action 1091 01:02:14,320 --> 01:02:17,640 Speaker 1: through shaming is pretty could be pretty brutal. Some of 1092 01:02:17,640 --> 01:02:20,680 Speaker 1: these guys that they get their hands on, it's just 1093 01:02:21,480 --> 01:02:25,320 Speaker 1: they're done. I mean they're not working anymore. And it 1094 01:02:26,120 --> 01:02:28,640 Speaker 1: there was a interesting thing and I put it into 1095 01:02:28,680 --> 01:02:30,360 Speaker 1: I put a little brief together and I put it 1096 01:02:30,400 --> 01:02:35,640 Speaker 1: into a uh the report stream. Um. It was about 1097 01:02:35,680 --> 01:02:39,920 Speaker 1: this what they call the Shitty Men and Media List. Yeah, 1098 01:02:40,000 --> 01:02:41,720 Speaker 1: it was the thing that came out like in December, 1099 01:02:42,240 --> 01:02:45,600 Speaker 1: and there was this, uh, a young woman who was 1100 01:02:45,640 --> 01:02:49,360 Speaker 1: working in media, and she thought, you know what, there's 1101 01:02:49,400 --> 01:02:53,200 Speaker 1: this kind of uh rumor mill or this kind of 1102 01:02:53,200 --> 01:02:56,360 Speaker 1: back room warnings that women give to each other about 1103 01:02:56,760 --> 01:03:01,080 Speaker 1: certain guys, and so what if I take I make 1104 01:03:01,160 --> 01:03:03,640 Speaker 1: this kind of shaming process in this back room information 1105 01:03:03,800 --> 01:03:07,360 Speaker 1: more tangible and they can be shared. And so what 1106 01:03:07,400 --> 01:03:10,040 Speaker 1: you just did is create a spreadsheet she called the 1107 01:03:10,040 --> 01:03:12,360 Speaker 1: Shitty Men and Media List that listed, you know, all 1108 01:03:12,400 --> 01:03:15,880 Speaker 1: these different guys who were known for getting drunk and 1109 01:03:16,160 --> 01:03:19,400 Speaker 1: being assholes or or as little as uh, having you know, 1110 01:03:19,480 --> 01:03:23,880 Speaker 1: creepy lunch meetings, all right, and she put it had 1111 01:03:23,920 --> 01:03:28,600 Speaker 1: about dozen people in that this spreadsheet, and she floated 1112 01:03:28,600 --> 01:03:33,920 Speaker 1: it among different writers and media people. Um, and it 1113 01:03:33,960 --> 01:03:36,880 Speaker 1: took off. I mean within twenty four hours was up 1114 01:03:36,920 --> 01:03:39,440 Speaker 1: to seventy people. I mean there were thousands of people 1115 01:03:39,560 --> 01:03:43,240 Speaker 1: investing time, hours and hours and hours building this list out. 1116 01:03:43,560 --> 01:03:46,000 Speaker 1: I mean, if you're like I've been in software a 1117 01:03:46,000 --> 01:03:48,920 Speaker 1: long time and if you float a prototype piece of 1118 01:03:48,960 --> 01:03:53,520 Speaker 1: software that gets you know, serious people investing incredible amounts 1119 01:03:53,560 --> 01:03:57,480 Speaker 1: of time and effort into building out Uh. And it 1120 01:03:57,520 --> 01:03:59,400 Speaker 1: has been you know, you can just see the ramp 1121 01:03:59,400 --> 01:04:02,040 Speaker 1: and you can fee this thing. Just zoom. Man, you've 1122 01:04:02,040 --> 01:04:04,280 Speaker 1: got something on your hands. You've got something that's explosive. 1123 01:04:04,960 --> 01:04:08,360 Speaker 1: And it got up to you know, seventies two people 1124 01:04:08,560 --> 01:04:13,160 Speaker 1: and um, she pulled it because it was starting to 1125 01:04:13,200 --> 01:04:14,960 Speaker 1: get to the you know, people who wanted to report 1126 01:04:15,000 --> 01:04:19,439 Speaker 1: on it to the press, um and to shut it down, 1127 01:04:19,560 --> 01:04:22,000 Speaker 1: and you know, people were discussing it and you know, 1128 01:04:22,280 --> 01:04:24,480 Speaker 1: very mad at it. I included some examples I was 1129 01:04:24,520 --> 01:04:27,520 Speaker 1: able to pull off. But it's the kind of thing 1130 01:04:27,560 --> 01:04:29,560 Speaker 1: that if you take it to the next level, you 1131 01:04:29,640 --> 01:04:31,600 Speaker 1: put it into a black chain, for instance, where it's 1132 01:04:32,480 --> 01:04:36,600 Speaker 1: you can't be raised, and do you have like you 1133 01:04:36,640 --> 01:04:38,600 Speaker 1: can even bring it up to another level where you 1134 01:04:38,640 --> 01:04:42,200 Speaker 1: can you know, attach names and addresses. Even facial recognition 1135 01:04:42,240 --> 01:04:45,360 Speaker 1: is now really really easy. Um. But the picture of 1136 01:04:45,360 --> 01:04:47,480 Speaker 1: the person in and it just runs through generic facial 1137 01:04:47,480 --> 01:04:49,840 Speaker 1: recognition and says, okay, this is a person off the 1138 01:04:49,920 --> 01:04:52,960 Speaker 1: shitty men I media list or shitty men list. I 1139 01:04:52,960 --> 01:04:55,600 Speaker 1: mean there are some people who haven't coming. I mean, 1140 01:04:55,640 --> 01:04:59,200 Speaker 1: our our Attorney general here in New York City probably 1141 01:04:59,480 --> 01:05:01,720 Speaker 1: uh yeah, got what he had come in to him. 1142 01:05:01,760 --> 01:05:03,640 Speaker 1: But I mean, doesn't this sound like some sort of 1143 01:05:03,680 --> 01:05:06,880 Speaker 1: neo McCarthy is um at a certain point, Yeah, it 1144 01:05:06,960 --> 01:05:10,000 Speaker 1: just say it's just the same thing though the same 1145 01:05:10,680 --> 01:05:14,400 Speaker 1: the same problem with the social credit system in China. 1146 01:05:14,680 --> 01:05:16,360 Speaker 1: It can grow in away. I mean in China, if 1147 01:05:16,400 --> 01:05:21,440 Speaker 1: you have a problem with your score, um, there's no 1148 01:05:21,800 --> 01:05:24,840 Speaker 1: legal redress because it's not done in the courts. It's 1149 01:05:24,840 --> 01:05:29,480 Speaker 1: done all through a voluntary basis. That's what the government says, um. 1150 01:05:29,640 --> 01:05:33,400 Speaker 1: And there's you know, no way of actually forcing them 1151 01:05:33,440 --> 01:05:37,320 Speaker 1: to revise your score or uh doing anything about it. 1152 01:05:37,480 --> 01:05:39,840 Speaker 1: The same could happen here, and it could get to 1153 01:05:39,840 --> 01:05:42,240 Speaker 1: the point where it's just like, you know, it's either 1154 01:05:42,280 --> 01:05:46,480 Speaker 1: an untrue allegation or you know somebody who's you know, 1155 01:05:46,840 --> 01:05:54,120 Speaker 1: being accused of something very very minor um definitely disputable 1156 01:05:54,160 --> 01:05:57,440 Speaker 1: that ends up getting getting hammered, not getting a job, 1157 01:05:57,600 --> 01:06:03,680 Speaker 1: not getting opper tunities, having problems in social life everything. 1158 01:06:03,760 --> 01:06:06,400 Speaker 1: It's just you could apply the same thing to any 1159 01:06:06,440 --> 01:06:09,320 Speaker 1: of the other morality things. If you said anything, uh, 1160 01:06:09,480 --> 01:06:12,720 Speaker 1: you said any kind of you know, racial slur, you 1161 01:06:12,760 --> 01:06:16,040 Speaker 1: said anything that you know, people could call, you know, 1162 01:06:17,080 --> 01:06:21,080 Speaker 1: sexist or whatever that could be captured off your Twitter 1163 01:06:21,160 --> 01:06:23,960 Speaker 1: or if you're captured off a video phone and that 1164 01:06:23,960 --> 01:06:26,800 Speaker 1: would be used as evidence that would then be put 1165 01:06:26,840 --> 01:06:29,800 Speaker 1: into a list like this. I mean, not many Americans, 1166 01:06:29,840 --> 01:06:32,920 Speaker 1: I feel really truly understand this. But when you go 1167 01:06:33,040 --> 01:06:35,680 Speaker 1: traveling the other parts of the world, um, places in 1168 01:06:35,680 --> 01:06:38,880 Speaker 1: the Middle East and such, to sit down and have 1169 01:06:39,520 --> 01:06:42,120 Speaker 1: a conversation like we're having now, or even a more 1170 01:06:42,560 --> 01:06:46,400 Speaker 1: benign conversation, maybe just some superficial talk about politics and 1171 01:06:46,440 --> 01:06:49,680 Speaker 1: a cafe or a restaurant, that would not happen like 1172 01:06:49,800 --> 01:06:53,440 Speaker 1: that never happens in many countries around the world because 1173 01:06:53,440 --> 01:06:56,040 Speaker 1: people are so afraid of the regime. And it's it's 1174 01:06:56,040 --> 01:06:58,640 Speaker 1: not even a fear, it's just an understanding like that's 1175 01:06:58,680 --> 01:07:00,600 Speaker 1: part of the social bargain that yeah, I guess you 1176 01:07:00,600 --> 01:07:04,280 Speaker 1: have with the with the regime, yep um. And it 1177 01:07:04,320 --> 01:07:06,360 Speaker 1: feels like we're kind of moving to that. I mean, 1178 01:07:06,400 --> 01:07:11,000 Speaker 1: I've had some funny experiences where, um, you know, people 1179 01:07:11,000 --> 01:07:13,160 Speaker 1: in the media are afraid to talk in front of 1180 01:07:13,160 --> 01:07:15,480 Speaker 1: their peers, but then they know who I am and 1181 01:07:15,840 --> 01:07:17,800 Speaker 1: kind of like when we're by ourselves will kind of 1182 01:07:17,840 --> 01:07:19,920 Speaker 1: like give me a little elbow and be like, yeah, man, 1183 01:07:20,000 --> 01:07:23,040 Speaker 1: don't believe the media narratives. You know, they're they're like 1184 01:07:23,080 --> 01:07:26,640 Speaker 1: closeted conservatives. Oh I get yeah, I get that all 1185 01:07:26,640 --> 01:07:31,080 Speaker 1: the time from people. Yeah, it's like they can't say 1186 01:07:31,120 --> 01:07:34,520 Speaker 1: certain things in certain ways because they'll get they'll get hammered, 1187 01:07:34,560 --> 01:07:38,200 Speaker 1: they'll be unemployable, they won't work with them. Um. And 1188 01:07:38,240 --> 01:07:42,000 Speaker 1: then this, this has the potential to become explicit um 1189 01:07:42,040 --> 01:07:44,120 Speaker 1: and if it's a part of a blockchain or part 1190 01:07:45,200 --> 01:07:48,640 Speaker 1: you know, baked in in in the right way, it's unstoppable. 1191 01:07:48,960 --> 01:07:50,440 Speaker 1: I mean, it's not something you could like go to 1192 01:07:50,520 --> 01:07:53,200 Speaker 1: courts and get expunged. You know, there's no one to 1193 01:07:53,240 --> 01:07:56,160 Speaker 1: talk to, you can't clear your name correct. It's just yeah, 1194 01:07:56,200 --> 01:08:01,000 Speaker 1: it's like this this you know, global block job that 1195 01:08:01,360 --> 01:08:04,360 Speaker 1: kind of strangles us and it becomes as oppressive as 1196 01:08:04,400 --> 01:08:08,320 Speaker 1: the Chinese system, but something that we don't really have 1197 01:08:08,440 --> 01:08:11,880 Speaker 1: much control over, you know. In um. I think this 1198 01:08:12,040 --> 01:08:14,360 Speaker 1: is kind of a well known phenomena that when when 1199 01:08:14,360 --> 01:08:18,679 Speaker 1: societies are shaking shaking up and it reshuffles our mental deck. 1200 01:08:18,720 --> 01:08:21,400 Speaker 1: It takes a couple of generations for people to kind 1201 01:08:21,400 --> 01:08:25,320 Speaker 1: of fully understand the new system. Um, social media is 1202 01:08:25,320 --> 01:08:28,000 Speaker 1: a very new development for us. And you know, I 1203 01:08:28,320 --> 01:08:30,559 Speaker 1: grew up. I'm young enough that I came up and 1204 01:08:30,560 --> 01:08:33,519 Speaker 1: grew up as we moved from analog to digital and 1205 01:08:33,600 --> 01:08:35,080 Speaker 1: you know then we got a O L and our 1206 01:08:35,080 --> 01:08:37,400 Speaker 1: fifty six K modem and watch this whole development of 1207 01:08:37,439 --> 01:08:40,800 Speaker 1: social media. Um, do you think that you know, say, 1208 01:08:41,000 --> 01:08:44,320 Speaker 1: you know, our kids are grandkids. Eventually they're gonna there's 1209 01:08:44,320 --> 01:08:46,920 Speaker 1: gonna be a maturing in society and we'll start to 1210 01:08:47,000 --> 01:08:50,599 Speaker 1: use social media and more responsible. Um dare I say 1211 01:08:50,720 --> 01:08:57,240 Speaker 1: adult ways? Yeah? I think the I think the way 1212 01:08:57,240 --> 01:09:01,040 Speaker 1: that that actually gets up gets used is that kind 1213 01:09:01,040 --> 01:09:05,679 Speaker 1: of this open source network, kind of organizational form gets 1214 01:09:05,880 --> 01:09:13,240 Speaker 1: gets tamed. Just like we tamed institutions bureaucracy to provide 1215 01:09:13,320 --> 01:09:18,960 Speaker 1: us positive functionality and decision making, we've tamed markets to 1216 01:09:19,000 --> 01:09:21,559 Speaker 1: a large extent to help us with our decision making, 1217 01:09:21,560 --> 01:09:25,439 Speaker 1: societal decision making. We also trained tame tribalism. We we 1218 01:09:25,600 --> 01:09:28,800 Speaker 1: changed it, we abstracted it and turned it into uh, 1219 01:09:29,360 --> 01:09:33,920 Speaker 1: the mechanism behind religions and and also in profession behind nationalism. 1220 01:09:33,920 --> 01:09:37,920 Speaker 1: You know, nationalism is as a form of tribalism um. 1221 01:09:37,960 --> 01:09:41,920 Speaker 1: And those have all served useful purposes, and you know, 1222 01:09:42,520 --> 01:09:45,400 Speaker 1: they can be used in bad ways. I mean, bureaucracy 1223 01:09:45,640 --> 01:09:48,519 Speaker 1: can be used as a bad thing, uh when when 1224 01:09:48,520 --> 01:09:50,400 Speaker 1: in the dictator's hands, it can be making them very 1225 01:09:50,439 --> 01:09:53,880 Speaker 1: efficient at killing people, or it would be used in 1226 01:09:53,920 --> 01:09:56,960 Speaker 1: a good thing in terms of you know, mobilizing resources 1227 01:09:57,000 --> 01:10:00,439 Speaker 1: to undertake big projects that are positive. But so this, 1228 01:10:00,439 --> 01:10:03,639 Speaker 1: this networked form of organization made possible by social networking, 1229 01:10:03,680 --> 01:10:07,479 Speaker 1: this you know, these open source networks like I was 1230 01:10:07,520 --> 01:10:11,760 Speaker 1: talking about in the political sense. Uh, we'll figure out. 1231 01:10:11,760 --> 01:10:13,160 Speaker 1: We're going to have to figure out how to tame 1232 01:10:13,200 --> 01:10:15,040 Speaker 1: them and make them part of our system because they 1233 01:10:15,080 --> 01:10:19,439 Speaker 1: can do things that uh, the other older systems, the 1234 01:10:19,479 --> 01:10:25,120 Speaker 1: other older organizations can do. Right. And there's people like 1235 01:10:25,720 --> 01:10:29,080 Speaker 1: probably you and I who both you know, have read 1236 01:10:29,280 --> 01:10:31,920 Speaker 1: you know, science fiction and also just seeing technology and 1237 01:10:32,280 --> 01:10:35,720 Speaker 1: have these great aspirations for what technology can do. Um. 1238 01:10:35,760 --> 01:10:38,959 Speaker 1: But I feel like this is actually a positive development 1239 01:10:38,960 --> 01:10:43,080 Speaker 1: that because of the Trumpell action and the charges of 1240 01:10:43,160 --> 01:10:46,599 Speaker 1: collusion and Russian meddling, not not the collusion, I should 1241 01:10:46,600 --> 01:10:50,639 Speaker 1: say the Russian meddaling the disinformation campaigns um that people 1242 01:10:50,640 --> 01:10:53,320 Speaker 1: are becoming more aware of social media and the negative 1243 01:10:53,360 --> 01:11:00,719 Speaker 1: ways that it can be used. Yeah. They of course 1244 01:11:00,760 --> 01:11:04,720 Speaker 1: that has the potential of causing all sorts of you know, 1245 01:11:05,160 --> 01:11:08,240 Speaker 1: setting emotion a lot of constraints, or for the big 1246 01:11:08,240 --> 01:11:12,559 Speaker 1: social networking companies to start changing their aiyes, which right, Yeah, 1247 01:11:13,000 --> 01:11:15,479 Speaker 1: one of the reports I was writing about writing about 1248 01:11:15,479 --> 01:11:17,880 Speaker 1: the social aies is that you know, now that we're 1249 01:11:17,880 --> 01:11:21,120 Speaker 1: putting pressure on Facebook and YouTube and others, is that 1250 01:11:21,160 --> 01:11:27,639 Speaker 1: they'll start training their aies to minimize complaints about content, 1251 01:11:28,520 --> 01:11:31,400 Speaker 1: minimized complaints that is, fake news, or minimize complaints that 1252 01:11:31,560 --> 01:11:34,400 Speaker 1: is you know, obnoxious or or hate speech or or 1253 01:11:34,439 --> 01:11:37,240 Speaker 1: the like. And when you start training in AI light that, 1254 01:11:37,520 --> 01:11:43,880 Speaker 1: um it, it gets really good at kind of minimizing 1255 01:11:44,000 --> 01:11:47,120 Speaker 1: the complaints. It iterates towards that goal. You know, it's 1256 01:11:47,240 --> 01:11:49,760 Speaker 1: interacting with billions of people and it changes sit you know, 1257 01:11:50,120 --> 01:11:53,200 Speaker 1: it deletes certain things or soft bands certain types of content. 1258 01:11:53,360 --> 01:11:57,599 Speaker 1: Is it wipes out visibility of a content some uploaded 1259 01:11:57,680 --> 01:12:01,760 Speaker 1: video before anyone even sees it. Uh. And if if 1260 01:12:01,760 --> 01:12:05,599 Speaker 1: it moves the complaint meter down, you know, it reduces complaints, 1261 01:12:05,640 --> 01:12:09,519 Speaker 1: then that feature gets kept and and iterates again and 1262 01:12:09,520 --> 01:12:15,000 Speaker 1: again and again. The unintendive side effect if something like 1263 01:12:15,040 --> 01:12:18,560 Speaker 1: that is that we end up having people you know, surrounded, 1264 01:12:18,640 --> 01:12:21,160 Speaker 1: you know, kind of encased in this big ball of 1265 01:12:21,200 --> 01:12:23,800 Speaker 1: cotton where they don't see anything negative or anything that 1266 01:12:23,840 --> 01:12:26,400 Speaker 1: they don't they disagree with. Oh. I saw that during 1267 01:12:26,439 --> 01:12:29,880 Speaker 1: the election where I had um conservative friends who are 1268 01:12:29,920 --> 01:12:31,680 Speaker 1: like kind of reaching out. It's kind of funny to 1269 01:12:31,680 --> 01:12:34,559 Speaker 1: see and they're saying, like, can can can a Hillary 1270 01:12:34,600 --> 01:12:37,080 Speaker 1: supporter please tell me why they support Hillary? Like I'm 1271 01:12:37,120 --> 01:12:39,760 Speaker 1: not trying to be uh, you know, a dick. I'm 1272 01:12:39,800 --> 01:12:42,760 Speaker 1: seriously asking, like, I'm interested to hear your point of view. 1273 01:12:42,800 --> 01:12:44,920 Speaker 1: And I thought it was interesting to see people posting 1274 01:12:44,960 --> 01:12:47,400 Speaker 1: stuff like that because what it means is they're literally 1275 01:12:47,479 --> 01:12:50,640 Speaker 1: not seeing the other side of the other side of 1276 01:12:50,680 --> 01:12:54,360 Speaker 1: the political spectrum. Yeah. And and that was just you know, 1277 01:12:54,439 --> 01:12:57,800 Speaker 1: through section of the kind of friends that you have, 1278 01:12:58,360 --> 01:13:01,679 Speaker 1: and and and the stuff that a post because they're 1279 01:13:01,680 --> 01:13:03,600 Speaker 1: posting a lot because they get positive kind of the 1280 01:13:03,600 --> 01:13:08,559 Speaker 1: serotonin kick whenever they get a like, uh, the it 1281 01:13:08,600 --> 01:13:11,400 Speaker 1: gets even worse and far worse when the ai start 1282 01:13:11,479 --> 01:13:13,920 Speaker 1: kicking in, and that that you don't even see the 1283 01:13:14,000 --> 01:13:16,400 Speaker 1: things I mean, you know they're talking about you know, 1284 01:13:16,479 --> 01:13:19,000 Speaker 1: millions of videos not even showing up on YouTube because 1285 01:13:19,600 --> 01:13:24,479 Speaker 1: they're screened before they even hit yeah, or they screen 1286 01:13:24,560 --> 01:13:27,320 Speaker 1: before they hit you. They may be visible to other 1287 01:13:27,360 --> 01:13:29,840 Speaker 1: people and you and you could have people, you know, 1288 01:13:29,880 --> 01:13:32,280 Speaker 1: three years from now going, oh, I don't have any 1289 01:13:32,280 --> 01:13:34,759 Speaker 1: problems with Trump or I don't have any problems with whoever, 1290 01:13:35,360 --> 01:13:38,640 Speaker 1: because I haven't seen any negative news on right, I 1291 01:13:38,640 --> 01:13:42,120 Speaker 1: haven't seen anything anyone disagree or say it was terrible. 1292 01:13:42,280 --> 01:13:44,680 Speaker 1: And now another element is added to that with the 1293 01:13:44,720 --> 01:13:46,880 Speaker 1: monetization of everything. I mean, I've talked about it on 1294 01:13:46,920 --> 01:13:49,200 Speaker 1: the show, and we do a live stream now or 1295 01:13:49,240 --> 01:13:52,000 Speaker 1: like a fraction of a fraction of our Facebook audience 1296 01:13:52,040 --> 01:13:54,960 Speaker 1: is seeing that show up on their true and you know, 1297 01:13:55,240 --> 01:13:57,840 Speaker 1: it's it's kind of weird, and it's also them trying 1298 01:13:57,880 --> 01:13:59,600 Speaker 1: to make money, but yea, it is you living in 1299 01:13:59,680 --> 01:14:01,800 Speaker 1: a neck chamber. And then I've also seen the thing 1300 01:14:01,840 --> 01:14:04,479 Speaker 1: recently where there's a way you could look up on 1301 01:14:04,600 --> 01:14:10,040 Speaker 1: your profile how Facebook has you, um like situated as 1302 01:14:10,040 --> 01:14:15,519 Speaker 1: a conservative liberal moderate and that from my perspective, yeah, 1303 01:14:15,520 --> 01:14:17,400 Speaker 1: it has me as a moderate. I I don't have 1304 01:14:17,400 --> 01:14:19,760 Speaker 1: anything listed under my political views, and I guess it's 1305 01:14:19,800 --> 01:14:22,280 Speaker 1: just because of like who my friends are, and you know, 1306 01:14:22,320 --> 01:14:24,479 Speaker 1: I have friends of all different backgrounds. But yeah, it 1307 01:14:24,880 --> 01:14:27,720 Speaker 1: is pretty interesting that so that that way, you know, 1308 01:14:27,760 --> 01:14:31,160 Speaker 1: if you're listed as a conservative, they're targeting conservative ads 1309 01:14:31,200 --> 01:14:33,240 Speaker 1: to you. And I mean, it is a whole lot 1310 01:14:33,280 --> 01:14:35,360 Speaker 1: of social engineering. Like it does make me want to 1311 01:14:35,400 --> 01:14:39,400 Speaker 1: get off there very but that's also the um, that's 1312 01:14:39,400 --> 01:14:41,519 Speaker 1: just the world we live in now, is is that 1313 01:14:41,560 --> 01:14:44,040 Speaker 1: they know, you know, what you're clicking on, and what 1314 01:14:44,120 --> 01:14:46,240 Speaker 1: type of stuff you're responding to, and what stores you 1315 01:14:46,320 --> 01:14:50,360 Speaker 1: shop at, both online and offline, and uh, it's a 1316 01:14:50,400 --> 01:14:53,080 Speaker 1: whole lot of big brother stuff going on. I think 1317 01:14:53,080 --> 01:14:55,120 Speaker 1: you're probably better off stand on. I mean, you know, 1318 01:14:55,160 --> 01:14:57,720 Speaker 1: I saw a lot of I mean, granted, I was 1319 01:14:57,760 --> 01:14:59,960 Speaker 1: involved in a lot of the early tech that became 1320 01:15:00,080 --> 01:15:03,240 Speaker 1: social networking. So you know, I joined a company in 1321 01:15:03,240 --> 01:15:04,760 Speaker 1: two thousand and one that built the you know, the 1322 01:15:04,800 --> 01:15:09,320 Speaker 1: first networked blogging tool, did RSS and all that other 1323 01:15:09,320 --> 01:15:11,439 Speaker 1: stuff that you know, that kind of network blog that 1324 01:15:11,560 --> 01:15:14,240 Speaker 1: became Facebook and Twitter and stuff. So I've been kind 1325 01:15:14,280 --> 01:15:17,120 Speaker 1: of a big proponent of this stuff really early on. 1326 01:15:18,160 --> 01:15:22,360 Speaker 1: UM and uh, I found it it's actually more positive 1327 01:15:22,400 --> 01:15:27,000 Speaker 1: than negative. Even now, because if you disconnect, you lose 1328 01:15:27,040 --> 01:15:29,920 Speaker 1: all those people and all those connections. You know, you 1329 01:15:30,080 --> 01:15:32,880 Speaker 1: become a nonentity. I've seen people just disappear. I mean, 1330 01:15:33,640 --> 01:15:36,800 Speaker 1: and you know, people that would be great to interact with, 1331 01:15:36,840 --> 01:15:40,840 Speaker 1: they just become who are they talking to a few 1332 01:15:40,840 --> 01:15:42,559 Speaker 1: people that are you know, they can't talk about the 1333 01:15:42,600 --> 01:15:45,200 Speaker 1: topics that that they they're interested in because they they 1334 01:15:45,200 --> 01:15:48,280 Speaker 1: don't have access to them online, right right, especially you 1335 01:15:48,280 --> 01:15:51,559 Speaker 1: know smarter people. Do we really want them disconnecting people 1336 01:15:51,560 --> 01:15:54,200 Speaker 1: who you know, really have something to say on a subject. 1337 01:15:54,240 --> 01:15:56,360 Speaker 1: And then what we're going to get is like teenage 1338 01:15:56,400 --> 01:16:00,200 Speaker 1: girls on Snapchat, you know, having that dog Phil there 1339 01:16:00,240 --> 01:16:04,680 Speaker 1: for their face. That's gonna be our social media I 1340 01:16:04,720 --> 01:16:08,599 Speaker 1: don't know, go ahead, John, Yeah, No, I mean what's interesting, 1341 01:16:08,640 --> 01:16:13,280 Speaker 1: I mean, at least from a security perspective, is everything 1342 01:16:13,280 --> 01:16:16,439 Speaker 1: that we're talking about here is magnified globally because no 1343 01:16:16,880 --> 01:16:18,439 Speaker 1: people are talking about to say that the client of 1344 01:16:18,479 --> 01:16:22,720 Speaker 1: Facebook in the US. But you know, Facebook's adding five 1345 01:16:23,080 --> 01:16:30,040 Speaker 1: thousand new users every day, a number, five thousand daily daily. Yeah, 1346 01:16:30,240 --> 01:16:32,479 Speaker 1: and you know it's two and a half billion people. 1347 01:16:32,560 --> 01:16:34,400 Speaker 1: And it's gonna slow down because they're gonna saturate the 1348 01:16:34,479 --> 01:16:40,439 Speaker 1: planet outside of China, but the it's replaced the local news. 1349 01:16:41,400 --> 01:16:44,200 Speaker 1: It's replaced the kind of news source for a good 1350 01:16:44,240 --> 01:16:46,800 Speaker 1: portion of the world. So if you're like in Cambodia, 1351 01:16:46,920 --> 01:16:50,800 Speaker 1: this is the news source, wouldn't you say? I mean 1352 01:16:50,840 --> 01:16:55,400 Speaker 1: it's connected, but I would say Twitter especially, Um, you 1353 01:16:55,439 --> 01:16:58,000 Speaker 1: know when you've seen when you've seen the protests in 1354 01:16:58,000 --> 01:17:00,800 Speaker 1: in the Middle East and all that stuff that you 1355 01:17:00,800 --> 01:17:02,679 Speaker 1: found out about it first on Twitter. I found out 1356 01:17:02,680 --> 01:17:05,320 Speaker 1: about the Vin Wadden death on Twitter first. When when 1357 01:17:05,680 --> 01:17:08,960 Speaker 1: news hits it's way before the mainstream media. Oh yeah, 1358 01:17:09,000 --> 01:17:11,960 Speaker 1: Twitter is definitely um, you know, just it's basically a 1359 01:17:12,000 --> 01:17:15,920 Speaker 1: big RSS feed. Is is great at broadcasting, you know, 1360 01:17:16,040 --> 01:17:18,519 Speaker 1: and and as a as a news source. Um, it's 1361 01:17:18,520 --> 01:17:25,080 Speaker 1: actually even probably more destabilizing than Facebook. Facebook is limited 1362 01:17:25,080 --> 01:17:28,439 Speaker 1: to a certain number of friends, and you don't have 1363 01:17:28,520 --> 01:17:32,519 Speaker 1: that kind of broadcast aspect, and it had a little 1364 01:17:32,520 --> 01:17:36,280 Speaker 1: bit of broadcast aspect when it had the uh, the 1365 01:17:36,280 --> 01:17:40,759 Speaker 1: the AI s that they had initially built were focused 1366 01:17:40,760 --> 01:17:43,880 Speaker 1: on finding viral pieces of content that people tended to 1367 01:17:43,920 --> 01:17:47,880 Speaker 1: like and then propagating it. So instead of like having 1368 01:17:47,920 --> 01:17:50,200 Speaker 1: only a fraction of the people actually see your video 1369 01:17:51,600 --> 01:17:56,360 Speaker 1: on Facebook, kind of that negative damper. Uh, it would 1370 01:17:56,400 --> 01:17:59,360 Speaker 1: take that video and then ten times your audience would 1371 01:17:59,400 --> 01:18:02,799 Speaker 1: see it if if it had a strong initial reaction. 1372 01:18:03,040 --> 01:18:06,679 Speaker 1: I also, this could just be the way I see things. 1373 01:18:06,720 --> 01:18:09,120 Speaker 1: You'd have a better view on this. But I feel 1374 01:18:09,120 --> 01:18:11,759 Speaker 1: like people pay more attention to what's friending on Twitter 1375 01:18:11,880 --> 01:18:14,360 Speaker 1: than on Facebook. You know, if you see something in 1376 01:18:14,400 --> 01:18:16,720 Speaker 1: that those top three trends, if it's a name, if 1377 01:18:16,720 --> 01:18:19,559 Speaker 1: it's a country, You're like, all right, what's going on here? 1378 01:18:19,600 --> 01:18:21,920 Speaker 1: And I want to have the news before everybody else. 1379 01:18:22,280 --> 01:18:25,439 Speaker 1: It's definitely a much cleaner way of getting news from 1380 01:18:25,439 --> 01:18:27,599 Speaker 1: a direct source. So you can find that one person 1381 01:18:28,040 --> 01:18:31,800 Speaker 1: that's that live kind of person on the spot and 1382 01:18:32,240 --> 01:18:34,559 Speaker 1: you can directly connect to them, directly follow them and 1383 01:18:34,600 --> 01:18:40,559 Speaker 1: watch what they're doing. It's awesome. Yeah, it's a but 1384 01:18:40,800 --> 01:18:42,519 Speaker 1: it does put a lot of power in some of 1385 01:18:42,560 --> 01:18:46,000 Speaker 1: the big hubs that you don't get on Facebook. I mean, 1386 01:18:46,320 --> 01:18:48,240 Speaker 1: maybe it's because of the circle I'm in, but I 1387 01:18:48,280 --> 01:18:51,880 Speaker 1: feel like Twitter is just saturated by journalists and academics 1388 01:18:51,920 --> 01:18:54,439 Speaker 1: who are like training snarky barbs at one another on 1389 01:18:54,479 --> 01:18:57,280 Speaker 1: a daily basis. I'm not I'm not a big fan 1390 01:18:57,320 --> 01:19:00,599 Speaker 1: of Twitter. Yeah, there's whether there's just two pieces. That's 1391 01:19:00,640 --> 01:19:04,000 Speaker 1: that a lot of what we're seeing in that kind 1392 01:19:04,000 --> 01:19:08,680 Speaker 1: of the consensus morality is being built by that group, right, 1393 01:19:09,160 --> 01:19:13,000 Speaker 1: and they're using that. It's that I'd say probably you know, 1394 01:19:13,040 --> 01:19:18,679 Speaker 1: two million people in the government, bureaucracy and and in media, 1395 01:19:19,400 --> 01:19:23,479 Speaker 1: in academia, um, you know, it's much smaller the group 1396 01:19:23,520 --> 01:19:26,360 Speaker 1: that's you know, truly active, probably a hundred thousand or 1397 01:19:26,439 --> 01:19:30,840 Speaker 1: less who are actually driving this kind of consensus morality forward. 1398 01:19:31,040 --> 01:19:35,160 Speaker 1: It's like the new power elite. Yeah, exactly. And it's 1399 01:19:35,200 --> 01:19:38,080 Speaker 1: great at shaming. You can shame Pearson a person really 1400 01:19:38,120 --> 01:19:41,680 Speaker 1: quickly on Twitter. I mean, the way that Trump has 1401 01:19:41,760 --> 01:19:44,760 Speaker 1: used it has really changed everything. That before he olds 1402 01:19:44,760 --> 01:19:47,400 Speaker 1: a press conference, before he talks to the media, goes 1403 01:19:47,439 --> 01:19:49,360 Speaker 1: on the news, he just tweets what he wants. And 1404 01:19:49,520 --> 01:19:52,240 Speaker 1: I remember there was actually a really uh I mean, 1405 01:19:52,280 --> 01:19:54,880 Speaker 1: it's just a funny but interesting tweet of Trump's is 1406 01:19:54,920 --> 01:19:58,240 Speaker 1: where he said, like me, I don't remember what he 1407 01:19:58,320 --> 01:20:00,839 Speaker 1: was talking about specifically, but he said, like me tweeting 1408 01:20:00,840 --> 01:20:02,800 Speaker 1: out this stuff before I hold the press conference and 1409 01:20:02,800 --> 01:20:07,240 Speaker 1: all that. He's like, it's not presidential, it's modern day presidential. Right, 1410 01:20:07,280 --> 01:20:10,840 Speaker 1: It's kind of true, Like this is the future. I don't. 1411 01:20:10,960 --> 01:20:13,280 Speaker 1: I don't think the president following Trump is going to 1412 01:20:13,360 --> 01:20:16,120 Speaker 1: be off of Twitter and not doing you know, somewhat 1413 01:20:16,160 --> 01:20:19,559 Speaker 1: similar things. Well. Yeah, Twitter definitely allowed him to route 1414 01:20:19,560 --> 01:20:23,480 Speaker 1: around the media and directly connect with is with the insurgency, 1415 01:20:23,520 --> 01:20:27,719 Speaker 1: and it was it was very effective. It's also allowed 1416 01:20:27,760 --> 01:20:30,200 Speaker 1: the insurgency to interact with him. So what they did 1417 01:20:30,320 --> 01:20:31,840 Speaker 1: is that whenever they came up with a meme or 1418 01:20:31,880 --> 01:20:36,719 Speaker 1: some idea that was in you know, what I thought 1419 01:20:36,760 --> 01:20:38,680 Speaker 1: was good, it would be propagated and would show up 1420 01:20:38,680 --> 01:20:41,439 Speaker 1: in Trump's feed and Trump would find it, you know, 1421 01:20:41,479 --> 01:20:43,640 Speaker 1: would go great past the media minders and all the 1422 01:20:43,680 --> 01:20:46,960 Speaker 1: people that were massaging you know, the information didn't take 1423 01:20:46,960 --> 01:20:49,840 Speaker 1: it the going into Trump and you see that thing 1424 01:20:49,840 --> 01:20:52,720 Speaker 1: and then he What made him the perfect kind of 1425 01:20:52,800 --> 01:20:54,639 Speaker 1: vehicle for this is that he would pick that up 1426 01:20:55,800 --> 01:20:59,959 Speaker 1: and use it tweeting the thing of him body slamming CNN, 1427 01:21:00,120 --> 01:21:03,240 Speaker 1: which was originally like Vince McMahon, Oh yeah, he picks 1428 01:21:03,320 --> 01:21:07,160 Speaker 1: up all these ideas, you know, right off as Twitter feed. John, 1429 01:21:07,800 --> 01:21:10,360 Speaker 1: I was wondering if you could just backtrack one second. 1430 01:21:10,360 --> 01:21:12,679 Speaker 1: I was wondering if you could finish that thought about 1431 01:21:12,800 --> 01:21:15,920 Speaker 1: how there's you know, perhaps a hundred thousand people who 1432 01:21:15,960 --> 01:21:19,759 Speaker 1: are engaged in this building of consensus morality on social media. 1433 01:21:20,160 --> 01:21:22,080 Speaker 1: And it's kind of flushed out that thought that you 1434 01:21:22,120 --> 01:21:23,960 Speaker 1: were making, because that's really important, I think, and I 1435 01:21:23,960 --> 01:21:33,760 Speaker 1: haven't heard anyone quite phrase it that way. Oh well, um, yeah, 1436 01:21:33,800 --> 01:21:37,840 Speaker 1: it's it's a decentral it's a decentralized network, like we 1437 01:21:37,840 --> 01:21:43,479 Speaker 1: were describing earlier. Yeah. The way I the way I 1438 01:21:43,560 --> 01:21:45,839 Speaker 1: kind of framed it, you know, I bought the framework 1439 01:21:45,880 --> 01:21:50,040 Speaker 1: for myself about this is that they had the I 1440 01:21:50,080 --> 01:21:52,519 Speaker 1: put it into kind of the dimensions of war. So 1441 01:21:52,600 --> 01:21:55,960 Speaker 1: you have you know that the physical and the psychological 1442 01:21:56,000 --> 01:22:01,599 Speaker 1: and the and the moral um, you know, the kind 1443 01:22:01,640 --> 01:22:05,280 Speaker 1: of a Buoidian framework and the psychological kind of maneuver. 1444 01:22:05,360 --> 01:22:10,680 Speaker 1: Warfare was the insurgency, you know, very memetic, Uh, you know, 1445 01:22:10,840 --> 01:22:17,400 Speaker 1: lots of disinformation, lots of uh. They would swarm on targets, 1446 01:22:18,439 --> 01:22:24,120 Speaker 1: threaten them, yell, let them shout them down. Very very effective, uh, 1447 01:22:24,760 --> 01:22:30,840 Speaker 1: constantly moving. Uh. The moral the this the kind of 1448 01:22:30,840 --> 01:22:33,920 Speaker 1: new moral network obviously fights in the moral space. Um. 1449 01:22:34,120 --> 01:22:36,640 Speaker 1: And it tries to you know, it builds by you know, 1450 01:22:36,680 --> 01:22:39,960 Speaker 1: creating this consensus and then it fights by you know, 1451 01:22:40,000 --> 01:22:42,880 Speaker 1: through this shaming and by this you know disconnection, it 1452 01:22:42,960 --> 01:22:48,240 Speaker 1: disconnects you from from the network, and that disconnection actually 1453 01:22:48,280 --> 01:22:53,479 Speaker 1: does you damage because you know, right now, so much 1454 01:22:53,520 --> 01:22:57,840 Speaker 1: of what we do, both personally and and you know financially, 1455 01:22:58,000 --> 01:23:00,160 Speaker 1: you know, through our jobs is done online. If you 1456 01:23:00,160 --> 01:23:04,320 Speaker 1: become disconnected, to become less valuable, uh, your trunk hated, 1457 01:23:04,400 --> 01:23:09,440 Speaker 1: your life is diminished. And that's been very very effective 1458 01:23:10,080 --> 01:23:13,800 Speaker 1: in terms of you know, how they actually you know 1459 01:23:13,920 --> 01:23:19,639 Speaker 1: a chief success. UM. Twitter has been a amazing vehicle 1460 01:23:19,680 --> 01:23:22,320 Speaker 1: for that, much more so than Facebook. You know, Facebook 1461 01:23:22,360 --> 01:23:25,200 Speaker 1: allowed you know, the small groups, the kind of the 1462 01:23:25,240 --> 01:23:29,639 Speaker 1: insurgency to operate below the radar because you know, there's 1463 01:23:29,640 --> 01:23:31,800 Speaker 1: not one place on Facebook you can see all the 1464 01:23:31,840 --> 01:23:38,080 Speaker 1: flow and UM an interesting kind of uh spin off 1465 01:23:38,080 --> 01:23:41,360 Speaker 1: from this whole Cambridge analytical thing, because the Cambridge analytical 1466 01:23:41,400 --> 01:23:44,040 Speaker 1: thing is that Facebook didn't sell data. Allowed a researcher 1467 01:23:44,120 --> 01:23:47,680 Speaker 1: to gather some data using an app by voluntary participants 1468 01:23:47,800 --> 01:23:51,320 Speaker 1: that gathered a teeny bit from a lot of people, um, 1469 01:23:51,360 --> 01:23:53,080 Speaker 1: in order to kind of learn more about how the 1470 01:23:53,120 --> 01:23:57,160 Speaker 1: networked work. And that's and then this guy turned around 1471 01:23:57,160 --> 01:23:59,200 Speaker 1: and sold it to Cambrige Analytical for a million bucks. 1472 01:23:59,320 --> 01:24:01,160 Speaker 1: So it's he made a lot of money and he 1473 01:24:01,320 --> 01:24:04,839 Speaker 1: kind of betrayed the agreement that he had with Facebook 1474 01:24:04,880 --> 01:24:10,000 Speaker 1: over this. Now Facebook shutting down all these researchers, So 1475 01:24:11,640 --> 01:24:14,040 Speaker 1: the ability to actually see what's going on across the 1476 01:24:14,080 --> 01:24:17,160 Speaker 1: board is going to be less so than than ever 1477 01:24:17,240 --> 01:24:21,040 Speaker 1: and that kind of you know, murky kind of small networks, 1478 01:24:21,080 --> 01:24:24,720 Speaker 1: you know, people with two, three, four friends. Uh, is 1479 01:24:24,760 --> 01:24:30,920 Speaker 1: it you know, perfect for kind of the insurgency to operate? Um, 1480 01:24:31,000 --> 01:24:33,479 Speaker 1: you can't be you know, there's no real top down 1481 01:24:33,560 --> 01:24:37,040 Speaker 1: enforcement except through Facebook. If they used AIS to do it. 1482 01:24:38,439 --> 01:24:39,800 Speaker 1: What do you I mean, if you were to make 1483 01:24:39,840 --> 01:24:45,240 Speaker 1: a prediction for these big social media tech companies Facebook, Twitter, etcetera, 1484 01:24:45,920 --> 01:24:48,559 Speaker 1: where do you think that's gonna go? Um in the future. 1485 01:24:48,840 --> 01:24:51,200 Speaker 1: I mean you kind of pointed out, No, that's not 1486 01:24:51,240 --> 01:24:53,759 Speaker 1: going to become the next MySpace. They're they're still adding 1487 01:24:53,760 --> 01:24:56,800 Speaker 1: a lot of new users. Um, is there going to 1488 01:24:56,880 --> 01:24:59,280 Speaker 1: be like some anti trust issues? What do you think 1489 01:24:59,320 --> 01:25:04,720 Speaker 1: is going to happen? Um? Well, what's happening in the 1490 01:25:04,880 --> 01:25:09,200 Speaker 1: US is that they are basically relying on Facebook to 1491 01:25:09,880 --> 01:25:14,040 Speaker 1: maintain stability. They've outsourced social stability to Facebook. Our our 1492 01:25:14,120 --> 01:25:19,640 Speaker 1: government has Yeah, I mean basically UM. And that's the deal. UM, 1493 01:25:19,680 --> 01:25:23,599 Speaker 1: that they provide us insight into all these different countries 1494 01:25:23,640 --> 01:25:27,800 Speaker 1: around the world, and that they provide us which you know, 1495 01:25:27,800 --> 01:25:29,519 Speaker 1: with two and a half billion users, they know more 1496 01:25:29,560 --> 01:25:32,240 Speaker 1: about a lot of the what's going on in in 1497 01:25:32,240 --> 01:25:34,760 Speaker 1: in places around the world. And UM, a lot of 1498 01:25:34,800 --> 01:25:36,600 Speaker 1: the countries do themselves. Now that's some hell of a 1499 01:25:36,600 --> 01:25:40,040 Speaker 1: good census data if you can get. Oh yeah, and 1500 01:25:40,160 --> 01:25:44,200 Speaker 1: it's moving towards a global census. And Facebook is also 1501 01:25:44,320 --> 01:25:47,599 Speaker 1: getting to the point where, um, they did some AI 1502 01:25:47,640 --> 01:25:50,320 Speaker 1: official recognition work. About a year and a half ago, 1503 01:25:50,600 --> 01:25:53,320 Speaker 1: some data came out that said, did they build a 1504 01:25:53,400 --> 01:25:58,360 Speaker 1: database of safe million people you know, users of Facebook, 1505 01:25:58,920 --> 01:26:03,000 Speaker 1: and um they train their AI to find them you 1506 01:26:03,040 --> 01:26:05,240 Speaker 1: know too, if you gave it a new picture of 1507 01:26:05,280 --> 01:26:07,800 Speaker 1: one of the people in that database, it could find 1508 01:26:07,880 --> 01:26:13,120 Speaker 1: you in five seconds out of million people. Now you 1509 01:26:13,160 --> 01:26:16,200 Speaker 1: can find you out of billions. And the interesting thing 1510 01:26:16,200 --> 01:26:18,120 Speaker 1: about this AI is once it's trained, it can run 1511 01:26:18,120 --> 01:26:26,080 Speaker 1: on a cell phone, so one you could that's like 1512 01:26:26,240 --> 01:26:28,759 Speaker 1: a perfect tool for a lot of you know what, 1513 01:26:28,760 --> 01:26:32,240 Speaker 1: what what we'd like to do globally, UM, in terms 1514 01:26:32,240 --> 01:26:35,800 Speaker 1: of identifying people people of interest and when you're saying 1515 01:26:35,880 --> 01:26:39,160 Speaker 1: the look at you know, people globally, and I talked 1516 01:26:39,200 --> 01:26:42,519 Speaker 1: about this just earlier. A lot of people credit like 1517 01:26:42,560 --> 01:26:45,760 Speaker 1: the Arab spring happening because of Twitter. You know, like 1518 01:26:45,840 --> 01:26:48,680 Speaker 1: how far down the line is it when they can 1519 01:26:48,760 --> 01:26:51,719 Speaker 1: no longer suppress the people of North Korea from seeing 1520 01:26:51,760 --> 01:26:53,760 Speaker 1: what's going on. Well, it's interesting that it can be 1521 01:26:53,800 --> 01:26:57,400 Speaker 1: a tool for revolution or a tool for social control. Yeah, 1522 01:26:57,479 --> 01:26:59,160 Speaker 1: I think it's going to end up being more social 1523 01:26:59,200 --> 01:27:02,360 Speaker 1: control as we go forward. You know, I turned it 1524 01:27:02,479 --> 01:27:07,000 Speaker 1: kind of a long night is um. You know, you 1525 01:27:07,040 --> 01:27:09,800 Speaker 1: know what happens when bureaucracy of the twentieth century went bad, 1526 01:27:09,800 --> 01:27:12,840 Speaker 1: It created a lot of you know, very efficient killing 1527 01:27:12,840 --> 01:27:15,800 Speaker 1: the shaderships, yeah, killing machines or or just you know, 1528 01:27:16,280 --> 01:27:20,120 Speaker 1: incredible oppression. And in this now we can build systems, 1529 01:27:20,400 --> 01:27:24,720 Speaker 1: these networks that can say through the same purpose and 1530 01:27:24,840 --> 01:27:29,240 Speaker 1: that you know, it can it can. It can using AI, 1531 01:27:29,560 --> 01:27:34,200 Speaker 1: you can it's not just censoring like the broadcast. It 1532 01:27:34,240 --> 01:27:37,800 Speaker 1: can you know, censor the living room conversations as well. 1533 01:27:39,680 --> 01:27:42,439 Speaker 1: You know, it's like, you know, not just having a 1534 01:27:42,439 --> 01:27:45,800 Speaker 1: sensor in the broadcast studio, it's in every single living room, 1535 01:27:45,920 --> 01:27:49,479 Speaker 1: every single discussion, every water cooler discussion. That's done via 1536 01:27:49,479 --> 01:27:55,680 Speaker 1: the network. Y. Yeah, and this is how this is 1537 01:27:55,680 --> 01:27:59,120 Speaker 1: how Facebook and Twitter and others, you know, stay off 1538 01:27:59,160 --> 01:28:02,679 Speaker 1: the government radar. Is that they provide the social stability 1539 01:28:02,680 --> 01:28:07,720 Speaker 1: and they make sure that that Trump doesn't happen next time. Um, 1540 01:28:07,760 --> 01:28:13,559 Speaker 1: and that they ensure that you know, destabilizing ideas and 1541 01:28:13,680 --> 01:28:16,760 Speaker 1: people in groups don't get any purchase. I mean the 1542 01:28:16,800 --> 01:28:19,280 Speaker 1: way you describe it, and I don't disagree with you, 1543 01:28:19,800 --> 01:28:22,679 Speaker 1: is that like Facebook is our new government, that they're 1544 01:28:22,680 --> 01:28:25,160 Speaker 1: almost like our new God or new religion that has 1545 01:28:25,200 --> 01:28:28,840 Speaker 1: a divine power kingmaker. Well, they do, they serve, they're 1546 01:28:28,880 --> 01:28:30,599 Speaker 1: serving a purpose. I mean you would see a lot 1547 01:28:30,600 --> 01:28:34,519 Speaker 1: of corporate kind of governance Act activism now coming out 1548 01:28:34,520 --> 01:28:37,479 Speaker 1: of companies like black Rock. I mean black Rock is 1549 01:28:37,520 --> 01:28:41,720 Speaker 1: like this major influence over like fourteen trillion dollars. Yeahs, 1550 01:28:41,840 --> 01:28:44,680 Speaker 1: like the size of the U. S economy. Um. That 1551 01:28:44,720 --> 01:28:48,840 Speaker 1: that has, you know, through boards of directors control over 1552 01:28:48,880 --> 01:28:55,160 Speaker 1: that kind of thing. And I remember hearing this, hearing 1553 01:28:55,200 --> 01:28:59,200 Speaker 1: an interview with the think who is the CEO of 1554 01:28:59,200 --> 01:29:01,839 Speaker 1: black Rock, and he said, you know, I'm not concerned 1555 01:29:01,840 --> 01:29:06,160 Speaker 1: about Putin. Putin and his his gang and most of 1556 01:29:06,160 --> 01:29:10,439 Speaker 1: the other people bad guys around the world. Because they're 1557 01:29:10,439 --> 01:29:13,720 Speaker 1: in my network. All their financial net worth, everything that 1558 01:29:13,760 --> 01:29:16,120 Speaker 1: they own, everything, their kids own, everything there, you know, 1559 01:29:16,920 --> 01:29:20,519 Speaker 1: entire family owns is in my network. I can see them, 1560 01:29:20,880 --> 01:29:24,200 Speaker 1: I can see where it goes. So they're not going 1561 01:29:24,280 --> 01:29:27,120 Speaker 1: to rock the boat too much because they're part of 1562 01:29:27,120 --> 01:29:29,559 Speaker 1: the system. Yeah. And the only thing that gave him 1563 01:29:29,560 --> 01:29:31,639 Speaker 1: little pause was isis because it was a black hole. 1564 01:29:31,680 --> 01:29:33,439 Speaker 1: It's like an ink blot in a system. Is It 1565 01:29:33,479 --> 01:29:36,320 Speaker 1: wasn't part of this big financial network. It was like 1566 01:29:36,439 --> 01:29:42,360 Speaker 1: spreading this kind of parallel system that was disconnecting dark wet. Yeah. 1567 01:29:42,560 --> 01:29:47,720 Speaker 1: It that thing scared him a bit. But you know, 1568 01:29:47,760 --> 01:29:49,840 Speaker 1: so black Rock is now like pushing on the on 1569 01:29:49,920 --> 01:29:52,960 Speaker 1: the gun front, the gun control front, and we're gonna 1570 01:29:52,960 --> 01:29:55,040 Speaker 1: see a lot of companies start to move in this direction. 1571 01:29:55,160 --> 01:29:59,360 Speaker 1: Is you know, they're responding to that that consensus morality 1572 01:29:59,400 --> 01:30:02,959 Speaker 1: that's developed. They're becoming partners in that, and they're enforcing 1573 01:30:03,000 --> 01:30:07,440 Speaker 1: it by uh, you know, reducing investments in gun companies. 1574 01:30:07,920 --> 01:30:11,639 Speaker 1: And they're putting pressure on them to modify what they sell. 1575 01:30:13,000 --> 01:30:16,240 Speaker 1: UM if they're retailers or uh, you know, producers, and 1576 01:30:16,360 --> 01:30:19,439 Speaker 1: here's the things that you should make and shouldn't make, UM, 1577 01:30:19,520 --> 01:30:21,519 Speaker 1: and eventually, you know, that gets to a point where 1578 01:30:21,920 --> 01:30:24,760 Speaker 1: I think it's possible to imagine a scenario where you 1579 01:30:24,760 --> 01:30:28,920 Speaker 1: get to a point where, um, if you own a weapon, 1580 01:30:31,080 --> 01:30:33,640 Speaker 1: it's going to be tough in large parts of the 1581 01:30:33,680 --> 01:30:36,519 Speaker 1: country to get a job or do do certain things. 1582 01:30:36,960 --> 01:30:40,519 Speaker 1: But kind of social shaming. Because of social shaming, and 1583 01:30:40,560 --> 01:30:43,240 Speaker 1: I know people already who who who won't let their 1584 01:30:43,280 --> 01:30:47,600 Speaker 1: kids go play at a house that has a gun unbelievable. 1585 01:30:47,800 --> 01:30:49,920 Speaker 1: Well they make it, they asked the question. And if that, 1586 01:30:50,240 --> 01:30:52,920 Speaker 1: if you're if you're listed as a person with who 1587 01:30:52,920 --> 01:30:58,200 Speaker 1: owns a gun, then that could make you unfit to 1588 01:30:58,240 --> 01:31:01,959 Speaker 1: go to the p t A. This is extremely nineteen 1589 01:31:02,000 --> 01:31:04,599 Speaker 1: eighty for any of those Well, I mean we're already 1590 01:31:04,600 --> 01:31:08,320 Speaker 1: seeing it with it with the division with Donald Trump, 1591 01:31:08,439 --> 01:31:11,000 Speaker 1: because I mean people will be like, oh, your husband 1592 01:31:11,080 --> 01:31:13,200 Speaker 1: voted for Donald Trump. Why don't you divorce him. He's 1593 01:31:13,200 --> 01:31:15,799 Speaker 1: a horrible person. You're like, well, it is not necessarily 1594 01:31:15,800 --> 01:31:17,840 Speaker 1: a horrible person. He's like everyone else in America. He 1595 01:31:17,880 --> 01:31:19,560 Speaker 1: looked at a ballot that had two choices on it 1596 01:31:19,560 --> 01:31:23,439 Speaker 1: and he picked one. Oh, it definitely, it definitely hurts 1597 01:31:23,439 --> 01:31:25,400 Speaker 1: you at work. I mean, my son was works in 1598 01:31:25,439 --> 01:31:30,120 Speaker 1: New York. He's a software developer, and um you saw that. 1599 01:31:30,200 --> 01:31:32,519 Speaker 1: You know, there was this one young guy that who 1600 01:31:32,560 --> 01:31:35,160 Speaker 1: was a great developers, a great guy, very nice to everybody, 1601 01:31:35,160 --> 01:31:37,920 Speaker 1: but they was identified as a Trump supporter. Yikes. I 1602 01:31:37,960 --> 01:31:41,320 Speaker 1: saw people get Crafford where I worked, man um where 1603 01:31:41,320 --> 01:31:44,000 Speaker 1: where I worked, Like the next day in the women's bathroom, 1604 01:31:44,040 --> 01:31:46,120 Speaker 1: I found out because they put it up on social media. 1605 01:31:46,400 --> 01:31:49,800 Speaker 1: They created like a shrine to Hillary on the mirror 1606 01:31:50,200 --> 01:31:52,559 Speaker 1: and and it had like all of these you know, 1607 01:31:52,800 --> 01:31:56,360 Speaker 1: women's empowerment things. It pictures of Hillary Clinton. And the 1608 01:31:56,400 --> 01:31:58,879 Speaker 1: only reason got taken down is there were a handful 1609 01:31:58,960 --> 01:32:01,280 Speaker 1: of female Trump's orders there that were like I'm not 1610 01:32:01,360 --> 01:32:04,320 Speaker 1: okay with this. What happened? What happened to the guy 1611 01:32:04,320 --> 01:32:08,160 Speaker 1: in your son's office? Well, the good portion of the 1612 01:32:08,200 --> 01:32:11,280 Speaker 1: office didn't want to work with that. We won't work 1613 01:32:11,280 --> 01:32:16,040 Speaker 1: on projects with him. Wow, unbelievable ostracizing. Yeah, you had 1614 01:32:16,040 --> 01:32:18,919 Speaker 1: to go do something else. There really wasn't any if 1615 01:32:19,040 --> 01:32:20,639 Speaker 1: no one's going to work with you, and then it's 1616 01:32:20,680 --> 01:32:23,479 Speaker 1: like and then you could argue the opposite saying you know, hey, 1617 01:32:23,479 --> 01:32:26,080 Speaker 1: he's one of the pests. I mean, very very personable, 1618 01:32:26,280 --> 01:32:28,880 Speaker 1: wouldn't say a negative thing versus you know, other people 1619 01:32:28,920 --> 01:32:32,280 Speaker 1: who are really nasty. Yeah, yeah, I didn't have that 1620 01:32:32,280 --> 01:32:35,280 Speaker 1: political well. I mean, meanwhile, you can have people like that, 1621 01:32:35,360 --> 01:32:38,360 Speaker 1: like the Attorney General of ours, who you know is 1622 01:32:38,439 --> 01:32:41,200 Speaker 1: very open. Me too, I'm on your side, girls, But 1623 01:32:41,280 --> 01:32:47,599 Speaker 1: in reality, you know, it's monster very different. Yeah. You know, 1624 01:32:47,960 --> 01:32:51,120 Speaker 1: it's important to remember that the US is has a 1625 01:32:51,240 --> 01:32:54,080 Speaker 1: very puritanical streak. I mean, you look at me and 1626 01:32:54,160 --> 01:32:57,880 Speaker 1: look at the prohibition. Yeah, I mean there's a reason, 1627 01:32:57,920 --> 01:32:59,720 Speaker 1: I guess. I guess there was a good reason why 1628 01:32:59,760 --> 01:33:02,240 Speaker 1: we had prohibition because Americans drank like four times more 1629 01:33:02,280 --> 01:33:05,560 Speaker 1: than any other country in the world up until prohibition. 1630 01:33:05,600 --> 01:33:08,720 Speaker 1: I mean, we were we were just pickled. I mean, 1631 01:33:08,720 --> 01:33:10,640 Speaker 1: I can't believe the amount of hard alcohol we used 1632 01:33:10,640 --> 01:33:16,360 Speaker 1: to drink relative to everybody else. And you know what 1633 01:33:16,400 --> 01:33:20,400 Speaker 1: other countries could shut down alcohol use? Well, we also, 1634 01:33:20,920 --> 01:33:25,160 Speaker 1: I mean, we expect moral purity, I feel like from people, 1635 01:33:25,320 --> 01:33:27,879 Speaker 1: you know, especially in how we build up our heroes 1636 01:33:27,960 --> 01:33:30,680 Speaker 1: and then we like to tear them down. It's like, 1637 01:33:31,040 --> 01:33:34,720 Speaker 1: I mean, no, nobody is a saint. Yeah, it's a 1638 01:33:36,200 --> 01:33:39,080 Speaker 1: But we're in this kind of uncharted territory where Trump 1639 01:33:39,080 --> 01:33:42,200 Speaker 1: broke all those rules. So none of that old shamming worked, 1640 01:33:43,200 --> 01:33:45,640 Speaker 1: and you know that kind of hyper reaction to that, 1641 01:33:45,720 --> 01:33:48,360 Speaker 1: this new kind of puritan is m This new moral 1642 01:33:48,439 --> 01:33:54,160 Speaker 1: network has emerged, this consensus morality to kind of to 1643 01:33:54,160 --> 01:33:58,920 Speaker 1: to refashion you know, this public morality in a way 1644 01:33:58,960 --> 01:34:02,160 Speaker 1: that they can in a post Trump world. And I 1645 01:34:02,240 --> 01:34:05,800 Speaker 1: saw you had written something u back on the gun 1646 01:34:05,880 --> 01:34:08,599 Speaker 1: issue that I thought was interesting saying that what you're 1647 01:34:08,600 --> 01:34:12,879 Speaker 1: seeing is these mega corporations, these transnational corporations, are stepping 1648 01:34:12,920 --> 01:34:16,479 Speaker 1: in to fill a perceived void and governance. They're seeing 1649 01:34:16,479 --> 01:34:19,640 Speaker 1: that our government isn't taking action that at least a 1650 01:34:19,720 --> 01:34:24,040 Speaker 1: large portion of Americans want some sort of action. Um, 1651 01:34:24,080 --> 01:34:28,040 Speaker 1: and so you're seeing a corporation and unelected corporations step 1652 01:34:28,080 --> 01:34:32,800 Speaker 1: into that that void. Yeah, that's it's all kind of 1653 01:34:33,120 --> 01:34:35,760 Speaker 1: tied to that whole post West family and you know, 1654 01:34:35,800 --> 01:34:40,160 Speaker 1: hollowing out of the nation state. I mean, look, there 1655 01:34:40,160 --> 01:34:42,200 Speaker 1: was a guy, got Philip Bobbitt wrote a book Shield 1656 01:34:42,200 --> 01:34:44,519 Speaker 1: of Achilles back in two thousand and one or so, 1657 01:34:44,920 --> 01:34:47,599 Speaker 1: and uh, it was like very popular at the time, 1658 01:34:47,600 --> 01:34:49,160 Speaker 1: and he said, we're moving from a nation state to 1659 01:34:49,200 --> 01:34:52,599 Speaker 1: a market state. And the idea is a nation state 1660 01:34:52,640 --> 01:34:55,559 Speaker 1: has borders and it has. You know, it provides a 1661 01:34:55,600 --> 01:34:57,920 Speaker 1: promise to people that will protect you and will provide 1662 01:34:57,920 --> 01:35:04,040 Speaker 1: you economic advancement. And what we saw with globalization that 1663 01:35:04,160 --> 01:35:06,120 Speaker 1: is that we moved towards something called the market state, 1664 01:35:06,160 --> 01:35:10,400 Speaker 1: which market states like it doesn't say it's going to 1665 01:35:10,479 --> 01:35:13,519 Speaker 1: protect you per se. It's gonna give you some level 1666 01:35:13,560 --> 01:35:16,280 Speaker 1: of protection, but the most the thing it really provides 1667 01:35:16,280 --> 01:35:21,840 Speaker 1: you is you know, opportunity. And you know it does 1668 01:35:21,880 --> 01:35:25,440 Speaker 1: that by lowering barriers, lowering barriers to you know, immigration, 1669 01:35:26,000 --> 01:35:31,639 Speaker 1: lowing barriers to multi culture, cultural barriers, gender barriers, all 1670 01:35:31,680 --> 01:35:35,080 Speaker 1: the all the barriers that were up are wiped out 1671 01:35:35,160 --> 01:35:40,440 Speaker 1: and you're now directly connected to the world. Wow. And 1672 01:35:40,479 --> 01:35:45,559 Speaker 1: we're in that kind of market state situation now, and um, 1673 01:35:45,600 --> 01:35:47,280 Speaker 1: you know, a good portion of people are saying, wait 1674 01:35:47,280 --> 01:35:53,280 Speaker 1: a second, h We liked a few barriers, we liked 1675 01:35:53,280 --> 01:35:56,080 Speaker 1: to some level of you know, social safety nets, and 1676 01:35:56,160 --> 01:35:59,559 Speaker 1: we liked certain ethical and moral behavior you know by 1677 01:35:59,560 --> 01:36:05,320 Speaker 1: our um establishment elites, and this this new market state 1678 01:36:05,360 --> 01:36:08,559 Speaker 1: doesn't have that. It's like, it's it's hollow, I think. 1679 01:36:08,600 --> 01:36:10,920 Speaker 1: I mean it's the premise of globalization is that every 1680 01:36:11,000 --> 01:36:15,120 Speaker 1: human being is in competition with every other human being. Globally. Yeah, 1681 01:36:15,160 --> 01:36:20,920 Speaker 1: but in a market state, you're just you're directly in competition. 1682 01:36:20,920 --> 01:36:24,840 Speaker 1: There's no help from government. The state. Yeah, the state 1683 01:36:24,920 --> 01:36:28,040 Speaker 1: is just making that ability to compete, the ability to 1684 01:36:28,040 --> 01:36:33,160 Speaker 1: get that information easy and without any barrier to it. 1685 01:36:33,600 --> 01:36:36,240 Speaker 1: And you know that allows a lot of bad behavior 1686 01:36:36,280 --> 01:36:39,439 Speaker 1: at the the corporate level and the elite level, and um, 1687 01:36:39,479 --> 01:36:43,639 Speaker 1: it makes government very ineffective. It gets less and less 1688 01:36:43,680 --> 01:36:48,040 Speaker 1: capable of actually undertaking anything, loses control of its borders, 1689 01:36:48,040 --> 01:36:50,880 Speaker 1: it loses controls of his of his people flow, it 1690 01:36:50,920 --> 01:36:53,720 Speaker 1: loses control of its finances, and it loses control of 1691 01:36:53,760 --> 01:36:59,519 Speaker 1: its economy. Trade. Uh obviously you know, control of the information, throw, 1692 01:37:00,080 --> 01:37:03,640 Speaker 1: control of the media. It all all the things that 1693 01:37:03,800 --> 01:37:06,880 Speaker 1: used to you know, prop up the nation state disappear 1694 01:37:07,080 --> 01:37:11,439 Speaker 1: or diminished substantially. Well, I guess to kind of finish 1695 01:37:11,479 --> 01:37:16,440 Speaker 1: off this interview, which is pretty wide ranging but fascinating. UM, 1696 01:37:16,479 --> 01:37:19,160 Speaker 1: I know you've done some work on resiliency and things 1697 01:37:19,200 --> 01:37:20,559 Speaker 1: like this, and I just want to kind of pick 1698 01:37:20,560 --> 01:37:24,840 Speaker 1: your brain your current thoughts about potential models that might 1699 01:37:24,880 --> 01:37:28,680 Speaker 1: work for us in the future. Um, some solutions. I 1700 01:37:28,720 --> 01:37:31,040 Speaker 1: mean there probably is no like silver bullet, of course, 1701 01:37:31,120 --> 01:37:35,480 Speaker 1: but um, some potential models for how the future economy, 1702 01:37:35,640 --> 01:37:38,200 Speaker 1: future governance, and I guess what it all comes down 1703 01:37:38,200 --> 01:37:40,559 Speaker 1: to is how we relate to one one another as 1704 01:37:40,640 --> 01:37:44,559 Speaker 1: human beings. Um, how how could you potentially see that 1705 01:37:44,680 --> 01:37:50,040 Speaker 1: functioning in the future. Well, there's um, you know the 1706 01:37:50,080 --> 01:37:53,360 Speaker 1: thing I got into, you know, after the financial crisis, 1707 01:37:53,439 --> 01:37:57,880 Speaker 1: this idea of you know, decentralized resilience that you you 1708 01:37:58,000 --> 01:38:01,839 Speaker 1: produce more locally, you have more local control, you connect 1709 01:38:01,880 --> 01:38:04,720 Speaker 1: with the bigger system on your own terms instead of 1710 01:38:05,040 --> 01:38:09,440 Speaker 1: having the terms dictated to you because of your dependence. 1711 01:38:10,360 --> 01:38:12,639 Speaker 1: So you produce more energy, and you produce more food, 1712 01:38:12,680 --> 01:38:16,760 Speaker 1: and you uh, you'd have local communities and networks that 1713 01:38:16,800 --> 01:38:19,519 Speaker 1: would protect you and sustain you if things broke down. 1714 01:38:20,880 --> 01:38:23,639 Speaker 1: And that made sense because we saw that the big 1715 01:38:23,760 --> 01:38:29,160 Speaker 1: global system it almost yeah, and that you know, it's 1716 01:38:29,240 --> 01:38:31,559 Speaker 1: very easy now to you know, you break one thing 1717 01:38:31,600 --> 01:38:36,919 Speaker 1: and in one piece of the infrastructure, global infrastructure, and 1718 01:38:36,560 --> 01:38:40,640 Speaker 1: it generates cascades of failure that go everywhere, and they 1719 01:38:40,680 --> 01:38:44,600 Speaker 1: amplify and and things get worse and worse. H and 1720 01:38:44,640 --> 01:38:46,439 Speaker 1: things break down for quite a long time. You could 1721 01:38:46,479 --> 01:38:50,519 Speaker 1: end up without power and without UH supplies for weeks 1722 01:38:50,520 --> 01:38:55,479 Speaker 1: at a time. Um or if you're like Puerto Rico. 1723 01:38:55,520 --> 01:38:57,320 Speaker 1: If you're outside of this kind of golden circle of 1724 01:38:57,360 --> 01:39:00,720 Speaker 1: people who get the kind of high end support, uh, 1725 01:39:00,960 --> 01:39:04,120 Speaker 1: you'll be short change. You won't even get any of that. 1726 01:39:04,479 --> 01:39:06,360 Speaker 1: Then you're on your own for once. Things break for 1727 01:39:06,400 --> 01:39:11,080 Speaker 1: a lot longer um. And you know, there's lots of 1728 01:39:11,080 --> 01:39:13,040 Speaker 1: cool stuff you can do at the individual level to 1729 01:39:13,080 --> 01:39:16,400 Speaker 1: make your house more productive. You know, I put in 1730 01:39:16,439 --> 01:39:19,760 Speaker 1: a generator, for instance. It allowed me to you know, 1731 01:39:19,760 --> 01:39:24,519 Speaker 1: when I we lost power for two three days, was 1732 01:39:24,560 --> 01:39:27,200 Speaker 1: an on it not entity for me. It didn't it 1733 01:39:27,200 --> 01:39:30,000 Speaker 1: didn't phaze me. Within five seconds, I'm sorry about the dogs. 1734 01:39:30,240 --> 01:39:33,439 Speaker 1: So then five seconds the the generators on and it 1735 01:39:33,520 --> 01:39:37,840 Speaker 1: was a whole house generator and you know, uh, it 1736 01:39:37,880 --> 01:39:40,600 Speaker 1: seemed like just a normal day. Um. And then it 1737 01:39:40,680 --> 01:39:43,439 Speaker 1: provided you made made it possible for me to provide 1738 01:39:44,280 --> 01:39:47,639 Speaker 1: you know, recharging and hot showers to anyone else who 1739 01:39:47,640 --> 01:39:51,080 Speaker 1: needed to in my neighborhood who didn't have the the generator. 1740 01:39:51,200 --> 01:39:57,680 Speaker 1: So made everyone's life better. Um. But you know, that 1741 01:39:57,800 --> 01:40:03,120 Speaker 1: kind of resilience may not work, um, particularly well if 1742 01:40:03,160 --> 01:40:08,680 Speaker 1: there's kind of civil strife or violent civil strife that 1743 01:40:08,800 --> 01:40:13,280 Speaker 1: becomes an entirely different beast and how that plays out 1744 01:40:13,400 --> 01:40:19,200 Speaker 1: is you know, in this kind of modern environment is 1745 01:40:19,840 --> 01:40:21,920 Speaker 1: a programs. I mean what we've seen is that it 1746 01:40:21,960 --> 01:40:25,519 Speaker 1: gets everything falls apart. There isn't you know, there's dozens 1747 01:40:25,560 --> 01:40:28,920 Speaker 1: of groups and their violence everywhere, and no one can 1748 01:40:28,960 --> 01:40:32,640 Speaker 1: agree on anything, and um, the trocities just mount and 1749 01:40:32,720 --> 01:40:36,439 Speaker 1: things break down, and um, you don't have time to 1750 01:40:36,479 --> 01:40:39,840 Speaker 1: actually plant things, and you won't get the you know, 1751 01:40:39,880 --> 01:40:42,000 Speaker 1: any kind of real benefit from the kind of local 1752 01:40:42,040 --> 01:40:45,120 Speaker 1: resilience that you would have if you were focused on 1753 01:40:45,240 --> 01:40:48,479 Speaker 1: kind of systems. So we are going to have to 1754 01:40:48,520 --> 01:40:51,639 Speaker 1: find some sort of renewed social contract and some shape 1755 01:40:51,720 --> 01:40:56,360 Speaker 1: or form. Yeah. That that that's so exactly. So the 1756 01:40:56,360 --> 01:40:58,639 Speaker 1: way we get through this is to find some new 1757 01:40:58,720 --> 01:41:03,519 Speaker 1: source effective kinship. Is there something that unites us, connects us. 1758 01:41:04,479 --> 01:41:08,960 Speaker 1: That so when you interact with somebody, uh, they're not 1759 01:41:09,040 --> 01:41:11,479 Speaker 1: an enemy, they're not They're not somebody to you know, 1760 01:41:11,680 --> 01:41:14,120 Speaker 1: if you transact with them, you're not trying to you know, 1761 01:41:15,560 --> 01:41:18,160 Speaker 1: get one over on them and and they do damage 1762 01:41:18,200 --> 01:41:21,200 Speaker 1: to them. Um that you can trust people when you 1763 01:41:21,240 --> 01:41:24,400 Speaker 1: see them implicitly because they're part of this larger group. 1764 01:41:24,760 --> 01:41:26,719 Speaker 1: And factive kinship is different than the kind of blood 1765 01:41:26,800 --> 01:41:29,280 Speaker 1: kinship that we've had, you know, in the initial clans 1766 01:41:29,280 --> 01:41:31,920 Speaker 1: and stuff. This is this idea that it's expansive. It's 1767 01:41:31,960 --> 01:41:34,840 Speaker 1: like anyone who follows these rules and goes through these 1768 01:41:34,920 --> 01:41:38,600 Speaker 1: rituals and believes in this narrative can join, you know. 1769 01:41:38,720 --> 01:41:41,880 Speaker 1: And and you know, that's the same thing that just 1770 01:41:42,200 --> 01:41:45,080 Speaker 1: I mean, when modern humans came out of Africa and 1771 01:41:45,479 --> 01:41:50,360 Speaker 1: ran into the Neanderthals, Neanderthals were clannish at best and 1772 01:41:50,560 --> 01:41:53,960 Speaker 1: families that you know, expanded families that at worst, and 1773 01:41:54,200 --> 01:41:57,479 Speaker 1: they were you know, real kid I mean, and they 1774 01:41:57,479 --> 01:42:00,200 Speaker 1: were didn't have this kind of stored information in the 1775 01:42:00,240 --> 01:42:03,400 Speaker 1: social structure, and they didn't they couldn't expand beyond that 1776 01:42:03,479 --> 01:42:07,200 Speaker 1: you know that relate you know that DNA relationship um, 1777 01:42:07,240 --> 01:42:09,760 Speaker 1: and it limited them. And here they were coming up 1778 01:42:09,800 --> 01:42:17,160 Speaker 1: against tribes of people, were multiple families that were not 1779 01:42:17,280 --> 01:42:22,599 Speaker 1: related in the traditional sense um, and that they believed 1780 01:42:22,600 --> 01:42:26,639 Speaker 1: in the narrative. They saw a kinship in the narrative 1781 01:42:26,680 --> 01:42:29,200 Speaker 1: and through the rituals, and that the narrative that told 1782 01:42:29,240 --> 01:42:33,400 Speaker 1: them that they were special, uh, that they it was 1783 01:42:33,439 --> 01:42:36,400 Speaker 1: a reason why they worked together and they joined together, 1784 01:42:36,680 --> 01:42:40,439 Speaker 1: and that they were better off together than than apart um. 1785 01:42:40,800 --> 01:42:43,920 Speaker 1: We have to find that. Yes, I hope, so, I 1786 01:42:43,920 --> 01:42:46,720 Speaker 1: hope some people wake up to that idea. I don't 1787 01:42:46,720 --> 01:42:48,320 Speaker 1: know what it is. I mean, I don't know anything. 1788 01:42:48,479 --> 01:42:49,800 Speaker 1: You know, I looked at the global level and I 1789 01:42:49,800 --> 01:42:53,000 Speaker 1: didn't see anything. I mean, there isn't. I mean, those 1790 01:42:53,360 --> 01:42:56,240 Speaker 1: those ideas haven't been invented yet. Yeah. If you look 1791 01:42:56,240 --> 01:42:59,320 Speaker 1: at the environmental narrative, we're a plague in the world, 1792 01:42:59,439 --> 01:43:02,800 Speaker 1: we're destroying. If you look at the commercial kind of 1793 01:43:02,880 --> 01:43:06,360 Speaker 1: corporate world, we're exploiting everybody and making them work for 1794 01:43:06,600 --> 01:43:09,600 Speaker 1: Pittan's and we're concentrating on all the income. If you 1795 01:43:09,600 --> 01:43:12,080 Speaker 1: look at the national level, what's left of the nation state, 1796 01:43:12,080 --> 01:43:15,880 Speaker 1: it's all you know, colonial exploitation and and uh, you 1797 01:43:15,920 --> 01:43:20,439 Speaker 1: know wars, and you know tyranny's and the like. So 1798 01:43:20,560 --> 01:43:24,160 Speaker 1: I don't I don't know of any kind of cohesive 1799 01:43:24,760 --> 01:43:27,400 Speaker 1: global narrative. I don't know what could replace what we're seeing, 1800 01:43:27,439 --> 01:43:29,240 Speaker 1: you know, seeing with the decline in the nation state. 1801 01:43:29,400 --> 01:43:32,479 Speaker 1: But we have to have something, we have to have 1802 01:43:32,560 --> 01:43:38,280 Speaker 1: something interesting. Well, John, I think we've probably taken up 1803 01:43:38,360 --> 01:43:41,559 Speaker 1: enough of your time today. We've also covered a ton 1804 01:43:41,600 --> 01:43:44,519 Speaker 1: of different topics here, and you know, it's hard to 1805 01:43:45,120 --> 01:43:47,920 Speaker 1: when you're covering such broad concepts hard to dig down 1806 01:43:47,920 --> 01:43:49,960 Speaker 1: into it, but I think we really I mean, I 1807 01:43:50,080 --> 01:43:52,240 Speaker 1: heard you articulate a number of different ideas that I 1808 01:43:52,320 --> 01:43:55,160 Speaker 1: haven't really thought of. I hadn't put I didn't connected 1809 01:43:55,200 --> 01:43:56,640 Speaker 1: the dots in that way. So I mean, this has 1810 01:43:56,680 --> 01:43:59,080 Speaker 1: been very helpful and I think our listeners will really 1811 01:43:59,120 --> 01:44:02,200 Speaker 1: enjoy it as well. Oh cool, it's been a lot 1812 01:44:02,200 --> 01:44:06,080 Speaker 1: of fun, and thanks for having me on you know, UM. 1813 01:44:06,080 --> 01:44:07,439 Speaker 1: You know what I do is I try to come 1814 01:44:07,479 --> 01:44:09,360 Speaker 1: up with frameworks with the reports in the way I 1815 01:44:09,400 --> 01:44:12,080 Speaker 1: think UM and the frameworks help you just deal with 1816 01:44:12,120 --> 01:44:15,000 Speaker 1: the onslaught of new information, and I kind of categorize it. 1817 01:44:15,920 --> 01:44:18,760 Speaker 1: If your decision maker unfreezes you, you know, sometimes when 1818 01:44:18,800 --> 01:44:22,080 Speaker 1: things change too fast, you're you're frozen. You're you're unable 1819 01:44:22,120 --> 01:44:24,479 Speaker 1: to act, You're unable to make decisions. And if you 1820 01:44:24,479 --> 01:44:25,960 Speaker 1: have a framework, you can you can get through. And 1821 01:44:26,000 --> 01:44:30,519 Speaker 1: the framework doesn't have to be corrects mostly correct. It 1822 01:44:30,520 --> 01:44:32,800 Speaker 1: allows you, you know, allows it frees you up to 1823 01:44:32,840 --> 01:44:35,160 Speaker 1: make decisions that you have to make and and makes 1824 01:44:35,200 --> 01:44:37,559 Speaker 1: you feel more confident about moving forward in the world. 1825 01:44:37,800 --> 01:44:40,200 Speaker 1: And that's what I try to do. I also, I 1826 01:44:40,240 --> 01:44:43,519 Speaker 1: didn't want to interrupt John's train of thought a while 1827 01:44:43,600 --> 01:44:45,160 Speaker 1: back in the interview when he said it, but I'm 1828 01:44:45,160 --> 01:44:47,880 Speaker 1: pretty sure you're the first person on the podcast to 1829 01:44:48,000 --> 01:44:51,720 Speaker 1: use the word mamatic. And it went totally above my 1830 01:44:51,760 --> 01:44:53,400 Speaker 1: head at first, and then I had to process it, 1831 01:44:53,439 --> 01:44:56,200 Speaker 1: and then I was like, oh, meme in there, maematic, 1832 01:44:56,600 --> 01:44:59,280 Speaker 1: I'm going to start using that word. So thank you 1833 01:44:59,320 --> 01:45:03,240 Speaker 1: for expanding my vocabulary. Well, yeah, this is a lot 1834 01:45:03,240 --> 01:45:07,120 Speaker 1: of fun. Thanks great, and once again at John Rob 1835 01:45:07,240 --> 01:45:09,639 Speaker 1: on Twitter, which, as you can tell from this interview 1836 01:45:09,680 --> 01:45:11,880 Speaker 1: is definitely someone you want to follow on Twitter. And 1837 01:45:11,960 --> 01:45:15,599 Speaker 1: also his book it's called Brave New War right. Yeah, 1838 01:45:15,640 --> 01:45:17,479 Speaker 1: I read I read it, but it was a while back, 1839 01:45:17,600 --> 01:45:22,000 Speaker 1: but it was definitely an eye opener um, mostly about insurgency. Um, 1840 01:45:22,080 --> 01:45:25,240 Speaker 1: so people should go read that. Yeah, and then um, 1841 01:45:25,360 --> 01:45:27,920 Speaker 1: then my report is on Patreon is ghen rob or 1842 01:45:28,040 --> 01:45:31,000 Speaker 1: Patreon slash genre. Yeah I saw that as well, So 1843 01:45:31,040 --> 01:45:33,160 Speaker 1: you have a Patreon and so check that out, check 1844 01:45:33,160 --> 01:45:36,160 Speaker 1: out the Global Guerrillas Report, And once again it's John 1845 01:45:36,880 --> 01:45:40,160 Speaker 1: Rob with two bees. So at John Rob R O 1846 01:45:40,320 --> 01:45:44,160 Speaker 1: B B on Twitter and on Patreon, and uh, yeah, 1847 01:45:44,160 --> 01:45:46,439 Speaker 1: we really appreciate you coming on. This has been a 1848 01:45:46,479 --> 01:45:50,240 Speaker 1: great episode, Oh my pleasure, thanks guys. Yeah, and at 1849 01:45:50,280 --> 01:45:51,920 Speaker 1: some point I should say, I'd love to have you 1850 01:45:51,960 --> 01:45:54,960 Speaker 1: back on to talk about your Air Force days and 1851 01:45:55,000 --> 01:45:58,160 Speaker 1: Special operations. I think that could be a whole other episode. 1852 01:45:58,360 --> 01:46:00,920 Speaker 1: Oh yeah, our audience is definitely interested in that stuff. 1853 01:46:01,400 --> 01:46:04,880 Speaker 1: All right, pretty cool? All right? Thanks John, all right, 1854 01:46:04,920 --> 01:46:08,439 Speaker 1: take care, guys, to my pleasure. But thanks again to 1855 01:46:08,560 --> 01:46:12,040 Speaker 1: John rob for coming on. Excellent UM interview there. I 1856 01:46:12,080 --> 01:46:14,479 Speaker 1: think we went really in depth. And no, we didn't 1857 01:46:14,479 --> 01:46:16,960 Speaker 1: get into any of his special operations background, and I'm 1858 01:46:16,960 --> 01:46:18,760 Speaker 1: sure a lot of you would have liked to have 1859 01:46:18,800 --> 01:46:20,800 Speaker 1: heard that. So I think that a part two is 1860 01:46:20,840 --> 01:46:22,680 Speaker 1: in order at some point, and if you want to 1861 01:46:22,680 --> 01:46:25,719 Speaker 1: hear that, let us know at Software Radio on Twitter 1862 01:46:26,240 --> 01:46:29,799 Speaker 1: UM and then individually at Ian Scotto at Jack Murphy 1863 01:46:30,000 --> 01:46:33,439 Speaker 1: r g R on Twitter. Of course, there's only one 1864 01:46:33,520 --> 01:46:37,240 Speaker 1: club out there with gear handpicked by special operations military 1865 01:46:37,320 --> 01:46:41,599 Speaker 1: veterans from several branches, and that, of course, is Create Club. 1866 01:46:41,920 --> 01:46:44,360 Speaker 1: Past items we've had in our premium crates have been 1867 01:46:44,400 --> 01:46:48,440 Speaker 1: a survival belt which included a stainless steel knife, firestarter, 1868 01:46:48,920 --> 01:46:52,280 Speaker 1: bottle opener, and an LED flashlight. Great item, I know 1869 01:46:52,360 --> 01:46:54,880 Speaker 1: both of us have it. UM also in a premium 1870 01:46:54,920 --> 01:46:57,360 Speaker 1: Create we had a ballistic shield insert for your backpack 1871 01:46:57,439 --> 01:47:01,479 Speaker 1: made by cry Precision. Creatu is really stepping it up 1872 01:47:01,560 --> 01:47:04,960 Speaker 1: right now though, as t and progresses, by putting out 1873 01:47:05,040 --> 01:47:08,120 Speaker 1: custom products that you're not going to find anywhere else. 1874 01:47:08,479 --> 01:47:11,200 Speaker 1: We have different tiers of membership depending on how prepared 1875 01:47:11,320 --> 01:47:14,400 Speaker 1: you want to be, and gift options are available as well, 1876 01:47:14,520 --> 01:47:16,920 Speaker 1: so if this is something you want to give you know, 1877 01:47:17,000 --> 01:47:19,160 Speaker 1: for example, if you're a female listener to your husband 1878 01:47:19,280 --> 01:47:22,160 Speaker 1: or it's your friend, check it out. It's at Create 1879 01:47:22,200 --> 01:47:26,599 Speaker 1: Club dot us once again, that's great Club dot us 1880 01:47:26,720 --> 01:47:28,559 Speaker 1: to see all of that and all of the different 1881 01:47:28,600 --> 01:47:31,479 Speaker 1: tiers for your dog owners, check this out. You're gonna 1882 01:47:31,520 --> 01:47:34,200 Speaker 1: love this. We've just partnered with Kuna. They have a 1883 01:47:34,200 --> 01:47:37,080 Speaker 1: team of trained canine handlers picking out a box for 1884 01:47:37,200 --> 01:47:40,439 Speaker 1: your dog each month of healthy treats and training aids. 1885 01:47:40,720 --> 01:47:44,000 Speaker 1: It's custom built through your dog's size and age as well. 1886 01:47:44,520 --> 01:47:48,200 Speaker 1: US sourced products, all natural and they not only promote 1887 01:47:48,200 --> 01:47:51,679 Speaker 1: a healthy diet but also promote being active with your dog. 1888 01:47:52,200 --> 01:47:54,960 Speaker 1: You can see all of that at Kuna dot Dog. 1889 01:47:55,040 --> 01:47:58,040 Speaker 1: That's Kuna dot Dog. It's efficient for you, your dog 1890 01:47:58,080 --> 01:48:00,400 Speaker 1: will appreciate it as well, of course, and that's spelled 1891 01:48:00,479 --> 01:48:04,160 Speaker 1: see U n A dot d O G. We just 1892 01:48:04,240 --> 01:48:07,400 Speaker 1: sent out the first box and people are loving it. 1893 01:48:07,439 --> 01:48:09,360 Speaker 1: We have a multivitamin in there for your dog, a 1894 01:48:09,400 --> 01:48:12,719 Speaker 1: toy for your dog, and uh, it's all just great 1895 01:48:12,800 --> 01:48:15,760 Speaker 1: stuff that they're going to really love and it's going 1896 01:48:15,800 --> 01:48:17,880 Speaker 1: to be helpful to use a dog on her. Also 1897 01:48:18,000 --> 01:48:20,120 Speaker 1: as a reminder for all of those who are listening, 1898 01:48:20,439 --> 01:48:23,160 Speaker 1: for a limited time, you can receive a fifty percent 1899 01:48:23,320 --> 01:48:26,760 Speaker 1: discounted membership to the spec Ops Channel, our channel that 1900 01:48:26,880 --> 01:48:32,040 Speaker 1: offers the most exclusive shows, documentaries and interviews covering the 1901 01:48:32,080 --> 01:48:36,120 Speaker 1: most exciting military content today. The spec Ops channel premier 1902 01:48:36,160 --> 01:48:40,280 Speaker 1: show Training Sell follows former Special Operations Forces as they 1903 01:48:40,320 --> 01:48:43,799 Speaker 1: participate in the most advanced training in the country, everything 1904 01:48:43,840 --> 01:48:49,120 Speaker 1: from shooting schools, defensive driving, jungle and winter warfare, climbing 1905 01:48:49,160 --> 01:48:52,040 Speaker 1: and much more. Again, you can watch this content by 1906 01:48:52,040 --> 01:48:55,360 Speaker 1: subscribing to the spec Ops Channel and that's at spec 1907 01:48:55,360 --> 01:48:57,960 Speaker 1: ops channel dot com and you can take advantage of 1908 01:48:57,960 --> 01:49:01,720 Speaker 1: a limited time offer of fifth deeper cent off your membership. 1909 01:49:01,760 --> 01:49:05,240 Speaker 1: That's only four nine a month and the app is 1910 01:49:05,280 --> 01:49:08,880 Speaker 1: available now for iOS for you iPhone users. Android will 1911 01:49:08,920 --> 01:49:12,720 Speaker 1: be available early next month. I think that wraps it up. 1912 01:49:12,760 --> 01:49:15,760 Speaker 1: We went pretty long here. Um, you know what I'm 1913 01:49:15,760 --> 01:49:17,240 Speaker 1: gonna plug and I have nothing to do with it. 1914 01:49:17,240 --> 01:49:19,839 Speaker 1: I was telling you before. Check out Cobra Kai on YouTube. 1915 01:49:20,000 --> 01:49:23,120 Speaker 1: Its fucking great. I love it, and everybody who's watched 1916 01:49:23,120 --> 01:49:25,680 Speaker 1: it as really enjoyed it. And I'm not, even, like 1917 01:49:25,720 --> 01:49:27,760 Speaker 1: I say on the podcast, pretty often not a big 1918 01:49:27,960 --> 01:49:31,240 Speaker 1: movie or show guy. And on a recent show you 1919 01:49:31,280 --> 01:49:33,200 Speaker 1: were saying how everything is a remake, everything is a 1920 01:49:33,240 --> 01:49:36,200 Speaker 1: reboot now and I would agree, Um, and they're usually 1921 01:49:36,240 --> 01:49:38,519 Speaker 1: you're very poorly done. This is like a reboot that 1922 01:49:38,960 --> 01:49:41,799 Speaker 1: it kept like that classic eighties heel of that original 1923 01:49:41,920 --> 01:49:45,920 Speaker 1: Karate Kid, but updated it to now I'm loving and 1924 01:49:45,960 --> 01:49:48,160 Speaker 1: I'm only two episodes in, but I'm probably gonna watch 1925 01:49:48,400 --> 01:49:51,439 Speaker 1: some more of that when i get home. Yeah. I 1926 01:49:51,520 --> 01:49:53,680 Speaker 1: know this has nothing to do with that, but I'm 1927 01:49:53,680 --> 01:49:55,720 Speaker 1: throwing it out there because it's of interest to me. 1928 01:49:55,840 --> 01:49:58,680 Speaker 1: Then it Well, thanks for checking this out. Um. We 1929 01:49:58,800 --> 01:50:00,600 Speaker 1: got a lot of other great stuff wind up for 1930 01:50:00,720 --> 01:50:10,320 Speaker 1: you where you're the month, and we appreciate office you've 1931 01:50:10,320 --> 01:50:14,439 Speaker 1: been listening to soff rep Radio. New episodes up every 1932 01:50:14,479 --> 01:50:17,880 Speaker 1: Wednesday and Friday for all of the great content. From 1933 01:50:17,880 --> 01:50:21,200 Speaker 1: our veteran journalists. Join us and become a team room 1934 01:50:21,240 --> 01:50:24,840 Speaker 1: member today at soft rep dot com. Follow the show 1935 01:50:24,920 --> 01:50:28,880 Speaker 1: on Instagram and Twitter at soft rep Radio, and be 1936 01:50:29,040 --> 01:50:32,080 Speaker 1: sure to also check out the Power of Thought podcast 1937 01:50:32,520 --> 01:50:37,200 Speaker 1: hosted by Hurricane Group CEO and Navy Seal Sniper instructor 1938 01:50:37,680 --> 01:50:38,599 Speaker 1: Brandon Webb.