1 00:00:07,120 --> 00:00:19,360 Speaker 1: Ah beat Force. If it doesn't work, you're just not 2 00:00:19,680 --> 00:00:25,440 Speaker 1: using enough. You're listening to Software Radio Special Operations, Military 3 00:00:25,520 --> 00:01:07,640 Speaker 1: news and straight talk with the guys and the community 4 00:01:09,080 --> 00:01:14,000 Speaker 1: softwarep Radio on Time on Target. Very excited to have 5 00:01:14,240 --> 00:01:17,080 Speaker 1: Peter Singer or PW. Singer on with us. I don't 6 00:01:17,080 --> 00:01:20,319 Speaker 1: know which. I should refer to you as tubs under PW, 7 00:01:20,360 --> 00:01:23,240 Speaker 1: so let's do that that works, PW. Singer, because I 8 00:01:23,240 --> 00:01:26,120 Speaker 1: know on Twitter you're Peter W. Singer, so I'm not, 9 00:01:26,280 --> 00:01:29,800 Speaker 1: you know, giving anything away here. But the book is 10 00:01:29,920 --> 00:01:33,600 Speaker 1: like war the weaponization of social media. UH. P W. 11 00:01:33,760 --> 00:01:37,360 Speaker 1: Singer is also a senior fellow at New America, which 12 00:01:37,400 --> 00:01:40,120 Speaker 1: I'm actually interested to get into and hear about what 13 00:01:40,880 --> 00:01:44,400 Speaker 1: New America is exactly. But we're excited to talk about 14 00:01:44,400 --> 00:01:46,400 Speaker 1: the book. And this is pretty cool that you co authored. 15 00:01:46,400 --> 00:01:49,040 Speaker 1: This book came out very recently and was sent to us. 16 00:01:49,840 --> 00:01:52,240 Speaker 1: Jack read it cover to cover and I'm excited to 17 00:01:52,280 --> 00:01:56,000 Speaker 1: dig into it. Well. Thanks for having me, absolutely yeah, 18 00:01:56,160 --> 00:01:58,760 Speaker 1: really a pleasure of Peter and UH. I also have 19 00:01:58,840 --> 00:02:01,920 Speaker 1: your your previous book, ghost Fleet on my bookshelf and 20 00:02:02,240 --> 00:02:04,480 Speaker 1: I have yet to read it. I'll fess up to that, 21 00:02:04,520 --> 00:02:06,960 Speaker 1: but I want to uh. I was gonna thank you 22 00:02:07,040 --> 00:02:11,120 Speaker 1: until well I did read this book. You know, we 23 00:02:11,200 --> 00:02:13,200 Speaker 1: take a certain amount of pride and actually reading the 24 00:02:13,240 --> 00:02:15,560 Speaker 1: books of the authors that we have on the show, 25 00:02:15,600 --> 00:02:17,800 Speaker 1: and we try to make it through as many of 26 00:02:17,840 --> 00:02:20,919 Speaker 1: them as we can. UM, and I found this book 27 00:02:20,960 --> 00:02:24,960 Speaker 1: to be very interesting. It's a it really encompasses like 28 00:02:25,040 --> 00:02:29,400 Speaker 1: the history of UM, the communications mediums, you know, and 29 00:02:29,440 --> 00:02:34,400 Speaker 1: you you went back to the printing press and whire cables, UM, 30 00:02:35,160 --> 00:02:39,120 Speaker 1: the first Transatlantic cables, and then talk about the weaponization 31 00:02:39,160 --> 00:02:42,440 Speaker 1: of social media and how it can be used to propagandize, 32 00:02:42,520 --> 00:02:45,400 Speaker 1: how states can use it to propagandize other states, or 33 00:02:45,560 --> 00:02:49,240 Speaker 1: how states can use it for internal suppression in their 34 00:02:49,240 --> 00:02:52,160 Speaker 1: own countries. Uh So it's a very interesting kind of 35 00:02:52,200 --> 00:02:56,520 Speaker 1: top to bottom history of a really an emerging subject 36 00:02:56,560 --> 00:02:59,079 Speaker 1: that we're really just trying to come to grips with, 37 00:02:59,160 --> 00:03:02,400 Speaker 1: I think here in the United States. UM. So, the 38 00:03:02,720 --> 00:03:04,639 Speaker 1: I guess the first question I wanted to start off 39 00:03:04,720 --> 00:03:06,760 Speaker 1: with it, although this isn't covered in your book because 40 00:03:06,760 --> 00:03:10,119 Speaker 1: it's it's happening in real time, is UM, we're going 41 00:03:10,160 --> 00:03:13,920 Speaker 1: into midterm elections, We're seeing the headlines that there could 42 00:03:13,919 --> 00:03:17,880 Speaker 1: be meddling in these elections like there was in I 43 00:03:17,919 --> 00:03:19,600 Speaker 1: was wondering if you could talk a little bit about 44 00:03:19,720 --> 00:03:23,600 Speaker 1: how things have changed since, or any new trends you're seeing, 45 00:03:23,639 --> 00:03:26,720 Speaker 1: any new actions that are being taken um, perhaps by 46 00:03:26,760 --> 00:03:29,840 Speaker 1: foreign governments or by our own government, to protect the 47 00:03:30,440 --> 00:03:33,920 Speaker 1: sovereignty and the validity I suppose that the legitimacy of 48 00:03:33,960 --> 00:03:37,720 Speaker 1: our internal elections. Sure. So, first, thanks again for having 49 00:03:37,720 --> 00:03:41,080 Speaker 1: me on. Really appreciate it. Uh So. The book, the 50 00:03:41,080 --> 00:03:45,440 Speaker 1: title of it is like war and like war. If 51 00:03:45,480 --> 00:03:48,920 Speaker 1: you think of cyber war as the hacking of networks, 52 00:03:49,440 --> 00:03:52,440 Speaker 1: Like war is about the hacking of the people on 53 00:03:52,480 --> 00:03:59,240 Speaker 1: the networks through of likes and lies, driving ideas viral 54 00:03:59,720 --> 00:04:03,440 Speaker 1: and and uh. It covers everything from the use of 55 00:04:03,480 --> 00:04:08,760 Speaker 1: this by organizations like ISIS to Russian disinformation operations, to 56 00:04:09,240 --> 00:04:11,920 Speaker 1: how it was the secret sauce for the Trump campaign, 57 00:04:12,360 --> 00:04:15,840 Speaker 1: to how we're seeing it used by celebrities like Taylor 58 00:04:15,880 --> 00:04:19,320 Speaker 1: Swift and the like. Um In many ways, it's uh, 59 00:04:19,360 --> 00:04:22,600 Speaker 1: we're finding that social media it's not just the nervous 60 00:04:22,640 --> 00:04:25,239 Speaker 1: system of the modern world, and we use for everything 61 00:04:25,279 --> 00:04:29,720 Speaker 1: from dating to business marketing to you and I having 62 00:04:29,720 --> 00:04:32,080 Speaker 1: this conversation then posting it up on YouTube and the like. 63 00:04:32,640 --> 00:04:36,480 Speaker 1: It's also become a battlefield. Uh. And what's fascinating about 64 00:04:36,480 --> 00:04:40,560 Speaker 1: that battlefield is you have these wildly diverse actors who 65 00:04:40,680 --> 00:04:44,320 Speaker 1: end up using almost the same exact tactics. So you 66 00:04:44,400 --> 00:04:49,840 Speaker 1: have Taylor Swift and Janet Hussein, who was ISIS top recruiter, 67 00:04:50,520 --> 00:04:54,520 Speaker 1: operating in much the same way because online their goal 68 00:04:54,680 --> 00:04:57,960 Speaker 1: is the same, even though they have very different real 69 00:04:58,000 --> 00:05:01,279 Speaker 1: world goals. And um, as you know we talked about 70 00:05:01,320 --> 00:05:04,119 Speaker 1: in the book, it's not just about winning the web. 71 00:05:04,200 --> 00:05:06,480 Speaker 1: It allows you to win your battles off the web. 72 00:05:06,560 --> 00:05:08,960 Speaker 1: And you know, the examples range from how you can 73 00:05:09,040 --> 00:05:11,360 Speaker 1: use it to win an election to how ISIS uses 74 00:05:11,400 --> 00:05:14,159 Speaker 1: it to win the First Battle of Moss and the like. UM. 75 00:05:14,279 --> 00:05:16,560 Speaker 1: So to your question about you know what played out 76 00:05:16,560 --> 00:05:20,080 Speaker 1: in Steen and then what's playing out right now, is 77 00:05:20,160 --> 00:05:25,240 Speaker 1: that obviously we saw a pretty massive campaign that was 78 00:05:25,320 --> 00:05:28,679 Speaker 1: conducted by Russia. UM. We were not the first target 79 00:05:28,680 --> 00:05:31,839 Speaker 1: of this. Again, they were using very similar tactics against 80 00:05:32,080 --> 00:05:37,640 Speaker 1: the Ukrainians UM and bregsit and you know, very much 81 00:05:37,680 --> 00:05:40,320 Speaker 1: a sort of bit of American arrogance will no that 82 00:05:40,320 --> 00:05:43,839 Speaker 1: couldn't be done to us. And it was the campaign. 83 00:05:44,560 --> 00:05:46,599 Speaker 1: The details of it have you know, kind of dribbled 84 00:05:46,600 --> 00:05:51,080 Speaker 1: out over time. Um, the data now shows that it 85 00:05:51,120 --> 00:05:54,240 Speaker 1: was much much bigger than I think most people appreciate. 86 00:05:54,640 --> 00:05:59,920 Speaker 1: So as an example, the Russian uh false information that 87 00:06:00,080 --> 00:06:03,599 Speaker 1: was pumped out on Facebook, over a hundred forty six 88 00:06:03,720 --> 00:06:06,520 Speaker 1: million Americans saw it in their Facebook feeds. That's half 89 00:06:06,520 --> 00:06:09,400 Speaker 1: the population. Um. And that's just Facebook, if you're looking 90 00:06:09,560 --> 00:06:14,400 Speaker 1: at content on Twitter, on Instagram, etcetera. And again, much 91 00:06:14,520 --> 00:06:18,080 Speaker 1: like isis much like a good marketer for a new movie, 92 00:06:18,360 --> 00:06:21,800 Speaker 1: they moved across the platforms where they would um, use 93 00:06:21,920 --> 00:06:26,040 Speaker 1: Reddit to push out ideas, but then move the images 94 00:06:26,080 --> 00:06:29,440 Speaker 1: over to another space. And Um, it was incredibly cheap. 95 00:06:30,080 --> 00:06:34,080 Speaker 1: At least the latest information it's that the annual budget 96 00:06:34,160 --> 00:06:36,680 Speaker 1: for it among the Russian operation was about twelve million 97 00:06:36,720 --> 00:06:39,440 Speaker 1: dollars um. As you all know, the Pentagon, you know 98 00:06:39,480 --> 00:06:42,960 Speaker 1: sneezes out twelve million dollars um on on on a 99 00:06:43,080 --> 00:06:48,159 Speaker 1: on a slow day. Um. But effective. Um. And we 100 00:06:48,160 --> 00:06:51,160 Speaker 1: can have a debate about, you know, did it swing 101 00:06:51,200 --> 00:06:55,560 Speaker 1: the election? That that's for the historians. Um. What we 102 00:06:55,680 --> 00:07:00,000 Speaker 1: do know is that the attackers think it worked, because 103 00:07:00,040 --> 00:07:03,000 Speaker 1: they're back at it. They never stopped addicted to Then 104 00:07:03,040 --> 00:07:07,240 Speaker 1: we've seen some examples UM popping up into UH and 105 00:07:07,320 --> 00:07:09,640 Speaker 1: this is again one of the key tactics is that 106 00:07:09,680 --> 00:07:13,680 Speaker 1: they're jumping into other conversations trying to leverage them to 107 00:07:13,800 --> 00:07:17,440 Speaker 1: their own ends. So we've seen the Russian disinformation campaigns 108 00:07:17,840 --> 00:07:21,960 Speaker 1: UM try and wedgend everything from the protests in Charlottesville 109 00:07:22,440 --> 00:07:26,880 Speaker 1: to the controversy over um UH protests during the NFL 110 00:07:26,960 --> 00:07:35,240 Speaker 1: anthem UH to UM Nike boycott to UH election antivan 111 00:07:35,440 --> 00:07:39,360 Speaker 1: and uh SO. It's continuing. But what's important about it 112 00:07:39,480 --> 00:07:42,760 Speaker 1: is other actors are out there watching and learning and saying, hey, 113 00:07:43,160 --> 00:07:46,120 Speaker 1: that work for them, we can do it too. And 114 00:07:46,160 --> 00:07:48,600 Speaker 1: so we've seen it pop up into everything from the 115 00:07:48,960 --> 00:07:52,200 Speaker 1: Brazilian election campaign that's going on right now to the 116 00:07:52,240 --> 00:07:56,320 Speaker 1: Mexican election too. We've seen domestic actors learned from these 117 00:07:56,360 --> 00:08:00,360 Speaker 1: tactics and apply them over uh SO. The Kavanaugh hearing 118 00:08:00,680 --> 00:08:04,160 Speaker 1: as an example of the Supreme Court Justice hearing, we 119 00:08:04,200 --> 00:08:08,320 Speaker 1: saw both sides of that debate using the very same 120 00:08:08,520 --> 00:08:12,600 Speaker 1: what we call like war tactics to try and win 121 00:08:12,720 --> 00:08:15,440 Speaker 1: their real world battle. And the reason again is because 122 00:08:16,120 --> 00:08:20,000 Speaker 1: people think it works. And frankly, you know I'm biased, 123 00:08:20,040 --> 00:08:21,920 Speaker 1: but I would say it does. The data shows that 124 00:08:21,960 --> 00:08:26,000 Speaker 1: it does work. Um. Now, okay, what do we do 125 00:08:26,040 --> 00:08:28,720 Speaker 1: about it? What's government doing about it? Um? You know, 126 00:08:28,760 --> 00:08:31,240 Speaker 1: I can give kind of a longer answer, but basically 127 00:08:32,000 --> 00:08:34,960 Speaker 1: the challenge right now is that the United States may 128 00:08:35,000 --> 00:08:37,680 Speaker 1: have been the nation that invented the Internet, but where 129 00:08:37,679 --> 00:08:41,120 Speaker 1: the nation that the government's particularly militaries around the world 130 00:08:41,120 --> 00:08:44,800 Speaker 1: look at and say, don't let what has happened to 131 00:08:45,000 --> 00:08:47,480 Speaker 1: America happen to us, where that we're the example of 132 00:08:47,480 --> 00:08:49,760 Speaker 1: what not to do. UM. So if you look at 133 00:08:49,800 --> 00:08:52,680 Speaker 1: you know, measures at the Baltics have put into place, etcetera, 134 00:08:52,679 --> 00:08:55,800 Speaker 1: they'll site, look at what happened to the Americans. This 135 00:08:55,880 --> 00:08:58,600 Speaker 1: is why we're doing X, Y and Z. So there 136 00:08:58,760 --> 00:09:04,360 Speaker 1: is a greater awareness of these campaigns. Um. We're even 137 00:09:04,400 --> 00:09:07,560 Speaker 1: weaving them into our own operations, so, you know, particularly 138 00:09:07,600 --> 00:09:11,520 Speaker 1: to the special operations community. There's a really interesting back 139 00:09:11,559 --> 00:09:14,760 Speaker 1: and forth that we play within the book of how 140 00:09:14,840 --> 00:09:19,240 Speaker 1: Isis was using certain tactics in taking over Mocil that 141 00:09:20,280 --> 00:09:22,679 Speaker 1: surprise us, and it's part of the story of of 142 00:09:22,800 --> 00:09:26,000 Speaker 1: how they defeat a much larger force. But when it 143 00:09:26,000 --> 00:09:28,080 Speaker 1: comes time a couple of years later for US to 144 00:09:28,160 --> 00:09:31,840 Speaker 1: take back Moscil, we're using many of the same tactics 145 00:09:31,920 --> 00:09:34,480 Speaker 1: we've been watching and learning them, even using some of 146 00:09:34,480 --> 00:09:37,400 Speaker 1: the tactics the Russians were using against our election. So 147 00:09:37,440 --> 00:09:39,920 Speaker 1: there's this kind of learning process that's going on. But overall, 148 00:09:39,920 --> 00:09:44,000 Speaker 1: when it comes to overall strategy and policy, uh, particularly 149 00:09:44,000 --> 00:09:47,480 Speaker 1: beyond the military, we're not yet dealing well with it. 150 00:09:47,960 --> 00:09:51,760 Speaker 1: The same um uh process is played out at the 151 00:09:51,880 --> 00:09:57,480 Speaker 1: company's UM. I liken it to the companies have gone 152 00:09:57,520 --> 00:10:01,600 Speaker 1: through the stages of grief. Um. They were initially in 153 00:10:01,800 --> 00:10:04,560 Speaker 1: denial what was playing out on their networks, that their 154 00:10:04,600 --> 00:10:08,079 Speaker 1: networks had become these kind of war zones. Now and 155 00:10:08,480 --> 00:10:10,720 Speaker 1: you know, you can see this like Zuckerberg immediately after 156 00:10:11,120 --> 00:10:14,800 Speaker 1: saying out it's a quote pretty crazy idea that um, 157 00:10:14,800 --> 00:10:17,719 Speaker 1: we could have had this these problems of fake news, disinformation, 158 00:10:17,760 --> 00:10:20,360 Speaker 1: and even more that it could have influenced people's votes. 159 00:10:20,800 --> 00:10:24,240 Speaker 1: And what's ironic is, um he's saying that at the 160 00:10:24,360 --> 00:10:29,920 Speaker 1: exact same moment that Facebook is advertising to political campaigns, 161 00:10:30,360 --> 00:10:33,520 Speaker 1: that Facebook is the best place to influence someone's vote, 162 00:10:33,720 --> 00:10:36,280 Speaker 1: and they've got all the data to back it up. UM. 163 00:10:36,320 --> 00:10:37,880 Speaker 1: So he goes from saying, you know, it's a pretty 164 00:10:37,880 --> 00:10:42,240 Speaker 1: crazy idea two years ago to now Facebook has created 165 00:10:42,280 --> 00:10:46,120 Speaker 1: a quote war room to try and manage this problem. 166 00:10:46,240 --> 00:10:50,319 Speaker 1: Um Zuckerberg describes himself as being in a quote arms 167 00:10:50,440 --> 00:10:55,280 Speaker 1: race against Russian disinformation campaigns. But it kind of points 168 00:10:55,320 --> 00:10:57,920 Speaker 1: to again one of these see changes we talked about 169 00:10:57,960 --> 00:11:02,000 Speaker 1: in the book of some of the most important actors 170 00:11:02,280 --> 00:11:05,840 Speaker 1: in war in politics right now are a handful of 171 00:11:05,880 --> 00:11:09,920 Speaker 1: tech inventors who basically the policy changes that they decide 172 00:11:10,480 --> 00:11:14,480 Speaker 1: on Facebook on Twitter tilt the battlefield one way or 173 00:11:14,520 --> 00:11:16,760 Speaker 1: the other. And oviously, they were kind of tilted to 174 00:11:17,360 --> 00:11:23,240 Speaker 1: advantage uh, disinformation warriors terrorist groups. Now we're trying to 175 00:11:23,280 --> 00:11:25,920 Speaker 1: get them to tilt it back, but you also get 176 00:11:25,960 --> 00:11:30,000 Speaker 1: these really fascinating challenges of should they have that power 177 00:11:30,120 --> 00:11:32,080 Speaker 1: or not. So I'll end on a great illustration of 178 00:11:32,120 --> 00:11:37,439 Speaker 1: the change um. For several years, Facebook basically ignored how 179 00:11:37,679 --> 00:11:40,640 Speaker 1: it was being used as a space not just a 180 00:11:40,760 --> 00:11:46,199 Speaker 1: call for mass killings and MIMAR, but to coordinate around it. 181 00:11:46,800 --> 00:11:50,320 Speaker 1: And you know, this played out. But then a couple 182 00:11:50,320 --> 00:11:54,960 Speaker 1: of weeks ago Facebook said, you know what, Um, the 183 00:11:55,000 --> 00:11:59,079 Speaker 1: generals who are the leaders of MR You're kicked off Facebook. 184 00:11:59,760 --> 00:12:05,160 Speaker 1: And and this is notable because in MIMER, Facebook pretty 185 00:12:05,240 --> 00:12:08,800 Speaker 1: much is the Internet it's everyone. It's it's not how 186 00:12:08,840 --> 00:12:11,320 Speaker 1: you're using multiple different it's it's all funneled through that. 187 00:12:11,679 --> 00:12:15,520 Speaker 1: And so basically, a private company just decided to make 188 00:12:15,600 --> 00:12:20,520 Speaker 1: it more difficult for a nation's leaders to speak to 189 00:12:20,640 --> 00:12:23,920 Speaker 1: their citizens. But I would argue for very good reasons, 190 00:12:23,960 --> 00:12:25,880 Speaker 1: because the leaders were calling from mass killings. But it 191 00:12:25,960 --> 00:12:28,320 Speaker 1: just shows kind of the crazy changes that have played 192 00:12:28,360 --> 00:12:30,320 Speaker 1: out in a very short period of time. When you 193 00:12:30,360 --> 00:12:34,320 Speaker 1: circle back, you know, Zuckerberg and men's Facebook basically as 194 00:12:34,320 --> 00:12:38,280 Speaker 1: a means originally for students to write who was hot 195 00:12:38,400 --> 00:12:41,199 Speaker 1: or not, and now they've got this power and warm politics. 196 00:12:41,360 --> 00:12:43,920 Speaker 1: There's this great meme to a vet where it says 197 00:12:43,960 --> 00:12:47,920 Speaker 1: like when you created a you know, system for to 198 00:12:48,480 --> 00:12:50,400 Speaker 1: find out, you know, who was like a hotter girl, 199 00:12:50,480 --> 00:12:53,840 Speaker 1: but you've influenced to election and you know or or 200 00:12:53,920 --> 00:12:57,160 Speaker 1: you know, I don't remember what it exactly. Really it's 201 00:12:57,160 --> 00:12:59,160 Speaker 1: where where zucker Burn's got like his hand in his face. 202 00:12:59,160 --> 00:13:02,319 Speaker 1: It's like, oh my god, what have I done? Um No, 203 00:13:02,640 --> 00:13:05,440 Speaker 1: I think there's a fascinating subject, and I'm glad that 204 00:13:05,480 --> 00:13:07,720 Speaker 1: you kind of um put it together in in a 205 00:13:07,880 --> 00:13:10,360 Speaker 1: in a readable format rather than you know, trying to 206 00:13:10,600 --> 00:13:13,200 Speaker 1: I think people trying to scoot around the Internet and 207 00:13:13,240 --> 00:13:15,640 Speaker 1: read you know, previous work that you or others have done, 208 00:13:16,160 --> 00:13:19,520 Speaker 1: trying to piece it all together on their own. Um. 209 00:13:20,120 --> 00:13:22,040 Speaker 1: One thing that I think that you put your finger 210 00:13:22,120 --> 00:13:24,679 Speaker 1: on in the book which is interesting is that some 211 00:13:24,760 --> 00:13:28,640 Speaker 1: of these uh, you know, foreign state actors exploit the 212 00:13:28,760 --> 00:13:31,600 Speaker 1: freedom that we have in the Western world, things like 213 00:13:31,679 --> 00:13:35,880 Speaker 1: freedom of speech, um, freedom of expression, to use these 214 00:13:35,880 --> 00:13:39,200 Speaker 1: as vehicles to propagandize the American public. And it seems 215 00:13:39,240 --> 00:13:42,480 Speaker 1: that it's something that as Americans were very naive about 216 00:13:42,640 --> 00:13:45,760 Speaker 1: that we we have this um impression that people out 217 00:13:45,760 --> 00:13:49,800 Speaker 1: in the world are genuinely good actors overall. Um, we 218 00:13:49,800 --> 00:13:52,240 Speaker 1: we don't have our our skeptics had on when we 219 00:13:52,360 --> 00:13:56,120 Speaker 1: view some of this information as one if um, you 220 00:13:56,200 --> 00:13:59,719 Speaker 1: had any conclusions about why why and if you think, 221 00:13:59,720 --> 00:14:03,520 Speaker 1: I'm Americans are particularly susceptible to this type of propaganda 222 00:14:03,720 --> 00:14:06,040 Speaker 1: or is it this just human nature across the board. 223 00:14:06,920 --> 00:14:08,640 Speaker 1: So first, you know, I wanted to thank you for 224 00:14:08,720 --> 00:14:11,680 Speaker 1: kind of pointing out that that idea of the approach 225 00:14:11,679 --> 00:14:13,600 Speaker 1: of the book. One of the things that we tried 226 00:14:13,640 --> 00:14:18,400 Speaker 1: to do is, um not just tell the story. So 227 00:14:18,520 --> 00:14:22,040 Speaker 1: you know, every chapter begins with a fascinating story, an 228 00:14:22,120 --> 00:14:24,640 Speaker 1: interesting character the light because that's actually one of the 229 00:14:24,680 --> 00:14:27,160 Speaker 1: lessons of what works in this spaces is the power 230 00:14:27,200 --> 00:14:30,680 Speaker 1: of narrative. And it's UM in turn. UH frankly, why 231 00:14:30,720 --> 00:14:35,520 Speaker 1: a lot of our own UM campaigns haven't done well 232 00:14:35,640 --> 00:14:38,760 Speaker 1: compared to say what isis or Russia has been putting together. 233 00:14:38,800 --> 00:14:41,160 Speaker 1: We've forgotten the power of narratives. So we tried to 234 00:14:41,200 --> 00:14:43,400 Speaker 1: have that in the book, and it also just reflected 235 00:14:43,440 --> 00:14:46,800 Speaker 1: the way we researched it, where UM we went around 236 00:14:46,960 --> 00:14:50,920 Speaker 1: interviewing all the key players and from them you get 237 00:14:50,960 --> 00:14:54,600 Speaker 1: these really fascinating stories. UM. You know, what is the 238 00:14:54,600 --> 00:14:57,600 Speaker 1: creator of the Internet? Think, what's happened to his baby? Um? 239 00:14:57,640 --> 00:15:03,480 Speaker 1: What is someone you know, everything from an uh A 240 00:15:04,080 --> 00:15:07,120 Speaker 1: line officer in the military, all the way up to 241 00:15:07,480 --> 00:15:13,120 Speaker 1: generals to extremist group recruiters to UM reality stars. Uh, 242 00:15:13,400 --> 00:15:15,160 Speaker 1: you know what do they all think about what's played 243 00:15:15,200 --> 00:15:18,920 Speaker 1: out in this UM? And we tried to be careful too, 244 00:15:18,960 --> 00:15:21,600 Speaker 1: And this goes to the core of your question. There's 245 00:15:21,600 --> 00:15:25,360 Speaker 1: a lot of pessimism surrounding both kind of social media 246 00:15:25,400 --> 00:15:28,240 Speaker 1: in general right now, but in particular how America is 247 00:15:28,280 --> 00:15:33,320 Speaker 1: doing at it UM, And we tried to every time 248 00:15:33,360 --> 00:15:37,240 Speaker 1: we talked about a phenomena, talk about both the good 249 00:15:37,280 --> 00:15:39,600 Speaker 1: and the bad side of it. Talk about a good 250 00:15:39,600 --> 00:15:42,560 Speaker 1: guy using it to win their wars, but also here's 251 00:15:42,560 --> 00:15:45,000 Speaker 1: how a bad guy is using it. UM. A good 252 00:15:45,040 --> 00:15:49,120 Speaker 1: example is uh, this idea of UM. You know, one 253 00:15:49,160 --> 00:15:51,680 Speaker 1: of the changes that social media is brought. In particular too, 254 00:15:51,760 --> 00:15:55,400 Speaker 1: I think the operating space for your audience is UM. 255 00:15:55,560 --> 00:15:58,600 Speaker 1: When you have this combination of all these sensors out 256 00:15:58,680 --> 00:16:02,360 Speaker 1: there UM and everyone able to share on social media, 257 00:16:02,760 --> 00:16:08,600 Speaker 1: it's essentially ended UM secrets. Uh. You know. A great 258 00:16:08,640 --> 00:16:10,840 Speaker 1: example would be the bin Laden rate. It was supposed 259 00:16:10,840 --> 00:16:15,040 Speaker 1: to be highly classified, and yet you had a Pakistani 260 00:16:15,080 --> 00:16:17,680 Speaker 1: cafe owner who was up late at night and a 261 00:16:17,720 --> 00:16:20,320 Speaker 1: bata bad He hears the helicopters coming in, and what 262 00:16:20,360 --> 00:16:23,040 Speaker 1: does he do? The new natural thing. He goes on 263 00:16:23,080 --> 00:16:28,240 Speaker 1: to social media and complaints. But his complaints about one helicopter, 264 00:16:28,360 --> 00:16:32,680 Speaker 1: second helicopter, first explosion, second explosion, they double as these 265 00:16:32,720 --> 00:16:36,560 Speaker 1: like live battlefield reports. UM. And you know, I've been 266 00:16:37,160 --> 00:16:39,440 Speaker 1: involved more personally in examples of this where we had 267 00:16:39,480 --> 00:16:42,240 Speaker 1: to go to you know, tell people in so calm, hey, 268 00:16:42,280 --> 00:16:46,000 Speaker 1: this operation that you think is super secretive in Syria 269 00:16:46,120 --> 00:16:49,080 Speaker 1: or in Libya. You know, here's where people are talking 270 00:16:49,080 --> 00:16:52,520 Speaker 1: about it online. Can I tell you the craziest thing 271 00:16:52,520 --> 00:16:54,640 Speaker 1: in terms of bi laon rate. I swear to you, 272 00:16:54,680 --> 00:16:56,800 Speaker 1: and I'm I know I'm not the only person who 273 00:16:56,840 --> 00:17:00,240 Speaker 1: had this experience. I learned about the bin Laden raid 274 00:17:00,520 --> 00:17:03,520 Speaker 1: when when it happened on Twitter from Tom Green, the 275 00:17:03,520 --> 00:17:06,840 Speaker 1: comedian Tom Green, and he was tweeting about it before 276 00:17:06,880 --> 00:17:10,240 Speaker 1: Fox News, before CNN, for MSNBC. I don't know how 277 00:17:10,240 --> 00:17:12,640 Speaker 1: he was privy to it. I'm sure someone tweeted about 278 00:17:12,680 --> 00:17:15,639 Speaker 1: it that he was following, but like that just shows 279 00:17:15,640 --> 00:17:18,240 Speaker 1: the absurdity of social media that you know, that's who 280 00:17:18,240 --> 00:17:20,600 Speaker 1: you were learning this from some guy who was like 281 00:17:20,640 --> 00:17:23,560 Speaker 1: a popular comedian in the late nineties early two thousands, 282 00:17:23,920 --> 00:17:27,000 Speaker 1: before any news network. So there, you know, there was 283 00:17:27,080 --> 00:17:32,679 Speaker 1: some kind of chain connection between UM, the original owner, 284 00:17:32,800 --> 00:17:35,400 Speaker 1: the guy who was on the scene, and just basically 285 00:17:36,240 --> 00:17:39,880 Speaker 1: enough people retweeted and then it spends out. UM. So 286 00:17:40,160 --> 00:17:42,760 Speaker 1: you have this sort of example of an end of secrecy. 287 00:17:42,920 --> 00:17:45,240 Speaker 1: And on one hand you can say, wow, this, you know, 288 00:17:45,359 --> 00:17:49,720 Speaker 1: unveils operations UM, and you know people will say, well, 289 00:17:49,800 --> 00:17:51,840 Speaker 1: I don't like that. On the other hand, we tell 290 00:17:51,880 --> 00:17:56,480 Speaker 1: the story of UM, a group of volunteer UH kind 291 00:17:56,480 --> 00:18:00,159 Speaker 1: of work from home Sherlock Holmes, their online detect is 292 00:18:00,560 --> 00:18:04,480 Speaker 1: who use social media to track down war crimes that 293 00:18:04,640 --> 00:18:08,240 Speaker 1: wouldn't otherwise be detected. There. For example, the team that 294 00:18:09,160 --> 00:18:12,160 Speaker 1: proved that Russia was part of UM, the Russia shot 295 00:18:12,200 --> 00:18:15,159 Speaker 1: down the airline or over Ukraine a couple of years ago, 296 00:18:15,320 --> 00:18:18,560 Speaker 1: killing two or ninety eight civilians. UM. So you get 297 00:18:18,600 --> 00:18:21,719 Speaker 1: this like wonderful back and forth that comes out of 298 00:18:21,720 --> 00:18:27,280 Speaker 1: this UM. Now the you know, your question though, is 299 00:18:27,359 --> 00:18:31,480 Speaker 1: like why has it been so bad for the US? 300 00:18:31,600 --> 00:18:36,600 Speaker 1: And UM. I think usually there's not just a particularly 301 00:18:36,600 --> 00:18:39,320 Speaker 1: American phenomena, but maybe we're a little more prone to it. 302 00:18:39,400 --> 00:18:44,600 Speaker 1: Is when things go bad with technologies and policy, it's 303 00:18:44,720 --> 00:18:51,280 Speaker 1: usually a combination of two things, arrogance and ignorance. Arrogance, UM, 304 00:18:51,320 --> 00:18:54,080 Speaker 1: it might be arrogance, this couldn't happen to us the 305 00:18:54,080 --> 00:18:55,800 Speaker 1: way we talked about. You know, this may be something 306 00:18:55,800 --> 00:18:59,240 Speaker 1: that we see hitting other nations but it couldn't happen 307 00:18:59,280 --> 00:19:02,760 Speaker 1: to us. Or this may be being used to fight 308 00:19:02,800 --> 00:19:05,640 Speaker 1: against other nations militaries but we don't have to pay 309 00:19:05,680 --> 00:19:08,320 Speaker 1: attention to it. And again, all the phenomenon we We 310 00:19:08,359 --> 00:19:11,840 Speaker 1: talked about this in the book. Um, you know it 311 00:19:11,960 --> 00:19:14,840 Speaker 1: had a history to it. Um, you know things were 312 00:19:14,840 --> 00:19:18,199 Speaker 1: playing out against us in were happening earlier, things that 313 00:19:18,240 --> 00:19:23,639 Speaker 1: Isis was doing in etcetera. We were seeing Hamas do 314 00:19:23,760 --> 00:19:26,760 Speaker 1: that to the Israelis back in earlier periods. So there's 315 00:19:26,800 --> 00:19:29,560 Speaker 1: first kind of an arrogance of Um, this couldn't happen 316 00:19:29,600 --> 00:19:31,560 Speaker 1: to us. There's also a little bit of arrogance among 317 00:19:31,640 --> 00:19:36,200 Speaker 1: the technology creators. Of course, the thing I create is 318 00:19:36,240 --> 00:19:39,280 Speaker 1: going to be used for good, not going okay? How 319 00:19:39,359 --> 00:19:43,280 Speaker 1: might bad actors misuse it? And then you have really 320 00:19:43,280 --> 00:19:46,280 Speaker 1: what the book is trying to um, pop is the 321 00:19:46,320 --> 00:19:51,080 Speaker 1: side of ignorance. If these technologies have created a set 322 00:19:51,080 --> 00:19:55,040 Speaker 1: of new rules of the game, new rules for how 323 00:19:55,119 --> 00:19:58,920 Speaker 1: you operate, new rules for the nature of secrecy, new 324 00:19:59,000 --> 00:20:02,960 Speaker 1: rules for how you win battles, etcetera, then we better 325 00:20:03,080 --> 00:20:06,040 Speaker 1: learn those new rules. And the book tells the tale 326 00:20:06,040 --> 00:20:09,600 Speaker 1: of how the people that understand the rules, whether they 327 00:20:09,600 --> 00:20:13,960 Speaker 1: are Isis or Taylor Swift, they're using that to win 328 00:20:14,000 --> 00:20:17,000 Speaker 1: their wars, and in turn, the ones who don't understand 329 00:20:17,040 --> 00:20:20,600 Speaker 1: the rules, they're the ones getting their clocks cleaned. And 330 00:20:21,040 --> 00:20:26,080 Speaker 1: that is again something that we've I guess, you know, 331 00:20:26,240 --> 00:20:28,439 Speaker 1: we need to catch up. And you can see that 332 00:20:28,520 --> 00:20:34,080 Speaker 1: and everything from uh US Senate hearings where senators are 333 00:20:34,080 --> 00:20:36,159 Speaker 1: flailing and you know, I mean you may seem the 334 00:20:36,160 --> 00:20:39,000 Speaker 1: hearing where they were interacting with Mark Zuckerberg and just 335 00:20:39,320 --> 00:20:43,360 Speaker 1: you know, showing off how poorly they understood. We're just embarrassing. 336 00:20:43,480 --> 00:20:47,200 Speaker 1: But also to you know, examples that have really flummux 337 00:20:48,359 --> 00:20:52,000 Speaker 1: the special operations community. UM. So, like from conversations with 338 00:20:52,040 --> 00:20:57,120 Speaker 1: people UM in this field, they'll talk about, UM, there's 339 00:20:57,160 --> 00:21:01,760 Speaker 1: a very different way that they've approached messaging that essentially, 340 00:21:02,160 --> 00:21:04,960 Speaker 1: you know, will spend days and days creating power points 341 00:21:05,040 --> 00:21:08,280 Speaker 1: to find that one perfect message that's targeted at the 342 00:21:08,280 --> 00:21:11,440 Speaker 1: wrong person, both of which are not how you went 343 00:21:11,480 --> 00:21:14,479 Speaker 1: in this space. You kee, You've got to do it quickly. 344 00:21:14,760 --> 00:21:17,199 Speaker 1: There's not one perfect message. You have to proliferate it 345 00:21:17,200 --> 00:21:19,560 Speaker 1: out widely, and you're not trying to go after that 346 00:21:19,640 --> 00:21:23,200 Speaker 1: one individual. It's about operating on scale. But so I 347 00:21:23,320 --> 00:21:25,479 Speaker 1: have these kind of conversations with people in the community 348 00:21:25,520 --> 00:21:29,400 Speaker 1: and they say, we go in believing that we will 349 00:21:29,520 --> 00:21:32,320 Speaker 1: lose the battle of the narrative, and the battle of 350 00:21:32,359 --> 00:21:36,880 Speaker 1: the narrative, particularly when you're talking about counter terrorism, counterinsurgency, 351 00:21:36,920 --> 00:21:41,280 Speaker 1: that's the battle that might matter most. That's the battle 352 00:21:41,480 --> 00:21:45,679 Speaker 1: where um it really will decide not just the adversaries 353 00:21:45,680 --> 00:21:48,520 Speaker 1: of recruiting, but the interactions with the local populace and 354 00:21:48,520 --> 00:21:50,960 Speaker 1: the like. And we're going into these situations this again, 355 00:21:51,040 --> 00:21:53,680 Speaker 1: this is a complaint from folks that I've met with. 356 00:21:54,040 --> 00:21:57,160 Speaker 1: Their complaint is that they believe they're going in on 357 00:21:57,200 --> 00:22:00,320 Speaker 1: the losing side, and so we've got to understand the 358 00:22:00,320 --> 00:22:03,800 Speaker 1: new rules to remedy that, well, aren't There also a 359 00:22:03,840 --> 00:22:06,959 Speaker 1: lot of legal and policy restrictions as far as how 360 00:22:07,000 --> 00:22:10,399 Speaker 1: the US military goes about this sort of you know, 361 00:22:10,440 --> 00:22:13,359 Speaker 1: we can use a term strategic messaging or propaganda. I 362 00:22:13,359 --> 00:22:14,960 Speaker 1: guess it all depends on your point of view. But 363 00:22:15,000 --> 00:22:17,960 Speaker 1: there's things like the Schmidt Mund Act I believe is 364 00:22:18,000 --> 00:22:20,200 Speaker 1: one of the main ones that is Uh, it actually 365 00:22:20,240 --> 00:22:22,480 Speaker 1: applies to the State Department, I believe, but the but 366 00:22:22,640 --> 00:22:25,280 Speaker 1: D o D accepts it as a norm. Uh that 367 00:22:25,440 --> 00:22:28,720 Speaker 1: if we attempt to do some sort of strategic messaging 368 00:22:28,920 --> 00:22:31,920 Speaker 1: on the Internet, that there's no guarantee that American citizens 369 00:22:31,920 --> 00:22:35,240 Speaker 1: want in turn see that quote unquote propaganda and be 370 00:22:35,560 --> 00:22:40,679 Speaker 1: influenced by it. So there are definite UM challenges in 371 00:22:40,800 --> 00:22:44,359 Speaker 1: terms of kind of laws that were set up for 372 00:22:45,280 --> 00:22:48,840 Speaker 1: a different era, policies that haven't caught up to this 373 00:22:48,920 --> 00:22:52,359 Speaker 1: new kind of technology. UM. You mentioned one on the 374 00:22:52,400 --> 00:22:56,400 Speaker 1: external messaging side. UM. One that's made us a particularly 375 00:22:56,480 --> 00:23:00,840 Speaker 1: vulnerable on the internal messaging side is how how we 376 00:23:01,000 --> 00:23:07,960 Speaker 1: handle uh political commercials. Um. It's not perfectly, but it's 377 00:23:08,000 --> 00:23:13,840 Speaker 1: regulated in radio and TV. On social media, political ads 378 00:23:13,880 --> 00:23:18,480 Speaker 1: are regulated in the same way that skywriting is that 379 00:23:18,640 --> 00:23:20,800 Speaker 1: is with like the smoke that comes out of the plane, 380 00:23:21,000 --> 00:23:23,280 Speaker 1: they're actually regulated under the very same law. And of 381 00:23:23,320 --> 00:23:26,719 Speaker 1: course social media does not like disappear and that's it, 382 00:23:26,760 --> 00:23:30,000 Speaker 1: and it's only seen by that one person there. It proliferates, 383 00:23:30,000 --> 00:23:34,880 Speaker 1: it's more wide, and it's more influential than um uh 384 00:23:34,960 --> 00:23:37,520 Speaker 1: you know um TV or radio ads right now. So 385 00:23:37,600 --> 00:23:40,760 Speaker 1: there's a lot of policy and legal kind of catching 386 00:23:40,840 --> 00:23:43,640 Speaker 1: up that needs to go on. UM. But I will 387 00:23:43,640 --> 00:23:45,360 Speaker 1: you know, I'm gonna push back at you in some way. 388 00:23:45,400 --> 00:23:49,600 Speaker 1: There's often complaints, particularly within the military, at you know, 389 00:23:49,640 --> 00:23:52,800 Speaker 1: we can't do X, we can't do why. The reality 390 00:23:52,920 --> 00:23:55,080 Speaker 1: is often the things that prevent us from doing X 391 00:23:55,119 --> 00:24:03,440 Speaker 1: and Y are not law. It's internal culture. Um, it's 392 00:24:03,480 --> 00:24:08,440 Speaker 1: an internal bureaucracy. It's sticking to an old doctrine, failure 393 00:24:08,480 --> 00:24:13,000 Speaker 1: of imagination. Yeah, exactly. And then there's another thing where, UM, 394 00:24:13,080 --> 00:24:15,119 Speaker 1: let's just admit, there are some things that we do 395 00:24:15,240 --> 00:24:20,840 Speaker 1: in secret squirrel land that are different than um, what 396 00:24:20,920 --> 00:24:24,600 Speaker 1: we complain about. So uh, you know the example of 397 00:24:24,840 --> 00:24:29,600 Speaker 1: UM we I I can say it, maybe you can't, 398 00:24:29,640 --> 00:24:33,640 Speaker 1: but you know, there are examples of UM contracts put 399 00:24:33,680 --> 00:24:37,560 Speaker 1: out by the US military to allow one person to 400 00:24:37,680 --> 00:24:44,160 Speaker 1: control multiple social media accounts simultaneously under false personas. That's 401 00:24:44,200 --> 00:24:47,879 Speaker 1: basically the Russian disinformation model. We've just learned how to 402 00:24:47,920 --> 00:24:51,880 Speaker 1: copycat them. Um Uh. We do it bureaucratically, put out 403 00:24:51,920 --> 00:24:54,439 Speaker 1: a contract for it. We were just using it. Um. 404 00:24:54,520 --> 00:24:56,439 Speaker 1: These are among many of the lessons learned kind of 405 00:24:56,480 --> 00:24:59,199 Speaker 1: more recently. This was for the sitcom A O R. 406 00:24:59,480 --> 00:25:01,840 Speaker 1: So you know, there's some things we do in secret 407 00:25:01,840 --> 00:25:07,600 Speaker 1: Squirreland that wouldn't um uh that that are not you 408 00:25:07,600 --> 00:25:09,679 Speaker 1: know kind of there's nothing that we can do in 409 00:25:09,680 --> 00:25:12,040 Speaker 1: this space. Things. Things may have evolved, but I believe 410 00:25:12,080 --> 00:25:14,359 Speaker 1: actually the FBI has done the best job with a 411 00:25:14,480 --> 00:25:19,399 Speaker 1: persona building, UM, probably better than the military has. UM. 412 00:25:19,600 --> 00:25:22,480 Speaker 1: This is another question. Well, I mean the real thing 413 00:25:22,480 --> 00:25:24,880 Speaker 1: in terms of this is you know the best who 414 00:25:24,880 --> 00:25:29,320 Speaker 1: has done the persona building. UM. You know, look beyond government, 415 00:25:29,560 --> 00:25:34,720 Speaker 1: look at corporate models, look at nonstate actor models. So 416 00:25:34,800 --> 00:25:36,679 Speaker 1: you know, it's a good illustration. In the book, we 417 00:25:36,720 --> 00:25:40,080 Speaker 1: have this back and forth UM in one section of 418 00:25:40,560 --> 00:25:45,040 Speaker 1: Hillary Clinton versus Wendy's. UM. Hillary Clinton is a real person, 419 00:25:45,680 --> 00:25:50,359 Speaker 1: but she did not come across as real online and 420 00:25:50,400 --> 00:25:53,680 Speaker 1: that was because she had a highly echo to U. S. Military, 421 00:25:53,920 --> 00:25:58,840 Speaker 1: highly bureaucratic UH messaging campaign. UM. As many as eleven 422 00:25:58,880 --> 00:26:02,960 Speaker 1: different people would way in on a single tweet from her. UM. 423 00:26:02,960 --> 00:26:07,320 Speaker 1: By contrast, Wendy's, which is not a it's a hamburger change. 424 00:26:07,520 --> 00:26:09,640 Speaker 1: You know. May Wendy's may have once been a little girl, 425 00:26:09,720 --> 00:26:12,840 Speaker 1: but it's not right now. Wendy's is known as one 426 00:26:12,880 --> 00:26:17,879 Speaker 1: of the most effective kind of corporate actors online because 427 00:26:18,160 --> 00:26:22,320 Speaker 1: it's it's viewed as real. And again, this is this 428 00:26:22,600 --> 00:26:25,720 Speaker 1: me saying real. It's not just oh that's nice. It's 429 00:26:25,800 --> 00:26:28,520 Speaker 1: one of the five attributes that we explore in the 430 00:26:28,600 --> 00:26:32,760 Speaker 1: book of every time something goes viral, every time something wins, 431 00:26:33,400 --> 00:26:36,680 Speaker 1: it has one of these attributes, and one is this 432 00:26:36,760 --> 00:26:39,960 Speaker 1: idea of authenticity and again you can have that kind 433 00:26:39,960 --> 00:26:42,360 Speaker 1: of back and forth between learning from what a Wendy's 434 00:26:42,440 --> 00:26:46,679 Speaker 1: does versus the Hillary Clinton campaign model. To you know, 435 00:26:46,760 --> 00:26:49,760 Speaker 1: that's been one of the challenges of our messaging against 436 00:26:49,800 --> 00:26:56,000 Speaker 1: saying isis is isis UM had you know, it was 437 00:26:56,160 --> 00:27:00,080 Speaker 1: viewed as more real. UM it had a model that 438 00:27:00,320 --> 00:27:03,960 Speaker 1: really leveraged that first in its efforts against its rivals 439 00:27:04,000 --> 00:27:06,840 Speaker 1: like al Qaeda, and then it's in its efforts against 440 00:27:06,880 --> 00:27:10,280 Speaker 1: the coalition. Can I also just add every time Hillary 441 00:27:10,359 --> 00:27:15,719 Speaker 1: Clinton during the campaign tried to sound culturally relevant or authentic, 442 00:27:15,800 --> 00:27:20,720 Speaker 1: it really fell on its face. And the the the 443 00:27:20,800 --> 00:27:22,960 Speaker 1: example I could think of in my head first was 444 00:27:23,000 --> 00:27:24,959 Speaker 1: do you remember when she was on the campaign trail 445 00:27:25,359 --> 00:27:28,200 Speaker 1: and she she said, uh, I don't know who invented 446 00:27:28,240 --> 00:27:31,720 Speaker 1: Pokemon Go, but I hope that you all pokemon go 447 00:27:31,840 --> 00:27:35,280 Speaker 1: to the polls. That that one viral, but not because 448 00:27:35,280 --> 00:27:38,800 Speaker 1: of its authenticity. It was just like, this woman does 449 00:27:38,840 --> 00:27:40,800 Speaker 1: not know how to come off as a real person. 450 00:27:41,800 --> 00:27:46,119 Speaker 1: And this again now we pulled back and UM, it 451 00:27:46,160 --> 00:27:48,520 Speaker 1: connects to I mean kind of wonky with you a 452 00:27:48,600 --> 00:27:51,159 Speaker 1: larger discussion. So the book is I mean, you know, 453 00:27:51,200 --> 00:27:53,639 Speaker 1: we've talked about examples from Taylor Swift to Wendy's. But 454 00:27:53,880 --> 00:27:56,240 Speaker 1: it also blends in. You know, the history of the 455 00:27:56,280 --> 00:27:59,919 Speaker 1: blitz creeg and isis military operations. But one of the 456 00:28:00,119 --> 00:28:04,400 Speaker 1: changes that's played out is um going back to the 457 00:28:04,520 --> 00:28:10,320 Speaker 1: very first democracies. Um, you have writing from Aristotle about 458 00:28:10,400 --> 00:28:12,800 Speaker 1: you know, one of the twists of a democracy in 459 00:28:12,880 --> 00:28:16,000 Speaker 1: ancient Greece is that it's a government of the people. 460 00:28:16,600 --> 00:28:21,879 Speaker 1: But then you need a new class of a particular 461 00:28:22,000 --> 00:28:24,840 Speaker 1: kind of person. In Aristotle comes up with the concept 462 00:28:24,920 --> 00:28:28,040 Speaker 1: of a politician. So it is someone who is of 463 00:28:28,080 --> 00:28:30,199 Speaker 1: the people. They're not a king, they're not a priest. 464 00:28:30,240 --> 00:28:33,600 Speaker 1: There of the people, but they have decided that they 465 00:28:33,600 --> 00:28:35,600 Speaker 1: are better than the people and that they should be 466 00:28:35,640 --> 00:28:38,360 Speaker 1: the leader of the people. And so that's created this 467 00:28:38,440 --> 00:28:40,960 Speaker 1: kind of long um. It's both a feature and a 468 00:28:41,040 --> 00:28:46,800 Speaker 1: contradiction UM, and the politicians have long had to kind 469 00:28:46,840 --> 00:28:49,920 Speaker 1: of do this balancing act of I'm of you, but 470 00:28:50,440 --> 00:28:53,440 Speaker 1: you know I'm better than you. Well, then that really 471 00:28:53,480 --> 00:28:58,040 Speaker 1: brings us leaves into technology. So like with the early newspapers, 472 00:28:58,360 --> 00:29:01,160 Speaker 1: you get the stories and a America of well he 473 00:29:01,280 --> 00:29:04,440 Speaker 1: was born in a log cabin, and then with TV 474 00:29:05,080 --> 00:29:09,800 Speaker 1: you get this basically industry of diners in Iowa and 475 00:29:09,880 --> 00:29:13,880 Speaker 1: New Hampshire that are based supported by politicians visiting them 476 00:29:14,240 --> 00:29:17,760 Speaker 1: to try and show that they're real. What social media 477 00:29:17,960 --> 00:29:24,920 Speaker 1: brought in is now the individual politician. It's them directly. 478 00:29:25,360 --> 00:29:28,440 Speaker 1: It's not through the media, it's not through the middle. 479 00:29:28,960 --> 00:29:33,160 Speaker 1: And you know, that was um. If you're looking at Trump, 480 00:29:33,680 --> 00:29:38,520 Speaker 1: even among um people who this this was from polling 481 00:29:38,560 --> 00:29:42,360 Speaker 1: for example, during the Republican nomination, so other Republicans who 482 00:29:42,480 --> 00:29:46,600 Speaker 1: liked his his rivals, um, you know, whether it's Rubio 483 00:29:46,920 --> 00:29:49,720 Speaker 1: or Bush or whatever. In the polling, even when they 484 00:29:49,760 --> 00:29:52,400 Speaker 1: said they didn't like his policy, the one attribute that 485 00:29:52,400 --> 00:29:56,400 Speaker 1: they would say about him as well, it's it's he's real, um, 486 00:29:56,440 --> 00:30:01,760 Speaker 1: and his his authenticity. And so that has created a 487 00:30:01,800 --> 00:30:06,880 Speaker 1: model that again, whether it was Trump, whether it was 488 00:30:07,000 --> 00:30:11,720 Speaker 1: to shift over to Jane who's saying, isis top recruiter, um, whatever, 489 00:30:12,160 --> 00:30:14,840 Speaker 1: that attribute of being viewed as real, of being viewed 490 00:30:14,880 --> 00:30:18,840 Speaker 1: as authentic is something that wins on the web and 491 00:30:18,880 --> 00:30:23,640 Speaker 1: in turn those who are viewed as inauthentic um lose. 492 00:30:24,200 --> 00:30:27,640 Speaker 1: But again, what's fascinating about it is and this comes 493 00:30:27,640 --> 00:30:30,080 Speaker 1: from the interviews with you know, again, whether it's tech 494 00:30:30,120 --> 00:30:35,240 Speaker 1: recruiters to generals to the reality stars. No one is 495 00:30:35,920 --> 00:30:40,959 Speaker 1: real on the guy. It's all performative, right, It's it's 496 00:30:41,000 --> 00:30:46,640 Speaker 1: a conscious authenticity. So when when when Jane Hussain or 497 00:30:46,760 --> 00:30:51,200 Speaker 1: Taylor Swift are engaging with a fan, whether it's an 498 00:30:51,200 --> 00:30:54,480 Speaker 1: Isis fan or a music fan, they're doing it in 499 00:30:54,640 --> 00:30:58,400 Speaker 1: knowledge that the world is watching. When Donald Trump is 500 00:30:58,760 --> 00:31:02,240 Speaker 1: saying things the way he does, he's doing it consciously 501 00:31:02,280 --> 00:31:05,520 Speaker 1: knowing that he's doing it for a purpose. And that's 502 00:31:05,560 --> 00:31:07,239 Speaker 1: one of the things again that if you look at 503 00:31:07,280 --> 00:31:11,240 Speaker 1: kind of the military side of it, Um, we've we've 504 00:31:11,240 --> 00:31:15,200 Speaker 1: faced a challenge and that kind of approach. Well, I 505 00:31:15,240 --> 00:31:18,120 Speaker 1: think there's that debate with filmmakers, isn't there that you 506 00:31:18,160 --> 00:31:21,719 Speaker 1: know whether or not anything on camera can be authentic 507 00:31:21,800 --> 00:31:24,760 Speaker 1: and real because the moment you have a camera on you, 508 00:31:24,760 --> 00:31:26,960 Speaker 1: you know, you're aware of it, you know you're being watched, 509 00:31:27,520 --> 00:31:30,760 Speaker 1: and you behave differently. And now we get to the 510 00:31:30,800 --> 00:31:36,719 Speaker 1: next part of this battle space, Um, what maybe even 511 00:31:36,960 --> 00:31:41,480 Speaker 1: sort of you acting real and quotation marks online. We're 512 00:31:41,520 --> 00:31:47,000 Speaker 1: moving into a space where that imagery can be manipulated changed. 513 00:31:47,400 --> 00:31:50,240 Speaker 1: So we have the challenge of not only do we 514 00:31:50,760 --> 00:31:54,200 Speaker 1: fall within our little own echo chambers. Where again, whether 515 00:31:54,280 --> 00:31:57,480 Speaker 1: it's UM U s military operations in a rock to 516 00:31:57,920 --> 00:32:02,360 Speaker 1: UM what you think of Kavanaugh as a Supreme Court justice, 517 00:32:02,800 --> 00:32:05,760 Speaker 1: we're in these little echo chambers of each of us 518 00:32:05,800 --> 00:32:08,840 Speaker 1: have our own truths, our own reality. But now we 519 00:32:08,880 --> 00:32:12,800 Speaker 1: have the potential of weaving new technology like artificial intelligence 520 00:32:12,800 --> 00:32:15,560 Speaker 1: and like in it where you can literally manipulate that 521 00:32:15,640 --> 00:32:19,600 Speaker 1: imagery and take things that truly are not real and 522 00:32:19,680 --> 00:32:22,680 Speaker 1: make them look as if they are real. UM. This 523 00:32:22,760 --> 00:32:25,560 Speaker 1: is called deep fakes uh. And this is the idea 524 00:32:25,680 --> 00:32:31,760 Speaker 1: of creating um uh what look like truths uh, you know, 525 00:32:32,000 --> 00:32:37,480 Speaker 1: an image of video. The examples range from speeches that 526 00:32:37,600 --> 00:32:44,560 Speaker 1: politicians never gave to UM people appearing in movies that 527 00:32:44,640 --> 00:32:49,520 Speaker 1: they never start. In UM two, we will see that weaponized. 528 00:32:49,680 --> 00:32:54,440 Speaker 1: Also so UM images of US soldiers committing an atrocity 529 00:32:54,440 --> 00:32:56,840 Speaker 1: that they never did. And so that's where it's kind 530 00:32:56,840 --> 00:32:59,280 Speaker 1: of the next stage of these battles that looms and 531 00:32:59,360 --> 00:33:02,200 Speaker 1: again points to the need for us to to get 532 00:33:02,200 --> 00:33:04,080 Speaker 1: a handle of these new rules of the game, because 533 00:33:04,080 --> 00:33:06,880 Speaker 1: we're if if if this is a new realm and 534 00:33:06,960 --> 00:33:11,720 Speaker 1: new domain of war. UM, we're in the biplane stage 535 00:33:11,720 --> 00:33:16,000 Speaker 1: of it. You know, the weapons the tactics. They're still 536 00:33:16,040 --> 00:33:21,160 Speaker 1: early stage, you know, but isis was the first user. 537 00:33:22,000 --> 00:33:26,800 Speaker 1: The groups that follow are going to get better at it, 538 00:33:26,880 --> 00:33:29,640 Speaker 1: do more UM. In turn, you know, another change that 539 00:33:29,760 --> 00:33:32,880 Speaker 1: looms for the operating environment I think for this audience 540 00:33:32,880 --> 00:33:35,800 Speaker 1: in particular, is think of all the changes that have 541 00:33:35,920 --> 00:33:39,720 Speaker 1: happened with social media so far, whether it's news, politics, 542 00:33:39,760 --> 00:33:43,320 Speaker 1: and war, and only half the world is online right now. 543 00:33:44,280 --> 00:33:47,320 Speaker 1: The other half is still to come. So you know, 544 00:33:47,440 --> 00:33:50,120 Speaker 1: the the to circle back that example of the big 545 00:33:50,200 --> 00:33:54,680 Speaker 1: loadon rate UM that happened in a country where at 546 00:33:54,680 --> 00:33:58,320 Speaker 1: the time it plays out, if I remember the data correctly, UM, 547 00:33:58,400 --> 00:34:01,080 Speaker 1: less than ten percent of the nation at that point 548 00:34:02,000 --> 00:34:05,280 Speaker 1: was on social media. That's not the case now in 549 00:34:06,120 --> 00:34:08,399 Speaker 1: it's certainly not going to be the case moving forward. 550 00:34:08,400 --> 00:34:13,719 Speaker 1: In so you have UM, all these phenomena are only 551 00:34:13,719 --> 00:34:16,000 Speaker 1: going to grow, which again means we need to learn 552 00:34:16,040 --> 00:34:20,279 Speaker 1: about them. I know we're probably running short on time, 553 00:34:20,320 --> 00:34:22,799 Speaker 1: but I wanted to get this question in for sure 554 00:34:22,840 --> 00:34:26,000 Speaker 1: before UM we come to a close. Is this is 555 00:34:26,040 --> 00:34:29,040 Speaker 1: more of a psychological question than a technological question, but 556 00:34:29,280 --> 00:34:32,200 Speaker 1: it's touched upon in your book. Is the notion of 557 00:34:32,320 --> 00:34:35,080 Speaker 1: when people are confronted with the fact that they are 558 00:34:35,520 --> 00:34:38,920 Speaker 1: propagating false information. I think there's something very interesting that 559 00:34:39,000 --> 00:34:41,799 Speaker 1: happens in people's minds. And I think we've all had 560 00:34:42,200 --> 00:34:45,680 Speaker 1: this experience where someone posts something that's completely ridiculous. I 561 00:34:45,680 --> 00:34:48,400 Speaker 1: remember someone posting this meme and it was of the 562 00:34:48,760 --> 00:34:52,480 Speaker 1: woman Dr Ford who had accused Kavanaugh of a sexual assault, 563 00:34:52,760 --> 00:34:56,120 Speaker 1: and it was a picture of her looking like a 564 00:34:56,120 --> 00:34:59,040 Speaker 1: a promiscuous woman in a bikini with like a bottle 565 00:34:59,040 --> 00:35:01,800 Speaker 1: of champagne in her hand, and implying that she is 566 00:35:01,840 --> 00:35:05,440 Speaker 1: a promiscuous woman in this and that, And I said, 567 00:35:05,920 --> 00:35:07,560 Speaker 1: I pointed out as like, look, that's not her in 568 00:35:07,600 --> 00:35:10,040 Speaker 1: the photo. That quote you've thrown on the meme is 569 00:35:10,080 --> 00:35:12,560 Speaker 1: not real. This isn't real. And I find over and 570 00:35:12,600 --> 00:35:15,160 Speaker 1: over again when you confront people with that, what they 571 00:35:15,200 --> 00:35:18,840 Speaker 1: come back with, when they're presented with clear evidence, is Okay, 572 00:35:18,920 --> 00:35:22,160 Speaker 1: this particular meme may not be factual, but the point 573 00:35:22,320 --> 00:35:26,919 Speaker 1: is true. What is happening in people's minds? And why 574 00:35:27,000 --> 00:35:30,320 Speaker 1: is it so hard for people? Why do they cling 575 00:35:30,440 --> 00:35:35,640 Speaker 1: so desperately to false information and we just cannot despite 576 00:35:35,680 --> 00:35:38,960 Speaker 1: whatever evidence you give them, you just can't talk them 577 00:35:39,000 --> 00:35:42,640 Speaker 1: out of it. Is like this deep seated emotional, ideological 578 00:35:42,719 --> 00:35:47,360 Speaker 1: need to believe in something. Yeah, and that's you know, 579 00:35:47,400 --> 00:35:49,279 Speaker 1: the space. We could treat it as if it's a 580 00:35:49,360 --> 00:35:54,719 Speaker 1: technology space, but UM, it goes back to politics and war. 581 00:35:55,040 --> 00:36:00,920 Speaker 1: They're about humans, They're about us. UM. And this face 582 00:36:01,080 --> 00:36:07,279 Speaker 1: itself was designed to push those UM mental you know, 583 00:36:07,320 --> 00:36:11,520 Speaker 1: be its psychological or emotional buttons. UM. The very design 584 00:36:11,600 --> 00:36:14,360 Speaker 1: of it. Uh you know. So for example, there is 585 00:36:14,400 --> 00:36:19,200 Speaker 1: a reason why uh when there a message pops up 586 00:36:19,280 --> 00:36:24,360 Speaker 1: on Facebook, Uh, it's in red. Um, It's in red 587 00:36:24,480 --> 00:36:28,320 Speaker 1: because that fires the part of your brain that really 588 00:36:28,400 --> 00:36:31,400 Speaker 1: wants to make red go away. It's not just that 589 00:36:31,440 --> 00:36:35,320 Speaker 1: you notice it. You want to make away the fires 590 00:36:35,360 --> 00:36:38,359 Speaker 1: that sort of anticipation. Notice that they don't tell you, 591 00:36:38,800 --> 00:36:42,080 Speaker 1: um the importance of the message. Uh. You know. It 592 00:36:42,160 --> 00:36:45,640 Speaker 1: could be the most important thing in the world, or 593 00:36:45,640 --> 00:36:48,800 Speaker 1: it could be an annoying UM person that you barely 594 00:36:48,800 --> 00:36:51,520 Speaker 1: remember from high school reaching out. But you've got to 595 00:36:51,640 --> 00:36:53,840 Speaker 1: go there to get it. So, you know, the design 596 00:36:53,880 --> 00:36:58,600 Speaker 1: of the space UM does that. And it's the same 597 00:36:59,239 --> 00:37:05,360 Speaker 1: in terms of how we UM interact with conspiracy theories 598 00:37:05,360 --> 00:37:12,000 Speaker 1: and falsehoods. UM. The more that we are exposed to them, 599 00:37:12,120 --> 00:37:15,319 Speaker 1: the more we fall prey to them UM. And that 600 00:37:15,440 --> 00:37:19,319 Speaker 1: is both you as an individual UM. And you know, 601 00:37:19,360 --> 00:37:21,759 Speaker 1: all the data shows that if you've posted kind of 602 00:37:21,800 --> 00:37:25,279 Speaker 1: one falsehood, your sort of defenses are broken down to 603 00:37:25,440 --> 00:37:29,080 Speaker 1: doing it more and more. UM. Also scarily, the more 604 00:37:29,200 --> 00:37:33,120 Speaker 1: you post conspiracy theories UM, the more likely the more 605 00:37:33,160 --> 00:37:39,520 Speaker 1: prone you are to extremism, more likely to to voice UM, hate, violence, 606 00:37:39,719 --> 00:37:42,080 Speaker 1: you name it. There's there is a connection. Even when 607 00:37:42,120 --> 00:37:46,000 Speaker 1: the conspiracy theory is not about politics. It might be UM, 608 00:37:46,040 --> 00:37:48,840 Speaker 1: you know, the anti vax or the anti vaccine movement 609 00:37:48,920 --> 00:37:50,960 Speaker 1: or the flat earth movement. It's kind of like if 610 00:37:50,960 --> 00:37:53,560 Speaker 1: you think of it as like a gateway drug. But 611 00:37:53,680 --> 00:37:58,000 Speaker 1: it's also you make everyone else in your network more 612 00:37:58,040 --> 00:38:01,960 Speaker 1: prone to it, so you're aching down the defenses of 613 00:38:02,000 --> 00:38:06,279 Speaker 1: your friends and family UM. And uh So there's you know, 614 00:38:06,320 --> 00:38:08,520 Speaker 1: there's all of these kind of psychological issues, but it 615 00:38:08,520 --> 00:38:11,000 Speaker 1: actually I want to end on UM you know, kind 616 00:38:11,000 --> 00:38:14,680 Speaker 1: of a more positive note. Uh. This points to the 617 00:38:14,760 --> 00:38:19,239 Speaker 1: need for not just national level response. You know, we 618 00:38:19,320 --> 00:38:23,320 Speaker 1: talked about how we've got to adjust our military training 619 00:38:23,440 --> 00:38:27,880 Speaker 1: doctrine to UM. There's models of how the Estonias of 620 00:38:27,920 --> 00:38:31,360 Speaker 1: the world have developed better defenses UM for their nation 621 00:38:31,440 --> 00:38:36,040 Speaker 1: without losing their democracy. Without compromising their rights. Um, there's 622 00:38:36,040 --> 00:38:38,880 Speaker 1: things that the companies can do, but they're also points 623 00:38:38,920 --> 00:38:41,480 Speaker 1: to what you and I need to do as individuals. 624 00:38:41,520 --> 00:38:45,480 Speaker 1: And I really like the parallel of public health here, 625 00:38:45,480 --> 00:38:49,520 Speaker 1: where in public health each of us there's a role 626 00:38:49,600 --> 00:38:53,520 Speaker 1: for government, there's a role for industry, but it also 627 00:38:53,600 --> 00:38:55,719 Speaker 1: comes down to what you and I do, and part 628 00:38:55,719 --> 00:38:58,360 Speaker 1: of that is having an ethic to um not just 629 00:38:58,440 --> 00:39:02,000 Speaker 1: protect ourselves, but protect everyone around us. So, for example, 630 00:39:02,040 --> 00:39:05,719 Speaker 1: I teach my kids um to cover their mouth when 631 00:39:05,719 --> 00:39:10,040 Speaker 1: they cough. Now, first I don't say well, because I 632 00:39:10,080 --> 00:39:12,920 Speaker 1: teach my kids hygiene. There's no role for government in 633 00:39:12,960 --> 00:39:15,920 Speaker 1: public health, or there's no role for a private industry. 634 00:39:15,920 --> 00:39:19,000 Speaker 1: There's no role for hospitals, but they in turn rely 635 00:39:19,160 --> 00:39:20,920 Speaker 1: on me teaching my kids that. But go back to 636 00:39:20,920 --> 00:39:23,440 Speaker 1: that idea of cover your mouth when you cough. In 637 00:39:23,440 --> 00:39:27,080 Speaker 1: no way, shape or form does that defend only you. 638 00:39:27,719 --> 00:39:31,000 Speaker 1: It's about you taking responsibility for defending everyone else in 639 00:39:31,040 --> 00:39:33,480 Speaker 1: your network. And it's the same phenomena when it comes 640 00:39:33,520 --> 00:39:39,360 Speaker 1: to these issues of disinformation, fake news, falsehoods. We are 641 00:39:39,600 --> 00:39:43,160 Speaker 1: not just targets of it trying to break down our 642 00:39:43,200 --> 00:39:48,920 Speaker 1: own defenses. We are also the combatants. We decide whether 643 00:39:48,960 --> 00:39:52,280 Speaker 1: it goes viral or not. So you know, to that person, 644 00:39:52,440 --> 00:39:55,920 Speaker 1: it's not just hey, this is false it's hey, you 645 00:39:56,000 --> 00:40:01,279 Speaker 1: are responsible for the spreading of that falsehood, and you're 646 00:40:01,400 --> 00:40:05,720 Speaker 1: part of the attack against all of your friends and family. 647 00:40:06,239 --> 00:40:08,560 Speaker 1: And so that's where again we need to you know, 648 00:40:08,600 --> 00:40:12,239 Speaker 1: kind of message this as it's not just about defending you, 649 00:40:12,920 --> 00:40:16,000 Speaker 1: it's not just about defending the nation. It's about defending 650 00:40:16,040 --> 00:40:19,640 Speaker 1: your friends and family. And you know that's hopefully some 651 00:40:19,680 --> 00:40:21,759 Speaker 1: of the sort of the positive side lessons that comes 652 00:40:21,760 --> 00:40:24,080 Speaker 1: out of this project is to better equip all of 653 00:40:24,160 --> 00:40:26,920 Speaker 1: us in that kind of battles that are playing out 654 00:40:27,360 --> 00:40:29,400 Speaker 1: in the very same space that we post you know, 655 00:40:29,440 --> 00:40:32,160 Speaker 1: all the good fun things of social media, be it 656 00:40:32,239 --> 00:40:37,600 Speaker 1: our birthday party images to uh great interviews with people online. 657 00:40:38,000 --> 00:40:40,439 Speaker 1: Do you think there's a role for UM, let's say, 658 00:40:40,440 --> 00:40:43,960 Speaker 1: bringing Civics classes back to high schools and updating them 659 00:40:43,960 --> 00:40:46,839 Speaker 1: to teach children like, Look, this is part of your 660 00:40:46,840 --> 00:40:50,480 Speaker 1: responsibility as a citizen. You know, you vote, you go 661 00:40:50,600 --> 00:40:53,080 Speaker 1: to jury duty, maybe you will elect to serve in 662 00:40:53,120 --> 00:40:58,359 Speaker 1: the military, and also stop spreading fake news. Please, absolutely, 663 00:40:58,400 --> 00:41:01,080 Speaker 1: you know what you're getting at is um gital literacy 664 00:41:01,280 --> 00:41:03,920 Speaker 1: and when we look at the other nations that that 665 00:41:04,040 --> 00:41:07,799 Speaker 1: handle this space really well, they have digital literacy campaigns 666 00:41:07,880 --> 00:41:10,080 Speaker 1: they teach in their schools, and not just like an 667 00:41:10,080 --> 00:41:15,120 Speaker 1: elementary school level, but um, it's also woven into for example, 668 00:41:15,640 --> 00:41:18,560 Speaker 1: media campaigns and the like. Because of course, as we 669 00:41:18,600 --> 00:41:21,920 Speaker 1: all know, UM, it's not the twelve year olds that 670 00:41:21,920 --> 00:41:26,400 Speaker 1: are spreading most of the conspiracy online. Uh, it's um. Frankly, 671 00:41:26,440 --> 00:41:30,799 Speaker 1: you're in my elders I joke that baby boomers. Facebook 672 00:41:30,880 --> 00:41:36,040 Speaker 1: is like the tide pods for baby boomers. Um. But so, 673 00:41:36,200 --> 00:41:38,200 Speaker 1: but that's what you know. The nations be they like 674 00:41:38,239 --> 00:41:41,640 Speaker 1: in Estonia, Sweden, who have gotten really good at this 675 00:41:42,640 --> 00:41:44,719 Speaker 1: unsurprising because they were the first to be targeted by 676 00:41:44,719 --> 00:41:47,239 Speaker 1: the rushes of the world. That's what they have. But 677 00:41:47,360 --> 00:41:50,879 Speaker 1: there's a real strange example of um, what's playing out 678 00:41:50,920 --> 00:41:56,000 Speaker 1: in this The United States government has helped pay for 679 00:41:56,080 --> 00:42:02,320 Speaker 1: a digital literacy program in Ukraine tell Ukrainian students better 680 00:42:02,400 --> 00:42:08,440 Speaker 1: recognize disinformation online, but we don't have that for our 681 00:42:08,440 --> 00:42:12,080 Speaker 1: own kids. And it again points back to some of 682 00:42:12,080 --> 00:42:16,400 Speaker 1: the the sort of the strangeness of this space that UM, 683 00:42:16,480 --> 00:42:19,480 Speaker 1: we parts of our government recognized the problem and are 684 00:42:19,480 --> 00:42:22,000 Speaker 1: trying to help better defend other nations, but because of 685 00:42:22,040 --> 00:42:24,000 Speaker 1: a kind of a dysfunctionality and a part of the 686 00:42:24,080 --> 00:42:30,160 Speaker 1: ship at home, we're more open to being taken advantage of, exploited, etcetera. 687 00:42:30,280 --> 00:42:32,759 Speaker 1: But again it points to how we shouldn't throw our 688 00:42:32,760 --> 00:42:34,640 Speaker 1: hands up and say, you know, there's nothing that we 689 00:42:34,680 --> 00:42:37,000 Speaker 1: can do. Actually, there are things that we can do. 690 00:42:37,280 --> 00:42:39,560 Speaker 1: They're proven to work. It's just are we going to 691 00:42:39,600 --> 00:42:43,200 Speaker 1: implement them at the national level, at the business level, 692 00:42:43,520 --> 00:42:46,680 Speaker 1: at our own personal levels. I have a couple of 693 00:42:46,760 --> 00:42:51,799 Speaker 1: questions before we actually got question. Definitely, if you have 694 00:42:51,880 --> 00:42:53,560 Speaker 1: to go, if you have to go where we can 695 00:42:53,640 --> 00:42:57,399 Speaker 1: wrap it up, it's not a problem. Let's let's I'll 696 00:42:57,480 --> 00:43:01,880 Speaker 1: hit Okay, So just a really quick one. I'm wondering, 697 00:43:02,360 --> 00:43:05,200 Speaker 1: you know, from everything that we've discussed in this past 698 00:43:05,280 --> 00:43:07,880 Speaker 1: forty five minutesters so you know, I think it's easy 699 00:43:07,880 --> 00:43:10,000 Speaker 1: to infer you're not a fan of a guy like 700 00:43:10,040 --> 00:43:14,880 Speaker 1: Alex Jones, but from the libertarian perspective of myself and 701 00:43:15,200 --> 00:43:17,640 Speaker 1: I think Jack on some level, I actually just want 702 00:43:17,680 --> 00:43:19,480 Speaker 1: to know what you think of like the shadow banning 703 00:43:19,520 --> 00:43:21,759 Speaker 1: of a guy like Alex Jones, where it's like, you know, 704 00:43:22,000 --> 00:43:24,880 Speaker 1: he's being banned from all this social media, and is 705 00:43:25,000 --> 00:43:27,640 Speaker 1: a good precedent I guess yes, any even being banned 706 00:43:27,640 --> 00:43:31,480 Speaker 1: from PayPal, which really stops him from being able to, 707 00:43:32,640 --> 00:43:35,719 Speaker 1: you know, just get financial compensation for what he's doing, 708 00:43:35,760 --> 00:43:38,080 Speaker 1: whether it's fake or not. And I do think it 709 00:43:38,160 --> 00:43:41,279 Speaker 1: sets a pretty crazy precedent of you know, who's going 710 00:43:41,320 --> 00:43:44,000 Speaker 1: to be banned next from there. So let's let's set 711 00:43:44,160 --> 00:43:47,920 Speaker 1: the first thing. UM use the term shadow banning. Shadow banning, 712 00:43:48,840 --> 00:43:54,600 Speaker 1: so there is the conspiracy theory of my messages are 713 00:43:54,719 --> 00:43:59,440 Speaker 1: not getting out, UM somehow they're secretly doing something behind 714 00:43:59,520 --> 00:44:04,439 Speaker 1: the scene. That's shadow banning. And then there is d platforming, 715 00:44:04,800 --> 00:44:08,120 Speaker 1: which is you are kicked off. So shadow banning is 716 00:44:08,120 --> 00:44:12,239 Speaker 1: actually a really good example of the I'm going to 717 00:44:12,400 --> 00:44:16,880 Speaker 1: use a kind word shenanigans that UH people like Jones 718 00:44:16,920 --> 00:44:20,120 Speaker 1: and others were pulling off by claiming that there were 719 00:44:20,200 --> 00:44:23,799 Speaker 1: things being done by secret powers to them, whether it 720 00:44:24,000 --> 00:44:30,120 Speaker 1: was I'm being shadow banned. Two more UM dangerous to society, 721 00:44:30,239 --> 00:44:34,520 Speaker 1: UH claiming that UM mass killings were falsehoods, that there 722 00:44:34,560 --> 00:44:41,160 Speaker 1: were little kid actors, UM spreading disinformation about terror campaigns, etcetera, etcetera. 723 00:44:41,360 --> 00:44:42,920 Speaker 1: So but let's even be just I you know I'm 724 00:44:42,960 --> 00:44:45,719 Speaker 1: gonna push what I'm pushing back as and maybe you 725 00:44:45,719 --> 00:44:48,279 Speaker 1: didn't even intend it, but that the very idea of shadows. No, 726 00:44:48,400 --> 00:44:51,040 Speaker 1: I know, to be fair, we've been shadow banned on 727 00:44:51,120 --> 00:44:54,000 Speaker 1: some level, you know, with Crate Club, with with guys 728 00:44:54,040 --> 00:44:56,600 Speaker 1: who have weapons, and then we get flagged for gun sales, 729 00:44:56,680 --> 00:45:00,680 Speaker 1: and you know that I would put that ob there's 730 00:45:00,719 --> 00:45:05,040 Speaker 1: there's the conspiracy theory of shadow banning versus what you 731 00:45:05,080 --> 00:45:09,000 Speaker 1: were really getting at is d platforming. You're kicked off 732 00:45:09,040 --> 00:45:12,160 Speaker 1: the network. You don't have access to the network. Um, 733 00:45:12,200 --> 00:45:17,440 Speaker 1: and uh, here's my own take on this. Um, you 734 00:45:17,560 --> 00:45:21,440 Speaker 1: have a right to free speech. You do not have 735 00:45:21,560 --> 00:45:25,800 Speaker 1: an inherent right to be part of a privately owned 736 00:45:25,920 --> 00:45:31,120 Speaker 1: network and do not have a right to spread falsehood 737 00:45:31,239 --> 00:45:36,319 Speaker 1: that is dangerous to society on that network. So you know, 738 00:45:36,400 --> 00:45:39,840 Speaker 1: and again even by the fact that Alex Jones cannot 739 00:45:39,840 --> 00:45:43,960 Speaker 1: even say, well, I have a contract with Facebook or Twitter. 740 00:45:44,040 --> 00:45:47,760 Speaker 1: I paid them x No, literally, you didn't pay them anything. 741 00:45:48,239 --> 00:45:51,040 Speaker 1: You are on their network and they get to decide 742 00:45:51,080 --> 00:45:53,600 Speaker 1: the rules of that network. You still have the right 743 00:45:53,640 --> 00:45:56,479 Speaker 1: to say what you want. They don't have to let 744 00:45:56,520 --> 00:45:59,360 Speaker 1: you on their network. Now what they have chosen to 745 00:45:59,400 --> 00:46:02,920 Speaker 1: do through out history, going all the way back to 746 00:46:03,960 --> 00:46:07,480 Speaker 1: the my Spaces, the six degrees is for the most part, 747 00:46:07,560 --> 00:46:12,200 Speaker 1: the networks have been ambivalent about what's playing out on them. 748 00:46:12,440 --> 00:46:15,200 Speaker 1: They have not wanted to do what's known as content moderation. 749 00:46:15,239 --> 00:46:17,840 Speaker 1: They have not wanted to police what people say. But 750 00:46:17,920 --> 00:46:21,600 Speaker 1: then every um couple of years they get pressure from 751 00:46:21,640 --> 00:46:25,440 Speaker 1: two directions. They get pressure from the other users to 752 00:46:25,520 --> 00:46:28,960 Speaker 1: other customers of that network saying this is something that 753 00:46:28,960 --> 00:46:32,640 Speaker 1: that if you keep allowing this, we're going to leave. 754 00:46:33,120 --> 00:46:35,799 Speaker 1: And then the second is the threat of government intervention. 755 00:46:36,120 --> 00:46:38,880 Speaker 1: So the very first time there was content moderation, it 756 00:46:39,000 --> 00:46:44,800 Speaker 1: was over something like, uh, child pornography. Basically, the the 757 00:46:44,920 --> 00:46:48,680 Speaker 1: users of the early networks say, you know what, we 758 00:46:48,760 --> 00:46:50,840 Speaker 1: don't like that this is in this space. If you 759 00:46:50,920 --> 00:46:54,360 Speaker 1: don't clean it up, we might leave, and government is saying, 760 00:46:54,800 --> 00:46:57,319 Speaker 1: guess what, We're going to intervene. So there's this is 761 00:46:57,360 --> 00:47:00,720 Speaker 1: what's played out. And initially it was um that everyone 762 00:47:00,760 --> 00:47:05,359 Speaker 1: could agree on, seemingly like child pornography. Then after nine 763 00:47:05,400 --> 00:47:11,160 Speaker 1: eleven it moved into things like, for example, UM terrorists videos, 764 00:47:11,200 --> 00:47:17,440 Speaker 1: and initially it was terrorist videos of UM beheading, idea attacks, 765 00:47:17,600 --> 00:47:22,640 Speaker 1: soldiers dying and so the companies go after that. Then 766 00:47:22,800 --> 00:47:28,799 Speaker 1: it is well, not just violent imagery, it's inspiration of 767 00:47:29,480 --> 00:47:34,160 Speaker 1: mass killing, inspiration of extremism. And that wasn't easy enough 768 00:47:34,320 --> 00:47:37,560 Speaker 1: seemingly issue when it was something like al Qaeda. Then 769 00:47:37,560 --> 00:47:40,280 Speaker 1: it grows a little bit thornier when it's, for example, 770 00:47:40,320 --> 00:47:44,319 Speaker 1: incidents like Charlottesville. Um. And so there's basically been this 771 00:47:44,400 --> 00:47:47,560 Speaker 1: kind of continual back and forth, and we're seeing, you know, 772 00:47:47,600 --> 00:47:50,520 Speaker 1: kind of what I'm getting out is the company's um. 773 00:47:50,719 --> 00:47:55,040 Speaker 1: They've more and more taken on a political role that 774 00:47:55,120 --> 00:47:57,839 Speaker 1: reflects the idea that they're running what are again these 775 00:47:57,880 --> 00:48:01,680 Speaker 1: battlefields of political debate, but also with real effect on 776 00:48:01,760 --> 00:48:04,719 Speaker 1: real battle fields. So my own personal take on this 777 00:48:04,920 --> 00:48:11,640 Speaker 1: is again, UM, this is uh. There are actors online 778 00:48:12,320 --> 00:48:15,440 Speaker 1: who you have a right to free speech, You have 779 00:48:15,480 --> 00:48:20,440 Speaker 1: a right even to lie. The platform doesn't have to 780 00:48:20,520 --> 00:48:24,319 Speaker 1: allow you to be a danger to the broader society, 781 00:48:24,840 --> 00:48:27,560 Speaker 1: and because you don't have a contractual relationship with them, 782 00:48:27,560 --> 00:48:30,279 Speaker 1: they have the right to boot you. So um, in 783 00:48:30,320 --> 00:48:35,360 Speaker 1: the case of Jones, then a policy, he violated the policy. 784 00:48:35,640 --> 00:48:37,320 Speaker 1: But I want to hit something else because it circles 785 00:48:37,360 --> 00:48:40,360 Speaker 1: back to something that's really crucial to the broader discussion. 786 00:48:41,840 --> 00:48:44,440 Speaker 1: What was fascinating and we now we have the data 787 00:48:44,520 --> 00:48:48,719 Speaker 1: on it is in the Russian campaigns, they did not 788 00:48:48,920 --> 00:48:55,600 Speaker 1: just inject disinformation into our own body politic, the fake 789 00:48:55,680 --> 00:49:00,120 Speaker 1: ads and things like that. They also using both our 790 00:49:00,160 --> 00:49:04,640 Speaker 1: body accounts and sock puppet accounts, um would retreat, they 791 00:49:04,640 --> 00:49:09,360 Speaker 1: would try and elevate the discussion of certain actors within 792 00:49:09,440 --> 00:49:11,799 Speaker 1: our own discourse. And now you can literally lay out 793 00:49:11,800 --> 00:49:15,240 Speaker 1: the data and see who was it that Russia wanted 794 00:49:15,320 --> 00:49:20,160 Speaker 1: their voices magnified as a way of harming the United States. 795 00:49:22,080 --> 00:49:25,600 Speaker 1: And you have this very small number of what are 796 00:49:25,640 --> 00:49:30,360 Speaker 1: known as super spreaders. They're basically sort of toxic voices 797 00:49:30,400 --> 00:49:33,920 Speaker 1: that have a larger effect than the than the rest 798 00:49:33,960 --> 00:49:38,120 Speaker 1: of kind of the general populace, and by several orders 799 00:49:38,160 --> 00:49:44,040 Speaker 1: of magnitude. It was figures like Alex Jones, the people 800 00:49:44,080 --> 00:49:46,960 Speaker 1: behind pizza Gate. Again circling back to one of the 801 00:49:47,000 --> 00:49:51,480 Speaker 1: prior questions, it was remarkable that the Russians identified the 802 00:49:51,560 --> 00:49:56,800 Speaker 1: worst conspiracy theorists as the best way to harm America. 803 00:49:57,280 --> 00:50:00,640 Speaker 1: So I don't feel bad about those folks being platform 804 00:50:01,000 --> 00:50:02,879 Speaker 1: They can still do what when you can still run 805 00:50:02,880 --> 00:50:05,480 Speaker 1: out in the street and yell at people, But I 806 00:50:05,520 --> 00:50:08,360 Speaker 1: don't feel bad that, um, you know, the companies decided 807 00:50:08,360 --> 00:50:09,880 Speaker 1: they gave them a set of rules and said you 808 00:50:10,000 --> 00:50:13,040 Speaker 1: violated the rules, you're off. Well that uh, I actually 809 00:50:13,160 --> 00:50:15,319 Speaker 1: have that written down on my piece of paper here too, 810 00:50:15,400 --> 00:50:17,560 Speaker 1: that I was going to ask you about the murky 811 00:50:17,560 --> 00:50:21,560 Speaker 1: and interesting relationship between bright bart Info Wars, and RT. 812 00:50:21,800 --> 00:50:23,480 Speaker 1: But I think people will have to go and read 813 00:50:23,480 --> 00:50:26,120 Speaker 1: your book and start through some of that on their 814 00:50:26,120 --> 00:50:28,480 Speaker 1: own and draw their own conclusions. That's a that's a 815 00:50:28,480 --> 00:50:30,680 Speaker 1: whole other can of worms. Uh. Maybe we can do 816 00:50:30,719 --> 00:50:33,160 Speaker 1: another interview sometime in the future and again into some 817 00:50:33,200 --> 00:50:35,919 Speaker 1: of these subjects. And the last really quick thing I said. 818 00:50:36,080 --> 00:50:39,040 Speaker 1: One last thing was because I mentioned this at the beginning, 819 00:50:39,120 --> 00:50:40,839 Speaker 1: just if you can give us a quick brief of 820 00:50:40,880 --> 00:50:43,319 Speaker 1: what New America is since you're a senior fellow there. 821 00:50:44,120 --> 00:50:46,839 Speaker 1: So New America is a think tank where a non 822 00:50:46,920 --> 00:50:54,080 Speaker 1: governmental um uh, nonpartisan, nonprofit organization. We basically wrestle with 823 00:50:54,640 --> 00:50:58,960 Speaker 1: the questions that happened when policy and technology come crashing together. 824 00:50:59,400 --> 00:51:05,359 Speaker 1: So you have people writing books, reports, um and articles, 825 00:51:05,400 --> 00:51:08,000 Speaker 1: and they're on everything from you know, what's the future 826 00:51:08,000 --> 00:51:10,400 Speaker 1: of the internet too. We have a team that's working 827 00:51:10,440 --> 00:51:12,640 Speaker 1: on the future of war. So, for example, the Future 828 00:51:12,680 --> 00:51:18,320 Speaker 1: of War Team has everything from UH historians a recently 829 00:51:18,320 --> 00:51:22,560 Speaker 1: retired Army officers actually writing the Army's UM official history 830 00:51:22,600 --> 00:51:27,759 Speaker 1: of doctrinal change since UM UH nine eleven two. UM. 831 00:51:27,760 --> 00:51:32,040 Speaker 1: We have scientists too, we have lawyers. Do we have 832 00:51:32,480 --> 00:51:36,120 Speaker 1: former Air Force acquisition officers? Do we have former members 833 00:51:36,200 --> 00:51:39,640 Speaker 1: of the community that's in this audience a former member 834 00:51:39,640 --> 00:51:42,680 Speaker 1: of Navy Seal Team six, for example, And so everyone's 835 00:51:42,800 --> 00:51:46,640 Speaker 1: essentially wrestling with these questions of and the Future War 836 00:51:46,680 --> 00:51:50,279 Speaker 1: Team what will happen next in this space? Overall, the 837 00:51:50,320 --> 00:51:53,799 Speaker 1: mission of the organization is UM to basically help us 838 00:51:53,960 --> 00:52:00,400 Speaker 1: understand change and us defined as government, media and the 839 00:52:00,440 --> 00:52:04,280 Speaker 1: wider public. Awesome, Well, thanks for coming on once again. 840 00:52:04,440 --> 00:52:07,520 Speaker 1: The book is like War the weaponization of social Media. 841 00:52:07,600 --> 00:52:09,799 Speaker 1: He's the co author of the book. UH. You could 842 00:52:09,840 --> 00:52:13,400 Speaker 1: follow Peter on Twitter or PW on Twitter at Peter W. 843 00:52:13,719 --> 00:52:18,320 Speaker 1: Singer and the website is PW Singer dot com. Thanks 844 00:52:18,360 --> 00:52:21,160 Speaker 1: again so much for coming on. We appreciate it. Thank 845 00:52:21,200 --> 00:52:24,080 Speaker 1: you very much. I really appreciate your time today. Thanks Peter. 846 00:52:25,280 --> 00:52:27,759 Speaker 1: All right, and this will be up tomorrow and I'll 847 00:52:27,800 --> 00:52:29,840 Speaker 1: be in touch when it is all right, thank you, 848 00:52:30,200 --> 00:52:35,840 Speaker 1: all right, have a good afternoon. Interesting stuff there. Uh. 849 00:52:35,880 --> 00:52:38,319 Speaker 1: I will be the first to say, and people might 850 00:52:38,320 --> 00:52:39,680 Speaker 1: be like, why didn't you push back on them a 851 00:52:39,719 --> 00:52:41,879 Speaker 1: little bit more? Uh, you know, because I don't feel 852 00:52:41,880 --> 00:52:43,799 Speaker 1: like what we do is like a split screen debate show. 853 00:52:43,800 --> 00:52:45,759 Speaker 1: We like to give people a platform to get their 854 00:52:45,800 --> 00:52:49,040 Speaker 1: work out there. I did not agree with him on 855 00:52:49,040 --> 00:52:53,200 Speaker 1: on everything. To be completely honest, Um, whenever we talk 856 00:52:53,280 --> 00:52:57,080 Speaker 1: about the d platforming of people, I totally separate the 857 00:52:57,120 --> 00:53:01,319 Speaker 1: constitutional issue of it with just my personal feelings towards it. 858 00:53:01,840 --> 00:53:05,680 Speaker 1: What they're doing to people like Alex Jones is completely constitutional, 859 00:53:05,760 --> 00:53:10,160 Speaker 1: completely legal. I've never said anything um different that that 860 00:53:10,200 --> 00:53:12,960 Speaker 1: it's violating is free speech, not violating his free speech. 861 00:53:13,000 --> 00:53:15,040 Speaker 1: The government is not putting men jail or anything like that. 862 00:53:15,280 --> 00:53:17,880 Speaker 1: But I actually do think it sets a scary precedent. 863 00:53:17,920 --> 00:53:21,920 Speaker 1: On the other end of I don't like anyone dictating 864 00:53:22,160 --> 00:53:24,879 Speaker 1: what is the dangerous narrative, what is fake news. I'd 865 00:53:24,960 --> 00:53:29,080 Speaker 1: like people to you know, self educate on on cross 866 00:53:29,160 --> 00:53:31,719 Speaker 1: checking sources and that type of thing and knowing what 867 00:53:31,880 --> 00:53:34,960 Speaker 1: is real what is fake. But even when you talked about, 868 00:53:35,000 --> 00:53:37,520 Speaker 1: you know, educating kids on this type of thing, we 869 00:53:37,560 --> 00:53:39,520 Speaker 1: all have our own personal bias, and a left wing 870 00:53:39,560 --> 00:53:41,680 Speaker 1: teacher is gonna have a very different view of what 871 00:53:41,840 --> 00:53:45,320 Speaker 1: is fake news, what is okay. But then, speaking of bias, 872 00:53:45,680 --> 00:53:47,879 Speaker 1: look at the assumption you just made. You just jump 873 00:53:47,960 --> 00:53:50,319 Speaker 1: to the conclusion that the teacher teaching the classes left 874 00:53:50,320 --> 00:53:53,000 Speaker 1: wing and that they're going to teach the kids with 875 00:53:53,239 --> 00:53:57,000 Speaker 1: a left wing agenda in a civics class. Well, here's 876 00:53:57,000 --> 00:53:59,600 Speaker 1: the thing, and so I have education at all, it's 877 00:53:59,600 --> 00:54:01,120 Speaker 1: get rid of it because it's all going to be 878 00:54:01,120 --> 00:54:03,960 Speaker 1: politically biased. I don't have to change the restructure the 879 00:54:03,960 --> 00:54:06,160 Speaker 1: whole thing. But that's the whole different subject. But I 880 00:54:06,200 --> 00:54:09,600 Speaker 1: spoke about this with you prior to us recording. Um, 881 00:54:09,680 --> 00:54:13,080 Speaker 1: the people who run you know, the main voices behind Twitter, 882 00:54:13,520 --> 00:54:16,839 Speaker 1: Facebook are all pretty openly left wing. We know what 883 00:54:17,000 --> 00:54:20,200 Speaker 1: Mark Zuckerberg is not banning liberals on the platform. I 884 00:54:20,200 --> 00:54:22,880 Speaker 1: gave you the example before recorded of you know, Kathy 885 00:54:22,960 --> 00:54:25,680 Speaker 1: Griffin putting up the Trump head. She wasn't banned from 886 00:54:25,680 --> 00:54:29,200 Speaker 1: anywhere after that. I mean, I didn't don't performed her anywhere. 887 00:54:29,320 --> 00:54:31,600 Speaker 1: I don't have the whole history of who's been banned 888 00:54:31,600 --> 00:54:35,480 Speaker 1: and who hasn't. Um there are I mean, I'll tell 889 00:54:35,480 --> 00:54:38,120 Speaker 1: you on Twitter. There are some pretty prominent neo Nazis 890 00:54:38,120 --> 00:54:40,040 Speaker 1: who have not been banned and like nothing has happened 891 00:54:40,040 --> 00:54:42,880 Speaker 1: to them. It seems to me that it's a question 892 00:54:43,040 --> 00:54:46,520 Speaker 1: of when they reach a certain level of notoriety. Like 893 00:54:46,560 --> 00:54:49,880 Speaker 1: Alex Jones. Alex Jones would have coasted by fine, but 894 00:54:50,040 --> 00:54:52,959 Speaker 1: he just deliberately courted so much controversy and he got 895 00:54:52,960 --> 00:54:55,560 Speaker 1: what he wanted. He pushed up into the national debate. 896 00:54:55,880 --> 00:54:58,600 Speaker 1: He was on television all the time, people are talking 897 00:54:58,640 --> 00:55:01,600 Speaker 1: about him, and now only the spotlight is on this guy, 898 00:55:01,960 --> 00:55:04,160 Speaker 1: and there are enough people piste off about the stuff 899 00:55:04,200 --> 00:55:07,040 Speaker 1: he was saying that you know that that Sandy hook 900 00:55:07,120 --> 00:55:11,240 Speaker 1: never happened, that the parents of dead children or crisis actors, 901 00:55:11,600 --> 00:55:15,040 Speaker 1: people just got fed up with it, and that that's 902 00:55:15,120 --> 00:55:19,360 Speaker 1: kind of the consumer or customer pushback against these companies 903 00:55:19,360 --> 00:55:20,880 Speaker 1: that are like, you gotta do something about this. You 904 00:55:20,920 --> 00:55:25,320 Speaker 1: can't let this idiot run rampant like this. But every 905 00:55:25,360 --> 00:55:27,799 Speaker 1: i'd say half the country, as in the people voted 906 00:55:27,800 --> 00:55:30,360 Speaker 1: for Trump or always half the people who voted were 907 00:55:30,440 --> 00:55:33,680 Speaker 1: equally as shocked and offended by Kathy Griffin holding up 908 00:55:33,719 --> 00:55:36,279 Speaker 1: the Trump head. Saw it as a violent threat to 909 00:55:36,280 --> 00:55:40,160 Speaker 1: towards our president and nothing was done, you know to her. 910 00:55:40,239 --> 00:55:42,600 Speaker 1: I mean, in my personal opinion, yes, that was that 911 00:55:42,680 --> 00:55:45,719 Speaker 1: pictures inappropriate and it's not something I would ever do 912 00:55:45,920 --> 00:55:50,560 Speaker 1: or advocate. But there's a difference between that one dumb 913 00:55:50,600 --> 00:55:55,799 Speaker 1: picture and Alex Jones is NonStop ranting about how nine 914 00:55:55,800 --> 00:55:58,800 Speaker 1: eleven is an inside job, about how the CIA is 915 00:55:58,840 --> 00:56:00,640 Speaker 1: out to kill all of us, and about how they're 916 00:56:00,640 --> 00:56:03,399 Speaker 1: putting chemicals in the water that make the frogs gay. Yeah, 917 00:56:03,520 --> 00:56:06,680 Speaker 1: it's completely fucking different. That's like apples and oranges. My 918 00:56:06,840 --> 00:56:10,560 Speaker 1: question though, is always you know who dictates what can 919 00:56:10,640 --> 00:56:12,960 Speaker 1: be the platforms what can and I know it's the 920 00:56:12,960 --> 00:56:16,319 Speaker 1: owners of these companies who do happen to mostly be 921 00:56:16,520 --> 00:56:19,720 Speaker 1: left wing if you look at the prominent people. Um 922 00:56:19,760 --> 00:56:21,560 Speaker 1: and then, like I mentioned to him, I didn't really 923 00:56:21,600 --> 00:56:24,920 Speaker 1: go into it in detail, but with our own company, uh, 924 00:56:25,239 --> 00:56:28,320 Speaker 1: great club has really been screwed over by these people 925 00:56:28,400 --> 00:56:31,480 Speaker 1: who you know you know this. It went from you 926 00:56:31,520 --> 00:56:33,319 Speaker 1: can't have pictures in the you can have a video 927 00:56:33,320 --> 00:56:35,719 Speaker 1: in the head of a guy holding a gun, all right, 928 00:56:35,800 --> 00:56:37,360 Speaker 1: we won't have it. You can have a video on 929 00:56:37,400 --> 00:56:39,120 Speaker 1: the head of a guy holding a knife, all right. 930 00:56:39,320 --> 00:56:41,920 Speaker 1: The way this guy is holding a flashlight towards the camera, 931 00:56:42,120 --> 00:56:45,439 Speaker 1: very intimidating. You know, that was I mean, I am 932 00:56:45,480 --> 00:56:49,800 Speaker 1: concerned on on I say, I'm very concerned honestly because 933 00:56:49,800 --> 00:56:54,280 Speaker 1: what what these social media platforms are doing politics aside, 934 00:56:54,320 --> 00:56:57,080 Speaker 1: just remove partisan politics for a moment, is that it 935 00:56:57,200 --> 00:57:01,239 Speaker 1: is a global wide social engineering project, and let's just 936 00:57:01,280 --> 00:57:03,120 Speaker 1: call it what it is. And I think Peter was 937 00:57:03,160 --> 00:57:06,200 Speaker 1: acknowledging that as well, that these companies are these are 938 00:57:06,239 --> 00:57:12,719 Speaker 1: mega corporations, they're transnational in nature. Um, they I mean 939 00:57:12,760 --> 00:57:16,800 Speaker 1: they have a profound influence on our cognitive process. Uh, 940 00:57:16,840 --> 00:57:19,920 Speaker 1: they have a profound impact on elections, on how we 941 00:57:20,040 --> 00:57:23,760 Speaker 1: view political issues, how we view policy issues, and all 942 00:57:23,800 --> 00:57:26,120 Speaker 1: of these things. So, yeah, you have to call it 943 00:57:26,160 --> 00:57:28,360 Speaker 1: what it is. At a certain point, these companies are 944 00:57:28,360 --> 00:57:33,160 Speaker 1: involved in social engineering, and um, they respond to what 945 00:57:33,240 --> 00:57:36,520 Speaker 1: customers want. They respond to request from our government because 946 00:57:37,040 --> 00:57:39,080 Speaker 1: none of these companies want to get slapped with anti 947 00:57:39,200 --> 00:57:42,320 Speaker 1: trust suits. So the government tells them to do something, 948 00:57:42,360 --> 00:57:46,800 Speaker 1: they're probably going to comply. Um, And that that is concerning. 949 00:57:46,800 --> 00:57:49,160 Speaker 1: I mean there there is the whole you know George 950 00:57:49,280 --> 00:57:53,560 Speaker 1: orwell not D eighty four complex happening. Um, it just 951 00:57:53,640 --> 00:57:56,520 Speaker 1: removed the partisan politics from the conversation for a moment. 952 00:57:56,800 --> 00:58:00,200 Speaker 1: And I mean, these companies have a strong hand and 953 00:58:00,360 --> 00:58:03,560 Speaker 1: engineering how we think. And that's a concern regardless, and 954 00:58:03,600 --> 00:58:06,360 Speaker 1: it's something that we have to think. And um, I 955 00:58:06,400 --> 00:58:10,600 Speaker 1: think that is why, Yeah, parents have to educate their children, 956 00:58:10,880 --> 00:58:13,600 Speaker 1: um about social media. And I think there is a 957 00:58:13,680 --> 00:58:16,880 Speaker 1: role for you know, the public school system to teach 958 00:58:17,000 --> 00:58:18,880 Speaker 1: kids about you know, when I was a kid, I 959 00:58:18,880 --> 00:58:21,240 Speaker 1: mean I remember they would talk to us about read 960 00:58:21,240 --> 00:58:23,560 Speaker 1: in the newspaper and how to analyze the news and 961 00:58:23,560 --> 00:58:27,320 Speaker 1: how how to think about claims and are very different, um. 962 00:58:27,360 --> 00:58:29,120 Speaker 1: And and kids need to be taught to think the 963 00:58:29,160 --> 00:58:31,240 Speaker 1: same way about what they see on the internet, what 964 00:58:31,280 --> 00:58:34,960 Speaker 1: they see on social media. UM. You know, Peter was 965 00:58:35,000 --> 00:58:37,320 Speaker 1: talking about how we're going into an era now where 966 00:58:37,880 --> 00:58:41,480 Speaker 1: uh clips, new video clips are going to be outright faked, 967 00:58:41,840 --> 00:58:44,920 Speaker 1: um as in their completely false from top to bottom. 968 00:58:44,960 --> 00:58:47,400 Speaker 1: But what we've seen, what we're already seeing is the 969 00:58:47,440 --> 00:58:50,400 Speaker 1: you'll see clips that are video that is shot in 970 00:58:50,480 --> 00:58:53,640 Speaker 1: one place, but then it's presented online is something else. 971 00:58:54,000 --> 00:58:56,240 Speaker 1: So I can think of that's going to be crazy. 972 00:58:56,280 --> 00:58:58,320 Speaker 1: I can think of a few a few examples off 973 00:58:58,360 --> 00:59:00,600 Speaker 1: the top of my head. There was one clip it 974 00:59:00,720 --> 00:59:04,000 Speaker 1: was UM. There was a it was an economic um depression, 975 00:59:04,040 --> 00:59:08,880 Speaker 1: an economic crisis in France UM in the early two thousands, 976 00:59:09,160 --> 00:59:11,440 Speaker 1: and there were people riding in the streets about it. 977 00:59:11,760 --> 00:59:13,800 Speaker 1: And that clip was put on Facebook and it was 978 00:59:13,840 --> 00:59:16,960 Speaker 1: portrayed as these are Muslims riding in the streets of 979 00:59:17,040 --> 00:59:19,640 Speaker 1: Paris and flipping over cars and burning them, and that's 980 00:59:19,680 --> 00:59:22,680 Speaker 1: just completely not what was happening there. So we're already 981 00:59:22,720 --> 00:59:26,200 Speaker 1: seeing those kinds of things or UM in South Africa. 982 00:59:26,280 --> 00:59:28,960 Speaker 1: There's a lot of clips making the rounds UM and 983 00:59:29,000 --> 00:59:31,760 Speaker 1: I see them on on social media as well, showing 984 00:59:31,800 --> 00:59:34,800 Speaker 1: atrocities that didn't even happen in South Africa. It's what 985 00:59:34,880 --> 00:59:38,640 Speaker 1: Evan Barlow was talking about when we had him on UM. 986 00:59:38,800 --> 00:59:41,000 Speaker 1: So yeah, it's a mess, and kids need to be 987 00:59:41,000 --> 00:59:43,880 Speaker 1: taught in adults too, I mean baby boomers or they 988 00:59:43,960 --> 00:59:46,680 Speaker 1: kill me. They're like fake they're like fake news machines, 989 00:59:46,760 --> 00:59:48,720 Speaker 1: but they're kind of the left end right, they're kind 990 00:59:48,720 --> 00:59:50,320 Speaker 1: of they're kind of a lost cause. I don't know 991 00:59:50,320 --> 00:59:53,320 Speaker 1: what we can do about them, but but you know, UM, 992 00:59:53,640 --> 00:59:56,040 Speaker 1: our kids just need to be taught better about how 993 00:59:56,080 --> 00:59:59,440 Speaker 1: to discern fact from fiction. I wish I could find. 994 00:59:59,600 --> 01:00:02,000 Speaker 1: So I'm trying to find this clip and I don't 995 01:00:02,040 --> 01:00:06,400 Speaker 1: know if I will. But Jenkweager from Young Turks debated 996 01:00:06,440 --> 01:00:09,080 Speaker 1: Tucker Carlson about a lot of this stuff at politic 997 01:00:09,240 --> 01:00:12,000 Speaker 1: On And I'm not like a viewer of Tucker Carlson. 998 01:00:12,040 --> 01:00:14,200 Speaker 1: I don't really watch any televised news. But he made 999 01:00:14,200 --> 01:00:17,680 Speaker 1: some really interesting points about you know, what is hate speech? 1000 01:00:17,760 --> 01:00:20,120 Speaker 1: And you know there was there's never been this precedent 1001 01:00:20,240 --> 01:00:24,480 Speaker 1: set that. I mean, I think anybody who's read the 1002 01:00:24,520 --> 01:00:28,520 Speaker 1: Constitution studies that the freedom of speech was meant to 1003 01:00:28,560 --> 01:00:32,080 Speaker 1: protect unpopular speech. Also we wouldn't need it, and he 1004 01:00:32,160 --> 01:00:34,800 Speaker 1: spoke about like do we really have free speech now? 1005 01:00:34,800 --> 01:00:38,040 Speaker 1: He's like, this is really setting a precedent where you 1006 01:00:38,120 --> 01:00:40,720 Speaker 1: say something online that people don't like, you can't really 1007 01:00:40,720 --> 01:00:43,960 Speaker 1: work a regular job. You'll be you'll be fine, that's true. 1008 01:00:44,000 --> 01:00:46,720 Speaker 1: But the United States government doesn't protect you from being 1009 01:00:46,800 --> 01:00:49,440 Speaker 1: socially ostracized. No, But it's it's setting a different We're 1010 01:00:49,480 --> 01:00:52,360 Speaker 1: in a very different world basically, I think, than we 1011 01:00:52,360 --> 01:00:55,040 Speaker 1: were twenty years ago, ten years ago. I wish you 1012 01:00:55,040 --> 01:00:57,040 Speaker 1: could find the quip. I'm probably gonna find it once 1013 01:00:57,080 --> 01:01:00,400 Speaker 1: we've stopped we've talked about We've talked about UM people 1014 01:01:00,440 --> 01:01:03,480 Speaker 1: trying to UM engage in d platforming in the past 1015 01:01:03,520 --> 01:01:06,840 Speaker 1: on this podcast, and you know how people will there. 1016 01:01:06,880 --> 01:01:10,120 Speaker 1: They are like these little cliques, they're like cults, especially 1017 01:01:10,200 --> 01:01:12,720 Speaker 1: on Twitter, where they will they'll twitter mob you if 1018 01:01:12,760 --> 01:01:15,600 Speaker 1: they don't like, you know, your opinion about something. Yeah, 1019 01:01:15,640 --> 01:01:17,680 Speaker 1: and but that but that's not a First Amendment issue, 1020 01:01:17,720 --> 01:01:19,600 Speaker 1: No it's not. But people can go and do stupid 1021 01:01:19,640 --> 01:01:21,959 Speaker 1: things if they want. I agree, And you can still 1022 01:01:21,960 --> 01:01:24,240 Speaker 1: engage in your free speech and be off of all 1023 01:01:24,280 --> 01:01:28,959 Speaker 1: these social networks. Um. But yeah, one of the interesting things, 1024 01:01:28,960 --> 01:01:31,520 Speaker 1: you know, when we're talking last show with Benny about 1025 01:01:31,520 --> 01:01:35,840 Speaker 1: the transgender issue, it's becoming um. You know, if you 1026 01:01:35,880 --> 01:01:38,040 Speaker 1: have a certain opinion on that issue, you're in the 1027 01:01:38,040 --> 01:01:42,120 Speaker 1: hate speech category and stuff like that does does bother me. 1028 01:01:42,200 --> 01:01:45,120 Speaker 1: I certainly don't have a problem with any grown adult 1029 01:01:45,160 --> 01:01:47,480 Speaker 1: doing what they want to do, but we should be 1030 01:01:47,520 --> 01:01:50,000 Speaker 1: allowed to have a debate on it with that. Yeah, absolutely, 1031 01:01:50,000 --> 01:01:53,000 Speaker 1: But we also need to stop being snowflakes collectively. And 1032 01:01:53,040 --> 01:01:55,040 Speaker 1: like somebody says, you know, I think you're a bad 1033 01:01:55,080 --> 01:01:57,480 Speaker 1: person because you said that, and it's like, well, okay, 1034 01:01:58,080 --> 01:01:59,760 Speaker 1: I think I'm a bad person, Like I'm not going 1035 01:01:59,800 --> 01:02:02,080 Speaker 1: to whine about how you're trying to shut down my speech. 1036 01:02:02,160 --> 01:02:04,080 Speaker 1: Like I'm a grown man. I'm gonna say what I'm 1037 01:02:04,080 --> 01:02:06,600 Speaker 1: gonna say, and I accept the repercussions of that. But 1038 01:02:06,640 --> 01:02:09,200 Speaker 1: I am worried about the d platforming of people because 1039 01:02:09,320 --> 01:02:10,800 Speaker 1: they have a different view on this. You know, it's 1040 01:02:10,880 --> 01:02:13,919 Speaker 1: it's not all people. It goes from people who think 1041 01:02:13,960 --> 01:02:17,040 Speaker 1: the kids in Sandy Hook were child actors or you 1042 01:02:17,080 --> 01:02:20,120 Speaker 1: know whatever. What what's the next step? What does it 1043 01:02:20,240 --> 01:02:22,800 Speaker 1: Does it go against like say, all gun owners and 1044 01:02:22,840 --> 01:02:26,400 Speaker 1: we start to ostracize that about Yeah? Is that something 1045 01:02:26,400 --> 01:02:28,400 Speaker 1: that's going to happen? And I think it'll be interesting 1046 01:02:28,440 --> 01:02:30,520 Speaker 1: to see how this plays out over the next decade. 1047 01:02:31,040 --> 01:02:33,560 Speaker 1: Is it gonna be like you saw this um bifurcation 1048 01:02:33,760 --> 01:02:37,400 Speaker 1: in televised news? Um? You know, CNN was kind of 1049 01:02:37,400 --> 01:02:40,160 Speaker 1: the middle of the road. They weaned left, but they 1050 01:02:40,160 --> 01:02:43,400 Speaker 1: were more center then. What we saw was MSNBC and 1051 01:02:43,440 --> 01:02:46,040 Speaker 1: Fox News. And now you have your left wing news 1052 01:02:46,080 --> 01:02:48,080 Speaker 1: and you have your right wing news, and CNN has 1053 01:02:48,080 --> 01:02:51,640 Speaker 1: gotten pretty left. Yeah, they have. Um, are we going 1054 01:02:51,680 --> 01:02:55,000 Speaker 1: to see this with social media at some point? Are 1055 01:02:55,000 --> 01:02:57,920 Speaker 1: you gonna see um? You know, some right wing billionaire 1056 01:02:58,280 --> 01:03:01,760 Speaker 1: fund their own social media. That's a safe space for 1057 01:03:02,120 --> 01:03:05,520 Speaker 1: conservatives or you know whatever. We're going to call them 1058 01:03:05,760 --> 01:03:09,000 Speaker 1: right wingers. It's it's hard to say because at this 1059 01:03:09,120 --> 01:03:12,640 Speaker 1: point I want to say that Facebook is a monopoly 1060 01:03:12,760 --> 01:03:16,560 Speaker 1: and Google and all these monopolies. However, being objective about this, 1061 01:03:16,920 --> 01:03:19,919 Speaker 1: I do remember in middle school in the early two 1062 01:03:19,960 --> 01:03:24,040 Speaker 1: thousand's having a history teacher talk about how Microsoft had 1063 01:03:24,080 --> 01:03:26,480 Speaker 1: a monopoly and the Internet Explorer. You know, there was 1064 01:03:26,520 --> 01:03:29,320 Speaker 1: no competition. I remember Bill Gates was pulled in front 1065 01:03:29,360 --> 01:03:31,840 Speaker 1: of Congress if I remember correctly. Yeah, this was the 1066 01:03:31,880 --> 01:03:34,640 Speaker 1: time where there were some repercussions. They had to they 1067 01:03:34,640 --> 01:03:36,560 Speaker 1: had to stop doing some things. Are split up part 1068 01:03:36,600 --> 01:03:39,080 Speaker 1: of the company, didn't they. I don't. I'm sorry, I'm sorry, 1069 01:03:39,120 --> 01:03:41,720 Speaker 1: I don't remember exactly A long time ago. That's the 1070 01:03:42,040 --> 01:03:44,840 Speaker 1: great thing about doing a podcast, just spur the moment ideas. 1071 01:03:44,880 --> 01:03:47,680 Speaker 1: And this isn't well researched or anything. But this was 1072 01:03:47,720 --> 01:03:49,920 Speaker 1: at the time where the most popular mac was remember 1073 01:03:49,920 --> 01:03:52,840 Speaker 1: those like those billboards of you can get this in 1074 01:03:52,960 --> 01:03:55,560 Speaker 1: pink and orange and all these crazy yards and no 1075 01:03:55,600 --> 01:03:58,240 Speaker 1: one was buying it. So this was long before the iPod, 1076 01:03:58,360 --> 01:04:01,720 Speaker 1: long before you know, the uh it had something. It 1077 01:04:01,760 --> 01:04:04,720 Speaker 1: had something to do with how they were bundling Windows 1078 01:04:04,880 --> 01:04:08,200 Speaker 1: with um with Internet Explorers. They're saying, like, if you 1079 01:04:08,240 --> 01:04:12,720 Speaker 1: buy this operating system, then it's implied that you have 1080 01:04:12,800 --> 01:04:17,400 Speaker 1: to use the browser. Um yeah, I mean, I'm sorry. 1081 01:04:17,400 --> 01:04:19,520 Speaker 1: I don't have my like Wikipedia facts at the top 1082 01:04:19,600 --> 01:04:21,840 Speaker 1: of my head, but this is something that, yeah, some 1083 01:04:22,040 --> 01:04:25,800 Speaker 1: past incident that'd be interesting to delve into a little bit. Yeah. 1084 01:04:25,880 --> 01:04:28,520 Speaker 1: So part of me wants to say that these social 1085 01:04:28,560 --> 01:04:31,680 Speaker 1: networks have a monopoly, that there's never gonna be any competition. 1086 01:04:31,960 --> 01:04:34,360 Speaker 1: But then the other part of me wants to say, 1087 01:04:34,440 --> 01:04:37,320 Speaker 1: you know, people believe that about Windows, about you know, 1088 01:04:37,360 --> 01:04:41,240 Speaker 1: other things. Yeah, yeah, it'll it'll reach a point, and um, 1089 01:04:41,480 --> 01:04:45,160 Speaker 1: some of these social media platforms are destined to die. Um. 1090 01:04:45,240 --> 01:04:46,960 Speaker 1: You know, I don't think Twitter is going to be 1091 01:04:46,960 --> 01:04:49,560 Speaker 1: around in ten years. No, I don't think it's gonna 1092 01:04:49,600 --> 01:04:53,320 Speaker 1: be around the way it is today. Um, Facebook will 1093 01:04:53,360 --> 01:04:57,600 Speaker 1: be around, I think, Um, Instagram, I don't know. I 1094 01:04:57,640 --> 01:04:59,760 Speaker 1: mean some of them, some of them are very popular, 1095 01:05:00,240 --> 01:05:02,680 Speaker 1: but I don't know what their longevity is gonna be. 1096 01:05:03,040 --> 01:05:04,560 Speaker 1: You know, some of them I think will turn into 1097 01:05:04,600 --> 01:05:08,560 Speaker 1: like my Space. Well, yeah, we'll see. Um. You know, 1098 01:05:08,680 --> 01:05:11,560 Speaker 1: there's two news articles I just want to bring up 1099 01:05:11,640 --> 01:05:14,640 Speaker 1: real quick. But before I do, be sure to check 1100 01:05:14,680 --> 01:05:17,080 Speaker 1: out Create Club, which I've been talking about a little bit, 1101 01:05:17,280 --> 01:05:21,280 Speaker 1: the long anticipated collaboration watch we did with NFW Watches 1102 01:05:21,320 --> 01:05:24,160 Speaker 1: and is in the next premium crate. We have different 1103 01:05:24,160 --> 01:05:26,960 Speaker 1: tiers of membership depending on how prepared you want to be, 1104 01:05:27,320 --> 01:05:29,960 Speaker 1: and gift options are available as well. Scott Whittner from 1105 01:05:30,000 --> 01:05:32,440 Speaker 1: the load out room and the guys are currently working 1106 01:05:32,480 --> 01:05:38,360 Speaker 1: on bringing custom products in uh, everything from sunglass cases 1107 01:05:38,400 --> 01:05:41,000 Speaker 1: to E d C bags and other manly products as 1108 01:05:41,040 --> 01:05:44,560 Speaker 1: the club fore Men by men. Uh. You can check 1109 01:05:44,640 --> 01:05:47,720 Speaker 1: that all out at Create Club dot us. Once again, 1110 01:05:47,720 --> 01:05:50,439 Speaker 1: that's Create Club dot us. Everything is currently shipping. Jack 1111 01:05:50,480 --> 01:05:52,920 Speaker 1: and I actually got a look over at the warehouse 1112 01:05:53,000 --> 01:05:56,160 Speaker 1: like live feed, which is pretty cool, uh for your 1113 01:05:56,200 --> 01:05:59,520 Speaker 1: dog owners. Actually, I got my crate yesterday and I'm 1114 01:05:59,560 --> 01:06:02,160 Speaker 1: not sure if it's the the category of it is 1115 01:06:02,160 --> 01:06:04,600 Speaker 1: that the prokrate. It's a smaller one, but it had 1116 01:06:04,600 --> 01:06:10,440 Speaker 1: a whole bunch of stuff in it. It had sunglasses case, um, 1117 01:06:10,520 --> 01:06:14,440 Speaker 1: what did it have? Some bottle openers had there's a 1118 01:06:14,560 --> 01:06:19,400 Speaker 1: there's a spear fishing spear in there. That I haven't 1119 01:06:19,400 --> 01:06:22,280 Speaker 1: put together. What else There's a whole bunch of stuff 1120 01:06:22,320 --> 01:06:25,600 Speaker 1: in there. Um, yeah, it's pretty cool. Yeah, that's great 1121 01:06:25,600 --> 01:06:28,160 Speaker 1: to see. Man, they're doing an awesome job for your 1122 01:06:28,200 --> 01:06:30,360 Speaker 1: dog owners. Check this out. You're gonna love this. You've 1123 01:06:30,360 --> 01:06:33,000 Speaker 1: heard me talk about it. We've partnered with Kuna, who 1124 01:06:33,080 --> 01:06:35,640 Speaker 1: is a team of trained canine handlers picking out a 1125 01:06:35,640 --> 01:06:38,240 Speaker 1: box for your dog every month of healthy treats and 1126 01:06:38,320 --> 01:06:41,440 Speaker 1: training aids. It's custom built for your dog size and 1127 01:06:41,560 --> 01:06:44,840 Speaker 1: age as well. The products are US sourced, all natural, 1128 01:06:45,160 --> 01:06:47,560 Speaker 1: and they not only promote a healthy diet, but also 1129 01:06:47,640 --> 01:06:50,200 Speaker 1: promote being active with your dog. So whether we're talking 1130 01:06:50,200 --> 01:06:52,760 Speaker 1: a pitbull or at Chuawa, this is just what we're 1131 01:06:52,840 --> 01:06:55,000 Speaker 1: looking for. You can see all of that at Kuna 1132 01:06:55,080 --> 01:06:58,360 Speaker 1: dot Dog. That's kuned dot dog. It's fishing for you. 1133 01:06:58,360 --> 01:07:00,920 Speaker 1: Your dog will appreciate it as well, of course. And 1134 01:07:00,960 --> 01:07:04,560 Speaker 1: that's spelled c U N A dot d O G. 1135 01:07:05,200 --> 01:07:06,919 Speaker 1: And I know Benny was saying you gotta get signed 1136 01:07:06,960 --> 01:07:10,479 Speaker 1: up for COONa because you have a dog. Unfortunately don't anymore. Uh. 1137 01:07:10,520 --> 01:07:12,880 Speaker 1: And then also as a reminder for those listening, for 1138 01:07:12,920 --> 01:07:17,040 Speaker 1: a limited time, you can receive a fifty discounted membership 1139 01:07:17,080 --> 01:07:20,080 Speaker 1: to the spec ops Channel. That's our channel that offers 1140 01:07:20,080 --> 01:07:24,560 Speaker 1: the most exclusive shows, documentaries and interviews covering the most 1141 01:07:24,560 --> 01:07:28,840 Speaker 1: exciting military content today. The spec ops channel premier show 1142 01:07:28,920 --> 01:07:33,760 Speaker 1: Training Cell follows former Special Operations Forces as they participate 1143 01:07:33,800 --> 01:07:36,320 Speaker 1: in the most advanced training in the country, everything from 1144 01:07:36,320 --> 01:07:42,000 Speaker 1: shooting schools, defensive driving, jungle and winter warfare and much more. Again, 1145 01:07:42,040 --> 01:07:44,400 Speaker 1: you can watch this content on the spec ops channel 1146 01:07:44,800 --> 01:07:47,600 Speaker 1: at spec ops channel dot com and take advantage of 1147 01:07:47,600 --> 01:07:50,880 Speaker 1: a limited time offer a fifty off your membership. That's 1148 01:07:50,920 --> 01:07:54,160 Speaker 1: only four a month. Uh. The app is up spec 1149 01:07:54,160 --> 01:07:59,000 Speaker 1: ops channel dot com and the software radio Android app 1150 01:07:59,080 --> 01:08:02,240 Speaker 1: is now, So we're on Android. Weren't I iPhone? All that? 1151 01:08:02,800 --> 01:08:06,400 Speaker 1: And I'm gonna take one last sip of this as 1152 01:08:06,440 --> 01:08:09,200 Speaker 1: this is running out. But when you know, it's funny, 1153 01:08:09,400 --> 01:08:12,240 Speaker 1: I got my coffee. I killed my coffee. So someone 1154 01:08:12,360 --> 01:08:14,720 Speaker 1: is gonna say, because it always happens on Twitter to 1155 01:08:14,760 --> 01:08:17,080 Speaker 1: be why the hell are you drinking Starbucks and you 1156 01:08:17,080 --> 01:08:21,360 Speaker 1: should be drinking black rifle coffee only I whipped hard. 1157 01:08:22,000 --> 01:08:25,759 Speaker 1: You know what's funny? Um? So Starbucks always gets blasted? 1158 01:08:25,880 --> 01:08:29,080 Speaker 1: Is this um like super left wing corporation? They're definitely 1159 01:08:29,439 --> 01:08:32,519 Speaker 1: left wing, but they're not anti veteran. Remember them in 1160 01:08:32,600 --> 01:08:35,120 Speaker 1: Black Rifle for having like a whole thing. They actually 1161 01:08:35,200 --> 01:08:38,519 Speaker 1: hire a lot of veterans, but they also were hiring 1162 01:08:38,560 --> 01:08:42,439 Speaker 1: Syrian you know, refugees, and somehow this got painted as 1163 01:08:43,000 --> 01:08:45,000 Speaker 1: why can't you give those jobs to vets? And it's 1164 01:08:45,000 --> 01:08:47,680 Speaker 1: not a one or the other. They do both. And 1165 01:08:47,720 --> 01:08:50,120 Speaker 1: I do know this because I worked for Senator Bill 1166 01:08:50,200 --> 01:08:52,680 Speaker 1: Bradley's like high up on the board at Starbucks. And 1167 01:08:53,000 --> 01:08:55,240 Speaker 1: when I say they're left waning, yeah, they're pretty open 1168 01:08:55,280 --> 01:08:58,639 Speaker 1: about that. But they they're also very veteran friendly. And 1169 01:08:59,160 --> 01:09:01,720 Speaker 1: I guess it's because of those T shirts that came out. 1170 01:09:01,720 --> 01:09:03,599 Speaker 1: I don't know if it was grunt style or something 1171 01:09:03,640 --> 01:09:07,040 Speaker 1: where it was like, you know, vets before refugees and 1172 01:09:07,080 --> 01:09:09,479 Speaker 1: stuff like that. And these are two very separate issues, 1173 01:09:09,520 --> 01:09:13,000 Speaker 1: and I do hate how that's gets put into one category. 1174 01:09:13,080 --> 01:09:17,280 Speaker 1: So that's that's really some mouth breathing type ship. Um. 1175 01:09:17,400 --> 01:09:19,200 Speaker 1: I don't know. I've gone on my whole rant before 1176 01:09:19,240 --> 01:09:22,400 Speaker 1: about how like, I'm not concerned with the company's politics. 1177 01:09:22,520 --> 01:09:25,160 Speaker 1: I don't drink Starbucks coffee unless I have to, but 1178 01:09:25,280 --> 01:09:28,000 Speaker 1: not not because of politics. I just don't like the 1179 01:09:28,040 --> 01:09:30,000 Speaker 1: taste of their coffee. It's not very good. It's the 1180 01:09:30,040 --> 01:09:32,960 Speaker 1: only coffee that like. Any time I have any other coffee, 1181 01:09:32,960 --> 01:09:35,080 Speaker 1: it just doesn't do it for me. Yeah, I mean 1182 01:09:35,120 --> 01:09:37,840 Speaker 1: strictly Starbucks. Well, I mean we're kind of spoiled here. 1183 01:09:37,880 --> 01:09:40,200 Speaker 1: I mean in New York City. You know, there's people, 1184 01:09:40,400 --> 01:09:42,639 Speaker 1: you know, all kinds of little coffee shops making their 1185 01:09:42,640 --> 01:09:45,800 Speaker 1: own coffee. So they're just never getting to me. I 1186 01:09:45,800 --> 01:09:49,880 Speaker 1: don't know why Star Starbucks coffee just tastes like burned 1187 01:09:49,960 --> 01:09:52,519 Speaker 1: to me. And I've heard other people say that too, 1188 01:09:52,560 --> 01:09:56,280 Speaker 1: man Um. The last things I want to mention, I 1189 01:09:56,400 --> 01:09:58,760 Speaker 1: saw the cover of the New York Times today, the 1190 01:09:58,840 --> 01:10:01,880 Speaker 1: report of China. You know, they're they're saying that they 1191 01:10:01,920 --> 01:10:05,720 Speaker 1: are tapping the president's iPhone and he's being urged to 1192 01:10:05,720 --> 01:10:08,320 Speaker 1: switch over to something other than an iPhone, but he 1193 01:10:08,360 --> 01:10:11,479 Speaker 1: continues to use it. So that's definitely a concern that 1194 01:10:11,520 --> 01:10:15,040 Speaker 1: they're hearing his private conversations. During the Obama administration, there's 1195 01:10:15,080 --> 01:10:18,000 Speaker 1: a story about how Yankee White had been compromised, which 1196 01:10:18,080 --> 01:10:23,200 Speaker 1: is the classified communication system that the White House uses, 1197 01:10:23,720 --> 01:10:26,000 Speaker 1: UM that had been compromised by the Russians, And that 1198 01:10:26,080 --> 01:10:29,920 Speaker 1: story disappeared really quick. Don't hear much about that. And 1199 01:10:30,320 --> 01:10:32,960 Speaker 1: there's an article up on the News Rep by Joe 1200 01:10:33,040 --> 01:10:36,960 Speaker 1: la Fabe, who we've had on about China and Palestine 1201 01:10:37,040 --> 01:10:40,160 Speaker 1: signing a new trade agreement and strengthening the political ties, 1202 01:10:40,200 --> 01:10:43,719 Speaker 1: which I thought was interesting, um, because that definitely shows 1203 01:10:43,800 --> 01:10:47,040 Speaker 1: China getting more in bed with countries that were costle towards. 1204 01:10:47,320 --> 01:10:50,400 Speaker 1: I also do wonder you know, Palestine isn't exactly like 1205 01:10:50,439 --> 01:10:53,080 Speaker 1: an industrious country, like you know, what are they even trading? 1206 01:10:54,040 --> 01:10:56,759 Speaker 1: A good question. I don't know anything about this trade agreement, 1207 01:10:56,800 --> 01:10:58,240 Speaker 1: so I'm going to have to go and do some 1208 01:10:58,280 --> 01:11:02,040 Speaker 1: research myself. But articles up though at the News rep 1209 01:11:02,080 --> 01:11:08,440 Speaker 1: dot com China is spreading its wings and its tentacles everywhere. Yeah. Um, 1210 01:11:08,479 --> 01:11:12,360 Speaker 1: well that's it for us, ben Uh. We'll be back 1211 01:11:12,720 --> 01:11:18,280 Speaker 1: actually Wednesday with episode four fucking hundred. I can't believe it. 1212 01:11:18,280 --> 01:11:21,200 Speaker 1: It's insane, that's wild. Yeah, I'm excited for it. I 1213 01:11:21,200 --> 01:11:23,360 Speaker 1: won't give away we're having on, but I think it'll 1214 01:11:23,360 --> 01:11:26,360 Speaker 1: be pretty cool. Um. I think we'll get some great 1215 01:11:26,400 --> 01:11:29,800 Speaker 1: stories out of it, and please do what Like Like 1216 01:11:29,840 --> 01:11:32,440 Speaker 1: I said, we used this show as a platform for anybody, 1217 01:11:32,920 --> 01:11:35,280 Speaker 1: um that that we think has something interesting out that 1218 01:11:35,320 --> 01:11:37,280 Speaker 1: the audience will love. So please do pick up Like 1219 01:11:37,400 --> 01:11:41,400 Speaker 1: War the Weaponization of Social Media by P. W. Singer, 1220 01:11:41,400 --> 01:11:43,400 Speaker 1: and then you should look at the coverage Donemer who's 1221 01:11:43,400 --> 01:11:48,519 Speaker 1: the other co author? Um it is Emerson T. Brooking. Yes, 1222 01:11:48,600 --> 01:11:50,960 Speaker 1: we're originally gonna have both of them on UM, but 1223 01:11:51,040 --> 01:11:53,559 Speaker 1: I think Emerson was tied up. But it was a 1224 01:11:53,560 --> 01:11:55,680 Speaker 1: great interview. I read the book. I mean it is 1225 01:11:55,720 --> 01:12:00,800 Speaker 1: a good, um, concise history of this emerging topic that, 1226 01:12:01,280 --> 01:12:03,559 Speaker 1: like I said, we haven't fully come to graps grips 1227 01:12:03,600 --> 01:12:06,320 Speaker 1: with UM. So if you want to learn about the 1228 01:12:06,360 --> 01:12:10,439 Speaker 1: weaponization of social media and read a more sort of UM. 1229 01:12:10,720 --> 01:12:13,160 Speaker 1: I don't necessarily want to say it's purely academic. It's 1230 01:12:13,160 --> 01:12:15,200 Speaker 1: not like reading a white paper. But they do have 1231 01:12:15,240 --> 01:12:18,200 Speaker 1: all their source citations, because if you're writing about fake news, 1232 01:12:18,320 --> 01:12:21,680 Speaker 1: you want to do your homework right UM. So it 1233 01:12:21,760 --> 01:12:24,480 Speaker 1: does give a good history about the transfer the transmission 1234 01:12:24,520 --> 01:12:27,560 Speaker 1: of information throughout history and how it can be weaponized 1235 01:12:27,760 --> 01:12:31,200 Speaker 1: UM with the focus on today. So I definitely give 1236 01:12:31,200 --> 01:12:33,560 Speaker 1: it a read. If you're interested in that topic, absolutely 1237 01:12:33,640 --> 01:12:36,280 Speaker 1: check it out. We're out. We'll be back next episode, 1238 01:12:36,320 --> 01:13:11,639 Speaker 1: episode four hundred. Yeah, yea, you've been listening to self 1239 01:13:11,680 --> 01:13:16,840 Speaker 1: rep Radio. New episodes up every Wednesday and Friday. Follow 1240 01:13:16,880 --> 01:13:20,400 Speaker 1: the show on Instagram and Twitter at self rep Radio