1 00:00:09,664 --> 00:00:13,104 Speaker 1: Mission Implausible is now something you can watch. Just go 2 00:00:13,144 --> 00:00:16,823 Speaker 1: to YouTube and search Mission Implausible podcasts or click on 3 00:00:16,864 --> 00:00:17,864 Speaker 1: the link to our channel. 4 00:00:17,904 --> 00:00:22,864 Speaker 2: In our show notes, I'm John Cipher and I'm Jerry O'shay. 5 00:00:23,304 --> 00:00:26,584 Speaker 2: We have over sixty years of experience as clandestine officers 6 00:00:26,664 --> 00:00:29,504 Speaker 2: in the CIA, serving in high risk areas all around 7 00:00:29,544 --> 00:00:30,824 Speaker 2: the world, and part. 8 00:00:30,584 --> 00:00:34,504 Speaker 3: Of our job was creating conspiracies to deceive our adversaries. 9 00:00:34,784 --> 00:00:37,504 Speaker 2: Now we're going to use that experience to investigate the 10 00:00:37,543 --> 00:00:40,424 Speaker 2: conspiracy theories everyone's talking about as well as some of 11 00:00:40,464 --> 00:00:41,264 Speaker 2: you may not have heard. 12 00:00:41,344 --> 00:00:43,584 Speaker 3: Could they be true or are we being manipulated? 13 00:00:43,703 --> 00:00:52,464 Speaker 2: We'll find out now on Mission Implausible. Today's guest is 14 00:00:52,543 --> 00:00:56,264 Speaker 2: Anthony Vincy, a former intelligence officer and entrepreneur who served 15 00:00:56,304 --> 00:01:00,064 Speaker 2: as the chief Technology Officer at the National Geospatial Intelligence Agency. 16 00:01:00,504 --> 00:01:02,784 Speaker 2: He's the author of a recent book titled The Fourth 17 00:01:02,784 --> 00:01:05,424 Speaker 2: Intelligence Revolution, where he argues that we're entering a new 18 00:01:05,464 --> 00:01:08,744 Speaker 2: era in intelligence, one shape by rapid technological change and 19 00:01:08,784 --> 00:01:12,304 Speaker 2: the growing challenge China AI and the overwhelming volumes of 20 00:01:12,384 --> 00:01:15,264 Speaker 2: data that can be used against us. So in this conversation, 21 00:01:15,304 --> 00:01:17,664 Speaker 2: we're going to dig into what that fourth revolution looks like, 22 00:01:17,863 --> 00:01:21,184 Speaker 2: What it changes for national security and organizations beyond government 23 00:01:21,344 --> 00:01:23,544 Speaker 2: and what listeners should be paying attention to. Us AI 24 00:01:23,624 --> 00:01:26,264 Speaker 2: helps spread conspiracies and impacts intelligence. 25 00:01:26,304 --> 00:01:28,984 Speaker 4: So welcome Anthony, thanks for having me on. Appreciate it. 26 00:01:29,264 --> 00:01:32,184 Speaker 2: Yeah, of course. So in your book you describe how 27 00:01:32,184 --> 00:01:35,503 Speaker 2: intelligence has changed, and Jerry and I both agree with that. 28 00:01:35,544 --> 00:01:38,503 Speaker 2: You know, specifically, the rise of mass data, the data 29 00:01:38,503 --> 00:01:41,264 Speaker 2: that we all create from our financial information and from 30 00:01:41,264 --> 00:01:44,024 Speaker 2: our devices, and the means to access and manipulate that 31 00:01:44,143 --> 00:01:47,023 Speaker 2: data more effectively makes us now all the target foreign 32 00:01:47,063 --> 00:01:50,544 Speaker 2: intelligence actors, and so intelligence is no longer spy versus 33 00:01:50,544 --> 00:01:53,704 Speaker 2: spy like it was when Jerry and I started. But so, 34 00:01:53,704 --> 00:01:56,023 Speaker 2: so how can we compete in this new environment? How 35 00:01:56,024 --> 00:01:59,744 Speaker 2: does a democracy respond to the natural advantage as autocracies 36 00:01:59,744 --> 00:02:01,424 Speaker 2: like Russia and China wield. 37 00:02:01,624 --> 00:02:04,704 Speaker 4: Yeah, it is sort of spy versus everyone, right, and 38 00:02:04,704 --> 00:02:08,584 Speaker 4: maybe everyone versus spy as well. I think is probably 39 00:02:08,624 --> 00:02:11,784 Speaker 4: the answer to your question. I think that all of us, 40 00:02:11,824 --> 00:02:15,184 Speaker 4: everybody's sort of become this target of intelligence. And we 41 00:02:15,304 --> 00:02:17,864 Speaker 4: turned on the news and every day there's another sort 42 00:02:17,904 --> 00:02:21,224 Speaker 4: of cyber espionage hack of a banking system or an 43 00:02:21,304 --> 00:02:25,344 Speaker 4: insurance company or whatever it is, and all that information 44 00:02:25,424 --> 00:02:28,544 Speaker 4: is being collected a lot by the Chinese, but by 45 00:02:28,544 --> 00:02:33,624 Speaker 4: other countries as well, and they're launching information attacks on people. 46 00:02:33,664 --> 00:02:36,184 Speaker 4: They use TikTok and other platforms to do that. So 47 00:02:36,304 --> 00:02:40,344 Speaker 4: everyone is in the game now, it's not just intelligence 48 00:02:40,424 --> 00:02:45,944 Speaker 4: officers or military officers. And the temptation can be, well, 49 00:02:45,984 --> 00:02:50,624 Speaker 4: we should have the government try to protect us. If 50 00:02:50,624 --> 00:02:53,264 Speaker 4: you think about what happened after nine eleven, all of 51 00:02:53,304 --> 00:02:57,304 Speaker 4: a sudden, people became the targets of terrorism, and so 52 00:02:57,344 --> 00:03:02,664 Speaker 4: we reacted by creating the Counter Terrorism Center, significantly expanding 53 00:03:02,664 --> 00:03:06,824 Speaker 4: what it did, and creating the Department of Homeland Security 54 00:03:06,904 --> 00:03:11,584 Speaker 4: and having this government response. The problem in espionage is 55 00:03:11,624 --> 00:03:14,264 Speaker 4: if you do that, if you go too far, if 56 00:03:14,304 --> 00:03:19,224 Speaker 4: you have the government really crack down on information, then 57 00:03:19,264 --> 00:03:24,704 Speaker 4: you just end up as an authoritarian to talitarian government. 58 00:03:25,264 --> 00:03:28,944 Speaker 4: And the last thing you want is these government agencies 59 00:03:28,984 --> 00:03:32,744 Speaker 4: controlling information. So I think the only way to react 60 00:03:32,984 --> 00:03:38,304 Speaker 4: is to bring it back to civil society, to citizens 61 00:03:38,424 --> 00:03:39,584 Speaker 4: to protect ourselves. 62 00:03:40,024 --> 00:03:44,264 Speaker 2: Have you seen our citizens? Though? You know, I think 63 00:03:44,264 --> 00:03:45,584 Speaker 2: we're screwed, is what you're telling me. 64 00:03:45,984 --> 00:03:49,184 Speaker 4: There's a spectrum. There's a spectrum. But you know, look, 65 00:03:49,224 --> 00:03:52,624 Speaker 4: we did this before, the same thing happened with cybersecurity. 66 00:03:53,184 --> 00:03:56,864 Speaker 4: And thirty years ago, if you had walked into a 67 00:03:56,944 --> 00:04:00,344 Speaker 4: diner anywhere in America and you said, hey, do you 68 00:04:00,424 --> 00:04:02,864 Speaker 4: know what cybersecurity is? Like, people would have looked at 69 00:04:02,864 --> 00:04:04,904 Speaker 4: you like you were an alien. They would have been like, 70 00:04:04,984 --> 00:04:06,864 Speaker 4: what are you talking about? I don't even know what 71 00:04:06,944 --> 00:04:11,504 Speaker 4: the word cyber means. And now, like literally anywhere in America, 72 00:04:11,544 --> 00:04:13,824 Speaker 4: if you said, hey, do you know about cybersecurity, people 73 00:04:13,864 --> 00:04:15,384 Speaker 4: would be like, oh, yeah, you mean like I need 74 00:04:15,384 --> 00:04:18,104 Speaker 4: to change my password on my phone or I shouldn't 75 00:04:18,104 --> 00:04:21,224 Speaker 4: click on a phishing email. Like literally everyone everyone is 76 00:04:21,264 --> 00:04:24,183 Speaker 4: trained at work, they're trained at school. Children who are 77 00:04:24,264 --> 00:04:27,263 Speaker 4: eight years old know that there are hackers in the 78 00:04:27,303 --> 00:04:31,703 Speaker 4: world and that they shouldn't just leave their computer unprotected, right, 79 00:04:31,863 --> 00:04:35,704 Speaker 4: And the same thing can happen with espionage. But if 80 00:04:35,704 --> 00:04:38,464 Speaker 4: people can realize there's this problem and that you can 81 00:04:38,503 --> 00:04:42,104 Speaker 4: protect yourself and that there are some basic things you 82 00:04:42,144 --> 00:04:44,863 Speaker 4: can do, it will help. Now, that won't solve the 83 00:04:44,863 --> 00:04:47,224 Speaker 4: whole problem. You still need the government, just like we 84 00:04:47,303 --> 00:04:50,304 Speaker 4: still need the NSA and SISA and things like that 85 00:04:50,344 --> 00:04:54,743 Speaker 4: to protect on cybersecurity, but it can create resilience in 86 00:04:54,784 --> 00:04:55,424 Speaker 4: the population. 87 00:04:55,904 --> 00:04:58,984 Speaker 3: In the history of warfare, and we had the Stirrup, 88 00:04:58,984 --> 00:05:03,343 Speaker 3: which changed warfare and civilization. We had the invention of gunpowder, 89 00:05:03,823 --> 00:05:07,183 Speaker 3: the nuclear age. Suddenly war is different because we've got nukes. 90 00:05:07,464 --> 00:05:10,384 Speaker 3: And I think now the new new is the new 91 00:05:10,464 --> 00:05:16,224 Speaker 3: changes is information right, information dominance, information warfare. And I 92 00:05:16,264 --> 00:05:18,584 Speaker 3: want to go to Hollywood just for a second. You know, 93 00:05:18,743 --> 00:05:24,023 Speaker 3: Arnold Schwarzenegger, terminator, Skynet, the T eight hundred. Just a 94 00:05:24,024 --> 00:05:27,864 Speaker 3: couple of days ago in the New York Times they 95 00:05:27,863 --> 00:05:33,384 Speaker 3: were talking about the Ukraine using AI drones no pilot involved, 96 00:05:33,984 --> 00:05:38,064 Speaker 3: making incredibly accurate strikes and killing Russian soldiers. And of 97 00:05:38,144 --> 00:05:40,664 Speaker 3: course the Russians are working on this, we are as well. 98 00:05:40,984 --> 00:05:47,024 Speaker 3: So now you've got information basically AI carrying out TIS 99 00:05:47,224 --> 00:05:47,984 Speaker 3: on its own. 100 00:05:48,183 --> 00:05:49,863 Speaker 4: Look, one thing I'd point out that a lot of 101 00:05:49,904 --> 00:05:54,344 Speaker 4: people forget about this. The Terminator was a spy, right, 102 00:05:54,424 --> 00:05:58,384 Speaker 4: it was sent to infiltrate the insurgents like that. It 103 00:05:58,544 --> 00:06:01,303 Speaker 4: wasn't there as a soldier. It was actually a spy 104 00:06:01,584 --> 00:06:05,263 Speaker 4: since it was not as you know. Look, I Ukraine's interesting. 105 00:06:05,344 --> 00:06:08,743 Speaker 4: It's an intermediate war, right, It's like the Spanish Civil 106 00:06:08,743 --> 00:06:12,024 Speaker 4: War right before World War Two. People are trying things out. 107 00:06:12,344 --> 00:06:15,023 Speaker 4: We're seeing these new technologies on the battlefield. It's not 108 00:06:15,183 --> 00:06:18,504 Speaker 4: fully there yet. Yes, they're using some autonomy, they're using 109 00:06:18,503 --> 00:06:22,104 Speaker 4: a lot of drones, but like it's still also just 110 00:06:22,183 --> 00:06:26,463 Speaker 4: traditional warfighting and like trenches and tanks and stuff. I 111 00:06:26,503 --> 00:06:29,544 Speaker 4: think the next war you're really gonna see, and especially 112 00:06:29,623 --> 00:06:32,424 Speaker 4: if it involves China or the US, you're going to 113 00:06:32,464 --> 00:06:37,664 Speaker 4: see fully autonomous warfare and fully autonomous intelligence where the 114 00:06:38,183 --> 00:06:45,544 Speaker 4: intelligence tools are collecting information autonomously. The drones are going 115 00:06:45,584 --> 00:06:48,943 Speaker 4: out there and collecting information and sending it to other 116 00:06:49,623 --> 00:06:54,143 Speaker 4: fully autonomous systems. So one drone is collecting information with 117 00:06:54,383 --> 00:06:58,023 Speaker 4: some sensor, sending it to another drone, and then maybe 118 00:06:58,024 --> 00:07:01,503 Speaker 4: that drone is processing it or using it operationally to 119 00:07:01,584 --> 00:07:04,944 Speaker 4: target something. And then all that information is maybe going 120 00:07:04,984 --> 00:07:09,944 Speaker 4: back to some AI that's making battlefield operational decisions, maybe 121 00:07:09,944 --> 00:07:14,344 Speaker 4: even strategic level decisions. And I call this in the 122 00:07:14,384 --> 00:07:18,264 Speaker 4: book Singularity War. Right, all of the pieces start to 123 00:07:18,344 --> 00:07:24,104 Speaker 4: become unmanned with autonomous without human involvement, and when that 124 00:07:24,264 --> 00:07:27,424 Speaker 4: starts to happen, we don't know exactly how it's going 125 00:07:27,464 --> 00:07:29,624 Speaker 4: to play out. That's why it's the Singularity's And when 126 00:07:29,664 --> 00:07:33,504 Speaker 4: you imagine for example, swarms that have hundreds of thousands 127 00:07:33,584 --> 00:07:38,224 Speaker 4: or even millions of independently operating autonomous systems that are 128 00:07:38,264 --> 00:07:40,784 Speaker 4: all in a big network feeding back to what you 129 00:07:40,864 --> 00:07:44,304 Speaker 4: might call like a battlefield brain of some sort. It's 130 00:07:44,424 --> 00:07:47,344 Speaker 4: unclear what's going to happen. There's still going to be 131 00:07:47,384 --> 00:07:49,424 Speaker 4: people on the battlefield, so it's still going to be 132 00:07:49,504 --> 00:07:52,784 Speaker 4: dangerous for people. But these things are going to operate 133 00:07:52,944 --> 00:07:54,784 Speaker 4: in a totally new way, and it's almost like we're 134 00:07:54,784 --> 00:07:57,304 Speaker 4: going to have to study it and to see in 135 00:07:57,344 --> 00:07:57,944 Speaker 4: real time. 136 00:07:58,184 --> 00:08:00,944 Speaker 2: And also in terms of agencies have always aimed at 137 00:08:00,984 --> 00:08:04,184 Speaker 2: influencing populations, they just weren't always successful. The Russians were 138 00:08:04,184 --> 00:08:06,984 Speaker 2: doing the same subversion to us in the seventies, eighties, nineties, 139 00:08:07,024 --> 00:08:09,304 Speaker 2: but it wasn't as effective as it ended up being 140 00:08:09,304 --> 00:08:13,544 Speaker 2: a twenty sixteen and beyond. So this seems to be 141 00:08:13,584 --> 00:08:17,344 Speaker 2: like a big massive move forward on offense. What is 142 00:08:17,584 --> 00:08:19,984 Speaker 2: potential ways to defend against this? 143 00:08:20,464 --> 00:08:23,703 Speaker 4: I'm kind of market like this. If you think back 144 00:08:24,144 --> 00:08:27,864 Speaker 4: twenty years before something, how crazy does it seem If 145 00:08:27,904 --> 00:08:30,904 Speaker 4: you went back to nineteen eighty three and you walked 146 00:08:30,944 --> 00:08:34,024 Speaker 4: into the CIA and you're like, hey, in twenty years, 147 00:08:34,504 --> 00:08:38,944 Speaker 4: you guys are going to form a public over venture 148 00:08:38,984 --> 00:08:43,704 Speaker 4: capital company that's going to have an office in Silicon 149 00:08:43,824 --> 00:08:45,304 Speaker 4: Valley and people are going to walk in and you're 150 00:08:45,304 --> 00:08:49,184 Speaker 4: going to invest money into companies like the CIA would 151 00:08:49,184 --> 00:08:50,944 Speaker 4: have looked at you like you were nuts. 152 00:08:51,703 --> 00:08:53,624 Speaker 2: I didn't even think we would create a gift shop, 153 00:08:54,343 --> 00:08:57,864 Speaker 2: and we did, Like you're Starbucks. Now we could buy 154 00:08:57,944 --> 00:09:00,544 Speaker 2: like a clothes for a baby with CIA on it. 155 00:09:00,624 --> 00:09:03,223 Speaker 4: And I think right now, if you walked into the 156 00:09:03,264 --> 00:09:06,583 Speaker 4: CIA and you're like twenty years from now, civilians, just 157 00:09:06,784 --> 00:09:10,824 Speaker 4: regular people are going to run the world's largest and 158 00:09:10,904 --> 00:09:13,984 Speaker 4: most powerful intelligence agency on their own. They won't be 159 00:09:14,024 --> 00:09:16,624 Speaker 4: funded by government, and they're going to be in many 160 00:09:16,664 --> 00:09:21,223 Speaker 4: ways better than you at doing intelligence. I think Ratcliffe 161 00:09:21,223 --> 00:09:23,344 Speaker 4: would look at you like you're crazy and escort you 162 00:09:23,343 --> 00:09:25,784 Speaker 4: out of the building. And So to answer your question, 163 00:09:26,144 --> 00:09:28,104 Speaker 4: I think it's going to be that much of a revolution, 164 00:09:28,944 --> 00:09:32,424 Speaker 4: and I think that to get at defense it's going 165 00:09:32,504 --> 00:09:35,224 Speaker 4: to have to involve that. But literally hundreds of millions 166 00:09:35,223 --> 00:09:38,184 Speaker 4: of Americans are going to know what an information attack is. 167 00:09:38,544 --> 00:09:40,424 Speaker 4: They're going to know it comes from AI and these 168 00:09:40,463 --> 00:09:42,784 Speaker 4: different things. They're going to have techniques. They're going to 169 00:09:42,864 --> 00:09:45,064 Speaker 4: be like, oh, yeah, I triangulate news that's how I 170 00:09:45,103 --> 00:09:47,664 Speaker 4: do it, just like an analyst would that. I learned 171 00:09:47,703 --> 00:09:50,544 Speaker 4: to do that in high school. And I think we 172 00:09:50,583 --> 00:09:54,784 Speaker 4: will form what you could loosely call a civilian intelligence 173 00:09:54,824 --> 00:09:57,984 Speaker 4: agency similar to what we see today in a belling 174 00:09:58,064 --> 00:10:01,304 Speaker 4: Cat or even Reddit, where people are like just doing 175 00:10:01,343 --> 00:10:06,064 Speaker 4: investigations and they're doing it to protect themselves from these 176 00:10:06,103 --> 00:10:09,424 Speaker 4: threats and because they don't necessarily want the government to 177 00:10:09,824 --> 00:10:13,464 Speaker 4: but they're almost more like, you know, a militia or 178 00:10:13,504 --> 00:10:16,664 Speaker 4: I would say, like a neighborhood watch, and that becomes 179 00:10:16,784 --> 00:10:18,184 Speaker 4: normal and powerful. 180 00:10:18,583 --> 00:10:22,304 Speaker 3: It's interesting the photograph that came out recently of Trump 181 00:10:22,384 --> 00:10:24,263 Speaker 3: and Hexath and I can't remember who was there where 182 00:10:24,304 --> 00:10:28,744 Speaker 3: they reprised that famous photo of Obama watching the bin 183 00:10:28,824 --> 00:10:32,344 Speaker 3: Laden takedown. They did that, but in the background behind 184 00:10:32,384 --> 00:10:36,463 Speaker 3: them is Twitter feed coming out of Venezuela apparently there 185 00:10:36,544 --> 00:10:40,384 Speaker 3: that they're monitoring. So part of their intelligences the US 186 00:10:40,384 --> 00:10:45,743 Speaker 3: forces were going in apparently was controlling and or monitoring. 187 00:10:45,744 --> 00:10:47,624 Speaker 3: I'll say controlling because I think so, but I don't know, 188 00:10:47,664 --> 00:10:51,624 Speaker 3: so monitoring Twitter. The last thing they wanted was people 189 00:10:51,624 --> 00:10:55,184 Speaker 3: in Venezuela going, oh, there's helicopters coming over my building. 190 00:10:55,703 --> 00:11:00,184 Speaker 3: And then letting Maduro's Cuban and Cuban security forces, which they. 191 00:11:00,103 --> 00:11:02,664 Speaker 2: Did in Pakistan during the exactly right. 192 00:11:02,784 --> 00:11:06,544 Speaker 3: Yeah. But the on the defensive side, what goes along 193 00:11:06,583 --> 00:11:10,664 Speaker 3: with that, however, is you create these just like you know, 194 00:11:11,024 --> 00:11:14,064 Speaker 3: the people in Europe they made these incredible castles with 195 00:11:14,103 --> 00:11:16,103 Speaker 3: thick walls and then a long come cannons. Yeah, you 196 00:11:16,103 --> 00:11:18,144 Speaker 3: don't need those walls anymore, right, they're like, no, good 197 00:11:18,144 --> 00:11:22,024 Speaker 3: tow we're creating these offensive cyber capacities, but there also 198 00:11:22,103 --> 00:11:25,104 Speaker 3: is a lot of work on defensive cyber capacities large scale, 199 00:11:25,343 --> 00:11:28,704 Speaker 3: insofar as if someone launches one hundred thousand drones at you, 200 00:11:29,144 --> 00:11:32,144 Speaker 3: how do you either turn them off or control them 201 00:11:32,184 --> 00:11:35,064 Speaker 3: in turn to turn them back on whoever sent them. 202 00:11:35,103 --> 00:11:36,863 Speaker 3: And that goes in intelligence as well. 203 00:11:36,984 --> 00:11:40,584 Speaker 4: Right, you can't even think about offense and defense in 204 00:11:40,703 --> 00:11:44,624 Speaker 4: this same way you're talking about castles, like, Okay, the 205 00:11:44,703 --> 00:11:47,983 Speaker 4: cannons get so powerful, like doesn't even matter, It doesn't 206 00:11:48,024 --> 00:11:51,664 Speaker 4: matter how thick the wall is anyway, right, And in fact, 207 00:11:51,784 --> 00:11:54,904 Speaker 4: you don't even have knights anymore with armor because it 208 00:11:55,024 --> 00:11:57,344 Speaker 4: just will go right through it. So actually people just 209 00:11:57,384 --> 00:12:00,263 Speaker 4: stopped having armor and they just wore wool codes and 210 00:12:00,304 --> 00:12:04,264 Speaker 4: it's completely unrecognizable. And I think drone warfare is going 211 00:12:04,343 --> 00:12:09,064 Speaker 4: to be like that as well, and where yes, it 212 00:12:09,144 --> 00:12:13,504 Speaker 4: may be targeting every drone and we have directed energy 213 00:12:13,544 --> 00:12:16,944 Speaker 4: weapons that can shoot thousands and thousands of things down. 214 00:12:17,223 --> 00:12:19,383 Speaker 4: Or just thinking out of the box a little bit, 215 00:12:19,544 --> 00:12:24,064 Speaker 4: you say, how do I hack into the drone manufacturer 216 00:12:24,064 --> 00:12:26,624 Speaker 4: before these things even get out the door? Or I 217 00:12:26,784 --> 00:12:29,984 Speaker 4: recruit as a source the guy who designed the thing 218 00:12:30,064 --> 00:12:33,584 Speaker 4: and back door them. And so now you're starting to 219 00:12:33,624 --> 00:12:37,944 Speaker 4: think about physical weapons in the way today we think 220 00:12:37,984 --> 00:12:40,664 Speaker 4: about cybersecurity, where we're like, oh, we need a back 221 00:12:40,664 --> 00:12:44,304 Speaker 4: door to a cybersystem. What if I had a backdoor 222 00:12:44,343 --> 00:12:46,104 Speaker 4: to a hardware system. 223 00:12:45,904 --> 00:12:48,184 Speaker 3: Like the Israelis did with those pagers, you know, the 224 00:12:48,184 --> 00:12:51,583 Speaker 3: Iranian shooting missile and it, you know, or the centrifugees, 225 00:12:51,624 --> 00:12:53,584 Speaker 3: and they all start to fail and nobody knows why. 226 00:12:53,703 --> 00:12:58,344 Speaker 4: And sometimes what happens is what was once like exquisite 227 00:12:58,824 --> 00:13:02,263 Speaker 4: and you know it stucksnet was exquisite. I'm sure they 228 00:13:02,343 --> 00:13:05,983 Speaker 4: spent billions of dollars on this thing, and now we're 229 00:13:06,024 --> 00:13:07,624 Speaker 4: reaching a point. I don't know if you saw this 230 00:13:07,703 --> 00:13:13,144 Speaker 4: recently that their anthropic put out something where China used 231 00:13:13,424 --> 00:13:16,383 Speaker 4: anthropic to automate eighty or ninety percent of a cyber 232 00:13:16,504 --> 00:13:20,544 Speaker 4: espionage attack. And I bet within the year you're gonna 233 00:13:20,624 --> 00:13:22,583 Speaker 4: just see a non state group. You're going to see 234 00:13:22,624 --> 00:13:25,384 Speaker 4: a hacker group with three guys in it who can 235 00:13:25,463 --> 00:13:29,664 Speaker 4: automate a stucksneck level attack because they can just make it. 236 00:13:41,944 --> 00:13:43,904 Speaker 3: So I spent a lot of time in counter terrorism 237 00:13:44,024 --> 00:13:46,703 Speaker 3: and back in the day, and one of the things 238 00:13:46,703 --> 00:13:49,304 Speaker 3: we were concerned about that touches on this conversation is 239 00:13:49,343 --> 00:13:54,584 Speaker 3: that with technology ever, smaller groups of people have the 240 00:13:54,664 --> 00:13:59,904 Speaker 3: ability to inflict exponentially larger damage on a society. So 241 00:13:59,984 --> 00:14:03,504 Speaker 3: you get three hackers can take down like the US 242 00:14:04,184 --> 00:14:05,664 Speaker 3: nuclear umbrella for example. 243 00:14:05,984 --> 00:14:11,223 Speaker 4: Yeah, exactly. I mean bioterrorism is the ultimate example of that, 244 00:14:11,223 --> 00:14:15,104 Speaker 4: that a single person could, in theory, make a bio 245 00:14:15,223 --> 00:14:18,904 Speaker 4: weapon that could infect the entire planet and to many 246 00:14:18,944 --> 00:14:22,824 Speaker 4: people in the AI space, that's actually the scariest thing. 247 00:14:23,584 --> 00:14:27,024 Speaker 4: It's far more scary than even nuclear weapon designs getting 248 00:14:27,064 --> 00:14:31,023 Speaker 4: out there, is that AI would allow someone to develop 249 00:14:31,144 --> 00:14:32,104 Speaker 4: a bioweapon. 250 00:14:32,384 --> 00:14:34,664 Speaker 2: So let's talk a little bit about China, because China 251 00:14:34,664 --> 00:14:37,824 Speaker 2: obviously now is probably the biggest player in this and 252 00:14:37,864 --> 00:14:41,304 Speaker 2: they're very advanced. So you and others have described China 253 00:14:41,344 --> 00:14:43,464 Speaker 2: almost like as an intelligent state that like all the 254 00:14:43,504 --> 00:14:46,544 Speaker 2: companies and citizens by law, must work with the state 255 00:14:46,584 --> 00:14:49,464 Speaker 2: when I asked. And this creates great challenges not only 256 00:14:49,504 --> 00:14:53,504 Speaker 2: for Western intelligence and countries, but also companies. Right so, 257 00:14:53,544 --> 00:14:56,984 Speaker 2: now China and Russia enlisted their populations in collecting intelligence 258 00:14:57,024 --> 00:15:00,224 Speaker 2: and running covert action against the West. I wonder what 259 00:15:00,384 --> 00:15:04,184 Speaker 2: role should the private sector play in this kind of stuff. 260 00:15:04,584 --> 00:15:09,984 Speaker 4: I think all authoritarian nations are innately intelligent states, right, 261 00:15:10,024 --> 00:15:13,344 Speaker 4: because they need to get this information about their citizens 262 00:15:13,384 --> 00:15:16,184 Speaker 4: so that they can control them, Whereas in a democracy 263 00:15:17,304 --> 00:15:19,624 Speaker 4: it doesn't really work that way because it's the voters 264 00:15:19,624 --> 00:15:21,744 Speaker 4: and so forth. Control The government goes the other way. 265 00:15:22,304 --> 00:15:25,824 Speaker 4: And they have built a system that is a perfect 266 00:15:25,904 --> 00:15:28,424 Speaker 4: form of surveillance in my view. They were collecting all 267 00:15:28,464 --> 00:15:30,464 Speaker 4: of the information, they have all the laws they need 268 00:15:30,504 --> 00:15:34,824 Speaker 4: to collect everything, and they have put sensors everywhere on 269 00:15:35,024 --> 00:15:39,264 Speaker 4: phones and on streets, through cameras and so forth. We 270 00:15:39,384 --> 00:15:42,344 Speaker 4: are obviously not going to do that, but we have 271 00:15:42,464 --> 00:15:46,064 Speaker 4: to deal with threats like that, and that might be 272 00:15:46,224 --> 00:15:50,184 Speaker 4: threats of you know, economic threats like you're saying Chinese 273 00:15:50,224 --> 00:15:55,504 Speaker 4: ip theft or something tanking our economy or disrupting our 274 00:15:55,704 --> 00:16:00,424 Speaker 4: defense industrial base. It might be bioterror threats of just 275 00:16:00,824 --> 00:16:04,784 Speaker 4: crazy at you know, literal people who are maybe insane 276 00:16:04,984 --> 00:16:07,544 Speaker 4: or who have a religious or other belief that makes 277 00:16:07,584 --> 00:16:10,824 Speaker 4: them want to kill lots of peaceeople doing weapons. So 278 00:16:10,824 --> 00:16:14,544 Speaker 4: how do you protect yourself against that? And I think 279 00:16:14,704 --> 00:16:17,104 Speaker 4: we saw a taste of this after nine eleven, and 280 00:16:17,144 --> 00:16:21,104 Speaker 4: we created this whole of government approach to address this. 281 00:16:21,184 --> 00:16:23,224 Speaker 4: All of a sudden, we went from we're trying to 282 00:16:23,264 --> 00:16:27,024 Speaker 4: protect ourselves from states like the Soviet Union to these 283 00:16:27,104 --> 00:16:30,784 Speaker 4: non state sort of decentralized groups like al Qaeda, and 284 00:16:30,864 --> 00:16:32,904 Speaker 4: so to do that, we had to bring all these 285 00:16:32,944 --> 00:16:35,984 Speaker 4: forms of government together. You know, we had the Department 286 00:16:35,984 --> 00:16:39,104 Speaker 4: of Transportation working with the Department of Homeland Security, working 287 00:16:39,144 --> 00:16:41,544 Speaker 4: with the FBI and the CIA, and that was the 288 00:16:41,584 --> 00:16:44,504 Speaker 4: way we were able to defend ourselves. Again, I think 289 00:16:44,584 --> 00:16:47,984 Speaker 4: this time around, we need a whole of society approach 290 00:16:48,464 --> 00:16:51,144 Speaker 4: where it's all of those things in government, but it's 291 00:16:51,144 --> 00:16:58,064 Speaker 4: also companies and civil society, nonprofits, universities working together not 292 00:16:58,104 --> 00:17:02,544 Speaker 4: necessarily being controlled by the government. That I think is 293 00:17:02,544 --> 00:17:05,544 Speaker 4: where we go. A decentralized whole of society approach. 294 00:17:06,304 --> 00:17:11,384 Speaker 3: But those sort of authoritarian societies that you line China, Russia, 295 00:17:11,424 --> 00:17:14,824 Speaker 3: the Soviet Union, former East Germany that had pretty much 296 00:17:14,944 --> 00:17:20,544 Speaker 3: total information control because they were authoritarian societies and weren't democracies. 297 00:17:20,744 --> 00:17:23,944 Speaker 3: They could collect it, they could correlate it, but they 298 00:17:23,984 --> 00:17:26,904 Speaker 3: couldn't tell their leadership what they were finding. You know, 299 00:17:28,424 --> 00:17:31,504 Speaker 3: Russia today, no one dared tell Putin that the Ukraine's 300 00:17:31,544 --> 00:17:34,024 Speaker 3: going to push back on you. Maybe the only way 301 00:17:34,024 --> 00:17:38,224 Speaker 3: out of this is decentralization and a democratic approach, because 302 00:17:38,544 --> 00:17:41,824 Speaker 3: when you centralize it, it still doesn't fix the human 303 00:17:41,944 --> 00:17:44,824 Speaker 3: fuck up factor. Right, They're still going to make bad decisions, 304 00:17:44,984 --> 00:17:47,184 Speaker 3: or they're going to make even worse decisions based on 305 00:17:48,144 --> 00:17:52,624 Speaker 3: confidence that they have information, that they have information dominans 306 00:17:52,664 --> 00:17:55,184 Speaker 3: they actually understand it, when actually in reality they're surrounded 307 00:17:55,184 --> 00:17:59,023 Speaker 3: by this bubble of approbation. No one dares tell, like 308 00:17:59,064 --> 00:18:01,384 Speaker 3: the strongmen, what they really what the findings are. 309 00:18:01,584 --> 00:18:05,504 Speaker 4: I think that the democratic kind of capitalist system is 310 00:18:05,944 --> 00:18:10,744 Speaker 4: more resilient and more functional than an authority closed system 311 00:18:10,864 --> 00:18:14,464 Speaker 4: and always will be. Now they do have advantages and 312 00:18:14,544 --> 00:18:17,023 Speaker 4: we're seeing that. Right. They can very much direct their 313 00:18:17,064 --> 00:18:20,024 Speaker 4: economy if they want to become the lead in a 314 00:18:20,064 --> 00:18:23,744 Speaker 4: certain technology quickly they could do it, like China did 315 00:18:23,784 --> 00:18:26,904 Speaker 4: that in solar panels. They went from zero solar panel 316 00:18:27,304 --> 00:18:30,664 Speaker 4: production or R and D to they basically own the 317 00:18:30,744 --> 00:18:35,784 Speaker 4: global market. Now they can be very good at counterintelligence domestically, right, 318 00:18:35,864 --> 00:18:39,504 Speaker 4: like they have some advantages, but in the long run, 319 00:18:40,464 --> 00:18:44,304 Speaker 4: just having a decentralized system, you're going to make better 320 00:18:44,344 --> 00:18:49,064 Speaker 4: decisions economically and strategically, and you're going to enlist a 321 00:18:49,184 --> 00:18:51,984 Speaker 4: larger part of your population. This is why democracy started 322 00:18:52,024 --> 00:18:55,024 Speaker 4: in the first place. France was able to win wars 323 00:18:55,064 --> 00:18:59,424 Speaker 4: in continental Europe because everybody became a citizen and everybody 324 00:18:59,424 --> 00:19:01,984 Speaker 4: could join the military, and they were able to put 325 00:19:01,984 --> 00:19:05,744 Speaker 4: the biggest military together of anybody there. And we have 326 00:19:05,944 --> 00:19:11,304 Speaker 4: fallen into this lull where I think that people say, well, 327 00:19:11,344 --> 00:19:14,344 Speaker 4: the CIA will take care of it. Oh, information operations 328 00:19:14,504 --> 00:19:17,144 Speaker 4: like the FBI, like these guys got it. It's no, 329 00:19:17,464 --> 00:19:21,504 Speaker 4: actually they don't. And if you notice, we keep having 330 00:19:21,544 --> 00:19:24,624 Speaker 4: all these information attacks. We can't stop TikTok. It took 331 00:19:24,704 --> 00:19:28,424 Speaker 4: us years to make this thing illegal. We allowed the 332 00:19:28,544 --> 00:19:32,984 Speaker 4: Russians to interfere with an election, No doubt they and 333 00:19:33,024 --> 00:19:34,784 Speaker 4: the Chinese are going to try to do so again. 334 00:19:35,184 --> 00:19:37,704 Speaker 2: But what we've seen, and what you're talking about is 335 00:19:37,704 --> 00:19:41,344 Speaker 2: that foreign actors now can get into our system via 336 00:19:41,384 --> 00:19:44,464 Speaker 2: big data, via social media, via all these means, and 337 00:19:44,464 --> 00:19:46,864 Speaker 2: they can figure out what spins us up. So you know, 338 00:19:46,944 --> 00:19:50,144 Speaker 2: our podcast we talk about conspiracy theories. It is very 339 00:19:50,184 --> 00:19:52,424 Speaker 2: easy to look at the American system and see what 340 00:19:52,584 --> 00:19:55,464 Speaker 2: gets people upset. What are conspiracy theories it might work. 341 00:19:55,464 --> 00:19:57,864 Speaker 2: What are conspiracy theories that do damage to the United 342 00:19:57,864 --> 00:20:00,584 Speaker 2: States and to our polity, and just push those into 343 00:20:00,624 --> 00:20:01,064 Speaker 2: the system. 344 00:20:01,104 --> 00:20:03,824 Speaker 3: It's not just foreigners. The system, at least as it's 345 00:20:03,824 --> 00:20:05,704 Speaker 3: developing now, is oligarchic. 346 00:20:05,864 --> 00:20:06,024 Speaker 4: Right. 347 00:20:06,144 --> 00:20:09,944 Speaker 3: There's ten guys in the US that control a huge 348 00:20:09,944 --> 00:20:12,984 Speaker 3: part of our algorithms. Right, Somebody like Elon Musk can 349 00:20:13,104 --> 00:20:15,264 Speaker 3: have a huge impact on what we think. And that's 350 00:20:15,384 --> 00:20:18,704 Speaker 3: not democratic, right, That's not large groups of people coming 351 00:20:18,744 --> 00:20:23,304 Speaker 3: to a consensus. It's large groups of people being increasingly triggered, 352 00:20:23,304 --> 00:20:25,624 Speaker 3: as John said, into sort of fed information to get 353 00:20:25,624 --> 00:20:26,824 Speaker 3: them to think in a certain way. 354 00:20:27,144 --> 00:20:29,824 Speaker 4: We've always had this sort of oligarchic system, haven't we. 355 00:20:29,944 --> 00:20:32,864 Speaker 4: You could go back to like Hurst right, owning newspapers. 356 00:20:33,064 --> 00:20:35,104 Speaker 4: You got us into a war eighteen ninety eight, right 357 00:20:35,144 --> 00:20:40,304 Speaker 4: by pushing it and posts. In the forties, fifties, sixties, seventies, 358 00:20:40,584 --> 00:20:45,464 Speaker 4: there's four television channels. They're owned by four companies. There's 359 00:20:45,504 --> 00:20:48,624 Speaker 4: been this sort of evolution where that piece is always there. 360 00:20:48,664 --> 00:20:51,624 Speaker 4: I don't disagree that there's a lot of oligarchic control, 361 00:20:52,184 --> 00:20:55,304 Speaker 4: but that's always been around, and we've always found ways 362 00:20:55,544 --> 00:20:59,344 Speaker 4: around that control, whether it's pamphlets or now maybe it's 363 00:20:59,384 --> 00:21:02,264 Speaker 4: like a Reddit group or something. I think that what 364 00:21:02,464 --> 00:21:08,344 Speaker 4: becomes dangerous actually is when we lose trust in the 365 00:21:08,424 --> 00:21:11,704 Speaker 4: forms of media. We used to believe what was on 366 00:21:11,984 --> 00:21:17,944 Speaker 4: TV right, You'd say, okay, see a crazy if it's 367 00:21:17,984 --> 00:21:22,864 Speaker 4: on NBC, okay, like it's probably right. And now we 368 00:21:22,944 --> 00:21:26,424 Speaker 4: don't believe that. And I think that actually was the 369 00:21:26,544 --> 00:21:30,304 Speaker 4: ultimate goal of Russia and some of these actors was 370 00:21:30,424 --> 00:21:33,904 Speaker 4: not to even pick a side. It was to just 371 00:21:33,984 --> 00:21:40,064 Speaker 4: make us start distrusting media in general and distrust ourselves 372 00:21:40,224 --> 00:21:41,864 Speaker 4: and other people in our society. 373 00:21:42,624 --> 00:21:43,904 Speaker 2: No, I think that's right, And I think one of 374 00:21:43,944 --> 00:21:46,264 Speaker 2: the most damaging things is and you see it, oh, 375 00:21:46,304 --> 00:21:48,544 Speaker 2: you can't trust the media. Can't trust the MSM, the 376 00:21:48,584 --> 00:21:51,344 Speaker 2: mainstream media. Hate to say it, but the mainstream media 377 00:21:51,624 --> 00:21:55,824 Speaker 2: are usually large magazines or newspapers or whatever who do 378 00:21:55,944 --> 00:21:58,744 Speaker 2: have actual fact checkers, and they have journalists around the world, 379 00:21:58,824 --> 00:22:01,104 Speaker 2: and they still have some semblance of trying to find 380 00:22:01,104 --> 00:22:03,544 Speaker 2: the truth even though they make mistakes, as opposed to 381 00:22:03,584 --> 00:22:05,024 Speaker 2: all of the a lot of these people that are 382 00:22:05,024 --> 00:22:07,664 Speaker 2: just opinion makers or trying to make some money off 383 00:22:07,704 --> 00:22:10,584 Speaker 2: on social media feeds to to push things that get 384 00:22:10,664 --> 00:22:12,544 Speaker 2: more clicks because they're entertaining. 385 00:22:13,344 --> 00:22:17,344 Speaker 4: I think that people have reacted. It's almost like an 386 00:22:17,344 --> 00:22:23,664 Speaker 4: immune response where everyone has basically figured out now that 387 00:22:23,784 --> 00:22:27,304 Speaker 4: you can't really trust anything you see or you read 388 00:22:27,504 --> 00:22:31,184 Speaker 4: on the internet for sure, and most likely not on TV. 389 00:22:32,104 --> 00:22:35,784 Speaker 4: And yeah, the question becomes, now, how do we trust 390 00:22:35,904 --> 00:22:38,424 Speaker 4: each other again? How do you trust media? I think 391 00:22:38,464 --> 00:22:41,144 Speaker 4: we're past the point now where we could just be like, 392 00:22:41,344 --> 00:22:44,384 Speaker 4: let's just create the BBC or something that's like this 393 00:22:44,544 --> 00:22:48,304 Speaker 4: central trustworthy thing. I don't think that can happen again. 394 00:22:48,384 --> 00:22:50,464 Speaker 4: I think the only way to get trust back is 395 00:22:50,904 --> 00:22:55,024 Speaker 4: essentially everybody's got to start to think like intelligence officers, 396 00:22:55,624 --> 00:22:58,624 Speaker 4: where you know, if you think about it, intelligence officers 397 00:22:58,664 --> 00:23:02,144 Speaker 4: have always existed in this environment where you don't trust 398 00:23:02,144 --> 00:23:06,544 Speaker 4: anything because there's deception, like you never believe a single 399 00:23:06,624 --> 00:23:09,144 Speaker 4: source of information no matter how could your sources. You're like, 400 00:23:09,224 --> 00:23:11,744 Speaker 4: let me get a different source, right, let me OPS 401 00:23:11,784 --> 00:23:14,904 Speaker 4: test this source, whatever it is. And it doesn't mean 402 00:23:14,944 --> 00:23:17,944 Speaker 4: you don't believe anything, because you have to at some point, 403 00:23:18,264 --> 00:23:20,344 Speaker 4: you have to write an intel and you have to 404 00:23:20,344 --> 00:23:22,664 Speaker 4: get it to a decision maker. So you have to say, 405 00:23:22,744 --> 00:23:26,184 Speaker 4: at some point something is more true than something else. 406 00:23:26,504 --> 00:23:28,704 Speaker 4: And the way that you do that is through these tools, 407 00:23:28,744 --> 00:23:32,664 Speaker 4: through OPS testing, through triangulation, and through that you can 408 00:23:32,744 --> 00:23:36,784 Speaker 4: again come to find some level of useful truth. 409 00:23:37,264 --> 00:23:40,264 Speaker 3: It's an interesting point you made about BBC because one 410 00:23:40,304 --> 00:23:43,544 Speaker 3: of the most successful CIA operations ever was putting together 411 00:23:43,664 --> 00:23:46,664 Speaker 3: radio for Europe, at least initially before it it worked 412 00:23:46,664 --> 00:23:48,704 Speaker 3: so well we got out of it and it kept running, 413 00:23:48,824 --> 00:23:52,624 Speaker 3: you know, Voice of America and in authoritarian countries. John 414 00:23:52,704 --> 00:23:54,944 Speaker 3: was in the Soviet Union and I spent time with 415 00:23:55,424 --> 00:23:57,104 Speaker 3: dealing a lot with Extermany and a lot of the 416 00:23:57,144 --> 00:24:01,544 Speaker 3: third tearing places people gave up, populations gave up on 417 00:24:01,584 --> 00:24:04,064 Speaker 3: their domestic media. Right, It's just like, basically, I don't 418 00:24:04,104 --> 00:24:06,184 Speaker 3: believe any of it, but then they would look to 419 00:24:06,304 --> 00:24:09,784 Speaker 3: the BBC today still in most of authoritarian country who's 420 00:24:09,784 --> 00:24:13,464 Speaker 3: in African Asia, it is the gold standard. People believe 421 00:24:13,544 --> 00:24:14,824 Speaker 3: what's on PBC. 422 00:24:14,824 --> 00:24:18,624 Speaker 2: Could either choose to invest in creating an intelligence service 423 00:24:18,664 --> 00:24:20,664 Speaker 2: and all that requires, or you can just read the 424 00:24:20,704 --> 00:24:22,584 Speaker 2: Economist and you're probably about the same. 425 00:24:23,024 --> 00:24:24,184 Speaker 3: Yeah, that's a good point. 426 00:24:30,304 --> 00:24:34,664 Speaker 4: I think that actually we're going to move past all 427 00:24:34,704 --> 00:24:39,224 Speaker 4: the media in fact, and AI is becoming the arbitra 428 00:24:39,624 --> 00:24:45,064 Speaker 4: of information. Now, what happened yesterday in Venezuela. Who is 429 00:24:45,224 --> 00:24:49,184 Speaker 4: this person who's going to win this election? And why? 430 00:24:49,744 --> 00:24:53,184 Speaker 4: How should I vote? If I value these things? How 431 00:24:53,224 --> 00:24:56,224 Speaker 4: should I vote? And now what's really dangerous about this 432 00:24:57,384 --> 00:25:02,384 Speaker 4: is that that we're beginning to trust the AI. And 433 00:25:02,424 --> 00:25:05,344 Speaker 4: once you trust that, what if it gets infiltrated? What 434 00:25:05,384 --> 00:25:07,464 Speaker 4: if that gets hacked? And by the way, it's not 435 00:25:07,504 --> 00:25:09,544 Speaker 4: as hard to hack these things as you would think, 436 00:25:09,864 --> 00:25:12,744 Speaker 4: or even worse, what if it's owned by a foreign actor. 437 00:25:12,824 --> 00:25:14,664 Speaker 4: So deep seek is a Chinese version. 438 00:25:14,864 --> 00:25:18,984 Speaker 2: Also, asystem depend on pulling from huge data sets, which 439 00:25:19,184 --> 00:25:21,224 Speaker 2: tend to try to find things that are real. But 440 00:25:21,384 --> 00:25:22,904 Speaker 2: our bad actor is going to be able to push 441 00:25:22,984 --> 00:25:25,864 Speaker 2: enough stuff into the system so that when it answers 442 00:25:25,864 --> 00:25:27,704 Speaker 2: the question that there's more bad data out there that 443 00:25:27,704 --> 00:25:29,464 Speaker 2: it's pulling from than good data. 444 00:25:29,664 --> 00:25:33,304 Speaker 4: That used to be the concern, right, because these models 445 00:25:33,344 --> 00:25:36,784 Speaker 4: are trained on millions and millions of different records and 446 00:25:36,864 --> 00:25:41,584 Speaker 4: data sources. Right. But Anthropic just put out a study 447 00:25:41,584 --> 00:25:43,984 Speaker 4: a couple of months ago and they showed that you 448 00:25:44,024 --> 00:25:48,624 Speaker 4: could poison it's called machine learning poisoning an LM with 449 00:25:48,744 --> 00:25:51,544 Speaker 4: two hundred and fifty document You could inject two hundred 450 00:25:51,544 --> 00:25:54,704 Speaker 4: and fifty documents and that would change the model in 451 00:25:54,744 --> 00:25:59,984 Speaker 4: a noticeable way. And what's even scarier to me is 452 00:26:00,024 --> 00:26:03,304 Speaker 4: that they can be very subtle. Right. It doesn't have 453 00:26:03,424 --> 00:26:06,264 Speaker 4: to say Hitler was a great man, It doesn't have 454 00:26:06,344 --> 00:26:09,624 Speaker 4: to say that. What if it's just uses this like 455 00:26:09,864 --> 00:26:13,744 Speaker 4: a slightly softer adjective, or it's. 456 00:26:13,824 --> 00:26:17,624 Speaker 3: Certain if cost happens still controversial, or you don't even 457 00:26:17,704 --> 00:26:20,104 Speaker 3: have to have these deep thakes where you're just completely 458 00:26:20,144 --> 00:26:20,984 Speaker 3: faking a video. 459 00:26:21,304 --> 00:26:23,864 Speaker 4: But what if you just change a single word in 460 00:26:23,904 --> 00:26:27,144 Speaker 4: what a person said or the tone of how they 461 00:26:27,224 --> 00:26:31,024 Speaker 4: said it, like really subtle stuff. And that's what you 462 00:26:31,064 --> 00:26:33,984 Speaker 4: could do with AI. And you can do it over long, 463 00:26:34,024 --> 00:26:37,424 Speaker 4: long periods of time. So if I tell you something blatant, 464 00:26:37,584 --> 00:26:39,224 Speaker 4: you're probably not gonna believe it. But if I do 465 00:26:39,344 --> 00:26:42,384 Speaker 4: it over months and years and by the way, what 466 00:26:42,464 --> 00:26:46,264 Speaker 4: if I start while you're a child and slowly just 467 00:26:46,664 --> 00:26:49,744 Speaker 4: insert information that slowly makes you think by the time 468 00:26:49,784 --> 00:26:55,344 Speaker 4: you become voting age, maybe it's not worth calligion religion, yeah, 469 00:26:55,424 --> 00:26:59,304 Speaker 4: Or maybe it's like it's not worth defending Taiwan. Or 470 00:26:59,384 --> 00:27:02,064 Speaker 4: maybe it's like, maybe I shouldn't vote for this more 471 00:27:02,104 --> 00:27:07,344 Speaker 4: hawkish president because America should be more isolationists and not 472 00:27:07,544 --> 00:27:12,784 Speaker 4: interfere in East Age affairs like those are the subtleties. 473 00:27:12,864 --> 00:27:16,144 Speaker 4: And I call it in the book Hacking Minds. I 474 00:27:16,184 --> 00:27:19,224 Speaker 4: think that the AI can be used to hack us back. 475 00:27:19,384 --> 00:27:21,104 Speaker 4: I used to be a case officer, right, and like 476 00:27:21,424 --> 00:27:24,224 Speaker 4: the job. My job, as you know, is to go 477 00:27:24,344 --> 00:27:27,624 Speaker 4: out there and find someone and develop them over time, 478 00:27:28,224 --> 00:27:32,144 Speaker 4: get them to trust me by understanding their motivations and 479 00:27:32,184 --> 00:27:34,504 Speaker 4: so forth, and then ultimately recruit them and get them 480 00:27:34,504 --> 00:27:37,304 Speaker 4: to do what I want. That's a one on one job, right, 481 00:27:37,384 --> 00:27:38,944 Speaker 4: Like I've got to do it. I've got to have 482 00:27:38,984 --> 00:27:41,064 Speaker 4: a dialogue with the person. I'm going to meet them 483 00:27:41,104 --> 00:27:43,464 Speaker 4: a lot and buy a lot of dinners and drinks. 484 00:27:44,184 --> 00:27:48,264 Speaker 4: But now with AI, AI is going to take over 485 00:27:48,344 --> 00:27:50,704 Speaker 4: the case officer job, and the AI can have that 486 00:27:50,824 --> 00:27:55,584 Speaker 4: dialogue over time and subtly influence a person and change 487 00:27:55,624 --> 00:27:58,384 Speaker 4: their mind. And maybe it never even has to recruit 488 00:27:58,424 --> 00:28:00,584 Speaker 4: them to get the information, by the way, maybe it 489 00:28:00,744 --> 00:28:04,584 Speaker 4: just convinces them over time that the most ethical thing 490 00:28:04,664 --> 00:28:07,024 Speaker 4: for them to do would be to leak information from 491 00:28:07,064 --> 00:28:09,944 Speaker 4: their company or from their government office. And this is 492 00:28:10,024 --> 00:28:13,784 Speaker 4: not advertising anymore, This is not propaganda. This is more 493 00:28:13,904 --> 00:28:15,384 Speaker 4: like what a case officer does. 494 00:28:15,464 --> 00:28:18,024 Speaker 2: It's going to kill the alcohol industry though, yeah, that 495 00:28:18,064 --> 00:28:20,304 Speaker 2: way we wouldn't want. You can't take all people and 496 00:28:20,384 --> 00:28:21,224 Speaker 2: drink with them. 497 00:28:21,104 --> 00:28:24,104 Speaker 4: Like yeah, yeah, yeah, there's gonna be a lot of 498 00:28:24,144 --> 00:28:28,504 Speaker 4: saved livers in Langley, Virginia over the years. Maybe a 499 00:28:28,584 --> 00:28:30,104 Speaker 4: couple less divorces, who knows. 500 00:28:30,704 --> 00:28:35,864 Speaker 3: So a couple of years back, China was able to 501 00:28:36,544 --> 00:28:40,664 Speaker 3: the PRC was able to penetrate the database that had 502 00:28:40,704 --> 00:28:44,504 Speaker 3: Taiwanese pensions in it, Like why would they want that? 503 00:28:44,624 --> 00:28:47,544 Speaker 3: What good is it? And once they got into the pensions, 504 00:28:47,584 --> 00:28:50,704 Speaker 3: they started figuring out like how they're paid, and then 505 00:28:50,784 --> 00:28:53,264 Speaker 3: like the numbers that go with each pension and how 506 00:28:53,304 --> 00:28:57,864 Speaker 3: they're paid. And what they found out was that Taiwanese knocks, 507 00:28:58,624 --> 00:29:03,184 Speaker 3: not official cover officers, Taiwanese intelligence officers operating inside of 508 00:29:03,304 --> 00:29:07,664 Speaker 3: China were still being paid their pension money was still 509 00:29:07,704 --> 00:29:12,384 Speaker 3: being paid by the Taiwanese Intelligence Service into their pensions right, 510 00:29:12,624 --> 00:29:15,744 Speaker 3: not by their corporations, and they were able to identify 511 00:29:16,064 --> 00:29:21,064 Speaker 3: every single Taiwanese knock operating inside of China completely wipe 512 00:29:21,064 --> 00:29:24,584 Speaker 3: them out. And who would a thunk that, like, you know, 513 00:29:24,664 --> 00:29:28,984 Speaker 3: penetrating a pension agency database which wasn't very well protected. 514 00:29:29,424 --> 00:29:32,784 Speaker 3: So what is the danger of what might seem to 515 00:29:32,864 --> 00:29:36,944 Speaker 3: be innocuous data sets and especially TikTok the fight over there. 516 00:29:37,144 --> 00:29:40,904 Speaker 4: So the thing about TikTok is it's a generational kind 517 00:29:40,984 --> 00:29:44,464 Speaker 4: of thing, right, generation Z, which is people in roughly 518 00:29:44,504 --> 00:29:48,264 Speaker 4: their twenties. Eighty percent of them are on TikTok, probably 519 00:29:48,424 --> 00:29:51,984 Speaker 4: at this point even more, and most of them use 520 00:29:52,104 --> 00:29:54,304 Speaker 4: TikTok to get the news, and for many of them 521 00:29:54,304 --> 00:29:58,144 Speaker 4: it's the only news source. So TikTok is both a 522 00:29:58,144 --> 00:30:02,784 Speaker 4: way to collect information about this entire generation which could 523 00:30:02,824 --> 00:30:05,464 Speaker 4: be used in the way you're saying, right, future presidents, 524 00:30:06,344 --> 00:30:10,584 Speaker 4: future presidents, or future intelligence officers w not be surprised 525 00:30:11,144 --> 00:30:17,344 Speaker 4: that by analyzing the media habits of an entire population 526 00:30:17,504 --> 00:30:21,024 Speaker 4: of twenty something year old you can figure out who's 527 00:30:21,064 --> 00:30:23,464 Speaker 4: going to become an intelligence officer who's going to sign up. 528 00:30:23,704 --> 00:30:27,264 Speaker 4: But it's not just about data collection and assessing what 529 00:30:27,304 --> 00:30:31,264 Speaker 4: people are going to do. It's also about controlling them, 530 00:30:31,424 --> 00:30:36,624 Speaker 4: influencing them, and and TikTok because so many people use 531 00:30:36,664 --> 00:30:39,824 Speaker 4: it and it becomes their primary source of media, you 532 00:30:39,944 --> 00:30:43,984 Speaker 4: can influence what people believe and think. So there's a 533 00:30:43,984 --> 00:30:47,824 Speaker 4: group out of Ruckers that has done very good studies 534 00:30:47,984 --> 00:30:53,744 Speaker 4: on TikTok data, showing that not only does it censor 535 00:30:53,864 --> 00:30:57,824 Speaker 4: information for examples, but it's They've also shown that the 536 00:30:58,024 --> 00:31:03,984 Speaker 4: longer a user is on TikTok, the more benevolent view 537 00:31:04,024 --> 00:31:08,304 Speaker 4: they have of Chinese human rights record, like it is 538 00:31:08,464 --> 00:31:13,784 Speaker 4: working but a coincidence. And now remember this, we're so 539 00:31:13,944 --> 00:31:18,024 Speaker 4: focused on TikTok, which thank god it's finally moving to 540 00:31:18,344 --> 00:31:21,384 Speaker 4: American hands, although I'm sure they have back doors in 541 00:31:21,424 --> 00:31:24,224 Speaker 4: all these ways to continue to use it, like the 542 00:31:24,224 --> 00:31:26,664 Speaker 4: threat won't go away. But remember you know what else 543 00:31:26,744 --> 00:31:30,464 Speaker 4: China owns. It owns forty percent of epic games and 544 00:31:30,504 --> 00:31:33,144 Speaker 4: one hundred percent of riot games, like they own huge 545 00:31:33,184 --> 00:31:36,304 Speaker 4: amounts of the video game industry in America, which is 546 00:31:36,504 --> 00:31:43,504 Speaker 4: used by the majority of men age eight to fifty right, 547 00:31:43,944 --> 00:31:49,544 Speaker 4: and I bet you ninety something percent of people who 548 00:31:49,624 --> 00:31:53,144 Speaker 4: join the military have played a huge amount of video 549 00:31:53,224 --> 00:31:58,264 Speaker 4: games in their lives. They also own education technologies. Princeton 550 00:31:58,344 --> 00:32:02,424 Speaker 4: Review and Tutor dot Com are owned by a Chinese fund, 551 00:32:02,504 --> 00:32:06,904 Speaker 4: Prima Era. They are buying up schools in Texas. They're 552 00:32:06,944 --> 00:32:10,784 Speaker 4: buying up whole chains of schools. So this aren't the 553 00:32:10,824 --> 00:32:14,224 Speaker 4: only threats. It's not just social media, it's all of 554 00:32:14,264 --> 00:32:17,104 Speaker 4: these things. So they don't have TikTok. They have another 555 00:32:17,184 --> 00:32:20,784 Speaker 4: app that's also owned by ByteDance. But by law there 556 00:32:21,544 --> 00:32:25,424 Speaker 4: thirty percent of what goes over the social media apps 557 00:32:25,424 --> 00:32:28,384 Speaker 4: has to be educational so that they get it. They're 558 00:32:28,424 --> 00:32:29,184 Speaker 4: in on the joke. 559 00:32:29,544 --> 00:32:31,504 Speaker 2: So what you're suggesting, I think, I think is that 560 00:32:31,624 --> 00:32:34,664 Speaker 2: democratic resilience is now part of national security. It's always 561 00:32:34,664 --> 00:32:37,184 Speaker 2: been true, but it's even more true now. And we 562 00:32:37,224 --> 00:32:39,464 Speaker 2: need civil self defense. We need to build a society 563 00:32:39,504 --> 00:32:43,344 Speaker 2: that's harder to deceive, harder to divide, harder to exploit. 564 00:32:43,584 --> 00:32:46,184 Speaker 2: So what can we learn, What can we do as 565 00:32:46,184 --> 00:32:49,584 Speaker 2: a government, What can we learn by others? Are where 566 00:32:49,624 --> 00:32:51,424 Speaker 2: do we go to make ourselves more resilient? 567 00:32:51,544 --> 00:32:54,824 Speaker 4: We can learn from some other countries. The Finns. I 568 00:32:55,024 --> 00:32:58,744 Speaker 4: visited the Baltics over the summer. I mean, they are 569 00:32:59,224 --> 00:33:02,944 Speaker 4: definitely aware of the Russian threat and have sort of 570 00:33:02,944 --> 00:33:05,904 Speaker 4: begun to mobilize society in a way that we see 571 00:33:05,984 --> 00:33:10,224 Speaker 4: in like Israel or something where everyone is realizes the 572 00:33:10,384 --> 00:33:13,824 Speaker 4: threats here, and some of it's physical, but also some 573 00:33:13,904 --> 00:33:17,384 Speaker 4: of it's informational. But I think we have the right 574 00:33:17,544 --> 00:33:22,384 Speaker 4: DNA in America to do this, and we have many 575 00:33:22,464 --> 00:33:25,544 Speaker 4: of the tools in place already. There's two sides of 576 00:33:25,584 --> 00:33:28,304 Speaker 4: the coin of social media. Yeah, it can be the 577 00:33:28,344 --> 00:33:32,384 Speaker 4: threat vector, but it can also be the means to 578 00:33:32,424 --> 00:33:35,744 Speaker 4: push out the antidote and to get it out quickly, 579 00:33:35,784 --> 00:33:36,864 Speaker 4: to say, hey, this is fake. 580 00:33:37,344 --> 00:33:39,704 Speaker 3: So this program we talk a lot about conspiracy theories 581 00:33:39,704 --> 00:33:42,384 Speaker 3: and it's a real hands in our society, and of 582 00:33:42,384 --> 00:33:46,184 Speaker 3: course hate speech goes along very closely with conspiracy theories. 583 00:33:46,664 --> 00:33:50,784 Speaker 4: Yeah, we've been talking about the sort of external information 584 00:33:50,904 --> 00:33:53,744 Speaker 4: threat coming from China and so forth, but it's internal 585 00:33:53,784 --> 00:33:58,824 Speaker 4: as well. Right, These tools AI social media allow for 586 00:33:59,184 --> 00:34:04,024 Speaker 4: the creation of these conspiracy theories in a sense by 587 00:34:04,064 --> 00:34:08,223 Speaker 4: creating fake videos and so forth, and the spread of 588 00:34:08,264 --> 00:34:12,263 Speaker 4: that information right to lots of people. And remember, innately 589 00:34:12,304 --> 00:34:16,183 Speaker 4: a human being believes something the more familiar they are 590 00:34:16,224 --> 00:34:20,104 Speaker 4: with it. So the more you see anything, the more 591 00:34:20,143 --> 00:34:22,904 Speaker 4: you will believe it. The more you see a certain face, 592 00:34:23,024 --> 00:34:25,903 Speaker 4: the more you trust it. The more you hear the 593 00:34:25,944 --> 00:34:27,743 Speaker 4: same thing, the more you will believe it. 594 00:34:27,824 --> 00:34:30,024 Speaker 3: But you still need government. It's helpful to have government 595 00:34:30,104 --> 00:34:33,743 Speaker 3: intervention insofar as like Europe, for example, they have very 596 00:34:33,743 --> 00:34:38,143 Speaker 3: strict laws against hate speech and also monetizing conspiracy theories 597 00:34:38,223 --> 00:34:40,423 Speaker 3: and hate speech, which we don't have in the US. Right, 598 00:34:40,703 --> 00:34:43,703 Speaker 3: it's much more advantageous to be in Alex Jones and 599 00:34:43,743 --> 00:34:47,544 Speaker 3: make millions off of really awful conspiracy theories that wouldn't 600 00:34:47,544 --> 00:34:49,824 Speaker 3: be possible under most European. 601 00:34:49,424 --> 00:34:54,223 Speaker 4: Governments, although it hasn't stopped it from happening, right, So 602 00:34:54,263 --> 00:34:57,703 Speaker 4: you can still collect the information and just the reality. 603 00:34:57,783 --> 00:35:01,264 Speaker 4: Here the First Amendment, we're not going to block speech. 604 00:35:01,304 --> 00:35:04,983 Speaker 4: I mean, the bar is so high to stop something, 605 00:35:05,024 --> 00:35:07,983 Speaker 4: it's like screaming fire in a movie theater level bar. Right, 606 00:35:08,024 --> 00:35:10,223 Speaker 4: it's got to be like I I am going to 607 00:35:10,344 --> 00:35:14,424 Speaker 4: kill this person tomorrow at two point thirty pm, and 608 00:35:14,464 --> 00:35:16,064 Speaker 4: I'm going to use a gun, right Like, it's got 609 00:35:16,104 --> 00:35:19,623 Speaker 4: to be like so blatant and obvious for us to 610 00:35:19,623 --> 00:35:21,584 Speaker 4: block it, which I think is a good thing. I 611 00:35:21,623 --> 00:35:25,864 Speaker 4: think that the government should provide some of these tools freely. 612 00:35:26,304 --> 00:35:30,904 Speaker 4: The that could be by funding education programs in schools. 613 00:35:30,944 --> 00:35:34,144 Speaker 4: Maybe that could be by open paying to open source 614 00:35:34,263 --> 00:35:39,823 Speaker 4: certain intelligence tools. I think that the government should use 615 00:35:39,904 --> 00:35:43,743 Speaker 4: the intelligence community should use this information. So a lot 616 00:35:43,783 --> 00:35:46,663 Speaker 4: of people talk about open source intelligences, and you know 617 00:35:46,703 --> 00:35:50,383 Speaker 4: that we should use commercial imagery or global movement data 618 00:35:50,424 --> 00:35:53,584 Speaker 4: or whatever the pick you're poison. But I think the 619 00:35:53,623 --> 00:35:58,823 Speaker 4: government should actually come to rely on open source analysis, 620 00:35:58,864 --> 00:36:04,344 Speaker 4: for example, and allow there to be a billing cat 621 00:36:04,544 --> 00:36:08,384 Speaker 4: like organization. I think that Trump administration was right to 622 00:36:08,464 --> 00:36:12,944 Speaker 4: shut down the Foreign Malian Influence Counterline Influence Center. I 623 00:36:12,944 --> 00:36:16,504 Speaker 4: don't think it was particularly helpful. Like the warnings that 624 00:36:16,544 --> 00:36:18,744 Speaker 4: they sent out were sort of like the red Amber 625 00:36:18,824 --> 00:36:22,223 Speaker 4: Green warnings from counter Terrorism Days from DHS. They were 626 00:36:22,223 --> 00:36:25,903 Speaker 4: like useless. They're like, oh, there might be some sort 627 00:36:25,904 --> 00:36:29,703 Speaker 4: of political influence campaign. It's like, okay, how does that 628 00:36:29,783 --> 00:36:33,504 Speaker 4: change my life? I also think that the intelligence community 629 00:36:33,504 --> 00:36:36,064 Speaker 4: has to rebuild some trust, and I think the way 630 00:36:36,064 --> 00:36:39,663 Speaker 4: to rebuild trust is to start going two ways. And 631 00:36:39,824 --> 00:36:43,024 Speaker 4: I think that means, for example, providing the IC could 632 00:36:43,064 --> 00:36:46,703 Speaker 4: be a provider of information rather than the CIA is 633 00:36:46,703 --> 00:36:48,663 Speaker 4: give me all your information and I'm not going to 634 00:36:48,864 --> 00:36:51,223 Speaker 4: give you anything. We could go the other way and 635 00:36:51,263 --> 00:36:53,424 Speaker 4: start giving information out that helps people. 636 00:36:53,703 --> 00:36:55,383 Speaker 2: Can you tell us is there anything else you're working 637 00:36:55,424 --> 00:36:57,223 Speaker 2: on or is any place where people can find what 638 00:36:57,263 --> 00:36:58,304 Speaker 2: you're up to these days? 639 00:36:58,464 --> 00:36:58,703 Speaker 1: Yeah? 640 00:36:58,864 --> 00:37:01,783 Speaker 4: The book is out now, The Fourth Intelligence Revolution. I've 641 00:37:01,783 --> 00:37:05,104 Speaker 4: got a website Anthony vinc dot com and a sub 642 00:37:05,143 --> 00:37:09,064 Speaker 4: stack Three Kinds of Intelligence, which is where I'm diving 643 00:37:09,143 --> 00:37:11,623 Speaker 4: more de lee into some of these things and reacting 644 00:37:11,623 --> 00:37:14,024 Speaker 4: to what's going on in the news and some of 645 00:37:14,064 --> 00:37:17,744 Speaker 4: these things I'm saying, I really mean that we should 646 00:37:17,783 --> 00:37:21,384 Speaker 4: do them, and I actually do think we should create 647 00:37:21,424 --> 00:37:24,943 Speaker 4: a civilian intelligence agency. And if people out there want 648 00:37:24,944 --> 00:37:27,584 Speaker 4: to work on these things, hit me up. I'm on 649 00:37:27,663 --> 00:37:30,663 Speaker 4: LinkedIn and an x and my substack or my website, Like, 650 00:37:31,024 --> 00:37:33,624 Speaker 4: I actually think that we could work together on these things, 651 00:37:33,663 --> 00:37:35,424 Speaker 4: and I think we should. I don't want to be 652 00:37:35,464 --> 00:37:37,464 Speaker 4: in charge of it. I think this needs to be 653 00:37:37,504 --> 00:37:41,143 Speaker 4: a decentralized thing, but I want I want people who 654 00:37:41,183 --> 00:37:43,504 Speaker 4: want to kind of contribute, almost like how open source 655 00:37:43,504 --> 00:37:47,303 Speaker 4: software is made. But let's do that for countering, you know, 656 00:37:47,464 --> 00:37:51,263 Speaker 4: foreign malign influence, encountering, you know, for an espionage and 657 00:37:51,304 --> 00:37:52,823 Speaker 4: these things that are threats to the nation. 658 00:37:53,743 --> 00:37:55,584 Speaker 2: Very cool. So again, thank you so. 659 00:37:55,623 --> 00:37:57,984 Speaker 4: Much, thank you so much, appreciate you having me. 660 00:38:01,904 --> 00:38:06,944 Speaker 1: Mission Implausible is produced by Adam Davidson, Jerry O'sha, John Cipher, 661 00:38:07,263 --> 00:38:11,064 Speaker 1: and Jonathan Stern. The associate produce sir is Rachel Harner. 662 00:38:11,344 --> 00:38:15,143 Speaker 1: Mission Implausible is a production of honorable, mention and abominable 663 00:38:15,183 --> 00:38:17,143 Speaker 1: pictures for iHeart Podcasts.