1 00:00:14,040 --> 00:00:16,560 Speaker 1: Welcome to tech Stuff. This is the story. I'm as 2 00:00:16,640 --> 00:00:20,919 Speaker 1: Volosen here with Cara Price. Hi as Hi Cara. So 3 00:00:20,960 --> 00:00:24,200 Speaker 1: I've been very, very excited about today's story. It's a 4 00:00:24,239 --> 00:00:26,960 Speaker 1: deep dive with the author of a column that has 5 00:00:27,000 --> 00:00:32,040 Speaker 1: the headline A band of innovators reimagines the spy game 6 00:00:32,479 --> 00:00:35,640 Speaker 1: for a world with no cover. Here's the author. 7 00:00:36,040 --> 00:00:41,560 Speaker 2: It began to realize that the future of intelligence was 8 00:00:41,600 --> 00:00:43,720 Speaker 2: going to be written in zeros and ones, that it 9 00:00:43,760 --> 00:00:47,560 Speaker 2: was going to become a technology war an algorithm. 10 00:00:48,120 --> 00:00:50,680 Speaker 1: So the story is by David Ignatius. He's a journalist 11 00:00:50,680 --> 00:00:53,639 Speaker 1: at the Washington Post with the reputation of knowing the 12 00:00:53,720 --> 00:00:58,280 Speaker 1: inside workings of the CIA better than almost anyone else 13 00:00:58,520 --> 00:01:00,600 Speaker 1: in the world. One of the things I love about 14 00:01:00,600 --> 00:01:02,840 Speaker 1: doing tech stuff is that we get to look under 15 00:01:02,840 --> 00:01:07,480 Speaker 1: the hood of how tech is revolutionizing places that you 16 00:01:07,600 --> 00:01:11,679 Speaker 1: might not expect, in some cases the most unexpected places, 17 00:01:12,080 --> 00:01:15,600 Speaker 1: one of which is, as we insiders like to call it, 18 00:01:15,640 --> 00:01:19,520 Speaker 1: the agency, the CIA. And what's interesting about David's piece 19 00:01:19,880 --> 00:01:23,440 Speaker 1: is that it's clear spycraft itself. Is it an inflection 20 00:01:23,560 --> 00:01:24,840 Speaker 1: point because of AI? 21 00:01:25,240 --> 00:01:26,920 Speaker 3: So tell me a little bit more about why this 22 00:01:26,959 --> 00:01:27,760 Speaker 3: cut your interest. 23 00:01:28,040 --> 00:01:30,520 Speaker 1: Well, I grew up as a boy in Britain and 24 00:01:30,560 --> 00:01:33,720 Speaker 1: have become a British Man. So of course James Bond 25 00:01:33,760 --> 00:01:38,200 Speaker 1: fantasies are totally organic to who I became as a person. 26 00:01:38,680 --> 00:01:42,880 Speaker 1: But I'm just beyond intrigued by this conceptually, because traditional 27 00:01:42,880 --> 00:01:46,680 Speaker 1: spycraft was all about obscuring your identity from other people. 28 00:01:47,319 --> 00:01:50,280 Speaker 1: Like all those crazy disguises and prosthetics and stuff you 29 00:01:50,360 --> 00:01:53,880 Speaker 1: see in movies like Mitchell Impossible, They're all based on 30 00:01:54,040 --> 00:01:56,480 Speaker 1: real CIA technologies, or at least many of them are. 31 00:01:56,920 --> 00:01:59,120 Speaker 1: And David told me that the art of disguise has 32 00:01:59,160 --> 00:02:01,720 Speaker 1: gotten so good that you can easily change your race, 33 00:02:02,040 --> 00:02:04,840 Speaker 1: even your gender. And that's just the stuff we know about. 34 00:02:05,600 --> 00:02:09,799 Speaker 1: But now intelligence agencies are facing a radical new problem, 35 00:02:09,840 --> 00:02:11,840 Speaker 1: which is how do you trig a machine? 36 00:02:12,280 --> 00:02:14,840 Speaker 3: So while you might have a mask that completely changes 37 00:02:14,880 --> 00:02:18,160 Speaker 3: your appearance, a retina scan or a fingerprint could completely 38 00:02:18,200 --> 00:02:19,119 Speaker 3: blow your cover. 39 00:02:19,280 --> 00:02:21,959 Speaker 1: That's right. So the story began with a guy who 40 00:02:22,000 --> 00:02:24,440 Speaker 1: worked with the CIA who warned a few years ago 41 00:02:24,880 --> 00:02:28,480 Speaker 1: that computer vision would soon be able to identify people 42 00:02:28,880 --> 00:02:32,400 Speaker 1: at a far off distance just based on the signature 43 00:02:32,400 --> 00:02:36,000 Speaker 1: of how they walked. It's called gate analysis, and at 44 00:02:36,040 --> 00:02:37,560 Speaker 1: the time he was laughed out of the room as 45 00:02:37,560 --> 00:02:41,520 Speaker 1: a scaremonger, but it turns out he'd actually seen into 46 00:02:41,520 --> 00:02:45,960 Speaker 1: the future. It's almost impossible nowadays for humans through outsmart 47 00:02:45,960 --> 00:02:49,000 Speaker 1: machines because of something that David called, and I love 48 00:02:49,080 --> 00:02:53,519 Speaker 1: this phrase, digital dust. What that refers to is the 49 00:02:53,600 --> 00:02:57,239 Speaker 1: data signatures we leave behind no matter how hard we try. 50 00:02:57,360 --> 00:03:00,960 Speaker 1: Here's the real kicker. Not leaving digital dust could be 51 00:03:01,120 --> 00:03:04,920 Speaker 1: just as revealing as leaving it. In the past, spies 52 00:03:04,960 --> 00:03:08,120 Speaker 1: could slip under cover with just a story. But now 53 00:03:08,320 --> 00:03:12,519 Speaker 1: if they don't have decades of online activity LinkedIn Instagram, Facebook, etc. 54 00:03:13,040 --> 00:03:16,520 Speaker 1: To support that story, it just doesn't make any sense 55 00:03:16,560 --> 00:03:19,760 Speaker 1: to their adversaries. In fact, as David put it, the 56 00:03:19,840 --> 00:03:22,720 Speaker 1: harder you try to hide, the more visible you become. 57 00:03:23,360 --> 00:03:25,880 Speaker 1: And all of this leads to some existential questions about 58 00:03:25,919 --> 00:03:29,160 Speaker 1: the future of spying. How does a CIA adapt, what 59 00:03:29,200 --> 00:03:32,720 Speaker 1: happens if it doesn't, and who's responsible for dragging into 60 00:03:32,720 --> 00:03:35,760 Speaker 1: the future. One of those people is the guy I mentioned, 61 00:03:35,800 --> 00:03:39,000 Speaker 1: the person who warned about data analysis. He's now innovating 62 00:03:39,160 --> 00:03:41,760 Speaker 1: on the outside and trying to set into the CIA. 63 00:03:42,400 --> 00:03:44,480 Speaker 1: But to start with, I wanted to know about how 64 00:03:44,520 --> 00:03:47,040 Speaker 1: the CIA has tried to adapt already and how it's going. 65 00:03:47,520 --> 00:03:52,000 Speaker 1: Here's my conversation with David Ignatius. You chart in the 66 00:03:52,120 --> 00:03:57,480 Speaker 1: piece how a number of poorly executed technological ideas have 67 00:03:57,600 --> 00:04:01,000 Speaker 1: led to networks being dismantled in places like China and Iran. 68 00:04:01,040 --> 00:04:05,000 Speaker 1: I assume by networks being dismantled that means people getting 69 00:04:05,520 --> 00:04:09,160 Speaker 1: arrested and maybe even killed, and that only killed. 70 00:04:09,440 --> 00:04:14,200 Speaker 2: Some of the executions were ruesomely done in ways to 71 00:04:14,640 --> 00:04:18,400 Speaker 2: make sure that our overhead satellites can see them. So 72 00:04:18,800 --> 00:04:22,560 Speaker 2: it is true that our attempts to come up with 73 00:04:22,680 --> 00:04:28,440 Speaker 2: clever technologies have sometimes been half baked. I'll give you 74 00:04:28,480 --> 00:04:32,599 Speaker 2: an example that's been published by me and others. The 75 00:04:32,640 --> 00:04:36,600 Speaker 2: agency had a seemingly very clever method for communicating with 76 00:04:36,720 --> 00:04:42,040 Speaker 2: recruited agents, where it would give them access to computer 77 00:04:42,160 --> 00:04:46,760 Speaker 2: websites that fit their personalities. Let's suppose somebody was a 78 00:04:46,839 --> 00:04:51,720 Speaker 2: Liverpool soccer fan, so it'll be a Liverpool fan site. 79 00:04:52,120 --> 00:04:56,440 Speaker 2: But embedded in that fan site was a template for 80 00:04:56,560 --> 00:05:01,920 Speaker 2: communicating directly and very secretly through the Internet through a VPN, 81 00:05:02,080 --> 00:05:06,120 Speaker 2: a hole in the Internet back to Langley. The problem 82 00:05:06,360 --> 00:05:11,320 Speaker 2: was that these things all had the same back end electronically, 83 00:05:12,080 --> 00:05:15,719 Speaker 2: so once you stripped away the nominal cover. You know, 84 00:05:15,760 --> 00:05:19,160 Speaker 2: there were ones for Rasta fans in the West Indies, 85 00:05:19,240 --> 00:05:22,520 Speaker 2: there were ones for Cut three music fans. I mean, 86 00:05:22,520 --> 00:05:25,720 Speaker 2: if you can go to dozens of the dedicated sites, 87 00:05:25,760 --> 00:05:27,680 Speaker 2: but they all had the same back end, which was 88 00:05:27,720 --> 00:05:33,040 Speaker 2: really about covert communications. And in both Iran and China 89 00:05:34,160 --> 00:05:40,400 Speaker 2: this secret was assessed and then ruthlessly exploited. So you 90 00:05:40,480 --> 00:05:42,880 Speaker 2: have to be careful with technology. When you think you're 91 00:05:42,880 --> 00:05:46,160 Speaker 2: being smart, you have to go back and look at 92 00:05:46,200 --> 00:05:49,400 Speaker 2: it again because you may simply be being obvious in 93 00:05:49,440 --> 00:05:51,240 Speaker 2: a different way you hadn't considered. 94 00:05:52,360 --> 00:05:54,800 Speaker 1: One of the other examples that comes up in the 95 00:05:54,880 --> 00:06:00,320 Speaker 1: piece is American Kamando's I believe in Syria who whose 96 00:06:00,520 --> 00:06:03,279 Speaker 1: cell phones came from Fort Bragg, right, And so it 97 00:06:03,360 --> 00:06:05,760 Speaker 1: was very easy for an external pologies to locate where 98 00:06:05,760 --> 00:06:06,200 Speaker 1: they were. 99 00:06:06,680 --> 00:06:13,800 Speaker 2: So there was a very secret location in northeastern Syria, 100 00:06:14,520 --> 00:06:17,920 Speaker 2: in the Kurdish controlled area. I know about it because 101 00:06:17,920 --> 00:06:22,120 Speaker 2: I probably went there five times as an embed with 102 00:06:22,240 --> 00:06:25,719 Speaker 2: the Joint Special Operations Command, which is our most secret 103 00:06:25,760 --> 00:06:29,520 Speaker 2: and really our best military force, which is running operations 104 00:06:29,520 --> 00:06:33,640 Speaker 2: there to destroy Isis. Remember how frightening ISIS was so 105 00:06:34,080 --> 00:06:39,160 Speaker 2: a well meaning guy who was in the commercial advertising business, 106 00:06:39,480 --> 00:06:44,240 Speaker 2: whose business was picking up the little emissions from your 107 00:06:44,279 --> 00:06:48,720 Speaker 2: cell phones that tell ways where you are on the highway, 108 00:06:49,080 --> 00:06:52,279 Speaker 2: what cop car has pulled over. He wanted to help 109 00:06:52,440 --> 00:06:55,240 Speaker 2: refugees who were fleeing Syria those early in the early 110 00:06:55,279 --> 00:06:57,800 Speaker 2: days of the world. So I just bought from Syria 111 00:06:58,160 --> 00:07:02,000 Speaker 2: telcom companies bought all this data. Is very cheap because 112 00:07:02,000 --> 00:07:05,320 Speaker 2: nobody had any commercial applications whatsoever for that area. And 113 00:07:05,360 --> 00:07:09,080 Speaker 2: then he overlaid telephones in the area of Fort Bragg, 114 00:07:09,120 --> 00:07:12,080 Speaker 2: North Carolina, which is where Jaysack and all of our 115 00:07:12,080 --> 00:07:17,560 Speaker 2: most secret units operate, and he found there were all 116 00:07:17,600 --> 00:07:22,559 Speaker 2: these things at the cement factory, supposedly an abandoned French 117 00:07:22,600 --> 00:07:26,400 Speaker 2: cement factory. The advertisers just were like, oh my god, 118 00:07:26,400 --> 00:07:30,360 Speaker 2: look at all this. So there's an example where commercial 119 00:07:30,840 --> 00:07:36,240 Speaker 2: innovation just was much faster than the creativity or the 120 00:07:36,280 --> 00:07:41,640 Speaker 2: ability to detect new threats and opportunities on the intelligence side. 121 00:07:42,040 --> 00:07:44,920 Speaker 2: And that illustrates I think what in some ways OZ 122 00:07:44,960 --> 00:07:48,480 Speaker 2: is that is the biggest and most positive change that's happened. 123 00:07:49,280 --> 00:07:53,160 Speaker 2: Starting in nineteen ninety nine, then CI director George Tenne 124 00:07:53,440 --> 00:07:57,520 Speaker 2: was very foresighted and seeing that the CIA is falling 125 00:07:57,600 --> 00:08:01,640 Speaker 2: behind the pace of innovation of Silicon Valley. Tenant had 126 00:08:01,640 --> 00:08:05,000 Speaker 2: a good sense to create a CIA venture capital fund, 127 00:08:05,160 --> 00:08:08,320 Speaker 2: which he called ink Tel, and rather than getting the 128 00:08:08,400 --> 00:08:12,360 Speaker 2: usual Intel bureaucrat to run it, he went outside and 129 00:08:12,360 --> 00:08:15,960 Speaker 2: picked him in named Gilman Louie, who'd been running a 130 00:08:16,080 --> 00:08:22,400 Speaker 2: video game company, and Gilman Louie began going out and 131 00:08:22,480 --> 00:08:27,760 Speaker 2: looking for smart ideas that could help in intelligence missions, 132 00:08:28,160 --> 00:08:32,520 Speaker 2: recognizing that the pace of innovation in the private sectors 133 00:08:32,920 --> 00:08:35,600 Speaker 2: simply couldn't be matched by government. The government had to 134 00:08:35,640 --> 00:08:40,760 Speaker 2: find a way to use these technologies, if possible, work 135 00:08:40,840 --> 00:08:43,800 Speaker 2: with the people who were creating them. So that got 136 00:08:44,040 --> 00:08:47,640 Speaker 2: the intelligence communitee operating at the speed of innovation, if 137 00:08:47,679 --> 00:08:51,040 Speaker 2: you will, and I think has certainly accelerated the pace 138 00:08:51,080 --> 00:08:54,079 Speaker 2: of change that's happened in Britain a lot of our allies. 139 00:08:54,440 --> 00:08:57,400 Speaker 2: So this began with a good idea of George Tennant's. 140 00:08:57,440 --> 00:09:00,480 Speaker 2: But now I think pretty much universally understood that the 141 00:09:00,520 --> 00:09:04,400 Speaker 2: intelligence world wants totally closed, has to connect with this 142 00:09:04,679 --> 00:09:07,760 Speaker 2: wildly open and innovative world of talking about. 143 00:09:08,280 --> 00:09:11,400 Speaker 1: I saw in your piece those investments in Paneteer and Enderrail, 144 00:09:11,480 --> 00:09:14,960 Speaker 1: and presumably if you were a venture investor, you would 145 00:09:15,000 --> 00:09:19,280 Speaker 1: be beyond cockahoop with those returns. As the CIA, though 146 00:09:19,520 --> 00:09:23,640 Speaker 1: I mean financial performances is great, but as a CIA 147 00:09:24,400 --> 00:09:27,840 Speaker 1: in some sense threatening its own relevance by outsourcing so 148 00:09:27,920 --> 00:09:28,400 Speaker 1: much innovation. 149 00:09:28,440 --> 00:09:33,160 Speaker 2: As the private sector, the CIA, they're trying to keep up. 150 00:09:33,600 --> 00:09:36,559 Speaker 2: They have technical advisory boards and they has some very 151 00:09:36,600 --> 00:09:41,800 Speaker 2: patriotic people and in tech companies work with them. We 152 00:09:41,880 --> 00:09:46,120 Speaker 2: have this interesting problem that Elon Musk, to take one example, 153 00:09:46,840 --> 00:09:53,560 Speaker 2: came up with an extraordinarily powerful system Starlink with now 154 00:09:54,000 --> 00:10:00,000 Speaker 2: three thousand dollars satellites and lowerthorbit providing broadbeam signals over everywhere, 155 00:10:00,800 --> 00:10:04,320 Speaker 2: and that became the command and control network for the 156 00:10:04,400 --> 00:10:08,480 Speaker 2: Ukrainian military. But that then puts enormous power in the 157 00:10:08,559 --> 00:10:12,040 Speaker 2: hands of a private sector entrepreneurs. So what happens if 158 00:10:12,120 --> 00:10:16,800 Speaker 2: Eli Musk decides I have to sell teslas in China, 159 00:10:17,160 --> 00:10:19,280 Speaker 2: I have to make teslas and shine. I don't want to. 160 00:10:19,440 --> 00:10:22,079 Speaker 2: I'm sick of this. Ukraine will war, right, So I'm 161 00:10:22,120 --> 00:10:24,360 Speaker 2: going to cut off their ability to get those signals. 162 00:10:24,360 --> 00:10:26,600 Speaker 2: There have been moments when he's in fact threatened to 163 00:10:26,640 --> 00:10:29,120 Speaker 2: do that and then just pulled back. But it just 164 00:10:29,160 --> 00:10:32,440 Speaker 2: illustrates that the use of private technology is a double 165 00:10:32,520 --> 00:10:36,360 Speaker 2: edged sword. Yes, it accelerates the pace of your ability 166 00:10:36,400 --> 00:10:39,520 Speaker 2: to innovate, but it makes you more dependent on these 167 00:10:39,720 --> 00:10:45,800 Speaker 2: entrepreneurs who hopefully share your national interests and will protect 168 00:10:45,840 --> 00:10:50,360 Speaker 2: and advance your secret advantage. But you can't be sure 169 00:10:50,400 --> 00:10:53,160 Speaker 2: of that, and so that's a puzzle. I think people 170 00:10:53,160 --> 00:10:54,360 Speaker 2: are still struggling with. 171 00:10:55,120 --> 00:11:00,920 Speaker 1: Characterize where the CIA is on these issues and questions 172 00:11:00,920 --> 00:11:03,760 Speaker 1: that you've been raising, because the sense I get from 173 00:11:03,760 --> 00:11:06,880 Speaker 1: the piece is that you feel their way behind. I mean, 174 00:11:06,920 --> 00:11:09,880 Speaker 1: there's this great quote from a former CIA director, David Norman, 175 00:11:09,920 --> 00:11:12,679 Speaker 1: who says, if Henry Ford had gone to transportation customers 176 00:11:12,720 --> 00:11:14,720 Speaker 1: and us what they wanted, the would have said faster horses. 177 00:11:15,280 --> 00:11:17,720 Speaker 1: That's what the CIA has been trying to build, faster horses, 178 00:11:17,760 --> 00:11:18,680 Speaker 1: and it's pretty damning. 179 00:11:18,960 --> 00:11:23,600 Speaker 2: It is pretty damning. So they deny that they're as 180 00:11:23,880 --> 00:11:28,160 Speaker 2: out of it as a piece suggests. The question is 181 00:11:28,440 --> 00:11:33,040 Speaker 2: how fast you can move to fully adapt It can 182 00:11:33,080 --> 00:11:35,480 Speaker 2: be aware of something, but fully adapt to it. As 183 00:11:35,520 --> 00:11:39,520 Speaker 2: a different matter, people are still trying to I think 184 00:11:39,559 --> 00:11:41,520 Speaker 2: faster or worse. It is a little unfair, but they're 185 00:11:41,559 --> 00:11:45,720 Speaker 2: still trying to think about within the existing paradigm for espionage. 186 00:11:45,800 --> 00:11:47,760 Speaker 2: How can we do it better? How do we hide better? 187 00:11:47,800 --> 00:11:50,720 Speaker 2: How do we find ways to capture their signals without 188 00:11:50,720 --> 00:11:55,480 Speaker 2: being observed ourselves? But what's needed, people say, is something 189 00:11:55,920 --> 00:11:59,439 Speaker 2: really very new, a whole new way of thinking about operating. 190 00:12:00,120 --> 00:12:04,040 Speaker 2: Is John Ratcliffe, the CI director of a person who's creative, 191 00:12:04,720 --> 00:12:10,120 Speaker 2: disruptive enough to orchestrate that transformation. We'll see, But legacy 192 00:12:10,200 --> 00:12:14,880 Speaker 2: systems have a momentum sort of weight that Look at 193 00:12:14,880 --> 00:12:18,440 Speaker 2: aircraft carriers in the Navy. I mean, it's twenty years ago. 194 00:12:18,559 --> 00:12:21,640 Speaker 2: We knew that they were all vulnerable. They disappeared the 195 00:12:21,679 --> 00:12:25,079 Speaker 2: first minutes or hours of any attack. But they're still 196 00:12:25,080 --> 00:12:27,600 Speaker 2: out there, and good luck trying to get rid of. 197 00:12:27,640 --> 00:12:37,120 Speaker 1: Them after the break the story, even elite soldier to 198 00:12:37,400 --> 00:12:59,160 Speaker 1: tech founder who's creating an AI souperagent, stay with us. Yeah, 199 00:12:59,160 --> 00:13:02,079 Speaker 1: this's light in the p The CIA's technology challenge is 200 00:13:02,080 --> 00:13:05,280 Speaker 1: a little noted example of a transformation that's happening in 201 00:13:05,360 --> 00:13:09,679 Speaker 1: every area of defense and security today. Smart machines can 202 00:13:09,760 --> 00:13:13,480 Speaker 1: outweit humans now, even for Tim Cook, even for the 203 00:13:13,559 --> 00:13:18,920 Speaker 1: CEOs of silicon value based technology companies, knowing how to 204 00:13:19,000 --> 00:13:23,440 Speaker 1: balance defending the core product with integrating this wave of 205 00:13:23,520 --> 00:13:26,880 Speaker 1: new technology that is so fast moving. Even Google is 206 00:13:26,960 --> 00:13:32,600 Speaker 1: struggling with it. How does a government organization saddled with 207 00:13:32,640 --> 00:13:35,920 Speaker 1: the bureaucracy that even the most forward thinking government organizations 208 00:13:36,520 --> 00:13:41,080 Speaker 1: come with. Is there an overall strategic response to this problem? 209 00:13:41,160 --> 00:13:42,640 Speaker 1: Is crisis? 210 00:13:42,800 --> 00:13:46,880 Speaker 2: So I think the obvious answer is that new ideas 211 00:13:47,440 --> 00:13:50,960 Speaker 2: in the intelligence business that are really powerful, that allow 212 00:13:51,040 --> 00:13:54,320 Speaker 2: you to collect secrets that you didn't have before, that 213 00:13:54,480 --> 00:14:02,040 Speaker 2: open new areas for collection and analysis become irresistible. These 214 00:14:02,080 --> 00:14:06,079 Speaker 2: secrets are so powerful once you learn to read somebody's mail, 215 00:14:06,679 --> 00:14:11,160 Speaker 2: listening to their phone calls, it's resistible. Because policymakers, once 216 00:14:11,200 --> 00:14:15,160 Speaker 2: they've got that conversation between the General secretary and his 217 00:14:15,440 --> 00:14:17,520 Speaker 2: chief of staff, they want it. They want to get 218 00:14:17,520 --> 00:14:21,280 Speaker 2: the good stuff. So there's always a demand for policy 219 00:14:21,280 --> 00:14:25,280 Speaker 2: makers for the very best intelligence. And if technology can 220 00:14:25,680 --> 00:14:29,120 Speaker 2: allow people to get more of that or sustain the 221 00:14:29,120 --> 00:14:32,280 Speaker 2: flow of it, certainly the demand will be there. I 222 00:14:32,360 --> 00:14:36,400 Speaker 2: just wouldn't note if you look at what the CIA 223 00:14:36,560 --> 00:14:42,360 Speaker 2: and other agencies were able to know about Russian intentions 224 00:14:43,200 --> 00:14:48,280 Speaker 2: in late twenty twenty one, when Europe and even Ukraine 225 00:14:48,400 --> 00:14:51,240 Speaker 2: said no, the Russians aren't going to attack, and Bill 226 00:14:51,240 --> 00:14:54,480 Speaker 2: Burns and his colleagues kept going out and saying, yes 227 00:14:54,520 --> 00:14:58,600 Speaker 2: they are, and that they had detailed intelligence. They were 228 00:14:58,640 --> 00:15:03,840 Speaker 2: reading Russian intentions like a phone book. It's still not 229 00:15:03,920 --> 00:15:07,480 Speaker 2: clear just how they had such precise intelligence, but they 230 00:15:07,560 --> 00:15:09,640 Speaker 2: knew right where they were coming out. How Ukraine in 231 00:15:09,680 --> 00:15:11,960 Speaker 2: the early days of the war was able to be 232 00:15:12,000 --> 00:15:15,240 Speaker 2: so successful. We knew where they were coming. They were 233 00:15:15,240 --> 00:15:18,240 Speaker 2: coming to the airport just west of Kiev. We knew 234 00:15:18,280 --> 00:15:20,520 Speaker 2: exactly what they were going to try to do and 235 00:15:20,880 --> 00:15:23,360 Speaker 2: ban there were people there waiting for them, and they 236 00:15:23,480 --> 00:15:27,120 Speaker 2: just took them out. But the electronic coordination of all 237 00:15:27,160 --> 00:15:29,800 Speaker 2: the systems that have to have to operate simultaneously we 238 00:15:29,800 --> 00:15:31,840 Speaker 2: don't think about. You know, if you've got an hour 239 00:15:31,920 --> 00:15:35,440 Speaker 2: of time and you've got all these different multiple fires, 240 00:15:35,480 --> 00:15:40,000 Speaker 2: different drones, different systems that cannot be done by human beings, 241 00:15:40,000 --> 00:15:43,240 Speaker 2: too complicated. So there's a way in which the war 242 00:15:43,280 --> 00:15:47,240 Speaker 2: in Ukraine was an algorithm and the complex systems for 243 00:15:47,360 --> 00:15:50,240 Speaker 2: handling data that's never been done in the warfare before. 244 00:15:50,400 --> 00:15:55,040 Speaker 2: That is something of that complexity managed electronically simultaneously. You've 245 00:15:55,080 --> 00:15:56,000 Speaker 2: never seen that before. 246 00:15:56,280 --> 00:16:00,440 Speaker 1: So the diagnosis is actually rumors of the demise maybe 247 00:16:00,720 --> 00:16:03,000 Speaker 1: over exaggerating. What you're really looking at in the piece 248 00:16:03,080 --> 00:16:06,800 Speaker 1: is more continued relevance in a five to ten year horizon. 249 00:16:07,480 --> 00:16:10,480 Speaker 2: So I'm looking at the people who were trying to 250 00:16:10,560 --> 00:16:13,880 Speaker 2: change the paradigm. But the point that should encourage people, 251 00:16:13,960 --> 00:16:17,120 Speaker 2: if you're an American, you have to say that is 252 00:16:17,680 --> 00:16:20,720 Speaker 2: there are a lot of really smart people out there. 253 00:16:20,880 --> 00:16:23,400 Speaker 2: Many of them got frustrated being the CI. It's pretty 254 00:16:23,440 --> 00:16:26,680 Speaker 2: bureaucratic these days, got a little political, to put a 255 00:16:26,760 --> 00:16:31,680 Speaker 2: mild late so people say. And they're out there trying 256 00:16:31,680 --> 00:16:34,120 Speaker 2: to create companies that are going to be good for 257 00:16:34,160 --> 00:16:37,840 Speaker 2: their former colleagues, solve problems that they couldn't solve while 258 00:16:37,840 --> 00:16:41,240 Speaker 2: they were in an operation, and so people are coming 259 00:16:41,320 --> 00:16:44,200 Speaker 2: up with, as I say, quite innovative ideas. 260 00:16:45,000 --> 00:16:47,480 Speaker 1: Talk a bit about Aaron Brown's story, if you don't mind, 261 00:16:47,680 --> 00:16:50,760 Speaker 1: how you first met him, how his insights within the 262 00:16:50,760 --> 00:16:54,400 Speaker 1: agency were rejected, and how he came to design Lumbra 263 00:16:54,840 --> 00:16:57,600 Speaker 1: and even engage with Samiltman in the very early days 264 00:16:57,600 --> 00:16:59,040 Speaker 1: of chat Chipet's release. 265 00:17:00,080 --> 00:17:03,480 Speaker 2: Brown was an Army ranger and if you don't know 266 00:17:03,520 --> 00:17:06,520 Speaker 2: anything about what they do at ranger school, you know 267 00:17:06,680 --> 00:17:09,760 Speaker 2: these are the guys who can climb up a cliff 268 00:17:09,800 --> 00:17:13,440 Speaker 2: face and run ten miles or a fifty pound pack 269 00:17:13,480 --> 00:17:16,680 Speaker 2: on their back and get in and out of incredible places. 270 00:17:16,800 --> 00:17:20,800 Speaker 2: They're tough fighters and end up going to these special units, 271 00:17:21,440 --> 00:17:25,480 Speaker 2: And like many very good soldiers, Aaron ended up getting 272 00:17:25,520 --> 00:17:28,800 Speaker 2: cut to the CI and worked in its counter terrorism 273 00:17:28,880 --> 00:17:32,480 Speaker 2: operations in the days of the pursuit of wassamb B 274 00:17:32,520 --> 00:17:35,840 Speaker 2: and Laud. All the while he was with a deep 275 00:17:36,000 --> 00:17:39,479 Speaker 2: engineer's interest in electronics, was trying to think about the 276 00:17:39,680 --> 00:17:42,600 Speaker 2: tools of his craft and whether they were adequate. And 277 00:17:42,920 --> 00:17:46,159 Speaker 2: when the bin laden work was done, he began to 278 00:17:46,240 --> 00:17:51,399 Speaker 2: worry about the vulnerability of officers overseas to this technology 279 00:17:51,400 --> 00:17:54,080 Speaker 2: that could recognize the way they walked. And just think 280 00:17:54,080 --> 00:17:57,959 Speaker 2: about it. If you've got recordings of everybody enters all 281 00:17:58,000 --> 00:18:01,639 Speaker 2: the airports of China, everybody who enters the US embassy, 282 00:18:01,720 --> 00:18:04,399 Speaker 2: or all the consulates that we have, you could end 283 00:18:04,480 --> 00:18:06,800 Speaker 2: up having a library that you can then run against 284 00:18:07,200 --> 00:18:10,000 Speaker 2: people all over China. You see, well, where that guy 285 00:18:10,000 --> 00:18:12,200 Speaker 2: from the US embassy, what's he doing in chung Du? 286 00:18:12,920 --> 00:18:15,639 Speaker 2: Why do you go to that forest that's ten miles 287 00:18:15,680 --> 00:18:19,639 Speaker 2: out of town. What's going on here? So he left 288 00:18:19,680 --> 00:18:24,159 Speaker 2: the agency year eighteen months ago and with a friend 289 00:18:24,320 --> 00:18:29,239 Speaker 2: started this company, Lumbra, And his basic idea is that 290 00:18:29,480 --> 00:18:35,280 Speaker 2: it's not simply these magnificent ais that are moving towards 291 00:18:35,440 --> 00:18:39,080 Speaker 2: what we call superintelligence without being quite sure what that means. 292 00:18:39,080 --> 00:18:42,000 Speaker 2: But computers that begin to be able to think in 293 00:18:42,080 --> 00:18:46,199 Speaker 2: real human like ways and begin to give themselves instructions, 294 00:18:46,200 --> 00:18:49,600 Speaker 2: maybe by writing their own programs, begin to be able 295 00:18:49,600 --> 00:18:52,240 Speaker 2: to speak to each other. It's one thing super intelligence 296 00:18:52,280 --> 00:18:55,480 Speaker 2: will be able to do. And his insight was that 297 00:18:56,040 --> 00:19:01,120 Speaker 2: just as our brilliant brains need hands, legs, the ability 298 00:19:01,160 --> 00:19:05,520 Speaker 2: to get to what we need to assemble to then process, 299 00:19:06,359 --> 00:19:09,240 Speaker 2: computers are going to need agents agentic AI that will 300 00:19:09,240 --> 00:19:13,240 Speaker 2: help them assemble that disparate pieces of information, think and 301 00:19:13,280 --> 00:19:17,119 Speaker 2: then analyze and make sense of it. No human being could. 302 00:19:17,840 --> 00:19:20,919 Speaker 2: So he decided that that was his going to be 303 00:19:20,960 --> 00:19:24,240 Speaker 2: his area, this central nervous system that's going to connect 304 00:19:24,800 --> 00:19:28,520 Speaker 2: the AI brain with all the other parts of the 305 00:19:28,560 --> 00:19:32,640 Speaker 2: system that will make it most effective. He in turn 306 00:19:32,760 --> 00:19:35,800 Speaker 2: introduced me to other friends, a guy who was thinking 307 00:19:35,840 --> 00:19:38,720 Speaker 2: about this problem of how our cell phones give off 308 00:19:38,760 --> 00:19:41,639 Speaker 2: our identity, but we still have to communicate, So how 309 00:19:41,640 --> 00:19:44,800 Speaker 2: our officers going to communicate without giving away their position. 310 00:19:44,920 --> 00:19:47,680 Speaker 2: And it turns out he came up with an incredibly 311 00:19:47,920 --> 00:19:51,000 Speaker 2: ingenious technology that essentially the three or four levels of 312 00:19:51,000 --> 00:19:54,360 Speaker 2: identifiers in our phones were not aware of that give 313 00:19:54,400 --> 00:19:57,440 Speaker 2: off our identification, but kind a way to bounce those 314 00:19:57,480 --> 00:20:02,439 Speaker 2: identifiers among the different a thousands of users of his system, 315 00:20:03,200 --> 00:20:07,159 Speaker 2: so you can't really tell where any particular signal is 316 00:20:07,200 --> 00:20:10,160 Speaker 2: coming from, even if you capture it. It's called Cape. 317 00:20:10,560 --> 00:20:14,520 Speaker 2: Third company that interesting to me was called Strider. It's 318 00:20:14,600 --> 00:20:18,520 Speaker 2: based in Salt Lake City, Utah. Essentially, what they're doing 319 00:20:18,800 --> 00:20:23,640 Speaker 2: is reading other people's digital dust. We're not the only 320 00:20:23,680 --> 00:20:27,120 Speaker 2: ones who leave digital dust. The Chinese are so intent 321 00:20:27,200 --> 00:20:31,000 Speaker 2: on monitoring their own population. They've got cameras on every 322 00:20:31,040 --> 00:20:32,920 Speaker 2: street and it's not to watch us, is to watch 323 00:20:33,040 --> 00:20:36,480 Speaker 2: Chinese people. And it turns out that that's an entry 324 00:20:36,520 --> 00:20:39,760 Speaker 2: way for companies like Strider to collect an amazing amount 325 00:20:39,840 --> 00:20:43,440 Speaker 2: of information. I mean, I've looked at it what they 326 00:20:43,440 --> 00:20:46,920 Speaker 2: can get, and it's it's pretty incredible. Imagine how long 327 00:20:46,960 --> 00:20:50,040 Speaker 2: it would take you to do that kind of forensics 328 00:20:50,080 --> 00:20:53,480 Speaker 2: in a free digital world. So that's something that that 329 00:20:53,600 --> 00:20:56,600 Speaker 2: company is doing. They think they probably do it better 330 00:20:56,600 --> 00:20:59,959 Speaker 2: than any internal agency in the government. The fact that 331 00:21:00,080 --> 00:21:03,160 Speaker 2: they were willing to talk about it with me in detail, 332 00:21:03,160 --> 00:21:06,359 Speaker 2: I didn't sneak them in the telling me, it means 333 00:21:06,359 --> 00:21:08,959 Speaker 2: that they're fairly confident they can continue to do it. 334 00:21:09,280 --> 00:21:12,600 Speaker 2: So those are three little examples of companies that are 335 00:21:12,720 --> 00:21:15,399 Speaker 2: playing around at the frontier, and I hope people will 336 00:21:15,440 --> 00:21:18,160 Speaker 2: find that encouraging. I would hate to think that we 337 00:21:18,160 --> 00:21:23,880 Speaker 2: were just a slumbering, sloppy giant with the intelligence equivalent 338 00:21:23,880 --> 00:21:26,639 Speaker 2: of aircraft carriers just sitting there away and get blown 339 00:21:26,640 --> 00:21:27,879 Speaker 2: out of the water. I don't want that. 340 00:21:28,359 --> 00:21:30,920 Speaker 1: We've talked a lot about concerns about the CIA falling 341 00:21:31,040 --> 00:21:34,680 Speaker 1: behind technologically. Do you have any concerns about a swing 342 00:21:34,760 --> 00:21:38,240 Speaker 1: too hot in the other direction and onboarding too many 343 00:21:38,840 --> 00:21:44,040 Speaker 1: untested private technologies that could create risk either to American 344 00:21:44,080 --> 00:21:46,359 Speaker 1: citizens or systems risk. I mean, what's the kind of 345 00:21:46,359 --> 00:21:48,359 Speaker 1: flip side concern and everything we've been talking about. 346 00:21:49,240 --> 00:21:52,920 Speaker 2: So, you know, this is a period where we're seeing 347 00:21:53,240 --> 00:21:58,199 Speaker 2: that the powers of government can be misused in what 348 00:21:58,359 --> 00:22:00,960 Speaker 2: to me are quite disturbing. When is you know, when 349 00:22:01,680 --> 00:22:06,520 Speaker 2: law enforcement is federalized, you see overreach, and you could 350 00:22:06,520 --> 00:22:09,320 Speaker 2: see a technological version of that overreach. You could see 351 00:22:09,320 --> 00:22:15,040 Speaker 2: the application of facial recognition software. In China, a citizens 352 00:22:15,080 --> 00:22:19,160 Speaker 2: can't go out of his town without permission. He gets 353 00:22:19,400 --> 00:22:23,760 Speaker 2: a score based on his performance at work, rating his 354 00:22:24,160 --> 00:22:27,920 Speaker 2: social merit. He gets a little gold star if he's 355 00:22:27,960 --> 00:22:31,720 Speaker 2: a dutiful citizens supporting the Chinese Communist Party. We don't 356 00:22:31,760 --> 00:22:35,359 Speaker 2: want that, and we don't want our schools, I don't 357 00:22:35,400 --> 00:22:40,760 Speaker 2: think to become rigid in what they instruct We want creativity, 358 00:22:40,840 --> 00:22:45,600 Speaker 2: not uniformity, and these tools can create uniformity sadly. So 359 00:22:45,880 --> 00:22:48,960 Speaker 2: you know, we have to remember that the power of 360 00:22:49,000 --> 00:22:52,960 Speaker 2: our federal government is overwhelming and that we want it 361 00:22:53,000 --> 00:22:56,440 Speaker 2: to be strong in dealing with our adversaries. But if 362 00:22:56,480 --> 00:23:00,480 Speaker 2: that power begins to be used against American citizens or 363 00:23:00,560 --> 00:23:04,359 Speaker 2: inappropriate ways around the world, everybody should be watching. So 364 00:23:04,400 --> 00:23:07,439 Speaker 2: that's I think my biggest challenge is a journalist. I 365 00:23:07,480 --> 00:23:12,280 Speaker 2: think this revolution is needed and beneficial, but I think 366 00:23:12,320 --> 00:23:16,040 Speaker 2: the dangers are ones that we have to keep in mind. 367 00:23:16,760 --> 00:23:18,600 Speaker 2: It can't be so worried about the danger you just 368 00:23:18,600 --> 00:23:20,480 Speaker 2: to stop and say we're not going to do it. 369 00:23:20,560 --> 00:23:22,680 Speaker 2: That would be a mistake. But you do have to 370 00:23:22,720 --> 00:23:25,119 Speaker 2: be eventualan all the time about what could happen. 371 00:23:26,240 --> 00:23:28,280 Speaker 1: Well, David Nations, thank you so much for joining us 372 00:23:28,280 --> 00:23:30,879 Speaker 1: on text us today for a fascinating conversation, and I 373 00:23:30,960 --> 00:23:32,639 Speaker 1: hope you'll join us again before too long. 374 00:23:32,520 --> 00:23:33,960 Speaker 2: With pleasure Us. Thank you very much for. 375 00:23:55,920 --> 00:23:59,080 Speaker 1: Text Stuff. I amost Forelosian and I'm care Price. This 376 00:23:59,200 --> 00:24:03,080 Speaker 1: episode was Preduce Spy, Eliza Dennis, Melissa Slaughter, and Tyler Hill. 377 00:24:03,320 --> 00:24:06,360 Speaker 1: It was executive produced by Me, Karen Price, and Kate 378 00:24:06,400 --> 00:24:10,880 Speaker 1: Osborne for Kaleidoscope and Katrian Novelle for iHeart Podcasts. Jack 379 00:24:10,920 --> 00:24:14,360 Speaker 1: Insley makes this episode and Kyle Murdoch wrote out theme song. 380 00:24:14,560 --> 00:24:16,680 Speaker 3: Join us on Friday for the week in tech as 381 00:24:16,720 --> 00:24:18,440 Speaker 3: and I will run through the tech headlines you may 382 00:24:18,440 --> 00:24:21,080 Speaker 3: have missed. Please rate, review, and reach out to us 383 00:24:21,080 --> 00:24:39,600 Speaker 3: at tech Stuff podcast at gmail dot com.