1 00:00:02,720 --> 00:00:16,400 Speaker 1: Bloomberg Audio Studios, Podcasts, Radio News. 2 00:00:18,079 --> 00:00:21,759 Speaker 2: Hello and welcome to another episode of the Odd Lots Podcast. 3 00:00:21,880 --> 00:00:24,280 Speaker 3: I'm Joeisenthal and I'm Tracy Alloway. 4 00:00:24,640 --> 00:00:27,440 Speaker 2: So Tracy recording this March twenty fourth, And of course 5 00:00:27,480 --> 00:00:29,800 Speaker 2: almost all of our episodes. 6 00:00:29,440 --> 00:00:31,680 Speaker 4: Lately have been about the war in Iran. 7 00:00:32,280 --> 00:00:34,879 Speaker 2: But what's interesting, or what's a little weird is that 8 00:00:35,240 --> 00:00:38,520 Speaker 2: just prior to the war, literally days or maybe hours, 9 00:00:38,880 --> 00:00:42,320 Speaker 2: the biggest story in the world was actually about defense 10 00:00:42,400 --> 00:00:44,760 Speaker 2: and you know the DoD that's right. 11 00:00:44,800 --> 00:00:48,239 Speaker 3: So you are referring to anthrop a. Yeah, and it's 12 00:00:48,520 --> 00:00:51,280 Speaker 3: disagreement to put it mildly in the Department of War. 13 00:00:51,680 --> 00:00:52,440 Speaker 4: Yeah, exactly. 14 00:00:52,479 --> 00:00:55,160 Speaker 2: This was the biggest story going right up on the 15 00:00:55,200 --> 00:00:57,920 Speaker 2: eve of the war in Iran. And of course obviously 16 00:00:58,040 --> 00:01:02,240 Speaker 2: there was this contract topics technology was used by the 17 00:01:02,240 --> 00:01:05,280 Speaker 2: Defense Department. So it was not a disagreement about the 18 00:01:05,360 --> 00:01:08,920 Speaker 2: use of AI per se in war, but the question 19 00:01:09,000 --> 00:01:12,440 Speaker 2: of the degree to which AI could be used for 20 00:01:12,520 --> 00:01:15,360 Speaker 2: autonomous weapons systems on their own without a human in the. 21 00:01:15,360 --> 00:01:18,720 Speaker 5: Loop and surveillance and surveillance this was another key element. 22 00:01:18,840 --> 00:01:22,479 Speaker 3: But you're right, so we've heard this expression autonomous weapons, yeah, 23 00:01:22,520 --> 00:01:25,640 Speaker 3: pop up more and more especially in recent days. And 24 00:01:25,680 --> 00:01:28,880 Speaker 3: I've a lot of questions over what exactly that means, 25 00:01:29,000 --> 00:01:31,880 Speaker 3: because my impression is the US military certainly has been 26 00:01:31,959 --> 00:01:33,320 Speaker 3: using AI for some time. 27 00:01:33,440 --> 00:01:35,480 Speaker 5: Yes, and so we're really talking about. 28 00:01:35,560 --> 00:01:40,360 Speaker 3: Degrees here of autonomy, right, And so if you think 29 00:01:40,360 --> 00:01:43,319 Speaker 3: about an autonomous weapon, I think your mind could go 30 00:01:43,560 --> 00:01:46,760 Speaker 3: fully terminator and you know there's like a murder robot 31 00:01:46,800 --> 00:01:50,680 Speaker 3: out there that's making its own decisions on which. 32 00:01:50,600 --> 00:01:52,720 Speaker 5: People or places to target. 33 00:01:53,200 --> 00:01:55,880 Speaker 3: And then you get levels below that, right where AI 34 00:01:56,040 --> 00:01:59,920 Speaker 3: is kind of helping humans to come up with strategic decisions. 35 00:02:00,160 --> 00:02:00,360 Speaker 4: Right. 36 00:02:00,480 --> 00:02:03,000 Speaker 2: So, if there is a missile coming and you have 37 00:02:03,040 --> 00:02:06,200 Speaker 2: a missile defense system, I don't think you want a 38 00:02:06,280 --> 00:02:08,400 Speaker 2: human in the loop. Is like, Okay, here are the 39 00:02:08,440 --> 00:02:11,200 Speaker 2: coordinates X, y Z that we think it's going to hit. 40 00:02:11,639 --> 00:02:13,360 Speaker 2: At this point in time, we think the missile will 41 00:02:13,360 --> 00:02:14,880 Speaker 2: be here, Are you cool with firing it? 42 00:02:15,040 --> 00:02:16,240 Speaker 4: I think everyone's. 43 00:02:15,840 --> 00:02:18,960 Speaker 2: Probably okay with that level of autonomy. But I have 44 00:02:19,000 --> 00:02:22,799 Speaker 2: a feeling that to your point exactly a lot of 45 00:02:22,840 --> 00:02:26,320 Speaker 2: this discussion, and maybe it's core to what Anthropic and 46 00:02:26,360 --> 00:02:28,680 Speaker 2: the Department of Defense who are disagreeing on, I have 47 00:02:28,720 --> 00:02:30,720 Speaker 2: a feeling a lot of this is going to come 48 00:02:30,760 --> 00:02:33,880 Speaker 2: down to definitions. My guess is that there is not 49 00:02:34,200 --> 00:02:38,440 Speaker 2: one shared agreement of this is an autonomous weapon system 50 00:02:38,720 --> 00:02:39,560 Speaker 2: and this one is. 51 00:02:39,520 --> 00:02:42,440 Speaker 3: Not absolutely And of course there are also questions over 52 00:02:42,600 --> 00:02:47,440 Speaker 3: exactly how places like the Department of Defense, not only 53 00:02:47,440 --> 00:02:51,120 Speaker 3: how they define it, but once they have those definitions, 54 00:02:51,240 --> 00:02:55,120 Speaker 3: whether or not certain companies trust them. Yes, so stick 55 00:02:55,160 --> 00:02:57,400 Speaker 3: to those policies, because the US will say, well, our 56 00:02:57,520 --> 00:03:01,400 Speaker 3: policy is not to survey are citizens at the moment, 57 00:03:01,520 --> 00:03:03,720 Speaker 3: So if you're anthropic, you don't need to worry about that. 58 00:03:04,080 --> 00:03:07,600 Speaker 3: Clearly anthropic feels otherwiseer say they do. So there are 59 00:03:07,600 --> 00:03:11,799 Speaker 3: all these really interesting thematic questions that pop up from all. 60 00:03:11,639 --> 00:03:12,840 Speaker 4: Of this totally. 61 00:03:12,880 --> 00:03:15,320 Speaker 2: And then there's the question of Okay, here's a technology 62 00:03:15,360 --> 00:03:17,639 Speaker 2: and the government says, we believe that we can use 63 00:03:17,680 --> 00:03:19,720 Speaker 2: this to make the country safe. For what, You're not 64 00:03:19,760 --> 00:03:22,200 Speaker 2: going to let us do it like private corporation. There's 65 00:03:22,240 --> 00:03:25,000 Speaker 2: some very interesting questions about the role of corporate power 66 00:03:25,080 --> 00:03:26,800 Speaker 2: visa vi the government and so forth. 67 00:03:26,840 --> 00:03:29,359 Speaker 4: Anyway, this is something that has. 68 00:03:29,240 --> 00:03:32,120 Speaker 2: Become even more timely in the reports, even in the 69 00:03:32,160 --> 00:03:35,640 Speaker 2: early days of the Iran War, about these AI systems 70 00:03:35,640 --> 00:03:38,480 Speaker 2: having been used perhaps in target selection, but we don't 71 00:03:38,520 --> 00:03:40,200 Speaker 2: really know. None of the reporting is like that. 72 00:03:40,440 --> 00:03:40,720 Speaker 6: Clear. 73 00:03:40,800 --> 00:03:44,400 Speaker 2: I don't think that they're going out and advertising this strike. 74 00:03:44,600 --> 00:03:46,440 Speaker 5: This is exactly how we're using AI. 75 00:03:46,600 --> 00:03:48,680 Speaker 2: This is exactly how we're using AI, and so forth. 76 00:03:48,760 --> 00:03:51,920 Speaker 2: But this is obviously a huge debate and war aside. 77 00:03:51,920 --> 00:03:54,760 Speaker 2: It's totally going to grow, and just as an AI 78 00:03:54,880 --> 00:03:57,840 Speaker 2: is going to grow, it seems in so many different areas. Anyway, 79 00:03:57,840 --> 00:04:00,280 Speaker 2: I'm really excited to say, we really do have the 80 00:04:00,320 --> 00:04:02,680 Speaker 2: perfect guest, someone who's been writing and thinking about this 81 00:04:02,760 --> 00:04:05,440 Speaker 2: stuff for a long time. When we talked to an 82 00:04:05,480 --> 00:04:08,520 Speaker 2: AI expert, I marked the dividing line AND's like, where 83 00:04:08,640 --> 00:04:12,120 Speaker 2: you talking about AI prior to when Chad Gipt was released, 84 00:04:12,160 --> 00:04:14,280 Speaker 2: as like, I take a little bit more seriously the 85 00:04:14,280 --> 00:04:17,200 Speaker 2: people who are in this space prior to November twenty 86 00:04:17,320 --> 00:04:20,120 Speaker 2: twenty two. Anyway, I'm very excited to say we're gonna 87 00:04:20,120 --> 00:04:22,440 Speaker 2: be speaking with Paul Shari. He's the executive vice president 88 00:04:22,520 --> 00:04:25,120 Speaker 2: at the Center for a New American Security and he's 89 00:04:25,160 --> 00:04:27,520 Speaker 2: the author of two books related to this, one as 90 00:04:27,680 --> 00:04:30,240 Speaker 2: the most recent for Battlegrounds Power in the Age of 91 00:04:30,320 --> 00:04:33,040 Speaker 2: Artificial Intelligence, and then prior to that, he is the 92 00:04:33,080 --> 00:04:36,159 Speaker 2: author of Army of non autonomous weapons and the future 93 00:04:36,240 --> 00:04:39,080 Speaker 2: of war. He was previously in the Office of the 94 00:04:39,120 --> 00:04:42,760 Speaker 2: Secretary of Defense, also a previous Army ranger, so truly 95 00:04:42,800 --> 00:04:45,080 Speaker 2: the perfect guest. So, Paul, thank you so much for 96 00:04:45,279 --> 00:04:46,640 Speaker 2: coming on odlots. 97 00:04:47,240 --> 00:04:49,440 Speaker 6: Oh thank you for having me. Very excited to be here. 98 00:04:49,800 --> 00:04:50,760 Speaker 4: What do you we start? 99 00:04:50,880 --> 00:04:53,880 Speaker 2: I mentioned I had a feeling that maybe the definition 100 00:04:53,960 --> 00:04:57,360 Speaker 2: of an autonomous weapon is a contested one. But if 101 00:04:57,400 --> 00:04:59,640 Speaker 2: I say to you, what's an autonomous weapon, what's an 102 00:04:59,640 --> 00:05:00,559 Speaker 2: autonom this weapon? 103 00:05:01,520 --> 00:05:03,280 Speaker 6: So I think you're right from the beginning that there 104 00:05:03,320 --> 00:05:07,320 Speaker 6: is not a unified definition that everyone agrees on. The 105 00:05:07,360 --> 00:05:11,080 Speaker 6: Defense Department has their definition that's written in their policy. 106 00:05:11,480 --> 00:05:14,800 Speaker 6: I think conceptually, I think the distinction really is a 107 00:05:14,839 --> 00:05:18,599 Speaker 6: weapon that is choosing its own targets on the battlefield, 108 00:05:18,800 --> 00:05:20,920 Speaker 6: and it's not where we are today. Right now, today 109 00:05:21,000 --> 00:05:24,240 Speaker 6: people are choosing those targets. But it is kind of 110 00:05:24,240 --> 00:05:26,839 Speaker 6: a spectrum because we do have examples of weapons that 111 00:05:26,920 --> 00:05:30,919 Speaker 6: have some measure of autonomy. A good analogy might be 112 00:05:30,920 --> 00:05:35,160 Speaker 6: self driving cars, where conceptually, like, okay, a self driving 113 00:05:35,200 --> 00:05:36,720 Speaker 6: car would be where the AI is driving the car, 114 00:05:36,960 --> 00:05:39,640 Speaker 6: but then you get into an actual car today and 115 00:05:39,680 --> 00:05:43,200 Speaker 6: a lot of them have intelligence cruise control, automatic breaking, 116 00:05:43,760 --> 00:05:47,040 Speaker 6: automated self parking, They have all these like automated features 117 00:05:47,040 --> 00:05:50,120 Speaker 6: that are kind of creeping you in this direction where 118 00:05:50,160 --> 00:05:53,400 Speaker 6: the AI is taking over more and more control for 119 00:05:53,520 --> 00:05:56,240 Speaker 6: what the vehicle can do. And it's actually pretty similar 120 00:05:56,240 --> 00:05:57,560 Speaker 6: thing in the military space as well. 121 00:05:58,080 --> 00:06:00,880 Speaker 3: So Joe mentioned that had we been having this conversation 122 00:06:00,960 --> 00:06:03,920 Speaker 3: even a month ago now, it probably would have had 123 00:06:04,160 --> 00:06:09,760 Speaker 3: fewer concrete examples of AI enabled weaponry, let's say, or strategy. 124 00:06:10,320 --> 00:06:14,240 Speaker 3: When the Pentagon talks about its advanced AI tools that 125 00:06:14,279 --> 00:06:18,360 Speaker 3: it's deploying for the Iran conflict, what are some examples 126 00:06:18,400 --> 00:06:21,760 Speaker 3: that you're seeing right now that are different to say, 127 00:06:22,080 --> 00:06:25,000 Speaker 3: maybe just a year ago when we had another Iran conflict. 128 00:06:26,440 --> 00:06:29,080 Speaker 6: Right, So, there's a couple ways in which the Pentagon 129 00:06:29,279 --> 00:06:33,200 Speaker 6: is using AI right now. One is narrow AI systems 130 00:06:33,200 --> 00:06:36,440 Speaker 6: that have around for over a decade now that do 131 00:06:36,760 --> 00:06:41,400 Speaker 6: image classification. For example, So this was the military's original 132 00:06:41,400 --> 00:06:45,520 Speaker 6: project Maven almost a decade ago, where they took machine 133 00:06:45,560 --> 00:06:49,320 Speaker 6: learning image classifiers to sift through drone video feeds and 134 00:06:49,400 --> 00:06:53,080 Speaker 6: saddle images to identify objects. Okay, here's a building, here's 135 00:06:53,120 --> 00:06:56,839 Speaker 6: a person, here's a vehicle, that's pretty mature technology. Now, 136 00:06:56,960 --> 00:06:58,880 Speaker 6: what's come out in just the last couple of weeks 137 00:06:59,000 --> 00:07:02,400 Speaker 6: that's really quite interesting is that in the midst of 138 00:07:02,440 --> 00:07:06,160 Speaker 6: this huge, messy public breakup between Anthropic and the Pentagon, 139 00:07:06,600 --> 00:07:10,000 Speaker 6: we found out that in fact, Anthropics AI tools are 140 00:07:10,040 --> 00:07:13,840 Speaker 6: being used by the US military to help plan the 141 00:07:13,880 --> 00:07:17,640 Speaker 6: war against Iran. That's obviously a different kind of AI tool, 142 00:07:18,040 --> 00:07:22,120 Speaker 6: AI Large language models, AI being used to write code 143 00:07:22,200 --> 00:07:25,040 Speaker 6: AI agents, and that's being used in a different ways, 144 00:07:25,160 --> 00:07:29,640 Speaker 6: really helping intel analysts sift through just the massive amounts 145 00:07:29,640 --> 00:07:33,080 Speaker 6: of data that the US military has. And so you 146 00:07:33,080 --> 00:07:36,160 Speaker 6: can imagine the problem that the military is facing right 147 00:07:36,200 --> 00:07:39,760 Speaker 6: now when they're looking at targets in Iran. US military 148 00:07:39,760 --> 00:07:44,560 Speaker 6: has flown over six thousand sorties against Iran. The Iranian 149 00:07:44,600 --> 00:07:47,480 Speaker 6: military architecture is degraded in a lot of ways. US 150 00:07:47,520 --> 00:07:50,880 Speaker 6: militaries are bombed a lot of targets. There are mobile targets, 151 00:07:51,160 --> 00:07:56,360 Speaker 6: senior Iranian commanders, mobile missile launchers and air defense systems 152 00:07:56,480 --> 00:07:59,600 Speaker 6: and drone launchers. US militter has got to bring off 153 00:07:59,640 --> 00:08:02,720 Speaker 6: it in f together and find out where are these 154 00:08:02,720 --> 00:08:05,520 Speaker 6: targets right now and where is there an aircraft that 155 00:08:05,640 --> 00:08:07,720 Speaker 6: is the right bombs on it to take these targets out. 156 00:08:08,040 --> 00:08:10,600 Speaker 6: And that's how AI is being used to help basically 157 00:08:11,040 --> 00:08:12,960 Speaker 6: process and understand all that information. 158 00:08:13,240 --> 00:08:16,720 Speaker 2: When I think about the description that you gave for that, 159 00:08:17,240 --> 00:08:19,200 Speaker 2: I sometimes think like, could it be that? 160 00:08:19,760 --> 00:08:19,880 Speaker 1: Now? 161 00:08:19,920 --> 00:08:23,120 Speaker 2: I don't think that using anthropics technology mean so that 162 00:08:23,120 --> 00:08:26,520 Speaker 2: they go into Claude dot AI and say, give us 163 00:08:26,520 --> 00:08:31,000 Speaker 2: a list of suitable targets for sorties. But could it 164 00:08:31,040 --> 00:08:33,120 Speaker 2: be something like that? But I'm sure there's a different 165 00:08:33,120 --> 00:08:36,839 Speaker 2: interface and so forth. But is that a completely ridiculous 166 00:08:36,880 --> 00:08:40,800 Speaker 2: way of essentially framing the service that AI is providing 167 00:08:40,880 --> 00:08:41,320 Speaker 2: right now? 168 00:08:42,640 --> 00:08:45,800 Speaker 6: So the way that these AI tools are being integrated 169 00:08:45,960 --> 00:08:49,559 Speaker 6: are through a existing system called the Maven Smart System okay, 170 00:08:49,559 --> 00:08:52,880 Speaker 6: which is built by Palenteer, and that fuses all this 171 00:08:53,000 --> 00:08:56,600 Speaker 6: data together. So you basically have an existing architecture for 172 00:08:56,800 --> 00:09:00,680 Speaker 6: data management for Intel analysts that the Milita already has 173 00:09:01,040 --> 00:09:03,120 Speaker 6: that brings together all these different forms of data. You 174 00:09:03,200 --> 00:09:08,280 Speaker 6: might have satellite imagery, geolocation, data, signals, intelligence, other forms 175 00:09:08,280 --> 00:09:12,960 Speaker 6: of information. That's pretty great for Intel analysts, but that's 176 00:09:13,000 --> 00:09:16,520 Speaker 6: also really unwieldy because how does a human understand all 177 00:09:16,559 --> 00:09:20,120 Speaker 6: that data and process it? And that's where the large 178 00:09:20,160 --> 00:09:22,880 Speaker 6: language model tools, whether it's Clawed or other companies can 179 00:09:22,880 --> 00:09:26,960 Speaker 6: be valuable. Is that could be a way for a 180 00:09:27,080 --> 00:09:31,040 Speaker 6: human to interact with that data, to basically task a 181 00:09:31,120 --> 00:09:33,280 Speaker 6: large language model to say, Okay, here's a bunch of 182 00:09:33,400 --> 00:09:36,160 Speaker 6: data I'm giving you. I want you to look for 183 00:09:36,720 --> 00:09:39,320 Speaker 6: intersections and things, right. I want you to look for 184 00:09:39,360 --> 00:09:44,160 Speaker 6: a place where we have satellite imagery and some other 185 00:09:44,280 --> 00:09:47,960 Speaker 6: forms of intelligence that can help identify the location of 186 00:09:48,440 --> 00:09:51,800 Speaker 6: you know, some missile launcher for example, and then humans 187 00:09:51,840 --> 00:09:55,040 Speaker 6: can look at that and help one just like find 188 00:09:55,200 --> 00:09:58,920 Speaker 6: where are all these targets, and then it helpfully planning too. 189 00:09:58,960 --> 00:10:01,320 Speaker 6: A human could say, okay, here's this list of potential 190 00:10:01,360 --> 00:10:04,199 Speaker 6: targets that I have. Now they're scattered all over run 191 00:10:04,240 --> 00:10:06,440 Speaker 6: around is a really big country. I want to map 192 00:10:06,480 --> 00:10:10,760 Speaker 6: these two locations for US aircraft at different bases across 193 00:10:10,760 --> 00:10:14,400 Speaker 6: the region. What are available aircraft and what are available 194 00:10:14,600 --> 00:10:17,760 Speaker 6: munitions on those aircraft? The commused to take out those 195 00:10:17,760 --> 00:10:21,280 Speaker 6: targets to help build a strike package. So like, the 196 00:10:21,320 --> 00:10:24,800 Speaker 6: AI is definitely being used to help understand the battlespace 197 00:10:24,800 --> 00:10:27,800 Speaker 6: and to plan operations. But in I would say ways 198 00:10:27,800 --> 00:10:31,559 Speaker 6: that are pretty narrowly directed by people It's not quite 199 00:10:31,559 --> 00:10:33,680 Speaker 6: as simple as like, dump all this data into a 200 00:10:33,679 --> 00:10:37,000 Speaker 6: context winder for LM and say, oh, AI figure it out. Okay, 201 00:10:37,000 --> 00:10:40,040 Speaker 6: people are asking the AI some really specific questions, So. 202 00:10:40,280 --> 00:10:43,880 Speaker 3: I'm thinking how to phrase this question diplomatically, But I 203 00:10:43,920 --> 00:10:47,480 Speaker 3: get that the difference between fully autonomous weapons is, you know, 204 00:10:47,520 --> 00:10:51,640 Speaker 3: the human as a decision maker in the current setup, 205 00:10:51,800 --> 00:10:55,520 Speaker 3: how meaningful is the human actually? Like, what's your sense 206 00:10:55,559 --> 00:10:58,360 Speaker 3: of it? Because I'm imagining if you're an intelligence officer 207 00:10:58,400 --> 00:11:01,920 Speaker 3: and you're getting realms and reams data from Iran and 208 00:11:01,960 --> 00:11:04,880 Speaker 3: you ask the AI to pick out certain patterns or 209 00:11:04,960 --> 00:11:09,440 Speaker 3: identify potential strategic targets, how much due diligence are you 210 00:11:09,480 --> 00:11:12,560 Speaker 3: actually doing on what that model spits out. Because of course, 211 00:11:12,640 --> 00:11:16,960 Speaker 3: the tendency when a lot of people use LLM certainly 212 00:11:17,080 --> 00:11:19,440 Speaker 3: is just you accept what it shows you on the screen. 213 00:11:19,559 --> 00:11:22,360 Speaker 2: And just to add on to Tracy's question, because there's 214 00:11:22,440 --> 00:11:24,199 Speaker 2: on to go, which is then in the early. 215 00:11:24,000 --> 00:11:26,120 Speaker 4: Days of the war, we hit that school. 216 00:11:26,360 --> 00:11:29,520 Speaker 2: Yeah, and I'm reading a New York Times report and 217 00:11:29,559 --> 00:11:33,880 Speaker 2: that was the result of quote outdated data provided by 218 00:11:33,880 --> 00:11:37,319 Speaker 2: the Defense Intelligence Agency. Now we don't know exactly what 219 00:11:37,360 --> 00:11:41,120 Speaker 2: that means, but Okay, various outputs come out, then what happens, 220 00:11:41,200 --> 00:11:44,280 Speaker 2: like how much is the human layer currently in terms 221 00:11:44,280 --> 00:11:46,959 Speaker 2: of Okay, here are targets, here are ships that are 222 00:11:46,960 --> 00:11:50,200 Speaker 2: on a battleship. This could be plausible. What do you 223 00:11:50,360 --> 00:11:53,000 Speaker 2: think or what do you know about the level of 224 00:11:53,120 --> 00:11:57,000 Speaker 2: human decision making that happens between some output and then 225 00:11:57,040 --> 00:11:59,760 Speaker 2: the ultimate call for a strike on whatever it is. 226 00:12:00,920 --> 00:12:02,079 Speaker 6: Yeah, I mean, I think, first of all, I think 227 00:12:02,080 --> 00:12:04,480 Speaker 6: it's a really important question because it is one of 228 00:12:04,520 --> 00:12:08,040 Speaker 6: the possible failure modes, if you will, of AI and 229 00:12:08,080 --> 00:12:10,199 Speaker 6: how we use it, because you can end up in 230 00:12:10,240 --> 00:12:13,480 Speaker 6: a place where humans are nominally in the loop and 231 00:12:13,520 --> 00:12:15,800 Speaker 6: you could say, well, it's not an autonous weapon humans 232 00:12:15,840 --> 00:12:18,640 Speaker 6: making these decisions. But if the human is not meaningfully 233 00:12:18,679 --> 00:12:22,040 Speaker 6: engaged and they're just kind of rubber stamping some kind 234 00:12:22,040 --> 00:12:24,520 Speaker 6: of decision, and that's not really what we're looking for. 235 00:12:24,600 --> 00:12:28,040 Speaker 6: So I think that's it's been a longstanding concern for 236 00:12:28,120 --> 00:12:31,319 Speaker 6: many years about people worried about autonous weapons. I think 237 00:12:31,320 --> 00:12:34,400 Speaker 6: that's a very real risk with how AI is used now. 238 00:12:34,440 --> 00:12:37,480 Speaker 6: Based on my understanding of how the AI technology is 239 00:12:37,559 --> 00:12:41,000 Speaker 6: used in Maven today and based on what I've seen 240 00:12:41,040 --> 00:12:43,760 Speaker 6: of demonstrations of it, because I have seen some demonstrations 241 00:12:43,800 --> 00:12:46,800 Speaker 6: of this in action. I think humans are pretty involved 242 00:12:46,880 --> 00:12:50,600 Speaker 6: right now in terms of actually looking at the output 243 00:12:50,679 --> 00:12:54,920 Speaker 6: from AI, giving pretty specific guidance to the AI systems. 244 00:12:55,360 --> 00:12:59,440 Speaker 6: I do think there is an underlying challenge that the 245 00:12:59,559 --> 00:13:02,920 Speaker 6: strike on the school highlights, which is, when you're talking 246 00:13:03,000 --> 00:13:06,640 Speaker 6: about thousands and thousands of targets, what's the degree of 247 00:13:06,800 --> 00:13:11,120 Speaker 6: vetting that's gone into all of that information, both in 248 00:13:11,160 --> 00:13:14,360 Speaker 6: the run up to the war, which in this case, 249 00:13:14,400 --> 00:13:16,839 Speaker 6: that school was a fixed object, and so that's likely 250 00:13:16,960 --> 00:13:20,000 Speaker 6: something that should have clearly been much more vetted prior 251 00:13:20,120 --> 00:13:23,760 Speaker 6: to the we're kicking off. That someone could have identified 252 00:13:24,400 --> 00:13:27,760 Speaker 6: that that building that was struck had actually been at 253 00:13:27,800 --> 00:13:30,120 Speaker 6: one point in time part of an Iranian military compound. 254 00:13:30,480 --> 00:13:33,920 Speaker 6: But you could see based on public available satellite imagery 255 00:13:34,320 --> 00:13:36,000 Speaker 6: that it had been moved out of that compound some 256 00:13:36,080 --> 00:13:38,560 Speaker 6: time ago and it had been converted to a school, 257 00:13:39,000 --> 00:13:41,680 Speaker 6: and it would appear, based on what's been reported in 258 00:13:41,720 --> 00:13:44,880 Speaker 6: the Times, that that information had never been updated in 259 00:13:44,920 --> 00:13:49,079 Speaker 6: this DEIA targeting database. Now, I would hope that we'll 260 00:13:49,120 --> 00:13:51,600 Speaker 6: get more information in the future and some investigation about 261 00:13:51,679 --> 00:13:54,079 Speaker 6: exactly where that went wrong. But I think that does 262 00:13:54,160 --> 00:13:57,200 Speaker 6: speak to this underlying challenge of how good is the 263 00:13:57,360 --> 00:14:00,679 Speaker 6: data going into this AI system and how thoroughly are 264 00:14:00,720 --> 00:14:03,760 Speaker 6: people vetting it? And again, in principle, AI might be 265 00:14:03,840 --> 00:14:05,800 Speaker 6: able to help you with those things, but you got 266 00:14:05,840 --> 00:14:07,520 Speaker 6: to use it the right way and people still have 267 00:14:07,559 --> 00:14:09,520 Speaker 6: to be meaningfully engaging THECC. 268 00:14:25,760 --> 00:14:28,200 Speaker 2: Why don't we back up for a second, tell us 269 00:14:28,200 --> 00:14:31,600 Speaker 2: about the work that you've done in this area, really 270 00:14:31,640 --> 00:14:34,440 Speaker 2: several years ahead of the curve, and talk about this stuff, 271 00:14:34,480 --> 00:14:36,600 Speaker 2: planning for this stuff. Give us a little bit of 272 00:14:36,680 --> 00:14:40,120 Speaker 2: sort of your background and what got you on this 273 00:14:40,240 --> 00:14:43,320 Speaker 2: train again several years before Chad Shept. 274 00:14:44,880 --> 00:14:49,120 Speaker 6: Yeah, so really over a decade ago now, around say 275 00:14:49,440 --> 00:14:53,320 Speaker 6: twenty eleven, I let an effort inside the Pentagon when 276 00:14:53,320 --> 00:14:55,720 Speaker 6: I worked at the Office the Sectary of Defense on 277 00:14:56,360 --> 00:14:59,920 Speaker 6: developing the Pentagon's policy on the role of a ton 278 00:15:00,040 --> 00:15:02,400 Speaker 6: I meet in weapons, the one that's still in effect 279 00:15:02,440 --> 00:15:06,960 Speaker 6: today in fact, and that was really part of at 280 00:15:06,960 --> 00:15:10,120 Speaker 6: the time. We weren't at all where the military is 281 00:15:10,160 --> 00:15:12,760 Speaker 6: now in terms of integrating AI tools. I mean, these 282 00:15:12,800 --> 00:15:15,200 Speaker 6: types of large language models just did exist at the time, 283 00:15:15,920 --> 00:15:19,680 Speaker 6: but the military had kind of woken up to what 284 00:15:19,720 --> 00:15:23,600 Speaker 6: I would call this accidental robotics revolution during the wars 285 00:15:23,600 --> 00:15:26,680 Speaker 6: in the Rock and Afghanistan, where the military had deployed 286 00:15:26,720 --> 00:15:30,720 Speaker 6: thousands of air and ground robots, drones in the air 287 00:15:30,760 --> 00:15:34,160 Speaker 6: and ground robots for diffusing bombs, and the military was 288 00:15:34,200 --> 00:15:36,640 Speaker 6: starting to think through where is this going in the future, 289 00:15:37,160 --> 00:15:39,040 Speaker 6: And one of the things that everyone could see would 290 00:15:39,080 --> 00:15:42,320 Speaker 6: be valuable would be having more autonomy in these systems, 291 00:15:42,360 --> 00:15:45,400 Speaker 6: the ability to not be tortally reliant on a human 292 00:15:45,480 --> 00:15:48,160 Speaker 6: remotely controlling them, which was really the case at the time, 293 00:15:48,760 --> 00:15:51,640 Speaker 6: but that raised all these obviously the any questions about 294 00:15:51,640 --> 00:15:54,960 Speaker 6: like well, how much autonomy should they have and what 295 00:15:55,040 --> 00:15:57,800 Speaker 6: are the legal and ethical implications of that, And that 296 00:15:57,920 --> 00:16:00,600 Speaker 6: was actually the topic of a lot of discussion among 297 00:16:00,720 --> 00:16:04,080 Speaker 6: people in the military at the time and in the 298 00:16:04,080 --> 00:16:06,640 Speaker 6: Pentagon for people working on these issues, and so that 299 00:16:06,760 --> 00:16:10,720 Speaker 6: ultimately led to that policy directive that's still in place 300 00:16:10,720 --> 00:16:13,040 Speaker 6: on the role of autonomy and weapons. And then when 301 00:16:13,040 --> 00:16:15,680 Speaker 6: I left the government, I continued to work on this topic. 302 00:16:15,840 --> 00:16:20,080 Speaker 6: Is we've seen discussions internationally through the United Nations as 303 00:16:20,080 --> 00:16:23,560 Speaker 6: we've seen the technology evolve in really amazing ways but 304 00:16:23,640 --> 00:16:26,160 Speaker 6: also ones out of risks with artificial intelligence. 305 00:16:26,760 --> 00:16:28,760 Speaker 5: So when you were doing that job. 306 00:16:28,800 --> 00:16:31,000 Speaker 3: I get that you're on the policy side, but did 307 00:16:31,000 --> 00:16:34,240 Speaker 3: you ever see anything on the contractor side similar to 308 00:16:34,240 --> 00:16:36,800 Speaker 3: what we're seeing with Anthropic right now. Was there ever 309 00:16:36,840 --> 00:16:40,680 Speaker 3: a contractor who said, actually, no, I'm really uncomfortable with 310 00:16:40,720 --> 00:16:44,160 Speaker 3: the way that the department wants to use this particular tech. 311 00:16:45,040 --> 00:16:48,640 Speaker 6: Not at that time. Now, a few years later, after 312 00:16:48,720 --> 00:16:52,240 Speaker 6: the US military launched Project mavin there was a big 313 00:16:52,480 --> 00:16:55,440 Speaker 6: dust up when it came out publicly that Google had 314 00:16:55,480 --> 00:16:57,280 Speaker 6: been a part of Product Maven, and a number of 315 00:16:57,280 --> 00:17:01,080 Speaker 6: Google employees signed and open letter protesting and Google eventually 316 00:17:01,120 --> 00:17:04,480 Speaker 6: discontinue their work on Project Mavane. And you know, it's 317 00:17:04,520 --> 00:17:08,200 Speaker 6: not exact replica here of what's going on, but there's 318 00:17:08,200 --> 00:17:12,040 Speaker 6: certainly some similarities in terms of a disconnect between how 319 00:17:12,080 --> 00:17:15,000 Speaker 6: some people in the AI community are thinking about how 320 00:17:15,080 --> 00:17:17,439 Speaker 6: their technology ought to be used in war and how 321 00:17:17,480 --> 00:17:19,880 Speaker 6: the military is thinking about it. And I think part 322 00:17:19,880 --> 00:17:23,240 Speaker 6: of that's like there's this underlying challenge of AI is 323 00:17:23,359 --> 00:17:27,000 Speaker 6: really different than a lot of traditional military technologies because 324 00:17:27,000 --> 00:17:29,200 Speaker 6: it's coming out of the commercial sector. So in a way, 325 00:17:29,200 --> 00:17:31,800 Speaker 6: it's kind of like the opposite of stealth technology that 326 00:17:31,960 --> 00:17:34,600 Speaker 6: was invented in secret defense labs and doesn't have a 327 00:17:34,640 --> 00:17:38,520 Speaker 6: lot of commercial applications. AI is all of these different applications. 328 00:17:38,560 --> 00:17:41,280 Speaker 6: It's not being invented by the military. The military is 329 00:17:41,280 --> 00:17:44,240 Speaker 6: having to be import it in and there are a 330 00:17:44,280 --> 00:17:48,399 Speaker 6: lot of debates about how AI should be used, you know, 331 00:17:48,440 --> 00:17:50,320 Speaker 6: in the military and more broadly in society. 332 00:17:50,520 --> 00:17:53,920 Speaker 3: Actually, on that note, I think this is really interesting 333 00:17:54,000 --> 00:17:56,320 Speaker 3: and definitely a pivotal point in I guess, the history 334 00:17:56,320 --> 00:18:00,360 Speaker 3: of the military industrial complex. But why can't the US government, 335 00:18:00,600 --> 00:18:04,399 Speaker 3: with all its resources, actually develop AI in house and 336 00:18:04,480 --> 00:18:07,719 Speaker 3: just avoid the seeming complication of having to deal with 337 00:18:07,760 --> 00:18:08,800 Speaker 3: a commercial enterprise. 338 00:18:09,520 --> 00:18:13,840 Speaker 6: Part ofly, it doesn't have the technical skills the AI 339 00:18:13,920 --> 00:18:19,399 Speaker 6: scientists and engineers are. Really there's a fierce competition for 340 00:18:19,440 --> 00:18:22,359 Speaker 6: talent in the AI space, and so the military just 341 00:18:22,359 --> 00:18:25,400 Speaker 6: like can't buy that talent they don't have it. And 342 00:18:25,600 --> 00:18:27,720 Speaker 6: the government spends a lot of money, hundreds of billions 343 00:18:27,760 --> 00:18:31,560 Speaker 6: of dollars annually on defense. But we've seen actually in 344 00:18:31,560 --> 00:18:34,399 Speaker 6: the last few years that private enterprise is able to 345 00:18:34,400 --> 00:18:38,040 Speaker 6: mobile massive amounts of capital towards building data centers, to 346 00:18:38,320 --> 00:18:42,720 Speaker 6: training AI models. And partly because the commercial applications for 347 00:18:42,720 --> 00:18:47,159 Speaker 6: this technology are much bigger than the defense applications, and 348 00:18:47,240 --> 00:18:50,639 Speaker 6: so for a lot of these tech companies there's some 349 00:18:51,400 --> 00:18:54,280 Speaker 6: at least maybe not in this particular instance, but in 350 00:18:54,280 --> 00:18:57,200 Speaker 6: the past there could offer be some prestige associated with saying, oh, 351 00:18:57,520 --> 00:19:01,280 Speaker 6: you know, the Air Force uses our AI system or 352 00:19:01,320 --> 00:19:04,680 Speaker 6: the Navy users our technology, but the defense sector is 353 00:19:04,680 --> 00:19:07,320 Speaker 6: actually kind of small for them as a customer. I mean, 354 00:19:07,359 --> 00:19:10,000 Speaker 6: the dollar amount that's been talked about publicly for the 355 00:19:10,000 --> 00:19:12,840 Speaker 6: Anthropic contract, it's two hundred million dollars. That's not a 356 00:19:12,880 --> 00:19:15,320 Speaker 6: lot of money for these AI companies. Yeah, and so 357 00:19:15,400 --> 00:19:18,640 Speaker 6: I think that actually we've seen the defense sectors struggle 358 00:19:18,720 --> 00:19:21,480 Speaker 6: to just keep pace with the amount of investment that's 359 00:19:21,480 --> 00:19:22,280 Speaker 6: needed in the space. 360 00:19:22,960 --> 00:19:26,040 Speaker 2: Cherioucy, I think it's a good question. And then you remember, well, 361 00:19:26,200 --> 00:19:29,600 Speaker 2: the government couldn't build a good healthcare website to sign up. 362 00:19:29,560 --> 00:19:30,400 Speaker 4: For health insurance. 363 00:19:30,640 --> 00:19:32,600 Speaker 2: And I hate to bring that up because it's old, 364 00:19:33,000 --> 00:19:34,960 Speaker 2: but it's true. Right, So it's like, are they gonna 365 00:19:35,000 --> 00:19:38,399 Speaker 2: build a world class LLM or can a government build 366 00:19:38,440 --> 00:19:42,720 Speaker 2: a good employment insurance website. We've done multiple episodes the 367 00:19:43,160 --> 00:19:45,680 Speaker 2: answer continues to be not the case. I do find 368 00:19:45,680 --> 00:19:49,680 Speaker 2: it fascinating. However, your point about there is this novelty. 369 00:19:50,200 --> 00:19:53,960 Speaker 2: It is impossible to imagine and say Lockheed Martin inventing 370 00:19:54,000 --> 00:19:56,000 Speaker 2: get technology and they's like, no, you can't use it. 371 00:19:56,040 --> 00:19:59,840 Speaker 2: Because Lockheed Martin's entire raise on death Right is building. 372 00:19:59,600 --> 00:20:00,639 Speaker 4: Technolog for the government. 373 00:20:00,680 --> 00:20:03,639 Speaker 2: It is inconceivable what that would be. But it is 374 00:20:03,920 --> 00:20:07,720 Speaker 2: sort of novel when you're getting these defend technologies. And 375 00:20:07,800 --> 00:20:09,920 Speaker 2: you know the Google was also an example of Google 376 00:20:09,920 --> 00:20:13,840 Speaker 2: obviously had technology that did not originally serve a purpose 377 00:20:14,080 --> 00:20:17,960 Speaker 2: of defense. We saw the we remember the employee revolt. 378 00:20:18,240 --> 00:20:22,359 Speaker 2: Let's talk more about that disagreement though, between Anthropic and 379 00:20:22,440 --> 00:20:26,360 Speaker 2: the Department of Defense. In your mind, where does Peter 380 00:20:26,800 --> 00:20:30,760 Speaker 2: heg Seth want to go with this technology? And is 381 00:20:30,800 --> 00:20:34,080 Speaker 2: that deviate from some of the policies and the directives 382 00:20:34,080 --> 00:20:36,919 Speaker 2: that you were working on when you were when you 383 00:20:36,960 --> 00:20:37,879 Speaker 2: were working on this stuff. 384 00:20:38,720 --> 00:20:42,439 Speaker 6: So it's kind of crazy about this whole dispute is 385 00:20:42,440 --> 00:20:46,360 Speaker 6: particularly an issue of autonomous weapons. Literally everyone I've spoken 386 00:20:46,440 --> 00:20:50,000 Speaker 6: with has said that there's no intention by the military 387 00:20:50,359 --> 00:20:53,800 Speaker 6: to use AI to make fully autonomous weapons today. Okay, 388 00:20:53,840 --> 00:20:56,280 Speaker 6: I think anybody that's actually worked with a large language 389 00:20:56,280 --> 00:20:58,840 Speaker 6: model with any kind of chat bought, whether it's Claude 390 00:20:58,920 --> 00:21:02,600 Speaker 6: or Gemini or chat Chie knows that if you use 391 00:21:02,640 --> 00:21:05,159 Speaker 6: these to write an email, you need to double check it. Like, 392 00:21:05,200 --> 00:21:08,200 Speaker 6: in no way, shape or form are they reliable enough 393 00:21:08,240 --> 00:21:10,520 Speaker 6: to make life and death decisions. I don't think the 394 00:21:10,520 --> 00:21:14,160 Speaker 6: military actually wants to do that. What's at dispute here 395 00:21:14,240 --> 00:21:17,520 Speaker 6: is a more fundamental disagreement about well, who sets the 396 00:21:17,640 --> 00:21:20,840 Speaker 6: rules and so the origins of this really was that 397 00:21:21,240 --> 00:21:23,960 Speaker 6: when the Pentagon came out with a new strategy for 398 00:21:24,040 --> 00:21:27,240 Speaker 6: AI in January, one of the things in their strategy 399 00:21:27,320 --> 00:21:30,320 Speaker 6: was that going forward, they wanted their contracts with AI 400 00:21:30,359 --> 00:21:33,800 Speaker 6: companies to allow the military to use their AD tools 401 00:21:33,800 --> 00:21:37,760 Speaker 6: for any lawful use. Basically, look, anything that's legal, we 402 00:21:37,760 --> 00:21:39,880 Speaker 6: want the ability to do it. And that has conflicted 403 00:21:39,920 --> 00:21:41,439 Speaker 6: with how a lot of these tech companies have been 404 00:21:41,440 --> 00:21:44,320 Speaker 6: thinking about their AI tools. They're very nervous, many of 405 00:21:44,320 --> 00:21:47,880 Speaker 6: these companies about harms from AI. They're conscious of these risks, 406 00:21:48,080 --> 00:21:50,000 Speaker 6: and so a lot of them have various use policies 407 00:21:50,080 --> 00:21:52,879 Speaker 6: in place. You can't use AI, you know, to launch 408 00:21:52,920 --> 00:21:56,200 Speaker 6: offensive cyber attacks for example. That's kind of thing that 409 00:21:56,359 --> 00:21:59,159 Speaker 6: actually like the government might want to do. And so 410 00:21:59,840 --> 00:22:02,480 Speaker 6: this is a This was really the rub with the 411 00:22:02,480 --> 00:22:06,440 Speaker 6: government was like who sets the rules rather than necessarily 412 00:22:06,640 --> 00:22:09,800 Speaker 6: like a near term question of fully autonomous weapons. 413 00:22:09,960 --> 00:22:13,600 Speaker 3: So what we've already seen is Anthropic has this disagreement 414 00:22:13,920 --> 00:22:18,439 Speaker 3: with the government, and then OpenAI steps in and raises 415 00:22:18,480 --> 00:22:20,639 Speaker 3: its hand and says, okay, Anthropic doesn't want to do it, 416 00:22:20,640 --> 00:22:23,480 Speaker 3: We'll do it happily. Does this just leave us in 417 00:22:23,520 --> 00:22:26,680 Speaker 3: a situation where it's sort of a race to the bottom, Right, 418 00:22:26,720 --> 00:22:29,840 Speaker 3: It's like the lab with maybe the least amount of 419 00:22:29,920 --> 00:22:34,080 Speaker 3: safety concern or the least amount of reputational concern is 420 00:22:34,119 --> 00:22:36,520 Speaker 3: able to do this, and so we still wind up 421 00:22:36,560 --> 00:22:39,399 Speaker 3: in a situation where the government is using AI. 422 00:22:40,680 --> 00:22:43,280 Speaker 6: Well, I think what's unfortunate here is that when you 423 00:22:43,280 --> 00:22:46,320 Speaker 6: think about would be optimal for the government one, I 424 00:22:46,359 --> 00:22:47,959 Speaker 6: think it would be ideal for the government to have 425 00:22:47,960 --> 00:22:51,600 Speaker 6: access to this technology and to have access to all 426 00:22:51,680 --> 00:22:55,080 Speaker 6: of the best in class models available, because they are 427 00:22:55,160 --> 00:22:58,920 Speaker 6: good at slightly different things sometimes, and it's much healthier 428 00:22:58,920 --> 00:23:01,359 Speaker 6: for the government to have access to a number of 429 00:23:01,400 --> 00:23:04,760 Speaker 6: different providers so that there is healthy competition in the market. 430 00:23:04,800 --> 00:23:07,120 Speaker 6: You don't get locked in with one vendor. But also, 431 00:23:07,720 --> 00:23:10,800 Speaker 6: if the AI scientists are saying, hey, it's not reliable 432 00:23:10,800 --> 00:23:13,600 Speaker 6: for this, you to listen like that seems like a 433 00:23:13,680 --> 00:23:16,520 Speaker 6: thing you want to hear them like out about right, 434 00:23:16,680 --> 00:23:19,560 Speaker 6: And so I think like in order to use AI 435 00:23:19,600 --> 00:23:23,400 Speaker 6: and ways that actually are effective for the US military, 436 00:23:23,640 --> 00:23:26,680 Speaker 6: we've got to have a healthy dialogue between the AI 437 00:23:26,760 --> 00:23:29,639 Speaker 6: community and people in the military profession. But what the 438 00:23:29,680 --> 00:23:33,919 Speaker 6: technology can and cannot do. And I think it's unfortunate 439 00:23:33,960 --> 00:23:35,960 Speaker 6: that we've seen that dialogue break down in such a 440 00:23:36,000 --> 00:23:37,320 Speaker 6: dramatic way over the speed. 441 00:23:37,880 --> 00:23:40,359 Speaker 3: Just going back to the idea, who actually makes the 442 00:23:40,440 --> 00:23:43,920 Speaker 3: rules you mentioned earlier that you know you can't use 443 00:23:44,400 --> 00:23:47,600 Speaker 3: claude to hack into to illegally hack into a system. 444 00:23:47,600 --> 00:23:49,879 Speaker 3: Supposedly it is unable to do that. It has like 445 00:23:49,920 --> 00:23:52,760 Speaker 3: a kill switch within itself that prevents it from doing that. 446 00:23:53,000 --> 00:23:55,560 Speaker 3: If you're anthropic, could you not just hardcode some of 447 00:23:55,600 --> 00:23:58,320 Speaker 3: these systems and say you're not going to be able 448 00:23:58,320 --> 00:24:03,080 Speaker 3: to be used for domestic surveillance of Americans or for 449 00:24:03,359 --> 00:24:04,120 Speaker 3: war crimes. 450 00:24:05,000 --> 00:24:06,879 Speaker 6: So yeah, So this is where it gets a little 451 00:24:07,000 --> 00:24:09,440 Speaker 6: more technical. It has to do with some of the 452 00:24:09,480 --> 00:24:13,080 Speaker 6: ways in which the companies may be providing their technology 453 00:24:13,119 --> 00:24:15,160 Speaker 6: to the government. So there's a couple of different ways 454 00:24:15,160 --> 00:24:17,439 Speaker 6: in which an AI company can put safeguards in place 455 00:24:17,800 --> 00:24:21,320 Speaker 6: to make sure that their models not being abused. One 456 00:24:21,400 --> 00:24:24,639 Speaker 6: is training the model itself to refuse certain requests. So 457 00:24:24,680 --> 00:24:26,639 Speaker 6: if you ask the model to do something, it's just 458 00:24:26,640 --> 00:24:28,280 Speaker 6: going to say, like, I'm not going to do that. 459 00:24:28,280 --> 00:24:30,640 Speaker 6: That's not consistent with the guidance that I've been given, 460 00:24:30,640 --> 00:24:32,680 Speaker 6: and the model has been trained to do that response. 461 00:24:33,480 --> 00:24:37,800 Speaker 6: Another way is that the company can put classifiers on 462 00:24:37,840 --> 00:24:41,160 Speaker 6: the input and or the output of a model, where 463 00:24:41,800 --> 00:24:44,400 Speaker 6: the model might give you an answer, but then there's 464 00:24:44,440 --> 00:24:47,320 Speaker 6: like another aisystem that's checking that answer or checking what 465 00:24:47,359 --> 00:24:50,000 Speaker 6: you ask of it and saying, well, that's not acceptable. 466 00:24:50,280 --> 00:24:52,800 Speaker 6: And then a third and I've want into that actually 467 00:24:52,840 --> 00:24:55,240 Speaker 6: myself in my own research because the nature of the 468 00:24:55,240 --> 00:24:57,520 Speaker 6: things that I work on a security things, and I've 469 00:24:57,520 --> 00:25:01,480 Speaker 6: had situations where I asked Claude help me understand this issue. 470 00:25:01,640 --> 00:25:05,120 Speaker 6: Claud actually generates response and then it gets deleted. It's 471 00:25:05,119 --> 00:25:07,439 Speaker 6: really interesting to see. And then the other way an 472 00:25:07,440 --> 00:25:10,840 Speaker 6: anthropic has actually talked about this in response to countering 473 00:25:11,359 --> 00:25:15,960 Speaker 6: some use of Claude by Chinese hackers who are using 474 00:25:15,960 --> 00:25:21,720 Speaker 6: it for cyber attacks, is the company monitors use through 475 00:25:22,280 --> 00:25:25,320 Speaker 6: that people are doing, and so people are doing things 476 00:25:25,359 --> 00:25:28,080 Speaker 6: that seem suspicious. Maybe they're logging in from an IP 477 00:25:28,200 --> 00:25:30,920 Speaker 6: address that's known to be associated with cyber criminals or 478 00:25:30,960 --> 00:25:33,840 Speaker 6: a hacking group that to try and find ways to 479 00:25:33,880 --> 00:25:36,560 Speaker 6: get around some of these protections. The company can also 480 00:25:36,600 --> 00:25:38,440 Speaker 6: find ways to try to catch that, and so there's 481 00:25:38,440 --> 00:25:41,040 Speaker 6: a couple different ways to do it that might not 482 00:25:41,240 --> 00:25:45,000 Speaker 6: all be in place. If you're thinking about military use 483 00:25:45,040 --> 00:25:48,600 Speaker 6: where if depending on how that relationship is structure between 484 00:25:48,600 --> 00:25:51,320 Speaker 6: the company. If the model is, for example, that hosted 485 00:25:51,520 --> 00:25:55,040 Speaker 6: on a different cloud infrastructure and the military is direct 486 00:25:55,040 --> 00:25:57,359 Speaker 6: access to it, the company may not have the same 487 00:25:57,600 --> 00:26:01,879 Speaker 6: ways to actually shape what or not the technology is 488 00:26:01,920 --> 00:26:05,040 Speaker 6: being used according to their principles, which is partly why 489 00:26:05,040 --> 00:26:07,560 Speaker 6: the contract details do matter of like what is the 490 00:26:07,600 --> 00:26:10,640 Speaker 6: agreement between the company the government about what the military 491 00:26:10,640 --> 00:26:28,520 Speaker 6: can and cannot use the technology trace. 492 00:26:28,600 --> 00:26:31,040 Speaker 2: I think your point about like this sort of seemingly 493 00:26:31,680 --> 00:26:34,560 Speaker 2: safety or safety race at the bottom is very real 494 00:26:34,600 --> 00:26:36,640 Speaker 2: and it's one that I think about a lot. When 495 00:26:37,400 --> 00:26:41,760 Speaker 2: LMS or AI was basically just synonymous with open AI, 496 00:26:42,160 --> 00:26:44,080 Speaker 2: they could set the pace of development. 497 00:26:43,760 --> 00:26:44,600 Speaker 4: Right, they could do it. 498 00:26:44,800 --> 00:26:47,960 Speaker 2: As soon as this became a hyper competitive space where 499 00:26:47,960 --> 00:26:50,239 Speaker 2: you have open AI and you have anthropic, and you 500 00:26:50,240 --> 00:26:54,120 Speaker 2: have Gemini and a thousand open source AI models out 501 00:26:54,160 --> 00:26:58,560 Speaker 2: of China, et cetera. The tempo of release has really heightened, 502 00:26:58,600 --> 00:27:01,680 Speaker 2: and the degree which it feels like they have no 503 00:27:01,800 --> 00:27:05,560 Speaker 2: choice but to accelerate just for the commercial imperative feels 504 00:27:05,560 --> 00:27:07,960 Speaker 2: like a very real dynamic in which like, I don't 505 00:27:08,000 --> 00:27:10,600 Speaker 2: know where that leaves AI safety well totally. 506 00:27:10,640 --> 00:27:13,560 Speaker 3: And also you mentioned China. Then it's not just domestic 507 00:27:13,600 --> 00:27:17,119 Speaker 3: competition became, you know, open AI versus anthropic, it's a 508 00:27:17,160 --> 00:27:20,000 Speaker 3: competition between international actors where it's like, Okay, well, the 509 00:27:20,080 --> 00:27:22,800 Speaker 3: US might want to have safeguards on technology or say 510 00:27:22,800 --> 00:27:25,760 Speaker 3: that it does, but maybe Russia or China don't care. 511 00:27:26,080 --> 00:27:28,160 Speaker 2: Right tell you you know, it's funny all you mentioned 512 00:27:28,200 --> 00:27:30,280 Speaker 2: where you like see the output for one second and 513 00:27:30,280 --> 00:27:33,560 Speaker 2: then deletes, Like when deep Seak came out doing some 514 00:27:33,680 --> 00:27:37,679 Speaker 2: experiments about like figuring out its censorship and I was 515 00:27:37,720 --> 00:27:40,600 Speaker 2: trying to do some adversarial prompting and I was like, 516 00:27:41,080 --> 00:27:43,880 Speaker 2: historians like to talk about a period in the twentieth 517 00:27:43,880 --> 00:27:48,480 Speaker 2: century where it failed. Attempt at extreme rapid industrialization happened 518 00:27:48,480 --> 00:27:50,399 Speaker 2: and it led to famine. And then you see the 519 00:27:50,480 --> 00:27:53,760 Speaker 2: output and it said Okay, what happened in the twentieth century? 520 00:27:53,960 --> 00:27:56,280 Speaker 2: Where did this famine? Well, there's something called the Great 521 00:27:56,359 --> 00:27:59,040 Speaker 2: Leap Forward, and then immediately it just as soon as 522 00:27:59,080 --> 00:27:59,640 Speaker 2: the chain of. 523 00:27:59,560 --> 00:28:01,840 Speaker 4: Thought hit the Great Leap Forward, it just like disappeared. 524 00:28:01,880 --> 00:28:04,159 Speaker 2: So I'm always like very amused by like when the 525 00:28:04,240 --> 00:28:07,439 Speaker 2: system recognizes that the system has gone too far. Anyway, 526 00:28:08,080 --> 00:28:12,480 Speaker 2: we've been talking about quote large language models, but actually 527 00:28:12,520 --> 00:28:15,760 Speaker 2: AI is beyond large language models, including the image stuff, 528 00:28:15,760 --> 00:28:18,520 Speaker 2: and that actually lllm's at this point. It's a very 529 00:28:18,560 --> 00:28:22,040 Speaker 2: twenty twenty three sort of term. And when and I 530 00:28:22,040 --> 00:28:24,000 Speaker 2: think this is important because when we get to the 531 00:28:24,040 --> 00:28:28,720 Speaker 2: intersection of AI and robotics and so forth, or AI 532 00:28:28,920 --> 00:28:32,119 Speaker 2: and target, we're talking about something but beyond large language model, 533 00:28:32,320 --> 00:28:34,640 Speaker 2: but we might still be talking about generative AI. 534 00:28:35,200 --> 00:28:36,719 Speaker 4: Where do you see this going? 535 00:28:36,960 --> 00:28:41,720 Speaker 2: And what are the weapon systems that aren't currently You said, currently, 536 00:28:41,760 --> 00:28:45,520 Speaker 2: no one is actually talking about true autonomous weapons. But 537 00:28:45,560 --> 00:28:47,840 Speaker 2: if that were the case, then there wouldn't be a controversy. 538 00:28:47,880 --> 00:28:50,000 Speaker 2: There was, so there was clearly something just beyond the 539 00:28:50,000 --> 00:28:52,920 Speaker 2: horizon that could come into the picture of a true 540 00:28:52,960 --> 00:28:56,200 Speaker 2: autonomous weapon system where the technology is building towards that. 541 00:28:56,680 --> 00:28:58,400 Speaker 2: If this weren't the case, there would be no dispute, 542 00:28:58,440 --> 00:29:01,240 Speaker 2: you wouldn't have two books written about this subject. So 543 00:29:01,280 --> 00:29:05,200 Speaker 2: what are these weapons systems that would classify as autonomous 544 00:29:05,240 --> 00:29:09,120 Speaker 2: weapons today that the technology is building towards right now? 545 00:29:10,480 --> 00:29:13,040 Speaker 6: Yeah, I mean I certainly the trends are taking us there. 546 00:29:13,040 --> 00:29:14,640 Speaker 6: And one of the things that you see in the 547 00:29:14,680 --> 00:29:17,120 Speaker 6: Pentagon's position in this issue, for example, is they want 548 00:29:17,120 --> 00:29:19,920 Speaker 6: to preserve that option. Yeah, for there's certainly not interested 549 00:29:19,920 --> 00:29:22,360 Speaker 6: in tying their hands. I think you could see that 550 00:29:22,480 --> 00:29:26,160 Speaker 6: evolving in a couple ways. One trend we're clearly seeing 551 00:29:26,280 --> 00:29:30,280 Speaker 6: with the largest and most capable AI systems is they're 552 00:29:30,320 --> 00:29:33,440 Speaker 6: increasingly multimodal. They're bringing in lots of different forms of data, 553 00:29:33,480 --> 00:29:35,760 Speaker 6: of course, and they're increasing the general purpose they could 554 00:29:35,760 --> 00:29:38,080 Speaker 6: just like do a variety of different kinds of things. 555 00:29:38,080 --> 00:29:40,440 Speaker 6: They become more capable of that, and so that's like 556 00:29:40,600 --> 00:29:45,680 Speaker 6: one way in which you could see AI being used 557 00:29:46,120 --> 00:29:50,040 Speaker 6: in ways that might sort of slowly pulls humans out 558 00:29:50,040 --> 00:29:53,120 Speaker 6: of the loop, where instead of person giving an AI 559 00:29:53,240 --> 00:29:56,680 Speaker 6: like really narrow tasks to do in a planning process, 560 00:29:56,880 --> 00:29:58,720 Speaker 6: maybe the AI is able to take on more, bring 561 00:29:58,840 --> 00:30:02,280 Speaker 6: in more data, take on more like sophisticated longer term tasks, 562 00:30:02,280 --> 00:30:04,320 Speaker 6: and we're certainly seeing this in other areas like coding, 563 00:30:04,720 --> 00:30:07,000 Speaker 6: where the task length that an AI system could do 564 00:30:07,120 --> 00:30:11,080 Speaker 6: is growing exponentially over time. Another sort of way that 565 00:30:11,120 --> 00:30:13,480 Speaker 6: we might see this look is just we see a 566 00:30:13,520 --> 00:30:17,440 Speaker 6: network of AI agents that are interacting with different pieces 567 00:30:17,480 --> 00:30:20,560 Speaker 6: of data, doing different types of things, and that the 568 00:30:20,600 --> 00:30:24,200 Speaker 6: net effect of that is that maybe humans are again 569 00:30:24,240 --> 00:30:27,160 Speaker 6: sort of like nominally looking at these targets but not 570 00:30:27,200 --> 00:30:30,560 Speaker 6: actually approving them in some meaningful way. And then there's 571 00:30:30,640 --> 00:30:32,400 Speaker 6: like a more separate I would almost think of like 572 00:30:32,400 --> 00:30:35,680 Speaker 6: an embodied form of AI in robotics, which could be 573 00:30:35,720 --> 00:30:40,200 Speaker 6: a drone or munition or robotic system that has some 574 00:30:40,280 --> 00:30:45,040 Speaker 6: kind of onboard autonomy that might be partly a distilled 575 00:30:45,280 --> 00:30:47,480 Speaker 6: model so that it can be operating at the edge 576 00:30:47,520 --> 00:30:51,680 Speaker 6: on lower computing on this actual munition or drone, or 577 00:30:51,800 --> 00:30:54,720 Speaker 6: it might be some hybrid system that has partly machine 578 00:30:54,760 --> 00:30:57,920 Speaker 6: learning but also just a lot of hand coded code. 579 00:30:58,240 --> 00:31:01,240 Speaker 6: It's more like expert level system that's going out into 580 00:31:01,320 --> 00:31:04,560 Speaker 6: the battle space and hunting targets directly and attacking them. 581 00:31:04,640 --> 00:31:06,840 Speaker 6: So something kind of like the low cost drones that 582 00:31:06,840 --> 00:31:10,760 Speaker 6: we're seeing Iran launch, but once the can loiter and 583 00:31:11,040 --> 00:31:12,720 Speaker 6: identify targets that attack them. 584 00:31:12,880 --> 00:31:16,440 Speaker 2: Whether that doesn't exist today. We don't have drones loiter 585 00:31:16,920 --> 00:31:19,959 Speaker 2: that are just hanging out there. And then when something flat, 586 00:31:20,080 --> 00:31:21,920 Speaker 2: then there's a system is like this looks like a 587 00:31:21,960 --> 00:31:24,920 Speaker 2: target attacks That actually doesn't exist currently as far as you. 588 00:31:24,880 --> 00:31:29,760 Speaker 6: Know, well, I mean, they're not in widespread use. So 589 00:31:29,800 --> 00:31:33,200 Speaker 6: there have been some narrow examples, I would say historically 590 00:31:33,480 --> 00:31:38,000 Speaker 6: dating back to the eighties, in fact of lording munitions 591 00:31:38,400 --> 00:31:41,320 Speaker 6: that could search over wider and would queue off of radars, 592 00:31:41,720 --> 00:31:44,600 Speaker 6: and so radars are what the military would call a 593 00:31:44,640 --> 00:31:48,360 Speaker 6: cooperative target that when they're emitting in the electromaenetic spectrum, 594 00:31:48,680 --> 00:31:50,680 Speaker 6: if you know the signature of the radar you're looking for, 595 00:31:50,840 --> 00:31:52,120 Speaker 6: you could see it. You can just home in on 596 00:31:52,120 --> 00:31:54,600 Speaker 6: that radar. Now, if they turn off, that's different than 597 00:31:54,920 --> 00:31:57,480 Speaker 6: kidden and they're harder to find. But there have been 598 00:31:57,520 --> 00:32:01,440 Speaker 6: some examples. A system that the US Navy had in 599 00:32:01,440 --> 00:32:04,560 Speaker 6: the eighties called the Tomahawk anti shit missile, not actually 600 00:32:04,640 --> 00:32:07,480 Speaker 6: the same Tomahawk cruise missile that the military is using now, 601 00:32:07,720 --> 00:32:10,400 Speaker 6: a different one that was designed to fly a search 602 00:32:10,480 --> 00:32:13,760 Speaker 6: pattern and hunt Soviet chips. There was an Israeli system 603 00:32:13,760 --> 00:32:16,120 Speaker 6: called the Harpy drone that was designed to go after 604 00:32:16,240 --> 00:32:18,440 Speaker 6: radars that would loorder for a period of time. But 605 00:32:18,520 --> 00:32:21,479 Speaker 6: these lording musicians have never really been in widespread use 606 00:32:21,520 --> 00:32:22,280 Speaker 6: by militaries. 607 00:32:22,440 --> 00:32:24,560 Speaker 3: We got to invent one of those, like high pitched 608 00:32:24,640 --> 00:32:28,800 Speaker 3: alarms to deter the loitering drones from hanging out outside targets. 609 00:32:28,800 --> 00:32:30,479 Speaker 3: I guess I mean we have electrical jammings. 610 00:32:30,560 --> 00:32:31,400 Speaker 5: Yeah, that does exist. 611 00:32:31,720 --> 00:32:34,480 Speaker 3: Okay, So when I think about as we move towards 612 00:32:34,640 --> 00:32:39,080 Speaker 3: more autonomous weaponry, I think about bots basically interacting with 613 00:32:39,160 --> 00:32:41,560 Speaker 3: bots at that point, and then I think back to 614 00:32:41,640 --> 00:32:44,800 Speaker 3: previous examples of bots interacting with bots, and there are 615 00:32:45,160 --> 00:32:47,640 Speaker 3: numerous ones where things tend to go off the rails. 616 00:32:47,720 --> 00:32:49,640 Speaker 4: They just start debating the meaning of life, right or. 617 00:32:49,640 --> 00:32:52,080 Speaker 3: They start talking in like a language that no one 618 00:32:52,160 --> 00:32:53,680 Speaker 3: understands accept them. 619 00:32:53,520 --> 00:32:54,200 Speaker 5: Stuff like that. 620 00:32:54,840 --> 00:33:01,120 Speaker 3: Does the possibility of undesired escalation go up the more 621 00:33:01,200 --> 00:33:03,480 Speaker 3: we move towards fully autonomous weaponry. 622 00:33:05,280 --> 00:33:08,080 Speaker 6: I think that is a very serious risk. And so 623 00:33:08,800 --> 00:33:11,800 Speaker 6: the mental model that I have for this are things 624 00:33:11,840 --> 00:33:16,000 Speaker 6: like flash crashes that we've seen in financial markets due 625 00:33:16,000 --> 00:33:20,320 Speaker 6: to the interactions of different algorithms that are executing trades 626 00:33:20,640 --> 00:33:23,400 Speaker 6: where you get these emergent properties of how the algorithms 627 00:33:23,440 --> 00:33:26,680 Speaker 6: might interact in the market, and it's a competitive environment. 628 00:33:27,000 --> 00:33:28,800 Speaker 6: Companies aren't going to share the details of what their 629 00:33:28,800 --> 00:33:32,040 Speaker 6: algorithms are doing, and you can do strange behaviors. Now, 630 00:33:32,080 --> 00:33:34,960 Speaker 6: the way that financial markets have dealt with this problem 631 00:33:35,120 --> 00:33:38,120 Speaker 6: is regulators have installed circuit breakers that take stocks offline 632 00:33:38,120 --> 00:33:40,560 Speaker 6: if the price moves too quickly. There's no referee to 633 00:33:40,560 --> 00:33:42,800 Speaker 6: call time out in war, and so I think that's like, 634 00:33:42,880 --> 00:33:46,520 Speaker 6: particularly in cyberspace, one can envision a future where that 635 00:33:46,680 --> 00:33:48,840 Speaker 6: is a risk, where things are happening at machine speed 636 00:33:49,400 --> 00:33:53,720 Speaker 6: and you have autonomous offensive cyber operations. You need to 637 00:33:53,760 --> 00:33:57,239 Speaker 6: defend against that. You need some measure of autonomy on 638 00:33:57,280 --> 00:34:00,840 Speaker 6: the defensive side to defend it. Machines space need and 639 00:34:00,880 --> 00:34:04,080 Speaker 6: you could get situations where you get weird interactions that 640 00:34:04,160 --> 00:34:06,840 Speaker 6: might escalate the conflict, or it could also happen between 641 00:34:06,920 --> 00:34:11,640 Speaker 6: drones interacting in some kind of crisis situation. Now, a 642 00:34:11,640 --> 00:34:14,000 Speaker 6: situation where like there's a big shooting war in the 643 00:34:14,080 --> 00:34:16,640 Speaker 6: way people are already attacking, there might be less of 644 00:34:16,680 --> 00:34:20,320 Speaker 6: a concern, although you still could worry about escalation geographically 645 00:34:20,480 --> 00:34:23,600 Speaker 6: against bringing new countries into a conflict, or maybe attacking 646 00:34:24,040 --> 00:34:27,200 Speaker 6: really sensitive sites that are tied to nuclear command and 647 00:34:27,200 --> 00:34:29,480 Speaker 6: control that you'd rather not go after. So I think 648 00:34:29,480 --> 00:34:31,520 Speaker 6: that's a very real risk when we think about how 649 00:34:31,560 --> 00:34:33,320 Speaker 6: this technology might be employed going forward. 650 00:34:33,800 --> 00:34:39,719 Speaker 2: What about AI in really difficult ethical questions strikes where 651 00:34:39,719 --> 00:34:42,399 Speaker 2: we know that civilians, for example, are going to be killed, 652 00:34:42,440 --> 00:34:45,440 Speaker 2: which that happens all the time in war and presumably 653 00:34:45,840 --> 00:34:50,080 Speaker 2: tries to be minimized, but war planners will find some 654 00:34:51,040 --> 00:34:56,040 Speaker 2: level of acceptable they called collateral damage. Is AI playing 655 00:34:56,160 --> 00:34:58,799 Speaker 2: a role or do you expect it to play a 656 00:34:59,000 --> 00:35:02,120 Speaker 2: role in some of these strikes? 657 00:35:02,160 --> 00:35:03,360 Speaker 4: That may be great areas. 658 00:35:04,400 --> 00:35:06,400 Speaker 6: I think you could envision in ways that a I 659 00:35:06,440 --> 00:35:09,160 Speaker 6: would be used that would make warfare more precise and 660 00:35:09,200 --> 00:35:11,200 Speaker 6: more human and ethical, and ways that it could be 661 00:35:11,280 --> 00:35:13,239 Speaker 6: used that would not and what can be opposite. So, 662 00:35:13,280 --> 00:35:17,840 Speaker 6: for example, if you had an AI system that could 663 00:35:17,880 --> 00:35:21,719 Speaker 6: look over all this targeting data and then identify if 664 00:35:21,719 --> 00:35:24,759 Speaker 6: a strike is within a certain distance using you know, 665 00:35:24,880 --> 00:35:29,080 Speaker 6: musis of a certain size of protected targets, whether it's 666 00:35:29,120 --> 00:35:34,319 Speaker 6: schools or hospitals or critical civilian infrastructure, and say hey, well, 667 00:35:34,560 --> 00:35:38,120 Speaker 6: like warning here, you should not carry out the strike 668 00:35:38,320 --> 00:35:41,800 Speaker 6: or it's a higher level of approval, or maybe you 669 00:35:41,800 --> 00:35:44,520 Speaker 6: should use small or more precise munitions. That would be 670 00:35:44,520 --> 00:35:47,239 Speaker 6: a really beneficial use of AI, particularly when you're talking 671 00:35:47,239 --> 00:35:50,080 Speaker 6: about a military campaign that hits a lot of targets 672 00:35:50,080 --> 00:35:52,080 Speaker 6: in a short period of time, That could be really 673 00:35:52,120 --> 00:35:56,200 Speaker 6: valuable and may reduce civilian casualties. You know, the risk 674 00:35:56,280 --> 00:35:58,160 Speaker 6: of all of this is you could end up in 675 00:35:58,160 --> 00:36:03,080 Speaker 6: a world where humans are just less engaged in this process, right, 676 00:36:03,160 --> 00:36:05,880 Speaker 6: and so there's both mistakes that humans miss or humans 677 00:36:05,920 --> 00:36:09,040 Speaker 6: just don't feel as morally responsible, which I think is 678 00:36:09,080 --> 00:36:13,200 Speaker 6: like a really tricky thing to think about morally because 679 00:36:14,000 --> 00:36:17,560 Speaker 6: on the one hand, as a democratic society we make 680 00:36:17,600 --> 00:36:19,759 Speaker 6: a decision as a nation to go to war. It's 681 00:36:19,760 --> 00:36:21,680 Speaker 6: a very small number of people that have to carry 682 00:36:21,680 --> 00:36:24,879 Speaker 6: that burden. And if someone if you could say, well, look, 683 00:36:25,000 --> 00:36:28,800 Speaker 6: what's the benefit to someone having like PTSD years after 684 00:36:28,840 --> 00:36:31,600 Speaker 6: a conflict that they're haunted mess having happened, that doesn't 685 00:36:31,600 --> 00:36:34,360 Speaker 6: seem great. Maybe we could reduce that. On the other hand, 686 00:36:34,560 --> 00:36:37,520 Speaker 6: if we thought of war and nobody felt morally responsible 687 00:36:37,560 --> 00:36:40,040 Speaker 6: for the killing that occurred, that doesn't seem good either, 688 00:36:40,320 --> 00:36:43,760 Speaker 6: and that could lead to more suffering and civilian casualties 689 00:36:43,760 --> 00:36:46,320 Speaker 6: and war. So I think that's certainly a concern we 690 00:36:46,360 --> 00:36:47,799 Speaker 6: think about how to use the technology. 691 00:36:48,080 --> 00:36:51,680 Speaker 3: Yeah, this is very enders game coded, right, where you 692 00:36:51,760 --> 00:36:54,120 Speaker 3: have someone who's basically playing like a video game and 693 00:36:54,160 --> 00:36:56,960 Speaker 3: wiping out entire civilizations and they think it's just a 694 00:36:57,040 --> 00:36:59,120 Speaker 3: video game, just an exercise, but it turns out it's 695 00:36:59,160 --> 00:37:02,160 Speaker 3: actual warfare. And we're seeing some degree of that in 696 00:37:02,200 --> 00:37:05,520 Speaker 3: the way that the Department of War is portraying this 697 00:37:05,600 --> 00:37:09,880 Speaker 3: conflict so far. It's very video game, yes. 698 00:37:09,200 --> 00:37:10,480 Speaker 4: Especially in public presentation. 699 00:37:10,760 --> 00:37:13,960 Speaker 2: Yeah, literal animated gifts of video games exactly. 700 00:37:14,320 --> 00:37:18,240 Speaker 3: So, Paul, you mentioned something you mentioned the word circuit breaker, 701 00:37:18,760 --> 00:37:22,200 Speaker 3: and circuit breakers are nice things to have in markets. 702 00:37:22,280 --> 00:37:24,319 Speaker 3: I think they'd be even nicer things to have in 703 00:37:24,520 --> 00:37:27,600 Speaker 3: armed conflict and war. Is there any possibility that you 704 00:37:27,680 --> 00:37:32,360 Speaker 3: could design something like that for a major conflict. 705 00:37:32,680 --> 00:37:34,719 Speaker 6: I think it's possible, like at a technical level, to 706 00:37:34,719 --> 00:37:37,200 Speaker 6: figure out how you would do that and where you 707 00:37:37,239 --> 00:37:39,759 Speaker 6: put protections on your side in the military, and what 708 00:37:39,800 --> 00:37:43,080 Speaker 6: you would do with even maybe cooperatively with an enemy. 709 00:37:43,320 --> 00:37:45,560 Speaker 6: The challenge is how do you avoid what we were 710 00:37:45,560 --> 00:37:48,040 Speaker 6: talking about earlier, a race to the bottom on safety. Yeah, 711 00:37:48,080 --> 00:37:50,719 Speaker 6: and right, we're seeing this in the private sector between 712 00:37:50,719 --> 00:37:52,640 Speaker 6: the AA companies as the Russian to get products out 713 00:37:52,680 --> 00:37:56,360 Speaker 6: the market. I think it's especially hard in the military space, 714 00:37:56,360 --> 00:38:00,680 Speaker 6: where countries are investing in the military because they're worried 715 00:38:00,719 --> 00:38:03,359 Speaker 6: about what some other adversary might do and then want 716 00:38:03,400 --> 00:38:05,719 Speaker 6: to get a leg up on them. And so it's 717 00:38:05,760 --> 00:38:09,920 Speaker 6: not that cooperation in the midst of conflict never happens. 718 00:38:09,960 --> 00:38:15,120 Speaker 6: It does, and countries have agreed to take certain weapons 719 00:38:15,239 --> 00:38:19,400 Speaker 6: off the table chemical and biological weapons, for example. It 720 00:38:19,400 --> 00:38:22,840 Speaker 6: doesn't mean that they're never used, but most civilized countries 721 00:38:22,840 --> 00:38:25,240 Speaker 6: that were not going to use them. But those examples 722 00:38:25,239 --> 00:38:28,120 Speaker 6: are pretty rare and it's pretty hard to do, and 723 00:38:28,200 --> 00:38:30,920 Speaker 6: some think that dynamic is the really challenging one. It's like, 724 00:38:30,960 --> 00:38:34,280 Speaker 6: how do you find ways to cooperate with your enemies 725 00:38:34,600 --> 00:38:36,480 Speaker 6: to avoid some of the biggest dangers here. 726 00:38:37,200 --> 00:38:39,799 Speaker 2: So I think there's the last question for me. You know, 727 00:38:39,880 --> 00:38:43,160 Speaker 2: you mentioned that drones are kind of robot and there 728 00:38:43,160 --> 00:38:46,320 Speaker 2: are other robots that have been existence in either national 729 00:38:46,320 --> 00:38:50,680 Speaker 2: security or police work for a while. I think there 730 00:38:50,719 --> 00:38:53,400 Speaker 2: are robots on the subway sometimes that seem to be 731 00:38:53,520 --> 00:38:55,480 Speaker 2: really yeah, but I don't think they really. 732 00:38:55,280 --> 00:38:57,279 Speaker 3: See the robots at the grocery store and they end 733 00:38:57,360 --> 00:38:59,480 Speaker 3: up like chasing me while I'm trying to buy like 734 00:38:59,680 --> 00:39:01,920 Speaker 3: rare or something. Yeah, the ones that sweep the floors 735 00:39:01,960 --> 00:39:02,360 Speaker 3: and stuff. 736 00:39:02,440 --> 00:39:03,480 Speaker 4: Yeah, there's the robots. 737 00:39:03,600 --> 00:39:06,480 Speaker 2: Yeah, I think there is a Eric Adams did a 738 00:39:06,520 --> 00:39:08,600 Speaker 2: contract with some company that was doing like subway for 739 00:39:08,760 --> 00:39:10,880 Speaker 2: cards or something like that. But these are really different 740 00:39:11,280 --> 00:39:13,759 Speaker 2: ai as we talk about it in robots are two 741 00:39:13,760 --> 00:39:17,520 Speaker 2: different technological trees, but they are going to merge, and 742 00:39:17,560 --> 00:39:21,400 Speaker 2: there's the possibility of their ultimate merger. Do you foresee 743 00:39:21,440 --> 00:39:25,839 Speaker 2: a world in which essentially we don't have human soldiers 744 00:39:26,080 --> 00:39:30,520 Speaker 2: and wars are fought with who has the most advanced 745 00:39:30,560 --> 00:39:33,839 Speaker 2: autonomous robots? We know China is investing a lot in 746 00:39:33,920 --> 00:39:37,560 Speaker 2: humanoid robotics. Do you foresee a world in which that 747 00:39:37,760 --> 00:39:41,279 Speaker 2: is the nature of a ground invasion as you and 748 00:39:41,520 --> 00:39:43,840 Speaker 2: it happens with robots or various other sorts, talk to 749 00:39:43,920 --> 00:39:45,319 Speaker 2: us about how far that could go. 750 00:39:46,520 --> 00:39:48,919 Speaker 6: Yeah, So, I mean, look, I think where we see 751 00:39:49,000 --> 00:39:52,600 Speaker 6: robots whose more and were for absolutely the long arc 752 00:39:52,680 --> 00:39:56,160 Speaker 6: of technology in war, from the first time someone picked 753 00:39:56,200 --> 00:39:58,160 Speaker 6: up a rocket through it at somebody else has been 754 00:39:58,680 --> 00:40:04,120 Speaker 6: towards great distance between adversaries moving up through bows and 755 00:40:04,239 --> 00:40:07,239 Speaker 6: arrows and rifles and intercontinental ballistic missiles. And I think 756 00:40:07,520 --> 00:40:09,880 Speaker 6: robotics will be the next evolution of this trend of 757 00:40:09,960 --> 00:40:12,879 Speaker 6: finding ways to find the enemy strike the enemy without 758 00:40:12,880 --> 00:40:15,760 Speaker 6: putting yourself at risk. And there's certainly a role for 759 00:40:16,280 --> 00:40:19,040 Speaker 6: robotics out on the battlefield. I think a vision of 760 00:40:19,200 --> 00:40:23,000 Speaker 6: like future wars of just robots fighting robots, there's no 761 00:40:23,080 --> 00:40:26,040 Speaker 6: humans in wealth. That's not realistic for a couple of reasons. 762 00:40:26,080 --> 00:40:29,080 Speaker 6: One is I think militaries are going to need people 763 00:40:29,239 --> 00:40:34,440 Speaker 6: relatively forward deployed to execute command and control for robotic systems. 764 00:40:35,200 --> 00:40:37,880 Speaker 6: The US military right now can fly drones remotely from 765 00:40:37,920 --> 00:40:42,279 Speaker 6: the United States in a relatively uncontested environment against more 766 00:40:42,320 --> 00:40:45,759 Speaker 6: sophisticated adversaries who could jam your communications like and we see, 767 00:40:45,800 --> 00:40:47,680 Speaker 6: for example, like a lot of jamming on the front 768 00:40:47,719 --> 00:40:50,399 Speaker 6: lines in Ukraine. That's one of the ways you should 769 00:40:50,400 --> 00:40:53,439 Speaker 6: go after these drones. Then you need people close by 770 00:40:53,920 --> 00:40:58,080 Speaker 6: because it is easier to have shorter range protected communications. 771 00:40:58,200 --> 00:41:00,000 Speaker 6: When you go to a longer distances, that's just hard 772 00:41:00,200 --> 00:41:02,439 Speaker 6: to do. So I think given be people relatively close 773 00:41:02,480 --> 00:41:04,840 Speaker 6: for that reason. I think if you want to control territory, 774 00:41:05,080 --> 00:41:07,680 Speaker 6: you have to put people there eventually to get out 775 00:41:07,680 --> 00:41:10,200 Speaker 6: of a vehicle and walk around and control it. But 776 00:41:10,239 --> 00:41:12,200 Speaker 6: I think the other reason is maybe a little dark, 777 00:41:12,239 --> 00:41:15,680 Speaker 6: which I think realistically, in order for wars to end, 778 00:41:16,360 --> 00:41:19,120 Speaker 6: there will have to be some human price that's paid. 779 00:41:19,160 --> 00:41:22,440 Speaker 6: I think that's an unfortunate reality that if it's just 780 00:41:22,680 --> 00:41:26,239 Speaker 6: machines that are being destroyed, that we may not get 781 00:41:26,280 --> 00:41:28,960 Speaker 6: to the place where one side or the other is 782 00:41:29,040 --> 00:41:32,759 Speaker 6: willing to suit for peace. And I think, unfortunately, war 783 00:41:32,840 --> 00:41:36,880 Speaker 6: is likely to involve people and human costs for a 784 00:41:36,960 --> 00:41:37,640 Speaker 6: very long time. 785 00:41:38,400 --> 00:41:40,319 Speaker 3: I have one more question as well, and I guess 786 00:41:40,360 --> 00:41:42,319 Speaker 3: it's a thought experiment, but if we think back to 787 00:41:42,600 --> 00:41:46,239 Speaker 3: sort of pivotal moments in military history and their intersection 788 00:41:46,440 --> 00:41:49,400 Speaker 3: with technology, one of them that comes up is the 789 00:41:49,480 --> 00:41:53,279 Speaker 3: Russian officer who decided not to press the button in 790 00:41:53,360 --> 00:41:56,879 Speaker 3: response to the US and thereby, you know, supposedly save 791 00:41:56,960 --> 00:42:01,960 Speaker 3: the world from nuclear disaster. Happen in a fully autonomous 792 00:42:02,000 --> 00:42:04,560 Speaker 3: military environment nowadays, I. 793 00:42:04,560 --> 00:42:06,880 Speaker 6: Mean today, it would still happen because there's people involved. Right, 794 00:42:06,920 --> 00:42:11,200 Speaker 6: So this incident standis la Petrov, He's sitting at a 795 00:42:11,280 --> 00:42:14,960 Speaker 6: terminal and it gets this warning that there's a ballistic 796 00:42:15,000 --> 00:42:17,920 Speaker 6: missile launched from the United states against the Soviet Union, 797 00:42:18,320 --> 00:42:21,839 Speaker 6: and then another missile, another five missiles coming in. And 798 00:42:22,239 --> 00:42:25,279 Speaker 6: the thing that's interesting about this is when Petrov talked 799 00:42:25,280 --> 00:42:27,759 Speaker 6: about it afterwards, and we could hear what he said 800 00:42:27,800 --> 00:42:30,400 Speaker 6: because we all lived because he made the right decision. 801 00:42:30,400 --> 00:42:33,480 Speaker 6: Here is he talked about how he said he had 802 00:42:33,480 --> 00:42:35,880 Speaker 6: a funny feeling in his gut and that he knew 803 00:42:35,920 --> 00:42:38,279 Speaker 6: that the Russian system, the Russians had just deployed, or 804 00:42:38,280 --> 00:42:41,359 Speaker 6: said the Soviets rather just deployed a new satellite based 805 00:42:41,400 --> 00:42:44,960 Speaker 6: early warning system to detect us ICBM launches. That it 806 00:42:45,000 --> 00:42:46,360 Speaker 6: was new, and he knew that a lot of the 807 00:42:46,400 --> 00:42:49,000 Speaker 6: Soviet technology doesn't work that great at first, so he 808 00:42:49,080 --> 00:42:51,680 Speaker 6: was skeptical of it. Turns out it was in fact faulty. 809 00:42:51,760 --> 00:42:54,359 Speaker 6: He was detecting the reflection of sunlight off the top 810 00:42:54,400 --> 00:42:57,320 Speaker 6: of clouds and the system was identifying that as a 811 00:42:57,640 --> 00:43:01,160 Speaker 6: missile launch, and that's what it was. And he went 812 00:43:01,280 --> 00:43:04,359 Speaker 6: and then called the early warning radar stations and staid, 813 00:43:04,360 --> 00:43:06,239 Speaker 6: are you seeing his missiles come over the horizon and said, no, 814 00:43:06,320 --> 00:43:08,640 Speaker 6: there's not missiles. So he reported up the chain that 815 00:43:08,640 --> 00:43:12,040 Speaker 6: the system was malfunctioning. I think the scary question here 816 00:43:12,120 --> 00:43:14,040 Speaker 6: is like if that was an AI, what would the 817 00:43:14,080 --> 00:43:17,000 Speaker 6: AI have done? And it's kind of like whatever it 818 00:43:17,040 --> 00:43:20,520 Speaker 6: was programmed to wherever it was trained to do. And 819 00:43:21,360 --> 00:43:25,040 Speaker 6: obviously we're seeing more general purpose AI systems like large 820 00:43:25,080 --> 00:43:29,000 Speaker 6: language models, have the ability to bring together more information, 821 00:43:29,239 --> 00:43:32,960 Speaker 6: to understand better context, to have just like a more 822 00:43:33,000 --> 00:43:35,759 Speaker 6: contextual understanding of the questions that you're asking of it. 823 00:43:36,280 --> 00:43:39,360 Speaker 6: But it still doesn't know the stakes of a conference. 824 00:43:39,400 --> 00:43:42,160 Speaker 6: It still doesn't know, like in some visceral level, what 825 00:43:42,200 --> 00:43:44,800 Speaker 6: the consequences are. And so I think that's a strong 826 00:43:44,880 --> 00:43:48,160 Speaker 6: compelling reason why we need to have humans involved these decisions. 827 00:43:48,600 --> 00:43:51,359 Speaker 6: Even as the AI becomes more capable, there's still gonna 828 00:43:51,360 --> 00:43:53,960 Speaker 6: be things we want humans to do because humans understand 829 00:43:53,960 --> 00:43:54,680 Speaker 6: why it matters. 830 00:43:55,719 --> 00:43:59,080 Speaker 2: I started the conversation by mentioning that you know, it's 831 00:43:59,160 --> 00:44:02,600 Speaker 2: not very common traversial to say, have an anti missile 832 00:44:02,640 --> 00:44:06,080 Speaker 2: system fire a missile when there's one coming in, but 833 00:44:06,440 --> 00:44:08,040 Speaker 2: that could be wrong, and you want to make sure 834 00:44:08,040 --> 00:44:10,600 Speaker 2: that it is in fact a missile and not a 835 00:44:10,640 --> 00:44:13,520 Speaker 2: civilian air jet or something like that. Be So even 836 00:44:13,680 --> 00:44:16,880 Speaker 2: there where it seems like a canonical example, if you 837 00:44:16,920 --> 00:44:19,759 Speaker 2: just want to have the missile system go off, you 838 00:44:19,760 --> 00:44:23,080 Speaker 2: would want to have human safeguards and human oversight and 839 00:44:23,160 --> 00:44:27,840 Speaker 2: human understanding of the system such that it is in 840 00:44:28,000 --> 00:44:32,879 Speaker 2: fact shooting down missile anyway, Paul Sharie, fascinating conversation. Really 841 00:44:32,920 --> 00:44:35,040 Speaker 2: appreciate you coming out on the up lots and talking 842 00:44:35,040 --> 00:44:35,560 Speaker 2: about your work. 843 00:44:36,360 --> 00:44:37,720 Speaker 6: Thank you. Really joyed the discussion. 844 00:44:38,239 --> 00:44:41,239 Speaker 5: Thanks so much, Paul. That was depressing and fascinating. 845 00:44:41,360 --> 00:44:44,160 Speaker 4: Yeah at the time, Yeah, no, it was great. It 846 00:44:44,200 --> 00:44:45,000 Speaker 4: was Thank you so much. 847 00:44:45,400 --> 00:44:46,359 Speaker 6: Yeah, Facebook cover. 848 00:44:59,080 --> 00:44:59,680 Speaker 4: I kind of get. 849 00:44:59,640 --> 00:45:01,880 Speaker 2: Choked up at the end thinking about that decision that 850 00:45:02,080 --> 00:45:05,000 Speaker 2: saved humanity at the end, And. 851 00:45:05,000 --> 00:45:06,680 Speaker 5: It's actually it's a crazy story. 852 00:45:06,719 --> 00:45:09,000 Speaker 2: It's a crazy story. It's one of those stories that like, 853 00:45:09,080 --> 00:45:10,720 Speaker 2: why doesn't everybody know. 854 00:45:10,920 --> 00:45:12,479 Speaker 4: That that person's name? 855 00:45:12,640 --> 00:45:14,799 Speaker 2: I mean, when you think about how many people, I 856 00:45:14,840 --> 00:45:15,960 Speaker 2: couldn't remember it either. 857 00:45:16,280 --> 00:45:18,480 Speaker 3: But shout out to another podcast if you want to 858 00:45:18,520 --> 00:45:21,759 Speaker 3: learn more about this. Dan Carlin's Hardcore History has at 859 00:45:21,880 --> 00:45:27,680 Speaker 3: least one, possibly two episodes on narrow aversions of nuclear disaster. 860 00:45:27,920 --> 00:45:31,200 Speaker 3: So very good to listen to, if not terrifying. 861 00:45:31,320 --> 00:45:34,040 Speaker 2: You know, there's another point in that exact story that 862 00:45:34,120 --> 00:45:36,600 Speaker 2: I think is really interesting and it's something I've been 863 00:45:36,680 --> 00:45:40,680 Speaker 2: thinking about a lot across AI because There's something similar 864 00:45:40,719 --> 00:45:45,200 Speaker 2: about humans and AI, which is that there is definitely 865 00:45:45,320 --> 00:45:48,799 Speaker 2: a gap between what we know and what we can articulate. 866 00:45:49,040 --> 00:45:51,600 Speaker 2: And this is certainly true with AI. Right, so the 867 00:45:51,640 --> 00:45:54,480 Speaker 2: bot makes some decision, or it determines something, it does 868 00:45:54,520 --> 00:45:56,440 Speaker 2: not mean it's going to be able to spit out 869 00:45:56,480 --> 00:45:58,960 Speaker 2: in words, how it arrived at that decision. 870 00:45:59,160 --> 00:46:01,080 Speaker 4: But that's true for humans as well. 871 00:46:01,120 --> 00:46:04,200 Speaker 2: And so the idea that like, okay, maybe we do 872 00:46:04,239 --> 00:46:07,560 Speaker 2: get funny feelings about or take you know, like, again, 873 00:46:08,000 --> 00:46:11,439 Speaker 2: we're still pretty good at determining the difference between AI 874 00:46:11,480 --> 00:46:15,280 Speaker 2: generated texts and human generated text can we but often 875 00:46:15,680 --> 00:46:17,799 Speaker 2: I mean we're still like often could get it right, 876 00:46:18,719 --> 00:46:21,560 Speaker 2: but could we write down exactly what we saw that 877 00:46:21,600 --> 00:46:25,959 Speaker 2: we understood? There is that gap, and when we're talking 878 00:46:25,960 --> 00:46:31,040 Speaker 2: about life or death decisions being made, it is scary 879 00:46:31,160 --> 00:46:34,480 Speaker 2: to think about that role of instinct that we can't 880 00:46:34,600 --> 00:46:37,480 Speaker 2: articulate having been taken out of the decision. Well. 881 00:46:37,520 --> 00:46:40,480 Speaker 3: I think also technology is very good at pattern recognition, 882 00:46:40,760 --> 00:46:44,319 Speaker 3: right and responding to patterns and preset paths. It's been 883 00:46:44,360 --> 00:46:47,600 Speaker 3: programmed to do certain things, and I think in a 884 00:46:47,640 --> 00:46:50,719 Speaker 3: war environment, that's one of the most uncertain environments that 885 00:46:50,719 --> 00:46:53,399 Speaker 3: you can possibly imagine, and so you have to think 886 00:46:53,480 --> 00:46:57,400 Speaker 3: that there should be some element of flexibility in your response, 887 00:46:57,440 --> 00:47:00,600 Speaker 3: But I don't know how you actually encode that into 888 00:47:00,640 --> 00:47:03,800 Speaker 3: a thing that like runs on rigid numbers and lines 889 00:47:04,160 --> 00:47:06,360 Speaker 3: lines of code. The other thing I was thinking was 890 00:47:06,560 --> 00:47:09,960 Speaker 3: the anthropic situation and just how new that is from 891 00:47:10,000 --> 00:47:12,440 Speaker 3: a sort of military history perspective, in the sense that 892 00:47:12,520 --> 00:47:16,879 Speaker 3: here we have this really important pivotal piece of technology 893 00:47:16,920 --> 00:47:21,120 Speaker 3: that hasn't come out of like actual military demand. Right 894 00:47:21,280 --> 00:47:24,520 Speaker 3: to Paul's point, it's a commercial product. Its commercial uses 895 00:47:24,560 --> 00:47:27,920 Speaker 3: are arguably a lot more profitable than its military ones, 896 00:47:28,160 --> 00:47:31,799 Speaker 3: and so seeing that now interact with the Pentagon and 897 00:47:31,880 --> 00:47:34,080 Speaker 3: the Department of War really interesting. 898 00:47:34,200 --> 00:47:35,279 Speaker 5: It's been flipped right. 899 00:47:35,560 --> 00:47:35,840 Speaker 4: Yeah. 900 00:47:35,880 --> 00:47:39,040 Speaker 2: The closest example actually that comes to mind, there is 901 00:47:39,080 --> 00:47:42,760 Speaker 2: one example that's in fairly recent history, and it's starlink 902 00:47:42,840 --> 00:47:44,920 Speaker 2: oh yeah co And of course that was developed for 903 00:47:44,920 --> 00:47:47,960 Speaker 2: commercial internet purposes, but it played a role in Ukraine 904 00:47:48,280 --> 00:47:51,280 Speaker 2: and so forth, and at one point, if I recall, 905 00:47:51,680 --> 00:47:54,839 Speaker 2: there was a tension point about the degree to which 906 00:47:54,880 --> 00:47:58,040 Speaker 2: the Ukrainians could use and so I do think that 907 00:47:58,320 --> 00:48:00,759 Speaker 2: is sort of an interesting parallel here. 908 00:48:00,960 --> 00:48:02,480 Speaker 4: The other thing that we didn't. 909 00:48:02,239 --> 00:48:04,600 Speaker 2: Get to this and I this is gonna be a 910 00:48:04,640 --> 00:48:07,439 Speaker 2: little cynical, but I think it's right, which is that 911 00:48:08,200 --> 00:48:12,600 Speaker 2: there's another element I believe to the Anthropic situation, which is, 912 00:48:12,680 --> 00:48:15,040 Speaker 2: like Anthropic is the last. 913 00:48:14,800 --> 00:48:16,000 Speaker 4: Big lib tech company. 914 00:48:16,239 --> 00:48:19,239 Speaker 2: We're perceived as such, right, and we know that there's 915 00:48:19,280 --> 00:48:22,120 Speaker 2: been this fairly sort of right word turn in Silicon 916 00:48:22,200 --> 00:48:25,920 Speaker 2: Valley over the years. And I don't think like Anthropic 917 00:48:26,040 --> 00:48:28,120 Speaker 2: is like totally part of that. I think there's still 918 00:48:28,200 --> 00:48:31,600 Speaker 2: sort of lib coded. I also think it's incidentally why 919 00:48:31,640 --> 00:48:33,319 Speaker 2: a bunch of people who probably like work in media 920 00:48:33,360 --> 00:48:35,400 Speaker 2: are like end up using Claude even though they're all 921 00:48:35,440 --> 00:48:37,880 Speaker 2: kind of the same. Like, I do think there's something 922 00:48:37,920 --> 00:48:41,080 Speaker 2: there and that they have this thing they say, we're 923 00:48:41,120 --> 00:48:41,880 Speaker 2: not going to ever. 924 00:48:41,880 --> 00:48:44,720 Speaker 4: Have ads, and we know that. 925 00:48:44,600 --> 00:48:47,319 Speaker 2: Like Andreas and Horowitz, Mark Andresen just talked about ads 926 00:48:47,360 --> 00:48:50,600 Speaker 2: are good. Ads democratize the internet. Ads enabled the Internet 927 00:48:50,640 --> 00:48:53,080 Speaker 2: to be spread to everyone. There are some other politics 928 00:48:53,080 --> 00:48:56,800 Speaker 2: at play. Because again, like from my understanding, and it 929 00:48:56,840 --> 00:48:58,480 Speaker 2: would take a lawyer, it's like, I don't think that 930 00:48:58,560 --> 00:49:02,040 Speaker 2: the agreement that open as Ey sign was that different 931 00:49:03,080 --> 00:49:06,040 Speaker 2: than what the agreement that Anthropic had. There is probably 932 00:49:06,040 --> 00:49:08,000 Speaker 2: a little bit of difference. I just think there's some 933 00:49:08,080 --> 00:49:13,720 Speaker 2: other politics and play here. Per to Paul's point, maybe 934 00:49:13,760 --> 00:49:19,600 Speaker 2: nobody is talking about currently autonomous weapons right now fully autonomous, 935 00:49:19,680 --> 00:49:23,000 Speaker 2: but it can't be long, and I think this is 936 00:49:23,040 --> 00:49:24,040 Speaker 2: going to be a real tension. 937 00:49:24,120 --> 00:49:24,959 Speaker 4: Sooner resan later. 938 00:49:25,040 --> 00:49:26,480 Speaker 3: Can I say one thing and I'm going to be 939 00:49:26,560 --> 00:49:30,000 Speaker 3: slightly facetious, but also not Can you be slightly facetious. 940 00:49:30,040 --> 00:49:30,560 Speaker 4: I'm going to be. 941 00:49:30,480 --> 00:49:33,160 Speaker 3: Facetious, which is I have a solution to modern warfare. 942 00:49:33,760 --> 00:49:34,399 Speaker 4: Don't do it? 943 00:49:35,960 --> 00:49:38,319 Speaker 5: Okay, but no for real? 944 00:49:38,680 --> 00:49:41,480 Speaker 3: Yeah, if we're just gonna have bots fighting points and 945 00:49:41,520 --> 00:49:43,440 Speaker 3: it's going to cost a lot of money and result 946 00:49:43,480 --> 00:49:49,000 Speaker 3: in people's deaths, every country should have to build its biggest, best, 947 00:49:49,239 --> 00:49:52,719 Speaker 3: most technologically advanced robot and just have them fight it 948 00:49:52,760 --> 00:49:56,400 Speaker 3: out gladiatorial. And my twist is to Paul's point about 949 00:49:56,400 --> 00:50:00,840 Speaker 3: war always having to be painful in some way, Everyone 950 00:50:00,960 --> 00:50:04,240 Speaker 3: in that particular society has to be engaged and dedicate 951 00:50:04,320 --> 00:50:08,640 Speaker 3: some amount of time or money to building that particular robot, 952 00:50:09,040 --> 00:50:11,720 Speaker 3: and you just have to iterate on the robot forever 953 00:50:11,880 --> 00:50:14,759 Speaker 3: until you feel comfortable to have them fight, and that 954 00:50:14,840 --> 00:50:17,840 Speaker 3: way everyone shares the pain, but without the loss of 955 00:50:17,920 --> 00:50:19,320 Speaker 3: human life. Am I high. 956 00:50:19,400 --> 00:50:20,640 Speaker 5: I don't think so well. 957 00:50:20,840 --> 00:50:22,279 Speaker 4: I think you should write a book. No, I don't 958 00:50:22,280 --> 00:50:23,520 Speaker 4: think you should write a sci fi book. 959 00:50:23,640 --> 00:50:24,839 Speaker 5: All right, shall we leave it there. 960 00:50:24,920 --> 00:50:25,640 Speaker 4: Let's leave it there. 961 00:50:25,800 --> 00:50:27,960 Speaker 3: This has been another episode of the Authoughts podcast. 962 00:50:28,040 --> 00:50:31,080 Speaker 5: I'm Tracy Alloway. You can follow me at Tracy Alloway. 963 00:50:30,800 --> 00:50:33,560 Speaker 2: And I'm Joe Wisenthal. You can follow me at the Stalwart. 964 00:50:33,680 --> 00:50:37,120 Speaker 2: Follow our guest Paul Shari. He's at Paul Underscore Shari. 965 00:50:37,440 --> 00:50:40,600 Speaker 2: Follow our producers Carmen Rodriguez at Carmen armat dash, Ol 966 00:50:40,600 --> 00:50:42,000 Speaker 2: Bennett at Dashbot. 967 00:50:41,680 --> 00:50:43,520 Speaker 4: And kill Brooks at Kelbrooks. 968 00:50:43,800 --> 00:50:46,080 Speaker 2: For more Odd Lots content, go to Bloomberg dot com 969 00:50:46,080 --> 00:50:48,239 Speaker 2: slash odd Lots, where the daily newsletter and all of 970 00:50:48,280 --> 00:50:50,279 Speaker 2: our episodes, and you can chaut about all of these 971 00:50:50,320 --> 00:50:53,439 Speaker 2: topics twenty four to seven in our discord Discord dot 972 00:50:53,520 --> 00:50:54,880 Speaker 2: gg slash onlines. 973 00:50:54,920 --> 00:50:56,719 Speaker 3: And if you enjoy all thoughts, if you like it 974 00:50:56,760 --> 00:50:59,399 Speaker 3: when we talk about the future of autonomous weapons, then 975 00:50:59,400 --> 00:51:02,640 Speaker 3: please leave us a positive review on your favorite podcast platform. 976 00:51:02,760 --> 00:51:05,239 Speaker 3: And remember, if you are a Bloomberg subscriber, you can 977 00:51:05,280 --> 00:51:07,960 Speaker 3: listen to all of our episodes absolutely ad free. 978 00:51:08,080 --> 00:51:10,600 Speaker 5: All you need to do is find the Bloomberg. 979 00:51:10,120 --> 00:51:12,799 Speaker 3: Channel on Apple Podcasts and follow the instructions there. 980 00:51:13,200 --> 00:51:14,040 Speaker 5: Thanks for listening.