1 00:00:02,520 --> 00:00:07,400 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. 2 00:00:07,840 --> 00:00:11,200 Speaker 2: AI in full focus this morning. The standoff between Anthropic 3 00:00:11,480 --> 00:00:14,360 Speaker 2: and the Pentagon over military safeguards is ramping up. The 4 00:00:14,400 --> 00:00:17,160 Speaker 2: tech giant facing a deadline today to accept conditions or 5 00:00:17,239 --> 00:00:21,239 Speaker 2: be blacklisted. The Anthropic CEO Dario Emodi saying in a statement, 6 00:00:21,360 --> 00:00:24,239 Speaker 2: quote these threats do not change our position. We cannot 7 00:00:24,239 --> 00:00:26,680 Speaker 2: in good conscience to see to their request. The US 8 00:00:26,760 --> 00:00:29,760 Speaker 2: Undersecretary of Defense and Mil Michael, responding to the statement, 9 00:00:29,800 --> 00:00:33,320 Speaker 2: calling Emodi a liar with a God complex. The under 10 00:00:33,320 --> 00:00:36,280 Speaker 2: Secretary joins US now for more. Under Secretary Michael, Welcome 11 00:00:36,320 --> 00:00:38,720 Speaker 2: to the program, Sir. Calling someone a liar is pretty 12 00:00:38,720 --> 00:00:40,400 Speaker 2: strong language. What's he like about? 13 00:00:40,800 --> 00:00:45,159 Speaker 1: Well, what happened is we've been negotiating in good faith 14 00:00:45,760 --> 00:00:49,440 Speaker 1: on the Department of Warside for about three months, and 15 00:00:49,560 --> 00:00:53,640 Speaker 1: we're working pretty diligently, and we sent over a proposal 16 00:00:53,800 --> 00:00:57,280 Speaker 1: that we thought made a lot of concessions to the 17 00:00:57,400 --> 00:01:01,400 Speaker 1: language that Anthropic wanted, and then, you know, without any notice, 18 00:01:01,800 --> 00:01:05,400 Speaker 1: they published an article where we thought we were getting close, 19 00:01:05,920 --> 00:01:09,160 Speaker 1: saying that they were breaking off talks well before the deadline, 20 00:01:09,200 --> 00:01:13,200 Speaker 1: which is generally not good partner oriented practice. If you 21 00:01:13,240 --> 00:01:14,160 Speaker 1: will so mail. 22 00:01:14,240 --> 00:01:16,200 Speaker 2: This is what they're asking for as far as we understand, 23 00:01:16,400 --> 00:01:18,399 Speaker 2: and let me share this with the audience. It doesn't 24 00:01:18,400 --> 00:01:21,280 Speaker 2: want its technology use for surveillments of US citizens or 25 00:01:21,360 --> 00:01:24,119 Speaker 2: for autonomous lethal strikes without a human in the loop. 26 00:01:24,600 --> 00:01:26,520 Speaker 2: That's the ask. What were the concessions. 27 00:01:27,959 --> 00:01:32,040 Speaker 1: Yeah, the concessions are pretty simple. We agreed in writing 28 00:01:32,800 --> 00:01:37,960 Speaker 1: to ensure that the Department of was following all laws 29 00:01:38,120 --> 00:01:41,880 Speaker 1: and regulations, including the National Security Act of nineteen forty seven, 30 00:01:42,280 --> 00:01:45,720 Speaker 1: the PIZ Act, and all other applicable laws and regulations, 31 00:01:45,920 --> 00:01:49,720 Speaker 1: because mass surveillance of Americans is already illegal, so we 32 00:01:49,720 --> 00:01:53,480 Speaker 1: were offering to put all of that language and affirm 33 00:01:53,840 --> 00:01:56,320 Speaker 1: that we were following all laws in the contract. When 34 00:01:56,320 --> 00:01:59,440 Speaker 1: it comes to autonomous weapons, similarly, we said we'll follow 35 00:01:59,480 --> 00:02:03,280 Speaker 1: all laws, including a DoD Directive that's been in place 36 00:02:03,280 --> 00:02:07,520 Speaker 1: for years that governs how we would use any such weapons. 37 00:02:07,920 --> 00:02:10,720 Speaker 1: And then we affirmed that there would be human oversight 38 00:02:11,639 --> 00:02:15,400 Speaker 1: over all kinds of every kind of part of the 39 00:02:15,440 --> 00:02:19,280 Speaker 1: development process or engagement process or use about Thomas weapons. 40 00:02:19,680 --> 00:02:22,600 Speaker 1: And he didn't like the word I guess as appropriate 41 00:02:22,720 --> 00:02:26,200 Speaker 1: at the end of that sentence, But we believe we've 42 00:02:26,800 --> 00:02:30,040 Speaker 1: we conceded to all their substantive demands. So it was 43 00:02:30,080 --> 00:02:33,600 Speaker 1: surprising that out of nowhere they'd cut off negotiations. 44 00:02:33,720 --> 00:02:36,679 Speaker 3: So it appears that your differences are actually minor. Would 45 00:02:36,760 --> 00:02:37,480 Speaker 3: that be accurate? 46 00:02:39,080 --> 00:02:42,000 Speaker 1: Yeah, That's what was surprising is usually if your differences 47 00:02:42,000 --> 00:02:43,560 Speaker 1: are minor, you get in a room, you try to 48 00:02:43,600 --> 00:02:47,440 Speaker 1: hash them out, and instead, without any notice, you know, 49 00:02:47,560 --> 00:02:52,480 Speaker 1: publishes something about his conscience and then doesn't engage. And 50 00:02:52,720 --> 00:02:55,600 Speaker 1: it was difficult to understand why because we were working 51 00:02:55,680 --> 00:02:59,160 Speaker 1: pretty diligently on this, and we were at the final 52 00:02:59,200 --> 00:03:02,320 Speaker 1: stages of few words here or there where we agreed 53 00:03:02,320 --> 00:03:04,760 Speaker 1: to what they wanted so in substance. So it was 54 00:03:05,200 --> 00:03:09,400 Speaker 1: very surprising given that we negotiate with hundreds of technology 55 00:03:09,440 --> 00:03:11,720 Speaker 1: companies and this is the only one we've ever had 56 00:03:11,720 --> 00:03:12,600 Speaker 1: that behavior from. 57 00:03:12,840 --> 00:03:15,320 Speaker 3: Are you still weighing using the Defense Production Act to 58 00:03:15,400 --> 00:03:19,119 Speaker 3: compel Anthropic to basically have to use its product or potential? 59 00:03:19,240 --> 00:03:22,639 Speaker 3: You're weighing still making it a supply chain risk, which 60 00:03:23,160 --> 00:03:26,840 Speaker 3: we heard from anthropics say that this is almost contradictory proposals. 61 00:03:28,760 --> 00:03:32,839 Speaker 1: They're two different things, and I think depending on how 62 00:03:32,840 --> 00:03:37,160 Speaker 1: today goes. At five o'clock, the Secretary of warpete Hexeth 63 00:03:37,200 --> 00:03:40,080 Speaker 1: gets to make the decision on how to reply. I've 64 00:03:40,240 --> 00:03:44,720 Speaker 1: maintained my openness to continue dialogue through that deadline. But 65 00:03:44,800 --> 00:03:47,240 Speaker 1: they've seem to have launched sort of a pr campaign 66 00:03:47,240 --> 00:03:51,440 Speaker 1: that was planned well before these negotiations sort of restarted 67 00:03:51,520 --> 00:03:56,800 Speaker 1: on Tuesday. So I don't know. Their behavior is frankly unpredictable. 68 00:03:56,840 --> 00:03:58,080 Speaker 1: I'm not sure what to expect. 69 00:03:58,440 --> 00:04:01,640 Speaker 3: So do you plan them having more talks today or 70 00:04:01,720 --> 00:04:03,720 Speaker 3: is a decision just need to be made by today. 71 00:04:04,880 --> 00:04:09,280 Speaker 1: I offered more talks if so long as they're in 72 00:04:09,320 --> 00:04:13,560 Speaker 1: good faith. We're always open to talks. And we set 73 00:04:13,720 --> 00:04:17,159 Speaker 1: a deadline, and we meant the deadline, and up until 74 00:04:17,160 --> 00:04:19,279 Speaker 1: that deadline, I'm open to more talks and I told 75 00:04:19,279 --> 00:04:19,840 Speaker 1: them so. 76 00:04:19,839 --> 00:04:22,159 Speaker 3: So what does happen at five to ZHO one today? 77 00:04:22,240 --> 00:04:25,719 Speaker 3: If there's no agreement between the DoD and Anthropic. 78 00:04:26,800 --> 00:04:30,359 Speaker 1: That's up Secretary Hegseeth. We have some courses of action 79 00:04:30,480 --> 00:04:34,800 Speaker 1: that he's considering. And ultimately this comes down to the 80 00:04:34,800 --> 00:04:38,560 Speaker 1: war fighter, right. It comes down to for any AI 81 00:04:38,640 --> 00:04:42,920 Speaker 1: system we might use, are we using it to protect 82 00:04:43,120 --> 00:04:46,000 Speaker 1: our war fighters in the right way? Are we using 83 00:04:46,040 --> 00:04:48,520 Speaker 1: it to sort of give them the best tools to 84 00:04:48,560 --> 00:04:50,960 Speaker 1: be efficient and to be lethal when they have to 85 00:04:50,960 --> 00:04:55,440 Speaker 1: be lethal, and that's the primary thing in Secretary hess head, 86 00:04:55,920 --> 00:04:58,760 Speaker 1: and we told them that Ultimately, at the end of 87 00:04:58,760 --> 00:05:02,680 Speaker 1: the day, we fall all the law, all laws, but 88 00:05:02,920 --> 00:05:06,520 Speaker 1: we can't let any one company stand between us and 89 00:05:06,720 --> 00:05:10,239 Speaker 1: the war fighter because they don't make the rules. Congress 90 00:05:10,279 --> 00:05:13,320 Speaker 1: makes the roles, the presidents signed them, We execute them, 91 00:05:13,360 --> 00:05:14,359 Speaker 1: and we do so safely. 92 00:05:14,400 --> 00:05:16,320 Speaker 3: Well, speaking of the President, has he weighed in on 93 00:05:16,400 --> 00:05:19,919 Speaker 3: this specifically? Has there been discussions with him? And Secretary 94 00:05:20,000 --> 00:05:21,600 Speaker 3: Hegseth and yourself as well? 95 00:05:23,720 --> 00:05:25,880 Speaker 1: This has all been internal to the Department of War 96 00:05:25,960 --> 00:05:27,120 Speaker 1: so far, okay? 97 00:05:27,200 --> 00:05:29,920 Speaker 3: And then when it comes to potentially the other AI 98 00:05:30,040 --> 00:05:33,760 Speaker 3: negotiations underway, What is going on right now with XAI 99 00:05:34,200 --> 00:05:37,120 Speaker 3: and to get grock on classified networks. 100 00:05:38,279 --> 00:05:42,680 Speaker 1: Well, we I think the smart approach when I came 101 00:05:42,720 --> 00:05:45,200 Speaker 1: in at the Department of War about nine months ago 102 00:05:45,720 --> 00:05:47,800 Speaker 1: and looked at what we were using AI for, and 103 00:05:47,839 --> 00:05:50,800 Speaker 1: it was some pretty minimal use cases. And given the 104 00:05:50,839 --> 00:05:53,680 Speaker 1: power of the technology, the potential power to do good 105 00:05:54,640 --> 00:05:57,560 Speaker 1: for the US military both from an efficiency standpoint and 106 00:05:57,600 --> 00:05:59,920 Speaker 1: a strength standpoint, I wanted to make sure we have 107 00:06:00,080 --> 00:06:02,320 Speaker 1: a lot of options. So we went around and we've 108 00:06:02,360 --> 00:06:08,800 Speaker 1: launched Google for unclassified networks, We've signed XAI for classified 109 00:06:08,839 --> 00:06:11,520 Speaker 1: and non classified networks, and we want to continue to 110 00:06:11,560 --> 00:06:15,239 Speaker 1: provide options to all of our components here at DOOW, 111 00:06:15,680 --> 00:06:17,479 Speaker 1: and that's what we'll continue to do. And it's just 112 00:06:17,600 --> 00:06:20,159 Speaker 1: smart to have more than one option so that we 113 00:06:20,200 --> 00:06:22,479 Speaker 1: can see the strengths and weaknesses of each model and 114 00:06:22,600 --> 00:06:26,239 Speaker 1: learn from them as the AI revolution begins here. 115 00:06:26,279 --> 00:06:30,680 Speaker 3: Under Secretary Michael is lethal autonomy really so critical for 116 00:06:30,920 --> 00:06:32,320 Speaker 3: future national security? 117 00:06:34,520 --> 00:06:36,400 Speaker 1: It is. I mean, if you think about it from 118 00:06:36,480 --> 00:06:40,320 Speaker 1: a defense standpoint, whether it's a drone swarm that's coming 119 00:06:40,320 --> 00:06:45,000 Speaker 1: out a military base, whether it's a hybersonic missile coming 120 00:06:45,000 --> 00:06:48,719 Speaker 1: out the United States, where the reaction time against the 121 00:06:48,760 --> 00:06:53,080 Speaker 1: sort of how many weapons are coming at you you 122 00:06:53,240 --> 00:06:55,159 Speaker 1: want to take them, be able to take them down 123 00:06:55,640 --> 00:07:00,480 Speaker 1: potentially faster than a human could alone. If that's how 124 00:07:00,520 --> 00:07:03,000 Speaker 1: it's done. And we're learning from the Russian Ukraine War 125 00:07:03,080 --> 00:07:05,560 Speaker 1: with the drone swarms and so on, and with the 126 00:07:05,600 --> 00:07:08,400 Speaker 1: new weapons that have been developed all over the world 127 00:07:08,440 --> 00:07:10,760 Speaker 1: that change the name of war, the game of warfare, 128 00:07:11,640 --> 00:07:14,520 Speaker 1: that we've got to respond and defend ourselves in any 129 00:07:14,520 --> 00:07:16,800 Speaker 1: way we can. The question is do you have a 130 00:07:16,880 --> 00:07:20,080 Speaker 1: human on a loop to make sure that we're monitoring 131 00:07:20,080 --> 00:07:23,600 Speaker 1: these systems? And that's what we proposed in writing in language, 132 00:07:23,640 --> 00:07:26,560 Speaker 1: that we always have human oversight over these things, but 133 00:07:26,640 --> 00:07:28,840 Speaker 1: they're necessary given what's happening in the world. 134 00:07:29,400 --> 00:07:34,720 Speaker 2: Clearly, the technology these companies are producing is tremendously powerful. Yeah, 135 00:07:34,760 --> 00:07:37,400 Speaker 2: you believe this individual is both a liar and has 136 00:07:37,400 --> 00:07:40,280 Speaker 2: a god complex. How concerned should the American public be 137 00:07:41,160 --> 00:07:43,960 Speaker 2: about individuals like this running companies like this if that's 138 00:07:43,960 --> 00:07:45,760 Speaker 2: what you believe this person is. 139 00:07:47,200 --> 00:07:50,560 Speaker 1: Yeah, I think there's some concern that you know, when 140 00:07:50,560 --> 00:07:55,280 Speaker 1: you have leaders of some of these companies talking about 141 00:07:55,480 --> 00:08:00,720 Speaker 1: unemploying seventy million Americans, the lawsuits they're under for scraping 142 00:08:00,840 --> 00:08:05,800 Speaker 1: content from content publishers and having billion dollar lawsuits against 143 00:08:05,840 --> 00:08:10,119 Speaker 1: them using that content to make profits. And then really 144 00:08:10,120 --> 00:08:14,400 Speaker 1: what's concerning is making their own policies that sort of 145 00:08:15,360 --> 00:08:18,880 Speaker 1: and could sit on top of democratic policies that are 146 00:08:18,960 --> 00:08:21,520 Speaker 1: voted on by the people, passed by Congress, signed by 147 00:08:21,520 --> 00:08:24,480 Speaker 1: the President. You do have to worry about are they 148 00:08:24,520 --> 00:08:27,040 Speaker 1: taking it too far? Are they trying to do and 149 00:08:27,160 --> 00:08:30,880 Speaker 1: impose their own views on the American people in an 150 00:08:31,000 --> 00:08:33,559 Speaker 1: undemocratic way, so, I think those things are going to 151 00:08:33,600 --> 00:08:36,360 Speaker 1: be things we're grappling with in society for the next 152 00:08:36,360 --> 00:08:38,480 Speaker 1: several years as these companies get bigger and bigger. 153 00:08:38,559 --> 00:08:40,480 Speaker 2: We're looking forward to having a conversation with you, sir. 154 00:08:40,559 --> 00:08:43,040 Speaker 2: Thanks for making time for us this morning. Meil Mark there, 155 00:08:43,200 --> 00:08:45,000 Speaker 2: the under Secretary of Defense,