1 00:00:09,840 --> 00:00:12,880 Speaker 1: Yo, what's going on? It's Texter. So technology does not 2 00:00:13,039 --> 00:00:15,920 Speaker 1: slow down for anybody. And sometimes I wish that I 3 00:00:15,920 --> 00:00:18,520 Speaker 1: could just talk to y'all live about what's happening right now. 4 00:00:19,160 --> 00:00:21,120 Speaker 1: But then we were talking about it, we realized, wait 5 00:00:21,120 --> 00:00:24,000 Speaker 1: a second, we can talk to you live. So we're 6 00:00:24,000 --> 00:00:26,319 Speaker 1: gonna try something new here. Every once in a while, 7 00:00:26,320 --> 00:00:28,639 Speaker 1: we're gonna talk about something that's happening right now and 8 00:00:28,720 --> 00:00:31,400 Speaker 1: let people jump into the audience to ask questions, and 9 00:00:31,440 --> 00:00:33,720 Speaker 1: we'll bring a guest along. And so what you're about 10 00:00:33,760 --> 00:00:36,240 Speaker 1: to listen to is our first episode that we live 11 00:00:36,280 --> 00:00:39,239 Speaker 1: streamed on Wednesday, March third. So if you want to 12 00:00:39,240 --> 00:00:41,440 Speaker 1: catch the next one as we're recording it and maybe 13 00:00:41,440 --> 00:00:44,199 Speaker 1: even ask some questions, you can subscribe to our YouTube 14 00:00:44,240 --> 00:00:46,280 Speaker 1: page and you can catch the links to that in 15 00:00:46,360 --> 00:00:57,200 Speaker 1: the description. So everyone, this is we're doing something different here. 16 00:00:57,240 --> 00:01:01,319 Speaker 1: We're doing the first live episode of kill Switch. We're 17 00:01:01,360 --> 00:01:03,720 Speaker 1: gonna probably try to do these maybe once a month, 18 00:01:04,200 --> 00:01:06,480 Speaker 1: just to give people opportunity to actually jump into the 19 00:01:06,560 --> 00:01:10,560 Speaker 1: chat and talk about stuff that is happening right now. 20 00:01:10,640 --> 00:01:13,360 Speaker 1: So thanks for bearing with us while we experiment a 21 00:01:13,400 --> 00:01:17,160 Speaker 1: little bit. But today what I'd like to talk about 22 00:01:17,520 --> 00:01:21,160 Speaker 1: is what is happening with Anthropic, what's going on with 23 00:01:21,200 --> 00:01:23,760 Speaker 1: open Ai, what's going on with the US government. To 24 00:01:23,800 --> 00:01:26,440 Speaker 1: talk about it today, our guest is Will Knight, who's 25 00:01:26,480 --> 00:01:29,360 Speaker 1: a senior writer at Wired. He's been writing about AI 26 00:01:29,480 --> 00:01:32,360 Speaker 1: for a while. He's got a weekly AI newsletter called 27 00:01:32,400 --> 00:01:35,040 Speaker 1: AI Lab. Will Welcome to kill Switch. 28 00:01:35,080 --> 00:01:37,280 Speaker 2: Welcome in, Thank you for having me. I'm wanting to 29 00:01:37,280 --> 00:01:40,120 Speaker 2: be on the first experimental live stream. 30 00:01:40,480 --> 00:01:43,679 Speaker 1: Yeah, hey, look, we'll see. Everything's in experiment at this point, 31 00:01:43,720 --> 00:01:48,080 Speaker 1: for better or for worse. Sure, So I want to 32 00:01:48,160 --> 00:01:51,520 Speaker 1: start this off with a timeline, Okay. On the twenty seventh, 33 00:01:52,080 --> 00:01:55,280 Speaker 1: Donald Trump gets on truth Social and says that Claude 34 00:01:55,320 --> 00:01:59,480 Speaker 1: is a radical left woke company. Okay, yeah, out of nowhere, 35 00:02:00,040 --> 00:02:02,720 Speaker 1: People like, what why is he saying this? Pete heg 36 00:02:02,800 --> 00:02:06,840 Speaker 1: Saith gets on Twitter and says that Claude Anthropic, the 37 00:02:06,840 --> 00:02:10,200 Speaker 1: company that makes Claude, is a supply chain risk, which 38 00:02:10,200 --> 00:02:12,320 Speaker 1: most people have never thought about what that even means. 39 00:02:13,000 --> 00:02:17,040 Speaker 1: The next day, Open Ai, who makes chat GPT, announces 40 00:02:17,080 --> 00:02:19,680 Speaker 1: that they're working with the Department of War, and then 41 00:02:19,800 --> 00:02:23,880 Speaker 1: Donald Trump announces that we're bombing Iran. As I say 42 00:02:23,919 --> 00:02:27,480 Speaker 1: that that sounds like a completely unhinged sequence of events. 43 00:02:28,360 --> 00:02:30,880 Speaker 1: Please tell me where I'm wrong, because I must be 44 00:02:30,919 --> 00:02:31,640 Speaker 1: missing something. 45 00:02:32,120 --> 00:02:34,320 Speaker 2: I mean, I think I think there's a lot that's 46 00:02:34,960 --> 00:02:37,840 Speaker 2: that's right in there and a lot that's unhinged. But 47 00:02:38,639 --> 00:02:40,639 Speaker 2: I mean to sort of set the scene a little bit, 48 00:02:41,040 --> 00:02:43,920 Speaker 2: you know. So this goes back. So the use of 49 00:02:44,240 --> 00:02:49,440 Speaker 2: Claude in classified military systems has its origins last year, 50 00:02:49,520 --> 00:02:52,799 Speaker 2: last summer, Anthropic was the first to get involved in 51 00:02:53,160 --> 00:02:55,480 Speaker 2: working with the military. You know, they've pursued a very 52 00:02:55,880 --> 00:02:58,519 Speaker 2: business focus, and I think that sort of led them 53 00:02:58,520 --> 00:03:00,440 Speaker 2: to focus on the d D, which is like one 54 00:03:00,480 --> 00:03:02,840 Speaker 2: of the biggest businesses in the US. So they ended 55 00:03:02,919 --> 00:03:05,880 Speaker 2: up supplying Claude their model, a version of that which 56 00:03:05,960 --> 00:03:11,400 Speaker 2: runs on the Pentagon's own systems for military use on 57 00:03:11,520 --> 00:03:15,480 Speaker 2: classified systems. In many ways, this is totally remarkable because 58 00:03:16,160 --> 00:03:18,320 Speaker 2: just a few years ago, everybody in the AI industry 59 00:03:18,440 --> 00:03:20,480 Speaker 2: was saying, one thing we don't want to do is 60 00:03:20,520 --> 00:03:22,800 Speaker 2: work with the military, and people were staging, you know, 61 00:03:22,880 --> 00:03:26,120 Speaker 2: walkouts over this, right, So it shows you how the 62 00:03:26,160 --> 00:03:29,000 Speaker 2: tech world has really pivoted to that. What happened more 63 00:03:29,040 --> 00:03:33,800 Speaker 2: recently was that its themes and you know, this is 64 00:03:33,840 --> 00:03:38,360 Speaker 2: somewhat disputed but there's been reports that when President Maduro 65 00:03:38,520 --> 00:03:42,280 Speaker 2: was captured, Claude was used, and somebody at Anthropic may 66 00:03:42,320 --> 00:03:46,520 Speaker 2: have raised questions about that. This seems like it snowballed. 67 00:03:46,840 --> 00:03:49,760 Speaker 2: And one of the carve outs in the original contract 68 00:03:49,840 --> 00:03:53,680 Speaker 2: signed last year there were two. One was against mass 69 00:03:53,680 --> 00:03:56,800 Speaker 2: surveillance of US citizen and the others was that the 70 00:03:56,840 --> 00:04:00,160 Speaker 2: model shouldn't be used to build fully autonomous weapons. And 71 00:04:01,280 --> 00:04:06,200 Speaker 2: the DoD or W said that they wanted to change 72 00:04:06,240 --> 00:04:08,520 Speaker 2: the contract essentially, which is an unusual thing to do, 73 00:04:08,760 --> 00:04:11,760 Speaker 2: but they wanted to allow all your lawful use. And 74 00:04:12,080 --> 00:04:15,200 Speaker 2: to some degree, this dispute seems like it's a little 75 00:04:15,200 --> 00:04:17,520 Speaker 2: bit like people, you know, each side talking past each 76 00:04:17,520 --> 00:04:19,920 Speaker 2: other to some degree. And I think that the DOW 77 00:04:20,080 --> 00:04:24,160 Speaker 2: the Department of War became very concerned about a tech 78 00:04:24,200 --> 00:04:27,200 Speaker 2: company saying how it could and couldn't use the technology. 79 00:04:27,400 --> 00:04:30,360 Speaker 2: And to be fair, that is quite an unusual idea, right. 80 00:04:30,400 --> 00:04:33,680 Speaker 2: It's like somebody selling a missile or a jet fighter 81 00:04:33,720 --> 00:04:35,760 Speaker 2: to the Pentagon and then later on saying we don't 82 00:04:35,760 --> 00:04:37,320 Speaker 2: want you to use it in this way or that way. 83 00:04:38,080 --> 00:04:40,680 Speaker 2: But then again, ai is is a really different technology, 84 00:04:40,760 --> 00:04:43,760 Speaker 2: is very new, it's very experimental and I believe anthropic, 85 00:04:44,200 --> 00:04:46,719 Speaker 2: especially among AI companies, it's very focused on safety and 86 00:04:46,720 --> 00:04:49,920 Speaker 2: they have concerns just about how it could end up 87 00:04:49,960 --> 00:04:50,640 Speaker 2: being used. 88 00:04:50,960 --> 00:04:54,240 Speaker 1: Yeah, well, so let's jump into this so anthropic you know, 89 00:04:54,640 --> 00:04:57,680 Speaker 1: this is a statement that is put on anthropic dot com. 90 00:04:57,760 --> 00:05:00,400 Speaker 1: So this is their own, you know, pr statement here 91 00:05:00,920 --> 00:05:03,000 Speaker 1: and the title is statement from Daria Ama Day on 92 00:05:03,040 --> 00:05:06,320 Speaker 1: our discussions with the Department of War, and it starts 93 00:05:06,360 --> 00:05:09,720 Speaker 1: out the first line, I believe deeply in the existential 94 00:05:09,720 --> 00:05:12,800 Speaker 1: importance of using AI to defend the United States and 95 00:05:12,839 --> 00:05:17,840 Speaker 1: other democracies and to defeat our autocratic adversaries. So already 96 00:05:17,839 --> 00:05:20,240 Speaker 1: out of the gate, that's kind of an interesting thing 97 00:05:20,440 --> 00:05:23,320 Speaker 1: for again a company in Silicon Valley to be saying, 98 00:05:23,480 --> 00:05:27,560 Speaker 1: even just in the past couple of years, really, but 99 00:05:27,680 --> 00:05:30,360 Speaker 1: it seems like the two big points of contention here, 100 00:05:30,440 --> 00:05:34,600 Speaker 1: there's two. He says, using these systems for mass domestic 101 00:05:34,800 --> 00:05:39,560 Speaker 1: surveillance is incompatible with democratic values. So spying out other people, 102 00:05:39,880 --> 00:05:42,640 Speaker 1: we're okay with that. Spying on people in the US, 103 00:05:42,960 --> 00:05:45,320 Speaker 1: We're not okay with that. Right, And then the next 104 00:05:45,320 --> 00:05:49,400 Speaker 1: thing is, quote frontier AI systems are simply not reliable 105 00:05:49,520 --> 00:05:54,400 Speaker 1: enough to power fully autonomous weapons both of those sound 106 00:05:54,760 --> 00:05:59,239 Speaker 1: totally okay and reasonable. Not only do they sound okay 107 00:05:59,279 --> 00:06:02,120 Speaker 1: and reasonable, but I mean, look, I've spent a whole 108 00:06:02,120 --> 00:06:05,600 Speaker 1: lot of time around people on the pretty far rate 109 00:06:06,680 --> 00:06:09,200 Speaker 1: and you know, for work, right, and they will tell 110 00:06:09,240 --> 00:06:12,919 Speaker 1: you I don't want no government spying on me, and 111 00:06:13,000 --> 00:06:16,680 Speaker 1: I don't trust these computers. Right, where's the beef? What's 112 00:06:16,720 --> 00:06:17,520 Speaker 1: the problem here? 113 00:06:18,200 --> 00:06:20,640 Speaker 2: That's a great question. You know. I've spent a lot 114 00:06:20,680 --> 00:06:24,919 Speaker 2: of time talking to people in the Pentagon and military 115 00:06:25,520 --> 00:06:29,520 Speaker 2: defense companies, and nobody believes that those frontier models are 116 00:06:29,520 --> 00:06:32,679 Speaker 2: ready were even remotely near ready being used in autonomous weapons, 117 00:06:33,360 --> 00:06:35,760 Speaker 2: And consistently the government said we have no desire to 118 00:06:35,839 --> 00:06:39,160 Speaker 2: do either of those things. I think the way what 119 00:06:39,240 --> 00:06:42,200 Speaker 2: it really came down to was that they didn't just 120 00:06:43,120 --> 00:06:46,280 Speaker 2: you know, take a knee and agree to change the 121 00:06:46,360 --> 00:06:48,960 Speaker 2: terms of their contract. And as you say, you know, 122 00:06:49,720 --> 00:06:54,120 Speaker 2: it culminated in Trump and Hegxeth and others putting out 123 00:06:54,120 --> 00:06:58,080 Speaker 2: these very very you know, as often happens, they were 124 00:06:58,080 --> 00:06:59,680 Speaker 2: sort of classifying them all of a sudden as the 125 00:06:59,720 --> 00:07:03,359 Speaker 2: radical left, which is it's pretty incredible for company supplying 126 00:07:03,360 --> 00:07:06,400 Speaker 2: technology that's working on classified US systems. 127 00:07:06,520 --> 00:07:10,080 Speaker 1: Right, So this is the post that Donald Trump made 128 00:07:10,120 --> 00:07:13,680 Speaker 1: on his social media network Truth Social and this I'm 129 00:07:13,720 --> 00:07:16,600 Speaker 1: reading from the top, and I'm reading verbatim. The United 130 00:07:16,600 --> 00:07:20,120 Speaker 1: States of America will never allow a radical left woke 131 00:07:20,280 --> 00:07:25,280 Speaker 1: company to dictate how a great military fights and wins wars. 132 00:07:25,480 --> 00:07:28,120 Speaker 1: That decision belongs to your commander in chief and the 133 00:07:28,120 --> 00:07:31,400 Speaker 1: tremendous leaders I appoint to run are military. And then 134 00:07:31,480 --> 00:07:33,200 Speaker 1: he finally gets to what he's talking about. He says, 135 00:07:33,200 --> 00:07:37,760 Speaker 1: the left wing nut jobs Anthropic have made a disastrous 136 00:07:37,800 --> 00:07:41,160 Speaker 1: mistake trying to strong arm the Department of War and 137 00:07:41,200 --> 00:07:43,520 Speaker 1: force them to obey their terms of service instead of 138 00:07:43,520 --> 00:07:44,320 Speaker 1: our constitution. 139 00:07:44,760 --> 00:07:45,320 Speaker 3: Yeah. 140 00:07:45,400 --> 00:07:47,400 Speaker 2: I mean, like a lot of that doesn't really even 141 00:07:47,920 --> 00:07:51,120 Speaker 2: make sense. I think Anthropic is not a radical left 142 00:07:51,720 --> 00:07:56,160 Speaker 2: organization by a stretch. It's since come out that that 143 00:07:56,200 --> 00:07:59,240 Speaker 2: should The DoD was negotiating with that for they were 144 00:07:59,240 --> 00:08:01,880 Speaker 2: trying to get this war odding kind of massalage into 145 00:08:01,880 --> 00:08:04,040 Speaker 2: a format that everybody was happy with. And it doesn't 146 00:08:04,080 --> 00:08:06,840 Speaker 2: seem like that would have been impossible. It feels like 147 00:08:07,560 --> 00:08:10,880 Speaker 2: some people within the administration lost patients, And I think 148 00:08:10,880 --> 00:08:12,960 Speaker 2: a lot of it came down to just power and 149 00:08:14,360 --> 00:08:18,560 Speaker 2: the Department of Defense and the leadership not being able 150 00:08:18,640 --> 00:08:20,960 Speaker 2: to handle being denied that by a tech company that 151 00:08:21,000 --> 00:08:23,720 Speaker 2: they could very easily see as as you know, part 152 00:08:23,760 --> 00:08:26,800 Speaker 2: of this sort of you know, coastal elite and in 153 00:08:26,840 --> 00:08:30,160 Speaker 2: the case of Anthropic, woke and liberal. I mean, it's 154 00:08:30,200 --> 00:08:32,880 Speaker 2: another reflection of how AI is just is it sort 155 00:08:32,920 --> 00:08:36,640 Speaker 2: of also a cultural lightning rod, right, it becomes suddenly 156 00:08:36,640 --> 00:08:41,559 Speaker 2: it becomes this kind of left versus right. So, I mean, 157 00:08:41,760 --> 00:08:46,199 Speaker 2: you know that that post kind of does unfortunately sum 158 00:08:46,280 --> 00:08:49,480 Speaker 2: up the entire situation, which is slightly bewildering if you 159 00:08:49,920 --> 00:08:52,800 Speaker 2: know that now they've they've taken the action to try 160 00:08:52,800 --> 00:08:56,440 Speaker 2: and to classify, as you said, Anthropic a supply chain risk, 161 00:08:56,880 --> 00:08:59,040 Speaker 2: which is something that's only ever been to foreign nations. 162 00:08:59,040 --> 00:09:02,319 Speaker 2: And we're doing this to one of the foremost AI 163 00:09:02,440 --> 00:09:05,280 Speaker 2: companies in the United States, one of them I supposedly, 164 00:09:05,360 --> 00:09:09,200 Speaker 2: you know, should be our national champions. It it doesn't 165 00:09:09,240 --> 00:09:13,560 Speaker 2: sound like to me they're really promoting American technology. If 166 00:09:13,559 --> 00:09:15,319 Speaker 2: you if you're too keen to do that, I think 167 00:09:15,360 --> 00:09:16,920 Speaker 2: you don't. You know, it's a real danger you're going 168 00:09:17,000 --> 00:09:21,160 Speaker 2: to harm some of you know, this this nascent, very 169 00:09:21,160 --> 00:09:24,439 Speaker 2: important American company also maybe dissuade other companies from working 170 00:09:24,480 --> 00:09:26,880 Speaker 2: with the government or the Pentagon if they feel that 171 00:09:26,880 --> 00:09:27,839 Speaker 2: that's going to happen. 172 00:09:27,640 --> 00:09:32,480 Speaker 1: Right, Yeah, So the supply chain risk thing again that 173 00:09:32,480 --> 00:09:35,720 Speaker 1: that's the language that Pete Hegseth used. What does it 174 00:09:35,800 --> 00:09:40,319 Speaker 1: mean to call anthropic a supply chain risk? What does 175 00:09:40,360 --> 00:09:41,960 Speaker 1: that even mean? Yeah? 176 00:09:42,000 --> 00:09:43,599 Speaker 2: I mean, you know, well, the first thing is to 177 00:09:43,640 --> 00:09:49,760 Speaker 2: say that this supply chain risk designation is incredibly unusual, extraordinary, 178 00:09:49,880 --> 00:09:53,040 Speaker 2: and just the kind of thing that I think the 179 00:09:53,080 --> 00:09:56,480 Speaker 2: sort of left field thinking that we're become accustomed to 180 00:09:56,600 --> 00:09:59,440 Speaker 2: with the administration because it's usually applied to, you know, 181 00:10:00,040 --> 00:10:02,400 Speaker 2: a company that you want to remove from the US 182 00:10:02,520 --> 00:10:05,439 Speaker 2: supply chain because you're worried that their software or their 183 00:10:05,440 --> 00:10:09,800 Speaker 2: hardware might contain backdoors, might be a threat to national 184 00:10:09,800 --> 00:10:13,000 Speaker 2: security because it could be used by an adversary. So 185 00:10:13,120 --> 00:10:14,839 Speaker 2: this is the very sort of all or nothing. You 186 00:10:15,280 --> 00:10:18,160 Speaker 2: either agree with everything we say, or you are the enemy. 187 00:10:18,920 --> 00:10:21,880 Speaker 2: I mean, that's the kind of tenor which sounds familiar 188 00:10:21,880 --> 00:10:26,000 Speaker 2: to me generally. Often the most extreme version of that designation, 189 00:10:26,120 --> 00:10:27,760 Speaker 2: and we haven't seen the full details of it because 190 00:10:27,760 --> 00:10:30,000 Speaker 2: they haven't produced it yet, but would mean that any 191 00:10:30,360 --> 00:10:33,240 Speaker 2: company who's working with the d id couldn't use your technology, 192 00:10:33,240 --> 00:10:36,280 Speaker 2: which is a vast I mean, Amazon, which has invested 193 00:10:36,320 --> 00:10:39,920 Speaker 2: billions in Anthropy, has massive contracts with the DoD so 194 00:10:39,960 --> 00:10:43,000 Speaker 2: it's an enormous thing to designate them. It's also doesn't 195 00:10:43,000 --> 00:10:45,400 Speaker 2: really make sense because you know, they were also saying 196 00:10:45,440 --> 00:10:47,880 Speaker 2: we may invoke the Defense Production Act, which has been 197 00:10:48,000 --> 00:10:51,000 Speaker 2: used in wartimes or in actually also during the Cold 198 00:10:51,080 --> 00:10:56,559 Speaker 2: War to force companies to prioritize production and supply because 199 00:10:56,760 --> 00:11:01,240 Speaker 2: what they were making is so important. So it's it's wait, 200 00:11:01,600 --> 00:11:03,400 Speaker 2: I think this may be what, Yeah, this may be 201 00:11:03,400 --> 00:11:06,480 Speaker 2: a legal problem that the d ID has, and I 202 00:11:06,520 --> 00:11:09,160 Speaker 2: think you know Anthropic is going to as a signal 203 00:11:09,200 --> 00:11:11,760 Speaker 2: that's going to take some legal action, So you can't 204 00:11:11,800 --> 00:11:14,800 Speaker 2: it's difficult to argue. On one hand, it's so vital 205 00:11:14,840 --> 00:11:17,520 Speaker 2: we may invoke the Defense Production Act, but then we 206 00:11:17,559 --> 00:11:19,880 Speaker 2: may also designate you a supply chain risk. 207 00:11:20,400 --> 00:11:24,600 Speaker 1: Yeah, like you're too dangerous for us to use, but 208 00:11:25,200 --> 00:11:28,840 Speaker 1: you're so useful that we want to use your product. 209 00:11:28,559 --> 00:11:31,640 Speaker 2: But we want to force you to supply it big one. 210 00:11:32,280 --> 00:11:35,600 Speaker 1: Yeah, like nobody can have it, but we want it. Okay, 211 00:11:35,880 --> 00:11:49,480 Speaker 1: what are we doing here? So basically the next day 212 00:11:50,480 --> 00:11:54,640 Speaker 1: open Ai signs something with the government. How does that 213 00:11:54,720 --> 00:11:57,120 Speaker 1: sequence of events? Like, what kind of sense are you 214 00:11:57,160 --> 00:11:57,840 Speaker 1: able to make of that? 215 00:11:58,160 --> 00:12:02,720 Speaker 2: Yeah, it is very interesting that anthropics chief rival. And 216 00:12:03,000 --> 00:12:06,040 Speaker 2: quick reminder that Anthropic was founded by people who left 217 00:12:06,400 --> 00:12:10,079 Speaker 2: OpenAI because they said that company was not developing AI 218 00:12:10,559 --> 00:12:15,600 Speaker 2: with enough focus on safety. And they're really really bitter rivals. 219 00:12:15,640 --> 00:12:18,360 Speaker 2: And as it was going on, Sam al Mun was 220 00:12:18,400 --> 00:12:21,960 Speaker 2: working through an alternative deal. The deal that they signed 221 00:12:22,000 --> 00:12:25,760 Speaker 2: seems extremely similar to the one that Anthropic we're trying 222 00:12:25,760 --> 00:12:28,520 Speaker 2: to hash out, has similar carve outs, so it's sort 223 00:12:28,520 --> 00:12:31,360 Speaker 2: of doesn't actually carve them out, but it specifies that 224 00:12:31,360 --> 00:12:36,000 Speaker 2: those things that massurbillance is unlawful and that nobody wants 225 00:12:36,040 --> 00:12:37,240 Speaker 2: to build autonomous weapons. 226 00:12:37,559 --> 00:12:41,880 Speaker 1: So are the details or the terms of the contract 227 00:12:41,880 --> 00:12:45,880 Speaker 1: that OpenAI signed with the Department of War, are they 228 00:12:45,920 --> 00:12:51,199 Speaker 1: materially different from what Anthropic did? Because this is from CNBC, 229 00:12:51,679 --> 00:12:55,040 Speaker 1: Altman said, and I want to read this here quote. 230 00:12:55,120 --> 00:12:57,959 Speaker 1: I believe that we will hopefully have the best models 231 00:12:57,960 --> 00:13:00,440 Speaker 1: that were encouraged the government to be willing to work 232 00:13:00,480 --> 00:13:03,679 Speaker 1: with us, even if our safety stack annoys them. But 233 00:13:03,679 --> 00:13:05,800 Speaker 1: there will be at least one other actor, which I 234 00:13:05,840 --> 00:13:09,520 Speaker 1: assume will be x Ai, which effectively will say, will 235 00:13:09,559 --> 00:13:14,040 Speaker 1: do whatever you want. So the vibe here seems to 236 00:13:14,080 --> 00:13:17,280 Speaker 1: be like, okay, well listen, at least worthy adults in 237 00:13:17,280 --> 00:13:19,880 Speaker 1: the room. They're gonna let Grop do whatever Groc wants 238 00:13:19,880 --> 00:13:22,280 Speaker 1: to do, whatever, whatever Trump wants to do. They're just 239 00:13:22,320 --> 00:13:24,719 Speaker 1: gonna let it do it, drop bombs wherever. At least 240 00:13:24,760 --> 00:13:26,240 Speaker 1: we're going to be some adults in the room. But 241 00:13:26,960 --> 00:13:30,360 Speaker 1: is Opening Eye doing something fundamentally different. Did they seed 242 00:13:30,400 --> 00:13:33,480 Speaker 1: more grounded than Anthropic did or is it the same thing? 243 00:13:35,280 --> 00:13:36,920 Speaker 2: You know, Yeah, there's a lot going on there. I 244 00:13:37,200 --> 00:13:41,120 Speaker 2: think that they they did see slightly more ground in 245 00:13:41,160 --> 00:13:45,200 Speaker 2: that they basically agreed to a contract that specified all 246 00:13:45,280 --> 00:13:49,560 Speaker 2: lawful use. They changed the contractors make Oh but you know, 247 00:13:49,640 --> 00:13:51,800 Speaker 2: to be clear, here's what's what is lawful and what 248 00:13:51,840 --> 00:13:54,880 Speaker 2: the deity doesn't want to do. But I think there 249 00:13:54,880 --> 00:13:56,760 Speaker 2: isn't a huge amount of difference. I mean, I think 250 00:13:56,800 --> 00:14:00,000 Speaker 2: what you're seeing with that that internal statement was really, 251 00:14:01,200 --> 00:14:04,600 Speaker 2: you know, Sam open Ai willing to take a bit 252 00:14:04,640 --> 00:14:09,720 Speaker 2: of flat because it's just incredibly important to get a 253 00:14:09,720 --> 00:14:11,920 Speaker 2: win over Anthropic, to be seen to be very on 254 00:14:12,000 --> 00:14:14,040 Speaker 2: board with the US government because you want more and 255 00:14:14,120 --> 00:14:17,840 Speaker 2: more involvement, more and more contracts and they've since you know, 256 00:14:17,880 --> 00:14:20,000 Speaker 2: open a I since got quite a lot of blowback. 257 00:14:20,640 --> 00:14:22,920 Speaker 1: Yeah, well, let's talk about the blowback here. I mean 258 00:14:23,120 --> 00:14:27,360 Speaker 1: this is on tech crunch us app. Uninstalls of chat 259 00:14:27,400 --> 00:14:31,440 Speaker 1: gbt's mobile app went up almost three hundred per so 260 00:14:31,600 --> 00:14:40,640 Speaker 1: people were uninstalling chat gbt and started installing Claude Anthropics Claude. 261 00:14:40,720 --> 00:14:44,120 Speaker 1: And so there's a weird way in which there's this 262 00:14:45,040 --> 00:14:47,280 Speaker 1: and I really hesitate to call this a game, but 263 00:14:47,280 --> 00:14:49,800 Speaker 1: there's there's a weird, really weird way in which there's 264 00:14:49,840 --> 00:14:53,520 Speaker 1: this game being played over who gets to be the 265 00:14:53,600 --> 00:14:57,160 Speaker 1: contractor for the Department of War. But then also we're 266 00:14:57,240 --> 00:14:59,720 Speaker 1: still fighting for consumers here. 267 00:15:00,720 --> 00:15:03,800 Speaker 2: Definitely, And it is interesting. It's the sort of perception 268 00:15:03,840 --> 00:15:06,120 Speaker 2: of who's the good guy and you had, you know, 269 00:15:06,200 --> 00:15:10,920 Speaker 2: Katie Perry was subscribing to claud pro and posting about 270 00:15:10,920 --> 00:15:14,120 Speaker 2: it and this, you know, the sort of interesting cultural moment. 271 00:15:14,160 --> 00:15:17,040 Speaker 2: And I think the thing that was that struck me 272 00:15:17,160 --> 00:15:20,560 Speaker 2: was like, the people really know how Claude is being 273 00:15:20,640 --> 00:15:23,640 Speaker 2: used by the military, Do they know what Anthropic was 274 00:15:23,680 --> 00:15:26,920 Speaker 2: willing to do? I think it's gotten a little oversimplified 275 00:15:26,960 --> 00:15:29,040 Speaker 2: to my mind, and I think that one of the 276 00:15:29,160 --> 00:15:32,680 Speaker 2: risks there is that we ought to be rather necessarily 277 00:15:32,760 --> 00:15:36,320 Speaker 2: uninstalling something or posting about who you think is best, 278 00:15:37,120 --> 00:15:39,080 Speaker 2: really pushing for more of an explanation about how the 279 00:15:39,160 --> 00:15:42,400 Speaker 2: technology is being used and developed and tested, like what 280 00:15:42,520 --> 00:15:45,240 Speaker 2: is it mean that it's used on classified systems? There 281 00:15:45,240 --> 00:15:48,200 Speaker 2: should be more accountability there. And the fact of the 282 00:15:48,200 --> 00:15:51,160 Speaker 2: matter is there is nothing legally to putting the issue 283 00:15:51,160 --> 00:15:53,440 Speaker 2: of the bonus to once nothing prohibiting the development of 284 00:15:53,480 --> 00:15:57,640 Speaker 2: autonomous weapons, and there isn't really a legal framework around 285 00:15:57,720 --> 00:16:00,520 Speaker 2: that sort. If they, you know, civilians were affected by 286 00:16:00,520 --> 00:16:03,600 Speaker 2: fully autonomous weapons, you suddenly have a situation where nobody 287 00:16:03,680 --> 00:16:06,640 Speaker 2: is legally accountable. And so these are things that really 288 00:16:06,640 --> 00:16:09,560 Speaker 2: need to be figured out. And that doesn't mean a 289 00:16:09,600 --> 00:16:12,720 Speaker 2: fully autonomous web that could be your automated part of 290 00:16:12,760 --> 00:16:15,280 Speaker 2: the so called kill chain and so decisions were kind 291 00:16:15,280 --> 00:16:17,960 Speaker 2: of automated away. I think that is really important to 292 00:16:18,000 --> 00:16:20,800 Speaker 2: have more focus on this and to have more discussion 293 00:16:20,840 --> 00:16:24,120 Speaker 2: of it. And you know, one of the reasons, honestly, 294 00:16:24,440 --> 00:16:27,680 Speaker 2: as someone who's looked at the DoD and the work 295 00:16:27,680 --> 00:16:31,080 Speaker 2: going on around autonomy, is that unfortunately it is the 296 00:16:31,120 --> 00:16:33,040 Speaker 2: case that we're just going to have more and more autonomy, 297 00:16:33,400 --> 00:16:36,160 Speaker 2: and most analysts expect that they're going to be swarms 298 00:16:36,200 --> 00:16:39,440 Speaker 2: of drones attacking systems, and the truth is you just 299 00:16:39,480 --> 00:16:41,600 Speaker 2: won't be able to have humans in the loop always. 300 00:16:41,760 --> 00:16:43,520 Speaker 2: So this is going to become a really big issue. 301 00:16:43,520 --> 00:16:47,080 Speaker 2: And it's not necessarily about anthropics models in those situations, 302 00:16:47,080 --> 00:16:50,120 Speaker 2: but the question of who's accountable, how do we know 303 00:16:50,160 --> 00:16:52,600 Speaker 2: that the systems are reliable? That is going to become 304 00:16:52,680 --> 00:16:54,200 Speaker 2: more and more important. So I think it's good that 305 00:16:54,200 --> 00:16:57,000 Speaker 2: the situation is getting some attention. I do think that 306 00:16:57,040 --> 00:17:00,320 Speaker 2: the positive thing is that it is important for people 307 00:17:00,320 --> 00:17:03,520 Speaker 2: who understand this technology really really well. Do you maybe 308 00:17:03,560 --> 00:17:06,680 Speaker 2: have a role in at least influencing how it's used. 309 00:17:06,840 --> 00:17:08,560 Speaker 2: And that doesn't mean, you know, like you get to 310 00:17:08,600 --> 00:17:10,920 Speaker 2: say yes or no to a mission. It just means 311 00:17:10,920 --> 00:17:15,119 Speaker 2: perhaps that you say, have some input on how you 312 00:17:15,160 --> 00:17:17,640 Speaker 2: ought to try and deploy it within the Pentagon I. 313 00:17:17,600 --> 00:17:20,720 Speaker 1: Think, I mean, how does that functionally work? Though maybe 314 00:17:20,720 --> 00:17:23,240 Speaker 1: that's the question here, That is the question, Yeah, can 315 00:17:23,280 --> 00:17:28,240 Speaker 1: you have a contract with the government and tell them no, 316 00:17:28,400 --> 00:17:31,320 Speaker 1: you can't do this? And I'm asking that partially because 317 00:17:31,720 --> 00:17:34,320 Speaker 1: certainly there is a lot of talk that will wait 318 00:17:34,320 --> 00:17:37,520 Speaker 1: a second, The iron Strake. Was Claude used there. 319 00:17:38,000 --> 00:17:41,119 Speaker 2: Yeah. To answer the first question, I think that is 320 00:17:41,200 --> 00:17:43,919 Speaker 2: a really good question. How do companies, how does the 321 00:17:43,920 --> 00:17:47,719 Speaker 2: public understand how things are developed, what the rules are 322 00:17:47,720 --> 00:17:51,920 Speaker 2: around them. And there are specified rules around autonomous weapons, 323 00:17:51,960 --> 00:17:55,439 Speaker 2: around surveillance, but this is also a technology that's different, 324 00:17:55,840 --> 00:17:58,600 Speaker 2: that can misbehave in surprising ways and can be kind 325 00:17:58,640 --> 00:18:01,080 Speaker 2: vulnerable in surprising ways. So I mean that it needs 326 00:18:01,119 --> 00:18:05,000 Speaker 2: a bit more scrutiny. The Washington Posters has reported. I 327 00:18:05,000 --> 00:18:08,440 Speaker 2: think it's that it seems very likely true that Claude 328 00:18:08,520 --> 00:18:12,080 Speaker 2: was absolutely used in the Iran operation as part of 329 00:18:12,080 --> 00:18:14,960 Speaker 2: this system called Maize and which was built by Palanteer, 330 00:18:15,080 --> 00:18:18,160 Speaker 2: which includes a bunch of things, not just a language model. 331 00:18:18,200 --> 00:18:21,119 Speaker 2: It can do you know, analyzed maps and images and 332 00:18:21,760 --> 00:18:24,679 Speaker 2: mission plan and so on. But yeah, as part of that. 333 00:18:24,760 --> 00:18:26,840 Speaker 2: And so now, one of the things that was kind 334 00:18:26,840 --> 00:18:29,280 Speaker 2: of crazy with Trump's initial statement was that he said, 335 00:18:29,720 --> 00:18:33,560 Speaker 2: I'm ordering everybody to stop working with Anthropic because they're 336 00:18:33,760 --> 00:18:36,000 Speaker 2: all these terrible people. But there'll be a six month 337 00:18:36,040 --> 00:18:38,560 Speaker 2: period of what we carry on doing. Right, it can't 338 00:18:38,600 --> 00:18:41,960 Speaker 2: be that bad, and we're going to use We're about 339 00:18:42,000 --> 00:18:45,040 Speaker 2: to use them in this you know, incredible operation in 340 00:18:45,800 --> 00:18:46,520 Speaker 2: the Middle East. 341 00:18:47,000 --> 00:18:51,479 Speaker 1: So yeah, I mean we're laughing. It's terrible. I mean 342 00:18:51,520 --> 00:18:54,760 Speaker 1: it's ridiculous and also terrible. I suppose it can be 343 00:18:54,760 --> 00:18:59,080 Speaker 1: two things at the same time. And because we're doing 344 00:18:59,119 --> 00:19:02,520 Speaker 1: this live, I wanted to take a few questions. And 345 00:19:02,840 --> 00:19:06,040 Speaker 1: one of the questions that I got actually previously I 346 00:19:06,080 --> 00:19:09,399 Speaker 1: posted on Instagram does anybody have any questions? And somebody posted, 347 00:19:09,480 --> 00:19:12,160 Speaker 1: and I think they were kind of joking, but they said, hey, 348 00:19:12,160 --> 00:19:15,520 Speaker 1: can we just make all the country's Ais fight against 349 00:19:15,520 --> 00:19:17,600 Speaker 1: each other? Can we just have them fight instead of 350 00:19:17,640 --> 00:19:21,080 Speaker 1: the real war. But it's kind of funny that they 351 00:19:21,119 --> 00:19:24,200 Speaker 1: say that because there's I don't know if you've seen 352 00:19:24,200 --> 00:19:29,960 Speaker 1: this article in the New Scientist, but basically different AIS, 353 00:19:30,000 --> 00:19:32,960 Speaker 1: and the headline here kind of says it, all AIS 354 00:19:33,040 --> 00:19:37,359 Speaker 1: can't stop recommending nuclear strikes in wargames simulations, and the 355 00:19:37,400 --> 00:19:42,320 Speaker 1: subhead leading AIS from Open Ai, Anthropic and Google opted 356 00:19:42,359 --> 00:19:46,040 Speaker 1: to use nuclear weapons in simulated wargames in ninety five 357 00:19:46,080 --> 00:19:47,440 Speaker 1: percent of cases. 358 00:19:48,520 --> 00:19:52,000 Speaker 2: Right, Yeah, So I read that paper. It's important to 359 00:19:52,080 --> 00:19:55,840 Speaker 2: know that this person was just trying to understand what 360 00:19:55,880 --> 00:19:57,720 Speaker 2: the kind of behavior of models would be. Well, I'm 361 00:19:57,720 --> 00:19:59,959 Speaker 2: not advocating to do that, and I don't think anybody. 362 00:20:00,240 --> 00:20:02,520 Speaker 2: I think actually this is one of the like not 363 00:20:02,680 --> 00:20:06,520 Speaker 2: letting AI have a role in nuclear weapon decision making 364 00:20:06,640 --> 00:20:08,960 Speaker 2: is one thing that the US and China have actually 365 00:20:09,000 --> 00:20:11,480 Speaker 2: agreed on, which is extremely unusual these days. 366 00:20:11,520 --> 00:20:11,680 Speaker 1: Right. 367 00:20:11,720 --> 00:20:14,840 Speaker 2: So, but it's unsurprising to me that if you have 368 00:20:14,880 --> 00:20:17,879 Speaker 2: a model trade on the entire Internet, it's going to 369 00:20:18,400 --> 00:20:23,760 Speaker 2: tend toward escalation and not be very very expert in 370 00:20:24,440 --> 00:20:27,440 Speaker 2: how to avoid you know, it's not going to be 371 00:20:27,840 --> 00:20:30,040 Speaker 2: someone who spent their life worrying about that sort of thing. 372 00:20:30,080 --> 00:20:33,160 Speaker 2: And on the question of whether ais could fight each other, 373 00:20:33,359 --> 00:20:37,679 Speaker 2: the truth is, like you are already seeing in certain conflicts, 374 00:20:38,080 --> 00:20:41,560 Speaker 2: you know, autonomous systems, semi autonomous systems are remotely operated 375 00:20:42,160 --> 00:20:43,960 Speaker 2: fighting other semi autonomous systems. 376 00:20:44,000 --> 00:20:44,119 Speaker 1: Right. 377 00:20:44,119 --> 00:20:47,040 Speaker 2: And some people may argue that that would remove people 378 00:20:47,080 --> 00:20:50,800 Speaker 2: from harm's way. That is not the reality of a conflict. 379 00:20:50,840 --> 00:20:52,640 Speaker 2: There's going to be plenty of people in harm's way. 380 00:20:53,240 --> 00:20:55,639 Speaker 2: And the question is, you know, how do you do 381 00:20:55,720 --> 00:20:59,200 Speaker 2: that in ways that are reliable and don't make mistakes. 382 00:20:59,280 --> 00:21:02,040 Speaker 2: And there has been talking in recent years of these 383 00:21:02,080 --> 00:21:04,679 Speaker 2: sort of things like existential risks, but long before that, 384 00:21:04,720 --> 00:21:08,200 Speaker 2: there are lots of systemic dangers of using this technology 385 00:21:08,200 --> 00:21:12,040 Speaker 2: in any critical system. So you know, just as you 386 00:21:12,080 --> 00:21:14,719 Speaker 2: have hallucinations if you're using a chat, or just as 387 00:21:14,760 --> 00:21:17,840 Speaker 2: it gets simple reasoning wrong in a very convincing way, 388 00:21:18,240 --> 00:21:19,880 Speaker 2: that is what we should be worried about. I think 389 00:21:19,920 --> 00:21:23,439 Speaker 2: in how this and other models language models are used 390 00:21:23,480 --> 00:21:24,800 Speaker 2: by the military, by the Pentagon. 391 00:21:25,200 --> 00:21:25,520 Speaker 1: You know what. 392 00:21:25,640 --> 00:21:29,320 Speaker 2: I once talked to the excuse and naval officer in 393 00:21:29,320 --> 00:21:32,120 Speaker 2: the US Navy whose job it was to, you know, 394 00:21:32,440 --> 00:21:34,920 Speaker 2: be in command of these nuclear submarines, and he said 395 00:21:35,200 --> 00:21:37,120 Speaker 2: one of the first things he did when he got 396 00:21:37,119 --> 00:21:38,960 Speaker 2: the job was get permission to go and talk to 397 00:21:38,960 --> 00:21:42,400 Speaker 2: his counterpart in China, go meet him and spend time there. 398 00:21:42,560 --> 00:21:45,760 Speaker 2: The reason being because if something went wrong, if one 399 00:21:45,760 --> 00:21:48,320 Speaker 2: of the systems went wrong with some a person made 400 00:21:48,359 --> 00:21:52,159 Speaker 2: a mistake, you know, or say there's a collision between 401 00:21:52,200 --> 00:21:54,719 Speaker 2: two vessels, and you know, you've a risk of escalation. 402 00:21:55,200 --> 00:21:56,560 Speaker 2: He wants to be able to pick up the phone 403 00:21:56,640 --> 00:21:58,000 Speaker 2: and talk to someone he knows. He doesn't want that 404 00:21:58,040 --> 00:22:00,080 Speaker 2: to be the first time he's spoken to them, and 405 00:22:00,119 --> 00:22:02,919 Speaker 2: so that sort of shows the importance of the human element. 406 00:22:03,000 --> 00:22:06,720 Speaker 2: And that's that's not something that those we talk about 407 00:22:07,119 --> 00:22:09,440 Speaker 2: a g I and intelligence being sold. But that is 408 00:22:09,480 --> 00:22:12,160 Speaker 2: a very human thing that no language want to really, 409 00:22:12,800 --> 00:22:17,080 Speaker 2: they don't have that kind of human human intelligence, right. 410 00:22:17,240 --> 00:22:19,960 Speaker 1: Yeah, chat GBT can't take groc out for coffee. 411 00:22:20,400 --> 00:22:23,320 Speaker 2: Yeah No, well no not yet. I mean, that's a 412 00:22:23,359 --> 00:22:25,600 Speaker 2: frightening thought, it is. I don't think anybody would want 413 00:22:25,640 --> 00:22:26,280 Speaker 2: to take rock count. 414 00:22:26,400 --> 00:22:38,560 Speaker 3: Yeah. 415 00:22:39,160 --> 00:22:42,040 Speaker 1: So there's a question also here in the chat that 416 00:22:42,119 --> 00:22:45,439 Speaker 1: I want to get to, so b masters asking saying 417 00:22:46,080 --> 00:22:49,640 Speaker 1: they read some sources that say that Anthropic was actually 418 00:22:49,680 --> 00:22:52,600 Speaker 1: willing to give in to the Department of Wars demands, 419 00:22:52,800 --> 00:22:56,120 Speaker 1: but the communication broke down. The CEO just doubled down 420 00:22:56,160 --> 00:22:58,640 Speaker 1: on the ethical argument, just as a pr move. I mean, 421 00:22:58,640 --> 00:23:01,040 Speaker 1: what your reporting have you? Have you seen anything that 422 00:23:01,080 --> 00:23:02,000 Speaker 1: would substantiate that. 423 00:23:02,680 --> 00:23:06,639 Speaker 2: I couldn't tell you definitively if they were about to 424 00:23:06,720 --> 00:23:10,440 Speaker 2: cave to the d d's terms. I have heard that 425 00:23:10,480 --> 00:23:12,840 Speaker 2: they were very close to reaching an agreement which seemed 426 00:23:12,840 --> 00:23:15,960 Speaker 2: like sounded to me like a compromise between the two 427 00:23:16,040 --> 00:23:18,440 Speaker 2: and it wasn't very far off. I think it's gonna 428 00:23:18,440 --> 00:23:20,080 Speaker 2: it's gonna be too easy to get caught up in 429 00:23:21,520 --> 00:23:24,560 Speaker 2: the drama of these companies in there and missed the 430 00:23:24,560 --> 00:23:27,879 Speaker 2: bigger picture. Which is we'll hang on. Maybe we should 431 00:23:27,920 --> 00:23:30,240 Speaker 2: have more of a discussion about how those things are 432 00:23:30,280 --> 00:23:32,880 Speaker 2: you what are the benefits, what are the It's important 433 00:23:32,920 --> 00:23:34,760 Speaker 2: to say that I don't think you should not use 434 00:23:35,160 --> 00:23:38,280 Speaker 2: AI in defense. It's all about how reliably you can 435 00:23:38,320 --> 00:23:40,680 Speaker 2: do it, What are the limits? What accountability is that? 436 00:23:41,000 --> 00:23:43,720 Speaker 2: How how does citizens sort of know how that's being 437 00:23:43,800 --> 00:23:44,520 Speaker 2: used in their name? 438 00:23:44,640 --> 00:23:44,840 Speaker 3: Right? 439 00:23:44,880 --> 00:23:47,400 Speaker 2: And it you know, Trump wrote in that that message 440 00:23:47,400 --> 00:23:51,160 Speaker 2: that he decides how wars are fought, and in some way, 441 00:23:51,240 --> 00:23:53,800 Speaker 2: right he decides when we invadeer on, but he doesn't 442 00:23:53,800 --> 00:23:57,439 Speaker 2: decide the terms of war, like you know, what is 443 00:23:57,880 --> 00:24:01,160 Speaker 2: a war crime? That is decided by society at large. 444 00:24:01,160 --> 00:24:02,760 Speaker 2: And that's the sort of a bigger thing that we 445 00:24:02,960 --> 00:24:05,119 Speaker 2: to be asking these questions when it comes to the 446 00:24:05,160 --> 00:24:08,960 Speaker 2: new capabilities that the AI is going to give governments, 447 00:24:09,040 --> 00:24:11,680 Speaker 2: which is they would be able to do new kinds 448 00:24:11,680 --> 00:24:14,359 Speaker 2: of massurveilance you could never you know, never imagine before. Right, 449 00:24:14,400 --> 00:24:16,760 Speaker 2: you can be able to find patterns you couldn't think 450 00:24:16,760 --> 00:24:18,880 Speaker 2: of it, or you'll be able to sort of automate 451 00:24:18,920 --> 00:24:22,639 Speaker 2: that much more so, I think, yeah, that's the bigger 452 00:24:22,680 --> 00:24:24,600 Speaker 2: thing for me, and I think I should remember that 453 00:24:24,800 --> 00:24:27,800 Speaker 2: you know, real life, real human beings at stake here 454 00:24:27,880 --> 00:24:30,680 Speaker 2: and that war as hell, and it is something you 455 00:24:30,760 --> 00:24:31,960 Speaker 2: want to try and avoid. 456 00:24:31,760 --> 00:24:35,000 Speaker 1: Right, Yeah, you were talking about the bigger picture, and 457 00:24:35,040 --> 00:24:37,919 Speaker 1: also you mentioned palents here, which, as it happens, I 458 00:24:38,320 --> 00:24:40,720 Speaker 1: just a little bit ago talk to mechanic Kelly, your 459 00:24:40,880 --> 00:24:44,719 Speaker 1: colleague about talents here, and we actually have an episode 460 00:24:44,720 --> 00:24:48,240 Speaker 1: about tech drop in next week. But this kind of 461 00:24:48,280 --> 00:24:51,960 Speaker 1: seems to be part of a larger trend, like you said, 462 00:24:52,560 --> 00:24:56,919 Speaker 1: big tech companies who were very interested in working with 463 00:24:57,240 --> 00:25:02,560 Speaker 1: the government, but specifically working with the government in military capacities, 464 00:25:03,680 --> 00:25:07,240 Speaker 1: which feels new and in some ways it's not. Of course, 465 00:25:07,400 --> 00:25:09,680 Speaker 1: in some ways it's not. Some of the origins of 466 00:25:09,720 --> 00:25:12,160 Speaker 1: things like the Internet, you know, does come from the military. 467 00:25:12,520 --> 00:25:15,159 Speaker 1: But in terms of the way that we think about 468 00:25:15,359 --> 00:25:18,800 Speaker 1: the culture of Silicon Valley, I think this turn has 469 00:25:18,800 --> 00:25:22,359 Speaker 1: been a little bit unsettling but also surprising for a 470 00:25:22,400 --> 00:25:24,679 Speaker 1: lot of people. But it does to be a trend 471 00:25:24,920 --> 00:25:28,920 Speaker 1: right now. Where do you see this going, say, even 472 00:25:28,960 --> 00:25:29,879 Speaker 1: over the next year or so. 473 00:25:31,080 --> 00:25:33,040 Speaker 2: I mean, that's a great question. That is a really 474 00:25:33,040 --> 00:25:39,560 Speaker 2: interesting question. It's hard to imagine ALI companies becoming disentangled 475 00:25:39,760 --> 00:25:43,440 Speaker 2: because they are is so strategically important, and the incentive 476 00:25:43,440 --> 00:25:46,719 Speaker 2: for working with the government is enormous. Right, The amount 477 00:25:46,720 --> 00:25:48,919 Speaker 2: of data, the amount of compute, the amount of money 478 00:25:49,520 --> 00:25:51,840 Speaker 2: that the government has to throw out those companies is 479 00:25:51,880 --> 00:25:54,600 Speaker 2: so huge that it would be surprising to me if 480 00:25:54,640 --> 00:25:58,080 Speaker 2: everybody said, our conscience is not clear, so we're going 481 00:25:58,160 --> 00:26:01,280 Speaker 2: to rein that in So if you imagine that we're 482 00:26:01,280 --> 00:26:04,040 Speaker 2: going to have more and more tension, more more conflict, 483 00:26:04,880 --> 00:26:08,080 Speaker 2: there will be pressures on them too, for everybody to 484 00:26:08,640 --> 00:26:11,639 Speaker 2: use that AI and new ways that might be concerning, 485 00:26:12,280 --> 00:26:17,840 Speaker 2: besides the use in military situations, which I think we 486 00:26:17,840 --> 00:26:20,240 Speaker 2: should have try and have transparency on. I think it 487 00:26:20,320 --> 00:26:23,760 Speaker 2: is really telling that Anthropic had this concern about mass 488 00:26:23,760 --> 00:26:29,840 Speaker 2: surveillance because I think language models are preternaturally brilliant at 489 00:26:30,359 --> 00:26:35,560 Speaker 2: passing large amounts of text and audio and making sense 490 00:26:35,600 --> 00:26:38,520 Speaker 2: of it, finding patterns and making predictions like we haven't 491 00:26:38,520 --> 00:26:41,040 Speaker 2: done anything like that, and that's exactly what you might 492 00:26:41,080 --> 00:26:43,840 Speaker 2: want to do with in surveillance. So the temptation is 493 00:26:43,880 --> 00:26:47,600 Speaker 2: going to be huge, and yeah, I think it will. 494 00:26:47,640 --> 00:26:50,280 Speaker 2: I think it will be very interesting to see if 495 00:26:50,320 --> 00:26:53,359 Speaker 2: this turns into more of a pushback if more people. 496 00:26:54,840 --> 00:26:57,159 Speaker 2: One thing is that tech workers have a ridiculous I 497 00:26:57,160 --> 00:27:00,000 Speaker 2: mean especially AI engineers, right has an incredible amount of 498 00:27:00,119 --> 00:27:02,880 Speaker 2: cloud these days because they're so valuable. And I think 499 00:27:02,880 --> 00:27:06,080 Speaker 2: this is you know, one reason why Sam Alman and 500 00:27:06,119 --> 00:27:10,520 Speaker 2: others are going to extraordinary lengths to try and explain 501 00:27:10,600 --> 00:27:13,960 Speaker 2: and to a piece their their staff and maybe change 502 00:27:14,000 --> 00:27:15,600 Speaker 2: even change tax because of that. 503 00:27:15,840 --> 00:27:17,720 Speaker 1: Yeah, well, there were there were a bunch of workers, 504 00:27:17,760 --> 00:27:20,639 Speaker 1: I mean hundreds of workers from open Ai and Google 505 00:27:20,640 --> 00:27:25,760 Speaker 1: who signed an open letter basically supporting anthropic. There is 506 00:27:26,760 --> 00:27:30,520 Speaker 1: something of a little bit of pushback, if I can 507 00:27:30,560 --> 00:27:33,800 Speaker 1: call it that, from inside of these companies, and even 508 00:27:33,840 --> 00:27:37,520 Speaker 1: you know, at the consumer angle, people uninstalling chat ubt 509 00:27:37,680 --> 00:27:42,600 Speaker 1: and installing Claude. The question, though, becomes, are both of 510 00:27:42,600 --> 00:27:45,240 Speaker 1: those things enough? Is the pushback from inside the company? 511 00:27:45,359 --> 00:27:47,640 Speaker 1: Enough is a bunch of people deciding that, Hey, I'm 512 00:27:47,640 --> 00:27:51,680 Speaker 1: gonna uninstall open Ai and use the woke claud which 513 00:27:51,680 --> 00:27:53,600 Speaker 1: I think we've established is is maybe not that maybe 514 00:27:53,680 --> 00:27:56,600 Speaker 1: not as woke as some people might have be given 515 00:27:56,680 --> 00:28:00,440 Speaker 1: it credit for. What does that do? Does that anything 516 00:28:00,840 --> 00:28:01,440 Speaker 1: move the needle? 517 00:28:02,400 --> 00:28:03,920 Speaker 2: I don't think it does at this stage. I think 518 00:28:04,040 --> 00:28:06,720 Speaker 2: the question is does it go from hundreds of workers 519 00:28:06,720 --> 00:28:11,600 Speaker 2: to thousands and does the pushback from the public, which 520 00:28:11,600 --> 00:28:15,080 Speaker 2: already seems to be fading somewhat, does that turn into 521 00:28:15,080 --> 00:28:18,160 Speaker 2: something maybe with further incidents that has a meaningful impact 522 00:28:18,240 --> 00:28:21,560 Speaker 2: on bottom line, you know. And it is sort of 523 00:28:21,600 --> 00:28:24,359 Speaker 2: hard these days, right for people. It's harder for people 524 00:28:24,400 --> 00:28:26,919 Speaker 2: to push back and draw lines when it comes to 525 00:28:26,960 --> 00:28:29,879 Speaker 2: the government. It seems like everybody's a bit more curtailed 526 00:28:29,920 --> 00:28:32,679 Speaker 2: and scared, right, But at some point that you know, 527 00:28:32,720 --> 00:28:35,400 Speaker 2: as we've seen with some of the protests, that there 528 00:28:35,720 --> 00:28:37,320 Speaker 2: is a sort of breaking point, and I think it 529 00:28:37,359 --> 00:28:39,600 Speaker 2: needs to be a bigger pushback. But I think it 530 00:28:39,640 --> 00:28:45,560 Speaker 2: also we probably should have more nuance around what that 531 00:28:45,600 --> 00:28:48,040 Speaker 2: pushback here it really is, you know, just saying anthropic 532 00:28:48,200 --> 00:28:52,120 Speaker 2: is better. It's actually maybe like maybe everybody should get 533 00:28:52,120 --> 00:28:54,720 Speaker 2: together and say all these companies should be saying, hey, 534 00:28:54,760 --> 00:28:57,560 Speaker 2: we're not totally sure this stuff is ready even for 535 00:28:58,680 --> 00:29:01,560 Speaker 2: half the use cases whatever it might be like, and 536 00:29:01,720 --> 00:29:04,479 Speaker 2: then have the government explain to people how it's being mean. 537 00:29:04,520 --> 00:29:07,480 Speaker 2: It's one of the things that America can do, you know, 538 00:29:07,640 --> 00:29:10,760 Speaker 2: is pushes political leaders in theory to do that. So 539 00:29:10,920 --> 00:29:12,280 Speaker 2: I would rewind hope. 540 00:29:12,120 --> 00:29:16,040 Speaker 1: For well, thank you so much for coming through and 541 00:29:16,080 --> 00:29:17,960 Speaker 1: talking about this. It's heavy stuff, but we got to 542 00:29:18,000 --> 00:29:19,280 Speaker 1: talk about it. Thank you for real. 543 00:29:19,440 --> 00:29:21,120 Speaker 2: You're very welcome, ye, thanks for having me. 544 00:29:24,280 --> 00:29:26,880 Speaker 1: Thank you so much for listening to another episode of 545 00:29:26,960 --> 00:29:29,000 Speaker 1: kill Switch. You can email us if you want to 546 00:29:29,040 --> 00:29:32,840 Speaker 1: talk at kill Switch at Kaleidoscope dot NYC or on 547 00:29:32,880 --> 00:29:35,840 Speaker 1: Instagram or at kill switch Pod, and if you like 548 00:29:35,840 --> 00:29:38,320 Speaker 1: what you're hearing, maybe leave us a review. It helps 549 00:29:38,360 --> 00:29:40,840 Speaker 1: other people find the show, which helps us keep doing 550 00:29:40,840 --> 00:29:43,040 Speaker 1: our thing. And as I said at the beginning of 551 00:29:43,080 --> 00:29:45,920 Speaker 1: the show, kill Switch is on YouTube, so if you 552 00:29:45,960 --> 00:29:48,040 Speaker 1: want to catch the next one live, the link for 553 00:29:48,080 --> 00:29:51,880 Speaker 1: that and everything else is in the show notes. Kill 554 00:29:51,920 --> 00:29:54,920 Speaker 1: Switch is hosted by Me Dexter Thomas. It's produced by 555 00:29:54,960 --> 00:29:59,000 Speaker 1: Sena Ozaki, Darluck Potts, and Julia Nutter. Our theme song 556 00:29:59,120 --> 00:30:02,680 Speaker 1: is by me and Kyle Murdoch from Kaleidoscope. Our executive 557 00:30:02,680 --> 00:30:06,880 Speaker 1: producers are Ozma Lashin, mungesh Hati Kadur, and Kate Osborne 558 00:30:06,960 --> 00:30:11,000 Speaker 1: from iHeart. Our executive producers are Katrina Norvil and Nikki Ecord. 559 00:30:11,360 --> 00:30:11,880 Speaker 1: Catch on the 560 00:30:11,920 --> 00:30:24,000 Speaker 3: Next one, Goodbye,