WEBVTT - The Story: Are the US and China in a Tech War? w/ Jake Sullivan

0:00:13.119 --> 0:00:16.880
<v Speaker 1>Welcome to Tech Stuff. I'm Os Voloscian. On this podcast,

0:00:16.960 --> 0:00:20.520
<v Speaker 1>we cover emerging technologies and how they reflect but also

0:00:20.680 --> 0:00:26.720
<v Speaker 1>shape the way we live. Artificial intelligence, robotics, energy batteries,

0:00:27.080 --> 0:00:31.640
<v Speaker 1>space exploration. More often than not, though the conversation comes

0:00:31.640 --> 0:00:35.599
<v Speaker 1>back to the same place, China. It's a topic and

0:00:35.640 --> 0:00:38.800
<v Speaker 1>a country that everyone has an opinion about, but where

0:00:38.880 --> 0:00:43.800
<v Speaker 1>few have true insight. Today's guest is different. According to

0:00:43.840 --> 0:00:46.559
<v Speaker 1>a headline last year in Wired magazine, he has quote

0:00:46.880 --> 0:00:50.320
<v Speaker 1>the American who waged a tech war on China. The

0:00:50.400 --> 0:00:54.160
<v Speaker 1>article continues, China is racing to unseat the US as

0:00:54.200 --> 0:00:58.040
<v Speaker 1>the world's technological superpower, not if Jake Sullivan can help it.

0:00:58.600 --> 0:01:00.040
<v Speaker 1>As soon as I read the article, I know, I

0:01:00.120 --> 0:01:02.560
<v Speaker 1>knew I wanted to have Jake Sullivan on the show.

0:01:03.280 --> 0:01:06.320
<v Speaker 1>He served as National Security Advisor for the duration of

0:01:06.400 --> 0:01:10.080
<v Speaker 1>Joe Biden's presidency, and according to my queries on groc

0:01:10.200 --> 0:01:13.679
<v Speaker 1>Gemini and chat GPT, he spent more time with his

0:01:13.760 --> 0:01:18.240
<v Speaker 1>Chinese counterparts, including Chi chimping, than any US national security

0:01:18.280 --> 0:01:22.360
<v Speaker 1>advisor since Henry Kissinger. But today, a conversation about waging

0:01:22.360 --> 0:01:25.840
<v Speaker 1>a tech war on China, Autonomous weapons on the battlefield

0:01:25.880 --> 0:01:29.560
<v Speaker 1>of Ukraine and far beyond, and what comes next for

0:01:29.640 --> 0:01:32.960
<v Speaker 1>a man dubbed a future potential president by Hillary Clinton.

0:01:33.520 --> 0:01:35.000
<v Speaker 1>Jake Sullivan, Welcome to tech Stuff.

0:01:35.360 --> 0:01:36.240
<v Speaker 2>Thanks for having me.

0:01:36.319 --> 0:01:38.640
<v Speaker 1>I wanted to start by asking you this, which is

0:01:39.200 --> 0:01:41.800
<v Speaker 1>you are known as a foreign policy person, right, I mean,

0:01:41.840 --> 0:01:45.560
<v Speaker 1>working in the State Department, working on the Iran nuclear

0:01:45.600 --> 0:01:51.240
<v Speaker 1>deal during the Obama administration, national Security advisor under Joe Biden,

0:01:52.040 --> 0:01:54.560
<v Speaker 1>and yet more and more you seem to be a

0:01:54.600 --> 0:01:58.080
<v Speaker 1>tech policy person. And I'm curious you know how that

0:01:58.120 --> 0:02:02.120
<v Speaker 1>shift began and when you started to see tank as

0:02:02.200 --> 0:02:04.800
<v Speaker 1>such a crucial lens for national security.

0:02:05.280 --> 0:02:09.680
<v Speaker 2>It really began for me after twenty sixteen and I

0:02:09.720 --> 0:02:12.760
<v Speaker 2>served as a senior policy advisor on Hillary Clinton's campaign,

0:02:12.960 --> 0:02:14.800
<v Speaker 2>And it was over the course of that campaign that

0:02:14.880 --> 0:02:19.560
<v Speaker 2>I came to recognize first the degree to which the

0:02:19.600 --> 0:02:23.560
<v Speaker 2>intersection of domestic policy and foreign policy was where everything

0:02:23.680 --> 0:02:29.120
<v Speaker 2>was happening. And second, at that intersection was technology, both

0:02:29.240 --> 0:02:31.639
<v Speaker 2>what we were doing in the United States to invest

0:02:31.840 --> 0:02:35.280
<v Speaker 2>at the cutting edge and what was happening in the

0:02:35.360 --> 0:02:38.440
<v Speaker 2>US China tech competition. So I spent my years out

0:02:38.440 --> 0:02:41.840
<v Speaker 2>of government really digging into a set of technology, economic

0:02:41.960 --> 0:02:44.800
<v Speaker 2>and national security issues and when I came in as

0:02:44.880 --> 0:02:48.280
<v Speaker 2>National Security Advisor for the first time ever the National

0:02:48.360 --> 0:02:52.160
<v Speaker 2>Security Council, we stood up a Directorate on Technology and

0:02:52.240 --> 0:02:56.160
<v Speaker 2>National Security so that we could really create an engine

0:02:56.240 --> 0:02:58.680
<v Speaker 2>room for the kind of policy work I felt needed

0:02:58.720 --> 0:03:01.280
<v Speaker 2>to be done to make sure that the United States

0:03:01.320 --> 0:03:04.040
<v Speaker 2>was in the best possible position to make technology work

0:03:04.080 --> 0:03:05.560
<v Speaker 2>for us rather than against US.

0:03:06.160 --> 0:03:08.760
<v Speaker 1>Yeah, it's interesting when you when you leave off this,

0:03:09.200 --> 0:03:12.200
<v Speaker 1>you know way you go for an exit interview, when

0:03:12.200 --> 0:03:15.440
<v Speaker 1>you sat for the Wide magazine piece why there? And

0:03:15.480 --> 0:03:17.600
<v Speaker 1>did you like the headline the American who waged a

0:03:17.639 --> 0:03:18.720
<v Speaker 1>tech war against China?

0:03:19.120 --> 0:03:21.080
<v Speaker 2>Great question on the headline. I can't say I love

0:03:21.120 --> 0:03:23.560
<v Speaker 2>the headline because I personally you may have noticed, don't

0:03:23.639 --> 0:03:25.040
<v Speaker 2>use the phrase tech war.

0:03:25.240 --> 0:03:26.919
<v Speaker 1>No, it would be it would be ill advised for

0:03:27.520 --> 0:03:29.520
<v Speaker 1>something in your position to use the word war, I

0:03:29.520 --> 0:03:30.840
<v Speaker 1>guess when talking about China.

0:03:31.000 --> 0:03:33.880
<v Speaker 2>Yeah, exactly, And I don't think of it in those terms. Actually,

0:03:33.919 --> 0:03:35.760
<v Speaker 2>I think of it as on a much more common

0:03:35.800 --> 0:03:39.160
<v Speaker 2>sense basis, what do we need to do to promote

0:03:39.360 --> 0:03:41.840
<v Speaker 2>the advancement of American technology and what do we need

0:03:41.880 --> 0:03:46.160
<v Speaker 2>to do to protect American technology from being used against

0:03:46.280 --> 0:03:50.320
<v Speaker 2>us by competitors like the People's Republic of China, and

0:03:50.320 --> 0:03:52.600
<v Speaker 2>that was the strategy that we pursued. It was not

0:03:52.880 --> 0:03:56.680
<v Speaker 2>a war, but it was hard nosed in the sense

0:03:56.760 --> 0:03:58.760
<v Speaker 2>that we felt that for too long we had been

0:03:58.800 --> 0:04:01.800
<v Speaker 2>sending our most advanced tool use technologies to China for

0:04:01.880 --> 0:04:04.200
<v Speaker 2>them to turn back against us, and we weren't going

0:04:04.240 --> 0:04:06.840
<v Speaker 2>to allow that to happen, particularly when it came to

0:04:06.880 --> 0:04:08.920
<v Speaker 2>advanced semiconductors.

0:04:08.480 --> 0:04:12.120
<v Speaker 1>When it comes to China specifically, I mean, what was

0:04:12.120 --> 0:04:16.280
<v Speaker 1>your worst fear about China getting an edge and what

0:04:16.320 --> 0:04:19.159
<v Speaker 1>was your mission to prevent that from happening.

0:04:19.640 --> 0:04:23.279
<v Speaker 2>Well, Look, I've said since the start of my time,

0:04:23.560 --> 0:04:26.160
<v Speaker 2>from when I came in as National Security Advisor, that

0:04:27.360 --> 0:04:31.320
<v Speaker 2>China makes no bones about the fact that it seeks

0:04:31.360 --> 0:04:36.160
<v Speaker 2>to become the world's leading economic, diplomatic, and technological power.

0:04:36.920 --> 0:04:40.880
<v Speaker 2>And my view is that the United States is better off,

0:04:40.920 --> 0:04:43.039
<v Speaker 2>and frankly, the world is better off if the United

0:04:43.080 --> 0:04:47.640
<v Speaker 2>States sustains its position as the world's leading power, including

0:04:47.640 --> 0:04:50.960
<v Speaker 2>the world's leading technology power. That is the way that

0:04:51.040 --> 0:04:53.680
<v Speaker 2>we are most likely to be able to ensure that

0:04:53.760 --> 0:04:57.480
<v Speaker 2>this technological revolution which you have spoken to so many

0:04:57.560 --> 0:05:00.520
<v Speaker 2>different brilliant people about works more for us than it

0:05:00.560 --> 0:05:03.800
<v Speaker 2>works against US, and so I worried that China was

0:05:03.839 --> 0:05:08.520
<v Speaker 2>going to achieve supremacy or dominance in the key technologies

0:05:08.560 --> 0:05:10.640
<v Speaker 2>of the future. And wasn't just my worry. I think

0:05:10.680 --> 0:05:13.159
<v Speaker 2>a lot of people back in twenty twenty one were

0:05:13.240 --> 0:05:15.920
<v Speaker 2>essentially saying, you know, the US is going to lose

0:05:15.920 --> 0:05:18.120
<v Speaker 2>out to China when it comes to AI and other

0:05:18.279 --> 0:05:22.600
<v Speaker 2>areas too. China has too many advantages. The United States

0:05:22.680 --> 0:05:24.600
<v Speaker 2>is too disorganized. It's just not going to get its

0:05:24.600 --> 0:05:27.120
<v Speaker 2>act together. And I was determined to make sure that

0:05:27.160 --> 0:05:29.960
<v Speaker 2>we competed as effectively and vigorously as possible so that

0:05:30.000 --> 0:05:31.760
<v Speaker 2>we could sustain the edge. But I also knew it

0:05:31.800 --> 0:05:34.040
<v Speaker 2>was going to be a tough competition because China has

0:05:34.120 --> 0:05:35.880
<v Speaker 2>a lot of capability to bring to bear in this

0:05:36.000 --> 0:05:37.960
<v Speaker 2>area as well as in other areas, some of which

0:05:37.960 --> 0:05:40.280
<v Speaker 2>you mentioned at the outset of the podcast, whether it's

0:05:40.400 --> 0:05:44.360
<v Speaker 2>robotics or autonomy or you name it. So I can't

0:05:44.360 --> 0:05:46.360
<v Speaker 2>say I went to sleep at night with a single

0:05:46.400 --> 0:05:49.760
<v Speaker 2>fear other than I felt it was my job to

0:05:49.839 --> 0:05:54.160
<v Speaker 2>help position the United States as best as possible in

0:05:54.200 --> 0:05:57.919
<v Speaker 2>this really sustain competition with China over the future of

0:05:58.400 --> 0:06:01.400
<v Speaker 2>key technologies that are going to find so much of

0:06:01.520 --> 0:06:04.840
<v Speaker 2>national security, but also economics and society in the years ahead,

0:06:05.120 --> 0:06:07.480
<v Speaker 2>and AI is essential to that as any.

0:06:08.160 --> 0:06:12.280
<v Speaker 1>There is this theory called the Fucidity's trap, which has

0:06:12.320 --> 0:06:15.200
<v Speaker 1>been popularized by Graham Allison, who I know is a

0:06:15.240 --> 0:06:17.720
<v Speaker 1>colleague at at Harvard now and kind of a mentor,

0:06:17.760 --> 0:06:20.040
<v Speaker 1>which essentially says that you know, if you look at

0:06:20.040 --> 0:06:23.600
<v Speaker 1>the history of a declining great power and a rising

0:06:23.680 --> 0:06:27.040
<v Speaker 1>great power in should we say in our ten cases,

0:06:27.080 --> 0:06:29.360
<v Speaker 1>that will necessarily lead to war? And I think that's

0:06:29.360 --> 0:06:31.880
<v Speaker 1>been kind of like a bipartisan framework for thinking about

0:06:32.120 --> 0:06:35.800
<v Speaker 1>US and China. Is that a framework that you believe

0:06:35.880 --> 0:06:38.679
<v Speaker 1>to be true that there is a about fifty percent

0:06:38.800 --> 0:06:42.320
<v Speaker 1>chance that there's inevitability of a capital w war? Like,

0:06:42.360 --> 0:06:45.120
<v Speaker 1>how do you apply that framework to you're thinking about

0:06:45.200 --> 0:06:47.280
<v Speaker 1>China and then in turn your policy on tech.

0:06:47.680 --> 0:06:51.160
<v Speaker 2>I think Graham Alison's book is excellent. It deserves to

0:06:51.200 --> 0:06:53.760
<v Speaker 2>be read by everybody. I have it here in my

0:06:53.880 --> 0:06:56.400
<v Speaker 2>office at Harvard, and not just because he's my colleague

0:06:56.400 --> 0:07:00.640
<v Speaker 2>down the hall. But I do not believe that it

0:07:00.720 --> 0:07:05.119
<v Speaker 2>is excellent because it is describing an inevitable conflict between

0:07:05.160 --> 0:07:06.760
<v Speaker 2>the US and China. That this is some kind of

0:07:06.839 --> 0:07:08.680
<v Speaker 2>law of nature. In fact, he points out in the

0:07:08.720 --> 0:07:11.680
<v Speaker 2>book that there are cases where a rising power and

0:07:11.800 --> 0:07:16.280
<v Speaker 2>established power didn't go to war. And my view is

0:07:16.320 --> 0:07:19.360
<v Speaker 2>that the US and China are going to be in

0:07:19.400 --> 0:07:24.840
<v Speaker 2>a sustained competition. It will be intense, it will be challenging,

0:07:25.560 --> 0:07:28.560
<v Speaker 2>but it is critical, it is vital that we manage

0:07:28.600 --> 0:07:32.000
<v Speaker 2>that competition so that it does not spill over into

0:07:32.040 --> 0:07:35.400
<v Speaker 2>conflict or confrontation. And my view is we can do that,

0:07:36.000 --> 0:07:38.160
<v Speaker 2>and I actually think the blueprint for that was laid

0:07:38.160 --> 0:07:40.120
<v Speaker 2>out over the four years of the Biden administration. We

0:07:40.120 --> 0:07:43.600
<v Speaker 2>were able to enhance America's competitive position, take tough actions,

0:07:44.120 --> 0:07:47.200
<v Speaker 2>and also engage in sustained diplomacy so that we didn't

0:07:47.320 --> 0:07:50.160
<v Speaker 2>move in the direction of war with China, which would

0:07:50.200 --> 0:07:52.800
<v Speaker 2>be a catastrophe not just for the US and China,

0:07:52.840 --> 0:07:53.840
<v Speaker 2>but for the entire world.

0:07:54.400 --> 0:07:56.440
<v Speaker 1>He let's talk about some of the tough measures. You

0:07:56.480 --> 0:08:00.960
<v Speaker 1>mentioned two countries, Holland and Japan. That this kind of

0:08:01.000 --> 0:08:06.120
<v Speaker 1>fascinating secret meeting that you convene with your adoption Japanese counterparts.

0:08:07.200 --> 0:08:08.200
<v Speaker 1>What are you trying to achieve?

0:08:08.840 --> 0:08:11.640
<v Speaker 2>So I'll be advised in what I say, because having

0:08:11.680 --> 0:08:15.040
<v Speaker 2>some discretion in the diplomatic relationships with Japan and the

0:08:15.080 --> 0:08:18.600
<v Speaker 2>Netherlands was critical to being able to achieve the kind

0:08:18.640 --> 0:08:23.320
<v Speaker 2>of consensus we were able to achieve on semiconductor manufacturing

0:08:23.320 --> 0:08:27.200
<v Speaker 2>equipment export controls. So, for your listeners, there's two types

0:08:27.240 --> 0:08:30.600
<v Speaker 2>of export controls. There's the controls on very high end ships,

0:08:31.080 --> 0:08:35.240
<v Speaker 2>I think H one hundred Nvidia chips or the Blackwells,

0:08:35.840 --> 0:08:41.160
<v Speaker 2>and then there are controls on the manufacturing equipment used

0:08:41.600 --> 0:08:45.880
<v Speaker 2>to make the highest end AI chips and other high

0:08:45.960 --> 0:08:51.320
<v Speaker 2>end ships. And that equipment is primarily produced by three countries,

0:08:51.520 --> 0:08:55.840
<v Speaker 2>the United States, the Netherlands, and Japan, and so having

0:08:55.880 --> 0:08:59.040
<v Speaker 2>all three of us come together around a common understanding

0:08:59.520 --> 0:09:04.040
<v Speaker 2>of what components and what machines necessary for the manufacture

0:09:04.040 --> 0:09:08.320
<v Speaker 2>of semiconductors would be controlled was essential for such controls

0:09:08.360 --> 0:09:14.720
<v Speaker 2>to be effective, and that required persuasion, consultation, really laying

0:09:14.760 --> 0:09:17.520
<v Speaker 2>out the challenge as we saw it, and it also

0:09:17.600 --> 0:09:21.440
<v Speaker 2>required a deep level of technical detail, which is why

0:09:21.800 --> 0:09:24.240
<v Speaker 2>this really was not just about the diplomats or the

0:09:24.320 --> 0:09:27.079
<v Speaker 2>national security advisors being in the room. It was about

0:09:27.120 --> 0:09:30.200
<v Speaker 2>technical experts from each of the three countries really working

0:09:30.240 --> 0:09:32.520
<v Speaker 2>through the reality of this so that it wasn't just

0:09:32.520 --> 0:09:33.640
<v Speaker 2>a fly by night effort.

0:09:34.200 --> 0:09:38.640
<v Speaker 1>One policy goal was to prevent China from buying advanced

0:09:38.679 --> 0:09:42.120
<v Speaker 1>chips from companies like Nvidia, and the other was to

0:09:42.120 --> 0:09:46.079
<v Speaker 1>prevent China from developing the technology to produce them domestically,

0:09:46.120 --> 0:09:49.840
<v Speaker 1>which was more what the Japanese Dutch diplomatic effort was.

0:09:50.000 --> 0:09:56.679
<v Speaker 1>Around January twentieth, literally the day you leave office, Deep Seek,

0:09:56.760 --> 0:10:01.920
<v Speaker 1>the Chinese AI Company announces a new you reasoning model

0:10:02.280 --> 0:10:06.320
<v Speaker 1>that performs on par with the best American models and

0:10:06.440 --> 0:10:09.280
<v Speaker 1>was able to be developed in spite of the export controls. Now,

0:10:09.320 --> 0:10:11.400
<v Speaker 1>some people say they claim it's because they actually were

0:10:11.400 --> 0:10:14.199
<v Speaker 1>able to develop such as sophisticated technology, they didn't need

0:10:14.200 --> 0:10:17.280
<v Speaker 1>as many chips. Others say they just got around the

0:10:17.280 --> 0:10:19.640
<v Speaker 1>export controls. I don't know if you know which of

0:10:19.640 --> 0:10:21.680
<v Speaker 1>those you have more of a hypothesis to be true.

0:10:22.360 --> 0:10:24.600
<v Speaker 1>But either way, what was your reaction to this moment?

0:10:24.640 --> 0:10:26.560
<v Speaker 1>I mean, someone said this was basically the moment that

0:10:26.920 --> 0:10:28.800
<v Speaker 1>everything you were working on was designed to prevent.

0:10:29.840 --> 0:10:31.960
<v Speaker 2>Well, keep in mind, this is a I'm glad you

0:10:32.040 --> 0:10:33.640
<v Speaker 2>used the word moment because it was a moment. It

0:10:33.640 --> 0:10:35.920
<v Speaker 2>was a moment in time in an ongoing competition, and

0:10:35.960 --> 0:10:39.000
<v Speaker 2>of course since then, open Ai has come out with three.

0:10:39.360 --> 0:10:42.040
<v Speaker 2>Deep Seek has come out with yet another version of

0:10:42.080 --> 0:10:44.959
<v Speaker 2>its R one model, and we'll see more models coming

0:10:44.960 --> 0:10:49.360
<v Speaker 2>out as we go. For me, deep Seek sort of

0:10:49.360 --> 0:10:52.440
<v Speaker 2>showed me number one the power of a concerted pr

0:10:52.520 --> 0:10:56.880
<v Speaker 2>campaign by China to effectively say resistance is futile. These

0:10:56.920 --> 0:11:00.360
<v Speaker 2>export controls have failed. Give up America. I would just

0:11:00.440 --> 0:11:02.920
<v Speaker 2>point out that it was the number one issue they

0:11:03.040 --> 0:11:05.200
<v Speaker 2>raised with me in every meeting, which suggests to me

0:11:05.320 --> 0:11:07.920
<v Speaker 2>they didn't think the export controls were totally useless.

0:11:08.040 --> 0:11:10.040
<v Speaker 1>The number one issue is why you try to prevent

0:11:10.120 --> 0:11:11.200
<v Speaker 1>us from buying chips?

0:11:11.480 --> 0:11:16.080
<v Speaker 2>Right? Why have you imposed these semiconductor export controls on us?

0:11:16.600 --> 0:11:19.199
<v Speaker 2>And you know, I spent a lot of time explaining

0:11:19.280 --> 0:11:21.760
<v Speaker 2>our theory of small yard high fence, and what we

0:11:21.800 --> 0:11:26.719
<v Speaker 2>saw is the difference between a national security logic to

0:11:27.040 --> 0:11:30.040
<v Speaker 2>export controls in some kind of economic or technology blockade,

0:11:30.040 --> 0:11:31.840
<v Speaker 2>which I do not believe that we are engaged in

0:11:31.880 --> 0:11:32.320
<v Speaker 2>with China.

0:11:32.640 --> 0:11:35.400
<v Speaker 1>They've responded just to your small yard high fence metaphor

0:11:35.440 --> 0:11:37.679
<v Speaker 1>with big yard iron cut in right correct.

0:11:37.960 --> 0:11:41.640
<v Speaker 2>That that is what they said, And I said, well,

0:11:41.679 --> 0:11:44.280
<v Speaker 2>we see it very differently. I see it as a

0:11:44.320 --> 0:11:46.360
<v Speaker 2>small yard and a high fence. And part of the

0:11:46.360 --> 0:11:48.439
<v Speaker 2>reason I know it's small is I know in the

0:11:48.480 --> 0:11:50.600
<v Speaker 2>meetings that I'm conducting how much we're keeping out of it,

0:11:50.679 --> 0:11:53.960
<v Speaker 2>how much continuing technology trade there is between the US

0:11:53.960 --> 0:11:56.440
<v Speaker 2>and China. So but the other thing about deep Seat

0:11:56.440 --> 0:11:59.040
<v Speaker 2>that I think is really important for people to recognize

0:11:59.240 --> 0:12:04.679
<v Speaker 2>is the big breakthrough was really an open AI breakthrough

0:12:04.679 --> 0:12:07.400
<v Speaker 2>that then Deep Seek more or less replicated and then

0:12:07.440 --> 0:12:11.560
<v Speaker 2>improved upon. And I don't actually know exactly I'm not

0:12:11.640 --> 0:12:14.040
<v Speaker 2>inside Deep Seek, so I can't say, but it seems

0:12:14.040 --> 0:12:17.680
<v Speaker 2>on the balance of the evidence that the deep Seek

0:12:17.760 --> 0:12:20.400
<v Speaker 2>story is the story that involves them having access to

0:12:20.440 --> 0:12:24.160
<v Speaker 2>a range of Western chips, some of which they got

0:12:24.600 --> 0:12:27.480
<v Speaker 2>before the export controls ever went into place, and some

0:12:27.679 --> 0:12:30.960
<v Speaker 2>they got between the first set in twenty two and

0:12:31.000 --> 0:12:33.840
<v Speaker 2>the second set in twenty three when in video basically

0:12:33.920 --> 0:12:37.840
<v Speaker 2>designed around the controls what they call the H eight hundred,

0:12:38.480 --> 0:12:41.800
<v Speaker 2>So there was an H one hundred, which was at

0:12:41.840 --> 0:12:46.600
<v Speaker 2>one point the most elite chip. In Vidia basically reduced

0:12:46.600 --> 0:12:49.320
<v Speaker 2>the interconnect speed to get around the control created the

0:12:49.480 --> 0:12:52.400
<v Speaker 2>H eight hundred for the Chinese market that got fixed

0:12:52.400 --> 0:12:55.160
<v Speaker 2>in twenty three. And so I think the right way

0:12:55.200 --> 0:12:58.720
<v Speaker 2>to think about the export control story is it started

0:12:58.720 --> 0:13:02.840
<v Speaker 2>in twenty two. The first draft of those export controls

0:13:02.840 --> 0:13:05.400
<v Speaker 2>in twenty two we learned lessons from because this was

0:13:05.440 --> 0:13:08.800
<v Speaker 2>a new undertaking, and one of those lessons involved how

0:13:08.800 --> 0:13:12.280
<v Speaker 2>to make sure that there couldn't be this easy design around,

0:13:12.280 --> 0:13:14.360
<v Speaker 2>this easy workaround, and that's why we updated them in

0:13:14.400 --> 0:13:17.480
<v Speaker 2>twenty three and updated them again in twenty four. So

0:13:18.280 --> 0:13:22.439
<v Speaker 2>these are not going to be one hundred percent foolproof.

0:13:23.240 --> 0:13:26.080
<v Speaker 2>What they are designed to do is, to the maximum

0:13:26.080 --> 0:13:30.760
<v Speaker 2>extent possible, make sure that the highest end national security

0:13:30.840 --> 0:13:34.640
<v Speaker 2>related technology, these very high end AI chips are not

0:13:35.120 --> 0:13:37.680
<v Speaker 2>going to China to be used against America currents allies.

0:13:38.040 --> 0:13:41.280
<v Speaker 2>And you know, I can't guarantee zero leakage, that it's

0:13:41.280 --> 0:13:44.640
<v Speaker 2>not a dynamic scenario where China and others are working

0:13:44.720 --> 0:13:46.679
<v Speaker 2>to get around them. We just have to keep at

0:13:46.720 --> 0:13:49.160
<v Speaker 2>it and continue to learn and continue to update.

0:13:49.679 --> 0:13:52.400
<v Speaker 1>So just to play a bank to you, you'll view

0:13:52.400 --> 0:13:59.119
<v Speaker 1>as far from controls on exports of advanced chips being futile.

0:13:59.200 --> 0:14:02.400
<v Speaker 1>And even if deep Seat were able to distill knowledge

0:14:02.520 --> 0:14:07.280
<v Speaker 1>from open AI's models, there is a value in these

0:14:07.320 --> 0:14:11.200
<v Speaker 1>export controls. And maybe deep Seak's success was because the

0:14:11.240 --> 0:14:15.480
<v Speaker 1>export controls weren't effective until let's say twenty three, and

0:14:15.600 --> 0:14:18.640
<v Speaker 1>really your legacy will be making sure the next Deep

0:14:18.679 --> 0:14:19.680
<v Speaker 1>Seat moment doesn't happen.

0:14:20.160 --> 0:14:23.560
<v Speaker 2>Yeah, I mean, I think realistically time will tell. What

0:14:23.600 --> 0:14:26.600
<v Speaker 2>I would just argue is I believe it's already had

0:14:26.640 --> 0:14:30.920
<v Speaker 2>an impact, because I think it has placed quite a

0:14:30.960 --> 0:14:35.520
<v Speaker 2>limit on China's access to an incredibly important resource for

0:14:35.600 --> 0:14:39.960
<v Speaker 2>frontier AI development, Compute, and that we can expect that

0:14:40.400 --> 0:14:44.880
<v Speaker 2>impact to be seen over the course of the rest

0:14:44.880 --> 0:14:47.640
<v Speaker 2>of twenty five, twenty six, twenty seven. You know, if

0:14:47.680 --> 0:14:51.400
<v Speaker 2>you look at public statements from Deep Seek CEO, one

0:14:51.440 --> 0:14:54.120
<v Speaker 2>of the things he talks about is access to Compute

0:14:54.200 --> 0:14:57.480
<v Speaker 2>being a challenge for him, something the Chinese government doesn't

0:14:57.520 --> 0:15:01.720
<v Speaker 2>like to acknowledge, but it something he's sort of very

0:15:01.920 --> 0:15:04.840
<v Speaker 2>publicly talked about. The China's effort to have its own

0:15:04.920 --> 0:15:08.280
<v Speaker 2>ship has not met with the kind of success that

0:15:08.360 --> 0:15:11.480
<v Speaker 2>the Chinese propagandas want to suggest it has, either in

0:15:11.560 --> 0:15:13.960
<v Speaker 2>terms of their performance or in terms of the scale

0:15:14.000 --> 0:15:17.000
<v Speaker 2>of what they're capable of producing. They cannot produce in

0:15:17.040 --> 0:15:20.800
<v Speaker 2>a year anything remotely resembling what, for example, in video

0:15:20.880 --> 0:15:23.840
<v Speaker 2>can produce. Not to mention the Google TPUs or Trainium

0:15:24.880 --> 0:15:29.720
<v Speaker 2>or other American design silicon. So I think those core

0:15:30.320 --> 0:15:34.080
<v Speaker 2>observations are intact sitting here today in June of twenty

0:15:34.120 --> 0:15:36.760
<v Speaker 2>twenty five, and now let's see how things play out.

0:15:36.960 --> 0:15:39.520
<v Speaker 1>I want to talk about video for a moment. I mean, obviously,

0:15:39.560 --> 0:15:41.600
<v Speaker 1>Jensen Whang, the SEO of and Video, has been out

0:15:41.640 --> 0:15:47.040
<v Speaker 1>there loudly and publicly castigating in effect your policy and

0:15:47.080 --> 0:15:50.320
<v Speaker 1>also saying, you know, necessity is the mother of invention

0:15:50.440 --> 0:15:53.320
<v Speaker 1>and in fact us export controls. And we've heard this

0:15:53.440 --> 0:16:00.520
<v Speaker 1>argument before, spurred this great development in Chinese domestic developments

0:16:00.520 --> 0:16:02.680
<v Speaker 1>and technology. How do yours boned?

0:16:03.040 --> 0:16:06.320
<v Speaker 2>Well? First, it's an interesting statement is the suggestion that

0:16:06.360 --> 0:16:09.400
<v Speaker 2>if deep Seak had more Nvidia chips, it would not

0:16:09.680 --> 0:16:13.840
<v Speaker 2>have invented R one. I mean, that is a to me,

0:16:13.880 --> 0:16:18.200
<v Speaker 2>a kind of bizarre statement. But the argument that I

0:16:18.320 --> 0:16:24.160
<v Speaker 2>frequently encounter in this area is, aha, Look, China's motivated

0:16:24.280 --> 0:16:26.560
<v Speaker 2>when it comes to AI and motivated when it comes

0:16:26.560 --> 0:16:30.120
<v Speaker 2>to semiconductor production. Because you are placing limits on the

0:16:30.200 --> 0:16:33.720
<v Speaker 2>chips and the manufacturing equipment they can access, You've motivated them.

0:16:33.880 --> 0:16:36.760
<v Speaker 2>And this is really Jensen's argument, and I think that

0:16:36.840 --> 0:16:41.400
<v Speaker 2>this misses just the facts and the sequence of what's

0:16:41.480 --> 0:16:44.320
<v Speaker 2>unfolded over the last decade. It was China who came

0:16:44.360 --> 0:16:46.920
<v Speaker 2>out years before these export controls were ever in place

0:16:47.360 --> 0:16:49.760
<v Speaker 2>and announced to the world that they were going to

0:16:49.800 --> 0:16:52.920
<v Speaker 2>be the world leader in ai by twenty thirty. And

0:16:52.960 --> 0:16:55.480
<v Speaker 2>that one of the areas that made in China twenty

0:16:55.520 --> 0:16:57.480
<v Speaker 2>twenty five, by the way, a policy that was put

0:16:57.520 --> 0:16:59.640
<v Speaker 2>in place more than a decade ago was going to

0:16:59.680 --> 0:17:05.040
<v Speaker 2>focus on, was chips. And so the semiconductor export controls

0:17:05.240 --> 0:17:09.000
<v Speaker 2>did not create this big push by China. The semiconductor

0:17:09.040 --> 0:17:12.760
<v Speaker 2>export controls responded to this big push by China. And

0:17:12.840 --> 0:17:15.520
<v Speaker 2>so I think that this narrative of necessity as the

0:17:15.520 --> 0:17:18.760
<v Speaker 2>mother of invention kind of gets the sequence exactly backwards.

0:17:19.119 --> 0:17:22.399
<v Speaker 2>It suggests that the big Chinese push all happened post

0:17:22.440 --> 0:17:26.880
<v Speaker 2>export controls, when in fact it long predated the export controls,

0:17:26.880 --> 0:17:29.320
<v Speaker 2>and the export controls were in part motivated by saying, well,

0:17:29.320 --> 0:17:31.359
<v Speaker 2>wait a second, China's kind of come out to the

0:17:31.359 --> 0:17:32.760
<v Speaker 2>world and said this is what we want to do.

0:17:32.880 --> 0:17:35.480
<v Speaker 2>We're telling all of you as saying, well, we're not

0:17:35.520 --> 0:17:38.320
<v Speaker 2>going to make it easier for you, because you know,

0:17:38.440 --> 0:17:40.679
<v Speaker 2>we want to sustain our advantages.

0:17:40.240 --> 0:17:42.600
<v Speaker 1>Just to play devil's advocate, though. I mean, obviously, any

0:17:42.920 --> 0:17:45.720
<v Speaker 1>economy has a demand side and a supply side, right,

0:17:45.800 --> 0:17:49.399
<v Speaker 1>and clearly you know you're right. The demand side from

0:17:49.440 --> 0:17:52.040
<v Speaker 1>the Chinese Communist Party and the government was there get

0:17:52.080 --> 0:17:56.040
<v Speaker 1>to ANI supremacy by twenty thirty on shore chip manufacturing.

0:17:56.800 --> 0:18:00.240
<v Speaker 1>But there's a lot of demand site mandates from the

0:18:00.280 --> 0:18:05.159
<v Speaker 1>Chinese Communist Party which don't translate into enormous successes. The

0:18:05.240 --> 0:18:09.159
<v Speaker 1>supply side issue, which was US policy is saying well,

0:18:09.160 --> 0:18:11.959
<v Speaker 1>you can't have this from US, may have ultimately been

0:18:11.960 --> 0:18:15.680
<v Speaker 1>a much more powerful and motivating signal to Chinese technology

0:18:15.720 --> 0:18:18.040
<v Speaker 1>companies than the dictates of their own government.

0:18:18.720 --> 0:18:21.000
<v Speaker 2>Well, I would make a couple points in response to that.

0:18:21.080 --> 0:18:23.720
<v Speaker 2>The first is that in a bunch of the other

0:18:23.880 --> 0:18:27.400
<v Speaker 2>key areas that China also identified mid and China twenty

0:18:27.480 --> 0:18:31.880
<v Speaker 2>twenty five, you've seen huge progress in China, huge progress, right.

0:18:32.000 --> 0:18:35.200
<v Speaker 2>So somehow the demand and supply sides lined up when

0:18:35.240 --> 0:18:39.560
<v Speaker 2>it came to things like robotics or electric vehicles, or

0:18:39.840 --> 0:18:42.040
<v Speaker 2>you go down the list of what they identified as

0:18:42.040 --> 0:18:44.440
<v Speaker 2>their sectors. It's not like AI stands out as something

0:18:44.440 --> 0:18:46.800
<v Speaker 2>that all of a sudden blipped off the map because

0:18:46.800 --> 0:18:49.359
<v Speaker 2>of our export control. So I find it a little

0:18:49.359 --> 0:18:55.200
<v Speaker 2>hard to square with what has been a very distinctive

0:18:56.200 --> 0:19:00.879
<v Speaker 2>presidentially dictated policy, of which there are are not many

0:19:01.160 --> 0:19:03.520
<v Speaker 2>that look like made in China twenty twenty five, or

0:19:03.560 --> 0:19:06.359
<v Speaker 2>where they have put as much energy as they have

0:19:06.480 --> 0:19:07.199
<v Speaker 2>in this area.

0:19:07.760 --> 0:19:10.000
<v Speaker 1>I know that your Chinese counterpart often like to ask

0:19:10.040 --> 0:19:13.200
<v Speaker 1>you this kind of rhetorical question, where's the line between

0:19:13.200 --> 0:19:18.359
<v Speaker 1>economic policy and national security? Physical question? And I'm curious,

0:19:18.720 --> 0:19:21.520
<v Speaker 1>you know, did you and President Biden look with envy

0:19:22.160 --> 0:19:25.920
<v Speaker 1>at President She's ability to onboard massive new energy onto

0:19:25.960 --> 0:19:30.760
<v Speaker 1>the grid to dictate the decision making at the top

0:19:30.840 --> 0:19:32.959
<v Speaker 1>Chinese tech firms in a way that the US government

0:19:33.000 --> 0:19:36.200
<v Speaker 1>and the tech firms are increasingly or often intention I mean,

0:19:36.880 --> 0:19:39.720
<v Speaker 1>was there something about the Chinese model in terms of

0:19:39.800 --> 0:19:44.480
<v Speaker 1>preparing for this new industrial and AI revolution that you

0:19:44.480 --> 0:19:45.439
<v Speaker 1>wish you could have borrowed?

0:19:45.840 --> 0:19:48.800
<v Speaker 2>Well, look, anytime you're working at the White House and

0:19:48.840 --> 0:19:55.280
<v Speaker 2>you're dealing with bureaucracy, red tape regulation, the thought is

0:19:55.280 --> 0:19:57.359
<v Speaker 2>never far from your mind. Hey, if I was just

0:19:57.520 --> 0:20:00.800
<v Speaker 2>all powerful and President Biden was all powerful, we could

0:20:00.840 --> 0:20:03.280
<v Speaker 2>just not have to waste time with any of this.

0:20:03.680 --> 0:20:06.080
<v Speaker 2>But then you pause and you say, hey, wait a second, Actually,

0:20:06.080 --> 0:20:09.040
<v Speaker 2>democracy is a pretty good form of government one. And

0:20:09.040 --> 0:20:14.000
<v Speaker 2>two the American technology model, which is messier, which is

0:20:14.040 --> 0:20:21.119
<v Speaker 2>more decentralized, which is in many ways more maddening, also

0:20:21.200 --> 0:20:23.240
<v Speaker 2>really works. And it's a good thing to bet on.

0:20:24.000 --> 0:20:26.560
<v Speaker 2>And that's why I think we're in such a disturbing

0:20:26.640 --> 0:20:29.320
<v Speaker 2>moment right now, because, look, some of the advantages of

0:20:29.320 --> 0:20:31.879
<v Speaker 2>the US has. One is the ability to attract talent

0:20:32.240 --> 0:20:35.200
<v Speaker 2>from all over the world. Another is this ecosystem a

0:20:35.240 --> 0:20:39.960
<v Speaker 2>basic research funding supplied by the government, research universities, and

0:20:40.040 --> 0:20:42.840
<v Speaker 2>the private sector. And two of those three pillars are

0:20:42.880 --> 0:20:46.320
<v Speaker 2>being knocked out by this administration. It strikes me that

0:20:47.080 --> 0:20:50.680
<v Speaker 2>the most recent announcement saying basically, we're going to try

0:20:50.680 --> 0:20:53.680
<v Speaker 2>to really dramatically reduce the number of Chinese grad students

0:20:53.680 --> 0:20:58.200
<v Speaker 2>and undergraduate students and researchers in the United States, all

0:20:58.200 --> 0:21:00.160
<v Speaker 2>of these are self harming moves because all of them

0:21:00.640 --> 0:21:04.800
<v Speaker 2>take away from this unique model. The US is built

0:21:04.840 --> 0:21:08.840
<v Speaker 2>to build and sustain an innovation edge over time. So

0:21:08.880 --> 0:21:11.240
<v Speaker 2>the Chinese have their way of doing it. We have

0:21:11.320 --> 0:21:14.280
<v Speaker 2>to look at that and say, that's a formidable competitor.

0:21:14.800 --> 0:21:18.119
<v Speaker 2>But it is precisely that observation that we are dealing

0:21:18.119 --> 0:21:21.399
<v Speaker 2>with the formidable competitor. That motivated us to say, what

0:21:21.440 --> 0:21:22.919
<v Speaker 2>do we need to do on the promote side, what

0:21:22.920 --> 0:21:25.840
<v Speaker 2>do we need to do to push the boundaries, and

0:21:25.880 --> 0:21:27.400
<v Speaker 2>then what do we need to do on the protect

0:21:27.440 --> 0:21:30.320
<v Speaker 2>side to make sure that our most advanced technologies aren't

0:21:30.320 --> 0:21:33.560
<v Speaker 2>being used against us? So that's it was that sense

0:21:33.640 --> 0:21:38.399
<v Speaker 2>of real understanding that this was going to be a

0:21:38.440 --> 0:21:41.679
<v Speaker 2>hard race in many of these different technology areas that

0:21:41.800 --> 0:21:44.440
<v Speaker 2>motivated us to take the policy steps that we took.

0:21:46.560 --> 0:21:50.240
<v Speaker 1>After the break, Jake Sullivan on how the Biden administration

0:21:50.840 --> 0:21:56.200
<v Speaker 1>was preparing for the possibility of agi I, a system

0:21:56.480 --> 0:22:00.760
<v Speaker 1>that is superior to humans and effectively old COCA tasks,

0:22:01.400 --> 0:22:04.400
<v Speaker 1>and also Jake talks about the role of autonomous weapons

0:22:04.400 --> 0:22:12.520
<v Speaker 1>in current conflicts from Ukraine to Gaza. I want to

0:22:12.560 --> 0:22:16.719
<v Speaker 1>zoom out a bit, Jake. One of the most interesting

0:22:16.920 --> 0:22:21.280
<v Speaker 1>conversations I've seen this year was between as reclined ere

0:22:21.320 --> 0:22:25.119
<v Speaker 1>I say it and Ben Buchanan, the White House Special

0:22:25.119 --> 0:22:29.119
<v Speaker 1>Advisor on a AI for the Biden administration. Essentially that

0:22:29.160 --> 0:22:32.919
<v Speaker 1>they had a conversation where Ben Buchanan said it was

0:22:32.960 --> 0:22:38.200
<v Speaker 1>the belief of the Biden administration that artificial general intelligence

0:22:38.400 --> 0:22:41.119
<v Speaker 1>is going to come within the next four years, i e.

0:22:41.640 --> 0:22:44.840
<v Speaker 1>Not just AI that helps humans do tasks, but a

0:22:44.880 --> 0:22:50.240
<v Speaker 1>system that is superior to humans in effectively all cognitive tasks.

0:22:50.880 --> 0:22:53.160
<v Speaker 1>Was that your view, with those conversations you were part of.

0:22:53.800 --> 0:22:57.160
<v Speaker 2>I would amend it slightly to say that a premise

0:22:57.200 --> 0:22:59.840
<v Speaker 2>of our approach, both on the national security side and

0:22:59.840 --> 0:23:02.720
<v Speaker 2>the writ large in the process that Ben and Bruce

0:23:02.800 --> 0:23:06.399
<v Speaker 2>reed led, was that that was a distinct possibility, and

0:23:06.480 --> 0:23:09.480
<v Speaker 2>therefore we had to plan and prepare for it. Not

0:23:10.200 --> 0:23:13.600
<v Speaker 2>a bold prediction that it would certainly arrive, but that

0:23:13.840 --> 0:23:16.320
<v Speaker 2>it was and remains a distinct possibility.

0:23:16.800 --> 0:23:19.440
<v Speaker 1>Where did you fail a debate about how proble this is.

0:23:20.080 --> 0:23:23.639
<v Speaker 2>My own personal view has always been I'm not sure

0:23:24.400 --> 0:23:28.719
<v Speaker 2>because and the reason I'm definitely not sure about this

0:23:28.800 --> 0:23:34.960
<v Speaker 2>one is that incredibly smart people have incredibly different timetables

0:23:36.080 --> 0:23:41.639
<v Speaker 2>for when AGI will arrive. And what I have to

0:23:41.640 --> 0:23:46.320
<v Speaker 2>do as National Security advisor is take that spectrum of

0:23:46.359 --> 0:23:51.520
<v Speaker 2>opinion and say, Okay, is there a sufficient degree of

0:23:51.560 --> 0:23:54.280
<v Speaker 2>credibility we would assign to the proposition that it could

0:23:54.320 --> 0:23:56.760
<v Speaker 2>come in the next four years. The answer is yes,

0:23:57.119 --> 0:24:00.560
<v Speaker 2>then we need to operate against that assumption because it

0:24:00.600 --> 0:24:03.639
<v Speaker 2>means we don't have much time to get prepared for

0:24:03.760 --> 0:24:06.480
<v Speaker 2>what that will mean from a national security perspective, and

0:24:06.520 --> 0:24:10.960
<v Speaker 2>economic perspective, a social perspective, you know, the impact on

0:24:11.440 --> 0:24:16.320
<v Speaker 2>every facet of human life. And so we were operating

0:24:16.640 --> 0:24:20.320
<v Speaker 2>under the premise that this was a distinct possibility, but

0:24:20.520 --> 0:24:23.439
<v Speaker 2>not that it was a certainty, because we couldn't be certain.

0:24:23.920 --> 0:24:27.840
<v Speaker 2>And I remain fascinated by the AGI debate today because

0:24:28.000 --> 0:24:30.840
<v Speaker 2>a lot of the content I consume in the debate

0:24:30.920 --> 0:24:36.840
<v Speaker 2>over AI is about these really quite wildly different perceptions

0:24:36.880 --> 0:24:39.720
<v Speaker 2>of both where we are and when the next breakthroughs

0:24:39.720 --> 0:24:43.160
<v Speaker 2>are going to come and how. And I find that

0:24:43.359 --> 0:24:46.440
<v Speaker 2>just so interesting. It reminds me a little bit of

0:24:46.960 --> 0:24:49.680
<v Speaker 2>the very intense debates over the future of the Chinese economy,

0:24:49.720 --> 0:24:52.119
<v Speaker 2>where you can find people saying it's going to be

0:24:52.119 --> 0:24:54.520
<v Speaker 2>an unstoppable juggernaut, you can find people who are saying,

0:24:54.600 --> 0:24:58.400
<v Speaker 2>basically it has it's hobbled by intractable problems, and everywhere

0:24:58.440 --> 0:25:02.680
<v Speaker 2>in between. I again in that area, I get asked,

0:25:02.720 --> 0:25:05.520
<v Speaker 2>where did you fall on that debate of whether the

0:25:05.560 --> 0:25:08.800
<v Speaker 2>Chinese economy? I said, similarly, I didn't take a fixed,

0:25:08.880 --> 0:25:11.200
<v Speaker 2>determined view because I had to prepare for a range

0:25:11.240 --> 0:25:12.400
<v Speaker 2>of different contingencies.

0:25:13.040 --> 0:25:15.760
<v Speaker 1>I'm curious what probably is as a percentage you would assign.

0:25:17.080 --> 0:25:21.560
<v Speaker 1>How on Earth do you respond to this transformative hypothetical

0:25:21.560 --> 0:25:22.920
<v Speaker 1>in terms of creating policy.

0:25:23.720 --> 0:25:26.600
<v Speaker 2>Well, in a number of ways. I mean, for starters,

0:25:26.800 --> 0:25:29.320
<v Speaker 2>a lot of the work around alignment as safety and

0:25:29.359 --> 0:25:32.640
<v Speaker 2>security was work that was really stimulated by our administration

0:25:32.920 --> 0:25:36.479
<v Speaker 2>in concert and coordination with countries around the world. You know,

0:25:36.800 --> 0:25:42.880
<v Speaker 2>the Bletchley Park meeting in the UK, followed by successive summits. Obviously,

0:25:42.880 --> 0:25:45.399
<v Speaker 2>this administration takes a different view on the question of

0:25:45.400 --> 0:25:46.840
<v Speaker 2>safety and alignment than we did.

0:25:47.160 --> 0:25:49.679
<v Speaker 1>Vance was in Europe basically saying that god rails are

0:25:49.720 --> 0:25:51.200
<v Speaker 1>off innovating in evading at.

0:25:51.359 --> 0:25:54.760
<v Speaker 2>Yeah, exactly exactly. The second thing was actually how we

0:25:54.760 --> 0:25:57.320
<v Speaker 2>would deal with China directly on the issue of AI risk.

0:25:57.760 --> 0:25:59.800
<v Speaker 2>So not only what's it going to take for the

0:25:59.840 --> 0:26:02.680
<v Speaker 2>US to remain in the lead at the frontier, but also,

0:26:04.119 --> 0:26:06.520
<v Speaker 2>you know, how do we have a real dialogue for

0:26:06.640 --> 0:26:10.800
<v Speaker 2>a joint interest in managing risks to all of humanity,

0:26:10.960 --> 0:26:15.480
<v Speaker 2>including all Americans and all Chinese. And President Biden and

0:26:15.560 --> 0:26:19.119
<v Speaker 2>President She agreed to launch that dialogue when they met

0:26:19.200 --> 0:26:21.639
<v Speaker 2>at the summit in San Francisco in twenty twenty three.

0:26:22.080 --> 0:26:24.040
<v Speaker 2>We had a first session of it in twenty four.

0:26:24.359 --> 0:26:26.000
<v Speaker 2>I don't know if that's going to continue in this

0:26:26.040 --> 0:26:27.040
<v Speaker 2>administration or not.

0:26:27.560 --> 0:26:31.280
<v Speaker 1>What did they talk about was the framework to step

0:26:31.320 --> 0:26:35.120
<v Speaker 1>beyond national borders and consider a threat to humanity as

0:26:35.240 --> 0:26:38.440
<v Speaker 1>the presidents of the two most powerful countries in the world,

0:26:38.440 --> 0:26:39.840
<v Speaker 1>And what were those conversations like.

0:26:40.040 --> 0:26:42.080
<v Speaker 2>Well, first, I don't want to overstate how far the

0:26:42.119 --> 0:26:47.679
<v Speaker 2>first one got. These kinds of diplomatic engagements, especially between

0:26:47.840 --> 0:26:52.159
<v Speaker 2>two countries that are at once having dialogue and competing,

0:26:53.160 --> 0:26:55.919
<v Speaker 2>are always a bit tentative, and both sides come with

0:26:55.960 --> 0:26:59.160
<v Speaker 2>their cards a little close to their best. But that

0:26:59.200 --> 0:27:05.240
<v Speaker 2>meeting did begin to expose and a common understanding of

0:27:05.280 --> 0:27:08.000
<v Speaker 2>some of the big risks, including the risk at the

0:27:08.040 --> 0:27:13.200
<v Speaker 2>convergence of biology or biotech and artificial intelligence, the risk

0:27:13.240 --> 0:27:18.280
<v Speaker 2>of misalignment misaligned AI causing all manner of harm, and

0:27:18.280 --> 0:27:22.080
<v Speaker 2>the risk of proliferation AI getting into the hands of

0:27:22.600 --> 0:27:25.480
<v Speaker 2>bad actors who would choose to threaten both the US

0:27:25.480 --> 0:27:28.159
<v Speaker 2>and China or other nation states for that matter, or

0:27:28.280 --> 0:27:32.040
<v Speaker 2>just seek generally to destabilize. So that first session, I

0:27:32.080 --> 0:27:36.040
<v Speaker 2>would say, was a warm up and it still remains

0:27:36.240 --> 0:27:38.439
<v Speaker 2>a task for the US and China to have a

0:27:38.560 --> 0:27:41.520
<v Speaker 2>much deeper, sustained dialogue on this, something that I very

0:27:41.600 --> 0:27:44.840
<v Speaker 2>much support, even as I support the United States continuing

0:27:44.920 --> 0:27:47.879
<v Speaker 2>to do the work to stay in the lead. I

0:27:47.920 --> 0:27:51.480
<v Speaker 2>think it is not the essence of JD Vance's message

0:27:51.600 --> 0:27:55.560
<v Speaker 2>in Paris earlier this year, which was we're in for

0:27:55.600 --> 0:27:58.880
<v Speaker 2>a race, so you know, nothing that will slow us down.

0:28:00.040 --> 0:28:03.360
<v Speaker 2>Elation of any kind. I think we need to race

0:28:03.400 --> 0:28:05.840
<v Speaker 2>as fast as we can, but we also do need

0:28:05.880 --> 0:28:10.359
<v Speaker 2>guard rails to ensure safety and alignment. I guess the

0:28:10.400 --> 0:28:12.159
<v Speaker 2>word safety is a bit out of vogue with this

0:28:12.200 --> 0:28:15.720
<v Speaker 2>administration to call it security and alignment. We do need

0:28:15.760 --> 0:28:19.720
<v Speaker 2>those now. At the same time, we have to pay

0:28:19.720 --> 0:28:22.800
<v Speaker 2>attention to the fact that if we're pursuing a bunch

0:28:22.840 --> 0:28:26.040
<v Speaker 2>of guardrails, maybe alongside other like minded states, and China's

0:28:26.080 --> 0:28:29.000
<v Speaker 2>choosing the no guardrail approach, you know, that could give

0:28:29.040 --> 0:28:32.000
<v Speaker 2>them an advantage in the race and the competition as

0:28:32.040 --> 0:28:35.200
<v Speaker 2>we go forward. So one of the things I draw

0:28:35.240 --> 0:28:40.640
<v Speaker 2>upon as a partial analogy, because it is completely imperfect,

0:28:41.280 --> 0:28:45.800
<v Speaker 2>is the fact that we were, you know, simultaneously building

0:28:45.840 --> 0:28:49.640
<v Speaker 2>up our nuclear arsenals with the Soviet Union and also

0:28:50.200 --> 0:28:53.800
<v Speaker 2>beginning to think about both proliferation and the US and

0:28:53.800 --> 0:28:57.880
<v Speaker 2>the Soviets work together on the Non Proliferation Treaty and

0:28:58.080 --> 0:29:01.680
<v Speaker 2>on control, and over the series of agreements from the

0:29:01.680 --> 0:29:05.320
<v Speaker 2>seventies onward, we came to understandings about arms control and

0:29:05.360 --> 0:29:11.120
<v Speaker 2>even arms reduction, even as we were developing more sophisticated weapons,

0:29:11.160 --> 0:29:15.160
<v Speaker 2>more sophisticated targeting, more sophisticated intelligence the same way. So

0:29:15.360 --> 0:29:17.959
<v Speaker 2>I think there are some lessons to be learned from that,

0:29:18.680 --> 0:29:23.400
<v Speaker 2>But basically the answer is that we have to essentially

0:29:23.440 --> 0:29:26.440
<v Speaker 2>feel our way across the river on the stones, and

0:29:26.480 --> 0:29:29.560
<v Speaker 2>there isn't like a map or a guidebook that will

0:29:29.600 --> 0:29:32.520
<v Speaker 2>tell you how to strike this balance. That is a

0:29:32.560 --> 0:29:35.440
<v Speaker 2>matter that is more art than science, that requires steady

0:29:35.480 --> 0:29:38.880
<v Speaker 2>and determined leadership from the President and the White House,

0:29:39.200 --> 0:29:42.840
<v Speaker 2>and also requires a lot of technical expertise being brought

0:29:42.840 --> 0:29:45.880
<v Speaker 2>in house. I think the current administration's attitude is just

0:29:45.960 --> 0:29:48.680
<v Speaker 2>let a rip, and that concerns me, and I think

0:29:48.680 --> 0:29:52.960
<v Speaker 2>it puts more inputus frankly on the private sector to

0:29:53.040 --> 0:29:56.040
<v Speaker 2>begin thinking through how it deals with the guardrails question,

0:29:56.480 --> 0:29:59.040
<v Speaker 2>because it cannot just punt the question to the White House.

0:29:59.040 --> 0:30:00.480
<v Speaker 2>Since the White House is kind of said, We're not

0:30:00.480 --> 0:30:01.120
<v Speaker 2>going to deal.

0:30:00.960 --> 0:30:05.240
<v Speaker 1>With that, pivoting away from HgI but towards something which

0:30:05.320 --> 0:30:10.280
<v Speaker 1>is frankly no less terrifying autonomous weapons systems. A few

0:30:10.360 --> 0:30:14.719
<v Speaker 1>days before we recorded this interview, Russia experience, you know

0:30:14.720 --> 0:30:19.080
<v Speaker 1>what some are calling its Pearl Harbor, Ukrainian fleet of

0:30:19.200 --> 0:30:24.720
<v Speaker 1>Ukrainian drones way away inside Russian borders knocking out some

0:30:24.800 --> 0:30:28.200
<v Speaker 1>of their nuclear aircraft. Were you surprised by this and

0:30:28.240 --> 0:30:35.240
<v Speaker 1>how did you contend with the theory of swarming autonomous

0:30:35.240 --> 0:30:38.680
<v Speaker 1>weapons becoming a reality within your term?

0:30:39.480 --> 0:30:43.440
<v Speaker 2>Well, to take the second question first, I mean there's

0:30:43.480 --> 0:30:46.560
<v Speaker 2>the nation state challenge and then there's the non state

0:30:46.640 --> 0:30:49.600
<v Speaker 2>actor challenge, the nation and state challenge. You know, the

0:30:49.640 --> 0:30:54.400
<v Speaker 2>Pentagon has quite sophisticated, quite adaptable planning processes for new

0:30:54.440 --> 0:30:58.520
<v Speaker 2>weapons systems, new types of technology, and new tactics coming on,

0:30:59.160 --> 0:31:02.080
<v Speaker 2>and they do it tremendous amount of war gaming and

0:31:02.160 --> 0:31:04.640
<v Speaker 2>testing around that. One of the things that I was

0:31:04.720 --> 0:31:09.160
<v Speaker 2>very focused on as National Security Advisor was ensuring that

0:31:09.480 --> 0:31:13.600
<v Speaker 2>our entire national security enterprise, the Defense Department, the intelligence community,

0:31:13.840 --> 0:31:16.560
<v Speaker 2>and then on the financial sanction size, the Treasury Department

0:31:16.640 --> 0:31:19.000
<v Speaker 2>or the Export Control side of the Commerce Department was

0:31:19.080 --> 0:31:24.040
<v Speaker 2>thinking about applications of both offense and defense when it

0:31:24.080 --> 0:31:27.920
<v Speaker 2>comes to AI. So towards the end of twenty twenty four,

0:31:28.000 --> 0:31:34.000
<v Speaker 2>President Biden issued a National Security Memorandum on artificial intelligence.

0:31:34.520 --> 0:31:37.200
<v Speaker 2>Part of that was around how do we ensure we're

0:31:37.240 --> 0:31:43.320
<v Speaker 2>capable of defending against this type of thing a drone, swarm,

0:31:43.480 --> 0:31:48.000
<v Speaker 2>AI enabled autonomous weapons of various kinds, and that worries

0:31:48.040 --> 0:31:50.360
<v Speaker 2>me greatly, But I feel like, Okay, we have a

0:31:50.440 --> 0:31:53.080
<v Speaker 2>lane for that. We can work to be in a

0:31:53.120 --> 0:31:57.240
<v Speaker 2>position to deter or defend against that or counteract that.

0:31:57.920 --> 0:31:59.600
<v Speaker 2>The thing that keeps me up at night is that

0:31:59.640 --> 0:32:05.960
<v Speaker 2>decization of lethal technology. It's the extent to which an individual,

0:32:06.000 --> 0:32:11.040
<v Speaker 2>a group, a highly motivated organization, whether criminal or terrorists

0:32:11.120 --> 0:32:15.320
<v Speaker 2>or otherwise, can get its hands on highly precise and

0:32:15.400 --> 0:32:19.880
<v Speaker 2>capable and lethal systems that they can pinpoint target very

0:32:19.920 --> 0:32:24.360
<v Speaker 2>far away. And what the Ukraine case shows us is,

0:32:24.440 --> 0:32:28.840
<v Speaker 2>with some degree of resourcefulness and inventiveness, very far away

0:32:28.880 --> 0:32:32.320
<v Speaker 2>can really mean very far away from one's borders, including

0:32:32.360 --> 0:32:36.800
<v Speaker 2>in quite protected parts of a big country like Russia.

0:32:37.080 --> 0:32:39.200
<v Speaker 2>That's a pretty scary thought, even for the homeland of

0:32:39.200 --> 0:32:42.520
<v Speaker 2>the United States and that's something that if I were

0:32:42.560 --> 0:32:47.160
<v Speaker 2>sitting as National Security advisor today, I'd be diving deep

0:32:47.280 --> 0:32:50.640
<v Speaker 2>into the Ukraine case to say, what exactly does this

0:32:50.760 --> 0:32:54.320
<v Speaker 2>tell us about the vulnerability and risk fear in the

0:32:54.400 --> 0:32:57.760
<v Speaker 2>United States. Was I surprised by what the Ukrainians did? Guess?

0:32:57.760 --> 0:33:02.480
<v Speaker 2>And no, no, I'm not surprised. That they're constantly adaptive

0:33:02.520 --> 0:33:05.240
<v Speaker 2>and capable, and frankly, I'm quite proud of the work

0:33:05.280 --> 0:33:08.720
<v Speaker 2>that we played in helping to fund and see their

0:33:09.160 --> 0:33:11.680
<v Speaker 2>drone program over the years. But they deserve the credit

0:33:11.840 --> 0:33:14.800
<v Speaker 2>for really taking that drone program to an incredible level

0:33:15.040 --> 0:33:17.600
<v Speaker 2>and then coming up with an operation as sophisticated and

0:33:17.640 --> 0:33:21.719
<v Speaker 2>complex as this. So yes, like the actual iteration was

0:33:21.960 --> 0:33:25.320
<v Speaker 2>surprising in a kind of you know, holy cow, I'm

0:33:25.320 --> 0:33:28.440
<v Speaker 2>really impressed by that way. But the fact that they

0:33:28.480 --> 0:33:31.800
<v Speaker 2>are doing this kind of thing is just a credit

0:33:32.000 --> 0:33:34.160
<v Speaker 2>to their to their bravery and skill.

0:33:35.200 --> 0:33:44.480
<v Speaker 1>Gaza reports that autonomous weapons being used by Israel making

0:33:44.520 --> 0:33:52.800
<v Speaker 1>their own kill decisions, including kill decisions that include civilian debts.

0:33:54.320 --> 0:33:56.680
<v Speaker 1>Do you know that to be true? I do not

0:33:57.600 --> 0:34:02.440
<v Speaker 1>know that to be true. The Israeli Defense Forces indicated

0:34:02.520 --> 0:34:06.240
<v Speaker 1>in conversations with their counterparts in the Pentagon and the

0:34:06.240 --> 0:34:10.840
<v Speaker 1>State Department that they have a human in the decision

0:34:10.840 --> 0:34:15.280
<v Speaker 1>making mode when strikes are taken. So I can't speak

0:34:15.320 --> 0:34:18.160
<v Speaker 1>to the reports which I've also seen, and I don't

0:34:18.200 --> 0:34:22.880
<v Speaker 1>know about every individual case, but that's the communication that

0:34:22.880 --> 0:34:26.120
<v Speaker 1>occurred between the US and the IDEA. When I talked

0:34:26.120 --> 0:34:27.399
<v Speaker 1>to you around the world. I mean, there are two

0:34:27.520 --> 0:34:30.879
<v Speaker 1>things that come up. One is with the world from Afghanistan,

0:34:31.800 --> 0:34:36.080
<v Speaker 1>and the other is the failure to constrain Israel's response

0:34:37.160 --> 0:34:40.200
<v Speaker 1>to the Hamas attacks on October seventh in terms of

0:34:40.239 --> 0:34:46.560
<v Speaker 1>civilian casualties in Gaza. How much did that frustrate your

0:34:46.560 --> 0:34:53.879
<v Speaker 1>ability to bring together a coalition? But I think your

0:34:53.880 --> 0:35:01.200
<v Speaker 1>mission was and is, which is that democratic values infuse

0:35:02.239 --> 0:35:03.040
<v Speaker 1>tech leadership.

0:35:04.120 --> 0:35:07.400
<v Speaker 2>Well, I think about these cases quite differently, you know,

0:35:07.600 --> 0:35:10.279
<v Speaker 2>sitting here today in twenty twenty five, I think the

0:35:10.360 --> 0:35:12.760
<v Speaker 2>United States is much better off that we're not entering

0:35:12.760 --> 0:35:15.480
<v Speaker 2>our twenty fifth year or more in Afghanistan sending American

0:35:15.480 --> 0:35:16.960
<v Speaker 2>men and women to fight and die there. So I

0:35:16.960 --> 0:35:20.560
<v Speaker 2>think President Biden got the big call right, and the

0:35:20.600 --> 0:35:25.640
<v Speaker 2>withdrawal itself was challenging and it was tragic. But when

0:35:25.680 --> 0:35:27.960
<v Speaker 2>you end in war after twenty years with all of

0:35:28.000 --> 0:35:30.320
<v Speaker 2>the decisions and the pathologies that are piled up is

0:35:30.320 --> 0:35:31.840
<v Speaker 2>not going to be easy. It was never going to

0:35:31.880 --> 0:35:34.919
<v Speaker 2>be easy. And anyone who would suggest otherwise, I think,

0:35:35.760 --> 0:35:39.680
<v Speaker 2>you know, you know, does not have credibility. So I

0:35:40.280 --> 0:35:43.880
<v Speaker 2>think that you know, I would do things differently. And

0:35:43.960 --> 0:35:46.640
<v Speaker 2>I've said this repeatedly publicly in terms of that actual

0:35:46.719 --> 0:35:48.600
<v Speaker 2>draw down, and we learned a lot of lessons from it.

0:35:48.800 --> 0:35:52.600
<v Speaker 2>But fundamentally it is a good thing the United States

0:35:52.640 --> 0:35:55.920
<v Speaker 2>is not in Afghanistan now. When it comes to Gaza,

0:35:55.960 --> 0:36:01.239
<v Speaker 2>it's just an absolute damn tragedy in every respect. Just

0:36:01.320 --> 0:36:06.120
<v Speaker 2>the gut wrenching images to this day, innocent people dying,

0:36:06.600 --> 0:36:11.319
<v Speaker 2>innocent people going without food, the tragedy going back to

0:36:11.360 --> 0:36:14.680
<v Speaker 2>the seventh itself of October, and the largest massacre of

0:36:14.719 --> 0:36:18.440
<v Speaker 2>Jews since the Holocaust, the tragedy of the hostages and

0:36:18.480 --> 0:36:24.400
<v Speaker 2>their families. All of these are just awful. And I

0:36:24.440 --> 0:36:28.960
<v Speaker 2>spent from October seventh of twenty three to January twentieth

0:36:29.000 --> 0:36:32.279
<v Speaker 2>of twenty twenty five living with the burden that we

0:36:33.000 --> 0:36:36.239
<v Speaker 2>couldn't stop it until the very end when we had

0:36:36.239 --> 0:36:38.920
<v Speaker 2>a cease fire and hostage deal in place. As President

0:36:38.960 --> 0:36:42.840
<v Speaker 2>Biden left office and I wrestle with that. I wrestle

0:36:42.880 --> 0:36:43.680
<v Speaker 2>with that every day.

0:36:44.080 --> 0:36:46.520
<v Speaker 1>When to an Oscar was then more the administration could

0:36:46.560 --> 0:36:50.000
<v Speaker 1>have done to prevent some of that Cornasian tragedy.

0:36:50.719 --> 0:36:53.120
<v Speaker 2>Look, that's a question I will keep asking myself and

0:36:53.480 --> 0:36:58.080
<v Speaker 2>keep asking others. You know, the main argument people make is,

0:36:58.360 --> 0:37:00.799
<v Speaker 2>you know, just cut off weapons to Israel. And I

0:37:00.840 --> 0:37:03.399
<v Speaker 2>think one thing we were very much contending with over

0:37:03.440 --> 0:37:05.839
<v Speaker 2>the course of twenty twenty four is Israel wasn't just

0:37:05.960 --> 0:37:10.960
<v Speaker 2>facing Hamas, they were facing Hisbella, the Syrian militias, the

0:37:10.960 --> 0:37:14.279
<v Speaker 2>Iraqi militias, the Huthis, and even around itself. So they

0:37:14.280 --> 0:37:16.560
<v Speaker 2>were being attacked on multiple fronts. And it's it's hard

0:37:16.600 --> 0:37:19.239
<v Speaker 2>to walk away from an ally and a circumstance like that.

0:37:19.440 --> 0:37:22.480
<v Speaker 2>But you know, I spent a lot of time personally,

0:37:23.320 --> 0:37:25.560
<v Speaker 2>day in day out in my office working on the

0:37:25.560 --> 0:37:29.240
<v Speaker 2>issue of humanitarian assistance and getting food and medicine into Gaza,

0:37:29.239 --> 0:37:32.000
<v Speaker 2>and we didn't get enough in. But I think we

0:37:32.080 --> 0:37:35.880
<v Speaker 2>did get a sufficient quantity in, not enough for what

0:37:35.920 --> 0:37:36.600
<v Speaker 2>people needed.

0:37:36.800 --> 0:37:40.000
<v Speaker 1>I mean, the real question is how much is mortal

0:37:40.040 --> 0:37:46.680
<v Speaker 1>authority the United States mortal authority required to exercise a

0:37:46.760 --> 0:37:52.080
<v Speaker 1>kind of self power that is critical to prevail in

0:37:52.360 --> 0:37:55.240
<v Speaker 1>if we don't inque to tech war tech competition with China.

0:37:56.080 --> 0:37:59.080
<v Speaker 2>Look, I think that our ability to work closely with

0:37:59.480 --> 0:38:03.239
<v Speaker 2>like minded democracies who share our values, who have a

0:38:03.320 --> 0:38:07.320
<v Speaker 2>vision for the world that is positive, that is consistent

0:38:07.360 --> 0:38:10.480
<v Speaker 2>with the vision we have for ourselves as a country,

0:38:11.719 --> 0:38:15.280
<v Speaker 2>this is a critical part of the long term competition

0:38:15.360 --> 0:38:19.560
<v Speaker 2>with China. And I actually believe that one of the

0:38:19.600 --> 0:38:23.320
<v Speaker 2>things that has been painful to watch is the extent

0:38:23.360 --> 0:38:26.360
<v Speaker 2>to which the President Trump, in approaching the competition to China,

0:38:26.400 --> 0:38:28.080
<v Speaker 2>has chosen basically to go to war with all of

0:38:28.080 --> 0:38:30.680
<v Speaker 2>our allies at the same time, rather than try to

0:38:30.680 --> 0:38:34.279
<v Speaker 2>work with them to try to come up with a

0:38:34.320 --> 0:38:37.640
<v Speaker 2>common and united strategy for lack of a better term,

0:38:37.719 --> 0:38:41.000
<v Speaker 2>the free world to prevail in this competition.

0:38:42.040 --> 0:38:46.960
<v Speaker 1>You know, obviously you've been asked recently by political on stage.

0:38:47.120 --> 0:38:50.359
<v Speaker 1>Were you surprised by President Biden's debate performance? You know

0:38:50.840 --> 0:38:53.080
<v Speaker 1>what did you know? You said, Yes, I was surprised

0:38:53.080 --> 0:38:55.239
<v Speaker 1>and know he was always shot when I interacted with him.

0:38:55.920 --> 0:38:58.160
<v Speaker 1>So I don't want to ask you to rehearse that again.

0:38:58.280 --> 0:39:00.560
<v Speaker 1>But I guess, as somebody who's been a kind of

0:39:00.600 --> 0:39:04.320
<v Speaker 1>leader in the Democrat Party, for you know, fifteen plus years,

0:39:05.239 --> 0:39:09.600
<v Speaker 1>how does the party get over this? Who knew what when?

0:39:10.200 --> 0:39:11.800
<v Speaker 1>A series of questions.

0:39:12.160 --> 0:39:16.120
<v Speaker 2>Look, obviously, I've been involved in political campaigns in the past,

0:39:17.320 --> 0:39:20.200
<v Speaker 2>so you know I have personal opinions on lots of

0:39:20.200 --> 0:39:23.640
<v Speaker 2>things about politics. But in the Biden administration, I served

0:39:23.640 --> 0:39:26.200
<v Speaker 2>as National Security Advisor and my job was to focus

0:39:26.239 --> 0:39:32.160
<v Speaker 2>on the national security issues. So I haven't weighed in

0:39:32.239 --> 0:39:36.120
<v Speaker 2>on the big political doing and throwing around all this.

0:39:36.239 --> 0:39:39.080
<v Speaker 2>All I've been able to do is relate my experience

0:39:39.600 --> 0:39:42.279
<v Speaker 2>with the president in the oval, in the situation room,

0:39:43.440 --> 0:39:46.239
<v Speaker 2>and that's where I will continue to be in this

0:39:46.280 --> 0:39:49.640
<v Speaker 2>conversation let other people hash out where the party goes

0:39:49.640 --> 0:39:50.040
<v Speaker 2>from here.

0:39:51.080 --> 0:39:53.640
<v Speaker 1>Just to close, kind of coming back to the kind

0:39:53.640 --> 0:39:57.920
<v Speaker 1>of widest sweep of technology and history and the role

0:39:57.960 --> 0:40:02.200
<v Speaker 1>of the US you've talked, I think about how tech

0:40:02.280 --> 0:40:06.200
<v Speaker 1>history has had these two waves to date, the first

0:40:06.200 --> 0:40:08.680
<v Speaker 1>one being the onset of the Internet, the hope it

0:40:08.800 --> 0:40:13.759
<v Speaker 1>carried democrazation, and the second one being how some repressive

0:40:13.800 --> 0:40:17.440
<v Speaker 1>governments were able to harness this technology for spying, for harassment,

0:40:17.520 --> 0:40:23.959
<v Speaker 1>for repression. What do you hope the third wave will

0:40:23.960 --> 0:40:27.480
<v Speaker 1>be and how do we get there well.

0:40:27.520 --> 0:40:31.560
<v Speaker 2>I hope the first the third wave will avoid the

0:40:32.600 --> 0:40:35.600
<v Speaker 2>to my view, a certain blind optimism of the first wave,

0:40:35.640 --> 0:40:37.840
<v Speaker 2>where we thought, hey, this is so great, it's going

0:40:37.920 --> 0:40:42.440
<v Speaker 2>to mean that freedom and democracy reign everywhere, but also

0:40:42.760 --> 0:40:46.400
<v Speaker 2>avoid a sense of doom and dread from the second

0:40:46.400 --> 0:40:50.680
<v Speaker 2>wave that you know, we're all screwed. I think that

0:40:50.880 --> 0:40:55.960
<v Speaker 2>the third wave, if it goes right, will mean that

0:40:56.000 --> 0:40:59.719
<v Speaker 2>we are able to harness the most exciting opportunities of art,

0:40:59.719 --> 0:41:05.040
<v Speaker 2>if you're so intelligence for human health, for human well being,

0:41:07.440 --> 0:41:11.120
<v Speaker 2>you know, for advancing the capacity of people to communicate

0:41:11.480 --> 0:41:14.240
<v Speaker 2>better as supposed to worse, and that we will mitigate

0:41:14.280 --> 0:41:17.759
<v Speaker 2>the downside risks. We're not going to eliminate them. We're

0:41:17.760 --> 0:41:19.239
<v Speaker 2>going to have to live with them. They're going to

0:41:19.239 --> 0:41:21.080
<v Speaker 2>be real and acute, but that we mitigate them, and

0:41:21.120 --> 0:41:24.600
<v Speaker 2>that the opportunities end up creating a circumstance where AI

0:41:24.719 --> 0:41:26.759
<v Speaker 2>is working much more for us than against us. Can

0:41:26.800 --> 0:41:28.799
<v Speaker 2>we achieve that? I don't know. I think this is

0:41:28.840 --> 0:41:31.520
<v Speaker 2>going to be a close front of thing, because I'm

0:41:31.560 --> 0:41:36.560
<v Speaker 2>definitely not a doomer, but I am certainly nervous about

0:41:36.680 --> 0:41:39.600
<v Speaker 2>whether we have the tools and the capacity and the

0:41:39.640 --> 0:41:43.640
<v Speaker 2>foresight right now to build the kind of guardrails necessary

0:41:43.719 --> 0:41:47.200
<v Speaker 2>that will both allow AI to flourish for the positive

0:41:47.719 --> 0:41:52.759
<v Speaker 2>but also you know, guard against the downside. I don't

0:41:52.800 --> 0:41:55.360
<v Speaker 2>think that we're right now on track, and I think

0:41:55.480 --> 0:41:59.040
<v Speaker 2>that if this administration isn't going to step up to

0:41:59.280 --> 0:42:01.799
<v Speaker 2>take leadership in this, it's going to require others, many

0:42:01.840 --> 0:42:05.120
<v Speaker 2>outside of government, to take leadership in this, because it

0:42:05.200 --> 0:42:09.120
<v Speaker 2>is a project that we cannot dally on given the

0:42:09.120 --> 0:42:11.640
<v Speaker 2>possibility that this is coming at us very very fast.

0:42:17.000 --> 0:42:19.120
<v Speaker 1>Jakesonlan, thank you so much for joining UK Stuff today.

0:42:19.320 --> 0:42:20.160
<v Speaker 2>Thanks for having me.

0:42:39.040 --> 0:42:42.560
<v Speaker 1>For tech Stuff. I'm os Valoshin. This episode was produced

0:42:42.600 --> 0:42:47.400
<v Speaker 1>by Eliza Dennis and Adriana Tapia. It was executive produced

0:42:47.400 --> 0:42:51.279
<v Speaker 1>by me caraen Price and Kate Osborne for Kaleidoscope and

0:42:51.400 --> 0:42:56.200
<v Speaker 1>Katrina norvelve iHeart Podcasts. Jack Insley mixed this episode and

0:42:56.320 --> 0:42:59.680
<v Speaker 1>Kyle Murdoch wrote our theme song. Join us on Friday

0:42:59.680 --> 0:43:01.879
<v Speaker 1>for the week Tech when we'll run through the tech

0:43:01.880 --> 0:43:05.480
<v Speaker 1>headlines you may have missed and please rate, review and

0:43:05.640 --> 0:43:09.040
<v Speaker 1>reach out to us at tech Stuff podcast at gmail

0:43:09.080 --> 0:43:16.400
<v Speaker 1>dot com.