1 00:00:04,760 --> 00:00:07,760 Speaker 1: On this episode of the News World. On Wednesday, July 2 00:00:07,880 --> 00:00:12,520 Speaker 1: twenty third, President Trump released Winning the Race America's AI 3 00:00:12,640 --> 00:00:17,960 Speaker 1: Action Plan, which includes three pillars. One accelerate AI innovation, 4 00:00:18,400 --> 00:00:23,400 Speaker 1: two build American AI infrastructure, and three lead in international 5 00:00:23,400 --> 00:00:27,360 Speaker 1: AI diplomacy and security. The United States isn't a race 6 00:00:27,400 --> 00:00:31,720 Speaker 1: to achieve global dominance in artificial intelligence. The country who 7 00:00:31,800 --> 00:00:35,160 Speaker 1: leads with the largest AI ecosystem will set global AI 8 00:00:35,280 --> 00:00:39,800 Speaker 1: standards and realize broad economic and military benefits, or in 9 00:00:39,800 --> 00:00:43,440 Speaker 1: the verge of a new industrial revolution, an information revolution 10 00:00:43,800 --> 00:00:47,080 Speaker 1: and renaissance. Here to talk about the AI Action Plan, 11 00:00:47,560 --> 00:00:50,760 Speaker 1: I'm really pleased to welcome back my guest, Neil Chilson. 12 00:00:51,120 --> 00:00:53,920 Speaker 1: He is the former technologist for the Federal Trade Commission 13 00:00:54,120 --> 00:01:06,760 Speaker 1: and current head of AI policy at the Abundance Institute. Neil, welcome, 14 00:01:06,800 --> 00:01:09,080 Speaker 1: and thank you for joining me once again on newts World. 15 00:01:09,480 --> 00:01:10,280 Speaker 2: Thanks for having me. 16 00:01:11,000 --> 00:01:13,960 Speaker 1: You know, one of President Trump's first actions in January 17 00:01:14,360 --> 00:01:17,840 Speaker 1: was to revoke President Biden's Executive Order on AI and 18 00:01:17,959 --> 00:01:22,399 Speaker 1: sign Executive Order fourteen one seventy nine, removing barriers to 19 00:01:22,440 --> 00:01:26,560 Speaker 1: American leadership and artificial intelligence, directing the creation of an 20 00:01:26,560 --> 00:01:30,279 Speaker 1: AI action plan. From your standpoint, how does the Trump 21 00:01:30,319 --> 00:01:34,200 Speaker 1: administration's philosophy and AI differ from the Biden approach? 22 00:01:35,280 --> 00:01:37,440 Speaker 2: You know, it's one hundred and eighty degrees different. The 23 00:01:37,480 --> 00:01:40,680 Speaker 2: Biden Executive Order, which was the third longest in history, 24 00:01:41,080 --> 00:01:44,480 Speaker 2: laid out a long series of things that the administration 25 00:01:44,680 --> 00:01:48,520 Speaker 2: was concerned about about AI. It framed AI as a 26 00:01:48,640 --> 00:01:54,240 Speaker 2: risk primarily, and tried to deploy the entire federal governments 27 00:01:54,320 --> 00:02:00,000 Speaker 2: to mitigate that risk. The Trump administration repealed that executive Order, 28 00:02:00,440 --> 00:02:02,840 Speaker 2: and this Action Plan lays out their plan, and it 29 00:02:02,880 --> 00:02:06,240 Speaker 2: takes a very different framework. It looks at AI as 30 00:02:06,280 --> 00:02:08,720 Speaker 2: a giant opportunity, as you framed it up. It's an 31 00:02:09,240 --> 00:02:13,359 Speaker 2: industrial and information revolution wrapped in with a renaissance and 32 00:02:13,880 --> 00:02:17,120 Speaker 2: has huge potential benefits for the United States if we 33 00:02:17,240 --> 00:02:19,200 Speaker 2: seize it and if we win the race. And so 34 00:02:19,639 --> 00:02:23,200 Speaker 2: the entire Action Plan is under that frame, and the 35 00:02:23,200 --> 00:02:28,600 Speaker 2: policies that it suggests are largely commercially driven, with some 36 00:02:28,960 --> 00:02:32,959 Speaker 2: boosters from the government, including some deregulatory efforts that sort 37 00:02:32,960 --> 00:02:35,200 Speaker 2: of form the backbone of several of the sections, some 38 00:02:35,360 --> 00:02:37,760 Speaker 2: of the pillars that you talked about. So it really 39 00:02:37,840 --> 00:02:41,520 Speaker 2: is very different both an approach and policy from the 40 00:02:41,520 --> 00:02:42,480 Speaker 2: Biden Administration. 41 00:02:42,840 --> 00:02:45,160 Speaker 1: Well, you know, back in January, the Trump White House 42 00:02:45,160 --> 00:02:49,280 Speaker 1: announced the creation of Stargate, a public private partnership with 43 00:02:49,480 --> 00:02:53,920 Speaker 1: five hundred billion dollars in projected investment. In your judgment, 44 00:02:53,960 --> 00:02:57,840 Speaker 1: how significant is this for USAI competitiveness. 45 00:02:58,400 --> 00:03:02,800 Speaker 2: To compete global on AI, we need a lot of infrastructure. 46 00:03:02,840 --> 00:03:05,760 Speaker 2: That's energy infrastructure, but it's also data centers, these big 47 00:03:05,760 --> 00:03:10,960 Speaker 2: computer warehouses that do all the actual work for the 48 00:03:11,040 --> 00:03:14,480 Speaker 2: AI algorithms, and so I think those types of investments 49 00:03:14,520 --> 00:03:17,760 Speaker 2: are super important. They're critical. There was another one that 50 00:03:17,800 --> 00:03:22,480 Speaker 2: Trump just announced along with some Pennsylvania folks just last 51 00:03:22,520 --> 00:03:24,919 Speaker 2: week or maybe the week before, that was around ninety 52 00:03:24,960 --> 00:03:28,600 Speaker 2: billion dollars in investment. And so these types of large investments, 53 00:03:28,720 --> 00:03:32,120 Speaker 2: primarily driven by the private sector, are really important to 54 00:03:32,919 --> 00:03:36,000 Speaker 2: continued US leadership in artificial intelligence. 55 00:03:36,600 --> 00:03:40,880 Speaker 1: AI actually relies on enormous amounts of computing, and enormous 56 00:03:40,880 --> 00:03:43,840 Speaker 1: amounts of computing really rely on a tremendous amount of electricity. 57 00:03:44,480 --> 00:03:48,200 Speaker 1: There's a whole infrastructure behind whatever happens when we go 58 00:03:48,240 --> 00:03:50,320 Speaker 1: online and interact with AI. 59 00:03:51,720 --> 00:03:54,600 Speaker 2: That's why one of the three pillars in the AI 60 00:03:54,680 --> 00:03:57,960 Speaker 2: Action Plan looks at clearing away some of the regulatory 61 00:03:58,000 --> 00:04:00,520 Speaker 2: barriers that make it so hard to build new energy 62 00:04:01,160 --> 00:04:03,360 Speaker 2: and connect it to the grid. In the United States, 63 00:04:03,400 --> 00:04:06,360 Speaker 2: we have a real challenge here as far as supply 64 00:04:06,480 --> 00:04:08,880 Speaker 2: on energy. We sort of flattened out since the seventies. 65 00:04:08,960 --> 00:04:12,600 Speaker 2: We haven't built a lot of new capacity. That's a problem, 66 00:04:12,640 --> 00:04:16,800 Speaker 2: given that it would be great for Americans who are 67 00:04:16,839 --> 00:04:19,920 Speaker 2: using all sorts of services if energy was too cheap 68 00:04:19,960 --> 00:04:23,000 Speaker 2: to meter. We have the technological ability to do that. 69 00:04:23,360 --> 00:04:25,800 Speaker 2: Lots of times regulatory barriers get in the way. 70 00:04:25,839 --> 00:04:28,279 Speaker 1: I have a friend who's very deeply into modular nuclear 71 00:04:28,839 --> 00:04:31,719 Speaker 1: and just signed a contract I think with Amazon to 72 00:04:31,760 --> 00:04:34,800 Speaker 1: put a modular nuclear power plant next to every one 73 00:04:34,839 --> 00:04:36,520 Speaker 1: of their computer centers. 74 00:04:37,480 --> 00:04:38,240 Speaker 2: That's a great idea. 75 00:04:38,440 --> 00:04:40,640 Speaker 1: I love it. Before we had much deeper because I 76 00:04:40,640 --> 00:04:43,799 Speaker 1: was going to ask about the recommendations that the Abundance 77 00:04:43,800 --> 00:04:46,120 Speaker 1: Institute is issued. But first I think it'd be helpful 78 00:04:46,160 --> 00:04:48,360 Speaker 1: for most people. Tell us just for a minute about 79 00:04:48,360 --> 00:04:49,640 Speaker 1: the Abundance Institute. 80 00:04:50,080 --> 00:04:54,640 Speaker 2: So, the Abundance Institute is a mission driven nonprofit that's 81 00:04:54,760 --> 00:04:59,760 Speaker 2: very focused on emerging technologies. We're anabashed optimists about the 82 00:05:00,000 --> 00:05:04,120 Speaker 2: ability of technology to drive widespread human prosperity. We see 83 00:05:04,160 --> 00:05:06,839 Speaker 2: from the lessons of history that that has been what 84 00:05:07,000 --> 00:05:09,680 Speaker 2: technology has done in the past. But it doesn't just 85 00:05:09,720 --> 00:05:12,480 Speaker 2: happen by itself. It takes two things. It takes a 86 00:05:12,520 --> 00:05:15,719 Speaker 2: culture that embraces innovation rather than fears it, and it 87 00:05:15,720 --> 00:05:19,160 Speaker 2: takes a regulatory environment that enables it. And so we 88 00:05:19,320 --> 00:05:23,279 Speaker 2: are dedicated to fostering both of those situations. I spend 89 00:05:23,279 --> 00:05:25,440 Speaker 2: a lot of my time on the policy side and 90 00:05:25,520 --> 00:05:27,560 Speaker 2: a lot of focus on AI, but we do a 91 00:05:27,600 --> 00:05:30,640 Speaker 2: lot of work in energy, including small modular reactors, like 92 00:05:30,680 --> 00:05:34,520 Speaker 2: you mentioned the policy environment for that, and we're looking 93 00:05:34,560 --> 00:05:37,400 Speaker 2: at other technologies that are coming down the pike that 94 00:05:37,520 --> 00:05:42,080 Speaker 2: will need both policy and cultural support to really reach 95 00:05:42,160 --> 00:05:44,800 Speaker 2: the full impact that they could to bring benefits to people. 96 00:05:45,560 --> 00:05:48,719 Speaker 1: In March, your institute issued a comment on request for 97 00:05:48,760 --> 00:05:52,640 Speaker 1: information on the development of an Artificial Intelligence Action Plan, 98 00:05:53,440 --> 00:05:56,480 Speaker 1: and there's something like a thousand AI related bills under 99 00:05:56,520 --> 00:05:59,800 Speaker 1: consideration at the state level. What's your whole sense about 100 00:06:00,160 --> 00:06:04,680 Speaker 1: what degree can we allow states to micro regulate into 101 00:06:04,680 --> 00:06:07,840 Speaker 1: what degree does this never become a federal preemption? 102 00:06:08,960 --> 00:06:11,039 Speaker 2: There is a real challenge here because AI is a 103 00:06:11,160 --> 00:06:15,840 Speaker 2: general purpose technology and defining it is quite difficult. It's 104 00:06:15,880 --> 00:06:19,160 Speaker 2: a moving target, and so I often say to state regulators, 105 00:06:19,720 --> 00:06:21,920 Speaker 2: what if you just replaced what you're talking about is 106 00:06:22,000 --> 00:06:25,560 Speaker 2: AI with advanced computing, and then asked whether or not 107 00:06:25,640 --> 00:06:29,360 Speaker 2: your legislation makes sense. And I think the challenge is 108 00:06:30,040 --> 00:06:33,400 Speaker 2: so much of this flood of state regulation has a 109 00:06:33,480 --> 00:06:36,960 Speaker 2: reach that goes beyond the borders of the state. No doubt, 110 00:06:36,960 --> 00:06:40,159 Speaker 2: there are plenty of applications of artificial intelligence and things 111 00:06:40,240 --> 00:06:44,520 Speaker 2: like insurance where the states have a traditional role of 112 00:06:45,640 --> 00:06:48,880 Speaker 2: governing those types of uses and should continue to that's 113 00:06:48,920 --> 00:06:53,159 Speaker 2: part of our constitutional federalist system. But when states start 114 00:06:53,200 --> 00:06:57,680 Speaker 2: passing laws where the effects are very broad way beyond 115 00:06:57,680 --> 00:07:00,680 Speaker 2: their borders, in fact, could be global, then I think 116 00:07:00,760 --> 00:07:03,320 Speaker 2: it's incumbent on the US federal government to step in 117 00:07:03,360 --> 00:07:05,240 Speaker 2: and say, hey, actually, some of this is our role, 118 00:07:05,360 --> 00:07:08,680 Speaker 2: right And we've seen some examples from states like New 119 00:07:08,760 --> 00:07:12,640 Speaker 2: York where sponsors of bills that are sort of comprehensive 120 00:07:13,280 --> 00:07:17,800 Speaker 2: in the broad scope regulatory approach to AI, where their 121 00:07:17,840 --> 00:07:20,920 Speaker 2: stated position is that they want to have a national 122 00:07:21,080 --> 00:07:24,800 Speaker 2: or a global effect, and that really is something that 123 00:07:24,840 --> 00:07:29,040 Speaker 2: goes beyond the framework of the constitution, and we're talking 124 00:07:29,040 --> 00:07:31,720 Speaker 2: about things that really should be governed at the federal 125 00:07:31,800 --> 00:07:35,440 Speaker 2: level and This AI Action Plan recognizes that challenge and 126 00:07:35,480 --> 00:07:37,480 Speaker 2: that problem, and I think it starts to move in 127 00:07:37,520 --> 00:07:39,800 Speaker 2: the right direction to say like, hey, actually there's a 128 00:07:39,800 --> 00:07:41,960 Speaker 2: strong federal role here well. 129 00:07:42,000 --> 00:07:45,240 Speaker 1: And I think that the founding fathers intent was that 130 00:07:45,240 --> 00:07:50,240 Speaker 1: there would be certain interstate commerce things that have to 131 00:07:50,280 --> 00:07:54,800 Speaker 1: be dealt with nationally. As the former Federal Trade Commission 132 00:07:54,840 --> 00:07:58,600 Speaker 1: Chief technologist, you had a front row seat Trump's twenty 133 00:07:58,720 --> 00:08:04,720 Speaker 1: nineteen and twenty executive orders on artificial intelligence. I'm really 134 00:08:04,760 --> 00:08:08,800 Speaker 1: curious how does this first Trump administration compare with what 135 00:08:08,840 --> 00:08:11,000 Speaker 1: the second Trump administration is now doing. 136 00:08:11,600 --> 00:08:15,680 Speaker 2: There are some big differences. Primarily, I think the first 137 00:08:15,720 --> 00:08:19,400 Speaker 2: Trump administration took longer to get going right. There wasn't 138 00:08:19,440 --> 00:08:21,600 Speaker 2: the team on the bench ready to go. And so 139 00:08:22,280 --> 00:08:24,440 Speaker 2: I think that one thing we're seeing here is that 140 00:08:24,680 --> 00:08:28,160 Speaker 2: this AI Action Plan is a great example of this 141 00:08:28,960 --> 00:08:34,320 Speaker 2: second administration really having an agenda coming in being ready 142 00:08:34,360 --> 00:08:37,960 Speaker 2: to start executing on it, putting this together based on 143 00:08:38,000 --> 00:08:40,079 Speaker 2: an EO that came out the first day that Trump 144 00:08:40,160 --> 00:08:43,160 Speaker 2: was in office, and then executing on this what is 145 00:08:43,200 --> 00:08:47,240 Speaker 2: a quite comprehensive set of ideas in such a short time, 146 00:08:47,280 --> 00:08:50,320 Speaker 2: I think just demonstrates the difference in the level of 147 00:08:50,400 --> 00:08:53,400 Speaker 2: preparedness to hit the ground running that this administration has 148 00:08:53,559 --> 00:08:54,600 Speaker 2: compared to the first time. 149 00:09:10,320 --> 00:09:16,160 Speaker 1: Explain for average citizens why it is so important for 150 00:09:16,240 --> 00:09:20,160 Speaker 1: the US to have global dominance in artificial intelligence. 151 00:09:21,320 --> 00:09:25,240 Speaker 2: So, as a general purpose technology, artificial intelligence is important 152 00:09:25,280 --> 00:09:29,760 Speaker 2: because intelligence is important. If you can put more intellectual 153 00:09:29,800 --> 00:09:33,120 Speaker 2: horsepower behind solving problems, and you can offload some of 154 00:09:33,120 --> 00:09:35,920 Speaker 2: that to computers, we can really expand the number of 155 00:09:35,960 --> 00:09:43,719 Speaker 2: problems that we solve. And this means in spaces like healthcare, transportation, entertainment, defense, 156 00:09:44,480 --> 00:09:47,440 Speaker 2: and so part of the reason that this is important 157 00:09:47,520 --> 00:09:50,120 Speaker 2: is that the US right now leads in this space. 158 00:09:50,240 --> 00:09:55,480 Speaker 2: We have major competitors, though in particular China is developing 159 00:09:55,520 --> 00:09:58,680 Speaker 2: their own tools to solve these same types of problems. 160 00:09:59,120 --> 00:10:02,040 Speaker 2: And this is particular concerning, I think, in the defense space, 161 00:10:02,520 --> 00:10:08,160 Speaker 2: but also just economically. The US will benefit if the 162 00:10:08,200 --> 00:10:13,120 Speaker 2: majority of the world adopts our technology and our infrastructure, 163 00:10:13,559 --> 00:10:16,480 Speaker 2: our AI stack as they would call it, to build 164 00:10:17,040 --> 00:10:20,600 Speaker 2: the problem solving tools of the future. And so I 165 00:10:20,640 --> 00:10:22,800 Speaker 2: think that's why it's really important for the US to 166 00:10:22,840 --> 00:10:25,920 Speaker 2: stay ahead here. There is also a cultural component, which 167 00:10:25,960 --> 00:10:30,760 Speaker 2: is that artificial intelligence models capture a lot of the 168 00:10:30,840 --> 00:10:34,440 Speaker 2: ideas of a country, and so I think just from 169 00:10:34,480 --> 00:10:38,960 Speaker 2: that aspect, we would much prefer that the US ideas 170 00:10:39,040 --> 00:10:43,920 Speaker 2: of liberty, of capitalism, of entrepreneurship are those that fuel 171 00:10:44,840 --> 00:10:47,760 Speaker 2: a lot of the tools that other people use around 172 00:10:47,760 --> 00:10:52,520 Speaker 2: the world, rather than Chinese Communist Party inflected AI models. 173 00:10:52,840 --> 00:10:56,199 Speaker 1: It's been fascinating to me that the United States has 174 00:10:56,280 --> 00:11:01,679 Speaker 1: really emphasized innovation and the European to really emphasized regulation. 175 00:11:02,679 --> 00:11:06,320 Speaker 1: They're really a fascinating case study and two cultural approaches. 176 00:11:06,320 --> 00:11:08,679 Speaker 1: And as you know, the European Union now has a 177 00:11:08,720 --> 00:11:12,280 Speaker 1: Code of Practice on Artificial Intelligence I guess, which is 178 00:11:12,320 --> 00:11:16,000 Speaker 1: called the AI Act, which would put a lot of 179 00:11:16,040 --> 00:11:20,000 Speaker 1: different controls if you will, what's your sense of the 180 00:11:20,000 --> 00:11:20,880 Speaker 1: European approach. 181 00:11:21,600 --> 00:11:24,720 Speaker 2: So it's fascinating. The Europeans were really far along the road. 182 00:11:24,760 --> 00:11:27,680 Speaker 2: They were so advanced in AI regulation that they were 183 00:11:27,679 --> 00:11:34,000 Speaker 2: writing a very comprehensive EUAI Act before chet GPT came out, 184 00:11:34,480 --> 00:11:37,600 Speaker 2: and in fact, when chat GPT came out and it 185 00:11:37,720 --> 00:11:39,920 Speaker 2: was very different than the way that they had been 186 00:11:39,960 --> 00:11:41,960 Speaker 2: thinking about AI, they had to do like sort of 187 00:11:42,000 --> 00:11:45,839 Speaker 2: emergency revisions to their EUAI Act, so it was almost 188 00:11:45,880 --> 00:11:48,560 Speaker 2: out of day even before it passed and so this 189 00:11:48,640 --> 00:11:52,040 Speaker 2: code of practice is a sort of voluntary I'm air 190 00:11:52,080 --> 00:11:54,680 Speaker 2: quoting that for your listeners. It's a sort of voluntary 191 00:11:54,720 --> 00:11:59,920 Speaker 2: set of commitments that if companies sign on to the 192 00:12:00,000 --> 00:12:02,280 Speaker 2: and the European Union will say, Okay, we're going to 193 00:12:02,360 --> 00:12:05,760 Speaker 2: assume you're complying if you follow all of these particular conditions. 194 00:12:05,800 --> 00:12:08,880 Speaker 2: But as you point out, those conditions are extremely detailed, 195 00:12:09,040 --> 00:12:12,240 Speaker 2: they're extremely onerous, and some of the companies have refused. 196 00:12:12,280 --> 00:12:14,360 Speaker 2: Some of the US companies have refused to sign on, 197 00:12:15,440 --> 00:12:18,240 Speaker 2: probably assuming that the European Union is going to sue 198 00:12:18,240 --> 00:12:20,840 Speaker 2: them no matter what. And that's probably a good assumption 199 00:12:21,000 --> 00:12:24,840 Speaker 2: in this space. And so that approach of regulate first 200 00:12:25,480 --> 00:12:29,400 Speaker 2: and maybe hope to carve out some innovation later has 201 00:12:29,440 --> 00:12:32,120 Speaker 2: been the European approach, and it's failed pretty badly, and 202 00:12:32,120 --> 00:12:35,160 Speaker 2: in fact, I think there is some recognition within Europe 203 00:12:35,240 --> 00:12:38,160 Speaker 2: that this is a problem. There has been some pushback. 204 00:12:38,400 --> 00:12:42,800 Speaker 2: Some of the major European companies like Airbus, have signed 205 00:12:42,840 --> 00:12:46,400 Speaker 2: a letter to the European Union saying that the EUAI 206 00:12:46,440 --> 00:12:50,640 Speaker 2: Act implementation needs to be delayed because nobody can figure 207 00:12:50,640 --> 00:12:53,439 Speaker 2: out how to comply with it and still build useful products, 208 00:12:53,679 --> 00:12:57,800 Speaker 2: and some of the countries that have AI companies, France 209 00:12:58,000 --> 00:13:00,640 Speaker 2: has a company called Mistrawl, which does quite a lot 210 00:13:00,679 --> 00:13:03,839 Speaker 2: of innovative AI work. France has been pushing back pretty 211 00:13:03,840 --> 00:13:05,599 Speaker 2: hard on the EU Act as well, So you know 212 00:13:05,640 --> 00:13:07,400 Speaker 2: it's bad when France is pushing back on it. 213 00:13:07,640 --> 00:13:10,760 Speaker 1: That's right, And I understand that Meta has basically said 214 00:13:10,760 --> 00:13:12,400 Speaker 1: they're not having going to touch it. That's right. 215 00:13:12,440 --> 00:13:14,240 Speaker 2: I think Meta is one of those companies has said, 216 00:13:14,280 --> 00:13:16,680 Speaker 2: the European Union is going to sue us, no matter 217 00:13:16,720 --> 00:13:18,839 Speaker 2: what we do in this space, We're not going to 218 00:13:18,960 --> 00:13:21,800 Speaker 2: do this voluntary commitment that would essentially paint the target 219 00:13:21,800 --> 00:13:25,160 Speaker 2: on our back. After the European Union regulators have sort 220 00:13:25,200 --> 00:13:28,840 Speaker 2: of loaded the bow and arrow, and so Meta has 221 00:13:28,920 --> 00:13:30,000 Speaker 2: chosen not to assign on. 222 00:13:30,320 --> 00:13:33,680 Speaker 1: Could the Europeans paint themselves into a corner where literally 223 00:13:33,760 --> 00:13:36,920 Speaker 1: the really biggest companies could just decide not to be there. 224 00:13:37,800 --> 00:13:39,920 Speaker 2: I think that's a possibility. I mean, I think what 225 00:13:39,960 --> 00:13:43,800 Speaker 2: we've seen even more so is that the Chinese, especially 226 00:13:43,800 --> 00:13:47,000 Speaker 2: in the hardware space, the Chinese developers have rushed in 227 00:13:47,120 --> 00:13:51,280 Speaker 2: to fill the gap where American companies have been sort 228 00:13:51,280 --> 00:13:56,600 Speaker 2: of pushed out by the Europeans. The Chinese companies rush 229 00:13:56,640 --> 00:13:58,520 Speaker 2: in to sort of fill that gap. Huawei did that 230 00:13:58,559 --> 00:14:01,400 Speaker 2: with five G hardware. I think there's a real concern 231 00:14:01,480 --> 00:14:03,240 Speaker 2: that if we don't get and you see this in 232 00:14:03,280 --> 00:14:07,160 Speaker 2: the AI action Plan, if we don't get the global market, 233 00:14:08,080 --> 00:14:13,280 Speaker 2: if we somehow keep our allies and our friends from 234 00:14:13,360 --> 00:14:17,760 Speaker 2: adopting US tech, then they'll end up adopting Chinese tech. 235 00:14:17,800 --> 00:14:21,120 Speaker 2: And so I think China sees that as the game plan, 236 00:14:21,640 --> 00:14:23,360 Speaker 2: and I think we need to do what we can 237 00:14:23,480 --> 00:14:25,520 Speaker 2: to make sure that our friends and allies can build 238 00:14:25,560 --> 00:14:28,040 Speaker 2: on the American tech stack. That's really important. 239 00:14:28,360 --> 00:14:34,240 Speaker 1: We've really seen the dramatic tightening of export controls for 240 00:14:34,400 --> 00:14:37,680 Speaker 1: chips and for other things that go into AI. Candidly, 241 00:14:37,720 --> 00:14:41,200 Speaker 1: I can't quite tell which generation of chips are now 242 00:14:41,320 --> 00:14:44,640 Speaker 1: embargoed and which chips are okay, and how big the 243 00:14:44,640 --> 00:14:47,560 Speaker 1: differences in terms of capabilities. What's your sense of that 244 00:14:47,600 --> 00:14:52,120 Speaker 1: whole effort to control exports, particularly to China of very 245 00:14:52,240 --> 00:14:53,240 Speaker 1: very advanced chips. 246 00:14:54,000 --> 00:14:56,800 Speaker 2: So I think that was the initial strategy of the 247 00:14:56,800 --> 00:15:00,320 Speaker 2: Biden administration had a very sweeping called Diffusion Rule, which 248 00:15:00,720 --> 00:15:03,080 Speaker 2: actually set a lot of export limits on a lot 249 00:15:03,120 --> 00:15:06,160 Speaker 2: of our friends and allies, countries like Israel and Portugal, 250 00:15:06,480 --> 00:15:09,560 Speaker 2: and there wasn't really a clear reason why, but that 251 00:15:09,600 --> 00:15:13,040 Speaker 2: continued into the Trump administration. I think we're seeing, especially 252 00:15:13,040 --> 00:15:18,040 Speaker 2: with the ongoing development of China's own infrastructure, that if 253 00:15:18,080 --> 00:15:20,680 Speaker 2: we don't sell them some level of chips, they will 254 00:15:20,680 --> 00:15:24,200 Speaker 2: build their own capacity and we're just letting their own 255 00:15:24,280 --> 00:15:28,160 Speaker 2: domestic providers proceed without any competition if we don't sell 256 00:15:28,240 --> 00:15:32,080 Speaker 2: chips into there. And so the Trump administration did allow 257 00:15:32,280 --> 00:15:35,560 Speaker 2: for I would say these are about three generation back chips. 258 00:15:35,800 --> 00:15:39,040 Speaker 2: They're about seventeen percent of the capacity of the top 259 00:15:39,480 --> 00:15:42,240 Speaker 2: line GPUs that we have right now, and so they're 260 00:15:42,240 --> 00:15:44,840 Speaker 2: not nearly as good as the cutting edge technology, but 261 00:15:45,080 --> 00:15:47,880 Speaker 2: they are being allowed to be sold into China, and 262 00:15:48,000 --> 00:15:50,520 Speaker 2: I think that makes a ton of sense. If we 263 00:15:50,560 --> 00:15:54,320 Speaker 2: see China as a rival, we still want them building 264 00:15:54,360 --> 00:15:57,160 Speaker 2: on the American tech stack. I think that's great because 265 00:15:57,200 --> 00:16:00,200 Speaker 2: it means that even if their models get exported elsewhere, 266 00:16:00,560 --> 00:16:03,040 Speaker 2: they'll run the best on the hardware that they were 267 00:16:03,080 --> 00:16:05,880 Speaker 2: trained on, and right now, we want that to be 268 00:16:06,000 --> 00:16:06,960 Speaker 2: the American hardware. 269 00:16:07,920 --> 00:16:12,280 Speaker 1: What would you say in this ongoing race, is it 270 00:16:12,400 --> 00:16:15,040 Speaker 1: essentially between the US and China or are there other 271 00:16:15,120 --> 00:16:18,440 Speaker 1: countries who have significant AI capabilities. 272 00:16:19,160 --> 00:16:22,200 Speaker 2: So it is essentially right now between AI and China 273 00:16:22,760 --> 00:16:27,840 Speaker 2: on many of the dimensions. China has the largest collection 274 00:16:28,160 --> 00:16:33,240 Speaker 2: of software developers in the world, and so they are 275 00:16:33,600 --> 00:16:38,800 Speaker 2: the major competitor. There are lots of other regions that 276 00:16:38,880 --> 00:16:43,040 Speaker 2: want to get into the AI business, but they'll largely 277 00:16:43,120 --> 00:16:47,240 Speaker 2: probably build on either US or Chinese hardware and software. 278 00:16:47,280 --> 00:16:49,880 Speaker 2: And so when we look at the deals that the 279 00:16:49,880 --> 00:16:53,560 Speaker 2: Trump administration struck with some of the Middle East countries, 280 00:16:53,720 --> 00:16:56,840 Speaker 2: UAE and some others, part of the motivation there is 281 00:16:56,920 --> 00:17:00,720 Speaker 2: these countries have a lot of capital dispense and they 282 00:17:00,760 --> 00:17:04,280 Speaker 2: want to build large data centers and develop AI, and 283 00:17:04,400 --> 00:17:06,840 Speaker 2: if we don't make that deal with them, China will. 284 00:17:23,160 --> 00:17:27,080 Speaker 1: There's a lot of concern that somehow artificial intelligence relates 285 00:17:27,119 --> 00:17:31,439 Speaker 1: back to Schwarzenegro movies and all that stuff. Do you 286 00:17:31,480 --> 00:17:34,639 Speaker 1: see AI as a serious threat or as a tool. 287 00:17:35,119 --> 00:17:36,360 Speaker 1: How do you think about it? 288 00:17:36,880 --> 00:17:39,520 Speaker 2: I think of AI as a tool. Like I said, 289 00:17:39,520 --> 00:17:42,239 Speaker 2: I think of it as advanced computing. When I was 290 00:17:42,320 --> 00:17:45,240 Speaker 2: first getting into computers, the cutting edge of AI was 291 00:17:45,480 --> 00:17:49,000 Speaker 2: chest plaining, and now everybody's phone can do that. When 292 00:17:49,040 --> 00:17:51,280 Speaker 2: I was in high school, I think the cutting edge 293 00:17:51,320 --> 00:17:56,480 Speaker 2: was optical recognition, the idea that computer could recognize an object. 294 00:17:56,600 --> 00:18:00,440 Speaker 2: Now everybody's phone can do that, text recognition, voice recognition, 295 00:18:00,640 --> 00:18:03,399 Speaker 2: voice generation. He's all at some point we're considered the 296 00:18:03,400 --> 00:18:07,040 Speaker 2: cutting edge of artificial intelligence. Now they're on everybody's phone, 297 00:18:07,040 --> 00:18:10,240 Speaker 2: nobody really calls them artificial intelligence anymore. I think that's 298 00:18:10,359 --> 00:18:13,480 Speaker 2: going to continue to be the trend. As this specific 299 00:18:13,680 --> 00:18:17,959 Speaker 2: set of chatbot tools get integrated to lots of different services, 300 00:18:18,240 --> 00:18:21,119 Speaker 2: we'll probably stop calling it artificial intelligence and we'll just 301 00:18:21,119 --> 00:18:23,919 Speaker 2: call it computers again. There'll be something new that is 302 00:18:23,960 --> 00:18:26,240 Speaker 2: being generated. And so I really do think of it 303 00:18:26,280 --> 00:18:29,520 Speaker 2: as a tool. These are not personalities. They don't have will. 304 00:18:30,040 --> 00:18:34,640 Speaker 2: They are really sophisticated tools that can help us think 305 00:18:34,720 --> 00:18:38,280 Speaker 2: through patterns in language and music and art. But they 306 00:18:38,320 --> 00:18:41,199 Speaker 2: don't replace humans, I don't think, and I think we 307 00:18:41,280 --> 00:18:46,639 Speaker 2: can really use them to amplify our intellectual abilities, like 308 00:18:46,680 --> 00:18:49,200 Speaker 2: a steam engine. But a steam engine, you know, wasn't 309 00:18:49,200 --> 00:18:51,440 Speaker 2: a muscle, it wasn't a human, just because it could 310 00:18:51,440 --> 00:18:53,080 Speaker 2: do some of the things that a human could do, 311 00:18:53,160 --> 00:18:55,480 Speaker 2: And so I think of them very much like that, 312 00:18:55,560 --> 00:18:56,879 Speaker 2: a steam engine for the mind. 313 00:18:57,520 --> 00:18:59,600 Speaker 1: Yeah, of course, the era of the steam engine also 314 00:18:59,680 --> 00:19:02,760 Speaker 1: is the of Frankenstein and of the whole notion of 315 00:19:02,960 --> 00:19:04,920 Speaker 1: these artificially developed monsters. 316 00:19:05,400 --> 00:19:08,840 Speaker 2: The mythology around being replaced by the things that we 317 00:19:08,920 --> 00:19:12,399 Speaker 2: build is very, very old, right, and so that fear 318 00:19:12,520 --> 00:19:15,800 Speaker 2: is somehow ingrained in humans, and I think it's just 319 00:19:15,840 --> 00:19:18,920 Speaker 2: being played out in this new context of intelligence. 320 00:19:19,240 --> 00:19:22,640 Speaker 1: The one area that when we talk about exports, etc. 321 00:19:23,280 --> 00:19:27,640 Speaker 1: It's sort of astonishing what percent of the world's advanced 322 00:19:27,680 --> 00:19:31,879 Speaker 1: chips are on Taiwan. Doesn't that make Taiwan in a 323 00:19:31,920 --> 00:19:36,359 Speaker 1: sense strategic in ways that people don't often think about it. 324 00:19:37,240 --> 00:19:40,480 Speaker 2: I think you're absolutely right. What the Taiwanese have been 325 00:19:40,520 --> 00:19:45,400 Speaker 2: able to do with building chips at scale and very 326 00:19:45,440 --> 00:19:50,280 Speaker 2: quickly developing the types of tools to manufacture in a 327 00:19:50,400 --> 00:19:53,800 Speaker 2: very consistent and high yield way the types of chips 328 00:19:53,800 --> 00:19:56,840 Speaker 2: that power so many of the devices is pretty impressive. 329 00:19:56,880 --> 00:19:59,600 Speaker 2: It's pretty unique. Frankly, I mean, it does mean that 330 00:19:59,640 --> 00:20:04,479 Speaker 2: Taiwan is strategically important to the US in the supply 331 00:20:04,600 --> 00:20:06,720 Speaker 2: chain of semiconductors, for sure. 332 00:20:07,119 --> 00:20:09,520 Speaker 1: The other thing which has always surprised me, and maybe 333 00:20:09,520 --> 00:20:14,320 Speaker 1: you can help explain it, is modern ship production requires 334 00:20:14,400 --> 00:20:17,760 Speaker 1: a massive amount of investment to build these modern ship 335 00:20:17,760 --> 00:20:23,080 Speaker 1: plans at the level of controlled detail that which they 336 00:20:23,200 --> 00:20:26,520 Speaker 1: manufacture is an astonishing achievement. 337 00:20:26,840 --> 00:20:29,680 Speaker 2: It really is. And part of that is because they 338 00:20:29,720 --> 00:20:33,479 Speaker 2: produce at such high yields, they produce at such scale, 339 00:20:33,480 --> 00:20:35,080 Speaker 2: and that's the only way you can get the cost 340 00:20:35,160 --> 00:20:39,280 Speaker 2: of these very very complicated devices down enough so that 341 00:20:39,560 --> 00:20:42,440 Speaker 2: normal consumers can buy them. And so your iPhone, if 342 00:20:42,440 --> 00:20:44,720 Speaker 2: we had to produce them at a smaller scale, would 343 00:20:44,800 --> 00:20:48,280 Speaker 2: cost maybe ten times what it costs right now, or 344 00:20:48,320 --> 00:20:50,560 Speaker 2: it might not even be physically possible to make if 345 00:20:50,600 --> 00:20:52,320 Speaker 2: you didn't make it at that high scale. So the 346 00:20:52,320 --> 00:20:55,040 Speaker 2: amount of capital investment that these companies had to put 347 00:20:55,080 --> 00:21:00,719 Speaker 2: in to build manufacturing capability is really high. Some policy 348 00:21:00,800 --> 00:21:03,399 Speaker 2: choices way back when that made it harder for us 349 00:21:03,400 --> 00:21:06,760 Speaker 2: to maintain that capacity here we benefited a lot from 350 00:21:06,800 --> 00:21:10,600 Speaker 2: the ability to purchase that capability from other parts of 351 00:21:10,640 --> 00:21:13,000 Speaker 2: the world. Now there's a push to bring it back, 352 00:21:13,000 --> 00:21:16,000 Speaker 2: but I think people are realizing it's not just the dollars. 353 00:21:16,040 --> 00:21:20,639 Speaker 2: There's a lot of skilled personnel and talent and know 354 00:21:20,760 --> 00:21:24,800 Speaker 2: how that is really important in being able to produce 355 00:21:24,840 --> 00:21:28,679 Speaker 2: the type of quality and scale that Taiwan has been 356 00:21:28,720 --> 00:21:29,480 Speaker 2: able to achieve. 357 00:21:29,680 --> 00:21:32,119 Speaker 1: Didn't the Taiwan needs discover that when they had an 358 00:21:32,160 --> 00:21:35,720 Speaker 1: opportunity and a commitment to build a very large facility 359 00:21:35,720 --> 00:21:38,080 Speaker 1: in Arizona, they just couldn't find the workforce. 360 00:21:38,480 --> 00:21:40,000 Speaker 2: Yeah, it was very hard. In fact, I think they 361 00:21:40,040 --> 00:21:42,080 Speaker 2: had to bring a bunch of people from Taiwan. They 362 00:21:42,119 --> 00:21:44,640 Speaker 2: had to try to persuade them to move from Taiwan 363 00:21:44,680 --> 00:21:47,199 Speaker 2: to move to Arizona, and I think they really did 364 00:21:47,280 --> 00:21:49,120 Speaker 2: struggle with that. I think some of that has been 365 00:21:49,160 --> 00:21:51,000 Speaker 2: mitigated a little bit, you know, over time, I think 366 00:21:51,000 --> 00:21:53,000 Speaker 2: they've been able to work out some of that, but 367 00:21:53,080 --> 00:21:55,520 Speaker 2: it certainly is a barrier. You know, it's hard to 368 00:21:55,560 --> 00:21:58,879 Speaker 2: port a culture from one place to another, and bringing 369 00:21:58,920 --> 00:22:02,840 Speaker 2: the culture that enabled Taiwan to build these massive factories 370 00:22:02,880 --> 00:22:04,960 Speaker 2: to Arizona has been a little bit challenging. 371 00:22:05,600 --> 00:22:10,359 Speaker 1: Apparently it requires such an intense focus on detail that 372 00:22:10,480 --> 00:22:14,080 Speaker 1: it's not part of the traditional American patterns. Let me 373 00:22:14,119 --> 00:22:15,600 Speaker 1: ask you what I'm saying, which is, when you think 374 00:22:15,640 --> 00:22:21,400 Speaker 1: about your institute, what would you say the major developments 375 00:22:21,440 --> 00:22:23,480 Speaker 1: of the near future that you all are focused on. 376 00:22:24,200 --> 00:22:26,520 Speaker 2: You know, I'm very focused on ai but again, that's 377 00:22:26,560 --> 00:22:29,480 Speaker 2: a general purpose technology, so to be a little more specific, 378 00:22:29,600 --> 00:22:32,879 Speaker 2: I am very excited and very interested in the implications 379 00:22:32,920 --> 00:22:38,879 Speaker 2: of artificial intelligence for medicine, and in particular, there is 380 00:22:38,920 --> 00:22:41,399 Speaker 2: a growing body of evidence that we are able to 381 00:22:41,520 --> 00:22:45,840 Speaker 2: use some of these models to identify new uses for 382 00:22:45,960 --> 00:22:49,399 Speaker 2: already approved drugs, and many of these uses can be 383 00:22:49,520 --> 00:22:53,439 Speaker 2: targeted towards the types of diseases that maybe only one 384 00:22:53,480 --> 00:22:56,240 Speaker 2: hundred people in the world suffer from at a time. 385 00:22:56,560 --> 00:23:01,840 Speaker 2: Those types of diseases never get specific drug research because 386 00:23:01,920 --> 00:23:03,959 Speaker 2: it doesn't make sense to try to build a product 387 00:23:04,000 --> 00:23:07,480 Speaker 2: for one hundred people. But in aggregate, there are many 388 00:23:07,600 --> 00:23:10,560 Speaker 2: of these types of diseases, and so if we can repurpose, 389 00:23:10,600 --> 00:23:14,080 Speaker 2: if we can use these AI models to repurpose existing 390 00:23:14,160 --> 00:23:18,320 Speaker 2: approved drugs to solve or mitigate some of these diseases, 391 00:23:18,359 --> 00:23:21,000 Speaker 2: I think there's a really huge impact that we could 392 00:23:21,000 --> 00:23:23,760 Speaker 2: have on people's lives right now, even with the technology 393 00:23:23,840 --> 00:23:25,320 Speaker 2: we have right now, and that's only going to get 394 00:23:25,359 --> 00:23:27,640 Speaker 2: better in the future. And so I'm very excited about 395 00:23:27,640 --> 00:23:31,119 Speaker 2: this type of drug personalization, the ability to target some 396 00:23:31,200 --> 00:23:35,280 Speaker 2: of these very difficult to solve medical problems that many 397 00:23:35,320 --> 00:23:36,160 Speaker 2: people suffer from. 398 00:23:36,560 --> 00:23:38,800 Speaker 1: I've been talking for years about the idea of turning 399 00:23:39,240 --> 00:23:42,680 Speaker 1: disabilities into capabilities, and it seems to me we're right 400 00:23:42,720 --> 00:23:45,639 Speaker 1: at the edge of the technologies which is literally going 401 00:23:45,720 --> 00:23:46,600 Speaker 1: to make that possible. 402 00:23:47,359 --> 00:23:49,320 Speaker 2: I think you're right, and it's a really exciting time 403 00:23:49,359 --> 00:23:50,240 Speaker 2: to be in the space. 404 00:23:50,520 --> 00:23:53,160 Speaker 1: That's why I think when you think about all this stuff, 405 00:23:53,640 --> 00:23:57,560 Speaker 1: I'm very optimistic that if we can avoid a nuclear 406 00:23:57,600 --> 00:24:01,200 Speaker 1: war and avoid losing to China, that over the next 407 00:24:01,200 --> 00:24:03,880 Speaker 1: twenty or thirty years, we really will have an extraordinary 408 00:24:03,880 --> 00:24:08,760 Speaker 1: golden age of new developments, new technologies, new opportunities, and 409 00:24:08,880 --> 00:24:11,440 Speaker 1: sort of being at a place called the Abundance Institute, 410 00:24:11,560 --> 00:24:12,760 Speaker 1: you almost have to believe that. 411 00:24:13,160 --> 00:24:15,119 Speaker 2: I was going to say it's an abundant future. I 412 00:24:15,119 --> 00:24:17,200 Speaker 2: think if we get some of these policy things right, 413 00:24:17,240 --> 00:24:19,280 Speaker 2: and I think the AI Action Plan does move us 414 00:24:19,320 --> 00:24:20,320 Speaker 2: in that direction. 415 00:24:20,320 --> 00:24:22,920 Speaker 1: I hope, because this is going to continue to evolve, 416 00:24:23,280 --> 00:24:25,879 Speaker 1: it's going to be a lively process to watch. So 417 00:24:25,960 --> 00:24:28,760 Speaker 1: I hope you'll consider in future coming back and keeping 418 00:24:28,840 --> 00:24:32,399 Speaker 1: us updated as all these opportunities keep developing, and I 419 00:24:32,400 --> 00:24:35,280 Speaker 1: hope people take a search look at the Abundance Institute. Neil, 420 00:24:35,359 --> 00:24:37,440 Speaker 1: I want to thank you for joining me. I want 421 00:24:37,440 --> 00:24:40,119 Speaker 1: to let our listeners know they can find Winning the 422 00:24:40,200 --> 00:24:44,359 Speaker 1: Race America's AI Action Plan at White House Dot Gov, 423 00:24:44,640 --> 00:24:47,879 Speaker 1: and they can learn more about your organization, the Abundance 424 00:24:47,920 --> 00:24:52,600 Speaker 1: Institute by visiting your website at Abundance Got Institute. 425 00:24:53,240 --> 00:24:54,919 Speaker 2: That's right, Thank you so much for having me on. 426 00:25:00,000 --> 00:25:02,200 Speaker 1: Oh I guess, Neil Chilson. You can get a link 427 00:25:02,240 --> 00:25:05,280 Speaker 1: to Winning the Race America's AI action plan on our 428 00:25:05,320 --> 00:25:08,639 Speaker 1: show page at newtsworld dot com. News World is produced 429 00:25:08,640 --> 00:25:13,000 Speaker 1: by Gagish three sixty and iHeartMedia. Our executive producers Guarnsey Sloan. 430 00:25:13,440 --> 00:25:16,840 Speaker 1: Our researcher is Rachel Peterson. The artwork for the show 431 00:25:16,880 --> 00:25:19,920 Speaker 1: was created by Steve Penley. Special thanks to the team 432 00:25:19,960 --> 00:25:23,480 Speaker 1: at Gingerstree sixty. If you've been enjoying Newtsworld, I hope 433 00:25:23,520 --> 00:25:25,960 Speaker 1: you'll go to Apple Podcast and both rate us with 434 00:25:26,040 --> 00:25:29,040 Speaker 1: five stars and give us a review so others can 435 00:25:29,119 --> 00:25:32,360 Speaker 1: learn what it's all about. Right now, listeners of Newtsworld 436 00:25:32,400 --> 00:25:36,440 Speaker 1: consenter for my three free weekly columns at gingistree sixty 437 00:25:36,480 --> 00:25:41,040 Speaker 1: dot com slash newsletter. I'm Nute Gingrich. This is Newtsworld.