1 00:00:02,520 --> 00:00:07,040 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. 2 00:00:07,880 --> 00:00:11,680 Speaker 2: President Trump signed an executive order aimed at limiting state 3 00:00:11,760 --> 00:00:15,040 Speaker 2: level regulation of AI. The move is supported by tech 4 00:00:15,160 --> 00:00:19,360 Speaker 2: leaders who have argued local rules could stifle innovation. We're 5 00:00:19,440 --> 00:00:22,479 Speaker 2: joined by David Sachs, the White House AI and Crypto 6 00:00:22,640 --> 00:00:25,800 Speaker 2: zah Or, Senior Advisor, and David, I think it is 7 00:00:25,840 --> 00:00:28,520 Speaker 2: a really good place to start in your work with 8 00:00:28,560 --> 00:00:32,960 Speaker 2: the President in consulting and advising on the formulation of 9 00:00:33,000 --> 00:00:37,159 Speaker 2: this executive order. What was the problem that you were 10 00:00:37,200 --> 00:00:40,440 Speaker 2: trying to solve for and what is it that you 11 00:00:40,560 --> 00:00:43,879 Speaker 2: said to the President about why this EO was the 12 00:00:43,960 --> 00:00:46,880 Speaker 2: right approach to focus on state level laws? 13 00:00:48,240 --> 00:00:49,599 Speaker 3: Well, thanks for having me. 14 00:00:49,840 --> 00:00:51,960 Speaker 4: The problem that we see is that you've got one 15 00:00:52,000 --> 00:00:55,640 Speaker 4: thousand different bills going through state legislatures right now to 16 00:00:55,720 --> 00:00:59,000 Speaker 4: regulate AI, and over one hundred measures already passed. Some 17 00:00:59,040 --> 00:01:01,440 Speaker 4: of these bills are country victory, and you've got fifty 18 00:01:01,480 --> 00:01:03,400 Speaker 4: different states running in fifty different directions. 19 00:01:03,760 --> 00:01:05,240 Speaker 3: That type of compliance. 20 00:01:04,800 --> 00:01:07,560 Speaker 4: Regime is going to be very hard for small companies 21 00:01:07,560 --> 00:01:11,640 Speaker 4: and startups, especially innovators, to comply with. And so what 22 00:01:11,680 --> 00:01:16,319 Speaker 4: we need is a single federal or national framework for 23 00:01:16,400 --> 00:01:18,800 Speaker 4: AI regulation and that's what the President has supported, and 24 00:01:18,800 --> 00:01:20,839 Speaker 4: by the way, he supported this for a long time. 25 00:01:20,880 --> 00:01:23,959 Speaker 4: If you go back to his July speech on AI 26 00:01:24,200 --> 00:01:27,240 Speaker 4: he called for a single national framework then. And what 27 00:01:27,240 --> 00:01:29,800 Speaker 4: we've done with the SEO now is to make clear 28 00:01:29,840 --> 00:01:33,040 Speaker 4: that is the administration's policy and to task members of 29 00:01:33,040 --> 00:01:35,760 Speaker 4: the administration to work with Congress to try and enact 30 00:01:36,160 --> 00:01:39,520 Speaker 4: that framework through legislation, because ultimately this needs to be 31 00:01:39,520 --> 00:01:42,400 Speaker 4: a law, and in the meantime create tools that the 32 00:01:42,400 --> 00:01:46,000 Speaker 4: administration can use to push back on examples of the 33 00:01:46,040 --> 00:01:48,480 Speaker 4: most onerous and excessive state regulations. 34 00:01:49,600 --> 00:01:52,160 Speaker 2: David, there is of course some pushback you know, on 35 00:01:52,200 --> 00:01:55,960 Speaker 2: the executive order from the states themselves, from other Republicans, 36 00:01:56,720 --> 00:01:59,840 Speaker 2: you know, as you know, like I studied the July 37 00:02:01,560 --> 00:02:05,040 Speaker 2: speech and strategy closely, a big part of it, you know, 38 00:02:05,160 --> 00:02:09,800 Speaker 2: was infrastructure related and about deregulation. The concern about this 39 00:02:10,000 --> 00:02:13,520 Speaker 2: latest executive order is that while it addresses your concerns 40 00:02:13,600 --> 00:02:17,359 Speaker 2: about many different pieces of state regulation, it does not 41 00:02:17,480 --> 00:02:20,200 Speaker 2: provide for a single federal framework. 42 00:02:22,280 --> 00:02:24,480 Speaker 4: Well, at the end of the day, that single federal 43 00:02:24,520 --> 00:02:27,600 Speaker 4: framework has be enacted through law, and we need Congress 44 00:02:27,639 --> 00:02:30,000 Speaker 4: to do that. And so the President has asked Congress 45 00:02:30,040 --> 00:02:32,000 Speaker 4: to do that, and these task members of the administration 46 00:02:32,320 --> 00:02:35,880 Speaker 4: to work with Congress to produce that framework. In the meantime, 47 00:02:35,880 --> 00:02:37,959 Speaker 4: what we've done here is articulated set of principles. We 48 00:02:38,080 --> 00:02:41,560 Speaker 4: said what values are important to us. We said that 49 00:02:42,160 --> 00:02:45,280 Speaker 4: we want to protect child safety, that's important. We want 50 00:02:45,280 --> 00:02:48,560 Speaker 4: to respect copyright, we want to preserve the ability of 51 00:02:48,600 --> 00:02:53,320 Speaker 4: local communities to choose what infrastructures and their communities. We're 52 00:02:53,360 --> 00:02:56,960 Speaker 4: not seeking to preempt the states in any of those areas. 53 00:02:57,000 --> 00:02:59,840 Speaker 4: So this is an important set of principles that we 54 00:03:00,120 --> 00:03:03,080 Speaker 4: have put forth. And at the same time, the EO 55 00:03:03,160 --> 00:03:05,839 Speaker 4: provides for a number of tools that can be used 56 00:03:05,840 --> 00:03:09,600 Speaker 4: to push back on excessive state regulation. And let me 57 00:03:09,639 --> 00:03:12,840 Speaker 4: just illustrate why I think this is so necessary. What 58 00:03:12,840 --> 00:03:16,040 Speaker 4: we're really talking about here is regulation of AI models 59 00:03:16,080 --> 00:03:20,560 Speaker 4: and algorithms. Well, think about how an AI model is developed. 60 00:03:20,600 --> 00:03:23,840 Speaker 4: You can have developers in one state of multiple states 61 00:03:23,919 --> 00:03:26,640 Speaker 4: writing the code. It can then be trained in a 62 00:03:26,720 --> 00:03:29,679 Speaker 4: data center in another state. You then can have inference 63 00:03:29,760 --> 00:03:33,200 Speaker 4: happen in another state, and the entire service is provided 64 00:03:33,240 --> 00:03:37,560 Speaker 4: over the Internet using national telecommunications infrastructure. So you're dealing 65 00:03:37,600 --> 00:03:39,440 Speaker 4: there with at least four different states and all of 66 00:03:39,520 --> 00:03:43,600 Speaker 4: them can lay claim to regulating those AI models, and 67 00:03:43,680 --> 00:03:46,720 Speaker 4: those regulations can be in contradiction with each other. Even 68 00:03:46,760 --> 00:03:49,520 Speaker 4: Democrat governors have admitted this as a problem. So just 69 00:03:49,600 --> 00:03:51,800 Speaker 4: the other day, Kathy hoklel the governor of New York 70 00:03:52,120 --> 00:03:56,520 Speaker 4: It basically said that she might prefer to enact California's 71 00:03:56,600 --> 00:03:59,120 Speaker 4: SB fifty three, which is a regulation that they just 72 00:03:59,160 --> 00:04:01,960 Speaker 4: passed in California, rather than the bill that her own 73 00:04:01,960 --> 00:04:04,840 Speaker 4: assembly gave her, the Raise Act, because she sees that, wait, 74 00:04:04,840 --> 00:04:07,280 Speaker 4: do we really want to create this patchwork in different regulations. 75 00:04:07,440 --> 00:04:10,200 Speaker 4: So even democratic governors are realizing this is a problem, 76 00:04:10,280 --> 00:04:12,800 Speaker 4: and if they all run in different directions, then we're 77 00:04:12,840 --> 00:04:15,280 Speaker 4: going to end up with a patchwork or a mismash 78 00:04:15,680 --> 00:04:18,120 Speaker 4: of regulations that are impossible for companies to comply with. 79 00:04:18,160 --> 00:04:21,360 Speaker 4: What the President is calling for here is just common sense. 80 00:04:21,600 --> 00:04:25,000 Speaker 4: We want to get to a single national framework of 81 00:04:25,080 --> 00:04:28,240 Speaker 4: compliance as opposed to fifty states running in different directions. 82 00:04:28,560 --> 00:04:31,200 Speaker 1: Meanwhile, Kathy Hokle actually is getting a bit of criticism, 83 00:04:31,240 --> 00:04:34,920 Speaker 1: perhaps for narrowing and what some are saying is bowing 84 00:04:34,960 --> 00:04:37,840 Speaker 1: down to business. David, I'm really interested in how you 85 00:04:37,920 --> 00:04:41,360 Speaker 1: oppose that view, because there is anxiety in the population, 86 00:04:41,839 --> 00:04:45,680 Speaker 1: AI versus jobs, AI versus energy bills. How are you 87 00:04:45,720 --> 00:04:48,960 Speaker 1: giving them the sense that we haven't seen federal government 88 00:04:49,000 --> 00:04:51,560 Speaker 1: and indeed now state governments just handing over the reins 89 00:04:51,600 --> 00:04:54,000 Speaker 1: to big tech billionaires as people call them. 90 00:04:54,839 --> 00:04:55,039 Speaker 3: Right. 91 00:04:55,080 --> 00:04:56,960 Speaker 4: No, I understand there's a lot of fear out there 92 00:04:57,040 --> 00:04:59,920 Speaker 4: about AI and job los specifically, and a lot of 93 00:05:00,040 --> 00:05:02,640 Speaker 4: those fears have been drummed up. Let me just say 94 00:05:02,640 --> 00:05:04,320 Speaker 4: on the job loss question, because I think this is 95 00:05:04,400 --> 00:05:07,120 Speaker 4: really important that Yale just released a study and it 96 00:05:07,240 --> 00:05:09,880 Speaker 4: showed that in the thirty three months after the launch 97 00:05:09,920 --> 00:05:13,080 Speaker 4: of chat GPT, there was no discernible disruption to the 98 00:05:13,160 --> 00:05:16,640 Speaker 4: US job market none. They said, no discernible disruption. And 99 00:05:16,680 --> 00:05:18,960 Speaker 4: in fact, if you look right now, more jobs are 100 00:05:19,000 --> 00:05:21,200 Speaker 4: being created than being lost. So this whole idea of 101 00:05:21,320 --> 00:05:23,599 Speaker 4: job losses just isn't true. There was an article on 102 00:05:23,600 --> 00:05:26,280 Speaker 4: the Wall Street Journal just last week talking about the 103 00:05:26,880 --> 00:05:31,160 Speaker 4: construction boom that's happening that's benefiting construction workers like electricians, 104 00:05:31,200 --> 00:05:34,800 Speaker 4: like plumbers, like workers who pour concrete or hang drywall. 105 00:05:35,000 --> 00:05:37,159 Speaker 3: Their wages are up thirty percent. 106 00:05:36,800 --> 00:05:39,279 Speaker 4: Because this infrastructure boom that's happening right now, and there's 107 00:05:39,320 --> 00:05:42,719 Speaker 4: actually a job shortage in many of those trades, meaning 108 00:05:42,920 --> 00:05:45,720 Speaker 4: we need more workers going into those trades. So what 109 00:05:45,720 --> 00:05:48,160 Speaker 4: we're seeing right now is an overall AI boom that's 110 00:05:48,160 --> 00:05:52,360 Speaker 4: benefiting the economy. You know, the GDP growth rate was 111 00:05:52,400 --> 00:05:55,560 Speaker 4: tracking about four percent, and half of that up to 112 00:05:55,560 --> 00:05:58,479 Speaker 4: half of it's been attributed to AI. So I just 113 00:05:58,520 --> 00:06:01,720 Speaker 4: think that this narrow about job loss has been blown 114 00:06:01,760 --> 00:06:05,120 Speaker 4: out of proportion. Certainly there could be job displacement in 115 00:06:05,160 --> 00:06:07,440 Speaker 4: the future, but we haven't seen any of that so far. 116 00:06:07,640 --> 00:06:09,560 Speaker 3: It's been quite the opposite. It has been job gains. 117 00:06:10,360 --> 00:06:13,400 Speaker 2: David final one on the EO, if I may, you 118 00:06:13,440 --> 00:06:17,080 Speaker 2: know what this EO allows for. Is it the sort 119 00:06:17,080 --> 00:06:20,560 Speaker 2: of hope that it will lead to the DOJ sewing 120 00:06:20,760 --> 00:06:24,599 Speaker 2: states like New York and California. And if that's the case, 121 00:06:25,080 --> 00:06:28,039 Speaker 2: you know the President and the administration's confidence that you'd 122 00:06:28,080 --> 00:06:29,480 Speaker 2: win them. 123 00:06:29,960 --> 00:06:31,800 Speaker 4: Well, that is one of the tools that is in 124 00:06:31,839 --> 00:06:34,719 Speaker 4: the EO is that the DOJ has been tasked to 125 00:06:34,760 --> 00:06:38,600 Speaker 4: form a litigation task Force that would have the ability 126 00:06:38,640 --> 00:06:43,000 Speaker 4: to push back on excessively burn some state laws, laws 127 00:06:43,040 --> 00:06:46,440 Speaker 4: that may be unconstitutional, violate the First Amendment, things like that. 128 00:06:46,440 --> 00:06:48,919 Speaker 4: By the way, the DOJ already had that power, so 129 00:06:49,000 --> 00:06:51,680 Speaker 4: this is not a novel power. But what's being done 130 00:06:51,720 --> 00:06:53,960 Speaker 4: here in the CEO is we're marshaling all the resources 131 00:06:53,960 --> 00:06:56,599 Speaker 4: of the federal government behind the strategy of the President 132 00:06:56,600 --> 00:06:58,760 Speaker 4: to create a national framework. Now, in terms of what 133 00:06:58,880 --> 00:07:01,160 Speaker 4: laws would go after, that's a decision that has been made. 134 00:07:02,360 --> 00:07:05,440 Speaker 4: We haven't decided whether California and New York should be 135 00:07:05,480 --> 00:07:08,600 Speaker 4: targets in that way. The one that I think is 136 00:07:08,600 --> 00:07:12,400 Speaker 4: probably the most successive is this Colorado law that seeks 137 00:07:12,400 --> 00:07:15,960 Speaker 4: to prohibit algorithmic discrimination. What that basically says is that 138 00:07:15,960 --> 00:07:18,280 Speaker 4: if an AI model has a disparate impact on a 139 00:07:18,320 --> 00:07:22,760 Speaker 4: protected group, then that model is violating the law. Model developers, 140 00:07:22,760 --> 00:07:24,160 Speaker 4: by the way, I have no idea how to comply 141 00:07:24,280 --> 00:07:26,960 Speaker 4: with this because they're not aware of all the downstream 142 00:07:27,080 --> 00:07:29,320 Speaker 4: uses of their model. I mean, if a business decides 143 00:07:29,320 --> 00:07:32,240 Speaker 4: to use an AI model in a hiring decision, for example, 144 00:07:32,760 --> 00:07:34,840 Speaker 4: that business is already on the hook for discrimination. So 145 00:07:34,840 --> 00:07:37,920 Speaker 4: how would the model developer know that it was being 146 00:07:38,000 --> 00:07:39,920 Speaker 4: used in that way? But what Colorado was trying to 147 00:07:39,960 --> 00:07:43,840 Speaker 4: do there is get their ideology inserted into the model. 148 00:07:43,880 --> 00:07:45,000 Speaker 3: That's very concerning to us. 149 00:07:45,120 --> 00:07:47,160 Speaker 4: We think there's a first amendment issue there, But look, 150 00:07:47,200 --> 00:07:49,360 Speaker 4: we haven't made any decisions in terms of how that 151 00:07:49,600 --> 00:07:51,040 Speaker 4: litigation task for US to be used. 152 00:07:51,120 --> 00:07:53,680 Speaker 1: David Briefly, all of this is set in the context 153 00:07:53,680 --> 00:07:57,440 Speaker 1: of US versus China and a deemed to run forward 154 00:07:57,680 --> 00:08:00,960 Speaker 1: on AI development. Meanwhile, a busy week and eight two 155 00:08:01,000 --> 00:08:03,880 Speaker 1: hundreds might indeed be able to get to China. How 156 00:08:03,920 --> 00:08:05,600 Speaker 1: many do you think you'll do in volumes and what 157 00:08:05,680 --> 00:08:07,280 Speaker 1: do you think the appetite is of China to buy 158 00:08:07,280 --> 00:08:08,960 Speaker 1: in videos more sophisticated chips. 159 00:08:10,160 --> 00:08:11,000 Speaker 3: Well, it's interesting. 160 00:08:11,040 --> 00:08:13,080 Speaker 4: I just saw an article that said that China was 161 00:08:13,120 --> 00:08:15,640 Speaker 4: rejecting the H two hundred, So apparently they don't want them, 162 00:08:15,880 --> 00:08:17,680 Speaker 4: and I think the reason for that is they want 163 00:08:17,680 --> 00:08:20,840 Speaker 4: semiconductor independence. The same way that the United States wanted 164 00:08:21,000 --> 00:08:23,960 Speaker 4: to be energy independent. They want to be semiconductor independent. 165 00:08:24,200 --> 00:08:26,800 Speaker 4: So they're rejecting our chips. And that's part of the 166 00:08:26,840 --> 00:08:30,240 Speaker 4: calculation that goes into the decision of what we authorized 167 00:08:30,280 --> 00:08:32,679 Speaker 4: to be sold to China. The US policy has always 168 00:08:32,720 --> 00:08:35,240 Speaker 4: been that we don't allow the leading edge chips, and 169 00:08:35,320 --> 00:08:35,640 Speaker 4: we're not. 170 00:08:36,160 --> 00:08:37,640 Speaker 3: This is this H two hundred chip. 171 00:08:37,880 --> 00:08:39,640 Speaker 4: It was state of the art a couple of years ago, 172 00:08:39,720 --> 00:08:42,320 Speaker 4: but now it's been superseded by the new or Blackwell 173 00:08:42,400 --> 00:08:45,280 Speaker 4: architecture and the Ruben architecture that's coming out next year. 174 00:08:45,600 --> 00:08:48,040 Speaker 4: So this is now a lagging chip, not a leading chip. 175 00:08:48,080 --> 00:08:50,440 Speaker 4: But what you see is China is not taking them 176 00:08:50,720 --> 00:08:53,600 Speaker 4: because they want to prop up and subsidize Wilwey. 177 00:08:53,640 --> 00:08:55,480 Speaker 3: They want to create a national champion. 178 00:08:55,840 --> 00:08:58,400 Speaker 4: And that was part of our calculation of selling not 179 00:08:58,559 --> 00:09:00,719 Speaker 4: the best but lagging chips tochch is you can take 180 00:09:00,800 --> 00:09:03,199 Speaker 4: market share away from Walwig. But I think the Chinese 181 00:09:03,240 --> 00:09:06,280 Speaker 4: government's figured that out and that's why they're not allowing them. 182 00:09:06,440 --> 00:09:09,240 Speaker 1: David Sachs, we always wish we had more time White House, 183 00:09:09,280 --> 00:09:11,720 Speaker 1: AI and cryptos Are We thank you for joining us 184 00:09:11,760 --> 00:09:12,880 Speaker 1: today on the executive Order