1 00:00:02,400 --> 00:00:07,280 Speaker 1: Bloomberg Audio Studios, podcasts, radio news is. 2 00:00:07,280 --> 00:00:10,800 Speaker 2: Your first have a AI expo for national competitiveness. So 3 00:00:11,280 --> 00:00:14,120 Speaker 2: where are we in US competitiveness when it comes to 4 00:00:14,120 --> 00:00:15,200 Speaker 2: out of show intelligence. 5 00:00:16,000 --> 00:00:17,920 Speaker 3: Well, the good news is the US is way ahead 6 00:00:17,920 --> 00:00:19,960 Speaker 3: of China and everybody else, and I think that's going 7 00:00:20,040 --> 00:00:23,720 Speaker 3: to continue for a while. To me, national competitiveness is 8 00:00:23,800 --> 00:00:26,600 Speaker 3: the challenge for the next ten or twenty years because 9 00:00:26,640 --> 00:00:30,960 Speaker 3: the Chinese are really focused on dominating certain industries and 10 00:00:31,360 --> 00:00:33,840 Speaker 3: we need to compete with them and make sure we win. 11 00:00:34,600 --> 00:00:37,640 Speaker 3: In the case of artificial intelligence, we are well ahead 12 00:00:38,040 --> 00:00:40,519 Speaker 3: two or three years. Probably we have China, which in 13 00:00:40,560 --> 00:00:43,280 Speaker 3: my world is in eternity. I think we're in pretty 14 00:00:43,280 --> 00:00:48,319 Speaker 3: good shape. And the crazy valuations, all the money, all 15 00:00:48,440 --> 00:00:52,240 Speaker 3: the new experimentation, this sort of enormous adoption of AI 16 00:00:52,360 --> 00:00:55,680 Speaker 3: that's occurring. We have a lot of implications on society 17 00:00:55,840 --> 00:00:57,400 Speaker 3: and very good for business. 18 00:00:57,640 --> 00:01:02,880 Speaker 2: What about well, ultimately, the of regulating artificial intelligence, is 19 00:01:02,920 --> 00:01:05,319 Speaker 2: there any risk that we stifle innovation and. 20 00:01:05,280 --> 00:01:05,960 Speaker 1: So doing. 21 00:01:07,440 --> 00:01:11,360 Speaker 3: There's always a risk of premature regulation. My simplest example 22 00:01:11,400 --> 00:01:14,679 Speaker 3: there is Europe. I've spent ten years trying to convince 23 00:01:14,720 --> 00:01:17,200 Speaker 3: Europe to actually innovate instead of regulate, and they just 24 00:01:17,319 --> 00:01:22,720 Speaker 3: keep regulating. The current act AI EU act is essentially regulation, 25 00:01:22,959 --> 00:01:25,920 Speaker 3: not investment in the future. So you can see that 26 00:01:25,959 --> 00:01:29,280 Speaker 3: Europe is highly unlikely to be relevant. China, of course, 27 00:01:29,360 --> 00:01:32,800 Speaker 3: is struggling because of chips shortages and so forth. All 28 00:01:32,840 --> 00:01:35,200 Speaker 3: that they're ready to win if they can get the 29 00:01:35,240 --> 00:01:38,000 Speaker 3: harder that they need, and the rest of the world. 30 00:01:37,760 --> 00:01:39,120 Speaker 1: Is not focused enough on this. 31 00:01:39,720 --> 00:01:43,280 Speaker 3: So the good news from myself my position in terms 32 00:01:43,280 --> 00:01:46,360 Speaker 3: of America here we are in DC for national competitiveness. 33 00:01:46,640 --> 00:01:48,880 Speaker 1: We're the likely winner if we don't screw it up. 34 00:01:49,840 --> 00:01:50,880 Speaker 1: What could screw it up? 35 00:01:51,200 --> 00:01:54,160 Speaker 2: Could it be a lack of government focus? How much 36 00:01:54,320 --> 00:01:56,520 Speaker 2: you how would you grade the current administration when it 37 00:01:56,520 --> 00:01:58,760 Speaker 2: comes to a focus on innovation and funding it. 38 00:02:00,160 --> 00:02:02,520 Speaker 3: We've been working with the US government and they've taken 39 00:02:02,520 --> 00:02:05,840 Speaker 3: a largely handoff, hands off, you know, tell us what's 40 00:02:05,880 --> 00:02:09,040 Speaker 3: going on approach, which has worked pretty well. There are 41 00:02:09,040 --> 00:02:11,800 Speaker 3: now voluntary commitments from all the companies. That's a good 42 00:02:11,880 --> 00:02:16,040 Speaker 3: first step. I'm sure there will be areas of real problems, 43 00:02:16,080 --> 00:02:19,720 Speaker 3: you know, cyber dangers, biological dangers and so forth, and 44 00:02:19,800 --> 00:02:21,760 Speaker 3: I would expect in the next ten years you'll see 45 00:02:22,040 --> 00:02:26,000 Speaker 3: real regulations in some of these spaces because of potential harm. Today, 46 00:02:26,080 --> 00:02:29,160 Speaker 3: it's unlikely you're going to have serious harm unless you 47 00:02:29,200 --> 00:02:31,160 Speaker 3: think that the deep fakes are a serious harm, which 48 00:02:31,200 --> 00:02:32,560 Speaker 3: they may be in some cases. 49 00:02:33,560 --> 00:02:34,840 Speaker 1: So we're okay at the moment. 50 00:02:35,360 --> 00:02:38,840 Speaker 2: Would you worry about the next administration if it was 51 00:02:38,960 --> 00:02:41,320 Speaker 2: indeed Underbiden, if it was a Trump administration. 52 00:02:44,000 --> 00:02:47,239 Speaker 3: We are steadfastly bipartisan on this issue because the development 53 00:02:47,280 --> 00:02:50,280 Speaker 3: of AI benefits everybody in the nation. It benefits in 54 00:02:50,360 --> 00:02:53,480 Speaker 3: terms of, you know, curing cancer, helping with climate change, 55 00:02:53,880 --> 00:02:58,720 Speaker 3: new materials, making energy more efficient, distribution use, and so forth, 56 00:02:58,760 --> 00:03:03,359 Speaker 3: and so on. Actional competitiveness should be a bipartisan issue 57 00:03:03,600 --> 00:03:05,840 Speaker 3: in each administration. You have to get to know them, 58 00:03:05,880 --> 00:03:08,440 Speaker 3: you have to get them up to speed. One of 59 00:03:08,440 --> 00:03:11,720 Speaker 3: the strange things about America is that we change over 60 00:03:11,760 --> 00:03:14,639 Speaker 3: the entire government every four or eight years, and then 61 00:03:14,680 --> 00:03:17,040 Speaker 3: you have this whole new bunch of people who have 62 00:03:17,080 --> 00:03:19,720 Speaker 3: to learn both how the government works and also they 63 00:03:19,760 --> 00:03:23,280 Speaker 3: come in with an agenda. I'm not too worried about 64 00:03:23,360 --> 00:03:27,679 Speaker 3: it because everybody benefits from this, whether you're a Republican 65 00:03:27,760 --> 00:03:31,679 Speaker 3: or a Democrat, whether you're a liberal or a conservatives. 66 00:03:30,560 --> 00:03:33,320 Speaker 2: Nuanced and it's also a nuanced relationship that corporates have 67 00:03:33,360 --> 00:03:36,480 Speaker 2: to work with the United States and indeed with China 68 00:03:36,520 --> 00:03:38,520 Speaker 2: at the moment. And I'm really interested you say, perhaps 69 00:03:38,600 --> 00:03:40,680 Speaker 2: China is being pushed behind because of its lack of 70 00:03:40,720 --> 00:03:43,880 Speaker 2: access to chips all we currently navigating that nuance. 71 00:03:43,920 --> 00:03:48,960 Speaker 3: Well, Well, the Trump administration followed by the Biden administration 72 00:03:49,040 --> 00:03:51,680 Speaker 3: did a good job in getting those rules in place. 73 00:03:52,560 --> 00:03:55,520 Speaker 3: It's a good example and not a common example, of 74 00:03:55,520 --> 00:03:59,800 Speaker 3: how a targeted intervention has a positive measured outcome from 75 00:03:59,840 --> 00:04:01,920 Speaker 3: the standpoint of the national security. 76 00:04:02,760 --> 00:04:07,480 Speaker 1: So I think we're good there. It's very important. 77 00:04:07,000 --> 00:04:10,680 Speaker 3: And I led the National Security Commission on Artificial Intelligence 78 00:04:11,520 --> 00:04:14,440 Speaker 3: a few years ago for the Congress. It's very important 79 00:04:14,840 --> 00:04:17,400 Speaker 3: that we get more basic money out of the government 80 00:04:17,440 --> 00:04:18,240 Speaker 3: for basic research. 81 00:04:18,320 --> 00:04:19,680 Speaker 1: We're building intelligence. 82 00:04:19,720 --> 00:04:22,719 Speaker 3: We're building a new form of non human intelligence that 83 00:04:22,760 --> 00:04:26,440 Speaker 3: will change the world forever in a good way. And furthermore, 84 00:04:26,480 --> 00:04:28,640 Speaker 3: it needs to be done with our partner countries. These 85 00:04:28,640 --> 00:04:31,400 Speaker 3: are the five Eyes of the UK, that sort of 86 00:04:31,400 --> 00:04:34,159 Speaker 3: thing some of the European countries. And also it needs 87 00:04:34,200 --> 00:04:38,160 Speaker 3: to be done with American values. So using China as 88 00:04:38,160 --> 00:04:41,200 Speaker 3: the bogeyman, imagine if they were in charge of the Internet, 89 00:04:41,240 --> 00:04:43,120 Speaker 3: and they were in charge of the rules. And you 90 00:04:43,160 --> 00:04:46,000 Speaker 3: imagine how very different your experience as a user on 91 00:04:46,040 --> 00:04:48,599 Speaker 3: the Internet would be today. You can show up, you 92 00:04:48,640 --> 00:04:51,200 Speaker 3: can be anonymous, you can do whatever you want within reason. 93 00:04:51,880 --> 00:04:54,240 Speaker 3: That kind of freedom is central to the spirit of 94 00:04:54,240 --> 00:04:56,159 Speaker 3: the Internet, and it's important to be preserved. 95 00:04:56,800 --> 00:05:00,560 Speaker 2: Talking about a bogeyman and a Chinese related bogaman, TikTok 96 00:05:00,640 --> 00:05:04,320 Speaker 2: became the bogaman that many are now trying to regulate, 97 00:05:04,360 --> 00:05:07,680 Speaker 2: to indeed ban, or to sell. There's been some reporting 98 00:05:07,720 --> 00:05:14,240 Speaker 2: that you've been interested, alongside Manution in potentially purchasing us TikTok. 99 00:05:14,279 --> 00:05:15,040 Speaker 1: Would you still be. 100 00:05:17,600 --> 00:05:19,920 Speaker 3: I'm not currently looking at that. I looked at it 101 00:05:19,920 --> 00:05:23,520 Speaker 3: for a while. My personal view on this is you're 102 00:05:23,520 --> 00:05:29,159 Speaker 3: better off regulating than banning or a judicial action. All 103 00:05:29,279 --> 00:05:31,520 Speaker 3: the big tech companies are now in the hands of 104 00:05:31,520 --> 00:05:34,560 Speaker 3: the DOJ in various legal fights. I would prefer to 105 00:05:34,560 --> 00:05:36,880 Speaker 3: see a regulatory regime that is sort of has the 106 00:05:36,960 --> 00:05:39,880 Speaker 3: right incentives and the right prohibitions for all of these things. 107 00:05:40,240 --> 00:05:42,680 Speaker 3: My own view of TikTok is that TikTok is not 108 00:05:42,720 --> 00:05:45,560 Speaker 3: really social media, it's really television, and that you can 109 00:05:45,640 --> 00:05:48,320 Speaker 3: regulate television by the equivalent of the equal time rule, 110 00:05:48,400 --> 00:05:50,039 Speaker 3: but somehow we're not having that conversation. 111 00:05:51,200 --> 00:05:53,960 Speaker 2: What's interesting is, of course, you say, many big tech 112 00:05:54,000 --> 00:05:57,640 Speaker 2: companies are currently well entangled from a time perspective with 113 00:05:57,680 --> 00:06:00,359 Speaker 2: the DOJ one of them being where we know you 114 00:06:00,400 --> 00:06:03,560 Speaker 2: holm from? Of course the previous CEO of Google. Where 115 00:06:03,600 --> 00:06:06,359 Speaker 2: did you rate from a personal perspective? Where Google is 116 00:06:06,360 --> 00:06:07,880 Speaker 2: in the AI competition race. 117 00:06:09,160 --> 00:06:12,400 Speaker 3: Well, it's interesting that almost all of the technology you're 118 00:06:12,400 --> 00:06:16,720 Speaker 3: seeing today was invented at Google about five to ten 119 00:06:16,800 --> 00:06:20,360 Speaker 3: years ago. And it's one of the sort of great 120 00:06:20,400 --> 00:06:23,320 Speaker 3: things that during that period Google had a roughly in 121 00:06:23,360 --> 00:06:26,120 Speaker 3: my estimate, two thirds of the world's talent in AI 122 00:06:26,279 --> 00:06:28,320 Speaker 3: was working at so Google had a very very strong 123 00:06:28,360 --> 00:06:31,920 Speaker 3: head start. I think Google is a complicated place, is 124 00:06:31,960 --> 00:06:35,800 Speaker 3: a lot going on. Open Ai, in particular came out 125 00:06:35,800 --> 00:06:38,440 Speaker 3: with GPT three three point five and four and did 126 00:06:38,480 --> 00:06:42,200 Speaker 3: a fantastic job. I think that was the competitive nudge 127 00:06:42,520 --> 00:06:45,160 Speaker 3: that has gotten Google back in the game. What I 128 00:06:45,279 --> 00:06:48,920 Speaker 3: like now is you have these two huge companies, Microsoft 129 00:06:48,960 --> 00:06:53,440 Speaker 3: and open Ai together and Google Google Alphabet again operating together, 130 00:06:53,880 --> 00:06:57,279 Speaker 3: and you have them putting billions of dollars hardware and 131 00:06:57,680 --> 00:07:01,280 Speaker 3: enormous software teams to invent this new future. This is 132 00:07:01,320 --> 00:07:05,920 Speaker 3: not to take away from Anthropic, another competitor. Obviously, whatever 133 00:07:05,960 --> 00:07:08,159 Speaker 3: Elon is doing with x dot ai, he's raising a 134 00:07:08,200 --> 00:07:11,040 Speaker 3: huge amount of money, and Meta has recently released a 135 00:07:11,080 --> 00:07:14,360 Speaker 3: four hundred billion parameter modelers and the process of releading 136 00:07:14,360 --> 00:07:18,120 Speaker 3: it that looks really, really good. So that competition is 137 00:07:18,160 --> 00:07:22,000 Speaker 3: the right answer to the regulators and to everyone's question. 138 00:07:22,560 --> 00:07:28,480 Speaker 3: Competition brings out enormous value to consumers. And eventually people 139 00:07:28,520 --> 00:07:30,800 Speaker 3: will see this and they'll say, oh my god, look 140 00:07:30,840 --> 00:07:33,000 Speaker 3: at what this does. And it's free. And of course 141 00:07:33,080 --> 00:07:36,120 Speaker 3: all that money was paid out of the markets. Billions 142 00:07:36,120 --> 00:07:38,440 Speaker 3: of dollars that's going in right now, and they'll monetize 143 00:07:38,440 --> 00:07:39,080 Speaker 3: it somehow. 144 00:07:39,640 --> 00:07:40,440 Speaker 1: Does it have to. 145 00:07:40,400 --> 00:07:44,840 Speaker 2: Be led by the big corporates? How much well optimism 146 00:07:44,880 --> 00:07:48,160 Speaker 2: do you have that the startup culture is what's going 147 00:07:48,200 --> 00:07:50,360 Speaker 2: to breed the innovation as well you as an active 148 00:07:50,360 --> 00:07:52,680 Speaker 2: investor as well as someone who's trying to breed competitiveness 149 00:07:52,720 --> 00:07:53,880 Speaker 2: from a national perspective. 150 00:07:55,520 --> 00:07:57,600 Speaker 3: I've tried to invest in most of the companies because 151 00:07:57,640 --> 00:07:59,400 Speaker 3: I frankly can't figure out which ones are going to 152 00:07:59,400 --> 00:08:03,560 Speaker 3: be successful. One of the most interesting startup successes is Anthropic, 153 00:08:03,640 --> 00:08:06,160 Speaker 3: which is now valued very high, started off as a 154 00:08:06,160 --> 00:08:08,920 Speaker 3: bunch of founders who came out of open Ai, who 155 00:08:09,000 --> 00:08:11,320 Speaker 3: had invented some of the technology but not all of it, 156 00:08:11,360 --> 00:08:13,960 Speaker 3: and they built an unusually good product. Their big partner 157 00:08:14,400 --> 00:08:17,600 Speaker 3: is Amazon, and that again makes sense to me, right 158 00:08:17,640 --> 00:08:19,800 Speaker 3: because they can do it faster than Amazon. They're very, 159 00:08:19,920 --> 00:08:23,200 Speaker 3: very focused. So this new industrial structure of these deep 160 00:08:23,200 --> 00:08:25,240 Speaker 3: partnerships is not a bad thing. 161 00:08:25,680 --> 00:08:25,920 Speaker 1: Look. 162 00:08:25,960 --> 00:08:29,520 Speaker 3: The important thing is the arrival of these large language 163 00:08:29,520 --> 00:08:32,280 Speaker 3: models is a new economic problem in our industry because 164 00:08:32,320 --> 00:08:36,240 Speaker 3: the amount of capital. In my fifty years of doing software, 165 00:08:36,960 --> 00:08:39,880 Speaker 3: we never had capital constraints. We just had software people 166 00:08:40,080 --> 00:08:42,480 Speaker 3: and the software. We always had whatever capital we need. 167 00:08:42,840 --> 00:08:45,640 Speaker 3: Now in order to do anything, you need ten thousand 168 00:08:45,760 --> 00:08:49,439 Speaker 3: or twenty thousand in video H one hundreds B two hundreds, 169 00:08:49,679 --> 00:08:51,480 Speaker 3: which cost forty thousand dollars a pop. 170 00:08:52,120 --> 00:08:53,079 Speaker 1: So you do the math. 171 00:08:53,480 --> 00:08:56,480 Speaker 3: All of a sudden, your idea to get started has 172 00:08:56,520 --> 00:09:01,240 Speaker 3: to have a billion dollar valuation. That's event in a 173 00:09:01,280 --> 00:09:05,000 Speaker 3: software industry, and an unfortunately one which I think tends 174 00:09:05,040 --> 00:09:08,880 Speaker 3: to reward large companies as a bush of the smaller companies. 175 00:09:09,200 --> 00:09:12,680 Speaker 3: The best scenarios where the large companies are investing heavily, 176 00:09:12,800 --> 00:09:15,520 Speaker 3: and the small companies are right up trying to invent 177 00:09:15,600 --> 00:09:17,679 Speaker 3: the new future. And every once in a while one 178 00:09:17,679 --> 00:09:20,840 Speaker 3: of those small companies becomes some huge monolith. Look at Meta, 179 00:09:21,080 --> 00:09:23,600 Speaker 3: look at TikTok. You get the idea and. 180 00:09:23,600 --> 00:09:26,640 Speaker 2: You mentioned, well a global company there with TikTok. What 181 00:09:26,720 --> 00:09:29,439 Speaker 2: about European startups? Could we have a giant out of 182 00:09:29,480 --> 00:09:31,840 Speaker 2: Europe or is regulation just ended up? Would you invest 183 00:09:31,840 --> 00:09:32,960 Speaker 2: in a European startup? 184 00:09:34,040 --> 00:09:34,640 Speaker 1: I have been. 185 00:09:34,520 --> 00:09:38,480 Speaker 3: Investing in France personally because the French ecosystem, in particular 186 00:09:38,559 --> 00:09:41,760 Speaker 3: under President Macron, is much more silk and valley like. 187 00:09:43,040 --> 00:09:45,920 Speaker 3: It's going to be rare to see the successes in 188 00:09:45,960 --> 00:09:49,880 Speaker 3: Europe that you see in America in the UK, and 189 00:09:49,920 --> 00:09:52,800 Speaker 3: the fundamental reason is they can't get the capital and 190 00:09:52,800 --> 00:09:57,120 Speaker 3: the regulatory structure is so prohibitive. It's really sad because 191 00:09:57,120 --> 00:10:01,520 Speaker 3: the European talent pool is amazing. The opportunity to build 192 00:10:01,600 --> 00:10:04,640 Speaker 3: companies in terms of impact is huge, and I think 193 00:10:04,679 --> 00:10:07,280 Speaker 3: frankly the government and the regulation is slowing them down. 194 00:10:07,720 --> 00:10:09,320 Speaker 1: So I'm not writing them off. 195 00:10:09,360 --> 00:10:12,520 Speaker 3: I'm doing some investing, but so far the action will 196 00:10:12,520 --> 00:10:15,400 Speaker 3: be in the US, the UK and Canada, where the 197 00:10:15,440 --> 00:10:19,320 Speaker 3: regulatory structures are much more permissible to economic investment. 198 00:10:20,240 --> 00:10:23,400 Speaker 2: And when we started this conversation, you said, the US 199 00:10:23,480 --> 00:10:25,199 Speaker 2: is leading as long as they don't screw it up. 200 00:10:25,640 --> 00:10:28,120 Speaker 2: One piece of advice to ensure that the US doesn't. 201 00:10:32,040 --> 00:10:35,400 Speaker 3: Let's start by saying, the US capital system is extraordinary. 202 00:10:36,040 --> 00:10:39,600 Speaker 3: You can raise billions of dollars based on an idea 203 00:10:39,720 --> 00:10:41,640 Speaker 3: and a promise of a scale business. 204 00:10:42,040 --> 00:10:43,240 Speaker 1: Let's not screw that up. 205 00:10:43,720 --> 00:10:47,720 Speaker 3: The second thing is, let's not regulate these companies before 206 00:10:47,760 --> 00:10:51,040 Speaker 3: they do something. Give them a chance to figure out 207 00:10:51,320 --> 00:10:55,360 Speaker 3: how to monetize these spaces and build new businesses. Look 208 00:10:55,400 --> 00:10:58,960 Speaker 3: at the tech companies that define the US stock market. 209 00:10:59,320 --> 00:11:02,520 Speaker 1: Look at the value created. For the last couple of years, the. 210 00:11:02,679 --> 00:11:05,680 Speaker 3: Sort of Lucky seven, the Famous four, whatever you want 211 00:11:05,679 --> 00:11:08,959 Speaker 3: to call them, have driven the US stock market as 212 00:11:09,000 --> 00:11:12,280 Speaker 3: the primary drivers of wealth for the whole nation. Right, 213 00:11:12,679 --> 00:11:14,840 Speaker 3: So the game that we're in is one where we 214 00:11:14,880 --> 00:11:17,600 Speaker 3: want to create another one, and another one and another one. 215 00:11:17,960 --> 00:11:22,280 Speaker 3: That opportunity is how economics works in America, and we're 216 00:11:22,280 --> 00:11:25,600 Speaker 3: in a position to provide global platforms at scale using 217 00:11:25,640 --> 00:11:29,960 Speaker 3: the combination of American research, American policy, and all of 218 00:11:29,960 --> 00:11:33,480 Speaker 3: this money that's raisable This is a godsend. Every other 219 00:11:33,480 --> 00:11:36,200 Speaker 3: country in the world would want this. Let's keep it going.