1 00:00:02,400 --> 00:00:06,800 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. 2 00:00:07,000 --> 00:00:09,880 Speaker 2: The former Google CEO Eric Smith joins us on his 3 00:00:10,000 --> 00:00:14,120 Speaker 2: latest book, Genesis, Artificial Intelligence, Hope and the Human Spirit. 4 00:00:14,400 --> 00:00:16,279 Speaker 2: Eric joins us now for more, Eric, Welcome to the 5 00:00:16,320 --> 00:00:17,919 Speaker 2: program Sir. I want to pick up on that one 6 00:00:17,960 --> 00:00:20,520 Speaker 2: word hope, and ask you how much hope I should 7 00:00:20,560 --> 00:00:22,880 Speaker 2: have that this goes right? And start with this quote 8 00:00:22,960 --> 00:00:26,880 Speaker 2: from Foreign Affairs magazine. It reads as follows. In cases 9 00:00:26,880 --> 00:00:30,160 Speaker 2: where some humans might face off militarily or diplomatically against 10 00:00:30,200 --> 00:00:34,360 Speaker 2: the highly AI enabled state or against AI itself, humans 11 00:00:34,400 --> 00:00:38,519 Speaker 2: could struggle to survive, much less compete. Such an intermediate 12 00:00:38,680 --> 00:00:42,960 Speaker 2: order could witness an internal implosion of societies and an 13 00:00:43,040 --> 00:00:46,960 Speaker 2: uncontrollable explosion of external conflicts. So you've got to calm 14 00:00:47,000 --> 00:00:49,519 Speaker 2: meet down, Eric. Should I have hope that this plays out? 15 00:00:49,560 --> 00:00:53,800 Speaker 3: Well? Well, let's start with and hope that either of 16 00:00:53,840 --> 00:00:58,880 Speaker 3: your show is incredible. Previous guests talked about two percent productivity. 17 00:00:58,920 --> 00:01:02,120 Speaker 3: Imagine that productivity goes to five percent a year because 18 00:01:02,120 --> 00:01:03,000 Speaker 3: of this technology. 19 00:01:03,080 --> 00:01:04,800 Speaker 1: So there's every reason to think that there's. 20 00:01:04,920 --> 00:01:09,399 Speaker 3: Enormous land businesses, inventions, science, health, and so forth to 21 00:01:09,400 --> 00:01:11,920 Speaker 3: get invented. But the same is also true for the 22 00:01:11,920 --> 00:01:14,800 Speaker 3: invention of conflict and war. And if you think about it, 23 00:01:14,800 --> 00:01:17,240 Speaker 3: it makes absolutely no sense to put somebody in a 24 00:01:17,280 --> 00:01:18,920 Speaker 3: fighter jet up in the air and have it be 25 00:01:18,959 --> 00:01:21,839 Speaker 3: shot at by another missile. It makes much more sense 26 00:01:21,840 --> 00:01:23,720 Speaker 3: for that fighter pilot to be sitting on the ground 27 00:01:24,240 --> 00:01:25,560 Speaker 3: and having a swarm. 28 00:01:25,280 --> 00:01:26,720 Speaker 1: Of the equivalent of those jets. 29 00:01:26,800 --> 00:01:30,280 Speaker 3: Let's just call them powerful robots, powerful drones that act 30 00:01:30,360 --> 00:01:34,679 Speaker 3: in synchrony to achieve the objective of this attacker or 31 00:01:34,760 --> 00:01:38,480 Speaker 3: to defend yourself. And the future of war is autonomous 32 00:01:38,560 --> 00:01:40,040 Speaker 3: and networked and AI driven. 33 00:01:41,000 --> 00:01:43,200 Speaker 4: This is something that a lot of people have focused on, 34 00:01:43,440 --> 00:01:46,440 Speaker 4: in particular how the US is going to evolve in 35 00:01:46,480 --> 00:01:48,280 Speaker 4: this manner, as well as how China is going to 36 00:01:48,280 --> 00:01:50,400 Speaker 4: evolve in this manner, how they're going to use some 37 00:01:50,440 --> 00:01:54,600 Speaker 4: of the space satellites to help coordinate that effort. Eric, 38 00:01:54,640 --> 00:01:58,360 Speaker 4: do you feel like right now one superpower is winning 39 00:01:58,440 --> 00:02:01,240 Speaker 4: versus the other? What is the best way to gain 40 00:02:01,280 --> 00:02:03,840 Speaker 4: preeminence in a military AI sphere? 41 00:02:04,840 --> 00:02:07,440 Speaker 3: But when doctor Kissinger and I went on his last 42 00:02:07,520 --> 00:02:10,840 Speaker 3: trip to China, I was convinced from that meeting that 43 00:02:10,960 --> 00:02:13,160 Speaker 3: China was two to two and a half years behind us. 44 00:02:13,720 --> 00:02:17,079 Speaker 3: But I am now wrong turns out that in the 45 00:02:17,160 --> 00:02:20,359 Speaker 3: last couple of weeks, China has brought out software models 46 00:02:21,320 --> 00:02:24,359 Speaker 3: what are called language large language models that are rivals 47 00:02:24,400 --> 00:02:26,639 Speaker 3: of the best American ones, which I never thought would 48 00:02:26,680 --> 00:02:29,359 Speaker 3: be possible. One is called Quinn, another word's called hong Yong. 49 00:02:30,400 --> 00:02:33,840 Speaker 3: And it looks like they've caught up or they're very 50 00:02:33,840 --> 00:02:37,520 Speaker 3: close behind. And it looks like China has decided that 51 00:02:37,560 --> 00:02:40,040 Speaker 3: it's another part of its industrial policy along with this 52 00:02:40,160 --> 00:02:43,840 Speaker 3: focus on cellular dominating the battery industry, dominated in the 53 00:02:43,840 --> 00:02:44,800 Speaker 3: car industry, et cetera. 54 00:02:44,880 --> 00:02:46,200 Speaker 1: So they're willing to spend the money. 55 00:02:46,680 --> 00:02:49,520 Speaker 3: What's interesting is that it does frame the narrative of 56 00:02:49,639 --> 00:02:52,079 Speaker 3: China versus the US as the defining narrative. 57 00:02:52,440 --> 00:02:54,320 Speaker 1: And one of the more interesting questions, which we. 58 00:02:54,280 --> 00:02:56,639 Speaker 3: Discussed at some length in the book, is what happens 59 00:02:56,680 --> 00:02:58,360 Speaker 3: to the deterrence and what happens to all the other 60 00:02:58,400 --> 00:03:00,919 Speaker 3: one hundred and ninety five countries that are not China 61 00:03:00,960 --> 00:03:04,400 Speaker 3: and not the US. These questions are really important because 62 00:03:04,680 --> 00:03:09,720 Speaker 3: the power of this technology upends society in so many ways, economically, 63 00:03:10,000 --> 00:03:12,280 Speaker 3: the way we govern, the way we reach language, and 64 00:03:12,320 --> 00:03:13,120 Speaker 3: so forth, and so on. 65 00:03:13,720 --> 00:03:15,360 Speaker 4: So there are a lot of questions within this, and 66 00:03:15,480 --> 00:03:17,920 Speaker 4: this could be a half an hour discussion, but one 67 00:03:18,000 --> 00:03:20,000 Speaker 4: of them is who should be financing and who should 68 00:03:20,000 --> 00:03:24,320 Speaker 4: be leading the national efforts to gain pre eminence in 69 00:03:24,400 --> 00:03:26,760 Speaker 4: this sphere. And you have to wonder, you talk about 70 00:03:26,840 --> 00:03:30,720 Speaker 4: China as a state organized, state funded type of effort 71 00:03:30,840 --> 00:03:33,680 Speaker 4: that is coordinated in that manner, how should it be 72 00:03:33,720 --> 00:03:36,920 Speaker 4: directed in a place that does consider itself capitalist like 73 00:03:36,920 --> 00:03:37,760 Speaker 4: the United States. 74 00:03:38,960 --> 00:03:42,040 Speaker 3: Well, the Chinese model is what is called civil military fusion, 75 00:03:42,240 --> 00:03:45,840 Speaker 3: where they subsidize their biggest companies. No one in America, 76 00:03:46,000 --> 00:03:48,080 Speaker 3: not even in the companies, thinks that the thinks that 77 00:03:48,160 --> 00:03:51,720 Speaker 3: the largest American companies need to be subsidized. But let 78 00:03:51,760 --> 00:03:55,480 Speaker 3: me make an argument that if general intelligence, that is, 79 00:03:55,600 --> 00:03:59,520 Speaker 3: human intelligence, is to be invented, it should be invented 80 00:03:59,720 --> 00:04:02,440 Speaker 3: in the United States and under control of Americans and 81 00:04:02,440 --> 00:04:05,280 Speaker 3: the American government. And I don't mean managed, but I 82 00:04:05,320 --> 00:04:08,520 Speaker 3: mean under the legal control of our country and our 83 00:04:08,560 --> 00:04:12,160 Speaker 3: citizens at our democracy and getting there first is a 84 00:04:12,160 --> 00:04:13,880 Speaker 3: big deal. A number of us have talked for some 85 00:04:14,040 --> 00:04:18,320 Speaker 3: time that there should be ANAI for America initiative where 86 00:04:18,320 --> 00:04:21,960 Speaker 3: the government works closely with the private companies and figures 87 00:04:21,960 --> 00:04:24,160 Speaker 3: out a way to make sure that we are ahead 88 00:04:24,160 --> 00:04:25,119 Speaker 3: of ahead of time. 89 00:04:26,000 --> 00:04:26,520 Speaker 1: In truth. 90 00:04:26,560 --> 00:04:29,479 Speaker 3: However, the financial industry that you all invented, for which 91 00:04:29,520 --> 00:04:33,240 Speaker 3: we are incredibly grateful, has been incredibly generous in terms 92 00:04:33,240 --> 00:04:35,479 Speaker 3: of the ability to fundraise the billions of dollars that 93 00:04:35,480 --> 00:04:38,760 Speaker 3: are required, and id state by the way that you 94 00:04:38,800 --> 00:04:40,599 Speaker 3: guys have done such a good job in raising money, 95 00:04:40,640 --> 00:04:43,239 Speaker 3: and we've done such a good job in actually getting 96 00:04:43,240 --> 00:04:45,279 Speaker 3: the computer scientists around the world and the data and 97 00:04:45,320 --> 00:04:47,600 Speaker 3: so on and so on. What we now need is energy. 98 00:04:48,440 --> 00:04:51,240 Speaker 3: We are prolific consumers or energy, and we're going to 99 00:04:51,279 --> 00:04:53,520 Speaker 3: run out. One estimate is that US will run out 100 00:04:53,560 --> 00:04:56,200 Speaker 3: of all sources of energy by twenty twenty eight at 101 00:04:56,200 --> 00:04:57,120 Speaker 3: our current growth rate. 102 00:04:57,680 --> 00:04:59,800 Speaker 5: There's another topic you discussed in this book, which is 103 00:05:00,400 --> 00:05:01,240 Speaker 5: to check AI. 104 00:05:01,839 --> 00:05:06,880 Speaker 1: How can that work? What do you think about it. 105 00:05:07,160 --> 00:05:09,719 Speaker 3: One of the things that happens is in AI systems 106 00:05:09,800 --> 00:05:12,640 Speaker 3: is you have compositional generations. What happens is you take 107 00:05:13,000 --> 00:05:15,680 Speaker 3: one piece and another piece and you interlick them and 108 00:05:15,720 --> 00:05:19,000 Speaker 3: they've never been linked together before, and it looks like 109 00:05:19,040 --> 00:05:20,600 Speaker 3: when you combine those two. 110 00:05:20,480 --> 00:05:24,080 Speaker 1: You get emergent properties. Right, So that's in and of 111 00:05:24,120 --> 00:05:24,960 Speaker 1: itself interesting. 112 00:05:25,360 --> 00:05:27,240 Speaker 3: So the question is what is the limit of that 113 00:05:27,320 --> 00:05:29,600 Speaker 3: how far can can that system go up? I think 114 00:05:29,680 --> 00:05:33,000 Speaker 3: it will get there, and I think that it's probable 115 00:05:33,040 --> 00:05:36,680 Speaker 3: that we can build systems that are the technical term 116 00:05:36,720 --> 00:05:39,920 Speaker 3: is super intelligent, where you have a simple system, sorry, 117 00:05:39,960 --> 00:05:45,320 Speaker 3: a single system that is at the nineteh percentile of physics, math. 118 00:05:45,520 --> 00:05:48,880 Speaker 1: Chemistry, and arts and so forth. No human can do that. 119 00:05:49,320 --> 00:05:52,120 Speaker 3: It looks like these systems will be not only available 120 00:05:52,160 --> 00:05:54,880 Speaker 3: in the next five years because we already have examples 121 00:05:54,920 --> 00:05:57,680 Speaker 3: of passing these tests already, but also that they'll be 122 00:05:57,720 --> 00:06:01,760 Speaker 3: broadly available for all of society. That has huge implications. 123 00:06:01,800 --> 00:06:04,680 Speaker 3: We've never had an experiment where each and every one 124 00:06:04,720 --> 00:06:07,640 Speaker 3: of us has a polymath, you know, the Einstein type 125 00:06:07,680 --> 00:06:12,880 Speaker 3: scientists at our back and call. And furthermore, that scientist 126 00:06:13,279 --> 00:06:15,280 Speaker 3: whatever you want to call him or her or it, 127 00:06:16,240 --> 00:06:19,480 Speaker 3: is capable of writing code, doing agents, and making things happen. 128 00:06:19,800 --> 00:06:22,120 Speaker 5: How do you think the United States competes with China 129 00:06:22,320 --> 00:06:24,960 Speaker 5: when we have an incoming administration that wants to put 130 00:06:25,000 --> 00:06:27,680 Speaker 5: up tariff walls, and even this administration who's been putting 131 00:06:27,680 --> 00:06:30,799 Speaker 5: export controls on some of our high tech semiconductors. 132 00:06:32,120 --> 00:06:33,919 Speaker 1: I am not in favor of broad tariffs. 133 00:06:33,960 --> 00:06:36,040 Speaker 3: I never have been I'm the son of an economist 134 00:06:36,279 --> 00:06:40,040 Speaker 3: and a grandson of an economist, and tariffs are essentially taxes. 135 00:06:40,440 --> 00:06:43,480 Speaker 3: What I am in favor of is restrictions or limitations 136 00:06:43,520 --> 00:06:47,320 Speaker 3: for strategic reasons. It's really important that we win in 137 00:06:47,320 --> 00:06:51,240 Speaker 3: this race against China. And so the things that the 138 00:06:51,279 --> 00:06:54,800 Speaker 3: Trump administration Biden administration to limit, for example, hardware access, 139 00:06:54,839 --> 00:06:57,080 Speaker 3: were really smart. And I know because I was part 140 00:06:57,080 --> 00:07:00,840 Speaker 3: of a commission that recommended it way back when in 141 00:07:00,839 --> 00:07:01,719 Speaker 3: fact was adopted. 142 00:07:01,800 --> 00:07:02,360 Speaker 1: Great job. 143 00:07:02,800 --> 00:07:05,440 Speaker 3: So the question here is ignoring the tariff question, which 144 00:07:05,440 --> 00:07:06,719 Speaker 3: I just don't like, and I think this is just 145 00:07:06,760 --> 00:07:10,120 Speaker 3: another tax. How do you control how do you control 146 00:07:10,160 --> 00:07:13,600 Speaker 3: and limit what China does? And how do you amplify 147 00:07:13,640 --> 00:07:16,600 Speaker 3: what America does? And you sit there and you go, well, 148 00:07:16,680 --> 00:07:19,000 Speaker 3: you're doing pretty well, right. Everybody's making a lot of money. 149 00:07:19,000 --> 00:07:20,960 Speaker 3: There's all this growth, and video is doing well. All 150 00:07:20,960 --> 00:07:23,280 Speaker 3: the tech companies are doing well. But let me explain 151 00:07:23,320 --> 00:07:26,080 Speaker 3: what happens. You have a slope that's like this, and 152 00:07:26,120 --> 00:07:28,720 Speaker 3: it keeps going up and up and up with humans. 153 00:07:29,120 --> 00:07:32,240 Speaker 3: At some point, the industry believes that there will be 154 00:07:32,320 --> 00:07:36,080 Speaker 3: AI scientists, that is, non human scientists, and he or 155 00:07:36,120 --> 00:07:38,440 Speaker 3: she who gets there first gets a slope like this 156 00:07:38,560 --> 00:07:39,760 Speaker 3: and all of a sudden boom. 157 00:07:39,960 --> 00:07:42,560 Speaker 1: Yeah right, we're really really growing fast. 158 00:07:42,960 --> 00:07:45,600 Speaker 3: These network effect businesses are what we do in our 159 00:07:45,640 --> 00:07:49,160 Speaker 3: in our competitive environment against each other under a US regulation. 160 00:07:49,360 --> 00:07:51,040 Speaker 1: I want that to be true globally for. 161 00:07:51,000 --> 00:07:52,800 Speaker 2: The US, Eric, I can tell you we're all in 162 00:07:52,840 --> 00:07:55,200 Speaker 2: favor of having your back soon to continue the conversation. 163 00:07:55,320 --> 00:07:57,920 Speaker 2: Congratulations on the new book, so we appreciate your time. 164 00:07:57,920 --> 00:08:00,000 Speaker 2: The former Google CEO Eric Smith