1 00:00:02,440 --> 00:00:08,280 Speaker 1: Bloomberg Audio Studios, Podcasts, radio News. We start with the 2 00:00:08,360 --> 00:00:10,760 Speaker 1: US economy and some signs that the consumer may be 3 00:00:10,920 --> 00:00:13,920 Speaker 1: pulling back. Has seen this weekend the retail sales numbers, 4 00:00:14,120 --> 00:00:16,400 Speaker 1: and we welcome down our very special contributor, Larry Summers 5 00:00:16,440 --> 00:00:18,919 Speaker 1: of Harvard. So we had the retail sales numbers which 6 00:00:18,920 --> 00:00:21,440 Speaker 1: came in lower and expected by the housing starts were 7 00:00:21,480 --> 00:00:24,759 Speaker 1: really down substantially. Are we seeing a slowing economy? 8 00:00:25,520 --> 00:00:28,040 Speaker 2: We may be, and it's certainly not at the pace 9 00:00:28,160 --> 00:00:31,920 Speaker 2: that it once was. I think it's a real question 10 00:00:32,080 --> 00:00:37,040 Speaker 2: whether we're really seeing a profound slowing or month to 11 00:00:37,159 --> 00:00:41,479 Speaker 2: month fluctuations. My guess is this is still in the 12 00:00:41,520 --> 00:00:46,080 Speaker 2: world of month to month fluctuations with an underlying picture 13 00:00:46,600 --> 00:00:52,480 Speaker 2: around continuing growth. But you can't be sure, and I'd 14 00:00:52,520 --> 00:00:56,120 Speaker 2: certainly agree that the data has been more on the 15 00:00:56,240 --> 00:00:59,360 Speaker 2: slow side for the last month or two than on 16 00:00:59,440 --> 00:01:00,280 Speaker 2: the rapid. 17 00:01:00,600 --> 00:01:03,120 Speaker 1: So last week we spoke. We talked about the possible 18 00:01:03,160 --> 00:01:06,200 Speaker 1: economic policies of a Donald Trump if he were returned 19 00:01:06,400 --> 00:01:08,119 Speaker 1: to the White House, because he had floated the idea 20 00:01:08,160 --> 00:01:11,640 Speaker 1: of maybe having tariffs replaced some or all of income taxes. 21 00:01:11,760 --> 00:01:14,479 Speaker 1: You were not very enthusiastic about that. Since then, he's 22 00:01:14,480 --> 00:01:18,080 Speaker 1: responded to you very specifically on a podcast called all 23 00:01:18,200 --> 00:01:20,959 Speaker 1: In where they asked him about well, how he responded 24 00:01:21,000 --> 00:01:23,959 Speaker 1: to your thoughts on tariffs, and he said he respected you. 25 00:01:24,080 --> 00:01:26,840 Speaker 1: He gave you nice plaut it said, he respects you. You 26 00:01:26,600 --> 00:01:28,920 Speaker 1: speak your mind, is I think what he said. But 27 00:01:28,920 --> 00:01:31,080 Speaker 1: at the same time, he really likes tariffs, he said, 28 00:01:31,120 --> 00:01:34,600 Speaker 1: because it shows the power of a country, both economically 29 00:01:34,680 --> 00:01:38,280 Speaker 1: and politically. What do you make of his endorsement of 30 00:01:38,319 --> 00:01:41,040 Speaker 1: tariffs as a major tool of policy. 31 00:01:41,520 --> 00:01:47,640 Speaker 2: I don't see the evidence for his belief. First of all, 32 00:01:48,000 --> 00:01:51,920 Speaker 2: the tariffs he proposes are going to be levied against Canada, 33 00:01:52,000 --> 00:01:56,680 Speaker 2: They're going to be levied against our traditional European allies. 34 00:01:57,320 --> 00:02:01,400 Speaker 2: If they are a tool of power and timidation, it 35 00:02:01,440 --> 00:02:04,280 Speaker 2: seems to me they should be used much more selectively 36 00:02:04,880 --> 00:02:10,359 Speaker 2: than he has proposed. Second, when you launch attacks, then 37 00:02:10,520 --> 00:02:15,200 Speaker 2: others respond and the whole thing can spiral. The classic 38 00:02:15,320 --> 00:02:19,200 Speaker 2: example of a major tariff policy in American history was 39 00:02:19,320 --> 00:02:25,359 Speaker 2: Smooth Hawley, and it contributed to making the depression great. 40 00:02:25,760 --> 00:02:29,480 Speaker 2: As I look around the world at countries who have 41 00:02:29,720 --> 00:02:35,440 Speaker 2: seen tariffs as the center of their economic strategy. To 42 00:02:35,560 --> 00:02:40,400 Speaker 2: make a nationalist point vis a vis other countries. The 43 00:02:40,600 --> 00:02:45,480 Speaker 2: examples look like places like Argentina and a number of 44 00:02:45,520 --> 00:02:51,120 Speaker 2: places in Latin America where it has not been so successful. 45 00:02:51,520 --> 00:02:58,920 Speaker 2: So I'd like to see what the case is. But 46 00:02:59,440 --> 00:03:05,480 Speaker 2: this is a case swhere. It's about as universal among 47 00:03:05,800 --> 00:03:13,000 Speaker 2: economists who studied these things that you shouldn't pursue systematic 48 00:03:13,160 --> 00:03:19,720 Speaker 2: across the board tariffs for long periods of time. 49 00:03:20,280 --> 00:03:22,760 Speaker 1: Let's talk about change, because there's an awful lot of 50 00:03:22,760 --> 00:03:25,119 Speaker 1: things that are changing around us right now. Whether it's 51 00:03:25,120 --> 00:03:29,440 Speaker 1: climate that you've talked about at fair amount, geopolitics, global economy, 52 00:03:29,480 --> 00:03:31,400 Speaker 1: there are a lot of changes going at the same time. 53 00:03:31,760 --> 00:03:34,920 Speaker 1: Are we going through a particularly tumultuous time of change 54 00:03:34,960 --> 00:03:37,640 Speaker 1: right now? Do you think around us? Or has it 55 00:03:37,680 --> 00:03:38,160 Speaker 1: always been? 56 00:03:38,200 --> 00:03:43,600 Speaker 2: Thus, here's the sense I have, David, I was fortunate 57 00:03:43,720 --> 00:03:50,560 Speaker 2: enough to welcome baby granddaughter, Francis Joanne into the world 58 00:03:50,640 --> 00:03:55,680 Speaker 2: when my daughter had my first granddaughter, about ten days ago. 59 00:03:56,120 --> 00:03:59,320 Speaker 2: So I've been thinking about her life and it made 60 00:03:59,360 --> 00:04:04,480 Speaker 2: me think about my grandmother. My grandmother lived from nineteen 61 00:04:04,600 --> 00:04:09,720 Speaker 2: hundred to nineteen seventy four. She saw indoor plumbing come, 62 00:04:10,080 --> 00:04:15,160 Speaker 2: she saw electricity come, she saw a first telephone, then radio, 63 00:04:15,440 --> 00:04:23,599 Speaker 2: then TV movies come. She saw air conditioning, She saw antibiotics, 64 00:04:23,680 --> 00:04:28,839 Speaker 2: which made childhood death a rarity. She saw the ability 65 00:04:28,880 --> 00:04:32,880 Speaker 2: to no longer ride in horses, but instead to be 66 00:04:32,920 --> 00:04:38,440 Speaker 2: able to fly across the country in five hours. My 67 00:04:38,640 --> 00:04:44,240 Speaker 2: life almost now as long as hers was yours. We've 68 00:04:44,279 --> 00:04:48,400 Speaker 2: seen a lot of history. We've seen a lot of change. Yes, 69 00:04:48,600 --> 00:04:53,560 Speaker 2: we've seen computers. We've seen the cell phone. We've seen yes, 70 00:04:53,800 --> 00:05:01,560 Speaker 2: the Bloomberg terminal. We've seen more modern financial markets. But 71 00:05:01,680 --> 00:05:05,119 Speaker 2: I think you'd have to agree that we've saw much 72 00:05:05,279 --> 00:05:11,640 Speaker 2: much less change than my grandmother's generation did. I have 73 00:05:11,680 --> 00:05:16,919 Speaker 2: a suspicion that my granddaughter is going to witness history 74 00:05:17,800 --> 00:05:24,960 Speaker 2: like my grandmother did. And most importantly, I think we're 75 00:05:25,000 --> 00:05:29,640 Speaker 2: going to see a step change with what happens in 76 00:05:30,440 --> 00:05:36,360 Speaker 2: artificial intelligence. As I've said before, the wheel was awfully fundamental, 77 00:05:36,920 --> 00:05:39,880 Speaker 2: but once you have the wheel, you don't automatically get 78 00:05:39,880 --> 00:05:44,800 Speaker 2: more and better wheels. Same thing with electricity. But artificial 79 00:05:44,880 --> 00:05:50,920 Speaker 2: intelligence has the capacity to make better artificial intelligence, and 80 00:05:50,960 --> 00:05:55,880 Speaker 2: that puts in a kind of upward exponential ratchet that 81 00:05:56,320 --> 00:06:01,200 Speaker 2: isn't a feature of any other technological change. So my 82 00:06:01,360 --> 00:06:07,480 Speaker 2: daughter's gonna witness seismic change, and the granddaughter is gonna 83 00:06:07,520 --> 00:06:10,760 Speaker 2: witness seismic change, and the issue is going to be 84 00:06:11,480 --> 00:06:17,560 Speaker 2: can we manage it so we avoid the catastrophes that 85 00:06:17,880 --> 00:06:24,440 Speaker 2: were also part of my grandmother's interval on this earth. 86 00:06:25,080 --> 00:06:27,240 Speaker 1: I just want to pick up on the artificial intelligence point, 87 00:06:27,279 --> 00:06:31,920 Speaker 1: because you brought artificial generative artificial to us here at 88 00:06:31,960 --> 00:06:34,159 Speaker 1: Wall Street week right about the time I think chat 89 00:06:34,200 --> 00:06:36,120 Speaker 1: Gepeat came out. Now, since then you've got on the 90 00:06:36,120 --> 00:06:37,960 Speaker 1: board of Open AI, so I want to ask what 91 00:06:38,279 --> 00:06:41,320 Speaker 1: secret things in open eye? But from what you understand 92 00:06:41,440 --> 00:06:45,640 Speaker 1: now about generative AI, has your view changed about how 93 00:06:45,680 --> 00:06:47,800 Speaker 1: big it could be and also about what the possible 94 00:06:47,839 --> 00:06:50,400 Speaker 1: dangers are because we know now more than we did then. 95 00:06:51,200 --> 00:06:58,479 Speaker 2: Look, I think, David, this can be transcendent in its 96 00:06:58,520 --> 00:07:04,000 Speaker 2: importance for the reason and that that I just describe. 97 00:07:04,640 --> 00:07:10,280 Speaker 2: It has this recursive aspect where it's able to improve itself, 98 00:07:11,040 --> 00:07:17,480 Speaker 2: and that differentiates it from other technologies. When we'll get 99 00:07:17,960 --> 00:07:22,160 Speaker 2: where very hard to know how much will there be 100 00:07:23,320 --> 00:07:32,040 Speaker 2: last mile problems that will stop some uses. That's another 101 00:07:32,320 --> 00:07:36,800 Speaker 2: very important question where I don't think anybody can give 102 00:07:37,360 --> 00:07:45,080 Speaker 2: a completely confident answer. God knows as companies as a society. 103 00:07:46,120 --> 00:07:51,480 Speaker 2: We cannot leave war to generals, and we cannot leave 104 00:07:52,120 --> 00:07:59,600 Speaker 2: AI only to AI developers. That's why it's absolutely essential 105 00:08:00,280 --> 00:08:07,600 Speaker 2: that public authorities take a strong role here to make 106 00:08:07,640 --> 00:08:12,600 Speaker 2: sure this technology is used for good. But equally, any 107 00:08:12,680 --> 00:08:16,800 Speaker 2: effort to stop this or to just slow it down 108 00:08:17,280 --> 00:08:22,000 Speaker 2: for the sake of slowing it down without also thinking 109 00:08:22,040 --> 00:08:26,800 Speaker 2: about its positive development, would be to see the field 110 00:08:27,400 --> 00:08:31,120 Speaker 2: to the irresponsible, would be to seed the field to 111 00:08:31,880 --> 00:08:36,240 Speaker 2: potential adversaries of the United States, would be to seed 112 00:08:36,360 --> 00:08:42,560 Speaker 2: the field to those whose vision of AI might be 113 00:08:42,679 --> 00:08:47,240 Speaker 2: as a tool of totalitarianism rather than as a tool 114 00:08:47,880 --> 00:08:55,880 Speaker 2: of human emancipation. That's why there's a phrase that I 115 00:08:56,200 --> 00:09:03,240 Speaker 2: like that the that open eyes I often uses responsible 116 00:09:03,400 --> 00:09:09,080 Speaker 2: iterative deployment. That is, you don't you proceed in stages 117 00:09:09,800 --> 00:09:14,640 Speaker 2: and you're very focused on doing it responsibly. Now, that's 118 00:09:14,760 --> 00:09:21,680 Speaker 2: easier said than done and requires enormous thought, but I 119 00:09:21,720 --> 00:09:25,880 Speaker 2: don't think we have any other viable alternative. 120 00:09:26,200 --> 00:09:28,000 Speaker 1: Larry, thank you so very much for being with us 121 00:09:28,000 --> 00:09:30,880 Speaker 1: again this week. That's our special contributor, Larry Summers of 122 00:09:31,000 --> 00:09:31,440 Speaker 1: Harvard