1 00:00:00,280 --> 00:00:07,200 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. 2 00:00:08,760 --> 00:00:11,680 Speaker 2: Maybe that will be the story for the global economy 3 00:00:11,920 --> 00:00:15,760 Speaker 2: in twenty twenty six. Yes, the risks are there, but 4 00:00:15,880 --> 00:00:19,720 Speaker 2: actually it's the momentum and the optimism and the AI 5 00:00:19,800 --> 00:00:22,720 Speaker 2: revolution which continue to dominate the narrative. 6 00:00:33,000 --> 00:00:36,000 Speaker 3: I'm Stephanie Flanders, head of Government and Economics at Bloomberg, 7 00:00:36,320 --> 00:00:39,239 Speaker 3: and this is Trumpnomics, the podcast that looks at the 8 00:00:39,280 --> 00:00:42,320 Speaker 3: economic world of Donald Trump, how he's already shaped the 9 00:00:42,320 --> 00:00:45,040 Speaker 3: global economy and what on earth is going to happen next. 10 00:00:49,040 --> 00:00:52,280 Speaker 3: And this week it's a traditional Trumpanomic look ahead to 11 00:00:52,360 --> 00:00:55,160 Speaker 3: twenty twenty six. As we look forward, I hope to 12 00:00:55,200 --> 00:00:57,800 Speaker 3: the holidays. I wanted to take a deep breath and 13 00:00:57,840 --> 00:01:00,279 Speaker 3: think a little bit about what's happened in the past year, 14 00:01:00,840 --> 00:01:03,280 Speaker 3: and a lot about what it means for the year 15 00:01:03,320 --> 00:01:06,360 Speaker 3: ahead and the Christmas bundle I've selected to help me 16 00:01:06,440 --> 00:01:09,080 Speaker 3: with this, and I hope provide just the right amount 17 00:01:09,080 --> 00:01:13,800 Speaker 3: of illumination. Are in Washington, Mario Parker, our managing editor 18 00:01:13,840 --> 00:01:18,399 Speaker 3: for US Politics, amazingly joining Trumponomics for the first time. Mario, Hello, 19 00:01:18,680 --> 00:01:19,880 Speaker 3: I am very happy to be here. 20 00:01:19,920 --> 00:01:20,280 Speaker 4: Thank you. 21 00:01:20,600 --> 00:01:23,600 Speaker 3: Also in Washington, frequent visitors to the pod. Tom Orlick, 22 00:01:23,720 --> 00:01:25,600 Speaker 3: chief economists for Bloomberg Economics. 23 00:01:25,760 --> 00:01:27,440 Speaker 2: Great to be a Stephanie and in London. 24 00:01:27,560 --> 00:01:31,920 Speaker 3: Welcome back. Palmei Olson, a Bloomberg opinion columnist covering technology, 25 00:01:31,959 --> 00:01:35,959 Speaker 3: whose book Supremacy, AI, Chat, GBT and The Race That 26 00:01:35,959 --> 00:01:38,720 Speaker 3: Will Change the World was the FT's Business Book of 27 00:01:38,760 --> 00:01:41,360 Speaker 3: the Year for twenty twenty four. Parmey, welcome back. 28 00:01:41,920 --> 00:01:42,840 Speaker 5: Thank you for having me. 29 00:01:49,880 --> 00:01:52,280 Speaker 3: I mean I wanted the three of you for various reasons, 30 00:01:52,280 --> 00:01:54,200 Speaker 3: some of them pretty obvious. You've been right at the 31 00:01:54,240 --> 00:01:56,440 Speaker 3: heart of the action this year, and I find you 32 00:01:56,800 --> 00:02:02,000 Speaker 3: always wise about what's to come. The way we always 33 00:02:02,000 --> 00:02:04,040 Speaker 3: start these things where I like to start, is just 34 00:02:04,160 --> 00:02:06,280 Speaker 3: for each of us to think, what was the most 35 00:02:07,080 --> 00:02:12,120 Speaker 3: memorable moment of this year? So Mary, I'll put you 36 00:02:12,160 --> 00:02:14,120 Speaker 3: on the spot. What was the most memorable moment for you? 37 00:02:15,080 --> 00:02:18,760 Speaker 6: I think it was in the specific to this podcast 38 00:02:18,880 --> 00:02:22,800 Speaker 6: very much, which was the Liberation Day rollout where it started. 39 00:02:22,960 --> 00:02:24,400 Speaker 4: There were fits and starts. 40 00:02:24,440 --> 00:02:28,519 Speaker 6: There were problems with the White House's copying machine, there 41 00:02:28,520 --> 00:02:32,320 Speaker 6: were typographical errors on some of the cars, and there 42 00:02:32,480 --> 00:02:35,160 Speaker 6: was just a lot of confusion, and I think that 43 00:02:35,320 --> 00:02:38,440 Speaker 6: set the stage for what we saw with both a 44 00:02:38,560 --> 00:02:41,640 Speaker 6: teariff rollout, and the way and the many things that 45 00:02:41,639 --> 00:02:44,680 Speaker 6: the administration promised it did not quite come to fruition, 46 00:02:45,200 --> 00:02:48,520 Speaker 6: but also the fast pace and the chaos that we've 47 00:02:48,560 --> 00:02:52,480 Speaker 6: seen wrought by the administration deliberately so where they want 48 00:02:52,520 --> 00:02:55,120 Speaker 6: to move fast and quote unquote break things, and we've 49 00:02:55,120 --> 00:02:56,520 Speaker 6: seen that throughout the rest of the year. 50 00:02:56,639 --> 00:02:59,240 Speaker 4: So that was the most memorable moment I think, Tom. 51 00:02:59,720 --> 00:03:03,640 Speaker 2: There was a moment just before the US made that 52 00:03:03,680 --> 00:03:08,840 Speaker 2: faithful decision to join Israel in attacking Iran's nuclear facilities, 53 00:03:09,280 --> 00:03:14,920 Speaker 2: and there was still intense speculation about whether Operation Midnight Hammer, 54 00:03:15,120 --> 00:03:17,720 Speaker 2: as it was called, would go ahead or not, and 55 00:03:17,840 --> 00:03:21,799 Speaker 2: President Trump took to truth social and said, I may 56 00:03:21,840 --> 00:03:25,880 Speaker 2: do it, I may not do it. Only I will decide, 57 00:03:26,360 --> 00:03:29,000 Speaker 2: and I expect everyone else's way ahead of me on this. 58 00:03:29,200 --> 00:03:32,320 Speaker 2: But for me, that was a real clarifying moment, right 59 00:03:32,639 --> 00:03:38,200 Speaker 2: and it clarified that the intense uncertainty, the kind of 60 00:03:38,720 --> 00:03:42,480 Speaker 2: presidency as a soap opera with a cliffhanger at the 61 00:03:42,600 --> 00:03:44,760 Speaker 2: end of every episode to keep you tuning in the 62 00:03:44,800 --> 00:03:48,360 Speaker 2: next day. That wasn't a bug of some of the 63 00:03:48,520 --> 00:03:51,760 Speaker 2: trade policies and security policies we'd seen in the previous 64 00:03:51,800 --> 00:03:55,560 Speaker 2: few months. It was a feature, a deliberate strategy to 65 00:03:55,640 --> 00:04:00,840 Speaker 2: allow the President to very successfully dominate the conversation, dominate 66 00:04:00,920 --> 00:04:04,360 Speaker 2: the headlines, keep himself at the front of everyone's consciousness. 67 00:04:05,040 --> 00:04:06,120 Speaker 4: Pa well. 68 00:04:06,120 --> 00:04:07,840 Speaker 1: The highlight for me was actually at the very beginning 69 00:04:07,840 --> 00:04:10,800 Speaker 1: of this year in January, when a little company in 70 00:04:11,240 --> 00:04:16,039 Speaker 1: China called deep Seek released an AI model that caused 71 00:04:16,080 --> 00:04:17,960 Speaker 1: a freak out in Silicon Valley. 72 00:04:18,360 --> 00:04:20,560 Speaker 5: This was an AI model that was as good as. 73 00:04:20,520 --> 00:04:24,040 Speaker 1: Chat GBT, but developed for what apparently was a fraction 74 00:04:24,360 --> 00:04:27,760 Speaker 1: of the price, and it was an open model some 75 00:04:27,800 --> 00:04:30,080 Speaker 1: would call it open source, so they essentially put the 76 00:04:30,080 --> 00:04:32,760 Speaker 1: blueprints of this AI model on the Internet for anybody 77 00:04:33,200 --> 00:04:36,040 Speaker 1: to develop on themselves. And I thought it was just 78 00:04:36,080 --> 00:04:40,840 Speaker 1: such an interesting twist on this effort by the Trump 79 00:04:40,920 --> 00:04:46,080 Speaker 1: administration and previous White Houses before to restrict chip exports 80 00:04:46,160 --> 00:04:49,679 Speaker 1: to China to try and ensure that the US would 81 00:04:49,680 --> 00:04:54,680 Speaker 1: stay ahead on tech. And yet the constraint bread innovation, 82 00:04:54,960 --> 00:04:58,120 Speaker 1: the lack of chips, led this one company to try 83 00:04:58,160 --> 00:05:02,440 Speaker 1: and do more with less. And since then, deep Seek 84 00:05:02,480 --> 00:05:05,000 Speaker 1: isn't quite as popular as I would have thought it 85 00:05:05,040 --> 00:05:08,200 Speaker 1: would be by this point in time, but other Chinese 86 00:05:08,279 --> 00:05:11,560 Speaker 1: open source models are becoming much more popular, in particular 87 00:05:11,600 --> 00:05:14,960 Speaker 1: Ali Baba's Quen. So lots of Silicon Valley startups are 88 00:05:15,120 --> 00:05:18,799 Speaker 1: using models from China now, and I think that's just 89 00:05:18,839 --> 00:05:23,119 Speaker 1: such an interesting highlight for this ascendancy we're seeing from China, 90 00:05:23,160 --> 00:05:25,560 Speaker 1: both in AI and also in robotics. 91 00:05:26,240 --> 00:05:28,760 Speaker 3: And it's funny for me because your slightly relates to 92 00:05:28,760 --> 00:05:30,880 Speaker 3: one of mine. If I'm completely honest, if I think 93 00:05:30,920 --> 00:05:33,200 Speaker 3: of what was the most memorable, I think watching in 94 00:05:33,200 --> 00:05:35,960 Speaker 3: the office and watching in real time the feed coming 95 00:05:36,000 --> 00:05:39,160 Speaker 3: in of that' Zelenski, the sort of car crash Zelenski 96 00:05:39,600 --> 00:05:42,640 Speaker 3: session in the White House with the President, the visceral 97 00:05:42,960 --> 00:05:45,200 Speaker 3: reaction we all had of oh my god, this is 98 00:05:45,279 --> 00:05:48,240 Speaker 3: really very bad. But if I think about what was 99 00:05:48,279 --> 00:05:51,880 Speaker 3: the most eye opening, it was when we'd all been 100 00:05:52,440 --> 00:05:55,839 Speaker 3: completely focused on the US and I went to a 101 00:05:55,920 --> 00:05:59,360 Speaker 3: sort of reasonably high level senior people involved in global business, 102 00:05:59,600 --> 00:06:03,000 Speaker 3: some of them based in Hong Kong, were others in US, 103 00:06:03,520 --> 00:06:06,119 Speaker 3: and I'd asked them on this panel, you know what's 104 00:06:06,160 --> 00:06:08,560 Speaker 3: the thing that surprised you so far this year? Assuming 105 00:06:08,600 --> 00:06:10,920 Speaker 3: they would all talk about Donald Trump, every single one 106 00:06:10,960 --> 00:06:13,720 Speaker 3: of them pointed to a technological advance that they'd been 107 00:06:13,760 --> 00:06:17,000 Speaker 3: surprised that China had made whether it was in electric 108 00:06:17,160 --> 00:06:19,839 Speaker 3: vehicles or I'm sure a couple of them were talking 109 00:06:19,839 --> 00:06:23,000 Speaker 3: about deep seek and it just sort of made me think, Ah, 110 00:06:23,480 --> 00:06:26,040 Speaker 3: maybe China's just going to win this, even as we 111 00:06:26,080 --> 00:06:28,320 Speaker 3: spend an awful lot of time watching every move coming 112 00:06:28,320 --> 00:06:31,000 Speaker 3: out of the White House, so Tom partly because of that, 113 00:06:31,640 --> 00:06:34,560 Speaker 3: how the rest of the world has absorbed and responded 114 00:06:34,600 --> 00:06:37,039 Speaker 3: to trumpernomics has obviously been a big theme for the year. 115 00:06:37,800 --> 00:06:41,039 Speaker 3: The big surprise being that countries, in many ways most 116 00:06:41,080 --> 00:06:43,720 Speaker 3: almost all of them apart from China, did not retaliate 117 00:06:44,040 --> 00:06:47,400 Speaker 3: to the tariffs, and that has had its own consequences. 118 00:06:47,480 --> 00:06:51,960 Speaker 3: But how do you see the global ramifications of mister 119 00:06:51,960 --> 00:06:55,440 Speaker 3: Trump's economic policies playing out now in twenty twenty six. 120 00:06:55,480 --> 00:06:58,159 Speaker 3: We've had stage one? What's stage two going to look like? 121 00:06:58,760 --> 00:07:01,919 Speaker 2: So I think if we think about trade, if we 122 00:07:01,960 --> 00:07:06,120 Speaker 2: think about security, if we think about AI, if we 123 00:07:06,160 --> 00:07:09,920 Speaker 2: think about the FED, the twenty twenty five story isn't 124 00:07:09,960 --> 00:07:13,360 Speaker 2: over right. The consequences are still going to be playing 125 00:07:13,400 --> 00:07:16,680 Speaker 2: out in twenty twenty six. On trade, I think one 126 00:07:16,720 --> 00:07:19,920 Speaker 2: of the surprises is that we've not seen this enormous 127 00:07:19,960 --> 00:07:24,640 Speaker 2: hike in tariffs playing out immediately in higher prices for 128 00:07:24,800 --> 00:07:29,360 Speaker 2: US consumers and lower profits for US businesses. But I 129 00:07:29,400 --> 00:07:33,000 Speaker 2: think that passed through of tariffs to the rest of 130 00:07:33,000 --> 00:07:37,280 Speaker 2: the economy. Higher prices at the shops, lower margins for 131 00:07:37,440 --> 00:07:41,520 Speaker 2: US businesses, potentially a hit to US stocks. That's still 132 00:07:41,560 --> 00:07:44,320 Speaker 2: something that will play out in the early months of 133 00:07:44,400 --> 00:07:48,040 Speaker 2: twenty twenty six. If we think about security, I think 134 00:07:48,080 --> 00:07:50,440 Speaker 2: one of the big things we're looking for in twenty 135 00:07:50,480 --> 00:07:56,400 Speaker 2: twenty six is resolution of that terrible ongoing conflict in Ukraine. 136 00:07:56,720 --> 00:07:59,360 Speaker 2: If we think about the situation right now, well, it 137 00:07:59,480 --> 00:08:02,640 Speaker 2: seems like that resolution could be on terms which are 138 00:08:02,720 --> 00:08:06,280 Speaker 2: very favorable to Moscow, and that's something which has significant 139 00:08:06,280 --> 00:08:10,880 Speaker 2: economic consequences. We're already seeing Germany and other European nations 140 00:08:11,200 --> 00:08:15,400 Speaker 2: responding with a significant increase in their defense spending. Good 141 00:08:15,440 --> 00:08:18,960 Speaker 2: news for growth, good news for defense stocks, bad news 142 00:08:19,000 --> 00:08:22,119 Speaker 2: for countries already struggling under a heavy burden of debt. 143 00:08:22,520 --> 00:08:26,000 Speaker 2: On AI, well, one of the good news stories in 144 00:08:26,040 --> 00:08:29,920 Speaker 2: twenty twenty five was just the enormous optimism about AI, 145 00:08:30,080 --> 00:08:32,760 Speaker 2: and I'm sure we'll hear more about that from Pami later. 146 00:08:33,160 --> 00:08:36,679 Speaker 2: Big question for twenty twenty six, is that optimism going 147 00:08:36,720 --> 00:08:40,840 Speaker 2: to be sustained? And on central banks, well, huge challenges 148 00:08:40,880 --> 00:08:44,959 Speaker 2: to FED independence in twenty twenty five. In twenty twenty six, 149 00:08:45,160 --> 00:08:47,360 Speaker 2: Trump's going to be able to pick the chair of 150 00:08:47,400 --> 00:08:48,280 Speaker 2: the Federal Reserve. 151 00:08:49,320 --> 00:08:51,040 Speaker 3: And I think we probably all of those things we're 152 00:08:51,040 --> 00:08:52,960 Speaker 3: going to end up touching on. If I just think 153 00:08:53,000 --> 00:08:56,040 Speaker 3: about China, because that was such a focus of the 154 00:08:56,080 --> 00:08:59,719 Speaker 3: administration's attention at the starts of the year and has 155 00:08:59,760 --> 00:09:03,520 Speaker 3: been such a dominant theme if you sort of look back, 156 00:09:04,080 --> 00:09:09,240 Speaker 3: is China stronger or weaker economically going into twenty twenty 157 00:09:09,280 --> 00:09:11,480 Speaker 3: six as a result of trumpnomics? 158 00:09:12,320 --> 00:09:15,400 Speaker 2: So I think about it on a couple of dimensions, Stephanie. 159 00:09:15,440 --> 00:09:19,600 Speaker 2: The first one is how hawkish on China is Donald Trump? 160 00:09:19,760 --> 00:09:22,240 Speaker 2: And it's actually quite hard to answer that question. In 161 00:09:22,280 --> 00:09:26,760 Speaker 2: one respect, he's America's biggest China hawk. Until he descended 162 00:09:26,880 --> 00:09:31,120 Speaker 2: that Golden Escalator to start his presidential bid in twenty sixteen, 163 00:09:31,559 --> 00:09:34,920 Speaker 2: the view in Washington, DC was, yeah, there's some problems 164 00:09:34,960 --> 00:09:37,440 Speaker 2: with China. We've got some issues, but we can work together. 165 00:09:37,640 --> 00:09:41,000 Speaker 2: And it was Trump who was responsible for that enormous pivot, 166 00:09:41,040 --> 00:09:45,080 Speaker 2: that big reorientation saying, actually, China's not our friend, not 167 00:09:45,160 --> 00:09:47,960 Speaker 2: someone we're going to work with. China's our rival and 168 00:09:48,000 --> 00:09:50,600 Speaker 2: our adversary. But if you look at the details of 169 00:09:50,679 --> 00:09:55,640 Speaker 2: China policy right now, well, tariffs have come down, Controls 170 00:09:55,679 --> 00:09:58,720 Speaker 2: on China's access to the chips which power the Ai 171 00:09:58,840 --> 00:10:03,480 Speaker 2: Revolution have been relaxed. Apparently Trump got on the phone 172 00:10:03,520 --> 00:10:08,000 Speaker 2: with Japan's Prime Minister Takaichi and said, cool it on Taiwan. 173 00:10:08,120 --> 00:10:11,680 Speaker 2: You don't want to upset President Shei. Right, So the 174 00:10:11,720 --> 00:10:15,920 Speaker 2: big reorientation starting in twenty sixteen that positions Trump as 175 00:10:15,920 --> 00:10:19,720 Speaker 2: an enormous China hawk, the details of policy in the 176 00:10:19,760 --> 00:10:21,880 Speaker 2: last few months a bit less hawkish. 177 00:10:22,040 --> 00:10:22,280 Speaker 4: I think. 178 00:10:22,320 --> 00:10:24,480 Speaker 2: The other big question for China, the other dimension I 179 00:10:24,520 --> 00:10:27,160 Speaker 2: would think about this is kind of short term long term. 180 00:10:27,400 --> 00:10:32,400 Speaker 2: So short term, China's facing a crisis in its property sector, 181 00:10:32,679 --> 00:10:35,960 Speaker 2: collapse in real estate, the consequences of years of overbuilding, 182 00:10:36,120 --> 00:10:39,160 Speaker 2: and when you've got weak domestic demand, what you really 183 00:10:39,200 --> 00:10:42,720 Speaker 2: want is strong global demand, strong exports to keep your 184 00:10:42,760 --> 00:10:47,199 Speaker 2: economy going. Clearly, facing new high tariffs from the United 185 00:10:47,240 --> 00:10:52,160 Speaker 2: States is not very constructive. Longer term, though, there's a 186 00:10:52,280 --> 00:10:57,840 Speaker 2: narrative in China that democracy plus capitalism doesn't work. Democracy 187 00:10:57,880 --> 00:11:02,320 Speaker 2: plus capitalism ends if you're going to be generous in plutocracy, 188 00:11:02,640 --> 00:11:05,480 Speaker 2: if you're going to be less generous in kleptocracy. And 189 00:11:05,520 --> 00:11:08,360 Speaker 2: I think when Beijing looks at what's happening in Washington, DC, 190 00:11:08,760 --> 00:11:12,040 Speaker 2: that's very much their understanding of how things are playing out, 191 00:11:12,440 --> 00:11:14,760 Speaker 2: and I think long term they think that can only 192 00:11:14,800 --> 00:11:18,440 Speaker 2: be positive for China and China's emergence as a global power. 193 00:11:19,480 --> 00:11:21,480 Speaker 3: I mean, there was quite a lot of focus just 194 00:11:21,559 --> 00:11:23,679 Speaker 3: in the weeks leading up to the end of the 195 00:11:23,760 --> 00:11:27,960 Speaker 3: year on trade surplus still being enormous in China and 196 00:11:28,000 --> 00:11:32,000 Speaker 3: having quite successfully diverted a lot of exports that are 197 00:11:32,000 --> 00:11:35,000 Speaker 3: not going to the US to other parts of the world. Pami, 198 00:11:35,040 --> 00:11:38,000 Speaker 3: We're going to get into the relative strength, I would 199 00:11:38,040 --> 00:11:42,240 Speaker 3: suggest of the US economy this year. Some pain may 200 00:11:42,240 --> 00:11:45,000 Speaker 3: be still to come, but I think as a general 201 00:11:45,040 --> 00:11:48,719 Speaker 3: feeling that the impact of AI investment has been a 202 00:11:48,720 --> 00:11:51,400 Speaker 3: big part of that, supporting the economy and indeed just 203 00:11:51,480 --> 00:11:54,360 Speaker 3: making it hard to read the economy. Do you see 204 00:11:54,720 --> 00:11:58,679 Speaker 3: that level of capex as Tom talked about continuing next 205 00:11:58,760 --> 00:12:01,439 Speaker 3: year and as we say see this rollout of AI, 206 00:12:02,240 --> 00:12:05,240 Speaker 3: what are the potential bear traps there? Waiting down the 207 00:12:05,320 --> 00:12:07,439 Speaker 3: road for the administration or the economy. 208 00:12:08,280 --> 00:12:10,320 Speaker 5: Well, I think you're absolutely right in a lot of ways. 209 00:12:10,440 --> 00:12:13,280 Speaker 1: AI boom has given the Trump administration a lot of 210 00:12:13,320 --> 00:12:16,120 Speaker 1: cover because it's just made it look like everything is 211 00:12:16,160 --> 00:12:20,280 Speaker 1: going so well. And in aggregate, the Magnificent seven tech 212 00:12:20,320 --> 00:12:25,000 Speaker 1: stocks have seen their market valuations increase by trillions of 213 00:12:25,040 --> 00:12:27,880 Speaker 1: dollars just in the last three years since chat GBT 214 00:12:28,000 --> 00:12:31,280 Speaker 1: came out. I personally think there is going to be 215 00:12:31,360 --> 00:12:35,880 Speaker 1: some kind of correction in twenty twenty six, and for 216 00:12:35,920 --> 00:12:38,400 Speaker 1: all the reasons that people are saying, which is that 217 00:12:39,040 --> 00:12:44,240 Speaker 1: there is very high risk of near term monetization not happening. 218 00:12:44,480 --> 00:12:48,160 Speaker 1: In other words, there's been this huge buildout of infrastructure 219 00:12:48,240 --> 00:12:53,200 Speaker 1: anticipating all this demand that probably will come in eventually, 220 00:12:53,600 --> 00:12:55,720 Speaker 1: but it's just not going to come in fast enough 221 00:12:55,720 --> 00:12:58,040 Speaker 1: to meet the expectations of investors. 222 00:12:58,400 --> 00:13:00,160 Speaker 5: And this is exactly why we saw. 223 00:13:00,480 --> 00:13:04,040 Speaker 1: Booms and bust in the dot com era early two thousands. 224 00:13:04,120 --> 00:13:06,320 Speaker 1: You had this big rollout of fiber optic cables that 225 00:13:06,400 --> 00:13:08,840 Speaker 1: eventually got used, but just didn't get used quickly enough. 226 00:13:08,960 --> 00:13:11,079 Speaker 1: Same with the railroads in the nineteenth century. I think 227 00:13:11,120 --> 00:13:15,680 Speaker 1: we'll just see perhaps that the sparks will be leaked 228 00:13:15,720 --> 00:13:19,040 Speaker 1: reports from open ai showing a kind of soft demand 229 00:13:19,200 --> 00:13:22,640 Speaker 1: for their enterprise AI products, or something in the earnings 230 00:13:22,679 --> 00:13:27,800 Speaker 1: reports from Microsoft or Google or Amazon DAWs, just showing 231 00:13:27,800 --> 00:13:30,680 Speaker 1: that it's actually been quite difficult to get businesses to 232 00:13:30,800 --> 00:13:33,520 Speaker 1: adopt these AI services because that's where. 233 00:13:33,360 --> 00:13:34,920 Speaker 5: All the money is. Right now. 234 00:13:34,960 --> 00:13:39,760 Speaker 1: The huge success of generative AI is consumer based. You've 235 00:13:39,760 --> 00:13:42,920 Speaker 1: got nine hundred million people, that's about a ten percent 236 00:13:42,960 --> 00:13:47,679 Speaker 1: of the global population is using chatchbt every week every week. 237 00:13:47,720 --> 00:13:52,120 Speaker 1: That's astoundingly successful from a market dominance perspective, but it's 238 00:13:52,160 --> 00:13:54,320 Speaker 1: not really making money from open. 239 00:13:54,080 --> 00:13:56,400 Speaker 5: Ai because very few of those people are paying subscriptions. 240 00:13:56,720 --> 00:13:59,400 Speaker 1: The real money is going to come when businesses start 241 00:13:59,440 --> 00:14:02,160 Speaker 1: paying for the access to that service, and that's just 242 00:14:02,280 --> 00:14:05,840 Speaker 1: taking longer than these companies need. So the more signals 243 00:14:05,840 --> 00:14:09,040 Speaker 1: we get that lack of demand, which I guess will 244 00:14:09,040 --> 00:14:11,199 Speaker 1: happen next year, the more likely we'll have some kind 245 00:14:11,280 --> 00:14:14,679 Speaker 1: of decline in the markets. But I do think ultimately 246 00:14:15,640 --> 00:14:18,800 Speaker 1: that'll probably could work out quite well for the biggest 247 00:14:18,840 --> 00:14:22,200 Speaker 1: tech companies because they're the ones that can weather the storm. 248 00:14:22,240 --> 00:14:24,000 Speaker 5: They're so well capitalized. 249 00:14:24,560 --> 00:14:27,680 Speaker 1: It's the startups and the smaller companies who will really struggle, 250 00:14:28,000 --> 00:14:31,080 Speaker 1: who will see their valuations fall, and then here comes 251 00:14:31,080 --> 00:14:33,920 Speaker 1: an opportunity for big tech firms to acqui hire those 252 00:14:33,920 --> 00:14:37,040 Speaker 1: companies or buy those companies, acquire all their talent and 253 00:14:37,120 --> 00:14:40,360 Speaker 1: IP and just become even stronger out of all of this. 254 00:14:41,240 --> 00:14:42,240 Speaker 4: And that's a very good point. 255 00:14:42,440 --> 00:14:44,960 Speaker 3: In your book, your award winning book, you chart the 256 00:14:45,000 --> 00:14:48,520 Speaker 3: history of that crucial bits of the AI revolution, and 257 00:14:48,560 --> 00:14:51,120 Speaker 3: I think in the process we're reporting very closely on 258 00:14:51,560 --> 00:14:54,160 Speaker 3: the individuals in the industry. And of course the tech 259 00:14:54,200 --> 00:14:59,160 Speaker 3: bros are unusually close or at least people who have 260 00:14:59,280 --> 00:15:02,960 Speaker 3: the back of the AI industry, and these big tech 261 00:15:02,960 --> 00:15:06,280 Speaker 3: companies are very close to the White House and sort 262 00:15:06,280 --> 00:15:08,360 Speaker 3: of famously have the ear of the President. Some of 263 00:15:08,360 --> 00:15:11,200 Speaker 3: the kind of inconsistencies in China policy that Tom mentioned 264 00:15:11,800 --> 00:15:15,720 Speaker 3: is a reflection of that. The decision to allow Nvidia 265 00:15:15,840 --> 00:15:18,960 Speaker 3: to sell some of these more advanced ships to China. 266 00:15:19,120 --> 00:15:23,680 Speaker 3: If we start to see pame that AI costing jobs 267 00:15:24,400 --> 00:15:29,960 Speaker 3: or being associated with other social ills like we've already 268 00:15:30,000 --> 00:15:33,480 Speaker 3: seen with social media, if it starts being associated with 269 00:15:33,560 --> 00:15:35,800 Speaker 3: higher electricity costs for a lot of people as they 270 00:15:35,920 --> 00:15:38,960 Speaker 3: roll out these data processing centers, is there nervousness in 271 00:15:39,000 --> 00:15:42,000 Speaker 3: Silicon Valley about the kind of political exposure they have now. 272 00:15:42,040 --> 00:15:45,240 Speaker 3: And it's great to be influencing policy, but then means 273 00:15:45,240 --> 00:15:48,359 Speaker 3: they're a bit more implicated, I suppose. 274 00:15:48,040 --> 00:15:49,200 Speaker 5: But what choice do they have? 275 00:15:49,320 --> 00:15:51,560 Speaker 1: I think right now their view as this is the 276 00:15:52,840 --> 00:15:56,680 Speaker 1: strategy that has worked till now, which is to lobby 277 00:15:56,760 --> 00:16:00,880 Speaker 1: as hard as you can to try and prevent any 278 00:16:00,920 --> 00:16:04,440 Speaker 1: other regulations from coming down the pipeline that are going 279 00:16:04,480 --> 00:16:08,960 Speaker 1: to affect you. So that approach seems to be working 280 00:16:09,120 --> 00:16:12,040 Speaker 1: the Trump administration. As soon as Trump got into power, 281 00:16:12,560 --> 00:16:15,640 Speaker 1: he rescinded the executive order that Biden had put in 282 00:16:15,680 --> 00:16:19,200 Speaker 1: place on AI that calls on tech companies to do 283 00:16:19,240 --> 00:16:22,720 Speaker 1: some very basic stuff around auditing of their algorithms. 284 00:16:22,840 --> 00:16:23,720 Speaker 5: That was scrapped. 285 00:16:24,200 --> 00:16:26,560 Speaker 1: And now there's of course talk of trying to prevent 286 00:16:26,720 --> 00:16:31,040 Speaker 1: states from implementing their own AI rules because that might 287 00:16:31,120 --> 00:16:34,880 Speaker 1: create this kind of patchwork effect of regulation that will 288 00:16:35,000 --> 00:16:40,160 Speaker 1: hinder innovation. I'd personally find that completely exaggerated and overblown. 289 00:16:40,640 --> 00:16:43,040 Speaker 5: That has not been an issue in the past. With 290 00:16:43,200 --> 00:16:44,200 Speaker 5: privacy laws. 291 00:16:44,680 --> 00:16:48,680 Speaker 1: Usually companies will find the state with the strictest law, 292 00:16:48,880 --> 00:16:52,640 Speaker 1: like California, and then they'll just implement that across the board, 293 00:16:52,680 --> 00:16:55,680 Speaker 1: it's very workable. So I think this kind of narrative 294 00:16:55,800 --> 00:17:00,240 Speaker 1: spun by large companies around if you regulate US will 295 00:17:00,280 --> 00:17:02,040 Speaker 1: hinder innovation for everyone. 296 00:17:02,440 --> 00:17:04,119 Speaker 5: And we see this in Europe as well. 297 00:17:04,640 --> 00:17:07,720 Speaker 1: It's just not true, but somehow it has really taken 298 00:17:07,800 --> 00:17:10,200 Speaker 1: hold when Europe. The real issue is we just don't 299 00:17:10,280 --> 00:17:13,040 Speaker 1: have enough funding on this side of the pond for 300 00:17:13,160 --> 00:17:13,920 Speaker 1: tech startups. 301 00:17:13,960 --> 00:17:15,160 Speaker 5: The issue is not regulation. 302 00:17:15,800 --> 00:17:19,440 Speaker 1: The real problem is there's a real vacuum of thoughtful, 303 00:17:19,720 --> 00:17:23,760 Speaker 1: specific regulation, rules and standards that are going to govern 304 00:17:23,840 --> 00:17:27,959 Speaker 1: how this incredibly transformative technology that's addictive for some people, 305 00:17:28,000 --> 00:17:32,200 Speaker 1: that is actually affecting people's mental health already, there are 306 00:17:32,240 --> 00:17:35,520 Speaker 1: no rules out there to govern how that is used. 307 00:17:47,840 --> 00:17:51,800 Speaker 3: Mario, you've been nodding patiently through some of this. I mean, clearly, 308 00:17:51,880 --> 00:17:55,000 Speaker 3: the sort of the buzzword as we end twenty twenty five, 309 00:17:55,119 --> 00:17:58,520 Speaker 3: at least in sort of politically, has been affordability. And 310 00:17:58,600 --> 00:18:00,440 Speaker 3: there is a bit of a read a cross from 311 00:18:00,480 --> 00:18:03,480 Speaker 3: this AI discussion. And I know people who are being 312 00:18:04,359 --> 00:18:07,520 Speaker 3: told that the big increase in electricity supply that's going 313 00:18:07,560 --> 00:18:10,159 Speaker 3: to be needed for their area, which everyone knows is 314 00:18:10,240 --> 00:18:14,000 Speaker 3: due to data processing centers and big tech companies at 315 00:18:14,040 --> 00:18:17,119 Speaker 3: least everyone in that area knows is going to double 316 00:18:17,240 --> 00:18:20,320 Speaker 3: or sometimes even trouble their electricity costs. Is that going 317 00:18:20,400 --> 00:18:23,119 Speaker 3: to be one of the things that could continue to 318 00:18:23,240 --> 00:18:26,760 Speaker 3: stoke questions about affordability and cost of living. 319 00:18:27,280 --> 00:18:30,199 Speaker 6: Absolutely, And I was nodding with just the comments that 320 00:18:30,240 --> 00:18:32,280 Speaker 6: you all have been making just because I see the 321 00:18:32,359 --> 00:18:36,040 Speaker 6: through line of it all politically, right, So as PARMI 322 00:18:36,160 --> 00:18:39,119 Speaker 6: kind of mentioned or alluded to, you saw the influence 323 00:18:39,200 --> 00:18:42,600 Speaker 6: of the tech sector on display last week two ways 324 00:18:42,680 --> 00:18:46,400 Speaker 6: right with the easing the restrictions on in videos eight 325 00:18:46,440 --> 00:18:50,159 Speaker 6: two hundred the chips. But then also Donald Trump, I 326 00:18:50,200 --> 00:18:56,160 Speaker 6: mean almost verbatim repeated the rationale for AI not being 327 00:18:56,480 --> 00:18:59,359 Speaker 6: having a patchwork with the States and that they would 328 00:18:59,600 --> 00:19:03,040 Speaker 6: inherit bit the innovation of the industry as well, but 329 00:19:03,160 --> 00:19:05,919 Speaker 6: also the sugar high, right that it provides in an 330 00:19:05,960 --> 00:19:09,480 Speaker 6: economy that Americans are pretty sour on and giving Trump 331 00:19:09,480 --> 00:19:12,919 Speaker 6: pretty bad marks. The sugar high. That one bright spot 332 00:19:13,320 --> 00:19:16,920 Speaker 6: is that AIDS are showing the president the proliferation of 333 00:19:17,040 --> 00:19:20,639 Speaker 6: data centers, building the data centers across the country, and 334 00:19:20,680 --> 00:19:23,720 Speaker 6: he sees that as a win, appealing to his sense 335 00:19:23,800 --> 00:19:26,760 Speaker 6: of that the US can lead the way with this 336 00:19:26,920 --> 00:19:29,720 Speaker 6: breaking edge, this cutting edge technology as well. 337 00:19:30,200 --> 00:19:33,560 Speaker 4: Now, the other part of that coin is. 338 00:19:33,480 --> 00:19:35,760 Speaker 6: The fact that it leaves him vulnerable, he and the 339 00:19:35,800 --> 00:19:39,560 Speaker 6: party vulnerable to arguments about affordability, even as he touts 340 00:19:39,680 --> 00:19:46,720 Speaker 6: lower energy costs for gasoline, etc. Electritricity prices have skyrocketed, 341 00:19:46,760 --> 00:19:49,119 Speaker 6: and we saw that play out in some of the 342 00:19:49,200 --> 00:19:54,560 Speaker 6: elections that we saw in November, where Republicans lost pretty 343 00:19:54,560 --> 00:19:58,359 Speaker 6: handily in several places in affordability became a salient issue 344 00:19:58,400 --> 00:20:02,720 Speaker 6: and electricity prices became a salient issue. The other part 345 00:20:02,760 --> 00:20:05,959 Speaker 6: of it politically is, as you all have both alluded to, 346 00:20:06,920 --> 00:20:12,320 Speaker 6: Americans still don't quite understand all of the nuances of AI. 347 00:20:12,680 --> 00:20:16,880 Speaker 6: The administration hasn't been able to or the industry explain 348 00:20:16,960 --> 00:20:20,439 Speaker 6: it in a way that common Americans can really grasp it. 349 00:20:20,800 --> 00:20:23,560 Speaker 6: They see it as something that's fun, They see it 350 00:20:23,600 --> 00:20:26,320 Speaker 6: as something that can be confusing, They see it as 351 00:20:26,320 --> 00:20:28,720 Speaker 6: something that could be toxic to mental health at times. 352 00:20:29,160 --> 00:20:31,800 Speaker 4: They see it as a threat to their livelihoods and jobs. 353 00:20:31,960 --> 00:20:35,399 Speaker 6: Right, and then you add on the affordability issue as well, 354 00:20:35,480 --> 00:20:38,040 Speaker 6: and so we'll see that continue to play out in 355 00:20:38,040 --> 00:20:38,840 Speaker 6: twenty twenty six. 356 00:20:39,040 --> 00:20:43,440 Speaker 3: The President has been on something of an affordability tour, 357 00:20:43,640 --> 00:20:46,680 Speaker 3: quite a few big events on this, and yet it's 358 00:20:46,800 --> 00:20:51,840 Speaker 3: not clear what levers he's going to pull because in 359 00:20:51,880 --> 00:20:54,920 Speaker 3: some ways he doesn't recognize that there is an affordability crisis. 360 00:20:55,000 --> 00:20:58,960 Speaker 3: As we go into the midterms, how crucial is that 361 00:20:59,160 --> 00:21:01,199 Speaker 3: going to be. Do you think that there's going to 362 00:21:01,240 --> 00:21:05,840 Speaker 3: be actual moved by the administration more direct efforts or 363 00:21:05,880 --> 00:21:07,439 Speaker 3: is the White House going to still feel like this 364 00:21:07,560 --> 00:21:10,640 Speaker 3: is really just about messaging more than substance. 365 00:21:11,880 --> 00:21:13,200 Speaker 4: I think a little bit of both, right. 366 00:21:13,240 --> 00:21:15,119 Speaker 6: I think the problem for the White House is the 367 00:21:15,160 --> 00:21:17,439 Speaker 6: fact that Donald Trump, went back to our interview with 368 00:21:17,520 --> 00:21:20,720 Speaker 6: him in summer of twenty twenty four, has looked at 369 00:21:20,800 --> 00:21:23,880 Speaker 6: tariffs and argued that tariffs don't have a material effect 370 00:21:23,960 --> 00:21:27,600 Speaker 6: on consumer prices. Well, what you've seen over the last 371 00:21:27,680 --> 00:21:30,959 Speaker 6: month and change or so is some of the reversals 372 00:21:31,080 --> 00:21:35,240 Speaker 6: on food stuffs, right, bananas, coffee as well. So you 373 00:21:35,320 --> 00:21:38,320 Speaker 6: see the administration also trying to pull levers on things 374 00:21:38,359 --> 00:21:41,240 Speaker 6: like beef. So it's an acknowledgment in some ways that 375 00:21:41,480 --> 00:21:46,160 Speaker 6: tariffs are having this additive cost to the consumer from 376 00:21:46,280 --> 00:21:49,160 Speaker 6: the administration, even as Trump has argued, so you've got 377 00:21:49,160 --> 00:21:53,600 Speaker 6: this discordant message coming from the administration, and quite frankly, 378 00:21:53,880 --> 00:21:56,760 Speaker 6: if anyone saw the president's speech the rally, his heart 379 00:21:56,960 --> 00:22:00,679 Speaker 6: just wasn't into this affordability message. I mean he all 380 00:22:00,800 --> 00:22:04,000 Speaker 6: but said that his chief of staff, Susie wils told 381 00:22:04,040 --> 00:22:06,440 Speaker 6: him you need to go out here in campaign on this. 382 00:22:06,960 --> 00:22:10,040 Speaker 6: So almost sounded like someone who was grudgingly told you 383 00:22:10,080 --> 00:22:12,720 Speaker 6: got to go outside and do something that you really 384 00:22:12,720 --> 00:22:13,280 Speaker 6: don't want to do. 385 00:22:14,840 --> 00:22:17,600 Speaker 3: It's convenient for the purposes of this podcast to feel 386 00:22:17,600 --> 00:22:19,880 Speaker 3: that the economy is central to everything, we should step 387 00:22:19,920 --> 00:22:21,560 Speaker 3: back and ask what is the state of the US 388 00:22:21,640 --> 00:22:23,840 Speaker 3: economy going into twenty six. I mean, I know many 389 00:22:23,840 --> 00:22:27,400 Speaker 3: of us, certainly a year ago, if we've been told 390 00:22:27,520 --> 00:22:29,480 Speaker 3: what the administration was going to do, what was going 391 00:22:29,560 --> 00:22:32,479 Speaker 3: to happen in terms of tariffs and uncertainty and other things, 392 00:22:32,800 --> 00:22:36,200 Speaker 3: we probably would not have expected it to be as 393 00:22:36,240 --> 00:22:39,680 Speaker 3: strong or inflation to be as low. But what does 394 00:22:39,720 --> 00:22:42,080 Speaker 3: it look like now and what are you watching going 395 00:22:42,080 --> 00:22:45,119 Speaker 3: into twenty six on just the strength of the economy. 396 00:22:46,000 --> 00:22:48,040 Speaker 2: So I think there's quite a lot to be optimistic 397 00:22:48,040 --> 00:22:50,920 Speaker 2: about on the US economy heading into twenty twenty six. 398 00:22:51,040 --> 00:22:53,600 Speaker 2: Inflation's a bit higher than the FED would like it 399 00:22:53,640 --> 00:22:57,200 Speaker 2: to be, but it's certainly not spiraling higher growth pretty 400 00:22:57,280 --> 00:23:02,480 Speaker 2: robust looking into twenty twenty six. We've got easier monetary conditions. 401 00:23:02,840 --> 00:23:06,040 Speaker 2: The FED is cutting interest rates. We think they're going 402 00:23:06,080 --> 00:23:08,719 Speaker 2: to be cutting quite a lot more in the year ahead. 403 00:23:09,000 --> 00:23:13,080 Speaker 2: We've got supportive fiscal policy, the one big beautiful bill 404 00:23:13,720 --> 00:23:18,600 Speaker 2: cutting taxes for big US businesses. We've got a bonfire 405 00:23:18,920 --> 00:23:22,000 Speaker 2: of regulations. We can argue about whether or not that's 406 00:23:22,160 --> 00:23:26,880 Speaker 2: socially optimum or not, but certainly it sparks animal spirits. 407 00:23:27,040 --> 00:23:30,640 Speaker 2: And of course we've got the AI revolution and all 408 00:23:30,680 --> 00:23:34,760 Speaker 2: that means for market optimism and for capital spending. So 409 00:23:34,800 --> 00:23:37,200 Speaker 2: if you put all of those things together, I think 410 00:23:37,359 --> 00:23:40,480 Speaker 2: the base case for US growth in twenty twenty six, 411 00:23:40,640 --> 00:23:45,040 Speaker 2: US growth heading towards the midterms is pretty optimistic. There's 412 00:23:45,080 --> 00:23:50,479 Speaker 2: stuff that could go wrong. AI optimism could evaporate. A 413 00:23:50,520 --> 00:23:56,000 Speaker 2: new FED chair could potentially overplay their hand, bringing concerns 414 00:23:56,000 --> 00:24:01,760 Speaker 2: about lost FED independence back into perspectiveishing interest rates higher. 415 00:24:02,000 --> 00:24:05,360 Speaker 2: So some risks there, but the base case heading into 416 00:24:05,400 --> 00:24:07,720 Speaker 2: twenty twenty six I think pretty optimistic. 417 00:24:08,800 --> 00:24:11,119 Speaker 3: Well, you mentioned the FED, and obviously the independence of 418 00:24:11,119 --> 00:24:13,880 Speaker 3: the FED has been a great focus. There's been this 419 00:24:14,000 --> 00:24:17,320 Speaker 3: kind of horse race around who was going to be 420 00:24:17,440 --> 00:24:20,520 Speaker 3: named by the President as the new chairman of the FED. 421 00:24:20,720 --> 00:24:23,440 Speaker 3: The betting as we're recording this is still on. Kevin 422 00:24:23,480 --> 00:24:25,840 Speaker 3: has it, but there's been some desire from the White 423 00:24:25,880 --> 00:24:30,360 Speaker 3: House to still preserve the element of surprise. But that's 424 00:24:30,400 --> 00:24:32,960 Speaker 3: a power that the president has always had, but there 425 00:24:33,000 --> 00:24:36,199 Speaker 3: is uncertainty about whether the president also has the power 426 00:24:36,280 --> 00:24:39,439 Speaker 3: to fire members of the FED Board. I mean, Maria, 427 00:24:39,480 --> 00:24:42,520 Speaker 3: there's some pretty crucial Supreme Court decisions coming down the track, 428 00:24:42,560 --> 00:24:43,920 Speaker 3: and that's one of them. 429 00:24:44,640 --> 00:24:45,680 Speaker 4: Yes, absolutely. 430 00:24:45,840 --> 00:24:49,040 Speaker 6: One of the cases stems around nineteen thirty's era law 431 00:24:49,520 --> 00:24:55,200 Speaker 6: that prohibited presidents from firing people on independent branches of government. 432 00:24:55,640 --> 00:24:58,680 Speaker 6: This one is an FTC case that we're quite eager 433 00:24:58,720 --> 00:25:01,159 Speaker 6: to get the results of, but the Supreme Court is 434 00:25:01,200 --> 00:25:05,040 Speaker 6: already signaling that it will likely side with the president. 435 00:25:05,200 --> 00:25:11,520 Speaker 6: Another extraordinary expansion of executive authority that we've seen. Another case, obviously, 436 00:25:11,720 --> 00:25:15,879 Speaker 6: is the president's use of AIBA in order to enact 437 00:25:15,920 --> 00:25:20,160 Speaker 6: some of the tariffs. He's characterized that as essentially make 438 00:25:20,240 --> 00:25:23,040 Speaker 6: or break for the country. We'll see whether or not 439 00:25:23,119 --> 00:25:26,640 Speaker 6: the Supreme Court sides with him or not, but make 440 00:25:26,760 --> 00:25:27,960 Speaker 6: no question about it. 441 00:25:28,000 --> 00:25:29,000 Speaker 4: What we've seen the security. 442 00:25:29,040 --> 00:25:32,040 Speaker 6: So this Supreme Court just grant the president just an 443 00:25:32,160 --> 00:25:34,080 Speaker 6: enormous amount of executive leeway. 444 00:25:34,119 --> 00:25:37,399 Speaker 3: Here, Tom, there's the Lisa Cook judgment. There's been some 445 00:25:37,880 --> 00:25:40,440 Speaker 3: question about whether or not there would be a line 446 00:25:40,520 --> 00:25:44,800 Speaker 3: drawn between the FED and other independent eights supposed so 447 00:25:44,880 --> 00:25:50,560 Speaker 3: called agencies, where the Supreme Court appears set to say 448 00:25:50,600 --> 00:25:52,679 Speaker 3: that the president can kind of do what he wants 449 00:25:52,720 --> 00:25:56,080 Speaker 3: with almost all the agencies. But they have suggested earlier 450 00:25:56,119 --> 00:25:58,720 Speaker 3: in the year that the Federal Reserve had a slightly 451 00:25:58,720 --> 00:26:02,879 Speaker 3: different status, which might effect whether they ruled differently in 452 00:26:02,920 --> 00:26:05,480 Speaker 3: the case of Lisa Cook, which we've talked about before. 453 00:26:05,240 --> 00:26:06,360 Speaker 4: On this program. 454 00:26:06,400 --> 00:26:09,160 Speaker 3: But how important is that do you think in terms 455 00:26:09,160 --> 00:26:11,440 Speaker 3: of the way the Fed is viewed, Because at the moment, 456 00:26:11,480 --> 00:26:15,000 Speaker 3: obviously the market seem to be quite relaxed about the 457 00:26:15,040 --> 00:26:17,840 Speaker 3: pressure that the White House is putting on the Central Bank. 458 00:26:18,760 --> 00:26:21,720 Speaker 2: So a few years ago we talked about the twilight 459 00:26:21,760 --> 00:26:25,320 Speaker 2: of the economic idols, right, kind of slightly grandiose term. 460 00:26:25,400 --> 00:26:29,159 Speaker 2: Can't remember who came up with it, but the idea was, well, 461 00:26:29,520 --> 00:26:34,040 Speaker 2: the populists have smashed down the door of so many institutions, 462 00:26:34,160 --> 00:26:41,159 Speaker 2: right the universities, the media, the law firms, the scientific profession. 463 00:26:41,640 --> 00:26:44,520 Speaker 2: Maybe they'll come for the central banks. Right, The Federal 464 00:26:44,560 --> 00:26:48,760 Speaker 2: Reserve has failed on its basic task of controlling inflation. 465 00:26:49,440 --> 00:26:52,080 Speaker 2: The Federal Reserve back then had to kick out two 466 00:26:52,200 --> 00:26:57,320 Speaker 2: members of its governing council because of concerns about insider trading. 467 00:26:57,520 --> 00:27:01,840 Speaker 2: This is a vulnerable institution. The populists have knocked over 468 00:27:01,920 --> 00:27:04,680 Speaker 2: so many other institutions. Maybe they'll come for the FED 469 00:27:04,760 --> 00:27:07,280 Speaker 2: as well. And sure enough, here we are at the 470 00:27:07,359 --> 00:27:10,119 Speaker 2: end of twenty twenty five, looking ahead into twenty twenty six, 471 00:27:10,280 --> 00:27:13,800 Speaker 2: and the populists are certainly banging at the Fed's door. 472 00:27:14,080 --> 00:27:14,359 Speaker 4: Right. 473 00:27:14,440 --> 00:27:18,040 Speaker 2: We already have Stephen Myron, the chair of the Council 474 00:27:18,040 --> 00:27:21,800 Speaker 2: of Economic Advisors for President Trump, serving on the FED board. 475 00:27:22,160 --> 00:27:25,480 Speaker 2: We have the court case against Lisa Cook that you mentioned, 476 00:27:25,720 --> 00:27:29,160 Speaker 2: which could open up another board seat for the President 477 00:27:29,200 --> 00:27:32,840 Speaker 2: to fill. And we have Chair Powell exiting in May 478 00:27:33,119 --> 00:27:36,320 Speaker 2: and scope for the President to appoint his own chair. 479 00:27:36,520 --> 00:27:39,520 Speaker 2: And whoever he picks, no matter how credible they are, 480 00:27:39,920 --> 00:27:43,080 Speaker 2: is going to be laboring under the suspicion that they've 481 00:27:43,119 --> 00:27:46,480 Speaker 2: made some kind of low rate loyalty pledge to the 482 00:27:46,520 --> 00:27:48,480 Speaker 2: president in order to get the job. 483 00:27:48,800 --> 00:27:49,080 Speaker 4: Now. 484 00:27:49,600 --> 00:27:53,640 Speaker 2: We haven't seen the markets reacting to this much so far, 485 00:27:54,080 --> 00:27:58,440 Speaker 2: but it really is quite consequential. And independent federal reserve 486 00:27:58,840 --> 00:28:03,240 Speaker 2: is a fundamental underpinning of market confidence that the US 487 00:28:03,320 --> 00:28:07,800 Speaker 2: will be serious about controlling inflation. And that's a fundamental 488 00:28:07,880 --> 00:28:10,800 Speaker 2: underpinning of the role of the dollar as the world's 489 00:28:10,800 --> 00:28:15,680 Speaker 2: reserve currency and US treasuries as the world's safe asset. 490 00:28:16,040 --> 00:28:20,040 Speaker 2: If that confidence is undermined, well, the status of the dollar, 491 00:28:20,600 --> 00:28:24,159 Speaker 2: the status of the treasury market both open to question. 492 00:28:24,560 --> 00:28:27,320 Speaker 3: This kind of maximalist view of presidential power that we've 493 00:28:27,320 --> 00:28:30,360 Speaker 3: seen so far. Mario, I mean, how does that play politically? 494 00:28:30,359 --> 00:28:32,840 Speaker 3: Do you think there is a chance of this Congress, 495 00:28:33,000 --> 00:28:35,760 Speaker 3: previously rather feeble biting back. 496 00:28:36,800 --> 00:28:39,600 Speaker 6: Yeah, that's a great word for it. It's been mostly 497 00:28:39,640 --> 00:28:42,640 Speaker 6: feeble for the better part of this year. But we 498 00:28:42,800 --> 00:28:46,000 Speaker 6: have seen signs of life in recent months. 499 00:28:46,080 --> 00:28:46,240 Speaker 4: Right. 500 00:28:46,280 --> 00:28:50,440 Speaker 6: We saw obviously the evergrain here is the split between 501 00:28:50,560 --> 00:28:53,920 Speaker 6: the president and Marjorie Taylor Grain. We're also saying at 502 00:28:53,920 --> 00:28:58,280 Speaker 6: the state level, and if democracy functions the way it 503 00:28:58,320 --> 00:29:02,160 Speaker 6: should in operation, it could funnel up to Congress as well. 504 00:29:02,200 --> 00:29:05,160 Speaker 6: But we've seen pushback at some of the redistricting efforts 505 00:29:05,200 --> 00:29:10,560 Speaker 6: as well. We've seen the Congress, bipartisan Republicans even kind 506 00:29:10,600 --> 00:29:13,920 Speaker 6: of blanching a bit at the boat strikes in the Caribbean, 507 00:29:14,400 --> 00:29:17,760 Speaker 6: particularly the one in which it was like a double 508 00:29:17,840 --> 00:29:20,720 Speaker 6: what's called a double tap in a boat as well. 509 00:29:20,800 --> 00:29:24,360 Speaker 6: So we've started to see the president called for during 510 00:29:24,400 --> 00:29:27,440 Speaker 6: the shutdown, the President called for Republicans to eliminate the 511 00:29:27,480 --> 00:29:30,840 Speaker 6: filibuster that was aligned too far for them as well, 512 00:29:30,920 --> 00:29:33,880 Speaker 6: that they didn't heed. And then obviously, and again I 513 00:29:33,960 --> 00:29:38,120 Speaker 6: mentioned Marjorie Taylor Green, the fact that the Epstein files 514 00:29:38,200 --> 00:29:40,960 Speaker 6: and the legislation there, they forced the President to do 515 00:29:41,000 --> 00:29:43,840 Speaker 6: an about face on that issue as well. So we're 516 00:29:43,920 --> 00:29:48,680 Speaker 6: starting to see some semblances of life. Maybe that's because 517 00:29:48,720 --> 00:29:52,240 Speaker 6: of what the polls are bearing out with some erosion 518 00:29:52,440 --> 00:29:55,920 Speaker 6: among parts of the MAGA base, the pressure that we 519 00:29:56,080 --> 00:30:00,720 Speaker 6: have for congress people representatives as we head toward the 520 00:30:00,760 --> 00:30:05,400 Speaker 6: midterms as well, So that kind of tension may we 521 00:30:05,440 --> 00:30:07,640 Speaker 6: may see some checks there on a president's power. 522 00:30:08,160 --> 00:30:11,240 Speaker 3: You've both mentioned that the White House has been trying 523 00:30:11,280 --> 00:30:14,320 Speaker 3: to establish that it's going to be the president and 524 00:30:14,400 --> 00:30:18,640 Speaker 3: certainly federal government that dictates a single set of rules 525 00:30:19,360 --> 00:30:22,760 Speaker 3: for AI. But we also know there's plenty of individual states, 526 00:30:22,800 --> 00:30:25,760 Speaker 3: not least California, and indeed some in Congress that would 527 00:30:25,800 --> 00:30:28,000 Speaker 3: like to have more of a role and not just 528 00:30:28,120 --> 00:30:30,120 Speaker 3: leave it to the president. What should we be watching 529 00:30:30,200 --> 00:30:33,400 Speaker 3: on that? What are the key debates we should be 530 00:30:33,440 --> 00:30:35,960 Speaker 3: looking out for next year if we're kind of concerned 531 00:30:36,000 --> 00:30:38,800 Speaker 3: about the form that AI is going to take. 532 00:30:39,600 --> 00:30:44,400 Speaker 1: Personally, I think this allowing states to roll out their 533 00:30:44,400 --> 00:30:47,479 Speaker 1: own laws is actually quite a healthy approach, because then 534 00:30:47,520 --> 00:30:51,080 Speaker 1: it almost becomes like a lab Each state becomes like 535 00:30:51,160 --> 00:30:55,040 Speaker 1: its own laboratory for running an experiment on what law 536 00:30:55,120 --> 00:30:58,480 Speaker 1: works best, whether it's on tackling deep fakes or deep 537 00:30:58,520 --> 00:31:02,280 Speaker 1: fake porn or spam or fraud, or whatever approach you 538 00:31:02,320 --> 00:31:04,760 Speaker 1: want to take. In there are so many different ways 539 00:31:04,760 --> 00:31:11,680 Speaker 1: to approach AI. Centralizing regulation isn't always the best approach. 540 00:31:11,760 --> 00:31:14,680 Speaker 1: And I hate to say this because when the European 541 00:31:14,760 --> 00:31:17,240 Speaker 1: Union first rolled out it's AI Act, I was so 542 00:31:17,320 --> 00:31:20,720 Speaker 1: excited and I thought, this is the first comprehensive law 543 00:31:20,840 --> 00:31:23,560 Speaker 1: addressing AI, and it's so great and we need this 544 00:31:23,640 --> 00:31:25,680 Speaker 1: and they're moving so quickly. And then that was like 545 00:31:25,760 --> 00:31:28,160 Speaker 1: maybe two and a half years ago, and now I'm 546 00:31:28,200 --> 00:31:32,040 Speaker 1: just so disappointed on how it has turned out because 547 00:31:32,800 --> 00:31:36,440 Speaker 1: they've delayed it, they've blamed the standards organizations that are 548 00:31:36,440 --> 00:31:38,360 Speaker 1: trying to make it more specific, which is what it 549 00:31:38,400 --> 00:31:41,600 Speaker 1: really needs. It's all just been quite a big disaster. 550 00:31:42,360 --> 00:31:45,480 Speaker 1: But to answer your question about what to watch out for, 551 00:31:46,160 --> 00:31:49,479 Speaker 1: I would also just watch out for what's happening in 552 00:31:49,520 --> 00:31:52,800 Speaker 1: the court system, because some of the lawsuits that have 553 00:31:52,920 --> 00:31:56,960 Speaker 1: been thrown against that have targeted open AI, that have 554 00:31:57,000 --> 00:32:00,720 Speaker 1: targeted anthropic over copyright infringement, I mean, these are pretty 555 00:32:00,720 --> 00:32:05,440 Speaker 1: big cases. There is several there's some several cases against 556 00:32:05,560 --> 00:32:09,240 Speaker 1: open ai from more than a dozen families of people 557 00:32:09,680 --> 00:32:13,360 Speaker 1: who've experienced some kind of mental health serious mental health 558 00:32:13,680 --> 00:32:17,360 Speaker 1: harm from using chat GBT, and these are getting a 559 00:32:17,400 --> 00:32:20,760 Speaker 1: lot of attention, and open ai has publicly addressed them 560 00:32:20,760 --> 00:32:22,160 Speaker 1: and talked about making changes. 561 00:32:22,440 --> 00:32:24,040 Speaker 5: So I almost feel like, actually the. 562 00:32:24,000 --> 00:32:27,240 Speaker 1: Court system is having much more of an impact than 563 00:32:27,280 --> 00:32:29,720 Speaker 1: any kind of legislation is right now. 564 00:32:30,680 --> 00:32:33,280 Speaker 3: One of the surprises that we've had over the last 565 00:32:33,720 --> 00:32:36,720 Speaker 3: few years actually has been the strength, the relative strength 566 00:32:36,880 --> 00:32:38,600 Speaker 3: of US productivity. 567 00:32:38,640 --> 00:32:39,040 Speaker 4: Growth. 568 00:32:39,240 --> 00:32:41,560 Speaker 3: Economists tend to be obsessed with it because making more 569 00:32:41,560 --> 00:32:43,600 Speaker 3: stuff with the same number of people is how you 570 00:32:43,840 --> 00:32:47,520 Speaker 3: get richer. A key part of having AI feed into 571 00:32:47,840 --> 00:32:50,760 Speaker 3: continued productivity growth is going to be that, as you mentioned, 572 00:32:50,840 --> 00:32:54,240 Speaker 3: partly the diffusion across the economy. How has it applied 573 00:32:55,000 --> 00:32:57,479 Speaker 3: historically if I look at Europe, certainly of looking at 574 00:32:57,480 --> 00:32:59,880 Speaker 3: the UK, but I think in other places as well. 575 00:33:00,440 --> 00:33:03,680 Speaker 3: It's the diffusion of technology that's been lacking. Actually, the 576 00:33:03,680 --> 00:33:07,360 Speaker 3: best companies have had the best know how, but it's 577 00:33:07,400 --> 00:33:10,080 Speaker 3: not spread through the economy nearly as well as it 578 00:33:10,160 --> 00:33:12,920 Speaker 3: has in the US. Regulation is going to be an 579 00:33:12,920 --> 00:33:15,200 Speaker 3: important part of a rollout, but its diffusion is the 580 00:33:15,240 --> 00:33:18,760 Speaker 3: other big piece. What should we be looking at there 581 00:33:18,800 --> 00:33:21,600 Speaker 3: because obviously it also feeds into whether those at least 582 00:33:21,600 --> 00:33:25,120 Speaker 3: some of those high prices for AI companies are justified. 583 00:33:25,880 --> 00:33:27,959 Speaker 5: Yeah, that's a really great point to bring up. 584 00:33:28,000 --> 00:33:31,240 Speaker 1: Actually, and it's a point that politicians and lawmakers have 585 00:33:31,280 --> 00:33:34,920 Speaker 1: been making that. Actually in Europe we might not be 586 00:33:35,080 --> 00:33:38,400 Speaker 1: developing the big models that everybody's using in Silicon Valley, 587 00:33:38,920 --> 00:33:42,520 Speaker 1: but hey, let's use that as an opportunity. Let's capitalize 588 00:33:42,520 --> 00:33:45,320 Speaker 1: on all this money that Silicon Valley is spending and 589 00:33:45,360 --> 00:33:48,960 Speaker 1: all this infrastructure, and let's just use this technology to 590 00:33:49,000 --> 00:33:52,400 Speaker 1: make ourselves more productive. So to your point, exactly, this 591 00:33:52,440 --> 00:33:55,760 Speaker 1: is about winning the AI race. What does that even mean? 592 00:33:56,400 --> 00:33:59,400 Speaker 1: Does it mean that you have developed the most capable 593 00:33:59,400 --> 00:34:03,240 Speaker 1: AI model from a company in your country, or does 594 00:34:03,280 --> 00:34:05,040 Speaker 1: it just mean that the citizens of your country have 595 00:34:05,080 --> 00:34:08,480 Speaker 1: actually adopted this technology much more quickly than anyone else 596 00:34:09,040 --> 00:34:12,200 Speaker 1: and are exploiting it much more quickly. The government of Estonia, 597 00:34:12,320 --> 00:34:14,680 Speaker 1: for instance, they've a population of one and a half 598 00:34:14,760 --> 00:34:16,960 Speaker 1: million people, and they're the first in the world to 599 00:34:17,040 --> 00:34:20,240 Speaker 1: roll out chat ept to all their schools. It's an 600 00:34:20,360 --> 00:34:24,480 Speaker 1: educational version of chat ept. And someone else in Silicon 601 00:34:24,560 --> 00:34:27,880 Speaker 1: Valley has built this, but they're tweaking it. They're trying 602 00:34:27,920 --> 00:34:30,759 Speaker 1: to make it less of an answer engine and more 603 00:34:30,760 --> 00:34:33,640 Speaker 1: of a question engine, where the bot doesn't just give 604 00:34:33,680 --> 00:34:35,799 Speaker 1: the kids the answer all the time, but ask them 605 00:34:35,880 --> 00:34:38,560 Speaker 1: questions to get them thinking more. Now, if they can 606 00:34:38,600 --> 00:34:40,200 Speaker 1: make that work, they're going to be doing the rest 607 00:34:40,239 --> 00:34:42,560 Speaker 1: of the world a favor, but they'll also be doing 608 00:34:42,560 --> 00:34:46,280 Speaker 1: a huge favor to their students and to their educational system. 609 00:34:58,239 --> 00:35:01,239 Speaker 3: He's talked about on this show, the possibility of a 610 00:35:01,280 --> 00:35:04,879 Speaker 3: stock market correction or even something bigger than that, due 611 00:35:04,920 --> 00:35:09,920 Speaker 3: to reduce confidence in that handful of AI companies tech companies. 612 00:35:10,360 --> 00:35:12,560 Speaker 3: Tom I know that the economists have done a bit 613 00:35:12,560 --> 00:35:15,400 Speaker 3: of analysis on this. What's our guess in terms of 614 00:35:15,400 --> 00:35:17,560 Speaker 3: what the impact on the real economy would be of 615 00:35:17,600 --> 00:35:19,960 Speaker 3: that kind of stock market correction? I mean, given that 616 00:35:20,000 --> 00:35:22,680 Speaker 3: we I remember we saw in the tech bubble bursting 617 00:35:23,000 --> 00:35:25,720 Speaker 3: in two thousand and two thousand and one that actually 618 00:35:25,760 --> 00:35:28,200 Speaker 3: didn't have much impact on the economy. Could we expect 619 00:35:28,239 --> 00:35:29,400 Speaker 3: something like that this time? 620 00:35:30,800 --> 00:35:31,000 Speaker 4: Yeah. 621 00:35:31,040 --> 00:35:33,640 Speaker 2: So it's a big question, Stephanie, and it's one that 622 00:35:33,680 --> 00:35:36,920 Speaker 2: it's all too easy to wave your hands out in generalities. 623 00:35:37,080 --> 00:35:40,560 Speaker 2: What we've done at Bloomberg Economics is take out the 624 00:35:40,719 --> 00:35:44,319 Speaker 2: fed's big model of the US economy that's called FURBUS, 625 00:35:44,520 --> 00:35:47,319 Speaker 2: and we modeled a specific scenario. We looked at what 626 00:35:47,320 --> 00:35:51,480 Speaker 2: would happen if stock prices dropped around twenty twenty five 627 00:35:51,520 --> 00:35:54,520 Speaker 2: percent and there was a broadening of credit spreads for 628 00:35:54,600 --> 00:35:58,000 Speaker 2: corporate borrowers. And if you plug that shock into the 629 00:35:58,040 --> 00:36:01,560 Speaker 2: Furbus model, what it tells you is, well, you have 630 00:36:01,600 --> 00:36:04,520 Speaker 2: a blow to confidence, you have a blow to wealth 631 00:36:04,719 --> 00:36:09,280 Speaker 2: that starts hitting consumption. Is more expensive for corporates to borrow, 632 00:36:09,360 --> 00:36:12,880 Speaker 2: so it starts hitting investment as well. And as those 633 00:36:12,920 --> 00:36:16,239 Speaker 2: shocks hit the economy in the end, you have a 634 00:36:16,280 --> 00:36:19,520 Speaker 2: blow to GDP growth of around zero point seventy five 635 00:36:19,560 --> 00:36:22,920 Speaker 2: percent and a blow to employment or a boost to 636 00:36:23,000 --> 00:36:26,680 Speaker 2: unemployment of around zero point five percent. So those are 637 00:36:26,719 --> 00:36:30,480 Speaker 2: macro significant shocks. You'd see them in the numbers, and 638 00:36:30,800 --> 00:36:34,160 Speaker 2: if you have a snowball effect, well things could be 639 00:36:34,239 --> 00:36:38,240 Speaker 2: even worse. Still, those numbers are numbers which would damage 640 00:36:38,280 --> 00:36:41,760 Speaker 2: the growth trajectory for the US economy in twenty twenty six, 641 00:36:41,960 --> 00:36:43,800 Speaker 2: they wouldn't entirely derail. 642 00:36:43,440 --> 00:36:46,320 Speaker 3: It, and to short term if the bubble did burst, 643 00:36:46,480 --> 00:36:49,239 Speaker 3: or at least you had a big correction. Given the 644 00:36:49,280 --> 00:36:53,200 Speaker 3: relative to other countries, there's quite widespread holding of stock markets. 645 00:36:53,239 --> 00:36:56,160 Speaker 3: There's a lot of households that have feel that they've 646 00:36:56,200 --> 00:36:58,960 Speaker 3: got quite a lot richer as a result of this, 647 00:36:59,239 --> 00:37:01,080 Speaker 3: the run up in the stocks. Do you think we 648 00:37:01,080 --> 00:37:03,759 Speaker 3: should worry about the short term economic consequence of that? 649 00:37:04,640 --> 00:37:06,760 Speaker 2: Yeah, I think I'd worry about it in a couple 650 00:37:06,800 --> 00:37:10,080 Speaker 2: of respects, Stephanie. The first, as you mentioned is the 651 00:37:10,120 --> 00:37:14,080 Speaker 2: stock market. US households hold a lot of stocks. They 652 00:37:14,120 --> 00:37:17,759 Speaker 2: feel richer, They feel more inclined to consume if the 653 00:37:17,800 --> 00:37:20,440 Speaker 2: stock market is going up. If the stock market is 654 00:37:20,440 --> 00:37:25,080 Speaker 2: going down, that dynamic swings into reverse. Secondly, the funding 655 00:37:25,440 --> 00:37:29,279 Speaker 2: for the AI infrastructure build out is also tied to 656 00:37:29,760 --> 00:37:34,720 Speaker 2: market confidence on AI, right, and if that confidence evaporates, well, 657 00:37:35,239 --> 00:37:38,480 Speaker 2: companies like Google and Meta they're still going to have 658 00:37:38,520 --> 00:37:42,200 Speaker 2: the money to spend, but there's an entire ecosystem of 659 00:37:42,360 --> 00:37:45,400 Speaker 2: other AI startups which are going to suffer, and that 660 00:37:45,440 --> 00:37:47,240 Speaker 2: would be an additional negative for growth. 661 00:37:48,040 --> 00:37:50,319 Speaker 3: There was a lot of discussion earlier in the year, 662 00:37:50,600 --> 00:37:53,879 Speaker 3: especially in the wake of the Big Beautiful Bill, that 663 00:37:54,600 --> 00:37:58,200 Speaker 3: there would be pressure on the bond market, and indeed 664 00:37:58,239 --> 00:38:00,279 Speaker 3: you would expect if you look at the highlight level 665 00:38:00,280 --> 00:38:03,000 Speaker 3: of government borrowing in the US at a time when 666 00:38:03,000 --> 00:38:07,720 Speaker 3: the economy is pretty strong, there's certainly a very unsustainable 667 00:38:07,760 --> 00:38:10,080 Speaker 3: debt path. You see the path of debt as a 668 00:38:10,120 --> 00:38:12,360 Speaker 3: share of GDP just continuing to go up, which is 669 00:38:12,440 --> 00:38:16,480 Speaker 3: kind of by definition unsustainable. And yet it's one of 670 00:38:16,520 --> 00:38:20,640 Speaker 3: the few countries, certainly developed economies where the ten year 671 00:38:21,040 --> 00:38:24,759 Speaker 3: yield was for most of the year was lower than 672 00:38:24,880 --> 00:38:26,759 Speaker 3: the second half of the year was than it had 673 00:38:26,800 --> 00:38:29,600 Speaker 3: been at the start. Most governments are sort of dealing 674 00:38:29,600 --> 00:38:31,839 Speaker 3: with a higher cost of borrowing. That has been less 675 00:38:31,840 --> 00:38:35,000 Speaker 3: true in the US. Despite all of this, marriage, does 676 00:38:35,040 --> 00:38:38,400 Speaker 3: the administration kind of feel that it doesn't have to 677 00:38:38,440 --> 00:38:41,799 Speaker 3: worry about fiscal risks at all, and is it going 678 00:38:41,840 --> 00:38:44,400 Speaker 3: to feel sort of vindicated by the fact that despite 679 00:38:45,120 --> 00:38:47,879 Speaker 3: that passing of that bill, you know, investors just don't 680 00:38:47,880 --> 00:38:49,040 Speaker 3: seem to be worried about it. 681 00:38:50,680 --> 00:38:53,799 Speaker 6: So you say, one feature of both Trump one point 682 00:38:53,880 --> 00:38:56,040 Speaker 6: zero and two point zero is just the absence of 683 00:38:56,080 --> 00:38:59,200 Speaker 6: the presidents speaking about fiscal policy. That's something that had 684 00:38:59,239 --> 00:39:03,920 Speaker 6: traditionally been and tried and true point of American conservatism. 685 00:39:04,400 --> 00:39:07,120 Speaker 6: When it has come up in Trump two point zero, 686 00:39:07,160 --> 00:39:10,200 Speaker 6: you've seen the president both rationalize it with both his 687 00:39:10,320 --> 00:39:13,840 Speaker 6: tear of policy, saying that the US is going to 688 00:39:13,880 --> 00:39:18,279 Speaker 6: take in these massive amounts of good tear funds, but 689 00:39:18,320 --> 00:39:22,360 Speaker 6: then also the output from the one big beautiful bill, 690 00:39:22,520 --> 00:39:26,400 Speaker 6: the supercharging of the economy from his different policies, the 691 00:39:26,520 --> 00:39:32,719 Speaker 6: regulatory easing as well, and he's rationalized it in that way. 692 00:39:33,680 --> 00:39:35,000 Speaker 3: One thing I'm going to be watching. I don't know 693 00:39:35,000 --> 00:39:40,440 Speaker 3: about you, Tom, but we might see, ironically, if if 694 00:39:40,440 --> 00:39:44,480 Speaker 3: the president's power is kind of checked by the midterms, 695 00:39:45,000 --> 00:39:48,080 Speaker 3: you might find that the last few months of next 696 00:39:48,160 --> 00:39:51,799 Speaker 3: year are actually a tougher time in terms of the 697 00:39:51,800 --> 00:39:57,280 Speaker 3: bond market and investors than the previous few months, because 698 00:39:57,880 --> 00:40:00,680 Speaker 3: we saw in the UK that it's not now the 699 00:40:00,680 --> 00:40:02,919 Speaker 3: policies you have, it's kind of the way it's being done, 700 00:40:03,040 --> 00:40:05,840 Speaker 3: and the feeling that there's no constraints or no possible 701 00:40:06,000 --> 00:40:08,800 Speaker 3: checks if you end up with sort of genuine gridlock 702 00:40:08,960 --> 00:40:13,240 Speaker 3: in Congress at a time where naturally there's also questions 703 00:40:13,280 --> 00:40:18,200 Speaker 3: about whether the FED is going to tighten policy as needed. 704 00:40:18,440 --> 00:40:21,480 Speaker 3: If inflation starts heading up again, which many people see 705 00:40:21,520 --> 00:40:23,879 Speaker 3: is potentially happening in the second half of next year. 706 00:40:24,480 --> 00:40:26,880 Speaker 3: I think things could get a little bit ugly in 707 00:40:26,920 --> 00:40:30,840 Speaker 3: the bond markets. Ironically, even though in theory the president 708 00:40:30,880 --> 00:40:34,200 Speaker 3: looks a bit less powerful. That will actually worry some 709 00:40:34,440 --> 00:40:36,440 Speaker 3: in the markets. But who knows, Tom, what are you 710 00:40:36,520 --> 00:40:39,320 Speaker 3: going to be watching? What's the kind of wild card? 711 00:40:39,640 --> 00:40:42,880 Speaker 3: Now we're reaching the end and we're thinking about potential 712 00:40:42,960 --> 00:40:45,600 Speaker 3: risks or just yeah, things that we need to be 713 00:40:45,640 --> 00:40:48,120 Speaker 3: watching twenty twenty six that we might not otherwise be 714 00:40:48,160 --> 00:40:48,960 Speaker 3: paying attention to. 715 00:40:49,360 --> 00:40:51,160 Speaker 2: Yeah, maybe I could just pick up briefly on that 716 00:40:51,239 --> 00:40:54,160 Speaker 2: last thought, Stephanie, I think that's really interesting. So I 717 00:40:54,160 --> 00:40:57,960 Speaker 2: think traditionally we think about US presidents who are constrained 718 00:40:58,040 --> 00:41:01,879 Speaker 2: by congressional gridlock at home, trying to do more abroad. Right, 719 00:41:02,000 --> 00:41:04,960 Speaker 2: I can't get anything through Congress, Let me pursue an 720 00:41:04,960 --> 00:41:07,600 Speaker 2: aggressive foreign policy on one dimension or another. 721 00:41:07,880 --> 00:41:08,080 Speaker 4: Right. 722 00:41:08,080 --> 00:41:10,440 Speaker 2: I think the possibility, as you're pointing to, which I 723 00:41:10,440 --> 00:41:13,560 Speaker 2: hadn't occurred to me before, was well, President Trump my 724 00:41:13,600 --> 00:41:16,000 Speaker 2: begridlocked in Congress. But if he's got his man in 725 00:41:16,080 --> 00:41:20,760 Speaker 2: at the FED and he's unembarrassed about pulling that lever, well, 726 00:41:20,960 --> 00:41:24,000 Speaker 2: perhaps that's an unconstrained option for him. 727 00:41:24,280 --> 00:41:24,520 Speaker 1: Right. 728 00:41:25,000 --> 00:41:28,600 Speaker 2: So it could be that there's some dynamic between loss 729 00:41:28,600 --> 00:41:32,400 Speaker 2: of power at the midterms, greater capacity to and willingness 730 00:41:32,440 --> 00:41:35,320 Speaker 2: to influence the Fed, which could potentially play out in 731 00:41:35,360 --> 00:41:38,400 Speaker 2: a pretty negative way for the US bond market. In 732 00:41:38,480 --> 00:41:41,400 Speaker 2: terms of wild cards for twenty twenty six, I guess 733 00:41:41,400 --> 00:41:45,040 Speaker 2: my wildcard is everything goes great. Right, I'll tell you 734 00:41:45,080 --> 00:41:47,839 Speaker 2: why I was thinking that, So Bloomberg Economics. One place 735 00:41:47,840 --> 00:41:49,640 Speaker 2: in the world we pay a lot of attention to 736 00:41:50,160 --> 00:41:53,640 Speaker 2: is Taiwan because Taiwan sits at this kind of nexus 737 00:41:53,640 --> 00:41:58,360 Speaker 2: of geopolitical and trade risk. Right, Concerned that Shi Jinping 738 00:41:58,480 --> 00:42:03,680 Speaker 2: has ambitions to extend Chinese control to Taypeay, concern that 739 00:42:04,120 --> 00:42:08,160 Speaker 2: tariffs would be a real negative for an economy which 740 00:42:08,320 --> 00:42:11,719 Speaker 2: relies more on exports than pretty much any other economy 741 00:42:11,719 --> 00:42:14,920 Speaker 2: in the world. Guess what Taiwan's GDP growth was in 742 00:42:14,960 --> 00:42:16,759 Speaker 2: the first three quarters of twenty twenty five. 743 00:42:17,200 --> 00:42:17,839 Speaker 3: Pretty high. 744 00:42:18,160 --> 00:42:21,160 Speaker 2: I think you're going to tell me seven percent? Seven 745 00:42:21,200 --> 00:42:25,080 Speaker 2: percent growth, right, So, all of these very real concerns 746 00:42:25,120 --> 00:42:29,239 Speaker 2: about trade, all of these very real concerns about geopolitics. 747 00:42:29,640 --> 00:42:33,080 Speaker 2: Through it all, this rebel island, which should be at 748 00:42:33,080 --> 00:42:37,879 Speaker 2: the absolute nexus of these risks, continuing to significantly outperform, 749 00:42:38,239 --> 00:42:43,239 Speaker 2: precisely because of that optimism about Ai supercharging demand for 750 00:42:43,280 --> 00:42:47,080 Speaker 2: Taiwan's chips. Maybe in a microcosm, that will be the 751 00:42:47,120 --> 00:42:50,919 Speaker 2: story for the global economy in twenty twenty six. Yes, 752 00:42:51,280 --> 00:42:55,279 Speaker 2: the risks are there, but actually it's the momentum and 753 00:42:55,320 --> 00:42:59,320 Speaker 2: the optimism and the Ai revolution which continue to dominate 754 00:42:59,360 --> 00:42:59,880 Speaker 2: the narrative. 755 00:43:01,040 --> 00:43:03,560 Speaker 3: So, Mario, are you girding yourself for things to go 756 00:43:03,680 --> 00:43:09,360 Speaker 3: really right next year as you contemplate your Christmas holidays? 757 00:43:10,360 --> 00:43:14,000 Speaker 6: No, absolutely, I mean it's similar to Tom kind of 758 00:43:14,520 --> 00:43:16,640 Speaker 6: His answer was kind of similar to what I was 759 00:43:16,640 --> 00:43:19,840 Speaker 6: thinking as well. Just I mean in the last fifty 760 00:43:19,920 --> 00:43:22,759 Speaker 6: or sixty years or so, only two times has a 761 00:43:22,800 --> 00:43:26,760 Speaker 6: president been able to buck the midterm historical midterm trend 762 00:43:26,800 --> 00:43:30,279 Speaker 6: and keep their majority or expand it. Last month was 763 00:43:30,320 --> 00:43:32,719 Speaker 6: a bad month for the president and he continues to 764 00:43:32,760 --> 00:43:36,960 Speaker 6: be in a rut. Right now, Democrats are abiliate in 765 00:43:37,040 --> 00:43:39,760 Speaker 6: some ways and looking toward a blue wave. 766 00:43:40,480 --> 00:43:44,400 Speaker 4: But what if it doesn't happen, What if it doesn't materialize. 767 00:43:44,480 --> 00:43:47,319 Speaker 6: This is a president who was indicted, who went through 768 00:43:47,440 --> 00:43:51,840 Speaker 6: court cases, who was impeached twice, and still ended up 769 00:43:51,880 --> 00:43:52,880 Speaker 6: back in the White House. 770 00:43:53,280 --> 00:43:55,600 Speaker 4: What if he ends up with a hot hand electorally 771 00:43:55,680 --> 00:43:58,080 Speaker 4: again and keeps control of Congress next year. 772 00:43:58,880 --> 00:44:01,160 Speaker 3: It's interesting that we think that as the wildcard. Well, 773 00:44:01,160 --> 00:44:02,880 Speaker 3: funnily enough, and of course you can tell from this 774 00:44:02,960 --> 00:44:05,759 Speaker 3: that none of us coordinated. Mine was a combination of 775 00:44:05,800 --> 00:44:08,600 Speaker 3: those two. In the sense, I do think a lot 776 00:44:08,640 --> 00:44:11,839 Speaker 3: more could go right economically than people might expect. I 777 00:44:11,880 --> 00:44:14,720 Speaker 3: was just looking today noticing that both the trade deficit 778 00:44:14,760 --> 00:44:17,839 Speaker 3: and the federal deficit have been falling in recent months, 779 00:44:17,840 --> 00:44:19,719 Speaker 3: and in fact, the budget deficit has been falling for 780 00:44:19,800 --> 00:44:22,279 Speaker 3: most of this year. In that sense, you could say, 781 00:44:22,920 --> 00:44:26,239 Speaker 3: big picture, President Trump's policies are working. If he was 782 00:44:26,719 --> 00:44:30,000 Speaker 3: coming in wanting to shrink the trade deficit, well, that 783 00:44:30,120 --> 00:44:33,399 Speaker 3: is very much what has been happening. He has had 784 00:44:33,400 --> 00:44:37,239 Speaker 3: a big impact on the Chinese imports coming into the US, 785 00:44:37,320 --> 00:44:41,480 Speaker 3: as he suggested. Inflation has been pretty stable, Growth is 786 00:44:41,520 --> 00:44:44,480 Speaker 3: pretty stable, as we've noted, Productivity is a bit higher 787 00:44:44,480 --> 00:44:47,960 Speaker 3: than it has been elsewhere. I guess the kicker I 788 00:44:47,960 --> 00:44:49,839 Speaker 3: would add to that is I think that things could 789 00:44:49,880 --> 00:44:53,080 Speaker 3: go right in the big picture economically, and we could 790 00:44:53,080 --> 00:44:55,040 Speaker 3: all be giving a certain amount of credit to the 791 00:44:55,080 --> 00:44:57,440 Speaker 3: President for that, and he'll still lose the midterms, because 792 00:44:57,440 --> 00:44:59,239 Speaker 3: if we've learned anything in the last few years, it's 793 00:44:59,280 --> 00:45:02,560 Speaker 3: a household don't necessarily give credit for those kind of 794 00:45:02,560 --> 00:45:05,560 Speaker 3: big picture macro achievements if they're still seeing quite a 795 00:45:05,600 --> 00:45:08,520 Speaker 3: lot of micro pain. And I think they will be 796 00:45:08,520 --> 00:45:10,560 Speaker 3: still seeing a bit of micro pain, and even the 797 00:45:10,600 --> 00:45:13,120 Speaker 3: shrinking of that trade deficit will be reflecting a bit 798 00:45:13,160 --> 00:45:16,520 Speaker 3: of pain on the ground because US manufacturers will be 799 00:45:16,640 --> 00:45:19,920 Speaker 3: under pressure by those tariffs. So a combination these go 800 00:45:20,000 --> 00:45:23,719 Speaker 3: really well, but not so well for the president despite that, 801 00:45:23,840 --> 00:45:26,400 Speaker 3: and then he will fulminate at how unfair it is 802 00:45:26,960 --> 00:45:29,759 Speaker 3: and just finally put me from you your wild card 803 00:45:29,840 --> 00:45:31,239 Speaker 3: or the thing that you're going to be watching that 804 00:45:31,320 --> 00:45:34,160 Speaker 3: we might not otherwise be watching in twenty six. 805 00:45:34,719 --> 00:45:38,040 Speaker 5: Well, my wild card isn't a particularly wild name. 806 00:45:38,239 --> 00:45:41,319 Speaker 1: It would just be to look at Google and how 807 00:45:41,400 --> 00:45:44,920 Speaker 1: much it continues to move ahead in the race against 808 00:45:44,920 --> 00:45:47,600 Speaker 1: open AI. They if you think about the story of 809 00:45:47,640 --> 00:45:50,319 Speaker 1: the hair and the tortoise, I would really put open 810 00:45:50,320 --> 00:45:52,400 Speaker 1: ai as the hair and Google. 811 00:45:52,040 --> 00:45:52,920 Speaker 5: As the tortoise. 812 00:45:53,000 --> 00:45:55,920 Speaker 1: They started off on the back foot, way behind when 813 00:45:55,960 --> 00:45:58,880 Speaker 1: chatgybt came out three years ago, and they have just 814 00:45:58,960 --> 00:46:03,160 Speaker 1: gradually caught up. And now their latest model, Gemini, is 815 00:46:03,280 --> 00:46:08,080 Speaker 1: better than the latest model of open aies by some benchmarks, 816 00:46:08,440 --> 00:46:10,680 Speaker 1: and they're gaining ground in terms of market share. 817 00:46:10,920 --> 00:46:11,440 Speaker 5: They've got this. 818 00:46:11,640 --> 00:46:17,319 Speaker 1: Very cautious, corporate scientific approach to AI, which is very 819 00:46:17,360 --> 00:46:19,680 Speaker 1: different samal than an open AI, who is much more 820 00:46:20,080 --> 00:46:23,560 Speaker 1: product oriented and let's make this an engaging product with 821 00:46:23,600 --> 00:46:26,240 Speaker 1: lots of features that are going to keep people engaged 822 00:46:26,280 --> 00:46:30,360 Speaker 1: for as long as possible. So maybe Google's approach of 823 00:46:30,480 --> 00:46:34,520 Speaker 1: this kind of utilitarian approach to building AI it's not 824 00:46:34,600 --> 00:46:37,520 Speaker 1: as exciting, it's a little bit more boring, but they 825 00:46:37,560 --> 00:46:40,719 Speaker 1: are slowly gaining ground. And of course they are the 826 00:46:40,960 --> 00:46:44,480 Speaker 1: established company and so I would not be surprised to 827 00:46:44,520 --> 00:46:48,440 Speaker 1: see them actually maybe surpass Open Ai certainly in terms 828 00:46:48,520 --> 00:46:51,040 Speaker 1: of capabilities of their AI models. 829 00:46:51,080 --> 00:46:54,520 Speaker 3: Next year, all right, Google the boring company to watch. 830 00:46:55,040 --> 00:46:58,680 Speaker 3: Thank you very much, Marion Parker, pome Elsen, Tom Wllick, 831 00:46:58,719 --> 00:46:59,200 Speaker 3: thank you so. 832 00:46:59,239 --> 00:47:03,680 Speaker 4: Much, thank you, thank you, thanks definitely, and. 833 00:47:03,719 --> 00:47:10,680 Speaker 3: Happy New Year. Thanks for listening to Trumponomics from Bloomberg. 834 00:47:10,719 --> 00:47:13,200 Speaker 3: It was hosted by me Stephanie Flanders. I was joined 835 00:47:13,200 --> 00:47:18,120 Speaker 3: by Bloomberg Opinion columnist Pame Olsen, Bloomberg Economics Chief economist 836 00:47:18,239 --> 00:47:22,760 Speaker 3: Tom Orlick, and our managing editor for US Government, Mario Parker. 837 00:47:26,719 --> 00:47:29,560 Speaker 3: Trumponomics was produced by Summer Sadi and Moses and Am, 838 00:47:29,840 --> 00:47:33,160 Speaker 3: with help from Amy Keen and special thanks to Rachel 839 00:47:33,239 --> 00:47:37,920 Speaker 3: Lewis Chrisky. Sound design was by Blake Maples and Sage 840 00:47:37,920 --> 00:47:41,560 Speaker 3: Bowman is Bloomberg's head of podcast. To help others find 841 00:47:41,560 --> 00:47:44,840 Speaker 3: the show, please rate and review us highly wherever you 842 00:47:44,920 --> 00:47:47,839 Speaker 3: listen to your podcasts, and I hope whatever you're doing 843 00:47:48,040 --> 00:47:51,600 Speaker 3: you have a really great holiday. 844 00:48:00,080 --> 00:48:02,120 Speaker 4: The mat found the p