1 00:00:02,920 --> 00:00:10,799 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. You're listening to the 2 00:00:10,840 --> 00:00:15,000 Speaker 1: Bloomberg Intelligence podcast. Catch us live weekdays at ten am 3 00:00:15,040 --> 00:00:17,520 Speaker 1: Eastern on Apple car Playing and broyd Otto with the 4 00:00:17,520 --> 00:00:21,400 Speaker 1: Bloomberg Business App. Listen on demand wherever you get your podcasts, 5 00:00:21,640 --> 00:00:24,000 Speaker 1: or watch us live on YouTube. 6 00:00:25,360 --> 00:00:27,640 Speaker 2: There's a story out there, and the headline doesn't help 7 00:00:27,640 --> 00:00:29,120 Speaker 2: me because I really don't know what's going on here. 8 00:00:29,440 --> 00:00:34,560 Speaker 2: Microsoft and Apple drop open AI board plans as scrutiny grows. 9 00:00:35,120 --> 00:00:36,400 Speaker 3: I thought we wanted the big. 10 00:00:36,240 --> 00:00:39,680 Speaker 2: Tech names to be supporting AI, and I don't know 11 00:00:39,720 --> 00:00:40,440 Speaker 2: what's going on here. 12 00:00:40,560 --> 00:00:42,479 Speaker 4: So it was like a freebe scene. Isn't it like 13 00:00:42,560 --> 00:00:43,320 Speaker 4: they're just watching? 14 00:00:43,600 --> 00:00:46,199 Speaker 3: Yeah, exactly, but they tell me it's important. 15 00:00:46,240 --> 00:00:48,519 Speaker 2: And Dina Bass, who's a really good technology she's been 16 00:00:48,520 --> 00:00:51,880 Speaker 2: a great technology reporter for i'm gonna say thirty plus years. 17 00:00:51,880 --> 00:00:55,440 Speaker 2: I was reading here stuff way before she came to Bloomberg. 18 00:00:55,600 --> 00:00:57,800 Speaker 2: All right, So I'm gonna self admit I don't know 19 00:00:57,800 --> 00:00:59,480 Speaker 2: what's going on here. So I said, please get me 20 00:00:59,520 --> 00:01:01,840 Speaker 2: in deep Seeing, because I'm just guessing he knows what's 21 00:01:01,880 --> 00:01:04,200 Speaker 2: going on man deep Seeing. He's a tech analyst apt 22 00:01:04,200 --> 00:01:08,039 Speaker 2: Bloomberg Intelligence, Microsoft and Apple drop plans to take board 23 00:01:08,240 --> 00:01:12,440 Speaker 2: roles at open Ai in a surprise decision that underscores 24 00:01:12,480 --> 00:01:16,440 Speaker 2: growing regulatory scrutiny a big text influence over artificial intelligence. 25 00:01:17,440 --> 00:01:18,200 Speaker 3: What's going on here? 26 00:01:18,760 --> 00:01:22,400 Speaker 5: Well, so, I mean if you roll back through last 27 00:01:22,480 --> 00:01:25,680 Speaker 5: year when we had the open ai drama, you know, 28 00:01:25,880 --> 00:01:30,440 Speaker 5: the board firing Sam Altman and then open Ai adding 29 00:01:30,520 --> 00:01:36,400 Speaker 5: four new members, and I think everyone so far has 30 00:01:36,480 --> 00:01:41,080 Speaker 5: been just sort of interested in how open ai has 31 00:01:41,120 --> 00:01:44,000 Speaker 5: evolved since then in the sense that they have even 32 00:01:44,040 --> 00:01:49,040 Speaker 5: though they have added independent board members. Microsoft, you know, 33 00:01:49,280 --> 00:01:53,000 Speaker 5: and now Apple probably would have joined the board. It 34 00:01:53,000 --> 00:01:57,560 Speaker 5: doesn't make sense like when you have vested interests. In 35 00:01:57,600 --> 00:01:59,880 Speaker 5: the case of Microsoft, they have a forty nine per 36 00:02:00,080 --> 00:02:04,520 Speaker 5: sense steak in open ai and they're the primary cloud 37 00:02:04,520 --> 00:02:08,000 Speaker 5: provider for open Ai. How can you do a good 38 00:02:08,080 --> 00:02:11,080 Speaker 5: job of being an independent board member. And that's where 39 00:02:11,160 --> 00:02:14,080 Speaker 5: you know, the regulatory scrutiny comes into play, because the 40 00:02:14,160 --> 00:02:19,040 Speaker 5: regulators obviously are concerned about how Microsoft bought Inflection Ai. 41 00:02:19,120 --> 00:02:22,680 Speaker 5: Even though it wasn't an acquisition, they got all the 42 00:02:22,720 --> 00:02:26,680 Speaker 5: employees of Inflection Ai. So that's when it comes down 43 00:02:26,720 --> 00:02:28,440 Speaker 5: to the optics of it. 44 00:02:28,560 --> 00:02:32,120 Speaker 4: Here, break down for me again, what the relationship is 45 00:02:32,160 --> 00:02:35,800 Speaker 4: now with open ai and Apple versus open ai and Microsoft. 46 00:02:36,120 --> 00:02:39,560 Speaker 5: So Microsoft has a forty nine percent stake in open Ai. 47 00:02:39,680 --> 00:02:42,200 Speaker 5: They made a ten billion dollar investment, and so you 48 00:02:42,240 --> 00:02:44,440 Speaker 5: could argue, if you're an investor in a company, you 49 00:02:44,480 --> 00:02:47,080 Speaker 5: should get a board seat as a Sequoia and some 50 00:02:47,200 --> 00:02:50,440 Speaker 5: other private investors that open Ai has. Apple, on the 51 00:02:50,480 --> 00:02:54,480 Speaker 5: other hand, just partnered with open ai, which is going 52 00:02:54,560 --> 00:02:58,280 Speaker 5: to be one of the providers of Genai model on 53 00:02:58,360 --> 00:03:03,200 Speaker 5: their iOS and other devices, and so it's more of 54 00:03:03,240 --> 00:03:06,600 Speaker 5: a customer supplier relationship if you want to call it 55 00:03:06,600 --> 00:03:09,480 Speaker 5: that way. That Apple is using open e as a supplier, 56 00:03:09,880 --> 00:03:11,919 Speaker 5: why were they getting a board seat, Well, you could 57 00:03:12,040 --> 00:03:15,000 Speaker 5: argue they could be an independent board member, like Fijisimo 58 00:03:15,160 --> 00:03:18,440 Speaker 5: is one of the board members for open Eyes. She's 59 00:03:18,480 --> 00:03:23,200 Speaker 5: the CEO of Instacart. Now, I think that's where the 60 00:03:23,360 --> 00:03:27,760 Speaker 5: fact that these technologies are so intertwined in terms of 61 00:03:28,040 --> 00:03:32,400 Speaker 5: AI being deployed on Apple devices, you could argue the 62 00:03:32,480 --> 00:03:36,160 Speaker 5: independence of the board comes into question, and given how 63 00:03:36,240 --> 00:03:40,320 Speaker 5: important AI ethics and guardrails and all the things that 64 00:03:40,440 --> 00:03:44,680 Speaker 5: regulators care about that everyone cares about. Frankly, I think 65 00:03:44,720 --> 00:03:48,880 Speaker 5: the independence of the board was questionable, and these companies 66 00:03:48,960 --> 00:03:52,600 Speaker 5: decided to proactively not you join the board, which I 67 00:03:52,640 --> 00:03:55,200 Speaker 5: think is good for optics. 68 00:03:55,760 --> 00:04:01,040 Speaker 2: So Microsoft is under scrutiny in Europe and by US 69 00:04:01,280 --> 00:04:07,160 Speaker 2: regulators into I guess their alleged dominance of AI. What's 70 00:04:07,200 --> 00:04:10,120 Speaker 2: the view on the street about that risk to Microsoft? 71 00:04:10,160 --> 00:04:12,120 Speaker 2: And I don't know because at some point maybe Apple 72 00:04:12,120 --> 00:04:13,320 Speaker 2: out of it, but certainly Microsoft. 73 00:04:13,680 --> 00:04:16,800 Speaker 5: I mean, the best way to look at it is 74 00:04:17,360 --> 00:04:21,440 Speaker 5: these companies require a lot of compute for training their models. 75 00:04:21,440 --> 00:04:24,440 Speaker 5: So whether it's open ai and Tropic, any of the 76 00:04:24,600 --> 00:04:28,920 Speaker 5: independent GENI model providers, they need a lot of compute. Now, 77 00:04:28,920 --> 00:04:32,880 Speaker 5: in the case of Microsoft, arrangement is really you know, 78 00:04:33,320 --> 00:04:37,560 Speaker 5: full of their interests Because OpenAI ends up using Microsoft 79 00:04:37,600 --> 00:04:40,640 Speaker 5: Compute for training their models. 80 00:04:40,680 --> 00:04:42,799 Speaker 3: What does that mean? What is compute? 81 00:04:42,839 --> 00:04:47,040 Speaker 5: So think of all the Nvidia GPUs that open ai needs, 82 00:04:47,400 --> 00:04:51,800 Speaker 5: but it's deployed on Microsoft Cloud. So if I need 83 00:04:52,040 --> 00:04:55,800 Speaker 5: compute to train the models, I'm not setting up my 84 00:04:56,040 --> 00:04:59,120 Speaker 5: data center and Tropic open Ai they're not setting up 85 00:04:59,200 --> 00:05:03,279 Speaker 5: data centers. They're using Microsoft Data centers to get access 86 00:05:03,320 --> 00:05:06,520 Speaker 5: to that compute. And that's where they could have gone 87 00:05:06,560 --> 00:05:09,920 Speaker 5: to Amazon or Google to get that compute. The reason 88 00:05:09,960 --> 00:05:12,760 Speaker 5: they're going to Microsoft is because of the you know, 89 00:05:12,839 --> 00:05:15,640 Speaker 5: the stake they have and the board seed. And so 90 00:05:16,040 --> 00:05:19,640 Speaker 5: if you are open ai and Microsoft is sitting in 91 00:05:19,680 --> 00:05:22,080 Speaker 5: that board, are you really going to pick any of 92 00:05:22,120 --> 00:05:26,000 Speaker 5: the alternative cloud providers for training your models? Probably not. 93 00:05:26,960 --> 00:05:31,120 Speaker 4: Do you feel like open ai will ever go public 94 00:05:31,520 --> 00:05:35,120 Speaker 4: or be bought by Microsoft? Like, how does that situation 95 00:05:35,320 --> 00:05:35,840 Speaker 4: play out? 96 00:05:36,160 --> 00:05:40,799 Speaker 5: I mean, everyone knows their antitrust concerns around Bigpeg buying 97 00:05:40,839 --> 00:05:43,720 Speaker 5: any of these companies, So the acquisition is definitely out 98 00:05:43,720 --> 00:05:47,160 Speaker 5: of the window. And that's why the way Microsoft got 99 00:05:47,200 --> 00:05:50,760 Speaker 5: all the employees of Inflection Ai, it was a pseudo acquisition. 100 00:05:51,160 --> 00:05:53,880 Speaker 5: You get all of the employees, you pay six hundred 101 00:05:53,920 --> 00:05:57,159 Speaker 5: million dollars, and you're not calling it an acquisition. So 102 00:05:57,320 --> 00:06:00,320 Speaker 5: in the case of OpenAI, obviously that's not gonna have happened. 103 00:06:00,320 --> 00:06:02,960 Speaker 5: But they Microsoft does have a forty nine percent stake. 104 00:06:03,360 --> 00:06:07,040 Speaker 5: Google bought DeepMind for six hundred million. Now they bought it, 105 00:06:07,160 --> 00:06:09,560 Speaker 5: you know, a few years back. But it just goes 106 00:06:09,600 --> 00:06:13,640 Speaker 5: to show how, you know, these companies positioned themselves for 107 00:06:13,680 --> 00:06:16,599 Speaker 5: the future and they end up buying a lot of 108 00:06:16,640 --> 00:06:19,920 Speaker 5: the startups. So I don't see an acquisition on the cards. 109 00:06:19,960 --> 00:06:22,640 Speaker 5: But my open AI is unique in the sense it's 110 00:06:22,680 --> 00:06:26,479 Speaker 5: a nonprofit and then they are looking to build a 111 00:06:26,480 --> 00:06:29,760 Speaker 5: business out of it. So their structure was sort of 112 00:06:30,279 --> 00:06:33,240 Speaker 5: very weird to begin with. And now they had that 113 00:06:33,360 --> 00:06:35,960 Speaker 5: board trouble where it was a three percent board that 114 00:06:36,040 --> 00:06:39,760 Speaker 5: fired their CEO. Now they are expanding with independent board members. 115 00:06:39,800 --> 00:06:42,919 Speaker 5: So I don't think they've got it right still, but 116 00:06:43,000 --> 00:06:46,080 Speaker 5: at least, you know, not having Microsoft and Apple on 117 00:06:46,120 --> 00:06:48,840 Speaker 5: the board is good in terms of optics. And that's 118 00:06:48,839 --> 00:06:50,160 Speaker 5: what I can conclude. 119 00:06:49,760 --> 00:06:51,120 Speaker 3: Here on surveillance. 120 00:06:51,400 --> 00:06:53,960 Speaker 2: Earlier today we had a London based headshunt which I 121 00:06:53,960 --> 00:06:56,200 Speaker 2: had never heard from heard of rougher like I about 122 00:06:56,200 --> 00:06:58,320 Speaker 2: thirty eight billion dollars in assets under under management. 123 00:06:58,480 --> 00:07:00,599 Speaker 3: They have a Parish call on the market. 124 00:07:00,600 --> 00:07:03,000 Speaker 2: They think so overheated, everything's overhead blah blah blah, and 125 00:07:03,000 --> 00:07:05,800 Speaker 2: that includes AI. Now they're saying they believe AI that 126 00:07:05,880 --> 00:07:08,080 Speaker 2: of course they've been wrong by their owned mission and 127 00:07:08,080 --> 00:07:10,480 Speaker 2: they've been trying their peers because the market's been going up. 128 00:07:10,960 --> 00:07:13,239 Speaker 2: But the AI call is simply that yeah, AI is transformed, 129 00:07:13,280 --> 00:07:14,400 Speaker 2: blah blah blah blah. 130 00:07:14,480 --> 00:07:15,480 Speaker 3: But we don't even. 131 00:07:15,400 --> 00:07:18,120 Speaker 2: Know what the uses are and we don't all the 132 00:07:18,120 --> 00:07:20,800 Speaker 2: money that's being invested in AI. Explain to me what 133 00:07:20,800 --> 00:07:23,320 Speaker 2: the return on investment is on that. How do you 134 00:07:23,840 --> 00:07:25,240 Speaker 2: answer that or deal with that? 135 00:07:25,760 --> 00:07:28,960 Speaker 5: I mean, we know some tangible use cases right now. 136 00:07:29,080 --> 00:07:33,120 Speaker 5: One is from a developer co pilot perspective AI works, 137 00:07:33,440 --> 00:07:37,360 Speaker 5: and this thing is getting deployed at every company. Developers 138 00:07:37,400 --> 00:07:41,200 Speaker 5: are using copilot, same thing with customer service chatbots. This 139 00:07:41,240 --> 00:07:46,000 Speaker 5: thing is productivity enhancing and this is getting deployed. So 140 00:07:46,280 --> 00:07:49,040 Speaker 5: I think it's hard to argue that there is no 141 00:07:49,160 --> 00:07:53,320 Speaker 5: tangible use case. But look, the valuations have run up 142 00:07:53,520 --> 00:07:55,920 Speaker 5: and there will be a disconnect at some point where 143 00:07:56,040 --> 00:07:59,880 Speaker 5: valuations are way ahead of reality. And that's where you 144 00:08:00,160 --> 00:08:02,400 Speaker 5: as an investor, you got to pick your spots. I mean, 145 00:08:02,600 --> 00:08:05,960 Speaker 5: a TSMC will continue to benefit, whether it's in VideA, 146 00:08:06,000 --> 00:08:08,720 Speaker 5: whether it's Apple or anyone else that needs the latest 147 00:08:08,760 --> 00:08:11,040 Speaker 5: chips for AI, they won't continue to benefit. 148 00:08:11,640 --> 00:08:14,360 Speaker 2: And I think I can kind of write the press 149 00:08:14,400 --> 00:08:17,880 Speaker 2: release from in Video or any company that's really been 150 00:08:17,920 --> 00:08:21,760 Speaker 2: lifted by AI. You know, it's almost kind of a 151 00:08:21,800 --> 00:08:24,360 Speaker 2: customer fatigue. They've already bought all this stuff, They've already 152 00:08:24,360 --> 00:08:25,360 Speaker 2: invested all this money. 153 00:08:25,680 --> 00:08:26,880 Speaker 3: Our customers have pulled. 154 00:08:26,640 --> 00:08:28,840 Speaker 2: Three to four years of capex forward and spent it 155 00:08:28,880 --> 00:08:31,720 Speaker 2: over the last twenty four months, and we're seeing instead 156 00:08:31,720 --> 00:08:33,760 Speaker 2: of going forty eight percent, we're gonna go with thirty percent. 157 00:08:34,720 --> 00:08:37,200 Speaker 2: That's a bad day, yeah, text and for the market, 158 00:08:37,360 --> 00:08:38,560 Speaker 2: I think, And it's. 159 00:08:38,480 --> 00:08:41,480 Speaker 5: Still going to be uniform. So some sectors will see 160 00:08:41,800 --> 00:08:45,640 Speaker 5: very tangible productivity benefit of AI, some others will have 161 00:08:45,679 --> 00:08:48,760 Speaker 5: a delayed benefit or maybe no benefit, and that's where 162 00:08:48,800 --> 00:08:51,040 Speaker 5: the disconnect will start to show up. But right now 163 00:08:51,080 --> 00:08:54,520 Speaker 5: we are in that you know mode where everyone believes 164 00:08:54,559 --> 00:08:56,839 Speaker 5: AI is the next big thing and it will have 165 00:08:57,040 --> 00:08:58,120 Speaker 5: productivity boots. 166 00:08:58,160 --> 00:09:00,240 Speaker 4: And then just one quick point on what Paul was 167 00:09:00,240 --> 00:09:04,360 Speaker 4: talking about too, just that the monetization part, Like you say, 168 00:09:04,360 --> 00:09:07,240 Speaker 4: the use cases are there, but then how you monetize it? Yeah, 169 00:09:07,679 --> 00:09:09,600 Speaker 4: Like we don't know what that really looks like yet, 170 00:09:09,600 --> 00:09:11,720 Speaker 4: Like everyone's gonna keep wanting in video chips. Okay, I 171 00:09:11,760 --> 00:09:14,520 Speaker 4: get that, But then what the use case is for that? Okay, 172 00:09:14,800 --> 00:09:16,960 Speaker 4: then you got to monetize it. And at some point 173 00:09:17,520 --> 00:09:19,080 Speaker 4: that's gonna have to be proven. 174 00:09:19,360 --> 00:09:21,839 Speaker 5: I mean, look at Cloud. So when Cloud came through 175 00:09:21,840 --> 00:09:25,760 Speaker 5: the scene, everyone said, Okay, it's a secular trend. But 176 00:09:26,120 --> 00:09:29,680 Speaker 5: the monetization came about over time, and there are certain 177 00:09:29,800 --> 00:09:34,200 Speaker 5: cloud native apps that really did very well as a business. 178 00:09:34,480 --> 00:09:38,240 Speaker 5: So that's where an AI native app I think it's 179 00:09:38,280 --> 00:09:41,120 Speaker 5: going to do exceedingly well. Whereas if you're changing a 180 00:09:41,200 --> 00:09:44,120 Speaker 5: business and you're overlaying AI, you may or may not 181 00:09:44,160 --> 00:09:46,120 Speaker 5: be successful, but we'll find out. 182 00:09:46,559 --> 00:09:48,920 Speaker 3: Good time to be a tech analyst, he's the best. 183 00:09:49,000 --> 00:09:51,120 Speaker 4: Yeah, I know, like you can ask the dumbest questions 184 00:09:51,160 --> 00:09:54,319 Speaker 4: and he's like right there for you. Yeah, I appreciate it. 185 00:09:54,320 --> 00:09:56,640 Speaker 3: It's amazing. Mandeep, thank you so much for joining us. 186 00:09:56,640 --> 00:10:00,480 Speaker 2: Man Deep seeing senior tech industryental from Bloomberg Intelligence coming 187 00:10:00,520 --> 00:10:03,199 Speaker 2: into the New York office on like his management team, So. 188 00:10:03,160 --> 00:10:05,120 Speaker 3: We appreciate that. Thank you, man Man Deep. 189 00:10:06,679 --> 00:10:10,560 Speaker 1: You're listening to the Bloomberg Intelligence Podcast. Catch us live 190 00:10:10,640 --> 00:10:14,160 Speaker 1: weekdays at ten am Eastern on applecar Play and Android 191 00:10:14,200 --> 00:10:17,000 Speaker 1: Auto with the Bloomberg Business app. You can also listen 192 00:10:17,080 --> 00:10:20,200 Speaker 1: live on Amazon Alexa from our flagship New York station 193 00:10:20,559 --> 00:10:25,560 Speaker 1: Just Say Alexa playing Bloomberg eleven thirty ben. 194 00:10:25,480 --> 00:10:27,280 Speaker 4: Cher J Powell. He is speaking of the House Financial 195 00:10:27,280 --> 00:10:29,440 Speaker 4: Services Committee. In his opening statement, we will bring you 196 00:10:29,480 --> 00:10:32,600 Speaker 4: the Q and A when that kicks off. Steve Matthews' 197 00:10:32,600 --> 00:10:36,120 Speaker 4: Bloomberg Federal Reserve reporter. He joins us, Now he's anything 198 00:10:36,120 --> 00:10:38,760 Speaker 4: we're going to hear different today than yesterday. 199 00:10:40,040 --> 00:10:42,520 Speaker 6: I think the plan is not to hear anything different. 200 00:10:42,559 --> 00:10:46,800 Speaker 6: I mean, Powell's plan yesterday was pretty clearly to try 201 00:10:46,920 --> 00:10:49,600 Speaker 6: not to make news. And he was asked repeatedly about 202 00:10:49,800 --> 00:10:52,360 Speaker 6: we're going to get a right cut in September and 203 00:10:52,679 --> 00:10:56,280 Speaker 6: or July or whenever, and he said, I'm not going 204 00:10:56,360 --> 00:11:00,120 Speaker 6: to give any guidance on rights. And I think he 205 00:11:00,200 --> 00:11:04,480 Speaker 6: was trying to be very balanced. In most of the 206 00:11:04,559 --> 00:11:11,040 Speaker 6: recent testimony, the semi annual Hemphrey Hawkins testimony, he has 207 00:11:11,120 --> 00:11:14,040 Speaker 6: tried not to make a lot of news because, you know, 208 00:11:14,200 --> 00:11:18,800 Speaker 6: and this is a particular political situation, it's right before 209 00:11:18,880 --> 00:11:22,680 Speaker 6: the election. He doesn't want to make a lot of news. 210 00:11:22,720 --> 00:11:25,480 Speaker 6: So I think that you're going to see the same 211 00:11:25,559 --> 00:11:27,400 Speaker 6: tone today, Alex. 212 00:11:27,480 --> 00:11:29,800 Speaker 3: One of the things I love about Bloomberg and Bloomberg News. 213 00:11:29,720 --> 00:11:33,960 Speaker 2: Is we actually have a dedicated reporter for the Federal Reserve. 214 00:11:34,200 --> 00:11:35,720 Speaker 3: How cool is that? Nobody else does that? 215 00:11:35,840 --> 00:11:36,760 Speaker 4: No, it's really cool. 216 00:11:36,840 --> 00:11:39,719 Speaker 3: Yeah, Steve, Yeah, you're the best. Steve, Thanks so much. 217 00:11:40,760 --> 00:11:43,199 Speaker 2: I want to ask you just mentioned the election help 218 00:11:43,280 --> 00:11:46,960 Speaker 2: us understand how the FED thinks about timing of any 219 00:11:47,240 --> 00:11:50,400 Speaker 2: rate movement and the election when when is too close? 220 00:11:50,520 --> 00:11:53,880 Speaker 4: Are you stealing John Tucker's question? He's totally stealing John 221 00:11:53,880 --> 00:11:55,600 Speaker 4: Tucker's question. Sorry, go ahead, Steve. 222 00:11:55,720 --> 00:12:00,079 Speaker 6: There is a theory out there among Fed watchers that 223 00:12:00,160 --> 00:12:03,319 Speaker 6: the FED would prefer not to move in September. It's 224 00:12:03,360 --> 00:12:06,800 Speaker 6: the last meeting before the election. If they move in September, 225 00:12:06,840 --> 00:12:10,640 Speaker 6: there will be complaints, Oh, you're doing it for political reasons. 226 00:12:11,080 --> 00:12:14,520 Speaker 6: What Powell has said about this is they don't think 227 00:12:14,559 --> 00:12:17,640 Speaker 6: about it at all. If September is the right time, 228 00:12:17,720 --> 00:12:22,439 Speaker 6: they will move in September, and you know it doesn't 229 00:12:22,720 --> 00:12:27,320 Speaker 6: enter their discussions or as an influence at all. So 230 00:12:27,960 --> 00:12:30,200 Speaker 6: you know, some people believe there is a little bit 231 00:12:30,280 --> 00:12:34,000 Speaker 6: higher bar to move in September. But you know, if 232 00:12:34,040 --> 00:12:36,040 Speaker 6: that's what the data says is the right time, that 233 00:12:36,120 --> 00:12:37,120 Speaker 6: they would go ahead and move. 234 00:12:37,520 --> 00:12:39,559 Speaker 4: I'm going to bring in what Joe Wisenthal was talking 235 00:12:39,559 --> 00:12:42,240 Speaker 4: about in his column today, saying that yesterday the headline 236 00:12:42,280 --> 00:12:44,720 Speaker 4: was Powell says that labor is not a source of 237 00:12:44,720 --> 00:12:48,400 Speaker 4: inflationary pressures now and his point was like, well, that's 238 00:12:48,480 --> 00:12:51,679 Speaker 4: kind of the game, right, Like if the FED no 239 00:12:51,760 --> 00:12:55,680 Speaker 4: longer sees the risks of that wage price spiral and 240 00:12:55,679 --> 00:12:58,520 Speaker 4: that inflation spiral from the labor market, which is what 241 00:12:58,559 --> 00:13:01,440 Speaker 4: the FED can control. If the FED looks at it 242 00:13:01,480 --> 00:13:03,800 Speaker 4: that way, that could set him up for a cut. 243 00:13:04,040 --> 00:13:05,920 Speaker 4: What do you think about that? 244 00:13:05,920 --> 00:13:09,520 Speaker 6: That's certainly true. I agree with Joe. Joe's a smart guy. 245 00:13:11,160 --> 00:13:14,600 Speaker 6: Over the last three years, Powell has basically said, it's 246 00:13:14,600 --> 00:13:19,319 Speaker 6: all about inflation. We have one sided risks. We want 247 00:13:19,360 --> 00:13:22,720 Speaker 6: to return to price stability. The labor market is fine. 248 00:13:23,200 --> 00:13:26,160 Speaker 6: Now he's saying there are two sided wrests. There's risks 249 00:13:26,160 --> 00:13:29,240 Speaker 6: on inflation, there's risks on the labor market. And that 250 00:13:29,360 --> 00:13:32,520 Speaker 6: does set up a rate cut. I mean, because you 251 00:13:32,520 --> 00:13:36,400 Speaker 6: know they don't want to see the labor market softened further. 252 00:13:37,520 --> 00:13:41,320 Speaker 2: Steve to make a rate change, either up or down. 253 00:13:41,400 --> 00:13:44,280 Speaker 2: Does it have to be unanimous or what does a 254 00:13:44,280 --> 00:13:44,960 Speaker 2: FED like to do? 255 00:13:45,040 --> 00:13:49,160 Speaker 6: Typically it doesn't have to be unanimous. Powell has kept 256 00:13:49,160 --> 00:13:54,360 Speaker 6: things unanimous for the last couple of years. But I 257 00:13:54,400 --> 00:13:57,840 Speaker 6: think you know, particularly in the political environment, he would 258 00:13:57,920 --> 00:14:01,800 Speaker 6: much prefer that there be unanimous vote. But there are 259 00:14:01,840 --> 00:14:05,720 Speaker 6: a couple of hawks on the committee, so that that 260 00:14:05,800 --> 00:14:08,200 Speaker 6: may be tough. To get. I mean, you probably will 261 00:14:08,240 --> 00:14:08,880 Speaker 6: get a dissent. 262 00:14:10,520 --> 00:14:12,720 Speaker 4: What do you think we're going to get tomorrow with inflation? Like, 263 00:14:12,760 --> 00:14:14,760 Speaker 4: what do you think the Fed? I mean JPL yesterday 264 00:14:14,800 --> 00:14:16,720 Speaker 4: definitely talked about how it's still going to be inflation 265 00:14:16,920 --> 00:14:19,960 Speaker 4: X services that core number, right, what do you think 266 00:14:19,960 --> 00:14:20,600 Speaker 4: we're going to learn? 267 00:14:21,920 --> 00:14:25,360 Speaker 6: I mean, all the estimates are that the CPI is 268 00:14:25,400 --> 00:14:28,640 Speaker 6: going to come in relatively low, and the Fed, you 269 00:14:28,680 --> 00:14:32,840 Speaker 6: have to remember, is focused on PCEE inflation. CPI goes 270 00:14:32,880 --> 00:14:36,760 Speaker 6: into it PPI, which comes out a little bit later. 271 00:14:37,080 --> 00:14:41,560 Speaker 6: We'll also go into that, and after PPI sometimes CPI 272 00:14:41,640 --> 00:14:45,080 Speaker 6: can be misleading. After PPI you'll get some estimates of 273 00:14:45,120 --> 00:14:50,360 Speaker 6: what they're their preferred measure of inflation is for the month. 274 00:14:50,400 --> 00:14:54,280 Speaker 6: And but the estimates are things are coming down. Housing 275 00:14:54,320 --> 00:15:00,280 Speaker 6: prices are coming down, and they know rents have been 276 00:15:00,280 --> 00:15:03,080 Speaker 6: coming down, and that's delayed in the official data. 277 00:15:04,200 --> 00:15:07,960 Speaker 2: Steve, the Fed had a you know, a knock certainly 278 00:15:08,000 --> 00:15:11,760 Speaker 2: that they were too late to raise rates here and 279 00:15:11,840 --> 00:15:14,880 Speaker 2: that they really missed the boat there. Does that influence 280 00:15:15,760 --> 00:15:17,760 Speaker 2: kind of they're thinking today or they just kind of 281 00:15:17,920 --> 00:15:20,400 Speaker 2: sloth that off. 282 00:15:20,440 --> 00:15:23,920 Speaker 6: It does influence their thinking in the respect that you 283 00:15:23,960 --> 00:15:26,960 Speaker 6: know they don't want to see a recurrence of inflation, 284 00:15:27,200 --> 00:15:29,000 Speaker 6: and if they're going to air on one side or 285 00:15:29,000 --> 00:15:31,520 Speaker 6: the other, they want to air on the side of 286 00:15:31,640 --> 00:15:34,080 Speaker 6: not seeing a recurrence. So they're afraid that if inflation 287 00:15:34,240 --> 00:15:38,520 Speaker 6: were to bark up again, that would be really problematic 288 00:15:38,560 --> 00:15:40,080 Speaker 6: for inflation expectations. 289 00:15:40,400 --> 00:15:42,440 Speaker 2: All right, Steve, thanks so much. We really appreciate Steve. 290 00:15:42,480 --> 00:15:45,440 Speaker 2: Matthews Feder reserve reporter for Bloomberg News. 291 00:15:45,760 --> 00:15:50,239 Speaker 1: This is the Bloomberg Intelligence podcast, available on Apple, Spotify, 292 00:15:50,440 --> 00:15:53,360 Speaker 1: and anywhere else you will get your podcasts. Listen live 293 00:15:53,440 --> 00:15:57,040 Speaker 1: each weekday ten am to noon Eastern on Bloomberg dot Com, 294 00:15:57,160 --> 00:16:00,120 Speaker 1: the iHeart Radio app tune In, and the Bloomberg and 295 00:16:00,200 --> 00:16:03,040 Speaker 1: Ess that. You can also watch us live every weekday 296 00:16:03,080 --> 00:16:05,800 Speaker 1: on YouTube and always on the Bloomberg terminal 297 00:16:09,240 --> 00:16:09,280 Speaker 5: M