1 00:00:02,960 --> 00:00:10,840 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. You're listening to the 2 00:00:10,880 --> 00:00:15,040 Speaker 1: Bloomberg Intelligence podcast. Catch us live weekdays at ten am 3 00:00:15,080 --> 00:00:18,079 Speaker 1: Eastern on Apple Car, playing Android Auto with the Bloomberg 4 00:00:18,120 --> 00:00:21,440 Speaker 1: Business app. Listen on demand wherever you get your podcasts, 5 00:00:21,640 --> 00:00:24,040 Speaker 1: or watch us live on YouTube. 6 00:00:25,120 --> 00:00:27,360 Speaker 2: Let's stay on the Ebie discussion. We can do that 7 00:00:27,400 --> 00:00:30,840 Speaker 2: with Steve Mann at Global Autos and Industrials research Channels 8 00:00:30,880 --> 00:00:33,879 Speaker 2: for Bloomberg Intelligence joining us from Princeton, Jersey. We got 9 00:00:33,880 --> 00:00:35,760 Speaker 2: a little camera down there, so that is an incentive 10 00:00:35,760 --> 00:00:38,800 Speaker 2: for the folks to stay in Princeton. I guess Steve Man. 11 00:00:39,120 --> 00:00:42,160 Speaker 2: Don't get me started. Steve Man joins us though, Hey, Steve, 12 00:00:42,240 --> 00:00:45,920 Speaker 2: talk to us about I guess President Biden. We're talking 13 00:00:45,920 --> 00:00:51,520 Speaker 2: about tariffs on China evs and other strategic sectors. What's 14 00:00:51,520 --> 00:00:52,200 Speaker 2: going on there. 15 00:00:53,400 --> 00:00:57,000 Speaker 3: It's not surprise that they're going to do that. In 16 00:00:57,120 --> 00:01:01,600 Speaker 3: terms of the impact on the Chinese automaker, obviously nothing 17 00:01:01,600 --> 00:01:05,240 Speaker 3: because they don't sell any vehicles to the US at 18 00:01:05,240 --> 00:01:07,959 Speaker 3: the moment. They have no plans to, you know, byd 19 00:01:08,120 --> 00:01:12,440 Speaker 3: actually explicitly said they have no plans to actually sell 20 00:01:12,520 --> 00:01:15,720 Speaker 3: cars in the US at the moment. The only impact 21 00:01:15,760 --> 00:01:19,800 Speaker 3: is really on the battery side. But I think what 22 00:01:20,000 --> 00:01:22,479 Speaker 3: the government is trying to do is really to protect 23 00:01:22,880 --> 00:01:26,640 Speaker 3: a lot of investments that's already been made to onshore 24 00:01:27,080 --> 00:01:30,959 Speaker 3: EV battery production. You know, through the Inflation Reduction Act, 25 00:01:31,240 --> 00:01:34,800 Speaker 3: there's over one hundred billion dollars already spent on on 26 00:01:34,959 --> 00:01:39,800 Speaker 3: shoring EV battery and other supply chain components for evs 27 00:01:40,080 --> 00:01:43,479 Speaker 3: into the US. So I think, I think for automakers 28 00:01:43,520 --> 00:01:46,680 Speaker 3: there's no impact, but it's it's it seems like it's 29 00:01:46,680 --> 00:01:51,440 Speaker 3: more about posturing for for for the election later this year. 30 00:01:51,480 --> 00:01:53,120 Speaker 4: I have so many thoughts and feelings on this, but 31 00:01:53,160 --> 00:01:55,320 Speaker 4: I do think the timing is so interesting. So we 32 00:01:55,400 --> 00:01:58,720 Speaker 4: break the story on the day that Zeker is going 33 00:01:58,760 --> 00:02:01,520 Speaker 4: public in the US market. I don't know that that 34 00:02:01,600 --> 00:02:05,480 Speaker 4: feels you can buy stock of a China EV's company, 35 00:02:05,600 --> 00:02:08,240 Speaker 4: but you can't buy the China evy car. 36 00:02:08,760 --> 00:02:10,480 Speaker 5: Uh. Is that weird to you? 37 00:02:11,800 --> 00:02:14,840 Speaker 3: Uh, It's probably a good option to have, I think. 38 00:02:15,000 --> 00:02:15,120 Speaker 6: Uh. 39 00:02:15,480 --> 00:02:19,120 Speaker 3: China EV sales is still growing relatively strong. I mean, 40 00:02:19,120 --> 00:02:23,240 Speaker 3: it's slower this year than last year, but it's it's 41 00:02:23,280 --> 00:02:26,160 Speaker 3: you know, it's it's no different than what Neil has done, 42 00:02:26,520 --> 00:02:29,239 Speaker 3: what x Punk and Lee Auto has done. All these 43 00:02:29,240 --> 00:02:31,960 Speaker 3: companies have listed in the US. I think for the 44 00:02:32,000 --> 00:02:34,960 Speaker 3: most part, the purpose of this listing is really to 45 00:02:35,160 --> 00:02:38,200 Speaker 3: build the brand Cachet. You know, they can actually say 46 00:02:38,240 --> 00:02:41,760 Speaker 3: that we're a US listed company that goes really well 47 00:02:41,800 --> 00:02:44,920 Speaker 3: with a lot of the Chinese consumer. So if you 48 00:02:44,960 --> 00:02:48,919 Speaker 3: look at some of the sales Neil sales, Lee Auto 49 00:02:49,000 --> 00:02:51,919 Speaker 3: and Xpunk sales after they listed in the US, they 50 00:02:51,960 --> 00:02:56,680 Speaker 3: actually jumped uh in their in their home home turf. 51 00:02:56,800 --> 00:02:59,160 Speaker 2: I have to backtrack. I mean, you know, Alex is 52 00:02:59,160 --> 00:03:00,640 Speaker 2: way ahead of me on this. Can you tell me 53 00:03:00,680 --> 00:03:02,680 Speaker 2: what Zeker is for the It's the first time in 54 00:03:02,720 --> 00:03:05,320 Speaker 2: my life I've heard about the Zeker. It's a Chinese 55 00:03:05,440 --> 00:03:08,480 Speaker 2: manufactured electric car. Correct, that's right. 56 00:03:08,720 --> 00:03:14,960 Speaker 3: It's a premium evy maker. It actually comes competes with 57 00:03:15,080 --> 00:03:18,920 Speaker 3: the Tesla and the Neil Neils and they sell around 58 00:03:19,000 --> 00:03:22,960 Speaker 3: on average around the thirty seven forty dollars US dollars 59 00:03:23,560 --> 00:03:28,400 Speaker 3: per vehicle. It's part of Jili, the parent company of 60 00:03:28,480 --> 00:03:32,600 Speaker 3: the listed Jili Auto one seven five HK, So this 61 00:03:32,639 --> 00:03:35,480 Speaker 3: is part of the Jili. It's you know, a lot 62 00:03:35,480 --> 00:03:37,520 Speaker 3: of the actually a lot of the technology that Zeker 63 00:03:37,640 --> 00:03:42,080 Speaker 3: has is also part of Vovo they're head of R 64 00:03:42,120 --> 00:03:44,400 Speaker 3: and D is actually in Gothenburg, Sweden. 65 00:03:44,960 --> 00:03:47,120 Speaker 2: All right, Just for the IPO folks out there, this 66 00:03:47,160 --> 00:03:49,560 Speaker 2: is a company. They priced their IPO last night at 67 00:03:49,600 --> 00:03:51,720 Speaker 2: twenty one dollars a share. We just got a spread 68 00:03:51,840 --> 00:03:54,720 Speaker 2: just popped up on my terminal. Bid twenty three, ask 69 00:03:55,000 --> 00:03:58,120 Speaker 2: is twenty five has not opened yet, that's where the 70 00:03:58,120 --> 00:03:59,680 Speaker 2: bid ass spread is. Won't keep you up to date 71 00:03:59,800 --> 00:04:02,160 Speaker 2: on that throughout the day. So again, a cool new 72 00:04:02,240 --> 00:04:05,760 Speaker 2: IPO out there, which is good Goldman Sachs, Morgan Stanley, 73 00:04:05,800 --> 00:04:08,720 Speaker 2: Bank of America and China International Capital Corporate the underrator, 74 00:04:08,800 --> 00:04:11,560 Speaker 2: So some big time name supporting this company. 75 00:04:11,640 --> 00:04:14,880 Speaker 4: What's interesting is that Glee, the parent company, said it's 76 00:04:14,920 --> 00:04:17,880 Speaker 4: indicated it will subscribe to more than ninety percent of 77 00:04:17,920 --> 00:04:18,400 Speaker 4: the stock. 78 00:04:18,720 --> 00:04:21,440 Speaker 5: So yeah, so clearly that's still. 79 00:04:21,279 --> 00:04:25,960 Speaker 4: Wariness that exists in the EV market when that takes place. 80 00:04:26,160 --> 00:04:28,520 Speaker 5: Here is my deep question. It's not very deep, by 81 00:04:28,560 --> 00:04:30,720 Speaker 5: the way. If your goal is to. 82 00:04:30,600 --> 00:04:33,479 Speaker 4: Decarbonize the world as fast as possible, why would you 83 00:04:33,480 --> 00:04:35,960 Speaker 4: not flood the US market with the EV with cheap 84 00:04:36,040 --> 00:04:39,000 Speaker 4: China EV cars. And if that's not the goal, and 85 00:04:39,040 --> 00:04:42,600 Speaker 4: you want to green the economy, but also do it 86 00:04:42,600 --> 00:04:43,960 Speaker 4: profitably for US companies. 87 00:04:44,000 --> 00:04:46,359 Speaker 5: That's okay. Just let's say that. I mean, is that? 88 00:04:47,520 --> 00:04:49,120 Speaker 5: Can I say that out loud? Can I say the 89 00:04:49,200 --> 00:04:51,720 Speaker 5: quiet part out loud? Steve? Does this make sense to you? 90 00:04:52,760 --> 00:04:52,960 Speaker 7: Yeah? 91 00:04:53,040 --> 00:04:55,320 Speaker 3: I mean I think you have a very legitimate question. 92 00:04:56,440 --> 00:05:01,000 Speaker 3: But I think you know, with with the geop uh 93 00:05:01,240 --> 00:05:05,239 Speaker 3: wrangling between the two countries, uh, you know that that 94 00:05:05,240 --> 00:05:08,279 Speaker 3: that so you know this comes to play, right, you know, 95 00:05:08,360 --> 00:05:11,400 Speaker 3: do we flood the US? Do we protect the US industry? 96 00:05:11,440 --> 00:05:14,280 Speaker 3: I think I think especially during this year when there's 97 00:05:14,320 --> 00:05:18,400 Speaker 3: a lot of you know, to and fro on related 98 00:05:18,440 --> 00:05:21,960 Speaker 3: to the election, I think there's gonna be more and 99 00:05:22,000 --> 00:05:25,599 Speaker 3: more discussion around you know, do we let you know, 100 00:05:25,720 --> 00:05:30,200 Speaker 3: Chinese products come in for example, Evy batteries and solar panels. 101 00:05:30,640 --> 00:05:34,279 Speaker 3: So it's a very legitimate question. I think the government 102 00:05:34,320 --> 00:05:35,320 Speaker 3: will have to answer that for you. 103 00:05:36,120 --> 00:05:37,760 Speaker 2: All right, Steve, thanks so much for joining US. Steve 104 00:05:37,800 --> 00:05:40,760 Speaker 2: Man Global Autos and Industrials Research Channels for Bloomberg Intelligence. 105 00:05:41,480 --> 00:05:43,320 Speaker 2: You know, Alex I was up in Boston yesterday meeting 106 00:05:43,360 --> 00:05:46,799 Speaker 2: with some of the really smart folks at BCG Boston 107 00:05:47,120 --> 00:05:49,159 Speaker 2: Consulting Group, and one of the folks I spoke to 108 00:05:49,240 --> 00:05:52,960 Speaker 2: was Ahead of their EVY consulting practice, and he acknowledged 109 00:05:52,960 --> 00:05:56,560 Speaker 2: that the industry you know, probably put out some inferior products, 110 00:05:57,279 --> 00:05:59,640 Speaker 2: didn't price them. Well, there's a lot of you know, 111 00:05:59,640 --> 00:06:02,160 Speaker 2: the kind of a bumpy start to send the ev business. 112 00:06:02,160 --> 00:06:04,200 Speaker 2: But he says that being said, a lot of good 113 00:06:04,200 --> 00:06:06,200 Speaker 2: product is coming. Oh yeah, they're figuring out and they 114 00:06:06,200 --> 00:06:10,160 Speaker 2: will lower their cost structure. And he thinks ultimately, you know, 115 00:06:10,320 --> 00:06:12,720 Speaker 2: within a reasonable period of time, whether that's five years, 116 00:06:12,760 --> 00:06:15,039 Speaker 2: ten years, he thinks forty percent of the cars in 117 00:06:15,080 --> 00:06:17,760 Speaker 2: the US will be EV's. Oh interesting, So that's and 118 00:06:17,800 --> 00:06:20,480 Speaker 2: I've heard that number a little bit more frequently recently 119 00:06:20,480 --> 00:06:22,120 Speaker 2: because people are trying to make the sense we could 120 00:06:22,160 --> 00:06:24,480 Speaker 2: ever beat a one hundred percent EVS And I think 121 00:06:24,720 --> 00:06:26,719 Speaker 2: ultimately some people would like you to believe that, but 122 00:06:26,760 --> 00:06:28,760 Speaker 2: that may or may not happen. But I kind of 123 00:06:28,920 --> 00:06:31,640 Speaker 2: where they're coming from. They're kind of thinking about forty 124 00:06:31,640 --> 00:06:33,640 Speaker 2: percent within that you know, kind of ten ten year time. 125 00:06:33,920 --> 00:06:37,479 Speaker 1: I want to think about the implications for electricity production, 126 00:06:37,560 --> 00:06:39,200 Speaker 1: the grid and balancing the grid. 127 00:06:39,680 --> 00:06:43,000 Speaker 5: You were talking to the right people, Yes, they need 128 00:06:43,040 --> 00:06:43,520 Speaker 5: more of it. 129 00:06:43,680 --> 00:06:45,680 Speaker 4: And you want to know why utilities are some of 130 00:06:45,680 --> 00:06:47,479 Speaker 4: the best performing stocks in the SMP this year. 131 00:06:47,560 --> 00:06:50,120 Speaker 5: I mean they're defensive names. Are they defensive anymore? 132 00:06:50,160 --> 00:06:52,680 Speaker 4: They're growth stocks now they have actual growth in their 133 00:06:52,720 --> 00:06:54,239 Speaker 4: business because of stuff like EV's. 134 00:06:54,600 --> 00:06:55,640 Speaker 5: I have so many more thoughts. 135 00:06:57,160 --> 00:07:00,720 Speaker 1: You're listening to the Bloomberg Intelligence Podcast to catch us 136 00:07:00,760 --> 00:07:04,000 Speaker 1: live weekdays at ten am Eastern on applecar Play and 137 00:07:04,160 --> 00:07:07,080 Speaker 1: Androud Auto with a Bloomberg Business Act. You can also 138 00:07:07,160 --> 00:07:10,360 Speaker 1: listen live on Amazon Alexa from our flagship New York 139 00:07:10,400 --> 00:07:13,760 Speaker 1: station Just Say Alexa playing Bloomberg eleven thirty. 140 00:07:15,720 --> 00:07:18,960 Speaker 4: Joanne Shu joins us. She's University of Michigan Surveys of 141 00:07:19,000 --> 00:07:20,000 Speaker 4: Consumers Director. 142 00:07:20,080 --> 00:07:22,440 Speaker 5: This is her numbers. She does the data. 143 00:07:23,400 --> 00:07:25,920 Speaker 4: Joanne, can you just walk us through that decline of 144 00:07:26,040 --> 00:07:28,240 Speaker 4: US consumer sentiment right at a six month low? 145 00:07:28,280 --> 00:07:29,440 Speaker 5: Now? What let us here? 146 00:07:30,840 --> 00:07:33,120 Speaker 7: So it's not just the worst of both worlds. I 147 00:07:33,160 --> 00:07:35,760 Speaker 7: would say it's the worst on a number. It's worse 148 00:07:35,840 --> 00:07:39,280 Speaker 7: on a number of dimensions. Not only do consumers expect 149 00:07:39,320 --> 00:07:43,680 Speaker 7: inflation to rise a bit in the year ahead, but 150 00:07:43,720 --> 00:07:47,320 Speaker 7: they are also expecting unemployment to worsen, and they're also 151 00:07:47,400 --> 00:07:53,440 Speaker 7: expecting interest rates to rise, and so these are all 152 00:07:53,520 --> 00:07:56,200 Speaker 7: things that had kind of been in a holding pattern 153 00:07:56,240 --> 00:08:00,280 Speaker 7: for most of twenty twenty four, and suddenly in the 154 00:08:00,280 --> 00:08:03,400 Speaker 7: month of May, consumers are seeing a deterioration on all 155 00:08:03,440 --> 00:08:04,360 Speaker 7: of these dimensions. 156 00:08:04,640 --> 00:08:08,320 Speaker 2: So JO just like to call out the magnitude here again, 157 00:08:08,360 --> 00:08:10,720 Speaker 2: the consensus was seventy six point two for the University 158 00:08:10,800 --> 00:08:15,560 Speaker 2: Mission Sentiment Indicator seventy seven point two. Last period, it 159 00:08:15,600 --> 00:08:17,880 Speaker 2: came in its sixty seven point four and on an 160 00:08:17,960 --> 00:08:20,400 Speaker 2: order of magnitude, I don't remember seeing those types of 161 00:08:20,440 --> 00:08:23,920 Speaker 2: variances in the past. How unusual is this may reading 162 00:08:23,960 --> 00:08:24,240 Speaker 2: to you. 163 00:08:25,520 --> 00:08:28,720 Speaker 7: This is a statistically significant to decrease, but I wouldn't 164 00:08:28,760 --> 00:08:31,800 Speaker 7: call it a plummeting or a plunge. It definitely feels 165 00:08:31,840 --> 00:08:35,480 Speaker 7: like that because the last four months, you know, since January, 166 00:08:35,559 --> 00:08:38,800 Speaker 7: we've been only seeing little wiggles of one two points 167 00:08:39,240 --> 00:08:42,920 Speaker 7: per month. Essentially, we've had no change for four months, 168 00:08:43,120 --> 00:08:45,960 Speaker 7: and prior to that we had two months of really 169 00:08:46,000 --> 00:08:50,200 Speaker 7: strong surges. So this is the first real downward movement 170 00:08:50,360 --> 00:08:53,640 Speaker 7: we've seen in quite a bit of time. So I 171 00:08:53,679 --> 00:08:56,120 Speaker 7: think it is jarring to people who watch the data, 172 00:08:56,200 --> 00:08:59,440 Speaker 7: but it's not a plunge. It is a decline. It 173 00:08:59,480 --> 00:09:01,680 Speaker 7: is a signific can decline, but it's not a plunge. 174 00:09:01,760 --> 00:09:05,319 Speaker 4: You mentioned that those surveyed are worried about interest rates, 175 00:09:05,559 --> 00:09:07,439 Speaker 4: and they're worried the interest rates are going to rise. 176 00:09:07,679 --> 00:09:10,000 Speaker 4: Am I hearing that correctly, not that the Fed is 177 00:09:10,040 --> 00:09:12,280 Speaker 4: going to keep them steady or not cut, but they 178 00:09:12,320 --> 00:09:14,160 Speaker 4: expect higher interest rates. 179 00:09:14,840 --> 00:09:17,560 Speaker 7: Well, more people expect higher interest rates than last month. 180 00:09:17,760 --> 00:09:22,240 Speaker 7: Fewer people expect interest rates to fall, so I believe 181 00:09:22,280 --> 00:09:25,240 Speaker 7: about only about a quarter of consumers expect interest rates 182 00:09:25,280 --> 00:09:29,720 Speaker 7: to fall in the year ahead. So overall, consumer expectations 183 00:09:29,720 --> 00:09:32,080 Speaker 7: over interest rates are worse than they were last month. 184 00:09:32,960 --> 00:09:35,200 Speaker 2: Joanne, thank you so much for joining us. Really appreciate 185 00:09:35,240 --> 00:09:39,360 Speaker 2: you hopping on there. Joinshew, Surveys of Consumers Director for 186 00:09:39,800 --> 00:09:42,400 Speaker 2: the University of Michigan. 187 00:09:43,920 --> 00:09:47,800 Speaker 1: You're listening to the Bloomberg Intelligence podcast. Catch us live 188 00:09:47,880 --> 00:09:50,520 Speaker 1: weekdays at ten am Eastern on Apple, card Play and 189 00:09:50,640 --> 00:09:53,560 Speaker 1: Endroid Auto with the Bloomberg Business app. Listen on demand 190 00:09:53,640 --> 00:09:57,280 Speaker 1: wherever you get your podcasts, or watch us live on YouTube. 191 00:09:59,040 --> 00:10:01,040 Speaker 4: You're having S and P just up by two tens 192 00:10:01,040 --> 00:10:03,600 Speaker 4: of one percent, so we're definitely off the highs, but still. 193 00:10:03,800 --> 00:10:07,160 Speaker 4: Kim Forrest, founder and CIO Booth of Capital Partners joined us. Now, 194 00:10:07,280 --> 00:10:11,160 Speaker 4: Kim help me understand what's happening survey data, ism services, 195 00:10:11,360 --> 00:10:14,440 Speaker 4: and if I be youmish pointing to things that are 196 00:10:14,480 --> 00:10:18,600 Speaker 4: not that good. Equity market still near record highs, corporate profits, 197 00:10:18,640 --> 00:10:21,040 Speaker 4: high earnings holding up really really well. 198 00:10:21,600 --> 00:10:22,840 Speaker 5: What gives. 199 00:10:24,800 --> 00:10:25,160 Speaker 6: Well? 200 00:10:25,280 --> 00:10:28,839 Speaker 8: I would say that the investors are looking past this 201 00:10:29,040 --> 00:10:33,840 Speaker 8: information and they are planning for a longer time period 202 00:10:33,920 --> 00:10:34,760 Speaker 8: than the consumer. 203 00:10:34,960 --> 00:10:35,800 Speaker 6: We'll put it that way. 204 00:10:36,760 --> 00:10:38,920 Speaker 5: But does that mean that? So how do you trade 205 00:10:38,920 --> 00:10:39,280 Speaker 5: on that? 206 00:10:39,520 --> 00:10:42,680 Speaker 4: Like, do you trust the underlying data and say, look, 207 00:10:42,840 --> 00:10:44,959 Speaker 4: I mean the survey data and say, okay, maybe we're 208 00:10:44,960 --> 00:10:46,960 Speaker 4: going to hit some speed bumps, I mean, protect myself 209 00:10:46,960 --> 00:10:47,200 Speaker 4: for that. 210 00:10:47,679 --> 00:10:48,480 Speaker 5: Or do you say it. 211 00:10:48,400 --> 00:10:52,520 Speaker 4: Doesn't matter because these themes, even momentum and effect, a rotation. 212 00:10:52,360 --> 00:10:54,480 Speaker 5: Or AI, all of that just going to outweigh it. 213 00:10:56,800 --> 00:10:58,240 Speaker 6: I think it's a little bit of both. 214 00:10:58,480 --> 00:11:01,320 Speaker 8: First, I think that the con zoomer once again is 215 00:11:01,400 --> 00:11:04,920 Speaker 8: being shocked by things they do pretty much at least 216 00:11:05,080 --> 00:11:07,600 Speaker 8: once a week, if not more often, and that's fill 217 00:11:07,679 --> 00:11:11,360 Speaker 8: up their car or truck with gasoline. And the second 218 00:11:11,360 --> 00:11:14,440 Speaker 8: thing they do is go to the grocery store or 219 00:11:14,520 --> 00:11:18,959 Speaker 8: a restaurant and prices are just crazy. So everybody has 220 00:11:19,080 --> 00:11:23,560 Speaker 8: recency bias, and I believe strongly that food prices in 221 00:11:23,640 --> 00:11:27,920 Speaker 8: particular and gas are just you know, demoralizing. 222 00:11:27,960 --> 00:11:30,120 Speaker 6: Have you gone to the grocery store lately. I have? 223 00:11:30,559 --> 00:11:34,480 Speaker 6: It is you know, it's just it's crazy. 224 00:11:34,559 --> 00:11:38,440 Speaker 8: You just you can't believe the basket of groceries costs 225 00:11:38,520 --> 00:11:41,480 Speaker 8: that much, right, That's that's the first thing. And I 226 00:11:41,600 --> 00:11:46,440 Speaker 8: do think that we have great expectation, probably deservedly so, 227 00:11:46,720 --> 00:11:50,800 Speaker 8: although not on what I think Wall Street's timeline is 228 00:11:51,200 --> 00:11:55,560 Speaker 8: for AI to increase productivity. It's been a decade or 229 00:11:55,559 --> 00:11:59,000 Speaker 8: more since we've had something, a technology that we think 230 00:11:59,040 --> 00:12:02,319 Speaker 8: could deliver productivity, productivity in. 231 00:12:02,280 --> 00:12:03,320 Speaker 6: A meaningful way. 232 00:12:04,559 --> 00:12:09,560 Speaker 8: And I think that's what has captured investors imagination around AI. 233 00:12:10,200 --> 00:12:13,079 Speaker 2: And Kim, I was up at the Boston visiting the 234 00:12:13,120 --> 00:12:17,880 Speaker 2: Boston Consulting Group yesterday, and the senior consultants that are 235 00:12:17,920 --> 00:12:21,559 Speaker 2: you ask them what are your CEO clients asking you about? 236 00:12:21,600 --> 00:12:26,800 Speaker 2: And to a person AI, I mean, it's it's just amazing. 237 00:12:26,840 --> 00:12:29,440 Speaker 2: So Kim, as an investor, and as you talk to 238 00:12:29,440 --> 00:12:31,960 Speaker 2: your clients and as you talk to other money managers, 239 00:12:32,480 --> 00:12:36,599 Speaker 2: how do you suggest people really get a fundamental exposure 240 00:12:36,720 --> 00:12:38,800 Speaker 2: to AI? Did they just run out and hold their 241 00:12:38,840 --> 00:12:41,760 Speaker 2: nose of buying in video or how do you recommend 242 00:12:41,800 --> 00:12:43,760 Speaker 2: the investors kind of get some exposure here. 243 00:12:44,920 --> 00:12:46,960 Speaker 8: Well, there are going to be people that want to 244 00:12:47,000 --> 00:12:50,360 Speaker 8: own in Vidio because again I love the concept of 245 00:12:50,440 --> 00:12:53,199 Speaker 8: recency bias, like it's gone up a lot, so it 246 00:12:53,240 --> 00:12:56,559 Speaker 8: should keep going up, you know, forever and ever. And 247 00:12:56,920 --> 00:12:59,640 Speaker 8: I think that's true to some extent. But we are 248 00:12:59,720 --> 00:13:02,760 Speaker 8: probably probably going to hit a pothole where they miss 249 00:13:02,800 --> 00:13:06,679 Speaker 8: expectations and that's just because of the and I'm getting 250 00:13:06,679 --> 00:13:09,000 Speaker 8: in the weeds here, but go with me. The data 251 00:13:09,040 --> 00:13:11,400 Speaker 8: center build out, I don't care if it's for regular 252 00:13:11,480 --> 00:13:14,640 Speaker 8: data centers or AI. You reach a point at which 253 00:13:14,640 --> 00:13:17,480 Speaker 8: you stop spending and you use the technology you have 254 00:13:17,600 --> 00:13:20,199 Speaker 8: in a building for a while, and then you add more. 255 00:13:20,559 --> 00:13:23,320 Speaker 8: So we're at the beginning of the build out of AI, 256 00:13:23,520 --> 00:13:27,720 Speaker 8: so everybody wants to own in Nvidia. I think AMD 257 00:13:27,840 --> 00:13:30,280 Speaker 8: and Intel are fine companies that are going to be 258 00:13:30,320 --> 00:13:32,800 Speaker 8: able to participate in this, and they are going to 259 00:13:33,040 --> 00:13:37,520 Speaker 8: push in Vidia to probably sell their product at a 260 00:13:37,559 --> 00:13:42,319 Speaker 8: somewhat lower margin. And I think that both Intel a 261 00:13:42,400 --> 00:13:46,079 Speaker 8: AMD can play in this space. So those are some ideas. 262 00:13:46,480 --> 00:13:50,120 Speaker 8: I also think people should look further and keep their 263 00:13:50,120 --> 00:13:53,120 Speaker 8: eye out for good software that is going to answer 264 00:13:53,200 --> 00:13:56,440 Speaker 8: some problems. And what I mean by that is that 265 00:13:56,480 --> 00:14:00,840 Speaker 8: it's not entertainment, only that you know, those CEOs would 266 00:14:00,920 --> 00:14:06,120 Speaker 8: want to buy it to enhance their productivity, and that 267 00:14:06,240 --> 00:14:08,960 Speaker 8: I think is the surest way to having a happy 268 00:14:09,040 --> 00:14:11,880 Speaker 8: portfolio in five to seven years. 269 00:14:12,480 --> 00:14:14,520 Speaker 4: So huh, a new thing in five to seven years. 270 00:14:14,520 --> 00:14:15,719 Speaker 4: And I'm going to ask you about the next two 271 00:14:15,720 --> 00:14:17,880 Speaker 4: and a half weeks. But we do get in video earnings. 272 00:14:17,880 --> 00:14:20,880 Speaker 4: On May twenty second, I was talking to immobile servemen 273 00:14:20,920 --> 00:14:23,440 Speaker 4: of RBC. She runs all their derivative strategy, and she 274 00:14:23,520 --> 00:14:25,480 Speaker 4: was saying, this is the first time that they haven't 275 00:14:25,520 --> 00:14:29,560 Speaker 4: seen a lot of sentiment upside going into nvideo. There's 276 00:14:29,560 --> 00:14:33,080 Speaker 4: not the exuberants that we're used to seeing. What do 277 00:14:33,120 --> 00:14:34,800 Speaker 4: you think the mark how do you think the market's 278 00:14:34,800 --> 00:14:37,520 Speaker 4: going to react and take to in video's earnings? 279 00:14:40,280 --> 00:14:40,960 Speaker 6: Well, they're going. 280 00:14:40,960 --> 00:14:43,480 Speaker 8: To do it Wall Street does, which is overreact in 281 00:14:43,520 --> 00:14:46,080 Speaker 8: the short term whichever way that is, and then get 282 00:14:46,080 --> 00:14:48,760 Speaker 8: it right in the long term. I mean, I hate 283 00:14:48,760 --> 00:14:51,880 Speaker 8: to be so perfunctory about it, but that is the 284 00:14:51,960 --> 00:14:54,680 Speaker 8: last twenty five years of my life. Is you know, 285 00:14:54,720 --> 00:14:57,000 Speaker 8: you get that if it is higher than expected. You're 286 00:14:57,000 --> 00:15:00,280 Speaker 8: going to get a huge sugar rush upwards. It's a 287 00:15:00,320 --> 00:15:03,480 Speaker 8: little bit you're going to get a sell off because 288 00:15:03,520 --> 00:15:06,440 Speaker 8: there are people that are expecting this thing to just 289 00:15:06,560 --> 00:15:09,640 Speaker 8: keep marching up straight, you know that forty five degree 290 00:15:09,680 --> 00:15:10,480 Speaker 8: angle to the right. 291 00:15:10,680 --> 00:15:10,880 Speaker 6: Right. 292 00:15:12,800 --> 00:15:15,440 Speaker 8: But I don't think you should give up hope. This 293 00:15:15,600 --> 00:15:17,720 Speaker 8: is just what markets do in the short term. 294 00:15:18,320 --> 00:15:22,960 Speaker 4: So you mentioned AMD and Intel, what else? What other 295 00:15:23,000 --> 00:15:26,200 Speaker 4: stocks you like to play this trend? You mentioned Synopsis 296 00:15:26,520 --> 00:15:28,480 Speaker 4: SNPs is the ticker. 297 00:15:28,560 --> 00:15:29,400 Speaker 5: What's that company? 298 00:15:30,520 --> 00:15:36,000 Speaker 8: Well, it's for people that design chips and we're going 299 00:15:36,080 --> 00:15:39,440 Speaker 8: to need not just the AI chips, but to get 300 00:15:39,680 --> 00:15:42,400 Speaker 8: real productivity, we're going to have to have armies of 301 00:15:43,120 --> 00:15:46,600 Speaker 8: Internet of things and those will be robots, those will 302 00:15:46,680 --> 00:15:49,360 Speaker 8: be sensors, and they're all going to be talking to 303 00:15:49,640 --> 00:15:52,760 Speaker 8: back to the main computer and have the main computer 304 00:15:52,880 --> 00:15:53,640 Speaker 8: tell them what to. 305 00:15:53,520 --> 00:15:56,800 Speaker 6: Do, or at least gather that information. So there are 306 00:15:56,800 --> 00:15:57,400 Speaker 6: going to have to. 307 00:15:57,320 --> 00:16:01,960 Speaker 8: Be more chips made and improvements to chips made. So 308 00:16:02,200 --> 00:16:07,120 Speaker 8: that is how Synopsids benefits is from companies having to 309 00:16:07,120 --> 00:16:12,360 Speaker 8: buy additional licenses or additional capabilities to its products. So 310 00:16:13,360 --> 00:16:16,760 Speaker 8: I'm a strong believer. There's another company called Cadence. They 311 00:16:16,840 --> 00:16:20,320 Speaker 8: are also in the same area, but we like Synopsis 312 00:16:20,560 --> 00:16:23,479 Speaker 8: because of its breadth of product offerings. 313 00:16:24,280 --> 00:16:27,720 Speaker 4: What about What do you make of the consumer space? 314 00:16:27,760 --> 00:16:30,720 Speaker 4: Do you like anything in the consumer space? The read 315 00:16:30,760 --> 00:16:34,440 Speaker 4: through from earnings has been really confusing, like Starbucks and 316 00:16:34,560 --> 00:16:37,360 Speaker 4: McDonald's warning on the consumer, but then Dutch Bros. Just 317 00:16:37,800 --> 00:16:39,680 Speaker 4: crushed it yesterday. 318 00:16:39,720 --> 00:16:40,440 Speaker 5: What are we learning? 319 00:16:41,960 --> 00:16:42,720 Speaker 6: We're learning we. 320 00:16:42,680 --> 00:16:46,400 Speaker 8: Have a very picky consumer, as we should, right, I mean, 321 00:16:46,440 --> 00:16:49,120 Speaker 8: I'm a picky consumer. I'm thinking you're a picky consumer. 322 00:16:49,600 --> 00:16:51,800 Speaker 8: I might get an eye roll on this one, but 323 00:16:51,960 --> 00:16:56,200 Speaker 8: I really love companies like Urban Outfitters that know that 324 00:16:56,280 --> 00:17:02,240 Speaker 8: they can satisfy this really picky base of a customer set. 325 00:17:02,280 --> 00:17:05,199 Speaker 8: They're not trying to sell product to everybody. They know 326 00:17:05,280 --> 00:17:08,359 Speaker 8: who their consumer is and they have been through the 327 00:17:08,440 --> 00:17:12,280 Speaker 8: decades able to delight and surprise this consumer, to get 328 00:17:12,320 --> 00:17:14,080 Speaker 8: her to open her wallet. And it is a her 329 00:17:14,440 --> 00:17:17,680 Speaker 8: in their case, to get to open the wallet and 330 00:17:17,720 --> 00:17:21,320 Speaker 8: to spend for the new looks. And I think you 331 00:17:21,480 --> 00:17:24,920 Speaker 8: need to know you don't want to you don't want 332 00:17:24,920 --> 00:17:29,400 Speaker 8: to buy mass marketing. You want to buy narrow, niche 333 00:17:29,480 --> 00:17:32,720 Speaker 8: kind of products that companies that do it well. 334 00:17:33,000 --> 00:17:35,879 Speaker 4: So interesting it's like a very different story than just 335 00:17:35,920 --> 00:17:36,840 Speaker 4: even a few years ago. 336 00:17:36,920 --> 00:17:40,119 Speaker 5: Right, So what other? Yeah, go ahead. 337 00:17:40,600 --> 00:17:41,199 Speaker 6: Yeah. 338 00:17:41,040 --> 00:17:44,080 Speaker 8: I think that they do well in any kind of 339 00:17:44,200 --> 00:17:49,680 Speaker 8: market because they have this small consume, small consumer base 340 00:17:49,880 --> 00:17:53,680 Speaker 8: that has money and they want to look fresh and new, 341 00:17:53,840 --> 00:17:57,200 Speaker 8: and that is what their secret is. They keep inventing 342 00:17:57,320 --> 00:18:00,400 Speaker 8: and finding those designers and finding those shapes we don't 343 00:18:00,400 --> 00:18:01,200 Speaker 8: have in our closet. 344 00:18:01,960 --> 00:18:04,080 Speaker 5: Do you do you shop at Urban Outfitters? 345 00:18:04,640 --> 00:18:05,199 Speaker 6: Not really. 346 00:18:05,320 --> 00:18:07,040 Speaker 8: I'm kind of a hipnie, but not that much of 347 00:18:07,080 --> 00:18:08,800 Speaker 8: a hippie, but I get what they. 348 00:18:08,760 --> 00:18:09,520 Speaker 6: Do, you know. 349 00:18:10,680 --> 00:18:11,119 Speaker 5: Yeah, I know. 350 00:18:11,160 --> 00:18:13,840 Speaker 9: I was like, you're extending this into a fashion discussion, 351 00:18:13,960 --> 00:18:14,840 Speaker 9: you know, Yeah, and. 352 00:18:15,040 --> 00:18:15,760 Speaker 6: Yeah, we do that. 353 00:18:16,040 --> 00:18:16,480 Speaker 5: We do that. 354 00:18:17,040 --> 00:18:19,320 Speaker 4: I've known Kim for a long time and inevitably the 355 00:18:19,480 --> 00:18:22,160 Speaker 4: end of our interview always goes to fashion. We've had 356 00:18:22,160 --> 00:18:24,480 Speaker 4: this like hope and dream that one day we'll actually 357 00:18:24,520 --> 00:18:26,120 Speaker 4: go shopping in New York together, but that has yet 358 00:18:26,160 --> 00:18:26,800 Speaker 4: to materialize. 359 00:18:26,840 --> 00:18:29,320 Speaker 6: Yes, Oh my gosh, I can't wait. 360 00:18:29,880 --> 00:18:32,000 Speaker 4: Come on, I'm across two from Bloomingdale's. We' hit the 361 00:18:32,000 --> 00:18:34,840 Speaker 4: sales sample stores. Okay, Kim, I'll let you go get 362 00:18:34,880 --> 00:18:37,240 Speaker 4: back to your real job. Kim Forrest, founder and CIO 363 00:18:37,359 --> 00:18:41,000 Speaker 4: a Boca Capital Partners standing by. 364 00:18:41,240 --> 00:18:45,160 Speaker 1: You're listening to the Bloomberg Intelligence Podcast. Catch us live 365 00:18:45,240 --> 00:18:48,280 Speaker 1: weekdays at ten am Eastern on Apple car Play and 366 00:18:48,280 --> 00:18:51,200 Speaker 1: Android Auto with a Bloomberg Business Act. You can also 367 00:18:51,280 --> 00:18:54,760 Speaker 1: listen live on Amazon Alexa from our flagship New York station, 368 00:18:55,119 --> 00:18:58,879 Speaker 1: Just say Alexa play Bloomberg eleven thirty. 369 00:18:59,600 --> 00:19:04,120 Speaker 4: Here's super interesting report that came out earlier in the week. 370 00:19:04,440 --> 00:19:07,880 Speaker 4: You know the story, right, the great resignation. People left 371 00:19:07,960 --> 00:19:12,760 Speaker 4: jobs during COVID, they got new jobs. Worker satisfaction isn't great, 372 00:19:12,840 --> 00:19:17,320 Speaker 4: particularly among women, and yet is the grass greener. It's 373 00:19:17,320 --> 00:19:19,679 Speaker 4: something that if you ever contemplated leaving your job for 374 00:19:19,720 --> 00:19:22,639 Speaker 4: something else, you constantly think about, are there problems that 375 00:19:22,680 --> 00:19:24,680 Speaker 4: are at every workplace? Or am I the problem and 376 00:19:24,720 --> 00:19:27,000 Speaker 4: I'm bringing it with me? Well, now we may have 377 00:19:27,200 --> 00:19:30,879 Speaker 4: some answers to this. Alan Swire is principal researcher of 378 00:19:30,960 --> 00:19:34,199 Speaker 4: Human capital over at the Conference Board, and they had 379 00:19:34,240 --> 00:19:36,920 Speaker 4: a survey out that said the workers who jumped CHIP 380 00:19:37,040 --> 00:19:41,200 Speaker 4: during COVID are now regretting it. Alan, thanks for coming. 381 00:19:41,240 --> 00:19:43,480 Speaker 4: This is great. I love this story. Can you walk 382 00:19:43,520 --> 00:19:44,560 Speaker 4: us through your findings? 383 00:19:45,920 --> 00:19:46,120 Speaker 2: Yeah? 384 00:19:46,200 --> 00:19:49,879 Speaker 10: Sure, So that one specifically was a big surprise to 385 00:19:50,000 --> 00:19:52,200 Speaker 10: us because just last year when we ran the survey 386 00:19:52,280 --> 00:19:56,520 Speaker 10: or twenty two, I should say, it was completely opposite 387 00:19:56,600 --> 00:20:01,600 Speaker 10: to what we found this time. Measure of job satisfaction 388 00:20:01,800 --> 00:20:05,760 Speaker 10: was up amongst those who switched jobs, but this year, 389 00:20:06,600 --> 00:20:10,600 Speaker 10: overall job satisfaction was down by over five percentage points, 390 00:20:11,000 --> 00:20:13,280 Speaker 10: and it was down across most of the twenty six 391 00:20:13,320 --> 00:20:16,480 Speaker 10: components of satisfaction that we measure, so a big swing. 392 00:20:16,680 --> 00:20:20,560 Speaker 10: We think we agree with your summary that you know, 393 00:20:20,640 --> 00:20:24,080 Speaker 10: sometimes the grass isn't necessarily greener on the other side, 394 00:20:24,560 --> 00:20:27,720 Speaker 10: and could be for a range of factors that this 395 00:20:27,880 --> 00:20:32,120 Speaker 10: is happening, But we definitely see that this year those 396 00:20:32,160 --> 00:20:36,200 Speaker 10: who have switched jobs during the pandemic are less satisfied 397 00:20:36,240 --> 00:20:37,159 Speaker 10: than those who stayed. 398 00:20:37,840 --> 00:20:40,440 Speaker 2: Do we know what was the primary reason people did 399 00:20:40,480 --> 00:20:43,600 Speaker 2: switch jobs during the pandemic was any different than other times, 400 00:20:43,600 --> 00:20:46,119 Speaker 2: which is, you know, maybe higher pay or just a 401 00:20:46,119 --> 00:20:46,959 Speaker 2: better opportunity. 402 00:20:48,119 --> 00:20:52,199 Speaker 10: Yeah, there were really pay increases during the pandemic that 403 00:20:52,240 --> 00:20:55,840 Speaker 10: we hadn't seen for well over a decade's that's fairly 404 00:20:55,880 --> 00:20:59,359 Speaker 10: well documented too, that some workers were receiving twenty thirty 405 00:20:59,359 --> 00:21:04,000 Speaker 10: percent better offers just for switching jobs. And we think 406 00:21:04,040 --> 00:21:07,159 Speaker 10: that you know, maybe some people took those offers without 407 00:21:07,400 --> 00:21:10,480 Speaker 10: really thinking, you know, about the other elements that lead 408 00:21:10,520 --> 00:21:13,080 Speaker 10: to happiness at work and contentment at work, like who 409 00:21:13,080 --> 00:21:15,879 Speaker 10: you work with and whether the organization is going to 410 00:21:15,920 --> 00:21:20,040 Speaker 10: invest in your career, career growth and many other things. 411 00:21:20,280 --> 00:21:23,280 Speaker 10: So they went for the money maybe, and now they're 412 00:21:23,320 --> 00:21:24,320 Speaker 10: starting to regret it. 413 00:21:24,720 --> 00:21:27,680 Speaker 4: I mean, it's so true, like the older I get, 414 00:21:27,720 --> 00:21:31,119 Speaker 4: the more that I try and really distill down what 415 00:21:31,320 --> 00:21:34,080 Speaker 4: makes me happy day to day, and it's so helpful. 416 00:21:34,080 --> 00:21:35,399 Speaker 4: I mean, it took me a long time to figure 417 00:21:35,400 --> 00:21:37,240 Speaker 4: that out, but it's so helpful and understanding it and 418 00:21:37,280 --> 00:21:40,120 Speaker 4: sometimes staying put is the answer. You had a great 419 00:21:40,119 --> 00:21:42,240 Speaker 4: point out that says once an employee hits a three 420 00:21:42,359 --> 00:21:45,360 Speaker 4: year mark, satisfaction increases substantially. 421 00:21:45,440 --> 00:21:46,120 Speaker 5: Talk us through that. 422 00:21:47,280 --> 00:21:49,920 Speaker 10: Yeah, that's also a difference from what we've seen, not 423 00:21:49,960 --> 00:21:52,119 Speaker 10: only in our own research, but there's a good amount 424 00:21:52,160 --> 00:21:54,920 Speaker 10: of research out there that shows that usually it's people 425 00:21:54,920 --> 00:21:57,840 Speaker 10: who have are shortened their tendon they start a new job, 426 00:21:57,880 --> 00:22:02,520 Speaker 10: they're happy, there's a honeymoon period, and their engagement and 427 00:22:02,560 --> 00:22:06,600 Speaker 10: their satisfaction levels are usually higher than people with more tenure. 428 00:22:06,640 --> 00:22:10,200 Speaker 10: But this year, in our satisfaction survey, we saw the opposite, 429 00:22:10,320 --> 00:22:13,280 Speaker 10: and that goes with the finding that job switchers are 430 00:22:13,359 --> 00:22:16,280 Speaker 10: less satisfying than job stayers, maybe for some of the 431 00:22:16,320 --> 00:22:19,320 Speaker 10: same reasons. Those who switch jobs. They're the people who 432 00:22:19,320 --> 00:22:22,080 Speaker 10: are in the positions with less than three years tenure, 433 00:22:22,640 --> 00:22:25,600 Speaker 10: and they're seeing that, yeah, things aren't quite as good 434 00:22:25,640 --> 00:22:29,840 Speaker 10: as they might have hoped, even if their wages are 435 00:22:29,920 --> 00:22:34,360 Speaker 10: higher than if they had stayed. So that's really good 436 00:22:34,400 --> 00:22:39,119 Speaker 10: reinforcement that those two elements that we examine in this 437 00:22:39,240 --> 00:22:41,520 Speaker 10: report really compliment each other. 438 00:22:42,000 --> 00:22:44,160 Speaker 2: And Alan your survey kind of goes to an issue 439 00:22:44,200 --> 00:22:46,240 Speaker 2: Alex and I were just talking about today. The least 440 00:22:46,280 --> 00:22:50,879 Speaker 2: satisfied group is fully on site workers. The hybrid model 441 00:22:51,119 --> 00:22:54,919 Speaker 2: wins the day. I guess the hybrid model that is 442 00:22:55,080 --> 00:22:57,439 Speaker 2: the new normal for the foreseeable future, isn't it. 443 00:22:58,560 --> 00:23:01,800 Speaker 10: Yeah, I think it is. I think employers or most 444 00:23:01,800 --> 00:23:04,800 Speaker 10: employers that not all see the benefits of the hybrid model. 445 00:23:05,280 --> 00:23:08,160 Speaker 10: You get the best of both worlds if it's done right. 446 00:23:09,600 --> 00:23:11,960 Speaker 10: Some of employees who can work from home may be 447 00:23:11,960 --> 00:23:15,240 Speaker 10: more productive at home for some types of work, but 448 00:23:15,280 --> 00:23:18,639 Speaker 10: when it comes to collaboration, building trust in relationships, and 449 00:23:18,640 --> 00:23:21,919 Speaker 10: innovation there at work. For several days a week. So 450 00:23:22,240 --> 00:23:23,920 Speaker 10: if they do it right, they're getting the best of 451 00:23:23,960 --> 00:23:27,480 Speaker 10: both worlds. But what's really interesting is that job as 452 00:23:27,520 --> 00:23:30,159 Speaker 10: that workers see this too. You know, even though we 453 00:23:30,200 --> 00:23:32,360 Speaker 10: see sometimes that workers say I'd love to work five 454 00:23:32,440 --> 00:23:36,320 Speaker 10: days remote, maybe they also see the benefits of being 455 00:23:36,320 --> 00:23:38,160 Speaker 10: with their colleagues two or three days a week. 456 00:23:38,440 --> 00:23:40,360 Speaker 4: There's literally no bone in my body that ever wants 457 00:23:40,400 --> 00:23:42,280 Speaker 4: to work at home. Again, Like the eight weeks during 458 00:23:42,320 --> 00:23:44,159 Speaker 4: COVID with a camera set up my living room was 459 00:23:44,200 --> 00:23:47,840 Speaker 4: more than enough. Are we noticing any distinction in terms 460 00:23:47,840 --> 00:23:50,520 Speaker 4: of the types of jobs we're talking about in terms 461 00:23:50,520 --> 00:23:54,480 Speaker 4: of say, you know, women versus men or any ethnicity. 462 00:23:56,760 --> 00:24:00,399 Speaker 10: We did see big differences with women. For the seventh 463 00:24:00,440 --> 00:24:04,040 Speaker 10: year in a row. Women are significantly less satisfied than men, 464 00:24:04,160 --> 00:24:08,000 Speaker 10: and that gap through between twenty twenty two and twenty 465 00:24:08,040 --> 00:24:13,080 Speaker 10: twenty three. And you know, those the main reasons or 466 00:24:13,200 --> 00:24:19,240 Speaker 10: drivers of that dissatisfaction are around hay, whether that's wages 467 00:24:19,359 --> 00:24:26,080 Speaker 10: or bonuses or even promotions, and also around the flexibility 468 00:24:26,119 --> 00:24:28,879 Speaker 10: of work. So by that I mean some of the 469 00:24:28,920 --> 00:24:32,959 Speaker 10: benefits like vacation and familyly. And it's not too surprising 470 00:24:33,000 --> 00:24:35,800 Speaker 10: because we see that even in twenty twenty four women 471 00:24:35,840 --> 00:24:38,040 Speaker 10: are making eighty two point five cents on the dollar 472 00:24:38,080 --> 00:24:40,880 Speaker 10: for men, and that most of the burden of care, 473 00:24:41,000 --> 00:24:44,520 Speaker 10: whether it's for kids or sea or parents, and most 474 00:24:44,520 --> 00:24:47,320 Speaker 10: of the household work falls on women still, so this 475 00:24:47,440 --> 00:24:50,040 Speaker 10: is really reflected in results. And for the past seven 476 00:24:50,080 --> 00:24:53,520 Speaker 10: years we did not collect data on race. We are 477 00:24:53,560 --> 00:24:56,800 Speaker 10: contemplating doing that race and ethnicity next year. 478 00:24:57,880 --> 00:25:01,399 Speaker 2: Hey, Alan Helmporten, are some of the I don't know, 479 00:25:01,440 --> 00:25:06,440 Speaker 2: the fringe perks, whether it's a foosball table, food, you know, 480 00:25:06,600 --> 00:25:09,600 Speaker 2: coffee machine, there's things that really came to the forefront 481 00:25:09,680 --> 00:25:12,720 Speaker 2: or in the pandemic as employers tried to lure employees 482 00:25:12,720 --> 00:25:14,200 Speaker 2: back to the office. It seems like a lot of 483 00:25:14,240 --> 00:25:16,199 Speaker 2: companies made a lot of investments in that kind of stuff. 484 00:25:17,160 --> 00:25:20,680 Speaker 10: Yeah, I think you're right. We don't measure that specifically. 485 00:25:20,720 --> 00:25:23,359 Speaker 10: We don't have a measure for sort of some of 486 00:25:23,359 --> 00:25:26,639 Speaker 10: those extra perks that you're you're talking about, but we know, 487 00:25:27,080 --> 00:25:30,320 Speaker 10: you know that organizations are doing their their their best 488 00:25:30,400 --> 00:25:33,720 Speaker 10: to lure people to the office, and they're trying not 489 00:25:33,840 --> 00:25:38,320 Speaker 10: to command people to the office with those extra perps. 490 00:25:38,760 --> 00:25:41,639 Speaker 10: And probably it's a good idea, whether it has as 491 00:25:41,720 --> 00:25:44,800 Speaker 10: much impact as a good health plan, probably not, or 492 00:25:44,880 --> 00:25:48,240 Speaker 10: even a good strong you know, talent culture where people 493 00:25:48,280 --> 00:25:52,440 Speaker 10: collaborate and respect each other. They're probably not, but some 494 00:25:52,480 --> 00:25:55,720 Speaker 10: of those things definitely helped to entice people back to 495 00:25:55,760 --> 00:25:56,440 Speaker 10: the office. 496 00:25:56,880 --> 00:25:59,520 Speaker 4: Just giving my coffee, you know, like I definitely don't care, 497 00:25:59,640 --> 00:26:02,440 Speaker 4: Slash want a foosball table. Alan, thanks a lot, really 498 00:26:02,480 --> 00:26:04,960 Speaker 4: interesting stuff. Great to sort of break all of that down, 499 00:26:05,160 --> 00:26:09,480 Speaker 4: Alan Swire, principal researcher human Capital at the conference board, 500 00:26:09,560 --> 00:26:12,560 Speaker 4: joining us from Savannah, Georgia. I mean, none of this 501 00:26:12,640 --> 00:26:15,359 Speaker 4: surprises me, But I also wonder if our tolerance is 502 00:26:15,480 --> 00:26:20,240 Speaker 4: changing what work life balance means in a very different way. 503 00:26:20,400 --> 00:26:23,159 Speaker 2: Yes, I think I think that has changed for a 504 00:26:23,240 --> 00:26:26,359 Speaker 2: lot of people, and they express in different ways, you know, 505 00:26:26,440 --> 00:26:30,960 Speaker 2: leaving the workforce the only working places that have hybrid 506 00:26:31,080 --> 00:26:33,480 Speaker 2: all that kind of stuff. So yeah, then there are 507 00:26:33,480 --> 00:26:35,560 Speaker 2: the special people like us who just come back every 508 00:26:35,600 --> 00:26:36,160 Speaker 2: day of the week. 509 00:26:36,320 --> 00:26:36,520 Speaker 5: Yeah. 510 00:26:36,520 --> 00:26:38,520 Speaker 4: And to be fair, like my husband does most of 511 00:26:38,520 --> 00:26:40,600 Speaker 4: the childcare and he always has, so I just want 512 00:26:40,640 --> 00:26:42,720 Speaker 4: to shout out that I'm not one of those women 513 00:26:42,720 --> 00:26:43,960 Speaker 4: that have to do more of the childcare. 514 00:26:44,000 --> 00:26:44,800 Speaker 5: We love the husband. 515 00:26:46,320 --> 00:26:50,200 Speaker 1: You're listening to the Bloomberg Intelligence Podcast catch us live 516 00:26:50,280 --> 00:26:53,200 Speaker 1: weekdays at ten am Eastern on applecar. 517 00:26:52,840 --> 00:26:54,400 Speaker 10: Play and Android Auto with. 518 00:26:54,359 --> 00:26:57,199 Speaker 1: The Bloomberg Business app. You can also listen live on 519 00:26:57,280 --> 00:27:00,560 Speaker 1: Amazon Alexa from our flagship New York station just Say 520 00:27:00,600 --> 00:27:02,920 Speaker 1: Alexa playing Bloomberg eleven thirty. 521 00:27:05,000 --> 00:27:09,159 Speaker 2: All right, let's turn a conversation to AI. Yesterday I 522 00:27:09,160 --> 00:27:13,240 Speaker 2: attended the Boston Consulting Group EDGE Expo in Boston. Had 523 00:27:13,280 --> 00:27:15,440 Speaker 2: a lot of great conversations with the folks at BCG. 524 00:27:15,520 --> 00:27:18,720 Speaker 2: A lot are really smart people thinking about some important 525 00:27:18,720 --> 00:27:21,320 Speaker 2: issues for their clients. I first book with Steve Mills, 526 00:27:21,440 --> 00:27:25,359 Speaker 2: chief AI ethics Officer. He focuses on the risks for 527 00:27:25,520 --> 00:27:29,040 Speaker 2: businesses and forging ahead with AI without considering how to 528 00:27:29,160 --> 00:27:32,320 Speaker 2: do that in a responsible ethical manner. I began by 529 00:27:32,359 --> 00:27:34,480 Speaker 2: asking him to talk about his role and how he 530 00:27:34,520 --> 00:27:37,200 Speaker 2: thinks it about AI from an ethics perspective. Let's take 531 00:27:37,200 --> 00:27:38,080 Speaker 2: a lism the. 532 00:27:38,040 --> 00:27:40,280 Speaker 9: Way we think about it, and particularly as you start 533 00:27:40,320 --> 00:27:44,600 Speaker 9: talking about generative AI. A responsible generative AI system needs 534 00:27:44,640 --> 00:27:47,200 Speaker 9: to be proficient, meaning it does the thing. 535 00:27:47,080 --> 00:27:48,560 Speaker 11: We want well, okay, you know. 536 00:27:48,640 --> 00:27:51,080 Speaker 9: So if it's a question answer system, it can actually 537 00:27:51,080 --> 00:27:53,840 Speaker 9: accurately answer questions, and that's key to driving value with 538 00:27:53,920 --> 00:27:56,320 Speaker 9: these systems. That's the first piece. It needs to be 539 00:27:56,359 --> 00:27:59,120 Speaker 9: safe and equitable. So these are the things like bias 540 00:27:59,200 --> 00:28:03,000 Speaker 9: and you know, harmful language and the like. It needs 541 00:28:03,040 --> 00:28:05,320 Speaker 9: to be secure, and then it needs to be compliant 542 00:28:05,359 --> 00:28:08,040 Speaker 9: with laws and regulations. And so as we're building AI 543 00:28:08,160 --> 00:28:10,520 Speaker 9: on behalf of our clients, as we're helping clients navigate 544 00:28:10,560 --> 00:28:12,800 Speaker 9: these issues, those are the things we're trying to guard against. 545 00:28:12,880 --> 00:28:15,919 Speaker 2: What are the some of the common risks that are 546 00:28:15,960 --> 00:28:19,680 Speaker 2: really out there for implementing AI across a product, across 547 00:28:19,680 --> 00:28:22,480 Speaker 2: the service that you've seen so far, because it feels 548 00:28:22,600 --> 00:28:24,760 Speaker 2: like we're still in the very very early innings of 549 00:28:25,119 --> 00:28:26,240 Speaker 2: this AI discussion. 550 00:28:26,920 --> 00:28:29,679 Speaker 9: Yeah, I mean the most common things, particularly you know, 551 00:28:29,800 --> 00:28:32,840 Speaker 9: we see a lot of chatbots for example, it is. 552 00:28:32,840 --> 00:28:33,520 Speaker 11: Things like. 553 00:28:35,040 --> 00:28:39,040 Speaker 9: Biased language or you know, sexist language in a subtle way. 554 00:28:39,080 --> 00:28:41,240 Speaker 9: Typically it's not over, but it can come across in 555 00:28:41,280 --> 00:28:46,160 Speaker 9: a subtle way. Very security flaws. You know, when you 556 00:28:47,400 --> 00:28:50,120 Speaker 9: issue a certain prompt to the system, it reveals sensitive data. 557 00:28:50,240 --> 00:28:54,040 Speaker 9: We've seen examples of you know, people being able through 558 00:28:54,040 --> 00:28:57,760 Speaker 9: smart prompting of the system manipulate it to offer products 559 00:28:57,800 --> 00:29:01,960 Speaker 9: for you know, next to nothing, basically, you know, inaccurate 560 00:29:02,080 --> 00:29:04,440 Speaker 9: you know answers or you know, people refer to to 561 00:29:04,480 --> 00:29:08,000 Speaker 9: offen its hallucinations. But you know, again like basically the 562 00:29:08,080 --> 00:29:10,400 Speaker 9: wrong answer coming out of the system that the company 563 00:29:10,440 --> 00:29:12,520 Speaker 9: is then held viable for after the fact. So these 564 00:29:12,520 --> 00:29:14,640 Speaker 9: are all things I'm sort of picking from the headlines 565 00:29:14,640 --> 00:29:15,200 Speaker 9: that we've seen. 566 00:29:15,520 --> 00:29:18,240 Speaker 2: It seems to me that before you guys walk into 567 00:29:18,240 --> 00:29:20,920 Speaker 2: my office, I think I have to have a commitment 568 00:29:20,960 --> 00:29:23,520 Speaker 2: to do this thing ethically before I spend dollar one. 569 00:29:23,880 --> 00:29:25,080 Speaker 2: Is that the kind of the message you want to 570 00:29:25,080 --> 00:29:26,760 Speaker 2: get across to these people, Like, listen, guys, you got 571 00:29:26,800 --> 00:29:29,560 Speaker 2: to buy into this, implement this because this is a 572 00:29:29,600 --> 00:29:32,240 Speaker 2: powerful tool, maybe more powerful than maybe we even know 573 00:29:32,280 --> 00:29:32,800 Speaker 2: at this point. 574 00:29:33,040 --> 00:29:34,160 Speaker 11: Yeah, that's exactly right. 575 00:29:34,200 --> 00:29:38,480 Speaker 9: I always say you cannot deploy and scale GENAI without 576 00:29:38,640 --> 00:29:42,240 Speaker 9: responsible AI. And it kind of comes in two as 577 00:29:42,280 --> 00:29:46,680 Speaker 9: one is, organizations are realizing there's these risks, there's there's regulation, 578 00:29:47,760 --> 00:29:50,680 Speaker 9: you need this in place to minimize those risks so 579 00:29:50,720 --> 00:29:53,560 Speaker 9: you can do it confidently. At the same time, we 580 00:29:53,640 --> 00:29:56,840 Speaker 9: see it as a source of value both directly and indirectly. 581 00:29:56,920 --> 00:30:00,000 Speaker 9: What I mean by that is companies with mature respond 582 00:30:00,200 --> 00:30:02,360 Speaker 9: AI programs, and this came from research we did with MIT, 583 00:30:03,360 --> 00:30:06,920 Speaker 9: if they have a mature responsibili program, they report higher 584 00:30:06,960 --> 00:30:11,000 Speaker 9: customer engagement, better customer retention, higher customer trust, better long 585 00:30:11,080 --> 00:30:13,600 Speaker 9: term profitabilities. That's sort of the direct piece, and faster 586 00:30:13,640 --> 00:30:14,280 Speaker 9: innovation too. 587 00:30:14,320 --> 00:30:15,400 Speaker 11: By the way. 588 00:30:15,480 --> 00:30:18,840 Speaker 9: The indirect piece is they also report they get more 589 00:30:18,960 --> 00:30:21,560 Speaker 9: value from their AI investment, which I think is really 590 00:30:21,640 --> 00:30:24,640 Speaker 9: interesting and the intuition there is many of the things 591 00:30:24,680 --> 00:30:27,720 Speaker 9: you do to build product responsibly just makes better product 592 00:30:27,800 --> 00:30:30,160 Speaker 9: in the first place, and so it's not really that 593 00:30:30,280 --> 00:30:32,160 Speaker 9: much of a surprise that if you do those things 594 00:30:32,480 --> 00:30:33,320 Speaker 9: you build better product. 595 00:30:33,360 --> 00:30:35,680 Speaker 11: Do you get more value from your AI? Are there 596 00:30:36,200 --> 00:30:36,920 Speaker 11: in your experience? 597 00:30:36,960 --> 00:30:39,479 Speaker 2: Have you had companies that do better than others that 598 00:30:39,520 --> 00:30:42,400 Speaker 2: make a bigger commitment or what's kind of like a 599 00:30:42,400 --> 00:30:45,360 Speaker 2: best use case from your perspective, what's your not your pitch, 600 00:30:45,400 --> 00:30:48,040 Speaker 2: but what's your best idea best practice when you go 601 00:30:48,040 --> 00:30:48,880 Speaker 2: into talk to clients. 602 00:30:49,960 --> 00:30:51,920 Speaker 9: Well, I mean we always talk about, you know, what 603 00:30:51,960 --> 00:30:54,800 Speaker 9: we see as a mature responsible AI program that you know, 604 00:30:54,880 --> 00:30:58,440 Speaker 9: at its heart, there's there's basically a strategy that's really 605 00:30:58,480 --> 00:31:02,440 Speaker 9: bridging between corporate value and the AI strategy the company itself, 606 00:31:02,520 --> 00:31:04,200 Speaker 9: and this is a means to bring those two things 607 00:31:04,240 --> 00:31:07,720 Speaker 9: together and then there's you know, the governance, the processes, 608 00:31:07,800 --> 00:31:11,520 Speaker 9: all the tools, but ultimately you want this underpinned by 609 00:31:11,520 --> 00:31:12,800 Speaker 9: a culture of responsibility. 610 00:31:13,400 --> 00:31:15,160 Speaker 11: Who checks that? 611 00:31:15,560 --> 00:31:19,600 Speaker 2: Like I'm wondering, like are auditors being trained now not 612 00:31:19,800 --> 00:31:23,920 Speaker 2: wanted to look at financial statements but also and controls 613 00:31:24,440 --> 00:31:27,440 Speaker 2: and so on and so forth, but also you know 614 00:31:27,600 --> 00:31:31,960 Speaker 2: data really Like Bloomberg, we take data security extraordinarily seriously 615 00:31:32,000 --> 00:31:34,280 Speaker 2: as a data driven company. I would think that almost 616 00:31:34,360 --> 00:31:38,360 Speaker 2: any company now that is employing some level of AI. 617 00:31:39,440 --> 00:31:40,280 Speaker 11: There's going to be risk. 618 00:31:40,960 --> 00:31:41,760 Speaker 2: Who checks on that? 619 00:31:42,480 --> 00:31:46,040 Speaker 9: Yeah, it's a great question, and we're honestly extremely nascent 620 00:31:46,160 --> 00:31:49,480 Speaker 9: in this space. There are no standards yet for AI. 621 00:31:50,320 --> 00:31:52,960 Speaker 9: Therefore there's really no audit. Now You've got folks like 622 00:31:52,960 --> 00:31:55,720 Speaker 9: the Responsible AI Institute, which is a nonprofit that has 623 00:31:55,760 --> 00:31:59,440 Speaker 9: created a certification standard which some companies are following. I 624 00:31:59,480 --> 00:32:02,840 Speaker 9: think this is to change extremely rapidly. The EU just 625 00:32:02,840 --> 00:32:06,160 Speaker 9: past the AI Act. Very soon weel see standards associate 626 00:32:06,200 --> 00:32:07,520 Speaker 9: with that, and so I think we will start to 627 00:32:07,520 --> 00:32:10,520 Speaker 9: see this really come to the fore very very quickly. 628 00:32:10,560 --> 00:32:13,360 Speaker 2: It's interesting mention Europe because we've seen just with technology 629 00:32:13,400 --> 00:32:16,840 Speaker 2: in general Europe and privacy, for example, data privacy Europe 630 00:32:16,880 --> 00:32:19,520 Speaker 2: ahead of the US, And I would argue, you know, 631 00:32:19,520 --> 00:32:20,959 Speaker 2: if you go all the way back to the eighties 632 00:32:21,080 --> 00:32:23,840 Speaker 2: and early nineties with Microsoft in terms of just dominance 633 00:32:23,840 --> 00:32:27,600 Speaker 2: of US technology, do you expect other parts of the 634 00:32:27,600 --> 00:32:30,360 Speaker 2: world to maybe be a little bit more out front 635 00:32:30,360 --> 00:32:32,920 Speaker 2: on some of the regulations of this new technology visa 636 00:32:32,960 --> 00:32:36,400 Speaker 2: be the US or because I know, BCG's a global business, 637 00:32:36,600 --> 00:32:37,280 Speaker 2: you see everybody. 638 00:32:37,440 --> 00:32:39,760 Speaker 9: Yeah, I mean, we're starting to see a patchwork or regulation. 639 00:32:40,120 --> 00:32:43,040 Speaker 9: The EU, I would say, is the first place we're 640 00:32:43,080 --> 00:32:47,520 Speaker 9: seeing comprehensive AI specific regulation. But in the US the 641 00:32:47,880 --> 00:32:51,760 Speaker 9: administration has directed the executive branch enforcement agencies to take 642 00:32:51,800 --> 00:32:54,280 Speaker 9: existing regulations and apply them to AI, and so we're 643 00:32:54,280 --> 00:32:58,920 Speaker 9: seeing that with Consumer Financial Protection, Fair Housing, FDA. 644 00:32:59,080 --> 00:32:59,280 Speaker 11: You know. 645 00:32:59,440 --> 00:33:02,880 Speaker 9: So while we in the US don't have the AI 646 00:33:02,880 --> 00:33:06,480 Speaker 9: specific regulation, there is certainly regulation applied to AI right now. 647 00:33:06,520 --> 00:33:09,680 Speaker 9: So this is the situation we're in, in this odd 648 00:33:09,720 --> 00:33:13,000 Speaker 9: patchwork emerging and US states honestly for passing regulation states. 649 00:33:13,080 --> 00:33:16,360 Speaker 2: Okay, yeah, interesting, what are are there certain industries today 650 00:33:16,400 --> 00:33:18,760 Speaker 2: here again early stages of AI, that seem to be 651 00:33:18,840 --> 00:33:23,080 Speaker 2: more open to making the investments in AI, making the 652 00:33:23,120 --> 00:33:27,280 Speaker 2: commitment to AI. I would think technology companies probably defined 653 00:33:27,360 --> 00:33:30,240 Speaker 2: as opposed to I don't know a manufacturer company in 654 00:33:30,280 --> 00:33:32,720 Speaker 2: the Midwest that says this doesn't really apply to me. 655 00:33:33,040 --> 00:33:34,320 Speaker 11: What are your discussions? 656 00:33:34,320 --> 00:33:34,760 Speaker 2: How do they go? 657 00:33:34,960 --> 00:33:35,160 Speaker 6: Yeah? 658 00:33:35,160 --> 00:33:38,920 Speaker 9: I mean I think big technology companies are absolutely at 659 00:33:38,920 --> 00:33:42,000 Speaker 9: the foror of this space. You know, from years ago 660 00:33:42,080 --> 00:33:45,840 Speaker 9: they were working on these topics. The other place we're 661 00:33:45,880 --> 00:33:49,200 Speaker 9: seeing a lot of maturity is the highly regulated industry 662 00:33:49,240 --> 00:33:54,000 Speaker 9: in healthcare, insurance finance. Not surprising in that they've historically 663 00:33:54,000 --> 00:33:58,080 Speaker 9: had model risk management functions. You know, they've got regulators 664 00:33:58,080 --> 00:34:00,280 Speaker 9: looking at them, and so they've been very proactive. I 665 00:34:00,280 --> 00:34:05,120 Speaker 9: think other industries, there are absolutely very mature companies in 666 00:34:05,160 --> 00:34:06,520 Speaker 9: every industry, but I think if. 667 00:34:06,360 --> 00:34:10,040 Speaker 11: We're talking broad brushstrokes, others are lagging behind. I was 668 00:34:10,080 --> 00:34:13,000 Speaker 11: just talking and mentioned to somebody else. AI seem to 669 00:34:13,040 --> 00:34:14,160 Speaker 11: have come out of nowhere. 670 00:34:14,680 --> 00:34:17,080 Speaker 2: And how do you think about the evolution of AI? 671 00:34:17,200 --> 00:34:20,040 Speaker 2: Is this just the big data discussion we had five 672 00:34:20,120 --> 00:34:22,759 Speaker 2: years ago or is it just the commitment we made 673 00:34:22,760 --> 00:34:25,480 Speaker 2: in the eighties and nineties that oh boy, everything's going digital, 674 00:34:25,480 --> 00:34:28,239 Speaker 2: we better start spending more on it. How do you 675 00:34:28,239 --> 00:34:30,359 Speaker 2: guys think about the evolution of AI? Because I could 676 00:34:30,400 --> 00:34:33,000 Speaker 2: see a CEO or a board member saying to you, 677 00:34:33,360 --> 00:34:36,640 Speaker 2: I don't understand it. Haven't we been spending on technology 678 00:34:36,640 --> 00:34:38,080 Speaker 2: for the past twenty five years. 679 00:34:38,640 --> 00:34:42,920 Speaker 9: I mean, I really believe that the generative AI explosion 680 00:34:42,920 --> 00:34:46,320 Speaker 9: we've seen is an inflection point. It is something fundamentally different, 681 00:34:46,920 --> 00:34:49,560 Speaker 9: and I don't think we fully know how it will 682 00:34:49,800 --> 00:34:53,200 Speaker 9: impact everything. But I mean, you're seeing the degree of 683 00:34:53,239 --> 00:34:56,560 Speaker 9: hype around it. Yes, there is some hype, but it's 684 00:34:56,600 --> 00:34:58,719 Speaker 9: well founded. I mean, I think we're talking on par 685 00:34:58,960 --> 00:35:01,920 Speaker 9: with the Internet in some ways of how it could 686 00:35:01,920 --> 00:35:07,319 Speaker 9: be transformative inside organizations, And we're still discovering the capabilities, right, 687 00:35:07,360 --> 00:35:09,759 Speaker 9: That's some the unique thing about it, the emerging capabilities 688 00:35:09,760 --> 00:35:12,839 Speaker 9: we see and so each you know, each generation a 689 00:35:12,840 --> 00:35:15,080 Speaker 9: model that's being released is just so much more performant 690 00:35:15,120 --> 00:35:16,080 Speaker 9: and can do so much more. 691 00:35:16,080 --> 00:35:17,799 Speaker 11: And so the question is where where does it end? 692 00:35:17,800 --> 00:35:20,000 Speaker 9: But even with what we have today, there are so 693 00:35:20,080 --> 00:35:21,919 Speaker 9: many applications we haven't explored yet. 694 00:35:22,080 --> 00:35:25,200 Speaker 11: So what's kind of your when you walk into see 695 00:35:25,239 --> 00:35:26,799 Speaker 11: a client? How do you how do you lead off? 696 00:35:26,840 --> 00:35:29,120 Speaker 2: Do you do you just say I'm going to try 697 00:35:29,120 --> 00:35:30,960 Speaker 2: to My goal today is to get you to commit 698 00:35:31,880 --> 00:35:36,000 Speaker 2: or consider committing to an ethical implementation of your tech investment. 699 00:35:36,040 --> 00:35:37,520 Speaker 2: Is that kind of your pitch or what how do 700 00:35:37,560 --> 00:35:38,720 Speaker 2: you walk into when you see a client? 701 00:35:38,800 --> 00:35:41,239 Speaker 9: Yeah, in essence, it's you know, it's the message I 702 00:35:41,680 --> 00:35:44,160 Speaker 9: sort of led with earlier that you cannot scale Jenai 703 00:35:44,440 --> 00:35:47,759 Speaker 9: without doing it in a responsible way. And it's twofold, like, yes, 704 00:35:47,800 --> 00:35:51,800 Speaker 9: there's absolutely a risk mitigation play, both for the company 705 00:35:51,840 --> 00:35:54,120 Speaker 9: but even more importantly for customers, right because you have 706 00:35:54,160 --> 00:35:56,520 Speaker 9: the potential to create an unintended harm to individuals, so 707 00:35:56,719 --> 00:35:58,960 Speaker 9: like full stop, we need to avoid that. But from 708 00:35:58,960 --> 00:36:02,000 Speaker 9: a corporate perspective, there there's certainly risk, but there's also 709 00:36:02,040 --> 00:36:05,319 Speaker 9: this value side. It's not just about mitigating risk, it's 710 00:36:05,360 --> 00:36:07,759 Speaker 9: about making sure you realize the value. And so that's 711 00:36:07,800 --> 00:36:11,800 Speaker 9: always the message because I think the risk is motivating 712 00:36:11,840 --> 00:36:14,399 Speaker 9: to a point. But then it's okay if you want 713 00:36:14,440 --> 00:36:16,560 Speaker 9: to make this investment in AI anyway, if you want 714 00:36:16,600 --> 00:36:18,520 Speaker 9: to see that value, you need to make this investment 715 00:36:18,520 --> 00:36:19,600 Speaker 9: to do it in a responsible way. 716 00:36:19,640 --> 00:36:23,520 Speaker 2: And that I guess is that message getting through? 717 00:36:23,640 --> 00:36:25,359 Speaker 11: Do you think I think it is. 718 00:36:25,440 --> 00:36:29,920 Speaker 9: I think companies are still like there's I would say 719 00:36:29,960 --> 00:36:33,640 Speaker 9: two things we're seeing. One it is the topic of 720 00:36:33,719 --> 00:36:37,160 Speaker 9: responsibile AI has not been elevated senior enough inside organizations. 721 00:36:37,200 --> 00:36:39,319 Speaker 11: Yet I argue it's a c suite issue. It has 722 00:36:39,360 --> 00:36:40,880 Speaker 11: to be. If AI is a c suite issue, this 723 00:36:40,960 --> 00:36:41,360 Speaker 11: needs to be. 724 00:36:41,800 --> 00:36:44,720 Speaker 9: Too often we're seeing it a few layers down inside 725 00:36:44,760 --> 00:36:48,120 Speaker 9: the organization, and it's people who are passionate and brilliant 726 00:36:48,160 --> 00:36:50,319 Speaker 9: and wanted to do good things, but they don't have 727 00:36:50,360 --> 00:36:52,360 Speaker 9: the clout inside the organization really drive the kind of 728 00:36:52,400 --> 00:36:52,879 Speaker 9: change you need. 729 00:36:52,920 --> 00:36:54,520 Speaker 11: I mean, I mentioned this is a cultural change. You 730 00:36:54,520 --> 00:36:55,759 Speaker 11: need senior people. 731 00:36:55,680 --> 00:36:57,680 Speaker 9: So that's one, and then two, they're just not putting 732 00:36:57,680 --> 00:37:01,319 Speaker 9: the right right amount of resourcing any right. I think 733 00:37:01,400 --> 00:37:04,080 Speaker 9: this is being seen as you know, smaller dollar we 734 00:37:04,080 --> 00:37:05,040 Speaker 9: can do a few little. 735 00:37:04,920 --> 00:37:07,640 Speaker 11: Things and do it responsibly. It's a much bigger investment. 736 00:37:07,719 --> 00:37:10,160 Speaker 9: And if you think about you brought up privacy earlier 737 00:37:10,200 --> 00:37:13,560 Speaker 9: like GDPR or cyber sure. I mean if you look 738 00:37:13,600 --> 00:37:15,440 Speaker 9: back at some of the data, we're talking tens of 739 00:37:15,440 --> 00:37:17,840 Speaker 9: millions of dollars to get compliant with GDPR. It's the 740 00:37:17,880 --> 00:37:19,520 Speaker 9: same thing here and companies need to be ready for 741 00:37:19,560 --> 00:37:21,080 Speaker 9: that degree of investment. 742 00:37:21,719 --> 00:37:22,080 Speaker 11: All right. 743 00:37:22,080 --> 00:37:25,239 Speaker 2: That was Steve Mills, chief AI ethics officer at the 744 00:37:25,239 --> 00:37:28,280 Speaker 2: Boston Consulting Group. I spoke with them up in Boston yesterday. 745 00:37:28,480 --> 00:37:31,400 Speaker 2: And what was interesting A that they have a chief 746 00:37:31,440 --> 00:37:34,040 Speaker 2: AI ethics officer. Wouldn't have thought at that, didn't even 747 00:37:34,040 --> 00:37:36,879 Speaker 2: think about it. But b they say at BCG, when 748 00:37:36,880 --> 00:37:38,359 Speaker 2: they got to talk to one of their clients about 749 00:37:38,400 --> 00:37:40,960 Speaker 2: AI and making the investment in AI and the commitment 750 00:37:41,000 --> 00:37:43,920 Speaker 2: in AI, they say, listen, we're not going to engage 751 00:37:43,920 --> 00:37:47,799 Speaker 2: with you unless you make the commitment that ethics and 752 00:37:47,840 --> 00:37:49,640 Speaker 2: the management ethics is going to be at a board 753 00:37:49,800 --> 00:37:52,840 Speaker 2: level commitment because if not, there are so many pitfalls 754 00:37:52,840 --> 00:37:55,000 Speaker 2: that we don't even know about. And before you start 755 00:37:55,080 --> 00:37:57,399 Speaker 2: rushing off to make your investments in technology, you better 756 00:37:57,440 --> 00:38:00,080 Speaker 2: recognize that you need to manage the risk here. So 757 00:38:00,600 --> 00:38:02,759 Speaker 2: from their perspective, at least at BCG, that's how they 758 00:38:02,760 --> 00:38:04,719 Speaker 2: try to engage with their clients. 759 00:38:05,080 --> 00:38:08,080 Speaker 5: What is the what's the reaction to that? Did you say, I. 760 00:38:08,040 --> 00:38:09,719 Speaker 2: Think they get it. I think they get it because 761 00:38:09,719 --> 00:38:11,600 Speaker 2: I think he says, they were saying, the clients are 762 00:38:11,640 --> 00:38:15,840 Speaker 2: basically like, you know, we'll spend the money to upgrade 763 00:38:15,880 --> 00:38:17,760 Speaker 2: our tech and do all that kind of stuff because 764 00:38:17,760 --> 00:38:20,279 Speaker 2: we can't be left behind. That's the biggest risk that 765 00:38:21,000 --> 00:38:24,240 Speaker 2: the BCG CEOs say, we can't risk getting left behind. 766 00:38:24,320 --> 00:38:26,239 Speaker 2: So their in tendency is just to spend, spend, spend, 767 00:38:26,320 --> 00:38:29,440 Speaker 2: spend without really thinking about how to do it smartly 768 00:38:29,560 --> 00:38:32,719 Speaker 2: and overtime and do it responsibly. They feel like if 769 00:38:32,760 --> 00:38:35,480 Speaker 2: they don't, their competitors will and they'll be left in 770 00:38:35,520 --> 00:38:38,640 Speaker 2: the dust. So very interesting discussion about AI and how 771 00:38:38,680 --> 00:38:41,759 Speaker 2: it's really involved with all their discussions. 772 00:38:41,200 --> 00:38:45,920 Speaker 4: And apparently they'll be a CEO officer chief, aix ETHX officer. 773 00:38:46,080 --> 00:38:47,919 Speaker 5: That will now be something to think about. 774 00:38:48,200 --> 00:38:52,680 Speaker 1: This is the Bloomberg Intelligence Podcast, available on apples, Spotify, 775 00:38:52,920 --> 00:38:55,799 Speaker 1: and anywhere else you will get your podcasts. Listen live 776 00:38:55,920 --> 00:38:59,480 Speaker 1: each weekday ten am to noon Eastern on Bloomberg dot com, 777 00:39:00,280 --> 00:39:03,200 Speaker 1: radio app, tune In, and the Bloomberg Business app. You 778 00:39:03,239 --> 00:39:06,440 Speaker 1: can also watch us live every weekday on YouTube and 779 00:39:06,600 --> 00:39:08,120 Speaker 1: always on the Bloomberg terminal