1 00:00:00,040 --> 00:00:10,719 Speaker 1: Bloomberg Audio Studios, Podcasts, radio news. 2 00:00:11,680 --> 00:00:14,840 Speaker 2: Welcome to the Daybreak Asia podcast. I'm Doug Chrissner. We 3 00:00:14,960 --> 00:00:18,479 Speaker 2: begin today with Nvidia. After the US closed, the company 4 00:00:18,520 --> 00:00:22,560 Speaker 2: issued revenue guidance for the current quarter above estimates. Sales 5 00:00:22,560 --> 00:00:25,400 Speaker 2: could be as much as seventy eight billion dollars plus 6 00:00:25,480 --> 00:00:28,640 Speaker 2: or minus two percent, and in Vidia CEO Jensen Wong 7 00:00:28,760 --> 00:00:33,600 Speaker 2: said the enterprise adoption of AI agents is skyrocketing. For 8 00:00:33,640 --> 00:00:36,000 Speaker 2: a closer look, now, I'm joined by Daniel Newman. He 9 00:00:36,120 --> 00:00:39,199 Speaker 2: is the CEO at the Futurum Group. Daniel, thank you 10 00:00:39,240 --> 00:00:41,839 Speaker 2: for being here. Give me your take on what we 11 00:00:41,880 --> 00:00:42,760 Speaker 2: heard from Nvidia. 12 00:00:43,479 --> 00:00:46,519 Speaker 3: I think we knew coming into this print that the 13 00:00:47,080 --> 00:00:49,680 Speaker 3: numbers both for this quarter and the outlook, we're going 14 00:00:49,760 --> 00:00:52,640 Speaker 3: to be very positive. I think the market has been 15 00:00:52,720 --> 00:00:56,560 Speaker 3: in this bit of an uncertain space, almost in between 16 00:00:56,600 --> 00:01:00,000 Speaker 3: whether AI is a bubble or AI is so valued 17 00:01:00,240 --> 00:01:04,680 Speaker 3: that it is going to disrupt and basically eliminate entire industries. 18 00:01:04,880 --> 00:01:07,560 Speaker 3: But in Vidio sits in this really good position where 19 00:01:07,600 --> 00:01:13,080 Speaker 3: it's providing the quote unquote arms for this revolution. And 20 00:01:13,840 --> 00:01:16,880 Speaker 3: with those CAPEX numbers, we knew forty fifty percent of 21 00:01:16,880 --> 00:01:19,400 Speaker 3: those numbers would go to Nvidia. We're seeing it show 22 00:01:19,440 --> 00:01:22,240 Speaker 3: up in this big data center number and this forward 23 00:01:22,280 --> 00:01:27,400 Speaker 3: guidance that was almost ten percent above expectations. Just further 24 00:01:27,480 --> 00:01:29,880 Speaker 3: is that this isn't a short term There isn't some 25 00:01:30,080 --> 00:01:33,720 Speaker 3: near term pullback in neither capex or demand for its products. 26 00:01:34,000 --> 00:01:37,039 Speaker 3: It shows that not only the hyperscalers, but these next 27 00:01:37,040 --> 00:01:42,119 Speaker 3: generation clouds enterprises, sovereign nations are all continuing to bet 28 00:01:42,160 --> 00:01:45,440 Speaker 3: big on building out their AI futures with Nvidia haarmwork. 29 00:01:45,640 --> 00:01:48,560 Speaker 2: So we also heard from Salesforce after the balance. Kind 30 00:01:48,560 --> 00:01:51,640 Speaker 2: of interesting because this company faced a lot of scrutiny 31 00:01:52,280 --> 00:01:57,320 Speaker 2: when the conversation focused on disruption from artificial intelligence, and 32 00:01:57,400 --> 00:02:01,520 Speaker 2: the outlook from Salesforce was a little under whelming. Which 33 00:02:01,600 --> 00:02:04,520 Speaker 2: companies do you think are most at risk of being 34 00:02:04,600 --> 00:02:07,240 Speaker 2: threatened by some of the advancements that we're seeing in 35 00:02:07,320 --> 00:02:08,200 Speaker 2: AI right now? 36 00:02:09,120 --> 00:02:12,440 Speaker 3: Yeah, So Salesforce is a great example of a company 37 00:02:12,480 --> 00:02:15,040 Speaker 3: that's kind of into being the baby being thrown out 38 00:02:15,040 --> 00:02:15,880 Speaker 3: with the bathwater. 39 00:02:17,480 --> 00:02:21,000 Speaker 4: CRM, Salesforce Oracle Service. 40 00:02:21,040 --> 00:02:24,560 Speaker 3: Now, some of these big enterprise software companies that have 41 00:02:24,960 --> 00:02:29,560 Speaker 3: massive data modes and deep deeply entrenched inside of enterprises, 42 00:02:29,960 --> 00:02:32,640 Speaker 3: we just aren't seeing that the CIOs we're talking to 43 00:02:32,800 --> 00:02:36,560 Speaker 3: aren't seriously considering any sort of replacement, and basically their 44 00:02:36,639 --> 00:02:39,200 Speaker 3: numbers have been more of a continuation of the trend 45 00:02:39,200 --> 00:02:43,320 Speaker 3: lines that preceded this AI revolution. Having said that there 46 00:02:43,440 --> 00:02:45,760 Speaker 3: is some software that seems to be at risk, we 47 00:02:46,080 --> 00:02:48,840 Speaker 3: see the software companies that were sort of features that 48 00:02:48,960 --> 00:02:52,760 Speaker 3: ended up becoming large SaaS platforms that ended up going 49 00:02:52,800 --> 00:02:55,160 Speaker 3: public at high valuations with high multiples. 50 00:02:55,400 --> 00:02:58,959 Speaker 4: Say a software company like expensify that just does expenses. 51 00:02:59,000 --> 00:03:03,240 Speaker 3: It's one feature that can be augmented by an AI 52 00:03:03,520 --> 00:03:07,320 Speaker 3: vibe coded tool. Some of the design platforms like Canva, 53 00:03:07,400 --> 00:03:09,440 Speaker 3: we think are at risk when you see what Google 54 00:03:09,520 --> 00:03:12,440 Speaker 3: is doing with Nana Banana and other AI platforms that 55 00:03:12,480 --> 00:03:15,880 Speaker 3: are making design really easy. So there are softwares that 56 00:03:15,919 --> 00:03:19,160 Speaker 3: are at risk of being disrupted meaningfully by AI. We 57 00:03:19,200 --> 00:03:21,760 Speaker 3: look at the ones that have significant data modes, that 58 00:03:21,880 --> 00:03:25,520 Speaker 3: have large entrenched enterprise utilization, and of course those that 59 00:03:25,600 --> 00:03:31,600 Speaker 3: have like cross border transaction, major governance compliance. Those softwares 60 00:03:31,639 --> 00:03:33,639 Speaker 3: don't seem to be in trouble. Also, some of the 61 00:03:33,720 --> 00:03:37,440 Speaker 3: key technologies that are like in the design of semiconductors 62 00:03:37,440 --> 00:03:41,040 Speaker 3: and chips, we saw companies like Synopsis being sold on this. 63 00:03:41,240 --> 00:03:44,120 Speaker 3: I don't see them being disrupted. And also the security companies. 64 00:03:44,120 --> 00:03:47,800 Speaker 3: We see security and AI as symbiotic, and Thropic had 65 00:03:47,840 --> 00:03:51,480 Speaker 3: a major breach by China. They need CrowdStrike, they need 66 00:03:51,520 --> 00:03:54,320 Speaker 3: Palo Alto, so some software at risk. But I think 67 00:03:54,760 --> 00:03:56,560 Speaker 3: the selling has been largely overblown. 68 00:03:56,760 --> 00:04:00,160 Speaker 2: So I'm glad that you mentioned Anthropic, Daniel, because today 69 00:04:00,200 --> 00:04:04,440 Speaker 2: the company indicated that it's basically loosening its commitment to 70 00:04:04,520 --> 00:04:08,760 Speaker 2: managing its guardrails where AI is concerned. Some people were 71 00:04:08,760 --> 00:04:10,720 Speaker 2: saying this is one of the most dramatic shifts that 72 00:04:10,760 --> 00:04:13,360 Speaker 2: we have seen lately in the industry. Back in twenty three, 73 00:04:14,040 --> 00:04:18,400 Speaker 2: Anthropics that it would delay AI development that might be dangerous. 74 00:04:18,440 --> 00:04:21,279 Speaker 2: But we just learned yesterday that the company will no 75 00:04:21,400 --> 00:04:25,520 Speaker 2: longer do so if it believes it lacks a significant 76 00:04:25,600 --> 00:04:28,520 Speaker 2: lead over a competitor. And in the context of this, 77 00:04:28,720 --> 00:04:32,640 Speaker 2: earlier in the week, Defense Secretary Pete Hegseth was giving 78 00:04:32,720 --> 00:04:37,440 Speaker 2: Anthropic until Friday to open its AI technology for unrestricted 79 00:04:37,880 --> 00:04:42,040 Speaker 2: military use or risk losing its government contracts. So talk 80 00:04:42,080 --> 00:04:46,760 Speaker 2: to me about regulatory risk here, or at the very least, 81 00:04:47,080 --> 00:04:50,720 Speaker 2: how the government may get involved to kind of influence 82 00:04:50,720 --> 00:04:52,679 Speaker 2: where this industry moves going forward. 83 00:04:53,560 --> 00:04:57,440 Speaker 3: And this is a really difficult situation for Dario and 84 00:04:57,520 --> 00:05:02,240 Speaker 3: Anthropic because if Kiyak, if he ACQUIESCES, he looks like 85 00:05:02,520 --> 00:05:05,000 Speaker 3: sort of his long term position of being the more 86 00:05:05,040 --> 00:05:09,479 Speaker 3: ethical AI platform, the one that would move more slowly, 87 00:05:09,560 --> 00:05:13,400 Speaker 3: that would potentially be the protector of this innovation. But 88 00:05:13,480 --> 00:05:15,720 Speaker 3: on the other side of this, if he doesn't do it, 89 00:05:16,839 --> 00:05:22,080 Speaker 3: Open Ai, Google, you know, many of the other companies 90 00:05:22,200 --> 00:05:26,479 Speaker 3: that are that are pursuing leadership in this space likely 91 00:05:26,560 --> 00:05:30,719 Speaker 3: will And so the question mark is by potentially loosening 92 00:05:30,800 --> 00:05:33,880 Speaker 3: some guardrails, can he still be influential in trying to 93 00:05:34,000 --> 00:05:38,320 Speaker 3: drive some policy, some level of guardrails, some awareness to 94 00:05:38,360 --> 00:05:42,320 Speaker 3: the public markets of what is at stake as we 95 00:05:42,400 --> 00:05:46,480 Speaker 3: continue to see the incredible power of its technology and 96 00:05:46,920 --> 00:05:49,320 Speaker 3: the others that are in this space. I don't think 97 00:05:49,320 --> 00:05:52,360 Speaker 3: there's a win for Anthropic here. If they hold their ground, 98 00:05:52,560 --> 00:05:54,960 Speaker 3: they're going to lose significant business. They're going to be 99 00:05:54,960 --> 00:05:57,640 Speaker 3: at risk of being under immense pressure. We've seen how 100 00:05:57,760 --> 00:06:00,720 Speaker 3: you know, the various administrations can use new comer's tactics 101 00:06:00,720 --> 00:06:03,919 Speaker 3: to create pressure on companies, and of course their ability 102 00:06:03,960 --> 00:06:06,560 Speaker 3: to influence the future becomes less than. 103 00:06:06,480 --> 00:06:09,880 Speaker 4: It is today. So I'm to some extent I. 104 00:06:09,880 --> 00:06:12,640 Speaker 3: Doubt this is what he truly wants, But I think 105 00:06:12,680 --> 00:06:16,720 Speaker 3: his choices be at the table potentially with maybe not 106 00:06:17,120 --> 00:06:22,440 Speaker 3: completely living his ideal, or maybe be taken away from 107 00:06:22,440 --> 00:06:24,279 Speaker 3: the opportunity to be at the table and have no 108 00:06:24,400 --> 00:06:27,440 Speaker 3: influence at all on a really important shift into the future. 109 00:06:27,640 --> 00:06:31,320 Speaker 2: We've seen a lot of volatility in the equity market 110 00:06:31,440 --> 00:06:34,000 Speaker 2: around some of these AI names this week after that 111 00:06:34,120 --> 00:06:38,920 Speaker 2: report from a little known firm called Saittrini Research. It 112 00:06:39,000 --> 00:06:43,800 Speaker 2: seemed to outline some potential downside risk from AI on 113 00:06:43,839 --> 00:06:48,400 Speaker 2: a number of industries. And this report used several hypothetical 114 00:06:48,440 --> 00:06:52,560 Speaker 2: scenarios to kind of set the future. Future is in 115 00:06:52,640 --> 00:06:54,839 Speaker 2: the name of your firm. So if you had to 116 00:06:54,920 --> 00:06:58,719 Speaker 2: kind of forecast, Daniel, what the next five years may 117 00:06:58,760 --> 00:07:02,479 Speaker 2: look like as the artificial intelligence and the advancements that 118 00:07:02,520 --> 00:07:04,839 Speaker 2: we're talking about, what do you come away with? 119 00:07:06,000 --> 00:07:09,920 Speaker 3: Yeah, that report was quite the Doom's Day report. However, 120 00:07:09,920 --> 00:07:13,200 Speaker 3: there were iterations that were created that had greater levels 121 00:07:13,200 --> 00:07:17,920 Speaker 3: of optimism. We see many trillions of dollars of economic 122 00:07:17,960 --> 00:07:21,440 Speaker 3: growth that will come with the efficiencies and productivity that 123 00:07:21,600 --> 00:07:22,760 Speaker 3: will be generated by AI. 124 00:07:23,360 --> 00:07:25,600 Speaker 4: Where we see some potential. 125 00:07:25,160 --> 00:07:29,280 Speaker 3: Risk is the digestion period that we've historically seen during 126 00:07:29,440 --> 00:07:32,400 Speaker 3: massive transformative periods, you know, whether it's been steam or 127 00:07:32,400 --> 00:07:36,400 Speaker 3: electricity or the Internet, generally took place over a much 128 00:07:36,560 --> 00:07:41,320 Speaker 3: longer time. This particular innovation cycle has happened in a 129 00:07:41,440 --> 00:07:44,520 Speaker 3: very short period of time. And so many things, many 130 00:07:44,880 --> 00:07:48,320 Speaker 3: many roles in jobs and careers. You hear, you know, 131 00:07:48,360 --> 00:07:50,800 Speaker 3: will we need lawyers, Will we need doctors, Will we 132 00:07:50,840 --> 00:07:56,160 Speaker 3: need engineers, marketers, designers, salespeople. All these things have never 133 00:07:56,200 --> 00:07:58,080 Speaker 3: been put into question the same way. But if you 134 00:07:58,160 --> 00:08:00,160 Speaker 3: do look at history, and I think it's a and 135 00:08:00,200 --> 00:08:03,440 Speaker 3: the people do. With every revolution, there were roles that 136 00:08:03,440 --> 00:08:06,400 Speaker 3: people said, wow, will that go away? Just like the 137 00:08:06,400 --> 00:08:08,640 Speaker 3: people that used to walk the streets of small towns 138 00:08:08,840 --> 00:08:12,240 Speaker 3: and light the gas lamps at night, the original gas lighters. 139 00:08:12,280 --> 00:08:14,320 Speaker 3: Right with electricity, there would no longer be roles, or 140 00:08:14,320 --> 00:08:17,760 Speaker 3: they would no longer need people to chisel wheels out 141 00:08:17,800 --> 00:08:20,240 Speaker 3: of stone, Doug. But the net of it is is 142 00:08:20,280 --> 00:08:23,320 Speaker 3: that we ended up building industries that were exponentially larger. 143 00:08:24,160 --> 00:08:25,640 Speaker 4: Manufacturing at scale. 144 00:08:26,400 --> 00:08:30,360 Speaker 3: The Internet created entire new industries in marketplaces that did 145 00:08:30,360 --> 00:08:32,240 Speaker 3: not exist in the largest companies in the world that 146 00:08:32,280 --> 00:08:34,480 Speaker 3: employ the most people were, many of which were. 147 00:08:34,400 --> 00:08:35,440 Speaker 4: Created in that era. 148 00:08:35,640 --> 00:08:38,160 Speaker 3: So I think the hardest part for most people is 149 00:08:38,200 --> 00:08:41,560 Speaker 3: to visualize the unknown. And while I can't concretely say 150 00:08:41,600 --> 00:08:43,920 Speaker 3: what will be created, what I could say is this 151 00:08:44,000 --> 00:08:47,040 Speaker 3: innovation is creating an immense amount of additional productivity. It's 152 00:08:47,040 --> 00:08:49,760 Speaker 3: giving humans the potential to five and ten x what 153 00:08:49,800 --> 00:08:55,079 Speaker 3: they're able to output, and new roles, new opportunities will emerge. 154 00:08:55,760 --> 00:08:58,080 Speaker 3: But there is a risk for a short period of 155 00:08:58,160 --> 00:09:01,280 Speaker 3: time that there will be downside because how fast this 156 00:09:01,360 --> 00:09:04,680 Speaker 3: innovation has happened, and they know the humanity has never 157 00:09:04,720 --> 00:09:06,199 Speaker 3: seen anything change in this quick. 158 00:09:06,320 --> 00:09:09,880 Speaker 2: So as I'm listening to I'm thinking of Amara's law. 159 00:09:10,160 --> 00:09:13,760 Speaker 2: It says that people tend to overestimate the short term 160 00:09:13,800 --> 00:09:19,280 Speaker 2: impact of new technologies and underestimate their longer term effects. 161 00:09:19,800 --> 00:09:21,640 Speaker 2: Is that what you're saying, does it really apply in 162 00:09:21,720 --> 00:09:22,199 Speaker 2: this case? 163 00:09:22,920 --> 00:09:24,640 Speaker 3: There's a little bit of that, and there's a little 164 00:09:24,679 --> 00:09:27,600 Speaker 3: bit of Jevin's paradox in the sense that as we 165 00:09:27,679 --> 00:09:31,280 Speaker 3: continue to become more efficient as human beings, we will 166 00:09:31,280 --> 00:09:34,760 Speaker 3: continue to scale industries at a great rate. You know, 167 00:09:34,800 --> 00:09:37,640 Speaker 3: we've talked about numbers of GDP growth in the twenties 168 00:09:37,640 --> 00:09:40,520 Speaker 3: and thirties of trillion. There will be work to support 169 00:09:40,520 --> 00:09:43,080 Speaker 3: that that will not all be done by robots in 170 00:09:43,120 --> 00:09:46,280 Speaker 3: by AI. We need multi sided marketplaces. We need to 171 00:09:46,280 --> 00:09:49,680 Speaker 3: fulfill goods and services. People will still travel, people will 172 00:09:49,679 --> 00:09:52,760 Speaker 3: still get into autonomous vehicles, people will still want to 173 00:09:52,840 --> 00:09:56,360 Speaker 3: eat people. You know, the things that we do that 174 00:09:56,480 --> 00:10:00,520 Speaker 3: creates economic value will still be done and consumed. However, 175 00:10:00,800 --> 00:10:04,559 Speaker 3: some of that monotonous work, some of those low less 176 00:10:04,679 --> 00:10:08,319 Speaker 3: value processes that people are grinding away at their desk, 177 00:10:08,440 --> 00:10:11,720 Speaker 3: moving data from one cell to the next, or moving 178 00:10:11,760 --> 00:10:15,560 Speaker 3: papers from one email inbox to another, sending documents to 179 00:10:15,559 --> 00:10:18,120 Speaker 3: be signed, things like that will be automated. But people 180 00:10:18,160 --> 00:10:21,920 Speaker 3: will continue to evolve like we have all throughout history. 181 00:10:22,000 --> 00:10:25,000 Speaker 3: But the speed is the risk. But I think in 182 00:10:25,040 --> 00:10:28,120 Speaker 3: good time what you mentioned with the Mar's law does 183 00:10:28,200 --> 00:10:28,839 Speaker 3: become the truth. 184 00:10:29,120 --> 00:10:31,800 Speaker 2: So Daniel, before i'd let you go, give me your 185 00:10:31,920 --> 00:10:35,360 Speaker 2: sense of where investors can find the best opportunities right 186 00:10:35,400 --> 00:10:36,960 Speaker 2: now in the info tech space. 187 00:10:37,840 --> 00:10:41,559 Speaker 3: Yeah, there's really two places that we are focusing on. 188 00:10:41,559 --> 00:10:45,439 Speaker 3: One is we look at where everything remains constrained compute 189 00:10:45,920 --> 00:10:49,520 Speaker 3: memory power. These things are a multi year trend and 190 00:10:49,559 --> 00:10:51,720 Speaker 3: while there has been a lot of volatility. We've seen 191 00:10:51,880 --> 00:10:55,400 Speaker 3: Nvidia settle a little bit, We've seen Micron move quite 192 00:10:55,440 --> 00:10:58,640 Speaker 3: a bit up. We're still seeing the need for power 193 00:10:59,040 --> 00:11:01,640 Speaker 3: and rare earth. All of those areas are going to 194 00:11:01,640 --> 00:11:03,520 Speaker 3: continue to see growth as we see a number of 195 00:11:03,559 --> 00:11:06,560 Speaker 3: as high as four trillion dollars in data center infrastructure 196 00:11:06,840 --> 00:11:09,280 Speaker 3: by the end of the decade. On the other hand, 197 00:11:09,320 --> 00:11:11,920 Speaker 3: there is some opportunity in the arbitrage and in the 198 00:11:12,320 --> 00:11:15,000 Speaker 3: hard selloff that a number of companies have faced. We 199 00:11:15,400 --> 00:11:19,360 Speaker 3: do not believe companies like IBM, like I mentioned, Service 200 00:11:19,440 --> 00:11:22,200 Speaker 3: Now Salesforce, they have been sold at a rate that 201 00:11:22,400 --> 00:11:25,720 Speaker 3: is not appropriate based on the business and metrics that 202 00:11:25,760 --> 00:11:29,199 Speaker 3: they have delivered. There is some risk of AI and disruption, 203 00:11:29,559 --> 00:11:32,160 Speaker 3: but we think the names I mentioned and several others 204 00:11:32,200 --> 00:11:35,840 Speaker 3: of those deep moat enterprise software companies will continue to 205 00:11:35,880 --> 00:11:38,440 Speaker 3: perform at a high level despite the fact that the 206 00:11:38,520 --> 00:11:41,800 Speaker 3: rerating may be real, but the upside still exists. 207 00:11:41,920 --> 00:11:43,880 Speaker 2: Daniel Bell, leave it. They are always a pleasure. Thanks 208 00:11:43,880 --> 00:11:46,360 Speaker 2: so much. Daniel Newman there, he is the CEO at 209 00:11:46,440 --> 00:11:50,040 Speaker 2: the Futureum Group, joining us here on the Daybreak Asia podcast. 210 00:11:57,360 --> 00:11:59,800 Speaker 2: Welcome back to the Daybreak Asia Podcast. I'm Doug, Chris 211 00:12:00,200 --> 00:12:02,839 Speaker 2: and as I mentioned a moment ago, in Vidia gave 212 00:12:02,880 --> 00:12:06,840 Speaker 2: another bullish forecast for quarterly revenue. This would suggest the 213 00:12:06,880 --> 00:12:10,720 Speaker 2: massive buildout in AI compute remains on track now. This 214 00:12:10,800 --> 00:12:15,520 Speaker 2: could underpin earnings expectations across the Asian semiconductor supply chain, 215 00:12:15,679 --> 00:12:19,640 Speaker 2: especially where high bandwidth memory chip makers are concerned. And 216 00:12:19,679 --> 00:12:23,160 Speaker 2: that's where we begin our conversation with Karen Calder. Kieren 217 00:12:23,400 --> 00:12:26,959 Speaker 2: is head of equity research for Asia at UBP. He 218 00:12:27,040 --> 00:12:30,440 Speaker 2: spoke earlier with Bloomberg TV host Heidi Stroud Watts and 219 00:12:30,480 --> 00:12:30,960 Speaker 2: Sherry on. 220 00:12:31,120 --> 00:12:32,680 Speaker 5: We have this situation where, of course all of the 221 00:12:32,720 --> 00:12:36,480 Speaker 5: macro factors are still at play, ranging from everything to 222 00:12:36,480 --> 00:12:41,000 Speaker 5: you have political related to tariffs to central banks globally, 223 00:12:41,520 --> 00:12:43,800 Speaker 5: but at the moment it is really all about that 224 00:12:43,920 --> 00:12:46,200 Speaker 5: tech space. What do you make of the price action 225 00:12:46,320 --> 00:12:49,720 Speaker 5: that we've seen under the numbers from nvidio take away 226 00:12:50,360 --> 00:12:54,160 Speaker 5: the fear that clearly was dominating the training up until 227 00:12:54,200 --> 00:12:55,720 Speaker 5: now for the last few days. 228 00:12:58,160 --> 00:12:58,680 Speaker 4: Good morning. 229 00:12:58,760 --> 00:13:01,560 Speaker 6: So obviously the the numbers from a video are are 230 00:13:01,600 --> 00:13:05,000 Speaker 6: a big relief as a bellweather of you know, the 231 00:13:05,040 --> 00:13:09,920 Speaker 6: AI trade, at least on the infrastructure CAPEX build out side, 232 00:13:11,400 --> 00:13:15,520 Speaker 6: Nvidia needs to keep delivering and and and having a 233 00:13:15,520 --> 00:13:18,760 Speaker 6: strong outlook, So that's pretty positive if you look at 234 00:13:18,920 --> 00:13:20,200 Speaker 6: you know, over the course of this week. Of course, 235 00:13:20,240 --> 00:13:23,559 Speaker 6: we had this Centreny report over the weekend talking about 236 00:13:23,559 --> 00:13:25,920 Speaker 6: a hypothetical scenario a couple of years down the road 237 00:13:25,960 --> 00:13:30,920 Speaker 6: where we've had, you know, the AI build out and 238 00:13:31,000 --> 00:13:35,360 Speaker 6: the implementation. It basically takes out a lot of different sectors, 239 00:13:35,360 --> 00:13:40,920 Speaker 6: and I think probably we've passed peak worry about about 240 00:13:40,920 --> 00:13:44,040 Speaker 6: that sort of scenario. In fact, if you put that 241 00:13:44,080 --> 00:13:46,960 Speaker 6: report into chat GPT itself, it really says, don't worry 242 00:13:47,000 --> 00:13:49,320 Speaker 6: too much about this. This is a hypothetical scenario looking 243 00:13:49,360 --> 00:13:54,400 Speaker 6: at identifying risks going forward. So we think peak fear 244 00:13:54,640 --> 00:13:58,320 Speaker 6: has probably passed, and it's an opportunity, we think to 245 00:13:58,480 --> 00:14:04,520 Speaker 6: recalibrate in the names that have a decent near term outlook. 246 00:14:04,559 --> 00:14:07,720 Speaker 6: And for us, it's really the sort of Capex AI 247 00:14:07,880 --> 00:14:11,920 Speaker 6: CAPEX build out. So the names like in Nvidia as 248 00:14:11,920 --> 00:14:14,800 Speaker 6: well as some of the semiconductor manufacturers I think are 249 00:14:15,480 --> 00:14:16,720 Speaker 6: a good place to be at the moment. 250 00:14:20,680 --> 00:14:24,680 Speaker 7: Karin, how long do you think we sort of should 251 00:14:24,680 --> 00:14:26,720 Speaker 7: be looking at in terms of the time for that 252 00:14:26,800 --> 00:14:30,600 Speaker 7: separation of the sustainable gainers within that AI space. And 253 00:14:30,640 --> 00:14:33,640 Speaker 7: those you know clearly, you know, particularly in the likes 254 00:14:33,640 --> 00:14:36,000 Speaker 7: of Jamie Diamond's World, are something that you should be 255 00:14:36,040 --> 00:14:37,200 Speaker 7: a lot more cautious about. 256 00:14:40,160 --> 00:14:42,600 Speaker 6: Well, I mean think the timeline is now if you look, 257 00:14:42,640 --> 00:14:45,200 Speaker 6: you know, particularly JP Morgamy. They had a mini investor 258 00:14:45,280 --> 00:14:46,960 Speaker 6: day this week and they highlighted that they're going to 259 00:14:46,960 --> 00:14:50,080 Speaker 6: be spending almost twenty billion dollars this year, about half 260 00:14:50,120 --> 00:14:54,280 Speaker 6: of which is investment in platforms and AI capabilities. You know, 261 00:14:54,360 --> 00:14:58,840 Speaker 6: big corporates, including financials, which are able to spend and 262 00:14:58,840 --> 00:15:02,200 Speaker 6: implement AI will will benefit from that, and they'll be 263 00:15:02,200 --> 00:15:05,000 Speaker 6: able to help their clients and names that are a 264 00:15:05,000 --> 00:15:07,680 Speaker 6: little bit more exposed, you know, we'll have some work 265 00:15:07,720 --> 00:15:11,040 Speaker 6: to do. So obviously there's a sharp contrast between in 266 00:15:11,160 --> 00:15:14,000 Speaker 6: video results and Salesforce results, so Salesforce has a bit 267 00:15:14,040 --> 00:15:17,560 Speaker 6: of work to do to you know, adapt their business 268 00:15:17,560 --> 00:15:20,080 Speaker 6: model to you know, the new environment. 269 00:15:22,280 --> 00:15:24,320 Speaker 8: Karen, can you have this point also tell who the 270 00:15:24,320 --> 00:15:27,880 Speaker 8: winners or losers could be across Asia as well? 271 00:15:30,480 --> 00:15:33,200 Speaker 6: I wish I could. I mean, obviously I think that 272 00:15:33,280 --> 00:15:36,720 Speaker 6: we uh, we think that the semiconductor names are an 273 00:15:36,760 --> 00:15:39,320 Speaker 6: obvious place to be, so, you know, the big ones 274 00:15:39,520 --> 00:15:43,320 Speaker 6: T SMC and Samsung, which of course have have done well. 275 00:15:44,040 --> 00:15:47,840 Speaker 6: We think there's some critical companies in Japan in the 276 00:15:47,960 --> 00:15:51,920 Speaker 6: in the supply chain, which which we think at this 277 00:15:51,960 --> 00:15:54,040 Speaker 6: stage is also a place to be. I think it's 278 00:15:54,040 --> 00:15:58,280 Speaker 6: probably too it's difficult now to pick the winners on 279 00:15:58,320 --> 00:16:01,840 Speaker 6: the implementation side. So you know, while software names have 280 00:16:01,920 --> 00:16:04,640 Speaker 6: been really taken to the cleaners, I think it's maybe 281 00:16:04,680 --> 00:16:07,120 Speaker 6: a little bit too early to start to pick through 282 00:16:07,120 --> 00:16:10,120 Speaker 6: there and try to identify winners as we're still at 283 00:16:10,120 --> 00:16:11,000 Speaker 6: the build out stage. 284 00:16:12,440 --> 00:16:15,920 Speaker 8: Yeah, lots of component makers also chemical suppliers across Japan, 285 00:16:16,000 --> 00:16:19,320 Speaker 8: right Kieran. But here we also do have the idiosyncratic 286 00:16:19,360 --> 00:16:22,320 Speaker 8: story around the Bank of Japan, what privates Attakaichi wants 287 00:16:22,320 --> 00:16:25,680 Speaker 8: when it comes to those rate hikes, the weakness of 288 00:16:25,760 --> 00:16:29,720 Speaker 8: the Japanese yend and the acceleration of the losses. Is 289 00:16:29,720 --> 00:16:32,160 Speaker 8: this a risk or is this just more upside for 290 00:16:32,200 --> 00:16:33,160 Speaker 8: those exporters. 291 00:16:36,600 --> 00:16:39,640 Speaker 6: Well, so, okay, the end is a difficult issue for 292 00:16:40,920 --> 00:16:45,840 Speaker 6: Japan obviously. You know, the Bank of Japan is obviously 293 00:16:45,920 --> 00:16:49,080 Speaker 6: very cautious and with the comments and the new appointments 294 00:16:50,080 --> 00:16:53,760 Speaker 6: from the comments from the Prime Minister, a rate hike, 295 00:16:54,280 --> 00:16:56,560 Speaker 6: a further rate hike is probably pretty unlikely. 296 00:16:56,280 --> 00:16:56,840 Speaker 4: At this point. 297 00:16:57,920 --> 00:17:00,800 Speaker 6: On the other hand, the weekend is all very problematic, 298 00:17:00,920 --> 00:17:05,520 Speaker 6: especially for Japanese households, because so much of uh, you know, 299 00:17:05,640 --> 00:17:09,400 Speaker 6: the household basket is imported all mostly all energy is important, 300 00:17:09,440 --> 00:17:14,720 Speaker 6: lots of food is imported so important. Inflation is a 301 00:17:14,720 --> 00:17:18,320 Speaker 6: big problem, not only for the economy but specifically for households. 302 00:17:19,480 --> 00:17:22,520 Speaker 6: This is likely to continue, and I think, you know, 303 00:17:22,560 --> 00:17:25,040 Speaker 6: the export is obviously benefit from translation of earnings, but 304 00:17:25,320 --> 00:17:28,920 Speaker 6: at some point it's just too weak for the domestic economy. 305 00:17:29,280 --> 00:17:32,399 Speaker 6: She's got, she's bought herself four years. Hopefully there'll be, 306 00:17:33,080 --> 00:17:35,399 Speaker 6: you know, some measures taken sooner. 307 00:17:35,640 --> 00:17:38,479 Speaker 2: That was Karen Calder, head of Equity Research for Asia 308 00:17:38,520 --> 00:17:42,480 Speaker 2: at UBP, speaking with Bloomberg TV host Heidi Stroud, Watts 309 00:17:42,520 --> 00:17:45,520 Speaker 2: and Sherry On bringing you their conversation here on the 310 00:17:45,600 --> 00:17:51,000 Speaker 2: Daybreak Asia podcast. Thanks for listening to today's episode of 311 00:17:51,040 --> 00:17:55,160 Speaker 2: the Bloomberg Daybreak Asia Edition podcast. Each weekday, we look 312 00:17:55,160 --> 00:17:58,960 Speaker 2: at the story shaping markets, finance, and geopolitics in the 313 00:17:59,000 --> 00:18:02,080 Speaker 2: Asia Pacific. If you can find us on Apple, Spotify, 314 00:18:02,200 --> 00:18:05,720 Speaker 2: the Bloomberg Podcast YouTube channel, or anywhere else you listen, 315 00:18:06,119 --> 00:18:09,000 Speaker 2: join us again tomorrow for insight on the market moves 316 00:18:09,080 --> 00:18:13,600 Speaker 2: from Hong Kong to Singapore and Australia. I'm Doug Prisoner, 317 00:18:13,760 --> 00:18:15,200 Speaker 2: and this is Bloomberg