1 00:00:02,720 --> 00:00:10,560 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. You're listening to the 2 00:00:10,560 --> 00:00:14,480 Speaker 1: Bloomberg Intelligence Podcast. Catch us live weekdays at ten am 3 00:00:14,560 --> 00:00:18,479 Speaker 1: Eastern on Applecarplay and Android Auto with the Bloomberg Business App, 4 00:00:18,600 --> 00:00:21,799 Speaker 1: Listen on demand wherever you get your podcasts, or watch 5 00:00:21,920 --> 00:00:23,400 Speaker 1: us live on YouTube. 6 00:00:23,560 --> 00:00:26,880 Speaker 2: Let's take it one step further. Gene Munster, managing partner, 7 00:00:26,920 --> 00:00:29,360 Speaker 2: co founder for Loop Ventures, really one of the leading 8 00:00:29,880 --> 00:00:33,000 Speaker 2: voices on all things technology really for the last twenty 9 00:00:33,040 --> 00:00:37,040 Speaker 2: five thirty years. We appreciate getting a few minutes of 10 00:00:37,120 --> 00:00:39,640 Speaker 2: his time. Gene, I'm just gonna speak for myself, but 11 00:00:39,680 --> 00:00:43,120 Speaker 2: I probably speak for most of my radio and YouTube audience. 12 00:00:43,320 --> 00:00:45,520 Speaker 2: I did not know what deep Sick was, deep Sik 13 00:00:45,680 --> 00:00:50,200 Speaker 2: was before this morning, but now I'm learning. What should 14 00:00:50,200 --> 00:00:52,120 Speaker 2: I be learning? What should I takeaway be today? 15 00:00:54,400 --> 00:00:58,840 Speaker 3: I think that the highest level this whole chapter is about. 16 00:00:59,000 --> 00:01:02,520 Speaker 3: The first takeaway is that investors are nervous that this 17 00:01:03,120 --> 00:01:05,800 Speaker 3: eighty five percent moving the NASDAC over the past couple 18 00:01:05,840 --> 00:01:08,480 Speaker 3: of years is creating an environment where any sort of 19 00:01:09,319 --> 00:01:11,840 Speaker 3: ripple in what that AI trade looks like is going 20 00:01:11,880 --> 00:01:14,399 Speaker 3: to cause a significant impact on some of these valuations. 21 00:01:14,400 --> 00:01:16,640 Speaker 3: So I think that's I think this is a temperature 22 00:01:16,640 --> 00:01:18,520 Speaker 3: in terms of where the market is at as far 23 00:01:18,520 --> 00:01:21,399 Speaker 3: as kind of the company itself is concerned deep seek. 24 00:01:21,760 --> 00:01:26,920 Speaker 3: What's most important here is that they are advancing or 25 00:01:27,120 --> 00:01:30,399 Speaker 3: presumably advancing the cost of training these models you've talked 26 00:01:30,400 --> 00:01:32,800 Speaker 3: a lot about on the show today. And why that's 27 00:01:32,840 --> 00:01:35,920 Speaker 3: important is there's a question about how much hardware is 28 00:01:35,959 --> 00:01:38,920 Speaker 3: needed for that, and that obviously impacts the big hardware companies, 29 00:01:38,920 --> 00:01:42,160 Speaker 3: and then separately, how does that accelerate the adoption? So 30 00:01:42,800 --> 00:01:46,479 Speaker 3: there is the truth is always several layers below the surface, 31 00:01:46,520 --> 00:01:48,840 Speaker 3: and I think when it comes to Dee Sep there 32 00:01:48,880 --> 00:01:52,400 Speaker 3: are still some pretty significant unanswered questions, and I'll start 33 00:01:52,400 --> 00:01:54,360 Speaker 3: with one of probably the biggest one is that this 34 00:01:54,880 --> 00:01:57,520 Speaker 3: five to six million dollar training number that has the 35 00:01:57,560 --> 00:02:02,600 Speaker 3: market upside down today, that was not their most recent model. 36 00:02:02,600 --> 00:02:05,800 Speaker 3: They have an R model. It's unclear, you know, maybe 37 00:02:05,800 --> 00:02:08,440 Speaker 3: that was twenty five fifty one hundred million dollars to 38 00:02:08,480 --> 00:02:12,600 Speaker 3: train it. It probably was below what the most recent 39 00:02:12,680 --> 00:02:16,200 Speaker 3: version of Opening Eyes model, which is anticipated or estimated 40 00:02:16,200 --> 00:02:19,120 Speaker 3: to be about a five hundred million dollars in terms 41 00:02:19,120 --> 00:02:22,280 Speaker 3: of training. So I think big picture here is that 42 00:02:23,280 --> 00:02:26,760 Speaker 3: how we are delivering AI is evolving and there may 43 00:02:26,760 --> 00:02:30,639 Speaker 3: be some pretty significant improvements in terms of the ability 44 00:02:30,680 --> 00:02:32,919 Speaker 3: to do that at a lower cost. 45 00:02:33,400 --> 00:02:36,200 Speaker 4: So to that pointing, how much if we look at 46 00:02:36,280 --> 00:02:38,720 Speaker 4: Envidia in particular, it's down about two hundred and eighty 47 00:02:38,720 --> 00:02:41,920 Speaker 4: three points. What part of that is justified? 48 00:02:44,080 --> 00:02:46,240 Speaker 3: So I think the big picture here, I think that 49 00:02:46,320 --> 00:02:49,440 Speaker 3: this is an overreaction. I think when you look at 50 00:02:49,440 --> 00:02:52,200 Speaker 3: a company like Nvidia, it's a company we do own, 51 00:02:52,320 --> 00:02:55,080 Speaker 3: and I think that this is an overreaction, and just 52 00:02:55,240 --> 00:02:58,120 Speaker 3: specifically around that, if we can kind of zero in 53 00:02:58,200 --> 00:03:02,120 Speaker 3: on this concept that these models are becoming more efficient, 54 00:03:02,240 --> 00:03:05,040 Speaker 3: let's just take deep seek at face value that they've 55 00:03:05,040 --> 00:03:09,200 Speaker 3: had some sort of improvement. Is that that improvement if 56 00:03:09,240 --> 00:03:13,360 Speaker 3: you believe that the US tech companies are competent, that 57 00:03:13,400 --> 00:03:16,720 Speaker 3: improvement is around some architectures that have been talked about 58 00:03:16,760 --> 00:03:20,720 Speaker 3: for the last couple months. So those companies, the big 59 00:03:21,160 --> 00:03:25,520 Speaker 3: tech companies, the companies that are behind all the announcements 60 00:03:25,600 --> 00:03:31,239 Speaker 3: last week Meta increasing their CAPEX spend, and I think 61 00:03:31,280 --> 00:03:35,160 Speaker 3: that that is all that is a belief that they 62 00:03:35,240 --> 00:03:37,560 Speaker 3: still need this hardware. So when it comes back to 63 00:03:37,840 --> 00:03:41,000 Speaker 3: AI and comes back to the Nvidia trade, is that 64 00:03:41,080 --> 00:03:45,200 Speaker 3: I believe if they are reducing the costs to train 65 00:03:45,280 --> 00:03:49,000 Speaker 3: these models that I actually can build. I think a 66 00:03:49,040 --> 00:03:52,880 Speaker 3: credible case that that could increase the demand for some 67 00:03:52,960 --> 00:03:57,000 Speaker 3: of this hardware if we are getting closer to general intelligence. 68 00:03:57,000 --> 00:03:59,920 Speaker 3: If that potential is even closer, it's just going to 69 00:03:59,960 --> 00:04:02,840 Speaker 3: be an insatiable arms race. It's going to continue. And 70 00:04:02,880 --> 00:04:05,440 Speaker 3: so I understand why and video is down today. I 71 00:04:05,480 --> 00:04:07,800 Speaker 3: think it's actually a healthy thing. But ultimately I think 72 00:04:07,840 --> 00:04:09,480 Speaker 3: that their business is going to be just fine. We 73 00:04:09,520 --> 00:04:11,680 Speaker 3: have to wait twenty one trading days before we hear 74 00:04:11,840 --> 00:04:13,960 Speaker 3: from at least in VIDI on that, but I think 75 00:04:13,960 --> 00:04:14,760 Speaker 3: it's going to be fine. 76 00:04:15,440 --> 00:04:20,240 Speaker 2: So, Gene, was it a coincidence that on Friday, Meta 77 00:04:20,520 --> 00:04:24,640 Speaker 2: upsets capex forecast dramatically, which by a magnitude I've never seen, 78 00:04:24,680 --> 00:04:28,520 Speaker 2: from fifty billion to sixty five billion, and then today 79 00:04:28,560 --> 00:04:31,440 Speaker 2: we get this news on deep sick? Was Deep seek? 80 00:04:31,520 --> 00:04:34,400 Speaker 2: Was that kind of I don't know, coincidence? 81 00:04:35,800 --> 00:04:39,359 Speaker 3: I think, what's I think that's just purely coincidence. But 82 00:04:39,560 --> 00:04:41,840 Speaker 3: I do kind of go back to what we're just 83 00:04:41,880 --> 00:04:43,799 Speaker 3: talking about a minute ago. Is I think it's important 84 00:04:43,800 --> 00:04:47,359 Speaker 3: to note is that that announcement from Meta, if you 85 00:04:47,400 --> 00:04:50,000 Speaker 3: assume a Meta is Meta is competent, and I believe 86 00:04:50,000 --> 00:04:53,640 Speaker 3: that they are. That announcement came with the full knowledge 87 00:04:53,640 --> 00:04:56,159 Speaker 3: of what deep Seek was doing, so it's new to 88 00:04:56,240 --> 00:04:59,080 Speaker 3: most of us today. But for these companies who have 89 00:04:59,200 --> 00:05:02,719 Speaker 3: been making these huge investments, the concept of what deep 90 00:05:02,760 --> 00:05:05,200 Speaker 3: Seak had been trying to build has been They've had 91 00:05:05,240 --> 00:05:07,839 Speaker 3: a model that's been out. This has been something that's 92 00:05:07,880 --> 00:05:10,040 Speaker 3: been aware for the last couple of months, and so 93 00:05:10,080 --> 00:05:13,159 Speaker 3: I think that is important. Is that I think again 94 00:05:13,200 --> 00:05:16,600 Speaker 3: that this commentary from on Friday from Zuckerberg about the 95 00:05:16,640 --> 00:05:23,120 Speaker 3: increased capex factors in some of these potential breakthroughs that 96 00:05:23,160 --> 00:05:25,119 Speaker 3: we've seen with deep Seek. I want to stop short 97 00:05:25,120 --> 00:05:27,960 Speaker 3: of saying it's definitively a breakthrough because a lot of 98 00:05:28,040 --> 00:05:30,240 Speaker 3: unknowns around how good the models are and what the 99 00:05:30,279 --> 00:05:34,800 Speaker 3: true costs are. But I think that the AI investment 100 00:05:34,839 --> 00:05:37,840 Speaker 3: phase is still alive and well, this is still very early. 101 00:05:38,920 --> 00:05:41,880 Speaker 4: So if this could be an overreaction, we don't know 102 00:05:42,160 --> 00:05:45,760 Speaker 4: fair enough. In Nvidia, obviously an overreaction that stocked. The 103 00:05:45,800 --> 00:05:48,520 Speaker 4: way you look at it, does it make you think 104 00:05:48,640 --> 00:05:56,560 Speaker 4: though a rating of forty one times estimated PE. 105 00:05:55,839 --> 00:05:58,760 Speaker 3: So from where my head's at and thinking about calendar 106 00:05:58,800 --> 00:06:01,400 Speaker 3: twenty six at this point, and based on the calendar 107 00:06:01,400 --> 00:06:03,359 Speaker 3: twenty six numbers, I think it's closer to like a 108 00:06:03,400 --> 00:06:06,320 Speaker 3: twenty five, and I think that they can grow earnings 109 00:06:06,360 --> 00:06:08,839 Speaker 3: more than that. This is where the basically the rubber 110 00:06:08,880 --> 00:06:12,119 Speaker 3: hits the road, is that ultimately what happens in calendar 111 00:06:12,200 --> 00:06:14,039 Speaker 3: twenty six, all this stuff we're talking about with deep 112 00:06:14,080 --> 00:06:17,320 Speaker 3: seek today really impacts Ynvidia in twenty six. And if 113 00:06:17,360 --> 00:06:20,719 Speaker 3: in fact this dramatically changes how the kind of hardware 114 00:06:20,720 --> 00:06:22,880 Speaker 3: that people need less hardware, then I'm going to be wrong. 115 00:06:23,000 --> 00:06:26,479 Speaker 3: But if this does create this accelerated arms race, then 116 00:06:26,520 --> 00:06:28,120 Speaker 3: I think that they're going to grow faster. And so 117 00:06:28,520 --> 00:06:31,279 Speaker 3: I consider a one peg on that twenty six earnings 118 00:06:31,360 --> 00:06:36,119 Speaker 3: is actually attractive valuation relative to the rest of the group. 119 00:06:36,160 --> 00:06:38,000 Speaker 3: So I'm comfortable with the evaluation. 120 00:06:38,800 --> 00:06:41,280 Speaker 2: Just a red headline crossing the Bloomberg criminals we speak, 121 00:06:41,480 --> 00:06:46,200 Speaker 2: deep Seek says it is subject to large scale malicious attack. 122 00:06:46,360 --> 00:06:48,480 Speaker 2: So that's a headline. We'll have more reporting on that 123 00:06:48,640 --> 00:06:51,520 Speaker 2: going forward. Gene, just real quickly, we're going to hear 124 00:06:51,520 --> 00:06:54,760 Speaker 2: from some of the other big tech companies, Microsoft, Meta, Tesla, 125 00:06:54,760 --> 00:06:56,440 Speaker 2: how do you think they're going to address this issue? 126 00:06:57,360 --> 00:06:59,719 Speaker 3: It's obviously this is going to be front and center 127 00:06:59,760 --> 00:07:02,600 Speaker 3: the topic, and I think they're probably going to address 128 00:07:02,600 --> 00:07:06,200 Speaker 3: it by saying they still have plans to invest meaningfully 129 00:07:06,200 --> 00:07:08,440 Speaker 3: more in counter twenty five over twenty four when it 130 00:07:08,480 --> 00:07:10,920 Speaker 3: comes to Capex, and I think that that will play 131 00:07:11,000 --> 00:07:14,000 Speaker 3: part of a for reassuring we're long ways away from that. 132 00:07:14,040 --> 00:07:15,920 Speaker 3: We got two and a half trading days awave before 133 00:07:15,920 --> 00:07:17,760 Speaker 3: we start to get some of that commentary before the 134 00:07:18,440 --> 00:07:20,160 Speaker 3: mic gets turned over to the other side of the 135 00:07:20,400 --> 00:07:21,080 Speaker 3: equation here. 136 00:07:21,520 --> 00:07:23,360 Speaker 2: All right, Gene, thank you so much for joining us. 137 00:07:23,400 --> 00:07:25,640 Speaker 2: I know you're super busy today. Appreciate getting a few 138 00:07:25,640 --> 00:07:28,560 Speaker 2: minutes of your time. Gen Munster, managing partner co founder 139 00:07:28,880 --> 00:07:31,520 Speaker 2: Lupet Ventures, taking a little bit of a I guess 140 00:07:31,560 --> 00:07:33,720 Speaker 2: a longer term view, in a more broader view of 141 00:07:33,720 --> 00:07:37,840 Speaker 2: what this can mean for the indusuay. 142 00:07:36,680 --> 00:07:40,560 Speaker 1: You're listening to the Bloomberg Intelligence Podcast. Catch the program 143 00:07:40,680 --> 00:07:44,080 Speaker 1: live weekdays at ten am Eastern on Applecarplay and Android 144 00:07:44,080 --> 00:07:47,080 Speaker 1: Auto with the Bloomberg Business app. You can also listen 145 00:07:47,200 --> 00:07:50,440 Speaker 1: live on Amazon Alexa from our flagship New York station, 146 00:07:51,000 --> 00:07:53,680 Speaker 1: Just say Alexa play Bloomberg eleven thirty. 147 00:07:54,240 --> 00:07:56,120 Speaker 4: All right, let's get more on this text sell off 148 00:07:56,120 --> 00:07:59,040 Speaker 4: here Kim Forrest, founder and CIO of Boca Capital Partners. 149 00:07:59,080 --> 00:08:02,560 Speaker 4: She's also a former Chips person, like she did the stuff, 150 00:08:02,600 --> 00:08:03,720 Speaker 4: so she knows the things. 151 00:08:04,120 --> 00:08:04,480 Speaker 5: Clearly. 152 00:08:04,520 --> 00:08:08,240 Speaker 4: The deep Seek news is upending most of the tech industry, right. 153 00:08:08,600 --> 00:08:10,600 Speaker 4: You get the socks and DEXes down hard, you get 154 00:08:10,600 --> 00:08:13,720 Speaker 4: all the hyperscalers are down hard, all the chip makers, 155 00:08:13,760 --> 00:08:17,840 Speaker 4: all the power makers all down hard. Is this the 156 00:08:17,880 --> 00:08:20,200 Speaker 4: reckoning or is this a by the dip moment? 157 00:08:21,880 --> 00:08:26,000 Speaker 5: Well, that is the question, isn't it. I would say 158 00:08:26,000 --> 00:08:28,160 Speaker 5: it's a little bit of both. I think this is 159 00:08:28,200 --> 00:08:33,800 Speaker 5: a good level set, re leveling, resetting our expectations. So 160 00:08:34,240 --> 00:08:36,320 Speaker 5: the first thing is, we have to really see if 161 00:08:36,360 --> 00:08:41,720 Speaker 5: this is true. Now, apparently the company is in deep 162 00:08:41,800 --> 00:08:46,560 Speaker 5: Seek is an open source company. They have released how 163 00:08:46,600 --> 00:08:50,439 Speaker 5: they've done this, but you know, replication is key, so 164 00:08:50,520 --> 00:08:53,360 Speaker 5: we're gonna have to see if this actually works. That's 165 00:08:53,440 --> 00:08:57,760 Speaker 5: thing number one, Like, don't get too upset about major 166 00:08:57,840 --> 00:09:01,680 Speaker 5: developments until you know that they're real major developments, right, 167 00:09:01,760 --> 00:09:05,400 Speaker 5: So that's thing one. Thing two is and I think 168 00:09:05,440 --> 00:09:08,040 Speaker 5: this has been troubling me for a very long time, 169 00:09:09,120 --> 00:09:15,439 Speaker 5: especially with like last week's Stargate announcement. It seems as 170 00:09:15,480 --> 00:09:21,040 Speaker 5: if people have been expecting AI, which we don't know 171 00:09:21,040 --> 00:09:23,719 Speaker 5: what the payoff is yet we have a pretty good 172 00:09:23,760 --> 00:09:26,120 Speaker 5: idea it might help us out. But we're going to 173 00:09:26,160 --> 00:09:29,760 Speaker 5: cover the earth essentially in data centers and use far 174 00:09:29,840 --> 00:09:32,800 Speaker 5: more power than we create right now. So those two 175 00:09:32,840 --> 00:09:35,440 Speaker 5: things are kind of troubling that people have been making 176 00:09:35,480 --> 00:09:39,199 Speaker 5: investments in companies going to produce far more than they 177 00:09:39,240 --> 00:09:42,400 Speaker 5: can produce now. And do you see what I'm saying here? 178 00:09:42,440 --> 00:09:47,080 Speaker 5: We had these really big expectations, and I didn't think 179 00:09:47,080 --> 00:09:49,160 Speaker 5: that they were going to come to happen in short 180 00:09:49,240 --> 00:09:52,520 Speaker 5: or long term because of the well physical limitations of it. 181 00:09:53,160 --> 00:09:55,200 Speaker 2: So Kim, at this very early stage, I think I'm 182 00:09:55,200 --> 00:09:58,120 Speaker 2: probably representative of most of our listeners and viewers had 183 00:09:58,160 --> 00:10:00,679 Speaker 2: not heard of deep Seek before this morning. Sure, and 184 00:10:00,720 --> 00:10:03,120 Speaker 2: so that's calling into question kind of what we think 185 00:10:03,160 --> 00:10:06,560 Speaker 2: we understood about AI and its implication. It's not just 186 00:10:06,600 --> 00:10:09,920 Speaker 2: for technology preference stock market in general. My only question 187 00:10:09,960 --> 00:10:11,720 Speaker 2: that I think I have now that has any relevance 188 00:10:11,800 --> 00:10:16,160 Speaker 2: is do I have to rethink my spending associated with AI. 189 00:10:18,280 --> 00:10:21,280 Speaker 5: It depends on how long your timeframe is. If it's 190 00:10:21,320 --> 00:10:24,760 Speaker 5: the very short term, probably not you you know you'll 191 00:10:24,760 --> 00:10:27,920 Speaker 5: get rewarded because well orders are in and all that 192 00:10:28,000 --> 00:10:30,320 Speaker 5: kind of good stuff. And by short term I'm talking 193 00:10:30,320 --> 00:10:33,080 Speaker 5: a year, but let's say three to five years maybe, 194 00:10:33,920 --> 00:10:36,720 Speaker 5: And why is that? Well, if we can really train 195 00:10:36,880 --> 00:10:42,600 Speaker 5: models more rapidly with less input, which is a good thing, right, 196 00:10:42,679 --> 00:10:45,040 Speaker 5: where were we going to get all this electricity? That 197 00:10:45,240 --> 00:10:49,680 Speaker 5: was like my biggest question. But anyhow, back to your question, 198 00:10:50,240 --> 00:10:56,760 Speaker 5: I think you know this is part of being an 199 00:10:56,800 --> 00:11:00,960 Speaker 5: investors understanding how long was I play? I'm keeping this, 200 00:11:01,559 --> 00:11:05,080 Speaker 5: So that's part of the problem. We have two more 201 00:11:05,120 --> 00:11:09,200 Speaker 5: problems that are not being clarified by deep Seek. One is, 202 00:11:09,720 --> 00:11:12,080 Speaker 5: and remember I used to do this, so I'm way 203 00:11:12,080 --> 00:11:16,400 Speaker 5: down the road from most investors in thinking about things. 204 00:11:16,679 --> 00:11:20,200 Speaker 5: But as I use it, I'm not getting the results 205 00:11:20,240 --> 00:11:22,960 Speaker 5: that I need, And by that I mean it's not 206 00:11:23,200 --> 00:11:26,600 Speaker 5: right enough. The error rate that I get back from 207 00:11:26,640 --> 00:11:30,080 Speaker 5: the questions that I'm asking are anywhere between five and 208 00:11:30,160 --> 00:11:34,040 Speaker 5: maybe thirty percent, which that's huge. Like I can't just 209 00:11:34,280 --> 00:11:37,920 Speaker 5: use AI. I have to babysit AI. So these are 210 00:11:38,000 --> 00:11:41,520 Speaker 5: problems that aren't solved by anybody at this point. And 211 00:11:41,640 --> 00:11:44,760 Speaker 5: then there is how are we actually going to use it? 212 00:11:44,800 --> 00:11:48,040 Speaker 5: I think there was I think a Bloomberg report that 213 00:11:48,200 --> 00:11:51,319 Speaker 5: was saying that probably two hundred thousand people could be 214 00:11:51,440 --> 00:11:56,480 Speaker 5: laid off those lower level entry level people into finance 215 00:11:56,800 --> 00:12:00,480 Speaker 5: and then that mid tier that are now you know, 216 00:12:00,600 --> 00:12:05,720 Speaker 5: kind of like the AI for investment banking and equity research. 217 00:12:06,640 --> 00:12:09,839 Speaker 5: But if we were that wrong, we wouldn't last right, Like, 218 00:12:10,200 --> 00:12:13,439 Speaker 5: if you're five percent wrong in investment banking, your career 219 00:12:13,520 --> 00:12:17,440 Speaker 5: is really short. So you know, these are problems that 220 00:12:17,559 --> 00:12:22,000 Speaker 5: aren't being solved. So that's another issue for investors. How 221 00:12:22,360 --> 00:12:24,440 Speaker 5: right How long is it going to be till these 222 00:12:24,480 --> 00:12:26,160 Speaker 5: get right enough to replace a person? 223 00:12:26,640 --> 00:12:28,400 Speaker 4: That's such a great point because I was also reading 224 00:12:28,400 --> 00:12:30,640 Speaker 4: an article that talked about the limitations so far that 225 00:12:30,679 --> 00:12:35,520 Speaker 4: have been seen about Gee, what is it deep Seek? 226 00:12:35,520 --> 00:12:39,640 Speaker 4: I want to say, geek see deep Seek was its 227 00:12:40,080 --> 00:12:44,120 Speaker 4: lack of inability to discuss jijenping or to give real 228 00:12:44,200 --> 00:12:47,959 Speaker 4: handswers on Tanam and square, just those topics that are 229 00:12:47,960 --> 00:12:51,200 Speaker 4: near and dear to China's heart. And that's in part 230 00:12:51,240 --> 00:12:53,920 Speaker 4: some of the issue that you're talking about. So who 231 00:12:53,960 --> 00:12:55,480 Speaker 4: wins in this right now? 232 00:12:56,559 --> 00:13:00,760 Speaker 5: Who wins? Well, I think investors are losing in the 233 00:13:00,840 --> 00:13:04,800 Speaker 5: short term. But if you like me think that AI 234 00:13:05,080 --> 00:13:09,800 Speaker 5: is a path to productivity, and productivity always wins. I 235 00:13:09,840 --> 00:13:14,359 Speaker 5: think you could be a winner. I think maybe companies 236 00:13:14,520 --> 00:13:18,880 Speaker 5: like open Ai and Microsoft, you know, by dint of 237 00:13:19,679 --> 00:13:23,200 Speaker 5: you know, investing in them, and then everybody else who's 238 00:13:23,240 --> 00:13:28,000 Speaker 5: developing AI, they might be winners. Maybe the what is 239 00:13:28,040 --> 00:13:30,200 Speaker 5: it Stargate? I don't know why. I can never remember. 240 00:13:30,200 --> 00:13:32,560 Speaker 5: That's my break up here. I can ever remember that 241 00:13:33,000 --> 00:13:35,160 Speaker 5: because it has nothing to do with stars or gates, 242 00:13:35,520 --> 00:13:39,280 Speaker 5: that Stargate might be the loser in those people because 243 00:13:39,280 --> 00:13:42,880 Speaker 5: we're not necessarily going to need to cover the earth 244 00:13:42,880 --> 00:13:43,760 Speaker 5: in data centers. 245 00:13:43,920 --> 00:13:45,240 Speaker 4: Right, yep, all. 246 00:13:45,240 --> 00:13:47,559 Speaker 2: Right, Kim, thanks so much for joining us. Always appreciate 247 00:13:47,559 --> 00:13:51,000 Speaker 2: getting your perspective of your technology background. Kim Farst, founder 248 00:13:51,320 --> 00:13:56,080 Speaker 2: and chief investment officer of Book Capital Partners, joining us 249 00:13:56,160 --> 00:13:57,920 Speaker 2: from Pittsburgh via the zoom Thing. 250 00:14:00,040 --> 00:14:04,000 Speaker 1: Listening to the Bloomberg Intelligence Podcast. Catch us live weekdays 251 00:14:04,000 --> 00:14:07,040 Speaker 1: at ten am Eastern on Applecarplay and Android Auto with 252 00:14:07,120 --> 00:14:10,240 Speaker 1: the Bloomberg Business App. Listen on demand wherever you get 253 00:14:10,280 --> 00:14:13,520 Speaker 1: your podcasts, or watch us live on YouTube. 254 00:14:14,160 --> 00:14:16,720 Speaker 4: Happy mindy, everybody, Alex Deal here alongside Paul Sweeny. This 255 00:14:16,720 --> 00:14:19,400 Speaker 4: is Bloomberg Intelligence Radio. We are broadcasting to live for 256 00:14:19,560 --> 00:14:23,040 Speaker 4: Interactive Brookers Studio right here in Midtown Manhattan. Also check 257 00:14:23,120 --> 00:14:25,920 Speaker 4: us out on YouTube dot com. Or clearly have the 258 00:14:25,920 --> 00:14:29,880 Speaker 4: tech sel off underway really concentrated in certain names the 259 00:14:29,920 --> 00:14:33,040 Speaker 4: mag seven obviously, in Vidia, the chip stocks, although Apple's up, 260 00:14:33,080 --> 00:14:35,200 Speaker 4: I should say, but in Vidia obviously getting pounded, the 261 00:14:35,240 --> 00:14:37,400 Speaker 4: chip stocks getting pounded as well. So we wanted to 262 00:14:37,800 --> 00:14:40,960 Speaker 4: continue the conversation about AI and you really need to 263 00:14:41,000 --> 00:14:45,040 Speaker 4: rethink this investment thesis. Joining us now is Josh Pantoni. 264 00:14:45,120 --> 00:14:48,840 Speaker 4: He's CEO boosted dot Ai Now. The company has helped 265 00:14:48,920 --> 00:14:53,080 Speaker 4: dozens of investment managers who's AUM totals over one trillion 266 00:14:53,120 --> 00:14:56,440 Speaker 4: dollars and how to implement machine learning in their portfolios. 267 00:14:56,880 --> 00:15:01,720 Speaker 4: First off, Josh, have you seen this new geek seek? Geeks? 268 00:15:01,720 --> 00:15:04,000 Speaker 4: Why do I keep saying geek seek? It's not geeks 269 00:15:04,000 --> 00:15:10,680 Speaker 4: seek deep see see if you try deep seak? Have 270 00:15:10,720 --> 00:15:12,720 Speaker 4: you talked to your clients about it? What do you think? 271 00:15:13,720 --> 00:15:16,240 Speaker 6: Yeah? So, I mean it came out I think last Monday. 272 00:15:16,680 --> 00:15:19,200 Speaker 6: We've already ran a whole suite of benchmarks against it. 273 00:15:19,200 --> 00:15:20,480 Speaker 6: Try that out over a bunch of things, and I'd 274 00:15:20,520 --> 00:15:24,120 Speaker 6: say we're pretty familiar with it. I think it's impressive 275 00:15:24,440 --> 00:15:27,560 Speaker 6: in a number of ways. It's impressive how how much 276 00:15:27,600 --> 00:15:29,800 Speaker 6: power you get for the cost. I think the flip 277 00:15:29,840 --> 00:15:33,520 Speaker 6: side is it's actually not quite something that we could 278 00:15:33,560 --> 00:15:36,000 Speaker 6: even consider implementing it to our process. And I think 279 00:15:36,000 --> 00:15:37,360 Speaker 6: there's a lot of clients that will reach the same 280 00:15:37,400 --> 00:15:40,240 Speaker 6: conclusion after they start working with it. Why is that. 281 00:15:42,080 --> 00:15:42,160 Speaker 5: So? 282 00:15:42,480 --> 00:15:44,880 Speaker 6: A lot of how we help our clients is with 283 00:15:44,960 --> 00:15:48,640 Speaker 6: things like trying to identify risk and trying to identify 284 00:15:49,400 --> 00:15:51,440 Speaker 6: to build out these different workflows for different parts of 285 00:15:51,440 --> 00:15:53,840 Speaker 6: the financial process. And a really big part of that 286 00:15:54,200 --> 00:15:56,520 Speaker 6: is related to trust. You need to be able to 287 00:15:56,560 --> 00:15:58,480 Speaker 6: give answers that you're very confident in. You need to 288 00:15:58,480 --> 00:16:01,480 Speaker 6: give answers that you can sort of deeply into it 289 00:16:01,520 --> 00:16:03,200 Speaker 6: where it's coming from. And I think one of the 290 00:16:03,320 --> 00:16:05,200 Speaker 6: challenge of this model is that it has some very 291 00:16:05,280 --> 00:16:08,160 Speaker 6: clear political biases right off the bat. So if I'm 292 00:16:08,200 --> 00:16:10,880 Speaker 6: trying to use it to say understand what my portfolio 293 00:16:10,880 --> 00:16:14,400 Speaker 6: exposure is to China's going to be a challenge to 294 00:16:14,520 --> 00:16:17,400 Speaker 6: use it for things like that. Also, a lot of 295 00:16:17,440 --> 00:16:20,080 Speaker 6: the things we're most excited about are things like being 296 00:16:20,120 --> 00:16:21,960 Speaker 6: able to do computer use, being able to sort of 297 00:16:21,960 --> 00:16:24,480 Speaker 6: teach the machine how to like interact with different apps 298 00:16:24,480 --> 00:16:27,960 Speaker 6: and things on the computer, image recognition, trying to extract 299 00:16:27,960 --> 00:16:30,520 Speaker 6: out charts and things like that, and a lot of 300 00:16:30,560 --> 00:16:32,760 Speaker 6: those kind of capabilities that actually doesn't really have it 301 00:16:32,760 --> 00:16:34,680 Speaker 6: and seem to be as sophisticated in so it can 302 00:16:34,720 --> 00:16:37,920 Speaker 6: do very well in certain benchmarks, but when it actually 303 00:16:37,960 --> 00:16:39,560 Speaker 6: comes to trying to apply it for at least most 304 00:16:39,600 --> 00:16:41,800 Speaker 6: of the use cases that we're doing, it's not quite 305 00:16:41,880 --> 00:16:42,440 Speaker 6: there yet. 306 00:16:42,640 --> 00:16:44,920 Speaker 4: What is there for you right now? Like what does work? 307 00:16:46,440 --> 00:16:48,400 Speaker 6: I think the most interesting thing for me is actually 308 00:16:48,440 --> 00:16:50,440 Speaker 6: on the reasoning side. So of course, like one of 309 00:16:50,440 --> 00:16:52,480 Speaker 6: the really big pushes Opening Eyes had right now is 310 00:16:52,520 --> 00:16:55,600 Speaker 6: like with one and three, you've got these models coming 311 00:16:55,640 --> 00:16:58,080 Speaker 6: in where in theory they can take a task, break 312 00:16:58,120 --> 00:17:01,520 Speaker 6: it into subcomponents, and then start executing those components. And 313 00:17:01,560 --> 00:17:05,239 Speaker 6: I think in that area it's actually quite good. I 314 00:17:05,280 --> 00:17:08,160 Speaker 6: also think, you know, just from a cost basis, it's 315 00:17:08,200 --> 00:17:10,680 Speaker 6: extremely impressive they've been able to do. There's been some 316 00:17:11,160 --> 00:17:14,000 Speaker 6: controversy about exactly how much it costs to actually train 317 00:17:14,040 --> 00:17:16,440 Speaker 6: the model. What I can confirm though, is the actual 318 00:17:16,480 --> 00:17:19,440 Speaker 6: inference cost. The cost of running the model is extremely 319 00:17:19,520 --> 00:17:21,199 Speaker 6: cheap compared to what you're getting with like oh one 320 00:17:21,240 --> 00:17:23,240 Speaker 6: and oh three, and so just the fact is even 321 00:17:23,320 --> 00:17:25,480 Speaker 6: possible to do that as a major technology breakthrough. 322 00:17:26,320 --> 00:17:31,600 Speaker 2: If nothing else does Deep Seek just highlight the cost 323 00:17:31,840 --> 00:17:36,840 Speaker 2: issue and potentially the I don't know, the movement down 324 00:17:37,000 --> 00:17:38,960 Speaker 2: in costs of implementing AI. 325 00:17:40,480 --> 00:17:42,920 Speaker 6: Yeah, the way I tend to think about it is 326 00:17:43,320 --> 00:17:46,760 Speaker 6: the cheaper it is to train AI systems, the more 327 00:17:47,040 --> 00:17:49,320 Speaker 6: advanced capabilities you can train. 328 00:17:49,359 --> 00:17:52,240 Speaker 4: In the stress you lost your audio Josh, you there, Yes, 329 00:17:52,320 --> 00:17:52,680 Speaker 4: I am. 330 00:17:52,560 --> 00:17:54,120 Speaker 5: Here, still hear me. We're good? 331 00:17:54,160 --> 00:17:57,199 Speaker 6: Okay, perfect? Yeah. What the way I like to think 332 00:17:57,200 --> 00:18:00,160 Speaker 6: about it is the cheaper it is to train these 333 00:18:00,400 --> 00:18:04,600 Speaker 6: AI systems, Then the more advanced capabilities you can train, 334 00:18:04,760 --> 00:18:07,360 Speaker 6: the more advanced capabilities you can train, the more use 335 00:18:07,400 --> 00:18:10,280 Speaker 6: cases you unlock. The more use cases unlock, the more 336 00:18:10,320 --> 00:18:12,520 Speaker 6: you actually see it accelerate. So I actually think this 337 00:18:12,560 --> 00:18:16,040 Speaker 6: model will probably cause an acceleration in the capabilities of 338 00:18:16,040 --> 00:18:18,720 Speaker 6: other models that are getting built as more folks start 339 00:18:18,760 --> 00:18:20,199 Speaker 6: to adopt it. So for me, I actually see it 340 00:18:20,200 --> 00:18:21,120 Speaker 6: as completely positive. 341 00:18:22,480 --> 00:18:26,159 Speaker 4: What outside of deep Seek and other models, like, have 342 00:18:26,280 --> 00:18:27,800 Speaker 4: you worked with that you did? 343 00:18:27,920 --> 00:18:31,280 Speaker 6: Like yeah, so, I mean behind the scenes, we work 344 00:18:31,320 --> 00:18:33,640 Speaker 6: with in thropic models we work with open AI models. 345 00:18:34,280 --> 00:18:36,760 Speaker 6: We work with a bunch of fine tune models that 346 00:18:36,800 --> 00:18:40,040 Speaker 6: we build in house ourselves. You know. I like to 347 00:18:40,040 --> 00:18:42,840 Speaker 6: sort of describe it as an orchestra of different types 348 00:18:42,840 --> 00:18:44,840 Speaker 6: of models. And one of the things we've sort of 349 00:18:44,880 --> 00:18:48,520 Speaker 6: noticed is there's a lot of differences in capabilities. So 350 00:18:48,520 --> 00:18:50,840 Speaker 6: something like the inthroduct models and thropic models tend to 351 00:18:50,880 --> 00:18:53,120 Speaker 6: be better at very long context window where you try 352 00:18:53,119 --> 00:18:56,600 Speaker 6: and handle like huge amounts of text, whereas something like 353 00:18:56,640 --> 00:18:58,479 Speaker 6: the GPD form models tend to be a little bit 354 00:18:58,480 --> 00:19:01,800 Speaker 6: better at like foreign language and just sort of the 355 00:19:03,240 --> 00:19:06,240 Speaker 6: general verbiage it uses as it's giving outputs. So we 356 00:19:06,520 --> 00:19:07,280 Speaker 6: use a whole bunch. 357 00:19:08,760 --> 00:19:11,280 Speaker 2: Where how are your clients and the asset management business, 358 00:19:12,240 --> 00:19:14,320 Speaker 2: Josh using AI these days, in these early days. 359 00:19:15,680 --> 00:19:17,320 Speaker 6: Yeah, So the way I like to think about it 360 00:19:17,359 --> 00:19:18,920 Speaker 6: is we give them the ability to create this sort 361 00:19:18,960 --> 00:19:22,399 Speaker 6: of team of little AI workers where you teach them 362 00:19:22,440 --> 00:19:24,200 Speaker 6: how to do some kind of task, and then they're 363 00:19:24,200 --> 00:19:26,439 Speaker 6: going to do that task on a continuous basis. So 364 00:19:26,880 --> 00:19:29,040 Speaker 6: let's say you to do something like write an investment 365 00:19:29,040 --> 00:19:31,080 Speaker 6: thesis on a company, or let's say you had to 366 00:19:31,119 --> 00:19:34,080 Speaker 6: do something like write an ESG report or Let's say 367 00:19:34,119 --> 00:19:38,800 Speaker 6: you wanted to continuously monitor the world for any kinds 368 00:19:38,840 --> 00:19:41,520 Speaker 6: of updates to something that might happen in the air space. 369 00:19:42,040 --> 00:19:45,119 Speaker 6: These are all examples of sort of workflows you can 370 00:19:45,119 --> 00:19:47,000 Speaker 6: teach the system and have the system start to automate. 371 00:19:48,200 --> 00:19:50,160 Speaker 4: Well, we really appreciate your time. It was so good 372 00:19:50,160 --> 00:19:52,080 Speaker 4: to get that perspective. It's good to kind of get 373 00:19:52,080 --> 00:19:55,600 Speaker 4: the user mindset in for this. Josh Josh Pantoni, CEO 374 00:19:55,640 --> 00:19:59,119 Speaker 4: a boosted dot AI on deep Seek and sort of 375 00:19:59,160 --> 00:20:00,560 Speaker 4: the pros and the hans there. 376 00:20:01,480 --> 00:20:06,199 Speaker 1: This is the Bloomberg Intelligence podcast, available on Apple, Spotify, 377 00:20:06,400 --> 00:20:09,840 Speaker 1: and anywhere else you get your podcasts. Listen live each 378 00:20:09,880 --> 00:20:13,640 Speaker 1: weekday ten am to noon Eastern on Bloomberg dot com, 379 00:20:13,760 --> 00:20:17,280 Speaker 1: the iHeartRadio app, tune In, and the Bloomberg Business app. 380 00:20:17,720 --> 00:20:20,800 Speaker 1: You can also watch us live every weekday on YouTube 381 00:20:21,040 --> 00:20:23,280 Speaker 1: and always on the Bloomberg terminal.