1 00:00:02,520 --> 00:00:05,439 Speaker 1: Bloomberg Audio Studios, podcasts. 2 00:00:05,760 --> 00:00:10,479 Speaker 2: Radio news man Deep Singh joins us, Right now, did 3 00:00:10,520 --> 00:00:14,280 Speaker 2: you know what deep seek was before twelve or eighteen 4 00:00:14,400 --> 00:00:15,040 Speaker 2: hours ago? 5 00:00:15,400 --> 00:00:18,160 Speaker 3: I did, yes, And so what did you know about 6 00:00:18,200 --> 00:00:19,280 Speaker 3: them two days ago? 7 00:00:20,079 --> 00:00:23,160 Speaker 1: That it is a competitor in the LM space. They've 8 00:00:23,160 --> 00:00:26,520 Speaker 1: been training their own model and it's used mostly in 9 00:00:26,560 --> 00:00:27,800 Speaker 1: the East Asian region. 10 00:00:27,960 --> 00:00:30,560 Speaker 2: Okay, what I learned, thank you zero head for this 11 00:00:30,600 --> 00:00:35,720 Speaker 2: from Morgan Brown at dropbox is basically they've said, we're 12 00:00:35,720 --> 00:00:38,199 Speaker 2: not going to be as perfect as pristine as the 13 00:00:38,280 --> 00:00:42,040 Speaker 2: others out to thirty two decimal points. We're just going 14 00:00:42,080 --> 00:00:47,560 Speaker 2: to get it done out to point eight decimal points. 15 00:00:47,680 --> 00:00:49,239 Speaker 2: Is that what this is all about is they're not 16 00:00:49,240 --> 00:00:50,120 Speaker 2: going to be perfect. 17 00:00:50,479 --> 00:00:52,760 Speaker 1: I think there's more to it than just you know, 18 00:00:53,000 --> 00:00:57,240 Speaker 1: using the floating point fewer floating point operations than open 19 00:00:57,280 --> 00:01:01,279 Speaker 1: AI and Entropic and others. And in this case, they 20 00:01:01,800 --> 00:01:06,679 Speaker 1: basically focused on hardware efficiency, using the hardware in the 21 00:01:06,720 --> 00:01:10,399 Speaker 1: most efficient fashion. They had the benefit of all these 22 00:01:10,600 --> 00:01:14,160 Speaker 1: lms being out there. They use meta Llama as a 23 00:01:14,200 --> 00:01:17,880 Speaker 1: reference point. It's another open source model, and they figured 24 00:01:17,880 --> 00:01:21,040 Speaker 1: out a way to do this most efficiently than everyone else. 25 00:01:21,080 --> 00:01:24,360 Speaker 1: I mean, everyone is focused on scale right now, they 26 00:01:24,560 --> 00:01:26,240 Speaker 1: focused on hardware efficiency. 27 00:01:26,240 --> 00:01:27,720 Speaker 3: Call the Lama meta. 28 00:01:27,760 --> 00:01:31,800 Speaker 2: It's Facebook is the grandson of the original Lama on 29 00:01:31,840 --> 00:01:32,360 Speaker 2: Saturday Night. 30 00:01:32,680 --> 00:01:36,280 Speaker 4: There you go, right, very good man, Deep You've got 31 00:01:36,280 --> 00:01:39,440 Speaker 4: an analyst Bloomberg Intelligence in Hong Kong, Robert Lee. I'm 32 00:01:39,480 --> 00:01:42,480 Speaker 4: reading his research literally as we speak. And for folks 33 00:01:42,880 --> 00:01:45,600 Speaker 4: on the Bloomberg terminal, go to big and check out 34 00:01:45,600 --> 00:01:46,920 Speaker 4: if you want to need to figure out what's going 35 00:01:46,959 --> 00:01:48,960 Speaker 4: on with Deep seek and what China's doing with AI. 36 00:01:49,520 --> 00:01:53,920 Speaker 4: Bloomerg Intelligence has got the data and the research there. 37 00:01:55,080 --> 00:01:59,520 Speaker 4: Can China support a real competitive AI environment? It seems 38 00:01:59,520 --> 00:02:04,000 Speaker 4: like they're government is more restricting some of the technology there. 39 00:02:04,360 --> 00:02:08,080 Speaker 1: Well, so now we know the export controls probably weren't 40 00:02:08,200 --> 00:02:11,480 Speaker 1: as effective as they were supposed to be here, you know, 41 00:02:11,560 --> 00:02:14,440 Speaker 1: in terms of restricting the access to Nvidia latest chips, 42 00:02:14,800 --> 00:02:17,799 Speaker 1: and some people are saying they did use you know, 43 00:02:17,960 --> 00:02:21,680 Speaker 1: the latest h one hundred chips, albeit they were not 44 00:02:21,800 --> 00:02:25,000 Speaker 1: of the same scale as were available to Open AI 45 00:02:25,160 --> 00:02:28,959 Speaker 1: and the others. But look, we don't know the extent 46 00:02:29,120 --> 00:02:32,280 Speaker 1: of the hardware that was used for training. All I 47 00:02:32,320 --> 00:02:34,680 Speaker 1: can say, you know, based on this the fact that 48 00:02:34,720 --> 00:02:37,760 Speaker 1: they have a comparable model in performance to the others. 49 00:02:38,240 --> 00:02:42,279 Speaker 1: It shows that they clearly have a combination of algorithms 50 00:02:42,280 --> 00:02:45,239 Speaker 1: and compute to compete with the others. 51 00:02:45,320 --> 00:02:48,240 Speaker 2: Most of the financials are the revenue growth models that 52 00:02:48,320 --> 00:02:52,040 Speaker 2: you and b I have. Of all these fancy AI people, 53 00:02:52,160 --> 00:02:53,440 Speaker 2: are they now at risk? 54 00:02:53,760 --> 00:02:54,359 Speaker 3: Absolutely? 55 00:02:54,360 --> 00:02:57,120 Speaker 1: I mean, look at open ais oh one pro model 56 00:02:57,160 --> 00:03:01,560 Speaker 1: they're charging two hundred dollars per month because it's the 57 00:03:01,560 --> 00:03:04,520 Speaker 1: best model they have to offer. The fact that you 58 00:03:04,600 --> 00:03:07,639 Speaker 1: have an open source model that is comparable to the 59 00:03:07,760 --> 00:03:09,600 Speaker 1: model that Opening Eyes chargeable. 60 00:03:09,840 --> 00:03:11,720 Speaker 3: You're saying it's comparable. 61 00:03:11,400 --> 00:03:13,480 Speaker 1: I mean it is, yeah, I mean look at the 62 00:03:13,520 --> 00:03:16,799 Speaker 1: benchmarks and that's why you know all these models have 63 00:03:16,919 --> 00:03:20,480 Speaker 1: common benchmarks. This is within one to two percentage points 64 00:03:20,560 --> 00:03:21,400 Speaker 1: of that benchmark. 65 00:03:21,480 --> 00:03:23,120 Speaker 3: I know you don't do buy hole cell, but are 66 00:03:23,120 --> 00:03:24,760 Speaker 3: we going to sell here? Are we going to see 67 00:03:24,800 --> 00:03:27,560 Speaker 3: Wall Street put a cell on an video? I think 68 00:03:27,639 --> 00:03:28,080 Speaker 3: predict that. 69 00:03:28,800 --> 00:03:31,040 Speaker 1: I mean right now, the fact that Meta raised their 70 00:03:31,120 --> 00:03:35,040 Speaker 1: Capex on Friday to me earning season is where we 71 00:03:35,080 --> 00:03:37,600 Speaker 1: will find out what all these companies end up doing. 72 00:03:37,680 --> 00:03:40,560 Speaker 1: So you know, all this has happened in the last 73 00:03:40,920 --> 00:03:44,080 Speaker 1: forty eight hours, where Meta raised their CAPEX Deep Sea 74 00:03:44,200 --> 00:03:46,880 Speaker 1: came out it suddenly everyone is going crazy. 75 00:03:46,960 --> 00:03:50,200 Speaker 2: Paul the President lined up with the son of Japan 76 00:03:50,720 --> 00:03:52,760 Speaker 2: to do a ginormous tech deal as well. 77 00:03:52,920 --> 00:03:54,440 Speaker 3: You okay, there, you look at you. 78 00:03:55,000 --> 00:03:59,080 Speaker 4: I'm doing social Here are you doing social words? Doing 79 00:03:59,120 --> 00:04:02,120 Speaker 4: that out the social I'm saying, take a look at that. So, Mandy, 80 00:04:02,200 --> 00:04:05,680 Speaker 4: what do we do here? As we think about AI's 81 00:04:05,720 --> 00:04:09,600 Speaker 4: obviously from technology perspective, it has been the story for 82 00:04:09,640 --> 00:04:12,680 Speaker 4: the last two years at least. How do you think 83 00:04:12,680 --> 00:04:15,600 Speaker 4: about it now? Has anything changed in the last twenty 84 00:04:15,640 --> 00:04:16,359 Speaker 4: four hours for you? 85 00:04:16,960 --> 00:04:20,679 Speaker 1: A lot of software companies will feel, you know, they 86 00:04:20,800 --> 00:04:23,479 Speaker 1: can do their own AI now and not having to 87 00:04:23,560 --> 00:04:28,120 Speaker 1: rely on the big hyperscalers because the narrative was if 88 00:04:28,160 --> 00:04:32,120 Speaker 1: you can't spend upfront on Capex, then you don't have 89 00:04:32,200 --> 00:04:34,640 Speaker 1: a play. And it comes to the foundational model there 90 00:04:35,000 --> 00:04:38,240 Speaker 1: and so suddenly everyone feels empowered that they can do 91 00:04:38,320 --> 00:04:40,960 Speaker 1: their own AI with this development, I. 92 00:04:40,920 --> 00:04:44,039 Speaker 4: See Nvidio, Broadcom, some of these big chip makers that 93 00:04:44,080 --> 00:04:46,000 Speaker 4: have had such a huge run on the IP side, 94 00:04:46,120 --> 00:04:49,760 Speaker 4: they're down ten to eleven twelve percent here today? Is 95 00:04:49,839 --> 00:04:52,680 Speaker 4: that realistic? Is that does that seem reasonable to you? 96 00:04:53,080 --> 00:04:55,640 Speaker 1: I mean with semi companies we know the risk is 97 00:04:55,680 --> 00:04:59,159 Speaker 1: always that cyclical element. And if we are calling that 98 00:04:59,279 --> 00:05:01,640 Speaker 1: this is the the top of the cycle in terms 99 00:05:01,680 --> 00:05:05,479 Speaker 1: of semi demand and estimates aren't going up anymore, then 100 00:05:05,600 --> 00:05:07,800 Speaker 1: you know you will see this sort of stock reaction. 101 00:05:07,880 --> 00:05:09,920 Speaker 1: But I go back to my point about Meta raising 102 00:05:09,960 --> 00:05:13,279 Speaker 1: their capex projects target at it one hundred billion. 103 00:05:13,279 --> 00:05:15,960 Speaker 5: Of that, we haven't talked to you about that fifty 104 00:05:16,000 --> 00:05:20,080 Speaker 5: billion they were Meta Cappex was twenty five billion in 105 00:05:20,160 --> 00:05:23,480 Speaker 5: twenty nineteen, then it was fifty billion for this year. 106 00:05:23,920 --> 00:05:26,680 Speaker 4: Now they're saying to go on a sixty five I 107 00:05:26,760 --> 00:05:28,600 Speaker 4: feel like it when I look at those numbers, it 108 00:05:28,600 --> 00:05:32,240 Speaker 4: feels like drunken sale or time. What Did they have 109 00:05:32,320 --> 00:05:34,760 Speaker 4: a real strategy behind that spending or are they just 110 00:05:34,760 --> 00:05:36,960 Speaker 4: saying we need to be in this game, We need 111 00:05:37,000 --> 00:05:39,520 Speaker 4: to spend whatever we need to spend, typical Meta type 112 00:05:39,520 --> 00:05:40,000 Speaker 4: of spending. 113 00:05:40,200 --> 00:05:42,960 Speaker 1: Yeah, and I think there is that aspect where you 114 00:05:43,120 --> 00:05:45,559 Speaker 1: want to be that front end player when it comes 115 00:05:45,560 --> 00:05:48,440 Speaker 1: to the lllms. But in the case of Meta, the challenge, 116 00:05:48,880 --> 00:05:50,880 Speaker 1: in addition to the fact that you know deep sea 117 00:05:51,000 --> 00:05:53,440 Speaker 1: model is out there, is they don't have a cloud business. 118 00:05:53,520 --> 00:05:55,800 Speaker 1: You look at Microsoft, you look at Amazon, you look 119 00:05:55,839 --> 00:05:58,279 Speaker 1: at Google, they don't they have a cloud business to 120 00:05:58,400 --> 00:06:01,680 Speaker 1: monetize those GPUs. Even if let's say deep Sea came 121 00:06:01,760 --> 00:06:05,080 Speaker 1: up with a better model, Microsoft can use that capacity 122 00:06:05,080 --> 00:06:06,440 Speaker 1: for inferencing on the cloud. 123 00:06:06,600 --> 00:06:10,000 Speaker 3: They can generate cloud revenue. Mandy, what are the meetings 124 00:06:10,200 --> 00:06:12,880 Speaker 3: like in Silicon Valley? Now? 125 00:06:13,279 --> 00:06:15,600 Speaker 2: I mean, these guys never get up before nine am, 126 00:06:15,839 --> 00:06:18,160 Speaker 2: But today they're gonna get up at six am their time. 127 00:06:18,400 --> 00:06:21,760 Speaker 2: They're gonna be rock at nine am surveillance time. What 128 00:06:21,800 --> 00:06:24,599 Speaker 2: are the meetings gonna be like at Google? What's the 129 00:06:24,640 --> 00:06:26,440 Speaker 2: meeting gonna be like for Zuckerberg? 130 00:06:27,160 --> 00:06:30,279 Speaker 1: Oh? So I think again, Given these companies are reporting 131 00:06:30,320 --> 00:06:33,400 Speaker 1: earnings next week right now, they have to figure out 132 00:06:33,839 --> 00:06:35,960 Speaker 1: what is it that they relate to the investors in 133 00:06:36,040 --> 00:06:39,520 Speaker 1: terms of the scaling laws. Like up until now, the 134 00:06:39,560 --> 00:06:43,159 Speaker 1: biggest debate was will the scaling laws hold in twenty 135 00:06:43,200 --> 00:06:47,280 Speaker 1: twenty five? And based on this development, we don't talk 136 00:06:47,320 --> 00:06:48,200 Speaker 1: about scaling laws. 137 00:06:48,279 --> 00:06:51,159 Speaker 3: Do they replicate what deep seek is doing? 138 00:06:51,839 --> 00:06:55,080 Speaker 1: Absolutely? I think they will use some of the innovations 139 00:06:55,120 --> 00:06:57,400 Speaker 1: that deep Seek has showed in their paper, and given 140 00:06:57,440 --> 00:07:00,720 Speaker 1: they have open source it. My bed is everyone cares 141 00:07:00,720 --> 00:07:04,520 Speaker 1: about hardware efficiency, even if Microsoft is spending eighty billion 142 00:07:04,560 --> 00:07:07,640 Speaker 1: on AI Capex. They want to use that cap ex efficiency. 143 00:07:07,720 --> 00:07:11,240 Speaker 2: The smartest thing I've heard of from the technologist Paul Sweeney, 144 00:07:11,400 --> 00:07:13,880 Speaker 2: your Ted talk on this right, your Ted talk. 145 00:07:14,120 --> 00:07:16,680 Speaker 3: You almost wore a tie to your Ted thing. Exactly. 146 00:07:17,080 --> 00:07:18,520 Speaker 3: They're drunken sailors. 147 00:07:18,680 --> 00:07:21,080 Speaker 2: I mean, that's the smartest thing they've heard. Do you 148 00:07:21,120 --> 00:07:22,880 Speaker 2: see capex responsibility? 149 00:07:23,360 --> 00:07:24,240 Speaker 3: No? I think. 150 00:07:24,920 --> 00:07:29,160 Speaker 1: I mean to my mind, all this Capex can be used, 151 00:07:29,240 --> 00:07:32,000 Speaker 1: you know, for the future, and it's quite fungible. It's 152 00:07:32,040 --> 00:07:34,880 Speaker 1: not like laying a fiber network and oh it's gonna 153 00:07:34,920 --> 00:07:39,880 Speaker 1: go waste. Compute is always used in some form. Now, yes, 154 00:07:39,960 --> 00:07:43,040 Speaker 1: they may have overpaid for the GPUs. You could argue 155 00:07:43,560 --> 00:07:45,920 Speaker 1: was it worth paying thirty thousand dollars for a GPU? 156 00:07:46,080 --> 00:07:48,840 Speaker 1: Probably not, but still this compute is still useful. Man. 157 00:07:48,880 --> 00:07:51,120 Speaker 2: Deep single those folks. We're gonna go around equities, bond's 158 00:07:51,160 --> 00:07:53,400 Speaker 2: currencies come out of these again. We're off four percent. 159 00:07:53,840 --> 00:07:56,960 Speaker 3: No, sit down, you're not done yet. You know where 160 00:07:57,040 --> 00:07:57,520 Speaker 3: you're going. 161 00:07:58,240 --> 00:07:59,320 Speaker 4: Two segments you should. 162 00:07:59,080 --> 00:08:01,720 Speaker 3: See as an englishakfast on Monday. It's like this huge 163 00:08:01,760 --> 00:08:02,800 Speaker 3: kind of a thing. 164 00:08:02,960 --> 00:08:06,640 Speaker 2: The bond market yields in big time, which you'd expect 165 00:08:06,720 --> 00:08:10,120 Speaker 2: here the screening yield four point one eight percent in 166 00:08:10,320 --> 00:08:14,520 Speaker 2: nine basis points, the tenure yield in eleven basis points. 167 00:08:14,560 --> 00:08:17,360 Speaker 2: First thing, I looked at the real yield crushes in 168 00:08:17,520 --> 00:08:20,400 Speaker 2: ten basis points from a two point two zero to 169 00:08:20,400 --> 00:08:23,000 Speaker 2: two point one zero some of the angst there in 170 00:08:23,040 --> 00:08:23,680 Speaker 2: the market. 171 00:08:23,880 --> 00:08:25,560 Speaker 3: And what I look at, and it's not a good 172 00:08:25,600 --> 00:08:26,240 Speaker 3: number yet. 173 00:08:26,120 --> 00:08:29,000 Speaker 2: With the Vicks twenty one point five four, not thirty, 174 00:08:29,080 --> 00:08:31,040 Speaker 2: but from fifteen to twenty one. 175 00:08:31,040 --> 00:08:33,719 Speaker 4: That gets your attention, poll Man, Deep is a concern here. 176 00:08:34,120 --> 00:08:37,760 Speaker 4: The deep Seak product could be as good an AI 177 00:08:37,880 --> 00:08:40,920 Speaker 4: solution as what some of the Western companies are providing, 178 00:08:41,280 --> 00:08:43,880 Speaker 4: but at a lower cost. Is that the bottom line here? 179 00:08:44,000 --> 00:08:46,400 Speaker 1: Yeah? I mean look at the app stores that deep 180 00:08:46,400 --> 00:08:49,679 Speaker 1: Seak is the top appro right now here in the US, 181 00:08:49,760 --> 00:08:53,800 Speaker 1: and everyone is using it multitude chat. 182 00:08:54,360 --> 00:08:56,160 Speaker 3: No, I don't mean ititer up, but this is important. 183 00:08:56,559 --> 00:09:00,959 Speaker 2: Lesa Matao's not using it, Tom Keene is not using it. 184 00:09:01,480 --> 00:09:04,720 Speaker 2: An adult like you or Joe Wisenthal who's knee deep 185 00:09:04,720 --> 00:09:05,360 Speaker 2: into this stuff. 186 00:09:05,400 --> 00:09:09,280 Speaker 3: Good morning, Joe, Joey. Joe's not up yet, Okay, Joe 187 00:09:09,280 --> 00:09:12,960 Speaker 3: Wisenthal is going to bring up deep seek and bring up. 188 00:09:13,040 --> 00:09:16,400 Speaker 2: Chet GPT of one of the different managed levels. Can 189 00:09:16,440 --> 00:09:17,400 Speaker 2: you tell the difference? 190 00:09:18,559 --> 00:09:21,800 Speaker 1: Look, I mean there is an element of personalization that 191 00:09:21,920 --> 00:09:24,880 Speaker 1: these apps pro wide when it comes to asking the 192 00:09:24,920 --> 00:09:27,560 Speaker 1: type of questions you're asking. But if you have a 193 00:09:27,600 --> 00:09:30,400 Speaker 1: generic query, deep seek is going to give you an 194 00:09:30,440 --> 00:09:32,240 Speaker 1: answer that's comparable to Chat GPT. 195 00:09:32,679 --> 00:09:36,839 Speaker 4: I'm looking Tom, tell me you're on deep Seek Now. 196 00:09:36,880 --> 00:09:41,640 Speaker 4: I google the top app number one, deep Seek, number 197 00:09:41,640 --> 00:09:47,400 Speaker 4: two Chat GPT, and number three Paramount plus for land 198 00:09:47,480 --> 00:09:50,160 Speaker 4: Man and for Yellowstone. That's what I'm talking about. 199 00:09:49,880 --> 00:09:51,960 Speaker 2: That we had a land Man weekd at home. I said, 200 00:09:51,960 --> 00:09:53,559 Speaker 2: Sweeney's liked Deep into. 201 00:09:53,400 --> 00:09:54,120 Speaker 4: The number seven. 202 00:09:54,200 --> 00:09:57,600 Speaker 3: Fox Sports get one more in here, one more in here, 203 00:09:57,640 --> 00:09:59,080 Speaker 3: because man, Deep's coming back at eight. 204 00:09:59,280 --> 00:10:02,520 Speaker 4: What do you need? What do you think you're gonna 205 00:10:02,520 --> 00:10:04,720 Speaker 4: hear from some of these technology companies that next week 206 00:10:04,760 --> 00:10:07,680 Speaker 4: as they talk about China, as they talk about this, 207 00:10:08,360 --> 00:10:09,480 Speaker 4: what is a deep Seek thing? 208 00:10:09,840 --> 00:10:10,040 Speaker 3: Yeah? 209 00:10:10,080 --> 00:10:13,920 Speaker 1: I mean the whole aspect around CAPEX and scaling laws 210 00:10:13,960 --> 00:10:17,560 Speaker 1: and what to do with the latest GPUs. Are you 211 00:10:17,800 --> 00:10:22,200 Speaker 1: spending more on training or inferencing? Everything is on the 212 00:10:22,240 --> 00:10:24,479 Speaker 1: table in terms of hardware efficiency. 213 00:10:24,679 --> 00:10:27,520 Speaker 2: I have no idea what inferences is a clinic for 214 00:10:27,600 --> 00:10:30,240 Speaker 2: the man deep sing of Bloomberg intelligence