1 00:00:00,480 --> 00:00:03,600 Speaker 1: Bryan Bridge Tech Companies while they've been up and down 2 00:00:03,640 --> 00:00:05,640 Speaker 1: like a Yo yo this week since the Deep Seek 3 00:00:05,720 --> 00:00:09,880 Speaker 1: thing happened. Interesting timing because Microsoft and Meta have reported 4 00:00:09,960 --> 00:00:13,480 Speaker 1: their annual results today. Sam Dicky from Fisher Funds is 5 00:00:13,480 --> 00:00:16,200 Speaker 1: with us. Sam. Gooday, good evening, right, how are you 6 00:00:16,200 --> 00:00:19,280 Speaker 1: going good? Thank you? Quick throwback? What was the straw 7 00:00:19,320 --> 00:00:20,599 Speaker 1: that broke the camel's back here? 8 00:00:22,040 --> 00:00:24,440 Speaker 2: Yeah, that's an interesting question because it wasn't all new news, 9 00:00:24,480 --> 00:00:28,320 Speaker 2: so that deep Seak has been openly using these innovative 10 00:00:28,360 --> 00:00:31,320 Speaker 2: techniques to drive down the cost to train these AI 11 00:00:31,400 --> 00:00:34,199 Speaker 2: models over the last year or so. But it was 12 00:00:34,240 --> 00:00:36,000 Speaker 2: kind a threefold. It was the release of its most 13 00:00:36,040 --> 00:00:39,280 Speaker 2: recent model R one on twenty fifth January, which showed 14 00:00:39,520 --> 00:00:42,080 Speaker 2: the significantly improved results, which you guys have spoken about 15 00:00:42,120 --> 00:00:44,640 Speaker 2: this week, plus the talking heads picked it up at 16 00:00:44,720 --> 00:00:48,720 Speaker 2: Davos later that week. Plus critically, the deep Sea Gap 17 00:00:48,760 --> 00:00:51,560 Speaker 2: went from number thirty one on the Apple lapp store 18 00:00:51,600 --> 00:00:54,000 Speaker 2: to number one over the weekend, so that what really 19 00:00:54,080 --> 00:00:56,840 Speaker 2: was the straw that broke the camel's back? The consumer 20 00:00:57,040 --> 00:00:57,960 Speaker 2: voted with their feet. 21 00:00:59,680 --> 00:01:03,400 Speaker 1: So the significant breakthrough, obviously is that they managed to 22 00:01:03,440 --> 00:01:06,839 Speaker 1: do this some well, a lot of questions been asked 23 00:01:06,840 --> 00:01:09,520 Speaker 1: about how much US tech was used to make this, 24 00:01:10,120 --> 00:01:14,160 Speaker 1: how palatable the platform will be for American users or 25 00:01:14,200 --> 00:01:16,280 Speaker 1: America you know, users around the rest of the world 26 00:01:16,319 --> 00:01:17,279 Speaker 1: outside of China. 27 00:01:18,680 --> 00:01:20,160 Speaker 2: That's right. I mean one of the breakthrough is it 28 00:01:20,200 --> 00:01:22,760 Speaker 2: is important to talk about one of the breakthroughs. There 29 00:01:22,800 --> 00:01:25,959 Speaker 2: were many, but the simplest one to understand, the one 30 00:01:25,959 --> 00:01:29,759 Speaker 2: that stood out was pure reinforcement learning. So in playing English, 31 00:01:29,840 --> 00:01:33,080 Speaker 2: pure reinforcement learning is quite simple. So don't teach one 32 00:01:33,120 --> 00:01:35,679 Speaker 2: of these AI models to solve every puzzle on Earth 33 00:01:35,760 --> 00:01:38,360 Speaker 2: using every bit of data on earth, like how most 34 00:01:38,360 --> 00:01:41,959 Speaker 2: of the US large language models have been trained. Just 35 00:01:42,120 --> 00:01:44,080 Speaker 2: reward it if it does a good job with a 36 00:01:44,080 --> 00:01:46,760 Speaker 2: digital pad on the back, so a plus one for 37 00:01:46,800 --> 00:01:48,880 Speaker 2: the right answer and a minus one for the wrong answer. 38 00:01:48,920 --> 00:01:52,160 Speaker 2: And this sort of trial and error way of learning 39 00:01:52,160 --> 00:01:54,440 Speaker 2: has proved to be far more efficient and cost effective 40 00:01:54,440 --> 00:01:56,280 Speaker 2: as the model, just like we do, loves to reward 41 00:01:56,320 --> 00:01:58,560 Speaker 2: and kind of sort them out. In terms of the 42 00:01:58,880 --> 00:02:01,440 Speaker 2: controversy and people saying they don't believe the six million 43 00:02:01,480 --> 00:02:05,400 Speaker 2: dollar cost to train the model, and the controversy around 44 00:02:05,440 --> 00:02:10,680 Speaker 2: open AI saying that deep Seat piggybacked off, it's existing 45 00:02:11,400 --> 00:02:14,160 Speaker 2: chat GPT models. We don't know. We just know that 46 00:02:14,200 --> 00:02:17,200 Speaker 2: the arms race and the chipboard that you and I 47 00:02:17,240 --> 00:02:19,799 Speaker 2: spoke about late last years is really hotting up. 48 00:02:20,320 --> 00:02:23,320 Speaker 1: So basically what you're talking about, they're the pure reinforcement 49 00:02:23,400 --> 00:02:28,520 Speaker 1: is like positive reinforcement. How did the big, biggest, you know, 50 00:02:28,639 --> 00:02:31,880 Speaker 1: AI companies in the world missed this, Well, that's right, 51 00:02:31,919 --> 00:02:32,960 Speaker 1: they were certainly aware of it. 52 00:02:33,000 --> 00:02:36,880 Speaker 2: So Google, for example, used pure reinforcement learning. So the 53 00:02:36,919 --> 00:02:39,080 Speaker 2: word pure, by the way, is a lot of the 54 00:02:39,120 --> 00:02:43,800 Speaker 2: companies use human feedback reinforcement learning learning, but this is 55 00:02:43,880 --> 00:02:48,400 Speaker 2: sort of digital reinforcement learning with no human feedback or oversight. 56 00:02:49,320 --> 00:02:51,600 Speaker 2: Google used that technique with one of its much older 57 00:02:51,639 --> 00:02:55,440 Speaker 2: models called Alpha Go in twenty seventeen. But this pure 58 00:02:55,480 --> 00:02:59,880 Speaker 2: reinforcement learning technique historically had shortcomings and western which we 59 00:02:59,880 --> 00:03:01,320 Speaker 2: can go into it in a minute. But wesn AI 60 00:03:01,360 --> 00:03:04,000 Speaker 2: companies either weren't quick enough to solve those shortcomings or 61 00:03:04,320 --> 00:03:07,160 Speaker 2: probably weren't incentivized to solve them. Given it was an 62 00:03:07,160 --> 00:03:11,880 Speaker 2: AI boom, capital was freely available, and they had access 63 00:03:11,919 --> 00:03:15,360 Speaker 2: to cutting edge AI chips, which the Chinese did not. 64 00:03:16,360 --> 00:03:19,919 Speaker 2: And it seems like perversely the US restricting China from 65 00:03:20,200 --> 00:03:23,880 Speaker 2: accessing these cutting edge computer chips to train the AI 66 00:03:23,960 --> 00:03:27,760 Speaker 2: models forced deep Seek's hand, force them into a corner 67 00:03:27,760 --> 00:03:29,080 Speaker 2: and force them to get there quicker. 68 00:03:31,040 --> 00:03:34,840 Speaker 1: So perverse indeed, so can you just because I'm still 69 00:03:34,840 --> 00:03:37,800 Speaker 1: struggling a bit with pure reinforcement earning. Is this a 70 00:03:37,920 --> 00:03:41,720 Speaker 1: person is using the AI app and the machine gets 71 00:03:41,760 --> 00:03:44,080 Speaker 1: an answer right and you take, yes, you've got that right, 72 00:03:44,120 --> 00:03:46,840 Speaker 1: and then understands that it's doing something right or is 73 00:03:46,840 --> 00:03:48,160 Speaker 1: this a different process? 74 00:03:48,800 --> 00:03:51,200 Speaker 2: No, So that that's the inferencing, that's when we're using that. 75 00:03:51,280 --> 00:03:53,680 Speaker 2: This is the pre training, so the training of the model. 76 00:03:53,760 --> 00:03:57,680 Speaker 2: So really simplistically, you know that the models have been 77 00:03:57,720 --> 00:04:00,680 Speaker 2: getting bigger and bigger, this is excluding deep Seek, and 78 00:04:00,720 --> 00:04:03,800 Speaker 2: they've been getting trained on you know, most of the 79 00:04:03,840 --> 00:04:06,000 Speaker 2: text on the internet in the world, and a lot 80 00:04:06,040 --> 00:04:08,000 Speaker 2: of the photos and a lot of the numbers and 81 00:04:08,000 --> 00:04:11,040 Speaker 2: a lot of the videos, and they've been getting taught 82 00:04:11,080 --> 00:04:14,720 Speaker 2: to solve every puzzle on Earth using every bit of data. 83 00:04:15,400 --> 00:04:19,600 Speaker 2: But this, this deep Seek technique was, like I said, 84 00:04:19,600 --> 00:04:21,599 Speaker 2: like getting a digital pad on the back when it 85 00:04:21,839 --> 00:04:23,960 Speaker 2: got an answer right when it was training the model. 86 00:04:24,000 --> 00:04:26,520 Speaker 2: So not when we're influencing the model. I kind of 87 00:04:26,520 --> 00:04:29,560 Speaker 2: think of it as like, instead of studying for a 88 00:04:29,560 --> 00:04:32,400 Speaker 2: maths exam by going through all the mass textbooks, you 89 00:04:32,680 --> 00:04:34,880 Speaker 2: just do practice exams and get better and better and 90 00:04:34,880 --> 00:04:37,240 Speaker 2: better marks until eventually you're getting sort of ninety eight 91 00:04:37,240 --> 00:04:40,200 Speaker 2: percent the practice exams, but you don't bother studying in 92 00:04:40,240 --> 00:04:41,400 Speaker 2: any of the source material. 93 00:04:41,960 --> 00:04:44,880 Speaker 1: Wow, and then you're just reinforcing, like teaching your dog 94 00:04:44,960 --> 00:04:49,440 Speaker 1: how to set with a piece of every time. Okay, 95 00:04:49,720 --> 00:04:52,279 Speaker 1: so let's talk Microsoft and MESA. How did their results 96 00:04:52,320 --> 00:04:54,040 Speaker 1: look today? What are they saying about all of this? 97 00:04:55,480 --> 00:04:58,720 Speaker 2: Yeah, so they Microsoft said that this happens every technology 98 00:04:58,760 --> 00:05:02,360 Speaker 2: technology cycle, and we see sort of ten x improvements 99 00:05:02,440 --> 00:05:05,279 Speaker 2: and costs and efficiency. In other words, costs come down 100 00:05:05,320 --> 00:05:09,640 Speaker 2: by ninety percent in most tech cycles. And they also 101 00:05:09,680 --> 00:05:12,279 Speaker 2: said Deep Seak have have had some real innovations, so 102 00:05:12,279 --> 00:05:15,400 Speaker 2: they're underwriting the innovations, some of this reinforcement learning and 103 00:05:15,440 --> 00:05:17,880 Speaker 2: some of the other things we haven't discussed that will 104 00:05:18,040 --> 00:05:21,720 Speaker 2: likely get copied almost immediately. Microsoft sophomore or less said 105 00:05:21,760 --> 00:05:25,120 Speaker 2: that verbatim, and they also said the big beneficiaries of 106 00:05:25,160 --> 00:05:29,400 Speaker 2: these normal sort of big step changes and efficiency are 107 00:05:29,480 --> 00:05:32,560 Speaker 2: us Ryan, so the customers as prices come down now 108 00:05:32,800 --> 00:05:36,880 Speaker 2: Meta and Zuck he sounded a bit flat footed, so 109 00:05:36,960 --> 00:05:40,239 Speaker 2: he said, deep seek to a number of novel things. 110 00:05:40,960 --> 00:05:45,360 Speaker 2: We are still digesting, but Meta will copy them. It's 111 00:05:45,520 --> 00:05:47,839 Speaker 2: too early to have a strong opinion on what this 112 00:05:47,920 --> 00:05:51,440 Speaker 2: means for capital spending levels. So some pretty interesting comments 113 00:05:51,440 --> 00:05:52,480 Speaker 2: from big tech there. 114 00:05:52,480 --> 00:05:55,200 Speaker 1: Especially when you consider how much they're planning. I think, 115 00:05:55,320 --> 00:05:57,320 Speaker 1: was it eighty seven billion dollars or something they were 116 00:05:57,360 --> 00:05:58,599 Speaker 1: planning on spending on AI? 117 00:05:59,560 --> 00:06:03,520 Speaker 2: So well, Microsoft, Yeah, Microsoft's eighty billion dollars of capex total, 118 00:06:03,560 --> 00:06:05,760 Speaker 2: but yes, a lot of that will be data sinkers 119 00:06:06,160 --> 00:06:10,360 Speaker 2: and AI chips, and met Is underwrote a sort of 120 00:06:10,440 --> 00:06:13,799 Speaker 2: sixty billion dollar check a few days ago, even. 121 00:06:13,640 --> 00:06:16,680 Speaker 1: Before its results, because it'll surely have a huge effect 122 00:06:16,760 --> 00:06:18,040 Speaker 1: on where you put that right. 123 00:06:18,960 --> 00:06:21,320 Speaker 2: Well, that that's why I do think it's really interesting 124 00:06:21,320 --> 00:06:24,840 Speaker 2: that Zuckerberg's saying we are still digesting and it's too 125 00:06:24,880 --> 00:06:26,440 Speaker 2: early to have a strong opinion on what this means 126 00:06:26,440 --> 00:06:29,320 Speaker 2: for capital spinning levels, despite the fact that two days 127 00:06:29,400 --> 00:06:31,040 Speaker 2: or two or three days ago, in the last week, 128 00:06:31,200 --> 00:06:33,640 Speaker 2: he underwrote a sixty billion dollar CAPEX. 129 00:06:33,279 --> 00:06:37,560 Speaker 1: Program exactly right, So what are the implications for investors? See, 130 00:06:37,560 --> 00:06:38,920 Speaker 1: how should investors think about this? 131 00:06:39,600 --> 00:06:42,080 Speaker 2: Well, I think we do need to answer sort of 132 00:06:42,080 --> 00:06:44,159 Speaker 2: four or five critical questions which we don't have the 133 00:06:44,200 --> 00:06:46,440 Speaker 2: answer to yet. So firstly, that the model has only 134 00:06:46,440 --> 00:06:47,720 Speaker 2: been out for it less than a week. This is 135 00:06:47,720 --> 00:06:49,919 Speaker 2: a very simple one. So let's see other reviews go 136 00:06:50,040 --> 00:06:51,760 Speaker 2: from all these brand new customers. You know, it went 137 00:06:51,800 --> 00:06:54,040 Speaker 2: from number thirty one on the air collapse sort at one. 138 00:06:54,440 --> 00:06:57,440 Speaker 2: I've tested it, Ryan, and it's always jammed and tells 139 00:06:57,480 --> 00:07:00,159 Speaker 2: me to try again later. So let's see whether the 140 00:07:00,200 --> 00:07:03,680 Speaker 2: consumers are loving the model. The second one is does 141 00:07:03,720 --> 00:07:05,840 Speaker 2: this mean that companies like in Video and AMD you 142 00:07:05,880 --> 00:07:07,880 Speaker 2: guys have spoken about this earlier in the week, have 143 00:07:08,000 --> 00:07:10,800 Speaker 2: to cut the prices of their cutting edge accelerated compute 144 00:07:10,880 --> 00:07:13,840 Speaker 2: chips and related to that, the third question is if so, 145 00:07:14,000 --> 00:07:16,520 Speaker 2: will there be a big enough spike in demand like 146 00:07:16,560 --> 00:07:20,360 Speaker 2: we've seen over previous technological cycles to offset the price 147 00:07:20,400 --> 00:07:22,320 Speaker 2: custs Like we saw that with flat screen TVs, we 148 00:07:22,360 --> 00:07:26,600 Speaker 2: saw producers get way more efficient at producing them. Prices 149 00:07:26,600 --> 00:07:30,520 Speaker 2: of flat screen TVs collapsed and demand skyrocketed. And the 150 00:07:30,560 --> 00:07:32,640 Speaker 2: final thing is what does all this mean for power 151 00:07:32,680 --> 00:07:36,240 Speaker 2: demand which was exploding for AI data centers. But as 152 00:07:36,280 --> 00:07:38,720 Speaker 2: far as investors go, the market is shot first and 153 00:07:38,760 --> 00:07:42,160 Speaker 2: we'll answer those questions we discussed later, but it's going 154 00:07:42,200 --> 00:07:44,160 Speaker 2: to take time to get answers to those questions, and 155 00:07:44,360 --> 00:07:48,040 Speaker 2: if Zuckerberg is still digesting, as he said a few 156 00:07:48,080 --> 00:07:49,880 Speaker 2: hours ago, it's hard for the average investor to have 157 00:07:49,880 --> 00:07:52,120 Speaker 2: clarity yet. But one final point I'll leave you with 158 00:07:52,200 --> 00:07:54,880 Speaker 2: Ryan is I think we can expect more volatility, not 159 00:07:54,920 --> 00:07:57,280 Speaker 2: just in stop prices, but in how the China versus 160 00:07:57,440 --> 00:08:01,920 Speaker 2: us chipwoar. We discussed last year ratchet's up, so we 161 00:08:02,000 --> 00:08:03,320 Speaker 2: really do need to buckle up, I. 162 00:08:03,200 --> 00:08:06,000 Speaker 1: Think, Sam. Thanks for that, Sam, Dicky Fisher Funds. Great 163 00:08:06,040 --> 00:08:06,400 Speaker 1: to have you on. 164 00:08:06,400 --> 00:08:10,160 Speaker 2: As always, for more from Heather Duplessy Allen Drive, listen 165 00:08:10,240 --> 00:08:13,280 Speaker 2: live to news talks it'd be from four pm weekdays, 166 00:08:13,360 --> 00:08:15,520 Speaker 2: or follow the podcast on iHeartRadio