1 00:00:00,560 --> 00:00:03,720 Speaker 1: Already and this is this is the daily This is 2 00:00:03,720 --> 00:00:15,160 Speaker 1: the Daily OS. Oh, now it makes sense. Good morning 3 00:00:15,160 --> 00:00:17,439 Speaker 1: and welcome to the Daily OS. It's Wednesday, the twenty 4 00:00:17,520 --> 00:00:20,920 Speaker 1: ninth of January. I'm Sam, I'm Zara. What if creating 5 00:00:21,040 --> 00:00:25,200 Speaker 1: artificial intelligence programs was actually cheaper than we thought? How 6 00:00:25,239 --> 00:00:29,480 Speaker 1: would AI companies, governments and investors react if something faster, 7 00:00:29,800 --> 00:00:33,080 Speaker 1: more accurate, and cheaper than the current AI giants like 8 00:00:33,200 --> 00:00:37,120 Speaker 1: chatcha Et suddenly burst onto the scene. Well, today we're 9 00:00:37,120 --> 00:00:39,479 Speaker 1: going to talk about Deep Seek, this new company that 10 00:00:39,520 --> 00:00:42,320 Speaker 1: has really shaken up the stock market and the way 11 00:00:42,320 --> 00:00:46,280 Speaker 1: that we're all feeling about artificial intelligence. It's pretty remarkable, 12 00:00:46,280 --> 00:00:46,880 Speaker 1: stories areah. 13 00:00:51,000 --> 00:00:53,240 Speaker 2: I think at the beginning of this week, the number 14 00:00:53,240 --> 00:00:56,840 Speaker 2: of people that could name, identify, and explain what Deep 15 00:00:56,880 --> 00:00:59,600 Speaker 2: Seek is would have been few and far apart at 16 00:00:59,600 --> 00:01:03,200 Speaker 2: myself included. I'd say potentially you included in that as well. 17 00:01:03,720 --> 00:01:06,240 Speaker 2: And now it's all anyone's talking about. It is literally 18 00:01:06,240 --> 00:01:11,000 Speaker 2: dominating every single headline across the world today. Sam, take 19 00:01:11,080 --> 00:01:13,600 Speaker 2: us back to the beginning. AI is a very very 20 00:01:13,600 --> 00:01:17,720 Speaker 2: complicated space. But what exactly is Deep Seek and why 21 00:01:17,760 --> 00:01:19,160 Speaker 2: the hell are we hearing so much about it? 22 00:01:19,560 --> 00:01:22,640 Speaker 1: So DC is a Chinese AI company that just launched 23 00:01:22,640 --> 00:01:25,800 Speaker 1: its latest AI model called R one. Now when I 24 00:01:25,840 --> 00:01:27,520 Speaker 1: say model, just think of a brain. 25 00:01:27,800 --> 00:01:28,400 Speaker 3: I can do that. 26 00:01:28,480 --> 00:01:30,560 Speaker 1: All of these AI companies are trying to work out 27 00:01:30,600 --> 00:01:33,760 Speaker 1: how to create the best AI brain. Some are better 28 00:01:33,800 --> 00:01:37,679 Speaker 1: at solving mathematical problems, others are better at expressing, reasoning, 29 00:01:37,840 --> 00:01:41,479 Speaker 1: or being emotional, trying to mimic human emotion. And of course, 30 00:01:41,480 --> 00:01:43,920 Speaker 1: the goal for every AI company is to try and 31 00:01:44,000 --> 00:01:46,480 Speaker 1: build a model or a brain that can do everything. 32 00:01:46,680 --> 00:01:49,200 Speaker 2: Okay, And so when you're talking about AI companies, an 33 00:01:49,200 --> 00:01:52,880 Speaker 2: example of that would be open Ai, Creator, Chat ChiPT exactly. 34 00:01:52,960 --> 00:01:55,400 Speaker 1: But there are many companies and there are many companies 35 00:01:55,520 --> 00:01:58,320 Speaker 1: that are in the AI space without making models, and 36 00:01:58,320 --> 00:02:01,360 Speaker 1: we'll talk about that later as well. Now, what's fascinating 37 00:02:01,360 --> 00:02:04,040 Speaker 1: about deep Seek is that they've built a model that 38 00:02:04,120 --> 00:02:07,760 Speaker 1: they say performs as well as the best American AI systems, 39 00:02:08,000 --> 00:02:10,160 Speaker 1: but at a fraction of the cost. So to put 40 00:02:10,200 --> 00:02:13,239 Speaker 1: that in context, just last week we saw major US 41 00:02:13,280 --> 00:02:17,520 Speaker 1: companies like Microsoft, Oracle and open Ai, which you just mentioned, 42 00:02:17,840 --> 00:02:22,800 Speaker 1: pledge five hundred billion US dollars to build new AI infrastructure. 43 00:02:23,080 --> 00:02:25,440 Speaker 1: Then we also had an announcement from Meta they came 44 00:02:25,440 --> 00:02:28,000 Speaker 1: out separately and said that they were committing sixty five 45 00:02:28,080 --> 00:02:31,680 Speaker 1: billion US dollars to build infrastructure. There is so much 46 00:02:31,800 --> 00:02:35,560 Speaker 1: money being spent on this technology at the moment. Then 47 00:02:35,560 --> 00:02:38,359 Speaker 1: deep Seat comes along and they say they built their 48 00:02:38,360 --> 00:02:42,320 Speaker 1: system for about six million dollars, so million versus billion. 49 00:02:42,880 --> 00:02:45,239 Speaker 1: One expert I was reading called it a joke of 50 00:02:45,320 --> 00:02:45,840 Speaker 1: a budget. 51 00:02:46,120 --> 00:02:49,160 Speaker 2: So they're clearly a lot of differences between these companies. 52 00:02:49,320 --> 00:02:51,160 Speaker 2: But all of the companies that you were just talking 53 00:02:51,200 --> 00:02:55,040 Speaker 2: about before are American companies. Deep Seek is a Chinese. 54 00:02:54,720 --> 00:02:57,519 Speaker 1: Chinese AI company that say it can do what American 55 00:02:57,520 --> 00:03:00,320 Speaker 1: companies can do for a fraction of the cost. 56 00:03:00,440 --> 00:03:02,720 Speaker 2: Okay, So then tell me more about deep Seek, like, 57 00:03:02,760 --> 00:03:04,280 Speaker 2: how is it possible that they're doing this? 58 00:03:04,320 --> 00:03:06,639 Speaker 3: Are they telling the truth? Where have they come from? 59 00:03:06,919 --> 00:03:10,000 Speaker 1: It's a really interesting story. So it started as a 60 00:03:10,160 --> 00:03:13,960 Speaker 1: side hustle project of a Chinese it's an absurd side hustle, 61 00:03:13,960 --> 00:03:16,840 Speaker 1: it's crazy of a Chinese hedge fund called high Flog. 62 00:03:16,680 --> 00:03:17,919 Speaker 3: Usually like a jewelry brand. 63 00:03:18,520 --> 00:03:20,960 Speaker 1: So these are coders who had been employed to build 64 00:03:21,000 --> 00:03:23,640 Speaker 1: systems that would trade stocks and be able to predict 65 00:03:24,120 --> 00:03:27,280 Speaker 1: for clients when to buy and sell. And they started 66 00:03:27,320 --> 00:03:30,320 Speaker 1: playing around on the side of that with different ways 67 00:03:30,320 --> 00:03:32,960 Speaker 1: to build AI products that could be used by consumers, 68 00:03:33,000 --> 00:03:35,560 Speaker 1: So by you and me. The founder then spun off 69 00:03:35,800 --> 00:03:38,440 Speaker 1: Deep Seek as a separate company to this hedge fund 70 00:03:38,720 --> 00:03:39,880 Speaker 1: only in twenty twenty three. 71 00:03:40,000 --> 00:03:41,440 Speaker 3: Oh wow, so very recent. 72 00:03:41,680 --> 00:03:46,200 Speaker 1: Yeah, and they started only recently playing around with old GPUs, 73 00:03:46,960 --> 00:03:49,240 Speaker 1: which what is one of the key tools you need 74 00:03:49,280 --> 00:03:52,600 Speaker 1: to build an AI machine that they had available to them. 75 00:03:52,880 --> 00:03:54,160 Speaker 3: What's what's GPUs? 76 00:03:54,240 --> 00:03:55,480 Speaker 1: So let's break it down. I'm going to use a 77 00:03:55,560 --> 00:03:59,360 Speaker 1: car analogy. Let's pretend that Ferrari were spending billions of 78 00:03:59,400 --> 00:04:02,320 Speaker 1: dollars a year on new car engines to try and 79 00:04:02,360 --> 00:04:04,560 Speaker 1: figure out how to win the Formula One, and then 80 00:04:04,600 --> 00:04:07,480 Speaker 1: all of a sudden, engineers at Holden, you know, a 81 00:04:07,520 --> 00:04:10,720 Speaker 1: really un sexy car company, started playing around with a 82 00:04:10,760 --> 00:04:14,400 Speaker 1: Ferrari part made fifteen years ago, and they approached the 83 00:04:14,520 --> 00:04:17,000 Speaker 1: challenge of making the fast car from a different perspective 84 00:04:17,200 --> 00:04:20,120 Speaker 1: with these old parts, and actually they've made a faster car. 85 00:04:20,440 --> 00:04:22,000 Speaker 1: That's basically what's happened here. 86 00:04:22,080 --> 00:04:25,239 Speaker 2: Okay, So what you're saying is that this Chinese company, 87 00:04:25,279 --> 00:04:30,960 Speaker 2: Deep Seek has created this basically beast of Nai machine, 88 00:04:31,240 --> 00:04:34,520 Speaker 2: are using old parts, old GPUs, which is just the 89 00:04:34,560 --> 00:04:37,279 Speaker 2: little chips yeah sure, rather than building new ones. 90 00:04:37,120 --> 00:04:38,679 Speaker 1: And getting exactly the same results. 91 00:04:38,880 --> 00:04:40,960 Speaker 3: Okay, And is that why the cost is lower? 92 00:04:41,080 --> 00:04:44,600 Speaker 1: Exactly? And so they say though that it's not actually 93 00:04:44,760 --> 00:04:47,560 Speaker 1: the new GPUs, the new chips that you need to 94 00:04:47,600 --> 00:04:51,719 Speaker 1: make really good AI, it's their innovative training techniques. So 95 00:04:52,160 --> 00:04:55,080 Speaker 1: they say they've found ways to make AI models think 96 00:04:55,279 --> 00:04:58,640 Speaker 1: more efficiently. And there was one really key part of 97 00:04:58,680 --> 00:05:01,360 Speaker 1: the research that they say makes their AI different, and 98 00:05:01,440 --> 00:05:05,640 Speaker 1: that's recognizing that AI has a heart moments. This is 99 00:05:05,680 --> 00:05:08,440 Speaker 1: actually what they say. They say that AI has a 100 00:05:08,560 --> 00:05:11,520 Speaker 1: heart moments where they start processing a trying to answer 101 00:05:11,560 --> 00:05:13,560 Speaker 1: a question you put to them, and then they realize 102 00:05:13,560 --> 00:05:15,680 Speaker 1: something that might not be on the exact path they 103 00:05:15,680 --> 00:05:18,440 Speaker 1: were following. They've trained their AI to take that route, 104 00:05:18,600 --> 00:05:21,080 Speaker 1: so they say, oh, I've actually just thought of this, this, 105 00:05:21,160 --> 00:05:23,880 Speaker 1: and this, and then go down that route. So instead 106 00:05:23,920 --> 00:05:26,400 Speaker 1: of it explicitly trying to get to the answer as 107 00:05:26,440 --> 00:05:29,320 Speaker 1: fast and direct as possible, it almost kind of goes 108 00:05:29,400 --> 00:05:31,480 Speaker 1: with the flow a bit more. And I'm really simplifying 109 00:05:31,480 --> 00:05:31,760 Speaker 1: this here. 110 00:05:31,760 --> 00:05:33,280 Speaker 3: This train is legitimately heading. 111 00:05:33,320 --> 00:05:37,080 Speaker 2: So what you're saying here is that they are suggesting 112 00:05:37,200 --> 00:05:40,520 Speaker 2: that their technology is superior because they've trained it to 113 00:05:40,600 --> 00:05:41,320 Speaker 2: think differently. 114 00:05:41,720 --> 00:05:45,440 Speaker 1: Essentially, I think that we're probably summarizing artificial intelligence. Yeah, 115 00:05:45,440 --> 00:05:48,320 Speaker 1: and I think we've probably summarized like twenty five years 116 00:05:48,320 --> 00:05:53,120 Speaker 1: of quantum physics and quantum computing into six sentences. But yes, essentially, 117 00:05:53,320 --> 00:05:56,000 Speaker 1: they've figured out how to make old technology think differently 118 00:05:56,040 --> 00:05:58,200 Speaker 1: and be as good as new technology. Okay, that's what 119 00:05:58,240 --> 00:06:01,240 Speaker 1: you need to know. Okay, Sure, And so the USAI 120 00:06:01,320 --> 00:06:03,960 Speaker 1: companies get wind of this, and they start testing this 121 00:06:04,080 --> 00:06:07,960 Speaker 1: Chinese AI yep, and they give deep Seek and open 122 00:06:08,000 --> 00:06:10,840 Speaker 1: AI's top models, so the best chatch apt models that 123 00:06:10,880 --> 00:06:13,680 Speaker 1: you can possibly get. They give the both models the 124 00:06:13,720 --> 00:06:18,080 Speaker 1: same questions, and deep Seek produces similar, if not better answers. 125 00:06:18,160 --> 00:06:20,239 Speaker 1: When was this this week? Last week? 126 00:06:20,520 --> 00:06:21,599 Speaker 3: A panic? 127 00:06:21,920 --> 00:06:24,640 Speaker 1: Exactly? Now there is an important caveat that I want 128 00:06:24,640 --> 00:06:28,280 Speaker 1: to mention here. These us AI companies have said that 129 00:06:28,400 --> 00:06:31,520 Speaker 1: whilst the deep Seek model is impressive at writing in 130 00:06:31,600 --> 00:06:35,400 Speaker 1: general problem solving, it does struggle with certain specific tasks, 131 00:06:35,480 --> 00:06:39,400 Speaker 1: so things like you know, booking and a play, Yeah, 132 00:06:38,160 --> 00:06:42,640 Speaker 1: you're you do struggle with certain specific tasks numbers. So 133 00:06:42,720 --> 00:06:45,040 Speaker 1: now there's going to be a whole independent testing phase 134 00:06:45,360 --> 00:06:46,880 Speaker 1: that will give us some clearer results. 135 00:06:47,000 --> 00:06:47,279 Speaker 3: Okay. 136 00:06:47,360 --> 00:06:50,039 Speaker 2: The reason that the AI industry basically is freaking out 137 00:06:50,400 --> 00:06:53,480 Speaker 2: is because this new players come along is doing something 138 00:06:53,520 --> 00:06:55,839 Speaker 2: for the fraction of the cost, is getting similar results, 139 00:06:56,200 --> 00:06:57,880 Speaker 2: and it's non an American company. 140 00:06:58,040 --> 00:07:01,599 Speaker 1: Yeah, okay, and it challenges this fundamental assumption that we 141 00:07:01,760 --> 00:07:05,400 Speaker 1: have about AI development, which is this conventional thinking that 142 00:07:05,440 --> 00:07:08,800 Speaker 1: you needed the most expensive, the most cutting edge computer chips, 143 00:07:09,000 --> 00:07:11,440 Speaker 1: particularly those made by one company in the video, which 144 00:07:11,480 --> 00:07:13,200 Speaker 1: is one of the most valuable companies in the world. 145 00:07:13,240 --> 00:07:16,080 Speaker 1: We've talked about it on this podcast before, to build 146 00:07:16,080 --> 00:07:19,200 Speaker 1: the best AI systems, and that's why tech companies like 147 00:07:19,200 --> 00:07:21,680 Speaker 1: Google and Meta have been spending insane amounts of money 148 00:07:21,960 --> 00:07:25,840 Speaker 1: building the infrastructure that they say they need to produce 149 00:07:25,880 --> 00:07:27,960 Speaker 1: these AI systems, and investors have gone with it. 150 00:07:28,040 --> 00:07:31,360 Speaker 2: Okay, But I guess the reason that the tech world 151 00:07:31,440 --> 00:07:33,840 Speaker 2: is in so much disarray this week is because that 152 00:07:33,920 --> 00:07:37,160 Speaker 2: type of logic might not necessarily be the only. 153 00:07:36,920 --> 00:07:39,760 Speaker 1: Way exactly, and Deep Seek says that might not be true. 154 00:07:39,880 --> 00:07:42,880 Speaker 1: And what's more, Deep Seek have released their program as 155 00:07:43,000 --> 00:07:46,360 Speaker 1: open source, and what that means is they're an open book. 156 00:07:46,680 --> 00:07:49,400 Speaker 1: They've taken the hold and around the racetrack popped open 157 00:07:49,440 --> 00:07:51,200 Speaker 1: the hood and said, whoever wants to come and have 158 00:07:51,240 --> 00:07:52,520 Speaker 1: a look as much as. 159 00:07:52,360 --> 00:07:54,239 Speaker 3: You want, Why can't they just be copied? 160 00:07:54,560 --> 00:07:57,800 Speaker 1: They say that they don't care really if they're copied. 161 00:07:57,920 --> 00:08:00,080 Speaker 1: That this technology almost think about it, like you know, 162 00:08:00,160 --> 00:08:01,760 Speaker 1: there's been a lot of examples in the last twenty 163 00:08:01,760 --> 00:08:03,960 Speaker 1: four hours about the space race. They've kind of said, 164 00:08:03,960 --> 00:08:06,200 Speaker 1: everyone's trying to get to the moon. Here's how we're 165 00:08:06,200 --> 00:08:09,520 Speaker 1: doing it, hoping that the entire industry becomes more profitable. 166 00:08:09,640 --> 00:08:11,840 Speaker 1: It's just a different way of doing business. It's very 167 00:08:11,880 --> 00:08:14,240 Speaker 1: common in the tech world to release something out to 168 00:08:14,360 --> 00:08:16,440 Speaker 1: the public. And I want to bring up one new 169 00:08:16,520 --> 00:08:18,880 Speaker 1: term here, and that's the idea of a company having 170 00:08:18,880 --> 00:08:22,200 Speaker 1: a moat, a mote e moche. So think about a 171 00:08:22,240 --> 00:08:25,800 Speaker 1: competitive advantage that surrounds a company and literally think about 172 00:08:25,800 --> 00:08:30,640 Speaker 1: a castle with emotion around it and moat. Yeah yeah, 173 00:08:30,680 --> 00:08:33,280 Speaker 1: And that's what makes a company hard to copy. So 174 00:08:33,480 --> 00:08:36,320 Speaker 1: if a company has I feel that far ahead, yeah, 175 00:08:36,320 --> 00:08:39,680 Speaker 1: but if you have a wide moat, then other companies 176 00:08:39,679 --> 00:08:42,559 Speaker 1: can't copy you quickly and easily. And so for these 177 00:08:42,600 --> 00:08:47,080 Speaker 1: AI companies, these really valuable stocks. Their moats are not 178 00:08:47,120 --> 00:08:49,160 Speaker 1: only the knowledge they have and the expertise, but just 179 00:08:49,160 --> 00:08:51,720 Speaker 1: how much cash they've got, because the common reasoning has 180 00:08:51,720 --> 00:08:55,080 Speaker 1: always been to build great AI, you need fifty sixty 181 00:08:55,120 --> 00:08:56,840 Speaker 1: seventy one hundred, five hundred billion. 182 00:08:57,200 --> 00:08:57,720 Speaker 3: I understand. 183 00:08:57,720 --> 00:09:01,120 Speaker 2: So you're saying these companies they're competitive advantages that they've 184 00:09:01,200 --> 00:09:05,320 Speaker 2: raised in dollars of money, and therefore the average Joe 185 00:09:05,400 --> 00:09:07,120 Speaker 2: on the street can't go and build the same tech 186 00:09:07,160 --> 00:09:09,320 Speaker 2: as them because they can't raise a bajillion. 187 00:09:08,920 --> 00:09:11,839 Speaker 1: Dollars until now, until now when DeepC comes out and. 188 00:09:11,760 --> 00:09:14,000 Speaker 2: Says, actually, you don't need a bajillion dollars, you only 189 00:09:14,000 --> 00:09:16,640 Speaker 2: a little bit, only six million, and you can build 190 00:09:16,679 --> 00:09:17,880 Speaker 2: something just as good. 191 00:09:18,040 --> 00:09:18,480 Speaker 3: Okay. 192 00:09:18,640 --> 00:09:22,280 Speaker 2: So what has all of this kind of knowledge and 193 00:09:22,400 --> 00:09:26,400 Speaker 2: information done to the market, because there has been quite 194 00:09:26,440 --> 00:09:29,080 Speaker 2: a significant economic kind of market impact. 195 00:09:29,160 --> 00:09:31,800 Speaker 1: Yeah, it's been really dramatic. So, as I mentioned earlier 196 00:09:31,800 --> 00:09:34,280 Speaker 1: in the video, the company that makes the computer chips 197 00:09:34,559 --> 00:09:37,520 Speaker 1: that AI companies buy to build these big AI brains, 198 00:09:38,000 --> 00:09:40,640 Speaker 1: they lost a whopping get ready, five hundred and eighty 199 00:09:40,720 --> 00:09:44,160 Speaker 1: nine billion US dollars in the market value on the 200 00:09:44,240 --> 00:09:47,320 Speaker 1: US stock market yesterday. They went down sixteen point one percent, 201 00:09:47,760 --> 00:09:50,520 Speaker 1: and the overall US market went down three point one 202 00:09:50,559 --> 00:09:53,240 Speaker 1: percent crazy. You have to remember that it's not just 203 00:09:53,280 --> 00:09:56,319 Speaker 1: companies that are directly related to AI, but there are 204 00:09:56,320 --> 00:09:59,080 Speaker 1: so many adjacent companies that have business in AI. You know, 205 00:09:59,120 --> 00:10:02,880 Speaker 1: for example, energy companies, they would be relying on the 206 00:10:02,880 --> 00:10:05,160 Speaker 1: fact that everyone's building these big data centers. They're going 207 00:10:05,200 --> 00:10:07,079 Speaker 1: to need to plug that all into a wall somewhere. 208 00:10:07,320 --> 00:10:10,559 Speaker 1: So their stocks went down. The entire market really did 209 00:10:10,600 --> 00:10:12,720 Speaker 1: freak out about the fact that there's this new way 210 00:10:12,720 --> 00:10:13,400 Speaker 1: of doing things. 211 00:10:13,559 --> 00:10:16,680 Speaker 2: And it's not just the market that responds. It is 212 00:10:16,720 --> 00:10:20,160 Speaker 2: also this kind of geopolitical angle. I've mentioned that deep 213 00:10:20,200 --> 00:10:23,320 Speaker 2: Seek is Chinese and Open Ai and the other AI 214 00:10:23,360 --> 00:10:27,000 Speaker 2: companies are American. Talk me through that geopolitical angle. 215 00:10:27,080 --> 00:10:29,680 Speaker 1: Yeah, So the US has been trying to maintain its 216 00:10:29,960 --> 00:10:32,800 Speaker 1: competitive lead in AI. They say there's a two horse 217 00:10:32,880 --> 00:10:35,320 Speaker 1: race between them and China, and the way that they're 218 00:10:35,320 --> 00:10:38,120 Speaker 1: doing that is by restricting the computer chips that are 219 00:10:38,200 --> 00:10:40,880 Speaker 1: allowed to enter China. So, to go back to our 220 00:10:40,920 --> 00:10:44,360 Speaker 1: Ferrari example, they are not letting in new models of 221 00:10:44,360 --> 00:10:47,160 Speaker 1: a Ferrari engine into China because they don't want Chinese 222 00:10:47,160 --> 00:10:51,479 Speaker 1: AI companies to rival them. But Deep Seek's success suggests 223 00:10:51,640 --> 00:10:55,880 Speaker 1: that that restriction might actually be backfiring in two possible ways. 224 00:10:55,960 --> 00:10:58,400 Speaker 1: Either the chips aren't as crucial as we all think 225 00:10:58,480 --> 00:11:01,719 Speaker 1: to building these models, or Chinese companies have worked out 226 00:11:01,760 --> 00:11:03,319 Speaker 1: ways to get around the restrictions. 227 00:11:03,480 --> 00:11:05,000 Speaker 3: Yeah. So interesting. 228 00:11:05,240 --> 00:11:07,200 Speaker 2: And when I hear about that, the one thing that 229 00:11:07,240 --> 00:11:10,360 Speaker 2: I think about is how Donald Trump would be responding 230 00:11:10,360 --> 00:11:12,960 Speaker 2: to this, because we all know that Trump's platform is 231 00:11:13,120 --> 00:11:17,640 Speaker 2: about growing specifically American companies and wealth. How has he 232 00:11:17,760 --> 00:11:18,559 Speaker 2: responded to this? 233 00:11:19,120 --> 00:11:21,360 Speaker 1: Well, this has become a core focus. I mean already 234 00:11:21,440 --> 00:11:23,559 Speaker 1: in his first week, full week of being president, he 235 00:11:23,559 --> 00:11:26,840 Speaker 1: announced a five hundred billion dollar program in collaboration with 236 00:11:26,920 --> 00:11:31,280 Speaker 1: some really big companies AI company. Yeah, and then just yesterday, 237 00:11:31,760 --> 00:11:33,800 Speaker 1: talking about Deep Seek, he said that this is a 238 00:11:33,800 --> 00:11:36,360 Speaker 1: wake up call for the AI industry and that we 239 00:11:36,440 --> 00:11:39,200 Speaker 1: need to be quote laser focused on competing to win. 240 00:11:39,800 --> 00:11:42,199 Speaker 3: Wow, it's really like Space Race. 241 00:11:42,480 --> 00:11:45,680 Speaker 1: It's very much the Space race. Trump's close advisor Elon 242 00:11:45,760 --> 00:11:48,160 Speaker 1: Musk he speaking of Space Race. Yes, he accused Deep 243 00:11:48,160 --> 00:11:50,120 Speaker 1: Seak of lying about how many chips it needed to 244 00:11:50,200 --> 00:11:52,520 Speaker 1: run system. So he essentially said, what you're doing is 245 00:11:52,520 --> 00:11:55,600 Speaker 1: simply impossible, but there is certainly a bit of panic there. 246 00:11:55,800 --> 00:11:58,800 Speaker 1: Trump said that he might use executive orders to increase 247 00:11:58,880 --> 00:12:01,959 Speaker 1: the amount of public reas sources like electricity and energy 248 00:12:02,000 --> 00:12:05,720 Speaker 1: supplies and building materials to just build these data centers 249 00:12:05,760 --> 00:12:08,440 Speaker 1: that USAI companies say they need as quick as they can. 250 00:12:08,800 --> 00:12:10,680 Speaker 2: Sam, I want to finish this chat by just bringing 251 00:12:10,720 --> 00:12:14,720 Speaker 2: it home. And I guess contextualizing this in an Australian setting, 252 00:12:15,240 --> 00:12:19,280 Speaker 2: what are the implications for Australia's tech sector? What does 253 00:12:19,280 --> 00:12:20,480 Speaker 2: this story mean for US? 254 00:12:20,920 --> 00:12:23,679 Speaker 1: Well, pretty much every country besides the US has had 255 00:12:23,720 --> 00:12:26,360 Speaker 1: the same reaction, including Australia, which is that this is 256 00:12:26,400 --> 00:12:29,040 Speaker 1: good news. We've all known that the US has a 257 00:12:29,160 --> 00:12:32,520 Speaker 1: major competitive advantage in AI, and AI is seen as 258 00:12:32,520 --> 00:12:35,400 Speaker 1: a really important part of any country's innovation strategy. So 259 00:12:35,760 --> 00:12:38,760 Speaker 1: our Industry and Science Minister Ed Husick. He said yesterday 260 00:12:38,760 --> 00:12:41,440 Speaker 1: that he expects to see more of these cheaper and 261 00:12:41,520 --> 00:12:44,680 Speaker 1: more creative AI solutions in years to come as this 262 00:12:44,760 --> 00:12:47,199 Speaker 1: becomes more mainstream, and that it kind of evens the 263 00:12:47,240 --> 00:12:49,800 Speaker 1: playing field a little bit because we don't need five 264 00:12:49,880 --> 00:12:53,240 Speaker 1: hundred billion dollars. We just need a side hustle in 265 00:12:54,280 --> 00:12:54,680 Speaker 1: our jobs. 266 00:12:54,720 --> 00:12:56,560 Speaker 3: How did an our side hustle lead to that? 267 00:12:56,800 --> 00:12:58,840 Speaker 1: I know, but I think, look, this is still a 268 00:12:58,840 --> 00:13:02,240 Speaker 1: privately held company Deep Seek, so deep Seak isn't listed anywhere. 269 00:13:02,400 --> 00:13:04,480 Speaker 1: We're going to see how the stock markets responded. The 270 00:13:04,480 --> 00:13:07,320 Speaker 1: stock markets in Australia that tech and AI companies were 271 00:13:07,360 --> 00:13:10,199 Speaker 1: hit pretty hard yesterday, but our stock market isn't as 272 00:13:10,240 --> 00:13:12,800 Speaker 1: reliant on those big tech giants as in the US. 273 00:13:13,640 --> 00:13:15,960 Speaker 1: But it's a really seismic shift in the way that 274 00:13:15,960 --> 00:13:18,120 Speaker 1: we think about AI and I'm going to have to 275 00:13:18,120 --> 00:13:19,600 Speaker 1: ask Chatcha and see what happens. Now. 276 00:13:20,880 --> 00:13:23,640 Speaker 2: Well, thank you for explaining that story. I actually feel 277 00:13:23,679 --> 00:13:25,319 Speaker 2: like I completely understand it now. 278 00:13:25,280 --> 00:13:25,839 Speaker 1: So you. 279 00:13:28,320 --> 00:13:28,880 Speaker 3: Well, there you go. 280 00:13:29,080 --> 00:13:31,080 Speaker 2: Thank you so much for joining us for another episode 281 00:13:31,120 --> 00:13:33,880 Speaker 2: of The Daily Ods and supporting our independent newsroom. We're 282 00:13:33,880 --> 00:13:36,200 Speaker 2: going to be back later today with the headlines, but 283 00:13:36,360 --> 00:13:37,800 Speaker 2: until then, have a great day. 284 00:13:40,640 --> 00:13:42,960 Speaker 1: My name is Lily Maddon and I'm a proud Arunda 285 00:13:43,200 --> 00:13:47,959 Speaker 1: Bungelung Calcottin woman from Gadigol Country. The Daily oz acknowledges 286 00:13:48,080 --> 00:13:50,199 Speaker 1: that this podcast is recorded on the lands of the 287 00:13:50,240 --> 00:13:53,800 Speaker 1: Gadigol people and pays respect to all Aboriginal and Torres 288 00:13:53,800 --> 00:13:56,760 Speaker 1: Strait Island and nations. We pay our respects to the 289 00:13:56,760 --> 00:13:59,480 Speaker 1: first peoples of these countries, both past and present,