1 00:00:00,480 --> 00:00:03,640 Speaker 1: Already and this is this is the Daily This is 2 00:00:03,680 --> 00:00:04,360 Speaker 1: the Daily OS. 3 00:00:05,120 --> 00:00:15,720 Speaker 2: Oh, now it makes sense. Good morning and welcome to 4 00:00:15,720 --> 00:00:18,360 Speaker 2: the Daily OS. It's Friday, the twenty first of November. 5 00:00:18,400 --> 00:00:19,360 Speaker 2: I'm Sam Kazlowski. 6 00:00:19,600 --> 00:00:20,640 Speaker 1: I'm Zara Seidler. 7 00:00:21,000 --> 00:00:23,960 Speaker 2: This week, a technology stocks have been making headlines for 8 00:00:24,000 --> 00:00:27,280 Speaker 2: all the wrong reasons. Major tech companies like Navidia and 9 00:00:27,400 --> 00:00:31,000 Speaker 2: Meta have seen their share prices drop significantly, with some 10 00:00:31,240 --> 00:00:34,199 Speaker 2: analysts warning that we might be witnessing the beginning of 11 00:00:34,280 --> 00:00:37,959 Speaker 2: a tech and AI bubble bursting. On today's podcast, we're 12 00:00:37,960 --> 00:00:40,519 Speaker 2: going to break down what this actually means, how we 13 00:00:40,680 --> 00:00:43,959 Speaker 2: got here, and whether this does look like the infamous 14 00:00:44,080 --> 00:00:46,519 Speaker 2: dot com crash from the year two thousand. 15 00:00:50,960 --> 00:00:53,240 Speaker 1: All right, Sam, First things first, I do want to 16 00:00:53,240 --> 00:00:56,520 Speaker 1: get onto bubbles. Bubbles feel like an important and important 17 00:00:56,520 --> 00:00:59,160 Speaker 1: thing to get through. But I think a good starting 18 00:00:59,200 --> 00:01:03,360 Speaker 1: point is to understand what's happening with tech stocks, because 19 00:01:03,480 --> 00:01:04,600 Speaker 1: this is all related. Right. 20 00:01:04,760 --> 00:01:07,880 Speaker 2: Yeah, So November has been the worst month for tech 21 00:01:07,920 --> 00:01:09,840 Speaker 2: stocks in definitely a year. 22 00:01:10,120 --> 00:01:12,559 Speaker 1: Sorry to all listeners who have tech stocks, which is. 23 00:01:12,560 --> 00:01:15,039 Speaker 2: As we'll discuss pretty much every Australian and I'll get 24 00:01:15,040 --> 00:01:17,959 Speaker 2: to that in a bit. But the Nasdaq which is 25 00:01:18,080 --> 00:01:21,640 Speaker 2: a stock market index that tracks America's largest companies, and 26 00:01:21,720 --> 00:01:24,760 Speaker 2: most of those largest companies now are tech companies. It 27 00:01:24,840 --> 00:01:28,240 Speaker 2: fell about three percent over the past week. And that's 28 00:01:28,280 --> 00:01:29,880 Speaker 2: billions and billions of dollars. 29 00:01:30,120 --> 00:01:34,479 Speaker 1: It's funny because three percent sounds so minor's serious. 30 00:01:34,640 --> 00:01:37,360 Speaker 2: It is, but some of the world's biggest tech companies 31 00:01:37,400 --> 00:01:40,960 Speaker 2: have actually seen sharper drops than that. So Navidia, the 32 00:01:41,040 --> 00:01:44,720 Speaker 2: company that makes the computer chips that power AI, it 33 00:01:44,800 --> 00:01:47,800 Speaker 2: dropped around ten percent during November. It's now bounced back. 34 00:01:47,840 --> 00:01:49,440 Speaker 2: Will also put a pin in that and come back 35 00:01:49,440 --> 00:01:52,520 Speaker 2: to it later. Meta, which owns Facebook and Instagram, has 36 00:01:52,560 --> 00:01:56,240 Speaker 2: fallen by almost twenty percent in the past month. And 37 00:01:56,320 --> 00:01:59,920 Speaker 2: so this group of companies that Wall Street calls the magnificence, 38 00:02:00,680 --> 00:02:05,320 Speaker 2: so it's Apple, Microsoft, Google, Amazon, Navidio, Meta, and Tesla. 39 00:02:05,720 --> 00:02:08,600 Speaker 2: That's kind of the seven powerful tech companies on the planet. 40 00:02:09,160 --> 00:02:11,480 Speaker 2: As a group, they've dropped around an average of five 41 00:02:11,520 --> 00:02:12,360 Speaker 2: percent this month. 42 00:02:12,720 --> 00:02:17,200 Speaker 1: Okay, I really want to understand why that drop has happened. 43 00:02:17,800 --> 00:02:20,280 Speaker 1: But you started this by talking about a bubble, So 44 00:02:20,600 --> 00:02:23,080 Speaker 1: really quickly, what do you mean when you're talking about 45 00:02:23,080 --> 00:02:25,640 Speaker 1: a bubble? And then secondary to that, take me through 46 00:02:25,840 --> 00:02:27,120 Speaker 1: why this is actually happening. 47 00:02:27,240 --> 00:02:29,080 Speaker 2: So I think they should have called it a balloon, 48 00:02:29,360 --> 00:02:35,240 Speaker 2: not a bubble, and I know that the terminology police, 49 00:02:35,639 --> 00:02:37,600 Speaker 2: So I like to think about it like an actual 50 00:02:37,600 --> 00:02:42,520 Speaker 2: balloon which is being inflated beyond its safe capacity. And 51 00:02:42,639 --> 00:02:46,880 Speaker 2: a market bubble occurs when stock prices rise so much 52 00:02:47,240 --> 00:02:50,560 Speaker 2: and so quickly and so far above what these companies 53 00:02:50,560 --> 00:02:54,360 Speaker 2: are actually worth that something pops it and once it pops, 54 00:02:54,440 --> 00:02:57,040 Speaker 2: there's a very loud bang and everything comes crashing down. 55 00:02:57,840 --> 00:03:00,160 Speaker 2: And the whole idea behind a bubble is that the 56 00:03:00,200 --> 00:03:03,520 Speaker 2: prices that we're seeing on the stock market, they're not 57 00:03:03,560 --> 00:03:07,680 Speaker 2: being driven by rationality or by kind of calculations that 58 00:03:07,760 --> 00:03:11,840 Speaker 2: make sense. Instead, it's being driven by enthusiasm by fomo 59 00:03:12,360 --> 00:03:15,079 Speaker 2: and this kind of belief almost that these prices are 60 00:03:15,080 --> 00:03:17,680 Speaker 2: just going to keep rising forever. And so the classic 61 00:03:17,760 --> 00:03:20,960 Speaker 2: example that everyone uses is the dot com bubble from 62 00:03:21,080 --> 00:03:22,000 Speaker 2: the late nineties. 63 00:03:22,120 --> 00:03:22,200 Speaker 1: Ye. 64 00:03:22,480 --> 00:03:25,600 Speaker 2: So that was when internet companies saw their stock prices 65 00:03:25,639 --> 00:03:29,720 Speaker 2: go absolutely nuts, purely based on excitement that there was 66 00:03:29,760 --> 00:03:32,120 Speaker 2: this world wide web and we're going to make a 67 00:03:32,120 --> 00:03:34,640 Speaker 2: lot of money from the Internet, even though most of 68 00:03:34,639 --> 00:03:37,360 Speaker 2: these companies weren't actually making any profit. And when that 69 00:03:37,400 --> 00:03:40,960 Speaker 2: bubble burst in about March two thousand the Nasdaq, which 70 00:03:41,000 --> 00:03:43,960 Speaker 2: I talked about before it fell seventy seven percent. 71 00:03:44,160 --> 00:03:44,560 Speaker 1: Wow. 72 00:03:44,680 --> 00:03:47,520 Speaker 2: Yeah, then your next question was about why, right. 73 00:03:47,400 --> 00:03:50,080 Speaker 1: Yeah, sorry to do the double prompt, but I guess 74 00:03:50,160 --> 00:03:53,440 Speaker 1: so you've said, bubble balloon, whatever we want to call it. 75 00:03:53,920 --> 00:03:56,120 Speaker 1: The theory behind that is that something can get too 76 00:03:56,120 --> 00:03:58,960 Speaker 1: big and it can just burst pop yep pop. Is 77 00:03:58,960 --> 00:03:59,960 Speaker 1: that what's happening here? 78 00:04:00,600 --> 00:04:03,640 Speaker 2: Yeah, Well, it hasn't happened yet, but the fear, especially 79 00:04:03,680 --> 00:04:06,720 Speaker 2: this week, is that we are about to see that happen. Okay, 80 00:04:06,760 --> 00:04:09,080 Speaker 2: And there's a couple of key reasons why people are 81 00:04:09,120 --> 00:04:11,840 Speaker 2: starting to argue that. And the first is about money, 82 00:04:12,200 --> 00:04:15,360 Speaker 2: just the sheer amount of money that's being put into 83 00:04:15,360 --> 00:04:19,360 Speaker 2: these companies. So tech companies have been spending unbelievable amounts 84 00:04:19,360 --> 00:04:22,320 Speaker 2: of money building the infrastructure that they think they need 85 00:04:22,480 --> 00:04:24,320 Speaker 2: to make a lot of money from AI for the 86 00:04:24,320 --> 00:04:27,520 Speaker 2: next ten, twenty thirty years, and we're talking about you know, 87 00:04:27,560 --> 00:04:32,000 Speaker 2: specialized computer chips, massive data centers. In twenty twenty five alone, 88 00:04:32,040 --> 00:04:35,360 Speaker 2: the world's largest tech companies are expected to spend about 89 00:04:35,360 --> 00:04:39,800 Speaker 2: three to four hundred billion US dollars on infrastructure and 90 00:04:39,960 --> 00:04:42,120 Speaker 2: the problem is is that when they're spending all of 91 00:04:42,120 --> 00:04:45,040 Speaker 2: that money, we don't actually know for sure whether AI 92 00:04:45,279 --> 00:04:49,080 Speaker 2: is going to generate enough long term revenue and profits 93 00:04:49,120 --> 00:04:50,880 Speaker 2: for these companies to make all of that back. 94 00:04:51,000 --> 00:04:53,840 Speaker 1: And they're investing now in the hopes that the returns 95 00:04:53,880 --> 00:04:56,680 Speaker 1: are far greater whenever that's realized down the track. 96 00:04:56,800 --> 00:05:00,320 Speaker 2: Exactly. They're placing a huge bet. And one analyst that 97 00:05:00,360 --> 00:05:03,440 Speaker 2: I was reading said tech companies have to spend to 98 00:05:03,560 --> 00:05:05,880 Speaker 2: keep up with massive demand and to get out in 99 00:05:05,960 --> 00:05:09,640 Speaker 2: front of the pack, but that demand hasn't yet translated 100 00:05:09,760 --> 00:05:11,920 Speaker 2: fully into longer term profits. 101 00:05:11,960 --> 00:05:12,719 Speaker 1: So interesting. 102 00:05:12,960 --> 00:05:16,720 Speaker 2: Yeah, And another concern is that the companies are finding 103 00:05:16,720 --> 00:05:19,559 Speaker 2: it a long time to actually make that money back. 104 00:05:19,839 --> 00:05:20,760 Speaker 2: So let's say that it. 105 00:05:20,800 --> 00:05:23,400 Speaker 1: Costs work in media, Well, that takes extra long. 106 00:05:23,920 --> 00:05:26,120 Speaker 2: But you know, they were expecting it to take one 107 00:05:26,160 --> 00:05:28,799 Speaker 2: to two years to make money back from an individual user, 108 00:05:28,880 --> 00:05:31,599 Speaker 2: say with a Chat GBT subscription or a claud subscription. 109 00:05:31,920 --> 00:05:34,240 Speaker 2: They're now kind of saying it's going to take two 110 00:05:34,320 --> 00:05:36,760 Speaker 2: to four years to see a return on that investment. 111 00:05:37,240 --> 00:05:40,560 Speaker 2: And you know, that is a longer business model to 112 00:05:40,640 --> 00:05:42,840 Speaker 2: have to support, which is why people are getting a 113 00:05:42,880 --> 00:05:43,640 Speaker 2: little bit wobbly. 114 00:05:44,120 --> 00:05:47,359 Speaker 1: Okay, so you're saying there are some business models that 115 00:05:47,839 --> 00:05:52,440 Speaker 1: are I guess resting on some hypotheticals that haven't necessarily 116 00:05:52,600 --> 00:05:55,719 Speaker 1: been realized. And then there's this on the other hand, 117 00:05:55,920 --> 00:05:58,440 Speaker 1: huge volumes of cash being spent in the hopes that 118 00:05:58,480 --> 00:06:02,080 Speaker 1: it does get realized, especially when it comes to AI. 119 00:06:03,040 --> 00:06:07,240 Speaker 1: Have investors basically decided that these companies then aren't worth 120 00:06:07,400 --> 00:06:09,719 Speaker 1: as much as they thought they were even a few 121 00:06:09,720 --> 00:06:10,159 Speaker 1: months ago. 122 00:06:10,240 --> 00:06:12,600 Speaker 2: Is that why we're seeing the decline exactly? And this 123 00:06:12,680 --> 00:06:15,239 Speaker 2: is not investors saying these companies are going to fail 124 00:06:15,320 --> 00:06:17,960 Speaker 2: and aren't worth anything. Yeah, it's just saying they're not 125 00:06:18,200 --> 00:06:21,040 Speaker 2: worth as much as we see the hype of what 126 00:06:21,040 --> 00:06:22,640 Speaker 2: we're saying it's going to be right now. 127 00:06:23,360 --> 00:06:26,520 Speaker 1: I was like fairly fresh to the world of like 128 00:06:26,600 --> 00:06:30,000 Speaker 1: valuing companies when we started our company, and I was 129 00:06:30,120 --> 00:06:35,320 Speaker 1: shocked by how much of the economy rests on hype. Yeah, 130 00:06:35,520 --> 00:06:38,359 Speaker 1: and that you know, when you're looking from the outside in, 131 00:06:38,440 --> 00:06:39,880 Speaker 1: you can think that they're as you were saying, is 132 00:06:39,920 --> 00:06:43,480 Speaker 1: this mathematical equation or rationality? But really so much if 133 00:06:43,480 --> 00:06:45,920 Speaker 1: it comes down to hype and all the hype has 134 00:06:45,920 --> 00:06:48,280 Speaker 1: been around AI, there's nothing else that anyone has been 135 00:06:48,279 --> 00:06:50,920 Speaker 1: talking about, like AI and Ozempic. I'd say, the two 136 00:06:51,000 --> 00:06:54,040 Speaker 1: things that people have been speaking about. And now are 137 00:06:54,040 --> 00:06:56,560 Speaker 1: we saying perhaps the hype was unfounded? 138 00:06:56,800 --> 00:07:01,200 Speaker 2: Well, yeah, definitely cooler. There's a cooler pretre on this hype. 139 00:07:01,200 --> 00:07:04,000 Speaker 2: But I do have to kind of defend the hype 140 00:07:04,120 --> 00:07:06,800 Speaker 2: people though, somewhat of which you are one too. Well. 141 00:07:06,920 --> 00:07:09,279 Speaker 2: I think it needs a bit of balance always this discussion, 142 00:07:09,320 --> 00:07:13,160 Speaker 2: because it's very hard to predict something that doesn't exist yet, 143 00:07:13,360 --> 00:07:16,440 Speaker 2: but that's innovation, and so it's very hard to be 144 00:07:16,480 --> 00:07:18,720 Speaker 2: able to say with confidence, this is what we think 145 00:07:18,760 --> 00:07:20,720 Speaker 2: we're worth, so therefore we're going to sell the shares 146 00:07:20,760 --> 00:07:22,960 Speaker 2: at this rate. And the other thing to remember is 147 00:07:22,960 --> 00:07:25,200 Speaker 2: that if you look at the calculations of how much 148 00:07:25,240 --> 00:07:27,320 Speaker 2: these companies are worth based on how much they're making, 149 00:07:27,680 --> 00:07:30,480 Speaker 2: it was way more hyped before the two thousand dot 150 00:07:30,520 --> 00:07:34,679 Speaker 2: com bubble burst. So there's you remember, like it was yesterday, 151 00:07:35,000 --> 00:07:37,680 Speaker 2: it was great. I was four and a half years old, 152 00:07:38,920 --> 00:07:41,960 Speaker 2: so that you know, there's an example of Cisco that 153 00:07:42,080 --> 00:07:43,920 Speaker 2: was a company that took a huge hit in the 154 00:07:43,960 --> 00:07:46,440 Speaker 2: dot com bubble. There's a thing in finance called a 155 00:07:46,480 --> 00:07:49,920 Speaker 2: pe ratio. It's basically kind of how much a share 156 00:07:50,160 --> 00:07:51,760 Speaker 2: is going to cost you based on how much a 157 00:07:51,800 --> 00:07:55,080 Speaker 2: company is making. It's a score. They had a PE 158 00:07:55,160 --> 00:07:57,240 Speaker 2: ratio before the bubble burst in two thousand of one 159 00:07:57,280 --> 00:08:01,480 Speaker 2: hundred and forty eight. Navidio's current PE twenty six. So 160 00:08:02,120 --> 00:08:04,240 Speaker 2: it is another scale what we saw in the early 161 00:08:04,280 --> 00:08:07,160 Speaker 2: two thousands, but it is it is getting dangerously high. 162 00:08:07,480 --> 00:08:09,360 Speaker 1: We'll be back with the rest of today's deep dive 163 00:08:09,480 --> 00:08:15,800 Speaker 1: after a short note from today's sponsor. Okay, so there 164 00:08:15,920 --> 00:08:20,160 Speaker 1: is I guess an example of this happening previously, but 165 00:08:20,240 --> 00:08:22,360 Speaker 1: you've just said it was another scale. I want to 166 00:08:22,360 --> 00:08:25,160 Speaker 1: turn out to something else that I've been reading about, 167 00:08:25,280 --> 00:08:29,080 Speaker 1: and it's something about a circular investment in the AI industry. 168 00:08:29,280 --> 00:08:33,440 Speaker 2: I find that so interesting. So circular investment happens when 169 00:08:33,480 --> 00:08:37,480 Speaker 2: tech companies invest in each other's companies, and one example 170 00:08:37,520 --> 00:08:40,920 Speaker 2: would be the last month, Navidia and Microsoft said that 171 00:08:40,960 --> 00:08:44,440 Speaker 2: they would invest fifteen billion dollars in a company called Anthropic, 172 00:08:44,720 --> 00:08:47,840 Speaker 2: which makes another popular AI product called Claude. At the 173 00:08:47,880 --> 00:08:51,760 Speaker 2: same time, Anthropic committed to purchasing thirty billion dollars worth 174 00:08:51,840 --> 00:08:55,240 Speaker 2: of computing infrastructure like chips and servers and stuff from 175 00:08:55,280 --> 00:08:59,520 Speaker 2: Microsoft that runs on n video systems. So everybody is 176 00:08:59,600 --> 00:09:01,640 Speaker 2: kind of by things off each other and then investing 177 00:09:01,720 --> 00:09:04,840 Speaker 2: in each other's companies. And that is, according to critics, 178 00:09:04,840 --> 00:09:08,000 Speaker 2: similar to the dot com bubble, where everyone's kind of interconnected. 179 00:09:08,080 --> 00:09:10,319 Speaker 2: So you might have why is that a problem, Well, 180 00:09:10,320 --> 00:09:12,840 Speaker 2: it's a problem because if one domino falls over, then 181 00:09:13,120 --> 00:09:15,560 Speaker 2: they all kind of do. And also it could be 182 00:09:15,559 --> 00:09:18,120 Speaker 2: a problem in the fact that you might hear a 183 00:09:18,160 --> 00:09:22,880 Speaker 2: headline saying that company X has raised, however, many hundreds 184 00:09:22,880 --> 00:09:26,160 Speaker 2: of billions. But maybe that's not real money. It's kind 185 00:09:26,160 --> 00:09:28,400 Speaker 2: of just if you help us out with some chips, 186 00:09:28,440 --> 00:09:31,000 Speaker 2: will help you out with some computing power over there. 187 00:09:31,600 --> 00:09:34,559 Speaker 2: It's kind of in kind transactions and it's not actually 188 00:09:34,600 --> 00:09:37,400 Speaker 2: about the real cash. But yeah, the domino kind of 189 00:09:37,440 --> 00:09:41,160 Speaker 2: effect of a circular investment situation can be quite quick. 190 00:09:41,520 --> 00:09:43,000 Speaker 2: If one goes down, the rest go down. 191 00:09:43,559 --> 00:09:47,080 Speaker 1: I was having a conversation with somebody about the fact 192 00:09:47,160 --> 00:09:50,480 Speaker 1: that there are a few people who the world, it seems, 193 00:09:50,520 --> 00:09:54,000 Speaker 1: trusts that when they found the alarm, everybody's like, yep, 194 00:09:54,720 --> 00:09:57,840 Speaker 1: something is wrong here. That's happened. 195 00:09:58,040 --> 00:10:02,000 Speaker 2: It has happened, and you'll know, Michael Berry, he's. 196 00:10:01,840 --> 00:10:03,520 Speaker 1: The invat I didn't know his name, but we know. 197 00:10:03,840 --> 00:10:07,680 Speaker 2: He's the investor who was kind of made very notable 198 00:10:07,720 --> 00:10:10,360 Speaker 2: for our generation in The Big Short. He was the movie. 199 00:10:10,440 --> 00:10:12,400 Speaker 2: He was the investor who basically said, I think the 200 00:10:12,400 --> 00:10:15,560 Speaker 2: housing market is about to collapse, and he shorted the 201 00:10:15,559 --> 00:10:18,800 Speaker 2: housing market. He recently disclosed, I think it was about 202 00:10:18,880 --> 00:10:21,240 Speaker 2: a week and a half ago, that his fund had 203 00:10:21,280 --> 00:10:25,960 Speaker 2: purchased a billion US dollars in bets against Navidia, So 204 00:10:26,200 --> 00:10:29,880 Speaker 2: taking a short against Navidia. Essentially, he's betting that Navidia 205 00:10:29,920 --> 00:10:33,000 Speaker 2: stock price is going to fall dramatically. Then there's Peter Tiel, 206 00:10:33,040 --> 00:10:37,160 Speaker 2: who's another billionaire tech investor. He sold his entire Navidia 207 00:10:37,240 --> 00:10:40,000 Speaker 2: stake this week that was worth about one hundred million 208 00:10:40,120 --> 00:10:42,560 Speaker 2: US dollars. Crazy crazy. 209 00:10:42,800 --> 00:10:46,160 Speaker 1: It's crazy also that the patterns of an individual can 210 00:10:46,240 --> 00:10:50,400 Speaker 1: influence so much. So we've got massive spending, high prices, 211 00:10:50,480 --> 00:10:53,679 Speaker 1: circular investment, all things that I've just learned about, and 212 00:10:53,960 --> 00:10:57,840 Speaker 1: big investors betting against tech stocks. What are the companies 213 00:10:57,880 --> 00:10:58,720 Speaker 1: themselves saying. 214 00:10:59,240 --> 00:11:01,480 Speaker 2: They're saying there's not to worry about it. They're saying 215 00:11:01,559 --> 00:11:04,120 Speaker 2: everything's okay, and some of them do have a point. 216 00:11:04,160 --> 00:11:06,360 Speaker 2: I mean, they're saying it's completely different. From the dot 217 00:11:06,400 --> 00:11:10,240 Speaker 2: com bubble, the whole pe ratio thing that I said before. 218 00:11:10,240 --> 00:11:13,040 Speaker 2: They're relying on that a lot, but they are some 219 00:11:13,160 --> 00:11:16,400 Speaker 2: of them are profitable, and in the late nineties, almost 220 00:11:16,440 --> 00:11:20,960 Speaker 2: no internet companies that went bust were profitable. So yesterday 221 00:11:21,360 --> 00:11:24,200 Speaker 2: Navidia revealed to the market that its profits in the 222 00:11:24,240 --> 00:11:27,600 Speaker 2: past three months have gone up sixty five percent from 223 00:11:27,640 --> 00:11:30,360 Speaker 2: this time last year. In the last three months, they've 224 00:11:30,400 --> 00:11:34,160 Speaker 2: made thirty two billion US dollars in profit, So they 225 00:11:34,200 --> 00:11:37,040 Speaker 2: have banked thirty two billion in profit, and that is 226 00:11:37,160 --> 00:11:41,440 Speaker 2: just one quarter. Navidia's stock price did jump on that news, unsurprisingly, 227 00:11:41,480 --> 00:11:43,680 Speaker 2: went up three percent, and that's why I said at 228 00:11:43,720 --> 00:11:45,400 Speaker 2: the beginning it had a rough couple of weeks, but 229 00:11:45,440 --> 00:11:48,240 Speaker 2: it seems to be on the recovery now. The other 230 00:11:48,440 --> 00:11:52,720 Speaker 2: point that AI companies make is that we all genuinely 231 00:11:52,800 --> 00:11:55,040 Speaker 2: are using AI more than we did a year ago, 232 00:11:55,280 --> 00:11:57,920 Speaker 2: and there is genuine demand. They're not making this up. 233 00:11:58,000 --> 00:12:01,720 Speaker 2: They are showing that every business almost on the planet, 234 00:12:02,040 --> 00:12:06,079 Speaker 2: and every school and every you know, every network is 235 00:12:06,200 --> 00:12:08,760 Speaker 2: using AI. So they have a demand that they think 236 00:12:08,800 --> 00:12:11,600 Speaker 2: they can meet. And so you know, they're saying that 237 00:12:11,679 --> 00:12:15,320 Speaker 2: data centers are running at about eighty percent capacity, and 238 00:12:15,360 --> 00:12:17,480 Speaker 2: if they get higher than that, then it's dangerous that 239 00:12:17,520 --> 00:12:19,439 Speaker 2: they'll kind of cap out. So that's why they need 240 00:12:19,480 --> 00:12:22,520 Speaker 2: to build more data centers. It's not like expensive, yeah, 241 00:12:22,520 --> 00:12:24,160 Speaker 2: but it's not like no one's using this, and so 242 00:12:24,200 --> 00:12:25,480 Speaker 2: they're just building more and more for the. 243 00:12:25,480 --> 00:12:30,000 Speaker 1: Sake of it. Okay, So started the podcast by hypothesizing 244 00:12:30,040 --> 00:12:32,240 Speaker 1: about whether or not there is a bubble or balloon, 245 00:12:32,400 --> 00:12:33,200 Speaker 1: whatever we want to. 246 00:12:33,200 --> 00:12:36,920 Speaker 2: Call it, is there. Well, if I knew that answer, 247 00:12:37,280 --> 00:12:39,920 Speaker 2: I would also have the I would have the big 248 00:12:39,960 --> 00:12:43,079 Speaker 2: short too done about me, And that is I think 249 00:12:43,160 --> 00:12:45,640 Speaker 2: a really important point to remember that no one knows. 250 00:12:46,200 --> 00:12:48,800 Speaker 2: And you hear a lot of Yeah, you hear a 251 00:12:48,840 --> 00:12:51,360 Speaker 2: lot about the ones who get a pick right and 252 00:12:51,640 --> 00:12:53,959 Speaker 2: they get made into a movie with Christian Bale that 253 00:12:54,120 --> 00:12:56,000 Speaker 2: playing them as a character. But there are a lot 254 00:12:56,040 --> 00:12:58,480 Speaker 2: who don't get it right. And Michael Berry even for example, 255 00:12:58,520 --> 00:13:02,720 Speaker 2: has gotten stuff wrong as well. What I think is happening, though, 256 00:13:02,800 --> 00:13:06,680 Speaker 2: is what analysts are calling a potentially healthy cool off, 257 00:13:07,240 --> 00:13:07,680 Speaker 2: and so. 258 00:13:08,080 --> 00:13:09,320 Speaker 1: Like correcting itself. 259 00:13:09,400 --> 00:13:11,400 Speaker 2: Yeah, think about it like a big chill pill for 260 00:13:11,480 --> 00:13:13,880 Speaker 2: the market. I like to think about it like it's 261 00:13:13,920 --> 00:13:16,240 Speaker 2: a really fast car on a highway. And if you 262 00:13:16,280 --> 00:13:18,560 Speaker 2: spin out at one hundred and fifty kilometers an hour, 263 00:13:18,679 --> 00:13:21,120 Speaker 2: you can totally lose control. But if you spin out 264 00:13:21,160 --> 00:13:24,600 Speaker 2: at sixty kilometers an hour, you can kind of recorrect yourself. 265 00:13:24,720 --> 00:13:26,760 Speaker 2: That's almost what investors are trying to do now, is 266 00:13:26,840 --> 00:13:30,000 Speaker 2: let's just take the temperature down and calm things down. 267 00:13:30,080 --> 00:13:32,360 Speaker 2: But people are losing a lot of money in that process. 268 00:13:32,440 --> 00:13:34,840 Speaker 1: Yeah. Well, I mean I think that's a good point 269 00:13:34,920 --> 00:13:38,280 Speaker 1: to end on, because I think that it is always 270 00:13:38,360 --> 00:13:41,480 Speaker 1: nice to bring something back to our listeners and try 271 00:13:41,480 --> 00:13:44,040 Speaker 1: and make sense of it for them. So can you just, 272 00:13:44,400 --> 00:13:47,600 Speaker 1: I guess, finish on what it means for somebody, whether 273 00:13:47,720 --> 00:13:50,040 Speaker 1: or not they are investors in the share market. 274 00:13:50,200 --> 00:13:53,679 Speaker 2: Well, I think if you have a super fund, you 275 00:13:53,720 --> 00:13:56,800 Speaker 2: are invested in the share market. You just might not 276 00:13:56,840 --> 00:13:59,800 Speaker 2: at least at our agent might not realize that investment 277 00:13:59,840 --> 00:14:04,040 Speaker 2: for another couple of decades. But most super funds, if 278 00:14:04,040 --> 00:14:07,120 Speaker 2: not all of them, would have some sort of exposure 279 00:14:07,240 --> 00:14:09,760 Speaker 2: or buy into the companies that we've talked about today. 280 00:14:10,200 --> 00:14:13,720 Speaker 2: And when you talk about a company like Navidia, its 281 00:14:14,200 --> 00:14:17,160 Speaker 2: value is eight percent of the S and P five hundred, 282 00:14:17,200 --> 00:14:20,200 Speaker 2: which is the index that lists every one of the 283 00:14:20,200 --> 00:14:24,040 Speaker 2: five hundred companies in America and most Australian super funds 284 00:14:24,040 --> 00:14:27,200 Speaker 2: have S and P five hundred kind of exposure, and 285 00:14:27,240 --> 00:14:29,480 Speaker 2: that hopefully over time will be a really good thing 286 00:14:29,520 --> 00:14:33,600 Speaker 2: as the economy grows. But it is important that we're 287 00:14:33,640 --> 00:14:38,240 Speaker 2: all shareholders at some level, mainly through our super if 288 00:14:38,240 --> 00:14:40,680 Speaker 2: we don't have shares directly. The only other thing I 289 00:14:40,720 --> 00:14:43,400 Speaker 2: wanted to say before we kind of panic that things 290 00:14:43,400 --> 00:14:46,960 Speaker 2: are about to burst is that, yeah, there was a 291 00:14:46,960 --> 00:14:49,520 Speaker 2: big decline in the last month, but if you look 292 00:14:49,520 --> 00:14:52,600 Speaker 2: at the last year, things are already a lot better 293 00:14:52,760 --> 00:14:55,320 Speaker 2: than they were this time last year. So you know, 294 00:14:55,320 --> 00:14:56,920 Speaker 2: you can picture the big graph in your head and 295 00:14:56,920 --> 00:15:01,080 Speaker 2: there's been a then you try and solid so over 296 00:15:01,120 --> 00:15:03,960 Speaker 2: the long term, the kind of long term investing philosophy 297 00:15:04,080 --> 00:15:07,160 Speaker 2: still kind of stands true. But I think the really 298 00:15:07,200 --> 00:15:10,640 Speaker 2: important bit here is around AI companies, for really the 299 00:15:10,640 --> 00:15:14,880 Speaker 2: first time, having to justify properly the value that they 300 00:15:14,960 --> 00:15:16,239 Speaker 2: have to shareholders. 301 00:15:16,440 --> 00:15:18,760 Speaker 1: There you go, so interesting, Sam. Thank you for taking 302 00:15:18,840 --> 00:15:21,280 Speaker 1: us through that, really appreciate it, and thank you for 303 00:15:21,400 --> 00:15:24,080 Speaker 1: joining us for another episode of the Daily Oz. We 304 00:15:24,160 --> 00:15:26,880 Speaker 1: will be back as usual with the headlines this evening, 305 00:15:26,920 --> 00:15:31,880 Speaker 1: but until then, have a great day. 306 00:15:32,760 --> 00:15:35,080 Speaker 2: My name is Lily Maddon and I'm a proud Arunda 307 00:15:35,320 --> 00:15:40,120 Speaker 2: Bungelung Caalcuttin woman from Gadigol Country. The Daily oz acknowledges 308 00:15:40,200 --> 00:15:42,360 Speaker 2: that this podcast is recorded on the lands of the 309 00:15:42,360 --> 00:15:45,920 Speaker 2: Gadighl people and pays respect to all Aboriginal and Torres 310 00:15:45,920 --> 00:15:47,120 Speaker 2: s right island and nations. 311 00:15:47,440 --> 00:15:50,360 Speaker 1: We pay our respects to the first peoples of these countries, 312 00:15:50,480 --> 00:15:51,680 Speaker 1: both past and present.