1 00:00:03,840 --> 00:00:07,080 Speaker 1: Kyorda, Welcome to Shared Lunch. I'm Sonya. Today we have 2 00:00:07,120 --> 00:00:10,039 Speaker 1: a special episode we will talk about AI and investing. 3 00:00:10,480 --> 00:00:12,840 Speaker 1: We have a new feature coming to Chairs's called Search 4 00:00:12,920 --> 00:00:15,400 Speaker 1: AI and we'll talk more about it in this episode. 5 00:00:15,600 --> 00:00:18,280 Speaker 1: But before we get started, here's some important information. 6 00:00:19,440 --> 00:00:23,200 Speaker 2: Investing involves risk you might lose the money you start with. 7 00:00:23,640 --> 00:00:27,400 Speaker 2: We recommend talking to a licensed financial advisor. We also 8 00:00:27,440 --> 00:00:32,240 Speaker 2: recommend reading product disclosure documents before deciding to invest. Everything 9 00:00:32,280 --> 00:00:34,879 Speaker 2: you're about to see and here is current at the 10 00:00:34,920 --> 00:00:35,760 Speaker 2: time of recording. 11 00:00:35,880 --> 00:00:36,200 Speaker 3: Today. 12 00:00:36,680 --> 00:00:39,280 Speaker 1: We're joined by Richard Clark, who is one of the 13 00:00:39,280 --> 00:00:42,600 Speaker 1: co founders of chairs EA's, and Luke Petett, who is 14 00:00:42,800 --> 00:00:47,479 Speaker 1: the founder and CEO of Telescope. We are talking about AI, 15 00:00:47,640 --> 00:00:50,720 Speaker 1: which is a hot topic right now and something that's 16 00:00:50,920 --> 00:00:53,559 Speaker 1: really of interest to us as we are launching a 17 00:00:53,600 --> 00:00:57,280 Speaker 1: new tool here at chairs EA's called AI Search, and 18 00:00:57,320 --> 00:00:59,480 Speaker 1: so we want to talk about that, but also just 19 00:00:59,600 --> 00:01:02,080 Speaker 1: get a to to talk about AI with two people 20 00:01:02,120 --> 00:01:06,560 Speaker 1: who are deep in that world. So let's kick off 21 00:01:06,560 --> 00:01:10,240 Speaker 1: the conversation. So first, maybe we'll start with you, Richard. 22 00:01:10,360 --> 00:01:11,840 Speaker 1: Can you tell us a bit about. 23 00:01:11,600 --> 00:01:15,000 Speaker 3: You and how you come to be joining us combo today. 24 00:01:16,319 --> 00:01:19,640 Speaker 4: Yeah, sure. So I'm one of the technical co founders 25 00:01:19,680 --> 00:01:23,400 Speaker 4: of Chaza's and you know, came to Chazas after a 26 00:01:23,720 --> 00:01:26,920 Speaker 4: long time in the tech industry. And I think the 27 00:01:26,959 --> 00:01:30,240 Speaker 4: fascinating thing for me about AI has it has always 28 00:01:30,280 --> 00:01:34,839 Speaker 4: been really interesting, you know, the idea of computers doing everything, 29 00:01:34,959 --> 00:01:38,120 Speaker 4: you know, being just like people in some ways. You know, 30 00:01:38,120 --> 00:01:41,120 Speaker 4: it's science fiction everywhere, and so that was always a 31 00:01:41,200 --> 00:01:44,920 Speaker 4: thing that was fascinating to me. I tried to, you know, 32 00:01:45,160 --> 00:01:48,000 Speaker 4: make things ever since I started programming, you know, make 33 00:01:48,080 --> 00:01:52,160 Speaker 4: little companions or whatever, and they were all very good. 34 00:01:53,720 --> 00:01:56,120 Speaker 4: But you know, in the last few years, we've just 35 00:01:56,320 --> 00:01:59,680 Speaker 4: seen this incredible sea change, and it has been so 36 00:01:59,680 --> 00:02:03,560 Speaker 4: so interesting to see something that's gone from like nothing 37 00:02:03,600 --> 00:02:07,960 Speaker 4: to science fiction to to you know, really right there 38 00:02:08,000 --> 00:02:11,040 Speaker 4: on your computer, able to play with it. And you know, 39 00:02:11,120 --> 00:02:13,320 Speaker 4: my so my recent history with that has just been 40 00:02:13,360 --> 00:02:19,000 Speaker 4: really engaging with these new changes, building tools, chatting with 41 00:02:19,080 --> 00:02:23,639 Speaker 4: people like Louk here, and really trying to understand how 42 00:02:23,680 --> 00:02:27,040 Speaker 4: this is going to impact everything and how we can 43 00:02:27,160 --> 00:02:29,840 Speaker 4: use these things to do really really interesting stuff. 44 00:02:29,919 --> 00:02:33,519 Speaker 5: And how about you, Lot, Yeah, I mean, well, first 45 00:02:33,520 --> 00:02:36,000 Speaker 5: of all, you Sion founder of Telescope. 46 00:02:36,040 --> 00:02:39,000 Speaker 6: So I'm definitely trying to keep up with AI. 47 00:02:39,080 --> 00:02:42,880 Speaker 5: It's it's wild, as Richard pointed out, like it's it's 48 00:02:43,000 --> 00:02:46,079 Speaker 5: just moving so quick. It is even for someone well 49 00:02:46,120 --> 00:02:49,800 Speaker 5: and truly involved in the industry and as an engineer, 50 00:02:50,160 --> 00:02:52,640 Speaker 5: you know, it's sort of I feel like I understand 51 00:02:52,680 --> 00:02:56,880 Speaker 5: the fundamentals, but it is. It is crazy how quick 52 00:02:56,880 --> 00:03:00,200 Speaker 5: things are moving. And it certainly appeals. I mean, the 53 00:03:00,240 --> 00:03:04,080 Speaker 5: idea that you can just have a system make intelligent 54 00:03:04,200 --> 00:03:08,560 Speaker 5: choices with like kind of you know, unusual conditions. 55 00:03:08,320 --> 00:03:09,440 Speaker 6: It's so human. 56 00:03:09,520 --> 00:03:11,720 Speaker 5: It feels so human the way it works, and it 57 00:03:11,800 --> 00:03:16,120 Speaker 5: actually has shifted my thinking around how we see humans. 58 00:03:16,120 --> 00:03:18,360 Speaker 5: You know, I kind of feel like some sort of 59 00:03:18,400 --> 00:03:21,440 Speaker 5: finely tuned model that's had all these experiences through my 60 00:03:21,600 --> 00:03:22,400 Speaker 5: life as well. 61 00:03:22,480 --> 00:03:24,080 Speaker 3: You mentioned, you know, you're from Telescope. 62 00:03:24,280 --> 00:03:26,360 Speaker 1: For anyone who's never heard of telescope, do you want 63 00:03:26,400 --> 00:03:29,480 Speaker 1: to just give us the oneliner of what telescope is. 64 00:03:29,960 --> 00:03:30,160 Speaker 6: Yeah. 65 00:03:30,240 --> 00:03:34,720 Speaker 5: So we're generative aur company based in Australia, a small team, 66 00:03:34,800 --> 00:03:38,720 Speaker 5: but we're growing. We're working with brokers and publishers and 67 00:03:38,760 --> 00:03:43,520 Speaker 5: sort of FinTechs all across the world on solving problems 68 00:03:43,560 --> 00:03:46,680 Speaker 5: to do with capital markets and finance and most of 69 00:03:46,720 --> 00:03:49,560 Speaker 5: those problems we found to do with I guess, the 70 00:03:49,600 --> 00:03:53,480 Speaker 5: overwhelming nature of capital markets, of investing, of the choices 71 00:03:53,520 --> 00:03:58,440 Speaker 5: we have when it comes to investing, and just offering 72 00:03:58,720 --> 00:04:03,000 Speaker 5: co pilots or AI tools to help navigate that journey 73 00:04:03,320 --> 00:04:06,480 Speaker 5: when it comes to making a decision or thinking about 74 00:04:06,880 --> 00:04:09,880 Speaker 5: you know, a single stock or looking looking for something 75 00:04:09,920 --> 00:04:12,880 Speaker 5: maybe that matters to you, and just providing more information 76 00:04:13,000 --> 00:04:13,880 Speaker 5: to those investors. 77 00:04:14,000 --> 00:04:14,360 Speaker 3: Yeah. 78 00:04:14,440 --> 00:04:16,880 Speaker 1: So now if we want to kind of take a 79 00:04:16,920 --> 00:04:20,160 Speaker 1: step back and we're just talking about things that for 80 00:04:20,160 --> 00:04:22,560 Speaker 1: anyone who's not finger on the pulse or they're just 81 00:04:22,640 --> 00:04:25,200 Speaker 1: kind of reading the headlines hearing what's going on, if 82 00:04:25,200 --> 00:04:27,080 Speaker 1: we can help fill in a few of those gaps, 83 00:04:27,160 --> 00:04:29,760 Speaker 1: can you just tell us, like, maybe what's what's been 84 00:04:29,800 --> 00:04:33,880 Speaker 1: happening that's driving this real kick kind of into another. 85 00:04:33,640 --> 00:04:36,919 Speaker 4: Gear from where we've been sitting at Shares's And you know, 86 00:04:37,000 --> 00:04:40,440 Speaker 4: my own personal interest, I think probably three to four 87 00:04:40,520 --> 00:04:45,919 Speaker 4: years ago, largely with a technology called GPT two, which 88 00:04:46,160 --> 00:04:49,240 Speaker 4: was you know, the very you're probably aware of GPT four. 89 00:04:49,279 --> 00:04:51,320 Speaker 4: You might have heard that stuff on the headlines chat 90 00:04:51,400 --> 00:04:53,760 Speaker 4: GPT and stuff like that, but GPT two is an 91 00:04:53,760 --> 00:04:58,000 Speaker 4: earlier version of this and when it turned up. It 92 00:04:58,080 --> 00:05:01,400 Speaker 4: was very interesting. It's it was nothing like what we're 93 00:05:01,440 --> 00:05:04,279 Speaker 4: able to see today, but it had those hints. And 94 00:05:04,360 --> 00:05:08,279 Speaker 4: now with GPD four and some of the more recent 95 00:05:08,680 --> 00:05:12,440 Speaker 4: things both from Google and Anthropic and other companies involved 96 00:05:12,480 --> 00:05:17,919 Speaker 4: in this space, is actually it's doing really interesting stuff constantly, 97 00:05:18,160 --> 00:05:24,360 Speaker 4: consistently bringing new and interesting perspectives or creating new opportunities. 98 00:05:24,880 --> 00:05:27,320 Speaker 4: And I think this is the point where it's like, 99 00:05:28,240 --> 00:05:30,920 Speaker 4: you really have to be paying attention to this thing 100 00:05:32,240 --> 00:05:36,560 Speaker 4: as an organization because the potential is enormous and we 101 00:05:37,040 --> 00:05:39,960 Speaker 4: just don't know what we can and can't do with it. 102 00:05:39,960 --> 00:05:41,840 Speaker 4: We're still very much finding out. 103 00:05:42,000 --> 00:05:44,360 Speaker 6: I'd love to explain how it works as well. 104 00:05:45,080 --> 00:05:47,080 Speaker 3: Oh give there to go, don't hold back. 105 00:05:48,279 --> 00:05:50,320 Speaker 5: I was thinking about this morning, and it's very hard 106 00:05:50,320 --> 00:05:54,599 Speaker 5: to articulate sometimes these complex systems in simple ways. But 107 00:05:55,560 --> 00:05:58,440 Speaker 5: you know this, Just think about a simple sentence, like 108 00:05:58,480 --> 00:06:01,680 Speaker 5: this morning, I grabbed my surfboard and I caught an epic. 109 00:06:02,960 --> 00:06:05,400 Speaker 5: You know, I'm going to say most likely to say wave. 110 00:06:05,480 --> 00:06:08,520 Speaker 5: And we experienced this kind of auto predict on our phones. 111 00:06:08,960 --> 00:06:12,360 Speaker 5: And so if you think about a sentence and a conversation, 112 00:06:13,080 --> 00:06:15,800 Speaker 5: you know it follows a certain number of patterns, and 113 00:06:15,839 --> 00:06:19,400 Speaker 5: there's probability and mathematics involved in what's most likely to 114 00:06:19,440 --> 00:06:23,400 Speaker 5: be that final word, But maybe it was actually I 115 00:06:23,440 --> 00:06:26,440 Speaker 5: caught an epic fish, and so what's the probability of 116 00:06:26,480 --> 00:06:27,120 Speaker 5: me saying that? 117 00:06:27,200 --> 00:06:29,000 Speaker 6: And it's probably very low. 118 00:06:29,640 --> 00:06:31,960 Speaker 5: But then what open AI did in the beginning was 119 00:06:32,000 --> 00:06:33,919 Speaker 5: they actually just went, you know what, let's take auto 120 00:06:33,960 --> 00:06:39,039 Speaker 5: predict and let's put it through a large cluster of 121 00:06:39,040 --> 00:06:42,599 Speaker 5: computing units of GPUs, and let's look at all of 122 00:06:42,640 --> 00:06:46,400 Speaker 5: those relationships on a time series and almost think of 123 00:06:46,400 --> 00:06:49,200 Speaker 5: it like in three D space, so like visualize the 124 00:06:49,800 --> 00:06:52,880 Speaker 5: three D space of language, and so you think about 125 00:06:52,920 --> 00:06:55,559 Speaker 5: these kind of relationships between words, but then you start 126 00:06:55,560 --> 00:06:59,200 Speaker 5: to think of those in a multi dimensional manner, so 127 00:06:59,240 --> 00:07:02,800 Speaker 5: you start to think about relationships between paragraphs and the 128 00:07:02,880 --> 00:07:05,640 Speaker 5: question that's been asked, And that's really all that did, 129 00:07:05,680 --> 00:07:09,239 Speaker 5: and it's kind of fascinating that maybe this is all 130 00:07:09,480 --> 00:07:10,760 Speaker 5: intelligence really is. 131 00:07:11,360 --> 00:07:14,280 Speaker 1: Yeah, I've used it to try and you know, create 132 00:07:14,360 --> 00:07:16,520 Speaker 1: meal plans for my you know, one and a half 133 00:07:16,640 --> 00:07:17,040 Speaker 1: year old. 134 00:07:17,200 --> 00:07:20,240 Speaker 3: You know, when I was like and just like stuff 135 00:07:20,280 --> 00:07:20,840 Speaker 3: that I'm like. 136 00:07:20,840 --> 00:07:24,000 Speaker 1: Please help me, you know, and then when you think of, 137 00:07:24,200 --> 00:07:26,640 Speaker 1: you know, what, what mission we're on here, which is like, 138 00:07:26,880 --> 00:07:30,360 Speaker 1: we're trying to create more empowerment for people when it 139 00:07:30,360 --> 00:07:32,920 Speaker 1: comes to money and some of the things that you 140 00:07:32,920 --> 00:07:36,120 Speaker 1: guys have talked about about what is possible here? 141 00:07:36,160 --> 00:07:37,600 Speaker 3: Were AI or where we're at? 142 00:07:37,640 --> 00:07:40,880 Speaker 1: Where AI like, am really interested to, you know, get 143 00:07:40,880 --> 00:07:43,560 Speaker 1: your thoughts on how do you see this kind of 144 00:07:43,680 --> 00:07:46,560 Speaker 1: playing out for people when it comes to money or investing. 145 00:07:46,680 --> 00:07:48,880 Speaker 5: The irony is if this video ends up on YouTube, 146 00:07:48,880 --> 00:07:51,200 Speaker 5: it's probably going to be part of the future model 147 00:07:51,240 --> 00:07:54,880 Speaker 5: training that's going to help build the model. So maybe 148 00:07:55,520 --> 00:07:58,800 Speaker 5: maybe we have our moment to share our ideas personally, 149 00:07:58,840 --> 00:08:02,520 Speaker 5: I think if you think about the best investors of 150 00:08:02,560 --> 00:08:06,120 Speaker 5: all time, like Munger and Buffet and Howard, you know, 151 00:08:06,520 --> 00:08:10,720 Speaker 5: Howard markson all the long list of value investors, the 152 00:08:10,720 --> 00:08:16,000 Speaker 5: ones that actually study financial markets and history and a 153 00:08:16,080 --> 00:08:18,520 Speaker 5: multifaceted approach to investing. 154 00:08:18,920 --> 00:08:19,680 Speaker 6: They've spent a. 155 00:08:19,760 --> 00:08:23,520 Speaker 5: Good part of like thirty forty in Monger's case, maybe 156 00:08:23,560 --> 00:08:27,960 Speaker 5: seventy years of their life. I think Manger famously read 157 00:08:28,040 --> 00:08:31,160 Speaker 5: for thirty eight hours a week. He would allocate two 158 00:08:31,200 --> 00:08:35,240 Speaker 5: hours a week to meetings. So let's just presume he 159 00:08:36,240 --> 00:08:41,719 Speaker 5: spent seventy years of his investing career reading you know, 160 00:08:41,960 --> 00:08:45,600 Speaker 5: all sorts of literature, and not just annual filings and 161 00:08:46,000 --> 00:08:53,160 Speaker 5: SEC filings, but physics and you know, politics, history. I've 162 00:08:53,160 --> 00:08:56,920 Speaker 5: calculated that it's around about two billion words that Manger 163 00:08:56,960 --> 00:08:59,920 Speaker 5: has possibly consumed two to ten billion words in his 164 00:09:00,080 --> 00:09:03,160 Speaker 5: lifetime jet And then the question is that how much 165 00:09:03,160 --> 00:09:06,319 Speaker 5: does he recall? You know, that's forming his knowledge base, 166 00:09:06,840 --> 00:09:10,760 Speaker 5: but he's trained himself to know on that content. And 167 00:09:10,800 --> 00:09:13,079 Speaker 5: I think so there's no First of all, there's no 168 00:09:13,240 --> 00:09:18,679 Speaker 5: denying that the compound knowledge that AI models are going 169 00:09:18,720 --> 00:09:22,520 Speaker 5: through already, the intelligence that they're building on, is growing 170 00:09:22,640 --> 00:09:27,360 Speaker 5: at a large rate, and it will outperform somebody like 171 00:09:27,480 --> 00:09:31,720 Speaker 5: Manga or Buffett or whoever. And so I think the 172 00:09:31,800 --> 00:09:35,280 Speaker 5: opportunity in capital markets when it comes to an investing 173 00:09:35,559 --> 00:09:39,600 Speaker 5: is really interesting because maybe it levels the playing field. 174 00:09:39,679 --> 00:09:41,680 Speaker 6: If we have access to these models. 175 00:09:41,360 --> 00:09:44,200 Speaker 5: And they become super intelligent, and we just know how 176 00:09:44,200 --> 00:09:47,520 Speaker 5: to use them in the right way, and they're available publicly, 177 00:09:47,600 --> 00:09:50,040 Speaker 5: they're not privately run by Hedge Fund in New York, 178 00:09:51,920 --> 00:09:55,840 Speaker 5: maybe that kind of makes the markets more efficient and 179 00:09:55,920 --> 00:09:59,200 Speaker 5: makes them probably more fair. If a balance sheet comes 180 00:09:59,240 --> 00:10:02,520 Speaker 5: out on the stop market and you have something that 181 00:10:02,559 --> 00:10:06,000 Speaker 5: can read it and understand it because you can't, maybe 182 00:10:06,400 --> 00:10:10,240 Speaker 5: you know what to do, and so yeah, I think 183 00:10:10,240 --> 00:10:14,600 Speaker 5: maybe it creates more opportunity for abundance when it comes 184 00:10:14,600 --> 00:10:16,760 Speaker 5: to investing. That's my view on maybe where it could 185 00:10:16,760 --> 00:10:19,800 Speaker 5: go for ten ten years, and I'm an optimist. I'd 186 00:10:19,880 --> 00:10:23,480 Speaker 5: like to hope that it won't be held by you know, 187 00:10:23,760 --> 00:10:26,079 Speaker 5: concentrated powers and on Wall Street. 188 00:10:27,200 --> 00:10:27,840 Speaker 6: That's my view. 189 00:10:28,200 --> 00:10:30,439 Speaker 4: I would love to read thirty eight hours a week. 190 00:10:31,040 --> 00:10:36,560 Speaker 4: That's I love reading, and that sounds great, but you know, yeah, 191 00:10:36,720 --> 00:10:38,880 Speaker 4: but the other reflection that I have, and I think 192 00:10:38,960 --> 00:10:42,400 Speaker 4: is very key to us here at share These, is 193 00:10:42,440 --> 00:10:48,760 Speaker 4: that it's so much about wealth, is personal to you, 194 00:10:49,040 --> 00:10:54,720 Speaker 4: to your your family, your situation, all of these things. 195 00:10:54,800 --> 00:11:01,480 Speaker 4: And until a technology like AI turns up and is 196 00:11:01,559 --> 00:11:05,040 Speaker 4: able to take all of that context into account as 197 00:11:05,040 --> 00:11:10,120 Speaker 4: it provides you within science and take the information that's 198 00:11:10,160 --> 00:11:13,560 Speaker 4: been given. You know, maybe you're a particle physicist. You 199 00:11:13,600 --> 00:11:16,560 Speaker 4: don't need somebody to give you baby words on the 200 00:11:16,640 --> 00:11:19,320 Speaker 4: nature of particle physics, but you're trying to evaluate a 201 00:11:19,360 --> 00:11:21,760 Speaker 4: company that's working in that field, and you don't understand 202 00:11:21,800 --> 00:11:25,440 Speaker 4: the financial terms. You know, it's something that can understand 203 00:11:25,720 --> 00:11:28,440 Speaker 4: what you know, what you don't. The gaps that you 204 00:11:28,480 --> 00:11:33,160 Speaker 4: have in your understanding and what's relevant for you really 205 00:11:33,240 --> 00:11:37,280 Speaker 4: changes the picture in my view, because people can really 206 00:11:37,360 --> 00:11:42,080 Speaker 4: leverage the opportunities that they have personally and solve the 207 00:11:42,120 --> 00:11:46,520 Speaker 4: problems that they have personally and apply their personal values. 208 00:11:47,120 --> 00:11:49,960 Speaker 4: You know, get these tools. And I think we've talked 209 00:11:49,960 --> 00:11:52,760 Speaker 4: about this several times, the idea that AI at the 210 00:11:52,840 --> 00:11:56,040 Speaker 4: moment anyway, we're kind of anticipating and acting in a 211 00:11:56,080 --> 00:11:59,480 Speaker 4: copilot role. It's sitting there at like a friend, a buddy. 212 00:11:59,480 --> 00:12:02,719 Speaker 4: It's help us understand. It's not making decisions for us, 213 00:12:02,760 --> 00:12:06,400 Speaker 4: it's helping us understand the things that are important to us. 214 00:12:06,800 --> 00:12:10,760 Speaker 4: And when that happens, it just feels like it's really 215 00:12:10,800 --> 00:12:15,040 Speaker 4: really magical and really really useful. You know, Sonya's heard 216 00:12:15,080 --> 00:12:18,520 Speaker 4: me say this many many times. People need to be 217 00:12:18,600 --> 00:12:21,920 Speaker 4: building an intuition about these tools and how they work, 218 00:12:22,320 --> 00:12:26,280 Speaker 4: like you can't use them half as effectively. You know, 219 00:12:27,800 --> 00:12:31,680 Speaker 4: almost all of our listeners who've heard anything about this stuff, 220 00:12:32,880 --> 00:12:35,520 Speaker 4: will I have heard equal amounts of like, wow, this 221 00:12:35,640 --> 00:12:40,440 Speaker 4: is amazing at massive hype and also oh hallucinations, they lie, 222 00:12:40,520 --> 00:12:43,920 Speaker 4: they make stuff up, they do all that, and the 223 00:12:44,640 --> 00:12:47,800 Speaker 4: trekkers you get way better outcomes if you invest the 224 00:12:47,880 --> 00:12:51,400 Speaker 4: time to understand the strengths and weaknesses, you know, spend 225 00:12:51,400 --> 00:12:53,600 Speaker 4: some time playing with them. It doesn't really matter whether 226 00:12:53,600 --> 00:12:57,200 Speaker 4: you're doing serious work stuff or whether you're just toying around. 227 00:12:57,360 --> 00:13:01,680 Speaker 5: And if you tried the audio feature within chat GPT, 228 00:13:01,960 --> 00:13:03,760 Speaker 5: so you just have it on in your car and 229 00:13:03,880 --> 00:13:04,439 Speaker 5: talk to it. 230 00:13:05,280 --> 00:13:07,920 Speaker 6: That's fascinating. I've done it with my kids. So I 231 00:13:07,960 --> 00:13:08,880 Speaker 6: picked them up from school. 232 00:13:08,880 --> 00:13:12,720 Speaker 5: We've got three kids, and just take a moment to 233 00:13:12,760 --> 00:13:16,920 Speaker 5: explain my kids' ages, their names, and their interests. And 234 00:13:16,960 --> 00:13:20,079 Speaker 5: then I ask chat GPT to go around the car 235 00:13:20,559 --> 00:13:23,880 Speaker 5: one by one and ask them an interesting question that 236 00:13:23,880 --> 00:13:26,480 Speaker 5: they have to answer, and it becomes a homework task. 237 00:13:26,960 --> 00:13:29,880 Speaker 5: And I just drive and I let this AI agent 238 00:13:29,960 --> 00:13:32,719 Speaker 5: take over the conversation in the car and they love it. 239 00:13:32,720 --> 00:13:33,560 Speaker 6: It's fascinating. 240 00:13:34,320 --> 00:13:36,559 Speaker 3: I love it. It's like your own little family podcast. 241 00:13:36,960 --> 00:13:37,400 Speaker 6: Yeah. 242 00:13:37,559 --> 00:13:40,439 Speaker 5: Well, remember the feeling when we had an Encyclopedia Britannica 243 00:13:40,640 --> 00:13:44,120 Speaker 5: or those you know, or even a map of the world. 244 00:13:44,600 --> 00:13:46,880 Speaker 5: I mean it's the same experience for them, just at 245 00:13:46,880 --> 00:13:47,959 Speaker 5: a much deeper level. 246 00:13:48,240 --> 00:13:50,480 Speaker 3: Yeah. Great. Like so talking about getting stuck. 247 00:13:50,280 --> 00:13:53,400 Speaker 1: In you know, one thing I think of is like, 248 00:13:53,440 --> 00:13:56,040 Speaker 1: sometimes the problems don't change, it's just the solutions. All 249 00:13:56,040 --> 00:13:58,600 Speaker 1: the ways we might solve those problems become different as 250 00:13:58,800 --> 00:14:02,200 Speaker 1: different technologies and things like that. And you know, one 251 00:14:02,240 --> 00:14:05,840 Speaker 1: thing we are launching soon is and working with Telescope 252 00:14:05,840 --> 00:14:09,480 Speaker 1: to launch that is the AI search. And you know, 253 00:14:09,559 --> 00:14:11,800 Speaker 1: one of the problems that exists and one of the 254 00:14:11,880 --> 00:14:14,280 Speaker 1: you know, if people are investing, one of the top 255 00:14:14,360 --> 00:14:15,719 Speaker 1: questions is well, what should I. 256 00:14:15,679 --> 00:14:17,480 Speaker 3: Invest in or how should I think about this? And 257 00:14:17,480 --> 00:14:18,360 Speaker 3: what do I need to know? 258 00:14:18,520 --> 00:14:21,400 Speaker 1: And want to get more and build their knowledge around 259 00:14:21,440 --> 00:14:24,440 Speaker 1: investing and build their confidence in that space. So we're 260 00:14:24,480 --> 00:14:27,920 Speaker 1: launching this in New Zealand. But maybe Richard, do you 261 00:14:27,920 --> 00:14:30,520 Speaker 1: want to give us a bit of a insight into 262 00:14:30,520 --> 00:14:31,880 Speaker 1: what AI search is? 263 00:14:32,320 --> 00:14:32,520 Speaker 5: Yeah? 264 00:14:32,560 --> 00:14:36,720 Speaker 4: Yeah, yeah, So AI search for us is, you know, 265 00:14:36,760 --> 00:14:40,480 Speaker 4: we call the problem discovery. How do you find the 266 00:14:40,480 --> 00:14:44,560 Speaker 4: thing that you're actually interested in, whether you know it 267 00:14:45,080 --> 00:14:50,600 Speaker 4: you know well or not. And AI search is a 268 00:14:50,600 --> 00:14:53,080 Speaker 4: new take on this. It's the idea that you can 269 00:14:53,200 --> 00:14:56,040 Speaker 4: kind of give us a concept. You can search on 270 00:14:56,080 --> 00:14:59,680 Speaker 4: a concept or an idea what change in the world 271 00:15:00,680 --> 00:15:05,520 Speaker 4: might be true and then how would that impact what 272 00:15:05,560 --> 00:15:08,320 Speaker 4: you might look to invest in or what you might 273 00:15:08,440 --> 00:15:09,720 Speaker 4: look to think about. 274 00:15:09,920 --> 00:15:13,800 Speaker 5: I think the initial problem we were solving at Telescope, 275 00:15:13,800 --> 00:15:18,240 Speaker 5: even prior to working with Chasi's, was that scenario playing 276 00:15:18,480 --> 00:15:21,640 Speaker 5: a situation, a theme, or an event or something that 277 00:15:21,720 --> 00:15:26,600 Speaker 5: was personal that mattered to you, and you know, stepping 278 00:15:26,640 --> 00:15:29,800 Speaker 5: through I guess the butterfly effect if you've heard that 279 00:15:29,880 --> 00:15:32,080 Speaker 5: term before, but the cause and effects, and then the 280 00:15:32,080 --> 00:15:35,479 Speaker 5: cause and effect. So going through this process of conducting 281 00:15:36,200 --> 00:15:39,960 Speaker 5: second third order thinking, which is an interesting topic, but 282 00:15:40,040 --> 00:15:44,840 Speaker 5: actually just relying on the one point seven trillion parameters 283 00:15:44,880 --> 00:15:49,560 Speaker 5: that open ai has in their knowledge base to rationally 284 00:15:49,640 --> 00:15:52,600 Speaker 5: just search for stocks based on that theme, so not 285 00:15:52,680 --> 00:15:56,280 Speaker 5: only unpack the idea that you had, but also just 286 00:15:56,320 --> 00:16:01,640 Speaker 5: to rationally pick the stocks that match the idea. And 287 00:16:01,680 --> 00:16:04,760 Speaker 5: so I think, like then, just what's really interesting about 288 00:16:04,760 --> 00:16:07,160 Speaker 5: it is that it changes the way you then interact 289 00:16:07,280 --> 00:16:09,960 Speaker 5: in the real world because you suddenly realize you have 290 00:16:10,040 --> 00:16:13,720 Speaker 5: this tool available. So, for example, we had a user 291 00:16:13,800 --> 00:16:20,800 Speaker 5: go to take their EV to the mechanic and they said, oh, 292 00:16:20,920 --> 00:16:22,640 Speaker 5: the tires have a bit of wear on them, but 293 00:16:22,680 --> 00:16:27,119 Speaker 5: nothing unusual due to it being an EV and having evware, 294 00:16:27,840 --> 00:16:30,560 Speaker 5: And it's like, what do you mean by evwear? And 295 00:16:31,040 --> 00:16:33,960 Speaker 5: you put that into Chasi's ai search and it actually 296 00:16:34,000 --> 00:16:39,160 Speaker 5: explains like, well, they get twenty to twenty five percent 297 00:16:39,280 --> 00:16:42,120 Speaker 5: more wear on their tires because the car's heavier and 298 00:16:42,160 --> 00:16:44,840 Speaker 5: they have more talqu and so actually, like what's an 299 00:16:44,880 --> 00:16:48,960 Speaker 5: interesting side effect of the electrification of vehicles is that 300 00:16:49,480 --> 00:16:52,720 Speaker 5: now there's the tire companies that are starting to benefit 301 00:16:52,760 --> 00:16:56,000 Speaker 5: in their revenues growing. So that's an interesting idea that 302 00:16:56,040 --> 00:16:59,440 Speaker 5: you probably wouldn't have traditionally been able to just search. 303 00:16:59,520 --> 00:17:02,680 Speaker 5: And so I think going into the Chasias app and 304 00:17:02,760 --> 00:17:06,159 Speaker 5: just throwing in these ideas throughout the experience you have 305 00:17:06,280 --> 00:17:09,199 Speaker 5: in your career or you know, at home, and just 306 00:17:09,240 --> 00:17:12,600 Speaker 5: thinking about the products you use and the services you 307 00:17:12,720 --> 00:17:15,760 Speaker 5: use in a different way and being able to search 308 00:17:15,800 --> 00:17:17,840 Speaker 5: for those like you can you can put in you know, 309 00:17:18,320 --> 00:17:22,439 Speaker 5: maybe just something that an ETF doesn't already cover, Like 310 00:17:22,440 --> 00:17:25,679 Speaker 5: what about just female CEOs like that. Unfortunately it's not 311 00:17:25,720 --> 00:17:28,920 Speaker 5: an ETF, it's not an investment product, but perhaps that's 312 00:17:28,920 --> 00:17:32,840 Speaker 5: just something you want to invest in or search for 313 00:17:33,119 --> 00:17:36,479 Speaker 5: and then choose the single stocks that you actually believe in. 314 00:17:37,320 --> 00:17:40,520 Speaker 1: It's so like the opportunity I think and just helping 315 00:17:40,520 --> 00:17:44,520 Speaker 1: people build context, you know. Unfortunately, there's not a use 316 00:17:44,640 --> 00:17:48,440 Speaker 1: or no answer, you know, to any of the world 317 00:17:48,480 --> 00:17:51,840 Speaker 1: of investing. But I think what is really fascinating as 318 00:17:52,640 --> 00:17:55,240 Speaker 1: the more you can kind of understand the richness around 319 00:17:55,240 --> 00:17:59,800 Speaker 1: your decisions or the assumptions or the context, and so 320 00:18:00,080 --> 00:18:03,359 Speaker 1: can kind of maybe recognize some connections that you hadn't 321 00:18:03,480 --> 00:18:08,159 Speaker 1: more like most immediately made. Being able to explore and 322 00:18:08,400 --> 00:18:10,520 Speaker 1: you know, build that knowledge base or show some of 323 00:18:10,560 --> 00:18:14,800 Speaker 1: that logic or flesh those things out is super exciting 324 00:18:14,840 --> 00:18:17,199 Speaker 1: and I think can't wait for this to launch and 325 00:18:17,240 --> 00:18:21,399 Speaker 1: also to get feedback from our investors or around you know, 326 00:18:21,400 --> 00:18:24,720 Speaker 1: how this is helping them searching and get more interested 327 00:18:25,040 --> 00:18:28,080 Speaker 1: and building a portfolio that's right for them. I think 328 00:18:28,119 --> 00:18:30,480 Speaker 1: we've come to the end of our time now, and 329 00:18:30,520 --> 00:18:32,440 Speaker 1: I think just so grateful to have you guys here 330 00:18:32,480 --> 00:18:35,679 Speaker 1: and sharing your insights and what's been a really fascinating 331 00:18:35,800 --> 00:18:39,960 Speaker 1: conversation and shared some cool tools that people can use 332 00:18:40,040 --> 00:18:43,760 Speaker 1: to get stuck in that. Yeah, just did just want 333 00:18:43,760 --> 00:18:46,840 Speaker 1: to say thanks heats for sharing. 334 00:18:47,520 --> 00:18:47,879 Speaker 5: Thank you