1 00:00:09,600 --> 00:00:12,879 Speaker 1: Hello and welcome to The Australian's Money Puzzle podcast. I'm 2 00:00:12,960 --> 00:00:17,639 Speaker 1: James Kirby. Welcome aboard everybody. It looks very much like 3 00:00:17,760 --> 00:00:20,960 Speaker 1: twenty twenty five is shaping up as the year artificial 4 00:00:21,040 --> 00:00:25,720 Speaker 1: intelligence really changes the game for investors. And in today's show, 5 00:00:25,840 --> 00:00:30,240 Speaker 1: I've got two guests. I've got financial advisor James Gerard, 6 00:00:30,280 --> 00:00:34,400 Speaker 1: who you know, and an AI chatbot in the form 7 00:00:34,560 --> 00:00:38,720 Speaker 1: of Microsoft's co Pilot. Now, in each segment of the show, 8 00:00:39,120 --> 00:00:43,920 Speaker 1: I start by interviewing the AI copilot, and then my 9 00:00:44,159 --> 00:00:48,879 Speaker 1: very human financial advisor guest, James Girard, reviews the quality 10 00:00:48,960 --> 00:00:52,720 Speaker 1: basically of what's been offered for free by this remarkable 11 00:00:52,800 --> 00:01:00,680 Speaker 1: new technology. Okay, stand by, here we go. Thank you for. 12 00:01:00,640 --> 00:01:02,920 Speaker 2: Coming on, Thanks for having me, James. 13 00:01:03,520 --> 00:01:05,319 Speaker 1: Okay, let's get into it. I really just want to 14 00:01:05,319 --> 00:01:07,280 Speaker 1: figure out how to start. You know, how do I 15 00:01:07,319 --> 00:01:10,800 Speaker 1: begin as an investor? Is there certain books or very 16 00:01:10,800 --> 00:01:12,800 Speaker 1: well known courses perhaps I could. 17 00:01:12,640 --> 00:01:15,840 Speaker 2: Do starting out as an investor? You want to look 18 00:01:15,840 --> 00:01:19,600 Speaker 2: at the basics first. There's a classic book called The 19 00:01:19,680 --> 00:01:23,840 Speaker 2: Intelligent Investor by Benjamin Graham that's really useful, and you 20 00:01:23,880 --> 00:01:27,200 Speaker 2: can find online courses too. It's about getting a good 21 00:01:27,200 --> 00:01:30,160 Speaker 2: grasp on the fundamentals before diving into the market. 22 00:01:31,319 --> 00:01:33,240 Speaker 1: Yeah, thank you. Okay, Well, the thing is, I looked 23 00:01:33,240 --> 00:01:35,640 Speaker 1: at the Benjamin Graham books and they written a long 24 00:01:35,680 --> 00:01:38,440 Speaker 1: time ago. I mean, I find that those books very difficult, 25 00:01:38,480 --> 00:01:40,759 Speaker 1: that they're beyond me. Really, I'm looking for something much 26 00:01:40,800 --> 00:01:44,440 Speaker 1: more elementary, you know, like a introduction perhaps to personal investing. 27 00:01:45,000 --> 00:01:48,480 Speaker 2: Got it. The older stuff can be a bit dense. 28 00:01:48,880 --> 00:01:51,760 Speaker 2: For something more digestible, you might check out A Random 29 00:01:51,800 --> 00:01:55,400 Speaker 2: Walk down Wall Street by Burton Malkiel. It's a bit 30 00:01:55,400 --> 00:01:59,080 Speaker 2: more approachable for beginners and talks about the basics of 31 00:01:59,160 --> 00:02:01,120 Speaker 2: investing in the stock market. 32 00:02:02,600 --> 00:02:05,000 Speaker 1: Yeah, very good. I like that idea. I know that 33 00:02:05,200 --> 00:02:08,120 Speaker 1: basics of the stock market that would be useful. One 34 00:02:08,160 --> 00:02:10,720 Speaker 1: of the things I think that's happened since was it 35 00:02:10,760 --> 00:02:13,919 Speaker 1: Burton Malkiel wrote that book? Of course, it's again it's 36 00:02:13,919 --> 00:02:15,960 Speaker 1: a long time ago, and it's a bit dense, you know, Like, 37 00:02:16,160 --> 00:02:18,200 Speaker 1: which is better? Can you tell me? The shares or 38 00:02:18,240 --> 00:02:18,919 Speaker 1: the ETFs. 39 00:02:20,080 --> 00:02:24,200 Speaker 2: ETFs have really changed the game. They offer diversification and 40 00:02:24,240 --> 00:02:27,720 Speaker 2: tend to have lower fees compared to buying individual shares. 41 00:02:28,440 --> 00:02:31,320 Speaker 2: It's not necessarily about one being better than the other. 42 00:02:31,360 --> 00:02:34,359 Speaker 2: It depends on your strategy. ETFs can be a good 43 00:02:34,400 --> 00:02:37,160 Speaker 2: starting point for beginners because of the built in diversification. 44 00:02:38,240 --> 00:02:39,600 Speaker 2: What's your goal with investing? 45 00:02:40,480 --> 00:02:43,880 Speaker 1: My goal? Look, my goal with investing is like any 46 00:02:43,880 --> 00:02:46,359 Speaker 1: other goal, which is to make as much money as possible. 47 00:02:46,639 --> 00:02:48,840 Speaker 1: I suppose one of the things that sort of is 48 00:02:48,840 --> 00:02:51,960 Speaker 1: a dilemma for me is whether I buy shares or 49 00:02:52,240 --> 00:02:55,520 Speaker 1: exchange traded funds. As you say, maybe they're interchangeable, but 50 00:02:56,120 --> 00:02:59,560 Speaker 1: which do you think? Whether I should buy shares or property? 51 00:02:59,600 --> 00:03:01,960 Speaker 1: Because house price is also going up pretty. 52 00:03:01,680 --> 00:03:05,400 Speaker 2: Fast, Property can be a solid investment, especially if you're 53 00:03:05,400 --> 00:03:09,399 Speaker 2: thinking long term. It's a tangible asset and has potential 54 00:03:09,440 --> 00:03:14,560 Speaker 2: for appreciation and rental income. Shares and ETFs are more liquid, 55 00:03:14,960 --> 00:03:19,800 Speaker 2: but property offers a different kind of stability. Diversifying across 56 00:03:19,840 --> 00:03:24,359 Speaker 2: asset classes could balance out your portfolio. What's your risk tolerance? 57 00:03:24,520 --> 00:03:29,320 Speaker 1: Like my risk tolerance? My risk tolerance is normal. I 58 00:03:29,360 --> 00:03:32,040 Speaker 1: suppose One of the things I'm wondering is though, if 59 00:03:32,080 --> 00:03:34,239 Speaker 1: I'm paying down my home loan right at the moment, 60 00:03:34,320 --> 00:03:37,040 Speaker 1: my mortgage, maybe I'd be better off doing some investing. 61 00:03:37,600 --> 00:03:40,000 Speaker 1: Is there any sort of surveys that suggests which is 62 00:03:40,080 --> 00:03:42,640 Speaker 1: better and keep in mind I live in Australia. 63 00:03:43,760 --> 00:03:47,520 Speaker 2: It really depends on your specific situation. Paying down your 64 00:03:47,560 --> 00:03:50,360 Speaker 2: mortgage can be a safe bet because it reduces debt 65 00:03:50,480 --> 00:03:54,600 Speaker 2: and can save you on interest. Investing could potentially offer 66 00:03:54,680 --> 00:03:58,960 Speaker 2: higher returns, but it comes with risks. Consider speaking with 67 00:03:59,000 --> 00:04:01,680 Speaker 2: a financial advisor who knows the Australian market. 68 00:04:01,680 --> 00:04:04,520 Speaker 1: Well, okay, yes, well. One of the things I'm hoping 69 00:04:04,520 --> 00:04:06,880 Speaker 1: to get ASPUS is, you know, without going to an advisor, 70 00:04:06,960 --> 00:04:09,360 Speaker 1: maybe gets some information from you because you seem to 71 00:04:09,360 --> 00:04:12,280 Speaker 1: be across investment data. One of the things we have 72 00:04:12,400 --> 00:04:16,480 Speaker 1: here is this choice between putting money aside for the pension, 73 00:04:17,120 --> 00:04:19,800 Speaker 1: you know, for later life, or putting money into mortgage. 74 00:04:20,279 --> 00:04:22,919 Speaker 1: It's difficult to assess how much do you need in 75 00:04:22,960 --> 00:04:25,400 Speaker 1: this area? How much do you need to retire? How 76 00:04:25,440 --> 00:04:26,800 Speaker 1: do you estimate these things? 77 00:04:27,400 --> 00:04:31,400 Speaker 2: Retirement needs can vary greatly. A common rule of thumb 78 00:04:31,480 --> 00:04:34,120 Speaker 2: is aiming for around seventy to eighty percent of your 79 00:04:34,120 --> 00:04:38,000 Speaker 2: current income annually in retirement. This includes your pension savings 80 00:04:38,040 --> 00:04:41,159 Speaker 2: and any investments, but it really depends on your lifestyle, 81 00:04:41,320 --> 00:04:45,000 Speaker 2: health care needs, and life expectancy. It's a balancing act. 82 00:04:45,960 --> 00:04:48,080 Speaker 1: Oh yeah, where did you get that number? For the 83 00:04:48,120 --> 00:04:50,200 Speaker 1: rule of thumb, the seventy to eighty percent. It sounds 84 00:04:50,240 --> 00:04:52,640 Speaker 1: has high to me. I'd love to have that level, 85 00:04:52,760 --> 00:04:54,560 Speaker 1: you know, of income in my own retirement. 86 00:04:55,080 --> 00:04:59,440 Speaker 2: The seventy to eighty percent figure comes from financial planning guidelines. 87 00:05:00,200 --> 00:05:04,040 Speaker 2: It's based on the idea that retirees might spend less 88 00:05:04,120 --> 00:05:07,599 Speaker 2: on things like commuting or saving for retirement, but it 89 00:05:07,640 --> 00:05:11,839 Speaker 2: can vary. Some folks aim for more, especially if they 90 00:05:11,880 --> 00:05:13,839 Speaker 2: plan to travel or have high medical costs. 91 00:05:14,440 --> 00:05:18,000 Speaker 1: Okay, okay, and what would you say financial planning guidelines? 92 00:05:18,880 --> 00:05:20,640 Speaker 1: What which guidelines are you referring to? 93 00:05:21,120 --> 00:05:24,240 Speaker 2: I'm talking about advice from financial planners and experts. 94 00:05:24,720 --> 00:05:27,839 Speaker 1: Can you name who the financial planners are? I mean, like, 95 00:05:27,960 --> 00:05:30,960 Speaker 1: is it an association some sort? Is there a qualification here? 96 00:05:31,560 --> 00:05:35,560 Speaker 2: Sure? These guidelines often come from financial planning associations and 97 00:05:35,680 --> 00:05:40,600 Speaker 2: experts worldwide. In Australia, you've got bodies like the Financial 98 00:05:40,680 --> 00:05:48,560 Speaker 2: Planning Association FPA or the Association of Financial Advisors AFA. 99 00:05:48,720 --> 00:05:53,000 Speaker 2: Financial planners typically need qualifications and must meet certain standards 100 00:05:53,000 --> 00:05:53,719 Speaker 2: to give advice. 101 00:05:54,320 --> 00:05:56,960 Speaker 1: Do you know how many financial advices there are in Australia. 102 00:05:57,080 --> 00:05:58,800 Speaker 1: I'd like to find one. 103 00:05:59,120 --> 00:06:02,400 Speaker 2: There are thousands of financial advisors. I am not sure 104 00:06:02,440 --> 00:06:05,560 Speaker 2: of the exact number, but finding one might be easier 105 00:06:05,600 --> 00:06:09,599 Speaker 2: through the Financial Planning Association or the Australian Securities and 106 00:06:09,680 --> 00:06:11,360 Speaker 2: Investments Commission website. 107 00:06:11,720 --> 00:06:16,640 Speaker 1: Okay, pretty so, thank you well, James Gerard, human financial advisor. 108 00:06:16,920 --> 00:06:19,800 Speaker 1: That was pretty spooky. I found it spooky because I'd 109 00:06:19,839 --> 00:06:23,200 Speaker 1: been talking at that point to the bot for this 110 00:06:23,279 --> 00:06:25,440 Speaker 1: is edited versions you're hearing, folks, I'd been talking to 111 00:06:25,480 --> 00:06:27,640 Speaker 1: it for about half an hour and I was starting 112 00:06:28,279 --> 00:06:31,120 Speaker 1: It never gets emotional, of course, but you know, I 113 00:06:31,320 --> 00:06:34,880 Speaker 1: was starting to respond like I was talking to a character. 114 00:06:35,200 --> 00:06:37,440 Speaker 1: First thing I think we have to put on the 115 00:06:37,480 --> 00:06:39,680 Speaker 1: table is it's probably right. I mean, a lot of 116 00:06:39,720 --> 00:06:42,080 Speaker 1: the elementary financial advice, Why on earth would you pay 117 00:06:42,120 --> 00:06:44,720 Speaker 1: three thousand a year to get basic questions answered when 118 00:06:44,720 --> 00:06:48,120 Speaker 1: you can get them answered with AI. Am I being 119 00:06:48,200 --> 00:06:54,839 Speaker 1: brutally simplistic or is that true? I think it's true, James. 120 00:06:55,520 --> 00:06:58,400 Speaker 3: People will be jumping onto AI more and more to 121 00:06:58,520 --> 00:07:01,600 Speaker 3: typing these basic questions, and that's a threat to all 122 00:07:01,640 --> 00:07:05,680 Speaker 3: financial advisors where they're just focusing on the basics. So 123 00:07:05,720 --> 00:07:07,799 Speaker 3: I think what's going to happen with the financial advisory 124 00:07:07,800 --> 00:07:10,600 Speaker 3: industry is that everybody will need to skill up and 125 00:07:10,840 --> 00:07:14,440 Speaker 3: level up. So where there's very complex situation, not even 126 00:07:14,560 --> 00:07:17,480 Speaker 3: complex situations, complex decisions to be made, where there's maybe 127 00:07:17,800 --> 00:07:21,440 Speaker 3: five to ten different potential pathways where someone could go 128 00:07:21,520 --> 00:07:24,880 Speaker 3: with an investment or the way they structure something, it's 129 00:07:24,960 --> 00:07:28,440 Speaker 3: very difficult to capture that in a very full discussion 130 00:07:28,520 --> 00:07:33,040 Speaker 3: and overlay that judgment, that human judgment, because AI is 131 00:07:33,080 --> 00:07:35,960 Speaker 3: taking data, it's listening to the import it's whacking out 132 00:07:36,200 --> 00:07:38,920 Speaker 3: what it thinks is best. But it does lack that 133 00:07:39,800 --> 00:07:41,800 Speaker 3: human element and I'm not sure if that can ever 134 00:07:41,840 --> 00:07:45,120 Speaker 3: be replicated because we have to take it to another degree. 135 00:07:45,240 --> 00:07:48,200 Speaker 3: Where As we sit in video calls or face to 136 00:07:48,240 --> 00:07:51,840 Speaker 3: face meetings with clients, us as human financial advisors, we 137 00:07:51,880 --> 00:07:54,960 Speaker 3: look at people's body language, how they react, and we 138 00:07:55,480 --> 00:07:59,560 Speaker 3: position our communication in a way that is in line 139 00:07:59,600 --> 00:08:01,360 Speaker 3: with how the person would like to receive it. So 140 00:08:01,800 --> 00:08:05,080 Speaker 3: some people in our stereotype engineers they love information, whereas 141 00:08:05,400 --> 00:08:09,200 Speaker 3: salespeople in directors they just want short, snappy answers. So 142 00:08:09,240 --> 00:08:12,280 Speaker 3: that's a very simple example of how we adapt in AI. 143 00:08:12,400 --> 00:08:13,120 Speaker 3: Can't see you. 144 00:08:13,080 --> 00:08:16,440 Speaker 1: That's EQ right, that's EQ yeah, IQ. It's never going 145 00:08:16,480 --> 00:08:19,640 Speaker 1: to have the EQ yeah. That's an interesting point. And yeah, 146 00:08:19,640 --> 00:08:21,600 Speaker 1: there's one other thing which you mentioned. I think it's 147 00:08:21,640 --> 00:08:23,600 Speaker 1: the it's the killer point. Really, you have to know 148 00:08:23,680 --> 00:08:28,440 Speaker 1: the question to ask. And in a very almost paradoxical way, 149 00:08:29,040 --> 00:08:34,480 Speaker 1: part of this amazing leap in information supply from AI, 150 00:08:35,920 --> 00:08:39,520 Speaker 1: it means that knowing the questions to ask is more 151 00:08:39,520 --> 00:08:42,400 Speaker 1: important than ever. So you would think financial literacy is 152 00:08:42,480 --> 00:08:45,440 Speaker 1: less important, but in a way it's more important because 153 00:08:45,480 --> 00:08:48,040 Speaker 1: the people who can harness this other people who will 154 00:08:48,040 --> 00:08:49,760 Speaker 1: ask the right questions. But you're never going to know 155 00:08:49,840 --> 00:08:54,240 Speaker 1: the right questions unless you have some basics I think, 156 00:08:54,880 --> 00:08:58,959 Speaker 1: and then you think in terms of your industry. Basically 157 00:08:59,000 --> 00:09:02,600 Speaker 1: the whole thing moves up. Mar is that the ultimate 158 00:09:02,640 --> 00:09:03,240 Speaker 1: I'll come here. 159 00:09:03,559 --> 00:09:04,520 Speaker 2: Yeah, I believe. 160 00:09:04,559 --> 00:09:08,000 Speaker 3: So. I bet my last dollar that there's financial advisors 161 00:09:08,040 --> 00:09:10,680 Speaker 3: out there today who are typing in basic questions into 162 00:09:11,040 --> 00:09:14,040 Speaker 3: eye engines like what's the current contribution cap, what's catch 163 00:09:14,120 --> 00:09:17,360 Speaker 3: up concessional? So financial advisors are doing it every day. 164 00:09:17,440 --> 00:09:19,320 Speaker 3: Moms and dads and people general public are going to 165 00:09:19,320 --> 00:09:21,720 Speaker 3: do it as well. So the financial advisors need to 166 00:09:21,760 --> 00:09:23,760 Speaker 3: scale up. And one little AI tip would be that 167 00:09:24,280 --> 00:09:27,120 Speaker 3: if you don't know what questions to ask, or you 168 00:09:27,120 --> 00:09:29,280 Speaker 3: think there may be things that you're not completely aware of. 169 00:09:29,320 --> 00:09:31,880 Speaker 3: You can actually ask AI say all right, well, this 170 00:09:32,000 --> 00:09:34,600 Speaker 3: is my situation. I want to know this, but I'm 171 00:09:34,640 --> 00:09:36,360 Speaker 3: not sure what questions to ask. Can you help me 172 00:09:36,400 --> 00:09:38,680 Speaker 3: promptly ask? Tell me what questions I should be asking. 173 00:09:39,000 --> 00:09:41,240 Speaker 1: All right, we're going to keep these segments short. There's 174 00:09:41,360 --> 00:09:43,200 Speaker 1: a lot to get through. So what we're going to 175 00:09:43,240 --> 00:09:44,920 Speaker 1: do in the next segment you'll hear in a moment. 176 00:09:58,760 --> 00:10:02,000 Speaker 1: Hello and welcome back to The Australian's Money Puzzle podcast. 177 00:10:02,040 --> 00:10:06,040 Speaker 1: I'm James Kirby, and my guest today is the Microsoft 178 00:10:06,120 --> 00:10:10,920 Speaker 1: co Pilot AI chatbot. We've been going fairly well so far. 179 00:10:10,960 --> 00:10:13,000 Speaker 1: I want to lift it a gear now and I 180 00:10:13,040 --> 00:10:16,440 Speaker 1: want to ask the co pilot a bit more elaborate, 181 00:10:16,480 --> 00:10:21,760 Speaker 1: bit more advanced questions about Super. For instance, So, okay, 182 00:10:22,040 --> 00:10:26,400 Speaker 1: can you tell me about Super in Australia. I live 183 00:10:26,400 --> 00:10:29,440 Speaker 1: in Australia. How much can I put into Super? Do 184 00:10:29,480 --> 00:10:29,680 Speaker 1: you know? 185 00:10:29,840 --> 00:10:34,160 Speaker 2: Each year for the twenty twenty three to twenty twenty 186 00:10:34,200 --> 00:10:38,400 Speaker 2: four financial year you can contribute up to twenty seven thousand, 187 00:10:38,520 --> 00:10:42,199 Speaker 2: five hundred dollars in concessional before tax contributions. 188 00:10:42,600 --> 00:10:45,240 Speaker 1: Uh huh, very good. Okay, that's really all I'm interested in. 189 00:10:45,280 --> 00:10:47,880 Speaker 1: Twenty seven five hundred. Now, we'll take it, right, we'll 190 00:10:47,880 --> 00:10:49,439 Speaker 1: take it on board. That's actually a bit out of 191 00:10:49,520 --> 00:10:52,199 Speaker 1: date because in fact it's gone up to thirty thousand 192 00:10:52,200 --> 00:10:54,720 Speaker 1: more recently. But still, Now, when I put the money 193 00:10:54,720 --> 00:10:58,120 Speaker 1: into SUPER, what happens? Is it tax free? Does it mean? 194 00:10:58,800 --> 00:11:00,880 Speaker 1: Does it go straight in? Or is or tax involved? 195 00:11:02,160 --> 00:11:05,400 Speaker 2: When you put your before tax contributions into Super, like 196 00:11:05,480 --> 00:11:10,760 Speaker 2: salary sacrifice, they're generally taxed at fifteen percent, So it's 197 00:11:10,800 --> 00:11:13,840 Speaker 2: not tax free, but it's a lower tax rate compared 198 00:11:13,880 --> 00:11:15,600 Speaker 2: to most people's income tax rate. 199 00:11:16,000 --> 00:11:18,720 Speaker 1: I understand. Okay, yeah, very useful. And in terms of 200 00:11:18,720 --> 00:11:22,280 Speaker 1: how much I can put into Super, there's a certain 201 00:11:22,320 --> 00:11:24,320 Speaker 1: amount that goes in by low, right, that I must 202 00:11:24,400 --> 00:11:26,880 Speaker 1: put in. The government makes me put in a certain amount. 203 00:11:26,920 --> 00:11:28,760 Speaker 1: How much is that? Yep? 204 00:11:28,800 --> 00:11:31,880 Speaker 2: There's a compulsory amount. For the twenty twenty three to 205 00:11:31,960 --> 00:11:36,680 Speaker 2: twenty twenty four financial year, employers must contribute at least 206 00:11:36,720 --> 00:11:40,120 Speaker 2: eleven percent of your earnings to your super. This is 207 00:11:40,160 --> 00:11:42,120 Speaker 2: called the superannuation guarantee. 208 00:11:42,440 --> 00:11:46,480 Speaker 1: Okay, right, very good superannuation guarantee. That's the minimum as 209 00:11:46,520 --> 00:11:49,719 Speaker 1: such that I must put into my Super Okay, can 210 00:11:49,760 --> 00:11:52,840 Speaker 1: I take It's my money right? So can I take 211 00:11:53,040 --> 00:11:55,120 Speaker 1: money out of SUPER before I retire? 212 00:11:56,480 --> 00:12:01,079 Speaker 2: Generally, accessing SUPER before retirement is an allowed unless underspecific 213 00:12:01,080 --> 00:12:06,400 Speaker 2: conditions like severe financial hardship or certain medical situations. Once 214 00:12:06,440 --> 00:12:09,640 Speaker 2: you hit the preservation age, you can typically access it. 215 00:12:10,840 --> 00:12:14,120 Speaker 1: The preservation age. What's the preservation age? What does that mean? 216 00:12:14,640 --> 00:12:18,240 Speaker 2: Your preservation age is the minimum age you can generally 217 00:12:18,360 --> 00:12:22,440 Speaker 2: access your Super. In Australia, it ranges from fifty five 218 00:12:22,520 --> 00:12:24,720 Speaker 2: to sixty depending on when you were born. 219 00:12:25,240 --> 00:12:27,920 Speaker 1: Uh huh, okay, okay. Now can I ask you there's 220 00:12:27,960 --> 00:12:31,520 Speaker 1: some changes going on in super where it once if 221 00:12:31,559 --> 00:12:34,400 Speaker 1: you had basically this new tax is coming in on 222 00:12:34,480 --> 00:12:38,280 Speaker 1: Super over three million. I thought Super was tax free. 223 00:12:38,320 --> 00:12:40,320 Speaker 1: I thought all income in Super was tax free. 224 00:12:40,600 --> 00:12:44,720 Speaker 2: Super income is generally taxed lower than regular income, but 225 00:12:44,800 --> 00:12:48,280 Speaker 2: it's not always tax free. From July twenty twenty five, 226 00:12:49,040 --> 00:12:51,960 Speaker 2: over three million dollars will face higher tax rates on 227 00:12:52,000 --> 00:12:56,160 Speaker 2: the earnings. The idea is to ensure fairness in the system. 228 00:12:56,640 --> 00:12:59,840 Speaker 2: It might not affect everyone, but it's worth being aware 229 00:13:00,040 --> 00:13:04,360 Speaker 2: and seeking advice. If your balance is approaching that level, and. 230 00:13:04,480 --> 00:13:06,760 Speaker 1: Is that the low? Now? Is that actually the low. 231 00:13:07,559 --> 00:13:10,640 Speaker 2: Yes, it's set to become law starting in the financial 232 00:13:10,679 --> 00:13:12,880 Speaker 2: year twenty twenty five twenty six. 233 00:13:13,440 --> 00:13:17,760 Speaker 1: Thank you very much. Okay, James Gerard, the human financial 234 00:13:17,800 --> 00:13:21,040 Speaker 1: advisor you were listening to that. I think it's amazing 235 00:13:21,120 --> 00:13:25,920 Speaker 1: that a US based app AIP could give us such 236 00:13:26,080 --> 00:13:29,440 Speaker 1: precise answers. This is the sort of thing that people 237 00:13:29,440 --> 00:13:32,000 Speaker 1: ask financial advisors at an elementary level when they go 238 00:13:32,040 --> 00:13:33,480 Speaker 1: in how much can I put in super? How much 239 00:13:33,520 --> 00:13:35,560 Speaker 1: can I put in pre tax, et cetera post text? 240 00:13:36,120 --> 00:13:38,040 Speaker 1: What did you think of the answers? 241 00:13:39,400 --> 00:13:42,760 Speaker 3: I thought it was very interesting. The answers were mainly 242 00:13:42,840 --> 00:13:47,880 Speaker 3: technically correct. So when it talked about contribution, CAPS gave 243 00:13:48,240 --> 00:13:51,400 Speaker 3: a form of financial year's number. So currently it's thirty 244 00:13:51,440 --> 00:13:53,720 Speaker 3: thousand dollars for pre tax super contribution. 245 00:13:54,800 --> 00:13:57,240 Speaker 1: So it was wrong, Okay, I spotted that. Obviously I 246 00:13:57,240 --> 00:13:59,079 Speaker 1: didn't want to get into an argument with a bot 247 00:13:59,160 --> 00:14:03,679 Speaker 1: on air, but it was as such, it was wrong. 248 00:14:04,280 --> 00:14:08,520 Speaker 1: But then, to be fair, it was not so much 249 00:14:08,600 --> 00:14:12,080 Speaker 1: wrong as out of date. It was right a couple 250 00:14:12,160 --> 00:14:14,280 Speaker 1: of years ago, but it was wrong right now. So 251 00:14:14,920 --> 00:14:15,880 Speaker 1: why is that you think? 252 00:14:17,200 --> 00:14:20,480 Speaker 3: I think it's just the underlying data that's been sucked 253 00:14:20,520 --> 00:14:24,000 Speaker 3: into these various AI models may not be up to 254 00:14:24,120 --> 00:14:27,040 Speaker 3: date as yet, but that's a minor issue that will 255 00:14:27,160 --> 00:14:30,920 Speaker 3: improve as it has more linkage to contemporary websites and 256 00:14:30,960 --> 00:14:33,880 Speaker 3: access to live data. So I noticed that when it 257 00:14:33,920 --> 00:14:37,800 Speaker 3: came to the mandatory supernuation contributions, it did say twenty 258 00:14:37,880 --> 00:14:40,200 Speaker 3: twenty three twenty twenty four financial year, and it said 259 00:14:40,280 --> 00:14:44,000 Speaker 3: eleven percent. But we're not in twenty twenty three twenty four. 260 00:14:44,680 --> 00:14:47,720 Speaker 1: No, But so then again, you know, to be fair, 261 00:14:47,880 --> 00:14:50,360 Speaker 1: it was correct, they were correct figures. They were just 262 00:14:50,440 --> 00:14:54,240 Speaker 1: out of date. That's that's right. And if that was 263 00:14:54,240 --> 00:14:57,760 Speaker 1: an employee of yours, one of your many employees, tell me, 264 00:14:59,280 --> 00:15:01,240 Speaker 1: what would you have I said in the performance review? 265 00:15:01,240 --> 00:15:02,720 Speaker 1: Would you have said? What would you have said? They 266 00:15:02,720 --> 00:15:03,560 Speaker 1: could have done better? 267 00:15:04,920 --> 00:15:06,720 Speaker 3: They would have had to go brush up on their 268 00:15:06,720 --> 00:15:09,320 Speaker 3: technical skills, because that's just not up to scratch. We 269 00:15:09,360 --> 00:15:14,400 Speaker 3: can't go recommending clients with inaccurate technical information. And another 270 00:15:14,440 --> 00:15:16,920 Speaker 3: area was around the preservation age. Actually, this one was wrong. 271 00:15:16,960 --> 00:15:19,680 Speaker 3: It said fifty five, but it's actually sixty now, So 272 00:15:19,760 --> 00:15:23,080 Speaker 3: fifty five was quite a while ago. With the preservation age, Yeah, 273 00:15:23,120 --> 00:15:23,320 Speaker 3: what was. 274 00:15:23,360 --> 00:15:25,800 Speaker 1: Quite a while ago. That's quite a file ago. That 275 00:15:25,840 --> 00:15:28,400 Speaker 1: was actually completely wrong. That's sixty four years. 276 00:15:28,960 --> 00:15:33,400 Speaker 3: That's right, And the other observation is that it in 277 00:15:33,440 --> 00:15:35,720 Speaker 3: my opinion, it comes down to the prompting of what 278 00:15:35,800 --> 00:15:38,280 Speaker 3: you say to the AI bot in terms of what 279 00:15:38,320 --> 00:15:41,400 Speaker 3: the answer is. So with the super contributions, you asked it, 280 00:15:41,480 --> 00:15:43,680 Speaker 3: what's the rate of tax on supercontributions? That came out 281 00:15:43,720 --> 00:15:47,400 Speaker 3: at fifteen percent for concessional contributions, But if your income 282 00:15:47,480 --> 00:15:49,800 Speaker 3: is above two hundred and fifty thousand, it's actually thirty 283 00:15:49,840 --> 00:15:53,800 Speaker 3: percent contribution tax. So unless you knew that there's a 284 00:15:53,840 --> 00:15:55,920 Speaker 3: different tax for people above and below two hundred and 285 00:15:55,920 --> 00:15:59,600 Speaker 3: fifty thousand, the average Australian who's going to use chat 286 00:15:59,640 --> 00:16:03,640 Speaker 3: chip or some other AI machine to understand their finances 287 00:16:03,880 --> 00:16:05,600 Speaker 3: may not get the full pitch up because they're not 288 00:16:05,640 --> 00:16:07,040 Speaker 3: asking the right questions to it. 289 00:16:07,400 --> 00:16:12,040 Speaker 1: Okay, very interesting, yeah questions, Even assuming let's put it 290 00:16:12,000 --> 00:16:13,480 Speaker 1: into side the fact that they were out of date, 291 00:16:14,160 --> 00:16:19,040 Speaker 1: there's no context and there is really relevant context here. 292 00:16:19,120 --> 00:16:22,480 Speaker 1: So for instance, on those that was true, the answer 293 00:16:22,520 --> 00:16:25,000 Speaker 1: was true up to income of two hundred and fifty thousand, 294 00:16:25,040 --> 00:16:26,640 Speaker 1: isn't that right? And if you were over two hundred 295 00:16:26,640 --> 00:16:28,280 Speaker 1: and fifty thousand in your annual income there's a thing 296 00:16:28,320 --> 00:16:30,440 Speaker 1: called Division two ninety three which kicks in, and there's 297 00:16:30,480 --> 00:16:34,160 Speaker 1: another attack on Super and g gush. I'm hardly surprised 298 00:16:34,160 --> 00:16:36,040 Speaker 1: at a I can't keep up with it because we're 299 00:16:36,040 --> 00:16:38,080 Speaker 1: all struggling to keep up with it. Okay, we'll be 300 00:16:38,120 --> 00:16:41,680 Speaker 1: back in a moment with some really interesting questions about 301 00:16:41,680 --> 00:16:57,720 Speaker 1: the future financial advice. Hello, welcome back to the Australians 302 00:16:57,760 --> 00:17:01,160 Speaker 1: Money Puzzle podcast. I'm James Kirby. The first time that 303 00:17:01,200 --> 00:17:04,119 Speaker 1: I've had two guests on the show, one human and 304 00:17:04,320 --> 00:17:08,640 Speaker 1: one AI bot, which is Microsoft's co pilot. We've had 305 00:17:08,640 --> 00:17:12,320 Speaker 1: a bit of fun basically asking the bot about how 306 00:17:12,359 --> 00:17:14,679 Speaker 1: to start in financial advice. Then we sort of lifted 307 00:17:14,880 --> 00:17:18,159 Speaker 1: our game a bit and said pretty you know my 308 00:17:18,440 --> 00:17:21,520 Speaker 1: ly advanced questions about Super and it wasn't as good 309 00:17:21,520 --> 00:17:23,760 Speaker 1: on that part. I have to say now I'm just 310 00:17:23,800 --> 00:17:26,920 Speaker 1: going to ask in this segment about the future financial 311 00:17:26,960 --> 00:17:29,720 Speaker 1: advice itself and what might happen. And I think it's 312 00:17:29,720 --> 00:17:34,040 Speaker 1: an extraordinary piece for the simple reason that initially you'll 313 00:17:34,080 --> 00:17:37,080 Speaker 1: hear the bot gives a bland answer about job losses. 314 00:17:37,080 --> 00:17:39,200 Speaker 1: I push it a second time and like a human, 315 00:17:39,320 --> 00:17:41,840 Speaker 1: it takes a second read and actually gives a better answer. 316 00:17:41,880 --> 00:17:44,800 Speaker 1: It's really interesting. Now this time I want to ask 317 00:17:44,840 --> 00:17:48,080 Speaker 1: something more broad. Can you tell me will there be 318 00:17:48,200 --> 00:17:52,080 Speaker 1: job losses in the future due to artificial intelligence due 319 00:17:52,119 --> 00:17:52,960 Speaker 1: to AI. 320 00:17:53,480 --> 00:17:57,080 Speaker 2: AI's impact on jobs is a complex topic. While some 321 00:17:57,200 --> 00:18:00,920 Speaker 2: jobs may be affected, AI also creates new opportunities. 322 00:18:01,280 --> 00:18:03,520 Speaker 1: Yeah, yeah, I know, sure, But I'm asking you. Do 323 00:18:03,560 --> 00:18:06,679 Speaker 1: you think there will be job losses from AI in 324 00:18:06,720 --> 00:18:07,720 Speaker 1: the next few years. 325 00:18:08,160 --> 00:18:12,040 Speaker 2: Absolutely, It's likely that AI will lead to job losses 326 00:18:12,080 --> 00:18:15,120 Speaker 2: in some sectors due to automation. However, it will also 327 00:18:15,200 --> 00:18:18,360 Speaker 2: create new roles and industries. 328 00:18:18,920 --> 00:18:22,200 Speaker 1: Okay. The industry I'm particularly interested in is financial advice, 329 00:18:22,720 --> 00:18:26,000 Speaker 1: and of course that's the area invariably or other guests 330 00:18:26,040 --> 00:18:29,000 Speaker 1: on the show are often from. What can AI offer 331 00:18:29,640 --> 00:18:33,040 Speaker 1: in the business of financial advice and investment information? 332 00:18:34,040 --> 00:18:37,040 Speaker 2: AI can offer a lot in the financial advisory space. 333 00:18:37,680 --> 00:18:42,600 Speaker 2: It can analyze vast amounts of data quickly, provide personalized 334 00:18:42,680 --> 00:18:46,760 Speaker 2: investment advice based on individual goals, and even predict market 335 00:18:46,760 --> 00:18:53,160 Speaker 2: trends with advanced algorithms. Plus, it can streamline administrative tasks, 336 00:18:53,680 --> 00:18:56,960 Speaker 2: allowing advisors to focus more on client relationships. 337 00:18:57,800 --> 00:19:00,280 Speaker 1: Yeah, what would be the risk then for the day 338 00:19:00,320 --> 00:19:05,280 Speaker 1: listener or investor taking AI information exclusively exclusively for the 339 00:19:05,359 --> 00:19:06,159 Speaker 1: financial advice. 340 00:19:07,600 --> 00:19:12,800 Speaker 2: Relying solely on AI for financial advice carries risks. There's 341 00:19:12,800 --> 00:19:16,240 Speaker 2: the potential for over reliance on algorithms that might not 342 00:19:16,400 --> 00:19:22,160 Speaker 2: capture the nuances of individual circumstances or market anomalies. Plus, 343 00:19:22,400 --> 00:19:26,000 Speaker 2: without human judgment, there's a risk of missing the emotional 344 00:19:26,119 --> 00:19:28,719 Speaker 2: or psychological aspects of investing. 345 00:19:29,480 --> 00:19:31,879 Speaker 1: Okay, thank you very much. Great to have you on 346 00:19:31,920 --> 00:19:35,240 Speaker 1: the show. Nice to talk to you, and well, I 347 00:19:35,240 --> 00:19:37,120 Speaker 1: don't know if we'll have you on again, but thank 348 00:19:37,200 --> 00:19:38,320 Speaker 1: you for coming on the show. 349 00:19:38,800 --> 00:19:39,720 Speaker 2: Thanks for having me. 350 00:19:40,240 --> 00:19:44,639 Speaker 1: Take care Well, that was a pretty interesting interview, certainly 351 00:19:44,640 --> 00:19:46,359 Speaker 1: one of the more interesting interviews I've ever done. I 352 00:19:46,400 --> 00:19:48,520 Speaker 1: can tell you that, folks, because I'm not used to 353 00:19:48,560 --> 00:19:53,120 Speaker 1: talking to bots, and most people aren't. James, what did 354 00:19:53,160 --> 00:19:57,000 Speaker 1: you think about the general quality of the information? I mean, 355 00:19:57,040 --> 00:20:00,119 Speaker 1: I thought some of it ranged from seriously precise, and 356 00:20:00,160 --> 00:20:03,840 Speaker 1: actually the precise stuff was inappropriate, I thought, But then 357 00:20:03,880 --> 00:20:06,120 Speaker 1: the general stuff wasn't bad. What do you think? 358 00:20:06,960 --> 00:20:09,679 Speaker 3: Yeah, it was okay. I would say I'd rate it 359 00:20:09,720 --> 00:20:11,600 Speaker 3: as sort of like a seven out of ten. It 360 00:20:11,720 --> 00:20:14,479 Speaker 3: covered a lot of broad principles, a lot of just 361 00:20:14,520 --> 00:20:17,040 Speaker 3: general information that apply to a lot of people. But 362 00:20:17,320 --> 00:20:18,959 Speaker 3: maybe if those books are a little bit off, they 363 00:20:18,960 --> 00:20:21,000 Speaker 3: probably wouldn't be the first two that I'd give to 364 00:20:21,280 --> 00:20:23,720 Speaker 3: everybody because they are a little bit complex in nature. 365 00:20:23,720 --> 00:20:25,520 Speaker 1: But they're all books, and they're hard to read, and 366 00:20:25,680 --> 00:20:27,000 Speaker 1: they're both American. 367 00:20:26,960 --> 00:20:29,919 Speaker 3: That's right, So it probably just lacked that. Oh okay, 368 00:20:30,040 --> 00:20:31,879 Speaker 3: a little bit of context around it would have been good. 369 00:20:31,920 --> 00:20:34,119 Speaker 3: I'm an Australian investor, I'm forty five years old, so 370 00:20:34,160 --> 00:20:36,520 Speaker 3: on and so forth, and I don't know it still 371 00:20:36,520 --> 00:20:38,680 Speaker 3: maybe even the same ones, But again it comes back 372 00:20:38,720 --> 00:20:42,280 Speaker 3: to these issues around these AI models. They're still relatively new. 373 00:20:42,760 --> 00:20:45,480 Speaker 1: Yes, okay, now there were I thought it was good 374 00:20:45,520 --> 00:20:51,320 Speaker 1: on the basics there, like ETFs, you know, and having goals. 375 00:20:51,840 --> 00:20:54,040 Speaker 1: I thought that was quite good. I mean there were 376 00:20:54,160 --> 00:20:57,280 Speaker 1: very general answers, but there were very general questions, and 377 00:20:57,520 --> 00:20:59,480 Speaker 1: I don't think, what did you? And she only fought 378 00:20:59,520 --> 00:21:03,600 Speaker 1: with the B principles or approach that it took. 379 00:21:05,359 --> 00:21:08,959 Speaker 3: I had no concerns with what it said generally about ETFs, 380 00:21:09,040 --> 00:21:12,280 Speaker 3: but I guess the next step with an AI financial 381 00:21:12,280 --> 00:21:14,359 Speaker 3: adviser would be to say, well, which ones for me? 382 00:21:14,600 --> 00:21:17,399 Speaker 3: And then that just opens up this whole conversation that 383 00:21:17,480 --> 00:21:19,879 Speaker 3: typically someone would have with a face to face advisor, 384 00:21:20,000 --> 00:21:24,320 Speaker 3: because the AI financial advisor can look at the whole 385 00:21:24,400 --> 00:21:26,679 Speaker 3: list of ETFs that are available through this ground and 386 00:21:26,680 --> 00:21:30,280 Speaker 3: share market. But it's not going to be as good 387 00:21:30,320 --> 00:21:33,000 Speaker 3: as picking well what about time in market conditions should 388 00:21:33,000 --> 00:21:35,080 Speaker 3: be dollar cost average in and it can do the 389 00:21:35,119 --> 00:21:38,080 Speaker 3: basics on risk return metrics, but it's going to take 390 00:21:38,119 --> 00:21:40,680 Speaker 3: a very big conversation, like you need to know quite 391 00:21:40,680 --> 00:21:42,200 Speaker 3: a bit to be able to get the best out 392 00:21:42,200 --> 00:21:43,440 Speaker 3: of it to get what you want. 393 00:21:43,840 --> 00:21:45,840 Speaker 1: Yeah, it'll be interesting to talk to it the morning 394 00:21:45,880 --> 00:21:49,920 Speaker 1: after a crash. That would be really interesting because it 395 00:21:49,960 --> 00:21:53,159 Speaker 1: would be on emotional on like the human advisors and 396 00:21:53,200 --> 00:21:56,560 Speaker 1: stuff workers that we might be talking to. Okay, now 397 00:21:56,640 --> 00:21:58,399 Speaker 1: I tell you, well, I tell you what I didn't like. 398 00:21:58,880 --> 00:22:01,560 Speaker 1: As one stage, it says retirement needs can vary greatly. 399 00:22:01,600 --> 00:22:04,560 Speaker 1: A common rule of thumb is for aiming for seventy 400 00:22:04,560 --> 00:22:09,040 Speaker 1: to eighty percent of your current income in retirement. Oh yeah, 401 00:22:09,280 --> 00:22:11,800 Speaker 1: I don't know. Tell me that sounds too high to me, 402 00:22:11,840 --> 00:22:14,000 Speaker 1: that sounds way ambitious, I reckon. Most Australians is not 403 00:22:14,080 --> 00:22:17,879 Speaker 1: retired with seventy to eighty percent of their previous income 404 00:22:18,119 --> 00:22:20,239 Speaker 1: and wouldn't even aspire to do so is it too 405 00:22:20,359 --> 00:22:22,280 Speaker 1: high or was it in the ballpark? 406 00:22:23,400 --> 00:22:25,720 Speaker 3: I think it's too high, And I think this is 407 00:22:25,720 --> 00:22:28,200 Speaker 3: maybe one of the issues at present with these AI 408 00:22:28,520 --> 00:22:31,120 Speaker 3: chat engines is that they learn from these thing called 409 00:22:31,359 --> 00:22:34,240 Speaker 3: large language models. So it's basically the database of information 410 00:22:34,400 --> 00:22:36,400 Speaker 3: that is in there that when you type a question, 411 00:22:36,440 --> 00:22:38,360 Speaker 3: it looks into that and then spits out the answer. 412 00:22:38,400 --> 00:22:41,399 Speaker 3: And it's probably going more for a theoretical answer. So 413 00:22:41,560 --> 00:22:44,160 Speaker 3: in some textbook or some website somewhere, it probably says 414 00:22:44,200 --> 00:22:46,360 Speaker 3: that you should aim to retire on seventy to eighty 415 00:22:46,400 --> 00:22:50,200 Speaker 3: percent of your pre retirement income. However, when you overlay 416 00:22:50,240 --> 00:22:52,840 Speaker 3: that with reality, as you say, most people don't. So 417 00:22:52,880 --> 00:22:55,399 Speaker 3: there's a difference between what theory says and what actually 418 00:22:55,400 --> 00:22:57,760 Speaker 3: happens day to day, which AI may not be so 419 00:22:57,840 --> 00:22:59,320 Speaker 3: good at distinguishing. 420 00:22:59,119 --> 00:23:01,359 Speaker 1: Actually, which brings me back in a nice way to 421 00:23:01,359 --> 00:23:04,320 Speaker 1: close this show to one of the very first things 422 00:23:04,320 --> 00:23:08,600 Speaker 1: we were told when we started in journalism, which is 423 00:23:08,720 --> 00:23:11,760 Speaker 1: there is no such thing as a stupid question, only 424 00:23:11,880 --> 00:23:15,040 Speaker 1: stupid answers. And we might leave it right there, thank 425 00:23:15,080 --> 00:23:18,040 Speaker 1: you very much, James Girard. I won't go through the 426 00:23:18,080 --> 00:23:20,800 Speaker 1: protocol of thanking a bot because I still like to 427 00:23:20,800 --> 00:23:23,040 Speaker 1: think that I only talk to humans. So thank you. 428 00:23:23,119 --> 00:23:26,320 Speaker 1: James Gerard, Financial Advisor dot com dot au. Thank you, James, 429 00:23:26,320 --> 00:23:29,240 Speaker 1: my pleasure lovely to have you on the show. Keep 430 00:23:29,280 --> 00:23:32,000 Speaker 1: the emails rolling, folks. The money puzzle at The Australian 431 00:23:32,080 --> 00:23:33,200 Speaker 1: dot com dot au