1 00:00:06,080 --> 00:00:07,840 Speaker 1: Welcome to Fear and Greed Q and A where we 2 00:00:07,880 --> 00:00:11,959 Speaker 1: ask and answer questions about business, investing, economics, politics and more. 3 00:00:12,000 --> 00:00:15,640 Speaker 1: I'm Sean Aylmer. Artificial intelligence is quickly becoming one of 4 00:00:15,720 --> 00:00:19,480 Speaker 1: the biggest strategic investments for major companies, and Telstra is 5 00:00:19,640 --> 00:00:22,720 Speaker 1: performing a balancing act, expanding its use of AI to 6 00:00:22,760 --> 00:00:26,960 Speaker 1: improve efficiency in customer service while not over capitalizing on 7 00:00:27,040 --> 00:00:30,800 Speaker 1: the technology. Michael Ackland is Telstra's chief financial officer. He's 8 00:00:30,800 --> 00:00:33,960 Speaker 1: also just won the Investor Engagement Excellence Award at the 9 00:00:34,000 --> 00:00:36,559 Speaker 1: CFO Awards. Michael, Welcome to Fear and Greed Q and 10 00:00:36,600 --> 00:00:39,720 Speaker 1: a thanks Sean. First up, just on the award, the 11 00:00:39,880 --> 00:00:44,479 Speaker 1: Investor Engagement Award. How the hell do you engage with 12 00:00:44,680 --> 00:00:47,040 Speaker 1: what more than a million investors? That is quite the 13 00:00:47,120 --> 00:00:47,720 Speaker 1: job you've got. 14 00:00:48,440 --> 00:00:51,639 Speaker 2: Yeah, no, absolutely, And luckily we have a whole lot 15 00:00:51,640 --> 00:00:57,040 Speaker 2: of analysts and institutional investors that we can engage with 16 00:00:57,080 --> 00:01:00,640 Speaker 2: deeply and they amplify our communication around what we're doing. 17 00:01:00,680 --> 00:01:03,480 Speaker 2: But it is it's a big job, and it's all 18 00:01:03,520 --> 00:01:06,119 Speaker 2: about alignment sure and and getting making sure that our 19 00:01:06,160 --> 00:01:08,080 Speaker 2: message is aligned with our strategy and we do what 20 00:01:08,080 --> 00:01:08,880 Speaker 2: we say we're going to do. 21 00:01:08,959 --> 00:01:11,240 Speaker 3: So it's like many other parts of life. 22 00:01:11,760 --> 00:01:13,959 Speaker 1: So one thing about Telstra, there are a lot of 23 00:01:13,959 --> 00:01:16,920 Speaker 1: your shareholders who want a dividend, expect a divin end, 24 00:01:16,920 --> 00:01:19,160 Speaker 1: and in the last results last month they got an 25 00:01:19,160 --> 00:01:21,400 Speaker 1: increased dividend. They've got a share buyback and don't know 26 00:01:21,400 --> 00:01:23,160 Speaker 1: whether they're happy with share buybacks or when they prefer 27 00:01:23,240 --> 00:01:26,200 Speaker 1: dividends whatever. How do you like, in real terms, how 28 00:01:26,200 --> 00:01:29,520 Speaker 1: do you balance all that because obviously you've got institutional shareholders, 29 00:01:29,720 --> 00:01:33,640 Speaker 1: but you've got a lot of very small shareholders. How 30 00:01:33,640 --> 00:01:34,960 Speaker 1: do you balance those two things? 31 00:01:35,520 --> 00:01:38,440 Speaker 2: Yeah, well, I think we've had a pretty clear investment 32 00:01:38,480 --> 00:01:40,800 Speaker 2: narrative for some time. And you rightly point out the dividend. 33 00:01:41,160 --> 00:01:43,800 Speaker 2: Many people own us for that dividend, and they own 34 00:01:43,880 --> 00:01:46,920 Speaker 2: us as a defensive stock and a stock that's going 35 00:01:46,959 --> 00:01:49,800 Speaker 2: to be a reliable cash flow generator and one that 36 00:01:49,840 --> 00:01:53,160 Speaker 2: they can trust. And you know, that's really important, and 37 00:01:53,160 --> 00:01:56,160 Speaker 2: we think about that when we think about our strategy. 38 00:01:55,720 --> 00:01:58,080 Speaker 3: Where we're in it for the long term. 39 00:01:58,800 --> 00:02:02,920 Speaker 2: I like to say we're here for the shareholders that 40 00:02:02,960 --> 00:02:05,280 Speaker 2: are going to be the holders, not those that are 41 00:02:05,280 --> 00:02:07,880 Speaker 2: in for a short time but a good time. And 42 00:02:07,920 --> 00:02:10,640 Speaker 2: I think that's really important with a stock and a 43 00:02:10,680 --> 00:02:13,600 Speaker 2: company like Telstra with our history and our assets, that 44 00:02:14,480 --> 00:02:17,359 Speaker 2: we really align those things around being there for long 45 00:02:17,440 --> 00:02:19,320 Speaker 2: term investors and long term holders of our stock. 46 00:02:19,680 --> 00:02:21,560 Speaker 1: Okay, I mean, and I'll get into AI in a moment. 47 00:02:21,639 --> 00:02:24,400 Speaker 1: Just one final thing that sort of on that corporate strategy. 48 00:02:24,560 --> 00:02:26,360 Speaker 1: You kind of sit between the market and the company 49 00:02:26,400 --> 00:02:29,920 Speaker 1: being a CFO when investors are looking at Telstra right 50 00:02:29,960 --> 00:02:32,160 Speaker 1: now and you've had certainly in the mobile space you've 51 00:02:32,200 --> 00:02:35,120 Speaker 1: done really well. Other space is a bit more difficult. 52 00:02:35,160 --> 00:02:37,200 Speaker 1: What are investors talking to you about? 53 00:02:38,200 --> 00:02:40,400 Speaker 2: Yeah, well it is interesting and we will get to AI. 54 00:02:40,440 --> 00:02:42,120 Speaker 2: As you said, they talked that, they asked a lot 55 00:02:42,120 --> 00:02:46,760 Speaker 2: of questions about AI. Absolutely, but I think they're also 56 00:02:47,680 --> 00:02:53,280 Speaker 2: always trying to understand how our physical infrastructure plays into 57 00:02:53,360 --> 00:02:57,320 Speaker 2: our business. So that physical connectivity infrastructure that spreads out 58 00:02:57,360 --> 00:03:01,400 Speaker 2: all over the country, all those ducks and pits and 59 00:03:01,520 --> 00:03:05,240 Speaker 2: pipes and cables and towers and all of that that 60 00:03:05,360 --> 00:03:07,440 Speaker 2: just sort of makes that magic happen that we all 61 00:03:07,639 --> 00:03:10,679 Speaker 2: experience when we pick up our phone. They really sort 62 00:03:10,680 --> 00:03:12,760 Speaker 2: of tease that out. You know, how big is that, 63 00:03:13,360 --> 00:03:16,760 Speaker 2: how big is that mot how could that get disrupted 64 00:03:17,639 --> 00:03:19,880 Speaker 2: by different players? And I think they really make a 65 00:03:19,880 --> 00:03:22,960 Speaker 2: bet on that asset base that we have and just 66 00:03:22,960 --> 00:03:25,640 Speaker 2: how important that is for the economy and for communities 67 00:03:25,680 --> 00:03:27,800 Speaker 2: and for all of us in the way that we 68 00:03:27,840 --> 00:03:28,680 Speaker 2: connect every day. 69 00:03:28,919 --> 00:03:32,120 Speaker 1: Okay, let's talk. I am sure that there are a 70 00:03:32,120 --> 00:03:34,520 Speaker 1: lot of boards and senior management teams around the country 71 00:03:34,639 --> 00:03:38,680 Speaker 1: who have invested plenty in AI, and they've probably said 72 00:03:38,720 --> 00:03:41,680 Speaker 1: to management, Hey, we've got this great AI, now make 73 00:03:41,760 --> 00:03:44,760 Speaker 1: it work right now, there's going to be a disconnect 74 00:03:44,760 --> 00:03:47,960 Speaker 1: there at times because the world of AI and I 75 00:03:47,960 --> 00:03:51,600 Speaker 1: think we're talking to you because you have really distinguished 76 00:03:52,200 --> 00:03:55,520 Speaker 1: between I suppose the reality of what you can do 77 00:03:56,480 --> 00:04:00,840 Speaker 1: and that vague promise of AI changing the world. 78 00:04:01,080 --> 00:04:04,440 Speaker 2: Yeah, And it's like a bet you make on any technology. 79 00:04:04,720 --> 00:04:07,200 Speaker 2: You know, first of all, you need to make sure 80 00:04:07,240 --> 00:04:08,920 Speaker 2: that the first principle of what you're doing is you're 81 00:04:08,960 --> 00:04:11,520 Speaker 2: aligning to your strategy and you're making sure that that technology, 82 00:04:11,800 --> 00:04:14,360 Speaker 2: the technology isn't the strategy. Your strategy is the strategy 83 00:04:14,360 --> 00:04:16,640 Speaker 2: and the technology supports it. And I think you know 84 00:04:16,680 --> 00:04:19,000 Speaker 2: that's been really important one of the things we think 85 00:04:19,040 --> 00:04:21,719 Speaker 2: about with AI. We are making a bet on it. 86 00:04:21,760 --> 00:04:24,120 Speaker 2: We want to be an AI first company. We want 87 00:04:24,160 --> 00:04:27,840 Speaker 2: to make sure that we're brillant at utilizing AI. And 88 00:04:27,880 --> 00:04:30,279 Speaker 2: so there's probably two things that we think about. One 89 00:04:30,320 --> 00:04:34,240 Speaker 2: is AI companies are making amount of investment. I think 90 00:04:34,279 --> 00:04:36,880 Speaker 2: the estimate is over seven hundred billion dollars a capex 91 00:04:36,920 --> 00:04:39,919 Speaker 2: this year across those AI companies. That investment's going to 92 00:04:39,920 --> 00:04:43,080 Speaker 2: look for a return. And we experience this with cloud, 93 00:04:43,320 --> 00:04:46,120 Speaker 2: so being really disciplined on our architecture, making sure that 94 00:04:46,200 --> 00:04:49,640 Speaker 2: we don't that we don't lock ourselves into single vendors, 95 00:04:49,680 --> 00:04:51,680 Speaker 2: that we set ourselves up so that we could swap 96 00:04:51,680 --> 00:04:54,560 Speaker 2: between different vendors, That we understand the costs of the 97 00:04:54,640 --> 00:04:58,159 Speaker 2: tokens and the storage and the cloud compute, So making 98 00:04:58,160 --> 00:05:00,640 Speaker 2: sure that we've been really thinking about that deeply, and 99 00:05:00,680 --> 00:05:02,880 Speaker 2: we engage all over the world with experts to make 100 00:05:02,920 --> 00:05:05,960 Speaker 2: sure that we're really understanding where people are thinking they're 101 00:05:06,240 --> 00:05:09,920 Speaker 2: And then the second one is just around that balance 102 00:05:09,960 --> 00:05:12,520 Speaker 2: between you want to sort of send out and get 103 00:05:12,560 --> 00:05:15,240 Speaker 2: people to innovate. So we did that by giving lots 104 00:05:15,240 --> 00:05:17,560 Speaker 2: and lots of people access to Copilot and training our 105 00:05:17,560 --> 00:05:20,080 Speaker 2: people with our AI academy. And you sort of let 106 00:05:20,080 --> 00:05:22,279 Speaker 2: a thousand fires burn for a while, but then you 107 00:05:22,320 --> 00:05:24,680 Speaker 2: have to start to focus down on the areas where 108 00:05:24,680 --> 00:05:27,159 Speaker 2: you really can see the value. And I mean we're 109 00:05:27,160 --> 00:05:31,320 Speaker 2: looking at, like everyone, customer engagement, how you deal with customers. 110 00:05:31,480 --> 00:05:33,480 Speaker 2: AI is going to is a big opportunity there, and 111 00:05:33,480 --> 00:05:36,960 Speaker 2: we're seeing real benefits in call center handle times, in 112 00:05:37,839 --> 00:05:41,919 Speaker 2: people using generative AI chatbots, and war stuff happening on digital. 113 00:05:42,240 --> 00:05:44,680 Speaker 2: But we also look at our network operations and how 114 00:05:44,720 --> 00:05:45,640 Speaker 2: we run the network. 115 00:05:46,200 --> 00:05:46,839 Speaker 3: Can we make. 116 00:05:46,760 --> 00:05:51,240 Speaker 2: Faster decisions in the network, respond to alarms faster, optimize 117 00:05:51,279 --> 00:05:54,320 Speaker 2: our network in different ways as well as just in 118 00:05:54,320 --> 00:05:57,560 Speaker 2: that hole. And we've all heard this recently around the 119 00:05:57,600 --> 00:06:00,159 Speaker 2: way it's fundamentally changing the way we develop software. We 120 00:06:00,200 --> 00:06:04,000 Speaker 2: spend about a billion dollars a year internally developing software, 121 00:06:04,360 --> 00:06:07,320 Speaker 2: spend on it, and we're seeing significant changes in that 122 00:06:07,320 --> 00:06:09,280 Speaker 2: space as well as people use it to do coding 123 00:06:09,360 --> 00:06:11,520 Speaker 2: and speed up the way they do coding, and we've 124 00:06:11,520 --> 00:06:13,200 Speaker 2: seen that with a lot of software companies as well. 125 00:06:13,279 --> 00:06:17,560 Speaker 2: So disciplined on the infrastructure we set up to exploit AI. 126 00:06:17,600 --> 00:06:19,200 Speaker 2: We're not an AI company, We're going to be an 127 00:06:19,240 --> 00:06:23,600 Speaker 2: AI first company using it, and then disciplined like developing 128 00:06:23,640 --> 00:06:26,040 Speaker 2: capabilities in our people, but then being focused on where 129 00:06:26,080 --> 00:06:28,520 Speaker 2: we think we can get the benefits. 130 00:06:28,520 --> 00:06:31,200 Speaker 1: I get what you're saying. Let's talk about the first 131 00:06:31,240 --> 00:06:34,560 Speaker 1: I mean the chatbot example to begin with, and we 132 00:06:34,600 --> 00:06:36,400 Speaker 1: all go to chat GP where maybe don't but most 133 00:06:36,400 --> 00:06:38,520 Speaker 1: of us go to chat GPT now and think I've 134 00:06:38,520 --> 00:06:41,279 Speaker 1: got a headache. What's it mean? It's kind of a 135 00:06:41,320 --> 00:06:43,440 Speaker 1: similar sort of thing. I have a problem with a 136 00:06:43,440 --> 00:06:46,560 Speaker 1: mobile or a fixed landline, and I go to chat 137 00:06:46,600 --> 00:06:49,000 Speaker 1: GBT and it helps, or I go to Telstra using 138 00:06:49,279 --> 00:06:52,680 Speaker 1: one of the larger language models and it helps. How 139 00:06:52,720 --> 00:06:55,080 Speaker 1: good is that right now? Because I'd say five years 140 00:06:55,080 --> 00:06:58,600 Speaker 1: ago or two years ago, if I rang Telstra and 141 00:06:58,800 --> 00:07:01,400 Speaker 1: ended up an EQ, I'd be frustrated. They'd take a 142 00:07:01,400 --> 00:07:03,800 Speaker 1: long time like how much difference has that made? 143 00:07:04,240 --> 00:07:05,800 Speaker 3: Yeah, So there's two things. 144 00:07:05,800 --> 00:07:08,240 Speaker 2: One is, we've got a chatbot that supports our call 145 00:07:08,240 --> 00:07:12,040 Speaker 2: center agents when they're talking to you, and we've seen 146 00:07:13,120 --> 00:07:16,640 Speaker 2: you know, probably you know, twenty to thirty percent reduction 147 00:07:16,680 --> 00:07:20,360 Speaker 2: in answer times on lifeful like calls, as the agents 148 00:07:20,680 --> 00:07:22,880 Speaker 2: are better at answering the questions and they're more accurate 149 00:07:22,880 --> 00:07:25,800 Speaker 2: in answering the questions. But we had to clean up 150 00:07:25,800 --> 00:07:28,520 Speaker 2: a lot of our data to make that chat bot work. 151 00:07:28,960 --> 00:07:30,800 Speaker 2: And then the other one that we only launched recently 152 00:07:30,960 --> 00:07:34,720 Speaker 2: was our first generative AI chatbot for customers. That's at 153 00:07:34,720 --> 00:07:36,480 Speaker 2: the moment through the website. It will be launched through 154 00:07:36,520 --> 00:07:40,200 Speaker 2: the app soon. And that compares to the old chatbots 155 00:07:40,200 --> 00:07:43,080 Speaker 2: that we experienced, which were more machine learning sort of 156 00:07:43,120 --> 00:07:47,720 Speaker 2: linear thinking, which weren't great for many of us. If 157 00:07:47,720 --> 00:07:50,040 Speaker 2: you got stuck in the wrong doom loop, you couldn't 158 00:07:50,040 --> 00:07:51,720 Speaker 2: get out in trouble. So what we've seen with the 159 00:07:51,800 --> 00:07:57,440 Speaker 2: generative AI chatbot with customers is we've seen the containment rates, 160 00:07:57,440 --> 00:07:59,120 Speaker 2: so the number of people who resolved their issue in 161 00:07:59,160 --> 00:08:03,800 Speaker 2: there and say this sat triple, So that's really exciting. 162 00:08:03,840 --> 00:08:05,360 Speaker 3: We've gone from probably. 163 00:08:05,760 --> 00:08:08,280 Speaker 2: You know, fifteen percent of people who were able to 164 00:08:08,320 --> 00:08:10,400 Speaker 2: satisfy themselves within the chatbot and I have to go 165 00:08:10,440 --> 00:08:13,160 Speaker 2: to a person to now about forty five percent with 166 00:08:13,320 --> 00:08:15,760 Speaker 2: generative AI, and I think that'll get better about we're 167 00:08:15,800 --> 00:08:19,480 Speaker 2: all experiencing that with when we're using chat, GPT or 168 00:08:19,560 --> 00:08:23,560 Speaker 2: Chlaude or Microsoft Copilot. We're seeing that it's getting better 169 00:08:23,600 --> 00:08:26,080 Speaker 2: and better, and we're also seeing that in our ability 170 00:08:26,080 --> 00:08:27,680 Speaker 2: to apply it with customers. 171 00:08:28,200 --> 00:08:31,280 Speaker 1: At the enterprise level. Then how do you know where 172 00:08:31,360 --> 00:08:35,520 Speaker 1: AI will work for you and where it can add value? 173 00:08:36,240 --> 00:08:37,920 Speaker 1: Because there are a lot of areas where it won't 174 00:08:37,920 --> 00:08:38,320 Speaker 1: work for you. 175 00:08:38,360 --> 00:08:41,600 Speaker 2: I'm guessing yeah, absolutely, And this is where it's so 176 00:08:41,679 --> 00:08:47,000 Speaker 2: important too. The most important capability with AI is your 177 00:08:47,040 --> 00:08:49,600 Speaker 2: own people who know your processes, who know your business. 178 00:08:50,280 --> 00:08:53,320 Speaker 2: And so that's why the first step we took on ALI, well, 179 00:08:54,040 --> 00:08:56,200 Speaker 2: the first step was we got very focused on cleaning 180 00:08:56,280 --> 00:08:58,480 Speaker 2: up our data. We thought that was really important, simplifying 181 00:08:58,480 --> 00:09:03,559 Speaker 2: our environment. But training our staff, and we've partnered with Accenture, 182 00:09:03,559 --> 00:09:06,720 Speaker 2: which has been well publicized with a joint venture with 183 00:09:07,600 --> 00:09:09,800 Speaker 2: twelve or thirteen hundred people in that joint venture, which 184 00:09:09,800 --> 00:09:12,240 Speaker 2: are our employees, and our employees are going after this, 185 00:09:12,320 --> 00:09:16,760 Speaker 2: but training across our sort of twenty thousand workforce is 186 00:09:16,840 --> 00:09:19,920 Speaker 2: training in AI, giving them access to a corporate level 187 00:09:19,960 --> 00:09:23,360 Speaker 2: copilots that they can try it. And I really believe 188 00:09:23,400 --> 00:09:25,200 Speaker 2: that it's how people who are going to tell us 189 00:09:25,240 --> 00:09:27,920 Speaker 2: whether or not AI is going to work for those 190 00:09:27,960 --> 00:09:31,320 Speaker 2: processes or not. And it's only by getting more and 191 00:09:31,360 --> 00:09:35,040 Speaker 2: more of your staff engaged learning, experimenting, both privately but 192 00:09:35,080 --> 00:09:37,560 Speaker 2: also in the workplace that I think they're going to 193 00:09:37,559 --> 00:09:38,800 Speaker 2: tell us what works and what doesn't. 194 00:09:40,040 --> 00:09:43,600 Speaker 1: You've talked about over capitalizing on AI as well, how 195 00:09:43,640 --> 00:09:46,120 Speaker 1: do you know whether you're over capitalizing on AI. 196 00:09:46,960 --> 00:09:49,719 Speaker 3: Well, like most over capitalizing experiences, you. 197 00:09:49,679 --> 00:09:53,760 Speaker 1: Know afterwards, hindsight is twenty twenty hindsight. 198 00:09:53,800 --> 00:09:56,439 Speaker 3: Hindsight. Hindsight is a wonderful thing. 199 00:09:56,520 --> 00:10:02,440 Speaker 2: I mean, is it's absolutely the crunchy issue because you 200 00:10:02,520 --> 00:10:04,000 Speaker 2: know that it's a classic you don't want to move 201 00:10:04,040 --> 00:10:06,520 Speaker 2: too slow, you don't want to move too fast. And 202 00:10:06,559 --> 00:10:08,280 Speaker 2: I think the thing that we're focused on, I'll go 203 00:10:08,360 --> 00:10:10,960 Speaker 2: back to that is being really disciplined on the architecture 204 00:10:11,000 --> 00:10:12,960 Speaker 2: we set up. So one of the things we've done 205 00:10:13,840 --> 00:10:17,480 Speaker 2: within our our extenture JV is to start to look 206 00:10:17,480 --> 00:10:19,640 Speaker 2: at what we're calling an AI control plane, which is 207 00:10:20,520 --> 00:10:25,240 Speaker 2: our is giving us greater insight into what's the right 208 00:10:25,559 --> 00:10:28,679 Speaker 2: lem to use for what outcome, to manage and track 209 00:10:28,760 --> 00:10:31,599 Speaker 2: token costs which could get out of control, to be 210 00:10:31,640 --> 00:10:34,720 Speaker 2: able to plug in and plug out of things when 211 00:10:34,720 --> 00:10:36,720 Speaker 2: they're either working or not working, so that we can 212 00:10:36,800 --> 00:10:40,839 Speaker 2: control that cost curve of capitalization as we go forward. 213 00:10:40,880 --> 00:10:44,160 Speaker 2: But I think the point is you won't know, and 214 00:10:44,360 --> 00:10:46,719 Speaker 2: the real trick is to keep yourself as flexible as 215 00:10:46,720 --> 00:10:49,720 Speaker 2: possible so that you can react as you learn more. 216 00:10:50,360 --> 00:10:52,600 Speaker 1: Yeah, And so the point there is having not been 217 00:10:52,640 --> 00:10:55,400 Speaker 1: tied to one supply for the next five years, actually 218 00:10:55,679 --> 00:10:57,040 Speaker 1: moving between as necessary. 219 00:10:57,200 --> 00:11:00,760 Speaker 2: Yes, absolutely, keep that commercial tension optionality. 220 00:11:01,040 --> 00:11:03,800 Speaker 1: Okay. One final question that always comes back to staff 221 00:11:03,800 --> 00:11:06,959 Speaker 1: and tells for being such an iconic company, how will 222 00:11:07,040 --> 00:11:09,120 Speaker 1: you manage that there will be job loss as there 223 00:11:09,120 --> 00:11:10,719 Speaker 1: has to be, but there'll also be job gains on 224 00:11:10,760 --> 00:11:13,840 Speaker 1: the other hand in that area. How do you manage 225 00:11:13,840 --> 00:11:14,720 Speaker 1: that as a company. 226 00:11:15,679 --> 00:11:15,880 Speaker 3: Yeah. 227 00:11:15,880 --> 00:11:19,120 Speaker 2: Absolutely, And what we've focused on is training staff, is 228 00:11:19,160 --> 00:11:22,720 Speaker 2: giving staff that capability so that they're more capable. 229 00:11:22,280 --> 00:11:23,719 Speaker 3: In an AI world and an AI era. 230 00:11:23,760 --> 00:11:26,560 Speaker 2: And I think that's all you can do is help 231 00:11:26,600 --> 00:11:31,679 Speaker 2: prepare people to be as capable as responding to what 232 00:11:31,720 --> 00:11:33,120 Speaker 2: will be a new environment. 233 00:11:33,080 --> 00:11:33,640 Speaker 3: As you can. 234 00:11:34,400 --> 00:11:38,560 Speaker 2: Obviously, any decision you make where staffing levels move up 235 00:11:38,640 --> 00:11:42,360 Speaker 2: or down is something that is taken incredibly seriously. And 236 00:11:42,440 --> 00:11:45,280 Speaker 2: you know, we've got, as I said, as a company, 237 00:11:45,679 --> 00:11:48,920 Speaker 2: brilliant physical assets and unique physical assets that are core 238 00:11:49,040 --> 00:11:52,280 Speaker 2: to our competitive advantage. But we've also got you know, 239 00:11:52,559 --> 00:11:56,480 Speaker 2: tens of thousands of really passionate staff that are really 240 00:11:56,559 --> 00:11:59,480 Speaker 2: core to our competitive advantage as well. And we recognize 241 00:11:59,480 --> 00:12:02,280 Speaker 2: that history of what Telstra has been for people and 242 00:12:02,280 --> 00:12:04,080 Speaker 2: what it will continue to be for our staff, and 243 00:12:04,120 --> 00:12:09,000 Speaker 2: so investing in those capabilities I think is sort of 244 00:12:09,000 --> 00:12:11,240 Speaker 2: the best lever that companies can take to make sure 245 00:12:11,280 --> 00:12:14,240 Speaker 2: that one, as I said before, they're better at using 246 00:12:14,280 --> 00:12:16,120 Speaker 2: the AI because if your people know how to use it, 247 00:12:16,360 --> 00:12:19,200 Speaker 2: they're going to be better at identifying the opportunities but 248 00:12:19,280 --> 00:12:22,000 Speaker 2: second setting them up in the best way for the future. 249 00:12:22,400 --> 00:12:24,000 Speaker 1: Michael, thanks for talking to Fear and Greed. 250 00:12:24,320 --> 00:12:25,840 Speaker 3: Absolute pleasure. Thanks Sean, that was 251 00:12:25,840 --> 00:12:28,040 Speaker 1: Tell us a CFO Michael Ackland, I'm Sean all On 252 00:12:28,080 --> 00:12:29,959 Speaker 1: Matt and this is Fear and Greed Q and a