1 00:00:00,040 --> 00:00:03,400 Speaker 1: So this is the newest skill. It's very forward facing, 2 00:00:03,560 --> 00:00:07,040 Speaker 1: it's a skill that would not have existed two years ago. 3 00:00:07,760 --> 00:00:11,920 Speaker 2: That's Tim Dagan, best selling author of Work Backwards, and 4 00:00:11,960 --> 00:00:14,400 Speaker 2: someone who spends a lot of time thinking about how 5 00:00:14,480 --> 00:00:17,959 Speaker 2: work is evolving. And if you're looking to get a 6 00:00:18,000 --> 00:00:21,079 Speaker 2: promotion this year or make some big steps up in 7 00:00:21,160 --> 00:00:25,000 Speaker 2: the career ladder, can I suggest that simply working harder 8 00:00:25,360 --> 00:00:28,600 Speaker 2: might not be the thing that gets you there? I know, 9 00:00:28,880 --> 00:00:34,519 Speaker 2: slightly controversial, so LinkedIn recently released data showing that seventy 10 00:00:34,560 --> 00:00:37,000 Speaker 2: percent of the skills we will need at work over 11 00:00:37,040 --> 00:00:40,440 Speaker 2: the next five years are about to change, which means 12 00:00:40,479 --> 00:00:43,199 Speaker 2: doubling down on what you're already good at might not 13 00:00:43,320 --> 00:00:47,479 Speaker 2: be the safest strategy. So I invited Tim on to 14 00:00:47,720 --> 00:00:50,920 Speaker 2: unpack the five skills that will actually make you more 15 00:00:50,920 --> 00:00:55,400 Speaker 2: promotable this year, and they're quite possibly not the ones 16 00:00:55,440 --> 00:00:59,160 Speaker 2: you're spending time on. So if you've been feeling the 17 00:00:59,240 --> 00:01:03,400 Speaker 2: pace of check and wandering to yourself, am I keeping up? 18 00:01:04,040 --> 00:01:06,520 Speaker 2: By the end of this episode, you'll have at least 19 00:01:06,640 --> 00:01:18,040 Speaker 2: one clear skill to focus on building. Next Welcome to 20 00:01:18,120 --> 00:01:21,959 Speaker 2: How I Work, a show about habits, rituals, and strategies 21 00:01:22,080 --> 00:01:25,920 Speaker 2: for optimizing your day. I'm your host, doctor Amantha imber 22 00:01:30,440 --> 00:01:33,240 Speaker 2: Tim I wanted to chat. We've known each other for 23 00:01:33,240 --> 00:01:35,400 Speaker 2: a couple of years now, and I was really struck 24 00:01:35,480 --> 00:01:37,880 Speaker 2: by one of your columns that you recently wrote for 25 00:01:37,920 --> 00:01:40,640 Speaker 2: the Sydney Morning Herald that was titled what a land 26 00:01:40,640 --> 00:01:43,319 Speaker 2: of promotion this year? Here are five key skills that 27 00:01:43,400 --> 00:01:47,640 Speaker 2: you will need? And I think that this was inspired 28 00:01:47,680 --> 00:01:53,000 Speaker 2: by a statistic that LinkedIn released around how the world 29 00:01:53,040 --> 00:01:55,040 Speaker 2: of work and the skills that we need are changing. 30 00:01:55,120 --> 00:01:57,280 Speaker 3: Can you tell me about that sure thing? 31 00:01:57,480 --> 00:02:00,040 Speaker 1: And can I just say how wonderful it is to 32 00:01:59,880 --> 00:02:03,200 Speaker 1: be back. I always really love chatting with you, both 33 00:02:03,240 --> 00:02:06,640 Speaker 1: on and off the microphone. There is so much upheaval 34 00:02:06,760 --> 00:02:10,200 Speaker 1: happening at the moment, and it feels like, you know, 35 00:02:10,240 --> 00:02:12,280 Speaker 1: you talk about it, I talk about it. But it 36 00:02:12,320 --> 00:02:15,800 Speaker 1: does feel like now we're starting to see a bunch 37 00:02:15,800 --> 00:02:18,760 Speaker 1: of these data points actually come into life, and one 38 00:02:18,760 --> 00:02:21,520 Speaker 1: of them was some data from LinkedIn where they looked 39 00:02:21,560 --> 00:02:24,160 Speaker 1: at the skills that we're going to need for work 40 00:02:24,280 --> 00:02:26,959 Speaker 1: over the next five years. So what they actually found, 41 00:02:27,000 --> 00:02:29,200 Speaker 1: and this was a bit alarming for me, was that 42 00:02:29,440 --> 00:02:32,040 Speaker 1: seventy percent of the skills that we're going to need 43 00:02:32,040 --> 00:02:35,600 Speaker 1: at work are about to change, so they're not the 44 00:02:35,639 --> 00:02:39,320 Speaker 1: skills that we currently have. And it made me really 45 00:02:39,360 --> 00:02:42,839 Speaker 1: go down a rabbit warren of figuring out what are 46 00:02:42,880 --> 00:02:45,520 Speaker 1: these skills that we're going to need because you and 47 00:02:45,560 --> 00:02:48,360 Speaker 1: I think we know how we're going to work in 48 00:02:48,400 --> 00:02:50,600 Speaker 1: the future, and it turns out that there's a bunch 49 00:02:50,639 --> 00:02:53,200 Speaker 1: of skills which are actually not really that well prepared 50 00:02:53,240 --> 00:02:53,840 Speaker 1: for at the moment. 51 00:02:53,880 --> 00:02:55,600 Speaker 2: So I want to go through the five skills that 52 00:02:55,639 --> 00:02:58,359 Speaker 2: you wrote about, and the first skill that you talk 53 00:02:58,400 --> 00:03:03,480 Speaker 2: about is judgment, and I would love to know, like, 54 00:03:03,680 --> 00:03:06,600 Speaker 2: what does good judgment look like at work? 55 00:03:07,080 --> 00:03:10,840 Speaker 1: Judgment's one of these ones that you'd never normally really 56 00:03:10,960 --> 00:03:14,040 Speaker 1: think about. I think most of us assume that we 57 00:03:14,160 --> 00:03:16,920 Speaker 1: have good judgment. Most of us assume that we know 58 00:03:17,040 --> 00:03:21,600 Speaker 1: what we're talking about, that we can decide if something 59 00:03:21,760 --> 00:03:25,120 Speaker 1: is good or something is bad. And what we're actually seeing, 60 00:03:25,120 --> 00:03:27,600 Speaker 1: particularly with AI at the moment, and a lot of 61 00:03:27,639 --> 00:03:31,160 Speaker 1: these skills, surprise surprise, have to do with the impact 62 00:03:31,160 --> 00:03:34,120 Speaker 1: that AI is having on the workforce, and judgment is 63 00:03:34,120 --> 00:03:38,320 Speaker 1: one of those because judgment is really about with all 64 00:03:38,360 --> 00:03:40,840 Speaker 1: of the work that our agents are starting to do 65 00:03:40,920 --> 00:03:43,680 Speaker 1: for us, we assume and are the ones that. 66 00:03:43,840 --> 00:03:44,720 Speaker 3: Look at this work. 67 00:03:45,000 --> 00:03:47,760 Speaker 1: We obviously put prompts in and we get outputs from it, 68 00:03:47,800 --> 00:03:51,840 Speaker 1: and agentic AI completes tasks for us. But where the 69 00:03:51,840 --> 00:03:55,960 Speaker 1: one that then have to decide is this good? Is 70 00:03:56,000 --> 00:03:58,960 Speaker 1: this a hallucination? Should I be using this? Should I 71 00:03:59,000 --> 00:04:01,600 Speaker 1: be forwarding this to my boss us? A very very 72 00:04:01,640 --> 00:04:05,080 Speaker 1: simple example of this is some of the AI that's 73 00:04:05,160 --> 00:04:08,520 Speaker 1: filtering now into Gmail and to Outlook and Microsoft, so 74 00:04:08,600 --> 00:04:11,200 Speaker 1: things like Gemini where it says I can compose this 75 00:04:11,240 --> 00:04:13,360 Speaker 1: email for you, and it kind of gives you sometimes 76 00:04:13,400 --> 00:04:16,400 Speaker 1: the text of sometimes without you even asking, it gives 77 00:04:16,400 --> 00:04:20,000 Speaker 1: you the text of what you should say. And judgment 78 00:04:20,240 --> 00:04:23,279 Speaker 1: is about deciding should I do this? Is this the 79 00:04:23,360 --> 00:04:26,440 Speaker 1: right person to send this to? And is this even good? 80 00:04:26,560 --> 00:04:28,320 Speaker 1: Is this one I want to say? So all of 81 00:04:28,320 --> 00:04:31,080 Speaker 1: these things are skills that we haven't really actually had 82 00:04:31,120 --> 00:04:34,200 Speaker 1: to think that actively about before in the workplace. 83 00:04:34,279 --> 00:04:37,760 Speaker 2: This one really scares me because I look at how 84 00:04:38,600 --> 00:04:41,960 Speaker 2: the amount of grad jobs are being reduced across a 85 00:04:41,960 --> 00:04:45,479 Speaker 2: lot of industries, a lot of organizations, And when I 86 00:04:45,480 --> 00:04:49,440 Speaker 2: think about judgment, I think about, well, that's experience, like 87 00:04:49,520 --> 00:04:52,080 Speaker 2: that's doing things over and over again and then doing 88 00:04:52,160 --> 00:04:55,440 Speaker 2: different things over the course of years or maybe decades, 89 00:04:55,600 --> 00:05:00,600 Speaker 2: and then through those experiences you build up your judgement muscle. 90 00:05:01,120 --> 00:05:04,680 Speaker 2: But there's obviously got to be another way because that's 91 00:05:04,720 --> 00:05:07,400 Speaker 2: simply not going to work because a lot of more 92 00:05:07,480 --> 00:05:12,600 Speaker 2: junior jobs are being now done by AI. So do 93 00:05:12,640 --> 00:05:16,320 Speaker 2: you have any thoughts or strategies that you think about 94 00:05:16,320 --> 00:05:18,920 Speaker 2: when it comes to how can we build our judgment muscle? 95 00:05:19,240 --> 00:05:22,360 Speaker 1: Yeah, and judgment and experience are interrelated, but they're not 96 00:05:22,400 --> 00:05:25,680 Speaker 1: the exact same thing. So I think you can get 97 00:05:25,720 --> 00:05:28,919 Speaker 1: better at judgment through experience. So the more you do something, 98 00:05:28,920 --> 00:05:30,960 Speaker 1: the more you start realizing is this a good idea 99 00:05:31,040 --> 00:05:34,640 Speaker 1: or not. However, when we talk about AI, all of 100 00:05:34,720 --> 00:05:37,679 Speaker 1: us have roughly the same man experience. At the moment, 101 00:05:38,240 --> 00:05:41,520 Speaker 1: AI is still that new that if someone. 102 00:05:41,360 --> 00:05:44,560 Speaker 3: Says that they are an AI expert, they're. 103 00:05:44,400 --> 00:05:46,360 Speaker 1: Just someone who I think has been paid attention and 104 00:05:46,400 --> 00:05:48,920 Speaker 1: has consciously decided to do that in recent years. 105 00:05:49,480 --> 00:05:50,760 Speaker 3: So the most interesting. 106 00:05:50,400 --> 00:05:55,120 Speaker 1: Part about this is you can increase your judgment levels 107 00:05:55,440 --> 00:05:58,080 Speaker 1: by doing a lot of this stuff. At the moment, 108 00:05:58,080 --> 00:06:00,600 Speaker 1: I think most people are kind of around here. If 109 00:06:00,600 --> 00:06:02,920 Speaker 1: you look at think about the skill set of how 110 00:06:02,960 --> 00:06:06,120 Speaker 1: we think about AI, no one's really that far ahead 111 00:06:06,120 --> 00:06:08,760 Speaker 1: because no one has had years and years and decades 112 00:06:08,760 --> 00:06:11,880 Speaker 1: of experience, and that's very actually quite rare when you 113 00:06:11,880 --> 00:06:13,200 Speaker 1: think about it in terms of the skills that we're 114 00:06:13,200 --> 00:06:15,040 Speaker 1: going to need. So I see that as a real 115 00:06:15,080 --> 00:06:17,680 Speaker 1: positive because it means no one is that bar ahead 116 00:06:17,680 --> 00:06:19,320 Speaker 1: that you can't catch up at the moment. 117 00:06:20,080 --> 00:06:23,680 Speaker 2: I think that's super encouraging. Something I think about, you know, 118 00:06:23,720 --> 00:06:26,280 Speaker 2: in terms of how we can improve our ability to 119 00:06:26,400 --> 00:06:29,520 Speaker 2: judge something or at least think critically. A couple of 120 00:06:29,600 --> 00:06:32,760 Speaker 2: tools that I will often come back to first is 121 00:06:32,839 --> 00:06:36,200 Speaker 2: I love a good pre mortem, particularly when I'm making 122 00:06:36,279 --> 00:06:38,520 Speaker 2: a big decision or my team is on the cusp 123 00:06:38,560 --> 00:06:42,760 Speaker 2: of making a big decision. We will, and certainly have 124 00:06:42,920 --> 00:06:46,200 Speaker 2: quite often at Inventium, run pre mortems where we imagine 125 00:06:46,240 --> 00:06:49,280 Speaker 2: that we've done this thing and then it's like three 126 00:06:49,400 --> 00:06:52,320 Speaker 2: or six or twelve months out into the future, and 127 00:06:52,560 --> 00:06:55,680 Speaker 2: it's either being a total success or a total failure, 128 00:06:56,040 --> 00:06:58,719 Speaker 2: and we think about, okay, like, what are the conditions 129 00:06:58,760 --> 00:07:01,080 Speaker 2: and what has happened that has led to either one 130 00:07:01,120 --> 00:07:03,760 Speaker 2: of those outcomes, And then often when we think through that, 131 00:07:03,800 --> 00:07:06,840 Speaker 2: we can then work backwards and either go actually I 132 00:07:06,880 --> 00:07:08,440 Speaker 2: don't think we should do this. I don't think we 133 00:07:08,440 --> 00:07:12,360 Speaker 2: should implement this idea or we can improve it through 134 00:07:12,800 --> 00:07:14,640 Speaker 2: future projecting ourselves. 135 00:07:14,800 --> 00:07:16,800 Speaker 1: I adore the idea of the pre mortems as well. 136 00:07:17,040 --> 00:07:20,240 Speaker 1: I'm currently writing my newest newsletter for this month and 137 00:07:20,240 --> 00:07:22,880 Speaker 1: it's actually on the idea of pre mortems. And it's 138 00:07:22,920 --> 00:07:26,520 Speaker 1: because I recently held a big workshop for twenty people 139 00:07:26,520 --> 00:07:29,840 Speaker 1: that I'm mentoring, and as part of that, everyone has 140 00:07:29,880 --> 00:07:32,080 Speaker 1: goals and so they've got these twelve months goals and 141 00:07:32,120 --> 00:07:34,120 Speaker 1: what they want to get to. And what we did 142 00:07:34,320 --> 00:07:37,520 Speaker 1: was do a pre mortem exercise to say, if we're 143 00:07:37,600 --> 00:07:40,880 Speaker 1: here in twelve months time and you haven't hit your goals, 144 00:07:41,440 --> 00:07:44,680 Speaker 1: what is the most likely reason for that. So by 145 00:07:44,720 --> 00:07:46,880 Speaker 1: doing that, all of a sudden, everyone got out what 146 00:07:46,920 --> 00:07:49,640 Speaker 1: their biggest obstacle was, what their biggest fear was. And 147 00:07:49,680 --> 00:07:51,760 Speaker 1: in doing that you can then start figuring out, Okay, 148 00:07:51,800 --> 00:07:53,680 Speaker 1: this is the things I need to do in order 149 00:07:53,720 --> 00:07:55,800 Speaker 1: to get around that. The way that I normally think 150 00:07:55,840 --> 00:07:58,120 Speaker 1: about pre mortems is normally in a negative way, thinking 151 00:07:58,160 --> 00:08:00,480 Speaker 1: about if this is going to fail, this is why failed. 152 00:08:00,520 --> 00:08:03,160 Speaker 1: So I find interesting that you think about it in 153 00:08:03,200 --> 00:08:04,240 Speaker 1: a positive way as well. 154 00:08:04,640 --> 00:08:05,520 Speaker 3: How could we succeed. 155 00:08:05,840 --> 00:08:10,320 Speaker 2: Yeah, I remember we ran one many years ago before 156 00:08:10,360 --> 00:08:13,120 Speaker 2: we implemented our four day week. So we've run a 157 00:08:13,120 --> 00:08:15,440 Speaker 2: four day week at Inventium for about five and a 158 00:08:15,480 --> 00:08:16,320 Speaker 2: half years now. 159 00:08:16,440 --> 00:08:18,560 Speaker 1: You are the longest running four day week that I 160 00:08:18,720 --> 00:08:19,280 Speaker 1: know of a man. 161 00:08:19,440 --> 00:08:22,480 Speaker 2: I think we were the first in Australia to implement it, 162 00:08:22,640 --> 00:08:24,440 Speaker 2: so that's kind of cool. But yeah, I remember we 163 00:08:24,680 --> 00:08:26,200 Speaker 2: got the whole team together and we did a pre 164 00:08:26,280 --> 00:08:27,840 Speaker 2: mortem and we looked at what if it's a really 165 00:08:27,840 --> 00:08:29,880 Speaker 2: big success and what if it's a really big failure. 166 00:08:30,000 --> 00:08:33,040 Speaker 2: And I still remember that because that then influenced the 167 00:08:33,080 --> 00:08:35,840 Speaker 2: guiding principles that we still have to this day around 168 00:08:35,880 --> 00:08:39,480 Speaker 2: the four day week. Something else I love and I 169 00:08:39,520 --> 00:08:43,800 Speaker 2: think I've learned this, and it's just before we started recording. 170 00:08:43,920 --> 00:08:46,040 Speaker 2: We're talking about a few authors that we like, and 171 00:08:46,280 --> 00:08:49,000 Speaker 2: the Heath brothers, Chip and Dan Heath came up and 172 00:08:49,679 --> 00:08:52,440 Speaker 2: one of my favorite books of theirs is Decisive, And 173 00:08:53,240 --> 00:08:56,680 Speaker 2: I'm pretty sure that this comes from Decisive. But when 174 00:08:56,920 --> 00:08:58,959 Speaker 2: we've got a big decision to make, and again we're 175 00:08:58,960 --> 00:09:01,120 Speaker 2: still sort of talking about Judge and how we can improve, 176 00:09:01,720 --> 00:09:04,720 Speaker 2: you know, how good our critical thinking is, particularly around 177 00:09:04,760 --> 00:09:10,000 Speaker 2: decisions is not just making either or or whether or 178 00:09:10,040 --> 00:09:12,920 Speaker 2: not decisions. So whether or not decision is should I 179 00:09:12,960 --> 00:09:15,120 Speaker 2: do the thing or should I not do the thing? 180 00:09:15,320 --> 00:09:18,840 Speaker 2: And typically those kinds of decisions fail I think about 181 00:09:18,840 --> 00:09:22,439 Speaker 2: fifty percent of the time. But they're really really common decisions. 182 00:09:22,480 --> 00:09:26,319 Speaker 2: And so what chippen down Heath talk about is thinking about, well, 183 00:09:26,320 --> 00:09:29,800 Speaker 2: what are some other plausible alternatives, so that you're not 184 00:09:29,880 --> 00:09:32,839 Speaker 2: making whether or not decisions? You're always deciding should it 185 00:09:32,920 --> 00:09:35,880 Speaker 2: be this thing or should it be that thing or 186 00:09:35,880 --> 00:09:39,200 Speaker 2: that other thing. And that's something I will often come 187 00:09:39,240 --> 00:09:42,520 Speaker 2: back to when I'm making decisions. I'll catch myself if 188 00:09:42,600 --> 00:09:45,679 Speaker 2: it feels like a weather or not decision, and if 189 00:09:45,720 --> 00:09:48,079 Speaker 2: it is, then I'll spend more time thinking about, well, 190 00:09:48,080 --> 00:09:51,000 Speaker 2: what are some other really reasonable alternatives. 191 00:09:51,280 --> 00:09:53,480 Speaker 3: I love that. That's excellent. I haven't read that. 192 00:09:53,520 --> 00:09:55,599 Speaker 1: I've read a lot of Chip of Dad's stuff, but 193 00:09:55,640 --> 00:09:58,320 Speaker 1: I haven't read Decisive. And now that makes me want 194 00:09:58,360 --> 00:10:00,320 Speaker 1: to put on the book and figure out I've just 195 00:10:00,320 --> 00:10:01,000 Speaker 1: got insights like that. 196 00:10:01,320 --> 00:10:01,840 Speaker 3: It's so good. 197 00:10:01,840 --> 00:10:05,079 Speaker 2: Okay, let's move on to skill number two, which is storytelling. 198 00:10:05,640 --> 00:10:08,680 Speaker 2: So tell me what does it mean to be good 199 00:10:08,880 --> 00:10:11,240 Speaker 2: at storytelling in the work context? 200 00:10:11,480 --> 00:10:13,880 Speaker 3: So storytelling is this really interesting one? 201 00:10:14,120 --> 00:10:19,640 Speaker 1: And some of these skills might sound like their soft skills, 202 00:10:19,880 --> 00:10:24,160 Speaker 1: but they're actually really hard and really useful once you 203 00:10:24,200 --> 00:10:27,280 Speaker 1: can figure them out. And storytelling came from Wall Street 204 00:10:27,320 --> 00:10:31,040 Speaker 1: Journal wrote a report a few weeks ago where it 205 00:10:31,080 --> 00:10:34,000 Speaker 1: started looking at the number of job listings with the 206 00:10:34,040 --> 00:10:36,880 Speaker 1: word storyteller in that and they found that in the 207 00:10:37,000 --> 00:10:39,760 Speaker 1: US alone it had doubled in the last year. So 208 00:10:39,760 --> 00:10:42,079 Speaker 1: they are now about seventy thousand jobs that had the 209 00:10:42,120 --> 00:10:46,960 Speaker 1: word storyteller somewhere in it. And what this is is, 210 00:10:47,040 --> 00:10:49,280 Speaker 1: once again, this is a lot of these are reactions 211 00:10:49,320 --> 00:10:52,840 Speaker 1: to things happening in AI, which is when the whole 212 00:10:52,880 --> 00:10:54,760 Speaker 1: point of some of the content and some of the 213 00:10:54,800 --> 00:10:57,320 Speaker 1: ideas that get produced are that they are the median, 214 00:10:57,440 --> 00:11:00,640 Speaker 1: or they are the average, or they are the most expected. 215 00:11:01,080 --> 00:11:04,840 Speaker 1: So storytelling is how can you find a way through 216 00:11:04,920 --> 00:11:08,160 Speaker 1: all that noise? So, when there is so much noise 217 00:11:08,280 --> 00:11:11,880 Speaker 1: out there, how do you find a way to It 218 00:11:12,120 --> 00:11:15,640 Speaker 1: might be telling your company story from a marketing sense, 219 00:11:16,000 --> 00:11:20,360 Speaker 1: That could be telling your own personal story within a business, or. 220 00:11:20,240 --> 00:11:22,640 Speaker 3: It could be helping the CEO tell their story. 221 00:11:22,640 --> 00:11:25,559 Speaker 1: There's lots of different ways that storytelling abounds, but it's 222 00:11:25,600 --> 00:11:29,679 Speaker 1: a really fascinating and kind of quite hard skill to master. 223 00:11:29,920 --> 00:11:32,680 Speaker 2: I love this one, and I've got to do a 224 00:11:32,720 --> 00:11:36,600 Speaker 2: shout out to Matthew Dix, who I think is one 225 00:11:36,640 --> 00:11:39,520 Speaker 2: of the masters of storytelling out there. I interviewed him 226 00:11:39,559 --> 00:11:43,000 Speaker 2: on How I Work maybe three or four years ago. 227 00:11:43,040 --> 00:11:45,400 Speaker 2: I'll linked to that episode in the show notes because 228 00:11:45,400 --> 00:11:48,280 Speaker 2: I think that it's really hard to craft a good story. 229 00:11:48,640 --> 00:11:51,520 Speaker 2: And in the last few weeks, I've been working on 230 00:11:51,720 --> 00:11:55,559 Speaker 2: a new keynote around psychological safety because a particular company 231 00:11:56,160 --> 00:11:58,719 Speaker 2: had wanted me to speak to I think the top 232 00:11:58,720 --> 00:12:01,160 Speaker 2: two hundred leaders around how to build s like safety. 233 00:12:01,600 --> 00:12:03,840 Speaker 2: And it was funny going through the process of creating 234 00:12:03,920 --> 00:12:09,280 Speaker 2: a new keynote because aside from obviously the science and 235 00:12:09,880 --> 00:12:11,520 Speaker 2: you know, what's been shown in the research to drive 236 00:12:11,559 --> 00:12:14,320 Speaker 2: sit safety, I knew that in order for the keynote 237 00:12:14,400 --> 00:12:18,280 Speaker 2: to land and to actually connect with these leaders, I 238 00:12:18,559 --> 00:12:22,280 Speaker 2: needed to share stories from my own life, and I 239 00:12:22,320 --> 00:12:25,079 Speaker 2: think that was the most time consuming part of constructing 240 00:12:25,080 --> 00:12:28,120 Speaker 2: that keynote. It wasn't the knowledge around Okay, what drives 241 00:12:28,200 --> 00:12:30,560 Speaker 2: like safety and what are some you know, really practical 242 00:12:30,600 --> 00:12:33,560 Speaker 2: science based tools that I can share, but thinking about 243 00:12:33,920 --> 00:12:36,400 Speaker 2: what are the best personal stories that I can share 244 00:12:36,440 --> 00:12:39,960 Speaker 2: that will help connect with the audience on an emotional level, 245 00:12:40,120 --> 00:12:42,720 Speaker 2: and then digging even different, going Okay, how do I 246 00:12:42,720 --> 00:12:45,880 Speaker 2: best construct each of these stories? How do I create 247 00:12:46,000 --> 00:12:49,640 Speaker 2: stakes so that the stories are not just anecdotes? I 248 00:12:49,640 --> 00:12:51,920 Speaker 2: would love to know for you, Tim, do you have 249 00:12:52,000 --> 00:12:56,200 Speaker 2: any go to frameworks or tools or anything like that 250 00:12:56,280 --> 00:12:58,800 Speaker 2: when you think about constructing stories. I feel like you're 251 00:12:58,840 --> 00:13:01,920 Speaker 2: a great story to particularly in your books. I feel 252 00:13:01,920 --> 00:13:03,400 Speaker 2: like you're quite masterful at that. 253 00:13:03,400 --> 00:13:03,880 Speaker 3: That's funny. 254 00:13:03,920 --> 00:13:06,760 Speaker 1: I'm currently writing my fourth book at the moment, and 255 00:13:07,640 --> 00:13:10,440 Speaker 1: I went through the exact same thinking that you did 256 00:13:10,480 --> 00:13:14,720 Speaker 1: with your keynote, which is, how can I tell because 257 00:13:14,720 --> 00:13:17,120 Speaker 1: you've got to have some personal stories in there, how 258 00:13:17,280 --> 00:13:20,079 Speaker 1: this affects you, But how do you actually tell them 259 00:13:20,080 --> 00:13:23,120 Speaker 1: in a way that's relevant and interesting, has a beginning, 260 00:13:23,120 --> 00:13:25,960 Speaker 1: in the middle, and end. There's lots of different frameworks 261 00:13:25,960 --> 00:13:29,120 Speaker 1: out there. The Hero's Journey is a really great example 262 00:13:29,240 --> 00:13:31,600 Speaker 1: of a framework that people can look up. But when 263 00:13:31,640 --> 00:13:35,440 Speaker 1: I think of storytelling, the simplest thing that I think of, 264 00:13:35,600 --> 00:13:38,679 Speaker 1: and the way that I always try to teach people 265 00:13:38,720 --> 00:13:42,800 Speaker 1: storytelling is that storytelling is best when it's shared from 266 00:13:42,840 --> 00:13:45,120 Speaker 1: one person to another person. So if you're going to 267 00:13:45,160 --> 00:13:47,960 Speaker 1: give someone a story, how are they going to then 268 00:13:48,040 --> 00:13:49,520 Speaker 1: tell it to the next person, Tell it to the 269 00:13:49,520 --> 00:13:51,880 Speaker 1: next person, tell it to the next person. So the 270 00:13:51,920 --> 00:13:56,480 Speaker 1: simplicity of an idea, the ability to crystallize something down 271 00:13:56,559 --> 00:13:59,160 Speaker 1: to its simplest form, because I actually think that storytelling 272 00:13:59,280 --> 00:14:01,560 Speaker 1: is all about how can you communicate in the most 273 00:14:01,559 --> 00:14:05,720 Speaker 1: simplest ways possible. So anytime I am writing a book 274 00:14:05,960 --> 00:14:08,840 Speaker 1: and thinking of a story or creating a keynote trying 275 00:14:08,840 --> 00:14:10,840 Speaker 1: to come up with the story, I think about, what 276 00:14:11,000 --> 00:14:13,280 Speaker 1: is that person in the audience or that reader, how 277 00:14:13,280 --> 00:14:15,800 Speaker 1: are they going to tell it to the person next 278 00:14:15,840 --> 00:14:18,000 Speaker 1: to them? And that's the way that I really try 279 00:14:18,040 --> 00:14:20,800 Speaker 1: and make it simple. To do that, I get rid 280 00:14:20,880 --> 00:14:23,960 Speaker 1: of complicated things. I make sure that there's no jargon 281 00:14:24,000 --> 00:14:26,720 Speaker 1: that people don't understand. I make sure that it has 282 00:14:26,920 --> 00:14:29,880 Speaker 1: really simple, clear messages all the way through. 283 00:14:30,040 --> 00:14:31,480 Speaker 3: And that's kind of one of my tests. 284 00:14:31,920 --> 00:14:34,640 Speaker 2: I like that one. Okay, let me share one that 285 00:14:34,920 --> 00:14:38,480 Speaker 2: I think is really simple. It's so easy to remember, 286 00:14:38,800 --> 00:14:40,640 Speaker 2: and I'll often come back to it when I'm trying 287 00:14:40,680 --> 00:14:43,400 Speaker 2: to construct a story. And you might have come across 288 00:14:43,400 --> 00:14:47,360 Speaker 2: this one. It's called and butt Therefore, how it works 289 00:14:47,720 --> 00:14:50,480 Speaker 2: is you start with the end and this happened. What 290 00:14:50,640 --> 00:14:54,120 Speaker 2: is the scenario, the world that we're building here. Then 291 00:14:54,520 --> 00:14:58,160 Speaker 2: the butt is where you introduce the tension, because stories 292 00:14:58,320 --> 00:15:02,920 Speaker 2: need tension otherwise they are boring. And then you finished 293 00:15:02,920 --> 00:15:06,160 Speaker 2: with it. Therefore, because stories need an outcome, they need 294 00:15:06,200 --> 00:15:09,800 Speaker 2: a moral, they need a lesson, and so and we introduce, 295 00:15:09,960 --> 00:15:13,800 Speaker 2: you know, the world, the scenario, but we introduce the 296 00:15:13,880 --> 00:15:16,880 Speaker 2: tension and the stakes, and then obviously that needs to 297 00:15:16,920 --> 00:15:21,040 Speaker 2: get resolved. And then therefore is what the take home 298 00:15:21,320 --> 00:15:23,560 Speaker 2: is from that story. I think that that's just a 299 00:15:23,600 --> 00:15:26,880 Speaker 2: super simple little framework to come back to. 300 00:15:27,160 --> 00:15:27,560 Speaker 3: That's great. 301 00:15:27,560 --> 00:15:30,080 Speaker 1: I learned to love the simplicity within that. It is 302 00:15:31,080 --> 00:15:33,680 Speaker 1: the best ideas, the ones that you can just really 303 00:15:33,680 --> 00:15:35,680 Speaker 1: pass on to someone. And they're really memorable as well, 304 00:15:35,840 --> 00:15:39,880 Speaker 1: which is why things like acronyms are so important, because 305 00:15:40,000 --> 00:15:43,160 Speaker 1: acronyms are our way you've been able to remember things 306 00:15:43,200 --> 00:15:45,520 Speaker 1: easily when you can spell something out and be like, ah, 307 00:15:45,600 --> 00:15:48,560 Speaker 1: that's that method, it's the poor letter method, it's you know, 308 00:15:48,600 --> 00:15:50,440 Speaker 1: the life method, the heart method. 309 00:15:50,400 --> 00:15:51,240 Speaker 3: Or whatever it is. 310 00:15:51,720 --> 00:15:54,640 Speaker 1: I'm a big, big fan of acronyms, so much so 311 00:15:54,760 --> 00:15:58,040 Speaker 1: my monthly newsletter is called outlet, which stands for one 312 00:15:58,160 --> 00:15:59,720 Speaker 1: useful thing Literally every. 313 00:15:59,520 --> 00:16:01,280 Speaker 2: Time that's cool. 314 00:16:02,120 --> 00:16:06,200 Speaker 1: There are acronyms throughout my entire career. I always bring 315 00:16:06,680 --> 00:16:07,080 Speaker 1: it makes me. 316 00:16:07,160 --> 00:16:08,280 Speaker 3: Sure, that's awesome. 317 00:16:08,400 --> 00:16:11,560 Speaker 2: Outlet. Coming up next, Tim and I discuss three other 318 00:16:11,600 --> 00:16:13,520 Speaker 2: skills that are going to be critical if you want 319 00:16:13,560 --> 00:16:16,360 Speaker 2: to get ahead in your career. This year, we're going 320 00:16:16,440 --> 00:16:18,880 Speaker 2: to be talking about how to actually work with AI 321 00:16:19,120 --> 00:16:21,760 Speaker 2: without outsourcing your brain. We're going to get into this 322 00:16:21,840 --> 00:16:25,560 Speaker 2: idea of unlearning, which quite frankly most of us resist 323 00:16:25,640 --> 00:16:29,160 Speaker 2: because it's really hard. And we finish on Tim's favorite 324 00:16:29,160 --> 00:16:32,880 Speaker 2: skill of them all, conflict and how to manage it, 325 00:16:33,280 --> 00:16:36,400 Speaker 2: and Tim shares the one tiny word that has completely 326 00:16:36,480 --> 00:16:39,520 Speaker 2: changed how he manages any kind of conflict in his 327 00:16:39,680 --> 00:16:40,360 Speaker 2: own life. 328 00:16:44,000 --> 00:16:46,600 Speaker 4: If you're looking for more tips to improve the way 329 00:16:46,720 --> 00:16:49,880 Speaker 4: you work can live. I write a short weekly newsletter 330 00:16:50,000 --> 00:16:53,160 Speaker 4: that contains tactics I've discovered that have helped me personally. 331 00:16:53,560 --> 00:16:56,680 Speaker 4: You can sign up for that at Amantha dot com. 332 00:16:56,840 --> 00:17:02,520 Speaker 4: That's Amantha dot com. 333 00:17:02,640 --> 00:17:05,560 Speaker 2: Okay, let's move on to skill number three, which is 334 00:17:05,720 --> 00:17:09,760 Speaker 2: collaborative intelligence. What does this mean? Tim? 335 00:17:10,119 --> 00:17:13,399 Speaker 1: Yeah, so this is the probably the most the newest 336 00:17:13,560 --> 00:17:17,080 Speaker 1: skill it's very forward facing. It's a skill that would 337 00:17:17,200 --> 00:17:20,400 Speaker 1: not have existed I would hesitate to say, two years ago, 338 00:17:20,680 --> 00:17:23,159 Speaker 1: with the speed at which things are changing. And this 339 00:17:23,240 --> 00:17:25,960 Speaker 1: is a part Jim Wilson from ETCENT calls this the 340 00:17:26,080 --> 00:17:28,760 Speaker 1: missing middle, which is the part where this is about 341 00:17:28,800 --> 00:17:35,160 Speaker 1: collaboration between robots, which AI and urgentic agents and all 342 00:17:35,200 --> 00:17:37,440 Speaker 1: the tools that everyone listening to this is using because 343 00:17:37,680 --> 00:17:40,399 Speaker 1: they listen to you and they know how much you 344 00:17:40,560 --> 00:17:44,320 Speaker 1: use them and how much it's helpful, and that link 345 00:17:44,359 --> 00:17:48,200 Speaker 1: between the robots and the humans, so that kind of 346 00:17:48,240 --> 00:17:52,439 Speaker 1: like collaboration that happens between AI and humans. This is 347 00:17:52,640 --> 00:17:55,399 Speaker 1: really one of the new skill sets that people are 348 00:17:55,400 --> 00:17:55,920 Speaker 1: looking for. 349 00:17:56,480 --> 00:17:57,080 Speaker 3: And part of this. 350 00:17:57,160 --> 00:17:59,600 Speaker 1: The way I think about this is it's a little 351 00:17:59,600 --> 00:18:03,080 Speaker 1: bit about identity and about how we see AI, and 352 00:18:03,160 --> 00:18:05,840 Speaker 1: so if you only see it as a threat, then 353 00:18:05,840 --> 00:18:08,960 Speaker 1: you're not going to take most advantage of this collabortive intelligence, 354 00:18:09,080 --> 00:18:10,880 Speaker 1: whereas if you see it as an opportunity and an 355 00:18:10,960 --> 00:18:13,920 Speaker 1: idea to one plus one equals three or one plus 356 00:18:13,920 --> 00:18:16,280 Speaker 1: one might be three hundred sooner. The way that AI 357 00:18:16,359 --> 00:18:20,040 Speaker 1: development is going, collaborative intelligence is really intelligence that we 358 00:18:20,119 --> 00:18:23,480 Speaker 1: are going to need to collaborate as effectively as we 359 00:18:23,520 --> 00:18:24,160 Speaker 1: can with AI. 360 00:18:25,560 --> 00:18:28,720 Speaker 2: How do you think about this in your work and 361 00:18:30,400 --> 00:18:33,880 Speaker 2: how you see AI as a thought partner or someone 362 00:18:33,960 --> 00:18:36,159 Speaker 2: you delegate to, like, how do you conceptualize it? 363 00:18:36,320 --> 00:18:38,200 Speaker 1: Yeah, I think about it a lot because I use 364 00:18:38,240 --> 00:18:41,080 Speaker 1: it a lot. I really think about it as a 365 00:18:41,520 --> 00:18:43,320 Speaker 1: give example of the book writing, because it's been the 366 00:18:43,359 --> 00:18:46,879 Speaker 1: big focus of my time. This was the first book 367 00:18:46,920 --> 00:18:49,680 Speaker 1: out of the three books I've written so far that 368 00:18:50,080 --> 00:18:55,440 Speaker 1: AI was extremely useful as a research and inspiring partner. 369 00:18:55,760 --> 00:18:59,280 Speaker 1: Extremely useful, I kind of almost to the extent of 370 00:18:59,440 --> 00:19:00,679 Speaker 1: now find it. 371 00:19:01,520 --> 00:19:02,440 Speaker 3: I could go back to. 372 00:19:02,440 --> 00:19:04,800 Speaker 1: Writing how I used to research and how I used 373 00:19:04,840 --> 00:19:09,600 Speaker 1: to but not in the same way. My red line 374 00:19:09,680 --> 00:19:12,200 Speaker 1: is that I don't let AI write anything. I still 375 00:19:12,240 --> 00:19:14,480 Speaker 1: want that to be my own words, my own ideas, 376 00:19:14,560 --> 00:19:16,840 Speaker 1: my own way of speaking, because I really enjoy that 377 00:19:17,000 --> 00:19:19,919 Speaker 1: human element of that. But as far as a research 378 00:19:19,920 --> 00:19:25,879 Speaker 1: assistant goes the ability to comb through research reports to 379 00:19:26,119 --> 00:19:30,160 Speaker 1: find obscure pieces of information and data that I would 380 00:19:30,240 --> 00:19:34,159 Speaker 1: not have been able to find anywhere else to sense check. However, 381 00:19:34,560 --> 00:19:36,440 Speaker 1: the judgment going back to one of the skills that 382 00:19:36,480 --> 00:19:38,879 Speaker 1: I have. I have collaborative intelligence because I collaborate with 383 00:19:38,920 --> 00:19:41,720 Speaker 1: them but the judgment I have is that every single 384 00:19:41,720 --> 00:19:43,840 Speaker 1: thing that gives me I still go back and double check, 385 00:19:44,400 --> 00:19:49,440 Speaker 1: so especially research, especially reports that it kind of gives. 386 00:19:49,440 --> 00:19:51,159 Speaker 3: If it says seventy six percent of. 387 00:19:51,160 --> 00:19:54,080 Speaker 1: People believe this, I need to go back and read 388 00:19:54,080 --> 00:19:56,000 Speaker 1: it and find it and make sure that's actually what 389 00:19:56,040 --> 00:19:58,280 Speaker 1: it's saying, because that's the judgment part, which I think 390 00:19:58,320 --> 00:20:01,080 Speaker 1: is really important as opposed to letting things go through 391 00:20:01,359 --> 00:20:04,080 Speaker 1: without kind of like deciding is that actually correct. I 392 00:20:04,080 --> 00:20:06,320 Speaker 1: think when you write books, you have a real sense 393 00:20:06,320 --> 00:20:08,399 Speaker 1: of responsibility that you need to make sure that everything 394 00:20:08,440 --> 00:20:09,719 Speaker 1: in there is as accurate as a can bag. 395 00:20:09,800 --> 00:20:11,720 Speaker 2: Oh my gosh, one hundred percent. I also use it 396 00:20:12,119 --> 00:20:14,359 Speaker 2: and did use it for Book number five a lot 397 00:20:14,520 --> 00:20:16,920 Speaker 2: as a research assistant. I was probably working with it 398 00:20:16,960 --> 00:20:20,320 Speaker 2: for I don't know, two hours a day just on research, 399 00:20:20,560 --> 00:20:24,800 Speaker 2: like firstly in terms of uncovering new studies and using 400 00:20:24,840 --> 00:20:28,560 Speaker 2: a combination of consensus dot app, which is fantastic at 401 00:20:28,680 --> 00:20:32,560 Speaker 2: uncovering academic research when you know what keywords to type in, 402 00:20:32,960 --> 00:20:36,280 Speaker 2: but then also using deep research on I think I'd 403 00:20:36,280 --> 00:20:39,920 Speaker 2: mainly go to perplexity Gemini and chat GPT with sort. 404 00:20:39,800 --> 00:20:42,600 Speaker 1: Of perplexid research for book writing and actually saw something 405 00:20:42,600 --> 00:20:44,520 Speaker 1: you shared recently and talking about that, and I was 406 00:20:44,640 --> 00:20:46,840 Speaker 1: it made me I used the same thing with flexisd 407 00:20:46,960 --> 00:20:49,840 Speaker 1: research book research really good. 408 00:20:49,920 --> 00:20:53,520 Speaker 2: Yeah, also if I Gemini's deep research is really good 409 00:20:53,520 --> 00:20:56,160 Speaker 2: as well, better than chat JPT I found. But then, 410 00:20:56,560 --> 00:20:59,560 Speaker 2: I mean, there's so much verification that needs to be 411 00:20:59,640 --> 00:21:03,399 Speaker 2: done when you're using AI for research. And even I 412 00:21:03,520 --> 00:21:07,359 Speaker 2: found when you feed it a PDF and say, ask 413 00:21:07,440 --> 00:21:13,200 Speaker 2: it to summarize the results or the statistical analysis, even 414 00:21:13,240 --> 00:21:16,119 Speaker 2: though there are numbers there, like, you still have to 415 00:21:16,200 --> 00:21:19,360 Speaker 2: verify everything. Even like when you're feeding it that it's 416 00:21:19,359 --> 00:21:21,640 Speaker 2: still hallucinating. I find sometimes I. 417 00:21:21,560 --> 00:21:25,159 Speaker 1: Think that's skepticalness is really important and you need that, 418 00:21:25,280 --> 00:21:27,760 Speaker 1: and that's judgment. So you kind of we're both learning 419 00:21:27,800 --> 00:21:29,800 Speaker 1: that skill as we're doing it. So people listening to 420 00:21:29,840 --> 00:21:31,760 Speaker 1: this want to figure out how to get better this skill. 421 00:21:32,359 --> 00:21:34,439 Speaker 1: Use AI as much as you can, but don't believe 422 00:21:34,480 --> 00:21:37,400 Speaker 1: everything it sets. Double check it, you know, figure out 423 00:21:37,760 --> 00:21:42,240 Speaker 1: if it's correct. Fascinating to think then, So then I've 424 00:21:42,240 --> 00:21:45,159 Speaker 1: had several I was recently back and forward to Perth 425 00:21:45,640 --> 00:21:48,400 Speaker 1: from Sydney three times three weeks, so it was about 426 00:21:48,480 --> 00:21:50,520 Speaker 1: thirty hours on the plane just back and forward and 427 00:21:50,600 --> 00:21:51,719 Speaker 1: it was all writing times. 428 00:21:52,119 --> 00:21:53,360 Speaker 3: It was actually excellent because. 429 00:21:53,119 --> 00:21:55,480 Speaker 1: I love writing on planes, and even though there is 430 00:21:55,600 --> 00:21:57,960 Speaker 1: internet on the plane, I kind of didn't use Internet 431 00:21:57,960 --> 00:21:58,960 Speaker 1: because I want to be in a bit of my 432 00:21:59,000 --> 00:22:01,480 Speaker 1: own bubble. And that was fascinating because that was thirty 433 00:22:01,520 --> 00:22:05,399 Speaker 1: hours without AI, thirty hours without my research assistant, and 434 00:22:05,440 --> 00:22:08,040 Speaker 1: it was a different type of writing. That for me 435 00:22:08,440 --> 00:22:11,160 Speaker 1: was really important because it was me doing a bit 436 00:22:11,200 --> 00:22:15,360 Speaker 1: of the storytelling, a bit of the crafting, a bit 437 00:22:15,400 --> 00:22:17,920 Speaker 1: of the joy that I kind of get from doing that. 438 00:22:18,320 --> 00:22:20,720 Speaker 1: Whereas when I think you're hearing you haven't you know 439 00:22:20,720 --> 00:22:24,080 Speaker 1: an AI prompt next to you, there is the temptation 440 00:22:24,240 --> 00:22:27,160 Speaker 1: to just constantly ask it questions and constantly kind of 441 00:22:27,600 --> 00:22:30,760 Speaker 1: almost become a bit too factual and lose some of 442 00:22:30,800 --> 00:22:34,359 Speaker 1: the creative side of that. So part of colaborative intelligence 443 00:22:34,440 --> 00:22:37,160 Speaker 1: is knowing when you should use AI, when you shouldn't 444 00:22:37,240 --> 00:22:40,159 Speaker 1: use AI, how to use it best, and iman the 445 00:22:40,200 --> 00:22:42,199 Speaker 1: fact that you shared on I was on LinkedIn recently. 446 00:22:42,240 --> 00:22:44,800 Speaker 1: Around these are all the different tools that are using 447 00:22:44,840 --> 00:22:48,639 Speaker 1: for different things. That is an example of collaborative intelligence. 448 00:22:48,840 --> 00:22:52,000 Speaker 2: I find that one of the ways that most of 449 00:22:52,040 --> 00:22:55,920 Speaker 2: my workflows have changed when I think about how I'm 450 00:22:56,000 --> 00:23:00,480 Speaker 2: interacting with even just with my computer, is that I reckon, 451 00:23:00,560 --> 00:23:02,640 Speaker 2: Like with the time that I'm in putting stuff into 452 00:23:02,680 --> 00:23:06,560 Speaker 2: the computer, I reckon I'm probably about fifty percent voice 453 00:23:06,600 --> 00:23:10,399 Speaker 2: and fifty percent typing, whereas, like I would say, a 454 00:23:10,480 --> 00:23:14,639 Speaker 2: year ago, I was one hundred percent typing. And I 455 00:23:14,800 --> 00:23:17,600 Speaker 2: find because one of my pet hates is receiving an 456 00:23:17,600 --> 00:23:22,000 Speaker 2: AI written email that's had minimal or no human judgment, 457 00:23:22,160 --> 00:23:24,359 Speaker 2: which is becoming very very common, I find it. I 458 00:23:24,480 --> 00:23:28,480 Speaker 2: loath that it just makes me want to vomit on 459 00:23:28,520 --> 00:23:31,240 Speaker 2: the email. But I find that, you know, because we 460 00:23:31,400 --> 00:23:34,040 Speaker 2: work with thousands of people around training them on AI skills. 461 00:23:34,080 --> 00:23:36,120 Speaker 2: It's probably the number one use case where if people 462 00:23:36,119 --> 00:23:38,680 Speaker 2: are using AI for anything, it's writing emails, which is 463 00:23:38,720 --> 00:23:41,439 Speaker 2: then problematic because you know, they're generally pretty basic skills. 464 00:23:41,480 --> 00:23:44,040 Speaker 2: If people just have one use case for using AI. 465 00:23:44,600 --> 00:23:47,679 Speaker 2: And so my kind of go to now unless it's 466 00:23:47,760 --> 00:23:51,080 Speaker 2: quite a long and complex email, and I am using 467 00:23:51,280 --> 00:23:53,000 Speaker 2: AI to work with me on what I want to 468 00:23:53,040 --> 00:23:56,439 Speaker 2: say or to make what I want to say more succinct. Generally, 469 00:23:56,520 --> 00:23:58,720 Speaker 2: my workflow this is just with emails as well as 470 00:23:58,720 --> 00:24:01,040 Speaker 2: some other forms of writing, is I will just use 471 00:24:01,040 --> 00:24:03,800 Speaker 2: whisper flow, which I press the function key on my 472 00:24:03,880 --> 00:24:06,800 Speaker 2: computer and I just speak the email and then I 473 00:24:06,960 --> 00:24:09,080 Speaker 2: edit it and it sounds like me, it's what I 474 00:24:09,119 --> 00:24:12,359 Speaker 2: want to say. I think I've learned to speak fairly succinctly, 475 00:24:12,480 --> 00:24:14,919 Speaker 2: but it could still be more succinct, and that is 476 00:24:14,960 --> 00:24:19,240 Speaker 2: such a change. And I feel like with collaborative intelligence 477 00:24:19,280 --> 00:24:22,080 Speaker 2: and how we communicate with other humans, a lot of 478 00:24:22,080 --> 00:24:25,399 Speaker 2: that is how do we bring some humanness into what 479 00:24:25,520 --> 00:24:27,399 Speaker 2: we are trying to say to other people. So I 480 00:24:27,440 --> 00:24:30,159 Speaker 2: love that you're still doing the writing and that's a 481 00:24:30,160 --> 00:24:31,960 Speaker 2: hard line for you, and I hear that a lot 482 00:24:32,080 --> 00:24:36,040 Speaker 2: from people that are sort of writers first, If that 483 00:24:36,160 --> 00:24:36,639 Speaker 2: makes sense. 484 00:24:37,119 --> 00:24:40,080 Speaker 1: Yeah, it's interesting that I think it's important that everyone 485 00:24:40,080 --> 00:24:42,960 Speaker 1: has different lines and that the lines might change over time. 486 00:24:43,359 --> 00:24:47,240 Speaker 1: But right now, I think so much when I write, 487 00:24:47,440 --> 00:24:50,800 Speaker 1: and I think I've trained my brain to think at 488 00:24:50,800 --> 00:24:53,040 Speaker 1: the speed in which I type, and I edit in 489 00:24:53,080 --> 00:24:55,520 Speaker 1: my head as I type, and I can't move words around, 490 00:24:56,119 --> 00:24:58,840 Speaker 1: and that's how I process my thoughts and my emotions, 491 00:24:59,359 --> 00:25:03,600 Speaker 1: and me that's really useful. I find talking as a 492 00:25:03,880 --> 00:25:06,399 Speaker 1: really interesting way of like it's almost if you think 493 00:25:06,440 --> 00:25:09,679 Speaker 1: about divergent and convergent thinking for me, talking for me 494 00:25:09,800 --> 00:25:11,840 Speaker 1: is kind of divergent because all these ideas come out, 495 00:25:12,119 --> 00:25:14,800 Speaker 1: and then for me, writing is convergent when I kind 496 00:25:14,800 --> 00:25:18,760 Speaker 1: of like decide no. That's my take on this, And 497 00:25:18,840 --> 00:25:22,439 Speaker 1: I love the fact that when I do write, it 498 00:25:22,520 --> 00:25:25,560 Speaker 1: helps the world make sense, it helps me make sense, 499 00:25:25,560 --> 00:25:28,120 Speaker 1: it helps my own thoughts make sense. I find all 500 00:25:28,160 --> 00:25:29,160 Speaker 1: of that really important. 501 00:25:29,200 --> 00:25:33,080 Speaker 2: So let's talk about skill number four, which is unlearning, 502 00:25:33,119 --> 00:25:37,040 Speaker 2: which is kind of counterintuitive because organizations spend a lot 503 00:25:37,080 --> 00:25:40,440 Speaker 2: of money on learning and development, so what is unlearning? 504 00:25:40,760 --> 00:25:42,800 Speaker 1: So this is one of these skills that is probably 505 00:25:42,800 --> 00:25:45,600 Speaker 1: the most ethereal of them all. It's probably the hardest 506 00:25:45,640 --> 00:25:50,160 Speaker 1: to actually figure out how you really upskill in it. 507 00:25:50,480 --> 00:25:53,359 Speaker 1: But the concept of unlearning is the fact that things 508 00:25:53,359 --> 00:25:54,600 Speaker 1: are changing so much. 509 00:25:55,040 --> 00:25:56,280 Speaker 3: Every time you look. 510 00:25:56,119 --> 00:25:58,639 Speaker 1: There is a new tool, there's new ways of working. 511 00:25:59,000 --> 00:26:00,960 Speaker 1: I think all of us are feeling in this acceleration 512 00:26:01,119 --> 00:26:06,120 Speaker 1: that's happening. And unlearning is about the ability to forget 513 00:26:06,480 --> 00:26:10,040 Speaker 1: what you think you know and really to be able 514 00:26:10,080 --> 00:26:13,800 Speaker 1: to roll with the punches about what is coming up next. 515 00:26:14,280 --> 00:26:16,679 Speaker 1: So it's a really interesting skill and a lot of 516 00:26:16,680 --> 00:26:19,679 Speaker 1: it is really a mindset shift, which is that some 517 00:26:19,720 --> 00:26:21,800 Speaker 1: of these things are new and some of them a 518 00:26:21,800 --> 00:26:24,760 Speaker 1: bit scary, and are you going to see them as 519 00:26:24,760 --> 00:26:26,520 Speaker 1: an opportunity or again see it as a threat. 520 00:26:26,680 --> 00:26:31,640 Speaker 2: So, as humans, we're programmed to resist unlearning. We've got 521 00:26:31,640 --> 00:26:36,000 Speaker 2: biases like confirmation bias, which you know, really put us 522 00:26:36,080 --> 00:26:40,280 Speaker 2: looking for things that confirm our worldviews. So, Tim, I'm curious, 523 00:26:40,600 --> 00:26:43,960 Speaker 2: do you have any habit or strategies that you have 524 00:26:44,160 --> 00:26:46,840 Speaker 2: to deliberately unlearn in your own life. 525 00:26:47,000 --> 00:26:50,440 Speaker 1: Yeah, it's a really hard one to teach someone how 526 00:26:50,480 --> 00:26:53,440 Speaker 1: to unlearn, and I think it's something that we should, 527 00:26:53,520 --> 00:26:56,879 Speaker 1: number one, be conscious of when we come across new things. 528 00:26:56,960 --> 00:26:59,280 Speaker 1: Am I just holding onto the old way of doing things? 529 00:26:59,280 --> 00:27:00,320 Speaker 1: Because that's all I know. 530 00:27:00,960 --> 00:27:04,840 Speaker 3: Something that's really helped me is I am a proud optimist. 531 00:27:05,040 --> 00:27:08,080 Speaker 1: I consider myself I think even on my LinkedIn profile 532 00:27:08,119 --> 00:27:10,560 Speaker 1: it says I'm an optimist. It loves thinking about big 533 00:27:10,600 --> 00:27:14,560 Speaker 1: ideas and it's something I really got. 534 00:27:14,320 --> 00:27:16,280 Speaker 3: From my parents and my dad in particular. 535 00:27:16,680 --> 00:27:19,600 Speaker 1: And I love that because it means when I see 536 00:27:19,600 --> 00:27:22,080 Speaker 1: something new, or if I see a bit of an 537 00:27:22,080 --> 00:27:25,440 Speaker 1: obstacle in my way, I generally tend to think about 538 00:27:25,440 --> 00:27:29,320 Speaker 1: the positive that comes from that. And unlearning is really 539 00:27:29,400 --> 00:27:32,040 Speaker 1: a bit of an optimistic character trait, because it's about 540 00:27:32,080 --> 00:27:35,600 Speaker 1: looking at something and instead of thinking, oh God, I 541 00:27:35,720 --> 00:27:38,640 Speaker 1: need to now learn this whole new thing all over 542 00:27:38,720 --> 00:27:41,960 Speaker 1: because I thought I knew how something worked. Unlearning is 543 00:27:42,080 --> 00:27:47,440 Speaker 1: the ability to really focus on what the positives are 544 00:27:47,960 --> 00:27:50,400 Speaker 1: in figuring something new out for the first time, and 545 00:27:50,880 --> 00:27:52,080 Speaker 1: unlearning the way you've. 546 00:27:51,880 --> 00:27:52,639 Speaker 3: Done it in the past. 547 00:27:52,880 --> 00:27:57,359 Speaker 2: Now I've saved your most favorite skill for last, which 548 00:27:57,720 --> 00:28:02,960 Speaker 2: is conflict manager. Tell me why is this your favorite one? 549 00:28:03,000 --> 00:28:03,440 Speaker 4: Tim? 550 00:28:03,560 --> 00:28:07,000 Speaker 1: I mean, I know it's not fair to have favorite skills. 551 00:28:07,000 --> 00:28:11,200 Speaker 1: It's like choosing a favorite child. However, I do have 552 00:28:11,240 --> 00:28:14,800 Speaker 1: a favorite skill, and I think it's my favorite because 553 00:28:14,800 --> 00:28:18,360 Speaker 1: it's something that I have historically not been the best at. 554 00:28:18,600 --> 00:28:20,920 Speaker 1: It's something that I think most of us are historically 555 00:28:20,960 --> 00:28:23,919 Speaker 1: not the best at. And it's conflict. It's learning what 556 00:28:24,080 --> 00:28:28,800 Speaker 1: happens when two humans, in all of our valuabilities and 557 00:28:29,040 --> 00:28:34,119 Speaker 1: all of our expectations and hopes and intentions and miscommunications, 558 00:28:34,520 --> 00:28:37,520 Speaker 1: getting a room together and we conflict with each other. 559 00:28:38,200 --> 00:28:41,560 Speaker 1: And it's something that I've been thinking about a lot. 560 00:28:41,800 --> 00:28:45,560 Speaker 1: So I've spent the last few years researching, traveling the world, 561 00:28:45,720 --> 00:28:49,800 Speaker 1: writing about this, speaking to people from professors at Oxford 562 00:28:49,880 --> 00:28:54,440 Speaker 1: University through to producers of The Real Housewives to try 563 00:28:54,480 --> 00:28:57,200 Speaker 1: and understand what conflict is and how we can be 564 00:28:57,240 --> 00:29:00,480 Speaker 1: better at it. And the reason it's important is that 565 00:29:01,440 --> 00:29:05,320 Speaker 1: conflict is everywhere we look, particularly in a workplace. It's 566 00:29:05,400 --> 00:29:09,600 Speaker 1: estimated that we spend about three hours a week, every 567 00:29:09,640 --> 00:29:12,640 Speaker 1: single week, dealing with conflict, so it has this real cost, 568 00:29:13,200 --> 00:29:15,640 Speaker 1: Yet most of us are really bad at it. Most 569 00:29:15,680 --> 00:29:18,920 Speaker 1: of us are not taught skills. Most of us are 570 00:29:19,080 --> 00:29:24,320 Speaker 1: conflict avoiders somewhere between sixty to ninety percent of people, 571 00:29:24,360 --> 00:29:27,920 Speaker 1: depend on which study you look at. And the reason 572 00:29:27,920 --> 00:29:30,040 Speaker 1: why we do that is because it is hard, and 573 00:29:30,080 --> 00:29:34,080 Speaker 1: it's messy and it's difficult. And the way that technology 574 00:29:34,160 --> 00:29:37,280 Speaker 1: is going, we are interacting more and more on screens 575 00:29:37,600 --> 00:29:41,160 Speaker 1: with computers, with AI and we're removing a lot of 576 00:29:41,160 --> 00:29:43,200 Speaker 1: that conflict that exists in our life. So it's one 577 00:29:43,200 --> 00:29:47,000 Speaker 1: of these skills that I think with where everything is going, 578 00:29:47,040 --> 00:29:50,640 Speaker 1: if you fast forward through the next ten years, the 579 00:29:50,720 --> 00:29:53,520 Speaker 1: ability to interact with other humans is going to be 580 00:29:53,560 --> 00:29:56,520 Speaker 1: something that's going to be really prized within not only 581 00:29:56,560 --> 00:29:58,000 Speaker 1: the workplace, but within your life. 582 00:29:58,160 --> 00:30:01,440 Speaker 2: So you've done all this research into getting better at 583 00:30:01,480 --> 00:30:04,080 Speaker 2: conflict and you've spoken to. I mean, I can't believe 584 00:30:04,080 --> 00:30:06,760 Speaker 2: you've spoken to a producer from The Real Housewives. I 585 00:30:06,760 --> 00:30:09,320 Speaker 2: mean that would have been gold. Tell me, like, what 586 00:30:09,360 --> 00:30:12,200 Speaker 2: would have been one or two of the most powerful 587 00:30:12,360 --> 00:30:16,640 Speaker 2: strategies that you've learned for helping someone like me who 588 00:30:16,680 --> 00:30:19,200 Speaker 2: is not great at conflict and who tries to avoid 589 00:30:19,240 --> 00:30:22,680 Speaker 2: it but knows that that is not a good strategy, Like, 590 00:30:22,960 --> 00:30:25,000 Speaker 2: what can I do tim to get better at it? 591 00:30:25,160 --> 00:30:26,880 Speaker 1: Yes, there's a bunch of things that you can do, 592 00:30:27,160 --> 00:30:29,600 Speaker 1: and number one is to have a new book coming 593 00:30:29,640 --> 00:30:32,120 Speaker 1: out and doesn't come out to October, so you might 594 00:30:32,120 --> 00:30:33,720 Speaker 1: need to wait until then continue to be. 595 00:30:33,680 --> 00:30:35,360 Speaker 3: Bad at conflict until the book comes out. 596 00:30:35,840 --> 00:30:37,760 Speaker 1: No, I think one of the best things you can 597 00:30:37,800 --> 00:30:41,360 Speaker 1: do is be aware of what your conflict style is. 598 00:30:42,120 --> 00:30:45,160 Speaker 1: So generally there's two types of people. There's conflict seekers 599 00:30:45,160 --> 00:30:49,200 Speaker 1: and conflict devoids. And most people tend to form evoll 600 00:30:49,320 --> 00:30:54,000 Speaker 1: into conflict devoids. And it's because conflict seekers are a 601 00:30:54,120 --> 00:30:59,400 Speaker 1: rare breed of people who have sometimes unhealthily led into that. 602 00:31:00,000 --> 00:31:03,640 Speaker 1: There is this middle ground of people who are potentially 603 00:31:03,680 --> 00:31:06,600 Speaker 1: not conflict avoids or conflict seekers, but somewhere in the 604 00:31:06,640 --> 00:31:08,440 Speaker 1: middle of the two of them. So I think having 605 00:31:08,560 --> 00:31:12,960 Speaker 1: clarity around what type of person you are in conflict. 606 00:31:12,960 --> 00:31:16,280 Speaker 1: Are you someone who avoids conflict or seeks conflict? That 607 00:31:16,400 --> 00:31:21,280 Speaker 1: is really important. Knowing where your conflict sits is really important, 608 00:31:21,720 --> 00:31:23,360 Speaker 1: Knowing where you want to. 609 00:31:23,360 --> 00:31:25,560 Speaker 3: Get to and knowing how to get there. 610 00:31:26,200 --> 00:31:28,480 Speaker 1: They are some of the big questions that I have 611 00:31:28,560 --> 00:31:33,080 Speaker 1: been really thinking about for the last few years, and 612 00:31:33,200 --> 00:31:36,920 Speaker 1: it all basically comes down to when you're Amigdala takes over. 613 00:31:37,080 --> 00:31:41,040 Speaker 1: When you're in conflict and you have this emotions rising 614 00:31:41,120 --> 00:31:46,360 Speaker 1: inside you, you really have to try to have some 615 00:31:46,440 --> 00:31:49,880 Speaker 1: kind of clarity, some kind of intellectual awareness, some kind 616 00:31:49,880 --> 00:31:52,360 Speaker 1: of understanding of what's happening to you. Because unless you 617 00:31:52,400 --> 00:31:55,800 Speaker 1: can actually manually override that, you're not going to get anywhere. 618 00:31:56,160 --> 00:32:00,080 Speaker 1: So the biggest piece of advice I have is to 619 00:32:00,520 --> 00:32:03,400 Speaker 1: really just try and understand conflict a little bit more. 620 00:32:03,560 --> 00:32:05,560 Speaker 1: That will serve you in measurable ways. 621 00:32:05,920 --> 00:32:08,600 Speaker 2: Can I ask, like, what's one thing that you've started 622 00:32:08,600 --> 00:32:11,480 Speaker 2: to do or that you've found yourself doing in your 623 00:32:11,520 --> 00:32:15,480 Speaker 2: own life when you find you have conflict with your 624 00:32:15,760 --> 00:32:21,120 Speaker 2: husband or friends or people that you work with, Like, 625 00:32:21,160 --> 00:32:24,440 Speaker 2: what's the behavior that you've seen shift It's fascinating. 626 00:32:24,480 --> 00:32:28,120 Speaker 1: The more I understand it, the more I'm able to 627 00:32:28,960 --> 00:32:31,920 Speaker 1: find different strategies. And I'll give you one really simple 628 00:32:31,960 --> 00:32:34,520 Speaker 1: strategy here, which is as I've spent the last few 629 00:32:34,640 --> 00:32:39,280 Speaker 1: years thinking about this topic, and I went and studied 630 00:32:39,280 --> 00:32:41,080 Speaker 1: to become a nationally credited mediator. 631 00:32:41,120 --> 00:32:43,360 Speaker 3: As part of it, I traveled the world. I've spoken 632 00:32:43,440 --> 00:32:47,000 Speaker 3: to dozens and dozens of people in this topic, and. 633 00:32:46,960 --> 00:32:50,479 Speaker 1: I realized that there's actually one really powerful word in 634 00:32:50,520 --> 00:32:54,480 Speaker 1: conflict that actually gets a little bit forgotten, and the 635 00:32:54,520 --> 00:32:59,480 Speaker 1: word is how So. If you can put how into 636 00:32:59,480 --> 00:33:02,520 Speaker 1: a sentence when you're in conflict with somebody, it's actually 637 00:33:02,600 --> 00:33:05,920 Speaker 1: the most powerful word that you can use. And the 638 00:33:06,000 --> 00:33:09,560 Speaker 1: reason for that is that how forces someone else to 639 00:33:09,600 --> 00:33:12,200 Speaker 1: have empathy for you. So I'll give you an example 640 00:33:12,200 --> 00:33:14,840 Speaker 1: of that is, when you are in a complex situation 641 00:33:15,520 --> 00:33:17,880 Speaker 1: with somebody, Let's say it's your partner, or it's somebody 642 00:33:17,920 --> 00:33:20,480 Speaker 1: that you work closely with, if you can frame a 643 00:33:20,560 --> 00:33:25,040 Speaker 1: question with how so, how am I meant to resolve this? 644 00:33:25,760 --> 00:33:28,800 Speaker 1: How can we come to a solution together here? How 645 00:33:28,840 --> 00:33:31,240 Speaker 1: can I use these tools in order to try and 646 00:33:31,960 --> 00:33:33,800 Speaker 1: get this situation from A to B? 647 00:33:34,240 --> 00:33:34,840 Speaker 3: Whatever it is? 648 00:33:34,840 --> 00:33:37,520 Speaker 1: As soon as you say, how you're inviting the other 649 00:33:37,600 --> 00:33:41,600 Speaker 1: person to co create some of the answer to it, 650 00:33:41,680 --> 00:33:45,960 Speaker 1: and that simple trick of forced empathy is actually really, 651 00:33:45,960 --> 00:33:48,680 Speaker 1: really amazing. So that's something that I use every day now. 652 00:33:48,880 --> 00:33:51,640 Speaker 2: I love that, Tim. I am going to think about 653 00:33:51,640 --> 00:33:54,200 Speaker 2: that and I'm going to put that into practice. Now. 654 00:33:55,040 --> 00:33:59,560 Speaker 2: For listeners that have heard these five skills and they're like, 655 00:33:59,640 --> 00:34:01,720 Speaker 2: that's great, But what's just like one thing that I 656 00:34:01,760 --> 00:34:06,360 Speaker 2: should start doing today to make myself more promotable this year, 657 00:34:06,640 --> 00:34:08,080 Speaker 2: what advice would you give them? 658 00:34:08,280 --> 00:34:12,640 Speaker 1: Yeah, I think sometimes when you listen to wonderful podcasts 659 00:34:12,680 --> 00:34:16,120 Speaker 1: like this, you can get a bit overwhelmed with thinking, 660 00:34:16,800 --> 00:34:17,239 Speaker 1: what are. 661 00:34:17,120 --> 00:34:18,800 Speaker 3: These skills that these people are. 662 00:34:18,600 --> 00:34:22,400 Speaker 1: Wrapping in on about and how can I actually apply 663 00:34:22,520 --> 00:34:25,760 Speaker 1: them to my life. My advice would be to listen 664 00:34:25,960 --> 00:34:29,960 Speaker 1: back to what these five skills were. You don't need 665 00:34:30,000 --> 00:34:32,759 Speaker 1: to improve all of them all at once. That is 666 00:34:32,840 --> 00:34:35,840 Speaker 1: not what it's going to be. But listen to these 667 00:34:36,000 --> 00:34:38,680 Speaker 1: five skills and we'll just run through them here just 668 00:34:38,680 --> 00:34:42,640 Speaker 1: so everyone understands. The first one was judgment, the second 669 00:34:42,640 --> 00:34:47,280 Speaker 1: one is storytelling, the third one's collaborative intelligence, the fourth 670 00:34:47,280 --> 00:34:50,440 Speaker 1: one is unlearning, and the fifth one is conflict management. 671 00:34:51,160 --> 00:34:55,240 Speaker 1: So my advice is, listen to those five skills, think 672 00:34:55,440 --> 00:35:01,640 Speaker 1: which of them do you instinctively, naturally, join, optimistically want 673 00:35:01,640 --> 00:35:04,439 Speaker 1: to get better at, and then think of what's one 674 00:35:04,520 --> 00:35:07,239 Speaker 1: thing you can do in that space. So that's I 675 00:35:07,280 --> 00:35:10,440 Speaker 1: think a really simple way of not getting overwhelmed by 676 00:35:10,440 --> 00:35:13,240 Speaker 1: all of these and just thinking, Wow, storytelling is interesting. 677 00:35:13,760 --> 00:35:15,880 Speaker 1: I might just look up a couple of frameworks and 678 00:35:15,960 --> 00:35:18,960 Speaker 1: storytelling and try and do that. Conflict's interesting. I might 679 00:35:19,320 --> 00:35:22,040 Speaker 1: think about conflict and try and understand that more. So, 680 00:35:22,239 --> 00:35:24,360 Speaker 1: just just choose one of them and one simple. 681 00:35:24,120 --> 00:35:24,839 Speaker 3: Step that you can do. 682 00:35:24,960 --> 00:35:27,440 Speaker 2: I love that. Tim, thank you for coming back on 683 00:35:27,520 --> 00:35:30,120 Speaker 2: How I Work. And obviously you're going to come back 684 00:35:30,360 --> 00:35:33,680 Speaker 2: when your book is out to deep dive into the 685 00:35:33,760 --> 00:35:37,040 Speaker 2: area of conflict, which I personally find so interesting and 686 00:35:37,120 --> 00:35:41,040 Speaker 2: I'm sure listeners are absolutely going to want to know 687 00:35:41,120 --> 00:35:43,719 Speaker 2: more about that. So Tim, thank you, Thank you, thank 688 00:35:43,760 --> 00:35:44,919 Speaker 2: you for coming on How I Work. 689 00:35:45,040 --> 00:35:46,600 Speaker 3: Thank you, Mantha, thank you for having me again. 690 00:35:46,920 --> 00:35:50,520 Speaker 2: So my advice if you're contemplating taking action on all 691 00:35:50,680 --> 00:35:55,960 Speaker 2: five skills is just don't don't try to improve all 692 00:35:56,160 --> 00:36:00,680 Speaker 2: five skills at once. Just pick one the next time 693 00:36:00,680 --> 00:36:04,040 Speaker 2: you're about to hit send on something AI drafted, or 694 00:36:04,160 --> 00:36:07,480 Speaker 2: you feel that flicker of tension in a tricky conversation, 695 00:36:08,160 --> 00:36:11,240 Speaker 2: Just pause and ask yourself, which skill am I practicing 696 00:36:11,360 --> 00:36:14,480 Speaker 2: right now? Or could I practice right now? Is this 697 00:36:14,520 --> 00:36:18,239 Speaker 2: a moment for judgment or for storytelling, or maybe for 698 00:36:18,400 --> 00:36:22,600 Speaker 2: using that tiny powerful word how tick one skill, try 699 00:36:22,680 --> 00:36:25,759 Speaker 2: one small shift, and I'll see you next week on 700 00:36:25,840 --> 00:36:29,160 Speaker 2: How I Work. If you like today's show, make sure 701 00:36:29,239 --> 00:36:32,239 Speaker 2: you hit follow on your podcast app to be alerted 702 00:36:32,280 --> 00:36:33,760 Speaker 2: when new episodes drop. 703 00:36:34,280 --> 00:36:36,800 Speaker 4: How I Work was recorded on the traditional land of 704 00:36:36,840 --> 00:36:39,040 Speaker 4: the Warrangery people, part of the Cooler Nation