1 00:00:08,640 --> 00:00:10,559 Speaker 1: Daniel Murray. Welcome to the podcast. 2 00:00:11,280 --> 00:00:12,959 Speaker 2: Thanks Paul, great to be here with you. 3 00:00:13,640 --> 00:00:16,560 Speaker 1: Exciting Damie at your book has when did it release? 4 00:00:16,840 --> 00:00:17,439 Speaker 1: Very recently? 5 00:00:18,079 --> 00:00:20,680 Speaker 2: Yeah, a couple of days ago. Came out in bookshops, 6 00:00:20,720 --> 00:00:22,639 Speaker 2: so it was a bit of a surreal moment of 7 00:00:22,640 --> 00:00:25,040 Speaker 2: walking in. I was actually in Sydney, went into the 8 00:00:25,079 --> 00:00:29,319 Speaker 2: Dimicks there on Pitt Street and yeah, I got to 9 00:00:29,400 --> 00:00:31,760 Speaker 2: go in and see it on the shelf and next 10 00:00:31,800 --> 00:00:34,240 Speaker 2: to all those other wonderful books and authors. It's pretty 11 00:00:34,280 --> 00:00:34,959 Speaker 2: exciting to hos. 12 00:00:35,800 --> 00:00:38,160 Speaker 3: That's the cool bit, isn't it When you actually see 13 00:00:38,200 --> 00:00:40,400 Speaker 3: it in a bookstore and then you go, shit, this 14 00:00:40,520 --> 00:00:41,040 Speaker 3: is real. 15 00:00:42,760 --> 00:00:45,560 Speaker 2: It's it's felt real for a while, but you're right 16 00:00:45,760 --> 00:00:48,600 Speaker 2: when you see it. You know, I was there underneath 17 00:00:48,680 --> 00:00:53,880 Speaker 2: diary of CEO Steven Bartler's book and some other familiar 18 00:00:53,920 --> 00:00:56,320 Speaker 2: faces around it. It's exciting, very exciting. 19 00:00:56,360 --> 00:00:58,880 Speaker 3: Yeah, well you're in good company underneath there, all right, 20 00:00:59,000 --> 00:01:01,960 Speaker 3: So tell me there's it's just because I've just finished 21 00:01:02,000 --> 00:01:04,880 Speaker 3: my second book. How long did the did it take 22 00:01:04,959 --> 00:01:11,440 Speaker 3: you from the conception to actually getting the final edits. 23 00:01:11,120 --> 00:01:14,200 Speaker 1: Off and off to print? How long do you recket? 24 00:01:14,640 --> 00:01:16,200 Speaker 2: Oh? Look, it was a bit of a weird one 25 00:01:16,240 --> 00:01:19,720 Speaker 2: because I've been writing bits of this book for a 26 00:01:19,760 --> 00:01:23,520 Speaker 2: couple of years. Some of the frameworks and tools that 27 00:01:23,560 --> 00:01:25,880 Speaker 2: I've developed and put in the book have been around 28 00:01:25,920 --> 00:01:29,240 Speaker 2: for a bit longer. It was probably a good twelve 29 00:01:29,240 --> 00:01:32,000 Speaker 2: months or so from okay, I'm going to make this 30 00:01:32,080 --> 00:01:35,640 Speaker 2: a proper book, and really going through and refining and 31 00:01:35,640 --> 00:01:39,600 Speaker 2: pulling pieces together to then the part where a friend 32 00:01:39,600 --> 00:01:41,679 Speaker 2: of mine said, I don't self publish, don't talk to 33 00:01:42,240 --> 00:01:45,920 Speaker 2: my friends at Wiley, and that sort of started that process, 34 00:01:46,120 --> 00:01:48,920 Speaker 2: putting a proposal together, getting them across the line, and 35 00:01:49,600 --> 00:01:52,400 Speaker 2: then the editing and the proof as you probably know, 36 00:01:52,520 --> 00:01:55,320 Speaker 2: the proofing, the editing, the blood bath that comes back 37 00:01:55,360 --> 00:01:58,080 Speaker 2: when you realize you can't fell or punch away, as 38 00:01:58,080 --> 00:02:01,920 Speaker 2: well as this good. Yeah, it was a bit of 39 00:02:01,960 --> 00:02:03,720 Speaker 2: a journey, a bit of a journey, and good to be 40 00:02:03,720 --> 00:02:05,080 Speaker 2: at the end of it. Yeah. 41 00:02:05,160 --> 00:02:09,200 Speaker 3: Now, your book is all about empathy, So just top level, 42 00:02:09,720 --> 00:02:12,280 Speaker 3: give us a kind of a definition of empathy. 43 00:02:11,880 --> 00:02:12,440 Speaker 1: And what is that? 44 00:02:12,960 --> 00:02:18,320 Speaker 3: What are the overlaps and the differences between empathy, sympathy, 45 00:02:18,520 --> 00:02:19,519 Speaker 3: and kindness. 46 00:02:20,520 --> 00:02:23,600 Speaker 2: Yeah, there's a lot in there so that there are 47 00:02:23,760 --> 00:02:28,720 Speaker 2: lots of clinical or academic definitions with the book, I 48 00:02:28,800 --> 00:02:32,480 Speaker 2: really want to make it accessible for people to practice 49 00:02:32,520 --> 00:02:35,639 Speaker 2: and to put it into the world to do empathy, 50 00:02:35,680 --> 00:02:38,639 Speaker 2: not just to think about it. So I discuss empathy 51 00:02:38,680 --> 00:02:42,800 Speaker 2: as understanding why people do what they do, just as 52 00:02:42,840 --> 00:02:47,360 Speaker 2: simple as that. So what drives people, And sometimes that's emotions. 53 00:02:47,400 --> 00:02:53,880 Speaker 2: Sometimes that's very deeply an emotion drive. Sometimes it's much 54 00:02:53,919 --> 00:02:58,360 Speaker 2: more strategic or logical or rational. There's a whole bunch 55 00:02:58,400 --> 00:03:01,560 Speaker 2: of things that drivers. We're prett complex human beings, so 56 00:03:01,600 --> 00:03:05,239 Speaker 2: it's really trying to understand what's going on in another 57 00:03:05,320 --> 00:03:09,360 Speaker 2: person and what's driving them. I think some of the misconceptions, 58 00:03:10,560 --> 00:03:13,359 Speaker 2: the first one I talk about, which might be surprising 59 00:03:13,400 --> 00:03:16,960 Speaker 2: to people, is I don't think empathy is walking in 60 00:03:17,000 --> 00:03:20,720 Speaker 2: someone else's shoes. And the reason why I don't think 61 00:03:20,800 --> 00:03:24,320 Speaker 2: that's the case is if I was really struggling with something, 62 00:03:24,400 --> 00:03:27,880 Speaker 2: really having a hard time in a situation, you might 63 00:03:27,960 --> 00:03:31,320 Speaker 2: find that situation isn't very difficult at all. You might 64 00:03:31,360 --> 00:03:33,960 Speaker 2: walk on my shoes and go, actually, not a big deal. 65 00:03:36,160 --> 00:03:40,520 Speaker 2: Your experience in my shoes is not important to me. 66 00:03:41,520 --> 00:03:43,640 Speaker 2: My experience is important to me. And so if I 67 00:03:43,720 --> 00:03:45,800 Speaker 2: really want to understand someone, I've got to think it's 68 00:03:45,800 --> 00:03:48,120 Speaker 2: not about me. It's not about me as a good 69 00:03:48,160 --> 00:03:50,720 Speaker 2: proxy or any of those sorts of things. It's about 70 00:03:50,800 --> 00:03:53,920 Speaker 2: understanding somebody else and what's going on in their world. 71 00:03:54,720 --> 00:03:58,480 Speaker 3: Yeah, it's more, but less walking in their shoes and 72 00:03:58,680 --> 00:04:04,120 Speaker 3: more in being in their personalized hallucination of reality is 73 00:04:04,520 --> 00:04:09,000 Speaker 3: what I talk about. Your experience is a personalized hallucination 74 00:04:09,160 --> 00:04:14,200 Speaker 3: of reality because all of your life experiences, your genetics, 75 00:04:14,240 --> 00:04:16,520 Speaker 3: go into the mix, and it's all very personalized, isn't 76 00:04:16,520 --> 00:04:16,800 Speaker 3: it so? 77 00:04:17,400 --> 00:04:21,080 Speaker 2: Absolutely absolutely? And you know the I was chatting to 78 00:04:21,120 --> 00:04:23,320 Speaker 2: someone the other day about this. They're saying, oh, look, 79 00:04:23,600 --> 00:04:26,640 Speaker 2: we're having this conflict in our workplace, et cetera. And 80 00:04:28,000 --> 00:04:31,200 Speaker 2: this happens in all walks of life. This is the truth, 81 00:04:31,400 --> 00:04:34,400 Speaker 2: and this is what that person thinks of happening. And 82 00:04:34,720 --> 00:04:37,960 Speaker 2: I always say, well, they don't worry about the truth. 83 00:04:38,160 --> 00:04:43,039 Speaker 2: Just think about these plausible narratives. So your plausible narrative 84 00:04:43,080 --> 00:04:46,360 Speaker 2: is it sound good, wonderful? You like it? Well done? You? 85 00:04:46,880 --> 00:04:51,440 Speaker 2: Their plausible narrative is something different to yours. Deciding who's 86 00:04:51,520 --> 00:04:55,159 Speaker 2: right and wrong is a waste of time deciding. How 87 00:04:55,160 --> 00:04:57,840 Speaker 2: do we start to understand each other's narrative so we 88 00:04:57,880 --> 00:05:01,280 Speaker 2: can bring these storylines together a much much better way 89 00:05:01,279 --> 00:05:03,640 Speaker 2: of thinking about it. Because at the end of the day, 90 00:05:04,080 --> 00:05:09,440 Speaker 2: to answer your other question, what empathy, sympathy, kindness, My 91 00:05:09,640 --> 00:05:12,480 Speaker 2: objective is to make empathy something that helps us bring 92 00:05:12,560 --> 00:05:16,760 Speaker 2: these narratives to be able to not be the same, 93 00:05:16,920 --> 00:05:19,800 Speaker 2: but to blend them together so that we can work together, 94 00:05:19,839 --> 00:05:22,560 Speaker 2: we can interact with each other, we can we can 95 00:05:22,560 --> 00:05:27,480 Speaker 2: get along and create a better, more collegiate or harmonious 96 00:05:27,480 --> 00:05:31,680 Speaker 2: sort of society where we're not we're not picking these 97 00:05:31,839 --> 00:05:36,240 Speaker 2: us and then battles all the time. And this is 98 00:05:36,279 --> 00:05:41,159 Speaker 2: where I think empathy can be that tool sympathy. The 99 00:05:41,200 --> 00:05:45,600 Speaker 2: way I understand sympathy is my feelings about your situation. 100 00:05:46,440 --> 00:05:49,440 Speaker 2: So it's very much self centered how I feel about 101 00:05:49,440 --> 00:05:52,039 Speaker 2: what's going on for you, and where that relates to 102 00:05:52,120 --> 00:05:56,599 Speaker 2: things like compassion kindness. If I feel bad for you 103 00:05:57,200 --> 00:06:00,520 Speaker 2: and I take action that I think will help you, 104 00:06:01,080 --> 00:06:04,000 Speaker 2: or I take action to relieve my feelings and feeling 105 00:06:04,040 --> 00:06:08,000 Speaker 2: bad about it. You know, like there's a poor person 106 00:06:08,040 --> 00:06:10,560 Speaker 2: who doesn't have, you know, who looks homeless on the 107 00:06:10,560 --> 00:06:12,520 Speaker 2: side of the road. If I throw them two dollars, 108 00:06:12,560 --> 00:06:15,279 Speaker 2: I feel less guilty good on me. That is not 109 00:06:15,400 --> 00:06:18,279 Speaker 2: helping that person necessarily at all, and it's really not 110 00:06:18,320 --> 00:06:21,440 Speaker 2: about the other person's about me. Yeah, so that's where 111 00:06:21,560 --> 00:06:26,560 Speaker 2: sort of sympathy can drive. I think myth myth aligned 112 00:06:27,040 --> 00:06:30,760 Speaker 2: compassion or kindness at times, whereas or it's a bit 113 00:06:30,760 --> 00:06:32,760 Speaker 2: more like nicenesses and I'm trying to be nice to 114 00:06:32,800 --> 00:06:35,800 Speaker 2: the person, whereas I would say being kind is hey, 115 00:06:35,800 --> 00:06:39,320 Speaker 2: what's going on in their world? What's really happening for them? 116 00:06:39,600 --> 00:06:43,039 Speaker 2: What do they need? And how can I help them 117 00:06:43,200 --> 00:06:48,800 Speaker 2: with that? And that takes more than just sympathizing. It 118 00:06:48,800 --> 00:06:53,000 Speaker 2: takes empathizing, and it's a much more challenging thing to 119 00:06:53,040 --> 00:06:55,360 Speaker 2: do and not something I think we can or should 120 00:06:55,440 --> 00:06:58,520 Speaker 2: do with everyone all the time. But when we really 121 00:06:58,560 --> 00:07:01,200 Speaker 2: see something, we want to step in too and understand. 122 00:07:01,600 --> 00:07:03,760 Speaker 2: Empathy is that tool for us to build that deep 123 00:07:03,880 --> 00:07:05,760 Speaker 2: understanding so we can take better decisions. 124 00:07:06,640 --> 00:07:09,280 Speaker 3: And I want I want to come back to all 125 00:07:09,320 --> 00:07:12,840 Speaker 3: the time, because as you describe it, that takes time 126 00:07:12,880 --> 00:07:15,680 Speaker 3: and effort, right and to actually do that. 127 00:07:15,760 --> 00:07:18,440 Speaker 1: So I want to circle back to that in a minute. 128 00:07:18,480 --> 00:07:22,880 Speaker 3: But this idea of a plausible narrative, I actually really 129 00:07:23,120 --> 00:07:27,560 Speaker 3: like that, and because it links to what I said 130 00:07:27,600 --> 00:07:32,680 Speaker 3: about people's view of reality, being a personalized hallucination. And 131 00:07:32,960 --> 00:07:36,960 Speaker 3: anybody who says that this is absolutely the truth is 132 00:07:37,160 --> 00:07:40,880 Speaker 3: full of mad dog shit in my you know, any 133 00:07:40,920 --> 00:07:45,040 Speaker 3: scientists will tell you we you know, we we there's. 134 00:07:45,080 --> 00:07:47,800 Speaker 1: Very few things in science that are absolute truths. 135 00:07:47,800 --> 00:07:52,120 Speaker 3: It's just about certainty that that that they're not due 136 00:07:52,160 --> 00:07:55,840 Speaker 3: to chance and these sorts of things. Right, So when 137 00:07:55,880 --> 00:07:59,160 Speaker 3: did you come up with this idea of describing it 138 00:07:59,280 --> 00:08:04,720 Speaker 3: as a plausible narrative? And when you came up with that, 139 00:08:04,840 --> 00:08:07,160 Speaker 3: did you just kind of sit back and go, I've 140 00:08:07,240 --> 00:08:07,800 Speaker 3: kneeled it. 141 00:08:09,280 --> 00:08:13,000 Speaker 2: I didn't sit back and think I've nailed it, because 142 00:08:13,040 --> 00:08:15,240 Speaker 2: that just might be my own delusion. Right, But it's 143 00:08:17,920 --> 00:08:21,200 Speaker 2: about it's about sitting with people. And this happened a 144 00:08:21,200 --> 00:08:24,400 Speaker 2: lot working with in big organizations where someone would say, 145 00:08:25,160 --> 00:08:27,720 Speaker 2: this person, and I'm sure you've seen this in your 146 00:08:27,720 --> 00:08:30,760 Speaker 2: personal this person is doing this because and then they 147 00:08:30,880 --> 00:08:35,040 Speaker 2: run off this huge sentence or a storyline, and I 148 00:08:35,120 --> 00:08:37,320 Speaker 2: could sit back and go, but what if it's not 149 00:08:37,360 --> 00:08:41,840 Speaker 2: because of that m and what happens for us? You know, 150 00:08:41,920 --> 00:08:44,000 Speaker 2: to go back to your point, what are brains like? 151 00:08:44,040 --> 00:08:48,520 Speaker 2: It's certainty. They don't like uncertainty, And so the narrative 152 00:08:48,760 --> 00:08:51,400 Speaker 2: the reality that we make up is our brain saying 153 00:08:52,080 --> 00:08:54,760 Speaker 2: you've got this, you understand everything. Everything's going to be okay, 154 00:08:54,760 --> 00:08:57,120 Speaker 2: and I'll put it. It's in this box everything everything's 155 00:08:57,120 --> 00:09:02,120 Speaker 2: going to sweep that. It can be quite soothing in 156 00:09:02,160 --> 00:09:05,559 Speaker 2: the moment, but really destructive in the long run. And 157 00:09:05,600 --> 00:09:09,520 Speaker 2: so what I kept finding, even working in big organizations 158 00:09:09,559 --> 00:09:12,760 Speaker 2: at times was we'd have the most beautiful plans, the 159 00:09:12,760 --> 00:09:17,240 Speaker 2: best PowerPoint deck and all that sort of stuff. You 160 00:09:17,360 --> 00:09:19,440 Speaker 2: see it in communities where people go, oh, this is 161 00:09:19,480 --> 00:09:22,800 Speaker 2: how it's going to work. It all makes perfect sense. 162 00:09:23,280 --> 00:09:25,720 Speaker 2: But then when you add humans to the mix who've 163 00:09:25,760 --> 00:09:29,679 Speaker 2: got all of their weird, wonderful craziness, then things can 164 00:09:29,720 --> 00:09:34,040 Speaker 2: go astray really fast. And I was chatting with a 165 00:09:34,040 --> 00:09:38,280 Speaker 2: friend about it the other day. Share bicycles, shared students, 166 00:09:38,320 --> 00:09:39,880 Speaker 2: you know, if you've got them in Melbourne and Sydney, 167 00:09:39,920 --> 00:09:41,440 Speaker 2: where you can go and jump on those little e 168 00:09:41,520 --> 00:09:45,479 Speaker 2: bites and stuff, and they make perfect sense, great sensible, 169 00:09:45,679 --> 00:09:49,520 Speaker 2: logical way of moving around a city. Think about the 170 00:09:49,600 --> 00:09:52,520 Speaker 2: effort it takes to pick one up and throw it 171 00:09:52,520 --> 00:09:56,000 Speaker 2: in the river. That's what effort. Yet people do that. 172 00:09:56,320 --> 00:09:58,199 Speaker 2: You go, well, now one would go to that effort 173 00:09:58,240 --> 00:10:00,600 Speaker 2: that seems like a lot of wasted energy to go 174 00:10:00,640 --> 00:10:02,920 Speaker 2: and throw that thing in the river. However, people seem 175 00:10:02,960 --> 00:10:05,280 Speaker 2: to do it all the time. And so if you 176 00:10:05,320 --> 00:10:09,440 Speaker 2: don't understand some of these strange and weird narratives that 177 00:10:09,480 --> 00:10:12,320 Speaker 2: people are building in their heads that justified the behavior, 178 00:10:12,480 --> 00:10:14,040 Speaker 2: you're never going to be able to stop those things 179 00:10:14,040 --> 00:10:17,079 Speaker 2: from happening or even feel better ways to work with 180 00:10:17,160 --> 00:10:21,720 Speaker 2: people going forward. So that sort of as a mathematician 181 00:10:21,760 --> 00:10:24,800 Speaker 2: at heart, which is a strange thing to admit to, sometimes, 182 00:10:25,400 --> 00:10:28,800 Speaker 2: I'd love it if the world was like maths. 183 00:10:28,440 --> 00:10:33,120 Speaker 1: Predictable, certain certainty. And yes, yes, it's just not. 184 00:10:33,720 --> 00:10:37,120 Speaker 2: And so the narrative idea gives us a bit of flexibility, 185 00:10:37,360 --> 00:10:39,960 Speaker 2: gives us some room to move, gives us a bit 186 00:10:40,000 --> 00:10:42,120 Speaker 2: of a way of getting away from that brain's desire 187 00:10:42,240 --> 00:10:44,600 Speaker 2: to put things in black and white. 188 00:10:45,160 --> 00:10:48,120 Speaker 3: Yeah, and I think, look, everybody's been guilty of all 189 00:10:48,200 --> 00:10:50,559 Speaker 3: the lessers have been guilty of going why can't this 190 00:10:50,800 --> 00:10:52,920 Speaker 3: deckhead see things the way they are? 191 00:10:53,320 --> 00:10:53,480 Speaker 2: Right? 192 00:10:53,520 --> 00:10:56,079 Speaker 1: Of course, we've all been enough because. 193 00:10:56,040 --> 00:10:59,800 Speaker 3: That is our plausible narrative, which the us seems not 194 00:11:00,280 --> 00:11:04,760 Speaker 3: plausible but certain and the truth. And so I really 195 00:11:04,800 --> 00:11:08,719 Speaker 3: do like this idea of saying applausible narrative because what 196 00:11:08,760 --> 00:11:11,880 Speaker 3: it does is as well as Mick you think about 197 00:11:12,559 --> 00:11:16,480 Speaker 3: how they might be seeing the situation, is that it's 198 00:11:16,520 --> 00:11:19,880 Speaker 3: always a little bit of seed of dit into your 199 00:11:20,080 --> 00:11:23,520 Speaker 3: own interpretation of this that you have at absolutely one 200 00:11:23,600 --> 00:11:24,480 Speaker 3: hundred percent yield. 201 00:11:24,840 --> 00:11:28,720 Speaker 2: And well, of course I'll give you a really simple 202 00:11:29,480 --> 00:11:35,600 Speaker 2: example of this. Road rage goes down when people go 203 00:11:35,679 --> 00:11:39,160 Speaker 2: on holidays. So when people are at home, we're busy, 204 00:11:39,200 --> 00:11:41,360 Speaker 2: we're trying to get to drop the kids off at school, 205 00:11:41,520 --> 00:11:43,840 Speaker 2: pick up something, and go to an appointment, whatever the 206 00:11:43,880 --> 00:11:46,960 Speaker 2: case might be. Road rage where the first person I'm 207 00:11:47,000 --> 00:11:49,800 Speaker 2: beating the horn? Are you alpohol, get out of the road. 208 00:11:49,880 --> 00:11:53,839 Speaker 2: Right When we go on holidays, we're driving along the 209 00:11:53,840 --> 00:11:56,560 Speaker 2: a moalthy coast. We're looking out, how beautiful is this? 210 00:11:57,160 --> 00:11:59,839 Speaker 2: We're the ones puttering along and someone's behind us beeping. 211 00:12:00,640 --> 00:12:03,640 Speaker 2: Settle down? What's wrong with you? Relas it look how 212 00:12:03,679 --> 00:12:07,960 Speaker 2: beautiful it is? So we sort of forget that we're 213 00:12:08,080 --> 00:12:11,040 Speaker 2: very subjective in the way we see the world, and 214 00:12:11,120 --> 00:12:14,080 Speaker 2: we change at different times in different moments with whole 215 00:12:14,120 --> 00:12:17,760 Speaker 2: different stimulus going on. So I like this idea of 216 00:12:17,880 --> 00:12:21,920 Speaker 2: narrative because it just says Okay, our brain's just creating 217 00:12:21,960 --> 00:12:26,680 Speaker 2: this storyline. But I have some power to adapt the storyline. 218 00:12:26,720 --> 00:12:28,839 Speaker 2: And there isn't an absolute truth. It's just lots of 219 00:12:28,840 --> 00:12:32,760 Speaker 2: different ways that it could happen, and then I've got flexibility. 220 00:12:33,360 --> 00:12:35,520 Speaker 2: I think it's far more empowering to think, well, I 221 00:12:35,520 --> 00:12:39,520 Speaker 2: can shift between these different narratives than to say no, oh, 222 00:12:39,559 --> 00:12:41,920 Speaker 2: this is the truth and everyone else is wrong and therefore, 223 00:12:42,960 --> 00:12:44,840 Speaker 2: you know, let's burn them at the stake or whatever. 224 00:12:45,880 --> 00:12:46,560 Speaker 2: Ludicrous In my. 225 00:12:48,280 --> 00:12:52,800 Speaker 3: I see that concept of world beautical stamen people who 226 00:12:52,880 --> 00:12:54,440 Speaker 3: was on holiday. It just reminded me as you were 227 00:12:54,440 --> 00:12:57,240 Speaker 3: talking about that about the Good Samaritan study. 228 00:12:57,240 --> 00:12:59,320 Speaker 1: I don't know if you've heard of this one, and I'll. 229 00:12:59,240 --> 00:13:02,480 Speaker 3: Probably probably get some details slightly wrong, so you can 230 00:13:02,480 --> 00:13:04,520 Speaker 3: put me up, and if you don't, listeners will put 231 00:13:04,559 --> 00:13:08,560 Speaker 3: me up. But basically my memory of this is there 232 00:13:08,559 --> 00:13:10,640 Speaker 3: are a bunch of people who were brought in It 233 00:13:10,800 --> 00:13:14,240 Speaker 3: was in a university study, and they were read a paragraph. 234 00:13:14,280 --> 00:13:17,320 Speaker 3: Some of them were primed with the parable of the 235 00:13:17,360 --> 00:13:20,320 Speaker 3: Good Samaritan, and then others were just read a control 236 00:13:20,440 --> 00:13:22,920 Speaker 3: thing and then they were told they had to get 237 00:13:22,960 --> 00:13:25,320 Speaker 3: across to do a talk on the other side. Of 238 00:13:25,400 --> 00:13:29,080 Speaker 3: the quad or something in the university, and there was 239 00:13:29,120 --> 00:13:32,480 Speaker 3: an old person who dropped a heap of books, and 240 00:13:32,520 --> 00:13:35,600 Speaker 3: they were looking to see who would stop and who 241 00:13:35,679 --> 00:13:39,640 Speaker 3: didn't stop, right, But they actually manipulated the condition where 242 00:13:39,679 --> 00:13:42,319 Speaker 3: some of the people had plenty of time and some 243 00:13:42,360 --> 00:13:42,960 Speaker 3: of them were. 244 00:13:42,880 --> 00:13:43,959 Speaker 1: Quite tight for time. 245 00:13:44,720 --> 00:13:47,080 Speaker 3: And it wasn't actually whether or not they read the 246 00:13:47,080 --> 00:13:51,200 Speaker 3: Good Samaritan that drove whether or not they stopped to help. 247 00:13:51,280 --> 00:13:55,080 Speaker 1: It was when they had time. So even when people 248 00:13:55,120 --> 00:13:55,839 Speaker 1: had read the. 249 00:13:55,760 --> 00:13:58,040 Speaker 3: Good Samaritan, but they thought there wasn't a lot of 250 00:13:58,040 --> 00:14:00,800 Speaker 3: flex in theirs, they just walked right past right as 251 00:14:00,880 --> 00:14:03,280 Speaker 3: sort of kind of goes to your point about how 252 00:14:03,400 --> 00:14:08,920 Speaker 3: are our own personal situation and actually changes how we 253 00:14:08,960 --> 00:14:11,680 Speaker 3: interact with reality one hundred percent. 254 00:14:11,760 --> 00:14:14,439 Speaker 2: And I think they were actually theological students. 255 00:14:15,400 --> 00:14:17,280 Speaker 1: They have to get right, actually, yeah. 256 00:14:17,400 --> 00:14:22,960 Speaker 2: So they should have of stop should have also, well, 257 00:14:23,280 --> 00:14:26,080 Speaker 2: we all should stop potentially. But your points are really 258 00:14:26,120 --> 00:14:30,720 Speaker 2: good one, which is your brain prioritizers looking after you. 259 00:14:31,000 --> 00:14:34,080 Speaker 2: As much as we'd like to think it doesn't, it's not. 260 00:14:34,240 --> 00:14:38,520 Speaker 2: It's designed to look after you. It's perfectly designed and 261 00:14:38,920 --> 00:14:41,960 Speaker 2: has been created this way over you know, lots of 262 00:14:42,200 --> 00:14:45,680 Speaker 2: time of evolution to work in social groups. Yes, absolutely, 263 00:14:45,920 --> 00:14:48,360 Speaker 2: to care about others because of that social nature. Yet, 264 00:14:49,480 --> 00:14:54,480 Speaker 2: but its underlying function is to keep you safe. And 265 00:14:54,800 --> 00:15:00,720 Speaker 2: if that means I'm really busy and there's there's objects 266 00:15:00,760 --> 00:15:03,440 Speaker 2: in my way, there's problems my way, how do I 267 00:15:03,480 --> 00:15:05,760 Speaker 2: get past them as fast as possible to meet what 268 00:15:05,800 --> 00:15:08,840 Speaker 2: I'm trying to do. That's exactly what we should expect 269 00:15:08,880 --> 00:15:12,160 Speaker 2: it to do. And so this is where empathy takes work. 270 00:15:12,760 --> 00:15:17,320 Speaker 2: It takes effort. I have to actually put in some 271 00:15:17,320 --> 00:15:21,400 Speaker 2: some stoppages and blockages from the natural path I would 272 00:15:21,440 --> 00:15:25,360 Speaker 2: take of look after number one to think, Okay, I'm 273 00:15:25,800 --> 00:15:27,320 Speaker 2: I'm going to do something different. Now I'm going to 274 00:15:27,320 --> 00:15:31,200 Speaker 2: interrupt that natural flow and try something else, which is hard, 275 00:15:31,480 --> 00:15:34,600 Speaker 2: and we should acknowledge that it's hard, because if we 276 00:15:34,640 --> 00:15:37,120 Speaker 2: don't acknowledge it's hard, then what we tend to do 277 00:15:37,160 --> 00:15:40,040 Speaker 2: is go, well, you went those students who didn't stop, 278 00:15:40,080 --> 00:15:44,200 Speaker 2: They must be terrible people. They're bad and obviously they 279 00:15:44,240 --> 00:15:47,880 Speaker 2: don't believe in their religion or whatever. Right, But that's 280 00:15:47,920 --> 00:15:50,240 Speaker 2: not a useful way of looking at these people. They're 281 00:15:50,240 --> 00:15:51,840 Speaker 2: just humans and we're all fallible. 282 00:15:52,360 --> 00:15:55,120 Speaker 3: Yeah, yeah, yeah, Now that does bring in the question 283 00:15:55,200 --> 00:15:58,280 Speaker 3: that I was going to ask about the time and effort. So, 284 00:15:58,560 --> 00:16:01,440 Speaker 3: because this does take time, and it does take effort 285 00:16:01,480 --> 00:16:05,960 Speaker 3: to actually slow down, challenge your own assumptions and actually go. So, 286 00:16:06,160 --> 00:16:11,040 Speaker 3: how might Daniel in this situation, you know what might 287 00:16:11,080 --> 00:16:16,920 Speaker 3: he be thinking? And we're in fast piaced business like 288 00:16:18,280 --> 00:16:21,360 Speaker 3: when people are going, I don't have time for this 289 00:16:21,720 --> 00:16:26,120 Speaker 3: empathy stuff. That's all very nice, but there's deadlines here. 290 00:16:26,240 --> 00:16:29,040 Speaker 3: Whether that's an individual in a team or a leader. 291 00:16:29,880 --> 00:16:33,440 Speaker 3: What would you say to those people who are going, yeah, Daniel, 292 00:16:33,840 --> 00:16:36,600 Speaker 3: I get all of this, and it's great when I. 293 00:16:36,640 --> 00:16:39,080 Speaker 1: Have time, but I've got a bloody deadline to meet. 294 00:16:39,960 --> 00:16:44,680 Speaker 2: Yeah. Agree, So let me think of it this way. 295 00:16:45,280 --> 00:16:47,600 Speaker 2: There's the old phrase measure towards cut one. 296 00:16:48,040 --> 00:16:50,640 Speaker 1: Yes, love it, love it? Yeah. 297 00:16:50,840 --> 00:16:54,080 Speaker 2: Why does that exist? Well, because we know that when 298 00:16:54,120 --> 00:16:57,080 Speaker 2: we rush, when we don't, when we're not careful in 299 00:16:57,520 --> 00:17:00,520 Speaker 2: doing certain activities, there's a good chance we can make 300 00:17:00,680 --> 00:17:08,720 Speaker 2: mistakes and errors. In most big organizations, now, if someone said, hey, 301 00:17:08,800 --> 00:17:11,240 Speaker 2: we've got to make this decision, we could either do 302 00:17:11,400 --> 00:17:14,800 Speaker 2: some measurement and study, get some data behind it, or 303 00:17:14,840 --> 00:17:19,560 Speaker 2: we can just guess. I would suggest to you that 304 00:17:20,119 --> 00:17:24,240 Speaker 2: the vast majority of organizations today, people would say, maybe 305 00:17:24,280 --> 00:17:26,800 Speaker 2: there's gravi at least some data. Let's not just get right, 306 00:17:26,880 --> 00:17:29,119 Speaker 2: let's get some data and let's get some numbers in 307 00:17:29,720 --> 00:17:31,920 Speaker 2: what do the reports say? And you know, the idea 308 00:17:31,960 --> 00:17:35,119 Speaker 2: of data driven decision making? People go, oh, absolutely, we 309 00:17:35,160 --> 00:17:37,239 Speaker 2: should be getting lots of data, and I go one 310 00:17:37,280 --> 00:17:41,320 Speaker 2: hundred percent agree. Mathematician couldn't agree with you more. However, 311 00:17:42,920 --> 00:17:46,920 Speaker 2: some of the things that influence humans the most aren't 312 00:17:46,960 --> 00:17:49,639 Speaker 2: things you can measure and put in a spreadsheet very easily. 313 00:17:50,760 --> 00:17:55,480 Speaker 2: So things like emotions and values and relationships and culture. 314 00:17:56,800 --> 00:18:01,239 Speaker 2: You know, if you said relationship with my I could say, 315 00:18:01,280 --> 00:18:04,880 Speaker 2: my relationship with my wife is seventy three. What would 316 00:18:04,880 --> 00:18:07,320 Speaker 2: you do with that number? Okay, what does it mean? 317 00:18:07,520 --> 00:18:10,399 Speaker 2: How interesting or important is that number? In making it? 318 00:18:10,400 --> 00:18:14,840 Speaker 2: It's pretty much useful. But if you understood the relationship 319 00:18:14,880 --> 00:18:17,280 Speaker 2: I had with my wife, and you were trying to 320 00:18:17,520 --> 00:18:21,520 Speaker 2: sell me something or lead me or help me make 321 00:18:21,560 --> 00:18:24,960 Speaker 2: a decision in my financial world, for example, that would 322 00:18:25,000 --> 00:18:29,200 Speaker 2: be pretty important data. Would you agree the seventy three 323 00:18:29,840 --> 00:18:31,560 Speaker 2: I don't know what to do with that. But if 324 00:18:31,600 --> 00:18:34,000 Speaker 2: you spoke to me and understood the emotions and the 325 00:18:34,040 --> 00:18:38,000 Speaker 2: relationship and all those other hard to measure but really 326 00:18:38,040 --> 00:18:41,920 Speaker 2: important pieces, you'd make a better decision. And so what 327 00:18:41,960 --> 00:18:44,199 Speaker 2: I would say to people is, if you're going to 328 00:18:44,280 --> 00:18:47,360 Speaker 2: be working with another human and you need to make 329 00:18:47,400 --> 00:18:50,959 Speaker 2: decisions or you need to take actions around this person, 330 00:18:52,080 --> 00:18:56,000 Speaker 2: are you telling me that it's not worth understanding that data, 331 00:18:56,200 --> 00:18:59,520 Speaker 2: which is in most cases when you're dealing with another person, 332 00:19:00,040 --> 00:19:02,119 Speaker 2: that's all the gold, right, that's where all the really 333 00:19:02,160 --> 00:19:05,800 Speaker 2: important stuff is. So, yes, it's hard, Yes, it's challenging. 334 00:19:06,359 --> 00:19:08,760 Speaker 2: Can you make decisions about it? Of course you can 335 00:19:08,920 --> 00:19:12,960 Speaker 2: sometimes you have to. My challenge back would be if 336 00:19:13,000 --> 00:19:15,760 Speaker 2: you can better understand if you're talking on the phone 337 00:19:15,800 --> 00:19:20,440 Speaker 2: of someone who's had an insurance claient, understanding what the 338 00:19:20,480 --> 00:19:23,040 Speaker 2: context is of their life at the moment, it's going 339 00:19:23,119 --> 00:19:24,960 Speaker 2: to be much much more important for you to make 340 00:19:24,960 --> 00:19:27,920 Speaker 2: a decision. Then what's the policy? How do I put 341 00:19:27,960 --> 00:19:29,959 Speaker 2: you through this sassage factory and to pitch you at 342 00:19:30,000 --> 00:19:32,680 Speaker 2: the other end? Right? Yeah? 343 00:19:32,720 --> 00:19:33,320 Speaker 1: Absolutely. 344 00:19:33,400 --> 00:19:38,119 Speaker 3: Now, when I'm running some workshops with leaders, particularly I 345 00:19:38,119 --> 00:19:41,679 Speaker 3: do a high performance teams workshop, and in part of 346 00:19:41,720 --> 00:19:45,919 Speaker 3: that I get everybody sitting around in a circle with 347 00:19:46,080 --> 00:19:49,440 Speaker 3: no tables in the way, and then I just thought 348 00:19:49,480 --> 00:19:51,399 Speaker 3: out there right, we're going to talk about the most 349 00:19:51,600 --> 00:19:54,960 Speaker 3: challenging thing we've ever had to deal with emotionally, right, 350 00:19:55,000 --> 00:19:59,040 Speaker 3: And I generally lead off on that, and it's it's 351 00:19:59,040 --> 00:20:03,439 Speaker 3: a tough session for a lot of people. And you 352 00:20:03,520 --> 00:20:06,560 Speaker 3: can see straight away some people tensing up and going 353 00:20:06,920 --> 00:20:09,400 Speaker 3: oh shit, And I actually say to them, now your 354 00:20:09,440 --> 00:20:13,359 Speaker 3: bringing is now whizzing around, going fucking else Should I 355 00:20:13,359 --> 00:20:15,720 Speaker 3: talk about this or will I just talk about the 356 00:20:16,080 --> 00:20:19,560 Speaker 3: lesser one? And I go hard, right, And then the 357 00:20:19,640 --> 00:20:22,560 Speaker 3: longer people with the more nervous they get. But and 358 00:20:22,560 --> 00:20:26,920 Speaker 3: it often becomes very emotional. But afterwards when we debrief, 359 00:20:27,560 --> 00:20:32,000 Speaker 3: people go, oh my god, and now really understand why 360 00:20:32,040 --> 00:20:34,639 Speaker 3: you do what you do. And I'll also get the 361 00:20:34,680 --> 00:20:37,520 Speaker 3: people to say Hyatt's had an impact on them and 362 00:20:37,600 --> 00:20:41,880 Speaker 3: their view of reality. And that's a very powerful session 363 00:20:42,600 --> 00:20:47,280 Speaker 3: for getting people to really it's not so much about 364 00:20:47,400 --> 00:20:52,840 Speaker 3: understanding the individuals, but just understanding how everybody has got 365 00:20:52,840 --> 00:20:57,040 Speaker 3: shit going on in their lives that impact why and 366 00:20:57,080 --> 00:21:01,760 Speaker 3: how they do things sometimes profoundly. Now, that is a 367 00:21:01,800 --> 00:21:04,320 Speaker 3: powerful session, but it takes a fur bit of time. 368 00:21:05,119 --> 00:21:07,280 Speaker 1: So what would you say then. 369 00:21:07,200 --> 00:21:11,320 Speaker 3: To people who who live in are operating in teams, 370 00:21:11,359 --> 00:21:16,200 Speaker 3: whether they're a leader or not about how they can 371 00:21:17,200 --> 00:21:20,199 Speaker 3: Is there any questions or is there a method that 372 00:21:20,240 --> 00:21:26,920 Speaker 3: they can use to help them to understand someone else's 373 00:21:27,000 --> 00:21:29,800 Speaker 3: view of the world without having to sit around and 374 00:21:29,840 --> 00:21:32,520 Speaker 3: get the tissues out. 375 00:21:33,720 --> 00:21:36,840 Speaker 2: It's a great and it sounds like a fascinating workshop, 376 00:21:36,880 --> 00:21:38,320 Speaker 2: and I'd love to be applying the wall in some 377 00:21:38,440 --> 00:21:42,960 Speaker 2: of those. I'm sure it's some really interesting things come out. 378 00:21:44,240 --> 00:21:46,320 Speaker 2: Here's the challenge I have back with the question, though, 379 00:21:47,840 --> 00:21:51,400 Speaker 2: if you're looking for a trick to build trust quickly, 380 00:21:52,160 --> 00:21:57,520 Speaker 2: it's not going to work. Yeah, you might have heard 381 00:21:57,560 --> 00:21:59,520 Speaker 2: of this one before. Paul talked about it before that. 382 00:21:59,600 --> 00:22:02,080 Speaker 2: The model I always think about inside people's brains as 383 00:22:02,080 --> 00:22:03,960 Speaker 2: an elephant with a man riding on his back. Have 384 00:22:04,000 --> 00:22:08,720 Speaker 2: you heard this one? So the writer is that conscious, smart, intelligent, 385 00:22:08,920 --> 00:22:13,280 Speaker 2: you know, thoughtful, pragmatic brain, the strategic part of our brain, 386 00:22:13,320 --> 00:22:16,760 Speaker 2: and the elephant is our emotional core. It's that intuitive 387 00:22:16,760 --> 00:22:20,960 Speaker 2: and instinctive piece and the writer, the writer thinks he 388 00:22:21,000 --> 00:22:24,120 Speaker 2: is in full control until the elephant has strong emotions 389 00:22:24,160 --> 00:22:28,920 Speaker 2: besides its game off. It's all onto the elephant man. 390 00:22:29,240 --> 00:22:32,240 Speaker 2: And what you're talking about here is getting people to 391 00:22:32,480 --> 00:22:37,920 Speaker 2: really expose some of that emotional core. The reason why 392 00:22:37,920 --> 00:22:41,439 Speaker 2: it's so powerful because it builds trust. Elephants are the 393 00:22:41,440 --> 00:22:44,399 Speaker 2: things that determine if I trust someone or not. Not logic, 394 00:22:44,480 --> 00:22:47,560 Speaker 2: not rationality correct. And you know, if I said to you, 395 00:22:47,600 --> 00:22:51,800 Speaker 2: tell me someone you really trust, tell me the exact 396 00:22:52,119 --> 00:22:55,919 Speaker 2: things they did to build that trust. And now what 397 00:22:55,960 --> 00:22:57,320 Speaker 2: I'm going to do is I'm going to do those 398 00:22:57,359 --> 00:22:59,720 Speaker 2: same things. So then you have to trust me the same. 399 00:22:59,800 --> 00:23:02,720 Speaker 2: Right you would say no, because's the way you ask 400 00:23:02,800 --> 00:23:06,080 Speaker 2: the question. I definitely don't trust you. This sounds tough, right, 401 00:23:07,000 --> 00:23:11,320 Speaker 2: because it's emotionally. It's decided by the ell you feel trust, 402 00:23:11,560 --> 00:23:15,280 Speaker 2: you don't think trust. And so if you want to 403 00:23:15,480 --> 00:23:18,439 Speaker 2: build that relationship of trust, then you do have to 404 00:23:18,480 --> 00:23:22,320 Speaker 2: invest the time to actually genuinely take care and concern 405 00:23:22,359 --> 00:23:25,640 Speaker 2: for that person. You have to genuinely demonstrate you want 406 00:23:25,640 --> 00:23:28,520 Speaker 2: to know. Some of the easiest ways to do it 407 00:23:28,520 --> 00:23:30,639 Speaker 2: is actually ask someone a question and then just listen, 408 00:23:32,320 --> 00:23:34,800 Speaker 2: that's really interesting, tell me more about that. How did 409 00:23:34,840 --> 00:23:37,720 Speaker 2: that feel? And just shut up and listen. Don't try 410 00:23:37,760 --> 00:23:39,879 Speaker 2: and bring your experience in. Don't try and oh well, 411 00:23:39,920 --> 00:23:41,280 Speaker 2: yeah that happened to me once too. 412 00:23:41,840 --> 00:23:45,200 Speaker 3: Oh hi, many fucking people do that right as soon 413 00:23:45,200 --> 00:23:48,119 Speaker 3: as people start talking and like they take over the 414 00:23:48,240 --> 00:23:54,119 Speaker 3: conversation about about them, And it's difficult not to do it, 415 00:23:54,160 --> 00:23:57,359 Speaker 3: isn't it, because it just triggers something in your brian 416 00:23:57,480 --> 00:23:59,280 Speaker 3: and you want to you want to kind of shore 417 00:23:59,400 --> 00:24:04,480 Speaker 3: and show that you're connected. But that can that can backfire? 418 00:24:04,520 --> 00:24:05,080 Speaker 1: Really count it? 419 00:24:05,600 --> 00:24:09,840 Speaker 2: Oh well, of course again again it's natural your brain goes, oh, 420 00:24:09,880 --> 00:24:13,119 Speaker 2: what about me? What about me? What about relate this 421 00:24:13,160 --> 00:24:17,520 Speaker 2: to my experience? And it does take conscious effort. I 422 00:24:17,600 --> 00:24:21,360 Speaker 2: remember working with a brilliant CEO, a gentleman named Peter 423 00:24:21,520 --> 00:24:25,640 Speaker 2: Harmer who ran I AG at the time, and we're 424 00:24:25,680 --> 00:24:27,919 Speaker 2: in a meeting and he got really upset, worked up 425 00:24:27,920 --> 00:24:30,720 Speaker 2: about something and he put his hands in the dest 426 00:24:30,760 --> 00:24:35,320 Speaker 2: and went calm down, Peter. And you know, he was 427 00:24:35,359 --> 00:24:39,360 Speaker 2: obviously very emotionally engaged in this conversation and the development 428 00:24:39,440 --> 00:24:42,600 Speaker 2: was ready to jump on in, but he knew that 429 00:24:42,720 --> 00:24:45,400 Speaker 2: to make a good decision in the team he had 430 00:24:45,680 --> 00:24:50,280 Speaker 2: to try and let others have space. And the CEO, 431 00:24:50,920 --> 00:24:52,800 Speaker 2: you know, comes in and says, I think we should 432 00:24:52,800 --> 00:24:56,960 Speaker 2: do this. Everyone else goes, okay, that's just for a 433 00:24:57,000 --> 00:24:59,800 Speaker 2: better and you don't necessarily get a great outcome. So 434 00:25:00,119 --> 00:25:03,840 Speaker 2: it's important to be able to park our elephants as 435 00:25:03,920 --> 00:25:07,000 Speaker 2: much as we might feel sad or gutted or upset, 436 00:25:07,000 --> 00:25:10,199 Speaker 2: whatever it might be. And you see this in people 437 00:25:10,320 --> 00:25:15,160 Speaker 2: like Louis Throu You see this in really great interviewers 438 00:25:15,240 --> 00:25:19,560 Speaker 2: and people have Andrew Danson does this very well. They 439 00:25:19,760 --> 00:25:23,000 Speaker 2: sit with things without putting themselves into it, without letting 440 00:25:23,040 --> 00:25:26,439 Speaker 2: their emotions overwhelm it. And it's a hard thing to do, 441 00:25:26,520 --> 00:25:28,240 Speaker 2: but really important to build that trust. 442 00:25:28,840 --> 00:25:34,080 Speaker 3: Louis Throu I think it's exceptional at it really really good. Yeah, 443 00:25:34,119 --> 00:25:36,359 Speaker 3: you just need to watch Louis Throu about how to 444 00:25:37,040 --> 00:25:40,119 Speaker 3: how to really open help somebody to open up. And 445 00:25:40,200 --> 00:25:44,240 Speaker 3: it reminds me actually of motivational interviewing. You're familiar with 446 00:25:44,440 --> 00:25:47,840 Speaker 3: motivational interviewing, which I started years ago, and it was 447 00:25:47,840 --> 00:25:49,959 Speaker 3: a bit of a game changer for me actually in 448 00:25:50,080 --> 00:25:54,639 Speaker 3: dealing with clients and the Ore's approach that they have 449 00:25:55,520 --> 00:25:59,440 Speaker 3: of asking open ended questions right so that you get 450 00:25:59,480 --> 00:26:03,560 Speaker 3: somebody to talk, then giving people affirmations, you know, telling 451 00:26:03,560 --> 00:26:08,120 Speaker 3: them something that you'd like or admire or whatever, reflective 452 00:26:08,200 --> 00:26:12,840 Speaker 3: listening to your point earlier to actually listening in reflect 453 00:26:12,960 --> 00:26:16,160 Speaker 3: back the emotions I'm hearing that you're frustrated or whatever 454 00:26:16,200 --> 00:26:19,879 Speaker 3: it may be, right, and then in the yours approach 455 00:26:19,920 --> 00:26:23,640 Speaker 3: to then there's summarizing, but that's really about change talk. 456 00:26:23,680 --> 00:26:26,600 Speaker 3: You're summarizing change. That's that's when you're in a behavior 457 00:26:26,680 --> 00:26:29,280 Speaker 3: change thing. But it's that the open end of questions. 458 00:26:29,320 --> 00:26:33,000 Speaker 3: The aff for me is the reflective listening, because when 459 00:26:33,160 --> 00:26:39,240 Speaker 3: people feel that they've been heard and understood, that is 460 00:26:39,280 --> 00:26:41,879 Speaker 3: when you start to create that connection and that deeper 461 00:26:41,960 --> 00:26:44,240 Speaker 3: level of trust, isn't it absolutely? 462 00:26:45,359 --> 00:26:47,960 Speaker 2: You know? What I see in people like Louis Thrue 463 00:26:48,600 --> 00:26:51,359 Speaker 2: is he can sit in a place where he's talking 464 00:26:51,359 --> 00:26:56,919 Speaker 2: to someone who he vehemently disagrees with, but he's still 465 00:26:56,960 --> 00:27:00,440 Speaker 2: open to hearing, you know, you know, rushing the challenge 466 00:27:01,040 --> 00:27:04,959 Speaker 2: and shoot down their ideas and thoughts, and because of 467 00:27:05,000 --> 00:27:07,560 Speaker 2: that you create this space for them to share more 468 00:27:07,600 --> 00:27:11,000 Speaker 2: and share more. And again people say to me, oh, 469 00:27:11,040 --> 00:27:14,440 Speaker 2: but it's wrong, they shouldn't say that. They're saying something 470 00:27:14,520 --> 00:27:18,679 Speaker 2: that's terrible, And my response is always, but do you 471 00:27:18,760 --> 00:27:22,280 Speaker 2: understand why they're saying it? And if you don't understand 472 00:27:22,280 --> 00:27:25,199 Speaker 2: why they're saying it, shutting them off too fast and 473 00:27:25,280 --> 00:27:27,840 Speaker 2: telling someone that I think what you're saying is terrible. 474 00:27:28,600 --> 00:27:30,879 Speaker 2: Often what people do is they don't stop thinking a 475 00:27:30,880 --> 00:27:34,520 Speaker 2: bad ideas, they just stop saying them. And herefore you 476 00:27:34,560 --> 00:27:37,280 Speaker 2: don't get access to the information that drives it. And 477 00:27:38,080 --> 00:27:40,320 Speaker 2: we've got to be able to build those bridges sometimes. 478 00:27:40,920 --> 00:27:46,760 Speaker 3: How important is curiosity for building empathy. 479 00:27:48,600 --> 00:27:53,159 Speaker 2: Vital? So when I started researching and studying this idea 480 00:27:53,200 --> 00:27:58,000 Speaker 2: of empathy is a way of building better decisions and 481 00:27:58,080 --> 00:28:02,000 Speaker 2: working more effectively in my career, I wanted a process 482 00:28:02,840 --> 00:28:06,000 Speaker 2: and I couldn't find a process for building empathy. So 483 00:28:06,080 --> 00:28:10,240 Speaker 2: I developed one. And the first step in the process 484 00:28:10,480 --> 00:28:14,800 Speaker 2: is what I call being consciously curious. And what I 485 00:28:14,840 --> 00:28:19,120 Speaker 2: mean by being consciously curious is acknowledging I have believed, 486 00:28:19,640 --> 00:28:24,080 Speaker 2: I have biases, I have heuristics, I have assumptions. I've 487 00:28:24,119 --> 00:28:26,320 Speaker 2: got a mind. I've got a head full of stuff 488 00:28:26,560 --> 00:28:29,840 Speaker 2: that's ready to make a judgment within a millisecond of 489 00:28:29,880 --> 00:28:33,440 Speaker 2: anything I see. It's all there, ready to go. Happy days. 490 00:28:33,760 --> 00:28:37,320 Speaker 2: My brain thinks it's all correct and I'm awesome. But 491 00:28:37,560 --> 00:28:40,200 Speaker 2: what I need to do to be consciously curious is go, Okay, 492 00:28:40,280 --> 00:28:42,680 Speaker 2: I'm here to meet Paul. I'm just going to put 493 00:28:42,680 --> 00:28:45,560 Speaker 2: all that stuff to the side. I'm fronting up with 494 00:28:45,600 --> 00:28:49,640 Speaker 2: a blank sheet of paper, and I'm really interested. And 495 00:28:49,800 --> 00:28:52,440 Speaker 2: so this conscious curiosity is easy to say, hard to do. 496 00:28:52,920 --> 00:28:57,920 Speaker 2: But that's the fundamental part of empathy, is it's not 497 00:28:58,040 --> 00:29:01,920 Speaker 2: about the reason why it is so important. But if 498 00:29:01,920 --> 00:29:04,960 Speaker 2: I don't, if I'm not consciously curious, what I'll do 499 00:29:05,040 --> 00:29:10,480 Speaker 2: is I'll ask questions. I'll explore with a view to 500 00:29:10,720 --> 00:29:13,880 Speaker 2: validating what I already think is true exactly. 501 00:29:14,360 --> 00:29:18,720 Speaker 1: And I think that's hugely important. I really like that 502 00:29:18,760 --> 00:29:20,520 Speaker 1: as a first step, actually. 503 00:29:20,120 --> 00:29:23,640 Speaker 3: Because all people need to do is just pause this 504 00:29:23,680 --> 00:29:29,600 Speaker 3: podcast and google cognitive biases, list of cognitive biases, and 505 00:29:29,720 --> 00:29:33,760 Speaker 3: you will see a huge list of cognitive biases. And 506 00:29:33,800 --> 00:29:36,320 Speaker 3: if you read through it, you realize that most of 507 00:29:36,400 --> 00:29:39,200 Speaker 3: us do most of these things quite a lot of 508 00:29:39,240 --> 00:29:42,560 Speaker 3: the time, right And we have, as you said, all 509 00:29:42,560 --> 00:29:45,600 Speaker 3: these heuristics, all these shortcuts that we have. 510 00:29:45,560 --> 00:29:48,400 Speaker 1: To do to be able to exist in the world. 511 00:29:48,640 --> 00:29:53,240 Speaker 3: But it makes our thinking become a little bit more 512 00:29:53,760 --> 00:29:56,160 Speaker 3: narrow And if you hang out with the same people, 513 00:29:56,720 --> 00:29:59,240 Speaker 3: and if you're going to test your assumptions on people 514 00:29:59,240 --> 00:30:02,080 Speaker 3: who generally I agree with you, you're just bunching off 515 00:30:02,080 --> 00:30:03,320 Speaker 3: the echo chamber, aren't. 516 00:30:03,120 --> 00:30:08,720 Speaker 2: You absolutely, and where we're becoming even better experts are 517 00:30:08,760 --> 00:30:13,040 Speaker 2: doing that than we've ever been before. You know, I 518 00:30:13,080 --> 00:30:15,240 Speaker 2: you don't have to look far past or far into 519 00:30:15,360 --> 00:30:18,320 Speaker 2: something like Facebook to find groups of people who are 520 00:30:18,480 --> 00:30:23,680 Speaker 2: just just consciously just constantly seeking more and more data 521 00:30:24,000 --> 00:30:28,120 Speaker 2: that validates what they believe and completely ignoring anything to 522 00:30:28,200 --> 00:30:31,040 Speaker 2: the contrary. And we've done it for a long time. 523 00:30:31,120 --> 00:30:34,720 Speaker 2: Humans are experts are doing this, have done it for millennia. 524 00:30:34,760 --> 00:30:38,520 Speaker 2: But in a world you know, I've said this at 525 00:30:38,520 --> 00:30:42,720 Speaker 2: a conference the other day, Paul, the world has never 526 00:30:42,880 --> 00:30:46,240 Speaker 2: been more complex than it is today. Right, Never ever 527 00:30:46,320 --> 00:30:48,400 Speaker 2: in human history has it been as complex it is 528 00:30:48,680 --> 00:30:52,880 Speaker 2: as it is today. And it is the simplest it 529 00:30:52,920 --> 00:30:57,240 Speaker 2: will ever be again. Today today is the simplest. In 530 00:30:57,280 --> 00:31:01,000 Speaker 2: the future. Things are only scaling up ter the complexity, 531 00:31:01,440 --> 00:31:05,160 Speaker 2: and the more we train ourselves to be sure to 532 00:31:05,200 --> 00:31:08,200 Speaker 2: be closed to think. You know, I don't want to 533 00:31:08,400 --> 00:31:11,000 Speaker 2: make it political, but going back to the good old 534 00:31:11,040 --> 00:31:14,320 Speaker 2: days making it great again, it's all all it's all 535 00:31:14,480 --> 00:31:17,480 Speaker 2: complete rubbish. We have to be open and curious and 536 00:31:17,560 --> 00:31:20,920 Speaker 2: working together to find new ways of connecting this in 537 00:31:20,960 --> 00:31:25,640 Speaker 2: this future. Otherwise we'll just keep diverging, keep creating more 538 00:31:25,880 --> 00:31:28,880 Speaker 2: groups of us and them, and I think that's the 539 00:31:28,960 --> 00:31:31,680 Speaker 2: path where humanity starts to fall apart and do terrible 540 00:31:31,680 --> 00:31:32,360 Speaker 2: things to each other. 541 00:31:32,640 --> 00:31:35,920 Speaker 3: Look, I think you're right, and I think with you know, 542 00:31:36,040 --> 00:31:38,920 Speaker 3: you see it, and I think a lot of it 543 00:31:38,920 --> 00:31:43,040 Speaker 3: has started in the United States, of that polarization of 544 00:31:43,120 --> 00:31:46,400 Speaker 3: the US and them and of the lack of de 545 00:31:46,480 --> 00:31:49,240 Speaker 3: be it that there's no spreading around the world. 546 00:31:49,280 --> 00:31:51,600 Speaker 1: And as you will, no social media has got a 547 00:31:51,600 --> 00:31:52,320 Speaker 1: lot to do with it. 548 00:31:52,400 --> 00:31:55,440 Speaker 3: With Eccule Chambers, I think people are becoming more and 549 00:31:55,520 --> 00:31:59,920 Speaker 3: more unconsciously rigid rather than consciously curing. 550 00:32:00,920 --> 00:32:04,240 Speaker 1: And it's a bad thing. So I love your first 551 00:32:04,280 --> 00:32:06,800 Speaker 1: thing in the process. Tell us about the other steps 552 00:32:06,800 --> 00:32:08,800 Speaker 1: in the process. Let's explore those a little bit. 553 00:32:09,240 --> 00:32:11,920 Speaker 2: Yeah, so there's four steps that we've got to go through. 554 00:32:12,200 --> 00:32:15,240 Speaker 2: Being consciously curious is important, then is to openly explore 555 00:32:15,320 --> 00:32:18,840 Speaker 2: and openly explore. The second step is about asking questions, 556 00:32:19,200 --> 00:32:22,600 Speaker 2: speaking and sitting in people's experience, trying to gather as 557 00:32:22,680 --> 00:32:26,480 Speaker 2: much data as we can. And what I mean openly explore, 558 00:32:26,480 --> 00:32:30,120 Speaker 2: it's really not rushing to find the answers I want. 559 00:32:30,840 --> 00:32:33,240 Speaker 2: And you know, if we take the US as a 560 00:32:33,240 --> 00:32:36,880 Speaker 2: good example, there people are very black and white on 561 00:32:36,920 --> 00:32:41,360 Speaker 2: things like gun laws. Oh well, people are pro guns 562 00:32:41,400 --> 00:32:44,120 Speaker 2: because they like guns, because they're gun toting hillbilities and 563 00:32:44,160 --> 00:32:48,760 Speaker 2: want to shoot everything. Sure, that's a good first thought assumption, 564 00:32:49,120 --> 00:32:52,120 Speaker 2: very fast. But what if it's that they're actually deeply 565 00:32:52,200 --> 00:32:56,560 Speaker 2: afraid of not being able to protect their family. Because 566 00:32:56,560 --> 00:32:58,560 Speaker 2: I love for my family is something that I can 567 00:32:58,600 --> 00:33:01,440 Speaker 2: relate to really strongly, and if I had a dpp 568 00:33:01,560 --> 00:33:03,680 Speaker 2: if and not being out of well, I could empathize 569 00:33:03,680 --> 00:33:06,000 Speaker 2: with that pretty quickly. How do I explore that? How 570 00:33:06,040 --> 00:33:09,520 Speaker 2: do I ask questions to understand that and understand the 571 00:33:09,560 --> 00:33:12,600 Speaker 2: real context behind that? What else will cause people to 572 00:33:12,600 --> 00:33:16,480 Speaker 2: think that way? This openly explorer about sitting in that 573 00:33:16,880 --> 00:33:22,480 Speaker 2: unknown world and just gathering that data. 574 00:33:21,160 --> 00:33:23,720 Speaker 3: In this, sorry Daniel to jump in, but in this 575 00:33:23,960 --> 00:33:28,520 Speaker 3: do you have to cultivate a sense of openness? 576 00:33:28,560 --> 00:33:30,480 Speaker 1: And the reason I'm asking that question. 577 00:33:32,560 --> 00:33:35,400 Speaker 3: I mean people will at work and a lot of 578 00:33:35,400 --> 00:33:40,880 Speaker 3: them will have done like a Myersprings test or a. 579 00:33:40,400 --> 00:33:41,800 Speaker 1: I can't remember what the other. 580 00:33:42,680 --> 00:33:45,480 Speaker 3: The disk test and these sorts of personality ones. But 581 00:33:45,560 --> 00:33:49,040 Speaker 3: most serious psychologists will tell you that they're kind of 582 00:33:49,080 --> 00:33:53,120 Speaker 3: pop psychology, and the most validated one is the Big 583 00:33:53,160 --> 00:33:58,040 Speaker 3: five personality traits, one of which is openness to experience. 584 00:33:58,160 --> 00:34:00,560 Speaker 3: I'm just wondering, I don't know if you've seen any 585 00:34:00,600 --> 00:34:03,920 Speaker 3: research whether that is linked to empathy, whether people who 586 00:34:03,920 --> 00:34:08,240 Speaker 3: are higher in openness are actually higher in empathy. 587 00:34:09,320 --> 00:34:10,439 Speaker 1: So that's that's one question. 588 00:34:10,520 --> 00:34:13,160 Speaker 3: Then the second one is that do we have to 589 00:34:13,320 --> 00:34:17,480 Speaker 3: with this openly explore? Do you have to prime yourself 590 00:34:17,560 --> 00:34:20,600 Speaker 3: to be open to having new knowledge and having your 591 00:34:20,640 --> 00:34:21,720 Speaker 3: assumptions challenged. 592 00:34:22,520 --> 00:34:25,719 Speaker 2: Yeah, I think they fit together. So I haven't seen 593 00:34:25,840 --> 00:34:28,760 Speaker 2: research on that particular, because I know the Big Five 594 00:34:30,200 --> 00:34:33,200 Speaker 2: there are. What I always say with those of the 595 00:34:33,239 --> 00:34:39,960 Speaker 2: personality profiles is there are different underlying habits that we have, 596 00:34:40,400 --> 00:34:43,520 Speaker 2: and neurologically, what's going on here is our brain is 597 00:34:43,520 --> 00:34:47,480 Speaker 2: connecting these different neural pathways. There's trillions dollars, incredibly complex, 598 00:34:47,760 --> 00:34:49,920 Speaker 2: but the more we use them, the more they become 599 00:34:49,920 --> 00:34:54,319 Speaker 2: habitual and the go to. And so what we know 600 00:34:54,600 --> 00:34:57,719 Speaker 2: is that after a certain period of our lifetime, we 601 00:34:57,760 --> 00:35:00,480 Speaker 2: start to form pretty strong habits and we keep enforcing 602 00:35:00,480 --> 00:35:05,239 Speaker 2: in those pathways. This is important because it's for some 603 00:35:05,280 --> 00:35:07,560 Speaker 2: people they think, well, that's just how I am and 604 00:35:07,640 --> 00:35:09,120 Speaker 2: I can't change. Bullshit. 605 00:35:09,320 --> 00:35:10,520 Speaker 1: No, that's right. 606 00:35:10,960 --> 00:35:14,600 Speaker 3: That's how you've become because of your repeated behaviors. Donald 607 00:35:14,640 --> 00:35:17,600 Speaker 3: had the Scottish neuroscientists right, nerve sales that fire together, 608 00:35:17,640 --> 00:35:18,560 Speaker 3: wire together. 609 00:35:18,800 --> 00:35:22,080 Speaker 2: Exactly right. So neuroplasticity means you can always change your 610 00:35:22,320 --> 00:35:25,600 Speaker 2: entire life. There's no sort of stop to that. But 611 00:35:26,080 --> 00:35:30,560 Speaker 2: you change through deliberate practice, not through yes. And so yes, 612 00:35:30,640 --> 00:35:33,200 Speaker 2: you have to prime yourself, particularly if you're not someone 613 00:35:33,239 --> 00:35:37,080 Speaker 2: who is always that curious. You have to prime yourself 614 00:35:37,120 --> 00:35:40,520 Speaker 2: to go, Okay, this is a time when I really 615 00:35:40,600 --> 00:35:43,600 Speaker 2: want to understand when the hell's going on him. This 616 00:35:43,760 --> 00:35:45,840 Speaker 2: is a time where I need to be able to 617 00:35:45,880 --> 00:35:49,640 Speaker 2: park my assumption and just sit with the uncertainty and 618 00:35:49,640 --> 00:35:54,000 Speaker 2: the discomfort that will exist in this next conversation. I'm 619 00:35:54,000 --> 00:35:58,839 Speaker 2: going to actively stop going oh but what about or no, 620 00:35:58,920 --> 00:36:03,080 Speaker 2: that's not right, or yeah. It takes practice, and you've 621 00:36:03,080 --> 00:36:07,080 Speaker 2: got to catch yourself sometimes knowing that you thought you 622 00:36:07,120 --> 00:36:08,359 Speaker 2: weren't going to do it, and then you just did 623 00:36:08,400 --> 00:36:11,000 Speaker 2: it anyway because you're human, and I'm going to practice 624 00:36:11,000 --> 00:36:14,440 Speaker 2: again and practice again. So yes, I'm sure there are 625 00:36:14,480 --> 00:36:17,399 Speaker 2: people who are much better at this. Also the word 626 00:36:17,520 --> 00:36:20,920 Speaker 2: naturally and what that means, But that who have become 627 00:36:20,920 --> 00:36:22,920 Speaker 2: better at this over time, and there are people who 628 00:36:22,920 --> 00:36:24,560 Speaker 2: are a long way down the scale and need to 629 00:36:24,560 --> 00:36:28,880 Speaker 2: work harder. Yes, that's all true. But the deliberate practice 630 00:36:28,920 --> 00:36:30,719 Speaker 2: will make you better no matter where you are on 631 00:36:30,719 --> 00:36:31,440 Speaker 2: that scale. 632 00:36:31,680 --> 00:36:33,600 Speaker 3: And that's the point of the framework, right, is that 633 00:36:33,760 --> 00:36:38,279 Speaker 3: deliberately practice the framework. So we've got firstly being consciously 634 00:36:38,360 --> 00:36:43,759 Speaker 3: curious and then openly explore by I assume there's there's 635 00:36:43,800 --> 00:36:47,239 Speaker 3: an element of asking skillful questions in there and then 636 00:36:47,280 --> 00:36:49,920 Speaker 3: asking people to tell you more and exploring emotions. 637 00:36:50,960 --> 00:36:53,279 Speaker 1: What's the what's the third one? 638 00:36:54,000 --> 00:36:59,279 Speaker 2: Yeah? Before the other thing is you can brainstorm and 639 00:37:00,040 --> 00:37:04,959 Speaker 2: age outside of yourself about another person when you don't 640 00:37:04,960 --> 00:37:07,239 Speaker 2: have access to talk to them. So, you know, if 641 00:37:07,239 --> 00:37:11,080 Speaker 2: I was to think why might someone become a suicide bomber? 642 00:37:11,480 --> 00:37:14,040 Speaker 2: I don't agree that. I think the terrible thing to do. Don't. 643 00:37:14,280 --> 00:37:18,120 Speaker 2: Don't think it's a great action to take. However, if 644 00:37:18,120 --> 00:37:21,279 Speaker 2: I just assumed, oh, that's because they're evil, Okay, tick 645 00:37:21,320 --> 00:37:23,960 Speaker 2: a box, that's a good assumption for me to take. 646 00:37:24,000 --> 00:37:28,080 Speaker 2: It's a short, quick assumption. Is it particularly useful? What 647 00:37:28,120 --> 00:37:29,560 Speaker 2: do I do? Now? Do I go to around people 648 00:37:29,600 --> 00:37:31,920 Speaker 2: and go, hey, make sure you're not evil, don't be evil? 649 00:37:32,360 --> 00:37:34,839 Speaker 2: Is that going to really have an impact? No, So 650 00:37:34,920 --> 00:37:37,319 Speaker 2: I can still without knowing them talking to them, I 651 00:37:37,320 --> 00:37:40,880 Speaker 2: can probably imagine that it's a pretty effective way of 652 00:37:40,920 --> 00:37:44,399 Speaker 2: causing damage when you don't have many resources. It could 653 00:37:44,400 --> 00:37:46,759 Speaker 2: be driven by religion, it could be driven by a parent, 654 00:37:46,800 --> 00:37:48,480 Speaker 2: it could be driven by a love. That can be 655 00:37:48,520 --> 00:37:50,879 Speaker 2: driven by politics and a sense of nationalism. 656 00:37:51,239 --> 00:37:54,759 Speaker 3: It can be it's a huge and a big one 657 00:37:54,880 --> 00:37:57,840 Speaker 3: is a lack of hope in your current situation. 658 00:37:57,719 --> 00:38:01,480 Speaker 2: Absolutely lack of hope. It could be want my children, 659 00:38:01,600 --> 00:38:05,000 Speaker 2: my generations of family to have a better life. And 660 00:38:05,000 --> 00:38:07,640 Speaker 2: this is the only course of action I feel empowered 661 00:38:07,680 --> 00:38:12,800 Speaker 2: I can take. So when I start the brainstorm there again, 662 00:38:12,920 --> 00:38:16,160 Speaker 2: I don't think any of these are good, valid reasons 663 00:38:16,160 --> 00:38:19,080 Speaker 2: to do that, But that's not really the point. The 664 00:38:19,160 --> 00:38:21,120 Speaker 2: point is, if I'm going to try and influence and 665 00:38:21,200 --> 00:38:24,239 Speaker 2: change the decision someone's going to make, I've got to 666 00:38:24,280 --> 00:38:27,360 Speaker 2: have this visibility of these things, right at least understand 667 00:38:27,360 --> 00:38:29,600 Speaker 2: that these things could exist, because it gives me more 668 00:38:29,640 --> 00:38:33,680 Speaker 2: starting points to have conversations and explore further. So it 669 00:38:33,800 --> 00:38:37,040 Speaker 2: is asking questions, but it's not the only way. If 670 00:38:37,080 --> 00:38:39,919 Speaker 2: you've worked at a big company. Sometimes people up there 671 00:38:40,040 --> 00:38:42,719 Speaker 2: make a decision and you sit there and go, what 672 00:38:42,760 --> 00:38:47,120 Speaker 2: are these idiots doing there to make our life hard? 673 00:38:48,320 --> 00:38:50,600 Speaker 2: And quickly I'll go to a team, Are they're idiots, 674 00:38:50,600 --> 00:38:53,000 Speaker 2: they don't like us? Or where on the out I 675 00:38:53,080 --> 00:38:56,280 Speaker 2: make an assumption. I start communicating that assumption. It spreads 676 00:38:56,320 --> 00:38:59,560 Speaker 2: like wildfire. If instead I sat with my team, I said, hey, 677 00:38:59,600 --> 00:39:01,759 Speaker 2: they thought this, and one of my team goes, good, 678 00:39:01,800 --> 00:39:04,640 Speaker 2: their dicads, Okay, they might be What else could be 679 00:39:04,719 --> 00:39:05,560 Speaker 2: driving them? 680 00:39:05,760 --> 00:39:06,120 Speaker 1: Mhm? 681 00:39:06,600 --> 00:39:09,080 Speaker 2: You know that it's acius us so much more opportunity. 682 00:39:09,080 --> 00:39:10,920 Speaker 2: And what it then does is it allows us to 683 00:39:10,920 --> 00:39:14,839 Speaker 2: do step three, which is challenge my models. What are 684 00:39:14,840 --> 00:39:18,040 Speaker 2: those assumptions by a p heuristic? You know all those 685 00:39:18,080 --> 00:39:21,680 Speaker 2: things I've already got. What's the story? The plausible narrative? Again? 686 00:39:21,719 --> 00:39:25,040 Speaker 2: What's that? What it was that before? Now I'm going 687 00:39:25,080 --> 00:39:29,640 Speaker 2: to make a conscious decision to potentially alter it. Right 688 00:39:29,680 --> 00:39:32,480 Speaker 2: before I thought this, they are all just evil. Now 689 00:39:32,520 --> 00:39:35,080 Speaker 2: I go, well, actually, some people are going to be 690 00:39:35,120 --> 00:39:39,200 Speaker 2: doing it for different reasons. And once I can change 691 00:39:39,200 --> 00:39:42,640 Speaker 2: my mind and alter my mental models, that I can 692 00:39:42,800 --> 00:39:46,960 Speaker 2: use all the powers back to me. Isn't it now? 693 00:39:47,600 --> 00:39:51,879 Speaker 1: For some people, the force is very strong in them. 694 00:39:52,000 --> 00:39:55,480 Speaker 3: I what I think by that is the sense that 695 00:39:55,480 --> 00:39:58,319 Speaker 3: that that I am right and my model is right. 696 00:39:59,560 --> 00:40:04,360 Speaker 3: What sort of is there any guidance techniques that you 697 00:40:04,400 --> 00:40:09,760 Speaker 3: can give to people to being open to challenging their model, 698 00:40:09,920 --> 00:40:13,279 Speaker 3: or to deliberately challenge their model, because I think it's 699 00:40:13,320 --> 00:40:18,680 Speaker 3: a hugely important thing. I mean, I will often just 700 00:40:18,719 --> 00:40:20,440 Speaker 3: to give me a kind of case in point that 701 00:40:21,480 --> 00:40:25,520 Speaker 3: leverages off what you'd said earlier on, I tend to 702 00:40:25,600 --> 00:40:30,680 Speaker 3: read news from opposite sides of the political spectrum right 703 00:40:30,719 --> 00:40:35,640 Speaker 3: and it's a really interesting experiment going on to CNN 704 00:40:36,000 --> 00:40:38,120 Speaker 3: and then jumping across the Fox News and it's like 705 00:40:38,120 --> 00:40:40,600 Speaker 3: two different words. But even here in Australia, you read 706 00:40:40,600 --> 00:40:43,879 Speaker 3: the Guardian and then you read the Australian Right and 707 00:40:44,640 --> 00:40:46,360 Speaker 3: very very different takes. 708 00:40:46,400 --> 00:40:47,840 Speaker 1: And it gets you then. 709 00:40:47,800 --> 00:40:52,080 Speaker 3: To see that yes, there are different ways of interpreting things, 710 00:40:52,080 --> 00:40:54,080 Speaker 3: of viewing the world of all of this sort of stuff. 711 00:40:54,080 --> 00:40:57,440 Speaker 3: So what sort of things or what advice would you 712 00:40:57,480 --> 00:41:02,239 Speaker 3: give to people to be more deliberate about that, or 713 00:41:02,320 --> 00:41:05,480 Speaker 3: to to help them become more effective and challenging their model. 714 00:41:05,239 --> 00:41:08,920 Speaker 1: Because I think that this is this is really huge. 715 00:41:09,360 --> 00:41:14,239 Speaker 2: It's it's hard because there isn't an easy way. I 716 00:41:14,239 --> 00:41:17,640 Speaker 2: think doing things like you're doing are really really useful, 717 00:41:18,040 --> 00:41:21,399 Speaker 2: exposing yourself the stuff you don't agree with, being able 718 00:41:21,400 --> 00:41:23,279 Speaker 2: to sit there and go, hey, I can hold this, 719 00:41:23,360 --> 00:41:25,160 Speaker 2: I can look at this idea, I can hold it 720 00:41:25,160 --> 00:41:27,160 Speaker 2: in my mind. I can interrogate it and think about 721 00:41:27,160 --> 00:41:32,719 Speaker 2: it without believing it, without accepting that it's true. And 722 00:41:33,120 --> 00:41:35,719 Speaker 2: one of the things often say, particularly for people making 723 00:41:35,760 --> 00:41:39,080 Speaker 2: decisions leaders in teams, that's not even in a relationships. 724 00:41:40,440 --> 00:41:41,799 Speaker 2: Do you want to be right or do you want 725 00:41:41,800 --> 00:41:45,760 Speaker 2: to get the right outcome? Because sometimes you don't get both. 726 00:41:46,239 --> 00:41:47,319 Speaker 1: Yeah, yeah, and. 727 00:41:48,800 --> 00:41:51,440 Speaker 2: It's obvious, right, we want the better app. 728 00:41:52,120 --> 00:41:55,080 Speaker 1: You can get both, but you'll pess people off. 729 00:41:55,200 --> 00:41:59,600 Speaker 2: Potentially potentially well sometimes you can. Yeah, but I know 730 00:41:59,719 --> 00:42:02,720 Speaker 2: lots of people who will burn things to the ground 731 00:42:03,200 --> 00:42:06,280 Speaker 2: to stand on the smoldering actions and say, well, see 732 00:42:06,480 --> 00:42:09,719 Speaker 2: I was right. Take that ball, I was right. A 733 00:42:11,080 --> 00:42:16,520 Speaker 2: cool where we now right? Instead of saying, hey, how 734 00:42:16,520 --> 00:42:19,799 Speaker 2: do we move forward here? And we saw this. This 735 00:42:19,960 --> 00:42:22,839 Speaker 2: is a study done quite a long time ago, and 736 00:42:23,080 --> 00:42:26,200 Speaker 2: the modern context it's probably made it worse, where they 737 00:42:26,200 --> 00:42:29,480 Speaker 2: would show a very similar news article to people who 738 00:42:29,480 --> 00:42:32,919 Speaker 2: are pro Palestinian and pro Israeli, and both of them 739 00:42:33,080 --> 00:42:37,640 Speaker 2: would record report back how biased the same article was 740 00:42:38,080 --> 00:42:43,880 Speaker 2: in opposite directions directions we are when we see the world, 741 00:42:43,960 --> 00:42:46,600 Speaker 2: we don't see it in any sort of objectivity. We 742 00:42:46,719 --> 00:42:49,920 Speaker 2: see it through huge amounts of bias. And one of 743 00:42:49,960 --> 00:42:52,600 Speaker 2: the tips I always give people is to go, but 744 00:42:52,719 --> 00:42:56,239 Speaker 2: what if you're wrong? So in any situation, you know, 745 00:42:56,280 --> 00:42:58,080 Speaker 2: let's make a decision in a business, so we need 746 00:42:58,120 --> 00:43:02,040 Speaker 2: to you know, customer buy it. It must have been 747 00:43:02,040 --> 00:43:05,000 Speaker 2: because it was too expensive. Okay, cool, write that down. 748 00:43:05,120 --> 00:43:09,279 Speaker 2: What if we're wrong? Because if we're wrong, we might 749 00:43:09,320 --> 00:43:12,120 Speaker 2: well reduce the price and still not sell anything. Right, 750 00:43:12,120 --> 00:43:14,200 Speaker 2: we might do a whole bunch of marketing on this 751 00:43:14,320 --> 00:43:17,279 Speaker 2: new price and get no results. It would be bad, 752 00:43:17,320 --> 00:43:21,120 Speaker 2: wouldn't it. Okay, cool, Before we just dive headfirst into 753 00:43:21,160 --> 00:43:24,920 Speaker 2: this shallow pool here, let's step back and go what 754 00:43:24,960 --> 00:43:28,759 Speaker 2: if we're wrong? What are the other things? And interestingly enough, 755 00:43:28,800 --> 00:43:32,319 Speaker 2: I think there's elements from practices like risk management which 756 00:43:32,360 --> 00:43:33,399 Speaker 2: are really useful here. 757 00:43:33,719 --> 00:43:36,040 Speaker 3: Yeah, yeah, I was just about to say, stuff, get 758 00:43:36,040 --> 00:43:39,920 Speaker 3: go ahead with this because this is. 759 00:43:40,000 --> 00:43:43,400 Speaker 2: Just good practice. To go what else could be true? 760 00:43:43,800 --> 00:43:46,680 Speaker 2: How can we change this? What if we're wrong. And 761 00:43:46,960 --> 00:43:52,759 Speaker 2: I really challenge anyone to tell me that they know 762 00:43:52,880 --> 00:43:55,080 Speaker 2: all the answers, they've got it all right. You know, 763 00:43:55,120 --> 00:43:57,400 Speaker 2: the people when they say oh, I know everything and 764 00:43:57,440 --> 00:44:02,799 Speaker 2: I've got everything right immediately cause others to go bullshit. Yeah, 765 00:44:02,960 --> 00:44:06,200 Speaker 2: until you've become president, you know. Yeah. 766 00:44:06,239 --> 00:44:09,640 Speaker 3: And I think I think the best teams will will 767 00:44:09,719 --> 00:44:14,560 Speaker 3: actually have some people to to, you know, if they're 768 00:44:14,680 --> 00:44:16,640 Speaker 3: they're coming to a decision or they think this is 769 00:44:16,680 --> 00:44:20,320 Speaker 3: the decision. Have some people actually go away and create 770 00:44:20,360 --> 00:44:21,800 Speaker 3: a case for why we're wrong. 771 00:44:22,480 --> 00:44:27,120 Speaker 1: Right, So why did this feel? If you project forward? 772 00:44:27,239 --> 00:44:30,480 Speaker 3: Because that's all about challenging assumptions that we often have, 773 00:44:30,560 --> 00:44:35,719 Speaker 3: those assumptions that are are either maybe not wrong, but 774 00:44:35,800 --> 00:44:39,720 Speaker 3: maybe not complete and can lead you down the wrong path, 775 00:44:39,840 --> 00:44:42,000 Speaker 3: right if you don't challenge those assumptions. 776 00:44:43,000 --> 00:44:44,680 Speaker 2: And look, one of the things that I think is 777 00:44:44,719 --> 00:44:48,880 Speaker 2: so important, the conversation around DEI and the attack on 778 00:44:49,280 --> 00:44:51,680 Speaker 2: diversity that's going on and all that sort of stuff 779 00:44:51,719 --> 00:44:57,040 Speaker 2: that's floating around. I have someone I get why it's 780 00:44:57,080 --> 00:45:00,239 Speaker 2: probably gone too far in some spaces. The sure, let's 781 00:45:00,239 --> 00:45:02,440 Speaker 2: get back to the nuts and bolts of this. If 782 00:45:02,440 --> 00:45:06,080 Speaker 2: we can get people who see the world differently surrounding 783 00:45:06,160 --> 00:45:08,880 Speaker 2: us and adding different perspectives and different points of view. 784 00:45:09,440 --> 00:45:12,920 Speaker 2: That makes that stronger. But without doubt, if I can 785 00:45:12,960 --> 00:45:15,359 Speaker 2: get people who can see the things that I can't see. 786 00:45:15,520 --> 00:45:17,440 Speaker 2: If I just fill the room with people who believe 787 00:45:17,440 --> 00:45:20,840 Speaker 2: what I believe, I've got a good horsepower. I'm just 788 00:45:20,920 --> 00:45:22,560 Speaker 2: kind of maybe heading the wrong. 789 00:45:22,400 --> 00:45:25,520 Speaker 1: Direction and very quickly, very quickly exactly right. 790 00:45:25,560 --> 00:45:27,719 Speaker 2: And so Paul, this is where how do we how 791 00:45:27,719 --> 00:45:29,680 Speaker 2: do we get people to do this sort of stuff? 792 00:45:29,840 --> 00:45:33,400 Speaker 2: We'll bit with interest and curiosity about people who disagree 793 00:45:33,440 --> 00:45:36,880 Speaker 2: with you, bring them closer to you and go, that's fascinating. 794 00:45:36,920 --> 00:45:39,480 Speaker 2: I don't totally disagree, but I'm interested. 795 00:45:39,800 --> 00:45:42,319 Speaker 1: And be anymore yeah, in that. 796 00:45:42,360 --> 00:45:45,560 Speaker 2: Space, and don't try and get an answer or who's right? 797 00:45:46,680 --> 00:45:49,080 Speaker 2: Throw that stuff away and just sit with it and go, 798 00:45:49,520 --> 00:45:52,480 Speaker 2: there's something in here. And you know, that's where we 799 00:45:52,600 --> 00:45:57,040 Speaker 2: find so much gold when we do it authentically. But 800 00:45:57,080 --> 00:45:59,640 Speaker 2: it takes that trust building that we talked about. The 801 00:45:59,640 --> 00:46:04,200 Speaker 2: excess you you mentioned earlier become really crucial because if 802 00:46:04,239 --> 00:46:07,520 Speaker 2: I ask people who don't trust me to tell me 803 00:46:07,560 --> 00:46:11,280 Speaker 2: what they think, they will modify what they say based 804 00:46:11,320 --> 00:46:14,759 Speaker 2: on what the relationship we have. What they think all 805 00:46:14,760 --> 00:46:16,880 Speaker 2: that sort of stuff clouds and it gets real messy, 806 00:46:16,920 --> 00:46:17,600 Speaker 2: real powers. 807 00:46:17,760 --> 00:46:20,520 Speaker 3: And then often they will agree in the meeting, and 808 00:46:20,560 --> 00:46:23,000 Speaker 3: then they will go out and they will sabotize the 809 00:46:23,080 --> 00:46:26,640 Speaker 3: decision because I often say to this, people need to 810 00:46:26,719 --> 00:46:30,280 Speaker 3: feel heard if they're going to be committed to a decision, 811 00:46:30,320 --> 00:46:33,040 Speaker 3: even if they don't agree with that decision, if they've 812 00:46:33,080 --> 00:46:37,040 Speaker 3: actually been heard and it's been discussed, then they're much. 813 00:46:36,880 --> 00:46:38,000 Speaker 1: More likely to get on board. 814 00:46:38,040 --> 00:46:40,319 Speaker 3: But if you just shut them down, often they'll go 815 00:46:40,320 --> 00:46:44,439 Speaker 3: out and those sabotage and it fucking destroys organizations. I've 816 00:46:44,480 --> 00:46:47,319 Speaker 3: seen it, and it spreads like a cancer to your 817 00:46:47,360 --> 00:46:49,759 Speaker 3: point earlier on, then they go to their team and 818 00:46:49,800 --> 00:46:52,439 Speaker 3: go these decads and blah blah blah blah blah. Right, 819 00:46:52,480 --> 00:46:56,160 Speaker 3: and then it just it just gathers legs and spreads 820 00:46:56,160 --> 00:46:59,680 Speaker 3: like a cancer. Absolutely, So we got the first three, 821 00:46:59,719 --> 00:47:03,799 Speaker 3: So be consciously curious, openly explore, challenge your model, your 822 00:47:03,840 --> 00:47:04,480 Speaker 3: mental models. 823 00:47:04,480 --> 00:47:05,600 Speaker 1: And then what's the fourth one? 824 00:47:06,120 --> 00:47:08,319 Speaker 2: And to your point, it's what I called leading with 825 00:47:08,360 --> 00:47:10,840 Speaker 2: empathy with which is what I mean by that is, 826 00:47:10,920 --> 00:47:17,239 Speaker 2: make decisions, take action, navigate through this world with your 827 00:47:17,360 --> 00:47:22,480 Speaker 2: new found data and understanding about the situation. So your 828 00:47:22,480 --> 00:47:25,120 Speaker 2: point that you just raise there is an exact example 829 00:47:25,160 --> 00:47:28,400 Speaker 2: I talk about often. You have to make a decision, 830 00:47:28,520 --> 00:47:30,759 Speaker 2: maybe as a leader or a business owner or whatever it. 831 00:47:30,960 --> 00:47:33,400 Speaker 2: But you got to make a call. We're going to 832 00:47:33,400 --> 00:47:37,040 Speaker 2: do X or Y. You talk to your team, you 833 00:47:37,120 --> 00:47:40,319 Speaker 2: gather information, you get their perspectives, to hear them. But 834 00:47:40,360 --> 00:47:42,080 Speaker 2: then at the end of the day, you've got to 835 00:47:42,080 --> 00:47:44,640 Speaker 2: make a call. And sixty percent of your team thought 836 00:47:44,640 --> 00:47:48,279 Speaker 2: it was a twenty percent b the rest thought both 837 00:47:48,280 --> 00:47:52,440 Speaker 2: are terrible. What do you do? You have to be 838 00:47:52,480 --> 00:47:55,439 Speaker 2: able to use all that information to go team, I've 839 00:47:55,480 --> 00:47:58,239 Speaker 2: heard you, I've listened, this is what we're doing. Now. 840 00:47:58,320 --> 00:48:01,920 Speaker 2: At best percent of them might be a little bit 841 00:48:01,920 --> 00:48:05,399 Speaker 2: happy that you've chosen their but what about the rest? 842 00:48:05,640 --> 00:48:08,880 Speaker 2: And to your point, as a leader, your job is 843 00:48:08,920 --> 00:48:10,800 Speaker 2: not to go well, I made the decision because of 844 00:48:10,840 --> 00:48:14,000 Speaker 2: the majority, I made the decision because it'll make most 845 00:48:14,000 --> 00:48:16,719 Speaker 2: people happy, or I'm not making a decision because I 846 00:48:16,800 --> 00:48:19,560 Speaker 2: need to make everyone happy and not everyone agrees. All 847 00:48:19,560 --> 00:48:23,080 Speaker 2: of that stuff is not leadership. In my view, Leadership 848 00:48:23,120 --> 00:48:26,360 Speaker 2: is saying I've listened. I have to own this decision. 849 00:48:26,520 --> 00:48:29,319 Speaker 2: This is what we're doing, this is why this is 850 00:48:29,360 --> 00:48:31,800 Speaker 2: how I've thought about it, and we're going to announce 851 00:48:31,800 --> 00:48:37,840 Speaker 2: take action. And you know that's a real challenge for 852 00:48:37,920 --> 00:48:41,160 Speaker 2: lots of people in organizations is to be able to 853 00:48:41,320 --> 00:48:44,879 Speaker 2: hand on heart they I've just I've made a call 854 00:48:45,160 --> 00:48:47,080 Speaker 2: now to your point. If I do that, but I've 855 00:48:47,120 --> 00:48:49,360 Speaker 2: not listened to anyone, it's the captain's call from the 856 00:48:49,480 --> 00:48:53,440 Speaker 2: you know, my behind closed doors, then everyone in that 857 00:48:53,520 --> 00:48:57,200 Speaker 2: room's got the right to have the ship. God said. 858 00:48:58,200 --> 00:48:59,879 Speaker 2: But if I've listened to you and I can sit 859 00:49:00,040 --> 00:49:01,400 Speaker 2: and come back to you and go, hey, point, I 860 00:49:01,400 --> 00:49:04,000 Speaker 2: know this decision isn't the one you would have wanted. 861 00:49:04,160 --> 00:49:07,000 Speaker 2: I understand this, but look at this is what we 862 00:49:07,000 --> 00:49:08,839 Speaker 2: were facing. This is the sort of stuff. Is how 863 00:49:08,880 --> 00:49:11,200 Speaker 2: I thought through it, and I know this isn't exactly 864 00:49:11,239 --> 00:49:13,279 Speaker 2: what you wanted. So I wanted to talk to you 865 00:49:13,360 --> 00:49:15,160 Speaker 2: through this because I want to make sure we're on 866 00:49:15,200 --> 00:49:16,880 Speaker 2: the same page with how we're going to move forward, 867 00:49:17,880 --> 00:49:20,600 Speaker 2: and it's really important that we've got your commitment to 868 00:49:20,920 --> 00:49:21,919 Speaker 2: stake those steps. Now. 869 00:49:23,160 --> 00:49:24,719 Speaker 3: I love the way that you bring that up, because 870 00:49:24,760 --> 00:49:28,040 Speaker 3: as you were talking about all of this, I'm thinking 871 00:49:28,040 --> 00:49:31,800 Speaker 3: about the high Performance Teams model that I used that 872 00:49:31,960 --> 00:49:35,480 Speaker 3: was created by pat len Chuney. And at the bottom 873 00:49:35,760 --> 00:49:40,879 Speaker 3: is trust, but it's a deeper trust based on vulnerability 874 00:49:40,880 --> 00:49:43,720 Speaker 3: and understanding, not predictive trust, which is all the stuff 875 00:49:43,760 --> 00:49:48,280 Speaker 3: that you've talked about. Then it's about engaging in productive conflict, 876 00:49:48,360 --> 00:49:51,040 Speaker 3: as he sell, So it's that debate, hearing the other 877 00:49:51,160 --> 00:49:53,560 Speaker 3: sides of the story, which is very aligned to what 878 00:49:53,600 --> 00:49:58,640 Speaker 3: you're saying. Then it's about commitment, right because only when 879 00:49:58,680 --> 00:50:01,480 Speaker 3: you've had the productive conflict, you've had the conversations, you've 880 00:50:01,480 --> 00:50:04,880 Speaker 3: explored other ideas, can you expect people to commit and 881 00:50:04,920 --> 00:50:07,480 Speaker 3: they have to commit even if they don't agree to it. 882 00:50:07,680 --> 00:50:11,239 Speaker 3: Everybody has to commit. And then the next step is 883 00:50:11,400 --> 00:50:14,920 Speaker 3: holding people accountable to what they've actually committed to and 884 00:50:14,960 --> 00:50:18,200 Speaker 3: to their plans, which I think is is really critical. 885 00:50:18,239 --> 00:50:23,800 Speaker 3: So it's very very aligned with that high Performance Team's model. 886 00:50:24,280 --> 00:50:28,040 Speaker 1: And I wanted to go back to. 887 00:50:28,320 --> 00:50:32,320 Speaker 3: Where we talked about that that that diversity of opinion, 888 00:50:32,440 --> 00:50:36,279 Speaker 3: because you know, the whole diversity thing, as you said, 889 00:50:36,360 --> 00:50:40,520 Speaker 3: is very polarizing, right there were there's there's people who 890 00:50:40,520 --> 00:50:42,200 Speaker 3: think we need more of it, there's people who think 891 00:50:42,200 --> 00:50:44,000 Speaker 3: we've gone too far and we need less of it. 892 00:50:44,440 --> 00:50:47,320 Speaker 3: So how do you how do you get that balance 893 00:50:47,800 --> 00:50:51,360 Speaker 3: as a leader. Let's think particularly or if you're an individual, 894 00:50:52,000 --> 00:50:55,440 Speaker 3: you've got a team, and how much. 895 00:50:55,280 --> 00:50:59,239 Speaker 1: Diversity do we need? Like where does it stop? And 896 00:50:59,239 --> 00:51:00,600 Speaker 1: and and then. 897 00:51:02,080 --> 00:51:06,120 Speaker 3: Is it you know, are you looking to peck people 898 00:51:06,600 --> 00:51:11,160 Speaker 3: who have an opinion but aren't emotionally completely welded to 899 00:51:11,200 --> 00:51:14,560 Speaker 3: that opinion or is there any guidelines here about how 900 00:51:14,640 --> 00:51:19,200 Speaker 3: to make this process efficient and not just get completely 901 00:51:19,320 --> 00:51:22,680 Speaker 3: stuck because there's fifteen points of view that people are 902 00:51:22,760 --> 00:51:24,279 Speaker 3: just like, this is the way it has to be. 903 00:51:25,040 --> 00:51:30,279 Speaker 2: How do lock that lane sots? Let's start by going, Wow, 904 00:51:30,320 --> 00:51:34,600 Speaker 2: it's gray, and it's messy, and is the complexity explodes 905 00:51:34,640 --> 00:51:35,920 Speaker 2: as soon as you get a bunch of people in 906 00:51:35,920 --> 00:51:38,279 Speaker 2: a room. Okay, so let's accept that that's just the 907 00:51:38,400 --> 00:51:41,759 Speaker 2: nature of the beast here we're stepping into it. Therefore, 908 00:51:42,239 --> 00:51:46,000 Speaker 2: the wrong ways to go about diversity and inclusion by 909 00:51:46,719 --> 00:51:51,360 Speaker 2: metricating it with specific processes that say that there is 910 00:51:51,400 --> 00:51:55,440 Speaker 2: a simple and clear answer to this huge, messy, you know, 911 00:51:55,680 --> 00:52:00,960 Speaker 2: completely complex space. So I think the answer to your 912 00:52:01,000 --> 00:52:03,200 Speaker 2: question is not, well, you put a quota in place, 913 00:52:03,239 --> 00:52:04,880 Speaker 2: so you do this, you do that, And that's not 914 00:52:04,920 --> 00:52:08,920 Speaker 2: to say that those can't be useful tools. But my 915 00:52:09,120 --> 00:52:11,680 Speaker 2: real question before people even look at this whole what 916 00:52:12,680 --> 00:52:15,080 Speaker 2: diversity do need is what are you trying to achieve? 917 00:52:16,200 --> 00:52:18,920 Speaker 2: What's your outcome here? What are you trying to do? So, 918 00:52:18,960 --> 00:52:21,239 Speaker 2: if I work where I work with mining companies like 919 00:52:21,560 --> 00:52:24,520 Speaker 2: php Afford used you and they're trying to bring more 920 00:52:24,520 --> 00:52:28,160 Speaker 2: people in, trying to bring more resources, they need more 921 00:52:28,200 --> 00:52:32,000 Speaker 2: female workers because otherwise fifty percent of the population is 922 00:52:32,080 --> 00:52:35,440 Speaker 2: just ruled out because of how their gender. Well, that's stupid. 923 00:52:36,280 --> 00:52:39,479 Speaker 2: They've also found that they're often women are much better 924 00:52:39,480 --> 00:52:44,759 Speaker 2: at driving machinery because they don't push them too hard exactly. 925 00:52:45,560 --> 00:52:49,160 Speaker 2: Put that aside, So how do you navigate through it? 926 00:52:49,200 --> 00:52:51,800 Speaker 2: I think the first thing is to go it's going 927 00:52:51,840 --> 00:52:54,880 Speaker 2: to be hard, whether the paradox we have to grapple with. 928 00:52:55,239 --> 00:52:57,880 Speaker 2: We like people working with people who are like us. 929 00:52:58,040 --> 00:53:01,480 Speaker 2: It's easy. We finish each others sentences, we agree at 930 00:53:01,520 --> 00:53:04,120 Speaker 2: field nights. We need to work with people who are 931 00:53:04,160 --> 00:53:07,560 Speaker 2: not like us. But it's harder, it takes work, and 932 00:53:07,600 --> 00:53:11,400 Speaker 2: it takes time to build trust. So how do you 933 00:53:11,440 --> 00:53:14,440 Speaker 2: do that? Well? I think this is where you have 934 00:53:14,520 --> 00:53:19,080 Speaker 2: to be inclusive of a whole bunch of factors like gender, rate, 935 00:53:19,120 --> 00:53:21,640 Speaker 2: all that stuff. You can be very inclusive of those things, 936 00:53:21,960 --> 00:53:26,880 Speaker 2: but exclusive on your values. This is what we stand for, 937 00:53:27,239 --> 00:53:30,160 Speaker 2: This is our mission, this is what we believe. And 938 00:53:30,200 --> 00:53:35,120 Speaker 2: I don't mean values like integrity, honesty, communication, right, I mean, 939 00:53:36,120 --> 00:53:38,600 Speaker 2: let's really bring these down a few levels. I wrote 940 00:53:38,600 --> 00:53:40,919 Speaker 2: another book on this before, but let's bring these few 941 00:53:41,000 --> 00:53:44,080 Speaker 2: down a few levels to the We will always and 942 00:53:44,120 --> 00:53:47,759 Speaker 2: we will never type statements of how we will show up, 943 00:53:47,800 --> 00:53:50,200 Speaker 2: what we believe, what we're trying to achieve, how we'll 944 00:53:50,239 --> 00:53:53,919 Speaker 2: treat each other, because it's in that context of those 945 00:53:54,000 --> 00:53:57,280 Speaker 2: values that we can interact in a way that creates trust, 946 00:53:57,360 --> 00:54:01,640 Speaker 2: creates certainty, creates that sense that we're aligned, and then 947 00:54:01,680 --> 00:54:05,040 Speaker 2: you can have lots of diversity in argument, but we're 948 00:54:05,080 --> 00:54:07,840 Speaker 2: on the same page because we're still really committed to 949 00:54:07,880 --> 00:54:08,520 Speaker 2: the mission we're on. 950 00:54:08,960 --> 00:54:10,400 Speaker 1: Yeah, I like it. I like it. 951 00:54:10,440 --> 00:54:16,680 Speaker 3: And that requires that businesses have proper values, not the 952 00:54:16,680 --> 00:54:20,360 Speaker 3: fluffy bullshit that me and you have seen so many times. 953 00:54:20,640 --> 00:54:24,320 Speaker 3: I'm always like, your values have to be or should 954 00:54:24,480 --> 00:54:28,040 Speaker 3: be behaviorally based, where you, Daniel can say to me, Paul, 955 00:54:28,600 --> 00:54:31,200 Speaker 3: you're not behaving in accordance with this value. They should 956 00:54:31,200 --> 00:54:35,319 Speaker 3: be there as a as a guide or a north 957 00:54:35,360 --> 00:54:37,120 Speaker 3: Star for making decisions. 958 00:54:38,360 --> 00:54:38,960 Speaker 1: I like that. 959 00:54:39,080 --> 00:54:43,360 Speaker 3: So we want the diversity of opinion and background and 960 00:54:43,440 --> 00:54:46,680 Speaker 3: rais and all of that stuff. But we're in the 961 00:54:46,680 --> 00:54:49,319 Speaker 3: same company that has the same values and as these 962 00:54:49,480 --> 00:54:52,319 Speaker 3: values that we're using to make these decisions. 963 00:54:52,800 --> 00:54:56,680 Speaker 2: Absolutely, and so to your point, if where you asked earlier, 964 00:54:56,800 --> 00:54:59,640 Speaker 2: you know, if we're making decisions and people aren't agreeing 965 00:54:59,719 --> 00:55:02,400 Speaker 2: and they don't want to commit, you know, we've said 966 00:55:03,200 --> 00:55:07,840 Speaker 2: maybe we're we're an organization that says to drive the 967 00:55:07,880 --> 00:55:11,200 Speaker 2: amount of energy electricity we need to be functional in 968 00:55:11,239 --> 00:55:15,920 Speaker 2: the next twenty years. We can't just use green energy, right, 969 00:55:15,920 --> 00:55:18,560 Speaker 2: We've looked at the numbers. It's just not going to work. Now, 970 00:55:18,600 --> 00:55:22,520 Speaker 2: someone in my team might go, I absolutely vehemently disagree 971 00:55:22,560 --> 00:55:27,840 Speaker 2: with that, and we might say, well, okay, I understand it. 972 00:55:27,920 --> 00:55:30,040 Speaker 2: I respect your decision. I can see how that wouldn't 973 00:55:30,040 --> 00:55:32,120 Speaker 2: be a problem. But if that's going to be an 974 00:55:32,160 --> 00:55:35,919 Speaker 2: issue for you, then let's talk about how we part 975 00:55:35,960 --> 00:55:40,239 Speaker 2: ways respectfully and graciously and carefully. You. I want to 976 00:55:40,280 --> 00:55:42,040 Speaker 2: be kind to you, I want to support you in that, 977 00:55:42,320 --> 00:55:44,319 Speaker 2: but what I don't want you to sit here is 978 00:55:44,360 --> 00:55:47,480 Speaker 2: with this cognitive dissonance every day that you're in a 979 00:55:47,520 --> 00:55:49,680 Speaker 2: place that you don't think he's doing the right thing 980 00:55:50,239 --> 00:55:52,319 Speaker 2: because that's not good for us and it's not good 981 00:55:52,320 --> 00:55:56,439 Speaker 2: for you. And you know, sometimes people think empathies being 982 00:55:56,520 --> 00:56:00,799 Speaker 2: nice to people. It's not. It's been kind and you know, 983 00:56:00,960 --> 00:56:04,719 Speaker 2: kindness of saying hey, Paul, we need this standard from 984 00:56:04,760 --> 00:56:07,480 Speaker 2: you and you're not making it. So maybe this isn't 985 00:56:07,480 --> 00:56:09,680 Speaker 2: the right spot, Maybe this isn't the right job, maybe 986 00:56:09,719 --> 00:56:11,960 Speaker 2: this isn't the right role. Maybe we need to do 987 00:56:11,960 --> 00:56:14,440 Speaker 2: a lot more training. But this gap is a big problem. 988 00:56:14,840 --> 00:56:16,680 Speaker 2: And I want to support you and help you be 989 00:56:16,719 --> 00:56:19,319 Speaker 2: successful and happy. This is not going to happen here. 990 00:56:19,600 --> 00:56:22,839 Speaker 2: What can we do that's empathy for you, and that's 991 00:56:22,840 --> 00:56:26,000 Speaker 2: caring about you, not coddling you and going, oh, well, 992 00:56:26,760 --> 00:56:28,920 Speaker 2: let's just give it a few more months while everyone 993 00:56:28,920 --> 00:56:30,840 Speaker 2: else in the team is pissed off because you're not 994 00:56:30,920 --> 00:56:33,080 Speaker 2: pulling your way. All that stuff goes on, right, and 995 00:56:33,080 --> 00:56:36,920 Speaker 2: we just destroy people's confidence in their future career because 996 00:56:36,920 --> 00:56:41,960 Speaker 2: we're not we don't care enough, or we're not brave 997 00:56:42,120 --> 00:56:45,880 Speaker 2: enough to hold that uncomfortable conversation and sit in that 998 00:56:45,960 --> 00:56:47,880 Speaker 2: emotion and be real with someone. 999 00:56:48,640 --> 00:56:51,719 Speaker 3: Yeah, I look, I completely agree on that, Daniel. This 1000 00:56:51,760 --> 00:56:57,480 Speaker 3: has been excellent. I really like the idea of empathy 1001 00:56:57,520 --> 00:57:00,880 Speaker 3: being based on those plausible narrative and the process, the 1002 00:57:00,920 --> 00:57:04,560 Speaker 3: four step process that you have. So anybody who wants 1003 00:57:04,600 --> 00:57:08,480 Speaker 3: to have better connection with other humans. So unless you 1004 00:57:08,560 --> 00:57:10,480 Speaker 3: want to go and live on an island by yourself, 1005 00:57:11,440 --> 00:57:14,800 Speaker 3: the empathy gap, I think would be a very useful resource. 1006 00:57:15,800 --> 00:57:18,680 Speaker 3: Why can people go to find out more about you? 1007 00:57:18,840 --> 00:57:21,080 Speaker 3: Buy the book as a speaker? If they want to 1008 00:57:21,080 --> 00:57:23,360 Speaker 3: bring in or consult in their business, what's the best 1009 00:57:23,360 --> 00:57:24,120 Speaker 3: place to send them? 1010 00:57:24,600 --> 00:57:28,280 Speaker 2: Yeah, look, Daniel Murray dot au is the easiest handle 1011 00:57:28,360 --> 00:57:31,600 Speaker 2: to find me. That's my website. You can come and 1012 00:57:31,600 --> 00:57:34,640 Speaker 2: connect with me. I'm on LinkedIn all the time, as 1013 00:57:34,640 --> 00:57:38,840 Speaker 2: many of us are these days, and Empathic Consulting dot 1014 00:57:38,880 --> 00:57:42,680 Speaker 2: com is our business side as well. But any of 1015 00:57:42,680 --> 00:57:46,320 Speaker 2: those parts will leads through to me. And yeah, the 1016 00:57:46,360 --> 00:57:49,480 Speaker 2: books hopefully everywhere now, working hard to get it everywhere 1017 00:57:49,520 --> 00:57:55,840 Speaker 2: we can, and always always welcome those conversations, you know, Paul, 1018 00:57:55,880 --> 00:57:58,120 Speaker 2: I've really enjoyed this something. We could talk for hours. 1019 00:57:58,160 --> 00:58:01,320 Speaker 2: But I had a person who who messaged me on 1020 00:58:01,360 --> 00:58:03,920 Speaker 2: LinkedIn and they really challenged They said, I think this 1021 00:58:04,000 --> 00:58:06,520 Speaker 2: is bullshit. You know, I think empathy and sympathy are 1022 00:58:06,560 --> 00:58:09,560 Speaker 2: largely the same thing, and it's a weakness. And I 1023 00:58:09,720 --> 00:58:13,080 Speaker 2: really enjoyed that conversation because it gave me insight to 1024 00:58:13,600 --> 00:58:16,040 Speaker 2: what's really going on in the world around us. And 1025 00:58:16,160 --> 00:58:19,320 Speaker 2: i'd encourage anyone if you've heard this year, listen to 1026 00:58:19,320 --> 00:58:22,360 Speaker 2: this and go yeah, but it's not quite yeah. You know, 1027 00:58:22,400 --> 00:58:25,160 Speaker 2: I want to push back and reach out. I'm not 1028 00:58:25,200 --> 00:58:27,960 Speaker 2: going to tell you you're wrong, and you know, I 1029 00:58:28,000 --> 00:58:29,760 Speaker 2: don't wish you to blow smoke up as about it. 1030 00:58:29,960 --> 00:58:32,720 Speaker 2: I want us to really talk about it because the 1031 00:58:33,160 --> 00:58:35,160 Speaker 2: future is going to be shaped by all of us 1032 00:58:35,720 --> 00:58:38,320 Speaker 2: if we work together, or it's going to be torn 1033 00:58:38,360 --> 00:58:40,480 Speaker 2: apart when we want to be right and wrong and 1034 00:58:41,000 --> 00:58:42,000 Speaker 2: throw sticks at each other. 1035 00:58:42,440 --> 00:58:44,360 Speaker 1: Yeah. Very cool, excellent Daniel. 1036 00:58:44,680 --> 00:58:47,600 Speaker 3: Thank you, good statement, and I wish you all the 1037 00:58:47,680 --> 00:58:48,680 Speaker 3: success in your book. 1038 00:58:49,320 --> 00:58:56,440 Speaker 2: Cheers mate, Thank you,