1 00:00:00,480 --> 00:00:03,600 Speaker 1: A good a team. I hope you will. So those 2 00:00:03,600 --> 00:00:05,080 Speaker 1: of you who know me pretty well know that I 3 00:00:05,120 --> 00:00:07,480 Speaker 1: share a lot of stuff, obviously through the podcast, but 4 00:00:07,520 --> 00:00:09,399 Speaker 1: also on social media. I pump out a lot of 5 00:00:09,400 --> 00:00:13,480 Speaker 1: stuff on Instagram. Sometimes it's a thought, one thought, one idea, 6 00:00:13,560 --> 00:00:16,360 Speaker 1: or a couple. Sometimes it's a story. Sometimes it's serious, 7 00:00:16,400 --> 00:00:20,919 Speaker 1: sometimes it's silly, and sometimes it's funny, and sometimes it's philosophical. 8 00:00:21,000 --> 00:00:25,000 Speaker 1: And you know how it kind of works with me 9 00:00:25,079 --> 00:00:26,800 Speaker 1: is I don't really have a strategy or a plan. 10 00:00:26,880 --> 00:00:29,360 Speaker 1: I don't get up every day and think, what did 11 00:00:29,400 --> 00:00:31,520 Speaker 1: I share the last ten days, What topics have I 12 00:00:31,560 --> 00:00:34,519 Speaker 1: covered recently? What does the world need to hear? Or 13 00:00:34,600 --> 00:00:36,800 Speaker 1: what do I think I should you know? Or it's 14 00:00:37,040 --> 00:00:42,360 Speaker 1: very organic, And what happens quite often is I'll have 15 00:00:42,400 --> 00:00:46,000 Speaker 1: a thought or an idea, and I'll just come up 16 00:00:46,040 --> 00:00:50,080 Speaker 1: with one or two sentences that kind of capture that 17 00:00:50,200 --> 00:00:52,199 Speaker 1: idea or that message, and it might be in a 18 00:00:52,200 --> 00:00:54,360 Speaker 1: funny way or a cheeky way or a sweary way. 19 00:00:55,280 --> 00:00:59,000 Speaker 1: And quite often the one to two to three to 20 00:00:59,040 --> 00:01:02,320 Speaker 1: four sentence just have the most impact, get the most traction, 21 00:01:02,480 --> 00:01:06,039 Speaker 1: get the most eyes, get the most attention, and so 22 00:01:06,160 --> 00:01:11,720 Speaker 1: on and so there's a real, I guess, a real 23 00:01:11,800 --> 00:01:18,400 Speaker 1: attraction to writing stuff that's short, cheeky, funny, I guess, 24 00:01:18,400 --> 00:01:22,040 Speaker 1: and impactful in that way. But I also there's another 25 00:01:22,080 --> 00:01:25,959 Speaker 1: part of me that thinks deeply, very deeply. Some people 26 00:01:26,000 --> 00:01:30,080 Speaker 1: would say too deep, and that's okay, And for better 27 00:01:30,200 --> 00:01:34,240 Speaker 1: or worse, my mind really is always thinking about all 28 00:01:34,280 --> 00:01:37,880 Speaker 1: the stuff we talk about, you know, other people, human behavior, 29 00:01:39,400 --> 00:01:42,400 Speaker 1: us as a species and as a world, as a 30 00:01:42,440 --> 00:01:46,080 Speaker 1: planet full of eight billion or so human beings, and 31 00:01:46,240 --> 00:01:48,960 Speaker 1: you know, the dichotomies and the division and the love 32 00:01:49,000 --> 00:01:52,560 Speaker 1: and the hate and the and sometimes I'll write something 33 00:01:52,640 --> 00:01:55,240 Speaker 1: deep and I'll turn into not a few words, but 34 00:01:55,520 --> 00:01:58,080 Speaker 1: a paragraph, and then two paragraphs, and then I've written 35 00:01:58,080 --> 00:02:03,920 Speaker 1: this thing that's like probably too long to share anywhere, 36 00:02:04,600 --> 00:02:07,280 Speaker 1: and it's It's what interests me is some of the 37 00:02:07,280 --> 00:02:09,680 Speaker 1: things that I can write in two or three minutes 38 00:02:10,880 --> 00:02:14,280 Speaker 1: seem to get the most attention and traction and to 39 00:02:14,320 --> 00:02:17,680 Speaker 1: me are very you know, that not that impressive, If 40 00:02:17,720 --> 00:02:19,760 Speaker 1: I'm going to be honest, I don't think they're that impressive. 41 00:02:19,760 --> 00:02:21,760 Speaker 1: I don't think they're that clever or funny. But some 42 00:02:21,840 --> 00:02:23,920 Speaker 1: of the things that and you know, the things that 43 00:02:23,960 --> 00:02:26,040 Speaker 1: I'm talking about the things that are often cheeky or 44 00:02:26,040 --> 00:02:31,800 Speaker 1: sweary or inappropriate get sometimes you know, like I've written 45 00:02:32,000 --> 00:02:35,399 Speaker 1: one many many of my whiteboards have got well over 46 00:02:35,520 --> 00:02:39,320 Speaker 1: a million views. One of them's got somewhere close to 47 00:02:39,360 --> 00:02:42,480 Speaker 1: twenty million, and when I read it, I'm like, yeah, 48 00:02:42,480 --> 00:02:44,280 Speaker 1: it's kind of interesting and a bit funny, but it's 49 00:02:44,360 --> 00:02:48,560 Speaker 1: not that fucking good, right And then and then when 50 00:02:48,600 --> 00:02:50,680 Speaker 1: I opened the door, and this is I guess, this 51 00:02:50,840 --> 00:02:55,240 Speaker 1: is the dichotomy of like trying to be real and 52 00:02:55,240 --> 00:02:57,720 Speaker 1: be authentic and share your thoughts and ideas, but also, 53 00:02:58,080 --> 00:03:01,840 Speaker 1: you know, have have an audience and and share content 54 00:03:01,960 --> 00:03:06,160 Speaker 1: that people want to read and is relevant and connects 55 00:03:06,200 --> 00:03:10,320 Speaker 1: in a way that's not going to take people an 56 00:03:10,320 --> 00:03:14,160 Speaker 1: hour to read, and it's not overly complicated, and it's 57 00:03:14,200 --> 00:03:18,000 Speaker 1: relevant and it's all of the things that people want. Anyway, 58 00:03:18,080 --> 00:03:22,360 Speaker 1: with that long winded intro, so I started just thinking 59 00:03:22,400 --> 00:03:26,239 Speaker 1: out loud, writing some stuff, well, thinking to myself, I guess, 60 00:03:27,760 --> 00:03:31,320 Speaker 1: and then with the idea of potentially sharing it, and 61 00:03:31,560 --> 00:03:33,919 Speaker 1: I just wrote a few things and that became a paragraph, 62 00:03:34,000 --> 00:03:36,800 Speaker 1: and then that became a bigger paragraph, and my intention 63 00:03:36,960 --> 00:03:40,160 Speaker 1: was to share it on social media. Then it ended 64 00:03:40,240 --> 00:03:43,640 Speaker 1: up being probably you know, I could definitely share it 65 00:03:43,680 --> 00:03:46,120 Speaker 1: as a post, but whether or not anyone would read it, 66 00:03:47,440 --> 00:03:50,080 Speaker 1: and to be honest, I don't even know if sharing 67 00:03:50,120 --> 00:03:52,840 Speaker 1: it here is a great idea. I think some of 68 00:03:52,880 --> 00:03:55,080 Speaker 1: you will resonate with what I'm about to share with you, 69 00:03:55,200 --> 00:04:00,040 Speaker 1: some of you won't. And my intention is not to 70 00:04:00,160 --> 00:04:04,160 Speaker 1: change the world or push anyone in a direction, or 71 00:04:04,240 --> 00:04:12,320 Speaker 1: to highlight any specific cause or but rather just to 72 00:04:12,840 --> 00:04:17,560 Speaker 1: think deeply and to try to turn down the emotion. 73 00:04:17,640 --> 00:04:20,599 Speaker 1: I understand the emotion. I understand the anger, I understand 74 00:04:21,360 --> 00:04:23,840 Speaker 1: I understand the outrage. I understand all of it, and 75 00:04:24,240 --> 00:04:26,400 Speaker 1: I was all of it as well, and I still 76 00:04:26,520 --> 00:04:31,360 Speaker 1: feel those things. But I am fascinated with the why 77 00:04:31,720 --> 00:04:35,360 Speaker 1: behind it all. And I just think that sometimes it's 78 00:04:35,440 --> 00:04:40,560 Speaker 1: worth trying to understand the things that we hate, and 79 00:04:40,760 --> 00:04:43,920 Speaker 1: I'll explain why in a moment. So what I'm about 80 00:04:43,960 --> 00:04:47,560 Speaker 1: to share with you, I'm going to read. I don't 81 00:04:47,640 --> 00:04:50,080 Speaker 1: often read, you know. I just riff and freestyle, as 82 00:04:50,080 --> 00:04:55,039 Speaker 1: I'm doing right now. I just talk off the cough normally, 83 00:04:56,800 --> 00:04:59,359 Speaker 1: which doesn't always produce a good outcome, as you know. 84 00:05:00,000 --> 00:05:02,240 Speaker 1: But I'm going to read for the next few minutes. 85 00:05:02,240 --> 00:05:04,280 Speaker 1: I'm going to read you what I wrote, because I 86 00:05:04,360 --> 00:05:07,000 Speaker 1: think some of you will find it may be helpful, 87 00:05:07,560 --> 00:05:11,640 Speaker 1: maybe a little bit enlightening, or maybe worth exploring a 88 00:05:11,640 --> 00:05:16,440 Speaker 1: little further. Some of you won't, And that's okay, of course. Alrighty, 89 00:05:16,520 --> 00:05:20,120 Speaker 1: so here we go. Like most of you, Like most dozzies, 90 00:05:20,240 --> 00:05:24,760 Speaker 1: I was shocked, saddened, and if I'm being honest, I 91 00:05:24,800 --> 00:05:30,120 Speaker 1: was angry about what transpired at Bondai on Sunday. And 92 00:05:30,640 --> 00:05:33,320 Speaker 1: I'm also well aware that my shock and my sadness 93 00:05:33,320 --> 00:05:36,719 Speaker 1: and anger doesn't change anything. It doesn't fix anything, it 94 00:05:36,760 --> 00:05:40,520 Speaker 1: doesn't help anyone, and I'm well aware it's not a 95 00:05:40,560 --> 00:05:45,760 Speaker 1: solution to anything. And typically I don't really like to. 96 00:05:46,760 --> 00:05:49,560 Speaker 1: I don't feel compelled to. I tend not to comment 97 00:05:49,680 --> 00:05:54,599 Speaker 1: on such significant public events or tragedies because in the 98 00:05:54,600 --> 00:05:57,039 Speaker 1: grand scheme of things, really, who needs to hear what 99 00:05:57,120 --> 00:06:03,240 Speaker 1: I think? Honestly, and the complete overwhelm of media and 100 00:06:03,400 --> 00:06:06,440 Speaker 1: social media commentary in the middle of all of that, 101 00:06:06,960 --> 00:06:10,800 Speaker 1: and the outcry and the newspapers and the stuff on 102 00:06:10,920 --> 00:06:14,839 Speaker 1: repeat on TV. I don't think I'm important in that. 103 00:06:14,960 --> 00:06:17,240 Speaker 1: I know I'm not important than that. And maybe another 104 00:06:17,320 --> 00:06:23,400 Speaker 1: person just sharing their feelings isn't particularly helpful. So rather 105 00:06:23,480 --> 00:06:27,720 Speaker 1: than reiterate and revisit the obvious tragedy, devastation, and sadness 106 00:06:27,720 --> 00:06:30,800 Speaker 1: of what happened, I thought I would think out loud 107 00:06:30,800 --> 00:06:38,280 Speaker 1: about what's actually happening underneath the behavior, what's happening underneath 108 00:06:38,320 --> 00:06:44,280 Speaker 1: the violence, Like what's psychologically and sociologically and culturally and 109 00:06:44,320 --> 00:06:48,719 Speaker 1: emotionally happening behind that behavior, that violence, and that terror. 110 00:06:48,800 --> 00:06:53,000 Speaker 1: In simple terms, I want to explore the unseen behind 111 00:06:53,000 --> 00:06:56,000 Speaker 1: the scene. I want to explore the psychology that drives 112 00:06:56,040 --> 00:07:02,080 Speaker 1: the action, the thinking that drives the behavior. But before 113 00:07:02,120 --> 00:07:05,080 Speaker 1: I open this door, I want to say something really clearly, 114 00:07:05,640 --> 00:07:10,720 Speaker 1: and that is to understand hatred and to understand evil 115 00:07:10,840 --> 00:07:16,760 Speaker 1: and violence is not to condone it. I'm of the 116 00:07:16,800 --> 00:07:20,440 Speaker 1: opinion that we need to understand the minds and the 117 00:07:20,480 --> 00:07:24,600 Speaker 1: motivation and the beliefs and the ideology of people who 118 00:07:24,640 --> 00:07:29,920 Speaker 1: perpetrate horrible acts, who do horrible things, who hurt people, 119 00:07:30,040 --> 00:07:34,760 Speaker 1: so that we can better prepare and combat and protect 120 00:07:34,800 --> 00:07:41,680 Speaker 1: ourselves and others, protect the people we love. To not 121 00:07:41,880 --> 00:07:46,080 Speaker 1: understand the enemy in inverted commas is an overwhelming psychological 122 00:07:46,640 --> 00:07:51,760 Speaker 1: and strategic disadvantage. We need to understand them and if 123 00:07:51,760 --> 00:07:54,640 Speaker 1: we don't, we are way behind. As I said, it's 124 00:07:54,680 --> 00:07:58,520 Speaker 1: a disadvantage, and not only in war, but also in life, 125 00:07:59,120 --> 00:08:02,200 Speaker 1: the life that we saw or the lives that we 126 00:08:02,240 --> 00:08:06,560 Speaker 1: saw torn apart Sunday past. In fact, it's in our 127 00:08:06,640 --> 00:08:09,600 Speaker 1: interest and the interest of everyone that we love and 128 00:08:09,640 --> 00:08:12,480 Speaker 1: want to protect, to understand what we hate and to 129 00:08:12,640 --> 00:08:17,040 Speaker 1: understand those who would seek to do us harm. How 130 00:08:17,080 --> 00:08:20,440 Speaker 1: can we overcome what we don't understand, How can we 131 00:08:20,480 --> 00:08:25,680 Speaker 1: anticipate the potential actions and attacks if we don't seek 132 00:08:25,720 --> 00:08:32,400 Speaker 1: first to understand and to begin to understand. It's really 133 00:08:32,440 --> 00:08:36,920 Speaker 1: important to realize that most people who do evil things, 134 00:08:38,320 --> 00:08:42,560 Speaker 1: they don't believe they are either evil or evil doers. 135 00:08:43,840 --> 00:08:49,000 Speaker 1: And as uncomfortable as that makes us, and as deluded 136 00:08:49,080 --> 00:08:54,800 Speaker 1: and indoctrinated as they may be, most terrorists absolutely believe 137 00:08:56,120 --> 00:09:01,120 Speaker 1: they are doing the right thing. They believe they're doing 138 00:09:01,280 --> 00:09:07,120 Speaker 1: the good thing. Bad people doing bad things don't think 139 00:09:07,160 --> 00:09:11,320 Speaker 1: they are bad people doing bad things. They can't see that. 140 00:09:11,360 --> 00:09:14,400 Speaker 1: They don't have that level of self awareness. They live 141 00:09:14,400 --> 00:09:17,680 Speaker 1: in that confirmation bias, They live in that echo chamber 142 00:09:17,760 --> 00:09:21,719 Speaker 1: of belief and ideology that says that you and I, 143 00:09:22,080 --> 00:09:27,000 Speaker 1: in this case are the enemy. Consider a teenager. Think 144 00:09:27,040 --> 00:09:31,240 Speaker 1: about this. Think about a teenager walking into a marketplace 145 00:09:31,600 --> 00:09:35,560 Speaker 1: strapped with a vest that's full of explosives, ready to 146 00:09:35,679 --> 00:09:39,960 Speaker 1: kill himself because of the belief of a way of 147 00:09:40,040 --> 00:09:44,800 Speaker 1: thinking that he had programmed into him since he could walk. 148 00:09:46,160 --> 00:09:49,160 Speaker 1: Not only is the killing of innocence not evil in 149 00:09:49,240 --> 00:09:54,840 Speaker 1: his mind, it's noble, it's courageous, it's a higher calling. 150 00:09:56,559 --> 00:10:01,040 Speaker 1: And in his group and with his indoctrination, he's actually 151 00:10:01,040 --> 00:10:05,800 Speaker 1: becoming a martyr. He'll be celebrated for his sacrifice and 152 00:10:05,960 --> 00:10:10,240 Speaker 1: courage by his family, by those in his group, and 153 00:10:10,280 --> 00:10:14,559 Speaker 1: of course to you and I this thinking is fucking horrible, 154 00:10:16,360 --> 00:10:19,520 Speaker 1: and it is fucking horrible. It is pure evil. But 155 00:10:19,679 --> 00:10:24,959 Speaker 1: to them and to him, it is not. And this 156 00:10:25,040 --> 00:10:29,719 Speaker 1: is where we can seek to understand so that we 157 00:10:29,880 --> 00:10:35,920 Speaker 1: can perhaps prepare, we can perhaps anticipate. And this doesn't 158 00:10:36,000 --> 00:10:41,080 Speaker 1: just correlate to you know, big horrible, devastating events like 159 00:10:41,160 --> 00:10:44,840 Speaker 1: we saw on the weekend. This correlates and relates to life. 160 00:10:45,679 --> 00:10:48,520 Speaker 1: You need to understand, even in the context of what 161 00:10:48,559 --> 00:10:51,560 Speaker 1: I'm talking about. I believe it's important for you and 162 00:10:51,600 --> 00:10:56,839 Speaker 1: I to understand the sociopaths that surround us that we 163 00:10:56,880 --> 00:10:59,400 Speaker 1: work with that, we know that we need to interact 164 00:10:59,440 --> 00:11:03,120 Speaker 1: with them so that we can prepare ourself, so that 165 00:11:03,160 --> 00:11:07,120 Speaker 1: we can protect ourself and others for what might be 166 00:11:07,280 --> 00:11:11,120 Speaker 1: the consequences of interacting with or being around that person, 167 00:11:13,360 --> 00:11:16,800 Speaker 1: and what sits beneath this kind of violence, the violence 168 00:11:16,880 --> 00:11:21,840 Speaker 1: we saw on Sunday, it is rarely just an impulse 169 00:11:22,320 --> 00:11:27,360 Speaker 1: or momentary madness, or a single bad decision. It's almost 170 00:11:27,440 --> 00:11:32,600 Speaker 1: always the end point of a long psychological conveyor belt 171 00:11:33,720 --> 00:11:41,520 Speaker 1: identity fused with ideology, belonging tied to belief meaning outsourced 172 00:11:42,080 --> 00:11:47,800 Speaker 1: to a cause, and moral responsibility handed over to a narrative. 173 00:11:48,520 --> 00:11:53,120 Speaker 1: I've spoken about this before, but when a person's sense 174 00:11:53,160 --> 00:11:58,240 Speaker 1: of self, when their identity, when their version of who 175 00:11:58,280 --> 00:12:04,240 Speaker 1: they are becomes inseparable from an ideology, a tribe, or, 176 00:12:04,559 --> 00:12:10,080 Speaker 1: in some religions, a promised future of eternal bliss, then 177 00:12:10,520 --> 00:12:13,560 Speaker 1: critical thinking no longer plays a role in the decision 178 00:12:13,600 --> 00:12:21,200 Speaker 1: making and the action taking process. Nuance dies first, being 179 00:12:21,280 --> 00:12:27,520 Speaker 1: able to make nuanced decisions in the middle of a 180 00:12:27,559 --> 00:12:34,080 Speaker 1: programmed mind doesn't happen. Nuance dies, then empathy dies, Then 181 00:12:34,120 --> 00:12:39,720 Speaker 1: the humanity of the other person dies, and at that point, 182 00:12:40,640 --> 00:12:46,000 Speaker 1: violence is an experienced by them as violence. It's experienced 183 00:12:46,240 --> 00:12:54,679 Speaker 1: by them as obedience and necessity and duty and even destiny. 184 00:12:56,000 --> 00:12:59,440 Speaker 1: And this is how ordinary human psychology, under the right 185 00:13:00,120 --> 00:13:07,520 Speaker 1: or perhaps more accurately wrong conditions, can produce extraordinary cruelty, 186 00:13:08,240 --> 00:13:13,240 Speaker 1: not because the brain suddenly stops working, but because it 187 00:13:13,320 --> 00:13:19,000 Speaker 1: starts working exactly the way it was programmed to. The 188 00:13:19,120 --> 00:13:23,240 Speaker 1: uncomfortable truth is that the psychological ingredients that make this 189 00:13:23,400 --> 00:13:28,319 Speaker 1: possible are not exotic or rare. They're human. Our brains 190 00:13:28,360 --> 00:13:34,680 Speaker 1: crave certainty and belonging, meaning and identity. And when fear, grievance, humiliation, 191 00:13:34,840 --> 00:13:38,320 Speaker 1: or perceived in justice are layered on top of those needs, 192 00:13:38,640 --> 00:13:42,640 Speaker 1: and then exploited by simplistic narratives that divide the world 193 00:13:42,679 --> 00:13:48,280 Speaker 1: into good and evil, us and them, heroes and enemies, 194 00:13:50,040 --> 00:13:55,719 Speaker 1: then that slope becomes dangerously slippery. Most of us will 195 00:13:55,760 --> 00:14:01,959 Speaker 1: never commit violence. But the mechanisms of dehumanation, tribal thinking, 196 00:14:02,120 --> 00:14:06,720 Speaker 1: and moral outsourcing exist on a spectrum, and they don't 197 00:14:06,760 --> 00:14:10,400 Speaker 1: begin with bombs or bullets. They begin with stories. We 198 00:14:10,480 --> 00:14:16,040 Speaker 1: stop questioning people, we stop humanizing, and beliefs we stop 199 00:14:16,080 --> 00:14:21,360 Speaker 1: holding lightly. Now, having said all of that, I fully 200 00:14:21,360 --> 00:14:25,280 Speaker 1: acknowledge that these are just words. These are just my words, 201 00:14:25,280 --> 00:14:27,360 Speaker 1: these are just my thoughts, and I'm just a flawed 202 00:14:27,400 --> 00:14:31,640 Speaker 1: bloke trying to figure stuff out like you. I'm just 203 00:14:31,680 --> 00:14:34,880 Speaker 1: a guy trying to understand human behavior and how the 204 00:14:34,920 --> 00:14:38,240 Speaker 1: world works, and hopefully to do some good and to 205 00:14:38,360 --> 00:14:40,880 Speaker 1: share some love and to encourage some of you to 206 00:14:41,000 --> 00:14:45,240 Speaker 1: think critically. I want to support you along the way, 207 00:14:45,960 --> 00:14:49,440 Speaker 1: and maybe I can shine of the light in the 208 00:14:49,520 --> 00:14:55,400 Speaker 1: darkness of what's going on at the moment. Love harps