1 00:00:00,520 --> 00:00:02,520 Speaker 1: I'll get our team. Welcome to another installment of the 2 00:00:03,000 --> 00:00:06,480 Speaker 1: Bloody You Project with Bloody David Gillespian, Bloody, Tiffany and 3 00:00:06,519 --> 00:00:11,200 Speaker 1: Cook is not here. She's on sabbatical and do you 4 00:00:11,200 --> 00:00:11,760 Speaker 1: know where she is? 5 00:00:11,840 --> 00:00:13,000 Speaker 2: D no idea? 6 00:00:13,680 --> 00:00:18,520 Speaker 1: She's in India, all right? What she got exactly? What 7 00:00:18,640 --> 00:00:20,560 Speaker 1: is she doing there? Well, she's gone for a bit 8 00:00:20,600 --> 00:00:23,120 Speaker 1: of sight seeing. She's gone with a group. She's staying 9 00:00:23,160 --> 00:00:26,240 Speaker 1: with a bunch of I should have paid more attention. 10 00:00:26,920 --> 00:00:28,560 Speaker 2: You've got no idea what she's doing. 11 00:00:28,600 --> 00:00:32,559 Speaker 1: No, no, no, no, She's spending some time, like a retreat. 12 00:00:33,880 --> 00:00:36,919 Speaker 1: And I know that. I saw a podcast podcast at 13 00:00:36,960 --> 00:00:40,320 Speaker 1: Post today. She just finished spending I think two nights 14 00:00:40,320 --> 00:00:44,120 Speaker 1: in a monastery or something, and so she's just out 15 00:00:44,159 --> 00:00:46,520 Speaker 1: and about seeing another very different part of the world. 16 00:00:46,560 --> 00:00:49,919 Speaker 1: And if her reports or anything to go by, she 17 00:00:49,960 --> 00:00:52,720 Speaker 1: seems to be enjoying it. But she'll be back next time, 18 00:00:52,760 --> 00:00:56,400 Speaker 1: so you can you can ask her about India. But yeah, 19 00:00:56,440 --> 00:00:58,120 Speaker 1: so that's where have you ever been? Have you ever 20 00:00:58,120 --> 00:00:58,480 Speaker 1: been there? 21 00:00:59,160 --> 00:01:02,120 Speaker 2: Oh? Only very briefly as part of a business thing 22 00:01:02,160 --> 00:01:03,960 Speaker 2: I did. I was there for maybe two or three 23 00:01:04,040 --> 00:01:08,280 Speaker 2: days in maybe Delhi. I'm thinking, so, No, I haven't. 24 00:01:08,319 --> 00:01:09,480 Speaker 2: I haven't been to India. 25 00:01:10,160 --> 00:01:12,120 Speaker 1: Well you have, but you really didn't say anything. 26 00:01:12,280 --> 00:01:12,959 Speaker 2: No. No. 27 00:01:13,959 --> 00:01:19,640 Speaker 1: I went to Fiji last year four eighten hours because 28 00:01:19,680 --> 00:01:24,000 Speaker 1: I had to do a gig and I literally got in, slept, 29 00:01:24,160 --> 00:01:26,640 Speaker 1: got up. I wasn't named an Ada. Now it's like, 30 00:01:26,680 --> 00:01:30,040 Speaker 1: but there, Yeah, I would have been. Actually I got 31 00:01:30,040 --> 00:01:33,000 Speaker 1: in it like yeah, probably sixteen hours or something to 32 00:01:33,040 --> 00:01:35,360 Speaker 1: do a forty five minute keynote. 33 00:01:35,959 --> 00:01:38,280 Speaker 2: The Lifestyles of the Rich and Famous. 34 00:01:39,360 --> 00:01:42,520 Speaker 1: Hardly, hardly I can I. 35 00:01:42,440 --> 00:01:45,280 Speaker 2: Can top that one. So I used to commute regularly 36 00:01:45,440 --> 00:01:49,160 Speaker 2: to San Francisco when I was doing my tech startup days. 37 00:01:49,440 --> 00:01:52,400 Speaker 2: So by I commute, I mean I would fly there 38 00:01:52,440 --> 00:01:55,440 Speaker 2: every fortnight, spent a fortnight there, a fortnight back here. 39 00:01:56,960 --> 00:02:00,160 Speaker 2: There was one time I had to come back to 40 00:02:00,280 --> 00:02:04,400 Speaker 2: Brisbane for a wedding on a Saturday night and then 41 00:02:04,440 --> 00:02:06,880 Speaker 2: be back the following morning. So I think I was 42 00:02:07,000 --> 00:02:12,760 Speaker 2: actually in town maybe less than ten hours between flights 43 00:02:13,360 --> 00:02:15,639 Speaker 2: back from the US, and each flight is a twelve 44 00:02:15,680 --> 00:02:18,000 Speaker 2: hour flight, so that was a bit hectic. 45 00:02:18,680 --> 00:02:21,160 Speaker 1: I think it's it's more, isn't. It's like that's if 46 00:02:21,160 --> 00:02:23,720 Speaker 1: you go to even that. I think it's fourteen. 47 00:02:24,160 --> 00:02:28,239 Speaker 2: Yeah, I'm probably measuring a bit from Sydney to San Francisco, 48 00:02:28,320 --> 00:02:30,519 Speaker 2: which was the bit where I got to sleep. 49 00:02:30,600 --> 00:02:33,880 Speaker 1: So yeah, yeah, well you're up the pointy end, right, 50 00:02:33,919 --> 00:02:35,799 Speaker 1: so you get quicker than the rest of you. 51 00:02:35,919 --> 00:02:38,519 Speaker 2: Oh that's right, Yeah, that's exactly right. I've talked to 52 00:02:38,520 --> 00:02:41,040 Speaker 2: you about that analogy with the education system at some 53 00:02:41,160 --> 00:02:44,079 Speaker 2: time in the past, haven't I, which is it's all 54 00:02:44,160 --> 00:02:47,240 Speaker 2: like us all being on a plane and the plane 55 00:02:47,520 --> 00:02:49,160 Speaker 2: gets to London at the same time, but some of 56 00:02:49,240 --> 00:02:50,880 Speaker 2: us pay a lot more for our seats and have 57 00:02:50,919 --> 00:02:52,440 Speaker 2: a nicer ride, but at the end of the day, 58 00:02:52,440 --> 00:02:54,440 Speaker 2: we still all get to London at the same time. 59 00:02:54,720 --> 00:02:57,919 Speaker 1: Oh yeah, I like, I'm not posh at it. Well, 60 00:02:57,960 --> 00:03:03,720 Speaker 1: clearly I'm not posh, but I if I fly internationally 61 00:03:04,480 --> 00:03:06,800 Speaker 1: in my old age, I tend to fly business if 62 00:03:06,840 --> 00:03:08,920 Speaker 1: I can. I'm not trying to sound like a snob. 63 00:03:09,320 --> 00:03:12,240 Speaker 1: I'd rather I did a gig in Ireland a few 64 00:03:12,320 --> 00:03:15,160 Speaker 1: years ago and I've never been there till this gig, 65 00:03:15,720 --> 00:03:19,440 Speaker 1: and they and they're like it was going to cost 66 00:03:19,480 --> 00:03:22,760 Speaker 1: them too much, and I said, fly me business and 67 00:03:22,880 --> 00:03:26,560 Speaker 1: put me up somewhere nice and then don't pay me 68 00:03:26,639 --> 00:03:30,200 Speaker 1: for the gig. And they're like, Okay, So I got 69 00:03:30,200 --> 00:03:32,600 Speaker 1: a free trip to Ireland and went business and my 70 00:03:33,680 --> 00:03:37,640 Speaker 1: mum's mum was born in Ireland, so it was just 71 00:03:37,760 --> 00:03:40,160 Speaker 1: like a free little holiday. But have you been to Ireland? 72 00:03:41,880 --> 00:03:46,520 Speaker 2: Yeah? Once again, very briefly, to inspect a software facility. 73 00:03:47,600 --> 00:03:51,200 Speaker 2: So I've seen an airport in the inside of an 74 00:03:51,240 --> 00:03:51,960 Speaker 2: office building. 75 00:03:52,880 --> 00:03:54,600 Speaker 1: Wow, you must have done a lot of travel. I 76 00:03:54,680 --> 00:03:56,240 Speaker 1: bet you're glad those days are over. 77 00:03:56,560 --> 00:03:58,760 Speaker 2: Very glad. Yes, it was a big reason why I 78 00:03:58,760 --> 00:03:59,560 Speaker 2: stopped doing that. 79 00:04:00,360 --> 00:04:02,320 Speaker 1: Now, of course, we are not a travel show. 80 00:04:03,040 --> 00:04:03,600 Speaker 2: We aren't. 81 00:04:04,040 --> 00:04:06,280 Speaker 1: No, we're not. You wrote a You wrote an article 82 00:04:06,320 --> 00:04:10,320 Speaker 1: which I applauded on LinkedIn, and I'm sure it's up 83 00:04:10,360 --> 00:04:12,360 Speaker 1: on your website or I'm sure it's somewhere else. But 84 00:04:12,400 --> 00:04:15,040 Speaker 1: if you're not on LinkedIn, everyone get on LinkedIn. And 85 00:04:15,080 --> 00:04:18,400 Speaker 1: if you don't follow David Gillespie, follow him. He wrote 86 00:04:18,400 --> 00:04:24,520 Speaker 1: an article yesterday called Beyond First Impressions, Escaping the Psychopaths 87 00:04:24,720 --> 00:04:28,719 Speaker 1: Charm and I thought, now, as much as you don't 88 00:04:28,720 --> 00:04:31,320 Speaker 1: want to hear this, it kind of intersects with my 89 00:04:31,440 --> 00:04:34,680 Speaker 1: stuff a little bit, not the psychopathy stuff, but like 90 00:04:34,880 --> 00:04:38,279 Speaker 1: reading people and first impressions and all of that. But yeah, 91 00:04:38,279 --> 00:04:41,920 Speaker 1: I found it fascinating and like you were talking essentially 92 00:04:41,960 --> 00:04:44,560 Speaker 1: about what you see up front is often not what 93 00:04:44,600 --> 00:04:46,760 Speaker 1: you get further down the line. 94 00:04:47,320 --> 00:04:51,680 Speaker 2: It came from, you know, just I look around all 95 00:04:51,720 --> 00:04:56,400 Speaker 2: the time for interesting bits and pieces, and one that 96 00:04:56,560 --> 00:04:59,800 Speaker 2: sort of came through my feed, if you like, yesterday 97 00:05:00,040 --> 00:05:02,400 Speaker 2: warning because I'm trying to put something out every day 98 00:05:02,440 --> 00:05:06,800 Speaker 2: now just because keep me in practice. And yesterday morning, 99 00:05:06,880 --> 00:05:08,920 Speaker 2: there was this experiment out of the Duke University in 100 00:05:08,960 --> 00:05:12,360 Speaker 2: the United States. And look, you'd be familiar with these 101 00:05:12,440 --> 00:05:16,400 Speaker 2: kinds of experiments, you know, because you're in the area 102 00:05:16,440 --> 00:05:19,200 Speaker 2: of psychology, and so there's a lot of BS experiments 103 00:05:19,240 --> 00:05:23,599 Speaker 2: done in psychology. But this one, this one struck me 104 00:05:23,640 --> 00:05:25,559 Speaker 2: as a little bit interesting. This was one where they 105 00:05:25,880 --> 00:05:29,679 Speaker 2: are What they did was they asked people to value 106 00:05:31,240 --> 00:05:34,040 Speaker 2: sort of like a garage sale box of stuff. So 107 00:05:34,040 --> 00:05:36,800 Speaker 2: they give people a box of stuff that was closed, 108 00:05:37,440 --> 00:05:40,480 Speaker 2: and then they'd say, right open the box up and 109 00:05:40,680 --> 00:05:43,359 Speaker 2: give us a value for the things that you find 110 00:05:43,360 --> 00:05:45,040 Speaker 2: in the box. And then they do that and I 111 00:05:45,080 --> 00:05:47,279 Speaker 2: think it was maybe five or ten boxes they'd do 112 00:05:47,400 --> 00:05:49,880 Speaker 2: that and ask them to value them. Now, of course, 113 00:05:49,920 --> 00:05:52,400 Speaker 2: being a psychology experiment, there's a trick you know, there's 114 00:05:52,400 --> 00:05:54,440 Speaker 2: a hidden trick. And the hidden trick here is that 115 00:05:54,520 --> 00:05:57,839 Speaker 2: all the boxes have exactly the same value, but they 116 00:05:57,880 --> 00:06:01,360 Speaker 2: don't tell them that. But the box are arranged differently. 117 00:06:01,600 --> 00:06:04,400 Speaker 2: So in some of the boxes, the items at the 118 00:06:04,440 --> 00:06:07,800 Speaker 2: top are much more valuable than the items at the bottom. 119 00:06:08,120 --> 00:06:10,360 Speaker 2: So you might find a necklace or something at the 120 00:06:10,400 --> 00:06:13,360 Speaker 2: top of one box, and you know, and someone's old 121 00:06:13,360 --> 00:06:17,880 Speaker 2: shoes or something at the bottom of it. But the 122 00:06:18,000 --> 00:06:22,160 Speaker 2: value of the boxes was all approximately equal. But what 123 00:06:22,200 --> 00:06:27,039 Speaker 2: they found is that people always valued the boxes where 124 00:06:27,080 --> 00:06:32,560 Speaker 2: they found the more valuable things first much much higher 125 00:06:32,600 --> 00:06:35,440 Speaker 2: than where they found the more valuable things last or 126 00:06:35,440 --> 00:06:40,320 Speaker 2: in the middle. So and what they said was, Okay, 127 00:06:40,400 --> 00:06:43,640 Speaker 2: this proves something that's been proven quite a bit in psychology, 128 00:06:43,680 --> 00:06:47,080 Speaker 2: which is that first impressions count a lot. So our 129 00:06:47,120 --> 00:06:49,279 Speaker 2: first impression of something, and in this case it's the 130 00:06:49,279 --> 00:06:53,080 Speaker 2: first impression of the box, is that this is a 131 00:06:53,120 --> 00:06:56,440 Speaker 2: more valuable box because the first thing I saw was 132 00:06:56,520 --> 00:07:00,800 Speaker 2: more valuable, and even if I eventually got to something 133 00:07:00,920 --> 00:07:04,320 Speaker 2: that valuable in another box, my first impression had a 134 00:07:04,320 --> 00:07:08,640 Speaker 2: lot more weight. And so I considered that other box 135 00:07:08,920 --> 00:07:11,720 Speaker 2: where I found someone's old sneakers at the top to 136 00:07:11,800 --> 00:07:14,120 Speaker 2: be less valuable than the one where I found the 137 00:07:14,120 --> 00:07:18,640 Speaker 2: gold chain at the top. And so this confirmed previous research, 138 00:07:18,720 --> 00:07:21,400 Speaker 2: and that in itself is an interesting The twist in 139 00:07:21,520 --> 00:07:25,520 Speaker 2: this research is that what they did was they got 140 00:07:25,560 --> 00:07:28,480 Speaker 2: that cut of things, just like everyone else does. But 141 00:07:28,520 --> 00:07:30,360 Speaker 2: then they said to the people, we'd like you to 142 00:07:30,400 --> 00:07:34,920 Speaker 2: come back tomorrow, so don't give us a value now. 143 00:07:35,440 --> 00:07:37,600 Speaker 2: So there was another group, a control group. Don't give 144 00:07:37,680 --> 00:07:41,960 Speaker 2: us a value now. Go home, think about it, come 145 00:07:41,960 --> 00:07:45,920 Speaker 2: back tomorrow and tell us what you think the value 146 00:07:45,920 --> 00:07:50,200 Speaker 2: of the boxes were. And what invariably happened is when 147 00:07:50,280 --> 00:07:53,440 Speaker 2: people came back tomorrow and then gave their assessment of 148 00:07:53,440 --> 00:07:56,920 Speaker 2: the value, they were much more likely to get it right. 149 00:07:57,600 --> 00:08:00,960 Speaker 2: So they were much more likely to say, you know what, 150 00:08:01,200 --> 00:08:04,400 Speaker 2: these boxes are all about the same value and that 151 00:08:04,560 --> 00:08:08,320 Speaker 2: was the correct answer. So if you gave people time, 152 00:08:08,640 --> 00:08:10,800 Speaker 2: if you gave them a night to sleep on it, 153 00:08:10,840 --> 00:08:14,080 Speaker 2: if you like, and do whatever else, I mean, who knows. 154 00:08:14,120 --> 00:08:17,040 Speaker 2: They might have googled things, done some valuations themselves, or whatever. 155 00:08:17,160 --> 00:08:21,000 Speaker 2: Whatever they did, it didn't matter. The point was when 156 00:08:21,040 --> 00:08:23,240 Speaker 2: they came back twenty four hours later, they got the 157 00:08:23,280 --> 00:08:28,200 Speaker 2: right answer. They'd had time to properly digest and so 158 00:08:28,360 --> 00:08:30,920 Speaker 2: this was an interesting study because if you translate that 159 00:08:31,080 --> 00:08:34,840 Speaker 2: into relationships or workplaces or anything else and start to 160 00:08:34,880 --> 00:08:40,000 Speaker 2: think about the evolutionary reason why we operate on first impressions, 161 00:08:40,280 --> 00:08:43,280 Speaker 2: that's to me, that's where it starts to become really interesting. 162 00:08:43,559 --> 00:08:45,600 Speaker 2: I always find it interesting when you ask the question 163 00:08:45,679 --> 00:08:49,520 Speaker 2: why why would humans why would be environmentally involved in 164 00:08:50,160 --> 00:08:54,240 Speaker 2: favor first impressions, because this is not something unique to 165 00:08:54,280 --> 00:08:57,360 Speaker 2: people who do psychology studies. I mean, as you would know, 166 00:08:57,440 --> 00:09:02,040 Speaker 2: this is everybody everything. All the time we judge something. 167 00:09:02,120 --> 00:09:04,400 Speaker 2: We do judge a book by its cover. We do 168 00:09:04,880 --> 00:09:07,360 Speaker 2: make a snap judgment about whether we want to pick 169 00:09:07,400 --> 00:09:08,720 Speaker 2: the book up and read it, whether we want to 170 00:09:08,720 --> 00:09:11,160 Speaker 2: watch the television show, whether we even want to listen 171 00:09:11,240 --> 00:09:14,040 Speaker 2: to Craig or his guest, you know, within the first 172 00:09:14,240 --> 00:09:18,400 Speaker 2: few seconds of hearing about it. And the reason we 173 00:09:18,440 --> 00:09:23,520 Speaker 2: do that is because our brains and bodies are optimized 174 00:09:23,840 --> 00:09:28,920 Speaker 2: to save energy, and one of the most expensive things 175 00:09:28,960 --> 00:09:36,240 Speaker 2: we can do is think. It is an absolute energy hog. 176 00:09:37,240 --> 00:09:41,600 Speaker 2: Twenty five percent of the energy we can use is 177 00:09:41,640 --> 00:09:46,839 Speaker 2: spent thinking, and that's for an organ that occupies about 178 00:09:46,840 --> 00:09:51,120 Speaker 2: two percent of our body. So it's a massive energy hog. Now, 179 00:09:51,520 --> 00:09:54,360 Speaker 2: in these days of food everywhere, all the time, far 180 00:09:54,400 --> 00:09:56,439 Speaker 2: too much of it. We probably don't think too much 181 00:09:56,480 --> 00:10:00,480 Speaker 2: about energy scarcity, but if you needed to find the 182 00:10:00,520 --> 00:10:03,720 Speaker 2: food to supply that twenty five percent, and food was 183 00:10:03,800 --> 00:10:06,959 Speaker 2: really scarce, as it is in any hunter gatherers society. 184 00:10:06,960 --> 00:10:09,720 Speaker 2: I don't know if you do. You watch that show 185 00:10:09,760 --> 00:10:13,480 Speaker 2: alone that's been on SBS where the people you know 186 00:10:14,040 --> 00:10:17,839 Speaker 2: in the wilderness, you know, think of those people and 187 00:10:17,920 --> 00:10:21,199 Speaker 2: how keen they would be to waste energy When you're 188 00:10:21,200 --> 00:10:24,199 Speaker 2: getting one fish to eat, if you're lucky every three days, 189 00:10:24,520 --> 00:10:29,760 Speaker 2: you're not wasting energy on anything. And our bodies are 190 00:10:29,800 --> 00:10:32,480 Speaker 2: optimized that way, our brains are optimized that way. So 191 00:10:32,520 --> 00:10:35,880 Speaker 2: we take lots of shortcuts. And one of the shortcuts 192 00:10:35,920 --> 00:10:42,480 Speaker 2: we take is first impressions. So we figure and evolutionary 193 00:10:42,559 --> 00:10:46,200 Speaker 2: has told us. Evolution has told us that most of 194 00:10:46,240 --> 00:10:50,720 Speaker 2: the time we're right. More often than not, if we 195 00:10:50,760 --> 00:10:53,720 Speaker 2: make a judgment in the first few seconds, we're right. 196 00:10:54,280 --> 00:10:56,240 Speaker 2: And that was the rule that was being applied by 197 00:10:56,280 --> 00:10:58,480 Speaker 2: these people when they looked at these boxes in this 198 00:10:58,559 --> 00:11:02,160 Speaker 2: garage sale experiment. The first thing they saw was valuable. 199 00:11:02,640 --> 00:11:06,520 Speaker 2: First impression. This is a valuable box. No more thinking required, 200 00:11:06,760 --> 00:11:10,880 Speaker 2: look at all that energy I've saved as But if 201 00:11:10,880 --> 00:11:14,319 Speaker 2: you force people to expend the energy, if you force 202 00:11:14,440 --> 00:11:18,160 Speaker 2: people to actually think about it, you get a different result. 203 00:11:19,480 --> 00:11:23,240 Speaker 2: Now why I say this is important with psychopaths is 204 00:11:23,320 --> 00:11:31,199 Speaker 2: psychopaths rely on us making judgments on first impressions because 205 00:11:31,240 --> 00:11:35,719 Speaker 2: everything about a psychopath is fake. They lie about everything, 206 00:11:36,240 --> 00:11:40,600 Speaker 2: and they spend their entire lives working very hard on 207 00:11:40,640 --> 00:11:44,440 Speaker 2: a first impression that sells you. Most people when they 208 00:11:44,559 --> 00:11:48,280 Speaker 2: first encounter a psychopath will tell you that they are 209 00:11:48,320 --> 00:11:53,280 Speaker 2: the most charming, fabulous person they have ever met. And 210 00:11:53,360 --> 00:11:57,679 Speaker 2: that's because they are very very good at reflecting back 211 00:11:57,720 --> 00:12:02,280 Speaker 2: to you exactly what you want to say. They are 212 00:12:02,600 --> 00:12:06,200 Speaker 2: masters of first impressions. And if we only operate on 213 00:12:06,240 --> 00:12:10,000 Speaker 2: a first impression, we will get sucked in every single time, 214 00:12:10,360 --> 00:12:15,600 Speaker 2: both as an employer of psychopaths and in relationships. Yes, 215 00:12:16,240 --> 00:12:18,880 Speaker 2: operate on a first impression and you are in danger. 216 00:12:20,480 --> 00:12:22,520 Speaker 2: So I guess the tip out of all of this 217 00:12:23,120 --> 00:12:29,120 Speaker 2: is don't take the time, get the you know, the 218 00:12:29,240 --> 00:12:32,400 Speaker 2: extra night into the process, get the twenty four hours, 219 00:12:32,760 --> 00:12:36,679 Speaker 2: get the backup research. You can and will probably make 220 00:12:36,720 --> 00:12:40,160 Speaker 2: the correct decision if you put the first oppression aside 221 00:12:40,360 --> 00:12:43,360 Speaker 2: and say, you know what, I'm not making a decision 222 00:12:43,400 --> 00:12:48,440 Speaker 2: about this until tomorrow. Give your brain time to digest 223 00:12:48,679 --> 00:12:52,280 Speaker 2: everything it saw, if it's if it's a job application, 224 00:12:52,480 --> 00:12:55,440 Speaker 2: give give yourself time to follow through a process of 225 00:12:55,520 --> 00:12:59,320 Speaker 2: checking the information you're being given, because it's too late 226 00:12:59,440 --> 00:13:01,720 Speaker 2: after you've ended over the job or entered into a 227 00:13:01,760 --> 00:13:06,480 Speaker 2: relationship with this person. So that's the lesson out of this, 228 00:13:06,640 --> 00:13:11,040 Speaker 2: I think, which is, yes, we do work on first impressions, 229 00:13:11,440 --> 00:13:16,520 Speaker 2: but there's a really easy cure time. Give yourself time 230 00:13:17,160 --> 00:13:21,040 Speaker 2: to spend that cognitive power, to spend that thinking on 231 00:13:21,600 --> 00:13:25,440 Speaker 2: thinking about whether your first impression is right. So had 232 00:13:25,480 --> 00:13:28,040 Speaker 2: you asked those people with the garage boxes on the 233 00:13:28,040 --> 00:13:30,600 Speaker 2: first day when they did it, which is the most 234 00:13:30,640 --> 00:13:32,800 Speaker 2: vailable box, they would have said what everyone else said, 235 00:13:32,880 --> 00:13:34,640 Speaker 2: which is the one I saw with the most valuable 236 00:13:34,640 --> 00:13:37,920 Speaker 2: thing at the top. But don't ask them, don't ask, 237 00:13:38,360 --> 00:13:40,560 Speaker 2: don't get them to make a decision there, and then 238 00:13:41,280 --> 00:13:43,760 Speaker 2: ask them the next day and you get a very 239 00:13:43,760 --> 00:13:46,560 Speaker 2: different answer. And that's a rule we should be thinking 240 00:13:46,600 --> 00:13:51,760 Speaker 2: about in our whole lives, which is give yourself time 241 00:13:52,440 --> 00:13:56,720 Speaker 2: when encountering new information to decide whether your first impression 242 00:13:56,800 --> 00:13:57,360 Speaker 2: is correct. 243 00:13:58,040 --> 00:14:02,280 Speaker 1: Yeah, one of the things I advise people is to 244 00:14:02,400 --> 00:14:08,760 Speaker 1: trust slowly and respect slowly. And I don't say that 245 00:14:08,880 --> 00:14:13,120 Speaker 1: because I want them to think everybody is a potential 246 00:14:13,280 --> 00:14:16,120 Speaker 1: enemy or but I just say, you know, it's like, 247 00:14:16,360 --> 00:14:19,320 Speaker 1: how many people do you meet where initially they're exactly 248 00:14:19,320 --> 00:14:24,000 Speaker 1: what you said, charismatic, charming, affable, likable, you know, they're 249 00:14:24,680 --> 00:14:27,200 Speaker 1: articulate and all of that, and then down the track 250 00:14:28,080 --> 00:14:30,760 Speaker 1: they're nothing like that, or their energy is totally different. 251 00:14:30,760 --> 00:14:34,200 Speaker 1: And it's quite a significant percentage of people, even people 252 00:14:34,200 --> 00:14:37,240 Speaker 1: who say to me, when we first started going out, 253 00:14:37,440 --> 00:14:41,600 Speaker 1: it was like this, and then it's like three months past, 254 00:14:41,720 --> 00:14:45,360 Speaker 1: and then this curtain got drawn back onto the actual 255 00:14:45,480 --> 00:14:49,680 Speaker 1: person and it had been you know, a fucking charade 256 00:14:49,720 --> 00:14:52,960 Speaker 1: for three months or six months or or one week 257 00:14:53,080 --> 00:14:57,840 Speaker 1: sometimes for example. But did you ever read that book 258 00:14:58,800 --> 00:15:00,640 Speaker 1: Blink by Malcolm glad Well. 259 00:15:00,840 --> 00:15:05,720 Speaker 2: Yeah, I did, Yes, the whole book about first impressions. Yeah. Yeah, 260 00:15:06,400 --> 00:15:09,600 Speaker 2: So this is that part of this research is really 261 00:15:09,680 --> 00:15:13,880 Speaker 2: old news, like that book and lots of others, plenty 262 00:15:13,880 --> 00:15:16,840 Speaker 2: of plenty of research that confirms, yes, we do, in 263 00:15:16,880 --> 00:15:20,240 Speaker 2: fact work on first impressions. So as if anyone needed 264 00:15:20,240 --> 00:15:23,720 Speaker 2: that proof to them, there's a mountain of research that 265 00:15:23,760 --> 00:15:27,520 Speaker 2: proves we do. The interesting part of this is how 266 00:15:27,520 --> 00:15:31,480 Speaker 2: do you fix that? And you know it's maybe a 267 00:15:31,520 --> 00:15:34,600 Speaker 2: statement of the bleeding obvious, will give yourself time, but 268 00:15:36,680 --> 00:15:38,440 Speaker 2: this is this experiment actually proves it. 269 00:15:39,400 --> 00:15:43,440 Speaker 1: So when when, for want of a better term, when 270 00:15:43,440 --> 00:15:47,400 Speaker 1: are snap judgments? Okay? Because we don't always have the 271 00:15:47,480 --> 00:15:48,800 Speaker 1: luxury of time, right. 272 00:15:49,600 --> 00:15:53,320 Speaker 2: So when you're on I guess, when you're starving to 273 00:15:53,360 --> 00:15:57,800 Speaker 2: death on a loan, then snap judgments are a pretty 274 00:15:57,800 --> 00:16:01,160 Speaker 2: good idea because you want to conserve mean energy and 275 00:16:01,560 --> 00:16:06,440 Speaker 2: the cost of expending energy thinking about things is real 276 00:16:07,760 --> 00:16:12,360 Speaker 2: and maybe more than the cost of getting it wrong 277 00:16:12,600 --> 00:16:18,040 Speaker 2: if you make a snap judgment. So, and you might say, well, 278 00:16:18,120 --> 00:16:22,880 Speaker 2: where's your evidence for that, Gillespie, Well, the evidence is 279 00:16:23,120 --> 00:16:27,280 Speaker 2: evolution favors it. We wouldn't have gotten to this point 280 00:16:27,880 --> 00:16:32,080 Speaker 2: with this built in facility if evolution didn't say people 281 00:16:32,120 --> 00:16:35,040 Speaker 2: who tend to make snap judgments tend to survive better 282 00:16:35,080 --> 00:16:35,920 Speaker 2: than people who don't. 283 00:16:36,640 --> 00:16:40,520 Speaker 1: Yes, yes, well yeah, and I mean there are a 284 00:16:40,520 --> 00:16:44,520 Speaker 1: lot of instances where we have to especially not so 285 00:16:44,600 --> 00:16:47,760 Speaker 1: much in twenty twenty four, but still, I guess sociologically 286 00:16:47,920 --> 00:16:51,440 Speaker 1: in a way. But like evolutionarily, there were so many 287 00:16:51,560 --> 00:16:55,720 Speaker 1: dangerous situations and so many predators, and so much inherent 288 00:16:55,880 --> 00:16:59,400 Speaker 1: risk in the world that we, not me and you, 289 00:16:59,520 --> 00:17:03,560 Speaker 1: but live in that you had to be constantly vigilant 290 00:17:03,560 --> 00:17:09,240 Speaker 1: and constantly or at least regularly making decisions quickly because 291 00:17:09,320 --> 00:17:11,360 Speaker 1: it was a lot more life and death back then. 292 00:17:11,600 --> 00:17:14,480 Speaker 2: It's how we developed. I mean, you know, no, you 293 00:17:14,520 --> 00:17:16,399 Speaker 2: don't like me going off topic on these things, but 294 00:17:16,440 --> 00:17:21,720 Speaker 2: it's interesting how we developed. The thing that is essentially 295 00:17:21,760 --> 00:17:26,760 Speaker 2: a sixth sense about what other humans are speak, are thinking, 296 00:17:27,520 --> 00:17:33,640 Speaker 2: is using that exact capacity. So in our evolutionary environment, 297 00:17:34,040 --> 00:17:38,320 Speaker 2: there's the quick and the dead, and the way to 298 00:17:38,359 --> 00:17:42,080 Speaker 2: be the quickest is that if somebody else's did you know, 299 00:17:42,119 --> 00:17:45,760 Speaker 2: there's that old joke about you know, running from a bear. 300 00:17:46,080 --> 00:17:47,800 Speaker 2: I don't have to run faster than the bear. I 301 00:17:47,840 --> 00:17:51,600 Speaker 2: just have to run faster than you and me. 302 00:17:51,800 --> 00:17:53,120 Speaker 1: Yeah, that's right. 303 00:17:54,400 --> 00:17:58,360 Speaker 2: And so that applies in an evolutionary sense, which is 304 00:17:58,640 --> 00:18:02,600 Speaker 2: that we developed to facility to communicate with each other 305 00:18:03,320 --> 00:18:06,560 Speaker 2: which did not involve having to think about communicating with 306 00:18:06,640 --> 00:18:10,719 Speaker 2: each other about danger. So we develop the ability to 307 00:18:10,840 --> 00:18:16,520 Speaker 2: read each other's movements, expressions, even tone in the voice, 308 00:18:17,080 --> 00:18:20,920 Speaker 2: and understand danger without the word danger actually being said, 309 00:18:22,280 --> 00:18:26,800 Speaker 2: and so that is done using a facility in human 310 00:18:26,840 --> 00:18:31,119 Speaker 2: brain which is called mirror neurons, which essentially we have 311 00:18:31,200 --> 00:18:35,600 Speaker 2: neurons that are mimicking the things that they see in 312 00:18:35,680 --> 00:18:39,680 Speaker 2: others at high speed sublight speeds, well not sublight, but 313 00:18:39,840 --> 00:18:45,399 Speaker 2: light speed mirroring. So even before we have time to 314 00:18:45,480 --> 00:18:47,880 Speaker 2: even think about what's happening, our brain has already figured 315 00:18:47,920 --> 00:18:51,400 Speaker 2: out what's happening from what it's read from other humans 316 00:18:51,400 --> 00:18:55,080 Speaker 2: and the way they're interacting. Now, when you apply that 317 00:18:55,200 --> 00:18:58,720 Speaker 2: in modern society, what that means is that we can 318 00:18:58,840 --> 00:19:04,400 Speaker 2: read each other's more oceans on autopilot, we feel. It's 319 00:19:04,480 --> 00:19:07,680 Speaker 2: almost like we are directly connected to another person. When 320 00:19:07,760 --> 00:19:12,560 Speaker 2: we see another person experience pain, we experience it too. 321 00:19:13,560 --> 00:19:16,800 Speaker 2: We feel it, and we call that empathy. But it's 322 00:19:16,880 --> 00:19:24,640 Speaker 2: actually a really powerful mechanism that was evolved to protect 323 00:19:24,680 --> 00:19:28,640 Speaker 2: us from danger, which is we know about something being 324 00:19:28,720 --> 00:19:31,560 Speaker 2: dangerous before we can even think about it being dangerous, 325 00:19:31,600 --> 00:19:34,160 Speaker 2: before even the person that it's happening to can think 326 00:19:34,200 --> 00:19:38,040 Speaker 2: about it being dangerous. It's all happening in milliseconds and 327 00:19:38,160 --> 00:19:41,880 Speaker 2: protecting us, and we're running and then five minutes later, 328 00:19:41,960 --> 00:19:44,439 Speaker 2: as we're running along we start to think about what 329 00:19:44,520 --> 00:19:47,960 Speaker 2: exactly am I running from. I don't know, but my 330 00:19:48,040 --> 00:19:53,160 Speaker 2: brain said run. And that facility is the thing that 331 00:19:53,320 --> 00:19:59,399 Speaker 2: psychopaths lack. By the way, they cannot they cannot read 332 00:19:59,600 --> 00:20:03,880 Speaker 2: another person's emotions on autopilot like that. They actually have 333 00:20:03,960 --> 00:20:07,520 Speaker 2: to do the thinking bit, and that delay while they 334 00:20:07,560 --> 00:20:11,680 Speaker 2: think about what we're feeling is often detectable. So one 335 00:20:11,720 --> 00:20:14,720 Speaker 2: of the surest ways to detect whether a person in 336 00:20:14,800 --> 00:20:18,960 Speaker 2: a group of people may be a psychopath is watch 337 00:20:19,000 --> 00:20:22,080 Speaker 2: how the group reacts to something traumatic happening, either on 338 00:20:22,119 --> 00:20:25,440 Speaker 2: television or in a movie or in real life, or 339 00:20:25,480 --> 00:20:28,280 Speaker 2: someone telling a story about something traumatic in their life. 340 00:20:29,280 --> 00:20:32,880 Speaker 2: Don't look at them, look at the people listening and 341 00:20:33,000 --> 00:20:38,440 Speaker 2: see how they react to that event. Most people will 342 00:20:38,480 --> 00:20:43,840 Speaker 2: react automatically and in almost at light speed in terms 343 00:20:43,840 --> 00:20:48,400 Speaker 2: of real time reaction. Some people will have to think 344 00:20:48,400 --> 00:20:51,280 Speaker 2: about it, and that delay will be noticeable. 345 00:20:52,920 --> 00:20:55,760 Speaker 1: Yeah, do you know how I know I'm not a sociopath, 346 00:20:56,320 --> 00:20:58,600 Speaker 1: But a sociopath would say what I just said, by 347 00:20:58,640 --> 00:21:04,199 Speaker 1: the way, exactly what a sociopath would say. But I 348 00:21:04,320 --> 00:21:07,520 Speaker 1: cry about four times a day when I watched emotional 349 00:21:07,560 --> 00:21:11,879 Speaker 1: shit on Instagram, like some soldier coming home and his 350 00:21:12,080 --> 00:21:14,960 Speaker 1: kids running up to him or her. I'm like, oh 351 00:21:15,040 --> 00:21:18,040 Speaker 1: my god, I've. 352 00:21:16,880 --> 00:21:18,359 Speaker 2: Got some bad news for you there, Craig. 353 00:21:19,200 --> 00:21:20,440 Speaker 1: Don't tell me they're not real. 354 00:21:21,400 --> 00:21:23,720 Speaker 2: No, No, I don't know. Maybe they are, maybe they're not. 355 00:21:23,760 --> 00:21:28,639 Speaker 2: I don't know. But your reaction is interesting, girl, So 356 00:21:29,480 --> 00:21:34,600 Speaker 2: blubbering like a baby when something emotional is presented to you. 357 00:21:36,320 --> 00:21:38,440 Speaker 1: So there's a chance I'm still a psychopath. 358 00:21:38,760 --> 00:21:44,840 Speaker 2: That's a possibility, is actually an indication of a failure 359 00:21:44,880 --> 00:21:48,359 Speaker 2: of the higher order control mechanisms of an emotional response. 360 00:21:48,400 --> 00:21:52,359 Speaker 1: So psychopathe shut up, Gillespie, shut up. 361 00:21:53,119 --> 00:21:56,600 Speaker 2: It's not that psychopaths can't experience emotions, it's that they 362 00:21:56,640 --> 00:22:00,960 Speaker 2: can't experience the higher order control of emotions. So one 363 00:22:01,000 --> 00:22:03,360 Speaker 2: of the things that quite a few studies have identified 364 00:22:03,359 --> 00:22:07,879 Speaker 2: about psychopaths is that they do feel the four basic emotions, 365 00:22:08,160 --> 00:22:15,199 Speaker 2: one of which is sadness really really powerfully, and so 366 00:22:15,400 --> 00:22:21,040 Speaker 2: will inexplicably cry in response to something emotional being projected 367 00:22:22,480 --> 00:22:26,080 Speaker 2: when others around them don't find them, you know that 368 00:22:26,280 --> 00:22:28,480 Speaker 2: intense Wow. 369 00:22:28,680 --> 00:22:32,760 Speaker 1: Okay, so you've just cleared that up for me. Now. 370 00:22:33,400 --> 00:22:35,680 Speaker 2: I'm not trying to diagnose or anything there, Craig. I'm 371 00:22:35,720 --> 00:22:38,240 Speaker 2: just I'm just saying that it's not. It's not the 372 00:22:38,280 --> 00:22:40,119 Speaker 2: get out of jail free card you think it might be. 373 00:22:40,840 --> 00:22:43,360 Speaker 1: Oh, that's a pity. That's you know, there is there 374 00:22:43,400 --> 00:22:46,040 Speaker 1: is a term in psychology for what you're talking about, 375 00:22:46,040 --> 00:22:48,399 Speaker 1: which is understanding how other people think, which is called 376 00:22:48,800 --> 00:22:53,679 Speaker 1: theory of mind. Yes, that's the that's the capacity to 377 00:22:54,040 --> 00:22:55,840 Speaker 1: understand someone else's version. 378 00:22:55,920 --> 00:22:58,520 Speaker 2: It's a bit different theory of mind. 379 00:22:59,119 --> 00:23:01,520 Speaker 1: It's a bit different, but it's kind of understand. It's 380 00:23:01,560 --> 00:23:04,480 Speaker 1: having an insight into someone else's kind of Well. 381 00:23:04,400 --> 00:23:08,240 Speaker 2: Theory of mind is important because it, as far as 382 00:23:08,240 --> 00:23:10,560 Speaker 2: we know, we're the only species capable of doing it, 383 00:23:11,359 --> 00:23:18,160 Speaker 2: which is understanding what another person's perspective is. So, you know, 384 00:23:18,280 --> 00:23:21,640 Speaker 2: there's famous examples of the chimpanzee experiments where they taught 385 00:23:21,680 --> 00:23:26,200 Speaker 2: chimpanzees to talk endlessly, you know, taught them whole alphabets 386 00:23:26,240 --> 00:23:29,240 Speaker 2: et cetera, et cetera, et cetera, and they comprehensively failed 387 00:23:29,280 --> 00:23:32,760 Speaker 2: theory of mind every single time because never once, even 388 00:23:32,800 --> 00:23:35,560 Speaker 2: after they could talk essentially by pointing to symbols and 389 00:23:35,640 --> 00:23:38,080 Speaker 2: all that sort of thing, never once did they ask 390 00:23:38,359 --> 00:23:43,720 Speaker 2: the human what they thought or how they reacted. Because 391 00:23:43,800 --> 00:23:46,359 Speaker 2: to them the concept that the human was thinking too 392 00:23:47,119 --> 00:23:51,080 Speaker 2: was an impossible concept. And that's what theory of mind 393 00:23:51,160 --> 00:23:54,600 Speaker 2: is is putting yourself in somebody else's shoes. Now, the 394 00:23:54,760 --> 00:23:57,879 Speaker 2: interesting thing about theory of mind is psychopaths are really 395 00:23:57,920 --> 00:24:00,440 Speaker 2: really good at it. They have to spend their life 396 00:24:00,480 --> 00:24:02,320 Speaker 2: doing it because the rest of us are doing it 397 00:24:02,359 --> 00:24:06,480 Speaker 2: automatically on empathy autopilot, where our mirror and your owns 398 00:24:06,480 --> 00:24:09,840 Speaker 2: are firing. We understand what someone's thinking without even thinking 399 00:24:09,840 --> 00:24:12,680 Speaker 2: about it, whereas a psychopath actually has to think about 400 00:24:12,720 --> 00:24:16,800 Speaker 2: it and then decide to imitate that behavior. So theory 401 00:24:16,840 --> 00:24:20,600 Speaker 2: of mind is something that psychopaths are really really excellent at, 402 00:24:21,720 --> 00:24:24,800 Speaker 2: and that's possibly why I think the theory of mind 403 00:24:24,800 --> 00:24:29,240 Speaker 2: study suggests that most people can't do theory of mind 404 00:24:29,840 --> 00:24:33,760 Speaker 2: under the age of about three, I think is about 405 00:24:33,800 --> 00:24:36,199 Speaker 2: the number. So that when they've done the studies on 406 00:24:36,680 --> 00:24:38,919 Speaker 2: very young children, they you know that, you know, the 407 00:24:38,960 --> 00:24:41,720 Speaker 2: sorts of studies they do where they they ask them 408 00:24:41,720 --> 00:24:43,840 Speaker 2: to think, you know, is the lully under you know, 409 00:24:43,960 --> 00:24:46,480 Speaker 2: under that hat or under that one, and what would 410 00:24:46,480 --> 00:24:48,200 Speaker 2: the person have been thinking and all that sort of thing. 411 00:24:48,560 --> 00:24:51,440 Speaker 2: That doesn't work on a young child. But a young 412 00:24:51,520 --> 00:24:55,119 Speaker 2: child is capable of empathy, very very capable of empathy. 413 00:24:55,920 --> 00:24:58,080 Speaker 2: But they're not capable of theory of mind. 414 00:24:58,520 --> 00:25:00,600 Speaker 1: And I think that's the thing is they can understand 415 00:25:00,640 --> 00:25:02,960 Speaker 1: how other people think. They just don't care. 416 00:25:03,520 --> 00:25:07,560 Speaker 2: That's right. They have a very good understanding of how 417 00:25:07,600 --> 00:25:10,399 Speaker 2: you think, but the only purpose in understanding that is 418 00:25:10,400 --> 00:25:11,159 Speaker 2: to manipulate you. 419 00:25:11,840 --> 00:25:17,919 Speaker 1: Well, that actually gives them a strategic and sociological advantage really, 420 00:25:18,160 --> 00:25:21,480 Speaker 1: that they can understand how you think. You know. That's 421 00:25:22,359 --> 00:25:26,720 Speaker 1: have you in all your businesses? I guess over the 422 00:25:26,800 --> 00:25:30,600 Speaker 1: years you've employed a fair few people or no, like yourself, 423 00:25:30,640 --> 00:25:38,399 Speaker 1: have you higher? What's your I've employed about five hundred people, 424 00:25:38,680 --> 00:25:42,880 Speaker 1: not anymore, but over the years, And did you fuck 425 00:25:42,920 --> 00:25:46,200 Speaker 1: it up? Did you employed people and then three months? 426 00:25:46,200 --> 00:25:47,040 Speaker 2: Absolutely? 427 00:25:47,920 --> 00:25:48,080 Speaker 1: Oh? 428 00:25:48,160 --> 00:25:52,000 Speaker 2: Absolutely. I think anyone who says to you, no, I've 429 00:25:52,080 --> 00:25:55,600 Speaker 2: always perfectly employed, you know, I've never ever made a 430 00:25:55,640 --> 00:25:59,880 Speaker 2: mistake in someone I've employed is lying, is absolutely lying. 431 00:26:01,000 --> 00:26:06,080 Speaker 2: Because the reality is we do go with first impressions. 432 00:26:06,119 --> 00:26:09,439 Speaker 2: We do believe what is put on a resume. We 433 00:26:09,520 --> 00:26:12,880 Speaker 2: do believe what people tell us because all humans default 434 00:26:12,880 --> 00:26:15,399 Speaker 2: to believing. Remember we've talked before about this tit for 435 00:26:15,480 --> 00:26:19,560 Speaker 2: tap thing, which is in human interaction. The thing that 436 00:26:19,600 --> 00:26:22,280 Speaker 2: holds us together as a community is that we default 437 00:26:22,480 --> 00:26:25,720 Speaker 2: to believing what other humans tell us. And the only 438 00:26:25,800 --> 00:26:28,520 Speaker 2: time we don't believe what they tell us is when 439 00:26:28,560 --> 00:26:31,679 Speaker 2: they prove that they are a liar. And even then, 440 00:26:32,160 --> 00:26:35,760 Speaker 2: if they then seek forgiveness and then be honest, we'll 441 00:26:35,760 --> 00:26:39,800 Speaker 2: give them another chance. So our default is to believe 442 00:26:39,840 --> 00:26:44,119 Speaker 2: what people tell us, and unfortunately that works against us. 443 00:26:44,160 --> 00:26:47,080 Speaker 2: That plus the first impression in a job interview means 444 00:26:47,119 --> 00:26:49,960 Speaker 2: we've got a very low probability of detecting a psychopath. 445 00:26:50,320 --> 00:26:53,000 Speaker 2: But the cure to that is give yourself time and 446 00:26:53,080 --> 00:26:54,000 Speaker 2: check their references. 447 00:26:55,800 --> 00:26:59,080 Speaker 1: Exactly. I made a fair few mistakes, I'll be honest. 448 00:26:59,240 --> 00:27:02,320 Speaker 1: But also the other thing too is you know, so 449 00:27:02,400 --> 00:27:05,919 Speaker 1: it's almost like the distance between person and persona like 450 00:27:06,000 --> 00:27:07,960 Speaker 1: in the interview you get the you know, you get 451 00:27:07,960 --> 00:27:10,960 Speaker 1: the show, the Craig Show or whatever it is. You know, 452 00:27:11,040 --> 00:27:15,000 Speaker 1: it's like, well, this is a person you're trying to be. Yeah, 453 00:27:15,040 --> 00:27:17,280 Speaker 1: this is a person trying to be what they think 454 00:27:17,359 --> 00:27:19,760 Speaker 1: they need to be to get over the line in 455 00:27:19,760 --> 00:27:23,199 Speaker 1: this context, yep, yeah yeah. 456 00:27:23,280 --> 00:27:26,600 Speaker 2: So but unlike when you're dating someone, so the same 457 00:27:26,640 --> 00:27:30,280 Speaker 2: thing applies when you're potentially seeking out a mate. You know, 458 00:27:30,320 --> 00:27:32,800 Speaker 2: if you're on a first date with someone, it's a 459 00:27:32,880 --> 00:27:39,560 Speaker 2: job interview, but it's less formal, and it's you've got 460 00:27:39,680 --> 00:27:42,439 Speaker 2: less backup should you choose to use it in that 461 00:27:42,480 --> 00:27:45,680 Speaker 2: when you're employing someone, even though a lot of people think, 462 00:27:45,800 --> 00:27:48,399 Speaker 2: you know what, my judgment is fabulous, I love this person. 463 00:27:48,440 --> 00:27:50,399 Speaker 2: They are terrific. I'm not going to check anything, and 464 00:27:50,400 --> 00:27:53,680 Speaker 2: that's what a lot of people do. But they could 465 00:27:53,760 --> 00:27:56,960 Speaker 2: check it if they wanted to. They could verify the references. 466 00:27:57,000 --> 00:27:59,040 Speaker 2: They could talk to people who've worked for this person, 467 00:27:59,040 --> 00:28:01,800 Speaker 2: which is what they should do. They could make sure 468 00:28:01,840 --> 00:28:04,919 Speaker 2: they understand what every minute of this person's career look like, 469 00:28:05,040 --> 00:28:07,440 Speaker 2: not just the bits they've chosen to highlight on their resume. 470 00:28:08,080 --> 00:28:14,840 Speaker 2: They could do that. But when you're potentially interviewing a partner, 471 00:28:14,920 --> 00:28:17,920 Speaker 2: let's say you don't have all of that. All you've 472 00:28:17,920 --> 00:28:21,840 Speaker 2: got is the facade, and you've got to give yourself 473 00:28:21,880 --> 00:28:26,919 Speaker 2: time if you cannot decide based on the first impressions 474 00:28:26,960 --> 00:28:30,840 Speaker 2: of that facade, this person is ideal for me, because 475 00:28:31,480 --> 00:28:34,560 Speaker 2: the more ideal they are, honestly, the more likely they 476 00:28:34,600 --> 00:28:38,400 Speaker 2: are a psychopath, because that's what psychopaths are good at. 477 00:28:38,440 --> 00:28:42,880 Speaker 2: Doing which is giving you the mask you want to see. 478 00:28:44,280 --> 00:28:46,960 Speaker 1: So the great news is everyone, if you think you've 479 00:28:46,960 --> 00:28:50,120 Speaker 1: found your dream partner, you've probably found a psychopath. 480 00:28:50,680 --> 00:28:58,400 Speaker 2: So don't sign your house over to them. 481 00:28:59,640 --> 00:29:04,120 Speaker 1: Yourself sometime and do a prenup whatever that is. We'll 482 00:29:04,160 --> 00:29:07,840 Speaker 1: say goodbye fair but mate as always entertaining and we 483 00:29:07,960 --> 00:29:08,720 Speaker 1: appreciate you. 484 00:29:08,760 --> 00:29:11,440 Speaker 2: Thanks mate, absolute pleasure, see you, Greg