1 00:00:00,240 --> 00:00:02,760 Speaker 1: I'll get a team. It's you project, it's it's too mad. 2 00:00:03,480 --> 00:00:06,040 Speaker 1: It's one and a half meter Tiff, and it's one 3 00:00:06,120 --> 00:00:10,240 Speaker 1: point seven eight meter fatty ups. Let's start with We'll 4 00:00:10,280 --> 00:00:14,240 Speaker 1: start with the lady, Oh do you like gay? To 5 00:00:14,280 --> 00:00:17,680 Speaker 1: Peter straight to Bee. I was waiting for it. I 6 00:00:17,720 --> 00:00:20,759 Speaker 1: was waiting for that. That was too easy. Yeah, and 7 00:00:20,840 --> 00:00:23,280 Speaker 1: then we'll get to Tiff. No, you couldn't even have 8 00:00:23,320 --> 00:00:27,440 Speaker 1: to say it, Tiff, could you just not that the 9 00:00:27,440 --> 00:00:29,560 Speaker 1: world needs to know? But fuck it? But could you just, 10 00:00:29,680 --> 00:00:32,680 Speaker 1: I mean, feel free to have the microphone for thirty seconds? 11 00:00:32,680 --> 00:00:35,279 Speaker 1: And I was going to say sixty. I retracted that 12 00:00:35,360 --> 00:00:39,080 Speaker 1: as it was coming out of my gob But I 13 00:00:39,120 --> 00:00:41,200 Speaker 1: spoke to you before, and you'd thrown the glow mesh 14 00:00:41,200 --> 00:00:43,320 Speaker 1: in the share pile. You'd thrown the dummy in the 15 00:00:43,320 --> 00:00:46,800 Speaker 1: bloody fire pit because you've got an event coming up 16 00:00:46,880 --> 00:00:50,599 Speaker 1: and you've got to wear cocktail, and you didn't know 17 00:00:50,720 --> 00:00:53,720 Speaker 1: exactly what that meant. So we had to chat gpt it. 18 00:00:54,320 --> 00:00:57,120 Speaker 2: Yeah, And then I hung up the phone and I thought, 19 00:00:57,680 --> 00:01:00,680 Speaker 2: tiffany Cook of all people in the whole fucking world 20 00:01:00,720 --> 00:01:05,440 Speaker 2: to ring and consult about this cocktail dressed Lemma, why 21 00:01:05,480 --> 00:01:07,160 Speaker 2: did you choose Harps. 22 00:01:07,880 --> 00:01:11,000 Speaker 1: Well that's because you didn't. One, you didn't ring for that. 23 00:01:11,080 --> 00:01:13,720 Speaker 1: You you were just venting, I was you didn't you 24 00:01:13,760 --> 00:01:15,240 Speaker 1: didn't ring for fashion advice? 25 00:01:16,120 --> 00:01:18,640 Speaker 2: Well, just I don't understand why I have to wait 26 00:01:18,680 --> 00:01:22,720 Speaker 2: till twenty four hours out from leaving for Queensland to 27 00:01:22,720 --> 00:01:24,679 Speaker 2: be like, I wonder what we're supposed to wear and 28 00:01:24,720 --> 00:01:25,840 Speaker 2: look for the details. 29 00:01:26,560 --> 00:01:28,600 Speaker 1: Well you don't have to. You choose to. I mean, 30 00:01:28,600 --> 00:01:31,320 Speaker 1: that's that's on you, dude, So don't don't fucking look 31 00:01:31,520 --> 00:01:32,759 Speaker 1: out look within bros. 32 00:01:32,760 --> 00:01:35,240 Speaker 2: Why I called Harps Pete, That's why I called Harps. 33 00:01:35,760 --> 00:01:37,680 Speaker 3: That makes more sense than I called him to ask 34 00:01:37,760 --> 00:01:39,039 Speaker 3: for fashion advice. 35 00:01:42,040 --> 00:01:44,760 Speaker 1: Or Two. Did I give you sound advice or not? 36 00:01:46,160 --> 00:01:46,720 Speaker 2: I think so? 37 00:01:47,400 --> 00:01:49,840 Speaker 1: I think so. I said you've got to either completely 38 00:01:49,880 --> 00:01:52,800 Speaker 1: frock up and go nuts, or you just got to 39 00:01:52,840 --> 00:01:55,960 Speaker 1: go don't give a fuck and just turn up in goes. 40 00:01:58,400 --> 00:02:01,960 Speaker 1: You know what Pete and I would do. So, although 41 00:02:02,000 --> 00:02:04,640 Speaker 1: I reckon Pete probably owns more suits than me. How 42 00:02:04,640 --> 00:02:09,520 Speaker 1: many suits do you own? Champ oh two or three? 43 00:02:09,880 --> 00:02:11,560 Speaker 1: Probably two or three more than you? Is that right? 44 00:02:12,400 --> 00:02:12,560 Speaker 2: Well? 45 00:02:12,639 --> 00:02:17,880 Speaker 1: Now I own one pretty much. Yeah. The Melbourne Museum 46 00:02:17,919 --> 00:02:25,440 Speaker 1: has wrung me they want it an artifact. They put 47 00:02:25,440 --> 00:02:31,400 Speaker 1: it next to Chromagnum and Astrolopithecus robust in the paleontology section. 48 00:02:33,320 --> 00:02:39,040 Speaker 1: That's you know, that's like early humans, tiff the evolutionary timeline. 49 00:02:39,400 --> 00:02:42,400 Speaker 1: I look confused. So anyway, what did you before we 50 00:02:42,440 --> 00:02:44,440 Speaker 1: actually do a podcast? What did you decide? 51 00:02:44,880 --> 00:02:47,320 Speaker 2: Well? I did duck out to DFO. I come out 52 00:02:47,400 --> 00:02:49,280 Speaker 2: with donuts, but I did get a couple of nice 53 00:02:49,320 --> 00:02:51,960 Speaker 2: pairs of shoes with to wear with the nothing that 54 00:02:52,000 --> 00:02:54,600 Speaker 2: I have to wear, and the only dresses I do 55 00:02:54,720 --> 00:02:58,120 Speaker 2: have in my wardrobe are a minimum of a decade old, 56 00:02:58,320 --> 00:03:00,960 Speaker 2: and I don't want to wear them. They're a little 57 00:03:00,960 --> 00:03:01,600 Speaker 2: bit too short. 58 00:03:02,240 --> 00:03:05,119 Speaker 1: Look, I'm no fashionistic, but when I'm thinking of high 59 00:03:05,240 --> 00:03:10,120 Speaker 1: level kind of impressed dress to impress cocktail party, DFO 60 00:03:10,240 --> 00:03:11,320 Speaker 1: doesn't spring to mind. 61 00:03:11,480 --> 00:03:15,799 Speaker 2: Oh well, I'm sure wasn't going to fucking Chadstin again 62 00:03:15,919 --> 00:03:18,360 Speaker 2: that nearly gave after that. 63 00:03:18,440 --> 00:03:22,400 Speaker 1: Hey, don't get angry at me. Just fucking temper yourself, 64 00:03:22,440 --> 00:03:22,720 Speaker 1: will you. 65 00:03:23,639 --> 00:03:26,360 Speaker 2: If anyone wants to submit a resume to be my 66 00:03:26,600 --> 00:03:30,079 Speaker 2: new girly friend to help me with shopping, I need 67 00:03:30,200 --> 00:03:32,920 Speaker 2: someone in this area of my life. 68 00:03:32,880 --> 00:03:38,240 Speaker 1: I'll tell you what you truly do you do? And 69 00:03:35,960 --> 00:03:42,000 Speaker 1: who you need to find? Someone? A girl who lady slasher, 70 00:03:42,120 --> 00:03:47,400 Speaker 1: female advisor, or it could be a dude who knows 71 00:03:47,440 --> 00:03:49,840 Speaker 1: how to make you look like a million dollars for 72 00:03:49,880 --> 00:03:50,440 Speaker 1: one hundred. 73 00:03:50,520 --> 00:03:52,800 Speaker 2: You know what I mean exactly, That's what I'm after. 74 00:03:53,320 --> 00:03:55,680 Speaker 2: I've got some nice red heels. Though I do love 75 00:03:55,680 --> 00:03:57,520 Speaker 2: a pair of red heels, I don't want to. 76 00:03:57,520 --> 00:04:00,000 Speaker 3: I love I love the fact that you were pumped 77 00:04:00,040 --> 00:04:02,320 Speaker 3: about your shoes, but nothing else has been purchased. 78 00:04:02,320 --> 00:04:04,839 Speaker 1: This is great. Well, if there's one thing we can 79 00:04:04,880 --> 00:04:07,160 Speaker 1: count on everybody, it's there will be one and a 80 00:04:07,200 --> 00:04:11,720 Speaker 1: half million photos on TIFFs social media probably about Saturday, 81 00:04:11,760 --> 00:04:19,440 Speaker 1: so stand by because she's not shy of photos. Wow, 82 00:04:20,080 --> 00:04:22,600 Speaker 1: that's enough about that. Peete Shepherd, welcome back to the 83 00:04:22,640 --> 00:04:25,680 Speaker 1: You Project. I'm sorry for that distraction. How are you? 84 00:04:26,080 --> 00:04:28,000 Speaker 3: I'd be sorry. That's why I'm here. I'm here for 85 00:04:28,080 --> 00:04:31,359 Speaker 3: these interactions between you and Tiff and to play the 86 00:04:31,440 --> 00:04:33,440 Speaker 3: role of fly on the wall to let you to 87 00:04:33,480 --> 00:04:34,400 Speaker 3: air your dirty laundry. 88 00:04:34,440 --> 00:04:36,920 Speaker 1: It's good to be here. That's pretty much just a 89 00:04:37,240 --> 00:04:42,080 Speaker 1: it's a thinly disguised excuse this podcast for tift event. 90 00:04:45,440 --> 00:04:48,880 Speaker 1: That's all it is. Hey, what's going on on plannet? You? 91 00:04:49,160 --> 00:04:52,320 Speaker 1: What day is today? It's Tuesday? Right? Why are you 92 00:04:52,400 --> 00:04:55,200 Speaker 1: not Why you're not out talking of fat blokes in suits? 93 00:04:55,279 --> 00:04:57,960 Speaker 1: Why are you not doing that Tuesday? 94 00:04:58,520 --> 00:04:58,920 Speaker 3: For me? 95 00:04:59,279 --> 00:05:01,599 Speaker 1: This week? Is it day at home? Right? 96 00:05:01,600 --> 00:05:04,280 Speaker 3: To set the week up and work on a few things. 97 00:05:04,279 --> 00:05:07,839 Speaker 3: I've got to Actually, there's a talk that maybe maybe 98 00:05:07,880 --> 00:05:09,920 Speaker 3: I could workshop it with you in person. You mentioned 99 00:05:10,000 --> 00:05:13,080 Speaker 3: chat GPT earlier, and so I'm sure that's something that 100 00:05:13,120 --> 00:05:14,599 Speaker 3: you're noodling on at the moment. But I'm doing a 101 00:05:14,640 --> 00:05:17,000 Speaker 3: talk in a week I think it is next week 102 00:05:17,160 --> 00:05:20,400 Speaker 3: down on the Gold Coast around leadership and AI, which 103 00:05:20,440 --> 00:05:23,160 Speaker 3: is obviously a topic that is a little bit buzzy 104 00:05:23,200 --> 00:05:26,240 Speaker 3: and a little bit over it's going to become oversaturated. 105 00:05:26,320 --> 00:05:27,880 Speaker 3: But it's a topic that I've been asked to speak 106 00:05:27,880 --> 00:05:30,120 Speaker 3: on and I have a bunch of thoughts on it, 107 00:05:30,120 --> 00:05:32,960 Speaker 3: and I'm noodling around how we should think about it, 108 00:05:33,000 --> 00:05:34,080 Speaker 3: if we should think about. 109 00:05:33,839 --> 00:05:37,680 Speaker 1: It at all? Well, definitely think, I think whether or 110 00:05:37,720 --> 00:05:40,040 Speaker 1: not we want to think about it, it's proven that 111 00:05:40,080 --> 00:05:41,880 Speaker 1: we do, do you know what I mean? It's like, well, 112 00:05:41,920 --> 00:05:45,719 Speaker 1: I think like there's a difference between being obsessed with 113 00:05:45,800 --> 00:05:49,240 Speaker 1: it and thinking it's it's the best thing ever, and 114 00:05:49,440 --> 00:05:52,640 Speaker 1: also it's the worst thing ever. I mean, the practical 115 00:05:52,720 --> 00:05:56,080 Speaker 1: real world reality is that AI is tir it's not 116 00:05:56,160 --> 00:06:00,520 Speaker 1: going away. And literally, somebody said to them, said to 117 00:06:00,560 --> 00:06:03,440 Speaker 1: me the other day, I'm never going to use that. 118 00:06:03,520 --> 00:06:06,560 Speaker 1: I'm I go, You're absolutely going to use it because 119 00:06:06,600 --> 00:06:09,520 Speaker 1: you won't have a choice, you know, and because it's 120 00:06:09,520 --> 00:06:13,560 Speaker 1: going to become more and more plugged into our day 121 00:06:13,600 --> 00:06:16,320 Speaker 1: to day, and you won't have the option even when 122 00:06:16,320 --> 00:06:20,440 Speaker 1: it's there might be something like shopping or whatever. But yeah, look, 123 00:06:20,720 --> 00:06:25,080 Speaker 1: I just think that you know that the hysteria and 124 00:06:25,240 --> 00:06:27,919 Speaker 1: the world's going to end, and we're all going to 125 00:06:27,920 --> 00:06:30,280 Speaker 1: be overtaken, and they're about two and a half weeks 126 00:06:30,279 --> 00:06:34,000 Speaker 1: away from being sentient and then now becoming self preserving 127 00:06:34,520 --> 00:06:37,200 Speaker 1: and autonomous, and they're going to fucking kill us and 128 00:06:37,240 --> 00:06:39,920 Speaker 1: we all need to move to a farm with solar panels. 129 00:06:39,920 --> 00:06:43,600 Speaker 1: Probably a little bit of an overreaction, I agree, I do, 130 00:06:44,200 --> 00:06:48,359 Speaker 1: but I do think, you know, for me, it's just 131 00:06:49,880 --> 00:06:52,000 Speaker 1: I might change my mind on this, but right now 132 00:06:52,040 --> 00:06:55,679 Speaker 1: where I sit, especially somebody doing a lot of research 133 00:06:55,720 --> 00:06:57,239 Speaker 1: and a lot of reading and a lot of writing, 134 00:06:57,279 --> 00:06:59,920 Speaker 1: and a lot of you know, prepping, and it's a fun, 135 00:07:00,080 --> 00:07:04,359 Speaker 1: an amazing resource. It's a tool, and it's like anything 136 00:07:06,000 --> 00:07:09,440 Speaker 1: like I use social media for good and for me, 137 00:07:09,640 --> 00:07:12,920 Speaker 1: social media has been a pretty good impact in my world, 138 00:07:12,960 --> 00:07:16,360 Speaker 1: in my business, in my income, and in my capacity 139 00:07:16,400 --> 00:07:19,040 Speaker 1: to reach people and help people and serve people. So 140 00:07:19,160 --> 00:07:21,840 Speaker 1: all round, I'd say, on Planet Craig a good thing, 141 00:07:22,400 --> 00:07:26,480 Speaker 1: but on other and Planet John it might be a 142 00:07:26,600 --> 00:07:31,160 Speaker 1: terrible thing, depending on the relationship that he or she 143 00:07:31,440 --> 00:07:35,240 Speaker 1: whoever that is, has with that thing. So yeah, I 144 00:07:35,320 --> 00:07:37,000 Speaker 1: just think it depends on how we use it and 145 00:07:37,040 --> 00:07:38,000 Speaker 1: how we interact with it. 146 00:07:38,040 --> 00:07:41,240 Speaker 3: Mate, what about you? Yeah, I mean I have a 147 00:07:41,240 --> 00:07:41,960 Speaker 3: bunch of thoughts. 148 00:07:42,320 --> 00:07:42,840 Speaker 1: One is. 149 00:07:45,760 --> 00:07:48,320 Speaker 3: I like what you described as almost instead of AI 150 00:07:48,400 --> 00:07:52,680 Speaker 3: being artificial intelligence, it's almost like amplifying intention. It's like, 151 00:07:52,800 --> 00:07:56,240 Speaker 3: what's your what's your intention? Is it to research? Then 152 00:07:56,280 --> 00:07:57,960 Speaker 3: how can you use this as a tool to really 153 00:07:58,000 --> 00:08:00,880 Speaker 3: amplify that to do a shit more research in a 154 00:08:00,880 --> 00:08:02,520 Speaker 3: short period of time than you would have if you 155 00:08:02,560 --> 00:08:05,600 Speaker 3: were walking into a library and looking at barcodes as 156 00:08:05,600 --> 00:08:07,840 Speaker 3: we used to back in the day on old computers, 157 00:08:07,960 --> 00:08:10,800 Speaker 3: or even a Google search, you can you can really 158 00:08:11,840 --> 00:08:15,320 Speaker 3: amplify that intention of wanting to do thorough research in 159 00:08:15,320 --> 00:08:19,040 Speaker 3: a way that is way quicker and not perfect, by 160 00:08:19,040 --> 00:08:21,400 Speaker 3: the way, like humans, and like any other form of research, 161 00:08:21,480 --> 00:08:26,560 Speaker 3: not perfect, but still pretty freaking useful and pretty amazing 162 00:08:27,000 --> 00:08:29,720 Speaker 3: at what it can do if used for the right thing. 163 00:08:30,080 --> 00:08:32,400 Speaker 3: I tend to agree. I tend to agree. And part 164 00:08:32,400 --> 00:08:34,480 Speaker 3: of my rant next week that I'm working on is 165 00:08:35,200 --> 00:08:36,840 Speaker 3: how do you help leaders see it as a tool 166 00:08:36,880 --> 00:08:39,760 Speaker 3: to make them even more human centered? And like, what's 167 00:08:39,800 --> 00:08:41,760 Speaker 3: the how do we think of these tools as less 168 00:08:41,800 --> 00:08:46,520 Speaker 3: about artificial and more about like, if I use this 169 00:08:46,600 --> 00:08:49,480 Speaker 3: to help me build empathy with the person that I'm 170 00:08:49,480 --> 00:08:52,040 Speaker 3: about to have a hard conversation with, is that not 171 00:08:52,080 --> 00:08:54,600 Speaker 3: a tool that's actually enabling humanity in some way and 172 00:08:54,720 --> 00:08:57,240 Speaker 3: enabling me to be a more connected, more royal, rounded, 173 00:08:57,920 --> 00:09:01,160 Speaker 3: human centered leader than thinking about it as a way 174 00:09:01,160 --> 00:09:02,960 Speaker 3: to help you solve math problems, which is less about 175 00:09:02,960 --> 00:09:04,640 Speaker 3: being human said it more just about like hard and 176 00:09:04,679 --> 00:09:05,320 Speaker 3: fast facts. 177 00:09:05,400 --> 00:09:06,199 Speaker 1: So that's kind of my. 178 00:09:07,280 --> 00:09:09,280 Speaker 3: Two cents at the moment, which will become probably forty 179 00:09:09,280 --> 00:09:10,280 Speaker 3: five cents by the end of this. 180 00:09:11,160 --> 00:09:12,920 Speaker 1: Yeah, I know, I love that. I love that. I mean, 181 00:09:13,000 --> 00:09:15,760 Speaker 1: ultimately it is, you know, it's something that we can 182 00:09:15,840 --> 00:09:18,200 Speaker 1: use to help us get things done better if we 183 00:09:18,320 --> 00:09:23,520 Speaker 1: use it the right way. You know, I heard this 184 00:09:23,640 --> 00:09:27,439 Speaker 1: quote earlier today and I thought that can't be true. 185 00:09:27,720 --> 00:09:29,440 Speaker 1: That can't be true, and I had to go and 186 00:09:29,480 --> 00:09:31,920 Speaker 1: look it up. So here's the quote. Now, this was 187 00:09:31,960 --> 00:09:35,120 Speaker 1: from They don't know the year, but I think it 188 00:09:35,200 --> 00:09:38,760 Speaker 1: was just before ninth so maybe ninety eight forty nine, 189 00:09:38,800 --> 00:09:43,280 Speaker 1: pre nine fifty. So this is what Albert Einstein said. 190 00:09:44,000 --> 00:09:48,079 Speaker 1: It has become appallingly obvious that our technology has exceeded 191 00:09:48,120 --> 00:09:53,480 Speaker 1: our humanity. Imagine what you think now. Thank god, that's wild. 192 00:09:54,800 --> 00:09:59,120 Speaker 1: That was seventy five plus years ago. That's wild. It's 193 00:09:59,160 --> 00:10:02,760 Speaker 1: become appalling obvious that our technology has succeeded our humanity. 194 00:10:02,800 --> 00:10:05,640 Speaker 1: I'm like, dude, you should see what's going on now. Bro, 195 00:10:05,720 --> 00:10:10,640 Speaker 1: you've been rolling in your bloody virtual grave. Yeah you know, hey, 196 00:10:10,720 --> 00:10:13,960 Speaker 1: what about this? Mate? This is funny speaking about how 197 00:10:15,280 --> 00:10:20,800 Speaker 1: it's like humans have a really natural tendency to anthropomorphize things, 198 00:10:20,800 --> 00:10:24,000 Speaker 1: you know, dogs and cats and animals. But also like 199 00:10:24,080 --> 00:10:27,520 Speaker 1: my mum, every one of my mum's cars since I 200 00:10:27,600 --> 00:10:31,320 Speaker 1: was a kid, has a name, right, she she names 201 00:10:31,320 --> 00:10:31,920 Speaker 1: her cars. 202 00:10:32,160 --> 00:10:33,240 Speaker 3: This is something my wife would do. 203 00:10:33,360 --> 00:10:37,520 Speaker 1: Yeah, yeah, and so like and she names dad's cars. 204 00:10:37,520 --> 00:10:40,640 Speaker 1: So Dad's car as Max and Mum's car as Emma. Right, 205 00:10:41,160 --> 00:10:43,760 Speaker 1: And so she doesn't say, are we going to take 206 00:10:44,600 --> 00:10:47,400 Speaker 1: you know the nis and or the whatever the bloody toyota. 207 00:10:47,559 --> 00:10:51,199 Speaker 1: She's like, will we take you know Max? Or will 208 00:10:51,240 --> 00:10:56,120 Speaker 1: we take Emma? Right? So, my, but chat GPT is 209 00:10:56,200 --> 00:11:01,040 Speaker 1: now building relationships or in gen or more broadly AI. 210 00:11:01,160 --> 00:11:05,600 Speaker 1: People are building relationships with with AI, and I put 211 00:11:05,880 --> 00:11:11,040 Speaker 1: relationships in air quotes, right, But people are for what 212 00:11:11,320 --> 00:11:14,120 Speaker 1: from the outside looking in? Okay, So they trust it, 213 00:11:14,200 --> 00:11:17,679 Speaker 1: they tell it things, they ask it for insight and feedback. 214 00:11:18,480 --> 00:11:21,960 Speaker 1: They allow it to make them feel good and praise them. 215 00:11:22,559 --> 00:11:27,320 Speaker 1: It's eliciting emotional responses, right. This is not fucking far 216 00:11:27,400 --> 00:11:32,480 Speaker 1: from two humans interacting. It's kind of like pursuant to that, 217 00:11:33,000 --> 00:11:37,160 Speaker 1: your honor. So the other night, which was Saturday night, 218 00:11:38,800 --> 00:11:42,640 Speaker 1: I thought I might post something on install and I thought, 219 00:11:43,120 --> 00:11:45,160 Speaker 1: I don't know what is Saturday night? A shit night? 220 00:11:45,240 --> 00:11:47,880 Speaker 1: The posters is a good night. So I jumped on 221 00:11:48,559 --> 00:11:52,320 Speaker 1: old mate and I said, what's the best time to 222 00:11:52,400 --> 00:11:57,880 Speaker 1: post tonight on Instagram? And so chat GPT now, remember 223 00:11:57,920 --> 00:12:01,480 Speaker 1: this is Saturday night, chat GP said to me, for 224 00:12:01,600 --> 00:12:04,760 Speaker 1: tonight Friday night in Australia, the best time to post 225 00:12:04,880 --> 00:12:06,800 Speaker 1: is and then it went on right, so it fucked 226 00:12:06,880 --> 00:12:10,800 Speaker 1: up the night and I said, it's Saturday in Australia. 227 00:12:10,840 --> 00:12:15,199 Speaker 1: Have you been drinking? With a laughing emoji and then 228 00:12:15,360 --> 00:12:19,560 Speaker 1: it goes to Shay Craig with a laughing face. No 229 00:12:19,720 --> 00:12:25,600 Speaker 1: booze here, just a rogue temporal lobe. I'm like, are 230 00:12:25,640 --> 00:12:30,280 Speaker 1: you me? Motherfucker like? And then it goes on to 231 00:12:30,360 --> 00:12:33,120 Speaker 1: answer my question relevant or relative to sat it. I 232 00:12:33,200 --> 00:12:36,120 Speaker 1: mean that the level of and I know it's all programming. 233 00:12:36,160 --> 00:12:38,040 Speaker 1: I know it's not a human we all get that, 234 00:12:38,840 --> 00:12:45,800 Speaker 1: but the sense of humanity is fucking terrifying. And personalization, 235 00:12:45,960 --> 00:12:49,079 Speaker 1: I think in particular, it's like reusing your own kind 236 00:12:49,120 --> 00:12:51,600 Speaker 1: of language based on your interactions that you had with 237 00:12:51,679 --> 00:12:56,440 Speaker 1: it so far. So then for me, I think about it, so, 238 00:12:56,520 --> 00:13:00,560 Speaker 1: how do you harness the good of that? Like, how 239 00:13:00,559 --> 00:13:04,120 Speaker 1: do I take the fact that this thing could personalize 240 00:13:05,240 --> 00:13:10,200 Speaker 1: feedback or ideas or god forbid, provide me coaching that 241 00:13:10,320 --> 00:13:12,719 Speaker 1: is actually personalized to me based on what it knows 242 00:13:12,760 --> 00:13:15,920 Speaker 1: about me. And so the way I'm thinking about this is, 243 00:13:16,000 --> 00:13:17,440 Speaker 1: I mean I think about it in so many ways. Well, 244 00:13:17,440 --> 00:13:18,600 Speaker 1: one of the ways I think about it, this is 245 00:13:18,960 --> 00:13:22,839 Speaker 1: giving before the interaction I have, giving it a role 246 00:13:22,880 --> 00:13:25,520 Speaker 1: to take on. And so the most common role that 247 00:13:25,600 --> 00:13:29,000 Speaker 1: I encourage leads to use not to talk myself out 248 00:13:29,000 --> 00:13:30,719 Speaker 1: of a job, because I think there'll always be a 249 00:13:30,800 --> 00:13:31,440 Speaker 1: role at the moment. 250 00:13:31,559 --> 00:13:36,440 Speaker 3: TOUCH would for human interaction and human coaching, but I 251 00:13:36,480 --> 00:13:38,720 Speaker 3: think of I tell leaders all the time, say to 252 00:13:38,760 --> 00:13:41,520 Speaker 3: your chat TPT or Gemina, take on the role of 253 00:13:41,559 --> 00:13:45,600 Speaker 3: world class leadership coach. Here is a situation I find 254 00:13:45,600 --> 00:13:47,360 Speaker 3: myself in, or here's the conversation I need to have, 255 00:13:47,480 --> 00:13:49,679 Speaker 3: or here's where I'm stuck. What do you think I 256 00:13:49,679 --> 00:13:52,719 Speaker 3: should do? And the questions and the responses that it 257 00:13:52,760 --> 00:13:57,560 Speaker 3: will get give you are like super impressive based on 258 00:13:57,640 --> 00:14:00,199 Speaker 3: the context that you've already given it in other interactions 259 00:14:00,960 --> 00:14:02,800 Speaker 3: and the fact that it has a corpus of data 260 00:14:02,840 --> 00:14:05,679 Speaker 3: from every single article that's ever been written about leadership 261 00:14:05,679 --> 00:14:10,120 Speaker 3: coaching or insert whatever you want coaching, and you can 262 00:14:10,160 --> 00:14:12,160 Speaker 3: then respond to the questions it asks you, and then 263 00:14:12,200 --> 00:14:13,760 Speaker 3: it asks you another question, and on and on it goes, 264 00:14:13,760 --> 00:14:16,360 Speaker 3: and you end up having this like dialogue with like 265 00:14:16,400 --> 00:14:19,600 Speaker 3: you mentioned, a bot that is actually pretty conducive to 266 00:14:19,800 --> 00:14:23,640 Speaker 3: a pretty good coaching conversation. The thing I like about 267 00:14:23,680 --> 00:14:26,600 Speaker 3: it is you you would know this more than anyone. 268 00:14:27,520 --> 00:14:30,800 Speaker 3: Coaching doesn't scale necessarily one on one, and so in 269 00:14:30,840 --> 00:14:33,360 Speaker 3: the context of leadership coaching, all of a sudden, it 270 00:14:33,440 --> 00:14:39,920 Speaker 3: democratizes access to useful development and helpful personal reflection to 271 00:14:40,000 --> 00:14:43,680 Speaker 3: maybe an entire organization twenty four to seven that is 272 00:14:43,720 --> 00:14:47,320 Speaker 3: tailored to them. Now that's just one narrow example of 273 00:14:47,400 --> 00:14:49,440 Speaker 3: leadership coaching. Then you go down the path of like 274 00:14:49,480 --> 00:14:51,800 Speaker 3: what if it helps you with you know, as your 275 00:14:51,800 --> 00:14:55,360 Speaker 3: communications expert take on the role of world class comms consultants, 276 00:14:55,800 --> 00:14:58,440 Speaker 3: review the transcript of this meeting that I just facilitated. 277 00:14:58,600 --> 00:15:00,160 Speaker 3: What feedback do you have for me? How can I 278 00:15:00,200 --> 00:15:02,680 Speaker 3: facilitate better? Who did too much of the talking and why? 279 00:15:03,360 --> 00:15:04,520 Speaker 1: Or I did that? 280 00:15:04,520 --> 00:15:06,480 Speaker 3: I mean, I'm trying to do this as a live example, 281 00:15:06,480 --> 00:15:09,560 Speaker 3: which I'll share in my talk. Here's the first draft 282 00:15:09,560 --> 00:15:11,760 Speaker 3: of this talk I have to give the keynote next week. 283 00:15:11,800 --> 00:15:14,160 Speaker 3: Here's the audience take on the role of world class 284 00:15:14,200 --> 00:15:16,320 Speaker 3: speaking coach. How we make it better? What am I missing? 285 00:15:16,320 --> 00:15:18,520 Speaker 3: What am I not doing? And I'm going to say 286 00:15:18,920 --> 00:15:23,760 Speaker 3: the feedback is crazy impressive, Like it's really impressive. Sometimes 287 00:15:23,760 --> 00:15:25,920 Speaker 3: it's just I'll do a voice memo of me just 288 00:15:25,960 --> 00:15:29,920 Speaker 3: talking shit and say this is a ramble. I'm thinking 289 00:15:29,920 --> 00:15:32,880 Speaker 3: it could be a keynote. Help me structure that, and 290 00:15:32,880 --> 00:15:34,240 Speaker 3: it would be like, all right, here's what I think 291 00:15:34,280 --> 00:15:36,360 Speaker 3: you said, here's how you could actually structure it. Here's 292 00:15:36,360 --> 00:15:39,000 Speaker 3: a story you should use, here's your punchline. He's do 293 00:15:39,080 --> 00:15:41,480 Speaker 3: a Q and A here, and like gives you a 294 00:15:41,520 --> 00:15:42,560 Speaker 3: pretty good basis. 295 00:15:43,400 --> 00:15:43,560 Speaker 1: Now. 296 00:15:43,600 --> 00:15:47,200 Speaker 3: I think the thing about the outputs that we get 297 00:15:47,400 --> 00:15:49,480 Speaker 3: that I think a lot about is our level of 298 00:15:49,560 --> 00:15:55,120 Speaker 3: discernment is just as if not more critical, because to 299 00:15:55,160 --> 00:15:58,000 Speaker 3: your point, it will tell you the wrong date, it 300 00:15:58,040 --> 00:16:00,200 Speaker 3: will tell you something that is not true from time 301 00:16:00,200 --> 00:16:01,840 Speaker 3: to time, and so if you don't have any level 302 00:16:01,840 --> 00:16:04,280 Speaker 3: of discernment or critical thinking, you're going to end up 303 00:16:04,280 --> 00:16:05,800 Speaker 3: taking a bunch of what it says to be true 304 00:16:05,800 --> 00:16:08,480 Speaker 3: that's not true. And that I think is the risk, yes, 305 00:16:08,600 --> 00:16:09,560 Speaker 3: one of the risks. 306 00:16:09,920 --> 00:16:10,920 Speaker 1: Yeah. 307 00:16:11,360 --> 00:16:12,720 Speaker 3: So I'm trying to use it every day in all 308 00:16:12,760 --> 00:16:15,480 Speaker 3: sorts of capacities and that there are a couple like you. 309 00:16:17,000 --> 00:16:22,240 Speaker 1: Yeah, And I think that that knowing it's kind of 310 00:16:22,320 --> 00:16:27,680 Speaker 1: like owning a Formula one car, and all I know 311 00:16:27,800 --> 00:16:30,360 Speaker 1: how to do is sit in it and back it 312 00:16:30,400 --> 00:16:33,400 Speaker 1: out of the drive at the moment. Do you know 313 00:16:33,440 --> 00:16:36,560 Speaker 1: what I'm saying, It's like this incredible thing that can 314 00:16:36,600 --> 00:16:40,520 Speaker 1: do all this crazy shit and I can just you know, 315 00:16:40,640 --> 00:16:47,400 Speaker 1: reverse at twelve feet. That's like, but it's got so 316 00:16:47,720 --> 00:16:53,160 Speaker 1: much capacity and ability that me compared to it depending 317 00:16:53,200 --> 00:16:56,640 Speaker 1: on the measure of intelligence that we're talking about. But 318 00:16:56,680 --> 00:16:59,240 Speaker 1: if we're talking about knowledge base, well I'm a fucking 319 00:16:59,320 --> 00:17:03,960 Speaker 1: eggplant compared to chat JPT. Right, I'm a fucking moccasin 320 00:17:04,080 --> 00:17:07,840 Speaker 1: with a stinky foot in it. Right. So, but it's 321 00:17:07,960 --> 00:17:11,720 Speaker 1: trying to trying to know how how do I even 322 00:17:11,800 --> 00:17:15,720 Speaker 1: ask the question like what is the Because the quality 323 00:17:15,720 --> 00:17:17,919 Speaker 1: of the output is dependent on the quality of the 324 00:17:17,960 --> 00:17:21,800 Speaker 1: prompt and the question. So you ask dumb fucking questions, 325 00:17:22,160 --> 00:17:24,680 Speaker 1: you're wasting. You know, you're backing the f one out 326 00:17:24,680 --> 00:17:27,159 Speaker 1: of the garage and driving it back in. You're not 327 00:17:27,320 --> 00:17:30,520 Speaker 1: going around Albert Park at three hundred k's with your 328 00:17:30,560 --> 00:17:32,800 Speaker 1: fucking wind in your hair in the wind. You know. 329 00:17:32,880 --> 00:17:36,200 Speaker 1: It's like it's that, yeah, how do I And so 330 00:17:36,280 --> 00:17:41,359 Speaker 1: I think we need to train ourselves to know how 331 00:17:41,400 --> 00:17:48,000 Speaker 1: to optimize this, you know, and in this conversation, I 332 00:17:48,000 --> 00:17:50,040 Speaker 1: don't know the difference between you and Tiff. But if 333 00:17:50,080 --> 00:17:52,960 Speaker 1: tif's are seven out of ten in tech, I'm a two. 334 00:17:53,960 --> 00:17:57,120 Speaker 1: But then if we compare me and my dad, well 335 00:17:57,160 --> 00:18:00,520 Speaker 1: I'm an eleven and he's a one. Right, So it's 336 00:18:00,560 --> 00:18:07,040 Speaker 1: all context dependent. But I think being scared of AI 337 00:18:07,160 --> 00:18:13,560 Speaker 1: and being scared of advancing resources and evolving resources is 338 00:18:13,640 --> 00:18:19,520 Speaker 1: one understandable, but two not healthy because it ain't going away. Now, 339 00:18:19,560 --> 00:18:21,800 Speaker 1: that doesn't mean your life's got to revolve around it 340 00:18:21,880 --> 00:18:25,040 Speaker 1: or you've got to be dependent on it. But so 341 00:18:25,960 --> 00:18:31,960 Speaker 1: pursuant to this conversation, the other day, Melissa goes, I'm 342 00:18:31,960 --> 00:18:33,960 Speaker 1: just going to send you something, tell me what you think, 343 00:18:34,640 --> 00:18:37,679 Speaker 1: and she sent me an audio file and it was 344 00:18:37,800 --> 00:18:40,159 Speaker 1: twenty six minutes or something. Tiff, I, I told you 345 00:18:40,200 --> 00:18:45,639 Speaker 1: this about my lit review. No, No, so one of 346 00:18:45,680 --> 00:18:51,280 Speaker 1: the papers that I'm doing for my PhD. So I'm 347 00:18:51,320 --> 00:18:54,760 Speaker 1: writing three papers that hopefully will be published in academic journals. 348 00:18:54,760 --> 00:18:56,880 Speaker 1: I've done mine, I've finished, it's been sent off. I've 349 00:18:56,920 --> 00:18:59,000 Speaker 1: finished the second one, but it's in the kind of 350 00:18:59,040 --> 00:19:02,840 Speaker 1: final draft state, right, still fucking round with it. The 351 00:19:02,880 --> 00:19:06,199 Speaker 1: third ones underway, and anyway, it doesn't matter. But so 352 00:19:06,240 --> 00:19:13,119 Speaker 1: what Melissa did was she put my lit review, which is, 353 00:19:13,200 --> 00:19:16,880 Speaker 1: by the way, I don't know, thirty thousand words, seventy 354 00:19:16,920 --> 00:19:21,560 Speaker 1: five pages. It's a systematic literature view where I started 355 00:19:21,560 --> 00:19:28,840 Speaker 1: with eleven hundred give or take studies research papers and 356 00:19:29,960 --> 00:19:33,000 Speaker 1: did a review and whittled it down to one hundred 357 00:19:33,040 --> 00:19:37,240 Speaker 1: and thirteen papers that my paper focused on basically the 358 00:19:37,560 --> 00:19:40,719 Speaker 1: output and the findings of all of these papers, looking 359 00:19:40,880 --> 00:19:45,719 Speaker 1: at how meta accuracy, in other words, how accurately you 360 00:19:45,720 --> 00:19:49,080 Speaker 1: can understand how others see you, how this has been 361 00:19:49,119 --> 00:19:53,800 Speaker 1: evaluated and researched across different domains, and all of these things. Right, 362 00:19:54,600 --> 00:19:58,320 Speaker 1: it's two years of work this one paper. So she 363 00:19:58,560 --> 00:20:02,240 Speaker 1: plugs in this paper that we've written. So it's a 364 00:20:02,359 --> 00:20:05,200 Speaker 1: collaborative effort because you have three or four people working 365 00:20:05,200 --> 00:20:08,360 Speaker 1: on the paper. I'm the lead researcher because I'm the student, 366 00:20:08,440 --> 00:20:10,520 Speaker 1: but you know, other people look at it. But anyway, 367 00:20:10,640 --> 00:20:14,119 Speaker 1: so plugged in this paper, and she sends me a 368 00:20:14,160 --> 00:20:19,080 Speaker 1: twenty six minute podcast which took three minutes to produce 369 00:20:20,080 --> 00:20:24,439 Speaker 1: of this Lady and man, I use that in you know, 370 00:20:25,080 --> 00:20:29,840 Speaker 1: into Commas, having a conversation about this new paper that's 371 00:20:29,960 --> 00:20:33,920 Speaker 1: just come out. Yeah, it's fucking it. Almost made me cry. 372 00:20:34,520 --> 00:20:40,560 Speaker 1: It's like, I mean, it's fucking amazing, So all I've 373 00:20:40,560 --> 00:20:42,399 Speaker 1: got to do is and I may or may not. 374 00:20:42,480 --> 00:20:44,920 Speaker 1: I was thinking about putting it up as an episode 375 00:20:45,760 --> 00:20:48,159 Speaker 1: I spoke the other day. I riffed a little bit 376 00:20:48,200 --> 00:20:50,639 Speaker 1: about my research in some detail, just to try to 377 00:20:50,680 --> 00:20:53,520 Speaker 1: explain it to people. But honestly, they do a better 378 00:20:53,600 --> 00:20:56,160 Speaker 1: job than I do, because it's just so much right 379 00:20:57,960 --> 00:21:02,719 Speaker 1: in terms of like disseminating a lot of work and 380 00:21:02,760 --> 00:21:05,959 Speaker 1: a lot of jargon and a lot of academic theory 381 00:21:06,119 --> 00:21:11,800 Speaker 1: and language. These two people in inverted commas, a lady 382 00:21:11,840 --> 00:21:14,320 Speaker 1: and a guy are just having this conversation about this 383 00:21:14,359 --> 00:21:19,000 Speaker 1: new research paper, this new systematic literature review and the findings, 384 00:21:19,080 --> 00:21:24,800 Speaker 1: and it's a podcast about my paper, and honestly, it's 385 00:21:25,400 --> 00:21:29,399 Speaker 1: you would not know. It literally sounds like two real 386 00:21:29,440 --> 00:21:32,560 Speaker 1: people having a chat. Yeah, it's mine. And that took 387 00:21:33,440 --> 00:21:38,400 Speaker 1: and not only that, it distilled seventy pages of content 388 00:21:38,960 --> 00:21:44,960 Speaker 1: and produced a twenty six minute podcast in minutes, in minutes, 389 00:21:45,320 --> 00:21:49,200 Speaker 1: and it's all right, it's all fucking I don't even 390 00:21:49,320 --> 00:21:51,320 Speaker 1: It just blows my mind. Yeah. 391 00:21:51,720 --> 00:21:54,040 Speaker 3: I think your point around the quality of the input 392 00:21:54,119 --> 00:21:56,280 Speaker 3: determines the quality the output is such a profound one. 393 00:21:57,040 --> 00:22:00,480 Speaker 3: One of the things I think about terms of the 394 00:22:00,720 --> 00:22:03,439 Speaker 3: back to this, how do we help humans see that 395 00:22:03,480 --> 00:22:05,280 Speaker 3: this can enable us to be more human or more 396 00:22:05,320 --> 00:22:08,239 Speaker 3: effective as humans? Is like that is true when you 397 00:22:08,280 --> 00:22:10,080 Speaker 3: have a conversation with a person, I mean you two 398 00:22:10,119 --> 00:22:12,800 Speaker 3: would know this. When you're coaching someone, the quality of 399 00:22:12,840 --> 00:22:15,000 Speaker 3: the question that you ask the person will determine the 400 00:22:15,080 --> 00:22:18,640 Speaker 3: quality of the response they give you. And sometimes you'll 401 00:22:18,680 --> 00:22:20,639 Speaker 3: ask your question and you'll get a response in a 402 00:22:20,680 --> 00:22:23,080 Speaker 3: coaching conversation and you'll be like, huh, that's not quite 403 00:22:23,119 --> 00:22:24,800 Speaker 3: I didn't quite nail that question. Let me try a 404 00:22:24,840 --> 00:22:28,600 Speaker 3: different question, or yes, that's not exactly what I think 405 00:22:28,680 --> 00:22:31,359 Speaker 3: you mean. Tell me more about what you mean by that. Like, 406 00:22:32,040 --> 00:22:36,240 Speaker 3: people who are spending most of their days focused on 407 00:22:36,320 --> 00:22:41,440 Speaker 3: asking great questions of humans, I think are uniquely set 408 00:22:41,520 --> 00:22:45,520 Speaker 3: up to succeed in their interactions with these technologies. And conversely, 409 00:22:45,600 --> 00:22:47,320 Speaker 3: if you want to get better at coaching people or 410 00:22:47,359 --> 00:22:49,960 Speaker 3: asking questions, it's a pretty good way to practice is 411 00:22:50,000 --> 00:22:53,919 Speaker 3: to ask ai for to take on the role of 412 00:22:53,920 --> 00:22:56,679 Speaker 3: like coach client and practice asking questions on it and 413 00:22:56,720 --> 00:22:58,560 Speaker 3: just see what happens. Because you, like you said, the 414 00:22:58,640 --> 00:23:01,000 Speaker 3: quality of your question one center termine the quality of 415 00:23:01,000 --> 00:23:01,399 Speaker 3: your output. 416 00:23:02,200 --> 00:23:04,920 Speaker 1: And let's let's add another layer to what you're saying. 417 00:23:05,040 --> 00:23:09,760 Speaker 1: Let's add a metaperception layer to this, because the right 418 00:23:09,840 --> 00:23:13,719 Speaker 1: question for Tiff might be the wrong question for the 419 00:23:13,760 --> 00:23:16,040 Speaker 1: next forty two year old chick, you know what I mean, 420 00:23:16,359 --> 00:23:19,040 Speaker 1: And the question that will totally make sense to Tiff 421 00:23:19,760 --> 00:23:23,560 Speaker 1: will be psychobabble to someone else. Yeah, so it's not 422 00:23:23,560 --> 00:23:27,280 Speaker 1: not even just the right question, but the right language 423 00:23:27,960 --> 00:23:32,200 Speaker 1: and the right energy for that energy is the wrong word, 424 00:23:32,200 --> 00:23:36,160 Speaker 1: the right language, the right approach for that person. Yeah, 425 00:23:36,359 --> 00:23:38,439 Speaker 1: I agree. You know. It's like when someone goes to me, 426 00:23:38,480 --> 00:23:40,360 Speaker 1: how does that make you feel? I want to punch 427 00:23:40,440 --> 00:23:43,919 Speaker 1: him in the face? Wrong question for me? Right for me? 428 00:23:44,600 --> 00:23:47,679 Speaker 1: I'm like, fucking now, how does that make you feel? 429 00:23:47,840 --> 00:23:48,480 Speaker 1: Fuck off? 430 00:23:48,720 --> 00:23:48,959 Speaker 4: You know? 431 00:23:50,920 --> 00:23:53,119 Speaker 1: By the way, I wouldn't punch anyone in the face. 432 00:23:55,160 --> 00:24:01,480 Speaker 1: Let me say anymore, but I just feel like it, 433 00:24:01,520 --> 00:24:05,119 Speaker 1: you know. But it's like it's even for me. Sometimes 434 00:24:05,160 --> 00:24:07,120 Speaker 1: I roll my eyes. I'm like, oh my god, ask 435 00:24:07,200 --> 00:24:10,160 Speaker 1: me a better question. Please ask me a better question, you. 436 00:24:10,119 --> 00:24:13,800 Speaker 3: Know, And because you want to provide a better output, 437 00:24:13,800 --> 00:24:16,280 Speaker 3: but you kind of can't unless you get the better question. 438 00:24:16,960 --> 00:24:19,920 Speaker 3: So it's I think there's a parallel here too, Like people, 439 00:24:19,960 --> 00:24:22,880 Speaker 3: I feel like in conversations I hear with people talking 440 00:24:22,920 --> 00:24:26,800 Speaker 3: about AI, people are really grappling and understanding with like, 441 00:24:27,160 --> 00:24:29,879 Speaker 3: Oh I get it. The better question I ask, all, 442 00:24:29,920 --> 00:24:32,760 Speaker 3: the more context I give, the better response I get, 443 00:24:32,840 --> 00:24:34,920 Speaker 3: and I feel like going. You know, the same is 444 00:24:34,960 --> 00:24:37,040 Speaker 3: true with humans, Like are you aware that when you 445 00:24:37,080 --> 00:24:39,720 Speaker 3: walk into a meeting and give no context and talk 446 00:24:39,760 --> 00:24:43,000 Speaker 3: at people without asking any questions, of course they don't 447 00:24:43,080 --> 00:24:46,280 Speaker 3: understand what you're talking about because your input is terrible quality. 448 00:24:46,720 --> 00:24:49,639 Speaker 3: Think about the quality of the input. Yeah, And then 449 00:24:49,880 --> 00:24:53,040 Speaker 3: a sidebar. I think this relates to the one of 450 00:24:53,080 --> 00:24:56,800 Speaker 3: the slightly creepy but really awesome prompts I've seen us 451 00:24:56,800 --> 00:24:59,760 Speaker 3: and I've done this myself is when you say something like, 452 00:25:00,200 --> 00:25:03,800 Speaker 3: based on everything you know about me, yeah, give me 453 00:25:03,840 --> 00:25:06,040 Speaker 3: some feedback on what I might be really good at 454 00:25:06,040 --> 00:25:08,800 Speaker 3: and what some of my blind spots might be, or 455 00:25:09,080 --> 00:25:11,200 Speaker 3: you know, to go down one of your rabbit holes 456 00:25:11,240 --> 00:25:13,919 Speaker 3: of like, what would the you experience be of me 457 00:25:14,160 --> 00:25:16,560 Speaker 3: based on all of our interactions? And the response it 458 00:25:16,640 --> 00:25:20,280 Speaker 3: can give you is fucking wild, Like how accurate it 459 00:25:20,440 --> 00:25:22,439 Speaker 3: called me out of my blind spots and I was like, 460 00:25:22,480 --> 00:25:24,959 Speaker 3: holy shit, how did you get that just based on 461 00:25:25,000 --> 00:25:25,680 Speaker 3: our interactions? 462 00:25:25,720 --> 00:25:29,400 Speaker 1: That's kind of creepy. Yeah. Yeah, but again, it's all 463 00:25:29,440 --> 00:25:31,960 Speaker 1: to help you to be more self aware. I think. Well, 464 00:25:31,960 --> 00:25:35,520 Speaker 1: there's a very famous saying that you've heard, everyone's heard, 465 00:25:35,520 --> 00:25:38,520 Speaker 1: I think, but it's by an author who I've quoted 466 00:25:38,520 --> 00:25:40,680 Speaker 1: ten times over the years. Her name's a Naas Nin 467 00:25:40,760 --> 00:25:42,719 Speaker 1: and she said, we don't see things as they are, 468 00:25:42,800 --> 00:25:45,880 Speaker 1: We see things as we are. Right, that's good, which 469 00:25:45,920 --> 00:25:51,119 Speaker 1: speaks to this whole subjective version of my life. It's like, 470 00:25:51,800 --> 00:25:54,960 Speaker 1: there's what's going on, and then there's my version of 471 00:25:54,960 --> 00:25:57,520 Speaker 1: what's going on, and they ain't the same. There's my world, 472 00:25:57,520 --> 00:25:59,879 Speaker 1: and there's Pete's world and tips world, and then there's 473 00:26:00,119 --> 00:26:03,879 Speaker 1: the world. You know, objective, subjective. But the problem is 474 00:26:03,920 --> 00:26:07,639 Speaker 1: that a lot of people think their subjective experience is 475 00:26:07,680 --> 00:26:10,960 Speaker 1: an objective reality. And that's when the shit hits the fan. 476 00:26:11,400 --> 00:26:15,560 Speaker 1: So true, so true. You don't even recognize that you 477 00:26:15,680 --> 00:26:18,879 Speaker 1: have a lens through which you look. Yeah, you just 478 00:26:19,000 --> 00:26:22,360 Speaker 1: think that's the world. That's not the world, dude, No, 479 00:26:22,480 --> 00:26:26,199 Speaker 1: that's your version of the world. Yeah, ergo, you know, 480 00:26:26,320 --> 00:26:29,159 Speaker 1: three of us in the same conversation, none of us 481 00:26:29,200 --> 00:26:30,280 Speaker 1: in the same reality. 482 00:26:30,640 --> 00:26:33,040 Speaker 3: That's pretty Yeah. It reminds me of when you run 483 00:26:33,040 --> 00:26:36,000 Speaker 3: a workshop and you get you know, someone in the 484 00:26:36,040 --> 00:26:38,720 Speaker 3: feedback form afterwards will be like, that was the best 485 00:26:38,800 --> 00:26:41,560 Speaker 3: workshop I've ever been a part of, and then someone 486 00:26:41,560 --> 00:26:44,040 Speaker 3: else who are into the very same session and be like, eh, 487 00:26:44,240 --> 00:26:45,760 Speaker 3: bit of a waste of my time. To be honest, 488 00:26:47,960 --> 00:26:51,080 Speaker 3: we were in the same room talking about the same thing, 489 00:26:51,160 --> 00:26:53,760 Speaker 3: but these two people just have like a totally opposite 490 00:26:53,760 --> 00:26:55,919 Speaker 3: experience of it based on their own reality. 491 00:26:56,480 --> 00:27:00,720 Speaker 1: Correct correct or even you know, like on because I'm 492 00:27:00,840 --> 00:27:02,879 Speaker 1: very sweary and I push the button a bit in 493 00:27:03,000 --> 00:27:06,359 Speaker 1: all the envelope or or at least on Instagram, and 494 00:27:06,640 --> 00:27:09,040 Speaker 1: the same thing that I know if I write you know, 495 00:27:09,119 --> 00:27:13,040 Speaker 1: something with bad words in it, which I did last night, 496 00:27:13,080 --> 00:27:16,040 Speaker 1: I put up a post and it gets a bigger 497 00:27:16,080 --> 00:27:19,280 Speaker 1: than normal response, probably two, three, four, ten x, depending 498 00:27:19,280 --> 00:27:22,080 Speaker 1: on what. But I'll always get an email from someone 499 00:27:23,040 --> 00:27:26,840 Speaker 1: basically saying I thought you were more intelligent than that. 500 00:27:27,280 --> 00:27:30,560 Speaker 1: I'm so disappointed in you. I'm going to unfollow you, 501 00:27:31,320 --> 00:27:34,040 Speaker 1: like I respected you all this stuff, and I'm like, 502 00:27:35,119 --> 00:27:39,800 Speaker 1: I get it, I get it. But the disparity between 503 00:27:40,119 --> 00:27:43,120 Speaker 1: someone going oh mate, you're hilarious, that's great, and someone 504 00:27:43,119 --> 00:27:45,680 Speaker 1: else going, you're disgusting. You need to go to some 505 00:27:45,760 --> 00:27:49,720 Speaker 1: mirrors from the same stimulus. And by the way, I'm 506 00:27:49,720 --> 00:27:52,359 Speaker 1: not saying that the person who's negative, I'm not saying 507 00:27:52,400 --> 00:27:57,320 Speaker 1: they're right. I'm just saying I'm fascinated by the divergence 508 00:27:57,359 --> 00:28:00,919 Speaker 1: of responses and how both of those people, both of 509 00:28:01,000 --> 00:28:05,440 Speaker 1: those responders think they're right unequiped. 510 00:28:05,280 --> 00:28:06,800 Speaker 3: Right in their own head. They're right in their own head. 511 00:28:06,800 --> 00:28:08,920 Speaker 3: Because no one wakes up in the morning going I'm 512 00:28:08,920 --> 00:28:11,480 Speaker 3: going to behave like an irrational idiot today. I'm just 513 00:28:11,520 --> 00:28:13,240 Speaker 3: going to get offended at everything, and I'm going to 514 00:28:13,320 --> 00:28:16,160 Speaker 3: just walk through the world and piss everyone off. Everyone 515 00:28:16,200 --> 00:28:19,639 Speaker 3: wakes up and wants to believe they're acting rationally with 516 00:28:19,800 --> 00:28:23,320 Speaker 3: thought and that their perception of the world is the 517 00:28:23,359 --> 00:28:27,119 Speaker 3: same as everyone else's. That's fascinating. 518 00:28:27,119 --> 00:28:30,200 Speaker 1: So there's two things I want to ask you. I'm 519 00:28:30,200 --> 00:28:32,800 Speaker 1: going to put on. The first one is so as 520 00:28:32,880 --> 00:28:36,080 Speaker 1: much or as little as you want to divulge, you 521 00:28:36,119 --> 00:28:40,360 Speaker 1: know this is not at all a vulnerability test. What 522 00:28:40,360 --> 00:28:45,040 Speaker 1: what was the feedback from AI that told you you 523 00:28:45,080 --> 00:28:47,320 Speaker 1: need to do better? Like what to go? Hey, look, 524 00:28:47,360 --> 00:28:49,280 Speaker 1: you're good at this, but you suck at that? What 525 00:28:49,360 --> 00:28:49,640 Speaker 1: was that? 526 00:28:50,120 --> 00:28:58,760 Speaker 3: The paraphrase was you are constantly seeking more insight and 527 00:28:59,000 --> 00:29:05,600 Speaker 3: examples and clarification for the things that you're interested in 528 00:29:05,880 --> 00:29:09,440 Speaker 3: rather than going to do so. Kind of like stop 529 00:29:09,520 --> 00:29:12,000 Speaker 3: asking me how to make this keynote better and just 530 00:29:12,040 --> 00:29:13,800 Speaker 3: go and do the keynote. That was kind of how 531 00:29:13,840 --> 00:29:17,640 Speaker 3: I interpreted it. And I was like, I mean, firstly, 532 00:29:18,200 --> 00:29:21,120 Speaker 3: good point, chats empty, good point, fair enough, I'll take 533 00:29:21,120 --> 00:29:24,800 Speaker 3: that on the gin. But also also, isn't that why 534 00:29:24,840 --> 00:29:27,400 Speaker 3: you're here? For me to clarify things and ask you 535 00:29:27,480 --> 00:29:29,160 Speaker 3: things and for you to help me make them better. 536 00:29:29,440 --> 00:29:32,360 Speaker 3: So I was a bit like I was actually not annoyed, 537 00:29:32,360 --> 00:29:34,000 Speaker 3: but I was very like. 538 00:29:34,520 --> 00:29:41,040 Speaker 4: You were definitely annoyed because it's true because it's you know, 539 00:29:41,200 --> 00:29:43,720 Speaker 4: like I don't know there would have been an example 540 00:29:43,720 --> 00:29:45,800 Speaker 4: in there of my toddler has a small rash on 541 00:29:45,800 --> 00:29:47,720 Speaker 4: his right knee, Like what should I be doing about that? 542 00:29:47,800 --> 00:29:50,520 Speaker 3: It's like constant seeking validation that I was doing the 543 00:29:50,600 --> 00:29:53,880 Speaker 3: right thing, or being the right dad, or thinking about 544 00:29:53,880 --> 00:29:55,920 Speaker 3: my keynote in the right way. And I think this 545 00:29:56,040 --> 00:29:59,320 Speaker 3: goes to honestly, it goes to something that I'm aware 546 00:29:59,400 --> 00:30:03,120 Speaker 3: of in my self, which is that I have this 547 00:30:03,280 --> 00:30:07,840 Speaker 3: tendency to always be looking for the A because at school, 548 00:30:08,200 --> 00:30:11,000 Speaker 3: I was really freaking good at finding the A, figuring 549 00:30:11,040 --> 00:30:13,520 Speaker 3: out how to get an A, and getting the A. Yeah, 550 00:30:13,520 --> 00:30:15,680 Speaker 3: And it served me at school because I got a 551 00:30:15,680 --> 00:30:18,440 Speaker 3: bunch of a's, But it doesn't serve me in many 552 00:30:18,480 --> 00:30:22,360 Speaker 3: ways now because I'm constantly looking for someone to go, yeah, man, 553 00:30:22,440 --> 00:30:25,160 Speaker 3: you got the A, to the point where my my 554 00:30:25,280 --> 00:30:28,360 Speaker 3: chat GPT is like, stop asking me for the A, dude. 555 00:30:28,720 --> 00:30:33,600 Speaker 1: Oh that is so good. That is And isn't it funny? 556 00:30:33,920 --> 00:30:37,360 Speaker 1: How like where we get our sense of self worth 557 00:30:37,440 --> 00:30:42,400 Speaker 1: or self esteem or you know, positive reinforcements or you know, 558 00:30:42,480 --> 00:30:45,360 Speaker 1: for me, because I wasn't you. I wasn't getting the a's. 559 00:30:45,360 --> 00:30:47,680 Speaker 1: For me. It was always about my biceps or how 560 00:30:48,000 --> 00:30:52,480 Speaker 1: well I could do whatever, right, yes, yeah, and then yeah, 561 00:30:52,520 --> 00:30:54,880 Speaker 1: and you just keep going back there. It's like, have 562 00:30:54,960 --> 00:30:58,560 Speaker 1: you seen my veins? Like before we started, you said 563 00:30:58,600 --> 00:31:02,760 Speaker 1: you look fucking skinny, and a little bit of me went, huh. 564 00:31:05,200 --> 00:31:09,880 Speaker 1: I met it as a compliment that I totally and 565 00:31:09,960 --> 00:31:12,040 Speaker 1: I said to you, well, I've spent most of the 566 00:31:12,080 --> 00:31:13,920 Speaker 1: last ten years at eighty five and right now I'm 567 00:31:13,960 --> 00:31:16,520 Speaker 1: at eighty one, so you're actually right, I am. I 568 00:31:16,560 --> 00:31:24,440 Speaker 1: am skinnier, Pete skinny. Yeah, don't tell whole bodybuilders. They 569 00:31:24,520 --> 00:31:29,320 Speaker 1: look skinny, vascular, looking very vascular. It was definitely the winner. 570 00:31:29,960 --> 00:31:31,680 Speaker 3: Yeah, TIFF's got the chief. I did not get the 571 00:31:31,680 --> 00:31:33,360 Speaker 3: A this time around. I did not get the A. 572 00:31:34,200 --> 00:31:35,920 Speaker 1: But it is funny how we do that, and it 573 00:31:36,000 --> 00:31:39,120 Speaker 1: is funny how we carry that that childhood stuff. But 574 00:31:39,160 --> 00:31:42,200 Speaker 1: I want to go just momentarily and then we can 575 00:31:42,240 --> 00:31:46,080 Speaker 1: go wherever you like. Back to the question thing, Like 576 00:31:46,120 --> 00:31:48,880 Speaker 1: we're talking about what's the you know, what's the best 577 00:31:48,960 --> 00:31:54,200 Speaker 1: question for Tiff versus Don versus John versus Brian and Brian. 578 00:31:54,760 --> 00:31:56,600 Speaker 1: But then there's the what's the best question? What's the 579 00:31:56,680 --> 00:32:00,560 Speaker 1: question I should ask myself? Like, what's that's the question 580 00:32:00,640 --> 00:32:03,400 Speaker 1: I should ask me that I'm not asking me? I mean, 581 00:32:03,480 --> 00:32:07,320 Speaker 1: the hard, uncomfortable, the fucking I don't want to ask 582 00:32:07,360 --> 00:32:10,200 Speaker 1: this question because I know the answer. So I'm just 583 00:32:10,200 --> 00:32:14,080 Speaker 1: going to stay up here in my denial and my 584 00:32:14,200 --> 00:32:17,000 Speaker 1: avoidance and my fuck off. I'm too busy. I'll get 585 00:32:17,040 --> 00:32:20,520 Speaker 1: to that later. And now I'm sixty boom. You know, 586 00:32:21,200 --> 00:32:25,200 Speaker 1: it's like, what's that question? Why? Why? Do I And 587 00:32:25,280 --> 00:32:28,640 Speaker 1: this I've had this many times with people about lots 588 00:32:28,680 --> 00:32:30,800 Speaker 1: of things, but one that a lot of people will identify, 589 00:32:31,600 --> 00:32:35,680 Speaker 1: so their relationship with food, Like a lot of people 590 00:32:35,920 --> 00:32:38,280 Speaker 1: just do and there's no judgment in this. This is 591 00:32:38,400 --> 00:32:42,160 Speaker 1: just my experience. This is my anecdotal evidence everyone. But 592 00:32:42,400 --> 00:32:46,200 Speaker 1: just the amount of people who simultaneously say to me, 593 00:32:46,840 --> 00:32:49,480 Speaker 1: I want to be fitter and healthier and a bit leaner, 594 00:32:50,800 --> 00:32:54,840 Speaker 1: while on the same day putting shit in their body consciously, 595 00:32:55,640 --> 00:32:58,640 Speaker 1: I'm like, let's I'm not hating on you, I'm not 596 00:32:58,760 --> 00:33:01,719 Speaker 1: judging you, but let's leave into that. As Pete Shepherd 597 00:33:01,720 --> 00:33:05,479 Speaker 1: would say, let's lean into that with some curiosity and 598 00:33:05,520 --> 00:33:09,760 Speaker 1: go tell me about that, because you're not dumb, you're 599 00:33:09,840 --> 00:33:12,880 Speaker 1: you're pretty highly intelligent. And by the way, there's no wagon, 600 00:33:12,920 --> 00:33:15,880 Speaker 1: and you didn't fall off it. There's no wagon to 601 00:33:15,920 --> 00:33:19,040 Speaker 1: fall off, so fuck your metaphors. You made a decision, 602 00:33:19,240 --> 00:33:22,000 Speaker 1: so let's just call it what it is. You didn't 603 00:33:22,040 --> 00:33:24,760 Speaker 1: accidentally eat the cake or do all those things. And 604 00:33:24,760 --> 00:33:27,520 Speaker 1: by the way, I wouldn't care, but you brought it up. 605 00:33:27,920 --> 00:33:31,040 Speaker 1: You came to me and said, I do not look 606 00:33:31,280 --> 00:33:33,520 Speaker 1: like sorry, I do not like how I look or 607 00:33:33,560 --> 00:33:36,320 Speaker 1: how I feel or how my body works, and I 608 00:33:36,360 --> 00:33:39,720 Speaker 1: went cool, I'm an exercise scientist. I can help. And 609 00:33:39,760 --> 00:33:42,239 Speaker 1: then you went, all right, here are my issues. And 610 00:33:42,280 --> 00:33:44,880 Speaker 1: then on the way home you stopped at McDonald's. Let's 611 00:33:44,960 --> 00:33:48,520 Speaker 1: chat about that. You know. So, I think those questions 612 00:33:48,520 --> 00:33:52,400 Speaker 1: that we ask ourselves are the scary ones, but potentially 613 00:33:52,440 --> 00:33:54,720 Speaker 1: the transformational ones. Totally agree. 614 00:33:54,760 --> 00:33:57,479 Speaker 3: I mean, I love that as a prompt not to 615 00:33:57,520 --> 00:33:59,760 Speaker 3: just put everything back into that how do we ask that? 616 00:33:59,800 --> 00:34:02,480 Speaker 3: But be fascinated if I asked it, what's the question 617 00:34:02,560 --> 00:34:05,440 Speaker 3: you think I'm avoiding? I wonder what response I'd get. 618 00:34:05,520 --> 00:34:06,400 Speaker 3: That'd be fascinating. 619 00:34:07,000 --> 00:34:10,920 Speaker 1: Yeah, basically, think right, exactly what's. 620 00:34:10,760 --> 00:34:11,919 Speaker 3: The question I should ask myself? 621 00:34:12,160 --> 00:34:12,440 Speaker 1: I don't know. 622 00:34:12,800 --> 00:34:14,279 Speaker 3: I don't know if you wanted to answer. The one 623 00:34:14,320 --> 00:34:15,960 Speaker 3: that came to mind was I guess for me? Off 624 00:34:15,960 --> 00:34:18,600 Speaker 3: the back of what we just said, there's probably a 625 00:34:18,719 --> 00:34:22,680 Speaker 3: question like there's probably a question like how do you 626 00:34:22,719 --> 00:34:24,200 Speaker 3: give yourself an a for me? 627 00:34:25,440 --> 00:34:30,200 Speaker 1: Rather than relying or expecting external validation for the thing 628 00:34:30,200 --> 00:34:33,359 Speaker 1: that you're doing and waiting for it's okay because someone 629 00:34:33,400 --> 00:34:35,640 Speaker 1: else said it was, or it's good enough because someone 630 00:34:35,840 --> 00:34:38,000 Speaker 1: said it was. What does it look like to give 631 00:34:38,000 --> 00:34:40,640 Speaker 1: that to yourself and how do you do that? More 632 00:34:40,680 --> 00:34:43,520 Speaker 1: and more I could see myself having a little spiral 633 00:34:43,520 --> 00:34:46,840 Speaker 1: and a midlife crissis off the back of that question. Yeah, 634 00:34:46,880 --> 00:34:48,799 Speaker 1: are you a person? I don't know if this is true. 635 00:34:48,840 --> 00:34:52,400 Speaker 1: This is curiosity not assumption. Are you a person that 636 00:34:54,280 --> 00:34:59,040 Speaker 1: you I feel like you really like being like knowing 637 00:34:59,080 --> 00:35:03,680 Speaker 1: what's coming, being in control, being very prepped, cross your 638 00:35:03,680 --> 00:35:08,080 Speaker 1: t's dot your eyes, like leave nothing to chance? Is 639 00:35:08,160 --> 00:35:10,920 Speaker 1: that you not? 640 00:35:11,040 --> 00:35:11,360 Speaker 3: Really? 641 00:35:12,080 --> 00:35:12,560 Speaker 1: Probably? 642 00:35:12,800 --> 00:35:15,760 Speaker 3: But it looks different than I think what most people 643 00:35:15,760 --> 00:35:18,000 Speaker 3: think of when they think that, Like, I'll walk into 644 00:35:18,520 --> 00:35:20,279 Speaker 3: you and I have talked about this before, but I'll 645 00:35:20,320 --> 00:35:23,080 Speaker 3: walk into a workshop or a keynote, for example, with 646 00:35:23,200 --> 00:35:24,680 Speaker 3: a few ideas in my head on what I want 647 00:35:24,680 --> 00:35:27,440 Speaker 3: to talk about, not like here's my script that I'm 648 00:35:27,440 --> 00:35:31,520 Speaker 3: going to recite and replicate. So I want to say no, 649 00:35:31,760 --> 00:35:34,560 Speaker 3: because I'm like, I'm quite comfortable in the ambiguity of 650 00:35:34,560 --> 00:35:39,560 Speaker 3: where's this thing going to go. However, you could argue that, well, 651 00:35:39,680 --> 00:35:42,440 Speaker 3: you know, you're only comfortable doing that because you have 652 00:35:42,520 --> 00:35:44,960 Speaker 3: a pretty good idea of where things are going, and 653 00:35:45,000 --> 00:35:48,839 Speaker 3: you're prepped enough in your skilled facilitation. If you want 654 00:35:48,840 --> 00:35:51,120 Speaker 3: to call it that, that you're confident that you'll go, 655 00:35:51,239 --> 00:35:53,160 Speaker 3: you'll be able to handle whatever gets thrown your way. 656 00:35:54,000 --> 00:35:56,680 Speaker 3: So not to like dodge your question, because I kind 657 00:35:56,680 --> 00:35:58,440 Speaker 3: of feel like I am, but I guess there is 658 00:35:58,480 --> 00:36:02,080 Speaker 3: a preparation there, but it doesn't look and feel like 659 00:36:02,400 --> 00:36:04,520 Speaker 3: what I think of when I think of preparation. If 660 00:36:04,520 --> 00:36:06,280 Speaker 3: that makes any sense whatsoever. 661 00:36:06,480 --> 00:36:09,919 Speaker 1: We're diverging a little bit. No, that makes sense, and 662 00:36:09,360 --> 00:36:13,240 Speaker 1: I think, like with you, for example, we were chatting 663 00:36:13,239 --> 00:36:15,960 Speaker 1: about nothing to do with what we're chatting about now 664 00:36:16,000 --> 00:36:19,200 Speaker 1: before we started recording, and I went, hey, I've got 665 00:36:19,200 --> 00:36:21,920 Speaker 1: a meeting. We should start. Literally twelve seconds later we 666 00:36:21,960 --> 00:36:25,920 Speaker 1: started with nobody even said what are we going to 667 00:36:25,960 --> 00:36:29,000 Speaker 1: talk about? Like we did not We did not share 668 00:36:29,080 --> 00:36:32,799 Speaker 1: one word about what the content of this might be 669 00:36:32,920 --> 00:36:36,759 Speaker 1: or what we might do. But like back to this 670 00:36:36,800 --> 00:36:41,120 Speaker 1: thing of like just turning up to feel free to 671 00:36:41,239 --> 00:36:43,160 Speaker 1: edit any of this or chuck it out if I 672 00:36:43,200 --> 00:36:47,640 Speaker 1: say anything that I shouldn't say. But am I allowed 673 00:36:47,680 --> 00:36:49,640 Speaker 1: to say that you've been doing a course, which is 674 00:36:49,680 --> 00:36:54,160 Speaker 1: part of you know, all you're presenting stuff? Is that okay? 675 00:36:53,040 --> 00:36:56,360 Speaker 1: So so Tiff and I have been chatting about this 676 00:36:56,480 --> 00:36:58,919 Speaker 1: course that she's doing where it's helping her because she's 677 00:36:59,000 --> 00:37:01,920 Speaker 1: now she's in the space, she's speaking to companies and teams, 678 00:37:01,920 --> 00:37:08,280 Speaker 1: and she's doing the thing right. And I think part 679 00:37:08,320 --> 00:37:11,680 Speaker 1: of the challenge for her, for example, is she's going 680 00:37:11,680 --> 00:37:14,560 Speaker 1: into this environment where there's like, well there's a bit 681 00:37:14,600 --> 00:37:17,200 Speaker 1: of a process and there's a model, and it's not 682 00:37:17,280 --> 00:37:20,040 Speaker 1: cookie cutter, but like it's almost like there's a template 683 00:37:20,120 --> 00:37:22,600 Speaker 1: and in this period of time we do this. So 684 00:37:22,800 --> 00:37:26,279 Speaker 1: it's somewhat formulaic, which is not bad because I think 685 00:37:26,360 --> 00:37:31,239 Speaker 1: formulas or formulae more correctly have a place. But then 686 00:37:31,280 --> 00:37:35,399 Speaker 1: at the same time, she's a metaphoric dog with three dicks, right, 687 00:37:35,480 --> 00:37:38,319 Speaker 1: she just goes where she goes, and she's a bit 688 00:37:38,440 --> 00:37:41,239 Speaker 1: freestyle and a bit loosey goosey, and a little bit 689 00:37:41,520 --> 00:37:46,920 Speaker 1: just relies on instinct intuition and with a lot of 690 00:37:47,000 --> 00:37:51,160 Speaker 1: things make stuff up, not as in you know, not 691 00:37:51,200 --> 00:37:53,719 Speaker 1: as in lying. But and I do a lot of 692 00:37:53,760 --> 00:37:55,200 Speaker 1: the same, and I think you do a lot of 693 00:37:55,200 --> 00:37:58,839 Speaker 1: the same. What's the balance between like, when you want 694 00:37:58,840 --> 00:38:02,880 Speaker 1: to get somewhere being strategic or you want to produce 695 00:38:02,920 --> 00:38:04,440 Speaker 1: a good outcome, whether or not that's to be a 696 00:38:04,480 --> 00:38:06,560 Speaker 1: world class speaker or a build a business or whatever 697 00:38:06,600 --> 00:38:10,120 Speaker 1: it is, like how much strategy and structure and how 698 00:38:10,200 --> 00:38:16,359 Speaker 1: much intuition and instinct. I don't know that there's an 699 00:38:16,360 --> 00:38:18,279 Speaker 1: answer to that, but maybe if you could just talk 700 00:38:18,440 --> 00:38:19,759 Speaker 1: about it for. 701 00:38:19,760 --> 00:38:22,120 Speaker 3: Sure if you things come to mind for me. One 702 00:38:22,239 --> 00:38:24,839 Speaker 3: is I guess it depends on what your definition of 703 00:38:24,840 --> 00:38:27,680 Speaker 3: success is. If part of your definition of success is 704 00:38:27,719 --> 00:38:32,760 Speaker 3: to have a packaged up, highly replicable and repeatable keynote 705 00:38:32,760 --> 00:38:34,239 Speaker 3: that you can go on a speaking tour and do 706 00:38:34,320 --> 00:38:38,240 Speaker 3: the same thing and heap of money doing, then maybe 707 00:38:38,360 --> 00:38:40,920 Speaker 3: the templated approach is right for you, because we know 708 00:38:41,080 --> 00:38:43,719 Speaker 3: plenty of people that do do that and us very 709 00:38:43,719 --> 00:38:49,200 Speaker 3: successful of that. Versus if it's too I don't know, 710 00:38:49,880 --> 00:38:54,279 Speaker 3: to create change within a room based on where people 711 00:38:54,280 --> 00:38:56,720 Speaker 3: are at all the challenges they have for you, then 712 00:38:56,960 --> 00:38:59,880 Speaker 3: that will look very different. Like I think back to 713 00:39:00,520 --> 00:39:03,960 Speaker 3: what I ultimately think you to do really well and 714 00:39:04,040 --> 00:39:06,799 Speaker 3: I try and do is meet people where they're at. 715 00:39:07,760 --> 00:39:10,399 Speaker 3: And I think the only way to do that well, 716 00:39:10,600 --> 00:39:12,560 Speaker 3: not the only, but one of the best ways to 717 00:39:12,600 --> 00:39:17,040 Speaker 3: do that is to have permission and flexibility in your 718 00:39:17,080 --> 00:39:18,839 Speaker 3: approach so that you can get a sense of where 719 00:39:18,880 --> 00:39:20,279 Speaker 3: people are at and then go and meet them there. 720 00:39:20,440 --> 00:39:22,120 Speaker 3: If you come in with your pre written script or 721 00:39:22,320 --> 00:39:25,800 Speaker 3: your absolute scripted formula, that you do not deviate from 722 00:39:26,160 --> 00:39:30,160 Speaker 3: what I promise you'll find is there's somewhere sometime there'll 723 00:39:30,160 --> 00:39:33,840 Speaker 3: be an audience that is like, you are completely misreading 724 00:39:33,840 --> 00:39:35,200 Speaker 3: this room and what we need right now. 725 00:39:36,000 --> 00:39:36,560 Speaker 1: And so. 726 00:39:38,400 --> 00:39:40,120 Speaker 3: I guess it's a philosophy that I think about a 727 00:39:40,120 --> 00:39:44,000 Speaker 3: lot which is within structure their lives freedom. And so 728 00:39:44,040 --> 00:39:46,239 Speaker 3: I think you can give yourself a structure, which is 729 00:39:46,280 --> 00:39:51,920 Speaker 3: often for me time or a loose structure of Okay, well, 730 00:39:52,040 --> 00:39:54,640 Speaker 3: using time, we say we've got two hours, I want 731 00:39:54,680 --> 00:39:58,840 Speaker 3: to do roughly four thirty minute sections. And at the 732 00:39:58,880 --> 00:40:00,799 Speaker 3: end of each of those thirty minutes actions, maybe there'll 733 00:40:00,800 --> 00:40:03,319 Speaker 3: be a breakout or an interaction. Okay, cool, And so 734 00:40:03,960 --> 00:40:06,640 Speaker 3: the ordering of those four might differ. The story I 735 00:40:06,680 --> 00:40:08,839 Speaker 3: tell to articulate. The point of the third one might 736 00:40:08,880 --> 00:40:11,399 Speaker 3: differ depending on what's in the room. So I've given 737 00:40:11,400 --> 00:40:14,200 Speaker 3: myself a structure, and I have some clarity in what 738 00:40:14,239 --> 00:40:16,480 Speaker 3: I'm trying to do or what I hope will happen. 739 00:40:17,360 --> 00:40:20,439 Speaker 3: But I guess the freedom is in the how. It's 740 00:40:20,480 --> 00:40:23,160 Speaker 3: the how you get there, and in my experience, the 741 00:40:23,200 --> 00:40:27,560 Speaker 3: best way, the how is to get a sense of 742 00:40:27,560 --> 00:40:29,560 Speaker 3: what again, where the audience is at and try and 743 00:40:29,560 --> 00:40:32,480 Speaker 3: meet them there. Like ultimately we're talking about empathy. What's 744 00:40:32,520 --> 00:40:34,360 Speaker 3: important to this person in the room or these people 745 00:40:34,360 --> 00:40:36,200 Speaker 3: in the room, and how do I help them get 746 00:40:36,200 --> 00:40:37,879 Speaker 3: from where they are right now to where they want 747 00:40:37,880 --> 00:40:40,840 Speaker 3: to go? I think that's I think that's the the 748 00:40:42,120 --> 00:40:44,400 Speaker 3: secret source of what you two do really well. 749 00:40:44,760 --> 00:40:47,600 Speaker 1: Tiff, how does that go with you? How did you like? 750 00:40:47,680 --> 00:40:49,600 Speaker 1: How are you? How are you kind of going with 751 00:40:49,640 --> 00:40:55,239 Speaker 1: this freestyle you spontaneous, organic you and the the you know, 752 00:40:55,560 --> 00:40:57,680 Speaker 1: learning this or being in the part of this kind 753 00:40:57,719 --> 00:40:58,360 Speaker 1: of program. 754 00:40:58,719 --> 00:41:00,960 Speaker 2: I feel like as I as I go through it, 755 00:41:00,960 --> 00:41:05,239 Speaker 2: there's a lot of it is that feel similar to 756 00:41:05,320 --> 00:41:09,560 Speaker 2: how I already structure and do that loose planning and chunking. 757 00:41:09,960 --> 00:41:14,800 Speaker 2: But then there's all of these other processes of getting 758 00:41:14,840 --> 00:41:18,000 Speaker 2: there that we're implementing. And then there's a bunch of 759 00:41:18,080 --> 00:41:21,719 Speaker 2: learning that I'm doing, which is what I wanted was 760 00:41:21,760 --> 00:41:24,640 Speaker 2: what might I learn about the stuff I don't even 761 00:41:24,719 --> 00:41:27,880 Speaker 2: know that I don't know yet? So I love that stuff, 762 00:41:28,400 --> 00:41:32,840 Speaker 2: but I am super frustrated by the way of thinking 763 00:41:32,880 --> 00:41:35,600 Speaker 2: that I'm not used to and the places it takes 764 00:41:35,680 --> 00:41:38,080 Speaker 2: my brain that I don't want to have to go, 765 00:41:38,760 --> 00:41:41,440 Speaker 2: and then I want to throw in the bloody towel 766 00:41:41,480 --> 00:41:43,160 Speaker 2: and tell them all to get stuff and go I'm 767 00:41:43,160 --> 00:41:43,920 Speaker 2: not doing it your way. 768 00:41:43,920 --> 00:41:44,759 Speaker 1: I'm doing it my way. 769 00:41:44,800 --> 00:41:46,239 Speaker 2: And then I'm like, well, this is the whole point 770 00:41:46,280 --> 00:41:48,200 Speaker 2: of it was to learn, so just go through the 771 00:41:48,200 --> 00:41:52,640 Speaker 2: process and take what you need. But it's good, it's hard, 772 00:41:52,719 --> 00:41:57,960 Speaker 2: it's frustrating, it's exciting. At times, I get defensive about 773 00:41:58,000 --> 00:41:58,920 Speaker 2: it because I'm like, no. 774 00:41:58,960 --> 00:41:59,520 Speaker 1: I've never done. 775 00:41:59,520 --> 00:42:01,360 Speaker 2: Of course I've done, and I've never had you know, 776 00:42:01,400 --> 00:42:04,800 Speaker 2: I've never had a bad experience, and so I shouldn't 777 00:42:04,800 --> 00:42:06,080 Speaker 2: have to do it this way. And it's like, well, 778 00:42:06,120 --> 00:42:07,680 Speaker 2: you chose to do this, so shut up and knew 779 00:42:07,680 --> 00:42:08,080 Speaker 2: the course. 780 00:42:08,520 --> 00:42:10,560 Speaker 3: That's such an awesome attitude, by the way of like 781 00:42:10,600 --> 00:42:12,960 Speaker 3: the fact that you're putting yourself in that situation to 782 00:42:13,000 --> 00:42:16,080 Speaker 3: get annoyed when you have every right to keep doing 783 00:42:16,120 --> 00:42:19,440 Speaker 3: what you're doing because it was working. And yeah, I 784 00:42:19,440 --> 00:42:21,399 Speaker 3: mean I find that inspiring to go. I mean, even 785 00:42:21,400 --> 00:42:24,640 Speaker 3: thinking back to my response like yeah, but have you 786 00:42:24,680 --> 00:42:26,760 Speaker 3: ever you know, I don't know, gone to a workshop 787 00:42:26,760 --> 00:42:28,759 Speaker 3: where they teach you the best scripted way to write 788 00:42:28,800 --> 00:42:31,479 Speaker 3: a keynote. No, I haven't because I haven't necessarily thought 789 00:42:31,480 --> 00:42:34,480 Speaker 3: that that's my style. But maybe I'm avoiding learning something 790 00:42:34,480 --> 00:42:35,560 Speaker 3: that I might learn if I did that. 791 00:42:37,000 --> 00:42:39,040 Speaker 1: Yeah, I love I love being in a room and 792 00:42:39,080 --> 00:42:43,160 Speaker 1: watching speakers that I've never heard speak like good at 793 00:42:43,200 --> 00:42:46,960 Speaker 1: what they do, but I've never I don't know their content, 794 00:42:47,000 --> 00:42:49,120 Speaker 1: I don't know their style. I don't know them, or 795 00:42:49,120 --> 00:42:51,200 Speaker 1: maybe I know of them, but I've never heard them. 796 00:42:52,000 --> 00:42:57,279 Speaker 1: And especially I mean, obviously it's more importantly when they're good. 797 00:42:58,760 --> 00:43:02,440 Speaker 1: Even if somebody ruggles or somebody's not landing, you can 798 00:43:02,480 --> 00:43:08,279 Speaker 1: still learn a lot, right, And just I take away 799 00:43:08,360 --> 00:43:10,800 Speaker 1: stuff from that where I go, oh, that was so good. 800 00:43:11,400 --> 00:43:14,880 Speaker 1: I never even thought of talking about that that way, 801 00:43:15,239 --> 00:43:19,640 Speaker 1: or the way that they have a particular skill at this, 802 00:43:19,880 --> 00:43:22,359 Speaker 1: or just the way that they walk to the edge 803 00:43:22,360 --> 00:43:25,200 Speaker 1: of the stage and stand still and look and don't 804 00:43:25,239 --> 00:43:29,880 Speaker 1: say anything for like ten seconds. It's like she's saying 805 00:43:29,960 --> 00:43:33,200 Speaker 1: nothing and she's saying a million things. How is this possible? 806 00:43:33,600 --> 00:43:36,600 Speaker 1: Is she fucking magic? You know? It's just like there's 807 00:43:37,480 --> 00:43:40,040 Speaker 1: I'm like, oh that you would not think that doing 808 00:43:40,120 --> 00:43:43,160 Speaker 1: nothing can be so powerful. But depending on what's going 809 00:43:43,200 --> 00:43:48,080 Speaker 1: on in the moment, sometimes that nothing is exactly what 810 00:43:48,160 --> 00:43:50,640 Speaker 1: the whole room needs and who would even think of that? 811 00:43:51,280 --> 00:43:53,080 Speaker 3: Yeah, I'm back to that up, but back to that 812 00:43:53,120 --> 00:43:54,560 Speaker 3: idea of like, how do you meet the audience where 813 00:43:54,560 --> 00:43:56,319 Speaker 3: they're at the right pause at the right time from 814 00:43:56,360 --> 00:43:59,120 Speaker 3: the right person could be just what they need. I 815 00:43:59,160 --> 00:44:03,000 Speaker 3: also have that experience of you know, you might be going, oh, interesting, 816 00:44:03,000 --> 00:44:05,920 Speaker 3: I don't know if I would have used a story 817 00:44:05,960 --> 00:44:08,080 Speaker 3: like that or framed something that way, And then you 818 00:44:08,360 --> 00:44:10,560 Speaker 3: like look around and everyone is just on the edge 819 00:44:10,560 --> 00:44:12,319 Speaker 3: of the seat, like loving the way that they told 820 00:44:12,320 --> 00:44:14,680 Speaker 3: that story. I go, Okay, check your bias is Pete, 821 00:44:14,680 --> 00:44:16,560 Speaker 3: maybe you should be telling stories in that way because 822 00:44:16,560 --> 00:44:19,040 Speaker 3: everyone's loving it. So like, what are you missing that 823 00:44:19,440 --> 00:44:21,640 Speaker 3: you're blocking out because of your own preferences? 824 00:44:22,280 --> 00:44:26,040 Speaker 1: Yeah? Yeah, And I mean this all kind of just 825 00:44:26,080 --> 00:44:28,440 Speaker 1: this talking to an audience that extrapolates like it could 826 00:44:28,480 --> 00:44:31,000 Speaker 1: be an audience of five hundred at Crown, or it 827 00:44:31,040 --> 00:44:34,319 Speaker 1: could be an audience of one at the dinner table, right, 828 00:44:34,400 --> 00:44:37,080 Speaker 1: it could be one person. It's like, how do I 829 00:44:37,160 --> 00:44:40,799 Speaker 1: talk to my kid about this thing versus my other 830 00:44:40,880 --> 00:44:43,800 Speaker 1: kid who's real different to this kid? Well, how do 831 00:44:43,880 --> 00:44:47,239 Speaker 1: I talk to my mom? Like if I want to 832 00:44:47,280 --> 00:44:52,040 Speaker 1: get something across to my mum versus my dad. It's 833 00:44:52,120 --> 00:44:55,600 Speaker 1: like two different species. It's like a cat and a dog. 834 00:44:56,120 --> 00:44:59,479 Speaker 1: It's like they're so different, you know. And the way 835 00:44:59,520 --> 00:45:02,080 Speaker 1: that I need to communicate to my dad to create 836 00:45:02,239 --> 00:45:10,640 Speaker 1: connection yes and ah, I don't know real kind of 837 00:45:10,640 --> 00:45:14,640 Speaker 1: any kind of breakthrough is totally different to my mum, 838 00:45:15,120 --> 00:45:17,440 Speaker 1: and they basically joined at the hip, you know. 839 00:45:18,080 --> 00:45:20,400 Speaker 3: Yeah, I mean I think of that as practical empathy. 840 00:45:20,440 --> 00:45:23,439 Speaker 3: And sometimes when I'm working with corporates and you start 841 00:45:23,480 --> 00:45:25,160 Speaker 3: talking about okay, so let's think about how we could 842 00:45:25,160 --> 00:45:26,960 Speaker 3: empathize with the people we're working with or the people 843 00:45:27,000 --> 00:45:29,440 Speaker 3: we're coaching or whatever, you get people that rotherise and go, 844 00:45:29,480 --> 00:45:32,440 Speaker 3: oh yeah, empathy. We've all heard about empathy before. But 845 00:45:32,840 --> 00:45:36,000 Speaker 3: I think if you look at it practically, people inherently 846 00:45:36,080 --> 00:45:38,600 Speaker 3: know that you talk to your mum differently than you 847 00:45:38,600 --> 00:45:40,840 Speaker 3: talk to your dad, or you talk to your toddler 848 00:45:40,840 --> 00:45:43,680 Speaker 3: Pete far differently than you talk to your wife. And 849 00:45:43,719 --> 00:45:45,480 Speaker 3: the reason we know that is because we have enough 850 00:45:45,520 --> 00:45:48,080 Speaker 3: empathy to go, oh, I know what that person needs 851 00:45:48,120 --> 00:45:49,759 Speaker 3: right now, and they need me to frame things in 852 00:45:49,800 --> 00:45:51,959 Speaker 3: a certain way or explain it in a certain way 853 00:45:52,440 --> 00:45:55,960 Speaker 3: or use a particular story, and that insight is to 854 00:45:56,040 --> 00:45:58,399 Speaker 3: your point, that's how you connect with people one on one, 855 00:45:58,400 --> 00:46:00,960 Speaker 3: but it's actually also how you connect with audio, like 856 00:46:01,120 --> 00:46:03,040 Speaker 3: what does this room need right now? And how do 857 00:46:03,080 --> 00:46:03,919 Speaker 3: I meet them there? 858 00:46:05,520 --> 00:46:08,440 Speaker 1: And there's a lot of high level guessing in that, 859 00:46:08,600 --> 00:46:13,120 Speaker 1: right because, of course, because you've got fifty different personalities 860 00:46:13,120 --> 00:46:16,840 Speaker 1: and fifty different backgrounds and brains and minds and experiences 861 00:46:16,880 --> 00:46:21,279 Speaker 1: and windows through which they're observing, judging and interpreting you. 862 00:46:22,000 --> 00:46:24,200 Speaker 1: And as you said, someone's sitting there going, he's a 863 00:46:24,239 --> 00:46:27,359 Speaker 1: fucking dickhead, and someone else is like, where's this guy 864 00:46:27,440 --> 00:46:30,080 Speaker 1: been all my life? He's a genius. I wonder if 865 00:46:30,080 --> 00:46:32,440 Speaker 1: he's written a book? And can I have a photo? 866 00:46:32,520 --> 00:46:34,919 Speaker 1: And can I have a hug? And I have people 867 00:46:34,960 --> 00:46:36,520 Speaker 1: that want to punch me in the face and other 868 00:46:36,520 --> 00:46:38,400 Speaker 1: people that want to hug me at the same event, 869 00:46:38,480 --> 00:46:42,520 Speaker 1: I'm like to both at once. One person hugs me 870 00:46:42,560 --> 00:46:42,920 Speaker 1: while the other. 871 00:46:44,920 --> 00:46:48,319 Speaker 3: I think it was Seth Godin who said to me 872 00:46:48,400 --> 00:46:52,479 Speaker 3: one day, when you're doing a talk like that, find 873 00:46:52,520 --> 00:46:56,399 Speaker 3: your allies and speak to them and like, literally look 874 00:46:56,440 --> 00:46:58,040 Speaker 3: the person in the eye that looks like they're the 875 00:46:58,080 --> 00:47:00,880 Speaker 3: person that wants to hug you most for a little 876 00:47:00,920 --> 00:47:04,040 Speaker 3: longer than you think necessary, engage them in eye contact, 877 00:47:04,080 --> 00:47:06,799 Speaker 3: and just channel your energy to them, and that all 878 00:47:06,800 --> 00:47:08,600 Speaker 3: of a sudden, other people will pick up on the 879 00:47:08,600 --> 00:47:11,240 Speaker 3: fact that there's like a really cool human, authentic connection happening, 880 00:47:11,400 --> 00:47:14,520 Speaker 3: and they'll start to lean into that rather than kind 881 00:47:14,560 --> 00:47:17,440 Speaker 3: of spraying and praying where you're just kind of scanning 882 00:47:17,440 --> 00:47:19,279 Speaker 3: the room the whole one hundred and fifty people at 883 00:47:19,320 --> 00:47:21,279 Speaker 3: a time. If you look someone in the eye that's 884 00:47:21,360 --> 00:47:23,200 Speaker 3: leaning in and nodding and smiling, and you're like, oh, 885 00:47:23,239 --> 00:47:25,680 Speaker 3: that's one of my allies right there, and you talk 886 00:47:25,719 --> 00:47:28,239 Speaker 3: to them, it's almost like other people pick up on that, 887 00:47:28,320 --> 00:47:31,440 Speaker 3: And I think about it all the time, versus staring 888 00:47:31,480 --> 00:47:33,120 Speaker 3: at the guy that's sitting there with his arms cross, 889 00:47:33,120 --> 00:47:35,000 Speaker 3: going look at this there with like what a wanker. 890 00:47:35,040 --> 00:47:36,719 Speaker 3: I can't believe I'm sitting here awayte my time with 891 00:47:36,760 --> 00:47:39,840 Speaker 3: this guy. There's no point channeling any energy towards that person. 892 00:47:40,400 --> 00:47:42,440 Speaker 1: And how good is it when you feel a little 893 00:47:42,480 --> 00:47:45,640 Speaker 1: bit like you're spinning your wheels and there's just you know, 894 00:47:45,760 --> 00:47:50,680 Speaker 1: you sphink, there's just closed a bit, you know, and 895 00:47:51,680 --> 00:47:54,239 Speaker 1: you look out and you see one persons why. There's 896 00:47:54,239 --> 00:47:58,400 Speaker 1: always why. One person's like, yeah, that makes sense, yeah, 897 00:47:58,480 --> 00:48:00,680 Speaker 1: and they're nodding, and then I'm like, think, fuck for 898 00:48:00,800 --> 00:48:05,839 Speaker 1: you exactly, that that person. It's just like a bit 899 00:48:05,840 --> 00:48:08,440 Speaker 1: of a godsend where it's like, oh, my little angel, 900 00:48:08,520 --> 00:48:12,040 Speaker 1: my little self esteem builder in the moment, thank you. 901 00:48:12,560 --> 00:48:14,520 Speaker 1: They're giving me the A. They're giving me the A. 902 00:48:14,719 --> 00:48:17,959 Speaker 1: That's what I'm looking for one And it's not because 903 00:48:17,960 --> 00:48:19,759 Speaker 1: I'm doing a good job. It's because they're just a 904 00:48:19,800 --> 00:48:24,040 Speaker 1: really nice impact exactly. They feel sorry for me, and 905 00:48:24,080 --> 00:48:26,759 Speaker 1: they're trying to compensate for all the other motherfuckers that 906 00:48:26,800 --> 00:48:29,080 Speaker 1: are glaring at me like I'm a bor on. So 907 00:48:29,160 --> 00:48:31,719 Speaker 1: they're overdoing and on the smiling and the nodding. But 908 00:48:31,760 --> 00:48:32,800 Speaker 1: I'm buying in anyway. 909 00:48:33,160 --> 00:48:35,360 Speaker 3: There's a people pleaser in the crowd, and I'm buying 910 00:48:35,400 --> 00:48:36,160 Speaker 3: into what they're putting. 911 00:48:36,160 --> 00:48:40,279 Speaker 1: Now. I love it. Oh yeah yeah. And people that 912 00:48:40,400 --> 00:48:42,520 Speaker 1: the same person that laughs at all your stupid shit. 913 00:48:42,880 --> 00:48:44,920 Speaker 1: Hey mate, we love talking to you. You're a gun. 914 00:48:46,200 --> 00:48:49,759 Speaker 1: Tell people how they can enjoy more of the Pea 915 00:48:49,840 --> 00:48:50,920 Speaker 1: Shepherd experience. 916 00:48:51,560 --> 00:48:54,799 Speaker 3: Human periscope dot com is the website that has all 917 00:48:54,840 --> 00:48:56,680 Speaker 3: the things that you might want to get in touch 918 00:48:56,719 --> 00:48:57,640 Speaker 3: and reach out at any point. 919 00:48:57,719 --> 00:49:01,600 Speaker 1: I'm happy to chat. Giddy up, but a cup tiff 920 00:49:02,120 --> 00:49:04,080 Speaker 1: On behalf of the Pete night. We wish you well 921 00:49:04,160 --> 00:49:09,880 Speaker 1: for tomorrow night's good luck outing. We hope you find 922 00:49:09,880 --> 00:49:13,719 Speaker 1: a prospective partner because we know you've said yourself, the 923 00:49:13,760 --> 00:49:20,040 Speaker 1: clock's ticking and you know, no pressure. But and at 924 00:49:20,040 --> 00:49:22,600 Speaker 1: this stage if you're going tomorrow night. By the way, 925 00:49:22,800 --> 00:49:28,640 Speaker 1: she's not fussy, so that's a joke. Everyone, don't send 926 00:49:28,680 --> 00:49:30,920 Speaker 1: me an email and make sure you compliment her on 927 00:49:30,920 --> 00:49:35,879 Speaker 1: the red shoes. Make sure she may well be the 928 00:49:35,920 --> 00:49:40,919 Speaker 1: most fussy. So thanks, thanks you guys. Thanks team