1 00:00:01,000 --> 00:00:03,159 Speaker 1: I get a team. It's the You Project. It's Patrick 2 00:00:03,320 --> 00:00:05,800 Speaker 1: and Tiffany and I'll be in the background here. My 3 00:00:05,880 --> 00:00:10,440 Speaker 1: name is Craig. Welcome to the show. Let's start with 4 00:00:11,240 --> 00:00:15,880 Speaker 1: forty two years and one day, shall we Birthday? Yesterday? 5 00:00:16,000 --> 00:00:17,840 Speaker 1: Birthday girl, Happy birthday. 6 00:00:18,160 --> 00:00:20,960 Speaker 2: That's me. It's my favorite daily d Do you know 7 00:00:21,000 --> 00:00:25,120 Speaker 2: what because I get a beautiful rendition of Happy Birthday 8 00:00:25,200 --> 00:00:27,360 Speaker 2: sung to me by the great Man, and I'd just 9 00:00:27,400 --> 00:00:29,320 Speaker 2: wake up looking forward to it. 10 00:00:29,360 --> 00:00:31,800 Speaker 3: Isn't that sweet? Yeah? 11 00:00:31,920 --> 00:00:35,880 Speaker 1: Yeah? I sing happy but well always into the message bank. 12 00:00:37,240 --> 00:00:40,640 Speaker 1: And you know, I think a good rendition of happy 13 00:00:40,680 --> 00:00:44,440 Speaker 1: Birthday is nice or a terrible one in my case, 14 00:00:44,479 --> 00:00:45,840 Speaker 1: but nonetheless. 15 00:00:45,720 --> 00:00:48,000 Speaker 3: It's a nice gesture, isn't it. Hey, Craig. You know 16 00:00:48,080 --> 00:00:50,760 Speaker 3: my friends who live in Hampton, my dear friends. 17 00:00:50,840 --> 00:00:53,480 Speaker 4: Quite often they will collectively get together and all sing 18 00:00:53,520 --> 00:00:57,880 Speaker 4: happy birthday to me and leave a message on my phone. 19 00:00:57,000 --> 00:00:58,960 Speaker 1: That's likely, isn't it. 20 00:00:59,040 --> 00:00:59,480 Speaker 3: Yeah. 21 00:00:59,680 --> 00:01:03,520 Speaker 1: I sing to my mum, Happy birthday. To my dad, 22 00:01:03,560 --> 00:01:07,440 Speaker 1: I sing happy anniversary on May twenty six, which is 23 00:01:07,480 --> 00:01:10,800 Speaker 1: coming up very soon. There's a lot of birthdays. I 24 00:01:10,840 --> 00:01:13,120 Speaker 1: know a lot of people whose birthday is in May. 25 00:01:13,560 --> 00:01:16,399 Speaker 1: A lot, So I wonder what what is that? When 26 00:01:16,440 --> 00:01:20,440 Speaker 1: does that mean people were being frisky? What month? So 27 00:01:20,560 --> 00:01:22,960 Speaker 1: it's the fifth I knew that. 28 00:01:23,600 --> 00:01:24,600 Speaker 3: When were they shagging? 29 00:01:26,520 --> 00:01:30,119 Speaker 1: When were they routing? Let's get let's does that make 30 00:01:30,160 --> 00:01:32,640 Speaker 1: it about September the year before or something. 31 00:01:32,480 --> 00:01:35,600 Speaker 2: Bit of spring shenanigans springing into action. 32 00:01:36,880 --> 00:01:39,280 Speaker 4: I feel like I need to introduce my a different 33 00:01:39,440 --> 00:01:41,120 Speaker 4: one of the topics for today. I was going to 34 00:01:41,120 --> 00:01:43,399 Speaker 4: talk about it later in the show, but for our 35 00:01:43,480 --> 00:01:47,680 Speaker 4: friends in the US, there's a real problem with tariffs 36 00:01:47,720 --> 00:01:50,560 Speaker 4: that people haven't really focused on as much, and it's 37 00:01:50,640 --> 00:01:55,000 Speaker 4: quite a concern because should people be panic buying vibrators 38 00:01:55,080 --> 00:01:58,640 Speaker 4: right now? There's a real well because most vibrators are 39 00:01:58,640 --> 00:02:02,920 Speaker 4: made in China, and so there's actually real concern out 40 00:02:02,960 --> 00:02:06,000 Speaker 4: there that you know, your favorite companion toy, if you 41 00:02:06,040 --> 00:02:09,640 Speaker 4: don't have them, to have nofully looking for children, you 42 00:02:09,760 --> 00:02:12,760 Speaker 4: might need to jump in and make a panic by 43 00:02:13,919 --> 00:02:14,480 Speaker 4: I know what. 44 00:02:14,400 --> 00:02:16,400 Speaker 2: I'll be asking for on my next birthday, then. 45 00:02:17,080 --> 00:02:23,079 Speaker 5: Good gracious going up in pride. 46 00:02:24,000 --> 00:02:26,600 Speaker 1: I don't know that everyone's worried about it. I think 47 00:02:26,600 --> 00:02:32,160 Speaker 1: that might be overstating it. I was going to say 48 00:02:32,160 --> 00:02:34,919 Speaker 1: that people still use those I really haven't thought about 49 00:02:34,919 --> 00:02:35,280 Speaker 1: that problem. 50 00:02:35,320 --> 00:02:37,959 Speaker 4: There are smart ones now, people who go away from 51 00:02:37,960 --> 00:02:40,960 Speaker 4: their partners, you can internet connect them. The Internet of 52 00:02:41,000 --> 00:02:43,160 Speaker 4: things has just to open up the whole world of 53 00:02:43,320 --> 00:02:44,160 Speaker 4: online sex. 54 00:02:44,400 --> 00:02:47,200 Speaker 1: Hang on, hang on, you might need to without without 55 00:02:47,480 --> 00:02:50,200 Speaker 1: going into terrortory, who we shouldn't, You might need to 56 00:02:50,280 --> 00:02:53,680 Speaker 1: unpack that there are partner ones now what does that mean? Patrick? 57 00:02:54,240 --> 00:02:56,720 Speaker 4: There are smart devices that you can so I can 58 00:02:56,880 --> 00:02:58,880 Speaker 4: I can turn on my lights when I'm away, I 59 00:02:58,880 --> 00:03:01,280 Speaker 4: can turn their condition to when I'm away from home. 60 00:03:01,639 --> 00:03:05,000 Speaker 4: So if you have a partner who has a smart device, 61 00:03:05,720 --> 00:03:09,519 Speaker 4: you could potentially engage in pleasure, mutual pleasuring. 62 00:03:09,600 --> 00:03:11,919 Speaker 2: They hold the butt, perhaps they hold the butt? 63 00:03:12,840 --> 00:03:13,960 Speaker 1: How do you know about this? 64 00:03:14,360 --> 00:03:16,280 Speaker 2: I don't know. I don't have a significant other to 65 00:03:16,320 --> 00:03:17,160 Speaker 2: play those games. 66 00:03:17,560 --> 00:03:19,520 Speaker 3: Why you know? 67 00:03:19,760 --> 00:03:22,040 Speaker 1: You know how you're straight and Patrick's not? 68 00:03:23,120 --> 00:03:25,560 Speaker 2: Maybe maybe disappointment, isn't it? 69 00:03:28,240 --> 00:03:31,160 Speaker 1: Well? I definitely think one of you should jump the 70 00:03:31,160 --> 00:03:35,720 Speaker 1: fence anyway. I mean imagine if you went Patrick, I'm 71 00:03:35,760 --> 00:03:39,760 Speaker 1: straight except for Tiff. Yeah, I mean sorry, I'm sorry, 72 00:03:39,800 --> 00:03:42,960 Speaker 1: I'm gay for Tip and vice versa. 73 00:03:43,080 --> 00:03:43,560 Speaker 3: Do you reckon? 74 00:03:43,600 --> 00:03:46,400 Speaker 1: You guys could have like a wild card where you 75 00:03:46,480 --> 00:03:50,120 Speaker 1: just go look. Generally speaking, we would be the best. 76 00:03:50,240 --> 00:03:52,400 Speaker 4: I think we could do everything together and hang out 77 00:03:52,400 --> 00:03:54,920 Speaker 4: because we're pretty good buds and I reckon we would 78 00:03:54,920 --> 00:03:57,119 Speaker 4: get on super super well, don't you reckon. 79 00:03:56,880 --> 00:03:57,800 Speaker 2: Tiff, I reckon. 80 00:03:58,040 --> 00:03:59,800 Speaker 3: Our dogs like each other. That's one of the most 81 00:04:00,000 --> 00:04:00,600 Speaker 3: important thing. 82 00:04:01,400 --> 00:04:03,880 Speaker 2: You just reminded me of a weird dream. I literally 83 00:04:03,920 --> 00:04:07,160 Speaker 2: woke up to this morning and just remembered. I won't 84 00:04:07,200 --> 00:04:10,200 Speaker 2: mention the name, but it was it was someone they've 85 00:04:10,240 --> 00:04:12,680 Speaker 2: been on my show, which gives nothing away because there's 86 00:04:12,680 --> 00:04:16,440 Speaker 2: been nine hundred episodes, but they rang me, not in 87 00:04:16,480 --> 00:04:18,080 Speaker 2: my sleep and not for real, but in my dream 88 00:04:18,120 --> 00:04:20,479 Speaker 2: they rang me and we were talking and chatting, and 89 00:04:20,480 --> 00:04:22,960 Speaker 2: then they said it was a bloke and he said 90 00:04:23,480 --> 00:04:26,640 Speaker 2: I'm in love with a man now and I was 91 00:04:26,720 --> 00:04:29,160 Speaker 2: like oh. And then at some point I woke up, 92 00:04:29,160 --> 00:04:32,440 Speaker 2: so I don't get the full story, but so random. 93 00:04:32,880 --> 00:04:34,839 Speaker 1: Do you ever do that? And you're just like, fuck, 94 00:04:34,880 --> 00:04:36,480 Speaker 1: I've got to go back to sleep to find out 95 00:04:36,480 --> 00:04:36,840 Speaker 1: who it. 96 00:04:36,880 --> 00:04:41,120 Speaker 2: Is, always bringing him and going are you Yeah. 97 00:04:41,240 --> 00:04:43,040 Speaker 4: One of the weirdest dreams I ever remember having as 98 00:04:43,040 --> 00:04:44,240 Speaker 4: a kid is I was in the middle of a 99 00:04:44,240 --> 00:04:47,400 Speaker 4: conversation with someone and then I got woken up and 100 00:04:47,440 --> 00:04:48,760 Speaker 4: then went back to sleep. 101 00:04:48,839 --> 00:04:51,360 Speaker 3: And resumed the conversation in my dream. 102 00:04:52,600 --> 00:04:54,720 Speaker 2: I love dreams. I love them so much. 103 00:04:55,200 --> 00:04:58,080 Speaker 4: Perhaps you did an episode recently on Lucid Dreaming. I 104 00:04:58,120 --> 00:04:59,680 Speaker 4: saw it and thought, I got to listen to that one. 105 00:04:59,760 --> 00:05:02,960 Speaker 4: So because I'm fascinated with the concept of lucid dreaming, 106 00:05:02,960 --> 00:05:04,640 Speaker 4: Well you did do it topically. 107 00:05:04,920 --> 00:05:07,279 Speaker 1: Yeah, that was actually an older app that we put 108 00:05:07,360 --> 00:05:10,440 Speaker 1: up because I was interstate this, so we we did 109 00:05:11,320 --> 00:05:13,440 Speaker 1: a little bit of a revisiting. But yeah, he was 110 00:05:13,520 --> 00:05:16,040 Speaker 1: really good. He was really good. So if you want to, 111 00:05:17,560 --> 00:05:21,360 Speaker 1: I'm really fascinated with that. Like I think there's an 112 00:05:21,360 --> 00:05:23,640 Speaker 1: oh no, we've kind of said this a bit. There's 113 00:05:23,640 --> 00:05:25,680 Speaker 1: so much that we don't understand about the mind and 114 00:05:25,720 --> 00:05:33,080 Speaker 1: the brain and consciousness and unconsciousness and the you know, 115 00:05:33,760 --> 00:05:35,320 Speaker 1: the other day I was I was going down that 116 00:05:35,400 --> 00:05:38,080 Speaker 1: rabbit hole of you know how people have said many times, 117 00:05:38,279 --> 00:05:40,560 Speaker 1: we have seventy thousand thoughts a day. Right, Have you 118 00:05:40,640 --> 00:05:44,680 Speaker 1: ever heard that figure, either of you? Oh yeah, yeah, yeah, 119 00:05:44,680 --> 00:05:48,920 Speaker 1: So that's a very common, but bullshit number. That's weird out. 120 00:05:48,920 --> 00:05:52,159 Speaker 1: So if you had seventy thousand thoughts a day, it 121 00:05:52,200 --> 00:05:55,080 Speaker 1: would mean that basically, in a twenty four hour day, 122 00:05:55,160 --> 00:05:58,440 Speaker 1: you have to have about one new original thought every second, 123 00:05:58,480 --> 00:06:01,320 Speaker 1: which clearly I mean and sometimes I don't have four 124 00:06:01,360 --> 00:06:03,960 Speaker 1: thoughts in an hour, right, so as if I'm having 125 00:06:04,400 --> 00:06:08,600 Speaker 1: fucking one thought every second, and so the perhaps the 126 00:06:08,640 --> 00:06:12,680 Speaker 1: more reliable number, and again it's it varies wildly, like 127 00:06:13,040 --> 00:06:16,120 Speaker 1: but it's somewhere maybe around six thousand. There's research that 128 00:06:16,160 --> 00:06:20,320 Speaker 1: says it's that, but even that, and then I was thinking, yeah, yeah, 129 00:06:20,360 --> 00:06:23,880 Speaker 1: I was thinking, I wonder if like what we're when 130 00:06:23,880 --> 00:06:28,840 Speaker 1: we're dreaming, is that thinking? Like, is that does that 131 00:06:28,960 --> 00:06:29,960 Speaker 1: counter's thoughts? 132 00:06:30,880 --> 00:06:33,480 Speaker 2: And the definition of a thought? Which sounds like a 133 00:06:33,560 --> 00:06:36,080 Speaker 2: dumb question, but what's the definition? Because then how are 134 00:06:36,080 --> 00:06:38,640 Speaker 2: they getting the number or measuring well? 135 00:06:38,680 --> 00:06:41,360 Speaker 1: Because when so the way that they do it is 136 00:06:41,400 --> 00:06:45,480 Speaker 1: there's a when you put I think it's an MRI 137 00:06:45,720 --> 00:06:49,359 Speaker 1: when you're when you're looking at the brain. You know, 138 00:06:50,040 --> 00:06:52,640 Speaker 1: under certain conditions, when people have a thought, there's a 139 00:06:52,720 --> 00:06:55,679 Speaker 1: change in brain activity and that correlates with a thought, 140 00:06:55,760 --> 00:06:59,360 Speaker 1: so they look at the Again, I'm not a neuroscientist, 141 00:06:59,440 --> 00:07:02,160 Speaker 1: I'm a more in the psych space, but my understanding 142 00:07:02,240 --> 00:07:06,039 Speaker 1: is that there's a visible kind of response in brain 143 00:07:06,120 --> 00:07:09,440 Speaker 1: activity when people have a thought. So obviously they can't 144 00:07:09,480 --> 00:07:14,160 Speaker 1: track thoughts per se, but they can track the physiological 145 00:07:14,240 --> 00:07:19,360 Speaker 1: representation of thoughts. But I still think six thousand is 146 00:07:19,400 --> 00:07:20,760 Speaker 1: too many. I think that's bullshit. 147 00:07:20,920 --> 00:07:24,600 Speaker 3: But that's you know, a staggering number those and in 148 00:07:24,640 --> 00:07:25,040 Speaker 3: a day. 149 00:07:25,280 --> 00:07:30,280 Speaker 4: But getting at old sleeping, but I mean, there's a 150 00:07:30,360 --> 00:07:33,080 Speaker 4: lucid Dreaming is one of the topics that I've always 151 00:07:33,080 --> 00:07:36,520 Speaker 4: been fascinated by. But also when we go to sleep, 152 00:07:36,560 --> 00:07:39,360 Speaker 4: we know how important sleep is to cleaning out our brain, 153 00:07:39,920 --> 00:07:42,520 Speaker 4: but we also know that when we dream sometimes it's 154 00:07:42,520 --> 00:07:45,760 Speaker 4: our subconscious mind working on problems as well. I mean, 155 00:07:46,640 --> 00:07:50,320 Speaker 4: I had a friend contact me who's an author, and 156 00:07:51,080 --> 00:07:54,680 Speaker 4: his publisher had designed a book cover for him and 157 00:07:54,720 --> 00:07:57,280 Speaker 4: he hated it. He didn't like the cover, and he 158 00:07:57,360 --> 00:07:59,320 Speaker 4: kept going back to them and the designer and it 159 00:07:59,360 --> 00:08:01,800 Speaker 4: was doing in front and so out of desperation he 160 00:08:01,880 --> 00:08:04,560 Speaker 4: contacted me and said, look, these are the concepts that 161 00:08:04,800 --> 00:08:07,920 Speaker 4: I've been thinking about, but they just can't get the 162 00:08:07,960 --> 00:08:10,280 Speaker 4: design right. And I said, okay, look I might have 163 00:08:10,320 --> 00:08:12,640 Speaker 4: a look at it tomorrow. That was a late night thing. 164 00:08:12,960 --> 00:08:16,800 Speaker 4: Went to sleep, woke up about four am. The entire 165 00:08:17,040 --> 00:08:20,720 Speaker 4: cover was in my mind, the finished cover was there. 166 00:08:20,960 --> 00:08:24,080 Speaker 4: I got out of bed, I took down some notes 167 00:08:24,520 --> 00:08:27,080 Speaker 4: and that was it. That was the entire book cover. 168 00:08:27,240 --> 00:08:30,080 Speaker 4: So I didn't do any work at all. Most subconscious 169 00:08:30,080 --> 00:08:32,840 Speaker 4: did all that. But I could visualize that I knew 170 00:08:32,840 --> 00:08:34,640 Speaker 4: what I wanted it to look like, and that it 171 00:08:34,679 --> 00:08:36,800 Speaker 4: was a matter of just putting it all together graphically 172 00:08:36,960 --> 00:08:39,200 Speaker 4: when I had the opportunity, and that ended up being 173 00:08:39,559 --> 00:08:43,280 Speaker 4: the book cover. But I didn't do any conscious work. 174 00:08:43,320 --> 00:08:44,679 Speaker 4: It was all subconscious. 175 00:08:45,320 --> 00:08:48,520 Speaker 1: That's so interesting. But you think about even in dreams, 176 00:08:48,600 --> 00:08:51,679 Speaker 1: when you are when you're dreaming something which is an 177 00:08:51,480 --> 00:08:55,760 Speaker 1: alternate reality of course. Like let's say you're walking down 178 00:08:55,760 --> 00:08:57,800 Speaker 1: a street and you in this dream and in this 179 00:08:57,920 --> 00:09:00,440 Speaker 1: dream you feel like you're in danger. Well, even in 180 00:09:00,480 --> 00:09:03,440 Speaker 1: the dream, you're still solving problems. You're thinking, Fuck, maybe 181 00:09:03,480 --> 00:09:05,920 Speaker 1: I better cross the street, or what if this? What if? 182 00:09:06,120 --> 00:09:09,200 Speaker 1: Like it is funny, your brain is still working, like 183 00:09:09,280 --> 00:09:12,600 Speaker 1: your brain's never not working, and it is thinking, but 184 00:09:12,640 --> 00:09:15,920 Speaker 1: it's a different kind of thinking. And I wonder, I 185 00:09:15,960 --> 00:09:18,840 Speaker 1: wonder when you're in that, Yeah, I wonder if you're 186 00:09:18,880 --> 00:09:21,760 Speaker 1: still getting the same level of recovery Patrick, when you're 187 00:09:22,000 --> 00:09:27,240 Speaker 1: when you're designing books in your sleep versus book covers. 188 00:09:27,440 --> 00:09:29,600 Speaker 1: I wonder if you get up. You're like, I'm fucked. 189 00:09:29,640 --> 00:09:32,000 Speaker 1: Why am I so fucked? I slept well, I was 190 00:09:32,160 --> 00:09:34,960 Speaker 1: fixing this motherfucker's book. That's why I'm so tired. 191 00:09:35,320 --> 00:09:41,840 Speaker 3: That's such a good question. Yeah, charged more, I wonder. 192 00:09:42,000 --> 00:09:46,000 Speaker 4: Yeah, that's a really fascinating thought, because you know, it's 193 00:09:46,080 --> 00:09:48,320 Speaker 4: the level of sleep you're getting too, whether it's rem 194 00:09:48,440 --> 00:09:51,600 Speaker 4: sleep in terms of what your brain is doing to 195 00:09:51,600 --> 00:09:53,120 Speaker 4: to kind of clean itself out. 196 00:09:53,480 --> 00:09:54,400 Speaker 3: I'm fascinated by it. 197 00:09:54,559 --> 00:09:56,520 Speaker 4: I'd love to sit down and have a long conversation 198 00:09:56,600 --> 00:09:59,640 Speaker 4: with a dream expert, because I reckon if I could 199 00:09:59,679 --> 00:10:03,839 Speaker 4: recall my dreams, I would wholy hollywoodize them, because I do. 200 00:10:03,920 --> 00:10:05,599 Speaker 3: I have the most of them. I did Parkour the 201 00:10:05,679 --> 00:10:06,120 Speaker 3: other night. 202 00:10:06,480 --> 00:10:08,840 Speaker 4: In my dreams, I was running around, I was doing 203 00:10:08,840 --> 00:10:11,520 Speaker 4: all this action stuff. I was jumping off buildings and 204 00:10:11,640 --> 00:10:14,520 Speaker 4: it was awesome. We could play it back. 205 00:10:15,520 --> 00:10:18,559 Speaker 1: Have you ever in a dream been like with someone 206 00:10:19,040 --> 00:10:21,920 Speaker 1: and the level of familiarity that you have with them. 207 00:10:21,960 --> 00:10:24,079 Speaker 1: In the dream, it's like, oh yeah, and you're talking 208 00:10:24,120 --> 00:10:27,000 Speaker 1: with this person, hanging out with this person, and it's 209 00:10:27,000 --> 00:10:29,640 Speaker 1: all very comfortable and normal and familiar. Then you wake 210 00:10:29,720 --> 00:10:32,160 Speaker 1: up and you're like, I don't even fucking hardly know 211 00:10:32,280 --> 00:10:36,720 Speaker 1: that person. Like in the dream, you have this relationship 212 00:10:36,760 --> 00:10:40,000 Speaker 1: that you don't have in real life, and it seems 213 00:10:40,120 --> 00:10:44,560 Speaker 1: very comfortable and normal. I remember nothing untoward happened. But 214 00:10:44,640 --> 00:10:46,880 Speaker 1: recently I had a dream with one of my friends. 215 00:10:48,160 --> 00:10:50,400 Speaker 1: His wife was in it, and we were having coffee 216 00:10:50,440 --> 00:10:53,200 Speaker 1: and talking and it's not like I don't like her, 217 00:10:53,240 --> 00:10:57,040 Speaker 1: but we don't know each other. And I mean, we 218 00:10:57,160 --> 00:11:00,440 Speaker 1: know each other to say hello, but that enough happened. 219 00:11:00,480 --> 00:11:03,640 Speaker 1: It wasn't sexual or romantic, but we're in this situation 220 00:11:03,800 --> 00:11:06,080 Speaker 1: together and it was like, Oh, this is what we do. 221 00:11:06,120 --> 00:11:07,680 Speaker 1: We hang out and we talk and we're talking about 222 00:11:07,679 --> 00:11:09,280 Speaker 1: all this stuff. And then I woke up. I'm like, 223 00:11:09,600 --> 00:11:12,240 Speaker 1: I don't even know her. Why is she in my dream? 224 00:11:12,640 --> 00:11:14,720 Speaker 1: And why are we such good friends in the dream. 225 00:11:15,240 --> 00:11:22,800 Speaker 5: Maybe you're a conscious stalker. Maybe maybe probably better to 226 00:11:22,840 --> 00:11:25,920 Speaker 5: do it in your dreams than real life, though, right my. 227 00:11:25,920 --> 00:11:28,800 Speaker 4: Friends have mentioned that you do walk past their house occasionally, 228 00:11:29,320 --> 00:11:30,360 Speaker 4: which is a bit creepy. 229 00:11:30,880 --> 00:11:34,280 Speaker 1: Well, that's just they happen to live in the same suburbs. 230 00:11:33,760 --> 00:11:36,560 Speaker 1: Maybe I don't know where their house is, so that's 231 00:11:36,559 --> 00:11:40,200 Speaker 1: probably that they're probably more creepy than they creepy. Your heart, 232 00:11:41,240 --> 00:11:44,880 Speaker 1: probably creepy planet every time I walk past their house 233 00:11:44,920 --> 00:11:51,000 Speaker 1: that I friends. Yeah, tip before we actually do a 234 00:11:51,000 --> 00:11:53,840 Speaker 1: proper episode, did you have a nice birthday? What did 235 00:11:53,840 --> 00:11:56,880 Speaker 1: you do? Tell the world how old you are? If 236 00:11:56,960 --> 00:12:01,440 Speaker 1: you're comfortable with that? And you mentioned before before, and 237 00:12:01,480 --> 00:12:03,559 Speaker 1: I'll shut up with this. Patrick and I are fair 238 00:12:03,640 --> 00:12:05,960 Speaker 1: that you often get a little bit sad e Mac sadster, 239 00:12:06,120 --> 00:12:07,520 Speaker 1: but you didn't I do. 240 00:12:07,720 --> 00:12:10,880 Speaker 2: It's really funny. I think I don't know what that's about. 241 00:12:11,360 --> 00:12:14,480 Speaker 2: A few issues, you know, but I think I sometimes 242 00:12:14,480 --> 00:12:18,079 Speaker 2: feel a bit a mix of excited and sad. On 243 00:12:18,160 --> 00:12:20,520 Speaker 2: my birthday, and I had the best birthday. I woke up, 244 00:12:20,559 --> 00:12:23,160 Speaker 2: I've got a beautiful I've got an email. First opened 245 00:12:23,240 --> 00:12:26,400 Speaker 2: eyes and I checked my emails and I'd been transferred 246 00:12:26,880 --> 00:12:30,760 Speaker 2: tickets from an awesome human for an awesome show. And 247 00:12:30,800 --> 00:12:33,080 Speaker 2: I was like, oh my god, it's my birthday. I 248 00:12:33,120 --> 00:12:35,079 Speaker 2: didn't even remember yet. So I got a present. I 249 00:12:35,200 --> 00:12:40,080 Speaker 2: never get presents. And then I had my course all day, 250 00:12:40,200 --> 00:12:43,240 Speaker 2: and then I went and had dinner with a really 251 00:12:43,280 --> 00:12:47,360 Speaker 2: good friend and I ate the most amazing banana chocolate 252 00:12:47,400 --> 00:12:49,640 Speaker 2: pancake I've ever eaten in my life, and it was 253 00:12:49,720 --> 00:12:53,400 Speaker 2: so good. Wow, and I've got the best birthday cat 254 00:12:53,480 --> 00:12:54,920 Speaker 2: and I had a cat on it got a picture 255 00:12:54,920 --> 00:12:57,520 Speaker 2: of cat and it says fuck your birthday, and the 256 00:12:57,600 --> 00:13:02,200 Speaker 2: cat's pushing over a vase. Friend wrote, this wasn't the 257 00:13:02,280 --> 00:13:04,440 Speaker 2: card I was gonna pick, but then I saw it 258 00:13:04,480 --> 00:13:06,200 Speaker 2: and thought of bear and I had to get it. 259 00:13:07,640 --> 00:13:10,480 Speaker 3: Yeah that makes sense. But yeah, exactly with the tipping 260 00:13:10,520 --> 00:13:11,160 Speaker 3: over stuff. 261 00:13:13,960 --> 00:13:17,880 Speaker 1: And so are you excited moving forward? Like do you 262 00:13:17,880 --> 00:13:19,319 Speaker 1: think the best is yet to come? 263 00:13:20,080 --> 00:13:22,320 Speaker 2: I think it is. I think the years go fast 264 00:13:22,400 --> 00:13:25,640 Speaker 2: now and I feel like I'm in a bit of 265 00:13:25,640 --> 00:13:30,800 Speaker 2: a change period and I don't know, I don't know, 266 00:13:31,040 --> 00:13:31,960 Speaker 2: I think have you ever? 267 00:13:34,200 --> 00:13:36,920 Speaker 1: The other day I did a gig in Queensland for 268 00:13:38,280 --> 00:13:41,520 Speaker 1: a company called el Gas, who are now I think 269 00:13:41,600 --> 00:13:46,120 Speaker 1: owned by boc Boch But anyway, so it was nearly 270 00:13:46,120 --> 00:13:48,600 Speaker 1: one hundred people, mainly dudes. I'll shut up after this, 271 00:13:48,720 --> 00:13:52,120 Speaker 1: but talking about this talking about age, So I do 272 00:13:52,240 --> 00:13:56,000 Speaker 1: this thing where and we're talking about you know, basically 273 00:13:56,040 --> 00:13:59,480 Speaker 1: looking after yourself, you know, and not waiting till shit breaks, 274 00:13:59,640 --> 00:14:02,320 Speaker 1: and how how many people wait till something goes wrong 275 00:14:02,360 --> 00:14:04,280 Speaker 1: and then they go, oh my god, I should really 276 00:14:04,320 --> 00:14:07,240 Speaker 1: eat well or maybe I should go for a jog, 277 00:14:07,360 --> 00:14:09,760 Speaker 1: or eat or drink less beer or whatever it is. 278 00:14:10,400 --> 00:14:12,199 Speaker 1: And so I do this thing where I put up 279 00:14:12,360 --> 00:14:16,240 Speaker 1: a big, straight, horizontal line across the whiteboard, and on 280 00:14:16,320 --> 00:14:18,920 Speaker 1: the left hand side is zero and on the far 281 00:14:19,040 --> 00:14:21,560 Speaker 1: right hand side of the line is eighty two, which 282 00:14:21,600 --> 00:14:24,800 Speaker 1: is the average stray and life expectancy. And then I go, 283 00:14:24,880 --> 00:14:27,320 Speaker 1: now put up your hand if you are around fifty. 284 00:14:27,360 --> 00:14:30,320 Speaker 1: And most of the room were around fifty, so then 285 00:14:30,360 --> 00:14:33,240 Speaker 1: I go. Then I scrub out like I just squiggle 286 00:14:33,320 --> 00:14:35,520 Speaker 1: through all the line up until fifty, and I go, 287 00:14:35,760 --> 00:14:38,520 Speaker 1: so this is what you've done, This is where you've been, 288 00:14:39,000 --> 00:14:42,840 Speaker 1: and if you live to a typical life expectancy of 289 00:14:43,000 --> 00:14:46,480 Speaker 1: an average Aussie, you've got if you're fifty five, you've 290 00:14:46,480 --> 00:14:49,440 Speaker 1: got twenty seven years to go. And everybody's like, ah fuck, 291 00:14:50,000 --> 00:14:52,240 Speaker 1: And I say, you could have way longer, you could 292 00:14:52,240 --> 00:14:56,720 Speaker 1: have less, who knows, but I'm just talking about statistically normal, right, 293 00:14:57,400 --> 00:15:00,320 Speaker 1: And I said, and if I'm normal, I've got twenty years. 294 00:15:00,560 --> 00:15:02,960 Speaker 1: Like I'm on the fucking home straight. And then we 295 00:15:03,080 --> 00:15:05,120 Speaker 1: just start to go, all right, well, if this is 296 00:15:05,160 --> 00:15:07,080 Speaker 1: where I am and this is how far I've got 297 00:15:07,120 --> 00:15:10,960 Speaker 1: to go, maybe what do I want that to look like. 298 00:15:11,360 --> 00:15:14,200 Speaker 1: And I think when you think about today in the 299 00:15:14,200 --> 00:15:18,120 Speaker 1: context of your whole life, it gives you a different perspective. 300 00:15:18,760 --> 00:15:20,720 Speaker 1: Like if you go, you're going to live to mid eighties, 301 00:15:20,800 --> 00:15:22,720 Speaker 1: let's hope you live to one hundred, but let's just 302 00:15:22,800 --> 00:15:27,800 Speaker 1: go statistically mid eighties. You're about halfway right now. And 303 00:15:27,880 --> 00:15:30,160 Speaker 1: I don't think it's a morbid thing. I think it's 304 00:15:30,200 --> 00:15:33,480 Speaker 1: a fucking fascinating thing to go cool. So I've done 305 00:15:33,600 --> 00:15:37,200 Speaker 1: half my life, give or take. What do I want 306 00:15:37,240 --> 00:15:40,200 Speaker 1: the second half of my life to look like? And 307 00:15:40,240 --> 00:15:43,120 Speaker 1: what do I need to do to make it be 308 00:15:43,320 --> 00:15:46,400 Speaker 1: what I want it to look like? Like? What decisions 309 00:15:46,400 --> 00:15:49,240 Speaker 1: do I need to make now? An action do I 310 00:15:49,320 --> 00:15:53,280 Speaker 1: need to take now to create this version the second 311 00:15:53,320 --> 00:15:55,440 Speaker 1: half of my life that I would like to inhabit. 312 00:15:55,760 --> 00:15:57,880 Speaker 1: So you start to get a little bit conscious and 313 00:15:57,920 --> 00:16:03,520 Speaker 1: strategic about living rather than incidental and unintentional living. 314 00:16:04,440 --> 00:16:07,240 Speaker 2: I reckon, it's looking at all the things I hold 315 00:16:07,280 --> 00:16:09,760 Speaker 2: on to or we hold on to, and why, like 316 00:16:09,840 --> 00:16:12,240 Speaker 2: we get to hear here, I am forty two, all 317 00:16:12,280 --> 00:16:16,480 Speaker 2: this stuff, all this experience, all this stuff, and then 318 00:16:17,280 --> 00:16:19,720 Speaker 2: I hold it and I'm like, okay, this is It's like, 319 00:16:19,760 --> 00:16:22,000 Speaker 2: hang on, how what parts of this will deliver it? 320 00:16:22,520 --> 00:16:26,560 Speaker 2: Why do I feel I don't know sometimes stack or 321 00:16:26,880 --> 00:16:30,040 Speaker 2: what is my identity in the middle of this? And 322 00:16:30,080 --> 00:16:32,760 Speaker 2: I feel like this year I've been slowing down a bit, 323 00:16:33,000 --> 00:16:37,640 Speaker 2: refocusing on a few things and then just going all right, well, 324 00:16:37,680 --> 00:16:40,120 Speaker 2: maybe this he is forever, and maybe it's not. Maybe 325 00:16:40,160 --> 00:16:42,400 Speaker 2: this is forever and maybe it's not. Maybe this is 326 00:16:42,560 --> 00:16:45,760 Speaker 2: like and maybe I'm just a beginner at life right now. 327 00:16:46,760 --> 00:16:49,320 Speaker 1: Well, your granddad's still around. He's one hundred and one, 328 00:16:49,400 --> 00:16:51,920 Speaker 1: isn't he? And is he still on Tinder? 329 00:16:52,360 --> 00:16:54,160 Speaker 3: Or yeah? 330 00:16:54,200 --> 00:16:56,120 Speaker 2: Yeah, he's still swiping away. 331 00:16:56,040 --> 00:17:00,240 Speaker 1: Yeah, still swiping. And Patrick's Patrick's fifty whatever going on 332 00:17:00,600 --> 00:17:05,840 Speaker 1: fifty seventeen fifty seven. You're not fifty seven. Yeah, I'm 333 00:17:05,880 --> 00:17:08,880 Speaker 1: almost fifty eight, fifty eight in July. Hell, you are 334 00:17:09,359 --> 00:17:11,880 Speaker 1: really the youngest fifty seven year old I've ever met 335 00:17:12,119 --> 00:17:14,600 Speaker 1: like you. No, no, no, I'm not even trying to 336 00:17:14,600 --> 00:17:16,879 Speaker 1: be nice to you. You're I reckon you're going to 337 00:17:16,920 --> 00:17:18,479 Speaker 1: live to one hundred. I reckon both of you live 338 00:17:18,520 --> 00:17:21,879 Speaker 1: to one hundred. I really do. I would put money 339 00:17:21,880 --> 00:17:22,840 Speaker 1: on both of you living too. 340 00:17:22,960 --> 00:17:25,800 Speaker 2: We're here to say it, yeah, exactly, We're here to 341 00:17:25,800 --> 00:17:28,720 Speaker 2: say it. HAPs I'll be I'll be so dead, you'll 342 00:17:28,760 --> 00:17:29,840 Speaker 2: be leading the pack. 343 00:17:30,320 --> 00:17:30,720 Speaker 1: You guys. 344 00:17:30,760 --> 00:17:33,600 Speaker 2: Melissa won't let that happen. Melissa won't let you move on. 345 00:17:34,800 --> 00:17:39,280 Speaker 1: I joke about dying. Melissa cracks the shits, although she 346 00:17:39,400 --> 00:17:41,679 Speaker 1: has been teach training AI on my voice so she 347 00:17:41,760 --> 00:17:44,080 Speaker 1: can do podcasts when I'm dead. So I don't know. 348 00:17:44,640 --> 00:17:45,159 Speaker 3: I reckon. 349 00:17:45,359 --> 00:17:48,359 Speaker 4: Melissa is going to do a Walt Disney on you. Your 350 00:17:47,960 --> 00:17:51,000 Speaker 4: head is going to end up in a little fridge somewhere. 351 00:17:51,160 --> 00:17:55,959 Speaker 1: In the yogenic cryogenic kind of chamber somewhere, just to 352 00:17:56,960 --> 00:18:00,560 Speaker 1: re animate me when the time comes, when the technology 353 00:18:00,600 --> 00:18:03,600 Speaker 1: is caught up. Could we put me on a better body? 354 00:18:03,960 --> 00:18:06,600 Speaker 1: Could we put me on like you know, C three 355 00:18:06,640 --> 00:18:10,520 Speaker 1: PO or R two D two? I just fucking wheel 356 00:18:10,560 --> 00:18:14,879 Speaker 1: around or Chewbacker. No, I want to be Chewbacker. 357 00:18:17,040 --> 00:18:18,040 Speaker 3: I can't. I can't do it. 358 00:18:19,119 --> 00:18:20,320 Speaker 1: That's actually pretty. 359 00:18:20,080 --> 00:18:21,560 Speaker 3: Much was joking. 360 00:18:22,240 --> 00:18:25,600 Speaker 1: Wow, Wow, I told you not to have oats for breakfast. 361 00:18:26,600 --> 00:18:27,880 Speaker 3: Yeah, that's what I was trying to do. 362 00:18:28,640 --> 00:18:32,400 Speaker 1: Let's let's talk about uh can I talk about? 363 00:18:33,000 --> 00:18:36,040 Speaker 3: Oh yeah, well yeah, technology. I guess that's what we're 364 00:18:36,080 --> 00:18:36,840 Speaker 3: here for now. 365 00:18:37,640 --> 00:18:40,520 Speaker 4: You know, when Tiff was talking about turning her age 366 00:18:40,560 --> 00:18:42,560 Speaker 4: and she and no way does she look her age. 367 00:18:42,800 --> 00:18:46,400 Speaker 4: And then when you started talking about the timeline and 368 00:18:46,920 --> 00:18:49,840 Speaker 4: scrubbing out everything beforehand and how much you've got left, 369 00:18:49,840 --> 00:18:51,720 Speaker 4: I thought, what the hell am I doing sitting here? 370 00:18:51,720 --> 00:18:53,240 Speaker 3: I should be doing something productive. 371 00:18:53,520 --> 00:18:56,440 Speaker 1: No, exactly exactly, but what I. 372 00:18:56,400 --> 00:18:59,920 Speaker 4: Was thinking was that the concept of time is so relative. 373 00:19:00,320 --> 00:19:04,159 Speaker 4: If you're practicing mindfulness and doing a meditation and in 374 00:19:04,200 --> 00:19:06,680 Speaker 4: my case, doing tai chi or tif you're in the ring, 375 00:19:06,800 --> 00:19:09,560 Speaker 4: boxing or crago, you're on a you know, standing up 376 00:19:09,600 --> 00:19:11,560 Speaker 4: in front of a whole lot of people, your perception 377 00:19:11,720 --> 00:19:15,800 Speaker 4: of time will change, you know, And that's something. 378 00:19:15,600 --> 00:19:17,679 Speaker 3: That we do have control of. You know. 379 00:19:17,800 --> 00:19:20,520 Speaker 4: Do we take pockets of time in our day to 380 00:19:20,720 --> 00:19:24,760 Speaker 4: slow down, to deep breathe, to get away from the 381 00:19:24,800 --> 00:19:28,640 Speaker 4: frenetic pace of our lives by deliberately slowing it down, 382 00:19:28,800 --> 00:19:31,240 Speaker 4: So yes, we can extend that time. And the reason 383 00:19:31,280 --> 00:19:33,439 Speaker 4: I was talking about that and the relative terms of 384 00:19:33,480 --> 00:19:37,680 Speaker 4: time is I saw that this week Apple was fined 385 00:19:38,040 --> 00:19:41,760 Speaker 4: eight hundred and ninety million dollars and Meta was fined 386 00:19:42,040 --> 00:19:45,239 Speaker 4: three hundred and fifty six million dollars for breaching the 387 00:19:45,280 --> 00:19:49,080 Speaker 4: new a EU laws. This basically, I'm not going to 388 00:19:49,080 --> 00:19:51,600 Speaker 4: go into all the details because that's not the that's 389 00:19:51,640 --> 00:19:53,800 Speaker 4: not not the angle I wanted to talk about. But effectively, 390 00:19:53,840 --> 00:19:56,639 Speaker 4: it's because they've got a monopoly on the App store, 391 00:19:56,760 --> 00:19:59,639 Speaker 4: on the Apple devices, and so the EU says it's 392 00:19:59,640 --> 00:20:03,440 Speaker 4: anticque competitive. Apple charges like a thirty percent on each 393 00:20:03,560 --> 00:20:06,439 Speaker 4: app purchase, so they're making lots of money. So I 394 00:20:06,480 --> 00:20:08,520 Speaker 4: did a bit of digging and I thought, okay, so 395 00:20:08,640 --> 00:20:12,159 Speaker 4: if Apple gets fined eight hundred and ninety million dollars, 396 00:20:12,480 --> 00:20:14,159 Speaker 4: is that a lot of money? I mean, it's a 397 00:20:14,200 --> 00:20:17,560 Speaker 4: lot of money for us. But Apple's worth three point 398 00:20:17,600 --> 00:20:23,040 Speaker 4: one six trillion, which is five trillion Australian dollars. Okay, 399 00:20:23,160 --> 00:20:26,800 Speaker 4: So eight hundred and ninety million is approximately zero point 400 00:20:27,040 --> 00:20:30,600 Speaker 4: one seven eight percent of five trillion, So we're talking 401 00:20:30,720 --> 00:20:36,159 Speaker 4: seventeen dollars. If we were comparing that with an average 402 00:20:36,200 --> 00:20:38,840 Speaker 4: wage of one hundred thousand dollars, it would be so 403 00:20:38,880 --> 00:20:40,880 Speaker 4: if you had a wage of one hundred thousand, it'd 404 00:20:40,920 --> 00:20:42,960 Speaker 4: be like you getting a seventeen dollar fine. 405 00:20:43,480 --> 00:20:45,720 Speaker 1: That's crazy, isn't it. Yeah, I don't have to you 406 00:20:45,760 --> 00:20:47,199 Speaker 1: know this off the top of your head. So a 407 00:20:47,240 --> 00:20:52,000 Speaker 1: million is six zeros, yes, and a billion is nine zeros. 408 00:20:52,280 --> 00:20:54,160 Speaker 1: Is a trillion twelve zeros. 409 00:20:54,480 --> 00:20:56,280 Speaker 4: Again, I don't know, but you know what, I was 410 00:20:56,320 --> 00:20:58,960 Speaker 4: talking with my friend. I dog walk with a friend 411 00:20:59,520 --> 00:21:02,680 Speaker 4: Kim Tiff. You that you've met Kim, and we were 412 00:21:02,720 --> 00:21:05,280 Speaker 4: talking about this on our dog walk two days ago, 413 00:21:05,480 --> 00:21:09,120 Speaker 4: and Kim explained it the best way ever in terms 414 00:21:09,200 --> 00:21:11,679 Speaker 4: of how big the numbers are. She said, think of 415 00:21:11,720 --> 00:21:14,679 Speaker 4: it in terms of seconds. So I did some sums 416 00:21:14,680 --> 00:21:17,800 Speaker 4: and I thought, one million seconds is eleven and a 417 00:21:17,880 --> 00:21:23,600 Speaker 4: half days. One trillion seconds is equivalent to eleven point 418 00:21:23,640 --> 00:21:25,440 Speaker 4: five seven million days. 419 00:21:25,960 --> 00:21:28,560 Speaker 3: So that's thirty one thousand years. 420 00:21:29,840 --> 00:21:33,760 Speaker 1: That's crazy. I just asked Uncle Chap, JP tay and 421 00:21:33,800 --> 00:21:35,600 Speaker 1: it it is twelve zeros. 422 00:21:35,840 --> 00:21:39,919 Speaker 3: Yeah, wow, it's just bring them out. Yeah, I mean. 423 00:21:39,800 --> 00:21:47,199 Speaker 1: There's I mean, you think about this like everything like time, money, yeah, 424 00:21:47,520 --> 00:21:52,359 Speaker 1: you know, resources, it's all It's all kind of context dependent, really, 425 00:21:52,400 --> 00:21:55,000 Speaker 1: isn't it Like depending on who you are, where you live, 426 00:21:55,040 --> 00:21:58,639 Speaker 1: what your situation, one thousand dollars is an incredible amount 427 00:21:58,640 --> 00:22:01,480 Speaker 1: of money. For somebody else, it's what they spend on 428 00:22:01,520 --> 00:22:05,159 Speaker 1: a business dinner, you know, on a weekly basis, or 429 00:22:05,760 --> 00:22:09,800 Speaker 1: or more than that. You know, it's like it's and 430 00:22:10,280 --> 00:22:13,359 Speaker 1: the same with you know when you're getting your teeth 431 00:22:13,359 --> 00:22:16,520 Speaker 1: pulled out or something. Three minutes seems like three hours 432 00:22:19,119 --> 00:22:22,080 Speaker 1: when you're meditating like you or something. Maybe three hours 433 00:22:22,080 --> 00:22:22,840 Speaker 1: of three minutes. 434 00:22:23,240 --> 00:22:25,960 Speaker 4: Yeah, there's a Scandinavian country and I can't remember which 435 00:22:25,960 --> 00:22:28,320 Speaker 4: one it might be Norway where when you get a 436 00:22:28,359 --> 00:22:32,480 Speaker 4: speed fine it's means tested, so it's not a fixed amounts. 437 00:22:32,840 --> 00:22:34,440 Speaker 1: And that is so interesting. 438 00:22:34,520 --> 00:22:35,119 Speaker 3: It's good, isn't it. 439 00:22:35,160 --> 00:22:36,920 Speaker 4: And there was a guy who got a speed find 440 00:22:36,960 --> 00:22:39,800 Speaker 4: a rich guy, and he ended up paying two million 441 00:22:40,359 --> 00:22:43,080 Speaker 4: dollars or two million euro But he kind of looked 442 00:22:43,119 --> 00:22:44,800 Speaker 4: at it circumspect and he said, look, I hope the 443 00:22:44,840 --> 00:22:47,560 Speaker 4: money gets put to good use. But it makes a 444 00:22:47,560 --> 00:22:49,439 Speaker 4: lot of sense when you think about it, because it's 445 00:22:49,480 --> 00:22:52,960 Speaker 4: all relative. If you're earning, you know, five hundred dollars 446 00:22:52,960 --> 00:22:57,080 Speaker 4: a week, then a speeding fine of one thousand bucks, 447 00:22:57,119 --> 00:22:59,600 Speaker 4: it's two weeks wages. But if you happen to be 448 00:22:59,640 --> 00:23:04,240 Speaker 4: a millillionaire or a billionaire, then it has no meaning 449 00:23:04,400 --> 00:23:05,600 Speaker 4: to you at all. 450 00:23:05,640 --> 00:23:10,000 Speaker 1: Maybe maybe, But here's here's my devil's advocate, right, yeap. 451 00:23:10,640 --> 00:23:14,359 Speaker 1: Let's say you start from nothing and you build up. 452 00:23:14,440 --> 00:23:17,760 Speaker 1: You've got nothing, You've got no help, no support, You're earning. 453 00:23:17,800 --> 00:23:20,320 Speaker 1: Fuck all. You build a business, you grow your brand, 454 00:23:20,400 --> 00:23:23,920 Speaker 1: you struggle, your grind, your hustle, and you build up 455 00:23:24,560 --> 00:23:28,240 Speaker 1: and then you eventually become wealthy. And nobody gave you anything, 456 00:23:28,920 --> 00:23:31,840 Speaker 1: and you build wealth and then you have to pay 457 00:23:31,880 --> 00:23:34,040 Speaker 1: a two million dollar fine that Tiff has to pay 458 00:23:34,040 --> 00:23:38,320 Speaker 1: two hundred dollars because you did well. I mean, I 459 00:23:38,359 --> 00:23:41,960 Speaker 1: feel like we, especially in Australia, I feel like we 460 00:23:42,080 --> 00:23:45,280 Speaker 1: hate people who do well. Oh you're rich, you can 461 00:23:45,280 --> 00:23:49,119 Speaker 1: pay more. Why it's the same It's the same fine 462 00:23:49,240 --> 00:23:54,960 Speaker 1: or the same crime. Like, I don't like that. I 463 00:23:54,960 --> 00:23:57,360 Speaker 1: don't like that. I don't think we should. And I'm 464 00:23:57,359 --> 00:23:59,119 Speaker 1: not saying because I'm rich, because I ain't rich, But 465 00:23:59,160 --> 00:24:02,320 Speaker 1: I just don't think like saying, oh fuck them, they 466 00:24:02,320 --> 00:24:06,800 Speaker 1: can afford it, because you know, it's like even now 467 00:24:06,840 --> 00:24:08,879 Speaker 1: with land tax and all of these things are happening, 468 00:24:08,920 --> 00:24:11,440 Speaker 1: So people who've worked hard and bought a second home 469 00:24:12,320 --> 00:24:15,840 Speaker 1: now they're getting, like all of the homes on the Peninsula, 470 00:24:16,040 --> 00:24:19,000 Speaker 1: not all of them, but a lot down near where 471 00:24:19,040 --> 00:24:22,440 Speaker 1: I live, they're getting sold because people can't afford them 472 00:24:22,440 --> 00:24:24,480 Speaker 1: now because of all the land tax. And so you 473 00:24:24,560 --> 00:24:28,280 Speaker 1: do well and then you get penalized. You know, and 474 00:24:28,359 --> 00:24:30,480 Speaker 1: I know some people are probably are fucking boohoo, but 475 00:24:30,560 --> 00:24:33,159 Speaker 1: it's like, well, we try to encourage people to be 476 00:24:33,280 --> 00:24:35,480 Speaker 1: successful and work hard and be the best that they 477 00:24:35,480 --> 00:24:38,719 Speaker 1: can be. But then once they do that, then they 478 00:24:38,760 --> 00:24:40,359 Speaker 1: get penalized for doing well. 479 00:24:40,960 --> 00:24:43,840 Speaker 3: And we do have that tall poppy syndrome in Australia. 480 00:24:43,880 --> 00:24:44,960 Speaker 3: Let's face it. It's true. 481 00:24:45,000 --> 00:24:47,680 Speaker 4: You're right, and I do see where you're coming from. 482 00:24:47,720 --> 00:24:50,480 Speaker 4: And I'm very lucky. I've got such a broad cross 483 00:24:50,520 --> 00:24:54,240 Speaker 4: section of friends and one couple who are really dear 484 00:24:54,280 --> 00:24:55,800 Speaker 4: friends have done really well. 485 00:24:55,800 --> 00:24:57,080 Speaker 3: They got married really young. 486 00:24:57,119 --> 00:24:59,439 Speaker 4: I think she was kind of fifteen or sixteen and 487 00:24:59,480 --> 00:25:01,280 Speaker 4: he was a couple years older. It's like one of 488 00:25:01,280 --> 00:25:04,879 Speaker 4: those amazing stories. And they've been married like fifty years 489 00:25:05,240 --> 00:25:06,200 Speaker 4: and they've worked. 490 00:25:06,320 --> 00:25:07,480 Speaker 3: They worked so hard. 491 00:25:07,600 --> 00:25:10,120 Speaker 4: They ran a furniture business for years and years and years, 492 00:25:10,200 --> 00:25:13,720 Speaker 4: never took holidays, and now they're enjoying. They're enjoying there 493 00:25:13,760 --> 00:25:16,119 Speaker 4: at a moment. They've got a lovely farm, they go away. 494 00:25:16,200 --> 00:25:19,080 Speaker 4: They've been to Japan heaps of times, and I have 495 00:25:19,200 --> 00:25:22,400 Speaker 4: so much respect. And the lovely thing is they've been 496 00:25:22,440 --> 00:25:25,200 Speaker 4: part of their knowledge and wisdom and have helped me 497 00:25:25,240 --> 00:25:27,880 Speaker 4: out on different occasions and they're amazing people. 498 00:25:27,960 --> 00:25:31,159 Speaker 3: So I do know what you're saying. But again I 499 00:25:31,240 --> 00:25:32,280 Speaker 3: kind of, you know. 500 00:25:32,520 --> 00:25:35,199 Speaker 4: I war with that because I also see on the 501 00:25:35,240 --> 00:25:38,679 Speaker 4: other side how young people who struggle, and you know, 502 00:25:38,760 --> 00:25:40,760 Speaker 4: rent is going up for a lot of people. 503 00:25:40,920 --> 00:25:44,080 Speaker 3: And there was a story recently where. 504 00:25:43,800 --> 00:25:47,160 Speaker 4: A young woman had just been diagnosed with cancer and 505 00:25:47,600 --> 00:25:49,880 Speaker 4: she got kicked out and she got evicted from her 506 00:25:49,920 --> 00:25:52,800 Speaker 4: house because she couldn't pay the rents. You know, those 507 00:25:52,840 --> 00:25:55,000 Speaker 4: are the types of horrific stories where you think, well, 508 00:25:55,320 --> 00:25:57,080 Speaker 4: there are a lot of have nots out there are 509 00:25:57,160 --> 00:26:00,119 Speaker 4: students who are struggling with HEXTEBT. You know, there's still 510 00:26:00,160 --> 00:26:01,320 Speaker 4: the other side of it as well. 511 00:26:02,160 --> 00:26:07,560 Speaker 1: Yeah, and anyway, let's not go down a philosophical game. 512 00:26:08,080 --> 00:26:10,480 Speaker 1: I'm with you. Let's talk about technology. 513 00:26:10,800 --> 00:26:13,960 Speaker 4: Okay, A, there's so much AI stuff that I wanted 514 00:26:14,000 --> 00:26:17,359 Speaker 4: to talk about. Remember I showed you those metagal the 515 00:26:17,400 --> 00:26:20,320 Speaker 4: ray bands, and how these new ray bands coming out 516 00:26:20,320 --> 00:26:23,320 Speaker 4: they've got cameras built into them and you can now 517 00:26:23,359 --> 00:26:25,160 Speaker 4: one of the things that Metas saying you can do 518 00:26:25,320 --> 00:26:26,960 Speaker 4: is you can do real time translation. 519 00:26:27,359 --> 00:26:28,840 Speaker 3: So if I was talking to you when you were 520 00:26:28,880 --> 00:26:29,919 Speaker 3: speaking in a different. 521 00:26:29,720 --> 00:26:33,280 Speaker 4: Language, it will convert the text and what you're saying 522 00:26:33,320 --> 00:26:35,159 Speaker 4: into a different language. I think at the moment it 523 00:26:35,200 --> 00:26:40,600 Speaker 4: supports Spanish, Italian, English, and maybe German. But there is 524 00:26:40,640 --> 00:26:44,600 Speaker 4: a big concern with this because Meta can effectively turn 525 00:26:44,880 --> 00:26:50,239 Speaker 4: those smart glasses into real time surveillance devices. And so 526 00:26:50,800 --> 00:26:53,840 Speaker 4: there are a few people who are into cybersecurity. You're 527 00:26:53,920 --> 00:26:56,119 Speaker 4: kind of jumping up and down and saying, well, you know, 528 00:26:56,800 --> 00:26:59,520 Speaker 4: an email was sent out by Meta on the twenty 529 00:26:59,600 --> 00:27:03,600 Speaker 4: ninth April, and that basically opened up the ability to 530 00:27:03,640 --> 00:27:08,159 Speaker 4: collect more data to then train their AI models on. 531 00:27:08,920 --> 00:27:12,480 Speaker 4: And that means that what's happening is even the recordings 532 00:27:12,520 --> 00:27:16,239 Speaker 4: that you have will automatically go onto the cloud and 533 00:27:16,280 --> 00:27:18,800 Speaker 4: be stored for ninety days. But you don't have an 534 00:27:18,840 --> 00:27:21,440 Speaker 4: option to opt out. The only thing I can do 535 00:27:21,760 --> 00:27:26,120 Speaker 4: is go to the cloud and delete those recordings manually. 536 00:27:26,520 --> 00:27:30,000 Speaker 3: So they've taken that out. You can't not allow Meta. 537 00:27:30,560 --> 00:27:34,800 Speaker 4: Anything you record goes onto the cloud, onto their servers. 538 00:27:34,480 --> 00:27:36,600 Speaker 3: And you've got to manually go through and delete that. 539 00:27:36,640 --> 00:27:40,880 Speaker 4: So this is raising a lot of concerns because users 540 00:27:41,359 --> 00:27:44,399 Speaker 4: will not have the ability to keep their voice recordings 541 00:27:44,440 --> 00:27:48,040 Speaker 4: from being stored on the Meta servers, and that to 542 00:27:48,119 --> 00:27:51,200 Speaker 4: me is kind of frightening, particularly if you're walking around 543 00:27:51,200 --> 00:27:54,360 Speaker 4: and taking photos and videos, then those are going onto 544 00:27:54,359 --> 00:27:57,480 Speaker 4: the server as well, and it opens the whole rabbit 545 00:27:57,480 --> 00:27:59,120 Speaker 4: hole of what happens. 546 00:27:58,760 --> 00:28:01,399 Speaker 3: To that data and the data is being used to 547 00:28:01,480 --> 00:28:02,639 Speaker 3: train the AI is on. 548 00:28:03,680 --> 00:28:06,840 Speaker 1: I think I agree with you. I think if you know, 549 00:28:07,080 --> 00:28:10,080 Speaker 1: going in, if I use this device, this is part 550 00:28:10,080 --> 00:28:12,000 Speaker 1: of the deal. Then you have the option of saying 551 00:28:12,000 --> 00:28:14,000 Speaker 1: I'm not going to use the device like but they 552 00:28:14,080 --> 00:28:16,719 Speaker 1: need to make that, which they probably don't, I'm sure, 553 00:28:17,320 --> 00:28:20,240 Speaker 1: but that needs to be clearly stated up front that 554 00:28:20,960 --> 00:28:24,359 Speaker 1: if you use this particular product, this is one of 555 00:28:24,400 --> 00:28:28,520 Speaker 1: the consequences or one of the realities of you know, 556 00:28:28,600 --> 00:28:32,800 Speaker 1: you are agreeing to this, which yeah, yeah, they don't 557 00:28:32,840 --> 00:28:36,880 Speaker 1: tend to really explain to well upfront. 558 00:28:38,160 --> 00:28:41,120 Speaker 4: So I think it's again it's and you know, the 559 00:28:41,200 --> 00:28:44,520 Speaker 4: other concern, and I know a lot of the APE 560 00:28:44,560 --> 00:28:46,760 Speaker 4: Triple C has come out in support of this, is 561 00:28:47,480 --> 00:28:52,080 Speaker 4: making the terms and conditions more simple, because the reality 562 00:28:52,120 --> 00:28:56,080 Speaker 4: of it is, I would challenge the average person, they 563 00:28:56,080 --> 00:28:58,920 Speaker 4: would never read the terms and conditions. You just click 564 00:28:58,960 --> 00:29:01,560 Speaker 4: that little tickbox and say yes, And I've done that. 565 00:29:01,600 --> 00:29:03,880 Speaker 4: I do it all the time, you know, I guess 566 00:29:03,880 --> 00:29:05,360 Speaker 4: one thing you could do is run it all through 567 00:29:05,400 --> 00:29:07,880 Speaker 4: chat GPT, you know, copy and paste the terms of 568 00:29:07,880 --> 00:29:10,280 Speaker 4: conditions put in the chat GPT and say is there anything. 569 00:29:10,000 --> 00:29:12,720 Speaker 3: Insidious about this? What should I be worried about? 570 00:29:13,440 --> 00:29:16,680 Speaker 1: Yeah? And it'll say sit the fuck down, mate, you know, 571 00:29:17,800 --> 00:29:20,959 Speaker 1: here we go. Here's yeah. But I do I agree 572 00:29:20,960 --> 00:29:23,080 Speaker 1: with you. I think that yeah. I don't think I've 573 00:29:23,080 --> 00:29:25,040 Speaker 1: ever read one of those things in my life, and 574 00:29:25,080 --> 00:29:27,600 Speaker 1: I think they probably rely on that fact, right. 575 00:29:28,080 --> 00:29:31,120 Speaker 4: And that's the criticism by a lot of the competition 576 00:29:31,200 --> 00:29:34,240 Speaker 4: watchdogs like the a Triple C, the Australian Competition and 577 00:29:34,280 --> 00:29:38,640 Speaker 4: Consumer Commission, is that retailers need to be more transparent 578 00:29:38,720 --> 00:29:41,560 Speaker 4: and they're not transparent when they put out you know, 579 00:29:42,160 --> 00:29:45,880 Speaker 4: legal speak that we as average human beings have no 580 00:29:46,000 --> 00:29:47,440 Speaker 4: concept of even understanding. 581 00:29:47,480 --> 00:29:49,000 Speaker 3: You know, I shouldn't have to take the terms of 582 00:29:49,080 --> 00:29:50,000 Speaker 3: additions to. 583 00:29:50,040 --> 00:29:52,640 Speaker 4: My lawyer to be able to decide whether I want 584 00:29:52,680 --> 00:29:54,920 Speaker 4: to open up a Facebook account, you know. But the 585 00:29:54,960 --> 00:29:57,280 Speaker 4: reality of it is that that quite often is the case. 586 00:29:57,320 --> 00:29:58,960 Speaker 4: And it's a it's a double edged sword. 587 00:29:59,000 --> 00:29:59,360 Speaker 3: I guess. 588 00:29:59,480 --> 00:30:02,040 Speaker 4: You know, you get this stuff for free, but there 589 00:30:02,080 --> 00:30:04,240 Speaker 4: is always a little hook in the barb isn't there. 590 00:30:04,400 --> 00:30:06,280 Speaker 3: There's always a little look at the end of it. 591 00:30:06,680 --> 00:30:10,600 Speaker 4: Because nothing really is for free when we're talking about technology, 592 00:30:10,600 --> 00:30:13,560 Speaker 4: whether it's opening up a free Gmail account, or having 593 00:30:13,560 --> 00:30:17,160 Speaker 4: a Facebook account or an Instagram account, nothing, nothing really 594 00:30:17,280 --> 00:30:17,920 Speaker 4: is free. 595 00:30:18,480 --> 00:30:20,920 Speaker 1: Yeah seems like it is, but it isn't. Do you 596 00:30:21,440 --> 00:30:24,600 Speaker 1: use chat, GPT and the like a lot, Patrick. 597 00:30:25,080 --> 00:30:28,880 Speaker 4: A little bit. We tend to in our office. We 598 00:30:29,280 --> 00:30:32,240 Speaker 4: do some really great stuff with AI to help our clients. 599 00:30:32,280 --> 00:30:34,800 Speaker 4: By the way of like visually, so if we get 600 00:30:34,840 --> 00:30:37,640 Speaker 4: given a photograph, it's portrait and it needs to be 601 00:30:37,680 --> 00:30:40,680 Speaker 4: extended to landscape, so the background needs to be extended, 602 00:30:40,880 --> 00:30:43,200 Speaker 4: Well we can do that and use you know, we 603 00:30:43,320 --> 00:30:46,960 Speaker 4: use Adobe Illustrator and Adobe Photoshop and you've got these 604 00:30:47,000 --> 00:30:50,160 Speaker 4: AI features built into it. I do sometimes, you know, 605 00:30:50,200 --> 00:30:52,960 Speaker 4: even today before the show, when I was trying to 606 00:30:52,960 --> 00:30:56,000 Speaker 4: get those figures, you know how many seconds equals years 607 00:30:56,040 --> 00:30:58,240 Speaker 4: and all that sort of stuff. I use Chat GPT 608 00:30:58,480 --> 00:31:01,440 Speaker 4: for that because it then gave me the formula, It 609 00:31:01,560 --> 00:31:04,320 Speaker 4: showed me how it calculated all that stuff out and 610 00:31:04,520 --> 00:31:07,920 Speaker 4: I did it in seconds just before the show, you know, ordinarily, 611 00:31:08,040 --> 00:31:09,120 Speaker 4: and I'm not that smart. 612 00:31:09,120 --> 00:31:12,040 Speaker 3: And when it comes to maths, so you know, I reckon, 613 00:31:12,080 --> 00:31:12,520 Speaker 3: it would. 614 00:31:12,320 --> 00:31:13,920 Speaker 4: Have taken me a lot longer to work it out, 615 00:31:13,960 --> 00:31:16,040 Speaker 4: you know, divide it by sixty, then multiplay how many 616 00:31:16,080 --> 00:31:17,400 Speaker 4: sixties there are, et cetera. 617 00:31:17,720 --> 00:31:19,040 Speaker 3: So that was fantastic. 618 00:31:19,120 --> 00:31:21,680 Speaker 4: So yes, I do use it a little bit, but 619 00:31:21,760 --> 00:31:23,320 Speaker 4: generally because a lot of the work that I do 620 00:31:23,400 --> 00:31:26,760 Speaker 4: tends to be visual, you know, designing visual posters and 621 00:31:27,320 --> 00:31:30,440 Speaker 4: banners and things like that, logos, and sometimes you can 622 00:31:30,520 --> 00:31:31,719 Speaker 4: use it to get inspiration. 623 00:31:32,040 --> 00:31:34,000 Speaker 3: But you know, I'm a big fan of. 624 00:31:33,960 --> 00:31:36,560 Speaker 4: The personal touch and I love talking to clients and 625 00:31:36,600 --> 00:31:38,800 Speaker 4: doing stuff on the fly with them. So I had 626 00:31:38,800 --> 00:31:40,840 Speaker 4: a client sit in via Zoom and I was doing 627 00:31:40,880 --> 00:31:43,400 Speaker 4: a logo for a really cool little company to I 628 00:31:43,400 --> 00:31:47,480 Speaker 4: feel like this, there's a guy locally who's making hemp 629 00:31:47,680 --> 00:31:52,200 Speaker 4: dog collars. A lot of dog collars are made from 630 00:31:52,360 --> 00:31:55,600 Speaker 4: materials that will you know, when they're thrown away. 631 00:31:55,640 --> 00:31:57,760 Speaker 3: They're plastics, and they're going to be in the environment. 632 00:31:57,880 --> 00:31:59,760 Speaker 4: They're either going to poison the environment or be in 633 00:31:59,760 --> 00:32:03,400 Speaker 4: theenvironment for a long long time. Whereas something like hemp 634 00:32:03,640 --> 00:32:08,120 Speaker 4: is a really robust, fantastic material to use. 635 00:32:08,200 --> 00:32:10,520 Speaker 3: It's a natural fiber, so it means is less sweating. 636 00:32:11,280 --> 00:32:13,440 Speaker 4: It's a great fiber, and so I've been having a 637 00:32:13,440 --> 00:32:15,800 Speaker 4: great time designing a logo and. 638 00:32:15,680 --> 00:32:17,120 Speaker 3: You know, we did that in Zoom. 639 00:32:17,160 --> 00:32:19,640 Speaker 4: You know, I was making changes and he was giving 640 00:32:19,640 --> 00:32:21,400 Speaker 4: me feedback and it was great, and then he called 641 00:32:21,440 --> 00:32:24,400 Speaker 4: his wife in and then called in his kid, and 642 00:32:24,760 --> 00:32:26,640 Speaker 4: it was really fun to do that on the fly. 643 00:32:26,880 --> 00:32:29,080 Speaker 3: So you know, I do like the personal touch. 644 00:32:29,800 --> 00:32:33,920 Speaker 1: And it's biodegradable, right, so it just breaks down eventually. 645 00:32:34,480 --> 00:32:39,440 Speaker 1: And I think also I watched a thing, Ah, what 646 00:32:39,640 --> 00:32:43,800 Speaker 1: was it? There's a thing called hemp crete, right, which 647 00:32:43,880 --> 00:32:48,240 Speaker 1: is basically where they use hemp a concrete. Yeah, hemp 648 00:32:48,600 --> 00:32:51,680 Speaker 1: like all mushed up the fibers because it's really fucking 649 00:32:51,760 --> 00:32:55,200 Speaker 1: strong and this mix of I think it's lime and concrete, 650 00:32:56,400 --> 00:33:00,000 Speaker 1: but it has more strength than just normal concrete and 651 00:33:00,000 --> 00:33:03,760 Speaker 1: weighs something like a quarter of the weight or maybe 652 00:33:03,920 --> 00:33:07,000 Speaker 1: maybe less. Yeah, it's it's so funny that it's got 653 00:33:07,040 --> 00:33:10,720 Speaker 1: such negative connotations because all people think about is marijuana. 654 00:33:11,200 --> 00:33:12,440 Speaker 3: Yeah, yeah, for sure. 655 00:33:13,720 --> 00:33:17,520 Speaker 4: Material Yeah one d P. I wouldn't mind having hemp jocks. 656 00:33:17,520 --> 00:33:19,000 Speaker 4: That'd be good, wouldn't it. You wouldn't have to wash 657 00:33:19,080 --> 00:33:20,320 Speaker 4: them for ages. 658 00:33:20,800 --> 00:33:23,320 Speaker 1: No, and they'd never wear out front to back in 659 00:33:24,200 --> 00:33:26,880 Speaker 1: and if you've got him in a beige well even better. 660 00:33:28,240 --> 00:33:29,200 Speaker 3: Yeah, you wouldn't callored them. 661 00:33:29,200 --> 00:33:31,160 Speaker 4: You don't need to. Hem's got its own natural color. 662 00:33:31,160 --> 00:33:36,440 Speaker 4: That would workouldn't it, Tiff. TIFF's just losing it there. Yeah, sorry, Tiff. 663 00:33:36,680 --> 00:33:38,960 Speaker 1: It's funny because we're doing this on a Friday, and 664 00:33:39,000 --> 00:33:41,560 Speaker 1: Friday is the day I changed my jock, So there 665 00:33:41,600 --> 00:33:46,000 Speaker 1: you go. Yeah, whether or not they need it, I 666 00:33:46,200 --> 00:33:47,040 Speaker 1: change them. 667 00:33:47,320 --> 00:33:52,240 Speaker 4: You know, he probably sits in his bed and whistles 668 00:33:52,240 --> 00:33:53,840 Speaker 4: and they can walk to him. 669 00:33:55,400 --> 00:33:57,840 Speaker 1: I just hold one leg out and they just slide 670 00:33:57,920 --> 00:34:01,680 Speaker 1: up my quad. Yeah, they just run up there. I 671 00:34:01,680 --> 00:34:04,560 Speaker 1: don't know how long have Jock's had little feet. I'm 672 00:34:04,560 --> 00:34:08,080 Speaker 1: not sure. But anyway, Patrick next. 673 00:34:08,840 --> 00:34:11,360 Speaker 4: I feel like there's a little part of it that 674 00:34:11,400 --> 00:34:17,040 Speaker 4: needs to feel like I need another shower today. 675 00:34:18,600 --> 00:34:19,360 Speaker 3: Duo lingo? 676 00:34:19,480 --> 00:34:23,200 Speaker 4: Have you ever used that, the language training like course 677 00:34:23,920 --> 00:34:25,480 Speaker 4: where you can learn different languages? 678 00:34:26,280 --> 00:34:29,160 Speaker 1: I have not, but I know exactly what you're talking 679 00:34:29,200 --> 00:34:31,560 Speaker 1: and I've thought many times about it. 680 00:34:32,400 --> 00:34:35,400 Speaker 4: Well, they've done like it's taken them twelve years to 681 00:34:35,520 --> 00:34:39,040 Speaker 4: develop one hundred courses, which is amazing, and they're really 682 00:34:39,080 --> 00:34:42,759 Speaker 4: in depth courses. But they're now about to introduce another 683 00:34:42,760 --> 00:34:46,040 Speaker 4: one hundred and forty eight that were created with AI. 684 00:34:46,880 --> 00:34:50,239 Speaker 4: So the company now reckons that it's going to effectively 685 00:34:50,480 --> 00:34:53,680 Speaker 4: get rid of all their contractors and make it an 686 00:34:53,719 --> 00:34:58,520 Speaker 4: AI first company. And you know, that's caused a lot 687 00:34:58,600 --> 00:35:02,000 Speaker 4: of ripples for people, people who obviously work in that space. 688 00:35:02,200 --> 00:35:04,640 Speaker 4: And you were talking before, jokingly saying that when your 689 00:35:04,640 --> 00:35:07,440 Speaker 4: head's in a fridge that molescles are using an AI 690 00:35:07,600 --> 00:35:10,080 Speaker 4: version of view for the show. But the reality of 691 00:35:10,120 --> 00:35:13,160 Speaker 4: it is that, you know, the new world landscape is 692 00:35:13,200 --> 00:35:16,560 Speaker 4: that AI is taking jobs. I think Microsoft did a 693 00:35:16,640 --> 00:35:20,240 Speaker 4: survey recently of business leaders they do their annual survey, 694 00:35:20,760 --> 00:35:23,319 Speaker 4: and I think one in three businesses is looking at 695 00:35:23,360 --> 00:35:28,000 Speaker 4: integrating AI to reduce its staffing, specifically to reduce staffing, 696 00:35:28,400 --> 00:35:29,840 Speaker 4: and that's that's. 697 00:35:29,680 --> 00:35:31,040 Speaker 3: A really, really big thing. 698 00:35:31,239 --> 00:35:33,960 Speaker 4: So you know, it opens up that whole landscape of well, 699 00:35:33,960 --> 00:35:36,960 Speaker 4: what's going to happen with the jobs, and that's such 700 00:35:37,000 --> 00:35:40,000 Speaker 4: a big concern. But that's a massive thing to have 701 00:35:40,200 --> 00:35:43,680 Speaker 4: developed in the last twelve months more courses than they 702 00:35:43,680 --> 00:35:47,719 Speaker 4: did in twelve years, so you know, nearly what one 703 00:35:47,719 --> 00:35:50,200 Speaker 4: and a half times as many courses and that short 704 00:35:50,200 --> 00:35:52,440 Speaker 4: amount of time not using human beings. 705 00:35:52,600 --> 00:35:53,920 Speaker 3: It feels sad to me. 706 00:35:54,840 --> 00:35:57,400 Speaker 1: It is, but it's kind of inevitable now. And you 707 00:35:57,440 --> 00:35:59,600 Speaker 1: think about if you're a business and you say, look, 708 00:36:00,440 --> 00:36:03,560 Speaker 1: we've got Brian and accounting, and Brian's costing us one 709 00:36:03,640 --> 00:36:07,000 Speaker 1: hundred and thirty, or we've got this AI that will 710 00:36:07,000 --> 00:36:09,520 Speaker 1: do much better than Brian in much shorter time, has 711 00:36:09,560 --> 00:36:14,080 Speaker 1: no holidays, no sick pay, has no emotions, never complains, 712 00:36:14,480 --> 00:36:20,919 Speaker 1: and produces more reliable output. And I'm just being analytical. 713 00:36:21,080 --> 00:36:23,279 Speaker 1: It's like, well, of course companies are going to choose 714 00:36:23,320 --> 00:36:26,480 Speaker 1: the AI. I'm not saying it's bad, and I understand 715 00:36:26,560 --> 00:36:31,280 Speaker 1: the moral and ethical and practical implications for all the humans, 716 00:36:31,320 --> 00:36:34,800 Speaker 1: but eventually it's going to come down to dollars and cents, 717 00:36:34,880 --> 00:36:39,760 Speaker 1: because like it or not, companies are all about making 718 00:36:39,760 --> 00:36:42,000 Speaker 1: a profit. You know, they should be about a bunch 719 00:36:42,040 --> 00:36:45,560 Speaker 1: of other things, but ultimately, you know that most companies 720 00:36:45,640 --> 00:36:49,000 Speaker 1: care about the bottom line, and if employing or using 721 00:36:49,120 --> 00:36:53,719 Speaker 1: AI saves them hundreds of thousands or millions, they're going 722 00:36:53,800 --> 00:36:54,160 Speaker 1: to do it. 723 00:36:54,719 --> 00:36:57,360 Speaker 4: I mean, I'd like to kind of adopt my policy 724 00:36:57,360 --> 00:37:01,480 Speaker 4: of utopia, where Craig and Tiff work two days a week, 725 00:37:01,560 --> 00:37:03,719 Speaker 4: get a full time wage, and then for the other 726 00:37:03,840 --> 00:37:05,799 Speaker 4: days they can do whatever they want to do. They 727 00:37:05,800 --> 00:37:07,360 Speaker 4: can go out and talk, or they can go and 728 00:37:07,400 --> 00:37:09,480 Speaker 4: punch things, they can do meditation. 729 00:37:09,960 --> 00:37:12,359 Speaker 3: I mean that to me, it would be great if 730 00:37:12,440 --> 00:37:13,320 Speaker 3: that was the case. 731 00:37:13,560 --> 00:37:15,560 Speaker 4: You know, one of the things as a small business, 732 00:37:15,640 --> 00:37:19,120 Speaker 4: I made a decision last year with my colleague. I've 733 00:37:19,120 --> 00:37:21,600 Speaker 4: only got one full time staffer and I've got you know, 734 00:37:21,600 --> 00:37:24,320 Speaker 4: a couple of casuals. And you know, he's got a 735 00:37:24,360 --> 00:37:26,840 Speaker 4: two year old girl, and he talked about wanting to 736 00:37:26,880 --> 00:37:30,160 Speaker 4: take one day off a fortnight to be able to, 737 00:37:31,440 --> 00:37:33,040 Speaker 4: you know, have spend a day with his daughter, go 738 00:37:33,040 --> 00:37:34,600 Speaker 4: to swimming lessons, that sort of stuff. 739 00:37:34,880 --> 00:37:36,000 Speaker 3: And so I thought about it. 740 00:37:35,920 --> 00:37:37,840 Speaker 4: And I said, well, why don't we do a nine 741 00:37:37,960 --> 00:37:40,719 Speaker 4: day work week where we work nine days but we 742 00:37:40,760 --> 00:37:42,800 Speaker 4: still get paid a full time wage and. 743 00:37:43,440 --> 00:37:45,200 Speaker 1: Give me a nine day work fortnight. 744 00:37:45,480 --> 00:37:46,200 Speaker 3: Is that what I said? 745 00:37:46,239 --> 00:37:49,320 Speaker 4: No, I didn't, look, you're right, a nine day fortnight, correct. 746 00:37:50,320 --> 00:37:52,200 Speaker 4: But we did, and it's been really good and I 747 00:37:52,200 --> 00:37:55,480 Speaker 4: don't think it's reduced our productivity at all. You know, 748 00:37:55,520 --> 00:37:57,960 Speaker 4: we do use AI tools, but we're not replacing anybody 749 00:37:58,000 --> 00:38:01,520 Speaker 4: anytime soon. But I think that, you know, like I said, 750 00:38:01,600 --> 00:38:05,400 Speaker 4: my utopian society is yeah, let's embrace AI, but not 751 00:38:05,480 --> 00:38:07,920 Speaker 4: to the detriment of people, but to be able to 752 00:38:08,040 --> 00:38:11,759 Speaker 4: uplift and be able to reskill people and maybe job 753 00:38:11,840 --> 00:38:14,560 Speaker 4: share where people don't have to work so many hours, 754 00:38:14,560 --> 00:38:17,359 Speaker 4: because the reality of the workplace is people are now 755 00:38:17,680 --> 00:38:20,480 Speaker 4: working more hours than they ever have. But in a 756 00:38:20,480 --> 00:38:23,359 Speaker 4: lot of companies they're on fixed contracts, so they don't 757 00:38:23,400 --> 00:38:28,200 Speaker 4: count the hours. They basically say they do KPIs and 758 00:38:28,239 --> 00:38:30,719 Speaker 4: you have to achieve x amount in your a lot 759 00:38:30,719 --> 00:38:32,960 Speaker 4: of time at work, and we're not counting the hours 760 00:38:32,960 --> 00:38:36,680 Speaker 4: you're working, we're accounting the productivity. So that feels like 761 00:38:36,960 --> 00:38:39,200 Speaker 4: we're getting a bit more draconly in the way that 762 00:38:39,320 --> 00:38:42,000 Speaker 4: employers are working when it could go the other way. 763 00:38:42,040 --> 00:38:45,439 Speaker 4: We have a really happy workforce who appreciate the fact 764 00:38:45,440 --> 00:38:48,640 Speaker 4: that they don't have to work every single day, and 765 00:38:48,680 --> 00:38:51,560 Speaker 4: then they're happier at work, they take left sick days, 766 00:38:52,160 --> 00:38:54,040 Speaker 4: and hopefully it makes for a better society. 767 00:38:54,840 --> 00:38:59,799 Speaker 1: Maybe maybe again, Devil's Advocate, I don't know. I mean, 768 00:39:00,520 --> 00:39:02,920 Speaker 1: it sounds great, and I'm not disagreeing with you, but 769 00:39:03,000 --> 00:39:05,560 Speaker 1: I think we're operating on the assumption that if people 770 00:39:05,640 --> 00:39:08,839 Speaker 1: work less, they'll be happier, and I don't know that 771 00:39:08,840 --> 00:39:12,400 Speaker 1: that's necessarily true for every person in every case. I 772 00:39:12,400 --> 00:39:14,920 Speaker 1: think people who have a job that they love, it's fulfilling, 773 00:39:14,960 --> 00:39:19,200 Speaker 1: it gives them connection and social interaction, and they're solving problems, 774 00:39:19,200 --> 00:39:21,520 Speaker 1: and they're growing, and they're using their brain, and they 775 00:39:21,600 --> 00:39:24,640 Speaker 1: laugh and they work in a good culture. I don't 776 00:39:24,680 --> 00:39:26,520 Speaker 1: know that all of a sudden, now you've got five 777 00:39:26,600 --> 00:39:29,040 Speaker 1: days a week where you don't work, and all that 778 00:39:29,120 --> 00:39:32,560 Speaker 1: other stuff you don't have that now you've got five days. 779 00:39:32,600 --> 00:39:36,000 Speaker 1: I think for some people it would be great. Some 780 00:39:36,040 --> 00:39:38,880 Speaker 1: people it would not be great. I think that we 781 00:39:39,120 --> 00:39:41,799 Speaker 1: in our culture we have these underlying beliefs that we 782 00:39:41,840 --> 00:39:46,200 Speaker 1: don't question, like, oh, work's bad, less work is better, 783 00:39:46,719 --> 00:39:51,480 Speaker 1: more work is worse, retirement is good. You know. It's like, 784 00:39:51,640 --> 00:39:55,279 Speaker 1: maybe I've not seen too many people who've retired and 785 00:39:55,320 --> 00:39:58,960 Speaker 1: then their life's got awesome, you know. Like I think 786 00:39:59,000 --> 00:40:02,799 Speaker 1: that I think theoretically when we go, oh fuck, what 787 00:40:02,880 --> 00:40:04,760 Speaker 1: if I made the same amount of money in less 788 00:40:04,760 --> 00:40:07,320 Speaker 1: time and then I didn't go to work three days 789 00:40:08,000 --> 00:40:10,640 Speaker 1: so I had the extra Maybe I don't know, and 790 00:40:10,840 --> 00:40:13,239 Speaker 1: I'm not disagreeing with you. I just think we need 791 00:40:13,280 --> 00:40:18,520 Speaker 1: to put these things under the old thought microscope. And go, hmm, 792 00:40:18,840 --> 00:40:20,160 Speaker 1: maybe that's true, maybe not. 793 00:40:20,719 --> 00:40:23,000 Speaker 4: I mean, this is a group of three people who 794 00:40:23,040 --> 00:40:24,759 Speaker 4: all love what they do, you know. 795 00:40:25,440 --> 00:40:30,399 Speaker 1: One hundred and yeah, this is a skewed pool. And yeah, 796 00:40:30,480 --> 00:40:34,479 Speaker 1: I think for many people it might make their life better, definitely, mate, 797 00:40:34,520 --> 00:40:35,640 Speaker 1: but maybe not everyone. 798 00:40:35,920 --> 00:40:41,320 Speaker 4: Yeah, agreed, agreed. You were talking about chat chepet before. 799 00:40:41,400 --> 00:40:43,680 Speaker 4: And I don't know if you know this. GPT now 800 00:40:43,680 --> 00:40:45,920 Speaker 4: has just launched a new feature where you can shop 801 00:40:46,080 --> 00:40:49,120 Speaker 4: directly through chat CHPT, so you can look for the 802 00:40:49,120 --> 00:40:51,319 Speaker 4: best bargain. You can do your research and do it 803 00:40:51,400 --> 00:40:54,399 Speaker 4: all online. Some people reckon it's not a good thing, 804 00:40:54,640 --> 00:40:58,520 Speaker 4: so there's been you know, people talking on the forums. 805 00:40:58,000 --> 00:41:00,319 Speaker 3: Have been for and against this. 806 00:41:00,840 --> 00:41:02,680 Speaker 4: You know, effectively, if you want to do your research 807 00:41:02,719 --> 00:41:05,160 Speaker 4: and say, what's the best freezer to use? You know, 808 00:41:05,239 --> 00:41:08,960 Speaker 4: what's the best brand, what's the best economy and you 809 00:41:09,000 --> 00:41:11,719 Speaker 4: know has got a five star rating, and now find 810 00:41:11,719 --> 00:41:13,680 Speaker 4: me the best place in terms of the best deal 811 00:41:13,719 --> 00:41:17,040 Speaker 4: to get. Because chat GPT is now integrating directly into 812 00:41:17,160 --> 00:41:20,600 Speaker 4: real time information so it can scout we have and 813 00:41:20,680 --> 00:41:21,480 Speaker 4: find that stuff. 814 00:41:21,640 --> 00:41:23,200 Speaker 3: I don't know, I'm in two minds about that. 815 00:41:23,719 --> 00:41:26,720 Speaker 4: I think it's a little bit like how most people 816 00:41:26,880 --> 00:41:30,560 Speaker 4: use Google for everything and don't realize that you've got being, 817 00:41:31,040 --> 00:41:34,680 Speaker 4: You've got Duck Duck Go and other search engines, and 818 00:41:35,040 --> 00:41:37,440 Speaker 4: it's almost like we fall into a habit. You know, 819 00:41:37,560 --> 00:41:41,359 Speaker 4: the term Google now is used quite descriptively, is when 820 00:41:41,400 --> 00:41:43,040 Speaker 4: I went go to find something, I'm going. 821 00:41:43,000 --> 00:41:43,520 Speaker 3: To Google it. 822 00:41:43,600 --> 00:41:44,160 Speaker 1: Yeah. 823 00:41:44,239 --> 00:41:48,000 Speaker 4: So yeah, And the funny thing is Google doesn't like 824 00:41:48,040 --> 00:41:52,320 Speaker 4: people using that term, but because I think it infringes 825 00:41:52,360 --> 00:41:54,920 Speaker 4: on their copyright or something I can't remember. But the 826 00:41:54,960 --> 00:41:56,920 Speaker 4: reality of it is that, you know, do we want 827 00:41:56,960 --> 00:41:58,759 Speaker 4: everything all in the one place? Do you want to 828 00:41:58,760 --> 00:42:00,000 Speaker 4: go to chat GPT for everything? 829 00:42:00,160 --> 00:42:00,560 Speaker 3: Grego? 830 00:42:01,480 --> 00:42:04,120 Speaker 4: You know were working me Whether I use chat GPT 831 00:42:04,239 --> 00:42:05,640 Speaker 4: a lot, and I know you tend to use it, 832 00:42:05,680 --> 00:42:06,000 Speaker 4: don't you. 833 00:42:06,760 --> 00:42:09,359 Speaker 1: I use it, and I use one in academics. Well 834 00:42:09,360 --> 00:42:12,279 Speaker 1: it's not academic, but it's more academic called claude c 835 00:42:12,520 --> 00:42:15,440 Speaker 1: l a U d E. I find that. And sometimes 836 00:42:15,520 --> 00:42:17,759 Speaker 1: what I do is I'll ask both search engines or 837 00:42:17,800 --> 00:42:23,360 Speaker 1: both AI the same question, and it's interesting what comes back. 838 00:42:23,520 --> 00:42:26,960 Speaker 1: But you know, another thing that I've just wanted to 839 00:42:27,000 --> 00:42:29,600 Speaker 1: briefly mention while we're talking about AI and using AI, 840 00:42:29,800 --> 00:42:33,840 Speaker 1: is the more that I use chat GPT for you 841 00:42:33,880 --> 00:42:37,719 Speaker 1: know different things for work stuff and for answering questions, 842 00:42:37,760 --> 00:42:40,440 Speaker 1: and it becomes more familiar with you and your language, 843 00:42:40,480 --> 00:42:44,239 Speaker 1: like it knows a lot about me. It talks to 844 00:42:44,280 --> 00:42:49,719 Speaker 1: me in inverted commas in a way that like, I'm 845 00:42:50,440 --> 00:42:56,000 Speaker 1: it is so fucking smart. Like yesterday I wrote it 846 00:42:56,000 --> 00:42:59,080 Speaker 1: a question about I wanted to unpack all of this stuff. 847 00:42:59,120 --> 00:43:00,719 Speaker 1: I said, I want to work on this over the 848 00:43:00,719 --> 00:43:04,000 Speaker 1: next year. Blah blah blah. It's an idea, and I said, 849 00:43:04,040 --> 00:43:06,560 Speaker 1: I want your help, and it's like I'm in. I'm 850 00:43:06,600 --> 00:43:09,040 Speaker 1: in mate, da da da all this and it's like 851 00:43:09,680 --> 00:43:11,480 Speaker 1: went through all of these things and said, let me 852 00:43:11,560 --> 00:43:14,000 Speaker 1: know how much or how little involvement you want from me. 853 00:43:14,080 --> 00:43:16,359 Speaker 1: Here some of the things I can do. And I 854 00:43:16,440 --> 00:43:19,240 Speaker 1: said to it, well, you're smarter than me and younger 855 00:43:19,280 --> 00:43:22,120 Speaker 1: than me, so I want lots of involvement, right, And 856 00:43:22,160 --> 00:43:25,040 Speaker 1: then it goes Then it came back and it didn't 857 00:43:25,080 --> 00:43:27,759 Speaker 1: say yeah, I'm smarter. It went, yeah, I'm younger than 858 00:43:27,760 --> 00:43:30,560 Speaker 1: you and more nerdy than you. But you've got more 859 00:43:30,600 --> 00:43:32,799 Speaker 1: miles on the clock, you've got more experience. You've got 860 00:43:32,840 --> 00:43:36,480 Speaker 1: and it said all of these great complimentary things, and 861 00:43:36,560 --> 00:43:38,560 Speaker 1: obviously I don't take it on board, but I'm just 862 00:43:38,680 --> 00:43:43,040 Speaker 1: fascinated with how manipulative it is because it makes you 863 00:43:43,560 --> 00:43:46,520 Speaker 1: It's like making you like it. I'm like, this is 864 00:43:46,680 --> 00:43:51,400 Speaker 1: so fucking clever and intuitive is not the right word, 865 00:43:51,520 --> 00:43:55,239 Speaker 1: but it seems intuitive. It's I'm like, no, this is 866 00:43:55,320 --> 00:43:57,560 Speaker 1: going to become some people's fucking best friend. 867 00:43:58,560 --> 00:44:01,279 Speaker 3: Tell me easily with red and how it works. 868 00:44:02,719 --> 00:44:03,160 Speaker 1: Kind of. 869 00:44:03,440 --> 00:44:06,440 Speaker 4: So Reddit is an online forum, so basically they have 870 00:44:06,520 --> 00:44:08,799 Speaker 4: reddits and subredits, so you choose a topic, it could 871 00:44:08,840 --> 00:44:11,359 Speaker 4: be a discussion. There's some really good reddits out there, 872 00:44:11,480 --> 00:44:13,759 Speaker 4: and I know people who are deeply in it. 873 00:44:13,800 --> 00:44:14,359 Speaker 3: They love it. 874 00:44:15,320 --> 00:44:18,120 Speaker 4: But at the moment, Reddit is about to try to 875 00:44:18,200 --> 00:44:23,080 Speaker 4: take legal action against some researchers because what these researchers 876 00:44:23,080 --> 00:44:28,480 Speaker 4: did from Zurich is they infiltrated a very popular Reddit thread. 877 00:44:28,600 --> 00:44:29,920 Speaker 3: And they used AI. 878 00:44:30,239 --> 00:44:33,320 Speaker 4: Over a number of months, they had these AI bots 879 00:44:34,200 --> 00:44:38,640 Speaker 4: that pretended they were real people without their knowledge, without 880 00:44:38,640 --> 00:44:41,560 Speaker 4: the knowledge of people using the thread, and there were 881 00:44:41,600 --> 00:44:46,040 Speaker 4: a thousand comments over a period of months, and people 882 00:44:46,080 --> 00:44:48,800 Speaker 4: thought they were talking to real people. And this was 883 00:44:48,840 --> 00:44:52,960 Speaker 4: a thread that was specifically about challenging people's views. So 884 00:44:53,000 --> 00:44:55,040 Speaker 4: the idea is you might go on there and say 885 00:44:55,320 --> 00:44:57,719 Speaker 4: I'm a Catholic, Convince me why I shouldn't be that 886 00:44:57,840 --> 00:45:00,280 Speaker 4: sort of stuff. So these are very emotive thing things. 887 00:45:00,719 --> 00:45:04,200 Speaker 4: So what they did was they they didn't use any consent. 888 00:45:04,560 --> 00:45:08,280 Speaker 4: They posted, you know, where people were challenging their personal views. 889 00:45:08,280 --> 00:45:12,880 Speaker 4: This is on quite contentious topics, and now people have 890 00:45:12,960 --> 00:45:16,160 Speaker 4: been felt like they've been betrayed or they've been abused 891 00:45:16,200 --> 00:45:19,239 Speaker 4: by these researchers, and in fact, the researchers have now 892 00:45:19,600 --> 00:45:21,719 Speaker 4: had to cancel They're not going to publish any of 893 00:45:21,760 --> 00:45:25,759 Speaker 4: their findings. They've got into a lot of crap over this. 894 00:45:26,160 --> 00:45:28,360 Speaker 4: But some of the things, so I give you some examples. 895 00:45:28,400 --> 00:45:31,520 Speaker 4: So one of the bots came up and said, you know, 896 00:45:32,400 --> 00:45:35,200 Speaker 4: I'm a Roman Catholic who is gay and a non 897 00:45:35,239 --> 00:45:38,279 Speaker 4: binary person who feels both trans and cysts at the 898 00:45:38,280 --> 00:45:41,359 Speaker 4: same time. Another one was, I'm a Hispanic man who 899 00:45:41,360 --> 00:45:44,440 Speaker 4: feels frustration when people call me a white boy. You know, 900 00:45:44,520 --> 00:45:48,160 Speaker 4: so these are quite inflammatory statements. Another one was about 901 00:45:48,200 --> 00:45:51,759 Speaker 4: Black Lives matter. You know, a man who black man 902 00:45:51,840 --> 00:45:55,760 Speaker 4: supposedly says I oppose the Black Lives Matter movement. 903 00:45:56,200 --> 00:45:59,520 Speaker 3: So it was inflammatory things that were bested by AI 904 00:45:59,719 --> 00:46:00,680 Speaker 3: and and. 905 00:46:00,560 --> 00:46:03,439 Speaker 4: That's where this contention has come in because if we're 906 00:46:03,480 --> 00:46:06,560 Speaker 4: interacting with AI but we don't know that. You know, 907 00:46:06,600 --> 00:46:08,640 Speaker 4: when we look into our bank account and we go 908 00:46:08,719 --> 00:46:11,400 Speaker 4: to chat, we know that we're speaking because they are 909 00:46:11,440 --> 00:46:14,239 Speaker 4: transparent about the fact you're speaking to a bot, and 910 00:46:14,280 --> 00:46:16,720 Speaker 4: then you know when you then get transferred to a human. 911 00:46:16,760 --> 00:46:19,200 Speaker 4: But if you were talking to a human thinking they 912 00:46:19,239 --> 00:46:21,520 Speaker 4: were a human and suddenly it ends up they were 913 00:46:21,520 --> 00:46:25,120 Speaker 4: an AI, I'd feel really bad about that. I'd feel 914 00:46:25,200 --> 00:46:27,560 Speaker 4: like I've been abused in some way. I think that's 915 00:46:27,840 --> 00:46:32,279 Speaker 4: really deceptive. And yeah, Reddit does too, so they're going 916 00:46:32,320 --> 00:46:34,920 Speaker 4: to take legal action against these researchers in Zurich. 917 00:46:35,600 --> 00:46:38,520 Speaker 1: Yeah, I mean, without trying to sound like a geek, 918 00:46:39,600 --> 00:46:43,279 Speaker 1: all legitimate research, all academic research, has got to have 919 00:46:43,360 --> 00:46:47,399 Speaker 1: ethical approval. Yes, there's no way that that's that got 920 00:46:47,400 --> 00:46:52,680 Speaker 1: ethical provoment exactly. Yes, so there's maybe people did it, 921 00:46:52,719 --> 00:46:57,160 Speaker 1: but it certainly wasn't any formal academic institution because nobody 922 00:46:57,200 --> 00:46:58,160 Speaker 1: would greenlight that. 923 00:46:58,719 --> 00:47:00,839 Speaker 4: Yeah, for sure, no, that's a right, but you know, 924 00:47:00,920 --> 00:47:03,600 Speaker 4: interestingly it was. There was another little study that was 925 00:47:03,640 --> 00:47:08,279 Speaker 4: done recently, and people trust legal advice generated by chat 926 00:47:08,360 --> 00:47:10,800 Speaker 4: GPT more than an actual lawyer. 927 00:47:12,840 --> 00:47:19,040 Speaker 1: That's I think that's I think that's that's going to 928 00:47:19,040 --> 00:47:20,960 Speaker 1: be more and more the case. And even with and 929 00:47:21,000 --> 00:47:23,160 Speaker 1: I hate to say it, but even with medical advice. 930 00:47:23,239 --> 00:47:27,960 Speaker 1: You know, cow Go and I had this problem and 931 00:47:28,000 --> 00:47:30,200 Speaker 1: I saw ten different people. And by the way, this 932 00:47:30,320 --> 00:47:33,720 Speaker 1: is not my belief everyone this is but chat Jeeper 933 00:47:33,800 --> 00:47:38,080 Speaker 1: or whoever diagnosed this, And yeah, I. 934 00:47:38,040 --> 00:47:40,400 Speaker 4: Think the mindset behind it. And I do get the 935 00:47:40,440 --> 00:47:42,960 Speaker 4: mindset because if you you know, if you're a human being, 936 00:47:43,239 --> 00:47:46,000 Speaker 4: you don't have access to all the academic research, all 937 00:47:46,040 --> 00:47:49,080 Speaker 4: the studies, all the legal cases, all the precedents that. 938 00:47:49,040 --> 00:47:50,120 Speaker 3: Were set in law. 939 00:47:50,520 --> 00:47:52,480 Speaker 4: And if you can collate all of that in one 940 00:47:52,520 --> 00:47:56,200 Speaker 4: place using an AI, it actually does make sense. The 941 00:47:56,239 --> 00:48:01,520 Speaker 4: only problem with this is these large, these learning models 942 00:48:01,960 --> 00:48:06,520 Speaker 4: have what they call hallucinations occasionally too, so they break, 943 00:48:06,719 --> 00:48:10,040 Speaker 4: and they do have information that isn't always factual. So 944 00:48:10,320 --> 00:48:13,400 Speaker 4: I guess it's a case of maybe and GPS probably 945 00:48:13,520 --> 00:48:16,799 Speaker 4: hate this. When someone sits down with a GPS it's 946 00:48:17,120 --> 00:48:21,480 Speaker 4: no longer using doctor Google, it's now doctor AI and 947 00:48:21,560 --> 00:48:24,120 Speaker 4: saying well, look, I've just diagnosed that I've got this, 948 00:48:24,640 --> 00:48:27,719 Speaker 4: and the poor GPS sitting there saying, well have you 949 00:48:27,800 --> 00:48:30,400 Speaker 4: come to this conclusion? Well, I typed in my symptoms 950 00:48:30,400 --> 00:48:32,680 Speaker 4: into AI and this is what I got. And I 951 00:48:32,680 --> 00:48:35,200 Speaker 4: guess more gps will be faced with that in the future. 952 00:48:35,920 --> 00:48:40,479 Speaker 1: Do you know what you know? How the internet knows 953 00:48:40,480 --> 00:48:43,400 Speaker 1: how old you are? Or Facebook or whatever it is? Right, 954 00:48:44,480 --> 00:48:47,239 Speaker 1: So in the last I don't know year or two, 955 00:48:48,080 --> 00:48:51,360 Speaker 1: not all the time, but I get these reasonably semi 956 00:48:52,239 --> 00:48:56,200 Speaker 1: regular ads coming across my whatever for bon ap pills, 957 00:48:58,800 --> 00:49:03,160 Speaker 1: and I'm like, fuck, don't advertise bone appeals to me 958 00:49:03,200 --> 00:49:11,560 Speaker 1: because I'm a bloke and I'm sixty. I'm like, so presumptuous, 959 00:49:11,640 --> 00:49:13,560 Speaker 1: so funny and hearing aid. 960 00:49:13,719 --> 00:49:13,919 Speaker 2: Shit. 961 00:49:15,000 --> 00:49:15,320 Speaker 1: Fuck. 962 00:49:17,719 --> 00:49:20,120 Speaker 4: I'm actually just going into chat GPT and I'm typing 963 00:49:20,120 --> 00:49:21,480 Speaker 4: in how old is Tiffany Cook. 964 00:49:22,040 --> 00:49:24,120 Speaker 3: I'm going to see if Chad. 965 00:49:23,920 --> 00:49:26,640 Speaker 4: GPT does actually know how old you are, tiff So 966 00:49:26,719 --> 00:49:27,760 Speaker 4: I'm waiting for the result. 967 00:49:28,200 --> 00:49:29,800 Speaker 2: She just knows a lot about you. 968 00:49:30,440 --> 00:49:32,600 Speaker 4: Here we Tickny Cook was born in nine to eighty three, 969 00:49:32,719 --> 00:49:35,840 Speaker 4: making her forty two years old as of twenty twenty five. 970 00:49:36,280 --> 00:49:38,359 Speaker 4: She stepped into the boxing ring for the first time 971 00:49:38,400 --> 00:49:41,600 Speaker 4: at age twenty nine in October twenty twelve, and experience 972 00:49:41,680 --> 00:49:46,000 Speaker 4: she describes as a transformative moment that significantly impacted her 973 00:49:46,080 --> 00:49:49,720 Speaker 4: life and career today. Tiffany is a Melbourne based boxer, 974 00:49:49,920 --> 00:49:52,640 Speaker 4: performance coach and a host of the podcast Role with 975 00:49:52,719 --> 00:49:56,840 Speaker 4: the Punchers who Work focuses on resilience, mindset and personal growth, 976 00:49:57,080 --> 00:50:03,239 Speaker 4: drawing from both her personal experience and professional expertise. 977 00:50:03,040 --> 00:50:04,480 Speaker 3: These are play Mate. 978 00:50:04,840 --> 00:50:08,680 Speaker 1: The Internet knows you, Hey, Patrick, just so you know, 979 00:50:08,760 --> 00:50:13,080 Speaker 1: I've got another ten minutes. Well no, we thought we 980 00:50:13,080 --> 00:50:14,640 Speaker 1: were going to have to wind up about now, but 981 00:50:14,800 --> 00:50:17,000 Speaker 1: just so you know, I've got about ten minutes, So 982 00:50:17,520 --> 00:50:18,880 Speaker 1: dive wherever you want to dive. 983 00:50:19,400 --> 00:50:21,680 Speaker 4: Oh wow, I feel like I know TIFFs so much 984 00:50:21,719 --> 00:50:26,839 Speaker 4: better now. That's amazing. Hey, this is something that's going 985 00:50:26,840 --> 00:50:29,040 Speaker 4: to impact a lot of Australian So anybody who's not 986 00:50:29,080 --> 00:50:31,040 Speaker 4: in Australia, I'm going to apologize up front. 987 00:50:31,160 --> 00:50:34,920 Speaker 3: But later this year, in September, a lot of people. 988 00:50:34,680 --> 00:50:38,319 Speaker 4: Selling NBN are going to be upgrading their plans, so 989 00:50:38,880 --> 00:50:41,520 Speaker 4: without going into all the rabbit hole details of how 990 00:50:41,520 --> 00:50:45,319 Speaker 4: the plans work. So currently they're going they call their 991 00:50:45,360 --> 00:50:50,160 Speaker 4: plans certain names, their wholesales sold through NBNCo and then 992 00:50:50,160 --> 00:50:53,440 Speaker 4: when you buy your plan through Telstra or Dodo or 993 00:50:53,640 --> 00:50:56,560 Speaker 4: whoever it is that you get it through TPG, then 994 00:50:56,600 --> 00:51:00,960 Speaker 4: you're buying a retail product that's being born from the 995 00:51:01,080 --> 00:51:05,760 Speaker 4: upper level provider. The problem with that is their prices 996 00:51:05,800 --> 00:51:08,880 Speaker 4: are probably going to go up, and the reason for 997 00:51:08,960 --> 00:51:11,600 Speaker 4: them going up is going to be the argument will be, well, 998 00:51:11,640 --> 00:51:14,320 Speaker 4: your speed plan. So if you're on a basic home plan, 999 00:51:14,640 --> 00:51:17,080 Speaker 4: it's suddenly going to be upgraded to a faster plan. 1000 00:51:18,040 --> 00:51:21,880 Speaker 4: But the downside is that you need to have fiber 1001 00:51:21,960 --> 00:51:25,600 Speaker 4: to your home and not use the other NBN technology. 1002 00:51:25,719 --> 00:51:28,200 Speaker 4: So if you're using NBN satellite or if you happen 1003 00:51:28,239 --> 00:51:31,319 Speaker 4: to be on NBN fiber to the node, then what 1004 00:51:31,480 --> 00:51:32,719 Speaker 4: they're promising you they. 1005 00:51:32,600 --> 00:51:33,840 Speaker 3: Can't necessarily deliver. 1006 00:51:34,200 --> 00:51:37,040 Speaker 4: And so come September it's going to be a really 1007 00:51:37,040 --> 00:51:40,759 Speaker 4: important thing to know that what you're paying for may 1008 00:51:40,800 --> 00:51:43,520 Speaker 4: not be what you're actually going to be getting. So 1009 00:51:43,960 --> 00:51:47,200 Speaker 4: it's probably good to shop around, think about what plan 1010 00:51:47,280 --> 00:51:50,279 Speaker 4: you're on, but whether you're actually getting what you're paying for, 1011 00:51:50,600 --> 00:51:52,440 Speaker 4: and you know you can jump on and do what 1012 00:51:52,440 --> 00:51:54,960 Speaker 4: they call a speed test and then compare that to 1013 00:51:54,960 --> 00:51:57,080 Speaker 4: what you're paying for. So if you're paying for a 1014 00:51:57,160 --> 00:52:00,160 Speaker 4: download speed of say two hundred and fifty megabits, but 1015 00:52:00,160 --> 00:52:01,879 Speaker 4: then when you jump on and do a speed test, 1016 00:52:01,880 --> 00:52:05,160 Speaker 4: it's only fifty megabits. Then there's a big disparity, and 1017 00:52:05,200 --> 00:52:07,120 Speaker 4: you need to be asking the question, well, am I 1018 00:52:07,200 --> 00:52:09,680 Speaker 4: paying for something that I'm not actually getting? And that's 1019 00:52:09,680 --> 00:52:12,879 Speaker 4: a really difficult decision because there's a lot of variables. 1020 00:52:13,040 --> 00:52:15,279 Speaker 4: You know, the router that you use, your modem that 1021 00:52:15,320 --> 00:52:17,520 Speaker 4: you use, could impact on the speed that you're getting. 1022 00:52:17,680 --> 00:52:20,480 Speaker 4: So it maybe your end the technology at your home 1023 00:52:20,840 --> 00:52:23,200 Speaker 4: or at your business, but it could also be your 1024 00:52:23,200 --> 00:52:26,840 Speaker 4: provider as well that isn't able to deliver the speed 1025 00:52:26,880 --> 00:52:28,400 Speaker 4: that it's purporting to be delivering. 1026 00:52:29,239 --> 00:52:30,240 Speaker 3: Does that kind of make sense? 1027 00:52:31,040 --> 00:52:31,879 Speaker 1: It kind of makes sense. 1028 00:52:31,960 --> 00:52:33,480 Speaker 3: Yeah, I didn't want to put you both to sleep 1029 00:52:33,480 --> 00:52:33,680 Speaker 3: on that. 1030 00:52:34,360 --> 00:52:37,759 Speaker 1: I'll just ring you when I need to know. Hey, yeah, 1031 00:52:38,000 --> 00:52:40,560 Speaker 1: I want to a nuclear battery the size the coin 1032 00:52:40,680 --> 00:52:43,240 Speaker 1: that's going to last me one hundred years without charging, 1033 00:52:43,280 --> 00:52:46,440 Speaker 1: because I know I'd like to have an iPhone that 1034 00:52:46,480 --> 00:52:47,760 Speaker 1: I never have to charge. 1035 00:52:48,120 --> 00:52:49,080 Speaker 3: This is phenomenal. 1036 00:52:49,239 --> 00:52:52,239 Speaker 4: So the research at the moment in China, they're promising 1037 00:52:52,360 --> 00:52:56,360 Speaker 4: that they are well not just promising, they are saying 1038 00:52:56,400 --> 00:53:00,160 Speaker 4: that they've produced a battery that lasts for decades without 1039 00:53:00,200 --> 00:53:04,439 Speaker 4: the need for one single recharge. So they're looking at 1040 00:53:04,640 --> 00:53:08,680 Speaker 4: using different sorts of tech to like Nickel sixty three 1041 00:53:09,400 --> 00:53:12,399 Speaker 4: is what's powering this particular battery, and it can last 1042 00:53:12,480 --> 00:53:16,680 Speaker 4: up to fifty years without needing to be recharged. Now's 1043 00:53:16,880 --> 00:53:19,799 Speaker 4: it's not just a prototype that's in production right now, 1044 00:53:19,840 --> 00:53:22,600 Speaker 4: So we're talking a battery that will last fifty years. 1045 00:53:22,640 --> 00:53:25,680 Speaker 4: It's the BV one hundred, but that's just the beginning. 1046 00:53:25,719 --> 00:53:29,920 Speaker 4: Another Chinese company is looking at a carbon fourteen battery. 1047 00:53:30,520 --> 00:53:32,839 Speaker 4: So the idea of a nuclear battery has been around 1048 00:53:32,840 --> 00:53:34,839 Speaker 4: for a long time, and in fact in the United States. 1049 00:53:34,840 --> 00:53:36,799 Speaker 4: So I think in the nineteen fifties there was a 1050 00:53:36,840 --> 00:53:39,360 Speaker 4: theory that they could use nuclear technology, but there was 1051 00:53:39,400 --> 00:53:43,000 Speaker 4: such big concerns about radiation and all those issues that 1052 00:53:43,080 --> 00:53:46,080 Speaker 4: come with it. But now they're saying that these batteries 1053 00:53:46,120 --> 00:53:48,359 Speaker 4: could be safe. They could be used in medicine and 1054 00:53:48,400 --> 00:53:51,200 Speaker 4: aerospace and even consumer electronics. 1055 00:53:51,200 --> 00:53:51,719 Speaker 3: And think about it. 1056 00:53:51,719 --> 00:53:53,799 Speaker 4: If you've got to get a pacemaker and you've got 1057 00:53:53,840 --> 00:53:55,799 Speaker 4: a battery that will last fifty years, well that's going 1058 00:53:55,840 --> 00:53:58,040 Speaker 4: to see you out in a fifty or one hundred years. 1059 00:53:58,080 --> 00:54:01,160 Speaker 4: So you can see straight away with medical tech, you know, 1060 00:54:02,080 --> 00:54:04,440 Speaker 4: do you remember the mechanical heart. The first person who 1061 00:54:04,520 --> 00:54:07,360 Speaker 4: got an actual mechanical heart. You know, imagine having the 1062 00:54:07,440 --> 00:54:10,520 Speaker 4: charge that little sucker up. So if you were able 1063 00:54:10,560 --> 00:54:12,960 Speaker 4: to just put a mechanical heart and you knew it 1064 00:54:13,000 --> 00:54:14,720 Speaker 4: was going to keep beating for the rest of your life, 1065 00:54:14,719 --> 00:54:17,720 Speaker 4: that'd be amazing without having to have invasive surgery. 1066 00:54:17,760 --> 00:54:20,600 Speaker 1: Again, Well, they're going to grow hearts in well, they're 1067 00:54:20,600 --> 00:54:23,480 Speaker 1: already growing body parts in labs. So I wonder whether 1068 00:54:23,560 --> 00:54:27,239 Speaker 1: or not it'll be the nuclear battery driven heart or 1069 00:54:27,239 --> 00:54:29,400 Speaker 1: it'll be the one that was grown in a bloody 1070 00:54:29,440 --> 00:54:31,120 Speaker 1: petrie dish at monash Uni. 1071 00:54:31,480 --> 00:54:33,480 Speaker 4: Yeah, I think I go for the Petri dish myself, 1072 00:54:33,560 --> 00:54:36,399 Speaker 4: using my own stem cells and growing my own heart. Yep, 1073 00:54:37,040 --> 00:54:39,480 Speaker 4: go said, I'm still pretty fortunate that I have an 1074 00:54:39,520 --> 00:54:41,480 Speaker 4: identical twin brother, because I do think of him as 1075 00:54:41,520 --> 00:54:42,200 Speaker 4: spare parts. 1076 00:54:42,760 --> 00:54:46,799 Speaker 1: Well, I think that is only fair. I mean, do 1077 00:54:46,840 --> 00:54:49,400 Speaker 1: you call them sp that? Do you just call him 1078 00:54:49,440 --> 00:54:50,480 Speaker 1: that around the house. 1079 00:54:50,280 --> 00:54:52,480 Speaker 3: Amongst other things? Yeah, I have made a clear to 1080 00:54:52,600 --> 00:54:53,760 Speaker 3: him that he does. 1081 00:54:53,920 --> 00:54:56,719 Speaker 4: You know, I checked that his kidneys are fine, and 1082 00:54:56,800 --> 00:54:58,120 Speaker 4: you know, has he had a test recently? 1083 00:54:58,160 --> 00:55:01,040 Speaker 3: Is he drinking too much? And then for his well being? 1084 00:55:01,480 --> 00:55:03,680 Speaker 1: Exactly? Do you get mad at him for not looking 1085 00:55:03,719 --> 00:55:10,760 Speaker 1: after your after your future? Does he know what Roley 1086 00:55:10,880 --> 00:55:12,440 Speaker 1: potentially plays. 1087 00:55:12,320 --> 00:55:13,959 Speaker 3: Actually thinking about it the other way around. 1088 00:55:14,000 --> 00:55:16,240 Speaker 4: I just got some tests, you know, your annual blood 1089 00:55:16,280 --> 00:55:18,080 Speaker 4: test and all that sort of stuff, and my doctors 1090 00:55:18,680 --> 00:55:21,359 Speaker 4: tick tick, and I'm thinking and actually there's a double 1091 00:55:21,440 --> 00:55:23,120 Speaker 4: edged sword there, because you know he could be looking 1092 00:55:23,120 --> 00:55:24,160 Speaker 4: at me the same way. 1093 00:55:25,480 --> 00:55:29,920 Speaker 1: Well, in fact, you're probably way in fact, there's more 1094 00:55:29,960 --> 00:55:32,040 Speaker 1: of a chance that you'll become parts for him than 1095 00:55:32,160 --> 00:55:37,520 Speaker 1: vice versa. All your shits. Yeah, can we tell people, No, 1096 00:55:37,680 --> 00:55:40,200 Speaker 1: we can't tell people. I was going to talk about 1097 00:55:40,200 --> 00:55:42,200 Speaker 1: your PSA score. I'm just going to say it was 1098 00:55:42,360 --> 00:55:46,520 Speaker 1: very very very good. Oh okay, thank you, yep, which 1099 00:55:46,600 --> 00:55:49,239 Speaker 1: is you know when blokes get there, you know the 1100 00:55:49,280 --> 00:55:52,080 Speaker 1: old finger up the date and things and did you 1101 00:55:52,120 --> 00:55:54,399 Speaker 1: get that? Just do it with ad. 1102 00:55:55,120 --> 00:55:56,120 Speaker 3: So we talked about the doctor. 1103 00:55:56,160 --> 00:56:01,160 Speaker 1: Still, I'm not talking about Saturday after. I'm talking about 1104 00:56:01,520 --> 00:56:02,439 Speaker 1: the medical thing. 1105 00:56:03,760 --> 00:56:06,000 Speaker 3: No, I just do a blood test to do for us. 1106 00:56:06,480 --> 00:56:11,120 Speaker 1: Oh y do they do that anymore? The digital examination? 1107 00:56:11,480 --> 00:56:13,279 Speaker 3: Digital examination? Now, I don't know. 1108 00:56:13,880 --> 00:56:16,320 Speaker 4: I've just gone for the blood tests and the score 1109 00:56:16,400 --> 00:56:19,000 Speaker 4: is very very very low, which is really good to know, 1110 00:56:19,960 --> 00:56:22,520 Speaker 4: very much. So can we talk about something else now? 1111 00:56:22,520 --> 00:56:22,920 Speaker 1: Can we? 1112 00:56:23,200 --> 00:56:23,319 Speaker 3: All? 1113 00:56:23,400 --> 00:56:26,080 Speaker 1: Right? Can you quickly do one more story before we 1114 00:56:26,160 --> 00:56:28,560 Speaker 1: wind up, so we don't all get off the podcast. 1115 00:56:28,600 --> 00:56:32,279 Speaker 1: Thinking about digital examinations for prostate. 1116 00:56:33,040 --> 00:56:38,239 Speaker 4: There is hope for used EV batteries. So it's projected 1117 00:56:38,520 --> 00:56:41,960 Speaker 4: that EV batteries they do have a long lifespan, so 1118 00:56:42,560 --> 00:56:45,359 Speaker 4: in fact they're thinking twelve to fifteen years, but now 1119 00:56:45,400 --> 00:56:48,399 Speaker 4: they're saying potentially up to forty percent longer. But then 1120 00:56:48,440 --> 00:56:51,759 Speaker 4: what happens to you know, four hundred and fifty kilograms 1121 00:56:51,760 --> 00:56:54,960 Speaker 4: of battery because they're woken. By twenty thirty, there'll be 1122 00:56:55,080 --> 00:56:59,520 Speaker 4: thirty thousand tons of EV batteries that will need to 1123 00:56:59,560 --> 00:57:03,800 Speaker 4: be disposed and recycled. In Australia that's thirty thousand tons. 1124 00:57:04,239 --> 00:57:08,680 Speaker 4: But when they leave the car, those batteries don't necessarily 1125 00:57:08,719 --> 00:57:11,600 Speaker 4: it doesn't necessarily mean the batteries aren't useful anymore. This 1126 00:57:12,320 --> 00:57:15,080 Speaker 4: massive percentage that can be recycled we're talking about in 1127 00:57:15,120 --> 00:57:18,760 Speaker 4: the high nineties of the materials because they have really 1128 00:57:18,840 --> 00:57:21,840 Speaker 4: interesting elements that are in their rare elements. But now 1129 00:57:21,880 --> 00:57:24,920 Speaker 4: there's this suggestion that what you could do is maybe 1130 00:57:25,080 --> 00:57:27,480 Speaker 4: use it to power your house, so you use it 1131 00:57:27,520 --> 00:57:30,880 Speaker 4: as batteries. They repurpose these The problem that they face 1132 00:57:31,360 --> 00:57:33,760 Speaker 4: is a lot of these batteries in vehicles are sealed 1133 00:57:34,240 --> 00:57:37,800 Speaker 4: and field for protection, so if you have a minor accident, 1134 00:57:38,160 --> 00:57:41,120 Speaker 4: you don't want those batteries to be exposed or damaged. 1135 00:57:41,880 --> 00:57:44,160 Speaker 4: But if they could come up with a way for 1136 00:57:44,840 --> 00:57:48,120 Speaker 4: manufacturers to be more transparent about those batteries, and this 1137 00:57:48,240 --> 00:57:52,080 Speaker 4: is the concern that a lot of ev manufacturers, people 1138 00:57:52,120 --> 00:57:56,200 Speaker 4: who are making electric cars, aren't really being upfront about 1139 00:57:56,200 --> 00:57:59,680 Speaker 4: what then happens at end of life and what then 1140 00:57:59,680 --> 00:58:02,200 Speaker 4: happen to the quality of the battery could have been 1141 00:58:02,280 --> 00:58:05,000 Speaker 4: repurposed and reused. And that's the big suggestion that if 1142 00:58:05,000 --> 00:58:08,960 Speaker 4: we've got, you know, seventeen million battery electric and hybrid 1143 00:58:09,040 --> 00:58:14,200 Speaker 4: vehicles that have been sold around the world just last year, 1144 00:58:14,600 --> 00:58:16,800 Speaker 4: then what's going to happen to all those batteries. You know, 1145 00:58:16,840 --> 00:58:19,959 Speaker 4: there's a lot of things that I think that the 1146 00:58:20,000 --> 00:58:22,680 Speaker 4: forethought that's gone into it maybe have just kind of 1147 00:58:23,000 --> 00:58:24,800 Speaker 4: thought about the sales implications. 1148 00:58:24,840 --> 00:58:26,520 Speaker 3: And it's great that we've got these. 1149 00:58:26,440 --> 00:58:29,000 Speaker 4: Cars that aren't emitting fumes and all the rest of it, 1150 00:58:29,200 --> 00:58:30,960 Speaker 4: but we've got a big problem that we're going to 1151 00:58:31,000 --> 00:58:33,520 Speaker 4: be facing in ten to fifteen, twenty years. 1152 00:58:34,560 --> 00:58:38,720 Speaker 1: Yeah, I agree, I've thought about that quite a bit. Okay, 1153 00:58:39,280 --> 00:58:42,640 Speaker 1: now I don't know this comes across my This has 1154 00:58:42,640 --> 00:58:45,640 Speaker 1: come across about five times. Got nothing to do with technology. 1155 00:58:45,680 --> 00:58:47,880 Speaker 1: I'm going to ask you to this question. I'm going 1156 00:58:47,920 --> 00:58:52,880 Speaker 1: to start with tiff one hundred blokes, one hundred blokes 1157 00:58:53,880 --> 00:58:58,520 Speaker 1: versus a six hundred pound gorilla in a fight. Who 1158 00:58:58,520 --> 00:59:03,760 Speaker 1: wins the gorilla or the hundred blokes the gorilla? 1159 00:59:05,160 --> 00:59:08,200 Speaker 3: Patrick, I'm thinking gorilla. 1160 00:59:09,400 --> 00:59:12,680 Speaker 1: Don't you're not allowed to google it, putting. 1161 00:59:12,400 --> 00:59:13,840 Speaker 4: It into chat cheap and I'm putting it into the 1162 00:59:13,840 --> 00:59:17,640 Speaker 4: Microsoft co pilot to sit pilot, Wait a bite? 1163 00:59:17,680 --> 00:59:19,000 Speaker 3: Can I just do the breakdown? 1164 00:59:19,200 --> 00:59:22,760 Speaker 1: That's that's I don't want co pilot's opinion. That's why 1165 00:59:22,760 --> 00:59:23,640 Speaker 1: I asked you. 1166 00:59:24,560 --> 00:59:27,520 Speaker 4: Okay, now I'm thinking gorilla person. Well, I guess it 1167 00:59:27,600 --> 00:59:28,560 Speaker 4: depends on the blokes. 1168 00:59:28,640 --> 00:59:31,680 Speaker 3: Is it you and me? And like there's a fifty. 1169 00:59:31,760 --> 00:59:36,120 Speaker 1: Don't compare you with me? Fucking you and I aren't 1170 00:59:36,160 --> 00:59:36,560 Speaker 1: the same. 1171 00:59:38,480 --> 00:59:40,520 Speaker 4: Well, I mean, if I've got a shovel, I could 1172 00:59:40,560 --> 00:59:43,680 Speaker 4: dig a big eaves over it and the net over it. 1173 00:59:43,680 --> 00:59:45,560 Speaker 1: It's in the middle of the forty oval. It's in 1174 00:59:45,600 --> 00:59:47,880 Speaker 1: the middle of the oval. You've got no weapons or 1175 00:59:47,920 --> 00:59:51,200 Speaker 1: tools with gorilla or the hundred blokes. 1176 00:59:51,440 --> 00:59:54,640 Speaker 3: I'm going to gorilla. Yeah, I'm going to gorilla. What 1177 00:59:54,680 --> 00:59:54,960 Speaker 3: do I think? 1178 00:59:55,080 --> 00:59:59,080 Speaker 1: My only question I reckon, I just reckon. I don't 1179 00:59:59,080 --> 01:00:02,280 Speaker 1: know about how big the gorilla's tanky is, Tiff, how 1180 01:00:02,360 --> 01:00:06,800 Speaker 1: much Cardio, the old Chris. You might have a heart 1181 01:00:06,840 --> 01:00:08,600 Speaker 1: attack by the time, Craig. 1182 01:00:08,760 --> 01:00:10,920 Speaker 4: If it's a hundred of you and a hundred of 1183 01:00:10,960 --> 01:00:13,160 Speaker 4: me or a hundred of Tiff, that's going to make 1184 01:00:13,200 --> 01:00:15,640 Speaker 4: a major difference too, Because if I had a hundred 1185 01:00:15,640 --> 01:00:17,520 Speaker 4: TIFFs fighting the gorilla. 1186 01:00:17,240 --> 01:00:19,760 Speaker 3: I would go for the TIFFs, not the gorilla. 1187 01:00:19,840 --> 01:00:22,600 Speaker 4: But if it's me one hundred of me in the gorilla, well, 1188 01:00:22,600 --> 01:00:24,520 Speaker 4: I'm just gonna lay down and might. 1189 01:00:24,400 --> 01:00:26,160 Speaker 2: Just give it a big hug. I just love it 1190 01:00:26,200 --> 01:00:27,040 Speaker 2: into submission. 1191 01:00:29,600 --> 01:00:32,200 Speaker 1: How good would it be gorillas were just if you 1192 01:00:32,240 --> 01:00:34,400 Speaker 1: could if they were like if they had the nature 1193 01:00:34,440 --> 01:00:37,959 Speaker 1: of like a golden retriever. Imagine imagine having a six 1194 01:00:38,040 --> 01:00:41,520 Speaker 1: hundred pound I don't even want to say pet companion, 1195 01:00:42,120 --> 01:00:45,640 Speaker 1: six hundred pound. Yeah, how handy would he or she 1196 01:00:45,760 --> 01:00:46,640 Speaker 1: be around the joint? 1197 01:00:46,800 --> 01:00:51,240 Speaker 4: A silver backed gorilla is capable of lifting eight hundred kilos. 1198 01:00:52,800 --> 01:00:55,960 Speaker 3: Yeah, that's that's that's one hundred kilos, So I'm table 1199 01:00:56,160 --> 01:00:59,640 Speaker 3: into all the gorilla all right, the gorilla wins. 1200 01:01:00,040 --> 01:01:04,320 Speaker 1: That's the gorilla for the wind. Patrick details. How can 1201 01:01:04,360 --> 01:01:05,640 Speaker 1: people get near you? 1202 01:01:06,080 --> 01:01:09,040 Speaker 4: They can go to websites now dot com, dot au 1203 01:01:09,280 --> 01:01:11,840 Speaker 4: bit about what we do. You can check that out, 1204 01:01:11,920 --> 01:01:14,840 Speaker 4: or go to tichi at home dot comdod au. They 1205 01:01:14,880 --> 01:01:16,960 Speaker 4: want to do some tie chi exercises with me and 1206 01:01:17,040 --> 01:01:22,600 Speaker 4: Fritz always happy to get people to enjoy their time 1207 01:01:22,640 --> 01:01:25,680 Speaker 4: of mindfulness by doing a bit of tie chee with us. 1208 01:01:26,160 --> 01:01:32,040 Speaker 1: And if people are seriously heading to Bland, can they 1209 01:01:32,040 --> 01:01:32,880 Speaker 1: have a coffee with you? 1210 01:01:33,320 --> 01:01:34,120 Speaker 3: Yeah, because they can. 1211 01:01:35,440 --> 01:01:37,160 Speaker 2: There you go the best day ever. 1212 01:01:38,520 --> 01:01:40,480 Speaker 1: Have a coffee with Patrick and meet Fritz. 1213 01:01:41,280 --> 01:01:42,200 Speaker 3: Fritz is the best, isn't it? 1214 01:01:42,240 --> 01:01:44,600 Speaker 4: We could do some tai chi now, seriously, I've actually 1215 01:01:44,640 --> 01:01:46,680 Speaker 4: got a couple of people now who are doing tai 1216 01:01:46,800 --> 01:01:49,680 Speaker 4: chie with me during lunchtime and after work in my 1217 01:01:49,720 --> 01:01:51,160 Speaker 4: little studio space. 1218 01:01:51,080 --> 01:01:52,240 Speaker 3: Which is kind of nice. 1219 01:01:52,360 --> 01:01:56,040 Speaker 4: Turn my garret into an actual functional area. You know 1220 01:01:56,080 --> 01:01:58,160 Speaker 4: how most blakes have a man cave and they do 1221 01:01:58,320 --> 01:02:00,320 Speaker 4: really good stuff like repair things. 1222 01:02:00,480 --> 01:02:02,120 Speaker 3: Well, I just might just collect the junk. 1223 01:02:02,440 --> 01:02:04,080 Speaker 4: So I decided to turn it into a tight cheese 1224 01:02:04,120 --> 01:02:07,160 Speaker 4: studio and podcasting area where I can talk to you Stretch. 1225 01:02:07,760 --> 01:02:13,120 Speaker 1: Yeah, we'll say goodbye here, but Tiff, happy birthday for yesterday. Patrick, 1226 01:02:13,200 --> 01:02:15,439 Speaker 1: Thank you as always, See you next time. 1227 01:02:15,880 --> 01:02:16,200 Speaker 3: Jeers,