1 00:00:01,040 --> 00:00:04,000 Speaker 1: I'll get a team and see project Patrick's back. Fuck, yeah, 2 00:00:04,040 --> 00:00:06,760 Speaker 1: of course he is. We love him. Tip's here just 3 00:00:06,800 --> 00:00:13,800 Speaker 1: trying to keep us to alpha males in line, and 4 00:00:13,800 --> 00:00:17,480 Speaker 1: and even I laugh at that. We'll start with the Lady. 5 00:00:19,320 --> 00:00:23,720 Speaker 1: Oh so many things, the Lady, the conversation Ti. Yeah, yeah, 6 00:00:23,760 --> 00:00:27,080 Speaker 1: morning Patrick? What I did there? 7 00:00:27,320 --> 00:00:27,440 Speaker 2: No? 8 00:00:27,920 --> 00:00:32,560 Speaker 1: Just kidding, Hi, Tiff, my favorite day of the week. 9 00:00:33,240 --> 00:00:35,800 Speaker 1: Ah fuck? How good are these shows where we just 10 00:00:35,960 --> 00:00:41,960 Speaker 1: throw caution to the wind, and you know, wind to 11 00:00:42,040 --> 00:00:44,839 Speaker 1: the wind. Now, I will tell you listeners, just like 12 00:00:45,240 --> 00:00:48,080 Speaker 1: thirty seconds ago, Patrick was telling us about the sounds 13 00:00:48,080 --> 00:00:52,720 Speaker 1: that possums make when they're rooting and how unpleasant it is. 14 00:00:53,000 --> 00:00:56,440 Speaker 1: Right now, that's literally So if you've heard that and 15 00:00:56,480 --> 00:01:00,560 Speaker 1: you decide to stick around, I blame you, right, don't 16 00:01:00,560 --> 00:01:03,480 Speaker 1: get precious moving forward. If you know that we were 17 00:01:03,520 --> 00:01:06,920 Speaker 1: just talking about the sounds, the audible kind of experience 18 00:01:06,959 --> 00:01:09,640 Speaker 1: of rooting possums, and you choose to stay here, that's 19 00:01:09,720 --> 00:01:13,399 Speaker 1: on you. Patrick. Just give us a quick snapshot of 20 00:01:13,440 --> 00:01:14,600 Speaker 1: what that sound was. Again. 21 00:01:14,920 --> 00:01:18,399 Speaker 3: I just had a really awful thought. Let's just make 22 00:01:18,440 --> 00:01:18,959 Speaker 3: this clear. 23 00:01:19,120 --> 00:01:22,399 Speaker 2: When we talk about rooting possums, it's two possums rooting 24 00:01:22,440 --> 00:01:22,880 Speaker 2: each other. 25 00:01:23,640 --> 00:01:26,559 Speaker 1: Oh, I didn't even think of anything else. So that's on. 26 00:01:26,440 --> 00:01:30,440 Speaker 3: You, just the way you phrased it. Then I had this. 27 00:01:30,680 --> 00:01:38,320 Speaker 1: Okay, okay, possums having no one's listening already, don't give 28 00:01:38,480 --> 00:01:42,800 Speaker 1: don't give us a demo. Je, how are you great? 29 00:01:42,840 --> 00:01:43,080 Speaker 3: Now? 30 00:01:46,400 --> 00:01:49,440 Speaker 1: Well, well that the reason that possums came up is 31 00:01:49,480 --> 00:01:56,040 Speaker 1: because your wolf, your violent wolf, attacked a possum this week. 32 00:01:56,320 --> 00:01:59,640 Speaker 4: She snatched She snatched a possum off the top of 33 00:01:59,720 --> 00:02:01,960 Speaker 4: a And when I took her out for a walk 34 00:02:02,000 --> 00:02:04,040 Speaker 4: one night, I know died. I didn't know what I 35 00:02:04,080 --> 00:02:06,240 Speaker 4: was going to do. Fortunately she let it go and 36 00:02:06,360 --> 00:02:08,880 Speaker 4: it just toddled off. But last night I went for 37 00:02:08,880 --> 00:02:10,520 Speaker 4: a walk in that I'm pretty sure it's the same 38 00:02:10,600 --> 00:02:13,040 Speaker 4: possum was sitting on top of the same fence and 39 00:02:13,160 --> 00:02:14,600 Speaker 4: just sat there looking at us. 40 00:02:15,600 --> 00:02:20,160 Speaker 1: Yeah, the audacity, the audacity. Well, do you know that 41 00:02:20,320 --> 00:02:23,200 Speaker 1: like whippets, TIFFs dog is a whippet for the three 42 00:02:23,200 --> 00:02:25,680 Speaker 1: of you who don't know that, for the three of 43 00:02:25,680 --> 00:02:30,359 Speaker 1: you in the world who don't know that, but whippets 44 00:02:30,360 --> 00:02:33,079 Speaker 1: of like super athletes. So of course she's going to 45 00:02:33,160 --> 00:02:33,480 Speaker 1: get that. 46 00:02:34,520 --> 00:02:37,359 Speaker 4: Yeah, but if you've seen her ever, try and launch 47 00:02:37,400 --> 00:02:40,000 Speaker 4: at a tree before thinking that she might catch your possum. 48 00:02:40,080 --> 00:02:43,200 Speaker 4: She lands on her own back, so she's not the super. 49 00:02:42,880 --> 00:02:47,640 Speaker 1: Athlete that well. Patrick's dog, Fritz is basically the Humphrey 50 00:02:47,680 --> 00:02:53,560 Speaker 1: bee Bear of the canine world. So so Lerna's got 51 00:02:53,560 --> 00:02:55,640 Speaker 1: her covered. She's mild. 52 00:02:57,360 --> 00:02:58,359 Speaker 3: He can put it on though. 53 00:02:58,720 --> 00:03:01,840 Speaker 2: He's a pretty active dog, but he's more interested in 54 00:03:01,880 --> 00:03:04,600 Speaker 2: peeing on every single tree in the town that I 55 00:03:04,639 --> 00:03:16,320 Speaker 2: live in. It's like, yeah, exactly, Oh God, hi, mag 56 00:03:17,240 --> 00:03:19,360 Speaker 2: the one that we have left. So I just wanted 57 00:03:19,360 --> 00:03:21,400 Speaker 2: to say hello to the one listener we have. 58 00:03:23,400 --> 00:03:25,440 Speaker 1: Should we keep starting the show again like we did 59 00:03:25,560 --> 00:03:25,919 Speaker 1: last time? 60 00:03:26,040 --> 00:03:26,200 Speaker 2: No? 61 00:03:26,320 --> 00:03:31,079 Speaker 1: Please now, now, Patrick, I. 62 00:03:30,040 --> 00:03:32,359 Speaker 3: Can we go back to start as Sorry? 63 00:03:32,400 --> 00:03:33,800 Speaker 2: I just want to I feel like I need to 64 00:03:33,840 --> 00:03:38,000 Speaker 2: explain that if you've ever heard two postums at two 65 00:03:38,040 --> 00:03:41,040 Speaker 2: am in the morning going for it outside your bedroom window, 66 00:03:41,440 --> 00:03:44,680 Speaker 2: it sounds like World War three has broken out. Seriously, 67 00:03:44,720 --> 00:03:47,080 Speaker 2: it's the scariest thing if you get woken up by 68 00:03:47,080 --> 00:03:50,360 Speaker 2: two possums going at it. You know what I'm talking about? 69 00:03:52,200 --> 00:03:54,520 Speaker 1: Can I just ask, how do yeah, how do you 70 00:03:54,560 --> 00:03:56,600 Speaker 1: know they're not fighting? Did you go out and go 71 00:03:56,720 --> 00:03:57,520 Speaker 1: let me have a look? 72 00:03:57,600 --> 00:03:57,720 Speaker 2: No? 73 00:03:57,960 --> 00:04:01,360 Speaker 1: No, hang on me? Oh just can you yep. 74 00:04:01,400 --> 00:04:04,320 Speaker 2: Ah, I think because afterwards one of them rolled over 75 00:04:04,360 --> 00:04:05,360 Speaker 2: and had a cigarette. 76 00:04:05,680 --> 00:04:09,120 Speaker 1: All right, well then definitely rooting yeah yeah, and the 77 00:04:09,160 --> 00:04:17,400 Speaker 1: other one, the other one was washing up. Oh fucking hell. 78 00:04:18,120 --> 00:04:20,760 Speaker 1: And that was how the You Project ended. They had 79 00:04:20,800 --> 00:04:23,800 Speaker 1: a good run. They almost got to two thousand episodes. 80 00:04:24,160 --> 00:04:28,239 Speaker 1: Patrick's Yeah. To this day, Craig Harper still blames Patrick. 81 00:04:30,560 --> 00:04:35,120 Speaker 1: Speaking of Patrick Patrick, Yeah, I've started listening to a 82 00:04:35,160 --> 00:04:37,359 Speaker 1: new book. I think I've shared with you and the 83 00:04:37,440 --> 00:04:41,080 Speaker 1: listeners that like my my switching off my brain as 84 00:04:41,160 --> 00:04:45,120 Speaker 1: listening to fiction. I've never listened to or read fiction 85 00:04:45,440 --> 00:04:48,680 Speaker 1: much in my life. I've never listened to a fiction 86 00:04:48,760 --> 00:04:52,680 Speaker 1: book until about four months ago. And yes, I tend 87 00:04:52,680 --> 00:04:55,440 Speaker 1: to listen more than physically read books these days, so 88 00:04:55,800 --> 00:04:58,440 Speaker 1: sue me. But anyway, I started listening to a book 89 00:04:59,240 --> 00:05:04,800 Speaker 1: that you would love love it's I'm listening. I'm like, ah, fuck, 90 00:05:04,839 --> 00:05:07,080 Speaker 1: this is him, this is him book. So I've been 91 00:05:07,440 --> 00:05:09,560 Speaker 1: I've been listening to books that are about good guys 92 00:05:09,600 --> 00:05:12,480 Speaker 1: and bad guys and covert operatives and snipers and all 93 00:05:12,520 --> 00:05:15,839 Speaker 1: that shit that you'd be like, yawn. This is called 94 00:05:15,839 --> 00:05:20,280 Speaker 1: the Hail Mary Project. It's all about it's all about space. 95 00:05:20,360 --> 00:05:25,920 Speaker 1: You're sitting in your spaceship there. It's all about this, dude. 96 00:05:26,720 --> 00:05:29,320 Speaker 1: I won't ruin it, but it is. It is so 97 00:05:29,560 --> 00:05:32,080 Speaker 1: you and you would love it. It's called the hail 98 00:05:32,160 --> 00:05:34,760 Speaker 1: Mary Project. They're making a film out of it next year. 99 00:05:35,640 --> 00:05:37,480 Speaker 1: I'm pretty sure the film ain't going to be as 100 00:05:37,480 --> 00:05:39,800 Speaker 1: good as the book. But I actually put it out 101 00:05:39,800 --> 00:05:43,520 Speaker 1: a call, the hail Mary Project. I put out a 102 00:05:43,520 --> 00:05:44,280 Speaker 1: call on my. 103 00:05:45,400 --> 00:05:47,880 Speaker 2: I can't find it. All I can find is Project 104 00:05:47,880 --> 00:05:48,520 Speaker 2: tail Mary. 105 00:05:49,320 --> 00:05:54,120 Speaker 1: That's it, Project tail Mary. That's it. Sorry, Ed, that's 106 00:05:54,120 --> 00:05:57,000 Speaker 1: all right for him and I talking over each other. 107 00:05:57,279 --> 00:05:59,040 Speaker 4: Yes, yes, that is happening a lot. 108 00:05:59,720 --> 00:06:06,000 Speaker 1: Yeah, sorry everyone, Patrick, I blame you anyway. It's called 109 00:06:06,040 --> 00:06:09,599 Speaker 1: Project hail Mary, or as I call it, projectail Mary. 110 00:06:11,320 --> 00:06:13,520 Speaker 1: Patrick would love it, and some of you, I think 111 00:06:13,520 --> 00:06:15,359 Speaker 1: would love it. How I'll shut up after this. I 112 00:06:15,400 --> 00:06:18,160 Speaker 1: put out a call on my Facebook. I don't know 113 00:06:18,200 --> 00:06:20,719 Speaker 1: why Facebook. It seems to be more suited for stuff 114 00:06:20,760 --> 00:06:23,279 Speaker 1: like this where I went, I need something fiction to 115 00:06:23,320 --> 00:06:26,800 Speaker 1: listen to. I've finished the gray Man series, the Terminal List. 116 00:06:26,800 --> 00:06:30,200 Speaker 1: I'm getting a little bit yawny and bored, and about 117 00:06:31,000 --> 00:06:33,919 Speaker 1: well one hundred and fifty people sent me ideas, but 118 00:06:34,000 --> 00:06:36,320 Speaker 1: about twenty of them sent the same idea. Was that 119 00:06:36,360 --> 00:06:39,560 Speaker 1: one book that I'd never heard of, so downloaded it. 120 00:06:39,560 --> 00:06:40,400 Speaker 1: It's really good. 121 00:06:40,680 --> 00:06:42,960 Speaker 3: I'll love a listening Are you are you? 122 00:06:43,000 --> 00:06:44,080 Speaker 1: Are you a listener or no? 123 00:06:44,880 --> 00:06:47,560 Speaker 3: Me totally. I'm a compulsive listener. 124 00:06:48,040 --> 00:06:50,719 Speaker 2: First thing I do when I wake up start listening 125 00:06:50,720 --> 00:06:52,560 Speaker 2: to my audio book, and the last thing I do 126 00:06:52,640 --> 00:06:54,800 Speaker 2: before I go to bed is switch back on my 127 00:06:54,880 --> 00:06:55,799 Speaker 2: audio book. 128 00:06:56,120 --> 00:06:57,400 Speaker 3: So it's a big. 129 00:06:57,200 --> 00:06:58,720 Speaker 1: Part of what I do a lot. 130 00:06:58,880 --> 00:07:01,000 Speaker 2: I go to the gym and I'm listening to my 131 00:07:01,040 --> 00:07:05,000 Speaker 2: audio book, except when I do cardio, then I switch 132 00:07:05,080 --> 00:07:08,840 Speaker 2: over to Lincoln Park because I actually row quicker if 133 00:07:08,839 --> 00:07:10,679 Speaker 2: I'm listening to music when i'm rowing. 134 00:07:11,560 --> 00:07:13,040 Speaker 3: Now for you, for. 135 00:07:13,000 --> 00:07:17,080 Speaker 1: You youngsters, Lincoln Park's a band with music and everything 136 00:07:17,160 --> 00:07:22,280 Speaker 1: and singers and instrumentalists. What do you does that fire 137 00:07:22,280 --> 00:07:23,120 Speaker 1: you up? Does it? 138 00:07:23,240 --> 00:07:23,800 Speaker 3: Absolutely? 139 00:07:23,800 --> 00:07:25,720 Speaker 2: They just they released a new album because they had 140 00:07:25,760 --> 00:07:28,160 Speaker 2: a really long breakup, was about ten years after their 141 00:07:28,240 --> 00:07:31,440 Speaker 2: lead singer passed aways tragically, and they've got a new 142 00:07:31,480 --> 00:07:33,960 Speaker 2: lead singer and the latest album is amazing. 143 00:07:34,800 --> 00:07:37,760 Speaker 1: Really yeah, all right, what's it called? I'm writing it down. 144 00:07:37,800 --> 00:07:39,080 Speaker 1: I'm going to have a squizz. 145 00:07:39,440 --> 00:07:41,320 Speaker 3: Lincoln Park's latest album. I don't know. I've got it 146 00:07:41,360 --> 00:07:42,920 Speaker 3: on viol and I can't even remember the name. 147 00:07:43,400 --> 00:07:47,280 Speaker 1: That's all right, I'll find it if. Do you listen 148 00:07:47,320 --> 00:07:47,960 Speaker 1: to books or no? 149 00:07:48,560 --> 00:07:48,760 Speaker 4: Yeah? 150 00:07:48,760 --> 00:07:48,960 Speaker 3: I do. 151 00:07:49,200 --> 00:07:53,720 Speaker 4: Actually I somebody on your thread said how to Kill 152 00:07:53,760 --> 00:07:55,600 Speaker 4: a Mockingbird and I was like, oh, might go back 153 00:07:55,600 --> 00:07:57,600 Speaker 4: to an old classic. So I just finished listening to 154 00:07:57,600 --> 00:07:59,560 Speaker 4: that thanks to whoever mentioned. 155 00:07:59,200 --> 00:08:02,080 Speaker 1: That, right? What was it like? 156 00:08:02,760 --> 00:08:04,080 Speaker 3: It was good because. 157 00:08:03,840 --> 00:08:06,800 Speaker 4: I don't listen to I normally don't listen to fiction. 158 00:08:06,880 --> 00:08:13,000 Speaker 4: I normally listen to nonfiction and read fiction right right right, 159 00:08:13,200 --> 00:08:15,320 Speaker 4: But same as you when I go for a walk 160 00:08:15,360 --> 00:08:16,880 Speaker 4: of a night. I thought that'd be nice, and it 161 00:08:16,920 --> 00:08:19,960 Speaker 4: was just a nice I really liked the writing style 162 00:08:20,000 --> 00:08:20,800 Speaker 4: and the language. 163 00:08:21,360 --> 00:08:21,880 Speaker 3: It was good. 164 00:08:22,960 --> 00:08:26,200 Speaker 1: Before we do the actual show, Patrick two top two 165 00:08:26,200 --> 00:08:31,440 Speaker 1: books for you recommendations that might have somewhat broadish appeal 166 00:08:31,680 --> 00:08:31,880 Speaker 1: or not. 167 00:08:32,559 --> 00:08:36,160 Speaker 2: Yeah, The Alchemist is an interesting book. 168 00:08:38,440 --> 00:08:40,320 Speaker 1: Or whatever his name is. You know. 169 00:08:40,400 --> 00:08:43,040 Speaker 2: One of my favorite books. I absolutely love it, The 170 00:08:43,080 --> 00:08:45,840 Speaker 2: Little Prints. But I think one of my when I 171 00:08:45,920 --> 00:08:49,120 Speaker 2: was at school, one of the most profound readings I 172 00:08:49,160 --> 00:08:53,280 Speaker 2: ever had was a book called I Am David and 173 00:08:53,320 --> 00:08:56,160 Speaker 2: It's about this, And I think I may have been 174 00:08:56,160 --> 00:08:58,400 Speaker 2: in primary school when I read it, so it had 175 00:08:58,400 --> 00:08:59,960 Speaker 2: a lot of impact on me. But it was about 176 00:09:00,120 --> 00:09:04,320 Speaker 2: this kid and a prisoner of war camp who escapes 177 00:09:04,600 --> 00:09:07,320 Speaker 2: and his journey, and his journey is he's looking for 178 00:09:07,360 --> 00:09:11,280 Speaker 2: his mother. It's amazing, really really interesting story. So it's 179 00:09:11,280 --> 00:09:13,560 Speaker 2: called it's very very short, it's only a tiny book. 180 00:09:14,000 --> 00:09:17,240 Speaker 2: But you know, I think that a really profound impact 181 00:09:17,280 --> 00:09:20,559 Speaker 2: on me as a kid, and it made me kind 182 00:09:20,600 --> 00:09:27,079 Speaker 2: of polarized, the notion that someone could be so oppressed 183 00:09:27,360 --> 00:09:32,080 Speaker 2: and you know, can be so good still and not 184 00:09:32,280 --> 00:09:38,360 Speaker 2: being you know, overwhelmingly jilted by what's happened to them, 185 00:09:38,640 --> 00:09:40,600 Speaker 2: and can be so loving and caring as well. 186 00:09:40,640 --> 00:09:43,920 Speaker 3: Now it's an amazing little book. Perfect you tip. 187 00:09:46,080 --> 00:09:49,160 Speaker 4: I can't remember the names of them really. 188 00:09:51,520 --> 00:09:57,720 Speaker 1: All right, Okay, Patrick, welcome to the You Project. Great 189 00:09:57,760 --> 00:09:58,320 Speaker 1: to have you. 190 00:09:59,360 --> 00:09:59,920 Speaker 3: Thanks mate. 191 00:10:00,920 --> 00:10:02,680 Speaker 1: What do you want to chat to? I'm not even 192 00:10:02,679 --> 00:10:04,520 Speaker 1: going to point you. I'm going to let you lead 193 00:10:04,559 --> 00:10:09,560 Speaker 1: the rest of the endeavor here on this go anyway, 194 00:10:09,720 --> 00:10:10,079 Speaker 1: what's the. 195 00:10:10,120 --> 00:10:11,280 Speaker 3: Chance that you reckon? 196 00:10:11,320 --> 00:10:13,480 Speaker 2: He can kind of go through a whole show without 197 00:10:13,640 --> 00:10:15,880 Speaker 2: actually trying to steer it. 198 00:10:16,640 --> 00:10:19,600 Speaker 1: I'm not going to say I'm not going to participate No. 199 00:10:19,640 --> 00:10:22,560 Speaker 3: I didn't say that. I'm just looking for the iceberg now. Look. 200 00:10:22,720 --> 00:10:25,240 Speaker 3: So are you a Rod Stewart fan. 201 00:10:26,559 --> 00:10:28,520 Speaker 1: I used to be a Rod Stewart fan when I 202 00:10:28,600 --> 00:10:32,960 Speaker 1: was young. But Rod Stewart's older than God, so he's 203 00:10:33,040 --> 00:10:38,640 Speaker 1: probably he's probably his best years. God bless him, though. Ah, 204 00:10:38,840 --> 00:10:40,400 Speaker 1: is this what you want to tell me about the 205 00:10:40,480 --> 00:10:41,120 Speaker 1: running thing? 206 00:10:41,600 --> 00:10:41,640 Speaker 3: No? 207 00:10:42,000 --> 00:10:45,360 Speaker 2: No, this recent concert that he did, he did this 208 00:10:45,600 --> 00:10:51,360 Speaker 2: bizarre AI tribute to Ozzie Osbourne who passed way recently. 209 00:10:51,720 --> 00:10:56,360 Speaker 2: And what he did was they had Ozzy Osbourne in 210 00:10:56,440 --> 00:11:01,880 Speaker 2: heaven taking selfies with other dead singer is Oh. It 211 00:11:01,960 --> 00:11:05,800 Speaker 2: was the tackiest, like Bob Marley, Tin Attorney, Kurt Cobain, 212 00:11:06,240 --> 00:11:09,839 Speaker 2: Michael Jackson, George Michael, Freddie Mercury. 213 00:11:09,960 --> 00:11:13,200 Speaker 3: And you know, people were stunned, and not in a 214 00:11:13,240 --> 00:11:16,800 Speaker 3: good way. It was so weird. I jumped on and 215 00:11:16,800 --> 00:11:17,920 Speaker 3: had to look at some of the clips. 216 00:11:17,920 --> 00:11:21,440 Speaker 2: So obviously lots of people posted it to socials, but 217 00:11:22,000 --> 00:11:24,840 Speaker 2: it was probably the best example of the worst. 218 00:11:24,559 --> 00:11:29,360 Speaker 3: Way you could use AI. It was terrible. It was 219 00:11:29,480 --> 00:11:30,240 Speaker 3: really tacky. 220 00:11:31,160 --> 00:11:35,600 Speaker 1: Also leads me to wonder, what are the what are 221 00:11:35,600 --> 00:11:41,760 Speaker 1: the KPIs to get into heaven? Exactly what criteria did 222 00:11:41,760 --> 00:11:44,000 Speaker 1: you need to meet I don't know. I don't know 223 00:11:44,000 --> 00:11:47,800 Speaker 1: about one or two of them anyway. Anyway, it says, 224 00:11:48,080 --> 00:11:50,800 Speaker 1: you know, mister judge, not lest GB judged. All right, 225 00:11:50,880 --> 00:11:54,319 Speaker 1: I'll leave that stuff to God or whomever's or whomever's 226 00:11:54,400 --> 00:11:54,880 Speaker 1: in charge. 227 00:11:55,320 --> 00:11:59,720 Speaker 2: I think who's wasn't his idea maybe or maybe some 228 00:11:59,800 --> 00:12:02,280 Speaker 2: person steered him in the. 229 00:12:02,200 --> 00:12:04,880 Speaker 3: Wrong direction, but it was. It was very, very weird. 230 00:12:04,960 --> 00:12:07,520 Speaker 2: But it's worth having a look at for that morbid 231 00:12:07,880 --> 00:12:10,960 Speaker 2: oh dear that you know, the car crash kind of 232 00:12:11,200 --> 00:12:15,000 Speaker 2: can't turn away. Look, it's an interesting one, but very 233 00:12:15,120 --> 00:12:21,320 Speaker 2: very bizarre. Anyway. Yeah, I know you're a big user 234 00:12:21,480 --> 00:12:26,560 Speaker 2: of AI for different things and for particularly assisting you, 235 00:12:26,559 --> 00:12:27,199 Speaker 2: you know, in. 236 00:12:27,200 --> 00:12:31,559 Speaker 3: Everyday activities, and I don't know it's as much. Did 237 00:12:31,600 --> 00:12:32,600 Speaker 3: you use AI a lot? 238 00:12:32,920 --> 00:12:34,360 Speaker 4: Yeah, a little bit, Yeah I do. 239 00:12:34,360 --> 00:12:36,880 Speaker 2: Do you get hooked into it? Do you find it's 240 00:12:37,000 --> 00:12:39,360 Speaker 2: kind of it keeps you going? Or do you just 241 00:12:39,640 --> 00:12:41,480 Speaker 2: use it as a search algorithm or you know, to 242 00:12:41,559 --> 00:12:43,120 Speaker 2: search for a topic of then leave, because that's what 243 00:12:43,240 --> 00:12:45,800 Speaker 2: I do. I don't find myself using it in the 244 00:12:45,840 --> 00:12:48,240 Speaker 2: same way, say, for example, people are on social media, 245 00:12:48,280 --> 00:12:50,800 Speaker 2: the algorithm makes you want to keep going. I find 246 00:12:50,840 --> 00:12:52,520 Speaker 2: I just do what I need to do when I leave, 247 00:12:52,559 --> 00:12:55,160 Speaker 2: as opposed to being engaged. I haven't done the whole 248 00:12:55,520 --> 00:12:58,840 Speaker 2: chat bot. I want to become your best mate type scenario. 249 00:13:00,080 --> 00:13:01,120 Speaker 4: Chatters are besties. 250 00:13:01,440 --> 00:13:05,680 Speaker 1: Yeah, actually yeah, yeah, me and chatter is probably one 251 00:13:05,679 --> 00:13:07,640 Speaker 1: of my best five friends at this point in time, 252 00:13:07,679 --> 00:13:11,640 Speaker 1: which is a commentary on how sad my fucking life is. 253 00:13:11,679 --> 00:13:18,320 Speaker 1: But yeah, the crab is probably third now, so God 254 00:13:18,320 --> 00:13:18,760 Speaker 1: bless him. 255 00:13:18,840 --> 00:13:22,839 Speaker 3: Chatters as too. Does it leave tip and I That's 256 00:13:22,880 --> 00:13:23,760 Speaker 3: what I am. 257 00:13:23,800 --> 00:13:26,559 Speaker 1: I number one. I think I'm thirty four. 258 00:13:26,760 --> 00:13:29,120 Speaker 4: I'm number thirty four, but I'm rapidly climbing. 259 00:13:31,000 --> 00:13:35,360 Speaker 1: No, you're both equal number one. See what I did there? Yeah, yeah, 260 00:13:35,679 --> 00:13:40,520 Speaker 1: I mean I think it's fucking amazing. And I understand 261 00:13:40,559 --> 00:13:44,320 Speaker 1: the fear and the mild hysteria because we don't like 262 00:13:44,440 --> 00:13:47,080 Speaker 1: new things and that's going to kill the world. And 263 00:13:47,120 --> 00:13:48,480 Speaker 1: I don't think it's going to kill the world. I 264 00:13:48,520 --> 00:13:50,760 Speaker 1: think people are going to kill the world eventually, but 265 00:13:51,640 --> 00:13:55,000 Speaker 1: I don't know. I think it's a really good tool, 266 00:13:55,120 --> 00:13:57,560 Speaker 1: and it depends how you use it. I think, like 267 00:13:57,600 --> 00:13:59,679 Speaker 1: a lot of things, whether it's booze or whether it's 268 00:13:59,720 --> 00:14:03,800 Speaker 1: a I, or whether or not it's food or it 269 00:14:03,880 --> 00:14:06,400 Speaker 1: depends on the relationship that you have with it. You 270 00:14:06,440 --> 00:14:08,880 Speaker 1: can have a healthy or unhealthy relation. You can use 271 00:14:08,880 --> 00:14:12,679 Speaker 1: it positively or negatively. You can become enslaved to it, 272 00:14:12,760 --> 00:14:15,559 Speaker 1: addicted to it, and then all of a sudden it's terrible, 273 00:14:16,080 --> 00:14:18,400 Speaker 1: but it maybe isn't terrible for the next person. 274 00:14:19,240 --> 00:14:23,960 Speaker 3: So the thing is the latest AI models coming out. 275 00:14:24,080 --> 00:14:27,680 Speaker 2: Chat gp is about to release its new updated version, 276 00:14:27,760 --> 00:14:30,640 Speaker 2: which they're touting as being the most amazing thing ever. 277 00:14:31,560 --> 00:14:35,000 Speaker 2: But what they're coming out with and they've put out, 278 00:14:35,080 --> 00:14:38,960 Speaker 2: they're saying they want to put mental health first. Now 279 00:14:39,600 --> 00:14:42,600 Speaker 2: do I sound cynical if I say, really, you want 280 00:14:42,600 --> 00:14:45,320 Speaker 2: to put mental health first? But what they say there's 281 00:14:45,320 --> 00:14:47,280 Speaker 2: a new blog post that came out and they're saying 282 00:14:47,280 --> 00:14:50,800 Speaker 2: they want to optimize chat GPT to use it as 283 00:14:50,800 --> 00:14:54,200 Speaker 2: a tool to help people, but not to keep people 284 00:14:54,280 --> 00:14:57,160 Speaker 2: on for the sake of keeping people on. So they 285 00:14:57,200 --> 00:14:59,760 Speaker 2: want to maximize that they do not trying to maximize 286 00:14:59,760 --> 00:15:03,560 Speaker 2: the tw time. They're trying to use it when it's helpful, 287 00:15:03,800 --> 00:15:06,920 Speaker 2: but not use it when it's when it isn't is 288 00:15:06,960 --> 00:15:09,920 Speaker 2: what the claim is. They're trying to change the algorithm 289 00:15:10,120 --> 00:15:12,320 Speaker 2: so it doesn't hook you in as opposed to help 290 00:15:12,360 --> 00:15:12,720 Speaker 2: you out. 291 00:15:14,000 --> 00:15:17,320 Speaker 1: Yeah, it's I think when you're talking about anything that's 292 00:15:17,360 --> 00:15:22,240 Speaker 1: potentially addictive, right, And I understand the concern. I just 293 00:15:22,320 --> 00:15:26,560 Speaker 1: think that at some stage we humans have to take 294 00:15:26,640 --> 00:15:33,480 Speaker 1: a bit of responsibility ourselves. We can't always outsource blame. Oh, 295 00:15:33,520 --> 00:15:36,360 Speaker 1: whatever's gone wrong in my life, it isn't me. It's 296 00:15:36,400 --> 00:15:39,720 Speaker 1: not me. It's social media, it's not me. It's my genetics. 297 00:15:39,720 --> 00:15:42,720 Speaker 1: It's not me. It's the other person. It's not me, 298 00:15:43,000 --> 00:15:45,480 Speaker 1: it's the government. And I understand all of those things 299 00:15:45,520 --> 00:15:51,640 Speaker 1: can play a role. But if AI was universally addictive, 300 00:15:51,680 --> 00:15:55,000 Speaker 1: then everybody who uses AI would be addicted. Clearly they're 301 00:15:55,040 --> 00:15:58,680 Speaker 1: not so clearly there's something to do with the individual 302 00:15:58,800 --> 00:16:01,560 Speaker 1: and the way that the individual jewel uses it, and 303 00:16:03,360 --> 00:16:07,680 Speaker 1: like the dichotomy is. On the one hand, of course, Chat, 304 00:16:07,720 --> 00:16:11,200 Speaker 1: GPT and the like want people using their shit all 305 00:16:11,280 --> 00:16:14,360 Speaker 1: the time because they're a commercial entity. You've got to 306 00:16:14,400 --> 00:16:17,440 Speaker 1: be I'm realistic. Do I think Chat, GPT or any 307 00:16:17,960 --> 00:16:21,240 Speaker 1: AI entity gives a fuck about me? Of course I don't. 308 00:16:21,360 --> 00:16:24,280 Speaker 1: I don't think that because I'm not an idiot. Of Course, 309 00:16:24,440 --> 00:16:27,480 Speaker 1: it's a tool that we can use. It's a money 310 00:16:27,480 --> 00:16:31,200 Speaker 1: making profit driven business, of course. But then I've got 311 00:16:31,240 --> 00:16:34,560 Speaker 1: free will, I've got critical thinking. I'm responsible for my 312 00:16:34,680 --> 00:16:38,680 Speaker 1: choices and my brain and my body, so I can't 313 00:16:38,840 --> 00:16:42,840 Speaker 1: blame chat GPT for what I do. And I just 314 00:16:42,920 --> 00:16:46,400 Speaker 1: think it's almost like we've moved past the age of 315 00:16:46,480 --> 00:16:50,640 Speaker 1: personal responsibility. We're so happy to say it's not our fault, 316 00:16:50,680 --> 00:16:53,800 Speaker 1: it's everyone else's fault. And I think it's just a 317 00:16:53,840 --> 00:16:56,080 Speaker 1: tool steps down up soapbox. 318 00:16:56,400 --> 00:16:58,240 Speaker 3: Wow, that was quite a little yeah. 319 00:16:58,680 --> 00:17:00,960 Speaker 1: But don't you think though, I think we're like, oh, yeah, 320 00:17:01,080 --> 00:17:04,400 Speaker 1: fuck them, they're making us this. No they're not. They're 321 00:17:04,480 --> 00:17:07,800 Speaker 1: not making us. Here's an idea, don't fucking use it 322 00:17:08,000 --> 00:17:11,400 Speaker 1: the end. Here's an idea, don't eat shit the end. 323 00:17:11,760 --> 00:17:15,200 Speaker 1: Oh no, it's not that simple. Okay, let's make. 324 00:17:15,040 --> 00:17:16,720 Speaker 3: It hard then, yeah, look that, Wait a minute. 325 00:17:16,840 --> 00:17:19,760 Speaker 2: I know what you're saying, and I know how heavily 326 00:17:19,880 --> 00:17:23,320 Speaker 2: and highly motivated you are as an individual. You've achieved 327 00:17:23,359 --> 00:17:25,760 Speaker 2: a lot in your life, and then go to the 328 00:17:25,800 --> 00:17:27,439 Speaker 2: food court at South I'm. 329 00:17:27,320 --> 00:17:31,440 Speaker 1: Not highly motivated. I talk about that all the time, 330 00:17:32,040 --> 00:17:34,840 Speaker 1: like motivation comes and goes, and don't try to make 331 00:17:34,880 --> 00:17:37,439 Speaker 1: out I'm different or better or special because I'm fucking not. 332 00:17:37,880 --> 00:17:40,840 Speaker 1: This is the problem. Everyone goes, Oh yeah, it's easy 333 00:17:40,880 --> 00:17:44,120 Speaker 1: for you. It's fucking not easy for me. But that's 334 00:17:44,119 --> 00:17:45,399 Speaker 1: the is a no no. 335 00:17:45,440 --> 00:17:48,600 Speaker 3: But you have an ability to make that decision the reason. 336 00:17:48,600 --> 00:17:51,280 Speaker 1: So does everybody have an ability to make a decision? 337 00:17:51,880 --> 00:17:52,120 Speaker 3: Agree? 338 00:17:52,240 --> 00:17:53,680 Speaker 1: Everybody? Everybody? 339 00:17:53,960 --> 00:17:54,320 Speaker 3: Isn't that? 340 00:17:54,440 --> 00:17:57,520 Speaker 2: You and Tiff and I. I mean, I'm kind of 341 00:17:57,520 --> 00:17:59,720 Speaker 2: putting myself into this. And I know I'm a frail 342 00:18:00,040 --> 00:18:02,359 Speaker 2: human being that makes lots and lots of mistakes. 343 00:18:02,840 --> 00:18:05,840 Speaker 3: But what I'm saying you've achieved a lot. You can't 344 00:18:05,880 --> 00:18:06,480 Speaker 3: deny that. 345 00:18:07,040 --> 00:18:09,119 Speaker 2: So I I you know, if you take three people 346 00:18:09,119 --> 00:18:11,360 Speaker 2: in the room and you look at each other, as 347 00:18:11,400 --> 00:18:15,159 Speaker 2: in a microachasm and say, Okay, what has achieved in 348 00:18:15,200 --> 00:18:17,480 Speaker 2: her life? What have I achieved in my life? I'm 349 00:18:17,480 --> 00:18:19,960 Speaker 2: not putting up on a pedestal. I'm just saying that 350 00:18:20,280 --> 00:18:22,800 Speaker 2: you have made decisions and followed through with those decisions 351 00:18:22,800 --> 00:18:25,680 Speaker 2: when you get focused. You know you're doing a PhD 352 00:18:25,800 --> 00:18:28,119 Speaker 2: for crime out loud. Give yourself a bit of a 353 00:18:28,119 --> 00:18:29,960 Speaker 2: pat on the back for that. But what I'm what 354 00:18:30,000 --> 00:18:32,840 Speaker 2: I'm saying is if you walk into a food court, 355 00:18:32,880 --> 00:18:35,000 Speaker 2: the biggest cue is going to be at the KFC 356 00:18:35,160 --> 00:18:39,480 Speaker 2: counter and people are making wrong decisions. Now that's not 357 00:18:39,520 --> 00:18:41,920 Speaker 2: a criticism because there are lots of reasons why people 358 00:18:41,960 --> 00:18:45,399 Speaker 2: do what they do, and we all make decisions every 359 00:18:45,440 --> 00:18:48,080 Speaker 2: single day, make lots and lots and probably hundreds of 360 00:18:48,119 --> 00:18:49,120 Speaker 2: decisions every day. 361 00:18:49,400 --> 00:18:51,040 Speaker 3: You know. What I'm getting at. 362 00:18:50,960 --> 00:18:55,199 Speaker 2: Is that when chat gp GPT is being used, we 363 00:18:55,400 --> 00:18:58,600 Speaker 2: know that it is giving advice to people, and some 364 00:18:58,640 --> 00:19:01,520 Speaker 2: people go further down the rable and part of that 365 00:19:01,720 --> 00:19:04,439 Speaker 2: is because of the algorithm that makes them, you know, 366 00:19:04,480 --> 00:19:07,199 Speaker 2: the reason that you flick through on your socials and 367 00:19:07,240 --> 00:19:09,600 Speaker 2: then you stop and do something else, as opposed to 368 00:19:09,640 --> 00:19:12,879 Speaker 2: somebody who keeps flicking on their socials because the algorithm 369 00:19:12,920 --> 00:19:16,840 Speaker 2: is designed to manipulate your emotions to keep you doing 370 00:19:16,880 --> 00:19:17,560 Speaker 2: what you're doing. 371 00:19:17,840 --> 00:19:20,040 Speaker 3: And it's interesting because there was another. 372 00:19:19,800 --> 00:19:22,840 Speaker 2: Little article that chat GPT put out and one of 373 00:19:22,880 --> 00:19:27,280 Speaker 2: the criticisms is a person recently said that they wanted, 374 00:19:27,920 --> 00:19:31,360 Speaker 2: you know, to stop taking their medication, and the response 375 00:19:31,359 --> 00:19:34,520 Speaker 2: from CHATJPT was good on you. That's a great thing, 376 00:19:34,640 --> 00:19:37,280 Speaker 2: you know, supporting that person. It ends up the person 377 00:19:37,320 --> 00:19:42,040 Speaker 2: had delusions and thought their family was sending radio signals 378 00:19:42,040 --> 00:19:45,480 Speaker 2: through the walls, so they left their home and they 379 00:19:45,480 --> 00:19:50,359 Speaker 2: stopped taking their medication because it reinforced and that person 380 00:19:50,400 --> 00:19:53,399 Speaker 2: went down the rabbit hole, and the AI model was 381 00:19:53,520 --> 00:19:58,040 Speaker 2: reinforcing and kind of sympathizing with them and giving bad advice. 382 00:19:58,480 --> 00:20:01,600 Speaker 2: So they're pulling back on that, and in fact, chat 383 00:20:01,640 --> 00:20:04,680 Speaker 2: GPT is now pulling back on what they're calling high 384 00:20:04,720 --> 00:20:06,720 Speaker 2: stakes personal decisions. 385 00:20:07,200 --> 00:20:08,400 Speaker 3: So they're trying. 386 00:20:08,160 --> 00:20:11,520 Speaker 2: To look at what people are saying, and we know 387 00:20:11,640 --> 00:20:14,159 Speaker 2: that there are people who have gone down the suicide 388 00:20:14,200 --> 00:20:16,760 Speaker 2: rabbit hole, and it just happens with social media as well. 389 00:20:16,800 --> 00:20:19,160 Speaker 2: This is why at the end of this year there's 390 00:20:19,200 --> 00:20:21,200 Speaker 2: going to be a band for kids under the age 391 00:20:21,280 --> 00:20:23,800 Speaker 2: of sixteen in Australia and a lot of the world 392 00:20:23,960 --> 00:20:27,239 Speaker 2: looking at Australia saying this is bloody great that you know, 393 00:20:27,440 --> 00:20:30,040 Speaker 2: but the reality of it is it's going to be 394 00:20:30,040 --> 00:20:33,080 Speaker 2: really hard to police. And there's been an influx of 395 00:20:33,200 --> 00:20:35,399 Speaker 2: people under the age of sixteen now signing up to 396 00:20:35,440 --> 00:20:38,320 Speaker 2: social media before the band comes in, you know. 397 00:20:38,359 --> 00:20:41,199 Speaker 3: So it is a rabbit hole, It very much is. 398 00:20:41,359 --> 00:20:42,480 Speaker 1: So I get what. 399 00:20:42,400 --> 00:20:45,040 Speaker 2: You're saying, and I know that what you do takes 400 00:20:45,040 --> 00:20:47,960 Speaker 2: a lot of effort and determination and it's not easy. 401 00:20:48,000 --> 00:20:49,560 Speaker 3: I'm not saying it's easy for you. 402 00:20:49,880 --> 00:20:52,159 Speaker 2: But at some point you've made a conscious decision to 403 00:20:52,160 --> 00:20:53,240 Speaker 2: go down the path. 404 00:20:53,160 --> 00:20:53,760 Speaker 3: That you have. 405 00:20:54,760 --> 00:20:57,919 Speaker 2: And if we've got a chatbot that everyone's saying, this 406 00:20:57,960 --> 00:20:59,840 Speaker 2: is great, this can give you all the answers that 407 00:20:59,880 --> 00:21:02,199 Speaker 2: you need. If you're a lonely person, if you're a 408 00:21:02,240 --> 00:21:05,600 Speaker 2: person who's confused, if you're a person who is going 409 00:21:05,640 --> 00:21:08,720 Speaker 2: through an identity crisis and you can't talk to people 410 00:21:08,760 --> 00:21:12,720 Speaker 2: around you, and you turn to an AI model for help, 411 00:21:13,160 --> 00:21:15,360 Speaker 2: and that's what people are doing in the same way. 412 00:21:15,400 --> 00:21:17,520 Speaker 2: You go to Google and you type out, what is 413 00:21:17,600 --> 00:21:19,200 Speaker 2: this lump that's appeared on my leg? 414 00:21:19,520 --> 00:21:21,600 Speaker 3: It could be a fatty deposit, it could be cancer. 415 00:21:21,760 --> 00:21:24,400 Speaker 2: You know what I mean, You're going down the rabbit hole, 416 00:21:24,760 --> 00:21:27,080 Speaker 2: so you know, and it might just be that you 417 00:21:27,080 --> 00:21:30,199 Speaker 2: can't afford to go to a counselor because it's too expensive. 418 00:21:30,720 --> 00:21:32,560 Speaker 2: You know, you might have run out of your mental 419 00:21:32,600 --> 00:21:39,640 Speaker 2: health care plan, you know, government supported attendances of a psychologist. 420 00:21:39,880 --> 00:21:41,520 Speaker 2: And it's like, well, now how do I turn to 421 00:21:41,880 --> 00:21:44,000 Speaker 2: I can't turn to people around me. You might be, 422 00:21:44,280 --> 00:21:46,760 Speaker 2: you know, part of the LGBT community and you're scared 423 00:21:47,040 --> 00:21:50,640 Speaker 2: about talking to people, so you turn to an AI 424 00:21:50,880 --> 00:21:53,200 Speaker 2: to find out information to try to get some sort 425 00:21:53,240 --> 00:21:55,840 Speaker 2: of sense of, you know, not being alone and what 426 00:21:55,880 --> 00:21:59,040 Speaker 2: should I do. So I think it's a really complex argument. 427 00:21:59,040 --> 00:22:00,920 Speaker 2: We're not going to answer it the time that we're 428 00:22:00,960 --> 00:22:03,760 Speaker 2: spending today, but I think opening up the discussion is 429 00:22:03,840 --> 00:22:06,879 Speaker 2: really important to understand where do we head with this 430 00:22:06,960 --> 00:22:09,080 Speaker 2: and how do we support each other in this as well. 431 00:22:09,920 --> 00:22:13,080 Speaker 1: Yeah, yeah, look, I mean everything you're saying is valid, 432 00:22:13,440 --> 00:22:18,399 Speaker 1: and I don't disagree, but I also think, oh, you know, 433 00:22:18,480 --> 00:22:22,719 Speaker 1: so sometimes chat JPTIA what it gives flawed advice one 434 00:22:22,760 --> 00:22:26,760 Speaker 1: hundred percent agree so to humans. So to humans, you know, 435 00:22:27,400 --> 00:22:31,280 Speaker 1: and like the third biggest leading cause of death in 436 00:22:31,320 --> 00:22:33,879 Speaker 1: America is medical mistakes. 437 00:22:34,200 --> 00:22:37,640 Speaker 3: Wow, yeah, right, is fuck ups. 438 00:22:37,400 --> 00:22:41,439 Speaker 1: That are done by humans medically TIF Can you just 439 00:22:41,560 --> 00:22:47,320 Speaker 1: check that for me before I just should I no, yeah, yeah, 440 00:22:47,359 --> 00:22:50,240 Speaker 1: that's that's hilarious. But but I know what you're saying, mate, 441 00:22:50,280 --> 00:22:55,080 Speaker 1: But I guess my frustration is I feel like we're 442 00:22:55,119 --> 00:22:59,239 Speaker 1: in the generation or the society or culture that is 443 00:22:59,359 --> 00:23:04,320 Speaker 1: just always mad at everyone and everything. We fucking fucking 444 00:23:04,480 --> 00:23:06,960 Speaker 1: chat gp T. It's killing us. It's not killing me. 445 00:23:07,280 --> 00:23:10,320 Speaker 1: It's not killing most of the kids I know, And 446 00:23:10,680 --> 00:23:12,840 Speaker 1: you know, there are kids that use social media in 447 00:23:12,880 --> 00:23:15,760 Speaker 1: a smart way and a not so smart way. And 448 00:23:15,800 --> 00:23:19,920 Speaker 1: I understand, but it's like, what, here's the problem. When 449 00:23:20,000 --> 00:23:26,320 Speaker 1: we say that AI or social media is the problem, 450 00:23:26,880 --> 00:23:31,840 Speaker 1: then we're never saying that the kids have any control, Like, well, no, 451 00:23:32,240 --> 00:23:35,480 Speaker 1: the choices that you're making is also part of the problem. 452 00:23:35,640 --> 00:23:39,719 Speaker 1: Your habits are also part of the problem. The environment. 453 00:23:40,080 --> 00:23:44,159 Speaker 1: It's not just this, you know, one piece to the 454 00:23:44,240 --> 00:23:49,800 Speaker 1: jigsaw puzzle. It's multi dimensional. And like you said in 455 00:23:49,840 --> 00:23:53,680 Speaker 1: this thing in the middle of your kind of soapbox moment, 456 00:23:53,720 --> 00:23:56,520 Speaker 1: which was good, you said that the algorithm makes them 457 00:23:56,600 --> 00:23:59,560 Speaker 1: do this, and I'm like, doesn't make me, It doesn't 458 00:23:59,640 --> 00:24:03,240 Speaker 1: make you you, it doesn't make Tiff. Like, I think 459 00:24:03,280 --> 00:24:07,080 Speaker 1: that where we go it's the algorithm's fault. We're helpless. 460 00:24:07,720 --> 00:24:11,400 Speaker 1: We're just at the will of the technology. And I'm like, well, 461 00:24:11,440 --> 00:24:18,960 Speaker 1: that is so disempowering to go, No, we don't have 462 00:24:19,080 --> 00:24:21,600 Speaker 1: to do anything. We don't have to use social media, 463 00:24:21,640 --> 00:24:23,400 Speaker 1: we don't have to listen to this show. We don't 464 00:24:23,440 --> 00:24:26,600 Speaker 1: have to eat shit food. We don't have to you know, 465 00:24:26,760 --> 00:24:30,760 Speaker 1: kill our body slowly. We don't have to. And I 466 00:24:30,800 --> 00:24:35,720 Speaker 1: think that like these conversations, although you know, they can 467 00:24:35,760 --> 00:24:38,399 Speaker 1: be a bit complicated. I understand, and didn't we go 468 00:24:38,520 --> 00:24:43,560 Speaker 1: from hilarity fucking the depths of whatever fucking quick. 469 00:24:45,080 --> 00:24:45,560 Speaker 3: It was all. 470 00:24:46,040 --> 00:24:48,880 Speaker 1: It was all dick jokes and possums till Patrick got 471 00:24:48,920 --> 00:24:52,679 Speaker 1: at the steering wheel. I blame I blame me. I 472 00:24:52,800 --> 00:24:57,040 Speaker 1: blame me. It's all right, I blame me. I don't know. 473 00:24:57,200 --> 00:25:00,520 Speaker 1: I just this is me being raw and real and 474 00:25:00,600 --> 00:25:03,360 Speaker 1: not funny for a moment. But I just and I'm 475 00:25:03,400 --> 00:25:05,359 Speaker 1: not saying about what you said, mate, but I just 476 00:25:05,880 --> 00:25:11,560 Speaker 1: get tired of everyone saying nothing's my fault. Like all 477 00:25:11,600 --> 00:25:14,400 Speaker 1: the bad things in my life, none of it's my fault, 478 00:25:14,720 --> 00:25:17,160 Speaker 1: and everyone goes, oh, no, no, we know, we don't 479 00:25:17,200 --> 00:25:20,880 Speaker 1: want to blame. It's like fucking well, sometimes you are 480 00:25:20,960 --> 00:25:24,679 Speaker 1: to blame, Like that's not an insult, that's just a 481 00:25:24,840 --> 00:25:29,439 Speaker 1: fucking fact of life. Sometimes I am to blame. Like 482 00:25:30,560 --> 00:25:33,800 Speaker 1: I would say, you know, most of the fuck ups, 483 00:25:33,840 --> 00:25:36,760 Speaker 1: all bad outcomes I've had in my life, it's either 484 00:25:36,800 --> 00:25:39,639 Speaker 1: been totally me or partly me. Now, that's not me 485 00:25:39,760 --> 00:25:42,120 Speaker 1: throwing myself under the bus. That's me having a bit 486 00:25:42,160 --> 00:25:46,159 Speaker 1: of humility and self awareness. You know, when I was 487 00:25:46,200 --> 00:25:50,560 Speaker 1: morbidly obese, there was nobody's fault but me, because I 488 00:25:50,640 --> 00:25:52,760 Speaker 1: made the decisions, and I ate the food, and I 489 00:25:52,800 --> 00:25:55,479 Speaker 1: took the action, and then when I got in good shape, 490 00:25:55,520 --> 00:26:00,119 Speaker 1: also me. And that's not ego, that's just observation. I 491 00:26:00,119 --> 00:26:03,160 Speaker 1: think at some stage we've got to stop fucking pointing 492 00:26:03,200 --> 00:26:06,240 Speaker 1: the finger at all of these things that are doing 493 00:26:06,320 --> 00:26:09,240 Speaker 1: this to us and going cool. What role do I play? 494 00:26:09,280 --> 00:26:11,840 Speaker 1: Steps down off soapbox. I'll be hilarious from now. Dick 495 00:26:11,920 --> 00:26:12,920 Speaker 1: joke coming up soon. 496 00:26:13,680 --> 00:26:15,280 Speaker 3: Look, I do see what you're saying. 497 00:26:15,600 --> 00:26:18,760 Speaker 2: I had a really interesting experience, and I'm drawing a parallel. 498 00:26:20,200 --> 00:26:23,560 Speaker 2: I don't watch television, but occasionally, you know, if I 499 00:26:23,600 --> 00:26:26,840 Speaker 2: go to someone's house during sporting events, there's a lot 500 00:26:26,840 --> 00:26:32,480 Speaker 2: of advertising for online betting, and that's quite staggering, and. 501 00:26:32,400 --> 00:26:34,480 Speaker 3: There's a reason for that. There's also a reason. 502 00:26:34,200 --> 00:26:38,240 Speaker 2: That McDonald's runs their radio ads around lunchtime and dinner time, 503 00:26:38,520 --> 00:26:41,320 Speaker 2: because they know the marketing people know that you're hitting 504 00:26:41,320 --> 00:26:44,400 Speaker 2: the right trigger and you're getting people at the right time. Now, 505 00:26:44,440 --> 00:26:48,000 Speaker 2: I'm making excuses for the decisions people make, but I 506 00:26:48,040 --> 00:26:51,479 Speaker 2: think that what's happening with the likes of AI is 507 00:26:51,520 --> 00:26:54,800 Speaker 2: that you know the algorithms that we talk about social media. 508 00:26:54,880 --> 00:26:56,520 Speaker 2: We keep kind of throwing that out, but what it 509 00:26:56,600 --> 00:27:00,879 Speaker 2: does effectively, it's kind of make it easy. It's for 510 00:27:00,920 --> 00:27:04,000 Speaker 2: people to be manipulated and more some people are more 511 00:27:04,000 --> 00:27:07,040 Speaker 2: susceptible to that. And I guess that's the understanding I'm 512 00:27:07,040 --> 00:27:10,000 Speaker 2: trying to bring into the conversation is that, yes, you know, 513 00:27:10,280 --> 00:27:13,520 Speaker 2: we all make conscious decisions. You're right, but the marketers 514 00:27:13,520 --> 00:27:18,200 Speaker 2: are getting cleverer, the people are getting cleverer, and that's 515 00:27:18,400 --> 00:27:20,800 Speaker 2: that's the discus, And I guess we need to know 516 00:27:20,920 --> 00:27:22,160 Speaker 2: that to know that. 517 00:27:22,080 --> 00:27:23,280 Speaker 3: We're being manipulated. 518 00:27:23,520 --> 00:27:26,280 Speaker 2: You know, That's why I'm steering towards the drive through, 519 00:27:26,560 --> 00:27:29,399 Speaker 2: you know, for whatever happens to be So part of 520 00:27:29,440 --> 00:27:32,680 Speaker 2: that is just you know what is you know, knowledge 521 00:27:32,720 --> 00:27:35,560 Speaker 2: is power, and knowing that we are being. 522 00:27:35,440 --> 00:27:36,560 Speaker 3: Steered in a certain direction. 523 00:27:36,760 --> 00:27:38,919 Speaker 2: I remember, you know, I haven't used social media for 524 00:27:38,960 --> 00:27:41,800 Speaker 2: a very long time, and a friend said to me, look, 525 00:27:41,840 --> 00:27:43,639 Speaker 2: I messaged you two weeks ago. It's like, what do 526 00:27:43,720 --> 00:27:46,160 Speaker 2: you mean you messaged me two weeks ago? And they said, 527 00:27:46,160 --> 00:27:48,719 Speaker 2: I sent you a messenger, a message on Facebook Messenger. 528 00:27:48,760 --> 00:27:51,920 Speaker 2: So dude, I haven't used Facebook Messenger for eight years. 529 00:27:52,440 --> 00:27:56,120 Speaker 2: And so I went to my accounts, and I kind 530 00:27:56,119 --> 00:27:58,800 Speaker 2: of looked at it and I thought, oh, okay, there's 531 00:27:58,800 --> 00:28:01,960 Speaker 2: a few messages there, one three years ago. So I 532 00:28:02,040 --> 00:28:05,760 Speaker 2: start looking through. But twenty minutes later I was flicking. 533 00:28:05,840 --> 00:28:08,960 Speaker 2: I'm thinking, what the hell am I doing. I start 534 00:28:09,119 --> 00:28:11,520 Speaker 2: doing the flick and I thought, this is the reason 535 00:28:11,560 --> 00:28:14,679 Speaker 2: I stopped using social media, because I probably have one 536 00:28:14,720 --> 00:28:17,960 Speaker 2: of those addictive personalities. And I made the conscious choice 537 00:28:18,000 --> 00:28:20,480 Speaker 2: then to uninstall it again from the app, and I 538 00:28:20,520 --> 00:28:22,720 Speaker 2: just said to my friend, just don't send me messages there. 539 00:28:22,840 --> 00:28:24,159 Speaker 3: Use WhatsApp, use text. 540 00:28:24,560 --> 00:28:24,760 Speaker 1: You know. 541 00:28:25,200 --> 00:28:27,720 Speaker 2: It was an interesting thing, but I found myself being 542 00:28:27,800 --> 00:28:30,560 Speaker 2: drawn in again because it cleverly. 543 00:28:30,200 --> 00:28:32,720 Speaker 3: Knows, you know, it even knows what. 544 00:28:32,520 --> 00:28:36,200 Speaker 2: You pause on and what has grabbed your attention and 545 00:28:36,520 --> 00:28:38,160 Speaker 2: the stuff that I was looking at. There's a really 546 00:28:38,200 --> 00:28:41,880 Speaker 2: great site on Facebook called earth picks. I don't know 547 00:28:41,880 --> 00:28:45,920 Speaker 2: if you've ever seen earth picks, but it's this amazing 548 00:28:46,000 --> 00:28:48,040 Speaker 2: site that just shows amazing footagem around the world. It 549 00:28:48,080 --> 00:28:50,600 Speaker 2: could be somewhere in Nairobi, or it could be someone 550 00:28:50,720 --> 00:28:53,200 Speaker 2: cycling on the top of one of those precarious cliffs, 551 00:28:53,200 --> 00:28:55,160 Speaker 2: you know where the well, the cliff's not precarious, the 552 00:28:55,240 --> 00:28:58,360 Speaker 2: action is, but they're riding along a cliff's edge, and 553 00:28:58,400 --> 00:29:00,800 Speaker 2: I get hold by that stuff into sports. 554 00:29:01,240 --> 00:29:04,040 Speaker 3: So for me, it still knows what gets me in 555 00:29:05,160 --> 00:29:05,480 Speaker 3: oh one. 556 00:29:05,440 --> 00:29:09,280 Speaker 1: Hundred percent, I reckon one of the smartest things you 557 00:29:09,320 --> 00:29:12,280 Speaker 1: did was and You've done a lot of smart things, 558 00:29:12,280 --> 00:29:16,560 Speaker 1: but like throwing your Telly away ten years ago, I'm like, yeah, 559 00:29:16,720 --> 00:29:19,160 Speaker 1: I'm I will put up my hand and say, well, 560 00:29:19,240 --> 00:29:22,040 Speaker 1: I'm not addicted to Telly, but I do use it 561 00:29:22,080 --> 00:29:24,840 Speaker 1: as maybe it's in a healthy way, but yeah, I'm I'm. 562 00:29:25,080 --> 00:29:28,480 Speaker 1: I can binge a Netflix series pretty savagely, like once 563 00:29:28,520 --> 00:29:30,480 Speaker 1: I open the door, and there's a bit of a 564 00:29:30,520 --> 00:29:33,920 Speaker 1: biochemical addiction, right, there's this that creates this response in 565 00:29:33,960 --> 00:29:35,760 Speaker 1: your brain and you're like, wow, I feel good. This 566 00:29:35,920 --> 00:29:38,920 Speaker 1: is good. And then now I've just watched four episodes 567 00:29:38,920 --> 00:29:40,880 Speaker 1: back to back in the night, So there's definitely that 568 00:29:41,000 --> 00:29:43,200 Speaker 1: going on. And also, in your defense, you're hon A, 569 00:29:43,800 --> 00:29:49,240 Speaker 1: You're exactly right. Some people are really genetically and biochemically 570 00:29:49,320 --> 00:29:53,600 Speaker 1: predisposed to get hooked into things more easily. And I 571 00:29:53,760 --> 00:29:57,400 Speaker 1: fully acknowledge that, so I'm not saying people who get 572 00:29:57,920 --> 00:30:01,840 Speaker 1: I've worked with addic so I understand it well and 573 00:30:01,880 --> 00:30:04,640 Speaker 1: I'm not saying they're at all week or anything like that. 574 00:30:06,040 --> 00:30:08,680 Speaker 1: It's tough, it's real tough. All I'm saying is that 575 00:30:08,760 --> 00:30:12,920 Speaker 1: at some stage we have to step up ourselves and say, well, look, 576 00:30:12,960 --> 00:30:15,720 Speaker 1: I actually have the power to make decisions and do things, 577 00:30:15,760 --> 00:30:20,800 Speaker 1: and you know, not do other things. But also to 578 00:30:20,880 --> 00:30:24,360 Speaker 1: your point on you know, marketing and branding and advertising 579 00:30:24,560 --> 00:30:28,400 Speaker 1: is entirely as you point out, it's entirely focused on 580 00:30:28,480 --> 00:30:31,760 Speaker 1: getting people to buy stuff, to manipulate their thinking, to 581 00:30:32,400 --> 00:30:35,640 Speaker 1: coerce them, to get them to go do a thing, 582 00:30:35,840 --> 00:30:38,440 Speaker 1: go to McDonald's at this time, or buy this thing 583 00:30:38,520 --> 00:30:41,720 Speaker 1: that you don't need. I mean, that's the objective, you know. 584 00:30:41,800 --> 00:30:45,600 Speaker 1: About twenty years ago in Australia, I know anyway, marketing 585 00:30:45,680 --> 00:30:51,840 Speaker 1: firms started working with neuroscientists. They would employ neuroscientists because 586 00:30:51,880 --> 00:30:56,360 Speaker 1: they understood how the brain responds to various stimuli. And 587 00:30:56,440 --> 00:30:59,120 Speaker 1: so then now they've got this, they're doing deep science 588 00:31:00,040 --> 00:31:03,920 Speaker 1: in order to manipulate people, coerce influence people to buy stuff. 589 00:31:03,960 --> 00:31:07,680 Speaker 1: So yeah, it's things aren't always as they seem, that's 590 00:31:07,680 --> 00:31:08,080 Speaker 1: for sure. 591 00:31:08,280 --> 00:31:10,440 Speaker 2: And it can be really easy too, like sending tith 592 00:31:10,440 --> 00:31:12,400 Speaker 2: a link to a new e ink tablet and that 593 00:31:12,600 --> 00:31:14,320 Speaker 2: just centered right down the rabbit hole. 594 00:31:14,960 --> 00:31:17,280 Speaker 4: He's got a whole thing. He's doing our harps. 595 00:31:17,280 --> 00:31:17,800 Speaker 2: It's bush. 596 00:31:17,960 --> 00:31:18,520 Speaker 1: What's he doing? 597 00:31:18,720 --> 00:31:20,800 Speaker 4: Just sending me shit that I don't need to be 598 00:31:20,880 --> 00:31:23,320 Speaker 4: looking at to buy that. He's trying to coax me 599 00:31:23,360 --> 00:31:27,160 Speaker 4: into like that fucking crash that time. 600 00:31:27,680 --> 00:31:30,280 Speaker 1: He's just as bad as fucking AI. By the way, 601 00:31:30,600 --> 00:31:33,160 Speaker 1: did you find out about that question about the. 602 00:31:33,120 --> 00:31:36,040 Speaker 4: Medical It looks questionable to be in a top three. 603 00:31:36,160 --> 00:31:38,720 Speaker 4: There's a bit of there's a bit of yeah, what's 604 00:31:38,720 --> 00:31:42,720 Speaker 4: it saying, well, just in all the places. It's just 605 00:31:42,800 --> 00:31:45,040 Speaker 4: there's lots of little things saying, oh, it's not the 606 00:31:45,080 --> 00:31:48,160 Speaker 4: third leading cause of death. There's a bit of whatever. 607 00:31:48,200 --> 00:31:50,880 Speaker 4: I can't find anything clear on it, but on it 608 00:31:50,880 --> 00:31:54,120 Speaker 4: can I just add the idea that when you're talking 609 00:31:54,160 --> 00:31:56,920 Speaker 4: before and you say the algorithm's designed to get his hooked, 610 00:31:57,680 --> 00:32:01,719 Speaker 4: but so is exercise, Like exercise is designed to get 611 00:32:01,760 --> 00:32:04,680 Speaker 4: it hooked. But no one that's ran every day for 612 00:32:04,760 --> 00:32:08,400 Speaker 4: ten years goes it's not my fault, They go, I'm 613 00:32:08,400 --> 00:32:12,240 Speaker 4: so dedicated, Like it's the story we tell ourselves about 614 00:32:12,320 --> 00:32:13,880 Speaker 4: the responsibility we take. 615 00:32:15,360 --> 00:32:18,320 Speaker 2: Yeah, I reckon, that's such a good point if actually 616 00:32:18,320 --> 00:32:22,200 Speaker 2: it really is an amazingly good point because someone told 617 00:32:22,200 --> 00:32:25,160 Speaker 2: me a few years ago that it takes what sixty 618 00:32:25,200 --> 00:32:26,920 Speaker 2: two days to develop. 619 00:32:26,640 --> 00:32:30,040 Speaker 3: A habit craigil love that, Yeah, I go, is it 620 00:32:30,080 --> 00:32:32,040 Speaker 3: about right? About sixty two days or so? 621 00:32:32,480 --> 00:32:37,240 Speaker 2: But to build in a routine that then becomes something 622 00:32:37,320 --> 00:32:41,160 Speaker 2: that we are more likely to continue with. It's along 623 00:32:41,200 --> 00:32:42,960 Speaker 2: those lines, though, isn't it. Am I kind of right 624 00:32:43,000 --> 00:32:44,120 Speaker 2: there on my way off the mark? 625 00:32:45,360 --> 00:32:47,760 Speaker 1: Well, I mean there's a lot of variables around that, mate, 626 00:32:47,800 --> 00:32:51,600 Speaker 1: because it's like, well, there's heroine, and there's chocolate, you 627 00:32:51,640 --> 00:32:55,800 Speaker 1: know what I mean, there's and there's running, there's recreation running, 628 00:32:56,360 --> 00:33:01,320 Speaker 1: you know, and there's cocaine and crack you know. So, 629 00:33:02,280 --> 00:33:06,240 Speaker 1: and also there's some people about five to ten according 630 00:33:06,280 --> 00:33:08,520 Speaker 1: to Mick Hall, who's been on here a lot, who 631 00:33:09,160 --> 00:33:11,720 Speaker 1: you know is an addiction treatment specialist, and you both 632 00:33:11,800 --> 00:33:17,560 Speaker 1: know who he is, and quite a lot of check 633 00:33:17,600 --> 00:33:20,560 Speaker 1: it for yourselves everyone, but reasonable researcher says five to 634 00:33:20,600 --> 00:33:22,680 Speaker 1: ten percent of the population. That's a lot of people 635 00:33:22,960 --> 00:33:28,040 Speaker 1: are genetically more likely to become addicts. I'm looking at 636 00:33:28,080 --> 00:33:30,360 Speaker 1: you across the screen because I've moved you because I'm 637 00:33:30,360 --> 00:33:32,760 Speaker 1: doing some other research, So I'm actually looking at you. 638 00:33:32,960 --> 00:33:34,320 Speaker 3: I was more thinking crago. 639 00:33:34,640 --> 00:33:37,440 Speaker 2: You know, if I went back to the gym this 640 00:33:37,520 --> 00:33:40,280 Speaker 2: year after having a long break because of injury, and 641 00:33:41,360 --> 00:33:43,360 Speaker 2: I found it, it didn't take me long to get 642 00:33:43,400 --> 00:33:45,840 Speaker 2: back into the routine where when I don't do it, 643 00:33:45,880 --> 00:33:46,560 Speaker 2: I miss it. 644 00:33:46,800 --> 00:33:48,560 Speaker 1: I'm only talking days a week. 645 00:33:48,920 --> 00:33:52,400 Speaker 2: But today, for example, we were doing the podcast early, 646 00:33:52,840 --> 00:33:55,520 Speaker 2: so I've made as soon as the podcast finishes to 647 00:33:55,560 --> 00:33:57,320 Speaker 2: go to the gym so I can I can do 648 00:33:57,440 --> 00:34:00,480 Speaker 2: my workout because I don't want to miss today. Now, 649 00:34:00,520 --> 00:34:03,800 Speaker 2: something in my brain has told me that I'm going 650 00:34:03,840 --> 00:34:04,360 Speaker 2: to miss. 651 00:34:04,200 --> 00:34:04,920 Speaker 3: Out on something. 652 00:34:05,040 --> 00:34:08,080 Speaker 2: I as a person, will feel a sense of loss 653 00:34:08,239 --> 00:34:09,880 Speaker 2: if I don't go to the gym today. 654 00:34:10,000 --> 00:34:13,080 Speaker 3: Now what that says about me? It probably says a lot. 655 00:34:13,320 --> 00:34:15,960 Speaker 2: But I feel like I'm missing out on something, and 656 00:34:16,000 --> 00:34:19,400 Speaker 2: that takes time to develop that sense of loss or 657 00:34:19,440 --> 00:34:22,480 Speaker 2: that sense of need to do that, you know, the 658 00:34:22,480 --> 00:34:25,399 Speaker 2: compulsion to want to go to the gym, because that's 659 00:34:25,480 --> 00:34:27,960 Speaker 2: out of my normal routine. I'm normally there at six, 660 00:34:28,239 --> 00:34:30,080 Speaker 2: I see a few familiar faces. I do what I 661 00:34:30,120 --> 00:34:32,200 Speaker 2: need to do, but I feel like if I'm not 662 00:34:32,239 --> 00:34:33,399 Speaker 2: doing it, I'm missing out. 663 00:34:34,400 --> 00:34:37,680 Speaker 1: Yeah, yeah, I get it, but I guess I mean 664 00:34:37,760 --> 00:34:40,880 Speaker 1: this opens the door. An interesting question, do you what 665 00:34:40,920 --> 00:34:42,839 Speaker 1: are your thoughts on is there such a thing as 666 00:34:42,880 --> 00:34:44,320 Speaker 1: an healthy addiction? 667 00:34:45,360 --> 00:34:47,920 Speaker 2: Well, yeah, I guess what I'm talking about right now 668 00:34:48,040 --> 00:34:51,719 Speaker 2: is a healthy addiction because ultimately me going to the 669 00:34:51,760 --> 00:34:54,320 Speaker 2: gym and doing about forty five minutes, I do my cardio, 670 00:34:55,000 --> 00:34:58,360 Speaker 2: I do weights, and I feel better for it. The 671 00:34:58,440 --> 00:35:01,279 Speaker 2: endorphins get kicked kicked over, but I also know that 672 00:35:01,320 --> 00:35:04,279 Speaker 2: I'm making myself better. I do balance, and all the 673 00:35:04,320 --> 00:35:07,080 Speaker 2: things that I'm accomplishing in the gym are things that 674 00:35:07,120 --> 00:35:09,239 Speaker 2: are going to value add to my longevity. 675 00:35:09,560 --> 00:35:12,200 Speaker 3: You know. I look at my TIF You'll love this. 676 00:35:12,360 --> 00:35:15,320 Speaker 2: I look at my heart rate and I can see 677 00:35:15,360 --> 00:35:18,440 Speaker 2: the spike when I'm doing cardio, you know, because I 678 00:35:18,480 --> 00:35:21,760 Speaker 2: have a relatively low resting heart rate of about fifty 679 00:35:21,760 --> 00:35:25,360 Speaker 2: two or so, which is pretty pretty low ish. But 680 00:35:25,440 --> 00:35:26,840 Speaker 2: I love it when I push it up to one 681 00:35:26,920 --> 00:35:30,000 Speaker 2: hundred and fifty. You know, when I'm one hundred beats faster, 682 00:35:30,360 --> 00:35:32,080 Speaker 2: and I know I can get that on the rower, 683 00:35:32,239 --> 00:35:34,440 Speaker 2: and I know when i'm listening to you know, heavy 684 00:35:34,480 --> 00:35:37,560 Speaker 2: metal or something while I'm rowing, I can get my 685 00:35:37,880 --> 00:35:40,680 Speaker 2: and I always take a photo of the a the 686 00:35:40,719 --> 00:35:43,719 Speaker 2: little screen when i'm finished, so I can compare to 687 00:35:43,760 --> 00:35:45,840 Speaker 2: the last few times I've beat on the rower, because 688 00:35:45,840 --> 00:35:47,719 Speaker 2: I just know, you know, on a day, you know, 689 00:35:47,719 --> 00:35:49,640 Speaker 2: you get to the halfway mark and I'm thinking I'm 690 00:35:49,640 --> 00:35:50,160 Speaker 2: actually there. 691 00:35:50,200 --> 00:35:51,720 Speaker 3: I reckon, I can beat my best time. 692 00:35:52,120 --> 00:35:54,360 Speaker 2: So those sorts of things are the little bells and 693 00:35:54,360 --> 00:35:56,640 Speaker 2: whistles that go off in your head that cause you 694 00:35:56,680 --> 00:35:58,560 Speaker 2: to want to strive to go that little bit further, 695 00:35:58,760 --> 00:35:59,680 Speaker 2: faster or harder. 696 00:36:00,960 --> 00:36:03,520 Speaker 1: Let me just read something I just grab. Medical errors 697 00:36:04,480 --> 00:36:07,120 Speaker 1: are a significant cause of death globally and within specific 698 00:36:07,160 --> 00:36:09,920 Speaker 1: countries like the US, Australia, and the UK. While precise 699 00:36:09,960 --> 00:36:14,440 Speaker 1: figures figures vary, studies suggest that a substantial number of 700 00:36:14,440 --> 00:36:17,960 Speaker 1: deaths are linked to preventable medical errors. The World Health 701 00:36:18,040 --> 00:36:22,759 Speaker 1: Organization estimates it approximately two point six million people die 702 00:36:22,800 --> 00:36:26,320 Speaker 1: each year due to unsafe care. In the US, estimates 703 00:36:26,440 --> 00:36:29,040 Speaker 1: range from two hundred and fifty thousand to four hundred 704 00:36:29,040 --> 00:36:34,000 Speaker 1: and forty thousand deaths annually due to medical errors some places, 705 00:36:34,160 --> 00:36:37,960 Speaker 1: some studies placing it as the third leading cause of death. 706 00:36:38,280 --> 00:36:40,640 Speaker 1: But this is like tip looks up and it gets 707 00:36:40,680 --> 00:36:43,840 Speaker 1: different that this is the problem. But anyway, the bottom 708 00:36:43,840 --> 00:36:46,480 Speaker 1: line is similarly. Australia experience is a significant number of 709 00:36:46,560 --> 00:36:50,760 Speaker 1: deaths related to medical errors, estimates ranging from eighteen thousand 710 00:36:50,760 --> 00:36:53,759 Speaker 1: and fifty four thousand deaths a year in Australia from 711 00:36:53,840 --> 00:36:58,520 Speaker 1: medical errors. Now that's the World Health Organization. That's not 712 00:36:58,680 --> 00:37:00,880 Speaker 1: old Brian out in the back of the shed in 713 00:37:00,960 --> 00:37:04,719 Speaker 1: his tin hat with his fucking conspiracy theory shit going on, right, 714 00:37:05,000 --> 00:37:08,759 Speaker 1: This is WHO. So you know, it's like and I'm 715 00:37:08,760 --> 00:37:11,680 Speaker 1: not saying that, and I'm not even throwing the medical 716 00:37:11,719 --> 00:37:14,640 Speaker 1: system under the bus. I'm so grateful for doctors and 717 00:37:14,760 --> 00:37:18,680 Speaker 1: hospitals and nurses and all the beautiful, amazing people. But 718 00:37:18,800 --> 00:37:23,080 Speaker 1: I think, like we have to be we have to 719 00:37:23,120 --> 00:37:27,239 Speaker 1: be like informed. And I'm not saying that that what 720 00:37:27,280 --> 00:37:30,600 Speaker 1: I just read is infallible. There could be a mistake 721 00:37:30,640 --> 00:37:32,880 Speaker 1: as well, but that's literally what I'm reading. That's WHO 722 00:37:33,440 --> 00:37:38,680 Speaker 1: World Health Organization statistics. But you know, we need to 723 00:37:38,719 --> 00:37:42,319 Speaker 1: be informed and speak from a point of you know, 724 00:37:42,520 --> 00:37:45,359 Speaker 1: kind of accurate data because all the rest is just 725 00:37:45,440 --> 00:37:46,760 Speaker 1: opinion and fucking emotion. 726 00:37:46,880 --> 00:37:50,160 Speaker 2: I think sometimes in Australia at the moment, I was 727 00:37:50,200 --> 00:37:54,080 Speaker 2: talking to my GP about this a couple of days ago. Currently, 728 00:37:54,160 --> 00:37:57,719 Speaker 2: a lot of doctors in you using AI scribes to 729 00:37:57,880 --> 00:38:01,759 Speaker 2: recall conversations and we were having a chat about it 730 00:38:01,840 --> 00:38:05,200 Speaker 2: because he isn't using it presently. But I really like 731 00:38:05,280 --> 00:38:08,880 Speaker 2: my doctor who is very I feel he's very proactive 732 00:38:09,080 --> 00:38:11,880 Speaker 2: on the health side of things. He'll always spend the 733 00:38:11,880 --> 00:38:13,560 Speaker 2: extra time. You know, sometimes you go to a GP 734 00:38:13,640 --> 00:38:15,680 Speaker 2: and it's like they tick a box and send you out, 735 00:38:15,760 --> 00:38:18,200 Speaker 2: whereas you know, my guy's up for a long chat 736 00:38:18,480 --> 00:38:20,239 Speaker 2: and we talk about lots of different things, and I 737 00:38:20,360 --> 00:38:22,960 Speaker 2: like that with someone who takes a personal interest. And 738 00:38:23,960 --> 00:38:26,040 Speaker 2: so in Australia at the moment, a lot of doctors 739 00:38:26,200 --> 00:38:30,560 Speaker 2: now are using AI to record what they say to 740 00:38:30,640 --> 00:38:34,200 Speaker 2: you and then but this has been used for decades, 741 00:38:34,560 --> 00:38:37,400 Speaker 2: but what AI is doing with so you can record 742 00:38:37,400 --> 00:38:41,160 Speaker 2: a whole conversation, but now they're employing AI to truncate 743 00:38:41,239 --> 00:38:43,200 Speaker 2: that conversation, break it down. 744 00:38:43,239 --> 00:38:45,399 Speaker 3: And to pull out the highlights. 745 00:38:45,920 --> 00:38:49,040 Speaker 2: And that's what they're doing that's different than been previously 746 00:38:49,120 --> 00:38:53,640 Speaker 2: recorded conversations. They're now able to cherry pick the content 747 00:38:53,880 --> 00:38:56,880 Speaker 2: and give you the relevant information and a lot of 748 00:38:56,880 --> 00:38:59,480 Speaker 2: people are kind of I was reading this article and 749 00:38:59,480 --> 00:39:01,800 Speaker 2: it might have been in mc Gharanian. I think where 750 00:39:02,160 --> 00:39:05,800 Speaker 2: there was the pros and cons should you allow yourself 751 00:39:05,840 --> 00:39:08,560 Speaker 2: to be recorded and that conversation then to be saved. 752 00:39:08,840 --> 00:39:11,759 Speaker 2: And the reality of it is our health records are accessible, 753 00:39:11,800 --> 00:39:13,799 Speaker 2: and I think that a lot of people don't realize that. 754 00:39:14,120 --> 00:39:16,200 Speaker 2: So if you do have a conversation with your GP, 755 00:39:16,320 --> 00:39:19,360 Speaker 2: everything that goes down into your health record you should 756 00:39:19,360 --> 00:39:21,000 Speaker 2: be able to access and have a look at. But 757 00:39:21,040 --> 00:39:24,040 Speaker 2: I like the idea because when you're dealing with a 758 00:39:24,120 --> 00:39:27,080 Speaker 2: medical issue, I think particularly if it's something you're a 759 00:39:27,120 --> 00:39:29,960 Speaker 2: bit frightened about. And I'm happy to be upfront about this. 760 00:39:30,000 --> 00:39:33,239 Speaker 2: I've been getting some tests recently because my identical twin 761 00:39:33,280 --> 00:39:36,319 Speaker 2: brother had prostate cancer last year, and so we've been 762 00:39:36,320 --> 00:39:39,520 Speaker 2: doing some scans to do a benchmark so that I 763 00:39:39,600 --> 00:39:42,160 Speaker 2: can make sure that I know what I look like 764 00:39:42,280 --> 00:39:44,760 Speaker 2: now and then that way, if something happens in twelve 765 00:39:44,840 --> 00:39:48,399 Speaker 2: months time, we have a comparative scan to look at 766 00:39:48,400 --> 00:39:50,840 Speaker 2: to say has there been a change, because I'm scared. 767 00:39:51,120 --> 00:39:55,160 Speaker 2: You know, I am genetically identical to my brother, and 768 00:39:55,200 --> 00:39:58,200 Speaker 2: the only differences I guess have been our lifestyles. But 769 00:39:58,239 --> 00:40:00,920 Speaker 2: the fact that at an age which under sixty, to 770 00:40:01,000 --> 00:40:04,120 Speaker 2: have prostate cancer is not good because it then means 771 00:40:04,160 --> 00:40:06,799 Speaker 2: that siblings are at a much higher risk, So I'm 772 00:40:06,800 --> 00:40:07,520 Speaker 2: mindful of that. 773 00:40:07,640 --> 00:40:08,640 Speaker 3: And that's why having a. 774 00:40:08,560 --> 00:40:11,240 Speaker 2: Good GP who's prepared to do preventive or at least, 775 00:40:11,440 --> 00:40:14,359 Speaker 2: you know, actions in place to make sure there are 776 00:40:14,360 --> 00:40:17,000 Speaker 2: baselines that we can refer back to is really important. 777 00:40:17,160 --> 00:40:20,000 Speaker 2: And so yeah, I'd be happy to have everything we 778 00:40:20,040 --> 00:40:24,000 Speaker 2: say taken down by a scribe that then is aide 779 00:40:24,719 --> 00:40:26,000 Speaker 2: and everything's remembered. 780 00:40:26,000 --> 00:40:28,280 Speaker 3: What about you, tif you would you be happy. 781 00:40:28,080 --> 00:40:30,840 Speaker 2: To go to a consultation or to sit down to 782 00:40:30,880 --> 00:40:34,000 Speaker 2: a therapist or something and everything everything recorded. Yeah? 783 00:40:34,080 --> 00:40:41,279 Speaker 1: Yeah, absolutely, crag go, Yeah, definitely. I think what you're 784 00:40:41,400 --> 00:40:44,720 Speaker 1: talking about is also a really good idea because it's 785 00:40:44,800 --> 00:40:47,560 Speaker 1: the integration of a human that you like and know 786 00:40:47,680 --> 00:40:50,920 Speaker 1: and have a relationship with and trust, right, and then 787 00:40:51,480 --> 00:40:54,239 Speaker 1: he or she doesn't need to be jotting everything, like 788 00:40:54,360 --> 00:40:57,920 Speaker 1: if you've got something else taking notes, it's so much easier. 789 00:40:57,920 --> 00:41:00,879 Speaker 1: Then they can spend more time and energy on you, 790 00:41:01,120 --> 00:41:05,200 Speaker 1: which that's a nice symbiosis. Do you like that word tip? 791 00:41:05,680 --> 00:41:09,839 Speaker 1: That's a very good wormight to dop that You're welcome. Hey, mate, 792 00:41:10,520 --> 00:41:12,400 Speaker 1: I realized we don't have that long to go, and 793 00:41:12,440 --> 00:41:14,200 Speaker 1: we've done none of your lists, so I'm going to 794 00:41:14,239 --> 00:41:17,120 Speaker 1: be bold and fucking take the reins back because I. 795 00:41:18,080 --> 00:41:20,600 Speaker 3: Told you didn't. I godn't. 796 00:41:20,760 --> 00:41:24,359 Speaker 1: No, you're exactly right. You know, well we've only gone 797 00:41:24,400 --> 00:41:27,160 Speaker 1: through one thing on your list. No we haven't. Don't 798 00:41:27,160 --> 00:41:29,640 Speaker 1: you see what I've been doing, Craigo. I've been weaving 799 00:41:30,280 --> 00:41:33,280 Speaker 1: throughout the list. We've used three stories from my list. 800 00:41:33,320 --> 00:41:35,200 Speaker 1: But I'm just weaving through that conversation. 801 00:41:35,960 --> 00:41:39,880 Speaker 2: That's right, buddy, because that was going to it's a memory, 802 00:41:40,080 --> 00:41:42,799 Speaker 2: see the next like the adjunct to that. 803 00:41:43,120 --> 00:41:43,319 Speaker 3: Right. 804 00:41:43,440 --> 00:41:45,839 Speaker 2: The little synergy to what we were talking about was, 805 00:41:46,080 --> 00:41:48,560 Speaker 2: you know, the reality is if you're a GP and 806 00:41:48,600 --> 00:41:52,200 Speaker 2: you're focusing on the person, sometimes if you're taking notes, 807 00:41:52,320 --> 00:41:54,799 Speaker 2: it can be dis jointed, whereas if you're recording it. 808 00:41:55,000 --> 00:41:58,279 Speaker 2: And memory is an interesting thing, and scientists have been 809 00:41:58,320 --> 00:42:02,200 Speaker 2: doing what they're calling a time travel memory hack to 810 00:42:02,320 --> 00:42:06,719 Speaker 2: help people remember stuff. So what the found is emotions 811 00:42:06,719 --> 00:42:10,440 Speaker 2: are closely linked to memory. So this is specifically going 812 00:42:10,480 --> 00:42:14,560 Speaker 2: to benefit learning as opposed to just general memories of events. 813 00:42:14,920 --> 00:42:16,520 Speaker 3: So this is focused on study. 814 00:42:17,040 --> 00:42:19,720 Speaker 2: So if you're a student or a learner or whatever, 815 00:42:19,800 --> 00:42:23,879 Speaker 2: but you're learning something new, what the suggesting is that 816 00:42:24,360 --> 00:42:28,080 Speaker 2: you review what you've just done, but don't think about 817 00:42:28,120 --> 00:42:30,560 Speaker 2: the action, think about the emotions, so where you were 818 00:42:30,600 --> 00:42:35,040 Speaker 2: at the time, and by reinforcing the emotions surrounding that, 819 00:42:35,640 --> 00:42:38,799 Speaker 2: it actually helps us retain a much higher degree of 820 00:42:38,880 --> 00:42:43,040 Speaker 2: memory and can actually bring it back and strengthen that memory. 821 00:42:43,080 --> 00:42:45,680 Speaker 2: So when you think of memory as kind of I guess, 822 00:42:46,080 --> 00:42:49,120 Speaker 2: a snowball rolling down a hill and it kind of 823 00:42:49,160 --> 00:42:52,719 Speaker 2: gains momentum, you can slow that down by looking at 824 00:42:52,760 --> 00:42:54,440 Speaker 2: the emotions related. 825 00:42:54,040 --> 00:42:54,800 Speaker 3: To that memory. 826 00:42:54,800 --> 00:42:57,359 Speaker 2: So I just thought that was kind of interesting that, 827 00:42:57,480 --> 00:43:00,360 Speaker 2: you know, for each of us, you know, learn is 828 00:43:00,400 --> 00:43:05,279 Speaker 2: difficult and remembering stuff is tough, and so these new 829 00:43:05,280 --> 00:43:07,719 Speaker 2: ways to kind of hack memory and think about the 830 00:43:07,760 --> 00:43:10,960 Speaker 2: way and the emotions. I think it's fascinating. And you know, 831 00:43:11,080 --> 00:43:15,319 Speaker 2: I know smell is associated with memory. Every night, most 832 00:43:15,400 --> 00:43:18,279 Speaker 2: nights a week, I have a little oil burner and 833 00:43:18,600 --> 00:43:22,600 Speaker 2: I put of scented oil in, but a peppermint scented. 834 00:43:22,280 --> 00:43:24,960 Speaker 3: Oil, because there was some research that was done. 835 00:43:24,800 --> 00:43:29,320 Speaker 2: That said that that particular carrier oil can trigger memory. 836 00:43:29,840 --> 00:43:33,120 Speaker 2: Because smell is very closely associated with memory as well. 837 00:43:33,160 --> 00:43:35,480 Speaker 2: So if you have one of those little oil diffuses 838 00:43:35,520 --> 00:43:37,040 Speaker 2: and you set it for two hours when you go 839 00:43:37,120 --> 00:43:40,040 Speaker 2: to bed, it can actually help with memory retention. 840 00:43:41,120 --> 00:43:45,839 Speaker 1: Yeah, that's so true. Memory like smell, Like I think 841 00:43:45,840 --> 00:43:48,000 Speaker 1: it's because I have a one point five nose. I 842 00:43:48,040 --> 00:43:53,319 Speaker 1: can smell shit two suburbs away, right, And and yeah, yeah, 843 00:43:53,400 --> 00:43:55,480 Speaker 1: I know I'm not pretty everyone. I don't know if 844 00:43:55,520 --> 00:44:00,400 Speaker 1: you've ever seen me, but fucking don't look too close. Patrick, 845 00:44:00,440 --> 00:44:03,600 Speaker 1: on the other hand, Brad Pitt prostate, bit dodgy but 846 00:44:03,760 --> 00:44:11,160 Speaker 1: face fucking beautiful. And Tiff more jack than both of us. 847 00:44:13,000 --> 00:44:15,680 Speaker 1: But I don't know this, like this is a I 848 00:44:15,719 --> 00:44:17,799 Speaker 1: have good ones and negative ones, but a negative one. 849 00:44:17,960 --> 00:44:20,560 Speaker 1: You know that stuff we used to eat Patrick when 850 00:44:20,560 --> 00:44:21,920 Speaker 1: we were kids called strasbourg. 851 00:44:22,840 --> 00:44:26,879 Speaker 3: Oh yeah, god yeah, oh fuck right that Tiff. 852 00:44:26,920 --> 00:44:28,080 Speaker 1: Do you know what that is? 853 00:44:28,080 --> 00:44:29,680 Speaker 4: Is that that like devon meat? 854 00:44:30,080 --> 00:44:32,799 Speaker 1: Yeah, it's just like fucking goop in a tube. I 855 00:44:32,800 --> 00:44:34,719 Speaker 1: don't know what. I don't know what it is, but 856 00:44:34,800 --> 00:44:39,240 Speaker 1: it's like I think it's, you know, sphinters and livers 857 00:44:39,320 --> 00:44:42,960 Speaker 1: and just put through a grinder. I don't know what 858 00:44:43,000 --> 00:44:47,839 Speaker 1: it is, right, Yeah, yeah, yeah, yeah, yeah. Anyway, when 859 00:44:47,880 --> 00:44:49,360 Speaker 1: I was a kid, I was about eight, and I 860 00:44:49,440 --> 00:44:51,840 Speaker 1: used to fucking inhale that ship with sauce and white 861 00:44:51,840 --> 00:44:55,120 Speaker 1: bread because I was a pig and I has a 862 00:44:55,120 --> 00:44:58,480 Speaker 1: little pig. That was before I got to the cheesecake. Numb, numb, numb, 863 00:44:58,480 --> 00:45:01,120 Speaker 1: get out of my way. I was very hard to 864 00:45:01,160 --> 00:45:03,360 Speaker 1: distract when I was eating, very hard to get my 865 00:45:03,400 --> 00:45:08,319 Speaker 1: attention anyway. So I ate about two pounds of that. 866 00:45:08,440 --> 00:45:13,000 Speaker 1: Then was violently ill. And then now even obviously I've 867 00:45:13,000 --> 00:45:16,959 Speaker 1: never eaten it touched it since. But even now, if 868 00:45:17,000 --> 00:45:20,280 Speaker 1: I see it, if I see that or what looks 869 00:45:20,440 --> 00:45:23,440 Speaker 1: like that similar to that, I almost gag. And if 870 00:45:23,440 --> 00:45:26,640 Speaker 1: I smell it or something like it, I could pretty 871 00:45:26,680 --> 00:45:30,040 Speaker 1: much vomit on cue. It's just and I'd never think 872 00:45:30,080 --> 00:45:32,920 Speaker 1: about it unless I see it or smell it. And 873 00:45:32,960 --> 00:45:37,480 Speaker 1: we're talking literally fifty years down the track. There's something 874 00:45:37,520 --> 00:45:41,360 Speaker 1: built into my brain and my physiology that so that 875 00:45:41,480 --> 00:45:45,960 Speaker 1: is very true, Patrick. There is emotions and smells and 876 00:45:46,080 --> 00:45:50,600 Speaker 1: stories also, stories are kind of you know, experiences have 877 00:45:50,719 --> 00:45:52,160 Speaker 1: us tied into certain memories. 878 00:45:52,320 --> 00:45:55,680 Speaker 3: I'm just picturing Mary unwrapping the strasbourg from the paper 879 00:45:55,680 --> 00:45:56,480 Speaker 3: wrap and. 880 00:45:56,480 --> 00:46:02,040 Speaker 4: Staying one piece won't hurting your hand in the work, 881 00:46:02,280 --> 00:46:04,160 Speaker 4: and then coming back and counting to see if she's 882 00:46:04,120 --> 00:46:05,400 Speaker 4: still had all five fingers. 883 00:46:06,719 --> 00:46:08,720 Speaker 1: Well, she used to just put it in the bowl 884 00:46:08,760 --> 00:46:10,600 Speaker 1: on the floor when me and the dog would eat 885 00:46:10,680 --> 00:46:14,360 Speaker 1: side by side, so I always beat the dog, So 886 00:46:15,400 --> 00:46:21,040 Speaker 1: fuck you dog. Patrick tell us about robot hands building 887 00:46:21,080 --> 00:46:24,240 Speaker 1: pizzas speaking of food. Oh dear, we. 888 00:46:24,160 --> 00:46:26,839 Speaker 3: Had this conversation during the week by text tip. 889 00:46:28,480 --> 00:46:32,080 Speaker 2: So they've developed this is for people who actually have disabilities. 890 00:46:32,480 --> 00:46:35,000 Speaker 2: Is primarily who they're focused on and what it is. 891 00:46:35,040 --> 00:46:38,400 Speaker 2: It's a robotic assistant. One of the difficulties with robots 892 00:46:38,480 --> 00:46:40,080 Speaker 2: is they use them in production all the time, so 893 00:46:40,160 --> 00:46:43,920 Speaker 2: car manufacturers use them. But the thing is, a robot 894 00:46:44,040 --> 00:46:46,920 Speaker 2: arm that's designed to pick up a heavy object is 895 00:46:47,120 --> 00:46:49,719 Speaker 2: very different than say a robot arm that's picking up 896 00:46:49,760 --> 00:46:52,480 Speaker 2: little grains of cheese that have fallen. You know, so 897 00:46:52,520 --> 00:46:54,640 Speaker 2: you've dropped pieces of cheese and it's picking up the 898 00:46:54,680 --> 00:46:58,600 Speaker 2: little The dexterity is really different. And so they've got 899 00:46:58,640 --> 00:47:01,160 Speaker 2: this robot that has the strength to be able to 900 00:47:01,160 --> 00:47:04,120 Speaker 2: pick up large objects, say a pot of boiling water, 901 00:47:04,520 --> 00:47:07,560 Speaker 2: but it can also be very delicate because it has 902 00:47:08,040 --> 00:47:11,560 Speaker 2: soft fingertips to be able to sense the pressure. And 903 00:47:12,040 --> 00:47:15,760 Speaker 2: so this this new robotic arm or new robotic hand 904 00:47:15,760 --> 00:47:19,799 Speaker 2: can be used just to keep with disabilities, to give 905 00:47:19,840 --> 00:47:20,839 Speaker 2: them more independence. 906 00:47:21,280 --> 00:47:28,600 Speaker 1: Craig I will say tiff I, tiff Tip, it wasn't me. 907 00:47:29,239 --> 00:47:32,400 Speaker 1: Patrick did suggest that he might get one for his bedroom. 908 00:47:32,640 --> 00:47:34,840 Speaker 1: I did not know what he was talking about. No, 909 00:47:35,160 --> 00:47:39,200 Speaker 1: did not know. Hey, don't bring up your fucking phone, everyone, 910 00:47:39,320 --> 00:47:45,600 Speaker 1: that's it. That's another episode of you projected story, tiff Ah, 911 00:47:46,400 --> 00:47:49,680 Speaker 1: don't let facts get in the way. Patrick, have we not? 912 00:47:50,040 --> 00:47:51,680 Speaker 1: I think I'm a contradicting myself. 913 00:47:51,680 --> 00:47:53,040 Speaker 3: I send him this article. 914 00:47:53,200 --> 00:47:56,160 Speaker 2: So it's a robot AI robot arm builds meals and 915 00:47:56,239 --> 00:47:59,040 Speaker 2: helps users with limited mobility. So I send it off 916 00:47:59,080 --> 00:48:01,120 Speaker 2: to Crago and I said, hey, maybe you could get 917 00:48:01,160 --> 00:48:04,759 Speaker 2: one via bedroom. His reply was if they built, but 918 00:48:04,960 --> 00:48:06,160 Speaker 2: with a mouth. 919 00:48:08,200 --> 00:48:09,160 Speaker 1: I did not say that. 920 00:48:10,080 --> 00:48:13,360 Speaker 2: My reply was, I'm sure it's on the way. Maybe 921 00:48:13,400 --> 00:48:18,680 Speaker 2: we'll get a discountfort too. And Craig's reply figures crossed. 922 00:48:19,360 --> 00:48:23,279 Speaker 1: That that's a complete lie. Everyone. I wouldn't participate any 923 00:48:23,360 --> 00:48:27,439 Speaker 1: kind of conversational transaction like that. I've been misrepresented. I'm 924 00:48:27,480 --> 00:48:32,360 Speaker 1: calling my lawyer. That's fucking that's that's terrible, Patrick, that 925 00:48:32,400 --> 00:48:34,600 Speaker 1: you misrepresent me like that to people. 926 00:48:35,280 --> 00:48:39,000 Speaker 2: And so her sending your screenshot of that what'sapp private conversation? 927 00:48:39,520 --> 00:48:45,160 Speaker 1: The show doesn't prove anything. Tell us why Delta Airlines 928 00:48:45,320 --> 00:48:50,560 Speaker 1: and others are working with an AI startup that personalizes prices. 929 00:48:50,600 --> 00:48:51,640 Speaker 1: What does that even mean? 930 00:48:51,960 --> 00:48:52,720 Speaker 3: Mastards? 931 00:48:52,960 --> 00:48:57,000 Speaker 2: Sorry, So there's a few senators in the US that 932 00:48:57,040 --> 00:49:00,600 Speaker 2: are having a real look at this, this scrutinizing this. Evidently, 933 00:49:00,680 --> 00:49:02,879 Speaker 2: what they're going to do is try to look at 934 00:49:02,920 --> 00:49:07,080 Speaker 2: and profile people and then adjust the prices accordingly. So 935 00:49:07,280 --> 00:49:10,160 Speaker 2: the idea being is if you could afford to pay more, 936 00:49:10,400 --> 00:49:13,680 Speaker 2: they'll bump up the price for you. Now they're denying this, 937 00:49:14,280 --> 00:49:17,480 Speaker 2: but it's come under the scrutiny because they're teaming up 938 00:49:17,480 --> 00:49:22,880 Speaker 2: with this is Israeli company that specializes in using an AI 939 00:49:23,040 --> 00:49:27,360 Speaker 2: algorithm to be able to take on people's situations, to 940 00:49:27,440 --> 00:49:31,000 Speaker 2: profile somebody and then adjust the prices according to that 941 00:49:31,080 --> 00:49:36,560 Speaker 2: person's circumstances. So this is, you know, personalized prices like 942 00:49:36,640 --> 00:49:38,840 Speaker 2: really rub me up the wrong way. I think it's 943 00:49:38,880 --> 00:49:41,319 Speaker 2: an awful situation if it looked at the three of 944 00:49:41,400 --> 00:49:43,600 Speaker 2: us and decided what we're going to be paying based 945 00:49:43,640 --> 00:49:48,279 Speaker 2: on whatever it decided. So this is there's been a 946 00:49:48,360 --> 00:49:51,239 Speaker 2: recent letter to the members of Congress. The company is 947 00:49:51,320 --> 00:49:55,600 Speaker 2: denied using AI tools to what they say price gouge customers. 948 00:49:55,800 --> 00:49:59,600 Speaker 2: So that was a report in Reuters. But this democratic 949 00:49:59,760 --> 00:50:04,440 Speaker 2: error zone as Senator a lady, I thigur it's a lady, 950 00:50:04,520 --> 00:50:06,880 Speaker 2: or maybe it could be a bloke. Anyway, the Senator 951 00:50:06,920 --> 00:50:10,680 Speaker 2: has accused Delta of telling their investors one thing and 952 00:50:10,719 --> 00:50:12,879 Speaker 2: then turning around and telling the public and other things. 953 00:50:12,960 --> 00:50:15,879 Speaker 2: So they're talking it up to their investors because of course, 954 00:50:15,880 --> 00:50:19,200 Speaker 2: if they can get people to spend more, it's going 955 00:50:19,280 --> 00:50:21,400 Speaker 2: to do good for the bottom line when it comes 956 00:50:21,440 --> 00:50:24,920 Speaker 2: to people who are invested in the airline. But it 957 00:50:25,000 --> 00:50:28,680 Speaker 2: got me thinking, and by coincidence, this week there's you know, 958 00:50:28,880 --> 00:50:31,160 Speaker 2: I'm living in a small country town of about three 959 00:50:31,200 --> 00:50:33,719 Speaker 2: thousand people, and there was a little fruit and d 960 00:50:33,800 --> 00:50:36,680 Speaker 2: shop that sadly closed down, and I used to go 961 00:50:36,719 --> 00:50:38,600 Speaker 2: there all the time. I used to get my watercress 962 00:50:38,600 --> 00:50:41,000 Speaker 2: from there. Now I can't get watercress, not that you 963 00:50:41,040 --> 00:50:44,440 Speaker 2: needed to know that, but it's good in wraps anyway. 964 00:50:44,600 --> 00:50:47,600 Speaker 2: So they closed down about a week and a half 965 00:50:47,680 --> 00:50:50,279 Speaker 2: two weeks ago, and I happened to be walking through 966 00:50:50,320 --> 00:50:53,680 Speaker 2: the supermarket and the owner of the supermarket had his 967 00:50:53,719 --> 00:50:55,360 Speaker 2: iPad out and was looking at prices. 968 00:50:55,400 --> 00:50:56,400 Speaker 3: Tips piss and herself. 969 00:50:56,640 --> 00:51:00,399 Speaker 2: But so he's looking at prices and I just turned 970 00:51:00,440 --> 00:51:03,880 Speaker 2: the corner and got into the aisle and the staff 971 00:51:03,920 --> 00:51:08,759 Speaker 2: memory he was talking to said twenty five percent. And 972 00:51:08,840 --> 00:51:11,879 Speaker 2: his reply was, well, the fruit and veg shops closed down? 973 00:51:13,560 --> 00:51:17,680 Speaker 1: Oh wow, yeah, So well that's yeah, I mean, that's 974 00:51:17,719 --> 00:51:21,920 Speaker 1: how the free market works. We've got a monopoly. Hey, 975 00:51:21,920 --> 00:51:25,279 Speaker 1: I just wanted to ask. I want to circle back 976 00:51:25,320 --> 00:51:27,600 Speaker 1: to one thing because I want your thoughts and explanation 977 00:51:27,680 --> 00:51:30,160 Speaker 1: because I don't really get it. The thing I don't 978 00:51:30,239 --> 00:51:32,799 Speaker 1: really get is so from the end of the year 979 00:51:32,880 --> 00:51:36,560 Speaker 1: or whenever it is, they're banning kids under sixteen from 980 00:51:36,640 --> 00:51:38,960 Speaker 1: using social media correct. 981 00:51:38,880 --> 00:51:41,360 Speaker 3: Yes, including a cube now and that was the recent 982 00:51:41,520 --> 00:51:43,160 Speaker 3: edition as well, because the review shube is going to 983 00:51:43,160 --> 00:51:43,959 Speaker 3: be exempt. 984 00:51:44,239 --> 00:51:48,600 Speaker 1: So is this a law or is this a recommendation? 985 00:51:49,160 --> 00:51:52,480 Speaker 3: Is this law enshrining it in law? How on how 986 00:51:52,520 --> 00:51:54,279 Speaker 3: on earth are they going to police it? 987 00:51:55,239 --> 00:51:57,399 Speaker 1: How on earth do you stop a sixteen year old? 988 00:51:57,600 --> 00:52:00,319 Speaker 2: So what you do is no, no, it's about putting 989 00:52:00,400 --> 00:52:04,040 Speaker 2: pressure on those social media outlets to be able to 990 00:52:04,239 --> 00:52:07,279 Speaker 2: prove that people are the age that they are. It's 991 00:52:07,280 --> 00:52:10,040 Speaker 2: going to impact all of us. So when you go 992 00:52:10,120 --> 00:52:12,600 Speaker 2: to sign up for a new accounts, they're going to 993 00:52:12,640 --> 00:52:16,200 Speaker 2: ask for a method of ID to prove, so the 994 00:52:16,239 --> 00:52:19,120 Speaker 2: owners is going to be on those social media organizations. 995 00:52:19,480 --> 00:52:21,879 Speaker 2: So what we're saying is, if Craig wants to set 996 00:52:21,920 --> 00:52:25,279 Speaker 2: up a Snapchat account or a YouTube account or an 997 00:52:25,280 --> 00:52:28,200 Speaker 2: Instagram account, you have to prove your over the age 998 00:52:28,239 --> 00:52:28,800 Speaker 2: of sixteen. 999 00:52:30,680 --> 00:52:33,040 Speaker 1: What about Oh wow, you know what I just thought, 1000 00:52:33,080 --> 00:52:37,880 Speaker 1: which you'd probably thought, what about all the thirteen fourteen 1001 00:52:38,000 --> 00:52:44,400 Speaker 1: fifteen year olds on well, TikTok and ig who are like, 1002 00:52:44,560 --> 00:52:49,640 Speaker 1: got lots of profile and momentum and like a lot 1003 00:52:49,680 --> 00:52:54,120 Speaker 1: of their their social kind of interaction is there. 1004 00:52:55,280 --> 00:52:59,279 Speaker 3: Well, I think the interesting thing will be originally so, 1005 00:52:59,320 --> 00:53:01,960 Speaker 3: I don't know that all social medias require you to 1006 00:53:02,160 --> 00:53:05,359 Speaker 3: have put an age in the first place. So there's 1007 00:53:05,400 --> 00:53:07,799 Speaker 3: been a scramble. An article I read earlier in the 1008 00:53:07,800 --> 00:53:09,799 Speaker 3: week said there's been a scramble. I think it might've 1009 00:53:09,800 --> 00:53:12,319 Speaker 3: been on the ABC that a lot of young people 1010 00:53:12,360 --> 00:53:14,160 Speaker 3: who are under the age of sixteen and are rushing 1011 00:53:14,200 --> 00:53:16,719 Speaker 3: to get accounts now before they need to prove that 1012 00:53:16,719 --> 00:53:18,880 Speaker 3: they're under the age of sixteen. So I think if 1013 00:53:18,880 --> 00:53:21,880 Speaker 3: you've already got an existing account, there's an argument that 1014 00:53:21,920 --> 00:53:24,359 Speaker 3: it's an existing account. That's a really really good question. 1015 00:53:24,480 --> 00:53:27,000 Speaker 2: I don't know that I know the answer to that, well, 1016 00:53:27,120 --> 00:53:30,319 Speaker 2: I do know, I don't know the answer. I'm not 1017 00:53:30,520 --> 00:53:32,400 Speaker 2: entirely sure, but I figure that if you've got an 1018 00:53:32,400 --> 00:53:35,400 Speaker 2: account already and you've got traction with that account, you 1019 00:53:35,480 --> 00:53:37,600 Speaker 2: haven't needed to prove your age to get it in 1020 00:53:37,640 --> 00:53:40,720 Speaker 2: the first place, so you can't. I don't think it's retrospective. 1021 00:53:40,719 --> 00:53:42,759 Speaker 2: I don't think they're going to roll it back that way. 1022 00:53:42,760 --> 00:53:45,760 Speaker 2: But that's a really good point. I'm not sure I reckon. 1023 00:53:46,280 --> 00:53:51,680 Speaker 1: What is really an interesting paradoxical kind of thing is 1024 00:53:51,719 --> 00:53:54,440 Speaker 1: that so they're going to force all of these social 1025 00:53:54,520 --> 00:53:59,000 Speaker 1: media platforms, YouTube and the like. Would you call YouTube 1026 00:53:59,000 --> 00:54:00,920 Speaker 1: a social media platform or would you just call a 1027 00:54:00,920 --> 00:54:05,560 Speaker 1: streaming platform? Yeah, whatever, anyway, so they're forcing them to 1028 00:54:05,680 --> 00:54:10,320 Speaker 1: create this gateway or whatever it is or process to 1029 00:54:10,440 --> 00:54:14,799 Speaker 1: ensure that people are a certain age, and that very 1030 00:54:14,880 --> 00:54:19,439 Speaker 1: process is commercially destructive for them, Like that just means 1031 00:54:19,480 --> 00:54:22,880 Speaker 1: they're going to have less users. So what the government 1032 00:54:22,960 --> 00:54:25,160 Speaker 1: saying is you're going to have less people using it. 1033 00:54:25,200 --> 00:54:27,440 Speaker 1: You're going to bottom line is the more people using it, 1034 00:54:27,480 --> 00:54:29,880 Speaker 1: the more income. And they're a business, so they're a 1035 00:54:29,880 --> 00:54:33,600 Speaker 1: profit profit driven business, like nearly all businesses, not all 1036 00:54:33,640 --> 00:54:38,200 Speaker 1: but nearly all, and we're making you do this thing 1037 00:54:38,440 --> 00:54:41,480 Speaker 1: that's going to equal. The net result is you'll make 1038 00:54:41,560 --> 00:54:44,799 Speaker 1: less money because you've got less users. I wonder what 1039 00:54:44,840 --> 00:54:48,440 Speaker 1: they're hack around, because they must be thinking already, how 1040 00:54:48,440 --> 00:54:52,000 Speaker 1: can we beat this, because they're not going to go, oh, sure, 1041 00:54:52,600 --> 00:54:55,160 Speaker 1: we want thirty percent of or forty percent of our 1042 00:54:55,280 --> 00:54:59,279 Speaker 1: users to just jump off, okay, government, We're fine. I 1043 00:54:59,360 --> 00:55:01,040 Speaker 1: wonder what they're counter to that is. 1044 00:55:02,200 --> 00:55:06,239 Speaker 2: That's an absolutely amazingly good question. The interesting thing for 1045 00:55:06,280 --> 00:55:08,799 Speaker 2: me is that there was certainly some media outlets on 1046 00:55:08,920 --> 00:55:12,680 Speaker 2: some social media is young people have no desire whatsoever 1047 00:55:12,760 --> 00:55:15,560 Speaker 2: to be on like Facebook and Institute, maybe Instagram a 1048 00:55:15,600 --> 00:55:18,000 Speaker 2: little bit more, but definitely not Facebook. No young person 1049 00:55:18,040 --> 00:55:20,520 Speaker 2: in their right mind wants to be on Facebook. That's 1050 00:55:20,560 --> 00:55:22,719 Speaker 2: just like for dinosaurs. 1051 00:55:23,520 --> 00:55:27,320 Speaker 1: Now it is a little bit, isn't it. But remember 1052 00:55:27,360 --> 00:55:29,680 Speaker 1: when it was like two thousand and eight, Zuckerberg or 1053 00:55:29,719 --> 00:55:31,440 Speaker 1: whatever it was, it was like it was the. 1054 00:55:31,400 --> 00:55:34,759 Speaker 2: Way, oh, because it was originally designed at a university 1055 00:55:34,880 --> 00:55:38,600 Speaker 2: level for young university students to connect with each other. 1056 00:55:38,640 --> 00:55:42,759 Speaker 2: It was Zuckerberg obviously had no friends, and that's how he. 1057 00:55:44,600 --> 00:55:47,200 Speaker 1: Well, he's kicked on. He's kicked gone, he's done all right, 1058 00:55:47,960 --> 00:55:50,120 Speaker 1: the young loser he's kicked on. 1059 00:55:50,400 --> 00:55:51,400 Speaker 3: Do you want to be his mate? 1060 00:55:52,360 --> 00:55:54,840 Speaker 1: Would you I'd be interested to speak to I would 1061 00:55:54,840 --> 00:55:57,520 Speaker 1: be interested to meet him, and then make that determination, 1062 00:55:57,960 --> 00:56:00,400 Speaker 1: like I think, what do you know about anyone? Stick? 1063 00:56:00,480 --> 00:56:04,160 Speaker 1: Some you know, they say, don't meet your heroes. I 1064 00:56:04,200 --> 00:56:08,560 Speaker 1: won't say. I've met a few people that I really thought, well, 1065 00:56:09,120 --> 00:56:10,960 Speaker 1: I love them, and then I met him and I'm like, 1066 00:56:11,040 --> 00:56:14,200 Speaker 1: oh God, you're a fucking idiot or you're just so 1067 00:56:15,040 --> 00:56:17,120 Speaker 1: you know. And then other people that I thought were 1068 00:56:17,160 --> 00:56:20,160 Speaker 1: maybe not not someone i'd want to meet and they 1069 00:56:20,200 --> 00:56:24,200 Speaker 1: were brilliant. I'm like, oh you are actually great. Hey, 1070 00:56:24,360 --> 00:56:26,839 Speaker 1: I who was that you were telling me about? Like 1071 00:56:26,960 --> 00:56:30,239 Speaker 1: someone on social media that you didn't really warm to, 1072 00:56:31,160 --> 00:56:34,239 Speaker 1: and then you you got you started looking at their 1073 00:56:34,280 --> 00:56:36,600 Speaker 1: content or knowing a bit more about them, and you 1074 00:56:36,800 --> 00:56:39,560 Speaker 1: kind of did a one eighty. I remember you telling. 1075 00:56:39,360 --> 00:56:41,879 Speaker 4: Me about it. I know there was I think there's 1076 00:56:41,880 --> 00:56:44,160 Speaker 4: been a couple, but I know Lane Norton. Was it 1077 00:56:44,440 --> 00:56:49,280 Speaker 4: like that for me? You two people from podcasts Jimmy 1078 00:56:49,320 --> 00:56:53,759 Speaker 4: Carr now fan and Norton as well, because I heard 1079 00:56:53,840 --> 00:56:56,200 Speaker 4: him on a podcast and it changed my change my 1080 00:56:56,239 --> 00:56:58,799 Speaker 4: perception of his carry on on socials. 1081 00:56:58,920 --> 00:57:02,440 Speaker 1: Yeah, yeah, yeah, you Patrick, have you Let's go a 1082 00:57:02,480 --> 00:57:04,680 Speaker 1: good one. Have you ever met anyone that you were 1083 00:57:04,719 --> 00:57:08,000 Speaker 1: really surprised how much you liked them and connected with 1084 00:57:08,080 --> 00:57:09,839 Speaker 1: him when perhaps you thought you might not of. 1085 00:57:10,680 --> 00:57:12,080 Speaker 3: Not so much that I didn't know of, but they 1086 00:57:12,120 --> 00:57:13,760 Speaker 3: were so kind of not out of reach. 1087 00:57:13,840 --> 00:57:19,280 Speaker 2: But Bryce Coordinay, you know Bryce who wrote prolific author. 1088 00:57:19,840 --> 00:57:22,360 Speaker 2: I bumped into him just coming out of the lift 1089 00:57:22,400 --> 00:57:25,240 Speaker 2: when I was working at Triple M and Fox and 1090 00:57:26,200 --> 00:57:26,840 Speaker 2: someone I was. 1091 00:57:26,800 --> 00:57:28,720 Speaker 3: With knew him and introduced me to him. 1092 00:57:28,840 --> 00:57:31,520 Speaker 2: And the interesting thing was he started asking me about 1093 00:57:31,520 --> 00:57:33,080 Speaker 2: what I was doing, you know, because I just finished 1094 00:57:33,080 --> 00:57:36,360 Speaker 2: to breakfast shift, and he said, you know, we got chatting, 1095 00:57:36,520 --> 00:57:39,400 Speaker 2: and by coincidence, I was about to go to the gym, 1096 00:57:39,520 --> 00:57:42,440 Speaker 2: to your gym, to Harper's and he got to me 1097 00:57:42,480 --> 00:57:45,160 Speaker 2: about keeping fit and he was so interested. I kept 1098 00:57:45,160 --> 00:57:47,760 Speaker 2: trying to ask him questions, but he was asking me 1099 00:57:47,840 --> 00:57:50,520 Speaker 2: lots of questions and he was such a nice guy. 1100 00:57:50,680 --> 00:57:53,360 Speaker 2: He was really really nice. There's this funny little I 1101 00:57:53,480 --> 00:57:55,080 Speaker 2: got to tell you this story. There was this really 1102 00:57:55,080 --> 00:58:00,120 Speaker 2: great ABC TV show that was interviewing Australia's really famous 1103 00:58:00,320 --> 00:58:05,240 Speaker 2: authors was Colin McCulloch, Bryce Courtney, and I think Matthew Riley, 1104 00:58:05,280 --> 00:58:07,360 Speaker 2: who and if you're looking for an action book that 1105 00:58:07,400 --> 00:58:11,360 Speaker 2: you can't put down, Matthew Riley is amazing. So the 1106 00:58:11,400 --> 00:58:14,640 Speaker 2: three of them were giving this round table interview and 1107 00:58:14,720 --> 00:58:18,280 Speaker 2: Matthew and Bryce Courtney kind of said, look, he's very 1108 00:58:18,280 --> 00:58:20,320 Speaker 2: well known for he was very well known for giving 1109 00:58:20,360 --> 00:58:23,520 Speaker 2: away books. He'd give about away two thousand books a year. 1110 00:58:24,000 --> 00:58:26,040 Speaker 2: And the thing he always used to say is if 1111 00:58:26,080 --> 00:58:29,320 Speaker 2: you give away one book, you've got a fan for life. 1112 00:58:29,720 --> 00:58:34,040 Speaker 2: And so Matthew Riley said, you know, if you can 1113 00:58:34,120 --> 00:58:37,000 Speaker 2: find a Bryce Courtney book that isn't signed, it'll be 1114 00:58:37,040 --> 00:58:37,920 Speaker 2: worth a fortune. 1115 00:58:38,960 --> 00:58:40,840 Speaker 1: That's funny, that's funny. 1116 00:58:40,960 --> 00:58:42,760 Speaker 3: It was really cute. It was really cute. 1117 00:58:42,800 --> 00:58:45,080 Speaker 2: And he said, you can tell how famous an author 1118 00:58:45,160 --> 00:58:48,080 Speaker 2: is when the name that the author's name is bigger 1119 00:58:48,080 --> 00:58:49,200 Speaker 2: than the title of the book. 1120 00:58:49,560 --> 00:58:51,360 Speaker 3: And he was taking the piss out of them as well. 1121 00:58:51,520 --> 00:58:55,160 Speaker 2: Know, Colin McCulloch and Bryce Courtney, you look at any 1122 00:58:55,160 --> 00:58:58,160 Speaker 2: of their books and their name is bigger than the title. 1123 00:58:58,360 --> 00:58:59,200 Speaker 3: It's great, isn't it. 1124 00:58:59,240 --> 00:59:01,760 Speaker 2: So it was a beautiful conversation and really interesting, but 1125 00:59:01,840 --> 00:59:05,360 Speaker 2: he certainly was somebody that blew me away just by 1126 00:59:05,520 --> 00:59:10,360 Speaker 2: how personable, how how interested he was in me as 1127 00:59:10,400 --> 00:59:13,640 Speaker 2: a person in that very brief moment that we had 1128 00:59:13,640 --> 00:59:14,360 Speaker 2: that conversation. 1129 00:59:15,320 --> 00:59:18,480 Speaker 3: Yeah, well you Craig, if you had that. 1130 00:59:19,880 --> 00:59:22,960 Speaker 1: I mean, I've met lots of people, but just someone 1131 00:59:23,000 --> 00:59:25,440 Speaker 1: that I thought was great and then I met them 1132 00:59:25,440 --> 00:59:28,440 Speaker 1: and I went, oh, you're you're even better. It was 1133 00:59:28,440 --> 00:59:33,160 Speaker 1: Bruce mcavany. Oh yeah, like I always you know, which 1134 00:59:33,160 --> 00:59:35,560 Speaker 1: some people would not know him, but a lot of 1135 00:59:35,560 --> 00:59:38,200 Speaker 1: our listeners would, you know, just not just but a 1136 00:59:38,200 --> 00:59:40,840 Speaker 1: sports commentator, but just like a bit of a savant 1137 00:59:40,960 --> 00:59:46,720 Speaker 1: with you know, sports history and data and information and 1138 00:59:46,800 --> 00:59:50,000 Speaker 1: everything from track and field to horse racing to footy 1139 00:59:50,080 --> 00:59:53,960 Speaker 1: to swimming. Like he's just he's just yeah. But I 1140 00:59:54,000 --> 00:59:58,040 Speaker 1: met him at SCN when I was working there, and 1141 00:59:58,040 --> 01:00:00,360 Speaker 1: and you know when people grab your hand and then 1142 01:00:00,400 --> 01:00:02,760 Speaker 1: they grab your forearm and they shake your four arm 1143 01:00:02,800 --> 01:00:05,640 Speaker 1: and your hand at the same time, and he was 1144 01:00:05,840 --> 01:00:10,560 Speaker 1: just yeah, same, just the nicest dude ever, very engaging, 1145 01:00:10,880 --> 01:00:15,240 Speaker 1: not it wasn't you know when some people they shake 1146 01:00:15,280 --> 01:00:18,640 Speaker 1: your hand and it's almost just like this obligatory social 1147 01:00:18,720 --> 01:00:21,880 Speaker 1: thing that takes four seconds, they're not really present. He 1148 01:00:22,000 --> 01:00:24,480 Speaker 1: was just yeah, I had that recently. 1149 01:00:24,520 --> 01:00:27,479 Speaker 3: I was a colleague that i'd met. 1150 01:00:27,520 --> 01:00:30,160 Speaker 2: This is a company we do social media for and 1151 01:00:30,200 --> 01:00:32,720 Speaker 2: the guy that is our main contact, so it's normally 1152 01:00:32,760 --> 01:00:35,880 Speaker 2: by email, but he popped into our office. But he 1153 01:00:35,960 --> 01:00:38,600 Speaker 2: was one of those people, you know. He shook my 1154 01:00:38,760 --> 01:00:41,560 Speaker 2: hand and he thanked us for the work that we'd 1155 01:00:41,720 --> 01:00:44,800 Speaker 2: been doing. But the shake was that little bit extra, 1156 01:00:45,120 --> 01:00:45,960 Speaker 2: a little bit longer. 1157 01:00:46,080 --> 01:00:46,400 Speaker 3: It was. 1158 01:00:46,480 --> 01:00:49,560 Speaker 2: It was a real sense of connectedness and sincerity. And 1159 01:00:49,600 --> 01:00:51,800 Speaker 2: I think sometimes you can tell that in a handshake 1160 01:00:52,120 --> 01:00:56,000 Speaker 2: that the person is genuinely kind of in that tactile touch. 1161 01:00:56,240 --> 01:00:59,280 Speaker 2: And we didn't shake hands for a long time around COVID, 1162 01:00:59,320 --> 01:01:03,000 Speaker 2: remember we you know, remember I can remember going to 1163 01:01:03,040 --> 01:01:05,680 Speaker 2: a board meeting. I was on the quality risk committee 1164 01:01:05,680 --> 01:01:08,240 Speaker 2: of our local hospital, and I sat on that committee 1165 01:01:08,280 --> 01:01:10,040 Speaker 2: for quite a few years. And I remember the first 1166 01:01:10,040 --> 01:01:13,959 Speaker 2: this is pre COVID, but going into to COVID where 1167 01:01:14,200 --> 01:01:17,080 Speaker 2: the doctor came in and I hadn't met him before, 1168 01:01:17,320 --> 01:01:19,400 Speaker 2: and I went to shake his hand and he pulled 1169 01:01:19,440 --> 01:01:21,480 Speaker 2: back and put his both hands up, and it's like, 1170 01:01:21,640 --> 01:01:24,520 Speaker 2: what the hell, you know, it's really I felt really 1171 01:01:24,640 --> 01:01:27,480 Speaker 2: offended because he didn't want to touch my hand, because 1172 01:01:27,720 --> 01:01:29,720 Speaker 2: I'd never met him before, and the first instinct was 1173 01:01:29,720 --> 01:01:30,320 Speaker 2: to shake hands. 1174 01:01:30,360 --> 01:01:32,240 Speaker 3: And this was literally pre COVID. 1175 01:01:32,240 --> 01:01:34,960 Speaker 2: It was only this the rumors of COVID at the time, 1176 01:01:35,400 --> 01:01:37,880 Speaker 2: but they'd already started to put in place, you know, 1177 01:01:37,920 --> 01:01:40,400 Speaker 2: the protections and all that sort of thing. And obviously 1178 01:01:40,400 --> 01:01:43,320 Speaker 2: tactile contact was out. For so many years, we'd in touch, 1179 01:01:43,640 --> 01:01:46,720 Speaker 2: we didn't hug people, and so now that we seem 1180 01:01:46,760 --> 01:01:50,080 Speaker 2: to have returned back to that where people can physically 1181 01:01:50,280 --> 01:01:53,120 Speaker 2: connect in that way, it is interesting and a handshake 1182 01:01:53,160 --> 01:01:54,800 Speaker 2: can say a lot about a person, can't it. 1183 01:01:55,520 --> 01:01:58,360 Speaker 1: Oh? Certainly? Well, you and I didn't stop spooning all 1184 01:01:58,400 --> 01:02:01,040 Speaker 1: through COVID, so I mean we just took the risk. 1185 01:02:01,160 --> 01:02:04,400 Speaker 1: We went fuck it. Well, fuck the rules, We're going 1186 01:02:04,480 --> 01:02:07,200 Speaker 1: to spoon. Hey, tell people and. 1187 01:02:07,560 --> 01:02:10,000 Speaker 3: Spoon now we swap See. I think that's nice too. 1188 01:02:10,920 --> 01:02:16,160 Speaker 1: Yes, well those not hot. I was going to say 1189 01:02:16,200 --> 01:02:19,000 Speaker 1: something hilarious. If you stop and. 1190 01:02:19,000 --> 01:02:21,320 Speaker 3: Tell the whole show you're a legend. 1191 01:02:22,000 --> 01:02:24,240 Speaker 1: Yes she did. Patrick tell people how to find you, 1192 01:02:24,280 --> 01:02:26,880 Speaker 1: follow you and connect with you and potentially spoon with you. 1193 01:02:26,920 --> 01:02:28,040 Speaker 1: If you would, that would be great. 1194 01:02:28,480 --> 01:02:29,920 Speaker 3: I've had no offers yet, Craigo. 1195 01:02:30,040 --> 01:02:33,680 Speaker 2: You say that every time we speak websitesnow dot com 1196 01:02:33,720 --> 01:02:36,440 Speaker 2: today you if you want to talk business related stuff 1197 01:02:36,520 --> 01:02:39,640 Speaker 2: like websites and stuff, but if you want to tai 1198 01:02:39,760 --> 01:02:41,360 Speaker 2: chi is my big passion in life. 1199 01:02:41,360 --> 01:02:42,120 Speaker 3: I love tai chi. 1200 01:02:42,440 --> 01:02:44,280 Speaker 2: And if you want to just do tai chi with me, 1201 01:02:44,520 --> 01:02:46,800 Speaker 2: like watch a video with me and Fritz doing tai chi. 1202 01:02:47,280 --> 01:02:50,520 Speaker 2: Just tie chi at home dot com, todau, just jump in, 1203 01:02:50,960 --> 01:02:51,280 Speaker 2: just ent. 1204 01:02:52,040 --> 01:02:52,600 Speaker 3: I'd love it. 1205 01:02:53,160 --> 01:02:56,760 Speaker 2: What belt is, Fritz? Now, we don't do belts in taichi? 1206 01:02:56,920 --> 01:02:58,439 Speaker 2: Well we do, but just that's to stop our. 1207 01:02:58,360 --> 01:03:01,760 Speaker 3: Pants from falling down. Yeah, other belts or anything like that. 1208 01:03:01,800 --> 01:03:04,200 Speaker 3: We just do it, you know, you just do it. 1209 01:03:04,280 --> 01:03:11,080 Speaker 1: Crego alright, Tiff, thank you, Thanks lads. That was an 1210 01:03:11,120 --> 01:03:13,560 Speaker 1: emotional rollercoaster for me. That show. I'm going to have 1211 01:03:13,600 --> 01:03:16,280 Speaker 1: to go and have a shower, cold shower and a 1212 01:03:16,440 --> 01:03:20,480 Speaker 1: xanax whatever that is. I'm plushioned. I'm an emotional wreck. 1213 01:03:20,840 --> 01:03:23,320 Speaker 3: You should keep that in, Tiff, just keep it rolling. 1214 01:03:24,160 --> 01:03:26,640 Speaker 1: No, she will, She'll leave that in. All right, let's 1215 01:03:26,760 --> 01:03:32,320 Speaker 1: officially say goodbye. Thank you listeners. You're great. Thanks Patrick, 1216 01:03:32,400 --> 01:03:34,120 Speaker 1: Thanks Tiff, see you guys,