1 00:00:00,840 --> 00:00:04,040 Speaker 1: I got you, Bloody Champs, Craig Anthey Arpatifany and good Patrick, 2 00:00:04,120 --> 00:00:07,320 Speaker 1: James Manela. It's the You Project, It's the tech show. 3 00:00:07,440 --> 00:00:12,520 Speaker 1: It's a once a fortnite bloody extravaganza of let's be honest, 4 00:00:13,240 --> 00:00:17,520 Speaker 1: mainly bullshit with a bit of technology thrown in. Speaking 5 00:00:17,560 --> 00:00:20,119 Speaker 1: of bullshit, Hi Patrick. 6 00:00:20,079 --> 00:00:22,720 Speaker 2: Oh wow, that's just the epic. 7 00:00:22,800 --> 00:00:27,120 Speaker 1: I love you, I love you. You're loved. How are you? 8 00:00:27,840 --> 00:00:30,960 Speaker 2: Well? You know you're saying I'm loved, but I'm not 9 00:00:31,000 --> 00:00:32,199 Speaker 2: feeling that. 10 00:00:33,840 --> 00:00:37,520 Speaker 1: Well. It's the I don't know, it's the dysfunctional Australian 11 00:00:37,560 --> 00:00:38,200 Speaker 1: male way. 12 00:00:38,720 --> 00:00:38,920 Speaker 3: You know. 13 00:00:39,000 --> 00:00:40,639 Speaker 1: If I love you, I'm not going to tell you, 14 00:00:40,720 --> 00:00:43,199 Speaker 1: or if I'm going to tell you, I'm going to 15 00:00:43,240 --> 00:00:45,440 Speaker 1: put your funck with on the end of it, you. 16 00:00:45,400 --> 00:00:46,479 Speaker 2: Know, and punch me. 17 00:00:46,880 --> 00:00:50,080 Speaker 1: Yeah, and punch you. Yeah, exactly. 18 00:00:50,400 --> 00:00:50,600 Speaker 2: Yeah. 19 00:00:50,840 --> 00:00:53,159 Speaker 1: I can't just look at you and go, Patrick, I 20 00:00:53,200 --> 00:00:56,160 Speaker 1: love you, because that that is way too weird, probably 21 00:00:56,200 --> 00:00:58,680 Speaker 1: even for you, who's a little bit more evolved than me. 22 00:00:59,200 --> 00:01:02,920 Speaker 1: Okay a lot. That's probably even weird for you. But 23 00:01:03,320 --> 00:01:05,520 Speaker 1: you know, I love you, mate. That's all right, isn't it. 24 00:01:05,640 --> 00:01:08,240 Speaker 2: I love you too, Craigo, And if I love you too, 25 00:01:08,680 --> 00:01:09,920 Speaker 2: and please don't punch me. 26 00:01:10,800 --> 00:01:12,720 Speaker 4: I was just about the same. My love language is 27 00:01:12,760 --> 00:01:13,280 Speaker 4: a left. 28 00:01:13,080 --> 00:01:16,959 Speaker 2: Hook exactly, That's right, That. 29 00:01:19,160 --> 00:01:21,960 Speaker 1: Is funny. What is your love language speaking? And let's 30 00:01:22,000 --> 00:01:23,800 Speaker 1: just get this out of the way. So if you 31 00:01:24,360 --> 00:01:28,440 Speaker 1: the four of you don't know, is it not Gary Gray? 32 00:01:28,560 --> 00:01:28,800 Speaker 2: Is it? 33 00:01:28,880 --> 00:01:30,680 Speaker 1: Yeah? John Gray? 34 00:01:31,040 --> 00:01:33,559 Speaker 4: John Gray's Gary? 35 00:01:33,600 --> 00:01:37,199 Speaker 1: Some just just gone blame anyway, the Five Love Languages? 36 00:01:37,800 --> 00:01:40,360 Speaker 2: John, Can I just clarify something you just said, the 37 00:01:40,400 --> 00:01:43,759 Speaker 2: four of you? Who's the other person out in three 38 00:01:43,800 --> 00:01:44,120 Speaker 2: of us? 39 00:01:45,640 --> 00:01:48,400 Speaker 1: No? No, no, I was going to say, the four 40 00:01:48,440 --> 00:01:50,800 Speaker 1: of you who haven't heard of that book, The Five 41 00:01:50,880 --> 00:01:51,640 Speaker 1: Love Languages? 42 00:01:51,800 --> 00:01:53,440 Speaker 2: Okay, sorry, because. 43 00:01:53,160 --> 00:01:55,280 Speaker 1: I've spoken about it quite a bit. But anyway, the 44 00:01:55,280 --> 00:02:00,000 Speaker 1: five love languages, physical touch, words of affirmation, quality time, 45 00:02:00,040 --> 00:02:06,840 Speaker 1: I'm what is it? What is the gifts? And acts 46 00:02:06,840 --> 00:02:12,240 Speaker 1: of service? Thank you? Patrick? And yeah, what's yours? Mate? 47 00:02:12,280 --> 00:02:12,440 Speaker 5: Well? 48 00:02:12,480 --> 00:02:15,919 Speaker 1: Like, what when you are expressing love or you want 49 00:02:15,919 --> 00:02:19,160 Speaker 1: to show someone that you love them, what do you? 50 00:02:19,320 --> 00:02:21,760 Speaker 1: And I guess it depends whether or not it's you know, 51 00:02:22,000 --> 00:02:25,079 Speaker 1: like family or friend or someone or something a bit 52 00:02:25,120 --> 00:02:28,280 Speaker 1: more intimate, But what's your what's your kind of how 53 00:02:28,320 --> 00:02:30,280 Speaker 1: do you show people that you love them or care 54 00:02:30,320 --> 00:02:30,799 Speaker 1: about them. 55 00:02:31,120 --> 00:02:35,800 Speaker 2: I'm big into hugs. Yeah, you both have been recipients. 56 00:02:35,840 --> 00:02:39,040 Speaker 1: I'm sure I've been to see a movie with you 57 00:02:39,120 --> 00:02:42,120 Speaker 1: and pretty much we held hands the whole time. I mean, 58 00:02:42,200 --> 00:02:44,240 Speaker 1: you know there were moments where we didn't, you know, 59 00:02:44,360 --> 00:02:46,440 Speaker 1: that time where you were holding my cocky and when 60 00:02:46,480 --> 00:02:49,720 Speaker 1: it got scary. But you can't say that, can't take 61 00:02:49,760 --> 00:02:53,400 Speaker 1: that out, tip, take it out, take it out and 62 00:02:53,560 --> 00:02:57,000 Speaker 1: job in the cinema hashtag coming soon what I know, 63 00:02:58,520 --> 00:03:03,480 Speaker 1: and by coming I mean movie stop it wow, straight in. 64 00:03:03,639 --> 00:03:09,680 Speaker 4: We've just what has has been banned from Hoyts. 65 00:03:09,680 --> 00:03:14,639 Speaker 1: He calls it cooits Oh god, So it took us 66 00:03:14,680 --> 00:03:18,040 Speaker 1: thirteen seconds to get here. I apologize, no wonder no 67 00:03:18,120 --> 00:03:22,800 Speaker 1: one listens to this episode. There's three sick fucks and 68 00:03:23,040 --> 00:03:26,320 Speaker 1: go this is this whole technology thing. It's a ruse. 69 00:03:27,280 --> 00:03:31,639 Speaker 1: There's nothing, there's nothing going on here, nothing to see here. 70 00:03:32,200 --> 00:03:34,840 Speaker 2: Fans of the podcast realm, that's it. 71 00:03:35,120 --> 00:03:40,760 Speaker 1: That's it. And Tiff, what about so Patrick's physical touch 72 00:03:41,240 --> 00:03:44,680 Speaker 1: And by the way, everyone that doesn't despite all the bullshit, 73 00:03:44,720 --> 00:03:48,400 Speaker 1: that doesn't have to mean anything sexual. It's just off 74 00:03:48,400 --> 00:03:52,040 Speaker 1: in a hug or or you know, a tap on 75 00:03:52,120 --> 00:03:54,840 Speaker 1: the back or a little bit of a quick sideways 76 00:03:54,880 --> 00:03:56,560 Speaker 1: squeeze or something. What about you? 77 00:03:59,040 --> 00:04:05,480 Speaker 2: Sorry? What is his think that sounds creepy? What from 78 00:04:05,480 --> 00:04:08,520 Speaker 2: a hut to a bit of a sideways grope? 79 00:04:08,120 --> 00:04:17,200 Speaker 1: I said squeeze, I did not say grope. Sorry, call Tiff. 80 00:04:18,160 --> 00:04:21,640 Speaker 4: I think I'm a bit words of affirmation, you know what? 81 00:04:21,880 --> 00:04:24,680 Speaker 4: You know what's interesting? I look at how I behave 82 00:04:24,760 --> 00:04:26,920 Speaker 4: with my animals bother them. 83 00:04:28,080 --> 00:04:32,040 Speaker 1: But I think that's because I reckon. I reckon. Your 84 00:04:32,080 --> 00:04:35,839 Speaker 1: love language is context a pennant, like because yeah, you're 85 00:04:37,000 --> 00:04:40,480 Speaker 1: you don't open the door that wide with humans, but 86 00:04:40,800 --> 00:04:42,800 Speaker 1: animals you're like it's a free for all. 87 00:04:43,200 --> 00:04:43,440 Speaker 2: Yeah. 88 00:04:43,720 --> 00:04:48,320 Speaker 4: Like sometimes I think if I was someone's girlfriend, the 89 00:04:48,320 --> 00:04:51,200 Speaker 4: way I behave with my dogs, i'd have the restraining 90 00:04:51,279 --> 00:04:52,039 Speaker 4: order to put on me. 91 00:04:53,760 --> 00:04:56,680 Speaker 1: But the thing is, and I know this is like cliche, 92 00:04:56,839 --> 00:05:02,880 Speaker 1: but like you know that like even cats, like animals 93 00:05:02,960 --> 00:05:05,880 Speaker 1: don't lie, like they might hate you or love you, 94 00:05:05,960 --> 00:05:09,279 Speaker 1: but you're going to know they might want affection or not, 95 00:05:09,640 --> 00:05:13,120 Speaker 1: but they're not pretending, you know, And even if they 96 00:05:13,160 --> 00:05:17,360 Speaker 1: want food, they're you know, they're very clear about what 97 00:05:17,400 --> 00:05:19,880 Speaker 1: they want. You know, there's you don't have to read 98 00:05:19,960 --> 00:05:22,719 Speaker 1: between the lines like you do with humans. There's no 99 00:05:22,839 --> 00:05:24,200 Speaker 1: persona and person. 100 00:05:24,720 --> 00:05:25,040 Speaker 4: You know. 101 00:05:26,440 --> 00:05:28,560 Speaker 2: I'm having a bit of a chuckle to myself because 102 00:05:28,640 --> 00:05:31,680 Speaker 2: whilst we've been talking, my dog has walked into the 103 00:05:31,720 --> 00:05:34,880 Speaker 2: little studio and pressed his head up against my hand, 104 00:05:34,880 --> 00:05:38,160 Speaker 2: and now I'm just padding him. Fritz. What we're talking, 105 00:05:38,320 --> 00:05:40,400 Speaker 2: he knows, which is why I'm leading to one side. 106 00:05:40,400 --> 00:05:43,640 Speaker 2: I'm not having a stroke or anything, but like he's 107 00:05:43,680 --> 00:05:45,800 Speaker 2: having anyway, Leave it there. 108 00:05:48,360 --> 00:05:51,320 Speaker 1: Gary Chapman, his name is fucking hell. Why did I 109 00:05:51,360 --> 00:05:52,040 Speaker 1: blank on that? 110 00:05:53,400 --> 00:05:53,679 Speaker 2: Now? 111 00:05:53,839 --> 00:05:56,800 Speaker 1: Speaking of dogs and animals, Patrick, I see on your 112 00:05:56,839 --> 00:05:59,479 Speaker 1: little chat list that we want to open. I don't 113 00:05:59,520 --> 00:06:02,520 Speaker 1: know how much this is tech, but it is psychology, 114 00:06:02,640 --> 00:06:05,160 Speaker 1: so I do love it, and it does involve dogs, 115 00:06:05,720 --> 00:06:09,960 Speaker 1: so we all love it. I think you know. I 116 00:06:09,960 --> 00:06:12,280 Speaker 1: had to tell you. I don't think this is new research, 117 00:06:12,320 --> 00:06:13,640 Speaker 1: but I love the research. 118 00:06:14,040 --> 00:06:14,440 Speaker 2: It's good. 119 00:06:15,120 --> 00:06:19,200 Speaker 1: Yeah, So tell us about why having a dog could 120 00:06:19,200 --> 00:06:20,119 Speaker 1: be good for your health. 121 00:06:20,800 --> 00:06:24,280 Speaker 2: Well, look, they did it. The research specifically looked at 122 00:06:24,720 --> 00:06:27,120 Speaker 2: how we interact with dogs but also cats, and what 123 00:06:27,160 --> 00:06:29,880 Speaker 2: they tried to do was typify a dog owner and 124 00:06:29,920 --> 00:06:32,520 Speaker 2: a cat owner. Now you fall smack bang in the middle. Tip, 125 00:06:32,600 --> 00:06:35,760 Speaker 2: But it's interestingly before Craig jumped into the call. You 126 00:06:35,880 --> 00:06:38,760 Speaker 2: were actually you were bitching about your stand up desks. 127 00:06:38,839 --> 00:06:39,640 Speaker 2: What did your cat do? 128 00:06:41,240 --> 00:06:42,039 Speaker 4: What did a cat do? 129 00:06:42,440 --> 00:06:44,680 Speaker 2: Yeah? Why doesn't your desk work anymore? Oh? 130 00:06:44,800 --> 00:06:47,560 Speaker 4: Yeah, oh yeah, she chewed the fucking cord that makes 131 00:06:47,600 --> 00:06:48,799 Speaker 4: it go up and down. 132 00:06:49,360 --> 00:06:54,080 Speaker 2: So cat owners have a higher tendency towards anxiety and 133 00:06:54,120 --> 00:06:58,000 Speaker 2: they're more neurotic than dog owners. I don't care. 134 00:06:58,440 --> 00:07:02,479 Speaker 1: Wow, how I wonder is that just is that a 135 00:07:02,560 --> 00:07:05,400 Speaker 1: byproduct of owning a cat? Or is that just that 136 00:07:06,040 --> 00:07:11,160 Speaker 1: eurotic anxious people choose cats because cats are eurotic and anxious. 137 00:07:11,560 --> 00:07:13,880 Speaker 2: That's a really good point. Well, look, this was only 138 00:07:13,920 --> 00:07:15,920 Speaker 2: a survey of three hundred and twenty one people and 139 00:07:15,960 --> 00:07:20,120 Speaker 2: it was done at James Cook University and right here 140 00:07:20,120 --> 00:07:24,200 Speaker 2: at Australia. But the reason for it was it was 141 00:07:25,520 --> 00:07:27,640 Speaker 2: a lot of people got a lot of consolation during 142 00:07:27,680 --> 00:07:32,040 Speaker 2: COVID lockdown by having pets, and it seemed that people 143 00:07:32,080 --> 00:07:35,520 Speaker 2: who had dogs got through it and were more resilient, 144 00:07:35,600 --> 00:07:39,120 Speaker 2: particularly if they were confined, and so that helped a lot. 145 00:07:39,200 --> 00:07:41,920 Speaker 2: Oh that's showing. What did your cat? Oh did they 146 00:07:42,240 --> 00:07:43,480 Speaker 2: knock over the mug and break it? 147 00:07:43,800 --> 00:07:46,680 Speaker 4: Yeah, that's the last mug that she's broke, she's break. 148 00:07:47,520 --> 00:07:49,880 Speaker 4: She's gotten rid of thirty on indoor plants in the 149 00:07:49,920 --> 00:07:52,680 Speaker 4: two years I've had her, and all of the ornaments 150 00:07:52,840 --> 00:07:55,720 Speaker 4: have been broken. And about three mugs. 151 00:07:56,120 --> 00:07:58,480 Speaker 2: Are you really anxious right now that your cat could 152 00:07:58,480 --> 00:08:00,760 Speaker 2: be doing something while you're stuck here the podcast? 153 00:08:00,960 --> 00:08:03,200 Speaker 4: I'm feeling like I'm not sure what my love language 154 00:08:03,240 --> 00:08:03,760 Speaker 4: is anymore? 155 00:08:04,440 --> 00:08:06,720 Speaker 1: Why do you keep buying indoor plans if you know 156 00:08:06,800 --> 00:08:08,160 Speaker 1: that their time is nine? 157 00:08:08,400 --> 00:08:14,640 Speaker 4: No? No, now I've got now I have artificial ones? Wow, yeah, 158 00:08:15,560 --> 00:08:17,200 Speaker 4: which she still causes chaos with. 159 00:08:17,640 --> 00:08:20,480 Speaker 1: I don't know the value of just is it just 160 00:08:20,520 --> 00:08:21,480 Speaker 1: an aesthetic thing? 161 00:08:21,720 --> 00:08:23,840 Speaker 4: Or do you like an aesthetic thing? Now? 162 00:08:24,320 --> 00:08:24,560 Speaker 2: Right? 163 00:08:24,600 --> 00:08:26,600 Speaker 1: And so all those ones I can see over your 164 00:08:26,680 --> 00:08:30,040 Speaker 1: left shoulder, there's about five. Are they fake? 165 00:08:30,480 --> 00:08:32,960 Speaker 4: Two of them are real? There's two that she hasn't 166 00:08:33,520 --> 00:08:38,760 Speaker 4: messed with. The other twenty eight odd They made their 167 00:08:38,760 --> 00:08:41,840 Speaker 4: way outside and eventually passed away. 168 00:08:42,640 --> 00:08:45,960 Speaker 1: Patrick. I don't want to throw an emotional spanner in 169 00:08:46,000 --> 00:08:48,880 Speaker 1: the works, but like we've been through this with Tiff, 170 00:08:48,920 --> 00:08:52,440 Speaker 1: where you know, one of the puppies died way way 171 00:08:52,520 --> 00:08:57,079 Speaker 1: way before she should have. How old is Fritz? Now? 172 00:08:58,080 --> 00:09:00,640 Speaker 2: It's that ton nine on the first of so We'll 173 00:09:00,679 --> 00:09:01,120 Speaker 2: know he's. 174 00:09:00,960 --> 00:09:03,040 Speaker 1: Going to live to twenty five, so let's not worry. 175 00:09:03,160 --> 00:09:05,440 Speaker 1: But when he does turn twenty five and goes to 176 00:09:05,480 --> 00:09:08,160 Speaker 1: that big fucking kennel in the sky, are you going 177 00:09:08,200 --> 00:09:08,960 Speaker 1: to get another one? 178 00:09:09,840 --> 00:09:12,240 Speaker 2: I will be an emotional mess for quite a while. 179 00:09:13,440 --> 00:09:15,320 Speaker 2: My previous give you a bit of a snapshot. I've 180 00:09:15,360 --> 00:09:18,040 Speaker 2: only ever had two dogs, and my previous dog died 181 00:09:18,160 --> 00:09:22,200 Speaker 2: suddenly at the pet rumors, so it was quite a shock. 182 00:09:22,400 --> 00:09:22,480 Speaker 1: What. 183 00:09:23,120 --> 00:09:27,360 Speaker 2: Yeah, yeah, I was totally bereft. I was a blubbering idiot. 184 00:09:28,360 --> 00:09:31,920 Speaker 2: So yeah, I really struggled when I lost my first dog. 185 00:09:32,000 --> 00:09:34,679 Speaker 2: And to give you an idea of how bad it was. 186 00:09:35,120 --> 00:09:37,400 Speaker 2: For the next month after that, I'd go walking in 187 00:09:37,440 --> 00:09:39,720 Speaker 2: the morning with his ashes in my backpack because I 188 00:09:39,800 --> 00:09:41,320 Speaker 2: just felt I still wanted to walk him. 189 00:09:41,800 --> 00:09:45,360 Speaker 1: No, my god, that's the saddest thing ever. 190 00:09:46,640 --> 00:09:50,320 Speaker 2: Well, I just they are. And that was the interesting thing. 191 00:09:50,360 --> 00:09:53,760 Speaker 2: People with dogs during COVID were less lonely, you know, 192 00:09:54,240 --> 00:09:56,560 Speaker 2: having a dog and that connection, you know, like I said, 193 00:09:56,559 --> 00:09:58,839 Speaker 2: I'm sitting I'm still sitting here right now patting him. 194 00:09:59,040 --> 00:10:01,600 Speaker 2: I think for people, particularly older people or people who 195 00:10:01,720 --> 00:10:05,400 Speaker 2: live alone, it's great to have that company, to have 196 00:10:05,440 --> 00:10:07,600 Speaker 2: that animal next to you to talk to, to play with, 197 00:10:08,440 --> 00:10:11,840 Speaker 2: you know, and also the sense of reliance. I kind 198 00:10:11,840 --> 00:10:15,640 Speaker 2: of like that, you know, caring and talking about love languages. 199 00:10:15,840 --> 00:10:18,200 Speaker 2: I mean, you look after an animal. I've always felt 200 00:10:18,240 --> 00:10:20,160 Speaker 2: and I've always said that I'm not a pet owner. 201 00:10:20,760 --> 00:10:24,640 Speaker 2: I feel that I'm entrusted to look after the animal 202 00:10:24,840 --> 00:10:27,079 Speaker 2: as opposed to owning the animal. 203 00:10:27,160 --> 00:10:30,160 Speaker 1: Yeah, I'm with you. I don't like owner, like it 204 00:10:31,080 --> 00:10:34,040 Speaker 1: kind of infers that the animal is a commodity or 205 00:10:34,080 --> 00:10:38,400 Speaker 1: a product, you know, an item to be owned. Yeah. 206 00:10:38,840 --> 00:10:42,439 Speaker 1: I don't like that language either, because I don't think, 207 00:10:43,720 --> 00:10:48,640 Speaker 1: you know, I don't think we can own animals. Yeah, 208 00:10:49,080 --> 00:10:51,800 Speaker 1: like I understand the language, but to me, it's the 209 00:10:51,840 --> 00:10:52,559 Speaker 1: wrong language. 210 00:10:52,880 --> 00:10:56,240 Speaker 2: Yeah, and that's exactly why I never say I own Fritz. 211 00:10:56,640 --> 00:10:59,640 Speaker 2: I'm entrusted with him. And it's great. I think that 212 00:10:59,640 --> 00:11:01,640 Speaker 2: if you can I haven't had a dog and you 213 00:11:01,640 --> 00:11:05,120 Speaker 2: have an opportunity to babysit one's it's a lot of fun. 214 00:11:05,240 --> 00:11:07,600 Speaker 2: And when you connect and if you're the same, I 215 00:11:07,679 --> 00:11:09,960 Speaker 2: know how much you love your dog and your cat 216 00:11:10,200 --> 00:11:12,959 Speaker 2: on some level. Hey, if you had to choose which one, 217 00:11:13,000 --> 00:11:14,480 Speaker 2: would you choose the dog the cat? 218 00:11:15,600 --> 00:11:18,120 Speaker 4: By choose there's no choosing. 219 00:11:18,600 --> 00:11:19,319 Speaker 1: You can't ask it. 220 00:11:19,520 --> 00:11:20,200 Speaker 2: You can't ask it. 221 00:11:20,320 --> 00:11:23,720 Speaker 1: You can't ask I don't want to open the pain 222 00:11:23,840 --> 00:11:26,840 Speaker 1: door any wider than it's been opened, but fuck it, 223 00:11:26,880 --> 00:11:30,880 Speaker 1: I'm doing it on behalf of the curious listeners. You're welcome, listeners, Patrick. 224 00:11:31,400 --> 00:11:34,320 Speaker 1: How does somebody send their dog to the groomers and 225 00:11:34,480 --> 00:11:37,559 Speaker 1: the dog not come back as much or as little 226 00:11:37,559 --> 00:11:39,760 Speaker 1: as you want to share? And if that is zero, 227 00:11:39,880 --> 00:11:42,880 Speaker 1: that's fine, But I'm just curious. 228 00:11:42,440 --> 00:11:45,320 Speaker 2: What happened was Look, we think it may have been 229 00:11:45,360 --> 00:11:49,960 Speaker 2: a heart condition, you know, and the diet stressed when 230 00:11:50,280 --> 00:11:52,880 Speaker 2: they were being groomed. So but yeah, it was just 231 00:11:52,920 --> 00:11:55,880 Speaker 2: pretty awful. And look, the groomer was bereft as well. 232 00:11:56,040 --> 00:11:58,160 Speaker 2: You know, it wasn't just me. I mean, obviously they 233 00:11:58,160 --> 00:12:01,560 Speaker 2: didn't plan for that to happen, says he who got 234 00:12:01,600 --> 00:12:05,679 Speaker 2: there and started giving his dog mouth to resuscitation. You know, 235 00:12:06,000 --> 00:12:08,640 Speaker 2: I know how to give CPI had a dog and 236 00:12:08,679 --> 00:12:09,240 Speaker 2: not a human. 237 00:12:09,720 --> 00:12:15,000 Speaker 1: Yeah, that's not the worst skill to have. It's pretty 238 00:12:15,000 --> 00:12:18,120 Speaker 1: transferable too. So just think of the next person that's 239 00:12:18,200 --> 00:12:20,840 Speaker 1: lifeless as a big dog, and you'll be fine to. 240 00:12:20,720 --> 00:12:23,040 Speaker 2: Put your hand over their mouth and breathe into their nose. 241 00:12:23,080 --> 00:12:25,600 Speaker 1: Okay, you could even do that. You could do that. 242 00:12:26,559 --> 00:12:28,640 Speaker 1: You know it's not the preferred. 243 00:12:28,120 --> 00:12:33,719 Speaker 5: Model or operating system, but well, right, can we can 244 00:12:33,760 --> 00:12:37,200 Speaker 5: we talk about something more uplifting because everybody's reaching for 245 00:12:37,240 --> 00:12:40,720 Speaker 5: the fucking tissues and some anti depressant at the minute. 246 00:12:40,760 --> 00:12:43,360 Speaker 2: Can we Yeah, okay, here's a good one. How do 247 00:12:43,400 --> 00:12:47,040 Speaker 2: you want to feel like you're living a fuller, longer 248 00:12:47,679 --> 00:12:48,480 Speaker 2: enriched life. 249 00:12:49,040 --> 00:12:51,800 Speaker 1: We all do, right, Yes, I would think nobody's going 250 00:12:51,880 --> 00:12:52,520 Speaker 1: to say no to. 251 00:12:52,520 --> 00:12:55,439 Speaker 2: That, Isn't it interesting? Do you remember when you were 252 00:12:55,440 --> 00:12:59,559 Speaker 2: in primary school how school holidays went for a million years, 253 00:13:00,040 --> 00:13:03,200 Speaker 2: and when you're younger, everything takes longer. Everything felt like 254 00:13:03,200 --> 00:13:05,480 Speaker 2: it went for an eternity. School went for an eternity, 255 00:13:05,679 --> 00:13:07,400 Speaker 2: and then you went home, and then it was an eternity, 256 00:13:07,440 --> 00:13:08,920 Speaker 2: and then you went back to school again in the home. 257 00:13:09,440 --> 00:13:16,000 Speaker 2: So it's an interesting I guess psychological aspect to our 258 00:13:16,840 --> 00:13:19,400 Speaker 2: perception of who we are and what we do is 259 00:13:19,440 --> 00:13:24,600 Speaker 2: that the way we perceive the movement of time so constant, 260 00:13:24,720 --> 00:13:27,520 Speaker 2: and we experience it in the same way. Every minute 261 00:13:27,760 --> 00:13:30,880 Speaker 2: lasts the same for me and for you and for Tiff. However, 262 00:13:31,120 --> 00:13:34,400 Speaker 2: depending on what we're doing, that can change. Dramatically. And 263 00:13:34,440 --> 00:13:37,360 Speaker 2: a good example is if you go the same place 264 00:13:38,040 --> 00:13:40,160 Speaker 2: to the same location for the first time you go there, 265 00:13:40,280 --> 00:13:42,480 Speaker 2: it seems like it takes a long time. Then as 266 00:13:42,520 --> 00:13:44,920 Speaker 2: you become more familiar with the route that you're taking, 267 00:13:45,280 --> 00:13:48,880 Speaker 2: it seems to take less time. The more familiarity, we're 268 00:13:48,880 --> 00:13:51,960 Speaker 2: not taking in new information. So if you want to 269 00:13:52,120 --> 00:13:54,160 Speaker 2: as you get older, you know, we say it all 270 00:13:54,200 --> 00:13:57,040 Speaker 2: the time, gosh, you know, time flies, you know suddenly 271 00:13:57,280 --> 00:14:00,640 Speaker 2: you know you're sixty seven years old and every day 272 00:14:00,800 --> 00:14:05,160 Speaker 2: is a blink. But what it's thought is the way 273 00:14:05,280 --> 00:14:08,800 Speaker 2: to enhance your life. If you want to feel like 274 00:14:08,840 --> 00:14:12,600 Speaker 2: you're living a longer life, start doing newer things, learn 275 00:14:12,679 --> 00:14:17,560 Speaker 2: newer things, and not to be anything drastic. You know, 276 00:14:17,640 --> 00:14:22,120 Speaker 2: you could join my taichi class. No, but if you 277 00:14:22,160 --> 00:14:25,800 Speaker 2: think about it, anything new that you do takes more 278 00:14:25,840 --> 00:14:30,480 Speaker 2: mental focus. As a consequence, it's from a perception perspective, 279 00:14:30,560 --> 00:14:34,480 Speaker 2: it feels like it's taking longer, and so holding on 280 00:14:34,560 --> 00:14:37,160 Speaker 2: to a thought. You may be in meditation, you might 281 00:14:37,200 --> 00:14:39,520 Speaker 2: be praying, or you could be out learning a new skill, 282 00:14:39,560 --> 00:14:42,600 Speaker 2: you could be doing pottery, whatever it happens to be. 283 00:14:42,920 --> 00:14:48,280 Speaker 2: But doing new things can be really beneficial and it 284 00:14:48,360 --> 00:14:50,720 Speaker 2: might be going a different way for your walk in 285 00:14:50,760 --> 00:14:53,800 Speaker 2: the morning, and suddenly you're seeing new things, taking notice 286 00:14:53,840 --> 00:14:56,600 Speaker 2: of new things, new flowers, new hills, you know, whatever 287 00:14:56,640 --> 00:14:59,880 Speaker 2: it happens to be. So it's just an interesting little 288 00:15:00,000 --> 00:15:02,520 Speaker 2: exercise that we can all do, and it's not expensive 289 00:15:02,800 --> 00:15:05,200 Speaker 2: and it doesn't take a lot of energy. It just 290 00:15:05,640 --> 00:15:07,720 Speaker 2: is about shifting the way that we think about what 291 00:15:07,760 --> 00:15:11,520 Speaker 2: we're doing to try to make our life, according to neuroscience, 292 00:15:11,920 --> 00:15:12,600 Speaker 2: more fuller. 293 00:15:13,240 --> 00:15:17,320 Speaker 1: I love it. And also there's the that actually makes 294 00:15:17,320 --> 00:15:19,760 Speaker 1: sense and I haven't really thought of that, but I've 295 00:15:19,760 --> 00:15:23,920 Speaker 1: always been fascinated with the constant that is time versus 296 00:15:24,120 --> 00:15:29,160 Speaker 1: our perception and our individual experience of that constant, because 297 00:15:29,160 --> 00:15:31,160 Speaker 1: you're right, like, if you're in the middle of something 298 00:15:31,240 --> 00:15:35,920 Speaker 1: fucking horrible, five minutes seems like five hours, and if 299 00:15:35,960 --> 00:15:40,600 Speaker 1: you're in a complete state of flow and maybe euphoria, 300 00:15:40,920 --> 00:15:44,240 Speaker 1: an hour can seem like three or four minutes, you know, 301 00:15:44,320 --> 00:15:47,120 Speaker 1: because that's the way that we process it. But I 302 00:15:47,120 --> 00:15:49,920 Speaker 1: think also when you're doing new things, apart from that 303 00:15:50,920 --> 00:15:56,240 Speaker 1: kind of different perception of time, you know, you're having 304 00:15:56,240 --> 00:15:59,840 Speaker 1: a new experience, like you're learning something and you may 305 00:15:59,840 --> 00:16:03,400 Speaker 1: be opening the door to you know, a different set 306 00:16:03,440 --> 00:16:07,280 Speaker 1: of you know, or a different lot of biochemistry happening 307 00:16:07,320 --> 00:16:10,600 Speaker 1: in the brain because you're you're tapping into new things. 308 00:16:10,640 --> 00:16:13,200 Speaker 1: And yeah, I like it. It makes sense too. But 309 00:16:13,280 --> 00:16:17,960 Speaker 1: also think about this. When you're four, like twelve months 310 00:16:18,000 --> 00:16:22,080 Speaker 1: is twenty five percent of your life, but when you're sixty, 311 00:16:23,120 --> 00:16:27,080 Speaker 1: like me, it's not even two percent, you know, So 312 00:16:27,280 --> 00:16:31,080 Speaker 1: there is that that relative to the age of the 313 00:16:31,160 --> 00:16:33,280 Speaker 1: individual kind of factor as well. 314 00:16:34,080 --> 00:16:36,080 Speaker 2: Well. What are the things you can think if you're 315 00:16:36,080 --> 00:16:38,640 Speaker 2: brushing your teeth and you have to use your non 316 00:16:38,720 --> 00:16:43,760 Speaker 2: dominant hand because you're focusing so much harder, it seems 317 00:16:43,800 --> 00:16:47,640 Speaker 2: to take a lot longer, doesn't it. Yes, yes, you 318 00:16:47,680 --> 00:16:52,240 Speaker 2: know those neural pathways of focus and attention. It's actually 319 00:16:52,240 --> 00:16:54,880 Speaker 2: an interesting thing to do. I always take my tell 320 00:16:54,960 --> 00:16:58,120 Speaker 2: my tie cheese students to brush their teeth standing on 321 00:16:58,120 --> 00:17:01,400 Speaker 2: one foot and that way. It's a good thing for 322 00:17:01,480 --> 00:17:04,000 Speaker 2: core stability, and also because they're moving their hands up 323 00:17:04,040 --> 00:17:06,640 Speaker 2: and down, you're getting those micro movements and you're having 324 00:17:06,640 --> 00:17:10,000 Speaker 2: to readjust your balance constantly. It's just so simple. It 325 00:17:10,000 --> 00:17:12,680 Speaker 2: doesn't take much at all, but you turn something really 326 00:17:12,800 --> 00:17:15,680 Speaker 2: mundane into something you've got to focus on a lot more. 327 00:17:15,880 --> 00:17:17,720 Speaker 1: Now I do the same thing. I used to tell 328 00:17:17,720 --> 00:17:20,760 Speaker 1: people to dry the dishes on one foot, so and 329 00:17:20,800 --> 00:17:23,640 Speaker 1: that's same thing. You've got to reach over, grab the thing, 330 00:17:23,760 --> 00:17:26,280 Speaker 1: grab the bowl, grab your tea towel, drive, put it 331 00:17:26,320 --> 00:17:30,359 Speaker 1: on the bench. Reach. Yeah. Yeah, doing old things in 332 00:17:30,440 --> 00:17:34,280 Speaker 1: new ways is a great way to keep young and 333 00:17:34,320 --> 00:17:37,439 Speaker 1: to keep your body and your brain kind of evolving 334 00:17:37,560 --> 00:17:41,200 Speaker 1: because you're not doing you're not stuck in groundtog day 335 00:17:41,240 --> 00:17:45,399 Speaker 1: in terms of mindset or behavior or action taking. 336 00:17:45,760 --> 00:17:48,480 Speaker 2: Yeah, across the road from your place that you go 337 00:17:48,560 --> 00:17:50,720 Speaker 2: to a lot, Well, instead of walking across the road, 338 00:17:51,040 --> 00:17:52,640 Speaker 2: walk around the block the opposite way. 339 00:17:52,800 --> 00:17:53,240 Speaker 4: Hm hmm. 340 00:17:54,240 --> 00:17:57,000 Speaker 1: Yeah, it's funny you say that. I typically have been 341 00:17:57,359 --> 00:18:00,280 Speaker 1: running two or three k's three times a week, and 342 00:18:00,320 --> 00:18:03,040 Speaker 1: I do the same course, and I started doing a 343 00:18:03,040 --> 00:18:05,800 Speaker 1: different course. I enjoy it so much more, Like every 344 00:18:05,840 --> 00:18:09,280 Speaker 1: time I just go somewhere different now, which is to 345 00:18:09,359 --> 00:18:11,520 Speaker 1: your point, And I didn't even think about it. I 346 00:18:11,640 --> 00:18:13,680 Speaker 1: just thought, I'm just going to I'm sick of this 347 00:18:14,359 --> 00:18:17,280 Speaker 1: particular route because I know exactly how long it is, 348 00:18:17,680 --> 00:18:19,760 Speaker 1: so when I get to certain points, I know what 349 00:18:19,920 --> 00:18:22,480 Speaker 1: time I should be at because I'm not competitive at all. 350 00:18:23,160 --> 00:18:27,000 Speaker 1: And then but now just going in different spaces and 351 00:18:27,040 --> 00:18:30,680 Speaker 1: places and directions every time, I actually am having more fun. 352 00:18:31,320 --> 00:18:33,960 Speaker 2: Did the stalking charges help make you change your route 353 00:18:33,960 --> 00:18:34,359 Speaker 2: as well? 354 00:18:35,040 --> 00:18:38,680 Speaker 1: Yeah? Well, I mean sure, I mean there's that and 355 00:18:38,720 --> 00:18:41,120 Speaker 1: also the divvy van that follows me as I run, 356 00:18:41,160 --> 00:18:46,160 Speaker 1: which I mean that's a little disconcerting. But don't say that. 357 00:18:46,200 --> 00:18:50,280 Speaker 1: People think you serious. All right, tell us about wherever 358 00:18:50,320 --> 00:18:52,080 Speaker 1: you want to go. I'm not at the Tetris effect. 359 00:18:52,200 --> 00:18:55,040 Speaker 1: What is the Tetris effect? I love Tetris, by the. 360 00:18:55,040 --> 00:18:56,960 Speaker 2: Way, on me too. Tetris is one of the best 361 00:18:57,000 --> 00:19:01,080 Speaker 2: games ever because it is so mentally challenging. The Tetris effect. 362 00:19:02,080 --> 00:19:06,399 Speaker 2: It was discovered or if you play a game of 363 00:19:06,440 --> 00:19:09,439 Speaker 2: Tetris for quite a lengthy amount of time and then 364 00:19:09,480 --> 00:19:12,119 Speaker 2: you go to bed, it was found that some people 365 00:19:12,119 --> 00:19:17,239 Speaker 2: who are gamers were visualizing subconsciously the Tetris effect, So 366 00:19:17,520 --> 00:19:21,359 Speaker 2: the Tetris blocks turning around. So people know what tetrises. 367 00:19:21,400 --> 00:19:22,920 Speaker 2: I'm assuming people know Tetris well. 368 00:19:22,960 --> 00:19:24,840 Speaker 1: I would think half doo and half days. 369 00:19:24,920 --> 00:19:28,399 Speaker 2: So it's like all it is is basically a screen 370 00:19:28,520 --> 00:19:31,040 Speaker 2: where you've got colored blocks that are falling down and 371 00:19:31,080 --> 00:19:33,359 Speaker 2: you have to rearrange the blocks so that they stack 372 00:19:33,640 --> 00:19:36,400 Speaker 2: a long row, and then if you complete a row, 373 00:19:36,720 --> 00:19:39,480 Speaker 2: the road then disappears, so you can belong the game 374 00:19:39,480 --> 00:19:41,080 Speaker 2: by doing that. Is that kind of makes sense? 375 00:19:41,440 --> 00:19:44,520 Speaker 1: Yeah yeah, but I wouldn't think of Yeah yeah, but 376 00:19:44,560 --> 00:19:47,120 Speaker 1: I think anyone who's listening to this who's twenty five 377 00:19:47,240 --> 00:19:50,600 Speaker 1: or thirty even or younger, they're definitely not saying Tetris 378 00:19:50,640 --> 00:19:54,040 Speaker 1: is awesome. They're going, you are such a fucking granddad, 379 00:19:54,080 --> 00:19:57,240 Speaker 1: Tetris is awesome. It's like going, oh, have you seen 380 00:19:57,320 --> 00:20:00,159 Speaker 1: table tennis? Fucking hell, it's the next big thing. 381 00:20:02,680 --> 00:20:09,240 Speaker 2: Okay, right, oh yeah. 382 00:20:07,920 --> 00:20:10,200 Speaker 1: We weren't playing it in the eighties, which is forty 383 00:20:10,280 --> 00:20:14,480 Speaker 1: years ago, Grandpa, it was a long time, agodly groundbreaking. 384 00:20:15,960 --> 00:20:18,560 Speaker 1: What about tetris though? That's big? No, it's not. 385 00:20:21,440 --> 00:20:24,800 Speaker 2: So this researcher a govern the number Robert stick Gold. 386 00:20:25,400 --> 00:20:29,720 Speaker 2: He that's not a real name name, not a. 387 00:20:29,640 --> 00:20:34,520 Speaker 1: Real name, that's a pseudonym. He named Brian mcgilla Cuddy. 388 00:20:35,760 --> 00:20:39,800 Speaker 2: So what he realized was he named the phenomenon after 389 00:20:39,920 --> 00:20:43,679 Speaker 2: the puzzle because this usually happens when you're doing repetitive 390 00:20:43,880 --> 00:20:47,320 Speaker 2: tasks just prior to going to sleep. But there actually 391 00:20:47,359 --> 00:20:49,399 Speaker 2: is a positive flow and effect from this because what 392 00:20:49,400 --> 00:20:53,840 Speaker 2: they're talking about doing is using this tetris effect people 393 00:20:53,880 --> 00:20:58,040 Speaker 2: with PTSD. So people who have trouble sleeping, they want 394 00:20:58,080 --> 00:21:01,320 Speaker 2: to do is they try to get the to play 395 00:21:01,359 --> 00:21:05,520 Speaker 2: the game when they're recalling whatever the trauma happens to be, 396 00:21:05,720 --> 00:21:09,720 Speaker 2: and it reworks the memory and write is it rewrites 397 00:21:09,960 --> 00:21:12,680 Speaker 2: to the mental hard drive. And they're saying it can 398 00:21:12,760 --> 00:21:16,359 Speaker 2: help lessen the impact of the memory when used in 399 00:21:16,440 --> 00:21:18,880 Speaker 2: conjunction with playing a video game like this. 400 00:21:19,760 --> 00:21:22,359 Speaker 1: That is really interesting. We call that that. You know, 401 00:21:22,400 --> 00:21:25,679 Speaker 1: when there's a stimulus, like you think about like that 402 00:21:25,800 --> 00:21:31,280 Speaker 1: neuro association, that thought has a neurological kind of consequence 403 00:21:31,480 --> 00:21:34,840 Speaker 1: in the body and where that that trauma. Yeah, that 404 00:21:35,000 --> 00:21:38,880 Speaker 1: that's clever. That's clever, Like I have this neuro association. 405 00:21:39,359 --> 00:21:42,959 Speaker 1: Did you ever eat that that meat? What was it 406 00:21:43,040 --> 00:21:48,000 Speaker 1: called that? Like lunch meat? It was called no, not spam. 407 00:21:48,040 --> 00:21:50,199 Speaker 1: It was you know in those big tubes that your 408 00:21:50,240 --> 00:21:53,920 Speaker 1: mum used to slice. Oh my god, not salami, No, 409 00:21:53,960 --> 00:21:55,399 Speaker 1: not salami, no salamis. 410 00:21:55,880 --> 00:21:56,600 Speaker 2: It was like. 411 00:21:58,080 --> 00:21:58,560 Speaker 1: I forget. 412 00:21:58,680 --> 00:22:01,920 Speaker 2: It was very the meat products for crying out. 413 00:22:02,000 --> 00:22:04,000 Speaker 1: Yeah no, but you haven't always been a vegan. You 414 00:22:04,080 --> 00:22:07,520 Speaker 1: used to eat fucking charred animal flesh. Don't don't pretend 415 00:22:08,040 --> 00:22:11,240 Speaker 1: you grew up eating carrots. Anyway, I ate some of 416 00:22:11,280 --> 00:22:15,679 Speaker 1: this sandwich meat. Essentially it is just shit. And I 417 00:22:15,720 --> 00:22:18,320 Speaker 1: got violently ill. I got food poisoning, and I used 418 00:22:18,359 --> 00:22:21,199 Speaker 1: to love it, and then I got food poisoning. And 419 00:22:21,240 --> 00:22:25,840 Speaker 1: then even now the if you gave me, if you 420 00:22:25,960 --> 00:22:28,080 Speaker 1: just put that in front of me, I start to 421 00:22:28,560 --> 00:22:32,040 Speaker 1: dry reach and that was I was so ill. 422 00:22:32,760 --> 00:22:33,280 Speaker 2: And it was. 423 00:22:33,240 --> 00:22:35,560 Speaker 1: Because of that stuff. And every time I would spew, 424 00:22:35,680 --> 00:22:39,240 Speaker 1: I could taste it, and oh my god. Yeah, but 425 00:22:39,280 --> 00:22:41,680 Speaker 1: that's forty years later and I still can't even think 426 00:22:41,720 --> 00:22:42,160 Speaker 1: about it. 427 00:22:42,400 --> 00:22:45,000 Speaker 2: Well, talking about processed meat, I did actually try a 428 00:22:45,040 --> 00:22:49,000 Speaker 2: thing called tongue and versed once. What do you reckon 429 00:22:49,119 --> 00:22:50,000 Speaker 2: toungu in verst is? 430 00:22:50,640 --> 00:22:54,400 Speaker 1: I think it is a German product involving tongue. 431 00:22:54,200 --> 00:22:59,360 Speaker 2: Correct and it's basically sliced tongue, and the slices of 432 00:22:59,400 --> 00:23:02,119 Speaker 2: the show the shape of the tongue. 433 00:23:02,840 --> 00:23:06,359 Speaker 1: Yeah, yeah, my old man. Mild Man's like that, mild man, 434 00:23:06,760 --> 00:23:09,320 Speaker 1: he'll it fucking everything that you don't want it, like 435 00:23:09,520 --> 00:23:13,840 Speaker 1: tongue off lanes, like liver. I know people are going 436 00:23:13,920 --> 00:23:16,320 Speaker 1: to go live. It's amazing live, Its fucking horrible. 437 00:23:18,119 --> 00:23:20,120 Speaker 2: The reason life sounds like awful. 438 00:23:20,720 --> 00:23:24,399 Speaker 1: Oh it is? It is, all right, So tell me 439 00:23:24,440 --> 00:23:27,520 Speaker 1: about screen fatigue, because I feel like this might be 440 00:23:27,600 --> 00:23:31,480 Speaker 1: something to which I'm susceptible and everyone really. 441 00:23:31,440 --> 00:23:33,920 Speaker 2: Well, after the show today, I've got an eye appointment 442 00:23:33,960 --> 00:23:36,240 Speaker 2: to get an eye check up, so I haven't got 443 00:23:36,240 --> 00:23:38,399 Speaker 2: glasses yet, and every time I go for one of 444 00:23:38,520 --> 00:23:41,719 Speaker 2: my annual checkup, I think high end. It's now, isn't it. 445 00:23:41,720 --> 00:23:47,159 Speaker 2: It's got to be now glasses anyway, So screen fatigue, 446 00:23:47,240 --> 00:23:52,320 Speaker 2: it's something that's becoming more prevalent. And so there are 447 00:23:52,440 --> 00:23:55,240 Speaker 2: a number of different dangers associated with screen fatigue, and 448 00:23:55,280 --> 00:23:58,080 Speaker 2: I say dangers with an emphasis on well I came 449 00:23:58,480 --> 00:24:01,040 Speaker 2: quite so dangerous. But there are some problems that you 450 00:24:01,080 --> 00:24:04,400 Speaker 2: can you can you know you can have. And one 451 00:24:04,440 --> 00:24:08,399 Speaker 2: of the little rules that you can try to remember 452 00:24:08,480 --> 00:24:12,760 Speaker 2: this is from the American Optometric Association Optometric Optometric. I guess, 453 00:24:12,760 --> 00:24:15,760 Speaker 2: so let's go optometric. I think it sounds good anyway. 454 00:24:15,760 --> 00:24:19,200 Speaker 2: They advise what they call the twenty twenty twenty rule. Okay, 455 00:24:19,240 --> 00:24:22,359 Speaker 2: so every twenty minutes, look at something twenty feet away 456 00:24:22,640 --> 00:24:23,880 Speaker 2: for at least twenty seconds. 457 00:24:24,080 --> 00:24:29,520 Speaker 1: Yeah, yeah, yep, yeah, that's that's good advice. It is 458 00:24:29,800 --> 00:24:32,040 Speaker 1: like look at something in the distance, is like look 459 00:24:32,040 --> 00:24:34,320 Speaker 1: at the horizon. There is another one they go which 460 00:24:34,560 --> 00:24:36,680 Speaker 1: you know, however far that is for you. 461 00:24:37,240 --> 00:24:39,800 Speaker 2: Well, I've set up my desk specifically so that the 462 00:24:39,840 --> 00:24:41,919 Speaker 2: windows on the side and I can look down the 463 00:24:42,000 --> 00:24:44,840 Speaker 2: driveway well as I can see people coming up and 464 00:24:44,840 --> 00:24:46,960 Speaker 2: down the driveway. V It means I do have a 465 00:24:47,000 --> 00:24:50,879 Speaker 2: long distance viewable angle. Because if you're looking at a 466 00:24:50,880 --> 00:24:53,199 Speaker 2: screen for a long time, what it can do is 467 00:24:53,240 --> 00:24:56,800 Speaker 2: it causes dry eyes and irritation. So that's the prolonged time. 468 00:24:56,840 --> 00:24:59,480 Speaker 2: It means that we don't actually blink as much when 469 00:24:59,520 --> 00:25:03,320 Speaker 2: we look at screen. Yeah, this is typically blinking it 470 00:25:03,560 --> 00:25:07,320 Speaker 2: like millions miles an hour. So that then causes dry 471 00:25:07,359 --> 00:25:09,679 Speaker 2: eyes and irritation. So when we look at us, it's interesting, 472 00:25:09,720 --> 00:25:12,560 Speaker 2: isn't it that we we are subconsciously not blinking as 473 00:25:12,640 --> 00:25:14,880 Speaker 2: much when we look at it a screen. It's so 474 00:25:14,880 --> 00:25:15,560 Speaker 2: so weet. 475 00:25:15,600 --> 00:25:17,800 Speaker 1: I've got a trick a hack. What if we had 476 00:25:17,800 --> 00:25:20,240 Speaker 1: a huge screen, but we put it twenty feet away, 477 00:25:21,600 --> 00:25:27,080 Speaker 1: so we're always looking at something twenty feet away. Yeah, yeah, no, 478 00:25:28,000 --> 00:25:32,080 Speaker 1: I that seemed like what seemed like the advice is, 479 00:25:32,440 --> 00:25:34,760 Speaker 1: you know, looking at something twenty feet away is good 480 00:25:34,760 --> 00:25:37,680 Speaker 1: for our eyes. What if our screen was twenty feet Well, 481 00:25:37,680 --> 00:25:40,439 Speaker 1: we have like one hundred inch TV and that became your. 482 00:25:40,280 --> 00:25:42,919 Speaker 4: Screens in a house that big. 483 00:25:44,720 --> 00:25:49,560 Speaker 2: Patrick does does close every twenty minutes. Look at something close? 484 00:25:50,480 --> 00:25:52,640 Speaker 1: Can I here's the pause button for a moment. Now 485 00:25:52,720 --> 00:25:55,879 Speaker 1: you sent I think you sent Tiff as well? You 486 00:25:55,960 --> 00:25:57,960 Speaker 1: sent me and Tiff well you sent me? Did he 487 00:25:58,000 --> 00:25:59,440 Speaker 1: send them to you? Those photos tif? 488 00:25:59,560 --> 00:25:59,720 Speaker 4: Yes? 489 00:25:59,800 --> 00:26:03,480 Speaker 1: Yeah, so some were a bunch of flowers and outdoor 490 00:26:03,520 --> 00:26:07,800 Speaker 1: beautiful things, but one was a big room, empty looking room. 491 00:26:07,600 --> 00:26:10,639 Speaker 1: I'm assuming that's the new chie study. 492 00:26:11,040 --> 00:26:14,159 Speaker 2: Yeah, yeah, so finished. Yeah, that's so. 493 00:26:14,800 --> 00:26:17,919 Speaker 1: For those who haven't been listening, you're turning your garage 494 00:26:17,960 --> 00:26:21,680 Speaker 1: what is a large garage because you're rich into a 495 00:26:21,880 --> 00:26:26,320 Speaker 1: tai chi to studio. Yes, I can say that ten 496 00:26:26,359 --> 00:26:30,560 Speaker 1: times fast. And it looks good. It looks good. What's 497 00:26:30,560 --> 00:26:31,080 Speaker 1: the update? 498 00:26:31,160 --> 00:26:31,200 Speaker 5: Like? 499 00:26:31,240 --> 00:26:32,400 Speaker 1: Where are you at? Oh? 500 00:26:32,480 --> 00:26:34,480 Speaker 2: This has been an exciting thing for me. I haven't 501 00:26:34,520 --> 00:26:37,280 Speaker 2: really done much in the way of Renault's before, but 502 00:26:37,280 --> 00:26:40,320 Speaker 2: I've got a really great builder, and so it's a 503 00:26:40,359 --> 00:26:44,520 Speaker 2: massive garage. It's like nine meters by five meters. It's enormous. 504 00:26:44,560 --> 00:26:47,520 Speaker 2: You could fit three four cars in it. Easily more, 505 00:26:47,560 --> 00:26:51,439 Speaker 2: of course. But the idea being is that because I 506 00:26:51,480 --> 00:26:53,480 Speaker 2: do teach tai chi, but if I want to do 507 00:26:53,520 --> 00:26:57,160 Speaker 2: one on one or small sessions, it's kind of expensive 508 00:26:57,200 --> 00:26:59,480 Speaker 2: to hire out a location to do at a hall 509 00:26:59,560 --> 00:27:02,159 Speaker 2: or something like that. And you know, I was one 510 00:27:02,200 --> 00:27:04,920 Speaker 2: of those people that I just stored crap in my garage. 511 00:27:05,000 --> 00:27:08,520 Speaker 2: It's very cathartic throwing things away. I had two skips 512 00:27:08,560 --> 00:27:11,760 Speaker 2: so far, and I've got another skip coming soon to 513 00:27:11,840 --> 00:27:13,680 Speaker 2: throw more stuff away coming soon. 514 00:27:14,040 --> 00:27:15,680 Speaker 1: Do you not have any shit that you can give 515 00:27:15,720 --> 00:27:16,920 Speaker 1: to people? Like good shit? 516 00:27:17,200 --> 00:27:19,600 Speaker 2: You know I've given away some stuff. Yes, No, I certainly. 517 00:27:19,680 --> 00:27:22,359 Speaker 2: I've been trying to give away a beautiful art deco wardrobe, 518 00:27:22,960 --> 00:27:24,720 Speaker 2: but people don't want wardrobes anymore. 519 00:27:25,000 --> 00:27:27,040 Speaker 1: No, I think it might be that wardrobe. 520 00:27:27,359 --> 00:27:31,600 Speaker 2: Yeah, hey, so anyway, in answer to your question the paint. 521 00:27:31,480 --> 00:27:33,879 Speaker 1: I don't know that that art deco and beautiful go 522 00:27:33,960 --> 00:27:35,280 Speaker 1: in the same sentence, do they. 523 00:27:35,560 --> 00:27:37,880 Speaker 2: My whole house is art deco decorate. 524 00:27:37,960 --> 00:27:39,639 Speaker 1: We're not the whole rest my case. 525 00:27:39,840 --> 00:27:44,160 Speaker 2: Oh wow, yeah, I'm going to restind my invitation. 526 00:27:43,800 --> 00:27:47,000 Speaker 1: With ha ha ha oh. No, I love your house 527 00:27:47,000 --> 00:27:50,760 Speaker 1: and you I've never even seen it, but he's answering 528 00:27:50,800 --> 00:27:56,440 Speaker 1: your question. Sorry, oh man, jeez, I'm an only child. 529 00:27:56,640 --> 00:27:59,119 Speaker 1: I have a predisposition for interruption. 530 00:27:59,400 --> 00:28:03,040 Speaker 2: You do. So the painting has been done, very excited, 531 00:28:03,119 --> 00:28:05,800 Speaker 2: and then the plumber is due next week and the electrician, 532 00:28:05,840 --> 00:28:07,440 Speaker 2: so I wanted to get the painting done before that 533 00:28:07,680 --> 00:28:08,840 Speaker 2: and then it'll be all good to go. 534 00:28:09,320 --> 00:28:11,000 Speaker 1: What's on the floor, what's going to be? 535 00:28:11,640 --> 00:28:13,359 Speaker 2: I've just put green tongue on the floor at the 536 00:28:13,359 --> 00:28:15,280 Speaker 2: moment because I've kind of run out of money. So 537 00:28:15,560 --> 00:28:17,720 Speaker 2: if you're not familiar with green tongue, it's kind of 538 00:28:17,720 --> 00:28:21,879 Speaker 2: a it's it's the like chipboard, so maybe enemy to 539 00:28:21,920 --> 00:28:24,640 Speaker 2: thick chipboard that links together. But the green tongue as 540 00:28:24,640 --> 00:28:28,320 Speaker 2: opposed to the I think orange tongue is more it's 541 00:28:28,359 --> 00:28:31,040 Speaker 2: more water resistant. So I've just locked that down on 542 00:28:31,040 --> 00:28:31,640 Speaker 2: the concrete. 543 00:28:31,680 --> 00:28:33,080 Speaker 1: And is it b yo matts? 544 00:28:33,600 --> 00:28:35,280 Speaker 2: No, No, no, we don't do You don't use mats for 545 00:28:35,320 --> 00:28:37,320 Speaker 2: ti chi. You just do it in soft shoes like 546 00:28:37,400 --> 00:28:38,440 Speaker 2: runners and that sort of stuff. 547 00:28:38,600 --> 00:28:41,360 Speaker 1: Really, don't you ever get on your back or anything? 548 00:28:42,440 --> 00:28:45,520 Speaker 2: No. Tie chee is a slow moving martial art form, 549 00:28:45,680 --> 00:28:47,360 Speaker 2: so I guess only if you fall down. 550 00:28:48,480 --> 00:28:51,000 Speaker 1: Well, in my version of tie chi, we do it 551 00:28:51,040 --> 00:28:56,520 Speaker 1: on our back. So you clearly haven't evolved. So if 552 00:28:56,560 --> 00:29:00,480 Speaker 1: you want to do what I like to call a 553 00:29:00,720 --> 00:29:05,840 Speaker 1: supine tichi, just hit me up. 554 00:29:06,800 --> 00:29:09,080 Speaker 2: There are different forms of tie chief for people who 555 00:29:09,120 --> 00:29:11,640 Speaker 2: are incapacitated, So if they're in a wheelchair or whatever, 556 00:29:11,680 --> 00:29:13,920 Speaker 2: they can do the upper body parts of the tie 557 00:29:13,960 --> 00:29:18,120 Speaker 2: chi movement I'm beating is now, by the way, increased 558 00:29:18,200 --> 00:29:22,360 Speaker 2: stress and anxiety is also something that's associated with looking 559 00:29:22,360 --> 00:29:24,160 Speaker 2: at screens and talking to Craig Harper. 560 00:29:26,040 --> 00:29:28,400 Speaker 1: And imagine if you're talking to Craig Arper on a screen, 561 00:29:28,560 --> 00:29:31,880 Speaker 1: it's like the fucking double banger, absolutely double wordy. 562 00:29:33,680 --> 00:29:37,760 Speaker 2: That's funny al right now on Long twenty twenty twenty, Craigo. 563 00:29:38,040 --> 00:29:41,040 Speaker 1: Okay twenty twenty twenty, yes exactly. 564 00:29:41,120 --> 00:29:43,800 Speaker 2: Hey have you heard of a new term called raw dogging. 565 00:29:46,120 --> 00:29:48,680 Speaker 1: I don't think that's a new term, Patrick, I think 566 00:29:48,720 --> 00:29:52,000 Speaker 1: that's an old term, and I think it refers to 567 00:29:53,440 --> 00:29:58,360 Speaker 1: sexual relations without appropriate protection. That is exactly what it's 568 00:29:58,400 --> 00:29:59,000 Speaker 1: referring to. 569 00:29:59,520 --> 00:30:03,760 Speaker 2: Not now, not by the funky cool trendy YouTubers. 570 00:30:05,080 --> 00:30:07,280 Speaker 1: See, I don't know those of us who are older 571 00:30:07,320 --> 00:30:11,440 Speaker 1: than twenty raw dogging is. My mum was against it. 572 00:30:12,960 --> 00:30:15,440 Speaker 1: Choose to say to me, if it's not on, it's 573 00:30:15,480 --> 00:30:19,400 Speaker 1: not on, and I'd say, yes, mum, but get out 574 00:30:19,440 --> 00:30:28,840 Speaker 1: of the room though, and and what are you doing 575 00:30:28,840 --> 00:30:43,040 Speaker 1: with the handicam go away? Ah? Fuck, I see this 576 00:30:43,080 --> 00:30:43,720 Speaker 1: is a problem. 577 00:30:43,960 --> 00:30:45,320 Speaker 4: If it's not on, it's not on. 578 00:30:45,800 --> 00:30:47,480 Speaker 1: Yeah, it's not on, it's not on. This is what 579 00:30:47,560 --> 00:30:52,360 Speaker 1: happens when you do sixteen hundred podcasts and you're in 580 00:30:52,400 --> 00:30:54,520 Speaker 1: a little bit of pain. Which we won't open the 581 00:30:54,560 --> 00:30:55,960 Speaker 1: door on that, but you need to know what I'm 582 00:30:55,960 --> 00:30:59,200 Speaker 1: talking about. I've had some drugs this morning, I'm in pain, 583 00:30:59,600 --> 00:31:02,040 Speaker 1: so I'm not responsible for what comes out of my mouth. 584 00:31:02,320 --> 00:31:04,200 Speaker 2: It just occurred to me, Tiff, that if you come 585 00:31:04,280 --> 00:31:07,720 Speaker 2: from a one child family, whereas I'm from a four 586 00:31:07,920 --> 00:31:11,720 Speaker 2: child family, I've got a twenty five percent less chance 587 00:31:11,800 --> 00:31:12,920 Speaker 2: of someone walking in on me. 588 00:31:14,280 --> 00:31:20,280 Speaker 1: Yeah, statistically, that was That was the big fear of 589 00:31:19,840 --> 00:31:21,360 Speaker 1: my teenageers. 590 00:31:22,160 --> 00:31:26,040 Speaker 2: Your parents. Yeah, No, you. 591 00:31:26,080 --> 00:31:28,680 Speaker 1: Don't want that, do you. That's the no. I don't 592 00:31:28,720 --> 00:31:31,320 Speaker 1: even want to think about. Isn't it funny how everyone's 593 00:31:31,360 --> 00:31:33,880 Speaker 1: creeped out by that thought. I don't even want to 594 00:31:33,920 --> 00:31:35,080 Speaker 1: say the thought, but you. 595 00:31:35,040 --> 00:31:35,640 Speaker 2: Know you can. 596 00:31:35,680 --> 00:31:38,120 Speaker 1: You know, Mum and dad, you don't want to think 597 00:31:38,120 --> 00:31:39,320 Speaker 1: about that, do you? 598 00:31:39,320 --> 00:31:41,920 Speaker 2: No? No, Sorry to all the parents. 599 00:31:42,560 --> 00:31:45,880 Speaker 1: Sorry, what did you just say before tiff distracted us? 600 00:31:46,120 --> 00:31:48,160 Speaker 2: What? Raw dogging? Oh? 601 00:31:48,200 --> 00:31:51,880 Speaker 1: Yeah? What is what is the the contemporary equivalent of 602 00:31:52,000 --> 00:31:52,560 Speaker 1: raw dogging? 603 00:31:53,440 --> 00:31:55,560 Speaker 2: Well, I can't look at the screen because I'm tear 604 00:31:55,600 --> 00:31:58,760 Speaker 2: it up, tear it up so muchses. By the end 605 00:31:58,800 --> 00:32:02,040 Speaker 2: of this, raw dogging is now a term. 606 00:32:02,080 --> 00:32:03,640 Speaker 1: He stop saying raw dogging. 607 00:32:06,680 --> 00:32:08,160 Speaker 4: I'm going to row dog into India. 608 00:32:08,440 --> 00:32:14,960 Speaker 1: Ah God, next month. I'll tell you what I don't think. 609 00:32:15,120 --> 00:32:17,040 Speaker 1: I don't think you want to raw dog in general. 610 00:32:17,080 --> 00:32:22,320 Speaker 1: But no, it doesn't matter. Dirty on harps fucking get 611 00:32:22,360 --> 00:32:27,880 Speaker 1: your filter and click. It's back on Patrick and go. 612 00:32:28,640 --> 00:32:30,200 Speaker 2: I feel like I need to a polish. 613 00:32:30,400 --> 00:32:35,000 Speaker 1: Can you edit out everything up until now? Please? Hi? Everyone, 614 00:32:35,080 --> 00:32:38,240 Speaker 1: Welcome to the show. Patrick, great to see you. 615 00:32:38,480 --> 00:32:40,200 Speaker 2: I don't think you should edit any of this out. 616 00:32:40,320 --> 00:32:42,720 Speaker 2: Just make him suffer through it. No, okay. So raw 617 00:32:42,800 --> 00:32:45,440 Speaker 2: dogging is a new term where someone gets onto a 618 00:32:45,480 --> 00:32:50,200 Speaker 2: plane and they go into a state of almost trance 619 00:32:50,480 --> 00:32:53,600 Speaker 2: like state and they do nothing. They don't eat, they 620 00:32:53,600 --> 00:32:56,840 Speaker 2: don't drink, they don't read anything, they don't listen to anything, 621 00:32:56,840 --> 00:32:59,800 Speaker 2: I don't watch anything. It's become this trendy thing. A 622 00:32:59,800 --> 00:33:03,280 Speaker 2: guy what the name of a Damien Bailey who's a 623 00:33:03,280 --> 00:33:07,400 Speaker 2: big insta person, reckons he achieved his personal best at 624 00:33:07,440 --> 00:33:10,480 Speaker 2: thirteen and a half hour flight between Shanghai and Dallas 625 00:33:10,840 --> 00:33:15,640 Speaker 2: without any inflight entertainment, films, books, or music. So it's 626 00:33:15,880 --> 00:33:18,360 Speaker 2: generally blokes. Of course, it's going to be blokes who 627 00:33:18,400 --> 00:33:21,280 Speaker 2: do this, and they're the alpha male types. So the 628 00:33:21,400 --> 00:33:22,400 Speaker 2: three of us, it's going to. 629 00:33:22,320 --> 00:33:26,480 Speaker 1: Be you, I reckon TIFF's more of an alpha male 630 00:33:26,560 --> 00:33:29,800 Speaker 1: than both of us. But also like, there's all these 631 00:33:29,880 --> 00:33:33,160 Speaker 1: monks that have been living in caves just living on 632 00:33:33,400 --> 00:33:37,200 Speaker 1: fucking oxygen for thirty years who are like, ah, so. 633 00:33:37,320 --> 00:33:40,880 Speaker 2: What they're not on a plane at a high altitude 634 00:33:41,080 --> 00:33:44,520 Speaker 2: in potential DVT cross. 635 00:33:44,960 --> 00:33:50,440 Speaker 1: So wow, So what is what's the what's the outcome 636 00:33:50,520 --> 00:33:54,160 Speaker 1: or what's the intended outcome? Like just that they're going 637 00:33:54,240 --> 00:34:01,200 Speaker 1: through this, like this momentary kind of scarcity of technology 638 00:34:01,240 --> 00:34:04,240 Speaker 1: or food or water or stimulation or something. 639 00:34:04,960 --> 00:34:07,920 Speaker 2: It's a challenge. I guess they're challenging themselves to be 640 00:34:08,080 --> 00:34:12,960 Speaker 2: in the moment to focus on you know, that particular duration. 641 00:34:13,400 --> 00:34:17,160 Speaker 2: And one guy did a fifteen hour flight to Melbourne. 642 00:34:17,280 --> 00:34:20,800 Speaker 2: I mean that's crazy an Australian music producer by the 643 00:34:20,880 --> 00:34:23,560 Speaker 2: name of Torron Foot. Yes, that is a name I. 644 00:34:23,640 --> 00:34:26,359 Speaker 1: Have heard of this, I didn't think it was called 645 00:34:26,440 --> 00:34:30,080 Speaker 1: raw dogging, but also hashtag don't pick the middle seat 646 00:34:30,440 --> 00:34:32,799 Speaker 1: or there's going to be no fucking raw dogging. You're 647 00:34:32,840 --> 00:34:36,000 Speaker 1: going to be constantly shuffling your knees side ways while 648 00:34:36,000 --> 00:34:38,480 Speaker 1: Brian with the tiny bladder is stepping over the top 649 00:34:38,520 --> 00:34:38,680 Speaker 1: of you. 650 00:34:39,320 --> 00:34:42,000 Speaker 2: Yeah. Well, what was interesting though, is people have been 651 00:34:42,000 --> 00:34:47,280 Speaker 2: posting responses to it, and one medical expert said, they're idiots. 652 00:34:47,640 --> 00:34:50,759 Speaker 2: That was the great they're idiots. Well, because you know, 653 00:34:50,840 --> 00:34:53,040 Speaker 2: it's not a good thing. Some people are going without 654 00:34:53,040 --> 00:34:55,080 Speaker 2: food and water as well, and when, of course you're 655 00:34:55,080 --> 00:34:57,440 Speaker 2: on a long haul flight, they say hydration is important. 656 00:34:57,840 --> 00:34:59,919 Speaker 2: And if you're not hydrated and you don't move around, 657 00:35:00,120 --> 00:35:03,160 Speaker 2: do you increase some really serious health risks? 658 00:35:03,400 --> 00:35:06,320 Speaker 3: You know, Well, if you're getting dehydrated, your blood's getting 659 00:35:06,360 --> 00:35:10,400 Speaker 3: thicker because you're sweating, and obviously blah blah blah. So 660 00:35:10,600 --> 00:35:12,600 Speaker 3: I would think the well that I would think I 661 00:35:12,600 --> 00:35:17,239 Speaker 3: could almost guarantee the risk of deep vein thrombosis as 662 00:35:17,280 --> 00:35:20,240 Speaker 3: you alluded to, Patrick, or stroke or heart attack because 663 00:35:20,560 --> 00:35:24,120 Speaker 3: the higher the level of viscosity of your blood, or 664 00:35:24,160 --> 00:35:25,960 Speaker 3: in other words, the thicker it is and the less 665 00:35:26,040 --> 00:35:28,040 Speaker 3: volume you have, the more. 666 00:35:28,040 --> 00:35:30,239 Speaker 1: Risk you are. So I would think, at the very least, 667 00:35:30,320 --> 00:35:32,600 Speaker 1: drink some waterer. Do everything else but drink some water. 668 00:35:33,640 --> 00:35:37,160 Speaker 2: It's interesting that a business psychologist says, oh, it's probably 669 00:35:37,160 --> 00:35:40,480 Speaker 2: a good thing. You know, people can be quietly reflecting. 670 00:35:40,840 --> 00:35:43,080 Speaker 2: But it's the medical people I think I would be 671 00:35:43,120 --> 00:35:45,000 Speaker 2: listening to, not the business psychologists. 672 00:35:45,040 --> 00:35:48,080 Speaker 1: I also think, I don't mean, that's not poopoing your 673 00:35:48,120 --> 00:35:50,719 Speaker 1: story because it's quirky and it's interesting, which is what 674 00:35:50,800 --> 00:35:53,600 Speaker 1: we like. But how the fuck is this news? Like, 675 00:35:53,680 --> 00:35:56,040 Speaker 1: how's somebody sitting on a plane not drinking? 676 00:35:56,160 --> 00:35:56,239 Speaker 5: Like? 677 00:35:56,320 --> 00:35:58,400 Speaker 1: How is that with all the shit that's going on 678 00:35:58,480 --> 00:36:00,480 Speaker 1: in the world. Well, guess what. 679 00:36:01,120 --> 00:36:05,080 Speaker 2: These are famous TikTokers and famous people, all stars, and 680 00:36:05,200 --> 00:36:08,760 Speaker 2: young people will copy them. That's the problem, because people 681 00:36:08,840 --> 00:36:11,439 Speaker 2: copy them. It's like planking. Remember when planking was really 682 00:36:11,480 --> 00:36:15,960 Speaker 2: big and people fell off buildings and stuff. It happens, So, yeah, 683 00:36:16,080 --> 00:36:18,400 Speaker 2: influences people to do stupid things. 684 00:36:19,160 --> 00:36:19,480 Speaker 3: I know. 685 00:36:20,320 --> 00:36:25,160 Speaker 2: I know in Russia they have signs on train crossings 686 00:36:25,520 --> 00:36:28,640 Speaker 2: telling people not to take selfies on the train tracks 687 00:36:28,680 --> 00:36:32,360 Speaker 2: because people get run over by trains taking selfies. 688 00:36:33,520 --> 00:36:35,759 Speaker 1: Yeah, you would think if you were that young, your 689 00:36:35,840 --> 00:36:39,160 Speaker 1: ears would be better than that. You would think that 690 00:36:39,200 --> 00:36:41,000 Speaker 1: you should hear that shit coming. 691 00:36:41,520 --> 00:36:45,640 Speaker 2: You would. You would've probably got, you know, headphones on 692 00:36:45,719 --> 00:36:51,759 Speaker 2: or something. All right, plow on, plow on. I saw 693 00:36:51,800 --> 00:36:54,960 Speaker 2: a really interesting article in The Verge. It's a I 694 00:36:55,080 --> 00:36:58,160 Speaker 2: listened to the I'll watch the Vergin. Read the Verge anyway, 695 00:36:58,400 --> 00:37:01,640 Speaker 2: Ellison Johnson, I quite like it stuff. She was talking 696 00:37:01,640 --> 00:37:04,720 Speaker 2: about a new feature on the Google Pixel nine phone. 697 00:37:04,719 --> 00:37:07,440 Speaker 2: It's just about to come out. So there has been 698 00:37:07,480 --> 00:37:09,279 Speaker 2: a feature. And this is why I sent you some 699 00:37:09,280 --> 00:37:12,399 Speaker 2: photos this morning. You're probably wondering why. I say, oh yeah, yeah, yeah. 700 00:37:12,400 --> 00:37:15,200 Speaker 1: I did see like nature photos which were quite pretty. 701 00:37:15,520 --> 00:37:17,800 Speaker 2: Yeah. I took them yesterday in the last twenty four hours. 702 00:37:17,840 --> 00:37:19,840 Speaker 2: I just took a selection of photos whilst I was 703 00:37:19,880 --> 00:37:23,600 Speaker 2: out walking. At one point I drove tobacco smash and 704 00:37:23,800 --> 00:37:27,520 Speaker 2: I stopped outside a canola field and took some really 705 00:37:27,560 --> 00:37:30,040 Speaker 2: gorgeous photos of the canola. We all not know those 706 00:37:30,040 --> 00:37:33,440 Speaker 2: big fields of canola with the white glowy the yellow 707 00:37:33,880 --> 00:37:36,040 Speaker 2: kind of glow to them when the sun's setting. But 708 00:37:37,400 --> 00:37:39,560 Speaker 2: Google's Pixel phones and a lot of the new phones 709 00:37:39,600 --> 00:37:43,440 Speaker 2: now have things called magic eraser tools. But the next 710 00:37:43,520 --> 00:37:47,799 Speaker 2: level up from that is now you can place objects 711 00:37:47,840 --> 00:37:49,839 Speaker 2: into the scene. So not only can you take people 712 00:37:49,880 --> 00:37:52,520 Speaker 2: out of the scene, but one of the photos I've 713 00:37:52,560 --> 00:37:55,279 Speaker 2: seen she was the Canola shot with a shadow of me, 714 00:37:55,560 --> 00:37:57,920 Speaker 2: and then a photo of the same shot with the 715 00:37:57,960 --> 00:38:01,440 Speaker 2: shadow removed, so you can it's an you know, I 716 00:38:01,440 --> 00:38:03,640 Speaker 2: didn't like the shadow, but in some ways the shadow 717 00:38:03,719 --> 00:38:04,200 Speaker 2: is not bad. 718 00:38:04,400 --> 00:38:06,680 Speaker 1: That's the conceit. Yeah, yeah, yeah, yeah, yeah. 719 00:38:06,719 --> 00:38:08,680 Speaker 2: I just used the magic eraser to get rid of 720 00:38:08,680 --> 00:38:10,920 Speaker 2: the shadow. It took me maybe three or four seconds. 721 00:38:11,880 --> 00:38:14,319 Speaker 1: That is amazing, isn't it. So if you've got like 722 00:38:14,400 --> 00:38:16,520 Speaker 1: an old photo with a couple of mates, but one 723 00:38:16,560 --> 00:38:18,959 Speaker 1: of them's a fuck with and you on him out, yep, 724 00:38:19,320 --> 00:38:21,000 Speaker 1: you can get rid of the fuck with and just 725 00:38:21,040 --> 00:38:23,040 Speaker 1: have you and your mates and go. This was us 726 00:38:23,080 --> 00:38:25,560 Speaker 1: in two thousand and six at the races. 727 00:38:25,800 --> 00:38:28,480 Speaker 2: But you're changing history and get rid of Jason Nay. 728 00:38:28,840 --> 00:38:31,680 Speaker 1: History remains the same, it's just the record of it 729 00:38:31,680 --> 00:38:32,400 Speaker 1: it changes. 730 00:38:32,680 --> 00:38:36,440 Speaker 2: Well, the thing is the concern is when we're introducing 731 00:38:36,480 --> 00:38:41,200 Speaker 2: these tools, what does it mean about the art of photography. 732 00:38:41,440 --> 00:38:43,640 Speaker 2: If I said to you if I, if I said 733 00:38:43,800 --> 00:38:48,800 Speaker 2: tam and square, think of an image related to conflict 734 00:38:48,840 --> 00:38:50,239 Speaker 2: and square. 735 00:38:50,000 --> 00:38:52,680 Speaker 1: Yeah, what we all think of that young that poor 736 00:38:52,719 --> 00:38:55,680 Speaker 1: young girl running in front of the tank, which was 737 00:38:55,840 --> 00:38:58,520 Speaker 1: fucking one of the most tragic and it was a guy. 738 00:38:58,560 --> 00:39:01,279 Speaker 2: But yeah, that's was it a boy? A guy? Yep, 739 00:39:01,440 --> 00:39:03,319 Speaker 2: a student standing in front of the tank. 740 00:39:03,440 --> 00:39:06,239 Speaker 1: Resistant No, no, no, no, there's another photo of a 741 00:39:06,280 --> 00:39:07,960 Speaker 1: young naked child. 742 00:39:07,840 --> 00:39:10,640 Speaker 2: Running and yes, that's the napalm girl. That's from Vietnam. 743 00:39:10,760 --> 00:39:12,520 Speaker 1: Yes, oh that's Vietnam, is it? 744 00:39:12,560 --> 00:39:15,680 Speaker 2: But again, there's two iconic visions. One is the student 745 00:39:15,760 --> 00:39:17,400 Speaker 2: standing in front of the tank in front of Townerman 746 00:39:17,440 --> 00:39:20,759 Speaker 2: square ones the naked girl running away from napalm as 747 00:39:20,800 --> 00:39:23,560 Speaker 2: an explosion goes off in the background, you know, the 748 00:39:23,560 --> 00:39:27,560 Speaker 2: American flag being lifted, you know, the soldiers holding the flag. 749 00:39:27,880 --> 00:39:31,400 Speaker 2: So when we think of these snapshots of history and 750 00:39:31,440 --> 00:39:34,080 Speaker 2: then we can so easily change them, what does it 751 00:39:34,160 --> 00:39:36,960 Speaker 2: do to our perception of what the past is when 752 00:39:37,000 --> 00:39:39,080 Speaker 2: you can change it so readily and make it look 753 00:39:39,200 --> 00:39:43,719 Speaker 2: so real. And that's the concern now, that it's so 754 00:39:43,880 --> 00:39:47,839 Speaker 2: easy using these AI tools to change what we would 755 00:39:47,880 --> 00:39:50,799 Speaker 2: think of as a snapshot in time as reality and 756 00:39:50,840 --> 00:39:53,200 Speaker 2: so people are concerned about this, and it was just 757 00:39:53,280 --> 00:39:56,319 Speaker 2: interested about this article. It was, you know, because you 758 00:39:56,360 --> 00:40:00,239 Speaker 2: can now take a standard streetscape and you could put 759 00:40:00,239 --> 00:40:03,520 Speaker 2: a smoking wreck of a burnt out car there. You know, 760 00:40:04,520 --> 00:40:06,960 Speaker 2: that's the concern that you could then post it as 761 00:40:06,960 --> 00:40:10,719 Speaker 2: fake news. There was an article recently, I think it 762 00:40:10,840 --> 00:40:13,479 Speaker 2: was during the American campaign that's on at the moment. 763 00:40:13,480 --> 00:40:15,680 Speaker 2: It might have been related to Biden. I think that 764 00:40:16,360 --> 00:40:21,400 Speaker 2: an American telco was fined a million dollars for running 765 00:40:21,440 --> 00:40:25,480 Speaker 2: an ad where they had a depiction of Joe Biden 766 00:40:25,719 --> 00:40:28,080 Speaker 2: and it was not actually him and it wasn't endorsing 767 00:40:28,080 --> 00:40:30,919 Speaker 2: the product. So it's so easy to do that now. 768 00:40:31,200 --> 00:40:33,480 Speaker 1: But this is everywhere now, I mean, this is every 769 00:40:33,640 --> 00:40:37,880 Speaker 1: I mean, Joe Rogan has his voice because there's so 770 00:40:38,080 --> 00:40:40,719 Speaker 1: much of his face and his image, his body is 771 00:40:41,480 --> 00:40:45,919 Speaker 1: you know, video audio. He's been used like constantly by 772 00:40:46,000 --> 00:40:53,520 Speaker 1: companies making ads where TIFFs just made something with Okay, 773 00:40:54,120 --> 00:40:55,320 Speaker 1: that's funny, Tiff. 774 00:40:56,960 --> 00:40:58,960 Speaker 2: She just did a photo of you and I on 775 00:40:59,160 --> 00:41:01,640 Speaker 2: screen with the ponetograph I took yesterday. 776 00:41:02,080 --> 00:41:05,400 Speaker 4: So I just pushed the profile picture from both of 777 00:41:05,440 --> 00:41:08,440 Speaker 4: your facebooks onto the photo you were describing to the 778 00:41:08,480 --> 00:41:11,520 Speaker 4: audience like you were there together having assentide. 779 00:41:11,160 --> 00:41:13,080 Speaker 1: Can you send that to me right now? Which is 780 00:41:13,120 --> 00:41:16,160 Speaker 1: not good for the audience for me, But yeah, I'm 781 00:41:16,200 --> 00:41:18,839 Speaker 1: with you, Patrick, But this is the door that we've opened, right, 782 00:41:19,000 --> 00:41:22,040 Speaker 1: I mean, now, how do you It doesn't change history, 783 00:41:22,080 --> 00:41:25,840 Speaker 1: because history is history, but it changes what we think 784 00:41:26,080 --> 00:41:29,560 Speaker 1: perhaps history was, you know, and I guess this is 785 00:41:29,600 --> 00:41:35,880 Speaker 1: about you know, it's like when, if what was it? 786 00:41:35,920 --> 00:41:38,400 Speaker 1: If you're go into chat GPT and you say, what 787 00:41:38,640 --> 00:41:42,160 Speaker 1: happened on whatever the day Donald Trump got shot? We've 788 00:41:42,160 --> 00:41:44,640 Speaker 1: spoken about this once, but what happened on this date 789 00:41:44,719 --> 00:41:51,080 Speaker 1: in Pennsylvania or whatever it was? And it says there 790 00:41:51,160 --> 00:41:55,279 Speaker 1: was an event, a former there was a shooting, a 791 00:41:55,320 --> 00:41:59,440 Speaker 1: former president was injured, but no one else was hurt. 792 00:42:00,000 --> 00:42:03,880 Speaker 1: But it doesn't say that, you know, a fire a 793 00:42:03,920 --> 00:42:08,080 Speaker 1: fireman was killed, other people were injured, and Donald Trump 794 00:42:08,120 --> 00:42:11,560 Speaker 1: got shot. It doesn't it Because for whatever reason, it's 795 00:42:11,600 --> 00:42:15,840 Speaker 1: been programmed to be you know, pro certain things and 796 00:42:15,880 --> 00:42:19,800 Speaker 1: anti other things. So getting absolute objective data whether or 797 00:42:19,840 --> 00:42:23,000 Speaker 1: not we like that or don't like that or agree 798 00:42:23,040 --> 00:42:27,000 Speaker 1: with that, this is becoming a slipperer and slipprier landscape 799 00:42:27,040 --> 00:42:31,000 Speaker 1: because we don't know what is real because fiction is 800 00:42:31,040 --> 00:42:35,560 Speaker 1: being presented as fact from a range of kind of directions. 801 00:42:35,640 --> 00:42:39,400 Speaker 1: And yeah, it's hard. It's hard when you don't know 802 00:42:39,560 --> 00:42:42,359 Speaker 1: the truth and then you look something up, like I'm 803 00:42:42,560 --> 00:42:46,359 Speaker 1: constantly sent things by people like thirty times a day. 804 00:42:47,160 --> 00:42:49,359 Speaker 1: The crab is usually ten of them, by the way, 805 00:42:50,120 --> 00:42:54,239 Speaker 1: but people sending me research in inverted commas, and some 806 00:42:54,320 --> 00:42:56,400 Speaker 1: of them like, fuck, that's amazing. Then I go and 807 00:42:56,440 --> 00:42:59,560 Speaker 1: I look for the paper and the paper doesn't exist. 808 00:42:59,600 --> 00:43:03,120 Speaker 1: That is not real research, you know, like it didn't 809 00:43:03,160 --> 00:43:06,280 Speaker 1: this paper, this alleged paper, it's not real. It's a fraud. 810 00:43:06,320 --> 00:43:09,279 Speaker 1: There's not actually. But you can make things look like 811 00:43:09,440 --> 00:43:13,680 Speaker 1: science really easily now, and you can make things look 812 00:43:13,840 --> 00:43:17,040 Speaker 1: like historical events. As you're alluding to, what do you 813 00:43:17,080 --> 00:43:20,759 Speaker 1: think that what do you think that's going to look 814 00:43:20,920 --> 00:43:23,080 Speaker 1: like in five or ten years? Do you think we're 815 00:43:23,160 --> 00:43:25,120 Speaker 1: just not going to know what is real and what 816 00:43:25,239 --> 00:43:25,480 Speaker 1: is not? 817 00:43:26,600 --> 00:43:28,920 Speaker 2: It's a frightening thought. I think that's a really good 818 00:43:29,000 --> 00:43:31,040 Speaker 2: question that we all should be asking at the moment. 819 00:43:31,320 --> 00:43:33,920 Speaker 2: You know, how do we spot a fake? How do 820 00:43:33,960 --> 00:43:36,640 Speaker 2: we spot a deep fake? How do we spot fake news? 821 00:43:37,120 --> 00:43:38,759 Speaker 2: I don't know that there's an answer at the moment. 822 00:43:38,800 --> 00:43:43,279 Speaker 2: Because technology is leap frogging legislation. You know, technology is 823 00:43:43,320 --> 00:43:46,160 Speaker 2: out there. Google's putting out their new phone and it's 824 00:43:46,320 --> 00:43:49,800 Speaker 2: using it's turned its magic erase button in to reimagine, 825 00:43:50,160 --> 00:43:53,920 Speaker 2: so reimagine the scene so that you can place objects 826 00:43:54,000 --> 00:43:57,239 Speaker 2: in there. So suddenly, you know, someone meditating is now 827 00:43:57,280 --> 00:44:00,400 Speaker 2: going to bond next to them. You know, yeah, Hilario, 828 00:44:01,040 --> 00:44:01,600 Speaker 2: you know what I mean. 829 00:44:01,800 --> 00:44:04,560 Speaker 1: You could yeah, yeah, yeah, yeah. 830 00:44:04,160 --> 00:44:06,760 Speaker 2: But it's seamless. You know what would have taken somebody 831 00:44:06,800 --> 00:44:09,640 Speaker 2: who was a real adept at photoshop. Well that that 832 00:44:09,680 --> 00:44:12,440 Speaker 2: photograph that I sent you of the canola field, there 833 00:44:12,520 --> 00:44:14,160 Speaker 2: was grass on the side of the road. It was 834 00:44:14,280 --> 00:44:17,680 Speaker 2: dried yellow grass. But if I was to have photoshopped that, 835 00:44:18,000 --> 00:44:22,120 Speaker 2: I would have meticulously made sure the grass, the movement 836 00:44:22,120 --> 00:44:24,200 Speaker 2: of the grass was matched in me that I was 837 00:44:24,239 --> 00:44:27,359 Speaker 2: trying to remove the shadow from. But with that AI tool, 838 00:44:27,400 --> 00:44:30,080 Speaker 2: it did it in seconds. Yes, he did in a 839 00:44:30,120 --> 00:44:33,840 Speaker 2: lot of ways. But you know, and then again the 840 00:44:33,880 --> 00:44:37,360 Speaker 2: other thing is I prefer the photo with my shadow 841 00:44:37,400 --> 00:44:40,360 Speaker 2: in it because it shows a shadow of me holding 842 00:44:40,400 --> 00:44:43,480 Speaker 2: a camera taking over the canola field. So the shadow 843 00:44:43,520 --> 00:44:45,839 Speaker 2: says a lot about what time of day it was, 844 00:44:46,200 --> 00:44:48,160 Speaker 2: where I was. I was by the side of the 845 00:44:48,239 --> 00:44:51,520 Speaker 2: road taking a photo over a fence at a really 846 00:44:51,560 --> 00:44:55,080 Speaker 2: beautiful scene. So I've lost something in that image by 847 00:44:55,560 --> 00:44:58,400 Speaker 2: being the shadow of me, I've removed me from the image. 848 00:44:58,600 --> 00:45:01,440 Speaker 1: Now. I watched an ap so the other night of Insight, 849 00:45:02,400 --> 00:45:04,280 Speaker 1: do you watch that? I feel like you'd be interested 850 00:45:04,280 --> 00:45:07,720 Speaker 1: in that. Anyway, they were talking about the education system 851 00:45:07,719 --> 00:45:10,239 Speaker 1: and they had students and they had professors, and they 852 00:45:10,280 --> 00:45:12,520 Speaker 1: had lecturers and they're just talking about the good and 853 00:45:12,560 --> 00:45:19,680 Speaker 1: bad and they had like current undergrad students and postgrad students, 854 00:45:19,760 --> 00:45:24,560 Speaker 1: like young people, and they set up people cheating using 855 00:45:24,680 --> 00:45:29,080 Speaker 1: chat GPT and they all said yes. And one girl 856 00:45:29,120 --> 00:45:32,200 Speaker 1: who wasn't identified as in they didn't show her face. 857 00:45:32,239 --> 00:45:35,320 Speaker 1: You could just see her from the back. She's like, essentially, 858 00:45:35,360 --> 00:45:39,040 Speaker 1: I'm doing my degree using chat GPT. We're talking about 859 00:45:39,120 --> 00:45:44,440 Speaker 1: undergrad but still it's like it's it's I think that 860 00:45:45,080 --> 00:45:47,319 Speaker 1: one of the things that's going to happen unless we 861 00:45:47,960 --> 00:45:50,440 Speaker 1: figure out a solution or an antidote, is that the 862 00:45:50,560 --> 00:45:57,400 Speaker 1: value of some tertiary qualifications is going to be severely downgraded, 863 00:45:58,200 --> 00:46:03,800 Speaker 1: because you know, people are getting qualifications that they didn't earn, 864 00:46:04,800 --> 00:46:08,600 Speaker 1: and that's you know, as the technology gets better, and 865 00:46:08,640 --> 00:46:12,319 Speaker 1: it's harder and harder to figure out who wrote or 866 00:46:12,400 --> 00:46:18,440 Speaker 1: didn't write what. You know. Yeah, it's a very slippery 867 00:46:18,480 --> 00:46:21,080 Speaker 1: slope that we're on, and I don't know that. You know, 868 00:46:21,160 --> 00:46:23,360 Speaker 1: there are pros and cons, but I reckon the cons 869 00:46:23,360 --> 00:46:27,560 Speaker 1: are as you know, as significant and numerous as the 870 00:46:27,880 --> 00:46:28,680 Speaker 1: as the pros. 871 00:46:29,520 --> 00:46:32,880 Speaker 2: It is a challenge for educators and also I guess 872 00:46:32,920 --> 00:46:38,839 Speaker 2: for employers if someone's presenting a region may with documentation 873 00:46:39,000 --> 00:46:41,160 Speaker 2: to say that they've passed a degree or whatever, or 874 00:46:41,200 --> 00:46:45,960 Speaker 2: if someone standing above a table with a scalpel in there, yeah. 875 00:46:45,960 --> 00:46:48,920 Speaker 1: Think about that. Fuck that. Oh yeah, but there's a 876 00:46:48,920 --> 00:46:52,320 Speaker 1: lot of yeah you would. Yeah, Like a few people 877 00:46:52,360 --> 00:46:54,440 Speaker 1: have said, do I use chat Jo Patti, And I 878 00:46:54,480 --> 00:46:56,880 Speaker 1: really can't. There's a few things where I could maybe go, 879 00:46:57,000 --> 00:46:59,719 Speaker 1: here's my intro, could you rewrite it? But because of 880 00:46:59,880 --> 00:47:03,560 Speaker 1: my research is original, it has to be, you know. 881 00:47:03,680 --> 00:47:07,160 Speaker 1: But I think there are certain things. But I think also, 882 00:47:07,320 --> 00:47:09,879 Speaker 1: you know, somebody who's writing an application for a job, 883 00:47:10,360 --> 00:47:12,120 Speaker 1: you can write it and then throw it into chat 884 00:47:12,160 --> 00:47:15,400 Speaker 1: GPT and I'll churn out a slightly more polished version. 885 00:47:15,880 --> 00:47:19,680 Speaker 1: I think that's a great application, you know. And and 886 00:47:19,719 --> 00:47:24,520 Speaker 1: the sad thing is that some jobs, like copywriting, they're 887 00:47:24,560 --> 00:47:29,000 Speaker 1: probably to an extent, going to become redundant because it 888 00:47:29,280 --> 00:47:33,839 Speaker 1: chat GPT can do as good or better writing than 889 00:47:34,200 --> 00:47:35,080 Speaker 1: the average punter. 890 00:47:35,840 --> 00:47:37,760 Speaker 2: What about using if you ever used grammarly? 891 00:47:40,040 --> 00:47:41,800 Speaker 1: No, but I know, I know what it is. 892 00:47:41,880 --> 00:47:44,160 Speaker 2: Yeah, yeah, yeah, So what it does. It's not just 893 00:47:44,200 --> 00:47:47,040 Speaker 2: a spell checker. It's more of a phraseology checker. So 894 00:47:47,080 --> 00:47:49,279 Speaker 2: if you write a sentence, it will say, well, this 895 00:47:49,400 --> 00:47:51,640 Speaker 2: is not the best grammar you know, why don't you 896 00:47:51,719 --> 00:47:55,160 Speaker 2: rewrite it a certain way? And I mean in a 897 00:47:55,160 --> 00:47:58,520 Speaker 2: lot of ways. Then does that then stop or cease 898 00:47:58,680 --> 00:48:03,279 Speaker 2: being your original work, because making mistakes is part of 899 00:48:03,320 --> 00:48:04,480 Speaker 2: the learning process. 900 00:48:04,960 --> 00:48:08,080 Speaker 1: Yeah, yes, and no, I mean with something like grammar, 901 00:48:08,080 --> 00:48:12,319 Speaker 1: where it's just making something be more grammatically correct or 902 00:48:12,360 --> 00:48:16,080 Speaker 1: something perhaps flow a little better, But it's essentially your 903 00:48:16,120 --> 00:48:19,560 Speaker 1: words and ideas. I think that's a valuable tool. But 904 00:48:19,680 --> 00:48:23,640 Speaker 1: you know what, in twenty years, like who knows, who knows? 905 00:48:23,680 --> 00:48:27,520 Speaker 1: If people like I doubt we're going to be writing 906 00:48:27,600 --> 00:48:30,000 Speaker 1: on paper with pens in twenty years. 907 00:48:29,760 --> 00:48:32,359 Speaker 2: I doubt that they, Hey, I will be I use 908 00:48:32,400 --> 00:48:33,080 Speaker 2: a fountain pen. 909 00:48:33,480 --> 00:48:36,640 Speaker 1: I hope, so, but I don't think the majority. In fact, 910 00:48:36,719 --> 00:48:38,759 Speaker 1: I don't think we're going to be typing, because it's 911 00:48:38,760 --> 00:48:41,440 Speaker 1: going to be it's all going to be audio. What 912 00:48:41,440 --> 00:48:43,279 Speaker 1: do you call it? Where you just talk and it 913 00:48:43,320 --> 00:48:46,560 Speaker 1: types as you go dictation like those No you know 914 00:48:46,600 --> 00:48:48,840 Speaker 1: where you can talk into your phone and it just types. 915 00:48:48,960 --> 00:48:51,680 Speaker 1: Is that? What's Isn't there a more fancy name than 916 00:48:52,480 --> 00:48:55,920 Speaker 1: yeah whatever? That you know where? And that's becoming so 917 00:48:56,120 --> 00:48:58,279 Speaker 1: good where you think, why the fuck would I be 918 00:48:58,320 --> 00:49:00,600 Speaker 1: punching all these keys when I can talk and it 919 00:49:00,640 --> 00:49:01,360 Speaker 1: types for me. 920 00:49:01,760 --> 00:49:04,680 Speaker 2: I want thought to text, I don't even want to talk. 921 00:49:07,160 --> 00:49:09,799 Speaker 1: Well, you just took up with Elon Musk. I think 922 00:49:09,880 --> 00:49:13,040 Speaker 1: that's get yourself a neural link. We've spoken about that. 923 00:49:13,120 --> 00:49:16,440 Speaker 1: All right, let's do one more Champion. Let's bring home 924 00:49:16,520 --> 00:49:19,680 Speaker 1: this fucking train wreck of an episode with a little 925 00:49:19,680 --> 00:49:20,560 Speaker 1: bit of gusto. 926 00:49:21,160 --> 00:49:24,080 Speaker 2: Oh really, I was going to give some people we're talking. 927 00:49:23,840 --> 00:49:25,839 Speaker 1: About, you know, whatever you want, whatever you want. 928 00:49:25,920 --> 00:49:29,080 Speaker 2: I was just thinking of some positive things about AI 929 00:49:29,400 --> 00:49:34,080 Speaker 2: and a feature that's built into Google Maps. Particularly in 930 00:49:34,080 --> 00:49:36,880 Speaker 2: the States, people are getting worried that those Google street 931 00:49:36,960 --> 00:49:39,920 Speaker 2: view photos are how criminals can case out your house 932 00:49:40,360 --> 00:49:43,120 Speaker 2: to see we've got cameras, or to see whether you're 933 00:49:43,160 --> 00:49:45,359 Speaker 2: living a nice suburb and they can look at your 934 00:49:45,400 --> 00:49:46,960 Speaker 2: backyard and all that sort of stuff. But you can 935 00:49:47,000 --> 00:49:51,880 Speaker 2: actually request now to have your house blurred in Google Maps. 936 00:49:51,960 --> 00:49:55,600 Speaker 2: There is one caveat Once it's blurred, it's blurred for good. 937 00:49:55,800 --> 00:49:57,560 Speaker 2: So what you do is you just basically go to 938 00:49:57,600 --> 00:50:00,680 Speaker 2: your property. You've got to prove that it's yours. You don't, 939 00:50:00,719 --> 00:50:03,840 Speaker 2: you know, I can't arbitrarily choose your place and suddenly 940 00:50:03,840 --> 00:50:07,360 Speaker 2: blur it out. But basically go to Wogle Maps and 941 00:50:07,440 --> 00:50:10,640 Speaker 2: you select the reporter problem. It's in the upper left 942 00:50:10,840 --> 00:50:12,520 Speaker 2: corner of the screen, and then you've got to answer 943 00:50:12,520 --> 00:50:15,239 Speaker 2: a whole lot of questions and submit the request, and 944 00:50:15,280 --> 00:50:17,399 Speaker 2: then what it will do is it will blur the 945 00:50:17,400 --> 00:50:20,799 Speaker 2: photo of your property. So if people are concerned about that, 946 00:50:20,880 --> 00:50:23,400 Speaker 2: So that's another really cool because you've probably noticed in 947 00:50:23,520 --> 00:50:26,319 Speaker 2: Google Maps too that all the number plates are blurred out. 948 00:50:26,440 --> 00:50:28,600 Speaker 2: You know that's not done by you know, some pers 949 00:50:28,800 --> 00:50:31,839 Speaker 2: sitting there using a magic eraser tool that's done by 950 00:50:31,920 --> 00:50:34,560 Speaker 2: AI to do that, and also blurs out faces. You 951 00:50:34,600 --> 00:50:37,560 Speaker 2: don't normally get close up shots of people's faces in 952 00:50:38,000 --> 00:50:39,600 Speaker 2: that Google mapping software either. 953 00:50:39,840 --> 00:50:42,160 Speaker 1: Well maybe the guy who lives I'm not even going 954 00:50:42,239 --> 00:50:44,080 Speaker 1: to say where you live. But the guy who lives 955 00:50:44,120 --> 00:50:47,920 Speaker 1: in rural Victoria and leaves his doors unlocked when he 956 00:50:48,000 --> 00:50:52,640 Speaker 1: goes out, maybe he should rethink that. Yeah, maybe maybe 957 00:50:52,840 --> 00:50:55,440 Speaker 1: I don't know. Are there many? Are there many crooks 958 00:50:55,440 --> 00:50:58,640 Speaker 1: where you live? I wouldn't think they're too much tied 959 00:50:59,000 --> 00:51:01,080 Speaker 1: love and hugs and muny beans, aren't they know? 960 00:51:01,360 --> 00:51:03,600 Speaker 2: What happens is people. We've had some instances of a 961 00:51:03,600 --> 00:51:06,320 Speaker 2: few break ins recently. To be honest, I probably shouldn't 962 00:51:06,320 --> 00:51:08,680 Speaker 2: be telling people how to leave my door open all 963 00:51:08,719 --> 00:51:09,160 Speaker 2: the time. 964 00:51:09,960 --> 00:51:15,480 Speaker 1: No, no, we'll definitely die. We'll definitely start locking it. 965 00:51:15,600 --> 00:51:17,359 Speaker 2: No, I lock it when I look. I do lock 966 00:51:17,400 --> 00:51:19,040 Speaker 2: it if I'm going away for any length of time. 967 00:51:19,080 --> 00:51:20,360 Speaker 2: But if I'm just going down the street to the 968 00:51:20,360 --> 00:51:22,279 Speaker 2: post office in the door? 969 00:51:22,640 --> 00:51:28,000 Speaker 6: Oh wow, Patrick, how do people what is your home address? 970 00:51:28,040 --> 00:51:30,960 Speaker 6: By the way, And failing that, how can people follow 971 00:51:31,000 --> 00:51:33,279 Speaker 6: you online or connect with you online? 972 00:51:33,520 --> 00:51:36,320 Speaker 2: Did you go to websites? Noow, dot com dot au. 973 00:51:36,400 --> 00:51:38,080 Speaker 2: You could kind of stalk me there. You can see 974 00:51:38,120 --> 00:51:40,200 Speaker 2: what I look like and then decide that you might 975 00:51:40,200 --> 00:51:44,600 Speaker 2: want to call me and have a chat. Wow wow, Well, actually, 976 00:51:44,640 --> 00:51:46,600 Speaker 2: you know, it is one of the things I really 977 00:51:46,640 --> 00:51:50,280 Speaker 2: love about what I do is working with smaller business people, 978 00:51:50,880 --> 00:51:53,000 Speaker 2: you know, because what. 979 00:51:53,000 --> 00:51:56,560 Speaker 1: About big business people like you know, people over six foot. 980 00:51:56,719 --> 00:52:01,440 Speaker 2: Yeah yeahle raw dogging because that's what they do with 981 00:52:01,520 --> 00:52:03,719 Speaker 2: their males like you. 982 00:52:03,719 --> 00:52:08,279 Speaker 1: No, I told you Tif's more alpha male than me. 983 00:52:08,400 --> 00:52:09,000 Speaker 1: But yeah, go on. 984 00:52:09,800 --> 00:52:12,759 Speaker 2: I'm just going to say, do you find though, because 985 00:52:12,800 --> 00:52:14,200 Speaker 2: I know you get a lot of joy and TIF 986 00:52:14,239 --> 00:52:16,600 Speaker 2: You're the same with your PT stuff because it's one 987 00:52:16,600 --> 00:52:19,600 Speaker 2: on one as opposed to doing a presentation to a 988 00:52:19,680 --> 00:52:22,600 Speaker 2: room of fifty or one hundred people. There's something really 989 00:52:23,200 --> 00:52:27,839 Speaker 2: not intimate but kind of uplifting about being able to 990 00:52:27,880 --> 00:52:31,040 Speaker 2: work with someone one to one. So I think, yeah, anyway, 991 00:52:31,080 --> 00:52:32,480 Speaker 2: but I know I enjoy that. 992 00:52:33,000 --> 00:52:35,680 Speaker 1: I enjoy that. This is a terrible segue, but it's 993 00:52:35,800 --> 00:52:38,799 Speaker 1: kind of in line with that. I just launched a 994 00:52:38,840 --> 00:52:43,160 Speaker 1: new mentorship, which I've never done, for fifteen people ten weeks. 995 00:52:44,080 --> 00:52:46,040 Speaker 1: And it's just like, because I do a lot of 996 00:52:46,040 --> 00:52:49,400 Speaker 1: stuff with big audiences, and I'm quite excited about working 997 00:52:49,440 --> 00:52:54,440 Speaker 1: with only fifteen people and you know, for me, is 998 00:52:54,480 --> 00:52:56,640 Speaker 1: a small group and it's two and a half months, 999 00:52:56,640 --> 00:52:59,560 Speaker 1: so it's an extended period of time, and I'm quite 1000 00:52:59,560 --> 00:53:02,560 Speaker 1: interested to see how that's going to go, you know, 1001 00:53:02,680 --> 00:53:06,640 Speaker 1: because it gives everyone an opportunity every session to talk 1002 00:53:06,680 --> 00:53:10,439 Speaker 1: and ask questions. You're not just one block of nine 1003 00:53:10,520 --> 00:53:12,960 Speaker 1: hundred blocks on a range of zoom screens. 1004 00:53:13,640 --> 00:53:16,560 Speaker 2: You know, so well, they need a five minute tai 1005 00:53:16,680 --> 00:53:19,040 Speaker 2: chi kind of you know, workout session. 1006 00:53:19,320 --> 00:53:22,480 Speaker 1: I reckon we should No, I really do think we 1007 00:53:22,520 --> 00:53:26,520 Speaker 1: should do something. We should do a are you project? 1008 00:53:27,239 --> 00:53:29,120 Speaker 1: Patrick collab. 1009 00:53:31,760 --> 00:53:34,719 Speaker 2: On zoom sounds good. Yeah, you can count me on that. 1010 00:53:35,320 --> 00:53:39,239 Speaker 1: Hm hm hmm. I wonder how that would have to 1011 00:53:39,280 --> 00:53:42,040 Speaker 1: be live. I guess we could do it live with 1012 00:53:42,080 --> 00:53:44,760 Speaker 1: whoever wants to do it live, record it. We probably 1013 00:53:44,760 --> 00:53:46,920 Speaker 1: should talk about this off the show a bit. Fuck it, 1014 00:53:47,280 --> 00:53:50,239 Speaker 1: let's have a staff meeting. Look at Tiff. What is 1015 00:53:50,280 --> 00:53:52,440 Speaker 1: Tiff looking at? What are you looking at? Down there? 1016 00:53:54,040 --> 00:53:56,400 Speaker 4: Got cat hair stuff under the bottom of itself. 1017 00:53:57,040 --> 00:54:00,160 Speaker 1: She spent ninety percent of the phone looking at I'm 1018 00:54:00,200 --> 00:54:02,000 Speaker 1: looking down What are you doing down there? That's got 1019 00:54:02,000 --> 00:54:02,560 Speaker 1: your attention? 1020 00:54:04,320 --> 00:54:08,600 Speaker 2: Oh god, oh wow, it's like the belly button fluff. 1021 00:54:09,520 --> 00:54:14,000 Speaker 1: It's a terrible way to end a podcast. Patrick, Just 1022 00:54:14,040 --> 00:54:15,600 Speaker 1: tell people your web address again. 1023 00:54:15,640 --> 00:54:19,719 Speaker 2: Place websites now, dot com, dot au because no one 1024 00:54:19,719 --> 00:54:21,239 Speaker 2: can spell genesis effects. 1025 00:54:21,600 --> 00:54:23,840 Speaker 1: Yeah, This will be our last episode of this everyone, 1026 00:54:23,880 --> 00:54:26,760 Speaker 1: so I hope you enjoyed them all. It's been great. 1027 00:54:27,480 --> 00:54:30,080 Speaker 2: Thanks team, Thanks to the two people who kept listening. 1028 00:54:31,880 --> 00:54:35,840 Speaker 1: Oh god, it's just descending, isn't it. It is what 1029 00:54:36,000 --> 00:54:40,480 Speaker 1: we do. It's devolving, it's evolving. It's the opposite of AI. 1030 00:54:41,040 --> 00:54:45,800 Speaker 2: Yeah, it's the antithesis of what is happening.