1 00:00:01,000 --> 00:00:03,160 Speaker 1: I'll get a team. Welcome to another install on the show. 2 00:00:03,200 --> 00:00:06,360 Speaker 1: It's Patrick James Bonello, Tiffany and called Craig Anthony Harper. 3 00:00:06,559 --> 00:00:09,920 Speaker 1: Let's start with the genius in the bush. 4 00:00:10,000 --> 00:00:13,200 Speaker 2: Hi, Patrick, Oh sorry, I was waiting for you to say, 5 00:00:13,200 --> 00:00:16,880 Speaker 2: tiff that would have been the genius in the ring, 6 00:00:16,880 --> 00:00:17,320 Speaker 2: wouldn't that. 7 00:00:17,880 --> 00:00:22,560 Speaker 1: You're a genius, Thank you. Yeah, you're a genius of sorts. 8 00:00:22,560 --> 00:00:23,160 Speaker 1: How are you? 9 00:00:23,800 --> 00:00:26,200 Speaker 2: I'm actually doing pretty good. I'm sitting here at the 10 00:00:26,200 --> 00:00:32,040 Speaker 2: moment with Fritz and scientifically, Fritz is actually calming me down. 11 00:00:32,200 --> 00:00:35,199 Speaker 2: He's lowering my blood pressure and he's causing me to 12 00:00:35,240 --> 00:00:35,960 Speaker 2: be more relaxed. 13 00:00:37,080 --> 00:00:44,680 Speaker 1: Do you know that dogs, well, obviously if you love them, dogs, cats, animals, 14 00:00:44,720 --> 00:00:48,920 Speaker 1: of course, yeah, they increase not only lifespan, but health 15 00:00:48,920 --> 00:00:50,640 Speaker 1: span as well, which is nice, isn't it. 16 00:00:51,000 --> 00:00:53,880 Speaker 2: It's amazing, fantastic. I feel I should be doing more. 17 00:00:54,520 --> 00:00:57,160 Speaker 1: Although I did see a video the other day online 18 00:00:57,200 --> 00:00:59,960 Speaker 1: of this crazy woman with a cat in a stewdent. 19 00:01:02,320 --> 00:01:08,240 Speaker 1: I can't remember her name, someone Cook and the cat 20 00:01:08,360 --> 00:01:10,559 Speaker 1: that she had I might have been Bear the cat. 21 00:01:10,640 --> 00:01:12,600 Speaker 1: I'm not sure. I can't remember all the. 22 00:01:12,560 --> 00:01:15,680 Speaker 2: Details, but the common name for a cat, yeah, yeah. 23 00:01:15,560 --> 00:01:19,760 Speaker 1: Yeah, but this lady swearing like a motherfucker at this 24 00:01:20,080 --> 00:01:24,160 Speaker 1: poor cat. It was feeline abuse. 25 00:01:25,280 --> 00:01:28,839 Speaker 3: Bear's very thoughtful, and she realizes that with the cost 26 00:01:28,920 --> 00:01:33,320 Speaker 3: of living, having such a long life from all these 27 00:01:33,400 --> 00:01:35,840 Speaker 3: pets in my world is a very big expense. So 28 00:01:35,920 --> 00:01:39,080 Speaker 3: she's just balancing that out with some high sparks of 29 00:01:39,120 --> 00:01:40,640 Speaker 3: cortisol every now and then. 30 00:01:42,720 --> 00:01:46,080 Speaker 1: So tell everyone who's thinking, what is he talking about? 31 00:01:46,480 --> 00:01:49,640 Speaker 1: Tell everybody one you what happened. 32 00:01:49,680 --> 00:01:54,200 Speaker 3: If so, I've got banners behind me in the studio 33 00:01:54,600 --> 00:01:56,840 Speaker 3: that just sit there twenty four to seven. The studio 34 00:01:57,280 --> 00:02:00,320 Speaker 3: doubles as her bedroom. She's got a little cat towel here. 35 00:02:00,360 --> 00:02:05,040 Speaker 3: It's quite the playground. But when I when she knows 36 00:02:05,200 --> 00:02:08,120 Speaker 3: I'm putting up, obviously, when I put my headphones around 37 00:02:08,120 --> 00:02:13,680 Speaker 3: my ears, she goes into assault mode and pushes the 38 00:02:13,880 --> 00:02:15,600 Speaker 3: So I was sitting there, I had to record a 39 00:02:15,600 --> 00:02:18,799 Speaker 3: little video by myself, So I turn on the zoom, 40 00:02:18,919 --> 00:02:22,560 Speaker 3: I put the headphones on my head, and she just 41 00:02:22,919 --> 00:02:26,880 Speaker 3: runs in, pushes the banner over behind me, and runs 42 00:02:27,040 --> 00:02:30,200 Speaker 3: straight out the door. And I abuse her, And I 43 00:02:30,240 --> 00:02:33,440 Speaker 3: think to myself, my neighbors would hear that abuse all 44 00:02:33,440 --> 00:02:36,920 Speaker 3: the time and think I'm in a volatile relationship. 45 00:02:37,000 --> 00:02:41,679 Speaker 1: It's hilarious, and or you're abusing your animals. 46 00:02:42,000 --> 00:02:44,760 Speaker 3: She does it deliberately though, like it's they're so cunning. 47 00:02:44,800 --> 00:02:47,919 Speaker 1: It's deliberate and it's thought through. 48 00:02:48,040 --> 00:02:50,760 Speaker 3: And that banner is in here twenty four to seven. 49 00:02:50,840 --> 00:02:54,240 Speaker 3: She doesn't touch it unless she wants to irritate me. 50 00:02:55,240 --> 00:02:56,040 Speaker 2: Well do you reckon? 51 00:02:56,480 --> 00:02:59,919 Speaker 1: She's like mean cat form? Do you reckon? She real 52 00:03:00,480 --> 00:03:05,080 Speaker 1: is that you're about to stop focusing on her. Yes, 53 00:03:05,639 --> 00:03:09,120 Speaker 1: you're about to channel your attention into something else. Yes, 54 00:03:09,160 --> 00:03:13,440 Speaker 1: that's exactly what happens. Can you two talk amongst yourselves 55 00:03:13,440 --> 00:03:15,840 Speaker 1: for a moment, Patrick, because all your notes that you 56 00:03:15,960 --> 00:03:20,120 Speaker 1: sent me have disappeared. Oh no, I found them, found them. 57 00:03:20,160 --> 00:03:24,960 Speaker 1: I'm back. I'm sorry for that. Yeah. Yeah. My my 58 00:03:25,320 --> 00:03:30,080 Speaker 1: Monash University account, which is clearly sick of me, just 59 00:03:30,120 --> 00:03:32,320 Speaker 1: blocked me from my own thing. But I'm back in. 60 00:03:33,720 --> 00:03:36,200 Speaker 1: Have you been Patrick? What's going on? What's news up there? 61 00:03:36,840 --> 00:03:39,920 Speaker 2: Look, it's it's got cold really quickly. It was four 62 00:03:39,920 --> 00:03:42,440 Speaker 2: degrees this morning, so I had to stoke up the fireplace, 63 00:03:42,600 --> 00:03:45,000 Speaker 2: which is actually one of the very cathartic things that 64 00:03:45,040 --> 00:03:47,480 Speaker 2: I love to do. I'm not got central heating, but 65 00:03:47,800 --> 00:03:50,640 Speaker 2: I've also got some I was gonna say, I've got wood, 66 00:03:50,680 --> 00:03:54,680 Speaker 2: but I've got, you know, a lot of stuff for 67 00:03:54,720 --> 00:03:57,480 Speaker 2: the fireplace to be able to keep the house warm. 68 00:03:58,760 --> 00:04:00,800 Speaker 2: So yeah, it's That was one of the things I 69 00:04:00,840 --> 00:04:03,320 Speaker 2: did first thing this morning because I thought I thought 70 00:04:03,320 --> 00:04:05,720 Speaker 2: we were recording at seven thirty, so I got up 71 00:04:05,760 --> 00:04:08,839 Speaker 2: extra early to warm the house and get the fireplace going, 72 00:04:08,880 --> 00:04:11,440 Speaker 2: get my hot chocolate, and stand by the fire. I 73 00:04:11,480 --> 00:04:14,080 Speaker 2: was going through the notes, and then I sat down 74 00:04:14,080 --> 00:04:16,920 Speaker 2: and went to the studio, turned the heater on and 75 00:04:16,960 --> 00:04:19,400 Speaker 2: sat here at seven thirty and no one was there. 76 00:04:19,960 --> 00:04:22,599 Speaker 1: And I got a message at seven thirty two, going 77 00:04:22,680 --> 00:04:24,560 Speaker 1: are we doing a podcast? And I was sitting at 78 00:04:24,600 --> 00:04:28,800 Speaker 1: the cafe just knees deep. I was going to say 79 00:04:28,800 --> 00:04:33,039 Speaker 1: something else, knees deep in caffeine. What suits you? It is? 80 00:04:33,120 --> 00:04:35,119 Speaker 1: Seven thirty suit you better? Or nine? 81 00:04:35,320 --> 00:04:39,000 Speaker 2: We probably should have this conversation, but not during the podcast. 82 00:04:39,160 --> 00:04:42,320 Speaker 1: Yeah, let's do it. Let's have a little staff meeting. Now, 83 00:04:42,839 --> 00:04:45,159 Speaker 1: what are you or nine? 84 00:04:45,520 --> 00:04:47,279 Speaker 3: I like nine because I like to be able to 85 00:04:47,320 --> 00:04:50,039 Speaker 3: go and work out first. But you know, I'm just 86 00:04:50,120 --> 00:04:52,440 Speaker 3: here at press buttons. So it's when you two are 87 00:04:52,480 --> 00:04:53,480 Speaker 3: most on fire. 88 00:04:53,920 --> 00:04:55,719 Speaker 2: Yeah, no, that works for me. I can walk for 89 00:04:55,720 --> 00:04:56,440 Speaker 2: its beforehand. 90 00:04:56,520 --> 00:05:02,120 Speaker 1: Then you are the button presser Tiff metaphorically and literally. 91 00:05:02,400 --> 00:05:04,559 Speaker 2: Press push her buttons. 92 00:05:04,640 --> 00:05:11,120 Speaker 1: Yea, even when she's not in the studio. Yeah, I 93 00:05:11,160 --> 00:05:13,560 Speaker 1: love a good open fire. But anyway, all right, let's 94 00:05:13,640 --> 00:05:14,600 Speaker 1: onwards and upwards. 95 00:05:14,600 --> 00:05:17,320 Speaker 2: I think i've spoken to since we went to jail together. 96 00:05:20,520 --> 00:05:23,479 Speaker 1: I saw photos of you guys in jail. Was it fun? Patrick? 97 00:05:23,760 --> 00:05:25,880 Speaker 2: It was awesome. I'd never thought because I didn't know 98 00:05:25,920 --> 00:05:29,800 Speaker 2: they had cowed jails for one thing. It was great fun. 99 00:05:30,279 --> 00:05:32,880 Speaker 2: Took me to jail and I wore an orange jumpsuit. 100 00:05:33,000 --> 00:05:33,400 Speaker 2: Was great. 101 00:05:33,880 --> 00:05:35,200 Speaker 1: Did you drink any booze? 102 00:05:35,600 --> 00:05:38,240 Speaker 2: Oh no, I had mocktails. Oh no, it's not true. 103 00:05:38,240 --> 00:05:41,280 Speaker 2: I did have a shot, but just one, but most 104 00:05:41,279 --> 00:05:42,240 Speaker 2: of it was mocktails. 105 00:05:42,680 --> 00:05:45,600 Speaker 1: How long since you've been or have you ever been drunk? 106 00:05:46,200 --> 00:05:51,080 Speaker 2: Have I ever been drunk? Yes, I have, but decades, 107 00:05:51,279 --> 00:05:52,960 Speaker 2: it's been decades and decades. 108 00:05:53,880 --> 00:05:57,159 Speaker 1: Yeah to how long since you've been drunk, give or take. 109 00:05:58,240 --> 00:05:59,680 Speaker 1: Did you get drunk on your birthday? 110 00:06:00,000 --> 00:06:03,080 Speaker 3: I definitely didn't. I haven't been drunk for a long time. 111 00:06:03,200 --> 00:06:07,000 Speaker 3: But yeah, nothing good happens when you're drunk. We call 112 00:06:07,120 --> 00:06:09,839 Speaker 3: her Francesca and we put her to bed years ago. 113 00:06:10,960 --> 00:06:12,040 Speaker 1: Who's that drunk you? 114 00:06:12,480 --> 00:06:16,719 Speaker 3: Yeah, she's a whole different beast. Really, she has no 115 00:06:16,880 --> 00:06:19,160 Speaker 3: business in any of my life. 116 00:06:19,800 --> 00:06:21,240 Speaker 1: Tell us about Francesca. 117 00:06:22,040 --> 00:06:24,680 Speaker 3: She's just wild, you know. She comes out of the bamboo. 118 00:06:24,920 --> 00:06:27,600 Speaker 3: She's everybody's best friend. She's excited. She's like a two 119 00:06:27,680 --> 00:06:30,400 Speaker 3: year old that someone's given a packet of red jelly 120 00:06:30,440 --> 00:06:32,839 Speaker 3: beans too, and then just let loose. 121 00:06:32,920 --> 00:06:38,120 Speaker 2: She is. If I've met you, you don't need alcohol, exactly, exactly. 122 00:06:40,520 --> 00:06:43,479 Speaker 1: Yeah, I feel a little bit the same. I'm like, 123 00:06:44,040 --> 00:06:47,839 Speaker 1: imagine me and booze, or me and drugs that like, 124 00:06:47,880 --> 00:06:49,039 Speaker 1: I'm a nightmare. Anyway. 125 00:06:49,440 --> 00:06:52,000 Speaker 2: I had somebody take me aside one day and make 126 00:06:52,040 --> 00:06:59,000 Speaker 2: me promise that I would never take speed. I wouldn't, 127 00:06:59,200 --> 00:07:03,159 Speaker 2: but they make me promise that, not for any other reason, 128 00:07:03,200 --> 00:07:05,920 Speaker 2: but they couldn't imagine me being more hyper than I 129 00:07:05,960 --> 00:07:08,800 Speaker 2: am now, and they felt that it would probably be scary. 130 00:07:09,640 --> 00:07:12,320 Speaker 1: Do you not drink coffee? You don't drink coffee, do you, mate? 131 00:07:12,440 --> 00:07:15,720 Speaker 2: I do. I might have maybe two or three cups 132 00:07:15,760 --> 00:07:18,480 Speaker 2: a week. I don't drink it every day. 133 00:07:18,600 --> 00:07:22,800 Speaker 1: No, Well that's a little bit of a stimulant. All right, 134 00:07:22,840 --> 00:07:26,000 Speaker 1: What are we talking about today technology. Let's open that door, 135 00:07:26,160 --> 00:07:29,120 Speaker 1: just take us into the wonderland that is your safe space. 136 00:07:29,440 --> 00:07:31,800 Speaker 2: Well, I was just going to say, we started off 137 00:07:31,920 --> 00:07:35,280 Speaker 2: chatting about how cathartic. I was sitting here patting my 138 00:07:35,400 --> 00:07:37,800 Speaker 2: dog and I just figured that, I mean, we've all 139 00:07:37,840 --> 00:07:43,040 Speaker 2: known that, you know, having animals around can be cathartic. Well, Liss, 140 00:07:43,080 --> 00:07:45,360 Speaker 2: it's a cat that kind of attacks you. But aside 141 00:07:45,360 --> 00:07:49,120 Speaker 2: from that, there have been some new studies and what 142 00:07:49,160 --> 00:07:53,520 Speaker 2: they're saying now is they've done some They definitively got 143 00:07:53,600 --> 00:07:56,560 Speaker 2: one hundred and twenty two under graduates around the age 144 00:07:56,600 --> 00:07:59,680 Speaker 2: of eighteen and older and they were experiencing what they 145 00:07:59,720 --> 00:08:03,720 Speaker 2: were saying was moderate to high stress. So the research 146 00:08:03,800 --> 00:08:08,440 Speaker 2: is then collected and we're talking full on so saliva, fecal, 147 00:08:08,520 --> 00:08:13,160 Speaker 2: matter of you know, they did. This was whole biological 148 00:08:13,400 --> 00:08:17,520 Speaker 2: everything tests. And then what they did was they got 149 00:08:18,000 --> 00:08:20,960 Speaker 2: they got these people to interact with dogs, and we're 150 00:08:21,000 --> 00:08:26,280 Speaker 2: talking a discernible change in their stress levels, even seeing 151 00:08:26,320 --> 00:08:29,840 Speaker 2: the dog come towards them, finding out the dog's name, 152 00:08:30,400 --> 00:08:33,040 Speaker 2: and then of course petting the dog, interacting, playing with 153 00:08:33,080 --> 00:08:35,560 Speaker 2: the dog, throwing a ball, giving the dog a treat. 154 00:08:35,920 --> 00:08:41,880 Speaker 2: And we're talking massive amounts of a change biologically during 155 00:08:41,920 --> 00:08:45,760 Speaker 2: that interaction. So even just looking at smile or smiling 156 00:08:45,760 --> 00:08:49,840 Speaker 2: at the dog, calling the dog's name, stress levels dropped, 157 00:08:49,880 --> 00:08:55,680 Speaker 2: they decreased. And again this was proven and discernible thirty 158 00:08:55,760 --> 00:08:57,240 Speaker 2: three point five percent. 159 00:08:58,080 --> 00:08:58,320 Speaker 1: Wow. 160 00:08:58,360 --> 00:09:00,839 Speaker 2: So that's amazing. So if you've if you've got a dog, 161 00:09:00,880 --> 00:09:02,960 Speaker 2: but you've got a friend, so Crago, you could just 162 00:09:03,000 --> 00:09:06,560 Speaker 2: go to TIFF's place and Luna just give Luna a 163 00:09:06,559 --> 00:09:08,720 Speaker 2: bit of a pat and maybe. 164 00:09:09,400 --> 00:09:12,960 Speaker 1: I'm very excited. I'm very excited about finishing my study 165 00:09:12,960 --> 00:09:16,960 Speaker 1: because I'm getting I've just been. I've been every third day. 166 00:09:17,000 --> 00:09:19,280 Speaker 1: I'm like, that's it. I'm getting a German shepherd. No, 167 00:09:19,440 --> 00:09:23,160 Speaker 1: I'm getting a spooed. I'm getting a whipp it. Somebody 168 00:09:23,200 --> 00:09:28,120 Speaker 1: told me a shorthired something visla this morning. Oh Murray, 169 00:09:28,120 --> 00:09:30,120 Speaker 1: who I said every morning is like, no, you've got 170 00:09:30,160 --> 00:09:34,240 Speaker 1: to get a girl, the female vistla or and they're 171 00:09:34,280 --> 00:09:38,400 Speaker 1: like a ready brown beautiful. They are so beautiful. And 172 00:09:38,440 --> 00:09:41,240 Speaker 1: so you know, that's my thing today. But tomorrow I'll 173 00:09:41,240 --> 00:09:44,360 Speaker 1: be getting a fucking you know, a wolf or something. 174 00:09:45,000 --> 00:09:48,360 Speaker 1: But I don't know, but I'm yeah, I love dogs, 175 00:09:48,559 --> 00:09:51,080 Speaker 1: but I think the interesting thing Patrick about all this 176 00:09:51,120 --> 00:09:55,680 Speaker 1: stuff is too not everyone loves dogs. So you bring 177 00:09:55,720 --> 00:09:59,720 Speaker 1: a dog into a room that there are some cultures 178 00:09:59,760 --> 00:10:03,920 Speaker 1: that don't embrace dogs as pets as much as other 179 00:10:04,000 --> 00:10:05,800 Speaker 1: cultures do, if you know what I mean. 180 00:10:05,880 --> 00:10:11,240 Speaker 2: Well observably, and I have Indian friends, and I've noticed 181 00:10:11,280 --> 00:10:13,200 Speaker 2: that when you get a courier come to the door, 182 00:10:13,240 --> 00:10:16,560 Speaker 2: and I guess couriers and posters are always apprehensive about dogs. 183 00:10:16,800 --> 00:10:20,760 Speaker 2: But I've found that there are certainly Indian people tend 184 00:10:20,800 --> 00:10:23,920 Speaker 2: to embrace dogs less when they don't know. I guess 185 00:10:23,960 --> 00:10:26,040 Speaker 2: it's good for everybody to be wary of any dog 186 00:10:26,120 --> 00:10:29,280 Speaker 2: that they don't know on their home turf, particularly, I mean, 187 00:10:29,360 --> 00:10:32,839 Speaker 2: Fritz might lick you to death. But as a rule, 188 00:10:32,920 --> 00:10:36,160 Speaker 2: dogs can be territorial and they're guarding the pack, so 189 00:10:36,400 --> 00:10:39,720 Speaker 2: you've got to be careful. I guess posts and couriers 190 00:10:39,800 --> 00:10:44,280 Speaker 2: can see the worst in dogs too. Mind you, we 191 00:10:44,320 --> 00:10:47,280 Speaker 2: had a local poster who was friends with every single dog, 192 00:10:47,679 --> 00:10:49,280 Speaker 2: and he always had a good thing to say, and 193 00:10:49,320 --> 00:10:51,400 Speaker 2: so the dogs loved him, so that the dogs will 194 00:10:51,440 --> 00:10:53,679 Speaker 2: be wagging their tails waiting for the post he come around. 195 00:10:53,720 --> 00:10:58,280 Speaker 2: So there's, you know, one little stigma that actually isn't. 196 00:10:58,320 --> 00:11:02,320 Speaker 2: Really It's great to see posters who actually love their dogs. 197 00:11:02,360 --> 00:11:05,200 Speaker 2: But I think that I could see I'm trying to 198 00:11:05,320 --> 00:11:08,439 Speaker 2: visualize the dog that I would see with Crago, And 199 00:11:08,720 --> 00:11:11,520 Speaker 2: you know, I wouldn't be surprised if you've got a chihuahua, 200 00:11:11,880 --> 00:11:15,000 Speaker 2: because then you could you you could fly, You could 201 00:11:14,960 --> 00:11:17,079 Speaker 2: put it in your pocket, and you could drive your 202 00:11:17,080 --> 00:11:19,439 Speaker 2: motorbike and it could be sitting in a little napsack 203 00:11:19,520 --> 00:11:21,680 Speaker 2: around you. You know, I could your bum bag. See 204 00:11:21,679 --> 00:11:23,520 Speaker 2: it would fit in your bum bag that you always have. 205 00:11:24,840 --> 00:11:28,840 Speaker 2: Did you reckon tiff I reckon Jack Russell? Oh? Jack Russell? 206 00:11:29,040 --> 00:11:31,320 Speaker 1: Yeah? Do you know what? I know? You're trying to 207 00:11:31,320 --> 00:11:34,320 Speaker 1: bait me. That's okay, I'm open. I might get a 208 00:11:34,400 --> 00:11:37,400 Speaker 1: Chihuahua or a Jack Russell. I love all dogs. I 209 00:11:37,400 --> 00:11:40,800 Speaker 1: could put a Chihuahua in the pocket of my cargoes 210 00:11:41,320 --> 00:11:45,520 Speaker 1: and his little head could just pop out the side. 211 00:11:44,320 --> 00:11:48,240 Speaker 1: That would be do you know what I want to 212 00:11:48,280 --> 00:11:49,600 Speaker 1: tell you? So I don't know if you know this, 213 00:11:49,720 --> 00:11:55,960 Speaker 1: but and the I'm ninety eight percent sure about this research, 214 00:11:56,880 --> 00:12:01,040 Speaker 1: so don't exactly quote me. But dogs are I think 215 00:12:01,080 --> 00:12:08,520 Speaker 1: the only animal that produce oxytocin the love hormone like 216 00:12:09,440 --> 00:12:14,880 Speaker 1: humans do, as in when they're around their humans. That 217 00:12:14,960 --> 00:12:17,880 Speaker 1: could be total bullshit, but I know that they definitely 218 00:12:17,920 --> 00:12:20,319 Speaker 1: do produce it, whether or not other animals do. I 219 00:12:20,400 --> 00:12:24,080 Speaker 1: guess I feel like elephants would fuck I'd love an elephant. 220 00:12:24,160 --> 00:12:26,720 Speaker 2: Dolphins would that? Dolphins? I could fuck it. You know, dolphins, 221 00:12:26,720 --> 00:12:28,199 Speaker 2: they probably love you too, wouldn't they? 222 00:12:28,760 --> 00:12:31,400 Speaker 1: No, I don't know. I don't know, but elephants fuck. 223 00:12:31,480 --> 00:12:34,600 Speaker 1: Elephants are smart. Have you ever seen I saw this 224 00:12:34,679 --> 00:12:36,880 Speaker 1: footage gest and we'll get onto tech in a moment. 225 00:12:37,559 --> 00:12:40,480 Speaker 1: But there was this was recently at a zoo. There 226 00:12:40,520 --> 00:12:44,439 Speaker 1: was an earthquake. Might have been California or somewhere and 227 00:12:44,559 --> 00:12:47,120 Speaker 1: in the or San Diego, might have been in the anyway, 228 00:12:47,120 --> 00:12:50,800 Speaker 1: they've got this big elephant enclosure. Now, all these elephants 229 00:12:50,800 --> 00:12:54,200 Speaker 1: are obviously in captivity. There was an earthquake, all the 230 00:12:54,280 --> 00:12:56,960 Speaker 1: elephants got around the little elephants, So the big ones 231 00:12:57,040 --> 00:13:00,800 Speaker 1: and they all stand in a circle facing outwards with 232 00:13:00,880 --> 00:13:03,760 Speaker 1: all the little elephants in the middle to protect the 233 00:13:04,040 --> 00:13:07,800 Speaker 1: middle elephants. Little elephants because they don't know what's coming. 234 00:13:08,480 --> 00:13:11,040 Speaker 1: But whoever taught them more trained than this? Right? And 235 00:13:11,080 --> 00:13:14,480 Speaker 1: then so there's this circle of elephants like with their 236 00:13:14,520 --> 00:13:17,520 Speaker 1: bums in the middle and their faces and trunks out 237 00:13:18,200 --> 00:13:20,680 Speaker 1: and all the little ones being protected by all the 238 00:13:20,720 --> 00:13:23,960 Speaker 1: big ones. And the earthquake started and they all assembled 239 00:13:24,000 --> 00:13:27,440 Speaker 1: and got in this position within fifteen twenty seconds. How 240 00:13:27,480 --> 00:13:31,640 Speaker 1: fucking clever. We think we're smart. Fuck and el animals 241 00:13:31,679 --> 00:13:32,720 Speaker 1: are so smart. 242 00:13:33,600 --> 00:13:35,720 Speaker 2: Yeah, I'd be worried though if there was an earthquake 243 00:13:35,720 --> 00:13:37,280 Speaker 2: in the first thing you did was turn around and 244 00:13:37,280 --> 00:13:38,480 Speaker 2: shove your bub in my face. 245 00:13:38,960 --> 00:13:41,720 Speaker 1: Well, that would be to protect you, my little elephant. 246 00:13:44,040 --> 00:13:46,920 Speaker 1: So as you just keep your trunk to yourself while 247 00:13:46,920 --> 00:13:48,840 Speaker 1: we're in that position, things will be good. 248 00:13:49,440 --> 00:13:49,760 Speaker 3: Uh. 249 00:13:50,720 --> 00:13:53,400 Speaker 2: You remember what the elephants said to the bloke at 250 00:13:53,400 --> 00:13:57,880 Speaker 2: the nudist colony, who cute? Can you pick up peanuts 251 00:13:57,880 --> 00:14:04,520 Speaker 2: with it? It's a text, isn't it. 252 00:14:04,679 --> 00:14:08,839 Speaker 1: No, it's not well at out tip. That's terrible. No, 253 00:14:08,920 --> 00:14:12,080 Speaker 1: I don't leave it then it Let him suffer, all right? 254 00:14:12,120 --> 00:14:16,520 Speaker 1: Come on, tell us about technology and science and stuff 255 00:14:17,200 --> 00:14:17,800 Speaker 1: in the future. 256 00:14:18,120 --> 00:14:19,760 Speaker 2: Health on the health side of it. Let's stick with 257 00:14:19,760 --> 00:14:24,120 Speaker 2: the health stuff. This is really fascinating because obviously, when 258 00:14:24,200 --> 00:14:28,840 Speaker 2: you have surgery, it's very invasive, and scientists now think 259 00:14:28,880 --> 00:14:32,000 Speaker 2: that they're able to print three D tissue, So they're 260 00:14:32,040 --> 00:14:35,400 Speaker 2: able to inject a solution into your body and then 261 00:14:35,880 --> 00:14:41,760 Speaker 2: directly print into your body with no surgery, using ultrasound 262 00:14:41,760 --> 00:14:45,720 Speaker 2: waves so to form an object. So say you were 263 00:14:45,800 --> 00:14:48,640 Speaker 2: doing a repair, you don't actually open the body up, 264 00:14:48,680 --> 00:14:51,440 Speaker 2: you inject this solution in and then using sound waves 265 00:14:51,640 --> 00:14:56,360 Speaker 2: to three D print using that audio at different levels, 266 00:14:56,520 --> 00:15:01,200 Speaker 2: and you can create a three dimensional object inside the body. 267 00:15:01,960 --> 00:15:03,840 Speaker 2: That blew my mind when I saw that. That would 268 00:15:03,840 --> 00:15:06,040 Speaker 2: be amazing. Wouldn't it to be able to do that? 269 00:15:06,120 --> 00:15:09,600 Speaker 2: So say you're replacing a joints and you had, you know, 270 00:15:09,680 --> 00:15:11,680 Speaker 2: you kind of clean out the cartilage and you've got 271 00:15:12,200 --> 00:15:14,640 Speaker 2: you know, you could potentially inject into the knee joint 272 00:15:14,880 --> 00:15:17,600 Speaker 2: and then rebuild sections of it using this three D 273 00:15:18,480 --> 00:15:21,400 Speaker 2: printing technology inside the actual. 274 00:15:21,160 --> 00:15:26,840 Speaker 1: Joint without even making an incision. Yeah, like where stuff's going. 275 00:15:27,520 --> 00:15:30,680 Speaker 1: I saw this thing yesterday, This fourteen year old kid, right, 276 00:15:31,960 --> 00:15:34,720 Speaker 1: I think he was an Indian kid. He invented this 277 00:15:34,880 --> 00:15:37,920 Speaker 1: app that goes into your phone that monitors your heart 278 00:15:38,000 --> 00:15:43,720 Speaker 1: right and it can diagnose I think it's forty cardiac 279 00:15:46,120 --> 00:15:50,520 Speaker 1: medical conditions or forty one or forty two with ninety 280 00:15:50,560 --> 00:15:54,240 Speaker 1: six percent accuracy, right, And what they were saying is 281 00:15:54,320 --> 00:15:58,640 Speaker 1: compared to the average accuracy by medical professionals looking at 282 00:15:58,640 --> 00:16:02,800 Speaker 1: whatever is fifty six and this is like ninety odd percent. 283 00:16:03,360 --> 00:16:09,640 Speaker 1: So it's incredibly more accurate, more reliable. But just as 284 00:16:09,680 --> 00:16:13,680 Speaker 1: a kid, yeah, fourteen year old kid, and he's I've 285 00:16:13,720 --> 00:16:17,480 Speaker 1: got this footage of him doing basically a presentation to 286 00:16:17,520 --> 00:16:22,160 Speaker 1: all these people about how it works. Yeah, it's incredible. 287 00:16:22,200 --> 00:16:25,680 Speaker 1: I mean, I don't think we can fully understand or 288 00:16:25,760 --> 00:16:29,720 Speaker 1: grasp what's coming. And I think it's exponential. I think 289 00:16:29,840 --> 00:16:33,880 Speaker 1: AI and all the myriad of you know, things that 290 00:16:33,920 --> 00:16:38,040 Speaker 1: are going to be possible, like even even now, it's ridiculous. 291 00:16:38,080 --> 00:16:44,040 Speaker 1: Like a friend of mine messaged me that's in Singapore 292 00:16:44,840 --> 00:16:48,000 Speaker 1: and they had to get they're staying at a hotel 293 00:16:48,040 --> 00:16:51,480 Speaker 1: in Singapore and they had to get some clothes washed. 294 00:16:52,240 --> 00:16:55,800 Speaker 1: So they rang room service and said such and such 295 00:16:55,920 --> 00:16:59,080 Speaker 1: need some shit washed, and they went sure, I don't know, 296 00:16:59,160 --> 00:17:00,960 Speaker 1: I forget. The name will be up in a moment. 297 00:17:01,480 --> 00:17:06,600 Speaker 1: And then in a moment the her phone rang or 298 00:17:06,640 --> 00:17:08,679 Speaker 1: the text rang and said such and such as at 299 00:17:08,760 --> 00:17:13,200 Speaker 1: your door. She goes to the door. It's a robot, right, 300 00:17:13,600 --> 00:17:16,359 Speaker 1: and so she opens this thing, puts her thing in 301 00:17:16,400 --> 00:17:19,280 Speaker 1: the thing, and the robot takes off, and then the 302 00:17:19,359 --> 00:17:22,040 Speaker 1: robot comes back like four hours later with I mean, 303 00:17:23,000 --> 00:17:26,880 Speaker 1: and that's yeah, that's not a trial, that's just what's happening. 304 00:17:27,720 --> 00:17:31,000 Speaker 2: Yeah. Yeah. Incidentally, just staying on this one, this is 305 00:17:31,040 --> 00:17:35,119 Speaker 2: a California Institute of Technology, and it's called deep tissue 306 00:17:35,320 --> 00:17:42,560 Speaker 2: in vivo sound printing. So disp dis is what it is. 307 00:17:42,800 --> 00:17:45,000 Speaker 2: But they've done these exp I haven't done any human 308 00:17:45,040 --> 00:17:46,919 Speaker 2: being yet, but they've done it in a rabbit's stomach 309 00:17:47,160 --> 00:17:50,360 Speaker 2: and a mouse's bladder, and they are able to fabricate 310 00:17:50,840 --> 00:17:54,440 Speaker 2: without any surgical intervention at all. They were able to 311 00:17:54,880 --> 00:18:00,399 Speaker 2: fabricate using sound technology and three D print internally. That's 312 00:18:00,680 --> 00:18:01,560 Speaker 2: great stuff, isn't it. 313 00:18:02,160 --> 00:18:05,680 Speaker 1: Ah? Well, I mean the main the main problem with surgery, 314 00:18:05,760 --> 00:18:07,800 Speaker 1: or one of the main problems, is the trauma that's 315 00:18:07,800 --> 00:18:12,359 Speaker 1: caused through cutting through all the tissue. So we eliminate that. Wow. 316 00:18:13,440 --> 00:18:18,320 Speaker 2: Yeah cool. Hey are you guys readers? I mean you obviously, 317 00:18:18,520 --> 00:18:20,359 Speaker 2: for what you do, Craigo, you've got to do a 318 00:18:20,359 --> 00:18:23,320 Speaker 2: lot of reading. I listen to audio books and I 319 00:18:23,359 --> 00:18:25,920 Speaker 2: do obviously pick up books every now and again as well, 320 00:18:25,960 --> 00:18:30,439 Speaker 2: But and it's interesting what goes on in our brain 321 00:18:30,720 --> 00:18:34,600 Speaker 2: when we read, and there's more research going into that. 322 00:18:35,000 --> 00:18:39,320 Speaker 2: And it's funny because I you know, if you have 323 00:18:39,840 --> 00:18:43,320 Speaker 2: like I mean, I read every day, and what we're 324 00:18:43,359 --> 00:18:47,360 Speaker 2: now finding is that you know, letters, words, sentences, and texts, 325 00:18:47,760 --> 00:18:51,840 Speaker 2: whether you're reading aloud or just silently reading, your brain 326 00:18:51,960 --> 00:18:55,560 Speaker 2: is doing some amazing stuff when you put it all together, 327 00:18:56,119 --> 00:19:02,520 Speaker 2: and being lost in a book can can just change 328 00:19:02,560 --> 00:19:08,520 Speaker 2: your brain chemistry and it's it's it's amazing. So science 329 00:19:09,160 --> 00:19:13,000 Speaker 2: really doesn't know what goes on about you know, your 330 00:19:13,040 --> 00:19:17,240 Speaker 2: brain when you transform those letters on a page into 331 00:19:17,320 --> 00:19:21,480 Speaker 2: words or you visualize something. So there is more studies 332 00:19:21,520 --> 00:19:24,560 Speaker 2: going on now into what actually does happen when we 333 00:19:24,600 --> 00:19:26,919 Speaker 2: pick up a book and you know, you're transported to 334 00:19:26,960 --> 00:19:29,720 Speaker 2: another location if you think about reading, you know, say 335 00:19:29,880 --> 00:19:33,879 Speaker 2: you're really in depth into a personal motivation type book 336 00:19:34,080 --> 00:19:37,280 Speaker 2: and you start to feel that almost euphoric sense of 337 00:19:37,680 --> 00:19:39,800 Speaker 2: you know, being monivated. You want to get out there 338 00:19:39,800 --> 00:19:41,359 Speaker 2: and exercise, or you want to get out there and 339 00:19:41,800 --> 00:19:44,520 Speaker 2: start doing all the things as it's suggesting or for me, 340 00:19:44,680 --> 00:19:46,720 Speaker 2: you know, as a fan, as someone who loves fantasy 341 00:19:46,760 --> 00:19:50,879 Speaker 2: and sci fi, being transported and visualizing that place and 342 00:19:50,960 --> 00:19:53,080 Speaker 2: the place, you know, wherever it happens to be that 343 00:19:53,160 --> 00:19:56,200 Speaker 2: you're you know you're thinking about in that book, particularly 344 00:19:56,200 --> 00:19:58,919 Speaker 2: in sci fi and fantasy, because the terms of reference 345 00:19:58,960 --> 00:20:01,080 Speaker 2: are slightly different. You know, you're creating a world in 346 00:20:01,160 --> 00:20:04,360 Speaker 2: your mind in what the author is thinking, or even 347 00:20:04,359 --> 00:20:06,240 Speaker 2: getting into the head of an author. You know, that 348 00:20:06,560 --> 00:20:08,040 Speaker 2: blows me away when you read a book that was 349 00:20:08,080 --> 00:20:11,000 Speaker 2: written two hundred years ago and the concept and the 350 00:20:11,040 --> 00:20:13,960 Speaker 2: thoughts and the words that someone's put on a page 351 00:20:14,320 --> 00:20:17,280 Speaker 2: is brought back to life in your brain. So it 352 00:20:18,240 --> 00:20:20,840 Speaker 2: makes sense that they're kind of researching what's going on, 353 00:20:21,200 --> 00:20:24,040 Speaker 2: and you know, whether it's short form or long form. 354 00:20:24,200 --> 00:20:27,080 Speaker 2: You know, a lot of people now and I see 355 00:20:27,080 --> 00:20:29,440 Speaker 2: this as a frightening thought that a lot of young 356 00:20:29,440 --> 00:20:33,000 Speaker 2: people tend to not have that focus to read anymore. 357 00:20:33,040 --> 00:20:37,040 Speaker 2: They're stimulated by TikTok videos that go fifteen seconds, you know, 358 00:20:37,560 --> 00:20:40,320 Speaker 2: as opposed to sitting down and really, you know, diving 359 00:20:40,359 --> 00:20:43,880 Speaker 2: into a book. I was walking with my friend yesterday 360 00:20:43,920 --> 00:20:46,000 Speaker 2: who's a big listener to the show, and she said 361 00:20:46,040 --> 00:20:49,359 Speaker 2: she's reading three books at once. I can only read 362 00:20:49,400 --> 00:20:51,080 Speaker 2: one book at a time. What about you, guys? Can 363 00:20:51,160 --> 00:20:53,399 Speaker 2: you read more than one book at any time. 364 00:20:55,200 --> 00:20:59,840 Speaker 1: I'm a listener. Was I was an avid reader for 365 00:20:59,880 --> 00:21:02,840 Speaker 1: you is like physically holding a book obviously, because we 366 00:21:02,880 --> 00:21:05,720 Speaker 1: didn't have audio books, and people are going to go, 367 00:21:05,800 --> 00:21:07,240 Speaker 1: people are going to think, how on earth do you 368 00:21:07,280 --> 00:21:10,800 Speaker 1: have time for this? But in my like in my 369 00:21:11,000 --> 00:21:14,359 Speaker 1: walking hours, I listened to books. Right, So this morning 370 00:21:14,400 --> 00:21:16,760 Speaker 1: I was up at quarter to six walking listening to 371 00:21:16,800 --> 00:21:19,520 Speaker 1: a book. And I've listened to ten books in the 372 00:21:19,640 --> 00:21:26,520 Speaker 1: last twelve weeks. Yeah, so, yeah, So I consume lots 373 00:21:26,600 --> 00:21:30,800 Speaker 1: of content. And I've never and I've said this once 374 00:21:30,880 --> 00:21:34,439 Speaker 1: before on the show, but I've never really consumed fiction. 375 00:21:34,760 --> 00:21:37,520 Speaker 1: Not interested. It's all got to be you know, data 376 00:21:37,560 --> 00:21:40,720 Speaker 1: information da. And then I said to myself, you're a 377 00:21:40,720 --> 00:21:44,159 Speaker 1: boring fuck. Just listen to something for the sake of 378 00:21:44,200 --> 00:21:46,480 Speaker 1: list you don't not everything's got to be a fucking 379 00:21:46,680 --> 00:21:49,920 Speaker 1: learning moment, right, And so I listened to this book 380 00:21:49,960 --> 00:21:54,280 Speaker 1: that was recommended called The Gray Man, and which is 381 00:21:54,280 --> 00:21:57,240 Speaker 1: about the world's best assassin who's got a moral compass. 382 00:21:57,280 --> 00:21:59,000 Speaker 1: Of course it is, and of course he has, and 383 00:21:59,080 --> 00:22:03,199 Speaker 1: of course I'm interested, right, And now I'm up to 384 00:22:03,440 --> 00:22:08,680 Speaker 1: I'm currently listening to book ten and great love it 385 00:22:08,920 --> 00:22:09,920 Speaker 1: I love the audio books. 386 00:22:09,960 --> 00:22:12,160 Speaker 2: Yeah, I'm at book eight of a series I only 387 00:22:12,200 --> 00:22:15,120 Speaker 2: started about two weeks ago because I just plow through them, 388 00:22:15,240 --> 00:22:17,919 Speaker 2: you know what I'm doing, cause I'm walking, Fritz. But 389 00:22:18,200 --> 00:22:22,000 Speaker 2: the interesting thing with this study was that reading activates 390 00:22:22,040 --> 00:22:25,840 Speaker 2: the left hemisphere, so that's the language center of the brain. 391 00:22:26,600 --> 00:22:30,240 Speaker 2: Letters and text reading engage the visual and motor areas 392 00:22:30,280 --> 00:22:33,880 Speaker 2: of the brain, and words and sentence reading activate language, 393 00:22:33,880 --> 00:22:36,960 Speaker 2: classic language what they call language clusters also in the 394 00:22:37,040 --> 00:22:40,440 Speaker 2: left hemisphere. And they didn't know this, but the cerebellum 395 00:22:41,320 --> 00:22:44,600 Speaker 2: has never really been thought of to be engaged when 396 00:22:44,960 --> 00:22:47,200 Speaker 2: you know, we're reading. And in fact, the right cellar 397 00:22:47,200 --> 00:22:50,639 Speaker 2: about cerebellum is active in all types of reading, and 398 00:22:50,640 --> 00:22:55,280 Speaker 2: that's out loud reading, so with speech production. And then 399 00:22:55,440 --> 00:22:58,440 Speaker 2: the left side of the cerebellum is more active during 400 00:22:58,760 --> 00:23:02,000 Speaker 2: the word reading, like you know, you know, when we're 401 00:23:02,000 --> 00:23:05,680 Speaker 2: trying to get concepts, So we're trying to get meaning 402 00:23:05,720 --> 00:23:08,000 Speaker 2: out of whatever it is that we're reading, you know, 403 00:23:08,040 --> 00:23:11,760 Speaker 2: when we're thinking, really concentrating from focusing really closely on 404 00:23:11,840 --> 00:23:13,359 Speaker 2: whatever it is we're trying to make sense of. 405 00:23:14,240 --> 00:23:18,280 Speaker 1: M It's interesting because you can read the same thing, 406 00:23:18,920 --> 00:23:20,880 Speaker 1: but it won't you and I can read the same thing, 407 00:23:20,920 --> 00:23:25,240 Speaker 1: but it won't produce the same neurological response in the brain, 408 00:23:25,320 --> 00:23:30,879 Speaker 1: or biochemical response or emotional response, because there's like the 409 00:23:31,000 --> 00:23:35,040 Speaker 1: variable with everything is how you interpret things or how 410 00:23:35,400 --> 00:23:38,760 Speaker 1: you know, Like with food, it's how your biology interacts 411 00:23:38,760 --> 00:23:41,840 Speaker 1: with that food or that drug. And then with like 412 00:23:42,880 --> 00:23:47,119 Speaker 1: what we would call, you know, content or you know, 413 00:23:47,320 --> 00:23:52,440 Speaker 1: cognitive input or data or information. Yeah, it's how how 414 00:23:52,560 --> 00:23:54,639 Speaker 1: you respond and interact with that as well. 415 00:23:54,680 --> 00:23:58,320 Speaker 2: But I was we've been involved through my business, so 416 00:23:58,359 --> 00:24:01,679 Speaker 2: we've we've been involved recently in an amazing organization and 417 00:24:01,960 --> 00:24:07,360 Speaker 2: a horse welfare organization called Project Hope, and they rescue horses. 418 00:24:07,440 --> 00:24:10,920 Speaker 2: Their members take the horses on that have been neglected 419 00:24:11,000 --> 00:24:14,200 Speaker 2: or abused or whatever, and I have the real privilege 420 00:24:14,200 --> 00:24:16,200 Speaker 2: of working with them to do their annual report. I've 421 00:24:16,200 --> 00:24:18,480 Speaker 2: just done their third annual report. But there was one 422 00:24:18,480 --> 00:24:19,639 Speaker 2: I don't know and I don't know whether it was 423 00:24:19,640 --> 00:24:23,000 Speaker 2: a vulnerability at the time, but my colleague Robert was 424 00:24:23,040 --> 00:24:25,040 Speaker 2: in the office and it was early in the morning. 425 00:24:25,040 --> 00:24:27,560 Speaker 2: We just started work and I picked up the document 426 00:24:27,640 --> 00:24:30,600 Speaker 2: and I was pulling images in so the clients had, 427 00:24:30,760 --> 00:24:32,720 Speaker 2: you know, put a whole lot of images of horses 428 00:24:32,760 --> 00:24:34,680 Speaker 2: they've rescued, and there was one picture of a horse 429 00:24:34,680 --> 00:24:37,280 Speaker 2: and I'm not even going to describe what had happened 430 00:24:37,320 --> 00:24:40,399 Speaker 2: to it, but I actually walked away from the computer 431 00:24:40,440 --> 00:24:44,119 Speaker 2: and started crying. It was so it may have just 432 00:24:44,160 --> 00:24:47,719 Speaker 2: been a vulnerable moment for me and seeing an animal 433 00:24:47,720 --> 00:24:50,800 Speaker 2: and distress or seeing anything in distress, but I was 434 00:24:51,080 --> 00:24:55,360 Speaker 2: so taken aback from seeing that. You don't just see 435 00:24:56,119 --> 00:25:00,199 Speaker 2: the physical you know, you know what that horse is 436 00:25:00,240 --> 00:25:02,359 Speaker 2: going through physically in terms of what you're seeing on 437 00:25:02,400 --> 00:25:05,359 Speaker 2: the page, but your brain takes you to the scenario 438 00:25:05,440 --> 00:25:08,280 Speaker 2: as well, so the pain that might be associated with 439 00:25:08,320 --> 00:25:11,199 Speaker 2: the you know, the neglect that may have been associated 440 00:25:11,200 --> 00:25:13,920 Speaker 2: with so our brains work in a really amazing way, 441 00:25:13,960 --> 00:25:16,200 Speaker 2: so we're not just looking at a picture and feeling distressed. 442 00:25:16,920 --> 00:25:19,119 Speaker 2: I went to a totally different place where I started 443 00:25:19,160 --> 00:25:22,119 Speaker 2: seeing more than just that, and it was you know, 444 00:25:22,160 --> 00:25:24,159 Speaker 2: and we talked about it. I talked about it to 445 00:25:24,200 --> 00:25:27,399 Speaker 2: the clients because you know, they're celebrating fifty years of 446 00:25:28,040 --> 00:25:32,679 Speaker 2: rescuing animals, rescuing horses, and you know the amount of 447 00:25:32,680 --> 00:25:36,160 Speaker 2: horses over the years that have been rescued through this organization, 448 00:25:36,240 --> 00:25:39,560 Speaker 2: and it's staggering that. You know, we thought, well, I've 449 00:25:39,600 --> 00:25:42,480 Speaker 2: always had a challenge with putting those sorts of confronting 450 00:25:42,520 --> 00:25:45,080 Speaker 2: images into their annual report. But the reality of it 451 00:25:45,160 --> 00:25:47,120 Speaker 2: is you've got to show both. You know, it's great 452 00:25:47,119 --> 00:25:48,920 Speaker 2: to see all these horses that have been rescued, but 453 00:25:48,960 --> 00:25:51,399 Speaker 2: when you see the neglect as a contrast, like the 454 00:25:51,440 --> 00:25:55,719 Speaker 2: before and after, it's very very powerful. So the written 455 00:25:55,720 --> 00:25:58,800 Speaker 2: word or the visuals can take our brain down a 456 00:25:58,840 --> 00:26:02,120 Speaker 2: real rabbit hole and can can really be emotional. I've 457 00:26:02,119 --> 00:26:04,639 Speaker 2: set on trams before reading a book laughing out loud, 458 00:26:05,600 --> 00:26:09,159 Speaker 2: you know, so you know, those sorts of as you 459 00:26:09,200 --> 00:26:12,000 Speaker 2: said before Crago, I can read something and be moved 460 00:26:12,000 --> 00:26:13,640 Speaker 2: by it. I don't know if we've spoken this about 461 00:26:13,640 --> 00:26:14,880 Speaker 2: this on the show. You know, it was the best 462 00:26:14,920 --> 00:26:17,280 Speaker 2: of times, it was the worst of times. I think. 463 00:26:17,400 --> 00:26:21,800 Speaker 2: I'm sure we've quoted that from A Tale of Two Cities. 464 00:26:22,040 --> 00:26:27,159 Speaker 2: You know, some of these profound written dialogue. Ever, you know, 465 00:26:27,240 --> 00:26:32,480 Speaker 2: it's such an amazingly moving, you know, opening paragraph. When 466 00:26:32,520 --> 00:26:35,000 Speaker 2: you think about, you know, an opening sentence to a book. 467 00:26:36,000 --> 00:26:39,760 Speaker 1: Yeah, you raise an interesting point, and this is kind 468 00:26:39,800 --> 00:26:42,800 Speaker 1: of off topic for tech, but it's a philosophical and 469 00:26:44,400 --> 00:26:50,360 Speaker 1: I think it's like an ethical and a moral issue. 470 00:26:51,160 --> 00:26:54,879 Speaker 1: So there's a podcast which I'm happy to promote good podcasts. 471 00:26:54,880 --> 00:26:58,040 Speaker 1: There's a guy called Sean Ryan who's an ex Navy 472 00:26:58,080 --> 00:27:02,920 Speaker 1: seal blah blah blah, but quite a deep, deep thinking, 473 00:27:03,200 --> 00:27:07,000 Speaker 1: like to me, a really good human And he had 474 00:27:07,000 --> 00:27:09,800 Speaker 1: a guy on called Tim Tebow yesterday other day before, 475 00:27:10,200 --> 00:27:14,359 Speaker 1: who's the next Heisman Trophy winner, which just means he 476 00:27:14,400 --> 00:27:19,679 Speaker 1: was the best college football player regarded maybe ever went on, 477 00:27:19,760 --> 00:27:23,919 Speaker 1: played in the NFL, didn't really crush it, but just 478 00:27:23,960 --> 00:27:26,959 Speaker 1: a good human being. And he's been renowned as just 479 00:27:27,000 --> 00:27:32,880 Speaker 1: this good dude. And he started this thing highlighting all 480 00:27:32,920 --> 00:27:37,399 Speaker 1: of this stuff around child exploitation and all the stuff 481 00:27:37,400 --> 00:27:39,399 Speaker 1: that I don't even really want to mention, but you 482 00:27:39,440 --> 00:27:43,240 Speaker 1: can imagine, right, And I just saw these two on 483 00:27:43,280 --> 00:27:46,000 Speaker 1: a clip yesterday. It came up and they were talking 484 00:27:46,240 --> 00:27:52,439 Speaker 1: and like it is, so it makes me so sad 485 00:27:52,480 --> 00:27:55,520 Speaker 1: for the children. It makes me so fucking angry at 486 00:27:55,560 --> 00:27:59,159 Speaker 1: the people that do this shit, right, And I just 487 00:27:59,320 --> 00:28:02,040 Speaker 1: I was nearly going to turn off because I I 488 00:28:02,080 --> 00:28:05,159 Speaker 1: almost can't cope, right, I don't. I don't want to know. 489 00:28:05,960 --> 00:28:10,560 Speaker 1: And then and then Tim Tebow said when he starts 490 00:28:10,600 --> 00:28:14,679 Speaker 1: to tell he starts to talk. A lot of people go, no, no, no, 491 00:28:14,720 --> 00:28:15,800 Speaker 1: I don't want to know. I don't want to know. 492 00:28:15,880 --> 00:28:18,560 Speaker 1: I've got children, I don't don't tell me, right, And 493 00:28:18,600 --> 00:28:21,160 Speaker 1: I'm like, that is because that's it. I don't even 494 00:28:21,320 --> 00:28:25,359 Speaker 1: have children. And it fucking that. There's so many emotions 495 00:28:25,359 --> 00:28:27,280 Speaker 1: that that brings up in me. And I think to 496 00:28:27,359 --> 00:28:30,479 Speaker 1: your point about the horses, of course, and then all 497 00:28:30,560 --> 00:28:35,000 Speaker 1: of the other myriad of horrible things, domestic violence against women, 498 00:28:35,240 --> 00:28:39,240 Speaker 1: you know, all of the fucking horrible things, as painful 499 00:28:39,240 --> 00:28:42,520 Speaker 1: and as it as uncomfortable as it is, we need 500 00:28:42,600 --> 00:28:46,920 Speaker 1: to fucking throw the door open. And I think, I 501 00:28:47,000 --> 00:28:50,440 Speaker 1: hate looking at it, but I think, and then you go, okay, 502 00:28:51,360 --> 00:28:54,080 Speaker 1: well now I know this. What can I do? Can 503 00:28:54,120 --> 00:28:57,040 Speaker 1: I do anything? I know I can't turn it around, 504 00:28:57,080 --> 00:29:01,440 Speaker 1: but can I do anything? And I think one of them, sorry, 505 00:29:01,480 --> 00:29:04,760 Speaker 1: one of the you know, one of the opportunities that 506 00:29:05,080 --> 00:29:08,800 Speaker 1: social media presents, if used in the right way, is 507 00:29:08,840 --> 00:29:10,840 Speaker 1: that we can shine a light on some of this 508 00:29:11,040 --> 00:29:14,680 Speaker 1: dark shit that happens and maybe do some good that way. 509 00:29:14,920 --> 00:29:19,640 Speaker 2: One of my favorite favorite quotes is by an author 510 00:29:19,640 --> 00:29:22,120 Speaker 2: by the name of Edmund Or. It's attributed to Edmund 511 00:29:22,160 --> 00:29:25,960 Speaker 2: Burke and it says the only thing necessary for triumph, 512 00:29:26,040 --> 00:29:28,280 Speaker 2: for the for triumph of evil is for good men 513 00:29:28,320 --> 00:29:28,880 Speaker 2: to do nothing. 514 00:29:29,480 --> 00:29:31,440 Speaker 1: Yeah. Yeah, I quite that all the time. 515 00:29:31,480 --> 00:29:34,400 Speaker 2: It's so amazing, is it's. 516 00:29:34,680 --> 00:29:39,200 Speaker 1: Yeah, yeah, all it's necessary for evil to triumph is 517 00:29:39,240 --> 00:29:41,600 Speaker 1: for good men to do nothing. Yeah. It's like it's 518 00:29:41,600 --> 00:29:45,520 Speaker 1: so true. It's so and it's like almost like not 519 00:29:45,640 --> 00:29:50,280 Speaker 1: my problem, you know, not my problem. It's like, well, 520 00:29:51,800 --> 00:29:57,840 Speaker 1: stop being a selfish prick because some people, some people 521 00:29:57,960 --> 00:30:01,280 Speaker 1: can't look after themselves. Some people just need other people 522 00:30:01,320 --> 00:30:02,600 Speaker 1: to fucking help, you know. 523 00:30:03,440 --> 00:30:04,040 Speaker 2: Yeah. 524 00:30:04,160 --> 00:30:07,600 Speaker 1: Anyway, so that took. 525 00:30:07,400 --> 00:30:10,560 Speaker 2: A philosophical I know, I didn't kind of know where 526 00:30:10,560 --> 00:30:11,000 Speaker 2: to go with that. 527 00:30:12,320 --> 00:30:15,640 Speaker 1: I think that's okay. Not every moment on a podcast 528 00:30:15,720 --> 00:30:17,920 Speaker 1: needs to be comfortable or cheesy or fun, you know. 529 00:30:18,280 --> 00:30:21,560 Speaker 1: I just think like, and the three of us all 530 00:30:21,640 --> 00:30:24,960 Speaker 1: have a small voice in a fucking ocean of voices, 531 00:30:25,560 --> 00:30:27,960 Speaker 1: and sometimes you've got to go, yeah, okay, well this 532 00:30:28,120 --> 00:30:30,400 Speaker 1: is where we are. But what do you think you know? 533 00:30:30,760 --> 00:30:36,520 Speaker 2: So, yeah, definitely no problem there. I was. I know 534 00:30:36,600 --> 00:30:40,200 Speaker 2: that because you both are in the fitness world, I 535 00:30:40,280 --> 00:30:43,600 Speaker 2: often think about you know, real age, biological age, you 536 00:30:43,680 --> 00:30:45,960 Speaker 2: know what what does it mean as we get older 537 00:30:46,000 --> 00:30:49,120 Speaker 2: and you know, how do we know whether we're doing well, 538 00:30:49,160 --> 00:30:51,560 Speaker 2: whether we're on track, and you know, we're doing all 539 00:30:51,600 --> 00:30:54,240 Speaker 2: the right things for our body. I remember reading a 540 00:30:54,280 --> 00:30:57,200 Speaker 2: study and I tell my tai chies students this all 541 00:30:57,200 --> 00:30:59,880 Speaker 2: the time that when you get over the age of fifty, 542 00:31:00,240 --> 00:31:04,240 Speaker 2: it's it's fairly well documented that doing twenty minutes of 543 00:31:04,360 --> 00:31:08,240 Speaker 2: really vigorous exercise and say doing twenty minutes of something 544 00:31:08,280 --> 00:31:12,120 Speaker 2: like meditation, tai chi yoga can be just as beneficial. 545 00:31:12,480 --> 00:31:15,040 Speaker 2: So over a certain age, we don't have to you know, 546 00:31:15,120 --> 00:31:18,200 Speaker 2: knock ourselves out and go super hard. Not that that's 547 00:31:18,240 --> 00:31:21,800 Speaker 2: a bad thing, but you know, doing some form of 548 00:31:21,840 --> 00:31:26,520 Speaker 2: exercise is really important. But there's an interesting little article 549 00:31:26,520 --> 00:31:28,320 Speaker 2: that I read and I thought, oh, you know, it 550 00:31:28,360 --> 00:31:32,280 Speaker 2: was talking about how to measure, you know, your fitness level, 551 00:31:32,480 --> 00:31:34,920 Speaker 2: and one of the things they talk about was grip strength. 552 00:31:34,960 --> 00:31:37,920 Speaker 2: Have you ever heard that how grip strength actually says 553 00:31:38,000 --> 00:31:40,400 Speaker 2: a whole lot about how the rest of your body 554 00:31:40,440 --> 00:31:43,560 Speaker 2: is functioning and how on track you are. Have you 555 00:31:43,600 --> 00:31:44,600 Speaker 2: have you heard that one? Kraigo? 556 00:31:45,440 --> 00:31:48,600 Speaker 1: Yeah, that's a that's a standard mate, Yep, that's fairly 557 00:31:48,640 --> 00:31:52,120 Speaker 1: well known. Yeah. Also stride length. You know, there's a 558 00:31:52,120 --> 00:31:54,760 Speaker 1: bunch of things, but grip strength is a is an 559 00:31:54,800 --> 00:31:55,760 Speaker 1: indicator for sure. 560 00:31:56,080 --> 00:31:58,440 Speaker 2: Yeah. So the interesting thing is that you don't have 561 00:31:58,560 --> 00:32:01,760 Speaker 2: to because there are obviously you know, you can. What's 562 00:32:01,800 --> 00:32:03,840 Speaker 2: the measurement? How do you the device you use for 563 00:32:03,880 --> 00:32:09,120 Speaker 2: measuring grip? It's a dynamometer, Yeah, it's yeah, but it's pounds. 564 00:32:09,240 --> 00:32:12,720 Speaker 2: It's like pounds of force, I think, yep. Yeah. So 565 00:32:13,880 --> 00:32:16,680 Speaker 2: this little article I was reading recently was talking about 566 00:32:16,680 --> 00:32:18,959 Speaker 2: just grabbing a tennis ball and how long you can 567 00:32:19,040 --> 00:32:21,280 Speaker 2: compress and hold the tennis ball. And then the other 568 00:32:21,280 --> 00:32:23,160 Speaker 2: thing was if you go to a pull up bar 569 00:32:23,520 --> 00:32:25,960 Speaker 2: and just hang from the pull up bar, how long 570 00:32:26,000 --> 00:32:28,280 Speaker 2: you can hang from the pull up bar is another 571 00:32:28,320 --> 00:32:30,520 Speaker 2: one as well, just to measure it, and I think 572 00:32:30,560 --> 00:32:33,920 Speaker 2: for women it's over thirty seconds or thirty to forty seconds, 573 00:32:33,960 --> 00:32:36,280 Speaker 2: and for men it's over sixty seconds. So if you 574 00:32:36,320 --> 00:32:39,000 Speaker 2: can just hang for over sixty seconds as a bloke 575 00:32:39,600 --> 00:32:41,920 Speaker 2: or thirty to forty seconds as a woman, then that 576 00:32:42,560 --> 00:32:45,080 Speaker 2: says that you know you're doing pretty well. So just 577 00:32:45,120 --> 00:32:47,560 Speaker 2: a little simple test you can do at home, But 578 00:32:48,000 --> 00:32:50,200 Speaker 2: that fascinates me. I mean, I don't know that I'm 579 00:32:50,440 --> 00:32:52,520 Speaker 2: obsessed by it. But I you know, obviously I'm in 580 00:32:52,520 --> 00:32:54,440 Speaker 2: a room of people when I talk about a room, 581 00:32:54,480 --> 00:32:57,880 Speaker 2: a virtual room of people who are pretty focused on 582 00:32:58,360 --> 00:33:01,080 Speaker 2: staying healthy and looking good as they grace as they 583 00:33:01,160 --> 00:33:04,800 Speaker 2: age gracefully, and doing all the right things or trying 584 00:33:04,840 --> 00:33:07,560 Speaker 2: to do the right things. So it is something I guess, 585 00:33:07,920 --> 00:33:10,280 Speaker 2: you know, the people who listen to the podcast to 586 00:33:10,320 --> 00:33:12,680 Speaker 2: listen to the You Project tend to be from my 587 00:33:12,800 --> 00:33:16,080 Speaker 2: understanding and interaction when I see the Facebook group and 588 00:33:16,480 --> 00:33:19,880 Speaker 2: the chats that people have, it's people who get motivated 589 00:33:20,080 --> 00:33:23,840 Speaker 2: and like to be motivated to maybe have a better 590 00:33:23,880 --> 00:33:26,800 Speaker 2: thought process or do better with their lives, look for 591 00:33:26,880 --> 00:33:27,880 Speaker 2: ways to be motivated. 592 00:33:28,280 --> 00:33:28,400 Speaker 1: You know. 593 00:33:28,440 --> 00:33:31,719 Speaker 2: It's like people who pick up books on motivation. You know, 594 00:33:31,760 --> 00:33:33,480 Speaker 2: why do we back them up in the first place, 595 00:33:33,680 --> 00:33:35,480 Speaker 2: Because we want to do better. We want to be better, 596 00:33:35,520 --> 00:33:37,320 Speaker 2: we want our lives to be better. And for me, 597 00:33:37,440 --> 00:33:39,880 Speaker 2: in a lot of ways, technology does that. You know. 598 00:33:39,920 --> 00:33:42,280 Speaker 2: I love that we can use tech and make our 599 00:33:42,320 --> 00:33:44,880 Speaker 2: lives better. And you know, I know that you use 600 00:33:44,960 --> 00:33:47,840 Speaker 2: chat GPT and some of the prompts and chat GPT, 601 00:33:47,960 --> 00:33:51,040 Speaker 2: and actually that kind of opens to the to the 602 00:33:51,160 --> 00:33:53,560 Speaker 2: little topic I wanted to chat about is for people 603 00:33:53,600 --> 00:33:56,800 Speaker 2: who do use chat GPT, the prompts you use. And 604 00:33:56,840 --> 00:33:59,160 Speaker 2: I have this funny story. So this week I went 605 00:33:59,200 --> 00:34:03,320 Speaker 2: to the local super market and the I started, you know, 606 00:34:03,560 --> 00:34:06,240 Speaker 2: take took my stuff to the register and they started 607 00:34:06,280 --> 00:34:08,239 Speaker 2: scanning through the stuff and then I had mean, I 608 00:34:08,280 --> 00:34:11,840 Speaker 2: think I had like a carrot and some szucchini and 609 00:34:11,880 --> 00:34:13,840 Speaker 2: some other bits and pieces. But when they went to 610 00:34:14,040 --> 00:34:17,879 Speaker 2: weigh them, there was something wrong with the scales and 611 00:34:18,320 --> 00:34:20,319 Speaker 2: they could meet. They could weigh it, but it wasn't 612 00:34:20,360 --> 00:34:23,960 Speaker 2: translating to their first their pose system, the port of 613 00:34:24,040 --> 00:34:26,800 Speaker 2: sales system, And so the woman was standing there. We 614 00:34:26,840 --> 00:34:29,000 Speaker 2: went to a different register. She put the stuff on 615 00:34:29,040 --> 00:34:33,000 Speaker 2: the scales, it still wouldn't interact. And then the senior 616 00:34:33,080 --> 00:34:35,279 Speaker 2: lady there said, we'll work out what it is per 617 00:34:35,360 --> 00:34:38,680 Speaker 2: kilo and you know, and try to work it out. 618 00:34:38,760 --> 00:34:41,319 Speaker 2: So if a carrot is three dollars sixty akilo and 619 00:34:41,360 --> 00:34:44,640 Speaker 2: you've got a one hundred and sixty gram carrot, try 620 00:34:44,680 --> 00:34:46,520 Speaker 2: to work it out. And of course the first thing 621 00:34:46,640 --> 00:34:48,440 Speaker 2: I did was I just jumped on the chat GPT 622 00:34:48,600 --> 00:34:52,520 Speaker 2: on my phone and like all the girls were all 623 00:34:52,560 --> 00:34:54,759 Speaker 2: just looking dumbfounded. It's like, how did we work that out, 624 00:34:54,960 --> 00:34:57,000 Speaker 2: and I don't think off the top of my head, 625 00:34:57,000 --> 00:34:58,840 Speaker 2: I probably could have worked it out without Jack GBT. 626 00:34:58,920 --> 00:35:02,640 Speaker 2: I'm gonna be honest, full transparency here. Chat CHPT did 627 00:35:02,640 --> 00:35:04,239 Speaker 2: it for me. And so there I am standing at 628 00:35:04,239 --> 00:35:07,240 Speaker 2: the cash register going through and we were working out 629 00:35:07,360 --> 00:35:09,200 Speaker 2: you know, if I have a carrot and it's one 630 00:35:09,280 --> 00:35:11,960 Speaker 2: hundred and sixty grams and it costs you know, three 631 00:35:12,000 --> 00:35:15,640 Speaker 2: dollars sixty a kilo, what is my carrot worth? You know? 632 00:35:15,680 --> 00:35:18,319 Speaker 2: And it was kind of fun that we sold our 633 00:35:18,360 --> 00:35:23,520 Speaker 2: little problem using chat GPT in a practical sense, but 634 00:35:23,719 --> 00:35:26,000 Speaker 2: some of the prompts you put into chat GPT can 635 00:35:26,040 --> 00:35:28,400 Speaker 2: be really interesting. So if you've got a. 636 00:35:28,600 --> 00:35:31,759 Speaker 1: Ton I tell you, can I tell you for future use? Yeah, 637 00:35:31,840 --> 00:35:34,759 Speaker 1: go for it, go three sixty times point one six. 638 00:35:35,160 --> 00:35:37,360 Speaker 2: Yeah, thanks for that. It was easier with chat. 639 00:35:37,560 --> 00:35:41,359 Speaker 1: T Nah, not really, not really. 640 00:35:42,000 --> 00:35:44,279 Speaker 2: No, Actually it was interesting. Chat GPT not only gave 641 00:35:44,320 --> 00:35:46,359 Speaker 2: me the answer, but it showed me how to do it. 642 00:35:46,440 --> 00:35:52,240 Speaker 2: So yeah, but yeah, so creating a better to do list? 643 00:35:52,719 --> 00:35:54,960 Speaker 2: So what you know, this this is an article that 644 00:35:55,000 --> 00:35:56,880 Speaker 2: I was reading and it was just kind of giving 645 00:35:56,880 --> 00:35:59,480 Speaker 2: you some prompts that you can use for AI to 646 00:35:59,600 --> 00:36:03,000 Speaker 2: try to you know, streamline what you do. So obviously 647 00:36:03,000 --> 00:36:06,359 Speaker 2: summarizing documents is fantastic. If you have a document, throw 648 00:36:06,400 --> 00:36:07,960 Speaker 2: it in there and just say, can you just give 649 00:36:08,000 --> 00:36:11,799 Speaker 2: me a one paragraph overview of what this document is. 650 00:36:12,080 --> 00:36:14,440 Speaker 2: And it doesn't necessarily mean you're getting get to do 651 00:36:14,520 --> 00:36:17,000 Speaker 2: all the work for you, but it could be things 652 00:36:17,200 --> 00:36:20,080 Speaker 2: as simple as you know, one of my staff recently 653 00:36:20,360 --> 00:36:22,240 Speaker 2: sent out an email that had a lot of errors 654 00:36:22,239 --> 00:36:24,560 Speaker 2: in it, and so you know, it had already gone 655 00:36:24,600 --> 00:36:26,600 Speaker 2: out to the client. He'd copied me in and it 656 00:36:26,680 --> 00:36:29,800 Speaker 2: felt a little bit uncomfortable to have the conversation. You know, 657 00:36:29,840 --> 00:36:33,120 Speaker 2: I'm a grown purse, grown man, but I said to him, look, 658 00:36:33,280 --> 00:36:36,520 Speaker 2: just run it through chat GPT, get chat GPT to 659 00:36:36,600 --> 00:36:41,160 Speaker 2: check for errors and look for formatting. And that's not cheating. 660 00:36:41,360 --> 00:36:43,279 Speaker 2: You've already done all the hard yards. You've got all 661 00:36:43,320 --> 00:36:46,080 Speaker 2: the data in there and all the information. But it 662 00:36:46,080 --> 00:36:48,640 Speaker 2: can be really good. So but creating a smarter to 663 00:36:48,719 --> 00:36:52,400 Speaker 2: do list, I love that idea where it will prioritize, 664 00:36:52,440 --> 00:36:54,440 Speaker 2: so you put your list of things in and then 665 00:36:54,480 --> 00:36:57,920 Speaker 2: it helps you think about how to prioritize, and sometimes 666 00:36:57,920 --> 00:36:59,480 Speaker 2: that can just take a little bit of that emotional 667 00:36:59,480 --> 00:37:01,719 Speaker 2: burden off I don't know about you. When I've got 668 00:37:01,719 --> 00:37:03,120 Speaker 2: my list of things that I need to do for 669 00:37:03,160 --> 00:37:05,600 Speaker 2: the day, it can be a bit overwhelming. So tackling 670 00:37:05,640 --> 00:37:08,080 Speaker 2: the right thing at the right time, I reckon that's 671 00:37:08,120 --> 00:37:12,320 Speaker 2: a great idea for using something like you know, chet GPT, 672 00:37:12,520 --> 00:37:16,560 Speaker 2: or even just troubleshooting a problem, you know, saying well, 673 00:37:16,640 --> 00:37:18,880 Speaker 2: how can I tackle this? What's the best way around. 674 00:37:19,160 --> 00:37:22,799 Speaker 2: I had to write a bereavement note recently and I 675 00:37:22,840 --> 00:37:26,000 Speaker 2: actually wrote down, you know, I started penning my thoughts 676 00:37:26,040 --> 00:37:29,080 Speaker 2: and then I actually asked chat GPT about the best 677 00:37:29,120 --> 00:37:33,600 Speaker 2: way to send this bereavement message and it actually helped. 678 00:37:34,520 --> 00:37:36,480 Speaker 2: You know, it draws from what they call a large 679 00:37:36,600 --> 00:37:41,279 Speaker 2: language model, so the AI has been trained on a 680 00:37:41,320 --> 00:37:44,080 Speaker 2: whole lot of information, so we can take all that 681 00:37:44,120 --> 00:37:46,640 Speaker 2: info put it all together. And it really helped because 682 00:37:46,640 --> 00:37:50,279 Speaker 2: I find that conversation, that dialogue for somebody who's just 683 00:37:50,360 --> 00:37:54,080 Speaker 2: lost someone can be really difficult. And it's happened in 684 00:37:54,080 --> 00:37:56,920 Speaker 2: two instances, one where someone quite young in their thirties 685 00:37:56,960 --> 00:37:59,680 Speaker 2: died suddenly, and the other was a close friend of 686 00:37:59,719 --> 00:38:02,520 Speaker 2: mine who had a partner, a previous partner they'd broken 687 00:38:02,640 --> 00:38:05,600 Speaker 2: up with, but he committed suicide and so you know, 688 00:38:05,680 --> 00:38:10,440 Speaker 2: how do you message somebody and not cause further grief? 689 00:38:10,760 --> 00:38:12,879 Speaker 2: You know, you want to support the person, And I 690 00:38:12,920 --> 00:38:15,160 Speaker 2: find that really difficult. I think I'm pretty good. I'm 691 00:38:15,160 --> 00:38:17,560 Speaker 2: a good wordsmith and I'm good at writing stuff. But 692 00:38:17,840 --> 00:38:21,000 Speaker 2: just having a little bit of helpful oversight from a 693 00:38:21,120 --> 00:38:24,920 Speaker 2: large language model like chat CHAPT I find really really useful. 694 00:38:26,840 --> 00:38:29,800 Speaker 1: Yeah, I think we're just starting to understand the scope 695 00:38:29,960 --> 00:38:34,840 Speaker 1: for and there's a really common concern at the moment 696 00:38:34,880 --> 00:38:38,160 Speaker 1: that we're going to become dumber, it's going to hijack here. 697 00:38:38,320 --> 00:38:41,560 Speaker 1: I don't think that at all. I think like I 698 00:38:41,719 --> 00:38:46,840 Speaker 1: use I use Claude, which is a more academic focused LLM, 699 00:38:47,080 --> 00:38:51,799 Speaker 1: andat GPT as well for different things. Obviously not to 700 00:38:51,880 --> 00:38:56,360 Speaker 1: do my work in terms of my PhD, but where 701 00:38:56,520 --> 00:39:00,640 Speaker 1: I can I can plug in something and say, you know, 702 00:39:00,719 --> 00:39:03,399 Speaker 1: I can plug in someone else's article and say give 703 00:39:03,440 --> 00:39:06,640 Speaker 1: me the three key takeaways from this, or I can say, 704 00:39:06,719 --> 00:39:10,800 Speaker 1: explain this article to me as though I'm a fourteen 705 00:39:10,880 --> 00:39:14,200 Speaker 1: year old who doesn't understand science, and it'll do it instantly. 706 00:39:14,360 --> 00:39:18,480 Speaker 1: Like it really comes down to, like you go, all right, 707 00:39:18,480 --> 00:39:21,600 Speaker 1: a kitchen knife as a tool, a pushbike as a tool, 708 00:39:21,760 --> 00:39:25,040 Speaker 1: you know, an la LAMB, a large learning model or 709 00:39:25,080 --> 00:39:28,560 Speaker 1: AI is a tool. It really comes down to how 710 00:39:28,680 --> 00:39:31,839 Speaker 1: we use it and people who are terrified of it. One. 711 00:39:32,000 --> 00:39:35,719 Speaker 1: I understand that because new things can be terrifying. But 712 00:39:35,840 --> 00:39:38,680 Speaker 1: to figure out I was talking to my friend Christian 713 00:39:38,760 --> 00:39:41,919 Speaker 1: last night at the gym. TI. If you know Christian, tall, 714 00:39:41,960 --> 00:39:44,520 Speaker 1: goofy Christian, and God bless him. He's a great man. 715 00:39:44,560 --> 00:39:46,359 Speaker 1: You know him as well. He used to work at harp. 716 00:39:46,600 --> 00:39:48,839 Speaker 2: Yeah I do, yeah, yeah, yeah yeah. 717 00:39:48,920 --> 00:39:53,879 Speaker 1: And Christian's dyslexic and that's not a you know, that's 718 00:39:54,040 --> 00:39:56,799 Speaker 1: just you know, it's like he's six foot four and 719 00:39:56,800 --> 00:39:59,440 Speaker 1: he's also dyslexic, right, It's just part of who he is. 720 00:40:00,200 --> 00:40:03,760 Speaker 1: And I was explaining to him, you know, for somebody 721 00:40:03,880 --> 00:40:07,359 Speaker 1: like him to just be able to talk rather than 722 00:40:07,400 --> 00:40:11,720 Speaker 1: write or even and so he doesn't use it. And 723 00:40:11,840 --> 00:40:13,839 Speaker 1: so I got it out in the gym and I said, 724 00:40:13,840 --> 00:40:15,960 Speaker 1: look at this. So I just pressed it and I said, 725 00:40:17,400 --> 00:40:19,840 Speaker 1: you know, it explained to me the difference between spiritually 726 00:40:19,880 --> 00:40:22,840 Speaker 1: and religion, because he's quite spiritual. He's going down this 727 00:40:22,960 --> 00:40:26,200 Speaker 1: big spiritual rabbit and it's like, noa, I go ask 728 00:40:26,280 --> 00:40:28,720 Speaker 1: it anything you want, And he asked it a question, 729 00:40:28,840 --> 00:40:31,839 Speaker 1: I go, by the way, this is Christian, and it goes, oh, 730 00:40:31,920 --> 00:40:35,799 Speaker 1: hi Christian, great question, and then it answers and I'm like, dude, 731 00:40:36,120 --> 00:40:40,279 Speaker 1: just talk to it. But it will answer anything you want. 732 00:40:40,360 --> 00:40:42,800 Speaker 1: You don't have to read, you don't have to type. 733 00:40:42,880 --> 00:40:44,800 Speaker 1: All you've got to do is be able to talk. 734 00:40:45,520 --> 00:40:47,400 Speaker 1: It will talk back to you. And by the way, 735 00:40:47,880 --> 00:40:51,160 Speaker 1: whatever it says to you, there's also a written version 736 00:40:51,200 --> 00:40:53,320 Speaker 1: if you do want to happen to cut and paste 737 00:40:53,320 --> 00:40:56,319 Speaker 1: that into a document or something. And so where you 738 00:40:57,520 --> 00:41:01,200 Speaker 1: like I said to it the other day, over the 739 00:41:01,239 --> 00:41:04,960 Speaker 1: next two years, I want to develop you know, this, this, this, this, this, 740 00:41:05,040 --> 00:41:08,640 Speaker 1: and this, Like there's there's academic stuff, there's podcast stuff, 741 00:41:08,680 --> 00:41:11,759 Speaker 1: there's business, there's corporate, there's you know, there's all these 742 00:41:11,800 --> 00:41:14,160 Speaker 1: different components to what I want to do be create 743 00:41:14,200 --> 00:41:16,520 Speaker 1: over the next two years. And it's like, can you 744 00:41:16,560 --> 00:41:19,680 Speaker 1: help me plan this? And it's like absolutely, And then 745 00:41:19,719 --> 00:41:22,600 Speaker 1: it just comes up with all these brilliant like you're 746 00:41:22,680 --> 00:41:27,520 Speaker 1: sitting with a high end business coach, and then it says, 747 00:41:27,520 --> 00:41:29,759 Speaker 1: would you like me to unpack any of these? I go, yeah, 748 00:41:29,800 --> 00:41:32,120 Speaker 1: this bit here, what do you mean by that? Can 749 00:41:32,160 --> 00:41:35,359 Speaker 1: you un And then it goes absolutely. I mean, all 750 00:41:35,400 --> 00:41:38,080 Speaker 1: you need to do, need to know is how to 751 00:41:38,120 --> 00:41:42,040 Speaker 1: best interact with it. It's I mean this is people 752 00:41:42,080 --> 00:41:44,240 Speaker 1: are scared of it. I think people should be excited. 753 00:41:44,320 --> 00:41:48,440 Speaker 1: I think AI taking over down the track when it 754 00:41:48,480 --> 00:41:52,680 Speaker 1: becomes sentient, which will be about next Thursday. That's a 755 00:41:52,680 --> 00:41:53,320 Speaker 1: different matter. 756 00:41:57,160 --> 00:41:59,600 Speaker 2: One of the things I'm always mindful of is what 757 00:41:59,640 --> 00:42:05,400 Speaker 2: I'm fing into AI because obviously there's privacy concerns. But 758 00:42:05,840 --> 00:42:08,920 Speaker 2: if you're using the free version of chatchpt and you 759 00:42:09,000 --> 00:42:12,440 Speaker 2: don't log into it as long as so, for example, 760 00:42:12,560 --> 00:42:14,880 Speaker 2: you might have a medical report that you wanted to 761 00:42:14,920 --> 00:42:18,320 Speaker 2: look at, you could paste all you know. I recently 762 00:42:18,320 --> 00:42:20,319 Speaker 2: went to my GP, I got my annual blood test 763 00:42:20,360 --> 00:42:23,239 Speaker 2: done being vegan, and I'm already is always mindful that 764 00:42:23,280 --> 00:42:26,080 Speaker 2: I've got the right nuxperience and that I'm tracking okay. 765 00:42:26,600 --> 00:42:28,640 Speaker 2: And you know, the doctor said to me, this is great. 766 00:42:28,680 --> 00:42:31,239 Speaker 2: You know your figures are really really good. But I 767 00:42:31,280 --> 00:42:35,600 Speaker 2: could potentially put all of those in, but use it 768 00:42:35,640 --> 00:42:39,520 Speaker 2: without logging in, and don't put all your personal details. 769 00:42:39,640 --> 00:42:42,520 Speaker 2: Just put the results so it's not going to attribute 770 00:42:42,560 --> 00:42:44,839 Speaker 2: that to you. If you're concerned, I've got to tell 771 00:42:44,840 --> 00:42:47,719 Speaker 2: you one very quick quote we're talking quotes before. All 772 00:42:47,760 --> 00:42:49,840 Speaker 2: you have to decide is what to do with the 773 00:42:49,920 --> 00:42:52,239 Speaker 2: time that is given to you. Who said that? 774 00:42:53,840 --> 00:42:55,839 Speaker 1: I do not know, but I'm not mad at it. 775 00:42:55,960 --> 00:42:57,640 Speaker 1: I quite like it, isn't it. 776 00:42:57,680 --> 00:42:59,360 Speaker 2: I actually have that printed in my office. It was 777 00:42:59,400 --> 00:43:01,440 Speaker 2: Gandalf from the Lord of the Rings. 778 00:43:03,520 --> 00:43:06,799 Speaker 1: I thought you were going to say Gandy, but okay. 779 00:43:07,800 --> 00:43:09,480 Speaker 2: It's like it's a great quote though, isn't it. 780 00:43:10,160 --> 00:43:13,520 Speaker 1: You know what Gandy said and you're not. One of 781 00:43:13,520 --> 00:43:16,320 Speaker 1: his most famous quotes be the change you want to 782 00:43:16,360 --> 00:43:20,880 Speaker 1: see in the world. Yeah, yeah, a little better than Gandalf, 783 00:43:20,880 --> 00:43:22,920 Speaker 1: but yeah. 784 00:43:23,600 --> 00:43:26,520 Speaker 2: One of the most inspiring piece perpose people I ever 785 00:43:26,640 --> 00:43:31,240 Speaker 2: saw was attending a talk being given by the Dalai Lama. 786 00:43:31,840 --> 00:43:35,359 Speaker 2: Now there's a bloke. There's a bloke who who has 787 00:43:35,440 --> 00:43:38,920 Speaker 2: the magical qualities of Gandalf. He was he was pretty 788 00:43:38,960 --> 00:43:39,680 Speaker 2: amazing to see. 789 00:43:39,719 --> 00:43:41,560 Speaker 1: Do you and I go to that together or no? 790 00:43:41,680 --> 00:43:43,080 Speaker 1: Because I saw him as. 791 00:43:42,960 --> 00:43:45,720 Speaker 2: Well, I wonder if we did. He has an amazing 792 00:43:45,760 --> 00:43:48,080 Speaker 2: giggle as well. And you know the other cool thing 793 00:43:48,200 --> 00:43:51,239 Speaker 2: is that we we we share the same birthday, not 794 00:43:51,280 --> 00:43:54,120 Speaker 2: the same birth year, but the same birth day. 795 00:43:54,440 --> 00:44:00,359 Speaker 1: Yes, well, out to the Dalai Lama. If he listening. 796 00:44:00,440 --> 00:44:03,320 Speaker 1: I know he's a big fan. He doesn't catch every episode. 797 00:44:03,440 --> 00:44:05,080 Speaker 2: No, No, that's right, I. 798 00:44:05,000 --> 00:44:07,600 Speaker 1: Think mainly yours. Let's do one or two more, mate. 799 00:44:07,680 --> 00:44:09,919 Speaker 2: I'll tell you about another AI story because this kind 800 00:44:09,920 --> 00:44:12,920 Speaker 2: of freaked me out, and I'm not until entirely sure 801 00:44:13,239 --> 00:44:17,040 Speaker 2: that I think it's a good idea. But we've heard 802 00:44:17,040 --> 00:44:20,360 Speaker 2: about digital resurrections where you take a whole lot of 803 00:44:20,360 --> 00:44:23,640 Speaker 2: footage of somebody who's died and then you get AI 804 00:44:23,800 --> 00:44:26,719 Speaker 2: to generate an artificial version of that person. It could 805 00:44:26,760 --> 00:44:30,360 Speaker 2: be them saying something. This is really full on. So 806 00:44:31,160 --> 00:44:36,200 Speaker 2: a murder victim has addressed his killer in a court 807 00:44:37,120 --> 00:44:42,000 Speaker 2: thanks to AI resurrection. Okay, So there's the sister of 808 00:44:42,040 --> 00:44:46,640 Speaker 2: this guy who who was murdered. Stacy Wales used AI 809 00:44:46,760 --> 00:44:51,120 Speaker 2: to generate a video of her brother Christopher, and she 810 00:44:51,239 --> 00:44:56,800 Speaker 2: actually got she was able to play this artificial version 811 00:44:56,840 --> 00:45:01,480 Speaker 2: of him to the courtroom in the final sentence because 812 00:45:01,520 --> 00:45:03,680 Speaker 2: he was killed in a road rage incident. This is 813 00:45:03,680 --> 00:45:09,160 Speaker 2: in Arizona, and this is reported on NPR. And she said, look, 814 00:45:09,280 --> 00:45:11,520 Speaker 2: the struggle, the thing she struggled with. This court case 815 00:45:11,560 --> 00:45:14,680 Speaker 2: went over two years, and she said, what she struggled 816 00:45:14,719 --> 00:45:16,680 Speaker 2: with the most was that he didn't get to say, 817 00:45:16,960 --> 00:45:20,680 Speaker 2: he didn't get a chance to speak. So it motivated her. 818 00:45:20,960 --> 00:45:23,880 Speaker 2: Just woke up one day and thought, I've got footage 819 00:45:23,920 --> 00:45:26,680 Speaker 2: of him. I've got some real video of him. I 820 00:45:26,800 --> 00:45:28,759 Speaker 2: want him to address and I want people to know 821 00:45:28,800 --> 00:45:33,400 Speaker 2: who he was. It wasn't. The thing is the interesting 822 00:45:33,440 --> 00:45:37,280 Speaker 2: thing about this artificial version of him, this AI generated version, 823 00:45:37,680 --> 00:45:40,160 Speaker 2: was it wasn't about having a go at the person 824 00:45:40,200 --> 00:45:42,080 Speaker 2: who killed him. In fact, it was the opposite. It 825 00:45:42,120 --> 00:45:46,920 Speaker 2: was he would have tried to show some sort of sympathy. 826 00:45:47,840 --> 00:45:48,040 Speaker 1: You know. 827 00:45:48,120 --> 00:45:51,680 Speaker 2: It was actually quite an amazing generated you know. I 828 00:45:51,760 --> 00:45:53,600 Speaker 2: watched the little video clip and it was it was 829 00:45:53,680 --> 00:45:56,440 Speaker 2: pretty full on to watch this AI video clip. But 830 00:45:56,840 --> 00:45:59,400 Speaker 2: it concerns me that you could do that in a 831 00:45:59,440 --> 00:46:02,480 Speaker 2: court case. And I don't know what the legal systems 832 00:46:02,640 --> 00:46:05,800 Speaker 2: like in the US. And you know, because they have 833 00:46:05,880 --> 00:46:08,600 Speaker 2: what they call the victim impact statements, and so she 834 00:46:09,080 --> 00:46:12,600 Speaker 2: was entered in as the victim impact statement from the 835 00:46:12,680 --> 00:46:15,680 Speaker 2: victim themselves. I think it opens up a big can 836 00:46:15,719 --> 00:46:19,759 Speaker 2: of worms because you're giving voice to something. And as 837 00:46:19,760 --> 00:46:22,719 Speaker 2: I said, this was an accusatory thing. This wasn't evidence. 838 00:46:23,120 --> 00:46:26,200 Speaker 2: This was just this is who I am, and these 839 00:46:26,280 --> 00:46:30,080 Speaker 2: were my ideals, is what they did with this video clip. 840 00:46:30,120 --> 00:46:32,080 Speaker 2: But what a can of worms that is? 841 00:46:32,880 --> 00:46:38,720 Speaker 1: Yeah? That term digital resurrection, that's creepy. That's a creepy 842 00:46:38,880 --> 00:46:44,759 Speaker 1: fucking digital resurrection. It's like, I understand the motivation on 843 00:46:44,800 --> 00:46:50,400 Speaker 1: her part, but whoever scripted it, whoever whoever scripted the 844 00:46:50,440 --> 00:46:53,920 Speaker 1: words using his voice and his image, he didn't do 845 00:46:54,000 --> 00:46:57,680 Speaker 1: it like that. Not, you know, so to go, this 846 00:46:57,760 --> 00:47:00,200 Speaker 1: is what he wants to tell you. No, this is 847 00:47:00,239 --> 00:47:04,480 Speaker 1: what you I understand it though, but it's it's I 848 00:47:04,560 --> 00:47:07,560 Speaker 1: don't know that that legally or practically that that's going 849 00:47:07,640 --> 00:47:11,560 Speaker 1: to have any impact. I mean emotionally, emotionally, I get it. 850 00:47:12,080 --> 00:47:16,160 Speaker 2: What if I took every single podcast you've ever recorded, 851 00:47:16,680 --> 00:47:20,200 Speaker 2: put it into a large language model, and asked it, 852 00:47:20,400 --> 00:47:22,960 Speaker 2: what would Craig say? What? You know? What? 853 00:47:23,120 --> 00:47:23,279 Speaker 1: I know? 854 00:47:23,320 --> 00:47:26,120 Speaker 2: What would be amazing would be to isolate you in 855 00:47:26,160 --> 00:47:29,480 Speaker 2: one location and then do a large language model of 856 00:47:29,680 --> 00:47:33,719 Speaker 2: everything you've ever said publicly, all the podcasts and information, 857 00:47:34,200 --> 00:47:37,520 Speaker 2: and then ask a series of ten questions and see 858 00:47:37,560 --> 00:47:41,919 Speaker 2: how much the AI version of you responds to the 859 00:47:41,960 --> 00:47:44,399 Speaker 2: real you. Wouldn't that blow your mind? How interesting would 860 00:47:44,400 --> 00:47:46,680 Speaker 2: that be. 861 00:47:45,840 --> 00:47:49,560 Speaker 1: Interesting to you? Yeah? It would but it's still not 862 00:47:49,760 --> 00:47:53,279 Speaker 1: the actual person in real life, in real time. But yeah, look, 863 00:47:53,400 --> 00:47:56,440 Speaker 1: I mean, and where it's going, who knows it's so 864 00:47:57,600 --> 00:48:01,680 Speaker 1: it's it's it's getting pretty close to where it can 865 00:48:01,800 --> 00:48:04,640 Speaker 1: I guess with a high level of predictability know what 866 00:48:04,680 --> 00:48:07,560 Speaker 1: you're going to say, because you've because you're I mean, 867 00:48:07,600 --> 00:48:10,080 Speaker 1: we're all quite predictable. We like to think we're not, 868 00:48:10,239 --> 00:48:14,040 Speaker 1: but I'm sure I'm very predictable. You know. I don't 869 00:48:14,040 --> 00:48:15,520 Speaker 1: want to think I'm special, but I'm not. 870 00:48:16,280 --> 00:48:18,800 Speaker 2: Well, it's I think a few episodes, well quite a 871 00:48:18,800 --> 00:48:20,479 Speaker 2: few episodes ago. It might have been even a couple 872 00:48:20,520 --> 00:48:24,160 Speaker 2: of years ago, there was a guy who who'd lost 873 00:48:24,200 --> 00:48:27,439 Speaker 2: his father and was so devastated he decided to try 874 00:48:27,480 --> 00:48:30,440 Speaker 2: to build a digital version of his father using you know, 875 00:48:30,560 --> 00:48:33,080 Speaker 2: video and cassette recordings and all that sort of stuff. 876 00:48:33,120 --> 00:48:36,360 Speaker 2: And you know, there is that it is a morbid 877 00:48:36,480 --> 00:48:39,640 Speaker 2: kind of connection, and I guess part of the grieving 878 00:48:39,719 --> 00:48:44,279 Speaker 2: process is letting go and having memories and having a 879 00:48:44,320 --> 00:48:48,960 Speaker 2: digital representation of that person and interacting with them on 880 00:48:49,000 --> 00:48:51,480 Speaker 2: a daily basis. Like he was talking to his dead 881 00:48:51,520 --> 00:48:54,400 Speaker 2: father every day. He was interacting with him, so he 882 00:48:54,520 --> 00:48:57,880 Speaker 2: was never letting go. Of that person because he in 883 00:48:57,960 --> 00:49:02,279 Speaker 2: his mind he'd recreated it was a digital resurrection of 884 00:49:02,320 --> 00:49:04,040 Speaker 2: that person via AI. 885 00:49:05,000 --> 00:49:09,960 Speaker 1: So yeah, I'm very careful to judge how people grieve. 886 00:49:10,160 --> 00:49:13,200 Speaker 1: I reckon whatever works for you. You know, it's like 887 00:49:13,600 --> 00:49:16,839 Speaker 1: how many people talk to their dead partner? How many 888 00:49:16,880 --> 00:49:19,200 Speaker 1: people go to a grave and talk as though they're 889 00:49:19,200 --> 00:49:22,680 Speaker 1: there and they're listening, And how many people you know 890 00:49:22,880 --> 00:49:25,600 Speaker 1: you just don't know. I think it's like when you 891 00:49:25,680 --> 00:49:28,000 Speaker 1: talk to that person that's no longer here, but you 892 00:49:28,160 --> 00:49:31,000 Speaker 1: sense they're here or their energies here, how does that 893 00:49:31,120 --> 00:49:34,880 Speaker 1: make you feel? I feel great, I feel connected. I 894 00:49:34,960 --> 00:49:37,560 Speaker 1: know they're listening. I know this, I know that, And 895 00:49:37,600 --> 00:49:39,560 Speaker 1: then who are we to go? Yeah, well that's not 896 00:49:39,560 --> 00:49:44,239 Speaker 1: true though they're dead. So I think like I don't know, 897 00:49:44,440 --> 00:49:48,040 Speaker 1: Like we don't know. Fuck, I don't know what happens. 898 00:49:48,160 --> 00:49:52,920 Speaker 2: Like one of my dearest friends, she lost her husband, 899 00:49:52,960 --> 00:49:57,480 Speaker 2: who was quite young relatively when he died. Gosh, it 900 00:49:57,520 --> 00:50:00,520 Speaker 2: was twenty thirteen, so quite a while ago. And I 901 00:50:00,560 --> 00:50:03,920 Speaker 2: remember calling her a few days after he died because 902 00:50:03,920 --> 00:50:05,880 Speaker 2: I spent a lot of time with him when he 903 00:50:05,960 --> 00:50:08,400 Speaker 2: was going through cancer, and I'd catch up with him 904 00:50:08,440 --> 00:50:11,120 Speaker 2: once a week and we went from going for coffee 905 00:50:11,120 --> 00:50:14,719 Speaker 2: together to me bringing coffee to him, you know, but 906 00:50:14,800 --> 00:50:17,360 Speaker 2: he wasn't able to leave his home. And then a 907 00:50:17,360 --> 00:50:20,640 Speaker 2: few days after he died, I called my friend and 908 00:50:21,680 --> 00:50:23,680 Speaker 2: she wasn't able to take the call, and it was 909 00:50:23,760 --> 00:50:26,239 Speaker 2: his voice on her message service. And I was so 910 00:50:26,440 --> 00:50:29,239 Speaker 2: confronted because I could hear the voice of this guy 911 00:50:29,320 --> 00:50:31,960 Speaker 2: I had so much respect for and I love her dearly, 912 00:50:32,360 --> 00:50:36,520 Speaker 2: and I thought, oh gosh, it's him on the message 913 00:50:36,560 --> 00:50:39,120 Speaker 2: and it still is, you know, all that time later, 914 00:50:39,360 --> 00:50:43,920 Speaker 2: whenever I'm not as challenged now. I actually I would 915 00:50:43,960 --> 00:50:46,800 Speaker 2: hang up the phone before the message came on because 916 00:50:46,800 --> 00:50:49,720 Speaker 2: it was freaking me out a bit when that first happened. 917 00:50:50,000 --> 00:50:52,120 Speaker 2: So I can't even begin to imagine what it would 918 00:50:52,160 --> 00:50:55,359 Speaker 2: be like to have a digital recreation and interact with 919 00:50:55,440 --> 00:51:00,239 Speaker 2: someone who had passed away. Yeah, I'm challenged by the concept, 920 00:51:00,360 --> 00:51:03,480 Speaker 2: but that grieving process and we need to move on. 921 00:51:03,600 --> 00:51:03,799 Speaker 3: You know. 922 00:51:03,880 --> 00:51:06,480 Speaker 2: One another friend of mine lost her husband three years 923 00:51:06,520 --> 00:51:09,080 Speaker 2: ago and now she's met somebody new. And it's not 924 00:51:09,160 --> 00:51:11,799 Speaker 2: like she loves her husband any less, you know, she 925 00:51:11,880 --> 00:51:15,160 Speaker 2: had this amazing journey and he passed away, but she 926 00:51:15,360 --> 00:51:18,319 Speaker 2: now has a new relationship, so it doesn't lessen how 927 00:51:18,360 --> 00:51:21,000 Speaker 2: much we love that person. I think that again, it's 928 00:51:21,040 --> 00:51:23,040 Speaker 2: part of the grieving process, but we need to move 929 00:51:23,080 --> 00:51:26,840 Speaker 2: on from that and forming those new relationships. You know, 930 00:51:26,840 --> 00:51:29,160 Speaker 2: if a close friend passes away or we lose someone 931 00:51:29,160 --> 00:51:31,719 Speaker 2: in our life, then maybe it just means that it 932 00:51:31,840 --> 00:51:34,640 Speaker 2: makes way for somebody else to take that space or 933 00:51:34,880 --> 00:51:37,360 Speaker 2: a little part of the space that we want for 934 00:51:37,400 --> 00:51:37,880 Speaker 2: that person. 935 00:51:38,680 --> 00:51:43,520 Speaker 1: Yeah. Yeah, its good chatting with you, buddy. Tell people 936 00:51:43,520 --> 00:51:46,920 Speaker 1: how to connect with you and reach out if they 937 00:51:47,000 --> 00:51:47,359 Speaker 1: want to. 938 00:51:48,080 --> 00:51:52,000 Speaker 2: Websites now dot com today you or tie chair at 939 00:51:52,000 --> 00:51:55,680 Speaker 2: home dot com TODAYU. So there's the yin and yang 940 00:51:55,840 --> 00:51:59,120 Speaker 2: of my lifestyle. One is my one is my work, 941 00:51:59,160 --> 00:52:01,239 Speaker 2: and one is my past, and it's nice to have 942 00:52:01,320 --> 00:52:02,040 Speaker 2: them overlap. 943 00:52:03,800 --> 00:52:06,399 Speaker 1: And Tiff, try not to lose your shit at your 944 00:52:06,480 --> 00:52:10,239 Speaker 1: cat in the upcoming days if you would, yes, sir, 945 00:52:12,239 --> 00:52:15,480 Speaker 1: we'll say goodbye affair. But for now, thanks kids, Thanks everyone,