1 00:00:00,920 --> 00:00:03,680 Speaker 1: I'll get a team. It's Craig, Anthony Heart, Tiffany and Cook, Patrick, 2 00:00:03,840 --> 00:00:06,480 Speaker 1: James Bonillo. It is you projected as a Friday. As 3 00:00:06,519 --> 00:00:09,360 Speaker 1: we record, it's five minutes past nine here in the 4 00:00:09,440 --> 00:00:12,200 Speaker 1: thriving metropolis of Melbourne where Tiffany and Cook is. It's 5 00:00:12,240 --> 00:00:15,600 Speaker 1: nineteen eighty four and it's about nine ninety eight where 6 00:00:15,640 --> 00:00:16,280 Speaker 1: Patrick is. 7 00:00:16,640 --> 00:00:20,560 Speaker 2: Hi everyone, Hi, Hi, gee, that was two in one go. 8 00:00:20,880 --> 00:00:21,360 Speaker 3: That's two. 9 00:00:22,880 --> 00:00:25,479 Speaker 1: I've just come in off the long run and I've 10 00:00:25,520 --> 00:00:29,000 Speaker 1: just released a fastball tiff House, Tassi. 11 00:00:29,600 --> 00:00:32,720 Speaker 4: I've literally just come in off a long run. See 12 00:00:32,720 --> 00:00:33,800 Speaker 4: I've read my faces. 13 00:00:34,240 --> 00:00:37,000 Speaker 1: Yeah, I was looking at your face. I'm thinking you're 14 00:00:37,040 --> 00:00:41,000 Speaker 1: either really embarrassed or you've been doing something where your 15 00:00:41,000 --> 00:00:42,360 Speaker 1: heart rate's over one hundred. 16 00:00:42,800 --> 00:00:44,320 Speaker 3: It definitely is like running. 17 00:00:45,640 --> 00:00:46,279 Speaker 4: How good is it? 18 00:00:46,320 --> 00:00:46,560 Speaker 3: Though? 19 00:00:47,560 --> 00:00:48,279 Speaker 1: How far you right? 20 00:00:48,320 --> 00:00:48,559 Speaker 3: It is? 21 00:00:48,560 --> 00:00:49,480 Speaker 1: What's your hard you run? 22 00:00:50,040 --> 00:00:54,640 Speaker 3: I run ten cays and what's your heart rate? Well? 23 00:00:54,720 --> 00:00:56,560 Speaker 4: I got a little message on my garm and my 24 00:00:56,720 --> 00:01:00,080 Speaker 4: ma's heart rate just got increased to two oh one. 25 00:01:00,160 --> 00:01:00,880 Speaker 3: So what is it now? 26 00:01:01,240 --> 00:01:03,080 Speaker 2: We should track it in five minutes, Craig, go and 27 00:01:03,120 --> 00:01:04,919 Speaker 2: then incrementally see what. 28 00:01:05,040 --> 00:01:07,520 Speaker 1: It depends how long ago she stopped like if she 29 00:01:07,600 --> 00:01:09,840 Speaker 1: stopped three minutes ago, that would be interesting, but if 30 00:01:09,880 --> 00:01:12,800 Speaker 1: she stopped ten minutes ago, it wouldn't be How long, goo, 31 00:01:12,880 --> 00:01:13,160 Speaker 1: did you. 32 00:01:13,120 --> 00:01:15,959 Speaker 4: Finish fifteen minutes ago? 33 00:01:16,720 --> 00:01:17,200 Speaker 3: There you go? 34 00:01:17,840 --> 00:01:20,240 Speaker 1: So what's your heart rate right now? Can I guess? 35 00:01:20,520 --> 00:01:20,800 Speaker 4: Yeah? 36 00:01:21,840 --> 00:01:22,480 Speaker 1: Sixty one? 37 00:01:23,360 --> 00:01:23,440 Speaker 3: No? 38 00:01:24,400 --> 00:01:27,759 Speaker 4: What is it ninety? I actually have a high working 39 00:01:27,760 --> 00:01:30,480 Speaker 4: heart rate. My resting heart rates fifty right now luck 40 00:01:30,840 --> 00:01:35,800 Speaker 4: this week, right now. But when I move and exercise 41 00:01:35,880 --> 00:01:36,520 Speaker 4: it goes up. 42 00:01:37,480 --> 00:01:42,399 Speaker 1: Yeah, that's not necessarily what's yours? Patrick? I see you 43 00:01:42,440 --> 00:01:44,560 Speaker 1: looking at your bloody data there on your wrist. 44 00:01:44,920 --> 00:01:48,080 Speaker 2: Well, my heart rate at the moment is varying between 45 00:01:48,120 --> 00:01:50,480 Speaker 2: about sixty if it's actually going up now it was 46 00:01:50,520 --> 00:01:53,240 Speaker 2: sixty two, it's now gone up to sixty four. But 47 00:01:53,280 --> 00:01:56,080 Speaker 2: my average is fifty three. So what is it about? 48 00:01:56,120 --> 00:01:58,040 Speaker 3: Sitting in the room with you too, I'll tell you 49 00:01:58,040 --> 00:01:59,280 Speaker 3: what I'm going to take. 50 00:02:00,000 --> 00:02:01,440 Speaker 1: Don't you look at it. I'm going to take my 51 00:02:01,520 --> 00:02:03,440 Speaker 1: shirt off and then let's see what happens. 52 00:02:03,760 --> 00:02:05,440 Speaker 3: And then can Tiff take her shirt off? 53 00:02:05,920 --> 00:02:06,120 Speaker 1: Yeah? 54 00:02:06,280 --> 00:02:09,079 Speaker 3: See, I nearly didn't put it on. I was that hot. 55 00:02:09,160 --> 00:02:10,760 Speaker 4: I only that's why I was a little bit later 56 00:02:10,760 --> 00:02:12,000 Speaker 4: I thought I'd better put my shirt on. 57 00:02:13,320 --> 00:02:15,600 Speaker 1: I think thanks for that. We're a family show. 58 00:02:15,880 --> 00:02:18,359 Speaker 3: It's got to seventy four. What the hell have you 59 00:02:18,600 --> 00:02:21,600 Speaker 3: even not off yet? It was Gus about it. 60 00:02:22,120 --> 00:02:24,320 Speaker 1: That's the thought of my pets being on the screen. 61 00:02:24,400 --> 00:02:26,480 Speaker 3: You's going back now, it's going higher. 62 00:02:27,160 --> 00:02:31,360 Speaker 1: Yeah, you couldn't fucking cope, bro, Patrick, how's life in 63 00:02:31,400 --> 00:02:32,720 Speaker 1: the bush? You know? 64 00:02:33,240 --> 00:02:35,919 Speaker 2: As I wandered down to my little studio, I picked 65 00:02:35,919 --> 00:02:38,639 Speaker 2: an apple from the tree as I sat down. 66 00:02:38,880 --> 00:02:40,320 Speaker 3: I mean, isn't life great? 67 00:02:42,120 --> 00:02:44,720 Speaker 1: I know? I'll tell you what. There's no one in 68 00:02:45,120 --> 00:02:48,400 Speaker 1: not too many people in suburbia picking an apple on 69 00:02:48,440 --> 00:02:51,520 Speaker 1: the way to their podcast studio, that is, or their 70 00:02:51,560 --> 00:02:54,600 Speaker 1: recording studio. I don't think I've ever heard that sentence, 71 00:02:54,639 --> 00:02:57,399 Speaker 1: and probably never will again, Tiff, just before we talk 72 00:02:57,400 --> 00:03:00,960 Speaker 1: about actual things, have you seen your granddad yet? 73 00:03:01,520 --> 00:03:02,280 Speaker 3: No? Not yet. 74 00:03:02,320 --> 00:03:04,360 Speaker 4: He's out for a country to drive today with some 75 00:03:04,400 --> 00:03:07,119 Speaker 4: friends before they go back to wherever they came from. 76 00:03:08,400 --> 00:03:11,280 Speaker 1: And how's one hundred and one looking on him? 77 00:03:12,080 --> 00:03:13,360 Speaker 4: Mate? How exciting? 78 00:03:13,720 --> 00:03:14,560 Speaker 1: Really good? 79 00:03:14,760 --> 00:03:15,320 Speaker 4: Yeah? 80 00:03:15,400 --> 00:03:16,480 Speaker 3: Oh that's amazing. 81 00:03:16,800 --> 00:03:21,040 Speaker 4: Yeah he's kicking it. Wow, mate, I am good. 82 00:03:21,120 --> 00:03:22,880 Speaker 3: Geens that helps doesn't when? 83 00:03:22,960 --> 00:03:27,160 Speaker 2: Yeah, look at that for a lineage pedigree. 84 00:03:27,440 --> 00:03:29,919 Speaker 3: He's Luna what have you done with Luna while you're away? 85 00:03:30,240 --> 00:03:35,360 Speaker 4: She's here with me. Yeah, come on the boat. 86 00:03:38,320 --> 00:03:40,560 Speaker 2: Yeah, it's funny, isn't it because they have two new 87 00:03:40,600 --> 00:03:42,360 Speaker 2: ships because you had to put Luna in a little 88 00:03:42,400 --> 00:03:45,760 Speaker 2: pound though, didn't you during the voyage? Because you know 89 00:03:45,800 --> 00:03:48,520 Speaker 2: the new ships that they've built that are currently I 90 00:03:48,560 --> 00:03:52,320 Speaker 2: think on'es in Norway ones in Scotland, they're now like 91 00:03:52,320 --> 00:03:55,000 Speaker 2: two or three years away. But the thing is you 92 00:03:55,000 --> 00:03:57,560 Speaker 2: were able to have in the new ships. You're going 93 00:03:57,640 --> 00:04:00,520 Speaker 2: to be able to go from Victoria Tasmane and back 94 00:04:00,840 --> 00:04:02,760 Speaker 2: and have your dog in the cabin with you. 95 00:04:03,280 --> 00:04:05,760 Speaker 3: Mm hmm, how good? Sayd I can't wait. 96 00:04:06,000 --> 00:04:08,120 Speaker 1: You can't pull the obvious question. I'm not even going 97 00:04:08,200 --> 00:04:10,840 Speaker 1: to ask it, but can you answer the obvious question? 98 00:04:11,600 --> 00:04:11,880 Speaker 3: What? 99 00:04:12,600 --> 00:04:13,480 Speaker 1: Or come on? 100 00:04:13,800 --> 00:04:16,320 Speaker 3: I don't know what's the obvious question? Well? 101 00:04:16,360 --> 00:04:17,560 Speaker 1: Ship the dog shit? 102 00:04:18,480 --> 00:04:22,839 Speaker 2: Well you dogs trained, Fritz will go for nine hours 103 00:04:22,880 --> 00:04:23,760 Speaker 2: without having to go. 104 00:04:24,320 --> 00:04:27,000 Speaker 4: There's no reted area on the ship where they can 105 00:04:27,040 --> 00:04:28,119 Speaker 4: take them for we wiz. 106 00:04:28,760 --> 00:04:32,360 Speaker 1: Yes, do they call that the ship shipping area? 107 00:04:32,839 --> 00:04:34,240 Speaker 3: Holy ship? Yeah? 108 00:04:34,800 --> 00:04:39,360 Speaker 1: Holy ship? All right, let's talk about technology, because that 109 00:04:39,360 --> 00:04:45,159 Speaker 1: that's our that's our mission, that's allegedly our mission every time. 110 00:04:47,320 --> 00:04:51,640 Speaker 2: Mate, Look, I just you know, I was reading social 111 00:04:51,680 --> 00:04:54,560 Speaker 2: media gets me a little bit riled. Okay, so I 112 00:04:55,000 --> 00:04:57,560 Speaker 2: stopped social media. I think about eight years ago. I 113 00:04:57,560 --> 00:05:00,560 Speaker 2: stopped using it personally. And it is a to get 114 00:05:00,600 --> 00:05:04,760 Speaker 2: caught up in the momentum of comments because once upon 115 00:05:04,800 --> 00:05:08,840 Speaker 2: a time, journalism was filtered. You had an editor and 116 00:05:08,960 --> 00:05:12,479 Speaker 2: anything that was published was vetted before it went. Now 117 00:05:12,520 --> 00:05:15,760 Speaker 2: with social media, it's like a you know, it's electronic vomit. 118 00:05:16,040 --> 00:05:19,400 Speaker 2: Anybody can say anything at any time, and it sparks debate. 119 00:05:19,440 --> 00:05:23,200 Speaker 2: And I was reading an article yesterday and flicking doom scrolling, 120 00:05:23,600 --> 00:05:26,640 Speaker 2: and it was an article in reaction to a radio 121 00:05:26,760 --> 00:05:31,039 Speaker 2: segment where people were getting really riled up about taking 122 00:05:31,040 --> 00:05:34,800 Speaker 2: your dogs to Bunnings. So for about eight years, the 123 00:05:34,839 --> 00:05:38,240 Speaker 2: Bunnings hardware store allows people to take their dogs. 124 00:05:38,480 --> 00:05:40,080 Speaker 3: So Fritzy comes with me all the time. 125 00:05:40,120 --> 00:05:42,400 Speaker 2: I love taking Fritzy to Bunnings and I put him 126 00:05:42,400 --> 00:05:45,040 Speaker 2: into the trolley and everyone loves Fritzy, so people come 127 00:05:45,080 --> 00:05:47,240 Speaker 2: over and pat him, and kids, you know, love him. 128 00:05:47,240 --> 00:05:49,159 Speaker 2: He's a great cute little dog and he's well behaved. 129 00:05:49,160 --> 00:05:53,000 Speaker 2: But it sparked all this controversy. But the article that 130 00:05:53,200 --> 00:05:57,279 Speaker 2: was published was just in reaction to this radio segment, 131 00:05:57,320 --> 00:05:59,360 Speaker 2: so really the journalist hadn't done any work. 132 00:05:59,480 --> 00:06:00,919 Speaker 3: It was just oating segments. 133 00:06:01,320 --> 00:06:06,120 Speaker 2: And what fascinated me was there are probably about seventy comments, 134 00:06:06,120 --> 00:06:09,680 Speaker 2: and I started flicking through the comments, and what really 135 00:06:09,680 --> 00:06:14,000 Speaker 2: got me was the anger, the absolute anger and angst 136 00:06:14,400 --> 00:06:17,479 Speaker 2: at people firing off at each other. So someone said, oh, 137 00:06:17,640 --> 00:06:19,760 Speaker 2: I can bring the dog, but you know, you shouldn't 138 00:06:19,800 --> 00:06:22,280 Speaker 2: let kids because kids are worse than pets. And other 139 00:06:22,320 --> 00:06:24,800 Speaker 2: people were saying, and I've seen dogs dogs pee on 140 00:06:24,839 --> 00:06:28,200 Speaker 2: the you know, on the products, and Buddings has this 141 00:06:28,279 --> 00:06:31,120 Speaker 2: philosophy that basically most dogs are actually well behaved. If 142 00:06:31,120 --> 00:06:34,440 Speaker 2: you're going to bring them into a large place like bunings, 143 00:06:34,640 --> 00:06:36,640 Speaker 2: most people do the right thing and it's very rare. 144 00:06:36,720 --> 00:06:38,919 Speaker 2: And if they do have an accident, well they have 145 00:06:39,600 --> 00:06:41,480 Speaker 2: clean up crews and that you can get a dog 146 00:06:41,520 --> 00:06:44,159 Speaker 2: bag and clean clean up after them. So they just 147 00:06:44,200 --> 00:06:47,279 Speaker 2: take that approach. But there it was so vehement, and 148 00:06:47,320 --> 00:06:49,400 Speaker 2: it's not so much the topic of whether you should 149 00:06:49,440 --> 00:06:52,880 Speaker 2: or shouldn't take dogs into places like buntings, because if 150 00:06:52,880 --> 00:06:55,279 Speaker 2: you go to Europe, you can take your dog anywhere. 151 00:06:55,320 --> 00:06:57,080 Speaker 2: You can go to a restaurant and if your dog's 152 00:06:57,120 --> 00:06:59,000 Speaker 2: well behaved, sits on the ground. You had a really 153 00:06:59,080 --> 00:07:01,839 Speaker 2: nice restaurant that in Vienna. It was amazing, a really 154 00:07:01,920 --> 00:07:04,280 Speaker 2: high class restaurant, and the people we were meeting had 155 00:07:04,320 --> 00:07:07,600 Speaker 2: a dog and it was fantastic. And because I like 156 00:07:07,680 --> 00:07:09,320 Speaker 2: to take my dog with me, then there's the idea 157 00:07:09,360 --> 00:07:12,880 Speaker 2: of companion dogs or support dogs. But what got me 158 00:07:13,240 --> 00:07:15,920 Speaker 2: more so it wasn't so much the debate but the 159 00:07:15,960 --> 00:07:21,000 Speaker 2: anger and just the fiery tension between the respondents. You know, 160 00:07:21,040 --> 00:07:23,520 Speaker 2: you've got the yes side the no side, but the 161 00:07:23,600 --> 00:07:26,360 Speaker 2: personal digs were people were making at each other. 162 00:07:26,720 --> 00:07:30,160 Speaker 3: So what causes that frenzy? And it does? 163 00:07:30,480 --> 00:07:32,760 Speaker 2: I mean you've probably seen those sorts of threads or 164 00:07:32,800 --> 00:07:36,560 Speaker 2: those comments or reactions on socials where people just get 165 00:07:36,760 --> 00:07:40,520 Speaker 2: so irate. And then it made me think, well, I 166 00:07:40,560 --> 00:07:42,280 Speaker 2: was going to respond because as you know, I'm a 167 00:07:42,320 --> 00:07:44,600 Speaker 2: passionate dog owner. Fritzy sitting right next to me in 168 00:07:44,640 --> 00:07:47,560 Speaker 2: my chair. But the reality of it is I started 169 00:07:47,600 --> 00:07:49,360 Speaker 2: to type and I thought, what am I going to 170 00:07:49,400 --> 00:07:52,640 Speaker 2: achieve by having to go at this person for saying 171 00:07:52,720 --> 00:07:56,040 Speaker 2: what they did or you know, trying to exercise my opinion? 172 00:07:56,640 --> 00:07:57,760 Speaker 3: And I just pulled back from it. 173 00:07:57,760 --> 00:07:59,880 Speaker 2: I deleted the message and I thought, I just don't 174 00:08:00,200 --> 00:08:03,080 Speaker 2: want to engage people. 175 00:08:03,640 --> 00:08:07,920 Speaker 1: I mean, yeah, maybe because you don't do social media, 176 00:08:08,120 --> 00:08:10,720 Speaker 1: Tiff and I are going, yeah, of course, Patrick, this 177 00:08:10,880 --> 00:08:13,560 Speaker 1: is what social media like. This is not I don't 178 00:08:13,560 --> 00:08:16,040 Speaker 1: mean to be rude. This is no revelation at all. 179 00:08:16,240 --> 00:08:18,920 Speaker 1: Like this is all day, every day on social media. 180 00:08:20,160 --> 00:08:24,080 Speaker 1: It's interesting. But look, here's what I think. I mean, 181 00:08:24,120 --> 00:08:26,640 Speaker 1: it's a revelation and it's interesting for you, and it's 182 00:08:26,680 --> 00:08:29,160 Speaker 1: good for me to see that because I'm like, I'm 183 00:08:29,320 --> 00:08:32,319 Speaker 1: so used to that and desensitized to that, and that's 184 00:08:33,040 --> 00:08:36,440 Speaker 1: I'm like, well, of course, but here's what you did 185 00:08:36,520 --> 00:08:40,320 Speaker 1: that's great is you consciously went I'm not giving this 186 00:08:40,440 --> 00:08:44,679 Speaker 1: my attention or energy. And I think that's the like 187 00:08:44,720 --> 00:08:47,600 Speaker 1: the good and bad of social media is like I 188 00:08:47,640 --> 00:08:50,880 Speaker 1: have over one hundred thousand followers on social media and 189 00:08:51,040 --> 00:08:54,440 Speaker 1: I only put good things into the world, and ninety 190 00:08:54,559 --> 00:08:58,199 Speaker 1: nine percent of the feedback that I get is positive 191 00:08:58,320 --> 00:09:02,160 Speaker 1: ninety nine percent, So I think it depends like to 192 00:09:02,240 --> 00:09:05,599 Speaker 1: me ultimately, can you hear that ambulance you guys? No? 193 00:09:08,400 --> 00:09:08,960 Speaker 3: Right mate? 194 00:09:09,280 --> 00:09:10,400 Speaker 1: Yeah, no, pretty good? 195 00:09:10,440 --> 00:09:11,280 Speaker 3: It wasn't for you? 196 00:09:11,679 --> 00:09:13,959 Speaker 1: Yeah, no, thanks, yeah, you hang on, am I having 197 00:09:13,960 --> 00:09:17,960 Speaker 1: a heart noe? But I think, like, ultimately, what is 198 00:09:18,000 --> 00:09:22,640 Speaker 1: social media? It's a technology that we can use. It's 199 00:09:22,679 --> 00:09:25,840 Speaker 1: a resource and if you want to, you know, it's 200 00:09:25,880 --> 00:09:28,079 Speaker 1: like you can kill someone with a knife or make 201 00:09:28,120 --> 00:09:31,559 Speaker 1: your dinner with a knife. Like ultimately most of them 202 00:09:31,640 --> 00:09:34,319 Speaker 1: think most of the things that we have or kill 203 00:09:34,400 --> 00:09:36,720 Speaker 1: someone and then make them your dinner. Yeah, sure there's that, 204 00:09:37,160 --> 00:09:40,720 Speaker 1: but I wouldn't recommend it. But it's like all of 205 00:09:40,760 --> 00:09:43,600 Speaker 1: these things are just tools and resources. So I don't 206 00:09:43,679 --> 00:09:48,280 Speaker 1: know that social media is arbitrarily bad or that it's 207 00:09:48,400 --> 00:09:51,360 Speaker 1: all vomit. I don't think the stuff I put into 208 00:09:51,400 --> 00:09:54,439 Speaker 1: the world via social media, I'm using the same platforms 209 00:09:54,480 --> 00:09:56,840 Speaker 1: that all the hate is on, but I choose to 210 00:09:56,840 --> 00:10:01,440 Speaker 1: put positive stuff. So I think it's personal responsibility. I 211 00:10:01,559 --> 00:10:06,520 Speaker 1: don't follow any of the horrible, ugly, you know, bullshitty ones, 212 00:10:06,559 --> 00:10:09,319 Speaker 1: and if it comes, something comes, I just scroll past 213 00:10:09,440 --> 00:10:13,080 Speaker 1: or I delete it. I think it's about using it consciously. 214 00:10:13,160 --> 00:10:15,360 Speaker 1: But you do raise a good point, like so much 215 00:10:15,400 --> 00:10:18,520 Speaker 1: of what is on there is fucking horrible, but you've 216 00:10:18,520 --> 00:10:19,680 Speaker 1: got to self regulate. 217 00:10:20,160 --> 00:10:22,520 Speaker 2: I was more thinking also in terms of whether you 218 00:10:22,600 --> 00:10:25,559 Speaker 2: engage or not, and I almost did, and then I 219 00:10:25,600 --> 00:10:26,720 Speaker 2: really stepped back from it. 220 00:10:26,800 --> 00:10:28,560 Speaker 3: I thought it through. It's funny. 221 00:10:28,640 --> 00:10:31,360 Speaker 2: I a friend of mine was, you know when you 222 00:10:31,400 --> 00:10:34,040 Speaker 2: send when someone sends an angry letter, they called it 223 00:10:34,040 --> 00:10:34,679 Speaker 2: a shitt agram. 224 00:10:35,960 --> 00:10:39,440 Speaker 3: I just really hit a gram fun it's funny. Yeah, 225 00:10:39,559 --> 00:10:40,720 Speaker 3: I thought that was quite funny. 226 00:10:40,960 --> 00:10:46,040 Speaker 2: But the problem with putting anything in writing is that 227 00:10:46,160 --> 00:10:49,000 Speaker 2: if you get angry in writing, you can I can 228 00:10:49,040 --> 00:10:51,400 Speaker 2: get angry with you right now, and then we'll shake 229 00:10:51,640 --> 00:10:53,400 Speaker 2: hands about it, and you know, buy each other a 230 00:10:53,400 --> 00:10:55,840 Speaker 2: coffee and we'll be fine. But if you put it 231 00:10:55,880 --> 00:10:58,360 Speaker 2: in writing, then you go back to it two days 232 00:10:58,440 --> 00:11:00,720 Speaker 2: later and you reread it and just as angry as 233 00:11:00,760 --> 00:11:03,200 Speaker 2: you were the first time you read it, and then 234 00:11:03,360 --> 00:11:05,679 Speaker 2: again and you get more angry, and you read it again, 235 00:11:05,880 --> 00:11:09,040 Speaker 2: and it keeps firing you up. And we have a 236 00:11:09,120 --> 00:11:13,199 Speaker 2: little internal policy in my office that if you feel 237 00:11:13,280 --> 00:11:16,319 Speaker 2: riled and you're going to send a challenging email, you 238 00:11:16,360 --> 00:11:19,600 Speaker 2: never put the email address in to the person you're 239 00:11:19,640 --> 00:11:22,480 Speaker 2: either responding to. So if you hit reply, take their 240 00:11:22,520 --> 00:11:24,920 Speaker 2: email address out, get it out on paper if you 241 00:11:24,960 --> 00:11:26,600 Speaker 2: feel you need to, or type it up or whatever. 242 00:11:27,000 --> 00:11:31,200 Speaker 2: But generally, if you walk away ten spend five minutes 243 00:11:31,240 --> 00:11:34,080 Speaker 2: and then come back to it it Actually you've calmed down. 244 00:11:34,240 --> 00:11:36,360 Speaker 2: I mean my heart rate dropped down to fifty two 245 00:11:36,440 --> 00:11:39,000 Speaker 2: just a moment ago. So whatever it was that was 246 00:11:39,080 --> 00:11:41,000 Speaker 2: riled me up that not you know, the thought of 247 00:11:41,040 --> 00:11:42,240 Speaker 2: your shirt coming off or something. 248 00:11:42,280 --> 00:11:44,080 Speaker 3: I don't know, but you. 249 00:11:44,120 --> 00:11:48,559 Speaker 1: Know, I mean, yeah, like what this is getting off topic. 250 00:11:48,600 --> 00:11:50,960 Speaker 1: We'll get on tech, but this is my little wheelhouse. 251 00:11:51,000 --> 00:11:55,280 Speaker 1: You know. It's like everyone thinks they're right. Yeah, Like, 252 00:11:55,320 --> 00:11:59,520 Speaker 1: this is the thing, whatever side you're on, religious, political, 253 00:12:00,280 --> 00:12:04,560 Speaker 1: whatever the issue, whatever the topic. Nobody who's writing that thing, 254 00:12:04,880 --> 00:12:10,440 Speaker 1: whatever angle or approach or perspective they have, nobody thinks 255 00:12:10,480 --> 00:12:14,040 Speaker 1: they're wrong. And this is the problem, you know, Like 256 00:12:14,240 --> 00:12:18,480 Speaker 1: I now, after getting things wrong, something's wrong and something's right. 257 00:12:18,520 --> 00:12:22,200 Speaker 1: For sixty years, I pretty much assume that anything that 258 00:12:22,280 --> 00:12:25,560 Speaker 1: I say, there's the potential that I'm wrong about it, 259 00:12:26,080 --> 00:12:28,640 Speaker 1: you know, or anything for which I don't have absolute 260 00:12:28,640 --> 00:12:31,559 Speaker 1: evidence or proof, Like if it's an opinion or an 261 00:12:31,600 --> 00:12:34,839 Speaker 1: idea or a theory, I'm well aware that I could 262 00:12:34,840 --> 00:12:37,520 Speaker 1: be wrong, and so I try not to wrap too 263 00:12:37,600 --> 00:12:41,080 Speaker 1: much emotion around it and go, look, this is what 264 00:12:41,200 --> 00:12:43,600 Speaker 1: I think, and this is why I think it. But 265 00:12:43,679 --> 00:12:46,320 Speaker 1: I could be wrong. You know, I think like a 266 00:12:46,400 --> 00:12:49,400 Speaker 1: world where people don't have to be right all the time. 267 00:12:50,280 --> 00:12:52,840 Speaker 1: We're never going to live in that world. But fuck, 268 00:12:52,880 --> 00:12:54,840 Speaker 1: that'd be a nicer place to inhabit, wouldn't it. 269 00:12:55,360 --> 00:12:56,400 Speaker 3: But isn't it interesting? 270 00:12:56,440 --> 00:12:58,960 Speaker 2: Though? We were talking about your thesis, and the whole 271 00:12:59,040 --> 00:13:03,520 Speaker 2: premise of sign is that you challenge, You always put 272 00:13:03,600 --> 00:13:07,880 Speaker 2: up and propose new ideas, and nothing is ever definitely, 273 00:13:08,400 --> 00:13:11,000 Speaker 2: infinitely written in stone as such, is it? 274 00:13:11,320 --> 00:13:14,560 Speaker 1: YEA, Yeah, that's true. I mean science is always evolving, 275 00:13:14,600 --> 00:13:17,120 Speaker 1: and knowledge is always evolving, and the way that we 276 00:13:18,080 --> 00:13:24,640 Speaker 1: design studies and run studies and interpret data, and I think, look, 277 00:13:24,800 --> 00:13:26,720 Speaker 1: it depends on the science. You know, when you talk 278 00:13:26,760 --> 00:13:30,840 Speaker 1: about biology and physiology and engineering and all of those things, 279 00:13:30,880 --> 00:13:35,560 Speaker 1: they're like hard sciences, whereas psychology is ah and I 280 00:13:35,600 --> 00:13:40,839 Speaker 1: love psychology, but it's very It's hard to measure the 281 00:13:40,920 --> 00:13:45,160 Speaker 1: mind because there is no such thing as onto logically, 282 00:13:45,760 --> 00:13:49,640 Speaker 1: which means there's no actual evidence for a mind. But 283 00:13:49,760 --> 00:13:52,360 Speaker 1: we're studying the mind and the workings of the mind 284 00:13:52,920 --> 00:13:55,640 Speaker 1: because all of the kind of processing actually happens in 285 00:13:55,679 --> 00:13:58,360 Speaker 1: the brain. So when you do it, even if I 286 00:13:58,360 --> 00:14:00,280 Speaker 1: said this yes on a podcast, but if you do 287 00:14:00,320 --> 00:14:02,960 Speaker 1: a search. If you go to any AI thing and 288 00:14:03,000 --> 00:14:07,880 Speaker 1: say is there evidence Is there scientific, hard evidence that 289 00:14:08,440 --> 00:14:13,600 Speaker 1: the mind exists, it'll say no, because all thinking and 290 00:14:13,720 --> 00:14:17,640 Speaker 1: all data processing and all interpretation actually happens in the brain. 291 00:14:18,040 --> 00:14:22,640 Speaker 1: So the mind, as we understand it, it's a concept 292 00:14:22,760 --> 00:14:26,760 Speaker 1: that helps us decode the workings of the brain. And 293 00:14:26,800 --> 00:14:31,760 Speaker 1: it's not a bad thing. But yeah, anyway, come. 294 00:14:31,600 --> 00:14:33,160 Speaker 3: On, all right. 295 00:14:33,960 --> 00:14:37,360 Speaker 2: I heard a new term the other day, digital shoplifting, 296 00:14:37,680 --> 00:14:39,440 Speaker 2: and it kind of blew my mind a little bit. 297 00:14:39,480 --> 00:14:41,000 Speaker 3: I hadn't ever thought about this. 298 00:14:41,560 --> 00:14:43,840 Speaker 2: So there was a bit of a research study that 299 00:14:44,000 --> 00:14:47,320 Speaker 2: was done and it was about a lot of wealthy 300 00:14:47,640 --> 00:14:49,280 Speaker 2: gen zs and millennials. 301 00:14:49,360 --> 00:14:52,080 Speaker 3: This is in the US, and evidently. 302 00:14:51,720 --> 00:14:54,200 Speaker 2: Half of them, who were earning over one hundred thousand 303 00:14:54,240 --> 00:14:59,600 Speaker 2: dollars US, have admitted to digital shoplifting. 304 00:15:00,160 --> 00:15:03,640 Speaker 3: What's digital shoplifting? You ask, think, yes, yeah. 305 00:15:03,440 --> 00:15:07,640 Speaker 1: Yes, what Patrick, Hey, I have a question? What's digital shoplifting? 306 00:15:07,840 --> 00:15:09,080 Speaker 3: I'm glad you asked, Craig. 307 00:15:09,400 --> 00:15:12,320 Speaker 2: So, what happens is if you go online and you 308 00:15:12,440 --> 00:15:15,760 Speaker 2: make a purchase and that item arrives, you then go 309 00:15:15,880 --> 00:15:19,320 Speaker 2: back to the seller and say item didn't arrive, item 310 00:15:19,400 --> 00:15:22,640 Speaker 2: was broken, item got lost, and then in a lot 311 00:15:22,640 --> 00:15:26,600 Speaker 2: of cases, the online resellers will basically do you a 312 00:15:26,680 --> 00:15:29,720 Speaker 2: refund or send you another item. So it's people being 313 00:15:29,800 --> 00:15:32,800 Speaker 2: deceptive and there's not a lot of recourse. This is 314 00:15:32,880 --> 00:15:35,760 Speaker 2: really interesting because if I walked into a shop. 315 00:15:35,680 --> 00:15:39,360 Speaker 1: And I feel like we shouldn't be sharing this information, 316 00:15:39,880 --> 00:15:44,160 Speaker 1: I feel like you're encouraging bad behavior because. 317 00:15:43,920 --> 00:15:46,280 Speaker 3: We've got the best listeners. Ever, no one's going to 318 00:15:46,320 --> 00:15:46,600 Speaker 3: do that. 319 00:15:47,160 --> 00:15:50,240 Speaker 1: There's a few dodgy motherfuckers that listen. Don't worry about that. 320 00:15:50,400 --> 00:15:51,480 Speaker 1: You know who you are. 321 00:15:54,720 --> 00:15:56,880 Speaker 2: So anyway, so Craig is going to go out and 322 00:15:56,880 --> 00:16:03,320 Speaker 2: buy a new motorbike tomorrow exactly. So it would never 323 00:16:03,360 --> 00:16:06,480 Speaker 2: have occurred to me, Am I just like naive or something. 324 00:16:06,520 --> 00:16:10,000 Speaker 2: But they say half half of these people will earn 325 00:16:10,080 --> 00:16:13,600 Speaker 2: over one hundred thousand US. That's probably nearly two hundred 326 00:16:13,640 --> 00:16:17,840 Speaker 2: thousand Australian. And in fact, I think the Australian dollars dropped again, so. 327 00:16:18,440 --> 00:16:21,960 Speaker 1: I think it's a million Australian, it's ten million Australian. 328 00:16:22,280 --> 00:16:25,640 Speaker 2: Yeah, but isn't that interesting? Do you think it's a mindset? 329 00:16:26,400 --> 00:16:28,760 Speaker 2: I don't know how you could kind of, you know, 330 00:16:29,160 --> 00:16:31,960 Speaker 2: kind of define this. And also how you could justify 331 00:16:32,000 --> 00:16:34,320 Speaker 2: it because I think one of the problems when you 332 00:16:34,440 --> 00:16:37,880 Speaker 2: order online, you're not seeing the shopkeeper. You know, if 333 00:16:37,920 --> 00:16:40,080 Speaker 2: I walk into someone's store and the owner of the 334 00:16:40,120 --> 00:16:42,200 Speaker 2: store standing right there, it's going to be really hard 335 00:16:42,560 --> 00:16:45,000 Speaker 2: to steal from somebody who's a person, who's someone you 336 00:16:45,040 --> 00:16:48,600 Speaker 2: can relate to, who's working hard, who's packing boxes, and 337 00:16:48,640 --> 00:16:50,640 Speaker 2: you know, and there's obviously obviously making. 338 00:16:50,480 --> 00:16:51,840 Speaker 3: A living out of what they're doing. 339 00:16:52,280 --> 00:16:55,520 Speaker 2: But that whole fire wall that break, you know, the 340 00:16:55,600 --> 00:16:58,800 Speaker 2: digital divide between you and the other person, whether it's 341 00:16:58,800 --> 00:17:01,240 Speaker 2: social media and making a I meant to another person's 342 00:17:01,320 --> 00:17:04,080 Speaker 2: name and not actually seeing them in public. Because when 343 00:17:04,280 --> 00:17:06,399 Speaker 2: we were talking about social media a moment ago, a 344 00:17:06,400 --> 00:17:08,879 Speaker 2: lot of what people say they probably wouldn't say to 345 00:17:09,000 --> 00:17:11,119 Speaker 2: their face if you were standing in a room together, 346 00:17:11,359 --> 00:17:14,480 Speaker 2: or certainly the you know, the dialogue wouldn't be as 347 00:17:14,520 --> 00:17:17,240 Speaker 2: heated because you have an opportunity to be able to 348 00:17:17,240 --> 00:17:19,800 Speaker 2: interact with the person, and it's that you're introducing the 349 00:17:19,840 --> 00:17:22,840 Speaker 2: personality and the humanity of the person. 350 00:17:23,080 --> 00:17:25,720 Speaker 3: And I guess in the same way because. 351 00:17:25,400 --> 00:17:28,720 Speaker 2: People you know, and I know we're kind of targeting 352 00:17:28,760 --> 00:17:31,600 Speaker 2: Gen zs and millennials, but the reality of it is 353 00:17:31,640 --> 00:17:34,080 Speaker 2: anybody can do this online, and I think there is 354 00:17:34,119 --> 00:17:36,920 Speaker 2: a separation. If it's on Amazon, it's like this big company. 355 00:17:37,119 --> 00:17:38,440 Speaker 2: I can justify myself. 356 00:17:38,720 --> 00:17:42,800 Speaker 1: So, yeah, you make a very good point. And I 357 00:17:42,840 --> 00:17:48,439 Speaker 1: think that, yeah, because it's it's like there's nothing really personal, 358 00:17:48,920 --> 00:17:51,719 Speaker 1: there's no emotional you're not seeing another human, you're not 359 00:17:51,840 --> 00:17:59,760 Speaker 1: interacting with another human. Typically. Yeah, look, I I think. 360 00:18:00,080 --> 00:18:03,679 Speaker 1: I mean, I've owned lots of businesses and some you know, 361 00:18:04,400 --> 00:18:06,840 Speaker 1: I've had people steal from me a lot and people 362 00:18:06,960 --> 00:18:10,720 Speaker 1: you wouldn't think, Yeah, so I don't have and it 363 00:18:10,840 --> 00:18:13,479 Speaker 1: sounds cynical and negative. I'm not trying to be. But 364 00:18:13,600 --> 00:18:17,280 Speaker 1: my experience is that people, even people close to you, 365 00:18:17,320 --> 00:18:20,240 Speaker 1: will steal if they think they can steal and get 366 00:18:20,280 --> 00:18:22,600 Speaker 1: away with it, which people are going to say, I 367 00:18:22,640 --> 00:18:24,280 Speaker 1: can't believe you would say it's very negative. 368 00:18:24,280 --> 00:18:24,560 Speaker 3: It's not. 369 00:18:24,640 --> 00:18:28,240 Speaker 1: No, I'm literally just reporting what has happened. I have 370 00:18:28,280 --> 00:18:31,919 Speaker 1: a quick story, right, and it's about who would steal 371 00:18:32,000 --> 00:18:34,960 Speaker 1: and like trust And you think, so I went to 372 00:18:36,640 --> 00:18:39,040 Speaker 1: I went away to do a gig and I was 373 00:18:39,200 --> 00:18:44,760 Speaker 1: staying in a staying in New Zealand, and I trained 374 00:18:44,760 --> 00:18:48,399 Speaker 1: at this big, amazing gym and I had a bunch 375 00:18:48,400 --> 00:18:49,360 Speaker 1: of stuff. I've think. 376 00:18:49,520 --> 00:18:51,159 Speaker 2: Can I just say something before you go on? We 377 00:18:51,200 --> 00:18:53,400 Speaker 2: actually have spoken about this that's wrong with the cameras 378 00:18:53,480 --> 00:18:57,719 Speaker 2: and the guyings. I apologize sorry listeners, Yeah, go back 379 00:18:57,720 --> 00:18:59,359 Speaker 2: to a previous episode because it was it's actually a 380 00:18:59,400 --> 00:19:00,120 Speaker 2: really good story. 381 00:19:00,119 --> 00:19:01,920 Speaker 3: But short of it is that new camera. 382 00:19:01,800 --> 00:19:05,280 Speaker 1: Anyway, Sorry that they thought they could anyway, he thought 383 00:19:05,280 --> 00:19:07,080 Speaker 1: one or two of his staff was stealing, and the 384 00:19:07,119 --> 00:19:11,840 Speaker 1: bottom line was they were nearly all stealing, all of them. Yeah, 385 00:19:11,880 --> 00:19:14,919 Speaker 1: you know, even like friends and shit. So given the 386 00:19:14,960 --> 00:19:20,119 Speaker 1: opportunity to steal, I think sometimes and I think also 387 00:19:20,240 --> 00:19:23,720 Speaker 1: when stuff like you're talking about, maybe they don't. I 388 00:19:23,760 --> 00:19:27,040 Speaker 1: think they don't even think of that as stealing. Yeah, 389 00:19:27,240 --> 00:19:30,480 Speaker 1: it's like they would never think I'm a thief, but 390 00:19:30,560 --> 00:19:31,879 Speaker 1: it's literally stealing. 391 00:19:32,400 --> 00:19:35,399 Speaker 3: Yeah. I think the only thing you can steal Crago 392 00:19:35,960 --> 00:19:36,520 Speaker 3: is your heart. 393 00:19:37,080 --> 00:19:45,800 Speaker 1: Oh god, no, no, you can definitely steal money. I'm 394 00:19:46,240 --> 00:19:50,639 Speaker 1: interested in cars. Of course, as you know, Tesla is 395 00:19:50,680 --> 00:19:52,840 Speaker 1: not going great in Australia at the moment. 396 00:19:53,280 --> 00:19:55,639 Speaker 2: Oh look, there's always a slump at the start of 397 00:19:55,680 --> 00:19:58,320 Speaker 2: the year, so there's always a bit of a glitch 398 00:19:58,359 --> 00:20:01,760 Speaker 2: in the car market, particularly with Tesla. So there's always 399 00:20:01,760 --> 00:20:04,359 Speaker 2: been this traditional drop, but it's been bigger than usual. 400 00:20:04,720 --> 00:20:07,440 Speaker 2: And look, a lot of people are putting it down 401 00:20:07,440 --> 00:20:10,760 Speaker 2: to the whole tech bro thing with the election in 402 00:20:10,800 --> 00:20:14,600 Speaker 2: the US, and there's been a real shift away from 403 00:20:14,640 --> 00:20:19,080 Speaker 2: Elon Musk. And he's forever connected with the Tesla company. 404 00:20:19,400 --> 00:20:21,760 Speaker 2: So when you buy a Tesla, I guess there's a 405 00:20:21,760 --> 00:20:24,280 Speaker 2: real sense that there's that close connection to Elon Musk. 406 00:20:24,320 --> 00:20:27,760 Speaker 3: It's his company, and you know, there's been. 407 00:20:28,160 --> 00:20:31,480 Speaker 2: I guess, a real kickback to the way that Elon 408 00:20:31,600 --> 00:20:36,480 Speaker 2: Musk has presented himself his alignment with Donald Trump. And 409 00:20:36,520 --> 00:20:38,680 Speaker 2: they're really saying now that that's been like we're talking 410 00:20:38,720 --> 00:20:41,520 Speaker 2: a drop of sixty percent below the same time last year. 411 00:20:41,880 --> 00:20:43,520 Speaker 3: It's a massive slump in the market. 412 00:20:43,600 --> 00:20:46,399 Speaker 2: I think they only sold about five hundred cars this 413 00:20:46,560 --> 00:20:49,439 Speaker 2: month or something last month, so it's been a big drop. 414 00:20:49,960 --> 00:20:53,080 Speaker 2: And I did a bit of a search because I 415 00:20:53,119 --> 00:20:55,040 Speaker 2: don't know if you've seen the bumper stickers on some 416 00:20:55,160 --> 00:20:58,000 Speaker 2: Tesla's now people are putting bumper stickers on their car 417 00:20:58,119 --> 00:21:00,880 Speaker 2: saying I bought this before we knew I was crazy. 418 00:21:01,400 --> 00:21:05,119 Speaker 2: It's like a disclaimer and I have I saw one 419 00:21:05,119 --> 00:21:08,320 Speaker 2: on the gar and Mel that where someone was driving 420 00:21:08,320 --> 00:21:09,120 Speaker 2: a Tesla and. 421 00:21:09,080 --> 00:21:10,119 Speaker 3: There was a story somewhere. 422 00:21:10,119 --> 00:21:11,919 Speaker 2: It might might have been in the US where a 423 00:21:11,960 --> 00:21:14,119 Speaker 2: guy had bought a Tesla and then he returned it 424 00:21:14,160 --> 00:21:16,480 Speaker 2: and said I don't want it anymore. I mean, it's 425 00:21:16,480 --> 00:21:20,960 Speaker 2: pretty extreme, but I thought that was really interesting that that, 426 00:21:21,000 --> 00:21:23,159 Speaker 2: you know, you can have such a negative impact just 427 00:21:23,200 --> 00:21:25,959 Speaker 2: by the way you're presented, and that could you know, 428 00:21:26,560 --> 00:21:28,479 Speaker 2: mean well impact on the car. 429 00:21:28,680 --> 00:21:31,159 Speaker 1: I don't know. I don't know, because I don't know 430 00:21:31,200 --> 00:21:35,600 Speaker 1: that correlation is causation. Like also, what's happened in the 431 00:21:35,640 --> 00:21:39,760 Speaker 1: last year or two in Australia is a massive influx 432 00:21:40,000 --> 00:21:44,879 Speaker 1: of other electric cars, like yeah, yeah, like byd who 433 00:21:45,160 --> 00:21:47,640 Speaker 1: basically have a car which is as good or better 434 00:21:47,680 --> 00:21:52,199 Speaker 1: than a Tesla three for about twenty grand cheaper. So 435 00:21:53,119 --> 00:21:55,400 Speaker 1: you could be right, But I don't know. The fact 436 00:21:55,480 --> 00:21:58,919 Speaker 1: that some people think that Elon's crazy and the numbers 437 00:21:58,920 --> 00:22:02,080 Speaker 1: are down, I don't know that like that that is 438 00:22:02,119 --> 00:22:05,200 Speaker 1: the cause. It could be, but I think it's more 439 00:22:05,560 --> 00:22:09,320 Speaker 1: nuanced than that. I mean, there are huge amount you know, 440 00:22:09,400 --> 00:22:13,000 Speaker 1: when Tesla was stand alone, they had no competition, and 441 00:22:13,040 --> 00:22:15,679 Speaker 1: now in Australia and all around the world, there's vast 442 00:22:15,720 --> 00:22:20,760 Speaker 1: competition from lots of other brands who are producing more 443 00:22:20,840 --> 00:22:27,119 Speaker 1: and more incredible cars for significantly less, and some of 444 00:22:27,160 --> 00:22:29,760 Speaker 1: them with up to ten year warranties like. 445 00:22:30,160 --> 00:22:33,719 Speaker 3: But the article I did take that into consideration. 446 00:22:33,880 --> 00:22:36,439 Speaker 2: They did show and they talked about the Australian market 447 00:22:36,480 --> 00:22:39,520 Speaker 2: having a slump in terms of Tesla sales now that 448 00:22:39,560 --> 00:22:43,399 Speaker 2: we're getting flooded with Chinese car makers, But that was 449 00:22:43,480 --> 00:22:46,320 Speaker 2: taken into account. So even when you look at the 450 00:22:46,359 --> 00:22:50,560 Speaker 2: trajectory of the ev market starting to get a real 451 00:22:50,640 --> 00:22:54,439 Speaker 2: momentum with Chinese cars, that still didn't account for the 452 00:22:54,440 --> 00:22:58,280 Speaker 2: bigger drop than usual in January of those sales. So 453 00:22:58,400 --> 00:22:59,639 Speaker 2: I think it plays a little bit of a part. 454 00:22:59,680 --> 00:23:03,280 Speaker 2: But you're absolutely right, it's amazing what the influx of 455 00:23:03,359 --> 00:23:06,560 Speaker 2: Chinese cars on the market are reasonable prices because previously, 456 00:23:06,720 --> 00:23:08,880 Speaker 2: you know, three years ago, I bought my hybrid car, 457 00:23:09,400 --> 00:23:12,680 Speaker 2: but I probably would have now. 458 00:23:12,600 --> 00:23:14,440 Speaker 3: In hindsight, probably would have bought an electric. 459 00:23:14,800 --> 00:23:17,840 Speaker 2: And this is an interesting one too, because a lot 460 00:23:17,880 --> 00:23:20,400 Speaker 2: of people are resistant to the idea of getting electric 461 00:23:20,440 --> 00:23:23,080 Speaker 2: because there's this notion that they don't last as long. 462 00:23:23,440 --> 00:23:25,800 Speaker 2: And a little bit of research that came out recently 463 00:23:25,880 --> 00:23:28,600 Speaker 2: was saying that the average electric car will have an 464 00:23:28,640 --> 00:23:31,199 Speaker 2: eighteen year life span, which is not dissimilar to the 465 00:23:31,200 --> 00:23:35,119 Speaker 2: most petrol cars as well, So in terms of longevity 466 00:23:35,119 --> 00:23:37,840 Speaker 2: of the vehicle, it's not that much. The only real 467 00:23:37,880 --> 00:23:39,760 Speaker 2: hindrance I think for a lot of people is the 468 00:23:39,840 --> 00:23:43,200 Speaker 2: charging time. It's getting better, so there are more cars, 469 00:23:43,440 --> 00:23:45,320 Speaker 2: and you're talking about the Chinese cars. There are some 470 00:23:45,480 --> 00:23:48,679 Speaker 2: that can almost go to full charge in twenty minutes. 471 00:23:48,920 --> 00:23:50,480 Speaker 2: But then again, I'm not going to stand at a 472 00:23:50,480 --> 00:23:52,240 Speaker 2: petrol bowser for twenty minutes. 473 00:23:51,920 --> 00:23:58,200 Speaker 1: To you, No, no, I'm definitely not. No, I've joined 474 00:23:58,200 --> 00:24:03,280 Speaker 1: the hybrid clan like you. It's fucking ridiculous. My car 475 00:24:03,359 --> 00:24:08,000 Speaker 1: gets twelve hundred kilometers on it on a tank. Yeah, 476 00:24:08,359 --> 00:24:13,080 Speaker 1: that's so ridiculous. Like it goes so far, like you 477 00:24:13,080 --> 00:24:17,080 Speaker 1: could drive from Melbourne to Sydney and halfway back. Yeah, 478 00:24:17,320 --> 00:24:18,120 Speaker 1: super efficient. 479 00:24:18,560 --> 00:24:21,119 Speaker 2: Yeah, now that that's pretty amazing stuff. Mike, car certainly 480 00:24:21,119 --> 00:24:22,880 Speaker 2: hasn't got that sort of range, but it's pretty good. 481 00:24:23,040 --> 00:24:26,960 Speaker 2: Seven hundred kilometers is more than most electric cars out 482 00:24:27,000 --> 00:24:29,480 Speaker 2: on the road. It's probably, you know, in the ninety 483 00:24:29,480 --> 00:24:32,160 Speaker 2: fifth percentile. There's no chance that an electric are at 484 00:24:32,160 --> 00:24:34,080 Speaker 2: this stage just going to have that sort of that 485 00:24:34,200 --> 00:24:36,840 Speaker 2: sort of range. And as you say, if I wanted 486 00:24:36,840 --> 00:24:39,960 Speaker 2: to drive to Adelaide, no problem, and if I want 487 00:24:40,000 --> 00:24:42,160 Speaker 2: to stop, you know, I can fill up the tank 488 00:24:42,200 --> 00:24:43,280 Speaker 2: and a how long does it take to fill up 489 00:24:43,320 --> 00:24:43,640 Speaker 2: the tank? 490 00:24:43,880 --> 00:24:45,480 Speaker 3: You know, I don't know, a few minutes and you're 491 00:24:45,520 --> 00:24:46,840 Speaker 3: off and running again. So to speak. 492 00:24:47,280 --> 00:24:50,200 Speaker 1: I reckon right now with just an opinion, I reckon 493 00:24:50,280 --> 00:24:53,520 Speaker 1: hybrid is the I think that's the best option at 494 00:24:53,520 --> 00:24:56,240 Speaker 1: the moment. I think moving forward that will probably change. 495 00:24:56,280 --> 00:25:03,760 Speaker 1: But speaking of cars, tell me about the self fixing 496 00:25:04,000 --> 00:25:07,920 Speaker 1: pothole mechanism or I can't see it, but I read 497 00:25:07,960 --> 00:25:08,439 Speaker 1: it before. 498 00:25:09,600 --> 00:25:10,440 Speaker 3: How cool is this? 499 00:25:10,440 --> 00:25:13,320 Speaker 1: This is some recently there we go self healing road. 500 00:25:13,800 --> 00:25:16,800 Speaker 2: I know, I started reading into this and it kind 501 00:25:16,800 --> 00:25:21,160 Speaker 2: of is almost creepy at the same time. So in Swansea, 502 00:25:21,280 --> 00:25:24,360 Speaker 2: right at the university there, what they're trying to do 503 00:25:24,720 --> 00:25:29,359 Speaker 2: is in bed tiny plant spores mixed into the bitchumen 504 00:25:29,800 --> 00:25:32,720 Speaker 2: and they say to'll extend the life span of bitchmen 505 00:25:32,760 --> 00:25:33,399 Speaker 2: by thirty percent. 506 00:25:33,400 --> 00:25:35,800 Speaker 3: Because what happens is you get cracks in the. 507 00:25:35,800 --> 00:25:38,000 Speaker 2: Road and then they fill with water, and when the 508 00:25:38,040 --> 00:25:41,120 Speaker 2: water gets into the cracks, that causes pockets of water. 509 00:25:41,160 --> 00:25:42,520 Speaker 3: And that's what breaks down the bitchumen. 510 00:25:42,800 --> 00:25:46,160 Speaker 2: But what they're saying is with the plant spores, once 511 00:25:46,200 --> 00:25:50,280 Speaker 2: a crack appears, the spores then enlarge and fill up 512 00:25:50,359 --> 00:25:53,280 Speaker 2: the hole, so you don't get water into the potholes. 513 00:25:53,560 --> 00:25:56,439 Speaker 2: So it means that you'll get less potholes, so in 514 00:25:56,480 --> 00:25:57,760 Speaker 2: a way, it is self healing. 515 00:25:57,880 --> 00:25:59,920 Speaker 3: It's really cool tech, isn't it? 516 00:26:00,160 --> 00:26:01,959 Speaker 1: And smart? 517 00:26:02,760 --> 00:26:06,400 Speaker 3: Yeah, isn't that great? And I thought, how interesting is that? 518 00:26:06,800 --> 00:26:10,080 Speaker 2: So you know, using that biotech and the other thing too, 519 00:26:10,119 --> 00:26:12,719 Speaker 2: of course, because bitchumin is petrochemical, isn't it. 520 00:26:13,400 --> 00:26:13,600 Speaker 1: Yeah? 521 00:26:13,920 --> 00:26:16,840 Speaker 2: Yeah, whereas you know, using plant spores. I mean, I 522 00:26:16,880 --> 00:26:19,840 Speaker 2: know it's not going to replace bitjumen, but certainly I 523 00:26:19,840 --> 00:26:21,320 Speaker 2: thought it was an amazing concept. 524 00:26:21,480 --> 00:26:22,280 Speaker 3: I love how. 525 00:26:22,160 --> 00:26:24,560 Speaker 2: People come up with these ideas. Where would that in 526 00:26:25,280 --> 00:26:29,080 Speaker 2: your wildest dream would you even have conceived of that 527 00:26:29,280 --> 00:26:32,080 Speaker 2: of a way to extend the life of bitchumen by 528 00:26:32,119 --> 00:26:33,480 Speaker 2: putting sports. 529 00:26:33,720 --> 00:26:36,320 Speaker 1: I can barely time my fucking shoes, so don't I mean, 530 00:26:36,400 --> 00:26:41,960 Speaker 1: don't look to me for anything innovative. So the thing 531 00:26:42,080 --> 00:26:44,480 Speaker 1: I was going to say that the term on everybody's lips, 532 00:26:44,560 --> 00:26:46,679 Speaker 1: but that's not true. But the thing that's taken a 533 00:26:46,680 --> 00:26:50,680 Speaker 1: lot of people's attention or curiosity over the last few 534 00:26:50,720 --> 00:26:54,119 Speaker 1: weeks is a thing called deep Seek. I know what 535 00:26:54,200 --> 00:26:56,679 Speaker 1: it is. Maybe some of our listeners don't know what 536 00:26:56,800 --> 00:26:59,640 Speaker 1: it is, but what's the story around that? Firstly, tell 537 00:26:59,640 --> 00:27:01,679 Speaker 1: our list is what it is and what's going on 538 00:27:01,720 --> 00:27:01,960 Speaker 1: with that? 539 00:27:02,720 --> 00:27:06,280 Speaker 2: Wow, this is so controversial versual. So deep Seek I 540 00:27:06,359 --> 00:27:08,800 Speaker 2: think it came out in about November last year and 541 00:27:09,040 --> 00:27:10,760 Speaker 2: it was a bit of it kind of sat around 542 00:27:10,840 --> 00:27:14,720 Speaker 2: for a little bit. It's a Chinese AI, the equivalent 543 00:27:14,800 --> 00:27:18,600 Speaker 2: of chat GPT. Now what's caused the biggest stir is 544 00:27:19,119 --> 00:27:23,320 Speaker 2: the amount of money they claim to make. Deep Seek 545 00:27:23,760 --> 00:27:25,960 Speaker 2: was actually a lot less. I mean we're talking a 546 00:27:26,040 --> 00:27:29,320 Speaker 2: fraction fraction fraction of the cost because at the moment 547 00:27:29,760 --> 00:27:33,560 Speaker 2: to make an AI, you need to use in Video chips. Now, 548 00:27:33,560 --> 00:27:37,360 Speaker 2: in Video is the company that manufactures the best quality 549 00:27:37,760 --> 00:27:41,600 Speaker 2: chips in the world microchips, and they are the most 550 00:27:41,760 --> 00:27:44,359 Speaker 2: wealthy company in the world in terms of how much 551 00:27:44,640 --> 00:27:48,240 Speaker 2: their company is worth. So when this and then what 552 00:27:48,400 --> 00:27:51,679 Speaker 2: the American government was doing was restricting the sale of 553 00:27:51,720 --> 00:27:55,480 Speaker 2: in Vidia chips to China to prevent China from developing 554 00:27:55,560 --> 00:28:00,239 Speaker 2: your Chinese companies from developing their own AI. And this 555 00:28:00,280 --> 00:28:02,320 Speaker 2: is kind of cool in a little way. The guy 556 00:28:02,320 --> 00:28:06,719 Speaker 2: who's behind this new AI, the Chinese AI, actually studied 557 00:28:06,720 --> 00:28:09,280 Speaker 2: in Australia, so there's a little kind of local link 558 00:28:09,320 --> 00:28:11,520 Speaker 2: for us as well. I think he started in Melbourne. Yeah, 559 00:28:11,920 --> 00:28:18,000 Speaker 2: so effectively, this amazing deep Seak comes out. It's cheaper 560 00:28:18,240 --> 00:28:21,480 Speaker 2: and they've used old chips, so they haven't gone for 561 00:28:21,560 --> 00:28:23,879 Speaker 2: the new stuff. They haven't been able to purchase it 562 00:28:23,960 --> 00:28:26,400 Speaker 2: or get access to it. So they've managed to do 563 00:28:26,880 --> 00:28:30,439 Speaker 2: what no one else could possibly have envisaged they could do. 564 00:28:30,560 --> 00:28:32,960 Speaker 2: So everyone said, you can't do it, you can't build 565 00:28:32,960 --> 00:28:37,120 Speaker 2: an AI platform that's powerful using old chips, and they 566 00:28:37,160 --> 00:28:39,239 Speaker 2: did it. And this is what they're claiming that they 567 00:28:39,240 --> 00:28:41,880 Speaker 2: were able to do. And so they've released deep Seek 568 00:28:42,040 --> 00:28:44,760 Speaker 2: and I think what made the momentum and suddenly it 569 00:28:44,840 --> 00:28:48,640 Speaker 2: hit the media is it peaked in the downloads for phones, 570 00:28:48,680 --> 00:28:52,400 Speaker 2: so in the app stores suddenly, all because it was 571 00:28:52,560 --> 00:28:54,920 Speaker 2: November last year or so that it got launched and 572 00:28:54,960 --> 00:28:57,200 Speaker 2: no one thought about it, no one said anything, and 573 00:28:57,200 --> 00:28:59,120 Speaker 2: then all of a sudden, it's just peaking with all 574 00:28:59,120 --> 00:29:02,640 Speaker 2: these downloads. But on the end of this, the company 575 00:29:02,640 --> 00:29:08,080 Speaker 2: in video, their stocks plummeted once this came out because really, oh, 576 00:29:08,200 --> 00:29:11,080 Speaker 2: I mean it wiped billions of dollars off their share market. 577 00:29:11,160 --> 00:29:14,560 Speaker 2: I mean it's since climbed its way back and now 578 00:29:14,640 --> 00:29:18,440 Speaker 2: there's kind of these protectionist methods. The Australian government government 579 00:29:18,480 --> 00:29:21,520 Speaker 2: came out this week and said no one working in 580 00:29:21,600 --> 00:29:24,680 Speaker 2: government is allowed to have Deep seek on their phones, 581 00:29:24,720 --> 00:29:28,160 Speaker 2: on government phones, on government computers. Other countries are doing 582 00:29:28,200 --> 00:29:31,200 Speaker 2: the same thing because what they're suggesting is because it's 583 00:29:31,200 --> 00:29:34,959 Speaker 2: a Chinese company, it's also a backdoor for all the 584 00:29:35,000 --> 00:29:38,800 Speaker 2: information on your device to go back to Beijing. There's 585 00:29:38,840 --> 00:29:42,160 Speaker 2: this real paranoia and look, I don't know, I can't 586 00:29:42,160 --> 00:29:44,200 Speaker 2: say yes or no. I'm not a tech person in 587 00:29:44,280 --> 00:29:47,400 Speaker 2: terms of the back end technology, but there's a real 588 00:29:47,560 --> 00:29:52,280 Speaker 2: fear with the Chinese tech and Chinese AI that some 589 00:29:52,360 --> 00:29:55,200 Speaker 2: of this is actually going to go back. So the 590 00:29:55,200 --> 00:29:57,560 Speaker 2: information you're putting in there, the device that you're using 591 00:29:57,600 --> 00:30:01,640 Speaker 2: potentially may have and this is proven, but it may 592 00:30:01,800 --> 00:30:05,760 Speaker 2: have a backdoor that's sending your information to Beijing. Because 593 00:30:06,120 --> 00:30:10,520 Speaker 2: we know that the regime in China has quite rigid 594 00:30:10,600 --> 00:30:16,120 Speaker 2: constraints on tech companies, because the oversight is very different 595 00:30:16,280 --> 00:30:19,960 Speaker 2: in China, the landscape there, the monitoring, the requirements so 596 00:30:20,320 --> 00:30:23,080 Speaker 2: we really don't know what goes on. China is a 597 00:30:23,280 --> 00:30:27,720 Speaker 2: very fascinating country. It's you know, there's so much interest 598 00:30:27,800 --> 00:30:30,920 Speaker 2: around China, and as you know with my Tai Chi, 599 00:30:31,000 --> 00:30:33,680 Speaker 2: I love the culture and I love that side of it. 600 00:30:34,080 --> 00:30:36,360 Speaker 2: But there's still you know, when you think about the 601 00:30:36,360 --> 00:30:40,360 Speaker 2: regime that's running the country and the oversight that's there, 602 00:30:40,840 --> 00:30:43,600 Speaker 2: it asks all the questions and it begs the questions 603 00:30:44,000 --> 00:30:47,040 Speaker 2: what happens with these products that are launched by supposedly 604 00:30:47,120 --> 00:30:51,120 Speaker 2: independent companies and how much oversight is there by Beijing 605 00:30:51,160 --> 00:30:53,880 Speaker 2: and how many requirements are there for that data to 606 00:30:53,920 --> 00:30:55,320 Speaker 2: be given back to Beijing. 607 00:30:56,120 --> 00:30:58,760 Speaker 1: Well I'm not nervous at all. Now, thanks for that, noble. 608 00:31:00,520 --> 00:31:02,680 Speaker 1: I'll tell you there's quite a bit of Ai stuff. 609 00:31:02,760 --> 00:31:05,200 Speaker 2: Tell me what ghost say one think now before you 610 00:31:05,240 --> 00:31:09,280 Speaker 2: do about deep deep Seek. This is the creepiest and 611 00:31:09,360 --> 00:31:11,760 Speaker 2: weirdest little article I read a couple of weeks ago, 612 00:31:11,800 --> 00:31:13,640 Speaker 2: so I'm just kind of rejigged in my head. But 613 00:31:14,400 --> 00:31:17,600 Speaker 2: what they found someone was doing some searches and they 614 00:31:17,720 --> 00:31:21,920 Speaker 2: found that what deep Seek was doing was merging English 615 00:31:22,000 --> 00:31:25,440 Speaker 2: and Chinese. So I don't know if you've known people 616 00:31:25,480 --> 00:31:29,240 Speaker 2: who speak multi who are multilingual right. So in fact, 617 00:31:29,240 --> 00:31:31,680 Speaker 2: actually you met some friends of mine the other day 618 00:31:31,680 --> 00:31:34,560 Speaker 2: at a cafe and there's Chris who appeared on the 619 00:31:34,600 --> 00:31:37,280 Speaker 2: show who when he was on work Experience, I made 620 00:31:37,360 --> 00:31:39,200 Speaker 2: him do all the work and the research and we 621 00:31:39,280 --> 00:31:44,560 Speaker 2: realized I was a fraud. And yeah, anyway, so exploiter 622 00:31:44,800 --> 00:31:49,840 Speaker 2: of you know, a young labor, labor not unlike China. 623 00:31:49,960 --> 00:31:51,840 Speaker 2: Let's say, well. 624 00:31:51,680 --> 00:31:54,720 Speaker 1: That's your bloody, that's your oriental influence. 625 00:31:55,960 --> 00:31:57,040 Speaker 3: Oriental influence. 626 00:31:57,080 --> 00:32:00,040 Speaker 1: Okay, I was trying to tye to the influence, I 627 00:32:00,040 --> 00:32:00,520 Speaker 1: should say. 628 00:32:02,800 --> 00:32:05,600 Speaker 2: Anyway, so when you meet people. What I loved about 629 00:32:05,600 --> 00:32:08,080 Speaker 2: so when Chris was growing up, his mother's German and 630 00:32:08,120 --> 00:32:11,080 Speaker 2: he's his father's Australian. And what they did when they 631 00:32:11,080 --> 00:32:13,520 Speaker 2: brought up the kids was they got them to learn 632 00:32:13,560 --> 00:32:14,240 Speaker 2: both languages. 633 00:32:14,280 --> 00:32:16,520 Speaker 3: So people who are multilingual. 634 00:32:16,120 --> 00:32:19,400 Speaker 2: Sometimes when you know, when we think, we think in English. 635 00:32:19,480 --> 00:32:22,479 Speaker 2: And what's always fascinated me is when people who are 636 00:32:22,520 --> 00:32:25,400 Speaker 2: able to really speak two languages or more than one language, 637 00:32:25,480 --> 00:32:27,960 Speaker 2: are able to switch the way that they can think 638 00:32:28,240 --> 00:32:31,000 Speaker 2: and be able to think in a different language. But 639 00:32:31,120 --> 00:32:34,640 Speaker 2: what I find even more fascinating is I've had conversations 640 00:32:34,720 --> 00:32:37,160 Speaker 2: or I've seen Chris talking where he jumps from one 641 00:32:37,240 --> 00:32:41,280 Speaker 2: language to another. So some languages are more descriptive than others, 642 00:32:41,560 --> 00:32:44,080 Speaker 2: so sometimes you can answer. So you know the German 643 00:32:44,120 --> 00:32:48,160 Speaker 2: word schardenfreuder, I love because it basically means happiness at 644 00:32:48,160 --> 00:32:51,400 Speaker 2: the misfortune of others. It's a very funny word. But 645 00:32:52,440 --> 00:32:55,520 Speaker 2: the thing is there's no equivalent in English. So if 646 00:32:55,560 --> 00:32:57,960 Speaker 2: you have a sharden freud a moment or pardon me 647 00:32:58,040 --> 00:33:00,600 Speaker 2: sarten for it, a moment and someone you don't like 648 00:33:00,680 --> 00:33:02,640 Speaker 2: trips over in the gutto or something like that, and 649 00:33:02,680 --> 00:33:04,080 Speaker 2: you look over and you have a bit of a chuckle. 650 00:33:04,280 --> 00:33:06,480 Speaker 2: You don't cause them to fall over, but you laugh 651 00:33:06,480 --> 00:33:08,560 Speaker 2: at them because you don't really like them that much. 652 00:33:08,560 --> 00:33:11,440 Speaker 2: And something happened to them was bad karma, right, So 653 00:33:11,520 --> 00:33:14,240 Speaker 2: that's sharden freuder. So there's no English word. But what 654 00:33:14,400 --> 00:33:17,760 Speaker 2: happened I know this. I'm telling a very long story here, apologies. 655 00:33:18,080 --> 00:33:21,440 Speaker 2: So what happened was deep sex. You shut you shut 656 00:33:21,480 --> 00:33:24,640 Speaker 2: down my New Zealand story. They didn't shut it down. 657 00:33:24,680 --> 00:33:30,480 Speaker 2: I just put it on pause. So Deep Seek started 658 00:33:30,520 --> 00:33:33,760 Speaker 2: thinking in Chinese and English. And there's a real fear 659 00:33:33,840 --> 00:33:37,120 Speaker 2: now with people who look into this, that maybe AI 660 00:33:37,880 --> 00:33:42,320 Speaker 2: can develop its own language. If AI develops its own language, 661 00:33:42,360 --> 00:33:45,720 Speaker 2: which means it could think more quickly, then we won't 662 00:33:45,800 --> 00:33:49,880 Speaker 2: understand it. So when artificial intelligence is able to create 663 00:33:49,920 --> 00:33:52,400 Speaker 2: its own language and it's there, there's a potential there, 664 00:33:52,400 --> 00:33:55,280 Speaker 2: and that's what they saw with deep Seek, then potentially, 665 00:33:55,400 --> 00:33:58,360 Speaker 2: if it's creating its own language, we're locked out of 666 00:33:58,400 --> 00:34:01,680 Speaker 2: the equation. We no longer can understand the way it's thinking. 667 00:34:02,600 --> 00:34:06,600 Speaker 1: Of course, I fuck yes at one hundred percent, like 668 00:34:07,360 --> 00:34:10,840 Speaker 1: that shit is about twelve months away from being sentient, 669 00:34:11,280 --> 00:34:16,200 Speaker 1: from being conscious, from having like people think I'm being funny. No, 670 00:34:16,960 --> 00:34:21,160 Speaker 1: like AI will have its own version of consciousness and awareness, 671 00:34:21,920 --> 00:34:25,319 Speaker 1: and it will independently think in inverted commas and make 672 00:34:25,400 --> 00:34:31,080 Speaker 1: decisions because I mean everyone everyone who knows anything about AI, 673 00:34:31,200 --> 00:34:33,279 Speaker 1: as in the people that at the Gurus, they will 674 00:34:33,280 --> 00:34:36,360 Speaker 1: tell you that right now it's as smart as humans 675 00:34:36,440 --> 00:34:39,440 Speaker 1: or smarter obviously it's got a lot to draw on, 676 00:34:39,880 --> 00:34:42,760 Speaker 1: but that in the not too distant future, it will 677 00:34:43,000 --> 00:34:46,799 Speaker 1: like we will be relegated to the second smartest in 678 00:34:46,800 --> 00:34:49,800 Speaker 1: inverted commas species. And I know it's not a species, 679 00:34:49,840 --> 00:34:52,920 Speaker 1: but it is its own form of intelligence. And this 680 00:34:53,000 --> 00:34:59,640 Speaker 1: presents a whole raft of interesting ethical moral like conversations 681 00:34:59,680 --> 00:35:03,480 Speaker 1: around and what that what that impact because I think 682 00:35:03,600 --> 00:35:07,360 Speaker 1: most of us think that the technology for the most 683 00:35:07,400 --> 00:35:10,719 Speaker 1: part is a positive. I don't know about that. I 684 00:35:10,719 --> 00:35:13,239 Speaker 1: think definitely there are positives, but I think there's shit 685 00:35:13,320 --> 00:35:16,960 Speaker 1: that's going to happen that that maybe one day mankind, 686 00:35:17,560 --> 00:35:22,240 Speaker 1: sorry humankind, will go what the fuck? What the fuck 687 00:35:22,280 --> 00:35:23,040 Speaker 1: were we thinking? 688 00:35:23,480 --> 00:35:26,080 Speaker 2: You know, the thing I can see the application of 689 00:35:26,080 --> 00:35:29,319 Speaker 2: AI is amazing and the ability to be able to 690 00:35:29,400 --> 00:35:31,680 Speaker 2: do so many great things. And you said this earlier 691 00:35:31,680 --> 00:35:34,160 Speaker 2: in the show. You know, if you know you can 692 00:35:34,239 --> 00:35:36,520 Speaker 2: use a knife to cook with and then you can 693 00:35:36,600 --> 00:35:39,399 Speaker 2: use a knife to cause harm and exactly the same way. 694 00:35:39,400 --> 00:35:42,640 Speaker 3: That's how technology is being employed. You know, when it makes. 695 00:35:42,400 --> 00:35:45,120 Speaker 2: Our roads safer, when we can kind of detect if 696 00:35:45,160 --> 00:35:47,360 Speaker 2: a driver is about to fall asleep at the wheel, 697 00:35:47,520 --> 00:35:49,920 Speaker 2: you know, that could be one using an AI algorithm 698 00:35:50,080 --> 00:35:53,160 Speaker 2: with a camera pointing And this is available right now. 699 00:35:53,520 --> 00:35:54,880 Speaker 3: So I guess that's the concern. 700 00:35:55,080 --> 00:35:57,080 Speaker 2: You know, I don't know about you, but you know 701 00:35:57,160 --> 00:35:59,120 Speaker 2: you were you were referring to the concept of a 702 00:35:59,239 --> 00:36:04,920 Speaker 2: mind and if AI becomes sentient, then how do you 703 00:36:04,960 --> 00:36:08,160 Speaker 2: turn it off? Because is there an ethical aspect to it? 704 00:36:08,160 --> 00:36:11,120 Speaker 2: If you've got an entity that then is self aware, 705 00:36:11,719 --> 00:36:14,640 Speaker 2: Then where's an obligation on us not to turn it off? 706 00:36:14,680 --> 00:36:18,279 Speaker 2: Isn't there because you're effectively killing an AI or killing 707 00:36:18,280 --> 00:36:18,840 Speaker 2: an entity? 708 00:36:18,960 --> 00:36:19,399 Speaker 3: I don't know. 709 00:36:21,440 --> 00:36:25,160 Speaker 1: I mean, yeah, there are so many. I don't think 710 00:36:25,200 --> 00:36:28,239 Speaker 1: there's an easy answer to that, but it's fucking fascinating conversation. 711 00:36:28,560 --> 00:36:34,920 Speaker 1: Is it alive? Well, it does it have a nervous system. 712 00:36:35,000 --> 00:36:38,920 Speaker 1: Can it feel physical pain? No, you know, can it 713 00:36:38,960 --> 00:36:43,080 Speaker 1: feel emotional pain? Well, you wouldn't think it has emotions. 714 00:36:43,120 --> 00:36:46,960 Speaker 1: Could it simulate emotions, yes, I wouldn't think it can 715 00:36:47,040 --> 00:36:51,200 Speaker 1: have emotions. It definitely can't feel physical pain. It doesn't 716 00:36:51,239 --> 00:36:54,400 Speaker 1: have you know, spiritual people would say, well it doesn't 717 00:36:54,480 --> 00:36:57,040 Speaker 1: have a soul whatever that might or might not mean. 718 00:36:58,080 --> 00:37:03,640 Speaker 1: And I just think, you know, sometimes sometimes we've got 719 00:37:03,680 --> 00:37:06,360 Speaker 1: to be a little bit selfish and go fuck the 720 00:37:06,440 --> 00:37:10,920 Speaker 1: feelings of AI. Like we're talking about eight billion people 721 00:37:10,960 --> 00:37:14,120 Speaker 1: on a planet that we need to you know, you 722 00:37:14,120 --> 00:37:17,560 Speaker 1: don't want to wake up one one day, like humanity 723 00:37:17,600 --> 00:37:19,840 Speaker 1: doesn't want to wake up one day and the fucking 724 00:37:20,400 --> 00:37:23,000 Speaker 1: tech overlords are running the show, you know. 725 00:37:23,239 --> 00:37:25,360 Speaker 3: Eliminator Yeah, yeah. 726 00:37:25,239 --> 00:37:27,120 Speaker 1: I mean yeah, Hey. 727 00:37:27,520 --> 00:37:29,680 Speaker 2: What was the new Zealand story. What were we going 728 00:37:29,719 --> 00:37:32,440 Speaker 2: to say about New Zealand before? Then when I don 729 00:37:32,560 --> 00:37:35,279 Speaker 2: even starts, what do you mean I told you? I 730 00:37:35,360 --> 00:37:37,360 Speaker 2: told you, you said, you've told it before. 731 00:37:37,560 --> 00:37:39,880 Speaker 3: Oh, that one. You're going to say something else and 732 00:37:39,960 --> 00:37:40,320 Speaker 3: I catch you. 733 00:37:40,400 --> 00:37:43,600 Speaker 1: Oh, I definitely wasn't. I saw something that I wanted 734 00:37:43,600 --> 00:37:48,560 Speaker 1: to ask you about cultivated meat before you tell us 735 00:37:48,560 --> 00:37:52,319 Speaker 1: about that? Would you the vegan? Would you eat cultivated 736 00:37:52,360 --> 00:37:54,840 Speaker 1: meat because nothing's suffering and there's no pain. 737 00:37:55,480 --> 00:37:58,479 Speaker 2: No, because I think in the time. So I went 738 00:37:58,880 --> 00:38:03,160 Speaker 2: vegetarian initially for ten years and then became vegan for 739 00:38:03,200 --> 00:38:06,520 Speaker 2: the last four years. And what motivated me was obviously 740 00:38:06,600 --> 00:38:11,359 Speaker 2: animal cruelty initially, but I think as an extension of that, 741 00:38:11,880 --> 00:38:14,279 Speaker 2: I've now felt a little bit of a revulsion to 742 00:38:14,400 --> 00:38:18,200 Speaker 2: eating meat because there's a thought process about it coming 743 00:38:18,200 --> 00:38:21,840 Speaker 2: from an animal. So the concept of eating an animal product, 744 00:38:21,880 --> 00:38:24,400 Speaker 2: actually I find there's a bit of a barrier to that. 745 00:38:24,600 --> 00:38:26,759 Speaker 2: I'm doing a vegan cooking class tonight. By the way, Craig, 746 00:38:26,760 --> 00:38:27,920 Speaker 2: I thought you might be interested if you want to 747 00:38:27,920 --> 00:38:28,480 Speaker 2: come down. 748 00:38:28,800 --> 00:38:30,600 Speaker 1: As in your taking it or your partner. 749 00:38:30,719 --> 00:38:33,960 Speaker 2: No, No, I'm terrible. I'm an awful vegan. I can't I 750 00:38:34,080 --> 00:38:36,799 Speaker 2: come terrible cook? Now I'm going to a class to 751 00:38:36,880 --> 00:38:38,120 Speaker 2: learn how to cook better. 752 00:38:38,800 --> 00:38:40,520 Speaker 1: Look, I'll do my best to be then. 753 00:38:41,120 --> 00:38:42,120 Speaker 3: Yeah, that's the thing. 754 00:38:42,600 --> 00:38:44,640 Speaker 1: I think my belly button might need a tending to. 755 00:38:44,719 --> 00:38:47,440 Speaker 1: But I'll get back to you. 756 00:38:47,440 --> 00:38:50,160 Speaker 2: You wouldn't entertain the idea of reducing your meat intake. 757 00:38:50,239 --> 00:38:51,960 Speaker 2: What's your take on meat intake? 758 00:38:52,239 --> 00:38:54,520 Speaker 1: You keep talking about I'm going to fucking increase it, 759 00:38:54,640 --> 00:38:59,320 Speaker 1: So steady on, tell us about tell us about video 760 00:38:59,440 --> 00:39:02,360 Speaker 1: door bell? What's there? I think I need one of 761 00:39:02,440 --> 00:39:04,400 Speaker 1: those video doorbell. 762 00:39:04,680 --> 00:39:06,040 Speaker 3: Yeah, I've got a ring doorbell. 763 00:39:06,080 --> 00:39:08,080 Speaker 2: What I like about it is that now that I've 764 00:39:08,080 --> 00:39:10,120 Speaker 2: got my studio in the garage out the back of 765 00:39:10,160 --> 00:39:12,600 Speaker 2: the property, because I'm on a quarter acre block, which 766 00:39:12,600 --> 00:39:14,879 Speaker 2: doesn't sound like a lot, but it's a fairly large block, 767 00:39:15,080 --> 00:39:15,359 Speaker 2: and it. 768 00:39:15,360 --> 00:39:18,480 Speaker 1: Means that's in suburbia. That's a veritable farm. 769 00:39:19,360 --> 00:39:20,279 Speaker 3: Yeah, it's pretty. 770 00:39:20,320 --> 00:39:22,040 Speaker 2: And I was picking apples on the way here, by 771 00:39:22,080 --> 00:39:24,280 Speaker 2: the way, because my stone fruit aren't ready. 772 00:39:24,760 --> 00:39:28,840 Speaker 1: So is that a humble brag? 773 00:39:29,520 --> 00:39:30,160 Speaker 3: I think it is. 774 00:39:30,200 --> 00:39:32,200 Speaker 2: It's nice to have stuff that you can pick out 775 00:39:32,200 --> 00:39:34,120 Speaker 2: of your garden. I've got what else have I got 776 00:39:34,160 --> 00:39:36,680 Speaker 2: at the moment, I've got Oregon. I've got lots of herbs. 777 00:39:36,719 --> 00:39:37,440 Speaker 2: That's always good. 778 00:39:37,480 --> 00:39:38,400 Speaker 3: Sorry, I'm digressing. 779 00:39:39,440 --> 00:39:44,560 Speaker 1: It's an that paper folding thing. That's omi I just checking. 780 00:39:47,080 --> 00:39:50,160 Speaker 3: Is what you put in your vegan bowlines? 781 00:39:54,520 --> 00:39:55,600 Speaker 4: All right? If you could cook? 782 00:39:55,800 --> 00:40:00,359 Speaker 3: Can I cook well? Have all these herbs? And we're 783 00:40:00,360 --> 00:40:03,000 Speaker 3: a rudimentary cook. A little bit of way of putting it. 784 00:40:03,760 --> 00:40:06,960 Speaker 2: Look, we're all three single people. Where would you rate 785 00:40:07,000 --> 00:40:08,240 Speaker 2: yourself on the cooking scale. 786 00:40:08,280 --> 00:40:13,000 Speaker 1: Perhaps I can actually cook quite well, but I'm lazy. 787 00:40:13,560 --> 00:40:16,120 Speaker 1: I can cook quite well because I've lived by myself 788 00:40:16,120 --> 00:40:20,640 Speaker 1: for two hundred years. But the whole thing of getting 789 00:40:20,640 --> 00:40:22,640 Speaker 1: a whole lot of shit and chopping and preparing and 790 00:40:22,680 --> 00:40:28,239 Speaker 1: sauteing and above for me. Also, I'm fortunate that I 791 00:40:28,400 --> 00:40:33,399 Speaker 1: live with it literally within one kilometer of forty restaurants and. 792 00:40:33,960 --> 00:40:36,160 Speaker 2: From your place to the cafe across the road, there's 793 00:40:36,160 --> 00:40:39,000 Speaker 2: actually been a channel woven into the bitumen from you 794 00:40:39,120 --> 00:40:39,839 Speaker 2: just walking back? 795 00:40:40,040 --> 00:40:42,160 Speaker 1: Are you you haven't seen it? I've got a flying 796 00:40:42,280 --> 00:40:46,560 Speaker 1: fox from the office. I just fucking zip line across 797 00:40:46,560 --> 00:40:47,960 Speaker 1: there like a fucking ninja. 798 00:40:48,840 --> 00:40:50,200 Speaker 3: That would not surprise me. 799 00:40:52,520 --> 00:40:55,160 Speaker 1: Tell me about it. Tell me about these video doorbells. 800 00:40:55,719 --> 00:40:58,560 Speaker 2: Okay, we finished on the we started talking about the meat. 801 00:40:59,080 --> 00:41:00,560 Speaker 2: Were talking about the meat. The doorbells. 802 00:41:00,600 --> 00:41:03,399 Speaker 1: We can do bar all right, we'll finish on them up. 803 00:41:03,600 --> 00:41:06,160 Speaker 1: So tell me what's going on with this new labeling 804 00:41:06,239 --> 00:41:10,200 Speaker 1: for cultivated meat. Sorry, sorry, it's once you started talking 805 00:41:10,200 --> 00:41:13,800 Speaker 1: about vegans myself in the face. 806 00:41:15,640 --> 00:41:16,680 Speaker 3: So the Swiss. 807 00:41:16,400 --> 00:41:19,960 Speaker 2: Organization, right, they have a thing called a V label. 808 00:41:20,440 --> 00:41:23,840 Speaker 2: It's a little leaf logo and basically it certifies whether 809 00:41:24,000 --> 00:41:27,600 Speaker 2: a product is vegan and vegetarian. But now that I 810 00:41:27,680 --> 00:41:31,160 Speaker 2: come up with the SA label, excuse mete for cultivated meat. 811 00:41:31,320 --> 00:41:33,840 Speaker 2: So the idea is you get you get donor cells 812 00:41:33,880 --> 00:41:37,359 Speaker 2: from a cow and you grow them, and you end 813 00:41:37,440 --> 00:41:41,959 Speaker 2: up with a steak that's been grown rather than cut 814 00:41:42,000 --> 00:41:44,040 Speaker 2: from a cow and you know, slaughtered the cow. You've 815 00:41:44,080 --> 00:41:49,120 Speaker 2: just grown it from these cultured these cultured cells. The 816 00:41:49,200 --> 00:41:52,120 Speaker 2: thing is, it's even though it sounds like it's highly processed, 817 00:41:52,120 --> 00:41:54,759 Speaker 2: it isn't. You know, you're just doing exactly what the 818 00:41:54,800 --> 00:41:57,520 Speaker 2: body does. You're you're using donor cells that can then 819 00:41:57,560 --> 00:42:00,239 Speaker 2: replicate and then they grow into whatever it is, you know, 820 00:42:01,120 --> 00:42:04,400 Speaker 2: in this case of steak, So it's not overly processed 821 00:42:04,760 --> 00:42:08,240 Speaker 2: and the amount of water needed, the amount of input 822 00:42:08,320 --> 00:42:11,640 Speaker 2: needed to be able to grow the steak down the track. 823 00:42:11,960 --> 00:42:13,600 Speaker 2: They think it's going to be much better for the 824 00:42:13,719 --> 00:42:17,399 Speaker 2: environment using cultivated meat. Would you eat a cultivated meat 825 00:42:17,440 --> 00:42:19,680 Speaker 2: steak if it was available to you side by side 826 00:42:19,840 --> 00:42:20,800 Speaker 2: with another surline? 827 00:42:21,600 --> 00:42:24,560 Speaker 1: No, why, I don't want to. 828 00:42:24,920 --> 00:42:27,960 Speaker 2: But but if you knew it was better for the planet, 829 00:42:28,000 --> 00:42:31,080 Speaker 2: if you knew that an animal hadn't been slaughtered and 830 00:42:31,160 --> 00:42:34,200 Speaker 2: b it used a lot less water, a lot less resources, 831 00:42:34,360 --> 00:42:36,600 Speaker 2: it wasn't belching out cock. 832 00:42:36,680 --> 00:42:40,080 Speaker 1: I'm sorry, I'm not giving you the answer you want. 833 00:42:40,680 --> 00:42:42,880 Speaker 1: How dare I not agree with you? 834 00:42:42,920 --> 00:42:47,440 Speaker 3: No, Tiff? Would you? Would you try one? At least? 835 00:42:48,120 --> 00:42:48,200 Speaker 1: So? 836 00:42:48,360 --> 00:42:50,960 Speaker 3: Just it weeds me out that it's been grown. 837 00:42:51,480 --> 00:42:55,400 Speaker 4: Yeah, I just think you're interfering with everything. Something something 838 00:42:56,200 --> 00:42:57,240 Speaker 4: I reckon. 839 00:42:57,600 --> 00:43:02,279 Speaker 1: God or whomever built the hours for a reason. It 840 00:43:02,320 --> 00:43:04,759 Speaker 1: wasn't but it wasn't for them just to fucking die 841 00:43:04,800 --> 00:43:05,759 Speaker 1: of old age. 842 00:43:09,080 --> 00:43:11,080 Speaker 2: Oh man, Yeah, let's get onto it. What was it 843 00:43:11,120 --> 00:43:13,239 Speaker 2: at the topic of the let's go to doorbells? 844 00:43:13,560 --> 00:43:14,799 Speaker 3: Someone, I've got a ring. 845 00:43:15,160 --> 00:43:17,880 Speaker 1: I'm going to build this thing that tastes fucking delicious, 846 00:43:18,239 --> 00:43:20,480 Speaker 1: but I don't want anyone to eat it. It's just 847 00:43:20,520 --> 00:43:23,640 Speaker 1: going to walk around grass for twenty years then drop dead. 848 00:43:24,239 --> 00:43:25,080 Speaker 3: No answers. 849 00:43:26,200 --> 00:43:28,239 Speaker 1: Come on, bro, you're not smarter than God. 850 00:43:28,360 --> 00:43:31,040 Speaker 2: Come on, a cauliflower steak is pretty good with a 851 00:43:31,120 --> 00:43:31,640 Speaker 2: nice source. 852 00:43:32,320 --> 00:43:36,440 Speaker 1: Ok Now, come on, you're fighting a losing battle. 853 00:43:36,480 --> 00:43:39,080 Speaker 2: Get off the doorbells and you're crumbing. It's pretty good 854 00:43:39,120 --> 00:43:40,640 Speaker 2: too if you've put crumbles as well. 855 00:43:41,040 --> 00:43:43,080 Speaker 1: Sorry, rumb you, I'll crumb you. 856 00:43:43,160 --> 00:43:45,920 Speaker 2: Hey, yeah, pretty the doorbells. Look you were saying you're 857 00:43:45,960 --> 00:43:48,759 Speaker 2: thinking of getting one. And one of the challenges with 858 00:43:48,920 --> 00:43:52,080 Speaker 2: any sort of surveillance technology is a lot of the 859 00:43:52,120 --> 00:43:54,880 Speaker 2: companies that sell them also require. 860 00:43:54,480 --> 00:43:57,600 Speaker 3: You to take out a subscription, but you don't have to. 861 00:43:57,960 --> 00:44:00,680 Speaker 2: So I've got the ring doorbells, And for me, what 862 00:44:00,719 --> 00:44:02,640 Speaker 2: it means is if someone comes to my front door 863 00:44:02,680 --> 00:44:05,400 Speaker 2: and I'm sitting in the studio space, I get to 864 00:44:05,440 --> 00:44:07,720 Speaker 2: pop up on my phone. I can activate the camera, 865 00:44:07,920 --> 00:44:09,759 Speaker 2: but I can talk to the person as well, so 866 00:44:09,800 --> 00:44:12,200 Speaker 2: you can have a conversation with them. So tif someone 867 00:44:12,280 --> 00:44:14,880 Speaker 2: came to your house right now to drop off a delivery, 868 00:44:15,120 --> 00:44:17,400 Speaker 2: there's a lot of doorbells now that not only have 869 00:44:17,440 --> 00:44:19,520 Speaker 2: a front facing camera, but they have a camera that 870 00:44:19,560 --> 00:44:21,920 Speaker 2: points down so you can you can see the package 871 00:44:21,920 --> 00:44:24,239 Speaker 2: that's been dropped off, which is kind of good. Yeah, 872 00:44:24,280 --> 00:44:26,000 Speaker 2: so there's lots of new ones out on the market. 873 00:44:26,080 --> 00:44:29,120 Speaker 2: You don't necessarily have to buy a subscription to use them. 874 00:44:29,320 --> 00:44:31,520 Speaker 2: The other thing that my cameras do, which I really love, 875 00:44:31,680 --> 00:44:33,520 Speaker 2: is they have a light in them where you can 876 00:44:33,560 --> 00:44:36,080 Speaker 2: turn the light on manually, so it goes on for 877 00:44:36,120 --> 00:44:39,160 Speaker 2: thirty seconds. But it also has a siren. So if 878 00:44:39,200 --> 00:44:42,359 Speaker 2: you're in Tazzy, some dodgy person comes to the door, 879 00:44:42,400 --> 00:44:44,319 Speaker 2: the camera detects them and you see it. You can 880 00:44:44,360 --> 00:44:46,600 Speaker 2: flick the siren on and scare the be Jesus out 881 00:44:46,640 --> 00:44:47,439 Speaker 2: of them. 882 00:44:47,880 --> 00:44:51,360 Speaker 1: That's good. Yeah, I wouldn't mind. I would like just 883 00:44:51,400 --> 00:44:55,319 Speaker 1: to scare my friends. I mean, so forget the security. 884 00:44:55,400 --> 00:44:58,160 Speaker 1: How much fun would it be to bloody just terrify? 885 00:44:58,480 --> 00:45:00,799 Speaker 1: Can we jump back? Just because I actually want to 886 00:45:00,840 --> 00:45:02,600 Speaker 1: know what ghost GPT is. 887 00:45:03,520 --> 00:45:06,319 Speaker 2: Oh okay, so you're talking about all the AI that's 888 00:45:06,320 --> 00:45:09,680 Speaker 2: out at the moment. Ghost GPT is for criminals. So 889 00:45:09,960 --> 00:45:14,839 Speaker 2: guess what there's it's out there. It's been released, and 890 00:45:14,880 --> 00:45:17,680 Speaker 2: the problem is that there are no constraints. See if 891 00:45:17,719 --> 00:45:19,880 Speaker 2: you go to chat GPT and you try to do 892 00:45:19,920 --> 00:45:25,319 Speaker 2: something naughty, write some code to hack Craig's computer, it's 893 00:45:25,360 --> 00:45:28,000 Speaker 2: not going to do it though. There are constraints there. 894 00:45:28,040 --> 00:45:30,439 Speaker 2: But with this ghost GPT, it's kind of a dark 895 00:45:30,480 --> 00:45:33,200 Speaker 2: web thing where it doesn't have to. The brakes have 896 00:45:33,280 --> 00:45:36,200 Speaker 2: been turned off. You know, you can you can fly 897 00:45:36,280 --> 00:45:37,880 Speaker 2: down that road as fast as you want. You're not 898 00:45:37,920 --> 00:45:40,040 Speaker 2: putting on the brakes. It's it's just going, you know, 899 00:45:40,080 --> 00:45:43,640 Speaker 2: full speed ahead. So you know, cyber crime is out there, 900 00:45:43,680 --> 00:45:46,719 Speaker 2: and so yeah, so the criminals are now using it, 901 00:45:46,800 --> 00:45:48,800 Speaker 2: and the hackers are now using it. And the problem 902 00:45:48,840 --> 00:45:52,319 Speaker 2: with this is in the past, when you wanted to 903 00:45:52,360 --> 00:45:54,520 Speaker 2: get something hacked, you'd have to have quite a bit 904 00:45:54,560 --> 00:45:56,360 Speaker 2: of knowledge to be able to hack into a system. 905 00:45:56,600 --> 00:45:59,560 Speaker 2: But using AI to do programming because I use chat 906 00:45:59,560 --> 00:46:01,640 Speaker 2: GPT occasionally to alter code. 907 00:46:01,760 --> 00:46:02,680 Speaker 3: I'm not a coder. 908 00:46:02,920 --> 00:46:05,040 Speaker 2: I can read a little bit of HTML code and 909 00:46:05,560 --> 00:46:08,680 Speaker 2: that sort of thing, but I can paste the code 910 00:46:08,760 --> 00:46:11,319 Speaker 2: into chat GPT and say now I wanted to do this, 911 00:46:11,680 --> 00:46:14,840 Speaker 2: and it will alter the code accordingly, which is really handy. 912 00:46:14,920 --> 00:46:18,120 Speaker 2: So if you're not a real heavy intensive coder, you 913 00:46:18,120 --> 00:46:21,120 Speaker 2: can actually get help from AI. But the problem is 914 00:46:21,480 --> 00:46:24,239 Speaker 2: if you've got the all the constraints turned off, and 915 00:46:24,280 --> 00:46:28,919 Speaker 2: this is what this ghost you know, GPT is it's yeah, 916 00:46:29,040 --> 00:46:32,800 Speaker 2: it's not a good thing. But again it's the knife analogy. 917 00:46:32,920 --> 00:46:35,319 Speaker 2: I use it to make something, use it to do 918 00:46:36,239 --> 00:46:36,960 Speaker 2: you know, it's more. 919 00:46:37,040 --> 00:46:40,120 Speaker 1: Good, good or evil. Hey, last one interests me because 920 00:46:40,320 --> 00:46:43,800 Speaker 1: I've just started because Tiff told me because she's bossy, 921 00:46:43,840 --> 00:46:46,880 Speaker 1: and also Melissa, who's more bossy that I need to 922 00:46:46,880 --> 00:46:51,000 Speaker 1: put up videos or that's you know, real south and 923 00:46:51,040 --> 00:46:53,719 Speaker 1: putting up few reels. But I can't edit for shit. 924 00:46:54,400 --> 00:46:56,200 Speaker 1: So even the other day I wanted something, I had 925 00:46:56,200 --> 00:46:58,160 Speaker 1: to ring tip and go can you turn this into 926 00:46:58,200 --> 00:47:01,480 Speaker 1: a thing? Which I hate doing. But so Instagram, I've 927 00:47:01,520 --> 00:47:05,520 Speaker 1: got a new video editing app. Is that within the 928 00:47:05,600 --> 00:47:09,839 Speaker 1: actual what is that? So it's built into it. So look, 929 00:47:09,880 --> 00:47:12,360 Speaker 1: this has been a response to, I guess the big 930 00:47:12,440 --> 00:47:15,960 Speaker 1: ban on TikTok in the US and that by extension 931 00:47:15,960 --> 00:47:18,839 Speaker 1: has happened to a few Australians as well. So when 932 00:47:19,200 --> 00:47:22,400 Speaker 1: Trump came in, he announced this ban on TikTok because 933 00:47:22,400 --> 00:47:25,760 Speaker 1: TikTok is owned by China and again the whole China debate, 934 00:47:25,840 --> 00:47:29,520 Speaker 1: what's TikTok doing with the information? So there was a 935 00:47:29,560 --> 00:47:32,840 Speaker 1: reprieve for a month, so it because banned for like 936 00:47:33,239 --> 00:47:36,480 Speaker 1: nine hours or twelve hours and then they reactivated it 937 00:47:36,520 --> 00:47:38,239 Speaker 1: when Trump said, Okay, we're going to give you a 938 00:47:38,280 --> 00:47:40,640 Speaker 1: month reprieve, but then we're going to shut it down again. 939 00:47:40,960 --> 00:47:44,600 Speaker 1: And some Australians who had American based accounts also got 940 00:47:44,640 --> 00:47:47,160 Speaker 1: locked out of TikTok. And one of the things that 941 00:47:47,200 --> 00:47:49,600 Speaker 1: TikTok is able to do is it allows video editing 942 00:47:49,600 --> 00:47:52,359 Speaker 1: as well, built into the software. So Instagram's jumped onto 943 00:47:52,360 --> 00:47:54,719 Speaker 1: the bandwagon, and I think this was a reaction to that, 944 00:47:55,320 --> 00:47:58,840 Speaker 1: and they're now offering new video editing features within the 945 00:47:58,840 --> 00:48:01,720 Speaker 1: app itself means that you can edit, you can trim, 946 00:48:01,800 --> 00:48:04,440 Speaker 1: you can add effects in the app without having to 947 00:48:04,440 --> 00:48:06,520 Speaker 1: go to an external program that you may not know 948 00:48:06,560 --> 00:48:07,120 Speaker 1: how to use. 949 00:48:07,160 --> 00:48:09,440 Speaker 2: So Crago you can now go in and make some 950 00:48:09,560 --> 00:48:11,120 Speaker 2: changes to your video reels. 951 00:48:11,239 --> 00:48:14,960 Speaker 1: I'm definitely going to have to jump into YouTube to 952 00:48:15,040 --> 00:48:16,239 Speaker 1: figure out how to do that. 953 00:48:16,160 --> 00:48:20,160 Speaker 3: On I tiff, No Instagram, mate, we're talking about Instagram. 954 00:48:20,160 --> 00:48:23,600 Speaker 1: Not no no, no, I know that, but no, no, no, 955 00:48:23,719 --> 00:48:26,360 Speaker 1: you missed you did you misunderstood me. I'm going to 956 00:48:26,480 --> 00:48:30,680 Speaker 1: have to jump into YouTube to watch a video, an 957 00:48:30,719 --> 00:48:33,440 Speaker 1: instructional video on how to do it. 958 00:48:34,040 --> 00:48:36,560 Speaker 2: You know what a lot about young people and technology 959 00:48:36,760 --> 00:48:40,360 Speaker 2: is that there's no hurdles, there's no fear, and they'll 960 00:48:40,400 --> 00:48:43,200 Speaker 2: just jump into a program and learn it without having 961 00:48:43,239 --> 00:48:48,200 Speaker 2: to get instruction. I just I feel that people our age, Sorry, Tiff, 962 00:48:48,200 --> 00:48:50,880 Speaker 2: I'm not talking about you like Craig and not my age. 963 00:48:51,640 --> 00:48:53,719 Speaker 2: There is a there is a you know, it's not 964 00:48:53,760 --> 00:48:57,680 Speaker 2: innately intuitive to us. I mean for some of it 965 00:48:57,760 --> 00:49:01,799 Speaker 2: who nerds it is, But the intuitive nature of technology, 966 00:49:01,920 --> 00:49:04,399 Speaker 2: the younger the people are, the younger people are, they 967 00:49:04,440 --> 00:49:06,680 Speaker 2: just seem to pick it up and run with it 968 00:49:06,719 --> 00:49:08,400 Speaker 2: because it's intuitive for them. 969 00:49:08,760 --> 00:49:12,239 Speaker 1: Well, I think it's interesting you say that one of 970 00:49:12,239 --> 00:49:15,560 Speaker 1: my friends has got young kids, and yeah, all of 971 00:49:15,560 --> 00:49:19,560 Speaker 1: those kids speak fluent English and tech like, they grow 972 00:49:19,640 --> 00:49:23,520 Speaker 1: up with two languages. So there was a there was 973 00:49:23,560 --> 00:49:27,160 Speaker 1: never a part of their life where technology and because 974 00:49:27,160 --> 00:49:31,000 Speaker 1: they're young, like seven eight years old, it was already 975 00:49:31,120 --> 00:49:34,920 Speaker 1: quite advanced and so by the time they're two or three. 976 00:49:35,239 --> 00:49:37,560 Speaker 1: But there's this dude, James Gwilt, who used to play 977 00:49:37,600 --> 00:49:40,520 Speaker 1: for some Kilda Shout out to Jimmy. He comes in 978 00:49:40,600 --> 00:49:42,759 Speaker 1: to the Hamptons and I've put up a picture of 979 00:49:42,800 --> 00:49:45,279 Speaker 1: his daughter, Rosie, who's gorgeous. I don't know how old 980 00:49:45,360 --> 00:49:48,319 Speaker 1: she is, but she's not two. But she comes over 981 00:49:48,360 --> 00:49:50,600 Speaker 1: and sits on my knee while he goes and buys 982 00:49:50,640 --> 00:49:53,120 Speaker 1: his coffee, and we fuck around on my phone. She 983 00:49:53,320 --> 00:49:56,160 Speaker 1: pulls the phone out of my hand and is like, 984 00:49:56,320 --> 00:49:59,920 Speaker 1: fuck off, Grandpa, give me the phone and starts look 985 00:50:00,000 --> 00:50:03,800 Speaker 1: looking for pepper Pig on fucking YouTube or and she knows. 986 00:50:04,040 --> 00:50:06,920 Speaker 1: I mean, she doesn't type in pepper Pig, but I'll. 987 00:50:06,640 --> 00:50:07,120 Speaker 3: Bring up. 988 00:50:08,960 --> 00:50:13,560 Speaker 1: She still, she starts all bluey. She starts scrolling to 989 00:50:13,680 --> 00:50:17,440 Speaker 1: find the cartoon or the video that she wants, and 990 00:50:17,480 --> 00:50:19,959 Speaker 1: then she'll play. And I mean, she is not even 991 00:50:20,040 --> 00:50:24,200 Speaker 1: two years old, and she's already got a better understanding 992 00:50:24,400 --> 00:50:28,320 Speaker 1: of that phone than my mum, who's had an iPhone 993 00:50:28,360 --> 00:50:29,040 Speaker 1: for ten years. 994 00:50:29,239 --> 00:50:31,359 Speaker 3: Is that a responsible adult thing to do to let 995 00:50:31,360 --> 00:50:34,960 Speaker 3: a two year old scroll through your feed? 996 00:50:34,360 --> 00:50:36,560 Speaker 1: I just want I just want her to love me, 997 00:50:36,680 --> 00:50:38,880 Speaker 1: so I just fucking let her do. I'm like the 998 00:50:38,960 --> 00:50:42,239 Speaker 1: bad grandfather that lets them eat the lollies and fuck 999 00:50:42,280 --> 00:50:43,600 Speaker 1: around and stay up late. 1000 00:50:44,200 --> 00:50:47,760 Speaker 2: Kids are awesome, and it is amazing to watch the development, 1001 00:50:47,760 --> 00:50:50,440 Speaker 2: particularly at that age. I've got a colleague of mine 1002 00:50:50,480 --> 00:50:53,279 Speaker 2: who has just gone through this whole process, and his 1003 00:50:53,400 --> 00:50:56,799 Speaker 2: little girl's about to turn two in March, and it's 1004 00:50:56,960 --> 00:50:59,600 Speaker 2: like every time I see her, she's doing something new. 1005 00:51:00,200 --> 00:51:03,520 Speaker 3: It's phenomenal. It's so interesting. 1006 00:51:03,200 --> 00:51:05,920 Speaker 1: Think about how they're evolving, like you and I are 1007 00:51:05,960 --> 00:51:09,600 Speaker 1: fucking plants just blowing in the wind, Like they're like 1008 00:51:09,719 --> 00:51:14,720 Speaker 1: little sponges that just fucking never ever stop. At that age, 1009 00:51:15,080 --> 00:51:18,840 Speaker 1: the development is exponential. When you go risk. 1010 00:51:18,680 --> 00:51:19,640 Speaker 3: Plan, I reckon Craig. 1011 00:51:20,200 --> 00:51:24,680 Speaker 2: If you're very planned, you're a carniferous But if that. 1012 00:51:24,719 --> 00:51:27,439 Speaker 1: Will, say goodbye to tif last Patrick. Tell people how 1013 00:51:27,440 --> 00:51:29,160 Speaker 1: to find you and connect with you, please. 1014 00:51:29,480 --> 00:51:32,200 Speaker 2: I can just go to websites now, dot com, dot au. 1015 00:51:32,440 --> 00:51:35,000 Speaker 2: That's the main kind of website that I throw out there. 1016 00:51:35,040 --> 00:51:37,080 Speaker 2: But you know, you can tie chair at home as well. 1017 00:51:37,120 --> 00:51:38,960 Speaker 2: If you think you'd like to get into the zen 1018 00:51:39,440 --> 00:51:41,399 Speaker 2: feel and you want to take a bit of a chill, 1019 00:51:41,520 --> 00:51:42,480 Speaker 2: do some chen bar chan. 1020 00:51:42,680 --> 00:51:43,760 Speaker 3: That's my favorite exercise. 1021 00:51:43,760 --> 00:51:44,759 Speaker 2: By the way, if you want to go to the 1022 00:51:44,760 --> 00:51:47,320 Speaker 2: website chen bar Chan, I do it with my staff 1023 00:51:47,360 --> 00:51:49,880 Speaker 2: twice a day. It's a good way to just do 1024 00:51:49,920 --> 00:51:53,000 Speaker 2: a really quick two minute exercise. Get off your computer, 1025 00:51:53,120 --> 00:51:55,000 Speaker 2: get off what you're doing, go and have a stretch. 1026 00:51:55,160 --> 00:51:55,720 Speaker 3: It's free. 1027 00:51:56,160 --> 00:51:58,000 Speaker 2: Go to tie chair at home, dot Com DoD Au 1028 00:51:58,080 --> 00:52:00,440 Speaker 2: and do some chen buch on today on this No. 1029 00:52:00,480 --> 00:52:01,480 Speaker 3: It's great, I love it. 1030 00:52:01,680 --> 00:52:04,440 Speaker 2: I set up this website during COVID for my students 1031 00:52:04,480 --> 00:52:06,880 Speaker 2: because we couldn't do classes, so it hasn't even been 1032 00:52:06,960 --> 00:52:09,600 Speaker 2: updated since COVID. But it does have some good exercises 1033 00:52:09,640 --> 00:52:11,319 Speaker 2: and Fritz is also there as well. 1034 00:52:11,520 --> 00:52:13,480 Speaker 3: So that you go, there's a free before you today. 1035 00:52:13,520 --> 00:52:16,840 Speaker 2: Do yourself something, be mindful, do some Tai cheese stretching 1036 00:52:16,880 --> 00:52:17,120 Speaker 2: with me. 1037 00:52:18,200 --> 00:52:19,839 Speaker 1: And if you want to go to Balan and pick 1038 00:52:19,840 --> 00:52:21,920 Speaker 1: an apple off his tree, just send him an email. 1039 00:52:22,000 --> 00:52:24,720 Speaker 1: He will gladly accommodate you. Tiffy, What are you seeing 1040 00:52:25,200 --> 00:52:26,120 Speaker 1: Granddad today? 1041 00:52:26,960 --> 00:52:33,000 Speaker 4: Tomorrow? Oh tomorrow, right the countryside today? 1042 00:52:33,640 --> 00:52:35,120 Speaker 1: When's the shindig? Sorry? 1043 00:52:35,440 --> 00:52:39,480 Speaker 4: Is on Sunday. We're having a party on Sunday and 1044 00:52:39,680 --> 00:52:43,319 Speaker 4: on Saturday, the one weekend I'm in Tazzy and Mark 1045 00:52:43,360 --> 00:52:46,080 Speaker 4: Seymour is playing the Boxer Encore Tour, so I'm taking 1046 00:52:46,120 --> 00:52:48,480 Speaker 4: mom and dad to see the box for the first time. 1047 00:52:48,800 --> 00:52:51,840 Speaker 2: How awesome is it that will place us if you around, 1048 00:52:51,880 --> 00:52:52,959 Speaker 2: tiff of. 1049 00:52:52,880 --> 00:52:55,120 Speaker 4: Course, hopefully we'll be able a Naber t shirt and 1050 00:52:56,160 --> 00:52:57,000 Speaker 4: that is awesome. 1051 00:52:57,000 --> 00:52:59,120 Speaker 3: I still want to get a signed album, not from 1052 00:52:59,160 --> 00:53:00,239 Speaker 3: Mark semore from you you. 1053 00:53:01,520 --> 00:53:03,239 Speaker 2: I want, I want for one all and I want 1054 00:53:03,280 --> 00:53:05,040 Speaker 2: you to sign it for me. I've got to get one. 1055 00:53:05,600 --> 00:53:10,480 Speaker 1: Patrick, you should get Tiff to sign your breasts like 1056 00:53:10,600 --> 00:53:16,680 Speaker 1: all the young fans do. All right, it's been great. 1057 00:53:17,160 --> 00:53:17,960 Speaker 1: Thanks everyone,