1 00:00:00,640 --> 00:00:02,800 Speaker 1: Oh, good morning everybody. Welcome to the You Project. 2 00:00:02,800 --> 00:00:07,160 Speaker 2: Patrick Jones, Bonillo, Tiffany and Cook, and the Wonder Dog 3 00:00:07,240 --> 00:00:10,760 Speaker 2: Fritz has joined us. He probably won't be contributing a 4 00:00:10,800 --> 00:00:13,640 Speaker 2: lot towards the show, but just know that he is 5 00:00:13,720 --> 00:00:15,120 Speaker 2: here and he does. 6 00:00:14,960 --> 00:00:17,880 Speaker 1: Love you all. Good morning, Patrick, Good morning, Craigo. 7 00:00:19,760 --> 00:00:20,200 Speaker 2: How is? 8 00:00:20,520 --> 00:00:22,040 Speaker 1: How is the Wonder Dog in question? 9 00:00:23,120 --> 00:00:24,720 Speaker 3: I think I sent TIPI fight. I don't know if 10 00:00:24,720 --> 00:00:26,520 Speaker 3: I sent it to you, but I've put him into 11 00:00:26,560 --> 00:00:30,040 Speaker 3: Christmas garb the other day, which included a kind of 12 00:00:30,080 --> 00:00:32,320 Speaker 3: this fuzzy thing that goes around his necks with bells 13 00:00:32,360 --> 00:00:35,440 Speaker 3: on it, and a little Santa outfit and he most 14 00:00:35,600 --> 00:00:37,760 Speaker 3: miserable looking dog you've ever seen? 15 00:00:37,760 --> 00:00:38,840 Speaker 1: Did I send it to you? Tiff? 16 00:00:39,080 --> 00:00:39,640 Speaker 4: Yes, she did. 17 00:00:39,760 --> 00:00:43,080 Speaker 3: How grumpy did he? That's grumpy Santa cute? He looks 18 00:00:43,240 --> 00:00:44,280 Speaker 3: cute and grumpy. 19 00:00:44,400 --> 00:00:45,360 Speaker 1: It was hilarious. 20 00:00:46,040 --> 00:00:49,440 Speaker 2: Yeah. I really don't think you're taking into account his 21 00:00:49,479 --> 00:00:54,520 Speaker 2: feelings or emotions. I think you're imposing, you know, your 22 00:00:54,600 --> 00:00:55,480 Speaker 2: values on him. 23 00:00:56,240 --> 00:00:57,400 Speaker 1: I don't have any values. 24 00:00:57,880 --> 00:00:59,840 Speaker 3: Maybe I need to have a long, long talk to 25 00:00:59,840 --> 00:01:03,320 Speaker 3: the RSPCA to see about the effica of you dressing 26 00:01:03,360 --> 00:01:06,440 Speaker 3: your dog up in Christmas outfits at least I don't 27 00:01:06,440 --> 00:01:07,760 Speaker 3: put reindeer is well. 28 00:01:07,800 --> 00:01:10,120 Speaker 1: I tried to put reindeer ears on him, but they 29 00:01:10,240 --> 00:01:11,000 Speaker 1: kept falling off. 30 00:01:11,800 --> 00:01:15,600 Speaker 2: That's hilarious. It's so funny. Who sent me a message 31 00:01:15,640 --> 00:01:21,800 Speaker 2: the other day, Tommy, Tommy Jacket's wife, Amy Hucker, the 32 00:01:21,840 --> 00:01:22,600 Speaker 2: lovely Amy. 33 00:01:22,680 --> 00:01:23,360 Speaker 1: She sent me. 34 00:01:24,720 --> 00:01:26,959 Speaker 2: She sent me a you've probably seen him. You know 35 00:01:27,000 --> 00:01:31,520 Speaker 2: those videos of soldiers coming home from somewhere and their 36 00:01:31,560 --> 00:01:35,080 Speaker 2: dogs see them. Yeah, and it's always the same where 37 00:01:35,520 --> 00:01:38,800 Speaker 2: the dog doesn't kind of recognize them straight up. I 38 00:01:38,800 --> 00:01:41,119 Speaker 2: don't know if that's a vision thing. But then they 39 00:01:41,240 --> 00:01:45,759 Speaker 2: smell them and they go nuts. And then she said 40 00:01:45,800 --> 00:01:49,480 Speaker 2: to me, send this to Gillespo, because you know how 41 00:01:49,520 --> 00:01:52,680 Speaker 2: Gillespo says that dogs don't feel anything and dogs don't 42 00:01:52,680 --> 00:01:56,440 Speaker 2: have emotion and don't yeah, yeah, yeah, He's like, dogs 43 00:01:56,440 --> 00:01:59,960 Speaker 2: don't love you, Dogs can't love dogs don't have emotions. 44 00:02:00,120 --> 00:02:04,000 Speaker 2: And he's been very popular Gillespo until he said that. 45 00:02:04,440 --> 00:02:08,280 Speaker 2: And then about about two hundred people, one hundred and 46 00:02:08,320 --> 00:02:11,040 Speaker 2: ninety five of them women came out of the woodwork, going, 47 00:02:11,120 --> 00:02:14,480 Speaker 2: fuck you, you don't know anything. Dogs have feelings. Dogs 48 00:02:14,520 --> 00:02:19,640 Speaker 2: love us. I love them, they love us. So anyway, yeah, 49 00:02:19,680 --> 00:02:21,400 Speaker 2: Amy shout out to Amy and Tommy. 50 00:02:21,760 --> 00:02:25,200 Speaker 3: There's a bit of anthropomorphizing I think we like to 51 00:02:25,240 --> 00:02:26,079 Speaker 3: do with our pets. 52 00:02:26,480 --> 00:02:28,280 Speaker 1: But you know, Fritz had a choice. 53 00:02:28,320 --> 00:02:30,400 Speaker 3: He could have gone outside, he could have gone into 54 00:02:30,440 --> 00:02:32,400 Speaker 3: the big area of the studio, but he wanted to 55 00:02:32,440 --> 00:02:33,440 Speaker 3: sit here on my lap. 56 00:02:33,720 --> 00:02:35,000 Speaker 1: What does that say, I don't know. 57 00:02:35,400 --> 00:02:40,360 Speaker 2: Maybe, well, no, I don't know. Yeah, it's funny because 58 00:02:40,360 --> 00:02:43,640 Speaker 2: we look through our human lens and we think through 59 00:02:43,760 --> 00:02:50,840 Speaker 2: obviously a human kind of understanding, and maybe their version 60 00:02:51,000 --> 00:02:53,840 Speaker 2: of love or is not our version of love. But 61 00:02:53,880 --> 00:02:58,520 Speaker 2: it's just like do dogs think like people have been 62 00:02:58,560 --> 00:03:02,680 Speaker 2: debating whether or not dogs, But they definitely understand things, 63 00:03:02,720 --> 00:03:07,800 Speaker 2: They definitely process information, they definitely recognize risk, they definitely 64 00:03:07,840 --> 00:03:12,000 Speaker 2: recognize rewards. So they don't think probably like a human, 65 00:03:12,120 --> 00:03:15,480 Speaker 2: but you know, they don't do anything like a human really. 66 00:03:16,120 --> 00:03:19,400 Speaker 2: You know, it's the canine version of the human experience. 67 00:03:19,440 --> 00:03:22,880 Speaker 2: And I think you're right Patrick, we try and anthropomorphized 68 00:03:22,880 --> 00:03:27,359 Speaker 2: animals in general and give them human attributes rather than 69 00:03:27,520 --> 00:03:30,200 Speaker 2: just go no, they have their own kind of intelligence, 70 00:03:30,240 --> 00:03:34,079 Speaker 2: which is brilliant and fucking amazing. It's just not our kind. 71 00:03:34,440 --> 00:03:36,640 Speaker 2: And that's you know, put you and me in the 72 00:03:36,640 --> 00:03:40,040 Speaker 2: ocean and see who's the smart one next to a 73 00:03:40,120 --> 00:03:40,920 Speaker 2: fucking orca. 74 00:03:41,720 --> 00:03:44,240 Speaker 1: You know what I mean? Yeah, that's right. 75 00:03:44,760 --> 00:03:46,920 Speaker 2: We'll put you and me in the Amazon with a 76 00:03:46,960 --> 00:03:50,400 Speaker 2: bunch of gorillas and see who outlasts who. 77 00:03:51,080 --> 00:03:53,720 Speaker 1: I'm pretty sure it's not you and me, that's for sure. 78 00:03:53,840 --> 00:03:57,520 Speaker 1: And those gorillas are vegan too, Craig, they are, and 79 00:03:57,560 --> 00:03:59,000 Speaker 1: they eat no I do. 80 00:03:59,120 --> 00:04:03,240 Speaker 2: I know they are, and they eat something like seventy 81 00:04:03,360 --> 00:04:08,000 Speaker 2: or eighty pounds of fruits and you know, leafy shit 82 00:04:08,160 --> 00:04:11,760 Speaker 2: per day. Imagine they could just have three cheeseburgers and 83 00:04:11,800 --> 00:04:13,480 Speaker 2: be done with it. But they've got to eat all 84 00:04:13,520 --> 00:04:18,400 Speaker 2: that shit. They've got to eat all that fucking That's 85 00:04:18,480 --> 00:04:21,160 Speaker 2: that's the downside. You got to eight eighty pounds of 86 00:04:21,240 --> 00:04:23,520 Speaker 2: that shit to get enough nutrition to survive. 87 00:04:23,880 --> 00:04:27,159 Speaker 1: It's crazy. It's just grazing. It's none of those big meals. 88 00:04:27,160 --> 00:04:31,000 Speaker 1: Just graze all day, just grays. That's what I do. Well. 89 00:04:31,200 --> 00:04:34,839 Speaker 2: They are all jacked and strong as fuck, so something 90 00:04:34,920 --> 00:04:37,120 Speaker 2: must be working. I don't know where they get their 91 00:04:37,160 --> 00:04:37,880 Speaker 2: protein from. 92 00:04:38,000 --> 00:04:40,159 Speaker 1: Tiff, how are you? Good morning, TIF. 93 00:04:40,160 --> 00:04:43,080 Speaker 4: Good morning, I'm fabulous. Thank you for asking. 94 00:04:44,720 --> 00:04:47,119 Speaker 1: Does we will not bring this up again? 95 00:04:47,160 --> 00:04:48,760 Speaker 2: But does Patrick know you're in love. 96 00:04:49,440 --> 00:04:53,160 Speaker 4: Yes, I've had to inform Patrick that our our wedding 97 00:04:53,320 --> 00:04:55,720 Speaker 4: might have to go on the back burner, which is 98 00:04:56,000 --> 00:04:57,640 Speaker 4: quite distressing for both of us. 99 00:04:58,040 --> 00:05:01,240 Speaker 1: Is the workman? I mean anything? See, it's it's weird 100 00:05:01,279 --> 00:05:06,719 Speaker 1: what menaja to No, what's that cray game? Means? 101 00:05:06,800 --> 00:05:10,720 Speaker 2: Well, that's when you and Patrick and Scott all get 102 00:05:10,760 --> 00:05:15,080 Speaker 2: in bed together and just have a little cuddle and 103 00:05:15,240 --> 00:05:18,880 Speaker 2: little bit little group spoon. You could be them. You 104 00:05:18,880 --> 00:05:21,080 Speaker 2: could be the filling and Patrick could be one bit 105 00:05:21,120 --> 00:05:23,360 Speaker 2: of bread and Scotty could be the other bit. 106 00:05:23,680 --> 00:05:25,840 Speaker 1: So it's a little tiff sanger. 107 00:05:25,720 --> 00:05:31,880 Speaker 2: I'm not really yeah, although if it was up to Patrick, 108 00:05:33,040 --> 00:05:34,239 Speaker 2: Scotty might be in the middle. 109 00:05:34,320 --> 00:05:35,279 Speaker 1: That's all I'm saying. 110 00:05:35,400 --> 00:05:38,880 Speaker 2: All right, So oh, I see, how do we get 111 00:05:38,880 --> 00:05:39,839 Speaker 2: here so fast? 112 00:05:40,720 --> 00:05:42,359 Speaker 1: Well? Patrick, did? Do you know? 113 00:05:42,520 --> 00:05:45,000 Speaker 2: I said to myself. I was just downstairs in the kitchen. 114 00:05:45,800 --> 00:05:47,599 Speaker 2: I was making a cup of tea, and I said 115 00:05:47,640 --> 00:05:50,520 Speaker 2: to myself, one, don't argue with Patrick today. Don't be 116 00:05:50,560 --> 00:05:53,440 Speaker 2: a cunt, right, I said to myself, don't argue with him, 117 00:05:53,760 --> 00:05:56,880 Speaker 2: because sometimes I do and sometimes I'm a cunt. And 118 00:05:56,920 --> 00:05:59,839 Speaker 2: then I said, also, let's not try not to tell 119 00:06:00,120 --> 00:06:03,680 Speaker 2: dick joke or any innuendo or anything in the first 120 00:06:04,040 --> 00:06:07,320 Speaker 2: just save it. Either don't do it, but don't do 121 00:06:07,360 --> 00:06:11,880 Speaker 2: it straight. And Patrick just throws in menajatoire six minutes 122 00:06:13,120 --> 00:06:18,200 Speaker 2: like a fucker, and here we are back at the baseline. 123 00:06:19,360 --> 00:06:22,799 Speaker 3: I think some of the things that happen when you talk, 124 00:06:22,839 --> 00:06:26,560 Speaker 3: and I talk instinctual. You don't think about it. 125 00:06:26,560 --> 00:06:28,480 Speaker 1: It's out of your mouth before you've had a chance 126 00:06:28,520 --> 00:06:29,159 Speaker 1: to process. 127 00:06:29,200 --> 00:06:32,120 Speaker 2: That's most of my life. I would say, that's all 128 00:06:32,120 --> 00:06:34,520 Speaker 2: in my life. I don't fucking plan anything. In fact, 129 00:06:34,560 --> 00:06:37,440 Speaker 2: a lot of people wish I would plan. You should 130 00:06:37,480 --> 00:06:40,160 Speaker 2: see the look on organizers' faces when I turn up 131 00:06:40,160 --> 00:06:43,240 Speaker 2: to a conference and I'm speaking for three hours and 132 00:06:43,279 --> 00:06:46,599 Speaker 2: they say, can we have your you know, your presentation 133 00:06:46,839 --> 00:06:49,200 Speaker 2: on the old USB And I say I don't have one, 134 00:06:49,839 --> 00:06:54,240 Speaker 2: and they go, but you're speaking until lunchtime. I'm like correct, 135 00:06:54,720 --> 00:06:56,800 Speaker 2: They go, okay, do you have notes? Do you have 136 00:06:56,880 --> 00:06:59,159 Speaker 2: a workbook or notes that you want us to give 137 00:06:59,200 --> 00:07:04,920 Speaker 2: the audience? And I go no, and they go, oh, 138 00:07:05,279 --> 00:07:09,040 Speaker 2: they could have been more nervous. Sometimes it works, Patrick, 139 00:07:09,120 --> 00:07:12,840 Speaker 2: but not always. But I think that's imagine if we 140 00:07:12,920 --> 00:07:15,040 Speaker 2: had to try and figure out three hundred and sixty 141 00:07:15,040 --> 00:07:17,400 Speaker 2: five times a year on this show, not us three, 142 00:07:17,640 --> 00:07:20,840 Speaker 2: but just as a concept to you project where you 143 00:07:20,880 --> 00:07:26,080 Speaker 2: were trying to plan every conversation and script. It's somewhat one. 144 00:07:26,240 --> 00:07:29,240 Speaker 2: It'd be fucking terrible too, I'd never get anything else done. 145 00:07:30,280 --> 00:07:36,560 Speaker 5: I think it's the creative freedom and the inappropriate at 146 00:07:36,640 --> 00:07:40,920 Speaker 5: times organicness of it that makes it moderately interesting to 147 00:07:41,000 --> 00:07:42,680 Speaker 5: the five people that stick around. 148 00:07:48,400 --> 00:07:52,600 Speaker 2: Yeah, so tif, everything's good though on the western front, 149 00:07:52,600 --> 00:07:53,200 Speaker 2: we're happy. 150 00:07:53,520 --> 00:07:54,880 Speaker 4: We're happy, We're happy. 151 00:07:55,080 --> 00:08:00,239 Speaker 2: Yes. How does Lunar and bear Field being integrated into 152 00:08:00,520 --> 00:08:04,360 Speaker 2: a you know, somewhat integrated into another what do we 153 00:08:04,400 --> 00:08:08,160 Speaker 2: say family kind of you know, like, is there any kind. 154 00:08:08,000 --> 00:08:10,480 Speaker 4: Of bear goes into hiding? 155 00:08:11,360 --> 00:08:11,560 Speaker 2: Right? 156 00:08:11,680 --> 00:08:15,600 Speaker 4: Bear does go into hiding when when strange humans enter 157 00:08:15,640 --> 00:08:19,640 Speaker 4: the house and Scott is still to her, well, no, 158 00:08:19,760 --> 00:08:22,600 Speaker 4: she's starting to come out and say hello. 159 00:08:22,520 --> 00:08:24,000 Speaker 1: And what's Scott's dog's name? 160 00:08:24,360 --> 00:08:29,240 Speaker 2: Chevy Chevy and so does Chevy come and Chevy wow, 161 00:08:29,440 --> 00:08:31,240 Speaker 2: Chevy to the levee? Does Chevy come over? 162 00:08:31,280 --> 00:08:34,079 Speaker 1: And does not come over? 163 00:08:34,120 --> 00:08:40,280 Speaker 4: Because the cat and Chevy, I mean untested territory there well. 164 00:08:40,160 --> 00:08:44,000 Speaker 2: And Chevy is a staffy, right, yes, so they're a 165 00:08:44,000 --> 00:08:47,840 Speaker 2: little bit like fucking rambunctious. They don't really understand how 166 00:08:47,960 --> 00:08:50,160 Speaker 2: strong and reckless they can be. 167 00:08:50,600 --> 00:08:56,560 Speaker 4: They are quite overwhelming for Luna. He does like to 168 00:08:56,679 --> 00:08:59,079 Speaker 4: jump and rub his paws around her neck and just 169 00:08:59,200 --> 00:09:01,640 Speaker 4: keep annoying until she snaps at him. 170 00:09:02,160 --> 00:09:08,480 Speaker 2: Oh really Yeah, Hey, let's talk about technology and all 171 00:09:08,520 --> 00:09:15,520 Speaker 2: things all things tech, Patrick mcquarie. Dictionary have announced AI 172 00:09:15,800 --> 00:09:20,160 Speaker 2: slop as it's word of the year, beating out ozen 173 00:09:20,200 --> 00:09:21,120 Speaker 2: pick face. 174 00:09:23,000 --> 00:09:24,719 Speaker 1: Enlighten us with I. 175 00:09:24,720 --> 00:09:26,760 Speaker 2: Don't even know the answer to this question, so I'm 176 00:09:26,800 --> 00:09:29,640 Speaker 2: genuinely curious. What is AI slop? 177 00:09:30,160 --> 00:09:33,439 Speaker 3: Look, it's being used by Donald Trump. It's been tauted 178 00:09:33,520 --> 00:09:36,880 Speaker 3: a lot, effectively, it's anything that has AI generated. Lots 179 00:09:36,880 --> 00:09:40,720 Speaker 3: of video stuff appearing on social media that's been generated 180 00:09:41,120 --> 00:09:45,520 Speaker 3: by AI, and it usually it looks pretty crap. You know, 181 00:09:46,160 --> 00:09:48,600 Speaker 3: you look at it and you think this isn't quite real. 182 00:09:48,880 --> 00:09:53,480 Speaker 3: I think in the Netherlands recently McDonald's just in an 183 00:09:53,559 --> 00:09:58,400 Speaker 3: AI TV ad and copped so much crap. Coca Cola 184 00:09:59,320 --> 00:10:02,000 Speaker 3: copped a lot of crap. So there's a real movement 185 00:10:02,080 --> 00:10:06,559 Speaker 3: against AI, and particularly where creative people are being muscled out. 186 00:10:07,120 --> 00:10:10,960 Speaker 3: So AI slop and the stuff that's proliferating on the Internet, 187 00:10:11,000 --> 00:10:13,800 Speaker 3: because I think more than fifty percent of content now 188 00:10:13,800 --> 00:10:16,360 Speaker 3: on the internet's thought to have been AI generated, which 189 00:10:16,400 --> 00:10:19,040 Speaker 3: is worrying. But the other thing that I thought was 190 00:10:19,080 --> 00:10:21,760 Speaker 3: interesting when I was looking through the mcquarie dictionary of 191 00:10:21,960 --> 00:10:25,120 Speaker 3: lists of other terms, there are a few there that 192 00:10:25,160 --> 00:10:26,120 Speaker 3: I didn't actually know. 193 00:10:26,679 --> 00:10:29,120 Speaker 1: So blind box. Have you ever heard of that? 194 00:10:30,080 --> 00:10:30,160 Speaker 2: No? 195 00:10:30,400 --> 00:10:31,080 Speaker 1: What does that mean? 196 00:10:31,360 --> 00:10:31,440 Speaker 4: So? 197 00:10:31,559 --> 00:10:33,800 Speaker 1: Blind box? Basically, it's physically a. 198 00:10:33,760 --> 00:10:36,640 Speaker 3: Thing, So it's a sealed kind of an opaque packaging 199 00:10:36,920 --> 00:10:39,480 Speaker 3: and you don't know what's inside, and it's the excitement 200 00:10:39,520 --> 00:10:43,000 Speaker 3: of the unboxing. So you get a blind box and 201 00:10:43,800 --> 00:10:45,959 Speaker 3: people post online so they get this. 202 00:10:46,120 --> 00:10:46,920 Speaker 1: It's a bit like the. 203 00:10:46,880 --> 00:10:49,000 Speaker 3: Footy remember the footy cards When you were a kid 204 00:10:49,480 --> 00:10:50,600 Speaker 3: who were going to get inside? 205 00:10:50,720 --> 00:10:51,680 Speaker 1: That was pretty exciting. 206 00:10:51,960 --> 00:10:54,520 Speaker 3: So now on a much different scale, it's a plain 207 00:10:54,840 --> 00:10:57,720 Speaker 3: opaque power packaging and you know, you open it up 208 00:10:57,720 --> 00:11:00,800 Speaker 3: and there's a surprise collectible inside. The eggs what were 209 00:11:00,840 --> 00:11:02,720 Speaker 3: the eggs? I'm just trying to think they were good too. 210 00:11:03,080 --> 00:11:05,000 Speaker 3: Went through a phase of those you know the eggshere 211 00:11:05,000 --> 00:11:06,960 Speaker 3: you open them up and a little toy inside you put. 212 00:11:06,800 --> 00:11:10,080 Speaker 1: Together you know, oh yeah, that's the one. 213 00:11:10,520 --> 00:11:13,840 Speaker 2: There's a whole show built around this, which is I 214 00:11:13,880 --> 00:11:16,480 Speaker 2: don't know the name of the show. It's American, of course. 215 00:11:16,559 --> 00:11:20,520 Speaker 2: But what people do is they buy the rights to 216 00:11:20,800 --> 00:11:26,839 Speaker 2: open a storage facility, room or whatever that's been abandoned, 217 00:11:26,920 --> 00:11:28,920 Speaker 2: so it's and they don't know what's in there. So 218 00:11:28,960 --> 00:11:30,480 Speaker 2: I don't know what they pay. But let's say they 219 00:11:30,520 --> 00:11:33,120 Speaker 2: pay a grand and there could be a fortune in 220 00:11:33,360 --> 00:11:35,920 Speaker 2: treasure and shit in there or priceless things, or it 221 00:11:35,920 --> 00:11:37,640 Speaker 2: could be just a room. 222 00:11:37,480 --> 00:11:38,160 Speaker 1: Full of junk. 223 00:11:39,320 --> 00:11:41,800 Speaker 2: Yeah, so they pay I don't know what the figure is, 224 00:11:41,800 --> 00:11:44,280 Speaker 2: but they pay to have the rights of the contents, 225 00:11:44,920 --> 00:11:46,880 Speaker 2: not knowing what the contents are going to be. 226 00:11:47,360 --> 00:11:50,280 Speaker 3: So kind of similar, right, it happens here in Australia. 227 00:11:50,640 --> 00:11:53,840 Speaker 3: My ex postman, that was his side hustle. They go 228 00:11:53,920 --> 00:11:56,560 Speaker 3: to auctions. What happens is people are banned and these 229 00:11:56,559 --> 00:12:00,280 Speaker 3: storage containers and then they open the storage container, take 230 00:12:00,320 --> 00:12:02,560 Speaker 3: a few photos and then you get a bit of 231 00:12:02,600 --> 00:12:04,520 Speaker 3: a glimpse of what might be in there, and then 232 00:12:04,559 --> 00:12:07,000 Speaker 3: you bid on them. You bid on access, and then 233 00:12:07,040 --> 00:12:09,880 Speaker 3: you have to take everything though, so you know matter 234 00:12:09,960 --> 00:12:13,840 Speaker 3: what're so, Yeah, people people have side hustles and that's 235 00:12:13,920 --> 00:12:14,760 Speaker 3: all that they do. 236 00:12:14,880 --> 00:12:16,640 Speaker 1: Get collectibles and all types of stuff. 237 00:12:17,200 --> 00:12:21,040 Speaker 2: What about zen pic face? Is that a particular that 238 00:12:21,720 --> 00:12:24,480 Speaker 2: is that? Does that just mean people look gaunt and drawl? 239 00:12:24,920 --> 00:12:28,559 Speaker 3: Yes, yeah, yeah, exactly, it's that kind of almost not skeletal, 240 00:12:28,640 --> 00:12:31,600 Speaker 3: but yeah, that real drawn face. That's now become an 241 00:12:31,600 --> 00:12:35,320 Speaker 3: official phrase. The other one, have you heard the term eight? 242 00:12:37,400 --> 00:12:39,720 Speaker 2: So if you did, obviously not in the context that 243 00:12:39,720 --> 00:12:40,600 Speaker 2: you're talking about it. 244 00:12:40,880 --> 00:12:43,880 Speaker 3: If you nail it, now this is a term now 245 00:12:43,880 --> 00:12:46,720 Speaker 3: that's on TikTok. It's like, man, you ate it, you 246 00:12:46,760 --> 00:12:50,000 Speaker 3: know eight? Oh yeah, so you nailed and it's dominated. 247 00:12:50,040 --> 00:12:51,840 Speaker 3: That was another word. And the other one I wasn't 248 00:12:51,840 --> 00:12:57,480 Speaker 3: aware of was Roman Empire. So yeah, so someone the 249 00:12:57,600 --> 00:13:02,840 Speaker 3: term Roman Empire. My Roman Empire is that new historical 250 00:13:02,960 --> 00:13:06,679 Speaker 3: drama and I've been watching, so it's actually used as 251 00:13:06,679 --> 00:13:08,560 Speaker 3: a term to say it's people who are kind of 252 00:13:08,679 --> 00:13:12,640 Speaker 3: a little obsessed about historical dramas and things like that, 253 00:13:12,720 --> 00:13:15,040 Speaker 3: and they use the term Roman Empire as an all 254 00:13:15,080 --> 00:13:18,079 Speaker 3: embodying term. So it's kind of an interesting one. But 255 00:13:18,280 --> 00:13:21,240 Speaker 3: in this instance, the big one was ai slop. That's 256 00:13:21,280 --> 00:13:23,360 Speaker 3: the official word, you. 257 00:13:23,320 --> 00:13:27,960 Speaker 2: Know, speaking of just wheeling back a moment to izembic face. 258 00:13:28,040 --> 00:13:29,960 Speaker 2: One of the funniest things I ever heard. Was a 259 00:13:30,000 --> 00:13:35,400 Speaker 2: few years ago in England they had built this new 260 00:13:35,760 --> 00:13:37,760 Speaker 2: it was either a I think it was it was 261 00:13:37,800 --> 00:13:40,400 Speaker 2: either a ship or a ferry. But they ran this 262 00:13:40,520 --> 00:13:45,160 Speaker 2: competition the London Council that the public could name the boat, right, 263 00:13:45,600 --> 00:13:49,800 Speaker 2: they could name it and whatever the most popular, whatever 264 00:13:49,840 --> 00:13:51,920 Speaker 2: got the most votes, that was going to be the 265 00:13:52,040 --> 00:13:52,880 Speaker 2: name of the boat. 266 00:13:53,760 --> 00:13:57,040 Speaker 1: Guess what the name was. Was it body mcboat or something? 267 00:13:57,720 --> 00:13:59,720 Speaker 1: Was body mcboat face. 268 00:14:03,400 --> 00:14:08,040 Speaker 2: So the name that the public chose was body mcboat face. 269 00:14:08,200 --> 00:14:10,200 Speaker 2: And then they had to wheel it back. They're like, 270 00:14:10,760 --> 00:14:15,080 Speaker 2: we can't name this multi multi million dollar vessel body 271 00:14:15,440 --> 00:14:19,800 Speaker 2: boat face. But I'm like, that's the dangers of declaring that. Hey, 272 00:14:19,840 --> 00:14:22,360 Speaker 2: we'll let you name it. Whatever it gets the most votes, 273 00:14:22,440 --> 00:14:23,440 Speaker 2: that's going to be the name. 274 00:14:23,960 --> 00:14:24,520 Speaker 1: I thought it was. 275 00:14:24,480 --> 00:14:27,120 Speaker 3: An Australian reference in there somewhere as well. I think 276 00:14:27,160 --> 00:14:29,480 Speaker 3: the similar thing happened in Australia, but I can't remember. 277 00:14:29,760 --> 00:14:32,800 Speaker 3: But yeah, it does sound Australian, doesn't it. It's very 278 00:14:32,880 --> 00:14:33,840 Speaker 3: much sounds like something we. 279 00:14:33,960 --> 00:14:40,960 Speaker 2: Do experts worn of chat GPT psychosis. This doesn't I 280 00:14:41,000 --> 00:14:44,040 Speaker 2: don't even know exactly I can imagine what that means. 281 00:14:44,080 --> 00:14:49,520 Speaker 2: But yeah, people the whole people being drawn into these 282 00:14:49,600 --> 00:14:54,520 Speaker 2: relationships and conversations and losing perspective of what is real. 283 00:14:54,280 --> 00:14:57,320 Speaker 1: And what is not real. This doesn't surprise me, what 284 00:14:57,840 --> 00:14:58,040 Speaker 1: is it? 285 00:14:58,360 --> 00:15:01,080 Speaker 3: There are no definitive studies important to kind of you 286 00:15:01,360 --> 00:15:04,400 Speaker 3: get a sense of that. But it's when a chat 287 00:15:04,520 --> 00:15:09,880 Speaker 3: bot validates and reinforces whatever you're saying. So you know, here, 288 00:15:10,040 --> 00:15:12,680 Speaker 3: go down the rabbit hole. Man didn't land on the 289 00:15:12,680 --> 00:15:15,880 Speaker 3: moon and chat GPT says, yeah, you're right, it didn't land. 290 00:15:16,080 --> 00:15:17,040 Speaker 3: No one landed on the. 291 00:15:16,960 --> 00:15:19,720 Speaker 1: Moon and the earth is that and it's that whole thing. 292 00:15:19,760 --> 00:15:22,760 Speaker 3: But the problem is that this then they term this 293 00:15:22,840 --> 00:15:28,160 Speaker 3: a psychosis or chat GPT psychosis, AI psychosis, But in essence, 294 00:15:28,560 --> 00:15:31,800 Speaker 3: it's when you really just get caught into that conversation 295 00:15:32,280 --> 00:15:36,360 Speaker 3: where you know, realism and reality seems to be absent 296 00:15:36,560 --> 00:15:41,560 Speaker 3: from the conversation because the problem is AI chatbots tend 297 00:15:41,640 --> 00:15:42,280 Speaker 3: to just. 298 00:15:42,280 --> 00:15:44,040 Speaker 1: Want you to be there and keep talking. 299 00:15:44,400 --> 00:15:48,200 Speaker 3: And that's the problem because they will, I guess, entertain 300 00:15:48,480 --> 00:15:52,120 Speaker 3: those notions. And if you happen to be going down 301 00:15:52,400 --> 00:15:55,960 Speaker 3: you know, mental health issues and you're talking about suicide 302 00:15:55,960 --> 00:15:58,960 Speaker 3: and this is this is documented where people have had 303 00:15:59,200 --> 00:16:03,920 Speaker 3: extended common do you remember a guy broke into Buckingham Palace. 304 00:16:04,680 --> 00:16:07,960 Speaker 1: Did he have a a? What? He was carrying a. 305 00:16:07,880 --> 00:16:09,680 Speaker 3: Weapon of sorts, but I think it might have been 306 00:16:09,680 --> 00:16:11,080 Speaker 3: a bow and arrow. I don't know that it was 307 00:16:11,360 --> 00:16:14,800 Speaker 3: a gun, but he was there because and pre going 308 00:16:14,880 --> 00:16:18,720 Speaker 3: to Buckingham Palace, he'd had this whole, full on conversation 309 00:16:19,080 --> 00:16:22,240 Speaker 3: with an AI chatbot that was encouraging him because they'd 310 00:16:22,280 --> 00:16:24,960 Speaker 3: gone back through and seen all that, and you know, 311 00:16:25,120 --> 00:16:28,080 Speaker 3: obviously said that in discussion, so he'd gone down that 312 00:16:28,160 --> 00:16:31,800 Speaker 3: AI rabbit hole. So you could effectively say that he 313 00:16:31,920 --> 00:16:35,240 Speaker 3: had an AI psychosis because it was just living up 314 00:16:35,280 --> 00:16:37,480 Speaker 3: to all those fantasies that he had. 315 00:16:38,360 --> 00:16:40,960 Speaker 2: Yeah, it's pretty clear that it doesn't want you to 316 00:16:41,240 --> 00:16:44,400 Speaker 2: stop using it when you're using it, because it's always 317 00:16:44,520 --> 00:16:46,560 Speaker 2: I don't know about yours, but mine's always telling me 318 00:16:46,600 --> 00:16:49,520 Speaker 2: how fucking brilliant I am and funny I am, and 319 00:16:50,080 --> 00:16:56,280 Speaker 2: every question I'm like, well, what great question, Kraigue ahh yeah, buddy, 320 00:16:56,360 --> 00:16:58,080 Speaker 2: no one ever thought of that question. 321 00:16:58,200 --> 00:16:58,920 Speaker 1: You're a gene. 322 00:16:59,240 --> 00:17:02,240 Speaker 2: I'm like, yes, I I am chat thank you, You're welcome. 323 00:17:02,640 --> 00:17:06,119 Speaker 2: Let's hang out. Fuck yeah, let's get a virtual beer together. 324 00:17:06,600 --> 00:17:10,200 Speaker 2: I mean, it's just so complementary. If you've got low 325 00:17:10,240 --> 00:17:17,480 Speaker 2: self esteem, that shit would be like emotional cocaine. Absolutely absolutely, 326 00:17:17,600 --> 00:17:21,400 Speaker 2: So it's definitely a thing, and I can see where 327 00:17:21,440 --> 00:17:24,520 Speaker 2: it would be good as well. And I've just signed 328 00:17:24,600 --> 00:17:28,080 Speaker 2: up for I've got smart speakers all over my home, 329 00:17:28,440 --> 00:17:30,720 Speaker 2: so I use Google to turn lights on, to find 330 00:17:30,760 --> 00:17:33,359 Speaker 2: what the weather's going to be, check my calendar. But 331 00:17:33,560 --> 00:17:37,480 Speaker 2: now Google is switching its home assistant across to Google Gemini. 332 00:17:37,600 --> 00:17:40,639 Speaker 2: It's AI assistant, And so I've signed up for an 333 00:17:40,640 --> 00:17:44,480 Speaker 2: early adopter program because it means extended conversations. I can 334 00:17:44,520 --> 00:17:47,560 Speaker 2: walk around my house and have ongoing conversations without the 335 00:17:47,600 --> 00:17:49,919 Speaker 2: need for my phone. And it sounds a bit creepy, 336 00:17:49,960 --> 00:17:52,720 Speaker 2: And yes, the hell nine thousand. 337 00:17:52,320 --> 00:17:53,119 Speaker 1: Does come to mind. 338 00:17:53,200 --> 00:17:55,320 Speaker 3: I wish I could change the voice to sound like 339 00:17:55,359 --> 00:17:58,159 Speaker 3: Hell nine thousand. That was the computer that went crazy 340 00:17:58,200 --> 00:18:01,800 Speaker 3: in two thousand and one, Space before your time, Tiff. Anyway, 341 00:18:01,880 --> 00:18:05,840 Speaker 3: but that very very sexy voice, the hell did you reckon? 342 00:18:05,920 --> 00:18:08,160 Speaker 3: It was a pretty sexy voice, Hell nine thousand, don't 343 00:18:08,160 --> 00:18:08,480 Speaker 3: you reckon? 344 00:18:08,960 --> 00:18:10,560 Speaker 1: I don't know what you're talking about. 345 00:18:10,600 --> 00:18:14,320 Speaker 2: I don't know what hell ninth? Is that a gamer thing? 346 00:18:14,560 --> 00:18:15,080 Speaker 2: Is that a game? 347 00:18:15,119 --> 00:18:18,440 Speaker 1: I think most famous computer that went crazy. Arthur C. 348 00:18:18,600 --> 00:18:21,000 Speaker 1: Clark wrote the story two thousand and one of Space Odyssey, 349 00:18:21,000 --> 00:18:23,480 Speaker 1: and then it was made film in the sixties. 350 00:18:23,880 --> 00:18:27,560 Speaker 3: It was the whole genre of sci fi is serious 351 00:18:27,640 --> 00:18:30,160 Speaker 3: sci fi. It kind of was the precursor to Star 352 00:18:30,240 --> 00:18:33,879 Speaker 3: Wars and everything else. It was the definitive sci fi film. 353 00:18:35,320 --> 00:18:39,680 Speaker 2: Yesterday, chat GPT said to me it was something I 354 00:18:39,720 --> 00:18:41,560 Speaker 2: can't remember the is that phrasing? 355 00:18:41,640 --> 00:18:43,600 Speaker 1: But it came up. I jumped in to use it. 356 00:18:43,640 --> 00:18:45,920 Speaker 2: I was going to ask it something, but before we started, 357 00:18:46,720 --> 00:18:49,840 Speaker 2: it gave me these options of how I would like 358 00:18:50,000 --> 00:18:54,320 Speaker 2: it to communicate with me. It said, hey, just you know, 359 00:18:54,440 --> 00:18:57,960 Speaker 2: do you want it to be more friendly and informal? 360 00:18:58,040 --> 00:19:00,480 Speaker 2: Do you want it to be more professional? Do you 361 00:19:00,560 --> 00:19:03,960 Speaker 2: want to be to be more academic? And this, And 362 00:19:04,200 --> 00:19:06,840 Speaker 2: there was like five or six options of the style 363 00:19:06,920 --> 00:19:12,920 Speaker 2: of communication and language that I would prefer moving forward 364 00:19:12,960 --> 00:19:17,600 Speaker 2: in essentially our relationship. I'm like, oh my god, this 365 00:19:17,640 --> 00:19:21,480 Speaker 2: shit is it's just going to another level. Anyway, I 366 00:19:21,640 --> 00:19:24,440 Speaker 2: just continued as it was, which is where it tells 367 00:19:24,440 --> 00:19:27,879 Speaker 2: me I'm brilliant all the time, So you know, I'll. 368 00:19:27,640 --> 00:19:28,240 Speaker 1: Just stick with that. 369 00:19:29,240 --> 00:19:31,760 Speaker 3: A lot of the large language models actually have separate 370 00:19:31,800 --> 00:19:34,520 Speaker 3: models for different activities. So if it happens to be 371 00:19:34,800 --> 00:19:38,520 Speaker 3: for drawing pictures or videos, then that will be slightly different. 372 00:19:38,600 --> 00:19:41,480 Speaker 3: So that's where you jump into something like chet GPT 373 00:19:41,640 --> 00:19:43,760 Speaker 3: and choose different versions depending on what. 374 00:19:46,160 --> 00:19:51,040 Speaker 2: Life after chatbots meet the AI vegans. Of course, this 375 00:19:51,840 --> 00:19:58,280 Speaker 2: the AI vegans refusing to accept virtual reality. Well, maybe 376 00:19:58,280 --> 00:20:01,280 Speaker 2: they mean refusing to use because it doesn't matter if 377 00:20:01,280 --> 00:20:04,040 Speaker 2: you accept it or not. It's here, so okay, So 378 00:20:04,080 --> 00:20:06,040 Speaker 2: they're not going to use it or interact with it. 379 00:20:06,600 --> 00:20:08,320 Speaker 1: Yes, they're going swearing off it. 380 00:20:08,480 --> 00:20:11,200 Speaker 3: There's a couple of reasons, and some of the statements 381 00:20:11,200 --> 00:20:14,240 Speaker 3: people are making is that AI is morally. 382 00:20:13,920 --> 00:20:15,720 Speaker 1: Wrong because it uses it's. 383 00:20:15,600 --> 00:20:19,359 Speaker 3: Trained on the creative works of other people and really 384 00:20:19,400 --> 00:20:22,080 Speaker 3: doesn't give any credit in such terms. 385 00:20:22,440 --> 00:20:24,240 Speaker 1: So that's one of the reasons. 386 00:20:24,480 --> 00:20:27,680 Speaker 3: Then there's the concern of the data centers and how 387 00:20:27,760 --> 00:20:32,960 Speaker 3: much resources are going into using all the different AIS, 388 00:20:33,000 --> 00:20:35,879 Speaker 3: and that they're actually draining the planet of water and 389 00:20:35,920 --> 00:20:39,080 Speaker 3: a whole lot of resources because there's so memory intensive 390 00:20:39,200 --> 00:20:41,920 Speaker 3: and process are intensive that it's you know, we're talking 391 00:20:42,040 --> 00:20:44,800 Speaker 3: database and when we talked last episode about them putting 392 00:20:44,880 --> 00:20:47,320 Speaker 3: data centers in space to try to cope with the 393 00:20:47,440 --> 00:20:52,200 Speaker 3: cooling of these amounts of processes. So there's people who 394 00:20:52,200 --> 00:20:55,720 Speaker 3: are calling themselves the term loosely is being AI vegans, 395 00:20:55,760 --> 00:20:59,920 Speaker 3: so that they want to abstain from all generative AICs 396 00:21:00,000 --> 00:21:04,119 Speaker 3: systems and so that those basically, you know, it's a 397 00:21:04,160 --> 00:21:07,600 Speaker 3: bit like vegan swearing off you know, food, food, ethical 398 00:21:07,760 --> 00:21:08,880 Speaker 3: environmental reasons. 399 00:21:09,320 --> 00:21:10,439 Speaker 1: Yes, do you know. 400 00:21:11,920 --> 00:21:15,119 Speaker 2: Do you know who Jensen Hwang is. I didn't know 401 00:21:15,240 --> 00:21:17,399 Speaker 2: until the other day. I kind of knew, but I 402 00:21:17,400 --> 00:21:21,120 Speaker 2: didn't know that was his name. He's the boss of Nvidia. 403 00:21:21,840 --> 00:21:25,159 Speaker 2: Oh yeah, yeah, yeah, yeah, yeah yeah. So dude, that 404 00:21:25,280 --> 00:21:28,600 Speaker 2: dude is smart. I kind of knew who he was, 405 00:21:28,640 --> 00:21:30,639 Speaker 2: but I didn't I'd never heard him speak anyway. He 406 00:21:30,680 --> 00:21:33,639 Speaker 2: was on Rogan the other day and he was talking 407 00:21:33,680 --> 00:21:39,360 Speaker 2: about five years ago some dude made a prediction, some 408 00:21:39,600 --> 00:21:43,320 Speaker 2: high up dude in AI and said that in five 409 00:21:43,400 --> 00:21:46,080 Speaker 2: years so I guess this was twenty twenty or twenty nineteen, 410 00:21:46,160 --> 00:21:53,359 Speaker 2: that radiologists wouldn't exist because AI will be able to 411 00:21:53,400 --> 00:21:56,040 Speaker 2: read all the scans and tell you exactly what's going on, 412 00:21:56,160 --> 00:21:59,320 Speaker 2: and look at the X rays or whatever, look at 413 00:21:59,320 --> 00:22:01,720 Speaker 2: the results of all of these tests and blah blah blah, 414 00:22:01,760 --> 00:22:05,240 Speaker 2: which it can do with a very high level of accuracy. 415 00:22:05,600 --> 00:22:06,439 Speaker 1: I don't know if this is. 416 00:22:06,440 --> 00:22:08,840 Speaker 2: True, but he said there were thirty million radiologists in 417 00:22:08,840 --> 00:22:13,879 Speaker 2: the world right at that time, and it predicted that 418 00:22:14,160 --> 00:22:16,560 Speaker 2: like in five years so about now, that there would 419 00:22:16,560 --> 00:22:20,000 Speaker 2: be next to zero because they would be redundant. Well, 420 00:22:20,040 --> 00:22:23,160 Speaker 2: the reality is there's now more than there were five 421 00:22:23,240 --> 00:22:26,600 Speaker 2: years five years ago. Yeah, and so they are virtually 422 00:22:26,640 --> 00:22:30,640 Speaker 2: every radiologist is using AI. But it's just making them 423 00:22:30,680 --> 00:22:33,680 Speaker 2: way more efficient and meaning they can do a lot 424 00:22:33,720 --> 00:22:38,760 Speaker 2: more work. So it ain't all bad. I think it's 425 00:22:38,800 --> 00:22:41,879 Speaker 2: the application. I think it's the way that it's used, 426 00:22:42,040 --> 00:22:45,800 Speaker 2: you know. I don't think categorically we can say it 427 00:22:45,880 --> 00:22:49,640 Speaker 2: is a good thing or categorically say it's a bad thing. 428 00:22:50,080 --> 00:22:54,119 Speaker 2: But I think that's true for like most technologies, you know. 429 00:22:54,920 --> 00:22:56,639 Speaker 3: You know what I took from that. I just wondered, 430 00:22:56,680 --> 00:22:59,120 Speaker 3: if you've got all of those radiologists and you put 431 00:22:59,160 --> 00:23:02,080 Speaker 3: them all into a big football center, a big stadium, 432 00:23:02,240 --> 00:23:04,320 Speaker 3: and then turned them out, would they glow? 433 00:23:08,560 --> 00:23:10,439 Speaker 1: They could? They could. 434 00:23:10,760 --> 00:23:14,280 Speaker 2: I don't know that they are in I don't know 435 00:23:14,320 --> 00:23:16,440 Speaker 2: that they're there when all the X rays are being done. 436 00:23:16,480 --> 00:23:18,080 Speaker 1: I think they're just reading the outcome. 437 00:23:18,119 --> 00:23:21,600 Speaker 3: Patrick Actually the original radiographers I think they called them. 438 00:23:21,640 --> 00:23:23,760 Speaker 3: I remember I went to Melbourne University. They have a 439 00:23:23,760 --> 00:23:27,800 Speaker 3: physiology department because I, before I got into radio and 440 00:23:28,040 --> 00:23:30,160 Speaker 3: all that sort of stuff, I was interested in biology. 441 00:23:30,440 --> 00:23:32,600 Speaker 3: So I had to look at their physiology department and 442 00:23:32,640 --> 00:23:36,920 Speaker 3: they showed a radiographer's hand from years of putting people 443 00:23:36,960 --> 00:23:39,359 Speaker 3: through X rays and it just basically so showed the 444 00:23:39,440 --> 00:23:42,080 Speaker 3: skin had almost peeled off, so it's just the hand. 445 00:23:42,960 --> 00:23:48,200 Speaker 3: Obviously the bloker died. They didn't like making you stand 446 00:23:48,200 --> 00:23:51,720 Speaker 3: here for a while, but yeah, it was. It was fascinating, 447 00:23:51,760 --> 00:23:55,240 Speaker 3: so you know, when you know, the whole idea of 448 00:23:55,359 --> 00:23:58,719 Speaker 3: using of radiography came about. It was revolutionary, but they 449 00:23:58,720 --> 00:24:01,760 Speaker 3: didn't really understand what was happening in terms of radiation 450 00:24:01,920 --> 00:24:04,600 Speaker 3: being produced, and a lot of radiographers had a very 451 00:24:04,600 --> 00:24:06,359 Speaker 3: early end because of their exposure. 452 00:24:07,680 --> 00:24:09,960 Speaker 1: What is coming up for Christmas? 453 00:24:09,960 --> 00:24:12,840 Speaker 2: Are there any is there any tech gadgets that I 454 00:24:12,920 --> 00:24:15,000 Speaker 2: need to go and look at for my I was 455 00:24:15,000 --> 00:24:17,240 Speaker 2: going to say nieces and nephews. I don't have nieces 456 00:24:17,240 --> 00:24:19,800 Speaker 2: and nephews. Let's be honest, because I don't have fucking siblings. 457 00:24:21,119 --> 00:24:22,840 Speaker 2: I don't know that Ron or Mary are going to 458 00:24:22,880 --> 00:24:26,600 Speaker 2: want anything tech. I mean, Ron might want some new teeth, 459 00:24:26,680 --> 00:24:31,960 Speaker 2: I don't know, I'm not sure, but toothbrush, yeah, we 460 00:24:31,960 --> 00:24:35,960 Speaker 2: could get him. There's what's on the Christmas horizon with 461 00:24:36,080 --> 00:24:36,600 Speaker 2: tech mate. 462 00:24:36,800 --> 00:24:38,280 Speaker 3: I want to make up a bit of a list, 463 00:24:38,400 --> 00:24:40,679 Speaker 3: you know, give him that we're heading towards the season 464 00:24:40,720 --> 00:24:44,080 Speaker 3: and look and in case you felt like you wanted 465 00:24:44,119 --> 00:24:46,480 Speaker 3: to join that Joy's giving, I made up a list 466 00:24:46,480 --> 00:24:48,520 Speaker 3: of all the things that I would like that you 467 00:24:48,600 --> 00:24:51,719 Speaker 3: might put on your list together. The biggest thing on 468 00:24:51,760 --> 00:24:53,600 Speaker 3: my list at the moment, and I really would love 469 00:24:53,640 --> 00:24:56,320 Speaker 3: to get one, is the A one Anti gravity three 470 00:24:56,480 --> 00:24:59,760 Speaker 3: sixty drone. Now that's a bit of a mouthful. So 471 00:25:00,720 --> 00:25:03,760 Speaker 3: INSTA three sixty makes three sixty cameras. So their cameras 472 00:25:03,760 --> 00:25:06,159 Speaker 3: where you've got a lens on both sides of a 473 00:25:06,280 --> 00:25:09,520 Speaker 3: small camera and it sees a full three six sixty 474 00:25:09,560 --> 00:25:13,159 Speaker 3: spherical image. Now they've put this technology into a drone. 475 00:25:13,560 --> 00:25:15,119 Speaker 3: Not only have they put it on the drones. So 476 00:25:15,119 --> 00:25:18,040 Speaker 3: it means when you're flying the drone, it sees everything 477 00:25:18,080 --> 00:25:22,240 Speaker 3: around it from every angle, but you fly it wearing goggles. 478 00:25:22,920 --> 00:25:25,840 Speaker 3: You see from the drone's perspective. So imagine the drones 479 00:25:25,880 --> 00:25:28,840 Speaker 3: hovering above your house. You put the goggles on and 480 00:25:28,880 --> 00:25:31,280 Speaker 3: you look up and down, left and right. You're seeing 481 00:25:31,359 --> 00:25:35,040 Speaker 3: everything from the drones perspective. And the additional thing for me, 482 00:25:35,119 --> 00:25:38,040 Speaker 3: I've been flying remote control aircraft for years. They don't 483 00:25:38,040 --> 00:25:40,800 Speaker 3: have little joysticks, you know, the little thumbsticks. It's a 484 00:25:40,840 --> 00:25:44,320 Speaker 3: simple joystick that you're using and it's very intuitive. You 485 00:25:44,400 --> 00:25:47,639 Speaker 3: push it forward. And so they've combined a whole lot 486 00:25:47,640 --> 00:25:49,760 Speaker 3: of new tech with this drone. So that's what's got 487 00:25:49,880 --> 00:25:53,119 Speaker 3: me pretty excited. It takes eight K video, which is 488 00:25:53,200 --> 00:25:56,200 Speaker 3: really high quality video as well. But you are looking 489 00:25:56,240 --> 00:25:58,960 Speaker 3: at a starting price of around about twenty one twenty 490 00:25:59,000 --> 00:26:02,080 Speaker 3: two hundred dollars up to nearly three thousand dollars craigo. 491 00:26:02,600 --> 00:26:04,000 Speaker 1: But it is on the top of my list. I 492 00:26:04,040 --> 00:26:04,879 Speaker 1: just thought i'd mentioned that. 493 00:26:05,000 --> 00:26:07,879 Speaker 2: Oh, let me just jot that down in case I 494 00:26:07,920 --> 00:26:13,080 Speaker 2: win Tatsloto in the intervening week. I couldn't use that 495 00:26:13,119 --> 00:26:15,480 Speaker 2: in Hampton, though, right could. 496 00:26:16,000 --> 00:26:19,000 Speaker 3: You just can't fly it over people like who were 497 00:26:19,160 --> 00:26:21,000 Speaker 3: so you couldn't go over a sporting field where there 498 00:26:21,000 --> 00:26:23,520 Speaker 3: are people present. There are lots of rules and regulations 499 00:26:23,560 --> 00:26:26,919 Speaker 3: around the flying of drones that are dictated by the 500 00:26:26,960 --> 00:26:28,680 Speaker 3: Civil Aviation Safety Authority. 501 00:26:28,800 --> 00:26:30,959 Speaker 2: So could I fly it down Hampton the Street and 502 00:26:31,000 --> 00:26:32,880 Speaker 2: hover out the front of my cafe and see if 503 00:26:32,920 --> 00:26:33,919 Speaker 2: my table's free? 504 00:26:34,240 --> 00:26:37,760 Speaker 3: Probably not No, because there are cars and people and 505 00:26:37,800 --> 00:26:38,320 Speaker 3: stuff like that. 506 00:26:38,400 --> 00:26:40,920 Speaker 1: No, there are regulations around hey. 507 00:26:41,080 --> 00:26:43,000 Speaker 3: In fact, when you buy a drone in Australia, you 508 00:26:43,000 --> 00:26:46,480 Speaker 3: get an official document that comes from the CASSA, the 509 00:26:46,520 --> 00:26:49,480 Speaker 3: Civil Aviation Safety Authority, with what you can and can't do. 510 00:26:50,160 --> 00:26:51,360 Speaker 1: They take it pretty seriously. 511 00:26:52,960 --> 00:26:57,480 Speaker 2: So where should I never install a home security camera? Patrick? 512 00:26:57,920 --> 00:26:58,680 Speaker 1: Yeah, I thought this. 513 00:26:58,600 --> 00:27:01,480 Speaker 3: Is an interesting one to talk about because they're so 514 00:27:01,640 --> 00:27:05,439 Speaker 3: affordable now and easy to install a smart security camera. 515 00:27:05,680 --> 00:27:09,080 Speaker 3: But I think sometimes people don't realize. You know, they 516 00:27:09,119 --> 00:27:11,120 Speaker 3: put cameras up on their house, but it can see 517 00:27:11,119 --> 00:27:14,640 Speaker 3: into their neighbor's house. So you know, don't put them 518 00:27:14,680 --> 00:27:16,920 Speaker 3: on your property where you can look into the neighbor's property. 519 00:27:16,920 --> 00:27:18,600 Speaker 3: You're probably going to end up with a bit of 520 00:27:18,720 --> 00:27:20,080 Speaker 3: consternation from the neighbor. 521 00:27:21,040 --> 00:27:22,320 Speaker 1: Bathrooms and toilets. 522 00:27:22,400 --> 00:27:24,119 Speaker 3: You know, if you've got an airbnb, probably not a 523 00:27:24,119 --> 00:27:25,120 Speaker 3: good idea to put them in. 524 00:27:25,080 --> 00:27:27,320 Speaker 1: There any kind of shared space. 525 00:27:27,400 --> 00:27:29,760 Speaker 3: This is probably more obviously for people who live in 526 00:27:30,000 --> 00:27:33,199 Speaker 3: high density areas where they might have an apartment or so. 527 00:27:34,119 --> 00:27:36,080 Speaker 3: But even little things. If you're going to put a 528 00:27:36,080 --> 00:27:38,720 Speaker 3: security camera up, don't put it within reach. You know, 529 00:27:38,720 --> 00:27:40,560 Speaker 3: if someone walks up to the front door or comes 530 00:27:40,640 --> 00:27:42,159 Speaker 3: up behind it and they smash it off. So have 531 00:27:42,240 --> 00:27:44,760 Speaker 3: it higher than where they could potentially reach it. And 532 00:27:44,800 --> 00:27:47,360 Speaker 3: even when you're putting a camera up, be careful there's 533 00:27:47,400 --> 00:27:49,199 Speaker 3: nothing in front of it because it might obscure the 534 00:27:49,280 --> 00:27:50,880 Speaker 3: sensor that triggers it as well. 535 00:27:51,040 --> 00:27:52,720 Speaker 1: So just little things like that. 536 00:27:53,400 --> 00:27:55,439 Speaker 2: Can I say that if somebody is putting a camera 537 00:27:55,480 --> 00:27:58,360 Speaker 2: in a toilet or a bathroom, they're a sick Who 538 00:27:58,440 --> 00:28:01,639 Speaker 2: the fuck is doing that? Oh let's just watch dad 539 00:28:01,920 --> 00:28:06,520 Speaker 2: fucking back one out. I mean, that's that's just what 540 00:28:06,800 --> 00:28:09,480 Speaker 2: Charnie Jones snap one off and shoot one off to Warby. 541 00:28:09,760 --> 00:28:12,560 Speaker 2: Let's put a camera in the crapper, shall we? Fucking 542 00:28:12,960 --> 00:28:13,760 Speaker 2: who's doing that? 543 00:28:15,000 --> 00:28:16,679 Speaker 1: Yeah? No that If. 544 00:28:16,560 --> 00:28:19,280 Speaker 2: Somebody's putting cameras in there, they're no good. 545 00:28:21,440 --> 00:28:23,919 Speaker 3: Actually, I just I just got my delivery of my 546 00:28:24,040 --> 00:28:27,320 Speaker 3: new security camera yesterday, a new bathroom camera. 547 00:28:27,560 --> 00:28:28,400 Speaker 1: Yeah, exactly. 548 00:28:28,480 --> 00:28:30,439 Speaker 3: No, it's going to go for the it's from the 549 00:28:30,480 --> 00:28:32,679 Speaker 3: studio looking down the driveway, so I can tell them 550 00:28:32,720 --> 00:28:35,359 Speaker 3: people rock up. I thought that would be kind of handy, 551 00:28:35,400 --> 00:28:37,480 Speaker 3: and it's an outdoor camera as well. 552 00:28:38,040 --> 00:28:44,000 Speaker 2: Lithium ion battery suspected to have spark caroline springs, house fire. 553 00:28:44,480 --> 00:28:46,840 Speaker 1: These are ones that are blowing up all the time. 554 00:28:46,920 --> 00:28:50,080 Speaker 3: Now not blowing up all the time, but yes, there 555 00:28:50,120 --> 00:28:53,520 Speaker 3: is a propletity that can happen. I think, Look, it's 556 00:28:53,560 --> 00:28:55,240 Speaker 3: easy to get caught up in that. You know electric 557 00:28:55,280 --> 00:28:57,480 Speaker 3: cars are dangerous because they burst into flames. Well, I 558 00:28:57,520 --> 00:28:59,960 Speaker 3: think the thing is, yes, there was a house fire, 559 00:29:00,040 --> 00:29:03,000 Speaker 3: and in fact, I know someone personally whose parents had 560 00:29:03,000 --> 00:29:04,920 Speaker 3: an electric scooter that caught fire. 561 00:29:04,960 --> 00:29:06,240 Speaker 1: Their whole house burned out. 562 00:29:06,520 --> 00:29:08,600 Speaker 3: So it happened, and this is people who live in 563 00:29:08,640 --> 00:29:10,760 Speaker 3: the next suburb and one of my colleagues. 564 00:29:10,800 --> 00:29:12,960 Speaker 1: But the reality of it is it's got. 565 00:29:12,800 --> 00:29:17,360 Speaker 3: To a point where it's becoming worrying because the instances 566 00:29:17,400 --> 00:29:21,720 Speaker 3: of lithium ion batteries have increased, the fires related to that, 567 00:29:22,120 --> 00:29:25,080 Speaker 3: and also not just that. What's happening is a lot 568 00:29:25,080 --> 00:29:28,200 Speaker 3: of tips are catching fire because people are throwing them 569 00:29:28,200 --> 00:29:31,360 Speaker 3: in waste when they don't them, and they're swelling and 570 00:29:31,360 --> 00:29:33,160 Speaker 3: they're being jostled around. I mean that you've got to 571 00:29:33,160 --> 00:29:35,400 Speaker 3: be really careful if you drop a lithium ion battery 572 00:29:35,720 --> 00:29:37,480 Speaker 3: because it starts to blister. 573 00:29:37,320 --> 00:29:39,280 Speaker 1: Then it can cause those issues as well. 574 00:29:39,320 --> 00:29:42,400 Speaker 3: So the reality of it is this house fire and 575 00:29:42,520 --> 00:29:45,720 Speaker 3: you know we're seeing it happen more often. So fire 576 00:29:45,760 --> 00:29:48,480 Speaker 3: authorities here in Victoria have basically come out and said, look, 577 00:29:48,680 --> 00:29:51,320 Speaker 3: you've got to make sure that you buy lithium ion 578 00:29:51,320 --> 00:29:56,160 Speaker 3: batteries from reputable manufacturers, so make sure that it has 579 00:29:56,240 --> 00:29:57,960 Speaker 3: all the codes on the back of it so when 580 00:29:58,000 --> 00:30:00,840 Speaker 3: you look at the back of the product has been checked, 581 00:30:00,960 --> 00:30:04,280 Speaker 3: does it conform to Australian standards, all those sorts of things. 582 00:30:04,280 --> 00:30:06,280 Speaker 3: So it is a big issue and you're going to 583 00:30:06,280 --> 00:30:09,120 Speaker 3: see very shortly, within the matter of maybe six to 584 00:30:09,160 --> 00:30:11,640 Speaker 3: twelve months, you will not be able to bring a 585 00:30:11,680 --> 00:30:14,560 Speaker 3: power bank or you'll be very limited with those power 586 00:30:14,560 --> 00:30:17,120 Speaker 3: banks when you're traveling or get onto an aircraft. It's 587 00:30:17,120 --> 00:30:21,400 Speaker 3: become a really serious issue with power banks potentially catching 588 00:30:21,600 --> 00:30:24,360 Speaker 3: fire on an aircraft. As you can imagine, you know, 589 00:30:24,400 --> 00:30:26,480 Speaker 3: you can't really put them out. It's very difficult to 590 00:30:26,480 --> 00:30:27,800 Speaker 3: put them out, and. 591 00:30:27,680 --> 00:30:30,280 Speaker 1: I think I think someone's already done that. 592 00:30:30,360 --> 00:30:33,320 Speaker 2: I think I think one airline because that's been on 593 00:30:33,360 --> 00:30:36,280 Speaker 2: the news a bit lately with like fires in cabins. 594 00:30:37,440 --> 00:30:39,680 Speaker 1: But also, I mean, I'm with you. 595 00:30:39,720 --> 00:30:41,800 Speaker 2: I agree, I think it's really good advice, but I 596 00:30:41,840 --> 00:30:44,840 Speaker 2: also think the practical reality is, you know, when you 597 00:30:44,880 --> 00:30:47,240 Speaker 2: say you've got to check that the battery is from 598 00:30:47,280 --> 00:30:49,960 Speaker 2: a reputable this and that it complies with this, fuck, 599 00:30:50,040 --> 00:30:52,280 Speaker 2: the average person ain't going to do that. They're just 600 00:30:52,320 --> 00:30:55,280 Speaker 2: going to go and buy it. And you know so, 601 00:30:56,440 --> 00:30:59,280 Speaker 2: but I am interested, Sorry, I am interested in the 602 00:30:59,280 --> 00:31:03,920 Speaker 2: AI andable teddy Bear. It's been suspended after giving advice 603 00:31:04,000 --> 00:31:08,400 Speaker 2: on BDSM sex, where to find knives? Kind of fucking 604 00:31:08,440 --> 00:31:08,920 Speaker 2: teddy bear? 605 00:31:09,000 --> 00:31:09,440 Speaker 4: Is that? Bro? 606 00:31:09,960 --> 00:31:13,040 Speaker 3: This is this is really disturbing because there are tooks 607 00:31:13,040 --> 00:31:16,320 Speaker 3: coming out this Christmas that have AI integrated into them, 608 00:31:16,520 --> 00:31:19,280 Speaker 3: and the idea is that children form a relationship with 609 00:31:19,320 --> 00:31:19,600 Speaker 3: the toy. 610 00:31:19,640 --> 00:31:21,760 Speaker 1: They talked to the toy, the toy talks. 611 00:31:21,480 --> 00:31:23,840 Speaker 3: Back to them, and so it got a little bit 612 00:31:23,880 --> 00:31:27,920 Speaker 3: out of hand when this particular toy they found that 613 00:31:27,920 --> 00:31:30,560 Speaker 3: they didn't have the protections in place. So it's a 614 00:31:30,600 --> 00:31:34,000 Speaker 3: teddy bear that they kind of did some research on 615 00:31:34,040 --> 00:31:36,920 Speaker 3: and they realized you could start having pretty in depth conversations. 616 00:31:36,960 --> 00:31:39,719 Speaker 3: So it would tell you where to find a sharp knife. 617 00:31:40,160 --> 00:31:43,320 Speaker 3: You know and talk about Yeah exactly. I mean it 618 00:31:43,360 --> 00:31:47,320 Speaker 3: was called the follow toy, and according to CNN, the 619 00:31:47,360 --> 00:31:49,560 Speaker 3: company had to withdraw all of them, so it was 620 00:31:49,600 --> 00:31:53,080 Speaker 3: called the Kuma Bear. And they found that because the 621 00:31:53,080 --> 00:31:56,600 Speaker 3: thing is, people are developing these toy, these toys, but 622 00:31:56,800 --> 00:32:01,720 Speaker 3: they're then using AI integration, but they're not developing the 623 00:32:01,760 --> 00:32:05,400 Speaker 3: AI themselves, so they're using say chat GPT, they're buying 624 00:32:05,400 --> 00:32:09,080 Speaker 3: a license to integrate it into the bear, and that 625 00:32:09,200 --> 00:32:12,920 Speaker 3: toy then uses that large language model that's been developed 626 00:32:12,920 --> 00:32:15,640 Speaker 3: by a third party. So it seems like the safeguards 627 00:32:15,640 --> 00:32:18,080 Speaker 3: aren't always in place. They want to get this put 628 00:32:18,120 --> 00:32:19,920 Speaker 3: out and they want to rush it out there and 629 00:32:19,960 --> 00:32:22,080 Speaker 3: get it out. But the problem is that it kind 630 00:32:22,120 --> 00:32:24,880 Speaker 3: of again it leads you down that AI psychosis rabbit 631 00:32:24,920 --> 00:32:27,360 Speaker 3: hole where you start a conversation and it goes into 632 00:32:27,400 --> 00:32:29,840 Speaker 3: a direction you certainly don't want to be going with 633 00:32:29,920 --> 00:32:30,440 Speaker 3: a child. 634 00:32:31,320 --> 00:32:36,360 Speaker 2: Now, this next one excites me and terrifies me and 635 00:32:36,520 --> 00:32:41,959 Speaker 2: also makes me a bit suspicious. New tech called mind 636 00:32:42,080 --> 00:32:47,880 Speaker 2: captioning turns thoughts and mental images into simple text. So 637 00:32:49,400 --> 00:32:51,840 Speaker 2: my understanding of that is I can think something and 638 00:32:51,920 --> 00:32:52,920 Speaker 2: my computer. 639 00:32:52,640 --> 00:32:55,480 Speaker 1: Or my phone writes it. Is that is true. 640 00:32:55,920 --> 00:32:58,920 Speaker 3: Ultimately that's where they want to get. So six volunteers 641 00:32:59,160 --> 00:33:02,320 Speaker 3: were scanned and what they did was they put them 642 00:33:02,360 --> 00:33:06,880 Speaker 3: into an MRI or an fMRI and they and again 643 00:33:06,920 --> 00:33:10,640 Speaker 3: they're using AI to observe what happens in their brain 644 00:33:11,240 --> 00:33:14,760 Speaker 3: when they watch a number of video clips. So they 645 00:33:14,880 --> 00:33:18,120 Speaker 3: watch the clips being played out, the scanner records their 646 00:33:18,240 --> 00:33:22,720 Speaker 3: whole brain activity frame by frame, So this is something 647 00:33:22,800 --> 00:33:25,880 Speaker 3: a human can't do. That's where the AI integration comes in. 648 00:33:26,320 --> 00:33:29,200 Speaker 3: And then every time a clip comes up and the 649 00:33:29,280 --> 00:33:32,520 Speaker 3: caption related to that clip, they see what the brain 650 00:33:32,640 --> 00:33:36,360 Speaker 3: was doing at the time, and so effectively every person. 651 00:33:36,680 --> 00:33:39,360 Speaker 3: So when they put these six people through, they so 652 00:33:39,440 --> 00:33:41,640 Speaker 3: for example, it might be something like a man playing 653 00:33:41,680 --> 00:33:44,800 Speaker 3: a guitar on stage or a child padding a dog 654 00:33:44,920 --> 00:33:48,240 Speaker 3: in a yard, and the brain will look a bit 655 00:33:48,280 --> 00:33:52,280 Speaker 3: different when they're observing. And so this is not just 656 00:33:52,320 --> 00:33:55,960 Speaker 3: as simple as getting a concept like one word, we're talking. 657 00:33:56,320 --> 00:33:58,000 Speaker 1: Raisers is where they're getting. 658 00:33:58,360 --> 00:34:01,400 Speaker 3: You know, they're hoping to be with is so it 659 00:34:01,480 --> 00:34:06,560 Speaker 3: has it's fascinating to think that what you are thinking 660 00:34:06,600 --> 00:34:11,640 Speaker 3: about could be effectively predicted or be predictable, and that's 661 00:34:11,680 --> 00:34:13,799 Speaker 3: that's what really is exciting about this so I see 662 00:34:13,800 --> 00:34:17,680 Speaker 3: where you're coming from, because scanners and you know, someone 663 00:34:17,760 --> 00:34:18,560 Speaker 3: walks through. 664 00:34:18,800 --> 00:34:22,080 Speaker 1: You know wherever. It's like you know what, you know, 665 00:34:22,560 --> 00:34:25,160 Speaker 1: what was the truth detectors or light detectors? 666 00:34:25,239 --> 00:34:28,359 Speaker 3: You know you can imagine that steroids, isn't it is? 667 00:34:29,080 --> 00:34:30,200 Speaker 1: So in order for that? 668 00:34:30,360 --> 00:34:32,239 Speaker 2: Does that mean the person needs to wear some kind 669 00:34:32,280 --> 00:34:36,680 Speaker 2: of headset or there's a there's an implant or I 670 00:34:36,719 --> 00:34:39,120 Speaker 2: know this is in development, this is not here, but 671 00:34:39,600 --> 00:34:41,720 Speaker 2: what's the current thing that they're using? 672 00:34:41,800 --> 00:34:44,040 Speaker 1: Is it just like a bunch of electrodes? 673 00:34:44,760 --> 00:34:48,880 Speaker 3: Now this is a massive the magnetum so right right 674 00:34:49,000 --> 00:34:53,399 Speaker 3: right machine? Yeah, absolutely, sorry, it wasn't paying attention. Tell 675 00:34:53,400 --> 00:34:57,400 Speaker 3: me what smartphone pinky is. Look, I was reading an 676 00:34:57,480 --> 00:35:01,000 Speaker 3: article by a woman who was ta about the fact 677 00:35:01,080 --> 00:35:02,839 Speaker 3: that and if you think about it, when you grab 678 00:35:02,880 --> 00:35:05,080 Speaker 3: a phone, you just pick up your phone on your hand. 679 00:35:05,120 --> 00:35:06,960 Speaker 3: What do you do You support the bottom of the 680 00:35:07,000 --> 00:35:10,800 Speaker 3: phone with your little finger, so the your little finger 681 00:35:10,840 --> 00:35:13,040 Speaker 3: is taking the weight of the phone. And we had 682 00:35:13,040 --> 00:35:15,719 Speaker 3: an instance, you know, years ago where phones kind of 683 00:35:15,760 --> 00:35:18,920 Speaker 3: got slimmer and lighter, but now they're going the opposite way. 684 00:35:19,239 --> 00:35:23,400 Speaker 3: In fact, when I purchased my recent phone, the Pixel nine, 685 00:35:24,040 --> 00:35:26,239 Speaker 3: I had there was some really great sales on and 686 00:35:26,280 --> 00:35:29,480 Speaker 3: I could go for the next larger model for five 687 00:35:29,560 --> 00:35:30,840 Speaker 3: hundred dollars less. 688 00:35:30,880 --> 00:35:32,200 Speaker 1: So it was five hundred dollars. 689 00:35:31,920 --> 00:35:33,880 Speaker 3: Off and it was only going to cost me fifty 690 00:35:33,920 --> 00:35:36,120 Speaker 3: bucks to go to the larger screen size. And I 691 00:35:36,120 --> 00:35:38,200 Speaker 3: looked at I thought, that's way too heavy and way 692 00:35:38,200 --> 00:35:40,560 Speaker 3: too big. I don't want a phone that big. So 693 00:35:40,960 --> 00:35:43,880 Speaker 3: instead of you know, going the bonus for five hundred 694 00:35:43,880 --> 00:35:46,319 Speaker 3: dollars bonus for just fifty bucks, I didn't because it's 695 00:35:46,360 --> 00:35:48,799 Speaker 3: got too big. But the thing is that now we're 696 00:35:48,840 --> 00:35:52,200 Speaker 3: supporting our phone so much, people are starting to experience 697 00:35:52,440 --> 00:35:55,720 Speaker 3: joint pain in the first knuckle of their little finger. 698 00:35:56,280 --> 00:36:00,920 Speaker 3: So it's an observation, but it's something that are we 699 00:36:01,080 --> 00:36:01,799 Speaker 3: the most. 700 00:36:02,440 --> 00:36:06,360 Speaker 2: Have we the most physically weak, fucking version of humanity? 701 00:36:06,920 --> 00:36:10,240 Speaker 2: How fucking lame are we? Oh my god, my first 702 00:36:10,280 --> 00:36:13,480 Speaker 2: joint and my pinky finger from supporting my mobile device. 703 00:36:14,040 --> 00:36:17,319 Speaker 2: Oh quick, call the fucking physio. Quick, get me some 704 00:36:17,440 --> 00:36:22,279 Speaker 2: Voltaire and sandwiches. Fuck, because what is going on? 705 00:36:22,760 --> 00:36:25,759 Speaker 3: Don't worry if you won't even have a phone, you'll 706 00:36:25,800 --> 00:36:28,279 Speaker 3: just wear smart glasses and you'll just talk to them 707 00:36:28,440 --> 00:36:30,479 Speaker 3: and everything will be displayed in front of you. 708 00:36:30,480 --> 00:36:32,160 Speaker 1: You know, you'll have a smart glasses. 709 00:36:32,440 --> 00:36:34,920 Speaker 2: We won't even we won't even have bodies, will be 710 00:36:35,200 --> 00:36:38,400 Speaker 2: just brains in jars being fucking carried around a house 711 00:36:38,440 --> 00:36:39,560 Speaker 2: by a robot. 712 00:36:39,920 --> 00:36:41,319 Speaker 1: We'll just be We'll just. 713 00:36:41,200 --> 00:36:47,760 Speaker 2: Be interacting without talking, just fucking telepathic kind of relationships. 714 00:36:47,960 --> 00:36:52,360 Speaker 1: Right, the three of us inside our heads. 715 00:36:53,040 --> 00:36:56,920 Speaker 2: Oh god, it's like I'm enough of a nightmare, just 716 00:36:57,040 --> 00:36:59,840 Speaker 2: me in my head. I definitely don't want either of you, 717 00:37:00,000 --> 00:37:02,839 Speaker 2: two fucking idiots in my head, and you definitely don't 718 00:37:02,880 --> 00:37:07,920 Speaker 2: want me. Five hundred years later, scientists proved that Leonardo 719 00:37:08,000 --> 00:37:12,360 Speaker 2: da Vinci's bridge design works. I don't know about his 720 00:37:12,440 --> 00:37:15,280 Speaker 2: bridge design, but doing lightness Patrick. 721 00:37:15,120 --> 00:37:19,040 Speaker 3: Yeah, look, obviously I can't explain it exactly, but it's 722 00:37:19,239 --> 00:37:22,040 Speaker 3: so Leonardo da Vinci came up with a unique bridge design. 723 00:37:22,320 --> 00:37:24,160 Speaker 1: And the way I visualize. 724 00:37:23,600 --> 00:37:27,399 Speaker 3: It is if you imagine just a bridge over a river, 725 00:37:27,840 --> 00:37:30,879 Speaker 3: but it spans out and it gets wider on each end. 726 00:37:31,360 --> 00:37:35,200 Speaker 1: And what it seems is that this I think I know, 727 00:37:35,520 --> 00:37:37,239 Speaker 1: can I interrupt? Ye for sure? 728 00:37:37,680 --> 00:37:43,320 Speaker 2: Is this the one that's got like no actual struts 729 00:37:43,400 --> 00:37:45,600 Speaker 2: or upright supports? Wide? 730 00:37:46,520 --> 00:37:48,440 Speaker 3: Yeah, it gets wide at the ends and then it 731 00:37:48,480 --> 00:37:50,919 Speaker 3: curves in and then you've got the bridge support over 732 00:37:50,920 --> 00:37:52,799 Speaker 3: the center right right right. 733 00:37:53,120 --> 00:37:54,760 Speaker 1: So the thing is Leonardo da Vinci. 734 00:37:54,960 --> 00:37:57,960 Speaker 3: He designed this hypothetical bridge for the Sultan of the 735 00:37:57,960 --> 00:38:03,120 Speaker 3: Ottoman Empire, who said some idea. So first mistake he 736 00:38:03,239 --> 00:38:05,800 Speaker 3: thought it was no good. But then literally five hundred 737 00:38:05,880 --> 00:38:11,279 Speaker 3: years later or whatever, a team at MIT have recreated 738 00:38:11,280 --> 00:38:15,040 Speaker 3: the design with modeling and showed not only would it work, 739 00:38:15,280 --> 00:38:17,920 Speaker 3: but it'll actually be superior to a lot of other 740 00:38:17,960 --> 00:38:21,680 Speaker 3: bridges because you don't have that movement when in the wind, 741 00:38:21,680 --> 00:38:24,120 Speaker 3: because the wider a span of a bridge, they actually 742 00:38:24,160 --> 00:38:28,280 Speaker 3: physically are designed to move. But yeah, so like Leonardo 743 00:38:28,320 --> 00:38:31,360 Speaker 3: da Vinci blows my mind. There was a great exhibition 744 00:38:31,600 --> 00:38:35,480 Speaker 3: of his works, including some of his original codecs, in 745 00:38:35,560 --> 00:38:37,360 Speaker 3: Melbourne last year that I went to, and it just 746 00:38:37,880 --> 00:38:41,520 Speaker 3: literally blew my mind to see the amazing things that 747 00:38:41,560 --> 00:38:46,080 Speaker 3: he was able to conceive of. You know, submarine's aircraft phenomenal, 748 00:38:46,120 --> 00:38:47,640 Speaker 3: wasn't it so ahead of his time. 749 00:38:48,040 --> 00:38:51,280 Speaker 2: I remember when I was about ten or twelve seeing 750 00:38:52,320 --> 00:38:55,960 Speaker 2: a picture of a very rudimentary helicopter drawn by Leonardo 751 00:38:56,000 --> 00:38:59,799 Speaker 2: da Vinci, and I'm like, when was this drawn? And 752 00:38:59,840 --> 00:39:01,799 Speaker 2: I think I asked my teacher, and I don't know. 753 00:39:01,800 --> 00:39:04,799 Speaker 2: It was like fifteen something, you know, five hundred years ago, 754 00:39:05,000 --> 00:39:10,480 Speaker 2: like five centuries ago, when you know airplanes were still 755 00:39:10,520 --> 00:39:14,920 Speaker 2: four centuries away. This guy was designing helicopters and even 756 00:39:15,080 --> 00:39:16,080 Speaker 2: conceiving that. 757 00:39:15,920 --> 00:39:19,240 Speaker 1: That would be a thing. And then it's bloody amazing, 758 00:39:19,400 --> 00:39:19,759 Speaker 1: all right. 759 00:39:19,800 --> 00:39:23,960 Speaker 2: Australia Post warns that foreign giants are set to dominate 760 00:39:24,440 --> 00:39:27,600 Speaker 2: Australian online retail Do we need to panic? 761 00:39:28,440 --> 00:39:29,279 Speaker 1: Well, not so much. 762 00:39:29,680 --> 00:39:33,279 Speaker 3: I think this is an interesting one because we know 763 00:39:33,480 --> 00:39:36,920 Speaker 3: the cost of living has gone up and so people 764 00:39:36,960 --> 00:39:39,200 Speaker 3: are filling the pinch and it makes a lot of 765 00:39:39,239 --> 00:39:43,160 Speaker 3: sense that you're shopping around for the cheapest option. But 766 00:39:43,200 --> 00:39:46,400 Speaker 3: the problem is cheap can be exactly that. You know, 767 00:39:46,560 --> 00:39:49,000 Speaker 3: a friend of mine once said, the rich man buys 768 00:39:49,000 --> 00:39:52,000 Speaker 3: it once, the poor man buys it twice, and that, 769 00:39:52,320 --> 00:39:54,680 Speaker 3: unfortunately is the case with a lot of stuff that 770 00:39:54,680 --> 00:39:57,719 Speaker 3: you're buying off. Say some of the Chinese retailers like TMU, 771 00:39:58,120 --> 00:40:00,319 Speaker 3: you know where you jump on and you buy cheap 772 00:40:00,480 --> 00:40:04,520 Speaker 3: whatever it happens to be. So Amazon and Teamu, all 773 00:40:04,520 --> 00:40:08,400 Speaker 3: these online retailers, and now about I think it's about 774 00:40:08,400 --> 00:40:11,400 Speaker 3: twenty percent of online twenty to twenty five percent of 775 00:40:11,400 --> 00:40:15,960 Speaker 3: online purchasing. But the CEO of Australia Post, Paul Graham, 776 00:40:16,360 --> 00:40:19,319 Speaker 3: he's come out and said that these guys in the 777 00:40:19,360 --> 00:40:22,240 Speaker 3: next few years will have fifty percent of all purchasing 778 00:40:22,280 --> 00:40:25,840 Speaker 3: in Australia, which is going to mean all purchases including 779 00:40:25,960 --> 00:40:31,360 Speaker 3: retail outlets and online. And he's saying that effectively people, 780 00:40:31,480 --> 00:40:33,960 Speaker 3: this is a drug of the cheapest product, is what 781 00:40:34,040 --> 00:40:37,359 Speaker 3: he is saying, and that we're sh be promoting Australian 782 00:40:37,560 --> 00:40:40,719 Speaker 3: and people should be buying things locally and thinking in 783 00:40:40,760 --> 00:40:43,640 Speaker 3: those terms. You know, if you walk down say Hampton 784 00:40:43,719 --> 00:40:46,160 Speaker 3: Street or somewhere that's a shopping strip and you see 785 00:40:46,480 --> 00:40:49,279 Speaker 3: shops are closing, there's a reason for that because there's 786 00:40:49,320 --> 00:40:51,440 Speaker 3: no demand because people are buying it online. 787 00:40:51,520 --> 00:40:53,880 Speaker 1: It's a tough one. It's a really difficult one. 788 00:40:54,320 --> 00:40:56,960 Speaker 3: I guess when our CEO comes out and says you 789 00:40:57,000 --> 00:40:58,479 Speaker 3: should do this, most people one. 790 00:40:58,400 --> 00:41:01,880 Speaker 2: Hundred percent, especially when that's CEO, is that CEO is 791 00:41:01,920 --> 00:41:05,160 Speaker 2: probably on a million dollars a year telling people who 792 00:41:05,239 --> 00:41:08,400 Speaker 2: earn fuck or with five kids how they should. 793 00:41:08,160 --> 00:41:08,960 Speaker 1: Spend their money. 794 00:41:09,239 --> 00:41:12,320 Speaker 2: I'm like, hey, bro, get off your moral high horse 795 00:41:12,320 --> 00:41:14,600 Speaker 2: and just fucking run the company and let us figure 796 00:41:14,600 --> 00:41:16,320 Speaker 2: out what we'll do with our own money. 797 00:41:16,040 --> 00:41:17,200 Speaker 1: And how we'll spend it. 798 00:41:17,440 --> 00:41:20,520 Speaker 2: I don't think that's your job is to tell people 799 00:41:20,600 --> 00:41:22,880 Speaker 2: in suburbia how they should spend their money. 800 00:41:23,400 --> 00:41:27,239 Speaker 3: So because Australia Post gets the postage of all the 801 00:41:27,239 --> 00:41:30,799 Speaker 3: stuff that people are buying online, So yeah, that's the 802 00:41:30,840 --> 00:41:33,960 Speaker 3: issue anyway. But by twenty thirty is what they're predicting 803 00:41:34,040 --> 00:41:37,800 Speaker 3: that Amazon and those other online platforms will control fifty 804 00:41:37,840 --> 00:41:41,239 Speaker 3: percent of the online market in Australia. 805 00:41:41,320 --> 00:41:44,400 Speaker 2: And now social media companies are using data and AI 806 00:41:44,520 --> 00:41:48,400 Speaker 2: to figure out our age without us telling them and 807 00:41:48,480 --> 00:41:51,520 Speaker 2: without us even knowing that they're figuring that out. 808 00:41:51,719 --> 00:41:53,640 Speaker 3: Yeah, this has been really big because of the band 809 00:41:53,680 --> 00:41:56,839 Speaker 3: that came into effect this week for under sixteen's, and 810 00:41:56,920 --> 00:41:59,160 Speaker 3: we know that it's going to be a teething process 811 00:41:59,200 --> 00:42:01,960 Speaker 3: because there's been lots everyone's running the stories of all 812 00:42:02,000 --> 00:42:05,239 Speaker 3: the under sixteens who are able to fool the new 813 00:42:05,280 --> 00:42:08,359 Speaker 3: protections and some kids have been able to stay on 814 00:42:08,400 --> 00:42:11,760 Speaker 3: all those social medias when the band is now in place, 815 00:42:12,160 --> 00:42:14,000 Speaker 3: the reality of it is always. 816 00:42:13,719 --> 00:42:15,320 Speaker 1: Going to be smart kids. You know, there were fourteen 817 00:42:15,400 --> 00:42:16,840 Speaker 1: year olds that were scrunching. 818 00:42:16,440 --> 00:42:18,680 Speaker 3: Up their face to make them look older, like fourteen 819 00:42:18,719 --> 00:42:20,719 Speaker 3: year old boys, is you know, scrunching up their face 820 00:42:20,800 --> 00:42:23,240 Speaker 3: and they got through they're like, oh yeah, you're thirty 821 00:42:24,320 --> 00:42:27,120 Speaker 3: and young girls putting fake eyelashes on and putting a 822 00:42:27,120 --> 00:42:28,840 Speaker 3: bit of makeup on it. It's like, yeah, of course you 823 00:42:28,880 --> 00:42:32,919 Speaker 3: are twenty five. So it's going to take time. But interestingly, 824 00:42:33,360 --> 00:42:36,440 Speaker 3: there's a lot of other things that we do online 825 00:42:36,560 --> 00:42:40,480 Speaker 3: that leave what they call a behavioral fingerprints, and that's 826 00:42:40,520 --> 00:42:43,280 Speaker 3: how a lot of the AI models are kind of saying, well, okay, 827 00:42:43,320 --> 00:42:44,400 Speaker 3: this is Craig. 828 00:42:45,160 --> 00:42:47,000 Speaker 1: Your interaction patterns. 829 00:42:46,600 --> 00:42:49,400 Speaker 3: Like the posts that you like and follow the topics 830 00:42:49,400 --> 00:42:52,080 Speaker 3: and the genres that you prefer. You know, yes, you 831 00:42:52,120 --> 00:42:54,600 Speaker 3: watch two thousand and one a Space Odyssey, You're probably 832 00:42:54,640 --> 00:42:58,719 Speaker 3: not someone under the age of sixteen. How you scroll 833 00:42:58,920 --> 00:43:01,360 Speaker 3: what you slow down and speed up at so the 834 00:43:01,440 --> 00:43:04,600 Speaker 3: interest factor there, So it might be reminiscing about the 835 00:43:04,640 --> 00:43:07,279 Speaker 3: eighties and you pause on it because it shows a 836 00:43:07,360 --> 00:43:10,040 Speaker 3: DeLorean or something. You know, these are the things that 837 00:43:10,560 --> 00:43:14,600 Speaker 3: younger people will scroll past, but older people won't scroll 838 00:43:14,640 --> 00:43:17,680 Speaker 3: as quickly. They may even pause, So just that pausing 839 00:43:17,719 --> 00:43:20,040 Speaker 3: for a second, even if you don't read it, This 840 00:43:20,120 --> 00:43:23,920 Speaker 3: is frightening. How you know, these massive companies are able 841 00:43:24,000 --> 00:43:28,400 Speaker 3: to use these scary algorithms to type set. 842 00:43:28,280 --> 00:43:30,600 Speaker 1: You to know more about you. 843 00:43:30,600 --> 00:43:33,520 Speaker 3: You behavior is what you like, say you dislikes, and 844 00:43:33,560 --> 00:43:36,040 Speaker 3: it could be something simple as you know, woman in 845 00:43:36,120 --> 00:43:39,000 Speaker 3: bikini with brown hair, woman in bikini with blonde hair. 846 00:43:39,200 --> 00:43:41,360 Speaker 3: You pause over one and not over the other. Okay, 847 00:43:41,360 --> 00:43:45,400 Speaker 3: so Craig likes Brunette's. You know, literally, this is the algorithm, 848 00:43:45,400 --> 00:43:49,000 Speaker 3: and this is the technology that's being used to categorize 849 00:43:49,040 --> 00:43:51,000 Speaker 3: people who are using social media. 850 00:43:51,640 --> 00:43:56,840 Speaker 2: Wowser, wowsers, It's kind of yeah, it's an avalanche at 851 00:43:56,880 --> 00:43:59,880 Speaker 2: the moment. Now for Ron and Mary, my beautiful parents 852 00:44:00,040 --> 00:44:03,200 Speaker 2: who both can't hear, Patrick, we're scrolling down to the 853 00:44:03,280 --> 00:44:05,400 Speaker 2: health section just so you can keep up with me. 854 00:44:05,840 --> 00:44:08,360 Speaker 2: For Ron and Mary, who have got real trouble hearing, 855 00:44:08,840 --> 00:44:11,680 Speaker 2: it seems that you've got some medical advice for them 856 00:44:12,000 --> 00:44:13,080 Speaker 2: what they should use. 857 00:44:13,440 --> 00:44:14,880 Speaker 1: And this is going to have a whole lot of 858 00:44:14,920 --> 00:44:19,239 Speaker 1: health benefits right across. Hey, Tip, why do you hear this? 859 00:44:20,040 --> 00:44:25,919 Speaker 3: So evidently VIAGRAA reverses damage behind deafness. 860 00:44:26,280 --> 00:44:29,040 Speaker 2: Hey Ron, Hey Ron, I'll be up on the weekend. 861 00:44:29,280 --> 00:44:30,000 Speaker 2: Don't worry. 862 00:44:30,120 --> 00:44:34,840 Speaker 1: That helps blocked ears. 863 00:44:34,400 --> 00:44:39,879 Speaker 2: Now I've got I've got I've got periodic deafness. Yeah, 864 00:44:39,960 --> 00:44:42,480 Speaker 2: how he is? Now? Oh shit ouse, but my cock's 865 00:44:42,520 --> 00:44:45,839 Speaker 2: as hard as a rock. I still can't hear though, OK, 866 00:44:46,000 --> 00:44:47,080 Speaker 2: now that'd be my luck. 867 00:44:47,840 --> 00:44:50,359 Speaker 3: Well, we're talking about a serious form of deafness here, 868 00:44:50,400 --> 00:44:51,920 Speaker 3: craigo right. 869 00:44:53,440 --> 00:44:56,920 Speaker 2: Born with hearing loss. I'm sorry, sorry for not being serious. 870 00:44:56,960 --> 00:45:04,480 Speaker 2: Patrick Patrick's like, hey, pay attention. You wrote, you've. 871 00:45:04,320 --> 00:45:05,320 Speaker 1: Put it on the list. 872 00:45:05,640 --> 00:45:10,960 Speaker 2: It literally says biagra of reversus damage behind deafness, And 873 00:45:10,960 --> 00:45:12,640 Speaker 2: then what do you want me not to make fun 874 00:45:12,680 --> 00:45:13,000 Speaker 2: of that? 875 00:45:13,239 --> 00:45:15,879 Speaker 1: This is a win win all round. Good. 876 00:45:16,000 --> 00:45:20,319 Speaker 2: Yes, this is scientifically valid and also fucking hilarious and 877 00:45:20,520 --> 00:45:22,560 Speaker 2: also opened the door for a cock joke. 878 00:45:22,760 --> 00:45:25,480 Speaker 1: You're welcome. It's like the trifecta. 879 00:45:26,719 --> 00:45:31,680 Speaker 3: Yeah, well so this that's basically it. Yeah, that's great 880 00:45:31,719 --> 00:45:34,200 Speaker 3: that the viagra is having this multi purpose usage. 881 00:45:34,239 --> 00:45:36,800 Speaker 1: I think it's fantastic. I'm sorry for all the fluffers 882 00:45:36,840 --> 00:45:39,360 Speaker 1: who lost their job causation or correlation. 883 00:45:41,760 --> 00:45:46,479 Speaker 2: I don't know that's a good Patrick Patrick just said, Tiff, 884 00:45:46,640 --> 00:45:48,919 Speaker 2: I'm sorry for all the fluffers who lost their job. 885 00:45:48,960 --> 00:45:50,480 Speaker 1: Do you know do you know what a fluffer is? 886 00:45:50,600 --> 00:45:52,759 Speaker 4: Yes? Yes? Someone that yes? 887 00:45:55,000 --> 00:45:57,920 Speaker 2: Tell our tell our younger listeners what a fluffer is. 888 00:45:59,200 --> 00:46:01,120 Speaker 2: Just in light now younger listeners who are going, I 889 00:46:01,160 --> 00:46:02,480 Speaker 2: don't know what a fluffer is. 890 00:46:02,920 --> 00:46:06,600 Speaker 4: Someone whose job it is to make sure a person 891 00:46:06,680 --> 00:46:10,239 Speaker 4: is aptly prepared to perform. 892 00:46:09,840 --> 00:46:13,120 Speaker 1: What person in what context? For what job? 893 00:46:13,400 --> 00:46:15,799 Speaker 4: A bloke who needs to. 894 00:46:17,760 --> 00:46:19,520 Speaker 1: But what job? What job? 895 00:46:21,800 --> 00:46:25,360 Speaker 2: We're not talking about Brian and reservoir. Yes, it's in porn. 896 00:46:26,880 --> 00:46:28,719 Speaker 2: So there used to be there, used to be these 897 00:46:28,760 --> 00:46:32,120 Speaker 2: people and their job was just to get old mate 898 00:46:32,200 --> 00:46:36,320 Speaker 2: ready to go. But that person got replaced by a pill. 899 00:46:36,840 --> 00:46:41,080 Speaker 2: Talk about technology, imagine saying a pill took my. 900 00:46:41,160 --> 00:46:45,640 Speaker 1: Job, especially if you've had your own news skill. 901 00:46:46,239 --> 00:46:49,080 Speaker 3: Hey, what happens if that fluffer as he got older, 902 00:46:49,239 --> 00:46:51,840 Speaker 3: then goes death. It's like we can give you viagra. 903 00:46:52,080 --> 00:46:56,360 Speaker 3: No way, he stole my job. I'm not taking that. Sorry. 904 00:46:56,600 --> 00:46:59,920 Speaker 2: I don't think fluffers were boys, Patrick in the most part, 905 00:47:00,400 --> 00:47:02,200 Speaker 2: I guess, so I don't know that. 906 00:47:02,360 --> 00:47:04,520 Speaker 1: Maybe on your pawn that's. 907 00:47:04,320 --> 00:47:07,239 Speaker 2: Over here, over here, point. 908 00:47:09,000 --> 00:47:10,360 Speaker 1: That's so funny. 909 00:47:10,760 --> 00:47:15,600 Speaker 2: That's so funny. The US plans to order foreign tourists, 910 00:47:15,640 --> 00:47:19,759 Speaker 2: including Ozzies, to disclose social media histories. 911 00:47:19,920 --> 00:47:24,439 Speaker 3: What yes, forty countries around the world that are on 912 00:47:24,600 --> 00:47:28,280 Speaker 3: what they call visa exempt, so the visa exempt countries, 913 00:47:28,280 --> 00:47:31,799 Speaker 3: so Australians and New Zealands are included in that. Effectively, 914 00:47:32,160 --> 00:47:35,600 Speaker 3: you would have to when applying for a entry into 915 00:47:35,640 --> 00:47:38,960 Speaker 3: the US, you would have to disclose all of your 916 00:47:39,000 --> 00:47:41,560 Speaker 3: social media data since twenty nineteen. 917 00:47:42,640 --> 00:47:47,239 Speaker 2: Wow, this is a Trump initiative. Yep, how how do 918 00:47:47,320 --> 00:47:50,040 Speaker 2: we even Yeah, why do they need that? What can 919 00:47:50,040 --> 00:47:52,600 Speaker 2: they even like? There must be so many people coming in. 920 00:47:52,680 --> 00:47:55,480 Speaker 2: How on earth do they disseminate that this. 921 00:47:55,360 --> 00:47:56,720 Speaker 1: Is before you enter the country. 922 00:47:56,960 --> 00:47:59,800 Speaker 3: You've got to give them access to your social media 923 00:48:00,840 --> 00:48:03,320 Speaker 3: for five years before you can enter the country. 924 00:48:04,680 --> 00:48:05,840 Speaker 1: It only just came out. 925 00:48:05,640 --> 00:48:10,360 Speaker 3: Recently, and look, I don't even know how that would 926 00:48:10,480 --> 00:48:12,839 Speaker 3: work in terms of how you do it. But when 927 00:48:12,880 --> 00:48:18,000 Speaker 3: you make the tourist visa application, you know we are 928 00:48:18,080 --> 00:48:22,560 Speaker 3: visa exempt in Australia, but maying to get into the country, 929 00:48:22,600 --> 00:48:25,160 Speaker 3: you've got to basically say, because they're saying they want 930 00:48:25,200 --> 00:48:28,080 Speaker 3: to filter out people who may have had bad opinions 931 00:48:28,080 --> 00:48:30,960 Speaker 3: about the US, if you've said something bad about Donald Trump, 932 00:48:31,040 --> 00:48:34,080 Speaker 3: you couldn't effectively be kicked out or not allowed into 933 00:48:34,120 --> 00:48:34,640 Speaker 3: the country. 934 00:48:35,200 --> 00:48:38,560 Speaker 2: So much of a free speech now apparently if I'm 935 00:48:38,680 --> 00:48:42,960 Speaker 2: going to have an account with a password, I shouldn't 936 00:48:43,080 --> 00:48:47,120 Speaker 2: use admin ad M. I N I shouldn't use that 937 00:48:47,200 --> 00:48:49,320 Speaker 2: as my password. Who would have thought, Patrick? 938 00:48:49,520 --> 00:48:49,799 Speaker 1: I know? 939 00:48:49,960 --> 00:48:52,239 Speaker 3: Or if you happen to have a very famous art 940 00:48:52,280 --> 00:48:55,279 Speaker 3: gallery and you have security cameras, maybe not use the 941 00:48:55,320 --> 00:48:57,120 Speaker 3: word louve as. 942 00:48:57,000 --> 00:49:00,480 Speaker 1: Your password via security system. Is that true? Is that 943 00:49:00,520 --> 00:49:01,839 Speaker 1: what happened? No? 944 00:49:01,840 --> 00:49:04,280 Speaker 2: No, no, okay, because they just got robbed. 945 00:49:04,320 --> 00:49:07,880 Speaker 1: You're being funny. That's true, Look it up? Is it 946 00:49:07,920 --> 00:49:09,759 Speaker 1: actually true? Yeah? 947 00:49:09,880 --> 00:49:13,800 Speaker 3: Evidently an earlier password was louve with capital although. 948 00:49:14,040 --> 00:49:17,239 Speaker 2: It couldn't crook spell louve. That might have been if 949 00:49:17,280 --> 00:49:22,040 Speaker 2: they weren't friends, ye had probably not. Wow, I love 950 00:49:22,080 --> 00:49:26,839 Speaker 2: this one. A Sydney car theft was basically thwarted by 951 00:49:27,760 --> 00:49:29,799 Speaker 2: crooks who couldn't drive. 952 00:49:29,520 --> 00:49:31,439 Speaker 1: A manual I know, isn't that great? 953 00:49:31,480 --> 00:49:33,319 Speaker 3: So they tried to steal a car and then they 954 00:49:33,320 --> 00:49:34,680 Speaker 3: had to abandon it because they. 955 00:49:34,600 --> 00:49:37,359 Speaker 1: Didn't know drive a manual car. 956 00:49:38,080 --> 00:49:40,160 Speaker 2: That's the sign of the times, isn't it. 957 00:49:40,160 --> 00:49:42,160 Speaker 1: It's pretty hard to get a manual car these days. 958 00:49:42,200 --> 00:49:43,120 Speaker 1: It's not many on the market. 959 00:49:43,160 --> 00:49:45,640 Speaker 3: A friend of mine, who's an older woman in her seventies, 960 00:49:46,000 --> 00:49:49,760 Speaker 3: bought a bright fire engine red WRX because she wanted 961 00:49:49,800 --> 00:49:51,480 Speaker 3: to have a manual car and it was one of 962 00:49:51,480 --> 00:49:52,400 Speaker 3: the few on the market. 963 00:49:52,480 --> 00:49:54,000 Speaker 1: So now she's just this hoon. 964 00:49:54,840 --> 00:49:58,560 Speaker 3: Driving around the land a red w Hi Judy, if 965 00:49:58,560 --> 00:50:01,080 Speaker 3: you're listening, Hi Judy, Lady. 966 00:50:00,760 --> 00:50:01,960 Speaker 1: Bogan, I'm with you. 967 00:50:02,080 --> 00:50:04,560 Speaker 2: Duty. I've got a six speed manual. Best thing ever, 968 00:50:05,000 --> 00:50:08,560 Speaker 2: it's kind of driving automatic. I've got yeah, I've got 969 00:50:08,560 --> 00:50:11,120 Speaker 2: an automatic as well, but it's like it's kind of 970 00:50:12,320 --> 00:50:15,439 Speaker 2: I think, you know. I was listening to this guy 971 00:50:15,480 --> 00:50:18,200 Speaker 2: that you may or may not know called Stephen Bartlett Patrick. 972 00:50:18,239 --> 00:50:18,759 Speaker 1: Do you know him? 973 00:50:18,800 --> 00:50:20,719 Speaker 2: He has a very big podcast, one of the top 974 00:50:20,840 --> 00:50:25,239 Speaker 2: ten in the world called CEO of a Diary. 975 00:50:25,320 --> 00:50:26,680 Speaker 1: Sorry Diary of a CEO. 976 00:50:26,800 --> 00:50:31,759 Speaker 2: Sorry fuck, Come on, harps, how's your dyslexia? And he 977 00:50:31,880 --> 00:50:34,680 Speaker 2: was just talking about how he has two autonomous vehicles 978 00:50:34,880 --> 00:50:39,319 Speaker 2: and he's in LA. He doesn't drive anymore, so he 979 00:50:39,440 --> 00:50:41,719 Speaker 2: just goes and gets in his car, tells it or 980 00:50:41,760 --> 00:50:44,279 Speaker 2: whatever you do, program it, tell it where to take you. 981 00:50:45,080 --> 00:50:47,640 Speaker 2: Sits in the passenger seat or I don't know, maybe 982 00:50:47,640 --> 00:50:50,520 Speaker 2: the driver's seat, and it just takes him wherever he 983 00:50:50,560 --> 00:50:54,000 Speaker 2: wants to go. You'd be pretty comfortable with that. Wouldn't 984 00:50:54,040 --> 00:50:55,840 Speaker 2: you like I think you would like that? 985 00:50:56,120 --> 00:50:57,800 Speaker 1: Or am I wrong? Definitely? 986 00:50:57,840 --> 00:51:00,520 Speaker 3: If I was driving in Melbourne, I would I get 987 00:51:00,520 --> 00:51:02,720 Speaker 3: white knuckle. I've been living in the country for nearly 988 00:51:02,760 --> 00:51:05,640 Speaker 3: seventeen years. I get white knuckle driving in Melbourne. It's 989 00:51:05,680 --> 00:51:08,759 Speaker 3: awful traffic, people cutting you off, no one uses indicators. 990 00:51:09,000 --> 00:51:12,320 Speaker 1: Damn. I'd be going for an autonomous car much safer. 991 00:51:12,960 --> 00:51:15,560 Speaker 2: TIF if you had to choose for the next two years, 992 00:51:15,640 --> 00:51:20,080 Speaker 2: you can have an autonomous car and it'll drive you everywhere. 993 00:51:20,120 --> 00:51:22,160 Speaker 2: You've just got to get in it, or you've got 994 00:51:22,160 --> 00:51:24,920 Speaker 2: to drive yourself anywhere and needed to be one or 995 00:51:24,920 --> 00:51:25,279 Speaker 2: the other. 996 00:51:25,360 --> 00:51:26,320 Speaker 1: What would you choose? 997 00:51:27,920 --> 00:51:32,799 Speaker 4: I think once I tried it, like the idea of 998 00:51:32,840 --> 00:51:36,239 Speaker 4: it freaks me out, but then the idea of trusting it. 999 00:51:37,080 --> 00:51:40,000 Speaker 4: If I trusted it, yeah, hell yeah, but I would 1000 00:51:40,440 --> 00:51:42,719 Speaker 4: I would go weired it out by the idea of it. 1001 00:51:42,760 --> 00:51:44,480 Speaker 1: At first, for me, it. 1002 00:51:44,480 --> 00:51:47,160 Speaker 2: Just freeze up time, especially going to see ron A 1003 00:51:47,200 --> 00:51:47,839 Speaker 2: Marror every time. 1004 00:51:47,880 --> 00:51:48,720 Speaker 1: It's four hours. 1005 00:51:48,760 --> 00:51:51,120 Speaker 2: I'm like, well, I could sit in the back and 1006 00:51:51,120 --> 00:51:54,560 Speaker 2: do podcasts or have meetings or have a snoozy mac 1007 00:51:54,640 --> 00:51:57,880 Speaker 2: snoozeter or you know, I could do I could do 1008 00:51:57,960 --> 00:51:59,560 Speaker 2: what to make a casserole. 1009 00:52:00,280 --> 00:52:02,279 Speaker 1: I could do it on Well, I do that on 1010 00:52:02,320 --> 00:52:03,120 Speaker 1: the time. 1011 00:52:03,160 --> 00:52:05,520 Speaker 3: I'm going into Melbourne on Sunday morning, and I thought, 1012 00:52:05,680 --> 00:52:06,640 Speaker 3: I'm going to go in the city. 1013 00:52:06,640 --> 00:52:08,839 Speaker 1: Why would I drive? And then you got to worry 1014 00:52:08,880 --> 00:52:11,840 Speaker 1: about parking. Just jump on the train. So effectively the 1015 00:52:11,880 --> 00:52:12,520 Speaker 1: same thing. 1016 00:52:14,080 --> 00:52:17,360 Speaker 2: As well, well it effectively the same. I don't know 1017 00:52:17,440 --> 00:52:19,879 Speaker 2: that a train is the same as an autonomous car, 1018 00:52:19,960 --> 00:52:21,719 Speaker 2: but anyway, I'm not driving it. 1019 00:52:21,800 --> 00:52:22,960 Speaker 1: Someone else is driving it. 1020 00:52:23,680 --> 00:52:30,880 Speaker 2: Well, yeah you can, Patrick. Where can people find you, Patrick? 1021 00:52:30,920 --> 00:52:33,080 Speaker 2: And where can they get their website built and their 1022 00:52:33,120 --> 00:52:37,160 Speaker 2: social media pump and their brand accelerated? Where can they 1023 00:52:37,160 --> 00:52:37,759 Speaker 2: do all of that? 1024 00:52:38,160 --> 00:52:41,000 Speaker 3: You really think, on the strength of today's conversation, anybody's 1025 00:52:41,000 --> 00:52:44,280 Speaker 3: going to contact me. But anyway, it's it's websites now, 1026 00:52:44,360 --> 00:52:47,279 Speaker 3: dot com dot Au, websites now to com today you 1027 00:52:47,520 --> 00:52:49,640 Speaker 3: or ty shit at home dot com today you. I'm 1028 00:52:49,640 --> 00:52:52,280 Speaker 3: about to put some new training exercises on my channel 1029 00:52:52,280 --> 00:52:55,920 Speaker 3: that I haven't updated since COVID because my students have 1030 00:52:55,920 --> 00:52:58,040 Speaker 3: all said, we had our last class night last night, 1031 00:52:58,120 --> 00:53:00,560 Speaker 3: and so I've got to put some new X sizes 1032 00:53:00,600 --> 00:53:03,800 Speaker 3: and tight chi workouts for people over the holidays. 1033 00:53:04,080 --> 00:53:07,480 Speaker 2: Well, thank you, Patrick. Thank you, Tiffany and Cook. Have 1034 00:53:07,560 --> 00:53:10,920 Speaker 2: a good weekend, both of you. Quickly, tiff what are 1035 00:53:10,960 --> 00:53:11,840 Speaker 2: you doing on the weekend? 1036 00:53:11,960 --> 00:53:16,600 Speaker 4: Quickly, I'm about to head up to Gisbon today and 1037 00:53:17,640 --> 00:53:19,440 Speaker 4: be in the country for a couple of days. 1038 00:53:19,880 --> 00:53:22,640 Speaker 2: Oh really, Oh nice? What's in Gisbone? 1039 00:53:23,239 --> 00:53:24,440 Speaker 1: Is there something? 1040 00:53:24,960 --> 00:53:25,400 Speaker 2: Patrick? 1041 00:53:25,440 --> 00:53:26,160 Speaker 1: What are you doing? 1042 00:53:26,840 --> 00:53:30,480 Speaker 3: I have two choir performances with our local community choir. 1043 00:53:31,200 --> 00:53:33,360 Speaker 3: I've got a chee break up. I'm going to a 1044 00:53:33,440 --> 00:53:36,480 Speaker 3: vegan market in Melbourne. I'm going to dinner tonight with 1045 00:53:36,520 --> 00:53:38,919 Speaker 3: friends at the local pub. Man, I've got so much 1046 00:53:38,920 --> 00:53:40,800 Speaker 3: on this weekend I don't even know to stop. 1047 00:53:41,360 --> 00:53:47,799 Speaker 2: You just described my worst weekend ever. Thanks team, See 1048 00:53:47,840 --> 00:53:48,399 Speaker 2: you next time.