1 00:00:01,680 --> 00:00:07,360 Speaker 1: A good A team good Patrick, gooda good a Tiff Hey, 2 00:00:08,760 --> 00:00:10,760 Speaker 1: Tiff and I were just talking while you're off having 3 00:00:10,840 --> 00:00:14,000 Speaker 1: a Wii, Patrick or whatever you were doing, just backing 4 00:00:14,040 --> 00:00:18,440 Speaker 1: one out and really. 5 00:00:18,560 --> 00:00:21,160 Speaker 2: Thirty second not even thirty seconds in, and it's just 6 00:00:21,160 --> 00:00:22,599 Speaker 2: too cras well. 7 00:00:22,600 --> 00:00:24,800 Speaker 1: It wasn't yours, wasn't it for I? That was back 8 00:00:24,840 --> 00:00:25,639 Speaker 1: and so don't. 9 00:00:25,440 --> 00:00:27,960 Speaker 2: Blame us leading something down with a fountain pen. The 10 00:00:28,040 --> 00:00:30,280 Speaker 2: you go, they say that, do you use a pen? 11 00:00:31,440 --> 00:00:36,680 Speaker 1: So Diff and I were talking about like high pitched noises. 12 00:00:36,800 --> 00:00:38,160 Speaker 1: I don't know how it came up. 13 00:00:38,400 --> 00:00:40,360 Speaker 3: Because I had to go turn the exhaust fan off 14 00:00:40,360 --> 00:00:42,639 Speaker 3: because I just heard it, and then I couldn't unhear it. 15 00:00:43,320 --> 00:00:46,960 Speaker 1: That's right. So in my office where I'm at now, 16 00:00:47,320 --> 00:00:52,920 Speaker 1: if Melissa's over here doing work junk and whatever, there's 17 00:00:53,000 --> 00:00:55,800 Speaker 1: apparently a sound in my office that she can hear, 18 00:00:55,960 --> 00:00:58,639 Speaker 1: and she's like, oh my god, doesn't that annoy you? 19 00:00:58,800 --> 00:01:01,520 Speaker 1: I can't even hear it, Like it's not like I 20 00:01:01,520 --> 00:01:05,399 Speaker 1: don't find it annoying, I can't actually hear it. And 21 00:01:05,440 --> 00:01:08,320 Speaker 1: then Tip was telling me, will you tell the story. 22 00:01:08,520 --> 00:01:10,959 Speaker 3: I used to work at a restaurant in Tazzy and 23 00:01:11,000 --> 00:01:14,839 Speaker 3: they had this electric kind of frequency thing plugged into 24 00:01:15,120 --> 00:01:18,520 Speaker 3: the PowerPoint that kept ants away, that ants could hear 25 00:01:18,520 --> 00:01:20,920 Speaker 3: it and wouldn't come. And I could hear it. I 26 00:01:20,920 --> 00:01:26,440 Speaker 3: could hear the noise. And I said the harves was 27 00:01:26,440 --> 00:01:30,360 Speaker 3: for annoying little critters, and yeah, I could hear it. 28 00:01:30,560 --> 00:01:33,960 Speaker 2: So if you can hear a repetitive winding sound when 29 00:01:33,959 --> 00:01:35,880 Speaker 2: you're with Craig, it's Craig. 30 00:01:37,760 --> 00:01:38,400 Speaker 1: Very hopeful. 31 00:01:38,480 --> 00:01:41,720 Speaker 3: And here I am unplugging everything all the time, wondering 32 00:01:41,720 --> 00:01:42,680 Speaker 3: what doesn't go away? 33 00:01:43,040 --> 00:01:43,920 Speaker 4: Now, what's interesting? 34 00:01:44,000 --> 00:01:46,039 Speaker 2: I can remember I've got a young guy who works 35 00:01:46,040 --> 00:01:49,360 Speaker 2: for me after school. And there's a level or a 36 00:01:49,520 --> 00:01:52,240 Speaker 2: tone at which, as we get older we can't hear it. 37 00:01:52,480 --> 00:01:53,720 Speaker 4: And there was a while. 38 00:01:53,520 --> 00:01:57,680 Speaker 2: There where teenagers were playing music in school classrooms or 39 00:01:57,720 --> 00:02:01,360 Speaker 2: playing noises that the teachers couldn't because they were too old. 40 00:02:01,840 --> 00:02:02,480 Speaker 4: So there's a. 41 00:02:02,480 --> 00:02:06,080 Speaker 2: Range of frequency range that only young people can actually hear. 42 00:02:06,160 --> 00:02:07,560 Speaker 4: I thought that was really interesting. 43 00:02:08,760 --> 00:02:10,639 Speaker 1: You know, you two are being very hurtful to me. 44 00:02:11,840 --> 00:02:13,960 Speaker 5: We're sorry, Harps, we love you. 45 00:02:16,720 --> 00:02:21,320 Speaker 1: Very insecure. Yeah, I think that's a thing. I think 46 00:02:21,360 --> 00:02:26,160 Speaker 1: that that not being able to It's the same with dogs. 47 00:02:26,280 --> 00:02:30,000 Speaker 1: Don't dogs hear a certain pitch that humans can't hear. 48 00:02:30,760 --> 00:02:32,919 Speaker 4: Absolutely, you too. 49 00:02:32,760 --> 00:02:38,520 Speaker 1: Should test that now, Patrick, I sent you a I 50 00:02:38,560 --> 00:02:40,160 Speaker 1: mean you are the tech guy. I knew the guy 51 00:02:40,200 --> 00:02:42,200 Speaker 1: that comes up with the topics, but I sent I 52 00:02:42,240 --> 00:02:45,440 Speaker 1: saw something this week in the tech realm that I thought, 53 00:02:45,480 --> 00:02:48,880 Speaker 1: oh my god, my friend would my friend would love 54 00:02:49,000 --> 00:02:50,840 Speaker 1: not only would he love one of these, but I 55 00:02:50,880 --> 00:02:54,000 Speaker 1: think he would fucking love this job. Tell Tiff what 56 00:02:54,200 --> 00:02:55,919 Speaker 1: and our listeners what I sent you? 57 00:02:56,760 --> 00:02:59,480 Speaker 4: Look In the most simplest terms, it's a jet pack. 58 00:03:00,880 --> 00:03:03,160 Speaker 2: It's a lot more sophisticated than the ones that I 59 00:03:03,200 --> 00:03:05,360 Speaker 2: saw when I was fourteen at the Royal Melbourne Show 60 00:03:05,400 --> 00:03:08,440 Speaker 2: that I've been drooling over since I was fourteen. But 61 00:03:08,800 --> 00:03:14,120 Speaker 2: a British company has called test called Gravity now allows 62 00:03:14,160 --> 00:03:15,880 Speaker 2: people to do test flights. 63 00:03:15,560 --> 00:03:17,560 Speaker 4: Of this jet pack and it is amazing. 64 00:03:17,600 --> 00:03:20,880 Speaker 2: It's kind of a cross between the backpack, but it 65 00:03:20,919 --> 00:03:24,680 Speaker 2: also looks a bit like Iron Man from the Marvel movies. 66 00:03:24,880 --> 00:03:26,840 Speaker 4: And so Craigo sent it through to me. 67 00:03:26,880 --> 00:03:28,880 Speaker 2: But I did a bit more research Crago because when 68 00:03:28,919 --> 00:03:31,919 Speaker 2: we go to the UK, we can do these test flights. 69 00:03:31,919 --> 00:03:33,360 Speaker 4: So there's a special offer. 70 00:03:33,360 --> 00:03:37,360 Speaker 2: At the moment it's one five hundred pounds, but then 71 00:03:37,440 --> 00:03:39,440 Speaker 2: if you do the full flight experience. 72 00:03:39,840 --> 00:03:41,320 Speaker 4: It kind of jumps a little bit. 73 00:03:41,200 --> 00:03:44,680 Speaker 2: More, but I figure you could shout it's two hundred 74 00:03:44,720 --> 00:03:46,960 Speaker 2: pounds and then to do a whole day of flight 75 00:03:47,040 --> 00:03:50,040 Speaker 2: training it jumps. This is just the training is six 76 00:03:50,600 --> 00:03:53,240 Speaker 2: six hundred, so probably about ten grand to do a 77 00:03:53,320 --> 00:03:54,440 Speaker 2: day's training. 78 00:03:54,680 --> 00:03:58,360 Speaker 1: The idea of learning to fly a jetpack sounds fraught 79 00:03:58,400 --> 00:03:59,000 Speaker 1: with danger. 80 00:04:00,200 --> 00:04:03,200 Speaker 2: Now they have a gantry right with ropes and supports 81 00:04:03,200 --> 00:04:05,360 Speaker 2: and all that. See, that's the thing. You go there 82 00:04:05,480 --> 00:04:08,520 Speaker 2: and they tether you to this gantry so that you're safe. 83 00:04:08,880 --> 00:04:10,600 Speaker 2: But I want to be out in the forest and 84 00:04:10,640 --> 00:04:12,920 Speaker 2: just fly over lakes and all that sort of stuff. 85 00:04:13,000 --> 00:04:16,279 Speaker 2: But you know, there are some places you can go to, 86 00:04:16,279 --> 00:04:19,000 Speaker 2: I think in Queensland, and they have these water based 87 00:04:19,120 --> 00:04:21,920 Speaker 2: jet packs. So what they It's like you've got a 88 00:04:22,000 --> 00:04:25,799 Speaker 2: jet ski underneath you and as you start to move, 89 00:04:25,960 --> 00:04:27,919 Speaker 2: it just uses water to propel you. 90 00:04:27,960 --> 00:04:28,839 Speaker 4: Have you seen that tip? 91 00:04:29,120 --> 00:04:30,120 Speaker 5: Yeah, I've seen those. 92 00:04:30,279 --> 00:04:31,640 Speaker 4: Yeah, I really want to do that. Maybe we could 93 00:04:31,680 --> 00:04:31,880 Speaker 4: do that. 94 00:04:31,920 --> 00:04:34,039 Speaker 2: Craig go on holiday in Queensland and just do the 95 00:04:34,080 --> 00:04:35,320 Speaker 2: water jet pack experience. 96 00:04:35,800 --> 00:04:39,360 Speaker 1: Yeah, well it sounds super gay, but you're gay, but 97 00:04:39,440 --> 00:04:42,839 Speaker 1: I'm not. But that's okay. It's okay, it's okay that 98 00:04:42,880 --> 00:04:43,400 Speaker 1: it's going to. 99 00:04:43,360 --> 00:04:49,920 Speaker 2: Come with us, you and I go to the water park. 100 00:04:50,640 --> 00:04:53,359 Speaker 1: Really, can we not go to the UFC as well? 101 00:04:54,200 --> 00:04:55,320 Speaker 1: Can we not do something? 102 00:04:57,400 --> 00:05:00,840 Speaker 4: What about the trucks? What's what's the trucks? 103 00:05:01,200 --> 00:05:05,560 Speaker 1: Can we just upset the water parks with fucking monster trucks? 104 00:05:05,760 --> 00:05:08,280 Speaker 2: Are you telling me that putting on a water jet 105 00:05:08,320 --> 00:05:11,320 Speaker 2: pack is gay and monster trucks isn't? 106 00:05:11,440 --> 00:05:12,159 Speaker 4: Like? What the hell? 107 00:05:12,760 --> 00:05:14,760 Speaker 1: What? I don't know, I don't know. It just no, 108 00:05:15,000 --> 00:05:17,039 Speaker 1: just you and me going to the water parks in 109 00:05:17,120 --> 00:05:21,400 Speaker 1: Queensland on a holiday, you know, But that's all right, 110 00:05:21,520 --> 00:05:27,600 Speaker 1: I said, sporty, I'm com before we go on. I 111 00:05:27,680 --> 00:05:32,960 Speaker 1: just remembered we had heaps of messages this week with 112 00:05:33,160 --> 00:05:35,480 Speaker 1: I don't know why. I don't know why we don't 113 00:05:35,480 --> 00:05:37,479 Speaker 1: have a photo of you up much, but there was 114 00:05:37,880 --> 00:05:39,720 Speaker 1: a photo of you and Tiff, But I thought we 115 00:05:39,839 --> 00:05:43,200 Speaker 1: have your scone on maybe because it was on general 116 00:05:43,240 --> 00:05:48,279 Speaker 1: distribution and not on our page. But how many people say, oh, 117 00:05:48,600 --> 00:05:52,640 Speaker 1: Patrick's nothing like what I thought? Like, I don't know. 118 00:05:52,720 --> 00:05:54,760 Speaker 1: That's either a compliment or an insult. 119 00:05:55,000 --> 00:05:56,720 Speaker 4: We thought it was dark and handsome. 120 00:05:57,480 --> 00:06:01,000 Speaker 1: Either way, it's an insult because like either they think 121 00:06:01,040 --> 00:06:04,600 Speaker 1: you've got this fucking movie star voice and then they 122 00:06:04,640 --> 00:06:07,440 Speaker 1: see your head and they're like, oh that's disappointing, or 123 00:06:07,440 --> 00:06:10,680 Speaker 1: the other way or the other way around. They think 124 00:06:10,760 --> 00:06:13,279 Speaker 1: your movie star looks but you've got a ship voice. 125 00:06:13,640 --> 00:06:16,960 Speaker 1: What do you get that? Have you had that feedback before? 126 00:06:17,839 --> 00:06:19,920 Speaker 4: No, but I'm going to go from cry for a bit. 127 00:06:21,640 --> 00:06:23,680 Speaker 1: Well you do have the dulcin No. I think you're 128 00:06:23,760 --> 00:06:25,560 Speaker 1: quite a good looking man, but you also have a 129 00:06:25,600 --> 00:06:28,640 Speaker 1: good look at Let's go to the water parks in Queensland. No, 130 00:06:28,800 --> 00:06:33,360 Speaker 1: but you do. Sorry, I'm just fucking hell. Well, clearly 131 00:06:33,400 --> 00:06:36,080 Speaker 1: I'm no good at women. So let's fucking let's try it. 132 00:06:36,160 --> 00:06:39,120 Speaker 1: Let's see what happens. Are you speaking of relationships? I 133 00:06:39,160 --> 00:06:43,880 Speaker 1: know I'm digressing, but you two are pretty much at 134 00:06:43,960 --> 00:06:47,919 Speaker 1: the verge of a life together. Can you explain to 135 00:06:48,880 --> 00:06:51,520 Speaker 1: Tiff what we would I know this is inappropriate, but 136 00:06:51,600 --> 00:06:52,600 Speaker 1: fuck our listeners. 137 00:06:52,920 --> 00:06:53,240 Speaker 4: Do you know? 138 00:06:53,360 --> 00:06:55,239 Speaker 1: Our listeners love this bit the best. 139 00:06:55,800 --> 00:06:59,080 Speaker 3: They like I'm not to post I'm not opposed to 140 00:06:59,120 --> 00:07:01,000 Speaker 3: a situation ship with Patrick. 141 00:07:01,080 --> 00:07:04,800 Speaker 2: Actually, look I love that, and you know why when 142 00:07:04,880 --> 00:07:07,240 Speaker 2: we have a private conversation do you then bring it 143 00:07:07,320 --> 00:07:08,880 Speaker 2: up in the podcast? 144 00:07:10,960 --> 00:07:15,400 Speaker 4: Can I tell my operating things. I said it. 145 00:07:14,800 --> 00:07:20,440 Speaker 2: I was saying, I really enjoyed hanging out with Tiff 146 00:07:20,560 --> 00:07:23,840 Speaker 2: last weekend, and I could be in a relationship with her. 147 00:07:24,320 --> 00:07:28,800 Speaker 2: I said, it'd be everything except sex. We could probably spoon, but. 148 00:07:30,840 --> 00:07:33,560 Speaker 4: She's awesome. Look, seriously, Tiff. 149 00:07:33,400 --> 00:07:37,560 Speaker 2: Is just the ideal, perfect person everything. I'd tick all 150 00:07:37,600 --> 00:07:38,720 Speaker 2: the boxes there, Tiff. 151 00:07:40,000 --> 00:07:42,640 Speaker 3: Now, perhaps did mentioned it last night, and I've booked 152 00:07:42,640 --> 00:07:44,560 Speaker 3: out some time to go dress shopping. 153 00:07:44,720 --> 00:07:49,000 Speaker 4: So awesome, So I'd be up for that. Not for you, 154 00:07:49,160 --> 00:07:52,520 Speaker 4: a dress for her, you idiot, heard a company. 155 00:07:54,600 --> 00:07:59,360 Speaker 1: See I reckon Now. Patrick's proposal to me was not 156 00:07:59,440 --> 00:08:04,280 Speaker 1: that kind of proposal. His hypothetical scenario with you was 157 00:08:04,360 --> 00:08:06,520 Speaker 1: he would marry you. He said he would marry you 158 00:08:07,520 --> 00:08:10,880 Speaker 1: if you would allow him a semi regular, you know, 159 00:08:11,480 --> 00:08:17,560 Speaker 1: Saturday night off dalliance with somebody with a cock. Well, 160 00:08:17,960 --> 00:08:22,560 Speaker 1: I mean, all right, okay, and boy, all right? Well whatever? 161 00:08:22,840 --> 00:08:23,119 Speaker 3: Fuck? 162 00:08:23,160 --> 00:08:23,960 Speaker 1: I don't know these. 163 00:08:23,880 --> 00:08:25,600 Speaker 4: Days goes from bad to worse. 164 00:08:26,640 --> 00:08:29,800 Speaker 1: No, Well, I mean obviously he likes boys, not girls, 165 00:08:29,840 --> 00:08:31,000 Speaker 1: but he could marry you. 166 00:08:32,200 --> 00:08:34,080 Speaker 5: And what did you say, harps when he. 167 00:08:34,120 --> 00:08:38,960 Speaker 1: Said that, I said, well, you're about forty, dude, so 168 00:08:39,559 --> 00:08:47,680 Speaker 1: that's not I mean you say that yourself, so it's 169 00:08:47,800 --> 00:08:49,360 Speaker 1: not like the worst combo. 170 00:08:53,200 --> 00:08:54,319 Speaker 5: Oh well when we when? 171 00:08:54,480 --> 00:08:54,720 Speaker 4: When? 172 00:08:55,559 --> 00:08:57,000 Speaker 5: When are we all going to put it together? 173 00:08:57,760 --> 00:09:01,440 Speaker 2: This went to the wedding projeck dating, so this could 174 00:09:01,480 --> 00:09:05,120 Speaker 2: be the first t YP marriage. 175 00:09:05,520 --> 00:09:08,680 Speaker 1: I mean, like anything goes in twenty twenty four, so 176 00:09:08,760 --> 00:09:11,559 Speaker 1: why not? I mean, you guys, apart from the sex bit, 177 00:09:12,160 --> 00:09:15,520 Speaker 1: I think you guys would be a stellar couple. And 178 00:09:15,640 --> 00:09:17,800 Speaker 1: if you could go live in the country and then 179 00:09:18,320 --> 00:09:25,439 Speaker 1: your dogs could be best friends, that'd become brothers, sisters, sisters, 180 00:09:25,880 --> 00:09:31,000 Speaker 1: well siblings, canine siblings. As long as Fritz didn't eat 181 00:09:31,160 --> 00:09:40,120 Speaker 1: fucking the cat bears. 182 00:09:36,800 --> 00:09:40,440 Speaker 2: Yeah, I think, I think yeah, cats would get it 183 00:09:40,440 --> 00:09:43,559 Speaker 2: over a dog anytime because cats have sharp claws. I 184 00:09:43,600 --> 00:09:46,160 Speaker 2: think I wouldn't let Fritz near an angry cat. 185 00:09:48,160 --> 00:09:52,240 Speaker 1: All right, can you two stop distracting me? Let's talk. 186 00:09:53,840 --> 00:09:56,160 Speaker 1: Isn't it funny when I'm the problem on my own 187 00:09:56,200 --> 00:09:58,520 Speaker 1: show with that doubt? 188 00:10:00,080 --> 00:10:03,040 Speaker 2: A couple of really nice segues right throughout this conversation. 189 00:10:03,280 --> 00:10:05,280 Speaker 2: So I'm going to jump in and not let you start. 190 00:10:07,280 --> 00:10:09,760 Speaker 2: So I got to say, now I missed out on it, 191 00:10:09,800 --> 00:10:11,960 Speaker 2: but that you were there. We caught up in castle 192 00:10:12,000 --> 00:10:14,360 Speaker 2: Maine because we went to well, you went to a 193 00:10:14,400 --> 00:10:17,040 Speaker 2: concert and it's where you featured, didn't you. 194 00:10:17,600 --> 00:10:20,800 Speaker 5: A little bit? A little bit just saying of a. 195 00:10:20,960 --> 00:10:23,840 Speaker 2: Song video clip, I feel like I'm in the presence 196 00:10:23,880 --> 00:10:24,800 Speaker 2: of someone famous. 197 00:10:24,800 --> 00:10:25,120 Speaker 4: Craig. 198 00:10:26,040 --> 00:10:27,959 Speaker 1: Oh, this is why you want to marry her, because 199 00:10:28,000 --> 00:10:28,679 Speaker 1: you're a bloody. 200 00:10:29,280 --> 00:10:32,960 Speaker 5: You're a wannabe writing on my coattails. 201 00:10:33,480 --> 00:10:37,280 Speaker 1: That's you're a groupie, like just because Tip's a big deal. 202 00:10:37,320 --> 00:10:40,760 Speaker 2: Now you don't have to marry her dragging along by 203 00:10:40,800 --> 00:10:43,480 Speaker 2: the straps on your your boxing gloves. 204 00:10:43,720 --> 00:10:44,120 Speaker 1: Hey. Now. 205 00:10:44,600 --> 00:10:47,920 Speaker 2: The reason I thought about this though, was because there's 206 00:10:47,960 --> 00:10:52,240 Speaker 2: some study that's come out into shared music experiences. So 207 00:10:53,000 --> 00:10:57,199 Speaker 2: what it revealed is that obviously when we listen to music, 208 00:10:57,240 --> 00:10:59,880 Speaker 2: I can evoke a reaction, so you know, if it's 209 00:10:59,880 --> 00:11:02,480 Speaker 2: a ode. To Joy, for example, is probably one of 210 00:11:02,480 --> 00:11:05,520 Speaker 2: the most beautiful pieces of music ever written by Beethoven. 211 00:11:06,040 --> 00:11:11,480 Speaker 2: And by coincidence, every year ABC Classic FM does a 212 00:11:11,520 --> 00:11:14,760 Speaker 2: one hundred countdown and their topic this year was the 213 00:11:14,800 --> 00:11:18,720 Speaker 2: top one hundred feel good tunes and to Joy we 214 00:11:18,760 --> 00:11:21,280 Speaker 2: all know O to Joy, Yeah, to google it, you'll 215 00:11:21,320 --> 00:11:27,720 Speaker 2: know it. So shared music experiences extenuate the sensation. So 216 00:11:28,120 --> 00:11:30,120 Speaker 2: listening to a really good piece of music, kind of 217 00:11:30,160 --> 00:11:32,920 Speaker 2: voke reaction. But what the study has found is if 218 00:11:32,960 --> 00:11:36,760 Speaker 2: you share it with somebody else, it enhances it even more, 219 00:11:36,840 --> 00:11:39,400 Speaker 2: and the connectedness between you and the person you're sharing 220 00:11:39,440 --> 00:11:41,240 Speaker 2: it with. And when I think back to some of 221 00:11:41,280 --> 00:11:43,880 Speaker 2: the best concerts I've ever been in or been to, 222 00:11:44,720 --> 00:11:47,000 Speaker 2: what's enhanced that even more is the person I was 223 00:11:47,040 --> 00:11:50,440 Speaker 2: there with at the time, enjoying it with. So I 224 00:11:50,520 --> 00:11:52,760 Speaker 2: went to Imagine Dragons a few years ago with my 225 00:11:52,800 --> 00:11:56,360 Speaker 2: nephew and his girlfriend, and you know, it was embarrassing 226 00:11:56,400 --> 00:11:58,600 Speaker 2: for them because I sing along with all the songs. 227 00:11:58,880 --> 00:11:59,720 Speaker 4: Do you do that tip? 228 00:12:00,000 --> 00:12:00,719 Speaker 3: Did you do? No? 229 00:12:00,800 --> 00:12:02,480 Speaker 5: I definitely known lip sync? 230 00:12:04,120 --> 00:12:06,240 Speaker 4: Do you sing a long? Crego? 231 00:12:06,520 --> 00:12:08,080 Speaker 5: He's got a great singing voice. 232 00:12:08,360 --> 00:12:09,680 Speaker 1: I do not. I do not. 233 00:12:10,040 --> 00:12:11,880 Speaker 5: You do have a great singing voice, though. 234 00:12:12,280 --> 00:12:15,280 Speaker 1: I can sing vaguely in tune. But if I'm at 235 00:12:15,320 --> 00:12:18,040 Speaker 1: a concert, no one needs me fucking warbling next to 236 00:12:18,080 --> 00:12:20,760 Speaker 1: them while the actual artists are on stage. 237 00:12:21,400 --> 00:12:25,440 Speaker 3: I save birthday songs from harps on my voicemail and 238 00:12:25,480 --> 00:12:26,560 Speaker 3: listen to them regularly. 239 00:12:26,840 --> 00:12:29,520 Speaker 4: Oh wow, that's really great. I didn't know that about you, Crego. 240 00:12:30,120 --> 00:12:34,559 Speaker 5: M It's true. She's denying it. But it's actually true. 241 00:12:35,520 --> 00:12:38,160 Speaker 4: True, So that's that's really impressive. 242 00:12:38,440 --> 00:12:40,440 Speaker 2: If you can, I think, if you can hold a tone, 243 00:12:40,720 --> 00:12:44,440 Speaker 2: even depending on your vocal range, just being able to 244 00:12:44,520 --> 00:12:47,079 Speaker 2: keep your voice in tone I think makes the biggest 245 00:12:47,080 --> 00:12:50,800 Speaker 2: difference as to listening experience is pleasurable or not. But 246 00:12:51,080 --> 00:12:55,680 Speaker 2: so this study showed that sharing music even virtually can 247 00:12:55,720 --> 00:13:00,559 Speaker 2: really enhance the pleasure. And we're talking about triggering in orphans. 248 00:13:00,800 --> 00:13:03,880 Speaker 2: It was a study published by e Science and they 249 00:13:03,920 --> 00:13:07,479 Speaker 2: looked across the effects with a number of different experiments 250 00:13:07,920 --> 00:13:10,720 Speaker 2: online with participants in the United States and France. So 251 00:13:11,240 --> 00:13:14,480 Speaker 2: even listening with somebody, So remember the old mixtapes and 252 00:13:15,760 --> 00:13:18,320 Speaker 2: you'd put a headphone in one ear and you'd share 253 00:13:18,360 --> 00:13:22,120 Speaker 2: the earpiece with another person, and then you wonder, that's. 254 00:13:21,920 --> 00:13:25,800 Speaker 1: God, you're you're so fucking old. But I remember that, 255 00:13:25,880 --> 00:13:28,400 Speaker 1: I remember, I remember this is how old I am. 256 00:13:28,440 --> 00:13:33,120 Speaker 1: Bro I remember making mixtapes, pressing play and record on 257 00:13:33,160 --> 00:13:35,720 Speaker 1: the cassette player off the radio. 258 00:13:35,960 --> 00:13:38,840 Speaker 4: Ye, me too, and then you just. 259 00:13:38,800 --> 00:13:41,040 Speaker 1: Missed the start of the song because the fucking DJ 260 00:13:41,280 --> 00:13:44,960 Speaker 1: was introing it right, and then they kind of just 261 00:13:45,120 --> 00:13:46,280 Speaker 1: wind it up a little bit. 262 00:13:47,160 --> 00:13:49,160 Speaker 4: Yeah, that was really annoying when they talked over the 263 00:13:49,160 --> 00:13:50,520 Speaker 4: intro to the Oh. 264 00:13:50,679 --> 00:13:54,720 Speaker 1: Yeah, yeah, yeah, those were the days though. That was 265 00:13:55,040 --> 00:13:58,880 Speaker 1: I mean, there's lots of research out on the relationship 266 00:13:59,040 --> 00:14:02,880 Speaker 1: or the impact of music on people's nervous system. So if, 267 00:14:02,880 --> 00:14:06,080 Speaker 1: for example, you hear something like ode to Join it, Like, 268 00:14:06,200 --> 00:14:09,679 Speaker 1: the interesting thing too, is it's like you might hear 269 00:14:09,720 --> 00:14:14,199 Speaker 1: a piece of music like somebody loves classical music, and 270 00:14:14,320 --> 00:14:19,880 Speaker 1: that switches on their parasympathetic you know, the calm parasympathetic 271 00:14:19,920 --> 00:14:22,880 Speaker 1: nervous system at lowers their heart rate, blood pressure, slows 272 00:14:22,920 --> 00:14:27,920 Speaker 1: their breathing, right, and that's because they love it. Somebody 273 00:14:28,000 --> 00:14:31,800 Speaker 1: else fucking hates it, right, and it does the opposite. 274 00:14:31,960 --> 00:14:36,119 Speaker 1: So it's not like this music produces a universal response. 275 00:14:36,920 --> 00:14:39,880 Speaker 1: Like some people love country and western it's all they 276 00:14:39,960 --> 00:14:43,120 Speaker 1: listen to. For other people, the same song is like, 277 00:14:43,520 --> 00:14:47,000 Speaker 1: you know, fingernails on a blackboard. So, but it is 278 00:14:47,040 --> 00:14:52,960 Speaker 1: interesting the way that your physiology responds to differently to 279 00:14:53,040 --> 00:14:54,800 Speaker 1: the same stimulus to someone else. 280 00:14:55,560 --> 00:14:57,520 Speaker 4: Well, it's amazing, that's said. 281 00:14:58,200 --> 00:15:00,680 Speaker 2: It's it's interesting that if you book a survey, and 282 00:15:00,720 --> 00:15:03,240 Speaker 2: that's why I like the top one hundred countdown, but 283 00:15:03,360 --> 00:15:07,920 Speaker 2: to have so many people vote and that number one 284 00:15:08,040 --> 00:15:11,119 Speaker 2: song being owed to Joy out of all their listenership, 285 00:15:11,160 --> 00:15:13,560 Speaker 2: and we're talking tens of thousands of people, hundreds of 286 00:15:13,560 --> 00:15:15,720 Speaker 2: thousands of people who vote on this. 287 00:15:15,840 --> 00:15:16,400 Speaker 4: I just thought it was. 288 00:15:16,440 --> 00:15:19,000 Speaker 2: Quite interesting because it's one that I would have put 289 00:15:19,040 --> 00:15:22,240 Speaker 2: it really high on the list for me. And it's 290 00:15:22,240 --> 00:15:24,280 Speaker 2: funny when people say that I don't like classical music 291 00:15:24,520 --> 00:15:27,320 Speaker 2: and then you say, oh, did you like that movie? 292 00:15:27,360 --> 00:15:30,000 Speaker 2: Turn the music down or turn the audio down when 293 00:15:30,040 --> 00:15:33,640 Speaker 2: you're watching a scary movie or an uplifting rom com 294 00:15:34,120 --> 00:15:38,240 Speaker 2: and the music just helps with the sense of the emotion. 295 00:15:38,600 --> 00:15:41,800 Speaker 2: If you watch something, you know, like a Stephen King 296 00:15:42,040 --> 00:15:45,440 Speaker 2: film and you turn the music if it's not scary anymore. 297 00:15:46,200 --> 00:15:51,360 Speaker 1: Yeah, that's true. I think also with research, I had 298 00:15:51,440 --> 00:15:54,800 Speaker 1: to be a fucking kill joy. But if you ask 299 00:15:54,960 --> 00:15:59,040 Speaker 1: that same question to a thousand people under thirty, you're 300 00:15:59,040 --> 00:16:01,040 Speaker 1: not getting owed to Joy as the answer. 301 00:16:01,760 --> 00:16:04,600 Speaker 2: But if they, if they hear it with the music 302 00:16:04,640 --> 00:16:07,080 Speaker 2: and the way that it's composed, still have that same effect. 303 00:16:07,640 --> 00:16:07,800 Speaker 4: Yeah. 304 00:16:07,920 --> 00:16:11,040 Speaker 1: Maybe, I mean it depends like it's like music is 305 00:16:11,080 --> 00:16:13,680 Speaker 1: a really powerful thing, isn't it, And it's like you. 306 00:16:13,720 --> 00:16:16,480 Speaker 2: Report next time I've got my sixteen year old in 307 00:16:16,640 --> 00:16:18,320 Speaker 2: I'm going to put a set of headphones on him, 308 00:16:18,320 --> 00:16:20,480 Speaker 2: and I'm going to play three pieces of music. You 309 00:16:20,520 --> 00:16:24,360 Speaker 2: guys choose one piece each. I'll choose the Joy. If 310 00:16:24,400 --> 00:16:27,400 Speaker 2: you guys want to choose one each, then I'll play 311 00:16:27,480 --> 00:16:29,560 Speaker 2: all three and we'll get him to do a rating 312 00:16:29,640 --> 00:16:32,000 Speaker 2: of one to ten on each piece of music, and 313 00:16:32,040 --> 00:16:33,920 Speaker 2: we'll see if the Joy comes out on top. 314 00:16:34,440 --> 00:16:41,280 Speaker 5: I don't choose the Boxer, what of course you do? 315 00:16:41,520 --> 00:16:43,560 Speaker 4: Come on, Craig, you got to choose a piece of music. 316 00:16:43,760 --> 00:16:47,880 Speaker 1: Just that one sixteen year old kid constitutes a study? 317 00:16:48,320 --> 00:16:49,200 Speaker 5: Any rand? 318 00:16:49,600 --> 00:16:51,200 Speaker 4: Okay, it's a mini study. 319 00:16:51,400 --> 00:16:55,920 Speaker 2: Come on, choose a piece of music on the spot. 320 00:16:56,680 --> 00:16:57,320 Speaker 1: Ala Lujah? 321 00:16:58,400 --> 00:17:01,080 Speaker 4: Oh no, that's true? 322 00:17:01,360 --> 00:17:06,800 Speaker 1: Yeah, all right, I will play all three, and but 323 00:17:06,880 --> 00:17:08,560 Speaker 1: I'm going to say I'm going to tell you which 324 00:17:08,720 --> 00:17:09,760 Speaker 1: version of Ala Lujah? 325 00:17:09,920 --> 00:17:12,680 Speaker 4: Okay, we'll go and then well, I've. 326 00:17:12,480 --> 00:17:15,880 Speaker 1: Got to find I've just got to find the right artist. 327 00:17:16,000 --> 00:17:18,879 Speaker 1: Not Katie Lang. She does a good version, but there 328 00:17:18,880 --> 00:17:19,720 Speaker 1: are better versions. 329 00:17:19,960 --> 00:17:20,600 Speaker 4: All right? Done? 330 00:17:20,920 --> 00:17:23,080 Speaker 2: Can I give a shout out to the Dale Patterson 331 00:17:23,119 --> 00:17:24,520 Speaker 2: who sent me a link. 332 00:17:24,640 --> 00:17:26,760 Speaker 4: He's one of our listeners from Geelong and he sent 333 00:17:26,800 --> 00:17:27,480 Speaker 4: me a link. 334 00:17:27,280 --> 00:17:31,160 Speaker 2: To a really cool tech website which was really interesting 335 00:17:31,320 --> 00:17:31,760 Speaker 2: so I. 336 00:17:31,640 --> 00:17:33,760 Speaker 1: Love how you say can I and then you do it. 337 00:17:33,840 --> 00:17:43,679 Speaker 1: No you can't, can't fiance, No, no, you can't shout 338 00:17:43,720 --> 00:17:46,720 Speaker 1: out to Dale? What what was Dale emailing you about? 339 00:17:47,240 --> 00:17:51,320 Speaker 4: A tech website? Just lots of gadgets and stuff. 340 00:17:51,400 --> 00:17:54,639 Speaker 2: Of course, a fellow geek is he I don't know. 341 00:17:55,200 --> 00:17:57,240 Speaker 2: Wouldn't want to wrap him up in that bundle. I 342 00:17:57,240 --> 00:17:59,200 Speaker 2: don't think that's very nice. I don't know him. He 343 00:17:59,280 --> 00:18:01,359 Speaker 2: could be a a nice, interesting person. 344 00:18:01,560 --> 00:18:04,280 Speaker 1: And I love geeks. We love geeks. We love you. Hey. 345 00:18:04,280 --> 00:18:06,439 Speaker 1: One of the things that I've been interested in and 346 00:18:06,440 --> 00:18:09,760 Speaker 1: we've spoken about, I think once or twice maybe, or 347 00:18:09,800 --> 00:18:12,159 Speaker 1: maybe we haven't, but I feel like we have is 348 00:18:12,320 --> 00:18:17,160 Speaker 1: animals and language and how elephants have their own language, 349 00:18:17,320 --> 00:18:22,040 Speaker 1: dolphins have their own language. And I think you have 350 00:18:22,119 --> 00:18:23,280 Speaker 1: a story in the space. 351 00:18:23,840 --> 00:18:26,400 Speaker 2: Well, just there's a nice little seguay to that too. 352 00:18:26,400 --> 00:18:29,200 Speaker 2: Do you know elephants? Actually, it's now thought they have names. 353 00:18:29,520 --> 00:18:32,040 Speaker 2: They name each other, so they rock up in the 354 00:18:32,040 --> 00:18:34,239 Speaker 2: middle of the jungle and say, hey, Bob, how are 355 00:18:34,280 --> 00:18:35,080 Speaker 2: you not too bad? 356 00:18:35,119 --> 00:18:37,479 Speaker 4: Betty? Now it's interesting, isn't it. 357 00:18:37,520 --> 00:18:41,280 Speaker 2: But I do have a story about this, because look 358 00:18:42,000 --> 00:18:45,520 Speaker 2: for me, I reckon, I've got a good handle on 359 00:18:46,320 --> 00:18:49,679 Speaker 2: the kind of mood that Fritz is in. And you 360 00:18:49,720 --> 00:18:52,240 Speaker 2: know when a dog wags its tail and has all 361 00:18:52,240 --> 00:18:55,800 Speaker 2: these different cues. As a dog owner is a pet owner, tiff, 362 00:18:55,960 --> 00:18:58,040 Speaker 2: you would have a fairly good indication as to whether 363 00:18:58,080 --> 00:19:00,520 Speaker 2: your pet's happy or sad or excited, idable and all 364 00:19:00,560 --> 00:19:03,240 Speaker 2: that sort of stuff. But it's now thought that AI 365 00:19:03,800 --> 00:19:10,040 Speaker 2: potentially could actually translate what a dog is saying when 366 00:19:10,040 --> 00:19:10,640 Speaker 2: it barks. 367 00:19:10,920 --> 00:19:14,000 Speaker 4: So imagine that if with every bark. 368 00:19:13,840 --> 00:19:17,720 Speaker 2: A wine, or a growl, you could even know what 369 00:19:17,840 --> 00:19:20,760 Speaker 2: the dog is effectively trying to communicate to you, and 370 00:19:20,800 --> 00:19:23,280 Speaker 2: then in the future they could even. 371 00:19:23,040 --> 00:19:24,720 Speaker 4: Translate what craigs say. 372 00:19:25,280 --> 00:19:28,480 Speaker 2: No, But when you think about it, using AI to 373 00:19:28,600 --> 00:19:33,120 Speaker 2: analyze the tonal vocalizations of dogs, they say they can 374 00:19:33,640 --> 00:19:40,760 Speaker 2: distinguish between playful barks, aggressive growls, identify characteristics and even 375 00:19:40,800 --> 00:19:43,080 Speaker 2: the age and the breed of the dog by the 376 00:19:43,119 --> 00:19:44,400 Speaker 2: sound that it makes. 377 00:19:44,840 --> 00:19:48,160 Speaker 1: Wow, here's what dog dogs are saying. Four things right. 378 00:19:48,200 --> 00:19:52,720 Speaker 1: Dogs are saying I need a shit. Yeah, they're saying 379 00:19:52,840 --> 00:19:55,160 Speaker 1: I want to go for a walk, give me some 380 00:19:55,200 --> 00:19:58,720 Speaker 1: fucking food, and I love you. That's the four things 381 00:19:58,760 --> 00:20:04,119 Speaker 1: dogs are saying. So you're welcome. I just broke the code. 382 00:20:04,520 --> 00:20:08,119 Speaker 4: But you know, when a dog wags its tail, most 383 00:20:08,160 --> 00:20:11,639 Speaker 4: people associate that with being excited. So you meet a 384 00:20:11,640 --> 00:20:14,679 Speaker 4: new dog and the tail starts wagging. But what people 385 00:20:14,720 --> 00:20:18,359 Speaker 4: don't realize is depending on which way the tail is 386 00:20:18,440 --> 00:20:21,040 Speaker 4: leaning to so it will predominantly go left or right, 387 00:20:21,480 --> 00:20:26,240 Speaker 4: can actually mean apprehensive and potentially be not so much 388 00:20:26,240 --> 00:20:28,560 Speaker 4: a show of aggression but a show of apprehension. 389 00:20:28,760 --> 00:20:31,959 Speaker 2: So when two dogs meet each other, the tail can 390 00:20:32,000 --> 00:20:35,000 Speaker 2: be wagging, but it doesn't necessarily mean they're happy. It 391 00:20:35,040 --> 00:20:37,920 Speaker 2: could mean that they're apprehensive about meeting the other dog 392 00:20:38,040 --> 00:20:40,280 Speaker 2: or the situation that they're in, just by which way 393 00:20:40,320 --> 00:20:41,040 Speaker 2: the tail goes. 394 00:20:41,680 --> 00:20:42,960 Speaker 4: So yes and no. 395 00:20:43,680 --> 00:20:46,320 Speaker 1: Well, when I introduce you to new people, I always 396 00:20:46,320 --> 00:20:49,479 Speaker 1: look at the way that your tail wags, and I 397 00:20:49,480 --> 00:20:52,080 Speaker 1: can tell if you're a little bit kind of like 398 00:20:52,760 --> 00:20:57,840 Speaker 1: fearful or joyful. That's why I always walk around the back. 399 00:21:01,000 --> 00:21:03,800 Speaker 2: You know, there's a thing that people get dressed up 400 00:21:03,840 --> 00:21:06,760 Speaker 2: as what do they call that up play? 401 00:21:06,920 --> 00:21:10,840 Speaker 1: Is that it Well, there are people that let's not open. 402 00:21:10,880 --> 00:21:11,879 Speaker 1: That let's not open? 403 00:21:13,200 --> 00:21:15,320 Speaker 4: Can we can we continue on the AI theme. 404 00:21:15,840 --> 00:21:19,120 Speaker 2: I thought this was really awesome because I think probably 405 00:21:19,119 --> 00:21:22,320 Speaker 2: about a year ago, there was a photographer who entered 406 00:21:22,400 --> 00:21:27,960 Speaker 2: a photo competition with an AI image and won it, 407 00:21:28,720 --> 00:21:31,639 Speaker 2: and of course then it was revealed that the photo 408 00:21:31,760 --> 00:21:34,280 Speaker 2: wasn't a real photograph, it had been generated by AI, 409 00:21:34,359 --> 00:21:36,159 Speaker 2: and then he was disqualified and it made all the 410 00:21:36,160 --> 00:21:40,160 Speaker 2: headlines and there's all this controversy. Well, a different photographer, 411 00:21:40,400 --> 00:21:42,400 Speaker 2: a guy by the name of Miles Astra, has done 412 00:21:42,400 --> 00:21:46,440 Speaker 2: the exact opposite. He's entered a photo competition where there 413 00:21:46,520 --> 00:21:51,240 Speaker 2: was a category for AI generated artwork or AI generated photography, 414 00:21:51,480 --> 00:21:54,560 Speaker 2: but he put a real image in and he won competition, 415 00:21:54,880 --> 00:21:58,600 Speaker 2: and it was this really amazing photograph of a pink flamingo, 416 00:21:59,160 --> 00:22:03,000 Speaker 2: and the judges and everybody thought, this is fantastic. 417 00:22:03,440 --> 00:22:04,719 Speaker 4: Is an AI amazing? 418 00:22:05,000 --> 00:22:08,359 Speaker 2: And the photographer said he actually it was a real photograph, 419 00:22:08,440 --> 00:22:11,760 Speaker 2: and so he then got disqualified for submitting a real 420 00:22:11,800 --> 00:22:13,719 Speaker 2: photograph in an AI competition. 421 00:22:13,760 --> 00:22:15,600 Speaker 4: So I thought, it's kind of interesting that. 422 00:22:16,119 --> 00:22:18,960 Speaker 2: And we're not talking just run of the mill people 423 00:22:19,000 --> 00:22:23,280 Speaker 2: off the street, you and me. We're talking about some 424 00:22:23,400 --> 00:22:27,400 Speaker 2: pretty smart people from Getty Images, the New York Times, 425 00:22:28,200 --> 00:22:31,440 Speaker 2: you know, Christie's. So the judges knew what they were 426 00:22:31,480 --> 00:22:34,160 Speaker 2: talking about, and they kind of had a real sense of. 427 00:22:34,080 --> 00:22:35,679 Speaker 4: What a good photograph is. 428 00:22:36,320 --> 00:22:38,080 Speaker 2: But I just thought it was kind of great to 429 00:22:38,600 --> 00:22:40,960 Speaker 2: flip it around and that this photographer came up with 430 00:22:41,000 --> 00:22:43,440 Speaker 2: the idea. So this guy Miles Astray and. 431 00:22:43,720 --> 00:22:44,479 Speaker 4: Yeah, good on him. 432 00:22:44,560 --> 00:22:49,440 Speaker 2: It was the eighteen thirty nine Color Photography Awards, and 433 00:22:49,680 --> 00:22:51,560 Speaker 2: so he actually won two categories. 434 00:22:51,720 --> 00:22:53,240 Speaker 4: One is he came third in. 435 00:22:53,240 --> 00:22:57,000 Speaker 2: The I think it was the judges Award, and then 436 00:22:57,040 --> 00:22:59,880 Speaker 2: he came first in the People's Vote Award. 437 00:23:00,359 --> 00:23:02,040 Speaker 4: So everyone was fulled. 438 00:23:02,400 --> 00:23:04,760 Speaker 1: Everyone was tricked. I don't know that that's going to 439 00:23:04,760 --> 00:23:08,680 Speaker 1: happen too often moving forward, but I like real photography. Hey, 440 00:23:09,960 --> 00:23:12,200 Speaker 1: I don't really know what this is about, and this 441 00:23:12,280 --> 00:23:14,679 Speaker 1: isn't on our list of things to chat about, but 442 00:23:14,760 --> 00:23:18,800 Speaker 1: maybe you know. So Melissa who is like the Apple queen, 443 00:23:19,560 --> 00:23:23,479 Speaker 1: Like if you could marry Apple, Melissa would marry Apple. 444 00:23:24,000 --> 00:23:27,960 Speaker 4: I've got competition. What I've got competition? 445 00:23:28,440 --> 00:23:32,879 Speaker 1: Oh she is? She crushes so hard. You know Apple 446 00:23:33,040 --> 00:23:36,760 Speaker 1: is gay as well, right, Well, if Apple brought out 447 00:23:37,160 --> 00:23:39,760 Speaker 1: a toilet, she'd get an Apple toilet, like she would 448 00:23:39,800 --> 00:23:45,680 Speaker 1: get any Apple product. But this week, didn't Apple launch something. 449 00:23:46,800 --> 00:23:50,520 Speaker 1: Didn't they wheel out something like their version of chat 450 00:23:50,600 --> 00:23:52,320 Speaker 1: GPT or some verged it. 451 00:23:52,440 --> 00:23:53,880 Speaker 4: Yeah, This is really controversial. 452 00:23:53,920 --> 00:23:55,320 Speaker 2: In fact, it was something that I was going to 453 00:23:55,320 --> 00:24:00,400 Speaker 2: talk about because what's happened is they've basically Apple has 454 00:24:00,440 --> 00:24:02,440 Speaker 2: this big event and it talks about all their new 455 00:24:02,520 --> 00:24:05,000 Speaker 2: up and coming stuff, and one of the things they're 456 00:24:05,040 --> 00:24:09,080 Speaker 2: talking about is merging AI into the Siri operating system, 457 00:24:09,720 --> 00:24:12,960 Speaker 2: so they're taking the next so it's a big makeover 458 00:24:13,080 --> 00:24:13,560 Speaker 2: for Siri. 459 00:24:14,400 --> 00:24:16,040 Speaker 4: So effectively, what. 460 00:24:15,960 --> 00:24:18,280 Speaker 2: It's going to do is going to make Siri more natural, 461 00:24:18,560 --> 00:24:20,920 Speaker 2: more relevant, they're saying, more personal. 462 00:24:21,320 --> 00:24:22,320 Speaker 4: It's got a new look. 463 00:24:22,920 --> 00:24:26,399 Speaker 2: The icon's been made to look different and it glows better. 464 00:24:26,680 --> 00:24:28,760 Speaker 2: But what they're saying now is that you can also 465 00:24:28,840 --> 00:24:31,400 Speaker 2: handle things like stumbles in speech, so if someone has 466 00:24:31,400 --> 00:24:36,000 Speaker 2: a stutter, it'll better understand context. And you can also 467 00:24:36,200 --> 00:24:38,320 Speaker 2: type to Syria, and it can answer questions about how 468 00:24:38,320 --> 00:24:41,640 Speaker 2: you use your iPhone and you iMac and your iPad 469 00:24:41,640 --> 00:24:41,960 Speaker 2: and all. 470 00:24:41,840 --> 00:24:42,440 Speaker 4: That sort of thing. 471 00:24:42,640 --> 00:24:44,080 Speaker 2: The other thing it'll also be able to do is 472 00:24:44,119 --> 00:24:48,160 Speaker 2: interact with the other apps on your phone, so that's 473 00:24:48,160 --> 00:24:49,920 Speaker 2: one of the things that Siri hasn't been able to 474 00:24:49,960 --> 00:24:51,600 Speaker 2: do in the past. So you can get it to 475 00:24:51,680 --> 00:24:54,439 Speaker 2: go and interact with other apps, and it's going to 476 00:24:54,480 --> 00:24:56,760 Speaker 2: make the user experience a lot better. 477 00:24:57,200 --> 00:24:59,760 Speaker 4: However, there is a little bit of controversy. 478 00:24:59,800 --> 00:25:02,679 Speaker 2: Now we know that Elon Musk loves to be the 479 00:25:02,720 --> 00:25:04,000 Speaker 2: controversial person, but. 480 00:25:03,960 --> 00:25:05,040 Speaker 4: He's come out. 481 00:25:05,080 --> 00:25:08,639 Speaker 2: Elon Musk has this love hate relationship with AI, so 482 00:25:09,160 --> 00:25:12,960 Speaker 2: he effectively a second This was announced by Apple within hours, 483 00:25:13,359 --> 00:25:17,560 Speaker 2: He's tweeted about saying that he's going to cancel iPhones, 484 00:25:17,960 --> 00:25:20,040 Speaker 2: that none of his staff will be allowed to bring 485 00:25:20,119 --> 00:25:23,080 Speaker 2: iPhones into the building and if they do, they'll be 486 00:25:23,160 --> 00:25:24,440 Speaker 2: told to take out. 487 00:25:24,240 --> 00:25:26,600 Speaker 4: Their iPhones and their iPhones will be put in a 488 00:25:26,640 --> 00:25:30,840 Speaker 4: Faraday cage. Do you know what a Faraday cage is? Yeah? 489 00:25:31,680 --> 00:25:35,280 Speaker 1: Yeah, yeah, So they's yeah, like, but nobody will be 490 00:25:35,320 --> 00:25:40,160 Speaker 1: allowed to enter any of his buildings, like customers, nobody 491 00:25:40,240 --> 00:25:41,640 Speaker 1: with an Apple product. 492 00:25:42,119 --> 00:25:46,560 Speaker 2: Yeah yeah, I mean mind you he's I think he's 493 00:25:47,080 --> 00:25:51,720 Speaker 2: suing open AI or there's some sort of case going 494 00:25:51,760 --> 00:25:52,359 Speaker 2: on at the moment. 495 00:25:52,359 --> 00:25:54,080 Speaker 4: So there's there's a bit of bad blood there. 496 00:25:54,119 --> 00:25:57,160 Speaker 2: But the reason he says that he doesn't want this 497 00:25:57,400 --> 00:26:00,399 Speaker 2: integration or he thinks it's a bad thing, is that 498 00:26:00,480 --> 00:26:03,280 Speaker 2: he feels that effectively, what it's going to do is 499 00:26:03,320 --> 00:26:06,600 Speaker 2: it's just going to make them spying devices. He says 500 00:26:06,720 --> 00:26:10,640 Speaker 2: that you know effectively it's just going to to kind 501 00:26:10,680 --> 00:26:12,879 Speaker 2: of run a muck and that people will be able 502 00:26:12,960 --> 00:26:15,320 Speaker 2: to spy on him using their iPhones. 503 00:26:15,960 --> 00:26:19,600 Speaker 1: Do you know what I've noticed recently using chat GPT 504 00:26:19,880 --> 00:26:27,000 Speaker 1: four point zero. Patrick James is that old mate who 505 00:26:27,040 --> 00:26:30,000 Speaker 1: sounds like Tim Ferriss. That's the dude that talks to me, 506 00:26:30,160 --> 00:26:34,240 Speaker 1: sounds exactly like Tim Ferriss. He doesn't always get my 507 00:26:34,320 --> 00:26:39,720 Speaker 1: Australian accent. So sometimes I just need to americanize the 508 00:26:39,880 --> 00:26:44,679 Speaker 1: question because I don't type. I usually just talk like 509 00:26:44,760 --> 00:26:46,639 Speaker 1: you know how you can just ask it a question 510 00:26:46,760 --> 00:26:50,080 Speaker 1: and then it types and gives audio for the answer. 511 00:26:50,960 --> 00:26:56,239 Speaker 1: So sometimes it says, oh, I think you're asking you know? 512 00:26:56,359 --> 00:26:59,639 Speaker 1: And then then I say it again with a less 513 00:27:00,160 --> 00:27:03,560 Speaker 1: Boganny voice, and it understands what I'm saying. So I 514 00:27:03,640 --> 00:27:08,960 Speaker 1: wonder if the new Apple product can understand Bogans. Maybe 515 00:27:09,600 --> 00:27:14,800 Speaker 1: probably a Bogan friendly Bogan friendly AI. I bet they 516 00:27:14,840 --> 00:27:16,480 Speaker 1: don't even know what that is over there? 517 00:27:16,720 --> 00:27:18,440 Speaker 4: Not a chance if you were going to say something, 518 00:27:18,480 --> 00:27:19,040 Speaker 4: because I. 519 00:27:18,960 --> 00:27:21,080 Speaker 3: Was going to say joy, let everyone else know? Could 520 00:27:21,040 --> 00:27:23,480 Speaker 3: I had to google it? Anyone that mightn't know what 521 00:27:23,520 --> 00:27:24,760 Speaker 3: a Faraday cage is? 522 00:27:26,800 --> 00:27:27,000 Speaker 4: Yeah? 523 00:27:27,600 --> 00:27:29,919 Speaker 2: Yeah, okay, So Faraday cage if you can imagine just 524 00:27:29,960 --> 00:27:32,600 Speaker 2: a wire mesh around you. 525 00:27:32,640 --> 00:27:34,720 Speaker 4: So they usually use them during experiments. 526 00:27:34,920 --> 00:27:37,520 Speaker 2: So if you were you know how you can have 527 00:27:37,680 --> 00:27:40,720 Speaker 2: like a basically, if you if you had lightning come down, 528 00:27:41,040 --> 00:27:43,520 Speaker 2: you could protect yourself inside a Faraday cage. So it 529 00:27:43,640 --> 00:27:46,480 Speaker 2: insulates you from an electrical discharge. 530 00:27:46,560 --> 00:27:49,480 Speaker 4: Is that all the right answer, Tiff? 531 00:27:49,720 --> 00:27:50,399 Speaker 5: Yeah, yeah it is. 532 00:27:50,480 --> 00:27:54,400 Speaker 3: Yeah, it protects from an electromagnetic radiation and if you're interference. 533 00:27:54,720 --> 00:27:56,520 Speaker 3: I was thinking it might just be a lock box, 534 00:27:56,560 --> 00:27:58,800 Speaker 3: and I thought I might get one for chocolate around here, But. 535 00:27:59,640 --> 00:28:01,720 Speaker 4: Have you your chocolate from last weekend already? 536 00:28:02,000 --> 00:28:05,040 Speaker 2: I ate it on the way home, so did I 537 00:28:05,240 --> 00:28:07,359 Speaker 2: didn't get out of the car, so we bought chocolate 538 00:28:07,400 --> 00:28:10,359 Speaker 2: at the chocolate shop. Mine didn't make it back to 539 00:28:10,400 --> 00:28:13,080 Speaker 2: the land. It was only about forty five minutes away. 540 00:28:13,400 --> 00:28:14,280 Speaker 5: One didn't even. 541 00:28:14,119 --> 00:28:17,439 Speaker 3: Make it out of Castle Maye or Castle Maine if 542 00:28:17,440 --> 00:28:17,959 Speaker 3: you're fancy. 543 00:28:18,080 --> 00:28:20,080 Speaker 4: Oh yeah, is it Castle made or Castle Maine? 544 00:28:20,080 --> 00:28:22,720 Speaker 2: Maybe someone from Castle Maine or Castle Magan tell us, 545 00:28:23,200 --> 00:28:26,080 Speaker 2: I never know, Crag, I said. 546 00:28:25,920 --> 00:28:26,480 Speaker 1: Was it? 547 00:28:26,480 --> 00:28:26,840 Speaker 4: Well? 548 00:28:27,000 --> 00:28:30,160 Speaker 1: It? I think people from up there call it castle. 549 00:28:31,320 --> 00:28:34,360 Speaker 1: I'm a bogan, so I call it castle. But I 550 00:28:34,400 --> 00:28:37,200 Speaker 1: think interestingly. And this's got nothing to do with tech. 551 00:28:37,280 --> 00:28:39,880 Speaker 1: But remember if that dude we had on I think 552 00:28:39,920 --> 00:28:44,680 Speaker 1: his name is Dean Morby, the Power Left to Do. Yeah, yeah, yeah, 553 00:28:44,720 --> 00:28:47,800 Speaker 1: he's from Castle, Maine and all these homies up there, 554 00:28:47,840 --> 00:28:50,880 Speaker 1: so shout out to I mean, this is just we'll 555 00:28:51,000 --> 00:28:54,000 Speaker 1: come back in thirty seconds, Patrick. But this guy trains 556 00:28:54,040 --> 00:28:58,040 Speaker 1: a whole lot of older citizens, a lot of fifty plus, 557 00:28:58,120 --> 00:29:01,880 Speaker 1: sixty seventy plus, and he's basically well last time I 558 00:29:01,920 --> 00:29:03,960 Speaker 1: spoke to him. Anyway, I assume he's still doing it. 559 00:29:04,040 --> 00:29:09,280 Speaker 1: He had essentially a power lifting Jim Castle May Yeah, 560 00:29:09,320 --> 00:29:15,520 Speaker 1: for all these old people, and yeah, what's that? What 561 00:29:15,560 --> 00:29:16,000 Speaker 1: are you doing? 562 00:29:16,160 --> 00:29:16,560 Speaker 5: Castle? 563 00:29:17,960 --> 00:29:22,400 Speaker 2: I looked up how to pronounce Castlemaine and that's what 564 00:29:22,400 --> 00:29:23,640 Speaker 2: it came up as. 565 00:29:23,840 --> 00:29:25,960 Speaker 1: Castle Maine. It's brittish. 566 00:29:26,280 --> 00:29:28,360 Speaker 4: How can that work now? And it didn't work earlier? 567 00:29:28,520 --> 00:29:31,360 Speaker 2: So I was only playing it so I could listen 568 00:29:31,360 --> 00:29:33,240 Speaker 2: to it myself, and I didn't realize you guys could 569 00:29:33,240 --> 00:29:34,760 Speaker 2: hear it. But just for the sake of those people 570 00:29:34,760 --> 00:29:37,200 Speaker 2: who didn't hear it the first time, Castle. 571 00:29:36,840 --> 00:29:38,640 Speaker 5: Maine, I. 572 00:29:40,440 --> 00:29:44,360 Speaker 1: Think everyone, all right, that's enough now, Patrick, I think 573 00:29:44,480 --> 00:29:49,800 Speaker 1: everyone in the world other than nausies pronounces it parcel Maine. 574 00:29:51,040 --> 00:29:58,800 Speaker 1: I think all right, back on Bogan, Bogan Bergan, Like, 575 00:29:58,880 --> 00:30:02,040 Speaker 1: what's hilarious is when posh person says bogan, So it 576 00:30:02,080 --> 00:30:04,720 Speaker 1: doesn't sound like crass anymore? 577 00:30:05,600 --> 00:30:08,520 Speaker 5: And can you make it pronounce the sea word? That trick? 578 00:30:09,320 --> 00:30:13,400 Speaker 1: No, no, tiff? When did you become me? 579 00:30:14,000 --> 00:30:18,320 Speaker 5: Sounds so polite? I thought it might really soundly polite. 580 00:30:18,440 --> 00:30:22,040 Speaker 4: But genuinely speaking, you don't want to see your does Craig? 581 00:30:24,240 --> 00:30:24,400 Speaker 3: Oh? 582 00:30:24,440 --> 00:30:26,280 Speaker 4: Sorry you said the sea thing? I thought you meant craig. 583 00:30:28,840 --> 00:30:34,600 Speaker 1: God, I'm fucking copying it today. What's a collective noun 584 00:30:34,720 --> 00:30:40,920 Speaker 1: for bogan heard bogue? No, not the not the plural, 585 00:30:41,080 --> 00:30:45,360 Speaker 1: the collective noun, I don't know. Like a flock, a flotilla, 586 00:30:45,440 --> 00:30:52,000 Speaker 1: a murder, a gaggle of craigs, a gaggle of cra Yeah, 587 00:30:52,280 --> 00:30:55,920 Speaker 1: for sure it would be what about all right? Come on? 588 00:30:55,960 --> 00:30:57,920 Speaker 1: Can we get back on this terrible? 589 00:30:58,120 --> 00:31:01,000 Speaker 4: Are we scared about the Siri AI integration? 590 00:31:01,160 --> 00:31:03,520 Speaker 2: Do you want your phone knowing everything and being able 591 00:31:03,560 --> 00:31:06,440 Speaker 2: to do stuff without you even needing to ask it 592 00:31:06,480 --> 00:31:07,040 Speaker 2: to tip? 593 00:31:07,600 --> 00:31:08,040 Speaker 1: Ah? 594 00:31:08,160 --> 00:31:11,320 Speaker 5: Yeah, it's a bit weird, isn't it. I don't know what. 595 00:31:11,280 --> 00:31:14,560 Speaker 1: Does that mean though? Being able to without being without 596 00:31:14,600 --> 00:31:15,120 Speaker 1: asking it. 597 00:31:15,440 --> 00:31:18,440 Speaker 2: Well, okay, so we spoke about this a couple of 598 00:31:18,440 --> 00:31:22,320 Speaker 2: weeks ago. When I recently had an update to Android 599 00:31:22,360 --> 00:31:25,560 Speaker 2: Auto in my car using my Android phone, I had 600 00:31:25,600 --> 00:31:27,600 Speaker 2: a whole lot of text messages come through from a 601 00:31:27,640 --> 00:31:30,920 Speaker 2: single person. Because young people don't send one text, they 602 00:31:31,080 --> 00:31:34,160 Speaker 2: don't do a paragraph, they do line by line by line. 603 00:31:34,520 --> 00:31:38,480 Speaker 2: So a friend of mine sent me five text messages. 604 00:31:39,320 --> 00:31:39,520 Speaker 4: Right. 605 00:31:39,600 --> 00:31:43,640 Speaker 2: So my car then responded and said, you've just received 606 00:31:43,640 --> 00:31:45,520 Speaker 2: a whole lot of text messages from this person. Would 607 00:31:45,560 --> 00:31:48,760 Speaker 2: you like me to summarize what the conversation was about? 608 00:31:48,960 --> 00:31:52,000 Speaker 2: And I said, yeah, that's great, Google, go for it. 609 00:31:52,160 --> 00:31:55,120 Speaker 2: And it did. It summarized the whole thing, so rather 610 00:31:55,160 --> 00:31:58,640 Speaker 2: than bombarding me with five different text messages, it just 611 00:31:58,640 --> 00:32:00,680 Speaker 2: gave me the gist of what the conversation was about. 612 00:32:00,760 --> 00:32:03,880 Speaker 2: So that's where AI is being employed, and it's great 613 00:32:03,920 --> 00:32:06,520 Speaker 2: in the car, you don't want to be disturbed by 614 00:32:06,640 --> 00:32:09,600 Speaker 2: a stack of text messages. So I mean, that's just 615 00:32:09,640 --> 00:32:12,400 Speaker 2: one really simple example of potentially what. 616 00:32:12,400 --> 00:32:12,840 Speaker 4: It could do. 617 00:32:13,000 --> 00:32:18,240 Speaker 2: But I like the idea of a bogan translator or 618 00:32:18,360 --> 00:32:21,320 Speaker 2: you know what about if someone stutters and has an accent, 619 00:32:21,360 --> 00:32:24,000 Speaker 2: But those things actually do have a lot of and 620 00:32:24,040 --> 00:32:26,560 Speaker 2: a lot of merit because it's going to make life 621 00:32:26,600 --> 00:32:29,280 Speaker 2: a lot easier for people and more accessible. So it 622 00:32:29,320 --> 00:32:32,680 Speaker 2: will make that text sery or you know, or whatever 623 00:32:32,720 --> 00:32:35,840 Speaker 2: the android aut diversion is or android. So I kind 624 00:32:35,840 --> 00:32:39,360 Speaker 2: of like the idea ish. I think Elon Musk is 625 00:32:39,400 --> 00:32:42,560 Speaker 2: just doing the publicity stunt thing into. 626 00:32:42,320 --> 00:32:44,960 Speaker 1: What if there was a Seri or there was AI 627 00:32:45,120 --> 00:32:48,080 Speaker 1: that kind of had its own personality and some days 628 00:32:48,080 --> 00:32:50,880 Speaker 1: it was just like having a bad day. It's like 629 00:32:51,160 --> 00:32:53,680 Speaker 1: and you you know, you open it up and it goes, 630 00:32:53,720 --> 00:32:56,000 Speaker 1: what fucking what now? 631 00:32:56,400 --> 00:32:57,960 Speaker 2: But it'd be good if you could choose the mood 632 00:32:57,960 --> 00:33:00,600 Speaker 2: for the day to match your move, so you know, 633 00:33:00,680 --> 00:33:04,240 Speaker 2: the shitty mood, over the top nice mood. 634 00:33:05,320 --> 00:33:07,920 Speaker 4: You could do the TIFFs, the Patrick and the Greg mood. 635 00:33:08,520 --> 00:33:10,479 Speaker 1: What if you ask it a question and it's like, 636 00:33:10,560 --> 00:33:15,400 Speaker 1: that's a fucking stupid question. Try harder, Like seriously, you're 637 00:33:15,440 --> 00:33:19,000 Speaker 1: not asking me that, Like AI with attitude? 638 00:33:19,320 --> 00:33:20,400 Speaker 4: Can I make an admission? 639 00:33:20,680 --> 00:33:25,200 Speaker 2: So my identical twin brother, genetically identical to me, he 640 00:33:25,320 --> 00:33:27,000 Speaker 2: rang me up and he was telling me about a 641 00:33:27,040 --> 00:33:28,920 Speaker 2: tech problem that he had, and I got to the 642 00:33:28,960 --> 00:33:31,800 Speaker 2: point where I thought, how could you be so stupid and. 643 00:33:31,760 --> 00:33:34,320 Speaker 4: Not understand this because we're genetically identical. 644 00:33:34,760 --> 00:33:38,720 Speaker 2: It's like, you don't understand it. 645 00:33:39,920 --> 00:33:42,560 Speaker 1: Well, that's because you've exposed yourself to a lot of 646 00:33:42,680 --> 00:33:44,760 Speaker 1: training and education experience. 647 00:33:45,000 --> 00:33:47,080 Speaker 4: I should have listened to that whole sentence, shouldn't I have? 648 00:33:48,040 --> 00:33:51,400 Speaker 1: Well, you've you've got knowledge he hasn't. That's not about genetics. 649 00:33:51,400 --> 00:33:54,160 Speaker 1: That's just about what you've learned versus what he's learned. 650 00:33:54,200 --> 00:33:57,400 Speaker 2: It was pretty basically paste or something I don't know, 651 00:33:57,440 --> 00:34:01,400 Speaker 2: with something simple if you rolled his eyes and said, yeah, 652 00:34:01,400 --> 00:34:03,440 Speaker 2: I don't know how, he doesn't do it anyway. 653 00:34:04,800 --> 00:34:08,880 Speaker 4: All right, cybersecurity, let's talk about that. 654 00:34:09,320 --> 00:34:14,319 Speaker 2: Okay, Well, you have a business, a small business, a 655 00:34:14,400 --> 00:34:15,280 Speaker 2: business of sorts. 656 00:34:15,840 --> 00:34:17,520 Speaker 4: How CyberSecure are you? 657 00:34:17,600 --> 00:34:17,800 Speaker 2: Do you? 658 00:34:17,880 --> 00:34:18,200 Speaker 4: Reckon? 659 00:34:18,960 --> 00:34:22,080 Speaker 1: I'm not the person to ask. I'm about as CyberSecure 660 00:34:22,120 --> 00:34:26,439 Speaker 1: as a fucking box of clinics. That's out tough. 661 00:34:27,120 --> 00:34:28,319 Speaker 4: Do you think it's relevant though? 662 00:34:28,360 --> 00:34:31,600 Speaker 2: I mean, I'm probably asking a really obvious question because 663 00:34:31,680 --> 00:34:34,239 Speaker 2: if I know you've been through this whole debarcle and 664 00:34:34,280 --> 00:34:37,160 Speaker 2: the stress of having to go through this yourself. But 665 00:34:37,239 --> 00:34:39,959 Speaker 2: I think that well, there's a lot of warnings coming 666 00:34:39,960 --> 00:34:42,520 Speaker 2: out now and we're talking this is like across the board. 667 00:34:42,640 --> 00:34:45,960 Speaker 2: So this is from a government level, both in Australia 668 00:34:46,160 --> 00:34:49,799 Speaker 2: and also overseas as well. The United States has when 669 00:34:49,840 --> 00:34:52,839 Speaker 2: his cyber attacks were state driven. So what I'm saying 670 00:34:52,920 --> 00:34:56,239 Speaker 2: is it's not just independent. You know, some Russian bloke 671 00:34:56,320 --> 00:34:59,320 Speaker 2: sitting in his back shed trying to hack into your computer, 672 00:34:59,600 --> 00:35:04,319 Speaker 2: but these a state sanctioned where certain states well, and 673 00:35:04,360 --> 00:35:06,480 Speaker 2: it's kind of commonly known that China has a very 674 00:35:06,480 --> 00:35:11,120 Speaker 2: active operation in hacking as well, so state sanctioned hacking. 675 00:35:11,560 --> 00:35:16,000 Speaker 2: And I think more sixty percent of small business owners 676 00:35:16,000 --> 00:35:19,800 Speaker 2: say that hackers are their biggest fear. This was according 677 00:35:19,800 --> 00:35:23,440 Speaker 2: to the US Chamber of Commerce. But sixty percent is 678 00:35:23,480 --> 00:35:27,200 Speaker 2: a massive amount. And you heard about the hack recently 679 00:35:27,440 --> 00:35:30,560 Speaker 2: at the NHS in the UK. This only happened in 680 00:35:30,600 --> 00:35:36,160 Speaker 2: the last fortnight, so happened wassh Yeah? Anyway, the National 681 00:35:36,160 --> 00:35:39,719 Speaker 2: Health Service NHS was hacked into, but it was so 682 00:35:39,920 --> 00:35:43,360 Speaker 2: bad that they ended up resorting to using paper for 683 00:35:43,400 --> 00:35:47,000 Speaker 2: communications and people who were on waiting lists for surgery 684 00:35:47,040 --> 00:35:50,200 Speaker 2: had to be put off. It impacted and potentially could 685 00:35:50,200 --> 00:35:54,520 Speaker 2: have caused major problems for people who were needed urgent surgery. 686 00:35:54,840 --> 00:35:57,480 Speaker 2: And it makes you realize how vulnerable potentially we could 687 00:35:57,560 --> 00:36:01,960 Speaker 2: be whether it's public services, could be like our electricity supply, water, 688 00:36:02,040 --> 00:36:04,359 Speaker 2: that sort of stuff. So it kind of opens it up. 689 00:36:04,640 --> 00:36:07,960 Speaker 2: And when you run a small business, because you tend 690 00:36:08,080 --> 00:36:11,000 Speaker 2: not to have the infrastructure, you don't have somebody who's 691 00:36:11,080 --> 00:36:14,320 Speaker 2: running your IT network, that kind of tends. 692 00:36:14,040 --> 00:36:16,240 Speaker 4: To leave you more vulnerable as well. 693 00:36:16,440 --> 00:36:18,960 Speaker 2: So, I mean, it's something certainly that I've thought about, 694 00:36:18,960 --> 00:36:21,160 Speaker 2: but because I guess a little bit more tech savvy, 695 00:36:21,440 --> 00:36:25,280 Speaker 2: I think about the security implications of what it means 696 00:36:25,320 --> 00:36:28,399 Speaker 2: for the data that I store. But when you think 697 00:36:28,400 --> 00:36:31,880 Speaker 2: about all the data that you've got crago, you know, 698 00:36:31,920 --> 00:36:35,760 Speaker 2: your research at the moment, how are you protecting that research? 699 00:36:37,320 --> 00:36:41,560 Speaker 1: Yeah, so it's it is stored. I don't know. I 700 00:36:42,120 --> 00:36:44,839 Speaker 1: don't do that bit, but I don't know. I don't 701 00:36:44,880 --> 00:36:48,000 Speaker 1: know it's in a cupboard. I don't know a cyber cupboard. 702 00:36:48,200 --> 00:36:51,040 Speaker 1: I don't know, Patrick, I can don't ask me. I 703 00:36:51,120 --> 00:36:53,239 Speaker 1: know how to get into it and access it. It's 704 00:36:53,239 --> 00:36:56,840 Speaker 1: not my job. I don't do security. But it's funny 705 00:36:56,880 --> 00:36:59,239 Speaker 1: you asked, right, because I'm old. But then I think 706 00:36:59,280 --> 00:37:04,759 Speaker 1: about with the rapid evolution of computing, you know, quantum computers, 707 00:37:05,120 --> 00:37:10,160 Speaker 1: arriving next Wednesday, right, and the world is that like 708 00:37:10,239 --> 00:37:13,680 Speaker 1: the way that we buy and sell and bank and 709 00:37:14,480 --> 00:37:17,759 Speaker 1: you know, medicine and all of that. Like my dear 710 00:37:17,800 --> 00:37:22,160 Speaker 1: old mum and dad. I look at them, and twenty 711 00:37:22,360 --> 00:37:25,480 Speaker 1: twenty four is bewildering for my mum and dad. Like 712 00:37:25,520 --> 00:37:29,040 Speaker 1: it just that and they're not dumb people, but it's 713 00:37:29,320 --> 00:37:32,279 Speaker 1: just it's just I don't know. I don't know what 714 00:37:32,320 --> 00:37:36,400 Speaker 1: the answer is because we can't stop technological advancement. But 715 00:37:36,520 --> 00:37:40,920 Speaker 1: I feel so sorry. Like I went out two weekends 716 00:37:40,960 --> 00:37:42,879 Speaker 1: ago with mum and dad and their friends. So there 717 00:37:42,960 --> 00:37:45,800 Speaker 1: was like eight old people, I mean fucking old, fucking 718 00:37:46,000 --> 00:37:49,560 Speaker 1: old as fuck, like Noah's friends, right, Jesus is fucking 719 00:37:49,760 --> 00:37:54,400 Speaker 1: next door Nabors, right, and me and I was looking 720 00:37:54,440 --> 00:37:56,839 Speaker 1: around the table and I was trying to explain to 721 00:37:56,880 --> 00:38:01,560 Speaker 1: them what a podcast was, just something that simple. And 722 00:38:02,400 --> 00:38:04,759 Speaker 1: I'm not at all being rude because they're fucking you know, 723 00:38:04,840 --> 00:38:07,120 Speaker 1: in some ways they're way smarter than all of us, right, 724 00:38:07,200 --> 00:38:11,920 Speaker 1: but yeah, I like, fuck and this is essentially you know, 725 00:38:11,960 --> 00:38:13,880 Speaker 1: the best way you can explain it to them is 726 00:38:13,960 --> 00:38:15,960 Speaker 1: it's kind of like a radio show that you can 727 00:38:16,040 --> 00:38:19,680 Speaker 1: just listen to whenever you want, you know, but then 728 00:38:19,760 --> 00:38:21,799 Speaker 1: more how do you how do you? Where is it 729 00:38:21,840 --> 00:38:23,799 Speaker 1: on your phone? And then trying to go, well, so 730 00:38:23,920 --> 00:38:25,279 Speaker 1: there's an app, and then that's like. 731 00:38:25,360 --> 00:38:27,600 Speaker 4: Fuck, do they understand in streaming? 732 00:38:28,680 --> 00:38:32,400 Speaker 1: Not not kind of, but you know, like none of 733 00:38:32,440 --> 00:38:35,319 Speaker 1: them have Netflix, none of them have any of those platforms, 734 00:38:35,360 --> 00:38:38,840 Speaker 1: and so I guess theoretically, but even when you explain 735 00:38:38,920 --> 00:38:41,440 Speaker 1: to them something that for us and all of our listeners, 736 00:38:41,520 --> 00:38:45,560 Speaker 1: it's like, well, how could somebody not understand that? It's 737 00:38:45,560 --> 00:38:48,160 Speaker 1: a very different when you when you're trying to understand 738 00:38:48,160 --> 00:38:54,120 Speaker 1: an eighty five year old person's worldview thinking through like TIFFs, 739 00:38:54,320 --> 00:38:58,600 Speaker 1: you know, looking through TIFFs window, and she's probably more 740 00:38:58,640 --> 00:39:01,759 Speaker 1: compassionate and understanding, but you know what I mean, it's like, well, 741 00:39:01,800 --> 00:39:04,600 Speaker 1: of course this makes sense, but to them it makes 742 00:39:04,719 --> 00:39:07,200 Speaker 1: no sense, you know, And I just I don't know. 743 00:39:07,280 --> 00:39:09,960 Speaker 1: I think for me, that's something that I worry about 744 00:39:09,960 --> 00:39:14,000 Speaker 1: for really old people like Mum is terrified of paying 745 00:39:14,040 --> 00:39:17,720 Speaker 1: things electronically because she doesn't know what they're talking about, 746 00:39:18,200 --> 00:39:20,760 Speaker 1: and the people at the bank or whatever or medicare 747 00:39:20,880 --> 00:39:23,719 Speaker 1: they just talk to her like she's an idiot, because no, 748 00:39:23,880 --> 00:39:26,760 Speaker 1: you just do this, And they talked to her quickly 749 00:39:26,920 --> 00:39:30,960 Speaker 1: and dismissively, and she doesn't actually understand what it all means. 750 00:39:31,600 --> 00:39:33,400 Speaker 1: Can I just say they worries me? 751 00:39:33,960 --> 00:39:37,040 Speaker 4: Yeah? Can I just say too though? You do sometimes 752 00:39:37,080 --> 00:39:37,880 Speaker 4: have a choice. 753 00:39:37,960 --> 00:39:40,279 Speaker 2: And I'm not going to particularly name any of the 754 00:39:40,560 --> 00:39:43,239 Speaker 2: different banks, but you've got the Big four, and then 755 00:39:43,280 --> 00:39:45,880 Speaker 2: there are other banks as well, and there are community banks, 756 00:39:45,920 --> 00:39:48,720 Speaker 2: and sometimes it's about voting with your feet. 757 00:39:49,239 --> 00:39:52,080 Speaker 4: I recently had I live in a small town. 758 00:39:52,120 --> 00:39:54,359 Speaker 2: We only had two banks in our town, and the 759 00:39:54,400 --> 00:39:55,879 Speaker 2: big bank pulled out. 760 00:39:55,880 --> 00:39:58,040 Speaker 4: One of the Big four pulled out of the township. 761 00:39:58,320 --> 00:40:00,080 Speaker 2: So I pulled all of my money out of it 762 00:40:00,280 --> 00:40:02,000 Speaker 2: and put it into the local community bank. 763 00:40:02,040 --> 00:40:04,360 Speaker 4: I thought, you know what, bug you and I can walk. 764 00:40:04,200 --> 00:40:06,279 Speaker 2: In there any time of day and I see them 765 00:40:06,320 --> 00:40:11,520 Speaker 2: explaining things delicately, understanding they will talk them through you 766 00:40:11,719 --> 00:40:14,160 Speaker 2: for older people, because generally it's older people who want 767 00:40:14,160 --> 00:40:16,200 Speaker 2: to have that face to face. And I think that 768 00:40:16,239 --> 00:40:20,120 Speaker 2: there's still ways to shop around, to ask around and 769 00:40:20,200 --> 00:40:23,840 Speaker 2: not to get lumped in and be treated like that. 770 00:40:23,920 --> 00:40:26,120 Speaker 4: So if you go to a telco provider, there's a 771 00:40:26,120 --> 00:40:27,840 Speaker 4: lot of third party telco providers. 772 00:40:27,840 --> 00:40:29,839 Speaker 2: Now, if you walk in and you're not satisfied, because 773 00:40:29,840 --> 00:40:32,279 Speaker 2: you think you're being treated like a dummy, go somewhere else, 774 00:40:32,360 --> 00:40:33,120 Speaker 2: vote with your feet. 775 00:40:34,400 --> 00:40:36,560 Speaker 1: Everything you said makes sense to me. But if you 776 00:40:36,640 --> 00:40:38,560 Speaker 1: said to my mum and dad go to a third 777 00:40:38,600 --> 00:40:41,239 Speaker 1: party telco provider and vote with they'd be like, I 778 00:40:41,239 --> 00:40:43,120 Speaker 1: don't know what the fuck do you mean? I mean, 779 00:40:43,120 --> 00:40:45,640 Speaker 1: this is the point we all get that my mum 780 00:40:45,680 --> 00:40:48,960 Speaker 1: and dad do not understand anything you just said. So 781 00:40:49,000 --> 00:40:50,360 Speaker 1: that's the problem. 782 00:40:50,560 --> 00:40:53,759 Speaker 2: Yeah, can I You know, I was so excited by this. 783 00:40:54,239 --> 00:40:59,120 Speaker 2: The FBI recently hacked some hackers. And there are different 784 00:40:59,160 --> 00:41:02,440 Speaker 2: ways that hackers can get into your system. So, you know, 785 00:41:02,520 --> 00:41:04,560 Speaker 2: you click on a link and then they skim the 786 00:41:04,600 --> 00:41:06,799 Speaker 2: information they want to get your ID so they can 787 00:41:06,840 --> 00:41:09,279 Speaker 2: pretend that they're you. But the other thing they can 788 00:41:09,280 --> 00:41:11,799 Speaker 2: do is they use ransomware where they take all of 789 00:41:11,840 --> 00:41:14,360 Speaker 2: your data, they lock it up, they encrypt it, and 790 00:41:14,400 --> 00:41:17,919 Speaker 2: then they exploit that and say, well, if you want 791 00:41:17,920 --> 00:41:19,480 Speaker 2: your data back, you've got to pay. 792 00:41:19,320 --> 00:41:22,240 Speaker 4: Us X amount of dollars. Okay, so that's ransomware. 793 00:41:22,680 --> 00:41:26,000 Speaker 2: But now the FBI, I've managed to get seven thousand 794 00:41:26,160 --> 00:41:29,719 Speaker 2: key keys to ransomware and they're just giving them away 795 00:41:29,719 --> 00:41:33,320 Speaker 2: for free. So if someone a business, an individual, whatever 796 00:41:34,239 --> 00:41:38,719 Speaker 2: gets hacked and they have all their data locked up. Now, 797 00:41:38,760 --> 00:41:43,000 Speaker 2: the FBI has acts seven thousand ransomware keys, which I 798 00:41:43,000 --> 00:41:46,600 Speaker 2: thought was really good, so you can basically get access 799 00:41:46,640 --> 00:41:50,799 Speaker 2: to your data without having this whole ransom situation over 800 00:41:50,840 --> 00:41:52,600 Speaker 2: the top of you. So it's good to see that 801 00:41:52,680 --> 00:41:56,120 Speaker 2: the good guys, that the white hat hackers. 802 00:41:55,920 --> 00:42:00,840 Speaker 4: Are getting their way around some of the naughty people 803 00:42:00,840 --> 00:42:01,600 Speaker 4: out there. Crago. 804 00:42:02,120 --> 00:42:04,560 Speaker 1: I love that, and I like it that there are 805 00:42:04,600 --> 00:42:05,640 Speaker 1: white hat hackers. 806 00:42:06,120 --> 00:42:06,440 Speaker 4: Love it. 807 00:42:06,480 --> 00:42:09,280 Speaker 1: I love that idea. I did see on the news 808 00:42:09,400 --> 00:42:11,400 Speaker 1: on I don't know what channel it was, but and 809 00:42:11,440 --> 00:42:14,680 Speaker 1: it looked a bit weird to me this week a 810 00:42:14,800 --> 00:42:19,080 Speaker 1: robotic thumb that people were yes, so instead of having 811 00:42:19,880 --> 00:42:23,120 Speaker 1: four digits and one thumb, they had four fingers and 812 00:42:23,200 --> 00:42:26,880 Speaker 1: two thumbs on one. So they've now essentially got six 813 00:42:26,960 --> 00:42:30,640 Speaker 1: fingers on one hand. I'm not sure that's necessary, but 814 00:42:30,719 --> 00:42:32,040 Speaker 1: tell us about it. Yeah. 815 00:42:32,080 --> 00:42:34,520 Speaker 2: Well, so this study showed that ninety eight percent of 816 00:42:34,560 --> 00:42:36,799 Speaker 2: the people who took part in it were able to 817 00:42:37,239 --> 00:42:43,479 Speaker 2: more successfully manipulate objects with a third thumb, and only 818 00:42:43,520 --> 00:42:46,719 Speaker 2: thirteen people were unable to perform the task within the 819 00:42:46,719 --> 00:42:49,680 Speaker 2: first minutes. So people put on this glove, it gives 820 00:42:49,719 --> 00:42:53,319 Speaker 2: them an extra thumb, and within a minute, ninety eight 821 00:42:53,360 --> 00:42:57,200 Speaker 2: percent were able to use it more effectively than their 822 00:42:57,239 --> 00:43:01,800 Speaker 2: normal hand. So the dexterity was improved by having this augmentation, 823 00:43:02,480 --> 00:43:05,080 Speaker 2: so it could be used in lots of areas. And 824 00:43:05,120 --> 00:43:07,520 Speaker 2: they're saying that it's kind of like, you know, you're 825 00:43:07,600 --> 00:43:11,879 Speaker 2: grasping ability, so the dexterity was increased as well. It's 826 00:43:12,000 --> 00:43:14,960 Speaker 2: kind of really good. It's a robotic thumb. Would that 827 00:43:15,080 --> 00:43:15,840 Speaker 2: be great, wouldn't it? 828 00:43:16,400 --> 00:43:19,040 Speaker 1: Ah? I tell you who would love that? Mary for 829 00:43:19,160 --> 00:43:20,040 Speaker 1: opening jars? 830 00:43:20,560 --> 00:43:23,200 Speaker 4: Yes, exactly, so you get the double thumbs up. How 831 00:43:23,239 --> 00:43:23,640 Speaker 4: good's that? 832 00:43:25,440 --> 00:43:28,240 Speaker 1: I'm not even lying when I was up there last Saturday, 833 00:43:28,320 --> 00:43:31,400 Speaker 1: because I go every Saturday to take fucking hercules to 834 00:43:31,440 --> 00:43:37,440 Speaker 1: the gym Ronnie mcgronstar, And yeah, Mary asked me to 835 00:43:37,480 --> 00:43:40,040 Speaker 1: open multiple things because she was cooking, and you know 836 00:43:40,120 --> 00:43:42,360 Speaker 1: she takes out the jars of whatever out of the 837 00:43:42,400 --> 00:43:45,040 Speaker 1: fridge or the cupboard. I may as well just go 838 00:43:45,160 --> 00:43:47,919 Speaker 1: up there to open jars every Saturday and then drive home. 839 00:43:48,000 --> 00:43:51,800 Speaker 1: But if she had the robotic fucking thumb, she wouldn't 840 00:43:51,800 --> 00:43:54,400 Speaker 1: need me. I'd become redundant. 841 00:43:55,239 --> 00:43:58,359 Speaker 4: That's it, just a robotic thumb and that's all your work. 842 00:44:00,880 --> 00:44:04,000 Speaker 1: What role does Craig play is essentially a robotic thumb. 843 00:44:04,960 --> 00:44:06,160 Speaker 1: It comes at no cost. 844 00:44:06,920 --> 00:44:09,560 Speaker 2: It must be pretty easy to use if people only 845 00:44:09,600 --> 00:44:13,239 Speaker 2: took sixty seconds to be able to feel comfortable using it, 846 00:44:13,280 --> 00:44:15,400 Speaker 2: because I would think that if you had a glove 847 00:44:15,480 --> 00:44:17,920 Speaker 2: with an extra thumb pointing out, that would be weird. 848 00:44:18,400 --> 00:44:19,759 Speaker 4: But evidently not. 849 00:44:20,360 --> 00:44:24,000 Speaker 2: That says a lot about the user experience. If it's 850 00:44:24,000 --> 00:44:26,000 Speaker 2: that easy to pick up and start working with, don't 851 00:44:26,000 --> 00:44:26,360 Speaker 2: you reckon? 852 00:44:26,400 --> 00:44:30,440 Speaker 1: Well, yeah, I guess I mean more, I don't know, 853 00:44:30,520 --> 00:44:35,719 Speaker 1: maybe more importantly, like for people who like amputees who 854 00:44:35,760 --> 00:44:38,000 Speaker 1: have had accidents or whatever, you think, what are the 855 00:44:38,040 --> 00:44:43,920 Speaker 1: potential applications. There's a well known fuck. I think his 856 00:44:44,080 --> 00:44:48,719 Speaker 1: name is Paul Gelder. It might be he is a 857 00:44:48,800 --> 00:44:54,319 Speaker 1: guy who had one of his arms bitten off and 858 00:44:54,920 --> 00:44:57,440 Speaker 1: or like his forearm and hand and one of his 859 00:44:57,560 --> 00:45:01,760 Speaker 1: legs bitten off. Paul dig Gelder, Paul de Gelder, Thanks 860 00:45:01,800 --> 00:45:07,919 Speaker 1: tiff In believe it or not. Patrick Sidney Harbor shark 861 00:45:07,960 --> 00:45:13,080 Speaker 1: attack like like, imagine being fucking attacked by a shark 862 00:45:13,120 --> 00:45:16,920 Speaker 1: and he was a navy diver, but he's got these 863 00:45:17,239 --> 00:45:22,520 Speaker 1: motherfucker of a bionic end and he's jacked, like he's 864 00:45:22,560 --> 00:45:26,400 Speaker 1: this bit strong, jacked athletic dude. And he's got a 865 00:45:26,440 --> 00:45:30,800 Speaker 1: prosthetic bionic arm and hand and also a leg I think, 866 00:45:31,160 --> 00:45:34,319 Speaker 1: or just a prosthetic leg. But yeah, I think that's 867 00:45:34,360 --> 00:45:38,080 Speaker 1: one of the I think pretty soon we're going Look, 868 00:45:39,120 --> 00:45:41,359 Speaker 1: this is just what I think. I could be wildly wrong, 869 00:45:41,960 --> 00:45:45,760 Speaker 1: but I think we're going to see in the next 870 00:45:45,800 --> 00:45:50,440 Speaker 1: decade we're going to see quadriplegics walking through, you know, 871 00:45:50,520 --> 00:45:53,520 Speaker 1: with the stuff that they can now do with not 872 00:45:53,640 --> 00:45:56,920 Speaker 1: only the spine and the nervous system, but also robotics 873 00:45:56,960 --> 00:46:01,520 Speaker 1: and and you know, imagine, imagine people like our friend 874 00:46:01,560 --> 00:46:05,759 Speaker 1: tif Joel SARTI imagine, fuck, how good would that be 875 00:46:05,880 --> 00:46:08,160 Speaker 1: for him to be able to get to that point 876 00:46:08,200 --> 00:46:11,120 Speaker 1: in time? And Yeah, I think it's exciting. I think 877 00:46:11,160 --> 00:46:13,799 Speaker 1: for me, this is the well, one of the there's many, 878 00:46:13,840 --> 00:46:17,360 Speaker 1: but one of the upsides Patrick of like this kind 879 00:46:17,400 --> 00:46:21,279 Speaker 1: of genius that is coming online is that when we 880 00:46:21,360 --> 00:46:24,239 Speaker 1: can do stuff like help people walk that would never 881 00:46:24,320 --> 00:46:27,760 Speaker 1: have been able to walk, you know, for me, that's 882 00:46:27,360 --> 00:46:30,000 Speaker 1: it's it's all all the negative is worth it if 883 00:46:30,000 --> 00:46:31,200 Speaker 1: we can do things like that. 884 00:46:31,600 --> 00:46:33,600 Speaker 4: Yeah, and there's lots of different trains of thought too. 885 00:46:33,640 --> 00:46:38,040 Speaker 2: There's the exoskeleton where they basically have an exoskeleton around 886 00:46:38,120 --> 00:46:42,959 Speaker 2: the non functioning limbs, the legs. Of course, you could 887 00:46:42,960 --> 00:46:46,759 Speaker 2: then you know, match that up with say the brain implant, 888 00:46:47,080 --> 00:46:48,719 Speaker 2: so you do it that way. But but I think 889 00:46:48,760 --> 00:46:51,880 Speaker 2: we've spoken about this before as well, bypassing the injury 890 00:46:51,920 --> 00:46:55,240 Speaker 2: to the spine where you yeah, an ability to actually 891 00:46:55,280 --> 00:46:59,360 Speaker 2: have the nerve tissue connected and bypass the broken connection 892 00:46:59,480 --> 00:47:00,879 Speaker 2: there and so potentially that. 893 00:47:00,800 --> 00:47:03,560 Speaker 4: Could be as well. So I'm super exciting. 894 00:47:03,719 --> 00:47:06,800 Speaker 1: Yeah, and you think about the muscles, like, for example, 895 00:47:06,840 --> 00:47:10,800 Speaker 1: if somebody gets say a let's say between the shoulder blades, 896 00:47:10,840 --> 00:47:13,719 Speaker 1: so the thoracic so at T five or six, right, 897 00:47:14,480 --> 00:47:18,400 Speaker 1: and so they their arms work, but their legs don't work. 898 00:47:19,520 --> 00:47:23,279 Speaker 1: But their legs actually work. It's just that there's no 899 00:47:24,160 --> 00:47:27,719 Speaker 1: basically juice getting to the legs, you know. So as 900 00:47:27,760 --> 00:47:31,759 Speaker 1: you said, like the actual muscles, there's nothing wrong with 901 00:47:31,840 --> 00:47:34,960 Speaker 1: the muscles, they're just not getting any neural connection for 902 00:47:35,040 --> 00:47:38,359 Speaker 1: the brain to make the legs work. So yeah, if 903 00:47:38,360 --> 00:47:40,920 Speaker 1: you could build a little kind of a neural bridge 904 00:47:41,360 --> 00:47:46,439 Speaker 1: over that break so that that signal gets through, I mean, 905 00:47:46,600 --> 00:47:48,520 Speaker 1: I reckon you and I could probably figure that out 906 00:47:48,520 --> 00:47:50,640 Speaker 1: on a weekend. If we really put our minds. 907 00:47:50,400 --> 00:47:54,160 Speaker 5: To it, can get the crab to build it. 908 00:47:55,560 --> 00:48:00,879 Speaker 1: Exactly. I could conceptualize it. You know, I do work 909 00:48:00,880 --> 00:48:03,200 Speaker 1: at Brain Park or I do research at Brain Park, 910 00:48:03,280 --> 00:48:05,600 Speaker 1: So fuck, how hard can it be? I'm kidding everyone, 911 00:48:05,640 --> 00:48:07,280 Speaker 1: Please don't send me hate emails. 912 00:48:07,600 --> 00:48:11,200 Speaker 4: But I love through the door. They did, they not filter? 913 00:48:11,880 --> 00:48:13,400 Speaker 4: People look like. 914 00:48:14,120 --> 00:48:18,160 Speaker 1: I'm like the stupid mascot. You know, have AFL clubs 915 00:48:18,200 --> 00:48:21,000 Speaker 1: have a mascot. I'm the Brain Park mascot. I just 916 00:48:21,000 --> 00:48:24,000 Speaker 1: fucking walk around bumping into shit and people pat. 917 00:48:23,760 --> 00:48:25,520 Speaker 5: Me Brain Park Labrador. 918 00:48:26,000 --> 00:48:28,200 Speaker 4: Yeah yeah, the one who's pushing at the door that 919 00:48:28,280 --> 00:48:28,840 Speaker 4: says Paul. 920 00:48:29,480 --> 00:48:29,640 Speaker 5: Yeah. 921 00:48:29,719 --> 00:48:31,919 Speaker 1: Yeah. Like I just pushed my head against the door 922 00:48:32,000 --> 00:48:34,359 Speaker 1: until someone hears me banging and then they let me in. 923 00:48:35,560 --> 00:48:39,280 Speaker 1: It's sad, but like I love the pats and the food. 924 00:48:39,680 --> 00:48:39,919 Speaker 4: Yeah. 925 00:48:39,960 --> 00:48:43,759 Speaker 2: So talking about uplifting technology, this was I thought, this 926 00:48:43,800 --> 00:48:47,399 Speaker 2: is a nice little segue. So the Chinese drone manufacturer 927 00:48:47,440 --> 00:48:53,000 Speaker 2: Dji has for the first time ever flown a drone 928 00:48:53,080 --> 00:48:55,520 Speaker 2: to the top of Everest up up to one of 929 00:48:55,560 --> 00:48:58,560 Speaker 2: the base stations at Everest. Because the air is quite 930 00:48:58,640 --> 00:49:02,839 Speaker 2: thin and it's hard to get the lift capacity. And 931 00:49:02,960 --> 00:49:06,000 Speaker 2: what they're talking about is using the fly cart thirty 932 00:49:06,600 --> 00:49:09,600 Speaker 2: to be able to fly up there. So we're talking 933 00:49:09,680 --> 00:49:14,920 Speaker 2: six thousand meters six kilometers up and it was able to. 934 00:49:16,719 --> 00:49:18,399 Speaker 4: Be able to deliver. 935 00:49:20,040 --> 00:49:22,320 Speaker 2: Resources up to people at the top of Everest or 936 00:49:22,440 --> 00:49:25,360 Speaker 2: up up that high. And because the big the problem 937 00:49:25,400 --> 00:49:28,560 Speaker 2: there is the high winds, the sub zero temperatures, is 938 00:49:28,560 --> 00:49:31,320 Speaker 2: a whole lot of factors that have meant that drones 939 00:49:31,320 --> 00:49:32,919 Speaker 2: in the past haven't been able to get up there. 940 00:49:33,040 --> 00:49:36,239 Speaker 2: But it also means that they can clean up all 941 00:49:36,280 --> 00:49:38,839 Speaker 2: the crap that's on top of Everest as well. There's 942 00:49:38,880 --> 00:49:40,400 Speaker 2: a whole lot of junk up there, a lot of 943 00:49:40,480 --> 00:49:43,680 Speaker 2: well lots of stuff up there, so they can a 944 00:49:43,680 --> 00:49:46,560 Speaker 2: whole lot of dead people and dead people yeah, yep. 945 00:49:46,719 --> 00:49:49,000 Speaker 2: So the good thing is that they were able to 946 00:49:49,040 --> 00:49:51,239 Speaker 2: do this. They did a round trip and they still 947 00:49:51,239 --> 00:49:53,840 Speaker 2: had forty three percent battery power by the end of it, 948 00:49:53,840 --> 00:49:57,120 Speaker 2: which is kind of cool. And what it will do though, 949 00:49:57,239 --> 00:49:58,839 Speaker 2: is it's going to make it a lot safer two 950 00:49:58,920 --> 00:50:02,239 Speaker 2: because they do clean up parts of Everest, And what 951 00:50:02,280 --> 00:50:04,200 Speaker 2: it's going to mean is that for the sherpas and 952 00:50:04,239 --> 00:50:08,319 Speaker 2: people who are trying to preserve the natural area they 953 00:50:08,360 --> 00:50:10,319 Speaker 2: can stand in pulling these drones. I thought that was 954 00:50:10,360 --> 00:50:12,800 Speaker 2: really kind of cool that they're doing that with the drones. 955 00:50:12,920 --> 00:50:15,120 Speaker 2: I had a personal example last weekend. I think I 956 00:50:15,120 --> 00:50:18,240 Speaker 2: told Tip about this. I do a bit of drone 957 00:50:18,239 --> 00:50:20,400 Speaker 2: photography and one of my clients wanted me to go 958 00:50:20,480 --> 00:50:24,359 Speaker 2: out and do some drone work just around some earth 959 00:50:24,440 --> 00:50:27,120 Speaker 2: moving that they were doing. And I left the land 960 00:50:27,400 --> 00:50:29,600 Speaker 2: and I was only driving forty five minutes and it 961 00:50:29,719 --> 00:50:32,880 Speaker 2: was nice and still kind of nice day. And I 962 00:50:32,960 --> 00:50:36,280 Speaker 2: get to the location and they're gusting winds of forty 963 00:50:36,280 --> 00:50:39,880 Speaker 2: five kilometers an hour and at one point the drone 964 00:50:39,920 --> 00:50:42,280 Speaker 2: that I've got can at top speed can go about 965 00:50:42,280 --> 00:50:44,759 Speaker 2: seventy clicks, so it's fast. If you put it into 966 00:50:44,800 --> 00:50:47,200 Speaker 2: sport mode, it's pretty fast. So I had it in 967 00:50:47,280 --> 00:50:52,080 Speaker 2: sport mode and it was still going backwards. Yeah, it 968 00:50:52,120 --> 00:50:53,560 Speaker 2: was pretty full on. So I had to kind of 969 00:50:53,560 --> 00:50:55,480 Speaker 2: fly closer to the ground and then move it to 970 00:50:55,560 --> 00:50:57,759 Speaker 2: one location and then let the wind kind of push 971 00:50:57,800 --> 00:50:59,040 Speaker 2: it at maximum speed. 972 00:50:59,160 --> 00:51:01,320 Speaker 4: But that was kind of interesting. 973 00:51:01,360 --> 00:51:03,640 Speaker 2: So when to fly all the way up, you know, 974 00:51:03,719 --> 00:51:07,200 Speaker 2: to to that part of Everest and to you know, 975 00:51:07,440 --> 00:51:11,399 Speaker 2: nineteen thousand six hundred and eighty five feet sounds more 976 00:51:11,400 --> 00:51:13,880 Speaker 2: impressive than six thousand meters, doesn't it. 977 00:51:13,920 --> 00:51:17,799 Speaker 1: Well, six kilometers. The top of Everest is nearly nine kilometers, 978 00:51:17,840 --> 00:51:19,800 Speaker 1: so that must be up to one of the base 979 00:51:19,880 --> 00:51:25,200 Speaker 1: camps or one of the camp tours, but still six 980 00:51:25,280 --> 00:51:28,560 Speaker 1: kilometers up in the air. Dude. Yeah, Like you think 981 00:51:28,600 --> 00:51:32,720 Speaker 1: of a hundred story building that's one thousand feet usually 982 00:51:32,760 --> 00:51:37,319 Speaker 1: because there's like ten foot a floor, so that that 983 00:51:37,320 --> 00:51:40,120 Speaker 1: would be the equivalent of what did you say, how 984 00:51:40,120 --> 00:51:41,680 Speaker 1: many thousand feet? 985 00:51:42,239 --> 00:51:43,360 Speaker 4: Nineteen thousand feet? 986 00:51:44,640 --> 00:51:49,480 Speaker 1: That'd be the equivalent of a nineteen hundred story building. 987 00:51:51,080 --> 00:51:54,400 Speaker 1: That's fucking Hey, we needs jam. How can people find 988 00:51:54,400 --> 00:51:57,319 Speaker 1: you my friend and come and you know, play with 989 00:51:57,400 --> 00:52:00,960 Speaker 1: you and your drone? And also perhaps you need to 990 00:52:01,040 --> 00:52:05,040 Speaker 1: organize the the wedding arrangements just so all of us 991 00:52:05,120 --> 00:52:06,640 Speaker 1: can plan. 992 00:52:07,320 --> 00:52:09,480 Speaker 2: Well. If someone wants to just chat about nerd stuff 993 00:52:09,520 --> 00:52:11,160 Speaker 2: with me, I'm always happy to do that. 994 00:52:11,200 --> 00:52:12,960 Speaker 4: And thank you Dale from Geelong. 995 00:52:13,680 --> 00:52:15,839 Speaker 1: You get married if I'll pay for it. 996 00:52:16,400 --> 00:52:20,000 Speaker 2: Ah, we want to go to Vegas, Tiff, Let's go 997 00:52:20,040 --> 00:52:20,800 Speaker 2: to Vegas again. 998 00:52:21,000 --> 00:52:25,759 Speaker 1: I'm not paying for your honey. If you want to 999 00:52:25,760 --> 00:52:27,239 Speaker 1: get married, I. 1000 00:52:27,400 --> 00:52:30,040 Speaker 4: Pay for it. And we want to get married in Vegas. 1001 00:52:30,320 --> 00:52:32,680 Speaker 1: I will pay for your wedding up to a value 1002 00:52:32,680 --> 00:52:36,840 Speaker 1: of five thousand dollars. Has to be a legal wedding. 1003 00:52:37,160 --> 00:52:39,160 Speaker 4: You know this is being recorded, right you can't. 1004 00:52:39,360 --> 00:52:42,880 Speaker 1: I don't care. I'm happy if you actually really get married. 1005 00:52:42,920 --> 00:52:43,640 Speaker 1: I'll pay for it. 1006 00:52:44,160 --> 00:52:46,200 Speaker 4: Yeah, we will go to Vegas and get married. 1007 00:52:46,239 --> 00:52:48,200 Speaker 1: And then you're not getting married in Vegas. You've got 1008 00:52:48,200 --> 00:52:50,840 Speaker 1: to get married here. We need a real wedding and 1009 00:52:50,920 --> 00:52:53,680 Speaker 1: it needs to be legal, and me and the typ 1010 00:52:53,920 --> 00:52:55,160 Speaker 1: folks are going to come along. 1011 00:52:55,680 --> 00:52:57,240 Speaker 5: My mom's going to be so happy. 1012 00:52:58,520 --> 00:52:59,919 Speaker 4: Only if you're the bride's maid. 1013 00:53:01,440 --> 00:53:04,719 Speaker 1: I'll be the flower girl. I'll do security. I'll fucking sing, 1014 00:53:04,880 --> 00:53:05,520 Speaker 1: don't worry. 1015 00:53:06,040 --> 00:53:09,520 Speaker 2: Yeah, people can contact me going to websites now, dot com, 1016 00:53:09,520 --> 00:53:11,560 Speaker 2: todau because Genesis effects is too. 1017 00:53:11,480 --> 00:53:16,160 Speaker 4: Hard to spell. So you know, I created my. 1018 00:53:16,160 --> 00:53:19,680 Speaker 2: Business name before the internet was a thing, and so 1019 00:53:20,239 --> 00:53:22,440 Speaker 2: no one knows how to spell genesis effects, so I 1020 00:53:22,520 --> 00:53:25,120 Speaker 2: just kind of created websites now dot com today you 1021 00:53:25,360 --> 00:53:28,239 Speaker 2: is the alternative. I mean, we do a lot of websites, 1022 00:53:28,280 --> 00:53:30,960 Speaker 2: so I guess that's a good reason to have that, 1023 00:53:31,000 --> 00:53:33,920 Speaker 2: But it's just the easiest way to spell what we do. 1024 00:53:35,080 --> 00:53:38,040 Speaker 1: I think sometimes we try and get too tricky with names. 1025 00:53:38,239 --> 00:53:40,800 Speaker 1: I think that I think your new name works best. Patrick, 1026 00:53:40,920 --> 00:53:44,640 Speaker 1: Thank you, Pa, thank you, thank you. I think this 1027 00:53:44,680 --> 00:53:46,640 Speaker 1: episode is going to be called Patrick and Tiff for 1028 00:53:46,680 --> 00:53:47,759 Speaker 1: getting married for sure,