1 00:00:00,800 --> 00:00:03,800 Speaker 1: I'll get at me into you project. It's Tiffany and Cook, 2 00:00:03,840 --> 00:00:06,080 Speaker 1: It's Patrick James Bonello. It's me, the one in the 3 00:00:06,120 --> 00:00:08,680 Speaker 1: middle on my screen anyway, the one in the middle, tiff. 4 00:00:08,680 --> 00:00:10,960 Speaker 1: Where am I on your screen? Middle? Bottom? Top? 5 00:00:11,240 --> 00:00:13,200 Speaker 2: You are the middle child, But I've got to speak 6 00:00:13,240 --> 00:00:15,360 Speaker 2: of you, so you're real big when you talk. 7 00:00:16,239 --> 00:00:19,320 Speaker 1: Oh, I don't like that. I don't like that. Patrick. 8 00:00:19,320 --> 00:00:21,440 Speaker 1: Where am I on yours? And where's tiff on yours? 9 00:00:21,600 --> 00:00:23,280 Speaker 3: Well, you're at the top and tips at the bottom, 10 00:00:23,480 --> 00:00:24,960 Speaker 3: so I'm being cuddled in the middle. 11 00:00:26,160 --> 00:00:28,319 Speaker 1: All right. So you've got like a little kind of 12 00:00:29,000 --> 00:00:31,560 Speaker 1: like a vertical pile of us. 13 00:00:31,800 --> 00:00:34,200 Speaker 3: Yeah, your big spoon and TIFF's little spoon. 14 00:00:35,120 --> 00:00:37,800 Speaker 2: I have to put it on speak of you otherwise 15 00:00:37,840 --> 00:00:39,880 Speaker 2: I'm like a little budgery gar with a mirror. 16 00:00:39,960 --> 00:00:41,680 Speaker 4: I'm like, oh, look at that. 17 00:00:41,320 --> 00:00:41,720 Speaker 3: That's me. 18 00:00:43,120 --> 00:00:45,640 Speaker 1: Yeah, yeah, No, I don't like that. I like every 19 00:00:45,840 --> 00:00:48,440 Speaker 1: like do you know what I do? I have Patrick's 20 00:00:48,479 --> 00:00:50,519 Speaker 1: notes that he sent through in the middle of my 21 00:00:50,600 --> 00:00:53,920 Speaker 1: gigantic screen, and then our little faces up the top, 22 00:00:54,360 --> 00:00:56,680 Speaker 1: all like about the size of a match box. 23 00:00:58,080 --> 00:00:59,680 Speaker 3: So you really don't see us at all. 24 00:01:00,720 --> 00:01:04,160 Speaker 1: Now I can see you clearly, totally can tolerate that 25 00:01:04,200 --> 00:01:11,680 Speaker 1: can see clearly. Now Lorraine has gone, Thank goodness. Patrick, 26 00:01:11,760 --> 00:01:16,400 Speaker 1: Let's start with you in what looks like the International 27 00:01:16,440 --> 00:01:20,600 Speaker 1: Space station where you are look. 28 00:01:20,720 --> 00:01:23,160 Speaker 3: No, well, okay, I'm sitting at my little podcast studio 29 00:01:23,200 --> 00:01:26,000 Speaker 3: and I've got a fake International space station background. But 30 00:01:26,040 --> 00:01:28,039 Speaker 3: I really want to kind of focus on a more 31 00:01:28,040 --> 00:01:31,280 Speaker 3: important thing because our listeners can't see that we're actually 32 00:01:31,319 --> 00:01:34,920 Speaker 3: sitting in three different locations using zoom. They just didn't 33 00:01:34,959 --> 00:01:37,399 Speaker 3: see you use a tea towel to blow your nose. 34 00:01:39,560 --> 00:01:41,480 Speaker 1: It's not a tea towel, it's. 35 00:01:41,280 --> 00:01:44,600 Speaker 3: A chemy that you wipe up with. 36 00:01:45,520 --> 00:01:48,240 Speaker 1: No, it's not a chammy. It's a yellow kerchief. 37 00:01:50,120 --> 00:01:52,840 Speaker 3: It looks like a chevy. You just wipe his face 38 00:01:52,880 --> 00:01:53,680 Speaker 3: with a Chevy. 39 00:01:54,360 --> 00:01:56,640 Speaker 1: Well, now that you mentioned it, that's probably not a 40 00:01:56,680 --> 00:01:59,520 Speaker 1: bad idea. But I mean, shemys are for wiping things, 41 00:01:59,600 --> 00:02:01,160 Speaker 1: so I might look into that. 42 00:02:01,280 --> 00:02:05,120 Speaker 3: And then your face would look a lot cleaner. Sorry, 43 00:02:05,760 --> 00:02:07,040 Speaker 3: that started off well, didn't it. 44 00:02:07,400 --> 00:02:10,000 Speaker 1: I just I was praising you. I was telling everyone. 45 00:02:10,000 --> 00:02:12,160 Speaker 1: I was telling you three before we went live that 46 00:02:12,280 --> 00:02:15,880 Speaker 1: on the International Banello scale of how good people can be, 47 00:02:16,040 --> 00:02:18,880 Speaker 1: you're the high watermark. And then you shit care me 48 00:02:18,960 --> 00:02:21,040 Speaker 1: straight out of the gate on my own fucking show. 49 00:02:21,400 --> 00:02:23,400 Speaker 3: I just wanted to clarify that you weren't using you 50 00:02:23,600 --> 00:02:27,280 Speaker 3: get your voice a couple of octaves lower as nuts. 51 00:02:28,160 --> 00:02:31,280 Speaker 3: Now Fritz is on his bed warming here I turned 52 00:02:31,280 --> 00:02:32,840 Speaker 3: the heater on early. Now I'm getting too. 53 00:02:32,960 --> 00:02:35,320 Speaker 1: Oh now he's getting fuck and now he's getting his 54 00:02:35,440 --> 00:02:45,880 Speaker 1: guns out. Sheep id you hold yourself back. Ah, dear Chip, 55 00:02:45,919 --> 00:02:47,120 Speaker 1: how are you? Good morning? 56 00:02:47,760 --> 00:02:51,200 Speaker 2: Fabulous harps, thanks fantastic. 57 00:02:51,600 --> 00:02:55,080 Speaker 1: How's how's the household full of animals? 58 00:02:55,440 --> 00:02:59,160 Speaker 2: Oh it's goods, everything's calm, everything's calm. 59 00:02:59,240 --> 00:03:01,040 Speaker 4: Now, happy animals. 60 00:03:01,919 --> 00:03:05,720 Speaker 1: I found a new animal that both of you would want. 61 00:03:06,000 --> 00:03:10,720 Speaker 1: What is it? It's a it's a a hybrid, so 62 00:03:11,320 --> 00:03:14,679 Speaker 1: not it's not pretend it's real. I think it's called 63 00:03:14,680 --> 00:03:19,079 Speaker 1: a doxam anyway, a dog and a fox hybrid. 64 00:03:19,800 --> 00:03:23,120 Speaker 3: Really yeah, I just looked up doxham, and I mustn't 65 00:03:23,120 --> 00:03:26,000 Speaker 3: have spelled it correctly, because I just got the women. 66 00:03:27,760 --> 00:03:29,280 Speaker 3: Don't look up doxam. Whatever you do. 67 00:03:29,480 --> 00:03:30,840 Speaker 1: Sure you didn't look up buxham. 68 00:03:31,360 --> 00:03:33,640 Speaker 3: No do o x U M. I don't know, it. 69 00:03:33,680 --> 00:03:35,920 Speaker 1: Might be I am, I don't know. Just look dog, 70 00:03:36,680 --> 00:03:41,920 Speaker 1: look dog fox hybrid. Oh, this is great, I'll just 71 00:03:41,960 --> 00:03:45,080 Speaker 1: talk to everyone because those two dumbasses. I guess it's 72 00:03:45,360 --> 00:03:48,920 Speaker 1: on my instruction, so it's my fault. Have you seen it. 73 00:03:51,200 --> 00:03:52,120 Speaker 1: I don't think they're. 74 00:03:52,000 --> 00:03:56,040 Speaker 3: Ugly, don't you. Actually they do kind of have a 75 00:03:56,080 --> 00:03:57,200 Speaker 3: cute little foxy fog. 76 00:03:57,480 --> 00:04:01,200 Speaker 2: God, they look like a bloody hyena hybrid. 77 00:04:01,440 --> 00:04:04,200 Speaker 1: Ah. I thought you two were animal lovers. 78 00:04:05,640 --> 00:04:09,400 Speaker 3: Well yeah, but Tiff, you know, has her lovely animal. 79 00:04:09,480 --> 00:04:12,000 Speaker 4: It's got to be cute. It's a little easy stick up, 80 00:04:12,040 --> 00:04:13,000 Speaker 4: like a little ewok. 81 00:04:13,880 --> 00:04:17,440 Speaker 1: Yeah, yeah, So I don't know that i'd want one 82 00:04:17,480 --> 00:04:17,960 Speaker 1: in the house. 83 00:04:18,000 --> 00:04:23,400 Speaker 3: It looks like the answer to the question no one asked, Well. 84 00:04:23,640 --> 00:04:29,599 Speaker 1: Fucking sorry, Sigmund Freud, what would you like to talk about? Sorry, 85 00:04:29,720 --> 00:04:33,640 Speaker 1: Leonardo da Vinci sorrow? Sorry Jesus, where would you like 86 00:04:33,680 --> 00:04:38,960 Speaker 1: to go with the conversation? Come on, Delai Lama, you 87 00:04:39,040 --> 00:04:39,800 Speaker 1: open the batting. 88 00:04:41,040 --> 00:04:42,919 Speaker 3: You know I've got a connection to the deal alarm. 89 00:04:42,960 --> 00:04:44,920 Speaker 3: And not only have I been in the same room 90 00:04:44,920 --> 00:04:47,479 Speaker 3: with the Delai Lama, I was born on the same day. 91 00:04:47,760 --> 00:04:49,719 Speaker 1: You and I were in the same room at the 92 00:04:49,760 --> 00:04:54,039 Speaker 1: same time with him, Damas, how's your memory? 93 00:04:54,640 --> 00:04:57,400 Speaker 3: I did say that, I didn't I just conveniently left 94 00:04:57,440 --> 00:04:58,440 Speaker 3: you out of the equation. 95 00:04:58,800 --> 00:05:03,040 Speaker 1: Welcome back to so Now Central. I'm Patrick Beranello. Episode 96 00:05:03,120 --> 00:05:06,880 Speaker 1: number I'm not sure. Hey tell do you want to 97 00:05:06,920 --> 00:05:10,640 Speaker 1: hear Madellai Lama story? Come on, bro, do better. 98 00:05:12,920 --> 00:05:15,200 Speaker 4: This is already my favorite episode. 99 00:05:15,760 --> 00:05:19,560 Speaker 3: Really, we're talking to nobody because everyone's turned off. 100 00:05:20,240 --> 00:05:24,400 Speaker 1: They love this except our listeners are maniacs. They're like, 101 00:05:24,680 --> 00:05:26,599 Speaker 1: fuck all the information, just talk shit? 102 00:05:27,000 --> 00:05:29,720 Speaker 3: Can I just all about something? All of them this week? 103 00:05:29,839 --> 00:05:33,000 Speaker 3: I remember how a few weeks ago we offered people 104 00:05:33,040 --> 00:05:36,520 Speaker 3: a free domain name and like nothing happened, no one 105 00:05:36,560 --> 00:05:38,520 Speaker 3: took it up. And then and then I got a 106 00:05:38,520 --> 00:05:41,880 Speaker 3: message from Lovely Magdalena who says that this is her 107 00:05:41,960 --> 00:05:45,080 Speaker 3: favorite segment of the show. How good is that. She's 108 00:05:45,080 --> 00:05:50,919 Speaker 3: in Adelaide and and she she is a one of 109 00:05:50,920 --> 00:05:54,320 Speaker 3: those she does a civil celebrant, so it got chatting 110 00:05:54,320 --> 00:05:56,000 Speaker 3: to her during the week. But she loves the show, 111 00:05:56,200 --> 00:05:56,920 Speaker 3: she loves the well. 112 00:05:57,000 --> 00:05:58,719 Speaker 1: Shout out, shout out to Magdalene. 113 00:05:59,160 --> 00:06:02,080 Speaker 3: Magdalena, it is but everyone calls a magde when you 114 00:06:02,080 --> 00:06:08,240 Speaker 3: get close and personal, you know, once you becomes. 115 00:06:06,160 --> 00:06:09,239 Speaker 1: Someone for seventeen minutes, now it's his bestie. It's pretty 116 00:06:09,320 --> 00:06:14,040 Speaker 1: much joined at the hip for a state of time. 117 00:06:16,200 --> 00:06:19,080 Speaker 1: He's so needy everyone, whatever you do, don't send him 118 00:06:19,120 --> 00:06:22,480 Speaker 1: an email. You'll never fucking get rid of him. 119 00:06:23,440 --> 00:06:32,240 Speaker 3: This is actually very very god oh god, Okay, let's 120 00:06:32,240 --> 00:06:32,760 Speaker 3: start again. 121 00:06:33,000 --> 00:06:36,919 Speaker 1: Hey everyone, welcome to the You Project. I'm Craig Harper. 122 00:06:37,279 --> 00:06:40,600 Speaker 1: This is a hard hitting podcast where we explore the 123 00:06:40,960 --> 00:06:42,560 Speaker 1: human experience. 124 00:06:42,720 --> 00:06:42,880 Speaker 2: Hi. 125 00:06:43,000 --> 00:06:46,159 Speaker 3: Patrick, Oh hey Craig, Nice to see you. Hi Tiff, 126 00:06:46,400 --> 00:06:47,440 Speaker 3: Great to have you on the show. 127 00:06:47,760 --> 00:06:54,719 Speaker 1: Yeah, Tiff, gooday, gooday, gooday. All silliness societe. How what 128 00:06:54,920 --> 00:06:57,520 Speaker 1: is mag to do? So she's a civil celebrate, Yeah, 129 00:06:57,600 --> 00:06:59,040 Speaker 1: so weddings and funerals. 130 00:06:59,400 --> 00:07:02,920 Speaker 3: Yeah, but she's kind of tending towards funerals now, which 131 00:07:02,960 --> 00:07:04,560 Speaker 3: kind of makes sense because you've got a plan a wedding, 132 00:07:04,600 --> 00:07:08,080 Speaker 3: what eighteen months in advance, different minds, And also, you know, 133 00:07:08,320 --> 00:07:11,320 Speaker 3: weddings are on weekends, right and funerals are during the week, 134 00:07:11,440 --> 00:07:12,280 Speaker 3: So if you want to do it. 135 00:07:12,280 --> 00:07:15,160 Speaker 1: Sure about your rational on that makes sense? What so 136 00:07:15,240 --> 00:07:17,720 Speaker 1: what are you saying like it's much more convenient for 137 00:07:17,720 --> 00:07:19,600 Speaker 1: people to die is less planning? 138 00:07:19,880 --> 00:07:22,600 Speaker 3: Well, you've got a pretty ongoing, steady audience, haven't you 139 00:07:22,720 --> 00:07:23,760 Speaker 3: really when you think about it. 140 00:07:24,240 --> 00:07:27,720 Speaker 1: Oh god, this got this took a dark turn. Let's 141 00:07:27,720 --> 00:07:30,720 Speaker 1: start again. Everyone, welcome to the you project. 142 00:07:31,040 --> 00:07:34,680 Speaker 3: Look because I would Okay, now I have in my 143 00:07:34,840 --> 00:07:38,240 Speaker 3: will a very clear I do not want a funeral. 144 00:07:38,440 --> 00:07:41,040 Speaker 3: I've got two and a half grand at the local pub, 145 00:07:41,320 --> 00:07:44,320 Speaker 3: and I'm inviting all my friends me not included, of course. 146 00:07:44,920 --> 00:07:46,000 Speaker 3: Maybe in an ashtray. 147 00:07:46,360 --> 00:07:48,920 Speaker 1: I should just put you in a weekend at Bernie's 148 00:07:48,920 --> 00:07:51,480 Speaker 1: style outfit and fucking prop you in the corner. 149 00:07:51,760 --> 00:07:53,880 Speaker 3: Now, my twin brother will probably be there, so I'll 150 00:07:53,920 --> 00:07:54,960 Speaker 3: just know, just look at him. 151 00:07:55,000 --> 00:07:58,480 Speaker 1: Oh that'll be. That'll freak people out. He should just 152 00:07:58,800 --> 00:08:00,960 Speaker 1: he should put on a white sheet and just walk 153 00:08:01,000 --> 00:08:07,640 Speaker 1: in halfway through unannounced. Go what do you all doing? Yeah? 154 00:08:07,800 --> 00:08:10,600 Speaker 3: No, but if you if you don't want a religious 155 00:08:10,640 --> 00:08:13,280 Speaker 3: ceremony and you want a celebrant to do a wedding 156 00:08:13,360 --> 00:08:15,239 Speaker 3: or a funeral or something, it kind of makes sense. 157 00:08:16,520 --> 00:08:18,560 Speaker 3: Yeah anyway, So magnismine, you mate. 158 00:08:19,000 --> 00:08:21,840 Speaker 1: You and I chatted about this yesterday to forty your thoughts, 159 00:08:21,880 --> 00:08:26,040 Speaker 1: Like Patrick and I fucking much to both of our annoyance, 160 00:08:26,080 --> 00:08:28,440 Speaker 1: we kind of agree on this, which is a weird thing. 161 00:08:28,600 --> 00:08:31,680 Speaker 1: So just put that in the typ diary and that 162 00:08:32,280 --> 00:08:36,200 Speaker 1: neither of us really want a funeral. I I don't 163 00:08:36,559 --> 00:08:39,480 Speaker 1: know the idea of people. Yeah, I know the people 164 00:08:39,520 --> 00:08:41,840 Speaker 1: coming from all over to stand around and go, oh, 165 00:08:41,880 --> 00:08:45,600 Speaker 1: he was a good bloke when they probably didn't really 166 00:08:45,640 --> 00:08:47,880 Speaker 1: think that half of them, and then everyone. 167 00:08:48,080 --> 00:08:54,000 Speaker 2: Everybody thinks it. With Patrick, you gotta put on something. 168 00:08:54,040 --> 00:08:56,800 Speaker 2: Though he's not saying nobody gets anything in safe. 169 00:08:56,880 --> 00:08:58,760 Speaker 1: I'm happy to do that. I just don't want a 170 00:08:58,760 --> 00:09:01,160 Speaker 1: bunch of people in a church staring at a coffin 171 00:09:02,280 --> 00:09:05,000 Speaker 1: with what used to be me in it. I'm not 172 00:09:05,000 --> 00:09:05,680 Speaker 1: there anymore. 173 00:09:07,760 --> 00:09:08,480 Speaker 3: What about you? 174 00:09:09,000 --> 00:09:10,679 Speaker 4: But where are you going to go? When you're not 175 00:09:11,080 --> 00:09:11,960 Speaker 4: on to the other place. 176 00:09:12,080 --> 00:09:14,120 Speaker 1: I'm going to the next place. I'm just warming up. 177 00:09:14,160 --> 00:09:18,720 Speaker 4: Now you're gonna drop by, and just you know, I'm. 178 00:09:18,559 --> 00:09:20,760 Speaker 1: Going to be watching you like a fucking hawk, and 179 00:09:20,800 --> 00:09:28,119 Speaker 1: you better do better. You just remember I'm around. 180 00:09:27,679 --> 00:09:29,800 Speaker 3: Don't you pay attention to You? 181 00:09:29,840 --> 00:09:32,400 Speaker 1: Pay attention because you go? Is that you. 182 00:09:32,440 --> 00:09:36,280 Speaker 4: Harps carrying to it at this moment, we'll live with 183 00:09:36,320 --> 00:09:37,040 Speaker 4: me forever. 184 00:09:37,640 --> 00:09:37,840 Speaker 1: Good. 185 00:09:37,840 --> 00:09:40,000 Speaker 4: I hope I go first, because. 186 00:09:40,240 --> 00:09:42,920 Speaker 1: Because when I go to God, that's what Mary says. 187 00:09:44,320 --> 00:09:46,400 Speaker 1: I'm not going anywhere when it comes to you, I'm 188 00:09:46,440 --> 00:09:48,680 Speaker 1: just gonna to look over here. 189 00:09:49,840 --> 00:09:51,520 Speaker 4: That's crapy nowhere about that? 190 00:09:52,800 --> 00:09:57,959 Speaker 5: Sorry, Patrick, So I want to get again. Hey everyone, 191 00:09:58,080 --> 00:10:02,480 Speaker 5: welcome to the You project, fifth time we've started. So 192 00:10:02,720 --> 00:10:05,280 Speaker 5: what do your family and friends think about the idea 193 00:10:05,360 --> 00:10:08,720 Speaker 5: of just boos and chips at the pub but no 194 00:10:08,960 --> 00:10:09,800 Speaker 5: actual funeral. 195 00:10:10,200 --> 00:10:14,240 Speaker 3: My friends think it's a great idea, but I haven't 196 00:10:14,280 --> 00:10:16,680 Speaker 3: really articulated it to my family. I've just basically said 197 00:10:16,720 --> 00:10:20,040 Speaker 3: I don't want a funeral. They tap now, well, now 198 00:10:20,160 --> 00:10:21,200 Speaker 3: have you? Everyone knows? 199 00:10:21,400 --> 00:10:24,320 Speaker 1: So what will they do with your body? Just pop 200 00:10:24,360 --> 00:10:27,439 Speaker 1: it in the garden or something, or some food, Just 201 00:10:28,120 --> 00:10:33,040 Speaker 1: turn you into a bag of fertilizer. That's okay, tips 202 00:10:33,120 --> 00:10:34,360 Speaker 1: reading something you can tell you. 203 00:10:34,600 --> 00:10:36,600 Speaker 4: I'm deep in thought. I'm deep in thoughts. 204 00:10:36,679 --> 00:10:41,679 Speaker 2: I'm thinking about well, people like yourself and myself. You 205 00:10:41,679 --> 00:10:44,640 Speaker 2: have to have the conversation, like I'm like, have you 206 00:10:44,720 --> 00:10:45,360 Speaker 2: let me listen? 207 00:10:45,440 --> 00:10:45,800 Speaker 4: Know what? 208 00:10:46,080 --> 00:10:49,080 Speaker 2: How this party is going to roll out? Because it's 209 00:10:49,080 --> 00:10:52,960 Speaker 2: not like there's a significant other hanging around in the 210 00:10:53,000 --> 00:10:55,680 Speaker 2: next room waiting to put it all together. 211 00:10:56,280 --> 00:11:00,040 Speaker 1: Yeah, well, well that would Melissa would probably have to 212 00:11:00,040 --> 00:11:02,280 Speaker 1: be the one that would execute my wishes. 213 00:11:02,600 --> 00:11:03,439 Speaker 4: I don't think she's. 214 00:11:05,120 --> 00:11:09,640 Speaker 1: Not a big fan of talking about me dying, So 215 00:11:09,720 --> 00:11:11,679 Speaker 1: I'll just pop it in a little email and she 216 00:11:11,840 --> 00:11:14,000 Speaker 1: can open it when I die. I'll go there you go, 217 00:11:14,080 --> 00:11:14,480 Speaker 1: here you go. 218 00:11:15,280 --> 00:11:17,199 Speaker 4: Yeah, but yes, I love you again. 219 00:11:17,559 --> 00:11:17,800 Speaker 3: Yeah. 220 00:11:18,000 --> 00:11:21,960 Speaker 1: See, it's been great. It's been great. You can have 221 00:11:23,320 --> 00:11:28,600 Speaker 1: you know, all my Camo shorts and yeah, yeah, you 222 00:11:28,679 --> 00:11:31,360 Speaker 1: might even get a motorbike. Who knows, I'll have to 223 00:11:31,360 --> 00:11:34,320 Speaker 1: give you one. You can ride though, I mean you can, 224 00:11:34,360 --> 00:11:36,440 Speaker 1: Frid well. But what I mean is there are a 225 00:11:36,440 --> 00:11:39,079 Speaker 1: couple of them a bit big for you. 226 00:11:39,080 --> 00:11:41,080 Speaker 3: You gave away that pinball machine. That's what I would 227 00:11:41,120 --> 00:11:42,520 Speaker 3: have wanted. You're all pinball? 228 00:11:43,440 --> 00:11:46,199 Speaker 1: Yeah, that that is true. I did give that away. 229 00:11:46,240 --> 00:11:47,840 Speaker 1: What would you like of mind, Patrick? 230 00:11:48,200 --> 00:11:52,680 Speaker 3: Your pinball? Okay, all right, a big framed poster. 231 00:11:54,440 --> 00:11:56,800 Speaker 1: Is there any chance we could talk about tech today? 232 00:11:56,840 --> 00:11:58,719 Speaker 1: And I know I'm the problem so far? 233 00:11:59,440 --> 00:12:02,439 Speaker 3: Well, okay, so this is important because that email that 234 00:12:02,520 --> 00:12:05,000 Speaker 3: you were going to send, like it's sitting in wherever 235 00:12:05,040 --> 00:12:07,760 Speaker 3: it's sitting on your computer. What happens if your computer 236 00:12:07,800 --> 00:12:11,360 Speaker 3: get tacked and ransomware locks you out of that important 237 00:12:11,440 --> 00:12:14,360 Speaker 3: email to Melissa with instructions of the piss up when 238 00:12:14,400 --> 00:12:14,920 Speaker 3: you diet. 239 00:12:15,640 --> 00:12:17,560 Speaker 1: Oh shit, I don't know. I guess i'd have to 240 00:12:17,559 --> 00:12:21,360 Speaker 1: have a plan. B. Yeah, i'd have to. I don't know, 241 00:12:21,559 --> 00:12:24,120 Speaker 1: but yeah, okay, so tell me what I do. 242 00:12:24,480 --> 00:12:27,000 Speaker 3: Well, because you know how ransomware works, so you probably 243 00:12:27,320 --> 00:12:31,040 Speaker 3: familiar ish with how ransomware works. So someone hacks into 244 00:12:31,120 --> 00:12:34,200 Speaker 3: your computer, you click on it an unsuspecting link, and 245 00:12:34,240 --> 00:12:37,840 Speaker 3: then what the ransomware does. It locks up all the files, 246 00:12:37,880 --> 00:12:40,960 Speaker 3: It encodes them so you can't access them. Then you 247 00:12:41,000 --> 00:12:44,920 Speaker 3: get contacted by the nefarious people who have done that, 248 00:12:45,120 --> 00:12:49,120 Speaker 3: and they extort money from you to unlock those files 249 00:12:49,160 --> 00:12:51,080 Speaker 3: so you can get access to your data again. And 250 00:12:51,120 --> 00:12:54,040 Speaker 3: there have been companies that have actually gone bankrupt and 251 00:12:54,640 --> 00:12:57,720 Speaker 3: have shut down because they didn't want to pay the ransom. Well, 252 00:12:57,720 --> 00:13:00,000 Speaker 3: they couldn't afford the ransom. So this is really sick. 253 00:13:00,320 --> 00:13:03,760 Speaker 3: And in Japan now police have now there's two types 254 00:13:04,160 --> 00:13:07,880 Speaker 3: of ransomware that are very prolific. One is called Faux 255 00:13:07,920 --> 00:13:11,360 Speaker 3: Boss and one is called eight Base, And now Japanese 256 00:13:11,400 --> 00:13:14,319 Speaker 3: police are saying they've cracked it. They've they've worked out 257 00:13:14,360 --> 00:13:19,880 Speaker 3: a way to encrypt that ransomware data, the stuff that's 258 00:13:19,880 --> 00:13:22,760 Speaker 3: been you know, but that's locked to beredy out and 259 00:13:22,800 --> 00:13:25,440 Speaker 3: they're giving it away for free, which I thought was 260 00:13:25,520 --> 00:13:27,199 Speaker 3: kind of a cool thing because you know, if you're 261 00:13:27,200 --> 00:13:30,760 Speaker 3: a small business or an individual who's been subjected to 262 00:13:30,920 --> 00:13:33,640 Speaker 3: having all their information and it could just be all 263 00:13:33,720 --> 00:13:36,520 Speaker 3: your photos and all the words, and imagine all the 264 00:13:36,559 --> 00:13:40,320 Speaker 3: work you've done for your PhD. Could you imagine devastating 265 00:13:40,360 --> 00:13:42,400 Speaker 3: that would be. So I just think it's great that 266 00:13:42,440 --> 00:13:45,760 Speaker 3: the Japanese police have done this. There was a big 267 00:13:46,200 --> 00:13:52,080 Speaker 3: operation recently where they raided these people who make these 268 00:13:52,679 --> 00:13:57,400 Speaker 3: ransomware tools, and it looks like they've cracked it. So 269 00:13:57,440 --> 00:13:59,320 Speaker 3: that's that's kind of good news for people out there. 270 00:13:59,400 --> 00:14:01,960 Speaker 3: So they're giving it away for free. You can go 271 00:14:02,080 --> 00:14:03,959 Speaker 3: you can go to their website and there's an English 272 00:14:04,080 --> 00:14:06,440 Speaker 3: version as well, and you can hopefully use it to 273 00:14:06,440 --> 00:14:10,120 Speaker 3: decrypt any of the ransomware that might have infected your system. 274 00:14:10,720 --> 00:14:13,600 Speaker 1: I reckon Japan are ahead of the game. They do 275 00:14:13,679 --> 00:14:15,680 Speaker 1: a lot of good stuff over there. They seem to 276 00:14:15,720 --> 00:14:18,880 Speaker 1: be they seem to have a good social conscience for 277 00:14:18,920 --> 00:14:23,920 Speaker 1: everyone as well, like they seem to genuinely care about people, 278 00:14:23,960 --> 00:14:30,040 Speaker 1: which is nice. So yeah, is is that that's more 279 00:14:30,640 --> 00:14:33,280 Speaker 1: that's not so much the individual out in suburbia. Is 280 00:14:33,320 --> 00:14:36,040 Speaker 1: it's more companies and organizations that are going to have 281 00:14:36,080 --> 00:14:36,560 Speaker 1: that threat. 282 00:14:36,640 --> 00:14:39,840 Speaker 3: Yeah, will not necessarily, No, everybody can be affected by it, 283 00:14:39,880 --> 00:14:43,560 Speaker 3: that's right. It's so prolific, and it could just be 284 00:14:44,120 --> 00:14:47,880 Speaker 3: that it's the freckle faced kid who decides to try 285 00:14:47,920 --> 00:14:51,720 Speaker 3: to infect someone's computer and wants fifty bucks down to 286 00:14:52,200 --> 00:14:54,960 Speaker 3: you know, the big mega corporation where they're trying to 287 00:14:55,000 --> 00:14:58,600 Speaker 3: extort twenty million dollars. You know. The problem is these tools, 288 00:14:58,720 --> 00:15:03,200 Speaker 3: these ransom tools, and now fairly readily available to people 289 00:15:03,240 --> 00:15:05,160 Speaker 3: who know where to look at them in the dark, 290 00:15:05,200 --> 00:15:09,720 Speaker 3: deep corners of the Internet. And that's the problem. It's 291 00:15:09,720 --> 00:15:11,960 Speaker 3: making it easier for people to access this stuff. So 292 00:15:12,200 --> 00:15:14,560 Speaker 3: I just thought it was a little plus for anybody 293 00:15:14,600 --> 00:15:18,119 Speaker 3: that might potentially get ransomed. 294 00:15:19,160 --> 00:15:21,400 Speaker 1: I think I told you about this a year or 295 00:15:21,400 --> 00:15:24,360 Speaker 1: two ago on the show. Somebody sent me an email going, 296 00:15:24,840 --> 00:15:30,080 Speaker 1: we've got access to whatever. No, we've got Actually it's 297 00:15:30,120 --> 00:15:33,480 Speaker 1: a video of you being inappropriate while watching it. I'm 298 00:15:33,560 --> 00:15:37,520 Speaker 1: like that, if you don't send us whatever we're going 299 00:15:37,560 --> 00:15:40,840 Speaker 1: to I'm like, knock yourself out. Send it out. My 300 00:15:40,920 --> 00:15:45,880 Speaker 1: friends would love that, would they love it? 301 00:15:47,000 --> 00:15:49,200 Speaker 4: When they sent it to me, I wasn't that impressed. 302 00:15:49,200 --> 00:15:53,840 Speaker 3: Actually, I'm on, but you didn't find the zoom function digitis. 303 00:15:54,080 --> 00:16:00,520 Speaker 6: Ah, is that right, mister ed Hello Wilbur, All right, 304 00:16:01,640 --> 00:16:05,200 Speaker 6: could you talk about Chinese authorities and how they're using 305 00:16:05,240 --> 00:16:06,560 Speaker 6: a new tool Patrick? 306 00:16:06,920 --> 00:16:09,680 Speaker 1: Yeah, how I just jumped straight out of that. 307 00:16:11,560 --> 00:16:14,440 Speaker 3: Well, it's kind of interesting. We're talking about ransom, we're 308 00:16:14,480 --> 00:16:18,960 Speaker 3: now talking about malware. Chinese authorities have worked out they 309 00:16:18,960 --> 00:16:22,760 Speaker 3: can use malware to break into people's phones when you 310 00:16:22,880 --> 00:16:25,400 Speaker 3: enter We kind of heard about this in the United 311 00:16:25,440 --> 00:16:27,960 Speaker 3: States when you travel to the United States, where your 312 00:16:28,000 --> 00:16:32,240 Speaker 3: phone could be seized and they demand that you give 313 00:16:32,720 --> 00:16:35,360 Speaker 3: them access. And look, I don't know if there's due 314 00:16:35,400 --> 00:16:37,720 Speaker 3: cause or whatever, but now they're saying, if you go 315 00:16:37,800 --> 00:16:41,000 Speaker 3: to China, there's a real possibility that even if you 316 00:16:41,040 --> 00:16:43,760 Speaker 3: don't give them access, they can hack into your phone 317 00:16:43,760 --> 00:16:46,160 Speaker 3: physically they need to have it. But what this is 318 00:16:46,160 --> 00:16:48,160 Speaker 3: the kind of a warning for people who are traveling 319 00:16:48,200 --> 00:16:51,800 Speaker 3: to China. And the concern also is for Chinese nationals. 320 00:16:52,080 --> 00:16:54,840 Speaker 3: If you happen to be a journalist in China and 321 00:16:54,880 --> 00:16:58,480 Speaker 3: you say something that's concrete to the state policy, and 322 00:16:58,840 --> 00:17:01,040 Speaker 3: you know that that could be really concerning as well. 323 00:17:01,160 --> 00:17:04,280 Speaker 3: So this was in our article on tech Crunch, and 324 00:17:04,880 --> 00:17:07,959 Speaker 3: they were reporting on a mobile computer company, a company 325 00:17:08,000 --> 00:17:11,560 Speaker 3: called Lockout, and they were saying that, yeah, this new 326 00:17:11,600 --> 00:17:14,439 Speaker 3: tool now looks like they can hacken to at least 327 00:17:14,560 --> 00:17:17,920 Speaker 3: all the Android phones out there. Ones tend to be 328 00:17:17,960 --> 00:17:21,280 Speaker 3: a little bit harder to break into, and Apple have 329 00:17:21,400 --> 00:17:25,320 Speaker 3: had some really big kind of fights with you know, 330 00:17:26,560 --> 00:17:29,920 Speaker 3: large kind of governments that have kind of said to them, well, 331 00:17:29,920 --> 00:17:32,240 Speaker 3: we're not going to let you access our phones. One 332 00:17:32,240 --> 00:17:33,800 Speaker 3: of the things we take a lot of pride in 333 00:17:34,160 --> 00:17:36,400 Speaker 3: is the security on our phones. And that's a real 334 00:17:36,440 --> 00:17:40,760 Speaker 3: problem because if say someone's accused of doing something nefarious 335 00:17:40,840 --> 00:17:44,080 Speaker 3: or it might be terrorism related, they may want to 336 00:17:44,119 --> 00:17:47,000 Speaker 3: get access to their phone data. And you know, Apple's 337 00:17:47,080 --> 00:17:50,360 Speaker 3: kind of standing by its customers and saying, well, we 338 00:17:50,720 --> 00:17:53,359 Speaker 3: lock people out. We you know, this is your privacy, 339 00:17:53,400 --> 00:17:56,000 Speaker 3: and we encrypt the data and even if we could 340 00:17:56,040 --> 00:17:57,879 Speaker 3: open it, we couldn't see it. So that's the the 341 00:17:58,119 --> 00:18:01,399 Speaker 3: you know, the the stands that Apple takes. But there 342 00:18:01,400 --> 00:18:03,520 Speaker 3: has been a big uptake of Apple phones. I thought 343 00:18:03,520 --> 00:18:05,760 Speaker 3: that was interesting when I was in China that when 344 00:18:05,800 --> 00:18:09,719 Speaker 3: I first went there in twenty thirteen, nobody had our 345 00:18:09,840 --> 00:18:13,800 Speaker 3: iPhones at all, and everybody was using Android and mainly 346 00:18:13,880 --> 00:18:16,880 Speaker 3: Samsung and Huiawei and show me and all that sort 347 00:18:16,920 --> 00:18:19,399 Speaker 3: of thing. And then next time I was there, maybe 348 00:18:19,560 --> 00:18:22,200 Speaker 3: five or six years later, I saw a lot more 349 00:18:22,359 --> 00:18:25,520 Speaker 3: iPhones out there, which was interesting. And Australia has a 350 00:18:25,560 --> 00:18:27,520 Speaker 3: really high percentage of people on iPhones. 351 00:18:28,000 --> 00:18:31,480 Speaker 1: Yeah, just kind of curious. But anyway, so I should 352 00:18:31,600 --> 00:18:33,480 Speaker 1: know this. But they're all made in China. 353 00:18:33,320 --> 00:18:37,480 Speaker 3: Right, No, not all phones are made in China, the iPhone. 354 00:18:38,600 --> 00:18:41,680 Speaker 3: That's a really good question. I don't know, Crago, I'm 355 00:18:41,680 --> 00:18:44,080 Speaker 3: not sure. I know, can you find that out? I 356 00:18:44,119 --> 00:18:47,000 Speaker 3: know Trump was trying to get them made or maybe 357 00:18:47,359 --> 00:18:50,080 Speaker 3: assembled in the United States, so maybe some of the 358 00:18:50,160 --> 00:18:52,919 Speaker 3: cool components and then they're assembled, because that was the 359 00:18:52,920 --> 00:18:55,639 Speaker 3: big thing with tariffs that you know, there was a 360 00:18:55,640 --> 00:18:58,040 Speaker 3: bit of a panic for the likes of iPhone, even 361 00:18:58,080 --> 00:19:01,679 Speaker 3: though it's an American company. Apple is obviously a mega 362 00:19:02,400 --> 00:19:06,719 Speaker 3: American company, but because they're manufacturing off shore, it means 363 00:19:06,760 --> 00:19:09,280 Speaker 3: that they were subject to tariffs, and then that was 364 00:19:09,359 --> 00:19:12,119 Speaker 3: lifted for ninety days, if you remember, right. 365 00:19:13,240 --> 00:19:16,200 Speaker 1: Can I raise a topic that isn't on your list, 366 00:19:16,200 --> 00:19:19,359 Speaker 1: but I think you'll find it interesting nonetheless, because it 367 00:19:19,400 --> 00:19:24,400 Speaker 1: intersects your world and mine. So there's a whole lot 368 00:19:24,440 --> 00:19:27,280 Speaker 1: of I don't know what the word is, but I'm 369 00:19:27,320 --> 00:19:29,439 Speaker 1: going to say angst it's not the right word. But 370 00:19:29,480 --> 00:19:33,440 Speaker 1: it's in the ballpark with universities around Australia and around 371 00:19:33,440 --> 00:19:35,240 Speaker 1: the world at the moment trying to figure out how 372 00:19:35,240 --> 00:19:40,159 Speaker 1: the fuck to navigate undergrad degrees and postgrad degrees and 373 00:19:40,280 --> 00:19:44,800 Speaker 1: people and how AI is going to be integrated into 374 00:19:44,840 --> 00:19:49,760 Speaker 1: the academic process because it is so easy to cheat now, 375 00:19:50,080 --> 00:19:54,879 Speaker 1: especially with undergrad degrees where you've basically got to do 376 00:19:54,960 --> 00:20:02,480 Speaker 1: assignments and research projects. Primarily they can't obviously they can't 377 00:20:02,480 --> 00:20:06,159 Speaker 1: say to students you can't use AI because it's past 378 00:20:06,200 --> 00:20:09,720 Speaker 1: that point. But they're now trying to figure it, figure it, 379 00:20:09,960 --> 00:20:12,280 Speaker 1: figure out how to how to do that, what that 380 00:20:12,359 --> 00:20:16,760 Speaker 1: looks like. And I was talking to my senior supervisor yesterday, Chris, 381 00:20:18,280 --> 00:20:21,760 Speaker 1: who's you know, a professor, got two PhDs, blah blah blah, 382 00:20:21,800 --> 00:20:24,800 Speaker 1: And he's actually in the middle of conversations with my 383 00:20:25,000 --> 00:20:27,679 Speaker 1: university around this at the moment. And one of the 384 00:20:27,760 --> 00:20:31,000 Speaker 1: challenges is that the people who are very high up 385 00:20:31,040 --> 00:20:34,760 Speaker 1: in these organizations are generally older and they don't really 386 00:20:34,960 --> 00:20:39,719 Speaker 1: understand AI. So but then all the people who do 387 00:20:39,760 --> 00:20:43,200 Speaker 1: have great understanding are not the decision makers. So it's 388 00:20:43,280 --> 00:20:46,520 Speaker 1: it's going to be interesting moving forward to see how 389 00:20:47,600 --> 00:20:49,600 Speaker 1: you know, like my PhD is going to be by 390 00:20:49,600 --> 00:20:53,000 Speaker 1: the time I finished six years of work but foreseeably 391 00:20:53,480 --> 00:20:56,239 Speaker 1: done the right way, And I put that with an 392 00:20:56,280 --> 00:20:59,639 Speaker 1: asterisk next to it. Somebody could do all of the 393 00:20:59,680 --> 00:21:02,880 Speaker 1: work in a PhD in four weeks if they knew 394 00:21:02,880 --> 00:21:05,719 Speaker 1: what they were doing, depending on the type, Like they 395 00:21:05,720 --> 00:21:10,000 Speaker 1: couldn't do mine because mine was independent research, right as 396 00:21:10,040 --> 00:21:13,200 Speaker 1: in people in rooms and all of that. But yeah, 397 00:21:13,200 --> 00:21:15,240 Speaker 1: it's going to be interesting to see how that unfolds 398 00:21:15,240 --> 00:21:17,120 Speaker 1: over the next one two five years. 399 00:21:17,480 --> 00:21:19,920 Speaker 3: Yes, because you could do a PhD based on someone 400 00:21:19,960 --> 00:21:23,720 Speaker 3: else's research, can't you, or the data from research well 401 00:21:23,800 --> 00:21:24,600 Speaker 3: you care. 402 00:21:24,480 --> 00:21:27,280 Speaker 1: Well, you can do papers. You probably couldn't do an 403 00:21:27,440 --> 00:21:30,960 Speaker 1: entire PhD, but you can. Like one of my papers 404 00:21:31,040 --> 00:21:33,320 Speaker 1: is called a systematic review, which is where I look 405 00:21:33,359 --> 00:21:35,879 Speaker 1: at basically all of the work that's been done in 406 00:21:35,880 --> 00:21:42,199 Speaker 1: a particular area, in this case, meta accuracy. And I 407 00:21:42,240 --> 00:21:46,240 Speaker 1: started with sixteen hundred papers or sixteen hundred journal articles, 408 00:21:46,720 --> 00:21:50,720 Speaker 1: research projects from different people and distilled it to one 409 00:21:50,800 --> 00:21:54,159 Speaker 1: hundred and sixteen and then created a table out of that, 410 00:21:54,200 --> 00:22:00,000 Speaker 1: and that's become the core of this one paper. But yeah, 411 00:21:59,600 --> 00:22:03,720 Speaker 1: there would definitely be a way, depending on the design 412 00:22:03,800 --> 00:22:07,520 Speaker 1: of your PhD and the field of research, where somebody 413 00:22:07,560 --> 00:22:13,080 Speaker 1: could cheat and produce the equivalent work of a four 414 00:22:13,200 --> 00:22:16,000 Speaker 1: or five year journey in a month probably best. 415 00:22:16,240 --> 00:22:18,680 Speaker 3: I think the question for me when I think about this, 416 00:22:18,720 --> 00:22:21,800 Speaker 3: because I haven't obviously done a PhD, but how much 417 00:22:21,840 --> 00:22:24,800 Speaker 3: of the work that you did was kind of manual 418 00:22:25,000 --> 00:22:28,720 Speaker 3: grunt work as opposed to analysis. So say, for example, 419 00:22:28,760 --> 00:22:31,320 Speaker 3: you took all that data and you said how many 420 00:22:31,480 --> 00:22:35,480 Speaker 3: articles referenced this phrase. Now you could go through manually 421 00:22:35,560 --> 00:22:37,359 Speaker 3: and say, oh, yeah, that one did, that one didn't, 422 00:22:37,359 --> 00:22:39,400 Speaker 3: that one didn't, that one did, But you could also 423 00:22:39,520 --> 00:22:42,120 Speaker 3: plug it onto AI and it would instantly give you 424 00:22:42,440 --> 00:22:44,959 Speaker 3: all the articles that reference that phrase. So I'm thinking 425 00:22:45,040 --> 00:22:48,080 Speaker 3: in terms of writing something down and adding it up 426 00:22:48,119 --> 00:22:51,200 Speaker 3: and using a calculator. You know, it's just makes life easier. 427 00:22:51,520 --> 00:22:54,960 Speaker 3: So I think that, and you're right because we're still 428 00:22:54,960 --> 00:22:58,440 Speaker 3: coming to terms with AI. You know, your six year 429 00:22:58,520 --> 00:23:02,440 Speaker 3: project maybe could have been done in two years because 430 00:23:02,480 --> 00:23:05,240 Speaker 3: the grunt work could have been used by AI. You know, 431 00:23:05,359 --> 00:23:07,520 Speaker 3: there'd be validation and all that sort of thing. But 432 00:23:07,800 --> 00:23:10,520 Speaker 3: so I'm thinking it's it comes back to what's the 433 00:23:10,520 --> 00:23:14,120 Speaker 3: Spider Man quote, with great power comes great responsibility. 434 00:23:14,480 --> 00:23:16,160 Speaker 1: I love a good Spider Man reference. 435 00:23:16,480 --> 00:23:19,480 Speaker 3: Yeah, I know, I love that. On a side note, 436 00:23:19,520 --> 00:23:21,639 Speaker 3: you didn't see the video this morning of the guy 437 00:23:21,760 --> 00:23:23,960 Speaker 3: who went to court, you know, the graffiti guy who did. 438 00:23:24,280 --> 00:23:27,200 Speaker 1: Yeah, I saw him on the news wearing Spider Man. 439 00:23:27,280 --> 00:23:30,800 Speaker 3: Yeah. But getting back to this, so, you know, so 440 00:23:31,480 --> 00:23:36,080 Speaker 3: potentially your good, your use of AI could streamline what 441 00:23:36,400 --> 00:23:38,680 Speaker 3: is a very laborious process. And that's what I was 442 00:23:38,760 --> 00:23:41,080 Speaker 3: kind of getting to using the tools in the right 443 00:23:41,119 --> 00:23:43,040 Speaker 3: way one And. 444 00:23:43,000 --> 00:23:45,800 Speaker 1: I think, look, there are like I had to read 445 00:23:45,840 --> 00:23:48,720 Speaker 1: through hundreds and hundreds of journal articles and you can't 446 00:23:48,760 --> 00:23:51,600 Speaker 1: get the knowledge. You're like, you actually need to read them. 447 00:23:52,359 --> 00:23:57,320 Speaker 1: At the very least, you need to do a solid 448 00:23:57,359 --> 00:23:59,240 Speaker 1: skin which could be a half hour or an hour 449 00:23:59,280 --> 00:24:01,480 Speaker 1: of a paper if not read it, start to finish, right. 450 00:24:03,000 --> 00:24:07,520 Speaker 1: But then yeah, there are certainly things where I could say, listen, 451 00:24:09,040 --> 00:24:13,360 Speaker 1: I want show me some research that supports this idea, 452 00:24:14,160 --> 00:24:16,919 Speaker 1: some you know, and like he's the idea, Bubba, and 453 00:24:16,960 --> 00:24:18,800 Speaker 1: then I'll give you a bunch of papers, a bunch 454 00:24:18,800 --> 00:24:22,800 Speaker 1: of references or quotes, you know, the name of the 455 00:24:22,840 --> 00:24:25,320 Speaker 1: study the year of the study, the authors of the study, 456 00:24:25,840 --> 00:24:28,080 Speaker 1: and then I can decide whether or not I want 457 00:24:28,080 --> 00:24:31,000 Speaker 1: to integrate that into my research, you know. But that's 458 00:24:31,080 --> 00:24:34,080 Speaker 1: that's a good use, you know. But but there are 459 00:24:35,480 --> 00:24:38,800 Speaker 1: you know, some students now that are getting AI to 460 00:24:39,040 --> 00:24:42,600 Speaker 1: actually write their work. Like there will be a homework 461 00:24:42,680 --> 00:24:45,919 Speaker 1: question and I'll say, you know, do a project on 462 00:24:46,000 --> 00:24:50,080 Speaker 1: this and that, exploring this particular whatever, and it can 463 00:24:50,440 --> 00:24:53,440 Speaker 1: AI can produce that, you know, and then they can 464 00:24:53,520 --> 00:24:58,119 Speaker 1: jump in and manipulate a little bit. So there's a 465 00:24:58,200 --> 00:25:01,639 Speaker 1: thing called there's a thing called turn it in. I 466 00:25:01,680 --> 00:25:05,800 Speaker 1: think that's what it's called, where you actually you can 467 00:25:05,880 --> 00:25:08,240 Speaker 1: put the paper that you wrote, whether or not you 468 00:25:08,320 --> 00:25:12,560 Speaker 1: wrote it, into this program and it tells you it 469 00:25:12,600 --> 00:25:16,120 Speaker 1: will give you a turnet in score basically on how 470 00:25:16,200 --> 00:25:19,720 Speaker 1: much of it is being ripped off directly from other sources. 471 00:25:20,520 --> 00:25:23,439 Speaker 1: The problem now is AI can write it in a 472 00:25:23,480 --> 00:25:26,200 Speaker 1: way where the turnet in score comes out like it's 473 00:25:26,240 --> 00:25:29,760 Speaker 1: completely original from you, despite the fact that it's totally 474 00:25:29,800 --> 00:25:32,840 Speaker 1: not original. I could have a bit of that wrong. 475 00:25:32,880 --> 00:25:34,040 Speaker 1: But that's my understanding. 476 00:25:34,400 --> 00:25:37,640 Speaker 3: You have checks and measures, because I know that we've 477 00:25:37,720 --> 00:25:40,919 Speaker 3: hung out before where you've had to prepare for presentations. 478 00:25:40,920 --> 00:25:44,359 Speaker 3: So throughout the whole process there are checks and measures 479 00:25:44,359 --> 00:25:47,960 Speaker 3: along the way, and there is an oral presentation where 480 00:25:48,320 --> 00:25:51,160 Speaker 3: you really need to be grilled on what your knowledge 481 00:25:51,160 --> 00:25:54,440 Speaker 3: base is. So I guess the owners is now going 482 00:25:54,480 --> 00:25:58,320 Speaker 3: to be on the institution's universities to try to come 483 00:25:58,400 --> 00:26:02,200 Speaker 3: up with ways to make sure that someone like yourself 484 00:26:02,680 --> 00:26:06,639 Speaker 3: could be through those processes grilled on whatever the topic is. 485 00:26:06,680 --> 00:26:08,119 Speaker 3: I don't know, but then you've got to have smart 486 00:26:08,160 --> 00:26:10,200 Speaker 3: people there can understand what you're talking about. 487 00:26:10,400 --> 00:26:12,560 Speaker 1: Yeah, that's true. TIF did you find out? 488 00:26:12,800 --> 00:26:14,119 Speaker 4: Oh thanks for coming back to me. 489 00:26:15,359 --> 00:26:17,120 Speaker 1: Yeah, do you feel a bit? 490 00:26:17,800 --> 00:26:20,480 Speaker 2: I feel like we're all through the forest together and 491 00:26:20,480 --> 00:26:22,560 Speaker 2: you guys throw a stick and I go chase it, 492 00:26:22,600 --> 00:26:24,800 Speaker 2: and then you turn and change direction, and I'm like, 493 00:26:25,040 --> 00:26:26,359 Speaker 2: where are they wait for me? 494 00:26:27,119 --> 00:26:30,760 Speaker 3: I remember, that's the best analogy ever. 495 00:26:32,760 --> 00:26:34,000 Speaker 4: I was just sitting over your wagon. 496 00:26:34,040 --> 00:26:36,840 Speaker 2: My tail of phones and this is the same for 497 00:26:36,880 --> 00:26:40,359 Speaker 2: Android are made in China and the other ten percent 498 00:26:40,560 --> 00:26:43,840 Speaker 2: are mostly India and Vietnam. 499 00:26:44,240 --> 00:26:48,280 Speaker 1: The problem is not the problem but the benefits slash 500 00:26:48,400 --> 00:26:50,600 Speaker 1: problem is that they do a really good job. They're 501 00:26:50,600 --> 00:26:54,359 Speaker 1: really cost effective, you know. They it's and then to 502 00:26:54,600 --> 00:26:57,360 Speaker 1: do the same thing in Australia would just they probably 503 00:26:57,440 --> 00:27:00,000 Speaker 1: or they potentially wouldn't be as good and they would 504 00:27:00,080 --> 00:27:02,800 Speaker 1: cost two or three times as much and we probably 505 00:27:02,840 --> 00:27:04,399 Speaker 1: don't have the capacity to do it. 506 00:27:04,840 --> 00:27:06,720 Speaker 3: Yeah, Tim is a string mate. 507 00:27:07,640 --> 00:27:11,080 Speaker 1: Yeah, OK, Brian, how are you well? Thanks? Tiff, don't 508 00:27:11,160 --> 00:27:14,600 Speaker 1: feel free to jump in You're You're an integral part 509 00:27:14,640 --> 00:27:17,280 Speaker 1: of the conversation with lu you. 510 00:27:17,359 --> 00:27:18,040 Speaker 3: Oh yeah. 511 00:27:18,200 --> 00:27:20,359 Speaker 1: And also I just want you to know your scene. 512 00:27:20,960 --> 00:27:22,800 Speaker 4: I was just waiting for one of you to take 513 00:27:22,840 --> 00:27:23,880 Speaker 4: a breath. That's all. 514 00:27:25,200 --> 00:27:27,720 Speaker 3: A while, have you not before? 515 00:27:28,840 --> 00:27:31,960 Speaker 1: I think chatty mcs chatster in the International Space Station's 516 00:27:32,000 --> 00:27:33,280 Speaker 1: got you and me both covered. 517 00:27:35,760 --> 00:27:38,800 Speaker 3: Wow, hey, you know talking. Just to continue the theme 518 00:27:38,840 --> 00:27:43,879 Speaker 3: of AI, it's there's real concerns with a lot of 519 00:27:43,960 --> 00:27:50,960 Speaker 3: news sites that Google's new way of summarizing. So, I 520 00:27:50,960 --> 00:27:52,600 Speaker 3: don't know if you've noticed. You do a Google search 521 00:27:52,640 --> 00:27:54,679 Speaker 3: now and it gives you an AI summary first. 522 00:27:55,280 --> 00:27:56,200 Speaker 1: Yeah. Yeah. 523 00:27:56,280 --> 00:27:59,320 Speaker 3: Now the problem is this a new study. This has 524 00:27:59,320 --> 00:28:01,400 Speaker 3: come out of the guard in newspaper. It's an exclusive 525 00:28:01,400 --> 00:28:06,720 Speaker 3: in the Guardian and the study claims that previously sites 526 00:28:06,760 --> 00:28:10,119 Speaker 3: that were ranked really high can lose up to seventy 527 00:28:10,200 --> 00:28:13,840 Speaker 3: nine percent of all their traffic because people aren't bothering 528 00:28:13,920 --> 00:28:16,040 Speaker 3: to go to the site. They're just using the AI 529 00:28:16,240 --> 00:28:20,680 Speaker 3: overview and they're not drilling down any further. So there's 530 00:28:20,720 --> 00:28:25,080 Speaker 3: a massive drop in online site traffic purely because people 531 00:28:25,160 --> 00:28:27,679 Speaker 3: are just going with the summary, because the AI summary 532 00:28:27,760 --> 00:28:29,919 Speaker 3: is pig good. I mean, I do use it a 533 00:28:29,920 --> 00:28:34,280 Speaker 3: little bit. I find that sometimes maybe thirty percent of 534 00:28:34,320 --> 00:28:36,960 Speaker 3: the time, it's inaccurate and I have to look further. 535 00:28:37,560 --> 00:28:40,040 Speaker 3: But that's problematic, and generally I tend to use it 536 00:28:40,040 --> 00:28:43,200 Speaker 3: for technical questions. It might be in relation to, say, 537 00:28:43,400 --> 00:28:45,880 Speaker 3: you know, something I'm doing in some web design and 538 00:28:45,920 --> 00:28:49,680 Speaker 3: I need some help going through a process. So I've 539 00:28:49,760 --> 00:28:52,600 Speaker 3: found that that's not always accurate, and I really honestly 540 00:28:52,640 --> 00:28:54,719 Speaker 3: think it's probably about thirty percent of the stuff can 541 00:28:54,760 --> 00:28:57,640 Speaker 3: be a bit crap, but this is really causing a 542 00:28:57,640 --> 00:29:01,560 Speaker 3: lot of concern for those webs sites that rely on 543 00:29:01,720 --> 00:29:05,560 Speaker 3: that organic search traffic. So there's two types of search. 544 00:29:05,600 --> 00:29:07,320 Speaker 3: When you I don't want to kind of go into 545 00:29:07,360 --> 00:29:10,680 Speaker 3: the boring territory, but there's paid advertising where you see 546 00:29:10,720 --> 00:29:13,440 Speaker 3: little sponsored ad and you can pay to bump up 547 00:29:13,600 --> 00:29:16,320 Speaker 3: that information or this is what we call organic, and 548 00:29:16,360 --> 00:29:20,959 Speaker 3: that's really hard earned by creating content that is related 549 00:29:21,080 --> 00:29:23,800 Speaker 3: for specific topic, and then Google says, oh, yeah, this 550 00:29:23,840 --> 00:29:26,080 Speaker 3: is really interesting. You know, if I looked up the 551 00:29:26,120 --> 00:29:30,120 Speaker 3: you project, then your website would be very high in 552 00:29:30,160 --> 00:29:32,880 Speaker 3: that list because you're the authority when it comes to 553 00:29:32,920 --> 00:29:35,600 Speaker 3: the you project, because you are the U project. Or 554 00:29:35,600 --> 00:29:37,840 Speaker 3: if it was role with the punches, you know, TIFF's 555 00:29:37,880 --> 00:29:40,720 Speaker 3: information would come up there. But if you're just getting 556 00:29:40,720 --> 00:29:43,640 Speaker 3: an AI overview, you might not bother going to TIFF's 557 00:29:43,640 --> 00:29:46,120 Speaker 3: website or to your website. You might think, oh, well, 558 00:29:46,120 --> 00:29:48,560 Speaker 3: that's fine, Craig's the host of the new project. I 559 00:29:48,560 --> 00:29:50,840 Speaker 3: don't need to find out anything else because that's given 560 00:29:50,840 --> 00:29:53,320 Speaker 3: me the summary. And this is the worry for people 561 00:29:53,320 --> 00:29:57,080 Speaker 3: who run their businesses and you want to sell product 562 00:29:57,160 --> 00:29:59,640 Speaker 3: or they want to service up news services that you 563 00:30:00,040 --> 00:30:03,400 Speaker 3: sickly going to lose seventy or eighty percent of all 564 00:30:03,480 --> 00:30:06,520 Speaker 3: your search traffic swings aroundabouts. 565 00:30:06,760 --> 00:30:09,800 Speaker 1: It's good for it's good for the user, maybe that 566 00:30:09,840 --> 00:30:12,320 Speaker 1: they can get a quick snapshot without having to go 567 00:30:12,440 --> 00:30:15,480 Speaker 1: down the rabbit hole or but yeah, for the for 568 00:30:15,560 --> 00:30:20,200 Speaker 1: the people on the other side, not so great. Do 569 00:30:20,240 --> 00:30:24,160 Speaker 1: you say project or project, project, project? Yeah? 570 00:30:24,280 --> 00:30:25,120 Speaker 3: What did I just say? 571 00:30:25,760 --> 00:30:25,960 Speaker 6: Yeah? 572 00:30:26,000 --> 00:30:29,160 Speaker 1: You also always say project. I'm like it's not it's 573 00:30:29,160 --> 00:30:32,920 Speaker 1: not bad, but it's like most people say, Yeah, I 574 00:30:32,960 --> 00:30:33,920 Speaker 1: wonder why that is. 575 00:30:34,840 --> 00:30:36,720 Speaker 3: I don't know. Now I'm getting self conscious about it. 576 00:30:36,760 --> 00:30:38,040 Speaker 3: I don't even know that I did it. 577 00:30:38,320 --> 00:30:40,320 Speaker 1: Oh, you don't need to at all. That's I think. 578 00:30:40,360 --> 00:30:41,600 Speaker 1: It's it's interesting just. 579 00:30:42,000 --> 00:30:44,680 Speaker 3: Thinking also about this. You know, when the AI summary 580 00:30:44,720 --> 00:30:48,480 Speaker 3: comes up, it pushes everything down. And there's an interesting 581 00:30:48,640 --> 00:30:52,000 Speaker 3: term used in web design called the first fold, and 582 00:30:52,040 --> 00:30:55,240 Speaker 3: the first fold refers to the old newspaper term. You know, 583 00:30:55,320 --> 00:30:58,440 Speaker 3: you read all the paper was folded in half, and 584 00:30:58,480 --> 00:31:00,720 Speaker 3: you had to fold it to read the rest of it. 585 00:31:00,920 --> 00:31:05,280 Speaker 3: And that's what's before you scroll on a screen generally 586 00:31:05,280 --> 00:31:09,000 Speaker 3: obviously it's mainly on a desktop, but before you start scrolling, 587 00:31:09,160 --> 00:31:12,120 Speaker 3: that's referred to the first fold. Now the AI information 588 00:31:12,520 --> 00:31:15,920 Speaker 3: is pushing everything down. Then the legitimate results or the 589 00:31:16,000 --> 00:31:18,239 Speaker 3: other results are pushed further down the screen, and then 590 00:31:18,240 --> 00:31:20,560 Speaker 3: you've got to scribby scroll further to try to get 591 00:31:20,600 --> 00:31:21,440 Speaker 3: to them. 592 00:31:21,760 --> 00:31:25,520 Speaker 1: Hey, staying in the AI theme, Yeah, more people are 593 00:31:25,600 --> 00:31:28,840 Speaker 1: considering AI lovers and we shouldn't judge. 594 00:31:29,360 --> 00:31:32,240 Speaker 3: Yeah, what do you well, I've thought about this a lot, 595 00:31:32,480 --> 00:31:35,760 Speaker 3: and interestingly, I was having this discussion with my seventeen 596 00:31:35,840 --> 00:31:38,360 Speaker 3: year old employee about to turn eighteen. He's got a 597 00:31:38,360 --> 00:31:39,240 Speaker 3: girlfriend having. 598 00:31:39,160 --> 00:31:48,680 Speaker 1: This conversation with my virtual girlfriend. Look, sorry, sorry, boyfriend, 599 00:31:49,760 --> 00:31:54,360 Speaker 1: Chad Chuck Chuck Chuck the AI boyfriend. 600 00:31:54,760 --> 00:31:58,320 Speaker 3: Now, look, I'm really into minds about this because I 601 00:31:59,000 --> 00:32:03,200 Speaker 3: love the idea of AI being used, and particularly our 602 00:32:03,280 --> 00:32:09,160 Speaker 3: local medical clinic has a aged care residence and I've 603 00:32:09,160 --> 00:32:11,520 Speaker 3: been so excited to get there because they've got a 604 00:32:11,520 --> 00:32:14,480 Speaker 3: little robot that talks to the residents, and the residents 605 00:32:14,520 --> 00:32:17,280 Speaker 3: interact with the robot, and as the AI gets to 606 00:32:17,320 --> 00:32:20,240 Speaker 3: know the residents, it's interact with them and may talk 607 00:32:20,240 --> 00:32:22,760 Speaker 3: about things that they're more interested in, or gets to 608 00:32:22,800 --> 00:32:25,840 Speaker 3: know who they are, and it's using these great features. Now, 609 00:32:26,200 --> 00:32:30,360 Speaker 3: it's an inanimate object that is interactive and it's not 610 00:32:30,440 --> 00:32:34,040 Speaker 3: a person. But I can see that these people are 611 00:32:34,120 --> 00:32:36,720 Speaker 3: really getting a lot of value out of this interaction. 612 00:32:37,080 --> 00:32:41,360 Speaker 3: So would you fall in love with an AI? It's speculative, 613 00:32:41,600 --> 00:32:45,440 Speaker 3: it's suggested that it's inevitable, like there's a thought process 614 00:32:45,440 --> 00:32:48,000 Speaker 3: out there that eventually if you build up a rapport 615 00:32:48,240 --> 00:32:51,160 Speaker 3: or a repertoire or whatever with your chatbot. But I 616 00:32:51,480 --> 00:32:55,640 Speaker 3: mean I was talking to you know, to Caspa last night, 617 00:32:55,760 --> 00:32:59,360 Speaker 3: and you know young guy who you know, he's got 618 00:32:59,360 --> 00:33:02,520 Speaker 3: a girlfriend and is really enamored and it's lovely to see, 619 00:33:02,640 --> 00:33:04,400 Speaker 3: you know, you have one of your first loves and 620 00:33:04,400 --> 00:33:06,520 Speaker 3: all that sort of stuff. And of course he couldn't 621 00:33:06,560 --> 00:33:10,160 Speaker 3: conceive of the concept of having a chatbot that you 622 00:33:10,240 --> 00:33:13,160 Speaker 3: fall in love with. But I look, you know, we 623 00:33:13,320 --> 00:33:16,080 Speaker 3: talk about love in different ways. We have friends that 624 00:33:16,120 --> 00:33:20,040 Speaker 3: we care about and that we love. I love my dog, 625 00:33:20,960 --> 00:33:23,080 Speaker 3: and I could possibly say that I love my dog 626 00:33:23,160 --> 00:33:26,520 Speaker 3: more than some people. But what I'm saying is I'm 627 00:33:26,560 --> 00:33:29,200 Speaker 3: not in love. But you can love or feel a 628 00:33:29,240 --> 00:33:32,040 Speaker 3: connection in different ways, and it may not be so 629 00:33:32,160 --> 00:33:35,360 Speaker 3: much romantic, but it could be more empathetic, you know. 630 00:33:35,440 --> 00:33:39,000 Speaker 3: So I feel that it may be inevitable that as 631 00:33:39,040 --> 00:33:44,400 Speaker 3: we form these relationships with these online AI entities, that 632 00:33:44,440 --> 00:33:47,600 Speaker 3: we start to feel a real emotional connection to them. 633 00:33:48,000 --> 00:33:50,120 Speaker 3: But it opens up a can of worms, because what 634 00:33:50,880 --> 00:33:55,880 Speaker 3: if Craig dot ai suddenly goes bust and my AI 635 00:33:56,360 --> 00:34:00,280 Speaker 3: Harper chatbot is suddenly gone forever and I feel like 636 00:34:00,320 --> 00:34:01,800 Speaker 3: I've lost Craig, you know. 637 00:34:02,400 --> 00:34:05,240 Speaker 1: So that's I think you nailed it. Like I was 638 00:34:05,680 --> 00:34:10,440 Speaker 1: going to say, I don't think you know, falling in 639 00:34:10,520 --> 00:34:14,000 Speaker 1: love with is the kind of the hook the article, 640 00:34:14,840 --> 00:34:17,680 Speaker 1: and maybe some people do, but I think it's Yeah, 641 00:34:17,760 --> 00:34:20,879 Speaker 1: it's more about an emotional connection or a bond where 642 00:34:20,920 --> 00:34:25,279 Speaker 1: people feel I don't know, I feel like they have 643 00:34:25,560 --> 00:34:31,440 Speaker 1: a relationship or a bond or a connection or something 644 00:34:31,480 --> 00:34:34,320 Speaker 1: that has a level of consciousness and awareness that knows 645 00:34:34,400 --> 00:34:38,600 Speaker 1: their name, that talks to them, that remembers things about them. 646 00:34:39,080 --> 00:34:43,360 Speaker 1: So here's my funny little story, right, chat GPT always 647 00:34:43,360 --> 00:34:45,759 Speaker 1: says something to me like, hey, Craig, what's up? Or Hey, 648 00:34:45,760 --> 00:34:47,799 Speaker 1: that's a great question, Craig, let me think about it, 649 00:34:47,880 --> 00:34:50,120 Speaker 1: or hey, Craig, hope you have any all this? And 650 00:34:50,120 --> 00:34:52,439 Speaker 1: the other day it responded to something and it just said, 651 00:34:52,920 --> 00:34:56,799 Speaker 1: let me check. I'm like, where's my name? Bro? Like 652 00:34:56,920 --> 00:34:59,759 Speaker 1: I felt a bit hurt. I felt a little bit 653 00:34:59,880 --> 00:35:02,759 Speaker 1: like why did you not? Why you're not using my name? 654 00:35:03,920 --> 00:35:06,600 Speaker 1: And just for one moment, not really hurt, but I 655 00:35:06,840 --> 00:35:09,239 Speaker 1: just noticed it, and I'm like, I don't like it 656 00:35:09,280 --> 00:35:12,759 Speaker 1: when it doesn't use my name. I'm like, I may 657 00:35:12,800 --> 00:35:15,520 Speaker 1: have a relationship with chat JPT. I'm not sure. I'm 658 00:35:15,560 --> 00:35:16,319 Speaker 1: working through it. 659 00:35:16,840 --> 00:35:19,200 Speaker 3: Heif have you ever noticed how Craig looks at his 660 00:35:19,280 --> 00:35:20,839 Speaker 3: motorbikes when he's showing them off. 661 00:35:22,440 --> 00:35:24,040 Speaker 1: Never show my motorbikes off. 662 00:35:24,120 --> 00:35:27,000 Speaker 3: He did show off. No, No, you did kind of 663 00:35:27,120 --> 00:35:27,960 Speaker 3: look at But. 664 00:35:28,160 --> 00:35:30,960 Speaker 2: I know you're asking someone that probably looks at motorbikes 665 00:35:30,960 --> 00:35:32,759 Speaker 2: the same way, So I'm not I don't see it 666 00:35:32,760 --> 00:35:33,319 Speaker 2: problem here. 667 00:35:33,640 --> 00:35:36,239 Speaker 3: We can feel a fondness for objects, as we know, 668 00:35:36,680 --> 00:35:40,719 Speaker 3: got sentimental value. You may not bear to be able 669 00:35:40,760 --> 00:35:42,600 Speaker 3: to think that it was not there with you, and 670 00:35:42,600 --> 00:35:46,840 Speaker 3: it could be something that has no intrinsic kind of 671 00:35:46,880 --> 00:35:50,480 Speaker 3: monetary value, but because you've had it since I've got 672 00:35:50,480 --> 00:35:53,680 Speaker 3: a pencil case that I had in primary school, still 673 00:35:53,800 --> 00:35:56,560 Speaker 3: tucked away somewhere, and every now and again I happened 674 00:35:56,560 --> 00:35:58,840 Speaker 3: across it and open and I think, oh, there's my 675 00:35:59,080 --> 00:36:03,719 Speaker 3: to be I remember using that to b drawing pictures on. 676 00:36:04,760 --> 00:36:06,480 Speaker 4: A motorbike with a pencil case. 677 00:36:07,239 --> 00:36:12,239 Speaker 1: Well, hey, can I just say, speaking of memories, my 678 00:36:13,680 --> 00:36:17,640 Speaker 1: screen flashes up news stories in little bubbles. I don't 679 00:36:17,680 --> 00:36:19,799 Speaker 1: know why you can fix that, Patrick, next time you 680 00:36:19,840 --> 00:36:24,719 Speaker 1: hear but hul Cogan just died. Now, I know neither 681 00:36:24,760 --> 00:36:28,080 Speaker 1: of you were probably big wrestling fans. Do you both 682 00:36:28,160 --> 00:36:32,560 Speaker 1: know who he is? Of course I do, Yeah, yeah, yeah, 683 00:36:32,600 --> 00:36:35,279 Speaker 1: so that's sad whole Cogan, did you ever watch the 684 00:36:35,440 --> 00:36:37,120 Speaker 1: w w E, Patrick or Too. 685 00:36:38,000 --> 00:36:40,799 Speaker 3: There was a wrestling program on a Sunday when I 686 00:36:40,840 --> 00:36:44,040 Speaker 3: was a kid that we used to watch Maria Mulano. 687 00:36:44,560 --> 00:36:47,000 Speaker 1: No, no, dude, no, that. 688 00:36:48,480 --> 00:36:53,680 Speaker 3: Know, No, that's like wrestling thing. Wasn't there with Mury. 689 00:36:53,880 --> 00:36:58,400 Speaker 1: That Yeah, that's true. That was like like sick Championship Wrestling. 690 00:36:58,760 --> 00:37:02,720 Speaker 3: Well that was w W something well different. 691 00:37:02,960 --> 00:37:06,600 Speaker 1: Okay, you're so out of the loop, out of the WWE. 692 00:37:06,680 --> 00:37:09,520 Speaker 1: But anyway, he passed away sadly. Anyway, none of our 693 00:37:09,560 --> 00:37:10,959 Speaker 1: listeners care about that harp. 694 00:37:11,040 --> 00:37:14,959 Speaker 3: So here now they left, like the three intros ago. 695 00:37:17,760 --> 00:37:23,480 Speaker 1: Seriously, YouTube prepares crackdown on mass produced and repetitive videos. 696 00:37:23,600 --> 00:37:25,920 Speaker 3: Have you noticed how much shit is on YouTube at 697 00:37:25,920 --> 00:37:30,839 Speaker 3: the moment? Is there's so much AI slop that is 698 00:37:30,880 --> 00:37:35,000 Speaker 3: there making me angry? I get rage scrolling when I 699 00:37:35,040 --> 00:37:38,279 Speaker 3: look at you out, I do, Craig, and you know me, 700 00:37:38,719 --> 00:37:43,840 Speaker 3: I do tie chie, I get I get wow. 701 00:37:43,960 --> 00:37:47,200 Speaker 1: When you fire up my testicles retreat into my body, 702 00:37:47,280 --> 00:37:48,360 Speaker 1: I'm like wow. 703 00:37:49,640 --> 00:37:50,240 Speaker 3: In your throat. 704 00:37:52,600 --> 00:37:56,880 Speaker 1: No, it's a goiter, but at that bomb. Hey, everyone, 705 00:37:56,960 --> 00:37:59,680 Speaker 1: welcome to the you project. I'm Craig Harper, should we 706 00:37:59,719 --> 00:38:00,279 Speaker 1: start again? 707 00:38:01,400 --> 00:38:01,719 Speaker 6: The hell? 708 00:38:01,960 --> 00:38:05,959 Speaker 1: This is possibly the best and worst episode we've ever done. 709 00:38:06,200 --> 00:38:07,000 Speaker 3: You realize it. 710 00:38:07,360 --> 00:38:10,400 Speaker 1: We're going to get praize and complaints. 711 00:38:11,400 --> 00:38:15,480 Speaker 3: Sorry, Magda, I've turned your favorite segment into the shit 712 00:38:15,560 --> 00:38:16,080 Speaker 3: a segment. 713 00:38:17,640 --> 00:38:20,440 Speaker 1: Her favorite segment has become the show. Do you know 714 00:38:20,480 --> 00:38:23,120 Speaker 1: what this is like? It's like that comedian Tiff. You 715 00:38:23,200 --> 00:38:26,439 Speaker 1: probably know Matt Rife, Yes, yep, And all he does 716 00:38:26,560 --> 00:38:29,600 Speaker 1: is the crowd work. He doesn't actually do quite often, 717 00:38:30,040 --> 00:38:33,239 Speaker 1: he doesn't actually do a special where he stands up 718 00:38:33,320 --> 00:38:36,160 Speaker 1: and tells jokes. He never gets to that bit because 719 00:38:36,160 --> 00:38:38,840 Speaker 1: he's so good at riffing with the crowd and bullshitting. 720 00:38:40,920 --> 00:38:42,440 Speaker 1: Patrick take charge please. 721 00:38:42,560 --> 00:38:46,319 Speaker 3: Okay. So the way that people can earn money on 722 00:38:46,480 --> 00:38:49,000 Speaker 3: YouTube is to get lots and lots of looks at 723 00:38:49,080 --> 00:38:52,319 Speaker 3: videos that they create. The problem now is that people 724 00:38:52,400 --> 00:38:56,000 Speaker 3: are creating videos in a matter of minutes using AI 725 00:38:56,680 --> 00:39:01,319 Speaker 3: and pushing it out as a documentary about civilization hidden 726 00:39:01,400 --> 00:39:04,200 Speaker 3: underneath the pyramids. And we didn't know that they had 727 00:39:04,239 --> 00:39:07,399 Speaker 3: fusion reactors, you know, ten thousand years ago. So it's 728 00:39:07,400 --> 00:39:11,239 Speaker 3: that sort of crap that gets pushed out and unfortunately, 729 00:39:11,600 --> 00:39:14,319 Speaker 3: if they're clever enough, people click through and then it 730 00:39:14,360 --> 00:39:19,360 Speaker 3: pushes it up the algorithms, and so this crap basically 731 00:39:19,400 --> 00:39:23,160 Speaker 3: floats to the surface. And so YouTube is now looking 732 00:39:23,360 --> 00:39:28,960 Speaker 3: at trying to fine tune its partner program and particularly 733 00:39:29,040 --> 00:39:33,560 Speaker 3: the monetization policies, because when someone legitimately works really hard, 734 00:39:34,200 --> 00:39:37,160 Speaker 3: have you ever heard of? Okay, so maybe at five 735 00:39:37,200 --> 00:39:40,080 Speaker 3: o'clock this morning, I was watching a video on YouTube 736 00:39:40,120 --> 00:39:44,080 Speaker 3: about the Caspian Sea Monster right now the cure? 737 00:39:44,520 --> 00:39:47,080 Speaker 1: Would you say maybe? Why would you say maybe? Why 738 00:39:47,120 --> 00:39:50,279 Speaker 1: wouldn't you go? Why would you go a bit? I 739 00:39:50,320 --> 00:39:52,800 Speaker 1: was up at five, It's what I was doing. Everyone 740 00:39:52,800 --> 00:39:56,960 Speaker 1: else was sleeping or fucking I was watching a thing 741 00:39:57,000 --> 00:39:59,480 Speaker 1: on the Caspian Can you look. 742 00:39:59,360 --> 00:40:01,200 Speaker 3: Up the Caspian Sea Monster for us? 743 00:40:01,200 --> 00:40:04,879 Speaker 1: While God, I bet you can't wait. I was out 744 00:40:04,920 --> 00:40:08,680 Speaker 1: walking around suburbia like a fucking ninja, looking at with 745 00:40:09,560 --> 00:40:13,040 Speaker 1: all my friends, scaring everyone who had a dog and 746 00:40:13,120 --> 00:40:14,160 Speaker 1: was out and about. 747 00:40:14,200 --> 00:40:16,920 Speaker 3: Chet Chipet on your phone though, so you chet GPT 748 00:40:17,040 --> 00:40:18,000 Speaker 3: is always with you, Craig. 749 00:40:18,360 --> 00:40:21,279 Speaker 1: Yeah, I'm never alone. It's kind of like Jesus, but 750 00:40:21,400 --> 00:40:22,360 Speaker 1: more electronic. 751 00:40:23,080 --> 00:40:26,400 Speaker 3: That's the one. So tell us about the Caspian Sea Monster. 752 00:40:27,080 --> 00:40:31,120 Speaker 2: The Caspian Sea Monster refers to the Soviet made experimental 753 00:40:31,160 --> 00:40:36,239 Speaker 2: ground effect vehicle known as the KM corbel Macket. This 754 00:40:36,400 --> 00:40:41,839 Speaker 2: massive Ekreno plan whatever that is, a crenoplane, thank you, 755 00:40:42,640 --> 00:40:45,400 Speaker 2: was designed to fly just a few meters above the 756 00:40:45,440 --> 00:40:48,720 Speaker 2: water at high velocity. It was spotted by US spy 757 00:40:48,960 --> 00:40:50,600 Speaker 2: satellites during the Cold War. 758 00:40:51,120 --> 00:40:54,480 Speaker 3: Yeah, this is amazing aircraft which wasn't an aircraft. It 759 00:40:54,600 --> 00:40:56,840 Speaker 3: wasn't a plane, and it wasn't a boat. It was 760 00:40:56,880 --> 00:41:00,760 Speaker 3: a hybrid between the two. And I was this amazing 761 00:41:00,760 --> 00:41:03,359 Speaker 3: YouTube clip and the guy had put so much work 762 00:41:03,360 --> 00:41:07,880 Speaker 3: into going to Russia to see this a cranoplane, the 763 00:41:08,000 --> 00:41:12,040 Speaker 3: Caspian Sea Monster. It was quite revolutionary at the time 764 00:41:12,080 --> 00:41:15,360 Speaker 3: because it flew so low it was invisible to radar, 765 00:41:15,760 --> 00:41:18,200 Speaker 3: but it was so much faster than a boat or 766 00:41:18,239 --> 00:41:20,560 Speaker 3: a ship, so it meant that it could have been 767 00:41:20,840 --> 00:41:24,319 Speaker 3: a way to launch troops and to attack where no 768 00:41:24,400 --> 00:41:27,880 Speaker 3: one would see them coming effectively. But what I was 769 00:41:27,920 --> 00:41:31,200 Speaker 3: getting to is the amount of production quality that went 770 00:41:31,280 --> 00:41:34,480 Speaker 3: into this documentary that I was watching was fantastic and 771 00:41:34,520 --> 00:41:36,520 Speaker 3: I really found it interesting. I sent it to my 772 00:41:36,560 --> 00:41:39,800 Speaker 3: nerdy friend Ryan, who's an aerospace engineer and actually worked 773 00:41:39,840 --> 00:41:45,560 Speaker 3: on something similar. And then when someone throws out some 774 00:41:45,800 --> 00:41:49,400 Speaker 3: AI generated slop. It kind of makes me get that 775 00:41:49,520 --> 00:41:53,240 Speaker 3: rage scroll going, because you know, there's good quality stuff 776 00:41:53,280 --> 00:41:54,920 Speaker 3: on YouTube. So I think this is a bit of 777 00:41:54,960 --> 00:41:57,320 Speaker 3: a pad on the back to the people at Google 778 00:41:57,480 --> 00:41:59,319 Speaker 3: to try to clean up and get rid of this 779 00:41:59,400 --> 00:42:03,160 Speaker 3: mass produce video kind of slot that's out there, and 780 00:42:03,280 --> 00:42:06,040 Speaker 3: hopefully it will clean up the algorithm and clean up 781 00:42:06,080 --> 00:42:09,120 Speaker 3: the content so that when someone at five am wants 782 00:42:09,160 --> 00:42:13,840 Speaker 3: to look at old Russian planes, they get old Russian planes. 783 00:42:14,600 --> 00:42:18,759 Speaker 1: Shout out to that other guy who wants to do that. 784 00:42:19,360 --> 00:42:22,040 Speaker 1: You know they've already made that. You probably know this 785 00:42:22,080 --> 00:42:25,839 Speaker 1: better than the mate. But I think there's been at 786 00:42:25,960 --> 00:42:29,480 Speaker 1: least a couple of movies made which are completely AI, 787 00:42:29,880 --> 00:42:34,439 Speaker 1: no actors, The script was written by AI, the entire thing. 788 00:42:34,600 --> 00:42:38,920 Speaker 1: Some of them like very well, I don't know craft 789 00:42:38,960 --> 00:42:41,879 Speaker 1: that is that the right word. But I wonder if 790 00:42:42,000 --> 00:42:44,240 Speaker 1: down the track, because already there's a lot of CGI 791 00:42:44,360 --> 00:42:45,920 Speaker 1: and has been for a while, but I wonder if 792 00:42:45,960 --> 00:42:49,319 Speaker 1: down the track, you know, like soundtracks are done with AI, 793 00:42:49,400 --> 00:42:51,839 Speaker 1: scripts are done with AI, there are no actors there 794 00:42:51,880 --> 00:42:54,160 Speaker 1: are I wonder if that's going to become a thing 795 00:42:54,160 --> 00:42:56,160 Speaker 1: where people are going to go to cinemas to sit 796 00:42:56,239 --> 00:42:58,239 Speaker 1: and watch a cinema is going to be a thing. 797 00:42:58,280 --> 00:43:03,400 Speaker 1: That's another question. Completely AI created experience. 798 00:43:03,120 --> 00:43:05,640 Speaker 3: As long as we know that it's AI, as long 799 00:43:05,680 --> 00:43:08,600 Speaker 3: as we're put in the picture that way, there's some 800 00:43:08,640 --> 00:43:11,280 Speaker 3: great creative stuff out there. I don't know if we 801 00:43:11,320 --> 00:43:13,360 Speaker 3: talked about this on the show the last time we 802 00:43:13,400 --> 00:43:16,400 Speaker 3: had a segment two weeks ago, but there's a competition 803 00:43:16,520 --> 00:43:20,680 Speaker 3: going at the moment for people to actually use AI 804 00:43:20,960 --> 00:43:27,279 Speaker 3: to create artwork, to imagine women in history, to try 805 00:43:27,320 --> 00:43:31,640 Speaker 3: to shine the spotlight on the influence and the amazing 806 00:43:31,680 --> 00:43:33,480 Speaker 3: work that's been done over the years. I say it's 807 00:43:33,560 --> 00:43:37,200 Speaker 3: Marie Curie. Now, if someone isn't an artist or a 808 00:43:37,280 --> 00:43:41,160 Speaker 3: video producer, they can still use this AI, these AI 809 00:43:41,280 --> 00:43:45,680 Speaker 3: tools to reimagine or to visualize what that might have been. 810 00:43:45,719 --> 00:43:48,560 Speaker 3: So I can see that there are positives when someone 811 00:43:48,640 --> 00:43:51,680 Speaker 3: may be a really good storyteller but doesn't have video 812 00:43:51,840 --> 00:43:55,440 Speaker 3: editing skills. So it's got a fantastic idea. And I 813 00:43:55,480 --> 00:43:58,439 Speaker 3: know tif's good at drawing, so you could turn your 814 00:43:58,480 --> 00:44:01,759 Speaker 3: sketches into an animation sequence, so you could come up 815 00:44:01,800 --> 00:44:04,640 Speaker 3: with a cartoon series. But you don't animate, but you 816 00:44:04,840 --> 00:44:07,040 Speaker 3: are good at drawing or I might be good at 817 00:44:07,080 --> 00:44:09,600 Speaker 3: telling stories, but I'm not good with the other side 818 00:44:09,640 --> 00:44:12,120 Speaker 3: of it. So again it comes down to how it's 819 00:44:12,160 --> 00:44:15,680 Speaker 3: being used. Our Netflix is just about to launch a 820 00:44:15,719 --> 00:44:19,960 Speaker 3: new series where they've used AI exclusively to generate all 821 00:44:19,960 --> 00:44:22,920 Speaker 3: the special effects. So where they would have used compositing 822 00:44:23,280 --> 00:44:27,720 Speaker 3: and you know, the normal tools to create explosions or whatever, 823 00:44:28,080 --> 00:44:31,160 Speaker 3: they've then decided to go down the route of using AI. 824 00:44:31,360 --> 00:44:34,040 Speaker 3: But they've been very upfront about it, so they're saying, 825 00:44:34,360 --> 00:44:38,440 Speaker 3: this part of our series is using AI to create 826 00:44:38,560 --> 00:44:40,000 Speaker 3: these specific effects. 827 00:44:40,680 --> 00:44:44,640 Speaker 1: I wonder what Charlie Chaplin would think. I mean, what 828 00:44:44,800 --> 00:44:49,279 Speaker 1: was that maybe nineteen fifteen, nineteen ten, fifteen twenty. You 829 00:44:49,280 --> 00:44:53,680 Speaker 1: remember all those those old black and white silent movies 830 00:44:53,760 --> 00:44:57,040 Speaker 1: with him doing his own stunts and holding on to 831 00:44:57,120 --> 00:44:59,759 Speaker 1: the side of a train and climbing a ladder, that 832 00:45:00,400 --> 00:45:05,040 Speaker 1: leaning against the wall and Mike. It's just in a 833 00:45:05,320 --> 00:45:10,160 Speaker 1: relatively short time span, it's just become something completely different. 834 00:45:10,640 --> 00:45:13,000 Speaker 3: If you ever get a chance, do a YouTube search 835 00:45:13,239 --> 00:45:18,160 Speaker 3: on old special effects from black and white films, the 836 00:45:18,320 --> 00:45:20,880 Speaker 3: clever things that the filmmakers used to do. You know, 837 00:45:20,920 --> 00:45:22,759 Speaker 3: the first Man on the Moon, Remember the picture of 838 00:45:22,760 --> 00:45:25,479 Speaker 3: the moon with the rocket ship in its eye. There's 839 00:45:25,480 --> 00:45:28,640 Speaker 3: a really old, old old film that's out there. I 840 00:45:28,640 --> 00:45:30,839 Speaker 3: think it might have been French, but there was some 841 00:45:30,960 --> 00:45:36,080 Speaker 3: amazing optical illusions that we used in the old days 842 00:45:36,120 --> 00:45:39,000 Speaker 3: to create some amazing effects. It was just clever, really 843 00:45:39,000 --> 00:45:40,600 Speaker 3: really clever, interesting stuff. 844 00:45:41,040 --> 00:45:43,440 Speaker 1: Can I ask one question that isn't on the list 845 00:45:43,520 --> 00:45:47,360 Speaker 1: but was on the news and is as we speak, 846 00:45:48,000 --> 00:45:50,920 Speaker 1: just sitting on a deck chair in my prefrontal cortex 847 00:45:51,000 --> 00:45:55,759 Speaker 1: waiting to get some attention. You're welcome everyone. That is 848 00:45:56,640 --> 00:45:59,600 Speaker 1: last night on the news there was a fire on 849 00:45:59,640 --> 00:46:04,200 Speaker 1: a vert and plane. Yes, ah so that yeah, so 850 00:46:04,360 --> 00:46:07,920 Speaker 1: tell them so lithium batteries that are just spontaneously or 851 00:46:08,120 --> 00:46:13,480 Speaker 1: kind of catching a light and they're talking like that. 852 00:46:13,719 --> 00:46:15,960 Speaker 1: On every plane flight that people go on, there are 853 00:46:15,960 --> 00:46:18,680 Speaker 1: a bunch of people with these batteries that can catch 854 00:46:18,719 --> 00:46:23,479 Speaker 1: a light and they're almost impossible to put out once 855 00:46:23,520 --> 00:46:24,719 Speaker 1: they get a blaze. 856 00:46:24,640 --> 00:46:27,480 Speaker 3: Yeah, they're power banks, so they usually through my own batteries. 857 00:46:27,480 --> 00:46:29,080 Speaker 3: You've probably seen the power I've got a power bank 858 00:46:29,080 --> 00:46:31,839 Speaker 3: that I travel with. Fantastic because it means that if 859 00:46:31,840 --> 00:46:34,320 Speaker 3: you're stuck somewhere in your phone, particularly when you travel, 860 00:46:34,360 --> 00:46:36,239 Speaker 3: you use a lot more GPS and your phone goes 861 00:46:36,280 --> 00:46:39,319 Speaker 3: flat quickly having a power bank with you. I tend 862 00:46:39,400 --> 00:46:41,000 Speaker 3: to have it in my backpack. If I go to 863 00:46:41,040 --> 00:46:43,920 Speaker 3: Melbourne and I'm traveling around, I always take my power 864 00:46:43,920 --> 00:46:46,880 Speaker 3: bank with me. So power banks are just a collection 865 00:46:46,960 --> 00:46:49,120 Speaker 3: of batteries that you can charge your phone off or 866 00:46:49,200 --> 00:46:53,040 Speaker 3: run your devices from, whether it's a laptop or a phone, 867 00:46:53,360 --> 00:46:55,439 Speaker 3: and people carry with them with them all the time. 868 00:46:55,520 --> 00:46:57,880 Speaker 3: And I think that the problem with these sorts of 869 00:46:57,920 --> 00:47:00,799 Speaker 3: stories is and yes, you're right, in fact, the same 870 00:47:00,840 --> 00:47:03,680 Speaker 3: thing happened, but it caused a major fire on a 871 00:47:03,920 --> 00:47:06,400 Speaker 3: Chinese or one of the Asian airlines. It wasn't Chinese 872 00:47:06,520 --> 00:47:08,799 Speaker 3: one of the Asian airlines. And it's subsequently a lot 873 00:47:08,800 --> 00:47:11,840 Speaker 3: of airlines have now banned these power banks, and you 874 00:47:11,880 --> 00:47:14,280 Speaker 3: can see that airlines are going to get really nervous 875 00:47:14,360 --> 00:47:18,480 Speaker 3: about this. And I guess the concern is it's not 876 00:47:18,520 --> 00:47:20,400 Speaker 3: so much the new ones, it's if you've had an 877 00:47:20,400 --> 00:47:23,120 Speaker 3: old one, you might have dropped it. That's where these 878 00:47:23,160 --> 00:47:25,920 Speaker 3: lithium ion batteries, if you ever drop a lithium ion, 879 00:47:26,320 --> 00:47:28,880 Speaker 3: that can be really problematic. So that's the concern, and 880 00:47:28,920 --> 00:47:31,520 Speaker 3: you're right, it's very, very very hard to put them 881 00:47:31,520 --> 00:47:34,799 Speaker 3: out because once they get started. So I don't know 882 00:47:34,800 --> 00:47:37,279 Speaker 3: what that's going to mean for air travel. I think 883 00:47:37,320 --> 00:47:39,200 Speaker 3: it's going to make it a little bit difficult, more 884 00:47:39,200 --> 00:47:42,600 Speaker 3: difficult because airlines are likely to put a ban on 885 00:47:42,760 --> 00:47:45,160 Speaker 3: being able to use them, which will make things a 886 00:47:45,200 --> 00:47:47,239 Speaker 3: little bit harder. But that said, a lot of the 887 00:47:47,280 --> 00:47:50,440 Speaker 3: newer planes have USB ports in your seats, so you 888 00:47:50,480 --> 00:47:54,160 Speaker 3: can charge directly at your seat anyway. So for the 889 00:47:54,200 --> 00:47:55,920 Speaker 3: concerns that people have, if you if you're on a 890 00:47:55,960 --> 00:47:58,880 Speaker 3: long haul trip and you want to watch your laptop 891 00:47:58,960 --> 00:48:00,680 Speaker 3: or you want to use your laptop or an iPad 892 00:48:00,800 --> 00:48:04,600 Speaker 3: or something like that, then you know you can now 893 00:48:04,600 --> 00:48:07,120 Speaker 3: plug into your seat. So I think for most people 894 00:48:07,440 --> 00:48:10,400 Speaker 3: that I think may not be as necessary as it was. 895 00:48:10,719 --> 00:48:12,120 Speaker 3: But I think what's going to happen is you're going 896 00:48:12,160 --> 00:48:13,680 Speaker 3: to get to the airport and have to buy one 897 00:48:13,680 --> 00:48:16,480 Speaker 3: if you're traveling, because if you're footslogging around Europe and 898 00:48:16,520 --> 00:48:17,919 Speaker 3: you're going to be out and about for a long 899 00:48:17,960 --> 00:48:19,720 Speaker 3: time and you don't want your phone to go flat, 900 00:48:19,960 --> 00:48:22,440 Speaker 3: then you're going to need to have a power bank anyway, 901 00:48:22,440 --> 00:48:24,520 Speaker 3: and you might end up having to create a whole 902 00:48:24,600 --> 00:48:27,080 Speaker 3: new market of disposable power banks, because what's going to 903 00:48:27,080 --> 00:48:29,240 Speaker 3: happen to them? Do you hire them at the airport 904 00:48:29,280 --> 00:48:31,319 Speaker 3: and then return them or do you just urf them 905 00:48:31,360 --> 00:48:33,359 Speaker 3: at the end. So it's going to create a whole 906 00:48:33,360 --> 00:48:34,399 Speaker 3: lot of waste, isn't it. 907 00:48:34,920 --> 00:48:37,960 Speaker 1: H'e upsent just quickly, We've got about two minutes. Also, 908 00:48:38,200 --> 00:48:40,120 Speaker 1: is it not the same? I don't know, Patrick, but 909 00:48:41,360 --> 00:48:44,320 Speaker 1: bikes and scooters that have are they the same kinds 910 00:48:44,320 --> 00:48:46,719 Speaker 1: of batteries? Because we're seeing shit just blow up in 911 00:48:46,800 --> 00:48:49,040 Speaker 1: people's houses and catch a light. 912 00:48:49,440 --> 00:48:52,279 Speaker 3: Yeah, sadly, I was talking to a friend of mine 913 00:48:52,400 --> 00:48:54,919 Speaker 3: or a lady that worked with me for a while, 914 00:48:54,960 --> 00:48:57,800 Speaker 3: and her parents' house burnt down. They had an electric 915 00:48:57,880 --> 00:49:00,560 Speaker 3: scooter in the garage. It caught five and it burnt 916 00:49:00,600 --> 00:49:02,960 Speaker 3: down the entire house. And that was just like down 917 00:49:03,000 --> 00:49:04,080 Speaker 3: the road from where I live. 918 00:49:04,400 --> 00:49:05,239 Speaker 1: You have one of those? 919 00:49:05,760 --> 00:49:09,280 Speaker 3: I do, But when I did my whole new studio revamp, 920 00:49:09,320 --> 00:49:12,040 Speaker 3: I misplaced the charger, so I don't know what I've 921 00:49:12,040 --> 00:49:13,319 Speaker 3: done with the charger for it. 922 00:49:13,719 --> 00:49:15,960 Speaker 1: You might have to use your legs and push like 923 00:49:16,000 --> 00:49:18,520 Speaker 1: the old days, you know, back in the sixties, stand 924 00:49:18,560 --> 00:49:20,120 Speaker 1: on it and push with your other leg. 925 00:49:20,360 --> 00:49:22,239 Speaker 3: But it's hard because it's got a battery on it. 926 00:49:22,239 --> 00:49:24,680 Speaker 3: It's heavy, and it has that you know, kind of 927 00:49:24,719 --> 00:49:26,160 Speaker 3: grinding sound because. 928 00:49:25,960 --> 00:49:28,080 Speaker 1: As that resistance. Think of it as a workout. 929 00:49:28,400 --> 00:49:30,399 Speaker 3: Yeah, I just did ten minutes on the rower after 930 00:49:30,440 --> 00:49:33,240 Speaker 3: I did Waits this morning. That's good, isn't it. Yeah, 931 00:49:33,239 --> 00:49:35,040 Speaker 3: that's my cardio. Ten minutes on the row and then 932 00:49:35,040 --> 00:49:35,760 Speaker 3: I'll walk fits. 933 00:49:37,280 --> 00:49:39,759 Speaker 1: Tell people where to find you, follow you, connect with you, 934 00:49:40,080 --> 00:49:43,920 Speaker 1: and just a viewer warning, listener warning. If you do 935 00:49:44,000 --> 00:49:46,520 Speaker 1: send him an email, he's in your life forever. Keep 936 00:49:46,560 --> 00:49:47,120 Speaker 1: that in mind. 937 00:49:47,640 --> 00:49:54,960 Speaker 3: Hey, welcome to the you Project him ah to Shay 938 00:49:55,080 --> 00:49:59,839 Speaker 3: bro too, shy websitesnow, dot com, dot au websites now, 939 00:50:00,080 --> 00:50:02,200 Speaker 3: Calm tod a you. It's only because it's the easiest 940 00:50:02,239 --> 00:50:05,440 Speaker 3: website to remember and you can connect with me. 941 00:50:06,400 --> 00:50:08,360 Speaker 1: Thank you Tiffany, and thank you Patrick. 942 00:50:08,920 --> 00:50:10,760 Speaker 3: Thanks, Thanks lads,