1 00:00:00,360 --> 00:00:02,800 Speaker 1: Oh good a tenants of bloody you project. It's bloody Patrick, 2 00:00:02,840 --> 00:00:06,360 Speaker 1: bloody Tiff, bloody Harps. It's Friday afternoon, Thursday afternoon. I 3 00:00:06,360 --> 00:00:10,240 Speaker 1: should say this is something of a recording anomaly. Wh're 4 00:00:10,280 --> 00:00:15,400 Speaker 1: normally Friday am for this little episode, so I don't 5 00:00:15,440 --> 00:00:18,400 Speaker 1: know how our brain's working. At five p emish on 6 00:00:18,400 --> 00:00:22,759 Speaker 1: a Thursday. Tiff just swanned in from somewhere. We'll find 7 00:00:22,800 --> 00:00:26,400 Speaker 1: out Patrick's drinking either cam Butcher or com Butcher. He's 8 00:00:26,440 --> 00:00:30,280 Speaker 1: already been corrected by Tiff, Like first sentence out of 9 00:00:30,280 --> 00:00:33,040 Speaker 1: her mouth was a correction, not hollow. It was just 10 00:00:33,080 --> 00:00:37,519 Speaker 1: correcting him on his pronunciation. How's your cam Butcher? Patrick? 11 00:00:37,960 --> 00:00:38,839 Speaker 1: Like coom Butcher? 12 00:00:38,880 --> 00:00:42,200 Speaker 2: Because I'm scared more scared of Tiff than you Apple 13 00:00:42,320 --> 00:00:46,560 Speaker 2: and lime kom Butcher is bloody great. I can still 14 00:00:47,000 --> 00:00:50,320 Speaker 2: microbiomes doing what they do with my tummy? 15 00:00:50,560 --> 00:00:54,760 Speaker 1: Are you do you take much notice of the whole 16 00:00:54,800 --> 00:00:57,520 Speaker 1: gup by ome thing? And are you proactively trying to 17 00:00:57,560 --> 00:01:00,360 Speaker 1: make yours healthier? Or do you just know the word? 18 00:01:01,040 --> 00:01:04,880 Speaker 1: I just know the word, you just know the word? 19 00:01:05,920 --> 00:01:07,640 Speaker 2: The other reason I bought it? Can I can I 20 00:01:07,680 --> 00:01:11,520 Speaker 2: admit why I bought it? So I normally I go 21 00:01:11,840 --> 00:01:14,800 Speaker 2: shopping on a Wednesday after I've finished my tai chi class. 22 00:01:14,840 --> 00:01:16,800 Speaker 2: I teach tai chi, but I deliberately do an early 23 00:01:16,880 --> 00:01:18,960 Speaker 2: class and then I've got an hour to shop. I 24 00:01:19,040 --> 00:01:22,040 Speaker 2: go to Aldi, I go to Cohl's. But because we're 25 00:01:22,240 --> 00:01:24,479 Speaker 2: in the middle of the holidays, can. 26 00:01:24,319 --> 00:01:27,520 Speaker 1: I just say, Chips laughing hysterically with her face in 27 00:01:27,520 --> 00:01:30,720 Speaker 1: her hands, I don't know what do you laughing at? 28 00:01:30,760 --> 00:01:34,000 Speaker 3: Quick fact that who else would have a whole story 29 00:01:34,040 --> 00:01:36,560 Speaker 3: around the breezeon of drinking a butcher? 30 00:01:37,400 --> 00:01:41,039 Speaker 1: Well, if anyone can turn a thirty second anecdote into 31 00:01:41,080 --> 00:01:44,640 Speaker 1: a five minute fucking monologue, you know it's not you 32 00:01:44,840 --> 00:01:45,039 Speaker 1: or me. 33 00:01:47,720 --> 00:01:50,360 Speaker 3: I'm on the edge in my seat though, please continue past. 34 00:01:50,800 --> 00:01:53,400 Speaker 1: Yeah, I'll tell you what. The listeners are definitely not 35 00:01:53,520 --> 00:01:56,960 Speaker 1: turning this off because this is not boring at all. 36 00:01:58,400 --> 00:02:00,920 Speaker 2: So if I'll finish off the story, because I just 37 00:02:00,920 --> 00:02:05,080 Speaker 2: have to. Now, when I shop online, I hit two buttons. 38 00:02:05,320 --> 00:02:10,480 Speaker 2: I hit vegan, and then I hit on special, then 39 00:02:10,600 --> 00:02:14,640 Speaker 2: my algments. That's the filter that I use first. 40 00:02:14,760 --> 00:02:17,840 Speaker 1: And then I look for other stuff and then comes 41 00:02:17,919 --> 00:02:20,839 Speaker 1: up mung beans and a banana, and he goes, oh, 42 00:02:20,919 --> 00:02:23,320 Speaker 1: I think I'll go the banana this week. 43 00:02:23,120 --> 00:02:27,440 Speaker 2: And pineapple and lime com Butcher for two for one. 44 00:02:27,720 --> 00:02:29,760 Speaker 2: That's why I bought it. I don't normally say and 45 00:02:29,800 --> 00:02:32,520 Speaker 2: answer your question, Grego, I don't do anything for my 46 00:02:32,639 --> 00:02:35,760 Speaker 2: gut biome except if it happens to be on special 47 00:02:36,080 --> 00:02:37,400 Speaker 2: once every term. 48 00:02:37,800 --> 00:02:39,679 Speaker 1: Well, I think you have a pretty good diet, which 49 00:02:39,720 --> 00:02:41,680 Speaker 1: is doing something for your gut biome. Tip for you, 50 00:02:41,840 --> 00:02:44,200 Speaker 1: very I mean, I've never been, but I am. We 51 00:02:44,280 --> 00:02:46,239 Speaker 1: all know I am now because I've just got myself 52 00:02:46,240 --> 00:02:49,800 Speaker 1: there right in front of me, triple strength probiotics with 53 00:02:50,000 --> 00:02:54,800 Speaker 1: ninety six billion cfu per capture whatever that means. And 54 00:02:54,880 --> 00:02:58,120 Speaker 1: I got to say, I'm going to say my poo's 55 00:02:58,160 --> 00:03:01,760 Speaker 1: a world class, world class since I've got on the 56 00:03:01,760 --> 00:03:08,519 Speaker 1: probiotic training. Do you do you think about that a bit? Well, now, 57 00:03:09,160 --> 00:03:10,880 Speaker 1: I mean I've cleared that up for everyone. 58 00:03:12,080 --> 00:03:16,640 Speaker 2: Do you mean anybody else's podcasting segments? Because it always 59 00:03:16,680 --> 00:03:17,560 Speaker 2: comes up in mind. 60 00:03:19,280 --> 00:03:21,920 Speaker 1: I don't know why everyone's I don't know why everyone's 61 00:03:22,000 --> 00:03:25,720 Speaker 1: so fucking offended by something that everyone on the planet does, 62 00:03:25,760 --> 00:03:26,400 Speaker 1: which is shit. 63 00:03:26,880 --> 00:03:29,280 Speaker 2: No, I'm not offended by it, but surely we can 64 00:03:29,320 --> 00:03:32,560 Speaker 2: think of something else to thought about. 65 00:03:33,400 --> 00:03:36,440 Speaker 1: Yeah, can Butcher bottles and on special. And there's two buttons. 66 00:03:36,480 --> 00:03:39,760 Speaker 1: I hit the vegan and the on special. I reckon, 67 00:03:39,840 --> 00:03:44,840 Speaker 1: my poor story's got that covered. So just saying, I 68 00:03:44,880 --> 00:03:48,200 Speaker 1: wonder if our one remaining listener wants to hear What's next? Tiff? 69 00:03:49,160 --> 00:03:50,840 Speaker 1: How are you, Tiff? What have you been doing? 70 00:03:51,120 --> 00:03:53,880 Speaker 3: I'm good. I just had some clients the Savo and 71 00:03:54,000 --> 00:03:56,240 Speaker 3: scurried home in a hurry so I could make this 72 00:03:56,320 --> 00:03:58,080 Speaker 3: because I didn't want to miss out on the trio. 73 00:03:59,280 --> 00:04:02,600 Speaker 1: How many clients do you like to do? Now? If 74 00:04:02,640 --> 00:04:04,960 Speaker 1: you're wondering what Tiff does, she's among other things, she 75 00:04:05,040 --> 00:04:08,360 Speaker 1: trains a few people in the gym. What's your ideal 76 00:04:08,480 --> 00:04:11,120 Speaker 1: number of Not too many? Not too few? And I 77 00:04:11,120 --> 00:04:12,880 Speaker 1: know you don't do them every day, but like three 78 00:04:12,920 --> 00:04:14,600 Speaker 1: a day, two a day, four a day. 79 00:04:14,880 --> 00:04:17,920 Speaker 3: I've actually recently structured it so that I've got two 80 00:04:18,000 --> 00:04:21,719 Speaker 3: big days and then on the other there's a couple 81 00:04:21,720 --> 00:04:23,760 Speaker 3: of days where I just do a couple I don't 82 00:04:23,800 --> 00:04:26,479 Speaker 3: often do afternoons. I've got two groups that I do 83 00:04:26,600 --> 00:04:30,080 Speaker 3: on an afternoon together two times a week, and the 84 00:04:30,120 --> 00:04:31,360 Speaker 3: rest of morning sessions. 85 00:04:31,560 --> 00:04:32,560 Speaker 2: Some day is my big day. 86 00:04:32,680 --> 00:04:36,200 Speaker 3: Sometimes I can have up to six or seven on 87 00:04:36,240 --> 00:04:38,440 Speaker 3: a Sunday, which is a big day, all back to back, 88 00:04:39,480 --> 00:04:42,880 Speaker 3: because now I've managed my energy and I've managed a 89 00:04:42,880 --> 00:04:46,640 Speaker 3: bit of downtime a bit better with that, whereas it 90 00:04:46,720 --> 00:04:50,560 Speaker 3: used to be kind of four people and I'm out 91 00:04:50,560 --> 00:04:50,839 Speaker 3: of there. 92 00:04:52,400 --> 00:04:54,719 Speaker 1: Yeah. You know what's good about having your own business, 93 00:04:54,880 --> 00:04:59,000 Speaker 1: like you and me and Patrick for that point, is 94 00:04:59,040 --> 00:05:02,599 Speaker 1: that you don't do the same thing all the time. 95 00:05:02,800 --> 00:05:04,920 Speaker 1: It's like if I came to you and I went, hey, 96 00:05:05,680 --> 00:05:09,160 Speaker 1: I've got fifteen school kids that I want trained three 97 00:05:09,200 --> 00:05:14,279 Speaker 1: times a week at sekurta foreshore, could you do that? 98 00:05:14,640 --> 00:05:18,200 Speaker 1: And you could go, sure, here's how it would work, 99 00:05:18,240 --> 00:05:20,520 Speaker 1: here's how much it would like. It's not like you 100 00:05:20,600 --> 00:05:23,800 Speaker 1: only do the same thing the same way. So when 101 00:05:23,880 --> 00:05:26,479 Speaker 1: companies come to me and they go, what do you do? 102 00:05:26,600 --> 00:05:29,160 Speaker 1: I go, I can do anything from a thirty minute 103 00:05:29,240 --> 00:05:32,680 Speaker 1: keynote to a three day living program, and we can 104 00:05:32,760 --> 00:05:36,600 Speaker 1: literally talk about anything from leadership and culture and high 105 00:05:36,640 --> 00:05:41,440 Speaker 1: performance to know metacognition and theory of mind and why 106 00:05:41,480 --> 00:05:44,000 Speaker 1: the boss is a prick. You know. It's like it's just, 107 00:05:44,960 --> 00:05:46,520 Speaker 1: you know, you don't have to go and hop in 108 00:05:46,560 --> 00:05:49,719 Speaker 1: this cookie cutter kind of you know thing where you 109 00:05:49,760 --> 00:05:52,159 Speaker 1: do the same thing. I love that kind of freedom 110 00:05:52,200 --> 00:05:55,799 Speaker 1: and that creativity to be able to to a point, 111 00:05:55,920 --> 00:05:58,680 Speaker 1: make stuff up, make your job up. What do you 112 00:05:58,720 --> 00:06:01,400 Speaker 1: want me to talk on? Like I had to talk 113 00:06:01,680 --> 00:06:03,280 Speaker 1: not today, I was going to say today yesterday with 114 00:06:03,320 --> 00:06:06,240 Speaker 1: a company and you know, like with a lot of companies, 115 00:06:06,279 --> 00:06:09,279 Speaker 1: they don't exactly know what they want, Like they want 116 00:06:09,279 --> 00:06:12,200 Speaker 1: you to talk at their conference, or to do a workshop, 117 00:06:12,640 --> 00:06:14,839 Speaker 1: or to come in and do some coaching or consulting. 118 00:06:16,160 --> 00:06:18,000 Speaker 1: But then you get on a call and you're like, Okay, 119 00:06:18,040 --> 00:06:20,640 Speaker 1: exactly what do you want me to do and talk about? 120 00:06:20,800 --> 00:06:23,359 Speaker 1: They're like, ah, not exactly sure, but these are the 121 00:06:23,360 --> 00:06:26,320 Speaker 1: problems we've got. And then you can go, oh what 122 00:06:26,440 --> 00:06:29,520 Speaker 1: about this and they go, yeah, let's do that. What 123 00:06:29,560 --> 00:06:31,440 Speaker 1: about that? They're like, oh, that would be good too. 124 00:06:32,160 --> 00:06:34,640 Speaker 1: Like there's a lot of I think that's the beauty 125 00:06:34,680 --> 00:06:36,880 Speaker 1: of it, and saying with Hugh Patrick in your work 126 00:06:36,960 --> 00:06:38,479 Speaker 1: where you kind of it's pretty diverse. 127 00:06:38,960 --> 00:06:41,320 Speaker 2: Well, I just got asked this week to be the 128 00:06:41,400 --> 00:06:48,239 Speaker 2: compare for a year half hoedown fundraiser for our farmers. Wow, 129 00:06:48,640 --> 00:06:51,880 Speaker 2: I know nothing about boots scooting. I'm going to have 130 00:06:51,960 --> 00:06:53,799 Speaker 2: to do some serious reasons. 131 00:06:53,920 --> 00:06:55,600 Speaker 3: I used to do a bit of line dancing. When 132 00:06:55,600 --> 00:06:58,080 Speaker 3: we kid with my mom, I had the greatest boots. 133 00:06:58,480 --> 00:07:02,279 Speaker 2: Yeah, yeah, a good thing. Yeah, Drongo and the Crow, 134 00:07:02,480 --> 00:07:07,480 Speaker 2: the Darling's Family Trust, Renegade Bootscooter's Duo Deluxe. That's whose headline. 135 00:07:08,080 --> 00:07:11,440 Speaker 1: That's fucking great. I'm coming. I'm definitely coming to that. 136 00:07:11,920 --> 00:07:16,560 Speaker 1: And I want to see you in cowboy boots and 137 00:07:16,560 --> 00:07:19,760 Speaker 1: a hat, trying not to be gay up the front, 138 00:07:19,880 --> 00:07:22,440 Speaker 1: trying to look like a cowboy. 139 00:07:23,440 --> 00:07:25,840 Speaker 2: And the other gig that I landed today was a 140 00:07:25,840 --> 00:07:27,960 Speaker 2: website for a male is squat. 141 00:07:29,680 --> 00:07:32,840 Speaker 1: Hang on? Oh for a male escort? 142 00:07:33,080 --> 00:07:35,960 Speaker 2: Yeah, to do a website for a male CampaignOn, so 143 00:07:36,120 --> 00:07:36,520 Speaker 2: you can. 144 00:07:36,520 --> 00:07:38,360 Speaker 1: Do it obviously doing a contrary with him. 145 00:07:39,000 --> 00:07:41,200 Speaker 2: No, No, it's a male to female an. 146 00:07:41,520 --> 00:07:43,040 Speaker 1: Did you just say the same thing. 147 00:07:44,640 --> 00:07:45,760 Speaker 2: She did? Thanks, guys. 148 00:07:47,000 --> 00:07:51,320 Speaker 1: Wow, did you go? Yeah, that'll be cash chack or 149 00:07:51,480 --> 00:07:56,600 Speaker 1: cash check or Nature's credit card katching. Yeah. 150 00:07:56,640 --> 00:07:58,800 Speaker 2: I think we might end up doing a photo shoot 151 00:07:58,920 --> 00:08:00,160 Speaker 2: for him as well. 152 00:08:00,760 --> 00:08:03,320 Speaker 1: Yeah, of course you will. He's like, I don't have 153 00:08:03,360 --> 00:08:08,200 Speaker 1: the money for that. That's all right, it's free, come over. Wow. 154 00:08:08,440 --> 00:08:11,040 Speaker 1: And then he's like, but why are you also nude? 155 00:08:11,360 --> 00:08:14,200 Speaker 1: Oh that's just it's just how I don't know. It's 156 00:08:14,200 --> 00:08:16,840 Speaker 1: a thing. Shwe By the way, did I tell you 157 00:08:16,880 --> 00:08:17,960 Speaker 1: I teach ty g. 158 00:08:20,640 --> 00:08:24,960 Speaker 2: It's de rated? We only host g rated websites on 159 00:08:25,000 --> 00:08:25,520 Speaker 2: our server. 160 00:08:26,080 --> 00:08:30,080 Speaker 1: Yeah, do you though? We know? How how did this 161 00:08:30,200 --> 00:08:36,880 Speaker 1: particular escort? I was reaching for the right word. How 162 00:08:36,880 --> 00:08:39,040 Speaker 1: did he? How did he find you? 163 00:08:39,720 --> 00:08:39,920 Speaker 3: Oh? 164 00:08:40,120 --> 00:08:43,080 Speaker 2: You know how he found me? His accountant listens to 165 00:08:43,120 --> 00:08:44,000 Speaker 2: the podcast. 166 00:08:44,960 --> 00:08:45,880 Speaker 1: Oh that is so good. 167 00:08:46,240 --> 00:08:48,440 Speaker 2: It's yp that's how he found me. 168 00:08:49,200 --> 00:08:53,000 Speaker 1: Wow. Wow, I'm glad that we're building your career. Here 169 00:08:53,040 --> 00:08:57,640 Speaker 1: you go, has that? I don't. Oh, I've got so 170 00:08:57,720 --> 00:09:01,120 Speaker 1: many questions about male escorts that are probably not appropriate 171 00:09:01,160 --> 00:09:06,320 Speaker 1: for now. Don't don't mail escort some males cord Firstly, 172 00:09:06,520 --> 00:09:07,240 Speaker 1: is it legal? 173 00:09:07,640 --> 00:09:07,880 Speaker 2: Yes? 174 00:09:08,000 --> 00:09:10,440 Speaker 1: Is it legal? Right? Well, I guess you can't have 175 00:09:10,480 --> 00:09:12,040 Speaker 1: a website if it's illegal, can you. 176 00:09:12,320 --> 00:09:17,520 Speaker 2: Yeah? It's competitive service? Okay, it's a campaign, is it? Yeah? 177 00:09:17,720 --> 00:09:22,400 Speaker 2: Is it you just go out to dinner? It could be. 178 00:09:23,080 --> 00:09:25,800 Speaker 2: It doesn't have to be a physically romantic encounter. It 179 00:09:25,840 --> 00:09:29,319 Speaker 2: could just be a you know, above the shoulders, high 180 00:09:29,360 --> 00:09:32,400 Speaker 2: brow interaction with someone you want. 181 00:09:32,200 --> 00:09:34,240 Speaker 1: To Yeah, I can imagine it'd be like a mensa 182 00:09:34,360 --> 00:09:39,440 Speaker 1: meeting with a salad and a hand job. What Sorry? 183 00:09:39,640 --> 00:09:40,320 Speaker 1: Cut that out? 184 00:09:40,400 --> 00:09:40,600 Speaker 2: What? 185 00:09:40,840 --> 00:09:45,000 Speaker 1: No? Talk up? Hello? What did you say? I said? Salad, 186 00:09:48,080 --> 00:09:51,560 Speaker 1: solid crago. We're off to a flying start. No different 187 00:09:51,600 --> 00:09:54,720 Speaker 1: to every fucking week. You know most people listen to 188 00:09:54,760 --> 00:09:56,880 Speaker 1: this this bit and then jump out when the text 189 00:09:56,920 --> 00:10:01,760 Speaker 1: starts like fuck that boring tech talk. 190 00:10:01,960 --> 00:10:04,240 Speaker 2: Oh, I know you know how you said. We've probably 191 00:10:04,240 --> 00:10:06,560 Speaker 2: down to one listener. I can tell you who it is. 192 00:10:07,280 --> 00:10:08,040 Speaker 1: Who is it? 193 00:10:08,040 --> 00:10:11,120 Speaker 2: It's Jane. So I've got a message from Jane after 194 00:10:11,200 --> 00:10:14,520 Speaker 2: our last podcast, who is a big fan of typ 195 00:10:15,120 --> 00:10:17,160 Speaker 2: and she had a request for something that we could 196 00:10:17,240 --> 00:10:17,760 Speaker 2: talk about. 197 00:10:20,240 --> 00:10:22,520 Speaker 1: And what is that? Sorry? I was swinging on what 198 00:10:22,600 --> 00:10:25,080 Speaker 1: I have. Firstly, shout out to Jane. Hi, Jane, thanks 199 00:10:25,120 --> 00:10:27,640 Speaker 1: for being our one listener. What does Jane want you 200 00:10:27,720 --> 00:10:28,439 Speaker 1: to talk about? 201 00:10:28,679 --> 00:10:31,600 Speaker 2: He want us to talk about deep fakes and we 202 00:10:31,720 --> 00:10:34,360 Speaker 2: kind of have touched on this a few times and 203 00:10:34,440 --> 00:10:36,840 Speaker 2: it is kind of a really important one, I guess, 204 00:10:36,880 --> 00:10:39,640 Speaker 2: to get our heads around. Jane also happens to listen 205 00:10:39,679 --> 00:10:43,599 Speaker 2: to ABC podcasts and I listened to the same podcast 206 00:10:43,640 --> 00:10:46,360 Speaker 2: called Ladies We Need to Talk, and I kind of 207 00:10:46,360 --> 00:10:49,120 Speaker 2: felt like I was in an impost a little bit, 208 00:10:49,320 --> 00:10:52,160 Speaker 2: because if it's a podcast by women for women, I've 209 00:10:52,160 --> 00:10:54,200 Speaker 2: felt they kind of should I be listening to this? 210 00:10:54,280 --> 00:10:56,600 Speaker 2: But it was an amazingly good podcast, and I know 211 00:10:56,640 --> 00:10:58,240 Speaker 2: I shouldn't be promoting other podcasts. 212 00:10:58,240 --> 00:11:01,680 Speaker 1: Well, you can promote whatever. So is that an actual 213 00:11:01,720 --> 00:11:04,240 Speaker 1: podcast or is that an episode. 214 00:11:04,280 --> 00:11:07,679 Speaker 2: That's Ladies want. 215 00:11:06,200 --> 00:11:08,199 Speaker 1: To Ladies, we need to talk. 216 00:11:08,559 --> 00:11:11,320 Speaker 2: It's really good actually, But so the question was and 217 00:11:11,360 --> 00:11:13,600 Speaker 2: I thought it was a really great question, and that 218 00:11:13,760 --> 00:11:17,040 Speaker 2: is she wanted to hear our take on the prevalence 219 00:11:17,200 --> 00:11:20,920 Speaker 2: of deep fakes and you know, is it just for 220 00:11:20,960 --> 00:11:24,320 Speaker 2: the porn industry and how disturbing do we think that 221 00:11:24,440 --> 00:11:27,559 Speaker 2: it is? And look, one of the stats that came 222 00:11:27,600 --> 00:11:31,160 Speaker 2: out that just blew my mind was that ninety six 223 00:11:31,200 --> 00:11:35,240 Speaker 2: percent of deep fakes are pornographic. So keeping in mind 224 00:11:35,280 --> 00:11:38,360 Speaker 2: that deep fakes can be used for political reasons to 225 00:11:38,360 --> 00:11:42,160 Speaker 2: try to sway people, to mislead people, but the high 226 00:11:42,160 --> 00:11:45,000 Speaker 2: percentage of ninety six is that deep fakes are pornographic. 227 00:11:45,120 --> 00:11:48,200 Speaker 2: And this is the thing that is so disturbing is 228 00:11:48,559 --> 00:11:52,360 Speaker 2: ninety nine percent of deep fake videos and photos are 229 00:11:52,360 --> 00:11:56,640 Speaker 2: of women. Yeah, and in looking for and the interesting 230 00:11:56,679 --> 00:11:59,360 Speaker 2: thing that came out of that was that a lot 231 00:11:59,400 --> 00:12:02,920 Speaker 2: of the methods that people use, so the AI tools, 232 00:12:03,360 --> 00:12:07,080 Speaker 2: whether it's putting a person's face onto an existing pornographic video. 233 00:12:07,440 --> 00:12:09,880 Speaker 2: But the majority of all of that is geared up. 234 00:12:10,360 --> 00:12:16,239 Speaker 2: For the stock imagery is female, not male, because industry 235 00:12:16,320 --> 00:12:20,000 Speaker 2: is geared up and predominantly it is men wanting to, 236 00:12:21,120 --> 00:12:26,000 Speaker 2: you know, superimpose a woman's face and whatever they're using 237 00:12:26,040 --> 00:12:28,680 Speaker 2: it for. There's a whole lot of websites out there 238 00:12:29,160 --> 00:12:32,960 Speaker 2: that obviously promote this sort of thing. They have celebrities, 239 00:12:34,080 --> 00:12:36,720 Speaker 2: but some of it can be quite malicious. And I 240 00:12:36,840 --> 00:12:39,640 Speaker 2: was looking at some recent court cases and there was 241 00:12:39,679 --> 00:12:42,880 Speaker 2: a guy that the biggest court case in Australian history 242 00:12:42,920 --> 00:12:45,400 Speaker 2: so far is recent and a guy by the name 243 00:12:45,400 --> 00:12:51,440 Speaker 2: of Antonio Rotondo. He deep fake basically used prominent women 244 00:12:51,800 --> 00:12:56,839 Speaker 2: and used deep fake to create pornographic material. He was 245 00:12:56,960 --> 00:13:02,280 Speaker 2: fined three hundred and forty three thousand dollars by the 246 00:13:02,320 --> 00:13:07,040 Speaker 2: Federal Court after being caught uploading and not removing non 247 00:13:07,080 --> 00:13:12,320 Speaker 2: consensual deep fake sexual images. And these are Australian women. 248 00:13:12,640 --> 00:13:17,839 Speaker 2: So this was a really a breached fourteen under the 249 00:13:17,840 --> 00:13:22,280 Speaker 2: Online Safety Act fourteen instances of the Online Safety Act. 250 00:13:22,800 --> 00:13:26,400 Speaker 2: So it's in the reach of average, everyday people. And 251 00:13:26,440 --> 00:13:29,520 Speaker 2: that's the thing, you know, he'd already previously been fined 252 00:13:29,559 --> 00:13:32,520 Speaker 2: twenty five thousand dollars. That didn't stop him ian twenty 253 00:13:32,600 --> 00:13:35,720 Speaker 2: five thousand dollars didn't stop him from doing it, So 254 00:13:36,160 --> 00:13:38,280 Speaker 2: he may have been getting money out of this. He 255 00:13:38,360 --> 00:13:41,080 Speaker 2: may have been publishing and people subscribed to the site. 256 00:13:41,200 --> 00:13:44,160 Speaker 2: So I'm not entirely sure. I didn't follow the case 257 00:13:44,240 --> 00:13:48,240 Speaker 2: per se, but you know, it's a pretty staggering one. 258 00:13:48,280 --> 00:13:50,400 Speaker 2: And do you remember and this has came close to 259 00:13:50,440 --> 00:13:53,520 Speaker 2: home because last year there was some boys at a 260 00:13:53,520 --> 00:13:56,800 Speaker 2: local grammar school in back of Smash that were caught 261 00:13:56,920 --> 00:13:59,679 Speaker 2: and what they've been doing is taking AI generated new 262 00:14:00,000 --> 00:14:04,080 Speaker 2: images of about fifty other female students at their school, 263 00:14:04,440 --> 00:14:07,199 Speaker 2: and then they were sharing it amongst themselves and one 264 00:14:07,240 --> 00:14:08,800 Speaker 2: of the boys was actually charged. 265 00:14:08,960 --> 00:14:12,600 Speaker 1: But can we just clear up they weren't images of 266 00:14:12,640 --> 00:14:17,360 Speaker 1: the girls were They were girl's head imposed on existing nude. 267 00:14:17,160 --> 00:14:19,000 Speaker 2: Bodies exactly, That's exactly right. 268 00:14:19,080 --> 00:14:21,880 Speaker 1: So it wasn't the actual girl's bodies, but it was 269 00:14:22,360 --> 00:14:25,400 Speaker 1: it was also fakes where they'd put the girl's heads on. 270 00:14:26,400 --> 00:14:30,320 Speaker 1: And I think just explain to people that because like 271 00:14:30,480 --> 00:14:32,880 Speaker 1: so deep correct me if I'm wrong, but a deep 272 00:14:32,920 --> 00:14:37,200 Speaker 1: fake is essentially using an image putting someone's real face, 273 00:14:37,320 --> 00:14:41,320 Speaker 1: like my face on some other dude's nude body or 274 00:14:41,320 --> 00:14:44,440 Speaker 1: something like that, and then you then representing as that 275 00:14:44,520 --> 00:14:46,920 Speaker 1: as though it's real. This is really Craig up A 276 00:14:47,000 --> 00:14:48,280 Speaker 1: wearing nothing or whatever it is. 277 00:14:48,400 --> 00:14:52,520 Speaker 2: Right, it could be face, it could be voice. Remember 278 00:14:52,640 --> 00:14:55,360 Speaker 2: you know that's classified as a deep fake as well, 279 00:14:55,400 --> 00:14:58,320 Speaker 2: putting a person's voice and using that. And you've heard 280 00:14:58,800 --> 00:15:03,440 Speaker 2: you know the sk now where people call and misrepresent. 281 00:15:03,880 --> 00:15:08,240 Speaker 2: So you know, you might have a grandchild who's backpacking 282 00:15:08,480 --> 00:15:12,800 Speaker 2: through the UK and they're posting on social media and 283 00:15:12,800 --> 00:15:16,040 Speaker 2: they've got a video that has them talking. Well, potentially 284 00:15:16,120 --> 00:15:19,480 Speaker 2: a scammer could sample that and then call their family 285 00:15:19,520 --> 00:15:21,760 Speaker 2: and say, look, I've run out of money. Something's happened. 286 00:15:21,760 --> 00:15:24,480 Speaker 2: I've lost all my luggage. And these are things that 287 00:15:24,480 --> 00:15:27,360 Speaker 2: are happening right now. You know, there was a CEO 288 00:15:27,600 --> 00:15:31,800 Speaker 2: or a CFO of a company who you know deep 289 00:15:31,960 --> 00:15:34,640 Speaker 2: you know, people use deep fake to scam and try 290 00:15:34,640 --> 00:15:36,360 Speaker 2: to get him to call through and change some bank 291 00:15:36,400 --> 00:15:39,040 Speaker 2: accounts over you know, put money into to transfer money 292 00:15:39,040 --> 00:15:41,560 Speaker 2: into another bank account. So there's lots of instances of 293 00:15:42,160 --> 00:15:44,520 Speaker 2: this happening, and even here in Australia as well. 294 00:15:45,200 --> 00:15:47,440 Speaker 1: I got an email a few years ago from some 295 00:15:48,360 --> 00:15:51,840 Speaker 1: I don't know whatever going, We've got video footage of 296 00:15:51,920 --> 00:15:56,960 Speaker 1: you watching porn and how can pleasuring yourself? We're going 297 00:15:57,040 --> 00:16:00,880 Speaker 1: to share it. I'm like, share away. How good's that you? 298 00:16:01,200 --> 00:16:06,040 Speaker 1: And they're like, ah, I'm like it's like it's going. 299 00:16:06,760 --> 00:16:09,240 Speaker 2: My answer that would have been can we monetize this 300 00:16:09,400 --> 00:16:11,160 Speaker 2: and let's go fifty to fifty. 301 00:16:11,320 --> 00:16:14,840 Speaker 1: Yeah, And they're like if you send us this or 302 00:16:14,880 --> 00:16:18,960 Speaker 1: if I'm like really, it's but I think that this, 303 00:16:19,720 --> 00:16:22,920 Speaker 1: I mean, this is just one you know, version of 304 00:16:23,080 --> 00:16:28,080 Speaker 1: or one of the myriad of potential consequences and variables of, 305 00:16:28,440 --> 00:16:30,920 Speaker 1: you know, artificial intelligence and all the stuff that we've 306 00:16:30,920 --> 00:16:33,760 Speaker 1: got to deal with. And I was listening to Rogan 307 00:16:33,840 --> 00:16:36,600 Speaker 1: yesterday and who was talking about if you get a 308 00:16:36,680 --> 00:16:40,920 Speaker 1: chance to see it, it's it's ridiculous. So fifty cent, 309 00:16:41,120 --> 00:16:44,240 Speaker 1: of course, for our older listeners, fifty cent as a rapper. 310 00:16:44,720 --> 00:16:47,960 Speaker 1: He's an artist, and they took one of his songs 311 00:16:48,720 --> 00:16:53,200 Speaker 1: and they turned it into this nineteen fifty sixties bluesy 312 00:16:53,360 --> 00:16:58,200 Speaker 1: kind of song where this guy was singing this cool 313 00:16:58,280 --> 00:17:03,440 Speaker 1: dude all Ai general and it was unbelievably good, and 314 00:17:03,480 --> 00:17:06,360 Speaker 1: it was just using these rap lyrics turned into this 315 00:17:07,359 --> 00:17:11,760 Speaker 1: kind of very very cool fifties style jazz kind of 316 00:17:11,800 --> 00:17:15,439 Speaker 1: blues kind of song and video clip and it was 317 00:17:15,640 --> 00:17:19,360 Speaker 1: it looked absolutely real. And the problem is not only 318 00:17:19,400 --> 00:17:23,000 Speaker 1: did it look real, it was great. Even Rogan's like, 319 00:17:23,280 --> 00:17:25,760 Speaker 1: I love this and it's fake and this is going 320 00:17:25,800 --> 00:17:30,600 Speaker 1: to be I mean, moving forward with creatives and artists, 321 00:17:30,640 --> 00:17:35,360 Speaker 1: whether that's rights or singers or musicians or you know whatever, 322 00:17:36,080 --> 00:17:39,639 Speaker 1: there's going to be so much stuff that's produced that 323 00:17:40,400 --> 00:17:43,600 Speaker 1: like it or not, is going to be good, like 324 00:17:43,680 --> 00:17:45,919 Speaker 1: as in, that's going to be doing people out of 325 00:17:47,640 --> 00:17:52,359 Speaker 1: I guess their income or their job in a way. 326 00:17:52,680 --> 00:17:54,960 Speaker 1: But what do you do about that? Though? If they're 327 00:17:55,000 --> 00:18:00,159 Speaker 1: producing music AI, that's phenomenal to listen to. What do 328 00:18:00,200 --> 00:18:00,399 Speaker 1: you do? 329 00:18:01,119 --> 00:18:03,600 Speaker 2: I think as long as there's a disclaimer on there 330 00:18:03,640 --> 00:18:06,000 Speaker 2: and they're not trying to be was it Millie Vanilli? 331 00:18:06,080 --> 00:18:07,080 Speaker 2: Remember Millie Vanilli? 332 00:18:07,359 --> 00:18:07,520 Speaker 3: Yeah? 333 00:18:07,560 --> 00:18:11,639 Speaker 1: Yeah, yeah, too young for that. Milli Vanilli was this 334 00:18:11,880 --> 00:18:15,320 Speaker 1: two person group who rose to fame I'm going to 335 00:18:15,359 --> 00:18:19,040 Speaker 1: say in the nineties tiff and they were super cool. 336 00:18:19,160 --> 00:18:23,240 Speaker 1: Everyone loved them and they just like for about I 337 00:18:23,280 --> 00:18:25,040 Speaker 1: was going to say five minutes, but probably a year 338 00:18:25,160 --> 00:18:28,560 Speaker 1: or two they were massive, and then everyone then then 339 00:18:28,600 --> 00:18:31,960 Speaker 1: we found out I don't know, just not long after 340 00:18:32,480 --> 00:18:36,480 Speaker 1: they lip synced everything and they didn't sing anything. Yeah, 341 00:18:36,640 --> 00:18:42,040 Speaker 1: it was just these two super cool dudes who couldn't sing. Yeah. 342 00:18:42,040 --> 00:18:44,119 Speaker 2: I think they tried to do a live concert and 343 00:18:44,160 --> 00:18:46,200 Speaker 2: it backfired on them. So they were trying to lip 344 00:18:46,200 --> 00:18:49,840 Speaker 2: sync during a life God, yeah, I think is what happened. Yeah. 345 00:18:50,000 --> 00:18:52,760 Speaker 2: But you know, the other thing that's worth keeping in 346 00:18:52,800 --> 00:18:57,160 Speaker 2: mind in terms of deep fake is it's been used 347 00:18:57,160 --> 00:19:00,919 Speaker 2: in films. It's a positive effect. You know, Harrison Ford 348 00:19:01,640 --> 00:19:04,160 Speaker 2: was deep faked so that in one of the last 349 00:19:04,200 --> 00:19:07,240 Speaker 2: Indiana Jones movies they could have a flashback to when 350 00:19:07,240 --> 00:19:09,399 Speaker 2: he was younger. So there's lots of you know, and 351 00:19:09,720 --> 00:19:13,000 Speaker 2: I'm trying to think, who is the coastar in Top 352 00:19:13,040 --> 00:19:18,080 Speaker 2: Gun Bell Kilmer? So Bell Kilmer had a terrible illness 353 00:19:18,119 --> 00:19:20,600 Speaker 2: where he lost his voice, but he acted in a 354 00:19:20,680 --> 00:19:23,399 Speaker 2: movie and they deep faked his voice so that he 355 00:19:23,440 --> 00:19:26,560 Speaker 2: could still continue to act. So there's lots of instances 356 00:19:26,600 --> 00:19:28,560 Speaker 2: where you can use deep fake. And I know a 357 00:19:28,600 --> 00:19:32,679 Speaker 2: couple of episodes ago. I think it's a Scandinavian film 358 00:19:33,280 --> 00:19:39,560 Speaker 2: about an alien abduction where the cast wrote performed in 359 00:19:39,600 --> 00:19:43,800 Speaker 2: their native tongue, and then they re voiced all of 360 00:19:43,840 --> 00:19:47,520 Speaker 2: the parts in English, but rather than reacting it, they 361 00:19:47,560 --> 00:19:50,200 Speaker 2: just did the voices and then the lips were made 362 00:19:50,520 --> 00:19:54,440 Speaker 2: to move with the English movements by using deep fake. 363 00:19:55,800 --> 00:19:59,520 Speaker 2: They watched it because I've been I just rewatched the 364 00:19:59,640 --> 00:20:02,760 Speaker 2: series that I really liked on Netflix called Young Royals, 365 00:20:03,400 --> 00:20:07,399 Speaker 2: and the cast re recorded the whole thing in English, 366 00:20:07,400 --> 00:20:09,560 Speaker 2: which is so much better than getting just a random 367 00:20:09,600 --> 00:20:13,240 Speaker 2: actor to voice the character. But they didn't dub. Obviously, 368 00:20:13,320 --> 00:20:16,760 Speaker 2: the lips are still speaking, you know, in the foreign tongue, 369 00:20:17,160 --> 00:20:19,000 Speaker 2: and it would have been great if they could lip 370 00:20:19,000 --> 00:20:21,920 Speaker 2: sync it because it looked even better. But those sorts 371 00:20:21,920 --> 00:20:24,120 Speaker 2: of things make a big difference. And that's where it's 372 00:20:24,200 --> 00:20:28,399 Speaker 2: kind of been using it properly. Someone else recently, I 373 00:20:28,440 --> 00:20:32,280 Speaker 2: can't remember, someone famous gave a talk and then they 374 00:20:32,440 --> 00:20:36,280 Speaker 2: use deep fake to translate into about eight or nine 375 00:20:36,320 --> 00:20:39,359 Speaker 2: different languages, so they didn't speak the other languages, but 376 00:20:39,440 --> 00:20:41,320 Speaker 2: they were able to use it to be able to 377 00:20:42,320 --> 00:20:45,560 Speaker 2: bring that message or whatever it was in multiple languages 378 00:20:45,640 --> 00:20:46,840 Speaker 2: using deep fake as well. 379 00:20:47,160 --> 00:20:51,920 Speaker 1: Isn't that just AI? It's not all deep like deep 380 00:20:51,960 --> 00:20:54,800 Speaker 1: fake seems something dodgy, but It's like, isn't it just 381 00:20:54,960 --> 00:20:59,919 Speaker 1: AI generated whatever? Isn't like CGI computer generated imagery? You know, 382 00:21:00,040 --> 00:21:03,200 Speaker 1: when you're putting in scenes that weren't in there, isn't 383 00:21:03,240 --> 00:21:04,359 Speaker 1: that the same? Now? 384 00:21:04,480 --> 00:21:07,840 Speaker 2: AI, artificial intelligence is the method for doing it. So 385 00:21:07,920 --> 00:21:11,159 Speaker 2: you're using AI tools and that's where it comes. So 386 00:21:11,160 --> 00:21:13,960 Speaker 2: whether you're using AI tools as a scammer or a fraudster, 387 00:21:14,480 --> 00:21:17,679 Speaker 2: or if it's a politician or trying to sway public opinion, 388 00:21:17,720 --> 00:21:19,240 Speaker 2: it's just the tools that they're using. 389 00:21:20,040 --> 00:21:23,000 Speaker 1: But what I'm saying is if we're say, for example, 390 00:21:23,080 --> 00:21:26,560 Speaker 1: someone like I think Paul Walker. You know that actor 391 00:21:26,600 --> 00:21:30,800 Speaker 1: Paul Walker. He passed away right through filming one of 392 00:21:30,840 --> 00:21:33,120 Speaker 1: The Fast and the Furious. I could have this wrong, 393 00:21:33,160 --> 00:21:35,120 Speaker 1: but I feel like they did a couple of scenes, 394 00:21:35,840 --> 00:21:40,479 Speaker 1: you know, which was CGI because he'd passed away. So 395 00:21:41,520 --> 00:21:43,760 Speaker 1: I wouldn't call that deep fake because we all know 396 00:21:43,800 --> 00:21:45,960 Speaker 1: what's going on. Do you know what I mean anyway? 397 00:21:46,040 --> 00:21:50,040 Speaker 2: Semantics, Yeah, the semantics the term deep fake. Yeah, I 398 00:21:50,040 --> 00:21:51,560 Speaker 2: get what you're saying to me. 399 00:21:51,680 --> 00:21:55,720 Speaker 1: That's associated with deception, whereas this is just like just 400 00:21:57,600 --> 00:22:01,280 Speaker 1: you know, computer generated stuff that make something better. 401 00:22:01,440 --> 00:22:04,760 Speaker 2: I guess if you d age an actor, it is 402 00:22:04,800 --> 00:22:07,439 Speaker 2: a deep faking. The actor. It's not the real person. 403 00:22:07,480 --> 00:22:11,240 Speaker 2: It's using a technique, whether it AI or advanced CGI, 404 00:22:11,520 --> 00:22:14,520 Speaker 2: probably using some sort of AI. But it is effectively 405 00:22:14,520 --> 00:22:16,359 Speaker 2: a deep The term is still deep fake. It's the 406 00:22:16,440 --> 00:22:18,919 Speaker 2: kind of the umbrella term for whom you led an 407 00:22:18,960 --> 00:22:21,800 Speaker 2: image to be different than what it was by superimposing 408 00:22:21,880 --> 00:22:24,719 Speaker 2: or making them younger, or putting another face on another person. 409 00:22:24,800 --> 00:22:26,920 Speaker 2: So effectively, that is deep fake. 410 00:22:27,720 --> 00:22:30,040 Speaker 1: So Tiff, you and I are deep faking right now 411 00:22:30,040 --> 00:22:31,960 Speaker 1: because we've got filters on our things. I don't know, 412 00:22:32,000 --> 00:22:33,639 Speaker 1: do I have a filter. I don't think I do. 413 00:22:34,560 --> 00:22:37,440 Speaker 1: Everybody who uses a filter on zoom's deep faking. 414 00:22:37,960 --> 00:22:40,560 Speaker 3: Yeah, this is not more a real mustache. 415 00:22:41,359 --> 00:22:44,119 Speaker 2: You look good with that though. The top hat was fantastic. 416 00:22:44,200 --> 00:22:48,160 Speaker 1: Yeah, oh yeah, yeah. I wish you'd shave your hairy 417 00:22:48,240 --> 00:22:57,560 Speaker 1: shoulders though. Yeah, Patrick, tell us, I don't know. Let's 418 00:22:57,560 --> 00:23:01,359 Speaker 1: tell us about reusable water bottles that are a breeding 419 00:23:01,440 --> 00:23:03,120 Speaker 1: ground for horrible things. 420 00:23:03,640 --> 00:23:05,200 Speaker 2: Is that what you're drinking out of now? 421 00:23:07,359 --> 00:23:10,320 Speaker 1: Yes? Yeah, but I'm doing that. I'm having lots of 422 00:23:10,359 --> 00:23:14,520 Speaker 1: breeding ground activity for my gut biome. It's my Chinese 423 00:23:14,560 --> 00:23:19,680 Speaker 1: doctor told me to well, my Chinese medicine doctor, I 424 00:23:19,720 --> 00:23:22,600 Speaker 1: should say, who happens to be Chinese? 425 00:23:22,880 --> 00:23:25,919 Speaker 2: Well, the market, the global market for reusable water bottles 426 00:23:26,000 --> 00:23:30,440 Speaker 2: is valued at about ten billion dollars. Okay, so it's very, very, 427 00:23:30,560 --> 00:23:33,080 Speaker 2: very big. But the question is is that every year, 428 00:23:33,119 --> 00:23:35,560 Speaker 2: because if you've got a reusable bottle, you shouldn't be 429 00:23:35,600 --> 00:23:37,919 Speaker 2: buying what every year? Should you? Anyway? Short of it 430 00:23:38,000 --> 00:23:41,080 Speaker 2: is the problem that you have with that is some 431 00:23:41,680 --> 00:23:44,439 Speaker 2: drink bottles have those pop up straws or they have 432 00:23:44,480 --> 00:23:48,960 Speaker 2: a narrow opening and it's plastic and be a breeding ground. 433 00:23:49,119 --> 00:23:52,040 Speaker 2: And I've read reports where they say you should clean 434 00:23:52,720 --> 00:23:56,480 Speaker 2: your water bottle every day, every single day. You should 435 00:23:56,560 --> 00:23:59,520 Speaker 2: wash it out and clean it, sterilize it like hot water, 436 00:23:59,840 --> 00:24:04,440 Speaker 2: and clean it properly. But yeah, so basically there's been 437 00:24:04,520 --> 00:24:08,600 Speaker 2: studies done specifically looking at people who are using water 438 00:24:08,600 --> 00:24:11,359 Speaker 2: bottles on a daily basis and how often, so that 439 00:24:11,400 --> 00:24:13,880 Speaker 2: the researchers were saying, well, how clean are the bottles 440 00:24:14,240 --> 00:24:16,320 Speaker 2: that people are using on a daily basis. We know 441 00:24:16,359 --> 00:24:18,800 Speaker 2: we're trying to do the right thing, not waste plastics, 442 00:24:19,040 --> 00:24:22,719 Speaker 2: and so we're trying to do the right thing, But 443 00:24:22,920 --> 00:24:25,600 Speaker 2: is that a good thing from our perspective? Because you know, 444 00:24:25,640 --> 00:24:28,359 Speaker 2: most people, well not most people, but there's a chance 445 00:24:28,400 --> 00:24:31,280 Speaker 2: that you might have fecal bacteria on your hands that 446 00:24:31,359 --> 00:24:34,919 Speaker 2: you flip the lid on it, and certainly you've passed 447 00:24:34,920 --> 00:24:38,560 Speaker 2: that on to the straw. Sorry, tiff TIFFs start having 448 00:24:38,640 --> 00:24:39,320 Speaker 2: dinner tonight. 449 00:24:40,560 --> 00:24:43,200 Speaker 4: Sorry, but I mean the funny thing is, though, right, 450 00:24:44,040 --> 00:24:46,760 Speaker 4: we all, I mean you would be we would all 451 00:24:46,760 --> 00:24:50,879 Speaker 4: be surprised how many people, like we go fecal bacteria. 452 00:24:51,040 --> 00:24:53,639 Speaker 1: Okay, so probably everyone has a bit of that on 453 00:24:53,680 --> 00:24:56,880 Speaker 1: them somewhere. It's like, but people think, oh no, that's 454 00:24:57,520 --> 00:25:00,480 Speaker 1: it's like, well, no. We were talking to doctor Bill, 455 00:25:00,640 --> 00:25:02,760 Speaker 1: doctor Bill Sullivan who you probably don't know, Patrick, but 456 00:25:02,800 --> 00:25:05,159 Speaker 1: he comes on the show from Indiana State University, is 457 00:25:05,240 --> 00:25:08,600 Speaker 1: like professor of micro biology or something, and he was 458 00:25:08,640 --> 00:25:11,040 Speaker 1: talking about one of the reasons people get so sick 459 00:25:11,200 --> 00:25:14,640 Speaker 1: is because our environment is so sanitized. He's like, we 460 00:25:14,760 --> 00:25:17,200 Speaker 1: need dirt, we need germs. 461 00:25:17,160 --> 00:25:20,040 Speaker 3: Like we need on your water bottle, Patrick. 462 00:25:19,720 --> 00:25:22,119 Speaker 1: Yeah, we need back. Exactly, have a little bit of 463 00:25:22,160 --> 00:25:25,480 Speaker 1: kaka in your cam Butcher. It's probably kaka in your 464 00:25:25,520 --> 00:25:29,480 Speaker 1: cam Butcher anyway, Patrick, it's probably twelve percent kaka, which 465 00:25:29,520 --> 00:25:31,920 Speaker 1: is why it's so good for your gut. You don't 466 00:25:31,960 --> 00:25:36,160 Speaker 1: need to do the old poop pills up the old date. 467 00:25:36,440 --> 00:25:40,320 Speaker 1: You can just do that. I mean that is literally 468 00:25:40,359 --> 00:25:44,439 Speaker 1: a medical yeah, but I just think that we, you know, 469 00:25:44,480 --> 00:25:48,400 Speaker 1: in twenty twenty five, we just worried, you know, for 470 00:25:48,520 --> 00:25:52,040 Speaker 1: us living in first world luxury. Oh, for God's sake, 471 00:25:52,080 --> 00:25:55,840 Speaker 1: stop being so no pun intended, but anal it's like, 472 00:25:56,040 --> 00:25:59,720 Speaker 1: oh my god, just you know, we don't have to 473 00:25:59,760 --> 00:26:04,720 Speaker 1: be wiping and sanitizing everything, and not only that, where 474 00:26:04,720 --> 00:26:07,119 Speaker 1: we fuck up our immune system because we have no 475 00:26:07,240 --> 00:26:10,200 Speaker 1: exposure to some of the bacteria we should. But then 476 00:26:10,240 --> 00:26:12,960 Speaker 1: on top of that you layer the anxiety that people 477 00:26:13,000 --> 00:26:16,040 Speaker 1: have about this ship, which makes them even worse. So 478 00:26:16,119 --> 00:26:21,119 Speaker 1: now we've got mental health issues around everything being fucking sanitized, 479 00:26:21,520 --> 00:26:24,840 Speaker 1: and that makes people sick too. You know what, for 480 00:26:24,920 --> 00:26:28,400 Speaker 1: the last timeline of humanity, we have not done all 481 00:26:28,440 --> 00:26:31,560 Speaker 1: the things that we do around this ship, John pay 482 00:26:31,840 --> 00:26:35,040 Speaker 1: Owning a dog is probably the best way to get 483 00:26:35,040 --> 00:26:38,240 Speaker 1: a mix of bacteria, because I don't know about you, Tiff, 484 00:26:38,400 --> 00:26:39,959 Speaker 1: does Luna ever slip the tongue in? 485 00:26:40,119 --> 00:26:42,000 Speaker 2: You know me too? 486 00:26:42,560 --> 00:26:46,200 Speaker 1: Fritz does same me. You get a bit close after 487 00:26:46,280 --> 00:26:49,479 Speaker 1: licking his balls, then he's licking your lips so fucking 488 00:26:49,560 --> 00:26:55,560 Speaker 1: forget all your hygiene, I said, cat, though I picked 489 00:26:56,000 --> 00:26:57,200 Speaker 1: Fritz doesn't have balls. 490 00:26:57,680 --> 00:26:59,800 Speaker 2: No, well, his previous owners took him to the vet 491 00:26:59,800 --> 00:27:00,760 Speaker 2: when he was a little. 492 00:27:00,760 --> 00:27:06,240 Speaker 1: Wow, like father, like son? What? What? Sorry? Sorry? What? Sorry? 493 00:27:06,240 --> 00:27:12,719 Speaker 1: I just it just broke up. Then, come on, Patrick, 494 00:27:12,800 --> 00:27:15,760 Speaker 1: jump back on your list. 495 00:27:15,920 --> 00:27:22,080 Speaker 2: Off this topic fast. What about do you before you 496 00:27:22,160 --> 00:27:22,720 Speaker 2: go to bed? 497 00:27:23,680 --> 00:27:25,960 Speaker 1: I love it when you get tongue tied. It's so 498 00:27:26,000 --> 00:27:29,520 Speaker 1: good because you never get tongue tied. Yeah, before I 499 00:27:29,560 --> 00:27:30,879 Speaker 1: go to bed? Yeah, keep going. 500 00:27:31,280 --> 00:27:33,240 Speaker 2: How long before you go to bed? Do you look 501 00:27:33,240 --> 00:27:34,520 Speaker 2: at a screen TV? 502 00:27:36,280 --> 00:27:38,680 Speaker 1: If I'm being honest, If I'm being honest, I'm in 503 00:27:38,720 --> 00:27:41,920 Speaker 1: bed watching TV. I'm terrible at this, but I don't. Yeah, 504 00:27:42,080 --> 00:27:44,800 Speaker 1: I'm the worst example of what to do. I tell 505 00:27:44,840 --> 00:27:47,920 Speaker 1: people what to do, but I I sleep like a 506 00:27:48,040 --> 00:27:49,560 Speaker 1: rock though, so I don't know that. 507 00:27:49,600 --> 00:27:51,240 Speaker 2: I'm Wow. 508 00:27:52,280 --> 00:27:54,760 Speaker 1: You drink I'm drinking coffee right now, seats. 509 00:27:55,800 --> 00:27:57,119 Speaker 2: Wow, I'm drinking It's. 510 00:27:56,960 --> 00:27:58,760 Speaker 1: Five point thirty. I'm drinking coffee, and I'll have a 511 00:27:58,760 --> 00:28:00,680 Speaker 1: cup of tea. I often take a cup of tea 512 00:28:00,720 --> 00:28:05,200 Speaker 1: to bed and watch Telly, so my brain has all 513 00:28:05,240 --> 00:28:06,320 Speaker 1: the stimulants. 514 00:28:06,760 --> 00:28:12,480 Speaker 2: Wow. A Norwegian study CRAGO involving more than forty five 515 00:28:12,680 --> 00:28:17,800 Speaker 2: thousand students. Right. Yes, that when they used screens in 516 00:28:17,840 --> 00:28:21,800 Speaker 2: bed for one hour after the lights out, increased their 517 00:28:21,880 --> 00:28:27,040 Speaker 2: risk of insomnia by fifty nine percent and shortened their 518 00:28:27,080 --> 00:28:29,520 Speaker 2: total sleep by twenty five minutes per night. 519 00:28:30,600 --> 00:28:33,200 Speaker 1: I believe it, and you believe it, but it doesn't. 520 00:28:33,240 --> 00:28:36,200 Speaker 1: But it's not for everyone, but it's it's generally true, though. Yeah, 521 00:28:36,240 --> 00:28:38,600 Speaker 1: doing what I do. Say to people don't do what 522 00:28:38,680 --> 00:28:39,760 Speaker 1: I do. It's terrible. 523 00:28:40,480 --> 00:28:43,080 Speaker 2: I try to not look at my phone and listen 524 00:28:43,120 --> 00:28:45,400 Speaker 2: to an audio book or a podcast before I go 525 00:28:45,520 --> 00:28:48,640 Speaker 2: to bed, so black out the room. I have a 526 00:28:48,720 --> 00:28:52,840 Speaker 2: total totally blacked out room, even mine. I've got a 527 00:28:53,000 --> 00:28:56,080 Speaker 2: digital clock that projects onto the ceiling. It's a little 528 00:28:56,200 --> 00:28:58,800 Speaker 2: led display. And then what I do is, I've got 529 00:28:58,800 --> 00:29:02,000 Speaker 2: two pillows and I pushed the pillows across so that 530 00:29:02,160 --> 00:29:05,440 Speaker 2: covers the laser and that way I don't even get 531 00:29:05,480 --> 00:29:09,400 Speaker 2: the glow of the red lights that reflected. It's totally 532 00:29:09,720 --> 00:29:10,720 Speaker 2: got to be totally black. 533 00:29:12,240 --> 00:29:17,120 Speaker 1: Yeah, yeah, cook, Do you have a particular sleep protocol. 534 00:29:17,520 --> 00:29:21,320 Speaker 3: Yeah. I now try and get off the phone an 535 00:29:21,400 --> 00:29:27,120 Speaker 3: hour before before bed. And the phone lives in the 536 00:29:27,200 --> 00:29:29,640 Speaker 3: kitchen of a night, so I can't look at it 537 00:29:30,000 --> 00:29:31,960 Speaker 3: or no, it's there because we know that keeps the 538 00:29:32,000 --> 00:29:36,200 Speaker 3: brain active, or it does for me, and I have 539 00:29:36,240 --> 00:29:39,720 Speaker 3: to read a real book. Ohne, I've got a protocol 540 00:29:39,800 --> 00:29:42,280 Speaker 3: that is working, and my sleep's gone up about an 541 00:29:42,280 --> 00:29:45,280 Speaker 3: average a half hour night, which is phenomenal. I'm having 542 00:29:45,320 --> 00:29:48,040 Speaker 3: some good having some good times with that new with 543 00:29:48,120 --> 00:29:48,840 Speaker 3: that protocol. 544 00:29:50,600 --> 00:29:50,840 Speaker 2: Yeah. 545 00:29:52,160 --> 00:29:54,360 Speaker 1: The only thing that I do is I listened to 546 00:29:54,880 --> 00:29:58,640 Speaker 1: I've got this repeat. Basically it sounds like surf, like 547 00:29:58,760 --> 00:30:03,320 Speaker 1: crashing waves, which I played pretty much every night, and 548 00:30:03,400 --> 00:30:05,320 Speaker 1: I don't know how many waves are here crash, but 549 00:30:05,360 --> 00:30:10,120 Speaker 1: it's not that many. I've got forty minutes of crashing 550 00:30:10,200 --> 00:30:13,680 Speaker 1: waves and surf. It's like just But when I play 551 00:30:13,760 --> 00:30:16,960 Speaker 1: like even gentle music, it distracts me because of the 552 00:30:17,040 --> 00:30:21,840 Speaker 1: different the different instruments. But if it's essentially like white 553 00:30:21,920 --> 00:30:24,400 Speaker 1: noise or brown noise or green noise or any of those, 554 00:30:24,440 --> 00:30:31,000 Speaker 1: where it's a constant, yeah, I fall asleep straight up. 555 00:30:31,200 --> 00:30:33,600 Speaker 2: It's interesting. My digital clock that you know, the fancy 556 00:30:33,600 --> 00:30:36,760 Speaker 2: one with the laser that points the time onto the roof. 557 00:30:37,080 --> 00:30:41,240 Speaker 2: It also has pre recorded wave sounds. What else does 558 00:30:41,280 --> 00:30:47,000 Speaker 2: it have? You know, like creek babbling brook like when 559 00:30:47,120 --> 00:30:49,560 Speaker 2: through the trees, I not never want to turn that 560 00:30:49,640 --> 00:30:52,560 Speaker 2: on and have a listen. Isn't that funny? I should? 561 00:30:52,800 --> 00:30:55,880 Speaker 1: It should? It could be, It could be the could 562 00:30:55,880 --> 00:30:57,600 Speaker 1: be the thing that gets you over the line. Tell 563 00:30:57,680 --> 00:31:03,160 Speaker 1: us why we wouldn't charge our phone when we're renting 564 00:31:03,200 --> 00:31:03,520 Speaker 1: a car. 565 00:31:04,200 --> 00:31:06,800 Speaker 2: Oh see, you know it's funny. I did this little 566 00:31:06,840 --> 00:31:10,120 Speaker 2: story or I put it aside in our notes previous 567 00:31:10,520 --> 00:31:12,440 Speaker 2: previous podcast, and we didn't get a chance to talk 568 00:31:12,480 --> 00:31:14,600 Speaker 2: about it. And it was that very same day I 569 00:31:14,600 --> 00:31:17,640 Speaker 2: was getting my car service, do you remember, Yeah, And 570 00:31:17,720 --> 00:31:20,160 Speaker 2: so I'd done that very thing. I plugged my car, 571 00:31:20,840 --> 00:31:22,600 Speaker 2: my phone into the car so I could use my 572 00:31:22,680 --> 00:31:26,000 Speaker 2: GPS and Android Auto and it hadn't even occurred to me. 573 00:31:26,560 --> 00:31:29,080 Speaker 2: And the reality of it is, when you are in 574 00:31:29,120 --> 00:31:32,560 Speaker 2: a courtesy car and you plug your phone in, potentially 575 00:31:32,600 --> 00:31:36,360 Speaker 2: it can be synchronizing with your phone, taking your contacts off, 576 00:31:36,440 --> 00:31:40,040 Speaker 2: sharing your contacts with the vehicle. And if someone knew 577 00:31:40,040 --> 00:31:42,040 Speaker 2: what they were doing, they could go back. And I've 578 00:31:42,040 --> 00:31:44,160 Speaker 2: found this before. If you ever got into a courtesy 579 00:31:44,200 --> 00:31:46,720 Speaker 2: car and you've hooked up by Wi Fi or hooked 580 00:31:46,760 --> 00:31:49,440 Speaker 2: up by cable, and you can see other people, other 581 00:31:49,560 --> 00:31:51,560 Speaker 2: users who've used the car. Have you ever seen that? 582 00:31:52,240 --> 00:31:55,680 Speaker 1: I have not, but I believe that would. Yeah, it 583 00:31:55,680 --> 00:31:56,600 Speaker 1: doesn't surprise me. 584 00:31:57,200 --> 00:31:59,800 Speaker 2: So that's the reality. When you plug in a cable, 585 00:32:00,080 --> 00:32:02,360 Speaker 2: and it may even just be to charge your phone, 586 00:32:02,480 --> 00:32:04,360 Speaker 2: you think, oh, you had better charge the phone. Well, 587 00:32:04,400 --> 00:32:08,960 Speaker 2: potentially you could be transferring information between your phone and 588 00:32:09,080 --> 00:32:13,080 Speaker 2: your car. And most car companies, no rental companies, have 589 00:32:13,360 --> 00:32:16,640 Speaker 2: zero policy because it wouldn't occur to them to have 590 00:32:16,680 --> 00:32:20,640 Speaker 2: a policy about the privacy of your data if it 591 00:32:20,680 --> 00:32:23,040 Speaker 2: gets taken sucked up by the car that you're using 592 00:32:23,080 --> 00:32:24,960 Speaker 2: as a high car. Do you reckon? 593 00:32:25,000 --> 00:32:28,200 Speaker 1: We already live in a time where, in Australia anyway, 594 00:32:28,320 --> 00:32:35,680 Speaker 1: it's pretty much impossible to have a technology free life. Yeah, 595 00:32:35,720 --> 00:32:38,880 Speaker 1: that's a tough one. That's a tough like, it's pretty impossible, 596 00:32:38,920 --> 00:32:41,520 Speaker 1: isn't it. I mean when you think of banking and 597 00:32:41,560 --> 00:32:44,880 Speaker 1: payments and bills and like, unless you live off the 598 00:32:44,960 --> 00:32:47,560 Speaker 1: grid in the middle of nowhere and grow your own 599 00:32:47,600 --> 00:32:50,520 Speaker 1: food and have your own water and generate your own 600 00:32:50,560 --> 00:32:54,360 Speaker 1: electricity and have your own medicine, Like, think about all 601 00:32:54,440 --> 00:32:59,760 Speaker 1: of the things that you need to not just subsist 602 00:33:00,440 --> 00:33:04,760 Speaker 1: but exist and potentially thrived be very hard to have 603 00:33:04,920 --> 00:33:08,640 Speaker 1: no or to not be connected. I don't mean no technology. 604 00:33:08,680 --> 00:33:12,160 Speaker 1: Maybe you can have a generator or whatever if that's technology, 605 00:33:12,200 --> 00:33:16,360 Speaker 1: but I mean technology technology. I think we're almost past 606 00:33:16,440 --> 00:33:17,400 Speaker 1: that time, aren't we. 607 00:33:18,440 --> 00:33:21,360 Speaker 2: I think we would risk being well for most of 608 00:33:21,440 --> 00:33:25,400 Speaker 2: us the way we engage. And you talked about things 609 00:33:25,480 --> 00:33:29,160 Speaker 2: like banking, it's getting increasingly difficult to do any sort 610 00:33:29,200 --> 00:33:31,880 Speaker 2: of banking unless you're doing it online. I went to 611 00:33:31,920 --> 00:33:34,760 Speaker 2: our local community bank recently to withdraw some money. I 612 00:33:34,840 --> 00:33:37,760 Speaker 2: don't normally use cash. In fact, I don't think I've 613 00:33:37,840 --> 00:33:40,920 Speaker 2: used cash since before COVID, so it's been what five 614 00:33:41,000 --> 00:33:44,920 Speaker 2: years or something. But the interesting thing is I got 615 00:33:44,960 --> 00:33:46,920 Speaker 2: my cards out. So I'm standing there with my wallet, 616 00:33:46,960 --> 00:33:49,320 Speaker 2: which just sits in my glovebox because again I don't 617 00:33:49,400 --> 00:33:52,160 Speaker 2: use it, and I'm putting all the different cards and 618 00:33:52,160 --> 00:33:54,320 Speaker 2: about five cards, and I put them into the cash 619 00:33:54,520 --> 00:33:56,960 Speaker 2: machine and they're all popping out and it's like none 620 00:33:57,000 --> 00:33:59,640 Speaker 2: of these work, Like I'm not getting any money out. 621 00:34:00,000 --> 00:34:02,040 Speaker 2: Manager came out and what the hell are you doing? 622 00:34:02,200 --> 00:34:04,600 Speaker 2: Because I know the manager and she's and she said, no, no, 623 00:34:04,600 --> 00:34:07,880 Speaker 2: none of your cards are enabled for ATM. It's like 624 00:34:08,120 --> 00:34:10,239 Speaker 2: what she said, no, no, none of the cards are 625 00:34:10,320 --> 00:34:12,799 Speaker 2: enabled for ATM because you don't get any interest if 626 00:34:12,840 --> 00:34:15,600 Speaker 2: you've got an account that has ATM access, But if 627 00:34:16,160 --> 00:34:18,840 Speaker 2: you have no ATM access, then you earn interest. And 628 00:34:18,840 --> 00:34:20,840 Speaker 2: it's like, well, how do I get money out? I 629 00:34:20,920 --> 00:34:22,560 Speaker 2: just come in to the bank and we'll you know, 630 00:34:22,760 --> 00:34:25,440 Speaker 2: we'll do it across the counter. I had no idea. 631 00:34:25,520 --> 00:34:27,560 Speaker 2: It had been five years since I've used an ATM, 632 00:34:27,600 --> 00:34:29,840 Speaker 2: and I realized that none of my cards work in 633 00:34:29,880 --> 00:34:30,840 Speaker 2: an ATM. 634 00:34:31,280 --> 00:34:33,160 Speaker 1: So what did you need cash for? Were you going 635 00:34:33,200 --> 00:34:35,399 Speaker 1: to see your mate that you're building a website for? 636 00:34:39,680 --> 00:34:42,760 Speaker 2: What did I need cash for? I don't know cash 637 00:34:43,840 --> 00:34:46,759 Speaker 2: I needed cash for. I had one of my colleagues 638 00:34:47,040 --> 00:34:49,040 Speaker 2: had a birthday and I wanted to put cash in. 639 00:34:49,239 --> 00:34:52,160 Speaker 2: Actually both of them, two colleagues had a birthday. One 640 00:34:52,160 --> 00:34:54,239 Speaker 2: of them just turned eighteen, and so I wanted to 641 00:34:54,239 --> 00:34:55,920 Speaker 2: put some cash in a card for him, and I 642 00:34:56,000 --> 00:34:57,960 Speaker 2: ended up getting one of those debit cards because he 643 00:34:58,040 --> 00:35:00,600 Speaker 2: wants to travel M. I thought to be easier just 644 00:35:00,600 --> 00:35:02,879 Speaker 2: getting the cash out and giving that and my other 645 00:35:02,960 --> 00:35:05,520 Speaker 2: colleague as a surprise, I put three hundred bucks into 646 00:35:05,560 --> 00:35:05,919 Speaker 2: his card. 647 00:35:07,000 --> 00:35:07,880 Speaker 1: We were very nice. 648 00:35:08,120 --> 00:35:08,800 Speaker 2: He didn't say. 649 00:35:11,880 --> 00:35:15,000 Speaker 1: You don't give to express something in return, you just 650 00:35:15,080 --> 00:35:18,719 Speaker 1: be nice. Come on, and now you've spoken about it publicly, 651 00:35:18,800 --> 00:35:20,799 Speaker 1: you're really really reaching, aren't you. 652 00:35:21,160 --> 00:35:22,520 Speaker 2: He's not going to listen to the podcast. 653 00:35:23,440 --> 00:35:27,640 Speaker 1: Scientists tell us that working from home makes us happier. 654 00:35:27,680 --> 00:35:30,080 Speaker 1: I reckon, that's a bit. I hate it when they 655 00:35:30,120 --> 00:35:35,320 Speaker 1: say generalize things. Yeah, it's like, well, does it everyone? 656 00:35:35,560 --> 00:35:38,879 Speaker 1: All the people? I don't know. I hate it when 657 00:35:38,880 --> 00:35:43,480 Speaker 1: they make unequivocal statements. No, we did some research and 658 00:35:43,520 --> 00:35:46,560 Speaker 1: with the people that we looked at in general, the 659 00:35:46,600 --> 00:35:50,399 Speaker 1: people that we looked at, I don't know. I feel 660 00:35:50,440 --> 00:35:52,959 Speaker 1: like there's a lot of people that don't come under 661 00:35:52,960 --> 00:35:55,440 Speaker 1: that banner. But it's an interesting conversation starter. 662 00:35:55,560 --> 00:36:00,680 Speaker 2: Anyway, four year Australian study, this is and this came. 663 00:36:00,800 --> 00:36:03,480 Speaker 2: This has all come to the four post COVID or 664 00:36:03,560 --> 00:36:07,040 Speaker 2: during COVID, I should say, you know, I've been running 665 00:36:07,040 --> 00:36:10,200 Speaker 2: a business from home for what twenty six years? And 666 00:36:10,680 --> 00:36:13,520 Speaker 2: you know, at one point you'd almost embarrassingly say that 667 00:36:13,560 --> 00:36:16,960 Speaker 2: you're working from home. But now it's become the done thing, 668 00:36:17,600 --> 00:36:20,320 Speaker 2: and you know, because a lot of people were forced 669 00:36:20,560 --> 00:36:24,759 Speaker 2: to do that, and there's legislation being enacted in Victoria 670 00:36:25,400 --> 00:36:28,640 Speaker 2: to make it compulsory for employers to allow staff to 671 00:36:28,680 --> 00:36:32,480 Speaker 2: work at least two days a week from home if 672 00:36:32,480 --> 00:36:34,320 Speaker 2: it's practicable. I mean, if you work in a factory, 673 00:36:34,320 --> 00:36:37,600 Speaker 2: that's not going to happen, but it makes sense. So 674 00:36:37,640 --> 00:36:42,000 Speaker 2: this this started, this study began just before the pandemic 675 00:36:42,320 --> 00:36:46,359 Speaker 2: and they was the University of South Australia and they 676 00:36:46,360 --> 00:36:49,640 Speaker 2: looked at teleworkers, so you know, and the impact on 677 00:36:50,120 --> 00:36:53,240 Speaker 2: their lives and they said that. 678 00:36:53,560 --> 00:36:56,560 Speaker 1: When what's a teleworker, you mean someone who's on the 679 00:36:56,560 --> 00:36:57,480 Speaker 1: phone and does. 680 00:36:57,920 --> 00:37:02,120 Speaker 2: Just working remotely. So tell so basically anybody who works 681 00:37:02,560 --> 00:37:08,359 Speaker 2: from home and is remotely yeah, so I mean yeah, 682 00:37:08,440 --> 00:37:11,839 Speaker 2: So the pandemic was accelerated the ability to work you know, 683 00:37:12,520 --> 00:37:14,719 Speaker 2: basically from home for anybody you know who was an 684 00:37:14,760 --> 00:37:18,960 Speaker 2: office worker. But media, it seems that the immediate effect 685 00:37:19,000 --> 00:37:22,400 Speaker 2: of working from home was better sleep. So that was 686 00:37:22,440 --> 00:37:25,479 Speaker 2: one of the things because on average, remote workers gained 687 00:37:25,480 --> 00:37:27,879 Speaker 2: an extra thirty minutes of rest overnight. And that makes 688 00:37:27,880 --> 00:37:31,400 Speaker 2: sense because if you're not commuting to work, then you 689 00:37:31,560 --> 00:37:34,600 Speaker 2: have more time after work knocks off to do all 690 00:37:34,640 --> 00:37:37,160 Speaker 2: the things that we have to do to live our lives, 691 00:37:37,480 --> 00:37:40,280 Speaker 2: which means we then are able to get to bed earlier. 692 00:37:40,400 --> 00:37:43,600 Speaker 2: Potentially and be able to get that extra sleep that 693 00:37:43,640 --> 00:37:45,160 Speaker 2: we all absolutely need. 694 00:37:45,440 --> 00:37:46,799 Speaker 1: So does that mean I'm going to have to go 695 00:37:46,840 --> 00:37:52,280 Speaker 1: and do my corporate gigs in people's homes? Good a, Brian, 696 00:37:52,320 --> 00:37:54,440 Speaker 1: I'm here for a workshop. It's just you and me. 697 00:37:58,640 --> 00:38:02,279 Speaker 1: Imagine if people stop going the corporate I'm fucked. Yeah, 698 00:38:02,360 --> 00:38:05,400 Speaker 1: what am I going to do? Yeah? I don't know. 699 00:38:05,640 --> 00:38:08,920 Speaker 2: I think it'll snuggle at your place. 700 00:38:09,000 --> 00:38:10,600 Speaker 1: Of course, we could do it in the front yard, 701 00:38:12,080 --> 00:38:16,000 Speaker 1: I think though, I understand it, but it seems like 702 00:38:16,160 --> 00:38:19,600 Speaker 1: everything's geared towards the employee, not the employer, like the 703 00:38:19,600 --> 00:38:24,080 Speaker 1: people who are actually paying them. It's like, can you go, okay, 704 00:38:24,120 --> 00:38:27,319 Speaker 1: we actually want you to be when you work. We 705 00:38:27,480 --> 00:38:30,000 Speaker 1: want you to be in the workplace. And it's like 706 00:38:30,120 --> 00:38:33,160 Speaker 1: if you say that, like you're a bad person. I'm like, 707 00:38:33,280 --> 00:38:35,680 Speaker 1: I feel like there are a lot of jobs, like 708 00:38:35,760 --> 00:38:38,319 Speaker 1: if you're a mechanic or a builder, or any kind 709 00:38:38,400 --> 00:38:42,080 Speaker 1: of tradee or a school teacher or and I know 710 00:38:42,120 --> 00:38:44,239 Speaker 1: you can do some of that. I just think that 711 00:38:45,320 --> 00:38:48,640 Speaker 1: I don't know where. And I totally understand it, and 712 00:38:48,640 --> 00:38:53,279 Speaker 1: I'm all for better conditions for employees, But when does 713 00:38:53,320 --> 00:38:56,120 Speaker 1: the tail stop wagging the dog? Like when does the 714 00:38:56,120 --> 00:38:59,560 Speaker 1: person who's paying the person have some right to say, 715 00:38:59,840 --> 00:39:01,719 Speaker 1: I actually want you here because I want to be 716 00:39:01,760 --> 00:39:04,200 Speaker 1: able to just talk to you frequently, and I want 717 00:39:04,239 --> 00:39:06,840 Speaker 1: to be able to build an environment we've got a 718 00:39:06,840 --> 00:39:09,440 Speaker 1: team of creatives in the same room, or I don't know, 719 00:39:09,960 --> 00:39:14,640 Speaker 1: like it feels for me, who's employed people to then 720 00:39:14,760 --> 00:39:19,440 Speaker 1: be told, oh, you can't you you have no control 721 00:39:19,560 --> 00:39:21,839 Speaker 1: over any of this. You've just got to do what 722 00:39:21,880 --> 00:39:24,239 Speaker 1: you're told and pay everyone and shut the fuck up. 723 00:39:24,280 --> 00:39:26,680 Speaker 1: I'm like, oh, I don't know that that would work 724 00:39:26,719 --> 00:39:27,799 Speaker 1: in a lot of businesses. 725 00:39:28,080 --> 00:39:30,520 Speaker 2: Look, I agree with you. I've got to say, when 726 00:39:30,600 --> 00:39:35,320 Speaker 2: my employee works from home, I'm less likely to engage. 727 00:39:35,680 --> 00:39:37,879 Speaker 2: We have phone calls and we chat, but I always 728 00:39:37,920 --> 00:39:39,719 Speaker 2: feel like I'm imposing on him. So if I have 729 00:39:39,760 --> 00:39:41,799 Speaker 2: a phone call and I hang up and then I 730 00:39:41,800 --> 00:39:44,239 Speaker 2: think of something thirty seconds later that I've meant to mention, 731 00:39:44,440 --> 00:39:47,560 Speaker 2: and I call it a Willoson imposition. Whereas when he's 732 00:39:47,600 --> 00:39:50,480 Speaker 2: in the office with me, oh, I joined a quick 733 00:39:50,520 --> 00:39:52,360 Speaker 2: read of this email before I send it, you know, 734 00:39:52,600 --> 00:39:54,319 Speaker 2: I go over to him and say, how's this job 735 00:39:54,360 --> 00:39:56,520 Speaker 2: you're working on going, And I'll have a look and 736 00:39:56,560 --> 00:39:58,239 Speaker 2: we'll see what he's working on. It's like I'll make 737 00:39:58,239 --> 00:40:01,919 Speaker 2: a couple of suggestions, but that doesn't work when he's 738 00:40:01,960 --> 00:40:04,839 Speaker 2: working remotely. I absolutely know that he's very good at 739 00:40:04,840 --> 00:40:08,120 Speaker 2: working remotely, and I get that, and it's good for 740 00:40:08,239 --> 00:40:10,200 Speaker 2: him because he can take his daughter to day care. 741 00:40:10,640 --> 00:40:13,040 Speaker 2: So two days a week he works from home, you know, 742 00:40:13,160 --> 00:40:15,799 Speaker 2: four three days he's in the office. But I feel 743 00:40:15,840 --> 00:40:18,440 Speaker 2: we're more productive when we're in the same room together, 744 00:40:18,520 --> 00:40:22,040 Speaker 2: and I feel I'm supported because I like to bounce 745 00:40:22,120 --> 00:40:24,360 Speaker 2: ideas off and you know me, I hate to talk. 746 00:40:25,719 --> 00:40:29,960 Speaker 1: Yeah, we know, I notice, But I also I don't know. 747 00:40:30,160 --> 00:40:32,000 Speaker 1: I guess there's no right wrong with this. But I 748 00:40:32,040 --> 00:40:35,000 Speaker 1: also think depending on the workplace and the culture. But 749 00:40:36,160 --> 00:40:38,120 Speaker 1: I know lots of people who actually really like their 750 00:40:38,200 --> 00:40:40,319 Speaker 1: job and they like where they work. And I know 751 00:40:40,360 --> 00:40:43,759 Speaker 1: people who also hate their job. But going to a 752 00:40:43,800 --> 00:40:46,480 Speaker 1: place where you've got friends and colleagues that you do 753 00:40:46,600 --> 00:40:49,799 Speaker 1: cool stuff with together and it is a good environment. Now, 754 00:40:49,840 --> 00:40:53,320 Speaker 1: I know this is not every workplace, but I also 755 00:40:53,400 --> 00:40:55,640 Speaker 1: think there's something to be said for getting out of 756 00:40:55,640 --> 00:40:58,440 Speaker 1: the house, going somewhere. Yeah. 757 00:40:58,520 --> 00:41:01,799 Speaker 2: Yeah, and I do that because I teach, and that's 758 00:41:01,840 --> 00:41:04,000 Speaker 2: a great way to get out and do that. I mean, 759 00:41:04,160 --> 00:41:06,080 Speaker 2: I'm at the gym in the morning and then walking Fritz, 760 00:41:06,360 --> 00:41:08,239 Speaker 2: so I spend at least two hours out of the 761 00:41:08,280 --> 00:41:10,840 Speaker 2: house even before I start work. But I love it 762 00:41:10,880 --> 00:41:13,319 Speaker 2: when I have an excuse to leave and if I 763 00:41:13,320 --> 00:41:15,759 Speaker 2: can have them. The other day, I think I told 764 00:41:15,800 --> 00:41:17,920 Speaker 2: you I got very excited because I'm a nerd and 765 00:41:17,960 --> 00:41:21,960 Speaker 2: we just opened a new library in Bland. It's very exciting. 766 00:41:22,800 --> 00:41:26,080 Speaker 2: But I I had a meeting with a client and 767 00:41:26,120 --> 00:41:27,719 Speaker 2: I just said, let's go to the library and meet 768 00:41:27,760 --> 00:41:29,920 Speaker 2: there because I've got meeting rooms and stuff, and it 769 00:41:30,000 --> 00:41:31,560 Speaker 2: was awesome just to be able to get out of 770 00:41:31,560 --> 00:41:33,040 Speaker 2: the house and meet somewhere different. 771 00:41:34,680 --> 00:41:37,319 Speaker 1: You know, it's not nineteen eighty two, don't you. We 772 00:41:37,440 --> 00:41:41,840 Speaker 1: got a library? What, Tiff? Can I ask you a question? 773 00:41:42,000 --> 00:41:47,120 Speaker 1: We've got a library. God, wait till the internet gets 774 00:41:47,160 --> 00:41:50,200 Speaker 1: to Bland. You guys are going to go nuts. Hey, 775 00:41:50,680 --> 00:41:56,200 Speaker 1: tif are you What are your thoughts on self driving cars? 776 00:41:56,760 --> 00:42:00,359 Speaker 1: Would you get in a self driving car? Would you go, hey, 777 00:42:00,800 --> 00:42:03,040 Speaker 1: take me to the gym and just sit in the 778 00:42:03,080 --> 00:42:05,000 Speaker 1: back seat and let yourself be driven. 779 00:42:05,600 --> 00:42:07,520 Speaker 3: I don't know, I don't even do you know what 780 00:42:07,800 --> 00:42:11,279 Speaker 3: I find weird. I've seem to have an issue with 781 00:42:11,360 --> 00:42:14,000 Speaker 3: the sound of electric cars when they drive past me. 782 00:42:14,080 --> 00:42:16,879 Speaker 3: The they freak me out. I feel like I'm in 783 00:42:16,920 --> 00:42:21,919 Speaker 3: some weird I don't know it's weird. So I don't 784 00:42:21,960 --> 00:42:23,160 Speaker 3: know how you feel about them. 785 00:42:23,760 --> 00:42:26,799 Speaker 2: Well, tell even eyes that you're hearing. That's about the 786 00:42:26,840 --> 00:42:27,560 Speaker 2: sound that you get in. 787 00:42:27,800 --> 00:42:30,839 Speaker 3: They've got this weird there's this weird noise that goes 788 00:42:30,880 --> 00:42:33,680 Speaker 3: bast I feel like I'm a sort of cipher. I'm 789 00:42:33,680 --> 00:42:37,080 Speaker 3: not repeating it. You're not bullying me. 790 00:42:37,640 --> 00:42:41,520 Speaker 1: Okay, Well they're they're on their way in Australia, as 791 00:42:41,520 --> 00:42:43,439 Speaker 1: Patrick will tell us in a moment, but also they're 792 00:42:43,440 --> 00:42:49,520 Speaker 1: already in California. There's already taxis and a bunch of 793 00:42:49,520 --> 00:42:52,360 Speaker 1: other things, which, yeah, so you order a cab, it 794 00:42:52,480 --> 00:42:56,359 Speaker 1: comes to your house driverless, you get in and it 795 00:42:56,400 --> 00:43:00,000 Speaker 1: takes you where you're going, and you're the only person 796 00:43:00,080 --> 00:43:03,400 Speaker 1: in the car. That's that's got to be interesting for 797 00:43:03,440 --> 00:43:04,640 Speaker 1: your first time doing that. 798 00:43:05,239 --> 00:43:08,280 Speaker 2: Well, at the moment. In Australia, Tesla has been allowed 799 00:43:08,280 --> 00:43:11,640 Speaker 2: to enable what they call full self driving, but there's 800 00:43:11,640 --> 00:43:16,319 Speaker 2: a caveat it's called supervised self driving, so you have 801 00:43:16,400 --> 00:43:19,719 Speaker 2: to be sitting at the steering will ready to take over. 802 00:43:19,920 --> 00:43:21,879 Speaker 2: So you you know, you can't be sitting around having 803 00:43:21,920 --> 00:43:24,360 Speaker 2: a nap. You can't be sitting in the passenger seat. 804 00:43:25,400 --> 00:43:28,279 Speaker 2: It's TIFF's given me this blank look. But what the 805 00:43:28,280 --> 00:43:29,600 Speaker 2: hell is the point? 806 00:43:30,920 --> 00:43:33,000 Speaker 1: I saw a video of a dude coming in a 807 00:43:33,080 --> 00:43:36,439 Speaker 1: Tesla from Sydney to Melbourne and he's got to sit 808 00:43:36,560 --> 00:43:39,880 Speaker 1: there like, I think it's more stressful than just driving, 809 00:43:40,200 --> 00:43:42,319 Speaker 1: because he's got to be fucking sitting there with his 810 00:43:42,400 --> 00:43:44,960 Speaker 1: hands at the ten to two but not on the wheel, 811 00:43:45,400 --> 00:43:48,359 Speaker 1: but just ready at any moment for a fucking ten 812 00:43:48,440 --> 00:43:51,239 Speaker 1: hour drive to take what's the point? I'm with you. 813 00:43:51,760 --> 00:43:53,120 Speaker 1: I want to sit in the back and have a 814 00:43:53,120 --> 00:43:57,399 Speaker 1: snoozy mcsnoozeter. Well, watch your video looking to happen. 815 00:43:58,719 --> 00:44:03,240 Speaker 2: I got to say that I use like assisted cruise 816 00:44:03,280 --> 00:44:06,560 Speaker 2: control in my car and lane assist, so my car 817 00:44:06,680 --> 00:44:08,879 Speaker 2: will use adaptive cruise control. If the car in front 818 00:44:08,920 --> 00:44:11,279 Speaker 2: of me starts to slow down, the cruise control will 819 00:44:11,320 --> 00:44:15,279 Speaker 2: slow down accordingly and maintain the distance. It's safer if 820 00:44:15,320 --> 00:44:18,040 Speaker 2: someone jams on their brakes, the car potentially Well, I've 821 00:44:18,040 --> 00:44:20,880 Speaker 2: actually had this one. I've reversed into my driveway because 822 00:44:20,920 --> 00:44:25,120 Speaker 2: I hadn't trimmed the hedge well enough, I started reversing. 823 00:44:25,480 --> 00:44:26,240 Speaker 1: Such a metaphor. 824 00:44:28,280 --> 00:44:30,719 Speaker 2: I'm pretty good usually at trimming the hedge. Yeah, I 825 00:44:30,920 --> 00:44:34,760 Speaker 2: like to keep it tidy. As I reversed into the drive, 826 00:44:35,200 --> 00:44:38,000 Speaker 2: the car slammed on the brakes, So it's like, what 827 00:44:38,080 --> 00:44:40,520 Speaker 2: the hell just happened? Did I hit something? And it 828 00:44:40,600 --> 00:44:43,560 Speaker 2: was just that the bush was poking out a little bit, 829 00:44:43,960 --> 00:44:47,760 Speaker 2: and so I subsequently got out of my heage trimmer 830 00:44:47,800 --> 00:44:49,600 Speaker 2: and trim the hedge. 831 00:44:51,040 --> 00:44:53,240 Speaker 1: Notice how quiet I am here? Yeah? 832 00:44:53,320 --> 00:44:55,680 Speaker 2: Yeah, no, But those things are handy. So if you're 833 00:44:55,680 --> 00:44:57,879 Speaker 2: reversing and a child ran out in front of you 834 00:44:57,960 --> 00:45:00,640 Speaker 2: and you didn't see them, or a dog, a cat, 835 00:45:01,200 --> 00:45:03,200 Speaker 2: then your car potentially. 836 00:45:02,719 --> 00:45:06,600 Speaker 1: We get that. That's the old technology. Patrick, twenty nineteen. 837 00:45:06,680 --> 00:45:09,840 Speaker 1: Suzuki's got all that shit. Let's talk about No, stop 838 00:45:09,880 --> 00:45:12,880 Speaker 1: distracting us with your bullshit, Mazda, Let's go back to 839 00:45:12,960 --> 00:45:17,040 Speaker 1: fucking Oh, I guess what my car's got, adaptive cruise control. 840 00:45:17,239 --> 00:45:22,040 Speaker 1: Fucking hell, that's a newsplash. Let's talk about Tesla's that 841 00:45:22,200 --> 00:45:24,720 Speaker 1: can drive you around while you're sitting in the back seat. 842 00:45:25,400 --> 00:45:27,040 Speaker 1: When is that actually happening? 843 00:45:27,480 --> 00:45:30,000 Speaker 2: Well, the foot well, not in the back seat. Because 844 00:45:30,000 --> 00:45:35,319 Speaker 2: it's a legislative not technology. So legislatively, no government in 845 00:45:35,360 --> 00:45:37,919 Speaker 2: Australia is going to let a car on the road 846 00:45:38,000 --> 00:45:41,440 Speaker 2: that's full self driving. They want ever like ever at 847 00:45:41,440 --> 00:45:45,000 Speaker 2: the moment, at the moment, they prove themselves. So the 848 00:45:45,560 --> 00:45:48,319 Speaker 2: argument from Tesla is that cameras don't blink, they don't 849 00:45:48,320 --> 00:45:52,760 Speaker 2: get tired, they don't get distracted, and so potentially having 850 00:45:52,840 --> 00:45:55,600 Speaker 2: that car in that autonomous mode of full self driving 851 00:45:55,640 --> 00:45:58,120 Speaker 2: mode is going to be safer even if you're sitting 852 00:45:58,120 --> 00:46:01,120 Speaker 2: there at ten to two ready to take over control, 853 00:46:01,480 --> 00:46:04,960 Speaker 2: because the car will react better than what they're claiming. 854 00:46:04,960 --> 00:46:07,160 Speaker 2: The car will react better than what a human driver will. 855 00:46:07,719 --> 00:46:11,160 Speaker 1: I think you're right, and I think also I think 856 00:46:11,200 --> 00:46:13,400 Speaker 1: they're right. I think we don't want to think about that. 857 00:46:13,600 --> 00:46:16,439 Speaker 1: But you know, every now and then you go, oh, 858 00:46:16,560 --> 00:46:19,200 Speaker 1: a Tesla ran over four people, or it did this 859 00:46:19,320 --> 00:46:22,520 Speaker 1: or that, But you think about all the human error 860 00:46:22,560 --> 00:46:28,440 Speaker 1: accidents that happen every day all around the world. I 861 00:46:28,560 --> 00:46:30,960 Speaker 1: you know, I would say they are far there's going 862 00:46:31,040 --> 00:46:34,040 Speaker 1: to be even when we do have full automated self 863 00:46:34,080 --> 00:46:36,920 Speaker 1: driving cars. I would think there will be accidents, but 864 00:46:36,960 --> 00:46:39,359 Speaker 1: I think it will be I could be wrong, but 865 00:46:39,440 --> 00:46:43,919 Speaker 1: I feel like it'll be overwhelmingly safer than humans doing 866 00:46:43,920 --> 00:46:47,359 Speaker 1: the job. For the reason that you said, yes, Tiff, yes, 867 00:46:47,400 --> 00:46:50,600 Speaker 1: you're with your hand up in row four. I love 868 00:46:50,600 --> 00:46:53,360 Speaker 1: it when you put up your hand like you're at school. Everyone. 869 00:46:54,840 --> 00:46:57,760 Speaker 1: I just say, your camera has changed. It keeps zooming 870 00:46:57,800 --> 00:46:58,480 Speaker 1: in and out on you. 871 00:46:58,560 --> 00:47:01,359 Speaker 3: What's that about, zoomer me if I wanted it to 872 00:47:01,400 --> 00:47:03,680 Speaker 3: do that? And I said, that sounds great, zoom let's 873 00:47:03,719 --> 00:47:04,040 Speaker 3: do it. 874 00:47:04,120 --> 00:47:06,000 Speaker 1: And every now, every time you put your head in 875 00:47:06,040 --> 00:47:08,719 Speaker 1: your hands, which is every two minutes, we see the 876 00:47:08,719 --> 00:47:11,920 Speaker 1: top of a scone. You got a little dandruff. Bro, 877 00:47:12,000 --> 00:47:14,160 Speaker 1: I don't you know. I'm just just letting you know. 878 00:47:14,560 --> 00:47:15,319 Speaker 1: Maybe just some. 879 00:47:15,480 --> 00:47:18,799 Speaker 3: Coind quite a few silver hairs sprouting through, by the way, 880 00:47:19,560 --> 00:47:21,359 Speaker 3: But that's not what my hand was raised for. It 881 00:47:21,480 --> 00:47:25,200 Speaker 3: was to say, my biggest concern is, as if people 882 00:47:25,239 --> 00:47:27,640 Speaker 3: aren't going to hack them hack everything else, people are 883 00:47:27,640 --> 00:47:30,400 Speaker 3: going to hack that shit and crazy stuff will happen. 884 00:47:31,200 --> 00:47:35,160 Speaker 1: Yeah. I'm with you on that. Yeah, that's like I mean, 885 00:47:35,200 --> 00:47:38,000 Speaker 1: it's essentially a computer on wheels, as if they're not 886 00:47:38,080 --> 00:47:38,840 Speaker 1: going to be hacked. 887 00:47:38,960 --> 00:47:39,720 Speaker 3: Yeah. 888 00:47:39,800 --> 00:47:41,480 Speaker 1: Oh look, and now I'm doing one hundred and eighty 889 00:47:41,560 --> 00:47:45,040 Speaker 1: kilometers an hour towards that tree. Oh and I'm a 890 00:47:45,080 --> 00:47:48,719 Speaker 1: politician that people or whatever it is like, Yeah, that 891 00:47:49,080 --> 00:47:53,200 Speaker 1: that scares me. Yeah, Patrick fixed that wair if you could. 892 00:47:53,640 --> 00:47:56,200 Speaker 1: If you could, Patrick, fix that and get back. 893 00:47:56,000 --> 00:47:59,719 Speaker 2: To us, I will all right to do list Craigo. 894 00:48:00,280 --> 00:48:02,080 Speaker 1: Yeah, if you could. I know you've got not much 895 00:48:02,160 --> 00:48:07,399 Speaker 1: on can you. I want to jump ships a little bit. 896 00:48:07,520 --> 00:48:13,400 Speaker 1: Dutch designers innovate tiny home. I'm a bit obsessed with 897 00:48:13,480 --> 00:48:16,880 Speaker 1: tiny homes everyone. I think I'd like to live in 898 00:48:16,880 --> 00:48:20,960 Speaker 1: one for a year or two. Tiny home inside recycled 899 00:48:21,040 --> 00:48:26,279 Speaker 1: wind turbine. It's either that or those buddy shipping containers. 900 00:48:26,840 --> 00:48:28,920 Speaker 1: I don't mind a good shipping container. 901 00:48:28,520 --> 00:48:31,359 Speaker 2: That's been frocked up a bit, especially when you get 902 00:48:31,360 --> 00:48:34,360 Speaker 2: them on top of each other where they're offset. 903 00:48:34,080 --> 00:48:37,160 Speaker 1: And you can yeah, like a big genger kind of thing. 904 00:48:37,560 --> 00:48:40,360 Speaker 2: Yep, yep, tetris like in the Tetris. 905 00:48:40,960 --> 00:48:43,320 Speaker 1: Yeah, this is because one of the I mean, it's. 906 00:48:43,200 --> 00:48:44,920 Speaker 2: Great that there are a lot of wind turn bides 907 00:48:44,960 --> 00:48:46,600 Speaker 2: around now. I don't know if you've ever been up 908 00:48:46,640 --> 00:48:51,400 Speaker 2: close and personal next to a wind turbine, but they're gigantic, like, 909 00:48:51,440 --> 00:48:56,400 Speaker 2: they're massively big, and the problem is the environmental sustainability 910 00:48:56,400 --> 00:49:00,319 Speaker 2: of these once they become decommissioned. And so I have 911 00:49:00,320 --> 00:49:01,719 Speaker 2: a bit of a chuckle when I saw the name 912 00:49:01,760 --> 00:49:06,000 Speaker 2: of this eco friendly innovation. It's called Nestle, which could 913 00:49:06,000 --> 00:49:08,960 Speaker 2: be pronounced nestle or nestlay. So it's either I'm going 914 00:49:09,000 --> 00:49:11,080 Speaker 2: to eat it, or I'm going to live in it 915 00:49:11,560 --> 00:49:14,000 Speaker 2: because could be a block of chocolate, mate. 916 00:49:13,960 --> 00:49:15,919 Speaker 1: Or I'm going to snuggle up next to it. Yeah. 917 00:49:16,360 --> 00:49:19,640 Speaker 2: Yeah. So each of the units measures about three hundred 918 00:49:19,680 --> 00:49:23,040 Speaker 2: and seventy six square feet. So what's that about, like 919 00:49:23,120 --> 00:49:26,520 Speaker 2: a square, like one hundred square meters or something? No, 920 00:49:26,800 --> 00:49:29,440 Speaker 2: how much three hundred and seventy what three hundred and 921 00:49:29,440 --> 00:49:30,600 Speaker 2: seventy six square feet? 922 00:49:31,280 --> 00:49:34,919 Speaker 1: Well, that's like twenty feet by twenty feet just under 923 00:49:35,120 --> 00:49:40,560 Speaker 1: So it's like five meters by four meters, which is like, no, no, no, no, 924 00:49:40,600 --> 00:49:41,240 Speaker 1: it's tiny. 925 00:49:41,719 --> 00:49:43,600 Speaker 2: What's that in square meters? Though, because I don't know 926 00:49:43,640 --> 00:49:45,799 Speaker 2: square feet, I should have probably done the calculations. 927 00:49:46,080 --> 00:49:50,759 Speaker 1: Okay, well it's like is five by four is twenty 928 00:49:50,800 --> 00:49:51,719 Speaker 1: square meters? 929 00:49:51,920 --> 00:49:54,799 Speaker 2: Are twenty square meters? Okay, that sounds good. So they 930 00:49:55,040 --> 00:49:59,399 Speaker 2: are making these really great little stylish you know, they're 931 00:49:59,480 --> 00:50:03,560 Speaker 2: using European Well they're getting people to kind of move 932 00:50:03,560 --> 00:50:07,400 Speaker 2: into them. And it sounds like the perfect way to 933 00:50:07,480 --> 00:50:12,279 Speaker 2: recycle and repurpose these turbine parts. And you know, I 934 00:50:12,360 --> 00:50:15,880 Speaker 2: lived in a caravan for about three years. When I 935 00:50:15,920 --> 00:50:18,440 Speaker 2: was about sixteen, I bought a caravan and moved out 936 00:50:18,440 --> 00:50:21,239 Speaker 2: into the backyard and it was I had my own 937 00:50:21,280 --> 00:50:23,319 Speaker 2: little what why are you laughing at me? 938 00:50:23,400 --> 00:50:25,440 Speaker 1: I bought a caravan and I moved out. Where'd you go? 939 00:50:25,480 --> 00:50:25,640 Speaker 2: Oh? 940 00:50:25,680 --> 00:50:30,440 Speaker 1: The backyard? What? Yeah? Seven feet from mum so she 941 00:50:30,480 --> 00:50:32,560 Speaker 1: could wash my clothes and make my dinner. 942 00:50:32,800 --> 00:50:34,440 Speaker 2: I had it in com too, so I could just 943 00:50:34,520 --> 00:50:36,600 Speaker 2: press to see when it was ready to. 944 00:50:36,680 --> 00:50:39,240 Speaker 1: Yeah, Mum's dinner ready. And also, what about my jocks? 945 00:50:39,280 --> 00:50:42,840 Speaker 1: What's the status on those? It's a status on my jocks? 946 00:50:42,920 --> 00:50:48,160 Speaker 1: Over hell on pathnic. Can you impersonate your mum for me? Please? No? 947 00:50:48,400 --> 00:50:51,520 Speaker 2: I would not I would not go that far, my 948 00:50:52,040 --> 00:50:54,480 Speaker 2: lovely mother. I wouldn't do that. Actually I couldn't. 949 00:50:54,520 --> 00:50:54,640 Speaker 4: You know. 950 00:50:54,640 --> 00:50:58,120 Speaker 2: It's funny. I do a really good Maltese accent, but 951 00:50:58,200 --> 00:51:01,120 Speaker 2: when I heard my parents talk, couldn't hear their accent? 952 00:51:02,280 --> 00:51:04,919 Speaker 1: Wow? Well let's hear it. Let's hear your really good 953 00:51:04,960 --> 00:51:07,640 Speaker 1: Maltese accent. Well, can we say to everyone you are 954 00:51:07,719 --> 00:51:10,040 Speaker 1: Maltese by heritage. 955 00:51:09,680 --> 00:51:13,040 Speaker 2: By heritage, demoritese accent sounds like this. If I talk 956 00:51:13,080 --> 00:51:16,680 Speaker 2: about the Dutch engineers who innovate the tiny hole wind, 957 00:51:16,760 --> 00:51:18,720 Speaker 2: say his say cold wind turbine. 958 00:51:19,440 --> 00:51:23,720 Speaker 1: Oh oh that's pretty good, dude, that's pretty good. Wow, 959 00:51:23,920 --> 00:51:29,600 Speaker 1: that's better than my Irish, which I'm not doing accent. 960 00:51:30,360 --> 00:51:32,880 Speaker 2: Mind, I can do an Irish accent till the cows 961 00:51:32,920 --> 00:51:33,399 Speaker 2: come home. 962 00:51:34,239 --> 00:51:37,040 Speaker 1: Well, you're pretty good at that. That's the voiceover man 963 00:51:37,080 --> 00:51:38,920 Speaker 1: in you coming out. Tip. Would you like to live 964 00:51:38,960 --> 00:51:40,799 Speaker 1: in a tiny home for a year or two? I'd 965 00:51:40,880 --> 00:51:43,759 Speaker 1: love to, I reckon be all right with a pet 966 00:51:43,800 --> 00:51:46,399 Speaker 1: horse if I was. Imagine if you were stinking rich 967 00:51:46,440 --> 00:51:49,520 Speaker 1: and you just build an estate of tiny homes, had 968 00:51:49,640 --> 00:51:53,080 Speaker 1: like a thousand tiny homes, I wonder, and you could 969 00:51:53,080 --> 00:51:56,719 Speaker 1: sell them for like one hundred and twenty grand, so 970 00:51:56,800 --> 00:51:58,800 Speaker 1: everyone could, Well not everyone could, but you know what 971 00:51:58,880 --> 00:52:01,640 Speaker 1: I mean, what's what's the median price of a house 972 00:52:01,640 --> 00:52:05,840 Speaker 1: in Melbourne? A trillion dollars? What do you get for 973 00:52:05,840 --> 00:52:09,000 Speaker 1: one hundred grand? You get a letterbox and a yakka 974 00:52:09,200 --> 00:52:09,719 Speaker 1: next to it. 975 00:52:12,160 --> 00:52:16,680 Speaker 2: Hucknell Well countries are looking into the affordability and trying 976 00:52:16,680 --> 00:52:19,560 Speaker 2: to get people into the housing market by producing eco 977 00:52:19,600 --> 00:52:23,960 Speaker 2: sustainable housing like the shipping containers and now this idea 978 00:52:24,000 --> 00:52:27,640 Speaker 2: of having a recycled wind turbine. But it would be 979 00:52:27,719 --> 00:52:30,919 Speaker 2: so good for young people. I look at my young 980 00:52:30,960 --> 00:52:33,719 Speaker 2: people that I know, my friend's kids, and you think, 981 00:52:33,760 --> 00:52:35,800 Speaker 2: how the hell you know if you live in Hampton, 982 00:52:36,200 --> 00:52:38,360 Speaker 2: how the hell are you ever going to buy in Hampton? 983 00:52:38,600 --> 00:52:40,680 Speaker 2: And this is happening in a lot of places where 984 00:52:40,960 --> 00:52:44,479 Speaker 2: young people are being displaced from the suburbs they grew 985 00:52:44,560 --> 00:52:47,800 Speaker 2: up in because they can't afford to buy in that suburb. 986 00:52:48,080 --> 00:52:52,000 Speaker 2: So it's happening in a lot of towns where you've 987 00:52:52,040 --> 00:52:55,760 Speaker 2: got resort towns, so places four Key and Ocean Grove. 988 00:52:56,200 --> 00:52:58,480 Speaker 2: So if you grew up there surfing as a kid, 989 00:52:58,920 --> 00:53:01,600 Speaker 2: once you get to the point where you may want 990 00:53:01,600 --> 00:53:04,080 Speaker 2: to move out of home and buy a place, it's 991 00:53:04,200 --> 00:53:06,880 Speaker 2: too out of the spectrum because it's too expensive. They 992 00:53:06,880 --> 00:53:10,040 Speaker 2: can't they can't buy in the town that they grew 993 00:53:10,120 --> 00:53:10,440 Speaker 2: up in. 994 00:53:11,120 --> 00:53:13,920 Speaker 1: That's why everyone head to the land. That's where it's at. 995 00:53:14,320 --> 00:53:17,239 Speaker 1: That's it's Patrick starting a cult pretty soon. 996 00:53:17,719 --> 00:53:19,840 Speaker 2: That'd be good leader, do you Beckon? I could be 997 00:53:19,880 --> 00:53:21,800 Speaker 2: a cult leader? Maybe On edit career change. 998 00:53:23,840 --> 00:53:26,520 Speaker 1: You could definitely be involved in a cult. I feel 999 00:53:26,560 --> 00:53:28,880 Speaker 1: more like you'd be the wife of the cult leader. 1000 00:53:29,200 --> 00:53:34,000 Speaker 1: You so i'm your missus pretty much pretty much, you 1001 00:53:34,040 --> 00:53:36,960 Speaker 1: could be my wife. I don't want to be a 1002 00:53:37,000 --> 00:53:40,759 Speaker 1: cult leader. No, I don't know. You're more You're not 1003 00:53:40,840 --> 00:53:45,440 Speaker 1: that bossy. I mean you're definitely, You're definitely smart enough 1004 00:53:45,480 --> 00:53:48,240 Speaker 1: if you wanted. But I don't think you want a cult. 1005 00:53:48,400 --> 00:53:50,160 Speaker 1: I don't want a cult? Who wants a cult? 1006 00:53:50,400 --> 00:53:53,200 Speaker 2: If in this conversation these three windows, who's the cult 1007 00:53:53,280 --> 00:53:55,040 Speaker 2: leader out of the three? 1008 00:53:56,000 --> 00:54:00,799 Speaker 1: Well, that's no one. It might be Tiff, it might 1009 00:54:00,880 --> 00:54:03,120 Speaker 1: you might be two, I see, and I might just 1010 00:54:03,200 --> 00:54:07,200 Speaker 1: do security at the front gate, heeping out the Feds. 1011 00:54:07,480 --> 00:54:11,520 Speaker 1: He carries about she could be a cult leader. Well, 1012 00:54:11,560 --> 00:54:13,759 Speaker 1: she kind of is, Hey, can you do one more 1013 00:54:13,800 --> 00:54:15,759 Speaker 1: story because I need to go and eat food because 1014 00:54:15,760 --> 00:54:19,319 Speaker 1: I'm getting like TIF, I'm getting angry. You're a bit 1015 00:54:19,360 --> 00:54:22,680 Speaker 1: angry at the moment. Yep, yeah, ye, So can you 1016 00:54:22,719 --> 00:54:24,279 Speaker 1: can you shove it along a little bit? 1017 00:54:25,120 --> 00:54:27,640 Speaker 2: Sheeers, I can't find the story. I'm just going to 1018 00:54:27,640 --> 00:54:31,279 Speaker 2: talk about it anyway. So Radian we talked about a 1019 00:54:31,280 --> 00:54:33,560 Speaker 2: few Australian innovations and I love it when Australians come 1020 00:54:33,600 --> 00:54:37,160 Speaker 2: up with innovations. But some scientists in Australia have come 1021 00:54:37,239 --> 00:54:40,720 Speaker 2: up with a way too. And you'll love this because 1022 00:54:40,760 --> 00:54:43,839 Speaker 2: I know you like to lurk around your suburb at 1023 00:54:43,920 --> 00:54:49,680 Speaker 2: night to give people. I I resemble that comment, Patrick, 1024 00:54:51,320 --> 00:54:53,640 Speaker 2: because I know my friends say that you walked us 1025 00:54:53,719 --> 00:54:54,240 Speaker 2: their house. 1026 00:54:54,600 --> 00:54:57,759 Speaker 1: I do. That's I think I must be the hamp 1027 00:54:57,880 --> 00:55:00,600 Speaker 1: and fucking weirdo because I'm out in the dark like 1028 00:55:00,640 --> 00:55:04,280 Speaker 1: a ninja. People just go, what is wrong with that guy? 1029 00:55:04,640 --> 00:55:04,960 Speaker 3: I know? 1030 00:55:05,120 --> 00:55:07,960 Speaker 1: I apologize, I'm friendly. You can say hello. 1031 00:55:09,160 --> 00:55:12,839 Speaker 2: Now They've got an ultra thin film that can turn 1032 00:55:13,000 --> 00:55:16,680 Speaker 2: ordinary glasses into night vision goggles if you wanted to 1033 00:55:16,680 --> 00:55:18,680 Speaker 2: buy night vision goggles. We've seen him in the movies, 1034 00:55:18,719 --> 00:55:20,600 Speaker 2: you know, the special ops people they wear those big 1035 00:55:20,680 --> 00:55:25,520 Speaker 2: chunky rectangus. Getting rid of all of that. So straights 1036 00:55:25,880 --> 00:55:28,520 Speaker 2: come up with a new way that could potentially transform 1037 00:55:28,719 --> 00:55:31,239 Speaker 2: how we walk around at night. And that could just 1038 00:55:31,280 --> 00:55:34,960 Speaker 2: be as simple as you not tripping over something when 1039 00:55:34,960 --> 00:55:38,600 Speaker 2: you're out evening walk. You know, I'm not suggesting it's 1040 00:55:38,600 --> 00:55:42,160 Speaker 2: for nefarious reasons that Craig may want to have night 1041 00:55:42,239 --> 00:55:45,800 Speaker 2: vision when he goes out for his walk. But there's organization. 1042 00:55:45,920 --> 00:55:51,200 Speaker 2: It's the ARC Center of Excellence for transformative meta optical systems. 1043 00:55:51,880 --> 00:55:53,720 Speaker 1: Of course it is. Do you know what else would 1044 00:55:53,719 --> 00:55:57,279 Speaker 1: be good? If they could have heat detecting in it 1045 00:55:57,320 --> 00:55:59,399 Speaker 1: as well, so I can see, you know, where live 1046 00:55:59,480 --> 00:56:02,400 Speaker 1: things are as I move across the urban landscape. 1047 00:56:02,719 --> 00:56:03,960 Speaker 2: Just turn it back to creepy. 1048 00:56:05,880 --> 00:56:11,600 Speaker 1: That's just for personal security. It is, it is, I reckon, 1049 00:56:11,760 --> 00:56:15,279 Speaker 1: it's yes, I was going to talk about something else. No, 1050 00:56:15,440 --> 00:56:17,000 Speaker 1: I would like a pair of those? Would you like 1051 00:56:17,040 --> 00:56:17,640 Speaker 1: a pair of those? 1052 00:56:17,640 --> 00:56:17,879 Speaker 3: Though? 1053 00:56:18,440 --> 00:56:22,000 Speaker 2: Yeah? Absolutely, it makes it would make night vision much 1054 00:56:22,000 --> 00:56:24,960 Speaker 2: more accessible as well, so because it's at the moment 1055 00:56:24,960 --> 00:56:29,200 Speaker 2: it's really just soldiers and people working professional bird watches 1056 00:56:29,280 --> 00:56:32,040 Speaker 2: or something. But the no, it could be really good 1057 00:56:32,120 --> 00:56:34,239 Speaker 2: to have that for safety and for lots of other 1058 00:56:34,280 --> 00:56:37,319 Speaker 2: reasons as well well. Even driving at night. You could 1059 00:56:37,320 --> 00:56:39,960 Speaker 2: potentially be driving and not have your headlocks on. 1060 00:56:40,520 --> 00:56:43,440 Speaker 1: Oh yeah, gee, let's try that, shall we? Hey, everyone, 1061 00:56:43,520 --> 00:56:47,840 Speaker 1: let's just do a little social experiment, automotive experiment. Let's no, 1062 00:56:48,520 --> 00:56:49,719 Speaker 1: let's not do that for. 1063 00:56:49,680 --> 00:56:52,400 Speaker 3: A night time we and not having to turn the 1064 00:56:52,520 --> 00:56:55,239 Speaker 3: lights on and wake yourself up with all the blue light. 1065 00:56:55,800 --> 00:56:57,799 Speaker 1: Oh, that's such an issue. Isn't it when you get 1066 00:56:57,840 --> 00:56:59,440 Speaker 1: up and you need a Wii and you turn that 1067 00:56:59,480 --> 00:57:02,399 Speaker 1: shit on and it's like fox up your sleep. Yeah, 1068 00:57:02,480 --> 00:57:04,360 Speaker 1: that's why I just go. I think this is the 1069 00:57:04,400 --> 00:57:06,279 Speaker 1: toilet in front of me. I'll do my best and 1070 00:57:06,320 --> 00:57:10,319 Speaker 1: I'll deal with it at six am. What's all that 1071 00:57:10,480 --> 00:57:18,000 Speaker 1: splashing on my calves? What's that? And my shins? Hold it? 1072 00:57:18,120 --> 00:57:18,520 Speaker 2: Crago? 1073 00:57:19,400 --> 00:57:22,720 Speaker 1: Yeah? Or you can always you know, you can always 1074 00:57:22,760 --> 00:57:24,800 Speaker 1: sit down, but that's not you don't want to do 1075 00:57:24,880 --> 00:57:27,560 Speaker 1: that when you're a boy. Do you do. 1076 00:57:27,520 --> 00:57:30,280 Speaker 2: You remember we had a train at Harper's whose wife 1077 00:57:30,440 --> 00:57:31,320 Speaker 2: made him sit down? 1078 00:57:31,360 --> 00:57:35,800 Speaker 1: Do you remember that a trainer at Harper's who's do 1079 00:57:35,920 --> 00:57:38,000 Speaker 1: not tell us the name on air, but definitely tell 1080 00:57:38,080 --> 00:57:40,320 Speaker 1: us what do you mean he's oh, so that he 1081 00:57:40,320 --> 00:57:41,440 Speaker 1: wouldn't piss everywhere? 1082 00:57:41,880 --> 00:57:45,000 Speaker 2: Yep, he's under strict instructions. You have to sit down, 1083 00:57:45,560 --> 00:57:46,520 Speaker 2: not to let us stand up. 1084 00:57:47,080 --> 00:57:51,160 Speaker 1: Yeah, well that's wow. I need to dissown whoever that 1085 00:57:51,320 --> 00:57:54,240 Speaker 1: was before you even tell me who that was? Patrick? Whereon? 1086 00:57:54,360 --> 00:58:00,400 Speaker 1: People learn about you and what you do, and join 1087 00:58:00,440 --> 00:58:02,840 Speaker 1: your cult and come and stay at your house. 1088 00:58:02,880 --> 00:58:07,240 Speaker 2: Overnight websites noow dot com today you is the main 1089 00:58:07,360 --> 00:58:10,200 Speaker 2: website or you could get them chi at home dot 1090 00:58:10,200 --> 00:58:11,919 Speaker 2: com today you if you feel like you just want 1091 00:58:11,920 --> 00:58:13,920 Speaker 2: to hang out and do tie chi with me. 1092 00:58:15,240 --> 00:58:16,640 Speaker 1: Perfect both ways. 1093 00:58:17,040 --> 00:58:20,560 Speaker 2: And also, thank you so much, we finally got someone 1094 00:58:20,600 --> 00:58:22,560 Speaker 2: who kind of had some feedback. That was great that 1095 00:58:22,720 --> 00:58:25,520 Speaker 2: Jane took the trouble to ask us to talk about something. 1096 00:58:25,600 --> 00:58:28,360 Speaker 2: So if you want to talk to us something, just 1097 00:58:28,400 --> 00:58:30,760 Speaker 2: go to websitesnow dot com today. 1098 00:58:30,800 --> 00:58:34,880 Speaker 1: You you just made this podcast sound so tragic when 1099 00:58:34,960 --> 00:58:39,000 Speaker 1: you just highlighted that a person sent a question, a 1100 00:58:39,160 --> 00:58:41,080 Speaker 1: person get scarted with them. 1101 00:58:41,600 --> 00:58:44,040 Speaker 2: They care looking you know what, people enjoy listening. They 1102 00:58:44,040 --> 00:58:46,000 Speaker 2: don't feel they have to contribute, but if they want to, 1103 00:58:46,360 --> 00:58:48,200 Speaker 2: they can, and they sure can. 1104 00:58:48,960 --> 00:58:51,440 Speaker 1: Can I say, how funny you look holding what appears 1105 00:58:51,480 --> 00:58:53,280 Speaker 1: to be a bottle of beer in your hands, but 1106 00:58:53,360 --> 00:58:56,640 Speaker 1: it's kom butcher. If we know you're angry, we're going 1107 00:58:56,680 --> 00:58:57,760 Speaker 1: to let you go. But what are you going to 1108 00:58:57,800 --> 00:59:00,040 Speaker 1: shove in your face? What's the go to when you're hang. 1109 00:59:01,160 --> 00:59:03,800 Speaker 3: The place next door to where I took my clients 1110 00:59:03,880 --> 00:59:07,200 Speaker 3: is fair Feed and they have the best home cooked 1111 00:59:07,280 --> 00:59:10,080 Speaker 3: awesome chef prepared, not home cooked chef prepared me. So 1112 00:59:10,080 --> 00:59:12,040 Speaker 3: I got I can't remember I bought, but I bought 1113 00:59:12,080 --> 00:59:14,600 Speaker 3: some slow cooked meats. I'm going to whip that up. 1114 00:59:16,080 --> 00:59:17,760 Speaker 1: Is that the place where you're going and you get 1115 00:59:17,800 --> 00:59:20,800 Speaker 1: food and then you just pay whatever you think is no? 1116 00:59:20,880 --> 00:59:24,600 Speaker 3: This place started up around COVID time when we were 1117 00:59:24,760 --> 00:59:29,919 Speaker 3: in lockdown, and it's just like a sustainable little establishment 1118 00:59:30,160 --> 00:59:32,680 Speaker 3: and they've got chefs there and they cook all that. 1119 00:59:32,920 --> 00:59:35,880 Speaker 3: It's really good food, it's really tasty, and it's really 1120 00:59:36,160 --> 00:59:37,040 Speaker 3: awesome priced. 1121 00:59:37,080 --> 00:59:40,960 Speaker 1: So is it not for profit? Is my question? Or 1122 00:59:41,000 --> 00:59:41,800 Speaker 1: is it a business? 1123 00:59:42,240 --> 00:59:47,880 Speaker 3: It's a social enterprise right, and they're lovely too, So 1124 00:59:48,000 --> 00:59:51,360 Speaker 3: I drop it there once a week when I'm training people. 1125 00:59:51,200 --> 00:59:53,080 Speaker 1: Give a shout out. What's their name again? What's the 1126 00:59:53,160 --> 00:59:56,360 Speaker 1: business name? Or the social enterprise? What? Fair feed? 1127 00:59:56,520 --> 00:59:59,840 Speaker 3: Fair feed? Go get around. It's good, good food and 1128 01:00:00,120 --> 01:00:05,360 Speaker 3: soups that make me feel like my nan cooked to them. 1129 01:00:04,400 --> 01:00:07,360 Speaker 2: Yeah, can I give you some quick cooking advice because 1130 01:00:07,400 --> 01:00:09,400 Speaker 2: I had the most exciting lunch today. 1131 01:00:09,520 --> 01:00:13,480 Speaker 1: Which part about we're finishing the podcast, are you not understanding. 1132 01:00:13,040 --> 01:00:15,439 Speaker 3: Everybody wants cooking advice from the vegan. 1133 01:00:16,840 --> 01:00:20,200 Speaker 1: And also yeah, we're just about we're just winding up 1134 01:00:20,200 --> 01:00:22,240 Speaker 1: and he's like, hey, can I telling you no? You 1135 01:00:22,400 --> 01:00:23,160 Speaker 1: fucking can't. 1136 01:00:23,320 --> 01:00:26,720 Speaker 3: I'm about to go and eat my slow cooked So. 1137 01:00:27,680 --> 01:00:32,200 Speaker 1: Hey, go and tell your story to Fritz. He's probably 1138 01:00:32,240 --> 01:00:34,840 Speaker 1: so stick of your stories. Yeah, to see everyone. 1139 01:00:35,400 --> 01:00:35,880 Speaker 2: Bye,