1 00:00:00,440 --> 00:00:03,360 Speaker 1: Okay, it tell him it's it's Harps and Patrick. We 2 00:00:03,440 --> 00:00:05,800 Speaker 1: are Sands, Tiff, You and I am just going to 3 00:00:05,920 --> 00:00:09,920 Speaker 1: have to see how we go. Just us too, dummies, mate, Yeah, 4 00:00:10,039 --> 00:00:10,840 Speaker 1: we'll be fine. 5 00:00:11,200 --> 00:00:12,800 Speaker 2: You know. Testosterone's pumping. 6 00:00:15,480 --> 00:00:17,720 Speaker 1: Did you go get some? I'm glad you got some. 7 00:00:18,120 --> 00:00:19,640 Speaker 1: Where'd you get it? Did you get it on the 8 00:00:19,680 --> 00:00:21,360 Speaker 1: black market or did you see the doc? 9 00:00:22,800 --> 00:00:23,200 Speaker 2: Really? 10 00:00:24,120 --> 00:00:26,640 Speaker 1: Do we have to go there already? No? I reckon. 11 00:00:26,760 --> 00:00:31,840 Speaker 1: I actually think your testosterone levels would be probably above 12 00:00:32,040 --> 00:00:36,120 Speaker 1: average for your age because of your health, your healthy lifestyle, 13 00:00:36,240 --> 00:00:36,720 Speaker 1: et cetera. 14 00:00:37,240 --> 00:00:38,519 Speaker 2: Does that make a big difference, does it? 15 00:00:39,000 --> 00:00:42,680 Speaker 1: Oh? Of course? Yeah? Yeah, yeah, yeah. So if you 16 00:00:43,520 --> 00:00:45,640 Speaker 1: if you beat your body up, you know, drink lots 17 00:00:45,680 --> 00:00:49,800 Speaker 1: of booze, eat shit, don't exercise, you know, like literally 18 00:00:49,840 --> 00:00:53,280 Speaker 1: lifting weights or in your case, doing tai chi or 19 00:00:53,720 --> 00:00:59,240 Speaker 1: other kind of strength based stuff as well, increases testosterone production, 20 00:01:00,080 --> 00:01:03,440 Speaker 1: you know. So yeah, being sedentary apart from the fact 21 00:01:03,440 --> 00:01:07,200 Speaker 1: that it declines as men age and women as well, 22 00:01:07,240 --> 00:01:12,080 Speaker 1: because women have testosterone and small amounts. But yes, so, oh, 23 00:01:12,200 --> 00:01:19,039 Speaker 1: one hundred percent, like lifting weights increases test production and 24 00:01:19,080 --> 00:01:21,200 Speaker 1: blah blah blah and all of that. But there's a 25 00:01:21,240 --> 00:01:24,520 Speaker 1: lot of Yeah, it's a lot of things going on 26 00:01:24,600 --> 00:01:28,360 Speaker 1: in that kind of endocrime space around male and female 27 00:01:28,400 --> 00:01:33,520 Speaker 1: hormones and therapy. And it's for very a very long 28 00:01:33,560 --> 00:01:36,200 Speaker 1: time talking about and I've spoken about too much on 29 00:01:36,240 --> 00:01:40,280 Speaker 1: this show, but anything linked to testosterone and men is 30 00:01:40,400 --> 00:01:44,320 Speaker 1: essentially associated with the drug cheating scandals of the eighties 31 00:01:44,319 --> 00:01:48,480 Speaker 1: and nineties and everyone, like people don't understand that that 32 00:01:49,200 --> 00:01:53,960 Speaker 1: testosterone does not equal steroids, you know what I mean. 33 00:01:54,520 --> 00:01:59,040 Speaker 1: And there's clinical applications we're talking about health and just 34 00:01:59,080 --> 00:02:01,600 Speaker 1: like there are for ester own and estrogen for women. 35 00:02:01,640 --> 00:02:05,240 Speaker 1: But enough of that boring conversation. Do you still have 36 00:02:06,240 --> 00:02:09,560 Speaker 1: morning Wood? Patrick? That's that's my question. Do you still 37 00:02:09,600 --> 00:02:13,120 Speaker 1: periodically have that? This is a consult now everyone, you're welcome. 38 00:02:14,800 --> 00:02:16,400 Speaker 2: I feel like I'm damned if I do, if I 39 00:02:16,480 --> 00:02:18,799 Speaker 2: damned if I don't. By answering that question. 40 00:02:18,720 --> 00:02:21,720 Speaker 1: Well, do you know what's funny is like literally, but 41 00:02:21,919 --> 00:02:25,320 Speaker 1: literally a doctor will ask you whether or not like 42 00:02:25,400 --> 00:02:27,440 Speaker 1: I listen, have you heard of a speaking of tech? 43 00:02:27,520 --> 00:02:29,560 Speaker 1: Have you heard of a guy called Brian Johnson? No, 44 00:02:29,680 --> 00:02:32,760 Speaker 1: this is true, Okay, now I haven't come. So here's 45 00:02:32,919 --> 00:02:35,280 Speaker 1: like this dude who spends two million dollars a. 46 00:02:35,280 --> 00:02:37,840 Speaker 2: Year on reversing a Sorry, yes, of course I have. 47 00:02:37,919 --> 00:02:40,400 Speaker 1: Yeah. Yeah, So one of the things that he talks about, 48 00:02:40,480 --> 00:02:43,960 Speaker 1: and it's not haha, giggle, giggle. You know, schoolboy smut 49 00:02:44,000 --> 00:02:47,680 Speaker 1: is like like with men, and this sounds funny, but 50 00:02:47,760 --> 00:02:50,680 Speaker 1: one of the indicators of health is literally how many 51 00:02:50,720 --> 00:02:54,760 Speaker 1: times and for how long a night men have an erection. 52 00:02:55,440 --> 00:03:00,920 Speaker 1: And so in young, healthy males sometimes that happens somewhere 53 00:03:00,960 --> 00:03:03,320 Speaker 1: around two to three hours a night, and it's not 54 00:03:03,360 --> 00:03:06,760 Speaker 1: because they're sexually aroused, it's just what's happening in their body. 55 00:03:07,280 --> 00:03:10,000 Speaker 1: And so this is an indicator of and yes, it 56 00:03:10,080 --> 00:03:12,280 Speaker 1: sounds funny, but it's got to do with blood flow 57 00:03:12,320 --> 00:03:16,680 Speaker 1: and testosterone production and cell health and all of these things. 58 00:03:16,720 --> 00:03:21,480 Speaker 1: Whereas yeah, like guys who are fifty or sixty who 59 00:03:21,560 --> 00:03:24,240 Speaker 1: can still do that or who still get you know, 60 00:03:24,440 --> 00:03:28,079 Speaker 1: without too much trouble, they can get an erection. Sorry everyone, 61 00:03:28,120 --> 00:03:32,440 Speaker 1: it's clinical. Yeah, it's a real indicator of health. And obviously, 62 00:03:33,720 --> 00:03:36,600 Speaker 1: you know, every second dude in Australia, if not more 63 00:03:36,720 --> 00:03:40,080 Speaker 1: past a certain age have a rectal dysfunction, which it's 64 00:03:40,120 --> 00:03:43,680 Speaker 1: all haha, who you know what it actually correlates really 65 00:03:43,720 --> 00:03:49,440 Speaker 1: strongly with overall well being. Just like when women's estrogen 66 00:03:49,440 --> 00:03:54,840 Speaker 1: and progesterone levels drop, it equates to, among other things, 67 00:03:55,200 --> 00:03:59,920 Speaker 1: perimenopause and menopause. And so there are these physiological consequences 68 00:04:00,520 --> 00:04:04,560 Speaker 1: on both sides of the endocrime scale. But because we're 69 00:04:04,560 --> 00:04:09,360 Speaker 1: so fucking stupid in our society at times where we 70 00:04:09,560 --> 00:04:12,240 Speaker 1: just all, you know, but if we snigger at it 71 00:04:12,320 --> 00:04:15,560 Speaker 1: or we think it's controversial. But if your hormones aren't working, 72 00:04:16,440 --> 00:04:20,160 Speaker 1: including thyroid function and a bunch of other things, you're fucked. 73 00:04:20,320 --> 00:04:24,479 Speaker 1: So that's why women go on HRT, and that's why 74 00:04:24,520 --> 00:04:26,960 Speaker 1: we try to get the endocrime system working well, because 75 00:04:27,000 --> 00:04:32,680 Speaker 1: if that's working well, you're literally winding back the clock biologically. 76 00:04:33,000 --> 00:04:37,240 Speaker 2: At least now you eloquently pointed out or asked if 77 00:04:37,240 --> 00:04:39,280 Speaker 2: I had morning, would which was really lovely way of 78 00:04:39,320 --> 00:04:40,920 Speaker 2: putting it. Thanks, thank you very much for that. 79 00:04:41,080 --> 00:04:41,599 Speaker 1: You're welcome. 80 00:04:41,720 --> 00:04:43,599 Speaker 2: What I think I might do for the next two 81 00:04:43,640 --> 00:04:46,640 Speaker 2: weeks before our next is I'll keep a diary for you, 82 00:04:46,680 --> 00:04:52,559 Speaker 2: Craig yep, yep, yep. I'll make take note. I can't 83 00:04:52,600 --> 00:04:53,839 Speaker 2: say that I do. 84 00:04:53,880 --> 00:04:56,640 Speaker 1: You know what for for our seven male listeners that 85 00:04:56,680 --> 00:04:59,040 Speaker 1: we have because I don't know why, but women love you. 86 00:05:00,200 --> 00:05:03,120 Speaker 1: So many women I love Patrick. I'm like, you know, 87 00:05:03,160 --> 00:05:04,880 Speaker 1: he likes I love him. 88 00:05:05,120 --> 00:05:08,640 Speaker 2: I like too, but just not in that way. 89 00:05:10,279 --> 00:05:12,520 Speaker 1: That's probably why they like you. They feel safe. 90 00:05:12,920 --> 00:05:17,560 Speaker 2: Yeah you can, yeah, absolutely, yeah, yeah, yeah, you. 91 00:05:17,520 --> 00:05:18,120 Speaker 1: Know, yes, go on. 92 00:05:18,560 --> 00:05:20,240 Speaker 2: I want to keep onto this topic because I had 93 00:05:20,240 --> 00:05:21,719 Speaker 2: a very interesting little. 94 00:05:22,920 --> 00:05:26,440 Speaker 1: Careful Yeah, I'm worried now. I want to keep on 95 00:05:26,480 --> 00:05:29,120 Speaker 1: the topic of erections because I had an interesting yet 96 00:05:29,200 --> 00:05:29,440 Speaker 1: go on. 97 00:05:30,279 --> 00:05:33,920 Speaker 2: Okay, So, over the age of fifty, males should get 98 00:05:33,960 --> 00:05:39,040 Speaker 2: PSA tests done, which is a prostate specific atogen. And so, 99 00:05:39,760 --> 00:05:43,240 Speaker 2: as you know, my twin brother had prostate cancer and 100 00:05:43,400 --> 00:05:47,600 Speaker 2: is now going through that journey. In a big concern 101 00:05:48,000 --> 00:05:50,159 Speaker 2: for me as well, obviously, because it increases the risk. 102 00:05:50,200 --> 00:05:52,880 Speaker 2: If you're under the age of sixty, your siblings have 103 00:05:52,920 --> 00:05:57,200 Speaker 2: a higher risk of prostate cancer. And so I've been 104 00:05:57,279 --> 00:06:02,520 Speaker 2: tracking my PSAs and I did blood tests just recently 105 00:06:02,560 --> 00:06:05,440 Speaker 2: because I like to get on your blood tests, particularly 106 00:06:05,520 --> 00:06:07,640 Speaker 2: with my eating patterns and all that sort of stuff. 107 00:06:08,200 --> 00:06:11,960 Speaker 2: And what I've been researching, and this doesn't get told 108 00:06:12,080 --> 00:06:15,560 Speaker 2: very often, is that And again I'm trying to put 109 00:06:15,600 --> 00:06:18,520 Speaker 2: this delicately, but if you act on that morning wood. 110 00:06:21,560 --> 00:06:23,400 Speaker 1: Do you hang a towel on it or something and 111 00:06:23,440 --> 00:06:24,680 Speaker 1: then stroll through the house. 112 00:06:24,800 --> 00:06:27,080 Speaker 2: That's the right, Yeah, because you've got to fall suddenly. 113 00:06:28,279 --> 00:06:35,279 Speaker 2: That can influence your pay levels levels within six hours, 114 00:06:35,279 --> 00:06:37,360 Speaker 2: but up to twenty four to forty eight hours. 115 00:06:37,680 --> 00:06:41,680 Speaker 1: Why are you doing this as you're he's holding out 116 00:06:41,680 --> 00:06:44,360 Speaker 1: his hands like you know when you caught a fish. Yeah, 117 00:06:44,400 --> 00:06:47,320 Speaker 1: go on, so say that again about the six hours 118 00:06:47,360 --> 00:06:48,920 Speaker 1: I was obsessed with your hands? 119 00:06:50,000 --> 00:06:53,280 Speaker 2: No, no, just PSA L I'm not going to sit 120 00:06:53,360 --> 00:06:55,280 Speaker 2: on my hands now, so I don't move there. 121 00:06:55,400 --> 00:06:57,400 Speaker 1: That looks worse, Yeah, go worse. 122 00:06:57,480 --> 00:07:02,599 Speaker 2: Yeah, evidently this in reading, you know, doing research into 123 00:07:02,720 --> 00:07:05,880 Speaker 2: two PSA levels because I'm obviously monitoring mine very closely. 124 00:07:06,760 --> 00:07:10,880 Speaker 2: What can influence your PSAs is if you've if you've 125 00:07:10,880 --> 00:07:17,360 Speaker 2: had yeah, you know you've jacked op lated, thank you, 126 00:07:17,520 --> 00:07:21,560 Speaker 2: weed off, ejaculated right within six hours, but up to 127 00:07:21,600 --> 00:07:24,880 Speaker 2: twenty four to forty eight hours. So I went monk 128 00:07:25,080 --> 00:07:27,480 Speaker 2: for the last week before I had my blood test, 129 00:07:27,800 --> 00:07:30,800 Speaker 2: and my PINS dropped by about ten percent. 130 00:07:31,320 --> 00:07:34,760 Speaker 1: It dropped. So you're saying jacking off increases your PSA. 131 00:07:35,320 --> 00:07:37,880 Speaker 2: Well, if it's just before you go for a blood test, Yeah, 132 00:07:37,920 --> 00:07:41,160 Speaker 2: evidently because I got a GP friend of mine yesterday. 133 00:07:41,160 --> 00:07:42,800 Speaker 2: We walk every week. I've got a few grints. 134 00:07:43,080 --> 00:07:46,160 Speaker 1: Hang on, So what's the message? What's the And by 135 00:07:46,200 --> 00:07:49,320 Speaker 1: the way, remember everyone, this is not this is the 136 00:07:49,360 --> 00:07:53,960 Speaker 1: opposite of advice, So don't take any notice of what 137 00:07:54,640 --> 00:08:02,040 Speaker 1: Patrick's jacking off regime really or any kind of medical 138 00:08:02,080 --> 00:08:06,720 Speaker 1: conclusions he draws from that. Now you've hit up the computer, Well, 139 00:08:06,840 --> 00:08:07,960 Speaker 1: how I don't trust you. 140 00:08:08,560 --> 00:08:12,600 Speaker 2: The effect of ejaculation on serum prostate specific antigen level 141 00:08:12,640 --> 00:08:15,640 Speaker 2: This is an article written in twenty thirteen from the 142 00:08:15,760 --> 00:08:19,720 Speaker 2: National Institute of Health. So evidently it's it's known that 143 00:08:19,800 --> 00:08:22,840 Speaker 2: that can impact your PSA levels, but. 144 00:08:22,760 --> 00:08:25,200 Speaker 1: You haven't read out the rest. What's it say? The impact? 145 00:08:26,160 --> 00:08:32,240 Speaker 2: Okay, ejaculation contemporarily temporarily increase serum prostate specific andigen's PSA levels, 146 00:08:32,400 --> 00:08:36,199 Speaker 2: typically by five to ten percent, with the particlar bike 147 00:08:36,400 --> 00:08:38,160 Speaker 2: lasting up to forty eight hours. 148 00:08:38,559 --> 00:08:43,480 Speaker 1: Right, Well there you go. Yeah, so boys, Yeah, that's 149 00:08:43,520 --> 00:08:46,920 Speaker 1: that's a tricky one, isn't it. You do a cost 150 00:08:47,000 --> 00:08:51,200 Speaker 1: benefit analysis, You're like, how long is it elevated? Yeah? 151 00:08:51,920 --> 00:08:55,240 Speaker 1: And what are we talking about? Now? Hey, well, you 152 00:08:56,679 --> 00:09:01,040 Speaker 1: keep on keeping on. Thanks for eating a fourteen day 153 00:09:01,080 --> 00:09:05,080 Speaker 1: would diary. We'll call it the E MW Early Morning 154 00:09:05,120 --> 00:09:05,920 Speaker 1: Wood Diary. 155 00:09:06,240 --> 00:09:09,080 Speaker 2: So Jake the PEG. What about you? 156 00:09:10,040 --> 00:09:15,720 Speaker 1: Hey, I've had an direction since the eighties for you. 157 00:09:18,120 --> 00:09:21,920 Speaker 2: I don't all right now there you're embarrassed? 158 00:09:22,600 --> 00:09:24,439 Speaker 1: Is it hot in here? Is it hot in here? 159 00:09:25,440 --> 00:09:31,880 Speaker 1: Stop it? Yeah? Well, well, I'll also keep a diary 160 00:09:32,520 --> 00:09:38,880 Speaker 1: we compare. No, we won't everyone that's disgusting and for 161 00:09:38,920 --> 00:09:41,720 Speaker 1: the one listener that remains, For the one listener, what 162 00:09:41,880 --> 00:09:45,600 Speaker 1: is wrong with you? Why are you still here? Oh? God? 163 00:09:45,760 --> 00:09:48,280 Speaker 1: Do you know? We do this every time where we 164 00:09:48,320 --> 00:09:51,800 Speaker 1: talk about ship that's so unrelated, nothing to do with 165 00:09:51,880 --> 00:09:53,960 Speaker 1: what we're meant to be talking about. And then there's 166 00:09:54,000 --> 00:09:57,760 Speaker 1: this awkward, fucking segue. Let's talk about technology. 167 00:09:57,920 --> 00:10:00,240 Speaker 2: Wait a minute, let's talk about porn, men, because there 168 00:10:00,280 --> 00:10:02,000 Speaker 2: is a segment there is something that I want to 169 00:10:02,000 --> 00:10:04,640 Speaker 2: talk about and relate to porn, because if you happen 170 00:10:04,679 --> 00:10:04,959 Speaker 2: to be. 171 00:10:05,240 --> 00:10:07,280 Speaker 1: I feel much better with that segue. 172 00:10:07,520 --> 00:10:10,400 Speaker 2: Well, because if you wake up okay anywhere? 173 00:10:10,440 --> 00:10:13,000 Speaker 1: All right, all right, let's talk about porn. Do we 174 00:10:13,040 --> 00:10:13,320 Speaker 1: need to? 175 00:10:13,400 --> 00:10:15,600 Speaker 2: But all right, well, no, it's just in relation to 176 00:10:16,000 --> 00:10:19,400 Speaker 2: the government oversight and the fact that the Australian government 177 00:10:19,440 --> 00:10:24,920 Speaker 2: is now limiting access to social media for children under 178 00:10:24,920 --> 00:10:28,640 Speaker 2: the age of sixteen. They're also rolling out systems to 179 00:10:28,760 --> 00:10:31,360 Speaker 2: be able to verify age with a lot of the 180 00:10:31,400 --> 00:10:36,440 Speaker 2: major porn sites as well. That raises concerns and about 181 00:10:36,440 --> 00:10:40,360 Speaker 2: the oversight of how they're going to verify that you 182 00:10:40,440 --> 00:10:43,520 Speaker 2: are over a certain age. So, you know, for adults 183 00:10:43,559 --> 00:10:46,520 Speaker 2: wanting to access porn in Australia, now you will have 184 00:10:46,559 --> 00:10:47,240 Speaker 2: to prove your. 185 00:10:47,080 --> 00:10:50,240 Speaker 1: Age, right, yeah, okay, And how do you do that? 186 00:10:50,880 --> 00:10:52,880 Speaker 2: Well, there's number of different ways that can do it. 187 00:10:53,000 --> 00:10:56,000 Speaker 2: One will be literally using a scanner on your phone, 188 00:10:56,040 --> 00:10:58,480 Speaker 2: like the camera on your phone, to just take a 189 00:10:58,480 --> 00:11:00,880 Speaker 2: photograph of yourself and then it will use an algorithm 190 00:11:00,880 --> 00:11:03,160 Speaker 2: to to gauge whether or not you're old enough. 191 00:11:03,800 --> 00:11:06,319 Speaker 1: That seems like then you're giving an image of yourself 192 00:11:06,360 --> 00:11:09,800 Speaker 1: to the fucking Internet, not that you know. I don't 193 00:11:09,800 --> 00:11:16,680 Speaker 1: know how porn works, but allegedly I don't. Yeah, let's 194 00:11:16,679 --> 00:11:17,240 Speaker 1: stop now. 195 00:11:17,600 --> 00:11:20,240 Speaker 2: Yeah, that was the only segue I could think of 196 00:11:20,480 --> 00:11:23,960 Speaker 2: in relation to our conversation, Craig, Can we talk about 197 00:11:24,000 --> 00:11:24,520 Speaker 2: something else? 198 00:11:25,040 --> 00:11:30,520 Speaker 1: Yeah? Fuck please? Please? Let's talk about another bloody app 199 00:11:30,559 --> 00:11:34,640 Speaker 1: from Patrick nearby Glasses. What does nearby Glasses by the way, 200 00:11:34,679 --> 00:11:38,720 Speaker 1: everyone Patrick sends me a list of talking points, we 201 00:11:38,840 --> 00:11:41,760 Speaker 1: generally don't get halfway through it. But tell me about 202 00:11:41,800 --> 00:11:43,360 Speaker 1: your bloody app. 203 00:11:43,960 --> 00:11:45,960 Speaker 2: Well, there's lots of concerns at the moment with all 204 00:11:46,000 --> 00:11:49,439 Speaker 2: these smart glasses that are out and Meta, the company 205 00:11:49,440 --> 00:11:52,959 Speaker 2: that owns Facebook has teamed up with a few glasses 206 00:11:53,080 --> 00:11:56,680 Speaker 2: manufacturers and what you've got AI glasses that have cameras 207 00:11:56,679 --> 00:11:59,520 Speaker 2: in them. You can record, you can take photos, but 208 00:11:59,600 --> 00:12:03,120 Speaker 2: the eyeglasses can also see the world around them. And 209 00:12:03,679 --> 00:12:06,080 Speaker 2: we were chatting earlier about how you and how I 210 00:12:06,120 --> 00:12:08,280 Speaker 2: went to a concert in Melbourne this week and I 211 00:12:08,360 --> 00:12:12,120 Speaker 2: was with a blind friend of mine who has you know, 212 00:12:12,160 --> 00:12:14,120 Speaker 2: who has some smart glasses, and we were just talking 213 00:12:14,160 --> 00:12:17,079 Speaker 2: about how useful they are for him to use those 214 00:12:17,120 --> 00:12:20,920 Speaker 2: to be able to describe what is visually in front 215 00:12:20,960 --> 00:12:23,760 Speaker 2: of him. So his wife bought them for him. Although 216 00:12:23,800 --> 00:12:25,480 Speaker 2: he's really good with a cane and he's got a 217 00:12:25,720 --> 00:12:28,559 Speaker 2: seeing eye dog, but he's been playing around with the 218 00:12:28,640 --> 00:12:33,280 Speaker 2: new smart glasses. But there are privacy concerns about somebody 219 00:12:33,320 --> 00:12:36,720 Speaker 2: walking around with cameras that are constantly recording or taking 220 00:12:36,760 --> 00:12:40,320 Speaker 2: in the scenery around them. So this guy came out 221 00:12:40,320 --> 00:12:43,440 Speaker 2: with an app called Nearby Glasses, and I tested it 222 00:12:43,480 --> 00:12:46,320 Speaker 2: and I downloaded it onto my phone. It's really cool 223 00:12:46,720 --> 00:12:51,720 Speaker 2: because when people use smart glasses, the glasses are basically 224 00:12:51,920 --> 00:12:55,360 Speaker 2: connecting to your phone and that's how they're getting the 225 00:12:55,400 --> 00:12:58,439 Speaker 2: information and how they're able to do all the amazing 226 00:12:58,520 --> 00:13:01,839 Speaker 2: things that they're doing. But evidently there's a certain frequency 227 00:13:01,960 --> 00:13:06,240 Speaker 2: range and without going into the technicals, effectively, this app 228 00:13:06,480 --> 00:13:10,280 Speaker 2: will let you know if someone in your vicinity is 229 00:13:11,000 --> 00:13:14,280 Speaker 2: dark glasses. Yeah, and potentially sporting. Yeah. I thought it 230 00:13:14,320 --> 00:13:14,880 Speaker 2: was really good. 231 00:13:15,000 --> 00:13:18,200 Speaker 1: So now that is a good idea, Like, I think 232 00:13:18,240 --> 00:13:22,360 Speaker 1: that's a good that's a great like. Yeah. And also 233 00:13:22,600 --> 00:13:25,920 Speaker 1: I think I don't know why this jumped to mine, 234 00:13:25,920 --> 00:13:31,000 Speaker 1: but I yes, I do. I was talking to one 235 00:13:31,080 --> 00:13:36,199 Speaker 1: of the ladies who works at the gym i' go 236 00:13:36,320 --> 00:13:38,960 Speaker 1: to and they were just talking about, you know, creeping 237 00:13:39,200 --> 00:13:41,840 Speaker 1: creepster dudes who come into the gym and do you 238 00:13:41,840 --> 00:13:44,000 Speaker 1: know what I mean, And it's like they're just it's 239 00:13:44,080 --> 00:13:47,200 Speaker 1: clear that, you know, the twenty percent of their reason 240 00:13:47,920 --> 00:13:50,720 Speaker 1: to be there as for training and eighty percent is 241 00:13:50,760 --> 00:13:54,600 Speaker 1: for other things. So like the idea of girls being 242 00:13:56,440 --> 00:13:58,320 Speaker 1: I don't really care so much. I don't want it, 243 00:13:58,400 --> 00:14:01,240 Speaker 1: but it's not an issue. But yeah, like dudes being 244 00:14:01,240 --> 00:14:04,520 Speaker 1: able to film women and stuff just by putting on glasses, 245 00:14:05,080 --> 00:14:09,400 Speaker 1: that capacity should not be legal, like to be able 246 00:14:09,400 --> 00:14:13,800 Speaker 1: to just imagine you wearing those in the gym and 247 00:14:13,840 --> 00:14:17,520 Speaker 1: they're just filming everyone in the gym and then they 248 00:14:17,559 --> 00:14:20,600 Speaker 1: go home and download that and that's creepy as fuck. 249 00:14:21,520 --> 00:14:24,240 Speaker 2: Yeah, look, it's a real interesting one. As I said, 250 00:14:24,320 --> 00:14:27,440 Speaker 2: my blind friend has been using them, and you can 251 00:14:27,440 --> 00:14:30,520 Speaker 2: imagine for someone who's vision impaired, if you've got cameras 252 00:14:30,800 --> 00:14:33,640 Speaker 2: and smart glasses that are constantly looking at the world 253 00:14:33,720 --> 00:14:36,240 Speaker 2: around you, you could effectively go to the gym and 254 00:14:36,280 --> 00:14:38,320 Speaker 2: look at what the weights are that you've set, You 255 00:14:38,360 --> 00:14:42,320 Speaker 2: could describe the area. You could make enable people a 256 00:14:42,320 --> 00:14:45,120 Speaker 2: lot more by using this technology. But of course it's 257 00:14:45,160 --> 00:14:48,840 Speaker 2: like anything. It means that you know this. We're constantly 258 00:14:48,880 --> 00:14:52,080 Speaker 2: being surveiled, and the question is where is that data going, 259 00:14:52,160 --> 00:14:55,920 Speaker 2: who's monitoring it? And another concern is that meta at 260 00:14:55,920 --> 00:14:58,760 Speaker 2: the company that makes a lot of the software and 261 00:14:59,080 --> 00:15:03,200 Speaker 2: is powering the these smart glasses, they have oversight and 262 00:15:03,240 --> 00:15:06,520 Speaker 2: they use humans for oversights, so AI is looking at 263 00:15:06,600 --> 00:15:10,280 Speaker 2: what you're seeing, but they take snippets of the recordings 264 00:15:10,680 --> 00:15:13,000 Speaker 2: and they use humans to be able to cross check 265 00:15:13,040 --> 00:15:16,560 Speaker 2: and match with AI to help make AI more accurate. 266 00:15:17,120 --> 00:15:21,160 Speaker 2: But these outsourced workers are now saying that they're capturing 267 00:15:21,200 --> 00:15:25,520 Speaker 2: people going to the toilet, having sex, looking at passwords 268 00:15:25,560 --> 00:15:27,680 Speaker 2: on their computer when they're typing stuff up. So the 269 00:15:27,760 --> 00:15:30,240 Speaker 2: problem is if you start wearing these and they're there 270 00:15:30,280 --> 00:15:32,520 Speaker 2: and they're part of your life all the time, everything 271 00:15:32,600 --> 00:15:35,440 Speaker 2: you see, potentially they see. And that's why I thought 272 00:15:35,520 --> 00:15:38,280 Speaker 2: this app is really good. It's a free app, and 273 00:15:38,280 --> 00:15:40,240 Speaker 2: what I liked about it as well is it's been 274 00:15:40,320 --> 00:15:44,440 Speaker 2: it's a guy who's kind of a Swiss psycho sociologist 275 00:15:44,720 --> 00:15:47,200 Speaker 2: and he's a hobbyist coder, so's he's a bit of 276 00:15:47,240 --> 00:15:50,240 Speaker 2: a nerd, and he invented the app. But what's really 277 00:15:50,280 --> 00:15:55,360 Speaker 2: interesting is anytime you're wearing these smart glasses, they basically 278 00:15:55,840 --> 00:16:00,000 Speaker 2: the app listens for a Bluetooth signal, and the Bluetooth 279 00:16:00,240 --> 00:16:03,520 Speaker 2: signal basically has these data packets that fly forward and 280 00:16:03,640 --> 00:16:07,160 Speaker 2: back to tell other devices what they are and what 281 00:16:07,200 --> 00:16:09,440 Speaker 2: they're doing. So the way, so your phone knows to 282 00:16:09,440 --> 00:16:12,880 Speaker 2: connect to the smart glasses because of the signal between them, 283 00:16:13,040 --> 00:16:15,640 Speaker 2: and that's what this app is looking for because I've 284 00:16:15,640 --> 00:16:19,480 Speaker 2: got a VR headset, and my VR headset uses the 285 00:16:19,480 --> 00:16:22,360 Speaker 2: same sort of data packets. So as soon as I 286 00:16:22,440 --> 00:16:26,200 Speaker 2: turned the VR headset on, my phone pinged to tell 287 00:16:26,280 --> 00:16:30,560 Speaker 2: me that there was a recordable device or a smart 288 00:16:30,560 --> 00:16:33,640 Speaker 2: device that was being used within my vicinity, so I 289 00:16:33,640 --> 00:16:35,880 Speaker 2: could see that people would probably have them in the gym. 290 00:16:35,960 --> 00:16:37,400 Speaker 2: You know, it's a free app. It's great. 291 00:16:37,440 --> 00:16:40,280 Speaker 1: I thought it was fantastic. I wonder what you'd do 292 00:16:40,360 --> 00:16:42,320 Speaker 1: with that, though. Let's say you've got it on and 293 00:16:42,360 --> 00:16:45,440 Speaker 1: you're in the gym and at pings. What do you 294 00:16:45,480 --> 00:16:47,720 Speaker 1: do then? Do you go up to the person that 295 00:16:47,760 --> 00:16:49,720 Speaker 1: you because there might be ten people in the gym 296 00:16:49,720 --> 00:16:52,080 Speaker 1: with glasses on? Do you go up to the person 297 00:16:52,120 --> 00:16:55,400 Speaker 1: that you suspect and go e? Or do you just 298 00:16:55,880 --> 00:16:59,160 Speaker 1: avoid that? Like? It's good, Like I think it's a 299 00:16:59,160 --> 00:17:01,720 Speaker 1: great idea. And my question is and then what. 300 00:17:02,320 --> 00:17:05,040 Speaker 2: Well, then you be you're more conscious of what you 301 00:17:05,080 --> 00:17:07,720 Speaker 2: say and do and how you present yourself, because the 302 00:17:07,760 --> 00:17:11,080 Speaker 2: thing is, if you're at a public place, for example, it's. 303 00:17:10,920 --> 00:17:13,440 Speaker 1: No hang on, hang on, hang on, hang on? Why 304 00:17:13,480 --> 00:17:15,760 Speaker 1: the fuck do I need to be more conscious of 305 00:17:15,800 --> 00:17:18,359 Speaker 1: what I say and do and how I present myself? 306 00:17:18,480 --> 00:17:22,080 Speaker 1: Because some fucking idiots wearing PERV glasses. I shouldn't have 307 00:17:22,200 --> 00:17:26,040 Speaker 1: to adapt. They should fuck off, like you're no, no, no, 308 00:17:26,080 --> 00:17:28,600 Speaker 1: But if you're hang on, hang on, I know you're 309 00:17:28,640 --> 00:17:31,959 Speaker 1: a chatty kathy. Let me have a go, right, Like 310 00:17:32,119 --> 00:17:36,240 Speaker 1: if I have to change because some blokes wearing glasses, 311 00:17:37,200 --> 00:17:41,520 Speaker 1: that's not right, Like they shouldn't control the dynamics of 312 00:17:41,560 --> 00:17:44,720 Speaker 1: that space. They shouldn't be allowed to wear those. If 313 00:17:44,720 --> 00:17:49,200 Speaker 1: it has the capacity to record people who haven't consented 314 00:17:49,240 --> 00:17:52,520 Speaker 1: to that, that I don't think that should be legal. 315 00:17:52,800 --> 00:17:55,600 Speaker 1: And because that is what's happening, because they're recording people 316 00:17:55,720 --> 00:17:58,960 Speaker 1: who have not consented to it, Like, how is that okay? 317 00:17:59,359 --> 00:18:02,280 Speaker 2: It's not okay, And that's my point. And gyms need 318 00:18:02,320 --> 00:18:04,639 Speaker 2: to have hard policies. So basically what you should be 319 00:18:04,640 --> 00:18:06,680 Speaker 2: doing is going to the person who owns the gym 320 00:18:06,680 --> 00:18:09,920 Speaker 2: and saying, well, maybe our policy should be anybody who 321 00:18:09,960 --> 00:18:12,160 Speaker 2: is accessing the gym is not allowed to use any 322 00:18:12,200 --> 00:18:15,320 Speaker 2: sort of recording device, and that you do. So I 323 00:18:15,359 --> 00:18:19,320 Speaker 2: think it's incumbent on the venue to start to enforce that, 324 00:18:19,480 --> 00:18:20,840 Speaker 2: because the thing is you don't want it to become 325 00:18:20,880 --> 00:18:23,879 Speaker 2: the Spanish inquisition either. Imagine if you suddenly, you know, 326 00:18:24,119 --> 00:18:26,320 Speaker 2: muscled up to some bloke and you find out, well, 327 00:18:26,359 --> 00:18:29,760 Speaker 2: actually he's blind and he needs them, or they just 328 00:18:29,800 --> 00:18:30,359 Speaker 2: have verything. 329 00:18:30,480 --> 00:18:32,160 Speaker 1: I think that's going to be evident. I don't think 330 00:18:32,160 --> 00:18:36,600 Speaker 1: anyone's that fucking stupid you think. And I just say, 331 00:18:36,240 --> 00:18:40,520 Speaker 1: my friend Tony Doherty of Doughty's Gyms, the world famous 332 00:18:40,520 --> 00:18:45,440 Speaker 1: Doughty Gyms, he he doesn't let people take because what's 333 00:18:45,520 --> 00:18:47,680 Speaker 1: probably not in the land, but in a lot of 334 00:18:47,720 --> 00:18:52,680 Speaker 1: gyms in Melbourne, people are constantly filming, constantly videoing and 335 00:18:52,720 --> 00:18:56,119 Speaker 1: even taking in many of them tripods, So they're setting 336 00:18:56,200 --> 00:18:59,720 Speaker 1: up try oh like a fucking studio. They're setting up 337 00:19:00,000 --> 00:19:03,840 Speaker 1: iPods and filming themselves, but obviously they're capturing everyone who's 338 00:19:03,880 --> 00:19:09,399 Speaker 1: in behind them, and he's just he's just he's like, 339 00:19:09,520 --> 00:19:13,119 Speaker 1: fucks them off and he either kicks them out or 340 00:19:13,200 --> 00:19:17,120 Speaker 1: he just tells them they can't do it. So there's 341 00:19:17,200 --> 00:19:19,440 Speaker 1: no I don't know if there's Yeah, I think I'm 342 00:19:19,440 --> 00:19:22,200 Speaker 1: pretty sure there's no videoing in his gyms at all. 343 00:19:22,359 --> 00:19:25,280 Speaker 1: So who won't be putting up with these glasses? Is 344 00:19:25,280 --> 00:19:26,520 Speaker 1: a big time good. 345 00:19:26,400 --> 00:19:30,919 Speaker 2: On your tone? Yeah, the one thing that is the reality, unfortunately, 346 00:19:31,080 --> 00:19:33,960 Speaker 2: not unfortunately, but in Australia, if you are in a 347 00:19:34,000 --> 00:19:36,080 Speaker 2: public space, if you're walking down the street, if you're 348 00:19:36,080 --> 00:19:40,160 Speaker 2: at a local festival, it legal too. You can take 349 00:19:40,200 --> 00:19:44,280 Speaker 2: photographs of total strangers. It's creepy. I'm not saying that 350 00:19:44,280 --> 00:19:47,040 Speaker 2: it's a good thing, but there is no law to 351 00:19:47,080 --> 00:19:49,920 Speaker 2: say that you can't be filming or taking photos or 352 00:19:49,960 --> 00:19:55,200 Speaker 2: recording video space. But when you're in a venue where 353 00:19:55,320 --> 00:19:58,480 Speaker 2: there's a membership and you can control that, that's again 354 00:19:58,560 --> 00:20:02,200 Speaker 2: where I'd be pushing for the owners of the gyms, 355 00:20:02,200 --> 00:20:04,080 Speaker 2: and that's what i'd be doing. I'd be approaching and saying, 356 00:20:04,359 --> 00:20:06,439 Speaker 2: you know, all the gym attendant and say, I'm not 357 00:20:06,480 --> 00:20:09,400 Speaker 2: feeling comfortable as someone around here who's wearing the smart glasses. 358 00:20:09,480 --> 00:20:12,159 Speaker 2: I wouldn't challenge someone directly. I would get one of 359 00:20:12,160 --> 00:20:16,120 Speaker 2: the attendants or the owners to challenge them. Yeah, good 360 00:20:16,240 --> 00:20:16,679 Speaker 2: luck with that. 361 00:20:18,840 --> 00:20:22,320 Speaker 1: Have you been in a gym lately? I want to 362 00:20:22,320 --> 00:20:25,200 Speaker 1: do two. So the next one I want to do 363 00:20:26,480 --> 00:20:30,679 Speaker 1: is which we've spoken about the potential impact of AI 364 00:20:30,800 --> 00:20:32,919 Speaker 1: on a lot of industries. I don't want to go 365 00:20:32,920 --> 00:20:35,240 Speaker 1: into AI much today other than this one, but I 366 00:20:35,280 --> 00:20:37,359 Speaker 1: find it interesting. So the third one on your list, 367 00:20:37,960 --> 00:20:41,560 Speaker 1: so AI generated film at Trotfest twenty twenty six, which 368 00:20:41,600 --> 00:20:46,840 Speaker 1: is a kind of a film festival, sparks filmmaker backlash. 369 00:20:47,040 --> 00:20:52,640 Speaker 1: I mean, the inevitability of AI films with no actual 370 00:20:52,680 --> 00:20:56,399 Speaker 1: actors is a foregone conclusion. I don't know what the 371 00:20:56,440 --> 00:20:59,320 Speaker 1: timeline is. It's probably Oh, I think it has already happened. 372 00:20:59,359 --> 00:21:02,480 Speaker 2: Tell us about So trop Fest is pretty exciting because 373 00:21:02,520 --> 00:21:05,760 Speaker 2: the festival, it's an Australian festival that's launched some amazing 374 00:21:05,840 --> 00:21:09,200 Speaker 2: careers because it's been around for a long time, but 375 00:21:09,320 --> 00:21:11,359 Speaker 2: COVID kind of pulled the pin on it, so it 376 00:21:11,400 --> 00:21:13,440 Speaker 2: hasn't been around for six years. So they have trop 377 00:21:13,480 --> 00:21:16,280 Speaker 2: Fest this year and one of the finalists was a 378 00:21:16,280 --> 00:21:19,800 Speaker 2: film called Sid Confidential. I watched it before the show. 379 00:21:19,920 --> 00:21:21,679 Speaker 2: I just thought, I figured I'll have a bit of 380 00:21:21,680 --> 00:21:23,680 Speaker 2: a watch to see what I think of it. It 381 00:21:23,720 --> 00:21:26,960 Speaker 2: was one of those kind of detective noir kind of 382 00:21:27,000 --> 00:21:30,200 Speaker 2: semi cartoon three D cartoons, so oh yeah, it was 383 00:21:30,240 --> 00:21:33,160 Speaker 2: an ark and stormy night and I was yeah, yeah, 384 00:21:34,680 --> 00:21:38,320 Speaker 2: guys over there, look I watched it. 385 00:21:38,320 --> 00:21:40,359 Speaker 1: It was Can I just say that's why you and 386 00:21:40,400 --> 00:21:41,280 Speaker 1: I are not actors? 387 00:21:42,320 --> 00:21:48,640 Speaker 2: It was Yeah, it was very terrible. It was moderately entertaining. Visually, 388 00:21:49,080 --> 00:21:53,000 Speaker 2: I kept seeing AI glitches, like at one point the 389 00:21:53,040 --> 00:21:55,640 Speaker 2: guy goes to jump on a tram and it says 390 00:21:55,960 --> 00:21:58,240 Speaker 2: Sydney on the tram at the front of the tram, 391 00:21:58,240 --> 00:22:01,280 Speaker 2: and then the frame flips and it says this garbled 392 00:22:01,480 --> 00:22:05,280 Speaker 2: junk or whatever. So there's little and it's fun to watch, 393 00:22:05,280 --> 00:22:08,159 Speaker 2: even to look for all the mistakes. It was. It 394 00:22:08,200 --> 00:22:10,800 Speaker 2: was entertaining. But I think the big argument from all 395 00:22:10,840 --> 00:22:14,959 Speaker 2: the filmmakers who spend hundreds and hundreds of hours mapping 396 00:22:15,000 --> 00:22:18,000 Speaker 2: out stories and then filming and editing and doing all 397 00:22:18,040 --> 00:22:22,000 Speaker 2: the work they do because trop Fest is very high standard, 398 00:22:21,640 --> 00:22:24,160 Speaker 2: and I think that's where people have, you know, things 399 00:22:24,200 --> 00:22:26,400 Speaker 2: have come unstuck. There should be a separate category for AI, 400 00:22:26,560 --> 00:22:27,919 Speaker 2: and that's the way you get around it. 401 00:22:28,320 --> 00:22:31,200 Speaker 1: So category can I ask, So does it it looks 402 00:22:31,240 --> 00:22:33,200 Speaker 1: more cartoonish than real right. 403 00:22:33,440 --> 00:22:35,840 Speaker 2: Well in this instance because that was the style that 404 00:22:35,880 --> 00:22:37,840 Speaker 2: they were going for. I mean, you can go for 405 00:22:37,960 --> 00:22:42,040 Speaker 2: super ultra realistic, and you know, we've seen examples I 406 00:22:42,080 --> 00:22:46,000 Speaker 2: guess all over the internet with the likes of famous actors, 407 00:22:46,080 --> 00:22:50,359 Speaker 2: and you know, it's been phenomenal. The quality is unbelievably good. 408 00:22:50,359 --> 00:22:52,320 Speaker 2: But in this case, they've gone from a kind of 409 00:22:52,359 --> 00:22:56,080 Speaker 2: three dimensional cartoon representation. It was kind of fun. It 410 00:22:56,119 --> 00:22:59,680 Speaker 2: was a bit of a homage to to trop Fest 411 00:22:59,680 --> 00:23:01,960 Speaker 2: itself off. If you watch it, it only goes from 412 00:23:01,960 --> 00:23:04,359 Speaker 2: about fifteen minutes or so. So well, it was an 413 00:23:04,359 --> 00:23:08,480 Speaker 2: interesting watch. But I certainly wouldn't have shortlisted it because 414 00:23:08,520 --> 00:23:09,880 Speaker 2: I think that's a bit of a and a lot 415 00:23:09,920 --> 00:23:13,840 Speaker 2: of the you know, of all the seven hundred entriest 416 00:23:13,880 --> 00:23:16,919 Speaker 2: of all, why would you choose an AI over the 417 00:23:17,040 --> 00:23:20,480 Speaker 2: human made and sweat it over. I'm not to say 418 00:23:20,520 --> 00:23:22,760 Speaker 2: that the person who produced it didn't actually put some 419 00:23:22,800 --> 00:23:25,800 Speaker 2: effort into it, but as the majority of it was AI, 420 00:23:26,480 --> 00:23:29,120 Speaker 2: then you've got to ask yourself, you know, what's the standard. 421 00:23:29,200 --> 00:23:31,240 Speaker 2: We want to make sure that we keep that separate 422 00:23:31,440 --> 00:23:34,720 Speaker 2: from everyday average people who want to try to launch 423 00:23:34,760 --> 00:23:36,879 Speaker 2: a career or who're passionate about their hobby. 424 00:23:37,640 --> 00:23:40,920 Speaker 1: Do you have Netflix? You probably don't or do you? Yeah? 425 00:23:40,960 --> 00:23:42,639 Speaker 2: I just don't watch it. I keep thinking I should 426 00:23:42,640 --> 00:23:44,280 Speaker 2: just cancel my membership. 427 00:23:44,240 --> 00:23:47,159 Speaker 1: Before you do. There's a little like I never you know, 428 00:23:47,200 --> 00:23:49,520 Speaker 1: those short films that day like you're just talking about, 429 00:23:49,520 --> 00:23:51,720 Speaker 1: but that like this thing came up the other day. 430 00:23:51,760 --> 00:23:55,359 Speaker 1: I went to bed and I normally go, I'm going 431 00:23:55,440 --> 00:23:57,440 Speaker 1: to watch whatever, and then seven minutes later, I'm a 432 00:23:57,520 --> 00:24:02,320 Speaker 1: sleeper by cup at two and what's that and Robots? No, 433 00:24:02,520 --> 00:24:07,600 Speaker 1: it's called The Singers. Okay, the Singers and it goes 434 00:24:07,640 --> 00:24:10,520 Speaker 1: for I think eighteen minutes and it's not on Netflix, 435 00:24:11,440 --> 00:24:13,960 Speaker 1: and I'd like you to watch it and then message 436 00:24:13,960 --> 00:24:17,919 Speaker 1: me and any of our listeners too. I watched it 437 00:24:17,960 --> 00:24:21,560 Speaker 1: and I'm like, I kind of loved it and hated it, 438 00:24:21,640 --> 00:24:26,160 Speaker 1: and I'm like, it's very different. Don't look it up now? 439 00:24:26,440 --> 00:24:29,000 Speaker 2: No, No, I was actually looking up Love Death and Robots, 440 00:24:29,080 --> 00:24:31,720 Speaker 2: which is something I think you should watch. So I'll 441 00:24:31,760 --> 00:24:35,440 Speaker 2: watch The Singers and you watch Okay, how. 442 00:24:35,280 --> 00:24:37,160 Speaker 1: Long does Love Death and Robots go? For? 443 00:24:37,359 --> 00:24:41,160 Speaker 2: The little clips? They're all little mini video clips, different things. 444 00:24:42,560 --> 00:24:44,360 Speaker 2: Where do I get that Netflix as well? 445 00:24:44,760 --> 00:24:47,840 Speaker 1: Okay, all right, now I want to jump ship and 446 00:24:47,920 --> 00:24:51,040 Speaker 1: talk about something that's more in my lane your car 447 00:24:51,920 --> 00:24:55,280 Speaker 1: not really? Yeah, well, but you know what's interesting is 448 00:24:55,320 --> 00:24:57,719 Speaker 1: by D, which I'm sure most of our listeners know 449 00:24:57,760 --> 00:25:00,399 Speaker 1: by now as a Chinese brand, and it's so funny 450 00:25:00,440 --> 00:25:03,840 Speaker 1: the psychology and the kind of the I guess, the 451 00:25:03,920 --> 00:25:09,560 Speaker 1: feeling attitude that people have about Chinese cars are shifting 452 00:25:09,640 --> 00:25:11,480 Speaker 1: because you know, there was a point in time when 453 00:25:11,520 --> 00:25:15,160 Speaker 1: the Chinese cars in Australia were very agricultural and very 454 00:25:15,200 --> 00:25:17,320 Speaker 1: low quality. But that is definitely not the case. And 455 00:25:17,359 --> 00:25:20,240 Speaker 1: I'm not endorsing or supporting. I'm just saying, what is 456 00:25:20,280 --> 00:25:22,359 Speaker 1: a few my cars? A few my cars, few my 457 00:25:22,400 --> 00:25:26,440 Speaker 1: friends have got Chinese cars, and some of them are 458 00:25:26,560 --> 00:25:31,159 Speaker 1: fucking amazing. But anyway, anyway, b y D has just 459 00:25:31,280 --> 00:25:36,240 Speaker 1: produced the world's longest range EV with a monster six 460 00:25:36,320 --> 00:25:38,360 Speaker 1: hundred and forty four miles, which is given or take 461 00:25:38,400 --> 00:25:41,320 Speaker 1: a thousand case between charges. 462 00:25:42,359 --> 00:25:42,520 Speaker 2: One. 463 00:25:42,880 --> 00:25:44,399 Speaker 1: That's a that's a game changer. 464 00:25:44,880 --> 00:25:48,240 Speaker 2: That's massive because it is has been a big factor 465 00:25:48,240 --> 00:25:51,760 Speaker 2: in Australia. Range anxiety is a big thing for a 466 00:25:51,800 --> 00:25:53,720 Speaker 2: lot of people, and if you particularly live in a 467 00:25:53,800 --> 00:25:57,000 Speaker 2: rural area. I know when I bought my car that 468 00:25:57,280 --> 00:26:01,679 Speaker 2: I did the hybrid version, but the battery powered the 469 00:26:01,720 --> 00:26:05,639 Speaker 2: EV version was only two kilometers and to me, living 470 00:26:06,080 --> 00:26:08,280 Speaker 2: an hour from Melbourne, I thought a two hundred ks 471 00:26:08,359 --> 00:26:12,840 Speaker 2: is not a lot. So you know, a thousand kilometers, Wow, 472 00:26:12,920 --> 00:26:14,639 Speaker 2: that's that's amazing. You know you're driving. 473 00:26:16,119 --> 00:26:18,399 Speaker 1: That's definitely a game changer. So it'll be interesting to 474 00:26:18,400 --> 00:26:19,399 Speaker 1: see how that goes. 475 00:26:20,480 --> 00:26:24,120 Speaker 2: It's called the Z nine GT if anybody's interested, because 476 00:26:24,119 --> 00:26:26,520 Speaker 2: they have I didn't realize this, but by D has 477 00:26:26,560 --> 00:26:30,560 Speaker 2: a luxury brand as well, called Denza, So it's this brand. Yah, 478 00:26:30,560 --> 00:26:32,880 Speaker 2: I didn't realize. It's like Lexus and Toyota, isn't it? 479 00:26:34,320 --> 00:26:40,480 Speaker 1: Yes, so exactly, and hid I have one what's it 480 00:26:40,520 --> 00:26:47,360 Speaker 1: called Genesis? Yeah? They have Genesis, which is where's now? 481 00:26:47,400 --> 00:26:49,600 Speaker 1: I've now, I've lost all your notes. There we are 482 00:26:49,720 --> 00:26:54,400 Speaker 1: now I'm back. But will your next car be an EV? 483 00:26:55,840 --> 00:26:59,520 Speaker 2: Probably will be. I would think I have solar power, 484 00:26:59,560 --> 00:27:01,320 Speaker 2: and I all think that if I could use my 485 00:27:01,400 --> 00:27:03,520 Speaker 2: soul that to charge the car during the day, all 486 00:27:03,560 --> 00:27:06,040 Speaker 2: that sort of stuff. But I think I haven't got 487 00:27:06,240 --> 00:27:09,560 Speaker 2: soular batteries. I haven't got batteries rather for my solar 488 00:27:09,600 --> 00:27:11,520 Speaker 2: system yet. And I know there's a government initiative at 489 00:27:11,560 --> 00:27:14,840 Speaker 2: the moment. But there are a small number of EV 490 00:27:15,119 --> 00:27:19,240 Speaker 2: cars that can be used in reverse to power the 491 00:27:19,280 --> 00:27:21,879 Speaker 2: house if the power goes out. So I think the 492 00:27:22,000 --> 00:27:26,560 Speaker 2: Nissan Leaf is one of those. So I can't remember 493 00:27:26,560 --> 00:27:29,760 Speaker 2: the technical term, but basically you plug it in, but 494 00:27:29,840 --> 00:27:31,960 Speaker 2: if the power drops, it acts as a battery. Because 495 00:27:32,000 --> 00:27:34,600 Speaker 2: the cars, the batteries in cars are a lot more 496 00:27:34,600 --> 00:27:37,960 Speaker 2: powerful than the batteries that people get to power their 497 00:27:37,960 --> 00:27:39,320 Speaker 2: houses for the most part. 498 00:27:40,119 --> 00:27:44,679 Speaker 1: Yeah, well, there's a huge demand on those batteries propelling 499 00:27:44,720 --> 00:27:47,800 Speaker 1: a two ton vehicle out one hundred kilometers an hour 500 00:27:47,840 --> 00:27:51,280 Speaker 1: for extended periods of time on a highway. What does 501 00:27:51,400 --> 00:27:56,560 Speaker 1: fake content of occupation is not educational? What does that mean? 502 00:27:56,640 --> 00:28:02,560 Speaker 1: I don't even know what that means. Fake content of occupational? Yeah, 503 00:28:02,680 --> 00:28:03,160 Speaker 1: what is that? 504 00:28:03,920 --> 00:28:08,080 Speaker 2: So? Yeah, the occupation of Are you familiar with the 505 00:28:08,080 --> 00:28:12,120 Speaker 2: little island of Guernsey off Yes? Yeah, okay, so it's 506 00:28:12,440 --> 00:28:14,959 Speaker 2: kind of closer to I think it's closer to France 507 00:28:14,960 --> 00:28:17,840 Speaker 2: than it is to the UK. But it's the furthest 508 00:28:17,920 --> 00:28:22,879 Speaker 2: most you know, island of the group of UK islands. 509 00:28:23,240 --> 00:28:25,680 Speaker 2: And it wasated by so it was German. It was 510 00:28:25,720 --> 00:28:30,480 Speaker 2: occupied by German troops in nineteen forty or so. And 511 00:28:31,359 --> 00:28:34,399 Speaker 2: what's been happening is a lot of people have been 512 00:28:34,480 --> 00:28:41,280 Speaker 2: uploading photos of the occupation but they're ai generated, right. 513 00:28:41,560 --> 00:28:44,240 Speaker 2: Historian has come out and said, you know, because people 514 00:28:44,240 --> 00:28:47,160 Speaker 2: are trying to get clicks, so they're looking to make 515 00:28:47,280 --> 00:28:50,920 Speaker 2: you know, sensationalist photos or they're animating photos. So you've 516 00:28:50,960 --> 00:28:53,720 Speaker 2: got a still image of troops and then you animate 517 00:28:53,760 --> 00:28:56,120 Speaker 2: it and that's the soldiers walking down the street. But 518 00:28:56,160 --> 00:28:58,760 Speaker 2: it's not historically accurate. And that's this is a real 519 00:28:58,800 --> 00:29:03,200 Speaker 2: concern because there's this professor of it's actually an interesting title, 520 00:29:03,520 --> 00:29:09,160 Speaker 2: Professor of Conflict Archaeology and Holocaust Heritage at the University 521 00:29:09,200 --> 00:29:12,480 Speaker 2: of Cambridge. His name is doctor Gillie Carr, and he 522 00:29:12,520 --> 00:29:16,560 Speaker 2: says this has really become troubling because the occupation in 523 00:29:16,600 --> 00:29:18,560 Speaker 2: particular is the one that he highlighted, and he said, 524 00:29:18,600 --> 00:29:22,280 Speaker 2: there's lots of AI generated images and they're being shared 525 00:29:22,360 --> 00:29:25,200 Speaker 2: on social media all the time because there's been a 526 00:29:25,240 --> 00:29:29,760 Speaker 2: real resurgence in interest in the World War and what 527 00:29:30,200 --> 00:29:33,920 Speaker 2: happened there. And he's saying, well, the problem is none 528 00:29:33,960 --> 00:29:37,120 Speaker 2: of us is actually accurate. It's not scientifically, it's not 529 00:29:37,640 --> 00:29:40,760 Speaker 2: historically accurate. And this is the concern and this is 530 00:29:40,800 --> 00:29:45,000 Speaker 2: going to be you know, a worrisome for us moving forward, 531 00:29:45,320 --> 00:29:49,440 Speaker 2: whether you're flicking through your reels on YouTube or Instagram 532 00:29:50,000 --> 00:29:52,960 Speaker 2: and what's real and what isn't real And. 533 00:29:54,200 --> 00:29:57,280 Speaker 1: That's already happening, like because a lot of my shit 534 00:29:57,440 --> 00:29:59,640 Speaker 1: is research based where it's just because a lot of 535 00:29:59,680 --> 00:30:03,920 Speaker 1: my fear his nutrition and training and science. And then 536 00:30:04,120 --> 00:30:08,080 Speaker 1: it's like, oh, groundbreaking research reveals and it's like from 537 00:30:08,240 --> 00:30:12,360 Speaker 1: science blah blah, and it's it looks one hundred percent 538 00:30:12,520 --> 00:30:18,680 Speaker 1: legit and it's complete fiction. Like it's just it's it's 539 00:30:18,680 --> 00:30:22,960 Speaker 1: a total fiction created to get clicks and numbers. And 540 00:30:23,000 --> 00:30:27,240 Speaker 1: then you I saw Jordan Peterson, who we all know 541 00:30:27,280 --> 00:30:30,160 Speaker 1: who he is anyway, he said that there was some 542 00:30:30,320 --> 00:30:31,960 Speaker 1: quote from him. There's a picture of him in a 543 00:30:32,040 --> 00:30:35,440 Speaker 1: chair on a stage making some profound fucking thing. I'm like, what, 544 00:30:36,560 --> 00:30:41,280 Speaker 1: And I chucked it into Claude or chat Jo, but 545 00:30:41,400 --> 00:30:43,400 Speaker 1: I'm like, is this legit? And they're like, nah, he 546 00:30:43,520 --> 00:30:46,400 Speaker 1: never said that. That's not true. That's a complete fabrication. 547 00:30:46,560 --> 00:30:50,200 Speaker 1: And I'm like, that's the thing. You don't you don't know, 548 00:30:50,640 --> 00:30:54,760 Speaker 1: and you also, I think you have to. Like I 549 00:30:54,840 --> 00:30:58,200 Speaker 1: put up a post before on Insta and it's like 550 00:30:58,360 --> 00:31:01,200 Speaker 1: there's a sentence that's you know when people go are 551 00:31:01,240 --> 00:31:04,120 Speaker 1: the best way to get in shape is? And I wrote, 552 00:31:04,880 --> 00:31:07,560 Speaker 1: so I put that as a standaline sentence in quotation marks, 553 00:31:07,560 --> 00:31:10,080 Speaker 1: and I wrote after that, whatever comes next is bullshit, 554 00:31:10,720 --> 00:31:14,960 Speaker 1: because there is no best way for all these different 555 00:31:15,040 --> 00:31:19,440 Speaker 1: bodies and people with different backgrounds and genetics and ages 556 00:31:19,480 --> 00:31:24,239 Speaker 1: and injuries and medical challenges, and you know, there's no 557 00:31:24,400 --> 00:31:27,240 Speaker 1: best program but people all the time. Again, this is 558 00:31:27,280 --> 00:31:29,880 Speaker 1: the best way to do this. No, it's not because 559 00:31:29,920 --> 00:31:32,760 Speaker 1: you go the best way for who, the best way 560 00:31:32,800 --> 00:31:35,640 Speaker 1: for Patrick or Craig or my mum or the thirty 561 00:31:35,680 --> 00:31:40,160 Speaker 1: year old elite athlete or Yeah. So I think I 562 00:31:40,200 --> 00:31:42,640 Speaker 1: think it's getting worse and worse, and I think it's 563 00:31:42,680 --> 00:31:45,440 Speaker 1: going to be well with some stuff anyway, and that 564 00:31:46,800 --> 00:31:50,440 Speaker 1: you know that instagramy, Facebook y, you know, when stuff's 565 00:31:50,480 --> 00:31:53,960 Speaker 1: getting pumped out, which is allegedly some information source. I 566 00:31:54,000 --> 00:31:56,680 Speaker 1: think most of it isn't. I think most of it 567 00:31:56,720 --> 00:32:01,000 Speaker 1: is now somewhere between somewhat accurate and complete bullshit. 568 00:32:01,600 --> 00:32:03,680 Speaker 2: Yeah. I think more than fifty one percent of all 569 00:32:03,720 --> 00:32:07,720 Speaker 2: content now being generated on the internet video still the 570 00:32:07,720 --> 00:32:12,000 Speaker 2: whole lot, Yeah, AI generated, which is a frightening thought. 571 00:32:12,440 --> 00:32:14,920 Speaker 2: I mean, it's not a useful tool, but it also 572 00:32:15,040 --> 00:32:18,880 Speaker 2: means I worry about election time. You know what happened. 573 00:32:18,920 --> 00:32:21,840 Speaker 2: That's election time, by the way. Not going back to 574 00:32:21,840 --> 00:32:22,520 Speaker 2: our first topic. 575 00:32:23,560 --> 00:32:26,600 Speaker 1: When we have a state that's with an E, not 576 00:32:26,680 --> 00:32:29,320 Speaker 1: an L, not an R. Yeah, when we have. 577 00:32:29,280 --> 00:32:33,240 Speaker 2: A state or federal election. How do we know that 578 00:32:33,280 --> 00:32:36,920 Speaker 2: the video we're seeing isn't propaganda that's been generated by AI. 579 00:32:37,400 --> 00:32:40,640 Speaker 2: We know that that's been influence heavily influenced in previous 580 00:32:40,680 --> 00:32:44,880 Speaker 2: elections in other countries. Yes, as you say, you can 581 00:32:44,920 --> 00:32:48,160 Speaker 2: fake it so well now. And if you're in one 582 00:32:48,160 --> 00:32:52,040 Speaker 2: of those blinker visions where your algorithm is pulling up 583 00:32:52,040 --> 00:32:56,760 Speaker 2: far left, far right, middle middle, but someone's going for 584 00:32:56,840 --> 00:33:01,240 Speaker 2: clicks and they're looking for propaganda, it's getting really difficult 585 00:33:01,320 --> 00:33:06,080 Speaker 2: to know what's right and what's not right. And we're 586 00:33:06,120 --> 00:33:09,360 Speaker 2: all being challenged to try to work out, you know, 587 00:33:09,840 --> 00:33:12,720 Speaker 2: whether we jump one way or the other. And I 588 00:33:12,720 --> 00:33:14,320 Speaker 2: don't know is there going to be a pushback. I 589 00:33:14,320 --> 00:33:16,840 Speaker 2: mean it's certainly with a lot of younger people. I 590 00:33:16,880 --> 00:33:19,680 Speaker 2: see people wearing headphones with cables on them. Now they're 591 00:33:19,680 --> 00:33:21,760 Speaker 2: not wearing you know, yeah, yeah bones. 592 00:33:22,520 --> 00:33:25,800 Speaker 1: I bought some the other day because yeah, I'm like, 593 00:33:26,120 --> 00:33:28,840 Speaker 1: I'm not sure. And I also think another thing, Mate, 594 00:33:29,040 --> 00:33:34,719 Speaker 1: is just the gap between what is happening in well, 595 00:33:34,800 --> 00:33:38,480 Speaker 1: let's stay in this space, but in AI and social 596 00:33:38,520 --> 00:33:42,360 Speaker 1: media and the stuff that's being because everything on social 597 00:33:42,400 --> 00:33:45,520 Speaker 1: media is with one objective or a couple, but the 598 00:33:45,560 --> 00:33:48,080 Speaker 1: immediate objective is to get your attention and keep your 599 00:33:48,080 --> 00:33:51,560 Speaker 1: attention for as long as possible. And then I think 600 00:33:51,600 --> 00:33:56,080 Speaker 1: the gap between people's actual understanding of what's going on 601 00:33:56,240 --> 00:34:00,200 Speaker 1: and what's actually going on, you know, this awareness around 602 00:34:00,240 --> 00:34:04,520 Speaker 1: like every time I look at something, unless it's literally 603 00:34:04,840 --> 00:34:06,800 Speaker 1: a you know, I see something that you put up 604 00:34:06,840 --> 00:34:08,640 Speaker 1: on social media, well, I go, well, that's him, and 605 00:34:08,640 --> 00:34:11,600 Speaker 1: that's Brits and that's the fucking that's his morning, you know, 606 00:34:11,680 --> 00:34:13,880 Speaker 1: I know, but most things, I'm like, I don't know 607 00:34:13,880 --> 00:34:16,520 Speaker 1: if this is true, even if it looks true, and 608 00:34:16,560 --> 00:34:20,480 Speaker 1: I think trying to understand. I think I spoke to 609 00:34:20,520 --> 00:34:24,160 Speaker 1: Melissa earlier and I said, when I'm finished my current study, 610 00:34:24,920 --> 00:34:27,440 Speaker 1: even though I don't really I'm not excited about it, 611 00:34:27,480 --> 00:34:29,600 Speaker 1: but I feel like I need to do a little 612 00:34:29,680 --> 00:34:32,680 Speaker 1: bit of a Craig version of a deep dive into AI, 613 00:34:33,760 --> 00:34:36,839 Speaker 1: just so that I can start to speak the language. 614 00:34:37,400 --> 00:34:39,200 Speaker 1: You know, there ain't too many sixty year olds that 615 00:34:39,239 --> 00:34:42,200 Speaker 1: are all over AI and understandably, or fifty or forty 616 00:34:42,280 --> 00:34:44,040 Speaker 1: year olds for that matter. But I don't want to 617 00:34:44,080 --> 00:34:45,719 Speaker 1: get to the point where I don't understand what the 618 00:34:45,719 --> 00:34:48,840 Speaker 1: fuck's going on, and so I want to become not 619 00:34:48,920 --> 00:34:51,120 Speaker 1: because I want to become a coder or write this 620 00:34:51,239 --> 00:34:53,880 Speaker 1: or that or you know, but I just want to 621 00:34:54,000 --> 00:34:57,520 Speaker 1: understand what is possible, what is happening, what is real? 622 00:34:57,600 --> 00:35:01,920 Speaker 1: What is bullshit? Of course, how do I take advantage 623 00:35:01,960 --> 00:35:06,839 Speaker 1: of in an intelligent way? How do I take advantage 624 00:35:06,880 --> 00:35:10,960 Speaker 1: of what's possible and available for my business and for 625 00:35:11,080 --> 00:35:14,919 Speaker 1: my career? And like, you can go the super long way, 626 00:35:15,000 --> 00:35:17,640 Speaker 1: or you can get the same outcome like I can. 627 00:35:18,400 --> 00:35:21,880 Speaker 1: I can literally get AI to check my entire paper 628 00:35:21,920 --> 00:35:23,800 Speaker 1: that I wrote that took a year and a half. 629 00:35:23,960 --> 00:35:26,200 Speaker 1: It can check it. I'm not talking about rewrite it 630 00:35:26,320 --> 00:35:29,560 Speaker 1: or change it. I'm just find any mistakes in three 631 00:35:29,600 --> 00:35:32,239 Speaker 1: minutes that would take me half a day of sitting 632 00:35:32,320 --> 00:35:35,720 Speaker 1: and reading. Why wouldn't you do that? Like find the mistakes, 633 00:35:35,760 --> 00:35:38,440 Speaker 1: find them grammatical? Is there too much repetition? What does 634 00:35:38,480 --> 00:35:42,880 Speaker 1: it flow? You know? And that's not using AI to 635 00:35:43,640 --> 00:35:47,160 Speaker 1: you know, obviously all my workers self generated because I 636 00:35:47,239 --> 00:35:49,920 Speaker 1: ran independent studies and collected all the data. But for 637 00:35:50,000 --> 00:35:52,799 Speaker 1: things like that, and even I mean, I know this 638 00:35:52,880 --> 00:35:56,560 Speaker 1: is a very low level use based on what's actually possible. Now, 639 00:35:56,600 --> 00:36:00,560 Speaker 1: but the idea that I'm not going to inter or 640 00:36:00,600 --> 00:36:05,439 Speaker 1: interact with AI as some people like fuck AI. I'm like, yeah, 641 00:36:05,480 --> 00:36:07,480 Speaker 1: good luck with that. You're not. It's going to be 642 00:36:07,520 --> 00:36:09,560 Speaker 1: in your life whether or not you want it, So 643 00:36:09,600 --> 00:36:11,480 Speaker 1: you might need to get a little bit kind of 644 00:36:12,960 --> 00:36:17,560 Speaker 1: you know, aware, and what's the word being able to 645 00:36:18,080 --> 00:36:19,480 Speaker 1: speak the language somewhat? 646 00:36:19,840 --> 00:36:23,319 Speaker 2: Yeah, I think understanding the lingo is really important. But 647 00:36:23,719 --> 00:36:25,839 Speaker 2: at what point do we deep dive to the point 648 00:36:25,880 --> 00:36:29,280 Speaker 2: where you know, I don't know exactly how my phone works, 649 00:36:29,920 --> 00:36:33,080 Speaker 2: but I use it every day, and you know, there's 650 00:36:33,080 --> 00:36:34,640 Speaker 2: only so many things you can do. But I think 651 00:36:34,640 --> 00:36:37,080 Speaker 2: from a safety perspective, or at least from a you know, 652 00:36:37,480 --> 00:36:40,720 Speaker 2: keeping yourself informed, knowing more about AI is really important. 653 00:36:40,880 --> 00:36:43,600 Speaker 2: I did a little interesting exercise the other day with 654 00:36:44,000 --> 00:36:46,440 Speaker 2: che chpt I just I threw a lot of my 655 00:36:46,520 --> 00:36:50,600 Speaker 2: financial information in there about how much money I had, 656 00:36:50,640 --> 00:36:53,600 Speaker 2: what my superannuation was currently at, you know, what my 657 00:36:53,680 --> 00:36:56,719 Speaker 2: cash flow was, and I had a really interesting conversation, 658 00:36:56,920 --> 00:37:00,000 Speaker 2: you know, theoretical of when could I retire, how much 659 00:37:00,080 --> 00:37:03,680 Speaker 2: what I need? But it was pulling in data from 660 00:37:04,320 --> 00:37:07,759 Speaker 2: the Australian government website looking at pensions what you could 661 00:37:07,760 --> 00:37:10,600 Speaker 2: do to supplement that if you were self funded retiree. 662 00:37:10,680 --> 00:37:13,759 Speaker 2: It was really interesting because it was drawing all the 663 00:37:13,800 --> 00:37:16,560 Speaker 2: sources from the internet that were government related, so the 664 00:37:16,800 --> 00:37:20,080 Speaker 2: information really seemed quite accurate. You know, you could do 665 00:37:20,120 --> 00:37:22,319 Speaker 2: this and then you could supplement your pension by doing this, 666 00:37:22,400 --> 00:37:25,080 Speaker 2: and if you have this much, it was phenomenal. You know, 667 00:37:25,200 --> 00:37:28,920 Speaker 2: you have a conversation after conversation with a financial advisor. 668 00:37:28,960 --> 00:37:31,920 Speaker 2: I'm not suggesting that you exclusively go to AI as 669 00:37:31,920 --> 00:37:36,080 Speaker 2: a financial advisor, but as a starting point, it's great. 670 00:37:36,120 --> 00:37:37,360 Speaker 2: It's an amazing tool. 671 00:37:38,640 --> 00:37:42,239 Speaker 1: I Yeah, I think that too. I think I think 672 00:37:42,920 --> 00:37:46,080 Speaker 1: I was talking to somebody the other day about just 673 00:37:46,080 --> 00:37:48,480 Speaker 1: a little They asked me something, and it's like we 674 00:37:48,480 --> 00:37:52,360 Speaker 1: were talking about how when they get emotional, they can't 675 00:37:52,400 --> 00:37:56,280 Speaker 1: think clearly, they don't make great decisions, you know, they 676 00:37:56,360 --> 00:37:59,839 Speaker 1: don't respond well in all of that. And anyway, I 677 00:37:59,880 --> 00:38:03,360 Speaker 1: was trying to explain on a pretty basic level. So 678 00:38:03,440 --> 00:38:06,480 Speaker 1: bits of your brains do different things. You've got this 679 00:38:06,560 --> 00:38:09,480 Speaker 1: thing called an amigdela, which is like the emotional epicenter 680 00:38:09,520 --> 00:38:12,080 Speaker 1: of your brain. You've got a thing called a prefrontal 681 00:38:12,120 --> 00:38:14,279 Speaker 1: cortex which makes decisions. All of that, and I go 682 00:38:14,400 --> 00:38:19,160 Speaker 1: sometimes that bit overtakes that bit and like, how do 683 00:38:19,200 --> 00:38:21,319 Speaker 1: you how do you know this? I goble, I study it, 684 00:38:21,360 --> 00:38:23,520 Speaker 1: but I go, do you know what? You could literally 685 00:38:23,560 --> 00:38:28,000 Speaker 1: go into chat GPT and go break down the bits 686 00:38:28,040 --> 00:38:30,279 Speaker 1: of the brain, like what are they called and what 687 00:38:30,320 --> 00:38:33,960 Speaker 1: do they do? You know? And it's I came back 688 00:38:34,000 --> 00:38:35,440 Speaker 1: and I thought, how does that? 689 00:38:35,719 --> 00:38:35,799 Speaker 2: Like? 690 00:38:36,440 --> 00:38:38,439 Speaker 1: Was that a good idea? And I said, literally, tell 691 00:38:38,480 --> 00:38:41,719 Speaker 1: me all the main bits because there's a fucking lot, 692 00:38:41,760 --> 00:38:44,239 Speaker 1: but the main bits of the brain and what those 693 00:38:44,239 --> 00:38:46,840 Speaker 1: bits of the brain do for us. And it's really 694 00:38:47,440 --> 00:38:50,600 Speaker 1: like people could learn a basic understanding of how their 695 00:38:50,600 --> 00:38:52,799 Speaker 1: brain works, or at least what the bits of their 696 00:38:52,840 --> 00:38:56,200 Speaker 1: brain do in an hour if they want, and you know, 697 00:38:56,320 --> 00:38:59,919 Speaker 1: take it's like, it is so incredible what's available now 698 00:39:00,120 --> 00:39:03,600 Speaker 1: and how help Like that's just a real practical use. 699 00:39:03,719 --> 00:39:06,160 Speaker 1: It's not steering you in a direction, it's giving you 700 00:39:06,520 --> 00:39:09,000 Speaker 1: and the stuff that it spewed out. And I wrote, 701 00:39:09,520 --> 00:39:12,560 Speaker 1: you know, basically word it so that it's it's user 702 00:39:12,600 --> 00:39:14,800 Speaker 1: friendly for a lay person, and it was brilliant. 703 00:39:15,360 --> 00:39:18,160 Speaker 2: Yeah, yeah, treat me like I'm you know, a year 704 00:39:18,200 --> 00:39:23,240 Speaker 2: seven student, or treat me like yeah. I used another 705 00:39:23,280 --> 00:39:26,279 Speaker 2: little interesting use. I had an awkward situation with a 706 00:39:26,280 --> 00:39:30,880 Speaker 2: colleague slash friend and I had to send an email, 707 00:39:31,360 --> 00:39:35,000 Speaker 2: but I had to word it delicately because I didn't 708 00:39:35,080 --> 00:39:37,880 Speaker 2: want it to be one of those email trails where 709 00:39:37,920 --> 00:39:39,799 Speaker 2: I said this, then you said that, and I said 710 00:39:40,400 --> 00:39:43,080 Speaker 2: said that. So I asked chat GPT. I want to 711 00:39:43,120 --> 00:39:45,040 Speaker 2: send this. I don't want to offend the person, but 712 00:39:45,120 --> 00:39:47,120 Speaker 2: I also want to shut down the conversation so it 713 00:39:47,120 --> 00:39:50,480 Speaker 2: doesn't go any further. And it was great advice. 714 00:39:50,880 --> 00:39:52,560 Speaker 1: Really, this is really good. 715 00:39:52,680 --> 00:39:55,360 Speaker 2: Yeah, because I, you know, I don't want to offend people, 716 00:39:55,400 --> 00:39:58,400 Speaker 2: and I find that you know, not that I'm overtly 717 00:39:58,440 --> 00:40:00,440 Speaker 2: a people pleaser, but I think that you and I 718 00:40:00,480 --> 00:40:02,920 Speaker 2: are very similar in that way. You know, we like 719 00:40:03,040 --> 00:40:06,680 Speaker 2: to be liked. And so I really wanted to handle 720 00:40:06,719 --> 00:40:09,200 Speaker 2: this delicately, but I just didn't. I kept rewriting the 721 00:40:09,239 --> 00:40:12,080 Speaker 2: same email. I rewrote it, and rewrote it, and rewrote it. 722 00:40:12,120 --> 00:40:14,920 Speaker 2: And look at a quick tip for all you email 723 00:40:15,320 --> 00:40:17,239 Speaker 2: people out there who want to send an email and 724 00:40:17,280 --> 00:40:20,080 Speaker 2: they're a bit angry or they're a bit emotional, don't 725 00:40:20,120 --> 00:40:24,560 Speaker 2: put the person's name in the two yet. Write the 726 00:40:24,600 --> 00:40:27,160 Speaker 2: email first and then put the name in. That way, 727 00:40:27,200 --> 00:40:30,360 Speaker 2: you can't accidentally send it or write it in something else. 728 00:40:30,440 --> 00:40:33,600 Speaker 1: First, yeah, did you find it up? 729 00:40:33,880 --> 00:40:35,719 Speaker 2: No? No, No, I didn't. I didn't. I make sure 730 00:40:35,880 --> 00:40:38,640 Speaker 2: so even if i'm replying, when I hit the reply, 731 00:40:38,840 --> 00:40:42,200 Speaker 2: I take their email address out so I don't accidentally 732 00:40:42,400 --> 00:40:42,960 Speaker 2: send it. 733 00:40:43,640 --> 00:40:46,759 Speaker 1: But that is a really interesting thing because you think 734 00:40:46,800 --> 00:40:49,560 Speaker 1: about and this is heading a bit more into psych 735 00:40:49,640 --> 00:40:52,280 Speaker 1: and sociology, but still related to what we're talking about. 736 00:40:52,880 --> 00:40:56,279 Speaker 1: You know those those times when you just know you're 737 00:40:56,280 --> 00:40:59,839 Speaker 1: having a pointless conversation or you feel like with some 738 00:41:00,080 --> 00:41:02,920 Speaker 1: people and I'm not saying I'm never the problem. I 739 00:41:03,560 --> 00:41:07,120 Speaker 1: definitely am at times, but you're like, fuck this person. 740 00:41:07,239 --> 00:41:12,080 Speaker 1: I have had this conversation seventy three times and you 741 00:41:12,120 --> 00:41:15,359 Speaker 1: know it's like going or when you're in the middle 742 00:41:15,400 --> 00:41:17,040 Speaker 1: of it, you're like, God, I just want this to 743 00:41:17,160 --> 00:41:20,000 Speaker 1: end because you know it's not fixing or changing or 744 00:41:20,040 --> 00:41:22,319 Speaker 1: improving anything. What are you laughing at. 745 00:41:22,480 --> 00:41:25,360 Speaker 2: I'm laughing because I'm thinking about our podcasts. 746 00:41:27,160 --> 00:41:30,520 Speaker 1: This is this is fucking ground breaking. People love this shit, 747 00:41:31,280 --> 00:41:36,040 Speaker 1: both of them. Both of our listeners love this. Shout 748 00:41:36,080 --> 00:41:40,799 Speaker 1: out to Arnie Marget's all right, let's do you pick 749 00:41:40,840 --> 00:41:45,160 Speaker 1: another couple pick which my point is, well, I just 750 00:41:45,200 --> 00:41:47,839 Speaker 1: fucking said to you fucking stop picking on me. You're 751 00:41:47,880 --> 00:41:51,480 Speaker 1: a bully. You're a bully from Bland. My point is 752 00:41:51,920 --> 00:41:55,279 Speaker 1: that sometimes you like, there are these pointless, groundhole day 753 00:41:55,360 --> 00:41:59,520 Speaker 1: conversations that we keep kind of involving ourselves in rather 754 00:41:59,560 --> 00:42:02,680 Speaker 1: than going the last time I spoke to this person 755 00:42:02,719 --> 00:42:06,840 Speaker 1: about this thing, it didn't work, that was not fruitful 756 00:42:06,920 --> 00:42:10,640 Speaker 1: or productive. So we either don't talk about it or 757 00:42:10,760 --> 00:42:13,720 Speaker 1: talk about it in a different way, you know, because 758 00:42:13,760 --> 00:42:17,040 Speaker 1: it's like I have wasted so much. I'll give you 759 00:42:17,120 --> 00:42:22,560 Speaker 1: an example. So, and I've had this at least thirty 760 00:42:22,600 --> 00:42:24,040 Speaker 1: I was going to say a hundred, not one hundred, 761 00:42:24,040 --> 00:42:26,959 Speaker 1: but at least thirty times with friends of mine where 762 00:42:26,960 --> 00:42:29,600 Speaker 1: they go, oh, I just want to pick your brain 763 00:42:29,640 --> 00:42:31,960 Speaker 1: for a minute. I just want to talk to you 764 00:42:31,960 --> 00:42:38,319 Speaker 1: about so food and I go, nah, nah, I go. 765 00:42:38,440 --> 00:42:40,680 Speaker 1: You and I have had this chat thirty five times 766 00:42:40,719 --> 00:42:43,279 Speaker 1: and you never do anything with the information that I 767 00:42:43,360 --> 00:42:46,760 Speaker 1: give you. And twenty years down the track, you're still 768 00:42:46,800 --> 00:42:48,920 Speaker 1: looking for a quick fix, a magic pill or a 769 00:42:48,920 --> 00:42:51,959 Speaker 1: fucking shortcut. And you want to know if I think 770 00:42:52,000 --> 00:42:55,040 Speaker 1: ozepic is a great idea for you? Not having that chat, 771 00:42:55,440 --> 00:42:58,000 Speaker 1: how about this eat less shit? Move more? And just 772 00:42:58,040 --> 00:43:01,040 Speaker 1: fucking stop asking me oh, you know, but that's not 773 00:43:01,080 --> 00:43:04,160 Speaker 1: what they want to hear. So some of my friends, 774 00:43:04,160 --> 00:43:07,960 Speaker 1: I go, do not talk to me about your body fitness, health, nutrition, 775 00:43:08,320 --> 00:43:13,200 Speaker 1: exercise program until you're actually in the middle of doing it. 776 00:43:13,520 --> 00:43:16,680 Speaker 2: That's actually a healthy thing for conversations and friends. I've 777 00:43:16,680 --> 00:43:20,200 Speaker 2: got one of my dearest friends. She is quite religious, 778 00:43:20,719 --> 00:43:23,120 Speaker 2: very very religious, and whenever we get into or over 779 00:43:23,160 --> 00:43:28,360 Speaker 2: the years previously we'd get into quite intense debates. And 780 00:43:28,440 --> 00:43:30,920 Speaker 2: at one point we were talking about Bible references and 781 00:43:30,960 --> 00:43:35,040 Speaker 2: she called me ignorant, and I took quite offense to that. 782 00:43:35,160 --> 00:43:38,480 Speaker 2: So we decided that we just will never discuss religion 783 00:43:38,640 --> 00:43:41,520 Speaker 2: ever because she has very strong opinions and I have 784 00:43:41,719 --> 00:43:45,920 Speaker 2: very strong alter opinions, and it was actually damaging to 785 00:43:46,000 --> 00:43:49,040 Speaker 2: our relationship that we were getting because we both headstrong, 786 00:43:49,080 --> 00:43:51,960 Speaker 2: we both like to argue and debate points. But we 787 00:43:52,080 --> 00:43:54,120 Speaker 2: just found that there's so many other things that we 788 00:43:54,160 --> 00:43:56,200 Speaker 2: can talk about and we care about and love each 789 00:43:56,200 --> 00:43:59,200 Speaker 2: other a lot, that we decided that that's just off 790 00:43:59,200 --> 00:44:02,240 Speaker 2: the table, never ever ever talk about religion. 791 00:44:02,400 --> 00:44:05,040 Speaker 1: But see, yeah, that's not a bad idea, but I 792 00:44:05,080 --> 00:44:09,560 Speaker 1: think the problem with that is you truly think that 793 00:44:09,600 --> 00:44:13,319 Speaker 1: you're right and there's no chance that you're wrong, as 794 00:44:13,360 --> 00:44:17,640 Speaker 1: does she Yeah, exactly. That's a level of arrogance. And 795 00:44:18,280 --> 00:44:21,880 Speaker 1: it's like I've realized over the years, and I'm not 796 00:44:22,000 --> 00:44:25,279 Speaker 1: fucking mother Teresa or whatever, but yeah, like all the 797 00:44:25,320 --> 00:44:29,799 Speaker 1: stuff you remember, how emphatic and how like, Now, what 798 00:44:29,880 --> 00:44:32,319 Speaker 1: do I think? I think? I actually don't know. I 799 00:44:32,360 --> 00:44:34,919 Speaker 1: have certain beliefs, but I don't know. I could be wrong. 800 00:44:35,000 --> 00:44:36,719 Speaker 1: You don't know. You don't know if the Bible is 801 00:44:36,760 --> 00:44:39,319 Speaker 1: real or bullshit, or you know, you don't know. You 802 00:44:39,400 --> 00:44:41,880 Speaker 1: think you know. I don't know. None of us really know, 803 00:44:42,160 --> 00:44:45,719 Speaker 1: like with many many things. And I think the irony 804 00:44:45,960 --> 00:44:50,200 Speaker 1: is that we're talking about you know, spiritual religious god 805 00:44:50,440 --> 00:44:58,920 Speaker 1: things or non things. And I find that, like twice 806 00:44:58,960 --> 00:45:01,239 Speaker 1: in the last two days I've read somebody put a 807 00:45:01,280 --> 00:45:04,720 Speaker 1: thing on Facebook and that was this big proclamation about 808 00:45:04,719 --> 00:45:07,680 Speaker 1: shit and at the end they said, if you disagree 809 00:45:07,760 --> 00:45:11,640 Speaker 1: with me, unfriend me. And I'm like, that is so hilarious. 810 00:45:11,680 --> 00:45:15,359 Speaker 1: You're literally saying I only want to surround myself with 811 00:45:15,400 --> 00:45:18,759 Speaker 1: people who agree with me. I want your endorsement. If 812 00:45:18,760 --> 00:45:22,360 Speaker 1: you don't endorse or agree, fuck off. I want to 813 00:45:22,400 --> 00:45:27,480 Speaker 1: live in an echo chamber. Whereas the healthiest way is now, 814 00:45:27,600 --> 00:45:30,080 Speaker 1: because if you and I think something different on different things, 815 00:45:30,120 --> 00:45:32,799 Speaker 1: that's fine. Doesn't affect our friendship or how much I 816 00:45:32,840 --> 00:45:35,040 Speaker 1: love you or want you to be fucking great and healthy. 817 00:45:35,120 --> 00:45:38,160 Speaker 1: Right if you think A and I think B, nothing 818 00:45:38,200 --> 00:45:42,800 Speaker 1: in me wants to convince you of B. I'm like, cool, 819 00:45:43,560 --> 00:45:45,719 Speaker 1: you know, and there are even and I think the 820 00:45:45,800 --> 00:45:49,040 Speaker 1: thing is when you so emphatically believe that you're right 821 00:45:49,120 --> 00:45:53,520 Speaker 1: about something for which you actually can't prove because there's 822 00:45:53,600 --> 00:45:57,800 Speaker 1: no data in this and even like what people call data. 823 00:45:57,920 --> 00:46:00,400 Speaker 1: You go back and you go, well, you know, there 824 00:46:00,400 --> 00:46:03,400 Speaker 1: are even in science. There's so much science that isn't 825 00:46:03,400 --> 00:46:08,000 Speaker 1: really science. It's like it's it's in somebody's interpretation of 826 00:46:08,040 --> 00:46:11,400 Speaker 1: a thing. So then you go, well, who designed the study, 827 00:46:11,680 --> 00:46:15,880 Speaker 1: A person who interpreted the study? A person like And 828 00:46:15,960 --> 00:46:19,239 Speaker 1: when you think about I'm digressing here, but that you know, 829 00:46:19,320 --> 00:46:23,560 Speaker 1: because people are so self righteous and adamant about this 830 00:46:23,719 --> 00:46:25,480 Speaker 1: is science. And then you go, so, do you know 831 00:46:25,520 --> 00:46:29,759 Speaker 1: how much of science is funded by private organizations who 832 00:46:29,760 --> 00:46:32,680 Speaker 1: have a vested interest in the results of the science. 833 00:46:33,239 --> 00:46:37,200 Speaker 1: It's about eighty percent. It's seventy five percent, I think, 834 00:46:37,280 --> 00:46:40,520 Speaker 1: you know, And the rest is generally government or often government. 835 00:46:40,560 --> 00:46:46,280 Speaker 1: And so when somebody has essentially responsibility to the person 836 00:46:46,480 --> 00:46:51,200 Speaker 1: funding their research, or they're at least somewhat encouraged to 837 00:46:51,239 --> 00:46:54,719 Speaker 1: produce certain data which is of value to then it's 838 00:46:54,760 --> 00:46:58,120 Speaker 1: not honest. It's not real, it's not independent. You know, 839 00:46:58,200 --> 00:47:02,759 Speaker 1: most real science is out that what you hypothesized is 840 00:47:02,840 --> 00:47:06,080 Speaker 1: fucking wrong and going hey everyone, I got it wrong. 841 00:47:06,440 --> 00:47:10,880 Speaker 1: Here's the information. That's science, you know. But it's like 842 00:47:11,040 --> 00:47:13,520 Speaker 1: we are so emotional that we don't want to be 843 00:47:13,880 --> 00:47:16,520 Speaker 1: we don't want to be wrong. And it's like, well 844 00:47:16,600 --> 00:47:22,839 Speaker 1: that's arrogance, you like, you're going to get shit wrong constantly. Yeah, 845 00:47:22,880 --> 00:47:25,120 Speaker 1: that's like even when we go we're talking about stuff, 846 00:47:25,120 --> 00:47:27,359 Speaker 1: I go, hey, everyone, do not take any of this 847 00:47:27,400 --> 00:47:29,640 Speaker 1: as of advice, because this is the two fucking dickheads 848 00:47:29,640 --> 00:47:30,720 Speaker 1: on a Friday chatting. 849 00:47:32,440 --> 00:47:36,200 Speaker 2: Yeah, I'm very passionate about veganism, but I will go 850 00:47:36,280 --> 00:47:38,080 Speaker 2: to dinner with people and they can have a stake. 851 00:47:38,120 --> 00:47:40,800 Speaker 2: I'm not going to get offended by that. I'm not 852 00:47:40,840 --> 00:47:44,440 Speaker 2: going to change someone's opinions overnight. And you know, ultimately, 853 00:47:44,920 --> 00:47:48,120 Speaker 2: and whether it's faith, belief or whatever, the belief system 854 00:47:48,280 --> 00:47:52,040 Speaker 2: you have, you can't really control that you know, if you, 855 00:47:52,600 --> 00:47:55,600 Speaker 2: in your heart believe that there is a God and 856 00:47:55,680 --> 00:47:57,320 Speaker 2: that when you die you're going to go to heaven, 857 00:47:57,560 --> 00:48:00,520 Speaker 2: that's lovely that you feel good and comfortable in that sense. 858 00:48:01,000 --> 00:48:02,640 Speaker 1: Patronizing that's lovely. 859 00:48:03,080 --> 00:48:06,240 Speaker 2: Well, it is great because I've seen Look, a really 860 00:48:06,320 --> 00:48:09,600 Speaker 2: good friend of mine, her husband made a ner own disease, 861 00:48:09,840 --> 00:48:13,720 Speaker 2: and you know, he chose what he wanted to pass 862 00:48:13,880 --> 00:48:16,280 Speaker 2: and it was and I spent you know, a whole 863 00:48:16,320 --> 00:48:19,399 Speaker 2: afternoon with him three days before he died, reading from 864 00:48:19,480 --> 00:48:22,520 Speaker 2: his favorite book, and it was the emotional thing of 865 00:48:22,520 --> 00:48:24,920 Speaker 2: my life. And what I drew a lot of comfort 866 00:48:24,960 --> 00:48:27,799 Speaker 2: from was I knew how spiritual and how religious he was. 867 00:48:27,840 --> 00:48:30,440 Speaker 2: Then he really didn't fear death and he got a 868 00:48:30,440 --> 00:48:32,799 Speaker 2: lot from that. That was great for me to see 869 00:48:32,840 --> 00:48:35,360 Speaker 2: that in him, knowing that he chose the time he 870 00:48:35,440 --> 00:48:38,200 Speaker 2: was going to die, That was phenomenal to be a 871 00:48:38,239 --> 00:48:40,840 Speaker 2: part of that. And I feel really really lucky to 872 00:48:40,960 --> 00:48:43,799 Speaker 2: have spent that time with him, you know, holding his hand, 873 00:48:43,880 --> 00:48:48,160 Speaker 2: reading a book, and that was an experience that is profound. 874 00:48:48,840 --> 00:48:51,880 Speaker 2: And so ultimately, if you have a core belief, now 875 00:48:51,880 --> 00:48:54,239 Speaker 2: one's going to change that. And all you can do 876 00:48:54,360 --> 00:48:56,400 Speaker 2: is basically say, well, this is me, this is you, 877 00:48:56,520 --> 00:48:58,239 Speaker 2: This is why I believe what I believe. And you 878 00:48:58,280 --> 00:48:59,960 Speaker 2: can take that on board and not take it on board. 879 00:49:02,520 --> 00:49:05,360 Speaker 1: Look at us getting deep in philosophical on the home straight. 880 00:49:06,560 --> 00:49:08,239 Speaker 1: Have you got a two minute one, because I've got 881 00:49:08,280 --> 00:49:09,360 Speaker 1: to be out in two minutes. 882 00:49:10,440 --> 00:49:12,759 Speaker 2: There is one that I was really surprised about, and 883 00:49:12,760 --> 00:49:17,120 Speaker 2: I was surprised and not so surprised that women are 884 00:49:17,160 --> 00:49:22,759 Speaker 2: often an afterthought in terms of product design and implementation 885 00:49:22,880 --> 00:49:26,520 Speaker 2: of public spaces. Okay, so in the nineteen seventies, this 886 00:49:26,560 --> 00:49:29,640 Speaker 2: is a great article I was reading. There was a 887 00:49:29,719 --> 00:49:33,760 Speaker 2: thing called reference man. Effectively referenced man in the seventies 888 00:49:33,840 --> 00:49:37,720 Speaker 2: was a white male twenty five to thirty five ways, 889 00:49:37,760 --> 00:49:40,880 Speaker 2: about seventy kilos and about one hundred and seventy seventy 890 00:49:40,960 --> 00:49:44,719 Speaker 2: centimeters tall, which, ironically, I'm just under seventy kilos and 891 00:49:44,719 --> 00:49:47,080 Speaker 2: I'm one hundred and sixty nine centimeters tall. 892 00:49:47,880 --> 00:49:50,480 Speaker 1: Average person through seventies, you're the prototype. 893 00:49:50,719 --> 00:49:53,080 Speaker 2: Well yes and no, because I did a bit of 894 00:49:53,080 --> 00:49:56,520 Speaker 2: a googling on this one, and evidently the average height 895 00:49:56,640 --> 00:50:02,040 Speaker 2: has increased in Australia by about six or seven centimeters, 896 00:50:02,320 --> 00:50:05,279 Speaker 2: so that's now wrong anyway. But what it related to 897 00:50:05,960 --> 00:50:09,440 Speaker 2: is the fact that this reference. Now, there's an interesting 898 00:50:09,680 --> 00:50:14,240 Speaker 2: professor at monash Uni called Nicole Carm's and she was talking. 899 00:50:14,480 --> 00:50:17,440 Speaker 2: She runs a lab called monash x YX Lab. I 900 00:50:17,480 --> 00:50:19,200 Speaker 2: don't know if you've heard of that one, Craigo. 901 00:50:19,239 --> 00:50:19,839 Speaker 1: But have not. 902 00:50:20,320 --> 00:50:22,400 Speaker 2: But anyway, so what she's saying is one of the 903 00:50:22,400 --> 00:50:26,279 Speaker 2: problem is problems is that everything's gender specific. So if 904 00:50:26,320 --> 00:50:28,840 Speaker 2: you think about standing on a tram and reaching up 905 00:50:28,880 --> 00:50:31,799 Speaker 2: and grabbing the handle, well, someone has decided that the 906 00:50:31,840 --> 00:50:35,360 Speaker 2: handle should be at a certain height. And so everything 907 00:50:35,440 --> 00:50:37,759 Speaker 2: is based on a lot of research or a lot 908 00:50:37,800 --> 00:50:42,640 Speaker 2: of the RND and the development of products and spaces, 909 00:50:42,719 --> 00:50:45,359 Speaker 2: whether it's sitting on a bench or something like that, 910 00:50:45,640 --> 00:50:50,080 Speaker 2: is still based on reference. Man, the white male from Wow, 911 00:50:51,040 --> 00:50:52,040 Speaker 2: isn't that fascinating? 912 00:50:52,520 --> 00:50:55,399 Speaker 1: That is fascinating. It doesn't surprise me. But I will 913 00:50:55,440 --> 00:50:59,600 Speaker 1: say I totally agree with well, it's nothing to agree 914 00:50:59,640 --> 00:51:02,239 Speaker 1: or dis I find that interesting and believe it. But 915 00:51:02,280 --> 00:51:06,240 Speaker 1: I think there are a huge amount of products now 916 00:51:06,840 --> 00:51:10,880 Speaker 1: that women are at the forefront of because women spend 917 00:51:10,960 --> 00:51:13,960 Speaker 1: so much like and I mean that in a nice way. 918 00:51:13,960 --> 00:51:17,400 Speaker 1: It's like especially around I guess like fashion and all 919 00:51:17,440 --> 00:51:20,000 Speaker 1: of those things. I think generally speaking, women are more 920 00:51:20,480 --> 00:51:24,480 Speaker 1: interested in you know, well, maybe I'm not. Well, I'm 921 00:51:24,480 --> 00:51:27,320 Speaker 1: definitely not typical, but I think it goes both ways. 922 00:51:27,840 --> 00:51:30,600 Speaker 1: It's like there are probably certain you know, whether and 923 00:51:30,640 --> 00:51:34,359 Speaker 1: some women absolutely love cars, some don't. They're just like, oh, 924 00:51:34,440 --> 00:51:36,600 Speaker 1: it gets me from A to B and it's you know, 925 00:51:36,680 --> 00:51:39,320 Speaker 1: and some are all over it, and then other blokes 926 00:51:39,360 --> 00:51:41,799 Speaker 1: are not, you know, or blow you know. But I 927 00:51:41,840 --> 00:51:44,080 Speaker 1: think there's I think you'd find with a lot of 928 00:51:44,120 --> 00:51:47,799 Speaker 1: products that there's a real leaning one way or the other, 929 00:51:48,080 --> 00:51:53,920 Speaker 1: male or female around who's more you know, predisposed to 930 00:51:54,000 --> 00:51:56,040 Speaker 1: kind of be interested in that. But I do find 931 00:51:56,080 --> 00:52:00,879 Speaker 1: that interesting. Look, I think back then, when did you say, 932 00:52:00,880 --> 00:52:04,400 Speaker 1: the fifties or sixties and seventies, Yeah, well fifty I 933 00:52:04,440 --> 00:52:07,680 Speaker 1: mean that's fifty years ago now, and it's like, oh everything, 934 00:52:07,719 --> 00:52:11,879 Speaker 1: I agree. The world was so fucking geared towards men 935 00:52:12,000 --> 00:52:14,360 Speaker 1: and what men want and men need, and yeah, it 936 00:52:14,800 --> 00:52:19,400 Speaker 1: had to change, and thankfully it's changing. But you and 937 00:52:19,440 --> 00:52:22,799 Speaker 1: Tiff both kind of straddle that man woman kind of 938 00:52:23,239 --> 00:52:25,799 Speaker 1: you know, like you to blur the lines a little bit. 939 00:52:26,719 --> 00:52:28,960 Speaker 2: Well. The thing that she did bring up, though, is 940 00:52:28,960 --> 00:52:32,600 Speaker 2: that women quite often aren't in the data, so you know, 941 00:52:32,719 --> 00:52:35,200 Speaker 2: when you think about it, it's half the population. But 942 00:52:35,440 --> 00:52:38,880 Speaker 2: historically they've been excluded from that data set because it 943 00:52:38,960 --> 00:52:41,000 Speaker 2: is based on it's so male centric. 944 00:52:41,719 --> 00:52:44,600 Speaker 1: Yeah, it's ridiculous, Patrick. Where can people connect with you? 945 00:52:44,680 --> 00:52:48,399 Speaker 1: Find you, do yoga with you, meet Fritz and come 946 00:52:48,440 --> 00:52:51,840 Speaker 1: and stay thousand dollars a night in Milan for the weekend. 947 00:52:52,120 --> 00:52:56,080 Speaker 2: Oh that'd be good, wouldn't it. Websites noow dot com 948 00:52:56,200 --> 00:52:58,320 Speaker 2: dot au if you want to chat about like I 949 00:52:58,320 --> 00:53:01,719 Speaker 2: don't know websites figure, but if you want to check 950 00:53:01,760 --> 00:53:03,480 Speaker 2: out and do some Tai chee with Fritz and I 951 00:53:03,760 --> 00:53:06,960 Speaker 2: tie chee at home dot com dot au, but send 952 00:53:07,040 --> 00:53:09,040 Speaker 2: us a message, or if you want to trig about 953 00:53:09,080 --> 00:53:11,319 Speaker 2: something and not talk about male ejaculation. 954 00:53:12,120 --> 00:53:16,080 Speaker 1: Pat, see, you brought it right back up. It was 955 00:53:16,120 --> 00:53:20,120 Speaker 1: the furthest thing from anyone's mind. M Patrick sent me 956 00:53:20,160 --> 00:53:22,760 Speaker 1: a photo just before the show, every one of Fritz 957 00:53:22,960 --> 00:53:27,680 Speaker 1: wearing headphones and a little mouth talking piece like an 958 00:53:27,719 --> 00:53:32,000 Speaker 1: amy lady on the ads, and he said, we're ready. 959 00:53:33,000 --> 00:53:37,160 Speaker 1: Sadly Fritz hasn't made an appearance. Thank you, mate. I 960 00:53:37,200 --> 00:53:41,160 Speaker 1: will say goodbye affair, but for the minute, have a 961 00:53:41,200 --> 00:53:42,840 Speaker 1: good Friday chatting