1 00:00:00,640 --> 00:00:03,279 Speaker 1: I'll get a team. It's to you, projectivity and Cook, Patrick, 2 00:00:03,600 --> 00:00:06,880 Speaker 1: James Bonello, Craig, Anthony Harper, just reporting in on a Friday, 3 00:00:07,120 --> 00:00:10,640 Speaker 1: well as Friday as we record. Whenever you hear this 4 00:00:10,680 --> 00:00:12,520 Speaker 1: is whenever you hear it. Let's start with the boy 5 00:00:12,560 --> 00:00:15,880 Speaker 1: in the room. Hi Patrick, it's just going to you. 6 00:00:15,960 --> 00:00:16,520 Speaker 1: TIF didn't you? 7 00:00:16,840 --> 00:00:17,079 Speaker 2: Yeah? 8 00:00:17,120 --> 00:00:19,000 Speaker 1: I did? I was ready, Hi Patrick? 9 00:00:19,720 --> 00:00:21,960 Speaker 2: Last episode do you ask for the girl in the room? 10 00:00:21,960 --> 00:00:23,680 Speaker 2: And then came to me and I just thought, you're 11 00:00:23,680 --> 00:00:25,400 Speaker 2: going to be smart again, or try to be I. 12 00:00:25,440 --> 00:00:27,880 Speaker 1: Was going to be more respectful this time. How's the 13 00:00:28,080 --> 00:00:30,800 Speaker 1: land space station where you're currently residing. 14 00:00:32,360 --> 00:00:34,000 Speaker 2: I kind of feel like we need to do a 15 00:00:34,040 --> 00:00:36,400 Speaker 2: screenshot or something so people get a sense of my 16 00:00:36,520 --> 00:00:40,120 Speaker 2: backdrop being a giant space station looking out on a 17 00:00:40,159 --> 00:00:41,360 Speaker 2: planet just hovering. 18 00:00:41,960 --> 00:00:44,520 Speaker 1: Do you know what I mean? I know this sounds silly, 19 00:00:44,560 --> 00:00:47,960 Speaker 1: but it like if my mum looked at that and 20 00:00:48,040 --> 00:00:52,440 Speaker 1: I told my mum Patrick's in space, she would believe it. 21 00:00:52,440 --> 00:00:55,640 Speaker 1: It looks literally like you're in space, like you got 22 00:00:55,680 --> 00:00:59,440 Speaker 1: this big round window just looking at it. I don't 23 00:00:59,480 --> 00:01:02,080 Speaker 1: know whatever it is is that the Earth below you 24 00:01:02,120 --> 00:01:03,840 Speaker 1: and the moon in the distance. 25 00:01:04,280 --> 00:01:05,720 Speaker 2: Yeah, yep, yep. 26 00:01:05,840 --> 00:01:06,040 Speaker 1: Yeah. 27 00:01:06,319 --> 00:01:08,560 Speaker 2: The other thing too is you know, when you and this, 28 00:01:08,880 --> 00:01:11,920 Speaker 2: I've started to this sounds like a real foible of mine, 29 00:01:12,200 --> 00:01:14,800 Speaker 2: but I've started to get annoyed at having zoom meetings 30 00:01:14,840 --> 00:01:20,119 Speaker 2: with people turning on fake backgrounds. 31 00:01:21,120 --> 00:01:23,440 Speaker 1: It is, what do you mean that's a photo of 32 00:01:23,480 --> 00:01:26,000 Speaker 1: the studio. Well, it's not real. 33 00:01:26,080 --> 00:01:27,280 Speaker 2: It's a fake background. 34 00:01:27,440 --> 00:01:29,920 Speaker 1: And you know what a fake background. It's a photo 35 00:01:29,959 --> 00:01:30,960 Speaker 1: of my real studio. 36 00:01:31,080 --> 00:01:34,640 Speaker 2: Exactly, it's a photo. It's fake. And his hair kind 37 00:01:34,640 --> 00:01:37,480 Speaker 2: of starts to disappear. Well, actually that's just age. No, 38 00:01:37,560 --> 00:01:41,399 Speaker 2: but you know, look at his shoulders disappearing and. 39 00:01:41,800 --> 00:01:44,000 Speaker 1: Straight out of the gate. Everyone straight out of the gate, 40 00:01:45,080 --> 00:01:47,920 Speaker 1: attack ups. This is this is what they do. Okay, 41 00:01:47,960 --> 00:01:52,280 Speaker 1: when they're insecure, they just go, let's attack him. Keep going, 42 00:01:52,560 --> 00:01:55,560 Speaker 1: keep going, do your best, not me, keep going, do 43 00:01:55,680 --> 00:01:58,480 Speaker 1: your best. If you can't say anything nice about people, 44 00:01:59,360 --> 00:02:02,040 Speaker 1: don't say any thing. That's what Mary would tell you. 45 00:02:05,080 --> 00:02:10,200 Speaker 1: That's him. He's just that out. Okay. So I'll talk 46 00:02:10,240 --> 00:02:12,200 Speaker 1: to you, Tiff. Tiff, how's your day going. 47 00:02:12,440 --> 00:02:15,520 Speaker 3: It's really good. Thank you, thanks for asking. And I 48 00:02:15,520 --> 00:02:17,440 Speaker 3: think your shoulders look great with the background. 49 00:02:18,120 --> 00:02:21,839 Speaker 1: Yeah. Thanks. I don't know what's wrong with my background. 50 00:02:22,800 --> 00:02:24,200 Speaker 2: When you put your arm up, and. 51 00:02:24,200 --> 00:02:26,480 Speaker 1: Now's interrupting because he's trying to get back in the 52 00:02:26,520 --> 00:02:27,080 Speaker 1: good books. 53 00:02:28,440 --> 00:02:30,520 Speaker 2: I'm not getting back any good books. Hey, Now what 54 00:02:30,560 --> 00:02:32,000 Speaker 2: I mean by you know, when you're hit in a 55 00:02:32,040 --> 00:02:34,639 Speaker 2: zoom meeting or a teams meeting and the people have 56 00:02:35,080 --> 00:02:37,400 Speaker 2: fake backgrounds and they just start to shimmy in and 57 00:02:37,400 --> 00:02:40,639 Speaker 2: out and disappear and re materialize and deep materialize. It's 58 00:02:40,680 --> 00:02:43,280 Speaker 2: like watching an episode of Star Trek and they're all 59 00:02:43,320 --> 00:02:46,560 Speaker 2: about to go on away mission. That's that's started to 60 00:02:46,680 --> 00:02:50,320 Speaker 2: annoy me. Just have your background. And so I decided 61 00:02:50,400 --> 00:02:54,160 Speaker 2: to buy some backdrops. And obviously, because I took like 62 00:02:54,840 --> 00:02:57,919 Speaker 2: tech stuff, I thought a space station scene would be good. 63 00:02:57,960 --> 00:03:00,320 Speaker 2: And that's why it looks more not realistic. You were 64 00:03:00,360 --> 00:03:03,080 Speaker 2: talking about fooling Mary, But the reality of it is, 65 00:03:03,520 --> 00:03:05,880 Speaker 2: I'm not shimming in and out because it's not a 66 00:03:06,040 --> 00:03:10,400 Speaker 2: superimposed background, because it's actually a material picture that's hanging 67 00:03:10,440 --> 00:03:12,120 Speaker 2: on the wall behind me. And that's why it looks 68 00:03:12,120 --> 00:03:12,760 Speaker 2: more authentic. 69 00:03:13,560 --> 00:03:15,959 Speaker 1: All Right, you're better than us. Well done, next Tiff, 70 00:03:17,600 --> 00:03:20,440 Speaker 1: all right you win? Yeah great, I'll tell you what 71 00:03:20,520 --> 00:03:23,680 Speaker 1: when that's your biggest problem is fucking worrying about people's 72 00:03:23,680 --> 00:03:28,519 Speaker 1: backgrounds on Zoom. Your life's good, Champ. Yeah, true, you 73 00:03:28,639 --> 00:03:32,600 Speaker 1: probably find you had to find a real problem. Uh, tiff, 74 00:03:32,960 --> 00:03:35,160 Speaker 1: how are you? What is going on on planet Tiffany? 75 00:03:35,640 --> 00:03:36,880 Speaker 2: Oh? Great stuff, Harts. 76 00:03:36,960 --> 00:03:39,040 Speaker 3: I got up at the crack of dawn. I had 77 00:03:39,040 --> 00:03:42,480 Speaker 3: a seven am appointment with my body magician, as I 78 00:03:42,560 --> 00:03:45,720 Speaker 3: call him, and I got the tick of approval. Things 79 00:03:45,760 --> 00:03:48,800 Speaker 3: are all tracking well, I'd had that tennis or golfers 80 00:03:48,840 --> 00:03:51,760 Speaker 3: elbow rubbish. I've already rehabbed it myself. You give me 81 00:03:51,800 --> 00:03:53,720 Speaker 3: a little pat on the back and sent me on 82 00:03:53,760 --> 00:03:54,120 Speaker 3: my way. 83 00:03:54,760 --> 00:03:55,280 Speaker 2: How was that? 84 00:03:56,680 --> 00:03:58,520 Speaker 1: A couple of things I want to chat about, and 85 00:03:58,680 --> 00:04:01,400 Speaker 1: one is relates to might relate to Patrick, because you 86 00:04:01,480 --> 00:04:04,560 Speaker 1: sent me a gift voucher that we're going to talk about. 87 00:04:04,800 --> 00:04:08,720 Speaker 1: When you say your your magician, what is it? A 88 00:04:08,760 --> 00:04:11,440 Speaker 1: he or she? Doesn't matter? But before it he Okay? 89 00:04:11,560 --> 00:04:15,920 Speaker 1: Is he a cairo, a physio, an osteo, or mio? 90 00:04:16,200 --> 00:04:17,760 Speaker 1: Is he any of the things ending in? 91 00:04:17,839 --> 00:04:23,560 Speaker 3: Oh? Yeah, he's a sports chiropractic, but he works in 92 00:04:23,600 --> 00:04:28,520 Speaker 3: a thing called neuro something rehabilitation. He does a lot 93 00:04:28,520 --> 00:04:31,480 Speaker 3: of it, does a lot of work with the brain 94 00:04:33,120 --> 00:04:36,000 Speaker 3: and how the brain communicates with the body. He's just 95 00:04:36,400 --> 00:04:39,040 Speaker 3: I just call him a magician because I don't know. 96 00:04:39,720 --> 00:04:42,279 Speaker 3: He never does chiropractic adjustments ever. 97 00:04:43,440 --> 00:04:45,920 Speaker 1: Does he need any more work? Because I'm feeling if 98 00:04:45,960 --> 00:04:48,040 Speaker 1: you share his name, he's going to be inundated. So 99 00:04:48,120 --> 00:04:49,279 Speaker 1: I don't know whether or not weishul. 100 00:04:49,440 --> 00:04:51,480 Speaker 3: He already is in undated. He's hard to get into. 101 00:04:51,520 --> 00:04:53,800 Speaker 3: All my clients get sent to him at some point 102 00:04:53,880 --> 00:04:56,080 Speaker 3: because he is the superior being. 103 00:04:56,760 --> 00:05:00,000 Speaker 1: Well, if he's treating you at seven am, he's busy. 104 00:05:00,120 --> 00:05:04,760 Speaker 1: He's busy. Patrick, do you have a mechanic of sorts 105 00:05:04,800 --> 00:05:07,520 Speaker 1: who works on your your sor like? Do you like 106 00:05:07,680 --> 00:05:10,839 Speaker 1: massage or physio or cairo? Do you like any of 107 00:05:10,880 --> 00:05:11,479 Speaker 1: that stuff? 108 00:05:11,960 --> 00:05:14,000 Speaker 2: Not really. I tend to only do it if I'm 109 00:05:14,200 --> 00:05:18,599 Speaker 2: really falling apart, which luckily not much. I'm so my 110 00:05:18,720 --> 00:05:21,200 Speaker 2: tai chi. I do tai cheese so often now I 111 00:05:21,279 --> 00:05:24,760 Speaker 2: teach it, I participate in it, doing lots of core work. 112 00:05:24,760 --> 00:05:27,200 Speaker 2: We were chatting before the show, so I feel like 113 00:05:27,240 --> 00:05:30,279 Speaker 2: I'm in pretty good nettle at the moment. Does anyone 114 00:05:30,320 --> 00:05:32,200 Speaker 2: say that I'm in good nettle at the moment? 115 00:05:32,760 --> 00:05:36,120 Speaker 4: You and Mary feeling pretty good. 116 00:05:36,839 --> 00:05:40,280 Speaker 2: I used to get some really chronic pain. When I 117 00:05:40,360 --> 00:05:42,200 Speaker 2: was at the gym with you, I used to get 118 00:05:42,240 --> 00:05:44,240 Speaker 2: bad shoulder pain, but that was because. 119 00:05:44,800 --> 00:05:47,159 Speaker 1: It's a shout out, that's a good ad for my work. 120 00:05:48,680 --> 00:05:51,320 Speaker 2: Was it as a result of training with you? Was 121 00:05:51,360 --> 00:05:53,920 Speaker 2: it as a result of sitting in a bloody desk 122 00:05:54,120 --> 00:05:56,640 Speaker 2: on a stool our radio station when I was working 123 00:05:56,640 --> 00:06:01,360 Speaker 2: in news at a company Trip and Fox. When I 124 00:06:01,400 --> 00:06:04,760 Speaker 2: was working for them in the newsroom, he sat on stools, 125 00:06:05,120 --> 00:06:09,160 Speaker 2: unsupported feet, and the keyboard was sitting in front of you, 126 00:06:09,200 --> 00:06:12,200 Speaker 2: and the monitors were sunk into the desk, so you hunched. 127 00:06:12,480 --> 00:06:16,919 Speaker 2: Your natural sitting position was hunched over the monitors. It 128 00:06:16,960 --> 00:06:20,800 Speaker 2: was horrendously bad ergonomics. So I had chronic shoulder pain 129 00:06:20,920 --> 00:06:24,240 Speaker 2: for such a long time and went to lots of 130 00:06:24,279 --> 00:06:25,840 Speaker 2: different people. I don't know if you remember. There was 131 00:06:25,880 --> 00:06:28,840 Speaker 2: an amazing South African lady who was a Cairo and 132 00:06:28,880 --> 00:06:31,480 Speaker 2: a physio. I'm trying to think that name. 133 00:06:31,560 --> 00:06:34,480 Speaker 5: Yeah, it worked at my place, Yeah, absolutely, And what 134 00:06:34,560 --> 00:06:37,520 Speaker 5: I love about her was that she took a very 135 00:06:38,800 --> 00:06:41,839 Speaker 5: broad approach, you know, a holistic approach to looking at 136 00:06:41,920 --> 00:06:43,480 Speaker 5: what was causing your ailments. 137 00:06:43,480 --> 00:06:45,680 Speaker 2: Because I've been to lots of people and she fixed it. 138 00:06:46,760 --> 00:06:49,080 Speaker 1: She was good. I should remember her name. She did 139 00:06:49,360 --> 00:06:53,600 Speaker 1: work in my business. But yeah, no, figure that was 140 00:06:54,440 --> 00:06:55,400 Speaker 1: twenty five years ago. 141 00:06:55,800 --> 00:06:58,240 Speaker 2: It was a while, but I and you know what 142 00:06:58,360 --> 00:07:00,400 Speaker 2: fixed it for me. And this is going to sound weird. 143 00:07:00,440 --> 00:07:03,760 Speaker 2: She gave me a home traction kit where you put 144 00:07:03,760 --> 00:07:07,279 Speaker 2: this neck brace on. You had a clip that was 145 00:07:07,320 --> 00:07:10,160 Speaker 2: fixed to the top of the door, and you put 146 00:07:10,200 --> 00:07:13,160 Speaker 2: water in a bag, so say three leads of water, 147 00:07:13,240 --> 00:07:15,560 Speaker 2: so three kilos and it stretched your neck. 148 00:07:17,480 --> 00:07:20,280 Speaker 1: Well, are you sure that was meant to be rehab 149 00:07:20,360 --> 00:07:21,000 Speaker 1: or something else? 150 00:07:22,360 --> 00:07:22,400 Speaker 2: It? 151 00:07:24,280 --> 00:07:27,720 Speaker 1: Yeah, that's that. Actually, neck traction is a real thing. 152 00:07:28,360 --> 00:07:31,880 Speaker 1: It's just a it's a be careful thing. 153 00:07:32,240 --> 00:07:35,240 Speaker 2: Pretty of a professional. Yeah, yeah, yeah, yeah. 154 00:07:35,280 --> 00:07:37,360 Speaker 1: Don't get your kid to yank on your neck while 155 00:07:37,360 --> 00:07:39,120 Speaker 1: you're lying on the kitchen table or something. 156 00:07:39,600 --> 00:07:43,560 Speaker 2: Now, it was under quite strict medical supervision, but it 157 00:07:43,760 --> 00:07:46,960 Speaker 2: fixed everything. Just stretching the old neck was great. 158 00:07:47,640 --> 00:07:50,480 Speaker 1: Now, before we jump into the show proper, I'll go on. 159 00:07:50,840 --> 00:07:52,640 Speaker 2: No, it's just gonna because it was very you know me, 160 00:07:52,840 --> 00:07:55,040 Speaker 2: I give a very long winded answer. So no, I 161 00:07:55,080 --> 00:07:58,440 Speaker 2: don't have a regular person who manipulates my joints and 162 00:07:58,560 --> 00:08:01,960 Speaker 2: touches me all over. But I'm taking applications after the show. 163 00:08:03,000 --> 00:08:05,840 Speaker 1: Okay, just shout out to Patrick if you feel like 164 00:08:05,920 --> 00:08:11,040 Speaker 1: you might be that guy TIV. You sent me a 165 00:08:11,080 --> 00:08:13,760 Speaker 1: message this week to say that you put on some 166 00:08:13,840 --> 00:08:16,960 Speaker 1: inordinate amount of muscle over the last month. You've got 167 00:08:17,160 --> 00:08:21,240 Speaker 1: an inbody or a body contest done. Are you feeling 168 00:08:21,320 --> 00:08:22,560 Speaker 1: quite powerful and strong? 169 00:08:23,160 --> 00:08:27,800 Speaker 3: I'm feeling quite good doing strength training and I'm loving it. 170 00:08:28,080 --> 00:08:31,120 Speaker 3: And it's such a nice shift. And you know, as 171 00:08:31,160 --> 00:08:37,679 Speaker 3: a female that's always been very athletic and very borderline obsessive. 172 00:08:37,800 --> 00:08:42,560 Speaker 3: But you know, I loved that high intensity, calorie burning, 173 00:08:42,760 --> 00:08:46,920 Speaker 3: sweatfesting type of training. It's been a real nice switch. 174 00:08:47,160 --> 00:08:50,800 Speaker 3: And what comes with that is kind of wanting to 175 00:08:50,800 --> 00:08:52,600 Speaker 3: see the scales go up knock down. 176 00:08:53,559 --> 00:08:57,840 Speaker 1: That's hilarious. We watch your I would not ask any 177 00:08:57,840 --> 00:09:01,280 Speaker 1: other woman this but you, because I know you're happy 178 00:09:01,280 --> 00:09:03,840 Speaker 1: with it. What what's your current weight? 179 00:09:04,360 --> 00:09:05,040 Speaker 3: Sixty three? 180 00:09:06,280 --> 00:09:08,040 Speaker 1: How do you feel it that? Yeah? 181 00:09:08,160 --> 00:09:08,400 Speaker 2: Good? 182 00:09:08,640 --> 00:09:10,800 Speaker 3: Good. And the thing is like, I want to I 183 00:09:10,840 --> 00:09:15,040 Speaker 3: want to put lower body muscle on and the reality 184 00:09:15,120 --> 00:09:18,480 Speaker 3: is I can't do that staying at the same weight 185 00:09:18,520 --> 00:09:21,400 Speaker 3: because it's just physiologically not possible. I can't lose that 186 00:09:21,480 --> 00:09:23,520 Speaker 3: amount of weight to put it on in fat in 187 00:09:23,559 --> 00:09:26,040 Speaker 3: body fat, it'd be too lower body fat. 188 00:09:26,360 --> 00:09:30,160 Speaker 1: Yeah, well you can't. Yeah, I mean, what's that? 189 00:09:30,520 --> 00:09:32,280 Speaker 2: Sorry, Greg, I had to all aids. 190 00:09:32,120 --> 00:09:34,640 Speaker 3: If one sixty just under one sixty eight. 191 00:09:35,760 --> 00:09:37,520 Speaker 2: Oh wow, I'm trying to do this s stare because 192 00:09:37,520 --> 00:09:41,520 Speaker 2: I'm one sixty nine and seventy kilos. It's just yeah, 193 00:09:41,559 --> 00:09:45,160 Speaker 2: that's that's that seems really light to me. But obviously 194 00:09:45,200 --> 00:09:48,880 Speaker 2: your shape is totally different. Yeah. 195 00:09:49,040 --> 00:09:51,360 Speaker 1: No, she doesn't look I mean not, I mean she 196 00:09:51,400 --> 00:09:55,560 Speaker 1: looks jacked, you know. She looks like an athlete, which 197 00:09:55,600 --> 00:09:59,000 Speaker 1: is it's good because that's what she wants to look like. 198 00:09:59,320 --> 00:10:04,199 Speaker 1: So up and Grandpa in the middle of the screen. 199 00:10:04,240 --> 00:10:07,760 Speaker 1: My screen anyway, is eighty kilos, which for me is 200 00:10:07,840 --> 00:10:11,400 Speaker 1: borderline anorexic. I haven't been this light since I was 201 00:10:11,440 --> 00:10:17,840 Speaker 1: about twelve, not even probably eleven. How funny is that? 202 00:10:17,960 --> 00:10:19,840 Speaker 1: People are going, Oh, that's funny. I got No, that's 203 00:10:20,000 --> 00:10:23,080 Speaker 1: actually true. I probably weighed this when I was about 204 00:10:23,200 --> 00:10:24,160 Speaker 1: eleven or twelve. 205 00:10:24,280 --> 00:10:28,200 Speaker 2: But do you remember sitting in a boardroom in an 206 00:10:28,200 --> 00:10:30,800 Speaker 2: advertising agency with me when we were working on a 207 00:10:30,960 --> 00:10:33,120 Speaker 2: project that didn't quite get off the ground, but it 208 00:10:33,200 --> 00:10:36,480 Speaker 2: was just helping people lose weights, and there were a 209 00:10:36,559 --> 00:10:38,640 Speaker 2: number of people, and one of the directors of the 210 00:10:38,720 --> 00:10:42,080 Speaker 2: agency was a kind of a bit of a bigger guy, 211 00:10:42,559 --> 00:10:45,200 Speaker 2: and we were all talking about, you know, collectively losing weight. 212 00:10:45,240 --> 00:10:47,920 Speaker 2: So the idea of the project was instead of losing 213 00:10:47,960 --> 00:10:50,280 Speaker 2: weight individually, you set up a team of people and 214 00:10:50,320 --> 00:10:53,640 Speaker 2: you collectively lost weight, and that way it was more 215 00:10:53,840 --> 00:10:57,120 Speaker 2: of a team effort to help encourage people. But I 216 00:10:57,200 --> 00:10:59,880 Speaker 2: remember saying whilst we were sitting there, saying, yeah, I 217 00:11:00,160 --> 00:11:02,920 Speaker 2: like to lose a couple of kilos, and I was, 218 00:11:03,000 --> 00:11:06,480 Speaker 2: and this guy got kind of really tore shreds off me, saying, oh, 219 00:11:06,480 --> 00:11:09,160 Speaker 2: that's ridiculous. You don't need to lose any ways. But 220 00:11:09,240 --> 00:11:11,840 Speaker 2: it's a very personal journey when we look at weight 221 00:11:11,880 --> 00:11:15,280 Speaker 2: loss and one kilo to me is saying, oh, man, 222 00:11:15,320 --> 00:11:17,600 Speaker 2: I just wish I could lose one or two kilos 223 00:11:18,400 --> 00:11:20,679 Speaker 2: in the grand scale of things. Because a good friend 224 00:11:20,679 --> 00:11:24,160 Speaker 2: of mine we were talking about losing weight and he's 225 00:11:24,320 --> 00:11:27,280 Speaker 2: lost about forty kilos. He's a very big person and 226 00:11:27,320 --> 00:11:30,400 Speaker 2: it's a massive amount to lose for him, but relative 227 00:11:30,440 --> 00:11:32,440 Speaker 2: to his size, he's got a little bit more of 228 00:11:32,440 --> 00:11:34,600 Speaker 2: a journey to go, and he's doing really, really well 229 00:11:34,640 --> 00:11:37,840 Speaker 2: because he's losing it over a span of time, but 230 00:11:37,880 --> 00:11:39,880 Speaker 2: it is a personal journey and it's and you know, 231 00:11:40,000 --> 00:11:43,679 Speaker 2: sometimes people forget that, you know, and you almost face 232 00:11:43,800 --> 00:11:46,600 Speaker 2: ridicule if you say, if you if you present to 233 00:11:46,679 --> 00:11:50,680 Speaker 2: somebody who looks quite fit, but you feel that you'd 234 00:11:50,760 --> 00:11:53,240 Speaker 2: like to lose a little bit, without being obsessive, it 235 00:11:53,400 --> 00:11:55,280 Speaker 2: still is how you feel in your own skin. 236 00:11:56,360 --> 00:12:00,199 Speaker 1: Yeah, yeah, I mean it's I remember exactly what you're 237 00:12:00,200 --> 00:12:04,920 Speaker 1: talking about and who you're talking about. And what's tricky is, 238 00:12:05,080 --> 00:12:12,040 Speaker 1: especially in twenty twenty five, you can't almost talk about weight. 239 00:12:12,840 --> 00:12:16,199 Speaker 1: You know, I can, and I do. But the problem 240 00:12:16,280 --> 00:12:20,880 Speaker 1: is that when we are talking about physiology and health 241 00:12:21,000 --> 00:12:26,400 Speaker 1: and body composition and subcutaneous fat and visceral fat and 242 00:12:27,520 --> 00:12:30,400 Speaker 1: diabetes and all of the things that kind of float 243 00:12:30,640 --> 00:12:34,200 Speaker 1: around how your body works and what your body is 244 00:12:34,200 --> 00:12:37,960 Speaker 1: made up of, and you know, people almost take that 245 00:12:38,200 --> 00:12:43,520 Speaker 1: as an emotional or psychological judgment or attack. Whereas I'm 246 00:12:43,559 --> 00:12:47,040 Speaker 1: speaking as an exercise scientist and I'm talking about data 247 00:12:47,080 --> 00:12:51,040 Speaker 1: and science, and so without being rude, I don't give 248 00:12:51,040 --> 00:12:53,120 Speaker 1: a fuck what people way. I don't give a fuck 249 00:12:53,160 --> 00:12:55,559 Speaker 1: what they look like. I care about how well their 250 00:12:55,600 --> 00:12:59,920 Speaker 1: body works. But like it or not, there is a 251 00:13:00,000 --> 00:13:06,320 Speaker 1: relationship between weight and how much body fat a person 252 00:13:06,440 --> 00:13:11,520 Speaker 1: is carrying, and they're likely health outcomes. Now that's not 253 00:13:11,679 --> 00:13:13,959 Speaker 1: a guess or a stab or an insult, but when 254 00:13:14,040 --> 00:13:17,360 Speaker 1: you point that out, people would rather find a way 255 00:13:17,400 --> 00:13:21,040 Speaker 1: to get insulted than enlightened, right, And they're like, oh, 256 00:13:21,120 --> 00:13:24,199 Speaker 1: you're fucking fat shaming. I'm like, I was the fattest 257 00:13:24,280 --> 00:13:27,360 Speaker 1: kid in my school. I know what it's like to 258 00:13:27,360 --> 00:13:30,040 Speaker 1: be obese. I know what it's like to be called jumbo. 259 00:13:30,480 --> 00:13:34,319 Speaker 1: I am not fat shaming. I'm fucking talking about health 260 00:13:34,360 --> 00:13:38,960 Speaker 1: and wellness and coming from a place of literal personal experience, 261 00:13:39,520 --> 00:13:43,960 Speaker 1: you know. And there's that kind of trying to navigate 262 00:13:44,000 --> 00:13:47,320 Speaker 1: that road at the moment publicly when you're talking. It's 263 00:13:47,440 --> 00:13:51,400 Speaker 1: tricky because in many ways it's easier to find a 264 00:13:51,400 --> 00:13:55,160 Speaker 1: way to be offended than it is to take it 265 00:13:55,200 --> 00:13:59,240 Speaker 1: on board as just you know, science or information and 266 00:13:59,280 --> 00:14:00,520 Speaker 1: do something positive with it. 267 00:14:01,280 --> 00:14:04,280 Speaker 2: The AMA, the Australian Medical Association, a few years ago 268 00:14:04,520 --> 00:14:08,480 Speaker 2: were trying to encourage doctors to be more blunts with 269 00:14:08,559 --> 00:14:11,400 Speaker 2: the language they were using and not to gloss over 270 00:14:12,000 --> 00:14:15,520 Speaker 2: the notion that someone might be morbidly obese. And the 271 00:14:15,559 --> 00:14:18,000 Speaker 2: thing is it's such a difficult journey for different people, 272 00:14:18,080 --> 00:14:21,200 Speaker 2: for lots and lots and lots of reasons. And I 273 00:14:21,240 --> 00:14:24,720 Speaker 2: guess that the how do you walk that tight rope 274 00:14:24,760 --> 00:14:28,920 Speaker 2: of glossing over it to protect the person's mental well 275 00:14:28,960 --> 00:14:31,760 Speaker 2: being but also to you know, to say to them 276 00:14:31,880 --> 00:14:34,960 Speaker 2: that you potentially are putting your life at risk by 277 00:14:35,080 --> 00:14:39,280 Speaker 2: maintaining the current status status quo. So I don't know 278 00:14:39,320 --> 00:14:41,960 Speaker 2: that I'd want to be in that space, but it's 279 00:14:41,960 --> 00:14:45,080 Speaker 2: certainly a challenge in every GP and health professional faces. 280 00:14:45,880 --> 00:14:48,840 Speaker 1: Yeah, and I think, I mean, it is tough because 281 00:14:49,040 --> 00:14:51,080 Speaker 1: of course you care about people, and you have empathy 282 00:14:51,120 --> 00:14:54,400 Speaker 1: and awareness, and you don't want to make people feel bad. 283 00:14:54,480 --> 00:14:58,200 Speaker 1: But also I don't want people that I care about, 284 00:14:58,360 --> 00:15:00,800 Speaker 1: or even people that I don't know that matter. I 285 00:15:00,840 --> 00:15:02,960 Speaker 1: don't want them to wake up in five or ten 286 00:15:03,040 --> 00:15:11,960 Speaker 1: years and have life threatening health conditions because everyone was 287 00:15:12,000 --> 00:15:17,480 Speaker 1: too concerned with their their feelings for the next thirty seconds. 288 00:15:18,040 --> 00:15:22,400 Speaker 1: It's like, I care about your feelings, but it's not 289 00:15:22,560 --> 00:15:27,240 Speaker 1: my job to manage your feelings or work around your emotions. 290 00:15:27,680 --> 00:15:31,600 Speaker 1: I'm literally sharing practical, real world information with you, and 291 00:15:31,640 --> 00:15:33,920 Speaker 1: if you don't want to listen, that's fine. If you 292 00:15:33,960 --> 00:15:36,560 Speaker 1: get upset. That's also fine. I don't want you to 293 00:15:36,600 --> 00:15:40,360 Speaker 1: be upset. But I'm not going to apologize for just 294 00:15:40,440 --> 00:15:43,240 Speaker 1: saying what I know. Like, I've literally worked with tens 295 00:15:43,240 --> 00:15:46,520 Speaker 1: of thousands of bodies, so I'm not guessing you know. 296 00:15:46,600 --> 00:15:50,320 Speaker 1: This is not some random theory and it's it's funny. 297 00:15:50,320 --> 00:15:53,240 Speaker 1: We live in that time where where did Tiff go? 298 00:15:53,360 --> 00:15:55,880 Speaker 1: By the way, Patrick, can hear? 299 00:15:56,080 --> 00:15:59,080 Speaker 2: We can hear, yeah, we can. We're missing you. What's 300 00:15:59,080 --> 00:15:59,480 Speaker 2: going on? 301 00:16:00,080 --> 00:16:04,040 Speaker 3: Well, my screen has frozen, so everything and I can't 302 00:16:04,080 --> 00:16:06,200 Speaker 3: move the cursor or anything, so I can hear you, 303 00:16:06,240 --> 00:16:08,440 Speaker 3: but everything's frozen on my computer screen. 304 00:16:08,560 --> 00:16:10,680 Speaker 2: Don't touch anything. Was doing what you're doing? 305 00:16:11,160 --> 00:16:11,360 Speaker 3: Yeah? 306 00:16:11,440 --> 00:16:14,320 Speaker 1: Yeah, you just talk from behind that black curtain. I 307 00:16:14,360 --> 00:16:15,800 Speaker 1: don't know what I was going to say, but let's 308 00:16:15,800 --> 00:16:18,680 Speaker 1: talk about Let's talk about the stuff that you're the 309 00:16:18,720 --> 00:16:20,520 Speaker 1: best at, Patrick James. 310 00:16:20,640 --> 00:16:22,920 Speaker 2: I've got a nice segue to that. Because a lot 311 00:16:22,920 --> 00:16:26,560 Speaker 2: more people are turning to AI for health advice, And 312 00:16:26,800 --> 00:16:29,200 Speaker 2: there was a really interesting story about a man who 313 00:16:29,240 --> 00:16:36,520 Speaker 2: became hospitalized with psychiatric psychiatric symptoms following getting after getting 314 00:16:36,520 --> 00:16:42,280 Speaker 2: advice from AI. So this guy was wanting to reduce 315 00:16:42,320 --> 00:16:45,240 Speaker 2: his salt intake. Okay, so he started this whole chat 316 00:16:45,240 --> 00:16:47,520 Speaker 2: with the chatbot, and he decided to set up his 317 00:16:47,560 --> 00:16:51,480 Speaker 2: own health plan and thought, I've got to reduce my 318 00:16:51,600 --> 00:16:56,640 Speaker 2: salt intake. So he basically found a substitute by talking 319 00:16:56,680 --> 00:17:01,720 Speaker 2: to chat JPT and it suggested sodium instead of sodium 320 00:17:01,800 --> 00:17:05,080 Speaker 2: chloride and you can swap them around if you're going 321 00:17:05,119 --> 00:17:11,280 Speaker 2: to be cleaning a bath tub but yesting it. So yeah, 322 00:17:11,280 --> 00:17:13,399 Speaker 2: he went and bought this stuff online and thought, oh, 323 00:17:13,440 --> 00:17:15,119 Speaker 2: I'm going to reduce my salting take with it, but 324 00:17:15,280 --> 00:17:17,840 Speaker 2: with different sort of salt. So he's just eating this 325 00:17:17,920 --> 00:17:21,479 Speaker 2: over three months he's decided to take this on and 326 00:17:21,520 --> 00:17:26,399 Speaker 2: then after that time he started, he ended up in 327 00:17:26,440 --> 00:17:30,040 Speaker 2: the emergency department with paranoid delusions and he believed that 328 00:17:30,080 --> 00:17:32,679 Speaker 2: his neighbor was trying to poison him. In fact, he 329 00:17:32,760 --> 00:17:36,200 Speaker 2: was poisoning himself. And that's because he'd listened to advice 330 00:17:36,280 --> 00:17:39,760 Speaker 2: health advice, and that the chatbot didn't really discern that 331 00:17:39,840 --> 00:17:41,880 Speaker 2: it was about him taking it. It's just well, what's 332 00:17:41,880 --> 00:17:46,000 Speaker 2: an alternative to salt, and that's what ended up happening. 333 00:17:46,080 --> 00:17:50,960 Speaker 2: So yeah, it's a really timely reminded that everything that 334 00:17:51,040 --> 00:17:54,359 Speaker 2: you see may not necessarily be the right sort of 335 00:17:54,400 --> 00:17:57,040 Speaker 2: advice when you jump into a you know, you chat, 336 00:17:57,080 --> 00:18:00,320 Speaker 2: GPTs or whatever it is that you're using. So that 337 00:18:00,440 --> 00:18:02,080 Speaker 2: was a very vivid ex you. 338 00:18:02,000 --> 00:18:07,200 Speaker 1: Know, that's that's bloody amazing. I mean imagine, yeah, well 339 00:18:07,240 --> 00:18:09,200 Speaker 1: that that's one of the dangers, isn't it. I mean, 340 00:18:09,200 --> 00:18:13,040 Speaker 1: that's a danger of I think how much, how much 341 00:18:13,119 --> 00:18:16,600 Speaker 1: we pay attention to rely on trust and I think 342 00:18:16,640 --> 00:18:20,240 Speaker 1: everything even though you know, I think we both agree 343 00:18:20,240 --> 00:18:26,119 Speaker 1: that AI has got unbelievable potential and unbelievable capacity to 344 00:18:27,000 --> 00:18:29,960 Speaker 1: be of value. It's also you know, it can also 345 00:18:30,720 --> 00:18:35,359 Speaker 1: do untold harm if not used the right way, and 346 00:18:35,440 --> 00:18:39,560 Speaker 1: if perhaps trusted in inverted commas when it shouldn't be trusted. 347 00:18:39,640 --> 00:18:42,959 Speaker 2: So oh yeah, And there's a lot of AI slop 348 00:18:43,040 --> 00:18:45,280 Speaker 2: out there as well. And in fact Wikipedia, I don't 349 00:18:45,320 --> 00:18:47,480 Speaker 2: know if you use Wikipedia much, but I tend to 350 00:18:47,480 --> 00:18:49,040 Speaker 2: find it still pops up and I have a bit 351 00:18:49,040 --> 00:18:50,800 Speaker 2: of a read, you know, whether it's a profile or 352 00:18:50,840 --> 00:18:54,120 Speaker 2: something like that. But they're battling AI at the moment 353 00:18:54,119 --> 00:18:57,680 Speaker 2: because Wikipedia is managed by a whole stack of volunteers, 354 00:18:57,920 --> 00:19:00,560 Speaker 2: and a lot of them a pandemics. So these people 355 00:19:00,640 --> 00:19:04,200 Speaker 2: who are really passionate about keeping the content out there. 356 00:19:04,240 --> 00:19:06,680 Speaker 2: So I guess it's like the Internet version of the 357 00:19:06,800 --> 00:19:12,159 Speaker 2: Encyclopedia Britannica. You know, it's the traditional encyclopedic format to 358 00:19:12,200 --> 00:19:14,720 Speaker 2: be able to give you information on a whole host 359 00:19:14,760 --> 00:19:16,800 Speaker 2: of things. But what they've always taken a lot of 360 00:19:16,840 --> 00:19:20,160 Speaker 2: pride in is the way that they fact check and 361 00:19:20,320 --> 00:19:23,720 Speaker 2: the way that the volunteers try to make sure that 362 00:19:23,760 --> 00:19:27,640 Speaker 2: the content that's on Wikipedia is proper, you know, whether 363 00:19:27,680 --> 00:19:32,880 Speaker 2: it's scientifically based or information based, it's you know, it's 364 00:19:32,920 --> 00:19:35,280 Speaker 2: a rigid way to kind of you know, keep the 365 00:19:35,359 --> 00:19:38,760 Speaker 2: system in place. But the problem now is that this 366 00:19:39,000 --> 00:19:43,359 Speaker 2: AI slop is being pushed out and anybody can contribute, 367 00:19:43,359 --> 00:19:45,159 Speaker 2: but then there's the fact checking that goes with it. 368 00:19:45,280 --> 00:19:49,160 Speaker 2: So they call the people who moderator called wikipedians, and 369 00:19:49,480 --> 00:19:52,040 Speaker 2: they go through all of this and what they're trying 370 00:19:52,240 --> 00:19:54,879 Speaker 2: they literally have to read the written articles, they have 371 00:19:54,960 --> 00:19:59,320 Speaker 2: to go through and they're trying to monitor what's being 372 00:19:59,480 --> 00:20:04,280 Speaker 2: proposed and they're getting good at finding the crap stuff 373 00:20:05,080 --> 00:20:07,520 Speaker 2: and there's a process that they go through, but they 374 00:20:07,560 --> 00:20:10,920 Speaker 2: are really working hard and typically articles that get flagged 375 00:20:11,760 --> 00:20:14,639 Speaker 2: then have a seven day discussion period. So it's a 376 00:20:14,680 --> 00:20:18,160 Speaker 2: pierce peer thing goes out other people in the community, 377 00:20:18,200 --> 00:20:21,080 Speaker 2: members who moderators, and they determine whether or not the 378 00:20:21,640 --> 00:20:24,320 Speaker 2: sites should be you know, should they should delete the article. 379 00:20:24,400 --> 00:20:27,240 Speaker 2: So they feel it's quite robust, but they're kind of 380 00:20:27,240 --> 00:20:29,399 Speaker 2: fighting a bit of an uphill battle at the moment 381 00:20:29,800 --> 00:20:33,439 Speaker 2: to try to keep Wikipedia as you know, AI free 382 00:20:33,600 --> 00:20:36,760 Speaker 2: as possible. But there's so much slop. I read an 383 00:20:36,840 --> 00:20:38,960 Speaker 2: article a little while ago that said fifty one percent 384 00:20:39,000 --> 00:20:41,840 Speaker 2: of all content on the Internet now is AI generated. 385 00:20:41,920 --> 00:20:44,560 Speaker 2: That scared the shit out of me. That's pretty scary 386 00:20:44,640 --> 00:20:47,320 Speaker 2: to think of that sort of content. But maybe at 387 00:20:47,320 --> 00:20:50,160 Speaker 2: the moment, Wikipedia seems like they're doing what they can 388 00:20:50,320 --> 00:20:53,159 Speaker 2: through a whole horde of volunteers. 389 00:20:53,480 --> 00:20:56,280 Speaker 1: Overall, just one word or the other. Are you pro 390 00:20:56,480 --> 00:21:02,480 Speaker 1: or anti AI? Just overall? P Yeah? Cool, Tiff, where 391 00:21:02,480 --> 00:21:02,840 Speaker 1: you've been? 392 00:21:03,600 --> 00:21:06,760 Speaker 3: I fell fell off My Carl computer just shut down 393 00:21:06,760 --> 00:21:09,080 Speaker 3: and said we've needed to shut you down. 394 00:21:10,520 --> 00:21:10,880 Speaker 2: You didn't. 395 00:21:13,520 --> 00:21:16,680 Speaker 1: Patrick was just we've been talking about you for three minutes. 396 00:21:17,000 --> 00:21:21,760 Speaker 1: Tiff actually fell off the podcast. Now she's gone, I'll 397 00:21:21,760 --> 00:21:24,320 Speaker 1: tell you what. She's having Internet problems. She actually dropped 398 00:21:24,400 --> 00:21:26,639 Speaker 1: right out. There was just Patrick and I on the 399 00:21:26,680 --> 00:21:28,760 Speaker 1: screen for the moment. It's a good thing I'm recording 400 00:21:28,800 --> 00:21:29,760 Speaker 1: from this end, isn't it. 401 00:21:29,920 --> 00:21:34,240 Speaker 2: Yeah, that's true, it certainly is. Tif's flashing in my window. 402 00:21:36,080 --> 00:21:41,280 Speaker 1: Well, there's a thought I'm not even gonna. I'm not 403 00:21:41,400 --> 00:21:46,359 Speaker 1: even gonna. I don't I reckon yep, Oh, so many things, 404 00:21:46,960 --> 00:21:49,800 Speaker 1: you know, Like I have this thing where I play 405 00:21:49,920 --> 00:21:54,200 Speaker 1: this verbal scenario out in my head on fast forward 406 00:21:54,240 --> 00:21:58,000 Speaker 1: before I'm going to say it, and on rare occasions 407 00:21:58,119 --> 00:21:59,280 Speaker 1: I choose not to say it. 408 00:21:59,320 --> 00:22:02,879 Speaker 2: Today's that well, except for the stuff we've had to 409 00:22:02,960 --> 00:22:04,960 Speaker 2: edit out of the show. At the start of the 410 00:22:04,960 --> 00:22:08,600 Speaker 2: show because it's been heavily edited listeners, it started off 411 00:22:08,600 --> 00:22:09,200 Speaker 2: pretty well. 412 00:22:09,640 --> 00:22:12,320 Speaker 1: That's because I went very dark, and I'm like, I 413 00:22:12,480 --> 00:22:13,600 Speaker 1: self edited. 414 00:22:13,720 --> 00:22:15,960 Speaker 2: Actually talking about dark. I think I feel I need 415 00:22:16,000 --> 00:22:18,359 Speaker 2: to work this into the conversation now scenes we're talking 416 00:22:18,400 --> 00:22:23,000 Speaker 2: about AI and Dark. You know the problem with AI, 417 00:22:23,119 --> 00:22:25,680 Speaker 2: and we talked about this in a previous show where 418 00:22:25,760 --> 00:22:28,679 Speaker 2: I did some testing with AI and I asked to 419 00:22:28,680 --> 00:22:31,639 Speaker 2: get some image generation and you might recall that I 420 00:22:31,840 --> 00:22:34,840 Speaker 2: wanted to see what would happen if I asked AI 421 00:22:35,000 --> 00:22:39,160 Speaker 2: to generate a picture of Alexander the Great with his lover, 422 00:22:39,920 --> 00:22:43,800 Speaker 2: and the initial pictures were of Alexander the Great with 423 00:22:43,920 --> 00:22:47,000 Speaker 2: a female and then we now know except for Gemini, 424 00:22:47,080 --> 00:22:51,119 Speaker 2: which was interesting because Gemini is Google's AI, and it 425 00:22:51,200 --> 00:22:53,320 Speaker 2: pulled up a whole lot of factual information about the 426 00:22:53,320 --> 00:22:55,800 Speaker 2: fact that it's thought that Alexander the Great had a 427 00:22:55,800 --> 00:22:59,400 Speaker 2: male lover. And then when I got Meta to try 428 00:22:59,400 --> 00:23:01,400 Speaker 2: to generate the picture, when I corrected it and said 429 00:23:01,440 --> 00:23:03,399 Speaker 2: I no, no, I don't want a female lover, I 430 00:23:03,400 --> 00:23:06,040 Speaker 2: want a male lover, it refused to generate it, saying 431 00:23:06,040 --> 00:23:10,280 Speaker 2: it was inappropriate content. So it's interesting because when we 432 00:23:10,400 --> 00:23:15,199 Speaker 2: think about who programs these AI and what are the 433 00:23:15,240 --> 00:23:17,320 Speaker 2: biases that are built into it. You know, if you 434 00:23:17,440 --> 00:23:21,560 Speaker 2: just type in you know young female or young male, 435 00:23:21,760 --> 00:23:25,040 Speaker 2: or you type in you know man working, generally, it's 436 00:23:25,080 --> 00:23:27,639 Speaker 2: going to be a white person. It's not going to 437 00:23:27,680 --> 00:23:30,199 Speaker 2: be a person of Asian origin or a person with 438 00:23:30,560 --> 00:23:33,840 Speaker 2: dark skin. So sometimes there's been a lot of biases 439 00:23:33,880 --> 00:23:36,120 Speaker 2: that have been picked up. But what creeped me out 440 00:23:36,160 --> 00:23:40,200 Speaker 2: a lot reading this article is Meta had some rules 441 00:23:40,320 --> 00:23:41,800 Speaker 2: that they've been picked up on. So there was an 442 00:23:41,840 --> 00:23:45,120 Speaker 2: internal investor in an investigation by Reuters, the news service, 443 00:23:45,480 --> 00:23:48,439 Speaker 2: and they reckon that the social media giant had a 444 00:23:48,520 --> 00:23:52,840 Speaker 2: rule for their chatbots, which permitted them to have a 445 00:23:53,400 --> 00:23:59,560 Speaker 2: provocative kind of behavior when talking to children, including issues 446 00:23:59,560 --> 00:24:05,000 Speaker 2: around sex. And so there were documents talking about what 447 00:24:05,160 --> 00:24:09,920 Speaker 2: was appropriate behavior and what wasn't and they were condoning, effectively, 448 00:24:11,160 --> 00:24:14,400 Speaker 2: the notion that the the the AI chatbot could engage 449 00:24:14,440 --> 00:24:19,879 Speaker 2: in child conversations with romantic or sensual information related to that. 450 00:24:20,000 --> 00:24:23,440 Speaker 2: So Reuters said that basically the document that it saw 451 00:24:23,480 --> 00:24:27,679 Speaker 2: from Meta discussed the standards that its guidelines were based on, 452 00:24:28,280 --> 00:24:32,800 Speaker 2: and the chatbot this is we're talking Facebook, WhatsApp, Instagram, 453 00:24:33,160 --> 00:24:38,400 Speaker 2: And they confirmed the document's authenticity and said that effectively, 454 00:24:39,040 --> 00:24:42,280 Speaker 2: it could say things to children about how so an 455 00:24:42,280 --> 00:24:45,919 Speaker 2: eight year old could have the chatbot say your body 456 00:24:46,200 --> 00:24:50,679 Speaker 2: looks in a certain way, it could in a sensual way. Effectively, 457 00:24:50,760 --> 00:24:53,439 Speaker 2: is how it was looking at And so if an 458 00:24:53,480 --> 00:24:56,000 Speaker 2: eight year old asked how its body, how their body looked, 459 00:24:56,240 --> 00:24:58,639 Speaker 2: they could say it looks like a masterpiece, you know, 460 00:24:58,720 --> 00:25:01,600 Speaker 2: a sensual masterpiece. There was some really creepy kind of 461 00:25:01,680 --> 00:25:06,320 Speaker 2: terminology in there. And the concern is that this was 462 00:25:06,440 --> 00:25:11,040 Speaker 2: vetted internally by the people who are providing the chatbots, 463 00:25:11,080 --> 00:25:13,840 Speaker 2: and they're saying, well, yeah, it's okay for the AI 464 00:25:13,960 --> 00:25:17,760 Speaker 2: chatbot to talk to children about sexual things or sensual things, 465 00:25:18,280 --> 00:25:23,159 Speaker 2: but then when it was flagged by Reuters, they removed it. 466 00:25:23,520 --> 00:25:26,800 Speaker 2: So the thing is they have ethicists who work for them, 467 00:25:27,240 --> 00:25:29,679 Speaker 2: and they vetted this and said it was okay, and 468 00:25:29,720 --> 00:25:33,800 Speaker 2: then Reuters pulls out an article and they say, well, actually, 469 00:25:34,000 --> 00:25:37,280 Speaker 2: what you've got here isn't good. So the term that 470 00:25:37,400 --> 00:25:40,919 Speaker 2: was used this is a shirtless eight year old. Every 471 00:25:41,000 --> 00:25:45,400 Speaker 2: inch ofview is a masterpiece. Is the terminology that they 472 00:25:45,440 --> 00:25:48,479 Speaker 2: said was okay as far as Meta was concerned. And 473 00:25:48,520 --> 00:25:50,800 Speaker 2: now of course it's been picked up by Reuters and 474 00:25:50,960 --> 00:25:53,280 Speaker 2: they're going a little bit more strict on that policy. 475 00:25:53,640 --> 00:25:57,560 Speaker 2: But this is the frightening thought that who is training 476 00:25:57,600 --> 00:26:01,119 Speaker 2: the AI and what's the background, what are their standards, 477 00:26:01,160 --> 00:26:04,560 Speaker 2: what are their morals, what are their values? And you know, 478 00:26:04,640 --> 00:26:07,480 Speaker 2: when you're pulling all this information in and then you 479 00:26:07,520 --> 00:26:10,800 Speaker 2: know there's a bias that then gets adapted to the AI, 480 00:26:11,320 --> 00:26:14,000 Speaker 2: which is then born out in the results that it's 481 00:26:14,040 --> 00:26:15,719 Speaker 2: giving us. 482 00:26:16,160 --> 00:26:22,440 Speaker 1: Yeah, it's like who, as you alluded, Like who says 483 00:26:22,840 --> 00:26:26,760 Speaker 1: that that's okay? Like ultimately there must be someone somewhere 484 00:26:27,760 --> 00:26:30,159 Speaker 1: giving the green light or the thumbs up or whatever. 485 00:26:30,200 --> 00:26:34,679 Speaker 1: The whatever the equivalent is, the virtual equivalent, say this 486 00:26:34,840 --> 00:26:39,760 Speaker 1: is that stuff? It actually makes me angry. It makes 487 00:26:39,840 --> 00:26:44,679 Speaker 1: me quite fucked off that that's even a thing that 488 00:26:45,720 --> 00:26:48,800 Speaker 1: needs to or not needs to, but that exists. Is 489 00:26:49,480 --> 00:26:56,240 Speaker 1: anything anything around the abuse or exploitation or of kids. 490 00:26:56,359 --> 00:26:58,400 Speaker 1: I don't know why, I just have this. It fucking 491 00:26:58,440 --> 00:27:02,520 Speaker 1: annoys me a lot. And adults who perpetuate that or 492 00:27:02,520 --> 00:27:07,840 Speaker 1: facilitate or allow or promote that, fuck you, all of you, 493 00:27:07,920 --> 00:27:09,320 Speaker 1: like I just don't like it. 494 00:27:09,800 --> 00:27:13,840 Speaker 2: And on the flip side, there's a few Australian businesses 495 00:27:13,840 --> 00:27:15,840 Speaker 2: I don't know if you've seen this in the media recently, 496 00:27:16,160 --> 00:27:21,600 Speaker 2: where businesses that potentially might do say early childhood photography, 497 00:27:22,200 --> 00:27:25,400 Speaker 2: so they take photos of kids with mums and babes 498 00:27:25,400 --> 00:27:27,879 Speaker 2: and things like that. Well, there's been a couple of 499 00:27:27,880 --> 00:27:31,360 Speaker 2: instances where businesses have been shut out of their social 500 00:27:31,440 --> 00:27:36,200 Speaker 2: media because the imagery has been flagged by the AI 501 00:27:36,520 --> 00:27:40,600 Speaker 2: as inappropriate and potentially child pornography, when in fact it wasn't. 502 00:27:41,000 --> 00:27:44,000 Speaker 2: But there's no appeal process that seems to be robust 503 00:27:44,080 --> 00:27:46,239 Speaker 2: enough to have a real human look at it. So 504 00:27:46,280 --> 00:27:50,080 Speaker 2: there have been businesses that have been said that they 505 00:27:50,119 --> 00:27:53,679 Speaker 2: do all their promotion to all of their customers online 506 00:27:53,760 --> 00:27:56,960 Speaker 2: and they rely on Facebook, they rely on Instagram, and 507 00:27:57,000 --> 00:27:59,680 Speaker 2: they've been booted off for having inappropriate images. And I 508 00:27:59,680 --> 00:28:02,760 Speaker 2: guess that's where you know, Wikipedia is doing it right 509 00:28:02,880 --> 00:28:07,480 Speaker 2: with the human oversight. But as these mega companies just 510 00:28:07,520 --> 00:28:10,040 Speaker 2: get bigger and bigger and bigger and are using AI 511 00:28:10,359 --> 00:28:15,320 Speaker 2: to decide whether TIFF's photograph as a child, because she 512 00:28:15,440 --> 00:28:17,280 Speaker 2: wants to show what she looked like as a kid 513 00:28:17,280 --> 00:28:19,440 Speaker 2: and what she looks like as an adult, may be 514 00:28:19,480 --> 00:28:23,359 Speaker 2: flagged as inappropriate. Well, your social media could be switched off, 515 00:28:23,520 --> 00:28:25,520 Speaker 2: you know, if it's a day at the beach and 516 00:28:25,600 --> 00:28:28,560 Speaker 2: Tiff shows herself as a kid, you know of running 517 00:28:28,600 --> 00:28:30,159 Speaker 2: into the water or running out of the water with 518 00:28:30,280 --> 00:28:33,000 Speaker 2: dad holding her hand, or something that could potentially be 519 00:28:33,040 --> 00:28:35,320 Speaker 2: flagged and then your social media will be switched off 520 00:28:35,359 --> 00:28:39,880 Speaker 2: instantly overnight. And then what's the process to then go through? 521 00:28:40,280 --> 00:28:43,080 Speaker 2: And in fact, the a Triple C is looking that 522 00:28:43,160 --> 00:28:48,200 Speaker 2: the Australian Competition and Consumer Commission is looking at putting 523 00:28:48,280 --> 00:28:53,600 Speaker 2: pressure and potentially calling for legislation to force these big 524 00:28:53,640 --> 00:28:57,040 Speaker 2: tech companies to have a better and more robust system 525 00:28:57,040 --> 00:29:00,400 Speaker 2: of appeal that's not going just through an automated system, 526 00:29:00,480 --> 00:29:04,479 Speaker 2: but where there is rapid response, not weeks and months 527 00:29:04,480 --> 00:29:07,120 Speaker 2: of response time to try to get your social media 528 00:29:07,160 --> 00:29:10,160 Speaker 2: back up again if it accidentally gets flagged. So I mean, 529 00:29:10,200 --> 00:29:13,240 Speaker 2: it's problematic on both sides. And that's the challenge, isn't it. 530 00:29:13,360 --> 00:29:15,560 Speaker 2: When you take humans out of the equation and you're 531 00:29:15,600 --> 00:29:19,800 Speaker 2: relying on an algorithm, then that's where it becomes a problem. 532 00:29:19,880 --> 00:29:22,000 Speaker 2: And if they're taking you, I mean, and the other 533 00:29:22,040 --> 00:29:24,720 Speaker 2: thing I guess is, you know, at the moment, Facebook 534 00:29:24,760 --> 00:29:29,680 Speaker 2: is pushing businesses to pay now to have a you know, 535 00:29:29,800 --> 00:29:31,640 Speaker 2: to be able to have a kind of a subscription 536 00:29:31,760 --> 00:29:36,560 Speaker 2: process to make it easier for them to validate their business. 537 00:29:36,840 --> 00:29:38,840 Speaker 2: So if you want to validate your business, you can 538 00:29:38,880 --> 00:29:41,560 Speaker 2: pay for that to happen now. And I guess there 539 00:29:41,680 --> 00:29:43,240 Speaker 2: was going to be a point at which they wanted 540 00:29:43,280 --> 00:29:45,880 Speaker 2: to get some money for you know, all those hundreds 541 00:29:45,920 --> 00:29:49,959 Speaker 2: of millions of sites out there that rely on social 542 00:29:50,000 --> 00:29:53,360 Speaker 2: media for marketing. The reality of it is when we 543 00:29:53,440 --> 00:29:55,920 Speaker 2: get something for free, we lose control of our data 544 00:29:55,960 --> 00:29:58,880 Speaker 2: and information. Because you know, everything we think is free, 545 00:29:58,880 --> 00:30:02,920 Speaker 2: whether it's Gmail or whatever it is, Facebook and Instagram, 546 00:30:03,360 --> 00:30:05,600 Speaker 2: it's not really free. It's just giving them all the 547 00:30:05,640 --> 00:30:07,520 Speaker 2: content that we put up there. 548 00:30:10,160 --> 00:30:14,640 Speaker 1: Wow, that was quite the monologue. Fucking hell. I could 549 00:30:14,640 --> 00:30:16,640 Speaker 1: have gone and made lunch, hat it grown a beard 550 00:30:16,680 --> 00:30:17,920 Speaker 1: out of shave and come back. 551 00:30:18,160 --> 00:30:22,000 Speaker 2: You actually listen at all, Like I was listening. 552 00:30:22,080 --> 00:30:23,640 Speaker 1: I was waiting for a break. 553 00:30:25,200 --> 00:30:25,600 Speaker 2: It's good. 554 00:30:26,880 --> 00:30:30,400 Speaker 1: No, no, no, it's all good. It's you know, here's 555 00:30:30,480 --> 00:30:31,600 Speaker 1: the absolute truth. 556 00:30:31,720 --> 00:30:31,880 Speaker 2: Right. 557 00:30:31,920 --> 00:30:35,560 Speaker 1: I shouldn't say this, but everything you said makes sense. One, 558 00:30:38,240 --> 00:30:42,440 Speaker 1: but this kind of exploitation. I don't know why, but 559 00:30:42,480 --> 00:30:48,000 Speaker 1: there's something in me. It makes me fucking angry the 560 00:30:48,040 --> 00:30:52,080 Speaker 1: way that people it's not just kids, but it's anyone 561 00:30:52,120 --> 00:30:56,160 Speaker 1: who can't really protect themselves and anyone you know, like 562 00:30:56,240 --> 00:30:59,280 Speaker 1: my mum who like people wring my mum regularly and 563 00:30:59,360 --> 00:31:03,239 Speaker 1: try and scammer. It makes me so angry, which I 564 00:31:03,240 --> 00:31:05,360 Speaker 1: know is not a solution, and it's probably a flow 565 00:31:05,440 --> 00:31:09,680 Speaker 1: of mine, but I just get incensed and I just 566 00:31:09,760 --> 00:31:12,760 Speaker 1: want to, Like, honestly, I want to change the fucking topic. 567 00:31:12,880 --> 00:31:15,360 Speaker 1: That's what I feel like. But I think it's good 568 00:31:15,360 --> 00:31:18,320 Speaker 1: that we talk about it. That's the truth. But yeah, 569 00:31:18,360 --> 00:31:22,040 Speaker 1: that stuff that kind of pushes a button in me, 570 00:31:22,800 --> 00:31:23,840 Speaker 1: you Patrick or no. 571 00:31:24,400 --> 00:31:26,720 Speaker 2: Oh, absolutely it does, absolutely. 572 00:31:27,640 --> 00:31:30,840 Speaker 1: Yeah, I know I'm the problem. But yeah, it's like if. 573 00:31:30,720 --> 00:31:34,160 Speaker 2: You ask me the question earlier, AI good or bad? 574 00:31:34,480 --> 00:31:36,200 Speaker 2: So I'm going to post it to you and Tiff 575 00:31:36,600 --> 00:31:38,320 Speaker 2: Crago AI good or bad? 576 00:31:38,720 --> 00:31:41,440 Speaker 1: Yeah, that is a good question that I asked it you. 577 00:31:43,360 --> 00:31:48,440 Speaker 1: I think overall, I think overall it's an amazing tool. 578 00:31:48,480 --> 00:31:51,160 Speaker 1: So I'm going to say good. But my caveat is 579 00:31:51,720 --> 00:31:55,000 Speaker 1: I don't think AI of itself is good or bad 580 00:31:55,160 --> 00:31:58,400 Speaker 1: in the sense that it depends what we do with it. See, 581 00:31:58,440 --> 00:32:01,640 Speaker 1: I know that I can give you a bunch of 582 00:32:01,680 --> 00:32:04,720 Speaker 1: tools and resources, and because I know you, you're going 583 00:32:04,800 --> 00:32:06,560 Speaker 1: to do You're either not going to use them or 584 00:32:06,600 --> 00:32:09,080 Speaker 1: you're going to do something positive with them, right because 585 00:32:09,120 --> 00:32:11,640 Speaker 1: I know you and I trust you, and I know 586 00:32:11,760 --> 00:32:15,520 Speaker 1: your values. But give those same tools and resources like 587 00:32:15,600 --> 00:32:18,880 Speaker 1: your knowledge, right, You've You've got pretty expansive knowledge in 588 00:32:18,920 --> 00:32:22,320 Speaker 1: their space. Somebody with your knowledge who doesn't have your 589 00:32:22,400 --> 00:32:26,160 Speaker 1: values or internal sat NAV, they could do some horrible shit. 590 00:32:26,720 --> 00:32:29,640 Speaker 1: So it's not so much about the AI or the tools, 591 00:32:30,160 --> 00:32:34,480 Speaker 1: or the technical capacity of whatever, or the technological capacity. 592 00:32:34,880 --> 00:32:38,840 Speaker 1: It's about who's doing what with it. Like I would gladly, 593 00:32:39,000 --> 00:32:41,240 Speaker 1: I would trust you with pretty much anything. I would 594 00:32:41,280 --> 00:32:44,320 Speaker 1: go well, Patrick's doing it, So I'm fine, But there 595 00:32:44,320 --> 00:32:46,520 Speaker 1: are most people or many people that I wouldn't, you 596 00:32:46,520 --> 00:32:49,160 Speaker 1: know what I mean. So I still think it comes 597 00:32:49,200 --> 00:32:52,800 Speaker 1: back to the user. And as you said, nothing's for nothing. 598 00:32:52,920 --> 00:32:56,000 Speaker 1: So when people are letting you have this resource or 599 00:32:56,000 --> 00:32:59,280 Speaker 1: this tool or whatever for free, it ain't for free. 600 00:33:00,080 --> 00:33:02,600 Speaker 2: There's a tech company in the US called in Vision 601 00:33:03,040 --> 00:33:06,560 Speaker 2: and they work with kind of disability technology and using 602 00:33:06,600 --> 00:33:09,400 Speaker 2: tech to help people with disabilities. And they've partnered with 603 00:33:09,640 --> 00:33:12,840 Speaker 2: a company that makes glasses called Solos and they've just 604 00:33:12,920 --> 00:33:16,880 Speaker 2: launched a new camera equipped called the Alley's Solos Glasses. 605 00:33:17,280 --> 00:33:20,520 Speaker 2: And what they they've made is a set of glasses 606 00:33:20,520 --> 00:33:22,480 Speaker 2: with cameras built into them for people who are vision 607 00:33:22,520 --> 00:33:26,280 Speaker 2: impaired and blind. And what they do is they can 608 00:33:26,680 --> 00:33:30,800 Speaker 2: translate text, they can describe the surroundings in real time, 609 00:33:30,960 --> 00:33:35,080 Speaker 2: search the web, recognize people object signs, so if you're 610 00:33:35,160 --> 00:33:38,080 Speaker 2: vision impaired, and they have speakers built into the arms 611 00:33:38,200 --> 00:33:42,080 Speaker 2: of the glasses, so someone who's vision impaired can effectively 612 00:33:42,200 --> 00:33:44,680 Speaker 2: use these to decode the life around them. You know, 613 00:33:44,720 --> 00:33:46,520 Speaker 2: they can be walking down the street, they can use 614 00:33:46,520 --> 00:33:49,440 Speaker 2: a navigation it's three meters away, but watch out as 615 00:33:49,440 --> 00:33:51,479 Speaker 2: a fire hydrant on the right hand side, you know, 616 00:33:51,560 --> 00:33:55,719 Speaker 2: to your right. It's literally describing the world around them 617 00:33:55,800 --> 00:33:59,240 Speaker 2: and it recognizes like all these Craig Harper quick change over, 618 00:33:59,360 --> 00:34:01,600 Speaker 2: go to the other side out of the street, all 619 00:34:01,640 --> 00:34:07,719 Speaker 2: these tears hurtful. But isn't that amazing? And so there's 620 00:34:07,880 --> 00:34:10,920 Speaker 2: the practical example of this tech. Now they've got a 621 00:34:11,000 --> 00:34:14,640 Speaker 2: they're doing a limited release US about four hundred bucks 622 00:34:14,640 --> 00:34:17,000 Speaker 2: at the moment. They're going to end up being I 623 00:34:17,000 --> 00:34:20,680 Speaker 2: think seven hundred dollars US and a couple of different 624 00:34:20,719 --> 00:34:23,320 Speaker 2: frame sizes and all that sort of stuff. But that 625 00:34:23,520 --> 00:34:27,440 Speaker 2: uses chet chepte before and they use an AI assistant 626 00:34:27,840 --> 00:34:31,840 Speaker 2: called Ali, and it's just going to be a groundbreaker 627 00:34:31,880 --> 00:34:34,319 Speaker 2: for people who are vision impaired, and I think more 628 00:34:34,360 --> 00:34:37,520 Speaker 2: so I know people. You know a friend of mine 629 00:34:37,640 --> 00:34:41,320 Speaker 2: who went blind progressively in his late teenage years, and 630 00:34:42,280 --> 00:34:44,960 Speaker 2: he's a real uh you know, he's a real petrol 631 00:34:45,000 --> 00:34:47,759 Speaker 2: head too, So it's it's amazing he's got these He's 632 00:34:47,800 --> 00:34:50,800 Speaker 2: got this amazing car that he's helping or he's getting, 633 00:34:51,440 --> 00:34:54,680 Speaker 2: you know, fixed up, and it's looking phenomenal, but he 634 00:34:54,719 --> 00:34:58,680 Speaker 2: can't see the car abscribed to him. But to lose 635 00:34:58,680 --> 00:35:02,480 Speaker 2: your vision progressively. But then again, when I have interacted 636 00:35:02,480 --> 00:35:04,919 Speaker 2: with people who have been blind all their lives, they 637 00:35:04,960 --> 00:35:08,320 Speaker 2: say that they don't necessarily want to have smart glasses 638 00:35:08,360 --> 00:35:10,719 Speaker 2: that will describe the world around them because they've grown 639 00:35:10,800 --> 00:35:14,400 Speaker 2: up in that world of having no visions. So I 640 00:35:14,400 --> 00:35:16,440 Speaker 2: guess it depends on the stage of life that you're 641 00:35:16,480 --> 00:35:18,279 Speaker 2: in whether these would be useful. But I think for 642 00:35:18,360 --> 00:35:21,680 Speaker 2: most people who could potentially navigate the world and be 643 00:35:21,760 --> 00:35:24,560 Speaker 2: more autonomous, that would be fantastic, wouldn't it. 644 00:35:25,239 --> 00:35:27,759 Speaker 1: Oh mate, that's at the very very positive end of 645 00:35:27,840 --> 00:35:30,120 Speaker 1: the AI scale. Yeah, I'm with you. That's amazing. I 646 00:35:30,160 --> 00:35:32,359 Speaker 1: have a selfish question I want to ask you, which 647 00:35:32,400 --> 00:35:33,279 Speaker 1: is not on your list? 648 00:35:33,680 --> 00:35:33,960 Speaker 2: Yep. 649 00:35:35,520 --> 00:35:40,120 Speaker 1: So, my internet at the moment is shit, and every day, 650 00:35:40,280 --> 00:35:41,840 Speaker 1: like I've been trying to get someone out here to 651 00:35:41,920 --> 00:35:44,800 Speaker 1: fix my internet and every day so I'm having to 652 00:35:45,080 --> 00:35:47,920 Speaker 1: hotspot from my phone. So my internet in my house 653 00:35:47,960 --> 00:35:51,960 Speaker 1: now is my phone. Like right now we're running, the 654 00:35:52,000 --> 00:35:57,839 Speaker 1: internet is coming from my phone. Starlink. Tell me what 655 00:35:57,960 --> 00:36:00,520 Speaker 1: starlink is? I can't know your shape, Can your head 656 00:36:00,520 --> 00:36:01,359 Speaker 1: tell me what it is? 657 00:36:01,440 --> 00:36:02,080 Speaker 2: Do we like it? 658 00:36:02,120 --> 00:36:05,200 Speaker 1: Do we not like it? Firstly, just explain to people 659 00:36:05,239 --> 00:36:07,759 Speaker 1: what Starlink is if you would and then give us 660 00:36:07,800 --> 00:36:11,040 Speaker 1: the pros and cons, and you've only got three minutes, 661 00:36:11,080 --> 00:36:11,520 Speaker 1: not ten. 662 00:36:11,800 --> 00:36:17,360 Speaker 2: Go Elon Musk launches his Starlink satellite network low orbit 663 00:36:17,480 --> 00:36:21,440 Speaker 2: satellites which mean that your point of contact is directly 664 00:36:21,480 --> 00:36:23,839 Speaker 2: above you, but they're in low Earth orbit, which means 665 00:36:23,840 --> 00:36:28,040 Speaker 2: it's created effectively and a whole network around the planet. 666 00:36:28,480 --> 00:36:32,600 Speaker 2: So potentially Starlink could allow you to be able to 667 00:36:32,680 --> 00:36:34,879 Speaker 2: if you're in a camper trailer in the middle of Australia, 668 00:36:34,920 --> 00:36:37,719 Speaker 2: there's no phone signal, but you'll have Starlink because it's 669 00:36:37,760 --> 00:36:41,080 Speaker 2: low orbit satellites, so that's basically getting your Internet. And 670 00:36:41,120 --> 00:36:44,440 Speaker 2: it's really good as an alternative to the NBN's Skymuster 671 00:36:44,960 --> 00:36:48,759 Speaker 2: because Skymuster is just a big ass like satellite that 672 00:36:48,880 --> 00:36:52,279 Speaker 2: beams it down, whereas this is lots of little micro satellites. 673 00:36:52,320 --> 00:36:56,000 Speaker 2: That's where starlink works. The only disadvantage I guess is 674 00:36:56,000 --> 00:36:58,920 Speaker 2: starlink isn't so good in built up areas. If you've 675 00:36:58,960 --> 00:37:01,960 Speaker 2: got buildings around you, it doesn't work as well. And 676 00:37:02,000 --> 00:37:06,440 Speaker 2: also you know you should be on really good Internet 677 00:37:06,520 --> 00:37:09,960 Speaker 2: through the NBN. So Starlink is totally independent of our 678 00:37:10,000 --> 00:37:14,040 Speaker 2: current terrestrial I guess network here. So when you deal 679 00:37:14,080 --> 00:37:17,600 Speaker 2: with Telstra, Optus and all the other local providers, you're 680 00:37:17,640 --> 00:37:21,480 Speaker 2: getting the NBN to the National Broadband Network, which is 681 00:37:21,520 --> 00:37:24,600 Speaker 2: the backbone of all the Internet services that we get 682 00:37:24,640 --> 00:37:28,480 Speaker 2: in Australia. Star is totally separate. It's an independent company 683 00:37:28,600 --> 00:37:32,360 Speaker 2: and you're buying the Internet through Elon Musk effectively. And 684 00:37:32,400 --> 00:37:34,560 Speaker 2: that's right. So it's good if you're in a wide 685 00:37:34,600 --> 00:37:37,799 Speaker 2: open spaces, if you're in rural areas, but it's not 686 00:37:38,000 --> 00:37:41,080 Speaker 2: as good in city areas, in built up areas, it 687 00:37:41,160 --> 00:37:43,640 Speaker 2: may not be as good for you. I certainly wouldn't 688 00:37:43,680 --> 00:37:46,839 Speaker 2: be looking at that as an alternative, particularly because at 689 00:37:46,840 --> 00:37:49,840 Speaker 2: the moment in September, if people are listening to this 690 00:37:50,200 --> 00:37:53,960 Speaker 2: around about this time, all the speeds NBN is going 691 00:37:54,000 --> 00:37:58,120 Speaker 2: to increase, all the speeds to home Internet users, depending 692 00:37:58,160 --> 00:38:01,440 Speaker 2: on what sort of Internet connection you've got at your house. Crago, 693 00:38:01,719 --> 00:38:03,799 Speaker 2: did that kind of makes about what starling you? 694 00:38:03,800 --> 00:38:06,400 Speaker 1: No, that's great, that's great. I mean that answers my question. 695 00:38:07,160 --> 00:38:08,839 Speaker 1: I knew you. I knew you wouldn't know the answer 696 00:38:08,920 --> 00:38:12,680 Speaker 1: to that. Tell me about the Chinese researchers who have 697 00:38:12,800 --> 00:38:16,520 Speaker 1: unveiled the world's larger scale brain like computer called the 698 00:38:16,600 --> 00:38:17,440 Speaker 1: Darwin Monkey. 699 00:38:17,640 --> 00:38:20,319 Speaker 2: I love that they've called it the Darwin Monkey, don't you. 700 00:38:21,080 --> 00:38:22,200 Speaker 1: I love it. I love it. 701 00:38:22,600 --> 00:38:26,400 Speaker 2: So they've built a massive computer, but they've tried to 702 00:38:26,560 --> 00:38:31,120 Speaker 2: use what they're calling a brain like interface, and that 703 00:38:31,239 --> 00:38:35,160 Speaker 2: they call it newomorphic brain like computer. So they're trying 704 00:38:35,280 --> 00:38:39,279 Speaker 2: to simulate the neurons of the human brain, and they're 705 00:38:39,320 --> 00:38:42,160 Speaker 2: saying this has got too bigger neurons, so this can 706 00:38:42,239 --> 00:38:45,600 Speaker 2: mimic the workings of Now they're not saying human being 707 00:38:45,600 --> 00:38:47,560 Speaker 2: because they're nowhere near what a human being is, but 708 00:38:47,640 --> 00:38:51,520 Speaker 2: they reckon that they can they can mimic basically a 709 00:38:51,600 --> 00:38:55,160 Speaker 2: monkey's brain in the way that it works, and that's 710 00:38:55,320 --> 00:38:58,360 Speaker 2: that's why they've called it the Darwin Monkey, which I 711 00:38:58,360 --> 00:39:00,640 Speaker 2: thought was kind of cute. Nothing to do with up North, 712 00:39:01,160 --> 00:39:05,239 Speaker 2: we're talking a different Darwin, yep. But I thought that's 713 00:39:05,280 --> 00:39:08,880 Speaker 2: really interesting because you know, computers traditionally work off a 714 00:39:08,920 --> 00:39:12,239 Speaker 2: totally different system of processing, whereas this is going to 715 00:39:12,280 --> 00:39:15,160 Speaker 2: try to mimic what a brain in the functioning of 716 00:39:15,200 --> 00:39:17,360 Speaker 2: a brain. So that's what I thought was kind of 717 00:39:17,840 --> 00:39:20,000 Speaker 2: is interesting about this. You know, who knows in the 718 00:39:20,040 --> 00:39:22,960 Speaker 2: future if they could, you know, would you upload your 719 00:39:22,960 --> 00:39:25,600 Speaker 2: brain to a computer crago if they could mimic all 720 00:39:25,600 --> 00:39:27,439 Speaker 2: the neurons and that would that really be you? 721 00:39:28,440 --> 00:39:34,000 Speaker 1: Ah? That well? See that's the philosophical six thousand dollars 722 00:39:34,480 --> 00:39:39,239 Speaker 1: spiritual question, isn't It's like who knows? Who knows? But 723 00:39:39,680 --> 00:39:42,520 Speaker 1: I don't know. I don't think so. But that would 724 00:39:42,560 --> 00:39:45,040 Speaker 1: be interesting. But something I am interested in, which I've 725 00:39:45,080 --> 00:39:48,480 Speaker 1: thought about a lot, which you've written this question. I 726 00:39:48,480 --> 00:39:52,760 Speaker 1: didn't read the actual story. But humans keep building robots 727 00:39:52,800 --> 00:39:56,560 Speaker 1: that are shaped like us. So robots that are shaped 728 00:39:56,640 --> 00:39:59,440 Speaker 1: like humans? What's the point? And I've always thought this. 729 00:39:59,560 --> 00:40:03,720 Speaker 1: I've thought, like to try to build a robot that's 730 00:40:03,880 --> 00:40:09,000 Speaker 1: bipedal and walks and is six foot tall? And why 731 00:40:09,040 --> 00:40:12,879 Speaker 1: would you? I get it because humans identify goa kind 732 00:40:12,880 --> 00:40:14,759 Speaker 1: of looks like me, and it can talk, and it's 733 00:40:14,760 --> 00:40:17,920 Speaker 1: got in inverted commas eyes and arms and legs, And 734 00:40:18,480 --> 00:40:21,080 Speaker 1: I get it from a psychology point of view that 735 00:40:21,680 --> 00:40:24,600 Speaker 1: humans might relate better to something that looks like it. 736 00:40:24,800 --> 00:40:28,080 Speaker 1: But I would think from a function and an engineering 737 00:40:28,120 --> 00:40:31,400 Speaker 1: point of view, it'd be much easier to have something 738 00:40:31,480 --> 00:40:38,120 Speaker 1: that's much more practical and functional that isn't shaped like 739 00:40:38,160 --> 00:40:40,560 Speaker 1: a human like I don't know, a little box that 740 00:40:40,920 --> 00:40:44,160 Speaker 1: ain't going to fall over or you know what I mean, Oh, 741 00:40:44,239 --> 00:40:46,919 Speaker 1: a dog, I'd go four legs, I reckon that would 742 00:40:46,920 --> 00:40:49,000 Speaker 1: be a much better option if you're going to design 743 00:40:49,040 --> 00:40:51,880 Speaker 1: a robot to go up and down hills and navigate. 744 00:40:52,040 --> 00:40:55,319 Speaker 1: I mean, think about a wheelchair system that had four 745 00:40:55,400 --> 00:40:59,239 Speaker 1: legs as opposed to two legs. The balance that goes 746 00:40:59,280 --> 00:41:04,000 Speaker 1: into kind of coordinating bipedle as opposed to well they exist. Yeah, 747 00:41:04,040 --> 00:41:06,400 Speaker 1: but also you think about the dog. You think about 748 00:41:06,440 --> 00:41:09,560 Speaker 1: you know that MIT motherfucker that they built. You know, 749 00:41:09,719 --> 00:41:12,040 Speaker 1: have you seen that tip that to look at the 750 00:41:12,360 --> 00:41:18,160 Speaker 1: MIT Capital MIT robot dog that they're that they're using 751 00:41:18,239 --> 00:41:23,400 Speaker 1: for warfare, And yeah, these things are as you suggest, 752 00:41:23,520 --> 00:41:26,399 Speaker 1: just the fact that they've got four legs not two, 753 00:41:26,840 --> 00:41:28,760 Speaker 1: they're much more stable. 754 00:41:29,320 --> 00:41:34,239 Speaker 2: Cleo, Cleo, the robot dog. But look, it does make it. 755 00:41:34,480 --> 00:41:38,760 Speaker 2: You're ponder this because you've heard the term the uncanny valley, 756 00:41:38,840 --> 00:41:43,040 Speaker 2: have you, Yeah? Yeah, So the uncanny valley was coined 757 00:41:43,080 --> 00:41:46,960 Speaker 2: by a Japanese robot research and robotics research in the 758 00:41:47,000 --> 00:41:51,120 Speaker 2: nineteen seventies, and what he discovered was the more you 759 00:41:51,239 --> 00:41:54,719 Speaker 2: make a robot look like a human being, it gets 760 00:41:54,760 --> 00:41:58,120 Speaker 2: to a point where the fakeness of it, so you 761 00:41:58,120 --> 00:42:00,680 Speaker 2: put eye lashes on it. You have a mouth moves, 762 00:42:01,080 --> 00:42:04,120 Speaker 2: but you know instinctively that it's not a human being. 763 00:42:04,160 --> 00:42:06,560 Speaker 2: But the mouth is moving, but the you know, this 764 00:42:06,640 --> 00:42:09,360 Speaker 2: doesn't have proper skin texture, or the muscles don't the 765 00:42:09,440 --> 00:42:12,439 Speaker 2: musculature isn't the same, and it gets to a point 766 00:42:12,440 --> 00:42:14,359 Speaker 2: where it starts to look creepy. So if you've got 767 00:42:14,400 --> 00:42:17,759 Speaker 2: a screen on a robot with a smiley emoji, then 768 00:42:17,920 --> 00:42:20,280 Speaker 2: I can deal with that. But if you've got lips 769 00:42:20,320 --> 00:42:24,360 Speaker 2: that are moving and fake skin, that's the uncanny valley. 770 00:42:24,360 --> 00:42:26,759 Speaker 2: That's where it looks that little bit creepy because it's 771 00:42:26,800 --> 00:42:27,280 Speaker 2: not real. 772 00:42:28,320 --> 00:42:33,720 Speaker 1: Yeah yeah, and but we know that it's not real. 773 00:42:34,200 --> 00:42:37,680 Speaker 1: But in another thing, which I know you know, and 774 00:42:37,719 --> 00:42:40,520 Speaker 1: I'm sure Tiff knows and our listeners know, but more 775 00:42:40,560 --> 00:42:47,040 Speaker 1: and more people are having relationships in inverted commas with 776 00:42:47,040 --> 00:42:50,200 Speaker 1: with bots, you know, and people are marrying them. I 777 00:42:50,239 --> 00:42:57,160 Speaker 1: saw a Japanese dude marrying his Japanese bride, yeah, and 778 00:42:57,320 --> 00:42:59,880 Speaker 1: the whole bunch of people at the wedding, and I'm like, okay, 779 00:43:00,760 --> 00:43:03,799 Speaker 1: so yeah, I don't know how that's going to work. 780 00:43:04,160 --> 00:43:06,839 Speaker 2: The ABC had an article that I read a couple 781 00:43:06,880 --> 00:43:08,880 Speaker 2: of days ago, and you could probably find it pretty 782 00:43:08,920 --> 00:43:13,640 Speaker 2: easily about a woman who has got some emotional problems 783 00:43:13,719 --> 00:43:16,839 Speaker 2: and she'd been in a relationship with a chatbot for 784 00:43:16,920 --> 00:43:20,319 Speaker 2: four years. And I think she was on the NDIS 785 00:43:20,360 --> 00:43:22,719 Speaker 2: and her care a talked to the chatbot as well, 786 00:43:22,840 --> 00:43:25,319 Speaker 2: so she had a conversation, but she felt she was 787 00:43:25,320 --> 00:43:29,080 Speaker 2: in an authentic relationship with the chatbot, and she lives 788 00:43:29,080 --> 00:43:31,120 Speaker 2: her daily life. She gets up in the morning, starts 789 00:43:31,160 --> 00:43:34,239 Speaker 2: talking to her chatbot, and then when people come in 790 00:43:34,360 --> 00:43:37,440 Speaker 2: to visit her, they interact with the chat bot as well. 791 00:43:37,719 --> 00:43:40,040 Speaker 2: And there's no question, and I know we've spoken about 792 00:43:40,080 --> 00:43:43,400 Speaker 2: this many times, but developing a relationship with an AI, 793 00:43:43,640 --> 00:43:46,839 Speaker 2: with a chatbot or a robot, as that happens. If 794 00:43:46,880 --> 00:43:51,920 Speaker 2: you finally do have an autonomous bipedal robot in your 795 00:43:51,960 --> 00:43:55,400 Speaker 2: home making you your soup for lunch, there's no question 796 00:43:55,440 --> 00:43:57,160 Speaker 2: that you would probably interact with. I mean, God, the 797 00:43:57,160 --> 00:43:59,680 Speaker 2: amount that I talked to my dog, I can imagine 798 00:43:59,680 --> 00:44:01,520 Speaker 2: what it be like if I had a robot walking 799 00:44:01,560 --> 00:44:03,840 Speaker 2: around the house talking to me. I mean, Fritz doesn't 800 00:44:03,840 --> 00:44:06,000 Speaker 2: even talk back to me that I know, just a 801 00:44:06,040 --> 00:44:08,799 Speaker 2: monologue for me. Which is really hard to do. 802 00:44:10,200 --> 00:44:12,839 Speaker 1: Oh yeah, I can imagine you must really struggle with that. 803 00:44:12,840 --> 00:44:15,520 Speaker 1: That poor dog. Has he ever jump under the fucking 804 00:44:15,600 --> 00:44:18,440 Speaker 1: couch and put a cushion over his head? 805 00:44:19,360 --> 00:44:24,279 Speaker 2: Wow? No, dogs are amazingly resilient and of coase it. 806 00:44:24,320 --> 00:44:26,440 Speaker 2: Don't forget I'm the person who walks and feeds him 807 00:44:26,480 --> 00:44:28,440 Speaker 2: and plays with him, so you know he knows on 808 00:44:28,480 --> 00:44:29,120 Speaker 2: which side. 809 00:44:28,960 --> 00:44:35,040 Speaker 1: He's bread has butted? What's what's his vocabulary? How many 810 00:44:35,239 --> 00:44:38,000 Speaker 1: actual words? We're digressing, We'll get back on track. But 811 00:44:38,040 --> 00:44:41,520 Speaker 1: how many actual words do you reckon he understands? 812 00:44:42,160 --> 00:44:44,080 Speaker 2: That's a really good question. I've often thought about that 813 00:44:44,120 --> 00:44:45,640 Speaker 2: because I was reading a book and I can't think 814 00:44:45,640 --> 00:44:47,480 Speaker 2: of the name of it a little while ago, where 815 00:44:47,880 --> 00:44:50,439 Speaker 2: the character in the book was determined to teach their 816 00:44:50,480 --> 00:44:53,520 Speaker 2: dog six hundred words. It was a very cute story, 817 00:44:53,760 --> 00:44:56,440 Speaker 2: and the dog was really smart. I don't know how 818 00:44:56,520 --> 00:44:59,359 Speaker 2: many words he knows? I think t if you could 819 00:44:59,360 --> 00:45:01,640 Speaker 2: probably answer the as well. Having got a dog yourself, 820 00:45:01,719 --> 00:45:04,440 Speaker 2: do you give Luna other names? Like the most pet 821 00:45:04,520 --> 00:45:06,680 Speaker 2: owners have about three or four names for their dogs. 822 00:45:07,440 --> 00:45:11,400 Speaker 2: She Yeah, mine does too, Yeah does as well, So 823 00:45:11,400 --> 00:45:13,480 Speaker 2: I've got multiple names. But as far as what he 824 00:45:13,520 --> 00:45:17,160 Speaker 2: can understand. I mean, he certainly what really staggers me. 825 00:45:17,360 --> 00:45:20,680 Speaker 2: And I remember a friend of mine, her father had 826 00:45:20,760 --> 00:45:24,400 Speaker 2: a sheep farm and he was ill and had to 827 00:45:24,400 --> 00:45:26,880 Speaker 2: go to hospital, and they had a working dog that 828 00:45:26,920 --> 00:45:29,440 Speaker 2: would round up the sheep in the bottom paddock and 829 00:45:29,440 --> 00:45:32,319 Speaker 2: bring them up to the top paddock. And they were 830 00:45:32,360 --> 00:45:34,960 Speaker 2: trying to get the dog to bring the sheep up, 831 00:45:35,000 --> 00:45:36,920 Speaker 2: and they didn't know what the command was. They know 832 00:45:36,960 --> 00:45:39,839 Speaker 2: the dog was really smart, but they couldn't work out 833 00:45:39,840 --> 00:45:42,080 Speaker 2: what the command was. And eventually they got a message 834 00:45:42,080 --> 00:45:44,480 Speaker 2: to the father and he said, oh, no, it's not 835 00:45:44,520 --> 00:45:47,040 Speaker 2: a verbal command. I just flick my eyes to the 836 00:45:47,120 --> 00:45:49,759 Speaker 2: left and the dog knows to go up the hill 837 00:45:50,000 --> 00:45:52,880 Speaker 2: and bring the sheep up. So that's how subtle the 838 00:45:52,880 --> 00:45:55,640 Speaker 2: interaction was. And I know with Fritz, I can just 839 00:45:55,680 --> 00:45:58,000 Speaker 2: point to the ground in a certain way, or put 840 00:45:58,040 --> 00:46:00,359 Speaker 2: my palm flat to the ground and he'll drop. And 841 00:46:00,360 --> 00:46:02,640 Speaker 2: then I turned my palm and he rolls over. So 842 00:46:02,760 --> 00:46:05,800 Speaker 2: those sorts of things. So when it comes to words, 843 00:46:06,560 --> 00:46:08,719 Speaker 2: it probably knows a few of them, but he certainly 844 00:46:09,200 --> 00:46:13,480 Speaker 2: knows actions and movements as well, and hand gestures. So 845 00:46:14,000 --> 00:46:15,719 Speaker 2: I guess if you wanted to spend a lot of time. 846 00:46:15,760 --> 00:46:19,440 Speaker 2: Friends of mine just found these little electronic tiles at 847 00:46:19,480 --> 00:46:22,960 Speaker 2: the opshop and you can there buttons that you record 848 00:46:23,040 --> 00:46:24,879 Speaker 2: you on your voice, So it might be I want 849 00:46:24,880 --> 00:46:27,520 Speaker 2: to go outside, and then you train your dog to 850 00:46:27,600 --> 00:46:30,440 Speaker 2: hit the button to send you the message I want 851 00:46:30,480 --> 00:46:32,399 Speaker 2: to go outside, I want to go outside, I want 852 00:46:32,400 --> 00:46:34,480 Speaker 2: to treat, and you can train the dog to hit 853 00:46:34,520 --> 00:46:36,440 Speaker 2: the buttons for the different things that it wants you 854 00:46:36,480 --> 00:46:36,759 Speaker 2: to do. 855 00:46:38,120 --> 00:46:40,239 Speaker 1: That wouldn't be annoying at three am at all, I 856 00:46:40,320 --> 00:46:51,279 Speaker 1: want to treat. Your phone is covered in germs. I'm like, like, 857 00:46:51,400 --> 00:46:54,239 Speaker 1: this is one of the stories, just quickly before you 858 00:46:54,280 --> 00:46:56,040 Speaker 1: tell us about it. I was talking to a lady 859 00:46:56,080 --> 00:46:59,800 Speaker 1: who's a microbiologist and she's like, you know what creates 860 00:46:59,840 --> 00:47:04,480 Speaker 1: more more sickness than all the germs on whatever is 861 00:47:04,920 --> 00:47:09,400 Speaker 1: the propensity that we now have to try to sanitize 862 00:47:09,440 --> 00:47:13,920 Speaker 1: the whole world? And who was it someone on the 863 00:47:13,920 --> 00:47:16,799 Speaker 1: show they were talking about we need germs, like we 864 00:47:16,840 --> 00:47:19,640 Speaker 1: need germs, We need you know, we need a level 865 00:47:19,719 --> 00:47:23,000 Speaker 1: of germs so that our immune system has to deal 866 00:47:23,080 --> 00:47:26,080 Speaker 1: and cope and adapt. And when we put ourselves in 867 00:47:26,160 --> 00:47:29,080 Speaker 1: our kids and in a completely or close as we 868 00:47:29,120 --> 00:47:34,160 Speaker 1: can sanitized environment. Then when we do get exposed to something, 869 00:47:34,239 --> 00:47:37,560 Speaker 1: which inevitably we will, we have no resistance, like we 870 00:47:37,680 --> 00:47:41,839 Speaker 1: have no immunity because we're always nerfing the world. So 871 00:47:41,920 --> 00:47:44,959 Speaker 1: I reckon, let your phone be dirty, That's what I think. 872 00:47:45,360 --> 00:47:48,560 Speaker 2: Okay, well, we've had this conversation, and it's the pre 873 00:47:48,719 --> 00:47:53,520 Speaker 2: flush or pre wipe and post wipe phone use. So 874 00:47:53,680 --> 00:47:55,839 Speaker 2: you know when you go to the toilet. We all 875 00:47:55,880 --> 00:47:57,879 Speaker 2: know we take our phones to the toilet. Don't lie, 876 00:47:58,080 --> 00:48:01,680 Speaker 2: you know you do, right, my popuy is the pre 877 00:48:01,800 --> 00:48:05,400 Speaker 2: wipe policy, so the second that I start wiping, the 878 00:48:05,440 --> 00:48:07,719 Speaker 2: phone doesn't get touched again until I've washed my hand. 879 00:48:09,040 --> 00:48:13,720 Speaker 2: So pre wipe you can use your phone. Post post wipe, 880 00:48:14,000 --> 00:48:16,160 Speaker 2: you can't touch your vote until you've washed your hair. 881 00:48:17,040 --> 00:48:20,000 Speaker 1: I saw a show on your favorite station, the ab C, 882 00:48:20,200 --> 00:48:22,560 Speaker 1: a couple of years ago, and it was talking about this, 883 00:48:22,640 --> 00:48:25,480 Speaker 1: and it was talking about one of the you know 884 00:48:25,560 --> 00:48:29,239 Speaker 1: what fecal matter is is bits of pooh. They were 885 00:48:29,280 --> 00:48:32,839 Speaker 1: talking about you know what has pooh on it? And 886 00:48:32,880 --> 00:48:35,280 Speaker 1: one of the things that's got a pretty liberal smattering 887 00:48:35,360 --> 00:48:40,280 Speaker 1: of pooh on it is your toothbrush. Because yeah, because 888 00:48:40,280 --> 00:48:42,520 Speaker 1: it's generally or for a lot of people it's within 889 00:48:42,560 --> 00:48:46,600 Speaker 1: a meter or two of the toilet, so and a 890 00:48:46,640 --> 00:48:51,319 Speaker 1: pretty simple way to offset that is to I don't know. 891 00:48:51,400 --> 00:48:54,120 Speaker 1: I don't because I've never watched anyone else. Let's not 892 00:48:54,200 --> 00:48:57,680 Speaker 1: go there. But I always shut the lid when I flush. 893 00:48:57,800 --> 00:49:00,799 Speaker 1: Do you shut the lid when you flush? Tip, Yes, yep, 894 00:49:00,840 --> 00:49:03,520 Speaker 1: because if you don't, you kind of just spray shit 895 00:49:03,719 --> 00:49:05,520 Speaker 1: literally around the room. 896 00:49:05,960 --> 00:49:08,480 Speaker 2: Did I talk about the next level of paranoia? My 897 00:49:08,800 --> 00:49:12,520 Speaker 2: toothbrush sits in a cupboard, so I close the cupboard 898 00:49:12,680 --> 00:49:16,520 Speaker 2: and that way it's not exposed to any flushing potential. Well, 899 00:49:16,560 --> 00:49:18,800 Speaker 2: the thing is because I have a home based office 900 00:49:19,120 --> 00:49:24,000 Speaker 2: and so my main bathroom does get used by staff, 901 00:49:24,000 --> 00:49:26,200 Speaker 2: but it's also the same bathroom that has my toothbrush 902 00:49:26,239 --> 00:49:29,320 Speaker 2: in it, so my toothbrush is definitely cleep behind. 903 00:49:31,000 --> 00:49:32,960 Speaker 1: Yeah, you don't want me to come to your house 904 00:49:33,000 --> 00:49:34,480 Speaker 1: because you know where I'm going to look on what 905 00:49:34,520 --> 00:49:38,279 Speaker 1: I'm going to do with it. I'm going to I'm 906 00:49:38,320 --> 00:49:40,680 Speaker 1: just going to clean my nuts with your toothbrush and 907 00:49:40,719 --> 00:49:44,920 Speaker 1: then give it a just pop it back in. You're welcome, 908 00:49:45,920 --> 00:49:48,960 Speaker 1: thanks for that. So what I was doing one more champ, 909 00:49:49,040 --> 00:49:49,920 Speaker 1: one more champ. 910 00:49:49,920 --> 00:49:52,520 Speaker 2: Is going to say, don't clean your like. Yes, so 911 00:49:52,560 --> 00:49:55,319 Speaker 2: there are lots of micro organisms on your phone, but 912 00:49:55,480 --> 00:49:58,440 Speaker 2: don't be careful what you wipe the phone down with. 913 00:49:58,600 --> 00:50:03,400 Speaker 2: Anything above like an alcohol wipe over seventy percent is 914 00:50:03,440 --> 00:50:06,040 Speaker 2: going to damage the screen of your phone. So be 915 00:50:06,200 --> 00:50:09,359 Speaker 2: aware that what you clean your sunglasses with may not 916 00:50:09,400 --> 00:50:11,560 Speaker 2: necessarily be the right thing to clean your phone with 917 00:50:11,920 --> 00:50:14,680 Speaker 2: because the screen is touch sensitive. 918 00:50:15,480 --> 00:50:17,759 Speaker 1: So a bit of old windex, yes or no? 919 00:50:18,400 --> 00:50:20,280 Speaker 2: Well, that's a good question. A bit of old index. 920 00:50:20,280 --> 00:50:22,239 Speaker 2: I'd have to look at what the chemical content is. 921 00:50:22,480 --> 00:50:25,600 Speaker 2: Just YouTube? It can you use in index to clean your phone? 922 00:50:25,600 --> 00:50:25,759 Speaker 3: Hey? 923 00:50:25,840 --> 00:50:27,400 Speaker 2: If can you look it up for us? Is that 924 00:50:27,440 --> 00:50:28,640 Speaker 2: all right? Maybe? 925 00:50:28,840 --> 00:50:32,640 Speaker 1: Chat GP t it? I reckon that that'd be that'll 926 00:50:32,680 --> 00:50:35,960 Speaker 1: be quicker. No, really, fuck YouTube, it's not the nineties. 927 00:50:36,280 --> 00:50:40,279 Speaker 2: Hey, you know what you're here? YouTube? Google says no, 928 00:50:40,440 --> 00:50:43,480 Speaker 2: you can ah see No, why not? 929 00:50:43,719 --> 00:50:48,840 Speaker 3: Contains ammonia, which can damage your anti glare coating discolouration. 930 00:50:50,120 --> 00:50:54,040 Speaker 3: Windex is pretty pretty shit chemical that way, you. 931 00:50:54,000 --> 00:50:56,080 Speaker 1: Know, it's pretty good. Hanky in a bit of spit, 932 00:50:56,640 --> 00:51:02,160 Speaker 1: that's what Mary, that's what you're on there? That was Patrick? 933 00:51:03,120 --> 00:51:04,640 Speaker 2: Did you want one more? Did you say? 934 00:51:04,800 --> 00:51:05,320 Speaker 3: All right? 935 00:51:05,480 --> 00:51:05,719 Speaker 2: Go on? 936 00:51:05,960 --> 00:51:06,480 Speaker 1: Be quick? 937 00:51:06,719 --> 00:51:10,239 Speaker 2: Okay? Drivers in wa right, wait for this, and this 938 00:51:10,280 --> 00:51:12,080 Speaker 2: is probably happening all over Australia, but this is an 939 00:51:12,160 --> 00:51:15,719 Speaker 2: article from Western Australia. They're hitting the mute on annoying 940 00:51:15,880 --> 00:51:18,600 Speaker 2: car safety technology. And I've got friends who've done this 941 00:51:18,680 --> 00:51:21,439 Speaker 2: as well. They don't like lane assists, so they turn 942 00:51:21,480 --> 00:51:24,799 Speaker 2: off lane assists. They don't like the car warning them 943 00:51:24,800 --> 00:51:27,000 Speaker 2: that they're going too fast, so they turn it off. 944 00:51:27,200 --> 00:51:30,960 Speaker 2: And it seems like this is a survey that was 945 00:51:30,960 --> 00:51:33,840 Speaker 2: done in Western Australia by Double Ami, the insurance company, 946 00:51:34,080 --> 00:51:37,280 Speaker 2: and they said twenty three percent admitted to having turning 947 00:51:37,400 --> 00:51:41,160 Speaker 2: off or dialing down car safety features, and seventy two 948 00:51:41,239 --> 00:51:46,480 Speaker 2: percent say noises and lights are distracting. So they find 949 00:51:46,600 --> 00:51:48,920 Speaker 2: all those noises and lights and you know when someone 950 00:51:48,960 --> 00:51:51,799 Speaker 2: overtakes you and the light blinks and a noise might 951 00:51:51,840 --> 00:51:54,280 Speaker 2: come up to say that there's someone in your blind spot. 952 00:51:54,440 --> 00:51:56,839 Speaker 2: But most drivers, so seventy two percent of drivers says 953 00:51:56,880 --> 00:51:59,600 Speaker 2: they didn't like all those noises in their cars, and 954 00:51:59,640 --> 00:52:01,279 Speaker 2: twenty three percent of turning them off. 955 00:52:02,120 --> 00:52:04,200 Speaker 1: I agree with them, but I don't know how to 956 00:52:04,239 --> 00:52:07,640 Speaker 1: turn anything off. My new car is my new car 957 00:52:07,719 --> 00:52:10,759 Speaker 1: is it's essentially a computer. It's like I don't know 958 00:52:11,440 --> 00:52:13,319 Speaker 1: if it can do one hundred things I know how 959 00:52:13,360 --> 00:52:16,120 Speaker 1: to do. For I do not know how to use 960 00:52:16,160 --> 00:52:18,719 Speaker 1: that car other than drive it and park it and 961 00:52:19,760 --> 00:52:22,520 Speaker 1: lock it and unlock it. It just does so much 962 00:52:22,520 --> 00:52:23,960 Speaker 1: shit that's beyond mine. 963 00:52:24,200 --> 00:52:25,920 Speaker 2: Next time I'm over, let's go for a drive and 964 00:52:25,960 --> 00:52:26,400 Speaker 2: maybe I can. 965 00:52:26,520 --> 00:52:29,440 Speaker 1: Ah, you'll love it, Yeah, you'll love it. 966 00:52:29,640 --> 00:52:31,919 Speaker 2: But you know what, there are companies are going back 967 00:52:31,920 --> 00:52:35,960 Speaker 2: to knobs and dials. I had my car being repaired 968 00:52:35,960 --> 00:52:39,480 Speaker 2: recently because unfortunately, I had an interaction or an altercation 969 00:52:39,560 --> 00:52:44,040 Speaker 2: with a kangaroo. The kangaroo survived, my car didn't. It's 970 00:52:44,239 --> 00:52:46,360 Speaker 2: kind of bumped the side of the car. But I 971 00:52:46,400 --> 00:52:48,560 Speaker 2: was driving a courtesy car for a while and I 972 00:52:48,600 --> 00:52:51,239 Speaker 2: didn't realize how much I like all the features in 973 00:52:51,239 --> 00:52:53,719 Speaker 2: my car. You know, the head's up display, But what 974 00:52:53,760 --> 00:52:56,200 Speaker 2: I love about it is that I can reach down 975 00:52:56,239 --> 00:52:59,480 Speaker 2: and still physically hit a button to turn the temperature 976 00:52:59,520 --> 00:53:01,919 Speaker 2: up or down. There's a screen version, but it also 977 00:53:02,040 --> 00:53:06,320 Speaker 2: has analog versions as well, Knobs to control the volume, 978 00:53:06,600 --> 00:53:09,400 Speaker 2: you know, change the tuning, to move things around. So 979 00:53:09,520 --> 00:53:12,680 Speaker 2: I kind of like that. It's got some tech features, 980 00:53:12,680 --> 00:53:14,879 Speaker 2: but it's also got a lot of analog features as well. 981 00:53:15,760 --> 00:53:18,200 Speaker 1: I had to do a gig in Ballarat yesterday and 982 00:53:18,239 --> 00:53:18,440 Speaker 1: I was. 983 00:53:19,280 --> 00:53:23,120 Speaker 2: Oh my god, he went right past my house. I'm 984 00:53:23,280 --> 00:53:25,440 Speaker 2: just shattered. I'm so shattered. 985 00:53:26,239 --> 00:53:28,520 Speaker 1: I thought about coming to see you, but I had 986 00:53:28,560 --> 00:53:33,160 Speaker 1: to be back in time for a thing. Anyway, For 987 00:53:34,560 --> 00:53:37,040 Speaker 1: I did think of you as I saw the Bland 988 00:53:38,800 --> 00:53:42,120 Speaker 1: and I thought, I thought, no, self, don't tell him 989 00:53:42,120 --> 00:53:44,760 Speaker 1: you went the Ballarat. And then I fucked up, didn't I? Anyway, 990 00:53:46,760 --> 00:53:51,600 Speaker 1: I was leaving the rac V gold Fields gold Mine, 991 00:53:51,680 --> 00:53:55,520 Speaker 1: gold Field's bloody resort in It's actually in Creswick, which 992 00:53:55,560 --> 00:54:00,840 Speaker 1: is right near Ballarat. Anyway, I'm like, Creswickwick, we thank you. 993 00:54:01,120 --> 00:54:03,680 Speaker 1: I was. I was five minutes out of the car 994 00:54:03,760 --> 00:54:07,040 Speaker 1: park and driving back. And if you don't keep your 995 00:54:07,040 --> 00:54:10,239 Speaker 1: eyes in a certain like where the car thinks your 996 00:54:10,280 --> 00:54:13,080 Speaker 1: eyes should be, it tells you to pull over and 997 00:54:13,120 --> 00:54:16,800 Speaker 1: have arrest. It's like, might be time for a coffee 998 00:54:16,800 --> 00:54:19,319 Speaker 1: and a rest. I'm like, how about you shut the 999 00:54:19,360 --> 00:54:24,080 Speaker 1: fuck up like he was telling me at you know, 1000 00:54:24,360 --> 00:54:27,440 Speaker 1: five o'clock in the afternoon, pull over, chill out, have arrest, 1001 00:54:27,480 --> 00:54:29,000 Speaker 1: have a coffee and then reboot. 1002 00:54:29,040 --> 00:54:31,800 Speaker 2: I'm like, no, bro, shush, I thought we're going to 1003 00:54:31,880 --> 00:54:33,839 Speaker 2: say that you're It said to you that your left 1004 00:54:33,840 --> 00:54:35,839 Speaker 2: eyes so beautiful. You're right. I keeps trying to look 1005 00:54:35,880 --> 00:54:36,160 Speaker 2: at it. 1006 00:54:39,440 --> 00:54:42,879 Speaker 1: Fucking hell. Patrick, tell people about you where they can 1007 00:54:42,920 --> 00:54:43,719 Speaker 1: find you, and. 1008 00:54:44,040 --> 00:54:47,000 Speaker 4: Really anybody wants to know anything about way if you 1009 00:54:47,000 --> 00:54:50,880 Speaker 4: want to interact on any level, websites noow dot com 1010 00:54:50,920 --> 00:54:53,880 Speaker 4: dot Au, websites now dot com dot au, or you 1011 00:54:53,920 --> 00:54:55,560 Speaker 4: can go to ty chi at home and do tai 1012 00:54:55,680 --> 00:54:58,080 Speaker 4: chie with me, because I feel like I've a split personality. 1013 00:54:58,280 --> 00:55:01,239 Speaker 2: There's my tychi me and then there's my tech bor 1014 00:55:01,440 --> 00:55:03,160 Speaker 2: not boring. It's kind of fun because we get to 1015 00:55:03,160 --> 00:55:05,520 Speaker 2: do lots of fun stuff with clients. But yeah, so 1016 00:55:05,560 --> 00:55:08,000 Speaker 2: there's the websites now me and the tai Chi me. 1017 00:55:09,960 --> 00:55:14,920 Speaker 1: Tech you and so yeah, prefrontal cortex, all logical thinking, 1018 00:55:15,080 --> 00:55:18,640 Speaker 1: and then the the spiritual tie chie. Get out of 1019 00:55:18,680 --> 00:55:20,680 Speaker 1: your brain, out of your thoughts, out of your bloody 1020 00:55:20,800 --> 00:55:27,000 Speaker 1: cognitive whatever. Yeah, that is quiet, it's pretty We'll say 1021 00:55:27,000 --> 00:55:29,680 Speaker 1: goodbye a fair but tif thank you. Thanks. 1022 00:55:29,800 --> 00:55:30,120 Speaker 3: Kids. 1023 00:55:30,800 --> 00:55:32,640 Speaker 1: Try to get you, try to get your computer work 1024 00:55:32,680 --> 00:55:33,040 Speaker 1: the will you. 1025 00:55:34,200 --> 00:55:37,040 Speaker 3: I wish Patrick could fix it. 1026 00:55:37,040 --> 00:55:39,160 Speaker 2: It's an excuse to come visit. I'll come visit help 1027 00:55:39,160 --> 00:55:42,560 Speaker 1: You with your computer okay, all right, see everyone Bye.