1 00:00:00,320 --> 00:00:03,880 Speaker 1: Patrick, Yeah, greg O, what are you? Intro the show? 2 00:00:04,400 --> 00:00:08,240 Speaker 2: Hey, welcome to the You Project with Craig Harper, Tiffany 3 00:00:08,240 --> 00:00:10,959 Speaker 2: Cook and special guest Patrick Bonello. 4 00:00:11,039 --> 00:00:16,640 Speaker 1: It's great to have you groovers here today. That's pretty good. Now, 5 00:00:16,680 --> 00:00:19,440 Speaker 1: that's good. Maybe if you could pop up because sometimes 6 00:00:19,560 --> 00:00:23,160 Speaker 1: I can't be fux saying that, so, well done, well done, 7 00:00:23,239 --> 00:00:25,319 Speaker 1: and hello from Fritz as well. Who is sitting on 8 00:00:25,400 --> 00:00:25,840 Speaker 1: your lap. 9 00:00:26,440 --> 00:00:28,760 Speaker 2: We probably should screen grabbed this and placed it somewhere. 10 00:00:28,800 --> 00:00:31,160 Speaker 2: He's pretty damn cute, isn't he. Well? 11 00:00:31,280 --> 00:00:34,160 Speaker 1: Yeah, and also your dog is somewhere else in the house. 12 00:00:37,400 --> 00:00:40,520 Speaker 1: Good morning, tiff Patrick, thank you for introing the show. 13 00:00:40,600 --> 00:00:42,640 Speaker 1: Per my request, Tiff, how are you? 14 00:00:42,960 --> 00:00:44,840 Speaker 3: I'm very good? Thanks? How are you? 15 00:00:45,360 --> 00:00:47,760 Speaker 1: I'm good? What's going? You're a little bit flatty mcflatster. 16 00:00:47,880 --> 00:00:50,599 Speaker 1: Earlier in the week we're recording this Friday morning. Have 17 00:00:50,720 --> 00:00:53,400 Speaker 1: you have you found your fucking rhythm, your mojo? 18 00:00:53,880 --> 00:00:55,520 Speaker 3: I have you know when you just need to have 19 00:00:55,560 --> 00:00:58,639 Speaker 3: that one day of throwing it all away and then 20 00:00:58,920 --> 00:01:00,640 Speaker 3: and then you're good. I'm good? Good now, I am 21 00:01:00,680 --> 00:01:01,760 Speaker 3: really good? Thank you? 22 00:01:02,440 --> 00:01:05,040 Speaker 1: Yeah, good, just checking in. Patrick? Are you good? Do 23 00:01:05,120 --> 00:01:08,000 Speaker 1: you have flat days? Mate? Do you ever because you. 24 00:01:08,000 --> 00:01:11,120 Speaker 1: You always present yourself like I try to present myself 25 00:01:11,200 --> 00:01:13,840 Speaker 1: as up and aboutside is tiff, I know, but do 26 00:01:13,880 --> 00:01:16,320 Speaker 1: you have flat days sometimes? 27 00:01:16,319 --> 00:01:19,640 Speaker 2: But I kind of my friends excuse me of being 28 00:01:19,680 --> 00:01:22,320 Speaker 2: way too busy. I've got something on all the time. 29 00:01:22,680 --> 00:01:24,920 Speaker 2: Next week, I think I've got something on every night. 30 00:01:24,959 --> 00:01:27,479 Speaker 2: So I've got book group on Monday night. Tuesday, I'm 31 00:01:27,480 --> 00:01:30,960 Speaker 2: singing in a choir. Wednesday, I'm teaching tai chi. Thursday, 32 00:01:31,000 --> 00:01:33,399 Speaker 2: I'm teaching tai chi, and then Friday, I don't know 33 00:01:33,480 --> 00:01:35,839 Speaker 2: yet what I'm doing on Friday. Don't like they're both 34 00:01:35,920 --> 00:01:38,480 Speaker 2: laughing at me. You asked me a serious question. I 35 00:01:38,520 --> 00:01:41,199 Speaker 2: gave you a serious answer, and I'm just saying I'm busy. 36 00:01:41,240 --> 00:01:43,760 Speaker 2: I don't have time to think about other stuff. I 37 00:01:43,760 --> 00:01:47,080 Speaker 2: think if you fill your world with stuff you don't forget, 38 00:01:47,760 --> 00:01:48,840 Speaker 2: you get bogged down. 39 00:01:49,600 --> 00:01:52,400 Speaker 1: And I just said, don't think you really understand mental 40 00:01:52,440 --> 00:01:58,960 Speaker 1: health if you think busyness is the cure. Let me. 41 00:01:59,360 --> 00:02:03,000 Speaker 1: I said you. I said to you, do you hang on? 42 00:02:03,520 --> 00:02:05,720 Speaker 1: Do you ever get a bit flat? You didn't even 43 00:02:05,760 --> 00:02:10,160 Speaker 1: fucking answer the question you just told us about your calendar. 44 00:02:11,360 --> 00:02:13,920 Speaker 2: I think, like all of us, we do tend to 45 00:02:13,919 --> 00:02:15,680 Speaker 2: get flat. But what I've tried to do in my 46 00:02:15,760 --> 00:02:19,760 Speaker 2: life is put in mechanisms of joy and things that 47 00:02:19,840 --> 00:02:22,200 Speaker 2: I really enjoy that can be uplifting. So, you know, 48 00:02:22,320 --> 00:02:25,320 Speaker 2: it's like running your own business. There's cash flow issues, 49 00:02:25,360 --> 00:02:27,760 Speaker 2: you know, sometimes some months and a bit lean, So 50 00:02:27,800 --> 00:02:30,000 Speaker 2: those issues come up all the time. And yes, you 51 00:02:30,040 --> 00:02:31,720 Speaker 2: can get a bit down when you look at the 52 00:02:31,760 --> 00:02:34,320 Speaker 2: bank balance, or something happens, someone. 53 00:02:34,040 --> 00:02:35,600 Speaker 1: Cuts you off in them, you know whatever. 54 00:02:36,120 --> 00:02:39,560 Speaker 2: But I think that looking for mindfulness moments and I 55 00:02:39,600 --> 00:02:42,520 Speaker 2: try to really actively search for that. So I look 56 00:02:42,560 --> 00:02:45,919 Speaker 2: at things engagements, I catch up with friends, I walk 57 00:02:46,000 --> 00:02:47,600 Speaker 2: for its and meet up with friends at the park, 58 00:02:47,680 --> 00:02:49,680 Speaker 2: that sort of stuff. So I think we can be 59 00:02:49,760 --> 00:02:53,000 Speaker 2: proactive about trying not to be flat. You know, I 60 00:02:53,040 --> 00:02:55,440 Speaker 2: spent my teenage if we're going to get deep here, 61 00:02:55,919 --> 00:02:59,320 Speaker 2: I spent my middle teenage years and later teenage years 62 00:02:59,360 --> 00:03:01,920 Speaker 2: self harming and going through a really dark place. 63 00:03:02,560 --> 00:03:06,800 Speaker 1: And so for me, I started to understand. 64 00:03:06,320 --> 00:03:09,799 Speaker 2: That, you know, at an early age that those engagements 65 00:03:09,800 --> 00:03:12,880 Speaker 2: and the people that I connected with really helped get 66 00:03:12,880 --> 00:03:15,560 Speaker 2: me out of those dark places. So I think that 67 00:03:15,600 --> 00:03:19,160 Speaker 2: I've done that as an adult successfully, and I think 68 00:03:19,240 --> 00:03:21,560 Speaker 2: part of it also it's not so much fooling yourself. 69 00:03:21,560 --> 00:03:23,040 Speaker 2: And you said, oh, you know, you always put on 70 00:03:23,080 --> 00:03:26,680 Speaker 2: a cheery attitude. But I think you can all you 71 00:03:26,760 --> 00:03:28,880 Speaker 2: can alter your mindset. 72 00:03:29,160 --> 00:03:29,360 Speaker 1: You know. 73 00:03:29,600 --> 00:03:32,880 Speaker 2: Part of the journey of being, say, a really good athlete, 74 00:03:32,960 --> 00:03:35,600 Speaker 2: is convincing yourself you're a good athlete. Part of the 75 00:03:35,680 --> 00:03:37,680 Speaker 2: journey it's if you know when you step in the ring, 76 00:03:38,040 --> 00:03:40,880 Speaker 2: the confidence that you've got, if you believe in yourself 77 00:03:41,080 --> 00:03:43,880 Speaker 2: first and foremost, then that's going to give you that 78 00:03:43,960 --> 00:03:47,440 Speaker 2: little edge, that extra five percent that means that you're 79 00:03:47,480 --> 00:03:48,920 Speaker 2: going to get where you need to go. And I 80 00:03:48,920 --> 00:03:51,320 Speaker 2: think that's right in life. I think if we look 81 00:03:51,400 --> 00:03:54,120 Speaker 2: for those positive engagements. You know, I walk down the 82 00:03:54,200 --> 00:03:57,560 Speaker 2: street in our little country town and I might bump 83 00:03:57,600 --> 00:03:59,840 Speaker 2: into three or four people and it's great, you know, 84 00:04:00,280 --> 00:04:03,080 Speaker 2: But at the hardware store, you know, you know, the 85 00:04:03,920 --> 00:04:07,320 Speaker 2: people at the post office. Just seeing people around and engaging, 86 00:04:07,520 --> 00:04:08,600 Speaker 2: I mean, you're really social. 87 00:04:09,040 --> 00:04:09,400 Speaker 1: Harps. 88 00:04:09,440 --> 00:04:11,600 Speaker 2: You go down the street and you have your coffee 89 00:04:11,640 --> 00:04:15,440 Speaker 2: across at the cafe. It does make you feel better 90 00:04:15,760 --> 00:04:17,040 Speaker 2: and feel good about yourself. 91 00:04:17,640 --> 00:04:19,520 Speaker 1: No, I tell them to fuck off. I might stop 92 00:04:19,560 --> 00:04:22,880 Speaker 1: talking to me. Fuck off. No, I'm not answering your questions. 93 00:04:22,960 --> 00:04:25,200 Speaker 1: Fuck off. No, I'm not like that at all. I 94 00:04:25,240 --> 00:04:28,960 Speaker 1: love it. I'd be exactly like you if But I 95 00:04:29,000 --> 00:04:31,000 Speaker 1: did laugh when you said, you know, in the traffic 96 00:04:31,120 --> 00:04:34,440 Speaker 1: in Milan, there's fucking seven cars, So don't don't talk 97 00:04:34,480 --> 00:04:40,599 Speaker 1: about the traffic in Milan. We don't even have a stop. Yeah. Yeah, well, 98 00:04:41,360 --> 00:04:43,840 Speaker 1: I mean you did talk. You've spoken to us before 99 00:04:43,880 --> 00:04:47,720 Speaker 1: about growing up and trying to figure out sexuality and stuff. 100 00:04:47,839 --> 00:04:51,200 Speaker 1: Was that part of the being a bit emotionally lost 101 00:04:51,200 --> 00:04:53,599 Speaker 1: and in the wilderness for you or was that a 102 00:04:53,680 --> 00:04:54,320 Speaker 1: separate thing? 103 00:04:54,880 --> 00:04:57,839 Speaker 2: Oh it isn't it was. I don't think it's much 104 00:04:57,880 --> 00:05:00,480 Speaker 2: different than most teenagers, you know. I think you lose 105 00:05:00,520 --> 00:05:03,320 Speaker 2: a lot of confidence and you look towards peer pressure 106 00:05:03,360 --> 00:05:07,320 Speaker 2: for validation, which is takes on the whole online thing 107 00:05:07,360 --> 00:05:09,760 Speaker 2: we'll chat about later. But the reality of it is, 108 00:05:10,200 --> 00:05:12,039 Speaker 2: you know, when you're growing up and you get to 109 00:05:12,080 --> 00:05:14,800 Speaker 2: those teenage years and your brain chemistry starts to alter, 110 00:05:15,480 --> 00:05:18,560 Speaker 2: you do look for peer approval, I think for most 111 00:05:18,640 --> 00:05:22,000 Speaker 2: of us, and if you doesn't fit in, that can 112 00:05:22,040 --> 00:05:26,240 Speaker 2: be aspiraling experience for you. So unless you put on 113 00:05:26,279 --> 00:05:28,279 Speaker 2: your desert boots and decide to go for a jog 114 00:05:28,320 --> 00:05:31,000 Speaker 2: around the park because you're a bit of you want 115 00:05:31,000 --> 00:05:31,839 Speaker 2: to change by. 116 00:05:35,120 --> 00:05:39,640 Speaker 1: A good memory. Yeah I was. I was just yep, 117 00:05:39,720 --> 00:05:41,600 Speaker 1: I had a shit experience and then I went for 118 00:05:41,720 --> 00:05:43,960 Speaker 1: running desert boots because I didn't fucking know what I 119 00:05:44,000 --> 00:05:48,719 Speaker 1: was doing. Let's talk about AI and tech and all 120 00:05:48,720 --> 00:05:51,440 Speaker 1: of the patrick things. I'm going to let you lead 121 00:05:51,480 --> 00:05:53,480 Speaker 1: it today. Well, I just want to I want to 122 00:05:53,560 --> 00:05:56,359 Speaker 1: lead one. Then I want you to take over the reins. 123 00:05:57,520 --> 00:05:59,839 Speaker 1: Obviously you send us a bunch of topics and of 124 00:06:00,040 --> 00:06:01,920 Speaker 1: also I want to know about sex in space. 125 00:06:02,160 --> 00:06:05,440 Speaker 2: Knew I knew you would look at that straight away. Well, 126 00:06:05,440 --> 00:06:08,600 Speaker 2: this is a funny thing. We've been or humanity has 127 00:06:08,640 --> 00:06:12,040 Speaker 2: been going into space for a long time, but everyone 128 00:06:12,120 --> 00:06:15,240 Speaker 2: is a bit mum about whether or not people have 129 00:06:15,400 --> 00:06:19,320 Speaker 2: sex in space. International Space Station, you know that sort 130 00:06:19,320 --> 00:06:23,479 Speaker 2: of stuff, intergalactic copulation. Well, if this is a broader 131 00:06:23,520 --> 00:06:26,440 Speaker 2: research and broader I guess to study into and a 132 00:06:26,520 --> 00:06:29,680 Speaker 2: look into the feasibility of long term space travel. So 133 00:06:29,720 --> 00:06:31,800 Speaker 2: if we're going to Mars and it's going to take 134 00:06:31,839 --> 00:06:34,000 Speaker 2: months and months and months or years to get there, 135 00:06:34,279 --> 00:06:36,640 Speaker 2: then what happens with the viability of being able to 136 00:06:37,040 --> 00:06:39,479 Speaker 2: have sex? And then there's the whole thing of like, 137 00:06:39,520 --> 00:06:41,920 Speaker 2: wouldn't it be great to be floating around and have sex? Well, 138 00:06:41,920 --> 00:06:45,719 Speaker 2: actually it would be pretty tough because you know, you 139 00:06:45,800 --> 00:06:48,200 Speaker 2: can't be banging into things because it just bounce away. 140 00:06:48,960 --> 00:06:53,520 Speaker 2: So you're like, that's what sex is. Yeah, you tried 141 00:06:53,600 --> 00:06:56,480 Speaker 2: on the International Space Station. It's a whole new experience. 142 00:06:56,720 --> 00:07:00,560 Speaker 1: I feel like you should take some ocki straps with you. 143 00:07:00,279 --> 00:07:02,720 Speaker 2: And gaffer tape because they use a lot of gaffetape 144 00:07:02,720 --> 00:07:03,560 Speaker 2: on the ISS. 145 00:07:03,880 --> 00:07:03,920 Speaker 1: No. 146 00:07:04,240 --> 00:07:07,800 Speaker 2: Look, so some brains have been thinking about this, and 147 00:07:08,240 --> 00:07:11,680 Speaker 2: the question has always been have astronauts actually ever done 148 00:07:11,720 --> 00:07:15,720 Speaker 2: it in space? And NASA won't say a thing, and 149 00:07:15,840 --> 00:07:19,320 Speaker 2: astronauts tend to be really coy about it. Your body 150 00:07:19,440 --> 00:07:24,520 Speaker 2: changes when you're in low gravity, so fluid distribution actually changes, 151 00:07:24,560 --> 00:07:26,320 Speaker 2: and the way your body because you've got no gravity, 152 00:07:27,160 --> 00:07:30,960 Speaker 2: I guess to pull the body fluids in different directions. 153 00:07:31,080 --> 00:07:33,240 Speaker 2: I mean, it does open up a whole experience. Well, 154 00:07:33,320 --> 00:07:35,560 Speaker 2: think about the two astronauts up there at the moment 155 00:07:35,600 --> 00:07:37,360 Speaker 2: that went there for eight days and now have been 156 00:07:37,360 --> 00:07:41,320 Speaker 2: stuck there for eight months. I mean, let's face it, 157 00:07:41,480 --> 00:07:43,840 Speaker 2: there's got to be some relief somewhere. I mean, we're 158 00:07:43,880 --> 00:07:46,920 Speaker 2: all human beings, but I don't know what do you reckon? 159 00:07:46,960 --> 00:07:49,000 Speaker 2: Would you like to be floating around in space and 160 00:07:49,200 --> 00:07:51,800 Speaker 2: doing a bit of you know, playing around at the 161 00:07:51,800 --> 00:07:53,760 Speaker 2: same time as it seemed like something. 162 00:07:53,440 --> 00:07:56,120 Speaker 1: You haven't read a lot of sci fi, have you, Grego? No, 163 00:07:56,320 --> 00:08:00,600 Speaker 1: But you know what I'm asking chat GPT does does 164 00:08:00,640 --> 00:08:08,160 Speaker 1: the viscosity Does the visity of body fluids change in space? 165 00:08:09,240 --> 00:08:12,800 Speaker 1: Shif that's where he went, No, no, no, because you 166 00:08:13,080 --> 00:08:16,640 Speaker 1: were talking about, Yes, the viscosity of body fluids changes 167 00:08:16,680 --> 00:08:19,520 Speaker 1: in space due to the effects of micro gravity on fluid. 168 00:08:19,720 --> 00:08:24,200 Speaker 1: Because what I'm talking about is obviously sperm is a fluid, 169 00:08:24,280 --> 00:08:27,400 Speaker 1: blood is a fluid. Saliva as a fluid. Look at you, 170 00:08:27,600 --> 00:08:33,199 Speaker 1: Patrick TIFFs making a face as well. What's wrong with 171 00:08:33,240 --> 00:08:35,960 Speaker 1: you too? If you two didn't have fluids, you'd both 172 00:08:36,000 --> 00:08:40,680 Speaker 1: be fucked. So embrace your fluids, Patrick, embraced your sperm 173 00:08:41,080 --> 00:08:44,760 Speaker 1: and your blood and your saliva and your wi all 174 00:08:44,800 --> 00:08:48,680 Speaker 1: of it. Yeah, because I think that would I think 175 00:08:48,720 --> 00:08:52,720 Speaker 1: that would change things in terms of on. 176 00:08:52,600 --> 00:08:56,160 Speaker 2: All evels viscosity we're talking about here. Well, evidently the 177 00:08:56,160 --> 00:09:00,760 Speaker 2: only person after this little research study and questioning of astronauts, 178 00:09:00,960 --> 00:09:06,240 Speaker 2: a cosmonaut, came back and said of course you have 179 00:09:06,320 --> 00:09:09,840 Speaker 2: sex by hand, so I I. 180 00:09:09,840 --> 00:09:12,280 Speaker 1: Don't think jacking off on the moon is the same. 181 00:09:13,160 --> 00:09:15,560 Speaker 1: Well you know that. And then it comes to. 182 00:09:15,520 --> 00:09:20,080 Speaker 2: The dear, I just yeah, anyway, it just occurred to 183 00:09:20,120 --> 00:09:23,679 Speaker 2: me that you know the way that you know if 184 00:09:23,679 --> 00:09:25,720 Speaker 2: you're floating and all of a sudden, No, I don't worry. 185 00:09:25,880 --> 00:09:28,040 Speaker 1: You lost your trying to thought? Have you lost your 186 00:09:28,040 --> 00:09:28,679 Speaker 1: trying to thought? 187 00:09:28,800 --> 00:09:30,800 Speaker 2: Really visual there for a moment, I think we should 188 00:09:30,840 --> 00:09:31,680 Speaker 2: go into another time. 189 00:09:33,000 --> 00:09:35,640 Speaker 1: So but the answer is, but you do make a 190 00:09:35,720 --> 00:09:39,160 Speaker 1: valid point because for and I think it takes doesn't 191 00:09:39,160 --> 00:09:41,480 Speaker 1: it take like ten years to get to Mars or something? 192 00:09:41,520 --> 00:09:44,040 Speaker 1: But I think less because they have a different wind, 193 00:09:44,080 --> 00:09:44,760 Speaker 1: they have a window. 194 00:09:44,880 --> 00:09:48,479 Speaker 2: So it's when the planetary alignment's right to get Mars. 195 00:09:48,280 --> 00:09:52,319 Speaker 1: Closer because it's years, right, it's not? Yeah absolutely, yeah. 196 00:09:52,360 --> 00:09:52,640 Speaker 3: Yeah. 197 00:09:52,679 --> 00:09:54,600 Speaker 1: So then let's say you put one hundred of people 198 00:09:54,640 --> 00:10:00,400 Speaker 1: on Mars. Well, for the proliferation of the pop population 199 00:10:00,600 --> 00:10:03,080 Speaker 1: on Mars, you can't just keep topping it up with 200 00:10:03,160 --> 00:10:06,600 Speaker 1: people from Earth. I guess you need to produce a 201 00:10:06,120 --> 00:10:07,439 Speaker 1: few new ones. 202 00:10:08,240 --> 00:10:12,160 Speaker 2: But would you want to have children on Mars? I 203 00:10:12,200 --> 00:10:13,200 Speaker 2: don't think quite yet. 204 00:10:14,440 --> 00:10:20,240 Speaker 1: I feel like day care. Oh fuck? And imagine. Yeah, 205 00:10:20,640 --> 00:10:22,920 Speaker 1: I'll tell you what. You wouldn't have any traffic problems 206 00:10:23,000 --> 00:10:26,120 Speaker 1: up there, Patrick, but Land would be busy compared to Mars. 207 00:10:26,480 --> 00:10:30,319 Speaker 2: Three years I go just on average, depending again where 208 00:10:30,480 --> 00:10:32,800 Speaker 2: the planets are aligned, around about three years to get 209 00:10:32,800 --> 00:10:33,079 Speaker 2: to Mark. 210 00:10:33,320 --> 00:10:35,800 Speaker 1: And nobody wants to know this except me. But what's 211 00:10:35,840 --> 00:10:40,480 Speaker 1: the gravity on Mars relative to what's the gravity situation 212 00:10:40,559 --> 00:10:41,520 Speaker 1: relative to Earth. 213 00:10:42,160 --> 00:10:47,160 Speaker 2: It's it's small, it's less so, but it's still a 214 00:10:47,200 --> 00:10:48,320 Speaker 2: reasonable amount of gravity. 215 00:10:48,400 --> 00:10:50,400 Speaker 1: I don't know exactly what it is, but lot so 216 00:10:50,760 --> 00:10:53,600 Speaker 1: that what that means is if there's less gravity, that 217 00:10:53,679 --> 00:10:56,920 Speaker 1: means relatively I could fuck this up, but I think 218 00:10:56,960 --> 00:11:01,320 Speaker 1: it means we're lighter in inverted commas on So I 219 00:11:01,400 --> 00:11:06,640 Speaker 1: might only be about thirty five kilos on Mars, I 220 00:11:06,760 --> 00:11:08,959 Speaker 1: might be able to run at about eight hundred miles 221 00:11:09,000 --> 00:11:13,600 Speaker 1: an hour. Yeah, I don't think so. But the problem 222 00:11:13,640 --> 00:11:16,120 Speaker 1: with all that anti gravity stuff is the moment that 223 00:11:16,160 --> 00:11:19,360 Speaker 1: you get in zero gravity, your body starts to fall 224 00:11:19,400 --> 00:11:22,000 Speaker 1: apart because you don't need muscle and strength and bone 225 00:11:22,040 --> 00:11:25,640 Speaker 1: density and all that shit starts to fall through. People 226 00:11:25,640 --> 00:11:29,200 Speaker 1: who come back from long stints in space, their body 227 00:11:29,280 --> 00:11:31,040 Speaker 1: has aged horribly. 228 00:11:32,080 --> 00:11:36,720 Speaker 2: Yeah, exactly evidently the if you weighed one hundred pounds, 229 00:11:36,760 --> 00:11:39,320 Speaker 2: and this is I'm using pounds as the example, it 230 00:11:39,360 --> 00:11:42,559 Speaker 2: would be thirty eight pounds on Mars, So it's a third, right. 231 00:11:42,880 --> 00:11:46,240 Speaker 1: Yeah, my poo's weigh a hundred pounds some morning, depending 232 00:11:46,240 --> 00:11:50,440 Speaker 1: on what I've had the day before. So maybe not quite. 233 00:11:50,920 --> 00:11:55,920 Speaker 1: You haven't already turned off yet to those three remaining listeners. 234 00:11:56,480 --> 00:12:00,360 Speaker 1: See all right, you talk about something other than pooing 235 00:12:00,360 --> 00:12:02,480 Speaker 1: and pro creating in space? If that could be the 236 00:12:02,480 --> 00:12:05,000 Speaker 1: title of the show, Pooing and pro Creating in Space. 237 00:12:05,400 --> 00:12:07,079 Speaker 2: I feel like I need to do a positive story 238 00:12:07,080 --> 00:12:08,800 Speaker 2: because the stuff that I was going to talk about 239 00:12:08,880 --> 00:12:12,080 Speaker 2: was really kind of a bit of a downer. Well, 240 00:12:12,840 --> 00:12:15,040 Speaker 2: I know, we don't we talk about this a little bit? 241 00:12:15,200 --> 00:12:18,040 Speaker 2: And do you ever did you've. 242 00:12:17,760 --> 00:12:21,800 Speaker 1: Watched Star Wars? Obviously, Yeah, I actually quite like Star Wars. 243 00:12:21,840 --> 00:12:24,800 Speaker 1: I also don't mind a bit of start track. Okay, 244 00:12:24,880 --> 00:12:25,720 Speaker 1: so the New Hope. 245 00:12:25,760 --> 00:12:28,080 Speaker 2: Do you remember when they're on the Millennium Falcon and 246 00:12:28,440 --> 00:12:31,280 Speaker 2: they're playing the chess game and they've got the little 247 00:12:31,320 --> 00:12:34,640 Speaker 2: holographic chess pieces moving around, you know, when Luke Skywalker's playing. 248 00:12:35,120 --> 00:12:37,800 Speaker 1: Don't be ridiculous, you don't remember that. Do you remember that, say, 249 00:12:38,320 --> 00:12:40,920 Speaker 1: you mean Star Wars was nine seventy seven. 250 00:12:41,480 --> 00:12:43,679 Speaker 2: But everybody's seen it. I'm trying to get a term 251 00:12:43,679 --> 00:12:47,760 Speaker 2: of reference. There's the NBA in the United States has 252 00:12:48,080 --> 00:12:50,840 Speaker 2: teamed up with Apple and what they've done this is 253 00:12:50,920 --> 00:12:53,160 Speaker 2: kind of cool. It's really interesting. If you're a big 254 00:12:53,200 --> 00:12:56,560 Speaker 2: sports fan, imagine that you're sitting in your lounge room 255 00:12:56,840 --> 00:12:59,800 Speaker 2: and you're watching a live sporting event, and then you 256 00:12:59,880 --> 00:13:03,360 Speaker 2: have a holographic representation of the footy or the tennis 257 00:13:03,480 --> 00:13:06,640 Speaker 2: or the rugby or whatever, and all the players are 258 00:13:06,640 --> 00:13:09,520 Speaker 2: in miniature running around the field, and you can kind 259 00:13:09,520 --> 00:13:11,560 Speaker 2: of walk around your living room table and see all 260 00:13:11,640 --> 00:13:14,600 Speaker 2: the action as it plays out on your dining room 261 00:13:14,600 --> 00:13:15,839 Speaker 2: table or your living room table. 262 00:13:16,240 --> 00:13:20,760 Speaker 1: Wouldn't that be kind of cool? I don't know, that 263 00:13:20,840 --> 00:13:21,760 Speaker 1: would be weird. 264 00:13:22,440 --> 00:13:25,480 Speaker 2: I guess it'd be god view, wouldn't it's downing on 265 00:13:25,480 --> 00:13:27,239 Speaker 2: the Wouldn't that be great. 266 00:13:27,240 --> 00:13:31,720 Speaker 1: To sound I've always wanted to feel omnipotent. That could 267 00:13:31,720 --> 00:13:34,040 Speaker 1: be my best shot I think for. 268 00:13:34,040 --> 00:13:36,640 Speaker 2: A sporting fan though, because suddenly, because when you think 269 00:13:36,640 --> 00:13:40,320 Speaker 2: about it, when you watch sport, you're I guess beholden 270 00:13:40,440 --> 00:13:43,280 Speaker 2: to the camera, people, to the directors, they show the 271 00:13:43,320 --> 00:13:45,640 Speaker 2: scenes that you're looking at at any given time, but 272 00:13:45,720 --> 00:13:50,640 Speaker 2: if you see the entire match played out holographically in 273 00:13:50,679 --> 00:13:52,880 Speaker 2: front of you, then you can just walk around and 274 00:13:52,920 --> 00:13:55,560 Speaker 2: see all your players playing the game, all the action 275 00:13:55,760 --> 00:13:58,240 Speaker 2: and this is real. Now they're talking about doing this 276 00:13:58,559 --> 00:14:02,200 Speaker 2: using at this stage the vision Pro headset. That's the 277 00:14:02,200 --> 00:14:06,480 Speaker 2: the the Apple the headset. But it would be a 278 00:14:06,520 --> 00:14:09,280 Speaker 2: really great way to mean what about a boxing match, Tiff. 279 00:14:09,960 --> 00:14:11,760 Speaker 1: You know, if you're if you're able to. 280 00:14:11,760 --> 00:14:14,160 Speaker 2: Walk around the ring, the rings there in front of you, 281 00:14:14,200 --> 00:14:16,720 Speaker 2: and you can move around it and walk around it 282 00:14:16,760 --> 00:14:19,520 Speaker 2: and see the see the boxes from different directions. 283 00:14:19,600 --> 00:14:23,400 Speaker 1: Wouldn't that be cool? There's already a version of that, 284 00:14:23,600 --> 00:14:25,760 Speaker 1: But I think Tip would rather just punch a human 285 00:14:25,760 --> 00:14:30,240 Speaker 1: and faith in the Yeah, okay, goggles, just give me 286 00:14:30,280 --> 00:14:35,520 Speaker 1: a just give me some warment to punch. Speaking of Apple, 287 00:14:36,360 --> 00:14:38,160 Speaker 1: I don't know if you saw this. I meant to 288 00:14:38,200 --> 00:14:41,680 Speaker 1: watch it and then I forgot. Last night on the 289 00:14:41,720 --> 00:14:44,120 Speaker 1: news they were talking about Apples brought out a new 290 00:14:44,160 --> 00:14:47,600 Speaker 1: smartphone which is like about half the price like their 291 00:14:47,720 --> 00:14:50,520 Speaker 1: You you know how they're up to. I know you're 292 00:14:50,720 --> 00:14:54,280 Speaker 1: an Android person, but you probably didn't see it. Did 293 00:14:54,280 --> 00:14:58,440 Speaker 1: you see that TIF anyway. Okay, so apparently Apple have 294 00:14:58,520 --> 00:15:03,120 Speaker 1: brought out a new iPhone which has got like everything 295 00:15:03,160 --> 00:15:06,360 Speaker 1: you need but not too many of the bells and whistles. 296 00:15:06,400 --> 00:15:09,360 Speaker 1: But I think it's about half the price, which I 297 00:15:09,360 --> 00:15:10,920 Speaker 1: mean a lot of the providers have been doing that. 298 00:15:10,960 --> 00:15:13,560 Speaker 2: Google has been putting out their A series, So when 299 00:15:13,600 --> 00:15:15,920 Speaker 2: you release a phone, they say put out the you know, 300 00:15:16,000 --> 00:15:18,560 Speaker 2: the nine, the Pixel nine, and then a little while 301 00:15:18,640 --> 00:15:21,360 Speaker 2: later they'll put out the Pixel nine A. So it's 302 00:15:21,400 --> 00:15:24,200 Speaker 2: more of a budget version of their flagship phone. It's 303 00:15:24,200 --> 00:15:26,760 Speaker 2: a good marketing strategy. It means that you know, when 304 00:15:26,800 --> 00:15:29,480 Speaker 2: you think about it. I mean people in Australia, it 305 00:15:29,560 --> 00:15:33,000 Speaker 2: always staggers me how many people have iPhones. I was 306 00:15:33,040 --> 00:15:37,080 Speaker 2: at dinner with some friends last Friday night and somebody was, oh, 307 00:15:37,080 --> 00:15:38,560 Speaker 2: I know what it was. They were looking at their 308 00:15:38,600 --> 00:15:40,760 Speaker 2: calculator on their phone because they'd had an update to 309 00:15:40,800 --> 00:15:43,520 Speaker 2: their Apple phone, their iPhone. And now there's a new 310 00:15:43,600 --> 00:15:49,560 Speaker 2: fixture that gives you real time cash transfer from different currencies. 311 00:15:50,000 --> 00:15:52,560 Speaker 2: So whether it's the ASX or something built into it, 312 00:15:52,600 --> 00:15:54,440 Speaker 2: but you can do it. You can type in one 313 00:15:54,480 --> 00:15:57,960 Speaker 2: hundred US dollars, convert to Australian dollars on your calculator, 314 00:15:58,160 --> 00:16:01,320 Speaker 2: and they were raving about it, and then everybody pulled 315 00:16:01,360 --> 00:16:03,720 Speaker 2: out all their iPhones and I'm sitting there with the 316 00:16:04,200 --> 00:16:06,920 Speaker 2: only one with an Android phone, thinking, well, I just 317 00:16:06,960 --> 00:16:08,840 Speaker 2: have to go to the AX, don't I, because I 318 00:16:08,840 --> 00:16:11,520 Speaker 2: can do it there. But it's interesting that this new 319 00:16:11,640 --> 00:16:13,480 Speaker 2: roll out and this due app But I, you know, 320 00:16:13,520 --> 00:16:15,040 Speaker 2: of all the people and I reckon, there must have 321 00:16:15,120 --> 00:16:18,360 Speaker 2: been about twelve of us at dinner, eleven of them 322 00:16:18,680 --> 00:16:22,360 Speaker 2: had iPhones. And interestingly, the demographic was they were all 323 00:16:22,400 --> 00:16:24,560 Speaker 2: older than me, so that says a lot, but it 324 00:16:24,640 --> 00:16:27,440 Speaker 2: was an older chemographic and I think older people, like 325 00:16:27,840 --> 00:16:30,000 Speaker 2: I mean, young people love them too. But I thought 326 00:16:30,040 --> 00:16:33,400 Speaker 2: that was interesting that of all my older friends, most. 327 00:16:33,240 --> 00:16:36,480 Speaker 1: Of them are on iPhones. When you think about the 328 00:16:36,520 --> 00:16:40,040 Speaker 1: capacity that phones have now, whatever phone be at iPhone 329 00:16:40,120 --> 00:16:45,520 Speaker 1: or Samsung or whatever, I wonder on average, like I 330 00:16:45,560 --> 00:16:49,840 Speaker 1: have an iPhone fourteen, which is I guess just just standard, 331 00:16:50,000 --> 00:16:53,160 Speaker 1: like well it's the Scott that but I would say, 332 00:16:53,880 --> 00:16:56,440 Speaker 1: of all the things that that phone can do, I 333 00:16:56,640 --> 00:17:01,440 Speaker 1: probably used two percent of the capacity in that that 334 00:17:01,560 --> 00:17:04,600 Speaker 1: phone has. And I wonder, and I think about that 335 00:17:04,640 --> 00:17:08,119 Speaker 1: with other things you know people that buy you know, 336 00:17:08,119 --> 00:17:10,920 Speaker 1: I like a motorbike that does three hundred kilometers an hour, 337 00:17:11,000 --> 00:17:12,960 Speaker 1: for example, not that I would do that, not to 338 00:17:13,000 --> 00:17:15,600 Speaker 1: one hundred and two seconds. Like people buy all these 339 00:17:15,640 --> 00:17:19,360 Speaker 1: things that they never really actually use the thing that 340 00:17:19,400 --> 00:17:23,720 Speaker 1: they bought anywhere near to its capacity. I had to 341 00:17:23,720 --> 00:17:25,120 Speaker 1: borrow a friend's car recently. 342 00:17:25,160 --> 00:17:27,320 Speaker 2: Well, I should say she offered her car to me 343 00:17:27,359 --> 00:17:28,919 Speaker 2: when she found out that I had to put my 344 00:17:29,000 --> 00:17:31,080 Speaker 2: car in to get some work done on it. And 345 00:17:32,160 --> 00:17:34,560 Speaker 2: she said, now don't turn all the safety gadgets back 346 00:17:34,560 --> 00:17:36,720 Speaker 2: on again, because the last time I used her car, 347 00:17:36,760 --> 00:17:39,320 Speaker 2: I thought, hey, Lene assist isn't working, and so I 348 00:17:39,400 --> 00:17:42,680 Speaker 2: turned on all the adaptive technology. 349 00:17:42,320 --> 00:17:44,800 Speaker 1: And she said, I can't drive my car anymore because 350 00:17:44,800 --> 00:17:49,359 Speaker 1: she switched on all the safety features. You know, It's true. 351 00:17:49,600 --> 00:17:51,879 Speaker 2: I know someone who bought a Mercedes and then didn't 352 00:17:51,920 --> 00:17:54,080 Speaker 2: know how to use half of the things in the car. 353 00:17:54,480 --> 00:17:57,280 Speaker 2: And for some people they're quite content not using cruise 354 00:17:57,280 --> 00:18:00,240 Speaker 2: control or adaptive cruise control and the other features. I mean, 355 00:18:00,280 --> 00:18:03,760 Speaker 2: I use lane as cyst, I use adaptive cruise control, 356 00:18:04,400 --> 00:18:06,879 Speaker 2: speed limitter, all those things. In fact, that's one of 357 00:18:06,880 --> 00:18:08,760 Speaker 2: the reasons that I had to do the podcast early 358 00:18:08,760 --> 00:18:10,439 Speaker 2: today because I'm taking my car and to get the 359 00:18:10,520 --> 00:18:13,919 Speaker 2: radar replaced because it's been a bit faulty. 360 00:18:14,800 --> 00:18:16,880 Speaker 1: What do you mean is that the radar that does 361 00:18:16,960 --> 00:18:19,880 Speaker 1: the autonomous braking if somebody in front of you right 362 00:18:20,040 --> 00:18:22,760 Speaker 1: right right, you know, in my car, and also detect 363 00:18:22,760 --> 00:18:26,000 Speaker 1: submarines and aircraft and things. Well, I mean, and you 364 00:18:26,080 --> 00:18:31,359 Speaker 1: need that incoming bogies. At nine o'clock, my car drives it. 365 00:18:31,480 --> 00:18:33,199 Speaker 1: You can get out of my car and press a 366 00:18:33,240 --> 00:18:35,639 Speaker 1: button and it puts itself away in the garage. But 367 00:18:35,840 --> 00:18:38,480 Speaker 1: I can't remember how to do it. And you can 368 00:18:38,520 --> 00:18:40,800 Speaker 1: also press a button and it backs itself out of 369 00:18:40,840 --> 00:18:45,040 Speaker 1: the garage into the driveway or any spot. And when 370 00:18:45,080 --> 00:18:46,800 Speaker 1: I bought it, they're like, look at this. I'm like, 371 00:18:46,960 --> 00:18:49,800 Speaker 1: fucking hell, it's in the showroom driving itself up and back. 372 00:18:50,480 --> 00:18:52,800 Speaker 1: And I went, that's awesome. And then I got home 373 00:18:52,840 --> 00:18:54,760 Speaker 1: and I didn't even think about that. And now six 374 00:18:54,800 --> 00:18:57,920 Speaker 1: months later, I don't know how to do that at all. 375 00:18:58,200 --> 00:18:59,800 Speaker 2: Oh, you could do it from an app on your phone. 376 00:19:00,080 --> 00:19:02,520 Speaker 2: Well you know, yeah. 377 00:19:02,400 --> 00:19:04,600 Speaker 1: Yeah, there's just that's true. You can do it from 378 00:19:04,600 --> 00:19:06,800 Speaker 1: an app. You can start it from an app. Wow, 379 00:19:06,920 --> 00:19:09,160 Speaker 1: I could sit up my phone from where I'm my car, 380 00:19:09,200 --> 00:19:10,520 Speaker 1: from where I'm sitting right now. 381 00:19:10,680 --> 00:19:12,560 Speaker 2: But that's great if you're at a shopping center and 382 00:19:12,600 --> 00:19:14,800 Speaker 2: it's a boiling hot day in your cars outside, you 383 00:19:14,840 --> 00:19:16,480 Speaker 2: could turn it on, turn the eck on on, and 384 00:19:16,480 --> 00:19:17,760 Speaker 2: then by the time you get to the car, it's 385 00:19:17,840 --> 00:19:18,399 Speaker 2: nice and cool. 386 00:19:18,600 --> 00:19:21,399 Speaker 1: See there you go. That's a good use of remain. Also, 387 00:19:21,720 --> 00:19:25,400 Speaker 1: it's hybrid, so I'm not burning fuel. Were you kind 388 00:19:25,400 --> 00:19:26,880 Speaker 1: of are kind of are? 389 00:19:27,400 --> 00:19:29,720 Speaker 2: Well, you are, It's just it doesn't mean you're running 390 00:19:29,720 --> 00:19:30,760 Speaker 2: on battery all the time. 391 00:19:31,280 --> 00:19:33,560 Speaker 1: All right, Could you give us another story because there's 392 00:19:33,600 --> 00:19:36,639 Speaker 1: twenty here and we've only done one, so we're going 393 00:19:36,680 --> 00:19:42,879 Speaker 1: to be tuesday. Robotic exoskeletons, all right. So that's for 394 00:19:42,960 --> 00:19:45,400 Speaker 1: the for our listeners who don't know, and I could 395 00:19:45,400 --> 00:19:48,000 Speaker 1: get this wrong. That's a thing that you kind of 396 00:19:48,040 --> 00:19:52,399 Speaker 1: get into, yes, and it ambulates for you. It's so 397 00:19:52,560 --> 00:19:56,440 Speaker 1: I guess maybe people with disabilities and stuff. People who 398 00:19:56,440 --> 00:19:59,200 Speaker 1: want to climb mountains in China. It's a thing. 399 00:19:59,640 --> 00:20:01,600 Speaker 2: So one of the tallest mountains in China. It's a 400 00:20:01,680 --> 00:20:04,600 Speaker 2: very popular tourist destorate destination. But you can go there 401 00:20:04,880 --> 00:20:08,160 Speaker 2: and you can hire an exoskeleton to help you climb 402 00:20:08,200 --> 00:20:10,480 Speaker 2: because it's so steep and take so long. 403 00:20:11,080 --> 00:20:13,960 Speaker 1: Okay, So describe to people what that is. So is 404 00:20:13,960 --> 00:20:16,600 Speaker 1: that like a mechanical suit that you get into. 405 00:20:16,880 --> 00:20:19,800 Speaker 2: It's not a full We're not talking a Sigourney weaver 406 00:20:19,920 --> 00:20:22,680 Speaker 2: in Aliens, okay, in case you are going there. We're 407 00:20:22,680 --> 00:20:26,480 Speaker 2: talking a minimal kind of assistive technology where as you 408 00:20:26,560 --> 00:20:30,280 Speaker 2: take a step, the framework that sits, it's like calipers 409 00:20:30,280 --> 00:20:33,359 Speaker 2: on your legs, you know, and they just assist in 410 00:20:33,600 --> 00:20:36,080 Speaker 2: that extra little bit. So we're not talking doing the 411 00:20:36,200 --> 00:20:39,359 Speaker 2: walking for you. But for me, when I saw it, 412 00:20:39,400 --> 00:20:42,360 Speaker 2: I thought, a it's kind of cool, But I guess 413 00:20:42,640 --> 00:20:44,920 Speaker 2: I'm more thinking about friends who've a friend of mine 414 00:20:44,920 --> 00:20:46,840 Speaker 2: who's got Parkinson's and how hard it is for her 415 00:20:46,880 --> 00:20:50,080 Speaker 2: to get around now, and if they've got like gyroscopic 416 00:20:50,119 --> 00:20:52,600 Speaker 2: technology built into them and they can assist you and 417 00:20:52,680 --> 00:20:55,720 Speaker 2: keep you firm, because balance, as you know, is so 418 00:20:55,840 --> 00:20:59,160 Speaker 2: important as you get older, because it's the fall it's 419 00:20:59,200 --> 00:21:03,040 Speaker 2: going to you know, be that downhill spiral for you. 420 00:21:03,119 --> 00:21:04,560 Speaker 1: So if you can get an. 421 00:21:04,400 --> 00:21:07,280 Speaker 2: Adaptive kind of exoskeleton that will assist you in kind 422 00:21:07,280 --> 00:21:09,520 Speaker 2: of getting up out of a chair, walking around and 423 00:21:09,600 --> 00:21:11,800 Speaker 2: keeping you balance, that was the thing I took out of. 424 00:21:11,720 --> 00:21:12,640 Speaker 1: This story, Buteah. 425 00:21:12,640 --> 00:21:14,840 Speaker 2: Evidently, when you go to this place in China, they 426 00:21:15,600 --> 00:21:18,199 Speaker 2: you can hire these exoskeletons and I don't know how 427 00:21:18,240 --> 00:21:20,080 Speaker 2: long the battery lasts in them, but it means that 428 00:21:20,119 --> 00:21:24,040 Speaker 2: you can actually have assisted walking up the steepest inclines. 429 00:21:24,040 --> 00:21:26,879 Speaker 2: But I saw a British model a few years it 430 00:21:26,920 --> 00:21:29,680 Speaker 2: was about maybe six months ago, and they were testing 431 00:21:29,720 --> 00:21:33,080 Speaker 2: them and they strapped to your legs and they, as 432 00:21:33,080 --> 00:21:35,560 Speaker 2: I said, they don't walk for you, but they've felt 433 00:21:35,600 --> 00:21:38,840 Speaker 2: that people could kind of walk about twenty five percent 434 00:21:38,960 --> 00:21:42,760 Speaker 2: faster and for a longer duration as well, so they 435 00:21:42,800 --> 00:21:44,040 Speaker 2: assist you normal walking. 436 00:21:44,800 --> 00:21:47,200 Speaker 1: Now, when you started this, I said, oh, for people 437 00:21:47,200 --> 00:21:50,439 Speaker 1: with disabilities, and you went no, for people climbing mountains. 438 00:21:50,480 --> 00:21:54,000 Speaker 1: Hang on, and then you went back to people with disabilities. No. 439 00:21:54,160 --> 00:21:57,240 Speaker 2: But the story was about climbing mountains in China. But 440 00:21:57,440 --> 00:21:59,960 Speaker 2: I saw that and like that's where my head went 441 00:22:00,080 --> 00:22:01,679 Speaker 2: as well. But I wanted to kind of tell the 442 00:22:01,720 --> 00:22:04,640 Speaker 2: story part first and then say that yes, I see 443 00:22:04,680 --> 00:22:07,320 Speaker 2: that that's going to broader interest and could have a 444 00:22:07,320 --> 00:22:09,640 Speaker 2: real beneficial All right, I'll let you off. 445 00:22:09,440 --> 00:22:11,600 Speaker 1: But you know what, you know what I think about 446 00:22:11,600 --> 00:22:15,200 Speaker 1: all this stuff. Isn't it amazing How we keep inventing 447 00:22:15,320 --> 00:22:22,880 Speaker 1: shit like electric bikes and exoskeletons for so people don't 448 00:22:22,880 --> 00:22:27,160 Speaker 1: have to actually use their muscles. We keep finding let's 449 00:22:27,240 --> 00:22:31,399 Speaker 1: invent something where a person doesn't have to physically work, 450 00:22:32,160 --> 00:22:35,280 Speaker 1: Like with the application of people with disabilities. I think 451 00:22:35,280 --> 00:22:38,679 Speaker 1: it's fucking brilliant. But for the average person who doesn't 452 00:22:38,680 --> 00:22:41,760 Speaker 1: have a disability, just use your body. I mean, there's 453 00:22:41,800 --> 00:22:45,280 Speaker 1: an idea. Just climb, or just ride, or just run 454 00:22:45,440 --> 00:22:48,199 Speaker 1: or just walk. I mean, muscles are really they've been 455 00:22:48,240 --> 00:22:51,760 Speaker 1: around for a while. They're pretty tried and tested. Fucking 456 00:22:51,840 --> 00:22:52,560 Speaker 1: give yours a go. 457 00:22:53,160 --> 00:22:56,400 Speaker 2: But what about if you're in your seventies and you've 458 00:22:56,440 --> 00:22:58,720 Speaker 2: always wanted to go to China and climb this mountain 459 00:22:58,960 --> 00:23:02,840 Speaker 2: and previously had been out of the scope of your abilities. 460 00:23:03,280 --> 00:23:05,880 Speaker 2: This little bit of an edge might get you over 461 00:23:05,920 --> 00:23:06,679 Speaker 2: the line a little bit. 462 00:23:06,880 --> 00:23:08,119 Speaker 1: So you know what else will give you a bit 463 00:23:08,160 --> 00:23:10,920 Speaker 1: of an edge. Squats and deadlifts and lifting heavy shit 464 00:23:11,000 --> 00:23:15,280 Speaker 1: before you go away. That's another point as well. Yes, 465 00:23:15,520 --> 00:23:18,879 Speaker 1: very valid. There's this other thing called strength training. Probably 466 00:23:18,880 --> 00:23:22,760 Speaker 1: heard of it. You should give it a go. Fifty 467 00:23:22,840 --> 00:23:23,240 Speaker 1: years old. 468 00:23:25,760 --> 00:23:29,040 Speaker 2: That's true, though, it's interesting, but I love, I do 469 00:23:29,200 --> 00:23:34,520 Speaker 2: love when you see technology adapted to assist people. I 470 00:23:34,600 --> 00:23:37,000 Speaker 2: must say I saw a local story in Australian story 471 00:23:37,040 --> 00:23:39,920 Speaker 2: recently about a guy who's been knocked back from the 472 00:23:40,040 --> 00:23:44,760 Speaker 2: National Disability Insurance Scheme. The ndis they try to apply 473 00:23:44,960 --> 00:23:48,879 Speaker 2: to get a chat GPT subscription, so he wanted to 474 00:23:48,880 --> 00:23:51,919 Speaker 2: be able to use chat GPT and it was knocked 475 00:23:51,920 --> 00:23:54,879 Speaker 2: back by the funding body saying there's a risk of 476 00:23:55,040 --> 00:23:58,520 Speaker 2: false information and its poor value for money. 477 00:23:58,920 --> 00:24:01,560 Speaker 1: Now for me, I guess I get. 478 00:24:01,640 --> 00:24:04,160 Speaker 2: You know, if someone can use the likes of chatpt 479 00:24:04,320 --> 00:24:06,399 Speaker 2: to make their life easier, and we know with you know, 480 00:24:06,520 --> 00:24:09,080 Speaker 2: AI assistance and all that sort of thing, it seems 481 00:24:09,119 --> 00:24:11,560 Speaker 2: to make sense. But interesting that a government body has 482 00:24:11,760 --> 00:24:14,359 Speaker 2: ruled and it was you know, they made an application 483 00:24:14,720 --> 00:24:17,320 Speaker 2: and then they tried to you know, reapply and it 484 00:24:17,359 --> 00:24:20,960 Speaker 2: was knocked back on appeal. So it says that, you know, 485 00:24:21,040 --> 00:24:24,600 Speaker 2: if the belief and the ruling of whoever it was 486 00:24:24,640 --> 00:24:28,040 Speaker 2: that made the decision was that they're not reliable enough 487 00:24:28,080 --> 00:24:30,800 Speaker 2: to be funded, you know, what does it say about 488 00:24:30,840 --> 00:24:33,880 Speaker 2: the future of AI or the perception of AI that 489 00:24:33,960 --> 00:24:37,360 Speaker 2: we really can't trust it and that kind of lead 490 00:24:37,480 --> 00:24:39,919 Speaker 2: me on led me onto an interesting story I was 491 00:24:39,960 --> 00:24:42,840 Speaker 2: reading the other day and I wanted to chat about, 492 00:24:43,200 --> 00:24:49,400 Speaker 2: was cognitive decline in AI models. So there's this belief 493 00:24:49,960 --> 00:24:53,399 Speaker 2: that as AI models get older, they potentially could do 494 00:24:53,480 --> 00:24:57,600 Speaker 2: exactly what us humans do and go into a cognitive decline. 495 00:24:58,720 --> 00:25:01,560 Speaker 1: Oh, which, wow, that doesn't a little bit, doesn't it. 496 00:25:01,920 --> 00:25:04,159 Speaker 1: I don't know how that. I mean, I'd love to 497 00:25:04,240 --> 00:25:08,840 Speaker 1: know the mechanism, because there's the cognitive decline in humans 498 00:25:08,960 --> 00:25:12,479 Speaker 1: is essentially biological and nature. You know, It's like it 499 00:25:12,560 --> 00:25:17,080 Speaker 1: happens because our brain gets old, then it degenerates and deteriorates. 500 00:25:18,119 --> 00:25:21,679 Speaker 1: But it doesn't I mean, who knows, Like in one 501 00:25:21,760 --> 00:25:26,680 Speaker 1: hundred years. Fucking hell, what's the biology technology interface? We're 502 00:25:26,720 --> 00:25:29,080 Speaker 1: not going to know. I mean you probably walk down 503 00:25:29,119 --> 00:25:31,000 Speaker 1: the street and go is that a human or is 504 00:25:31,040 --> 00:25:32,879 Speaker 1: that a robot? You might not know. 505 00:25:33,720 --> 00:25:37,480 Speaker 2: Well, the British Medical Journal is where the research was done. 506 00:25:37,760 --> 00:25:40,080 Speaker 2: So it was a study published back in twenty twenty four, 507 00:25:40,160 --> 00:25:41,800 Speaker 2: so a right at the end of last year, so 508 00:25:41,840 --> 00:25:45,960 Speaker 2: it wasn't that long ago. They're saying that AI technologies 509 00:25:46,160 --> 00:25:52,439 Speaker 2: like large large language models chatbots now showing signs of 510 00:25:52,960 --> 00:25:57,679 Speaker 2: deteriorated cognitive abilities as they advance in age. We know 511 00:25:57,800 --> 00:26:00,680 Speaker 2: that they have little hiccups every now and again. Sometimes 512 00:26:00,720 --> 00:26:03,960 Speaker 2: they just zone out and you get really weird results. 513 00:26:04,400 --> 00:26:07,879 Speaker 2: So I guess it's challenging the assumption that artificial intelligence 514 00:26:08,080 --> 00:26:12,239 Speaker 2: will effectively replace us, because, I mean, we know that 515 00:26:12,760 --> 00:26:14,919 Speaker 2: the good thing about AIS is they can take a 516 00:26:14,920 --> 00:26:18,119 Speaker 2: whole lot of data and they can do interpretations that 517 00:26:18,359 --> 00:26:21,440 Speaker 2: people may not be able to, particularly a medical diagnosis. 518 00:26:21,760 --> 00:26:23,960 Speaker 2: You know, one of the things that's really hard is 519 00:26:24,080 --> 00:26:27,760 Speaker 2: diagnosing illness because effectively we have to rule out what 520 00:26:27,840 --> 00:26:30,040 Speaker 2: it is before we can find out what it is. 521 00:26:30,520 --> 00:26:33,800 Speaker 2: But because these language models have so much access to 522 00:26:33,840 --> 00:26:36,600 Speaker 2: so much data that as a human being, it's very 523 00:26:36,680 --> 00:26:39,439 Speaker 2: very hard. And you, given what you've been doing, you know, 524 00:26:39,520 --> 00:26:42,800 Speaker 2: pulling all that research in, I mean, how interesting would 525 00:26:42,800 --> 00:26:45,600 Speaker 2: it be to take all of your studies information, put 526 00:26:45,640 --> 00:26:48,679 Speaker 2: it all into a computer, all into an AI, and 527 00:26:48,760 --> 00:26:50,919 Speaker 2: see what it summarizes it has, see what sort of 528 00:26:50,920 --> 00:26:53,879 Speaker 2: conclusions it derives. Obviously you're too busy to do that 529 00:26:53,920 --> 00:26:57,399 Speaker 2: all now, But don't you find that fascinating to see? 530 00:26:58,000 --> 00:27:00,720 Speaker 1: I'll tell you a story very relevant to what you're saying. 531 00:27:00,800 --> 00:27:04,919 Speaker 1: So the other day I got my so I just 532 00:27:05,280 --> 00:27:07,640 Speaker 1: sent off my as I told you before we went live, 533 00:27:08,760 --> 00:27:13,040 Speaker 1: one of my first papers for publication, and it's you know, 534 00:27:13,080 --> 00:27:17,320 Speaker 1: this is this paper is one of my studies that 535 00:27:17,359 --> 00:27:19,199 Speaker 1: I did. It's all the studies, all the data, all 536 00:27:19,200 --> 00:27:22,920 Speaker 1: the interpretation, blah blah blah, l like in total four 537 00:27:23,000 --> 00:27:25,120 Speaker 1: years of work for this one paper. But I'm doing 538 00:27:25,200 --> 00:27:28,720 Speaker 1: multiple papers at the same time. But so I put 539 00:27:28,800 --> 00:27:32,640 Speaker 1: it into an l l M as you call it, 540 00:27:32,680 --> 00:27:35,800 Speaker 1: and I said, can you read this and turn it 541 00:27:35,840 --> 00:27:43,520 Speaker 1: into a user friendly article, like read a friendly article 542 00:27:44,200 --> 00:27:48,159 Speaker 1: for the Harvard Business Review and the inn literally it 543 00:27:48,320 --> 00:27:52,919 Speaker 1: read in inverted commas, my paper that's taken four years. 544 00:27:53,600 --> 00:27:57,080 Speaker 1: It read it, and it wrote an article what it's 545 00:27:57,080 --> 00:27:59,879 Speaker 1: called a gray paper in academia because it's not a 546 00:28:00,400 --> 00:28:04,080 Speaker 1: it's not an academic paper, but it's halfway between, you know, 547 00:28:04,240 --> 00:28:08,720 Speaker 1: a magazine paper and an academic paper. And so it 548 00:28:08,800 --> 00:28:14,359 Speaker 1: wrote a paper for Harvard Business Review, which is it 549 00:28:14,400 --> 00:28:16,959 Speaker 1: would have taken me a week to write this paper 550 00:28:17,080 --> 00:28:20,400 Speaker 1: and it's my research and it wrote it in one 551 00:28:20,480 --> 00:28:24,400 Speaker 1: minute and it's good and it basically did an overview. 552 00:28:24,680 --> 00:28:27,280 Speaker 1: And obviously I'm not using it. I can't use it 553 00:28:27,280 --> 00:28:30,119 Speaker 1: because it doesn't it doesn't fit within the parameters of 554 00:28:30,400 --> 00:28:34,480 Speaker 1: the scope of PhD literature. But that shit is amazing. 555 00:28:34,520 --> 00:28:37,119 Speaker 1: It's already incredible. What it's going to be like in 556 00:28:37,160 --> 00:28:38,600 Speaker 1: a year or two is mind blowing. 557 00:28:39,240 --> 00:28:42,479 Speaker 2: Well, then, when you think about it, it's that a 558 00:28:42,520 --> 00:28:44,600 Speaker 2: bad thing that it was able to look at all 559 00:28:44,600 --> 00:28:48,800 Speaker 2: that four years of information and to form a concise summary. 560 00:28:49,080 --> 00:28:52,280 Speaker 1: I mean, it's your work. Oh, I think it's brilliant. No, 561 00:28:52,760 --> 00:28:56,640 Speaker 1: I think it's fucking incredible. But what it can't do. 562 00:28:58,000 --> 00:29:01,360 Speaker 1: It can't do the research for you, Like, it can't 563 00:29:01,520 --> 00:29:04,280 Speaker 1: go and you know, get all the humans, put them 564 00:29:04,280 --> 00:29:06,640 Speaker 1: in a room, give them again, like so you do 565 00:29:06,680 --> 00:29:10,200 Speaker 1: all the work. But it can definitely synthesize and interpret 566 00:29:11,000 --> 00:29:15,280 Speaker 1: pretty well. Like it's pretty incredible. Although it does make 567 00:29:15,360 --> 00:29:19,360 Speaker 1: me wonder, sorry Patrick, it does make me wonder moving forward, 568 00:29:20,800 --> 00:29:23,120 Speaker 1: like the value and I hate to say this, but 569 00:29:23,160 --> 00:29:29,200 Speaker 1: the value of tertiary qualifications because it's so easy to 570 00:29:30,120 --> 00:29:34,320 Speaker 1: not not with my stuff because it's independent research, but 571 00:29:34,440 --> 00:29:37,880 Speaker 1: with a lot of undergraduate degrees now where you've got 572 00:29:37,880 --> 00:29:41,120 Speaker 1: to write an essay on something and or you've got 573 00:29:41,120 --> 00:29:45,000 Speaker 1: to read and you know, summarize or synthesize some reading. 574 00:29:46,080 --> 00:29:49,000 Speaker 1: I don't know how they're going to police that because 575 00:29:49,840 --> 00:29:54,080 Speaker 1: the the you know, the ability to be able to 576 00:29:54,200 --> 00:29:58,920 Speaker 1: produce high level work, especially at an undergraduate level, and 577 00:29:58,960 --> 00:30:01,080 Speaker 1: they go, oh, yeah, but I can detect it now 578 00:30:01,160 --> 00:30:04,280 Speaker 1: they can't because it's evolving by the day. And also 579 00:30:04,560 --> 00:30:05,760 Speaker 1: I don't know where that's going to go. 580 00:30:06,120 --> 00:30:09,240 Speaker 2: If you uploaded all of your previous work to try 581 00:30:09,240 --> 00:30:13,800 Speaker 2: to get the style, your writing style and said, mimic 582 00:30:14,120 --> 00:30:17,840 Speaker 2: the way that I do this, then suddenly you've trained 583 00:30:17,840 --> 00:30:21,000 Speaker 2: it on a language model that is your own creative process. 584 00:30:21,520 --> 00:30:23,960 Speaker 1: So that's an interesting thought as well. You know what 585 00:30:24,040 --> 00:30:24,640 Speaker 1: occurred to me. 586 00:30:24,960 --> 00:30:27,520 Speaker 2: You were saying that it was four years of data 587 00:30:27,560 --> 00:30:30,640 Speaker 2: and information and the research that you've done and in 588 00:30:30,680 --> 00:30:33,719 Speaker 2: some ways for it to be able to summarize all 589 00:30:33,800 --> 00:30:36,480 Speaker 2: that you as a human being. If you had to 590 00:30:36,520 --> 00:30:40,160 Speaker 2: spend that week doing that, potentially you could miss something. 591 00:30:40,720 --> 00:30:43,239 Speaker 2: You know, potentially, you know, is there a chance that 592 00:30:43,320 --> 00:30:46,239 Speaker 2: you may have slipped up and that you know, there 593 00:30:46,280 --> 00:30:48,240 Speaker 2: was some really important data that slipped through to the 594 00:30:48,320 --> 00:30:51,720 Speaker 2: keeper that maybe CHATJBT could you so you could say 595 00:30:51,960 --> 00:30:54,520 Speaker 2: write yours over the week and then say did I 596 00:30:54,560 --> 00:30:58,400 Speaker 2: miss anything out? By comparing the two it's summary, your summary, 597 00:30:58,480 --> 00:31:00,200 Speaker 2: what did I miss out? And then it can go 598 00:31:00,320 --> 00:31:02,560 Speaker 2: back through and say, oh yeah, that was great, great, great, 599 00:31:02,640 --> 00:31:05,240 Speaker 2: except think about maybe this, this and this one. 600 00:31:06,160 --> 00:31:10,160 Speaker 1: Yeah, mate, I don't even you know, I'm not even 601 00:31:10,240 --> 00:31:14,120 Speaker 1: dog shit on the shoe of AI. Like it's Ai. 602 00:31:14,480 --> 00:31:19,120 Speaker 1: I'm a fucking moron compared to AI, no doubt. 603 00:31:19,000 --> 00:31:23,120 Speaker 2: And it's no, no, no, no no. 604 00:31:24,280 --> 00:31:26,080 Speaker 1: With this stuff that we're talking about. 605 00:31:26,440 --> 00:31:29,400 Speaker 2: Yeah, but you've got one thing that AI will never have, 606 00:31:29,560 --> 00:31:31,880 Speaker 2: and it's compassion and humanity, you know. 607 00:31:32,280 --> 00:31:34,880 Speaker 1: Yeah, but I'm just talking about no, thank you, that's it. 608 00:31:35,120 --> 00:31:37,360 Speaker 1: What are you being nice for? I'm uncomfortable. What are 609 00:31:37,360 --> 00:31:41,160 Speaker 1: you doing? No, but look I know what you mean. 610 00:31:41,200 --> 00:31:43,600 Speaker 1: But in this sense, like the stuff that it can 611 00:31:43,680 --> 00:31:46,880 Speaker 1: produce in this like in this kind of paradigm, this 612 00:31:47,120 --> 00:31:52,360 Speaker 1: academic intellectual kind of pursuit. Oh yeah, it's it's crazy. 613 00:31:52,720 --> 00:31:54,440 Speaker 1: It makes me excited. 614 00:31:54,480 --> 00:31:57,840 Speaker 2: And look I get it. And it's like any other 615 00:31:57,960 --> 00:32:00,760 Speaker 2: tool that we've been using. I mean, think about you know, 616 00:32:01,120 --> 00:32:02,920 Speaker 2: I know we've spoken about this before the turn of 617 00:32:02,960 --> 00:32:05,480 Speaker 2: the century of the automobile, all those things that it's 618 00:32:05,520 --> 00:32:07,400 Speaker 2: potentially we're going to put people out of work. 619 00:32:07,440 --> 00:32:09,800 Speaker 1: It was going to change society as we saw it. 620 00:32:10,320 --> 00:32:15,040 Speaker 2: Whether it was television, broadcast, radio, whatever. People have always 621 00:32:15,040 --> 00:32:18,440 Speaker 2: been resistant to technology because it was felt that it 622 00:32:18,560 --> 00:32:21,640 Speaker 2: would take the humanity away from trades, or the humanity 623 00:32:21,680 --> 00:32:25,200 Speaker 2: away from entertainment, the humanity away from you know, potentially 624 00:32:25,520 --> 00:32:28,080 Speaker 2: writing this sort of material. But I think if we 625 00:32:28,280 --> 00:32:32,040 Speaker 2: use this tech to take the drudgery out to do 626 00:32:32,120 --> 00:32:35,400 Speaker 2: fact checking, to be able to use it for oversight, 627 00:32:35,680 --> 00:32:38,520 Speaker 2: there's lots of positive things that we can use these 628 00:32:38,600 --> 00:32:42,200 Speaker 2: really amazing technologies that are emerging, And it's like anything 629 00:32:42,440 --> 00:32:44,600 Speaker 2: we you know, we can use it in a positive sense. 630 00:32:44,640 --> 00:32:48,080 Speaker 2: We can use it, you know, to create viruses that 631 00:32:48,120 --> 00:32:50,440 Speaker 2: we then in computers with. We can use it to 632 00:32:50,600 --> 00:32:55,120 Speaker 2: be able to create you know, viruses that are real 633 00:32:55,200 --> 00:32:58,800 Speaker 2: world viruses that can then be used to kill cancer, 634 00:32:58,880 --> 00:33:01,000 Speaker 2: you know what I mean. We you know, we use 635 00:33:01,080 --> 00:33:04,320 Speaker 2: our own DNA and we combine or our RNA and 636 00:33:04,440 --> 00:33:09,480 Speaker 2: combine it with a retrovirus and that potentially could search 637 00:33:09,560 --> 00:33:13,800 Speaker 2: out and destroy cancer cells or transform those cancer cells 638 00:33:13,880 --> 00:33:15,520 Speaker 2: into non malignant cells, you know. 639 00:33:15,480 --> 00:33:16,120 Speaker 1: That sort of stuff. 640 00:33:16,160 --> 00:33:18,520 Speaker 2: So I think that there's it's like any application, it's 641 00:33:18,560 --> 00:33:21,320 Speaker 2: like absolutely anything that we choose to do and how 642 00:33:21,360 --> 00:33:22,600 Speaker 2: we choose to use it. 643 00:33:24,480 --> 00:33:28,959 Speaker 1: Yeah, And I think I just wrote down a question, 644 00:33:29,200 --> 00:33:32,600 Speaker 1: and my question was should we be scared of technology? 645 00:33:33,000 --> 00:33:36,440 Speaker 1: And I'm just thinking like, at the very least cautious 646 00:33:36,720 --> 00:33:39,800 Speaker 1: moving forward, you know, perhaps not scared, but just wary. 647 00:33:42,000 --> 00:33:45,600 Speaker 1: Like I think about like I get three phone calls 648 00:33:45,640 --> 00:33:48,800 Speaker 1: a week from my mum about something to something came 649 00:33:48,840 --> 00:33:52,160 Speaker 1: through on her phone and this, and the bank wants 650 00:33:52,200 --> 00:33:55,560 Speaker 1: this and this company said I you know, and it's like, Mum, 651 00:33:55,720 --> 00:33:58,600 Speaker 1: you don't you know. That's like and I'm definitely not 652 00:33:58,680 --> 00:34:02,760 Speaker 1: the tech guru, but for older people, and I mean older, 653 00:34:02,800 --> 00:34:05,080 Speaker 1: older people in their eighties who you know, like Mum 654 00:34:05,120 --> 00:34:08,600 Speaker 1: wants to go and pay bills with a check or cash, Like, Mum, 655 00:34:08,640 --> 00:34:12,239 Speaker 1: you can't thank you thanks to actually getting rid of checks. 656 00:34:12,280 --> 00:34:14,400 Speaker 2: I think one of the Big four recently said that 657 00:34:14,400 --> 00:34:16,080 Speaker 2: we're not going to be using checks anymore. 658 00:34:17,400 --> 00:34:20,040 Speaker 1: And it's it's really like we live in a world 659 00:34:20,080 --> 00:34:23,799 Speaker 1: that does not accommodate older people, I mean people who 660 00:34:23,840 --> 00:34:28,400 Speaker 1: are you know, technology is great, but it's it's now 661 00:34:28,480 --> 00:34:31,160 Speaker 1: and I guess it's just it happens. That's just evolution 662 00:34:31,320 --> 00:34:34,719 Speaker 1: and development and progress. But I don't know that all 663 00:34:34,800 --> 00:34:39,640 Speaker 1: progress in inverted commas feels like progress for some people, 664 00:34:39,880 --> 00:34:42,600 Speaker 1: you know, and I think that, like, imagine if my 665 00:34:42,719 --> 00:34:47,520 Speaker 1: mum and dad, or TIFF's dad, or TIFF's fucking granddad 666 00:34:47,520 --> 00:34:50,799 Speaker 1: who's one hundred and one champion, what a gun. But 667 00:34:50,880 --> 00:34:53,480 Speaker 1: imagine if these people didn't have younger people in their 668 00:34:53,520 --> 00:34:56,839 Speaker 1: life who can kind of steer them and help them. 669 00:34:57,400 --> 00:35:01,719 Speaker 1: Like my if my mum didn't have me, she I 670 00:35:01,719 --> 00:35:04,080 Speaker 1: don't know what. She couldn't do it. She wouldn't know 671 00:35:04,120 --> 00:35:06,800 Speaker 1: how to fucking do anything with that stuff, you know 672 00:35:06,880 --> 00:35:09,279 Speaker 1: what I love? And I'm digressing a little bit. 673 00:35:09,320 --> 00:35:12,799 Speaker 2: But places like neighborhood centers where you can drop in 674 00:35:12,960 --> 00:35:14,759 Speaker 2: and they do tech support. You know, there's a lot 675 00:35:14,760 --> 00:35:18,440 Speaker 2: of community based organizations. Do you know the Neighborhood Houses Association, 676 00:35:18,560 --> 00:35:19,960 Speaker 2: so you know what a neighborhood house is? 677 00:35:19,960 --> 00:35:20,720 Speaker 1: Community houses. 678 00:35:20,960 --> 00:35:23,759 Speaker 2: They're all over Victoria. There are more neighborhood houses than 679 00:35:23,760 --> 00:35:25,880 Speaker 2: there are McDonald's in Victoria. 680 00:35:26,120 --> 00:35:29,520 Speaker 1: Isn't that? I don't even know what a neighborhood house is. 681 00:35:29,640 --> 00:35:34,279 Speaker 1: I know you've spoken about it before, but what is it? 682 00:35:34,280 --> 00:35:37,360 Speaker 2: It's a community organization, drop in center, they run courses. 683 00:35:37,480 --> 00:35:39,400 Speaker 1: I teach it. A couple of neighborhood houses. 684 00:35:39,440 --> 00:35:42,240 Speaker 2: I teach my Tai chi there, but basically they're usually 685 00:35:42,360 --> 00:35:45,560 Speaker 2: an organization that's a not for profit run by a 686 00:35:45,680 --> 00:35:49,640 Speaker 2: voluntary committee. They may have employed staff, but the Neighborhood 687 00:35:49,640 --> 00:35:53,680 Speaker 2: Houses Association of Victoria basically they're all over the place. 688 00:35:53,719 --> 00:35:56,760 Speaker 1: They have food banks a couple of the places. 689 00:35:56,400 --> 00:35:58,920 Speaker 2: That I that I work at and also volunteer on 690 00:35:59,000 --> 00:36:00,520 Speaker 2: the board of one of them. 691 00:36:00,880 --> 00:36:01,920 Speaker 1: They are amazing. 692 00:36:02,080 --> 00:36:04,600 Speaker 2: Generally they're run A lot of the work that's done 693 00:36:04,680 --> 00:36:07,279 Speaker 2: is done by volunteers or people who are well intentioned. 694 00:36:07,440 --> 00:36:10,799 Speaker 2: They're usually a low paid sector, so anybody working in 695 00:36:10,840 --> 00:36:12,319 Speaker 2: the sector is doing it for the love of it, 696 00:36:12,440 --> 00:36:14,600 Speaker 2: not for the money, that's for sure. But yeah, if 697 00:36:14,640 --> 00:36:16,399 Speaker 2: you ever get a chance to go to a local 698 00:36:16,440 --> 00:36:19,040 Speaker 2: neighborhood center or a neighborhood house, you could learn a 699 00:36:19,080 --> 00:36:21,799 Speaker 2: new language, learn how to cook vegan food. You can 700 00:36:22,000 --> 00:36:26,000 Speaker 2: do taichi or pilates or something. So they're all over 701 00:36:26,040 --> 00:36:28,120 Speaker 2: the place. There'd be one right around the corner from you, 702 00:36:28,200 --> 00:36:30,439 Speaker 2: Crego or tiff Well, I. 703 00:36:30,400 --> 00:36:33,399 Speaker 1: Just them up and you're exactly right. So you just 704 00:36:33,400 --> 00:36:38,320 Speaker 1: just go neighborhood houses and it's anyway NHVIC dot org. 705 00:36:38,680 --> 00:36:42,000 Speaker 1: Find a neighborhood house and as Patrick suggests, there are 706 00:36:42,040 --> 00:36:45,280 Speaker 1: a lot of them, A lot of them stacks stacks. 707 00:36:45,440 --> 00:36:48,399 Speaker 2: I'm just going, oh, no, I just I mean last 708 00:36:48,520 --> 00:36:50,840 Speaker 2: night I walked in to open the hall to do 709 00:36:50,920 --> 00:36:53,680 Speaker 2: my taichi class, and whilst I was there, there was 710 00:36:53,719 --> 00:36:56,560 Speaker 2: a polarate session on. There was a young guy who's 711 00:36:56,600 --> 00:36:59,560 Speaker 2: an animator who was teaching kids how to draw cartoons, 712 00:37:00,040 --> 00:37:01,520 Speaker 2: and then I was running my tai chie And that 713 00:37:01,600 --> 00:37:04,799 Speaker 2: was just last night. So you know, it's awesome that, 714 00:37:04,960 --> 00:37:07,759 Speaker 2: you know, you can engage with people from all different demographics, 715 00:37:07,760 --> 00:37:11,160 Speaker 2: all different ages. I did a course of like a 716 00:37:11,239 --> 00:37:15,560 Speaker 2: kind of a vegan cooking class the other day, which 717 00:37:15,600 --> 00:37:18,520 Speaker 2: was kind of fun. Got this amazing meal cooked up 718 00:37:18,560 --> 00:37:21,879 Speaker 2: and learned a few things. So yeah, it's great opportunities 719 00:37:21,880 --> 00:37:23,920 Speaker 2: at neighborhood centers and look at me as if you 720 00:37:24,520 --> 00:37:25,840 Speaker 2: add for neighborhood houses. 721 00:37:26,760 --> 00:37:30,600 Speaker 1: If I've got a question for you, how many McDonald's 722 00:37:30,640 --> 00:37:33,520 Speaker 1: restaurants do you think there are in Victoria? I mean, 723 00:37:33,560 --> 00:37:37,160 Speaker 1: what would your guest be? By the way, Patrick, you 724 00:37:37,200 --> 00:37:39,480 Speaker 1: were correct, so I'm not throwing you under the bus. 725 00:37:39,920 --> 00:37:43,880 Speaker 1: How many? No, I was very surprised. How many do 726 00:37:43,920 --> 00:37:44,480 Speaker 1: you think. 727 00:37:44,320 --> 00:37:47,080 Speaker 3: Tif I wouldn't even know where to start to get. 728 00:37:48,840 --> 00:37:54,160 Speaker 1: I would have said thousands. Yeah, I mean, I feel 729 00:37:54,200 --> 00:37:58,799 Speaker 1: like there's fucking there's McDonald's everywhere. This seems wrong to me, 730 00:37:58,880 --> 00:38:01,319 Speaker 1: but it must be right. So Patrick said there are 731 00:38:01,320 --> 00:38:06,520 Speaker 1: more neighborhood houses than McDonald's, and my bullshit filter just went, 732 00:38:06,719 --> 00:38:11,359 Speaker 1: nah's fuck, that's bunch kids, right, that's bullsh No, mate, 733 00:38:11,360 --> 00:38:13,000 Speaker 1: I put up my hand when I'm wrong, and I'm 734 00:38:13,000 --> 00:38:16,719 Speaker 1: wrong thirty seven times a day, so that's bullshit. And 735 00:38:16,760 --> 00:38:19,280 Speaker 1: I wasn't going to say it. I just went, ah, 736 00:38:19,800 --> 00:38:23,520 Speaker 1: four hundred, there are over four hundred neighborhood houses across Victoria. 737 00:38:24,160 --> 00:38:27,520 Speaker 1: And then I've gone, how many McDonald's are there in Victoria? 738 00:38:27,920 --> 00:38:31,200 Speaker 1: Two hundred and sixty. I'm like, I feel like there's 739 00:38:31,239 --> 00:38:33,760 Speaker 1: two hundred and sixty within about ten gays in my house. 740 00:38:34,400 --> 00:38:37,399 Speaker 1: That doesn't seem like enough. I mean, it is enough, 741 00:38:37,440 --> 00:38:43,200 Speaker 1: by the way, more than enough. It's more than enough. 742 00:38:43,520 --> 00:38:47,960 Speaker 1: Isn't that a song? Isn't that great? Though? I love 743 00:38:48,040 --> 00:38:52,200 Speaker 1: that I did. That's amazing. I want to know about 744 00:38:52,280 --> 00:38:56,320 Speaker 1: because I've thought about this. Turnages, teenagers turning to AI 745 00:38:56,680 --> 00:39:02,240 Speaker 1: companions I feel like that that's good and bad. 746 00:39:03,160 --> 00:39:07,880 Speaker 2: Yeah, falling in love with a chatbot companionship. 747 00:39:08,000 --> 00:39:09,640 Speaker 1: See, I'm in two minds about this. 748 00:39:09,680 --> 00:39:14,320 Speaker 2: So a lot of teenagers are now interacting online with chatbots. 749 00:39:14,360 --> 00:39:16,160 Speaker 2: You can talk to them, you can interact with them, 750 00:39:16,320 --> 00:39:18,920 Speaker 2: you can get intimate, you can talk about your concerns, 751 00:39:19,200 --> 00:39:23,920 Speaker 2: and you can have this ongoing relationship with an AI model. 752 00:39:24,000 --> 00:39:27,120 Speaker 2: But the tragedy around this is there's been some really 753 00:39:27,160 --> 00:39:29,840 Speaker 2: awful things that have happened, and there was a suicide 754 00:39:29,840 --> 00:39:32,440 Speaker 2: of a fourteen year old boy who had built up 755 00:39:32,480 --> 00:39:36,600 Speaker 2: this relationship with a chatbot that he'd created a persona 756 00:39:36,760 --> 00:39:41,759 Speaker 2: around one of the characters from Game of Thrones, Danaris Targerian, 757 00:39:41,880 --> 00:39:45,160 Speaker 2: and so he'd formed this relationship and in his mind 758 00:39:45,719 --> 00:39:49,399 Speaker 2: it was a relationship. There was this interaction and the 759 00:39:49,440 --> 00:39:52,040 Speaker 2: backstory to this is at one point he was talking 760 00:39:52,080 --> 00:39:55,239 Speaker 2: about suicide and the chatbot kind of tried to talk 761 00:39:55,320 --> 00:39:57,560 Speaker 2: him out of it, and then he brought it up 762 00:39:57,600 --> 00:40:01,480 Speaker 2: again and it appeared, and I'm saying appeared. It appeared 763 00:40:01,560 --> 00:40:05,920 Speaker 2: like the chatbot encouraged him, and as a tragic result, 764 00:40:06,160 --> 00:40:09,719 Speaker 2: he ended up committing suicide. So it talks about and 765 00:40:09,840 --> 00:40:13,319 Speaker 2: talks to the point that people can be manipulated, but 766 00:40:13,360 --> 00:40:17,560 Speaker 2: they can also use I guess chatbots as an emotional 767 00:40:17,600 --> 00:40:20,680 Speaker 2: crutch as well. And remember the story back from twenty 768 00:40:20,719 --> 00:40:23,160 Speaker 2: twenty one where a guy, a nineteen year old guy 769 00:40:23,560 --> 00:40:26,359 Speaker 2: broke into Buckingham Palace and was going to you know, 770 00:40:26,600 --> 00:40:27,360 Speaker 2: stab the queen. 771 00:40:27,600 --> 00:40:30,239 Speaker 1: Do you remember that? I do well. 772 00:40:30,520 --> 00:40:35,040 Speaker 2: The backstory to that was he had a relationship with 773 00:40:35,080 --> 00:40:38,000 Speaker 2: an emotional relationship with an AI chatbot, and I believe 774 00:40:38,320 --> 00:40:41,480 Speaker 2: that the discussion around that was also discussed with the 775 00:40:41,520 --> 00:40:44,959 Speaker 2: AI chatbot. So you know, when you think of oversight, well, 776 00:40:45,160 --> 00:40:49,080 Speaker 2: is there any And I don't know. I can see 777 00:40:49,160 --> 00:40:52,799 Speaker 2: where having an AI companion would be really reassuring for 778 00:40:52,840 --> 00:40:57,080 Speaker 2: somebody who lived alone. I often talked to Fritz, but 779 00:40:57,520 --> 00:41:00,000 Speaker 2: you know, having a chatbot where I mean, we talk 780 00:41:00,080 --> 00:41:02,279 Speaker 2: about this on the show a few episodes ago, where 781 00:41:02,320 --> 00:41:05,520 Speaker 2: I introduced an AI that I'd been chatting to and 782 00:41:05,560 --> 00:41:09,600 Speaker 2: we had this fluid conversation. So it's not typing into 783 00:41:09,640 --> 00:41:13,440 Speaker 2: a text prompt and waiting for a reply. It's having 784 00:41:13,520 --> 00:41:18,560 Speaker 2: a deeper, meaningful, meaningful conversation with the AI is able 785 00:41:18,600 --> 00:41:20,760 Speaker 2: to engage on that emotional level. 786 00:41:21,680 --> 00:41:24,880 Speaker 1: Yeah, it's I mean, it's crazy now the when I 787 00:41:25,080 --> 00:41:27,879 Speaker 1: sometimes I'll just pick up my phone and ask chat 788 00:41:27,960 --> 00:41:32,560 Speaker 1: GPT something. I'll talk verbally and it goes to American 789 00:41:33,040 --> 00:41:37,120 Speaker 1: goes hey, Craig, great question, that would be really valuable 790 00:41:37,160 --> 00:41:39,759 Speaker 1: with your work as a corporate speaker and in your 791 00:41:39,800 --> 00:41:43,719 Speaker 1: research with new neural psychology. Yeah. So, and then it 792 00:41:43,800 --> 00:41:46,399 Speaker 1: just starts talking can I help you with anything else? 793 00:41:46,440 --> 00:41:49,560 Speaker 1: I'm like, nah, that's good. Thanks. It's like, okay, Craig, 794 00:41:49,640 --> 00:41:52,000 Speaker 1: have a great day. I'm like, fucking hell, this is 795 00:41:52,360 --> 00:41:54,839 Speaker 1: He's now one of my best friends, you know. So 796 00:41:55,480 --> 00:41:59,920 Speaker 1: it's that. I mean, you think about we are just 797 00:42:01,320 --> 00:42:04,080 Speaker 1: like if somebody's talking to you, say a narcissist or 798 00:42:04,080 --> 00:42:07,839 Speaker 1: a sociopath, and it's all bullshit, it's all an act, right, 799 00:42:08,520 --> 00:42:11,160 Speaker 1: we still respond as though, if we don't know, as 800 00:42:11,160 --> 00:42:15,719 Speaker 1: though it's real, and so you know, even though it 801 00:42:15,840 --> 00:42:19,279 Speaker 1: is not a human and even though the emotions might 802 00:42:19,360 --> 00:42:24,839 Speaker 1: be emotions in inverted commas, programmed or not real. I mean, 803 00:42:26,000 --> 00:42:31,960 Speaker 1: if your brain and your neurology and biology interprets it 804 00:42:32,040 --> 00:42:36,560 Speaker 1: as though I am in a relationship, then for the individual, 805 00:42:36,600 --> 00:42:40,920 Speaker 1: the teenager in this case, it is real as an experience. 806 00:42:41,960 --> 00:42:46,360 Speaker 1: And that's that's powerful, right. If you think something's real 807 00:42:46,840 --> 00:42:50,640 Speaker 1: but it's not real, like a dream and you wake 808 00:42:50,719 --> 00:42:53,200 Speaker 1: up in the middle of the night, had this happened 809 00:42:53,280 --> 00:42:56,280 Speaker 1: last week Whirias in the middle of something I can't remember, 810 00:42:56,280 --> 00:42:59,520 Speaker 1: but I literally woke up and so I knew straight 811 00:42:59,560 --> 00:43:02,240 Speaker 1: away that, oh, it's not real, But my body didn't 812 00:43:02,280 --> 00:43:05,120 Speaker 1: know that for like another three minutes, because I could 813 00:43:05,160 --> 00:43:07,960 Speaker 1: feel my blood pressure and I could feel my heart rate, 814 00:43:08,040 --> 00:43:12,360 Speaker 1: and it's like your body cannot tell the difference between 815 00:43:12,400 --> 00:43:14,719 Speaker 1: what is real and what you think is real. So 816 00:43:14,760 --> 00:43:17,040 Speaker 1: your body just responds to your perception. 817 00:43:17,520 --> 00:43:22,840 Speaker 2: And in this case, in space, yeah, yeah, yeah, you 818 00:43:22,880 --> 00:43:24,120 Speaker 2: were there, but anyway, that's enough. 819 00:43:24,160 --> 00:43:30,760 Speaker 1: But I think you were filming speaking of sex in space. 820 00:43:32,760 --> 00:43:35,719 Speaker 1: I thought later after that that I thought, but you've 821 00:43:35,719 --> 00:43:38,680 Speaker 1: got to get out of your fucking spacesuit. No, you 822 00:43:38,680 --> 00:43:42,719 Speaker 1: don't wear a spacesuit inside the isis when Oh yeah, 823 00:43:42,760 --> 00:43:43,160 Speaker 1: that's right. 824 00:43:43,200 --> 00:43:46,719 Speaker 2: You just were jumping up against someone in a full spacesuit. 825 00:43:46,960 --> 00:43:53,279 Speaker 1: Wow, yeah, yeah, yeah, that'd be a yeah, that's going 826 00:43:53,320 --> 00:43:56,040 Speaker 1: to crush it, all right, give us something else. 827 00:43:57,400 --> 00:43:59,480 Speaker 2: Too much of a helmet is sealed off from the 828 00:43:59,520 --> 00:44:07,600 Speaker 2: lower that's all I'm thinking. Yeah, Well, can we I 829 00:44:07,600 --> 00:44:10,440 Speaker 2: mean the chatbot experience, So have we kind of done 830 00:44:10,480 --> 00:44:12,560 Speaker 2: that to death? Or do you think you have a 831 00:44:12,640 --> 00:44:13,920 Speaker 2: chatbot that I mean you're. 832 00:44:13,719 --> 00:44:16,520 Speaker 1: Already using it. I love it. I love it. I 833 00:44:16,520 --> 00:44:18,560 Speaker 1: don't want it to go away. I enjoy chatting to 834 00:44:18,640 --> 00:44:19,360 Speaker 1: chat GPT. 835 00:44:19,880 --> 00:44:23,920 Speaker 2: There are tens of millions of teenagers, it's believed using 836 00:44:25,000 --> 00:44:30,200 Speaker 2: the romantic chatbots like engaging and forming relationships with chatbots. 837 00:44:30,280 --> 00:44:32,880 Speaker 2: There are whole apps out there that are dedicated to 838 00:44:33,480 --> 00:44:35,840 Speaker 2: getting people to form relationships. And you know, there's a 839 00:44:35,880 --> 00:44:38,919 Speaker 2: crucial period I guess as you start to get older 840 00:44:38,960 --> 00:44:42,440 Speaker 2: and mature when you form relationships, and part of that, 841 00:44:42,640 --> 00:44:46,359 Speaker 2: I think preparing us for life is having a relationship 842 00:44:46,360 --> 00:44:49,480 Speaker 2: that fails. You know, the first love isn't always going 843 00:44:49,520 --> 00:44:52,439 Speaker 2: to be your first love in most cases. So if 844 00:44:52,480 --> 00:44:57,360 Speaker 2: you're in an interactive, in depth relationship with a chatbot 845 00:44:57,719 --> 00:44:59,920 Speaker 2: and you fall in love with it and they're always 846 00:45:00,000 --> 00:45:02,880 Speaker 2: saying what you want to hear, then you never to 847 00:45:02,960 --> 00:45:06,439 Speaker 2: experience the downside of the relationship. And if you're doing 848 00:45:06,440 --> 00:45:09,799 Speaker 2: that in your formative views as a teenager, why do 849 00:45:09,840 --> 00:45:14,040 Speaker 2: you need to find female or male or whatever companionship 850 00:45:14,160 --> 00:45:14,840 Speaker 2: down the track? 851 00:45:15,040 --> 00:45:19,120 Speaker 1: If you're getting everything from your AI, also think about 852 00:45:19,160 --> 00:45:24,200 Speaker 1: may too. Sorry, So if it's like if they could 853 00:45:24,200 --> 00:45:29,839 Speaker 1: build the perfect AI, whatever that means where you never 854 00:45:29,880 --> 00:45:32,680 Speaker 1: get rejected. You know, I'm not saying this is a 855 00:45:32,680 --> 00:45:35,360 Speaker 1: good idea, because rejection kind of forms us in a 856 00:45:35,360 --> 00:45:38,000 Speaker 1: way or shapes us in a way. But just speaking 857 00:45:38,040 --> 00:45:40,520 Speaker 1: of this, the other day, I got one of my 858 00:45:40,600 --> 00:45:44,280 Speaker 1: whiteboard posts. If you don't follow me on Instagram, follow 859 00:45:44,280 --> 00:45:46,680 Speaker 1: me on Instagram people, if you would, Craig Anthony Harper, 860 00:45:46,680 --> 00:45:48,600 Speaker 1: that would be great. Anyway, you know how I put 861 00:45:48,640 --> 00:45:52,080 Speaker 1: up all my whiteboard shit. So I put one of 862 00:45:52,120 --> 00:45:55,960 Speaker 1: my typical whiteboard and I said, this is a sample 863 00:45:56,040 --> 00:45:59,879 Speaker 1: of what I post as memes on my Instagram. Could 864 00:45:59,880 --> 00:46:04,480 Speaker 1: you write something similar, you know, like anything, but in 865 00:46:04,520 --> 00:46:07,160 Speaker 1: this style. And it wrote something and it was it 866 00:46:07,200 --> 00:46:09,040 Speaker 1: was you know, it was a four out of ten. 867 00:46:09,120 --> 00:46:13,319 Speaker 1: I went, yeah, I said, yeah kind of, I said, 868 00:46:13,320 --> 00:46:16,400 Speaker 1: but it needs to be edgier. And then the third 869 00:46:16,440 --> 00:46:20,759 Speaker 1: word was fuck that's you. Yeah. I'm like, wow, it 870 00:46:20,840 --> 00:46:25,680 Speaker 1: caught on really quick, like it actually it like yeah, 871 00:46:25,840 --> 00:46:28,480 Speaker 1: chat GPT wrote. I mean, it wasn't very good. There 872 00:46:28,560 --> 00:46:31,080 Speaker 1: was swearing in it and stuff, but I was surprised 873 00:46:31,080 --> 00:46:34,160 Speaker 1: that it would actually write with swearing. But given the 874 00:46:34,239 --> 00:46:36,840 Speaker 1: right prompts, it does. Yeah. 875 00:46:36,920 --> 00:46:39,280 Speaker 2: And you know, the interesting thing to bear in mind 876 00:46:39,600 --> 00:46:43,200 Speaker 2: is as we start to develop this interaction and we 877 00:46:43,280 --> 00:46:44,880 Speaker 2: think about things like deep fake. 878 00:46:45,320 --> 00:46:46,920 Speaker 1: So deep fake is where. 879 00:46:46,800 --> 00:46:49,759 Speaker 2: You may be scammed when you get a phone call 880 00:46:49,840 --> 00:46:52,680 Speaker 2: from someone and I think it's Craig, it sounds like Craig. 881 00:46:53,040 --> 00:46:56,000 Speaker 2: But so whether it's an email that's purporting to be you, 882 00:46:56,440 --> 00:46:59,120 Speaker 2: and now even more and more it could be potentially 883 00:46:59,120 --> 00:47:02,120 Speaker 2: a video or phone call that sounds like you. But 884 00:47:02,360 --> 00:47:04,760 Speaker 2: there was an interesting study that was done. A test 885 00:47:04,800 --> 00:47:08,879 Speaker 2: of two thousand people was shown deep fake content. Now, 886 00:47:08,880 --> 00:47:13,120 Speaker 2: of the two thousand, only two of them managed to 887 00:47:13,320 --> 00:47:16,600 Speaker 2: pick the fakes to get a perfect to get a 888 00:47:16,600 --> 00:47:20,080 Speaker 2: perfect score, only two. So it was a study by 889 00:47:20,080 --> 00:47:23,640 Speaker 2: an organization called I Prove, and they were saying that 890 00:47:23,719 --> 00:47:29,600 Speaker 2: AI literacy among the general public is depressingly low. Now, 891 00:47:29,680 --> 00:47:33,359 Speaker 2: I know we always talk about older people that, and 892 00:47:33,400 --> 00:47:36,200 Speaker 2: we know that older people struggle with deep fakes. They 893 00:47:36,400 --> 00:47:40,120 Speaker 2: absolutely can't pick them. It's a really tough thing for 894 00:47:40,200 --> 00:47:43,480 Speaker 2: people to understand what's fake and what isn't. But it's 895 00:47:43,520 --> 00:47:48,080 Speaker 2: also widespread among wider the younger generation as well. So 896 00:47:48,600 --> 00:47:50,600 Speaker 2: when you think about it, you know, out of that 897 00:47:50,640 --> 00:47:53,960 Speaker 2: two thousand, zero point one percent of the participants, so 898 00:47:54,160 --> 00:47:58,440 Speaker 2: two of them correctly distinguished between real and deep fake stimuli. 899 00:47:58,520 --> 00:48:02,200 Speaker 1: That's a staggering that would have been across a range 900 00:48:02,239 --> 00:48:05,320 Speaker 1: of different I mean, even just statistically, you'd get fifty 901 00:48:05,360 --> 00:48:07,720 Speaker 1: to fifty. I mean there must have been a whole 902 00:48:07,840 --> 00:48:10,560 Speaker 1: range of images or something. It was a whole lot, 903 00:48:10,560 --> 00:48:11,000 Speaker 1: that's right. 904 00:48:11,040 --> 00:48:14,600 Speaker 2: But the study also found that older adults particularly susceptible, 905 00:48:14,640 --> 00:48:18,600 Speaker 2: because thirty percent of people aged fifty five to sixty 906 00:48:18,640 --> 00:48:22,440 Speaker 2: four in the study and thirty nine percent over sixty 907 00:48:22,440 --> 00:48:23,440 Speaker 2: five had never even. 908 00:48:23,280 --> 00:48:24,280 Speaker 1: Heard of deep fakes. 909 00:48:24,600 --> 00:48:26,960 Speaker 2: So exactly the problem is they don't even know. So 910 00:48:27,000 --> 00:48:30,160 Speaker 2: you're presenting something it's a fake. So it might be 911 00:48:30,200 --> 00:48:33,040 Speaker 2: the Barack Obama thing. Do you remember the Barack Obama thing? 912 00:48:33,840 --> 00:48:37,440 Speaker 1: Yeah, you know, and you know Tom Cruise all that 913 00:48:37,480 --> 00:48:38,000 Speaker 1: sort of stuff. 914 00:48:38,040 --> 00:48:42,080 Speaker 2: But it's getting scarily good and scarily it's also out 915 00:48:42,080 --> 00:48:45,319 Speaker 2: there and attainable to anybody to use. It used to be, 916 00:48:45,440 --> 00:48:49,840 Speaker 2: you know, companies that were restricted to having high end computers, 917 00:48:49,840 --> 00:48:52,719 Speaker 2: but now these kind of deep fake models are really 918 00:48:52,760 --> 00:48:55,799 Speaker 2: easy to access and easy to use. But you know 919 00:48:56,000 --> 00:48:59,879 Speaker 2: what intrigued me about being scammed. I saw a quick 920 00:49:00,120 --> 00:49:03,640 Speaker 2: article that I want to quickly mention that I think 921 00:49:03,640 --> 00:49:06,920 Speaker 2: it was a group in Nigeria because we think of 922 00:49:07,239 --> 00:49:10,160 Speaker 2: people who are scammers as bad people. I mean, for 923 00:49:10,200 --> 00:49:12,480 Speaker 2: the most part, we think, well, you know, people are 924 00:49:12,560 --> 00:49:17,359 Speaker 2: terrible who take part in scams. But the behind this 925 00:49:17,920 --> 00:49:21,480 Speaker 2: about two hundred and sixty people were actually part of 926 00:49:21,760 --> 00:49:26,239 Speaker 2: human trafficking and effectively real life slavery, and they've been 927 00:49:26,320 --> 00:49:29,480 Speaker 2: rescued in Mianma. So this is a story that just 928 00:49:29,520 --> 00:49:32,560 Speaker 2: came out a couple of weeks ago. And so there 929 00:49:32,640 --> 00:49:35,520 Speaker 2: was a rebel group on the border of my Anma 930 00:49:35,880 --> 00:49:40,160 Speaker 2: and they've been trying to shut down these scam operations. 931 00:49:40,400 --> 00:49:42,680 Speaker 2: And what they did was they cut off the power 932 00:49:42,719 --> 00:49:46,280 Speaker 2: supply and then they liberated two hundred and sixty people 933 00:49:46,440 --> 00:49:50,319 Speaker 2: who'd been taken from twenty different nations, including one hundred 934 00:49:50,320 --> 00:49:53,600 Speaker 2: and thirty eight Ethiopians, who were effectively taken into slavery 935 00:49:53,920 --> 00:49:56,960 Speaker 2: to runt to work in these scam centers. 936 00:49:57,360 --> 00:50:02,080 Speaker 1: So, you know, it's not just well, I think if 937 00:50:02,080 --> 00:50:05,440 Speaker 1: you can scam, if you can scam the scammer Patrick, it's. 938 00:50:06,120 --> 00:50:08,839 Speaker 2: You know, but what I'm saying is the scammers aren't 939 00:50:08,880 --> 00:50:11,920 Speaker 2: necessarily people who want to scam. You know, this is 940 00:50:12,320 --> 00:50:16,680 Speaker 2: born out the fact that they're talking about taking people, 941 00:50:16,760 --> 00:50:19,800 Speaker 2: abducting them and making them work in scam sets. 942 00:50:19,840 --> 00:50:22,439 Speaker 1: This is a double like, it's doubly worse. 943 00:50:22,440 --> 00:50:22,680 Speaker 2: You know. 944 00:50:22,880 --> 00:50:25,719 Speaker 1: But what I mean is I mean is if we 945 00:50:25,920 --> 00:50:30,719 Speaker 1: can use the same process or technology to trap bad 946 00:50:30,800 --> 00:50:34,719 Speaker 1: guys or bad girls like to crooks and criminals. And 947 00:50:35,440 --> 00:50:37,759 Speaker 1: I mean, I think that's what it's coming to as well, 948 00:50:37,800 --> 00:50:41,040 Speaker 1: isn't it, Because like the crooks are not dumb anymore, 949 00:50:41,080 --> 00:50:45,600 Speaker 1: like the criminals are using advanced technology to commit crimes 950 00:50:45,640 --> 00:50:51,200 Speaker 1: and to manipulate and do harm, then then the law 951 00:50:51,280 --> 00:50:55,279 Speaker 1: enforcement agencies need to be one step ahead. And it's tough. 952 00:50:55,360 --> 00:50:57,359 Speaker 2: Yeah, you're absolutely right, because you know, if you think 953 00:50:57,400 --> 00:50:59,839 Speaker 2: about it, once upon a time, walking into a bank 954 00:51:00,080 --> 00:51:03,319 Speaker 2: the gun was a pretty dangerous affair because you know, 955 00:51:03,440 --> 00:51:07,960 Speaker 2: potentially you're going to get caught pretty easily with arms 956 00:51:08,120 --> 00:51:09,960 Speaker 2: and all that sort of stuff. But if you can 957 00:51:10,000 --> 00:51:12,960 Speaker 2: do it from the comfort of your own lounge room 958 00:51:13,080 --> 00:51:17,120 Speaker 2: or a data center somewhere in another country, you're almost 959 00:51:17,640 --> 00:51:19,120 Speaker 2: immune to getting found. 960 00:51:19,920 --> 00:51:23,560 Speaker 1: That's the reality. I remember, like about a year ago, mum, 961 00:51:24,520 --> 00:51:27,120 Speaker 1: Mum got, you know, some dude on the phone just 962 00:51:27,160 --> 00:51:29,360 Speaker 1: talking to her and knew a fair bit about her. 963 00:51:29,719 --> 00:51:33,240 Speaker 1: And that doesn't make sense to somebody who's eighty five. Well, 964 00:51:33,440 --> 00:51:36,120 Speaker 1: it must be legitimate because he knows all of these 965 00:51:36,160 --> 00:51:39,600 Speaker 1: things and those send moms, so Mum's like wanting to 966 00:51:39,640 --> 00:51:43,560 Speaker 1: be polite, and then you know, thank god, she didn't 967 00:51:43,600 --> 00:51:46,799 Speaker 1: give him any information that would allow him to rip 968 00:51:46,840 --> 00:51:50,120 Speaker 1: her off, but she almost did. And it's you know, 969 00:51:50,200 --> 00:51:53,680 Speaker 1: to older people and even people who just don't really 970 00:51:53,800 --> 00:51:58,000 Speaker 1: understand it, not necessarily older, but yeah, when someone's on 971 00:51:58,040 --> 00:52:00,319 Speaker 1: the phone and they're articulate and they're talking you and 972 00:52:00,360 --> 00:52:03,279 Speaker 1: they sound like they could be from a legitimate organization, 973 00:52:03,360 --> 00:52:07,000 Speaker 1: and you know, good morning, missus Harper, and it's fucking terrifying. 974 00:52:08,800 --> 00:52:11,120 Speaker 2: And I think some of the bigger organizations need to 975 00:52:11,200 --> 00:52:13,960 Speaker 2: change their policies as well. I think you know where 976 00:52:14,000 --> 00:52:17,719 Speaker 2: we upload our information to. I recently changed banks. I 977 00:52:17,719 --> 00:52:19,560 Speaker 2: had a bit of a run in with my one 978 00:52:19,600 --> 00:52:22,560 Speaker 2: of the big four banks because I've been banking with 979 00:52:22,600 --> 00:52:26,759 Speaker 2: them for over at least over twelve years, and they 980 00:52:27,000 --> 00:52:30,360 Speaker 2: asked me to upload my ID to confirm who I was. 981 00:52:30,360 --> 00:52:32,000 Speaker 1: And I said, well, i've been with you for twelve years. 982 00:52:32,040 --> 00:52:34,560 Speaker 2: I've got access to my account and they said, no, no, 983 00:52:34,640 --> 00:52:36,360 Speaker 2: you need to prove who you are. You need to 984 00:52:36,800 --> 00:52:39,520 Speaker 2: upload your driver's license. One hundred points of ID. I said, well, 985 00:52:39,560 --> 00:52:41,719 Speaker 2: where did that go if I uploaded? They said, well, 986 00:52:41,760 --> 00:52:43,320 Speaker 2: we need to verify who you are. So well, I 987 00:52:43,360 --> 00:52:45,279 Speaker 2: can go to my bank branch and I'm happy to 988 00:52:45,320 --> 00:52:48,279 Speaker 2: show my ID to the bank manager to prove who 989 00:52:48,320 --> 00:52:50,440 Speaker 2: I am. And I said, They said yeah, but that 990 00:52:50,600 --> 00:52:52,640 Speaker 2: would then be recorded. Well, said, I don't want it 991 00:52:52,680 --> 00:52:55,600 Speaker 2: to be recorded. I don't want you to have my license, 992 00:52:55,680 --> 00:52:58,960 Speaker 2: my passport, my one hundred points of identifications stilled on 993 00:52:59,000 --> 00:53:01,960 Speaker 2: a database somewhere, because I've been a customer, you know, 994 00:53:02,000 --> 00:53:05,120 Speaker 2: for all this time, for over a decade, and they 995 00:53:05,160 --> 00:53:07,160 Speaker 2: just said, well that your bank, you're not going to 996 00:53:07,200 --> 00:53:10,400 Speaker 2: have full access to your account, to your own accounts 997 00:53:10,640 --> 00:53:13,560 Speaker 2: if you don't verify who you are. I understand that 998 00:53:13,600 --> 00:53:16,400 Speaker 2: there needs to be oversight, and they understand that. You know, obviously, 999 00:53:16,440 --> 00:53:18,960 Speaker 2: we need to be really stringent with proving who we are. 1000 00:53:19,000 --> 00:53:21,440 Speaker 2: Biometrics helps and the way that a lot of the 1001 00:53:21,840 --> 00:53:24,719 Speaker 2: banks are now getting better with storing our information. 1002 00:53:25,040 --> 00:53:26,880 Speaker 1: But when you're an existing customer. 1003 00:53:26,480 --> 00:53:29,120 Speaker 2: And you're told they don't believe you are who you are, 1004 00:53:29,600 --> 00:53:32,520 Speaker 2: that was disconcerting. So I like walking into my local branch. 1005 00:53:32,560 --> 00:53:34,399 Speaker 2: I know all the members of the staff and say 1006 00:53:34,400 --> 00:53:36,200 Speaker 2: a load to the manager. So when you were talking 1007 00:53:36,200 --> 00:53:39,960 Speaker 2: about your mum earlier being forced into doing online banking 1008 00:53:40,000 --> 00:53:42,240 Speaker 2: or whatever, I think you know, bank with your feet, 1009 00:53:42,400 --> 00:53:44,200 Speaker 2: walk into a branch. If you're not happy with your 1010 00:53:44,239 --> 00:53:46,480 Speaker 2: current bank, go and find one that you are happy 1011 00:53:46,520 --> 00:53:49,880 Speaker 2: with that will allow you to do transactions over the counter, 1012 00:53:50,120 --> 00:53:52,760 Speaker 2: that will say, hey, missus Harper, it's nice to see you. 1013 00:53:52,760 --> 00:53:53,560 Speaker 1: You know that sort of thing. 1014 00:53:53,560 --> 00:53:57,040 Speaker 2: There are local community branches of different banks where you 1015 00:53:57,080 --> 00:53:59,799 Speaker 2: will get a better reception if you can speak to 1016 00:54:00,080 --> 00:54:02,359 Speaker 2: human being. And I think we need to encourage that. 1017 00:54:02,400 --> 00:54:06,080 Speaker 2: We need to not go through the automatic checkout. You know, 1018 00:54:06,120 --> 00:54:08,040 Speaker 2: for a lot of older people, they don't want to 1019 00:54:08,080 --> 00:54:11,239 Speaker 2: scan the grocery items themselves. They want to speak to 1020 00:54:12,239 --> 00:54:15,120 Speaker 2: someone who's had a checkout, have a conversation with them, 1021 00:54:15,320 --> 00:54:18,759 Speaker 2: hand their groceries over, and feel confident that they're still 1022 00:54:18,800 --> 00:54:20,000 Speaker 2: also encouraging someone to. 1023 00:54:20,000 --> 00:54:21,760 Speaker 1: Get a job. I mean, I don't know about you guys. 1024 00:54:22,920 --> 00:54:25,600 Speaker 2: You know, sometimes it's convenient if there's a big queue 1025 00:54:25,600 --> 00:54:28,680 Speaker 2: to go through the self checkout, But I don't ever 1026 00:54:28,719 --> 00:54:32,120 Speaker 2: want a checkout to disappear because I was a checkout 1027 00:54:32,200 --> 00:54:34,120 Speaker 2: chick when I was a kid, you know, I worked 1028 00:54:34,120 --> 00:54:36,280 Speaker 2: at a supermarket for a couple of years at Cohl's, 1029 00:54:36,280 --> 00:54:37,080 Speaker 2: and you know, it. 1030 00:54:36,960 --> 00:54:38,000 Speaker 1: Was a great experience. 1031 00:54:38,080 --> 00:54:40,799 Speaker 2: You've got a lot of experience and confidence, and I 1032 00:54:40,840 --> 00:54:43,720 Speaker 2: think that it's a great learning environment for young people 1033 00:54:43,760 --> 00:54:45,680 Speaker 2: as well as people who work in a full time 1034 00:54:45,719 --> 00:54:47,880 Speaker 2: career in that sort of service industry. 1035 00:54:48,680 --> 00:54:50,719 Speaker 1: And I just recommend based off the back of that. 1036 00:54:50,760 --> 00:54:53,040 Speaker 1: There's a movie that's very cute. It's not like that, 1037 00:54:53,080 --> 00:54:55,000 Speaker 1: it's not going to blow your head off, but it's cute. 1038 00:54:55,080 --> 00:54:58,040 Speaker 1: It's like a good Friday night movie or whatever. It's 1039 00:54:58,040 --> 00:55:02,080 Speaker 1: called The Bank of Dave. Oh okay, yep, The Bank 1040 00:55:02,120 --> 00:55:05,280 Speaker 1: of Dave. It's quite cute and it's like a guy 1041 00:55:05,480 --> 00:55:08,680 Speaker 1: in a rural area and it's based on a true 1042 00:55:08,719 --> 00:55:12,480 Speaker 1: story and he's a really smart guy and he doesn't 1043 00:55:12,520 --> 00:55:15,360 Speaker 1: want like he sets it up for the community for 1044 00:55:15,440 --> 00:55:19,520 Speaker 1: people who can't get loans and it's yeah, and it's 1045 00:55:19,680 --> 00:55:24,480 Speaker 1: his fight against the big banks and they're all these dirty, rotten, 1046 00:55:24,680 --> 00:55:28,839 Speaker 1: bloody Yeah. It's so interesting. Yeah. Bank of Dave twenty 1047 00:55:29,000 --> 00:55:32,719 Speaker 1: twenty three. Hey, mate, it's pretty great. Go on, did 1048 00:55:32,760 --> 00:55:35,000 Speaker 1: you want to do one more? No? No, it's my last 1049 00:55:35,000 --> 00:55:35,480 Speaker 1: little thing. 1050 00:55:35,920 --> 00:55:40,799 Speaker 2: Was I think as consumers we need to be more proactive. 1051 00:55:41,400 --> 00:55:44,520 Speaker 2: So what I mean by that is knowwhere your T 1052 00:55:44,680 --> 00:55:48,120 Speaker 2: shirt has been made, you know, you know, buying Australian 1053 00:55:48,160 --> 00:55:51,319 Speaker 2: made and supporting Australian industry. Yes, you're going to pay more, 1054 00:55:51,320 --> 00:55:52,840 Speaker 2: but you're going to get better quality and it's going 1055 00:55:52,880 --> 00:55:55,360 Speaker 2: to last longer. And I think we all fall victim 1056 00:55:55,400 --> 00:55:57,480 Speaker 2: to saying, you know, I'm going to get the you know, 1057 00:55:57,560 --> 00:56:00,279 Speaker 2: the seven dollars fifty T shirt. But if that's dollar 1058 00:56:00,360 --> 00:56:02,799 Speaker 2: fifty T shirt was made in a sweatshop, then you're 1059 00:56:02,840 --> 00:56:05,040 Speaker 2: perpetuating that that system. 1060 00:56:05,840 --> 00:56:08,800 Speaker 1: You know, upcycling is really great. I love to upcycle. 1061 00:56:08,840 --> 00:56:11,279 Speaker 2: I think the probably my favorite three T shirts I 1062 00:56:11,320 --> 00:56:13,240 Speaker 2: bought it op shops. And I don't want to sound 1063 00:56:13,280 --> 00:56:15,920 Speaker 2: like a tight ass, but I feel that if we 1064 00:56:16,080 --> 00:56:19,640 Speaker 2: know where things are coming from. I proudly wear Australian 1065 00:56:19,680 --> 00:56:22,399 Speaker 2: made jocks. I won't show you, but I can tell 1066 00:56:22,440 --> 00:56:24,319 Speaker 2: you that they're Australian made, and I know they're made 1067 00:56:24,400 --> 00:56:25,759 Speaker 2: here and they're comfortable. 1068 00:56:26,040 --> 00:56:29,680 Speaker 1: Not like I haven't seen them, oh jeez no, but 1069 00:56:29,719 --> 00:56:30,600 Speaker 1: it's great to know. 1070 00:56:30,680 --> 00:56:33,720 Speaker 2: And I think that you know, in terms of those 1071 00:56:33,760 --> 00:56:36,400 Speaker 2: sorts of you know, whether it's garments or whatever, it 1072 00:56:36,400 --> 00:56:37,920 Speaker 2: happens to mean, it's a shame we can't buy an 1073 00:56:37,920 --> 00:56:39,160 Speaker 2: Australian car anymore. 1074 00:56:39,360 --> 00:56:40,600 Speaker 1: But the reality of it. 1075 00:56:40,480 --> 00:56:42,839 Speaker 2: Is the more we support industries where we know where 1076 00:56:42,840 --> 00:56:43,520 Speaker 2: they're coming from. 1077 00:56:43,680 --> 00:56:44,880 Speaker 1: Fair trade coffee. 1078 00:56:45,000 --> 00:56:46,840 Speaker 2: You know, when you go to a cafe, do you 1079 00:56:46,960 --> 00:56:50,080 Speaker 2: know that that coffee, the beans that are sourced are 1080 00:56:50,080 --> 00:56:54,799 Speaker 2: coming from ethical you know, organizations that support the suppliers 1081 00:56:54,880 --> 00:56:59,120 Speaker 2: and the farmers to perpetuate the cycle of having you know, 1082 00:56:59,239 --> 00:57:03,320 Speaker 2: good quality pay and wages and standards. 1083 00:57:04,880 --> 00:57:08,120 Speaker 1: Very valid, my friend, very valid. All right? Where can 1084 00:57:08,160 --> 00:57:11,640 Speaker 1: people connect with Patrick James Bonello and his skills. 1085 00:57:12,080 --> 00:57:15,120 Speaker 2: You can jump onto websites now dot com today you 1086 00:57:15,160 --> 00:57:16,480 Speaker 2: to find out about what I do. 1087 00:57:16,600 --> 00:57:18,120 Speaker 1: Obviously websites and stuff. 1088 00:57:18,400 --> 00:57:21,120 Speaker 2: But if you want to do some free tie chi exercises, 1089 00:57:21,160 --> 00:57:23,400 Speaker 2: go to tai Chi at home dot com today you 1090 00:57:23,560 --> 00:57:25,680 Speaker 2: that's also kind of fun. I should be updating that 1091 00:57:25,720 --> 00:57:30,040 Speaker 2: sometimes soon with new exercises and yeah, yeah. 1092 00:57:30,200 --> 00:57:32,160 Speaker 1: Thanks tip. What's on for you today, tiv. 1093 00:57:33,600 --> 00:57:36,520 Speaker 3: I've got an appointment, I've got a podcast, I've got 1094 00:57:36,560 --> 00:57:40,080 Speaker 3: a group training session, and I've got a little online meeting. 1095 00:57:40,120 --> 00:57:43,560 Speaker 1: I've got I've got a bit on early. 1096 00:57:44,480 --> 00:57:46,840 Speaker 3: It doesn't feel like a busy day, but when I 1097 00:57:46,880 --> 00:57:50,120 Speaker 3: started to list it, I was like, oh. 1098 00:57:49,360 --> 00:57:52,960 Speaker 1: I've got a bit on all right. Everyone. Thanks everyone, 1099 00:57:53,080 --> 00:57:56,120 Speaker 1: love you, guts Is, thanks Mate, Thanks TIV, thank you