1 00:00:01,200 --> 00:00:04,560 Speaker 1: I'll get a team. It's you project, it's Patrick. It's 2 00:00:04,600 --> 00:00:08,280 Speaker 1: my favorite. He's back and it's my other favorite tip. 3 00:00:08,360 --> 00:00:11,160 Speaker 1: They're both back. It's like the fucking wall of favorites. 4 00:00:11,200 --> 00:00:13,240 Speaker 1: As I look at my I was going to say, 5 00:00:13,320 --> 00:00:16,840 Speaker 1: TV screen. That's how old I'm It's like a TV screen, 6 00:00:16,920 --> 00:00:20,759 Speaker 1: isn't it. It's like something from the seventies. They've really evolved, 7 00:00:20,840 --> 00:00:25,400 Speaker 1: haven't they look at them? What with the wireless and everything? Hello, Patrick, 8 00:00:25,440 --> 00:00:25,960 Speaker 1: how are. 9 00:00:25,840 --> 00:00:29,080 Speaker 2: You let me dial my rotary dial phone and we'll 10 00:00:29,080 --> 00:00:30,440 Speaker 2: connect on the phone. 11 00:00:31,160 --> 00:00:34,040 Speaker 1: Yes, let's do that. I'll send my carrier pigeon over 12 00:00:34,120 --> 00:00:36,239 Speaker 1: to you, just to inform you of what's going on. 13 00:00:36,360 --> 00:00:37,600 Speaker 1: With a note and it's God. 14 00:00:39,520 --> 00:00:42,720 Speaker 2: I was talking to someone about writing handwriting letters. I 15 00:00:42,760 --> 00:00:46,840 Speaker 2: love writing cards to people, and it's one of those 16 00:00:46,880 --> 00:00:49,440 Speaker 2: things to receive a written note these days. 17 00:00:49,159 --> 00:00:51,960 Speaker 3: It's so unique. I'm good, by the way, thank you 18 00:00:52,040 --> 00:00:53,440 Speaker 3: for asking, how are you? 19 00:00:54,040 --> 00:00:56,280 Speaker 1: I'm good? Thank you? Tiff? What do you think the 20 00:00:56,440 --> 00:00:59,120 Speaker 1: writing the like with a pen? As I'm holding up 21 00:00:59,160 --> 00:01:01,520 Speaker 1: a pen just like Tiff would no unless I fucking 22 00:01:01,560 --> 00:01:04,720 Speaker 1: held one. I don't know why I'm doing this visual prompt, 23 00:01:04,760 --> 00:01:08,039 Speaker 1: but nonetheless I'm still doing it that's. 24 00:01:07,880 --> 00:01:10,479 Speaker 2: Getting close to your eye, mate, be careful, don't take 25 00:01:10,520 --> 00:01:11,400 Speaker 2: yourself in the eye with that. 26 00:01:11,600 --> 00:01:14,800 Speaker 1: Come on, yeah, thanks mom. If what do you think 27 00:01:14,840 --> 00:01:18,200 Speaker 1: the chances of the younger generation being able to use 28 00:01:18,240 --> 00:01:21,800 Speaker 1: one of these in ten years is like, do you 29 00:01:21,840 --> 00:01:25,959 Speaker 1: think we'll still have children in schools writing on paper 30 00:01:26,000 --> 00:01:28,319 Speaker 1: with pens or that will be obsolete. 31 00:01:28,640 --> 00:01:32,560 Speaker 4: It's interesting, isn't it. Maybe it'll make itself obsolete, but 32 00:01:32,600 --> 00:01:35,120 Speaker 4: then won't it be a beautiful craft like people that 33 00:01:35,400 --> 00:01:40,399 Speaker 4: have the ability to do that, Like if in. 34 00:01:40,319 --> 00:01:43,000 Speaker 1: The you know, I know, Patrick's going to say, oh, 35 00:01:43,080 --> 00:01:45,240 Speaker 1: we need to hold on to it, and I get it, 36 00:01:45,280 --> 00:01:47,600 Speaker 1: and it's romantic and it's beautiful. But in the real 37 00:01:47,640 --> 00:01:50,680 Speaker 1: world beyond that, I don't think people are going to 38 00:01:50,760 --> 00:01:54,640 Speaker 1: be writing with pens on paper. I think kids from 39 00:01:54,960 --> 00:01:57,000 Speaker 1: a young age you're going to be using computers and 40 00:01:57,040 --> 00:01:59,520 Speaker 1: everything's going to be done in that modality. What are 41 00:01:59,560 --> 00:02:00,400 Speaker 1: your thoughts, Patrick? 42 00:02:01,080 --> 00:02:04,120 Speaker 2: My thoughts are that you are completely and totally utterly wrong, 43 00:02:04,960 --> 00:02:07,240 Speaker 2: and new studies are now actually encouraging. 44 00:02:07,280 --> 00:02:10,000 Speaker 1: In primary school, you always say new studies, reference the 45 00:02:10,040 --> 00:02:10,920 Speaker 1: study right now. 46 00:02:11,880 --> 00:02:13,040 Speaker 3: It's a study right now. 47 00:02:13,040 --> 00:02:16,720 Speaker 1: I don't say studies, tell us tell me the name 48 00:02:16,760 --> 00:02:20,760 Speaker 1: of the study, or I don't believe you. Look, this 49 00:02:21,520 --> 00:02:24,239 Speaker 1: is your go to this is your go to studies. 50 00:02:24,280 --> 00:02:26,839 Speaker 1: Tell us there was this, don't fucking say that. Tell 51 00:02:26,880 --> 00:02:29,440 Speaker 1: me the name of the study, and don't start looking 52 00:02:29,440 --> 00:02:34,640 Speaker 1: it up now I am looking it up. Are you 53 00:02:34,760 --> 00:02:37,760 Speaker 1: telling me? Are you telling me that you think in 54 00:02:37,880 --> 00:02:40,520 Speaker 1: ten years that kids will be writing on paper with 55 00:02:40,639 --> 00:02:42,440 Speaker 1: pens more than they are now? 56 00:02:42,960 --> 00:02:45,799 Speaker 3: Yes, I'm telling you now that they are. 57 00:02:46,000 --> 00:02:49,919 Speaker 1: No, no, no, you're giving me your opinion now, don't 58 00:02:50,160 --> 00:02:53,000 Speaker 1: know this is your opinion, right, don't get over the 59 00:02:53,040 --> 00:02:58,320 Speaker 1: top with what you're not fucking nostradamis article I read recently, 60 00:02:58,919 --> 00:03:01,520 Speaker 1: because articles never get it wrong that. 61 00:03:03,160 --> 00:03:11,480 Speaker 2: Schools are now reintroducing pens because the mechanism for writing 62 00:03:11,639 --> 00:03:16,400 Speaker 2: words means that children are able to articulate and learn 63 00:03:16,680 --> 00:03:21,320 Speaker 2: in a much more rapid pace when they take notes 64 00:03:21,440 --> 00:03:24,400 Speaker 2: and they write it down. Because the act of actually 65 00:03:24,440 --> 00:03:28,680 Speaker 2: writing the words, forming the letters, combining the words helps 66 00:03:29,200 --> 00:03:32,480 Speaker 2: greatly with learning and part of learning process. So at 67 00:03:32,560 --> 00:03:36,400 Speaker 2: least in Victoria, there is a move to really strongly 68 00:03:36,480 --> 00:03:40,920 Speaker 2: encourage kids away from tablets and back to handwriting. 69 00:03:42,000 --> 00:03:45,360 Speaker 1: I think theoretically that's brilliant, and I hope that is true, 70 00:03:45,360 --> 00:03:49,120 Speaker 1: and I hope that. I mean, just because they're doing 71 00:03:49,320 --> 00:03:52,760 Speaker 1: like that would be great. But I don't know that 72 00:03:52,760 --> 00:03:55,960 Speaker 1: that in a decade that the standard is going to 73 00:03:55,960 --> 00:03:58,160 Speaker 1: be kids writing on paper. I think it's going to 74 00:03:58,200 --> 00:04:01,640 Speaker 1: be the exception, not the rule. Ye, Like, I'm with you, 75 00:04:01,760 --> 00:04:05,240 Speaker 1: like I think for left brain right brain kind of 76 00:04:05,320 --> 00:04:11,080 Speaker 1: creativity and that tactile thing of actually doing something that's 77 00:04:11,240 --> 00:04:15,200 Speaker 1: fluid with with a pen on a page. But then 78 00:04:15,240 --> 00:04:17,000 Speaker 1: all the greenies are going to go, no, we need 79 00:04:17,040 --> 00:04:19,400 Speaker 1: to stop using all the paper and we're killing all 80 00:04:19,440 --> 00:04:20,039 Speaker 1: the trees. 81 00:04:20,520 --> 00:04:26,080 Speaker 2: There's that Raising Raising Children dot net dot au son 82 00:04:26,080 --> 00:04:30,080 Speaker 2: austraining parenting website states handwriting is an important part of 83 00:04:30,160 --> 00:04:32,880 Speaker 2: literacy and an essential skillful life. 84 00:04:33,480 --> 00:04:33,800 Speaker 4: There you go. 85 00:04:34,640 --> 00:04:37,720 Speaker 1: Yeah, Still, that's just a website saying for something. It's 86 00:04:37,760 --> 00:04:43,680 Speaker 1: not research, is it? I love, Let's be clear. Let's 87 00:04:43,680 --> 00:04:44,200 Speaker 1: be clear. 88 00:04:45,120 --> 00:04:47,160 Speaker 2: If when you come out of the ring just punching 89 00:04:47,320 --> 00:04:49,599 Speaker 2: like shit, that's Craig this morning. 90 00:04:49,680 --> 00:04:49,840 Speaker 1: Right. 91 00:04:50,080 --> 00:04:50,120 Speaker 2: No. 92 00:04:50,279 --> 00:04:53,800 Speaker 1: No, I'm just saying, like, I don't have to agree 93 00:04:53,839 --> 00:04:56,159 Speaker 1: with everything you say just because you're saying it. I 94 00:04:56,200 --> 00:04:58,640 Speaker 1: can say cool, you say research, where is it? And 95 00:04:58,640 --> 00:04:59,359 Speaker 1: you go, I don't know. 96 00:05:00,080 --> 00:05:04,799 Speaker 2: Queensland curriculum and Assessment authority supporting your child development. 97 00:05:05,440 --> 00:05:09,920 Speaker 1: Ye still not research, Still not research, Just a fucking 98 00:05:09,960 --> 00:05:16,479 Speaker 1: website saying something, an education website. So what So what 99 00:05:17,080 --> 00:05:19,799 Speaker 1: you know? Science used to tell us that the world 100 00:05:19,880 --> 00:05:21,960 Speaker 1: was flat. Science used to tell us that the food 101 00:05:21,960 --> 00:05:24,640 Speaker 1: pyramid was good. Science used to tell us that pregnant 102 00:05:24,680 --> 00:05:27,840 Speaker 1: women should have polidamine. There's a lot of things that 103 00:05:27,920 --> 00:05:29,279 Speaker 1: turn out not to be true. 104 00:05:30,400 --> 00:05:34,160 Speaker 3: I'll send you a link to the National Library the 105 00:05:34,200 --> 00:05:36,240 Speaker 3: neuroscience behind writing. 106 00:05:38,760 --> 00:05:42,480 Speaker 1: But I believe you with that, Like, I truly I agree. 107 00:05:42,600 --> 00:05:45,200 Speaker 1: I think it's great. That's not the question. The question 108 00:05:45,320 --> 00:05:47,560 Speaker 1: is will it still be around? So you and I 109 00:05:47,640 --> 00:05:50,359 Speaker 1: are on the same page. I think I wish that 110 00:05:50,400 --> 00:05:53,119 Speaker 1: we're all still writing, and I wish we were using 111 00:05:53,200 --> 00:05:55,960 Speaker 1: both hands. Like, if you're listening to this and you're 112 00:05:56,000 --> 00:05:59,320 Speaker 1: a parent and you want your kid's brain to develop, great, 113 00:05:59,440 --> 00:06:01,760 Speaker 1: get them to do things with both hands. Get them 114 00:06:01,800 --> 00:06:04,359 Speaker 1: to write and draw with both hands. Get them to 115 00:06:04,440 --> 00:06:07,160 Speaker 1: play games with both hands. Hold a tennis racket, at 116 00:06:07,200 --> 00:06:12,080 Speaker 1: cricket bat, you know whatever, play with dolls with both hands, 117 00:06:12,360 --> 00:06:15,080 Speaker 1: like try to What about when you're getting dressed and 118 00:06:15,120 --> 00:06:18,360 Speaker 1: you try to button up your shirt with your your 119 00:06:18,440 --> 00:06:23,240 Speaker 1: non dominant hands. It's like the fucking I was going 120 00:06:23,279 --> 00:06:26,719 Speaker 1: to say something very inappropriate. Have you ever I know 121 00:06:26,839 --> 00:06:27,239 Speaker 1: I can't. 122 00:06:27,480 --> 00:06:30,400 Speaker 3: I can't to say it. 123 00:06:30,839 --> 00:06:33,760 Speaker 1: No, I really can't. I would really actually get in trouble. 124 00:06:35,120 --> 00:06:35,400 Speaker 4: I know. 125 00:06:35,480 --> 00:06:37,320 Speaker 1: For me to pull back is yeah, wow. 126 00:06:39,800 --> 00:06:41,120 Speaker 3: You know what I was just thinking. 127 00:06:41,120 --> 00:06:44,040 Speaker 2: You were talking about handwriting in schools and whether that's 128 00:06:44,080 --> 00:06:46,680 Speaker 2: going to be present in ten years time. Let's step 129 00:06:46,760 --> 00:06:52,000 Speaker 2: back to even earlier learning. What about kids drawing with crayons? 130 00:06:52,240 --> 00:06:56,640 Speaker 2: And you know, there's a five year old child next 131 00:06:56,640 --> 00:06:59,719 Speaker 2: door who's adorable. And when I take Fritz out to 132 00:06:59,760 --> 00:07:02,400 Speaker 2: the front yard when I'm weeding, the kids all swarm 133 00:07:02,400 --> 00:07:03,440 Speaker 2: because they love my dog. 134 00:07:03,920 --> 00:07:05,080 Speaker 3: Right, it's not nothing to do with me. 135 00:07:05,640 --> 00:07:09,640 Speaker 2: But the little girl next door she turned five, so 136 00:07:09,680 --> 00:07:12,200 Speaker 2: I got her a little gift and she wrote she 137 00:07:12,320 --> 00:07:15,280 Speaker 2: drew me this beautiful picture, and she spelt my name, 138 00:07:15,320 --> 00:07:17,160 Speaker 2: but she spelt the pee backwards, which was. 139 00:07:17,200 --> 00:07:19,960 Speaker 3: Kind of cute. It was just like thank you. 140 00:07:20,960 --> 00:07:23,720 Speaker 2: And I just thought I've hung it up in my office, 141 00:07:23,840 --> 00:07:26,120 Speaker 2: right behind my desk, because it says. 142 00:07:26,080 --> 00:07:28,040 Speaker 3: So cute that this five year old draw a picture 143 00:07:28,040 --> 00:07:28,280 Speaker 3: for me. 144 00:07:28,320 --> 00:07:33,560 Speaker 2: But the act of drawing, of creating something on paper 145 00:07:33,600 --> 00:07:37,040 Speaker 2: that kids do from quite an early age, and visualizing 146 00:07:37,080 --> 00:07:40,240 Speaker 2: the world and then articulating it in such a way. 147 00:07:40,520 --> 00:07:43,680 Speaker 2: You know, kids draw even before they speak sometimes, but 148 00:07:43,760 --> 00:07:46,640 Speaker 2: to draw mummy and daddy, or to draw a picture 149 00:07:46,680 --> 00:07:49,520 Speaker 2: of a sky with the sun in it, those things 150 00:07:49,600 --> 00:07:54,320 Speaker 2: just they're going to be around, you know, that taxile connectedness. 151 00:07:54,600 --> 00:07:57,600 Speaker 2: So I think that, you know, is that the precursor 152 00:07:57,600 --> 00:08:03,320 Speaker 2: to writing further maybe hopefully Now I like to think, yeah, all. 153 00:08:03,280 --> 00:08:05,480 Speaker 1: Right, let's move on, let's look at some of it. 154 00:08:05,640 --> 00:08:09,400 Speaker 1: Let's look at what's on your list. I don't disagree, 155 00:08:09,400 --> 00:08:11,600 Speaker 1: by the way, in terms of your premise that it's 156 00:08:11,600 --> 00:08:14,600 Speaker 1: great for the brain and great for development. We're on 157 00:08:14,720 --> 00:08:18,760 Speaker 1: the same page with that. Okay. Biology scientists are excited 158 00:08:18,840 --> 00:08:22,040 Speaker 1: by gel that's going to repair tooth enamel. 159 00:08:22,360 --> 00:08:23,440 Speaker 3: This is pretty big. 160 00:08:23,760 --> 00:08:27,920 Speaker 2: This is just really really big because we can't grow 161 00:08:28,040 --> 00:08:30,720 Speaker 2: new enamel on our teeth, you know, as we get over, 162 00:08:30,880 --> 00:08:31,720 Speaker 2: you get older. 163 00:08:31,840 --> 00:08:32,760 Speaker 3: It's a real problem. 164 00:08:32,840 --> 00:08:35,480 Speaker 2: And this is a concern right around the world because 165 00:08:35,880 --> 00:08:39,079 Speaker 2: the World Health Organization says about three point seven billion 166 00:08:39,160 --> 00:08:43,920 Speaker 2: people suffer from oral disease with the enamel degradation. So 167 00:08:44,000 --> 00:08:47,480 Speaker 2: when the enamel breaks down, it leaves your teeth susceptible, 168 00:08:47,679 --> 00:08:51,600 Speaker 2: so that's one of the big causes, and decay can 169 00:08:51,720 --> 00:08:59,040 Speaker 2: lead to things like diabetes, vascular disease, so bad oral 170 00:08:59,080 --> 00:09:03,280 Speaker 2: health is a real problem. And so the scientists have 171 00:09:03,360 --> 00:09:05,560 Speaker 2: developed a possible treatment. 172 00:09:05,679 --> 00:09:08,400 Speaker 3: So this is out of the University of Nottingham. 173 00:09:08,400 --> 00:09:12,120 Speaker 2: And they believe that they've come up with I guess 174 00:09:12,120 --> 00:09:15,520 Speaker 2: a chemical environmental engineering department have been working on this 175 00:09:15,600 --> 00:09:20,840 Speaker 2: research to have a protein based substance that mimics the 176 00:09:20,920 --> 00:09:23,760 Speaker 2: key features that are used in enamel to develop. So 177 00:09:23,800 --> 00:09:27,280 Speaker 2: this is what babies have when they're growing your teeth. 178 00:09:27,960 --> 00:09:31,960 Speaker 2: Infants use this kind of this process to develop the 179 00:09:32,040 --> 00:09:34,760 Speaker 2: enamel around the teeth. It's kind of like a scaffolding 180 00:09:34,880 --> 00:09:37,480 Speaker 2: that goes on a tooth. And now they're saying they 181 00:09:37,520 --> 00:09:40,800 Speaker 2: may be able to use these calcium and phosphate ions 182 00:09:41,400 --> 00:09:44,840 Speaker 2: are ions that are in saliva and mimic that so 183 00:09:44,920 --> 00:09:47,959 Speaker 2: that we can regrow the enamel. This is we're talking 184 00:09:48,000 --> 00:09:51,280 Speaker 2: This may be as soon as like twelve to eighteen 185 00:09:51,320 --> 00:09:55,240 Speaker 2: months away and that could be revolutionary for people, but 186 00:09:55,280 --> 00:09:58,160 Speaker 2: particularly for those people who you know. I often think 187 00:09:58,200 --> 00:10:01,240 Speaker 2: about the safety net we have with medy care here 188 00:10:01,240 --> 00:10:03,800 Speaker 2: in Australia and there's so many safety nets for people 189 00:10:03,840 --> 00:10:06,720 Speaker 2: who get sick, but the one area that there isn't 190 00:10:06,760 --> 00:10:09,800 Speaker 2: a lot of care for is people who you know, 191 00:10:09,840 --> 00:10:12,640 Speaker 2: don't have a lot of money and can't afford dental work. 192 00:10:13,080 --> 00:10:17,000 Speaker 2: And the ramifications of people having bad teeth can be 193 00:10:17,240 --> 00:10:19,880 Speaker 2: I mean, aside from the cosmetic side of it and 194 00:10:19,920 --> 00:10:24,080 Speaker 2: losing confidence, the other aspect is the heart conditions and 195 00:10:24,160 --> 00:10:28,760 Speaker 2: potential you know, ongoing effects of having dental disease. And 196 00:10:28,800 --> 00:10:30,600 Speaker 2: we don't have a big safety net in there for 197 00:10:30,640 --> 00:10:33,080 Speaker 2: adults who lose their teeth. I mean, I know, I 198 00:10:33,080 --> 00:10:36,080 Speaker 2: think kids can get free dental treatment, but adults can't, 199 00:10:36,520 --> 00:10:38,680 Speaker 2: and so this could be a life changer for so 200 00:10:38,760 --> 00:10:41,480 Speaker 2: many people who just can't afford to go to the dentist. 201 00:10:43,040 --> 00:10:48,000 Speaker 1: Yeah, it's true. It's if you've got kind of mouth issues, 202 00:10:48,080 --> 00:10:50,960 Speaker 1: or teeth issues, or gum issues, or kind of infection 203 00:10:51,080 --> 00:10:53,280 Speaker 1: issues in the mouth, which is a lot of people. 204 00:10:54,679 --> 00:11:00,480 Speaker 1: There's a kind of a downstream physiological clinical consequence of that, which, 205 00:11:00,480 --> 00:11:04,240 Speaker 1: as you said, is often heart issues, but moreover just 206 00:11:04,480 --> 00:11:08,559 Speaker 1: like people. Yeah, there's just when I had my infection, 207 00:11:08,640 --> 00:11:13,160 Speaker 1: which is not mouth but closely located in my ears, 208 00:11:13,200 --> 00:11:16,079 Speaker 1: I had like an infection in my head, as in 209 00:11:16,640 --> 00:11:22,000 Speaker 1: my Eustachian tubes, and I felt fucking terrible for six months, 210 00:11:22,040 --> 00:11:24,800 Speaker 1: but I thought it was something else, so I didn't 211 00:11:24,840 --> 00:11:27,760 Speaker 1: even get a CT scan on my head, which I 212 00:11:27,840 --> 00:11:30,800 Speaker 1: ended up having to get, because I thought it was 213 00:11:31,120 --> 00:11:33,199 Speaker 1: I don't know. I thought I had diabetes. I thought 214 00:11:33,240 --> 00:11:36,760 Speaker 1: I had chronic fatigue. I've got a thing called pernicious anemia, 215 00:11:36,760 --> 00:11:39,960 Speaker 1: which makes you tired. I thought I had prostate issues. 216 00:11:40,000 --> 00:11:42,640 Speaker 1: I thought all of these things, and what I had 217 00:11:42,720 --> 00:11:45,680 Speaker 1: was an infection in my head. And so I had 218 00:11:45,760 --> 00:11:50,600 Speaker 1: six months of all of these pretty shitty symptoms, and 219 00:11:50,640 --> 00:11:54,520 Speaker 1: I took one course of antibiotics and everything went away. 220 00:11:54,760 --> 00:11:58,240 Speaker 1: I'm like, oh, oh, and that's just from that one 221 00:11:58,320 --> 00:12:01,960 Speaker 1: thing in my head. That infection basically fucked up, like 222 00:12:02,120 --> 00:12:05,400 Speaker 1: even down to brain function. I feel like my brain 223 00:12:05,520 --> 00:12:07,640 Speaker 1: was a six out of ten for like six months, 224 00:12:08,200 --> 00:12:10,599 Speaker 1: and I thought I was just I was scared. I 225 00:12:10,640 --> 00:12:11,440 Speaker 1: was a little bit scared. 226 00:12:12,320 --> 00:12:15,240 Speaker 2: I want to treat this with the seriousness that deserves 227 00:12:15,320 --> 00:12:18,520 Speaker 2: because I think medical conditions should be something that we 228 00:12:18,679 --> 00:12:21,240 Speaker 2: think about a lot and we protect ourselves from. 229 00:12:21,559 --> 00:12:23,240 Speaker 3: But I want to take. 230 00:12:23,080 --> 00:12:28,199 Speaker 2: That leap from an head infection to your prostate. 231 00:12:29,040 --> 00:12:35,080 Speaker 1: Yeah, well, because yeah, well the symptoms were I was 232 00:12:35,080 --> 00:12:38,959 Speaker 1: weaning a lot, right, Okay, I was weaning a lot, 233 00:12:40,000 --> 00:12:44,160 Speaker 1: I was lethargic, my cognitive function was down, so they 234 00:12:44,200 --> 00:12:49,640 Speaker 1: were just symptoms which often shouldn't be associated with that. 235 00:12:49,840 --> 00:12:52,760 Speaker 1: So I literally thought I was diabetic. I thought I 236 00:12:52,800 --> 00:12:56,560 Speaker 1: had prostate issues. I thought I had and I wrote 237 00:12:56,559 --> 00:12:58,800 Speaker 1: out a list like I did all this research, like 238 00:12:58,880 --> 00:13:01,240 Speaker 1: I fucking knew what I'm doing, pretending, oh look, I'm 239 00:13:01,240 --> 00:13:05,400 Speaker 1: almost a doctor. So then I went into my actual 240 00:13:05,480 --> 00:13:07,960 Speaker 1: doctor and I said this is what I think's going on, 241 00:13:08,000 --> 00:13:10,160 Speaker 1: and he went, fucking he fuck, what are your symptoms? 242 00:13:10,679 --> 00:13:13,079 Speaker 1: So I told him and he went, well, you could 243 00:13:13,160 --> 00:13:15,800 Speaker 1: be on the money. Then I had all this, I 244 00:13:15,840 --> 00:13:18,320 Speaker 1: had urine tests, I had blood tests, and then he 245 00:13:18,400 --> 00:13:20,679 Speaker 1: calls me in and he goes, well, you got none 246 00:13:20,679 --> 00:13:22,920 Speaker 1: of that, but you've got a fucking infection in your head. 247 00:13:23,000 --> 00:13:27,600 Speaker 1: I went, ah, months of this, Holy Molly, Oh dude, dude, 248 00:13:27,600 --> 00:13:31,679 Speaker 1: I was And also I'm a big baby, so that 249 00:13:31,720 --> 00:13:32,320 Speaker 1: doesn't help. 250 00:13:32,559 --> 00:13:35,200 Speaker 3: The moral of the story is blakes go to the 251 00:13:35,240 --> 00:13:36,040 Speaker 3: doctor more often. 252 00:13:37,480 --> 00:13:40,920 Speaker 1: That is, probably probably don't go to doctor Google as much, 253 00:13:41,000 --> 00:13:46,560 Speaker 1: ye doctor chat GPTV. Patrick tell Us why Google is 254 00:13:46,600 --> 00:13:49,679 Speaker 1: putting data centers in space if you would tell us 255 00:13:49,720 --> 00:13:50,920 Speaker 1: that that'd be nice. 256 00:13:50,960 --> 00:13:53,800 Speaker 2: Well, it's kind of we're talking as early as twenty 257 00:13:53,840 --> 00:13:57,080 Speaker 2: twenty seven that we could. This is a problem with AI, 258 00:13:58,040 --> 00:14:00,800 Speaker 2: the need for data centers in the rece courses they use, 259 00:14:00,880 --> 00:14:07,200 Speaker 2: so water usage has just ballooned because they need water 260 00:14:07,240 --> 00:14:08,559 Speaker 2: to cool the data centers. 261 00:14:09,480 --> 00:14:11,880 Speaker 3: The theory is that if you put a. 262 00:14:12,040 --> 00:14:15,400 Speaker 2: Data center in orbit four hundred miles above the Earth, 263 00:14:15,960 --> 00:14:18,880 Speaker 2: then you don't have the cooling problems because space in 264 00:14:19,000 --> 00:14:23,560 Speaker 2: essence is pretty damn cold. But you could orient the 265 00:14:23,640 --> 00:14:27,800 Speaker 2: solar panels and be constantly generating the power that you 266 00:14:27,840 --> 00:14:30,760 Speaker 2: would need. So we'vein a very short amount of time, 267 00:14:31,040 --> 00:14:33,800 Speaker 2: and in fact, in a couple of years, there are 268 00:14:33,880 --> 00:14:38,440 Speaker 2: more ships, there are more rockets launching satellites, so getting 269 00:14:38,520 --> 00:14:40,640 Speaker 2: stuff into space is a lot cheaper than it ever 270 00:14:40,760 --> 00:14:43,720 Speaker 2: has been. So there's more stuff going up, and the 271 00:14:43,800 --> 00:14:46,920 Speaker 2: running costs by twenty thirty could be you know, a 272 00:14:47,000 --> 00:14:51,080 Speaker 2: space based data center could be comparable to one on Earth. 273 00:14:51,720 --> 00:14:54,560 Speaker 3: And oh, obviously tired data Trader is a big thing. 274 00:14:54,600 --> 00:14:56,960 Speaker 1: And I just asked two two questions. One Tiff is 275 00:14:57,000 --> 00:14:59,880 Speaker 1: for you. Could you just look up what's the ten 276 00:15:00,080 --> 00:15:03,080 Speaker 1: pictura space? And I know space is fucking big, but 277 00:15:03,120 --> 00:15:05,920 Speaker 1: could you just find out like a range and Patrick two, 278 00:15:07,040 --> 00:15:10,800 Speaker 1: what is it? Arta center? Is that just like a 279 00:15:11,040 --> 00:15:16,520 Speaker 1: physical space with a bazillion like I'm picturing like a 280 00:15:16,560 --> 00:15:18,880 Speaker 1: fucking basketball stadium full of computers? 281 00:15:19,120 --> 00:15:21,200 Speaker 2: Is that pretty well? Is that they're not always as 282 00:15:21,240 --> 00:15:23,400 Speaker 2: big as a basketball court. But that's exactly what You're right, 283 00:15:23,400 --> 00:15:26,720 Speaker 2: You're right. It's artist is a place where they have 284 00:15:26,840 --> 00:15:29,240 Speaker 2: racks and racks of machines. So the computers are all 285 00:15:29,320 --> 00:15:33,440 Speaker 2: arranged in racks all the all the you know, the 286 00:15:33,480 --> 00:15:37,000 Speaker 2: big I get processing brains that we use for the 287 00:15:37,040 --> 00:15:39,640 Speaker 2: Internet and for AI and all the things that we 288 00:15:39,760 --> 00:15:43,480 Speaker 2: store and save and process on a constant basis. Because 289 00:15:43,680 --> 00:15:46,400 Speaker 2: when we think about a lot of what we do nowadays, 290 00:15:46,440 --> 00:15:49,120 Speaker 2: a lot of the processing is going on in the cloud. 291 00:15:49,440 --> 00:15:51,520 Speaker 2: When we talk about that, we talk about a data 292 00:15:51,520 --> 00:15:53,360 Speaker 2: center that's remote from our computer. 293 00:15:53,720 --> 00:15:55,160 Speaker 3: So our computers don't. 294 00:15:54,960 --> 00:15:58,120 Speaker 2: Have to be as powerful because the processing is going 295 00:15:58,160 --> 00:16:00,440 Speaker 2: on at the data center end. When you go to 296 00:16:00,520 --> 00:16:04,480 Speaker 2: chat GPT, all the smarts that go into answering your 297 00:16:04,560 --> 00:16:08,240 Speaker 2: question isn't happening on your computer. Your computers linked up 298 00:16:08,280 --> 00:16:09,120 Speaker 2: to the Data center. 299 00:16:10,480 --> 00:16:14,200 Speaker 1: Yeah, TIF, any answer for us? Does it say cold 300 00:16:14,200 --> 00:16:14,680 Speaker 1: as fuck? 301 00:16:15,240 --> 00:16:18,280 Speaker 4: It? Well, it says, says a whole bunch of stuff, 302 00:16:18,440 --> 00:16:21,440 Speaker 4: and then it says because that's how much Chatters talks 303 00:16:22,200 --> 00:16:26,720 Speaker 4: minus two hundred and seventy degrees celsius. In other words, 304 00:16:26,760 --> 00:16:30,360 Speaker 4: colder than your ex is heart after a messy breakup. Thanks. Check. 305 00:16:30,440 --> 00:16:33,680 Speaker 1: Oh look at chat GPT being funny. 306 00:16:34,040 --> 00:16:34,640 Speaker 4: Wow. 307 00:16:35,440 --> 00:16:40,680 Speaker 1: Wow, it also knows you Patrick's coffin his little testicles 308 00:16:40,680 --> 00:16:43,760 Speaker 1: out over there is mute in himself. Fucking hell, did 309 00:16:43,760 --> 00:16:45,960 Speaker 1: you just, Nellie, as my mum would say, did you 310 00:16:46,120 --> 00:16:49,800 Speaker 1: nearly have a fucking epoplexy? Whatever? That is. 311 00:16:51,480 --> 00:16:55,560 Speaker 3: What it epoplexy is? Tell us I look at epoplexy please. 312 00:16:55,960 --> 00:16:57,960 Speaker 1: I don't think it's a real word. But I do 313 00:16:58,040 --> 00:17:01,280 Speaker 1: have another question for you, which is not on your list. 314 00:17:01,640 --> 00:17:05,560 Speaker 1: My question is how do you think? In fact, both 315 00:17:05,600 --> 00:17:07,199 Speaker 1: of you all want your opinion, but we'll start with 316 00:17:07,200 --> 00:17:11,440 Speaker 1: you Patrick. So the ban on social media for sixteen 317 00:17:11,480 --> 00:17:13,440 Speaker 1: year olds comes in in the next week or two, 318 00:17:13,480 --> 00:17:15,840 Speaker 1: depending on what state you're in. How do you reckon 319 00:17:15,880 --> 00:17:17,720 Speaker 1: that's going to go? Mate? And why? 320 00:17:18,520 --> 00:17:20,719 Speaker 3: Yeah, smart kids will get around it. 321 00:17:21,119 --> 00:17:24,000 Speaker 2: VPNs get you know, using a virtual private network to 322 00:17:24,040 --> 00:17:27,960 Speaker 2: pretend that you're in another country. I mean, two minds 323 00:17:28,000 --> 00:17:33,000 Speaker 2: about this. I really feel that social media can be 324 00:17:33,080 --> 00:17:36,920 Speaker 2: quite damaging for young people if it's used, you know, 325 00:17:37,440 --> 00:17:39,560 Speaker 2: in the wrong ways. And we know that the algorithms 326 00:17:39,560 --> 00:17:43,320 Speaker 2: of a lot of social media aren't necessarily pro health 327 00:17:43,400 --> 00:17:48,639 Speaker 2: and well being and mental health. Now I worry for 328 00:17:48,800 --> 00:17:52,680 Speaker 2: kids in rural areas who are disenfranchised, who are disconnected. 329 00:17:53,040 --> 00:17:55,600 Speaker 2: You know, there was a recent article I read about 330 00:17:55,680 --> 00:17:58,159 Speaker 2: kids who go to boarding school. You know, they are 331 00:17:58,200 --> 00:18:01,080 Speaker 2: going to come home and separate from all their friends. 332 00:18:01,280 --> 00:18:04,760 Speaker 2: How do they contact each other? Most young people these 333 00:18:04,840 --> 00:18:07,679 Speaker 2: days you snapchat, which is a messaging service, but it 334 00:18:07,680 --> 00:18:09,840 Speaker 2: also delivers video, and that's the problem. That's why it's 335 00:18:09,880 --> 00:18:12,440 Speaker 2: included in the band. So they're not trying to stop 336 00:18:12,520 --> 00:18:15,679 Speaker 2: kids from talking to each other. But if the social 337 00:18:15,720 --> 00:18:20,199 Speaker 2: media platform that they're using to communicate also happens to 338 00:18:20,200 --> 00:18:23,240 Speaker 2: be a video delivery platform, then. 339 00:18:23,119 --> 00:18:25,320 Speaker 3: It's part of the band. And that's where, you know. 340 00:18:25,400 --> 00:18:27,560 Speaker 2: I had one of my mates came out from Adelaide 341 00:18:27,560 --> 00:18:30,679 Speaker 2: recently came across from Adelaide with his girlfriend and his 342 00:18:30,800 --> 00:18:34,120 Speaker 2: daughter and granddaughter, and I asked the thirteen year old 343 00:18:34,480 --> 00:18:37,560 Speaker 2: how she felt about because she's just on Snapchat for 344 00:18:37,600 --> 00:18:39,200 Speaker 2: the whole time talking to her boyfriend. 345 00:18:39,280 --> 00:18:40,160 Speaker 3: Boyfriend thirteen. 346 00:18:40,280 --> 00:18:43,480 Speaker 2: Wow, that was a bit of an eye open and 347 00:18:43,560 --> 00:18:47,000 Speaker 2: form constantly chatting to each other, video chatting, all that 348 00:18:47,040 --> 00:18:49,400 Speaker 2: sort of stuff. But the platform that they were on 349 00:18:49,560 --> 00:18:51,040 Speaker 2: is a platform that's going to get banned. 350 00:18:52,480 --> 00:18:54,360 Speaker 3: And so her mother was looking. 351 00:18:54,200 --> 00:18:56,920 Speaker 2: At how are we going to keep the kids connected? 352 00:18:57,600 --> 00:18:59,800 Speaker 2: And I think that that's the concern that I have. 353 00:19:00,520 --> 00:19:06,399 Speaker 2: I you know, and school parents giving kids permission to 354 00:19:06,480 --> 00:19:10,359 Speaker 2: use social media, giving them phones. You know, we in 355 00:19:10,400 --> 00:19:13,320 Speaker 2: a nanny state where everything that has to happen has 356 00:19:13,359 --> 00:19:16,000 Speaker 2: to be dictated by a government and legislation. You know, 357 00:19:16,080 --> 00:19:19,639 Speaker 2: whatever happened to parenting where we take responsibility for the 358 00:19:19,680 --> 00:19:23,480 Speaker 2: devices we're giving to kids, you know, and we trust them. 359 00:19:23,520 --> 00:19:25,560 Speaker 3: We trust that they're going to do the right thing. 360 00:19:26,000 --> 00:19:29,200 Speaker 2: We talk to them, we have adult conversations and we say, look, 361 00:19:29,400 --> 00:19:31,200 Speaker 2: this is what you need to look out for when 362 00:19:31,240 --> 00:19:33,320 Speaker 2: you're online. You know, you're not allowed to have your 363 00:19:33,320 --> 00:19:36,119 Speaker 2: phone in your room after a certain time. Friends of 364 00:19:36,160 --> 00:19:39,000 Speaker 2: mine do that, all the kids, all the family, including 365 00:19:39,000 --> 00:19:42,160 Speaker 2: the parents, put their phone the kitchen, the charge overnight. 366 00:19:42,200 --> 00:19:43,719 Speaker 3: They don't have their phones in their rooms. 367 00:19:44,040 --> 00:19:47,720 Speaker 2: So I get that the ban is important in some ways, 368 00:19:48,160 --> 00:19:51,360 Speaker 2: but I'm always mindful of being told to do something 369 00:19:51,680 --> 00:19:54,679 Speaker 2: that effectively takes that control away from parents and kids 370 00:19:54,960 --> 00:19:59,760 Speaker 2: and disenfranchisees children. You know, if you're a trans kid 371 00:20:00,119 --> 00:20:03,000 Speaker 2: living in the middle of the bush, you may lose 372 00:20:03,080 --> 00:20:07,240 Speaker 2: contact with your community online and it could be really 373 00:20:07,280 --> 00:20:11,520 Speaker 2: hard for them to connect with people their peers because 374 00:20:11,640 --> 00:20:13,480 Speaker 2: they don't have that in their own community. 375 00:20:14,440 --> 00:20:16,960 Speaker 1: I hate it when you and I agree, it's fucking annoying, 376 00:20:18,160 --> 00:20:20,960 Speaker 1: very annoying. What do you think? What do you think? 377 00:20:21,200 --> 00:20:23,200 Speaker 4: You know what I think? I wonder what the effect 378 00:20:23,280 --> 00:20:26,680 Speaker 4: of how it undermines real laws. So now something that 379 00:20:26,880 --> 00:20:33,040 Speaker 4: seemingly stupid becomes outlawed, so kids get desensitized to breaking 380 00:20:33,080 --> 00:20:37,399 Speaker 4: the law. Right, there's no deal. I'll just do this 381 00:20:37,520 --> 00:20:41,120 Speaker 4: and you're not legally allowed to Will I do it? Anyway? 382 00:20:42,119 --> 00:20:45,560 Speaker 1: I wonder this is a good question. I mean, maybe 383 00:20:45,560 --> 00:20:48,080 Speaker 1: I'm stating the obvis Is it a law. 384 00:20:48,240 --> 00:20:48,639 Speaker 3: Or is it? 385 00:20:49,560 --> 00:20:52,320 Speaker 1: Like? Is there is it a criminal act for a 386 00:20:52,359 --> 00:20:55,600 Speaker 1: fifteen year old to be on fucking Instagram? Like? Is 387 00:20:55,600 --> 00:20:56,359 Speaker 1: that a crime? 388 00:20:57,200 --> 00:20:57,720 Speaker 3: Is a question? 389 00:20:58,160 --> 00:21:03,000 Speaker 1: Yeah? I would Yeah, there's just a lot going on, 390 00:21:03,119 --> 00:21:06,080 Speaker 1: and I'm with you. I just look, we need we 391 00:21:06,119 --> 00:21:08,360 Speaker 1: need governance, and we need police and we need all 392 00:21:08,359 --> 00:21:10,159 Speaker 1: of that. Of course we do, and it's great and 393 00:21:10,160 --> 00:21:14,080 Speaker 1: we're so grateful. But also like when you've got the 394 00:21:14,119 --> 00:21:18,119 Speaker 1: government saying your child can't do this, and it's not 395 00:21:18,200 --> 00:21:21,399 Speaker 1: like we're talking about running around with a machete, like 396 00:21:21,480 --> 00:21:24,360 Speaker 1: we're talking about using an app on a phone. As 397 00:21:24,400 --> 00:21:31,480 Speaker 1: you said, Patrick, in some cases, those kids having access 398 00:21:31,119 --> 00:21:34,200 Speaker 1: to other people via this medium is going to be 399 00:21:34,280 --> 00:21:36,600 Speaker 1: good for their health, good for their social life, good 400 00:21:36,640 --> 00:21:40,280 Speaker 1: for their state of mind. And I think to say 401 00:21:40,320 --> 00:21:43,880 Speaker 1: it's it's universally bad, so we need to well it isn't. 402 00:21:44,000 --> 00:21:49,120 Speaker 1: It isn't universally bad. Just like you know, it's like, well, 403 00:21:49,119 --> 00:21:51,800 Speaker 1: we can't have kids congregating in the fucking school yard 404 00:21:51,840 --> 00:21:56,040 Speaker 1: at lunchtime because bullying happens. Yeah, but ninety eight percent 405 00:21:56,040 --> 00:21:58,600 Speaker 1: of the time it's not happening, or ninety nine percent 406 00:21:58,640 --> 00:22:02,440 Speaker 1: of the time it's like, well, bullying happens in the playground, 407 00:22:02,600 --> 00:22:06,879 Speaker 1: so we can't go to the playground. Ever, it doesn't work. 408 00:22:07,160 --> 00:22:10,320 Speaker 1: I just don't know that that is a solution. 409 00:22:10,520 --> 00:22:13,719 Speaker 2: But you raise the really valuable point in terms of 410 00:22:13,960 --> 00:22:16,280 Speaker 2: you know, is a kid going to get fined for 411 00:22:16,359 --> 00:22:19,399 Speaker 2: being fifteen and having a Snapchat account using a VPN. 412 00:22:19,760 --> 00:22:23,800 Speaker 3: The laws in Australia are targeting the big. 413 00:22:23,680 --> 00:22:26,960 Speaker 2: Data, the big companies, so that you know, they're targeting Meta, 414 00:22:27,080 --> 00:22:30,520 Speaker 2: which owns Facebook and Instagram. They're targeting all of those 415 00:22:30,960 --> 00:22:35,080 Speaker 2: platforms and saying to them they have to put mechanisms 416 00:22:35,200 --> 00:22:38,919 Speaker 2: in place that don't allow young people to be on 417 00:22:39,000 --> 00:22:41,439 Speaker 2: if they're under the age of sixteen. So there's a 418 00:22:41,480 --> 00:22:45,040 Speaker 2: fifty million dollar fine if they aren't able to enforce that. 419 00:22:45,119 --> 00:22:47,800 Speaker 2: So they're throwing it on the tech companies to do 420 00:22:47,880 --> 00:22:51,040 Speaker 2: the right thing to enforce this ban. What does that 421 00:22:51,080 --> 00:22:53,560 Speaker 2: mean for an individual? I don't even know. That's such 422 00:22:53,600 --> 00:22:56,879 Speaker 2: a great question. But what I know is that fifty 423 00:22:56,920 --> 00:23:00,879 Speaker 2: million dollar fine is going to go out to any 424 00:23:00,960 --> 00:23:04,160 Speaker 2: breaches that occur that they caught out. But the reality 425 00:23:04,200 --> 00:23:06,360 Speaker 2: of it is they're already talking about ways to get 426 00:23:06,400 --> 00:23:09,000 Speaker 2: around that. You know, if the tech company is using 427 00:23:09,040 --> 00:23:12,160 Speaker 2: a face scanner and you just get your older brother 428 00:23:12,400 --> 00:23:15,040 Speaker 2: to come on and say that they're you, and they 429 00:23:15,119 --> 00:23:17,560 Speaker 2: skip your face and suddenly I've got an account. 430 00:23:17,760 --> 00:23:20,320 Speaker 3: A VPN is where you get a virtual private. 431 00:23:20,040 --> 00:23:23,320 Speaker 2: Network and the computer that you're using or the device 432 00:23:23,359 --> 00:23:26,880 Speaker 2: you're using it appears to be in another country, so 433 00:23:26,920 --> 00:23:28,640 Speaker 2: you can sign up in a country where they. 434 00:23:28,560 --> 00:23:30,760 Speaker 3: Don't have this bad. The rest of the world is 435 00:23:30,800 --> 00:23:31,280 Speaker 3: watching this. 436 00:23:31,600 --> 00:23:34,719 Speaker 2: You know, Australia has made the world stage in terms 437 00:23:34,760 --> 00:23:36,960 Speaker 2: of the progressive nature. 438 00:23:36,640 --> 00:23:39,119 Speaker 3: Of what it's felt by a lot of the you know, 439 00:23:39,560 --> 00:23:40,959 Speaker 3: just child welfare groups. 440 00:23:41,600 --> 00:23:44,879 Speaker 2: So other countries are looking to do something similar and 441 00:23:44,880 --> 00:23:48,719 Speaker 2: they're watching really closely what's happening in Australia because this 442 00:23:48,800 --> 00:23:52,560 Speaker 2: is a big thing for us. But how it impacts 443 00:23:52,600 --> 00:23:55,800 Speaker 2: and what ends up happening is out there. We're not 444 00:23:55,840 --> 00:23:57,639 Speaker 2: going to know it or December ten. You might be 445 00:23:57,680 --> 00:24:00,520 Speaker 2: hearing this before or after. Tell us that's what happened. 446 00:24:01,520 --> 00:24:04,800 Speaker 1: Yeah, yeah, get back to us. Yeah. I don't know. 447 00:24:05,320 --> 00:24:08,320 Speaker 1: I just think there's a lot of a lot of 448 00:24:08,320 --> 00:24:13,520 Speaker 1: stuff that's kind of going into the space of people's 449 00:24:14,280 --> 00:24:18,119 Speaker 1: private lives and you know, I don't know. I don't know. 450 00:24:18,160 --> 00:24:23,000 Speaker 1: And also, are there people over sixteen who are using 451 00:24:23,000 --> 00:24:27,280 Speaker 1: it in unhealthy and toxic and destructive ways? Yes? Is 452 00:24:27,720 --> 00:24:31,480 Speaker 1: it doing potential damage to people over sixteen? I guess 453 00:24:31,520 --> 00:24:35,000 Speaker 1: it is. I just think it's like there needs to 454 00:24:35,040 --> 00:24:39,880 Speaker 1: be some kind of self regulation, like we can't have 455 00:24:39,960 --> 00:24:43,960 Speaker 1: the government forever telling individuals what they can and can't 456 00:24:43,960 --> 00:24:46,400 Speaker 1: do with their fucking phone or in their personal life. 457 00:24:46,440 --> 00:24:48,960 Speaker 1: I mean unless it's of a criminal nature. 458 00:24:49,040 --> 00:24:52,880 Speaker 2: But yeah, Patricks can I just say there are two 459 00:24:52,920 --> 00:24:57,280 Speaker 2: teenagers currently who have launched a High Court challenge to 460 00:24:57,440 --> 00:25:01,320 Speaker 2: this new law. By the way, what happened is that 461 00:25:03,160 --> 00:25:06,880 Speaker 2: fifteen year old's Noah Jones and Macy Neeland. They've been 462 00:25:06,920 --> 00:25:09,840 Speaker 2: backed by a welfare group, like a rights group, and 463 00:25:09,840 --> 00:25:13,240 Speaker 2: they're arguing the band completely disregards the rights of children 464 00:25:13,680 --> 00:25:16,960 Speaker 2: and they say, look, we shouldn't be so viilenced. It's 465 00:25:17,080 --> 00:25:21,680 Speaker 2: like the George Orwell book nine to eighty four. There 466 00:25:21,680 --> 00:25:23,679 Speaker 2: is another side to this, and that's the opinion of 467 00:25:23,720 --> 00:25:27,679 Speaker 2: young people as well. So the Digital Freedom Project basically 468 00:25:27,800 --> 00:25:31,400 Speaker 2: has spearheaded this case and they're filed in the High 469 00:25:31,400 --> 00:25:35,760 Speaker 2: Court just recently and they're saying that teenagers rely on 470 00:25:35,760 --> 00:25:41,000 Speaker 2: social media for information association and this ban could actually 471 00:25:41,680 --> 00:25:45,920 Speaker 2: hurt the nation's most vulnerable children, so people with a disability, 472 00:25:46,000 --> 00:25:51,119 Speaker 2: First Nations kids rule and remote kids, LGBTQ teenagers. So 473 00:25:51,760 --> 00:25:54,440 Speaker 2: these are the children and the young people who stand 474 00:25:54,480 --> 00:26:00,560 Speaker 2: to be disenfranchised by being disconnected from their social connectedness online. Look, 475 00:26:00,600 --> 00:26:03,399 Speaker 2: whether this goes through, I'm not sure. There's a new 476 00:26:03,440 --> 00:26:08,000 Speaker 2: South Wales parliamentarian, a god named John Raddick. He you know, 477 00:26:08,040 --> 00:26:10,320 Speaker 2: he's kind of weighed in on this as well. So 478 00:26:10,840 --> 00:26:13,800 Speaker 2: it's I don't know how much luck they'll have in 479 00:26:13,840 --> 00:26:16,000 Speaker 2: the case, but it's interesting that they're fighting back. 480 00:26:17,600 --> 00:26:20,679 Speaker 1: Let's keep our AI chat to just one top or 481 00:26:20,680 --> 00:26:23,920 Speaker 1: one story this week, but I'm interested in this one. 482 00:26:24,480 --> 00:26:29,320 Speaker 1: A backlash against AI imagery being used in ads may 483 00:26:29,359 --> 00:26:34,520 Speaker 1: have begun as brands promote human made well, there's a bit. 484 00:26:34,400 --> 00:26:35,600 Speaker 3: Of criticism recently. 485 00:26:36,359 --> 00:26:39,600 Speaker 2: H and M and Guess got a really big backlash 486 00:26:39,640 --> 00:26:44,240 Speaker 2: for using AI brand ambassadors where they've used virtual models 487 00:26:44,760 --> 00:26:47,800 Speaker 2: instead of humans, and so this is where it's come from. 488 00:26:47,880 --> 00:26:51,520 Speaker 2: So now companies are doing the exact opposite. So you know, 489 00:26:51,960 --> 00:26:54,760 Speaker 2: there's a now that in this backlash where advertisers are 490 00:26:54,800 --> 00:26:57,280 Speaker 2: saying this was made using real people. 491 00:26:57,440 --> 00:26:59,920 Speaker 3: We didn't use AI to create. 492 00:26:59,680 --> 00:27:03,159 Speaker 1: This how novel. Hey look this is a human. Her 493 00:27:03,280 --> 00:27:04,640 Speaker 1: name's Jane. 494 00:27:04,520 --> 00:27:11,159 Speaker 2: Yeah, so so Cadbury, Polaroid, Heineken. They're hating on AI 495 00:27:11,800 --> 00:27:15,760 Speaker 2: and they're celebrating their work as being human made work, 496 00:27:16,280 --> 00:27:19,120 Speaker 2: which is interesting, you know, and and when you take 497 00:27:19,200 --> 00:27:22,000 Speaker 2: it to the extreme, there was this guy, I can't 498 00:27:22,040 --> 00:27:24,080 Speaker 2: remember where he might have been, an American guy. He 499 00:27:24,160 --> 00:27:28,239 Speaker 2: developed an AI model of an African American woman and 500 00:27:28,280 --> 00:27:31,439 Speaker 2: it looked stunning and he and the thing is, but 501 00:27:31,520 --> 00:27:35,760 Speaker 2: he was a white man who made an African American 502 00:27:35,840 --> 00:27:38,840 Speaker 2: model and was trying to get money from that using 503 00:27:38,960 --> 00:27:44,199 Speaker 2: that AI person that didn't exist, you know. So you know, 504 00:27:44,240 --> 00:27:47,440 Speaker 2: when you think of it in those terms, either misrepresentation 505 00:27:48,040 --> 00:27:51,919 Speaker 2: or what does it actually mean? Coca Cola did their 506 00:27:52,000 --> 00:27:56,200 Speaker 2: Christmas campaign, They did a Christmas AI ad campaign and 507 00:27:56,280 --> 00:27:58,960 Speaker 2: they've they've they've you know, it looks magical and it's 508 00:27:58,960 --> 00:28:02,480 Speaker 2: got polar bears doing rate things and trains and Christmassy stuff. 509 00:28:02,520 --> 00:28:04,199 Speaker 2: But you know, at the end of the day, you 510 00:28:04,240 --> 00:28:06,240 Speaker 2: look at it and you think, oh, isn't it sad 511 00:28:06,480 --> 00:28:10,760 Speaker 2: that it wasn't made by people in a sweatshop somewhere. 512 00:28:10,880 --> 00:28:11,680 Speaker 3: No, that's not true. 513 00:28:12,720 --> 00:28:16,679 Speaker 1: People Animators spoken and authorized by Patrick Bonello. His thoughts 514 00:28:16,680 --> 00:28:21,720 Speaker 1: don't reflect all the thoughts of the project management. Now. 515 00:28:21,800 --> 00:28:24,080 Speaker 2: But when you think about people who work in the 516 00:28:24,080 --> 00:28:25,359 Speaker 2: film industry and. 517 00:28:25,600 --> 00:28:28,560 Speaker 1: Work definitely date edit that out too, just leave that in. 518 00:28:29,520 --> 00:28:34,280 Speaker 1: And I joking, when is what's your lawyer's name? Again? 519 00:28:35,720 --> 00:28:38,440 Speaker 2: Well, they just resigned after hearing this segment, So I 520 00:28:38,480 --> 00:28:40,320 Speaker 2: don't have one if you're a lawyer and you want 521 00:28:40,360 --> 00:28:42,520 Speaker 2: to represent me, because I've put my foot in my mouth. 522 00:28:45,960 --> 00:28:48,720 Speaker 2: But I guess the other thing is in the film 523 00:28:48,760 --> 00:28:52,520 Speaker 2: industry at the moment, using AI, it can also mean 524 00:28:52,680 --> 00:28:56,080 Speaker 2: that we can just use these AI tools to help 525 00:28:56,160 --> 00:28:59,480 Speaker 2: with you know, improving the quality of the product. So 526 00:28:59,760 --> 00:29:03,200 Speaker 2: what point do we not recognize that AI is just 527 00:29:03,240 --> 00:29:05,600 Speaker 2: a useful tool exactly? 528 00:29:05,680 --> 00:29:08,280 Speaker 1: And it depends I think like many things like social 529 00:29:08,320 --> 00:29:11,680 Speaker 1: media apps like CGI or AI as we're talking about, 530 00:29:12,440 --> 00:29:14,719 Speaker 1: it depends on the use. It depends on the application. 531 00:29:15,080 --> 00:29:18,040 Speaker 1: Like it's not like all AI is doing bad things 532 00:29:18,200 --> 00:29:21,880 Speaker 1: or you know when when they do, when they produce 533 00:29:21,960 --> 00:29:24,480 Speaker 1: AI that can do an operation on me with ninety 534 00:29:24,520 --> 00:29:29,840 Speaker 1: nine point nine nine nine percent accuracy or success versus 535 00:29:29,840 --> 00:29:33,480 Speaker 1: a much lower figure for a human. I'm choosing the AI. Yeah, 536 00:29:33,520 --> 00:29:35,760 Speaker 1: you know what I'm saying, Like if it's life and death, 537 00:29:37,520 --> 00:29:39,200 Speaker 1: you know, I think over time we're going to find 538 00:29:39,240 --> 00:29:41,160 Speaker 1: more and more. And we've spoken about this a little bit, 539 00:29:41,160 --> 00:29:45,680 Speaker 1: but autonomous vehicles, and yes, can autonomous vehicles crass? Yes? 540 00:29:46,200 --> 00:29:49,840 Speaker 1: Can they all these things. Yes, can human driven vehicles crash, 541 00:29:50,080 --> 00:29:52,880 Speaker 1: We'll just fucking look out the window, you know. So 542 00:29:53,720 --> 00:29:56,560 Speaker 1: I think there's going to be over time, there has 543 00:29:56,640 --> 00:30:00,000 Speaker 1: to be just more and more acceptance and less resistant. 544 00:30:00,240 --> 00:30:04,160 Speaker 1: And it's you know, by the time Patrick Craigan Tiff 545 00:30:04,200 --> 00:30:06,520 Speaker 1: two point zero are sitting on a podcast, not that 546 00:30:06,560 --> 00:30:09,800 Speaker 1: there'll be any in thirty years, they're going to go, 547 00:30:09,880 --> 00:30:13,520 Speaker 1: remember those old dinosaurs who used to fucking drive themselves around, 548 00:30:14,320 --> 00:30:18,680 Speaker 1: Remember when actors in movies were biological not technological. They'll 549 00:30:18,720 --> 00:30:21,560 Speaker 1: be like, what was that about? What a weird phase 550 00:30:21,600 --> 00:30:25,360 Speaker 1: of the human kind of evolution was that? I like 551 00:30:25,480 --> 00:30:28,040 Speaker 1: all the shit that we're jumping up and down about now, 552 00:30:29,000 --> 00:30:32,440 Speaker 1: we've done that for centuries, eon, for fucking millennia. Where 553 00:30:32,600 --> 00:30:35,960 Speaker 1: something new comes along and everyone protests and says it's 554 00:30:35,960 --> 00:30:37,520 Speaker 1: going to be the end of the world. Rock and 555 00:30:37,600 --> 00:30:40,880 Speaker 1: roll music, it's the devil's music. It's going to destroy that. 556 00:30:41,040 --> 00:30:44,680 Speaker 1: You know, it doesn't. It's you know, not that there 557 00:30:44,680 --> 00:30:47,880 Speaker 1: needs to be no policing or no regulation, but I'm 558 00:30:47,920 --> 00:30:50,479 Speaker 1: pretty sure it's going to becoming more and more ingrained 559 00:30:50,520 --> 00:30:55,240 Speaker 1: in who we are and how we operate. I've been 560 00:30:55,320 --> 00:30:57,880 Speaker 1: fortunate that over the years, I've gone to a few 561 00:30:57,960 --> 00:31:03,320 Speaker 1: lost arts expos where you find artisans and people who 562 00:31:03,360 --> 00:31:08,640 Speaker 1: have retained these lost arts, like Blacksmith's, you know. And 563 00:31:08,880 --> 00:31:12,600 Speaker 1: I bought a fireplace poker, which was amazing. It was 564 00:31:12,640 --> 00:31:13,720 Speaker 1: this wrought iron. 565 00:31:13,480 --> 00:31:16,920 Speaker 2: Poker with a wizard's head formed out of the metal 566 00:31:16,960 --> 00:31:19,960 Speaker 2: at the top. And it was just stunning and it 567 00:31:20,080 --> 00:31:22,080 Speaker 2: was great to know that a craftsman had made and 568 00:31:22,200 --> 00:31:22,720 Speaker 2: made that. 569 00:31:23,160 --> 00:31:25,360 Speaker 3: And it was what are you laughing and the TIFFs 570 00:31:25,440 --> 00:31:26,080 Speaker 3: laughing at me? 571 00:31:26,360 --> 00:31:28,080 Speaker 1: But a wizard's head? 572 00:31:28,360 --> 00:31:30,160 Speaker 3: Wow? I had a wizards head on one end. 573 00:31:30,680 --> 00:31:33,680 Speaker 1: Yeah, do you actually have a fire they have? 574 00:31:33,880 --> 00:31:35,440 Speaker 3: Of course they have a fireplace. Well would I buy 575 00:31:35,480 --> 00:31:37,520 Speaker 3: a fireplace poker without a fireplace? 576 00:31:37,680 --> 00:31:41,280 Speaker 1: Fucking what? You would buy a fucking spaceship. 577 00:31:41,840 --> 00:31:44,840 Speaker 3: I would buy a spaceship if I could exactly. 578 00:31:45,680 --> 00:31:48,320 Speaker 1: I don't think. Knowing you, I wouldn't expect you would 579 00:31:48,480 --> 00:31:53,000 Speaker 1: need to have a fire to buy yourself a fireplace poker. 580 00:31:54,880 --> 00:31:59,760 Speaker 1: All right, let's talk tech. Soft robotic elbow cuts muscle 581 00:31:59,800 --> 00:32:02,640 Speaker 1: act activity. What does that even mean? Soft robotic elbow 582 00:32:02,720 --> 00:32:07,760 Speaker 1: cuts muscle activity by Oh, okay, people, it does some 583 00:32:07,840 --> 00:32:10,280 Speaker 1: of it does some of the work of lifting, does it? 584 00:32:10,600 --> 00:32:10,840 Speaker 3: Yes? 585 00:32:10,960 --> 00:32:14,000 Speaker 2: So it's made of silicon, it's an exoskeleton, and it's 586 00:32:14,120 --> 00:32:17,240 Speaker 2: it's very light. And what it does is if you 587 00:32:17,360 --> 00:32:21,120 Speaker 2: do any any work, say in a factory work potentially 588 00:32:21,200 --> 00:32:24,720 Speaker 2: where you've got this repetitive movement, it means that there's 589 00:32:24,800 --> 00:32:27,680 Speaker 2: less strain, so it takes up twenty two percent of 590 00:32:27,400 --> 00:32:31,280 Speaker 2: the force can be absorbed by this, and this could 591 00:32:31,320 --> 00:32:34,680 Speaker 2: be revolutionary for factory workers around the world. It means 592 00:32:34,760 --> 00:32:38,600 Speaker 2: that you know less pain RSI all that sort of stuff. 593 00:32:38,960 --> 00:32:44,400 Speaker 2: So University of Texas and Arlington they developed this robotic exoskeleon. 594 00:32:45,560 --> 00:32:48,440 Speaker 2: But it also so it's not about also supporting the joint, 595 00:32:48,720 --> 00:32:51,360 Speaker 2: but it also literally lifts the load. So whatever you're 596 00:32:51,360 --> 00:32:54,040 Speaker 2: carrying is twenty two percent lighter, or it would feel 597 00:32:54,040 --> 00:32:55,960 Speaker 2: twenty two percent lighter. That'd be great in the gym, 598 00:32:55,960 --> 00:32:58,600 Speaker 2: wouldn't it, Tiff, And you can put those extra weights 599 00:32:58,600 --> 00:32:59,280 Speaker 2: and create. 600 00:33:01,160 --> 00:33:04,640 Speaker 1: Nah, that would be from a training point of view, 601 00:33:04,640 --> 00:33:06,600 Speaker 1: but from the work point of view, and I'm looking 602 00:33:06,680 --> 00:33:10,200 Speaker 1: after yourself, yeah, I mean that is that's not the 603 00:33:10,240 --> 00:33:14,120 Speaker 1: worst thing at all. Tell me why pre owned gadgets 604 00:33:14,200 --> 00:33:17,760 Speaker 1: are becoming a thing? Why is there more and more 605 00:33:17,760 --> 00:33:22,480 Speaker 1: prevalence of people Is it buying pre owned stuff. Yeah, yeah, Well. 606 00:33:22,360 --> 00:33:26,080 Speaker 2: There's a bigger market now in refurbished so you know, 607 00:33:26,120 --> 00:33:28,640 Speaker 2: when you look for new phones or phones. 608 00:33:28,440 --> 00:33:30,560 Speaker 3: There's a lot of recent furbished stuff out there. 609 00:33:30,560 --> 00:33:33,520 Speaker 2: And I think it's gone from the days of your 610 00:33:33,600 --> 00:33:37,760 Speaker 2: eBay purchase of a dodgy device to just something now 611 00:33:37,880 --> 00:33:41,680 Speaker 2: where they're refurbished, they come with warranties, and as people 612 00:33:41,680 --> 00:33:44,520 Speaker 2: are more budget conscious, the reality. 613 00:33:44,120 --> 00:33:47,040 Speaker 3: Of it is, you know the majority. 614 00:33:47,480 --> 00:33:50,640 Speaker 2: Of people are just going to be as happy with 615 00:33:50,800 --> 00:33:54,040 Speaker 2: an iPhone thirteen than they would with a fourteen or fifty, 616 00:33:54,040 --> 00:33:55,600 Speaker 2: you know what I mean. You don't have to have 617 00:33:55,680 --> 00:33:57,600 Speaker 2: a two and a half thousand dollars phone if you 618 00:33:57,600 --> 00:33:59,520 Speaker 2: can get one for six hundred bucks and it's going 619 00:33:59,520 --> 00:34:02,080 Speaker 2: to do just as much. And there's always going to 620 00:34:02,120 --> 00:34:04,120 Speaker 2: be percentage of people who will queue up to get 621 00:34:04,160 --> 00:34:05,880 Speaker 2: the next phone, and then they're going to. 622 00:34:05,800 --> 00:34:06,880 Speaker 3: Offload their old phone. 623 00:34:06,960 --> 00:34:10,920 Speaker 2: So if you're flipping your phone literally every twelve months, 624 00:34:11,200 --> 00:34:14,319 Speaker 2: then there's really good quality stuff out there. I think 625 00:34:14,320 --> 00:34:18,279 Speaker 2: I've talked about this before. I haven't purchased a television 626 00:34:18,400 --> 00:34:21,360 Speaker 2: for probably twenty years, and I know I sound like 627 00:34:21,400 --> 00:34:24,479 Speaker 2: the tight ass, but I just know stacks of people 628 00:34:24,560 --> 00:34:26,840 Speaker 2: who get rid of television's regularly because they want to 629 00:34:26,840 --> 00:34:29,560 Speaker 2: go to the later and greater model, and so I 630 00:34:29,760 --> 00:34:32,319 Speaker 2: just buy their old TVs, or in many cases get 631 00:34:32,360 --> 00:34:35,640 Speaker 2: given them because they just want to offload them, you know, freedomine. 632 00:34:35,920 --> 00:34:38,080 Speaker 2: I got given a TV by my brother which was 633 00:34:38,120 --> 00:34:39,120 Speaker 2: seventy five inches. 634 00:34:39,320 --> 00:34:42,719 Speaker 3: It's enormous, like Green TV. But he went to an 635 00:34:42,719 --> 00:34:43,480 Speaker 3: eighty two inch. 636 00:34:44,400 --> 00:34:48,040 Speaker 1: Wow, I'll take it. What There's a whole lot of 637 00:34:48,080 --> 00:34:50,400 Speaker 1: conversations we could have off the back of that. You know. 638 00:34:50,440 --> 00:34:53,920 Speaker 1: One of my favorite old school like gadgets. I don't 639 00:34:53,920 --> 00:34:55,600 Speaker 1: know if I've told you this. I might have, but 640 00:34:56,400 --> 00:34:58,480 Speaker 1: I've got a game Boy from nine and I think 641 00:34:58,480 --> 00:35:03,160 Speaker 1: it's nine ninety seven, so you know that that classic size, 642 00:35:03,560 --> 00:35:05,799 Speaker 1: and it's got Tetris. It's you know how you put 643 00:35:05,800 --> 00:35:08,520 Speaker 1: in the discs. Yep, so it's got the old one 644 00:35:08,560 --> 00:35:12,440 Speaker 1: you put that in the back came literally, what's that? 645 00:35:12,719 --> 00:35:15,800 Speaker 3: It's a cartridge, not a disc? Thank you cartridge. 646 00:35:16,239 --> 00:35:18,480 Speaker 1: I found that It must have been in my cupboard 647 00:35:18,560 --> 00:35:22,319 Speaker 1: for oh twenty years, and I pulled it out and 648 00:35:22,320 --> 00:35:24,920 Speaker 1: I went, oh, I'll put a battery in. It's fucking 649 00:35:24,960 --> 00:35:28,799 Speaker 1: brand new, brand new, awesome. Yeah, it's the best, and 650 00:35:28,840 --> 00:35:32,800 Speaker 1: it's lime green. It's disgusting lime green, but it's also 651 00:35:32,880 --> 00:35:36,600 Speaker 1: the best color ever. So and then I usually lose 652 00:35:36,640 --> 00:35:39,120 Speaker 1: myself down the Tetris rabbit hole for three days and 653 00:35:39,120 --> 00:35:41,200 Speaker 1: then put it back in the cupboard for another two years. 654 00:35:41,680 --> 00:35:44,120 Speaker 3: My being an identical twin brother. 655 00:35:44,480 --> 00:35:48,840 Speaker 2: When we were kids and the first electronic games came out, 656 00:35:49,120 --> 00:35:50,760 Speaker 2: we both got given game. 657 00:35:50,920 --> 00:35:52,480 Speaker 3: We usually given gifts to share. 658 00:35:52,800 --> 00:35:55,080 Speaker 2: Who gives a gift to two kids to share just 659 00:35:55,120 --> 00:35:56,719 Speaker 2: because we share a same birthday? 660 00:35:57,000 --> 00:35:58,320 Speaker 3: I don't hear a gift. 661 00:35:58,480 --> 00:36:01,840 Speaker 2: How many of those days ended in fights and tears? 662 00:36:01,880 --> 00:36:03,080 Speaker 3: Do you reckon? So? 663 00:36:04,080 --> 00:36:08,640 Speaker 2: Anyway, it's such tight ass as your relatives at a 664 00:36:08,680 --> 00:36:09,920 Speaker 2: working class suburbs. 665 00:36:09,920 --> 00:36:15,520 Speaker 1: Shut up relatives, fucking do better. Two presents, two kids. 666 00:36:14,760 --> 00:36:18,440 Speaker 3: My favorite auntie. I love her to death. She's awesome. 667 00:36:18,480 --> 00:36:21,319 Speaker 2: She's my godmother and she's Irish as well. So she 668 00:36:21,480 --> 00:36:23,360 Speaker 2: just did this whole Patrick. She loved me because I 669 00:36:23,400 --> 00:36:25,200 Speaker 2: was My name is Patrick. I think that was the 670 00:36:25,239 --> 00:36:28,440 Speaker 2: other reason. But she always just to make a big 671 00:36:28,440 --> 00:36:29,680 Speaker 2: big thing about it. But she used to give us 672 00:36:29,680 --> 00:36:31,880 Speaker 2: the Guinness Book of World Records. Of course it was Guinness. 673 00:36:31,880 --> 00:36:34,279 Speaker 2: Why am I thinking that was funny? So she used 674 00:36:34,320 --> 00:36:35,920 Speaker 2: to give us to give us book and world records 675 00:36:35,960 --> 00:36:40,319 Speaker 2: to share to share, and we fought over it every year. 676 00:36:40,719 --> 00:36:43,200 Speaker 2: But anyway, getting back to computer games, So I've got 677 00:36:43,200 --> 00:36:50,280 Speaker 2: this two person robot fighting game where one person has 678 00:36:50,520 --> 00:36:52,600 Speaker 2: like a little buttons on one side, another person on 679 00:36:52,640 --> 00:36:55,640 Speaker 2: the other side, and two robots basically fire weapons. 680 00:36:55,360 --> 00:36:57,160 Speaker 3: At each other. And then the other one was remember 681 00:36:57,200 --> 00:36:59,960 Speaker 3: Galaxian the game Galaxian. 682 00:37:00,160 --> 00:37:05,080 Speaker 1: Two people anybody I might, I think I know the name, 683 00:37:05,080 --> 00:37:06,040 Speaker 1: but I've never played it. 684 00:37:07,920 --> 00:37:10,520 Speaker 2: So we again had a two person game where one 685 00:37:10,520 --> 00:37:12,560 Speaker 2: person stood on the other side and they had their 686 00:37:12,600 --> 00:37:14,840 Speaker 2: controls and they were looking down at the screen and 687 00:37:14,880 --> 00:37:18,400 Speaker 2: they played the aliens and you played the person defending. 688 00:37:18,560 --> 00:37:21,200 Speaker 2: So they were both two people games. And so years later, 689 00:37:21,280 --> 00:37:23,000 Speaker 2: and I think I moved up to Blant at this point, 690 00:37:23,000 --> 00:37:25,880 Speaker 2: but we're talking maybe ten years ago. My brother said, jeez, 691 00:37:25,880 --> 00:37:28,240 Speaker 2: you know, I wonder whatever happened to all those games 692 00:37:28,239 --> 00:37:30,520 Speaker 2: that we got, And I reckon Dad threw them all away, 693 00:37:30,960 --> 00:37:33,720 Speaker 2: and it's like, no, when I moved out of home. 694 00:37:33,640 --> 00:37:35,000 Speaker 3: I took them still. 695 00:37:36,400 --> 00:37:38,719 Speaker 1: So your brother doesn't know he does that. 696 00:37:41,400 --> 00:37:43,680 Speaker 2: I admitted it now. I admitted it, and I said, 697 00:37:44,040 --> 00:37:48,680 Speaker 2: actually I've got them he was so outraged that I'd 698 00:37:48,760 --> 00:37:51,480 Speaker 2: taken them. It's like, well, you know I was repatriating them. 699 00:37:51,560 --> 00:37:52,600 Speaker 2: You know they were mine too. 700 00:37:54,680 --> 00:37:58,600 Speaker 1: Now you've got a hybrid. I've got a hybrid. Yeah, yeah, 701 00:37:58,640 --> 00:38:02,800 Speaker 1: it does not. You are a hybrid. You are thirty 702 00:38:02,800 --> 00:38:07,720 Speaker 1: percent female, seventy percent male. We've jumped now the numbers 703 00:38:07,760 --> 00:38:14,120 Speaker 1: have changed seventy thirty the other way, China have produced 704 00:38:14,200 --> 00:38:18,880 Speaker 1: a hybrid EV that drove one thousand and four hundred 705 00:38:18,880 --> 00:38:22,399 Speaker 1: and forty four miles on well it's not really a tank, 706 00:38:22,480 --> 00:38:26,600 Speaker 1: but let's call it one tank without refueling. Now, off 707 00:38:26,640 --> 00:38:28,880 Speaker 1: the top of my head, I feel like fourteen forty 708 00:38:28,920 --> 00:38:33,879 Speaker 1: five miles is significantly over two thousand kilometers on one 709 00:38:34,200 --> 00:38:36,880 Speaker 1: basically one fueling. That's pretty incredible. 710 00:38:37,120 --> 00:38:40,600 Speaker 2: Yep, two thousand and three hundred and twenty five kilometers 711 00:38:40,719 --> 00:38:42,000 Speaker 2: is exactly how it went. 712 00:38:42,160 --> 00:38:42,399 Speaker 3: Yeah. 713 00:38:42,600 --> 00:38:45,759 Speaker 2: So it's a Chinese car company called four FAW and 714 00:38:46,040 --> 00:38:47,440 Speaker 2: this is a new milestone. 715 00:38:47,520 --> 00:38:53,239 Speaker 3: So they had a hong g HS six. The Hong 716 00:38:53,360 --> 00:38:54,840 Speaker 3: g HS six. 717 00:38:54,640 --> 00:38:56,480 Speaker 1: Sounds pretty sick, that old chestnut. 718 00:38:58,120 --> 00:39:02,080 Speaker 2: It's a plug in hybrid SU and it's set the 719 00:39:02,080 --> 00:39:05,440 Speaker 2: Guinness World Record for the longest distance traveled on a 720 00:39:05,480 --> 00:39:09,560 Speaker 2: single full charge and tank of fuel so without the 721 00:39:09,640 --> 00:39:11,560 Speaker 2: need to refuel. 722 00:39:11,400 --> 00:39:13,880 Speaker 3: And that's a staggering amount of distance. 723 00:39:14,800 --> 00:39:20,640 Speaker 1: Is the challenge for evs is the biggest hurdle. I 724 00:39:20,640 --> 00:39:23,440 Speaker 1: know there's a few, but is it just charging time 725 00:39:24,040 --> 00:39:28,000 Speaker 1: for making them really really commercially viable and popular? You know, 726 00:39:28,040 --> 00:39:31,080 Speaker 1: because I can go fill my little Suzuki Swift in 727 00:39:31,160 --> 00:39:33,560 Speaker 1: three minutes with enough gas to get me around for 728 00:39:33,600 --> 00:39:36,680 Speaker 1: the next week or two. But if I'm going to 729 00:39:36,719 --> 00:39:39,520 Speaker 1: do the same with my I don't know, Tesla or whatever, 730 00:39:39,600 --> 00:39:42,200 Speaker 1: depending on which one I've got or which Chinese EV, 731 00:39:42,920 --> 00:39:45,200 Speaker 1: it could be eight to ten hours, depending on how 732 00:39:45,200 --> 00:39:47,759 Speaker 1: I do it. Is that their challenge moving forward to 733 00:39:47,800 --> 00:39:48,880 Speaker 1: get that time down. 734 00:39:49,400 --> 00:39:51,239 Speaker 3: No, yes or no. 735 00:39:52,320 --> 00:39:55,479 Speaker 2: I do agree that that is problematic when we're living 736 00:39:55,480 --> 00:39:59,040 Speaker 2: in a country where range anxiety is an issue. But 737 00:39:59,120 --> 00:40:02,040 Speaker 2: the reality of it is the average Australian isn't going 738 00:40:02,080 --> 00:40:05,840 Speaker 2: to drive more than three or four hundred kilometers in 739 00:40:05,840 --> 00:40:08,879 Speaker 2: a day. They're going to commute to work, and if 740 00:40:08,880 --> 00:40:12,160 Speaker 2: you're charging, you're going to charge overnight and going to 741 00:40:12,200 --> 00:40:14,760 Speaker 2: a charging state. You can go somewhere to a charging 742 00:40:14,840 --> 00:40:18,080 Speaker 2: point and charge for twenty minutes and get it eighty 743 00:40:18,080 --> 00:40:19,719 Speaker 2: percent of range, so you don't have to. 744 00:40:19,680 --> 00:40:22,000 Speaker 3: Do that for eight hours whatever it is to go 745 00:40:22,040 --> 00:40:22,760 Speaker 3: to a full charge. 746 00:40:22,800 --> 00:40:24,640 Speaker 2: And in fact, the argument is that you should never 747 00:40:24,920 --> 00:40:27,080 Speaker 2: charge your car to full charge. It should sit between 748 00:40:27,080 --> 00:40:30,040 Speaker 2: the eighty and twenty mark, not get below twenty, but not. 749 00:40:29,960 --> 00:40:30,920 Speaker 3: Get above eighty. 750 00:40:31,640 --> 00:40:34,640 Speaker 2: So I think, really, when we think about any hurdles 751 00:40:34,640 --> 00:40:38,720 Speaker 2: to evs, I don't think range anxiety is the issue, 752 00:40:39,000 --> 00:40:40,400 Speaker 2: because the reality of it is. 753 00:40:40,600 --> 00:40:42,560 Speaker 3: I mean, think about it. How often do you drive? 754 00:40:42,840 --> 00:40:45,400 Speaker 2: I mean, if an average range of an EV is 755 00:40:45,440 --> 00:40:48,040 Speaker 2: four hundred and fifty kilometers, how often do you drive 756 00:40:48,080 --> 00:40:49,920 Speaker 2: four hundred and fifty kilometers? 757 00:40:50,920 --> 00:40:52,840 Speaker 1: I do regularly because I go up and back to 758 00:40:52,960 --> 00:40:56,960 Speaker 1: mums in a day, Mom and dads. Yeah, but fourred 759 00:40:57,040 --> 00:40:59,239 Speaker 1: if it actually did that, well, that would cover me. 760 00:40:59,440 --> 00:41:02,200 Speaker 1: That would be because I think it's it's probably only 761 00:41:02,239 --> 00:41:05,600 Speaker 1: about three hundred three fifty maybe depending on if I 762 00:41:05,600 --> 00:41:08,480 Speaker 1: do a little bit of yeah, a little bit of extra. 763 00:41:08,560 --> 00:41:11,799 Speaker 1: But you know, one thing that's probably a bit more 764 00:41:11,800 --> 00:41:14,160 Speaker 1: in my wheel house than yours. But one thing that's 765 00:41:14,200 --> 00:41:17,240 Speaker 1: not really taken off is electric motorbikes. 766 00:41:17,760 --> 00:41:18,000 Speaker 4: Yeah. 767 00:41:18,040 --> 00:41:18,839 Speaker 3: Interesting, isn't it. 768 00:41:18,840 --> 00:41:23,080 Speaker 2: Because there was a partnership between Suzuki and a few 769 00:41:23,080 --> 00:41:25,840 Speaker 2: other of the big players, and they were talking about 770 00:41:25,840 --> 00:41:29,680 Speaker 2: swappable batteries. The itea was that you drove into a servo, 771 00:41:30,040 --> 00:41:32,279 Speaker 2: dropped the battery, put the new battery in drive off, 772 00:41:32,400 --> 00:41:36,279 Speaker 2: a bit like you know, exchanging your gas cylinder with barbecues. 773 00:41:36,520 --> 00:41:38,359 Speaker 2: And it made so much sense, and in fact, they 774 00:41:38,440 --> 00:41:41,719 Speaker 2: teed up like three of the biggest manufacturers said, let's 775 00:41:41,760 --> 00:41:44,840 Speaker 2: standardize the battery, let's be consistent, so we can have 776 00:41:44,880 --> 00:41:48,000 Speaker 2: this swap and go mentality. And it sounded like such 777 00:41:48,000 --> 00:41:52,560 Speaker 2: a good idea. And we've seen some amazing looking electric bikes. 778 00:41:53,200 --> 00:41:55,040 Speaker 2: But I don't know whether it's a power to wait 779 00:41:55,160 --> 00:41:57,800 Speaker 2: ratio in terms of whether or not you can cramp. 780 00:41:57,840 --> 00:41:59,880 Speaker 2: I mean, people are riding scooters all the time, so 781 00:42:00,280 --> 00:42:03,520 Speaker 2: I don't know why it wouldn't be a thing, But yeah, definitely, 782 00:42:03,560 --> 00:42:07,480 Speaker 2: scooters are slowly catching on. But I think you'll find 783 00:42:08,800 --> 00:42:11,360 Speaker 2: so quite a few friends of the show, Scotti Douglas 784 00:42:11,400 --> 00:42:13,840 Speaker 2: who rides a motorbike, and Tip rides a motorbike, and 785 00:42:13,920 --> 00:42:14,960 Speaker 2: she's got a loud bike. 786 00:42:15,320 --> 00:42:21,759 Speaker 1: I've got stupid loud bike. Scott's got most. I think 787 00:42:22,000 --> 00:42:25,360 Speaker 1: bikes are more emotional than practical. I mean they're practical, 788 00:42:25,880 --> 00:42:28,479 Speaker 1: but It's like, I don't want a bike that makes 789 00:42:28,520 --> 00:42:32,480 Speaker 1: no noise for two reasons. One safety, If you've got 790 00:42:32,480 --> 00:42:34,560 Speaker 1: a loud bike, people know you're there. They're less likely 791 00:42:34,600 --> 00:42:36,799 Speaker 1: to pull in front of you, cut you off, you know, 792 00:42:36,960 --> 00:42:39,640 Speaker 1: open their door on you. You know, you don't want 793 00:42:39,640 --> 00:42:42,680 Speaker 1: it to be overwhelmingly thunderous, but if you are on 794 00:42:42,719 --> 00:42:44,560 Speaker 1: a motorbike, you would like it to be a little 795 00:42:44,560 --> 00:42:48,440 Speaker 1: bit louder than the average car. And then two, as 796 00:42:48,480 --> 00:42:53,239 Speaker 1: an experience, riding a motorbike that sounds amazing. It just 797 00:42:53,280 --> 00:42:56,160 Speaker 1: makes it more enjoyable. And people who don't ride motorbikes, 798 00:42:56,239 --> 00:42:59,000 Speaker 1: I totally understand they would go, well, that's fucking ridiculous. 799 00:42:59,719 --> 00:43:04,799 Speaker 1: But you know, there's something, there's something that happens when 800 00:43:04,800 --> 00:43:07,440 Speaker 1: you ride a motorbike that sounds amazing. Am I right, Tiff, 801 00:43:07,880 --> 00:43:08,520 Speaker 1: that is correct. 802 00:43:08,560 --> 00:43:11,080 Speaker 4: But I would like to add that Craig's bike is 803 00:43:11,239 --> 00:43:19,000 Speaker 4: overwhelmingly thunderous. Overwhelmingly so he's well Mary and Ron can 804 00:43:19,040 --> 00:43:22,840 Speaker 4: hear it start up way down there in till Drove Valley? 805 00:43:24,160 --> 00:43:27,600 Speaker 1: Yeah, that that bitch is loud, That motherfucker is. Let's 806 00:43:27,719 --> 00:43:32,839 Speaker 1: just say that. Let's just say that I might need 807 00:43:32,880 --> 00:43:34,960 Speaker 1: to just when I ride it up Hampton Street. I 808 00:43:35,120 --> 00:43:40,880 Speaker 1: just I don't even I just gently roll that. I 809 00:43:41,000 --> 00:43:44,160 Speaker 1: just I just ease that throttle on, change gears, quiet, 810 00:43:44,160 --> 00:43:48,000 Speaker 1: I try and be low key. It's it's like, please 811 00:43:48,440 --> 00:43:51,320 Speaker 1: don't notice that I'm riding a volcano up Hampton Street. 812 00:43:52,280 --> 00:43:56,040 Speaker 4: But yeah, it's a lot electric car. Because of that, 813 00:43:56,120 --> 00:43:57,960 Speaker 4: now it's electric car. I would not have an electric 814 00:43:58,000 --> 00:44:01,680 Speaker 4: bike since getting that, since riding with knowing with a 815 00:44:01,880 --> 00:44:06,400 Speaker 4: stock exhaust and then changing the exhaust completely transformed the 816 00:44:06,440 --> 00:44:07,440 Speaker 4: writing experience. 817 00:44:07,680 --> 00:44:13,000 Speaker 2: And yeah, sorry it makes you sorry. Did you say 818 00:44:13,120 --> 00:44:14,040 Speaker 2: you've made it louder? 819 00:44:14,520 --> 00:44:18,880 Speaker 4: Yeah, oh okay, not louder than Craigs. 820 00:44:19,160 --> 00:44:24,880 Speaker 1: No, no, So I was thinking, yeah, all the pseudo 821 00:44:24,960 --> 00:44:27,439 Speaker 1: psychologists and oh, of course he's got it loud because 822 00:44:27,440 --> 00:44:30,439 Speaker 1: he's fucking insecure, and that that exhausted just saying look 823 00:44:30,440 --> 00:44:34,080 Speaker 1: at me, look at me, only child, and might I say, 824 00:44:34,080 --> 00:44:38,319 Speaker 1: you're probably all correct, so carry on, you carry on. 825 00:44:39,080 --> 00:44:41,239 Speaker 2: I was just while was thinking the opposite, because I've 826 00:44:41,280 --> 00:44:43,719 Speaker 2: done a lot of paragliding and I always thought that 827 00:44:43,760 --> 00:44:45,919 Speaker 2: I would love to be able to get a paramotor. 828 00:44:46,040 --> 00:44:49,799 Speaker 3: So a paraglider is a fabric you you. 829 00:44:49,800 --> 00:44:52,480 Speaker 2: Run off a cliff or you know, on the coast 830 00:44:52,600 --> 00:44:55,280 Speaker 2: or down a mountain and whatever, and you use thermals 831 00:44:55,280 --> 00:44:59,160 Speaker 2: for lift or whatever updraft, but you can get what 832 00:44:59,200 --> 00:45:02,680 Speaker 2: they call a paramo, which is a propeller that. 833 00:45:02,640 --> 00:45:05,480 Speaker 3: You put onto the back of your backpack. The problem is, 834 00:45:05,800 --> 00:45:07,360 Speaker 3: for me, it's so loud. 835 00:45:07,520 --> 00:45:10,680 Speaker 2: I've loved paragliding because being up there and not having 836 00:45:10,719 --> 00:45:13,080 Speaker 2: a motor is amazing. But I was just thinking, while 837 00:45:13,120 --> 00:45:15,600 Speaker 2: you guys were chatting about how you wouldn't go for 838 00:45:15,640 --> 00:45:18,520 Speaker 2: an electric bike, that I would go for an electric 839 00:45:18,680 --> 00:45:22,200 Speaker 2: paramotor because that would be phenomenal because you haven't got 840 00:45:22,239 --> 00:45:26,400 Speaker 2: the sound, all the extra noise, So being an electric paramotor, 841 00:45:26,440 --> 00:45:28,279 Speaker 2: I think that would be amazing. And then you don't 842 00:45:28,320 --> 00:45:30,160 Speaker 2: have to climb anything. You can just kind of take 843 00:45:30,200 --> 00:45:31,919 Speaker 2: off from the ground, which would be pretty cool too. 844 00:45:32,320 --> 00:45:34,840 Speaker 1: You could just reincarnate as an eagle. That'd be cool. 845 00:45:35,440 --> 00:45:35,640 Speaker 1: You know. 846 00:45:35,680 --> 00:45:37,799 Speaker 2: One of the most exciting, in fact, almost one of 847 00:45:37,800 --> 00:45:39,520 Speaker 2: the last flights I did in my paraglider. 848 00:45:39,520 --> 00:45:41,960 Speaker 1: With that, I notice how he just didn't even acknowledge 849 00:45:41,960 --> 00:45:44,719 Speaker 1: that I spoke. Then, Tiff, did you notice that. 850 00:45:44,480 --> 00:45:46,840 Speaker 3: I was telling a story that related exactly to what 851 00:45:46,880 --> 00:45:48,800 Speaker 3: you're saying. You didn't give me a chance to finish. 852 00:45:49,000 --> 00:45:50,480 Speaker 1: Ah, we'll try harder. 853 00:45:51,800 --> 00:45:56,839 Speaker 2: So I launched at the launch site at bright and 854 00:45:56,960 --> 00:45:59,560 Speaker 2: as I launched over the tree tops and got a 855 00:45:59,600 --> 00:46:01,759 Speaker 2: little bit of distance away, I looked down and there 856 00:46:01,800 --> 00:46:04,200 Speaker 2: was an eagle flying underneath me. 857 00:46:04,880 --> 00:46:06,080 Speaker 3: Wow, spread wings. 858 00:46:06,080 --> 00:46:09,000 Speaker 2: And I thought to myself at that moment, how many 859 00:46:09,040 --> 00:46:12,760 Speaker 2: people have ever been in a situation with a flying 860 00:46:13,040 --> 00:46:16,640 Speaker 2: above an eagle? You know, the little winglets were just 861 00:46:16,719 --> 00:46:18,960 Speaker 2: kind of gently moving in the breeze. You know how 862 00:46:19,000 --> 00:46:21,520 Speaker 2: they adjust the little ailerons on their wings. And I 863 00:46:21,560 --> 00:46:22,719 Speaker 2: saw that from the top. 864 00:46:23,400 --> 00:46:26,120 Speaker 1: See that is all. That's pretty cool. I'll give you that. 865 00:46:26,480 --> 00:46:28,399 Speaker 1: One more thing before we go. I want to ask 866 00:46:28,440 --> 00:46:31,880 Speaker 1: you about And I know I'm a dinosaur, and I 867 00:46:31,920 --> 00:46:33,600 Speaker 1: know I've been using it for a while but not 868 00:46:33,640 --> 00:46:37,320 Speaker 1: in a significant way. But lately I've been using voice 869 00:46:37,360 --> 00:46:40,879 Speaker 1: to text, just in my phone, just using notes, which 870 00:46:40,920 --> 00:46:43,200 Speaker 1: I know there's a million, probably a million better ways 871 00:46:43,239 --> 00:46:45,920 Speaker 1: to do it, but so often when I'm going for 872 00:46:46,040 --> 00:46:50,319 Speaker 1: my my myriad of walks through the day, I'll have 873 00:46:50,360 --> 00:46:52,840 Speaker 1: an idea and so I just talk into my phone 874 00:46:52,840 --> 00:46:54,640 Speaker 1: and by the time I get home, I've got three 875 00:46:54,719 --> 00:46:59,319 Speaker 1: or four or five pages of notes. One do you 876 00:46:59,440 --> 00:47:03,000 Speaker 1: use voice to text like that, Patrick, And two, is 877 00:47:03,040 --> 00:47:06,200 Speaker 1: there a more efficient and effective way than what I'm doing, 878 00:47:06,200 --> 00:47:09,719 Speaker 1: which is basically just pressing the audio or the microphone 879 00:47:09,719 --> 00:47:11,120 Speaker 1: button and talking into notes. 880 00:47:11,840 --> 00:47:15,000 Speaker 3: It's a good question. Look, I don't use it. I've 881 00:47:15,000 --> 00:47:15,560 Speaker 3: got to say. 882 00:47:15,680 --> 00:47:19,440 Speaker 2: I tend to like write, but I use voice to 883 00:47:19,520 --> 00:47:22,640 Speaker 2: text or voice to add things to my calendar when 884 00:47:22,640 --> 00:47:24,600 Speaker 2: I'm in bed and it's late at night and I 885 00:47:24,600 --> 00:47:26,239 Speaker 2: suddenly think of something, so I don't have to turn 886 00:47:26,239 --> 00:47:28,360 Speaker 2: the light on. I just ask to add it to 887 00:47:28,400 --> 00:47:30,520 Speaker 2: my calendar and then I don't think about it anymore. 888 00:47:31,200 --> 00:47:34,040 Speaker 2: Is there something more efficient? I guess it depends on 889 00:47:34,160 --> 00:47:35,920 Speaker 2: what you wanted to do. If you wanted to get 890 00:47:35,960 --> 00:47:38,680 Speaker 2: a summary of your notes, if you felt you know, 891 00:47:38,760 --> 00:47:41,400 Speaker 2: that's where you employ an AI agent. But now what 892 00:47:41,400 --> 00:47:44,160 Speaker 2: you're doing now just recording into it. If it's accurate, 893 00:47:44,239 --> 00:47:47,040 Speaker 2: and you find the accuracy level as high, then I 894 00:47:47,080 --> 00:47:49,239 Speaker 2: think you're doing exactly the right thing. It's a really 895 00:47:49,280 --> 00:47:52,440 Speaker 2: efficient way of doing it. And you know, I'd keep 896 00:47:52,480 --> 00:47:53,160 Speaker 2: doing what you're doing. 897 00:47:53,760 --> 00:47:56,800 Speaker 1: I was walking yesterday and I was thinking, I'm always 898 00:47:56,800 --> 00:47:59,280 Speaker 1: thinking about, like, if I've got to do a solo podcast, 899 00:47:59,320 --> 00:48:04,240 Speaker 1: what's a what's something that you know that I haven't 900 00:48:04,280 --> 00:48:07,480 Speaker 1: overspoken about, or maybe is somewhat fresh or a fresh 901 00:48:07,520 --> 00:48:10,200 Speaker 1: angle on an old topic. But I was thinking about 902 00:48:10,239 --> 00:48:14,399 Speaker 1: the idea of, you know, how everyone tells everyone how 903 00:48:14,440 --> 00:48:17,480 Speaker 1: amazing they are and spectacular and incredible, and the truth 904 00:48:17,560 --> 00:48:20,399 Speaker 1: is that we're all not. Some people are, but I'm 905 00:48:20,440 --> 00:48:22,839 Speaker 1: wildly mediocre in most things. And I was so it's 906 00:48:22,880 --> 00:48:25,520 Speaker 1: coming up with an idea of like, Okay, so maybe 907 00:48:25,520 --> 00:48:28,359 Speaker 1: I don't have incredible genetics or the highest IQ. Maybe 908 00:48:28,400 --> 00:48:31,719 Speaker 1: I'm not a creative genius, and maybe I'll never cure 909 00:48:31,800 --> 00:48:34,800 Speaker 1: cancer or run fast or jump high or solve bloody 910 00:48:35,600 --> 00:48:40,359 Speaker 1: or whatever solve the bloody quantum theory problems, or but 911 00:48:40,680 --> 00:48:42,200 Speaker 1: how do I make the most out of what I've 912 00:48:42,200 --> 00:48:47,760 Speaker 1: got to work with? And so this was my basis 913 00:48:47,800 --> 00:48:50,400 Speaker 1: that I was walking around. I came home with pretty 914 00:48:50,480 --> 00:48:53,359 Speaker 1: much a whole episode planned, and then I just cut 915 00:48:53,360 --> 00:48:56,120 Speaker 1: and paste it into a dock and then it's on 916 00:48:56,160 --> 00:48:58,160 Speaker 1: my computer and I tidy it up a little bit. 917 00:48:58,840 --> 00:49:02,200 Speaker 1: But literally, I'm doing two things at once, which is 918 00:49:02,200 --> 00:49:07,880 Speaker 1: pretty efficient. And for me, I tend to think, I 919 00:49:07,960 --> 00:49:10,440 Speaker 1: don't know, I feel like I'm more creative, and I 920 00:49:10,480 --> 00:49:13,600 Speaker 1: feel like my brain works better, maybe better sometimes when 921 00:49:13,600 --> 00:49:16,520 Speaker 1: I'm walking than when i'm sitting. If I'm trying to 922 00:49:16,520 --> 00:49:19,800 Speaker 1: conceptualize and create something, I think when I'm not sitting 923 00:49:19,840 --> 00:49:23,000 Speaker 1: in this chair looking at this screen, but when I'm 924 00:49:23,080 --> 00:49:25,759 Speaker 1: out and I'm just walking in nature or suburbia or 925 00:49:25,800 --> 00:49:27,879 Speaker 1: a bit of both. Yeah, I feel like my brain 926 00:49:27,880 --> 00:49:29,279 Speaker 1: works better. I think there's a lot to. 927 00:49:29,200 --> 00:49:31,640 Speaker 2: Be said for the subconscious brain as well, what your 928 00:49:31,680 --> 00:49:33,680 Speaker 2: brain is doing. You know, if you've got a puzzle 929 00:49:33,719 --> 00:49:38,280 Speaker 2: to solve, a challenge to face, then when you stop 930 00:49:38,360 --> 00:49:40,840 Speaker 2: thinking about it, it doesn't necessarily mean that your brain 931 00:49:40,920 --> 00:49:43,319 Speaker 2: isn't still processing it. I think I've told you the 932 00:49:43,400 --> 00:49:46,040 Speaker 2: story about getting a phone call from someone we both know, 933 00:49:46,280 --> 00:49:50,320 Speaker 2: Andrew Jobbers, and he was struggling because he was publishing 934 00:49:50,320 --> 00:49:53,239 Speaker 2: his eighth book and the designer or the publisher had 935 00:49:53,280 --> 00:49:56,000 Speaker 2: a designer and he wasn't happy with the cover and 936 00:49:56,040 --> 00:49:57,880 Speaker 2: he was really stressing because it had gone back to 937 00:49:57,920 --> 00:50:01,640 Speaker 2: the design multiple times and he couldn't get what he wanted. 938 00:50:01,920 --> 00:50:03,840 Speaker 2: So he sent me a rough draft and said I 939 00:50:03,880 --> 00:50:06,319 Speaker 2: want to incorporate this, this, and this, and I said, oh, look, 940 00:50:06,360 --> 00:50:09,000 Speaker 2: I can have a look at it tomorrow. Went to sleep, 941 00:50:09,440 --> 00:50:12,280 Speaker 2: woke up about four am, the finished cover. 942 00:50:12,440 --> 00:50:16,000 Speaker 3: Was in my mind, all done, and I just got to. 943 00:50:15,960 --> 00:50:19,960 Speaker 2: The computer, got in there, produced it and that was 944 00:50:20,000 --> 00:50:20,759 Speaker 2: basically the cover. 945 00:50:21,320 --> 00:50:24,279 Speaker 3: I felt like I did zero work. I really did 946 00:50:24,360 --> 00:50:24,959 Speaker 3: zero work. 947 00:50:25,200 --> 00:50:28,560 Speaker 2: I had a concept, went to bed, subconscious brain has 948 00:50:28,640 --> 00:50:31,080 Speaker 2: worked on it overnight, doing what it was doing, and 949 00:50:31,080 --> 00:50:33,000 Speaker 2: the finished cover was there in the morning. 950 00:50:33,200 --> 00:50:35,280 Speaker 3: Just great. Wish life was luck that all the time. 951 00:50:37,600 --> 00:50:40,920 Speaker 1: Well, I mean, your brain is always thinking in inverted commas, 952 00:50:41,080 --> 00:50:44,680 Speaker 1: So there's thinking that happens because of us and despite us, 953 00:50:44,920 --> 00:50:47,520 Speaker 1: you know what I mean. So there's like conscious and 954 00:50:47,640 --> 00:50:51,000 Speaker 1: unconscious that it's like, this is prefrontal cortex critical thinking. 955 00:50:51,360 --> 00:50:55,160 Speaker 1: I'm focused on there. And then there's the deeper stuff 956 00:50:55,200 --> 00:50:56,960 Speaker 1: which is just going on while you're trying to bang 957 00:50:56,960 --> 00:51:00,000 Speaker 1: out a fuse edit. The brain's funk amazing. 958 00:51:00,000 --> 00:51:02,440 Speaker 2: Can I tell you a funny story about a viral 959 00:51:02,520 --> 00:51:05,359 Speaker 2: video I saw recently about because I waken a bit. 960 00:51:05,480 --> 00:51:07,360 Speaker 1: Can it be less than two minutes because Tiff and 961 00:51:07,360 --> 00:51:09,480 Speaker 1: I have got to go Okay, really funny story. 962 00:51:09,560 --> 00:51:12,040 Speaker 2: So the viral video how to get back to sleep 963 00:51:12,080 --> 00:51:13,520 Speaker 2: when you wake up in the middle of the night. 964 00:51:13,560 --> 00:51:16,160 Speaker 2: I always struggle with this. You get an ice pack, 965 00:51:16,239 --> 00:51:18,359 Speaker 2: wrapped in a tea towel and this lay it. 966 00:51:18,280 --> 00:51:23,400 Speaker 3: On your forehead because it cools your brain. I tried it. 967 00:51:23,480 --> 00:51:24,759 Speaker 3: This next, doesn't it? TI. 968 00:51:25,760 --> 00:51:28,640 Speaker 4: I have not tried it, but I did see an 969 00:51:28,760 --> 00:51:31,600 Speaker 4: article recently. It must have gone crazy because I feel 970 00:51:31,600 --> 00:51:32,480 Speaker 4: like I saw it a lot. 971 00:51:32,600 --> 00:51:35,239 Speaker 1: If you were given some high level medical advice by 972 00:51:35,280 --> 00:51:39,480 Speaker 1: the Crab yesterday about how to not wake up with 973 00:51:39,520 --> 00:51:43,040 Speaker 1: a blood sugar level of two? Did you try it? 974 00:51:43,080 --> 00:51:46,520 Speaker 1: Did you tell people? What the just again in under 975 00:51:46,560 --> 00:51:47,680 Speaker 1: one minute. 976 00:51:47,640 --> 00:51:50,680 Speaker 4: Tablespoon of almond butter, and I'm down with that. I 977 00:51:50,719 --> 00:51:53,560 Speaker 4: love almond butter, so I have been spooning that bad 978 00:51:53,600 --> 00:51:54,560 Speaker 4: boy before bed. 979 00:51:54,880 --> 00:51:55,319 Speaker 3: It's great. 980 00:51:55,400 --> 00:51:56,720 Speaker 1: And what did it work? 981 00:51:57,920 --> 00:51:58,120 Speaker 3: Well? 982 00:51:58,280 --> 00:52:01,160 Speaker 4: Yeah, it has been. It has been, but I don't 983 00:52:01,200 --> 00:52:04,480 Speaker 4: know how accurate the old We're back on everyone. We're 984 00:52:04,520 --> 00:52:08,920 Speaker 4: back on the glucose monitor, which has been a wild ride. Again. 985 00:52:09,080 --> 00:52:11,360 Speaker 4: Maybe the first one wasn't faulty after all. 986 00:52:12,440 --> 00:52:15,120 Speaker 1: But tell people what problem you were trying to solve. 987 00:52:15,719 --> 00:52:19,440 Speaker 4: So the first calibrated night, my blood sugar dropped to 988 00:52:20,719 --> 00:52:24,200 Speaker 4: supposedly to two point two on at least three occasions. 989 00:52:24,320 --> 00:52:27,759 Speaker 4: It's very low, and what's happening? It tends to do 990 00:52:27,880 --> 00:52:30,080 Speaker 4: so about four am. Keep waking up at four am 991 00:52:30,120 --> 00:52:32,680 Speaker 4: and I think it drops cortis or spikes and that 992 00:52:32,800 --> 00:52:33,560 Speaker 4: wakes me up. 993 00:52:34,840 --> 00:52:38,400 Speaker 1: So with the crab's prescription, that didn't happen. 994 00:52:39,239 --> 00:52:43,359 Speaker 4: Well no, no, it didn't the last two nights, so tuned. 995 00:52:43,600 --> 00:52:47,640 Speaker 1: Doctor Crabb at least qualified doctor in Australia. Patrick. How 996 00:52:47,640 --> 00:52:50,080 Speaker 1: can people find you and follow you and connect with you? 997 00:52:50,920 --> 00:52:53,359 Speaker 2: I can go to websites now dot com TODAYU that's 998 00:52:53,360 --> 00:52:56,399 Speaker 2: my business website, or tai Chi at home dot com, 999 00:52:56,400 --> 00:52:59,200 Speaker 2: todau which is my taichi website. 1000 00:52:58,600 --> 00:53:04,160 Speaker 1: The perfect great work you, great work, tiff Thanks everyone. 1001 00:53:04,239 --> 00:53:07,239 Speaker 1: We'll say goodbye affair, but thanks Patty, Thanks Tiffy