1 00:00:00,760 --> 00:00:03,160 Speaker 1: I'll get a team. Welcome to another installment of the 2 00:00:03,200 --> 00:00:06,400 Speaker 1: year project. It's been difficult over here. I've been trying 3 00:00:06,400 --> 00:00:09,720 Speaker 1: to get a break in the conversation. These two distractions 4 00:00:09,720 --> 00:00:14,280 Speaker 1: to my left and right on my screen just wouldn't 5 00:00:14,320 --> 00:00:16,680 Speaker 1: even let me start the show because they're so chatty. 6 00:00:16,680 --> 00:00:22,239 Speaker 1: Good morning, TIV, Good morning, Good morning Patrick, Good morning Patrick, 7 00:00:22,280 --> 00:00:22,840 Speaker 1: and Fritz. 8 00:00:24,800 --> 00:00:26,720 Speaker 2: Oh sorry you ordered his sweet speak now did you? 9 00:00:26,720 --> 00:00:29,440 Speaker 2: You're complaining about me talking too much? So I thought, well, 10 00:00:29,800 --> 00:00:30,960 Speaker 2: I'll have a bit of time out. 11 00:00:31,840 --> 00:00:34,960 Speaker 1: Oh look at you being hilarious, just a bit of comedy, 12 00:00:35,040 --> 00:00:36,800 Speaker 1: straight up hanging. I just got to go get a 13 00:00:36,840 --> 00:00:43,199 Speaker 1: needle and thread to sew up my sides. God, hello, Patrick, 14 00:00:43,240 --> 00:00:46,960 Speaker 1: Hello Fritz. Why doesn't Fritz talk? He's a dog and 15 00:00:47,000 --> 00:00:50,560 Speaker 1: he doesn't understand. He's very cute. Look at that. That's 16 00:00:50,560 --> 00:00:57,320 Speaker 1: a fucking lovable face. He seems very stoic. Now I've 17 00:00:57,360 --> 00:00:59,760 Speaker 1: figured out what kind of dog. I've narrowed it down 18 00:00:59,800 --> 00:01:00,720 Speaker 1: to two dogs. 19 00:01:01,040 --> 00:01:02,080 Speaker 3: Okay, what are we getting? 20 00:01:03,240 --> 00:01:03,560 Speaker 4: Well? 21 00:01:05,080 --> 00:01:07,759 Speaker 1: See, I feel like it's a no win situation because 22 00:01:07,840 --> 00:01:10,960 Speaker 1: whatever dog you say that you're thinking that you might get, 23 00:01:12,200 --> 00:01:15,240 Speaker 1: there's going to be people that go no, definitely don't 24 00:01:15,240 --> 00:01:17,920 Speaker 1: get one of those, because they'll eat your fucking house 25 00:01:17,920 --> 00:01:20,959 Speaker 1: when you're asleep, or you know, they'll dig up your garden, 26 00:01:21,080 --> 00:01:23,920 Speaker 1: or or they eat too much, or they die too early, 27 00:01:24,000 --> 00:01:28,160 Speaker 1: or they get hip displaysure or so what's see, I 28 00:01:28,160 --> 00:01:31,840 Speaker 1: haven't even told I haven't told anyone this, So I'm 29 00:01:31,840 --> 00:01:34,960 Speaker 1: telling the whole world at once. So the first one 30 00:01:35,080 --> 00:01:37,280 Speaker 1: is going to be no surprise because I've already had 31 00:01:37,319 --> 00:01:43,440 Speaker 1: one golden retriever. The second one, seventy percent of our 32 00:01:43,480 --> 00:01:45,480 Speaker 1: audience will go, I don't even know what that is. 33 00:01:46,560 --> 00:01:48,120 Speaker 1: And I'll tell you what. I'm going to give you 34 00:01:48,160 --> 00:01:51,680 Speaker 1: each three guesses, and if either of you guess it, 35 00:01:53,000 --> 00:01:58,080 Speaker 1: I'll give you fifty bucks. But tif get off your phone. 36 00:01:58,240 --> 00:01:59,000 Speaker 4: You're not allowed to. 37 00:01:59,360 --> 00:02:01,520 Speaker 3: I'll try for that breed you talked about. Is it 38 00:02:01,600 --> 00:02:03,400 Speaker 3: that breed you talked about the other day that I 39 00:02:03,400 --> 00:02:03,800 Speaker 3: won't read? 40 00:02:03,960 --> 00:02:04,120 Speaker 1: Well? 41 00:02:04,120 --> 00:02:07,480 Speaker 4: Then then you're out, so you straight on my You know, 42 00:02:07,720 --> 00:02:11,400 Speaker 4: what about a harrier? Nah, it's a harrier? Okay? 43 00:02:11,400 --> 00:02:15,000 Speaker 2: What about an Alaskan melomotor in the ballpark? 44 00:02:15,880 --> 00:02:17,040 Speaker 1: Yeah? Yeah? 45 00:02:17,440 --> 00:02:19,160 Speaker 4: Is that going to get twenty five bucks? 46 00:02:19,639 --> 00:02:23,240 Speaker 1: No? No, the dog I'm thinking about, but it's like, 47 00:02:23,360 --> 00:02:25,919 Speaker 1: this is not a dog. You get lightly like this 48 00:02:26,040 --> 00:02:29,160 Speaker 1: dog is a commitment, and that's why I'd probably lean 49 00:02:29,240 --> 00:02:32,040 Speaker 1: more towards the Golden Retriever because they're just like me, 50 00:02:32,240 --> 00:02:36,520 Speaker 1: fat and lasyer and eat food, fat la Lasian like 51 00:02:36,600 --> 00:02:43,239 Speaker 1: cuddles a Belgium Melamoire, which they're they're they're kind of 52 00:02:43,360 --> 00:02:47,160 Speaker 1: like a slightly smaller German Shepherd and some say smarter 53 00:02:47,280 --> 00:02:54,080 Speaker 1: and better in that kind of highly trainable, fiercely loyal, 54 00:02:55,320 --> 00:02:58,079 Speaker 1: and I no, I don't want a big, alpha male dog. Everyone. 55 00:02:58,080 --> 00:03:00,440 Speaker 1: I just love them, like I've met a couple and 56 00:03:00,480 --> 00:03:02,000 Speaker 1: they're so fucking cool. 57 00:03:02,680 --> 00:03:05,560 Speaker 4: It's like black faces. Yeah, gold with a black face. 58 00:03:06,080 --> 00:03:13,400 Speaker 1: Yeah, they're they're like a human, lower maintenance anyway, I 59 00:03:13,440 --> 00:03:14,840 Speaker 1: don't know, but I've got to wait till I finish 60 00:03:14,919 --> 00:03:17,840 Speaker 1: my PhD because I just don't have time to like, 61 00:03:18,000 --> 00:03:19,919 Speaker 1: I think that's the other thing I think. And you 62 00:03:20,000 --> 00:03:23,440 Speaker 1: both dog owners, but I think if you're going to 63 00:03:23,440 --> 00:03:26,240 Speaker 1: get a dog, like for me, I want to train 64 00:03:26,320 --> 00:03:29,880 Speaker 1: the dog really well and invest the time properly in 65 00:03:29,919 --> 00:03:33,480 Speaker 1: the first six twelve months to build a relationship, create 66 00:03:33,520 --> 00:03:37,200 Speaker 1: good dynamics, and train that dog in it. Not so 67 00:03:37,280 --> 00:03:40,920 Speaker 1: that it's fucking weapon, but just I think it's irresponsible 68 00:03:40,960 --> 00:03:44,720 Speaker 1: to have a totally untrained dog, and because they just 69 00:03:44,840 --> 00:03:48,880 Speaker 1: don't like because dogs are pack animals, they like being 70 00:03:48,960 --> 00:03:50,960 Speaker 1: led as long as they're loved and all of that. 71 00:03:51,840 --> 00:03:54,520 Speaker 1: But you know so, I feel like, especially if I've 72 00:03:54,520 --> 00:03:56,960 Speaker 1: got a Belgium Melamie, I really need to invest some 73 00:03:57,040 --> 00:04:00,280 Speaker 1: time into some proper training because if you don't train 74 00:04:00,400 --> 00:04:03,520 Speaker 1: some dogs, they're fucking nightmares. But if you train them properly, 75 00:04:03,520 --> 00:04:06,120 Speaker 1: they're great. If you're looking at one right. 76 00:04:06,000 --> 00:04:08,640 Speaker 3: Now, yeah, yeah, And you definitely don't want one of 77 00:04:08,680 --> 00:04:10,040 Speaker 3: those untrained. 78 00:04:09,560 --> 00:04:10,560 Speaker 5: Do you? 79 00:04:10,600 --> 00:04:10,840 Speaker 1: No? 80 00:04:11,080 --> 00:04:12,960 Speaker 5: No terrifying no things. 81 00:04:13,560 --> 00:04:15,360 Speaker 1: So one of the guys that I listened to, his 82 00:04:15,440 --> 00:04:19,000 Speaker 1: name's Tim Kennedy, is Special Forces x USC. He's got 83 00:04:19,000 --> 00:04:27,599 Speaker 1: one and he calls it the fur Missile. So what 84 00:04:27,640 --> 00:04:29,279 Speaker 1: are you doing over there, Ti, if you're talking to 85 00:04:29,279 --> 00:04:29,680 Speaker 1: your dad? 86 00:04:29,839 --> 00:04:34,880 Speaker 3: Yeah, it's just popped his head in the studio, Make 87 00:04:34,960 --> 00:04:35,880 Speaker 3: Luna for a walk. 88 00:04:38,720 --> 00:04:41,600 Speaker 1: Tell everyone what's happening in your world before we get underway, 89 00:04:41,640 --> 00:04:44,520 Speaker 1: because it's it's there's lots going. 90 00:04:44,279 --> 00:04:45,760 Speaker 5: On, speaking of animals. 91 00:04:45,800 --> 00:04:47,599 Speaker 3: I am going to be off on a plane to 92 00:04:47,680 --> 00:04:50,520 Speaker 3: India tomorrow, and so my dad flew in yesterday. He 93 00:04:50,640 --> 00:04:54,839 Speaker 3: is manning the chaotic house that is bear. 94 00:04:54,600 --> 00:04:57,920 Speaker 5: And Lunar HQ. So he's learning the ropes. 95 00:04:58,000 --> 00:05:00,720 Speaker 3: He's learning the ropes, and he's getting rap around the 96 00:05:00,839 --> 00:05:03,360 Speaker 3: pores of those who run the household here. 97 00:05:04,880 --> 00:05:07,640 Speaker 1: Do they love him already? I mean they know him. 98 00:05:09,400 --> 00:05:13,320 Speaker 3: Yesterday I was in the kitchen and I could just 99 00:05:13,400 --> 00:05:17,039 Speaker 3: hear giggling like a schoolgirl. And I snuck my head 100 00:05:17,040 --> 00:05:20,159 Speaker 3: around the corner and Dad's on the couch with a 101 00:05:20,240 --> 00:05:23,599 Speaker 3: doner on him, with Luna under the doner, with him 102 00:05:23,920 --> 00:05:28,839 Speaker 3: smooching him, and he's giggling. And then when they're four 103 00:05:28,880 --> 00:05:33,240 Speaker 3: hours after going into hiding, when he first arrived, came out, 104 00:05:33,440 --> 00:05:36,040 Speaker 3: next minute she's he's laying under the doner and she's 105 00:05:36,080 --> 00:05:36,760 Speaker 3: on top of it. 106 00:05:37,760 --> 00:05:41,479 Speaker 1: Oh, that's hilarious. I wonder if I wonder if this 107 00:05:41,640 --> 00:05:45,279 Speaker 1: is a probably a dumb question, but maybe not. Patrick. 108 00:05:45,320 --> 00:05:50,920 Speaker 1: Do you reckon that that the animals would know that 109 00:05:52,560 --> 00:05:56,200 Speaker 1: TIFF's dad is TIFF's dad because there's some smell or 110 00:05:56,279 --> 00:05:58,760 Speaker 1: sense or some genetic familiarity. 111 00:06:00,040 --> 00:06:01,680 Speaker 4: Well, that's a really good question, isn't it. 112 00:06:02,440 --> 00:06:04,839 Speaker 1: Like they have like incredible senses. 113 00:06:06,160 --> 00:06:09,640 Speaker 2: I think what they would sense is how Tiff feels 114 00:06:09,680 --> 00:06:13,919 Speaker 2: about her father, So wow, he comes in and she 115 00:06:14,279 --> 00:06:16,960 Speaker 2: is relaxed and loving towards her dad. 116 00:06:17,320 --> 00:06:20,880 Speaker 4: That would be what they would sense, is what my friend. 117 00:06:22,400 --> 00:06:26,120 Speaker 1: Well, TIFF's dad and Tiff and I had a coffee 118 00:06:26,160 --> 00:06:30,400 Speaker 1: yesterday at one forty five, and by about two oh seven, 119 00:06:30,480 --> 00:06:32,960 Speaker 1: TIFF's dad was nearly fucking asleep at the table. The 120 00:06:33,000 --> 00:06:35,880 Speaker 1: poor old bugger. He was just he hit the wall 121 00:06:35,920 --> 00:06:38,919 Speaker 1: because he didn't sleep the night before, and he was 122 00:06:38,960 --> 00:06:42,440 Speaker 1: a little bit he was a little bit hyper aware 123 00:06:42,480 --> 00:06:45,560 Speaker 1: about ticking all the boxes and going through the airport 124 00:06:45,600 --> 00:06:47,599 Speaker 1: and getting everything. Was a little bit worried about some 125 00:06:47,720 --> 00:06:48,800 Speaker 1: of that. God bless him. 126 00:06:48,920 --> 00:06:50,200 Speaker 4: It wasn't a little nailed it. 127 00:06:50,200 --> 00:06:56,119 Speaker 2: It wasn't a little bit your conversation too, what maybe 128 00:06:56,160 --> 00:06:56,960 Speaker 2: fall asleep? 129 00:06:58,440 --> 00:07:01,240 Speaker 1: Oh yeah, no, it was definitely me. That's actually a 130 00:07:01,240 --> 00:07:04,320 Speaker 1: good point. I didn't think of that, Maybe because I'm 131 00:07:04,320 --> 00:07:07,599 Speaker 1: a boring prick. Yeah, no, thanks for bringing that to 132 00:07:07,640 --> 00:07:10,960 Speaker 1: my attention. I was thinking it was a lack of 133 00:07:11,000 --> 00:07:16,880 Speaker 1: sleep back in therapy for me. Now, now just quickly, Tiff, 134 00:07:17,040 --> 00:07:24,640 Speaker 1: you're heading off to India tomorrow. Yeah, what's the what's 135 00:07:24,680 --> 00:07:27,320 Speaker 1: the update? You organized? I know you're not organized, but 136 00:07:27,640 --> 00:07:28,080 Speaker 1: where you are? 137 00:07:28,600 --> 00:07:31,560 Speaker 3: Well, I started packing yesterday and I realized that I 138 00:07:31,600 --> 00:07:35,560 Speaker 3: don't have any India friendly clothes when the description was 139 00:07:35,640 --> 00:07:39,920 Speaker 3: loose fitting and cover your shoulders. So I had to 140 00:07:40,040 --> 00:07:43,720 Speaker 3: duck out yesterday IVO and get some loose fitting linen 141 00:07:43,840 --> 00:07:50,040 Speaker 3: pants and stuff. Yeah, but we've got a tight what 142 00:07:50,080 --> 00:07:53,239 Speaker 3: do you call it, baggage allowance, So it's a challenge. 143 00:07:53,280 --> 00:07:55,640 Speaker 3: I'm not good at packing. I normally take twenty kilos 144 00:07:55,640 --> 00:07:56,840 Speaker 3: for a week in tazzy. 145 00:07:58,520 --> 00:07:59,239 Speaker 5: Stay tuned. 146 00:07:59,560 --> 00:08:01,520 Speaker 3: There's a lot early mornings coming up. 147 00:08:03,080 --> 00:08:05,160 Speaker 1: And I love your rationale that you told me. You 148 00:08:05,320 --> 00:08:08,840 Speaker 1: say about how they wanted you to take all of 149 00:08:08,880 --> 00:08:12,920 Speaker 1: these prescription drugs for dysentery and all the potential stomach 150 00:08:13,680 --> 00:08:16,760 Speaker 1: ailments that you might you know, and you're like, oh 151 00:08:16,840 --> 00:08:20,160 Speaker 1: fuck it, other people will take them. I'll just use 152 00:08:20,200 --> 00:08:23,240 Speaker 1: theirs if I get sick taking. I'm not sure that's 153 00:08:23,240 --> 00:08:23,880 Speaker 1: a good plan. 154 00:08:24,160 --> 00:08:27,400 Speaker 3: Look, I'm not taking a bunch of prescription medication to 155 00:08:27,480 --> 00:08:29,960 Speaker 3: go and sit in the bush and then have all 156 00:08:29,960 --> 00:08:32,080 Speaker 3: of us that have no idea go, oh, we should 157 00:08:32,080 --> 00:08:34,840 Speaker 3: take this prescription medication that we're not sure what we've 158 00:08:34,880 --> 00:08:36,160 Speaker 3: got and we should take it. 159 00:08:36,520 --> 00:08:37,840 Speaker 5: No. 160 00:08:37,840 --> 00:08:39,640 Speaker 2: Now, all you do, what you do is you have 161 00:08:39,640 --> 00:08:41,839 Speaker 2: it all lined up. You google it and say right, 162 00:08:41,960 --> 00:08:43,720 Speaker 2: this is for diarrhea. 163 00:08:43,480 --> 00:08:44,920 Speaker 4: This is for this, this is for this. 164 00:08:45,200 --> 00:08:47,080 Speaker 2: You don't have to read it, just say, okay, there's 165 00:08:47,080 --> 00:08:50,640 Speaker 2: my diarrhea tablets. There's my tablets for headaches, there's my tablet. 166 00:08:52,040 --> 00:08:54,160 Speaker 1: You know what you can do. You can just take a. 167 00:08:54,200 --> 00:08:57,080 Speaker 3: Two week supply of protein bars and eat only that. 168 00:08:58,960 --> 00:09:01,480 Speaker 1: The only problem is the water. You know what you 169 00:09:01,480 --> 00:09:05,760 Speaker 1: can do is just chat, dear chatters. Okay, here's my situation. 170 00:09:06,520 --> 00:09:09,319 Speaker 1: I'm in the jungle. I'm pooing through the eye of 171 00:09:09,360 --> 00:09:14,040 Speaker 1: a needle. These are my drug options. I'm standing by 172 00:09:14,120 --> 00:09:16,319 Speaker 1: for directions. 173 00:09:16,640 --> 00:09:18,720 Speaker 2: Can I tell you, though, if you don't rely on 174 00:09:18,800 --> 00:09:21,600 Speaker 2: other people's stuff, Because when I went to China for 175 00:09:21,640 --> 00:09:25,320 Speaker 2: the first time, my tied cheek group is predominantly older women, 176 00:09:25,480 --> 00:09:27,640 Speaker 2: twenty years older than me, so I get mothered the 177 00:09:27,640 --> 00:09:32,520 Speaker 2: fair bits. But we got to the university in a 178 00:09:32,520 --> 00:09:36,040 Speaker 2: place called bed to Her and there's no coffee, and 179 00:09:36,160 --> 00:09:39,800 Speaker 2: one lady has brought those little tea bag type coffee, 180 00:09:39,840 --> 00:09:41,800 Speaker 2: you know, the Harris coffee that's like in a tea bag. 181 00:09:42,000 --> 00:09:45,120 Speaker 2: You reckon, satles, you reckon. She's going to share it 182 00:09:45,160 --> 00:09:45,760 Speaker 2: with anybody. 183 00:09:45,760 --> 00:09:50,440 Speaker 4: Man. She's selling the shit. People really see people's colors 184 00:09:50,920 --> 00:09:53,880 Speaker 4: when they're the only one with an essential drug. 185 00:09:54,000 --> 00:10:02,600 Speaker 6: Oh wow, yeah, perrel watch out hey, speaking of I 186 00:10:02,600 --> 00:10:05,480 Speaker 6: don't know this is irrelevant to our general theme of 187 00:10:05,520 --> 00:10:07,600 Speaker 6: the show, but fuck it, you'll both find this interesting. 188 00:10:07,760 --> 00:10:13,800 Speaker 1: So it was Melissa's birthday yesterday, shout out to Melissa 189 00:10:14,840 --> 00:10:19,960 Speaker 1: twenty three, and this is she'll probably kill me for 190 00:10:20,040 --> 00:10:23,680 Speaker 1: telling you this, but fuck it. So Melissa has been 191 00:10:23,960 --> 00:10:26,480 Speaker 1: Melissa has no drugs. She's never been drunk, she'd never 192 00:10:26,480 --> 00:10:28,439 Speaker 1: been high, she never had a sip of alcohol. She 193 00:10:28,480 --> 00:10:31,960 Speaker 1: doesn't drink coffee, she doesn't drink tea. But she's jealous 194 00:10:32,080 --> 00:10:35,160 Speaker 1: of us coffee drinkers, right because I go, I tell 195 00:10:35,200 --> 00:10:37,920 Speaker 1: her how fucking amazing coffee is and that she's missing out, 196 00:10:37,960 --> 00:10:43,880 Speaker 1: and just like a good what do you call a 197 00:10:43,960 --> 00:10:46,000 Speaker 1: drug dealer, I'm just like trying to get her over 198 00:10:46,040 --> 00:10:46,439 Speaker 1: the line. 199 00:10:46,480 --> 00:10:46,680 Speaker 4: Right. 200 00:10:47,400 --> 00:10:53,160 Speaker 1: So, anyway, she started drinking coffee purely just for the 201 00:10:53,240 --> 00:10:57,040 Speaker 1: cognitive benefits of having some caffeine in her system, because 202 00:10:57,040 --> 00:10:59,800 Speaker 1: she works a lot, right, and so by about two 203 00:10:59,880 --> 00:11:03,560 Speaker 1: or three pms she sometimes hits the wall. So she's 204 00:11:03,600 --> 00:11:06,800 Speaker 1: having like a lunchtime coffee and she hates it. But 205 00:11:06,840 --> 00:11:09,760 Speaker 1: she goes and buys a coffee. She thinks it's disgusting. 206 00:11:10,280 --> 00:11:14,360 Speaker 1: It's the worst shit she's ever tasted. So for and 207 00:11:14,400 --> 00:11:16,480 Speaker 1: her and I have an arrangement where we don't buy 208 00:11:16,559 --> 00:11:21,400 Speaker 1: each other birthday presence. Right, But I heard this thing 209 00:11:21,440 --> 00:11:25,200 Speaker 1: advertised on the radio called Wakey, Wakey. Have you heard 210 00:11:25,200 --> 00:11:29,679 Speaker 1: of it? I know, I know it sounds like no, Patrick, 211 00:11:29,720 --> 00:11:32,040 Speaker 1: it's not the second line that you think of either. 212 00:11:32,400 --> 00:11:37,000 Speaker 1: I know what you're thinking. I'm not going to say 213 00:11:37,000 --> 00:11:39,680 Speaker 1: that because I'm not going there on this podcast. I'm 214 00:11:39,679 --> 00:11:42,480 Speaker 1: doing one episode with no dick jokes. 215 00:11:42,559 --> 00:11:45,760 Speaker 2: Right, No, anyway, story is that I'm talking about today? 216 00:11:45,800 --> 00:11:47,000 Speaker 2: Have you read my notes? 217 00:11:47,440 --> 00:11:47,600 Speaker 3: Oh? 218 00:11:47,679 --> 00:11:51,640 Speaker 1: I have? That's funny. That is funny. But anyway, but 219 00:11:51,720 --> 00:11:54,480 Speaker 1: you're opening that door, not me. But anyway, So I 220 00:11:54,520 --> 00:11:58,080 Speaker 1: went to the supermarket and I couldn't anyway. They've got 221 00:11:58,080 --> 00:12:00,920 Speaker 1: these little they're like Baroka chew tubes full of these 222 00:12:01,200 --> 00:12:05,319 Speaker 1: pills or these large tablets and they have the equivalent 223 00:12:05,440 --> 00:12:09,439 Speaker 1: of a cup of coffee. Now, not that I'm suggesting 224 00:12:09,520 --> 00:12:12,040 Speaker 1: anyone does this, but I thought, well, she's only doing 225 00:12:12,120 --> 00:12:15,080 Speaker 1: it for the caffeine, and we're talking about a six 226 00:12:15,120 --> 00:12:17,160 Speaker 1: dollar present. So I didn't really buy her anything. But 227 00:12:17,840 --> 00:12:21,920 Speaker 1: so I just got her. I haven't given them to 228 00:12:22,000 --> 00:12:24,600 Speaker 1: it yet, I'm saying yet, but i'll So I bought 229 00:12:24,600 --> 00:12:27,040 Speaker 1: her these things where you can buy them, and it's 230 00:12:27,360 --> 00:12:30,720 Speaker 1: orange flavored, no calories, no sugar, So you get a 231 00:12:30,760 --> 00:12:35,120 Speaker 1: coffee without having a coffee. So I told her. She's 232 00:12:35,200 --> 00:12:36,000 Speaker 1: quite excited. 233 00:12:36,559 --> 00:12:37,240 Speaker 4: What about an. 234 00:12:39,360 --> 00:12:43,520 Speaker 1: Well, that's different, that's different altogether, you know, do you know, 235 00:12:43,800 --> 00:12:46,959 Speaker 1: like while we're opening this door on nootropics, So nootropics 236 00:12:47,040 --> 00:12:51,559 Speaker 1: is cognitive enhancing. So any drug that has a nootropic 237 00:12:51,559 --> 00:12:56,400 Speaker 1: effect is a potential cognitive benefit, thinking clearer and sharper 238 00:12:56,440 --> 00:13:02,480 Speaker 1: for longer, and even improving IQ for a period of time. 239 00:13:02,760 --> 00:13:05,320 Speaker 1: We know that people think that's a joke. Like your IQ, 240 00:13:06,280 --> 00:13:10,480 Speaker 1: your functional like actually fluctuates during the day, right, so 241 00:13:10,600 --> 00:13:14,240 Speaker 1: obviously your cognitive performance, just like your energy fluctuates through 242 00:13:14,280 --> 00:13:17,360 Speaker 1: the day. So your functional IQ fluctuates through the day, 243 00:13:18,080 --> 00:13:21,000 Speaker 1: and your IQ actually improves after you lift weights for 244 00:13:21,040 --> 00:13:24,360 Speaker 1: a couple of hours, but it also does with coffee 245 00:13:24,440 --> 00:13:32,600 Speaker 1: and nicotine. Patrick is really widely used. Oh really, people 246 00:13:32,640 --> 00:13:36,480 Speaker 1: think oh yeah, yeah yeah, so and again everyone, this 247 00:13:36,559 --> 00:13:40,880 Speaker 1: is just a conversation, no recommendation, no prescription, no endorsement, 248 00:13:42,000 --> 00:13:48,640 Speaker 1: But so Nicorette two and four milligram chewing gum sells 249 00:13:48,720 --> 00:13:54,839 Speaker 1: like fucking hot cakes back in the days when hotcakes sold. Well, yeah, 250 00:13:55,040 --> 00:13:58,680 Speaker 1: lots of my friends use that, like at least twenty 251 00:13:58,679 --> 00:14:01,240 Speaker 1: of my friends use that, and it's ay. But the 252 00:14:01,280 --> 00:14:05,200 Speaker 1: problem is, like with all drugs, if you use it 253 00:14:05,240 --> 00:14:08,280 Speaker 1: too much, it doesn't work. If you use it every day, 254 00:14:08,280 --> 00:14:13,080 Speaker 1: it doesn't work. And so but the odd, the odd 255 00:14:13,120 --> 00:14:16,959 Speaker 1: piece of nicorette seems to be I'm not saying it's 256 00:14:16,960 --> 00:14:19,560 Speaker 1: a good idea, but it seems to be effective. So 257 00:14:19,720 --> 00:14:23,480 Speaker 1: people who let's say they've got they've got a focus 258 00:14:23,520 --> 00:14:25,760 Speaker 1: for three or four hours because they've got some big 259 00:14:25,840 --> 00:14:29,080 Speaker 1: meeting or something where they've got to be cognitively switched on, 260 00:14:29,840 --> 00:14:32,400 Speaker 1: they'll do that. Another one also, another one which is 261 00:14:32,520 --> 00:14:36,840 Speaker 1: much safer and also the most research supplement on the 262 00:14:36,880 --> 00:14:42,720 Speaker 1: planet is creatine, which I use every day. Creatine and 263 00:14:42,760 --> 00:14:46,480 Speaker 1: gladly endorse. Again, it's not a recommendation, but I use. 264 00:14:46,520 --> 00:14:49,520 Speaker 1: It's very very it's a naturally occurring thing in the body. 265 00:14:50,280 --> 00:14:52,720 Speaker 1: And there's also someone i'll shut up after this, but 266 00:14:52,960 --> 00:14:58,520 Speaker 1: some really interesting research that's come out recently for people 267 00:14:58,560 --> 00:15:03,640 Speaker 1: who are sleep dep revised a sleep sleep deprived, which 268 00:15:03,680 --> 00:15:08,600 Speaker 1: I sound like I'm sleep deprivived. People who are sleep 269 00:15:08,720 --> 00:15:14,480 Speaker 1: deprived are using what they call megadoses of like four 270 00:15:14,560 --> 00:15:17,000 Speaker 1: or five times and normal things, so like thirty grams 271 00:15:17,080 --> 00:15:20,160 Speaker 1: or twenty five grams instead of five as a dose, 272 00:15:20,240 --> 00:15:23,000 Speaker 1: and that seems to offset the lack of sleep, which 273 00:15:23,080 --> 00:15:25,960 Speaker 1: is not a solution, but it's just something that some 274 00:15:26,000 --> 00:15:29,480 Speaker 1: people are doing. So there it is. Patrick. 275 00:15:30,000 --> 00:15:31,680 Speaker 4: I was just reading up on creatine. 276 00:15:31,880 --> 00:15:33,880 Speaker 2: I didn't know much about it, but it says that 277 00:15:33,920 --> 00:15:37,600 Speaker 2: it's most often found in seafood and red meat, and 278 00:15:37,640 --> 00:15:41,080 Speaker 2: I'm thinking, holy crap, I better look for vegan creatine 279 00:15:41,200 --> 00:15:43,840 Speaker 2: because I'm not getting any. 280 00:15:43,160 --> 00:15:46,160 Speaker 1: Well, yeah, and you could just I mean, it is 281 00:15:46,320 --> 00:15:49,760 Speaker 1: like if everybody's listened. Everybody who's listened to me knows 282 00:15:49,840 --> 00:15:53,400 Speaker 1: I never endorse or recommend anything but I but I 283 00:15:53,440 --> 00:15:56,800 Speaker 1: can say that I use that and it works. I 284 00:15:56,800 --> 00:16:01,440 Speaker 1: can also say that I've had a few nicorette chewing 285 00:16:01,440 --> 00:16:03,880 Speaker 1: gums over the years, but I've probably had in my 286 00:16:03,920 --> 00:16:07,640 Speaker 1: life ten and I don't smoke. And by the way, 287 00:16:07,680 --> 00:16:10,320 Speaker 1: not that nicotine is great for you, but it's it's 288 00:16:10,680 --> 00:16:13,320 Speaker 1: it's not actually the bit of the cigarettes that cause cancer. 289 00:16:13,360 --> 00:16:16,200 Speaker 1: But but yeah, I don't. I generally don't use it, 290 00:16:16,240 --> 00:16:18,320 Speaker 1: but if I have, if I need something where I've 291 00:16:18,320 --> 00:16:21,240 Speaker 1: got to be up and about, I've done that a 292 00:16:21,280 --> 00:16:23,280 Speaker 1: few times and it kind of switches on the brain 293 00:16:23,320 --> 00:16:28,200 Speaker 1: a bit. Yeah, you too, feel so this is a 294 00:16:28,200 --> 00:16:30,880 Speaker 1: conversation you too. Yeah, So when I stopped talking, that's 295 00:16:30,920 --> 00:16:33,360 Speaker 1: a queue for one of you, two fucking dummies to 296 00:16:33,400 --> 00:16:34,080 Speaker 1: start talking. 297 00:16:34,520 --> 00:16:36,840 Speaker 4: Well, okay, let me start the whole. 298 00:16:36,840 --> 00:16:43,520 Speaker 1: Fuck have you two been on a podcast before, so 299 00:16:43,960 --> 00:16:44,320 Speaker 1: let me. 300 00:16:44,280 --> 00:16:45,840 Speaker 4: Just not gas bagging. 301 00:16:45,880 --> 00:16:48,080 Speaker 2: You just don't shut up so we can't get a 302 00:16:48,120 --> 00:16:51,040 Speaker 2: worded Edgeway, I'm part. 303 00:16:50,800 --> 00:16:52,880 Speaker 5: Of the audience. I'm going to leave this whole thing. 304 00:16:53,400 --> 00:16:54,200 Speaker 1: The modelogue. 305 00:16:54,800 --> 00:16:55,360 Speaker 4: I love this. 306 00:16:57,280 --> 00:17:02,080 Speaker 1: Ah fuck, It's true. Even I get sick of me sometimes, 307 00:17:02,080 --> 00:17:04,359 Speaker 1: So I don't blame you. You know, it wasn't until 308 00:17:04,359 --> 00:17:06,720 Speaker 1: this podcast that both Tip and I learned how to 309 00:17:06,760 --> 00:17:14,600 Speaker 1: fall asleep with our eyes open. Okay, all right, point taken, 310 00:17:14,800 --> 00:17:15,520 Speaker 1: I'll be quiet. 311 00:17:16,600 --> 00:17:19,439 Speaker 4: I hate that, but it's sorry. 312 00:17:20,359 --> 00:17:23,399 Speaker 1: Whatever you want, whatever you want, whatever you want, sor right. 313 00:17:23,440 --> 00:17:24,320 Speaker 1: Don't worry about Matte. 314 00:17:24,359 --> 00:17:28,879 Speaker 4: I think I may have offended him. Tiff, She's so 315 00:17:29,160 --> 00:17:30,760 Speaker 4: no sympathy at all. 316 00:17:31,359 --> 00:17:33,760 Speaker 1: She does give a fuck. I'm just going to say 317 00:17:33,800 --> 00:17:36,399 Speaker 1: before you launch on my list of topics that you 318 00:17:36,520 --> 00:17:42,200 Speaker 1: sent through, it's always tech ai health Today. Working backwards 319 00:17:42,200 --> 00:17:45,160 Speaker 1: from the list, I can see tech ai scams cars, 320 00:17:45,200 --> 00:17:47,440 Speaker 1: and at the top of the list porn. 321 00:17:48,640 --> 00:17:53,240 Speaker 4: Please explain. I thought that'd get your attention. Look, we 322 00:17:53,280 --> 00:17:56,200 Speaker 4: don't often talk about porn, have we. We don't really 323 00:17:56,240 --> 00:17:57,440 Speaker 4: talk about porn very often. 324 00:17:57,520 --> 00:17:59,560 Speaker 2: So the first thing I can tell you, it's really 325 00:17:59,560 --> 00:18:03,560 Speaker 2: funny this story is when I compile my notes. As 326 00:18:03,600 --> 00:18:06,000 Speaker 2: you know, I often cheat, and I get one of 327 00:18:06,040 --> 00:18:09,440 Speaker 2: my assistants to actually compile the list. So I get 328 00:18:09,480 --> 00:18:11,040 Speaker 2: all the emails together and then I get them to 329 00:18:11,080 --> 00:18:14,120 Speaker 2: compile it. But the guy who was helping me this 330 00:18:14,160 --> 00:18:18,280 Speaker 2: week is seventeen, and I said, oh, don't put that 331 00:18:18,400 --> 00:18:21,560 Speaker 2: story in because it's got links. So I didn't want 332 00:18:21,640 --> 00:18:23,560 Speaker 2: him to put the porn Hub story in because I 333 00:18:23,600 --> 00:18:25,760 Speaker 2: didn't want him to click on a link and potentially 334 00:18:25,800 --> 00:18:27,399 Speaker 2: send him to porn Hub. 335 00:18:28,640 --> 00:18:31,040 Speaker 1: So that was kind of So this is the heading 336 00:18:31,119 --> 00:18:36,479 Speaker 1: that I see. Porn Hub colorizes vintage erotic scenes dating 337 00:18:36,520 --> 00:18:39,639 Speaker 1: back to eighteen ninety six using AI. What a great 338 00:18:39,760 --> 00:18:40,800 Speaker 1: use of technology. 339 00:18:41,720 --> 00:18:43,680 Speaker 4: So they knew you'd like it now. 340 00:18:43,720 --> 00:18:48,440 Speaker 2: What they did they used one hundred thousand current porn 341 00:18:48,600 --> 00:18:54,600 Speaker 2: videos to teach AI what porn was, right, wow, yep. 342 00:18:54,880 --> 00:18:58,479 Speaker 2: So they used it to analyze the colorizing. So because 343 00:18:59,160 --> 00:19:02,320 Speaker 2: this stuff back to the eight ninety six, this titillating 344 00:19:02,440 --> 00:19:05,280 Speaker 2: pawn because it wouldn't be you know what we're used 345 00:19:05,280 --> 00:19:07,719 Speaker 2: to now. But what they did was they took this 346 00:19:07,880 --> 00:19:11,479 Speaker 2: vintage footage of erotic scenes, but they trained it on 347 00:19:11,600 --> 00:19:15,679 Speaker 2: current porn and then they added and colorized it so 348 00:19:15,680 --> 00:19:17,600 Speaker 2: they to get the skin tones and all that sort 349 00:19:17,600 --> 00:19:21,919 Speaker 2: of stuff. And they're actually seriously remastered this stuff, so 350 00:19:21,960 --> 00:19:25,440 Speaker 2: it's a it's a remastered film project, and there's a 351 00:19:25,720 --> 00:19:31,240 Speaker 2: ten vintage videos of this erotica and where they weren't 352 00:19:31,280 --> 00:19:34,399 Speaker 2: able to get any soundtrack that actually dubbed their own, 353 00:19:34,680 --> 00:19:37,399 Speaker 2: I'm hoping it's seventies paorn soundtrack music. 354 00:19:38,240 --> 00:19:41,560 Speaker 1: There's going to be some very distressed people out there going, hey, 355 00:19:41,640 --> 00:19:48,880 Speaker 1: do you want to see great great great great grandma? Like, firstly, 356 00:19:49,119 --> 00:19:54,320 Speaker 1: why why why are we doing this? Well, it's just 357 00:19:54,359 --> 00:19:56,359 Speaker 1: one of those because we can things, isn't it. 358 00:19:56,960 --> 00:20:00,800 Speaker 4: I just look, maybe if they'd used AI to solved 359 00:20:00,920 --> 00:20:03,119 Speaker 4: can to cure cancer or something might have been a 360 00:20:03,240 --> 00:20:06,080 Speaker 4: nice idea, but it's porn, isn't it? Yeah? 361 00:20:06,600 --> 00:20:10,960 Speaker 1: Wow, wow, Yeah, so many things. I could say that. 362 00:20:11,040 --> 00:20:13,560 Speaker 1: As I said, I'm going to try and refrain today, 363 00:20:14,000 --> 00:20:16,639 Speaker 1: but I can't say that I'm going to line up 364 00:20:16,680 --> 00:20:17,600 Speaker 1: to Have you seen it? 365 00:20:18,320 --> 00:20:21,080 Speaker 4: No, I haven't clicked to and looked at it. Probably boring. 366 00:20:21,200 --> 00:20:22,320 Speaker 1: Oh what do you mean? 367 00:20:22,720 --> 00:20:25,359 Speaker 4: No, I don't know. I didn't want to click. 368 00:20:25,440 --> 00:20:27,399 Speaker 1: Why is your voice going so high? Why do you 369 00:20:27,440 --> 00:20:29,879 Speaker 1: sound like a twelve year old soprano all of a sudden? 370 00:20:30,600 --> 00:20:33,160 Speaker 2: No, I just was worried that my If I clicked through, 371 00:20:33,320 --> 00:20:35,720 Speaker 2: then of course it'll leave a cookie on my computer, 372 00:20:35,800 --> 00:20:38,720 Speaker 2: and suddenly all the ads will be for things late 373 00:20:38,800 --> 00:20:42,720 Speaker 2: to the pawn and we'll use my computer, my staff 374 00:20:42,840 --> 00:20:45,960 Speaker 2: use my computer. I don't want to click on something 375 00:20:46,000 --> 00:20:48,600 Speaker 2: I have. Yeah, it would just be highly inappropriate, I 376 00:20:48,640 --> 00:20:49,360 Speaker 2: think to be all. 377 00:20:49,359 --> 00:20:51,199 Speaker 1: Right, Well, if people want to go and find that, 378 00:20:52,040 --> 00:20:56,119 Speaker 1: Patrick will send you a link, then the federal police 379 00:20:56,119 --> 00:20:58,200 Speaker 1: will be out his door at dinner time. 380 00:20:59,600 --> 00:20:59,879 Speaker 4: Yeah. 381 00:21:00,680 --> 00:21:03,080 Speaker 2: Hey, you know what, you'll find this interesting because we've 382 00:21:03,119 --> 00:21:10,080 Speaker 2: had some really interesting, challenging conversations about belief systems, and 383 00:21:10,680 --> 00:21:12,920 Speaker 2: this blows my mind. But there are a lot of 384 00:21:12,960 --> 00:21:16,080 Speaker 2: people who believe the Earth is flat, for example, but 385 00:21:16,200 --> 00:21:20,440 Speaker 2: there are also conspiracy theorists that you believe, for example, 386 00:21:20,640 --> 00:21:23,560 Speaker 2: that you know, Man never went to the moon, and 387 00:21:23,600 --> 00:21:25,320 Speaker 2: there's a whole lot of theories around that. And if 388 00:21:25,320 --> 00:21:28,480 Speaker 2: you talk to somebody who is a conspiracy theorist, they 389 00:21:28,480 --> 00:21:32,440 Speaker 2: have quite a number of really good rebuttals and arguments 390 00:21:32,760 --> 00:21:39,520 Speaker 2: to support whatever the conspiracy theory is. So interestingly, what 391 00:21:39,560 --> 00:21:44,119 Speaker 2: they've done is they've trained an AI model specifically to 392 00:21:44,160 --> 00:21:48,600 Speaker 2: be able to argue against conspiracy theories. This is a 393 00:21:48,680 --> 00:21:52,800 Speaker 2: team up between researchers at the Massachusetts Institute of Technologies 394 00:21:52,800 --> 00:21:56,600 Speaker 2: at MIT, Cornell, and the American University. So what they 395 00:21:56,640 --> 00:22:01,280 Speaker 2: did was they they created a chatbot call debunk Bot, 396 00:22:02,119 --> 00:22:05,720 Speaker 2: and they were able then to sit down I think 397 00:22:05,840 --> 00:22:09,320 Speaker 2: was over two thousand people and they would fire off 398 00:22:09,760 --> 00:22:14,520 Speaker 2: their supporting arguments for a particular conspiracy theory. But because 399 00:22:14,600 --> 00:22:20,360 Speaker 2: the AI was able to challenge those questions so quickly, 400 00:22:21,200 --> 00:22:25,560 Speaker 2: nearly it was well nearly, quite a large number of 401 00:22:25,600 --> 00:22:30,800 Speaker 2: them were actually shown to have reconsidered what their conspiracy 402 00:22:30,840 --> 00:22:35,040 Speaker 2: theory was after sitting through three sessions of talking to 403 00:22:35,119 --> 00:22:39,320 Speaker 2: the debunk bot. Because you know, when you think about, 404 00:22:40,080 --> 00:22:42,840 Speaker 2: you know what people belief systems are. If I have 405 00:22:42,880 --> 00:22:46,320 Speaker 2: an argument with you about something, you're usually an expert 406 00:22:46,359 --> 00:22:49,560 Speaker 2: on that, and so it's very difficult to argue against 407 00:22:49,600 --> 00:22:52,240 Speaker 2: the point that someone is really passionate about and potentially 408 00:22:52,240 --> 00:22:56,879 Speaker 2: has researched a lot, whereas an AI can snappily get 409 00:22:56,920 --> 00:23:02,000 Speaker 2: those results so quickly that it can debunk every single 410 00:23:02,040 --> 00:23:05,800 Speaker 2: one of the arguments that's presented. And they and even 411 00:23:05,840 --> 00:23:09,200 Speaker 2: after three months, they found a high percentage of those 412 00:23:09,240 --> 00:23:12,439 Speaker 2: people were starting to challenge their belief system based on 413 00:23:12,520 --> 00:23:15,200 Speaker 2: the arguments that they had with this AI chatbot. 414 00:23:16,280 --> 00:23:18,320 Speaker 1: I love that. That's very interesting, and I think it's 415 00:23:18,320 --> 00:23:20,480 Speaker 1: all you said, a really good thing or a really 416 00:23:20,520 --> 00:23:22,719 Speaker 1: interesting thing, and you said that things that people are 417 00:23:22,760 --> 00:23:27,520 Speaker 1: passionate about, the problem with that passion is that that's emotion. 418 00:23:28,400 --> 00:23:31,160 Speaker 1: And so as soon as you're emotional about something, you're 419 00:23:31,200 --> 00:23:36,040 Speaker 1: not rational or logical necessarily right. And it's the problem 420 00:23:36,040 --> 00:23:38,280 Speaker 1: with us, all of us, and I mean humans, you 421 00:23:38,520 --> 00:23:44,160 Speaker 1: may tiff, is that our beliefs are intertwined with our 422 00:23:44,240 --> 00:23:47,760 Speaker 1: sense of identity. And so it's like when you go 423 00:23:47,960 --> 00:23:51,000 Speaker 1: I am you're making an im statement. You say, I 424 00:23:51,080 --> 00:23:55,480 Speaker 1: am a vegan. Well, that's an identity. I am a 425 00:23:55,600 --> 00:24:00,560 Speaker 1: Collingwood supporter, I am a Buddhist, I am these I 426 00:24:00,600 --> 00:24:05,240 Speaker 1: am statements. And then so regarding your belief and thinking 427 00:24:05,280 --> 00:24:11,200 Speaker 1: about that particular ideology, philosophy, lifestyle, whatever, Yeah, it makes 428 00:24:11,240 --> 00:24:15,400 Speaker 1: you and I'm the same. It makes you almost unteachable 429 00:24:15,600 --> 00:24:19,240 Speaker 1: because you believe that you're right. So anything that doesn't 430 00:24:19,440 --> 00:24:23,800 Speaker 1: echo what you think is right. So the fact that they, 431 00:24:24,440 --> 00:24:27,840 Speaker 1: you know, did that and they got their responses really interesting. 432 00:24:27,880 --> 00:24:32,560 Speaker 1: But also keep in mind that conspiracy theories exist for 433 00:24:32,600 --> 00:24:36,040 Speaker 1: a reason. Like if you think that the government, for example, 434 00:24:36,119 --> 00:24:40,320 Speaker 1: tell you everything, you're an idiot, because they definitely don't, 435 00:24:41,119 --> 00:24:44,119 Speaker 1: and for good reason, I would imagine in many instances, 436 00:24:44,840 --> 00:24:48,600 Speaker 1: you know. So the fact is that some conspiracies are 437 00:24:48,680 --> 00:24:56,000 Speaker 1: actually true, a lot of them are not true, you know. Yeah, 438 00:24:56,080 --> 00:24:59,919 Speaker 1: so that's interesting. I wonder how long that would last. 439 00:25:00,240 --> 00:25:04,720 Speaker 2: Kind of turnaround, Well, this MIT professor, he's the co author, 440 00:25:04,800 --> 00:25:07,240 Speaker 2: guy with the name of David Rand and during his 441 00:25:07,520 --> 00:25:10,520 Speaker 2: media conference he was saying that one of the reasons 442 00:25:10,560 --> 00:25:16,240 Speaker 2: that AI was really overwhelmingly good is because it was respectful, 443 00:25:16,680 --> 00:25:20,560 Speaker 2: it was non conspirat when you think about it, and 444 00:25:20,600 --> 00:25:23,320 Speaker 2: you brought this up before the emotional side of it, 445 00:25:23,400 --> 00:25:27,040 Speaker 2: because it's not emotional. It was able to get the 446 00:25:27,119 --> 00:25:32,200 Speaker 2: explanations back in what was, you know, a rapid time, 447 00:25:32,680 --> 00:25:35,840 Speaker 2: and it was able to tackle every single point, but 448 00:25:35,920 --> 00:25:39,560 Speaker 2: it engaged in what they called critical thinking, providing the 449 00:25:39,640 --> 00:25:43,160 Speaker 2: counter evidence. So there was no question from the other 450 00:25:43,200 --> 00:25:46,320 Speaker 2: person's perspective. So the conspiracy theorist didn't feel like they 451 00:25:46,320 --> 00:25:49,560 Speaker 2: were getting a red blooded, passionate other person on the 452 00:25:49,560 --> 00:25:51,639 Speaker 2: other side of the argument, and what they were getting 453 00:25:51,720 --> 00:25:55,119 Speaker 2: was just factual responses rebutting what they believed in. So 454 00:25:56,320 --> 00:25:58,199 Speaker 2: it is an interesting way of kind of doing it. 455 00:25:58,240 --> 00:26:02,120 Speaker 2: And you know, they talked about over two thousand adults 456 00:26:02,560 --> 00:26:06,480 Speaker 2: they had conspiracy theories like the John F. Kennedy assassination, 457 00:26:06,640 --> 00:26:11,600 Speaker 2: alien abductions, the world being flat and around it. I 458 00:26:11,680 --> 00:26:14,800 Speaker 2: think around about twenty people were ready to walk away 459 00:26:14,800 --> 00:26:19,760 Speaker 2: from their beliefs, and another high percentage when shown this evidence, 460 00:26:20,080 --> 00:26:25,120 Speaker 2: showed a really strong sense of reconsidering what it was. 461 00:26:25,080 --> 00:26:26,480 Speaker 4: That they had believed. 462 00:26:26,480 --> 00:26:28,640 Speaker 2: So I just thought it was a really interesting use 463 00:26:28,680 --> 00:26:33,000 Speaker 2: of AI and a third of the respondents left saying 464 00:26:33,000 --> 00:26:36,800 Speaker 2: they were no longer certain of the belief and that 465 00:26:37,000 --> 00:26:40,600 Speaker 2: persisted after two months. So in answer your question, after 466 00:26:40,640 --> 00:26:43,159 Speaker 2: a two month period that was still their beliefs that 467 00:26:43,480 --> 00:26:47,000 Speaker 2: they had still challenged that belief I. 468 00:26:46,960 --> 00:26:49,359 Speaker 1: Think that a caveat needs to be I think that's 469 00:26:49,359 --> 00:26:51,600 Speaker 1: interesting and I don't disagree with any of that or 470 00:26:51,600 --> 00:26:53,919 Speaker 1: I don't think that that's not valuable. I think it is. 471 00:26:54,000 --> 00:26:56,360 Speaker 1: I think there needs to be an asterisk and a caveat, 472 00:26:57,000 --> 00:26:59,399 Speaker 1: and that is that. I don't think we should just 473 00:26:59,600 --> 00:27:05,760 Speaker 1: blindly trust artificial intelligence. I don't think that, you know, 474 00:27:05,920 --> 00:27:09,399 Speaker 1: like I saw a post the other day and somebody 475 00:27:09,480 --> 00:27:16,080 Speaker 1: had said, tell me, can you like somebody was asking chat, 476 00:27:16,119 --> 00:27:20,960 Speaker 1: GPT or one of those AI tools, tell me a 477 00:27:21,040 --> 00:27:24,359 Speaker 1: joke about Jesus and you know, and then it said 478 00:27:24,400 --> 00:27:27,199 Speaker 1: tell me, and it told them a joke or a 479 00:27:27,200 --> 00:27:30,560 Speaker 1: bunch of jokes about Jesus. And then it said, now 480 00:27:30,600 --> 00:27:33,720 Speaker 1: tell me some jokes about Muhammad and it said, I can't. 481 00:27:34,160 --> 00:27:38,400 Speaker 1: I can't tell religious like it wouldn't right. And there 482 00:27:38,440 --> 00:27:43,439 Speaker 1: was another one similar, like tell me basically tell me 483 00:27:43,480 --> 00:27:50,560 Speaker 1: a story that makes uh Kamela Harris look, you know, silly, 484 00:27:50,680 --> 00:27:53,440 Speaker 1: or and it wouldn't. But then with Donald Trump it 485 00:27:53,560 --> 00:27:58,160 Speaker 1: would so not everything where it depends on what it's 486 00:27:58,280 --> 00:28:00,320 Speaker 1: trained on. By the way, I don't think we should 487 00:28:00,400 --> 00:28:05,200 Speaker 1: necessarily be telling jokes about Christianity or Islam or Jesus 488 00:28:05,320 --> 00:28:07,520 Speaker 1: or Muhammad or I don't think so. I'm just saying 489 00:28:07,520 --> 00:28:11,880 Speaker 1: it's just I think it depends on, like not everything 490 00:28:12,000 --> 00:28:17,320 Speaker 1: that it produces is unequivocal truth. It depends on where 491 00:28:17,440 --> 00:28:20,439 Speaker 1: it's getting its source of its information from. 492 00:28:20,760 --> 00:28:23,720 Speaker 4: So the human bias of the programers is what you're saying, 493 00:28:23,840 --> 00:28:24,440 Speaker 4: is also. 494 00:28:24,280 --> 00:28:28,439 Speaker 1: One hundred percent because everything's programmed. Even it's like people 495 00:28:28,440 --> 00:28:33,960 Speaker 1: think science is flawless, it's complete shit, like not sciences, 496 00:28:34,000 --> 00:28:37,359 Speaker 1: but that idea, because science is a construct of the 497 00:28:37,440 --> 00:28:41,000 Speaker 1: human mind. We created this idea and we go, well, 498 00:28:41,000 --> 00:28:44,800 Speaker 1: we're going to test this hypothesis with these protocols. Then 499 00:28:44,880 --> 00:28:47,840 Speaker 1: we're going to and this is someone who's like running 500 00:28:47,880 --> 00:28:50,560 Speaker 1: his own research at the moment, And then you get 501 00:28:50,600 --> 00:28:53,640 Speaker 1: the research, then you interpret the data, then you tell 502 00:28:53,680 --> 00:28:56,280 Speaker 1: the world what it means. But the problem is all 503 00:28:56,320 --> 00:29:00,160 Speaker 1: the way along there's human interpretation and every human and 504 00:29:00,240 --> 00:29:06,240 Speaker 1: whether you're a minister or a scientist or an AI creator, 505 00:29:06,960 --> 00:29:10,080 Speaker 1: like every single one of those people is still a 506 00:29:10,160 --> 00:29:13,720 Speaker 1: human with bias. And the same with me, Like I 507 00:29:13,880 --> 00:29:17,000 Speaker 1: recognize that when I have these conversations with you, my 508 00:29:17,200 --> 00:29:20,360 Speaker 1: own bias, and there are times when you're right. No 509 00:29:20,400 --> 00:29:22,600 Speaker 1: I'm wrong, and I don't want you to be right 510 00:29:22,920 --> 00:29:26,120 Speaker 1: because it fucking bothers me. But you're right and I 511 00:29:26,200 --> 00:29:28,600 Speaker 1: am wrong. Right, And this is we need to be 512 00:29:28,640 --> 00:29:32,080 Speaker 1: able to acknowledge the human condition, and that is I 513 00:29:32,120 --> 00:29:35,320 Speaker 1: am flawed. I look at the window through the world, 514 00:29:35,360 --> 00:29:38,960 Speaker 1: through the Craig window I have. And until you can 515 00:29:39,080 --> 00:29:41,280 Speaker 1: recognize and realize that, you're going to get a lot 516 00:29:41,320 --> 00:29:45,280 Speaker 1: of shit wrong, and so is AI and so is science, 517 00:29:45,720 --> 00:29:48,640 Speaker 1: then we're all kidding ourselves because we think I know 518 00:29:48,760 --> 00:29:51,800 Speaker 1: the truth, but you don't. But the truth is, I 519 00:29:51,880 --> 00:29:56,520 Speaker 1: think ironically, we don't really fucking know most things. We 520 00:29:56,680 --> 00:29:57,560 Speaker 1: just think we know. 521 00:29:58,160 --> 00:30:01,400 Speaker 2: And science always has a cab yet anyway, that this 522 00:30:01,440 --> 00:30:04,880 Speaker 2: is a theory, So you know, the theory of relativity, 523 00:30:05,160 --> 00:30:08,120 Speaker 2: the theory of gravity. You know, we can display and 524 00:30:08,360 --> 00:30:11,120 Speaker 2: what you know, our best approximation of what we understand 525 00:30:11,200 --> 00:30:13,880 Speaker 2: gravity to be. But it's still just a theory, you know. 526 00:30:14,000 --> 00:30:17,000 Speaker 2: And when we talk about some of these, you can't 527 00:30:17,040 --> 00:30:20,080 Speaker 2: have hard and far fast, one hundred percent proven facts. 528 00:30:20,880 --> 00:30:23,520 Speaker 2: So science is always reticent to come out and say 529 00:30:23,600 --> 00:30:26,880 Speaker 2: this is one hundred percent this, ah. 530 00:30:26,560 --> 00:30:30,120 Speaker 1: But science doesn't communicate it like that. You're you're right, 531 00:30:30,320 --> 00:30:33,040 Speaker 1: you're absolutely right, but you know so. One of the 532 00:30:33,040 --> 00:30:37,600 Speaker 1: most famous fucking frauds of all time was by a 533 00:30:37,640 --> 00:30:41,520 Speaker 1: guy called Ansel Keyes who ran a study. His essential 534 00:30:41,680 --> 00:30:44,240 Speaker 1: theory was we've spoken about this couple of times on 535 00:30:44,360 --> 00:30:46,480 Speaker 1: I don't know if we've spoken about it on our show, 536 00:30:46,520 --> 00:30:51,360 Speaker 1: but he did a study that involved twenty different countries 537 00:30:51,400 --> 00:30:55,480 Speaker 1: and his theory. Essentially, his hypothesis was that low fat 538 00:30:55,560 --> 00:31:01,160 Speaker 1: equal low fat eating equals low fat people, and he 539 00:31:01,320 --> 00:31:05,120 Speaker 1: basically dockted the results. He cut out thirteen of the 540 00:31:05,160 --> 00:31:08,200 Speaker 1: twenty studies, so it went from a twenty country study 541 00:31:08,240 --> 00:31:12,520 Speaker 1: to a seven country study because thirteen of the country's 542 00:31:12,600 --> 00:31:16,480 Speaker 1: data contradicted his hypothesis, so he swept them under the 543 00:31:16,520 --> 00:31:20,200 Speaker 1: academic carpets so people wouldn't know. He tried to hide that, 544 00:31:20,640 --> 00:31:23,080 Speaker 1: and it went on and on and on, and the 545 00:31:24,000 --> 00:31:27,880 Speaker 1: uncomfortable truth is that since the incidents and the occurrence 546 00:31:27,920 --> 00:31:31,920 Speaker 1: of low fat eating, there's been a continual rise in 547 00:31:31,960 --> 00:31:36,719 Speaker 1: all the countries that embraced that philosophy with obesity, you know, 548 00:31:37,440 --> 00:31:42,360 Speaker 1: and so low fat, high carb high processed food, which 549 00:31:42,400 --> 00:31:45,320 Speaker 1: is what they kind of in part recommended, you know, 550 00:31:45,640 --> 00:31:49,240 Speaker 1: and that was allegedly the best science. So the food pyramid, 551 00:31:49,640 --> 00:31:52,520 Speaker 1: that was the science. And now people are many, many, 552 00:31:52,560 --> 00:31:55,120 Speaker 1: many people are kind of going, no, you know, so 553 00:31:55,240 --> 00:31:57,440 Speaker 1: you and in ten years we're going to be thinking 554 00:31:57,520 --> 00:31:59,880 Speaker 1: something else. It's a slippery landscape. 555 00:32:00,480 --> 00:32:03,800 Speaker 4: Mm hmmm. Hey, while we're on the topic of AI, 556 00:32:04,160 --> 00:32:06,000 Speaker 4: I had a story I wanted to talk about that 557 00:32:06,080 --> 00:32:08,840 Speaker 4: I really was adverse to. I kind of worry about this. 558 00:32:09,240 --> 00:32:12,480 Speaker 2: So a few shows ago, I used AI to copy 559 00:32:12,560 --> 00:32:15,440 Speaker 2: your voice and I did the of the show as 560 00:32:15,480 --> 00:32:17,760 Speaker 2: you saying how good I was. So that should have 561 00:32:17,760 --> 00:32:19,960 Speaker 2: been an indicator right off the bat that it wasn't 562 00:32:20,000 --> 00:32:20,520 Speaker 2: really you. 563 00:32:20,880 --> 00:32:22,080 Speaker 4: But what. 564 00:32:23,760 --> 00:32:25,920 Speaker 1: Hell? What are you? What are you so nasty to 565 00:32:26,000 --> 00:32:30,760 Speaker 1: me for today? Why are you picking on me today? 566 00:32:30,960 --> 00:32:33,480 Speaker 4: Well? Have you ever? Let's listen to our past episodes. 567 00:32:35,200 --> 00:32:36,960 Speaker 4: What I'm just getting really. 568 00:32:37,560 --> 00:32:40,800 Speaker 1: Well, I haven't thrown you under the bus once today. 569 00:32:40,960 --> 00:32:41,440 Speaker 4: That's right. 570 00:32:41,560 --> 00:32:46,160 Speaker 2: Today, That's why I'm getting in early. 571 00:32:46,840 --> 00:32:49,760 Speaker 1: All right, all right, I'll take it. Come on. 572 00:32:50,160 --> 00:32:54,640 Speaker 2: So Audible, which is a subscription service that I really 573 00:32:54,680 --> 00:32:58,160 Speaker 2: love because I love listening to audio books. They're now 574 00:32:58,280 --> 00:33:01,840 Speaker 2: doing a limited training service of some of their more 575 00:33:01,920 --> 00:33:06,800 Speaker 2: popular narrators and getting them to have their voices trained 576 00:33:06,800 --> 00:33:10,240 Speaker 2: by AI with their permission for them to be able 577 00:33:10,280 --> 00:33:14,920 Speaker 2: to read books using their AI voice. Now they will 578 00:33:14,920 --> 00:33:17,760 Speaker 2: be getting royalties for it, so they're giving permission. And 579 00:33:17,800 --> 00:33:20,560 Speaker 2: I kind of thought to myself, look, I kind of 580 00:33:20,600 --> 00:33:23,600 Speaker 2: feel sad. I love the idea of voice acting, and 581 00:33:24,000 --> 00:33:26,840 Speaker 2: some of the best audible books I've heard have been 582 00:33:26,880 --> 00:33:29,520 Speaker 2: from so talented people. You know, sometimes you get one 583 00:33:29,560 --> 00:33:33,320 Speaker 2: person who does the entire narration, does character voices, and 584 00:33:33,360 --> 00:33:36,680 Speaker 2: I love that, and they're usually really talented actors. So 585 00:33:36,800 --> 00:33:39,560 Speaker 2: I thought when I first saw this that it feels 586 00:33:39,560 --> 00:33:42,240 Speaker 2: a bit sad. It feels like for somebody who does 587 00:33:42,280 --> 00:33:45,280 Speaker 2: do voiceover work. I do on holds, I do voiceovers 588 00:33:45,520 --> 00:33:47,080 Speaker 2: for corporate video and stuff like that. 589 00:33:47,120 --> 00:33:49,520 Speaker 4: And I thought, to myself, this feels a bit sad. 590 00:33:49,760 --> 00:33:52,480 Speaker 2: Would I AI my voice and just you know, put 591 00:33:52,480 --> 00:33:54,000 Speaker 2: the script in and just do it that way? What 592 00:33:54,040 --> 00:33:58,280 Speaker 2: it feels like it's losing humanity. However, one thing that 593 00:33:58,360 --> 00:34:01,160 Speaker 2: just occurred to me, and it happened this week. James 594 00:34:01,240 --> 00:34:04,440 Speaker 2: Earl Jones passed away. Now, if you could, if I 595 00:34:04,920 --> 00:34:07,480 Speaker 2: would beg you to think of another voice that is 596 00:34:07,520 --> 00:34:11,479 Speaker 2: more iconic than James Earl Jones. You know Darth Vadamvasa 597 00:34:11,600 --> 00:34:15,240 Speaker 2: from The Lion King. He has one of the most iconic, 598 00:34:15,400 --> 00:34:20,239 Speaker 2: beautiful voices you've ever heard. Now, if his family consented 599 00:34:20,920 --> 00:34:24,160 Speaker 2: and they trained an AI model on James Earl Jones, 600 00:34:24,440 --> 00:34:27,480 Speaker 2: I'm really warring with this. It's not James Earl Jones, 601 00:34:27,560 --> 00:34:30,359 Speaker 2: but it is James Earl Jones because it's sampled from him. 602 00:34:30,360 --> 00:34:34,000 Speaker 2: It's using the nuances of his inflections and tonalities and all. 603 00:34:33,920 --> 00:34:36,680 Speaker 4: That sort of stuff. But is it really him? If 604 00:34:36,680 --> 00:34:37,360 Speaker 4: he narrates? 605 00:34:39,080 --> 00:34:44,160 Speaker 1: That's interesting. It's interesting because what if you well, obviously 606 00:34:44,200 --> 00:34:48,200 Speaker 1: it's not. It's just it's a program using his voice. 607 00:34:48,520 --> 00:34:52,479 Speaker 1: But then but then maybe it is, like you think 608 00:34:52,520 --> 00:34:58,080 Speaker 1: about if if you didn't know that it was AI created, 609 00:34:58,200 --> 00:35:01,400 Speaker 1: but it was using his voice, you would have the 610 00:35:01,480 --> 00:35:05,000 Speaker 1: same experience as if it was really him. It's the 611 00:35:05,080 --> 00:35:06,879 Speaker 1: knowing that fucks it up, isn't it. 612 00:35:07,800 --> 00:35:10,680 Speaker 4: Yeah, that's that's actually a really accurate points. That's interesting. 613 00:35:11,280 --> 00:35:13,440 Speaker 1: It's the knowing that makes it not. But if you 614 00:35:13,560 --> 00:35:16,600 Speaker 1: didn't know, it's like, remember you were telling was it 615 00:35:16,640 --> 00:35:18,320 Speaker 1: you telling us about the Banksies? 616 00:35:19,520 --> 00:35:22,320 Speaker 4: No? You know the art right? Last episode? 617 00:35:22,400 --> 00:35:25,960 Speaker 1: Right? Well, I know We had that conversation, but I 618 00:35:26,440 --> 00:35:28,440 Speaker 1: couldn't remember if you or I started. But you think 619 00:35:28,520 --> 00:35:31,920 Speaker 1: about like you know it's a Banksye, it's worth two 620 00:35:32,000 --> 00:35:35,200 Speaker 1: hundred thousand dollars. It really is a Banksy, but you 621 00:35:35,320 --> 00:35:38,760 Speaker 1: think it's not. It's worth sixty dollars. It's not about 622 00:35:38,760 --> 00:35:42,200 Speaker 1: the painting, it's about what you think you know. And 623 00:35:42,360 --> 00:35:47,640 Speaker 1: I'm thinking another very awesome voice, maybe maybe half a 624 00:35:47,760 --> 00:35:51,560 Speaker 1: rung down Morgan Freeman. He's also right up there, isn't 625 00:35:51,600 --> 00:35:52,800 Speaker 1: he like super voice? 626 00:35:53,200 --> 00:35:56,000 Speaker 4: So you know what where my mind went is if 627 00:35:56,080 --> 00:35:59,319 Speaker 4: you were blindfolded and put on a message table, I 628 00:35:59,360 --> 00:36:02,600 Speaker 4: have very solh Yes, yes. 629 00:36:03,640 --> 00:36:04,920 Speaker 1: You've obviously thought about this. 630 00:36:05,040 --> 00:36:09,120 Speaker 2: A bit walked in and gave you a message and 631 00:36:09,160 --> 00:36:09,920 Speaker 2: then I walked out. 632 00:36:10,800 --> 00:36:14,520 Speaker 1: You've seen Tip's hands. She looks like a fucking construction worker. 633 00:36:16,120 --> 00:36:19,839 Speaker 1: That's not even a joke. If somebody had to rub 634 00:36:19,880 --> 00:36:22,959 Speaker 1: my back and ass, I would much rather you, because 635 00:36:22,960 --> 00:36:25,839 Speaker 1: if you like getting fucking sand blasted. 636 00:36:26,280 --> 00:36:29,000 Speaker 5: It would be good exfoliation. It's good for your skin. 637 00:36:29,880 --> 00:36:32,239 Speaker 1: I don't want to massage from you, Eva, but thank you. 638 00:36:32,760 --> 00:36:35,719 Speaker 1: We're still friends. But I don't want you rubbing my ship. 639 00:36:35,840 --> 00:36:38,279 Speaker 1: But Patrick, maybe with a little bit of en you're 640 00:36:38,280 --> 00:36:42,000 Speaker 1: in the background, and some fucking some baby oil. Just 641 00:36:42,120 --> 00:36:45,399 Speaker 1: maybe Fritz licking my toes while Patrick does it. I mean, 642 00:36:46,120 --> 00:36:47,400 Speaker 1: I could think of worse things. 643 00:36:50,600 --> 00:36:53,400 Speaker 4: I asked, I did notice. 644 00:36:53,080 --> 00:36:55,840 Speaker 1: You you opened the door, and then I come in 645 00:36:55,880 --> 00:36:57,360 Speaker 1: and you're like, oh, how did that happen? 646 00:36:58,960 --> 00:37:01,759 Speaker 4: You didn't just open the door, dude, You crow bart 647 00:37:01,800 --> 00:37:07,640 Speaker 4: it and then widened it to make it French doores? 648 00:37:09,400 --> 00:37:11,760 Speaker 1: All right, I'm staying quiet, you to keep going. 649 00:37:13,080 --> 00:37:17,200 Speaker 2: I like, look, we're all mindful of AI fakes and 650 00:37:17,280 --> 00:37:19,280 Speaker 2: fake them and all that sort of stuff. But Google, 651 00:37:19,360 --> 00:37:22,440 Speaker 2: now of all companies, is planning to roll out a 652 00:37:22,520 --> 00:37:26,360 Speaker 2: new tech that will identify whether a photo was taken 653 00:37:26,440 --> 00:37:29,920 Speaker 2: with a camera, edited by software like Photoshop, or produced 654 00:37:30,600 --> 00:37:34,200 Speaker 2: entirely by a I I think that's actually pretty good 655 00:37:34,520 --> 00:37:38,040 Speaker 2: when it comes to, you know, being able to debunk 656 00:37:38,760 --> 00:37:41,400 Speaker 2: fake news, but also knowing that if you were a 657 00:37:41,480 --> 00:37:46,280 Speaker 2: client and you're asking for someone to produce some work 658 00:37:46,440 --> 00:37:49,120 Speaker 2: for you and they're trying to pass it off as 659 00:37:49,280 --> 00:37:52,600 Speaker 2: fake and looks so accurate, you could then check that 660 00:37:53,280 --> 00:37:55,200 Speaker 2: so you know, or if you know, TIFF's got an 661 00:37:55,200 --> 00:37:59,200 Speaker 2: autographed photo of Craig and she's trying to a thousand 662 00:37:59,280 --> 00:38:02,319 Speaker 2: bucks on ay and I do it scan and I say, 663 00:38:02,360 --> 00:38:05,799 Speaker 2: wait a minute, that's not Craig. That's been AI generated 664 00:38:05,840 --> 00:38:08,200 Speaker 2: and the signature is too pixelated. And I'm not going 665 00:38:08,200 --> 00:38:09,000 Speaker 2: to pay any them for that. 666 00:38:10,520 --> 00:38:13,760 Speaker 1: Yeah, yeah, because that would that'd be a big seller. 667 00:38:15,920 --> 00:38:19,759 Speaker 1: That'd be huge, don't you think though that? Moving forward? Mate, 668 00:38:19,760 --> 00:38:21,920 Speaker 1: this is going to be an ongoing thing, like just 669 00:38:21,960 --> 00:38:24,600 Speaker 1: trying to figure out what's real. I just thought of something. 670 00:38:24,600 --> 00:38:25,319 Speaker 1: You know what we could do? 671 00:38:25,719 --> 00:38:28,759 Speaker 2: We could do autograph photos of the three of us 672 00:38:29,200 --> 00:38:31,919 Speaker 2: posted onto eBay and then put it out to see 673 00:38:31,960 --> 00:38:32,760 Speaker 2: who gets the best. 674 00:38:32,800 --> 00:38:32,920 Speaker 4: Bit. 675 00:38:35,440 --> 00:38:37,640 Speaker 1: I feel like none of them are going to sell that, 676 00:38:37,719 --> 00:38:38,600 Speaker 1: would you too much? 677 00:38:39,160 --> 00:38:41,640 Speaker 4: No, I'll just put Fritz in there. Fritz will sell. 678 00:38:42,760 --> 00:38:45,040 Speaker 1: Tiff was telling me that she met someone the other day. 679 00:38:45,080 --> 00:38:46,920 Speaker 1: I'm not throwing her under the bus because this is 680 00:38:46,960 --> 00:38:50,160 Speaker 1: lovely who was nervous to meet her because she because 681 00:38:50,200 --> 00:38:51,200 Speaker 1: Tif's famous. 682 00:38:51,520 --> 00:38:52,680 Speaker 4: Yeah, I can understand. 683 00:38:52,760 --> 00:38:55,520 Speaker 1: How did you? How did you feel being famous? Tif? 684 00:38:56,719 --> 00:38:58,239 Speaker 5: It's so funny, isn't it. 685 00:38:59,600 --> 00:39:01,279 Speaker 1: It's yep? 686 00:39:03,640 --> 00:39:05,759 Speaker 3: I was like that put me on a pedestal because 687 00:39:05,760 --> 00:39:07,840 Speaker 3: I'll fall off and break my leg. I'm not I 688 00:39:07,880 --> 00:39:08,879 Speaker 3: don't belong up there. 689 00:39:10,080 --> 00:39:13,200 Speaker 1: Yeah, we know what happens when I stand on pedestals. 690 00:39:12,600 --> 00:39:13,000 Speaker 4: Don't we do? 691 00:39:16,719 --> 00:39:17,319 Speaker 5: Emergency? 692 00:39:18,360 --> 00:39:21,920 Speaker 1: That replied, Big Gone, buddy. 693 00:39:22,800 --> 00:39:25,880 Speaker 4: So I'm going to do one more AI story because 694 00:39:25,880 --> 00:39:28,760 Speaker 4: this is a particularly good one, and it's also about 695 00:39:28,800 --> 00:39:31,480 Speaker 4: a painter that I one of their pieces of art 696 00:39:31,480 --> 00:39:34,280 Speaker 4: that I love because it's also associated with a movie 697 00:39:34,280 --> 00:39:38,160 Speaker 4: that I love called The Woman in Gold and good 698 00:39:38,200 --> 00:39:42,719 Speaker 4: Stuff Climped. So Good Stuff Climped Austrian painter. A lot 699 00:39:42,760 --> 00:39:46,799 Speaker 4: of his paintings were lost. They were either stolen or 700 00:39:46,840 --> 00:39:48,960 Speaker 4: they were and there's a whole group that were actually 701 00:39:48,960 --> 00:39:53,080 Speaker 4: burnt in a fire. So there was a massive fire 702 00:39:53,320 --> 00:39:57,880 Speaker 4: when the Germans left Vienna. They burnt a lot of buildings. 703 00:39:57,960 --> 00:40:02,240 Speaker 2: And what they've done is they've taken black and white 704 00:40:02,239 --> 00:40:09,120 Speaker 2: photographs of those paintings, then trained an AI model on 705 00:40:09,239 --> 00:40:13,640 Speaker 2: Gustav Climt's style of painting, and then they've been able 706 00:40:13,680 --> 00:40:17,880 Speaker 2: to recolorize the imagery to give us some better understanding 707 00:40:17,920 --> 00:40:22,200 Speaker 2: of what the original painting would have looked like. So 708 00:40:22,239 --> 00:40:24,360 Speaker 2: are you familiar with the movie The Woman in Gold? 709 00:40:24,760 --> 00:40:28,240 Speaker 2: Helen Mirren was in it. It's a phenomenal story. Basically, 710 00:40:28,239 --> 00:40:31,360 Speaker 2: it's a true story about the reparation of this painting 711 00:40:31,400 --> 00:40:33,719 Speaker 2: by Clympt called The Woman in Gold that hung in 712 00:40:33,760 --> 00:40:36,520 Speaker 2: a gallery in Vienna for years. It was kind of 713 00:40:36,520 --> 00:40:40,920 Speaker 2: the Mona Lisa of Vienna, of Austria and the original family. 714 00:40:41,000 --> 00:40:43,560 Speaker 2: So that this woman, this woman who lived in New York. 715 00:40:43,880 --> 00:40:47,120 Speaker 2: It was her auntie. The painting was of her auntie 716 00:40:47,200 --> 00:40:49,840 Speaker 2: and it belonged to her family, and it was stolen 717 00:40:50,280 --> 00:40:54,120 Speaker 2: by the Nazis and then eventually it was you know, 718 00:40:54,239 --> 00:40:59,160 Speaker 2: it was then put on display in Austria, and she 719 00:40:59,400 --> 00:41:04,120 Speaker 2: fought to get the painting back and it happened. It 720 00:41:04,160 --> 00:41:06,120 Speaker 2: was amazing. It's a phenomenal. If you haven't seen the film, 721 00:41:06,160 --> 00:41:09,400 Speaker 2: it's great. I'll certainly encourage you to see it. But 722 00:41:09,960 --> 00:41:12,880 Speaker 2: I just think this is an amazing way to restore 723 00:41:13,000 --> 00:41:17,759 Speaker 2: something that was tragically lost to fire and using this 724 00:41:17,840 --> 00:41:21,120 Speaker 2: late you know, this really great AI tech to not 725 00:41:21,440 --> 00:41:24,560 Speaker 2: just recolorize, but to recolorize. So we're not talking porn 726 00:41:24,600 --> 00:41:29,359 Speaker 2: hub here. We're talking good climped right the whole use 727 00:41:29,360 --> 00:41:32,359 Speaker 2: your powers for good or evil in this game. 728 00:41:32,680 --> 00:41:35,760 Speaker 4: I'm thinking this is a good version of our first story. 729 00:41:38,040 --> 00:41:40,239 Speaker 1: Well, I definitely think that's a better app but I 730 00:41:40,280 --> 00:41:42,839 Speaker 1: wonder then even with that, Like when you get let's 731 00:41:42,880 --> 00:41:49,000 Speaker 1: say you get the kind of the restored artificially AI enhanced, 732 00:41:49,480 --> 00:41:52,120 Speaker 1: you know you still know that that's not the original. 733 00:41:52,280 --> 00:41:55,200 Speaker 1: I wonder, you know, so much of this stuff is 734 00:41:55,239 --> 00:41:58,640 Speaker 1: about like what you think about it. 735 00:41:58,920 --> 00:41:59,920 Speaker 4: Like, But I've got a. 736 00:42:00,120 --> 00:42:05,640 Speaker 2: Production of the Gustaff Climped painting in my home. It's 737 00:42:05,680 --> 00:42:09,160 Speaker 2: a really large format, it's almost a meter high, and 738 00:42:09,200 --> 00:42:11,680 Speaker 2: it looks stunning. It's nowhere near what the original looks like, 739 00:42:11,880 --> 00:42:14,600 Speaker 2: but it's nice because I love the painting and it's 740 00:42:14,640 --> 00:42:17,279 Speaker 2: great to have a prince of it. When I see 741 00:42:17,320 --> 00:42:19,440 Speaker 2: it, it makes me think about it. I couldn't you know, 742 00:42:19,600 --> 00:42:24,360 Speaker 2: I think of, you know, some of the master painters. 743 00:42:24,360 --> 00:42:25,880 Speaker 2: None of us are ever going to be able to 744 00:42:26,160 --> 00:42:28,799 Speaker 2: afford h Monte or you know, anything like that. But 745 00:42:30,040 --> 00:42:33,000 Speaker 2: you know, at least you can appreciate it by having 746 00:42:33,040 --> 00:42:35,439 Speaker 2: a reproduction of it in some way, shape or form. 747 00:42:36,440 --> 00:42:38,480 Speaker 1: We've got a friend of a friend of the You 748 00:42:38,560 --> 00:42:40,680 Speaker 1: project who's been on a couple of times, my friend 749 00:42:40,719 --> 00:42:44,879 Speaker 1: Dylan Dylan Key's art. You follow him, don't you tip? 750 00:42:45,080 --> 00:42:49,160 Speaker 3: Yeah, amazing, and I think. 751 00:42:50,840 --> 00:42:53,920 Speaker 1: I think he's going to be famous one days. He's 752 00:42:53,920 --> 00:42:56,720 Speaker 1: a big deal on Planet Craig. I love him. Dylan 753 00:42:56,760 --> 00:43:00,120 Speaker 1: Key's art, go and follow him and he does the 754 00:43:00,120 --> 00:43:07,880 Speaker 1: most incredible charcoal stuff. But what's interesting is, like he 755 00:43:07,960 --> 00:43:10,160 Speaker 1: does you know, a lot of a lot of animals 756 00:43:10,160 --> 00:43:11,760 Speaker 1: and stuff, and it's so good. 757 00:43:12,239 --> 00:43:13,440 Speaker 4: Wow, I'm looking at it now. 758 00:43:13,760 --> 00:43:17,880 Speaker 1: Yeah, you can't believe that this motherfucker drew that. I'm like, 759 00:43:17,960 --> 00:43:23,200 Speaker 1: how did you with charcoal? And but what's interesting is 760 00:43:23,239 --> 00:43:26,120 Speaker 1: you can buy, by the way, if you buy an original, 761 00:43:26,160 --> 00:43:28,080 Speaker 1: I think you can still buy original for a grand 762 00:43:28,160 --> 00:43:30,360 Speaker 1: or fifteen hundred or I don't know whatever. Like, but 763 00:43:30,440 --> 00:43:33,920 Speaker 1: when I first met him, like he was doing all 764 00:43:33,920 --> 00:43:37,040 Speaker 1: this incredible stuff and he only had a two or 765 00:43:37,080 --> 00:43:39,719 Speaker 1: three hundred followers, and I saw his stuff and I 766 00:43:39,760 --> 00:43:42,720 Speaker 1: reached out and went, hey, your shit's awesome. You should 767 00:43:42,719 --> 00:43:45,280 Speaker 1: be famous or something like that. And then we connected 768 00:43:45,320 --> 00:43:47,880 Speaker 1: and we spoke, and then he got a little bit 769 00:43:47,880 --> 00:43:50,200 Speaker 1: more of a following. So I think he's up to 770 00:43:50,280 --> 00:43:52,920 Speaker 1: like three or four thousand followers or something on install 771 00:43:52,960 --> 00:43:55,600 Speaker 1: or something. But my long winded point is you can 772 00:43:55,640 --> 00:43:58,120 Speaker 1: buy one of his originals for still what I think 773 00:43:58,239 --> 00:44:02,399 Speaker 1: is really cheap. By he does these incredible prints which 774 00:44:02,400 --> 00:44:08,120 Speaker 1: are pretty much indistinguishable from the original, and that's like 775 00:44:08,160 --> 00:44:11,759 Speaker 1: one hundred bucks. And I think it's so funny, like 776 00:44:12,200 --> 00:44:14,799 Speaker 1: you would somebody would have to tell you that this 777 00:44:15,000 --> 00:44:18,520 Speaker 1: is a print, that's how good they are. He sent 778 00:44:18,560 --> 00:44:20,560 Speaker 1: me one, and I didn't know that if it was 779 00:44:20,680 --> 00:44:23,959 Speaker 1: print or like, I didn't know, and I'm like, wow, 780 00:44:23,960 --> 00:44:27,600 Speaker 1: this is fucking amazing. But yeah, then you go, well 781 00:44:28,520 --> 00:44:31,239 Speaker 1: you could get the like some things which are worth 782 00:44:31,239 --> 00:44:34,000 Speaker 1: a million bucks, and then you could get a phenomenal 783 00:44:34,080 --> 00:44:36,800 Speaker 1: print that to look at from a few meters away, 784 00:44:37,960 --> 00:44:39,640 Speaker 1: you know. I mean, I'm sure a lot of people 785 00:44:39,640 --> 00:44:41,839 Speaker 1: would know, but for the average punter, like I look 786 00:44:41,840 --> 00:44:44,399 Speaker 1: at stuff, you go wow, I wonder if you get 787 00:44:44,400 --> 00:44:47,480 Speaker 1: the same joy. I wonder if it does the same 788 00:44:47,560 --> 00:44:50,840 Speaker 1: thing to your emotions and your body if it's a 789 00:44:50,880 --> 00:44:54,600 Speaker 1: print versus something real, if perceptually it's identical. 790 00:44:55,160 --> 00:44:59,319 Speaker 2: My cousin is a artist who does the cartoons for 791 00:44:59,360 --> 00:45:00,440 Speaker 2: the Times of Malta. 792 00:45:00,560 --> 00:45:04,440 Speaker 4: So he lives in Europe and he does their weekly cartoon. 793 00:45:04,880 --> 00:45:05,440 Speaker 4: And he's got a. 794 00:45:05,440 --> 00:45:08,560 Speaker 2: Great portfolio, and I've got about five of his pieces 795 00:45:08,600 --> 00:45:12,040 Speaker 2: of work now. I actually I've only ever bought one original, 796 00:45:12,080 --> 00:45:13,880 Speaker 2: and that was to give away as a gift. But 797 00:45:13,920 --> 00:45:16,879 Speaker 2: the other stuff My favorite print of his is called 798 00:45:16,960 --> 00:45:20,560 Speaker 2: Imbeciles of Frontier, so it's Imbeciles without Borders, and it's 799 00:45:20,560 --> 00:45:22,640 Speaker 2: got all the world leaders and a big banner over 800 00:45:22,640 --> 00:45:24,920 Speaker 2: the top and one of the guys standing behind the 801 00:45:24,920 --> 00:45:27,880 Speaker 2: other guys doing a you know, the rabbit years behind 802 00:45:27,920 --> 00:45:28,320 Speaker 2: him and all that. 803 00:45:28,400 --> 00:45:32,000 Speaker 4: It's a cute. It's a really beautiful framed print, but 804 00:45:32,080 --> 00:45:34,520 Speaker 4: it is just a print. And I think the original 805 00:45:34,560 --> 00:45:37,000 Speaker 4: had already sold anyway, so I couldn't have bought the original. 806 00:45:37,239 --> 00:45:39,760 Speaker 4: But it still gives me that same reaction, and everybody 807 00:45:39,800 --> 00:45:43,080 Speaker 4: who comes to my house who sees it really loves it. 808 00:45:43,080 --> 00:45:46,840 Speaker 2: It's just so wonderfully drawn and it's really detailed, you know, 809 00:45:46,880 --> 00:45:48,239 Speaker 2: when I was looking at the stuff that you're just 810 00:45:48,320 --> 00:45:51,200 Speaker 2: talking about. Now, people obviously can't see it, but it's 811 00:45:51,800 --> 00:45:55,520 Speaker 2: they are indistinguishable from a black and white photo. There's 812 00:45:55,560 --> 00:45:59,240 Speaker 2: soap and so people who can, you know, meticulously create 813 00:45:59,360 --> 00:46:03,360 Speaker 2: beautiful art. I can say it's amazing, but sometimes you 814 00:46:03,360 --> 00:46:05,800 Speaker 2: can't afford the original, but it's nice to have a reproduction. 815 00:46:05,840 --> 00:46:08,279 Speaker 2: I think it still gives you that warm fuzzies, don't 816 00:46:08,320 --> 00:46:09,040 Speaker 2: you what do you reckon? 817 00:46:09,080 --> 00:46:09,319 Speaker 4: Tiff? 818 00:46:09,520 --> 00:46:09,960 Speaker 1: Yeah? 819 00:46:10,000 --> 00:46:10,239 Speaker 4: Well? 820 00:46:10,719 --> 00:46:11,960 Speaker 1: Oh sorry god, Tiff. 821 00:46:12,320 --> 00:46:14,319 Speaker 3: Yeah No, I think it's interesting. It's all about the 822 00:46:14,360 --> 00:46:17,680 Speaker 3: context of what someone appreciates, because I was just thinking 823 00:46:17,760 --> 00:46:22,239 Speaker 3: in my head, someone doesn't appreciate someone's the story of 824 00:46:22,280 --> 00:46:27,320 Speaker 3: someone drawing it. What's the difference between getting a copy 825 00:46:27,360 --> 00:46:31,719 Speaker 3: of the print that was his reference photo because they 826 00:46:31,719 --> 00:46:36,080 Speaker 3: look exactly the same. But the idea is you appreciate 827 00:46:36,200 --> 00:46:39,960 Speaker 3: someone's gift, and you appreciate telling the people that come 828 00:46:40,000 --> 00:46:42,040 Speaker 3: to your house and go, oh, that's a nice print. 829 00:46:42,200 --> 00:46:44,840 Speaker 3: You go, oh, that's drawn by Dylan Keyes. Have a 830 00:46:44,840 --> 00:46:45,839 Speaker 3: look what this guy does. 831 00:46:46,600 --> 00:46:47,040 Speaker 4: Hmmm. 832 00:46:48,760 --> 00:46:53,680 Speaker 1: I saw this guy drawing. He was a street artist 833 00:46:54,160 --> 00:46:58,640 Speaker 1: as a video and he was drawing with his feet 834 00:46:58,920 --> 00:47:03,040 Speaker 1: because you've got no arm. And I mean I'm not saying, oh, 835 00:47:03,120 --> 00:47:05,479 Speaker 1: gee's art was good for a bloke drawing with his foot. 836 00:47:06,120 --> 00:47:10,799 Speaker 1: His art was just phenomenal, I mean amazing. And then 837 00:47:10,840 --> 00:47:13,880 Speaker 1: on top of that, I'm like, this bloke's doing this 838 00:47:13,920 --> 00:47:17,960 Speaker 1: with his foot. I could have all the bloody training 839 00:47:18,080 --> 00:47:23,640 Speaker 1: and like education and resources and I couldn't do anything 840 00:47:24,120 --> 00:47:27,719 Speaker 1: but one tenth of that. And then so the art 841 00:47:28,080 --> 00:47:30,000 Speaker 1: all on the side of the art is amazing. And 842 00:47:30,040 --> 00:47:33,719 Speaker 1: then you think how the art was created to me 843 00:47:33,880 --> 00:47:35,920 Speaker 1: that time's at by one hundred and I go, I 844 00:47:35,960 --> 00:47:38,600 Speaker 1: would love some of that guy's art, you know, just 845 00:47:38,680 --> 00:47:41,480 Speaker 1: because of the story that accompanies it as well. 846 00:47:41,560 --> 00:47:44,920 Speaker 2: You know, every year, my mum used to subscribe to 847 00:47:45,520 --> 00:47:50,840 Speaker 2: a Christmas card bundle from artists without hands, and I 848 00:47:51,080 --> 00:47:53,880 Speaker 2: vividly remember that she would there would get these gorgeous 849 00:47:53,960 --> 00:47:57,080 Speaker 2: Christmas cards she would send all our family and it 850 00:47:57,120 --> 00:48:00,399 Speaker 2: was specifically from artists who weren't paid in traditionally their hands, 851 00:48:00,440 --> 00:48:01,360 Speaker 2: some with their mouths. 852 00:48:01,640 --> 00:48:04,480 Speaker 4: That that'll my mind. That's yeah, isn't that great? 853 00:48:04,760 --> 00:48:10,239 Speaker 2: The talent knows no bounds, and disability isn't always a 854 00:48:10,320 --> 00:48:11,640 Speaker 2: hindrance to creativity? 855 00:48:12,480 --> 00:48:16,320 Speaker 1: One hundred percent, one hundred percent coup going Champ. 856 00:48:16,520 --> 00:48:18,200 Speaker 2: Hey, this is I thought you might find this an 857 00:48:18,239 --> 00:48:21,840 Speaker 2: interesting story because I know you like tech, but you 858 00:48:21,960 --> 00:48:25,360 Speaker 2: also like old tech. You like an analogue display on 859 00:48:25,400 --> 00:48:27,839 Speaker 2: your motorbike, not a digital display, would I be right? 860 00:48:28,760 --> 00:48:29,800 Speaker 4: Correct? Yeah? 861 00:48:29,840 --> 00:48:32,560 Speaker 2: So in the United States there's a massive push at 862 00:48:32,600 --> 00:48:35,040 Speaker 2: the moment, and it's actually gone to the House Committee. 863 00:48:35,320 --> 00:48:40,680 Speaker 2: They're talking about forcing manufacturers to put AM radios. 864 00:48:40,239 --> 00:48:41,120 Speaker 4: Back in cars. 865 00:48:41,800 --> 00:48:46,040 Speaker 2: Wow, because new cars don't have AM radios, they are 866 00:48:46,040 --> 00:48:50,799 Speaker 2: only digital services and streaming services. And the thing they're 867 00:48:50,840 --> 00:48:54,400 Speaker 2: saying is you've got to put AM radios back in cars, 868 00:48:54,440 --> 00:48:56,720 Speaker 2: and you're not allowed to make it a subscription service 869 00:48:57,040 --> 00:48:59,960 Speaker 2: because of course some manufacturers and I know Elin Musks 870 00:49:00,120 --> 00:49:02,680 Speaker 2: been a little bit criticized for this. If you buy 871 00:49:02,719 --> 00:49:05,279 Speaker 2: one of his cars, you can then upgrade just by 872 00:49:05,320 --> 00:49:09,919 Speaker 2: buying a subscription, so the physical vehicle doesn't change, you're 873 00:49:09,920 --> 00:49:12,759 Speaker 2: just subscribing to extra features, and so that's how they're 874 00:49:12,840 --> 00:49:14,160 Speaker 2: upselling people these days. 875 00:49:14,200 --> 00:49:15,919 Speaker 4: They're actually not just selling the car. 876 00:49:16,239 --> 00:49:18,440 Speaker 2: They're saying, yes, you can buy the AXE, but if 877 00:49:18,480 --> 00:49:20,320 Speaker 2: you want to have the sharpened version, you've got to 878 00:49:20,360 --> 00:49:24,920 Speaker 2: buy a subscription. So the reasoning or the rationale behind 879 00:49:24,960 --> 00:49:29,240 Speaker 2: the AM radio though, is that AM travels a lot 880 00:49:29,280 --> 00:49:33,319 Speaker 2: further than FM or DATA. I mean we're talking much, much, 881 00:49:33,440 --> 00:49:37,920 Speaker 2: much greater range because it's about the way that the 882 00:49:39,040 --> 00:49:43,719 Speaker 2: wave is generated. So that's called vertical polarization and horizontal 883 00:49:43,760 --> 00:49:47,080 Speaker 2: polarizing waves, and that's how the sound waves generated, but 884 00:49:47,120 --> 00:49:49,640 Speaker 2: the radio waves are generated. So this is interesting though 885 00:49:49,719 --> 00:49:53,120 Speaker 2: that they're trying to put a lot of pressure on 886 00:49:53,200 --> 00:49:55,680 Speaker 2: car makers to have the AM rang so that if 887 00:49:55,680 --> 00:50:00,879 Speaker 2: there's a national disaster or an emergency, they broadcast using AM. 888 00:50:01,239 --> 00:50:04,359 Speaker 2: So for example, here in Australia, if there's bushfires, if 889 00:50:04,360 --> 00:50:07,360 Speaker 2: you're tuning into an FM signal, you don't have the 890 00:50:07,520 --> 00:50:09,759 Speaker 2: range that you will on an AM signal. And it 891 00:50:09,800 --> 00:50:11,319 Speaker 2: makes a lot of sense when you start to think 892 00:50:11,320 --> 00:50:14,840 Speaker 2: about it in those terms, because we recently this week, 893 00:50:15,160 --> 00:50:19,280 Speaker 2: we had no cell phone coverage, We had no mobile phones, 894 00:50:19,600 --> 00:50:21,960 Speaker 2: so the second I stepped out of my house, my 895 00:50:22,080 --> 00:50:25,200 Speaker 2: phone was useless. I couldn't get Internet and I couldn't 896 00:50:25,239 --> 00:50:27,600 Speaker 2: make or receive phone calls. So the only way you 897 00:50:27,600 --> 00:50:29,400 Speaker 2: could make and receive calls was if you were in 898 00:50:29,440 --> 00:50:31,640 Speaker 2: a Wi Fi hotspot, So that. 899 00:50:31,719 --> 00:50:33,520 Speaker 4: Was a real challenge for a lot of people in town. 900 00:50:33,560 --> 00:50:35,960 Speaker 2: They were doing upgrades that were getting rid of three 901 00:50:36,040 --> 00:50:38,719 Speaker 2: G going to the five G towers, but it meant 902 00:50:38,719 --> 00:50:42,080 Speaker 2: that anybody who was using the Telstra towers was effectively 903 00:50:42,120 --> 00:50:44,680 Speaker 2: without a device. So if there's a bushfire and you're 904 00:50:44,719 --> 00:50:47,360 Speaker 2: trying to tune in to find out the latest information, 905 00:50:47,440 --> 00:50:50,680 Speaker 2: you may not have phone coverage, you may not have data, 906 00:50:51,040 --> 00:50:53,680 Speaker 2: and which case AM may be the only way to 907 00:50:53,719 --> 00:50:56,160 Speaker 2: get a really distant signal and be able to get 908 00:50:56,160 --> 00:50:58,280 Speaker 2: an update on an emergency situation. 909 00:50:59,600 --> 00:51:02,399 Speaker 1: I love it, good AM station mate. I think they're 910 00:51:02,440 --> 00:51:04,879 Speaker 1: going the way of the Dodo. But I tune into 911 00:51:05,080 --> 00:51:09,880 Speaker 1: SCN in the mornings, Melbourne's home of Sport and Garry 912 00:51:09,920 --> 00:51:13,320 Speaker 1: and timbang on. Yeah that's interesting. Hey, can I share 913 00:51:13,360 --> 00:51:15,319 Speaker 1: some tech with you that you might not know? 914 00:51:15,880 --> 00:51:16,680 Speaker 4: You might know it? 915 00:51:17,280 --> 00:51:21,480 Speaker 1: Yeah, right, So this is hot off the presses. Elon 916 00:51:21,640 --> 00:51:28,440 Speaker 1: Musk's neural link device blind Site gets FDA approval FDA 917 00:51:28,560 --> 00:51:33,319 Speaker 1: Breakthrough Device designation. Blind Site is an experiment mental implant 918 00:51:33,400 --> 00:51:36,879 Speaker 1: aimed at restoring vision in people who have lost sight 919 00:51:37,080 --> 00:51:42,240 Speaker 1: in both eyes, even for people who are born blind. 920 00:51:42,800 --> 00:51:46,319 Speaker 1: Elon Musk's Brain computer interface implant startup neural Link as 921 00:51:46,400 --> 00:51:50,480 Speaker 1: received FDA Breakthrough Approval designation for blind Site and implant 922 00:51:51,000 --> 00:51:57,319 Speaker 1: that restores vision. So it implants a micro electrode array 923 00:51:57,400 --> 00:52:02,000 Speaker 1: into the visual cortex of a person's brain, then activates neurons, 924 00:52:02,040 --> 00:52:05,160 Speaker 1: which provides the individuals with an image. I read about 925 00:52:05,160 --> 00:52:09,160 Speaker 1: it yesterday and said, at like, right now, it's giving 926 00:52:09,200 --> 00:52:13,200 Speaker 1: people like a grainy view of the world, but eventually 927 00:52:13,440 --> 00:52:16,640 Speaker 1: it's going to be way better than twenty twenty human 928 00:52:16,719 --> 00:52:20,480 Speaker 1: vision and give you a capacity to see infrared as well. 929 00:52:22,800 --> 00:52:24,080 Speaker 4: Wows seeing in the dark. 930 00:52:24,760 --> 00:52:28,759 Speaker 2: A good friend of mine, he went progressively blind in 931 00:52:28,800 --> 00:52:32,080 Speaker 2: his late teenage years. He's a mad car fanatic and 932 00:52:32,120 --> 00:52:36,359 Speaker 2: he actually has a couple of collectible cars. He still 933 00:52:36,400 --> 00:52:38,760 Speaker 2: is a car enthusiast, so even though he's now blind 934 00:52:38,840 --> 00:52:43,399 Speaker 2: fully blind, he's very passionate about cars. And for him 935 00:52:43,440 --> 00:52:46,160 Speaker 2: to have suddenly lost the ability to drive in his 936 00:52:46,360 --> 00:52:48,719 Speaker 2: late teens, well early twenties, so he still drove a 937 00:52:48,719 --> 00:52:52,279 Speaker 2: little bit and then actually got progressively blind, this would 938 00:52:52,280 --> 00:52:55,560 Speaker 2: be amazing for him to be able to get out 939 00:52:55,600 --> 00:52:57,759 Speaker 2: on the road again or be able to appreciate his 940 00:52:57,840 --> 00:53:00,879 Speaker 2: collection of cars. I think for peace, people who are 941 00:53:01,320 --> 00:53:06,279 Speaker 2: vision impaired from birth, they see the world in such 942 00:53:06,320 --> 00:53:10,120 Speaker 2: a different way in terms of augmented senses that have 943 00:53:10,320 --> 00:53:13,520 Speaker 2: made up for their lack of vision. Whereas somebody has 944 00:53:13,680 --> 00:53:17,400 Speaker 2: lost vision because of older age or potentially because of 945 00:53:17,440 --> 00:53:20,280 Speaker 2: an accident. You can see there'd be massive, a massive 946 00:53:20,440 --> 00:53:23,000 Speaker 2: jump to try to do this. I wonder whether somebody 947 00:53:23,040 --> 00:53:25,200 Speaker 2: who's been vision impaired, and I've got some friends who 948 00:53:25,440 --> 00:53:29,160 Speaker 2: have been vision impaired from birth as well, and I'd 949 00:53:29,200 --> 00:53:31,040 Speaker 2: love to ask them whether or not they would want 950 00:53:31,080 --> 00:53:31,879 Speaker 2: to get sight. 951 00:53:32,200 --> 00:53:34,279 Speaker 4: It's a really interesting one, isn't it. 952 00:53:35,000 --> 00:53:37,480 Speaker 1: I never thought of that That is a great question. 953 00:53:38,000 --> 00:53:41,600 Speaker 1: That is because I just thought, oh, of course, but 954 00:53:41,719 --> 00:53:43,160 Speaker 1: then maybe. 955 00:53:42,920 --> 00:53:46,319 Speaker 4: Not that is bias because we can see one. 956 00:53:46,280 --> 00:53:50,560 Speaker 1: Hundred I saw this documentary on this guy. You know, 957 00:53:51,200 --> 00:53:56,280 Speaker 1: obviously black bat's a blind Patrick tell people, I'm sure 958 00:53:56,320 --> 00:54:00,280 Speaker 1: you know how they navigate the world. So nar, yeah, 959 00:54:00,320 --> 00:54:05,319 Speaker 1: echo location, sonar? Yeah yeah, yeah. So so this guy 960 00:54:05,560 --> 00:54:08,719 Speaker 1: taught this blind dude taught himself echo location. 961 00:54:09,360 --> 00:54:11,080 Speaker 4: Oh yeah, yeah. 962 00:54:11,120 --> 00:54:17,240 Speaker 1: So he may slight dish like these noises and depending 963 00:54:17,280 --> 00:54:20,600 Speaker 1: on how quickly it comes back to him, and he 964 00:54:21,600 --> 00:54:26,120 Speaker 1: could navigate the world with his own version of sonar 965 00:54:26,400 --> 00:54:30,160 Speaker 1: I guess or echo location that he developed, and he 966 00:54:30,160 --> 00:54:33,239 Speaker 1: would know when things were in front of him that 967 00:54:33,840 --> 00:54:36,640 Speaker 1: like nobody had told him. But yeah, because he developed 968 00:54:36,640 --> 00:54:40,560 Speaker 1: this unbelievable almost like extra sense. 969 00:54:40,920 --> 00:54:41,520 Speaker 4: That's amazing. 970 00:54:41,600 --> 00:54:43,960 Speaker 2: Can I just debunk one little statement you just made 971 00:54:44,680 --> 00:54:48,520 Speaker 2: sure that's not actually blind. They do have small eyes, 972 00:54:48,560 --> 00:54:51,760 Speaker 2: they have a little bit of vision. They're very sensitive vision, 973 00:54:51,960 --> 00:54:54,600 Speaker 2: but it's actually sensitive to being able to see at 974 00:54:54,719 --> 00:54:57,040 Speaker 2: night as well. But they're not I mean, the term 975 00:54:57,080 --> 00:54:59,240 Speaker 2: blind is a bat. I know we use a lot, 976 00:54:59,400 --> 00:55:02,080 Speaker 2: so they don't have very good site, but they're not 977 00:55:02,120 --> 00:55:02,840 Speaker 2: actually blind. 978 00:55:03,080 --> 00:55:07,400 Speaker 1: Sorry, Okay, all right, So the site the vision impaired. 979 00:55:07,520 --> 00:55:11,359 Speaker 2: Well not necessarily they're not. It's not sharp, colorful vision 980 00:55:11,440 --> 00:55:15,279 Speaker 2: or anything like that. So they don't see like you do. 981 00:55:15,640 --> 00:55:19,440 Speaker 1: I'm vision impaired if you look at my face. 982 00:55:20,040 --> 00:55:22,200 Speaker 4: But the echo location, they're much better at that. 983 00:55:22,239 --> 00:55:24,640 Speaker 2: So I guess when they're flying, particularly at night, if 984 00:55:24,640 --> 00:55:26,560 Speaker 2: they can use their echo location to see a rat 985 00:55:26,680 --> 00:55:27,440 Speaker 2: or a mouse. 986 00:55:27,280 --> 00:55:29,520 Speaker 4: Or something and be able to swoop down. 987 00:55:32,280 --> 00:55:36,640 Speaker 1: Hell do they do? They? I don't know, all right, 988 00:55:36,719 --> 00:55:39,280 Speaker 1: black bats are not blind, just vision impaired. 989 00:55:41,200 --> 00:55:42,520 Speaker 4: Well that's subjective. 990 00:55:43,000 --> 00:55:45,880 Speaker 2: Not at night at night that's going to outperform you 991 00:55:46,040 --> 00:55:47,200 Speaker 2: hands wings down? 992 00:55:47,960 --> 00:55:51,680 Speaker 1: But is that is that with eyes or echo location both? 993 00:55:54,200 --> 00:55:57,080 Speaker 1: All right, I'm gonna I'm going to do a deep 994 00:55:57,120 --> 00:56:00,879 Speaker 1: dive intoday. No not so, I don't have time next. 995 00:56:00,760 --> 00:56:03,399 Speaker 4: You know, give us some advice. I like the advice thing. 996 00:56:03,440 --> 00:56:04,960 Speaker 2: You know, we should ask people if they want me 997 00:56:05,000 --> 00:56:06,959 Speaker 2: to kind of look up stuff for them at any point. 998 00:56:07,040 --> 00:56:11,319 Speaker 4: Uh, happily have a chat. Hey, batteries in your laptops, 999 00:56:12,000 --> 00:56:15,960 Speaker 4: so the life of the battery in your laptop can 1000 00:56:16,080 --> 00:56:20,080 Speaker 4: be compromised if you plug it in continuously all the time, 1001 00:56:20,080 --> 00:56:22,359 Speaker 4: which is what I do. And I forget to unplug it. 1002 00:56:22,680 --> 00:56:25,279 Speaker 4: So one of the things that we've now found with 1003 00:56:25,520 --> 00:56:31,319 Speaker 4: improving the longevity of standard batteries is not to discharge 1004 00:56:31,320 --> 00:56:33,640 Speaker 4: them by one hundred percent and not to charge them 1005 00:56:33,640 --> 00:56:36,480 Speaker 4: to one hundred percent. And in fact, I believe Tesla's 1006 00:56:37,040 --> 00:56:39,560 Speaker 4: and a lot of electric car manufacturers, when you plug 1007 00:56:39,600 --> 00:56:42,440 Speaker 4: them into charge, they try. 1008 00:56:42,200 --> 00:56:44,640 Speaker 2: To charge to say about eighty five ninety percent. They 1009 00:56:44,640 --> 00:56:47,600 Speaker 2: don't ever go full to one hundred. So it's only 1010 00:56:47,600 --> 00:56:49,880 Speaker 2: occasionally that you would do the part of me that 1011 00:56:49,880 --> 00:56:51,760 Speaker 2: you would do that. And so just a bit of advice. 1012 00:56:51,840 --> 00:56:54,960 Speaker 2: So when you're looking at you know the way that 1013 00:56:55,080 --> 00:56:58,239 Speaker 2: you use battery power. I mean, for a lot of us, 1014 00:56:58,280 --> 00:56:59,799 Speaker 2: I guess we keep out. I mean I work off 1015 00:56:59,800 --> 00:57:03,400 Speaker 2: allaptop a lot. Sitting here right now, I'm off the laptop, 1016 00:57:03,400 --> 00:57:06,160 Speaker 2: but I have a desktop machine. But the longevity of 1017 00:57:06,200 --> 00:57:09,200 Speaker 2: the battery will be really be dictated by how we 1018 00:57:09,320 --> 00:57:11,600 Speaker 2: use it. So for good battery health. And this is 1019 00:57:11,600 --> 00:57:15,320 Speaker 2: not just laptops, it's smartphones, anything that uses a battery, scooters, 1020 00:57:15,360 --> 00:57:18,120 Speaker 2: you name it. But don't plug it in and leave 1021 00:57:18,160 --> 00:57:19,960 Speaker 2: it plugged in overnight. I know we do it with 1022 00:57:20,000 --> 00:57:23,600 Speaker 2: our phones all the time. Phone manufacturers are getting better 1023 00:57:23,800 --> 00:57:28,080 Speaker 2: at regulating battery charging so that they don't damage the battery, 1024 00:57:28,120 --> 00:57:29,960 Speaker 2: and they have some really good tech built into some 1025 00:57:30,000 --> 00:57:33,040 Speaker 2: of the newer phones. But yeah, so they're saying, you know, 1026 00:57:33,160 --> 00:57:35,680 Speaker 2: don't keep your laptop plugged in or tablet or whatever 1027 00:57:35,680 --> 00:57:37,320 Speaker 2: it happens to be. So I just thought i'd throw 1028 00:57:37,360 --> 00:57:40,000 Speaker 2: that one out there. If you want to try to impress. 1029 00:57:40,160 --> 00:57:42,560 Speaker 1: Are you saying I shouldn't plug my phone in overnight 1030 00:57:42,600 --> 00:57:44,120 Speaker 1: and leave it all night. 1031 00:57:44,160 --> 00:57:46,520 Speaker 2: Well that's what they say. Yeah, you shouldn't leave it 1032 00:57:46,560 --> 00:57:48,720 Speaker 2: on all night. You should charge it and then take 1033 00:57:48,720 --> 00:57:51,040 Speaker 2: it off charge. Don't leave it plugged in overnight, so 1034 00:57:51,240 --> 00:57:52,160 Speaker 2: before you go to bed. 1035 00:57:52,600 --> 00:57:55,360 Speaker 3: So Mom's got setting and it only and it only 1036 00:57:55,480 --> 00:57:58,520 Speaker 3: charges to eighty percent. So I leave it in overnight, 1037 00:57:58,560 --> 00:57:59,400 Speaker 3: but it only evers. 1038 00:57:59,680 --> 00:58:03,760 Speaker 4: Yes, yes, some of the smartest. 1039 00:58:02,520 --> 00:58:05,680 Speaker 5: Settings on your laptop. When I had my laptop. 1040 00:58:05,320 --> 00:58:08,280 Speaker 3: Setup, I had a guy set up and he was 1041 00:58:08,400 --> 00:58:09,920 Speaker 3: he did all that clever stuff. 1042 00:58:10,120 --> 00:58:13,560 Speaker 4: Yeah, so fully charging and fully discharging puts the battery 1043 00:58:13,640 --> 00:58:16,840 Speaker 4: under a lot more stress, and that's thought to shorten 1044 00:58:16,920 --> 00:58:20,040 Speaker 4: its live but we're not. We're talking about. 1045 00:58:20,000 --> 00:58:23,400 Speaker 2: A massive increase in the duration of the life of 1046 00:58:23,400 --> 00:58:26,160 Speaker 2: the battery. If you try to avoid doing that, so 1047 00:58:26,200 --> 00:58:28,680 Speaker 2: you will get a lot longer out of the battery 1048 00:58:28,680 --> 00:58:29,240 Speaker 2: if you do that. 1049 00:58:29,360 --> 00:58:29,760 Speaker 4: And listen. 1050 00:58:29,800 --> 00:58:31,360 Speaker 2: Once upon a time, remember we used to be able 1051 00:58:31,360 --> 00:58:33,280 Speaker 2: to change the batteries in our phones. We just swap 1052 00:58:33,360 --> 00:58:35,720 Speaker 2: them over, and then of course we don't do that now, 1053 00:58:35,760 --> 00:58:38,040 Speaker 2: so we need to be more mindful about how we 1054 00:58:38,080 --> 00:58:40,880 Speaker 2: do that and how we charge and discharge so that 1055 00:58:40,920 --> 00:58:43,000 Speaker 2: we're not, you know, not damaging the battery. 1056 00:58:43,120 --> 00:58:45,800 Speaker 4: So not one hundred percent, not zero percent. There you 1057 00:58:45,800 --> 00:58:47,000 Speaker 4: go somewhere in. 1058 00:58:46,920 --> 00:58:49,800 Speaker 1: Betaatrick, did you, Oh, you don't have an iPhone? You 1059 00:58:49,840 --> 00:58:51,120 Speaker 1: have a Samsung? Do you? 1060 00:58:51,280 --> 00:58:54,920 Speaker 4: I've got a pixel, Google Pixel. So I just. 1061 00:58:55,680 --> 00:58:59,760 Speaker 1: Updated because I'm I'm a dummy with tech obviously, but 1062 00:58:59,840 --> 00:59:03,400 Speaker 1: I updated. Is it iOS eighteen or seventeen, tifth the 1063 00:59:03,440 --> 00:59:07,440 Speaker 1: latest or you know eighteen? So I just updated to 1064 00:59:07,600 --> 00:59:13,280 Speaker 1: iOS eighteen on my iPhone fourteen. But what's interesting is 1065 00:59:13,400 --> 00:59:17,120 Speaker 1: I was having a look at I was having a 1066 00:59:17,160 --> 00:59:19,200 Speaker 1: listen to the start of a podcast the other day. 1067 00:59:19,200 --> 00:59:22,320 Speaker 1: I can't remember one of ours? Or are you project one? 1068 00:59:22,960 --> 00:59:26,439 Speaker 1: And now with a new update, you open it up 1069 00:59:26,960 --> 00:59:30,080 Speaker 1: and the text as you're talking, like right now, if 1070 00:59:30,080 --> 00:59:32,480 Speaker 1: somebody's listening to this on an iPhone and you've got 1071 00:59:32,480 --> 00:59:35,480 Speaker 1: the update. You can go in and it gives the 1072 00:59:35,520 --> 00:59:39,080 Speaker 1: whole text of the whole show in real time as 1073 00:59:39,120 --> 00:59:42,640 Speaker 1: you're talking. It scrolls up the screen with so no. 1074 00:59:42,760 --> 00:59:45,680 Speaker 4: Typeost I've had that. I've had that built into my 1075 00:59:45,800 --> 00:59:48,480 Speaker 4: phone for about five years. It's great that I haven't. 1076 00:59:48,960 --> 00:59:49,440 Speaker 1: You haven't. 1077 00:59:49,840 --> 00:59:51,120 Speaker 4: They probably had an event. 1078 00:59:52,040 --> 00:59:54,600 Speaker 1: It's not your job to bring me down every time 1079 00:59:54,800 --> 00:59:56,200 Speaker 1: I just mentioned something. 1080 00:59:56,240 --> 00:59:58,880 Speaker 2: Okay, let me give you some pro Apple stories then, 1081 00:59:58,920 --> 01:00:00,960 Speaker 2: because they did have a big look Lunch event recently. 1082 01:00:01,040 --> 01:00:02,880 Speaker 2: And I've got to say, there are two little things 1083 01:00:02,920 --> 01:00:05,240 Speaker 2: that did pique my interest in knowing that you're an 1084 01:00:05,280 --> 01:00:05,960 Speaker 2: Apple person. 1085 01:00:06,680 --> 01:00:09,400 Speaker 4: The new AirPods that are coming up are coming out. 1086 01:00:09,520 --> 01:00:11,480 Speaker 4: I actually have a hearing. 1087 01:00:11,200 --> 01:00:13,880 Speaker 2: Test built into them, so you can test your own 1088 01:00:13,920 --> 01:00:16,040 Speaker 2: hearing using the new Apple AirPods. 1089 01:00:16,040 --> 01:00:18,280 Speaker 4: I thought that was pretty good. And the other thing 1090 01:00:18,600 --> 01:00:21,240 Speaker 4: that is really good that you're probably going to need Crago. 1091 01:00:21,400 --> 01:00:27,640 Speaker 1: Is if that's about five, you could put. 1092 01:00:27,760 --> 01:00:31,320 Speaker 2: Your Apple Watch on and it will detect sleep apnear. 1093 01:00:32,160 --> 01:00:38,800 Speaker 1: So why would I have sleep AP Now? I'm fucking 1094 01:00:39,040 --> 01:00:42,520 Speaker 1: lean and healthy. What do you think because I'm old, 1095 01:00:42,960 --> 01:00:45,520 Speaker 1: I'm going to sound like a walrus when I sleep. 1096 01:00:46,080 --> 01:00:49,680 Speaker 3: People with diagnosed sleep APNE right now are like, yeah, 1097 01:00:50,040 --> 01:00:50,600 Speaker 3: not lean. 1098 01:00:52,840 --> 01:01:00,000 Speaker 4: Email what he puts the podcast on himself to sleep well? 1099 01:01:00,000 --> 01:01:03,120 Speaker 1: Well, a lot of people who are overweight have sleep 1100 01:01:03,120 --> 01:01:03,520 Speaker 1: ap there. 1101 01:01:03,800 --> 01:01:06,680 Speaker 4: That's Trotye who's overweight. Thank you. That is in there 1102 01:01:06,720 --> 01:01:07,400 Speaker 4: for all our listeners. 1103 01:01:07,400 --> 01:01:09,920 Speaker 1: Well, and also not everyone who's got sleep APNA is 1104 01:01:09,960 --> 01:01:13,640 Speaker 1: out of shape. Of course guys should have clarified that 1105 01:01:13,720 --> 01:01:16,640 Speaker 1: at the start. But why would you assume I wouldn't 1106 01:01:16,640 --> 01:01:20,440 Speaker 1: need it, but you are tears? Wouldn't like what's behind that? 1107 01:01:21,160 --> 01:01:25,240 Speaker 4: Look, he's very old. You are older than us though, 1108 01:01:25,320 --> 01:01:27,640 Speaker 4: and it does tend to come age, doesn't it. 1109 01:01:28,280 --> 01:01:32,000 Speaker 1: Now you're being agist. I can't talk about Parker now. 1110 01:01:32,400 --> 01:01:34,760 Speaker 1: It's like I feel like I'm getting gaged up on 1111 01:01:34,800 --> 01:01:35,440 Speaker 1: my own show. 1112 01:01:37,320 --> 01:01:42,800 Speaker 4: You gotta kind of are no, But I think, well, 1113 01:01:42,800 --> 01:01:44,760 Speaker 4: I was just doing a pro Apple story. This is 1114 01:01:44,880 --> 01:01:47,680 Speaker 4: me doing a pro Apple story. I thought you'd be interested. 1115 01:01:48,240 --> 01:01:50,480 Speaker 1: I'm not pro Apple though, I'm not anything. I just 1116 01:01:50,520 --> 01:01:54,400 Speaker 1: happen to have an iPhone. In fact, Melissa hates my laptop. 1117 01:01:54,480 --> 01:01:57,280 Speaker 1: I've got a what do you call non Apple? Like 1118 01:01:57,360 --> 01:02:01,000 Speaker 1: I've just got an old school piece. I've just got 1119 01:02:01,080 --> 01:02:05,800 Speaker 1: a laptop. Fucking that she thinks is a piece of shit. 1120 01:02:05,880 --> 01:02:09,400 Speaker 1: But I'm used to it, so I won't change it. 1121 01:02:09,440 --> 01:02:11,240 Speaker 1: Probably is a piece of shit. Give us one more 1122 01:02:11,280 --> 01:02:14,200 Speaker 1: to wind up, Just something, no pressure, just something. 1123 01:02:13,920 --> 01:02:17,320 Speaker 4: Amazing photovoltaic cells. How's that? 1124 01:02:18,400 --> 01:02:18,919 Speaker 1: Yeah? Good? 1125 01:02:19,160 --> 01:02:19,320 Speaker 3: Yeah? 1126 01:02:19,320 --> 01:02:22,240 Speaker 4: I thought you'd like carry on. No, this is really 1127 01:02:22,280 --> 01:02:24,160 Speaker 4: interesting because for those of us. 1128 01:02:24,480 --> 01:02:27,400 Speaker 2: I mean, I have an old When I subscribe to 1129 01:02:27,480 --> 01:02:30,560 Speaker 2: Time magazine about thirty years ago, I got a free 1130 01:02:30,640 --> 01:02:34,040 Speaker 2: calculator with a solar panel in it, and I've never 1131 01:02:34,080 --> 01:02:35,480 Speaker 2: had to charge that bugger. 1132 01:02:35,280 --> 01:02:36,600 Speaker 4: Up for thirty years. 1133 01:02:36,840 --> 01:02:39,280 Speaker 2: You sit it out anywhere inside, you don't have to 1134 01:02:39,280 --> 01:02:42,280 Speaker 2: be outside, just any ambient light, and that bugger works. 1135 01:02:42,480 --> 01:02:44,959 Speaker 2: And I've always thought to myself, why the hell don't 1136 01:02:44,960 --> 01:02:48,400 Speaker 2: we have other stuff that we can use indoors. Well, finally, 1137 01:02:48,760 --> 01:02:51,720 Speaker 2: a new tech startup in Wogga Wogger is going to 1138 01:02:51,760 --> 01:02:54,240 Speaker 2: be the first Australian company to produce a new type 1139 01:02:54,280 --> 01:02:56,960 Speaker 2: of solar cell and they're saying it'll be able to 1140 01:02:57,040 --> 01:03:00,960 Speaker 2: generate enough electricity just indoors and eventually get rid of 1141 01:03:01,000 --> 01:03:04,560 Speaker 2: disposable batteries so you remote control anything that you use 1142 01:03:04,640 --> 01:03:08,080 Speaker 2: indoors Potentially could be run off. This new photo voltaic 1143 01:03:08,160 --> 01:03:09,960 Speaker 2: cell that is. 1144 01:03:10,080 --> 01:03:13,600 Speaker 1: That is actually a really good story because that I'd 1145 01:03:13,640 --> 01:03:17,000 Speaker 1: never thought of that. But those little kind of solar 1146 01:03:17,120 --> 01:03:21,560 Speaker 1: powered calculators. I used to have one and they were 1147 01:03:21,920 --> 01:03:24,680 Speaker 1: they are great, So yeah, why can't they do that 1148 01:03:24,760 --> 01:03:27,040 Speaker 1: with all the you know the stuff that we use 1149 01:03:27,080 --> 01:03:27,600 Speaker 1: in the house. 1150 01:03:27,800 --> 01:03:31,240 Speaker 2: Well, things like keyboards need to be connected constantly, and 1151 01:03:31,720 --> 01:03:35,120 Speaker 2: bearing in mind, a calculator uses similar tif to your 1152 01:03:35,480 --> 01:03:38,200 Speaker 2: tablet uses it kind of an e ink. Liquid crystal 1153 01:03:38,200 --> 01:03:41,600 Speaker 2: displays don't take much power to power to have them 1154 01:03:41,600 --> 01:03:43,680 Speaker 2: come up. But you know, these days, a lot of 1155 01:03:43,680 --> 01:03:46,240 Speaker 2: the technology we use just takes more power. It's not 1156 01:03:47,080 --> 01:03:50,280 Speaker 2: it needs something a bit more robust than a standard 1157 01:03:50,280 --> 01:03:53,280 Speaker 2: Betric battery. But I think it's kind of really cool. 1158 01:03:53,320 --> 01:03:56,440 Speaker 2: It's called Hallo cell, and they reckon they're going to 1159 01:03:56,440 --> 01:04:01,720 Speaker 2: start producing seven centimeter long photovoltaic to generate enough power 1160 01:04:01,720 --> 01:04:05,280 Speaker 2: to replace a pair of disposable batteries in a TV remote, 1161 01:04:05,400 --> 01:04:08,240 Speaker 2: or so the charging cable for a set of headphones. 1162 01:04:09,360 --> 01:04:11,919 Speaker 1: You could build a wind farm in your backyard, mate, 1163 01:04:12,000 --> 01:04:12,840 Speaker 1: you've got enough. 1164 01:04:12,680 --> 01:04:15,240 Speaker 4: Space, well not even the backyard, I just have it indoors. 1165 01:04:15,280 --> 01:04:16,280 Speaker 4: I've got a pretty big house. 1166 01:04:17,400 --> 01:04:19,400 Speaker 1: And you produce a lot of wind, so you should 1167 01:04:19,400 --> 01:04:24,479 Speaker 1: be able to run the whole fucking house. Patrick. Where 1168 01:04:24,480 --> 01:04:26,240 Speaker 1: can people connect with you? My friend? 1169 01:04:26,440 --> 01:04:28,240 Speaker 4: No one's going to want to connect with me after 1170 01:04:28,280 --> 01:04:28,720 Speaker 4: this show. 1171 01:04:28,960 --> 01:04:31,520 Speaker 1: Here they are, Yeah, they are. We love you even 1172 01:04:31,560 --> 01:04:34,760 Speaker 1: though you threw me under the bus and I apparently 1173 01:04:34,800 --> 01:04:39,360 Speaker 1: throw you under the bus a lot so. 1174 01:04:37,880 --> 01:04:40,800 Speaker 2: It didn't just run over you reversed about four time. 1175 01:04:42,600 --> 01:04:45,840 Speaker 2: You know, it's just go to websites noow dot com 1176 01:04:45,840 --> 01:04:48,040 Speaker 2: TODAYU is probably the easiest way. 1177 01:04:47,840 --> 01:04:50,000 Speaker 4: To find me. So websites now to com today you. 1178 01:04:50,120 --> 01:04:52,320 Speaker 2: But if you want to send a message and get 1179 01:04:52,400 --> 01:04:55,560 Speaker 2: us to talk about something on the next show, I'd 1180 01:04:55,640 --> 01:04:58,520 Speaker 2: love to hear from people if there's something in particular, 1181 01:04:58,640 --> 01:05:01,720 Speaker 2: if you've condoling, if you need to send some messages 1182 01:05:01,720 --> 01:05:02,240 Speaker 2: of support. 1183 01:05:02,400 --> 01:05:07,160 Speaker 1: Because it's no probably are going to go fuck Craig. 1184 01:05:07,240 --> 01:05:09,720 Speaker 1: He doesn't need it, and you're right, I don't. Well, 1185 01:05:09,760 --> 01:05:11,600 Speaker 1: I think that's a good idea. Why don't we do 1186 01:05:11,640 --> 01:05:17,640 Speaker 1: the entire next show off listener requests so they can 1187 01:05:17,680 --> 01:05:21,240 Speaker 1: do two places. One is email, you say your email again. 1188 01:05:21,120 --> 01:05:23,400 Speaker 2: I'll just go to websites now, dot com, tod au. 1189 01:05:23,480 --> 01:05:25,240 Speaker 2: You can fill out the form there and send something 1190 01:05:25,240 --> 01:05:27,080 Speaker 2: through to me, it'll come straight directly through to me, 1191 01:05:27,920 --> 01:05:28,680 Speaker 2: or you. 1192 01:05:28,600 --> 01:05:32,920 Speaker 1: Can go tother you project Facebook page, you project podcast 1193 01:05:33,000 --> 01:05:37,480 Speaker 1: Facebook page. Say this is for Patrick, and leave you 1194 01:05:37,520 --> 01:05:40,960 Speaker 1: a question or your idea for a bit of research 1195 01:05:41,160 --> 01:05:45,400 Speaker 1: or an exploration of a topic, and next time will 1196 01:05:45,400 --> 01:05:50,800 Speaker 1: be a completely listener generated and directed discussion. 1197 01:05:51,440 --> 01:05:52,800 Speaker 4: It seems like a lot of work for me. 1198 01:05:54,800 --> 01:05:56,440 Speaker 1: It sounds like a lot of work for one of 1199 01:05:56,480 --> 01:06:00,000 Speaker 1: your underlings. You said, by the way, it was your 1200 01:06:00,160 --> 01:06:02,320 Speaker 1: suggestion fucking thirty seconds ago. 1201 01:06:04,440 --> 01:06:07,000 Speaker 4: I think, all I'm just now nervous. 1202 01:06:08,000 --> 01:06:10,479 Speaker 1: You make a suggestion and then you go, oh my god, 1203 01:06:10,560 --> 01:06:12,640 Speaker 1: that's a lot of work for me. We don't fucking 1204 01:06:12,760 --> 01:06:13,160 Speaker 1: raise it. 1205 01:06:13,560 --> 01:06:15,720 Speaker 2: What happens if we do a whole one hour show 1206 01:06:15,880 --> 01:06:18,360 Speaker 2: on one topic, Well we won't. 1207 01:06:18,440 --> 01:06:20,000 Speaker 4: That little asked it out. 1208 01:06:20,760 --> 01:06:23,480 Speaker 1: Well, we need to put in some rules. I was 1209 01:06:23,520 --> 01:06:26,240 Speaker 1: talking to Tiff last night, right, I fucked up earlier 1210 01:06:26,280 --> 01:06:28,960 Speaker 1: this week. I know we're finished, but I'm running a 1211 01:06:29,000 --> 01:06:31,440 Speaker 1: mentoring group and I kind of I've got about seventeen 1212 01:06:31,480 --> 01:06:33,520 Speaker 1: people in the group, and I kind of forgot we're 1213 01:06:33,600 --> 01:06:36,040 Speaker 1: running through and I've forgot to say at the start, 1214 01:06:36,400 --> 01:06:38,200 Speaker 1: just keep it to two or three minutes. Each of 1215 01:06:38,200 --> 01:06:41,080 Speaker 1: you and I thought we'd get through the whole group. 1216 01:06:41,120 --> 01:06:43,240 Speaker 1: Bit of a debrief on the week. It was my fault. 1217 01:06:43,280 --> 01:06:45,480 Speaker 1: I didn't make it clear, so nobody's at fault. But 1218 01:06:46,080 --> 01:06:48,320 Speaker 1: let's just say that we got through about seven people 1219 01:06:48,360 --> 01:06:50,000 Speaker 1: in an hour, and I thought we'd get through the 1220 01:06:50,040 --> 01:06:53,240 Speaker 1: whole group in about forty minutes. So I think with 1221 01:06:53,360 --> 01:06:56,560 Speaker 1: the topics for yours, will dedicate you know, kind of 1222 01:06:56,560 --> 01:06:58,960 Speaker 1: three or four minutes to each topic or question, then 1223 01:06:59,000 --> 01:07:00,680 Speaker 1: we'll be able to get through twenty or so. 1224 01:07:01,080 --> 01:07:02,680 Speaker 4: Okay, that sounds really good. I'm in for that. 1225 01:07:03,160 --> 01:07:06,120 Speaker 2: Go for it websites now dot com, dot au or 1226 01:07:06,400 --> 01:07:08,640 Speaker 2: the new project Facebook page. 1227 01:07:09,600 --> 01:07:12,680 Speaker 1: Tiff. Good luck, you don't need luck, but have fun 1228 01:07:12,720 --> 01:07:16,400 Speaker 1: on the have fun on the sojourn, enjoy your time 1229 01:07:17,040 --> 01:07:21,000 Speaker 1: in India. Thanks huh, and good luck trying to find 1230 01:07:21,000 --> 01:07:23,640 Speaker 1: a coffee and drugs for when you're puoing through the 1231 01:07:23,680 --> 01:07:28,000 Speaker 1: I of a needle. And stay hydrated, but care for 1232 01:07:28,120 --> 01:07:28,720 Speaker 1: what you drink. 1233 01:07:28,920 --> 01:07:31,880 Speaker 5: Oh yeah, stay hydrated, but don't drink the water. 1234 01:07:36,240 --> 01:07:39,760 Speaker 1: Yeah, good luck with that, We'll say goodbye, fair but Patrick, 1235 01:07:39,800 --> 01:07:44,000 Speaker 1: thank you, TIF, thank you listeners, thank you