1 00:00:00,600 --> 00:00:02,400 Speaker 1: I get a team. It's a You Project, of course 2 00:00:02,400 --> 00:00:06,000 Speaker 1: it is. Patrick's here, Tips here, I'm here. I hope 3 00:00:06,040 --> 00:00:08,720 Speaker 1: you're well. Hope you're enjoying the You Project. And if 4 00:00:08,720 --> 00:00:11,520 Speaker 1: you're not part of our little online group, a Rooney, 5 00:00:12,400 --> 00:00:15,640 Speaker 1: the You Project Facebook group is available for you. 6 00:00:15,760 --> 00:00:18,239 Speaker 2: Is that what it's called the You Project podcast? Or 7 00:00:18,600 --> 00:00:22,360 Speaker 2: I don't know, it's something of that group. I should 8 00:00:23,320 --> 00:00:24,239 Speaker 2: I should probably know. 9 00:00:24,920 --> 00:00:27,440 Speaker 1: But there's about four thousand of us that jump on 10 00:00:27,560 --> 00:00:30,720 Speaker 1: there and talk, and young Danny this week had something 11 00:00:30,720 --> 00:00:32,000 Speaker 1: to celebrate. 12 00:00:31,520 --> 00:00:33,279 Speaker 2: In the group. We all did that, like a. 13 00:00:33,200 --> 00:00:37,640 Speaker 1: Little online community where people just support each other in 14 00:00:37,720 --> 00:00:39,440 Speaker 1: all the good things. So if you're not part of that, 15 00:00:39,479 --> 00:00:41,960 Speaker 1: there's no hooks, catches or agendas. You just jump in 16 00:00:42,000 --> 00:00:45,879 Speaker 1: and chat or jump out whatever the case. Hi, Patrick, Hey, Hey, 17 00:00:46,000 --> 00:00:50,519 Speaker 1: doing be part of the group? Kind of Yes and no? 18 00:00:51,200 --> 00:00:52,080 Speaker 3: Yes, I joined in. 19 00:00:53,240 --> 00:00:54,640 Speaker 2: Are you a member of the group? 20 00:00:54,800 --> 00:00:58,240 Speaker 3: I think so. Yeah, But I don't monitor social media anymore. Remember, 21 00:00:58,440 --> 00:01:01,200 Speaker 3: I just don't jump that's true. 22 00:01:01,680 --> 00:01:03,880 Speaker 2: Yeah, that's true. That's probably for the best. 23 00:01:04,080 --> 00:01:06,640 Speaker 1: But after talking to David Gillespie the other day, I 24 00:01:06,680 --> 00:01:08,959 Speaker 1: was a bit depressed but also a little bit enlightened. 25 00:01:10,080 --> 00:01:17,839 Speaker 1: So I'm right now curating my relationship with social media 26 00:01:17,920 --> 00:01:22,559 Speaker 1: and all of my attention regarding all things electronic high 27 00:01:22,600 --> 00:01:24,080 Speaker 1: TIV high harms. 28 00:01:24,920 --> 00:01:27,639 Speaker 4: How are you fabulous? Friday morning? 29 00:01:28,480 --> 00:01:31,760 Speaker 1: Now, Patrick and I saw that some male was in 30 00:01:31,800 --> 00:01:33,120 Speaker 1: the background before. 31 00:01:32,840 --> 00:01:35,920 Speaker 2: We went live. I'm not sure that either he or 32 00:01:35,959 --> 00:01:37,959 Speaker 2: I have given any kind of approval for that. 33 00:01:40,360 --> 00:01:41,920 Speaker 4: Earlier in the morning to walk the dog. 34 00:01:42,040 --> 00:01:44,920 Speaker 3: Yes, it is that what he does. Oh sorry, I 35 00:01:44,920 --> 00:01:47,800 Speaker 3: thought it was a letter from Australia Poste. 36 00:01:48,080 --> 00:01:50,440 Speaker 2: Yeah, so he's like a dog walker man, is he? 37 00:01:50,680 --> 00:01:51,440 Speaker 4: Yes? Yes? 38 00:01:52,080 --> 00:01:55,120 Speaker 1: Why was he in pajamas and my pajamas, I mean jocks? 39 00:01:58,080 --> 00:02:01,240 Speaker 1: Why was the dog walker man knewed in the background? 40 00:02:01,280 --> 00:02:06,160 Speaker 2: What was that about? Did you giggling? Giggling like a 41 00:02:06,200 --> 00:02:10,000 Speaker 2: fucking guilty school girl? And Patrick's like, I wish I 42 00:02:10,040 --> 00:02:11,520 Speaker 2: had one of them in my background. 43 00:02:12,400 --> 00:02:15,320 Speaker 3: I was just going to say that I couldn't say 44 00:02:15,520 --> 00:02:17,560 Speaker 3: you must have X ray glasses because he looked pretty 45 00:02:17,560 --> 00:02:18,160 Speaker 3: close to me. 46 00:02:20,840 --> 00:02:24,000 Speaker 1: We're just fucking around for the show. Patrick, you're familiar 47 00:02:24,080 --> 00:02:27,720 Speaker 1: with entertainment, No, you're familiar with that. No, but thanks 48 00:02:27,720 --> 00:02:29,840 Speaker 1: for that. You really added to what was going on. 49 00:02:30,560 --> 00:02:32,919 Speaker 4: Truth get in the way of a good story, Patrick. 50 00:02:32,560 --> 00:02:33,600 Speaker 2: Yeah, just let that. 51 00:02:34,400 --> 00:02:38,600 Speaker 1: Just let that roll now, Patrick, You've been the victim 52 00:02:38,639 --> 00:02:41,400 Speaker 1: of a vicious crime up there in the place where 53 00:02:41,520 --> 00:02:46,040 Speaker 1: you once bragged about how safe and beautiful it was. 54 00:02:46,080 --> 00:02:48,560 Speaker 1: You don't lock your car, you don't lock your house. 55 00:02:48,639 --> 00:02:51,280 Speaker 1: But you do, now, don't you. 56 00:02:51,600 --> 00:02:51,760 Speaker 5: Well? 57 00:02:51,760 --> 00:02:54,760 Speaker 3: I do lock my car, but it was parked across 58 00:02:54,760 --> 00:02:57,239 Speaker 3: the road on one of those really hot days under 59 00:02:57,280 --> 00:03:01,160 Speaker 3: a tree because there was no shade, and I left 60 00:03:01,200 --> 00:03:03,680 Speaker 3: it overnight and someone broke into it. Yeah, smashed the windows. 61 00:03:03,880 --> 00:03:06,680 Speaker 3: Wasn't fun. But I had my backpack in it, which 62 00:03:06,680 --> 00:03:08,280 Speaker 3: is what I take to tai Chi with all my 63 00:03:08,400 --> 00:03:12,360 Speaker 3: notes and fans and a little bit of equipment and yes, 64 00:03:12,880 --> 00:03:17,480 Speaker 3: my wallet. I know everyone's been telling me how dumb 65 00:03:17,560 --> 00:03:20,800 Speaker 3: I am, and one friend in particular, Hi Sylvia, it 66 00:03:21,160 --> 00:03:23,960 Speaker 3: constantly tells me how dumb I am, So thanks for that. 67 00:03:24,360 --> 00:03:27,800 Speaker 3: But yeah, the police were really lovely. The forensic guys 68 00:03:27,800 --> 00:03:30,880 Speaker 3: were really good. But the crims or krim, I don't 69 00:03:30,880 --> 00:03:32,799 Speaker 3: know if it was more than one person were wearing 70 00:03:32,800 --> 00:03:37,400 Speaker 3: gloves so they couldn't take any prints. But but digital tracking, 71 00:03:37,520 --> 00:03:40,400 Speaker 3: thank you. They used my credit card at nine different 72 00:03:40,400 --> 00:03:45,160 Speaker 3: locations as they drove back to Melbourne. So wow, Melbourne. Yeah, 73 00:03:45,200 --> 00:03:48,560 Speaker 3: and the good thing is police have CCTV footage, so 74 00:03:48,640 --> 00:03:52,440 Speaker 3: I'm hoping that my stupidity may actually lead to a conviction. 75 00:03:52,680 --> 00:03:55,120 Speaker 3: So in some ways it's a good thing. 76 00:03:56,840 --> 00:04:00,680 Speaker 2: Will you be locking it moving forward all the time 77 00:04:00,800 --> 00:04:01,640 Speaker 2: or just a look it? 78 00:04:01,680 --> 00:04:03,440 Speaker 3: And I always lock it, but I just didn't have 79 00:04:03,480 --> 00:04:05,280 Speaker 3: it in the driveway where I normally have it nice 80 00:04:05,280 --> 00:04:07,480 Speaker 3: and safe where the camera is, and it was out 81 00:04:07,520 --> 00:04:08,680 Speaker 3: of the range of the camera. 82 00:04:08,880 --> 00:04:12,120 Speaker 1: So it's not a good feeling, is it when you 83 00:04:12,280 --> 00:04:14,240 Speaker 1: walk up to your car, or like, I've had a 84 00:04:14,240 --> 00:04:17,000 Speaker 1: few things happen over the years where you go, yeah, 85 00:04:17,040 --> 00:04:19,279 Speaker 1: you come to a door at your house and like 86 00:04:19,320 --> 00:04:22,040 Speaker 1: I didn't leave that open or whatever makes you feel 87 00:04:22,080 --> 00:04:22,839 Speaker 1: sick in the guts. 88 00:04:23,400 --> 00:04:25,359 Speaker 3: I don't know if you remember I lived. I was 89 00:04:25,360 --> 00:04:28,279 Speaker 3: renting a place in Tuak, which again is supposedly a 90 00:04:28,360 --> 00:04:32,320 Speaker 3: nice area, and I had my camera in the glovebox, 91 00:04:32,400 --> 00:04:34,600 Speaker 3: but this is before they had screens on the back 92 00:04:34,640 --> 00:04:36,359 Speaker 3: of it was a Kodak digital camera, one of the 93 00:04:36,440 --> 00:04:40,440 Speaker 3: very first ones, and it got stopped and anyway, I 94 00:04:40,480 --> 00:04:42,560 Speaker 3: thought maybe I'll just check out a few of the 95 00:04:42,600 --> 00:04:46,680 Speaker 3: pawn shops to see a few weeks later. Anyway, when 96 00:04:46,680 --> 00:04:49,240 Speaker 3: I walked out of club X. Sorry, I mean the 97 00:04:49,240 --> 00:04:54,400 Speaker 3: places that you can dock your wares, not wear your hawks. 98 00:04:58,600 --> 00:05:01,840 Speaker 3: You know, a lot a log chapel street at all 99 00:05:01,880 --> 00:05:02,640 Speaker 3: those places. 100 00:05:02,839 --> 00:05:05,280 Speaker 2: I'm not helping you out. You dug this fucking hole. 101 00:05:05,600 --> 00:05:10,280 Speaker 2: I saw it coming from two acres away. 102 00:05:10,720 --> 00:05:12,400 Speaker 3: Anyway, I went to one to watch. 103 00:05:12,600 --> 00:05:14,960 Speaker 2: Fortunately, Tiff is the best crowd in. 104 00:05:14,880 --> 00:05:18,760 Speaker 3: The world, and you are lucky when I do stand up, Tiff, 105 00:05:18,960 --> 00:05:21,120 Speaker 3: you are still going to be there right at the 106 00:05:21,200 --> 00:05:26,320 Speaker 3: front of everybody. Thank you. Anyway, so I walk in 107 00:05:26,440 --> 00:05:29,320 Speaker 3: and I see this camera that's identical to my camera 108 00:05:29,600 --> 00:05:32,720 Speaker 3: on sale, and I asked the guy, you know is 109 00:05:33,080 --> 00:05:34,560 Speaker 3: can I check out the camera? I said, if you've 110 00:05:34,560 --> 00:05:36,400 Speaker 3: got the box and cables for it and a mate 111 00:05:36,520 --> 00:05:39,520 Speaker 3: just as is. So I looked at it and the 112 00:05:39,560 --> 00:05:41,839 Speaker 3: memory card was full, and I knew that my memory 113 00:05:41,839 --> 00:05:43,800 Speaker 3: card and my camera had been full. And then I 114 00:05:44,000 --> 00:05:47,040 Speaker 3: opened up the back of it and they had Panasonic batteries. 115 00:05:47,680 --> 00:05:51,159 Speaker 3: And Panasonic batteries aren't that common in a lot of devices, 116 00:05:51,200 --> 00:05:53,400 Speaker 3: you know, you tend to get durusale and ever ready, 117 00:05:53,680 --> 00:05:55,919 Speaker 3: so I thought, I reckon, this is my camera. So 118 00:05:55,920 --> 00:05:58,640 Speaker 3: I went to the local police station and the desk 119 00:05:58,760 --> 00:06:01,200 Speaker 3: sergeant a person at the desk said, nah, we can't 120 00:06:01,200 --> 00:06:02,800 Speaker 3: do anything. You know, there's not much we can do. 121 00:06:02,920 --> 00:06:06,480 Speaker 3: But a COIB detective happened to be walking past and 122 00:06:06,520 --> 00:06:08,000 Speaker 3: he turned to me and said, oh, what happened to 123 00:06:08,040 --> 00:06:10,240 Speaker 3: your camera? And I told him the whole story. He said, right, 124 00:06:10,400 --> 00:06:12,040 Speaker 3: let's jump in the car and go and have a look. 125 00:06:12,400 --> 00:06:15,080 Speaker 3: He was a lovely guy. And we're driving there he says, yeah, 126 00:06:15,160 --> 00:06:19,400 Speaker 3: this bloody licensed thieves. So anyway, the short story is 127 00:06:19,880 --> 00:06:21,919 Speaker 3: I took the camera. We got the camera, took it 128 00:06:21,960 --> 00:06:24,120 Speaker 3: back to my place, and I had the box and 129 00:06:24,160 --> 00:06:27,800 Speaker 3: packaging for it. Serial codes matched, so I knew it 130 00:06:27,839 --> 00:06:30,400 Speaker 3: was my camera. I had to buy it back. 131 00:06:31,440 --> 00:06:32,159 Speaker 2: Oh my god. 132 00:06:32,200 --> 00:06:34,560 Speaker 3: I had to buy it back for the price that 133 00:06:34,839 --> 00:06:38,880 Speaker 3: the porn shop had purchased it for because even though 134 00:06:38,880 --> 00:06:41,560 Speaker 3: I could prove it was my camera, they had ID 135 00:06:41,839 --> 00:06:45,280 Speaker 3: stolen ID that that had been used to verify the 136 00:06:45,279 --> 00:06:47,600 Speaker 3: person's sale. But anyway, so it's just I know it's 137 00:06:47,600 --> 00:06:49,640 Speaker 3: a bit of a long winded story. Apologies for that, 138 00:06:49,720 --> 00:06:53,960 Speaker 3: but you're right, Craigo, you do feel somewhat violated when 139 00:06:54,080 --> 00:06:57,159 Speaker 3: something gets stolen that doesn't belongs to you. 140 00:06:57,920 --> 00:07:00,000 Speaker 1: It's not a competition, but I'll share one with you 141 00:07:00,200 --> 00:07:03,680 Speaker 1: that was a good one I bought when I had 142 00:07:03,720 --> 00:07:05,839 Speaker 1: Harper's on the Highway that he used to train at, 143 00:07:05,880 --> 00:07:09,320 Speaker 1: when Tiff was just you know, doing year twelve fuck 144 00:07:09,400 --> 00:07:11,480 Speaker 1: and squeezing a z it's and trying to figure out 145 00:07:11,480 --> 00:07:13,880 Speaker 1: what she wanted to do in the next you'll doing. 146 00:07:13,760 --> 00:07:15,280 Speaker 2: That decade or two. 147 00:07:16,320 --> 00:07:19,680 Speaker 1: So I bought a new motorbike patrick which I was 148 00:07:19,720 --> 00:07:22,440 Speaker 1: called a Suzuki eleven hundred hour, which was one of 149 00:07:22,480 --> 00:07:25,760 Speaker 1: the first of the super bikes in that kind of 150 00:07:26,160 --> 00:07:29,160 Speaker 1: it looked like something you would ride on the track 151 00:07:29,200 --> 00:07:31,960 Speaker 1: but on the street, right early days of those superbikes, 152 00:07:32,560 --> 00:07:36,320 Speaker 1: and back then it's probably twenty five years ago or something. 153 00:07:36,480 --> 00:07:37,040 Speaker 2: Cost me. 154 00:07:38,520 --> 00:07:40,840 Speaker 1: Like twenty three or four grand, which was a fortune, 155 00:07:40,840 --> 00:07:42,240 Speaker 1: probably like fifty grand now. 156 00:07:43,000 --> 00:07:46,119 Speaker 2: Anyway, blue and white, beautiful, amazing. Went out. 157 00:07:46,320 --> 00:07:49,120 Speaker 1: I won't say where I bought it, but anyway, went 158 00:07:49,160 --> 00:07:51,320 Speaker 1: out to this place which is not near my house. 159 00:07:51,720 --> 00:07:53,920 Speaker 1: It's the only place in Victoria I could get one. 160 00:07:53,960 --> 00:07:58,000 Speaker 1: They'd just been released. Bought it, and while I was there, 161 00:07:58,040 --> 00:08:00,640 Speaker 1: I bought this like fucking I don't know what it was, 162 00:08:00,840 --> 00:08:04,440 Speaker 1: titanium chain or something that you can't cut through, and 163 00:08:04,600 --> 00:08:07,040 Speaker 1: just so I could lock. Apart from the steering lock 164 00:08:07,040 --> 00:08:10,040 Speaker 1: on the bike, I could also lock it so nobody 165 00:08:10,040 --> 00:08:15,320 Speaker 1: could pinch it. Blah blah blah, and I was I 166 00:08:15,360 --> 00:08:17,120 Speaker 1: had to go back the next day to pick it up. 167 00:08:17,680 --> 00:08:20,680 Speaker 1: So anyway, I went back the next day, because that's 168 00:08:20,760 --> 00:08:22,640 Speaker 1: pre deliberate all that. I picked it up. Friend of 169 00:08:22,680 --> 00:08:25,080 Speaker 1: mine dropped me off road at home Bibby Bobby Boo. 170 00:08:25,240 --> 00:08:28,480 Speaker 1: Still hadn't seen one on the road, so I was 171 00:08:28,520 --> 00:08:30,680 Speaker 1: super proud of myself. Next day, I was going to 172 00:08:30,720 --> 00:08:37,959 Speaker 1: work driving and I saw opposite me at an intersection 173 00:08:38,120 --> 00:08:42,400 Speaker 1: in fact, Nepeane Highway South Road, opposite me, going the 174 00:08:42,400 --> 00:08:45,800 Speaker 1: opposite direction to me. Of course, I go, oh my god, 175 00:08:45,880 --> 00:08:49,000 Speaker 1: there's my bike. Little did I know it was actually 176 00:08:49,040 --> 00:08:53,120 Speaker 1: my fucking bike, right, And I'm like, look at that, dude, 177 00:08:53,160 --> 00:08:55,400 Speaker 1: and it's the same and it's you know, obviously I 178 00:08:55,400 --> 00:08:57,760 Speaker 1: hadn't memorized the number plate, but I didn't even think. 179 00:08:58,480 --> 00:09:00,840 Speaker 2: And then later that day I got home and clearly 180 00:09:00,920 --> 00:09:01,800 Speaker 2: mine was gone. 181 00:09:02,000 --> 00:09:06,520 Speaker 1: So I actually saw the guy flogging my riding away 182 00:09:06,520 --> 00:09:09,440 Speaker 1: on my bike, and I'm thinking what a gun is 183 00:09:09,440 --> 00:09:13,600 Speaker 1: because he's got a bike like me. And then but 184 00:09:13,640 --> 00:09:16,240 Speaker 1: the dumb thing that I did was, so you buy 185 00:09:16,240 --> 00:09:18,920 Speaker 1: it from and of course most dealers are not like this. 186 00:09:19,040 --> 00:09:21,839 Speaker 1: But you buy it from the dealer and you pick 187 00:09:21,880 --> 00:09:24,120 Speaker 1: it up tomorrow. So they can cut a key to 188 00:09:24,160 --> 00:09:26,439 Speaker 1: the bike. They can cut a key to the padlock. 189 00:09:26,559 --> 00:09:29,880 Speaker 1: All the bits and pieces I bought for security, they 190 00:09:29,920 --> 00:09:32,720 Speaker 1: can replicate in the day between when you sign on 191 00:09:32,760 --> 00:09:35,880 Speaker 1: the dotted and when you get it. So the guy 192 00:09:35,920 --> 00:09:39,760 Speaker 1: who stole it, at the very least worked for the dealership, 193 00:09:40,800 --> 00:09:43,400 Speaker 1: or worked for the dealership, or knew someone who did. 194 00:09:43,760 --> 00:09:45,800 Speaker 1: So he just came to my joint, which he also 195 00:09:45,920 --> 00:09:49,560 Speaker 1: had the address because I signed all the paper, drove 196 00:09:49,640 --> 00:09:51,880 Speaker 1: to my joint, or got dropped off, had all the 197 00:09:51,960 --> 00:09:57,920 Speaker 1: keys unlocked, everything just rode it away And you're like, ah, 198 00:09:57,960 --> 00:10:01,840 Speaker 1: and then, which is understandable. But then I got interrogated 199 00:10:01,920 --> 00:10:05,679 Speaker 1: for about four hours by the fucking insurance company like 200 00:10:05,840 --> 00:10:07,679 Speaker 1: I was a criminal and I had done this. 201 00:10:08,760 --> 00:10:11,280 Speaker 2: So yeah, I sympathize with you. 202 00:10:12,480 --> 00:10:15,640 Speaker 3: Wow. Yeah, that's that's pretty fun, isn't it. 203 00:10:15,840 --> 00:10:20,160 Speaker 2: Oh Yeah, it's ridiculous. It's ridiculous. What are we what 204 00:10:20,200 --> 00:10:23,439 Speaker 2: are we teching today? Is there anything that takes your 205 00:10:23,520 --> 00:10:24,360 Speaker 2: fancy or do you. 206 00:10:24,400 --> 00:10:29,360 Speaker 3: Just want me to throw a CPR CREPR story. And 207 00:10:29,920 --> 00:10:33,040 Speaker 3: this is one of the most bizarre things. It doesn't 208 00:10:33,080 --> 00:10:39,000 Speaker 3: surprise me in the slightest. Meta basically had patented. It's 209 00:10:39,040 --> 00:10:43,679 Speaker 3: come out that they painted a AI that would continue 210 00:10:43,760 --> 00:10:47,960 Speaker 3: to post for you when you die. 211 00:10:48,480 --> 00:10:50,680 Speaker 2: Who yeah, who wants that? 212 00:10:51,120 --> 00:10:54,960 Speaker 3: Yeah? So now this happened in December, and they weren't 213 00:10:55,000 --> 00:10:57,640 Speaker 3: the first to do it. Microsoft actually had the same 214 00:10:57,679 --> 00:11:01,240 Speaker 3: idea but then shelved it's saying, yeah, probably a bit 215 00:11:01,360 --> 00:11:03,400 Speaker 3: creepy if we have a Chatbotton model. This was that 216 00:11:03,440 --> 00:11:07,719 Speaker 3: in twenty twenty one. But they're saying now, no, no, no, 217 00:11:07,840 --> 00:11:11,000 Speaker 3: we're not really going to use it. The idea had 218 00:11:11,040 --> 00:11:14,880 Speaker 3: come out of so say, for example, Crago on the 219 00:11:14,880 --> 00:11:19,640 Speaker 3: typ channel on Facebook decides to go away on an 220 00:11:19,640 --> 00:11:23,319 Speaker 3: extended holiday. You could effectively use this to post for 221 00:11:23,360 --> 00:11:26,160 Speaker 3: you while you're on a break, and I think part 222 00:11:26,200 --> 00:11:28,000 Speaker 3: of the idea. And then someone thought, ky, but what 223 00:11:28,040 --> 00:11:31,280 Speaker 3: if someone dies, it could keep posting for them, and 224 00:11:31,360 --> 00:11:34,480 Speaker 3: soily and you know that if you can, if you 225 00:11:34,480 --> 00:11:37,720 Speaker 3: could put all of your collected works into an AI, 226 00:11:38,360 --> 00:11:41,360 Speaker 3: it would really be able to mimic your voice. So 227 00:11:41,440 --> 00:11:44,040 Speaker 3: in a lot while it could effectively and who knows 228 00:11:44,040 --> 00:11:47,280 Speaker 3: down the track podcasting it could you know, it could 229 00:11:47,400 --> 00:11:48,679 Speaker 3: just be Tiff and I and an AI. 230 00:11:50,440 --> 00:11:51,720 Speaker 2: Well, very well could be. 231 00:11:51,920 --> 00:11:55,280 Speaker 1: I mean the quality of the kind of AI podcasts 232 00:11:55,320 --> 00:12:00,040 Speaker 1: that are using, you know, celebrities voices a part some 233 00:12:00,080 --> 00:12:04,520 Speaker 1: times from the cadence or the timing. But it's pretty indistinguishable. 234 00:12:04,559 --> 00:12:06,840 Speaker 1: I mean it is, but it's getting very close to 235 00:12:06,920 --> 00:12:11,160 Speaker 1: being you wouldn't know. So I also see that some 236 00:12:11,360 --> 00:12:14,200 Speaker 1: artists and writers, and I thought this would be the case, 237 00:12:15,679 --> 00:12:20,400 Speaker 1: quite hesitant to disclose when they're using AI or collaborating 238 00:12:20,440 --> 00:12:24,040 Speaker 1: with AI. What do you think the reason is for that? 239 00:12:24,360 --> 00:12:28,559 Speaker 1: Just embarrassment or fear of losing legitimacy or legitimacy. 240 00:12:28,800 --> 00:12:31,160 Speaker 3: There's a perception out there if you're an artist and 241 00:12:31,200 --> 00:12:34,120 Speaker 3: you say, oh, yeah, by the way, I used AI 242 00:12:34,320 --> 00:12:37,079 Speaker 3: to help get some themes for my new book, then 243 00:12:37,720 --> 00:12:41,240 Speaker 3: the authenticity and the sense of creativity is diluted. 244 00:12:41,280 --> 00:12:42,920 Speaker 2: I think yes. 245 00:12:43,600 --> 00:12:46,720 Speaker 3: But interestingly, there was a little bit of a research study, 246 00:12:47,040 --> 00:12:48,880 Speaker 3: and what they did was they took two and a 247 00:12:48,920 --> 00:12:53,920 Speaker 3: half thousand creative professionals and across four continents, and this 248 00:12:54,040 --> 00:12:56,160 Speaker 3: is a study done in twenty twenty four, and they 249 00:12:56,200 --> 00:12:59,679 Speaker 3: found that eighty five percent of them we're using AI 250 00:12:59,720 --> 00:13:03,560 Speaker 3: in work. Now, if you use AI to help you 251 00:13:03,600 --> 00:13:07,040 Speaker 3: get organized in the day, you help manage your emails, 252 00:13:07,480 --> 00:13:10,920 Speaker 3: and then you jump into the creative process. Even then, 253 00:13:11,480 --> 00:13:13,800 Speaker 3: the fact that AI is helping you was still a 254 00:13:13,840 --> 00:13:17,480 Speaker 3: bit of a stigma for most people. That the perception 255 00:13:17,800 --> 00:13:20,959 Speaker 3: was that you were almost like you're cheating by using AI. 256 00:13:21,400 --> 00:13:24,440 Speaker 3: I mean, I think I use it for concept work 257 00:13:24,559 --> 00:13:28,040 Speaker 3: sometimes if I'm stuck on an idea and then I think, well, yeah, 258 00:13:28,040 --> 00:13:31,320 Speaker 3: maybe I'll just jump into AI. Sometimes it's awful, it's terrible. 259 00:13:31,600 --> 00:13:34,040 Speaker 3: Other times you think, oh, yeah, that's not a bad idea. 260 00:13:34,440 --> 00:13:36,560 Speaker 3: I think that at the end of the day, it's 261 00:13:36,640 --> 00:13:39,800 Speaker 3: just like any tool. I'm sure you know people were 262 00:13:39,840 --> 00:13:43,880 Speaker 3: saying this when computers came out, and you know the 263 00:13:43,920 --> 00:13:47,000 Speaker 3: fact that we were using these devices or adding machines, 264 00:13:47,040 --> 00:13:52,320 Speaker 3: you know, calculators rather than an advocate. Yeah, and I 265 00:13:52,320 --> 00:13:54,800 Speaker 3: think it's just about global acceptance that, you know, as 266 00:13:54,800 --> 00:13:57,720 Speaker 3: it gets more and more integrated. I know that when 267 00:13:57,720 --> 00:14:01,000 Speaker 3: I take a photograph and I send peopleographs, now I 268 00:14:01,120 --> 00:14:04,360 Speaker 3: have an option to press a button and it's nano 269 00:14:04,480 --> 00:14:08,640 Speaker 3: banana on my phone and wow, lick the nano banana button. 270 00:14:08,960 --> 00:14:12,440 Speaker 3: I can change that photo. Before I send it. So 271 00:14:12,480 --> 00:14:14,680 Speaker 3: I was sitting talking to some friends. We were in 272 00:14:14,720 --> 00:14:17,920 Speaker 3: a walking group. We go walking together every Thursday morning. 273 00:14:17,960 --> 00:14:20,040 Speaker 3: We've been doing it for about fifteen or sixteen years, 274 00:14:20,360 --> 00:14:22,240 Speaker 3: and then we have a coffee afterwards. And I was 275 00:14:22,280 --> 00:14:24,400 Speaker 3: explaining this to a friend of mine. She was talking 276 00:14:24,480 --> 00:14:27,440 Speaker 3: about using AI and so I took a photo of 277 00:14:27,480 --> 00:14:29,840 Speaker 3: the group of people sitting at the table, and I 278 00:14:30,000 --> 00:14:32,760 Speaker 3: just in the prompt, I said, turn this into a 279 00:14:32,840 --> 00:14:37,440 Speaker 3: Christmas scene. Suddenly there was a Christmas tree in the background, 280 00:14:37,880 --> 00:14:41,680 Speaker 3: ballbles and decorations on the table, and you know, the 281 00:14:41,720 --> 00:14:44,320 Speaker 3: same people were there. One person was a little Christmas 282 00:14:44,320 --> 00:14:48,400 Speaker 3: hat and it did literally changed the you know what 283 00:14:48,640 --> 00:14:51,920 Speaker 3: was there. But it makes me wonder too. The reality 284 00:14:52,000 --> 00:14:56,200 Speaker 3: was so clear, it looked so real. I almost turned 285 00:14:56,200 --> 00:14:58,240 Speaker 3: it around to the other people and said, hey, remember 286 00:14:58,360 --> 00:15:00,360 Speaker 3: this is funny. We were sitting in the same positions 287 00:15:00,400 --> 00:15:03,800 Speaker 3: this time last year in December, you know, just to 288 00:15:03,840 --> 00:15:06,240 Speaker 3: see if I could fool anybody, because not everybody was 289 00:15:06,360 --> 00:15:09,000 Speaker 3: listening to the conversation. There were people in their own conversation, 290 00:15:09,040 --> 00:15:10,600 Speaker 3: you know, when you sit around with a group of people. 291 00:15:11,320 --> 00:15:15,040 Speaker 3: But the reality, well, the reality is there is no reality. 292 00:15:15,960 --> 00:15:19,160 Speaker 3: You know, are we being deceived. Well, yeah, maybe we are. 293 00:15:19,320 --> 00:15:21,800 Speaker 3: And credibility, where does that work? I mean, you use 294 00:15:22,000 --> 00:15:23,160 Speaker 3: AI all the time. 295 00:15:23,000 --> 00:15:25,560 Speaker 2: Craigo, I use it for different things. 296 00:15:25,560 --> 00:15:29,160 Speaker 1: But it's interesting you say this because, like all of 297 00:15:29,280 --> 00:15:31,640 Speaker 1: the stuff that I write on Instagram, if you don't 298 00:15:31,680 --> 00:15:34,080 Speaker 1: follow me on Instagram, folks, that's fine. But if you 299 00:15:34,120 --> 00:15:37,360 Speaker 1: want to just quig Anthony Harper. But like, I'll put 300 00:15:37,440 --> 00:15:40,240 Speaker 1: up some things like I'm just looking at one post 301 00:15:40,320 --> 00:15:40,640 Speaker 1: right now. 302 00:15:40,680 --> 00:15:42,360 Speaker 2: I wanted to read you this for a reason. It 303 00:15:42,440 --> 00:15:45,400 Speaker 2: just says it's handwritten on a whiteboard by me, one 304 00:15:45,480 --> 00:15:47,640 Speaker 2: hundred percent original. It's not brilliant. 305 00:15:47,680 --> 00:15:49,880 Speaker 1: But of course life's a shit show and then it's awesome, 306 00:15:49,920 --> 00:15:52,440 Speaker 1: and then you're sad and then happy, and then it's lunchtime. 307 00:15:52,800 --> 00:15:54,960 Speaker 1: Your challenge is not to create a perfect life, but 308 00:15:55,080 --> 00:15:58,760 Speaker 1: rather to thrive in the inevitable imperfection of it all. Now, 309 00:15:58,960 --> 00:16:03,080 Speaker 1: I've posted that post probably fifteen times over the last 310 00:16:03,120 --> 00:16:07,320 Speaker 1: few years, and the worst it ever gets is a 311 00:16:07,360 --> 00:16:09,680 Speaker 1: million views. One time it got two and a half 312 00:16:09,760 --> 00:16:13,840 Speaker 1: million views. Right, so I thought, I'm going to take that. 313 00:16:14,040 --> 00:16:16,720 Speaker 1: I took that and a few other higher rating ones, 314 00:16:16,720 --> 00:16:19,200 Speaker 1: and I wrote, so these are posts that I put 315 00:16:19,200 --> 00:16:22,760 Speaker 1: on Instagram on hundred percent original, written with my hand 316 00:16:22,840 --> 00:16:25,720 Speaker 1: on a whiteboard, taken a photo of tied it up 317 00:16:25,760 --> 00:16:27,720 Speaker 1: a bit, you know, made a bit clearer and cleaner. 318 00:16:29,160 --> 00:16:31,920 Speaker 2: I want you to come up with a post. 319 00:16:31,480 --> 00:16:34,560 Speaker 1: Similar to this or similar to these, written in my style, 320 00:16:34,560 --> 00:16:39,160 Speaker 1: a bit of swearing, whatever, but original. They were fucking terrible, 321 00:16:40,560 --> 00:16:44,680 Speaker 1: like terrible, like not like, oh my god, you couldn't 322 00:16:45,040 --> 00:16:47,240 Speaker 1: you know. But if you want to go, hey, tell 323 00:16:47,280 --> 00:16:50,400 Speaker 1: me a funny story about how the pyramids were built, well, 324 00:16:50,440 --> 00:16:53,960 Speaker 1: it can crush that, right, Or you want to give 325 00:16:54,000 --> 00:16:57,680 Speaker 1: me your thoughts on like yesterday. I'll talk for a 326 00:16:57,720 --> 00:16:59,160 Speaker 1: minute or two and then I'll shut up. But this 327 00:16:59,280 --> 00:17:03,280 Speaker 1: is very relevant to you. 328 00:17:03,360 --> 00:17:08,639 Speaker 2: Have you opened the door at all on AI agents 329 00:17:08,680 --> 00:17:09,879 Speaker 2: like agentic ai? 330 00:17:10,520 --> 00:17:12,200 Speaker 3: Yeah, a little bit. It just looks into. 331 00:17:12,080 --> 00:17:14,399 Speaker 1: Okay, So I just want to read this because this 332 00:17:14,560 --> 00:17:18,800 Speaker 1: blew me away. So agentic AI refers to artificial intelligence 333 00:17:18,880 --> 00:17:23,240 Speaker 1: systems that don't just respond to prompts. They take initiative, 334 00:17:23,640 --> 00:17:28,800 Speaker 1: set goals, make decisions, and act autonomously to achieve an objective. 335 00:17:29,440 --> 00:17:35,840 Speaker 1: In simple terms, traditional AI answers questions, agentic AI pursues outcomes. 336 00:17:35,920 --> 00:17:40,159 Speaker 1: And this is happening now. The core idea. Most current 337 00:17:40,200 --> 00:17:44,280 Speaker 1: AI tools, including chat assistance, are reactive. You asked the answer, 338 00:17:44,680 --> 00:17:47,920 Speaker 1: agentic ai is goal oriented and self directed. 339 00:17:48,320 --> 00:17:51,400 Speaker 2: You give it an outcome that you want. 340 00:17:51,359 --> 00:17:54,159 Speaker 1: It figures out the steps, it executes them, and it 341 00:17:54,200 --> 00:17:59,359 Speaker 1: adapts if and when needed. Key features of agentic ai 342 00:17:59,520 --> 00:18:03,800 Speaker 1: goal rn entered works to find towards defined objectives. Autonomous 343 00:18:03,800 --> 00:18:09,240 Speaker 1: decision making chooses actions without human input, planning ability breaks 344 00:18:09,320 --> 00:18:11,960 Speaker 1: large goals in small blah uh. I'm like, this is 345 00:18:12,040 --> 00:18:14,679 Speaker 1: just a person. This is a person. And I was 346 00:18:14,760 --> 00:18:18,840 Speaker 1: listening to a guy yesterday who's the boss of Uber. 347 00:18:19,280 --> 00:18:25,440 Speaker 1: His name is Dara Koshwahari or something. But anyway, when 348 00:18:25,520 --> 00:18:29,080 Speaker 1: he took over Uber, they were losing three billion a 349 00:18:29,160 --> 00:18:33,760 Speaker 1: year in twenty seventeen. They are now making ten billion 350 00:18:33,800 --> 00:18:38,120 Speaker 1: a year. And he's they're implementing all of these kind 351 00:18:38,119 --> 00:18:44,400 Speaker 1: of you know, these AI kind of agents to take 352 00:18:44,440 --> 00:18:47,119 Speaker 1: positions and they're employing other people. They're still one of 353 00:18:47,160 --> 00:18:50,200 Speaker 1: the biggest employers in the world, but it's the shit 354 00:18:50,280 --> 00:18:55,800 Speaker 1: that's happening. It's almost like you can't keep up. And yeah, 355 00:18:55,880 --> 00:18:58,840 Speaker 1: because Melissa started telling me about she wants to build 356 00:18:58,880 --> 00:19:01,760 Speaker 1: this team of agents, and I'm like, haka, are you 357 00:19:01,760 --> 00:19:02,440 Speaker 1: going to sack Tiff? 358 00:19:02,480 --> 00:19:03,520 Speaker 2: And I what's going on? 359 00:19:04,359 --> 00:19:10,080 Speaker 3: So there's a really lovely sense of authenticity to you 360 00:19:10,160 --> 00:19:13,159 Speaker 3: writing your posts on a whiteboard and taking a photo 361 00:19:13,400 --> 00:19:17,760 Speaker 3: and putting it up there. Last year I did a little, 362 00:19:18,480 --> 00:19:22,040 Speaker 3: a little kind of project that I started for myself, 363 00:19:22,040 --> 00:19:24,800 Speaker 3: and I had a whiteboard marker in my bathroom and 364 00:19:24,800 --> 00:19:27,120 Speaker 3: on the mirror. Every night before I went to bed, 365 00:19:27,160 --> 00:19:29,720 Speaker 3: after I brushed my teeth, i'd write a high coup. 366 00:19:30,320 --> 00:19:33,440 Speaker 3: So that's a Japanese poem. It's a five seven five 367 00:19:33,520 --> 00:19:36,720 Speaker 3: syllable so one line of five syllables, one line of seven, 368 00:19:36,800 --> 00:19:39,679 Speaker 3: one line of five. And it was really interesting pushing 369 00:19:39,720 --> 00:19:41,960 Speaker 3: yourself to be creative. You know, at the end of 370 00:19:41,960 --> 00:19:44,359 Speaker 3: the day, I'm always a bit tired, but I really 371 00:19:44,520 --> 00:19:47,920 Speaker 3: set myself a goal to write a highkup every single day, 372 00:19:48,359 --> 00:19:50,919 Speaker 3: and it was such a fun challenge. And I started 373 00:19:50,920 --> 00:19:53,520 Speaker 3: taking photos of them, which was kind of fun. You know, 374 00:19:53,560 --> 00:19:54,919 Speaker 3: I'd have to go back and look for them, and 375 00:19:54,960 --> 00:19:56,560 Speaker 3: I don't think they were earth shattering, and I don't 376 00:19:56,560 --> 00:19:58,960 Speaker 3: think I'm going to publish a book of haiku. But 377 00:20:00,040 --> 00:20:03,480 Speaker 3: there was a study recently a psychologists now saying that 378 00:20:03,520 --> 00:20:07,520 Speaker 3: people who write shopping lists. Physically writing a shopping list 379 00:20:07,640 --> 00:20:10,399 Speaker 3: rather than typing it into your phone. Instead of that, 380 00:20:11,560 --> 00:20:15,720 Speaker 3: they're actually using a cognitive processing and that actually strengthens 381 00:20:15,760 --> 00:20:21,200 Speaker 3: your memory. It's also intention and presence, and that doesn't 382 00:20:21,200 --> 00:20:24,520 Speaker 3: happen when you type it into a phone. So physically writing, 383 00:20:24,840 --> 00:20:29,199 Speaker 3: forming the words helps with the intention and the intent 384 00:20:29,280 --> 00:20:31,840 Speaker 3: and memorizing what it is on that list. So I 385 00:20:31,840 --> 00:20:35,119 Speaker 3: thought that's interesting and the fact that doing that, And 386 00:20:35,160 --> 00:20:37,119 Speaker 3: I know we've laughed at me many times about the 387 00:20:37,160 --> 00:20:40,080 Speaker 3: fact that I like to use a fountain pen. But 388 00:20:40,640 --> 00:20:43,280 Speaker 3: I love writing still and I adore writing, and I 389 00:20:43,280 --> 00:20:46,960 Speaker 3: still like using technology. But it's interesting, isn't it. 390 00:20:47,119 --> 00:20:48,560 Speaker 2: I'm with you. I think it is. 391 00:20:48,800 --> 00:20:50,920 Speaker 1: Like there's a lot of mellow drama at the moment 392 00:20:50,960 --> 00:20:54,119 Speaker 1: around around apps and around the use of AI, and 393 00:20:54,119 --> 00:20:58,400 Speaker 1: around the use of social media broadly, but I think 394 00:20:58,400 --> 00:21:01,160 Speaker 1: that every individual needs to be I have this chat 395 00:21:01,200 --> 00:21:08,080 Speaker 1: with Gillespie and tiff about, you know, just like, really, 396 00:21:08,119 --> 00:21:11,280 Speaker 1: what is happening with these brains that are getting hijacked 397 00:21:11,320 --> 00:21:17,800 Speaker 1: by this medium? And so literally you it takes your attention. 398 00:21:17,920 --> 00:21:20,840 Speaker 1: You're not choosing what to focus on, it's choosing for you. 399 00:21:21,240 --> 00:21:23,639 Speaker 1: And so now you are just a passenger. You're not 400 00:21:23,720 --> 00:21:27,800 Speaker 1: driving the bus like it has your attention, not the 401 00:21:27,840 --> 00:21:32,640 Speaker 1: other way around. So it's like I thought about it, 402 00:21:32,760 --> 00:21:36,080 Speaker 1: and definitely I've got an issue, definitely, and I have 403 00:21:36,160 --> 00:21:39,600 Speaker 1: to change it because it's my brain is my most 404 00:21:39,600 --> 00:21:42,520 Speaker 1: important tool and if anything that's going to kind of 405 00:21:43,040 --> 00:21:46,960 Speaker 1: impact my ability to think clearly, focus, create resolved problems, 406 00:21:46,960 --> 00:21:49,520 Speaker 1: you know, like use my brain well and keep my 407 00:21:49,600 --> 00:21:52,959 Speaker 1: brain healthy. To me, it's almost like I'm doing all 408 00:21:53,000 --> 00:21:55,880 Speaker 1: this training then coming home and eating doughnuts and pizza. 409 00:21:56,680 --> 00:21:59,000 Speaker 1: That's the that's the analogy for me, Like I do 410 00:21:59,080 --> 00:22:01,639 Speaker 1: all of this work with my brain and then I 411 00:22:01,920 --> 00:22:06,600 Speaker 1: just fucking unplugged my brain and let this thing hijack 412 00:22:06,760 --> 00:22:10,680 Speaker 1: my prefrontal cortex. So I'm off line and it's online 413 00:22:10,800 --> 00:22:12,159 Speaker 1: and it's I fucking hate it. 414 00:22:12,200 --> 00:22:13,000 Speaker 2: I've got to stop it. 415 00:22:13,680 --> 00:22:15,719 Speaker 3: I got a really interesting email during the week from 416 00:22:15,760 --> 00:22:19,200 Speaker 3: one of our listeners. Dale is his name Dale Patterson, 417 00:22:19,240 --> 00:22:21,800 Speaker 3: and it was a blog article that he sent through 418 00:22:21,840 --> 00:22:25,679 Speaker 3: to me about AI slop. And it's a guy by 419 00:22:25,720 --> 00:22:29,000 Speaker 3: the name of Gwinder whose handle is Gerwinder, and it's 420 00:22:29,119 --> 00:22:32,240 Speaker 3: a blog called The Prism. And what he did was 421 00:22:32,280 --> 00:22:35,040 Speaker 3: he took a whole lot of phrases, and he just 422 00:22:35,240 --> 00:22:37,679 Speaker 3: kind of paraphrased each of the key terms. So one 423 00:22:37,720 --> 00:22:41,440 Speaker 3: of them was the one percent rule. So in online communities, 424 00:22:42,200 --> 00:22:47,199 Speaker 3: one percent of users produce almost all of the content. Yeah, 425 00:22:47,280 --> 00:22:50,520 Speaker 3: the ninety nine consumers, the one person generating a content. 426 00:22:50,640 --> 00:22:54,280 Speaker 3: And then he talked about slopaganda. More online articles are 427 00:22:54,280 --> 00:22:57,960 Speaker 3: now written by AI than humans, so if you look 428 00:22:57,960 --> 00:23:00,679 Speaker 3: at an article, there's a good chance that it's going 429 00:23:00,760 --> 00:23:05,200 Speaker 3: to be an AI written article, not a human one. Moloch'spargan. 430 00:23:05,680 --> 00:23:10,119 Speaker 3: This is when large learning models so AIS compete for 431 00:23:10,240 --> 00:23:14,040 Speaker 3: votes on social media, so they push lies and rage 432 00:23:14,119 --> 00:23:17,160 Speaker 3: bait to actually win. So it's not about it's trying 433 00:23:17,200 --> 00:23:19,880 Speaker 3: to get that engagement and to get the most posts 434 00:23:19,920 --> 00:23:23,280 Speaker 3: possible or the most likes possible, So that doesn't matter. 435 00:23:23,320 --> 00:23:26,520 Speaker 3: There's no filter to say this is actually really crap. 436 00:23:26,800 --> 00:23:30,760 Speaker 3: It's just about getting that algorithm right. But this is 437 00:23:30,800 --> 00:23:32,159 Speaker 3: the one that made me think of what you were 438 00:23:32,200 --> 00:23:34,960 Speaker 3: just talking about before and switching off. I love the 439 00:23:35,000 --> 00:23:40,400 Speaker 3: paradox of boredom. We walk around with the most powerful 440 00:23:40,840 --> 00:23:45,159 Speaker 3: supercomputers that you can imagine in our pocket, and the 441 00:23:45,200 --> 00:23:48,119 Speaker 3: reality of that means the second that you get bored, 442 00:23:48,119 --> 00:23:51,000 Speaker 3: and I do this all the time. I swipe, I 443 00:23:51,119 --> 00:23:52,840 Speaker 3: jump on, I look at it, and for me it's 444 00:23:52,880 --> 00:23:55,359 Speaker 3: reading articles or whatever it happens to be looking at 445 00:23:55,440 --> 00:23:58,600 Speaker 3: YouTube reels. But the reality of it is, we don't 446 00:23:58,600 --> 00:24:01,680 Speaker 3: get bored anymore because we've always got something there to 447 00:24:01,760 --> 00:24:06,760 Speaker 3: keep us entertained. And boredom is what fuels creativity, not 448 00:24:06,880 --> 00:24:09,960 Speaker 3: having technology. So because our brains are constantly looking for 449 00:24:10,000 --> 00:24:13,119 Speaker 3: that excitement, it's there all the time. But if we 450 00:24:13,160 --> 00:24:16,320 Speaker 3: take that away, if we say to ourselves, Okay, I'm 451 00:24:16,320 --> 00:24:18,600 Speaker 3: going to go cold turkey on this, I'm going to 452 00:24:18,600 --> 00:24:21,120 Speaker 3: go Could you imagine going away for a weekend and 453 00:24:21,160 --> 00:24:26,720 Speaker 3: not taking your phone? Yeah, I've beach or going like 454 00:24:26,880 --> 00:24:27,359 Speaker 3: or a walk. 455 00:24:27,480 --> 00:24:31,840 Speaker 1: Yeah, but that, if I'm being honest, would that scare 456 00:24:31,880 --> 00:24:34,480 Speaker 1: me a little bit? It would, which is probably why 457 00:24:34,520 --> 00:24:38,280 Speaker 1: the fuck I should do it. I think we bullshit ourselves. 458 00:24:38,320 --> 00:24:41,680 Speaker 1: I think most people that I know, I don't think 459 00:24:41,720 --> 00:24:44,240 Speaker 1: you're one. I'm not Tiff would have to decide for myself, 460 00:24:44,280 --> 00:24:48,520 Speaker 1: but most people that I know, including me, actually have 461 00:24:48,600 --> 00:24:51,520 Speaker 1: a little bit of a problem. I'm not saying it's 462 00:24:51,600 --> 00:24:54,840 Speaker 1: life destroying, but I'm like, I definitely have something that 463 00:24:54,880 --> 00:24:57,640 Speaker 1: I need to put on my big boy pants and go, Yeah, 464 00:24:57,760 --> 00:25:00,520 Speaker 1: I waste time and I do the thing that I 465 00:25:00,600 --> 00:25:03,960 Speaker 1: tell people not to do, and I don't even I 466 00:25:04,080 --> 00:25:04,959 Speaker 1: kind of choose it. 467 00:25:05,520 --> 00:25:07,760 Speaker 2: But you know, the story is always I'm just going 468 00:25:07,800 --> 00:25:11,000 Speaker 2: to see what's happening with this, And then forty minutes later, 469 00:25:11,040 --> 00:25:13,520 Speaker 2: I'm like, I don't even know what started me. But 470 00:25:13,680 --> 00:25:16,720 Speaker 2: here I am forty minutes later, and I've just wasted 471 00:25:16,760 --> 00:25:20,760 Speaker 2: forty minutes of energy and fucking time productivity. You know. 472 00:25:20,880 --> 00:25:23,639 Speaker 3: So I don't know what I should bring up this 473 00:25:23,720 --> 00:25:25,960 Speaker 3: next topic because I know that you're going to ridicule me. 474 00:25:26,200 --> 00:25:30,359 Speaker 3: And I was watching a documentary recently on a new 475 00:25:30,480 --> 00:25:34,800 Speaker 3: term in the community online called gooning. Do you know 476 00:25:34,840 --> 00:25:37,159 Speaker 3: what gooning is? Do either of you know what gooning is? 477 00:25:39,960 --> 00:25:41,760 Speaker 2: I do not carry on. 478 00:25:42,160 --> 00:25:44,359 Speaker 3: I feel like the person, why why do you do. 479 00:25:44,320 --> 00:25:46,520 Speaker 1: This if you know it's going to not produce a 480 00:25:46,520 --> 00:25:50,959 Speaker 1: good response. What's your agenda? What's your reason? 481 00:25:51,080 --> 00:25:53,920 Speaker 3: I have no agenda. I'm trying to educate the masses 482 00:25:54,000 --> 00:25:55,840 Speaker 3: to masses. 483 00:25:56,520 --> 00:25:59,840 Speaker 2: Can look at you, Jesus, trying to educate the masses. 484 00:26:00,160 --> 00:26:04,080 Speaker 3: Two masses debating? No, okay, because okay, so gooning is 485 00:26:04,119 --> 00:26:10,320 Speaker 3: a term that's used to prolong the sexual build up. 486 00:26:11,280 --> 00:26:14,760 Speaker 1: Wow, right, so well there's only one of us, well 487 00:26:14,880 --> 00:26:16,800 Speaker 1: one two of us on this goal. 488 00:26:16,920 --> 00:26:20,520 Speaker 3: For whom that is relevant, well yeah, anyway, it could 489 00:26:20,520 --> 00:26:20,840 Speaker 3: be one. 490 00:26:21,200 --> 00:26:23,240 Speaker 1: But I did the same thing with food. I just 491 00:26:23,320 --> 00:26:25,720 Speaker 1: walk around it for half an hour and think about it. 492 00:26:26,040 --> 00:26:29,359 Speaker 3: But the frightening thing about this documentary that I saw 493 00:26:29,600 --> 00:26:34,000 Speaker 3: is that people can engage in this by just looking 494 00:26:34,000 --> 00:26:37,120 Speaker 3: at pornography. They have multiple screens, projectors, and all they 495 00:26:37,119 --> 00:26:39,880 Speaker 3: do is they spend a whole lot of time. We're 496 00:26:39,880 --> 00:26:43,840 Speaker 3: talking hours, eight hours, nine hours, ten hours just looking 497 00:26:43,880 --> 00:26:47,479 Speaker 3: at pornography. And there's a whole online communities that are 498 00:26:47,520 --> 00:26:49,720 Speaker 3: now dedicated to this. So once upon a time, it 499 00:26:49,800 --> 00:26:52,320 Speaker 3: was kind of embarrassing, Oh well you know, I kind 500 00:26:52,320 --> 00:26:55,159 Speaker 3: of that person did whatever. But now it's become a 501 00:26:55,200 --> 00:26:59,439 Speaker 3: whole thing. But talk about throwing your life away. I 502 00:26:59,480 --> 00:27:04,480 Speaker 3: mean I couldn't imagine spending a whole day like yeah anyway. 503 00:27:04,600 --> 00:27:08,840 Speaker 1: But I find that sad because that one hundred percent 504 00:27:08,960 --> 00:27:12,640 Speaker 1: nobody's doing that who's not an addict. Yeah, nobody's doing 505 00:27:12,640 --> 00:27:15,960 Speaker 1: that for any logical, healthy reason. Yeah, I mean all 506 00:27:16,000 --> 00:27:18,560 Speaker 1: of those people are addicted. Could be anything else. But 507 00:27:18,640 --> 00:27:21,879 Speaker 1: it's like, yeah, I don't know that's stuff. 508 00:27:21,920 --> 00:27:23,680 Speaker 2: It's like yeah, and. 509 00:27:23,640 --> 00:27:26,720 Speaker 3: Then well, artificial intimacy. And I'm kind of jumping off 510 00:27:26,720 --> 00:27:28,879 Speaker 3: that topic. But on that same list that I was 511 00:27:28,880 --> 00:27:32,960 Speaker 3: talking about the blog, you know, I had a young 512 00:27:33,000 --> 00:27:36,159 Speaker 3: guy who was sharing a long story anyway, lived me 513 00:27:36,200 --> 00:27:39,000 Speaker 3: be for three years, got kicked out of home. He 514 00:27:39,040 --> 00:27:41,399 Speaker 3: and his girlfriend were sitting on the couch once and 515 00:27:41,440 --> 00:27:44,600 Speaker 3: they were they were comparing how many friends they had 516 00:27:44,920 --> 00:27:48,480 Speaker 3: on their Facebook, on their facebooks, and I thought, they're 517 00:27:48,520 --> 00:27:53,679 Speaker 3: not your friends. You know, this artificial intimacy. You know, 518 00:27:54,600 --> 00:27:57,560 Speaker 3: no one on your Facebook list, if you've got a 519 00:27:57,600 --> 00:28:01,320 Speaker 3: thousand Facebook friends, No, they're not your friends. 520 00:28:01,520 --> 00:28:04,800 Speaker 2: But do you know what is interesting? One, you're correct. 521 00:28:05,000 --> 00:28:09,680 Speaker 1: Two from a psychological, sociological, and experiential. 522 00:28:09,000 --> 00:28:11,240 Speaker 2: Perspective, this is what I think. 523 00:28:11,320 --> 00:28:15,520 Speaker 1: Let's say somebody's got an AI friend and the net 524 00:28:15,560 --> 00:28:19,840 Speaker 1: result of their AI friend is positive. I'm not saying 525 00:28:19,880 --> 00:28:22,840 Speaker 1: that's always a case, of course it isn't. But somebody 526 00:28:22,880 --> 00:28:26,840 Speaker 1: who lives in rural wherever they live ten miles from 527 00:28:26,880 --> 00:28:29,439 Speaker 1: the next house, they're a bit disconnected. Maybe they've got 528 00:28:29,480 --> 00:28:32,520 Speaker 1: a few personal challenges, but every day they get up 529 00:28:32,560 --> 00:28:37,000 Speaker 1: and they talk to this friend and for them they 530 00:28:37,000 --> 00:28:41,200 Speaker 1: feel better, they feel you know, and obviously it's not 531 00:28:41,240 --> 00:28:45,440 Speaker 1: a real human and obviously the emotions. 532 00:28:45,600 --> 00:28:47,920 Speaker 2: The other way anyway are not real. But for the. 533 00:28:47,880 --> 00:28:51,280 Speaker 1: Person, those emotions and feelings might be real, which means 534 00:28:51,320 --> 00:28:55,560 Speaker 1: the net result the experience is good, which is not 535 00:28:55,640 --> 00:28:57,680 Speaker 1: saying I think everyone should be doing that. But I 536 00:28:57,800 --> 00:29:04,440 Speaker 1: think this area of the impact of relationships with different 537 00:29:04,560 --> 00:29:09,520 Speaker 1: forms of technology, I think that's going to become more 538 00:29:09,560 --> 00:29:11,160 Speaker 1: and more of a thing because I think. 539 00:29:11,000 --> 00:29:12,680 Speaker 2: People are going to more and more people are. 540 00:29:12,560 --> 00:29:14,600 Speaker 1: Going to have, for want of a more accurate term, 541 00:29:15,080 --> 00:29:17,880 Speaker 1: relationships with virtual entities. 542 00:29:18,640 --> 00:29:22,360 Speaker 3: I want one. I can't wait. I cannot wait, And 543 00:29:22,440 --> 00:29:24,360 Speaker 3: I think we talked about it on the last podcast. 544 00:29:24,400 --> 00:29:27,160 Speaker 3: I want an AI assistant. You know in the Iron 545 00:29:27,200 --> 00:29:29,080 Speaker 3: Man film's table that's different. 546 00:29:29,120 --> 00:29:32,520 Speaker 1: An II assistant is not an emotional relationship. 547 00:29:32,720 --> 00:29:35,360 Speaker 2: But if we're talking about intimacy, but. 548 00:29:35,400 --> 00:29:37,720 Speaker 3: No, I'm thinking, like in your case, you've got Melissa, 549 00:29:37,880 --> 00:29:40,760 Speaker 3: I'm thinking that where you've got someone you can talk to, 550 00:29:41,120 --> 00:29:43,400 Speaker 3: you can kind of say, well, what's happening today, what's up? 551 00:29:43,440 --> 00:29:46,520 Speaker 3: You know, have a full on confirmation. I know what 552 00:29:46,520 --> 00:29:49,239 Speaker 3: you're saying. But I think that eventually the more you 553 00:29:49,320 --> 00:29:52,760 Speaker 3: rely on that. I mean in the I'm talking about 554 00:29:52,800 --> 00:29:55,240 Speaker 3: a movie here, but we all felt kind of, you know, 555 00:29:55,560 --> 00:29:59,920 Speaker 3: warm about Javis, the AI that Tony Stark uses. I thought, 556 00:30:00,080 --> 00:30:03,320 Speaker 3: I want him to be my friend too. So I think, 557 00:30:03,560 --> 00:30:05,400 Speaker 3: even though we know it's an AI, even though we 558 00:30:05,440 --> 00:30:07,680 Speaker 3: know it's a chat bot or whatever, that it would 559 00:30:07,720 --> 00:30:10,600 Speaker 3: be impossible not to form some sort of connection, some 560 00:30:10,640 --> 00:30:14,400 Speaker 3: sort of fondness for that entity that is with you 561 00:30:14,560 --> 00:30:17,560 Speaker 3: every day if you wake up chatting and what am 562 00:30:17,560 --> 00:30:19,920 Speaker 3: I doing today? How can I better do this? Because 563 00:30:19,960 --> 00:30:23,239 Speaker 3: we start to rely on it, and that crutch that 564 00:30:23,280 --> 00:30:26,800 Speaker 3: we start to use, it can be beneficial. It'll be fantastic. 565 00:30:26,840 --> 00:30:28,760 Speaker 3: Don't forget to do this. Don't forget to do that. 566 00:30:28,920 --> 00:30:31,920 Speaker 3: You know, you know we've got this coming up. I 567 00:30:32,000 --> 00:30:35,600 Speaker 3: think that we would eventually form a relationship of sorts 568 00:30:35,640 --> 00:30:37,560 Speaker 3: with whatever AI chat what we're using. 569 00:30:38,440 --> 00:30:40,200 Speaker 1: And put up your hand. If you've ever been in 570 00:30:40,240 --> 00:30:44,760 Speaker 1: an unhealthy human relationship, everyone in the fucking world, you know, yes, 571 00:30:45,640 --> 00:30:46,280 Speaker 1: Tip's ready. 572 00:30:46,520 --> 00:30:46,720 Speaker 3: Yeah. 573 00:30:46,800 --> 00:30:50,080 Speaker 5: Come thinking about the impact, the long term impact of 574 00:30:50,560 --> 00:30:53,560 Speaker 5: just another area in our life where we are essentially 575 00:30:53,600 --> 00:30:57,400 Speaker 5: avoiding discomfort because I think about the conversations I've had 576 00:30:57,480 --> 00:31:02,320 Speaker 5: on friendships and relationships where rupture and repair is what 577 00:31:02,440 --> 00:31:08,600 Speaker 5: creates depth and intimacy. And yet then somebody chooses this 578 00:31:09,160 --> 00:31:11,960 Speaker 5: AI version of friendship which gives them all the feel goods. 579 00:31:11,960 --> 00:31:13,680 Speaker 4: But it's another area that. 580 00:31:13,640 --> 00:31:16,440 Speaker 5: We always having every other area of life where we 581 00:31:16,920 --> 00:31:19,239 Speaker 5: don't have to get uncomfortable, and in future then we 582 00:31:19,280 --> 00:31:22,600 Speaker 5: avoid it more and more, and what little connection we 583 00:31:22,840 --> 00:31:25,400 Speaker 5: already had with real people will have less of because 584 00:31:25,440 --> 00:31:28,160 Speaker 5: it's too hard. I'm not willing to get uncomfortable. I'm 585 00:31:28,160 --> 00:31:30,920 Speaker 5: not willing to express needs or show up for the 586 00:31:30,960 --> 00:31:34,680 Speaker 5: other person because in this relationship with AI, it's all 587 00:31:34,720 --> 00:31:35,120 Speaker 5: about me. 588 00:31:35,560 --> 00:31:38,320 Speaker 3: It's one sided. Yeah, that's so interesting, Tip, I actually 589 00:31:38,320 --> 00:31:40,920 Speaker 3: really like that point. I've often talked. You know, I've 590 00:31:40,960 --> 00:31:45,640 Speaker 3: got really close friends that have very differing political views, 591 00:31:45,720 --> 00:31:49,560 Speaker 3: I mean extremist political views. You know, they think veganism 592 00:31:49,680 --> 00:31:53,440 Speaker 3: is a wank. They like Donald Rump, you know they Yeah, 593 00:31:54,080 --> 00:31:58,920 Speaker 3: it's quite funny, and and but I like them a lot. 594 00:31:59,520 --> 00:32:04,480 Speaker 3: They're the lovely people. They're great friends. And if I 595 00:32:04,560 --> 00:32:06,760 Speaker 3: had an AI chatbot, it would just agree with everything 596 00:32:06,760 --> 00:32:08,600 Speaker 3: that I said. I love what you just said, Tip, 597 00:32:08,920 --> 00:32:11,280 Speaker 3: It's so true, and I love the challenges. I love 598 00:32:11,320 --> 00:32:13,440 Speaker 3: the diversity of people in my life. I like people 599 00:32:13,480 --> 00:32:16,520 Speaker 3: to argue with me, tell me I'm wrong, you know, 600 00:32:17,240 --> 00:32:21,400 Speaker 3: nag me for leaving my wallet in the car. But 601 00:32:21,480 --> 00:32:23,320 Speaker 3: the reality of it is here, who wants a whole 602 00:32:23,280 --> 00:32:25,320 Speaker 3: lot of yes men around you just kind of they're 603 00:32:25,360 --> 00:32:27,440 Speaker 3: telling you that you're great all the time. 604 00:32:27,920 --> 00:32:29,640 Speaker 5: How good is it when you've got a friend and 605 00:32:29,680 --> 00:32:31,640 Speaker 5: they're having a hard time and you get to show 606 00:32:31,760 --> 00:32:32,200 Speaker 5: up for them? 607 00:32:32,360 --> 00:32:33,760 Speaker 4: Well I was not going to do that. 608 00:32:34,200 --> 00:32:39,640 Speaker 2: Yeah, yeah, absolutely, you make valid points. I think. 609 00:32:41,320 --> 00:32:42,960 Speaker 1: I hear what you're both saying. I think it's a 610 00:32:42,960 --> 00:32:46,720 Speaker 1: little bit situation and context dependent, you know. I think 611 00:32:46,760 --> 00:32:48,720 Speaker 1: a lot of people don't have anyone that gives a 612 00:32:48,760 --> 00:32:51,440 Speaker 1: fuck about them, like really really really loves them and 613 00:32:51,480 --> 00:32:53,840 Speaker 1: really cares. And there are a lot of people who 614 00:32:53,840 --> 00:32:58,080 Speaker 1: are deeply, deeply sad and disconnected and alone. And I 615 00:32:58,200 --> 00:33:01,440 Speaker 1: don't for any minute think that AI is the answer 616 00:33:01,560 --> 00:33:04,800 Speaker 1: to that, just like I don't think one person is 617 00:33:04,840 --> 00:33:06,440 Speaker 1: the answer. 618 00:33:06,200 --> 00:33:07,600 Speaker 2: For any do you know what I mean? I think 619 00:33:07,640 --> 00:33:09,760 Speaker 2: the answer is us trying to figure out ourselves and 620 00:33:10,200 --> 00:33:11,280 Speaker 2: to understand our. 621 00:33:11,200 --> 00:33:13,680 Speaker 1: Own mind and our own emotions and our own behaviors 622 00:33:13,720 --> 00:33:16,920 Speaker 1: and our own subjective experience of life and to work 623 00:33:16,960 --> 00:33:20,360 Speaker 1: through that. But I think it's like it's for some 624 00:33:20,480 --> 00:33:24,520 Speaker 1: people it's going to be a positive addition to their life, 625 00:33:24,560 --> 00:33:28,600 Speaker 1: and for some people it's going to lead to some unhealthy, toxic, devastating, 626 00:33:29,000 --> 00:33:33,600 Speaker 1: bloody relationship or shudo relationship. But just like you know, 627 00:33:33,640 --> 00:33:35,480 Speaker 1: you've got someone in your life now, and from the 628 00:33:35,480 --> 00:33:41,160 Speaker 1: outside looking into it, if it seems to me overwhelmingly positive, 629 00:33:42,160 --> 00:33:45,200 Speaker 1: you know, but then you've also had other humans in 630 00:33:45,240 --> 00:33:48,040 Speaker 1: your life where that was the opposite, and the same 631 00:33:48,080 --> 00:33:53,880 Speaker 1: with me, It's like, I don't think because you know, 632 00:33:54,200 --> 00:33:56,720 Speaker 1: we build this connection, rapport and trust and love, which 633 00:33:56,760 --> 00:34:00,320 Speaker 1: is great, but as we know, that doesn't all ways 634 00:34:00,360 --> 00:34:03,600 Speaker 1: work out over the long term. So but you're right 635 00:34:03,640 --> 00:34:07,480 Speaker 1: in that, you know, being able to have hard relationships 636 00:34:07,720 --> 00:34:09,799 Speaker 1: and to be reflective and to be able to kind 637 00:34:09,840 --> 00:34:12,279 Speaker 1: of figure out what's going on with me really and 638 00:34:12,280 --> 00:34:14,960 Speaker 1: what's going on with me in relation to other people. 639 00:34:15,760 --> 00:34:19,359 Speaker 1: But I think nonetheless that moving forward, there will be 640 00:34:20,239 --> 00:34:24,040 Speaker 1: you know, in a decade, the prevalence of these kinds 641 00:34:24,080 --> 00:34:29,160 Speaker 1: of relationships I think is going to be high high. 642 00:34:29,280 --> 00:34:29,720 Speaker 2: Patrick. 643 00:34:29,840 --> 00:34:33,000 Speaker 1: Let's digress a little bit, because I saw this on 644 00:34:33,360 --> 00:34:37,239 Speaker 1: Telly or somewhere I saw this about this dude who 645 00:34:37,320 --> 00:34:43,759 Speaker 1: was trying to hack his robot. He's a programmer, and 646 00:34:43,800 --> 00:34:47,080 Speaker 1: he hacked his bloody vacuum he's robot vacuum because he 647 00:34:47,120 --> 00:34:50,400 Speaker 1: wanted to use it with his game console. And he 648 00:34:50,480 --> 00:34:56,319 Speaker 1: ended up hacking six seven hundred vacuums around America and 649 00:34:56,400 --> 00:34:59,320 Speaker 1: he could see into all these like he could literally 650 00:34:59,400 --> 00:35:02,600 Speaker 1: had video into all of these homes in real time of. 651 00:35:02,600 --> 00:35:07,360 Speaker 2: What people were doing. That's wasn't it. Yeah, that's fucking terrifying. 652 00:35:07,760 --> 00:35:12,320 Speaker 3: So Dji is the company that makes the amazing drones 653 00:35:12,480 --> 00:35:15,200 Speaker 3: that are very popular around the world. I think I've 654 00:35:15,320 --> 00:35:19,680 Speaker 3: exclusively had Dji drones. I still fly them. But they 655 00:35:19,719 --> 00:35:25,600 Speaker 3: released the Dji Romo robot vacuum cleaner recently. And so 656 00:35:25,719 --> 00:35:27,800 Speaker 3: this guy who was a bit of a security expert 657 00:35:27,880 --> 00:35:30,839 Speaker 3: and decided he wanted to use his joystick controlleries and 658 00:35:30,960 --> 00:35:32,720 Speaker 3: you know, his game control I think was the Xbox 659 00:35:32,760 --> 00:35:35,840 Speaker 3: controller or to control the vacuum. Thought would be a 660 00:35:35,840 --> 00:35:39,160 Speaker 3: bit fun, and then realized getting into the base code 661 00:35:39,239 --> 00:35:42,359 Speaker 3: that because they talked back to the bay station and 662 00:35:42,600 --> 00:35:45,440 Speaker 3: they learn because the idea of these robot vacuums is 663 00:35:45,480 --> 00:35:47,520 Speaker 3: they look at the world around them. They map the 664 00:35:47,600 --> 00:35:49,920 Speaker 3: area and then they use that to be able to 665 00:35:50,080 --> 00:35:54,120 Speaker 3: effectively clean better. But yeah, this is amazing. So he 666 00:35:54,120 --> 00:35:56,560 Speaker 3: didn't need a security pin or anything because he already 667 00:35:56,560 --> 00:35:59,680 Speaker 3: had an account. He was able to log in and 668 00:35:59,680 --> 00:36:02,680 Speaker 3: get access to other vacuum cleaners. And the scary thing 669 00:36:02,800 --> 00:36:05,160 Speaker 3: is that they have light sensors that can sense the 670 00:36:05,200 --> 00:36:08,520 Speaker 3: objects around them. They make maps of the house, and 671 00:36:08,560 --> 00:36:11,880 Speaker 3: they also have cameras. It's a frightening thought that someone 672 00:36:11,920 --> 00:36:15,040 Speaker 3: could then literally use your own device against you, to 673 00:36:15,400 --> 00:36:17,879 Speaker 3: spy on you, to map out your house, to look 674 00:36:17,880 --> 00:36:18,640 Speaker 3: at what you've got. 675 00:36:18,680 --> 00:36:21,680 Speaker 2: And it's crazy, as I understand it, correct me if 676 00:36:21,680 --> 00:36:23,840 Speaker 2: I'm wrong, but yet one that is fucking terrifying. 677 00:36:23,880 --> 00:36:26,440 Speaker 1: But the dude who did that obviously did it accidentally. 678 00:36:26,520 --> 00:36:28,600 Speaker 1: Then he went to whoever he needed to go to 679 00:36:28,640 --> 00:36:34,200 Speaker 1: and went, hey, guess what, I did this accidentally and 680 00:36:34,320 --> 00:36:36,360 Speaker 1: kind of put up his hands so that he wouldn't 681 00:36:36,360 --> 00:36:40,080 Speaker 1: be charged with any kind of I'm sure that's that's 682 00:36:40,120 --> 00:36:44,360 Speaker 1: a crime if he was doing it intentionally. But so yeah, 683 00:36:43,320 --> 00:36:48,080 Speaker 1: I think he just demonstrated to them that their system 684 00:36:48,200 --> 00:36:48,680 Speaker 1: or their. 685 00:36:48,520 --> 00:36:50,840 Speaker 2: Tech was relatively hackable and flawed. 686 00:36:51,360 --> 00:36:53,799 Speaker 3: Yeah, there are a lot of white they call them 687 00:36:53,800 --> 00:36:56,800 Speaker 3: white hat hackers out there that actually support the industry. 688 00:36:57,000 --> 00:37:00,239 Speaker 3: They look for vulnerabilities. Google has a competition that it 689 00:37:00,320 --> 00:37:03,520 Speaker 3: runs to try to get people to find vulnerabilities in 690 00:37:03,600 --> 00:37:05,919 Speaker 3: its systems, and then it gives them, it rewards them. 691 00:37:06,000 --> 00:37:08,160 Speaker 3: So yeah, there are a lot of really good programmers 692 00:37:08,200 --> 00:37:11,200 Speaker 3: out there who were altruistic and in some cases get 693 00:37:11,280 --> 00:37:14,160 Speaker 3: rewarded for that. But I thought this was interesting. And 694 00:37:14,200 --> 00:37:16,200 Speaker 3: the other thing as well. I didn't realize that robot 695 00:37:16,280 --> 00:37:19,520 Speaker 3: vacuum cleaners have microphones in them as well, the ultimate 696 00:37:19,600 --> 00:37:21,439 Speaker 3: spy device in the way. 697 00:37:21,840 --> 00:37:22,880 Speaker 2: Why would they need that. 698 00:37:23,440 --> 00:37:26,200 Speaker 3: I don't know. I'm not sure. Yeah, I don't know. 699 00:37:26,360 --> 00:37:28,480 Speaker 3: Maybe you can talk to them, Crago. It's like, hey, 700 00:37:28,520 --> 00:37:32,960 Speaker 3: little fella came over here. It's pretty lowly at my joint. 701 00:37:35,520 --> 00:37:41,200 Speaker 2: So they're launching a timber, yeah, wooden satellite that surely 702 00:37:41,680 --> 00:37:45,560 Speaker 2: that seems inconsistent with going into space like shit burning up. 703 00:37:46,000 --> 00:37:48,560 Speaker 3: No, No, quite the opposite. In fact, it's and shit 704 00:37:48,560 --> 00:37:51,000 Speaker 3: burning up is exactly right, So you're kind of right 705 00:37:51,040 --> 00:37:54,279 Speaker 3: and wrong. There was a bit of work done on 706 00:37:54,320 --> 00:37:56,919 Speaker 3: the International Space Station, and I think we chatted about 707 00:37:56,920 --> 00:38:00,360 Speaker 3: this maybe two years ago, where they were testing different 708 00:38:00,440 --> 00:38:03,759 Speaker 3: sorts of wood to see what their resilience was like 709 00:38:03,960 --> 00:38:08,160 Speaker 3: in space. Space debris and the chemicals that are released 710 00:38:08,160 --> 00:38:10,920 Speaker 3: into the atmosphere when a lot of satellites burn up 711 00:38:10,960 --> 00:38:13,520 Speaker 3: in the atmosphere, Aluminium, that sort of stuff are actually 712 00:38:13,600 --> 00:38:17,799 Speaker 3: quite toxic chemicals. And so a Japanese company has been 713 00:38:17,840 --> 00:38:21,040 Speaker 3: doing a lot of research into the concept of encasing 714 00:38:21,280 --> 00:38:25,680 Speaker 3: the satellite in wood, and it looks like that the 715 00:38:26,239 --> 00:38:29,719 Speaker 3: Ligno SAT is the first wooden satellite that's going to 716 00:38:29,719 --> 00:38:33,319 Speaker 3: be sent into space. And the reason that it's going 717 00:38:33,400 --> 00:38:36,279 Speaker 3: to be great is because when it eventually expires, it's 718 00:38:36,400 --> 00:38:39,200 Speaker 3: used by date. When it burns up in the atmosphere, 719 00:38:39,280 --> 00:38:41,839 Speaker 3: it's just wood, and it's not going to release I mean, 720 00:38:41,840 --> 00:38:44,759 Speaker 3: it's still going to have obviously electronics inside it, you know, 721 00:38:44,760 --> 00:38:48,480 Speaker 3: they're not carving up transistors like hand camping transistors. But 722 00:38:48,719 --> 00:38:51,520 Speaker 3: the bulk of the satellite, the exterior part, and it's 723 00:38:51,560 --> 00:38:53,839 Speaker 3: quite durable. The interesting thing that came out of this 724 00:38:53,920 --> 00:38:57,719 Speaker 3: research on the International Space Station is when you take 725 00:38:57,840 --> 00:39:00,799 Speaker 3: wood into space, there's no back to and there's no 726 00:39:00,920 --> 00:39:04,880 Speaker 3: moisture the things that normally will cause wood to rot, 727 00:39:05,120 --> 00:39:07,399 Speaker 3: so all the you know, and so when you think 728 00:39:07,440 --> 00:39:11,239 Speaker 3: about it. It's the perfect material to go out into 729 00:39:11,239 --> 00:39:14,279 Speaker 3: space because it's it's going to be fairly resilient. And 730 00:39:14,360 --> 00:39:17,279 Speaker 3: that's what came up during this research study. And now 731 00:39:17,560 --> 00:39:19,560 Speaker 3: the practical example is it looks like they're going to 732 00:39:19,600 --> 00:39:22,359 Speaker 3: fire off this, not fire off, they're going to put 733 00:39:22,360 --> 00:39:25,040 Speaker 3: it on board a launch saddle, a launch rocket, and 734 00:39:26,000 --> 00:39:27,799 Speaker 3: wooden satellites would be great. 735 00:39:28,080 --> 00:39:31,040 Speaker 2: Based on your no bacteria and no, what was the 736 00:39:31,040 --> 00:39:33,080 Speaker 2: other thing you said, like. 737 00:39:33,120 --> 00:39:35,719 Speaker 3: Because it's fro it's cold, So there's so the water 738 00:39:35,760 --> 00:39:37,600 Speaker 3: won't be rotting because bacteria is in water. 739 00:39:37,640 --> 00:39:39,760 Speaker 1: I guess No, No, you said in no space, in space, 740 00:39:39,800 --> 00:39:43,439 Speaker 1: there's like what causes the wood to write normally he's 741 00:39:43,440 --> 00:39:44,080 Speaker 1: not up there. 742 00:39:44,520 --> 00:39:46,920 Speaker 3: Yeah, that's right, so bacteria and water. 743 00:39:48,040 --> 00:39:50,480 Speaker 1: So I'm wondering if old mate was in space at 744 00:39:50,480 --> 00:39:53,040 Speaker 1: the space station and someone died and you just pushed 745 00:39:53,080 --> 00:39:55,880 Speaker 1: him out the door, would he just be held in 746 00:39:56,000 --> 00:39:57,760 Speaker 1: state perpetually kind of. 747 00:39:57,640 --> 00:40:01,800 Speaker 2: In space embarment and then you could. 748 00:40:01,640 --> 00:40:04,400 Speaker 1: Go grab him a couple of decades later, when we 749 00:40:04,520 --> 00:40:06,920 Speaker 1: figured out how to kind of restart the bit. 750 00:40:06,840 --> 00:40:09,640 Speaker 2: That broke and just pull him back on board, and 751 00:40:09,640 --> 00:40:12,160 Speaker 2: he's like, fuck, that was a good sleep. Yeah, you've 752 00:40:12,160 --> 00:40:15,000 Speaker 2: been sleeping for twenty five years, bro. 753 00:40:15,160 --> 00:40:17,200 Speaker 3: The Ultimate Crygenics. Yeah, that's right. 754 00:40:18,239 --> 00:40:21,959 Speaker 1: Hey, one battery swap, we're talking about cars. I don't 755 00:40:21,960 --> 00:40:23,560 Speaker 1: even know how would you do that? I must be 756 00:40:23,600 --> 00:40:28,200 Speaker 1: done by a machine. A battery slap. I'll swap. Obviously, 757 00:40:28,200 --> 00:40:32,000 Speaker 1: we're talking about EV's in less than zero point five seconds. 758 00:40:32,239 --> 00:40:37,920 Speaker 1: Neo sets new record for four consecutive days. So that 759 00:40:38,080 --> 00:40:42,799 Speaker 1: is battery life in evs and the technology around that 760 00:40:43,040 --> 00:40:44,560 Speaker 1: is changing pretty quickly. 761 00:40:45,640 --> 00:40:48,440 Speaker 3: So yeah, there's two aspects to that. So obviously, the 762 00:40:48,800 --> 00:40:52,120 Speaker 3: duration it takes to charge a battery has been a 763 00:40:52,160 --> 00:40:54,320 Speaker 3: bit of a pushback in the market because people do 764 00:40:54,400 --> 00:40:57,600 Speaker 3: want to wait all that time. So in China, Neo 765 00:40:58,440 --> 00:41:01,040 Speaker 3: has they do it. I think they've up over one 766 00:41:01,120 --> 00:41:03,799 Speaker 3: hundred and seventy five thousand battery swaps since they were 767 00:41:03,840 --> 00:41:07,799 Speaker 3: set up, and that's an amazing amount. So they basically 768 00:41:07,880 --> 00:41:10,480 Speaker 3: have a concept whereby if you buy this car, there 769 00:41:10,480 --> 00:41:12,719 Speaker 3: are all these service stations that are around China that you 770 00:41:12,719 --> 00:41:14,719 Speaker 3: can go to where you do a swap and go 771 00:41:15,040 --> 00:41:17,240 Speaker 3: so you don't charge the battery. They have a stock 772 00:41:17,320 --> 00:41:20,319 Speaker 3: supply of ready charged batteries and they swap and go 773 00:41:20,960 --> 00:41:26,520 Speaker 3: and the swap stations are now getting more batteries, they're 774 00:41:26,520 --> 00:41:28,880 Speaker 3: getting faster the technology because it used to take a while. 775 00:41:28,960 --> 00:41:31,880 Speaker 3: It used to take five minutes, four to five minutes 776 00:41:31,920 --> 00:41:34,160 Speaker 3: for the battery to be taken out and then swapped, 777 00:41:34,360 --> 00:41:36,040 Speaker 3: but now it's a lot more efficient. They've got a 778 00:41:36,080 --> 00:41:37,839 Speaker 3: lot better with it. And I know we spoke about 779 00:41:37,880 --> 00:41:41,399 Speaker 3: this with the motorcycle manufacturers, the top I think top 780 00:41:41,440 --> 00:41:44,919 Speaker 3: three motorcycle manufacturers, but they just electric cycles haven't taken 781 00:41:44,960 --> 00:41:49,440 Speaker 3: on very much. The e bikes have and they're ridiculously fast, 782 00:41:49,480 --> 00:41:52,800 Speaker 3: and legislations trying to catch up with that. The latest 783 00:41:52,800 --> 00:41:54,680 Speaker 3: on that is that I know I'm digressing, but the 784 00:41:54,760 --> 00:41:57,239 Speaker 3: latest on the e bikes thing is that they want 785 00:41:57,280 --> 00:42:00,239 Speaker 3: to actually classify them as motorbikes and make people will 786 00:42:00,280 --> 00:42:02,960 Speaker 3: have bikes bike licenses to be. 787 00:42:02,920 --> 00:42:05,640 Speaker 1: Able to well, they are essentially a lot of them. 788 00:42:05,840 --> 00:42:11,040 Speaker 1: But I think the thing that non motorbike people don't 789 00:42:11,080 --> 00:42:15,080 Speaker 1: really get about motorbikes, and Tip can speak to this 790 00:42:15,280 --> 00:42:18,600 Speaker 1: is who the fuck wants to ride something that's silent, 791 00:42:19,440 --> 00:42:23,680 Speaker 1: Like for me, it's about it's about the visceral, the vibration, 792 00:42:23,880 --> 00:42:27,839 Speaker 1: it's about the noise, it's about the throttle response, it's 793 00:42:27,840 --> 00:42:31,399 Speaker 1: about like and also one of the reasons people love 794 00:42:31,440 --> 00:42:35,839 Speaker 1: evs is because they're much faster than fuel cars or 795 00:42:36,480 --> 00:42:40,319 Speaker 1: internal combustion engines, right, but that's not the case with motorbikes. 796 00:42:40,560 --> 00:42:46,040 Speaker 1: Like my motorbikes are faster than you need. So yeah, 797 00:42:46,080 --> 00:42:48,560 Speaker 1: but nonetheless, I think they will take off, and they 798 00:42:48,600 --> 00:42:50,279 Speaker 1: are building, but I think it's going to be a 799 00:42:50,320 --> 00:42:56,920 Speaker 1: slower uptake because most motorcyclists ride them because they love motorbikes, 800 00:42:56,920 --> 00:42:59,000 Speaker 1: whereas a lot of people drive because they've got to 801 00:42:59,040 --> 00:42:59,719 Speaker 1: get from A to B. 802 00:43:01,680 --> 00:43:04,960 Speaker 3: Yeah, that's a good point. I was fascinated to see 803 00:43:05,080 --> 00:43:09,160 Speaker 3: that earlier this month, that company Neo with the swapover batteries. 804 00:43:09,360 --> 00:43:13,640 Speaker 3: They did the one hundred millionth battery swap, one hundred 805 00:43:13,880 --> 00:43:17,960 Speaker 3: million battery swap. That's crazy, so is it? 806 00:43:18,200 --> 00:43:20,360 Speaker 2: I don't know if you know. But so, like you 807 00:43:20,400 --> 00:43:22,080 Speaker 2: go to a servo, you put in your patrol, you 808 00:43:22,080 --> 00:43:24,560 Speaker 2: pay your eighty bucks or whatever you pay. So with 809 00:43:24,640 --> 00:43:27,080 Speaker 2: this you go and you get your battery slap swapped over, 810 00:43:27,160 --> 00:43:31,080 Speaker 2: and there's a set like obviously that's not for zero, 811 00:43:31,360 --> 00:43:33,520 Speaker 2: so they must have to go and pay a set 812 00:43:33,520 --> 00:43:35,560 Speaker 2: and fee every time that happens. 813 00:43:35,719 --> 00:43:38,480 Speaker 3: Yeah, I don't know. I'm I'm not entirely sure how 814 00:43:38,520 --> 00:43:40,719 Speaker 3: that how it works, what the charge would be. Yeah, 815 00:43:40,719 --> 00:43:43,239 Speaker 3: but for the convenience, that'd be awesome, wouldn't it. 816 00:43:43,760 --> 00:43:44,000 Speaker 2: Yeah? 817 00:43:44,200 --> 00:43:46,520 Speaker 1: Tip, Would you drive an EV as in a car? 818 00:43:46,680 --> 00:43:49,160 Speaker 1: Have you thought about I mean your cars? 819 00:43:49,719 --> 00:43:51,520 Speaker 2: It's not that at all, but it's what is your 820 00:43:51,520 --> 00:43:54,320 Speaker 2: car four or five years. 821 00:43:53,840 --> 00:43:59,600 Speaker 5: One's at twenty eighteen, so. 822 00:43:58,040 --> 00:44:00,840 Speaker 2: That's eighty ish years old. Would you X car? What 823 00:44:00,920 --> 00:44:04,120 Speaker 2: are the what are the percentage chances of your next 824 00:44:04,120 --> 00:44:04,959 Speaker 2: car being an EV? 825 00:44:05,640 --> 00:44:07,279 Speaker 4: I mean I'm not that far. 826 00:44:07,360 --> 00:44:09,319 Speaker 5: I'm not a real car person, so I don't like 827 00:44:09,360 --> 00:44:12,200 Speaker 5: the driving experience. I just find them. They when they 828 00:44:12,760 --> 00:44:14,400 Speaker 5: take off in front of me or I hear them, 829 00:44:14,440 --> 00:44:17,080 Speaker 5: they still freak me out. I find them weird, especially 830 00:44:17,719 --> 00:44:21,360 Speaker 5: EV buses, Like, yeah. 831 00:44:22,080 --> 00:44:25,880 Speaker 3: Buses are there, there's not that churning polluted diesel smell. 832 00:44:26,280 --> 00:44:28,280 Speaker 3: You know, if you're sitting behind on a bike, that'd 833 00:44:28,280 --> 00:44:31,200 Speaker 3: be awesome sitting behind an electric. 834 00:44:31,640 --> 00:44:33,840 Speaker 2: Very valid, very valid point, Patrick. 835 00:44:33,920 --> 00:44:38,000 Speaker 1: I have nearly died of carbon fucking monoxide poisoning a 836 00:44:38,080 --> 00:44:42,319 Speaker 1: thousand times, sitting behind fucking buses and trucks that are 837 00:44:42,360 --> 00:44:47,560 Speaker 1: just spewing out toxic shit into my face, that's not 838 00:44:47,600 --> 00:44:51,000 Speaker 1: the attractive part of it, be on the open road, 839 00:44:51,360 --> 00:44:52,000 Speaker 1: the fresh air. 840 00:44:53,120 --> 00:44:55,880 Speaker 3: The interesting point to that is that I think with 841 00:44:56,040 --> 00:44:59,760 Speaker 3: the EV market, my wish for that would be totally 842 00:45:00,000 --> 00:45:03,440 Speaker 3: aonymous EV. So I'm not talking about going for the 843 00:45:03,480 --> 00:45:07,640 Speaker 3: weekend drive. I'm thinking the commute because you put if 844 00:45:07,680 --> 00:45:11,799 Speaker 3: you put AI in charge of EV on the road 845 00:45:11,920 --> 00:45:15,600 Speaker 3: during the commute during the peak hour, I guarantee you 846 00:45:15,640 --> 00:45:17,880 Speaker 3: peak hour won't be as bad as what it is now, 847 00:45:18,320 --> 00:45:21,880 Speaker 3: take the huge kind of human error out of it, 848 00:45:22,360 --> 00:45:24,080 Speaker 3: and that's where I see it would be really good 849 00:45:24,120 --> 00:45:27,000 Speaker 3: for the daily commute. So people could be working in 850 00:45:27,040 --> 00:45:29,400 Speaker 3: their cars listening to an audiobook not having to worry 851 00:45:29,400 --> 00:45:32,520 Speaker 3: about that, no accidents on the roads, or less accidents 852 00:45:32,560 --> 00:45:34,840 Speaker 3: on the road. So I think that's where the combination 853 00:45:34,960 --> 00:45:38,160 Speaker 3: of EV and a AI together would be great. But 854 00:45:38,480 --> 00:45:41,880 Speaker 3: when I had my car windows smashed last week, I 855 00:45:41,960 --> 00:45:45,160 Speaker 3: got a courtesy car through my insurance company, and it 856 00:45:45,239 --> 00:45:49,080 Speaker 3: was the Toyota Corolla Hybrid. And I've got to tell 857 00:45:49,120 --> 00:45:52,799 Speaker 3: you four point eight liters per one hundred kilometers, so 858 00:45:53,080 --> 00:45:55,640 Speaker 3: it ran off the smell of an oily rag. But 859 00:45:56,000 --> 00:45:59,200 Speaker 3: because it's a pretty the Toyota have been doing hybrids 860 00:45:59,239 --> 00:46:03,640 Speaker 3: for a long time. There is a battery button where 861 00:46:03,680 --> 00:46:06,359 Speaker 3: if you stay below forty kilometers an hour, I could 862 00:46:06,440 --> 00:46:10,479 Speaker 3: drive to the supermarket Nana style and not use any 863 00:46:10,520 --> 00:46:13,319 Speaker 3: petrol at all. And that was playing with battery a 864 00:46:13,320 --> 00:46:17,280 Speaker 3: lot when I was driving the car sticking under forty. 865 00:46:18,480 --> 00:46:22,759 Speaker 1: Well, they were pioneers with the press, which was I 866 00:46:22,800 --> 00:46:25,080 Speaker 1: don't know, thirty five years ago, and that was one 867 00:46:25,160 --> 00:46:27,719 Speaker 1: hundred percent ev back then, and we're all like, that's 868 00:46:27,840 --> 00:46:30,040 Speaker 1: fucking that's never going to take off. 869 00:46:31,960 --> 00:46:34,799 Speaker 2: Yeah, yeah, yeah, I was going to say to you, 870 00:46:35,000 --> 00:46:37,680 Speaker 2: it's something on the tip of my tongue. Oh, that's right. 871 00:46:37,719 --> 00:46:42,120 Speaker 1: That guy Dara from Uber they were talking about. 872 00:46:42,600 --> 00:46:45,800 Speaker 2: This was on Diary of a CEO, Stephen Bartlett. 873 00:46:46,640 --> 00:46:52,880 Speaker 1: They were talking about the reality that autonomous vehicles avs 874 00:46:52,920 --> 00:46:56,160 Speaker 1: are going to overtake or going to play a very 875 00:46:56,160 --> 00:46:58,680 Speaker 1: big role in the Uber space and the. 876 00:46:58,760 --> 00:46:59,960 Speaker 2: Like moving forward. 877 00:47:01,320 --> 00:47:04,080 Speaker 1: And he said, like, they're talking about what are the 878 00:47:04,120 --> 00:47:07,480 Speaker 1: potential hurdles with people getting in a car with no driver, 879 00:47:07,680 --> 00:47:12,200 Speaker 1: and it's all really psychological and emotional, and I would 880 00:47:12,280 --> 00:47:14,480 Speaker 1: have to look at the research, but what he said 881 00:47:15,320 --> 00:47:20,160 Speaker 1: is that it's about ten times safer. So autonomous vehicles 882 00:47:20,200 --> 00:47:23,640 Speaker 1: are about ten times safer than humans driven by vehicles, 883 00:47:23,640 --> 00:47:25,680 Speaker 1: and they've got lots and lots and lots of data. 884 00:47:25,880 --> 00:47:29,200 Speaker 1: It's just that the idea still freaks people out. So 885 00:47:29,800 --> 00:47:33,239 Speaker 1: I would gladly get in an autonomous vehicle. I would gladly. 886 00:47:34,480 --> 00:47:37,120 Speaker 2: I could think of nothing better than just sitting back, 887 00:47:37,320 --> 00:47:41,560 Speaker 2: just doing whatever, or having a snooze. You love driving, though, 888 00:47:41,560 --> 00:47:41,880 Speaker 2: don't you. 889 00:47:42,800 --> 00:47:44,719 Speaker 3: I enjoy driving, and I've got to tell you, I 890 00:47:44,800 --> 00:47:47,080 Speaker 3: was devastated when the car got broken into because I 891 00:47:47,080 --> 00:47:50,880 Speaker 3: love my cart cause I'd never had a new car before. 892 00:47:50,880 --> 00:47:53,960 Speaker 3: It always bought secondhand cars, and the previous car I 893 00:47:54,000 --> 00:47:56,960 Speaker 3: had was probably one of the ugliest cars on the 894 00:47:57,000 --> 00:48:00,799 Speaker 3: market during my reno. Sce Nick McGahan, I know. 895 00:48:00,800 --> 00:48:02,960 Speaker 2: You've had a couple of ugly cars. That fucking this 896 00:48:03,200 --> 00:48:06,960 Speaker 2: and that Burgundy and that was an ugly motherfucker. 897 00:48:07,520 --> 00:48:11,080 Speaker 3: Oh my god, you're in my mother Jesus, Like I said, 898 00:48:11,440 --> 00:48:12,600 Speaker 3: I said, motherfucker. 899 00:48:12,680 --> 00:48:13,440 Speaker 2: Let's be clear. 900 00:48:14,080 --> 00:48:17,120 Speaker 3: That's a good looking car. Then, Nissan NXR Coop is 901 00:48:17,160 --> 00:48:19,840 Speaker 3: a bloody good looking car. I got to tell you anyway, 902 00:48:20,160 --> 00:48:22,439 Speaker 3: aside from the insights, because it even had a target roof. 903 00:48:22,480 --> 00:48:24,720 Speaker 3: You could take the roof section trick. Can we stop 904 00:48:24,760 --> 00:48:26,799 Speaker 3: talking about your car? I think we spoke about it 905 00:48:26,800 --> 00:48:30,600 Speaker 3: five times this podcast. Could you get onto some tech news? Okay, 906 00:48:31,040 --> 00:48:34,680 Speaker 3: my latest app session? Can I tell you about my 907 00:48:34,760 --> 00:48:35,440 Speaker 3: latest app? 908 00:48:35,560 --> 00:48:38,080 Speaker 1: Yeah, for God's sake, just don't talk about you fucking 909 00:48:38,160 --> 00:48:39,160 Speaker 1: broken window again. 910 00:48:39,360 --> 00:48:42,239 Speaker 3: I never used it in my car. Okay, you're not 911 00:48:42,280 --> 00:48:47,200 Speaker 3: a law to use your apps on your phone. So 912 00:48:47,400 --> 00:48:50,360 Speaker 3: I found I discovered this app recently called yucker y 913 00:48:50,520 --> 00:48:55,440 Speaker 3: U k A. And what it is. It's a free app, 914 00:48:55,600 --> 00:48:58,480 Speaker 3: doesn't have ads, but you do have a subscription tire. 915 00:48:58,800 --> 00:49:01,120 Speaker 3: But you can go to the souper market and scan 916 00:49:01,360 --> 00:49:04,880 Speaker 3: the barcode of all the products in the supermarket and 917 00:49:04,960 --> 00:49:06,880 Speaker 3: it tells you whether it's good, and it gives you 918 00:49:06,920 --> 00:49:09,200 Speaker 3: a breakdown and it says, okay, this is good for 919 00:49:09,280 --> 00:49:12,279 Speaker 3: this reason, this reason, this reason, but it's bad for 920 00:49:12,320 --> 00:49:14,920 Speaker 3: that reason. It might have too many two caloric so 921 00:49:14,960 --> 00:49:17,000 Speaker 3: it might have too many calories, or it might have 922 00:49:17,800 --> 00:49:20,880 Speaker 3: preservatives in it that are known to be harmful. It 923 00:49:20,920 --> 00:49:23,920 Speaker 3: has changed the way I shop for stuff. You know, 924 00:49:24,200 --> 00:49:26,880 Speaker 3: I use coconut cream in a lot of my pasta 925 00:49:26,960 --> 00:49:30,640 Speaker 3: sources because I don't use real crane. But there were 926 00:49:30,760 --> 00:49:36,000 Speaker 3: nine different creams and coconut milks, and only one of 927 00:49:36,040 --> 00:49:38,640 Speaker 3: them rated good. The rest of them were rated bad. 928 00:49:39,120 --> 00:49:41,799 Speaker 3: So the app's fantastic. And the other thing that I 929 00:49:41,800 --> 00:49:43,920 Speaker 3: can do is I can you can set filters, So 930 00:49:43,960 --> 00:49:47,279 Speaker 3: if you say, for example, were gluten intolerant, you could 931 00:49:47,280 --> 00:49:50,680 Speaker 3: set a gluten intolerant feature, and then when you scan stuff, 932 00:49:50,680 --> 00:49:53,880 Speaker 3: without having to read the barcode or read the contents, 933 00:49:54,080 --> 00:49:57,080 Speaker 3: it'll instantly tell you whether it's safe for you to have, 934 00:49:57,239 --> 00:49:59,200 Speaker 3: or if you've got a nut allergy, or in my case, 935 00:49:59,239 --> 00:50:01,200 Speaker 3: I set it for VA because I can't read bloody 936 00:50:01,280 --> 00:50:04,640 Speaker 3: labels anymore. I need a magnifying glass, or I take 937 00:50:04,640 --> 00:50:07,480 Speaker 3: a photo of the label and zoom in because it 938 00:50:07,560 --> 00:50:10,839 Speaker 3: takes forever to work out what all those numbers mean 939 00:50:10,920 --> 00:50:13,400 Speaker 3: and whether they contain an animal product or if it's 940 00:50:13,480 --> 00:50:16,680 Speaker 3: nuts or whatever. So it's yeah, it's my absolute obsession. 941 00:50:16,680 --> 00:50:18,640 Speaker 3: And the more I tell people about it, the more 942 00:50:18,640 --> 00:50:20,520 Speaker 3: people say, oh, yeah, I got it as well. It 943 00:50:20,640 --> 00:50:22,920 Speaker 3: sounds it's got minivative. 944 00:50:23,440 --> 00:50:25,759 Speaker 2: What's the problem with that? Do you think? What's the 945 00:50:25,760 --> 00:50:28,560 Speaker 2: potential problem with that app? Because I think you and 946 00:50:28,600 --> 00:50:30,120 Speaker 2: I might be thinking the same thing. 947 00:50:30,239 --> 00:50:33,279 Speaker 5: I just opened Chatters, and when is the yaka app 948 00:50:33,280 --> 00:50:34,759 Speaker 5: good and what's its philosophy? 949 00:50:34,800 --> 00:50:35,680 Speaker 4: With what is good? 950 00:50:36,080 --> 00:50:36,200 Speaker 5: In? 951 00:50:37,080 --> 00:50:39,360 Speaker 4: So I think you have to understand what. 952 00:50:39,280 --> 00:50:43,759 Speaker 5: You know about food and additives and like a lot 953 00:50:43,760 --> 00:50:46,360 Speaker 5: of the stuff David Gillespie talks about about seed oils 954 00:50:46,440 --> 00:50:50,080 Speaker 5: and everything, and yeah, what's its philosophy and what's it calling? 955 00:50:50,120 --> 00:50:51,960 Speaker 4: It was like the star rating on foods? 956 00:50:52,400 --> 00:50:56,000 Speaker 2: Yeah, and did you think Also, sorry, Patrick, I think 957 00:50:56,040 --> 00:50:57,600 Speaker 2: you just and I'm not saying it's bad. 958 00:50:57,640 --> 00:51:00,879 Speaker 1: It could be fucking amazing. So there's no judgment curiosity. 959 00:51:01,560 --> 00:51:05,960 Speaker 1: But like, for example, we have had too many talks 960 00:51:06,000 --> 00:51:08,600 Speaker 1: with David Gillespie about seed oils, but there is some 961 00:51:08,760 --> 00:51:12,800 Speaker 1: astounding science to support what he says. He never makes 962 00:51:12,800 --> 00:51:15,640 Speaker 1: a claim and doesn't back it up with proper research. 963 00:51:16,320 --> 00:51:18,359 Speaker 1: And then last night on the news on Channel seven, 964 00:51:18,400 --> 00:51:21,040 Speaker 1: they're like, ah, all the noise about seed oils, it's 965 00:51:21,080 --> 00:51:21,920 Speaker 1: not true. 966 00:51:22,120 --> 00:51:23,680 Speaker 2: They're all good. You know. 967 00:51:23,800 --> 00:51:26,239 Speaker 1: It's like they were just saying the exact opposite, but 968 00:51:26,400 --> 00:51:30,640 Speaker 1: nobody was referencing anything. Like the thing was, Nah, it's 969 00:51:30,680 --> 00:51:35,000 Speaker 1: not true. I'm a dietitian. It's good. Therefore you should 970 00:51:35,000 --> 00:51:37,400 Speaker 1: believe me because I'm no fuck that that's cool. You 971 00:51:37,520 --> 00:51:39,480 Speaker 1: might be true, you might be right, but give me 972 00:51:39,520 --> 00:51:42,600 Speaker 1: some science, give me some research, show me evidence that 973 00:51:42,640 --> 00:51:45,359 Speaker 1: what you're saying is true, because these things have been 974 00:51:45,400 --> 00:51:48,520 Speaker 1: happening for millennia, where we think something and then we 975 00:51:48,600 --> 00:51:51,920 Speaker 1: find out later like so many drugs like solidamide that 976 00:51:52,000 --> 00:51:53,920 Speaker 1: was given to pregnant women for this and that that 977 00:51:54,080 --> 00:51:57,160 Speaker 1: caused all these deformities. Well, at one stage this was 978 00:51:57,200 --> 00:51:58,920 Speaker 1: the right thing. You should have it, and it was 979 00:51:59,120 --> 00:52:03,600 Speaker 1: devastatedly terrible on women, right, but at one stage that 980 00:52:03,680 --> 00:52:06,799 Speaker 1: was good science. So I think we're so gullible that 981 00:52:06,840 --> 00:52:11,880 Speaker 1: we just go. My question would be with you, yaka 982 00:52:12,000 --> 00:52:15,680 Speaker 1: as you call it, like who science are you using? 983 00:52:15,840 --> 00:52:18,359 Speaker 1: Like what are the criteria that you are saying this 984 00:52:18,440 --> 00:52:20,480 Speaker 1: is good or bad? And how do we know that 985 00:52:20,760 --> 00:52:23,359 Speaker 1: you are right and the other people who say other 986 00:52:23,440 --> 00:52:27,080 Speaker 1: things are wrong. That's and by the way, everyone, I 987 00:52:27,120 --> 00:52:29,360 Speaker 1: have no opinion on it. It could be bloody brilliant 988 00:52:29,400 --> 00:52:30,640 Speaker 1: because I haven't even seen it. 989 00:52:30,640 --> 00:52:31,400 Speaker 2: But that's just what. 990 00:52:32,239 --> 00:52:34,480 Speaker 1: Even when I talk about science, or even when I 991 00:52:34,520 --> 00:52:38,520 Speaker 1: talk about potential behavior or choices around health and wellness, 992 00:52:39,040 --> 00:52:41,680 Speaker 1: I go, this is what I think. I don't go, 993 00:52:41,960 --> 00:52:44,560 Speaker 1: you should all do this because I've read a paper 994 00:52:45,040 --> 00:52:48,000 Speaker 1: or I've trained lots of people or yeah, I just 995 00:52:48,080 --> 00:52:52,320 Speaker 1: think that I think the concept is a good idea, 996 00:52:52,400 --> 00:52:53,919 Speaker 1: but I would just like to dig in. 997 00:52:53,840 --> 00:52:56,239 Speaker 2: And go, what's the science behind it? 998 00:52:56,280 --> 00:52:58,760 Speaker 1: And I know that sounds fucking worrying and I'm raining 999 00:52:58,760 --> 00:53:01,920 Speaker 1: on your parade, but I just think, Eh, you know, 1000 00:53:02,000 --> 00:53:04,920 Speaker 1: it's like the Health Star rating system. Gillespie just wrote 1001 00:53:05,280 --> 00:53:09,080 Speaker 1: an article on that. It's fucking incredible. So you get 1002 00:53:09,120 --> 00:53:12,360 Speaker 1: something like a with the current Health Star rating system, 1003 00:53:13,160 --> 00:53:17,439 Speaker 1: you can have a cereal that's full of process shit 1004 00:53:17,600 --> 00:53:18,680 Speaker 1: and sugar. 1005 00:53:19,960 --> 00:53:22,040 Speaker 2: Gets four and a half stars. In fact, he had 1006 00:53:22,040 --> 00:53:24,040 Speaker 2: a photo of one. And then he had a free 1007 00:53:24,160 --> 00:53:26,960 Speaker 2: range organic and I know you're not going to want this, Patrick, 1008 00:53:27,239 --> 00:53:30,320 Speaker 2: but steak, so it had the only ingredient was beef. 1009 00:53:30,520 --> 00:53:35,120 Speaker 2: It got zero point five stars compared to process sugary. 1010 00:53:35,239 --> 00:53:41,359 Speaker 2: Fucking Now, it's like, this is fucking ridiculous and it's 1011 00:53:41,480 --> 00:53:47,680 Speaker 2: based on all of these absolutely flawed criteria. But the 1012 00:53:47,719 --> 00:53:52,200 Speaker 2: one before that, which was the there's another star rating system. 1013 00:53:52,360 --> 00:53:54,800 Speaker 1: The way that people got the stars for their product 1014 00:53:54,840 --> 00:53:58,920 Speaker 1: was they bought them. There was no actual analysis, like, 1015 00:53:58,960 --> 00:54:03,560 Speaker 1: there was no nutrition, there was no science behind this rating. 1016 00:54:04,040 --> 00:54:06,799 Speaker 2: It was like, ah, we have this, we want to 1017 00:54:06,960 --> 00:54:09,520 Speaker 2: get your endorsement and they're like, wow, it's thirty grand 1018 00:54:09,560 --> 00:54:13,000 Speaker 2: for that. Cool, here's the thirty or whatever. You know, Like, 1019 00:54:13,040 --> 00:54:16,560 Speaker 2: it's just this stuff is don't believe everything that you 1020 00:54:16,680 --> 00:54:19,120 Speaker 2: hear or see. It's like you've got to learn a bit. Yes, tip, 1021 00:54:19,160 --> 00:54:21,560 Speaker 2: you've got your fingers twitching. I can tell, well, just 1022 00:54:21,719 --> 00:54:22,160 Speaker 2: was reading. 1023 00:54:22,280 --> 00:54:24,920 Speaker 5: Like one of the points that Chatters makes is that 1024 00:54:25,000 --> 00:54:28,760 Speaker 5: it judges foods in isolation. For example, olive oil loses 1025 00:54:28,800 --> 00:54:32,799 Speaker 5: points for calories, cheese looks worse than diet yogurt and 1026 00:54:32,840 --> 00:54:34,400 Speaker 5: honey may not even get a score. 1027 00:54:35,040 --> 00:54:36,840 Speaker 4: And then, yeah, we're relying on. 1028 00:54:37,040 --> 00:54:39,280 Speaker 5: Being told what to eat, not how to think about 1029 00:54:39,280 --> 00:54:40,200 Speaker 5: the food we eat. 1030 00:54:40,440 --> 00:54:43,680 Speaker 2: I think the problem is that calories have been demonized. 1031 00:54:43,880 --> 00:54:46,719 Speaker 2: It's like, well, dude, you fucking need calories every day. 1032 00:54:46,760 --> 00:54:49,440 Speaker 2: It's calories aren't good or bad. It's how many you 1033 00:54:49,600 --> 00:54:53,600 Speaker 2: have based on how many you expend. But there are 1034 00:54:53,640 --> 00:54:59,720 Speaker 2: some things, certain chemicals and certain flavoroids and fucking preservatives 1035 00:54:59,719 --> 00:55:05,359 Speaker 2: that unequivocally bad. So no amount is good. Sorry, Patrick, And. 1036 00:55:05,400 --> 00:55:08,800 Speaker 3: It's been really good at educating myself to ultra processed stuff, 1037 00:55:09,680 --> 00:55:12,080 Speaker 3: high sodium, that sort of thing. But I think the 1038 00:55:12,120 --> 00:55:15,879 Speaker 3: other thing is the filtering for vegan because I can't 1039 00:55:15,960 --> 00:55:18,000 Speaker 3: read the labels as well as I used to be 1040 00:55:18,040 --> 00:55:20,759 Speaker 3: able to, so being able to quickly do that and 1041 00:55:20,800 --> 00:55:23,399 Speaker 3: walk past. But you know, what I found was interesting too, 1042 00:55:23,560 --> 00:55:26,600 Speaker 3: is that sometimes the cheapest stuff on the market in 1043 00:55:26,680 --> 00:55:32,080 Speaker 3: terms of the amount of additives, because really, shelf life 1044 00:55:32,200 --> 00:55:35,480 Speaker 3: is one of these things that producers want to be 1045 00:55:35,520 --> 00:55:37,160 Speaker 3: able to have a product that stays on the shelf 1046 00:55:37,160 --> 00:55:40,320 Speaker 3: as long as possible. You know, if you open up wraps, 1047 00:55:40,640 --> 00:55:42,440 Speaker 3: or you know, if you buy raps at the supermarket, 1048 00:55:42,480 --> 00:55:44,840 Speaker 3: the majority of them have a life shelf life of 1049 00:55:44,920 --> 00:55:47,719 Speaker 3: like six months. I don't know any bread product that 1050 00:55:47,760 --> 00:55:50,320 Speaker 3: should last six months. And even when you open it, 1051 00:55:50,320 --> 00:55:51,920 Speaker 3: it can sit in the fridge for two weeks and 1052 00:55:51,920 --> 00:55:54,319 Speaker 3: you think hasn't ever got any mold on it. You 1053 00:55:54,360 --> 00:55:57,640 Speaker 3: know that you've got to ask the questions. You know, 1054 00:55:57,880 --> 00:56:00,799 Speaker 3: those things actually safe. I'm making my own made wraps now, 1055 00:56:00,800 --> 00:56:03,560 Speaker 3: I'm pretty happy with myself. So I'm just making it 1056 00:56:03,560 --> 00:56:06,680 Speaker 3: from oat flour and lind seed that I mill myself, 1057 00:56:07,120 --> 00:56:07,600 Speaker 3: and I've just. 1058 00:56:07,680 --> 00:56:09,000 Speaker 2: Think that's a wise decision. 1059 00:56:09,040 --> 00:56:10,759 Speaker 1: Also, I just wanted to jump back to what you 1060 00:56:10,760 --> 00:56:16,040 Speaker 1: were talking about the nutritional labels, right you can't read them. 1061 00:56:16,480 --> 00:56:19,279 Speaker 2: And the reason you can't read them is because it's 1062 00:56:19,320 --> 00:56:22,600 Speaker 2: fucking size four fonts on the back of the box 1063 00:56:22,960 --> 00:56:25,960 Speaker 2: down the bottom left or right hand corner, because they 1064 00:56:26,040 --> 00:56:29,560 Speaker 2: don't want it to get any attention. What was in 1065 00:56:29,600 --> 00:56:33,040 Speaker 2: that box was good news for them, it'd be all 1066 00:56:33,040 --> 00:56:36,799 Speaker 2: over the front in big letters. So the front is 1067 00:56:36,840 --> 00:56:38,320 Speaker 2: an ad. It's not information. 1068 00:56:38,640 --> 00:56:40,840 Speaker 1: The shit in that box that they don't want you 1069 00:56:40,920 --> 00:56:44,040 Speaker 1: to read, and the ingredients list that they don't want 1070 00:56:44,080 --> 00:56:47,919 Speaker 1: you to read, with all those preservatives, additives numbers, that's 1071 00:56:47,960 --> 00:56:50,319 Speaker 1: what you fuck. The front of the box. It's an ad. 1072 00:56:50,400 --> 00:56:54,160 Speaker 1: It's a trick, it's a fucking it's a delusion. Like 1073 00:56:54,280 --> 00:56:56,719 Speaker 1: go to the back and actually try to understand what 1074 00:56:56,719 --> 00:56:58,520 Speaker 1: you're putting in your gob We've got to go give 1075 00:56:58,600 --> 00:57:01,000 Speaker 1: us one more before we go cham your choice. 1076 00:57:01,360 --> 00:57:05,520 Speaker 3: I'm so excited. Next week we've got a lunar eclipse 1077 00:57:05,800 --> 00:57:08,760 Speaker 3: and it made me think about the new Artemis mission. 1078 00:57:08,800 --> 00:57:11,880 Speaker 3: So the Artemis two mission is the astronauts. They're going 1079 00:57:11,920 --> 00:57:14,680 Speaker 3: to be heading off soon this year. In fact, it's 1080 00:57:14,680 --> 00:57:17,720 Speaker 3: only a short time away, and the people in the 1081 00:57:17,800 --> 00:57:20,280 Speaker 3: Artomis spacecraft. We're going to go back to the Moon 1082 00:57:20,320 --> 00:57:22,320 Speaker 3: they're not landing on the Moon, but they're going to 1083 00:57:22,360 --> 00:57:25,080 Speaker 3: go around the Moon and run all these studies. But 1084 00:57:25,600 --> 00:57:28,280 Speaker 3: because of the placement of the Moon, they're going to 1085 00:57:28,320 --> 00:57:32,320 Speaker 3: be the furthest away from Earth than any other human 1086 00:57:32,360 --> 00:57:35,560 Speaker 3: being in his wyow So cool. 1087 00:57:35,680 --> 00:57:39,200 Speaker 1: Huh, you would love to go to space when that's 1088 00:57:39,360 --> 00:57:42,120 Speaker 1: like that'd be you'd probably chop off a knacker for 1089 00:57:42,160 --> 00:57:42,680 Speaker 1: that onenn't you? 1090 00:57:42,960 --> 00:57:43,080 Speaker 5: Well? 1091 00:57:43,080 --> 00:57:46,080 Speaker 3: I think last month last episode, I talked about a 1092 00:57:46,120 --> 00:57:50,000 Speaker 3: company that's sending ashes into space. I want to get 1093 00:57:50,000 --> 00:57:53,120 Speaker 3: my ashes sent into space. That's where I'm at. But 1094 00:57:53,160 --> 00:57:55,120 Speaker 3: the other thing that I found interesting about the Artemis 1095 00:57:55,240 --> 00:57:59,840 Speaker 3: mission is they're equipping the astronauts with ten year old 1096 00:58:00,120 --> 00:58:04,560 Speaker 3: DSLR cameras Nicon cameras because there was a nick On 1097 00:58:04,640 --> 00:58:07,320 Speaker 3: camera that came out, which is kind of fascinating. The 1098 00:58:07,360 --> 00:58:13,680 Speaker 3: camera is twenty sixteen camera, the D five Nikon camera, 1099 00:58:14,160 --> 00:58:18,600 Speaker 3: and it's got a it's more resilient to radiation and 1100 00:58:18,640 --> 00:58:22,840 Speaker 3: it has a much higher ISO rating, So ISO is 1101 00:58:22,880 --> 00:58:27,760 Speaker 3: the sensitivity to light, and it had this ridiculously good 1102 00:58:27,960 --> 00:58:31,520 Speaker 3: sensitivity built into it that's more than the comp than 1103 00:58:31,560 --> 00:58:35,160 Speaker 3: the current Nikons, which is fascinating, and it means that 1104 00:58:35,280 --> 00:58:38,760 Speaker 3: contrast between the darkness of space and any object they're 1105 00:58:38,760 --> 00:58:41,720 Speaker 3: taking photographs of. So that's what they're taking with them. 1106 00:58:41,760 --> 00:58:44,680 Speaker 3: Ten year old cameras are going to be flying around 1107 00:58:44,800 --> 00:58:47,640 Speaker 3: with human beings and going to be the furthest way 1108 00:58:47,760 --> 00:58:50,080 Speaker 3: that any human's ever been in history. 1109 00:58:51,360 --> 00:58:54,000 Speaker 1: Look at you, just bringing the bloody facts and the 1110 00:58:54,080 --> 00:58:57,120 Speaker 1: amusement and the entertainment. Now I heard that you had 1111 00:58:57,120 --> 00:59:03,760 Speaker 1: your car broken into. Ah, Patrick, of course can be. 1112 00:59:04,040 --> 00:59:06,120 Speaker 1: Somebody wrote in the group, I hate it when you 1113 00:59:06,160 --> 00:59:09,720 Speaker 1: ask Patrick where people can find you, Oh, because they 1114 00:59:09,720 --> 00:59:12,280 Speaker 1: don't like So now I'm now I'm scared. 1115 00:59:12,360 --> 00:59:15,000 Speaker 2: But oh well that's okay. I go. 1116 00:59:15,520 --> 00:59:17,640 Speaker 3: If anybody wants to contact me, they can just go 1117 00:59:17,680 --> 00:59:20,520 Speaker 3: to websites now dot com, dot a U and fill 1118 00:59:20,560 --> 00:59:23,040 Speaker 3: out the form and just say hi, let us know 1119 00:59:23,080 --> 00:59:23,680 Speaker 3: what you want us to do. 1120 00:59:23,800 --> 00:59:26,360 Speaker 2: Oh okay, I was unaware. Thank you for bringing that up. 1121 00:59:27,120 --> 00:59:32,680 Speaker 3: Tif did you know that I know? And also Tichi 1122 00:59:32,760 --> 00:59:35,200 Speaker 3: at home, dot com, DoD au. I have all these 1123 00:59:35,240 --> 00:59:38,640 Speaker 3: free exercises and you can meet Fritz my the Wonder dog, 1124 00:59:38,800 --> 00:59:40,160 Speaker 3: and we can do tichee together. 1125 00:59:41,320 --> 00:59:44,000 Speaker 1: Well yes, and also everyone, he doesn't get paid, so 1126 00:59:44,120 --> 00:59:44,880 Speaker 1: give him a break. 1127 00:59:44,920 --> 00:59:48,320 Speaker 2: He's allowed to promote we give him fuck all. You know, 1128 00:59:48,640 --> 00:59:51,640 Speaker 2: he's only got he's got a devalued car. 1129 00:59:51,760 --> 00:59:54,600 Speaker 1: Now he lives in fucking Upper oh. I can't say 1130 00:59:54,600 --> 00:59:55,600 Speaker 1: what I was going to say, but it. 1131 00:59:55,600 --> 00:59:59,680 Speaker 3: Would have been hilarious going to say yes exactly. 1132 00:59:59,240 --> 01:00:01,360 Speaker 2: But you know who would have sent us an email? 1133 01:00:01,480 --> 01:00:05,720 Speaker 2: Someone Tiff, thank you. Have a good Friday, both of you. 1134 01:00:05,840 --> 01:00:08,480 Speaker 1: We'll say goodbye fair but thanks both of you, and 1135 01:00:08,480 --> 01:00:09,200 Speaker 1: thanks listeners,