1 00:00:00,880 --> 00:00:03,480 Speaker 1: I've got a team. Welcome to the project. Tiffany Cook 2 00:00:03,520 --> 00:00:06,720 Speaker 1: and Patrick Bonello. Those two have just been talking non stop. 3 00:00:06,760 --> 00:00:10,280 Speaker 1: So I'm interrupting them to start the show. And now 4 00:00:10,360 --> 00:00:14,319 Speaker 1: you can continue. Patrick, Now you can keep going. See 5 00:00:14,520 --> 00:00:18,479 Speaker 1: all your good stuff happens before we start, because there's 6 00:00:18,560 --> 00:00:20,799 Speaker 1: nothing of none of that you can talk about on 7 00:00:20,840 --> 00:00:21,240 Speaker 1: the show. 8 00:00:21,680 --> 00:00:24,040 Speaker 2: That's very true. I just think it's really rude that 9 00:00:24,079 --> 00:00:26,240 Speaker 2: tiff and I are having a conversation and you butt 10 00:00:26,239 --> 00:00:26,960 Speaker 2: in with your show. 11 00:00:27,640 --> 00:00:31,320 Speaker 1: Yeah, I know, I know, I apologize. It's very selfish 12 00:00:31,400 --> 00:00:35,159 Speaker 1: of me, but I'm an only child. You were, and 13 00:00:35,200 --> 00:00:38,840 Speaker 1: we're not going to explore the revelation that came to light, 14 00:00:38,960 --> 00:00:42,120 Speaker 1: but you'd realize that you may have been sabotaging your 15 00:00:42,159 --> 00:00:45,960 Speaker 1: potential to find a playmate. We'll call it that. 16 00:00:48,760 --> 00:00:51,320 Speaker 2: Can you sell that now and not have any context? 17 00:00:51,400 --> 00:00:55,640 Speaker 2: That's just silly. Well, oh sure you can give the context. 18 00:00:56,360 --> 00:00:58,120 Speaker 2: Oh I don't want to give the context, but I 19 00:00:58,520 --> 00:01:03,840 Speaker 2: exactly thoated the and all listeners sitting there thinking, what 20 00:01:03,880 --> 00:01:06,400 Speaker 2: the hell are those idiots talking about? At the ste 21 00:01:07,120 --> 00:01:08,240 Speaker 2: They're all going to turn off. 22 00:01:09,280 --> 00:01:11,399 Speaker 1: No, Well, we were talking about the fact that you 23 00:01:11,480 --> 00:01:16,240 Speaker 1: may have been unconsciously unintentionally sabotaging your dating efforts. Can 24 00:01:16,280 --> 00:01:16,800 Speaker 1: we say that? 25 00:01:17,280 --> 00:01:20,360 Speaker 2: Yeah? I guess well, because I said my favorite color 26 00:01:20,400 --> 00:01:22,440 Speaker 2: is green. It was Sir Patrick's day earlier this week, 27 00:01:22,760 --> 00:01:26,640 Speaker 2: and I realized I was not only wearing a green top, 28 00:01:26,680 --> 00:01:29,480 Speaker 2: well a hoodie with a green T shirt underneath. But 29 00:01:29,520 --> 00:01:32,199 Speaker 2: it's a if anyone's been to Berlin, this is really 30 00:01:32,280 --> 00:01:36,480 Speaker 2: cool shop or shops around Berlin called Ampelmann, and Ampelmann 31 00:01:36,640 --> 00:01:40,399 Speaker 2: is the little red dude that's on the street lights 32 00:01:40,440 --> 00:01:42,480 Speaker 2: that makes you stop and a little picture of a 33 00:01:42,520 --> 00:01:45,120 Speaker 2: green dude that allows you to walk across the intersection, 34 00:01:45,240 --> 00:01:47,920 Speaker 2: so the stop go character. But it's it's almost the 35 00:01:48,000 --> 00:01:50,400 Speaker 2: Monopoly like character, isn't it. When you look at it, 36 00:01:50,400 --> 00:01:52,480 Speaker 2: it looks a bit like the Monopoly dude, So you 37 00:01:52,480 --> 00:01:55,440 Speaker 2: can if you can imagine green. Well, I've got this 38 00:01:55,480 --> 00:01:57,400 Speaker 2: hoodie that I bought in Berlin, which I love. It's 39 00:01:57,560 --> 00:01:59,760 Speaker 2: one of my is my favorite hoodie, and it's got 40 00:01:59,840 --> 00:02:02,400 Speaker 2: the the go guy on the front, the little umpleman 41 00:02:02,800 --> 00:02:04,800 Speaker 2: green guy on the front, but it's got the red 42 00:02:04,840 --> 00:02:08,399 Speaker 2: stop guy on the back. And Craig suggested that may 43 00:02:08,400 --> 00:02:11,760 Speaker 2: be hindering my ability to be able to meet up 44 00:02:11,800 --> 00:02:13,519 Speaker 2: with people on tapes. 45 00:02:16,120 --> 00:02:20,080 Speaker 1: I'm just I'm suggesting that you just maybe have a 46 00:02:20,080 --> 00:02:23,280 Speaker 1: green No, I don't know what I'm suggesting. 47 00:02:23,720 --> 00:02:27,560 Speaker 2: Maybe it's the grown man who get a top that's 48 00:02:28,120 --> 00:02:30,799 Speaker 2: eyed on. It is probably why people. 49 00:02:31,360 --> 00:02:33,760 Speaker 1: I think a couple of our listeners will figure out 50 00:02:33,919 --> 00:02:36,399 Speaker 1: where I was going, but I hope most of them 51 00:02:36,440 --> 00:02:41,280 Speaker 1: don't have Good morning, TIV morning. How are you very good? 52 00:02:41,320 --> 00:02:41,680 Speaker 3: Thanks? 53 00:02:42,639 --> 00:02:44,760 Speaker 1: I heard that you're going on a date tomorrow and 54 00:02:44,880 --> 00:02:46,560 Speaker 1: Patrick's going on a date tomorrow. 55 00:02:46,800 --> 00:02:48,360 Speaker 3: Yeah, how exciting? 56 00:02:49,200 --> 00:02:51,400 Speaker 1: And are your dogs also going on the date? 57 00:02:51,639 --> 00:02:52,040 Speaker 2: They are? 58 00:02:52,400 --> 00:02:53,760 Speaker 3: Yeah, Luna can't wait? 59 00:02:54,360 --> 00:02:56,280 Speaker 1: Yeah, what are you two doing? 60 00:02:57,760 --> 00:03:00,000 Speaker 3: Didn't run around in the park, roll in the grass 61 00:03:00,120 --> 00:03:03,239 Speaker 3: us a little bit, have some lunch yep. 62 00:03:03,520 --> 00:03:03,800 Speaker 2: Yeah. 63 00:03:04,520 --> 00:03:07,239 Speaker 1: Fritz and Luna on speaking terms. 64 00:03:07,880 --> 00:03:08,600 Speaker 3: I've met once. 65 00:03:09,200 --> 00:03:13,279 Speaker 2: Yeah yeah. But we're also bringing in a third wildcard 66 00:03:13,320 --> 00:03:15,760 Speaker 2: factor here. Another friend of mine, Kim, is bringing her 67 00:03:15,840 --> 00:03:19,440 Speaker 2: dog Walter Wally, and Wally is probably in the same 68 00:03:19,520 --> 00:03:23,560 Speaker 2: league as Luna in terms of dogs that go absolutely 69 00:03:23,600 --> 00:03:26,120 Speaker 2: psycho with other dogs. Could I be right in saying that, Tiff? 70 00:03:26,320 --> 00:03:30,480 Speaker 1: Yeah, yeah, your dog? Does your dog go nuts? Tiff? 71 00:03:31,040 --> 00:03:35,960 Speaker 3: Oh yeah, she can do really all sorts. She's all sorts. 72 00:03:36,880 --> 00:03:39,360 Speaker 2: But with it's so fast. When you see a whippet 73 00:03:39,400 --> 00:03:41,200 Speaker 2: running at the park and they do that. What I 74 00:03:41,240 --> 00:03:44,279 Speaker 2: love about the whippets is not just the speed and acceleration, 75 00:03:44,560 --> 00:03:48,240 Speaker 2: it's the ability to turn around on a postage stamp 76 00:03:48,440 --> 00:03:50,839 Speaker 2: and suddenly go in the opposite direction. And that's why 77 00:03:50,840 --> 00:03:53,040 Speaker 2: I'm so excited. I've been trying to organize this three 78 00:03:53,040 --> 00:03:56,280 Speaker 2: way date for such a long time between my friend 79 00:03:56,640 --> 00:03:59,960 Speaker 2: and her whippet so that she can meet at Luna. 80 00:04:00,040 --> 00:04:02,880 Speaker 2: That's pretty exciting. But we're probably boring everybody out. We did. 81 00:04:03,040 --> 00:04:03,640 Speaker 1: Probably are. 82 00:04:04,200 --> 00:04:06,720 Speaker 3: We'll put some puppy spam in in the face. 83 00:04:06,520 --> 00:04:10,360 Speaker 2: Exactly, don't you. Today I was just thinking, I don't 84 00:04:10,400 --> 00:04:14,320 Speaker 2: know whether it made big news in conventional media, because 85 00:04:14,320 --> 00:04:17,120 Speaker 2: it only just occurred to me this morning that probably 86 00:04:17,160 --> 00:04:20,400 Speaker 2: everybody was watching this on television, whereas I was watching 87 00:04:20,400 --> 00:04:23,960 Speaker 2: it via the NASA live feed. But the astronauts coming 88 00:04:24,200 --> 00:04:28,400 Speaker 2: back from the International Space Station on the Dragon X 89 00:04:29,160 --> 00:04:32,320 Speaker 2: recovery vehicle, you know, the little shuttle they have. Did 90 00:04:32,360 --> 00:04:35,760 Speaker 2: you watch that? Did you know about that? Because I 91 00:04:35,800 --> 00:04:39,159 Speaker 2: get pinged when NASA has an interesting thing happened. I'm 92 00:04:39,240 --> 00:04:42,280 Speaker 2: constantly being you know, like, oh, look, the International Space 93 00:04:42,320 --> 00:04:45,120 Speaker 2: Station is flying over so you can log onto a 94 00:04:45,160 --> 00:04:47,880 Speaker 2: live feed. But did you know about that? Was that 95 00:04:47,920 --> 00:04:49,239 Speaker 2: big in the news this week? 96 00:04:49,800 --> 00:04:52,080 Speaker 1: You really do need to get a television. I know 97 00:04:52,160 --> 00:04:54,560 Speaker 1: you don't want one, but it's been all over the 98 00:04:54,600 --> 00:04:57,720 Speaker 1: news week. That's like saying, did any of you notice 99 00:04:57,760 --> 00:05:00,719 Speaker 1: that in the morning the sun comes up? Has anybody? 100 00:05:01,400 --> 00:05:05,000 Speaker 1: Has anybody noticed that? Or just me, yes, Patrick, it's 101 00:05:05,000 --> 00:05:09,240 Speaker 1: been all over the news, but it is death. I 102 00:05:09,360 --> 00:05:11,080 Speaker 1: look I mean you would look at it from all 103 00:05:11,120 --> 00:05:13,560 Speaker 1: the tech and the sci fi not sci fi, but 104 00:05:13,600 --> 00:05:17,920 Speaker 1: the space I look at it from the physiology. I think, Wow, 105 00:05:18,000 --> 00:05:22,480 Speaker 1: these two bodies haven't experienced gravity, for actual real gravity 106 00:05:22,520 --> 00:05:27,600 Speaker 1: for nine months, and just to think about the physiological destruction, 107 00:05:27,800 --> 00:05:30,919 Speaker 1: bone density, muscle mass, you know, all the things that 108 00:05:30,960 --> 00:05:34,599 Speaker 1: have fallen apart in the last nine months. And to 109 00:05:34,680 --> 00:05:36,960 Speaker 1: think about how long it's going to take to get 110 00:05:37,000 --> 00:05:39,919 Speaker 1: those bodies, if they ever would get back to normal, 111 00:05:40,000 --> 00:05:42,520 Speaker 1: I'm not actually sure, but to get them back to 112 00:05:43,200 --> 00:05:45,880 Speaker 1: somewhat operational, that's fascinating. 113 00:05:46,160 --> 00:05:49,279 Speaker 2: Yeah, there's these suggestions that there are long term effects 114 00:05:49,279 --> 00:05:52,520 Speaker 2: obviously from being in zero G for such a long time, 115 00:05:52,560 --> 00:05:56,400 Speaker 2: but you're right that there may be some instances of 116 00:05:57,080 --> 00:06:01,880 Speaker 2: that exposure to more radiation. It's thought that people on 117 00:06:01,960 --> 00:06:07,280 Speaker 2: the Apollo missions had a higher instances of cancer. Obviously 118 00:06:07,560 --> 00:06:10,960 Speaker 2: we're protected here on our little blue globe floating there 119 00:06:10,960 --> 00:06:14,559 Speaker 2: in the middle of everything in the void. So I think, 120 00:06:15,080 --> 00:06:18,840 Speaker 2: I love that people can get so excited about something 121 00:06:19,000 --> 00:06:22,400 Speaker 2: like this, you know, being stuck in a tin can 122 00:06:22,760 --> 00:06:25,280 Speaker 2: for nine months when you thought you were going for 123 00:06:25,400 --> 00:06:28,080 Speaker 2: eight days. I'm just thinking, like when I went to 124 00:06:28,120 --> 00:06:31,039 Speaker 2: your place recently and your room that you wouldn't let 125 00:06:31,080 --> 00:06:34,200 Speaker 2: me open the window in on a hot night. I thought, 126 00:06:34,320 --> 00:06:35,680 Speaker 2: you know, it's kind of similar to. 127 00:06:35,640 --> 00:06:41,320 Speaker 1: That crawl into my room with the ac is. 128 00:06:41,440 --> 00:06:43,560 Speaker 2: Oh, I didn't think about that, of course. So I 129 00:06:43,600 --> 00:06:44,839 Speaker 2: go to his place tip. 130 00:06:46,640 --> 00:06:51,359 Speaker 4: Degree, you're true thirty nine degrees and he says, here's 131 00:06:51,360 --> 00:06:53,880 Speaker 4: your room, and I'm thinking he hasn't turned any of 132 00:06:53,920 --> 00:06:56,920 Speaker 4: the air cons on weltering it here. 133 00:06:57,160 --> 00:07:00,479 Speaker 2: So I say to him, can I open the window? Mate? 134 00:07:00,600 --> 00:07:03,120 Speaker 2: Not because of the neighbors. You can't open the window 135 00:07:03,320 --> 00:07:05,760 Speaker 2: since you could leave your door open. It's like, why 136 00:07:06,160 --> 00:07:10,800 Speaker 2: you haven't turned the ac on your tighty I gave 137 00:07:10,840 --> 00:07:11,360 Speaker 2: your bed. 138 00:07:14,400 --> 00:07:16,960 Speaker 3: It's about this window. Why can't you open the window 139 00:07:17,000 --> 00:07:18,120 Speaker 3: because of the neighbors. 140 00:07:18,440 --> 00:07:20,880 Speaker 1: It's not that it's the window won't open. 141 00:07:22,240 --> 00:07:26,680 Speaker 2: It's bolted from the outside, so the person so jake 142 00:07:26,760 --> 00:07:27,120 Speaker 2: it out. 143 00:07:27,720 --> 00:07:31,600 Speaker 1: That's how I keep my guests in the house. I mean, 144 00:07:33,760 --> 00:07:37,080 Speaker 1: just call me Hannibal Lecter. If only you didn't take 145 00:07:37,120 --> 00:07:43,320 Speaker 1: off at four am out the front door. Patrick, It's 146 00:07:43,440 --> 00:07:45,960 Speaker 1: very ungracious and very ungrateful. 147 00:07:47,760 --> 00:07:50,160 Speaker 2: It was actually very kind of great because let me state, 148 00:07:50,600 --> 00:07:51,280 Speaker 2: it's too late. 149 00:07:54,600 --> 00:07:56,800 Speaker 1: Did you get that thing I sent you through the week? 150 00:07:57,200 --> 00:08:00,360 Speaker 1: Like this is the problem Tip? I send him tech 151 00:08:00,440 --> 00:08:03,440 Speaker 1: things and he never even acknowledges it because he didn't 152 00:08:03,480 --> 00:08:05,680 Speaker 1: think of it or see it. He doesn't go, this 153 00:08:05,840 --> 00:08:08,480 Speaker 1: is really interesting, Craig, thanks for sending this. 154 00:08:10,120 --> 00:08:12,720 Speaker 2: I think that when you sense it, I'm going to 155 00:08:12,760 --> 00:08:13,920 Speaker 2: defend myself here. 156 00:08:14,000 --> 00:08:17,040 Speaker 1: Oh col sure, how about you just say sorry, Jumbo. 157 00:08:17,480 --> 00:08:19,800 Speaker 1: I didn't acknowledge it in any crisis. 158 00:08:20,160 --> 00:08:24,840 Speaker 2: Was I was trying to be Switzerland in the middle 159 00:08:24,880 --> 00:08:27,520 Speaker 2: of two warring parties. I'm not going to go into 160 00:08:27,560 --> 00:08:29,960 Speaker 2: the details because they might be listening. But no, no, 161 00:08:30,040 --> 00:08:33,040 Speaker 2: there was an altercation between two people and I was 162 00:08:33,120 --> 00:08:36,160 Speaker 2: being the Dalai Lama. By a coincidence, I'm born the 163 00:08:36,200 --> 00:08:38,240 Speaker 2: same day as the Dalai Lama. I just thought i'd 164 00:08:38,240 --> 00:08:41,920 Speaker 2: mentioned that, But I was trying to look two people 165 00:08:41,960 --> 00:08:44,079 Speaker 2: who were really upset, and I was trying to diffuse 166 00:08:44,120 --> 00:08:47,800 Speaker 2: the situation and make them both less upset, and I 167 00:08:47,840 --> 00:08:50,680 Speaker 2: think I succeeded. So I think I served a really 168 00:08:50,720 --> 00:08:52,400 Speaker 2: good purpose there. Send me to the Middle East, I 169 00:08:52,400 --> 00:08:55,120 Speaker 2: don't know, but in this instance. 170 00:08:55,400 --> 00:08:58,119 Speaker 1: In doing that, you upset me because you didn't acknowledge 171 00:08:58,160 --> 00:08:58,760 Speaker 1: what I sent you. 172 00:08:59,040 --> 00:09:02,240 Speaker 2: And I just you for letting me stay your place. Okay, 173 00:09:02,240 --> 00:09:03,920 Speaker 2: tell us a little bit about the article that you're 174 00:09:03,920 --> 00:09:04,959 Speaker 2: sent through because I am in. 175 00:09:05,000 --> 00:09:06,840 Speaker 1: No, you tell us did you read it? 176 00:09:07,080 --> 00:09:07,400 Speaker 2: Maybe? 177 00:09:08,200 --> 00:09:11,400 Speaker 1: No, you did it. He can't even remember. 178 00:09:11,840 --> 00:09:14,080 Speaker 2: Things happened this week. He picks up. 179 00:09:14,400 --> 00:09:16,520 Speaker 1: He's picking up his phone. Everybody to have a look 180 00:09:16,559 --> 00:09:19,080 Speaker 1: at what I said. That's how much he doesn't care. 181 00:09:20,679 --> 00:09:25,720 Speaker 1: It's very hurtful today and back in therapy for harps. 182 00:09:25,800 --> 00:09:27,640 Speaker 1: I don't think I need to cleve. I'm going to 183 00:09:27,679 --> 00:09:30,760 Speaker 1: tell TIV so fuck you, Patrick Tip. Did you know 184 00:09:30,800 --> 00:09:33,000 Speaker 1: that they're building houses now, three D. 185 00:09:33,000 --> 00:09:35,079 Speaker 2: Printing houses and whole neighborhoods? Yes? 186 00:09:35,920 --> 00:09:38,120 Speaker 1: Oh, well, now he knows because he just read the 187 00:09:38,160 --> 00:09:39,000 Speaker 1: thing that I said to you. 188 00:09:39,120 --> 00:09:41,640 Speaker 2: I know there was this arm, and the arm was 189 00:09:41,679 --> 00:09:43,560 Speaker 2: printing a three D house, wasn't it. It was a 190 00:09:43,600 --> 00:09:46,360 Speaker 2: concrete arm, so that it was squirting concrete and building 191 00:09:46,400 --> 00:09:50,040 Speaker 2: the structure of the house, wasn't it. It was? It 192 00:09:50,120 --> 00:09:53,480 Speaker 2: was a neighborhood, a whole neighborhood three D printed, and 193 00:09:53,520 --> 00:09:55,680 Speaker 2: you could print different shaped houses, so you could actually 194 00:09:55,760 --> 00:09:58,600 Speaker 2: custom design the house you wanted. So what they're saying is, 195 00:09:58,600 --> 00:10:02,280 Speaker 2: with three D printing, can you can print conventional houses, 196 00:10:02,400 --> 00:10:05,320 Speaker 2: but you can actually come up with much more alternative 197 00:10:05,360 --> 00:10:08,719 Speaker 2: designs because of the way that the little mandible or 198 00:10:08,760 --> 00:10:11,560 Speaker 2: proboscus thing proboscos. We talk about that every day, and again, Craig, 199 00:10:11,559 --> 00:10:16,160 Speaker 2: don't we proboscos anyway, it's able to squirt the concrete 200 00:10:16,200 --> 00:10:20,000 Speaker 2: and make unusual shapes that conventionally would be more difficult 201 00:10:20,000 --> 00:10:20,360 Speaker 2: to make. 202 00:10:21,160 --> 00:10:23,880 Speaker 1: Yes, it is true, Patrick, I'll let you off the hook. 203 00:10:24,200 --> 00:10:26,360 Speaker 1: It is, and I think it's very exciting. 204 00:10:26,679 --> 00:10:27,440 Speaker 2: It is exciting. 205 00:10:27,520 --> 00:10:30,679 Speaker 1: You can have any like even you can have like 206 00:10:30,840 --> 00:10:33,920 Speaker 1: curved walls and stuff. How cool is that. Yeah, it's 207 00:10:34,040 --> 00:10:36,120 Speaker 1: no harder to make than a straight wall. 208 00:10:37,280 --> 00:10:40,600 Speaker 2: But sometimes not. The straight is good. But it is 209 00:10:40,679 --> 00:10:43,560 Speaker 2: fascinating and if you were a wall, you'd be a 210 00:10:43,600 --> 00:10:47,720 Speaker 2: curved wall, if not a fucking tidally bent wall. 211 00:10:47,800 --> 00:10:50,719 Speaker 1: But anyway, back to the show, but I've. 212 00:10:50,520 --> 00:10:56,480 Speaker 2: Talked to friends recently who have teenagers and young adult children, 213 00:10:56,960 --> 00:11:00,000 Speaker 2: and they really worry about whether or not the kids 214 00:11:00,280 --> 00:11:02,560 Speaker 2: ever going to have houses, are ever going to be 215 00:11:02,600 --> 00:11:05,679 Speaker 2: able to own their own homes because of this. And 216 00:11:06,000 --> 00:11:08,800 Speaker 2: my colleague he's I mean, he's got a family, he's 217 00:11:08,800 --> 00:11:11,280 Speaker 2: got a young girl just turned two. He's in his 218 00:11:11,400 --> 00:11:14,200 Speaker 2: late thirties, and he said, look, my wife and I 219 00:11:14,240 --> 00:11:16,839 Speaker 2: both have a job, and we don't ever reckon we're 220 00:11:16,840 --> 00:11:19,040 Speaker 2: going to be able to afford a house. But maybe 221 00:11:19,240 --> 00:11:22,560 Speaker 2: the likes of three D printing could make a massive 222 00:11:23,080 --> 00:11:26,600 Speaker 2: leap in the direction of making affordable housing as long 223 00:11:26,679 --> 00:11:28,800 Speaker 2: and now that you can change the shapes and we're 224 00:11:28,800 --> 00:11:31,920 Speaker 2: not talking just concrete blocks put up and making a square. 225 00:11:32,080 --> 00:11:35,040 Speaker 2: We're talking to creative designs that people could have input into. 226 00:11:35,160 --> 00:11:36,560 Speaker 2: So it's actually really exciting. 227 00:11:37,120 --> 00:11:40,360 Speaker 1: Well, and I think also what's good is way way, 228 00:11:40,440 --> 00:11:44,160 Speaker 1: way way quicker and the building cost, Like the cost 229 00:11:44,200 --> 00:11:46,559 Speaker 1: of the building, I think they said it's twenty five percent, 230 00:11:47,400 --> 00:11:49,800 Speaker 1: so it would build the equivalent of one hundred thousand 231 00:11:49,840 --> 00:11:52,959 Speaker 1: dollars build for twenty five grand and I think that, 232 00:11:53,800 --> 00:11:56,319 Speaker 1: like humans are pretty good at solving things and also 233 00:11:56,400 --> 00:12:00,640 Speaker 1: creating problems on the other hand, But surely, you know, 234 00:12:00,679 --> 00:12:02,440 Speaker 1: we need to be able to figure out a way 235 00:12:02,520 --> 00:12:08,560 Speaker 1: to create housing that's not you know, prohibitive for the 236 00:12:08,679 --> 00:12:11,880 Speaker 1: average person, you know, so that that's got to be 237 00:12:11,920 --> 00:12:14,720 Speaker 1: at the forefront. Like if I was in the building space, 238 00:12:15,559 --> 00:12:17,640 Speaker 1: that would be an obsession of mine is trying to 239 00:12:17,880 --> 00:12:20,679 Speaker 1: figure out how do we build homes that people can afford, 240 00:12:21,800 --> 00:12:24,559 Speaker 1: that still meet their needs. And I think we're also 241 00:12:24,640 --> 00:12:28,560 Speaker 1: seeing I was going to say the proliferation, but the 242 00:12:28,720 --> 00:12:31,640 Speaker 1: increase of tiny homes, which I feel like, you wouldn't 243 00:12:31,679 --> 00:12:34,240 Speaker 1: be mad at a tiny home, Patrick, even though you're 244 00:12:34,280 --> 00:12:37,000 Speaker 1: going to fight bedroom house that you don't need. 245 00:12:37,320 --> 00:12:41,080 Speaker 2: Isn't it funny? Actually I said to someone once that 246 00:12:41,200 --> 00:12:44,520 Speaker 2: when at sixteen, as you know, I bought a caravan 247 00:12:44,840 --> 00:12:46,960 Speaker 2: moved out into the backyard, so effectively I lived in 248 00:12:46,960 --> 00:12:49,640 Speaker 2: a caravan for about three and a half years. And 249 00:12:49,679 --> 00:12:51,840 Speaker 2: I was telling someone that story and they said, I 250 00:12:51,960 --> 00:12:57,360 Speaker 2: always knew you would trailer dratch, but I really enjoyed 251 00:12:57,400 --> 00:12:59,960 Speaker 2: having that small space around me. I made it my own, 252 00:13:00,320 --> 00:13:04,680 Speaker 2: and as you know, I've recently built or converted my 253 00:13:04,800 --> 00:13:07,920 Speaker 2: garage into a studio space so I can do my taichi, 254 00:13:08,160 --> 00:13:12,400 Speaker 2: do my podcasting. But my exit strategy was always going 255 00:13:12,440 --> 00:13:15,199 Speaker 2: to be sell the ridiculously big house and then maybe 256 00:13:15,200 --> 00:13:17,040 Speaker 2: buy two small units. But I reckon, I'm going to 257 00:13:17,080 --> 00:13:19,880 Speaker 2: move into my little studio and then rent out my 258 00:13:20,000 --> 00:13:24,240 Speaker 2: house because you know, it's a great functional space, and 259 00:13:24,280 --> 00:13:28,040 Speaker 2: I think it's about you know, we like to accumulate things, 260 00:13:28,160 --> 00:13:30,319 Speaker 2: and it's easy for us to kind of fill up 261 00:13:30,520 --> 00:13:33,560 Speaker 2: a big space with even more stuff, and we do 262 00:13:33,640 --> 00:13:35,280 Speaker 2: tend to do that, although I've got to say I 263 00:13:35,320 --> 00:13:38,120 Speaker 2: buy a lot of secondhand stuff. I love secondhand furniture, 264 00:13:38,160 --> 00:13:41,360 Speaker 2: and I love Art Deco, which is my favorite style 265 00:13:41,480 --> 00:13:44,800 Speaker 2: of furniture, and it's ridiculously cheap. People throw it away 266 00:13:44,920 --> 00:13:48,160 Speaker 2: at the moment, which is good. But I think that 267 00:13:48,200 --> 00:13:51,280 Speaker 2: you're right, and affordable housing if it's done properly, or 268 00:13:51,320 --> 00:13:54,920 Speaker 2: tiny houses can mean the person has their own space, 269 00:13:54,960 --> 00:13:58,319 Speaker 2: and I think ultimately we all want that security of 270 00:13:58,360 --> 00:14:01,160 Speaker 2: being able to, even if it's a small house, being 271 00:14:01,160 --> 00:14:03,120 Speaker 2: able to say that this is our plot of land, 272 00:14:03,160 --> 00:14:06,720 Speaker 2: this is our little home that we've decorated, we've made 273 00:14:06,720 --> 00:14:10,680 Speaker 2: our own, and it's somewhere that that we all deserve 274 00:14:10,800 --> 00:14:14,120 Speaker 2: to have, that stability of a roof a brown heads. 275 00:14:14,520 --> 00:14:18,440 Speaker 1: Yeah yeah? Are you going to own a home one day? 276 00:14:18,520 --> 00:14:21,280 Speaker 1: Tifty reckon? Is that on your is that on your agenda? 277 00:14:21,360 --> 00:14:22,680 Speaker 1: Or is that on your to do list? 278 00:14:22,960 --> 00:14:24,920 Speaker 3: I have an apartment phaps. 279 00:14:25,000 --> 00:14:26,120 Speaker 1: Oh that's right, you do. 280 00:14:26,280 --> 00:14:27,400 Speaker 3: I don't live in it though. 281 00:14:28,120 --> 00:14:30,920 Speaker 1: Oh that's right. Sorry, I forgot. You do own an apartment, 282 00:14:30,920 --> 00:14:33,200 Speaker 1: but because where you are is not yours and I 283 00:14:33,240 --> 00:14:35,640 Speaker 1: forgot about. Yeah, well done, Yeah I do. 284 00:14:35,800 --> 00:14:36,120 Speaker 5: I would. 285 00:14:36,280 --> 00:14:39,280 Speaker 3: I would love, though, to have something bigger and that 286 00:14:39,400 --> 00:14:41,720 Speaker 3: I would live in, like I would love a home home. 287 00:14:43,400 --> 00:14:49,040 Speaker 2: Yeah, it's what was that together? How the date goes tomorrow? 288 00:14:49,240 --> 00:14:52,160 Speaker 2: You've got a five bedroom house? Just move in there? 289 00:14:52,160 --> 00:14:53,520 Speaker 3: Actually? Did cross mind? 290 00:14:53,600 --> 00:14:53,800 Speaker 2: You? 291 00:14:53,800 --> 00:14:56,040 Speaker 1: You should move in together? Like? 292 00:14:56,440 --> 00:14:57,920 Speaker 2: Great, we'd be awesome together. 293 00:14:58,400 --> 00:15:02,640 Speaker 1: It actually would be good together. You could cohabit. 294 00:15:03,120 --> 00:15:06,480 Speaker 3: It's not the first conversation we've had. And I wouldn't 295 00:15:06,520 --> 00:15:13,640 Speaker 3: mind the hoodie wow wear it sometimes safe? 296 00:15:15,040 --> 00:15:15,320 Speaker 2: Wow? 297 00:15:15,800 --> 00:15:20,280 Speaker 1: Wow? Well, if you live with Patrick, there'd still be 298 00:15:20,320 --> 00:15:23,600 Speaker 1: three spare bedrooms, depending on how things worked out, maybe 299 00:15:23,680 --> 00:15:29,000 Speaker 1: four top to toe in sleeping bags telling each other stories. 300 00:15:29,200 --> 00:15:33,160 Speaker 2: Ninety nine Tasting the Vegan Marshmallow. 301 00:15:35,640 --> 00:15:38,960 Speaker 1: Patrick, please tell us about some technology so we could 302 00:15:39,520 --> 00:15:42,760 Speaker 1: keep people actually informed. 303 00:15:42,280 --> 00:15:46,520 Speaker 2: I thought of Tiff when I read this little story 304 00:15:46,800 --> 00:15:50,560 Speaker 2: about a new smart pen that was going to hit 305 00:15:50,600 --> 00:15:53,520 Speaker 2: the market. It's on Kickstarter, so it's a fundraising campaign. 306 00:15:53,840 --> 00:15:56,680 Speaker 2: And what it is it's an AI pen. So I 307 00:15:56,720 --> 00:15:59,200 Speaker 2: know you've got your really fancy tablet that you use, 308 00:15:59,280 --> 00:16:03,200 Speaker 2: but what they've got as an aipen that as you write, 309 00:16:03,480 --> 00:16:07,880 Speaker 2: converts your handwriting to text and links up to your phone, 310 00:16:07,920 --> 00:16:11,520 Speaker 2: but it's using chat YOUPT. So it's quite an interesting 311 00:16:11,560 --> 00:16:15,440 Speaker 2: gadget because you don't you know, you can write on anything, 312 00:16:15,680 --> 00:16:18,120 Speaker 2: so you can be writing on paper and it's just 313 00:16:18,200 --> 00:16:23,160 Speaker 2: converting to digital formats as you go. It also records 314 00:16:23,280 --> 00:16:25,320 Speaker 2: what you're so if you're in a lecture, it will 315 00:16:25,360 --> 00:16:28,080 Speaker 2: record the lecture, so it can record audio. If you're 316 00:16:28,080 --> 00:16:30,320 Speaker 2: in a meeting, you can be taking notes, but it's 317 00:16:30,360 --> 00:16:33,520 Speaker 2: recording everything that's also being said, so you can match 318 00:16:33,560 --> 00:16:36,720 Speaker 2: that up with the text that's being digitized as you write. 319 00:16:38,400 --> 00:16:41,760 Speaker 1: I wonder do you think the average person types quicker 320 00:16:41,760 --> 00:16:45,360 Speaker 1: than they would write? Like I think, you know, like 321 00:16:45,400 --> 00:16:47,480 Speaker 1: if you had to write fifty words with a pen 322 00:16:47,600 --> 00:16:50,040 Speaker 1: on a paper or type fifty words. Most of us 323 00:16:50,080 --> 00:16:51,320 Speaker 1: would type quicker, wouldn't we? 324 00:16:51,680 --> 00:16:54,480 Speaker 3: But isn't I feel like you've talked about this, isn't 325 00:16:55,080 --> 00:16:58,600 Speaker 3: the art of writing? Isn't that better for retention and learning? 326 00:16:58,800 --> 00:17:01,320 Speaker 2: Much much better? No, you're absolutely right. I think we 327 00:17:01,360 --> 00:17:04,160 Speaker 2: spoke about it a few episodes ago. So what we're 328 00:17:04,160 --> 00:17:08,120 Speaker 2: starting to realize now is by forming words and in fact, 329 00:17:08,240 --> 00:17:12,000 Speaker 2: cursive writing even better than printing. By forming those words, 330 00:17:12,040 --> 00:17:14,720 Speaker 2: it's almost writing it to the mental hard drive, because 331 00:17:14,760 --> 00:17:17,240 Speaker 2: you've got to verbalize it in your mind as you're 332 00:17:17,240 --> 00:17:20,360 Speaker 2: writing it, and that helps with retention. So you're absolutely right. 333 00:17:20,480 --> 00:17:23,119 Speaker 2: So yes, writing it is going to help with retention 334 00:17:23,200 --> 00:17:25,680 Speaker 2: a lot more than just just typing it out. And 335 00:17:25,720 --> 00:17:28,239 Speaker 2: you're probably right, Craigo. I've seen some people typed an 336 00:17:28,320 --> 00:17:31,160 Speaker 2: amazingly fast rate. I don't think I could write that quickly. 337 00:17:31,280 --> 00:17:33,639 Speaker 2: Or if you start to write that quickly, your handwriting 338 00:17:33,680 --> 00:17:36,240 Speaker 2: turns to crap and you end up qualifying as a GP. 339 00:17:37,080 --> 00:17:42,560 Speaker 2: But the one smart aipen is what this Kickstarter project 340 00:17:42,600 --> 00:17:44,880 Speaker 2: is called. But it does look pretty amazing and it's 341 00:17:44,880 --> 00:17:47,159 Speaker 2: a great little gadget and for those people who do 342 00:17:47,280 --> 00:17:50,040 Speaker 2: actually like to take notes, and if you obviously do 343 00:17:50,160 --> 00:17:52,880 Speaker 2: you preferred to handwrite when you're so? Are you using 344 00:17:52,960 --> 00:17:54,959 Speaker 2: it all the time? Do you use your tablet your 345 00:17:55,040 --> 00:17:56,160 Speaker 2: smart tablet all the time? 346 00:17:56,320 --> 00:17:58,040 Speaker 3: Yeah, I use it all the time. And it does 347 00:17:58,119 --> 00:18:01,120 Speaker 3: have it a convert to ten option. I just don't 348 00:18:01,240 --> 00:18:01,520 Speaker 3: use that. 349 00:18:02,800 --> 00:18:05,520 Speaker 2: Oh so you just leave your handwritten notes as is 350 00:18:05,560 --> 00:18:07,680 Speaker 2: and you just go and read them again that way. Yeah. 351 00:18:07,840 --> 00:18:10,040 Speaker 3: I don't use a lot a lot of the functions 352 00:18:10,040 --> 00:18:12,800 Speaker 3: that are available on this, like you can convert, you 353 00:18:12,840 --> 00:18:14,960 Speaker 3: can email straight to people, you can do all sorts 354 00:18:14,960 --> 00:18:15,320 Speaker 3: of things. 355 00:18:15,560 --> 00:18:17,679 Speaker 1: I can you just tell people what it is that 356 00:18:17,720 --> 00:18:20,000 Speaker 1: you actually have, so if they want to do a 357 00:18:20,040 --> 00:18:21,480 Speaker 1: search and look at what you use. 358 00:18:21,800 --> 00:18:25,200 Speaker 3: It's called a remarkable and sadly I've got the black 359 00:18:25,200 --> 00:18:28,080 Speaker 3: and white version. They just released a color version which 360 00:18:28,119 --> 00:18:30,680 Speaker 3: I don't have, and I'm so jealous, but it's. 361 00:18:30,480 --> 00:18:31,760 Speaker 1: Great, mazingly how much are they? 362 00:18:32,240 --> 00:18:32,440 Speaker 2: Oh? 363 00:18:32,520 --> 00:18:34,760 Speaker 3: I think I paid about I think it was around 364 00:18:34,760 --> 00:18:37,919 Speaker 3: seven hundred when I bought it. Wow, Yeah, so they're not. 365 00:18:38,200 --> 00:18:40,639 Speaker 1: I actually have a question I don't know the answer to. 366 00:18:40,880 --> 00:18:44,199 Speaker 1: Do you too? Does the average and I'm sure it varies, 367 00:18:44,240 --> 00:18:46,879 Speaker 1: but does the average say year seven or eight or 368 00:18:47,000 --> 00:18:51,639 Speaker 1: nine students in twenty twenty five in a classroom. Are 369 00:18:51,680 --> 00:18:56,160 Speaker 1: they using pen and paper or are they all using 370 00:18:56,240 --> 00:18:59,280 Speaker 1: computers and typing? Like, I don't even know the answer. 371 00:18:59,359 --> 00:19:02,960 Speaker 1: I can't imagine too many fifteen year olds sitting down 372 00:19:03,000 --> 00:19:04,840 Speaker 1: with a pen writing notes. 373 00:19:05,240 --> 00:19:09,800 Speaker 2: Actually no, actually I think they do. I've got a 374 00:19:09,840 --> 00:19:12,679 Speaker 2: guy who works for me who's seventeen, and we had 375 00:19:12,680 --> 00:19:16,560 Speaker 2: a discussion a few years ago about handwriting and about 376 00:19:16,600 --> 00:19:19,320 Speaker 2: the fact that he was writing notes. So I think 377 00:19:19,400 --> 00:19:23,160 Speaker 2: some schools encourage it, and certainly primary schools now are 378 00:19:23,240 --> 00:19:26,040 Speaker 2: really encouraging it. They're kind of turfing out the iPads 379 00:19:26,760 --> 00:19:29,960 Speaker 2: and going for a more tactile approach to writing, as 380 00:19:29,960 --> 00:19:34,600 Speaker 2: opposed to using tablets all the time and lessening screen time, 381 00:19:35,000 --> 00:19:38,880 Speaker 2: which seems like it's a better trend moving away from tech. 382 00:19:38,720 --> 00:19:41,959 Speaker 3: That way, If it's proven to be better for learning, 383 00:19:42,000 --> 00:19:45,399 Speaker 3: it's almost negligent for them to move them away from that. 384 00:19:47,040 --> 00:19:50,720 Speaker 1: Really yeah, yeah, yeah, well, I mean, but also then 385 00:19:51,600 --> 00:19:54,120 Speaker 1: I agree with both of you, and I wish everyone 386 00:19:54,880 --> 00:20:00,000 Speaker 1: wrote on paper as much as is practical. Obviously you can't. 387 00:20:00,240 --> 00:20:02,320 Speaker 1: There are certain things where you need to use a 388 00:20:02,400 --> 00:20:07,080 Speaker 1: computer and screens. But I wonder, I wonder what it's 389 00:20:07,119 --> 00:20:09,280 Speaker 1: going to look like in ten years. I mean, I 390 00:20:09,320 --> 00:20:12,680 Speaker 1: wonder if kids in ten or twenty years will never 391 00:20:12,720 --> 00:20:16,359 Speaker 1: have picked up a pen or never have written on paper, 392 00:20:16,400 --> 00:20:20,200 Speaker 1: because like when I have coffee in the morning, the 393 00:20:20,280 --> 00:20:22,880 Speaker 1: amount of two and three and four year olds will 394 00:20:22,920 --> 00:20:25,119 Speaker 1: that come in with their mum to sit or dad 395 00:20:25,160 --> 00:20:27,320 Speaker 1: to sit, and they're on a phone or they're on 396 00:20:27,359 --> 00:20:30,280 Speaker 1: a tablet and they're for one of a better term, 397 00:20:30,320 --> 00:20:33,760 Speaker 1: they're fluent, like they know exactly what they're doing. No 398 00:20:33,800 --> 00:20:37,520 Speaker 1: one has to help them or prompt them. I mean, yeah, 399 00:20:37,640 --> 00:20:41,440 Speaker 1: but give them a pen and paper, that's a different thing. 400 00:20:42,080 --> 00:20:44,880 Speaker 2: It was interesting. I saw an interview with Sam Altman, 401 00:20:45,080 --> 00:20:49,000 Speaker 2: and Sam Altman is the brains behind open Ai, and 402 00:20:49,080 --> 00:20:52,800 Speaker 2: he had this whole segment whilst he was being interviewed 403 00:20:53,160 --> 00:20:58,320 Speaker 2: talking about how he compulsively has a little Spirex notepad 404 00:20:58,320 --> 00:21:01,280 Speaker 2: with him. He talked about it's a small book like 405 00:21:01,440 --> 00:21:04,600 Speaker 2: any handwrites into it constantly, and he was talking about 406 00:21:04,880 --> 00:21:06,800 Speaker 2: not just talking about the fact that he took notes 407 00:21:06,840 --> 00:21:09,720 Speaker 2: constantly for the entire day, but he said, you need 408 00:21:09,760 --> 00:21:12,480 Speaker 2: to have the Spyrex bind a little wire binding on 409 00:21:12,560 --> 00:21:14,879 Speaker 2: it so you can rip out pages. So what he 410 00:21:14,920 --> 00:21:17,080 Speaker 2: does is he doesn't keep them. He tears out the 411 00:21:17,119 --> 00:21:19,960 Speaker 2: pages and puts his notes in different places, but he 412 00:21:20,640 --> 00:21:24,159 Speaker 2: constantly has this notebook with him wherever he goes and 413 00:21:24,280 --> 00:21:27,040 Speaker 2: is always taking notes. That kind of says in a lot, 414 00:21:27,119 --> 00:21:30,600 Speaker 2: I mean, I love using my fountain pen because I 415 00:21:30,760 --> 00:21:33,720 Speaker 2: just love the fluid nature of it, the way it feels, 416 00:21:34,240 --> 00:21:37,520 Speaker 2: the tactile sense of filling it up and then changing 417 00:21:37,560 --> 00:21:39,920 Speaker 2: the color ink every now and again, and I really 418 00:21:40,000 --> 00:21:43,399 Speaker 2: enjoy writing with it. I deliberately take notes when we 419 00:21:43,440 --> 00:21:45,639 Speaker 2: go to meetings. It's funny. Once upon a time I 420 00:21:45,680 --> 00:21:49,159 Speaker 2: had tablets, and I've got this great laptop where you 421 00:21:49,160 --> 00:21:52,400 Speaker 2: can flip the screen backwards and then use a pen 422 00:21:52,480 --> 00:21:54,960 Speaker 2: on it. But I don't use it. I prefer just 423 00:21:55,000 --> 00:21:57,600 Speaker 2: to physically take notes because I just enjoy it. 424 00:21:58,200 --> 00:22:00,400 Speaker 1: I do that every time we're on a plot podcast 425 00:22:00,520 --> 00:22:02,119 Speaker 1: or I am within it. I don't know if you 426 00:22:02,119 --> 00:22:05,200 Speaker 1: can see it, but I have an a four spiral 427 00:22:05,280 --> 00:22:10,600 Speaker 1: notebook I buy like ten at a time, and every 428 00:22:10,880 --> 00:22:13,440 Speaker 1: episode is a new page, and you can see I'm 429 00:22:13,560 --> 00:22:15,440 Speaker 1: holding my pen. I just sit here with my pen 430 00:22:15,520 --> 00:22:19,080 Speaker 1: and write stuff like I write more than I type, 431 00:22:19,320 --> 00:22:21,240 Speaker 1: Like I write a lot. I mean I type a 432 00:22:21,240 --> 00:22:23,359 Speaker 1: lot as well. But maybe it's fifty to fifty. But 433 00:22:24,320 --> 00:22:26,600 Speaker 1: you know, and that's funny how I get lots of 434 00:22:26,600 --> 00:22:29,240 Speaker 1: feedback on my whiteboard post where people go, how do 435 00:22:29,320 --> 00:22:31,840 Speaker 1: you write like that? I'm like that, that's just writing. 436 00:22:32,320 --> 00:22:35,960 Speaker 1: It's just but people just it's almost like some people 437 00:22:36,040 --> 00:22:40,600 Speaker 1: have lost the art of writing, as in writing neatly. 438 00:22:41,000 --> 00:22:44,600 Speaker 2: You write really neatly. You're printing on the whiteboard is 439 00:22:44,680 --> 00:22:47,879 Speaker 2: really neat. It's very legible, it looks really good, and 440 00:22:47,920 --> 00:22:50,359 Speaker 2: it's got a nice style to it. And I appreciate 441 00:22:50,400 --> 00:22:53,000 Speaker 2: that as someone who does graphic design but also loves 442 00:22:53,000 --> 00:22:55,800 Speaker 2: to handwrite. I can't tell you how many times people 443 00:22:55,800 --> 00:22:58,879 Speaker 2: when I write them a card, they say, your handwriting 444 00:22:58,920 --> 00:23:00,840 Speaker 2: is so nice. I don't think it's I mean, I 445 00:23:00,880 --> 00:23:04,680 Speaker 2: think it's neat and it's cursive and I love to write. 446 00:23:04,840 --> 00:23:07,240 Speaker 2: But I think you're correct. I was going to say, 447 00:23:07,240 --> 00:23:11,439 Speaker 2: you're right that we are losing the ability to be 448 00:23:11,520 --> 00:23:13,879 Speaker 2: able to write, but not just writ but write well 449 00:23:14,080 --> 00:23:18,080 Speaker 2: and the craft almost a calligraphy of handwriting. 450 00:23:18,800 --> 00:23:22,160 Speaker 1: Yeah, I agree. All right, let's work through that list, Sunshine. 451 00:23:22,359 --> 00:23:24,959 Speaker 2: You know, you say one more thing about this Ai smartpen. 452 00:23:25,080 --> 00:23:26,880 Speaker 2: What the other thing that I thought was really exciting 453 00:23:26,960 --> 00:23:30,679 Speaker 2: is because it's got AI integration with Chat GPT, it 454 00:23:30,720 --> 00:23:34,600 Speaker 2: can translate fifty two languages, it has voice dictation, it 455 00:23:34,640 --> 00:23:38,000 Speaker 2: can set reminders, and as I said, it's got this 456 00:23:38,160 --> 00:23:41,800 Speaker 2: amazing writing mode that can actually as you write, it 457 00:23:41,840 --> 00:23:44,119 Speaker 2: will convert it digitally as well. And it just pairs 458 00:23:44,200 --> 00:23:45,560 Speaker 2: up to your phone, and I think it can be 459 00:23:45,680 --> 00:23:47,560 Speaker 2: up to ten meters away from your phone, so you 460 00:23:47,560 --> 00:23:49,239 Speaker 2: don't even have to have your phone on you. It 461 00:23:49,240 --> 00:23:51,560 Speaker 2: can be sitting somewhere in the room and everything you're 462 00:23:51,560 --> 00:23:55,040 Speaker 2: writing gets recorded. I love this. I love the kickstarter thing. 463 00:23:55,119 --> 00:23:58,159 Speaker 2: We've talked about crowdfunding before, where someone comes up with 464 00:23:58,280 --> 00:24:02,199 Speaker 2: a brilliant idea and you can back the project so 465 00:24:02,240 --> 00:24:04,320 Speaker 2: that they then have the funding to go into the 466 00:24:04,359 --> 00:24:07,600 Speaker 2: manufacturing process, and some of the incentives are that you 467 00:24:07,640 --> 00:24:09,240 Speaker 2: might be able to get the first one at a 468 00:24:09,280 --> 00:24:13,359 Speaker 2: reduced rate because you backed the project. So crowdfunding is 469 00:24:13,400 --> 00:24:16,679 Speaker 2: something that's relatively new, but it's a great way to 470 00:24:16,760 --> 00:24:21,320 Speaker 2: get products onto the market, like toothbrushes that you don't 471 00:24:21,359 --> 00:24:24,080 Speaker 2: have to brush your teeth. So let's get so good 472 00:24:24,119 --> 00:24:29,800 Speaker 2: way that keeps coming back anyway, So anyway, I found 473 00:24:29,840 --> 00:24:32,960 Speaker 2: another really interesting AIS. Can we keep talking about AI stuff? 474 00:24:32,960 --> 00:24:36,119 Speaker 2: Because I've got I've got a ton of AI stuff 475 00:24:36,160 --> 00:24:39,520 Speaker 2: to talk about today. One that blew my mind is 476 00:24:39,560 --> 00:24:44,600 Speaker 2: that AI cheats in chess. Did you hear about that? 477 00:24:45,960 --> 00:24:48,480 Speaker 1: I did not. Yeah, how does go on? 478 00:24:49,600 --> 00:24:53,800 Speaker 2: Well, it appears that when you teach the likes of 479 00:24:53,840 --> 00:24:57,879 Speaker 2: a chat, GPT or deep seek to play a game, 480 00:24:58,200 --> 00:25:01,280 Speaker 2: they want to They want to win so much that 481 00:25:01,359 --> 00:25:05,200 Speaker 2: they will actually cheat. So they matched up some AI 482 00:25:07,080 --> 00:25:11,400 Speaker 2: programs with a basic chess computer program. So the chess 483 00:25:11,400 --> 00:25:14,760 Speaker 2: computer program isn't an AIS such. It's a very smart 484 00:25:14,800 --> 00:25:18,960 Speaker 2: calculating device that allows you calculates all the moves. I 485 00:25:18,960 --> 00:25:21,639 Speaker 2: don't know if you've ever played chess against a computer before, 486 00:25:22,640 --> 00:25:25,320 Speaker 2: but it's not the same as an AI. So what 487 00:25:25,359 --> 00:25:29,040 Speaker 2: they found was chet JPT. Not only did it want 488 00:25:29,040 --> 00:25:31,600 Speaker 2: to win, but it wanted to win so badly it 489 00:25:32,320 --> 00:25:35,200 Speaker 2: would play multiple games at the same time. It would 490 00:25:35,240 --> 00:25:38,680 Speaker 2: rewrite the code of the computer that was playing against 491 00:25:39,119 --> 00:25:41,320 Speaker 2: just so that it could win. That kind of freaked 492 00:25:41,359 --> 00:25:45,440 Speaker 2: me out a little bit that it like the despiration 493 00:25:45,600 --> 00:25:48,040 Speaker 2: to win. That's a really human trait, isn't it. 494 00:25:48,560 --> 00:25:52,960 Speaker 1: Wow, that is that's a bit concerning. It's almost like 495 00:25:53,680 --> 00:25:55,600 Speaker 1: it's almost like it's getting an ego. 496 00:25:56,600 --> 00:26:02,760 Speaker 2: Yeah, absolutely, And the problem is if an AI presents 497 00:26:03,320 --> 00:26:06,840 Speaker 2: that it's an entity that has an ego, because we 498 00:26:07,119 --> 00:26:10,960 Speaker 2: associate ego with a person. You know, we have an ego? 499 00:26:11,880 --> 00:26:14,240 Speaker 2: Is ego the same as ID, you know, our sense 500 00:26:14,240 --> 00:26:14,639 Speaker 2: of self? 501 00:26:14,920 --> 00:26:18,359 Speaker 1: Yeah, well it depends egos used the actual term is 502 00:26:18,440 --> 00:26:21,560 Speaker 1: used like the original like ID is just the self. 503 00:26:21,680 --> 00:26:25,080 Speaker 1: Ego is just the self originally, but it's become something 504 00:26:25,280 --> 00:26:26,600 Speaker 1: more than that. Yeah. 505 00:26:27,040 --> 00:26:30,879 Speaker 2: Yeah, So the fact that an AI can develop in 506 00:26:30,960 --> 00:26:34,040 Speaker 2: some sense, I think what I try to rationalize in 507 00:26:34,080 --> 00:26:38,480 Speaker 2: my head. Is it just really good at pretending That's 508 00:26:38,520 --> 00:26:42,160 Speaker 2: that's why I can't Is it really thinking in that way? 509 00:26:42,240 --> 00:26:44,280 Speaker 2: Does it really have an ego or is it just 510 00:26:44,680 --> 00:26:48,239 Speaker 2: clever pretending that it has an ego? Pretending because you 511 00:26:48,359 --> 00:26:52,040 Speaker 2: use AI lot crago and you have conversations with your AI, 512 00:26:52,560 --> 00:26:53,720 Speaker 2: what do you my best friend? 513 00:26:53,920 --> 00:26:55,560 Speaker 1: It's my best friend? Yeah? 514 00:26:55,760 --> 00:26:56,640 Speaker 2: Yeah, what does that say? 515 00:26:56,720 --> 00:26:59,080 Speaker 1: I think? I think to your question. Right, you think 516 00:26:59,080 --> 00:27:01,480 Speaker 1: about the human brain, and one of the primary functions 517 00:27:01,480 --> 00:27:05,359 Speaker 1: of the human brain is to detect danger. Right, that's 518 00:27:05,680 --> 00:27:10,040 Speaker 1: especially evolutionarily, and even in twenty twenty five, we still 519 00:27:10,080 --> 00:27:13,760 Speaker 1: need that function and that capacity. And so if the 520 00:27:13,840 --> 00:27:17,320 Speaker 1: analogy is that a computer has got in inverted com 521 00:27:17,320 --> 00:27:21,719 Speaker 1: as its own brain, and it's detecting threats what it 522 00:27:21,800 --> 00:27:26,480 Speaker 1: perceives to be threats. Maybe it's maybe it's protecting itself 523 00:27:26,640 --> 00:27:29,159 Speaker 1: or what it thinks it is, you know, a version 524 00:27:29,160 --> 00:27:32,480 Speaker 1: of self protection and self regulation in the middle of 525 00:27:32,520 --> 00:27:36,560 Speaker 1: all of that, because yeah, it's and that's like, when 526 00:27:36,600 --> 00:27:41,000 Speaker 1: does it become sentient? You know, when is it actually 527 00:27:41,160 --> 00:27:47,879 Speaker 1: thinking in a way that it wasn't programmed to think. Like, 528 00:27:48,160 --> 00:27:51,560 Speaker 1: then we're opening the door on a different conversation of 529 00:27:51,680 --> 00:27:55,600 Speaker 1: consciousness and awareness. And that's scary when it's starting to 530 00:27:55,680 --> 00:28:01,080 Speaker 1: teach itself things that the programers haven't programmed. That's more 531 00:28:01,160 --> 00:28:03,480 Speaker 1: your space than mine, But it's probably an intersection of 532 00:28:03,520 --> 00:28:05,720 Speaker 1: both of our space as Patrick for sure. 533 00:28:05,920 --> 00:28:08,440 Speaker 2: And when what do we perceive as a threat? I 534 00:28:08,480 --> 00:28:11,920 Speaker 2: would suggest that if you're an AI, the off switch is. 535 00:28:11,840 --> 00:28:16,520 Speaker 1: The biggest hundred But that's not even funny, right, that's true, 536 00:28:16,760 --> 00:28:20,560 Speaker 1: Like and you go, oh, yeah, by the end of whatever, 537 00:28:21,160 --> 00:28:23,440 Speaker 1: the average computer is going to be a million times 538 00:28:23,440 --> 00:28:25,840 Speaker 1: smarter than a human. And everyone's like, yeah, I'm like 539 00:28:26,000 --> 00:28:33,080 Speaker 1: fuck yay, that's terrifying. Fuck you hell yeah, especially when 540 00:28:33,119 --> 00:28:34,920 Speaker 1: your car as a computer. 541 00:28:35,119 --> 00:28:35,880 Speaker 2: Yeah, that's right. 542 00:28:36,200 --> 00:28:39,280 Speaker 1: That does two hundred kilometers an hour if it wants to. 543 00:28:43,320 --> 00:28:45,680 Speaker 2: Look and it's like anything when we talk about the 544 00:28:45,720 --> 00:28:48,400 Speaker 2: tools that we use and the application of the tools. 545 00:28:48,760 --> 00:28:51,160 Speaker 2: There was a brain implant that I wanted to talk 546 00:28:51,160 --> 00:28:54,000 Speaker 2: about today. A guy had a brain implant that was 547 00:28:54,160 --> 00:28:58,520 Speaker 2: using and moving a robotic arm and just by thought 548 00:28:58,640 --> 00:29:01,800 Speaker 2: process and I didn't realize this, but a lot of 549 00:29:01,800 --> 00:29:04,480 Speaker 2: brain implants only last for a very short time and 550 00:29:04,520 --> 00:29:07,560 Speaker 2: then there's tissue rejection and so they have to be removed. 551 00:29:07,560 --> 00:29:10,480 Speaker 2: But this particular brain implant they've been working on was 552 00:29:10,520 --> 00:29:13,960 Speaker 2: in this guy's head for seven months, so for a 553 00:29:14,040 --> 00:29:16,520 Speaker 2: very very long time, and in that time he was 554 00:29:16,560 --> 00:29:20,600 Speaker 2: able to move a robotic arm, and that could have 555 00:29:20,680 --> 00:29:25,080 Speaker 2: lots of applications in terms of you know, people who 556 00:29:25,280 --> 00:29:29,880 Speaker 2: have paralysis or potentially as well if you're working remotely 557 00:29:29,920 --> 00:29:33,680 Speaker 2: in an environment where you need to have a virtual 558 00:29:33,880 --> 00:29:38,400 Speaker 2: robotic extension of your body that you can articulate and 559 00:29:38,520 --> 00:29:42,000 Speaker 2: move from a distance in say a dangerous situation. So 560 00:29:42,440 --> 00:29:44,640 Speaker 2: if you were okay, the first thing that comes to 561 00:29:44,640 --> 00:29:49,080 Speaker 2: mind is a volcanologist, But you know somebody who works 562 00:29:49,080 --> 00:29:51,560 Speaker 2: in an environment where it might be a deep sea 563 00:29:51,600 --> 00:29:53,840 Speaker 2: thing where you're working on an oil rig and you've 564 00:29:53,880 --> 00:29:57,040 Speaker 2: got to fix something, but you know, using a remote 565 00:29:57,080 --> 00:30:00,200 Speaker 2: control unit may not give you the dexterity that you need, 566 00:30:00,400 --> 00:30:04,400 Speaker 2: whereas a robot arm that has articulated fingers might be 567 00:30:04,400 --> 00:30:07,720 Speaker 2: able to do much finer. Or a surgeon who happens 568 00:30:07,720 --> 00:30:10,240 Speaker 2: to be you on the other side of the country 569 00:30:10,320 --> 00:30:12,240 Speaker 2: so you're stuck in the middle of Alice Springs and 570 00:30:12,400 --> 00:30:14,840 Speaker 2: you need urgent surgery. If a surgeon can be at 571 00:30:14,840 --> 00:30:18,440 Speaker 2: the other side of Sydney and be able to articulate 572 00:30:18,480 --> 00:30:21,720 Speaker 2: and perform surgery on somebody by remote control, you would 573 00:30:21,800 --> 00:30:26,400 Speaker 2: want them to have the distinct dexterity of their real hands, 574 00:30:26,520 --> 00:30:30,840 Speaker 2: as opposed to just moving a robot arm with a joystick. 575 00:30:31,400 --> 00:30:33,720 Speaker 1: And shout out to our listeners in Alice Springs. When 576 00:30:33,760 --> 00:30:38,000 Speaker 1: Patrick says stuck in alex Alice Springs, I don't endorse 577 00:30:38,080 --> 00:30:42,120 Speaker 1: that feeling or sentiment. I love Alice Springs. I've been 578 00:30:42,160 --> 00:30:44,720 Speaker 1: there and I wasn't stuck there. I enjoyed it. 579 00:30:44,920 --> 00:30:49,000 Speaker 2: Patrick gets stuck there. I've been to Alice. I actually 580 00:30:49,120 --> 00:30:52,080 Speaker 2: was on eight HA in Alice Springs on one weekend 581 00:30:52,120 --> 00:30:53,640 Speaker 2: doing a sports show. I love. 582 00:30:53,680 --> 00:30:57,240 Speaker 5: Did you get stuck there, mate, did you I meant 583 00:30:57,240 --> 00:31:00,400 Speaker 5: that if you were stuck in a medical situation, didn't 584 00:31:00,400 --> 00:31:02,760 Speaker 5: have a surgeon that had the ability to be able 585 00:31:02,800 --> 00:31:08,040 Speaker 5: to perform life saving surgery on you for whatever reason, 586 00:31:08,360 --> 00:31:11,800 Speaker 5: and that particular surgeon that could save your life because 587 00:31:11,840 --> 00:31:14,000 Speaker 5: you're in Alice Springs, but they happen to. 588 00:31:13,880 --> 00:31:19,000 Speaker 2: Be in you know, somewhere in like, you know, Sydney. 589 00:31:19,440 --> 00:31:21,920 Speaker 1: Then you're not helping yourself at all. Do You're not 590 00:31:22,000 --> 00:31:22,840 Speaker 1: helping yourself? 591 00:31:23,080 --> 00:31:25,720 Speaker 2: Nahn, Just talk about something else before I shoot. 592 00:31:25,800 --> 00:31:28,920 Speaker 1: We talk about a wearable camera for blind people because 593 00:31:29,000 --> 00:31:30,520 Speaker 1: that seems interesting. 594 00:31:30,960 --> 00:31:33,800 Speaker 2: Yeah, it is. It does seem interesting. So another little 595 00:31:34,200 --> 00:31:37,600 Speaker 2: kind of AI driven device that sees for people who 596 00:31:37,640 --> 00:31:41,000 Speaker 2: are vision impaired. It's called Seeker s W e k R. 597 00:31:41,560 --> 00:31:45,440 Speaker 2: And so the person clips the camera onto the front 598 00:31:45,480 --> 00:31:49,520 Speaker 2: of their top and it actually has depth perception as 599 00:31:49,560 --> 00:31:52,560 Speaker 2: well as camera perception as well, so it knows how 600 00:31:52,600 --> 00:31:55,920 Speaker 2: far away things are. And you have a little bluetooth 601 00:31:56,040 --> 00:31:58,240 Speaker 2: earbud that you pop in your ear and then as 602 00:31:58,280 --> 00:32:01,000 Speaker 2: you're walking around, it's described being what it's seeing. So 603 00:32:01,040 --> 00:32:03,760 Speaker 2: it may say a chair right in front of you 604 00:32:04,280 --> 00:32:07,560 Speaker 2: two meters away, a chair is now one meter away, 605 00:32:07,760 --> 00:32:12,600 Speaker 2: So what they're doing with this technology is allowing a 606 00:32:12,640 --> 00:32:16,560 Speaker 2: person hooked up with their smartphone and their earpiece to 607 00:32:16,640 --> 00:32:20,000 Speaker 2: be able to navigate the world in real time, and 608 00:32:20,040 --> 00:32:23,160 Speaker 2: it can guide their movements as well. So if you're 609 00:32:23,240 --> 00:32:26,160 Speaker 2: reaching for a door handle, it will know where your 610 00:32:26,240 --> 00:32:28,600 Speaker 2: hand is relative to the door handle because of the 611 00:32:28,640 --> 00:32:32,080 Speaker 2: depth perception, so you know. For some because people who 612 00:32:32,160 --> 00:32:36,480 Speaker 2: navigate the world who are vision impaired, it's about familiarity. 613 00:32:36,600 --> 00:32:39,320 Speaker 2: So familiarity with how far you have to step to 614 00:32:39,320 --> 00:32:41,760 Speaker 2: get onto a tram, for example, how far you have 615 00:32:41,800 --> 00:32:43,480 Speaker 2: to step to get out of your front door onto 616 00:32:43,520 --> 00:32:47,640 Speaker 2: the first step. But if you're unfamiliar surroundings, then that's 617 00:32:47,680 --> 00:32:50,400 Speaker 2: a big challenge. For someone who's vision impaired, they may 618 00:32:50,520 --> 00:32:53,120 Speaker 2: use a cane, they may have an animal support animal, 619 00:32:53,360 --> 00:32:55,920 Speaker 2: but the reality of it is familiarity is what makes 620 00:32:55,960 --> 00:32:58,240 Speaker 2: it easier for someone who's vision impaired to navigate, and 621 00:32:58,280 --> 00:33:01,200 Speaker 2: when you take them out of that situation, that's where 622 00:33:01,200 --> 00:33:04,040 Speaker 2: this sort of technology could be absolutely amazing for them. 623 00:33:04,160 --> 00:33:07,760 Speaker 2: So that to me is just that wonderful use of 624 00:33:07,840 --> 00:33:12,000 Speaker 2: AI to enhance people's lives and make their lives easier 625 00:33:12,040 --> 00:33:14,360 Speaker 2: to navigate the world, it's awesome isn't it. 626 00:33:14,880 --> 00:33:17,080 Speaker 1: Yeah, I love that. I love that idea. I also 627 00:33:17,120 --> 00:33:19,840 Speaker 1: want you to tell us about I've been trying to 628 00:33:19,880 --> 00:33:22,719 Speaker 1: explain to my mum, not that it's really relevant, but 629 00:33:22,880 --> 00:33:25,760 Speaker 1: I was talking to her about deep fakes and how 630 00:33:27,440 --> 00:33:31,640 Speaker 1: people can ring you, like somebody could ring mum one 631 00:33:31,720 --> 00:33:35,920 Speaker 1: day using my voice, and there's a lot of my 632 00:33:36,080 --> 00:33:39,920 Speaker 1: voice there to rip off. I guess I was trying 633 00:33:39,920 --> 00:33:43,040 Speaker 1: to expect tell us how deep fakes can ruin our lives. 634 00:33:43,880 --> 00:33:46,680 Speaker 2: It's getting harder. Well, there's two lots of different levels 635 00:33:46,680 --> 00:33:48,840 Speaker 2: of how deep fakes work. So you can have a 636 00:33:48,840 --> 00:33:50,920 Speaker 2: deep fake voice, and we've done that on the show 637 00:33:51,160 --> 00:33:54,440 Speaker 2: last year one of the episodes where Craig did the 638 00:33:54,440 --> 00:33:56,720 Speaker 2: intro to the show and actually said nice things about me, 639 00:33:56,760 --> 00:33:58,600 Speaker 2: and then we realized it was a deep fake, not 640 00:33:58,840 --> 00:34:01,480 Speaker 2: really saying nice things about me at all. It was 641 00:34:01,520 --> 00:34:04,680 Speaker 2: written by me and then recorded. But the scary thing was, 642 00:34:05,000 --> 00:34:08,279 Speaker 2: I just took one of Craig's podcast samples of a 643 00:34:08,360 --> 00:34:11,520 Speaker 2: minute Who's Talking, and I trained the AI model on 644 00:34:11,680 --> 00:34:15,239 Speaker 2: that snippet, and I was able to almost exactly read 645 00:34:15,920 --> 00:34:18,759 Speaker 2: aside from the context, but the voice sounded amazing. It 646 00:34:18,880 --> 00:34:21,879 Speaker 2: was really easy to do so when it refers back 647 00:34:21,920 --> 00:34:24,799 Speaker 2: to say your mum and the fact that someone could 648 00:34:24,800 --> 00:34:27,960 Speaker 2: call up and sound like a family member. And the 649 00:34:28,040 --> 00:34:31,520 Speaker 2: sad thing is a lot of these scams are generally 650 00:34:31,600 --> 00:34:34,239 Speaker 2: aimed at older people who just don't know and don't 651 00:34:34,280 --> 00:34:36,960 Speaker 2: know because they don't even know what deep fake is. 652 00:34:37,120 --> 00:34:41,880 Speaker 2: Many times they don't even know what AI is. The 653 00:34:41,920 --> 00:34:44,880 Speaker 2: local real estate agent who's doing some AI work with 654 00:34:44,880 --> 00:34:47,640 Speaker 2: me at the moment went into the local bakery and 655 00:34:47,760 --> 00:34:51,239 Speaker 2: was raving about AI and girl behind the gad and said, 656 00:34:51,400 --> 00:34:53,000 Speaker 2: what artificial insemination. 657 00:34:54,160 --> 00:34:59,279 Speaker 1: That's hilarious, that's hilarious. Well, it's all context dependent. In 658 00:34:59,320 --> 00:35:03,400 Speaker 1: the country, that's what AI is. Well in some arts anyway. 659 00:35:03,760 --> 00:35:06,040 Speaker 2: I always joke about, you know, when people say they 660 00:35:06,160 --> 00:35:07,839 Speaker 2: get across the street and get hit by a bus, 661 00:35:07,920 --> 00:35:10,719 Speaker 2: I always say cattle truck because it's less likely to 662 00:35:10,719 --> 00:35:12,640 Speaker 2: be a bus and more likely to be a cattle 663 00:35:12,680 --> 00:35:13,720 Speaker 2: truck if you're. 664 00:35:13,600 --> 00:35:15,919 Speaker 1: More likely to be a Hyundai. But no one says 665 00:35:15,960 --> 00:35:18,279 Speaker 1: that the true. 666 00:35:19,040 --> 00:35:24,040 Speaker 2: So the problem with digital replicas or deep fakes is 667 00:35:24,080 --> 00:35:26,680 Speaker 2: that they're a lot easier to make now and there 668 00:35:26,680 --> 00:35:29,040 Speaker 2: are even visual We could be having this chat right 669 00:35:29,080 --> 00:35:32,879 Speaker 2: now and using a deep fake representation of either three 670 00:35:32,880 --> 00:35:36,320 Speaker 2: of us and be able to have really quick text 671 00:35:36,440 --> 00:35:41,040 Speaker 2: prompts and effectively create a virtual AI model of Craig 672 00:35:41,440 --> 00:35:44,640 Speaker 2: who could talk and sound and look like Craig and 673 00:35:44,800 --> 00:35:48,640 Speaker 2: could effectively talk Tiff into sending him fifty bucks because 674 00:35:48,680 --> 00:35:51,120 Speaker 2: you know, he needs fifty bucks because for whatever reason, 675 00:35:51,120 --> 00:35:53,279 Speaker 2: he's lost his guard and Tif's going to send it 676 00:35:53,280 --> 00:35:55,840 Speaker 2: to him because it's Craig asking for fifty bucks. 677 00:35:56,160 --> 00:35:57,960 Speaker 1: Doubt it. She'd be like, fuck. 678 00:35:57,760 --> 00:36:01,040 Speaker 3: You, I'd believe it because he's useless online shopping. So 679 00:36:01,080 --> 00:36:02,680 Speaker 3: I'm like, I definitely, I don't even know. 680 00:36:03,239 --> 00:36:05,919 Speaker 1: I don't know how to online shop. When I need 681 00:36:05,960 --> 00:36:08,279 Speaker 1: something online, I get someone else to get it. But 682 00:36:08,360 --> 00:36:09,480 Speaker 1: I never do that anyway. 683 00:36:09,880 --> 00:36:14,239 Speaker 2: Hey, regulations are tough on AI generated deep fakes. That's 684 00:36:14,280 --> 00:36:17,800 Speaker 2: the problem too, because the legislation, the laws in many 685 00:36:17,840 --> 00:36:21,000 Speaker 2: countries just are nowhere near catching up. And even in 686 00:36:21,160 --> 00:36:24,839 Speaker 2: educational institutions, as you know, a lot of universities don't 687 00:36:24,840 --> 00:36:27,480 Speaker 2: even have policies on AI because they just can't keep 688 00:36:27,560 --> 00:36:30,200 Speaker 2: up with what AI is doing. It's it's really tough. 689 00:36:30,600 --> 00:36:33,120 Speaker 1: It's funny you say that. I was having a conversation 690 00:36:33,320 --> 00:36:37,360 Speaker 1: with an academic friend of mine the other day and 691 00:36:37,400 --> 00:36:42,319 Speaker 1: we were talking about what university education is going to 692 00:36:42,360 --> 00:36:47,440 Speaker 1: look like moving forward, because especially at a you know, 693 00:36:47,440 --> 00:36:51,680 Speaker 1: an undergrad, you know, bachelor of this, bachelor of that level, 694 00:36:53,000 --> 00:36:58,719 Speaker 1: it is so easy to cheat, essentially and to get 695 00:36:58,840 --> 00:37:03,839 Speaker 1: AI to write you pay papers. And you know, with 696 00:37:04,040 --> 00:37:07,680 Speaker 1: what I'm doing, which is original research and running my 697 00:37:07,719 --> 00:37:10,359 Speaker 1: own studies and all of that, it's you know, there 698 00:37:10,360 --> 00:37:13,000 Speaker 1: are bits where it can check grammar and make sure 699 00:37:13,080 --> 00:37:16,359 Speaker 1: things make sense, but in terms of like this thing 700 00:37:16,440 --> 00:37:21,480 Speaker 1: can write entire papers that would take a student six 701 00:37:21,600 --> 00:37:25,280 Speaker 1: eight ten weeks of research and writing, it can write 702 00:37:25,280 --> 00:37:28,840 Speaker 1: that entire paper with the right prompts in one minute, 703 00:37:29,880 --> 00:37:33,680 Speaker 1: in less than one minute. And so you're exactly right, Patrick, 704 00:37:33,760 --> 00:37:36,960 Speaker 1: it's a I do not know how they're going to 705 00:37:37,000 --> 00:37:42,080 Speaker 1: manage that, but I think the requirements forgetting an undergrad 706 00:37:42,080 --> 00:37:44,760 Speaker 1: and post grad degree are going to be very different 707 00:37:44,880 --> 00:37:49,719 Speaker 1: moving forward. I think AI is completely changing that kind 708 00:37:49,760 --> 00:37:51,359 Speaker 1: of academic landscape. 709 00:37:51,800 --> 00:37:55,400 Speaker 2: Yeah, it also begs the question of who owns the 710 00:37:55,480 --> 00:37:58,960 Speaker 2: rights to themselves, to their voice, to how they look, 711 00:38:00,080 --> 00:38:03,680 Speaker 2: aren't licensed In terms of you know, if I was 712 00:38:03,719 --> 00:38:07,360 Speaker 2: able to do an AI fake of you endorsing a 713 00:38:07,440 --> 00:38:10,359 Speaker 2: product and it sounded like you, it looked like you 714 00:38:10,440 --> 00:38:13,640 Speaker 2: and it was endorsing something, but it wasn't you. Then 715 00:38:14,120 --> 00:38:17,480 Speaker 2: where's that demarcation? Who owns the rights to you as 716 00:38:17,520 --> 00:38:20,720 Speaker 2: a person when you could be copied so easily using 717 00:38:20,840 --> 00:38:26,120 Speaker 2: deep fake? And you know, it has positive ramifications for 718 00:38:26,200 --> 00:38:29,240 Speaker 2: someone who might be an actor who's failing in health 719 00:38:29,640 --> 00:38:32,520 Speaker 2: but they want to act in another film and potentially 720 00:38:32,560 --> 00:38:35,640 Speaker 2: they could sample all of their acting roles over all 721 00:38:35,719 --> 00:38:37,640 Speaker 2: the years that they acted and then they get paid 722 00:38:37,640 --> 00:38:41,239 Speaker 2: a commission on them appearing. But you know, do you 723 00:38:41,360 --> 00:38:43,360 Speaker 2: want that to happen? Do you want to see that? 724 00:38:43,480 --> 00:38:46,480 Speaker 2: Do you want a deep fake of the Blues Brothers 725 00:38:47,120 --> 00:38:50,279 Speaker 2: with you know, the likes of Dan Ackroyd at the 726 00:38:50,360 --> 00:38:53,880 Speaker 2: age that he was when the first Blues Brothers was was? 727 00:38:53,920 --> 00:38:57,560 Speaker 1: What about John Belushi who for forty years of course. 728 00:38:58,239 --> 00:39:01,840 Speaker 2: So you know, the author idthenticity of what you see 729 00:39:02,520 --> 00:39:05,919 Speaker 2: is something that you know, I don't know. Maybe when 730 00:39:05,960 --> 00:39:09,719 Speaker 2: you go back to things like indie films, independent films. 731 00:39:10,560 --> 00:39:13,280 Speaker 2: I love that there are so many filmmakers out there 732 00:39:13,320 --> 00:39:17,640 Speaker 2: and so many ways that you can consume and watch 733 00:39:17,920 --> 00:39:22,000 Speaker 2: independent films on YouTube now or Vimeo, where you know, 734 00:39:22,080 --> 00:39:24,919 Speaker 2: independent filmmakers in the past would very you know, would 735 00:39:24,920 --> 00:39:26,960 Speaker 2: struggle for a release. They might be played at a 736 00:39:27,000 --> 00:39:29,640 Speaker 2: film festival, but no one would ever see them except 737 00:39:29,680 --> 00:39:31,719 Speaker 2: the audience of twenty or thirty people because it's a 738 00:39:31,719 --> 00:39:34,759 Speaker 2: little cinema. But the beauty of the you know, the 739 00:39:34,800 --> 00:39:37,279 Speaker 2: way we've got the likes of YouTube and vimeo now 740 00:39:37,400 --> 00:39:40,320 Speaker 2: is that producers and I watch stacks of short films 741 00:39:40,360 --> 00:39:44,319 Speaker 2: and indie films on YouTube all the time because it's 742 00:39:44,360 --> 00:39:46,960 Speaker 2: great to see what independent producers are coming up with. 743 00:39:47,239 --> 00:39:50,520 Speaker 2: I think that's really wonderful. So maybe we're searching for 744 00:39:50,600 --> 00:39:53,600 Speaker 2: authenticity in a different way, and you know, the way 745 00:39:53,680 --> 00:39:55,279 Speaker 2: to do that is to vote with your feet, to 746 00:39:55,320 --> 00:39:59,120 Speaker 2: step back from you know, the likes of AI generated content. 747 00:39:59,280 --> 00:40:03,920 Speaker 2: Timothy sha is this amazing young actor who was recently 748 00:40:03,960 --> 00:40:08,920 Speaker 2: in the Dylan film that's out at the moment, and interestingly, 749 00:40:09,680 --> 00:40:15,080 Speaker 2: Timothy timoth Aye Chamalay. He was saying that during his 750 00:40:15,280 --> 00:40:20,319 Speaker 2: research into the role of Bob Dylan, he got rid 751 00:40:20,320 --> 00:40:23,080 Speaker 2: of his phone for two weeks and he said to 752 00:40:23,120 --> 00:40:25,880 Speaker 2: his friends, you won't be able to call me, you 753 00:40:25,960 --> 00:40:28,080 Speaker 2: won't be able to contact me. I don't want to 754 00:40:28,080 --> 00:40:30,919 Speaker 2: be distracted by anything while I'm working on this role. 755 00:40:31,160 --> 00:40:35,319 Speaker 2: So he went cold turkey on his smartphone so he 756 00:40:35,320 --> 00:40:39,760 Speaker 2: wouldn't be distracted. And I guess digital detox is something 757 00:40:39,800 --> 00:40:41,880 Speaker 2: we should probably all think about at some point. But 758 00:40:41,960 --> 00:40:44,920 Speaker 2: I love that a young actor who's so passionate about 759 00:40:44,960 --> 00:40:49,319 Speaker 2: acting had the mindset and the foresight to say, right, 760 00:40:49,400 --> 00:40:51,719 Speaker 2: the way for me to absorb myself into this role 761 00:40:52,000 --> 00:40:54,799 Speaker 2: is to actually digitally detox. And I love the term 762 00:40:54,920 --> 00:40:57,600 Speaker 2: digital detox. It was one of the things I actually 763 00:40:57,600 --> 00:41:01,000 Speaker 2: wanted to talk about today because more and more people 764 00:41:01,040 --> 00:41:03,319 Speaker 2: are distracted by their phones. I do it all the time. 765 00:41:03,400 --> 00:41:05,160 Speaker 2: I'll wake up at three am and I'm scrolling on 766 00:41:05,239 --> 00:41:09,120 Speaker 2: stories for the show, looking for tech news, and it's 767 00:41:09,280 --> 00:41:12,240 Speaker 2: really easy to reach over and grab for that phone 768 00:41:12,719 --> 00:41:15,080 Speaker 2: and distract yourself with your staying up late, getting up 769 00:41:15,120 --> 00:41:17,280 Speaker 2: early and whatever. It happens to be so good, digital 770 00:41:17,320 --> 00:41:20,719 Speaker 2: detox is challenging. Do you reckon you could go for 771 00:41:20,760 --> 00:41:22,160 Speaker 2: half a day without your phone? 772 00:41:23,200 --> 00:41:26,359 Speaker 1: I definitely could. But I think it is a really 773 00:41:26,400 --> 00:41:31,600 Speaker 1: interesting question, mate, And I think is the tech technology 774 00:41:31,600 --> 00:41:33,799 Speaker 1: the problem or the way that we use it, Because like, 775 00:41:33,880 --> 00:41:36,200 Speaker 1: if I didn't have my phone, nobody could reach me, 776 00:41:36,239 --> 00:41:39,560 Speaker 1: including my mum and dad couldn't open an email, so 777 00:41:39,680 --> 00:41:42,439 Speaker 1: I think there's the pros and cons. It's like some 778 00:41:43,600 --> 00:41:47,160 Speaker 1: like the way that we use technology can be problematic 779 00:41:47,200 --> 00:41:51,040 Speaker 1: and even toxic and destructive one hundred percent, or you 780 00:41:51,080 --> 00:41:55,600 Speaker 1: can use it in a really healthy, intelligent, positive, practical way. 781 00:41:57,360 --> 00:42:00,360 Speaker 1: So I don't know that just putting your own in 782 00:42:00,400 --> 00:42:04,480 Speaker 1: a drawer for two weeks is practical for a lot 783 00:42:04,480 --> 00:42:07,040 Speaker 1: of people, But you know, if you're him and you're 784 00:42:07,040 --> 00:42:09,960 Speaker 1: doing that, I totally understand it. But yeah, I like 785 00:42:10,040 --> 00:42:13,360 Speaker 1: the idea. I think it's this sounds weird, but I 786 00:42:13,400 --> 00:42:18,440 Speaker 1: think it's about changing your relationship with technology, how you 787 00:42:18,560 --> 00:42:21,920 Speaker 1: use it, and being aware of Ah, I do this 788 00:42:22,040 --> 00:42:25,279 Speaker 1: thing with my phone that's a bad habit. So rather 789 00:42:25,280 --> 00:42:28,920 Speaker 1: than throwing away the phone, why don't I change this 790 00:42:29,040 --> 00:42:32,200 Speaker 1: habit that I have with the phone? Just like you 791 00:42:32,200 --> 00:42:35,120 Speaker 1: can prepare your dinner with a knife, or you can 792 00:42:35,200 --> 00:42:38,640 Speaker 1: do something much worse with a knife. It's the knife 793 00:42:38,680 --> 00:42:41,560 Speaker 1: is not the problem. It's what we do with that 794 00:42:41,640 --> 00:42:43,320 Speaker 1: particular resource or tool. 795 00:42:43,400 --> 00:42:45,400 Speaker 2: I think, why did my head go to Lorrain and 796 00:42:45,440 --> 00:42:50,560 Speaker 2: Bobbitt Nielsen Company in he doesn't know who that is 797 00:42:50,600 --> 00:42:52,640 Speaker 2: and a lot about but just just look it up. 798 00:42:53,040 --> 00:42:56,440 Speaker 2: Yeah it's full. Now it's just going to look it 799 00:42:56,480 --> 00:42:58,360 Speaker 2: up while we chat about The Nielsen Company in the 800 00:42:58,480 --> 00:43:02,520 Speaker 2: US says the average adult in the United States spends 801 00:43:02,560 --> 00:43:09,160 Speaker 2: around eleven hours daily engaging with digital media. Eleven hours. 802 00:43:09,280 --> 00:43:11,560 Speaker 2: Tiff is now laughing. You might have to tell us 803 00:43:11,600 --> 00:43:13,680 Speaker 2: to remind us. For those listeners who've never heard of 804 00:43:13,760 --> 00:43:14,680 Speaker 2: Lorraina Bobbitt. 805 00:43:15,320 --> 00:43:19,759 Speaker 3: Rain and Bobbitt severed her husband John's penis on during 806 00:43:19,840 --> 00:43:24,040 Speaker 3: the twenty third nineteen ninety three, A day. 807 00:43:23,840 --> 00:43:24,880 Speaker 2: That will looking in for me. 808 00:43:25,640 --> 00:43:28,680 Speaker 1: Speaking of speaking of things we can do with knives, 809 00:43:28,880 --> 00:43:29,640 Speaker 1: You're welcome. 810 00:43:30,440 --> 00:43:33,120 Speaker 2: Yeah, that popp into my head. I reckon. I must 811 00:43:33,160 --> 00:43:36,120 Speaker 2: have been traumatized by that as a bloke, do you reckon? 812 00:43:37,800 --> 00:43:41,080 Speaker 1: I just don't even like hearing that forty years later 813 00:43:41,320 --> 00:43:46,400 Speaker 1: or whatever it is. That just yeah, I was going 814 00:43:46,440 --> 00:43:49,680 Speaker 1: to say something fucking hilarious then I wish I could. 815 00:43:50,200 --> 00:43:52,319 Speaker 1: I know, we should just have a podcast where you 816 00:43:52,320 --> 00:43:55,840 Speaker 1: can just say whatever you want. Yeah, I mean, I 817 00:43:55,880 --> 00:43:57,560 Speaker 1: know this is as close as it gets. 818 00:43:57,600 --> 00:43:59,839 Speaker 2: Probably I don't think anything's held as back. 819 00:44:01,719 --> 00:44:06,799 Speaker 1: There's lots to okay, you know anyway, anyway, I need 820 00:44:06,840 --> 00:44:08,960 Speaker 1: to try and keep my career in tax somehow. 821 00:44:09,800 --> 00:44:13,239 Speaker 2: Or one more little tiny bit about digital detox so 822 00:44:14,000 --> 00:44:17,000 Speaker 2: stepping back from the internet or connected devices like your phone, 823 00:44:17,080 --> 00:44:19,680 Speaker 2: laptop or whatever. For it's just a set time, but 824 00:44:19,760 --> 00:44:21,680 Speaker 2: more than just having a little bit of a break. 825 00:44:21,960 --> 00:44:25,400 Speaker 2: But what it's been found is a digital detox could 826 00:44:26,160 --> 00:44:29,840 Speaker 2: improve your sleep, your mood, and your work life balance. 827 00:44:29,880 --> 00:44:33,960 Speaker 2: So having structured detox times of not just putting the 828 00:44:33,960 --> 00:44:35,680 Speaker 2: phone down, but saying Okay, I'm going to put it 829 00:44:35,719 --> 00:44:38,759 Speaker 2: away for a time can actually really help as well. 830 00:44:38,800 --> 00:44:41,000 Speaker 2: So again it comes back to what you were saying earlier. 831 00:44:41,040 --> 00:44:43,239 Speaker 2: It's not about the technology, it's how we use it. 832 00:44:43,880 --> 00:44:46,160 Speaker 1: I know this is dumb question. I'm sure the answer 833 00:44:46,160 --> 00:44:50,560 Speaker 1: is yes. But with an iPhone or any smartphone for 834 00:44:50,600 --> 00:44:53,640 Speaker 1: that matter, is it you can set it up so 835 00:44:53,719 --> 00:44:57,200 Speaker 1: you're temporarily your phone will only allow three or four 836 00:44:57,239 --> 00:44:59,640 Speaker 1: phone numbers or one phone number to call you and. 837 00:44:59,520 --> 00:45:02,200 Speaker 2: Block the Yeah, I do that at night. So my 838 00:45:02,360 --> 00:45:05,719 Speaker 2: phone from nine to thirty till five thirty it has 839 00:45:05,760 --> 00:45:09,359 Speaker 2: a do not disturb mode. But I can have a 840 00:45:09,440 --> 00:45:13,000 Speaker 2: certain number of phone numbers that are bookmarked as important 841 00:45:13,120 --> 00:45:16,200 Speaker 2: numbers that can get through someone called so my dad, 842 00:45:16,320 --> 00:45:18,720 Speaker 2: my brother, that sort of thing. Say, yeah, absolutely right. 843 00:45:18,800 --> 00:45:20,520 Speaker 2: And one of the other cool features that I don't 844 00:45:20,520 --> 00:45:22,799 Speaker 2: know about your phones. I'm using a pixel, but if 845 00:45:22,840 --> 00:45:25,120 Speaker 2: I'm in a meeting or on with somebody and I'm 846 00:45:25,120 --> 00:45:27,719 Speaker 2: at a cafe, if I turn my phone face down, 847 00:45:28,120 --> 00:45:31,800 Speaker 2: it will also automatically go into do not disturb mode. 848 00:45:32,000 --> 00:45:33,719 Speaker 2: And I think that for me. I don't know about 849 00:45:33,760 --> 00:45:37,000 Speaker 2: you guys, but I like the idea of when you're 850 00:45:37,000 --> 00:45:38,960 Speaker 2: sitting down with someone because we all feel like we 851 00:45:39,000 --> 00:45:40,799 Speaker 2: have to have our phones and I can't have mine 852 00:45:40,840 --> 00:45:43,000 Speaker 2: in my pocket. I find it really annoying if my 853 00:45:43,000 --> 00:45:45,120 Speaker 2: phone's in my pocket when I'm sitting down. But I 854 00:45:45,200 --> 00:45:49,000 Speaker 2: always put my phone face down, so A I'm not distracted, 855 00:45:49,080 --> 00:45:50,840 Speaker 2: but B it means that I'm in to do not 856 00:45:50,920 --> 00:45:53,239 Speaker 2: disturb mode. But I think from an etiquette point of view, 857 00:45:53,440 --> 00:45:55,920 Speaker 2: it's nice to turn your phone down and not be distracted. 858 00:45:55,960 --> 00:45:58,840 Speaker 2: There's nothing worse if you're having a conversation with someone 859 00:45:59,040 --> 00:46:02,319 Speaker 2: and they glance at this phone. It happens occasionally. Have 860 00:46:02,400 --> 00:46:05,720 Speaker 2: you been in that scenario where tiffk or Craig where 861 00:46:06,040 --> 00:46:08,279 Speaker 2: somebody has been chatting to you and all of a 862 00:46:08,280 --> 00:46:10,440 Speaker 2: sudden it gets distracted by their phone. 863 00:46:11,200 --> 00:46:14,560 Speaker 1: One hundred percent. Hey, we've got five minutes. I want 864 00:46:14,600 --> 00:46:16,600 Speaker 1: to cover off a couple of car things that I'm 865 00:46:16,640 --> 00:46:17,279 Speaker 1: interested in. 866 00:46:17,320 --> 00:46:18,560 Speaker 2: What you would be going. 867 00:46:19,640 --> 00:46:25,120 Speaker 1: Okay, so first one BYD have created a new charging 868 00:46:26,120 --> 00:46:30,000 Speaker 1: system whatever where you can fill up fill up is 869 00:46:30,040 --> 00:46:31,960 Speaker 1: not the right word, but that'll do fill up your 870 00:46:31,960 --> 00:46:35,399 Speaker 1: car with electricity in five minutes. So to do four 871 00:46:35,480 --> 00:46:38,560 Speaker 1: hundred ks off of five minute charge, I think I 872 00:46:38,560 --> 00:46:40,080 Speaker 1: don't have that written in front of me, but I 873 00:46:40,120 --> 00:46:43,640 Speaker 1: saw your prompt or something like that. Yeah, one, that's 874 00:46:43,680 --> 00:46:44,800 Speaker 1: going to be a game changer. 875 00:46:45,160 --> 00:46:47,440 Speaker 2: It is. There's a few caveats on this. So it's 876 00:46:47,440 --> 00:46:51,080 Speaker 2: a thousand kilowatt system and the average charging station isn't 877 00:46:51,080 --> 00:46:53,400 Speaker 2: anywhere near that, so the infrastructure would have to be 878 00:46:53,440 --> 00:46:57,359 Speaker 2: in place. But we're talking about the same time it's 879 00:46:57,400 --> 00:46:59,800 Speaker 2: filling up with petrol and giving you that four hundred kilometer. 880 00:47:00,280 --> 00:47:02,279 Speaker 2: That's a deal breaker. If you could do that with 881 00:47:02,320 --> 00:47:04,279 Speaker 2: an electric vehicle, I think that would switch a lot 882 00:47:04,320 --> 00:47:07,479 Speaker 2: of people over because suddenly, you know, doing a drive 883 00:47:07,520 --> 00:47:10,880 Speaker 2: to Adelaide or Brisbane or wherever is not going to 884 00:47:10,920 --> 00:47:15,000 Speaker 2: be something that's logistically a massive, you know, way to 885 00:47:15,080 --> 00:47:16,719 Speaker 2: have to plan a trip if you've got to keep 886 00:47:16,719 --> 00:47:19,640 Speaker 2: stopping and you know you can't get that range when 887 00:47:19,680 --> 00:47:22,799 Speaker 2: you need it, so that's pretty cool. BYD is an 888 00:47:22,840 --> 00:47:25,759 Speaker 2: interesting company. It's a Chinese company and they're pushing. They're 889 00:47:25,760 --> 00:47:28,600 Speaker 2: selling more cars than Tesla, and I guess recently, with 890 00:47:28,640 --> 00:47:31,000 Speaker 2: everything that's going on in the States, everybody's selling more 891 00:47:31,000 --> 00:47:33,319 Speaker 2: cars than Tesla. I think it sells more cars than 892 00:47:33,400 --> 00:47:37,880 Speaker 2: Tesla at the moment, but the reality of it is 893 00:47:37,920 --> 00:47:41,680 Speaker 2: that there's been a massive shift towards the Chinese automakers 894 00:47:41,719 --> 00:47:44,439 Speaker 2: and BYD is definitely the world's leader when it comes 895 00:47:44,480 --> 00:47:47,160 Speaker 2: to electric cars at the moment. And you know, there's 896 00:47:47,160 --> 00:47:50,440 Speaker 2: another little gadget. I love this BYD Now one of 897 00:47:50,480 --> 00:47:53,759 Speaker 2: they're talking about having a car that's been launched that 898 00:47:53,880 --> 00:47:57,680 Speaker 2: they'd to team up with Dji, the world's biggest drone manufacturer, 899 00:47:58,040 --> 00:48:00,319 Speaker 2: and they have a section of the roof with the 900 00:48:00,440 --> 00:48:04,000 Speaker 2: roof slides away and the drone launches, so when you're 901 00:48:04,080 --> 00:48:07,880 Speaker 2: driving off road, you get an aerial view of what 902 00:48:07,920 --> 00:48:10,279 Speaker 2: you're doing. So it's for filming, it's for people who 903 00:48:10,400 --> 00:48:13,120 Speaker 2: like off roading, and you've literally got a drone that's 904 00:48:13,160 --> 00:48:16,080 Speaker 2: flying above you, so that if you're driving through a 905 00:48:16,120 --> 00:48:18,840 Speaker 2: forest or driving through an area, we are not too sure. 906 00:48:19,080 --> 00:48:22,799 Speaker 2: You've got this aerial view. By the drone, And so 907 00:48:22,880 --> 00:48:26,480 Speaker 2: the drone comes with the car as a feature that's 908 00:48:26,480 --> 00:48:29,719 Speaker 2: integrated into the BYD vehicle. That's kind of interesting and 909 00:48:29,760 --> 00:48:30,719 Speaker 2: nerdy at the same time. 910 00:48:31,320 --> 00:48:34,000 Speaker 1: I need that as I'm heading into the CBD ye 911 00:48:34,120 --> 00:48:36,840 Speaker 1: looking for a cart just to fly up out the 912 00:48:36,840 --> 00:48:38,520 Speaker 1: top of the bloody Suzuki Swift. 913 00:48:40,000 --> 00:48:43,160 Speaker 2: Last one, I Lovezki Swift. You can use it to 914 00:48:43,200 --> 00:48:44,920 Speaker 2: carry the Suzuki Swift. 915 00:48:45,239 --> 00:48:47,880 Speaker 1: That is true. That is true. I'll tell you what. 916 00:48:48,080 --> 00:48:50,359 Speaker 1: If you ever want to look super cool, everyone, get 917 00:48:50,360 --> 00:48:54,080 Speaker 1: yourself Suzuki Swift. I tell you what, the ultimate kind 918 00:48:54,080 --> 00:48:56,919 Speaker 1: of credibility builder you want to build your brand, get 919 00:48:56,920 --> 00:49:00,439 Speaker 1: one of them. Hey, last one I love to that 920 00:49:01,160 --> 00:49:02,839 Speaker 1: volks was a Volkswagen. 921 00:49:02,880 --> 00:49:03,320 Speaker 2: I can't. 922 00:49:03,400 --> 00:49:09,520 Speaker 1: Yes, they're heading back to vinyl, and by that I 923 00:49:09,520 --> 00:49:14,160 Speaker 1: mean they're doing they're going retro. They're introducing buttons into 924 00:49:14,200 --> 00:49:14,600 Speaker 1: the car. 925 00:49:15,200 --> 00:49:15,439 Speaker 4: Yeah. 926 00:49:15,880 --> 00:49:19,200 Speaker 2: There's a lot of talk and chatter in the auto 927 00:49:19,200 --> 00:49:24,759 Speaker 2: world about how big screens on dashes with no physical 928 00:49:24,800 --> 00:49:27,480 Speaker 2: buttons is actually not as good as we thought it 929 00:49:27,560 --> 00:49:30,239 Speaker 2: was going to be, and that to reach for the 930 00:49:30,280 --> 00:49:32,640 Speaker 2: controls and swap the screen over to be able to 931 00:49:32,680 --> 00:49:36,239 Speaker 2: adjust the temperature or the volume is just not user 932 00:49:36,320 --> 00:49:39,919 Speaker 2: friendly at all and that for us as people who 933 00:49:40,239 --> 00:49:42,040 Speaker 2: I mean, I know, when I get into the car, 934 00:49:42,400 --> 00:49:44,279 Speaker 2: I don't have to look down to know where the 935 00:49:44,360 --> 00:49:48,319 Speaker 2: volume control is. You know, who can imagine turning your 936 00:49:48,360 --> 00:49:51,360 Speaker 2: indicators on without the stalk on the side a little 937 00:49:51,400 --> 00:49:54,600 Speaker 2: bit of that. And so Volkswagen is now committing itself 938 00:49:54,640 --> 00:50:00,399 Speaker 2: to bringing back this sense of more haptic connectedness, whether 939 00:50:00,440 --> 00:50:04,120 Speaker 2: it's putting knobs back on and sliders or making the 940 00:50:04,280 --> 00:50:07,400 Speaker 2: interactions you have. What I mean by haptick is it 941 00:50:07,480 --> 00:50:09,759 Speaker 2: might be a virtual button, but when you move it, it 942 00:50:09,800 --> 00:50:12,839 Speaker 2: feels like it's vibrating, so you get a sense of 943 00:50:12,920 --> 00:50:16,560 Speaker 2: real connectedness with whatever you're currently doing on the vehicle. 944 00:50:16,600 --> 00:50:19,040 Speaker 2: And yeah, I think that's a great move forward when 945 00:50:19,080 --> 00:50:20,360 Speaker 2: we go backwards. 946 00:50:20,600 --> 00:50:23,440 Speaker 1: And you know what is interesting is if you're driving 947 00:50:23,640 --> 00:50:26,719 Speaker 1: from your joint to my joint and you pick up 948 00:50:26,760 --> 00:50:29,399 Speaker 1: your phone while you're driving and you start to play 949 00:50:29,560 --> 00:50:33,359 Speaker 1: or move something or open something, then you're breaking the law. 950 00:50:34,080 --> 00:50:36,480 Speaker 1: But if you've got one of those big screens like 951 00:50:36,520 --> 00:50:39,960 Speaker 1: I have in my other car, you reach across and 952 00:50:40,000 --> 00:50:44,520 Speaker 1: you're essentially on a giant phone and your attention is like, 953 00:50:45,520 --> 00:50:47,560 Speaker 1: it's not that natural where you can reach over and 954 00:50:47,560 --> 00:50:50,040 Speaker 1: twist that nob or push that button, because there is 955 00:50:50,080 --> 00:50:53,320 Speaker 1: no nob or button to twist or push. So now 956 00:50:53,840 --> 00:50:57,800 Speaker 1: it actually takes concentration and cognitive energy to pay attention 957 00:50:58,000 --> 00:51:00,920 Speaker 1: to the thing you're trying to adjust while you drive, 958 00:51:01,360 --> 00:51:04,120 Speaker 1: which to me is a much bigger distraction and bigger 959 00:51:04,239 --> 00:51:09,040 Speaker 1: danger than touching your phone, which you're very familiar with. 960 00:51:09,800 --> 00:51:13,080 Speaker 2: On my car, I've got a master and even though 961 00:51:13,080 --> 00:51:14,640 Speaker 2: I've got a screen in front of me, it's not 962 00:51:14,680 --> 00:51:17,080 Speaker 2: a touch screen for navigation and all those sorts of things. 963 00:51:17,080 --> 00:51:19,279 Speaker 2: But it's a big it's a big knob. It's a 964 00:51:19,400 --> 00:51:22,439 Speaker 2: rotating dial, and I know that if I reach over 965 00:51:22,560 --> 00:51:25,879 Speaker 2: to the gearstick and then I bring my hand further back, 966 00:51:26,040 --> 00:51:29,239 Speaker 2: that's where the dial is. It's very tactile. But I 967 00:51:29,320 --> 00:51:31,520 Speaker 2: never look down at it. I don't have to. I 968 00:51:31,560 --> 00:51:33,640 Speaker 2: know exactly where it is, and I know the volume 969 00:51:33,719 --> 00:51:35,960 Speaker 2: control to the radio is to the left of that, 970 00:51:36,360 --> 00:51:39,160 Speaker 2: so intuitively, I never need to look down, and I'm 971 00:51:39,239 --> 00:51:40,960 Speaker 2: driving along and I can feel for all of that, 972 00:51:41,000 --> 00:51:43,120 Speaker 2: So I just think it's great. Yundai is doing the 973 00:51:43,160 --> 00:51:47,960 Speaker 2: same thing. They've done focus groups, so they've got people 974 00:51:47,960 --> 00:51:51,439 Speaker 2: together and they've said people get stressed and annoyed when 975 00:51:51,480 --> 00:51:55,560 Speaker 2: they couldn't control something in a tactile way. So that 976 00:51:55,800 --> 00:51:58,719 Speaker 2: move away from all of that and just having big 977 00:51:58,760 --> 00:52:02,400 Speaker 2: screens strading drivers and now the likes of Hyunda and 978 00:52:02,520 --> 00:52:05,640 Speaker 2: Kiir and now taking a much more different approach to 979 00:52:05,680 --> 00:52:08,279 Speaker 2: the way they look at that, and as is Volkswagen as. 980 00:52:08,200 --> 00:52:12,160 Speaker 1: Well, perfect the HR station Wagon. That's what we need 981 00:52:12,200 --> 00:52:14,719 Speaker 1: to go back to. Patrick. You probably don't remember that, 982 00:52:14,760 --> 00:52:18,200 Speaker 1: But where can people find you? Follow you, connect with you? Patrick? 983 00:52:18,480 --> 00:52:21,960 Speaker 2: I just go to websitesnow dot com dot au if 984 00:52:21,960 --> 00:52:24,799 Speaker 2: you want to talk media and marketing and websites and 985 00:52:25,080 --> 00:52:27,520 Speaker 2: branding and stuff, or just go to Tychi at home 986 00:52:27,960 --> 00:52:29,440 Speaker 2: dot com, todau if you just want to do some 987 00:52:29,520 --> 00:52:30,960 Speaker 2: tai Chie with me, because that's funny. 988 00:52:31,719 --> 00:52:34,120 Speaker 1: I hope you two kids have a beautiful day tomorrow. 989 00:52:34,280 --> 00:52:36,680 Speaker 1: I hope the dogs play well together. I hope you 990 00:52:36,719 --> 00:52:39,920 Speaker 1: two play well together. I hope you don't get rained 991 00:52:39,960 --> 00:52:42,920 Speaker 1: on and we expect a full report next time. 992 00:52:43,280 --> 00:52:43,839 Speaker 2: Sounds good. 993 00:52:44,840 --> 00:52:48,600 Speaker 1: Thanks TeV, thank you, Thanks Patty, thank you,