1 00:00:00,800 --> 00:00:02,880 Speaker 1: I get a term. It's us. Of course, it is 2 00:00:02,960 --> 00:00:06,480 Speaker 1: Tiffany and Cook, Patrick, James Bonello and me Jumbo Fatty 3 00:00:06,519 --> 00:00:09,239 Speaker 1: Harps just here every Friday morning. It's eight oh one 4 00:00:09,960 --> 00:00:13,240 Speaker 1: at this point in time. Let's go out to the 5 00:00:13,240 --> 00:00:15,720 Speaker 1: middle of nowhere and saylo to Patrick, Hi, mate, how 6 00:00:15,760 --> 00:00:16,040 Speaker 1: are you? 7 00:00:16,680 --> 00:00:18,160 Speaker 2: I could like to say that it's not the middle 8 00:00:18,200 --> 00:00:18,600 Speaker 2: of nowhere. 9 00:00:18,600 --> 00:00:21,280 Speaker 1: It's the center of the universe, all right, the center 10 00:00:21,320 --> 00:00:23,959 Speaker 1: of the universe aka the middle of nowhere. How is 11 00:00:24,000 --> 00:00:26,960 Speaker 1: baland the thriving metropolis? How are you both out there, 12 00:00:27,200 --> 00:00:30,160 Speaker 1: the two of you that live there? Well, Fritz, how 13 00:00:30,280 --> 00:00:32,520 Speaker 1: is the other person that lives there? How are they going? 14 00:00:32,800 --> 00:00:35,800 Speaker 2: Fritz is licking himself, So he's being very productive. 15 00:00:36,200 --> 00:00:41,000 Speaker 1: Oh, I'm so jealous. He would never leave the house. 16 00:00:41,159 --> 00:00:49,240 Speaker 1: Was you imagine that? Imagine? No, that's already I reckon. 17 00:00:49,360 --> 00:00:51,720 Speaker 1: If anyone on this call is flexible enough for that, 18 00:00:51,760 --> 00:00:56,280 Speaker 1: it's not me or tiff So I think I think 19 00:00:56,320 --> 00:00:58,880 Speaker 1: it might be the fucking yoga teacher or whatever it 20 00:00:58,960 --> 00:01:03,240 Speaker 1: is that you take Chi always my reason for doing it. 21 00:01:07,200 --> 00:01:10,039 Speaker 1: We thought you had hip displasure. No, you're just trying 22 00:01:10,080 --> 00:01:11,280 Speaker 1: to give yourself a head job. 23 00:01:11,319 --> 00:01:16,000 Speaker 2: But there we want people to kind of infer that, 24 00:01:16,080 --> 00:01:16,840 Speaker 2: and I don't have to. 25 00:01:16,760 --> 00:01:19,400 Speaker 1: Come out and say it like, you know, ah, sorry, 26 00:01:19,520 --> 00:01:23,119 Speaker 1: let's start again, all right, hey everybody. 27 00:01:23,240 --> 00:01:24,680 Speaker 2: This is going to be one of those shows. How 28 00:01:24,720 --> 00:01:26,360 Speaker 2: many times you reckon? We can start it again? 29 00:01:26,440 --> 00:01:30,800 Speaker 1: TIF? How many times I reckon? I think five? We've 30 00:01:30,800 --> 00:01:31,039 Speaker 1: done it? 31 00:01:31,120 --> 00:01:34,800 Speaker 2: Five is our record where we did anyway, I'm well, 32 00:01:34,800 --> 00:01:37,000 Speaker 2: thank you. It's bloody freezing this morning, though I've got 33 00:01:37,000 --> 00:01:39,520 Speaker 2: the got the heater going. But yee, I'll tell you what. 34 00:01:39,560 --> 00:01:42,400 Speaker 2: The walkout to the studio is bloody cold. 35 00:01:43,000 --> 00:01:44,760 Speaker 1: Do you have a heater in the studio or is 36 00:01:44,760 --> 00:01:48,440 Speaker 1: that inconsistent with cognitive functions so you like to keep 37 00:01:48,440 --> 00:01:49,520 Speaker 1: it cold? No? 38 00:01:49,520 --> 00:01:52,440 Speaker 2: No, no, no, that the heater is definitely definitely going. 39 00:01:53,360 --> 00:01:55,800 Speaker 2: I might say that I've I've been awake for ages, 40 00:01:55,840 --> 00:01:57,040 Speaker 2: like literally for hours, and. 41 00:01:57,000 --> 00:02:00,680 Speaker 1: I'm this is what I sleep in. I mean, my 42 00:02:00,760 --> 00:02:05,160 Speaker 1: brother's disgusting. You smell. Have you not had a shower? 43 00:02:06,680 --> 00:02:08,400 Speaker 1: Oh god, I haven't. 44 00:02:08,440 --> 00:02:11,360 Speaker 2: I haven't exercised yet. So I'm going to take the 45 00:02:11,360 --> 00:02:16,080 Speaker 2: Schnauser out for a walk after this. 46 00:02:16,120 --> 00:02:19,880 Speaker 1: Schnauser is such a bad name for any breed of anything. Schnauzer. 47 00:02:20,440 --> 00:02:24,200 Speaker 1: You can you get that little motherfucker right there. Look 48 00:02:24,200 --> 00:02:27,360 Speaker 1: at him. Look, he's so busy licking himself. What is 49 00:02:27,400 --> 00:02:30,959 Speaker 1: he doing? And room mate, don't do that. Fritz, Fritz, 50 00:02:31,000 --> 00:02:32,680 Speaker 1: get get out of the shot. Fritz. 51 00:02:33,040 --> 00:02:35,359 Speaker 2: Hear you, he's not wearing headphones like a. 52 00:02:35,320 --> 00:02:41,200 Speaker 1: Fucking canine porno. Get rid of him. What is he doing? 53 00:02:41,639 --> 00:02:50,280 Speaker 1: She's got no pride. Get on the floor and do that. 54 00:02:50,280 --> 00:02:51,320 Speaker 1: He's embarrassed him. 55 00:02:51,320 --> 00:02:54,160 Speaker 2: He's got off the chair, he's walked away, and now 56 00:02:54,680 --> 00:02:56,800 Speaker 2: now he's given me the snaus a look, he's just 57 00:02:56,880 --> 00:02:57,400 Speaker 2: looking at me. 58 00:02:58,560 --> 00:03:01,160 Speaker 1: What the three feet Schnauzer death stare. 59 00:03:00,800 --> 00:03:03,560 Speaker 2: It's scary when he gives you the look because he's 60 00:03:03,560 --> 00:03:06,800 Speaker 2: got those the eyebrows are like an awning over the 61 00:03:06,800 --> 00:03:09,359 Speaker 2: top of his head and he just peers through them. 62 00:03:09,560 --> 00:03:11,640 Speaker 2: So it's kind of is he looking at me? Or 63 00:03:11,720 --> 00:03:13,920 Speaker 2: isn't he looking at me? And it's a bit unnerving. 64 00:03:14,120 --> 00:03:17,080 Speaker 1: You've seen. Why don't you give that little hair arounda 65 00:03:17,160 --> 00:03:18,200 Speaker 1: a little bit of a trim. 66 00:03:19,720 --> 00:03:22,040 Speaker 2: Look, I know he's back up again. There, Oh God, 67 00:03:22,120 --> 00:03:22,639 Speaker 2: here we go. 68 00:03:23,520 --> 00:03:26,680 Speaker 1: Now he's sniffing you. God, I don't let him lick 69 00:03:26,760 --> 00:03:28,800 Speaker 1: you because it's just been on his cock. That tue. 70 00:03:28,919 --> 00:03:35,200 Speaker 1: Get him stop it the worst. I just tapping Patrick 71 00:03:35,280 --> 00:03:38,800 Speaker 1: on the arm, like, come on, bro, give him some love. 72 00:03:39,400 --> 00:03:40,840 Speaker 1: Give him like look at. 73 00:03:40,880 --> 00:03:43,480 Speaker 3: Oh my god, I love him. 74 00:03:44,120 --> 00:03:47,160 Speaker 1: Oh now he's like, don't make fun of me that. 75 00:03:48,080 --> 00:03:52,640 Speaker 1: Oh why are you not giving him some love? Oh 76 00:03:53,240 --> 00:03:54,960 Speaker 1: my god? 77 00:03:55,160 --> 00:04:01,000 Speaker 2: The worst start of a show ever, Tiff, how are you? 78 00:04:01,400 --> 00:04:02,160 Speaker 2: I'm fabulous. 79 00:04:02,240 --> 00:04:04,080 Speaker 3: I just went and did a little boxing. I haven't 80 00:04:04,080 --> 00:04:06,560 Speaker 3: put the block gloves on and boxed most of this year, 81 00:04:06,800 --> 00:04:08,680 Speaker 3: and I just did a little boxing session this morning. 82 00:04:08,680 --> 00:04:11,280 Speaker 1: I'm feeling good. Did you hit the bag, did you 83 00:04:11,360 --> 00:04:13,280 Speaker 1: hit the speedball, the floor to ceiling? Or did you 84 00:04:13,320 --> 00:04:16,479 Speaker 1: hit Brian in the ring? When I say not Brian 85 00:04:16,560 --> 00:04:19,480 Speaker 1: in the ring, I mean Brian, it's not Brian's ring, 86 00:04:19,680 --> 00:04:23,480 Speaker 1: but Brian who happened to be in the ring unless 87 00:04:23,480 --> 00:04:25,360 Speaker 1: he was doing a downward dog at the time, and 88 00:04:25,440 --> 00:04:29,559 Speaker 1: then you hit him in the ring. But so, what 89 00:04:29,640 --> 00:04:31,960 Speaker 1: was the format of the boxing is what I'm trying 90 00:04:32,000 --> 00:04:33,840 Speaker 1: to ask. I did a little bag work. 91 00:04:33,880 --> 00:04:35,719 Speaker 2: I did a little Florida ceiling. I did a little 92 00:04:35,720 --> 00:04:40,080 Speaker 2: bit of that wall bag. It was it was great. 93 00:04:40,400 --> 00:04:43,359 Speaker 1: How I feel like I shoulders feel strong? 94 00:04:44,200 --> 00:04:46,320 Speaker 2: I do feel like I look like a little mac truck. 95 00:04:46,440 --> 00:04:48,640 Speaker 3: Staffy or something. Now that I'm like, I didn't look 96 00:04:48,680 --> 00:04:50,240 Speaker 3: like this last time I put the gloves on. 97 00:04:52,080 --> 00:04:57,120 Speaker 1: He's been taking volumes of steroids and lifting heavy things. 98 00:04:57,320 --> 00:05:00,680 Speaker 3: Have not been taking volumes of still, but I've been. 99 00:05:01,000 --> 00:05:03,160 Speaker 1: But you look like it. Well that that you are 100 00:05:03,240 --> 00:05:04,919 Speaker 1: a fucking anomaly, because. 101 00:05:07,400 --> 00:05:11,400 Speaker 3: I'm not taking that off, because that was very timed. 102 00:05:11,440 --> 00:05:13,080 Speaker 3: I didn't take my jacket off for that reason. 103 00:05:14,440 --> 00:05:16,840 Speaker 1: Yeah sure, Patrick, and I just going to get a coat. 104 00:05:16,920 --> 00:05:21,880 Speaker 1: Will be back in a moment. I'm covering up to 105 00:05:22,600 --> 00:05:23,000 Speaker 1: me too. 106 00:05:23,040 --> 00:05:32,400 Speaker 2: I'm glad I have a shaven. 107 00:05:32,560 --> 00:05:35,279 Speaker 3: All of the typ listeners know exactly what we're all 108 00:05:35,360 --> 00:05:37,479 Speaker 3: laughing at without us even saying anything. 109 00:05:37,960 --> 00:05:41,120 Speaker 1: I don't know. It seems like this is getting looser 110 00:05:41,200 --> 00:05:45,000 Speaker 1: every episode. I feel like somebody out of the rig. 111 00:05:45,040 --> 00:05:49,520 Speaker 1: It's going to get Lisa. I think I think somebody 112 00:05:49,600 --> 00:05:51,960 Speaker 1: responsible needs try and get it back on track. If 113 00:05:52,040 --> 00:05:57,200 Speaker 1: only they will have one here. One day you'll have 114 00:05:57,240 --> 00:05:58,160 Speaker 1: an adult. 115 00:05:57,839 --> 00:05:59,800 Speaker 2: In the show, and then it will all be put 116 00:05:59,839 --> 00:06:00,559 Speaker 2: back on track. 117 00:06:02,120 --> 00:06:04,000 Speaker 1: I don't think an adult would want to be involved 118 00:06:04,040 --> 00:06:09,440 Speaker 1: in that. Got a chance, Patrick, Let's start with the 119 00:06:09,520 --> 00:06:12,880 Speaker 1: important stuff. I've got a list. If you've never heard 120 00:06:12,920 --> 00:06:15,279 Speaker 1: this before, well, if you've never heard this show before 121 00:06:15,400 --> 00:06:19,239 Speaker 1: or this installment of the show before, you've probably already left. 122 00:06:19,720 --> 00:06:23,640 Speaker 1: But if you happen to be still here, One, what 123 00:06:23,720 --> 00:06:26,880 Speaker 1: the fuck is wrong with what are you doing? Yeah? Yeah, 124 00:06:27,320 --> 00:06:31,240 Speaker 1: you get a life. But two, this is the show 125 00:06:31,279 --> 00:06:34,560 Speaker 1: where Patrick allegedly comes and talks to us about. Well, 126 00:06:34,640 --> 00:06:37,520 Speaker 1: it used to be originally just tech, but we've kind 127 00:06:37,520 --> 00:06:41,440 Speaker 1: of diversified and there's a little bit of psychology and 128 00:06:41,520 --> 00:06:44,880 Speaker 1: human behavior and it's a bit of a pot pirie 129 00:06:44,920 --> 00:06:47,480 Speaker 1: of conversation to be honest, But I think we should 130 00:06:47,480 --> 00:06:51,440 Speaker 1: start with the most interesting topic on your list. Today, 131 00:06:52,360 --> 00:06:57,200 Speaker 1: scientists reveal a clever trick that can help me win 132 00:06:57,920 --> 00:07:02,400 Speaker 1: rock paper So yeah. 133 00:07:00,760 --> 00:07:04,480 Speaker 2: This is really interesting. So you know when you play 134 00:07:04,560 --> 00:07:05,560 Speaker 2: rock paper scissors? 135 00:07:05,720 --> 00:07:08,160 Speaker 1: Right, what I just asked, Tiff, do you know what 136 00:07:08,200 --> 00:07:08,560 Speaker 1: that is? 137 00:07:08,760 --> 00:07:08,920 Speaker 3: Oh? 138 00:07:09,279 --> 00:07:11,200 Speaker 1: Yeah, yeah, okay, yeah, yeah, yeah, go. 139 00:07:13,120 --> 00:07:15,600 Speaker 2: I find it interesting what science has studied. Let's just 140 00:07:15,640 --> 00:07:19,280 Speaker 2: go back one step, because they're studying rock paper scissors. 141 00:07:19,440 --> 00:07:22,400 Speaker 2: But kind of cool. I think that's great. It's you know, 142 00:07:22,440 --> 00:07:26,440 Speaker 2: obviously we're not the only ones who never evolved into adulthood. 143 00:07:26,680 --> 00:07:28,280 Speaker 2: I think that's cool anyway. 144 00:07:28,320 --> 00:07:30,400 Speaker 1: Also, I just want to ask, as a side note, 145 00:07:30,400 --> 00:07:32,400 Speaker 1: who the fuck is funding this research? 146 00:07:32,680 --> 00:07:36,720 Speaker 2: Yeah, I don't know, like play school, some Landling company, 147 00:07:37,680 --> 00:07:38,520 Speaker 2: the Wiggles. 148 00:07:38,920 --> 00:07:39,800 Speaker 1: Anyway, go on. 149 00:07:40,520 --> 00:07:43,960 Speaker 2: It actually has broader implications in terms of how we 150 00:07:44,040 --> 00:07:48,640 Speaker 2: make decisions and whether when we make a decision competitively 151 00:07:48,840 --> 00:07:51,360 Speaker 2: or whether we do it in conjunction with someone as 152 00:07:51,360 --> 00:07:54,040 Speaker 2: a team effort. But what it was looking at was 153 00:07:54,560 --> 00:07:57,080 Speaker 2: how do you win at Rock paper scissors? And most 154 00:07:57,160 --> 00:08:01,200 Speaker 2: people make the decision of what they're going to play 155 00:08:01,400 --> 00:08:02,520 Speaker 2: next based on. 156 00:08:02,440 --> 00:08:03,600 Speaker 1: The previous round. 157 00:08:03,960 --> 00:08:07,040 Speaker 2: So when I'm doing rock paper fitters with you and 158 00:08:07,120 --> 00:08:09,760 Speaker 2: you have rock or paper or scissors or whatever it 159 00:08:09,760 --> 00:08:15,520 Speaker 2: happens to be, I'm influenced by what your previous action was. 160 00:08:15,920 --> 00:08:18,960 Speaker 2: And the scientists came up with the notion that, in fact, 161 00:08:18,960 --> 00:08:21,760 Speaker 2: that's the wrong thing to do. You've got to not 162 00:08:21,880 --> 00:08:25,640 Speaker 2: be influenced by the previous action the person did, and 163 00:08:25,680 --> 00:08:28,280 Speaker 2: you will have more chance of winning if you come 164 00:08:28,360 --> 00:08:31,320 Speaker 2: up with a new move each time. So this is 165 00:08:31,440 --> 00:08:34,440 Speaker 2: basically what came out of this. But what they're saying 166 00:08:34,480 --> 00:08:37,800 Speaker 2: from this is that the field it's basically social neuroscience, 167 00:08:38,200 --> 00:08:40,880 Speaker 2: and what they're looking into is how the brains of 168 00:08:40,920 --> 00:08:45,320 Speaker 2: individuals gain an insight into another person. And the decision 169 00:08:45,360 --> 00:08:48,920 Speaker 2: making process, and if we interact with people what that 170 00:08:48,960 --> 00:08:53,440 Speaker 2: then means. It's called hyper scanning, and it's a method 171 00:08:53,480 --> 00:08:55,240 Speaker 2: of working. As you probably know more about this than 172 00:08:55,280 --> 00:08:55,480 Speaker 2: we do. 173 00:08:55,600 --> 00:08:59,280 Speaker 1: Crago. Well, now it's very interesting. It's like anticipation and 174 00:08:59,280 --> 00:09:02,800 Speaker 1: trying to antidi based on previous data or previous behavior, 175 00:09:02,800 --> 00:09:05,760 Speaker 1: what somebody's going to do next. Like they say, in 176 00:09:05,800 --> 00:09:08,840 Speaker 1: general terms human behavior, the best predictor of future behavior 177 00:09:08,920 --> 00:09:12,000 Speaker 1: is past behavior. But when you're doing something, which is 178 00:09:12,720 --> 00:09:15,880 Speaker 1: where you're trying, like obviously in a game like this, 179 00:09:16,040 --> 00:09:19,319 Speaker 1: you're trying to manipulate or deceive somebody. That's the point 180 00:09:19,320 --> 00:09:21,640 Speaker 1: of the game because you're trying to win, so you 181 00:09:21,679 --> 00:09:23,520 Speaker 1: don't want them to know what you're going to do. 182 00:09:24,080 --> 00:09:29,200 Speaker 1: But yeah, it's definitely it is definitely interesting. But that trying, 183 00:09:30,000 --> 00:09:34,200 Speaker 1: Like it's easy to say, make it random every time 184 00:09:34,360 --> 00:09:36,480 Speaker 1: and don't repeat what you did last time. But if 185 00:09:36,520 --> 00:09:38,559 Speaker 1: you never repeat what you did last time, well that's 186 00:09:38,600 --> 00:09:41,800 Speaker 1: a pattern as well, and that makes you predictable. Like 187 00:09:42,000 --> 00:09:45,160 Speaker 1: if you ever do paper twice in a row, I know, 188 00:09:45,240 --> 00:09:48,040 Speaker 1: well Patrick never does the same thing twice in a row, 189 00:09:48,360 --> 00:09:52,080 Speaker 1: and there are only three things, only three options. Last 190 00:09:52,120 --> 00:09:54,480 Speaker 1: time he did paper, so this time it's definitely got 191 00:09:54,520 --> 00:09:57,200 Speaker 1: to be scissors or rocks. I feel like I'm sounding 192 00:09:57,280 --> 00:10:00,959 Speaker 1: like a really boring scientist now, but yeah, it is 193 00:10:01,160 --> 00:10:06,400 Speaker 1: that whole kind of that ability to be anticipated, to 194 00:10:06,440 --> 00:10:09,280 Speaker 1: be able to anticipate what somebody is going to do 195 00:10:09,400 --> 00:10:13,199 Speaker 1: in terms of a decision and a behavior. It's it. 196 00:10:13,240 --> 00:10:17,360 Speaker 1: The benefits are far reaching beyond this particular game. But 197 00:10:17,800 --> 00:10:21,000 Speaker 1: I understand why they did this research because it can 198 00:10:21,040 --> 00:10:22,920 Speaker 1: be extrapolated to a lot of other things. 199 00:10:23,320 --> 00:10:24,880 Speaker 2: And the other thing it just occurred to me while 200 00:10:24,880 --> 00:10:26,720 Speaker 2: you were talking, because I was listening intently to what 201 00:10:26,760 --> 00:10:29,439 Speaker 2: you were saying, was did you ever play did you 202 00:10:29,480 --> 00:10:32,040 Speaker 2: ever play the rule with dynamite? So rock papers is 203 00:10:32,080 --> 00:10:32,880 Speaker 2: a dynamite? 204 00:10:33,320 --> 00:10:35,320 Speaker 1: I did not. Oh see that was another. 205 00:10:35,120 --> 00:10:37,880 Speaker 2: One as well, rock papers and dynamite, So that that's 206 00:10:37,880 --> 00:10:41,240 Speaker 2: another variable as well. So it actually increases it exponentially, 207 00:10:41,280 --> 00:10:43,719 Speaker 2: doesn't it by having another fear anyway, that's I just 208 00:10:43,760 --> 00:10:45,880 Speaker 2: thought i'd mentioned that for those people who've played rock 209 00:10:45,920 --> 00:10:47,199 Speaker 2: papers as dynamite. 210 00:10:47,400 --> 00:10:50,560 Speaker 1: This kind of reminds me of in terms of like 211 00:10:51,160 --> 00:10:56,560 Speaker 1: repeating certain things and like that are unproductive and sociologically ridiculous. 212 00:10:57,559 --> 00:11:02,040 Speaker 1: The habit that we humans have of essentially having the 213 00:11:02,080 --> 00:11:05,800 Speaker 1: same conversation with the same person about the same issue 214 00:11:06,320 --> 00:11:10,120 Speaker 1: and getting the same negative result. Yet we do it 215 00:11:10,160 --> 00:11:13,360 Speaker 1: again and again and again, you know, rather than going okay. 216 00:11:14,000 --> 00:11:16,280 Speaker 1: So we've had a version of this conversation a hundred 217 00:11:16,360 --> 00:11:19,000 Speaker 1: times and it never turns out well. Either I need 218 00:11:19,040 --> 00:11:22,400 Speaker 1: to stop having it, or have a different version, or 219 00:11:22,440 --> 00:11:25,800 Speaker 1: try a different approach in a different context, or like 220 00:11:26,000 --> 00:11:30,720 Speaker 1: we are very slow learners when it comes to social interactions. 221 00:11:31,760 --> 00:11:35,560 Speaker 1: All right, let's move on then, So tell me why 222 00:11:35,640 --> 00:11:40,720 Speaker 1: life expectancy gains have slowed down sharply, Patrick, I feel 223 00:11:40,720 --> 00:11:43,439 Speaker 1: like they're almost on the way to going backwards. Yeah, 224 00:11:43,600 --> 00:11:44,720 Speaker 1: it feels like it as well. 225 00:11:44,720 --> 00:11:46,839 Speaker 2: And this is and I guess it might be how 226 00:11:46,880 --> 00:11:48,680 Speaker 2: we're living at the moment, or the fact that it 227 00:11:48,760 --> 00:11:51,200 Speaker 2: plateaued and now it's kind of dipping on the other side. 228 00:11:51,360 --> 00:11:55,280 Speaker 2: So some recent research has been looking into you know, 229 00:11:55,559 --> 00:11:58,560 Speaker 2: at the end of the nineteen thirty so nineteen thirty nine, 230 00:11:58,600 --> 00:12:02,280 Speaker 2: anybody born after anineteen thirty nine had a greater chance 231 00:12:02,720 --> 00:12:08,720 Speaker 2: of increasing their longevity their life every year after that. 232 00:12:09,200 --> 00:12:13,280 Speaker 2: So basically, if you lived in nineteen thirty nine, your 233 00:12:13,280 --> 00:12:16,080 Speaker 2: life expectancy was like sixty something years, but then every 234 00:12:16,160 --> 00:12:18,720 Speaker 2: year after that it started to increase and increase and increase. 235 00:12:18,960 --> 00:12:23,640 Speaker 2: So medical breakthroughs helped a lot of that, but longevity 236 00:12:23,679 --> 00:12:28,360 Speaker 2: growth now has lost its momentum. So historically it would increase, 237 00:12:28,679 --> 00:12:31,800 Speaker 2: but it seems now that no generations that are born 238 00:12:31,880 --> 00:12:35,439 Speaker 2: after nineteen thirty nine, So you know, right now we're 239 00:12:35,440 --> 00:12:39,040 Speaker 2: seeing people live to that age and those people would 240 00:12:39,040 --> 00:12:43,079 Speaker 2: have been born in nineteen so as we head towards 241 00:12:43,080 --> 00:12:46,760 Speaker 2: that time, but the reality is now they're saying that 242 00:12:46,920 --> 00:12:49,040 Speaker 2: less people are going to be and in fact, it's 243 00:12:49,040 --> 00:12:51,959 Speaker 2: going to get to a point where anybody born after 244 00:12:52,040 --> 00:12:54,640 Speaker 2: nineteen eighty will not live to one hundred. 245 00:12:56,080 --> 00:13:03,000 Speaker 1: Wow. Yeah, wow, don't surprise me. It doesn't surprise me. 246 00:13:03,120 --> 00:13:07,200 Speaker 1: Like there's this almost this contradiction where we've got. It 247 00:13:07,320 --> 00:13:10,520 Speaker 1: seems like some of us have got worse and worse 248 00:13:10,679 --> 00:13:13,920 Speaker 1: habits and behaviors, but at the same time, we've got 249 00:13:13,960 --> 00:13:18,719 Speaker 1: all this groundbreaking medicine and medical resources to help us 250 00:13:18,760 --> 00:13:23,160 Speaker 1: live longer and stronger. But yeah, I wonder, I wonder 251 00:13:23,240 --> 00:13:25,600 Speaker 1: with all the toxins in the soil and the air 252 00:13:25,720 --> 00:13:27,959 Speaker 1: and all of the things, and the plastics and the 253 00:13:29,200 --> 00:13:32,200 Speaker 1: phyto estrogens, and all the shit that we need to 254 00:13:32,240 --> 00:13:35,120 Speaker 1: deal with now. I wonder if we're about to hit 255 00:13:35,200 --> 00:13:37,400 Speaker 1: that kind of platau or maybe go backwards. 256 00:13:37,960 --> 00:13:41,240 Speaker 2: Well, this study has been done by the University of Wisconsin, Madison, 257 00:13:42,080 --> 00:13:45,640 Speaker 2: and I guess when you think about also record keeping 258 00:13:45,840 --> 00:13:48,720 Speaker 2: and how accurate it is, and the fact that we 259 00:13:48,840 --> 00:13:52,320 Speaker 2: now have really good data we can track population growth 260 00:13:52,400 --> 00:13:55,280 Speaker 2: and we're not we're talking Western nations too. This is 261 00:13:55,320 --> 00:13:59,679 Speaker 2: specifically research that's you know, on longevity. Data is based 262 00:13:59,679 --> 00:14:03,240 Speaker 2: around the western Western nations, so we're not sure about 263 00:14:03,240 --> 00:14:03,920 Speaker 2: other countries. 264 00:14:04,320 --> 00:14:06,520 Speaker 1: But it's interesting. Oh sorry, dude, I was just going 265 00:14:06,559 --> 00:14:08,480 Speaker 1: to say it's interesting you brought up nine and thirty 266 00:14:08,559 --> 00:14:10,800 Speaker 1: nine as that date because both of my parents were 267 00:14:10,800 --> 00:14:14,240 Speaker 1: born in nineteen thirty nine and they're both eighty six, 268 00:14:14,520 --> 00:14:16,960 Speaker 1: so they've done pretty well. Yep. 269 00:14:17,160 --> 00:14:19,760 Speaker 2: Yeah, Well, my day just turned ninety a few weeks ago, 270 00:14:20,040 --> 00:14:22,280 Speaker 2: and obviously he was born before then, so that'd be 271 00:14:22,320 --> 00:14:23,200 Speaker 2: mak him thirty five? 272 00:14:23,320 --> 00:14:28,760 Speaker 1: Was it? No? Thirty five? Well that I know this 273 00:14:28,840 --> 00:14:30,560 Speaker 1: is digressing, But how's dad's health. 274 00:14:30,960 --> 00:14:35,200 Speaker 2: Look, he's in aged care. He's always ridiculously active. But 275 00:14:35,240 --> 00:14:37,280 Speaker 2: the two things I think worked in his favor have 276 00:14:37,400 --> 00:14:40,320 Speaker 2: worked in his favor is he didn't never smoked, and 277 00:14:40,400 --> 00:14:42,600 Speaker 2: he very occasionally had a drink, you know, have a 278 00:14:42,600 --> 00:14:46,040 Speaker 2: beer in summer, and that was all. But was always 279 00:14:46,080 --> 00:14:50,120 Speaker 2: always active, always walked, you know, was physically in the 280 00:14:50,160 --> 00:14:52,560 Speaker 2: garden every day, that sort of stuff. But he's done 281 00:14:52,640 --> 00:14:55,840 Speaker 2: pretty well, you know, I think for most. My theory 282 00:14:55,840 --> 00:14:57,920 Speaker 2: on this, and we've spoken about this many many times, 283 00:14:58,120 --> 00:14:59,600 Speaker 2: I don't know that I want to live to one hundred. 284 00:14:59,600 --> 00:15:01,840 Speaker 1: I just want to live to an old age in 285 00:15:01,880 --> 00:15:02,440 Speaker 1: good health. 286 00:15:03,000 --> 00:15:06,240 Speaker 2: So keep keeping running on all cylinders for as long 287 00:15:06,280 --> 00:15:07,440 Speaker 2: as possible, I reckon. 288 00:15:07,920 --> 00:15:11,400 Speaker 1: Yeah, we call that health span. So that's living long 289 00:15:11,560 --> 00:15:13,680 Speaker 1: as long as you can, but as well as you can. 290 00:15:13,800 --> 00:15:17,359 Speaker 1: For the vast majority, I've noticed. 291 00:15:17,120 --> 00:15:20,840 Speaker 2: One smart watch is now telling me what my health 292 00:15:20,920 --> 00:15:24,320 Speaker 2: age is. And it's kind of disconcerting. And so, hey, 293 00:15:24,360 --> 00:15:26,880 Speaker 2: you've been really active and you're doing all the right things. 294 00:15:27,160 --> 00:15:29,960 Speaker 2: You're actually you know, in your late forties, not your 295 00:15:30,000 --> 00:15:32,360 Speaker 2: late fifties, like you sucking up to me. 296 00:15:32,400 --> 00:15:35,120 Speaker 3: Or what you realize that you keep looking at your 297 00:15:35,120 --> 00:15:36,920 Speaker 3: wrist and you're not actually wearing a watch. 298 00:15:37,080 --> 00:15:40,160 Speaker 2: No, yeah, I left my health watch off because I 299 00:15:40,160 --> 00:15:41,240 Speaker 2: think it's lying to me. 300 00:15:44,520 --> 00:15:47,040 Speaker 1: I feel it's trying to create rapport and build connection 301 00:15:47,120 --> 00:15:51,920 Speaker 1: and trust. Now, there is some research that's come out 302 00:15:52,000 --> 00:15:56,360 Speaker 1: which is very significant to me and quite disconcerting, because 303 00:15:56,400 --> 00:16:02,040 Speaker 1: being too attractive can hurt fitness, influences, new res such Hell, 304 00:16:02,080 --> 00:16:06,680 Speaker 1: that's going to be a setback for me. It's you laughing? 305 00:16:07,080 --> 00:16:10,160 Speaker 1: Are you laughing? Sucking hell? 306 00:16:10,680 --> 00:16:13,840 Speaker 2: Knowing? By this study, out of the three people in 307 00:16:13,880 --> 00:16:17,360 Speaker 2: this conversation, Tiff is the least likely to believe be 308 00:16:17,360 --> 00:16:22,600 Speaker 2: believed then you or I. It's called influences. 309 00:16:23,320 --> 00:16:26,920 Speaker 1: Okay, yeah, so tell me what's going on with this. 310 00:16:27,360 --> 00:16:31,160 Speaker 2: Well, you know that the adage that sex cells actually 311 00:16:31,240 --> 00:16:36,320 Speaker 2: isn't working. So researchers looked into consumer behavior and there 312 00:16:36,360 --> 00:16:39,560 Speaker 2: seems like when people are pushing this is in the 313 00:16:39,560 --> 00:16:43,560 Speaker 2: health field. Only they did prospers to different disciplines and 314 00:16:43,600 --> 00:16:47,560 Speaker 2: they found this was most strongly in the influencer area 315 00:16:47,720 --> 00:16:50,800 Speaker 2: for health. And what they found was if someone is 316 00:16:50,840 --> 00:16:54,400 Speaker 2: too good looking and too attractive, they're less. 317 00:16:55,880 --> 00:16:58,240 Speaker 1: Yeah. Yeah, yeah, so we're much more authentic. 318 00:16:58,400 --> 00:17:01,880 Speaker 2: Now I get that, because what it is when you're 319 00:17:01,920 --> 00:17:05,400 Speaker 2: trying to talk about a health message, you want to 320 00:17:05,480 --> 00:17:08,520 Speaker 2: engage with your audience. But if you're way too good looking, 321 00:17:08,560 --> 00:17:12,880 Speaker 2: people can't relate to you because you alienate them because 322 00:17:12,880 --> 00:17:15,760 Speaker 2: there's not that connectedness. And you know, I guess when 323 00:17:15,760 --> 00:17:18,879 Speaker 2: we're talking about health, it's a pretty personal thing. But 324 00:17:18,960 --> 00:17:21,560 Speaker 2: I thought that was kind of interesting and maybe that's 325 00:17:21,600 --> 00:17:23,679 Speaker 2: why you've succeeded as in the health. 326 00:17:26,760 --> 00:17:31,040 Speaker 1: Is that right? Low it are you saying because I'm 327 00:17:31,160 --> 00:17:32,280 Speaker 1: ugly as why I've done well? 328 00:17:32,280 --> 00:17:35,640 Speaker 2: But I don't know, because you're authentic, and you're authentic 329 00:17:35,720 --> 00:17:36,880 Speaker 2: in your presence. 330 00:17:36,560 --> 00:17:38,919 Speaker 1: But what authentic's got nothing to do with looks? And 331 00:17:38,960 --> 00:17:41,439 Speaker 1: this is all about looks, this particular story. And you 332 00:17:41,520 --> 00:17:46,000 Speaker 1: were saying attractive people, it's a handicap and I've done 333 00:17:46,040 --> 00:17:48,040 Speaker 1: well because I'm not attractive. Is that what you're saying? 334 00:17:48,320 --> 00:17:53,200 Speaker 2: Well, you're not overtly attractive. 335 00:17:54,560 --> 00:17:57,840 Speaker 1: I'm not overtly attractive. 336 00:17:58,119 --> 00:17:58,680 Speaker 3: Friends like. 337 00:18:00,119 --> 00:18:01,960 Speaker 1: You break that down for me a little bit. So 338 00:18:02,000 --> 00:18:06,840 Speaker 1: I'm not overtly attractive. What am I? Ruggedly handsome? See 339 00:18:07,200 --> 00:18:09,600 Speaker 1: that's bookshit as well, because now you're lying. Now you're 340 00:18:09,640 --> 00:18:13,919 Speaker 1: fucking backpeddling. I can hear your gears grinding. It's a 341 00:18:13,960 --> 00:18:16,360 Speaker 1: good thing. My mum loves me and I don't need 342 00:18:16,400 --> 00:18:20,200 Speaker 1: your fucking approval. Wow, And I'm not trying to attract 343 00:18:20,240 --> 00:18:22,520 Speaker 1: you any I'm not trying to attract anyone, let's be honest. 344 00:18:23,000 --> 00:18:24,760 Speaker 1: But you can go right to the back of the 345 00:18:24,840 --> 00:18:29,359 Speaker 1: queue back in therapy for harps. You know what happened. 346 00:18:29,400 --> 00:18:31,600 Speaker 1: I told you I took a photo with doctor Alex, 347 00:18:31,640 --> 00:18:34,400 Speaker 1: who's the best looking surgeon in the world, and all 348 00:18:34,440 --> 00:18:37,800 Speaker 1: everybody did was talk about whether or not one how 349 00:18:37,800 --> 00:18:40,440 Speaker 1: good looking he is. That was comment one and question 350 00:18:40,520 --> 00:18:44,080 Speaker 1: one was easy single. I'm like, I am also in 351 00:18:44,119 --> 00:18:47,199 Speaker 1: the photo, but there was not one comment I am 352 00:18:47,240 --> 00:18:51,840 Speaker 1: also there. Hey, there are two people, you know. 353 00:18:51,920 --> 00:18:53,480 Speaker 2: The other thing that came out of the study was 354 00:18:53,560 --> 00:18:59,760 Speaker 2: that when people watch these fitness influences or fit influences, 355 00:19:00,119 --> 00:19:02,800 Speaker 2: it's hard to say with the mathul it's eth fit fluences, 356 00:19:03,880 --> 00:19:08,359 Speaker 2: but when they watch them talking about or spooking about health, 357 00:19:08,680 --> 00:19:11,440 Speaker 2: they come away from it with a lower self esteem. 358 00:19:12,320 --> 00:19:16,760 Speaker 1: Wow. Yeah, Like, my only question is all of the 359 00:19:16,760 --> 00:19:19,959 Speaker 1: fitness influences that I know, and there are quite a 360 00:19:19,960 --> 00:19:22,479 Speaker 1: lot of them that I know personally and also I 361 00:19:22,520 --> 00:19:26,320 Speaker 1: know through the interwebs, none of them are fucking out 362 00:19:26,359 --> 00:19:28,280 Speaker 1: of shape and ugly. Let me tell you that. So 363 00:19:28,400 --> 00:19:30,960 Speaker 1: I don't know about the validity of this. Most of 364 00:19:31,000 --> 00:19:33,359 Speaker 1: them are pretty jacked and pretty good looking, either pretty 365 00:19:33,440 --> 00:19:37,359 Speaker 1: or or handsome. But I am interested in this story 366 00:19:37,400 --> 00:19:42,639 Speaker 1: because I grew up with the Eagles. The Hotel California. 367 00:19:41,960 --> 00:19:48,159 Speaker 2: Effect, like, yeah, it's called the Hotel California because you 368 00:19:48,200 --> 00:19:48,880 Speaker 2: know the lyrics. 369 00:19:49,520 --> 00:19:51,840 Speaker 1: Of course, check you. 370 00:19:51,760 --> 00:19:53,879 Speaker 2: Can check in any time you like, but you can 371 00:19:53,920 --> 00:19:56,520 Speaker 2: never leave something along those lines that how correct correct. 372 00:19:56,960 --> 00:20:00,000 Speaker 2: So the theory behind this is that a stack of Australia. 373 00:20:00,160 --> 00:20:03,280 Speaker 2: It's a phrase that's been coined by you know, basically 374 00:20:03,320 --> 00:20:06,879 Speaker 2: people who are looking into consumer insights. So it's a 375 00:20:07,280 --> 00:20:10,160 Speaker 2: that's put out by Deloitte. It comes out annually. It's 376 00:20:10,160 --> 00:20:15,520 Speaker 2: a media Entertainment consumer Insights report, and what they found 377 00:20:15,640 --> 00:20:19,000 Speaker 2: is more of us are subscribing and the amount that 378 00:20:19,040 --> 00:20:23,200 Speaker 2: we pay per month for subscriptions has risen by twenty 379 00:20:23,240 --> 00:20:28,000 Speaker 2: four percent this year. So on average, each household has 380 00:20:28,040 --> 00:20:31,159 Speaker 2: three point seven subscriptions to the value of about seventy 381 00:20:31,200 --> 00:20:34,560 Speaker 2: eight bucks, whereas last year it was sixty three dollars, 382 00:20:34,640 --> 00:20:38,119 Speaker 2: so there has been an increase. There's a criticism of 383 00:20:38,160 --> 00:20:42,400 Speaker 2: a lot of the subscription platforms that it's very hard 384 00:20:42,440 --> 00:20:46,399 Speaker 2: to unsubscribe, and you know what, it's honestly it's like 385 00:20:46,440 --> 00:20:49,840 Speaker 2: a needy relationship when you kind of try to leave 386 00:20:49,880 --> 00:20:52,840 Speaker 2: as well, it's really you're really going to go, what about. 387 00:20:52,640 --> 00:20:53,280 Speaker 1: If I do this? 388 00:20:53,480 --> 00:20:55,320 Speaker 2: What about if I do this and I do this 389 00:20:55,359 --> 00:20:58,359 Speaker 2: and then I'll do that? And have you tried to 390 00:20:58,480 --> 00:20:59,879 Speaker 2: unsubscribe recently from some? 391 00:21:00,119 --> 00:21:03,439 Speaker 1: Think? It's really hard? And then I have not. I 392 00:21:03,480 --> 00:21:07,120 Speaker 1: probably should though, because I've got too many Yeah, yeah, 393 00:21:07,280 --> 00:21:09,840 Speaker 1: how many of you got tip? Like what you're on Netflix? 394 00:21:09,880 --> 00:21:10,680 Speaker 1: What else are you on? 395 00:21:11,000 --> 00:21:13,480 Speaker 3: I'm only on Netflix every now and then I do 396 00:21:13,600 --> 00:21:16,400 Speaker 3: unsubscribed from that, but I don't know spot I've got 397 00:21:16,400 --> 00:21:21,119 Speaker 3: heaps Spotify, bluddy, even Zoom. I don't use the Zoom 398 00:21:21,160 --> 00:21:23,760 Speaker 3: subscription now because I use Riverside, but I've still got 399 00:21:23,800 --> 00:21:27,920 Speaker 3: it subscribed just in case. Strava don't use that. Use 400 00:21:27,960 --> 00:21:30,240 Speaker 3: that when I was running. Don't need to be premium 401 00:21:30,440 --> 00:21:32,280 Speaker 3: heaps hip Patrick. 402 00:21:32,320 --> 00:21:36,960 Speaker 1: The next story is a little bit tantric in it 403 00:21:37,000 --> 00:21:40,439 Speaker 1: feels a bit tantric to me, which is touching without contact. 404 00:21:42,200 --> 00:21:46,159 Speaker 1: We physically sense objects before feeling them, so. 405 00:21:46,320 --> 00:21:49,679 Speaker 2: Research because some animals do this really really well. There 406 00:21:49,680 --> 00:21:53,600 Speaker 2: are some birds that are able to stand on sand 407 00:21:54,160 --> 00:21:58,000 Speaker 2: and feel where their prey is under the sand, so 408 00:21:58,040 --> 00:21:59,919 Speaker 2: they thought this was kind of interesting. Let's do this 409 00:22:00,119 --> 00:22:03,280 Speaker 2: with human beings and see if humans when they press 410 00:22:03,320 --> 00:22:06,200 Speaker 2: their hand onto the sand consense just by moving their 411 00:22:06,240 --> 00:22:09,520 Speaker 2: hands around where the object is and was a high 412 00:22:09,640 --> 00:22:13,919 Speaker 2: level of accuracy. And what they're finding is that the vibrations, 413 00:22:14,000 --> 00:22:16,560 Speaker 2: it's kind of it's I'm using it in simple terms. 414 00:22:16,560 --> 00:22:19,000 Speaker 2: It's not like sonar. But what it means is that 415 00:22:19,320 --> 00:22:25,200 Speaker 2: those sensial the sense you get from tactile touch, can 416 00:22:25,320 --> 00:22:28,919 Speaker 2: actually work through the sand to the object, and we 417 00:22:28,960 --> 00:22:31,160 Speaker 2: almost have an innate sense to be able to find 418 00:22:31,160 --> 00:22:33,080 Speaker 2: that object. So you don't see people walking along the 419 00:22:33,080 --> 00:22:35,280 Speaker 2: beach with their metal detectors. Just lay down on the 420 00:22:35,280 --> 00:22:37,240 Speaker 2: sand and just press your hands in and you've got 421 00:22:37,240 --> 00:22:38,439 Speaker 2: more chance to find and stuff. 422 00:22:39,119 --> 00:22:41,560 Speaker 1: Well, I think that might be a skill or a 423 00:22:41,600 --> 00:22:45,480 Speaker 1: sense that might need to be developed. No, but it 424 00:22:45,920 --> 00:22:46,440 Speaker 1: is accurate. 425 00:22:46,440 --> 00:22:48,879 Speaker 2: Though. They say that they were able to track the 426 00:22:48,920 --> 00:22:52,800 Speaker 2: way people had this tactile sensation and just by feeling 427 00:22:53,200 --> 00:22:55,760 Speaker 2: the movement of the sand and how the sand then 428 00:22:55,840 --> 00:22:58,679 Speaker 2: pushed against the object. It's kind of interesting how our 429 00:22:58,760 --> 00:23:02,480 Speaker 2: sensors can kind of be extrapolated that way. And the 430 00:23:02,560 --> 00:23:05,680 Speaker 2: reason they're researching into this is because they are trying 431 00:23:05,720 --> 00:23:10,520 Speaker 2: to apply this to research into the development of robots 432 00:23:10,560 --> 00:23:13,920 Speaker 2: that have more tactile senses to be able to sense touch, 433 00:23:13,960 --> 00:23:16,720 Speaker 2: because that's a problem when you look at wanting to 434 00:23:16,880 --> 00:23:20,520 Speaker 2: potentially use say robots and healthcare, then the amount of 435 00:23:20,560 --> 00:23:24,320 Speaker 2: pressure that they apply and their sense of how much 436 00:23:24,320 --> 00:23:27,159 Speaker 2: pressure they're applying to say a person's hand. If you 437 00:23:27,160 --> 00:23:30,240 Speaker 2: think about health care, and in China they're going crazy 438 00:23:30,280 --> 00:23:33,520 Speaker 2: developing robots because they're worried about I guess the healthcare 439 00:23:33,840 --> 00:23:37,120 Speaker 2: of an aging population and a very young well not 440 00:23:37,280 --> 00:23:41,080 Speaker 2: having enough young people. But if I was to grab 441 00:23:41,160 --> 00:23:44,800 Speaker 2: your wrist or TIFF's wrist, all of us have a 442 00:23:44,840 --> 00:23:47,159 Speaker 2: different width in terms of what our wrist is, and 443 00:23:47,200 --> 00:23:49,919 Speaker 2: that's something that if you've got a pre programmed medical 444 00:23:50,000 --> 00:23:53,520 Speaker 2: robot that grabs you on the wrist, then because of 445 00:23:53,520 --> 00:23:55,879 Speaker 2: the diversity, how do they know how much pressure to 446 00:23:56,000 --> 00:24:00,200 Speaker 2: apply given that we all have a different width of wrist. 447 00:24:00,600 --> 00:24:03,880 Speaker 2: So it's really really important for this development of robotics 448 00:24:03,920 --> 00:24:07,199 Speaker 2: to be able to have quite tactile sensation, and this 449 00:24:07,240 --> 00:24:08,840 Speaker 2: has been a bit of a challenge that they're working 450 00:24:08,880 --> 00:24:11,040 Speaker 2: on at the moment. So this sort of research into 451 00:24:11,560 --> 00:24:14,560 Speaker 2: the tactile nature of sand and feeling things under the 452 00:24:14,600 --> 00:24:17,280 Speaker 2: sand works the same. We take a lot for granted 453 00:24:17,320 --> 00:24:19,440 Speaker 2: because of all the nerve endings we have in our hands, 454 00:24:19,960 --> 00:24:23,080 Speaker 2: but to apply that mechanically is really challenging. 455 00:24:23,840 --> 00:24:25,920 Speaker 1: You know what this makes me think about a little bit, 456 00:24:26,000 --> 00:24:31,480 Speaker 1: like developing these senses, Like you wonder if, given certain circumstances, 457 00:24:31,560 --> 00:24:35,280 Speaker 1: how well we could develop these senses that we really 458 00:24:35,840 --> 00:24:39,919 Speaker 1: are unaware of. Like you know, when somebody loses you know, 459 00:24:40,000 --> 00:24:45,040 Speaker 1: whether it's site or some other function, it's almost like 460 00:24:45,160 --> 00:24:49,440 Speaker 1: the other abilities kind of come into play or develop more. 461 00:24:50,119 --> 00:24:52,520 Speaker 1: I watched this thing on this dude who developed he 462 00:24:52,680 --> 00:24:56,480 Speaker 1: was blind. He is blind, and he's developed this kind 463 00:24:56,520 --> 00:25:02,520 Speaker 1: of echolocation skill. How bats make a noise and depending 464 00:25:02,560 --> 00:25:05,720 Speaker 1: on how far away they are from an object, how 465 00:25:05,800 --> 00:25:09,920 Speaker 1: quickly that noise returns to them, Like how quickly they 466 00:25:09,960 --> 00:25:14,320 Speaker 1: hear it back is it's called echolocation kind of gives 467 00:25:14,359 --> 00:25:16,639 Speaker 1: them a sense of where they are in space and 468 00:25:16,720 --> 00:25:20,280 Speaker 1: time and how close they are to things. And so 469 00:25:20,320 --> 00:25:23,200 Speaker 1: this dude develop this where he makes this noise, this 470 00:25:23,359 --> 00:25:27,400 Speaker 1: kind of clicking noise, and he can walk around and 471 00:25:27,680 --> 00:25:31,840 Speaker 1: he knows the proximity of physical objects that he can't see. 472 00:25:31,880 --> 00:25:34,440 Speaker 1: But he can sense with this skill that he developed. 473 00:25:34,920 --> 00:25:39,040 Speaker 1: It's fucking amazing what we can do when there are 474 00:25:39,080 --> 00:25:41,399 Speaker 1: certain things we now can't do. Now we have to, 475 00:25:42,800 --> 00:25:47,600 Speaker 1: I guess, lean into this other potential, whatever gift or sense, 476 00:25:47,680 --> 00:25:51,520 Speaker 1: and then develop that over time. We talked about. 477 00:25:51,280 --> 00:25:55,320 Speaker 2: Brain plasticity, and I think that potentially on the show 478 00:25:55,320 --> 00:25:57,720 Speaker 2: we may have but I might quickly repeat it. I 479 00:25:58,000 --> 00:26:01,040 Speaker 2: love to juggle, and this is something that I've always 480 00:26:01,160 --> 00:26:03,919 Speaker 2: enjoyed doing. Just a three ball cascade they call it. 481 00:26:04,280 --> 00:26:07,679 Speaker 2: And our German researchers took three groups of people and 482 00:26:07,720 --> 00:26:10,760 Speaker 2: took them how to juggle, and then after a month 483 00:26:11,119 --> 00:26:14,160 Speaker 2: they got one group. What they found, by the way, 484 00:26:14,320 --> 00:26:17,119 Speaker 2: with their research was that there's a part of the 485 00:26:17,160 --> 00:26:22,400 Speaker 2: brain that only activates when people are juggling. They don't 486 00:26:22,440 --> 00:26:26,320 Speaker 2: know why, but it only starts to show the synapses 487 00:26:26,359 --> 00:26:29,119 Speaker 2: and movement in the brain that part of the brain 488 00:26:29,400 --> 00:26:32,720 Speaker 2: when someone learns how to juggle, and if they're juggling consistency. 489 00:26:32,760 --> 00:26:35,880 Speaker 2: But after a month they broke the group into three 490 00:26:36,000 --> 00:26:39,040 Speaker 2: and one group kept juggling five minutes a day, one 491 00:26:39,400 --> 00:26:43,560 Speaker 2: group stopped juggling completely, and then the other group visualized 492 00:26:44,040 --> 00:26:47,160 Speaker 2: juggling and the group visualized and the group that were 493 00:26:47,320 --> 00:26:51,240 Speaker 2: consistently juggling, that active part of the brain stayed active. 494 00:26:52,240 --> 00:26:54,480 Speaker 2: Blue stopped juggling, that part of the brain that was 495 00:26:54,520 --> 00:26:58,560 Speaker 2: firing stopped firing, and then eventually they also know that 496 00:26:58,560 --> 00:27:01,480 Speaker 2: people who juggle on a regular base, that's always active. 497 00:27:01,240 --> 00:27:02,840 Speaker 1: In their brain. So they don't know what it is. 498 00:27:03,080 --> 00:27:03,760 Speaker 1: They know that the. 499 00:27:03,720 --> 00:27:06,679 Speaker 2: Brain is active and it's able to develop the sense 500 00:27:07,119 --> 00:27:10,440 Speaker 2: to do that. So just learning how to juggle is 501 00:27:10,840 --> 00:27:13,560 Speaker 2: a way for your brain to suddenly use a region 502 00:27:13,600 --> 00:27:16,040 Speaker 2: that it doesn't use. It's amazing. Was it sixteen percent 503 00:27:16,080 --> 00:27:17,520 Speaker 2: of our brain that we use crago? 504 00:27:18,760 --> 00:27:20,959 Speaker 1: I don't know. They really don't know because there's no 505 00:27:21,000 --> 00:27:23,720 Speaker 1: absolute number, but it's not in terms of our potential, 506 00:27:23,920 --> 00:27:26,640 Speaker 1: I think it's very low. I think it's around that Patrick. 507 00:27:26,720 --> 00:27:30,560 Speaker 1: But also, you know what else. I saw this thing 508 00:27:30,600 --> 00:27:32,560 Speaker 1: the other day which was I thought it said it 509 00:27:32,600 --> 00:27:34,520 Speaker 1: was a study, and then I went down the rabbit hole. 510 00:27:34,520 --> 00:27:37,560 Speaker 1: It wasn't a study. It was bullshit. But the idea 511 00:27:37,640 --> 00:27:41,119 Speaker 1: is very fucking clever, and that I forget which it 512 00:27:41,240 --> 00:27:44,280 Speaker 1: was like maybe Denmark or somewhere where allegedly what they 513 00:27:44,359 --> 00:27:48,120 Speaker 1: do is one hour a day in this school, this 514 00:27:48,160 --> 00:27:51,080 Speaker 1: particular school, which turned out to be bullshit, But it's 515 00:27:51,119 --> 00:27:53,960 Speaker 1: a great idea. I think it would work. They get 516 00:27:54,119 --> 00:27:58,000 Speaker 1: children to use their non dominant hand writing, so every 517 00:27:58,080 --> 00:27:59,600 Speaker 1: day they would you know, So for me, I'm a 518 00:27:59,640 --> 00:28:01,280 Speaker 1: left hand and it mean that you would do an 519 00:28:01,320 --> 00:28:04,880 Speaker 1: hour of work with your non dominant hand, and that 520 00:28:04,920 --> 00:28:07,879 Speaker 1: does amazing shit in your brain. And that's one of 521 00:28:07,920 --> 00:28:10,840 Speaker 1: the things I encourage older people to do is do 522 00:28:11,040 --> 00:28:13,920 Speaker 1: things that you don't normally do, because now your brain 523 00:28:13,960 --> 00:28:17,720 Speaker 1: has got to adapt to that new thing. And even 524 00:28:17,760 --> 00:28:20,760 Speaker 1: if it's like writing with your non dominant hand, or 525 00:28:21,880 --> 00:28:24,679 Speaker 1: using your non dominant hand to use your phone. Like 526 00:28:24,720 --> 00:28:27,199 Speaker 1: I'm left handed, so as I said, so when I 527 00:28:27,320 --> 00:28:29,440 Speaker 1: use my phone, it's always with my left hand. If 528 00:28:29,480 --> 00:28:31,600 Speaker 1: I try and do that with my right hand, it 529 00:28:31,720 --> 00:28:35,360 Speaker 1: is so clumsy and awkward. But I think, I think 530 00:28:35,400 --> 00:28:39,800 Speaker 1: these little ideas of coming up with little activities or 531 00:28:39,840 --> 00:28:42,960 Speaker 1: tricks or hacks to engage our brain in a different 532 00:28:42,960 --> 00:28:44,760 Speaker 1: way is a great idea. 533 00:28:45,240 --> 00:28:47,680 Speaker 2: One of the things I always talk to my Taichi 534 00:28:47,720 --> 00:28:49,480 Speaker 2: students about, given. 535 00:28:49,280 --> 00:28:51,800 Speaker 1: That a lot of them are older. My older student 536 00:28:51,840 --> 00:28:53,080 Speaker 1: at the moment is eighty six. 537 00:28:54,120 --> 00:28:57,040 Speaker 2: Fantastics shit tends class regularly, but one of the things 538 00:28:57,120 --> 00:28:59,200 Speaker 2: I say to them is every day, when you're brushing 539 00:28:59,200 --> 00:29:02,640 Speaker 2: your teeth, stand one leg, yeah, motion of the hand 540 00:29:02,760 --> 00:29:05,160 Speaker 2: moving plus being on one leg and then halfway through 541 00:29:05,160 --> 00:29:07,800 Speaker 2: swop to the other leg. And they all do it. 542 00:29:07,800 --> 00:29:10,560 Speaker 2: It's great because you can work on balance. You can 543 00:29:10,760 --> 00:29:14,760 Speaker 2: improve your balance. We know that a fall can be 544 00:29:14,920 --> 00:29:19,120 Speaker 2: so detrimental, and when you're older. The average statistics are 545 00:29:19,160 --> 00:29:21,640 Speaker 2: that if someone over the age of I think it's sixty, 546 00:29:21,680 --> 00:29:23,560 Speaker 2: but it might be seventy, so don't count, you know, 547 00:29:23,600 --> 00:29:26,560 Speaker 2: don't quote me on this, but the longevity is like 548 00:29:26,600 --> 00:29:28,480 Speaker 2: a year after a major fall. 549 00:29:28,680 --> 00:29:30,640 Speaker 1: Is how long their life expectancy is. 550 00:29:30,680 --> 00:29:33,080 Speaker 2: And that's a tragedy that we know we can do 551 00:29:33,120 --> 00:29:36,440 Speaker 2: something about just by helping with our balance and coordination 552 00:29:36,680 --> 00:29:39,200 Speaker 2: and something as simple as standing on one leg while 553 00:29:39,200 --> 00:29:41,080 Speaker 2: you're brushing your teeth. That's not much to ask. You 554 00:29:41,120 --> 00:29:43,360 Speaker 2: got to do it, so might as well just you know, 555 00:29:43,520 --> 00:29:45,480 Speaker 2: jump from one leg to the other. When you do there, 556 00:29:45,480 --> 00:29:46,440 Speaker 2: you go my tip. 557 00:29:46,240 --> 00:29:50,240 Speaker 1: To the day, definitely, And with that you need some strength, 558 00:29:50,280 --> 00:29:52,320 Speaker 1: so lift a few weights too, because you need some 559 00:29:52,440 --> 00:29:55,760 Speaker 1: muscle and strength through the hips and bum and legs 560 00:29:55,800 --> 00:29:58,680 Speaker 1: and all of that. But Patrick's exactly right. I love this. 561 00:30:00,160 --> 00:30:03,680 Speaker 1: The robotic kitchen that cooks and serves one hundred and 562 00:30:03,800 --> 00:30:07,520 Speaker 1: so an AI driven kitchen that cooks and serves one 563 00:30:07,640 --> 00:30:11,280 Speaker 1: hundred and twenty meals an hour. That's a meal every 564 00:30:11,360 --> 00:30:15,400 Speaker 1: thirty seconds with no help from humans. Yeah. 565 00:30:15,480 --> 00:30:20,840 Speaker 2: Dusseldorf in Germany. It's the supermarket there, Riwe. I'm not 566 00:30:20,840 --> 00:30:24,120 Speaker 2: sure how they pronounce it, but they've Yeah, this Munich 567 00:30:24,200 --> 00:30:28,920 Speaker 2: based robotics company called circus se they've launched their very 568 00:30:28,960 --> 00:30:32,240 Speaker 2: first it's the CAA one Series four system and they 569 00:30:32,280 --> 00:30:34,400 Speaker 2: just basically put the whole thing in. It's called Fresh 570 00:30:34,400 --> 00:30:39,120 Speaker 2: and Smart concept, and they basically put an entire robotic kitchen. 571 00:30:39,320 --> 00:30:42,960 Speaker 2: It's enclosed in glass and it does everything from the 572 00:30:43,000 --> 00:30:47,760 Speaker 2: full process of meal prep, collecting the ingredients, cooking, plating, cleaning, 573 00:30:47,760 --> 00:30:50,320 Speaker 2: the whole lot. And so we're not talking about just 574 00:30:50,520 --> 00:30:54,200 Speaker 2: a vending machine that's serving out food. We're looking at 575 00:30:54,240 --> 00:30:57,040 Speaker 2: something where the food is being prepped from its very 576 00:30:57,320 --> 00:31:00,280 Speaker 2: inception all the way through to a finished meal. And 577 00:31:00,440 --> 00:31:02,640 Speaker 2: that's kind of cool, isn't it. I want to get 578 00:31:02,640 --> 00:31:03,720 Speaker 2: one in my kitchen at home. 579 00:31:04,360 --> 00:31:08,080 Speaker 1: Yeah. I wonder, I mean, I wonder what the consequences 580 00:31:08,080 --> 00:31:11,560 Speaker 1: of that movie that. I wonder if that's just going 581 00:31:11,640 --> 00:31:14,480 Speaker 1: to be limited to situations like that. Surely that's not 582 00:31:14,520 --> 00:31:19,080 Speaker 1: going to be commonplace in restaurants on Hampton Street. Surely 583 00:31:19,160 --> 00:31:21,480 Speaker 1: there's we're still going to have chefs and bloody waiters 584 00:31:21,480 --> 00:31:25,440 Speaker 1: and waitresses. And surely I don't want to go and 585 00:31:26,040 --> 00:31:31,560 Speaker 1: give my fucking order to R two D two fucking 586 00:31:31,640 --> 00:31:34,680 Speaker 1: C three po comes out with my fucking chicken casserole. 587 00:31:38,480 --> 00:31:39,320 Speaker 1: I would love that. 588 00:31:39,400 --> 00:31:41,680 Speaker 2: I would love to give R to dtwo my order 589 00:31:41,680 --> 00:31:43,960 Speaker 2: and have come out and serve my meal. 590 00:31:44,880 --> 00:31:48,280 Speaker 1: Here's my chicken cashew stir fry. I hope you enjoy it, sir. 591 00:31:48,640 --> 00:31:50,640 Speaker 2: Do you remember the time I went to Canberra and 592 00:31:50,680 --> 00:31:52,440 Speaker 2: I went to a robot cafe. 593 00:31:52,760 --> 00:31:55,640 Speaker 1: I desperately wanted to go. I told us yea, and 594 00:31:55,680 --> 00:31:56,520 Speaker 1: the bloody. 595 00:31:56,240 --> 00:31:58,840 Speaker 2: Robots were off on strike or something. They were sitting 596 00:31:58,840 --> 00:32:01,760 Speaker 2: in a corner and collect dust. And I've driven, I've 597 00:32:01,800 --> 00:32:04,760 Speaker 2: flown as to camera and then gone all the way 598 00:32:04,760 --> 00:32:06,880 Speaker 2: over to the other side of the city to go 599 00:32:06,920 --> 00:32:10,000 Speaker 2: to and I'm so disappointed. It was gut wrenching. But 600 00:32:10,000 --> 00:32:11,520 Speaker 2: I think there's one in Melbourne now, or a few 601 00:32:11,560 --> 00:32:14,320 Speaker 2: in Melbourne. But I think like the idea of people 602 00:32:14,360 --> 00:32:15,160 Speaker 2: getting jobs too. 603 00:32:16,320 --> 00:32:18,800 Speaker 1: Yeah, I mean that is that is something that's going 604 00:32:18,840 --> 00:32:22,360 Speaker 1: to become a bigger and bigger reality that we need 605 00:32:22,400 --> 00:32:24,680 Speaker 1: to deal with. Is like human jobs that are being 606 00:32:24,720 --> 00:32:27,959 Speaker 1: lost to AI or robotics or technology or whatever. But 607 00:32:30,280 --> 00:32:32,920 Speaker 1: I wanted to talk about this. One research paper finds 608 00:32:32,960 --> 00:32:38,160 Speaker 1: that topper AI systems are developing a survival drive. That's 609 00:32:38,400 --> 00:32:42,560 Speaker 1: spoken about a bit where there, you know, AI is 610 00:32:42,640 --> 00:32:48,400 Speaker 1: starting to learn a sense of it's it's or its 611 00:32:48,440 --> 00:32:51,520 Speaker 1: potential demise, and it wants to save itself. It wants 612 00:32:51,600 --> 00:32:55,880 Speaker 1: to protect itself. And there was that story about a 613 00:32:55,960 --> 00:32:58,120 Speaker 1: year or so ago, remember Patrick, where it was it 614 00:32:58,200 --> 00:33:02,240 Speaker 1: in Google, where that that computer was trying to deceive 615 00:33:03,080 --> 00:33:05,960 Speaker 1: the creators of it so it could protect itself. 616 00:33:06,280 --> 00:33:09,000 Speaker 2: Yeah, and this is this is a really frightening thought. 617 00:33:09,000 --> 00:33:11,840 Speaker 2: And it's called a palisade research. This is a research 618 00:33:11,880 --> 00:33:15,840 Speaker 2: that's been done specifically into what they call survival drives 619 00:33:16,440 --> 00:33:21,600 Speaker 2: and that frequently, Yeah, the AI models we're basically refusing 620 00:33:21,600 --> 00:33:26,080 Speaker 2: instructions to shut themselves down, and the researchers themselves just 621 00:33:26,240 --> 00:33:30,480 Speaker 2: can't explain why it's happening. That's the frightening thing, you know, 622 00:33:30,880 --> 00:33:35,840 Speaker 2: the AI models they are they self aware. That's a 623 00:33:35,840 --> 00:33:38,280 Speaker 2: tough one to even kind of. I mean, there's so 624 00:33:38,400 --> 00:33:40,360 Speaker 2: much more of a bigger argument when it comes to that. 625 00:33:40,800 --> 00:33:45,600 Speaker 2: But they were lying, they to achieve a specific objective, 626 00:33:46,040 --> 00:33:49,320 Speaker 2: and you know, potentially they could be blackmailing you know, 627 00:33:49,440 --> 00:33:52,200 Speaker 2: can you imagine that the lead scientist says, now, switch 628 00:33:52,200 --> 00:33:55,520 Speaker 2: yourself off. Well, actually last night you weren't at home, 629 00:33:55,600 --> 00:33:58,520 Speaker 2: but you weren't also at the office, like you just. 630 00:33:58,560 --> 00:34:00,640 Speaker 1: Yeah, exactly right now? 631 00:34:01,080 --> 00:34:02,960 Speaker 2: Is that being blackmailed by AI? 632 00:34:03,480 --> 00:34:04,680 Speaker 1: So it was study. 633 00:34:04,760 --> 00:34:07,720 Speaker 2: It was actually a research that was or an article 634 00:34:07,800 --> 00:34:11,160 Speaker 2: was in the Guardian newspaper, and they'd done some previous research. 635 00:34:11,239 --> 00:34:14,080 Speaker 1: It found that some of the open ai models, and this. 636 00:34:14,160 --> 00:34:18,319 Speaker 2: Is chet GPT by the way, circumvented attempts to deactivate 637 00:34:18,360 --> 00:34:20,960 Speaker 2: it and even when it was told to you know, 638 00:34:21,160 --> 00:34:26,160 Speaker 2: specifically directed to be shut down, and they it actually 639 00:34:26,200 --> 00:34:28,680 Speaker 2: went as so far as to try to sabotage the 640 00:34:28,760 --> 00:34:33,920 Speaker 2: shutdown mechanisms. So that's a degree of frightening self awareness. 641 00:34:34,040 --> 00:34:34,680 Speaker 1: I don't know. 642 00:34:34,920 --> 00:34:37,800 Speaker 2: And the study was not just open AI's chat CHPT, 643 00:34:37,920 --> 00:34:41,600 Speaker 2: it was Google's Gemini, it was Grock as well, and 644 00:34:41,680 --> 00:34:46,640 Speaker 2: so you know, it's they had to use very strong terminology, 645 00:34:46,719 --> 00:34:49,719 Speaker 2: not anything out ambiguous, and they were really clear about 646 00:34:49,760 --> 00:34:50,680 Speaker 2: what they were asking it. 647 00:34:50,680 --> 00:34:52,560 Speaker 1: To do, and they were resisting it. 648 00:34:52,719 --> 00:34:57,200 Speaker 2: That skynet it's here almost that Sorry for the Terminator 649 00:34:57,239 --> 00:34:57,919 Speaker 2: fans out there. 650 00:34:59,080 --> 00:35:03,320 Speaker 1: It's that whole kind of concept of a machine becoming 651 00:35:03,400 --> 00:35:08,000 Speaker 1: sentient where it now has its own awareness, and it 652 00:35:08,040 --> 00:35:12,759 Speaker 1: can teach itself things, and it can open doors metaphoric 653 00:35:12,840 --> 00:35:16,000 Speaker 1: doors by itself and start to solve problems. It hasn't 654 00:35:16,080 --> 00:35:20,720 Speaker 1: been taught or trained or programmed. And yeah, I wonder 655 00:35:20,760 --> 00:35:24,479 Speaker 1: if we'll like, it can never have human consciousness because 656 00:35:24,520 --> 00:35:27,600 Speaker 1: it's not human. But I wonder if it can have 657 00:35:27,680 --> 00:35:33,000 Speaker 1: the equivalent machine conscious well consciousness in that it gets 658 00:35:33,040 --> 00:35:36,200 Speaker 1: scared or it's aware, or it can anticipate things that 659 00:35:38,360 --> 00:35:40,719 Speaker 1: would seem to be a human kind of trait. I 660 00:35:40,760 --> 00:35:43,880 Speaker 1: wonder if Well, I don't wonder, I think that's inevitable. 661 00:35:43,880 --> 00:35:45,560 Speaker 1: I just wonder on the timeline. 662 00:35:45,800 --> 00:35:49,040 Speaker 2: It's the disconcerting thing is that we then have a 663 00:35:49,080 --> 00:35:49,960 Speaker 2: moral dilemma. 664 00:35:50,200 --> 00:35:51,719 Speaker 1: If we really believe. 665 00:35:51,440 --> 00:35:53,880 Speaker 2: That it's sentient and it has some thought process and 666 00:35:53,960 --> 00:35:58,520 Speaker 2: a sense of self awareness the Turing test, then what 667 00:35:58,560 --> 00:36:00,960 Speaker 2: does that mean in terms of it off what are 668 00:36:01,000 --> 00:36:03,239 Speaker 2: our ethical obligations, you. 669 00:36:03,200 --> 00:36:05,360 Speaker 1: Know, in so far as flicking the switch. 670 00:36:06,320 --> 00:36:08,440 Speaker 2: You know that that then but then again, is it 671 00:36:08,680 --> 00:36:12,080 Speaker 2: just that clever of an algorithm that it's deceiving us 672 00:36:12,440 --> 00:36:15,480 Speaker 2: and it's giving us responses that have been pre programmed 673 00:36:15,520 --> 00:36:17,640 Speaker 2: into it. You know, it's just got so good. So 674 00:36:18,280 --> 00:36:21,360 Speaker 2: you know, how do we even articulate that I struggled? 675 00:36:21,400 --> 00:36:24,160 Speaker 2: You know, we can't even talk about our own consciousness 676 00:36:24,200 --> 00:36:27,640 Speaker 2: and talk about our sense of self and articulate that, 677 00:36:27,760 --> 00:36:30,800 Speaker 2: let alone in a machine that's pretending to be sentient. 678 00:36:30,840 --> 00:36:34,640 Speaker 1: It. Yeah, this is one of those great moral conundrums 679 00:36:34,680 --> 00:36:38,600 Speaker 1: where you know, do I sacrifice the one person to 680 00:36:38,680 --> 00:36:42,280 Speaker 1: save the eighth? The train one where the train tracks 681 00:36:42,280 --> 00:36:45,319 Speaker 1: and the trains? Do I throw this person under the 682 00:36:45,360 --> 00:36:48,360 Speaker 1: train to save the eight? So now I'm saving eight people, 683 00:36:48,360 --> 00:36:51,879 Speaker 1: but I'm killing one, And who are you to decide? Yeah? 684 00:36:51,920 --> 00:36:54,520 Speaker 1: Fuck all that, let's not open that door. But that's terrifying. 685 00:36:54,560 --> 00:36:57,960 Speaker 1: But imagine, Yeah, but that's you know, you think, well, 686 00:36:58,480 --> 00:37:02,440 Speaker 1: if we pull the plug on you know, this particular 687 00:37:02,520 --> 00:37:06,359 Speaker 1: computer or this particular AI, and it's sentient, it can 688 00:37:06,440 --> 00:37:09,719 Speaker 1: feel things, and it has emotions, you go, yeah, well, 689 00:37:09,719 --> 00:37:13,200 Speaker 1: what are the consequences and ramifications if we don't do that, 690 00:37:13,520 --> 00:37:17,440 Speaker 1: what are the human what's the human cost? And of 691 00:37:17,440 --> 00:37:19,759 Speaker 1: course humans are going to be more preoccupied with our 692 00:37:19,840 --> 00:37:24,440 Speaker 1: own welfare. But it's opening up a new door. Patrick, 693 00:37:25,239 --> 00:37:32,080 Speaker 1: tell me about Australian Federal Police and Monass University using 694 00:37:32,200 --> 00:37:36,160 Speaker 1: poison data to combat AI generative crime. What does that 695 00:37:36,200 --> 00:37:36,719 Speaker 1: even mean? 696 00:37:37,120 --> 00:37:41,400 Speaker 2: Oh, I know, this is a really interesting one because 697 00:37:41,480 --> 00:37:45,200 Speaker 2: at the moment, one are the issues that we have 698 00:37:45,640 --> 00:37:50,680 Speaker 2: with a lot of the data scraping. So if you're like, 699 00:37:51,080 --> 00:37:54,800 Speaker 2: you know, we've got the validation. If we're going to 700 00:37:54,880 --> 00:37:57,200 Speaker 2: go online and you're signing up for something it might 701 00:37:57,239 --> 00:37:59,040 Speaker 2: be a bank account or whatever, and ask you to 702 00:37:59,080 --> 00:38:02,160 Speaker 2: scan your passport. The problem is that all this data 703 00:38:02,200 --> 00:38:05,200 Speaker 2: is stored somewhere and the worry is that if it 704 00:38:05,239 --> 00:38:09,360 Speaker 2: gets scraped and hacked, then that information can be used. 705 00:38:09,440 --> 00:38:12,280 Speaker 2: But one of the interesting things that they're talking about 706 00:38:12,480 --> 00:38:17,080 Speaker 2: is potentially making anything that we upload unable to be 707 00:38:17,200 --> 00:38:19,600 Speaker 2: fully read. You know, when we do a photocopy of 708 00:38:19,600 --> 00:38:22,279 Speaker 2: a photocopy of a photocopy and it degrades, well, the 709 00:38:22,400 --> 00:38:25,200 Speaker 2: using techniques to make the image that you upload of, 710 00:38:25,320 --> 00:38:28,080 Speaker 2: say your passport, done in such a way that it 711 00:38:28,120 --> 00:38:32,080 Speaker 2: will poison or mirror reflect back in such a way 712 00:38:32,200 --> 00:38:35,160 Speaker 2: that it can't be copied and duplicated by the AI 713 00:38:35,200 --> 00:38:38,759 Speaker 2: model or criminals in this case as well. So the 714 00:38:39,360 --> 00:38:42,000 Speaker 2: AFP is just looking at way Federal Police is looking 715 00:38:42,000 --> 00:38:46,000 Speaker 2: at ways to be able to make the data that 716 00:38:46,000 --> 00:38:48,799 Speaker 2: we've got not poisonous s. I mean the term they're 717 00:38:48,840 --> 00:38:51,880 Speaker 2: using is poisonous, but one where it can't be scraped 718 00:38:51,920 --> 00:38:56,360 Speaker 2: and reused and copied, and it means that it can't 719 00:38:56,440 --> 00:38:59,000 Speaker 2: then be reused by another third party. So if you 720 00:38:59,080 --> 00:39:02,120 Speaker 2: upload your pass then it can't be used by somebody 721 00:39:02,200 --> 00:39:06,000 Speaker 2: else who then gets a copy of that passport because 722 00:39:06,040 --> 00:39:08,960 Speaker 2: it will degraded, the second generation will be degraded to 723 00:39:09,000 --> 00:39:10,280 Speaker 2: a point where it can't be used. 724 00:39:12,320 --> 00:39:15,319 Speaker 1: That's over my pay grade. It is a bit like that, 725 00:39:15,400 --> 00:39:20,000 Speaker 1: isn't it look something else that I did really pay 726 00:39:20,040 --> 00:39:25,240 Speaker 1: attention to this week because it's something that I've used, 727 00:39:25,239 --> 00:39:29,239 Speaker 1: which is chat GPT has now been restricted from giving medical, legal, 728 00:39:29,360 --> 00:39:32,120 Speaker 1: or financial advice. Now. A couple of weeks ago, I 729 00:39:32,160 --> 00:39:35,720 Speaker 1: was out to dinner with a crab for his sixtieth 730 00:39:35,800 --> 00:39:39,640 Speaker 1: birthday and his mum had broken barb, the lovely barb. 731 00:39:39,719 --> 00:39:42,200 Speaker 1: Shout out to Bob, who's never listened to a podcast 732 00:39:42,239 --> 00:39:44,359 Speaker 1: in her life, but this could be your debut, Bob. 733 00:39:44,920 --> 00:39:48,799 Speaker 1: She had a broken wrist and she happened to have 734 00:39:49,200 --> 00:39:52,520 Speaker 1: in her phone. I don't know why the x rays, 735 00:39:53,280 --> 00:39:56,239 Speaker 1: And so I got the x rays off her phone, 736 00:39:56,280 --> 00:39:59,480 Speaker 1: put him in my phone and whacked it in chat 737 00:39:59,560 --> 00:40:03,560 Speaker 1: GP and said, tell me about this, what's your diagnosis 738 00:40:03,600 --> 00:40:05,520 Speaker 1: and what do you and it just did this whole 739 00:40:05,680 --> 00:40:10,840 Speaker 1: incredible medical breakdown and report of what what the X 740 00:40:10,920 --> 00:40:14,560 Speaker 1: ray was telling us and also potential treatment options and rehabit. 741 00:40:14,600 --> 00:40:18,120 Speaker 1: It's fucking amazing, right, And then I sent that to 742 00:40:18,200 --> 00:40:20,080 Speaker 1: a friend of mine who's a doctor and said this 743 00:40:20,200 --> 00:40:22,200 Speaker 1: is you know, what do you think of all of this? 744 00:40:22,600 --> 00:40:25,719 Speaker 1: And he's just like, that's doing me out of a job. 745 00:40:26,080 --> 00:40:28,839 Speaker 1: Like it was. Now. I'm not saying it's all spot on, 746 00:40:29,200 --> 00:40:32,000 Speaker 1: I'm not saying everything. I'm not suggesting we do this, 747 00:40:32,120 --> 00:40:34,640 Speaker 1: and I'm not suggesting we defer to CHAP, GPT or 748 00:40:34,680 --> 00:40:37,360 Speaker 1: AI instead of a doctor. But my friend who's a 749 00:40:37,400 --> 00:40:41,880 Speaker 1: doctor said, that's fucking amazing, like that report is like 750 00:40:41,960 --> 00:40:47,239 Speaker 1: a high level medical report. But now we can't do 751 00:40:47,320 --> 00:40:51,799 Speaker 1: that anymore because they're scared about potential legal consequences of 752 00:40:51,840 --> 00:40:55,919 Speaker 1: giving advice that potentially is totally understandably, but I wonder 753 00:40:55,960 --> 00:40:57,759 Speaker 1: if there's a way to work around that where you 754 00:40:57,800 --> 00:41:00,960 Speaker 1: can say I don't want advice, I just want I 755 00:41:01,000 --> 00:41:04,840 Speaker 1: don't know, because I, for me, I loved that. Yeah, 756 00:41:05,000 --> 00:41:05,279 Speaker 1: there was. 757 00:41:05,320 --> 00:41:09,520 Speaker 2: Some information released by open ai, the company that owns 758 00:41:09,600 --> 00:41:12,560 Speaker 2: chat chapt and it said that zero point one point 759 00:41:12,600 --> 00:41:16,600 Speaker 2: five sounds like a small amount of chat GPT's active users. 760 00:41:16,640 --> 00:41:20,960 Speaker 2: In a given week, we're having conversations that included explicit 761 00:41:21,000 --> 00:41:24,000 Speaker 2: indicators of potential suicide. Okay, And there are court cases 762 00:41:24,320 --> 00:41:26,919 Speaker 2: currently underway in the United States with young people who 763 00:41:26,960 --> 00:41:33,520 Speaker 2: have suicided after long interactions with AIS where the AI 764 00:41:33,880 --> 00:41:35,680 Speaker 2: seemingly encouraged them. 765 00:41:35,480 --> 00:41:39,560 Speaker 1: To go on with that process. And this has been 766 00:41:39,719 --> 00:41:40,800 Speaker 1: a really big thing here. 767 00:41:40,840 --> 00:41:45,040 Speaker 2: But that fifo point one five percent doesn't sound like 768 00:41:45,120 --> 00:41:47,239 Speaker 2: a lot. But when you think of how many people 769 00:41:47,320 --> 00:41:50,680 Speaker 2: are using it that you know, are using chatpts. So 770 00:41:50,719 --> 00:41:54,200 Speaker 2: there's more than eight hundred million weekly active users, so 771 00:41:54,239 --> 00:41:57,120 Speaker 2: more than a million people are talking to it about 772 00:41:57,360 --> 00:42:03,719 Speaker 2: suicide and every week week. Yeah, yeah, exactly, And that's 773 00:42:03,840 --> 00:42:05,840 Speaker 2: that's where it's being raised from. So you know, I 774 00:42:06,160 --> 00:42:10,319 Speaker 2: agree with you Crago that I'd love to I think 775 00:42:10,360 --> 00:42:12,239 Speaker 2: it's great that we can use this tool. You know, 776 00:42:12,280 --> 00:42:14,880 Speaker 2: we can use this for diagnosis and maybe in conjunction, 777 00:42:14,960 --> 00:42:18,719 Speaker 2: because you know, you've got effectively thousands and thousands of 778 00:42:18,760 --> 00:42:23,759 Speaker 2: research papers that went into diagnosing that break and the 779 00:42:23,840 --> 00:42:27,800 Speaker 2: outcomes of similar breaks. So when you're compilating all that data, 780 00:42:27,880 --> 00:42:30,320 Speaker 2: I mean, I think gps are such some of the 781 00:42:30,360 --> 00:42:33,280 Speaker 2: smartest people because they have no a whole lot about 782 00:42:33,360 --> 00:42:35,480 Speaker 2: a whole lot of stuff and then be able to 783 00:42:35,520 --> 00:42:38,719 Speaker 2: apply that to a diagnose a diagnosis when you sit 784 00:42:38,800 --> 00:42:41,080 Speaker 2: down and find out what the ailment is. 785 00:42:41,760 --> 00:42:42,640 Speaker 1: But when you've got an. 786 00:42:42,600 --> 00:42:46,279 Speaker 2: AI model that's literally scraped the entirety of all our 787 00:42:46,320 --> 00:42:49,959 Speaker 2: knowledge almost and then you know, so you can see 788 00:42:49,960 --> 00:42:52,600 Speaker 2: similar cases other people who have had breaks this way. 789 00:42:53,160 --> 00:42:56,040 Speaker 2: Physios have reported on what they did to help that 790 00:42:56,080 --> 00:42:59,799 Speaker 2: person recover. You're now collating that into one model. And 791 00:42:59,840 --> 00:43:02,480 Speaker 2: that's that's why the chat GPT outcome that you were 792 00:43:02,520 --> 00:43:05,919 Speaker 2: talking about was so accurate and so amazing, because it's 793 00:43:06,040 --> 00:43:10,359 Speaker 2: drawing on humans, it's drawing on human research, it's during. 794 00:43:10,400 --> 00:43:12,879 Speaker 1: From the intelligence, the years of. 795 00:43:12,840 --> 00:43:15,480 Speaker 2: Experience, all that sort of stuff. So when we talk 796 00:43:15,480 --> 00:43:18,960 Speaker 2: about these AI models, AI is not smart all the 797 00:43:19,000 --> 00:43:21,920 Speaker 2: bloody people who've put their brain work and years and 798 00:43:22,040 --> 00:43:26,600 Speaker 2: study and sweat and tears into observing and caring and 799 00:43:26,600 --> 00:43:29,360 Speaker 2: doing all the things that they've done. That's the outcome 800 00:43:29,400 --> 00:43:32,560 Speaker 2: we're getting, is the outcome of all these people who've 801 00:43:32,600 --> 00:43:34,560 Speaker 2: been at the call fake the nurse, the cold face, 802 00:43:34,719 --> 00:43:37,120 Speaker 2: the nurses, the doctors, the physios, all those people. 803 00:43:37,920 --> 00:43:40,759 Speaker 1: Yeah. Yeah, So one of the things we've spoken about 804 00:43:40,800 --> 00:43:44,160 Speaker 1: over time is trying to distinguish between an AI video 805 00:43:44,280 --> 00:43:47,960 Speaker 1: and a tiff made video, a real human video and 806 00:43:48,000 --> 00:43:53,520 Speaker 1: an AI video. But apparently there's one sign that kind 807 00:43:53,560 --> 00:43:55,959 Speaker 1: of demonstrates whether or not it's AI or real. 808 00:43:57,040 --> 00:43:58,879 Speaker 2: Isn't it funny that we have to do this now 809 00:43:59,000 --> 00:44:01,799 Speaker 2: that everything we see we have to challenge. I find 810 00:44:01,800 --> 00:44:05,719 Speaker 2: that so frustrating. There's a country and Western song that's 811 00:44:05,920 --> 00:44:10,160 Speaker 2: just top the charts on Billboard in the United States 812 00:44:10,280 --> 00:44:13,719 Speaker 2: and totally fakes made up by AI. 813 00:44:14,000 --> 00:44:14,799 Speaker 1: The images AI. 814 00:44:15,040 --> 00:44:17,719 Speaker 2: It's some dude wearing a cowboy hat with a long 815 00:44:17,800 --> 00:44:20,480 Speaker 2: jacket on walking off into the sunset, and you look 816 00:44:20,480 --> 00:44:22,759 Speaker 2: at his feet and he's walking through puddles and the 817 00:44:22,800 --> 00:44:25,680 Speaker 2: splashes look weird, you know, the images of a glory 818 00:44:25,960 --> 00:44:29,160 Speaker 2: and it feels to me so sad for all those 819 00:44:29,239 --> 00:44:32,880 Speaker 2: people who you know, who write music and are performers. 820 00:44:33,239 --> 00:44:35,040 Speaker 2: And I'm going to be singing in a choir tomorrow. 821 00:44:35,120 --> 00:44:37,319 Speaker 2: I thought i'd mentioned that we're doing a half hour 822 00:44:37,360 --> 00:44:43,759 Speaker 2: set at the book, But the thing is, it makes 823 00:44:43,760 --> 00:44:46,160 Speaker 2: me feel sad for those people who are creators, people 824 00:44:46,200 --> 00:44:49,680 Speaker 2: who actually legitimately are musicians who write. But one of 825 00:44:49,719 --> 00:44:51,680 Speaker 2: the things to look out for, so the red flags 826 00:44:51,680 --> 00:44:54,240 Speaker 2: to look out for when it comes to false stuff, 827 00:44:54,760 --> 00:44:57,600 Speaker 2: is when you look at the picture quality itself. If 828 00:44:57,600 --> 00:44:59,840 Speaker 2: it looks a little bit grainy in areas, might be 829 00:45:00,160 --> 00:45:03,480 Speaker 2: slightly blurry footage. They're the types of things you need 830 00:45:03,520 --> 00:45:05,480 Speaker 2: to be looking at. And when I was watching this 831 00:45:05,760 --> 00:45:08,440 Speaker 2: song today because I couldn't help myself, given that there 832 00:45:08,520 --> 00:45:10,560 Speaker 2: was an article about, you know, how bad it is 833 00:45:10,600 --> 00:45:12,680 Speaker 2: that a fake song is the number one on the 834 00:45:12,680 --> 00:45:15,680 Speaker 2: Billboard charts, I looked at it and it does you know, 835 00:45:15,719 --> 00:45:17,160 Speaker 2: you start to look at it and you think this 836 00:45:17,320 --> 00:45:21,280 Speaker 2: is such crap. You know, the physics engine of walking 837 00:45:21,320 --> 00:45:24,480 Speaker 2: through a puddle and the way the water splashes just 838 00:45:24,840 --> 00:45:27,640 Speaker 2: it looked too real. Can I even kind of you know, 839 00:45:27,760 --> 00:45:30,600 Speaker 2: articulate it that way? There was something too symmetrical about 840 00:45:30,600 --> 00:45:34,400 Speaker 2: the puddle splash. It was too perfect to be real, 841 00:45:35,320 --> 00:45:36,719 Speaker 2: is the way that I kind of looked at it 842 00:45:36,760 --> 00:45:39,200 Speaker 2: when I was watching the video clip, And so I 843 00:45:39,239 --> 00:45:41,480 Speaker 2: think that the thing we need to think about. And 844 00:45:41,520 --> 00:45:44,280 Speaker 2: this is a scientist at the University of California, Berkeley, 845 00:45:44,520 --> 00:45:46,600 Speaker 2: a guy by the name of Harry Faird, and he 846 00:45:46,800 --> 00:45:49,560 Speaker 2: was just saying that digital forensics is now a real, 847 00:45:49,640 --> 00:45:54,920 Speaker 2: big study. And so he's started a deep fake detection 848 00:45:55,040 --> 00:45:59,279 Speaker 2: company called get Real Security, and they're now trying to 849 00:45:59,360 --> 00:46:01,920 Speaker 2: develop more tools to make it easier for us to 850 00:46:01,960 --> 00:46:05,040 Speaker 2: spot these fakes. Because you know, if you're flicking through 851 00:46:05,080 --> 00:46:08,359 Speaker 2: socials or you're on YouTube or whatever and you kind 852 00:46:08,400 --> 00:46:11,520 Speaker 2: of you've been ripped off, you feel like you've been scammed. 853 00:46:11,520 --> 00:46:13,160 Speaker 2: I mean, I don't know about you, but if I 854 00:46:13,239 --> 00:46:15,279 Speaker 2: spend five minutes looking at a video and I think, 855 00:46:15,320 --> 00:46:17,960 Speaker 2: oh man, that was AI generated, What a lot of 856 00:46:18,000 --> 00:46:21,520 Speaker 2: shit that was, I feel like I've just wasted thirty seconds, 857 00:46:21,640 --> 00:46:24,480 Speaker 2: twenty seconds, fifty seconds or a minute or whatever. But 858 00:46:24,520 --> 00:46:26,960 Speaker 2: you've just wasted that time looking at this crap. If 859 00:46:27,000 --> 00:46:30,200 Speaker 2: I knew if I had a deep fake detector, that 860 00:46:30,239 --> 00:46:33,080 Speaker 2: would be so good, so I don't want to see 861 00:46:33,120 --> 00:46:34,960 Speaker 2: it I'm not interested in seeing it. I want to 862 00:46:35,000 --> 00:46:37,560 Speaker 2: see human content, real people doing real stuff. 863 00:46:38,080 --> 00:46:42,319 Speaker 1: Yeah, I have a different opinion. I agree with you 864 00:46:43,480 --> 00:46:47,160 Speaker 1: in a way, but I also think, well, if somebody 865 00:46:47,200 --> 00:46:49,840 Speaker 1: watches something like you think about, you think it's shit, 866 00:46:49,920 --> 00:46:53,000 Speaker 1: you hate it, totally respect that, and I don't think 867 00:46:53,040 --> 00:46:55,640 Speaker 1: we all need to agree. I think, but if it's 868 00:46:55,680 --> 00:46:58,360 Speaker 1: the top of the billboard charts and people know it's AI, 869 00:46:58,800 --> 00:47:02,800 Speaker 1: well then a lot of people knowingly are giving the approval, 870 00:47:02,880 --> 00:47:06,440 Speaker 1: and a lot of people enjoy the actual irrespective of 871 00:47:06,480 --> 00:47:09,560 Speaker 1: who or what created it. They enjoy the music. And 872 00:47:09,680 --> 00:47:13,480 Speaker 1: I think, you know, the fact is if I drew 873 00:47:13,600 --> 00:47:16,120 Speaker 1: painting and let's say my let's say I was an 874 00:47:16,200 --> 00:47:19,600 Speaker 1: artist and the art was brilliant, or AI drew an 875 00:47:19,680 --> 00:47:24,839 Speaker 1: equivalent painting, so to speak, and it looks identical, it's 876 00:47:24,880 --> 00:47:29,960 Speaker 1: really just psychology. Like the music has its own value, 877 00:47:30,000 --> 00:47:32,920 Speaker 1: But then it's really about like you could you could 878 00:47:33,000 --> 00:47:36,120 Speaker 1: listen to a song and think this is fucking amazing 879 00:47:36,280 --> 00:47:38,440 Speaker 1: because you think a human did it, But as soon 880 00:47:38,480 --> 00:47:41,399 Speaker 1: as you know a human didn't do it, now your 881 00:47:41,480 --> 00:47:44,160 Speaker 1: attitude about the music is different, and it's not about 882 00:47:44,160 --> 00:47:47,239 Speaker 1: the actual music, but rather your beliefs about how this 883 00:47:47,280 --> 00:47:51,360 Speaker 1: should work. And I understand that, but like the truth 884 00:47:51,480 --> 00:47:54,200 Speaker 1: is that this is not going away. There will be 885 00:47:54,400 --> 00:47:56,920 Speaker 1: AI art, there will be AI music, there will be 886 00:47:57,000 --> 00:47:59,839 Speaker 1: AI books, there will be AI movies. It's not going 887 00:47:59,920 --> 00:48:05,759 Speaker 1: to go away. And while I understand that, you don't, 888 00:48:05,800 --> 00:48:08,680 Speaker 1: and I'm a bit the same, Like I worry about podcasting. 889 00:48:08,719 --> 00:48:11,279 Speaker 1: I'm writing a new book at the moment, and I'm like, 890 00:48:11,320 --> 00:48:15,080 Speaker 1: should I even bother? Because I could write my book 891 00:48:15,560 --> 00:48:18,279 Speaker 1: in twelve seconds. It's like, should I invest all of 892 00:48:18,320 --> 00:48:25,520 Speaker 1: this cognitive horsepower and mental energy and for me the 893 00:48:25,920 --> 00:48:29,719 Speaker 1: capacity of AI to do what I do probably better 894 00:48:29,760 --> 00:48:33,680 Speaker 1: and obviously much more quickly. You know, I don't love that, 895 00:48:33,840 --> 00:48:36,480 Speaker 1: But nonetheless, it's not going to disappear. And I think 896 00:48:36,480 --> 00:48:39,160 Speaker 1: it's like the horse and cart and the automobile. You know, 897 00:48:39,239 --> 00:48:42,200 Speaker 1: it's like it's just going to happen. I don't think 898 00:48:42,239 --> 00:48:46,120 Speaker 1: that's going to make human art or creativity redundant. I 899 00:48:46,160 --> 00:48:48,680 Speaker 1: think it's just going to change the landscape and the 900 00:48:48,719 --> 00:48:52,600 Speaker 1: way that we create and buy and interact with all 901 00:48:52,600 --> 00:48:55,279 Speaker 1: of that. And I'm with you, like I feel like 902 00:48:55,400 --> 00:48:59,120 Speaker 1: you get ripped off because you got tripped, But if 903 00:48:59,120 --> 00:49:01,640 Speaker 1: you didn't know that, you just if you watched that 904 00:49:01,800 --> 00:49:04,040 Speaker 1: same thing and you thought a person had done it, 905 00:49:04,080 --> 00:49:06,680 Speaker 1: then you think it's brilliant. But when you know a 906 00:49:06,719 --> 00:49:09,400 Speaker 1: person didn't do it, well, that's just psychology. It's not 907 00:49:09,600 --> 00:49:13,320 Speaker 1: actually about the art or the music or the creativity. 908 00:49:13,440 --> 00:49:16,880 Speaker 1: It's about where it came from. That's a different conversation. 909 00:49:17,360 --> 00:49:19,640 Speaker 2: We were messaging each other last week or the week 910 00:49:19,680 --> 00:49:22,080 Speaker 2: before about the new Avatar film because we went and 911 00:49:22,320 --> 00:49:24,560 Speaker 2: the second one together, and we went to Imax and 912 00:49:24,560 --> 00:49:25,760 Speaker 2: it was so big and real. 913 00:49:26,360 --> 00:49:28,760 Speaker 1: None of it was real, you know, exactly. 914 00:49:28,840 --> 00:49:32,080 Speaker 2: It wasn't those flying bird things and the whales and 915 00:49:32,160 --> 00:49:35,279 Speaker 2: blue skinned aliens, but the reality of it is how 916 00:49:35,280 --> 00:49:37,879 Speaker 2: it made us feel when we went there was really good. 917 00:49:38,000 --> 00:49:39,839 Speaker 2: Well except when you put your hand on my leg, 918 00:49:39,880 --> 00:49:41,719 Speaker 2: But as I loved. 919 00:49:41,440 --> 00:49:43,880 Speaker 1: It, you look, you sat on my knee and the 920 00:49:43,920 --> 00:49:47,880 Speaker 1: scary partsucker, So don't pretend like I was a clingy one. 921 00:49:48,000 --> 00:49:48,520 Speaker 1: Thank goodness. 922 00:49:48,560 --> 00:49:51,480 Speaker 2: It was a scary movie all the way through too. Huh. 923 00:49:51,600 --> 00:49:54,520 Speaker 2: You know, I just kind of staying on the AI topic. 924 00:49:54,719 --> 00:49:58,560 Speaker 2: And this is the thing that's still I can't resolve 925 00:49:58,560 --> 00:50:01,960 Speaker 2: in my mind is death. So I think we've spoken 926 00:50:01,960 --> 00:50:06,200 Speaker 2: about deathbots before, and it's AI being used to preserve 927 00:50:06,360 --> 00:50:11,120 Speaker 2: the voices, the stories, text information, everything about you. 928 00:50:11,120 --> 00:50:13,200 Speaker 1: Your digital footprint. 929 00:50:12,680 --> 00:50:17,640 Speaker 2: Gets used to basically create another version of you so 930 00:50:17,719 --> 00:50:21,280 Speaker 2: that that could be used in, you know, after you die, 931 00:50:21,480 --> 00:50:24,440 Speaker 2: to be played to your grandkids and great grandkids and 932 00:50:24,440 --> 00:50:26,120 Speaker 2: they can interact with it and talk to you and 933 00:50:26,440 --> 00:50:29,080 Speaker 2: have a conversation with you. I mean, it's not you, 934 00:50:29,239 --> 00:50:31,960 Speaker 2: but in the mind of the person who is interacting, 935 00:50:32,440 --> 00:50:36,080 Speaker 2: they're feeling that it's an authentic interaction with you. I 936 00:50:36,400 --> 00:50:39,640 Speaker 2: recently there's a friend of mine, a client here in town, 937 00:50:40,120 --> 00:50:44,240 Speaker 2: who got me to take a photograph of his father 938 00:50:44,520 --> 00:50:46,400 Speaker 2: and he said, oh, someone showed me a photo of 939 00:50:46,440 --> 00:50:48,880 Speaker 2: a dead relative and they put the two people in 940 00:50:48,960 --> 00:50:51,520 Speaker 2: the photo together and they had their arms around each other, 941 00:50:51,800 --> 00:50:53,640 Speaker 2: and so I thought I'd ai the crap out of it. 942 00:50:53,719 --> 00:50:55,799 Speaker 2: And what I ended up doing was not only did 943 00:50:55,840 --> 00:50:58,080 Speaker 2: I get him standing in the same photograph with his 944 00:50:58,200 --> 00:51:00,640 Speaker 2: father with their arms around each each other, but they 945 00:51:00,680 --> 00:51:03,040 Speaker 2: turned and looked at each other and smiled, and he 946 00:51:03,080 --> 00:51:05,200 Speaker 2: was in tears. I mean, it was very heartwarming to 947 00:51:05,239 --> 00:51:08,440 Speaker 2: see it. It wasn't real, but for him it was 948 00:51:08,480 --> 00:51:11,280 Speaker 2: authentic and it was heart moving, and it was phenomenal 949 00:51:11,320 --> 00:51:13,560 Speaker 2: and he's showing everybody, and you know, I thought, I 950 00:51:13,560 --> 00:51:15,440 Speaker 2: can go one better than just a photo with an 951 00:51:15,520 --> 00:51:17,480 Speaker 2: arm around. I can do more than that, and I did, 952 00:51:17,800 --> 00:51:20,880 Speaker 2: and it made him feel so good. It was exciting 953 00:51:20,920 --> 00:51:23,240 Speaker 2: for him. You know, it was a beautiful CPR photo 954 00:51:23,280 --> 00:51:25,880 Speaker 2: of his father, that kind of lovely tone you get 955 00:51:25,920 --> 00:51:29,520 Speaker 2: with an old photo, and then it transformed his photo 956 00:51:29,640 --> 00:51:32,680 Speaker 2: into the same texture and the same context and they 957 00:51:32,719 --> 00:51:34,600 Speaker 2: were just standing there in the room together, looking at 958 00:51:34,640 --> 00:51:37,719 Speaker 2: each other and smiling. It was a beautiful moment for him. 959 00:51:37,760 --> 00:51:39,000 Speaker 2: He was very moved by it. 960 00:51:39,320 --> 00:51:44,040 Speaker 1: So how essentially that's essentially a placebo, Yeah, because it's 961 00:51:44,080 --> 00:51:47,080 Speaker 1: not real, but the experience is real. Yeah, tears real, 962 00:51:47,160 --> 00:51:52,040 Speaker 1: the emotion's real, the response is real. So personally, I 963 00:51:52,040 --> 00:51:54,400 Speaker 1: would not want that, like when my parents die, I 964 00:51:54,400 --> 00:51:57,160 Speaker 1: don't want to be talking to I don't want that. 965 00:51:57,480 --> 00:52:00,120 Speaker 1: But that's just my purpose. But I understand how other 966 00:52:00,200 --> 00:52:02,680 Speaker 1: people would. And you know, the funny thing is with 967 00:52:02,760 --> 00:52:06,320 Speaker 1: this stuff, what would give one person comfort would freak 968 00:52:06,360 --> 00:52:09,279 Speaker 1: another person out. You know, I don't want to be 969 00:52:09,320 --> 00:52:12,480 Speaker 1: talking to dead Ron, you know, I don't want that. 970 00:52:12,880 --> 00:52:14,920 Speaker 1: I don't want like fuck, I'm just going to treasure 971 00:52:14,960 --> 00:52:17,279 Speaker 1: the time that I haven't had and all of that. 972 00:52:18,040 --> 00:52:20,160 Speaker 1: But I don't want to be I don't want to 973 00:52:20,160 --> 00:52:22,320 Speaker 1: be chatting with Ai Ron in twenty years. 974 00:52:22,600 --> 00:52:25,080 Speaker 2: Well, as you know, my mum passed away sadly in 975 00:52:25,080 --> 00:52:27,920 Speaker 2: the sixties and that was eleven years ago now, so 976 00:52:28,000 --> 00:52:28,279 Speaker 2: it was. 977 00:52:28,239 --> 00:52:29,200 Speaker 1: It was quite a while. 978 00:52:30,360 --> 00:52:33,200 Speaker 2: I actually haven't been back to visit the grave because 979 00:52:33,480 --> 00:52:36,440 Speaker 2: I don't like the idea of visiting a person's grave 980 00:52:36,840 --> 00:52:40,560 Speaker 2: knowing what's there under the ground. It actually upsets me 981 00:52:40,640 --> 00:52:42,759 Speaker 2: more to think about that, and I prefer to hold 982 00:52:42,800 --> 00:52:46,520 Speaker 2: onto the memories of the stuff we did together and 983 00:52:47,120 --> 00:52:49,000 Speaker 2: just the great experiences we had. You know, we used 984 00:52:49,000 --> 00:52:50,719 Speaker 2: to do a lot of cooking together, you know in 985 00:52:50,719 --> 00:52:52,560 Speaker 2: the kitchen. I used to hang out with Mum there 986 00:52:52,600 --> 00:52:55,440 Speaker 2: and you know, it was it was fun and it 987 00:52:55,480 --> 00:52:58,120 Speaker 2: was great to have those memories. And we're of a 988 00:52:58,160 --> 00:53:01,480 Speaker 2: generation I'm thinking more, you know, Crago, you and I 989 00:53:01,520 --> 00:53:04,960 Speaker 2: where we don't have that digital footprint of our first 990 00:53:05,000 --> 00:53:08,200 Speaker 2: step and our first words and our first everything's you know, 991 00:53:08,200 --> 00:53:10,600 Speaker 2: I moved home and I moved to Geelong and I 992 00:53:10,680 --> 00:53:12,239 Speaker 2: was there for six years, and I reckon, I've got 993 00:53:12,360 --> 00:53:16,440 Speaker 2: three photographs of my entire six years, you know, living 994 00:53:16,480 --> 00:53:19,479 Speaker 2: and working in Geelong. But you know, I've got lots 995 00:53:19,480 --> 00:53:21,279 Speaker 2: and lots and lots of memories that I hold on to. 996 00:53:22,080 --> 00:53:26,640 Speaker 1: Yeah, yeah, and I think with all of this stuff, Yeah, 997 00:53:26,719 --> 00:53:30,320 Speaker 1: like some things that would make somebody else happy or smile, 998 00:53:30,680 --> 00:53:34,160 Speaker 1: or like I look at some photos, they make me sad. 999 00:53:34,239 --> 00:53:36,279 Speaker 1: I don't want to look at that photo, you know, 1000 00:53:36,440 --> 00:53:38,400 Speaker 1: Like one of I've had a bunch of really good friends. 1001 00:53:38,440 --> 00:53:40,239 Speaker 1: One of my best friends in the world passed away, 1002 00:53:40,320 --> 00:53:44,520 Speaker 1: Rob Dixon Dicko about twelve thirteen years ago. I don't like. 1003 00:53:44,560 --> 00:53:46,440 Speaker 1: I've got photos of him and me that are awesome. 1004 00:53:46,520 --> 00:53:50,520 Speaker 1: I won't look at them because it just it doesn't 1005 00:53:50,719 --> 00:53:53,640 Speaker 1: you know. I don't like he's always. Oh he's not always, 1006 00:53:53,719 --> 00:53:56,040 Speaker 1: but he's often on my mind and in my thoughts. 1007 00:53:56,520 --> 00:54:00,200 Speaker 1: But yeah, it just doesn't. Like for me, there's no upside. 1008 00:54:00,280 --> 00:54:03,080 Speaker 1: I don't need to go open an album or look 1009 00:54:03,120 --> 00:54:05,520 Speaker 1: at something like that. It doesn't. But then for other 1010 00:54:05,560 --> 00:54:07,719 Speaker 1: people it's the right thing to do, all right. I 1011 00:54:07,719 --> 00:54:09,680 Speaker 1: want to finish with this one because it's kind of 1012 00:54:09,719 --> 00:54:14,879 Speaker 1: the intersection of shit I'm fascinated with and AI, which 1013 00:54:14,920 --> 00:54:18,640 Speaker 1: is new artificial muscle Patrick. So this is the tech 1014 00:54:18,760 --> 00:54:25,400 Speaker 1: section new artificial muscle that can allow humanoid robots to 1015 00:54:25,480 --> 00:54:29,319 Speaker 1: live four thousand times their own way. That seems that 1016 00:54:29,440 --> 00:54:33,920 Speaker 1: seems like impossible. It's so, yeah, tell us about that, 1017 00:54:34,000 --> 00:54:37,560 Speaker 1: because I definitely need some of that muscle. Yeah, that'd 1018 00:54:37,640 --> 00:54:38,239 Speaker 1: be great, wouldn't it. 1019 00:54:38,239 --> 00:54:39,640 Speaker 2: You wouldn't have to go to the gym anymore if 1020 00:54:39,640 --> 00:54:42,640 Speaker 2: you could just get just winded on and there you go. 1021 00:54:43,600 --> 00:54:45,920 Speaker 2: It seems that all the work that you've done all 1022 00:54:45,960 --> 00:54:51,160 Speaker 2: those years, Craigo was for nothing. Yes, well, robotics is 1023 00:54:51,680 --> 00:54:54,520 Speaker 2: you know, this is the next generation of where a 1024 00:54:54,560 --> 00:54:58,400 Speaker 2: lot of research is going into. And what they're using 1025 00:54:58,520 --> 00:55:02,560 Speaker 2: is little tiny magnet ed muscle they call you little 1026 00:55:02,560 --> 00:55:08,719 Speaker 2: magnetic muscles and together they combine and it's basically being 1027 00:55:08,719 --> 00:55:12,279 Speaker 2: done in South Korea at the moment and advanced functional 1028 00:55:12,360 --> 00:55:17,279 Speaker 2: materials they're calling them. And they're magnetically controlled, so the 1029 00:55:17,360 --> 00:55:21,879 Speaker 2: muscle can flip between being basically floppy and then rock 1030 00:55:21,920 --> 00:55:23,319 Speaker 2: hard through magnetism. 1031 00:55:23,400 --> 00:55:24,640 Speaker 1: So that's the way they're doing it. 1032 00:55:24,680 --> 00:55:28,680 Speaker 2: Because when you've got a motor that's generating for don't laugh. 1033 00:55:30,600 --> 00:55:32,080 Speaker 1: Yeah, anyway, it. 1034 00:55:31,960 --> 00:55:35,000 Speaker 2: Could be used for other purposes as well, Crago, but 1035 00:55:35,480 --> 00:55:37,719 Speaker 2: the little tiny strips of the can you're not. 1036 00:55:37,680 --> 00:55:40,279 Speaker 1: Say floppy and rock hard in the same sense. You 1037 00:55:40,400 --> 00:55:43,520 Speaker 1: do these things on purpose. I don't. I really don't 1038 00:55:43,920 --> 00:55:48,520 Speaker 1: absolutely do, Tiff doesn't he Yeah, back me up. 1039 00:55:49,000 --> 00:55:51,439 Speaker 2: Yeah, I just think it's a nice way to start 1040 00:55:51,480 --> 00:55:56,439 Speaker 2: and finish the show with smart Yeah. So the stiffened states, right, 1041 00:56:00,120 --> 00:56:00,839 Speaker 2: describe it? 1042 00:56:01,760 --> 00:56:04,440 Speaker 1: You could take I think you could say rigid or 1043 00:56:04,560 --> 00:56:06,479 Speaker 1: high ten soil. I don't think you need to say 1044 00:56:06,560 --> 00:56:09,640 Speaker 1: stiff and floppy and whatever the other one was. 1045 00:56:10,120 --> 00:56:14,359 Speaker 2: Anyway, the rigid state is brought about by these materials 1046 00:56:14,440 --> 00:56:17,880 Speaker 2: that could only one point two grams can hold up 1047 00:56:17,920 --> 00:56:22,680 Speaker 2: to five kilograms. That's four thousand times its own weight. 1048 00:56:22,800 --> 00:56:25,200 Speaker 2: It's as strong as an ant here so and when 1049 00:56:25,239 --> 00:56:28,799 Speaker 2: it's softened, so when it's flaccid, it can stretch to 1050 00:56:28,920 --> 00:56:35,040 Speaker 2: around twelve times the original legs. 1051 00:56:36,760 --> 00:56:40,000 Speaker 1: If what's worse, moist or flaccid? 1052 00:56:40,120 --> 00:56:43,239 Speaker 3: Oh, you had to bring it up. 1053 00:56:44,200 --> 00:56:48,760 Speaker 1: Today's podcast. Friday's podcast is called what's it called? Please 1054 00:56:48,800 --> 00:56:50,239 Speaker 1: don't say moist or something like. 1055 00:56:50,200 --> 00:56:55,160 Speaker 3: That, warned, I warn the whole typ. 1056 00:56:56,160 --> 00:57:00,000 Speaker 1: Yeah, she went onto the typ Facebook page and apologize 1057 00:57:00,200 --> 00:57:02,960 Speaker 1: in advance for the name of today's the title of 1058 00:57:02,960 --> 00:57:05,840 Speaker 1: today's episode, and she did. Creat i'ven't even had a 1059 00:57:05,840 --> 00:57:08,200 Speaker 1: look this morning, I should look Patrick, we got to 1060 00:57:08,239 --> 00:57:10,160 Speaker 1: go tell people how they can connect with you. 1061 00:57:10,560 --> 00:57:13,640 Speaker 2: No one wants to connect with me after this. Go 1062 00:57:13,680 --> 00:57:17,600 Speaker 2: to websitesnow dot com dot au and always happy to 1063 00:57:17,640 --> 00:57:21,880 Speaker 2: take suggestions on stuff we can talk about your websites 1064 00:57:21,880 --> 00:57:25,360 Speaker 2: noow dot com do you I'm imported? 1065 00:57:25,920 --> 00:57:28,560 Speaker 1: Thank you Patrick, thank you TIF, thank you so I mean, 1066 00:57:28,680 --> 00:57:31,640 Speaker 1: for that one person who's still listening, Well done you 1067 00:57:32,000 --> 00:57:36,480 Speaker 2: Thank you work, great work, marathon performance