1 00:00:00,680 --> 00:00:02,720 Speaker 1: I'll get a team. Welcome to another installment of your 2 00:00:02,759 --> 00:00:06,439 Speaker 1: favorite show, with your favorite people. David, John Ken, Patrick Gillespie, 3 00:00:06,440 --> 00:00:11,440 Speaker 1: Tiffany and Margaret, Woman of Veronica. Two dogs, Cook cloak Cookie? 4 00:00:11,960 --> 00:00:12,879 Speaker 2: Can they two dogs? 5 00:00:13,400 --> 00:00:14,040 Speaker 1: Are you all right? 6 00:00:14,560 --> 00:00:15,080 Speaker 3: I'm great? 7 00:00:15,120 --> 00:00:20,720 Speaker 1: Thanks great? Great, dog's good, dog, good, cat, good, motorbike good. 8 00:00:21,000 --> 00:00:21,880 Speaker 2: Everything's good. 9 00:00:22,600 --> 00:00:24,680 Speaker 1: To get rid of the cat, the motorbike or the dog. 10 00:00:24,760 --> 00:00:25,600 Speaker 1: What would go first? 11 00:00:25,600 --> 00:00:28,920 Speaker 2: One of them? None of them. We'd all live homeless 12 00:00:28,960 --> 00:00:31,320 Speaker 2: on food stamps if we had two meat, the cat, 13 00:00:31,400 --> 00:00:32,600 Speaker 2: the dog and the motorbike. 14 00:00:32,840 --> 00:00:34,240 Speaker 1: A food stamp still a thing. 15 00:00:34,520 --> 00:00:37,680 Speaker 3: I don't know. Never were in Australia, but I think 16 00:00:37,680 --> 00:00:38,880 Speaker 3: they are in the US. Yeah. 17 00:00:38,920 --> 00:00:43,880 Speaker 1: I think you've been watching too many Disney movies. Tiff. Hello, 18 00:00:44,520 --> 00:00:46,440 Speaker 1: Professor Gillespie, how are you? 19 00:00:46,560 --> 00:00:46,720 Speaker 2: Yeah? 20 00:00:46,880 --> 00:00:47,080 Speaker 1: Good? 21 00:00:47,120 --> 00:00:48,960 Speaker 3: Good to be here. I got to ask. So, last 22 00:00:48,960 --> 00:00:51,519 Speaker 3: time we talked, Tiff had bought an accessory for her 23 00:00:51,640 --> 00:00:56,520 Speaker 3: bike to make it really really noisy. That now fitted, Tiff. 24 00:00:56,320 --> 00:00:59,680 Speaker 2: It's been fitted. It's so good. I am so obsessed. 25 00:00:59,720 --> 00:01:02,680 Speaker 2: Now you've got no idea, David. We'll talk about it later. 26 00:01:03,160 --> 00:01:05,640 Speaker 1: No, talk about it, talk about it now, because I 27 00:01:05,680 --> 00:01:07,520 Speaker 1: want people to know what I have to contend with. 28 00:01:08,560 --> 00:01:12,080 Speaker 3: So you were frustrated at writing around on the equivalent 29 00:01:12,120 --> 00:01:13,960 Speaker 3: of a singer sewing machine, and now it's a singer 30 00:01:14,000 --> 00:01:16,600 Speaker 3: sewing machine that sounds like a Harley is it? It 31 00:01:16,680 --> 00:01:17,240 Speaker 3: really is? 32 00:01:17,360 --> 00:01:21,320 Speaker 2: And it's like I can't even describe. It has transformed 33 00:01:21,360 --> 00:01:24,240 Speaker 2: my relationship with the bike. I cannot ride it enough. 34 00:01:25,640 --> 00:01:29,440 Speaker 1: There aren't too many women that I know that are 35 00:01:30,319 --> 00:01:32,679 Speaker 1: I mean, in fact, you might be the only one 36 00:01:33,000 --> 00:01:38,400 Speaker 1: that's totally enamored with a motorbike that's loud and obnoxious. 37 00:01:39,200 --> 00:01:42,320 Speaker 1: So how good is it? Though? It is good? It 38 00:01:42,400 --> 00:01:45,800 Speaker 1: is good. It does so, I mean, but I'm a bogan. 39 00:01:45,959 --> 00:01:48,880 Speaker 3: So you're going to have to get a sound clip 40 00:01:49,040 --> 00:01:51,120 Speaker 3: and edit it in here, Tiff, so that people know 41 00:01:51,200 --> 00:01:52,000 Speaker 3: what we're talking about. 42 00:01:52,280 --> 00:01:54,440 Speaker 2: I'll go give it a few revs. Maybe next week 43 00:01:54,640 --> 00:01:56,279 Speaker 2: I'll get out on it, give it a few revs, 44 00:01:56,320 --> 00:01:57,640 Speaker 2: and pop it in for the listeners. 45 00:01:57,880 --> 00:01:59,840 Speaker 1: You could just go out record it to your phone. 46 00:02:00,040 --> 00:02:02,400 Speaker 1: Could you do that? Could you literally record it on 47 00:02:02,440 --> 00:02:05,160 Speaker 1: your phone and insert it into this episode. 48 00:02:05,360 --> 00:02:07,480 Speaker 3: I could do that. I could do it, Tiff. 49 00:02:07,840 --> 00:02:22,720 Speaker 1: Let's hear your motorbike now, boom, And that was TIFF's motorbike. Everyone, 50 00:02:22,800 --> 00:02:28,600 Speaker 1: I've just created some more. Every woman listens to the 51 00:02:28,639 --> 00:02:30,680 Speaker 1: show will be like, not every woman, but a lot 52 00:02:30,720 --> 00:02:34,639 Speaker 1: will be like, I don't get it. I didn't get it. 53 00:02:35,600 --> 00:02:36,120 Speaker 3: What it? 54 00:02:36,480 --> 00:02:39,280 Speaker 1: What didn't you get? What do you get now? Sorry, David, 55 00:02:39,320 --> 00:02:42,680 Speaker 1: this is really this is very inappropriate. But what do 56 00:02:42,720 --> 00:02:44,320 Speaker 1: you get now that you didn't get? 57 00:02:44,840 --> 00:02:47,239 Speaker 2: Well, I didn't get how much it was going to 58 00:02:47,480 --> 00:02:52,760 Speaker 2: change the entire riding experience, not just the ah everyone's 59 00:02:52,800 --> 00:02:55,520 Speaker 2: bad bokes louder than mine. And I'm on a Junomi 60 00:02:55,840 --> 00:03:01,640 Speaker 2: sewing machine, so I go the long way home. I 61 00:03:01,840 --> 00:03:04,560 Speaker 2: couldn't be bothered getting it, getting it unpacked, to bring 62 00:03:04,600 --> 00:03:07,920 Speaker 2: it fifteen minutes to the gym before. Now, I, oh, 63 00:03:08,560 --> 00:03:12,080 Speaker 2: I don't know something, something happens. It's very visceral. It's 64 00:03:12,400 --> 00:03:14,680 Speaker 2: it's the sound, it's the feeling. And do you know what, 65 00:03:14,680 --> 00:03:18,160 Speaker 2: I've got a comms unit the communications for other writers, 66 00:03:18,200 --> 00:03:20,240 Speaker 2: and I don't like it because it takes me away 67 00:03:20,240 --> 00:03:24,400 Speaker 2: from the presence of the sensation of the whole bike experience. 68 00:03:25,280 --> 00:03:27,080 Speaker 3: Yeah, I know what, I know what you're talking about. 69 00:03:27,240 --> 00:03:33,839 Speaker 3: You know my first car was a Gemini, Yeah, which 70 00:03:34,320 --> 00:03:36,400 Speaker 3: was you know, it's a singer sewing routine with four 71 00:03:36,440 --> 00:03:41,800 Speaker 3: wheels and it. It became a lot faster and better 72 00:03:42,240 --> 00:03:44,360 Speaker 3: when I went and bought some speech stripes and put 73 00:03:44,400 --> 00:03:46,640 Speaker 3: you stuck them down the side of it. 74 00:03:46,640 --> 00:03:52,280 Speaker 1: It's it's it's an indictment on you that you owned 75 00:03:52,280 --> 00:03:55,720 Speaker 1: a Gemini and then but that you are brave enough 76 00:03:55,760 --> 00:03:56,640 Speaker 1: to say that you. 77 00:03:56,560 --> 00:03:58,520 Speaker 3: Are under Gemini a long time ago. 78 00:03:59,280 --> 00:04:02,640 Speaker 1: Although back in the day they were they were kind 79 00:04:02,640 --> 00:04:04,040 Speaker 1: of okay. 80 00:04:04,080 --> 00:04:06,119 Speaker 3: It wasn't that kind of Gemini. It was a blue 81 00:04:06,240 --> 00:04:09,840 Speaker 3: nurse's car. So it was a lovely hell blue. 82 00:04:10,520 --> 00:04:14,440 Speaker 1: Had kind of baby powder blue. All right, Now, we're 83 00:04:14,480 --> 00:04:17,600 Speaker 1: not here to talk about bloody motorbikes or forty year 84 00:04:17,640 --> 00:04:21,120 Speaker 1: old geminis. We're here to talk about the things that 85 00:04:21,160 --> 00:04:24,159 Speaker 1: you've been sticking your nose into, because you fucking stick 86 00:04:24,200 --> 00:04:25,680 Speaker 1: your nose in a lot of places. 87 00:04:25,760 --> 00:04:26,200 Speaker 3: You do. 88 00:04:26,880 --> 00:04:30,360 Speaker 1: You do open doors that nobody asked you to open. 89 00:04:30,520 --> 00:04:33,880 Speaker 1: Let's just say that. But you do it with gay 90 00:04:33,880 --> 00:04:38,080 Speaker 1: abandon You just you just throw it open and just 91 00:04:38,320 --> 00:04:42,480 Speaker 1: cartwheel in there. And so one of that. By the way, everyone, 92 00:04:43,000 --> 00:04:46,039 Speaker 1: David who please help him get some more followers on 93 00:04:46,120 --> 00:04:49,679 Speaker 1: Buddy LinkedIn. He's up to two thousand, but my nana's 94 00:04:49,680 --> 00:04:51,760 Speaker 1: on two and a half thousand and she's dead. So 95 00:04:52,240 --> 00:04:53,960 Speaker 1: if you could give him, if you could give him 96 00:04:53,960 --> 00:04:56,800 Speaker 1: a chop out, follow him on LinkedIn. And also, if 97 00:04:56,800 --> 00:04:58,840 Speaker 1: you're not a member or if you're not following on 98 00:04:59,120 --> 00:05:04,119 Speaker 1: the U Project podcast Facebook page where he posts every 99 00:05:04,160 --> 00:05:06,600 Speaker 1: second or third one of his articles, jump in there. 100 00:05:07,240 --> 00:05:09,920 Speaker 1: Also follow on there. But you put up something this 101 00:05:10,040 --> 00:05:14,680 Speaker 1: week around an AI kind of let's call it a 102 00:05:15,680 --> 00:05:18,359 Speaker 1: resource that was developed. Do you want to I won't. 103 00:05:18,440 --> 00:05:21,800 Speaker 1: I won't spoil the I won't spoil the show. Tell 104 00:05:21,880 --> 00:05:23,840 Speaker 1: us what was developed and what it's for. 105 00:05:24,560 --> 00:05:29,479 Speaker 3: Okay. So O two, which is a British telecommunications company, 106 00:05:30,040 --> 00:05:34,839 Speaker 3: decided that rather than just blocking scammer numbers as they're 107 00:05:34,880 --> 00:05:38,800 Speaker 3: coming through their servers, they would prefer to waste the 108 00:05:38,839 --> 00:05:43,440 Speaker 3: scammer's time, figuring that that's actually a more powerful this 109 00:05:43,560 --> 00:05:47,080 Speaker 3: incentive to phone scammers because just blocking their numbers just 110 00:05:47,120 --> 00:05:49,440 Speaker 3: means that the computer moves on to the next number, 111 00:05:49,480 --> 00:05:53,800 Speaker 3: and you know, they haven't really harmed the scammer at all. 112 00:05:54,240 --> 00:05:57,400 Speaker 3: So what they developed was an AI that imitates your 113 00:05:57,440 --> 00:06:02,920 Speaker 3: grant and this AI will have a really lengthy conversation 114 00:06:03,320 --> 00:06:10,360 Speaker 3: with anyone that rings them, and it's really interesting. I 115 00:06:10,400 --> 00:06:13,360 Speaker 3: think you've got some audio here, Tiff, and I whack 116 00:06:13,400 --> 00:06:16,080 Speaker 3: it in because listening to this is your You're now 117 00:06:16,120 --> 00:06:19,479 Speaker 3: going to listen to the actual AI having a real 118 00:06:19,560 --> 00:06:22,080 Speaker 3: conversation with real phone scammers. 119 00:06:23,160 --> 00:06:27,400 Speaker 4: So W is than a dot W. And then Dodd, 120 00:06:27,720 --> 00:06:31,239 Speaker 4: I think you're perfectly bothering people, right, I'm just trying 121 00:06:31,600 --> 00:06:38,840 Speaker 4: to have a little chat. Gosh, our time fly is 122 00:06:40,720 --> 00:06:43,640 Speaker 4: it's showing me a picture of my cat Fluffy. 123 00:06:44,120 --> 00:06:46,479 Speaker 3: It's showing you a picture of your cat Fluffy. 124 00:06:46,720 --> 00:06:49,719 Speaker 1: It's not stupid, got it. 125 00:06:49,800 --> 00:06:53,960 Speaker 4: Dear, Because while they're busy talking to me, they can't 126 00:06:53,960 --> 00:06:57,800 Speaker 4: be scamming you. And let's face it, dear, I've got 127 00:06:57,839 --> 00:06:59,279 Speaker 4: all the time in the world. 128 00:07:00,400 --> 00:07:02,000 Speaker 3: So you know, she's got a bit of a sense 129 00:07:02,040 --> 00:07:03,800 Speaker 3: of humor as well. You know that one of fellow 130 00:07:03,839 --> 00:07:06,360 Speaker 3: saying has been nearly an hour and she's saying, oh, 131 00:07:06,400 --> 00:07:12,120 Speaker 3: how time flies. So it's an interesting approach to the 132 00:07:12,200 --> 00:07:15,880 Speaker 3: problem in that they figured, well, we haven't just got 133 00:07:15,920 --> 00:07:18,160 Speaker 3: one Grannie we can put on the phone. You know, 134 00:07:18,480 --> 00:07:20,600 Speaker 3: got one AI grannie, You've got a billion of them, 135 00:07:21,520 --> 00:07:25,040 Speaker 3: so you know, run them all simultaneously. And so that's 136 00:07:25,080 --> 00:07:28,200 Speaker 3: their plan, is to just waste the scammer's time and 137 00:07:28,240 --> 00:07:33,080 Speaker 3: clog up their calling networks with conversations with not very 138 00:07:33,120 --> 00:07:34,600 Speaker 3: real grannies. 139 00:07:34,880 --> 00:07:39,120 Speaker 1: Is is it? It must be bloody great AI. If 140 00:07:39,240 --> 00:07:42,280 Speaker 1: they can keep someone on the phone for an hour 141 00:07:43,360 --> 00:07:46,840 Speaker 1: thinking that they're actually having a real conversation with the 142 00:07:46,880 --> 00:07:48,360 Speaker 1: real person, well. 143 00:07:48,280 --> 00:07:50,960 Speaker 3: They've got a strong motivation to stay on the phone. 144 00:07:51,000 --> 00:07:55,640 Speaker 3: So you're remembering that, you know, it's a fairly unrewarding job, 145 00:07:55,680 --> 00:07:58,400 Speaker 3: I guess, in the sense that you know, ninety nine 146 00:07:58,400 --> 00:08:00,120 Speaker 3: percent of people are to speak hang up with in 147 00:08:00,160 --> 00:08:04,400 Speaker 3: the first second. So they want to stay on the phone. 148 00:08:04,440 --> 00:08:06,480 Speaker 3: They want to think that they're making progress. They want 149 00:08:06,520 --> 00:08:08,160 Speaker 3: to talk to someone and see if they can get 150 00:08:08,160 --> 00:08:11,040 Speaker 3: them too interested in what they're doing. And the interesting 151 00:08:11,040 --> 00:08:13,120 Speaker 3: thing about an AI is that it can adapt. It 152 00:08:13,160 --> 00:08:15,800 Speaker 3: can have a genuine conversation with you. It can talk 153 00:08:15,800 --> 00:08:17,640 Speaker 3: to you about what you've said, it can listen to 154 00:08:17,680 --> 00:08:20,000 Speaker 3: what you've said, it can engage in it. And this 155 00:08:20,040 --> 00:08:25,200 Speaker 3: one has been told programmed to keep the conversation going 156 00:08:25,440 --> 00:08:29,360 Speaker 3: whatever it takes. And as you heard from those examples, 157 00:08:29,360 --> 00:08:33,920 Speaker 3: it was managing to do it for an hour, which now, look, 158 00:08:34,080 --> 00:08:36,439 Speaker 3: I suspect this is a fairly short term solution and 159 00:08:36,520 --> 00:08:39,240 Speaker 3: that you know pretty soon these scammers will be AIS 160 00:08:39,320 --> 00:08:41,599 Speaker 3: themselves and see AI is talking to our eyes, and 161 00:08:42,280 --> 00:08:44,160 Speaker 3: no one will waste anyone's time. But I guess a 162 00:08:44,200 --> 00:08:48,120 Speaker 3: few within megacycles of computing time might go down the drain. 163 00:08:48,240 --> 00:08:52,560 Speaker 3: But it's an interesting use of AI at the moment, 164 00:08:52,640 --> 00:08:56,200 Speaker 3: and it shows how far it's gone that you could 165 00:08:56,280 --> 00:08:59,400 Speaker 3: be talking to what you think as a person in 166 00:08:59,480 --> 00:09:01,880 Speaker 3: real time and in fact you're not. 167 00:09:02,480 --> 00:09:08,480 Speaker 1: It is getting It's getting harder and harder to identify 168 00:09:08,520 --> 00:09:10,520 Speaker 1: what is real and what is not. Like I got 169 00:09:10,559 --> 00:09:13,480 Speaker 1: sent something today through one of the speaking agencies that 170 00:09:13,559 --> 00:09:17,880 Speaker 1: I do some work for, and I didn't open it, 171 00:09:18,120 --> 00:09:21,600 Speaker 1: but it looked legit. It was from there, had they 172 00:09:22,120 --> 00:09:24,760 Speaker 1: had everything that looks like the other emails I get 173 00:09:24,800 --> 00:09:28,560 Speaker 1: from them, But it had a file to open, which 174 00:09:28,840 --> 00:09:32,080 Speaker 1: didn't make sense, so I didn't open it. And I'm 175 00:09:32,200 --> 00:09:34,120 Speaker 1: nine to nine point nine, I'm sent. Sure it was 176 00:09:34,920 --> 00:09:39,520 Speaker 1: a virus or something, but I just nearly automatically opened 177 00:09:39,520 --> 00:09:42,760 Speaker 1: it because I've opened things from them many times and 178 00:09:42,800 --> 00:09:44,760 Speaker 1: it just came in and it's like, oh, there's a 179 00:09:44,760 --> 00:09:47,240 Speaker 1: file here, blah blah blah about that, and then I 180 00:09:47,280 --> 00:09:49,679 Speaker 1: read what it was about, and I'm like, what is 181 00:09:50,160 --> 00:09:53,440 Speaker 1: what is that, but you I think if you're not 182 00:09:53,600 --> 00:09:56,880 Speaker 1: ever vigilant, you know we're going to get in trouble 183 00:09:56,880 --> 00:10:00,760 Speaker 1: because the quality. I mean, there's also some ridiculously terrible 184 00:10:01,520 --> 00:10:04,360 Speaker 1: kind of you know, scams, but some of them are 185 00:10:04,360 --> 00:10:05,640 Speaker 1: pretty high level, aren't they. 186 00:10:06,480 --> 00:10:09,680 Speaker 3: Yeah, But I guess on the flip side of this 187 00:10:09,880 --> 00:10:12,880 Speaker 3: is it's worth thinking about where it could be useful. 188 00:10:13,240 --> 00:10:16,640 Speaker 3: If you've got AI capable of holding a reasonable conversation 189 00:10:16,760 --> 00:10:19,640 Speaker 3: with you, but also with access to a significant amount 190 00:10:19,640 --> 00:10:24,040 Speaker 3: of data about the organization it works for, then wouldn't 191 00:10:24,080 --> 00:10:27,559 Speaker 3: you rather be talking to an AI like that than 192 00:10:27,640 --> 00:10:30,480 Speaker 3: sitting on hold for two days waiting for your bank 193 00:10:30,520 --> 00:10:32,800 Speaker 3: to talk to you, or for your Telcoe to talk 194 00:10:32,840 --> 00:10:35,679 Speaker 3: to you, or for cent a Link to talk to you, 195 00:10:36,440 --> 00:10:38,600 Speaker 3: which is the alternative at the moment, which is press 196 00:10:38,720 --> 00:10:40,839 Speaker 3: this button and then go on hold for three days. 197 00:10:41,679 --> 00:10:45,000 Speaker 3: Wouldn't you rather be answered every single time, even if 198 00:10:45,000 --> 00:10:47,200 Speaker 3: it's by an AI that you can actually have a 199 00:10:47,240 --> 00:10:50,400 Speaker 3: conversation with and who probably has a pretty good chance 200 00:10:50,400 --> 00:10:52,040 Speaker 3: of giving you the information you're after. 201 00:10:52,559 --> 00:10:55,520 Speaker 1: And also you're not talking to anything that gets pissed 202 00:10:55,559 --> 00:11:00,400 Speaker 1: off or impatient or frustrated or has emotions or is 203 00:11:00,440 --> 00:11:02,679 Speaker 1: going to judge you or you know. 204 00:11:03,160 --> 00:11:06,320 Speaker 3: So in the example we just played the grandess, they 205 00:11:06,320 --> 00:11:10,320 Speaker 3: were swearing at her. It's perfectly calm, just keeps the 206 00:11:10,320 --> 00:11:11,520 Speaker 3: conversation going. 207 00:11:11,800 --> 00:11:13,760 Speaker 1: Yeah, yeah, that is, that is. 208 00:11:14,679 --> 00:11:14,839 Speaker 3: Yeah. 209 00:11:15,000 --> 00:11:18,480 Speaker 1: There are a range of applications even with I don't 210 00:11:18,559 --> 00:11:23,480 Speaker 1: use chat GPT for any of my writing or obviously 211 00:11:23,520 --> 00:11:27,320 Speaker 1: for my research because it's irrelevant. But sometimes I'll look 212 00:11:27,400 --> 00:11:31,959 Speaker 1: up something just to see what it says about something 213 00:11:32,160 --> 00:11:34,760 Speaker 1: I already understand, to see if what they bring up 214 00:11:34,880 --> 00:11:37,800 Speaker 1: is similar to what I would write or what I 215 00:11:37,840 --> 00:11:41,720 Speaker 1: would say. And I was thinking the other day about 216 00:11:43,720 --> 00:11:47,480 Speaker 1: what I call in personal intelligence, like how the ability 217 00:11:47,559 --> 00:11:50,160 Speaker 1: to be able to you know, understand others, all that communicates, 218 00:11:50,160 --> 00:11:54,400 Speaker 1: solve problems, work alongside other people, that kind of social intelligence. 219 00:11:54,520 --> 00:11:59,120 Speaker 1: And I thought, I've never heard the term into personal intelligence. 220 00:11:59,200 --> 00:12:02,320 Speaker 1: I wonder if it's actually a term and an area 221 00:12:02,320 --> 00:12:06,000 Speaker 1: of research. And I pulled it up and I said, 222 00:12:06,559 --> 00:12:10,920 Speaker 1: what is inter personal intelligence? On chat GPT? And it 223 00:12:10,920 --> 00:12:12,920 Speaker 1: brought it up and A Blowke wrote a book Come 224 00:12:12,960 --> 00:12:16,280 Speaker 1: on Boy you forty years ago, whole thesis, whole and 225 00:12:16,679 --> 00:12:18,960 Speaker 1: a guy at a Harvard a psychologist and a professor, 226 00:12:19,000 --> 00:12:22,600 Speaker 1: and yeah, it's one of eight kinds of intelligence he describes. 227 00:12:23,040 --> 00:12:25,120 Speaker 1: But anyways, so it finished, and then at the end, 228 00:12:25,840 --> 00:12:29,280 Speaker 1: unprompted by me chat, GPT goes, I hope you find 229 00:12:29,320 --> 00:12:32,800 Speaker 1: that valuable, Craig, because it's interesting because that intersects with 230 00:12:32,840 --> 00:12:36,040 Speaker 1: your research and probably something you could talk about in 231 00:12:36,080 --> 00:12:39,600 Speaker 1: your workshops and on your podcast. Like it gave me 232 00:12:39,679 --> 00:12:43,280 Speaker 1: a whole paragraph of just general advice around how I 233 00:12:43,320 --> 00:12:47,479 Speaker 1: could potentially use this information in different capacities and context 234 00:12:47,840 --> 00:12:50,080 Speaker 1: and I didn't ask it any of that. I'm like, 235 00:12:50,600 --> 00:12:54,640 Speaker 1: this motherfucker has been studying me, like what it's like, 236 00:12:54,760 --> 00:12:55,480 Speaker 1: it knows me. 237 00:12:56,240 --> 00:12:58,600 Speaker 3: Well, that's a nice segue to the next bit that 238 00:12:58,600 --> 00:13:01,960 Speaker 3: we want to talk about, Craig, which is you would 239 00:13:02,000 --> 00:13:06,559 Speaker 3: have seen the things about Bunnings getting a slap over 240 00:13:06,600 --> 00:13:09,400 Speaker 3: the wrist with a damp piece of lettuce by the 241 00:13:09,440 --> 00:13:12,480 Speaker 3: Privacy Commissioner saying, oh, you'ren aughty folks, you shouldn't have 242 00:13:12,520 --> 00:13:15,640 Speaker 3: done that. And what Bunnings were doing was for three 243 00:13:15,720 --> 00:13:22,520 Speaker 3: years they ran profiling software in their stores without telling 244 00:13:22,520 --> 00:13:27,119 Speaker 3: anyone they were doing it. So this was facial recognition 245 00:13:27,480 --> 00:13:30,200 Speaker 3: and the reason that Bunnings said they were doing it 246 00:13:30,360 --> 00:13:37,040 Speaker 3: was to because they wanted to identify criminals in, known shoplifters, 247 00:13:37,120 --> 00:13:41,160 Speaker 3: potential trouble makers, etc. In their stores. They say seventy 248 00:13:41,160 --> 00:13:44,760 Speaker 3: percent of problems in their stores come from a very 249 00:13:44,880 --> 00:13:47,880 Speaker 3: very small group of people that they would use this 250 00:13:48,000 --> 00:13:53,760 Speaker 3: software to identify, and the privacy commissioner said, wow, you 251 00:13:53,880 --> 00:13:56,080 Speaker 3: can't do that. They didn't have a problem with them 252 00:13:56,360 --> 00:13:59,200 Speaker 3: using the software, they had a problem with them doing 253 00:13:59,200 --> 00:14:05,200 Speaker 3: it without telling one. So and this sort of thing 254 00:14:05,320 --> 00:14:09,160 Speaker 3: is increasingly becoming part of retail. I don't know if 255 00:14:09,200 --> 00:14:14,000 Speaker 3: you've ever paid close attention to the self checkout kiosks 256 00:14:14,000 --> 00:14:15,800 Speaker 3: in the supermarket, but if you have, you would have 257 00:14:15,880 --> 00:14:19,440 Speaker 3: noticed a real time video of yourself operating the machine. 258 00:14:21,040 --> 00:14:25,720 Speaker 3: And there's no guarantees around any of this stuff, you know. 259 00:14:26,000 --> 00:14:27,920 Speaker 3: Bunning said, Look, it was deleted as soon as it 260 00:14:27,960 --> 00:14:30,680 Speaker 3: wasn't a match, like you know, in less than a second, 261 00:14:31,200 --> 00:14:34,240 Speaker 3: your photo was off the system. No one was keeping anything, 262 00:14:34,960 --> 00:14:39,440 Speaker 3: And maybe that's true, probably is, But the fact that 263 00:14:39,480 --> 00:14:42,880 Speaker 3: it's being recorded at all and is being used for identification, 264 00:14:44,960 --> 00:14:48,800 Speaker 3: I think gives pause for thought. Because profiling software like that, 265 00:14:48,960 --> 00:14:52,160 Speaker 3: which well facial recognition software all of us use all 266 00:14:52,200 --> 00:14:55,240 Speaker 3: the time, use it probably every morning to unlock your phone, 267 00:14:55,320 --> 00:14:57,320 Speaker 3: every five minutes to unlock your phone and is using 268 00:14:57,680 --> 00:15:01,240 Speaker 3: facial recognition software. It's used at the airports to you know, 269 00:15:01,840 --> 00:15:05,320 Speaker 3: get you through customs all that sort of stuff. The 270 00:15:05,400 --> 00:15:08,920 Speaker 3: question then comes, is it okay for to just be 271 00:15:08,920 --> 00:15:11,200 Speaker 3: being used everywhere all the time, for us to be 272 00:15:11,280 --> 00:15:15,520 Speaker 3: able to be identified, particularly in places where it can 273 00:15:15,560 --> 00:15:19,560 Speaker 3: be matched to other information about us, like our credit 274 00:15:19,600 --> 00:15:23,760 Speaker 3: card or our flyby's number, or a rewards number which 275 00:15:23,800 --> 00:15:26,960 Speaker 3: records every purchase we've ever made and our preferences, you know, 276 00:15:27,000 --> 00:15:28,880 Speaker 3: whether we happen to like the full fat milk or 277 00:15:28,880 --> 00:15:32,360 Speaker 3: the trim All that kind of stuff starts to be 278 00:15:32,520 --> 00:15:34,960 Speaker 3: then being associated with stuff that can be picked out 279 00:15:35,000 --> 00:15:39,480 Speaker 3: immediately by software that's automatically identifying you. Because if you 280 00:15:39,560 --> 00:15:42,960 Speaker 3: all had a number plate stable to your forehead, So 281 00:15:44,160 --> 00:15:47,320 Speaker 3: how comfortable are we with that? So taking the example 282 00:15:47,360 --> 00:15:50,920 Speaker 3: that you just gave with chat GPT, what if you 283 00:15:50,920 --> 00:15:54,080 Speaker 3: you know, you were browsing the women's little lingerie section 284 00:15:54,200 --> 00:15:58,760 Speaker 3: at Target, you know, as you do, and you know, 285 00:15:58,800 --> 00:16:02,320 Speaker 3: the facial recognitions software says, oh, Frank, it's got an 286 00:16:02,360 --> 00:16:10,120 Speaker 3: interest in women's underwear, and pops onto Instagram, does a 287 00:16:10,160 --> 00:16:13,560 Speaker 3: complete search of Instagram. You know, the reverse image search 288 00:16:13,600 --> 00:16:17,320 Speaker 3: of Instagram with your face identifies your Instagram account and 289 00:16:17,360 --> 00:16:20,000 Speaker 3: then starts popping ads for women's underwear in your Instagram. 290 00:16:20,640 --> 00:16:23,600 Speaker 3: So that's the sort of thing that can be used 291 00:16:23,600 --> 00:16:29,040 Speaker 3: commercially and is all doable now with existing technology, And 292 00:16:29,080 --> 00:16:31,680 Speaker 3: who's to say it isn't. I mean, how did chat 293 00:16:31,720 --> 00:16:34,160 Speaker 3: GPT know so much about you? He is a good 294 00:16:34,200 --> 00:16:39,120 Speaker 3: question to ask, But the question is how much of 295 00:16:39,120 --> 00:16:42,360 Speaker 3: this are we prepared to allow have happening. It's all 296 00:16:42,360 --> 00:16:45,480 Speaker 3: a bit amusing when it's about your lingerie shopping habits, 297 00:16:45,480 --> 00:16:49,120 Speaker 3: but it's less amusing when you get a fine in 298 00:16:49,160 --> 00:16:52,600 Speaker 3: the post for jaywalking at two am and Mount Iseres 299 00:16:52,800 --> 00:16:56,400 Speaker 3: because you know, a camera picked you up, identified your face, 300 00:16:57,040 --> 00:16:59,480 Speaker 3: it reported you to authorities and sent you a fine. 301 00:17:00,440 --> 00:17:03,080 Speaker 3: Or when your car, as we've talked before, you know, 302 00:17:03,360 --> 00:17:06,320 Speaker 3: shoots a letter off to the to the to the 303 00:17:06,320 --> 00:17:09,920 Speaker 3: coppers and says, oh, by the way, Harps was doing 304 00:17:09,960 --> 00:17:12,720 Speaker 3: five k over the limit and he's photographic proof of it. 305 00:17:15,160 --> 00:17:18,240 Speaker 5: You know, even what you were saying with at Woolworths, 306 00:17:19,119 --> 00:17:23,520 Speaker 5: I was thinking that because well, where I shop anyway, 307 00:17:23,720 --> 00:17:26,040 Speaker 5: there's I think there's two cameras, so there's one that 308 00:17:27,000 --> 00:17:28,480 Speaker 5: is straight on and my face. 309 00:17:28,560 --> 00:17:31,600 Speaker 1: But then there's one above me where it can see 310 00:17:31,640 --> 00:17:35,440 Speaker 1: what you're doing. It can see you scanning things. So 311 00:17:35,480 --> 00:17:39,040 Speaker 1: it's above my head looking at me, scanning things, looking 312 00:17:39,040 --> 00:17:41,840 Speaker 1: at me taking things out of the basket, scanning them, 313 00:17:41,920 --> 00:17:44,920 Speaker 1: then putting them into my bag. So it's this kind 314 00:17:45,000 --> 00:17:51,000 Speaker 1: of overhead view and I think, I mean, honestly, I 315 00:17:51,000 --> 00:17:53,320 Speaker 1: guess it doesn't bother me because I've never thought about 316 00:17:53,320 --> 00:17:57,200 Speaker 1: it too much. But no, no, I mean, maybe it's there, 317 00:17:57,240 --> 00:17:59,680 Speaker 1: but I've never seen this anywhere where it says, by 318 00:17:59,720 --> 00:18:01,560 Speaker 1: the way, you're being filmed right now. 319 00:18:02,680 --> 00:18:05,560 Speaker 3: Yeah, And that was the problem that the Privacy Commissioner 320 00:18:05,560 --> 00:18:07,159 Speaker 3: had with what Bunning's were doing. Like I said, they 321 00:18:07,200 --> 00:18:09,119 Speaker 3: didn't have a problem with the fact they were doing it. 322 00:18:09,880 --> 00:18:12,240 Speaker 3: The problem was they didn't tell people they were doing 323 00:18:12,320 --> 00:18:15,640 Speaker 3: it and give people the opportunity not to consent. 324 00:18:16,280 --> 00:18:19,560 Speaker 1: So what would be then if they if they told everyone? 325 00:18:19,760 --> 00:18:23,280 Speaker 1: I mean, what would you do tif if they went, hey, 326 00:18:23,640 --> 00:18:25,880 Speaker 1: if you shop here, we're going to film you, would 327 00:18:25,920 --> 00:18:31,040 Speaker 1: you still shop there? I mean probably, I guess I 328 00:18:31,080 --> 00:18:33,679 Speaker 1: get like, I know I'm being filmed, but then do 329 00:18:33,760 --> 00:18:36,679 Speaker 1: we know what they'd do with that footage? David or 330 00:18:37,080 --> 00:18:37,439 Speaker 1: like a. 331 00:18:37,480 --> 00:18:40,959 Speaker 3: Big problem, isn't it. No, we don't we know what 332 00:18:41,040 --> 00:18:43,440 Speaker 3: they say they do with it when the Privacy Commissioner 333 00:18:43,440 --> 00:18:46,480 Speaker 3: hauls them up before them. But do we really know 334 00:18:46,560 --> 00:18:49,160 Speaker 3: what they'd do with it and what guarantees we've got. 335 00:18:49,160 --> 00:18:52,119 Speaker 3: And last time when we talked about this with the cars, 336 00:18:52,200 --> 00:18:56,000 Speaker 3: and this data was being sent off to Chinese and 337 00:18:56,040 --> 00:18:58,760 Speaker 3: South Korean companies, and there was no explanation as to 338 00:18:58,800 --> 00:19:00,800 Speaker 3: what was being done with it other than it was 339 00:19:00,840 --> 00:19:05,639 Speaker 3: being used to train ais well to do what you know, 340 00:19:05,720 --> 00:19:08,159 Speaker 3: there's just no real detail about this stuff, and I 341 00:19:08,200 --> 00:19:11,120 Speaker 3: don't think anyone can with confidence say they know what's 342 00:19:11,160 --> 00:19:14,000 Speaker 3: being done. I guess that coming back to the question 343 00:19:14,080 --> 00:19:17,080 Speaker 3: you asked about, would TIFFs still go into Bunnings, Well, 344 00:19:17,160 --> 00:19:19,320 Speaker 3: she might, but I reckon there'd probably be a roaring 345 00:19:19,400 --> 00:19:21,600 Speaker 3: trade in those face masks. Things you can buy that 346 00:19:22,080 --> 00:19:28,200 Speaker 3: make it really, really difficult for these detection softwares to work. 347 00:19:28,240 --> 00:19:30,639 Speaker 3: So there's these ones that they sort of blur your features. 348 00:19:30,680 --> 00:19:34,280 Speaker 3: They're still basically see through, but they blur your features 349 00:19:34,440 --> 00:19:37,560 Speaker 3: and the software struggles with it. They were very very 350 00:19:37,560 --> 00:19:40,920 Speaker 3: popular in Hong Kong, by the way, during the democracy 351 00:19:41,000 --> 00:19:43,960 Speaker 3: riots because they knew that this technology was being used there, 352 00:19:45,359 --> 00:19:48,479 Speaker 3: and so they sold a lot of those masks. You 353 00:19:48,560 --> 00:19:50,480 Speaker 3: might see a lot of that sort of thing happening 354 00:19:50,880 --> 00:19:55,439 Speaker 3: if people are being told what's going on and deciding, actually, 355 00:19:55,480 --> 00:19:58,480 Speaker 3: I'd really rather not be identified. You know, when I 356 00:19:58,560 --> 00:20:00,560 Speaker 3: go in and buy a lawnmower. 357 00:20:00,560 --> 00:20:04,360 Speaker 1: I wonder if in some not too distant dystopian future 358 00:20:05,359 --> 00:20:09,760 Speaker 1: they are selling so they're filming us and then selling 359 00:20:09,800 --> 00:20:14,359 Speaker 1: that too. I mean, unless there was some commercial upside 360 00:20:14,480 --> 00:20:18,160 Speaker 1: for them, there'd be no reason for them to record 361 00:20:18,200 --> 00:20:19,600 Speaker 1: it unless they were going. 362 00:20:19,520 --> 00:20:23,320 Speaker 3: To there's a lot of commercial upside. So just the 363 00:20:23,400 --> 00:20:26,520 Speaker 3: example I gave you before, where you know, if it 364 00:20:26,720 --> 00:20:29,760 Speaker 3: noticed that you're hanging around in the linger section, then 365 00:20:29,800 --> 00:20:30,560 Speaker 3: that day not. 366 00:20:30,680 --> 00:20:33,400 Speaker 1: Just stop saying that. Please, could you find another example 367 00:20:33,440 --> 00:20:37,360 Speaker 1: or another example? Could it not be the fresh produce 368 00:20:37,440 --> 00:20:38,320 Speaker 1: section or something? 369 00:20:38,720 --> 00:20:42,040 Speaker 3: To be really really clear, listeners, I have no evidence 370 00:20:43,200 --> 00:20:46,399 Speaker 3: to hang around in the lingerie section. It is just 371 00:20:46,520 --> 00:20:47,400 Speaker 3: a hopothetical. 372 00:20:47,720 --> 00:20:49,679 Speaker 1: But target do though that. 373 00:20:51,280 --> 00:20:55,159 Speaker 3: But but I guess there's an instant commercial upside to that, 374 00:20:55,640 --> 00:20:58,800 Speaker 3: which is that information is valuable to an advertiser of 375 00:20:58,880 --> 00:21:04,399 Speaker 3: lingerie products, and there's other commercial applications as well, because 376 00:21:04,400 --> 00:21:07,040 Speaker 3: then they've got that information linked to the other things 377 00:21:07,040 --> 00:21:10,919 Speaker 3: that you purchase. So you may never purchase lingerie, but 378 00:21:11,600 --> 00:21:14,240 Speaker 3: it now knows something about you that it didn't know before, 379 00:21:14,520 --> 00:21:16,960 Speaker 3: which it would have only gotten from your purchase history, 380 00:21:17,480 --> 00:21:20,399 Speaker 3: and so it can cater the advertising to you, It 381 00:21:20,440 --> 00:21:23,480 Speaker 3: can target things to you, it can sell that targeting information. 382 00:21:23,760 --> 00:21:27,600 Speaker 3: I'm not suggesting anyone is doing this, It's just possible. 383 00:21:28,359 --> 00:21:31,200 Speaker 3: And whenever something is possible, there's likely to be someone 384 00:21:31,240 --> 00:21:32,200 Speaker 3: pushing the envelope. 385 00:21:32,800 --> 00:21:35,520 Speaker 1: If you look like you wanted to jump in, well. 386 00:21:35,400 --> 00:21:37,600 Speaker 2: I feel like if this, why is this a thing 387 00:21:37,680 --> 00:21:42,080 Speaker 2: around these sorts of things and not people that are 388 00:21:42,119 --> 00:21:45,600 Speaker 2: doing child trafficking and pornography, And like, if this sort 389 00:21:45,640 --> 00:21:49,320 Speaker 2: of stuff exists, is on their better uses than trying 390 00:21:49,320 --> 00:21:53,320 Speaker 2: to catch someone fucking scanning a roma tomato or something else. 391 00:21:56,200 --> 00:21:59,840 Speaker 3: It's probably, Yeah, it's probably because law enforcement often lags 392 00:21:59,840 --> 00:22:03,000 Speaker 3: be in terms of its access to technology. And then 393 00:22:03,040 --> 00:22:05,879 Speaker 3: the other problem is giving these sorts of capabilities to 394 00:22:05,960 --> 00:22:08,960 Speaker 3: law enforcement is a whole new level as well, in 395 00:22:09,000 --> 00:22:12,560 Speaker 3: the sense that it's people are almost okay with it. 396 00:22:12,600 --> 00:22:14,119 Speaker 3: If all it's going to happen is they're going to 397 00:22:14,119 --> 00:22:16,639 Speaker 3: sell you a new pair of shoes with it, but 398 00:22:17,040 --> 00:22:19,360 Speaker 3: they're less okay with it if someone who can throw 399 00:22:19,400 --> 00:22:22,000 Speaker 3: you in the slammer has access to this information. 400 00:22:23,240 --> 00:22:25,840 Speaker 1: Oh, there is so much stuff to sort through with this. 401 00:22:25,920 --> 00:22:28,960 Speaker 1: All right, let's do one more before we close the 402 00:22:29,000 --> 00:22:34,080 Speaker 1: Gillespie door. So you wrote an article the other day. Well, 403 00:22:34,080 --> 00:22:35,920 Speaker 1: there was an article that you commented on from the 404 00:22:35,960 --> 00:22:38,800 Speaker 1: New York Times called is being busy good for people 405 00:22:38,840 --> 00:22:40,280 Speaker 1: with ADHD? 406 00:22:40,880 --> 00:22:41,440 Speaker 3: Yeah? 407 00:22:41,520 --> 00:22:44,560 Speaker 1: What tell us about that? Because I'm I don't know 408 00:22:44,600 --> 00:22:46,359 Speaker 1: the answer. I'm suspecting it might be. 409 00:22:47,080 --> 00:22:51,000 Speaker 3: It is so interesting study, big study done in the 410 00:22:51,080 --> 00:22:56,359 Speaker 3: United States tracking people over sixteen years to see whether 411 00:22:56,400 --> 00:22:59,920 Speaker 3: there're ADHD symptoms, you know, stayed the same, got worse, 412 00:23:00,080 --> 00:23:03,080 Speaker 3: got better, whatever. And what they discovered along the way 413 00:23:03,560 --> 00:23:07,000 Speaker 3: was that it went in and out with adults, and 414 00:23:08,840 --> 00:23:13,679 Speaker 3: that the thing that most accurately predicted the ADHD symptoms 415 00:23:13,760 --> 00:23:16,639 Speaker 3: or the severity of them. And remembering what ADHD is, 416 00:23:16,680 --> 00:23:18,960 Speaker 3: and that the primary symptom is lack of focus, you know, 417 00:23:19,040 --> 00:23:22,560 Speaker 3: lack of an ability to focus, particularly in an adult. 418 00:23:22,560 --> 00:23:25,040 Speaker 3: That can have severe consequences because you can't get anything done. 419 00:23:25,359 --> 00:23:28,560 Speaker 3: You're a bit study at work and in general. And 420 00:23:30,440 --> 00:23:34,119 Speaker 3: so they were tracking this and they found it really 421 00:23:34,880 --> 00:23:38,680 Speaker 3: was affected by how busy you were. So the busier 422 00:23:38,760 --> 00:23:41,560 Speaker 3: you were, the more you had on the more stress 423 00:23:41,600 --> 00:23:44,679 Speaker 3: you were under, which is a little bit paradoxical, the 424 00:23:44,760 --> 00:23:47,600 Speaker 3: more stress you were under, the less likely you were 425 00:23:47,640 --> 00:23:51,680 Speaker 3: to have ADHD symptoms. So and in fact, for many people, 426 00:23:51,720 --> 00:23:53,600 Speaker 3: the more stress they became, the more likely they were 427 00:23:53,640 --> 00:23:58,080 Speaker 3: to be completely in remission from those symptoms. So that's 428 00:23:58,119 --> 00:24:00,159 Speaker 3: an interesting finding and it backs up some of the 429 00:24:00,200 --> 00:24:03,119 Speaker 3: things that we've talked about sort of over many episodes, 430 00:24:03,160 --> 00:24:07,840 Speaker 3: which is find a way to generate dopamine if what 431 00:24:07,920 --> 00:24:11,639 Speaker 3: you want to do is address ADHD symptoms. ADHD is 432 00:24:11,680 --> 00:24:14,640 Speaker 3: a lack of dopamine. You do not have enough dopamine 433 00:24:14,720 --> 00:24:17,800 Speaker 3: to keep you focused. And the definition of enough is 434 00:24:17,840 --> 00:24:20,399 Speaker 3: what your brain says is enough, and the level of 435 00:24:20,440 --> 00:24:23,920 Speaker 3: that is dependent on how addicted or stressed you are. 436 00:24:24,080 --> 00:24:27,080 Speaker 3: The higher your addiction or stress levels, the higher it 437 00:24:27,119 --> 00:24:29,679 Speaker 3: says you need. And so even a normal level in 438 00:24:29,760 --> 00:24:32,439 Speaker 3: someone else is not enough to keep you focused. So 439 00:24:33,080 --> 00:24:35,720 Speaker 3: the way ADHD medication works is to give you a 440 00:24:35,720 --> 00:24:39,320 Speaker 3: dopamine spike. They're stimulants, so not that people would be 441 00:24:39,320 --> 00:24:41,560 Speaker 3: prescribing it, but you could give someone cocaine. It will 442 00:24:41,600 --> 00:24:44,280 Speaker 3: give them a dopamine spike, and it would fix their dopamines, 443 00:24:44,400 --> 00:24:49,600 Speaker 3: the their ADHD symptoms. So the interesting thing is this 444 00:24:49,640 --> 00:24:53,800 Speaker 3: is entirely predictable, which is make someone busy, make them 445 00:24:53,880 --> 00:24:57,119 Speaker 3: really really have to focus, have to stay on task 446 00:24:57,200 --> 00:24:59,760 Speaker 3: all the time. They're having to generate their own dopamine 447 00:25:00,040 --> 00:25:03,480 Speaker 3: because they're so busy, put them under stress. Stress generates 448 00:25:03,520 --> 00:25:07,600 Speaker 3: dopamine just as well. So make someone busy, time poor, 449 00:25:07,720 --> 00:25:12,000 Speaker 3: under stress, they are likely to not have ADHD symptoms, 450 00:25:12,040 --> 00:25:15,240 Speaker 3: is what this study is saying. And the reason is 451 00:25:15,280 --> 00:25:17,679 Speaker 3: because of that, because they're generating their own dopamine. And 452 00:25:17,680 --> 00:25:19,480 Speaker 3: this lines up with the stuff we've talked about in 453 00:25:19,480 --> 00:25:22,399 Speaker 3: the past about ice baths, for example, how do you 454 00:25:22,440 --> 00:25:24,920 Speaker 3: address something like this jump in an ice bath because 455 00:25:24,920 --> 00:25:28,960 Speaker 3: an ice bath is a massive dose of stress and 456 00:25:29,040 --> 00:25:32,040 Speaker 3: it gives you a huge dopamine hit. So that's what 457 00:25:32,080 --> 00:25:34,520 Speaker 3: this found. The interesting thing is also backed up with 458 00:25:34,560 --> 00:25:39,040 Speaker 3: some earlier studies on students which found, okay, but what 459 00:25:39,359 --> 00:25:43,040 Speaker 3: they're busy doing matters. So you have to be interested 460 00:25:43,280 --> 00:25:46,119 Speaker 3: in the thing you are doing or that's keeping you busy, 461 00:25:46,760 --> 00:25:48,800 Speaker 3: or you won't get any dopamine from it. So being 462 00:25:48,840 --> 00:25:52,359 Speaker 3: busy doing housework, for example, did not have any effect 463 00:25:52,400 --> 00:25:57,880 Speaker 3: on students' ability to focus, but being busy playing sport did. 464 00:25:58,880 --> 00:26:01,960 Speaker 1: Oh right. It's not just any busy, it's a certain 465 00:26:02,080 --> 00:26:02,720 Speaker 1: kind of busy. 466 00:26:02,840 --> 00:26:05,159 Speaker 3: It has to be something you want to do, like 467 00:26:05,200 --> 00:26:07,639 Speaker 3: so it's got to be something that you enjoy or 468 00:26:07,680 --> 00:26:09,840 Speaker 3: else you're not going to keep up motivation to stay 469 00:26:09,880 --> 00:26:13,120 Speaker 3: busy with it anyway. But also it's not actually giving 470 00:26:13,160 --> 00:26:14,480 Speaker 3: you a dopen mean hit because you don't want to 471 00:26:14,480 --> 00:26:15,320 Speaker 3: do it in the first place. 472 00:26:16,000 --> 00:26:19,760 Speaker 1: That is very interesting. I wonder if I wonder if 473 00:26:19,840 --> 00:26:23,240 Speaker 1: knowing that in the education system, they could do something 474 00:26:23,320 --> 00:26:24,800 Speaker 1: with that insight. 475 00:26:25,480 --> 00:26:28,080 Speaker 3: I think a lot of teachers know this sort of 476 00:26:29,200 --> 00:26:33,040 Speaker 3: without any official science behind it. They know if I 477 00:26:33,119 --> 00:26:35,199 Speaker 3: want these boys in my class that are bouncing off 478 00:26:35,240 --> 00:26:38,280 Speaker 3: the walls to focus, I'll let them go outside and 479 00:26:38,280 --> 00:26:40,320 Speaker 3: bounce around for a bit and then I'll get more 480 00:26:40,359 --> 00:26:43,360 Speaker 3: out of them. 481 00:26:43,400 --> 00:26:45,479 Speaker 1: Well, t if it looks like you've got some editing 482 00:26:45,560 --> 00:26:48,119 Speaker 1: to do, because you've got to insert you may have 483 00:26:48,240 --> 00:26:51,040 Speaker 1: to reinsert that lady's audio because I don't know how 484 00:26:51,080 --> 00:26:54,359 Speaker 1: good that was, and you have to it was bloody awful, 485 00:26:55,320 --> 00:26:57,280 Speaker 1: and because yeah, that was a bit dodgy. So you 486 00:26:57,359 --> 00:26:59,320 Speaker 1: might have to figure that out, but you're a whiz. 487 00:26:59,320 --> 00:27:03,359 Speaker 1: And then you've got to insert the MT seven with 488 00:27:03,520 --> 00:27:08,359 Speaker 1: the Akropovich System on it, which that means nothing to 489 00:27:08,440 --> 00:27:09,800 Speaker 1: our listeners, but just. 490 00:27:09,760 --> 00:27:12,440 Speaker 2: Trust me, it means something to some of them. 491 00:27:12,440 --> 00:27:15,879 Speaker 1: It means something too much to the twelve Bogans that 492 00:27:15,960 --> 00:27:18,320 Speaker 1: listen to the show. Shout out to the Bogans and 493 00:27:18,359 --> 00:27:22,159 Speaker 1: Glespo's like, what's in Akropovich full system just makes it 494 00:27:22,200 --> 00:27:23,800 Speaker 1: sound good? Mate with the baffle out. 495 00:27:24,280 --> 00:27:26,879 Speaker 3: Yeah, I'm going to have to do something I've never 496 00:27:26,920 --> 00:27:28,200 Speaker 3: done before. I'm going to have to go and listen 497 00:27:28,200 --> 00:27:30,040 Speaker 3: to one of these episodes so I can hear it. 498 00:27:30,520 --> 00:27:32,679 Speaker 1: Yeah, you'll hear it, So it'll be out, not tomorrow, 499 00:27:32,760 --> 00:27:35,240 Speaker 1: the next day. I'm able to say goodbye, Affair, but 500 00:27:35,320 --> 00:27:38,320 Speaker 1: as always, we appreciate you. Thank you sir, right out 501 00:27:38,400 --> 00:27:38,760 Speaker 1: see ya