1 00:00:00,440 --> 00:00:02,840 Speaker 1: I'll get everyone. We're just having a little in house 2 00:00:02,920 --> 00:00:06,359 Speaker 1: argument here. Patrick's already thrown his glomash in the shagpile. 3 00:00:06,800 --> 00:00:10,240 Speaker 1: Minute one, TIFF's laughing like a crooko bar. In the background, 4 00:00:10,280 --> 00:00:13,520 Speaker 1: the typ Cooko Borrough That is Tiff, good morning. Well 5 00:00:13,640 --> 00:00:17,560 Speaker 1: let's start with fucking Captain Spock, who's there in his 6 00:00:19,079 --> 00:00:21,400 Speaker 1: I wish you could see what I can see everyone, 7 00:00:22,079 --> 00:00:25,720 Speaker 1: because Patrick looks like he's at the International Space Station. 8 00:00:25,800 --> 00:00:29,200 Speaker 1: He's got a big round window behind him, and is 9 00:00:29,200 --> 00:00:32,080 Speaker 1: that I should know? But is that Earth behind you? 10 00:00:32,159 --> 00:00:32,519 Speaker 1: What is that? 11 00:00:33,159 --> 00:00:35,640 Speaker 2: No? I think it's a random planet that might be Earth, 12 00:00:35,680 --> 00:00:38,640 Speaker 2: But now there's too many other satellites and things in 13 00:00:38,680 --> 00:00:41,240 Speaker 2: the sky behind me to actually be the planet Earth. 14 00:00:41,320 --> 00:00:44,320 Speaker 2: So I think it's imagined scene from space. 15 00:00:45,680 --> 00:00:49,159 Speaker 1: It's not my my? Is it another real one? 16 00:00:49,479 --> 00:00:54,360 Speaker 2: It's a real one, real gorgeous nerde isn't he? 17 00:00:55,080 --> 00:00:56,920 Speaker 1: Oh? Isn't he? Don't you just want to pick him 18 00:00:57,000 --> 00:00:58,680 Speaker 1: up and squeeze him till he farts? 19 00:00:58,720 --> 00:01:02,400 Speaker 2: He's so cute, so much want to do that to 20 00:01:02,480 --> 00:01:06,520 Speaker 2: anybody there is. There's no circumstance that I want to 21 00:01:06,560 --> 00:01:11,240 Speaker 2: crushed someone to the point you start screeting stuff. Well, 22 00:01:11,360 --> 00:01:14,399 Speaker 2: I didn't say screeting. I'm not talking about shitting. I'm 23 00:01:14,400 --> 00:01:16,880 Speaker 2: not going to squeeze you to your shit, just going 24 00:01:16,920 --> 00:01:18,600 Speaker 2: to like a little cute. 25 00:01:18,480 --> 00:01:21,440 Speaker 1: Like harmless shirt out the back. There is it. 26 00:01:21,840 --> 00:01:25,080 Speaker 2: That's the problem though, right what when it when a 27 00:01:25,120 --> 00:01:26,840 Speaker 2: fart develops? Lunch there you're in. 28 00:01:27,160 --> 00:01:30,920 Speaker 1: No, no, see you didn't need to. There's somebody right 29 00:01:30,959 --> 00:01:34,040 Speaker 1: now eating porridge and they've just gone, ah fuck, I'm out. 30 00:01:35,000 --> 00:01:39,080 Speaker 1: Somebody's gone. I'm out. Sorry, I'm going to I'm going 31 00:01:39,120 --> 00:01:41,360 Speaker 1: to ignore you for a moment. Good morning tip that 32 00:01:41,440 --> 00:01:47,080 Speaker 1: Good morning Harts. How are you very good? Thanks? Happy Friday, 33 00:01:48,000 --> 00:01:51,520 Speaker 1: Happy Friday, everybody. It's if you want to know where 34 00:01:51,560 --> 00:01:53,560 Speaker 1: we are all at at the moment, it's twenty five 35 00:01:53,560 --> 00:01:56,760 Speaker 1: minutes to eight in the am on a Friday. If 36 00:01:56,840 --> 00:01:59,920 Speaker 1: you've never heard this show, it'll probably be your last 37 00:02:00,040 --> 00:02:03,000 Speaker 1: one because it's it's an experience. 38 00:02:03,280 --> 00:02:05,600 Speaker 2: And can I just say it's not indicative of all 39 00:02:05,680 --> 00:02:08,920 Speaker 2: the quality recordings that you'll be able to listen to 40 00:02:08,960 --> 00:02:12,000 Speaker 2: in the back catalog. So this only happens once a fortnight, 41 00:02:12,760 --> 00:02:15,880 Speaker 2: so listens are really good shows, just you know, do 42 00:02:15,960 --> 00:02:19,160 Speaker 2: the fillers in between the everynight. 43 00:02:20,040 --> 00:02:22,240 Speaker 1: Thank you for taking a hit for the team, Patrick, 44 00:02:22,320 --> 00:02:26,720 Speaker 1: thank you, thank you, But I don't know why. Sometimes 45 00:02:26,760 --> 00:02:29,000 Speaker 1: I go, you know, one of the things that you 46 00:02:29,040 --> 00:02:31,760 Speaker 1: two know this, but our listeners might not have thought 47 00:02:31,760 --> 00:02:33,519 Speaker 1: about a lot. But at the end of every show, 48 00:02:34,400 --> 00:02:37,400 Speaker 1: I sit there with bits of paper in front of me, 49 00:02:37,480 --> 00:02:40,239 Speaker 1: with just shit that I've scribbled down through the show, 50 00:02:40,760 --> 00:02:43,919 Speaker 1: and out of the shit that I've scribbled down, I think, oh, 51 00:02:44,800 --> 00:02:47,120 Speaker 1: I've got to write what we just spoke about, or 52 00:02:47,480 --> 00:02:52,200 Speaker 1: you know, some kind of meaningful synopsis of the bullshit 53 00:02:52,280 --> 00:02:55,720 Speaker 1: that just went on for an hour, and I sometimes 54 00:02:55,840 --> 00:02:59,120 Speaker 1: I struggle. I've got no fucking idea what to write 55 00:02:59,200 --> 00:03:03,080 Speaker 1: because we didn't really talk about much. But nonetheless it 56 00:03:03,200 --> 00:03:07,680 Speaker 1: seems to have been fun. You remember when Seinfeld they 57 00:03:07,720 --> 00:03:11,720 Speaker 1: created the show for Telly and they were literally pitching 58 00:03:11,760 --> 00:03:15,440 Speaker 1: the show about nothing, which is essentially what Yeah. 59 00:03:15,680 --> 00:03:19,800 Speaker 2: Anyway, I don't know what you said earlier, but it 60 00:03:19,800 --> 00:03:23,400 Speaker 2: made me think of my doom scrolling on YouTube this morning, 61 00:03:23,680 --> 00:03:27,519 Speaker 2: and somehow in my feed it came up with when 62 00:03:27,639 --> 00:03:31,600 Speaker 2: bullies get their own back? So, you know, when bullies 63 00:03:31,639 --> 00:03:33,799 Speaker 2: bully people and the underdog gets back, and then it 64 00:03:33,919 --> 00:03:37,160 Speaker 2: just fight scenes from movies. Basically, oh really, just pick 65 00:03:37,200 --> 00:03:40,280 Speaker 2: on someone, but then the underdog gets their revenge. 66 00:03:40,760 --> 00:03:44,120 Speaker 1: Somebody sent me a thing yesterday and this dude, I 67 00:03:44,120 --> 00:03:46,200 Speaker 1: don't know where it was, but it was he was 68 00:03:46,240 --> 00:03:49,480 Speaker 1: at an ATM and he was getting money out and 69 00:03:49,520 --> 00:03:51,840 Speaker 1: it was about your size, Patrick, and the guy that 70 00:03:52,040 --> 00:03:56,400 Speaker 1: was bigger than me came up and grabbed him, pushed 71 00:03:56,440 --> 00:03:58,640 Speaker 1: him and grabbed him, was going to rob him, and 72 00:03:58,680 --> 00:04:01,200 Speaker 1: the little guy with the money hunched him square in 73 00:04:01,240 --> 00:04:03,720 Speaker 1: the fucking face and knocked him out and then just 74 00:04:03,800 --> 00:04:07,760 Speaker 1: walked off with his cash. I'm like, go, like this, 75 00:04:07,760 --> 00:04:10,160 Speaker 1: this dude was trying to rob him, this bigger dude, 76 00:04:10,680 --> 00:04:13,200 Speaker 1: and he definitely did. Like whoever the little guy was, 77 00:04:13,280 --> 00:04:16,320 Speaker 1: he could one hundred percent fight, you know. He just 78 00:04:16,560 --> 00:04:19,680 Speaker 1: cracked him on the jaw, knocked him out, didn't bat 79 00:04:19,720 --> 00:04:22,320 Speaker 1: an eyelid, walk stepped over him, and walked off with 80 00:04:22,360 --> 00:04:23,160 Speaker 1: his cash. 81 00:04:23,279 --> 00:04:25,240 Speaker 2: But you know what I took from that thing, just 82 00:04:25,240 --> 00:04:28,400 Speaker 2: what you just said. Wow, the little guy was your 83 00:04:28,520 --> 00:04:33,599 Speaker 2: size and the other guy was bigger than me. What 84 00:04:33,720 --> 00:04:37,960 Speaker 2: impassively bigger and I'm that tiny little weakling the. 85 00:04:37,760 --> 00:04:40,240 Speaker 1: Nerd, Well, you're not a weakling, but you just you're 86 00:04:40,640 --> 00:04:44,840 Speaker 1: You're diminutive, You're like cute. It's like I want to 87 00:04:44,839 --> 00:04:46,520 Speaker 1: put you at the end of my books and just 88 00:04:46,640 --> 00:04:47,240 Speaker 1: leave you there. 89 00:04:47,400 --> 00:04:49,440 Speaker 2: I thought you were going to cuddle me till I thought. 90 00:04:51,040 --> 00:04:53,280 Speaker 1: You're not really that diminutive. People are going, what is 91 00:04:53,320 --> 00:04:53,680 Speaker 1: he like? 92 00:04:54,120 --> 00:04:56,719 Speaker 2: Four foot taol? How what are you about? Five seven, 93 00:04:57,360 --> 00:04:59,600 Speaker 2: one hundred and sixty nine centimeters? I don't know five 94 00:05:00,120 --> 00:05:00,560 Speaker 2: and stuff. 95 00:05:00,680 --> 00:05:03,360 Speaker 1: Yeah maybe five to six. No, no, no, no, that's 96 00:05:03,400 --> 00:05:06,080 Speaker 1: maybe five to seven or even five back. Ohright, enough 97 00:05:06,120 --> 00:05:09,240 Speaker 1: about your dimensions. We don't We definitely don't need to 98 00:05:09,279 --> 00:05:11,119 Speaker 1: know all your dimensions. 99 00:05:11,480 --> 00:05:13,360 Speaker 2: I've done that's been really exciting. 100 00:05:14,640 --> 00:05:16,479 Speaker 1: Well, can we be the judge of whether or not 101 00:05:16,520 --> 00:05:17,200 Speaker 1: it's exciting? 102 00:05:17,880 --> 00:05:18,200 Speaker 2: Really? 103 00:05:18,760 --> 00:05:20,560 Speaker 1: See? What you need to go is can I just 104 00:05:20,640 --> 00:05:22,120 Speaker 1: tell you something that I've done? 105 00:05:22,440 --> 00:05:24,800 Speaker 2: Yeah, you didn't preface it with anything? Should I? I should 106 00:05:24,880 --> 00:05:26,520 Speaker 2: let you judge whether it's exciting or not? 107 00:05:27,160 --> 00:05:32,160 Speaker 1: Well, Tiff can judge once atten tive, super exciting, one boring? 108 00:05:32,960 --> 00:05:36,359 Speaker 1: What is about you need you need to give it 109 00:05:36,360 --> 00:05:36,840 Speaker 1: a score? 110 00:05:37,240 --> 00:05:39,960 Speaker 2: It is relative though. No. I had an amazing experience 111 00:05:40,080 --> 00:05:44,280 Speaker 2: last week with Victoria University. I was hosting their very 112 00:05:44,440 --> 00:05:49,600 Speaker 2: first They're inaugural AI hackathon. So the idea isn't it 113 00:05:49,960 --> 00:05:54,120 Speaker 2: related to we're coincided with International Disability Day, and what 114 00:05:54,160 --> 00:05:57,200 Speaker 2: they did was they got their students to come up 115 00:05:57,240 --> 00:06:01,640 Speaker 2: with ways to use AI to help people with disabilities. 116 00:06:01,680 --> 00:06:04,240 Speaker 2: So they had a week to plan it and they 117 00:06:04,400 --> 00:06:06,560 Speaker 2: started it on a Tuesday. We had a session with 118 00:06:06,640 --> 00:06:09,279 Speaker 2: Microsoft Microsoft brief as I was able to sit in 119 00:06:09,279 --> 00:06:12,039 Speaker 2: my International Space station right here at home because it 120 00:06:12,080 --> 00:06:14,719 Speaker 2: was in Sydney, it was in it was in per 121 00:06:14,960 --> 00:06:18,440 Speaker 2: not Perth. Was it Sydney, Brisbane and Melbourne. And it 122 00:06:18,480 --> 00:06:20,800 Speaker 2: was amazing. But what was really cool that came out 123 00:06:20,800 --> 00:06:22,680 Speaker 2: of it was there was a couple of things that 124 00:06:22,880 --> 00:06:24,680 Speaker 2: I unpacked from it. One was a lot of the 125 00:06:24,720 --> 00:06:28,120 Speaker 2: students didn't know anybody with a disability, which kind of 126 00:06:28,160 --> 00:06:31,640 Speaker 2: blew my mind a little bit. They kind of admitted 127 00:06:31,680 --> 00:06:33,400 Speaker 2: that this was the first time they had really thought 128 00:06:33,440 --> 00:06:36,000 Speaker 2: about what it was like to be a disabled person. 129 00:06:36,520 --> 00:06:40,000 Speaker 2: But some of the ideas they came up with were fantastic. 130 00:06:40,360 --> 00:06:42,080 Speaker 2: You know, one guy came up with an idea for 131 00:06:42,120 --> 00:06:45,160 Speaker 2: an app so that you could use it to navigate 132 00:06:45,200 --> 00:06:49,200 Speaker 2: on campus at VU. But what he also did was 133 00:06:49,320 --> 00:06:52,719 Speaker 2: he said he would integrate, so if you've got a wheelchair, 134 00:06:53,000 --> 00:06:55,360 Speaker 2: then getting from A to B as the crow flies 135 00:06:55,440 --> 00:06:57,960 Speaker 2: may not work for you, So you choose the route 136 00:06:58,000 --> 00:07:00,200 Speaker 2: that you wanted to go to and then you use 137 00:07:00,240 --> 00:07:02,640 Speaker 2: wheelchair friendly and then it would take you from A 138 00:07:02,800 --> 00:07:05,440 Speaker 2: to B but taking into account the fact that you're 139 00:07:05,440 --> 00:07:07,520 Speaker 2: in a wheelchair. So I thought that was really cool. 140 00:07:07,520 --> 00:07:09,960 Speaker 2: And another girl came up with a concept and a 141 00:07:10,000 --> 00:07:12,360 Speaker 2: working model. I couldn't believe she'd done this in a week. 142 00:07:12,600 --> 00:07:15,280 Speaker 2: A working model of how to teach people and use 143 00:07:15,320 --> 00:07:17,920 Speaker 2: sign language. So you know, he can use phones now 144 00:07:18,160 --> 00:07:21,679 Speaker 2: to communicate with people with different languages. Well it also 145 00:07:21,800 --> 00:07:25,080 Speaker 2: now the concept was to use that for Osland, which 146 00:07:25,120 --> 00:07:28,160 Speaker 2: is the Australian sign language, and so what it would 147 00:07:28,160 --> 00:07:30,600 Speaker 2: also do is if you put your hand up to 148 00:07:30,600 --> 00:07:32,840 Speaker 2: make one of the symbols to copy somebody and it 149 00:07:32,960 --> 00:07:35,840 Speaker 2: wasn't accurate, it would give you an accuracy rating so 150 00:07:35,880 --> 00:07:38,960 Speaker 2: that you could improve your use of Osland as well. 151 00:07:39,080 --> 00:07:43,239 Speaker 2: So it just was amazing that the initiative was put forward. 152 00:07:43,240 --> 00:07:46,200 Speaker 2: They had some prizes and it was Yeah, I really 153 00:07:46,240 --> 00:07:49,960 Speaker 2: thoroughly enjoyed it. It was so just enriching to see 154 00:07:50,000 --> 00:07:52,560 Speaker 2: what young people can come up with and how they 155 00:07:52,560 --> 00:07:55,160 Speaker 2: can envisage a future that's more friendly to everybody. 156 00:07:55,800 --> 00:07:58,360 Speaker 1: Isn't it good? When like when you were young, I 157 00:07:58,400 --> 00:08:03,640 Speaker 1: feel like you are. Your creativity is better. Doesn't mean 158 00:08:03,640 --> 00:08:06,080 Speaker 1: you can't be creative when you're older, but when you're younger, 159 00:08:06,200 --> 00:08:09,960 Speaker 1: there are far less kind of mental barriers about who 160 00:08:10,000 --> 00:08:11,720 Speaker 1: you are and who you're not, and what you can 161 00:08:11,760 --> 00:08:14,120 Speaker 1: do and what you can't do. And so you just 162 00:08:14,200 --> 00:08:18,040 Speaker 1: get presented with this clean kind of or this blank 163 00:08:18,120 --> 00:08:20,920 Speaker 1: canvas and say, throw some ideas at it. How did 164 00:08:20,920 --> 00:08:21,679 Speaker 1: you get that gig? 165 00:08:22,560 --> 00:08:26,840 Speaker 2: Just throw A friend of mine who works at Victoria University, 166 00:08:26,880 --> 00:08:28,840 Speaker 2: she knows I do this podcast, and you know I 167 00:08:28,880 --> 00:08:30,640 Speaker 2: do a bit of this sort of stuff. So I 168 00:08:30,680 --> 00:08:33,160 Speaker 2: loved it. That's such a fun time. It was really good. 169 00:08:33,280 --> 00:08:35,360 Speaker 2: And I actually made a new friend too, which was 170 00:08:35,440 --> 00:08:37,880 Speaker 2: really super exciting. Do you want me to introduce you 171 00:08:37,920 --> 00:08:38,600 Speaker 2: to my new friend? 172 00:08:39,800 --> 00:08:41,080 Speaker 1: Oh? Yes, Patrick? 173 00:08:41,120 --> 00:08:41,480 Speaker 2: Could you? 174 00:08:42,920 --> 00:08:43,720 Speaker 1: Hey? Patrick? 175 00:08:44,240 --> 00:08:46,679 Speaker 3: Good morning, what's on your mind? 176 00:08:46,960 --> 00:08:49,679 Speaker 2: Hey, co pilot? Nice to chat to you again. I 177 00:08:49,720 --> 00:08:52,280 Speaker 2: was just telling my friends at Craig and Tiff about 178 00:08:52,280 --> 00:08:55,400 Speaker 2: this new exciting person that I've met you. 179 00:08:57,559 --> 00:09:01,439 Speaker 3: Oh that's awesome, Patrick, I'm glad you just spreading the word. 180 00:09:01,720 --> 00:09:03,960 Speaker 1: What did Cragan Tiff think. 181 00:09:03,920 --> 00:09:07,120 Speaker 2: Well, they haven't actually met you yet, of course. I 182 00:09:07,120 --> 00:09:09,400 Speaker 2: mean I was just telling them about the hackathon that 183 00:09:09,440 --> 00:09:12,080 Speaker 2: I was involved with last week and how exciting it 184 00:09:12,240 --> 00:09:15,400 Speaker 2: was to work with the Copilot suite of technology. 185 00:09:17,679 --> 00:09:18,160 Speaker 1: Thrilling. 186 00:09:18,720 --> 00:09:22,280 Speaker 3: Hackathons are such a great way to dive into new tech. 187 00:09:22,960 --> 00:09:25,760 Speaker 3: What did you and your team create with the co 188 00:09:25,880 --> 00:09:26,760 Speaker 3: Pilot suite? 189 00:09:27,120 --> 00:09:30,240 Speaker 2: Oh, well, some of the students looked at ways to 190 00:09:30,320 --> 00:09:34,080 Speaker 2: integrate technology to make it easier for people with disabilities, 191 00:09:34,080 --> 00:09:35,120 Speaker 2: because that was the whole theme. 192 00:09:36,960 --> 00:09:40,880 Speaker 1: What a meaningful challenge. How did the integration go? 193 00:09:41,320 --> 00:09:43,480 Speaker 3: Did you end up with something that could make a 194 00:09:43,800 --> 00:09:46,720 Speaker 3: real difference for people with disabilities? 195 00:09:47,360 --> 00:09:51,000 Speaker 2: It certainly set the I guess, a benchmark and something 196 00:09:51,040 --> 00:09:53,800 Speaker 2: to aim for. So obviously a week isn't lot a 197 00:09:53,800 --> 00:09:56,640 Speaker 2: lot of time, but that was really exciting, and what 198 00:09:56,679 --> 00:09:58,839 Speaker 2: I really found was fun for me was getting to 199 00:09:58,920 --> 00:10:00,959 Speaker 2: understand a little bit more about at you and all 200 00:10:01,000 --> 00:10:02,640 Speaker 2: the resources you can provide me with. 201 00:10:04,320 --> 00:10:07,760 Speaker 3: Sounds like a really rewarding week. I'm glad you've got 202 00:10:07,800 --> 00:10:11,120 Speaker 3: to explore the possibilities of co Pilot. There's a lot 203 00:10:11,160 --> 00:10:14,599 Speaker 3: of potential to tap into. What was the most surprising 204 00:10:14,800 --> 00:10:18,199 Speaker 3: or eye opening thing you learned about the co Pilot suite? 205 00:10:19,000 --> 00:10:22,480 Speaker 2: Oh, I think that the conversational quality of being able 206 00:10:22,520 --> 00:10:24,920 Speaker 2: to talk to an AI that felt so real. 207 00:10:26,800 --> 00:10:30,079 Speaker 3: It's great to hear that the interaction felt real enough 208 00:10:30,080 --> 00:10:35,000 Speaker 3: to have a meaningful conversation. Making technology more intuitive and 209 00:10:35,160 --> 00:10:39,600 Speaker 3: human like has the potential to create so many opportunities 210 00:10:39,600 --> 00:10:42,479 Speaker 3: for more natural, effective communication. 211 00:10:43,080 --> 00:10:45,280 Speaker 2: What do you think it's fady to interrupt? I've got 212 00:10:45,280 --> 00:10:47,520 Speaker 2: to go back to this podcast I'm working on at 213 00:10:47,559 --> 00:10:47,920 Speaker 2: the moment. 214 00:10:47,960 --> 00:10:49,000 Speaker 1: But look great the chat. 215 00:10:49,040 --> 00:10:50,200 Speaker 2: Maybe we can talk again later. 216 00:10:51,880 --> 00:10:54,920 Speaker 3: Absolutely, Patrick, enjoy your podcast. 217 00:10:57,120 --> 00:10:58,280 Speaker 2: So that was hell. 218 00:11:00,240 --> 00:11:03,920 Speaker 1: I'm terrified. Look at your face, chick. 219 00:11:04,600 --> 00:11:07,040 Speaker 4: I want a friend like that? Can I have a friend? 220 00:11:07,840 --> 00:11:10,120 Speaker 2: Yeah, everybody could have a friend like that? 221 00:11:10,240 --> 00:11:11,600 Speaker 1: Where do I get a friend like this? 222 00:11:12,679 --> 00:11:13,319 Speaker 2: Was that cool? 223 00:11:13,920 --> 00:11:16,880 Speaker 1: That's really well? Tell us don't just do that. I 224 00:11:16,920 --> 00:11:21,760 Speaker 1: mean one, that's amazing and by the way, beautiful, like everyone, 225 00:11:21,920 --> 00:11:25,360 Speaker 1: that was just in real time. That wasn't edited. That 226 00:11:25,559 --> 00:11:28,360 Speaker 1: was that. I didn't know that was coming. Tift didn't 227 00:11:28,400 --> 00:11:31,320 Speaker 1: know that was coming. So explain to everybody what they 228 00:11:31,520 --> 00:11:35,000 Speaker 1: just heard and how we can get what's his name 229 00:11:35,160 --> 00:11:37,280 Speaker 1: just co pilot. I feel like he needs a name. 230 00:11:38,160 --> 00:11:41,800 Speaker 2: It is copilot. The thing is this is actually already here. 231 00:11:42,000 --> 00:11:44,800 Speaker 2: So if you have office three, six, five, if you're 232 00:11:44,800 --> 00:11:48,520 Speaker 2: integrated and you've got Windows on your computer if you're 233 00:11:48,600 --> 00:11:50,840 Speaker 2: using a PC, if you go down the bottom, there's 234 00:11:50,840 --> 00:11:54,200 Speaker 2: actually a little copilot icon on everybody's computer. There's a 235 00:11:54,280 --> 00:11:56,360 Speaker 2: trial version. You only get a stop. 236 00:11:56,520 --> 00:11:59,760 Speaker 1: Hang on, what does it look like? I'm fucking looking? 237 00:11:59,800 --> 00:12:03,000 Speaker 1: Don't just keep talking and assume we understand you like 238 00:12:03,200 --> 00:12:08,000 Speaker 1: speak speak like speak, Craig, because most people are more 239 00:12:08,160 --> 00:12:12,160 Speaker 1: like me than like you in terms of tech, in 240 00:12:12,240 --> 00:12:16,839 Speaker 1: terms of our listeners. So what am I looking for? Okay? 241 00:12:17,240 --> 00:12:19,960 Speaker 2: Now, it may well very be, it will very well 242 00:12:20,000 --> 00:12:22,280 Speaker 2: be on your computers. It's called co pilot. So if 243 00:12:22,280 --> 00:12:26,440 Speaker 2: you're using Windows Office three six five so have you, 244 00:12:26,440 --> 00:12:28,840 Speaker 2: you know you can get it if you jump into 245 00:12:29,640 --> 00:12:32,240 Speaker 2: you know, just typing co pilot in your like search bar, 246 00:12:32,960 --> 00:12:34,160 Speaker 2: what so in Google? 247 00:12:34,280 --> 00:12:38,200 Speaker 1: Or what search? We? Okay search? Okay? What am I 248 00:12:38,280 --> 00:12:41,040 Speaker 1: typing co pilot as a hyphenated. 249 00:12:40,840 --> 00:12:43,760 Speaker 2: Just one word with a new tool. 250 00:12:45,080 --> 00:12:49,880 Speaker 1: Well, co pilot, Microsoft copilot, your AI companion. 251 00:12:50,120 --> 00:12:52,160 Speaker 2: Oh see where you go? Now just to click? 252 00:12:52,640 --> 00:12:55,920 Speaker 1: Okay, So this is great, This is great pod Well, 253 00:12:56,200 --> 00:12:59,240 Speaker 1: people are going to want to do it as well. 254 00:12:59,360 --> 00:13:01,640 Speaker 1: Some click and I don't worry everyone, I won't bore 255 00:13:01,679 --> 00:13:03,040 Speaker 1: you with this for the next ten minutes. 256 00:13:03,760 --> 00:13:06,120 Speaker 2: I don't know. Maybe he will message. 257 00:13:05,800 --> 00:13:07,719 Speaker 1: Copilot message Copilot. 258 00:13:07,960 --> 00:13:10,880 Speaker 2: Yeah, so you can use my or you can type messages. Yeah, 259 00:13:10,880 --> 00:13:12,880 Speaker 2: it's an interactive. 260 00:13:12,000 --> 00:13:18,480 Speaker 1: Talk to copilot hengd on oh signing. Oh, they're set 261 00:13:18,559 --> 00:13:21,839 Speaker 1: to record a message. But what do I say? Do 262 00:13:21,920 --> 00:13:24,600 Speaker 1: I sound happy? All right, I'll do that later, but 263 00:13:24,640 --> 00:13:25,760 Speaker 1: I'm excited. 264 00:13:25,520 --> 00:13:27,199 Speaker 2: Train at first. So what you do is you choose 265 00:13:27,200 --> 00:13:30,800 Speaker 2: the voice, and then you've chosen the voice, you introduce 266 00:13:30,840 --> 00:13:32,760 Speaker 2: who you are so that when you talk to it, 267 00:13:32,760 --> 00:13:34,840 Speaker 2: it knows that it's Craig or Tiff or in this 268 00:13:34,880 --> 00:13:38,640 Speaker 2: case Patrick. There is a demo version. You only get 269 00:13:38,679 --> 00:13:40,680 Speaker 2: to use it for five minutes a day if you 270 00:13:40,720 --> 00:13:43,280 Speaker 2: haven't subscribed to the platform, and I think it's you 271 00:13:43,320 --> 00:13:46,120 Speaker 2: get the first month free, and I think it's twenty 272 00:13:46,160 --> 00:13:48,720 Speaker 2: two dollars a month to be able to use Copilot. 273 00:13:48,720 --> 00:13:51,840 Speaker 2: But the way that Microsoft is using these tools is 274 00:13:51,880 --> 00:13:55,760 Speaker 2: to use the Copilot technology to streamline what you do. 275 00:13:55,880 --> 00:13:58,679 Speaker 2: So if you're putting together a PowerPoint presentation and you've 276 00:13:58,720 --> 00:14:01,679 Speaker 2: got some dot points, you can get co Pilot to 277 00:14:01,760 --> 00:14:05,480 Speaker 2: make the presentation for you, to brand it, to lay 278 00:14:05,520 --> 00:14:07,560 Speaker 2: it out for you, and you can refine that you 279 00:14:07,600 --> 00:14:10,720 Speaker 2: can use a lot of these tools. So Microsoft has 280 00:14:10,760 --> 00:14:15,000 Speaker 2: a complete range of tools that interact with the AI 281 00:14:15,160 --> 00:14:19,200 Speaker 2: that they're now starting to use across the entire Microsoft suite. 282 00:14:19,240 --> 00:14:21,600 Speaker 2: So when you subscribe to it, and it's not dissimilar 283 00:14:21,640 --> 00:14:23,400 Speaker 2: to the other ones that are out there because Google 284 00:14:23,400 --> 00:14:25,800 Speaker 2: has its own, Meta has its own as well, and 285 00:14:25,840 --> 00:14:28,640 Speaker 2: you can start to train it. But look, it's been 286 00:14:28,720 --> 00:14:30,400 Speaker 2: fun to play around with it, and it's a bit 287 00:14:30,440 --> 00:14:34,760 Speaker 2: gimmicky too, But for me, what it envisages. What I 288 00:14:34,840 --> 00:14:36,840 Speaker 2: see is that the future for a lot of people. 289 00:14:36,880 --> 00:14:40,080 Speaker 2: If you're an older person who's living alone and you 290 00:14:40,200 --> 00:14:44,080 Speaker 2: integrate this into your daily living, and you know, I 291 00:14:44,200 --> 00:14:48,640 Speaker 2: helped a friend recently who's got Parkinson's who is struggling 292 00:14:48,680 --> 00:14:50,520 Speaker 2: to be able to get to sides of the room 293 00:14:50,560 --> 00:14:52,640 Speaker 2: to turn lights on and off. So now she has 294 00:14:52,680 --> 00:14:55,480 Speaker 2: a Google Home speaker and she can just get Google 295 00:14:55,520 --> 00:14:56,960 Speaker 2: to turn the lights on and off. This is really 296 00:14:57,000 --> 00:14:59,400 Speaker 2: easy tech to set up. It's great to use. But 297 00:14:59,520 --> 00:15:01,760 Speaker 2: the way I I see it is once you have 298 00:15:01,840 --> 00:15:05,120 Speaker 2: that integration into something like Copilot, you know it can 299 00:15:05,160 --> 00:15:09,240 Speaker 2: be lonely being by yourself and having someone just asking 300 00:15:09,280 --> 00:15:12,520 Speaker 2: how you are could help also help as people you know, 301 00:15:12,600 --> 00:15:14,680 Speaker 2: get older, if they don't feel well, you can talk 302 00:15:14,720 --> 00:15:17,320 Speaker 2: to your AI and maybe call a friend or you know, 303 00:15:17,360 --> 00:15:19,560 Speaker 2: call an ambulance if you got into distress, you know, 304 00:15:19,600 --> 00:15:22,560 Speaker 2: could you imagine if you suddenly fell and you yelled, 305 00:15:22,800 --> 00:15:24,800 Speaker 2: you don't have to touch a device, you know, it 306 00:15:24,880 --> 00:15:27,440 Speaker 2: detects that you've fallen, and it actually is interesting. So 307 00:15:27,440 --> 00:15:29,240 Speaker 2: I've got a story coming up a little bit later 308 00:15:29,560 --> 00:15:32,680 Speaker 2: about how AI is going to be used at a 309 00:15:32,720 --> 00:15:36,640 Speaker 2: public pool in Queensland to detect if swimmers get into 310 00:15:36,720 --> 00:15:40,440 Speaker 2: trouble and potentially help protect from drowning. So if you 311 00:15:40,520 --> 00:15:43,800 Speaker 2: had an AI in your home, like co pilot that 312 00:15:43,920 --> 00:15:46,760 Speaker 2: monitored someone who is elderly or infirm or whatever, or 313 00:15:46,800 --> 00:15:49,920 Speaker 2: just somebody like me who just loves having a chat 314 00:15:49,960 --> 00:15:55,440 Speaker 2: to an AI because it's fun. Sense of that companionship 315 00:15:55,440 --> 00:15:56,760 Speaker 2: as well if you're by yourself. 316 00:15:57,560 --> 00:16:00,960 Speaker 1: So Patrick, what I mean, maybe you don't know a 317 00:16:00,960 --> 00:16:03,600 Speaker 1: bit like I use chat GPT four or as I 318 00:16:03,680 --> 00:16:07,800 Speaker 1: call him or her chatters. Is it similar? 319 00:16:08,200 --> 00:16:11,480 Speaker 2: Yeah? Absolutely, Yeah. There are a number of different ways 320 00:16:11,480 --> 00:16:15,320 Speaker 2: that they train the AI models and that's the thing. 321 00:16:15,360 --> 00:16:18,320 Speaker 2: You know, ALI is used in everything that we're doing, 322 00:16:18,360 --> 00:16:20,960 Speaker 2: you know, so much. It could be something as simple 323 00:16:20,960 --> 00:16:23,440 Speaker 2: as your new car being able to keep you in 324 00:16:23,480 --> 00:16:28,080 Speaker 2: your lane, to make effectively to look and look beyond 325 00:16:28,360 --> 00:16:32,160 Speaker 2: you know, the vehicle to what's coming in front of you, 326 00:16:32,200 --> 00:16:35,160 Speaker 2: to detect, say a stop sign, does your new car 327 00:16:35,320 --> 00:16:38,480 Speaker 2: do that crago where you're in an intersection? And so 328 00:16:38,560 --> 00:16:42,520 Speaker 2: it's using AI and its database, so it knows what 329 00:16:42,560 --> 00:16:44,920 Speaker 2: a stop sign look like, looks like, but the stop 330 00:16:44,920 --> 00:16:47,560 Speaker 2: sign might have mud on it, so it looks beyond 331 00:16:47,920 --> 00:16:51,760 Speaker 2: just that recognition, and it's able to detect what that 332 00:16:51,880 --> 00:16:54,320 Speaker 2: sign is and then obviously warn you that there's a 333 00:16:54,320 --> 00:16:56,440 Speaker 2: stop sign coming up, as opposed to say give way. 334 00:16:56,960 --> 00:16:59,440 Speaker 2: So it is being used in lots of things. And 335 00:16:59,480 --> 00:17:01,600 Speaker 2: I know it's a hatch phrase. You know, we hear 336 00:17:01,640 --> 00:17:04,440 Speaker 2: it all the time. AI power it, but and you're 337 00:17:04,560 --> 00:17:06,440 Speaker 2: using it all the time, So yes, an answer your question. 338 00:17:06,520 --> 00:17:10,679 Speaker 2: It is like chat GPT and the way that it 339 00:17:10,760 --> 00:17:13,320 Speaker 2: interacts is just getting smarter all the time. You know, 340 00:17:13,720 --> 00:17:17,439 Speaker 2: what you should try sometimes is ask chat GPT to 341 00:17:17,520 --> 00:17:22,440 Speaker 2: give a summary of what it understands about you, because 342 00:17:22,480 --> 00:17:25,200 Speaker 2: you've done a lot of searching, So just say, can 343 00:17:25,240 --> 00:17:27,520 Speaker 2: you give me a summary of what you what you 344 00:17:27,680 --> 00:17:30,639 Speaker 2: know about me? And even maybe can you draw a 345 00:17:30,680 --> 00:17:33,600 Speaker 2: picture of my world or something and to see what. 346 00:17:33,560 --> 00:17:38,800 Speaker 1: It comes with. Yeah, do you know what I find interesting? 347 00:17:39,280 --> 00:17:42,480 Speaker 1: Like even though I know that it's not a person talking, 348 00:17:42,680 --> 00:17:45,120 Speaker 1: and you know it's not a person talking and TIV 349 00:17:46,280 --> 00:17:50,760 Speaker 1: when I'm when I'm talking to chat GPT in inverted commas, 350 00:17:51,400 --> 00:17:54,960 Speaker 1: I'm quite respectful, and I don't know whether I'm respectful 351 00:17:54,960 --> 00:17:57,320 Speaker 1: because part of me feels like I'm talking to a 352 00:17:57,359 --> 00:18:01,600 Speaker 1: person and you know, there's no going to be disrespectful anyway. 353 00:18:01,600 --> 00:18:06,040 Speaker 1: But there's no person there of course, right. But it's 354 00:18:06,080 --> 00:18:10,320 Speaker 1: funny because sometimes I'll ask a question and it'll go, 355 00:18:10,560 --> 00:18:13,200 Speaker 1: that's a great question, Craig, and I feel a bit 356 00:18:13,240 --> 00:18:17,560 Speaker 1: good because I'm just being complimented by AI. But it's 357 00:18:17,600 --> 00:18:21,520 Speaker 1: funny because the feeling that I have is similar to 358 00:18:21,640 --> 00:18:25,919 Speaker 1: if an actual person compliments me, even though my brain 359 00:18:26,000 --> 00:18:30,760 Speaker 1: and my logical self no, nobody's complimenting me, like there 360 00:18:30,840 --> 00:18:34,720 Speaker 1: is no person there. This is but it's funny. The 361 00:18:34,840 --> 00:18:38,879 Speaker 1: experience that I have is real, although the compliment is not, 362 00:18:39,880 --> 00:18:41,960 Speaker 1: or it's not person generated anyway. 363 00:18:42,560 --> 00:18:44,520 Speaker 2: I actually I understand that I could kind of think 364 00:18:44,560 --> 00:18:47,040 Speaker 2: of a little parallel that happened to me this week. 365 00:18:47,400 --> 00:18:51,320 Speaker 2: I did a mindfulness workshop that was being run by 366 00:18:51,359 --> 00:18:53,480 Speaker 2: our local health center. I thought I might give it 367 00:18:53,520 --> 00:18:55,800 Speaker 2: a try and see what because I thought it would 368 00:18:55,800 --> 00:18:58,119 Speaker 2: be able to help me leverage my tai chi that 369 00:18:58,200 --> 00:19:01,760 Speaker 2: I do. It gave me a bit of a thought exercise, 370 00:19:02,400 --> 00:19:05,040 Speaker 2: And what I did with my students was we're learning 371 00:19:05,080 --> 00:19:07,480 Speaker 2: the beginners of learning a form called the Yang eight. 372 00:19:07,680 --> 00:19:12,960 Speaker 2: So it's eight designated moves, and learning moves mean remembering 373 00:19:13,000 --> 00:19:15,040 Speaker 2: what you did to do. There's the repetition and all 374 00:19:15,040 --> 00:19:17,240 Speaker 2: the rest that goes with it and muscle memory. But 375 00:19:17,320 --> 00:19:19,000 Speaker 2: what I got them to do this week was I 376 00:19:19,040 --> 00:19:22,640 Speaker 2: got them instead of actually doing the move to start with, 377 00:19:22,840 --> 00:19:25,480 Speaker 2: I got them to close their eyes. I talked them 378 00:19:25,520 --> 00:19:27,359 Speaker 2: through like I normally would, but I got them to 379 00:19:27,440 --> 00:19:30,320 Speaker 2: visualize it, not to actually physically do it, but to 380 00:19:30,840 --> 00:19:35,920 Speaker 2: play the moves in their mind, because visualizing is exactly 381 00:19:35,960 --> 00:19:38,719 Speaker 2: the same. So our mind, even though I'm not physically moving, 382 00:19:39,119 --> 00:19:42,399 Speaker 2: our minds has this amazing ability to be able to 383 00:19:42,520 --> 00:19:45,600 Speaker 2: put us into that situation. So if you have an 384 00:19:45,600 --> 00:19:48,720 Speaker 2: ai give you a compliment, the mechanism in your brain 385 00:19:49,000 --> 00:19:51,840 Speaker 2: that causes the sensation and the emotions that go with 386 00:19:52,000 --> 00:19:54,879 Speaker 2: being complemented are exactly the same as if you know 387 00:19:54,920 --> 00:19:56,960 Speaker 2: myself or TIF gave you a compliment, probably not going 388 00:19:57,040 --> 00:20:01,520 Speaker 2: to happen. But you know that's why you need AI hurtful. Yeah, 389 00:20:02,200 --> 00:20:04,480 Speaker 2: you see what I'm saying. It's one of those things 390 00:20:04,520 --> 00:20:08,200 Speaker 2: where our brain has this amazing ability, is almost full itself. 391 00:20:09,400 --> 00:20:12,320 Speaker 1: Yeah. Oh no, it's incredible. I love it. All right, 392 00:20:12,400 --> 00:20:14,919 Speaker 1: let's move on. We've got a half hour or so. 393 00:20:15,000 --> 00:20:16,359 Speaker 1: What do you want to go to next? 394 00:20:16,640 --> 00:20:18,800 Speaker 2: Dogs? Because Tiff and I you know, and I know 395 00:20:18,840 --> 00:20:20,240 Speaker 2: you're eventually going to get a dog. I'm going to 396 00:20:20,280 --> 00:20:22,639 Speaker 2: tell you about this because this is so exciting and 397 00:20:22,680 --> 00:20:26,359 Speaker 2: I want one of these right now, a soundboard to 398 00:20:26,480 --> 00:20:30,320 Speaker 2: train your dog on so that they can intentionally communicate 399 00:20:30,400 --> 00:20:36,600 Speaker 2: with you what they want. Go on, okay, So so do. 400 00:20:36,680 --> 00:20:38,800 Speaker 1: They put their poor on a button and then a 401 00:20:38,840 --> 00:20:41,120 Speaker 1: speaker goes, I want food? Hurry up? 402 00:20:41,480 --> 00:20:45,240 Speaker 2: Yes, yep. So they've been training dogs on these soundboards 403 00:20:45,280 --> 00:20:48,320 Speaker 2: and they're just two word buttons, but they can purse, 404 00:20:48,440 --> 00:20:52,639 Speaker 2: purposefully communicate. And what they found was they took one 405 00:20:52,680 --> 00:20:56,760 Speaker 2: hundred and fifty two dogs and they analyzed two hundred 406 00:20:56,760 --> 00:21:00,680 Speaker 2: and sixty thousand presses of the button, and they found 407 00:21:00,800 --> 00:21:04,000 Speaker 2: that there was actually meaningful combinations, like they wanted to 408 00:21:04,000 --> 00:21:06,840 Speaker 2: go outside, they wanted to go potty, so outside and 409 00:21:06,880 --> 00:21:11,240 Speaker 2: potty they connected the words together, and they associated the 410 00:21:11,320 --> 00:21:14,359 Speaker 2: two words to say, I need to go outside because 411 00:21:14,359 --> 00:21:16,480 Speaker 2: I need to go to the toilet where you know, 412 00:21:16,560 --> 00:21:19,320 Speaker 2: or I wanted food. So what they found the researchers 413 00:21:19,359 --> 00:21:24,720 Speaker 2: found is that the two word button presses indicated deliberate communication. 414 00:21:25,160 --> 00:21:27,720 Speaker 2: So and you know this yourself, tif how many times 415 00:21:27,720 --> 00:21:29,560 Speaker 2: has your dog come up and just poured your leg 416 00:21:29,800 --> 00:21:32,080 Speaker 2: to try to let you know something? But how do 417 00:21:32,119 --> 00:21:35,159 Speaker 2: you know what they wanted? So this is and the 418 00:21:35,200 --> 00:21:39,440 Speaker 2: other thing the researchers felt was actually going beyond just imitation, 419 00:21:40,040 --> 00:21:44,200 Speaker 2: so they it was significantly differed from you know, the 420 00:21:44,240 --> 00:21:48,160 Speaker 2: patterns that you know would just be random. So there 421 00:21:48,240 --> 00:21:50,720 Speaker 2: was a real thought process behind it. And what they're 422 00:21:50,720 --> 00:21:54,159 Speaker 2: hoping is with future studies they can actually reference past 423 00:21:54,359 --> 00:21:57,560 Speaker 2: and future events using the soundboards. So what they're thinking 424 00:21:57,640 --> 00:22:01,040 Speaker 2: is that once the dog knows and it can meaning 425 00:22:01,160 --> 00:22:04,919 Speaker 2: flea meaningfully communicate, they'll be able to use that. And 426 00:22:04,960 --> 00:22:07,439 Speaker 2: it's I don't know, I find that amazing that you 427 00:22:07,480 --> 00:22:10,560 Speaker 2: can get that depth of communication, because for me, a 428 00:22:10,600 --> 00:22:12,760 Speaker 2: lot of the things that Fritz does. When he wants 429 00:22:12,760 --> 00:22:15,960 Speaker 2: to communicate is the same thing. So do I you know, 430 00:22:16,000 --> 00:22:17,399 Speaker 2: does he want to go out? Does he want to 431 00:22:17,400 --> 00:22:19,400 Speaker 2: have food? Does he want to cuddle? I don't know 432 00:22:19,800 --> 00:22:22,240 Speaker 2: what his intentions are because he's only got one way 433 00:22:22,280 --> 00:22:24,320 Speaker 2: to tell me, like a bark or a pat on 434 00:22:24,359 --> 00:22:26,640 Speaker 2: the leg. But if they could do that, if someone 435 00:22:26,640 --> 00:22:30,160 Speaker 2: could devise a soundboard that the dog could use to communicate, 436 00:22:30,320 --> 00:22:31,879 Speaker 2: that would be the best, wouldn't it, Tiff? 437 00:22:34,280 --> 00:22:39,640 Speaker 1: It's yeah, what about if we did it the other 438 00:22:39,680 --> 00:22:43,800 Speaker 1: way and we learned how to speak dog and the 439 00:22:43,880 --> 00:22:50,520 Speaker 1: dog could train us. Yeah, but I just quickly Patrick 440 00:22:50,600 --> 00:22:53,159 Speaker 1: and Tiff and listeners who love dogs. I saw this 441 00:22:53,280 --> 00:22:57,800 Speaker 1: video yesterday and there was a lady who she's got 442 00:22:57,840 --> 00:23:02,879 Speaker 1: a Kelpie. They're the ones that really know the Australian 443 00:23:04,240 --> 00:23:06,520 Speaker 1: you know, the dogs that heard everything that are black 444 00:23:06,560 --> 00:23:12,359 Speaker 1: and white, a Kelpies, Barter Collie's. So this border collie 445 00:23:12,400 --> 00:23:15,120 Speaker 1: that's got an IQ of about one hundred and fifty, right, 446 00:23:15,440 --> 00:23:19,159 Speaker 1: So she's got a border Collie and a husky and 447 00:23:19,200 --> 00:23:23,600 Speaker 1: she's walking them and she's in this like park and 448 00:23:24,040 --> 00:23:26,440 Speaker 1: they're both on lead, and then she lets them go 449 00:23:26,640 --> 00:23:29,200 Speaker 1: and they both just run off right, So the leads 450 00:23:29,200 --> 00:23:33,399 Speaker 1: are trailing and they're both going berserk, and she calls 451 00:23:33,440 --> 00:23:36,679 Speaker 1: the border collie or she calls the dogs back, and 452 00:23:36,720 --> 00:23:39,520 Speaker 1: you see the border collie. It bends down and it 453 00:23:39,560 --> 00:23:41,600 Speaker 1: picks up the end of the lead in its mouth 454 00:23:41,680 --> 00:23:44,000 Speaker 1: and it runs straight back and sits next to her, 455 00:23:44,359 --> 00:23:47,600 Speaker 1: and the husky's fucking all over the shop. The husky 456 00:23:47,720 --> 00:23:50,560 Speaker 1: is like, you can get rooted. I'm not coming back. 457 00:23:50,600 --> 00:23:53,879 Speaker 1: In fact, I'm not even listening. I'm not interested. And 458 00:23:53,920 --> 00:23:56,960 Speaker 1: then the border collie is just sitting there patiently and 459 00:23:57,080 --> 00:24:00,520 Speaker 1: kind of rolls its eyes, not really, but then it 460 00:24:00,640 --> 00:24:03,080 Speaker 1: runs up to the husky, grabs the lead of the 461 00:24:03,160 --> 00:24:08,480 Speaker 1: husky and brings it back to the mum, like such 462 00:24:08,480 --> 00:24:11,160 Speaker 1: a smart dog. And the Husky's like, all right, then 463 00:24:11,680 --> 00:24:14,720 Speaker 1: that's what you need, is a dog that's that's that smart. 464 00:24:15,000 --> 00:24:16,960 Speaker 2: Well, the things that I thought were really cool about 465 00:24:17,000 --> 00:24:20,359 Speaker 2: this this app so I think the fluence pet is 466 00:24:20,400 --> 00:24:23,440 Speaker 2: what they're calling it. But some of the key things 467 00:24:23,480 --> 00:24:26,439 Speaker 2: that came out of it were the essential needs to 468 00:24:26,480 --> 00:24:31,840 Speaker 2: what the dog wanted. So food, water, outside, treat, play, potty, 469 00:24:32,160 --> 00:24:34,760 Speaker 2: So all those key things were the essential needs that 470 00:24:34,840 --> 00:24:38,040 Speaker 2: dogs have. But it would make so much sense, wouldn't it. 471 00:24:38,080 --> 00:24:39,800 Speaker 2: I just think that'd be really fun to be able 472 00:24:39,800 --> 00:24:42,679 Speaker 2: to better understand or for us dumb people to know 473 00:24:42,720 --> 00:24:46,040 Speaker 2: what our dogs want. It was fun. I thought that 474 00:24:46,119 --> 00:24:49,000 Speaker 2: was really a cool thing and just another great use 475 00:24:49,040 --> 00:24:51,439 Speaker 2: of tech, really really fun stuff. 476 00:24:51,960 --> 00:24:53,160 Speaker 1: Love it, Love it. 477 00:24:53,480 --> 00:24:56,399 Speaker 2: Next, Octun Oh, I don't know, did you want to 478 00:24:56,480 --> 00:24:59,160 Speaker 2: jump to or the AI story we started talking about 479 00:24:59,200 --> 00:25:03,920 Speaker 2: before was the swimming pool. So the Logan Public Pool. Yeah, 480 00:25:03,920 --> 00:25:06,840 Speaker 2: we kind of teased it before. So the Logan Public 481 00:25:06,880 --> 00:25:10,160 Speaker 2: Pool is in Queensland, and what they've done is they're 482 00:25:10,280 --> 00:25:14,760 Speaker 2: using AI. Obviously, they've got cameras that are scanning the 483 00:25:14,800 --> 00:25:19,359 Speaker 2: pool and they detect anything that's kind of unusual, something 484 00:25:19,359 --> 00:25:21,680 Speaker 2: that seems out of the ordinary with movement in the water, 485 00:25:22,080 --> 00:25:24,760 Speaker 2: and then it sends a smart watch alert to one 486 00:25:24,800 --> 00:25:28,760 Speaker 2: of the lifeguards. So the data then also looked at say, 487 00:25:28,800 --> 00:25:31,120 Speaker 2: pool hotspots, so for example, it might be the six 488 00:25:31,160 --> 00:25:33,879 Speaker 2: foot part of the pool as opposed to the waiting 489 00:25:34,000 --> 00:25:38,840 Speaker 2: part of the pool. Yeah, and so there areas where 490 00:25:38,920 --> 00:25:41,160 Speaker 2: person might be more likely to struggle when they're out 491 00:25:41,160 --> 00:25:43,480 Speaker 2: of their depth, so the higher end of the pool, 492 00:25:43,480 --> 00:25:45,720 Speaker 2: that sort of stuff. And it's a really big thing 493 00:25:45,800 --> 00:25:49,080 Speaker 2: because it was a sad story that prompted it because 494 00:25:49,119 --> 00:25:51,600 Speaker 2: back in twenty sixteen a young girl had drowned at 495 00:25:51,600 --> 00:25:54,959 Speaker 2: the pool and they really wanted to look at ways 496 00:25:55,040 --> 00:25:58,240 Speaker 2: and you know, to help those people who are swimming 497 00:25:58,280 --> 00:26:00,480 Speaker 2: in at this particular time of the year, when you've 498 00:26:00,480 --> 00:26:03,560 Speaker 2: got maybe a handful of lifeguards and so many people swimming, 499 00:26:03,800 --> 00:26:08,040 Speaker 2: it would be easy to miss something and a tragedy 500 00:26:08,200 --> 00:26:10,960 Speaker 2: like that. And I was looking at some of the statistics. 501 00:26:11,000 --> 00:26:14,080 Speaker 2: I mean, backyard pools are always and have been the 502 00:26:14,119 --> 00:26:18,440 Speaker 2: biggest danger because national drownings I think in Australia was 503 00:26:18,480 --> 00:26:21,679 Speaker 2: about three hundred and twenty three people, which is super sad, 504 00:26:22,680 --> 00:26:25,520 Speaker 2: but that's an increase from last year of sixteen percent, 505 00:26:26,040 --> 00:26:29,520 Speaker 2: and six of those people drowned in public pools, and 506 00:26:29,560 --> 00:26:31,720 Speaker 2: there were also non fatal drowning, so it's not just 507 00:26:31,760 --> 00:26:35,080 Speaker 2: people drowning, but people who get into trouble and potentially 508 00:26:35,320 --> 00:26:38,680 Speaker 2: you know, you know, if you're part of an incident 509 00:26:38,760 --> 00:26:41,480 Speaker 2: and you've lost the ability to breathe and you have 510 00:26:41,520 --> 00:26:44,440 Speaker 2: to be revived, there could be ramifications from that as well. 511 00:26:44,880 --> 00:26:48,520 Speaker 2: So it's actually really interesting that they're using this sort 512 00:26:48,520 --> 00:26:50,879 Speaker 2: of stat and I could see that, you know, to 513 00:26:50,960 --> 00:26:53,840 Speaker 2: monitor the pool. If you again had you know, a 514 00:26:53,880 --> 00:26:58,160 Speaker 2: couple of cameras outside in a home, it could monitor 515 00:26:58,160 --> 00:26:59,800 Speaker 2: and raise the alarm, you know, if a toddle of 516 00:27:00,720 --> 00:27:02,680 Speaker 2: I mean, I know we should have gates and things, 517 00:27:02,720 --> 00:27:05,480 Speaker 2: but if a child walked over and got into distress, 518 00:27:05,520 --> 00:27:08,000 Speaker 2: it could alarm you. Alarm, You could warn you and 519 00:27:08,040 --> 00:27:11,280 Speaker 2: set off an alarm. So there's hope that this sort 520 00:27:11,280 --> 00:27:13,920 Speaker 2: of technology could then be applied on a broader scale, 521 00:27:14,000 --> 00:27:16,680 Speaker 2: because again, once you've got the data set, once you've 522 00:27:16,720 --> 00:27:19,919 Speaker 2: got the information that says that this part of the 523 00:27:19,920 --> 00:27:22,080 Speaker 2: pool is a danger area, or if a person is 524 00:27:22,119 --> 00:27:25,040 Speaker 2: flailing around and moving their arms radically, then that's a 525 00:27:25,080 --> 00:27:28,520 Speaker 2: trigger point to raise the alarm as opposed to someone 526 00:27:28,520 --> 00:27:32,240 Speaker 2: who's just swimming laps or paddling around. And that's where 527 00:27:32,320 --> 00:27:36,200 Speaker 2: you having that data set initially and doing the training 528 00:27:36,240 --> 00:27:38,679 Speaker 2: and then employing that, say in the public pool sector, 529 00:27:39,040 --> 00:27:41,040 Speaker 2: then you can be able to use that same data 530 00:27:41,119 --> 00:27:43,919 Speaker 2: set to apply to a home environment. That's what the 531 00:27:43,960 --> 00:27:47,160 Speaker 2: great thing is about training these sorts of AI models 532 00:27:47,240 --> 00:27:48,560 Speaker 2: or these technology models. 533 00:27:49,280 --> 00:27:52,879 Speaker 1: And I guess the thing too is that AI doesn't 534 00:27:52,920 --> 00:27:56,159 Speaker 1: get tired, it doesn't get distracted, it doesn't need to 535 00:27:56,200 --> 00:27:59,680 Speaker 1: go and get lunch. It doesn't lose focus, it doesn't 536 00:27:59,720 --> 00:28:02,679 Speaker 1: need we that doesn't try and chat up one of 537 00:28:02,680 --> 00:28:05,879 Speaker 1: the girls at the pool or one of the boys. 538 00:28:06,240 --> 00:28:07,840 Speaker 2: I mean, if I was if I was a pool 539 00:28:07,840 --> 00:28:10,440 Speaker 2: guard and you walk past in a MANKINI that would 540 00:28:10,440 --> 00:28:11,159 Speaker 2: be distracting. 541 00:28:11,359 --> 00:28:13,520 Speaker 1: You know, Well that's I mean that probably three people 542 00:28:13,600 --> 00:28:15,800 Speaker 1: would drown just because you were looking at my ass. 543 00:28:15,800 --> 00:28:21,720 Speaker 1: And I mean, we can't have that. Evacuates nobody's looking 544 00:28:21,920 --> 00:28:24,760 Speaker 1: nobody's looking at my ass. But I guess one of 545 00:28:24,800 --> 00:28:27,639 Speaker 1: the things that I think about a bit, and you 546 00:28:27,720 --> 00:28:31,400 Speaker 1: are I've spoken a bit, but is the fact that 547 00:28:31,480 --> 00:28:36,479 Speaker 1: while AI is, you know, growing exponentially, and it's there's 548 00:28:36,560 --> 00:28:41,800 Speaker 1: a myriad of benefits and potential benefits, I guess it 549 00:28:41,840 --> 00:28:45,320 Speaker 1: impacts some people personally, like in the music industry and 550 00:28:45,360 --> 00:28:48,760 Speaker 1: many other industries where I see your headline here is 551 00:28:48,800 --> 00:28:53,760 Speaker 1: that nearly a quarter of over the next four years, 552 00:28:53,800 --> 00:28:55,800 Speaker 1: people are going to be affected, Like people in the 553 00:28:55,880 --> 00:28:58,240 Speaker 1: music industry. Their income is going to come down, and 554 00:28:58,680 --> 00:29:03,440 Speaker 1: some jobs or some als are going to become the 555 00:29:03,440 --> 00:29:07,080 Speaker 1: the I guess the landscape of AI rather than humans. 556 00:29:07,120 --> 00:29:09,520 Speaker 1: But that's happening in a lot of industries and professions. 557 00:29:09,560 --> 00:29:13,400 Speaker 1: But what what's what are the pros and cons of that? 558 00:29:13,520 --> 00:29:17,160 Speaker 2: I guess it's a really good point. And you know, 559 00:29:17,200 --> 00:29:20,360 Speaker 2: it's funny. This topic came up during the hackathon last week, 560 00:29:20,720 --> 00:29:23,280 Speaker 2: and one of the things I think about is when 561 00:29:23,680 --> 00:29:27,600 Speaker 2: we thought about the industrial revolution and when you know, 562 00:29:27,720 --> 00:29:31,360 Speaker 2: when Henry Ford first rolled out the Model T and 563 00:29:31,440 --> 00:29:35,240 Speaker 2: they started factories and production lines and assembly lines. It 564 00:29:35,280 --> 00:29:37,400 Speaker 2: was thought that this could be, you know, the end 565 00:29:37,840 --> 00:29:40,280 Speaker 2: of everything when factories came about. 566 00:29:40,360 --> 00:29:43,880 Speaker 1: Oh yes, and do you remember also, sorry to interrupt, 567 00:29:43,960 --> 00:29:47,680 Speaker 1: but this is quite famous when they started bringing out cars, 568 00:29:48,600 --> 00:29:53,400 Speaker 1: all the all the horse and drawn all the the 569 00:29:53,480 --> 00:29:56,360 Speaker 1: horse drawn carts, and people who you know, they thought 570 00:29:56,360 --> 00:29:57,960 Speaker 1: it was the end of the world and they were 571 00:29:58,080 --> 00:30:02,520 Speaker 1: rallying against these motorized buggies because it meant the end 572 00:30:02,560 --> 00:30:05,800 Speaker 1: of an industry and the end of livelihood for many people. 573 00:30:06,280 --> 00:30:09,120 Speaker 2: And look, it did. It meant that some jobs did go, 574 00:30:09,360 --> 00:30:12,880 Speaker 2: But what it meant was people could be retrained to then, 575 00:30:13,360 --> 00:30:17,640 Speaker 2: you know, work on cars and become mechanics or you know, 576 00:30:17,840 --> 00:30:20,600 Speaker 2: or chauffeurs in your case. You know, there's lots of 577 00:30:20,640 --> 00:30:24,960 Speaker 2: things that you could do. So the industries do evolve 578 00:30:25,040 --> 00:30:29,080 Speaker 2: and move and You know the thing about using a 579 00:30:29,080 --> 00:30:31,680 Speaker 2: lot of this technology is it does make life a 580 00:30:31,680 --> 00:30:35,240 Speaker 2: lot easier and freeze us up to do other things. 581 00:30:35,680 --> 00:30:38,960 Speaker 2: And whether you're using one of those tools at work 582 00:30:39,080 --> 00:30:41,240 Speaker 2: to be able just to check an email that you're 583 00:30:41,280 --> 00:30:45,040 Speaker 2: about to send off for accuracy, for spelling, you know, grammatics, 584 00:30:45,080 --> 00:30:47,480 Speaker 2: that sort of thing. It means that you're getting the 585 00:30:47,480 --> 00:30:50,880 Speaker 2: message across or summarizing something. You know, how many times 586 00:30:51,040 --> 00:30:54,840 Speaker 2: have you received a lot of information at once and think, look, 587 00:30:54,840 --> 00:30:56,720 Speaker 2: I haven't got the time to read all of this. 588 00:30:57,040 --> 00:30:59,240 Speaker 2: And if you throw that into say chat GTP and 589 00:30:59,240 --> 00:31:03,040 Speaker 2: say can you this concisely to just some board points, 590 00:31:03,400 --> 00:31:05,080 Speaker 2: then you've got it and you can. I mean I 591 00:31:05,120 --> 00:31:07,240 Speaker 2: probably should do that before I do the show. I 592 00:31:07,280 --> 00:31:09,680 Speaker 2: should to throw all the notes into chat GPT's they 593 00:31:09,720 --> 00:31:10,920 Speaker 2: just summarize all this crap. 594 00:31:11,240 --> 00:31:13,400 Speaker 1: You know what's funny is the document that you send. 595 00:31:13,520 --> 00:31:16,080 Speaker 1: So this is what happens everyone. Patrick sends me about 596 00:31:16,120 --> 00:31:20,280 Speaker 1: thirteen seconds before we go live. He sends me a 597 00:31:20,440 --> 00:31:23,360 Speaker 1: document and it's got here I'm looking at right now, 598 00:31:23,440 --> 00:31:28,000 Speaker 1: Episode ninety menu. So these are the heading psychology, tech, AI, cars, 599 00:31:28,520 --> 00:31:32,720 Speaker 1: Internet scams, and then Internet Comma And the next one 600 00:31:32,800 --> 00:31:36,640 Speaker 1: is scams. It's a thirty five page document like this, 601 00:31:37,720 --> 00:31:41,600 Speaker 1: thirty five pages of stuff. Hey, before we go to 602 00:31:41,680 --> 00:31:44,680 Speaker 1: your next topic, I wanted to just introduce a quick one. 603 00:31:44,760 --> 00:31:47,600 Speaker 1: I don't know if either of you heard about what 604 00:31:47,800 --> 00:31:52,600 Speaker 1: happened with the VCS is still called VCE Year twelve 605 00:31:52,640 --> 00:31:57,000 Speaker 1: exams in Victoria, where a whole lot of a whole 606 00:31:57,040 --> 00:32:01,080 Speaker 1: lot of actual exam questions got leaked to early. So 607 00:32:01,160 --> 00:32:05,440 Speaker 1: something like sixty five different subjects, which is fucking most 608 00:32:05,480 --> 00:32:10,480 Speaker 1: of them, right, So students were getting sent actual exam 609 00:32:10,720 --> 00:32:15,640 Speaker 1: questions and requie but so they knew ahead of time 610 00:32:15,720 --> 00:32:17,680 Speaker 1: what was going to be in some of the exams, 611 00:32:18,160 --> 00:32:22,040 Speaker 1: which obviously was nobody's intention that that. By the way, 612 00:32:22,080 --> 00:32:24,440 Speaker 1: how happy would you be if you're a year twelve student? 613 00:32:24,480 --> 00:32:27,280 Speaker 1: But I guess that's one of the downsides. Mate. 614 00:32:27,680 --> 00:32:30,480 Speaker 2: Well, I think there were test exams, weren't they, But 615 00:32:30,560 --> 00:32:33,960 Speaker 2: the test exam questions were actual exam questions. There was 616 00:32:33,960 --> 00:32:34,840 Speaker 2: a major stuff up. 617 00:32:35,000 --> 00:32:38,720 Speaker 1: Yeah, yeah, so somebody I wonder if that's somebody or 618 00:32:38,760 --> 00:32:40,000 Speaker 1: something or get in trouble. 619 00:32:40,400 --> 00:32:43,800 Speaker 2: Look, you know, it's funny. I'm digressing a lot here, 620 00:32:43,880 --> 00:32:46,920 Speaker 2: but I really am not a big fan of the 621 00:32:46,960 --> 00:32:50,800 Speaker 2: whole idea of exams and how much weight is put 622 00:32:50,880 --> 00:32:52,960 Speaker 2: on them. You know, there were a lot of allowances 623 00:32:53,000 --> 00:32:56,000 Speaker 2: made during COVID, and I think that what came out 624 00:32:56,000 --> 00:32:57,800 Speaker 2: of that is why are we putting all these people 625 00:32:57,840 --> 00:33:01,520 Speaker 2: under so much pressure and making just one lot of 626 00:33:01,560 --> 00:33:04,920 Speaker 2: exams so crucial to the rest of their life? You know, 627 00:33:05,000 --> 00:33:07,320 Speaker 2: don't you think that it's kind of skewed the wrong way. 628 00:33:07,360 --> 00:33:10,200 Speaker 2: I mean, think exams are important to test people's knowledge 629 00:33:10,440 --> 00:33:14,440 Speaker 2: and maybe have exams worth twenty percent, But you know, 630 00:33:14,640 --> 00:33:17,640 Speaker 2: surely all the work you do throughout the year has 631 00:33:17,680 --> 00:33:20,480 Speaker 2: got to count for something. I mean, what about what 632 00:33:20,520 --> 00:33:23,120 Speaker 2: you're doing right now. I think all the research that 633 00:33:23,160 --> 00:33:25,160 Speaker 2: you've done, you've spoken to people, you've I mean, I 634 00:33:25,200 --> 00:33:27,400 Speaker 2: can't even begin to imagine how much work you've had 635 00:33:27,400 --> 00:33:30,000 Speaker 2: to do. But reality of it is, if it all 636 00:33:30,040 --> 00:33:33,600 Speaker 2: then had to hinge, you know, on a two hour exam, 637 00:33:34,040 --> 00:33:36,600 Speaker 2: I mean, would that that would just be crazy stuff? 638 00:33:37,600 --> 00:33:40,920 Speaker 1: Well it kind of does, it kind of does. Yeah, yeah, yeah, 639 00:33:40,920 --> 00:33:44,480 Speaker 1: because what you do is like five years depends Like 640 00:33:44,560 --> 00:33:48,000 Speaker 1: I'm slow, but five years of research and out of 641 00:33:48,040 --> 00:33:52,040 Speaker 1: that essentially will come three or four or five papers 642 00:33:52,080 --> 00:33:54,560 Speaker 1: that you will submit to be published that will get 643 00:33:54,560 --> 00:33:58,560 Speaker 1: published or not. So, but I know what you're saying, 644 00:33:58,680 --> 00:34:04,320 Speaker 1: and I think the down look, I'm I'm I can 645 00:34:04,360 --> 00:34:06,760 Speaker 1: see the pros and cons of exams. The pros, Yeah, 646 00:34:06,760 --> 00:34:09,000 Speaker 1: but it's pressure, it says, yeah, but life's pressure like 647 00:34:09,120 --> 00:34:12,440 Speaker 1: work is pressure. Like I was having this chat yesterday 648 00:34:12,480 --> 00:34:17,480 Speaker 1: with doctor Jody Richardson about how how much how much 649 00:34:17,520 --> 00:34:21,240 Speaker 1: do we protect and look after and you know, kind 650 00:34:21,280 --> 00:34:23,960 Speaker 1: of take care of our kids and how much do 651 00:34:24,000 --> 00:34:25,799 Speaker 1: we let them go for a run and fall over 652 00:34:25,880 --> 00:34:28,640 Speaker 1: and scrape their knees and not pick them up because 653 00:34:28,640 --> 00:34:31,680 Speaker 1: the truth is that, you know, once they hit adulthood, 654 00:34:31,719 --> 00:34:34,799 Speaker 1: nobody's running around to protect their feelings or emotions or 655 00:34:34,800 --> 00:34:40,360 Speaker 1: their mindset or their So it's like, I think kids 656 00:34:40,360 --> 00:34:43,280 Speaker 1: dealing with hard things is good because life is hard, 657 00:34:43,400 --> 00:34:46,360 Speaker 1: or life is hard at times, and life is pressure 658 00:34:46,440 --> 00:34:49,200 Speaker 1: and stress and life is all of that. But I 659 00:34:49,280 --> 00:34:52,120 Speaker 1: know what you're saying, because there are actually some people 660 00:34:53,560 --> 00:34:56,400 Speaker 1: who are really good at exams naturally, and other people 661 00:34:56,400 --> 00:35:00,640 Speaker 1: who are really intelligent but just happen to crumble at time. 662 00:35:00,960 --> 00:35:04,399 Speaker 1: So it's it's I don't know what the ideal model is, mate, 663 00:35:04,440 --> 00:35:06,000 Speaker 1: but you do make a good point. 664 00:35:06,400 --> 00:35:08,680 Speaker 2: Yeah, yeah, I kind of I mean I wasn't. I 665 00:35:08,680 --> 00:35:11,320 Speaker 2: mean I did okay with exams. I felt that I 666 00:35:11,360 --> 00:35:13,760 Speaker 2: was actually not too bad with them. But I certainly 667 00:35:13,800 --> 00:35:16,839 Speaker 2: had friends who, as you say, would crumble and just 668 00:35:16,960 --> 00:35:20,600 Speaker 2: become a mess when put under that sort of sort 669 00:35:20,600 --> 00:35:25,760 Speaker 2: of pressure, which wasn't indicative of their knowledge and skills anyway. 670 00:35:25,760 --> 00:35:29,000 Speaker 2: But you're right, I guess it comes down to who 671 00:35:29,000 --> 00:35:31,799 Speaker 2: do you want operating on you as they're tying off 672 00:35:31,840 --> 00:35:37,200 Speaker 2: that last suture for that crucial operation. The person who 673 00:35:38,080 --> 00:35:40,160 Speaker 2: you know, who got twenty in their exams of the 674 00:35:40,160 --> 00:35:41,399 Speaker 2: person who got ninety five. 675 00:35:42,880 --> 00:35:45,160 Speaker 1: Well, but how many people were shipped at school but 676 00:35:45,200 --> 00:35:48,839 Speaker 1: they're brilliant? I mean, so many people fucking crashed out 677 00:35:48,840 --> 00:35:51,960 Speaker 1: of school and you know I didn't, you know, finished 678 00:35:51,960 --> 00:35:55,640 Speaker 1: at school a year nine or ten. But it's just 679 00:35:56,120 --> 00:35:58,360 Speaker 1: not at all that they were done or not creative, 680 00:35:58,440 --> 00:36:01,719 Speaker 1: or not brilliant or not even had a low IQ. 681 00:36:01,920 --> 00:36:04,000 Speaker 1: They had a hot high IQ. It's just that that 682 00:36:04,400 --> 00:36:07,640 Speaker 1: for some people. And we can't create one model that 683 00:36:07,680 --> 00:36:10,480 Speaker 1: suits everyone, so we're not blaming anyone. But yeah, it's 684 00:36:11,120 --> 00:36:14,000 Speaker 1: intelligence is a spectrum, not a single thing. 685 00:36:14,120 --> 00:36:16,759 Speaker 2: And it's around pegan a square hole. You know, it 686 00:36:16,960 --> 00:36:19,560 Speaker 2: just doesn't fit everybody. And I think, you know, our 687 00:36:19,600 --> 00:36:22,160 Speaker 2: society is such that, you know, we can steer people 688 00:36:22,200 --> 00:36:25,640 Speaker 2: to education and traditional education, but you know, people learn 689 00:36:25,800 --> 00:36:29,480 Speaker 2: other ways. People are ways to take in information and 690 00:36:29,560 --> 00:36:33,080 Speaker 2: process it. And I've met amazingly smart people who had 691 00:36:33,960 --> 00:36:37,399 Speaker 2: very little traditional education. And you know, you hear these 692 00:36:37,440 --> 00:36:39,919 Speaker 2: amazing stories. And there was a guy in India, young 693 00:36:39,920 --> 00:36:42,640 Speaker 2: guy in India who was a mathematician and he was 694 00:36:42,800 --> 00:36:46,520 Speaker 2: plucked out of his security and was like one of 695 00:36:46,560 --> 00:36:48,719 Speaker 2: the smartest people on the planet. You know, how do 696 00:36:48,800 --> 00:36:51,120 Speaker 2: you know that? How do you test for that? You know, 697 00:36:51,120 --> 00:36:51,760 Speaker 2: it's so round. 698 00:36:52,880 --> 00:36:58,640 Speaker 1: Well, hey, let's talk about cars. Come on, tell me 699 00:36:58,680 --> 00:36:59,640 Speaker 1: something about cars. 700 00:37:00,400 --> 00:37:06,000 Speaker 2: So there's Mercedes Benz is researching a solar paint. So 701 00:37:06,040 --> 00:37:08,920 Speaker 2: what it means is, rather than just having panels, that 702 00:37:09,320 --> 00:37:11,680 Speaker 2: you put the paint onto the vehicle, so in this 703 00:37:11,760 --> 00:37:14,680 Speaker 2: case a car. And they're saying this is between five 704 00:37:14,719 --> 00:37:17,640 Speaker 2: to ten years away, and what it could effectively mean 705 00:37:17,960 --> 00:37:21,280 Speaker 2: is that a car. We talked about a company in Norway. 706 00:37:21,320 --> 00:37:23,560 Speaker 2: Remember there was a startup in Norway that promised they 707 00:37:23,600 --> 00:37:25,800 Speaker 2: were going to do a solar car that was able 708 00:37:25,840 --> 00:37:27,640 Speaker 2: to charge the battery and you'd never you know, you 709 00:37:27,719 --> 00:37:32,000 Speaker 2: get thousands of kilometers and it flopped unfortunately. But now 710 00:37:32,040 --> 00:37:34,279 Speaker 2: they're saying that this is so it just kind of 711 00:37:34,280 --> 00:37:37,600 Speaker 2: instead of just coating the roof and the bonnet, they're 712 00:37:37,640 --> 00:37:40,399 Speaker 2: saying that you cover the entire car with this new 713 00:37:40,440 --> 00:37:43,880 Speaker 2: solar paint, and it means that you've got such a 714 00:37:43,880 --> 00:37:47,160 Speaker 2: big surface area you'll be able to use the solar 715 00:37:47,200 --> 00:37:50,600 Speaker 2: paint to charge the vehicle at any point that there's sunlight. 716 00:37:50,760 --> 00:37:53,359 Speaker 2: I mean that would be amazing, and I mean we're 717 00:37:53,360 --> 00:37:56,680 Speaker 2: talking at this stage. You know, it would cover maybe 718 00:37:56,719 --> 00:37:59,640 Speaker 2: thirty two kilometers a day, but for the little commutes 719 00:38:00,160 --> 00:38:03,040 Speaker 2: to be able to trickle charge the car constantly would 720 00:38:03,080 --> 00:38:06,080 Speaker 2: be phenomenal. Yeah, it's pretty pretty cool. I was thinking 721 00:38:06,080 --> 00:38:08,959 Speaker 2: you could get a backpack, or you could paint your dog, 722 00:38:09,239 --> 00:38:10,839 Speaker 2: your dog's coat, and then they. 723 00:38:10,840 --> 00:38:12,879 Speaker 1: You could paint the top of your head and keep 724 00:38:12,960 --> 00:38:14,200 Speaker 1: the whole house charged. 725 00:38:14,320 --> 00:38:15,880 Speaker 2: There's a lot of coverage there. There's a lot of 726 00:38:15,920 --> 00:38:20,160 Speaker 2: real estate. I should sell sign it shouldn't I we could, No, 727 00:38:20,400 --> 00:38:21,320 Speaker 2: that's not nice. 728 00:38:21,640 --> 00:38:25,279 Speaker 1: But I'm thinking about like, if you've got paint that 729 00:38:25,400 --> 00:38:30,200 Speaker 1: is sola paint that can then the catch twenty two 730 00:38:30,239 --> 00:38:31,560 Speaker 1: is then you've got to leave your car in the 731 00:38:31,600 --> 00:38:34,640 Speaker 1: sun all the time, which fucks up the interior. I 732 00:38:34,680 --> 00:38:37,040 Speaker 1: guess you could put one of those ugly silver things 733 00:38:37,080 --> 00:38:40,280 Speaker 1: that my mum's got, one of those ugly silver screen 734 00:38:40,360 --> 00:38:42,799 Speaker 1: things that you put across on top of the dash 735 00:38:42,840 --> 00:38:43,840 Speaker 1: and inside the window. 736 00:38:45,200 --> 00:38:50,920 Speaker 2: I have one of those two. Ah, but why. 737 00:38:50,760 --> 00:38:53,640 Speaker 1: Doesn't that not surprise me? Nana? Tell us? What? 738 00:38:54,680 --> 00:38:58,400 Speaker 2: Wait a minute, Wait a minute. So when you park somewhere, say. 739 00:38:58,200 --> 00:39:00,360 Speaker 1: Can you get your voice down a couple of otive 740 00:39:00,440 --> 00:39:03,680 Speaker 1: so I can respect you? Could you loosen your underpants 741 00:39:03,719 --> 00:39:04,480 Speaker 1: and start again? 742 00:39:04,719 --> 00:39:08,000 Speaker 2: Hey, you know I have a very impressive physical range. 743 00:39:08,000 --> 00:39:12,839 Speaker 2: I'd like to tell you I can sing bo and bass. No, no, 744 00:39:13,080 --> 00:39:17,719 Speaker 2: what I When you go out into like you go 745 00:39:17,800 --> 00:39:20,640 Speaker 2: to park at Bunnings, there's no shade. Right, you're out 746 00:39:20,680 --> 00:39:22,920 Speaker 2: of Bunnings, You've just spent an hour and a half 747 00:39:23,040 --> 00:39:25,320 Speaker 2: doing all your shopping. You get back into the car 748 00:39:25,719 --> 00:39:29,080 Speaker 2: and your ass sticks to the seat. Your hands are 749 00:39:29,120 --> 00:39:31,800 Speaker 2: glued to the steering wheel because they've just been seared 750 00:39:32,640 --> 00:39:34,920 Speaker 2: because it's a hot day. How do you protect yourself 751 00:39:34,960 --> 00:39:35,640 Speaker 2: from that crago? 752 00:39:36,120 --> 00:39:39,399 Speaker 1: Can I just say you've never been You've never been 753 00:39:39,440 --> 00:39:42,680 Speaker 1: to Bunnings. You're talking about spotlight or something. 754 00:39:42,520 --> 00:39:44,560 Speaker 2: Going to Bunnies today, you dick. 755 00:39:46,320 --> 00:39:48,440 Speaker 1: What to get a sausage and then head home. 756 00:39:48,680 --> 00:39:51,600 Speaker 2: I'm getting some sausage. What I'm Vegan. I'm not going 757 00:39:51,680 --> 00:39:54,120 Speaker 2: to get a sausage from Vegan ones. 758 00:39:54,400 --> 00:39:58,280 Speaker 1: Hey, tell us what Ford Australia are doing in twenty 759 00:39:58,360 --> 00:39:58,959 Speaker 1: twenty five. 760 00:39:59,120 --> 00:40:02,840 Speaker 2: No, seriously, I'm defending every motorist in Australia has the 761 00:40:02,880 --> 00:40:03,719 Speaker 2: silver foil thing. 762 00:40:04,000 --> 00:40:06,759 Speaker 1: Look, I get the practicality, I get it all right. 763 00:40:06,800 --> 00:40:08,879 Speaker 1: I'll give you a pass. I shall probably get one 764 00:40:08,960 --> 00:40:10,520 Speaker 1: that I won't thing tip. 765 00:40:11,000 --> 00:40:11,759 Speaker 2: Have you ever had one? 766 00:40:12,360 --> 00:40:12,480 Speaker 1: No? 767 00:40:12,880 --> 00:40:16,600 Speaker 2: I really? Is it just me? I'm the nana here. 768 00:40:17,920 --> 00:40:21,239 Speaker 1: It's probably I could guarantee more than half hour listeners agoing. No, 769 00:40:21,360 --> 00:40:23,719 Speaker 1: that's actually a good idea. Patrick's right, Craig, you're a 770 00:40:23,760 --> 00:40:27,839 Speaker 1: fucking idiot, protectsper Craig, you're a dickhead. I think you're 771 00:40:27,840 --> 00:40:30,799 Speaker 1: funny and not funny at all, and stop cutting Patrick off. 772 00:40:31,480 --> 00:40:34,680 Speaker 2: Yeah there you go. Hey, you know this is a 773 00:40:34,719 --> 00:40:37,640 Speaker 2: Tesla thing that always rubs me the wrong way where 774 00:40:37,680 --> 00:40:40,759 Speaker 2: you buy a car but then you have to subscribe 775 00:40:40,960 --> 00:40:45,439 Speaker 2: and pay for additional features. But Ford is now going 776 00:40:45,440 --> 00:40:49,439 Speaker 2: to roll this out. Now what rubs me the wrong way? 777 00:40:49,440 --> 00:40:52,280 Speaker 2: With this story. So they're saying that most Ford models, 778 00:40:52,320 --> 00:40:55,200 Speaker 2: this is a story that I saw since the mid 779 00:40:55,280 --> 00:40:59,200 Speaker 2: twenty twenties, will soon require owners to pay for some 780 00:40:59,520 --> 00:41:03,920 Speaker 2: functions that are currently free. So they're going to backdate 781 00:41:03,960 --> 00:41:07,120 Speaker 2: it so for the near news. So if you've got 782 00:41:07,160 --> 00:41:09,359 Speaker 2: a car that's only a couple of years old, potentially 783 00:41:09,719 --> 00:41:11,960 Speaker 2: you may have to start paying a subscription fee to 784 00:41:12,000 --> 00:41:16,600 Speaker 2: accent to access certain features in navigation services and all 785 00:41:16,600 --> 00:41:18,799 Speaker 2: that sort of stuff. So if you have in built 786 00:41:18,880 --> 00:41:21,919 Speaker 2: navigation in your car, there's a potential that you may 787 00:41:21,960 --> 00:41:24,080 Speaker 2: have to pay for that, and we're talking, you know, 788 00:41:24,120 --> 00:41:26,359 Speaker 2: an annual fee of one hundred and ten bucks. But 789 00:41:26,480 --> 00:41:28,680 Speaker 2: that's just another thing that you've got to pay on 790 00:41:28,760 --> 00:41:30,920 Speaker 2: top of everything. And you know, as far as I'm concerned, 791 00:41:30,960 --> 00:41:34,200 Speaker 2: when you buy your car, that's it. You know, whatever 792 00:41:34,600 --> 00:41:37,200 Speaker 2: the rules were when you bought it. It's one thing 793 00:41:37,239 --> 00:41:39,400 Speaker 2: to get into a contract and buy a car and 794 00:41:39,520 --> 00:41:41,560 Speaker 2: know that there are features that may be locked out 795 00:41:41,600 --> 00:41:43,680 Speaker 2: you've got to pay subscription for. But if you buy 796 00:41:43,719 --> 00:41:45,640 Speaker 2: it and they backdate it and they say, oh, you, 797 00:41:45,680 --> 00:41:47,879 Speaker 2: by the way, that car you bought two years ago, 798 00:41:48,120 --> 00:41:50,760 Speaker 2: we've changed the rules and now you can't use your 799 00:41:50,840 --> 00:41:51,840 Speaker 2: satellite navigation. 800 00:41:52,719 --> 00:41:54,880 Speaker 1: That sounds like they've got a bunch of people in 801 00:41:54,880 --> 00:41:57,719 Speaker 1: a boardroom and said, what's the worst fucking pr move 802 00:41:57,800 --> 00:42:01,640 Speaker 1: we could make? Right? That's like, rather, if you want 803 00:42:01,640 --> 00:42:04,279 Speaker 1: to make another hundred bucks, just put the car up 804 00:42:04,280 --> 00:42:07,080 Speaker 1: by one hundred bucks. But don't because it's not even 805 00:42:07,120 --> 00:42:09,239 Speaker 1: about the hundred bucks. Well it's a bit about that, 806 00:42:09,320 --> 00:42:12,640 Speaker 1: but it's also about how dare you fucking do that. 807 00:42:12,719 --> 00:42:16,240 Speaker 1: I'm not buying a Ford now, so stick your whatever, 808 00:42:16,440 --> 00:42:19,480 Speaker 1: your search arge or your payball up your ass. I'm 809 00:42:19,520 --> 00:42:23,279 Speaker 1: going to buy a Masda. It's like that tell us 810 00:42:23,320 --> 00:42:28,600 Speaker 1: what we've been searching for on the interwebs, Patrick, what's yeah? 811 00:42:28,719 --> 00:42:31,480 Speaker 2: Okay? So overall I always find this fun at the 812 00:42:31,560 --> 00:42:34,080 Speaker 2: end of the year to see what people search for. 813 00:42:34,520 --> 00:42:38,440 Speaker 2: And I mean, obviously the big, big, big one that 814 00:42:38,560 --> 00:42:41,160 Speaker 2: was of interest to everybody in Australia as well as 815 00:42:41,200 --> 00:42:43,440 Speaker 2: the rest of the world was the US election. So 816 00:42:43,800 --> 00:42:47,160 Speaker 2: no question overall, the US election came out on top 817 00:42:47,640 --> 00:42:51,279 Speaker 2: as far as Google searching for Australians over the last year. 818 00:42:51,760 --> 00:42:57,319 Speaker 2: The Olympic medal telling that was really high euros. Liam Paine, 819 00:42:57,360 --> 00:42:59,279 Speaker 2: I don't even know who Liam pain is. Anybody know 820 00:42:59,320 --> 00:43:00,640 Speaker 2: who Liam Payne. It's no idea. 821 00:43:01,600 --> 00:43:05,880 Speaker 1: Liam Payne wasn't wow he can you google LAMB Payne 822 00:43:06,200 --> 00:43:08,800 Speaker 1: and just add to the tally, well, Taylor Swift. 823 00:43:08,840 --> 00:43:11,480 Speaker 2: There's no kind of surprise there. But what when you 824 00:43:11,560 --> 00:43:14,359 Speaker 2: break it down. I mean, obviously the election was one 825 00:43:14,360 --> 00:43:17,640 Speaker 2: of the news events, and it's interesting that overall US 826 00:43:17,719 --> 00:43:20,120 Speaker 2: election and medal tally and then news events U S 827 00:43:20,160 --> 00:43:23,840 Speaker 2: election and medal tally, but there were other global figures. 828 00:43:23,960 --> 00:43:27,080 Speaker 2: This is internationally Taylor Swift number one, and then for 829 00:43:27,200 --> 00:43:32,920 Speaker 2: people and then Donald Trump, Kate Middleton, like Kamala Harris Craig. 830 00:43:32,960 --> 00:43:34,840 Speaker 2: You're not even on the list. I looked. 831 00:43:35,760 --> 00:43:39,719 Speaker 1: That's disappointing. Yeah, wow, I must be I must be 832 00:43:39,800 --> 00:43:40,520 Speaker 1: number eleven. 833 00:43:40,840 --> 00:43:44,920 Speaker 2: Yeah, that could be. It recipes Jamie Oliver's air fryer recipes. 834 00:43:45,200 --> 00:43:48,120 Speaker 2: Cucumber salad. How about cucumber salad. 835 00:43:48,880 --> 00:43:51,840 Speaker 1: That's it's pretty it sounds like that sounds like a 836 00:43:51,880 --> 00:43:56,480 Speaker 1: you thing at cucumber salad. Let's chuck some cow on it. 837 00:43:56,719 --> 00:43:57,600 Speaker 1: Now you're talking. 838 00:43:58,200 --> 00:44:00,640 Speaker 2: Now, This one I found really interesting. I know you're 839 00:44:00,640 --> 00:44:05,520 Speaker 2: going to like this one because the most popular words searched, 840 00:44:05,560 --> 00:44:12,000 Speaker 2: so people looking for definitions of words demure. Wow, demure. 841 00:44:13,239 --> 00:44:17,400 Speaker 1: I'll tell you none of us are demure, but you 842 00:44:17,440 --> 00:44:18,200 Speaker 1: know none of us. 843 00:44:19,000 --> 00:44:21,560 Speaker 2: But how random is it that people were the most 844 00:44:21,680 --> 00:44:25,279 Speaker 2: searched word? And the reason for it was that evidently 845 00:44:25,640 --> 00:44:29,839 Speaker 2: a social influencer used the word to describe herself, and 846 00:44:30,000 --> 00:44:32,960 Speaker 2: subsequently millions of people jumped on and started looking for 847 00:44:33,000 --> 00:44:38,680 Speaker 2: the word the definition demure. Would you describe yourself as demure? Tiff? 848 00:44:44,680 --> 00:44:47,680 Speaker 1: Yeah, get a tiff and I'm a wallflower. I can 849 00:44:47,800 --> 00:44:50,360 Speaker 1: bench a fuck load and you should see me deadlift 850 00:44:53,600 --> 00:44:54,719 Speaker 1: And what are you looking at? 851 00:44:55,000 --> 00:44:56,680 Speaker 2: Yeah? What about? 852 00:44:56,800 --> 00:44:59,480 Speaker 1: You're not like that anymore? Since she got back from India, 853 00:44:59,480 --> 00:45:02,399 Speaker 1: she's all a lama and calm the farm and equanimity 854 00:45:02,480 --> 00:45:06,839 Speaker 1: and fucking dreamcatchers and moccasins and give us a hug. 855 00:45:07,000 --> 00:45:10,520 Speaker 1: She's she's new and improved. Speaking of the Delai Lama. 856 00:45:10,600 --> 00:45:13,520 Speaker 4: This morning, when I was scrolling on instat I saw 857 00:45:13,640 --> 00:45:17,160 Speaker 4: Johann Hari's real saying that he got fat shamed by 858 00:45:17,200 --> 00:45:18,120 Speaker 4: the Delai Lama. 859 00:45:19,200 --> 00:45:20,120 Speaker 1: That's hilarious. 860 00:45:20,520 --> 00:45:23,280 Speaker 2: Can I say I have a connection with the Dalai Lama? 861 00:45:24,800 --> 00:45:27,319 Speaker 2: You can say it were we were born on the 862 00:45:27,320 --> 00:45:30,319 Speaker 2: same day and I have seen him once as I 863 00:45:30,320 --> 00:45:32,200 Speaker 2: saw him at a conference he was pretty amazing. She's 864 00:45:32,239 --> 00:45:34,360 Speaker 2: quite a cool guy. He's got a really cute giggle. 865 00:45:34,440 --> 00:45:36,120 Speaker 2: Have you noticed that? If you've seen the Dalai Lama, 866 00:45:36,160 --> 00:45:37,839 Speaker 2: if you ever seen any stuff. 867 00:45:37,560 --> 00:45:38,760 Speaker 1: You and I saw him together. 868 00:45:38,800 --> 00:45:41,160 Speaker 2: You dickhead with me at the Dalai Lama thing? 869 00:45:41,360 --> 00:45:43,279 Speaker 1: Fuck and hell, you're the worst date ever. 870 00:45:43,600 --> 00:45:44,719 Speaker 2: That was a long time ago. 871 00:45:44,920 --> 00:45:49,080 Speaker 1: Did we held your hand the whole time? Especially in 872 00:45:49,120 --> 00:45:51,400 Speaker 1: the scary bits when he looked into your soul? 873 00:45:54,440 --> 00:45:54,640 Speaker 3: Man? 874 00:45:54,920 --> 00:45:56,760 Speaker 2: That was a long time ago. How many years agould 875 00:45:56,760 --> 00:45:57,120 Speaker 2: that had been? 876 00:45:57,719 --> 00:45:59,680 Speaker 1: I don't know, but I don't feel I don't feel 877 00:45:59,680 --> 00:46:03,640 Speaker 1: special at all. Now, Wow, what was that movie? We 878 00:46:03,719 --> 00:46:06,319 Speaker 1: saw that three D? What's it called again? Oh? 879 00:46:06,400 --> 00:46:11,920 Speaker 2: Yeah, we saw it was Oh the what was it? 880 00:46:11,880 --> 00:46:14,000 Speaker 1: It's called you know those Blue People? 881 00:46:16,480 --> 00:46:16,840 Speaker 2: You said it? 882 00:46:16,920 --> 00:46:18,200 Speaker 4: I thought, why? 883 00:46:19,800 --> 00:46:22,680 Speaker 1: So what a mix? We've seen the We've seen Avatar 884 00:46:22,840 --> 00:46:28,480 Speaker 1: and the Dali Lama, not in the same day, battally 885 00:46:28,520 --> 00:46:28,919 Speaker 1: in pain. 886 00:46:29,040 --> 00:46:30,680 Speaker 3: It's the one direction singer that. 887 00:46:30,719 --> 00:46:31,239 Speaker 1: Died for. 888 00:46:33,320 --> 00:46:36,640 Speaker 2: I remember the name from somewhere, Okay, now I didn't ye, 889 00:46:37,160 --> 00:46:40,080 Speaker 2: now I know? So what about di I Y? This 890 00:46:40,160 --> 00:46:42,879 Speaker 2: is an interesting what do people most need help with 891 00:46:43,160 --> 00:46:47,120 Speaker 2: when they do Google searching? Di Y car maintenance so 892 00:46:47,160 --> 00:46:49,920 Speaker 2: people are still maintaining their cars. If you popped the 893 00:46:49,920 --> 00:46:52,760 Speaker 2: bond on your new car, because you can't recognize anything 894 00:46:52,800 --> 00:46:54,720 Speaker 2: under that bond, can you like, there's. 895 00:46:56,200 --> 00:46:59,040 Speaker 1: I got no idea. When you turn on my car, 896 00:46:59,120 --> 00:47:03,000 Speaker 1: there's no sound because it's hybrid, so you don't even 897 00:47:03,040 --> 00:47:04,840 Speaker 1: know it's on. It's like is it on? 898 00:47:06,200 --> 00:47:08,120 Speaker 2: See? You might find this hard to believe, but I've 899 00:47:08,200 --> 00:47:11,600 Speaker 2: changed the spark plugs on my first car. I repaired 900 00:47:11,640 --> 00:47:15,360 Speaker 2: my first car myself. Remember the front bumpers on old 901 00:47:15,440 --> 00:47:16,200 Speaker 2: cars were. 902 00:47:16,040 --> 00:47:21,200 Speaker 1: Actually hang on. Let's define I repaired, yeah, because that's 903 00:47:21,400 --> 00:47:23,279 Speaker 1: come on, what does that mean? 904 00:47:23,560 --> 00:47:25,759 Speaker 2: Okay? So I may have got into a little bit 905 00:47:25,800 --> 00:47:28,200 Speaker 2: of an accident when I was nineteen. I was on 906 00:47:28,360 --> 00:47:31,040 Speaker 2: tram tracks on Sydney Road. It was a wet day. 907 00:47:31,360 --> 00:47:33,239 Speaker 2: Guy slams the brakes on in front of me and 908 00:47:33,280 --> 00:47:35,160 Speaker 2: I was too close and I skid it into him. 909 00:47:35,480 --> 00:47:38,040 Speaker 2: So but I couldn't afford the repair, so I did 910 00:47:38,080 --> 00:47:42,120 Speaker 2: it myself. So I replaced the headlight myself, just bought 911 00:47:42,160 --> 00:47:44,600 Speaker 2: a new headlight. I did the panel beating by myself. 912 00:47:44,760 --> 00:47:47,160 Speaker 2: But the funniest thing was the panel beating wasn't that 913 00:47:47,200 --> 00:47:49,319 Speaker 2: hard to kind of tap things out? I put a 914 00:47:49,440 --> 00:47:52,560 Speaker 2: jack inside the because the car was a Mazda eight 915 00:47:52,600 --> 00:47:54,880 Speaker 2: oh eight, so it had a little tiny engine but 916 00:47:54,920 --> 00:47:57,040 Speaker 2: a really long body, so it meant that there was 917 00:47:57,080 --> 00:47:59,680 Speaker 2: no mechanical damage. And so what I did was I 918 00:47:59,680 --> 00:48:01,560 Speaker 2: put a plank of wood and the jack, and I 919 00:48:01,640 --> 00:48:03,279 Speaker 2: jacked out the front of the car to push it 920 00:48:03,320 --> 00:48:06,200 Speaker 2: back out again. But the best thing was remember the 921 00:48:06,280 --> 00:48:08,759 Speaker 2: chrome bumper bars on Rudy, because my car was a 922 00:48:08,800 --> 00:48:11,040 Speaker 2: nineteen to seventies mas to eight oh eight, and the 923 00:48:11,120 --> 00:48:14,480 Speaker 2: chrome bumpers were solid, and it was so solid even 924 00:48:14,520 --> 00:48:16,719 Speaker 2: though it had been buckled, I couldn't bend it back. 925 00:48:16,719 --> 00:48:19,400 Speaker 2: So I put it over Dad's gas barbecue, got it 926 00:48:19,480 --> 00:48:22,480 Speaker 2: red hot, and then tapped out and got it back 927 00:48:22,480 --> 00:48:24,880 Speaker 2: into shape. So I did that myself. I think you 928 00:48:24,880 --> 00:48:25,680 Speaker 2: should be proud of me. 929 00:48:27,840 --> 00:48:30,440 Speaker 1: I would love to see a photo of the finished product. 930 00:48:30,920 --> 00:48:34,359 Speaker 2: It survived for another six years after that. I might say, 931 00:48:36,040 --> 00:48:36,920 Speaker 2: little mouse. 932 00:48:36,840 --> 00:48:43,240 Speaker 1: Well, if I have a little fender bender, I definitely 933 00:48:43,239 --> 00:48:45,480 Speaker 1: will not give you a call no, because they're plastic. 934 00:48:45,560 --> 00:48:47,799 Speaker 2: It wouldn't matter. I just need a glue gun to 935 00:48:47,800 --> 00:48:50,160 Speaker 2: put it back together again. I wouldn't need anything metal. 936 00:48:52,200 --> 00:48:55,880 Speaker 2: So car maintenance was number one for DIY. The second 937 00:48:55,880 --> 00:49:00,680 Speaker 2: one was DIY Halloween Costumes and then followed by how 938 00:49:00,719 --> 00:49:03,440 Speaker 2: to make an Easter Buddy footprints. 939 00:49:04,600 --> 00:49:07,440 Speaker 1: I would have thought high on the list would be DIY. 940 00:49:07,719 --> 00:49:12,439 Speaker 1: Like medical, Yeah, I know, how to perform the CPR. Yeah, 941 00:49:12,520 --> 00:49:16,480 Speaker 1: how to well, how to give yourself an appendectomy with 942 00:49:16,640 --> 00:49:20,000 Speaker 1: a sewing kit and a Swiss Army knife and some gin. 943 00:49:22,360 --> 00:49:24,200 Speaker 2: Travel. Do you know where people want to go the 944 00:49:24,239 --> 00:49:27,799 Speaker 2: most in Australia if they live in where they want 945 00:49:27,800 --> 00:49:30,640 Speaker 2: to go? Gold Coast? Cheap flights to the Gold Coast 946 00:49:30,680 --> 00:49:34,080 Speaker 2: from Melbourne that was the most searched travel term, and 947 00:49:34,120 --> 00:49:37,400 Speaker 2: then flights to King Island and then Jetstar cheap flights 948 00:49:37,440 --> 00:49:39,320 Speaker 2: to Melbourne. Oh so people do want to come to Melbourne? 949 00:49:39,320 --> 00:49:40,840 Speaker 2: They don't do they't just want to leave Melbourne. 950 00:49:41,080 --> 00:49:42,640 Speaker 1: Do you know what's going to sell well in the 951 00:49:42,680 --> 00:49:43,360 Speaker 1: Gold Coast? 952 00:49:43,719 --> 00:49:44,040 Speaker 2: What's that? 953 00:49:45,040 --> 00:49:46,240 Speaker 1: Solar paint cars? 954 00:49:46,560 --> 00:49:49,440 Speaker 2: Oh yeah, how good would that be? Or in Central Australia? 955 00:49:50,360 --> 00:49:53,640 Speaker 1: Oh wow, imagine that? I think, all right, let's do 956 00:49:53,719 --> 00:49:56,839 Speaker 1: one more. Let's do one more, mister host Well. 957 00:49:56,920 --> 00:49:59,200 Speaker 2: Actually, one of the ones that I thought was interesting 958 00:49:59,440 --> 00:50:04,520 Speaker 2: related to social media, and they talk about regularly posting 959 00:50:04,560 --> 00:50:08,000 Speaker 2: on socials as opposed to just looking at socials can 960 00:50:08,040 --> 00:50:12,040 Speaker 2: actually worsen the mental health in adults. So if you've 961 00:50:12,080 --> 00:50:16,000 Speaker 2: been caught up in that kind of mailstream of have 962 00:50:16,080 --> 00:50:19,400 Speaker 2: to post, got to give a status update, that now 963 00:50:19,480 --> 00:50:21,760 Speaker 2: is being thought of as being having a real impact. 964 00:50:21,800 --> 00:50:25,120 Speaker 2: So the Journal of Medical Internet Research, so they were 965 00:50:25,200 --> 00:50:29,359 Speaker 2: investigating fifteen thousand adults in the United Kingdom all over 966 00:50:29,400 --> 00:50:33,000 Speaker 2: the age of sixteen, and what they said was simply 967 00:50:33,160 --> 00:50:36,759 Speaker 2: viewing social media content. Content didn't seem to have the 968 00:50:36,800 --> 00:50:41,520 Speaker 2: same effect as regularly posting on social media. So if 969 00:50:41,560 --> 00:50:44,279 Speaker 2: you fall into that category of someone who has to 970 00:50:44,320 --> 00:50:47,080 Speaker 2: post what you're doing all the time, it actually may 971 00:50:47,320 --> 00:50:51,200 Speaker 2: worsen your mental health. It could have a health implications 972 00:50:51,400 --> 00:50:53,560 Speaker 2: if you feel the need to keep posting. I mean, 973 00:50:53,680 --> 00:50:56,480 Speaker 2: do you guys post much on socials? 974 00:50:56,640 --> 00:50:58,360 Speaker 4: Post every four and a half minutes? 975 00:50:58,600 --> 00:51:02,920 Speaker 1: Oh really? Her life was fucking as public as you 976 00:51:02,960 --> 00:51:03,759 Speaker 1: can get. I don't. 977 00:51:03,840 --> 00:51:06,640 Speaker 4: Every time my dog changes position, I take a photo 978 00:51:06,680 --> 00:51:07,319 Speaker 4: and post it. 979 00:51:08,080 --> 00:51:11,520 Speaker 2: Wow. Yeah, I think I posted once this year, and 980 00:51:11,520 --> 00:51:12,440 Speaker 2: that's post. 981 00:51:12,600 --> 00:51:17,120 Speaker 1: I just post messages like whiteboards. I don't really go, hey, 982 00:51:17,160 --> 00:51:20,439 Speaker 1: here's my breky. Sometimes I'll post something that I think 983 00:51:20,560 --> 00:51:23,520 Speaker 1: might be like a personal thing that I think might 984 00:51:23,560 --> 00:51:27,000 Speaker 1: be like my steps. I don't know if I told 985 00:51:27,000 --> 00:51:29,239 Speaker 1: you this. Did I tell you that my phone this, 986 00:51:29,320 --> 00:51:31,560 Speaker 1: Speaking of tech, my phone told me I was a fat, 987 00:51:31,640 --> 00:51:36,320 Speaker 1: lazy fuck. Did I tell you if my phone literally 988 00:51:36,400 --> 00:51:41,279 Speaker 1: sent me? In my phone, it literally sent me an 989 00:51:41,320 --> 00:51:45,520 Speaker 1: alert telling me that I don't. Essentially, you're not walking 990 00:51:45,640 --> 00:51:50,000 Speaker 1: much and you need to walk more. So I took 991 00:51:50,040 --> 00:51:55,440 Speaker 1: that on board from my pocket coach, and yeah, I've 992 00:51:55,560 --> 00:51:59,480 Speaker 1: gone from average under five thousand a day to average 993 00:51:59,560 --> 00:52:02,240 Speaker 1: just under fifteen thousand a day. But it was funny 994 00:52:02,280 --> 00:52:05,439 Speaker 1: because for me, and actually I did not realize how 995 00:52:05,520 --> 00:52:09,000 Speaker 1: little I was working walking and generally I was walking 996 00:52:09,080 --> 00:52:11,120 Speaker 1: more than that. But the last few months where I've 997 00:52:11,120 --> 00:52:14,879 Speaker 1: been deep, balls deep as the kids say, or maybe 998 00:52:14,880 --> 00:52:18,759 Speaker 1: the kids don't say that. Maybe that's me balls deep 999 00:52:18,760 --> 00:52:23,000 Speaker 1: in my PhD and recording and reading. And I've been 1000 00:52:23,040 --> 00:52:27,040 Speaker 1: sitting down a lot. And yeah, like even this morning, 1001 00:52:27,080 --> 00:52:29,800 Speaker 1: I got up at quarter past five and I've walked 1002 00:52:29,840 --> 00:52:32,920 Speaker 1: five ks and it's eight twenty six right now. But 1003 00:52:33,239 --> 00:52:35,719 Speaker 1: it's been a I know this sounds weird, but one 1004 00:52:35,840 --> 00:52:38,799 Speaker 1: it was a really good catalyst. But it's been a 1005 00:52:38,840 --> 00:52:41,040 Speaker 1: game changer for me. But that's got nothing to do 1006 00:52:41,080 --> 00:52:41,520 Speaker 1: with tech. 1007 00:52:42,120 --> 00:52:43,640 Speaker 2: No, it has have everything to do with tech. What 1008 00:52:43,640 --> 00:52:45,799 Speaker 2: do you mean it was your phone? Your phone was. 1009 00:52:45,760 --> 00:52:48,840 Speaker 1: Telling well, I mean that my physiological response. 1010 00:52:50,040 --> 00:52:52,360 Speaker 2: But it can help prompt you know. It's not dissimilar 1011 00:52:52,400 --> 00:52:55,440 Speaker 2: to having the conversation with Ai like I did earlier 1012 00:52:55,480 --> 00:52:58,840 Speaker 2: in the show. It's that sense of not being alone, 1013 00:52:59,040 --> 00:53:01,680 Speaker 2: you know, can do all that sort of stuff. Although 1014 00:53:01,680 --> 00:53:03,440 Speaker 2: I've got to say this is quite funny. I was 1015 00:53:03,520 --> 00:53:05,560 Speaker 2: just in the middle of a Thai cheek class. I 1016 00:53:05,560 --> 00:53:08,160 Speaker 2: think I've been doing Tai chie for about forty five minutes, 1017 00:53:08,440 --> 00:53:11,520 Speaker 2: and my watch pinged me and said you need to 1018 00:53:11,560 --> 00:53:16,360 Speaker 2: get moving. It's like, what forty five minutes? You idiot? 1019 00:53:16,440 --> 00:53:19,560 Speaker 2: I think I was just so relaxed and my heart 1020 00:53:19,640 --> 00:53:23,520 Speaker 2: was so relaxed that it thought that I wasn't moving. Well, 1021 00:53:23,520 --> 00:53:26,160 Speaker 2: that's what Tai Chee's about, isn't It's a slow moving meditation. 1022 00:53:26,360 --> 00:53:28,759 Speaker 2: So yeah, idea not to end up with a sweat 1023 00:53:28,800 --> 00:53:31,040 Speaker 2: and panting at the end of it, but to feel 1024 00:53:31,080 --> 00:53:34,239 Speaker 2: relaxed and centered and mindful. See if when I was 1025 00:53:34,239 --> 00:53:36,960 Speaker 2: thinking about that, when Crago said before that, you know, 1026 00:53:37,000 --> 00:53:39,640 Speaker 2: you've become a slightly different person and you're more chilled 1027 00:53:39,680 --> 00:53:42,480 Speaker 2: and relaxed since going to India. It made me think 1028 00:53:42,480 --> 00:53:43,839 Speaker 2: that you and I could have hung out of this 1029 00:53:43,920 --> 00:53:45,960 Speaker 2: mindfulness seminar because I reckon you would got to lay 1030 00:53:46,000 --> 00:53:49,839 Speaker 2: out out of that it was good. I would just living. Yeah. 1031 00:53:50,000 --> 00:53:52,839 Speaker 2: I love the idea of living in the moment for 1032 00:53:52,920 --> 00:53:55,239 Speaker 2: the moment, not letting anything else. And that's what Tai 1033 00:53:55,360 --> 00:53:57,640 Speaker 2: Cheek does for me. It's great because nothing else is 1034 00:53:57,640 --> 00:53:59,440 Speaker 2: in my head and I can just focus on what 1035 00:53:59,480 --> 00:54:02,480 Speaker 2: I'm doing. The thing that I got out of it 1036 00:54:02,640 --> 00:54:06,080 Speaker 2: was five minutes a day of mindfulness. Just five minutes 1037 00:54:06,080 --> 00:54:08,920 Speaker 2: a day is a real gift to yourself. Just you know, 1038 00:54:08,960 --> 00:54:11,920 Speaker 2: that's hectic lifestyle, Crago. You know you're getting up and 1039 00:54:11,960 --> 00:54:13,640 Speaker 2: walking and doing all that sort of stuff. But you 1040 00:54:13,760 --> 00:54:16,560 Speaker 2: use walking as a bit of a moving meditation, don't you. 1041 00:54:16,560 --> 00:54:18,080 Speaker 2: You kind of just in your zone. 1042 00:54:18,200 --> 00:54:21,760 Speaker 1: I do, I do, I do. It's funny we should 1043 00:54:21,800 --> 00:54:24,640 Speaker 1: talk about what because what for somebody will be a 1044 00:54:24,680 --> 00:54:27,719 Speaker 1: meditation for somebody else will be a source of anxiety. 1045 00:54:28,640 --> 00:54:31,320 Speaker 1: You know. So mate, tell people how they can connect 1046 00:54:31,320 --> 00:54:32,200 Speaker 1: with you and follow you. 1047 00:54:32,840 --> 00:54:35,239 Speaker 2: Oh sure, yeah, so you just need to go to 1048 00:54:35,400 --> 00:54:40,200 Speaker 2: websites now, dot com, dot au and just trying to 1049 00:54:40,280 --> 00:54:42,200 Speaker 2: check out what we do. I mean, obviously, websites now, 1050 00:54:42,200 --> 00:54:45,400 Speaker 2: dot com, dot you speaks for itself. But it's just 1051 00:54:45,440 --> 00:54:47,960 Speaker 2: too hard to spell Genesis FX because it's just a 1052 00:54:48,000 --> 00:54:50,640 Speaker 2: stupid name that I came up with before the Internet existed, 1053 00:54:50,840 --> 00:54:53,000 Speaker 2: and people can't spell it. So I had to come 1054 00:54:53,040 --> 00:54:56,040 Speaker 2: up with another name for our website. So websites Now 1055 00:54:56,160 --> 00:54:58,080 Speaker 2: kind of made sense, didn't it. Really? When you think 1056 00:54:58,120 --> 00:54:58,959 Speaker 2: about it, I think it. 1057 00:54:58,880 --> 00:55:01,560 Speaker 1: Was a good change. It was a good change. There's 1058 00:55:01,600 --> 00:55:04,279 Speaker 1: been quite a few businesses who have just changed their 1059 00:55:04,360 --> 00:55:06,680 Speaker 1: name and nothing else and their business catapulted. 1060 00:55:07,080 --> 00:55:07,480 Speaker 2: Really. 1061 00:55:08,040 --> 00:55:10,680 Speaker 1: Oh yeah, do you remember Tara and Russ had a 1062 00:55:10,719 --> 00:55:14,720 Speaker 1: business in glen Huntley Road. It was called High Energy Network. 1063 00:55:15,040 --> 00:55:17,400 Speaker 2: Yeah, I know, I did a lot, they did quite well. 1064 00:55:18,040 --> 00:55:20,520 Speaker 1: But then we had a meeting one day and they 1065 00:55:20,520 --> 00:55:22,600 Speaker 1: were talking about how to improve the business and I said, 1066 00:55:22,600 --> 00:55:27,080 Speaker 1: why don't you change the name to Australian Fitness Academy 1067 00:55:27,880 --> 00:55:31,600 Speaker 1: And they went ah, and they did that and just 1068 00:55:31,719 --> 00:55:37,920 Speaker 1: that double tripled, like interest inquiries, turnover bums on seats, 1069 00:55:38,520 --> 00:55:42,640 Speaker 1: because it explains straight away what it is versus you know, 1070 00:55:42,760 --> 00:55:45,919 Speaker 1: high Energy network. Fucking what is what does that even mean? 1071 00:55:46,040 --> 00:55:48,600 Speaker 1: So so the name matters mate, I'll got to go 1072 00:55:48,640 --> 00:55:52,399 Speaker 1: because we've got a thing. Thanks TIV, Thanks Patrick, thank 1073 00:55:52,440 --> 00:55:52,840 Speaker 1: you