1 00:00:00,560 --> 00:00:04,080 Speaker 1: I'll get a team mcraig, Anthony Harper, Patrick, James Bonillo. 2 00:00:04,120 --> 00:00:06,280 Speaker 1: The name of the show is the You Project. Every 3 00:00:07,280 --> 00:00:09,840 Speaker 1: second week, Young Patrick and I just saddle up. Usually 4 00:00:09,920 --> 00:00:13,880 Speaker 1: young Tiff is Tiffin. Who's Tiffin? Sounds like some kind 5 00:00:13,880 --> 00:00:17,160 Speaker 1: of stage show. Usually Tiff is here, she's not today, 6 00:00:17,280 --> 00:00:20,840 Speaker 1: So it's just us two alpha males just fucking holding 7 00:00:20,880 --> 00:00:23,680 Speaker 1: down the fort. But he reckons more alpha male? 8 00:00:24,000 --> 00:00:29,040 Speaker 2: You or me, ah, neither of us. Fritz, that's true. 9 00:00:29,360 --> 00:00:34,760 Speaker 3: I think it said he's nuts cut off? Can I 10 00:00:34,840 --> 00:00:37,440 Speaker 3: just ask a question? How can I never do the opener? 11 00:00:37,520 --> 00:00:38,960 Speaker 3: I know it's your show, and. 12 00:00:40,520 --> 00:00:41,960 Speaker 4: Let's start again. Let's start again? 13 00:00:42,600 --> 00:00:47,479 Speaker 1: Okay, yep, hey groovers, Welcome to the You Project. I'm 14 00:00:47,560 --> 00:00:49,920 Speaker 1: joined by one of the people I most respect in 15 00:00:49,960 --> 00:00:52,919 Speaker 1: all the world. In fact, I am humbled in his 16 00:00:53,000 --> 00:00:55,160 Speaker 1: presence and in a lot of ways I wish I 17 00:00:55,320 --> 00:00:58,600 Speaker 1: was him. Not only is he smarter, better looking, and 18 00:00:58,680 --> 00:01:02,840 Speaker 1: more talented, he shares the same birthday with the Dali Lama, 19 00:01:03,120 --> 00:01:06,160 Speaker 1: which goes a long way to explaining who he is. So, 20 00:01:06,319 --> 00:01:10,400 Speaker 1: without further ado, welcome back to the You Project. Patrick Bonello, 21 00:01:11,360 --> 00:01:11,960 Speaker 1: Hey haw good? 22 00:01:12,040 --> 00:01:13,400 Speaker 2: Was that fuck? 23 00:01:13,480 --> 00:01:13,520 Speaker 4: And? 24 00:01:13,680 --> 00:01:13,920 Speaker 1: Hell? 25 00:01:14,520 --> 00:01:19,560 Speaker 4: Now I know what you did. I'm quite terrified tell 26 00:01:19,600 --> 00:01:22,479 Speaker 4: people I did not know that was coming. Everybody that 27 00:01:22,640 --> 00:01:26,480 Speaker 4: wasn't me saying that? Wow? When did you do that? 28 00:01:26,680 --> 00:01:29,440 Speaker 4: And how explain to people what that was? 29 00:01:30,160 --> 00:01:34,520 Speaker 3: So I subscribed to a service called eleven Labs. It's 30 00:01:34,520 --> 00:01:39,720 Speaker 3: an AI voice generator. I sampled just one minute of 31 00:01:39,760 --> 00:01:43,280 Speaker 3: your voice from one of your podcasts. I uploaded it 32 00:01:43,360 --> 00:01:46,360 Speaker 3: and then I profiled you. So I put middle aged 33 00:01:46,440 --> 00:01:51,240 Speaker 3: Australian male, just a very brief descriptions, you know, deep voice, 34 00:01:51,280 --> 00:01:54,920 Speaker 3: that sort of stuff, and it generated that so effectively. 35 00:01:56,000 --> 00:01:57,480 Speaker 2: I could you could give me a. 36 00:01:57,440 --> 00:02:00,000 Speaker 3: Sentence now and I could get you to say it 37 00:02:00,120 --> 00:02:00,919 Speaker 3: back in AI. 38 00:02:02,040 --> 00:02:02,800 Speaker 2: Something simple. 39 00:02:03,440 --> 00:02:08,960 Speaker 1: That's all right. I've got a big crush on Patrick, 40 00:02:10,120 --> 00:02:15,600 Speaker 1: right my type. I don't know that this is very 41 00:02:15,600 --> 00:02:18,080 Speaker 1: good live, but anyway, here we go, Here we go. Yep, 42 00:02:18,880 --> 00:02:20,520 Speaker 1: I've got a big crush on Patrick. 43 00:02:21,080 --> 00:02:23,800 Speaker 2: Oh no, not as good? That is it? 44 00:02:23,800 --> 00:02:24,560 Speaker 4: It's not bad? 45 00:02:24,760 --> 00:02:25,920 Speaker 2: Drop it back, let me try again. 46 00:02:26,919 --> 00:02:30,040 Speaker 1: I've got a big crush on Patrick. It's kind of 47 00:02:31,040 --> 00:02:32,679 Speaker 1: it's good until it gets to your name. 48 00:02:33,080 --> 00:02:33,600 Speaker 2: Yeah. 49 00:02:34,040 --> 00:02:36,359 Speaker 3: So what you can do is you can keep uploading 50 00:02:36,440 --> 00:02:40,920 Speaker 3: additional audio and then as it profiles your voice, then 51 00:02:41,360 --> 00:02:43,800 Speaker 3: it'll be able to get it a lot better. Like, 52 00:02:43,840 --> 00:02:46,520 Speaker 3: for example, I don't think you remember saying this earlier, 53 00:02:46,520 --> 00:02:48,000 Speaker 3: but I'm sure you said this earlier. 54 00:02:48,040 --> 00:02:48,320 Speaker 2: Craig. 55 00:02:49,000 --> 00:02:50,919 Speaker 1: It just occurred to me that if all of our 56 00:02:50,960 --> 00:02:55,000 Speaker 1: listeners sent Patrick five dollars each, I will match that 57 00:02:55,120 --> 00:02:56,520 Speaker 1: with whatever we come up with. 58 00:02:59,440 --> 00:03:02,000 Speaker 4: Fuck you, fuck you and your AI skills. 59 00:03:02,560 --> 00:03:05,440 Speaker 3: So I might throw a few of these just randomly 60 00:03:05,480 --> 00:03:07,880 Speaker 3: through the show, just when it you know, if a pause, 61 00:03:07,919 --> 00:03:08,880 Speaker 3: I'll just throw something in. 62 00:03:10,200 --> 00:03:13,639 Speaker 1: That is very funny and a little bit terrifying. I've 63 00:03:13,639 --> 00:03:16,800 Speaker 1: never thought, I think it's I've seen that done to 64 00:03:16,840 --> 00:03:19,160 Speaker 1: other people. I did not expect anyone to be doing 65 00:03:19,200 --> 00:03:22,320 Speaker 1: that to me anytime soon. But thanks for introducing the 66 00:03:22,360 --> 00:03:26,640 Speaker 1: world to that. Thanks for giving people that idea. Wow, 67 00:03:27,840 --> 00:03:31,160 Speaker 1: how long did that take? I mean you're very tech savvy, 68 00:03:31,240 --> 00:03:34,040 Speaker 1: but how long did that take for you to figure that? Like, 69 00:03:34,080 --> 00:03:37,360 Speaker 1: you could literally do a whole podcast of me when 70 00:03:37,360 --> 00:03:39,040 Speaker 1: I'm not even present, Well, you. 71 00:03:39,000 --> 00:03:42,160 Speaker 3: Could do both of us and then just create it 72 00:03:42,200 --> 00:03:43,760 Speaker 3: and put it together. It would take a little bit 73 00:03:43,760 --> 00:03:47,560 Speaker 3: of work, but they are getting so much better. And 74 00:03:47,720 --> 00:03:51,520 Speaker 3: an answer your question, probably having never used the software, 75 00:03:51,600 --> 00:03:54,000 Speaker 3: never done anything with it before, I would say maybe 76 00:03:54,040 --> 00:03:54,920 Speaker 3: three or four minutes. 77 00:03:55,600 --> 00:03:57,240 Speaker 2: Once I found the clip and. 78 00:03:57,240 --> 00:04:01,000 Speaker 3: Uploaded it, it profiled it with a short amount of time. 79 00:04:01,040 --> 00:04:03,040 Speaker 3: I took the phrase, typed it up for the intro 80 00:04:03,240 --> 00:04:05,760 Speaker 3: to the show, and then I played it through about 81 00:04:05,800 --> 00:04:08,400 Speaker 3: four times and you can alter it, and then I've 82 00:04:08,480 --> 00:04:10,080 Speaker 3: chose the one I thought sounded the best. 83 00:04:10,920 --> 00:04:15,200 Speaker 1: Wow, so you could get some famous speech and get 84 00:04:15,200 --> 00:04:18,200 Speaker 1: me to say that famous speech. So could you just 85 00:04:18,440 --> 00:04:24,120 Speaker 1: upload the upload the dialogue and then is that how 86 00:04:24,160 --> 00:04:24,640 Speaker 1: it works? 87 00:04:24,800 --> 00:04:25,040 Speaker 2: Yeah? 88 00:04:25,120 --> 00:04:27,479 Speaker 3: Yeah, I can type anything, so I can either type 89 00:04:27,480 --> 00:04:31,920 Speaker 3: it in or I can just copy and paste, hit 90 00:04:31,960 --> 00:04:34,120 Speaker 3: the generate speech button, and off it goes. 91 00:04:34,520 --> 00:04:38,719 Speaker 1: Now some famous there's some famous Star Wars quote, like 92 00:04:38,800 --> 00:04:43,520 Speaker 1: in a Galaxy Far far whatever. This is probably not 93 00:04:43,680 --> 00:04:46,560 Speaker 1: something we should do in the middle of recording a podcast. Everyone, 94 00:04:46,600 --> 00:04:50,039 Speaker 1: but fuck it. Stand by, Let's see what happens. Is 95 00:04:50,080 --> 00:04:55,159 Speaker 1: there some wrong in this one? The force is strong 96 00:04:55,200 --> 00:04:55,600 Speaker 1: in this one. 97 00:04:57,680 --> 00:05:00,240 Speaker 2: There you go, Well, it kind of sounds funny you're 98 00:05:00,279 --> 00:05:01,279 Speaker 2: saying that because I know you. 99 00:05:01,520 --> 00:05:05,000 Speaker 1: It doesn't give it any inflection, does it? Like, there's 100 00:05:05,040 --> 00:05:05,679 Speaker 1: no emotion. 101 00:05:05,839 --> 00:05:10,240 Speaker 4: It's very it's very flat. 102 00:05:11,000 --> 00:05:12,480 Speaker 1: The force is strong in this one. 103 00:05:13,360 --> 00:05:14,360 Speaker 2: That's a little bit better. 104 00:05:14,680 --> 00:05:17,320 Speaker 4: Yeah, that's what did you do then like what? 105 00:05:17,880 --> 00:05:20,480 Speaker 3: Yep, there's more variable, so you can either make it 106 00:05:20,480 --> 00:05:23,800 Speaker 3: more stable or more variable. So there's there's three controls. 107 00:05:23,800 --> 00:05:31,279 Speaker 3: There's similarity, stability, and then style exaggeration Yeah yeah, wow wow, 108 00:05:32,440 --> 00:05:36,640 Speaker 3: and then you can use other languages as well. 109 00:05:36,920 --> 00:05:38,080 Speaker 1: Force is strong in this one. 110 00:05:38,360 --> 00:05:39,920 Speaker 2: It doesn't sound too bad, does it. 111 00:05:40,440 --> 00:05:44,600 Speaker 4: That's not too bad. It's not too bad. Interesting, you 112 00:05:44,640 --> 00:05:49,400 Speaker 4: are interesting, you are all right? 113 00:05:49,440 --> 00:05:51,440 Speaker 1: Well, I'm sure we'll jump in and out of fake 114 00:05:51,560 --> 00:05:54,240 Speaker 1: me through the show. I can't wait to see how 115 00:05:54,279 --> 00:05:57,320 Speaker 1: you're going to fucking, you know, destroy me as we 116 00:05:57,360 --> 00:05:57,719 Speaker 1: move on. 117 00:05:58,200 --> 00:05:59,880 Speaker 4: What are we talking about in today's episode? 118 00:06:00,040 --> 00:06:06,679 Speaker 1: If Patrick does typ one of my favorite porn films. 119 00:06:06,120 --> 00:06:09,440 Speaker 3: My next big gadget I reckon And I've been looking 120 00:06:09,480 --> 00:06:14,080 Speaker 3: at these quite closely. Ironically, they're glasses, so I guess 121 00:06:14,080 --> 00:06:16,960 Speaker 3: you would be. But smart glasses. Now, there are two 122 00:06:17,000 --> 00:06:19,680 Speaker 3: different types of smart glasses when we think about them. 123 00:06:19,960 --> 00:06:22,279 Speaker 3: One is where we talked and we've talked about the 124 00:06:22,320 --> 00:06:25,440 Speaker 3: augmented reality ones where it's super composes a picture. They're 125 00:06:25,480 --> 00:06:29,840 Speaker 3: not quite there yet, but aahweh has released a new 126 00:06:29,960 --> 00:06:32,560 Speaker 3: version of what they call their eyewear two and what 127 00:06:33,480 --> 00:06:36,640 Speaker 3: I like about it is you can wear the glasses 128 00:06:36,880 --> 00:06:39,480 Speaker 3: and you can use them just every day. They last 129 00:06:39,520 --> 00:06:43,560 Speaker 3: about eleven hours on charge, which is pretty a long time. 130 00:06:44,000 --> 00:06:46,760 Speaker 3: You don't need to have earphones plugged in, so you 131 00:06:46,839 --> 00:06:50,920 Speaker 3: can use them as your virtual assistance, so you could, 132 00:06:51,040 --> 00:06:53,559 Speaker 3: you know, call somebody, use them as a hands free 133 00:06:53,600 --> 00:06:57,600 Speaker 3: device you But what I really like about the Huawei 134 00:06:57,760 --> 00:07:04,000 Speaker 3: Ones is they've come up with a posture kind of assistance. 135 00:07:04,440 --> 00:07:08,760 Speaker 3: And what it does is will actually look at your 136 00:07:08,800 --> 00:07:13,920 Speaker 3: posture or work out whether you're slouching and then remind you. 137 00:07:14,520 --> 00:07:17,040 Speaker 3: So it will monitor so it actually is able to 138 00:07:17,760 --> 00:07:20,040 Speaker 3: check on your posture. And it's got a lot of 139 00:07:20,040 --> 00:07:23,120 Speaker 3: features built into it, so you know you they're lightweight, 140 00:07:23,160 --> 00:07:27,080 Speaker 3: they're comfortable, and you can tap it to play music, 141 00:07:27,320 --> 00:07:28,440 Speaker 3: you can make calls. 142 00:07:28,960 --> 00:07:31,800 Speaker 1: And you missed out a chunk of info like you 143 00:07:31,880 --> 00:07:35,480 Speaker 1: just said glasses, So it's not glasses. Yeah, there's glasses, 144 00:07:35,520 --> 00:07:36,760 Speaker 1: but is it a phone? 145 00:07:36,920 --> 00:07:37,760 Speaker 4: What is it? 146 00:07:37,760 --> 00:07:40,200 Speaker 3: It pairs to your phone, So if you imagine the 147 00:07:40,240 --> 00:07:42,720 Speaker 3: glasses you've got on now, you just touch the arm 148 00:07:42,760 --> 00:07:45,560 Speaker 3: of the glasses to activate it, and you could say, 149 00:07:45,600 --> 00:07:48,200 Speaker 3: be walking and it would be giving you directions turn 150 00:07:48,320 --> 00:07:49,320 Speaker 3: left at the next street. 151 00:07:49,400 --> 00:07:50,200 Speaker 2: That sort of stuff. 152 00:07:50,440 --> 00:07:53,360 Speaker 3: So anything that your phone can do, you can be 153 00:07:53,400 --> 00:07:56,160 Speaker 3: able to connect your glasses to as your smart assistant. 154 00:07:56,200 --> 00:07:58,600 Speaker 3: So take phone calls, make phone calls, all that sort 155 00:07:58,600 --> 00:08:01,120 Speaker 3: of stuff, and just listen to me music. One of 156 00:08:01,120 --> 00:08:04,200 Speaker 3: the things I like about this is you know, quite 157 00:08:04,200 --> 00:08:07,880 Speaker 3: often you see people with earbuds and you wonder how 158 00:08:08,000 --> 00:08:11,200 Speaker 3: dangerous that is because they can't hit traffic and it's 159 00:08:11,240 --> 00:08:13,800 Speaker 3: easy to kind of be lost in your own world. 160 00:08:14,120 --> 00:08:16,280 Speaker 3: The good thing is that this because they're not plugged 161 00:08:16,320 --> 00:08:19,720 Speaker 3: into your ear and they conduct through the top of 162 00:08:19,760 --> 00:08:22,360 Speaker 3: your ear. It means that if the doorbell goes, or 163 00:08:22,360 --> 00:08:24,480 Speaker 3: if you're walking down the street, you can still hear 164 00:08:24,480 --> 00:08:25,200 Speaker 3: things around you. 165 00:08:25,280 --> 00:08:27,239 Speaker 2: So I kind of like that feature as well. 166 00:08:28,560 --> 00:08:32,200 Speaker 1: Isn't Huawei the brand that got booted out of Australia 167 00:08:32,280 --> 00:08:33,720 Speaker 1: and the US their phones? 168 00:08:34,040 --> 00:08:36,120 Speaker 3: Yeah, it was there, Well, not the phones. You can 169 00:08:36,160 --> 00:08:39,240 Speaker 3: still buy Wa Hwei phones but you can't. It was 170 00:08:39,640 --> 00:08:42,720 Speaker 3: more the high end stuff they were using for telecommunications. 171 00:08:43,320 --> 00:08:45,480 Speaker 3: So they had a big contract in Australia to provide 172 00:08:45,520 --> 00:08:49,920 Speaker 3: telecommunication technology and both Australia and in China. The big 173 00:08:49,960 --> 00:08:52,400 Speaker 3: concern is not so much well, I mean, there was 174 00:08:52,480 --> 00:08:56,400 Speaker 3: some controversy around Huawei, but it was more that you 175 00:08:56,480 --> 00:09:01,679 Speaker 3: can't guarantee in China that a large company isn't getting 176 00:09:02,080 --> 00:09:04,880 Speaker 3: nudged and looked at by Beijing. That's the problem with 177 00:09:04,880 --> 00:09:08,719 Speaker 3: the communist government is and that's the big concern. Soanies 178 00:09:08,800 --> 00:09:12,920 Speaker 3: like Dji, the drone manufacturer, they've pulled out all drones 179 00:09:13,160 --> 00:09:16,920 Speaker 3: from the US high end drone markets, so where they're 180 00:09:17,000 --> 00:09:20,080 Speaker 3: using them in say law enforcement or fires and things 181 00:09:20,120 --> 00:09:22,680 Speaker 3: like that, and they're doing the same in Australia as well, 182 00:09:22,760 --> 00:09:25,920 Speaker 3: So they're saying they don't want the potential for China 183 00:09:26,000 --> 00:09:28,200 Speaker 3: to be monitoring what these drones are doing. 184 00:09:28,240 --> 00:09:29,560 Speaker 2: Effectively, it's surveillance. 185 00:09:30,120 --> 00:09:33,840 Speaker 3: There's no suggestion And Dji I kind of feel for 186 00:09:34,200 --> 00:09:36,439 Speaker 3: those companies because if they're doing the right thing and 187 00:09:36,480 --> 00:09:39,520 Speaker 3: they're saying, well, we're not actively monitoring you, but there's 188 00:09:39,559 --> 00:09:40,160 Speaker 3: no way to. 189 00:09:40,120 --> 00:09:40,840 Speaker 2: Really believe it. 190 00:09:40,920 --> 00:09:44,200 Speaker 3: And the suggestion is that, well, if Beijing walks up 191 00:09:44,240 --> 00:09:47,120 Speaker 3: to the front door, and I mean that in general terms, 192 00:09:47,120 --> 00:09:50,480 Speaker 3: someone representing the government knocks on the door and sees, 193 00:09:50,640 --> 00:09:53,960 Speaker 3: we want all your records, well, you know, do they 194 00:09:54,080 --> 00:09:56,160 Speaker 3: have any ability to be able to say no? 195 00:09:56,400 --> 00:09:59,599 Speaker 2: And that's the question, right, Yeah. 196 00:09:59,559 --> 00:10:03,040 Speaker 1: I just googled it because I didn't believe you, but 197 00:10:03,160 --> 00:10:08,520 Speaker 1: you are right. Australia has been Huawei Technology Co. From 198 00:10:08,760 --> 00:10:12,839 Speaker 1: supplying equipment for Australian five G mobile. Why would I 199 00:10:12,920 --> 00:10:15,920 Speaker 1: ever fucking doubt you. That's why have you Australian five 200 00:10:16,000 --> 00:10:19,320 Speaker 1: G mobile network, citing national security risks. 201 00:10:19,320 --> 00:10:21,760 Speaker 4: But they have not banned the phones. 202 00:10:21,520 --> 00:10:24,360 Speaker 2: Correct, yep, okay, okay, yeah. 203 00:10:24,120 --> 00:10:30,640 Speaker 4: Wow, Well there you go. But apparently their phones are great. 204 00:10:31,320 --> 00:10:34,199 Speaker 4: They're high end phones are meant to be great. 205 00:10:34,480 --> 00:10:35,040 Speaker 2: They're really good. 206 00:10:35,120 --> 00:10:35,280 Speaker 1: Yeah. 207 00:10:35,280 --> 00:10:37,520 Speaker 3: I've got a friend that was just we were comparing photos. 208 00:10:37,760 --> 00:10:39,160 Speaker 3: I was going to chat about this on the show 209 00:10:39,200 --> 00:10:41,679 Speaker 3: at some point. I've got some gorgeous photos and I 210 00:10:41,679 --> 00:10:44,600 Speaker 3: think I sent you a couple of the aurora, the. 211 00:10:46,160 --> 00:10:48,160 Speaker 2: Southern Aurora that we had last weekend. 212 00:10:48,400 --> 00:10:49,880 Speaker 3: I was just lucky to be at the right place 213 00:10:49,920 --> 00:10:52,600 Speaker 3: at the right time and got some gorgeous photographs and 214 00:10:53,080 --> 00:10:56,160 Speaker 3: just that beautiful red and green glow. 215 00:10:56,520 --> 00:10:58,440 Speaker 2: Fritzie and I went out there. He jumped in the 216 00:10:58,480 --> 00:10:58,960 Speaker 2: car with me. 217 00:10:59,080 --> 00:11:01,640 Speaker 3: We drove out about I don't know five minutes away, 218 00:11:01,679 --> 00:11:04,920 Speaker 3: were in total darkness, and it was gorgeous. 219 00:11:04,960 --> 00:11:06,760 Speaker 2: It was great. Got some really nice photos. 220 00:11:06,840 --> 00:11:09,160 Speaker 3: But for a lot of people who may have been 221 00:11:09,240 --> 00:11:13,839 Speaker 3: out there and wondered why they weren't seeing in real 222 00:11:13,920 --> 00:11:18,160 Speaker 3: life what our phones were actually showing when we took photos. 223 00:11:18,720 --> 00:11:21,280 Speaker 3: So the photos I sen chew and for all of 224 00:11:21,320 --> 00:11:23,360 Speaker 3: us have been on social media and had a look 225 00:11:23,400 --> 00:11:24,360 Speaker 3: on all those photos. 226 00:11:25,080 --> 00:11:26,480 Speaker 2: You may say, well. 227 00:11:26,360 --> 00:11:28,679 Speaker 3: They didn't look anything like that, But when I took 228 00:11:28,720 --> 00:11:31,720 Speaker 3: the photo, it looked amazing. And that was the case 229 00:11:31,760 --> 00:11:34,079 Speaker 3: for me as well, and it was interesting. I did 230 00:11:34,080 --> 00:11:36,120 Speaker 3: a bit of research and there are a number of 231 00:11:36,200 --> 00:11:38,959 Speaker 3: reasons why, and one of them, of course, is that 232 00:11:39,240 --> 00:11:42,160 Speaker 3: they take an extended exposure. In my case, my phone 233 00:11:42,360 --> 00:11:46,960 Speaker 3: took a six second exposure to get the image. But 234 00:11:47,000 --> 00:11:50,680 Speaker 3: the other thing is that the cones inside our eye, 235 00:11:50,720 --> 00:11:54,320 Speaker 3: the light receptors in our eye, when it gets really dark, 236 00:11:54,640 --> 00:11:58,360 Speaker 3: they effectively switch over from one sensor to another sensor, 237 00:11:58,679 --> 00:12:01,079 Speaker 3: and one of them is about to get as much 238 00:12:01,480 --> 00:12:04,640 Speaker 3: information and to pull in as much light, so it 239 00:12:04,720 --> 00:12:07,200 Speaker 3: kind of drops the use of color and goes to 240 00:12:07,280 --> 00:12:10,800 Speaker 3: almost black and white. So if you're stumbling around at 241 00:12:10,920 --> 00:12:13,600 Speaker 3: night and you've only got a faint glow in your room, 242 00:12:14,000 --> 00:12:17,360 Speaker 3: more than likely you will you will see in much 243 00:12:17,400 --> 00:12:21,440 Speaker 3: more muted tones. So if you're standing out looking out 244 00:12:21,480 --> 00:12:25,040 Speaker 3: at what we saw as the aurora. Effectively, what you're 245 00:12:25,080 --> 00:12:28,680 Speaker 3: looking at is something that's much more muted because you're 246 00:12:28,679 --> 00:12:31,040 Speaker 3: just not getting enough light to the receptors in the 247 00:12:31,040 --> 00:12:33,079 Speaker 3: back of your eye as well. So that's the reason 248 00:12:33,120 --> 00:12:37,520 Speaker 3: why most people's phones showed some really awesome pictures, but 249 00:12:38,240 --> 00:12:41,439 Speaker 3: you weren't seeing the aurora look anywhere near in the 250 00:12:41,559 --> 00:12:43,440 Speaker 3: quality that your phone was taking pictures of. 251 00:12:44,000 --> 00:12:46,320 Speaker 4: I'm looking at that picture that you sent me right now. 252 00:12:47,320 --> 00:12:49,679 Speaker 1: I'm wondering if I might I don't know if I'll 253 00:12:49,720 --> 00:12:51,520 Speaker 1: be able to use that in the promo for the show, 254 00:12:51,559 --> 00:12:55,680 Speaker 1: But it does look it looks it literally looks like 255 00:12:55,880 --> 00:13:00,280 Speaker 1: something made up. It looks like something sci fi because 256 00:13:00,280 --> 00:13:04,640 Speaker 1: there's there's just this blend that starts at the top 257 00:13:04,720 --> 00:13:09,520 Speaker 1: being almost black through to this burgundy indigo color, goes 258 00:13:09,559 --> 00:13:14,240 Speaker 1: into this light green dark green, and then but down 259 00:13:14,320 --> 00:13:17,600 Speaker 1: through the middle of that, it's almost like there's shards 260 00:13:17,640 --> 00:13:20,600 Speaker 1: of light and it looks it looks like a computer 261 00:13:20,720 --> 00:13:23,800 Speaker 1: generated thing. Doesn't look like a photo at all. I mean, 262 00:13:24,040 --> 00:13:26,600 Speaker 1: I'm feeling dumb at the moment. Can you explain that 263 00:13:26,679 --> 00:13:31,360 Speaker 1: to me? Like I was a three year old? I'm sorry, 264 00:13:31,520 --> 00:13:34,800 Speaker 1: could you stop playing fake me. People are not going 265 00:13:34,880 --> 00:13:37,120 Speaker 1: to go what's real? No, what's real and fake me? 266 00:13:37,960 --> 00:13:41,040 Speaker 2: Yeah, that's although what you've said it seems kind of 267 00:13:41,080 --> 00:13:42,120 Speaker 2: on par don't you reckon? 268 00:13:42,920 --> 00:13:44,560 Speaker 4: Well, I know what it is. 269 00:13:45,000 --> 00:13:49,240 Speaker 1: Yeah, I am dumb though, but thanks thanks for making me. 270 00:13:49,360 --> 00:13:51,400 Speaker 1: Of course you were right. I'm always wrong and you 271 00:13:51,440 --> 00:13:53,400 Speaker 1: are one hundred percent right all the time. 272 00:13:54,480 --> 00:13:57,480 Speaker 4: Huh oh god. How many of those do you have? 273 00:13:57,920 --> 00:13:58,040 Speaker 1: Ah? 274 00:13:58,080 --> 00:13:58,920 Speaker 2: There a couple more? 275 00:14:01,080 --> 00:14:02,319 Speaker 4: All right? Now? 276 00:14:02,720 --> 00:14:05,920 Speaker 1: Before we go on with your little kind of little 277 00:14:06,040 --> 00:14:08,400 Speaker 1: that isn't it? Don't you hate it when people say 278 00:14:08,440 --> 00:14:12,120 Speaker 1: your little thing? Last week you and I. 279 00:14:12,040 --> 00:14:15,840 Speaker 3: Had No one's ever said that, Tommy greg Yeah, you're 280 00:14:16,160 --> 00:14:19,560 Speaker 3: right right, John Holmes. 281 00:14:20,360 --> 00:14:24,360 Speaker 1: Everyone's going to be googling that now. A couple of 282 00:14:24,360 --> 00:14:26,720 Speaker 1: weeks ago, last time we were together, we had a bet. 283 00:14:26,760 --> 00:14:28,680 Speaker 1: I can't even remember what the bet was about, but 284 00:14:29,400 --> 00:14:31,800 Speaker 1: that was something to do with something that Tiff would know. 285 00:14:32,280 --> 00:14:34,640 Speaker 3: Yes, it was a picture of Richard Nixon. I sent 286 00:14:34,680 --> 00:14:37,400 Speaker 3: you a picture of Richard Nixon. You forwarded that picture 287 00:14:37,480 --> 00:14:40,480 Speaker 3: to Tiff and I said, of course Tiff will know 288 00:14:40,520 --> 00:14:42,600 Speaker 3: who that is, and you said, not a chance. It 289 00:14:42,680 --> 00:14:45,280 Speaker 3: started to ten dollar bet, and then you doubled it. 290 00:14:45,320 --> 00:14:48,960 Speaker 3: To twenty dollars and I burned and crashed and burned. 291 00:14:49,680 --> 00:14:51,760 Speaker 1: Yeah, you did crash and burn. And then this week 292 00:14:51,920 --> 00:14:57,080 Speaker 1: I went outside and just under my gate was an 293 00:14:57,120 --> 00:15:00,760 Speaker 1: Australia Post parcel and it was like weird. It was 294 00:15:01,040 --> 00:15:05,600 Speaker 1: shaped like a thick, fat, long sausage roll and I 295 00:15:05,680 --> 00:15:08,840 Speaker 1: picked it up and it felt like a very thick 296 00:15:08,960 --> 00:15:11,520 Speaker 1: fat sausage roll and it was kind. 297 00:15:11,320 --> 00:15:13,200 Speaker 2: Of felt like a John Holmes. 298 00:15:14,080 --> 00:15:14,760 Speaker 1: Yeah, it was. 299 00:15:14,840 --> 00:15:19,280 Speaker 4: Limp and heavy, speaking of John Holmes. 300 00:15:19,640 --> 00:15:25,280 Speaker 1: And I opened it and it said from Bland, from 301 00:15:25,280 --> 00:15:30,480 Speaker 1: the Tooth Fairy. And I'd forgotten all about our bet. 302 00:15:30,600 --> 00:15:35,360 Speaker 1: In fact, I didn't know until today when Melissa enlightened 303 00:15:35,440 --> 00:15:38,800 Speaker 1: me because she'd listened to the show. And I said, 304 00:15:38,880 --> 00:15:43,360 Speaker 1: Patrick sent me like a fucking million five cent pieces. 305 00:15:42,760 --> 00:15:46,040 Speaker 4: And I got, I got this this. Well, let's see, 306 00:15:46,080 --> 00:15:48,840 Speaker 4: you sent me apparently twenty dollars worth, and there's twenty 307 00:15:48,880 --> 00:15:51,200 Speaker 4: and that so twenty times twenty so four hundred and 308 00:15:51,200 --> 00:15:52,120 Speaker 4: five cent pieces. 309 00:15:52,200 --> 00:15:54,120 Speaker 2: I guess was it actually healers? 310 00:15:54,640 --> 00:15:55,920 Speaker 4: Was it actually twenty bucks? 311 00:15:56,160 --> 00:15:58,480 Speaker 2: Because it was twenty bucks and it cost me. 312 00:15:58,520 --> 00:16:01,200 Speaker 3: What you'll really enjoy is it cost me eleven bucks 313 00:16:01,200 --> 00:16:01,920 Speaker 3: to send it to you. 314 00:16:02,120 --> 00:16:04,520 Speaker 1: That's what I said to her, it must have cost 315 00:16:04,600 --> 00:16:06,520 Speaker 1: nearly that much to send it because. 316 00:16:06,240 --> 00:16:07,080 Speaker 4: It weighed a lot. 317 00:16:07,480 --> 00:16:08,480 Speaker 2: It did one point. 318 00:16:09,480 --> 00:16:13,000 Speaker 1: And also it's been brought to my attention by you 319 00:16:14,040 --> 00:16:16,320 Speaker 1: that you're actually breaking the law and doing that. 320 00:16:16,960 --> 00:16:19,400 Speaker 3: So I went to the bank, called the bank first, 321 00:16:19,400 --> 00:16:22,200 Speaker 3: our local little community branch, and said if you got 322 00:16:22,280 --> 00:16:25,520 Speaker 3: twenty bucks in five cent pieces, and they kind of 323 00:16:25,600 --> 00:16:27,400 Speaker 3: umed an out for a little bit and said, yeah, sure, 324 00:16:27,440 --> 00:16:29,280 Speaker 3: come in and grab them. So I went in there 325 00:16:29,320 --> 00:16:31,160 Speaker 3: and they gave them to me in all these cute 326 00:16:31,200 --> 00:16:33,200 Speaker 3: little neat bags, and I've got my post pack and 327 00:16:33,240 --> 00:16:34,200 Speaker 3: I'm just opening up. 328 00:16:34,160 --> 00:16:36,280 Speaker 2: The bags, tipping them in, open up the bag, tipping in. 329 00:16:36,640 --> 00:16:38,400 Speaker 3: And then I went to the post office and the 330 00:16:38,440 --> 00:16:41,360 Speaker 3: girls at the post office were pissing themselves laughing when 331 00:16:41,360 --> 00:16:44,640 Speaker 3: I told them what it was for. And then after 332 00:16:44,680 --> 00:16:46,560 Speaker 3: i'd sent it to you, of course, I started telling 333 00:16:46,600 --> 00:16:48,680 Speaker 3: lots of people what i'd done, because you know, I'm 334 00:16:48,680 --> 00:16:49,720 Speaker 3: trying to get a bit of a pat on the 335 00:16:49,760 --> 00:16:53,320 Speaker 3: back for myself. And someone said to me, you know 336 00:16:53,400 --> 00:16:57,240 Speaker 3: that's illegal. I said, what because the post office accepted it, 337 00:16:57,280 --> 00:16:58,080 Speaker 3: and so I did. 338 00:16:57,920 --> 00:16:59,840 Speaker 2: A good search and yeah, it's illegal. 339 00:16:59,480 --> 00:17:03,640 Speaker 3: To send any sort of cash via mail delivery service. 340 00:17:04,119 --> 00:17:06,439 Speaker 4: Who would have thought, I wonder why no idea? 341 00:17:06,600 --> 00:17:09,359 Speaker 1: I guess because they don't want that to be a 342 00:17:09,359 --> 00:17:12,439 Speaker 1: common practice because if if the bad guys think that 343 00:17:12,480 --> 00:17:15,240 Speaker 1: there's money floating around in the mail, then there are Yeah, 344 00:17:15,320 --> 00:17:17,360 Speaker 1: it kind of makes sense, but I don't see why 345 00:17:17,520 --> 00:17:20,680 Speaker 1: you should get penalized if you're if you're the one 346 00:17:20,720 --> 00:17:22,800 Speaker 1: pinching money, but if you're the one putting money, and 347 00:17:22,840 --> 00:17:26,240 Speaker 1: I don't think you should I get penalized. 348 00:17:26,280 --> 00:17:28,480 Speaker 4: Where we going next? Obi? Wan? Can I be? 349 00:17:29,080 --> 00:17:29,439 Speaker 2: Well? 350 00:17:29,600 --> 00:17:34,240 Speaker 3: Look, I guess I was really interested in a couple 351 00:17:34,240 --> 00:17:39,520 Speaker 3: of health related stories talking about AI because Google has 352 00:17:39,560 --> 00:17:43,879 Speaker 3: got its AI interface called Gemini, and what they're saying 353 00:17:43,920 --> 00:17:46,200 Speaker 3: is it's going to really help blind people, so people 354 00:17:46,200 --> 00:17:47,280 Speaker 3: who are vision impaired. 355 00:17:47,640 --> 00:17:49,879 Speaker 2: And there's two little things that I thought was really cool. 356 00:17:50,000 --> 00:17:54,160 Speaker 3: It's called talkback and it's using generative AI, and what 357 00:17:54,160 --> 00:18:00,639 Speaker 3: it means is that it will be able to run offline. Okay, 358 00:18:00,720 --> 00:18:02,760 Speaker 3: So one of the biggest issues with AI at the 359 00:18:02,800 --> 00:18:06,440 Speaker 3: moment is that it's a giant service somewhere, and there's 360 00:18:06,480 --> 00:18:09,520 Speaker 3: a lot of controversy over how much you know that 361 00:18:09,720 --> 00:18:14,479 Speaker 3: data generating costs in terms of power and is there 362 00:18:14,520 --> 00:18:18,200 Speaker 3: a real impact. But this language model is actually designed 363 00:18:18,240 --> 00:18:22,480 Speaker 3: for people with low vision, and it creates descriptions of 364 00:18:22,560 --> 00:18:26,199 Speaker 3: objects for blind users, but it runs totally off the 365 00:18:26,200 --> 00:18:29,600 Speaker 3: cloud and on the device itself, so they would be 366 00:18:29,640 --> 00:18:34,240 Speaker 3: able to refer back to articles of clothing. They can say, well, 367 00:18:34,280 --> 00:18:37,880 Speaker 3: this is a close up of a T shirt and 368 00:18:37,960 --> 00:18:38,440 Speaker 3: so it. 369 00:18:38,400 --> 00:18:42,680 Speaker 1: Will but people are blind, what do you mean they 370 00:18:42,680 --> 00:18:44,080 Speaker 1: can say this is a close up. 371 00:18:44,400 --> 00:18:47,000 Speaker 4: Well, because you would use so it's describing. 372 00:18:47,160 --> 00:18:52,240 Speaker 3: Yeah, so it describes, it describes it's seen, so right, yeah, yeah, 373 00:18:52,520 --> 00:18:57,239 Speaker 3: And it can get really really quite specific. It's not 374 00:18:57,359 --> 00:19:00,399 Speaker 3: just this is address, but this is a floral dress 375 00:19:00,440 --> 00:19:03,119 Speaker 3: and it's green and red and it has fringes and 376 00:19:03,160 --> 00:19:05,960 Speaker 3: that sort of stuff. And they're doing the training of 377 00:19:06,000 --> 00:19:08,440 Speaker 3: the AI with quite a fair. 378 00:19:08,200 --> 00:19:08,879 Speaker 2: Bit of my neuche. 379 00:19:08,960 --> 00:19:11,480 Speaker 3: So if you're going shopping and your vision impaired, this 380 00:19:11,560 --> 00:19:14,919 Speaker 3: could be really really useful, even just matching clothing colors 381 00:19:14,960 --> 00:19:17,280 Speaker 3: and that sort of thing. And I guess the other 382 00:19:17,280 --> 00:19:20,080 Speaker 3: thing I love about it is it makes people a 383 00:19:20,160 --> 00:19:23,399 Speaker 3: lot more independent, so there are less inhibitions because it 384 00:19:23,440 --> 00:19:26,199 Speaker 3: is harder for people to shop, and we take it 385 00:19:26,240 --> 00:19:29,280 Speaker 3: for granted. I mean, you work so hard to look 386 00:19:29,320 --> 00:19:36,080 Speaker 3: so uncoordinated, says he who's wearing a rainbow colored and 387 00:19:36,119 --> 00:19:36,640 Speaker 3: a black tea. 388 00:19:36,640 --> 00:19:42,000 Speaker 1: Sir, hey, fuck you, Brad Pitt, just fuck now, talk 389 00:19:42,040 --> 00:19:46,320 Speaker 1: about the clock calling the kettle black. Mister Cardigan wearing 390 00:19:46,560 --> 00:19:47,879 Speaker 1: fucking beret wearer. 391 00:19:48,080 --> 00:19:50,960 Speaker 2: I'm not wearing a beret or a cardigan today. 392 00:19:51,040 --> 00:19:53,840 Speaker 1: Yeah, that's just today. You're wearing an ABC logo on 393 00:19:53,920 --> 00:19:54,760 Speaker 1: a T shirt. 394 00:19:54,960 --> 00:19:57,639 Speaker 3: That's because I started doing Actually no, it's a Pride 395 00:19:57,720 --> 00:20:00,679 Speaker 3: logo because today's idahol a day. But also today I 396 00:20:00,720 --> 00:20:04,760 Speaker 3: started doing a regular segment on ABC Regional Radio. 397 00:20:05,800 --> 00:20:09,560 Speaker 1: Can I tell you that I have a fan I 398 00:20:09,600 --> 00:20:12,600 Speaker 1: won't say his name, but I have a fan on 399 00:20:12,640 --> 00:20:15,640 Speaker 1: my IG who's very very keen to. 400 00:20:15,600 --> 00:20:18,520 Speaker 4: Catch up with me, sending me nice messages. 401 00:20:19,320 --> 00:20:20,000 Speaker 2: That's nice. 402 00:20:20,400 --> 00:20:24,719 Speaker 1: He's a bit of a sweetheart too, he's nice. 403 00:20:25,359 --> 00:20:26,400 Speaker 4: I'm pretty sure I know. 404 00:20:27,080 --> 00:20:30,639 Speaker 3: No friendship. I'm thinking friendship that you doesn't have to go. 405 00:20:30,680 --> 00:20:32,280 Speaker 3: I mean, he knows it doesn't. 406 00:20:33,000 --> 00:20:35,480 Speaker 4: Is requesting a bit more than friendship, But that's all right. 407 00:20:36,280 --> 00:20:37,400 Speaker 2: Personally, training. 408 00:20:39,000 --> 00:20:43,320 Speaker 4: There's something like that. That's yeah, yeah, that's I might 409 00:20:43,680 --> 00:20:47,680 Speaker 4: pour it into you. Thanks for that, friends though, and 410 00:20:47,800 --> 00:20:48,000 Speaker 4: go on. 411 00:20:48,560 --> 00:20:51,199 Speaker 3: No, So I was just saying how exciting this is 412 00:20:51,400 --> 00:20:54,439 Speaker 3: and using things like AI offline to be able to 413 00:20:54,440 --> 00:20:57,120 Speaker 3: assist people who are vision impaired. And the other thing 414 00:20:57,160 --> 00:21:00,080 Speaker 3: that relates to this, there's a Seattle startup at the 415 00:21:00,119 --> 00:21:05,280 Speaker 3: moment and they've got this really really really cool gadget 416 00:21:05,320 --> 00:21:08,920 Speaker 3: called one court. So if you imagine you're sitting down 417 00:21:09,160 --> 00:21:12,720 Speaker 3: at a sporting game, or you're watching a sporting game tennis, 418 00:21:13,320 --> 00:21:15,440 Speaker 3: you know, grid iron or whatever it happens to be, 419 00:21:15,880 --> 00:21:19,960 Speaker 3: and you lay your hands on the screen if your 420 00:21:20,040 --> 00:21:23,879 Speaker 3: vision impaired, and it gives you haptic feedback of what's 421 00:21:23,880 --> 00:21:26,560 Speaker 3: actually happening on the court. So if you're playing tennis, 422 00:21:26,640 --> 00:21:29,600 Speaker 3: I could you could feel it and the vibrations match 423 00:21:30,080 --> 00:21:31,960 Speaker 3: what's being played out on the court. 424 00:21:32,240 --> 00:21:36,320 Speaker 4: Oh wow wow, yeah, yeah, you're watching good. What if 425 00:21:36,320 --> 00:21:38,359 Speaker 4: you're watching a porno? Does it do the same? 426 00:21:38,560 --> 00:21:40,280 Speaker 2: You had to go there, didn't you? No? 427 00:21:40,400 --> 00:21:43,400 Speaker 4: I didn't. I choose to. You know, by the way, 428 00:21:43,440 --> 00:21:46,439 Speaker 4: everyone Patrick, that was he? That was fake me? That 429 00:21:46,520 --> 00:21:46,960 Speaker 4: wasn't real? 430 00:21:47,000 --> 00:21:52,360 Speaker 2: MAE say that, Yeah that was Patrick. 431 00:21:52,960 --> 00:21:55,560 Speaker 4: So now I can get you in trouble just quickly 432 00:21:55,600 --> 00:21:56,359 Speaker 4: before we move on. 433 00:21:56,840 --> 00:21:59,480 Speaker 1: You know what I like about those the glasses that 434 00:21:59,600 --> 00:22:04,440 Speaker 1: essentially describe what's happening in front of people. Think about 435 00:22:04,480 --> 00:22:08,080 Speaker 1: the potential applications for safety, you know, crossing the road, 436 00:22:08,359 --> 00:22:12,280 Speaker 1: holes in the footpath, you know, obstacles like that's a 437 00:22:12,320 --> 00:22:13,920 Speaker 1: pretty cool application as well. 438 00:22:14,400 --> 00:22:17,320 Speaker 3: Yeah, that's a nice little seguay because there's a really 439 00:22:17,400 --> 00:22:21,160 Speaker 3: great story about a guy in the United States who. 440 00:22:21,080 --> 00:22:23,159 Speaker 1: Would just point out to put everyone how much you 441 00:22:23,320 --> 00:22:24,280 Speaker 1: just fobbed me off? 442 00:22:24,320 --> 00:22:27,320 Speaker 4: Then did I You went? 443 00:22:27,600 --> 00:22:32,600 Speaker 3: You wentblished that, she went, Yeah, it was my topic. 444 00:22:32,680 --> 00:22:34,280 Speaker 2: I was telling you how exciting it is. 445 00:22:34,600 --> 00:22:40,520 Speaker 1: You were talking about choosing fucking clothes. No, like, oh, 446 00:22:40,640 --> 00:22:43,159 Speaker 1: here's a close up of a T shirt? What about 447 00:22:43,160 --> 00:22:46,680 Speaker 1: saving someone's life? Your dickhead. 448 00:22:46,320 --> 00:22:49,760 Speaker 2: But you gave me the perfect segue to the next segment. 449 00:22:49,840 --> 00:22:51,320 Speaker 2: It's a really keep. 450 00:22:51,119 --> 00:22:53,440 Speaker 4: Going, say keep going segway. Boy. 451 00:22:53,800 --> 00:22:56,679 Speaker 3: So this, this guy in the United States got a 452 00:22:56,680 --> 00:22:59,679 Speaker 3: black T shirt with just a stop sign on the 453 00:22:59,720 --> 00:23:01,960 Speaker 3: front of it and then just stood by the side 454 00:23:02,000 --> 00:23:05,240 Speaker 3: of the road and autonomous cars was stopping because they 455 00:23:05,240 --> 00:23:06,440 Speaker 3: thought he was a stop sign. 456 00:23:07,160 --> 00:23:08,600 Speaker 4: That's great, isn't it. 457 00:23:08,760 --> 00:23:10,200 Speaker 2: I near'd like it. 458 00:23:10,440 --> 00:23:13,440 Speaker 4: That's but also, I mean, I wonder if he could 459 00:23:13,480 --> 00:23:15,520 Speaker 4: get charged with anything. 460 00:23:16,000 --> 00:23:19,280 Speaker 3: Right years, I don't know, so this guy's his name's 461 00:23:19,359 --> 00:23:21,040 Speaker 3: Jason Carr is from Arizona. 462 00:23:21,600 --> 00:23:24,080 Speaker 2: And look, I think he was doing it to prove 463 00:23:24,119 --> 00:23:24,600 Speaker 2: a point. 464 00:23:24,680 --> 00:23:29,120 Speaker 3: So Waimo is the company that has robotaxis and they're 465 00:23:29,200 --> 00:23:32,639 Speaker 3: running tests at the moment. So initially he just stood 466 00:23:32,680 --> 00:23:35,360 Speaker 3: at the edge of the road, and yeah, the car 467 00:23:35,400 --> 00:23:38,800 Speaker 3: came to a complete stop, and then it eventually resumed 468 00:23:38,880 --> 00:23:40,600 Speaker 3: driving once he walked away. 469 00:23:40,440 --> 00:23:44,280 Speaker 2: Or hid the T shirt. But look, he said that 470 00:23:44,840 --> 00:23:46,160 Speaker 2: it was something that he was doing. 471 00:23:46,200 --> 00:23:48,040 Speaker 3: It was a bit of a publicity stunt, I guess, 472 00:23:48,119 --> 00:23:50,440 Speaker 3: or you know, to kind of to be able to 473 00:23:50,480 --> 00:23:54,440 Speaker 3: emphasize a point that these self driving vehicles still have 474 00:23:54,520 --> 00:23:57,399 Speaker 3: a long way to go to differentiate between a sign 475 00:23:57,440 --> 00:23:59,919 Speaker 3: on a stick by the side road and an intersect 476 00:24:00,520 --> 00:24:02,920 Speaker 3: and just someone standing in a T shirt. But yeah, 477 00:24:03,080 --> 00:24:05,199 Speaker 3: I mean my first reaction was to have a bit 478 00:24:05,200 --> 00:24:08,679 Speaker 3: of a chuckle as well. But it does show, you know, 479 00:24:08,720 --> 00:24:12,639 Speaker 3: the risks of how an autonomous system like that and 480 00:24:12,680 --> 00:24:16,320 Speaker 3: a vehicle is a major heavy vehicle, you know, particularly 481 00:24:16,800 --> 00:24:20,240 Speaker 3: if it's an electric powered vehicle. They're even heavier, aren't they. 482 00:24:20,560 --> 00:24:22,520 Speaker 3: You can you can fool them, and there was all 483 00:24:22,520 --> 00:24:25,720 Speaker 3: the controversy a few years ago, well not that long ago, actually, 484 00:24:25,840 --> 00:24:29,240 Speaker 3: over whether or not the totally autonomous driving of the 485 00:24:29,240 --> 00:24:32,520 Speaker 3: Tesla's was able to pick up a child and differentiate 486 00:24:32,600 --> 00:24:35,360 Speaker 3: between a child walking across the road, and so there 487 00:24:35,400 --> 00:24:37,679 Speaker 3: was again a little bit of controversy around that. At 488 00:24:37,680 --> 00:24:39,320 Speaker 3: the end of the day, we want to make sure 489 00:24:39,320 --> 00:24:42,880 Speaker 3: we've got driver safety. But the current driver safety that's 490 00:24:42,880 --> 00:24:45,520 Speaker 3: built into a lot of vehicles is phenomenal. I know 491 00:24:45,560 --> 00:24:47,240 Speaker 3: I'm always crying about this, but it is good. 492 00:24:47,320 --> 00:24:48,040 Speaker 2: It's getting better. 493 00:24:48,280 --> 00:24:50,399 Speaker 3: It means that people are less likely to fall asleep 494 00:24:50,440 --> 00:24:52,399 Speaker 3: at the will because their car will notify them if 495 00:24:52,400 --> 00:24:54,960 Speaker 3: they not off or if they drift out of their lane, 496 00:24:55,000 --> 00:24:56,000 Speaker 3: and that's got to be good. 497 00:24:56,840 --> 00:24:57,639 Speaker 4: Yeah. 498 00:24:57,680 --> 00:25:02,680 Speaker 1: One of the challenges I think though, is that some 499 00:25:02,720 --> 00:25:04,400 Speaker 1: of the cars are so high tech. 500 00:25:04,480 --> 00:25:06,120 Speaker 4: Now. I love cars, I love motorbikes. 501 00:25:06,240 --> 00:25:08,920 Speaker 1: I watch reviews all the time, as you know, which 502 00:25:08,920 --> 00:25:12,240 Speaker 1: is ironic because I drive a fucking old Suzuki Swift. 503 00:25:12,320 --> 00:25:14,960 Speaker 1: But you know, like I haven't had a nice car 504 00:25:15,080 --> 00:25:18,200 Speaker 1: for years because I kind of went through a phase. 505 00:25:18,240 --> 00:25:20,840 Speaker 1: But I still love watching reviews, and I've been watching 506 00:25:21,160 --> 00:25:26,120 Speaker 1: lots of reviews lately on new fancy cars, electric cars, 507 00:25:26,600 --> 00:25:30,639 Speaker 1: and I watched this guy testing a car called the 508 00:25:30,680 --> 00:25:33,320 Speaker 1: Tank five hundred, which is essentially the size of a 509 00:25:33,400 --> 00:25:34,120 Speaker 1: land Cruiser. 510 00:25:34,880 --> 00:25:35,800 Speaker 4: It's Chinese. 511 00:25:36,480 --> 00:25:38,520 Speaker 1: It looks like a two hundred thousand dollars car at 512 00:25:38,560 --> 00:25:41,600 Speaker 1: seventy grand, give or take, except one of the problems. 513 00:25:41,640 --> 00:25:43,520 Speaker 1: And he jumps in and he's talking, and he takes 514 00:25:43,560 --> 00:25:46,479 Speaker 1: off and he's been driving for one hundred meters and 515 00:25:46,520 --> 00:25:50,399 Speaker 1: the car tells him, you're tired, pull over and have 516 00:25:50,520 --> 00:25:50,920 Speaker 1: a rest. 517 00:25:51,840 --> 00:25:52,040 Speaker 4: Right. 518 00:25:52,160 --> 00:25:54,640 Speaker 1: It keeps telling him because it's got all of these 519 00:25:54,680 --> 00:26:00,239 Speaker 1: senses that it can allegedly sense fatigue or tiredness or 520 00:26:00,280 --> 00:26:03,720 Speaker 1: however it does. And so every two hundred meters that 521 00:26:03,800 --> 00:26:05,919 Speaker 1: was telling him to pull over because he was tired, 522 00:26:06,320 --> 00:26:09,119 Speaker 1: and he just literally jumped in and started driving. So 523 00:26:10,080 --> 00:26:13,560 Speaker 1: they're sorting out the kinks. But you know, I wonder, 524 00:26:13,840 --> 00:26:15,720 Speaker 1: like I kind of love the fact that I have 525 00:26:15,800 --> 00:26:18,560 Speaker 1: a five year old car that's kind of basic. 526 00:26:19,560 --> 00:26:19,760 Speaker 4: You know. 527 00:26:19,840 --> 00:26:22,880 Speaker 1: It's like there's I don't need to be a buddy 528 00:26:23,119 --> 00:26:25,040 Speaker 1: tech journeius to be able to get in it and 529 00:26:25,080 --> 00:26:25,680 Speaker 1: operate it. 530 00:26:27,359 --> 00:26:27,600 Speaker 3: Yeah. 531 00:26:27,760 --> 00:26:28,359 Speaker 2: Look, I. 532 00:26:30,160 --> 00:26:32,840 Speaker 3: Love the more safety you have in a car, but 533 00:26:32,880 --> 00:26:35,520 Speaker 3: I still love to drive a manual every now and again. 534 00:26:35,600 --> 00:26:39,040 Speaker 3: I love, you know, stick shift it's nice to go 535 00:26:39,119 --> 00:26:40,560 Speaker 3: through the gears and feel the. 536 00:26:40,560 --> 00:26:42,080 Speaker 2: Car and know when to change. 537 00:26:42,160 --> 00:26:46,760 Speaker 3: And as you know, you've driven sticks many times on 538 00:26:46,800 --> 00:26:51,160 Speaker 3: the power of the vehicle. You remember the term powerband. Yeah, 539 00:26:51,200 --> 00:26:54,280 Speaker 3: and I love that. I love the people of our 540 00:26:54,359 --> 00:26:57,240 Speaker 3: generation understand what that means. And the other thing is 541 00:26:57,240 --> 00:26:59,159 Speaker 3: I've taught a lot of people to drive in my 542 00:26:59,320 --> 00:27:02,920 Speaker 3: Nissan on this NXR coop. And the good thing about 543 00:27:02,960 --> 00:27:05,040 Speaker 3: teaching a young person how to drive and a manual 544 00:27:05,160 --> 00:27:06,919 Speaker 3: is they have a lot more respect for how an 545 00:27:07,000 --> 00:27:10,960 Speaker 3: engine works, because you know that when you go from 546 00:27:11,000 --> 00:27:13,200 Speaker 3: first gear to second gear to third gear, you can 547 00:27:13,280 --> 00:27:17,920 Speaker 3: feel and hear how your engine is being taxed and 548 00:27:17,960 --> 00:27:21,760 Speaker 3: how much effort the engine is going through and the car. 549 00:27:22,240 --> 00:27:24,800 Speaker 3: I like that, And I'm not a petrol head by 550 00:27:24,960 --> 00:27:27,200 Speaker 3: a long shot, but I still love the feeling. And 551 00:27:27,520 --> 00:27:29,919 Speaker 3: a friend of mine she had to look for so 552 00:27:29,960 --> 00:27:32,679 Speaker 3: many different models before she was even able to purchase 553 00:27:32,720 --> 00:27:35,159 Speaker 3: a manual car. I think the WRX is one of 554 00:27:35,200 --> 00:27:37,600 Speaker 3: the few cars on the road that still is a manual. 555 00:27:38,160 --> 00:27:39,639 Speaker 2: It's very, very hard to buy a manual. 556 00:27:39,720 --> 00:27:42,880 Speaker 1: View. Yeah, mine's manual. Mine's five speed. It's a little 557 00:27:42,920 --> 00:27:45,480 Speaker 1: one point four turbo, little Suzuki sport. 558 00:27:45,520 --> 00:27:46,600 Speaker 4: It's great, It's great. 559 00:27:47,400 --> 00:27:51,040 Speaker 1: Here's something that caught my eye. I don't know why 560 00:27:51,840 --> 00:27:55,879 Speaker 1: researchers build AI driven sarcasm detector. 561 00:27:56,119 --> 00:27:56,520 Speaker 2: Oh I. 562 00:27:59,240 --> 00:28:00,840 Speaker 4: Was that built for me? 563 00:28:02,160 --> 00:28:06,400 Speaker 3: It feels like it doesn't it It really really really 564 00:28:06,480 --> 00:28:10,879 Speaker 3: does that. There is a seriousness behind this because being 565 00:28:11,000 --> 00:28:13,640 Speaker 3: able to detect sarcasm. 566 00:28:13,960 --> 00:28:14,720 Speaker 2: I know we call it. 567 00:28:14,640 --> 00:28:17,280 Speaker 3: The lowest form of whip, but I believe also it 568 00:28:17,400 --> 00:28:25,160 Speaker 3: shows intelligence. Potentially maybe, but evidently knowing sarcasm means your 569 00:28:25,200 --> 00:28:30,359 Speaker 3: interaction with people is a lot more natural. So what 570 00:28:30,440 --> 00:28:35,080 Speaker 3: it means, So this is some researchers who have basically 571 00:28:35,600 --> 00:28:40,160 Speaker 3: looked at sarcasm as a way to make the AI 572 00:28:40,320 --> 00:28:47,160 Speaker 3: model understand more about how language works. And sarcasm can 573 00:28:47,200 --> 00:28:50,320 Speaker 3: be hard to detect even if you're just an average person. 574 00:28:50,640 --> 00:28:53,640 Speaker 3: But if you're trying to train an AI model to 575 00:28:53,760 --> 00:28:58,680 Speaker 3: be more clever to understand how people work, the sarcasm 576 00:28:58,720 --> 00:29:03,120 Speaker 3: director detect certainly has a lot of benefits. 577 00:29:03,160 --> 00:29:06,400 Speaker 2: So I kind of like it that it's an interesting. 578 00:29:06,160 --> 00:29:08,440 Speaker 3: Idea, and then when you start to dig down a 579 00:29:08,480 --> 00:29:11,320 Speaker 3: little bit deeper, it makes a lot of sense. Because 580 00:29:11,640 --> 00:29:14,719 Speaker 3: we have a lot of nuances in communication and in 581 00:29:14,760 --> 00:29:17,840 Speaker 3: our conversation. I mean, look at my shows when we 582 00:29:17,880 --> 00:29:20,200 Speaker 3: get let's face it. 583 00:29:20,680 --> 00:29:22,640 Speaker 4: I mean, that's why we have people who love it 584 00:29:22,680 --> 00:29:24,920 Speaker 4: and people who hate it. Speaking of sarcasm. 585 00:29:25,160 --> 00:29:29,800 Speaker 1: This afternoon, I had a meeting with my senior academic supervisor, Chris, 586 00:29:29,960 --> 00:29:35,680 Speaker 1: and my other supervisor, Lucy, and another guy called Campbell 587 00:29:35,680 --> 00:29:39,000 Speaker 1: who helps with analytics and data and shout out to Campbell, 588 00:29:39,040 --> 00:29:42,800 Speaker 1: and and anyway, we're all on this call and we're 589 00:29:42,800 --> 00:29:45,480 Speaker 1: trying to figure out we're talking about this, you know, 590 00:29:45,560 --> 00:29:48,440 Speaker 1: looking at I won't bore everyone, but we're looking at 591 00:29:48,480 --> 00:29:54,400 Speaker 1: different variables that we could correlate with what too, Sorry, anyway, 592 00:29:54,760 --> 00:29:57,240 Speaker 1: we're thinking we're talking about different things that we could 593 00:29:57,320 --> 00:30:01,920 Speaker 1: correlate with or explore, being a relationship between meta accuracy 594 00:30:02,160 --> 00:30:05,640 Speaker 1: and so meta accuracy and age meta accuracy and income 595 00:30:05,680 --> 00:30:09,560 Speaker 1: metaaccuracy and all these different things. And I said, like, 596 00:30:09,760 --> 00:30:12,280 Speaker 1: as you know, Sarka as a maner joke, I go, 597 00:30:12,360 --> 00:30:16,760 Speaker 1: what about meta accuracy and star scigns and like every 598 00:30:16,840 --> 00:30:19,440 Speaker 1: like maybe virgos are really fucking good at it, right, 599 00:30:20,600 --> 00:30:23,480 Speaker 1: or capricorns maybe they can just you know, they're great 600 00:30:23,520 --> 00:30:26,000 Speaker 1: at this. And all three of them just looked at 601 00:30:26,000 --> 00:30:29,480 Speaker 1: me like I was a complete fuck with and Chris, 602 00:30:29,560 --> 00:30:32,720 Speaker 1: my senior supervisor, didn't even respond and he just kept 603 00:30:33,720 --> 00:30:37,800 Speaker 1: he just kept talking, Oh and ah, there it is, 604 00:30:37,840 --> 00:30:41,280 Speaker 1: and that's probably why I shouldn't be in this environment anyway, 605 00:30:41,560 --> 00:30:43,480 Speaker 1: keep going, I'm really getting bored. 606 00:30:43,520 --> 00:30:46,280 Speaker 3: No, oh, I didn't pick up and that I said 607 00:30:46,360 --> 00:30:50,000 Speaker 3: no instead of now, oh, there you go, keep going. 608 00:30:50,680 --> 00:30:52,440 Speaker 2: Yeah, that was it. That was just the sarcasm thing. 609 00:30:52,480 --> 00:30:53,160 Speaker 2: I thought it was great. 610 00:30:53,880 --> 00:30:57,440 Speaker 3: Do you Another little AI story that I had to 611 00:30:57,520 --> 00:30:59,480 Speaker 3: kind of throw in here is how people are using 612 00:30:59,480 --> 00:31:01,360 Speaker 3: AI now for insurance fraud. 613 00:31:01,400 --> 00:31:03,560 Speaker 2: Did you see that? It came up a little while ago. 614 00:31:03,960 --> 00:31:04,840 Speaker 2: So what they're. 615 00:31:04,640 --> 00:31:08,680 Speaker 3: Doing, yeah, because when you put in an accident claim, 616 00:31:09,040 --> 00:31:11,480 Speaker 3: you show your vehicle and the damage. 617 00:31:11,520 --> 00:31:12,840 Speaker 2: And thankfully I. 618 00:31:12,840 --> 00:31:14,680 Speaker 3: Haven't had too many accidents, but I did head a 619 00:31:14,760 --> 00:31:17,200 Speaker 3: kangaroo in my new car after they had it three 620 00:31:17,240 --> 00:31:20,680 Speaker 3: months and wow, yeah it was a really messy hit. 621 00:31:20,760 --> 00:31:23,240 Speaker 3: And I look it was already dead on the road, 622 00:31:23,320 --> 00:31:25,400 Speaker 3: because that would have made me even more distressed if 623 00:31:25,400 --> 00:31:27,960 Speaker 3: it had been alive and I was responsible for killing it. 624 00:31:28,040 --> 00:31:29,120 Speaker 2: But that's it. 625 00:31:29,440 --> 00:31:31,040 Speaker 3: Yet took out a big chunk of the front of 626 00:31:31,080 --> 00:31:35,480 Speaker 3: my car. But what people are doing is they're actually 627 00:31:35,920 --> 00:31:40,560 Speaker 3: faking claims by putting a picture of their car into 628 00:31:40,600 --> 00:31:43,400 Speaker 3: AI and say, put a DNT in the front right 629 00:31:43,440 --> 00:31:44,800 Speaker 3: fender and. 630 00:31:44,720 --> 00:31:45,719 Speaker 2: Then submitting it. 631 00:31:46,600 --> 00:31:49,720 Speaker 3: Really, so insurance companies are now and this was a 632 00:31:49,760 --> 00:31:53,280 Speaker 3: study in the UK done by Alliance, and they're saying 633 00:31:54,160 --> 00:31:56,440 Speaker 3: that what they're having to do is now really look 634 00:31:56,560 --> 00:32:00,240 Speaker 3: closely at accidents because they may be what they call 635 00:32:00,320 --> 00:32:03,400 Speaker 3: shallow fakes. So we've heard of deep fakes. This is 636 00:32:03,400 --> 00:32:08,040 Speaker 3: a shallow fake. So it's yeah, it's not always convincing. 637 00:32:08,280 --> 00:32:12,479 Speaker 3: And and look, some of them can be close. But 638 00:32:12,920 --> 00:32:14,960 Speaker 3: it wouldn't even have occurred to me to do something 639 00:32:15,040 --> 00:32:17,920 Speaker 3: like this, you know. It's funny some people just think 640 00:32:17,960 --> 00:32:21,120 Speaker 3: that way. So yeah, that was an interesting one. 641 00:32:21,440 --> 00:32:23,320 Speaker 1: Well, I think that's going to happen. Stuff like that's 642 00:32:23,360 --> 00:32:24,960 Speaker 1: going to happen more and more, isn't it. How do 643 00:32:25,000 --> 00:32:28,600 Speaker 1: we use this technology to shaft people, to embezzle, to 644 00:32:29,840 --> 00:32:34,760 Speaker 1: you know, create some kind of unfair advantage you know, Yeah, 645 00:32:34,840 --> 00:32:36,320 Speaker 1: that's gonna it's not going away. 646 00:32:37,000 --> 00:32:38,360 Speaker 2: Yeah, so that's that's a thing now. 647 00:32:38,520 --> 00:32:42,720 Speaker 3: Crago yep, And I look, there were there's been a 648 00:32:42,800 --> 00:32:47,760 Speaker 3: twenty nine percent increase in fraud reported, you know, according 649 00:32:47,800 --> 00:32:51,560 Speaker 3: to this particular report by Alliance and and so it's 650 00:32:51,600 --> 00:32:54,080 Speaker 3: happening a lot more. And they talked about a van 651 00:32:54,160 --> 00:32:57,520 Speaker 3: and they showed a picture of this van free Ai 652 00:32:57,760 --> 00:33:01,360 Speaker 3: and post Ai and the person have created a dint 653 00:33:01,800 --> 00:33:03,200 Speaker 3: in the in the side of the car. 654 00:33:03,360 --> 00:33:06,080 Speaker 2: So yeah, it's it's a thing. Are we giving people 655 00:33:06,160 --> 00:33:07,640 Speaker 2: bad bad ideas? Though? 656 00:33:08,200 --> 00:33:09,920 Speaker 4: I think the ideas were already out there? 657 00:33:10,000 --> 00:33:10,560 Speaker 2: Okay, good? 658 00:33:11,680 --> 00:33:15,840 Speaker 1: Yeah, I want to know why Pizza Hut got fined 659 00:33:15,880 --> 00:33:17,400 Speaker 1: two and a half million bucks. 660 00:33:17,880 --> 00:33:22,760 Speaker 3: I know the I kind of feel that this is 661 00:33:22,800 --> 00:33:26,560 Speaker 3: a very important message that the you know that the 662 00:33:26,600 --> 00:33:29,680 Speaker 3: authorities are dishing out to companies like this, Oh, dishing 663 00:33:29,680 --> 00:33:30,600 Speaker 3: out like a pizza. 664 00:33:31,120 --> 00:33:33,760 Speaker 4: No pun intended when you said that, I thought of that. 665 00:33:34,120 --> 00:33:37,880 Speaker 2: Well, the thing is, can you get spam on a pizza? 666 00:33:39,040 --> 00:33:40,120 Speaker 4: Oh? Tish book? 667 00:33:40,840 --> 00:33:42,040 Speaker 2: It's getting, isn't it. 668 00:33:42,160 --> 00:33:46,080 Speaker 1: I don't think you want like our our older folk, 669 00:33:46,520 --> 00:33:49,320 Speaker 1: our older listeners know what spam is, but probably our 670 00:33:49,920 --> 00:33:52,520 Speaker 1: forty and under brigade they wouldn't know what spam is. 671 00:33:52,600 --> 00:33:55,200 Speaker 3: Patrick, Yeah, so spam in a can that was just 672 00:33:55,320 --> 00:33:56,600 Speaker 3: I don't know what the meat. Well, I don't think 673 00:33:56,600 --> 00:33:58,720 Speaker 3: anybody knows what's inside spam. They put it in a 674 00:33:58,760 --> 00:34:00,440 Speaker 3: can and you can make a boot out of it. 675 00:34:00,520 --> 00:34:03,520 Speaker 4: Sp I think let's just say it was ham. 676 00:34:03,680 --> 00:34:05,920 Speaker 2: Ishh Yeah, any part. 677 00:34:05,880 --> 00:34:08,520 Speaker 4: In that it was in the neighborhood of ham. 678 00:34:08,680 --> 00:34:11,080 Speaker 2: I think snouts and assholes is probably. 679 00:34:10,800 --> 00:34:14,000 Speaker 1: You know, but I think I think I'll tell you 680 00:34:14,040 --> 00:34:17,440 Speaker 1: what it was. It was like a canful of salt. 681 00:34:17,960 --> 00:34:21,080 Speaker 1: It was a saltier shit you've ever tasted. They probably 682 00:34:21,120 --> 00:34:23,200 Speaker 1: put all that salt in there to kill. 683 00:34:22,960 --> 00:34:26,319 Speaker 3: The taste, yeah, and also keep it going. Didn't they 684 00:34:26,440 --> 00:34:28,560 Speaker 3: use it during Wasn't it developed for World War II 685 00:34:28,719 --> 00:34:29,120 Speaker 3: or something? 686 00:34:29,200 --> 00:34:31,080 Speaker 2: I thought I think it was troops. 687 00:34:31,239 --> 00:34:33,680 Speaker 4: I think like fair bit of fat, fair bit of salt. 688 00:34:33,880 --> 00:34:37,720 Speaker 1: Probably if you're in the middle of nowhere and trying 689 00:34:37,760 --> 00:34:38,839 Speaker 1: to stay alive. 690 00:34:38,600 --> 00:34:40,879 Speaker 4: Who knows, I mean, anything would taste pretty good. 691 00:34:41,200 --> 00:34:41,680 Speaker 2: Yeah. 692 00:34:41,760 --> 00:34:45,719 Speaker 3: So in Australia, the Communications and Media Authority, the ACMA, 693 00:34:46,560 --> 00:34:52,719 Speaker 3: they investigate complaints when people receive unsolicited emails or texts 694 00:34:53,120 --> 00:34:55,480 Speaker 3: and you don't have an ability to be able to 695 00:34:55,520 --> 00:34:58,080 Speaker 3: opt out. So it's laura in Australia. If you get 696 00:34:58,120 --> 00:34:59,920 Speaker 3: a bulk email sent to you or a bulk ten 697 00:35:00,560 --> 00:35:03,359 Speaker 3: and you try to opt out and you can't, then 698 00:35:03,400 --> 00:35:04,680 Speaker 3: the company's doing. 699 00:35:04,400 --> 00:35:05,520 Speaker 2: That is at fault. 700 00:35:05,960 --> 00:35:09,600 Speaker 3: And evidently in this particular instance, it was reported that 701 00:35:09,800 --> 00:35:14,120 Speaker 3: ten million spam marketing text and emails were sent out 702 00:35:14,160 --> 00:35:19,160 Speaker 3: in four months time, and people were trying to unsubscribe 703 00:35:19,440 --> 00:35:22,160 Speaker 3: to stop receiving them and they couldn't so and then 704 00:35:22,200 --> 00:35:25,560 Speaker 3: another four point three million marketing messages was sent out 705 00:35:25,960 --> 00:35:29,960 Speaker 3: without the option to unsubscribe. So it sends a really 706 00:35:30,080 --> 00:35:36,160 Speaker 3: clear message that you know, country, countries, companies know these laws. 707 00:35:36,200 --> 00:35:38,040 Speaker 2: These laws have been around for a long time now. 708 00:35:38,080 --> 00:35:39,680 Speaker 3: So I just thought it would be an interesting one 709 00:35:39,719 --> 00:35:43,280 Speaker 3: to mention that, you know, if you're receiving unsolicited emails 710 00:35:43,280 --> 00:35:45,480 Speaker 3: and you want to dub somebody in for doing it, gee, 711 00:35:45,480 --> 00:35:46,040 Speaker 3: I'll tell you what. 712 00:35:46,080 --> 00:35:48,040 Speaker 2: The find's pretty big, can. 713 00:35:47,920 --> 00:35:52,440 Speaker 1: I'd tell you something in the same ballpark. So I'm 714 00:35:52,480 --> 00:35:54,319 Speaker 1: not going to say what company. But I had this 715 00:35:55,800 --> 00:35:58,600 Speaker 1: kind of what do you call the computer protectiony stuff? 716 00:35:58,840 --> 00:35:59,480 Speaker 4: What do you call that? 717 00:36:00,120 --> 00:36:00,720 Speaker 2: Any virus? 718 00:36:01,280 --> 00:36:02,640 Speaker 4: Antivirus? Thank you Patrick. 719 00:36:03,320 --> 00:36:06,400 Speaker 1: So I had that installed I won't say what company, 720 00:36:06,520 --> 00:36:11,799 Speaker 1: and it like every eight minutes it would pop up 721 00:36:11,840 --> 00:36:16,080 Speaker 1: with whatever, right, and then it ran out and I 722 00:36:16,200 --> 00:36:19,000 Speaker 1: decided to go with something else and this one that 723 00:36:19,080 --> 00:36:22,200 Speaker 1: ran out to try to get that out of my 724 00:36:22,360 --> 00:36:26,920 Speaker 1: fucking computer. It was like it was a virus, like 725 00:36:27,040 --> 00:36:30,600 Speaker 1: it would not go away. I tried to uninstall it, 726 00:36:30,719 --> 00:36:33,400 Speaker 1: or I did what I thought uninstall it. I followed 727 00:36:33,440 --> 00:36:36,640 Speaker 1: the YouTube video, I uninstalled it, and then it was 728 00:36:36,719 --> 00:36:40,319 Speaker 1: back and it kept going, you're expired. You need to 729 00:36:40,360 --> 00:36:41,520 Speaker 1: do this, you need to do that. 730 00:36:41,600 --> 00:36:43,480 Speaker 4: If you don't do this, like the world's going to 731 00:36:43,520 --> 00:36:44,719 Speaker 4: blow up and we're going to you know. 732 00:36:44,800 --> 00:36:48,000 Speaker 1: It's like, in the end, it was more of a 733 00:36:48,080 --> 00:36:50,840 Speaker 1: problem than the fucking viruses I was trying to protect 734 00:36:50,880 --> 00:36:51,560 Speaker 1: myself from. 735 00:36:52,239 --> 00:36:55,640 Speaker 3: Yeah, so a lot of laptop and computer companies do 736 00:36:55,800 --> 00:36:59,239 Speaker 3: deals with these antivirus companies, starting with Mac. 737 00:36:59,480 --> 00:37:01,920 Speaker 2: I think probably is the one you're thinking of a 738 00:37:02,000 --> 00:37:05,480 Speaker 2: fee McAfee. Sorry, I just had something caught in my throat. 739 00:37:06,080 --> 00:37:06,520 Speaker 1: Yeah, so. 740 00:37:08,120 --> 00:37:11,359 Speaker 3: It's not unusual to have the anti virus installed, but 741 00:37:11,440 --> 00:37:14,280 Speaker 3: it's only a limited trial for say, thirty thirty days, 742 00:37:14,560 --> 00:37:16,799 Speaker 3: and it is really painful trying to get rid of it, 743 00:37:16,840 --> 00:37:18,840 Speaker 3: and it is I'm really impressed that you did a 744 00:37:18,920 --> 00:37:21,360 Speaker 3: YouTube and jumped on there and tried. 745 00:37:21,160 --> 00:37:23,000 Speaker 2: To get rid of it yourself, Craigi. I thought that 746 00:37:23,040 --> 00:37:23,720 Speaker 2: was very impressive. 747 00:37:23,800 --> 00:37:25,800 Speaker 4: Yeah, I thought I'd done it, and then about twenty 748 00:37:25,800 --> 00:37:27,480 Speaker 4: minutes later, it was back, and then I had to 749 00:37:27,480 --> 00:37:30,359 Speaker 4: get the actual smart person in the organization to come 750 00:37:30,400 --> 00:37:31,800 Speaker 4: over and do it. Liza. 751 00:37:31,960 --> 00:37:34,720 Speaker 1: Yeah, she came over and did it in thirteen seconds 752 00:37:34,719 --> 00:37:36,400 Speaker 1: and looked at me quizzically. 753 00:37:36,480 --> 00:37:38,400 Speaker 2: And she uses a Mac. Oh are you on a 754 00:37:38,440 --> 00:37:39,000 Speaker 2: Mac as well? 755 00:37:39,480 --> 00:37:42,120 Speaker 4: I'm not on a Mac, so it wasn't that brand, 756 00:37:42,160 --> 00:37:43,920 Speaker 4: you said, But I don't want to say it on 757 00:37:43,960 --> 00:37:46,120 Speaker 4: air because I feel like they might hijack me. 758 00:37:46,480 --> 00:37:47,320 Speaker 2: Yeah, fair enough. 759 00:37:47,880 --> 00:37:51,760 Speaker 3: It's interesting while you're talking about that at the moment, 760 00:37:51,880 --> 00:37:54,080 Speaker 3: Dell has put out a warning just recently in the 761 00:37:54,120 --> 00:37:57,279 Speaker 3: last few days because they had a data breach and 762 00:37:57,520 --> 00:38:01,560 Speaker 3: they believe forty nine million customers may have had information 763 00:38:02,000 --> 00:38:05,879 Speaker 3: and that that's now gone onto the dark web, and 764 00:38:06,320 --> 00:38:09,840 Speaker 3: that's potentially Yeah, this could be serious for a lot 765 00:38:09,880 --> 00:38:10,160 Speaker 3: of people. 766 00:38:10,440 --> 00:38:15,919 Speaker 4: Nine million. I mean, how does. 767 00:38:14,360 --> 00:38:17,400 Speaker 1: How did they ever compensate for that? Because it's like 768 00:38:17,480 --> 00:38:21,400 Speaker 1: it's almost unknowable, like the potential and I'm sure the 769 00:38:21,480 --> 00:38:24,920 Speaker 1: vast majority of people would be fine, but I mean, 770 00:38:24,960 --> 00:38:28,280 Speaker 1: even if it was one percent of forty nine million, 771 00:38:28,320 --> 00:38:31,840 Speaker 1: that's four hundred and ninety thousand people. That's half a 772 00:38:31,880 --> 00:38:36,000 Speaker 1: million people, give or take. I mean, yeah, that's that's 773 00:38:36,080 --> 00:38:40,160 Speaker 1: the scary thing. Now when other when people that don't 774 00:38:40,160 --> 00:38:43,200 Speaker 1: have your best interests at heart have got access to 775 00:38:43,239 --> 00:38:45,080 Speaker 1: all your shit, that makes me nervous. 776 00:38:45,360 --> 00:38:48,759 Speaker 3: Yeah, and look, one of the things they're saying is 777 00:38:48,800 --> 00:38:51,400 Speaker 3: that it could be as much as the name of 778 00:38:51,440 --> 00:38:54,600 Speaker 3: the buyer, the company named the address. So there's quite 779 00:38:54,600 --> 00:38:58,920 Speaker 3: a few markers that are very specific to the individual customer, 780 00:38:58,920 --> 00:39:01,440 Speaker 3: and that's one of the problems. And you know, things 781 00:39:01,520 --> 00:39:03,600 Speaker 3: like your name and your physical address. That sort of 782 00:39:03,600 --> 00:39:06,520 Speaker 3: stuff is really, you know, something that's quite problematic. But 783 00:39:06,600 --> 00:39:09,520 Speaker 3: one of the things that you should do is go 784 00:39:09,640 --> 00:39:12,360 Speaker 3: to the website of the vendor, if it happens to 785 00:39:12,360 --> 00:39:16,359 Speaker 3: be in this case, Dell, and follow their instructions and suggestions. 786 00:39:16,719 --> 00:39:18,799 Speaker 3: But the first thing you need to do is go 787 00:39:18,880 --> 00:39:22,319 Speaker 3: and change your password. Absolute first thing you do. Now, 788 00:39:22,360 --> 00:39:24,760 Speaker 3: I haven't had a Dell computer for quite a few years, 789 00:39:25,200 --> 00:39:27,040 Speaker 3: but that was the first thing I did was jump 790 00:39:27,080 --> 00:39:29,160 Speaker 3: in and see if I still had a Dell account 791 00:39:29,440 --> 00:39:31,840 Speaker 3: and if there was a password, and I generated a 792 00:39:31,880 --> 00:39:35,239 Speaker 3: new password just in case. That was one of the 793 00:39:35,280 --> 00:39:38,239 Speaker 3: first things. And also two factor authentication. You might have 794 00:39:38,280 --> 00:39:40,839 Speaker 3: seen it written down as two FA that's another thing 795 00:39:40,880 --> 00:39:42,640 Speaker 3: that you can do. Make sure then that two factor 796 00:39:42,719 --> 00:39:46,839 Speaker 3: authentication means that before you can sign in, you have 797 00:39:46,960 --> 00:39:51,560 Speaker 3: to either have a second factor authentication, either a text 798 00:39:51,600 --> 00:39:53,640 Speaker 3: message or an email get sent to you with a 799 00:39:53,719 --> 00:39:56,319 Speaker 3: code and then you put the code in. So it's 800 00:39:56,320 --> 00:39:58,400 Speaker 3: not enough just to have the password. You need that 801 00:39:58,440 --> 00:40:00,719 Speaker 3: factor code to then pull it there as well. It's 802 00:40:00,719 --> 00:40:03,840 Speaker 3: a pain, but it's a level of security as well. 803 00:40:04,560 --> 00:40:07,560 Speaker 1: I saw this real today, mate, and a guy was 804 00:40:08,280 --> 00:40:09,920 Speaker 1: I don't know if it was. It was like a 805 00:40:09,960 --> 00:40:13,200 Speaker 1: little almost like a road test of a product. This 806 00:40:13,239 --> 00:40:16,680 Speaker 1: guy was sitting in what looked to be a cafe. 807 00:40:16,760 --> 00:40:19,920 Speaker 1: There was lots of people around. He was talking and 808 00:40:20,520 --> 00:40:22,920 Speaker 1: we could hear what he was hearing. He had like 809 00:40:23,000 --> 00:40:26,319 Speaker 1: this thing in his ear, and then he said. 810 00:40:26,760 --> 00:40:27,120 Speaker 4: I don't know. 811 00:40:27,160 --> 00:40:30,160 Speaker 1: He gave some instruction to cut out the background noise 812 00:40:30,280 --> 00:40:33,640 Speaker 1: and just to focus on one person across the table. 813 00:40:34,560 --> 00:40:37,920 Speaker 1: Took out all the noise except this bloke's voice, and 814 00:40:37,960 --> 00:40:41,760 Speaker 1: then you could clearly hear what he was saying, except 815 00:40:41,840 --> 00:40:44,920 Speaker 1: he was speaking in Spanish, and then he said, now 816 00:40:45,000 --> 00:40:48,200 Speaker 1: converted into English. So all of a sudden, all the 817 00:40:48,239 --> 00:40:51,000 Speaker 1: noise in the room has gone, or you know, it's 818 00:40:51,000 --> 00:40:53,840 Speaker 1: gone from a ten to or one, and now we 819 00:40:53,880 --> 00:40:57,160 Speaker 1: can hear clearly hear this guy, and then we then 820 00:40:57,200 --> 00:41:02,279 Speaker 1: it gets translated in pretty much real time into English, like. 821 00:41:02,600 --> 00:41:04,279 Speaker 3: Yeah, it makes you feel sad that we don't have 822 00:41:04,320 --> 00:41:06,600 Speaker 3: a Cold war and spies anymore, because that. 823 00:41:06,600 --> 00:41:08,040 Speaker 2: Would have been really handy, wouldn't it. 824 00:41:08,360 --> 00:41:13,359 Speaker 4: Ah, Yeah, I'm pretty sure we do have spies. I'm 825 00:41:13,360 --> 00:41:14,200 Speaker 4: pretty sure there. 826 00:41:14,040 --> 00:41:16,320 Speaker 3: Are still shout out to everybody at AISIO. 827 00:41:17,239 --> 00:41:20,680 Speaker 1: I'm going to show my age here because I read 828 00:41:20,719 --> 00:41:24,399 Speaker 1: the one of the titles of one of your kind 829 00:41:24,400 --> 00:41:27,719 Speaker 1: of articles or one of your discussion points under social media, 830 00:41:27,760 --> 00:41:31,760 Speaker 1: and it says young people share their experience of being doxed. 831 00:41:32,520 --> 00:41:34,000 Speaker 1: I don't know what dost means. 832 00:41:34,560 --> 00:41:37,319 Speaker 3: This is actually a bit of bit insidious. This it's 833 00:41:37,360 --> 00:41:40,960 Speaker 3: a terrible thing to do to somebody. It's where online. Okay, 834 00:41:40,960 --> 00:41:43,600 Speaker 3: we'll give an example. You're online gaming and you have 835 00:41:43,640 --> 00:41:45,839 Speaker 3: a bit of an altercation with someone because quite often 836 00:41:45,880 --> 00:41:49,200 Speaker 3: online gamers can talk to each other and someone takes 837 00:41:49,200 --> 00:41:52,520 Speaker 3: a dislike to you. But then they get your personal details, 838 00:41:52,560 --> 00:41:55,840 Speaker 3: your address, and then they start sending pizza delivery people 839 00:41:55,840 --> 00:41:58,319 Speaker 3: to your house. And we're not talking one person, we're 840 00:41:58,320 --> 00:42:01,759 Speaker 3: talking fifty people. So suddenly you're getting packages delivered to 841 00:42:01,800 --> 00:42:04,440 Speaker 3: your house. You're getting pizzas delivered to your house, and 842 00:42:05,160 --> 00:42:08,359 Speaker 3: it could be amazingly distressing. This happened to a guy 843 00:42:08,400 --> 00:42:12,320 Speaker 3: who was only seventeen and his whole family was thrown 844 00:42:12,400 --> 00:42:15,480 Speaker 3: in a curveball here and it meant that, you know, 845 00:42:15,480 --> 00:42:17,920 Speaker 3: he got into an argument online playing a game called 846 00:42:18,320 --> 00:42:20,879 Speaker 3: Call of Duty, and then for about three or four 847 00:42:20,960 --> 00:42:24,359 Speaker 3: weeks he kept getting you know, uber eats and pizzas 848 00:42:24,640 --> 00:42:28,279 Speaker 3: and it says sometimes fifty times a day, and that's 849 00:42:28,280 --> 00:42:31,239 Speaker 3: what it is. It's a really malicious thing people can do. 850 00:42:31,440 --> 00:42:34,719 Speaker 3: I mean, I always get distressed when people the keyboard warriors, 851 00:42:34,760 --> 00:42:38,360 Speaker 3: you know, people who are so tough behind the keyboard, 852 00:42:38,719 --> 00:42:40,759 Speaker 3: and they wouldn't say if you wouldn't do it in 853 00:42:40,800 --> 00:42:43,359 Speaker 3: real life if you don't think about it. And look, 854 00:42:43,400 --> 00:42:45,120 Speaker 3: there's always going to be people out there who'll do 855 00:42:45,160 --> 00:42:47,239 Speaker 3: this sort of stuff. But yeah, so that's what dosing is. 856 00:42:48,080 --> 00:42:50,120 Speaker 3: But as you can imagine, the whole family would have 857 00:42:50,120 --> 00:42:52,120 Speaker 3: been pretty distressed about this sort of thing happen. 858 00:42:52,320 --> 00:42:55,640 Speaker 1: I wonder what the solution to this is for the 859 00:42:56,280 --> 00:42:59,720 Speaker 1: food companies or the restaurants or because you think about 860 00:42:59,840 --> 00:43:02,400 Speaker 1: you you could ring up. I mean, I live on 861 00:43:02,440 --> 00:43:05,759 Speaker 1: a street with fifty restaurants. You've taken cafes. I could 862 00:43:05,800 --> 00:43:08,799 Speaker 1: ring up the pizza joint that's one hundred and eighty 863 00:43:08,840 --> 00:43:11,680 Speaker 1: meters away and order ten pizzas. As long as I 864 00:43:11,719 --> 00:43:14,520 Speaker 1: give them an address and a phone number, they're going 865 00:43:14,560 --> 00:43:15,799 Speaker 1: to make the ten pizzas. 866 00:43:16,239 --> 00:43:18,319 Speaker 2: Look, I think the only way to get around that 867 00:43:18,480 --> 00:43:21,440 Speaker 2: is for people to prepay and to say I mean, 868 00:43:21,440 --> 00:43:22,799 Speaker 2: at the end of the day, if someone sends me 869 00:43:22,800 --> 00:43:25,320 Speaker 2: a pizza and it's been paid for, I'm not too concerned. 870 00:43:25,480 --> 00:43:30,120 Speaker 3: Right, Yeah, I mean, look, it's you know, the society 871 00:43:30,200 --> 00:43:32,480 Speaker 3: is breaking down when people start sending a kilo and 872 00:43:32,520 --> 00:43:34,000 Speaker 3: a half of coins in the mail. 873 00:43:34,080 --> 00:43:35,000 Speaker 2: I mean, that's just kidding. 874 00:43:37,960 --> 00:43:41,280 Speaker 4: Yeah. By the way, Patrick's home address for the Federal 875 00:43:41,320 --> 00:43:44,040 Speaker 4: Police if you're listening, he's in Milan. I don't know 876 00:43:44,080 --> 00:43:44,840 Speaker 4: the exact address. 877 00:43:44,880 --> 00:43:46,799 Speaker 1: It just occurred to me that if all of our 878 00:43:46,840 --> 00:43:50,920 Speaker 1: listeners sent Patrick five dollars each, I will match that 879 00:43:51,040 --> 00:43:52,319 Speaker 1: with whatever we come up with. 880 00:43:53,920 --> 00:43:55,360 Speaker 2: It's greg that's a great idea. 881 00:43:55,840 --> 00:43:59,200 Speaker 1: You're an idiot. Stop that was fake me. That wasn't 882 00:43:59,239 --> 00:44:03,800 Speaker 1: really everyone that. By the way, fake me is way too. 883 00:44:03,640 --> 00:44:04,960 Speaker 2: Polite, you reckon. 884 00:44:05,640 --> 00:44:07,880 Speaker 4: Yeah, yeah, fake. 885 00:44:07,719 --> 00:44:13,040 Speaker 1: Me is too articulate and nice. I really need to apologize. 886 00:44:13,080 --> 00:44:15,400 Speaker 1: I was going to do a tai chi class with Patrick, 887 00:44:15,480 --> 00:44:20,279 Speaker 1: but I weaseled my way out of it. There's no apology, 888 00:44:20,520 --> 00:44:21,520 Speaker 1: there's just relief. 889 00:44:22,000 --> 00:44:24,240 Speaker 2: You're right. Actually yeah, it's just relief. 890 00:44:25,080 --> 00:44:28,000 Speaker 4: Like fake me, fake me doesn't know real me because 891 00:44:28,040 --> 00:44:31,160 Speaker 4: fake me he would know. Fake me should know that. 892 00:44:31,200 --> 00:44:32,920 Speaker 4: Real me would never apologize for that. 893 00:44:33,960 --> 00:44:36,200 Speaker 1: Of course you were. I'm always wrong and you are 894 00:44:36,239 --> 00:44:41,560 Speaker 1: one hundred percent right all the time. You wish, all right, 895 00:44:41,719 --> 00:44:45,359 Speaker 1: take us home with something. Just round this bad boy 896 00:44:45,480 --> 00:44:46,200 Speaker 1: up if you could. 897 00:44:46,480 --> 00:44:47,880 Speaker 2: It sounds like you want to get rid of me. 898 00:44:48,680 --> 00:44:49,520 Speaker 4: No, you can do it. 899 00:44:49,480 --> 00:44:51,960 Speaker 1: For as long as I just realized. You know why 900 00:44:52,000 --> 00:44:54,680 Speaker 1: we've gotten through a lot in a short time. I 901 00:44:54,719 --> 00:44:58,200 Speaker 1: don't want to say it's tiff, but it's tiff. It's 902 00:44:59,000 --> 00:45:02,200 Speaker 1: there's a a big conversational whole. 903 00:45:04,960 --> 00:45:05,200 Speaker 2: You know. 904 00:45:05,920 --> 00:45:10,880 Speaker 3: There's some researchers here in Australia have been looking at 905 00:45:11,120 --> 00:45:15,800 Speaker 3: how children are being educated and they're saying that children 906 00:45:15,880 --> 00:45:21,440 Speaker 3: produce better pieces of writing by hand, so their ability 907 00:45:21,480 --> 00:45:25,320 Speaker 3: to learn and at a certain age. So we're talking 908 00:45:25,360 --> 00:45:28,440 Speaker 3: say grade one and grade two if they are still writing. 909 00:45:28,480 --> 00:45:31,439 Speaker 3: And this is where education sytem in Australia is being 910 00:45:31,520 --> 00:45:36,600 Speaker 3: praised because when you're writing, you're singularly focused on what 911 00:45:36,640 --> 00:45:39,520 Speaker 3: you're doing is spelling every single letter. You're writing out 912 00:45:39,520 --> 00:45:43,319 Speaker 3: every single letter, and it's a much a better way 913 00:45:43,360 --> 00:45:46,080 Speaker 3: to focus and to be able to get kids. 914 00:45:45,960 --> 00:45:46,279 Speaker 2: To do that. 915 00:45:46,280 --> 00:45:49,399 Speaker 3: And I thought, isn't that great that handwriting is such 916 00:45:49,400 --> 00:45:52,239 Speaker 3: a good thing for young people and that we still doing. 917 00:45:52,800 --> 00:45:57,080 Speaker 3: Obviously keyboard skills are imperative for kids as they're getting older, 918 00:45:57,480 --> 00:46:01,279 Speaker 3: but certainly at a younger age they're saying, your handwriting 919 00:46:01,360 --> 00:46:04,440 Speaker 3: is really, really good. I've got friends of mine who 920 00:46:04,520 --> 00:46:06,600 Speaker 3: live around the corner from you. I just gave their 921 00:46:06,680 --> 00:46:09,480 Speaker 3: fourteen year old a fountain pin because he was telling 922 00:46:09,960 --> 00:46:13,680 Speaker 3: likes to write, and so it's great. It was funny 923 00:46:13,680 --> 00:46:15,960 Speaker 3: because I went over to their place a few weeks 924 00:46:15,960 --> 00:46:19,520 Speaker 3: ago and his fingers were blue. The bed leaked. 925 00:46:19,719 --> 00:46:22,919 Speaker 2: I'm so dodgy. Was that an expensive one to start 926 00:46:22,960 --> 00:46:23,239 Speaker 2: off with? 927 00:46:24,640 --> 00:46:27,319 Speaker 4: You know? It? Never? I mean, I don't know about you. 928 00:46:27,440 --> 00:46:28,520 Speaker 4: A couple of things. 929 00:46:28,280 --> 00:46:32,239 Speaker 1: Here, but we we had to use fountain pens like 930 00:46:32,360 --> 00:46:34,880 Speaker 1: we had in the old days. We had like we 931 00:46:34,920 --> 00:46:37,239 Speaker 1: didn't dip it in a bloody thing of ink, but 932 00:46:37,520 --> 00:46:40,480 Speaker 1: they had these ink cartridges and we. 933 00:46:40,600 --> 00:46:43,920 Speaker 4: Had to write. Is it called cursive right where all 934 00:46:43,920 --> 00:46:45,080 Speaker 4: the letters are connected? 935 00:46:45,480 --> 00:46:50,480 Speaker 2: Seriously? Yeah, of course I still write cursive, do you? Yeah? 936 00:46:50,520 --> 00:46:51,120 Speaker 2: With the fountain? 937 00:46:52,000 --> 00:46:53,759 Speaker 4: Are you? And are you my mum? Are you an 938 00:46:53,800 --> 00:46:54,600 Speaker 4: eighty five year. 939 00:46:54,520 --> 00:46:56,160 Speaker 2: Old writing with a fountain pen? 940 00:46:56,320 --> 00:47:00,120 Speaker 3: I actually have a beautiful mont Blanc fountain pen that 941 00:47:00,160 --> 00:47:01,439 Speaker 3: I write with every day. 942 00:47:02,440 --> 00:47:04,160 Speaker 4: Yeah, that's clearly. 943 00:47:04,520 --> 00:47:06,080 Speaker 2: I feel like I need to write you a letter. 944 00:47:06,760 --> 00:47:09,880 Speaker 4: I feel for fuck your letter. Don't send me a letter. 945 00:47:10,719 --> 00:47:14,799 Speaker 4: Fuck you know you're so weird. It's so weird. Hey, 946 00:47:14,960 --> 00:47:17,239 Speaker 4: but I never thought about the fact that. 947 00:47:19,640 --> 00:47:24,040 Speaker 1: Kids might soon stop writing. Like when you see like 948 00:47:24,160 --> 00:47:29,080 Speaker 1: literally two year olds, three year olds using computers, tablets, phones, 949 00:47:29,160 --> 00:47:35,680 Speaker 1: whatever and typing and navigating a keyboard, then I wonder 950 00:47:35,760 --> 00:47:37,840 Speaker 1: if I wonder if there's going to be a time 951 00:47:38,360 --> 00:47:41,839 Speaker 1: in the near future. You know what, he just took 952 00:47:41,880 --> 00:47:44,200 Speaker 1: off and took his headphone off. So are you going 953 00:47:44,280 --> 00:47:46,439 Speaker 1: to pretend that you know what I'm talking about? Now? 954 00:47:47,360 --> 00:47:50,400 Speaker 3: I had Fritz was trying to get out of the office, 955 00:47:50,440 --> 00:47:52,279 Speaker 3: and I thought, I hobe he might need to go 956 00:47:52,320 --> 00:47:53,440 Speaker 3: to the toilet or something. 957 00:47:53,520 --> 00:47:55,680 Speaker 1: So I hope he did a big dump on your 958 00:47:55,760 --> 00:48:00,800 Speaker 1: fucking sofa. Hey, I was just saying, before you took rudely, 959 00:48:01,920 --> 00:48:08,840 Speaker 1: I wish I had some fake Patrick, don't look at you. 960 00:48:08,920 --> 00:48:11,880 Speaker 1: What Yeah, you're trying to find some more fake me. 961 00:48:12,360 --> 00:48:14,440 Speaker 1: I was just saying, I wonder if we're going to 962 00:48:14,440 --> 00:48:16,359 Speaker 1: get to a time in the near future where kids 963 00:48:16,400 --> 00:48:19,520 Speaker 1: don't actually learn to write with pen and pencil anymore, 964 00:48:20,000 --> 00:48:21,879 Speaker 1: because we don't need it. 965 00:48:22,480 --> 00:48:23,800 Speaker 4: The world doesn't need. 966 00:48:23,640 --> 00:48:24,719 Speaker 2: That, not in Australia. 967 00:48:24,800 --> 00:48:27,319 Speaker 3: Evidently that's still a core part of our teaching, which 968 00:48:27,320 --> 00:48:27,680 Speaker 3: is great. 969 00:48:27,760 --> 00:48:29,520 Speaker 1: I know it is now, but I wonder if it'll 970 00:48:29,560 --> 00:48:33,480 Speaker 1: be in ten years. Like we're trying to eradicate paper. 971 00:48:34,320 --> 00:48:37,560 Speaker 1: We're already almost a cashless society. 972 00:48:38,320 --> 00:48:40,800 Speaker 3: But hasn't Tiff been using one of those tablets in 973 00:48:40,840 --> 00:48:43,000 Speaker 3: a pen, because that's the other thing you can do. 974 00:48:43,160 --> 00:48:45,120 Speaker 3: You can just have a tablet that you can write on. 975 00:48:45,200 --> 00:48:48,040 Speaker 3: My laptop flips over and I have a little stylust 976 00:48:48,400 --> 00:48:50,759 Speaker 3: so I can take notes and write them that way. 977 00:48:51,239 --> 00:48:53,040 Speaker 3: You know, I'd played around with it for a little while. 978 00:48:53,080 --> 00:48:56,400 Speaker 3: In fact, your last conference that you did, do you 979 00:48:56,400 --> 00:48:57,920 Speaker 3: remember I went along to that and you had some 980 00:48:57,960 --> 00:49:00,680 Speaker 3: great guests speakers. Well, I took that and making notes 981 00:49:00,920 --> 00:49:03,600 Speaker 3: and I flipped the laptop over and I was writing 982 00:49:03,600 --> 00:49:04,719 Speaker 3: my notes on the back of it. 983 00:49:05,000 --> 00:49:07,319 Speaker 2: But it's not the same as having ink on a 984 00:49:07,320 --> 00:49:08,040 Speaker 2: piece of paper. 985 00:49:08,560 --> 00:49:13,560 Speaker 1: Hmmm, well, think about the trees, bro, think about the trees. Ah, 986 00:49:15,160 --> 00:49:18,160 Speaker 1: tell people how they can come and break into your 987 00:49:19,440 --> 00:49:23,520 Speaker 1: very unsecure house in Bland and do yoga with you 988 00:49:23,560 --> 00:49:29,239 Speaker 1: and get you to build them all taichi, yoga pillarties, 989 00:49:29,280 --> 00:49:32,000 Speaker 1: it's all the same. I really need to apologize. I 990 00:49:32,040 --> 00:49:34,200 Speaker 1: was going to do a tai chie class with Patrick, 991 00:49:34,320 --> 00:49:37,799 Speaker 1: but I weaseled my way out of it. Stop it, 992 00:49:37,880 --> 00:49:41,920 Speaker 1: stop playing fake me. All right, I'm taking the piece everyone. 993 00:49:42,080 --> 00:49:45,520 Speaker 1: They're definitely not the same thing. They're all boring. 994 00:49:48,719 --> 00:49:50,440 Speaker 4: Oh, he's telling need to apologize. 995 00:49:50,440 --> 00:49:52,760 Speaker 1: I was going to do a tai chie class with Patrick, 996 00:49:52,880 --> 00:49:54,359 Speaker 1: but I weaseled my way out of it. 997 00:49:56,520 --> 00:49:59,000 Speaker 4: There we go, yep, there we go. 998 00:50:00,000 --> 00:50:04,279 Speaker 1: We say, dot dot com be what smelkod doubt that 999 00:50:04,400 --> 00:50:05,879 Speaker 1: was supposed to be websites? 1000 00:50:06,040 --> 00:50:08,719 Speaker 3: Now dot com tod au. Maybe I have to put 1001 00:50:08,760 --> 00:50:09,399 Speaker 3: gaps in it. 1002 00:50:10,640 --> 00:50:13,040 Speaker 4: Go and type it in now, properly, Go and type 1003 00:50:13,080 --> 00:50:15,000 Speaker 4: it in. I'll talk while you're doing that and see 1004 00:50:15,040 --> 00:50:16,480 Speaker 4: if you can get it right this time. 1005 00:50:17,600 --> 00:50:22,080 Speaker 1: You really you've lowered your colors today. 1006 00:50:21,040 --> 00:50:22,400 Speaker 2: Isn't it. Well? You know the other thing is we 1007 00:50:22,480 --> 00:50:23,839 Speaker 2: normally do them earlier in the day. 1008 00:50:25,640 --> 00:50:27,759 Speaker 4: I don't mind. I don't mind five o'clock. It's five 1009 00:50:27,920 --> 00:50:30,800 Speaker 4: fifty and it's dark as fucking hand. 1010 00:50:32,040 --> 00:50:34,440 Speaker 2: It's it's well, it's pretty dark everywhere. I guess at 1011 00:50:34,480 --> 00:50:34,920 Speaker 2: the moment. 1012 00:50:37,200 --> 00:50:38,880 Speaker 1: Websites now, dot com dot a. 1013 00:50:40,719 --> 00:50:43,800 Speaker 2: Not really not really spell it phonetically. 1014 00:50:46,680 --> 00:50:52,080 Speaker 1: What if you just write ay dash you. 1015 00:50:54,160 --> 00:50:57,000 Speaker 3: That's so weird, that is so strange. I'm gonna have 1016 00:50:57,000 --> 00:50:58,839 Speaker 3: to play around with it. Look, we'll try one more time. 1017 00:50:58,880 --> 00:51:00,840 Speaker 3: I'm going to move a little of the sliders and 1018 00:51:00,920 --> 00:51:02,120 Speaker 3: see what happens. 1019 00:51:02,239 --> 00:51:06,760 Speaker 1: Sorry, now, dot com dot a you that's. 1020 00:51:06,600 --> 00:51:08,280 Speaker 2: In terrible yea website? 1021 00:51:09,000 --> 00:51:09,200 Speaker 3: Are you? 1022 00:51:09,800 --> 00:51:10,000 Speaker 4: Yeah? 1023 00:51:10,040 --> 00:51:13,719 Speaker 1: Can you perfect that before the next one? Patrick, this 1024 00:51:13,800 --> 00:51:16,800 Speaker 1: has been your worst effort. You're on your first warning 1025 00:51:17,719 --> 00:51:19,760 Speaker 1: and you're on thin ice. 1026 00:51:20,120 --> 00:51:23,960 Speaker 3: I'm not even going to you're gonna have you. 1027 00:51:25,480 --> 00:51:28,080 Speaker 1: No, you know, here's the beauty of your pay I 1028 00:51:28,080 --> 00:51:29,920 Speaker 1: can harbor or double it and you're still getting the 1029 00:51:29,960 --> 00:51:32,040 Speaker 1: same welcome. 1030 00:51:32,520 --> 00:51:34,960 Speaker 2: Yeah, thanks for that, mate, Thanks mate,