1 00:00:00,360 --> 00:00:01,880 Speaker 1: I get a team. It's a new project. 2 00:00:01,920 --> 00:00:05,760 Speaker 2: It's Stiffy and Cook, Patrick, James Bonello, craig Ethny Harper. 3 00:00:06,600 --> 00:00:08,600 Speaker 1: Once a week we get together and we do this. 4 00:00:09,320 --> 00:00:11,040 Speaker 1: We're still not sure why. 5 00:00:12,320 --> 00:00:15,760 Speaker 2: We seem to be a ruddless conversational ship, but nonetheless, 6 00:00:15,800 --> 00:00:17,160 Speaker 2: we just gather and we do it. 7 00:00:17,239 --> 00:00:18,720 Speaker 1: Hi, Patrick, We'll start with you. 8 00:00:19,000 --> 00:00:21,160 Speaker 3: I don't want to come out punching, but you actually 9 00:00:21,160 --> 00:00:23,440 Speaker 3: had two errors in the previous statement. 10 00:00:24,160 --> 00:00:26,520 Speaker 1: Okay, that's fine, go ahead. 11 00:00:26,880 --> 00:00:29,440 Speaker 3: Well we don't meet weekly. It's fortnightly, which I love, 12 00:00:29,640 --> 00:00:32,320 Speaker 3: and I look forward the highlight of my foot. 13 00:00:32,920 --> 00:00:35,000 Speaker 2: Okay, noted, thank you for the correction. 14 00:00:35,320 --> 00:00:37,159 Speaker 4: And you stumbled on the name and how long will 15 00:00:37,200 --> 00:00:37,960 Speaker 4: we know that to each other? 16 00:00:39,680 --> 00:00:40,600 Speaker 3: What is your name? 17 00:00:41,040 --> 00:00:44,080 Speaker 2: I was I actually had a momentary lapse about your 18 00:00:44,120 --> 00:00:49,080 Speaker 2: middle name, and that was because I was completely inappropriately 19 00:00:49,080 --> 00:00:53,840 Speaker 2: and unprofessionally looking at something on my phone. I would 20 00:00:53,920 --> 00:00:55,800 Speaker 2: yell at you two for doing that. So I'm going 21 00:00:55,880 --> 00:00:58,040 Speaker 2: to yell at my I'm going to yell at myself. 22 00:00:59,400 --> 00:01:02,480 Speaker 2: I was just I know there's no exs. 23 00:01:02,840 --> 00:01:03,240 Speaker 3: You can't. 24 00:01:03,240 --> 00:01:06,200 Speaker 2: Okay, I'm bad. Look I accept both bits of feedback. 25 00:01:06,280 --> 00:01:07,800 Speaker 2: You are correct, I am wrong. 26 00:01:09,160 --> 00:01:10,959 Speaker 1: There we go. That's how you do it. Everyone. 27 00:01:11,080 --> 00:01:13,280 Speaker 2: If somebody points out a mistake that you made or 28 00:01:13,280 --> 00:01:17,480 Speaker 2: a fault, just go fuck. Thanks for that, Patrick noted 29 00:01:17,600 --> 00:01:20,280 Speaker 2: taken on board, you can't won't do it again. 30 00:01:21,040 --> 00:01:21,600 Speaker 1: You're welcome. 31 00:01:22,920 --> 00:01:24,840 Speaker 3: I feel like we're in the moment now and I 32 00:01:24,840 --> 00:01:27,000 Speaker 3: feel a lot better and more recognized. 33 00:01:28,280 --> 00:01:31,560 Speaker 1: I feel seen, especially when you swear at me. TIF 34 00:01:31,600 --> 00:01:33,400 Speaker 1: How are you? How are your all? 35 00:01:33,920 --> 00:01:36,400 Speaker 5: I'm fabulous, some fabulous things apps. 36 00:01:37,959 --> 00:01:41,560 Speaker 2: I just had a coffee with Scottie Douglas, who is 37 00:01:41,600 --> 00:01:45,000 Speaker 2: a big fan of you and also a big fan 38 00:01:45,040 --> 00:01:46,959 Speaker 2: of Patrick. You wouldn't know who he is made he's 39 00:01:46,959 --> 00:01:50,160 Speaker 2: been on the show a couple of times, loves you, 40 00:01:50,240 --> 00:01:52,200 Speaker 2: and he's like, I never used to listen to the 41 00:01:52,240 --> 00:01:55,120 Speaker 2: Patrick episodes because I didn't listen to any, he said, 42 00:01:55,160 --> 00:01:58,240 Speaker 2: because I don't really give a fuck about tech, right, 43 00:01:58,880 --> 00:02:00,760 Speaker 2: And then he listened to one and he loved it, 44 00:02:00,800 --> 00:02:02,600 Speaker 2: and he goes, oh god, now I've got to go 45 00:02:02,680 --> 00:02:04,600 Speaker 2: back through them all because I love Patrick. 46 00:02:04,960 --> 00:02:05,080 Speaker 5: Oh. 47 00:02:05,240 --> 00:02:07,800 Speaker 3: Also, scott eat on your best friend. Thank you. You've 48 00:02:07,840 --> 00:02:11,640 Speaker 3: just made my day. After Craig abuses me at the 49 00:02:11,680 --> 00:02:13,880 Speaker 3: start of the show, right, well, you know. 50 00:02:13,800 --> 00:02:15,920 Speaker 2: That if I abuse you, especially if I call you 51 00:02:15,960 --> 00:02:17,280 Speaker 2: the sea word that you're in. 52 00:02:17,160 --> 00:02:19,600 Speaker 1: My top five, because I don't do that for anyone. 53 00:02:19,720 --> 00:02:20,320 Speaker 1: You're welcome. 54 00:02:21,760 --> 00:02:25,000 Speaker 2: But you know what is funny is this show has 55 00:02:25,400 --> 00:02:28,840 Speaker 2: got almost I mean it's loosely about tech, but people 56 00:02:28,919 --> 00:02:31,200 Speaker 2: don't listen to it for that. I think the majority 57 00:02:31,200 --> 00:02:33,320 Speaker 2: of people don't listen to it for that reason. So 58 00:02:34,080 --> 00:02:36,840 Speaker 2: let's not get too distracted with technology, shall we? 59 00:02:39,240 --> 00:02:41,480 Speaker 3: Yeah? Good idea, mate, that's great. 60 00:02:42,360 --> 00:02:45,160 Speaker 2: Before we do talk to Patrick in any detail. Tiffany 61 00:02:45,200 --> 00:02:47,359 Speaker 2: and Cook, what is new with you? What is going on? 62 00:02:47,840 --> 00:02:50,360 Speaker 2: You've got your big speaking event coming up in the 63 00:02:50,400 --> 00:02:54,639 Speaker 2: Sunshine State. You've been prepping like a fucking champion. You've 64 00:02:54,639 --> 00:02:56,920 Speaker 2: got a bit of nerves, You've got a bit of anxiety. 65 00:02:56,960 --> 00:02:59,040 Speaker 1: Have you picked out your frock? What are you wearing? 66 00:03:00,120 --> 00:03:02,800 Speaker 2: I'm sure you're wearing a good sensible frock as Mary 67 00:03:02,840 --> 00:03:04,920 Speaker 2: would say it in a track. 68 00:03:05,639 --> 00:03:09,840 Speaker 5: And I'm sure is shit not wearing a frock I've got? 69 00:03:10,360 --> 00:03:13,160 Speaker 2: Are you wearing overalls with like a spanner in your 70 00:03:13,160 --> 00:03:15,000 Speaker 2: pocket and a bit of grease on your chin? 71 00:03:15,480 --> 00:03:16,760 Speaker 3: Is a spatter in your pocket? 72 00:03:16,800 --> 00:03:19,040 Speaker 4: Tip? 73 00:03:20,040 --> 00:03:21,920 Speaker 2: I'm just happy to do my presentation. 74 00:03:23,760 --> 00:03:25,960 Speaker 5: There, we know frock there? I love it. Because all 75 00:03:25,960 --> 00:03:30,200 Speaker 5: the girls are getting here and makeup. I'm not fuck that, Yeah, 76 00:03:30,400 --> 00:03:33,440 Speaker 5: not at all, But I'm excited with Yeah, plenty of 77 00:03:33,480 --> 00:03:36,040 Speaker 5: nerves plenty because it is out of my comfort zone 78 00:03:36,400 --> 00:03:38,720 Speaker 5: the way that the whole things do it. But it's 79 00:03:38,760 --> 00:03:42,240 Speaker 5: coming together. I've pulled my head in, I've taken a 80 00:03:42,240 --> 00:03:46,280 Speaker 5: bit of my own medicine and leaned into the discomfort 81 00:03:46,560 --> 00:03:49,280 Speaker 5: and doing it someone else's way. And I'm just having 82 00:03:49,320 --> 00:03:49,720 Speaker 5: a crack. 83 00:03:50,760 --> 00:03:52,800 Speaker 3: Did you take Blake's advice when you say you're not 84 00:03:52,800 --> 00:03:54,440 Speaker 3: wearing a frock that you're going to do it naked? 85 00:03:54,480 --> 00:03:56,560 Speaker 3: You know how most people get up on stage and 86 00:03:56,600 --> 00:03:59,440 Speaker 3: they imagine the crowd's naked doing the reverse. 87 00:04:00,520 --> 00:04:01,600 Speaker 5: Definitely don't do that. 88 00:04:03,160 --> 00:04:06,120 Speaker 3: Just checking. I thought maybe distraction might be a good 89 00:04:06,840 --> 00:04:09,760 Speaker 3: strategy for sure. Not. I want to know what you're 90 00:04:09,760 --> 00:04:11,280 Speaker 3: gonna wear it? What are you gonna wear it? Because 91 00:04:11,280 --> 00:04:14,920 Speaker 3: you look pretty amazing and it doesn't matter what you wear, 92 00:04:14,920 --> 00:04:17,279 Speaker 3: you look great. But what do you think you wear? 93 00:04:17,640 --> 00:04:21,720 Speaker 5: Probably probably that ten dollars sparkly top I wore once before, 94 00:04:22,000 --> 00:04:23,800 Speaker 5: and my Freddy jeans. 95 00:04:24,520 --> 00:04:26,640 Speaker 3: So remember when we were led together. You're gonna wear 96 00:04:26,640 --> 00:04:27,960 Speaker 3: the same outfit or you. 97 00:04:28,040 --> 00:04:32,400 Speaker 5: Got well that was what they call it an orange 98 00:04:32,520 --> 00:04:35,359 Speaker 5: jail jumpsuit that wasn't mine, So no, I won't be 99 00:04:35,400 --> 00:04:36,040 Speaker 5: wearing that. 100 00:04:37,920 --> 00:04:40,040 Speaker 4: But so jeans and a sparkly top. Can you send 101 00:04:40,080 --> 00:04:40,800 Speaker 4: me a photo later? 102 00:04:40,920 --> 00:04:41,080 Speaker 3: Oh? 103 00:04:41,120 --> 00:04:43,839 Speaker 1: Absolutely, well pretty much sounds like you, Patrick. 104 00:04:45,520 --> 00:04:48,000 Speaker 2: I mean, I mean you could you two could just 105 00:04:48,040 --> 00:04:49,640 Speaker 2: bloody into change wardrobes. 106 00:04:50,960 --> 00:04:52,040 Speaker 1: Can we just clear up? 107 00:04:52,120 --> 00:04:54,800 Speaker 2: So, like you've got to go do You've been doing 108 00:04:54,880 --> 00:04:59,080 Speaker 2: part of this speaker's program and some of the some 109 00:04:59,120 --> 00:05:00,800 Speaker 2: of the people who did the course correct me if 110 00:05:00,800 --> 00:05:04,160 Speaker 2: I fuck any of this up, but got picked to 111 00:05:04,240 --> 00:05:07,560 Speaker 2: speak at an event and you're one of those people. Congratulations. 112 00:05:08,360 --> 00:05:12,320 Speaker 2: But the challenge for you is that you, who like me, 113 00:05:12,520 --> 00:05:17,080 Speaker 2: is very freestyle, which kind of means lazy and unplanned. 114 00:05:18,600 --> 00:05:21,320 Speaker 2: You've had to you've had to script it and you've 115 00:05:21,320 --> 00:05:25,560 Speaker 2: had to It's it's like, really in a way, it's 116 00:05:25,920 --> 00:05:29,159 Speaker 2: a performance, Like I mean, it's still you and there's 117 00:05:29,200 --> 00:05:32,200 Speaker 2: still your words. But am I right in that, it's 118 00:05:32,240 --> 00:05:35,680 Speaker 2: like it's got to be very scripted and very pre planned? 119 00:05:36,040 --> 00:05:39,080 Speaker 5: Yeah, very yeah, So it's very tightly planned. 120 00:05:39,800 --> 00:05:42,280 Speaker 2: And yeah, how is that for you? 121 00:05:42,360 --> 00:05:43,839 Speaker 1: Because that ain't you. 122 00:05:44,400 --> 00:05:46,359 Speaker 5: I know, it's been it's been a hell of it. 123 00:05:46,520 --> 00:05:49,080 Speaker 5: I've been very loud about it. I've been not the 124 00:05:49,120 --> 00:05:51,320 Speaker 5: best student, I can tell you that much. 125 00:05:51,839 --> 00:05:53,840 Speaker 2: But have you been quite oppositional? 126 00:05:54,040 --> 00:05:56,520 Speaker 5: Oh yeah, plenty of that. And in the final hour, 127 00:05:56,640 --> 00:05:59,679 Speaker 5: I thought, well, what is all this stuff you teach? 128 00:06:00,000 --> 00:06:02,160 Speaker 5: Oh that's right, shout the fuck up and do it. 129 00:06:02,200 --> 00:06:02,400 Speaker 3: Then? 130 00:06:02,640 --> 00:06:04,800 Speaker 5: So is there? 131 00:06:05,680 --> 00:06:06,240 Speaker 3: What is there? 132 00:06:07,000 --> 00:06:08,880 Speaker 2: And I'm not saying it's good or bad, It's I'm 133 00:06:08,880 --> 00:06:11,400 Speaker 2: not being critical. I'm being curious, like, what is the 134 00:06:11,640 --> 00:06:15,120 Speaker 2: what is her? Like your teacher person, we won't mention 135 00:06:15,200 --> 00:06:16,840 Speaker 2: what it is or who it is, but shout out 136 00:06:16,880 --> 00:06:19,200 Speaker 2: to her. Tip does say good things behind your back? 137 00:06:20,400 --> 00:06:24,800 Speaker 2: What like what's her philosophy? Then that it should be 138 00:06:24,839 --> 00:06:28,560 Speaker 2: as well choreographed, prepared and scripted as possible so you 139 00:06:28,600 --> 00:06:30,320 Speaker 2: can't fuck it up or no. 140 00:06:30,400 --> 00:06:33,480 Speaker 5: Well, and when they say scripted, the end result is 141 00:06:33,520 --> 00:06:35,600 Speaker 5: you're not going word for word script You just know 142 00:06:35,640 --> 00:06:39,760 Speaker 5: your content so tightly that you know where you're going 143 00:06:39,800 --> 00:06:41,720 Speaker 5: and you know the important perhaps to script. 144 00:06:43,320 --> 00:06:43,680 Speaker 3: I do. 145 00:06:44,680 --> 00:06:48,000 Speaker 5: I am open to believing that the potential of it 146 00:06:48,080 --> 00:06:53,320 Speaker 5: being a much better way to put it together might unfold. 147 00:06:54,480 --> 00:06:59,480 Speaker 5: I don't know if following this procedure exactly the way 148 00:06:59,520 --> 00:07:02,960 Speaker 5: I have is the best way for me, But I 149 00:07:03,000 --> 00:07:06,320 Speaker 5: think it'll iron out, and yeah, I think maybe the 150 00:07:06,400 --> 00:07:09,200 Speaker 5: end result is that it will be more powerful because 151 00:07:09,279 --> 00:07:14,160 Speaker 5: I will I will know exactly what I want to 152 00:07:14,240 --> 00:07:17,000 Speaker 5: be taken away and I won't leave anything out, and 153 00:07:17,000 --> 00:07:19,320 Speaker 5: it'll be a bit more Yeah. 154 00:07:19,400 --> 00:07:23,480 Speaker 2: I think the good thing about public speaking, professional whatever 155 00:07:23,720 --> 00:07:25,720 Speaker 2: is that there is no single way to do it. 156 00:07:25,800 --> 00:07:30,920 Speaker 2: Like I remember doing a gig with Stephen Bradbury, I 157 00:07:30,920 --> 00:07:33,760 Speaker 2: think three times, you know, the accidental gold medalists. Not 158 00:07:33,800 --> 00:07:36,840 Speaker 2: that he didn't deserve it, He fucking deserved it, but 159 00:07:36,920 --> 00:07:39,720 Speaker 2: I think every time it was pretty much verbatim word 160 00:07:39,800 --> 00:07:42,440 Speaker 2: for word, and it had you know, like video bits 161 00:07:42,440 --> 00:07:45,160 Speaker 2: in it and had like stories and every and it 162 00:07:45,280 --> 00:07:50,760 Speaker 2: was very formulaic, but every time it crushed right, every 163 00:07:50,840 --> 00:07:54,400 Speaker 2: time it did great. And so here's me, mister non 164 00:07:54,520 --> 00:07:59,200 Speaker 2: formula mister freestyle Lucy Goosey. And part of me was thinking, 165 00:07:59,200 --> 00:08:00,720 Speaker 2: why the fuck isn't that boring? 166 00:08:00,800 --> 00:08:01,600 Speaker 1: Just and then but. 167 00:08:01,520 --> 00:08:03,560 Speaker 2: Part of me is like, hey, shut up, dickhead. He's 168 00:08:03,560 --> 00:08:07,600 Speaker 2: getting paid shitloads and he's crushing three times in a row, 169 00:08:08,400 --> 00:08:12,600 Speaker 2: so this is actually working for him very well. That 170 00:08:12,760 --> 00:08:15,200 Speaker 2: probably wouldn't work for me. But then what I do 171 00:08:15,280 --> 00:08:17,640 Speaker 2: wouldn't work for him. So I think this is the 172 00:08:17,680 --> 00:08:20,920 Speaker 2: beauty of you know, like whether or not it's podcasting, 173 00:08:21,040 --> 00:08:24,040 Speaker 2: or whether or not it's you know, like building any 174 00:08:24,200 --> 00:08:26,680 Speaker 2: other kind of business or doing any other kind of 175 00:08:26,720 --> 00:08:27,480 Speaker 2: performedive thing. 176 00:08:27,520 --> 00:08:28,160 Speaker 1: It's not like. 177 00:08:28,080 --> 00:08:30,600 Speaker 2: There's a set way, but maybe what will happen is 178 00:08:30,640 --> 00:08:34,880 Speaker 2: you'll do this kind of this model, and then you 179 00:08:34,960 --> 00:08:39,200 Speaker 2: might end up in your own space moving forward, somewhere 180 00:08:39,240 --> 00:08:43,720 Speaker 2: between your loosey goosey style and this way more prepared style. 181 00:08:43,880 --> 00:08:45,480 Speaker 2: You might find a fit in the middle. 182 00:08:45,760 --> 00:08:48,120 Speaker 5: And I also love that all the work that goes 183 00:08:48,160 --> 00:08:51,960 Speaker 5: into this has given me something. Let's call it a 184 00:08:51,960 --> 00:08:55,400 Speaker 5: signature keynote, so I can have that if people want it, 185 00:08:55,520 --> 00:08:57,920 Speaker 5: or I can or I can have whatever version I 186 00:08:57,960 --> 00:09:01,520 Speaker 5: throw together to suit others events. 187 00:09:02,360 --> 00:09:02,560 Speaker 3: Cool. 188 00:09:03,120 --> 00:09:05,120 Speaker 2: I think you would have been a very good or 189 00:09:05,160 --> 00:09:07,360 Speaker 2: you would be if you wanted to be. You'd be 190 00:09:07,360 --> 00:09:11,120 Speaker 2: a very good presenter because you've got a great voice 191 00:09:11,120 --> 00:09:13,520 Speaker 2: and you're a great communicator. Have you ever thought about 192 00:09:13,600 --> 00:09:15,720 Speaker 2: doing stuff like that and what would you talk about? 193 00:09:16,240 --> 00:09:19,040 Speaker 3: Yeah, that's a really interesting one. I think I would 194 00:09:19,160 --> 00:09:21,160 Speaker 3: love to do that because I actually loved talking and 195 00:09:21,200 --> 00:09:25,920 Speaker 3: communicating and that's been my whole career, and I think 196 00:09:25,960 --> 00:09:28,440 Speaker 3: it's something that I've just always thrived on and I 197 00:09:28,440 --> 00:09:30,240 Speaker 3: have thought about it a little bit, you know, I've 198 00:09:30,400 --> 00:09:32,600 Speaker 3: kind of moved in different spaces. I did a few 199 00:09:32,679 --> 00:09:36,840 Speaker 3: lectures when I was a journalist at Federation University, talking to, 200 00:09:37,840 --> 00:09:41,319 Speaker 3: you know, young journalists or people studying to be journalists 201 00:09:41,360 --> 00:09:43,719 Speaker 3: about what it was actually like to work in a 202 00:09:43,760 --> 00:09:47,280 Speaker 3: breakfast newsroom and a lot of fun. It was just 203 00:09:47,400 --> 00:09:49,640 Speaker 3: a thing that I did once a year to the 204 00:09:49,760 --> 00:09:52,760 Speaker 3: journalism students. I don't know, it's a tough call. I 205 00:09:52,800 --> 00:09:54,600 Speaker 3: think there's lots of things that I can talk about. 206 00:09:54,600 --> 00:09:56,719 Speaker 3: It's funny I went to and this kind of has 207 00:09:56,920 --> 00:10:00,360 Speaker 3: a close connection. But I thought of that two of 208 00:10:00,400 --> 00:10:03,760 Speaker 3: you when I went to my very first improv night 209 00:10:04,320 --> 00:10:07,760 Speaker 3: and the local council here through our neighborhood center, ran 210 00:10:08,120 --> 00:10:10,960 Speaker 3: a four week course with a professional guy who's moved 211 00:10:11,000 --> 00:10:13,640 Speaker 3: to the land who does improv. And I knew a 212 00:10:13,679 --> 00:10:16,400 Speaker 3: lot of people who were doing the course, and so 213 00:10:16,520 --> 00:10:18,280 Speaker 3: it was fun to go along and see them do 214 00:10:18,360 --> 00:10:22,760 Speaker 3: this improv and I'd never experienced it before, and I thought, improv, 215 00:10:22,760 --> 00:10:23,720 Speaker 3: that'd be so easy. 216 00:10:23,760 --> 00:10:26,240 Speaker 4: We do that every fortnight? How hard? 217 00:10:27,200 --> 00:10:30,680 Speaker 3: But they got the audience to throw out a reference 218 00:10:30,840 --> 00:10:35,120 Speaker 3: word and they had to improv around that. And listening 219 00:10:35,120 --> 00:10:37,320 Speaker 3: to what Tiff was saying, I reckon I would struggle 220 00:10:37,400 --> 00:10:40,320 Speaker 3: to follow a script. You know, A good thing about 221 00:10:40,360 --> 00:10:42,560 Speaker 3: what you've done is kind of pushing you out of 222 00:10:42,600 --> 00:10:44,960 Speaker 3: your comfort zone, and then it probably is going to 223 00:10:44,960 --> 00:10:47,640 Speaker 3: help with structuring things a little bit more like running 224 00:10:47,640 --> 00:10:50,240 Speaker 3: to a not so much a script, but a structure. 225 00:10:50,559 --> 00:10:52,200 Speaker 3: And I think that's where you're probably going to take 226 00:10:52,200 --> 00:10:53,400 Speaker 3: a little bit of this and a little bit of that, 227 00:10:53,440 --> 00:10:55,480 Speaker 3: and it's going to make you an overall better speaker. 228 00:10:55,559 --> 00:10:58,120 Speaker 3: But I would love to talk to people about I mean, 229 00:10:58,400 --> 00:11:00,800 Speaker 3: I don't often talk about this very very much, and 230 00:11:01,080 --> 00:11:05,720 Speaker 3: I've had a challenging childhood. It's it's something that you 231 00:11:05,760 --> 00:11:07,760 Speaker 3: know in some areas it was kind of challenging. I 232 00:11:08,559 --> 00:11:11,560 Speaker 3: had a bit of childhood trauma and bullying when I 233 00:11:11,600 --> 00:11:13,960 Speaker 3: was a kid, and I've often thought if that could 234 00:11:13,960 --> 00:11:17,720 Speaker 3: help someone to know about adversity and coming from adversity. 235 00:11:17,960 --> 00:11:20,480 Speaker 3: In fact, Tiff and I were talking about partnering up 236 00:11:20,480 --> 00:11:24,520 Speaker 3: and maybe doing something together and running a workshop to 237 00:11:25,280 --> 00:11:28,000 Speaker 3: talk about resilience and that sort of thing, and I 238 00:11:28,040 --> 00:11:31,040 Speaker 3: think that's kind of fueled my love of tai chi 239 00:11:31,080 --> 00:11:34,120 Speaker 3: and breaking away from those things that can really hinder 240 00:11:34,200 --> 00:11:37,320 Speaker 3: us in our daily lives. So I think that would 241 00:11:37,360 --> 00:11:39,400 Speaker 3: be the topic that I'd like to most focus on, 242 00:11:39,440 --> 00:11:43,440 Speaker 3: something that I feel would really bring value to people's lives. 243 00:11:43,880 --> 00:11:45,120 Speaker 3: For me, that's a big thing. 244 00:11:45,240 --> 00:11:49,560 Speaker 2: You should maybe put some thought and work into that 245 00:11:49,600 --> 00:11:52,000 Speaker 2: and see what comes out the other side. I think 246 00:11:52,040 --> 00:11:54,840 Speaker 2: you would be it'd be good at that, and people 247 00:11:55,040 --> 00:11:59,600 Speaker 2: like you and yeah, you're authentic, and it's I think 248 00:11:59,640 --> 00:12:00,439 Speaker 2: that might resonate. 249 00:12:00,520 --> 00:12:01,520 Speaker 1: Let's talk about tech. 250 00:12:01,800 --> 00:12:05,680 Speaker 4: Shall we just give a shot at Scotty when you mate? 251 00:12:07,640 --> 00:12:11,520 Speaker 2: Okay, it doesn't seem like this is the time to 252 00:12:11,559 --> 00:12:12,439 Speaker 2: do that, but. 253 00:12:14,240 --> 00:12:15,000 Speaker 1: Sure, go on. 254 00:12:15,600 --> 00:12:17,600 Speaker 3: You know, wasn't that your friend that you had breakfast 255 00:12:17,640 --> 00:12:18,800 Speaker 3: with this morning? Was that Scotty? 256 00:12:19,360 --> 00:12:24,240 Speaker 1: Oh? I had no idea. Yeah, sure you can, yeah it. 257 00:12:24,440 --> 00:12:27,439 Speaker 4: I just thought it'd be nice to say hi. Listening 258 00:12:27,480 --> 00:12:27,839 Speaker 4: to me. 259 00:12:27,840 --> 00:12:31,600 Speaker 1: On the Patrick's not desperate at all. Will send you 260 00:12:31,679 --> 00:12:36,280 Speaker 1: his will send you his number later. Patrick. 261 00:12:37,040 --> 00:12:39,720 Speaker 2: It appears to me I heard this last night on 262 00:12:39,760 --> 00:12:41,440 Speaker 2: the news and it's on the top of your to 263 00:12:41,559 --> 00:12:47,120 Speaker 2: chat about list. Open ai is allowing mature content now, 264 00:12:47,559 --> 00:12:52,640 Speaker 2: am I interpreting that? As pawn non chat GPT for 265 00:12:52,880 --> 00:12:55,600 Speaker 2: adult verified users starting in December. 266 00:12:56,400 --> 00:12:57,640 Speaker 1: What does that actually mean? 267 00:12:58,120 --> 00:13:04,760 Speaker 3: Well, Sam Altman, the bloke who you know, the big AI, Meger, giant, 268 00:13:05,160 --> 00:13:09,400 Speaker 3: enormous company, what he's saying is it means that you 269 00:13:09,400 --> 00:13:12,840 Speaker 3: could potentially interact with your AI chatbot in a more 270 00:13:12,920 --> 00:13:16,760 Speaker 3: adult like way and not be hindered by that conversation. 271 00:13:17,080 --> 00:13:19,600 Speaker 3: So I don't think it's about producing porn. I think 272 00:13:19,640 --> 00:13:21,280 Speaker 3: it's having. 273 00:13:22,400 --> 00:13:24,760 Speaker 1: Dirty, dirty conversations with AI. 274 00:13:25,280 --> 00:13:29,920 Speaker 3: Yeah, maybe, I don't know. He hasn't really specified. Basically, 275 00:13:30,160 --> 00:13:32,280 Speaker 3: what he's saying is they want to roll out, you know, 276 00:13:33,360 --> 00:13:37,520 Speaker 3: you know, make that age restriction thing and treat adults 277 00:13:37,920 --> 00:13:41,040 Speaker 3: like adult users. And if adult users want to have 278 00:13:41,120 --> 00:13:44,880 Speaker 3: a more saucy conversation with AI, should we stand in 279 00:13:44,880 --> 00:13:50,080 Speaker 3: the way of that? Don't laugh, It's just. 280 00:13:49,240 --> 00:13:55,240 Speaker 2: Like, yeah, there's so many hilarious things that are fucking 281 00:13:55,360 --> 00:13:58,360 Speaker 2: bolting through my mind at the moment. What's the first thing, 282 00:13:58,520 --> 00:14:04,320 Speaker 2: even even for me somewhat inappropriate, what's the first Well, 283 00:14:04,400 --> 00:14:05,640 Speaker 2: let's see what happens with that. 284 00:14:05,920 --> 00:14:07,079 Speaker 1: Would you use it? Patrick? 285 00:14:07,920 --> 00:14:10,680 Speaker 3: No? Probably not, Tiff. 286 00:14:12,320 --> 00:14:15,680 Speaker 2: Nah, All right, Well we'll see what happens with that. Patrick, 287 00:14:16,440 --> 00:14:19,920 Speaker 2: Stay tuned. Everybody. Patrick might be getting a new boyfriend. 288 00:14:19,960 --> 00:14:24,040 Speaker 2: It sounds like his name is open. What you asked 289 00:14:24,200 --> 00:14:25,840 Speaker 2: us if we I know? 290 00:14:26,080 --> 00:14:30,880 Speaker 3: Would you no? I just want to dark question, I think, And. 291 00:14:30,840 --> 00:14:35,600 Speaker 2: I'm not just saying that I know that would give 292 00:14:35,680 --> 00:14:39,200 Speaker 2: me the opposite of the effect that some people would 293 00:14:39,200 --> 00:14:43,720 Speaker 2: want know what I'm saying like that would No, My 294 00:14:43,880 --> 00:14:48,680 Speaker 2: brain and my body does not work like that at all. 295 00:14:48,720 --> 00:14:52,320 Speaker 2: Google Search are introducing AI mode in Australia. What does 296 00:14:52,320 --> 00:14:53,960 Speaker 2: that even mean Google Search? 297 00:14:54,400 --> 00:14:58,440 Speaker 3: Well, it means that you can do more complicated searches 298 00:14:58,480 --> 00:15:02,040 Speaker 3: with a much broader text string, and you can do 299 00:15:02,080 --> 00:15:06,240 Speaker 3: it via input with text, images or voice. So I 300 00:15:06,360 --> 00:15:10,240 Speaker 3: might say, Tiffing, and I want to do a chocolate 301 00:15:10,320 --> 00:15:14,680 Speaker 3: tour of Melbourne in the CBD that takes three hours, 302 00:15:14,760 --> 00:15:16,720 Speaker 3: and we want to do it on a Saturday, starting 303 00:15:16,800 --> 00:15:19,840 Speaker 3: at five o'clock. So what we do is it? 304 00:15:20,600 --> 00:15:23,520 Speaker 1: You've thought about that prompt, haven't you. You've definitely thought 305 00:15:23,560 --> 00:15:23,960 Speaker 1: about that. 306 00:15:23,960 --> 00:15:26,400 Speaker 5: To bed earlier. Patrick. If that's a real thing. 307 00:15:27,160 --> 00:15:30,920 Speaker 3: Sure I'm into it. So you can do more complicated, 308 00:15:31,040 --> 00:15:34,600 Speaker 3: complicated text strings of information and it will be able 309 00:15:34,640 --> 00:15:37,800 Speaker 3: to do that, so it will not just answer one question. 310 00:15:37,840 --> 00:15:39,640 Speaker 3: Then you're going to do another query, then answer another 311 00:15:39,720 --> 00:15:43,560 Speaker 3: question to another query. So it will make the conversation 312 00:15:44,280 --> 00:15:46,880 Speaker 3: a lot more human like in terms of the way 313 00:15:46,880 --> 00:15:49,640 Speaker 3: that we communicate, and it will make those searches a 314 00:15:49,720 --> 00:15:51,360 Speaker 3: lot better. You know, you might say, I want to 315 00:15:51,360 --> 00:15:54,400 Speaker 3: go on holiday. I'm thinking of flying to Queensland. What's 316 00:15:54,480 --> 00:15:56,240 Speaker 3: the best time of the year to do that. I 317 00:15:56,240 --> 00:15:58,480 Speaker 3: don't want the weather to be hotter than thirty degrees 318 00:15:58,840 --> 00:16:02,480 Speaker 3: And those sorts of con stations will allow the integrated 319 00:16:02,520 --> 00:16:06,120 Speaker 3: AI to do a more comprehensive search, drawing on lots 320 00:16:06,120 --> 00:16:08,520 Speaker 3: and lots of different resources, rather than just going to 321 00:16:09,160 --> 00:16:12,440 Speaker 3: one resource website. It would then be able to look 322 00:16:12,440 --> 00:16:14,320 Speaker 3: at a whole lot of things before it responded. 323 00:16:15,480 --> 00:16:19,000 Speaker 1: So it's I mean, kind of just another AI, really. 324 00:16:18,960 --> 00:16:21,360 Speaker 3: Right, Yeah, it is, but it's an ALI in a 325 00:16:21,400 --> 00:16:25,680 Speaker 3: specific sense because it's drawing on real time information. So 326 00:16:26,080 --> 00:16:30,040 Speaker 3: the large language models we think about a resourced So 327 00:16:30,080 --> 00:16:32,840 Speaker 3: if you're getting a picture of an elephant being drawn 328 00:16:32,920 --> 00:16:36,360 Speaker 3: for you that's wearing a top hat and riding a unicycle, well, 329 00:16:36,360 --> 00:16:39,040 Speaker 3: then it's going to reference stuff that it's learnt previously, 330 00:16:39,360 --> 00:16:42,120 Speaker 3: whereas this is using real time information because it might 331 00:16:42,200 --> 00:16:44,760 Speaker 3: need flight data, it might be looking at current weather, 332 00:16:44,920 --> 00:16:48,240 Speaker 3: that sort of stuff, so it is more real time 333 00:16:48,640 --> 00:16:52,240 Speaker 3: than what the conventional AI models are although check GPT 334 00:16:52,360 --> 00:16:55,320 Speaker 3: and hours incorporating real time information, so it is a 335 00:16:55,400 --> 00:16:55,960 Speaker 3: lot better. 336 00:16:57,000 --> 00:16:59,080 Speaker 1: Well, the next one doesn't seem like a good idea. 337 00:16:59,160 --> 00:17:02,840 Speaker 2: Creators, Are you using AI to prank loved ones with 338 00:17:03,040 --> 00:17:05,280 Speaker 2: fake homeless intruders? 339 00:17:05,760 --> 00:17:08,280 Speaker 3: I saw this and it was so disturbing to see it. 340 00:17:08,320 --> 00:17:11,920 Speaker 3: There's so many stupid searches and stupid things was being 341 00:17:12,040 --> 00:17:15,000 Speaker 3: used for. And what it was was this girl or 342 00:17:15,040 --> 00:17:20,840 Speaker 3: young woman pranked her father by taking photos in her house. 343 00:17:20,880 --> 00:17:25,080 Speaker 3: So initially she has this bearded homeless guy standing at 344 00:17:25,080 --> 00:17:27,160 Speaker 3: the front door and she takes a photo. 345 00:17:27,480 --> 00:17:28,080 Speaker 1: So she takes a. 346 00:17:28,000 --> 00:17:32,160 Speaker 3: Photo and the AI superimposes this what appears to be 347 00:17:32,200 --> 00:17:34,879 Speaker 3: a homeless man standing at the front door. She texts 348 00:17:34,920 --> 00:17:37,359 Speaker 3: her father and says, oh, one of your friends just 349 00:17:37,520 --> 00:17:40,240 Speaker 3: rocked up of led him in. And then she shows 350 00:17:40,240 --> 00:17:42,800 Speaker 3: a picture of her kitchen and there's the same bloke 351 00:17:42,840 --> 00:17:45,840 Speaker 3: having a coffee, then blokes sitting on the couch, and 352 00:17:45,920 --> 00:17:49,639 Speaker 3: so she's making up this whole scenario and sending it 353 00:17:49,680 --> 00:17:52,560 Speaker 3: to her father, and of course he's freaking out. He's saying, 354 00:17:52,920 --> 00:17:55,040 Speaker 3: I don't know who this is, you know, what does 355 00:17:55,040 --> 00:17:57,520 Speaker 3: you want? Why did you let him inside the house? 356 00:17:57,880 --> 00:18:00,760 Speaker 3: And of course this I mean, the problem with a 357 00:18:00,800 --> 00:18:02,520 Speaker 3: lot of this stuff is that it has it been 358 00:18:02,560 --> 00:18:05,080 Speaker 3: sensationalized and is it all set up or is it 359 00:18:05,119 --> 00:18:08,440 Speaker 3: actually a real thing. I mean, the technology certainly is there. 360 00:18:08,680 --> 00:18:11,399 Speaker 3: You can superimpose someone into a shot now and in 361 00:18:11,440 --> 00:18:14,399 Speaker 3: fact I didn't put it into our articles, but I 362 00:18:14,480 --> 00:18:18,480 Speaker 3: find it really disturbing where you can now in real 363 00:18:18,600 --> 00:18:21,800 Speaker 3: time use AI. I think Google's rolling it out on 364 00:18:21,840 --> 00:18:24,680 Speaker 3: some of the pixel phones and Android phones, where you 365 00:18:24,720 --> 00:18:27,879 Speaker 3: can change the vision or the scene that you're looking 366 00:18:27,920 --> 00:18:31,760 Speaker 3: at and incorporate things that are not even there into 367 00:18:31,800 --> 00:18:34,200 Speaker 3: the photo you're about to take. That freaks me out 368 00:18:34,240 --> 00:18:38,280 Speaker 3: as well, someone who loves capturing the moment in real time. 369 00:18:38,400 --> 00:18:42,359 Speaker 3: You know, if I'm walking or hiking and I see 370 00:18:42,359 --> 00:18:44,760 Speaker 3: a spider web that's got dew on it and the 371 00:18:44,800 --> 00:18:47,399 Speaker 3: sunlight's hitting it in the right way, I'll take a 372 00:18:47,400 --> 00:18:50,840 Speaker 3: photo or a beautiful sunrise or sunset and those moments 373 00:18:50,840 --> 00:18:53,119 Speaker 3: are fleeting. You know, the cloud and the sky that 374 00:18:53,160 --> 00:18:56,520 Speaker 3: you look at now you'll never see again in that shape, 375 00:18:56,560 --> 00:18:59,960 Speaker 3: in that design. You know, those moments are beautifully capture 376 00:19:00,200 --> 00:19:02,560 Speaker 3: if you hip at the right time, but suddenly if 377 00:19:02,600 --> 00:19:04,920 Speaker 3: you're using AI, and in this case, it's a really 378 00:19:05,359 --> 00:19:08,879 Speaker 3: terrible way to prank someone. So it's just a really 379 00:19:08,960 --> 00:19:12,000 Speaker 3: weird TikTok trend at the moment, and it does my 380 00:19:12,040 --> 00:19:15,760 Speaker 3: head in sometimes I overthink it. What's your thoughts cray going. 381 00:19:15,920 --> 00:19:19,480 Speaker 2: And yeah, I just think that we need to be 382 00:19:19,560 --> 00:19:24,679 Speaker 2: careful like with all of this that people are. You know, 383 00:19:24,720 --> 00:19:27,600 Speaker 2: there's so much now that even comes across my social 384 00:19:27,680 --> 00:19:31,480 Speaker 2: media feed where it's the AI is so good as 385 00:19:31,480 --> 00:19:33,000 Speaker 2: in it it looks so. 386 00:19:33,160 --> 00:19:35,240 Speaker 1: Real that. 387 00:19:36,760 --> 00:19:39,880 Speaker 2: You spend half your life going, oh, is that actually 388 00:19:39,880 --> 00:19:42,680 Speaker 2: a dog doing this? Or is this somebody who created 389 00:19:42,680 --> 00:19:45,080 Speaker 2: an AI that looks like a dog doing that or 390 00:19:45,119 --> 00:19:47,920 Speaker 2: whatever the thing is, And I think that's I don't 391 00:19:47,960 --> 00:19:50,199 Speaker 2: know how that's going to unfold moving forward. 392 00:19:50,240 --> 00:19:54,160 Speaker 1: But also, you know, I mean, what that girl was. 393 00:19:54,080 --> 00:19:56,920 Speaker 2: Doing with her dad and the in inverted commas homeless 394 00:19:56,920 --> 00:19:59,879 Speaker 2: guy just to me seems like a silly idea to 395 00:20:00,040 --> 00:20:02,159 Speaker 2: start with. But when you think of more kind of 396 00:20:03,320 --> 00:20:08,080 Speaker 2: you know, subversive kind of unethical applications of this, you know, 397 00:20:08,200 --> 00:20:11,520 Speaker 2: people are like moving forward just trying to figure out 398 00:20:11,600 --> 00:20:14,480 Speaker 2: who do I trust and what do I trust online? 399 00:20:14,880 --> 00:20:17,280 Speaker 2: Because I think that's one of the biggest issues. Now 400 00:20:17,320 --> 00:20:19,240 Speaker 2: you don't know what is real, you don't know what 401 00:20:19,400 --> 00:20:24,320 Speaker 2: is bullshit, you don't know what's fake, and it's like 402 00:20:24,400 --> 00:20:28,560 Speaker 2: a real and it's it's difficult, I think. Also, you know, 403 00:20:28,640 --> 00:20:31,520 Speaker 2: like I'm sixty as fuck, I don't know. Maybe the 404 00:20:31,600 --> 00:20:34,280 Speaker 2: twelve year olds do, maybe Tiff does, maybe you do, 405 00:20:34,440 --> 00:20:38,320 Speaker 2: but like a huge percentage of people that aren't particularly 406 00:20:38,440 --> 00:20:41,919 Speaker 2: tech savvy, you know, then at the far end of 407 00:20:41,960 --> 00:20:44,920 Speaker 2: the spectrum, my mum and dad who get phone calls 408 00:20:44,920 --> 00:20:47,360 Speaker 2: from people who want them to do this and touch 409 00:20:47,520 --> 00:20:51,240 Speaker 2: you know whatever. Yeah, So it's it's a slippery slope 410 00:20:51,280 --> 00:20:53,240 Speaker 2: and I hope it gets easier, not harder. 411 00:20:53,960 --> 00:20:56,920 Speaker 3: One of the big concerns, and this is happening right now, 412 00:20:57,200 --> 00:21:02,240 Speaker 3: is using AI where state operators. So we're talking about 413 00:21:02,440 --> 00:21:07,879 Speaker 3: you know, government backed hackers and people using AI to 414 00:21:07,960 --> 00:21:10,560 Speaker 3: try to influence elections. We've seen it happen in the 415 00:21:10,640 --> 00:21:15,119 Speaker 3: United States, We've seen it happen overseas, and even misrepresentations 416 00:21:15,119 --> 00:21:17,960 Speaker 3: of what's going on in Ukraine. And you know the 417 00:21:18,040 --> 00:21:20,639 Speaker 3: problem is it's getting so accurate. As you pointed out, 418 00:21:20,680 --> 00:21:22,640 Speaker 3: it's hard to tell what's real and what isn't real. 419 00:21:23,280 --> 00:21:27,400 Speaker 3: And that's where it's almost like, gosh, I wish there 420 00:21:27,440 --> 00:21:31,360 Speaker 3: was some sort of AI filter to disallow that. I 421 00:21:31,400 --> 00:21:35,560 Speaker 3: think we've descended into a rabbit hole that's exceptionally frightening 422 00:21:35,920 --> 00:21:38,200 Speaker 3: if it's getting used in that way. It's one thing 423 00:21:38,280 --> 00:21:40,840 Speaker 3: to prank your dad and with a homeless bloke sitting 424 00:21:40,840 --> 00:21:43,919 Speaker 3: on your couch, it's another thing entirely to try to 425 00:21:43,960 --> 00:21:49,120 Speaker 3: swing public opinion or cause hate because of something that's 426 00:21:49,160 --> 00:21:53,720 Speaker 3: being politically manipulated to try to you know, misrepresent the 427 00:21:53,760 --> 00:21:58,200 Speaker 3: opposition or misrepresent the sitting government. Those sorts of insurgencies 428 00:21:58,359 --> 00:22:02,760 Speaker 3: can be exceptionally frightening, and you know, it's really worrying 429 00:22:02,840 --> 00:22:06,360 Speaker 3: because it's these tools are now so freely available. That's 430 00:22:06,400 --> 00:22:09,280 Speaker 3: the other thing, you know, Sam Altman's talking about. Great 431 00:22:09,359 --> 00:22:12,040 Speaker 3: Now you can talk sex stuff with your AI chatbot 432 00:22:12,080 --> 00:22:14,080 Speaker 3: and that's going to work for some people but not 433 00:22:14,160 --> 00:22:17,960 Speaker 3: for everybody. But where does that take us? With these 434 00:22:18,000 --> 00:22:20,280 Speaker 3: tools getting better and better and better, You know, the 435 00:22:20,359 --> 00:22:22,439 Speaker 3: genie is well and truly out of the bottle. 436 00:22:23,800 --> 00:22:26,840 Speaker 2: And you think about you think, like part of me 437 00:22:27,000 --> 00:22:31,199 Speaker 2: is like part of me is like okay, so somebody 438 00:22:31,240 --> 00:22:33,240 Speaker 2: who's living on their own, who doesn't have a partner, 439 00:22:33,280 --> 00:22:37,400 Speaker 2: who doesn't have intimacy, who doesn't feel that emotional connection, 440 00:22:37,480 --> 00:22:41,480 Speaker 2: who doesn't feel loved or seen or valued, and then 441 00:22:41,520 --> 00:22:44,439 Speaker 2: they have this resource where maybe they can get some 442 00:22:44,560 --> 00:22:46,600 Speaker 2: of that or feel like they get some of that. 443 00:22:46,720 --> 00:22:52,560 Speaker 2: So I have compassion and empathy there. So from that perspective, 444 00:22:53,280 --> 00:22:56,280 Speaker 2: the mind guy, I get it. The emotion guy, I 445 00:22:56,359 --> 00:22:58,960 Speaker 2: get it, or the human behavior, I get that, And 446 00:22:59,240 --> 00:23:02,840 Speaker 2: I think maybe that even serves a positive purpose for some. 447 00:23:02,920 --> 00:23:06,280 Speaker 2: But then on the other side, like Devil's advocate, I go, okay, 448 00:23:06,400 --> 00:23:10,400 Speaker 2: So now you know, people are building potentially these more 449 00:23:10,440 --> 00:23:15,520 Speaker 2: and more intimate relationships with this thing that's not real. 450 00:23:15,840 --> 00:23:18,600 Speaker 2: It's not a human, it doesn't actually have any emotions, 451 00:23:18,640 --> 00:23:22,520 Speaker 2: it doesn't actually care about you. It's a program. It's 452 00:23:22,560 --> 00:23:25,480 Speaker 2: a program that's pushing your buttons to make you feel 453 00:23:25,480 --> 00:23:30,479 Speaker 2: a certain way. And then what happens if somehow or 454 00:23:30,480 --> 00:23:35,600 Speaker 2: for some reason that technology disappears. You know, now we've 455 00:23:35,640 --> 00:23:41,120 Speaker 2: got people addicted to technologies and addicted to these technological 456 00:23:41,240 --> 00:23:43,920 Speaker 2: biological interface relationships. 457 00:23:43,960 --> 00:23:45,200 Speaker 1: Do you like that term? Patrick? 458 00:23:45,320 --> 00:23:50,399 Speaker 2: And I just see probably more problems and solutions. But 459 00:23:50,880 --> 00:23:53,360 Speaker 2: I'm just an old trying to fucking figure it out 460 00:23:53,359 --> 00:23:54,040 Speaker 2: in real time. 461 00:23:54,160 --> 00:23:56,680 Speaker 3: Okay, if we went back in time six hundred years 462 00:23:56,680 --> 00:24:00,880 Speaker 3: and I took you to Sparta, right, and I of. 463 00:24:00,840 --> 00:24:04,760 Speaker 2: Course you picked Sparta. Yeah, the fucking I know why 464 00:24:04,800 --> 00:24:06,960 Speaker 2: you picked Sparta. Just a whole lot of men in 465 00:24:07,000 --> 00:24:09,280 Speaker 2: not many clothes. Yeah, but go on, we could have 466 00:24:09,320 --> 00:24:10,879 Speaker 2: picked any other time in history. 467 00:24:11,520 --> 00:24:12,960 Speaker 3: No, No, I have a specific reason. 468 00:24:13,040 --> 00:24:14,639 Speaker 2: I don't think it was six hundred. I think it 469 00:24:14,680 --> 00:24:18,360 Speaker 2: was like two thousand whatever. You know, don't it wasn't 470 00:24:18,400 --> 00:24:19,400 Speaker 2: the fifteen hundreds. 471 00:24:19,440 --> 00:24:23,199 Speaker 3: Bro Thatt's come on, a good story, all right, come on, 472 00:24:23,320 --> 00:24:23,880 Speaker 3: let's hear it. 473 00:24:24,200 --> 00:24:26,080 Speaker 4: I think of the movie six hundred. That's where that 474 00:24:26,119 --> 00:24:29,280 Speaker 4: got into my head. Three hundred whatever, It was just 475 00:24:29,520 --> 00:24:30,360 Speaker 4: twice as many things. 476 00:24:30,440 --> 00:24:32,399 Speaker 2: What do you mean whatever, If you're going to say 477 00:24:32,440 --> 00:24:34,440 Speaker 2: something fucking makes sense. 478 00:24:34,280 --> 00:24:37,280 Speaker 3: I watched it twice, so that made it six hundred, didn't. 479 00:24:37,680 --> 00:24:39,160 Speaker 3: Sorry anyway, you. 480 00:24:39,160 --> 00:24:41,520 Speaker 2: Know it's not just us two listening. You know, you've 481 00:24:41,520 --> 00:24:43,480 Speaker 2: got to try and make sense to the audience. 482 00:24:43,840 --> 00:24:49,639 Speaker 3: Scotty Scotty Wait, Mindy Scotty jumped off five minutes ago. Wellard, 483 00:24:49,760 --> 00:24:53,159 Speaker 3: I know he's worried that I haunted him in social 484 00:24:53,200 --> 00:24:56,520 Speaker 3: media now, so no, no, what I'm getting at is right. 485 00:24:56,560 --> 00:24:59,400 Speaker 3: We take you back in time to Sparta, right, and 486 00:24:59,720 --> 00:25:01,480 Speaker 3: like he did at the start of the show, you 487 00:25:01,560 --> 00:25:04,399 Speaker 3: rip off his shirt and show you ripped abs. And 488 00:25:05,119 --> 00:25:08,240 Speaker 3: he did so he did. He lifted his shirt and 489 00:25:08,320 --> 00:25:10,680 Speaker 3: showed off his apps, which were very impressive a month so. 490 00:25:12,720 --> 00:25:15,080 Speaker 1: But I did not rip off my shirt. 491 00:25:15,359 --> 00:25:17,320 Speaker 3: Well it wasn't quite ripping off his shirt, but did 492 00:25:17,359 --> 00:25:18,400 Speaker 3: you expose yourself? 493 00:25:18,440 --> 00:25:19,600 Speaker 4: Come on, didn't you? 494 00:25:19,720 --> 00:25:21,320 Speaker 1: Fucking Now this is getting worse. 495 00:25:22,280 --> 00:25:23,840 Speaker 5: I'm staying out of it. 496 00:25:24,600 --> 00:25:28,440 Speaker 3: Thanks Tip, anyway, can I please finish? My god? 497 00:25:30,280 --> 00:25:31,280 Speaker 1: Could you get interesting? 498 00:25:31,359 --> 00:25:33,240 Speaker 2: Because I'm nodding off with everyone else. 499 00:25:34,000 --> 00:25:36,080 Speaker 3: It's a good idea everyone. 500 00:25:36,080 --> 00:25:38,440 Speaker 1: I'm sorry about today. We'll do better next time. 501 00:25:38,480 --> 00:25:41,679 Speaker 2: Patrick's not back in a week as previously advertised, but 502 00:25:41,720 --> 00:25:43,320 Speaker 2: it'll be back in two weeks. 503 00:25:43,119 --> 00:25:46,600 Speaker 3: Thank god. Everyone is so anyway, which is what this 504 00:25:46,960 --> 00:25:48,960 Speaker 3: whole concept that sounded good in my head when I 505 00:25:48,960 --> 00:25:51,360 Speaker 3: first started. Now ten minutes later it's not as good, 506 00:25:51,359 --> 00:25:52,679 Speaker 3: but I'm going to say it anyway. 507 00:25:53,000 --> 00:25:57,080 Speaker 4: Well, yeah, come on, we take you back to Sparta. 508 00:25:57,480 --> 00:26:01,760 Speaker 3: You rip off your top and then mister Spartan bloke 509 00:26:01,920 --> 00:26:04,000 Speaker 3: rips off his top or probably hasn't got a top 510 00:26:04,000 --> 00:26:06,640 Speaker 3: on the start with. And he looks at you and 511 00:26:06,680 --> 00:26:09,439 Speaker 3: he says, how many people have you killed to be 512 00:26:09,680 --> 00:26:12,480 Speaker 3: that physical man that you are? And you say, man, 513 00:26:12,520 --> 00:26:15,840 Speaker 3: I use a smith machine and dumbbells. That's not authentic 514 00:26:15,920 --> 00:26:18,879 Speaker 3: at all. It's like, well, who's the real man here, 515 00:26:19,119 --> 00:26:20,320 Speaker 3: who's the real warrior? 516 00:26:20,600 --> 00:26:22,680 Speaker 4: And I'm saying, so, I know it's a. 517 00:26:22,600 --> 00:26:23,399 Speaker 3: Really bad at allergy. 518 00:26:23,720 --> 00:26:26,159 Speaker 2: Hell, this is the worst analogy of all time. Do 519 00:26:26,200 --> 00:26:29,600 Speaker 2: you want to rethink this? Because everybody's just gone to 520 00:26:29,600 --> 00:26:34,359 Speaker 2: get a cup of tea and fucking wow, what is 521 00:26:34,760 --> 00:26:38,720 Speaker 2: Can I just say you no elephants stamp today like nothing. 522 00:26:40,480 --> 00:26:44,119 Speaker 3: The poor woman who I recently read an article about, 523 00:26:44,119 --> 00:26:49,040 Speaker 3: who had very severe face disfigurement, had formed a relationship 524 00:26:49,040 --> 00:26:51,920 Speaker 3: with an AI chatbot and to her it was real, 525 00:26:52,520 --> 00:26:55,360 Speaker 3: So unlike the spartan man who had been out there 526 00:26:55,359 --> 00:26:57,720 Speaker 3: and met the real person Craig who's been in a 527 00:26:57,800 --> 00:27:00,639 Speaker 3: gym and using fake equipment to get that body. 528 00:27:01,480 --> 00:27:03,480 Speaker 4: So what I'm saying is it's all the matter. 529 00:27:03,560 --> 00:27:05,399 Speaker 2: You're saying the only way to get in shape is 530 00:27:05,480 --> 00:27:10,080 Speaker 2: kill people, and that's better than a smith machine and dumbbells. Okay, wow, 531 00:27:10,200 --> 00:27:14,159 Speaker 2: shout out to the sociopaths and psychopaths. Patrick's starting a 532 00:27:14,200 --> 00:27:17,320 Speaker 2: new program, how to Kill for lean mass. 533 00:27:17,760 --> 00:27:19,439 Speaker 3: I think you're bringing into it that a little bit 534 00:27:19,440 --> 00:27:21,960 Speaker 3: too deeply now. But what I'm saying is using an 535 00:27:21,960 --> 00:27:25,000 Speaker 3: AI and forming an AO relationship for some people may 536 00:27:25,040 --> 00:27:27,639 Speaker 3: be a real comfort that they make. Because when you 537 00:27:27,640 --> 00:27:29,959 Speaker 3: first started that story about the poor lonely person who 538 00:27:30,040 --> 00:27:32,320 Speaker 3: lives alone, I'm thinking, what the fuck you're talking about? 539 00:27:32,320 --> 00:27:36,680 Speaker 3: All of us? We all live alone, for crying out loud, exactly. 540 00:27:37,160 --> 00:27:39,240 Speaker 2: But I also said it could be good, it could 541 00:27:39,280 --> 00:27:39,920 Speaker 2: be a positive. 542 00:27:39,960 --> 00:27:40,720 Speaker 1: I'm saying that. 543 00:27:41,119 --> 00:27:45,000 Speaker 2: I'm saying that there can be an upside, especially for 544 00:27:45,040 --> 00:27:47,760 Speaker 2: people who really feel disconnected and so on. But the 545 00:27:47,800 --> 00:27:51,760 Speaker 2: author I'm also saying, but it ain't all necessarily good. 546 00:27:51,920 --> 00:27:53,320 Speaker 1: I think you've got to look at. 547 00:27:53,800 --> 00:27:56,080 Speaker 2: I mean, rather than taking a side, I think you 548 00:27:56,160 --> 00:27:59,040 Speaker 2: want to try and look objectively about pros and cons 549 00:27:59,040 --> 00:28:02,359 Speaker 2: for everything. Otherwise you're just sitting on one side of 550 00:28:02,359 --> 00:28:04,240 Speaker 2: the fence and you're not open. Do I think it 551 00:28:04,280 --> 00:28:04,840 Speaker 2: could be good? 552 00:28:04,960 --> 00:28:07,480 Speaker 1: Yes? Do I think it could be problematic? Yes? 553 00:28:07,560 --> 00:28:11,280 Speaker 2: I think it depends on context, person, and situation. But 554 00:28:12,240 --> 00:28:14,640 Speaker 2: you know, there are so many variables around and I'm 555 00:28:14,640 --> 00:28:17,320 Speaker 2: not trying to be too serious, but it's like, are 556 00:28:17,359 --> 00:28:21,320 Speaker 2: drugs bad? Well, it depends what drugs, what you're taking, 557 00:28:21,359 --> 00:28:23,600 Speaker 2: how much, how often. But for a lot of people, 558 00:28:23,680 --> 00:28:26,760 Speaker 2: drugs will save their life. It's not black and white 559 00:28:26,800 --> 00:28:30,320 Speaker 2: with these things. It depends on the application and the situation. 560 00:28:31,200 --> 00:28:34,040 Speaker 2: And so for a lot of people, technology is just 561 00:28:34,080 --> 00:28:36,400 Speaker 2: a tool they use to make money and do good shit. 562 00:28:36,480 --> 00:28:39,480 Speaker 2: For other people it's an addiction that's fucking up their 563 00:28:39,520 --> 00:28:41,800 Speaker 2: life and they can't leave their bedroom because they're. 564 00:28:41,600 --> 00:28:43,960 Speaker 1: Playing games eight in hours a day. 565 00:28:44,120 --> 00:28:46,880 Speaker 2: So I just think it's not that arbitrary. 566 00:28:47,640 --> 00:28:50,840 Speaker 3: Yeah, now I agree, but I think that there's a 567 00:28:50,840 --> 00:28:54,080 Speaker 3: lot of hope there for people to be able to 568 00:28:54,160 --> 00:28:58,120 Speaker 3: try to use technology to fill avoid they might have 569 00:28:58,200 --> 00:29:00,680 Speaker 3: in their lives and if that can done in a 570 00:29:00,680 --> 00:29:04,880 Speaker 3: supportive way. But I think the problem is oversight, like anything. 571 00:29:04,960 --> 00:29:07,720 Speaker 3: You know, we were talking about Google using its more 572 00:29:07,720 --> 00:29:12,160 Speaker 3: advanced chat feature. You know, in your doing Google searching now, 573 00:29:12,200 --> 00:29:15,640 Speaker 3: a lot of use websites are reeling at the moment 574 00:29:15,760 --> 00:29:18,440 Speaker 3: because when you do a search in Google now you 575 00:29:18,480 --> 00:29:22,280 Speaker 3: get an AI overview of the information. So for example, 576 00:29:22,320 --> 00:29:26,440 Speaker 3: if I said find out about the typ podcast rather 577 00:29:26,520 --> 00:29:29,840 Speaker 3: than sending you to the TYP website. It would give 578 00:29:29,880 --> 00:29:33,240 Speaker 3: you a bit of an overview. So TYP podcast hosted 579 00:29:33,240 --> 00:29:37,120 Speaker 3: by Craig Harper, you know, supported by tif Cook and 580 00:29:37,240 --> 00:29:40,440 Speaker 3: occasional guest Patrick Barnello. Done okay, so gives you an overview. 581 00:29:40,440 --> 00:29:41,920 Speaker 3: It's like I don't need to know anything else. That's 582 00:29:41,960 --> 00:29:46,840 Speaker 3: the TYP podcast. Now, news sites have lost as much 583 00:29:46,880 --> 00:29:50,680 Speaker 3: as sixty percent of their traffic since this has been introduced. 584 00:29:51,040 --> 00:29:55,160 Speaker 3: The problem being is that Google is taking the information 585 00:29:55,400 --> 00:29:59,520 Speaker 3: from those sites without their permission, but not sending the 586 00:29:59,560 --> 00:30:03,000 Speaker 3: traffic to the sites. They're basically saying, this is the story, 587 00:30:03,040 --> 00:30:04,640 Speaker 3: this is the information. I'm going to give you it 588 00:30:04,720 --> 00:30:07,760 Speaker 3: in a very summarized form, and then people are not 589 00:30:07,920 --> 00:30:10,960 Speaker 3: needing to go to the sites. So these news websites 590 00:30:10,960 --> 00:30:12,880 Speaker 3: are saying, well, you're using our sites to serve up 591 00:30:12,920 --> 00:30:15,400 Speaker 3: that information. But because they haven't got the traffic, then 592 00:30:15,440 --> 00:30:17,840 Speaker 3: their ad revenue is going to drop because people aren't 593 00:30:17,880 --> 00:30:20,760 Speaker 3: going there. And if they've lost sixty percent of their traffic, 594 00:30:20,800 --> 00:30:23,960 Speaker 3: they've also lost sixty percent of their revenue as well. 595 00:30:24,160 --> 00:30:26,720 Speaker 3: And that's where this there's disconnect in terms of what's 596 00:30:26,760 --> 00:30:28,400 Speaker 3: being searched and what's being delivered. 597 00:30:28,440 --> 00:30:32,280 Speaker 2: So if that website's public domain though, right, and anyone 598 00:30:32,320 --> 00:30:34,520 Speaker 2: can go on there and look at it, and then 599 00:30:34,560 --> 00:30:36,760 Speaker 2: Google goes and looks at it and then tells you 600 00:30:36,800 --> 00:30:39,480 Speaker 2: what's on there. So yeah, I understand what you're saying, 601 00:30:39,520 --> 00:30:43,360 Speaker 2: but I mean, this is just that the rate of 602 00:30:43,360 --> 00:30:48,240 Speaker 2: evolution and development and change in their space is somebody 603 00:30:48,280 --> 00:30:50,719 Speaker 2: is going to be disadvantaged somewhere, and somebody is going 604 00:30:50,800 --> 00:30:53,440 Speaker 2: to be advantaged, you know, like there's going to be 605 00:30:54,360 --> 00:30:55,800 Speaker 2: pros and cons for everyone. 606 00:30:56,080 --> 00:30:56,280 Speaker 1: Yeah. 607 00:30:56,440 --> 00:30:59,960 Speaker 2: I don't know where that starts and stops. And how 608 00:31:00,080 --> 00:31:02,000 Speaker 2: how the fuck do we regulate. 609 00:31:01,520 --> 00:31:02,200 Speaker 3: This stuff, mate? 610 00:31:02,280 --> 00:31:06,200 Speaker 2: I mean, it's like because it's such a slippery slope. Anyway, 611 00:31:06,840 --> 00:31:11,120 Speaker 2: tell me about Tesla owners that want to sue Tesla, 612 00:31:11,360 --> 00:31:14,880 Speaker 2: Like there's a joint class action over. 613 00:31:14,800 --> 00:31:16,600 Speaker 1: Full self driving? What is that about? 614 00:31:16,960 --> 00:31:22,600 Speaker 3: Yep? So, Tesla has this FSD package that people who 615 00:31:22,640 --> 00:31:25,720 Speaker 3: have bought recent test the model cars, and it's basically 616 00:31:25,840 --> 00:31:29,120 Speaker 3: FSD is full self driving. And we talked about it 617 00:31:29,160 --> 00:31:31,959 Speaker 3: a little bit in the last podcast about how you 618 00:31:32,000 --> 00:31:34,680 Speaker 3: can use this feature. And but you still need to 619 00:31:34,680 --> 00:31:36,360 Speaker 3: be sitting at the car, You still need to be 620 00:31:36,400 --> 00:31:38,040 Speaker 3: sitting in the front seat, you still need to be 621 00:31:38,400 --> 00:31:40,920 Speaker 3: hands ready to grab the steering wheel so that if 622 00:31:40,960 --> 00:31:43,920 Speaker 3: anything was to happen, and it does make mistakes, but 623 00:31:44,320 --> 00:31:46,840 Speaker 3: the problem is that a lot of people were sold 624 00:31:46,880 --> 00:31:49,520 Speaker 3: on this idea and we're going back a few years 625 00:31:49,520 --> 00:31:52,920 Speaker 3: ago when these new model cars were coming out where 626 00:31:52,920 --> 00:31:56,000 Speaker 3: they were promised the ability to have full self driving 627 00:31:56,040 --> 00:31:59,800 Speaker 3: and it hasn't been realized. So the issue is that 628 00:31:59,840 --> 00:32:02,240 Speaker 3: there's a lot of pressure where people are saying, well, 629 00:32:02,240 --> 00:32:05,840 Speaker 3: we paid for this advanced driver assistant feature, you know, 630 00:32:05,880 --> 00:32:09,600 Speaker 3: this autopilot, the full self driving, and it hasn't come through, 631 00:32:09,880 --> 00:32:13,160 Speaker 3: particularly in Australia. So in Australia there's this class action 632 00:32:13,240 --> 00:32:15,320 Speaker 3: where people are saying, well, we've paid for something we 633 00:32:15,360 --> 00:32:18,040 Speaker 3: didn't get, and that's the problem. You know, if you 634 00:32:18,080 --> 00:32:20,880 Speaker 3: were old on this idea with a feature that you 635 00:32:20,920 --> 00:32:24,800 Speaker 3: were promised that just hasn't been that and Elon Musk 636 00:32:24,880 --> 00:32:26,760 Speaker 3: was kind of saying, well, we can roll it back 637 00:32:26,840 --> 00:32:30,560 Speaker 3: using software, but now they've admitted that there are some cars, 638 00:32:30,600 --> 00:32:32,840 Speaker 3: some of the early Tesla models just don't have that 639 00:32:32,880 --> 00:32:36,000 Speaker 3: capability at all. So if you've paid for something you 640 00:32:36,040 --> 00:32:37,800 Speaker 3: think you've you know you've got to get and you 641 00:32:37,840 --> 00:32:40,800 Speaker 3: haven't got it, well, I think there's really good grounds 642 00:32:40,840 --> 00:32:43,560 Speaker 3: to be able to say, well, I'm sorry, I want 643 00:32:43,560 --> 00:32:46,880 Speaker 3: a refunder a partial refund, or you know, I get 644 00:32:46,960 --> 00:32:49,239 Speaker 3: that the people are pretty peeved off about it. 645 00:32:49,920 --> 00:32:53,400 Speaker 2: I reckon it's a little bit redundant the idea in 646 00:32:53,440 --> 00:32:57,920 Speaker 2: Australia in that like even if if the three of 647 00:32:58,000 --> 00:33:00,520 Speaker 2: us had full self driving cars, cars which have the 648 00:33:00,560 --> 00:33:04,520 Speaker 2: capacity where you can essentially theoretically sit in the passenger 649 00:33:04,560 --> 00:33:06,360 Speaker 2: seat program in the car and it gets you where 650 00:33:06,360 --> 00:33:09,320 Speaker 2: you want to go. Because I can't see the Australian 651 00:33:09,360 --> 00:33:12,600 Speaker 2: government giving this a green light anytime in the next decade, 652 00:33:12,680 --> 00:33:13,000 Speaker 2: can you. 653 00:33:14,680 --> 00:33:17,160 Speaker 3: I think maybe in the next decade, but at least 654 00:33:17,160 --> 00:33:20,560 Speaker 3: a decade. It's getting close. I mean it's getting really, 655 00:33:20,600 --> 00:33:23,440 Speaker 3: really really close. There's some really great technology and when 656 00:33:23,440 --> 00:33:25,520 Speaker 3: you think of it, I mean people tend to highlight 657 00:33:25,680 --> 00:33:27,080 Speaker 3: the accidents and the little. 658 00:33:26,840 --> 00:33:29,200 Speaker 4: Things that go wrong, but if you think of it 659 00:33:29,240 --> 00:33:30,320 Speaker 4: in terms. 660 00:33:29,960 --> 00:33:33,280 Speaker 3: Of a whole lot of self driving cars on the road, 661 00:33:33,600 --> 00:33:35,760 Speaker 3: their reaction time is always going to be a lot 662 00:33:35,760 --> 00:33:37,480 Speaker 3: better than human reaction time. 663 00:33:37,800 --> 00:33:39,720 Speaker 1: On one hundred percent. 664 00:33:39,840 --> 00:33:42,560 Speaker 2: It's not often we agree, but you say you had 665 00:33:42,600 --> 00:33:46,959 Speaker 2: a thousand self driving cars and a thousand Craig driving 666 00:33:47,080 --> 00:33:50,479 Speaker 2: cars over an extended period of time, right, And I'm 667 00:33:50,520 --> 00:33:52,880 Speaker 2: a pretty good driver. I think I've never actually had 668 00:33:52,920 --> 00:33:56,560 Speaker 2: an accident. But like a bunch of humans in control 669 00:33:56,760 --> 00:34:01,600 Speaker 2: or AI in control, I think statistically there's way less accidents. 670 00:34:01,640 --> 00:34:04,400 Speaker 2: That doesn't mean there can't be accidents, but I think 671 00:34:04,440 --> 00:34:08,160 Speaker 2: statistically and find the humans have more error than the computers. 672 00:34:08,440 --> 00:34:11,920 Speaker 3: Yeah, and also because we're subject to getting tired. You know, 673 00:34:12,239 --> 00:34:15,440 Speaker 3: I don't like to drive late at night purely because 674 00:34:15,440 --> 00:34:18,520 Speaker 3: my circadian rhythm is that I'm an early morning person 675 00:34:18,680 --> 00:34:21,360 Speaker 3: and I get tired at night. So I know that 676 00:34:21,520 --> 00:34:24,040 Speaker 3: I am a less effective driver at night than I 677 00:34:24,040 --> 00:34:26,520 Speaker 3: would be at five point thirty in the morning, when 678 00:34:26,560 --> 00:34:29,440 Speaker 3: I feel like I'm my most invigorated and wide awake. 679 00:34:29,960 --> 00:34:33,480 Speaker 3: And that's impact people. You know, emotional issues. If you've 680 00:34:33,520 --> 00:34:36,400 Speaker 3: just had an argument with someone and you're feeling angry, 681 00:34:36,440 --> 00:34:38,680 Speaker 3: well that's going to be reflected in what you're doing 682 00:34:38,719 --> 00:34:41,080 Speaker 3: behind the wheel. How much agro do you see on 683 00:34:41,120 --> 00:34:43,799 Speaker 3: the roads? I mean that's another thing as well. You know, 684 00:34:43,880 --> 00:34:46,200 Speaker 3: at this stage, I haven't seen a self driving car 685 00:34:46,880 --> 00:34:48,360 Speaker 3: experienced road rage. 686 00:34:48,960 --> 00:34:51,720 Speaker 2: Well, if you ever see tear for and road raging, 687 00:34:51,880 --> 00:34:52,920 Speaker 2: just give her a cookie. 688 00:34:53,000 --> 00:34:53,759 Speaker 1: It'll all be good. 689 00:34:53,760 --> 00:34:57,719 Speaker 2: In about thirty seconds, Patrick tell us about how drones 690 00:34:57,760 --> 00:35:00,880 Speaker 2: are being used to combat retail hail theft. 691 00:35:01,440 --> 00:35:04,759 Speaker 3: Yeah, isn't this interesting? So surveillance drones and this is 692 00:35:04,760 --> 00:35:07,800 Speaker 3: a big problem. So it hasn't been rolled out quite yet, 693 00:35:08,239 --> 00:35:12,880 Speaker 3: but the idea is that a lot of retailers are 694 00:35:12,880 --> 00:35:16,719 Speaker 3: struggling with an ever increasing amount of theft. And I 695 00:35:16,800 --> 00:35:19,719 Speaker 3: read a really interesting article a few days ago that 696 00:35:19,960 --> 00:35:24,239 Speaker 3: theft actually is generational, and some younger people feel it's 697 00:35:24,280 --> 00:35:28,239 Speaker 3: okay to steal. That they they don't have any sort 698 00:35:28,239 --> 00:35:31,520 Speaker 3: of guilt because they feel that a retailer that has 699 00:35:31,600 --> 00:35:35,359 Speaker 3: lots of money, if you flog something, it doesn't really matter. 700 00:35:35,480 --> 00:35:41,160 Speaker 3: And whereas Baby Boom does fucking terrible parenting. Seriously, if 701 00:35:41,200 --> 00:35:43,879 Speaker 3: you think stealing's cool, your parents are shit. 702 00:35:44,400 --> 00:35:46,680 Speaker 2: Well that's that's what came out a lot of young 703 00:35:46,800 --> 00:35:48,760 Speaker 2: I don't know. I shouldn't so I'll probably get in trouble. 704 00:35:48,760 --> 00:35:52,520 Speaker 2: But I'm like, oh my god, what child thinks stealing's fine? 705 00:35:52,680 --> 00:35:54,759 Speaker 2: Feels generational? There's a look. 706 00:35:54,800 --> 00:35:59,120 Speaker 3: I can't quite it say. I don't know that's what 707 00:35:59,160 --> 00:36:01,239 Speaker 3: the article said. I'm not saying yay or a but 708 00:36:01,520 --> 00:36:04,880 Speaker 3: in this instance, right, So what do you think of it. 709 00:36:04,920 --> 00:36:08,279 Speaker 3: I mean, I guess if you are able to have 710 00:36:08,440 --> 00:36:11,799 Speaker 3: these drones that could follow a car or follow a 711 00:36:11,800 --> 00:36:16,279 Speaker 3: person once they stole from a retailer and track them down, 712 00:36:16,360 --> 00:36:20,000 Speaker 3: notify police and say, yeah, we're tracking this person because 713 00:36:20,080 --> 00:36:22,719 Speaker 3: they're suspected of theft. Because of course it's suspected of 714 00:36:22,800 --> 00:36:26,359 Speaker 3: theft until proven. I don't know, what do you reckon about, Tiff, 715 00:36:26,400 --> 00:36:28,040 Speaker 3: What are you reckon aut a drone? You walk out 716 00:36:28,040 --> 00:36:30,320 Speaker 3: of a department store, You just walked out of Meyer 717 00:36:30,600 --> 00:36:32,759 Speaker 3: and suddenly there's a buzzing above you because you look 718 00:36:32,800 --> 00:36:35,160 Speaker 3: a bit sus probably fair. 719 00:36:35,000 --> 00:36:39,120 Speaker 5: Col Yeah, cool, you're on surveillance when you're in there. 720 00:36:40,120 --> 00:36:41,960 Speaker 5: If you've done something wrong, so be it. 721 00:36:42,560 --> 00:36:44,600 Speaker 1: Are you going to have You have a stolen anything? Patrick? 722 00:36:45,239 --> 00:36:47,480 Speaker 3: Have I ever stolen it? No? I haven't stolen anything, 723 00:36:47,520 --> 00:36:51,640 Speaker 3: but I did I did knowingly someone I knew that 724 00:36:51,719 --> 00:36:54,319 Speaker 3: an item was placed at a lower price than what 725 00:36:54,360 --> 00:36:56,719 Speaker 3: it was. So there was a toy truck and I 726 00:36:56,800 --> 00:36:59,719 Speaker 3: might have been twelve or thirteen, and all of our 727 00:37:00,120 --> 00:37:04,920 Speaker 3: forty six, all the other trucks were I will say 728 00:37:04,920 --> 00:37:07,319 Speaker 3: it was five bucks, but this one had like two 729 00:37:07,360 --> 00:37:09,839 Speaker 3: dollars on it or something, and I knowingly took it 730 00:37:09,840 --> 00:37:12,000 Speaker 3: to the register and bought it knowing that the price 731 00:37:12,120 --> 00:37:16,000 Speaker 3: was wrong. Is that stealing? And I didn't change the ticket. 732 00:37:16,080 --> 00:37:17,800 Speaker 3: It wasn't me. I didn't take the sticker off and 733 00:37:17,840 --> 00:37:19,040 Speaker 3: put a cheap sticker on it. 734 00:37:19,040 --> 00:37:23,359 Speaker 2: It's technically not, I guess, but it's but that's like, well, 735 00:37:23,480 --> 00:37:26,839 Speaker 2: can I tell you I almost stole you still? 736 00:37:29,040 --> 00:37:29,360 Speaker 3: Oh? 737 00:37:29,440 --> 00:37:29,640 Speaker 5: Yeah? 738 00:37:29,800 --> 00:37:30,000 Speaker 1: No. 739 00:37:30,719 --> 00:37:32,799 Speaker 2: I used to have a friend who used to pinch 740 00:37:32,800 --> 00:37:34,640 Speaker 2: stuff when I was a kid, and I'm like, I 741 00:37:34,680 --> 00:37:38,319 Speaker 2: didn't get it, and he's like, just take We went, 742 00:37:38,440 --> 00:37:40,359 Speaker 2: I don't know where we were, and I was going 743 00:37:40,440 --> 00:37:43,840 Speaker 2: to steal a packet of juicy fruit, right this is 744 00:37:43,880 --> 00:37:47,920 Speaker 2: my big debut into fucking theft and crime. And he's like, 745 00:37:48,040 --> 00:37:50,560 Speaker 2: just put it in your pocket, like da data, And 746 00:37:50,600 --> 00:37:53,200 Speaker 2: I'm like, I really didn't want to, but I wanted 747 00:37:53,280 --> 00:37:55,080 Speaker 2: him to like me because I thought he was cool. 748 00:37:55,120 --> 00:37:56,920 Speaker 2: I don't know what the fuck. So anyway, I get 749 00:37:56,920 --> 00:37:59,120 Speaker 2: this packet of juicy fruit. I put it in my pocket. 750 00:37:59,160 --> 00:38:03,399 Speaker 2: This is my big into the world of criminality. And 751 00:38:04,280 --> 00:38:06,960 Speaker 2: I get like two meters from the and I can't. 752 00:38:07,040 --> 00:38:09,080 Speaker 2: I just walk back and I'm like, nah, I can't, 753 00:38:09,120 --> 00:38:11,440 Speaker 2: and I just put it back on the thing, and 754 00:38:11,480 --> 00:38:15,520 Speaker 2: I felt fucking terrible for like two days, even at 755 00:38:15,520 --> 00:38:17,680 Speaker 2: the thought that I was going to steal something and 756 00:38:17,719 --> 00:38:20,200 Speaker 2: I put it in my pocket. Even though I took 757 00:38:20,280 --> 00:38:25,040 Speaker 2: I was like ten, right, so or eleven. Mary Harper 758 00:38:25,160 --> 00:38:28,399 Speaker 2: will have a fucking she'll be praying for me all 759 00:38:28,480 --> 00:38:29,520 Speaker 2: day today. 760 00:38:29,360 --> 00:38:33,120 Speaker 1: If she hears this. But yeah, so that was my Yeah, 761 00:38:33,120 --> 00:38:33,839 Speaker 1: I can't do it. 762 00:38:34,040 --> 00:38:36,799 Speaker 4: I just gilt, make Catholic guilt that upbringing. That's what 763 00:38:36,880 --> 00:38:37,160 Speaker 4: did it. 764 00:38:37,239 --> 00:38:37,680 Speaker 3: I reckon. 765 00:38:37,760 --> 00:38:40,120 Speaker 1: Well, maybe it's just an internal moral compass. 766 00:38:40,480 --> 00:38:42,920 Speaker 3: Maybe can I just say that Tiff had a spatter 767 00:38:43,000 --> 00:38:44,720 Speaker 3: in her pocket and you had juicy fruit. 768 00:38:47,160 --> 00:38:50,040 Speaker 2: I don't know what that's the Can you tell us 769 00:38:50,040 --> 00:38:56,120 Speaker 2: about why the two year ago school phone bands are 770 00:38:56,200 --> 00:38:58,920 Speaker 2: either working or not working? So two years ago phones 771 00:38:58,960 --> 00:39:01,120 Speaker 2: were banned in school in Australia. 772 00:39:01,239 --> 00:39:03,440 Speaker 1: Did that happen? Is it working or not? 773 00:39:04,080 --> 00:39:08,120 Speaker 3: It started off in Victoria, which was great and according 774 00:39:08,160 --> 00:39:12,520 Speaker 3: to principles and teachers, I mean a lot of politicians 775 00:39:12,520 --> 00:39:15,399 Speaker 3: and parents believe this was the way to go. And 776 00:39:15,480 --> 00:39:20,120 Speaker 3: now after two years there's some real clear statistics that 777 00:39:20,160 --> 00:39:23,000 Speaker 3: have come out and they're saying teachers are saying since 778 00:39:23,040 --> 00:39:27,600 Speaker 3: the band they've seen stronger, lesson starts, fewer interruptions, are 779 00:39:27,640 --> 00:39:31,520 Speaker 3: better flow in teaching. And the other thing is device 780 00:39:31,600 --> 00:39:34,520 Speaker 3: driven conflict. Evidently there was I don't know whether it 781 00:39:34,560 --> 00:39:36,919 Speaker 3: was myophon's better than your pixel phone. I'm not sure 782 00:39:36,920 --> 00:39:40,560 Speaker 3: about that device driven conflict. That maybe just social media taunting, 783 00:39:40,600 --> 00:39:45,120 Speaker 3: who knows, but that's fallen as well across recesses and lunches, 784 00:39:45,200 --> 00:39:48,400 Speaker 3: and children are engaging more. So what it means is 785 00:39:48,440 --> 00:39:51,239 Speaker 3: that because they're not sitting in the schoolyard looking at 786 00:39:51,239 --> 00:39:54,560 Speaker 3: a phone, than enforced to actually interact with other students. 787 00:39:54,840 --> 00:39:58,200 Speaker 3: So resoundingly, and this was the survey of a thousand 788 00:39:58,239 --> 00:40:01,240 Speaker 3: public school principles. This is in the New South Wales 789 00:40:01,280 --> 00:40:05,319 Speaker 3: Department of Education. They basically said that ninety five percent 790 00:40:05,360 --> 00:40:09,640 Speaker 3: of principles still supported the ban, and that eighty one 791 00:40:09,680 --> 00:40:13,520 Speaker 3: percent said the band had improved student learning and eighty 792 00:40:13,600 --> 00:40:17,880 Speaker 3: six said had improved socialization among students. So I reckon, 793 00:40:17,880 --> 00:40:21,160 Speaker 3: that's a big tick of approval, and you almost wonder 794 00:40:21,360 --> 00:40:24,120 Speaker 3: whether that should happen in the workplace. You know, I 795 00:40:24,160 --> 00:40:27,839 Speaker 3: had a new staff member begin with me yesterday and 796 00:40:28,280 --> 00:40:30,640 Speaker 3: I know she was trying really hard on her first day, 797 00:40:30,719 --> 00:40:33,520 Speaker 3: but her phone was pinging a few times, and you 798 00:40:33,719 --> 00:40:39,120 Speaker 3: just saw that sudden attention totally disappear. And then even 799 00:40:39,160 --> 00:40:41,120 Speaker 3: though she wasn't reaching for her phone, it was the 800 00:40:41,239 --> 00:40:44,200 Speaker 3: ding on the phone. And that's the I guess the 801 00:40:44,960 --> 00:40:47,719 Speaker 3: You know, there are notifications for a reason because it's 802 00:40:47,920 --> 00:40:50,240 Speaker 3: drawing your attention to the fact that there's something popping 803 00:40:50,320 --> 00:40:51,160 Speaker 3: up on your phone. 804 00:40:51,320 --> 00:40:53,160 Speaker 4: But it is so distracting, isn't it. 805 00:40:53,360 --> 00:40:56,440 Speaker 3: When the phone is on silent all the time, it 806 00:40:56,480 --> 00:40:58,759 Speaker 3: never pings, never makes a sound. And in fact, when 807 00:40:58,760 --> 00:41:00,400 Speaker 3: I go to a meeting, there's a really cool feature 808 00:41:00,400 --> 00:41:01,960 Speaker 3: on a phone where if you put a face down, 809 00:41:02,239 --> 00:41:04,440 Speaker 3: it goes into do not disturb mode. So it's a 810 00:41:04,480 --> 00:41:06,840 Speaker 3: really great way when you're sitting down with someone. I 811 00:41:06,880 --> 00:41:09,040 Speaker 3: find it's an etiquette thing for me. I place my 812 00:41:09,080 --> 00:41:11,000 Speaker 3: phone face down and it says to the other person, 813 00:41:11,760 --> 00:41:14,239 Speaker 3: you've got my attention, my phone is not going to 814 00:41:14,239 --> 00:41:16,239 Speaker 3: disturb us. And I like doing that when I go 815 00:41:16,280 --> 00:41:18,839 Speaker 3: into a meeting or catching up with someone. 816 00:41:20,600 --> 00:41:22,399 Speaker 1: I've got a little bit of news for you. This 817 00:41:22,480 --> 00:41:24,200 Speaker 1: is kind of pseudo news. 818 00:41:24,239 --> 00:41:28,520 Speaker 2: I don't have all the data, but interestingly, well, I 819 00:41:28,560 --> 00:41:30,680 Speaker 2: mean this is in my space, in your space. So 820 00:41:30,800 --> 00:41:34,239 Speaker 2: one of the things that's been of or been an 821 00:41:34,239 --> 00:41:36,600 Speaker 2: issue over the last year or two, and especially more 822 00:41:36,680 --> 00:41:40,600 Speaker 2: right now than ever, is the use of AI at universities, 823 00:41:40,800 --> 00:41:46,560 Speaker 2: right because students are using AI to cheat essentially, I 824 00:41:46,600 --> 00:41:50,480 Speaker 2: mean we can, and the universities a little bit have 825 00:41:51,320 --> 00:41:54,440 Speaker 2: put their hands in there and gone where fucked like, 826 00:41:54,600 --> 00:41:58,040 Speaker 2: because you can't combat it, because the AI is so 827 00:41:58,120 --> 00:42:01,040 Speaker 2: good that it can trick the know. There are programs 828 00:42:01,080 --> 00:42:05,200 Speaker 2: that kind of oh what's it called. I think it's 829 00:42:05,200 --> 00:42:09,920 Speaker 2: called turn it in, where they can run your paper 830 00:42:10,000 --> 00:42:12,359 Speaker 2: through this program to see how much of it's been 831 00:42:12,400 --> 00:42:14,920 Speaker 2: plagiarized or cut and pasted or cheated or blah blah 832 00:42:14,920 --> 00:42:18,160 Speaker 2: blah blah blah. But now what a lot of the 833 00:42:18,239 --> 00:42:20,880 Speaker 2: universities are doing. They're going, all right, well, AI is 834 00:42:21,000 --> 00:42:23,680 Speaker 2: not going away. CHAT, GPT and the like are not 835 00:42:23,800 --> 00:42:27,440 Speaker 2: going away. How do we integrate it into the programs 836 00:42:27,480 --> 00:42:31,520 Speaker 2: and the courses so that it's something of a level 837 00:42:31,640 --> 00:42:35,120 Speaker 2: playing field. But at the same time, the students are 838 00:42:35,160 --> 00:42:41,319 Speaker 2: still actually thinking and solving problems and doing work and 839 00:42:41,400 --> 00:42:45,200 Speaker 2: learning because the whole idea that, oh, we're just not 840 00:42:45,280 --> 00:42:49,000 Speaker 2: going to let them use AI is just not an 841 00:42:49,000 --> 00:42:52,000 Speaker 2: option anymore because it's gone beyond that. So it'll be 842 00:42:52,040 --> 00:42:57,959 Speaker 2: interesting to see in academia over the next two, five, 843 00:42:58,120 --> 00:43:01,640 Speaker 2: ten years what happens. Because the truth is now that 844 00:43:01,719 --> 00:43:05,960 Speaker 2: people can produce an academic paper or a paper that 845 00:43:06,040 --> 00:43:10,000 Speaker 2: looks like an academic paper in two hours that would 846 00:43:10,040 --> 00:43:13,799 Speaker 2: have taken someone six months, you know, because it can 847 00:43:13,880 --> 00:43:16,759 Speaker 2: research for you, it can do citations for you, it 848 00:43:16,800 --> 00:43:19,560 Speaker 2: can reference for you, it can lay it all out 849 00:43:19,600 --> 00:43:24,760 Speaker 2: for you. It's bloody. It's quite disconcerting for somebody who's 850 00:43:24,880 --> 00:43:28,920 Speaker 2: just spent six years doing their PhD and has nearly finished. 851 00:43:29,800 --> 00:43:31,120 Speaker 3: I don't know. Do you lock people up in a 852 00:43:31,160 --> 00:43:33,400 Speaker 3: room with a fountain pen and make them write stuff? 853 00:43:34,120 --> 00:43:35,799 Speaker 3: It's a hard one. I don't think there's a really 854 00:43:35,800 --> 00:43:38,040 Speaker 3: good answer. I think that the logical way is what 855 00:43:38,080 --> 00:43:40,719 Speaker 3: they're doing is try to integrate it and say these 856 00:43:40,760 --> 00:43:42,880 Speaker 3: are the AI tools that you're allowed to use. And 857 00:43:44,920 --> 00:43:46,440 Speaker 3: as you pointed out, and I think you said it 858 00:43:46,440 --> 00:43:48,960 Speaker 3: earlier in the show too, that it's getting harder and 859 00:43:49,000 --> 00:43:51,920 Speaker 3: harder and harder to pick what is AI generated, and 860 00:43:51,960 --> 00:43:56,279 Speaker 3: that's just making life were you're second guessing everything I'm doing. 861 00:43:56,520 --> 00:43:58,759 Speaker 3: I don't use social media, but I was on there 862 00:43:58,840 --> 00:44:02,960 Speaker 3: recently setting up something for a client, and this clip 863 00:44:03,040 --> 00:44:06,560 Speaker 3: popped up of a woman in a chicken like it 864 00:44:06,640 --> 00:44:08,879 Speaker 3: was like a KFC store or something. She was quite 865 00:44:08,920 --> 00:44:11,360 Speaker 3: a large looking lady and she went to the counter 866 00:44:11,440 --> 00:44:14,560 Speaker 3: to order the food and then the floor collapsed underneath 867 00:44:14,560 --> 00:44:17,120 Speaker 3: it because she was so large and it was AI. 868 00:44:17,360 --> 00:44:21,520 Speaker 3: But before that happened, it looked so real. It looked 869 00:44:21,520 --> 00:44:23,359 Speaker 3: so real. I'm thinking, what is this picture? Why am 870 00:44:23,360 --> 00:44:25,840 Speaker 3: I getting this video served up? And it was just 871 00:44:25,960 --> 00:44:30,000 Speaker 3: this piece of AI slop. But the problem is that 872 00:44:30,040 --> 00:44:32,680 Speaker 3: it was so realistic. I didn't pick it until, of 873 00:44:32,719 --> 00:44:34,719 Speaker 3: course the floor collapsed and she fell into it, and 874 00:44:34,760 --> 00:44:37,040 Speaker 3: even then it looked amazingly real. 875 00:44:37,400 --> 00:44:39,920 Speaker 4: But you knew it that it was so unfeasible. 876 00:44:39,960 --> 00:44:42,880 Speaker 3: It was just ridiculous, But at that point you didn't 877 00:44:42,920 --> 00:44:47,360 Speaker 3: realize it was something that was just AI, totally AI created. 878 00:44:47,920 --> 00:44:50,400 Speaker 2: Well, I'm a little bit disappointed in you because I 879 00:44:50,400 --> 00:44:51,880 Speaker 2: didn't want to tell you this till the end of 880 00:44:51,880 --> 00:44:54,880 Speaker 2: the show. But I'm actually in the city this morning 881 00:44:54,960 --> 00:44:59,000 Speaker 2: doing a presentation, so you've been chatting with AI Craig 882 00:44:59,160 --> 00:45:01,719 Speaker 2: all morning and you didn't even pick it. 883 00:45:01,719 --> 00:45:05,960 Speaker 3: It's so stupid. It's dumb thing. It is so accurate, 884 00:45:05,960 --> 00:45:06,839 Speaker 3: isn't it, Tiff, what. 885 00:45:06,800 --> 00:45:08,640 Speaker 1: Are your Yeah? 886 00:45:08,760 --> 00:45:12,440 Speaker 2: Yeah, and how real does the huge nose look? Yeah, well, 887 00:45:12,440 --> 00:45:13,759 Speaker 2: that's hurtful both of you. 888 00:45:14,560 --> 00:45:16,759 Speaker 4: I'm going into the city today too, Craigo. 889 00:45:16,520 --> 00:45:20,839 Speaker 2: I'm actually in there right now doing a gig. Tell 890 00:45:20,880 --> 00:45:24,640 Speaker 2: me what the pope has just pooh poohed. I feel 891 00:45:24,719 --> 00:45:27,640 Speaker 2: like I'm so glad you could weave something Catholic into 892 00:45:27,719 --> 00:45:29,160 Speaker 2: our technology discussion. 893 00:45:29,440 --> 00:45:32,640 Speaker 3: Look, I think that's important for you as a well. Actually, 894 00:45:32,640 --> 00:45:34,160 Speaker 3: and me now that I think of it, because I 895 00:45:34,239 --> 00:45:37,200 Speaker 3: tried to get excommunicated from the church. I actually wrote 896 00:45:37,280 --> 00:45:41,360 Speaker 3: a letter in Penn in Fountain, pen to the Archbishop 897 00:45:41,360 --> 00:45:44,120 Speaker 3: of Melbourne, but he ignored me. I was very disappointed. 898 00:45:44,280 --> 00:45:46,759 Speaker 3: I'm still reeling from that to not another letter. 899 00:45:46,760 --> 00:45:48,319 Speaker 2: I don't think you know, I think you can just 900 00:45:48,360 --> 00:45:49,840 Speaker 2: go I'm not on the team anymore. 901 00:45:49,840 --> 00:45:50,399 Speaker 1: See you later. 902 00:45:50,760 --> 00:45:51,959 Speaker 4: Yeah, but now you've got. 903 00:45:51,880 --> 00:45:53,560 Speaker 3: To get the letter. You've got to get the official 904 00:45:53,640 --> 00:45:55,840 Speaker 3: letter of excommunication, which a lot of Catholics out there 905 00:45:55,880 --> 00:45:58,080 Speaker 3: are reeling about because it's a real bad thing to 906 00:45:58,120 --> 00:46:01,120 Speaker 3: get excommunicated from the church. But I did actually actively 907 00:46:01,200 --> 00:46:04,840 Speaker 3: try to because I feel that when they do a census, 908 00:46:04,920 --> 00:46:08,200 Speaker 3: I wanted all my parish records to be expunged, so 909 00:46:08,239 --> 00:46:10,239 Speaker 3: they can't count me as a Catholic because they tend 910 00:46:10,239 --> 00:46:11,960 Speaker 3: to like to do that and say we've got these 911 00:46:12,000 --> 00:46:14,360 Speaker 3: many Catholics in Australia. It's like, yeah, but I didn't 912 00:46:14,560 --> 00:46:17,200 Speaker 3: vote to be here, so I'd like not to be 913 00:46:17,280 --> 00:46:20,400 Speaker 3: in the group. I love my friends who are Catholics, 914 00:46:20,440 --> 00:46:22,759 Speaker 3: but I just don't want to be one anyway. Sorry, 915 00:46:22,920 --> 00:46:24,480 Speaker 3: we were talking about the pope went' we that took 916 00:46:24,520 --> 00:46:26,719 Speaker 3: me down. 917 00:46:26,680 --> 00:46:32,840 Speaker 2: Talk about deer in the headlights, talk about dog with 918 00:46:32,960 --> 00:46:33,640 Speaker 2: three dicks? 919 00:46:33,920 --> 00:46:36,360 Speaker 1: What concentrate? What adhd. 920 00:46:38,400 --> 00:46:41,040 Speaker 3: Oh dear? Anyway? So the Pope, so yeah, the popes 921 00:46:41,040 --> 00:46:44,920 Speaker 3: come out and said he's condemned This is very specific. 922 00:46:44,960 --> 00:46:51,600 Speaker 3: He's condemned clickbait, right, like the Pope, right, the head 923 00:46:51,600 --> 00:46:54,799 Speaker 3: of the Catholic Church, you know, el Papa, Pope Leo, 924 00:46:55,200 --> 00:46:57,759 Speaker 3: That Pope Leo the fourteenth is it? I think it's 925 00:46:57,760 --> 00:47:00,560 Speaker 3: the fourteenth. I'm discounting now. I'm pretty sure he's fourteenth. 926 00:47:00,600 --> 00:47:03,600 Speaker 3: Someone's going to correct me. So he's condemned clickbait, saying 927 00:47:03,640 --> 00:47:07,200 Speaker 3: that it's great, it's a degrading part of journalism. So 928 00:47:07,239 --> 00:47:10,080 Speaker 3: he had journalists there. He was giving a talk to 929 00:47:10,160 --> 00:47:13,239 Speaker 3: journalists including some Australian journalists and he's patting them on 930 00:47:13,280 --> 00:47:14,719 Speaker 3: the back and then he gives them a whack over 931 00:47:14,719 --> 00:47:17,760 Speaker 3: the back of their head. And this well metaphorically because 932 00:47:17,800 --> 00:47:22,160 Speaker 3: he says that clickbait is degrading. He said that. You know, 933 00:47:22,200 --> 00:47:25,960 Speaker 3: he's a proactive supporter of journalism, but he feels that 934 00:47:26,000 --> 00:47:29,799 Speaker 3: communication needs to be freed from the misguided thinking that 935 00:47:29,960 --> 00:47:33,640 Speaker 3: corrupts it, from unfair competition and the degrading practice of 936 00:47:33,760 --> 00:47:39,359 Speaker 3: so called clickbait. So he says, clickbait is sensationalist. It's 937 00:47:39,440 --> 00:47:43,400 Speaker 3: hyperbolic headline that entices online readers to click into a 938 00:47:43,480 --> 00:47:48,160 Speaker 3: story by admit omitting key information. That's it. That's the Pope. 939 00:47:48,760 --> 00:47:49,280 Speaker 1: Exactly. 940 00:47:49,560 --> 00:47:52,400 Speaker 2: He's exactly right, and it's not going away, Pope. So, 941 00:47:52,840 --> 00:47:56,239 Speaker 2: I mean, exactly you just described and that's not going 942 00:47:56,320 --> 00:47:59,880 Speaker 2: to stop. I mean, also, why have we had fourteen 943 00:48:00,040 --> 00:48:02,719 Speaker 2: hope leos? Are not one Pope Darren or Scott or 944 00:48:02,760 --> 00:48:07,800 Speaker 2: Brian or one scot Pope Patrick. I wonder if Patrick 945 00:48:07,840 --> 00:48:10,040 Speaker 2: does almost sound Papal does it. 946 00:48:10,040 --> 00:48:11,200 Speaker 3: It's kind of a wonderful. 947 00:48:11,600 --> 00:48:14,720 Speaker 2: It sounds it's quite catholic. Let's do let's do another 948 00:48:14,760 --> 00:48:20,040 Speaker 2: one or two. Let's talk about Ozzie's having their data 949 00:48:20,120 --> 00:48:24,200 Speaker 2: stolen and like, how that's happening. 950 00:48:24,880 --> 00:48:28,719 Speaker 3: Yeah, it is typically a real problem. So in Australia 951 00:48:28,760 --> 00:48:32,400 Speaker 3: we have a thing called the Australian Signals Directorate. Okay, 952 00:48:32,680 --> 00:48:36,319 Speaker 3: it sounds pretty kind of pretty full on, doesn't it really. 953 00:48:36,400 --> 00:48:39,080 Speaker 3: I think it's a great name anyway. So last year 954 00:48:39,200 --> 00:48:43,560 Speaker 3: individual victims of cybercrime lost on average thirty three thousand dollars, 955 00:48:43,719 --> 00:48:47,000 Speaker 3: a big increase, an eight percent increase. And the problem 956 00:48:47,120 --> 00:48:51,200 Speaker 3: is that this kind of body that looks into this stuff, 957 00:48:51,400 --> 00:48:54,560 Speaker 3: the Signals director it is saying what we need to 958 00:48:54,600 --> 00:48:58,880 Speaker 3: do is make sure people aren't just using usernames and passwords, 959 00:48:58,880 --> 00:49:01,920 Speaker 3: that it's become redundant, that this is one of the 960 00:49:01,920 --> 00:49:07,719 Speaker 3: big problems this. Abigail Bradshaw is the ASD Director of 961 00:49:07,800 --> 00:49:10,759 Speaker 3: the Dead Director General, and she was speaking on SBS 962 00:49:11,280 --> 00:49:14,560 Speaker 3: and she basically said, we've got to move away from passwords. 963 00:49:14,600 --> 00:49:18,359 Speaker 3: We need multi factor authentication because that's how a lot 964 00:49:18,360 --> 00:49:22,040 Speaker 3: of people so it's being scammed. So the thing is 965 00:49:22,560 --> 00:49:26,600 Speaker 3: it's now not hackers that are hacking in. It's really 966 00:49:26,680 --> 00:49:31,080 Speaker 3: clever people fooling people into getting their usernames and passwords 967 00:49:31,440 --> 00:49:35,360 Speaker 3: so they're not hacking into corporations. They're calling and I 968 00:49:35,440 --> 00:49:38,160 Speaker 3: think the Quantus Data League recently, because of course this 969 00:49:38,200 --> 00:49:41,640 Speaker 3: is massive, eight million Quantus customers, all them, you know, 970 00:49:41,680 --> 00:49:45,279 Speaker 3: because this played out about a week ago where there 971 00:49:45,320 --> 00:49:49,120 Speaker 3: was a hack into a system I think with sales 972 00:49:49,160 --> 00:49:51,360 Speaker 3: Force was the program that had all the data in it, 973 00:49:51,680 --> 00:49:55,719 Speaker 3: and someone called a call center overseas and convinced them 974 00:49:55,760 --> 00:49:58,960 Speaker 3: that they were a technician and needed access, and that's 975 00:49:59,000 --> 00:50:00,920 Speaker 3: how the hackers got into the didn't actually hack in, 976 00:50:01,200 --> 00:50:04,120 Speaker 3: they sweet talk their way in and they managed to 977 00:50:04,120 --> 00:50:06,560 Speaker 3: then get access to all this data. They tried to 978 00:50:07,840 --> 00:50:10,560 Speaker 3: extort the money from Quantus, saying, if you don't give 979 00:50:10,640 --> 00:50:13,359 Speaker 3: us x amount of dollars, we're going to release this 980 00:50:13,560 --> 00:50:17,040 Speaker 3: information on the dark web. Quantus like the right thing 981 00:50:17,160 --> 00:50:19,960 Speaker 3: to do, and that's the problem because governments are saying, 982 00:50:20,280 --> 00:50:24,080 Speaker 3: don't let don't give money to hackers, don't give people 983 00:50:24,080 --> 00:50:27,120 Speaker 3: the money people are extorting money, extorting the money. Don't 984 00:50:27,120 --> 00:50:29,600 Speaker 3: give it to them because just encourages them. So then 985 00:50:29,880 --> 00:50:32,840 Speaker 3: they released this information on the dark web and potentially, 986 00:50:32,920 --> 00:50:36,520 Speaker 3: now if you're a Quantus, you know, if you're part 987 00:50:36,560 --> 00:50:39,560 Speaker 3: of the Quantus database, then potentially you're user name, your 988 00:50:39,560 --> 00:50:43,759 Speaker 3: pass word, your address, information related to you for identity 989 00:50:43,840 --> 00:50:46,080 Speaker 3: theft is now out in the real space and this 990 00:50:46,160 --> 00:50:48,799 Speaker 3: is the big concern for the Signals Directorate that this 991 00:50:48,960 --> 00:50:51,279 Speaker 3: is now one of the most common ways. When I 992 00:50:51,320 --> 00:50:54,440 Speaker 3: log into my bank account, I have a secret code 993 00:50:54,440 --> 00:50:57,320 Speaker 3: that gets sent to my phone and that's multi factoring 994 00:50:57,360 --> 00:51:02,120 Speaker 3: for authentication. And so really if you are and even 995 00:51:02,160 --> 00:51:04,520 Speaker 3: our websites, when we do a website for a client, 996 00:51:04,960 --> 00:51:08,480 Speaker 3: we give them multi factor authentication to log into their sites. 997 00:51:08,960 --> 00:51:13,000 Speaker 3: So I guess as consumers we need to be insisting 998 00:51:13,440 --> 00:51:16,080 Speaker 3: to whatever service we use, whether we're logging into a 999 00:51:16,120 --> 00:51:20,080 Speaker 3: website and putting out credit card information because we want 1000 00:51:20,080 --> 00:51:23,400 Speaker 3: to buy something, we need to push that and say, 1001 00:51:23,560 --> 00:51:26,239 Speaker 3: well wait a minute, if you don't have multi factor authentication, 1002 00:51:27,080 --> 00:51:29,080 Speaker 3: I'm not going to give you my details. So there 1003 00:51:29,120 --> 00:51:31,880 Speaker 3: should be more pressure in that sense, and we should 1004 00:51:31,960 --> 00:51:35,920 Speaker 3: be more discerning about what we do online based on 1005 00:51:35,960 --> 00:51:38,000 Speaker 3: the secure level of security. And if it's just to 1006 00:51:38,080 --> 00:51:40,480 Speaker 3: use an over password, then maybe that's just not good 1007 00:51:40,560 --> 00:51:41,760 Speaker 3: enough in this day and age. 1008 00:51:42,960 --> 00:51:46,960 Speaker 2: Love it last one, last one. So Elon Musk people 1009 00:51:47,080 --> 00:51:50,400 Speaker 2: know created a thing quite a while ago now called neurallink, 1010 00:51:50,440 --> 00:51:53,399 Speaker 2: which is essentially an implant. It's a brain chip, goes 1011 00:51:53,440 --> 00:51:57,080 Speaker 2: inside your skull sits on just on the surface of 1012 00:51:57,120 --> 00:52:01,200 Speaker 2: your brain, which is my understanding. Ascent gives your brain 1013 00:52:01,400 --> 00:52:04,719 Speaker 2: access to the Internet, which is so you can you 1014 00:52:04,760 --> 00:52:10,680 Speaker 2: can google shit inside your head, Patrick if Well. And 1015 00:52:10,719 --> 00:52:13,480 Speaker 2: the story is that ten thousand people have signed up 1016 00:52:13,480 --> 00:52:17,160 Speaker 2: for this, which is mind blowing because who the fuck 1017 00:52:17,239 --> 00:52:21,280 Speaker 2: knows what's going to happen. I'm just putting something foreign. 1018 00:52:21,840 --> 00:52:26,160 Speaker 2: Putting something that's made of hardware in your brain unless 1019 00:52:26,360 --> 00:52:29,359 Speaker 2: you really need it to me, is fraught with danger. 1020 00:52:29,440 --> 00:52:32,520 Speaker 2: But let's say there was no danger and it was 1021 00:52:32,560 --> 00:52:33,600 Speaker 2: one hundred percent safe. 1022 00:52:33,640 --> 00:52:34,279 Speaker 1: Would you do it? 1023 00:52:34,320 --> 00:52:36,680 Speaker 3: Patrick, I'm going to roll it back one. I'm going 1024 00:52:36,760 --> 00:52:39,840 Speaker 3: to say right now as it is, including the danger 1025 00:52:40,040 --> 00:52:43,360 Speaker 3: if I had murder your own disease, and I was 1026 00:52:43,400 --> 00:52:45,520 Speaker 3: trapped in my own body and not able to move, 1027 00:52:45,680 --> 00:52:48,279 Speaker 3: having experienced that with a very dear friend of mine 1028 00:52:48,280 --> 00:52:51,239 Speaker 3: who passed away a few years back, three years ago, 1029 00:52:51,320 --> 00:52:55,280 Speaker 3: from your own disease, where you basically lose all control 1030 00:52:55,320 --> 00:52:59,280 Speaker 3: of your body, speak, movement, all that sort of stuff. 1031 00:52:59,320 --> 00:53:02,319 Speaker 3: I would jump with anything it take off the back 1032 00:53:02,360 --> 00:53:04,840 Speaker 3: of my head and plug in a computer. So I 1033 00:53:04,840 --> 00:53:07,680 Speaker 3: think it's a circumstantial thing. If I knew I only 1034 00:53:07,719 --> 00:53:09,920 Speaker 3: had six months left to live and I was trapped 1035 00:53:09,960 --> 00:53:12,200 Speaker 3: in my own body, I would be plugging in as 1036 00:53:12,200 --> 00:53:15,040 Speaker 3: many things into my head as possible, if that meant 1037 00:53:15,040 --> 00:53:17,239 Speaker 3: I could at least have some way to interact with 1038 00:53:17,280 --> 00:53:19,080 Speaker 3: the world. So I think it's circumstantial. 1039 00:53:20,960 --> 00:53:23,319 Speaker 1: And back to my question, Patrick, would you do it? 1040 00:53:23,560 --> 00:53:26,160 Speaker 2: And you don't have mode in you're on, you have 1041 00:53:26,200 --> 00:53:29,439 Speaker 2: an ability to not answer my fucking question and tell 1042 00:53:29,480 --> 00:53:32,440 Speaker 2: a story about just yes or no. 1043 00:53:32,440 --> 00:53:32,600 Speaker 5: No. 1044 00:53:32,680 --> 00:53:34,040 Speaker 4: It's not a use or no answer. 1045 00:53:34,200 --> 00:53:35,080 Speaker 1: I think it's a yes. 1046 00:53:35,080 --> 00:53:39,160 Speaker 2: It is like right now, as you are now, as 1047 00:53:39,239 --> 00:53:42,000 Speaker 2: you are now, if it was safe. 1048 00:53:41,680 --> 00:53:42,520 Speaker 1: Would you do it? 1049 00:53:42,960 --> 00:53:47,640 Speaker 3: If it was safe one hundred percent tested and rigorously tested? 1050 00:53:48,080 --> 00:53:50,840 Speaker 2: Yes, So fuck you could have just said that in 1051 00:53:50,880 --> 00:53:55,080 Speaker 2: the first place, telling us about mode in your own disease, 1052 00:53:55,120 --> 00:53:59,480 Speaker 2: which we do support MND and Neil Danaher and the 1053 00:53:59,520 --> 00:53:59,960 Speaker 2: work he does. 1054 00:54:00,120 --> 00:54:01,399 Speaker 1: Tiff, would you do it? 1055 00:54:02,160 --> 00:54:02,560 Speaker 5: I don't know. 1056 00:54:02,640 --> 00:54:04,760 Speaker 1: If it was one hundred percent safe. 1057 00:54:04,840 --> 00:54:08,320 Speaker 5: I don't know. I hope I don't have to have 1058 00:54:08,400 --> 00:54:11,319 Speaker 5: them ever have to make the decision. I hope it 1059 00:54:11,320 --> 00:54:12,839 Speaker 5: doesn't happen that fast. 1060 00:54:13,280 --> 00:54:17,240 Speaker 2: You will if it was one hundred percent no, No, 1061 00:54:17,400 --> 00:54:19,680 Speaker 2: I would just rather access the information in the old 1062 00:54:19,680 --> 00:54:23,560 Speaker 2: fashioned way. But then maybe that's just because I'm a 1063 00:54:23,600 --> 00:54:26,160 Speaker 2: scaredy cat, and maybe Patrick, you're right. 1064 00:54:26,239 --> 00:54:26,640 Speaker 1: I don't know. 1065 00:54:26,719 --> 00:54:28,600 Speaker 2: I don't think there's a right or wrong, but just 1066 00:54:28,719 --> 00:54:31,200 Speaker 2: my my gut, my intuition says no. 1067 00:54:31,520 --> 00:54:34,279 Speaker 3: Can I just talk about how absolutely lazy that I 1068 00:54:34,400 --> 00:54:37,680 Speaker 3: sometimes can be? So if I wake up in the morning, 1069 00:54:38,040 --> 00:54:40,480 Speaker 3: I've now got I talk to my Google home speaker 1070 00:54:40,520 --> 00:54:43,439 Speaker 3: to add calendon entries into my diary, which is good 1071 00:54:43,480 --> 00:54:45,279 Speaker 3: because if you wake up at two am and you've 1072 00:54:45,280 --> 00:54:46,719 Speaker 3: got all these things going on but you don't want 1073 00:54:46,719 --> 00:54:48,680 Speaker 3: to turn the lights on, I can talk to Google 1074 00:54:48,719 --> 00:54:50,080 Speaker 3: and get it to add that information. 1075 00:54:50,440 --> 00:54:51,920 Speaker 4: But sometimes I'm. 1076 00:54:51,760 --> 00:54:54,000 Speaker 1: So Google the name of your boyfriend. 1077 00:54:53,800 --> 00:55:02,200 Speaker 3: Yes, and Scott maybe say Scotty, Oh my god, they interchangeable. Maybe. 1078 00:55:02,640 --> 00:55:03,759 Speaker 4: So the thing is. 1079 00:55:03,960 --> 00:55:07,399 Speaker 2: How many funny, hilarious, entertaining things I could say right now, 1080 00:55:07,400 --> 00:55:10,319 Speaker 2: but I can't. Are the three people that send me 1081 00:55:10,360 --> 00:55:11,320 Speaker 2: emails all the time? 1082 00:55:11,480 --> 00:55:14,880 Speaker 3: You go on, But sometimes I'm still in that you know, 1083 00:55:14,920 --> 00:55:20,760 Speaker 3: when you're in that drowsy half awake time of twilight zone. Yeah, yeah, 1084 00:55:20,880 --> 00:55:24,040 Speaker 3: and you're too lazy to even speak to Google or 1085 00:55:24,080 --> 00:55:26,839 Speaker 3: too zonked out to speak to Google. If I had 1086 00:55:26,840 --> 00:55:29,520 Speaker 3: that neural link chip, I could just think it and 1087 00:55:29,560 --> 00:55:31,799 Speaker 3: not have to articulate it, so I don't even have 1088 00:55:31,880 --> 00:55:34,560 Speaker 3: to speak to add that calendar note. See that would 1089 00:55:34,560 --> 00:55:34,879 Speaker 3: be good. 1090 00:55:35,320 --> 00:55:37,480 Speaker 2: I'm glad you delayed the end of the show for that, 1091 00:55:37,520 --> 00:55:38,840 Speaker 2: because that was fucking great. 1092 00:55:42,280 --> 00:55:48,000 Speaker 1: In your head? What goes on? Oh we love you. 1093 00:55:48,440 --> 00:55:50,200 Speaker 1: I've just taken the p one ss. 1094 00:55:50,560 --> 00:55:52,919 Speaker 3: But if I had neurallink, I could plug my head 1095 00:55:52,920 --> 00:55:54,640 Speaker 3: into your head and then you would know what was 1096 00:55:54,680 --> 00:55:55,520 Speaker 3: going on in my head. 1097 00:55:55,560 --> 00:55:58,240 Speaker 2: I could And if you had neurallink, you could download 1098 00:55:58,320 --> 00:56:01,279 Speaker 2: your download your dreams and sect them later in the day. 1099 00:56:01,440 --> 00:56:03,600 Speaker 3: It'd be awesome. I'd love to I have so many good, 1100 00:56:03,680 --> 00:56:05,479 Speaker 3: vivid dreams. I dream so much. 1101 00:56:06,560 --> 00:56:10,720 Speaker 2: Patrick, Speaking of dreams, people who are dreaming to connect 1102 00:56:10,719 --> 00:56:16,480 Speaker 2: with you and get you to help them develop their website, etc. 1103 00:56:17,080 --> 00:56:18,960 Speaker 2: And business and branding and marketing. 1104 00:56:19,040 --> 00:56:19,840 Speaker 1: How can they do that? 1105 00:56:20,480 --> 00:56:24,560 Speaker 3: Just go to websitesnow, dot com doday. You really the worst? 1106 00:56:25,040 --> 00:56:27,360 Speaker 1: And that was the worst fucking segue of all time. 1107 00:56:28,280 --> 00:56:31,799 Speaker 3: Me, I'm not as bad in the real world. It's 1108 00:56:31,840 --> 00:56:34,239 Speaker 3: Craig just brings out the worst in me. So if 1109 00:56:34,239 --> 00:56:37,560 Speaker 3: you if you want to talk to me about important things. 1110 00:56:38,440 --> 00:56:41,319 Speaker 1: All right, we're going to go. Thank you, Patrick, thank 1111 00:56:41,360 --> 00:56:41,719 Speaker 1: you TIV. 1112 00:56:42,160 --> 00:56:43,000 Speaker 3: See you guys,