1 00:00:01,120 --> 00:00:05,120 Speaker 1: I get a champs Patrick, Tiffany and Cook you asked 2 00:00:05,120 --> 00:00:09,280 Speaker 1: for you Project fortnight They get together with two smartest people, 3 00:00:10,039 --> 00:00:14,160 Speaker 1: Patrick and Tiff and the ex fucking bogan from La Trivalley. 4 00:00:14,240 --> 00:00:17,880 Speaker 1: Just trying to hang into the conversational brilliance that is 5 00:00:17,920 --> 00:00:18,320 Speaker 1: these two. 6 00:00:18,360 --> 00:00:20,440 Speaker 2: Hi, Patrick, Hey he doing. 7 00:00:20,920 --> 00:00:24,560 Speaker 1: O'm good, I'm good. Good morning Tiff who was up 8 00:00:24,640 --> 00:00:25,919 Speaker 1: until thirty seconds ago? 9 00:00:26,520 --> 00:00:30,920 Speaker 2: Me just trying, just trying it on, just seeing how 10 00:00:30,960 --> 00:00:32,320 Speaker 2: it feels. Be great. 11 00:00:34,040 --> 00:00:37,720 Speaker 1: For some reason, TIFF's little window on this zoom call 12 00:00:37,840 --> 00:00:41,120 Speaker 1: said my name, and then she said to me, did 13 00:00:41,120 --> 00:00:43,519 Speaker 1: you do that? Like I have the skills to do? 14 00:00:43,600 --> 00:00:46,640 Speaker 2: That's the funny thing. You were touting us as the 15 00:00:46,680 --> 00:00:50,400 Speaker 2: tech genius is And after our last show where I 16 00:00:50,400 --> 00:00:53,159 Speaker 2: impersonated Tiff, I jumped into a team meeting with a 17 00:00:53,200 --> 00:00:57,480 Speaker 2: client and realized that I was still Tiff. So it's 18 00:00:57,520 --> 00:00:59,440 Speaker 2: all kind of all goes downhill from here, doesn't it. 19 00:00:59,560 --> 00:01:04,120 Speaker 1: Yeah, Well, you know there's there's the claim, there's the brand, 20 00:01:04,160 --> 00:01:07,640 Speaker 1: and then there's the reality behind the claim. How are 21 00:01:07,680 --> 00:01:09,320 Speaker 1: you Patrick, I'm good? 22 00:01:09,480 --> 00:01:12,000 Speaker 2: And I was just thinking that impersonating you, you could 23 00:01:12,000 --> 00:01:12,920 Speaker 2: probably sue me. 24 00:01:14,920 --> 00:01:17,760 Speaker 1: What go on? I feel like this? That was a 25 00:01:17,840 --> 00:01:19,080 Speaker 1: segue I feel. 26 00:01:20,400 --> 00:01:23,319 Speaker 2: Mat I's just that Meta is about to have to 27 00:01:23,360 --> 00:01:27,000 Speaker 2: fork out one point four billion dollars for using facial 28 00:01:27,040 --> 00:01:30,920 Speaker 2: recognition on the public without checking with them first. So 29 00:01:31,040 --> 00:01:33,200 Speaker 2: that's that's a big thing that's come through. I think 30 00:01:33,200 --> 00:01:35,360 Speaker 2: that's one point four billion US by the way, so 31 00:01:35,760 --> 00:01:39,560 Speaker 2: it's even more of course converted to our currency cocaine. 32 00:01:39,560 --> 00:01:42,880 Speaker 1: How they how have they been using that in what capacity? 33 00:01:43,200 --> 00:01:47,160 Speaker 2: So they like they was being social media accounts and 34 00:01:47,240 --> 00:01:49,640 Speaker 2: using facial recognition. So what they would do is they 35 00:01:49,680 --> 00:01:53,200 Speaker 2: would say, oh, this is Craig Harper's Facebook account and 36 00:01:53,240 --> 00:01:55,520 Speaker 2: will now match his face and then look at where 37 00:01:55,600 --> 00:01:58,280 Speaker 2: all his faces appear on other people's postings and on 38 00:01:58,320 --> 00:01:59,920 Speaker 2: the internet, all that sort of stuff. 39 00:02:00,080 --> 00:02:04,880 Speaker 1: It So, oh so then to what end do they 40 00:02:04,920 --> 00:02:07,520 Speaker 1: do that? Sorry, Like, what's the what's in it for 41 00:02:07,560 --> 00:02:10,040 Speaker 1: them to do that? Obviously money at some stage. 42 00:02:10,440 --> 00:02:13,080 Speaker 2: Yeah, look, I think the part of it behind that 43 00:02:13,280 --> 00:02:17,040 Speaker 2: is to try to develop I guess, an algorithm that's 44 00:02:17,080 --> 00:02:19,560 Speaker 2: going to keep Craig using Facebook. And if you can 45 00:02:19,919 --> 00:02:22,280 Speaker 2: refer to Craig and say, oh, look we saw you 46 00:02:22,480 --> 00:02:25,200 Speaker 2: in this post that was done it was a group 47 00:02:25,240 --> 00:02:27,680 Speaker 2: shot at Nova or you know, Tiff had a function. 48 00:02:28,160 --> 00:02:30,960 Speaker 2: I think it's about that connectivity, about trying to link 49 00:02:31,040 --> 00:02:33,959 Speaker 2: you and to be able to know where you're going, 50 00:02:34,000 --> 00:02:36,359 Speaker 2: what you're doing, and how you doing it, but then 51 00:02:36,440 --> 00:02:38,959 Speaker 2: matching that up with because you know, you've had you 52 00:02:39,080 --> 00:02:40,680 Speaker 2: caught up with Tiff and you might be doing a 53 00:02:40,680 --> 00:02:42,680 Speaker 2: training session, so you might need a new boxing gloves 54 00:02:42,760 --> 00:02:45,360 Speaker 2: or something because Tip's going to punch the shit out 55 00:02:45,400 --> 00:02:47,160 Speaker 2: of you. Go to the dentist. 56 00:02:49,880 --> 00:02:52,880 Speaker 1: Now, I think I saw somewhere on your list that 57 00:02:54,280 --> 00:02:55,799 Speaker 1: I saw it on the news the other night. Don't 58 00:02:55,800 --> 00:02:58,880 Speaker 1: know if it's the same thing, but like a wearable 59 00:02:59,320 --> 00:03:02,680 Speaker 1: I don't know, electronic brend like. 60 00:03:04,040 --> 00:03:07,640 Speaker 2: I guess it's imagine if you had a wearable electronic craig. 61 00:03:08,200 --> 00:03:10,560 Speaker 2: That would be hilarious. 62 00:03:11,360 --> 00:03:13,720 Speaker 1: People have one. It's called the You Project. 63 00:03:15,240 --> 00:03:18,880 Speaker 2: This is interesting, Okay, So the idea of incorporating AI 64 00:03:19,160 --> 00:03:23,760 Speaker 2: into a wearable device that interacts with you, this is 65 00:03:23,919 --> 00:03:26,839 Speaker 2: so creepy. So it's a ninety nine dollar product that's 66 00:03:26,880 --> 00:03:29,959 Speaker 2: being launched out into the marketplace. It's only on Apple 67 00:03:30,000 --> 00:03:34,080 Speaker 2: at the moment, and it's called The Friend and it 68 00:03:34,200 --> 00:03:37,280 Speaker 2: even can bully you as well, which is interesting. So 69 00:03:37,560 --> 00:03:41,760 Speaker 2: the promotional video for it basically has attracted a lot 70 00:03:41,760 --> 00:03:46,040 Speaker 2: of interest ninety thousand views, and it's kind of growing 71 00:03:46,080 --> 00:03:48,160 Speaker 2: all the time. So what it does is there's actually 72 00:03:48,200 --> 00:03:51,600 Speaker 2: a scene where this AI model mocks there wear like 73 00:03:51,640 --> 00:03:55,200 Speaker 2: it bullies them. But what it really monitors what you're doing. 74 00:03:55,320 --> 00:03:58,240 Speaker 2: So effectively, it's a medallion that you wear, right, It's 75 00:03:58,240 --> 00:04:01,000 Speaker 2: attached to a lanyard and you you as you walk 76 00:04:01,040 --> 00:04:04,600 Speaker 2: around and experience your day, it monitors and it just 77 00:04:04,640 --> 00:04:08,880 Speaker 2: throws in comments about what you're talking about, what you're doing. 78 00:04:09,160 --> 00:04:12,200 Speaker 2: You can talk to it as well. It's kind of 79 00:04:12,280 --> 00:04:15,080 Speaker 2: creepy but sounds fun. 80 00:04:14,760 --> 00:04:19,160 Speaker 1: I don't know about the I wonder about the potential 81 00:04:19,279 --> 00:04:23,400 Speaker 1: lawsuits when you know this wearable thing is bullying people 82 00:04:23,440 --> 00:04:25,440 Speaker 1: that are already anxious and depressed. 83 00:04:26,640 --> 00:04:30,840 Speaker 2: Well, it appears to listen to all your conversations, and 84 00:04:32,320 --> 00:04:35,160 Speaker 2: it's kind of attached by bluetooth to your phone, and 85 00:04:35,200 --> 00:04:38,840 Speaker 2: then as you're it kind of starts to using AI. 86 00:04:39,680 --> 00:04:43,160 Speaker 2: It begins to form its own internal thoughts, so it 87 00:04:43,240 --> 00:04:45,960 Speaker 2: can give a commentary. It can kind of take in 88 00:04:46,000 --> 00:04:49,919 Speaker 2: what's happening and understand the context of what you're doing, 89 00:04:49,960 --> 00:04:52,760 Speaker 2: what you're saying. I just think it would be kind 90 00:04:52,760 --> 00:04:55,400 Speaker 2: of interesting but I think I just would end up 91 00:04:55,440 --> 00:04:58,400 Speaker 2: with a grumpy old man sitting around my neck constantly. 92 00:04:58,760 --> 00:05:02,800 Speaker 1: I could just for me, there's just alarm bells all 93 00:05:02,920 --> 00:05:05,800 Speaker 1: over this, like a thing that stays attached to my 94 00:05:05,920 --> 00:05:10,479 Speaker 1: body that records and tracks and listens to everything that 95 00:05:10,600 --> 00:05:14,200 Speaker 1: I say. I definitely don't want that, because where's all 96 00:05:14,200 --> 00:05:18,680 Speaker 1: that data going. Talk about fucking invasion of privacy. Where's 97 00:05:18,760 --> 00:05:20,279 Speaker 1: that going? Yeah? 98 00:05:20,440 --> 00:05:23,520 Speaker 2: Look, I guess it could be used in other ways, 99 00:05:23,560 --> 00:05:26,719 Speaker 2: like if you go to the free you reach for 100 00:05:27,000 --> 00:05:31,200 Speaker 2: the Oreo milkshake instead of the carrot juice. It could 101 00:05:31,320 --> 00:05:33,040 Speaker 2: kind of have a go at. You couldn't it. You 102 00:05:33,080 --> 00:05:35,320 Speaker 2: could say, hey, well, yeah. 103 00:05:35,360 --> 00:05:37,760 Speaker 1: Well I could do that. See I'm okay with that. 104 00:05:38,279 --> 00:05:40,680 Speaker 1: You know, you fat fuck put the cheesecake down your 105 00:05:40,680 --> 00:05:44,160 Speaker 1: big fat fuck I mean for me personally, not anyone else. 106 00:05:44,200 --> 00:05:45,920 Speaker 2: It has to be your part. It has to be. 107 00:05:46,200 --> 00:05:50,760 Speaker 1: Here's what I'm worried about. Hey, we recorded you saying this, 108 00:05:51,200 --> 00:05:53,839 Speaker 1: Send us ten grand or we're going to release it 109 00:05:54,920 --> 00:05:59,400 Speaker 1: like well that No, that's a big fucking fat no. 110 00:05:59,640 --> 00:06:03,800 Speaker 1: Stick a friend, stick your electronic friend up your ass. 111 00:06:03,920 --> 00:06:07,360 Speaker 1: You can keep it fuck off. There's already way too 112 00:06:07,440 --> 00:06:12,400 Speaker 1: much shit we don't need more fucking electronic fucking friends. 113 00:06:12,440 --> 00:06:13,360 Speaker 1: We need humans. 114 00:06:14,120 --> 00:06:17,760 Speaker 2: We're wondering what Craig could possibly say outside of the 115 00:06:17,839 --> 00:06:21,599 Speaker 2: podcast hasn't already said on air live to all of 116 00:06:21,640 --> 00:06:27,359 Speaker 2: the people, and the terrifying Patrick, The terrifying thing. 117 00:06:27,320 --> 00:06:31,839 Speaker 1: Is that this has filtered me. This is this is 118 00:06:31,960 --> 00:06:35,640 Speaker 1: p g me. If I had the R rated podcast, 119 00:06:35,800 --> 00:06:38,279 Speaker 1: It'd either be the biggest podcast in the world or 120 00:06:38,320 --> 00:06:41,760 Speaker 1: I'd be in prison. I'm not sure we can do 121 00:06:41,920 --> 00:06:42,760 Speaker 1: zoom from prison. 122 00:06:42,760 --> 00:06:50,440 Speaker 2: That's okay, you and Baba. 123 00:06:47,680 --> 00:06:50,559 Speaker 1: When we're not making number plates and eating fucking rice 124 00:06:50,600 --> 00:06:55,039 Speaker 1: out of plastic containers, all right, keep going over one. 125 00:06:55,240 --> 00:06:58,480 Speaker 2: AI. Yeah, so this is another one that I thought's interesting. 126 00:06:58,880 --> 00:07:00,839 Speaker 2: A friend of mine's an author. You've met our friend 127 00:07:00,880 --> 00:07:04,120 Speaker 2: Tour who I did passed with. Now she's an author. 128 00:07:04,160 --> 00:07:07,240 Speaker 2: She's just doing her second book, and I'm actually doing 129 00:07:07,279 --> 00:07:09,640 Speaker 2: a bit of a preread, which is really exciting. But 130 00:07:10,320 --> 00:07:14,440 Speaker 2: there's been a stack of books hitting Amazon recently that 131 00:07:14,520 --> 00:07:17,560 Speaker 2: have all been written by AI. So there are some 132 00:07:17,600 --> 00:07:22,160 Speaker 2: sneaky people out there who are publishing and putting out cookbooks, particularly, 133 00:07:22,440 --> 00:07:25,760 Speaker 2: and they're kind of generating all this content and churning 134 00:07:25,800 --> 00:07:29,040 Speaker 2: out five books a week, and I know you've written, 135 00:07:29,200 --> 00:07:32,000 Speaker 2: and you've you know, you've got books that have been published, 136 00:07:32,000 --> 00:07:35,200 Speaker 2: and it's a lot of work that goes putting whatever 137 00:07:35,240 --> 00:07:38,200 Speaker 2: the book is. If you're a chef and you cookbook, 138 00:07:38,240 --> 00:07:42,000 Speaker 2: then you spend decades, yes, fine tuning your recipes and 139 00:07:42,240 --> 00:07:44,320 Speaker 2: you know, life experience and all that sort of stuff. 140 00:07:44,360 --> 00:07:48,600 Speaker 2: So it's interesting. So this journal went out and decided 141 00:07:48,640 --> 00:07:50,720 Speaker 2: to just take a look at some of the recipes 142 00:07:50,720 --> 00:07:54,040 Speaker 2: and there were things like bratwurst ice cream and croc 143 00:07:54,120 --> 00:08:00,640 Speaker 2: pot mohito so ai has absolutely no blood. The idea 144 00:08:01,040 --> 00:08:04,280 Speaker 2: what food tastes like, looks like should be combined. There 145 00:08:04,320 --> 00:08:07,440 Speaker 2: was one recipe I think it was like two gallons 146 00:08:07,480 --> 00:08:09,880 Speaker 2: of butter. It was one of the things you had 147 00:08:09,920 --> 00:08:12,760 Speaker 2: to add yes. So obviously they've got it wrong in 148 00:08:12,800 --> 00:08:16,920 Speaker 2: a very very very big way. So it's an interesting 149 00:08:16,960 --> 00:08:21,480 Speaker 2: one that it's becoming challenging to work out what's real 150 00:08:21,520 --> 00:08:24,800 Speaker 2: what isn't real. And I guess the onus really should 151 00:08:24,840 --> 00:08:28,360 Speaker 2: be on the likes of the publishing community to make 152 00:08:28,400 --> 00:08:31,080 Speaker 2: sure that you know, when you look at self publishing, 153 00:08:31,160 --> 00:08:34,360 Speaker 2: which is great a lot of authors who self published, 154 00:08:34,600 --> 00:08:37,880 Speaker 2: it's fantastic that they can and in the past. You know, 155 00:08:38,000 --> 00:08:41,280 Speaker 2: a lot of publishing companies had the lion share, and 156 00:08:41,320 --> 00:08:43,400 Speaker 2: it was really hard to get a book published. The 157 00:08:43,480 --> 00:08:45,840 Speaker 2: beauty of technology now is that a lot of people 158 00:08:45,920 --> 00:08:48,800 Speaker 2: can get out there, and some people may only publish 159 00:08:48,960 --> 00:08:51,840 Speaker 2: one hundred books that they sell to their friends, you know. 160 00:08:52,440 --> 00:08:54,640 Speaker 2: And that's great that you can do that, But when 161 00:08:54,679 --> 00:08:59,440 Speaker 2: you're dishing out literally dishing out crap cookbooks, it's not 162 00:08:59,520 --> 00:09:01,920 Speaker 2: the best thing, you know, Shepherd's Pie sushi. I thought 163 00:09:01,920 --> 00:09:03,200 Speaker 2: that one sounded interesting. 164 00:09:04,440 --> 00:09:07,280 Speaker 1: I think I think you mean print a hundred books, 165 00:09:07,280 --> 00:09:10,199 Speaker 1: not published a hundred books. I know what you mean, 166 00:09:10,280 --> 00:09:12,720 Speaker 1: because a hundred books would be one hundred different books 167 00:09:12,760 --> 00:09:14,480 Speaker 1: published the only. 168 00:09:14,520 --> 00:09:14,960 Speaker 2: One of each. 169 00:09:16,600 --> 00:09:19,720 Speaker 1: You could do that. That would be a very inefficient model. 170 00:09:19,760 --> 00:09:22,720 Speaker 1: But you committed. But I do you know what else 171 00:09:22,760 --> 00:09:28,760 Speaker 1: worries me mate in the same conversation is because obviously 172 00:09:28,800 --> 00:09:32,520 Speaker 1: AI is becoming better and better, and chat GPT and 173 00:09:32,520 --> 00:09:36,439 Speaker 1: the like is becoming better and better. I'm hearing lots 174 00:09:36,440 --> 00:09:39,920 Speaker 1: of stories of people who are basically sailing through undergraduate 175 00:09:39,960 --> 00:09:44,880 Speaker 1: degrees doing almost no work because you know, like for 176 00:09:44,920 --> 00:09:47,360 Speaker 1: the stuff that I'm doing, people go ah like with 177 00:09:47,400 --> 00:09:50,839 Speaker 1: your PhD. But it's really it's no value to me 178 00:09:50,920 --> 00:09:53,600 Speaker 1: because it's all based on the research that I'm doing. 179 00:09:54,160 --> 00:09:56,520 Speaker 1: But you can certainly write a paragraph and then run 180 00:09:56,559 --> 00:09:59,679 Speaker 1: it through chat GPT. But academic writing at that level 181 00:09:59,720 --> 00:10:02,920 Speaker 1: s differ. But at an undergraduate level where you might 182 00:10:03,000 --> 00:10:06,040 Speaker 1: have to do you know, let's say a three thousand 183 00:10:06,040 --> 00:10:10,520 Speaker 1: word assignment on you know, the metabolism of carbohydrate or 184 00:10:11,480 --> 00:10:17,000 Speaker 1: energy systems or whatever in an exercise science degree, or 185 00:10:17,240 --> 00:10:19,600 Speaker 1: you know, talk about the major muscle groups of the 186 00:10:19,640 --> 00:10:22,600 Speaker 1: shoulder and how they work. Whatever it is. Yeah, I 187 00:10:22,640 --> 00:10:26,480 Speaker 1: mean that can churn that out and with a little 188 00:10:26,480 --> 00:10:30,000 Speaker 1: bit of tweaking, it will pass through the what's it called, 189 00:10:31,040 --> 00:10:32,960 Speaker 1: I forget, there's a program. It will come to me 190 00:10:33,040 --> 00:10:35,480 Speaker 1: after we finish. But there's a program that you turn 191 00:10:35,520 --> 00:10:38,800 Speaker 1: it in. It's called So if you write something, you 192 00:10:38,880 --> 00:10:41,360 Speaker 1: can put it through this program and it will give 193 00:10:41,400 --> 00:10:43,680 Speaker 1: you a turn it in score, and it'll tell you 194 00:10:43,720 --> 00:10:46,240 Speaker 1: whether or not it's going to pass the you know, 195 00:10:46,280 --> 00:10:50,600 Speaker 1: the plagiarism test. And people are cranking these out and 196 00:10:50,640 --> 00:10:55,720 Speaker 1: it's passing that test quite easily. So we're seeing students 197 00:10:55,880 --> 00:10:59,480 Speaker 1: at all levels are who are producing what would take 198 00:10:59,720 --> 00:11:04,880 Speaker 1: weeks some work in minutes and they're getting them passed, 199 00:11:05,040 --> 00:11:08,520 Speaker 1: which is it is going to be moving forward a 200 00:11:08,559 --> 00:11:12,599 Speaker 1: big challenge. And then you think about something like copywriting 201 00:11:12,679 --> 00:11:16,800 Speaker 1: for real estate agents where here's the details of the house, 202 00:11:16,880 --> 00:11:19,920 Speaker 1: write us an ad. Well, that's a good application and 203 00:11:19,920 --> 00:11:22,760 Speaker 1: no one's really cheating, you're just using But when it 204 00:11:22,800 --> 00:11:25,840 Speaker 1: comes to academic stuff where people are getting degrees on 205 00:11:25,960 --> 00:11:28,280 Speaker 1: things that they haven't done the work for or they 206 00:11:28,320 --> 00:11:32,600 Speaker 1: don't understand, it's terrifying, especially when you think about things 207 00:11:32,679 --> 00:11:33,439 Speaker 1: like medicine. 208 00:11:33,960 --> 00:11:36,480 Speaker 2: Well, look, I do agree with what you're saying, and 209 00:11:36,720 --> 00:11:40,600 Speaker 2: we do use AI sparingly. We use it visually probably 210 00:11:40,720 --> 00:11:42,720 Speaker 2: the most, and I think I might have mentioned this 211 00:11:42,760 --> 00:11:45,480 Speaker 2: in one of our shows where sometimes when you're designing 212 00:11:45,480 --> 00:11:48,840 Speaker 2: a website, you want an image to span the entire width. 213 00:11:48,920 --> 00:11:51,200 Speaker 2: We call it a banner. And if you got a 214 00:11:51,200 --> 00:11:54,280 Speaker 2: photograph that doesn't have the right aspect ratio, and say 215 00:11:54,320 --> 00:11:56,600 Speaker 2: you've got, you know, some people in one area and 216 00:11:56,600 --> 00:11:59,160 Speaker 2: then some cars and some trees or whatever it happens 217 00:11:59,200 --> 00:12:02,160 Speaker 2: to be, you can use AI to stretch the photo out, 218 00:12:02,200 --> 00:12:06,160 Speaker 2: which would take hours and hours using normal techniques in photoshop, 219 00:12:06,360 --> 00:12:09,280 Speaker 2: but you can extend hills and scenes and we do 220 00:12:09,320 --> 00:12:11,080 Speaker 2: it for our clients all the time. You know, we've 221 00:12:11,080 --> 00:12:13,480 Speaker 2: got a plumber that we do a lot of social 222 00:12:13,520 --> 00:12:16,840 Speaker 2: media for and they send us portrait photos that just 223 00:12:16,880 --> 00:12:18,480 Speaker 2: need to be stretched out a little bit. And there 224 00:12:18,600 --> 00:12:20,920 Speaker 2: was one of their apprentices actually they had him working 225 00:12:20,920 --> 00:12:23,280 Speaker 2: on a toilet. It was really narrow and so we 226 00:12:23,440 --> 00:12:26,520 Speaker 2: just widen the shot and it was great. So we 227 00:12:26,640 --> 00:12:28,240 Speaker 2: used the original photo. It was just a bit of 228 00:12:28,240 --> 00:12:31,320 Speaker 2: the sides that we added to. But there was an 229 00:12:31,400 --> 00:12:33,560 Speaker 2: article in The Guardian only a couple of weeks ago 230 00:12:33,720 --> 00:12:38,480 Speaker 2: and it basically said that academics are saying, we're now 231 00:12:38,640 --> 00:12:42,760 Speaker 2: thinking that AI has put the integrity of Australian universities 232 00:12:42,840 --> 00:12:46,240 Speaker 2: at risk. There is so much cheating. And you alluded 233 00:12:46,280 --> 00:12:49,880 Speaker 2: to this as well, that it's the quality and the 234 00:12:49,920 --> 00:12:53,839 Speaker 2: standard of what you know when people are leaving university 235 00:12:53,840 --> 00:12:55,640 Speaker 2: with degrees, what does actually even mean? 236 00:12:56,440 --> 00:12:59,840 Speaker 1: Well, that's exactly right, and I think you know what 237 00:12:59,880 --> 00:13:02,760 Speaker 1: you talk about, how you would use it in your work. 238 00:13:02,840 --> 00:13:07,600 Speaker 1: I mean, to me, that's a legitimate, intelligent, positive use, right. 239 00:13:08,280 --> 00:13:10,760 Speaker 1: But that's not the stuff we're worried about. The stuff 240 00:13:10,800 --> 00:13:14,200 Speaker 1: we're worried about is people taking credit and getting credit 241 00:13:14,800 --> 00:13:17,800 Speaker 1: for shit they haven't done. I mean, that's basically lying 242 00:13:17,840 --> 00:13:21,600 Speaker 1: and deception. Well, we know it's becoming harder and harder 243 00:13:21,640 --> 00:13:22,320 Speaker 1: to monitor. 244 00:13:23,000 --> 00:13:25,080 Speaker 2: We were approached by a client and when we build 245 00:13:25,080 --> 00:13:27,000 Speaker 2: a website, one of the things we do is we 246 00:13:27,040 --> 00:13:29,800 Speaker 2: sit down and we do it quite a rigorous kind 247 00:13:29,840 --> 00:13:32,040 Speaker 2: of Q and A try to work out what they 248 00:13:32,040 --> 00:13:35,480 Speaker 2: want to achieve, how they're going to get the content together, 249 00:13:35,559 --> 00:13:38,120 Speaker 2: how they're going to put it together, imagery, all that 250 00:13:38,160 --> 00:13:39,920 Speaker 2: sort of stuff. But we had a client say, well, 251 00:13:39,920 --> 00:13:42,199 Speaker 2: can't you look at other sites in the same industry 252 00:13:42,280 --> 00:13:44,520 Speaker 2: and just use AI to rewrite their content? 253 00:13:46,120 --> 00:13:46,400 Speaker 1: Right? 254 00:13:47,000 --> 00:13:50,320 Speaker 2: I mean we said no, but you know that's what 255 00:13:50,480 --> 00:13:53,319 Speaker 2: the mindset is out there. Why do it? You know, 256 00:13:53,400 --> 00:13:55,360 Speaker 2: why do the hard yards if you can do it easy? 257 00:13:56,400 --> 00:14:02,719 Speaker 1: Well, I mean it like if there's no downside. I mean, 258 00:14:03,080 --> 00:14:05,480 Speaker 1: you know, with the evolution of everything, right, when something 259 00:14:05,559 --> 00:14:08,360 Speaker 1: new comes in, something old goes out. Like there are 260 00:14:08,400 --> 00:14:10,760 Speaker 1: so many jobs now that are becoming redundant. There are 261 00:14:10,800 --> 00:14:12,920 Speaker 1: so many new jobs that are becoming part of the 262 00:14:13,520 --> 00:14:18,840 Speaker 1: commercial landscape. And you know, as we've spoken about before, 263 00:14:19,200 --> 00:14:22,880 Speaker 1: you've showed us images of aiart versus real art, and 264 00:14:24,000 --> 00:14:31,040 Speaker 1: it's like it's those creative spaces where it might take 265 00:14:31,200 --> 00:14:36,240 Speaker 1: somebody you know, like Dylan Keyes who I follow Chuck 266 00:14:36,240 --> 00:14:39,040 Speaker 1: old by Dylan, who's an awesome artist. Follow him on 267 00:14:39,160 --> 00:14:42,680 Speaker 1: He's been on the show on Instagram, Like, his work 268 00:14:42,760 --> 00:14:47,320 Speaker 1: is fucking amazing. It take him fifty hours of fucking 269 00:14:47,960 --> 00:14:51,360 Speaker 1: like super focused work to produce one image that's brilliant. 270 00:14:52,120 --> 00:14:54,400 Speaker 1: And you can produce something that looks just like that 271 00:14:54,520 --> 00:14:59,520 Speaker 1: in thirty seconds with chat JPT and it's pretty much indistinguishable, 272 00:15:00,080 --> 00:15:02,880 Speaker 1: and you're like, and you know that one person did 273 00:15:02,880 --> 00:15:07,440 Speaker 1: it and the other one was synthetic, so to speak. 274 00:15:07,560 --> 00:15:11,880 Speaker 1: But yeah, if I was a creative I would be worried. 275 00:15:11,920 --> 00:15:15,960 Speaker 2: I think I totally agree. Look, I haven't authored any books, 276 00:15:16,200 --> 00:15:18,240 Speaker 2: but I've done a lot of writing over the years 277 00:15:18,240 --> 00:15:22,200 Speaker 2: as a journalist and creative pieces. Have written some short 278 00:15:22,240 --> 00:15:26,040 Speaker 2: stories the idea of someone just taking that material. Even 279 00:15:26,040 --> 00:15:29,080 Speaker 2: when I've written for clients and we've written copy for websites, 280 00:15:29,800 --> 00:15:32,320 Speaker 2: you've still spent the hours doing it. You know, you've 281 00:15:32,360 --> 00:15:34,720 Speaker 2: thought about it, you've spoken to the client. We had 282 00:15:34,720 --> 00:15:39,280 Speaker 2: one particular client recently, a lovely, lovely couple who you know, 283 00:15:39,560 --> 00:15:43,720 Speaker 2: were a fencing company. Then they started making steel fence 284 00:15:43,800 --> 00:15:47,600 Speaker 2: posts and they've revolutionized a lot of the production process 285 00:15:47,640 --> 00:15:51,640 Speaker 2: of standard domestic wooden fences because they've got metal posts 286 00:15:51,720 --> 00:15:54,760 Speaker 2: that they have in between all the spacing. So the 287 00:15:54,840 --> 00:15:57,840 Speaker 2: thing is they don't write they you know. So I 288 00:15:57,880 --> 00:16:01,000 Speaker 2: spent probably about two and a half hour interviewing them, 289 00:16:01,040 --> 00:16:03,200 Speaker 2: getting information from them so that I could write the 290 00:16:03,200 --> 00:16:05,960 Speaker 2: copy for them. Now, if someone else now produces a 291 00:16:06,040 --> 00:16:10,000 Speaker 2: steel fence post website and they just troll through our 292 00:16:10,080 --> 00:16:13,000 Speaker 2: site with all that original copy that I wrote, well, 293 00:16:13,480 --> 00:16:16,000 Speaker 2: you know that's that's not fair. You know, all that 294 00:16:16,080 --> 00:16:18,600 Speaker 2: hard work that went into it, and you know, whether 295 00:16:18,640 --> 00:16:21,080 Speaker 2: it's something that you've come up with yourself, whether someone 296 00:16:21,520 --> 00:16:25,480 Speaker 2: goes through all the you project sites podcasts and says, well, 297 00:16:25,520 --> 00:16:28,640 Speaker 2: I can kind of copy this and create my own podcast. 298 00:16:28,880 --> 00:16:29,280 Speaker 2: I don't know. 299 00:16:30,160 --> 00:16:33,160 Speaker 1: Do you know how many of my whiteboard lessons have 300 00:16:33,360 --> 00:16:39,200 Speaker 1: been ripped off? I really like, no hundreds, thousands, like 301 00:16:39,360 --> 00:16:43,280 Speaker 1: yesterday there was one. Anyway, I get sent them all. 302 00:16:43,680 --> 00:16:46,480 Speaker 1: Pretty much every day someone sends me one of my 303 00:16:46,600 --> 00:16:48,520 Speaker 1: whiteboards that I wrote. If you don't know what they are, 304 00:16:48,560 --> 00:16:52,560 Speaker 1: go to my Instagram page where it's just me writing 305 00:16:52,560 --> 00:16:54,560 Speaker 1: on a whiteboard. I take a photo of it. I 306 00:16:54,640 --> 00:16:57,120 Speaker 1: kind of brighten it up a bit, but it's my handwriting, 307 00:16:57,120 --> 00:17:00,000 Speaker 1: and I put it up thousands of times. In fact, 308 00:17:01,280 --> 00:17:04,120 Speaker 1: you know one I wrote called twelve fucking Rules for Success. Well, 309 00:17:04,119 --> 00:17:08,560 Speaker 1: that's been that's been done in different languages. There's details 310 00:17:08,560 --> 00:17:12,160 Speaker 1: of it, there's mugs with it on it, there's posters, 311 00:17:12,280 --> 00:17:15,280 Speaker 1: there's you know, it's been turned into product that other 312 00:17:15,320 --> 00:17:20,560 Speaker 1: people are selling, like my intellectual property. And I've you know, 313 00:17:20,680 --> 00:17:23,199 Speaker 1: I've opened the door on how do I There's really 314 00:17:23,280 --> 00:17:25,639 Speaker 1: not a lot unless you want to spend hundreds of 315 00:17:25,640 --> 00:17:28,840 Speaker 1: thousands of dollars that you may or may not, you know, 316 00:17:28,920 --> 00:17:33,520 Speaker 1: in terms of legal kind of pursuit. But it's I 317 00:17:34,040 --> 00:17:36,320 Speaker 1: just I don't like it, but I just go all right, 318 00:17:36,359 --> 00:17:40,960 Speaker 1: it's going to happen, you know, And I think, anyway, 319 00:17:41,080 --> 00:17:42,919 Speaker 1: let's move on something other than this. I want to 320 00:17:42,920 --> 00:17:45,639 Speaker 1: know about like Google have gotten. I'll go on. 321 00:17:46,480 --> 00:17:48,560 Speaker 2: There's one more thing of this is a new phrase 322 00:17:48,640 --> 00:17:53,080 Speaker 2: and you're gonna love it. It's called AI cannibalism, all right, 323 00:17:53,560 --> 00:17:57,280 Speaker 2: what is it? So what's happening is there's so much 324 00:17:57,400 --> 00:18:01,879 Speaker 2: AI generated content on the Internet. There's now concerns that 325 00:18:02,280 --> 00:18:06,000 Speaker 2: with new AI models being trained on the content on 326 00:18:06,080 --> 00:18:11,080 Speaker 2: the Internet that potentially they're now scraping AI model content. 327 00:18:11,400 --> 00:18:13,880 Speaker 2: So what they're doing is they're learning from other AI 328 00:18:14,600 --> 00:18:17,800 Speaker 2: and then as they scrape through there's so much AI 329 00:18:18,000 --> 00:18:21,240 Speaker 2: crap out there, and they're learning from the AI crap 330 00:18:21,760 --> 00:18:24,399 Speaker 2: that it's including the quality of the product. So the 331 00:18:24,400 --> 00:18:26,720 Speaker 2: best way to describe it is if if you say, 332 00:18:27,760 --> 00:18:32,479 Speaker 2: AI draw an elephant, and then AI learns off that 333 00:18:32,560 --> 00:18:35,879 Speaker 2: picture of an elephant that's been drawn by AI, and 334 00:18:35,920 --> 00:18:39,560 Speaker 2: there are subtle, weird, creepy, different things that just don't 335 00:18:39,600 --> 00:18:43,600 Speaker 2: feel right the AI is using that. So I just 336 00:18:43,680 --> 00:18:48,840 Speaker 2: loved the term AI cannibalism. It's such a great way 337 00:18:49,000 --> 00:18:52,800 Speaker 2: to describe what effectively is you know, I guess the 338 00:18:52,840 --> 00:18:56,880 Speaker 2: demise of AI models potentially the demise of AI models 339 00:18:57,160 --> 00:19:00,479 Speaker 2: if they're training them on information that's been generated by AI, 340 00:19:00,680 --> 00:19:03,399 Speaker 2: because now AI doesn't even know what AI. 341 00:19:03,440 --> 00:19:07,879 Speaker 1: Is well, and also AI doesn't know because if AI 342 00:19:08,000 --> 00:19:11,560 Speaker 1: is being programmed by programmers who have a particular bias, 343 00:19:11,640 --> 00:19:14,959 Speaker 1: which is everyone in the world, then you know, like 344 00:19:15,000 --> 00:19:16,840 Speaker 1: somebody put up a thing the other day and they 345 00:19:17,000 --> 00:19:21,879 Speaker 1: asked chat GPT, what happened on whatever the date was 346 00:19:21,920 --> 00:19:24,240 Speaker 1: that Donald Trump got shot what happened on that date 347 00:19:24,320 --> 00:19:28,320 Speaker 1: in Pennsylvania? And it said, and chat GPT said the 348 00:19:28,400 --> 00:19:31,440 Speaker 1: next president had something thrown at him, but no one 349 00:19:31,440 --> 00:19:37,199 Speaker 1: was hurt, like, which is objectively untrue. Yeah, you know. 350 00:19:38,720 --> 00:19:42,399 Speaker 1: And there was another one six months ago which you 351 00:19:42,400 --> 00:19:45,520 Speaker 1: would have remembered. I think they said, you know, show 352 00:19:45,600 --> 00:19:48,480 Speaker 1: us a picture of the founding forefathers of America, you know, 353 00:19:48,680 --> 00:19:51,200 Speaker 1: like all the old presidents and shit. And it brought 354 00:19:51,280 --> 00:19:56,439 Speaker 1: up and what it showed was some pictures of Native 355 00:19:57,680 --> 00:20:02,399 Speaker 1: Indigenous American women, right because it didn't want to show 356 00:20:02,480 --> 00:20:06,840 Speaker 1: white men. I'm like, okay, but now now, so now 357 00:20:06,840 --> 00:20:10,840 Speaker 1: there's another conundrum, And the conundrum is how we need 358 00:20:10,880 --> 00:20:15,119 Speaker 1: to be able to filter through what is objective truth 359 00:20:15,440 --> 00:20:20,199 Speaker 1: like fact, data, information, and what is something that is 360 00:20:20,320 --> 00:20:25,240 Speaker 1: being you know, like politically correctified or or there's a 361 00:20:25,440 --> 00:20:28,879 Speaker 1: there's a particular bias where oh, now this is not 362 00:20:29,119 --> 00:20:32,199 Speaker 1: this is not information, this is opinion or this is 363 00:20:32,240 --> 00:20:35,600 Speaker 1: an agenda. And now when it's when that agenda or 364 00:20:35,600 --> 00:20:39,840 Speaker 1: that opinion on either side, on all sides, right, when 365 00:20:39,880 --> 00:20:44,720 Speaker 1: that's being presented as objective information, and then over time 366 00:20:44,840 --> 00:20:47,960 Speaker 1: that starts to become the norm, then fucking hell, we 367 00:20:48,080 --> 00:20:50,639 Speaker 1: lose a grip on what is real, and it's you know, 368 00:20:50,680 --> 00:20:54,080 Speaker 1: then it just becomes a slippery slope. 369 00:20:53,480 --> 00:20:58,040 Speaker 2: You could know. And he's doing Malaysia, Who's doing in Malaysia? 370 00:20:58,280 --> 00:20:59,919 Speaker 2: They want an Internet kill switch? 371 00:21:01,040 --> 00:21:03,639 Speaker 1: What is that? That seems like I don't even know 372 00:21:03,680 --> 00:21:05,240 Speaker 1: what it is, but I think we need one. 373 00:21:05,400 --> 00:21:07,840 Speaker 2: Yeah. I don't think they even know yet what it is, 374 00:21:07,960 --> 00:21:12,560 Speaker 2: but they want one. So their parliament at the moment 375 00:21:12,800 --> 00:21:16,680 Speaker 2: is looking at new legislation to create, so they haven't 376 00:21:16,720 --> 00:21:18,919 Speaker 2: even created it yet. They want to create an internet 377 00:21:19,000 --> 00:21:22,920 Speaker 2: kill switch because saying that social media is out of control. 378 00:21:23,359 --> 00:21:26,080 Speaker 2: They if they've got more than an eight million people 379 00:21:26,200 --> 00:21:29,680 Speaker 2: in their country that are using a specific platform, they 380 00:21:29,720 --> 00:21:32,119 Speaker 2: now need to get a license to be able to 381 00:21:32,200 --> 00:21:34,840 Speaker 2: run in that country. But what they're saying is they 382 00:21:34,960 --> 00:21:36,840 Speaker 2: want to be able to turn it off. They just 383 00:21:36,880 --> 00:21:39,879 Speaker 2: want to flick a switch. I like the idea of 384 00:21:39,880 --> 00:21:42,119 Speaker 2: a big red button because I think big red buttons 385 00:21:42,119 --> 00:21:44,439 Speaker 2: that glow the best way to do it. And it 386 00:21:44,520 --> 00:21:47,760 Speaker 2: sits in you know, someone's department there, the president's department. 387 00:21:47,760 --> 00:21:51,080 Speaker 1: You know we're not in an episode of Get Smart, right, Yeah, 388 00:21:51,680 --> 00:21:54,359 Speaker 1: you know you know that it's not a nine and 389 00:21:54,480 --> 00:21:58,879 Speaker 1: sixty sitcom, right. I just wanted this is irrelevant, but 390 00:21:58,920 --> 00:22:01,320 Speaker 1: it's a little bit elevant to what we're talking about 391 00:22:01,320 --> 00:22:05,840 Speaker 1: creativity and the danger for creatives when you know AI 392 00:22:06,040 --> 00:22:10,359 Speaker 1: takes over. But I saw this quote from Einstein that 393 00:22:10,400 --> 00:22:11,959 Speaker 1: I don't know why I've never heard it. It's one 394 00:22:12,000 --> 00:22:14,959 Speaker 1: of the greatest quotes I've ever heard from Einstein this week. 395 00:22:15,480 --> 00:22:19,720 Speaker 1: And the quote is creativity is intelligence having fun. 396 00:22:20,840 --> 00:22:23,439 Speaker 2: Isn't that beautiful? That is so good? 397 00:22:23,800 --> 00:22:24,840 Speaker 1: That is love that? 398 00:22:25,400 --> 00:22:26,879 Speaker 2: Yeah, I love that one do. 399 00:22:26,960 --> 00:22:29,919 Speaker 1: I think I saw that on Lisa Stevenson's platform, so 400 00:22:30,000 --> 00:22:30,639 Speaker 1: shout out to her. 401 00:22:30,680 --> 00:22:33,680 Speaker 2: But what a great quote, isn't it funny. One of 402 00:22:33,760 --> 00:22:36,280 Speaker 2: the smartest guys that I know who I've interviewed on 403 00:22:36,280 --> 00:22:39,920 Speaker 2: my own podcast, a guy who is a rocket scientist, 404 00:22:40,440 --> 00:22:44,040 Speaker 2: and my running joke with him was that the reason 405 00:22:44,080 --> 00:22:46,600 Speaker 2: I hung out with him is because he brings my 406 00:22:46,880 --> 00:22:50,000 Speaker 2: IQ average up so high because he's IQ so I 407 00:22:50,560 --> 00:22:53,520 Speaker 2: and his response to that was, but I hang around 408 00:22:53,520 --> 00:22:55,880 Speaker 2: with you because it brings up my emotional intelligence. 409 00:22:56,680 --> 00:23:02,159 Speaker 1: Now that's probably true. I mean, yeah, well, I think you. 410 00:23:02,920 --> 00:23:04,920 Speaker 1: I hate to say it because I like hanging shit 411 00:23:05,000 --> 00:23:08,000 Speaker 1: on you, but I think you go okay, I think 412 00:23:08,119 --> 00:23:12,639 Speaker 1: IQ and EQ I reckon you're doing okay? Tell me 413 00:23:12,680 --> 00:23:16,000 Speaker 1: about because I feel like I need a robotic massage 414 00:23:16,040 --> 00:23:20,359 Speaker 1: table in my house. That yeah, this is like the 415 00:23:20,480 --> 00:23:22,800 Speaker 1: two point zero of the chair that you sit in. 416 00:23:22,840 --> 00:23:25,720 Speaker 2: It seems I know, this is so much better. It 417 00:23:25,840 --> 00:23:30,280 Speaker 2: kind of See there's two robot II robot driven machines 418 00:23:30,320 --> 00:23:31,879 Speaker 2: that I want to talk about in the show today. 419 00:23:32,240 --> 00:23:36,400 Speaker 2: One is a luge table. So this is I guess 420 00:23:36,400 --> 00:23:38,280 Speaker 2: when you think about it, there's so many different skill 421 00:23:38,320 --> 00:23:41,400 Speaker 2: sets that are needed to be able or massage, right, 422 00:23:41,680 --> 00:23:43,880 Speaker 2: and you've had met I mean one year. 423 00:23:43,760 --> 00:23:47,040 Speaker 1: I'm a massage I'm a massage slut. This is like 424 00:23:47,160 --> 00:23:49,480 Speaker 1: the ultimate present for me. H. 425 00:23:49,920 --> 00:23:53,840 Speaker 2: There we go. So this therapeutic massage has been around 426 00:23:53,880 --> 00:23:56,720 Speaker 2: for thousands of years. We know that it works, like 427 00:23:56,840 --> 00:24:01,439 Speaker 2: absolutely works. So there's a type of massage and I 428 00:24:01,480 --> 00:24:04,719 Speaker 2: think it's called a escape and it's a five thousand 429 00:24:04,800 --> 00:24:07,399 Speaker 2: year old massage practice. And so what they're trying to 430 00:24:07,440 --> 00:24:13,760 Speaker 2: do is effectively use an AI driven machine, effectively robotic 431 00:24:13,840 --> 00:24:16,239 Speaker 2: massage table. But it is going to be able to 432 00:24:16,359 --> 00:24:19,600 Speaker 2: mimic some of the techniques of massage. So, I mean, 433 00:24:19,920 --> 00:24:22,879 Speaker 2: what was the massage that you gave me for a birthday? Reason? 434 00:24:23,160 --> 00:24:25,280 Speaker 1: Like a couple of years it was called a hand job. 435 00:24:25,480 --> 00:24:31,879 Speaker 1: It was oh god, so technically not massage but high massage. 436 00:24:32,160 --> 00:24:32,480 Speaker 2: Idiot. 437 00:24:32,560 --> 00:24:35,640 Speaker 1: By the way, happy birthday, big boy. 438 00:24:36,400 --> 00:24:39,960 Speaker 2: Sorry, TIFFs is choked just having a drink of water. 439 00:24:40,040 --> 00:24:41,200 Speaker 1: Then I nearly died. 440 00:24:44,080 --> 00:24:47,000 Speaker 2: You call it a massage high message. I call it 441 00:24:47,040 --> 00:24:49,800 Speaker 2: a happy birthday. They must know that you go there 442 00:24:49,840 --> 00:24:52,560 Speaker 2: regularly because it's got a really big sign. We don't 443 00:24:52,600 --> 00:24:59,000 Speaker 2: give happy. 444 00:24:58,359 --> 00:25:00,960 Speaker 1: I do. I go to the I don't go as much. 445 00:25:01,000 --> 00:25:04,719 Speaker 1: I used to go a lot, but I like what 446 00:25:05,200 --> 00:25:07,560 Speaker 1: It's got to be good though, like tie massage. The 447 00:25:07,600 --> 00:25:12,480 Speaker 1: problem with massage therapists, whatever kind of style, it's like 448 00:25:12,560 --> 00:25:15,959 Speaker 1: everything else. There's some are fucking amazing, Tiff, you know this, 449 00:25:16,400 --> 00:25:20,399 Speaker 1: Some have got amazing hands and some are terrible. Like some. 450 00:25:20,560 --> 00:25:24,600 Speaker 1: If I within three minutes, I know, within thirty seconds, 451 00:25:24,640 --> 00:25:26,159 Speaker 1: I know if this is going to be good or 452 00:25:26,200 --> 00:25:29,439 Speaker 1: a bad hour. And I feel ripped off when I 453 00:25:29,520 --> 00:25:32,320 Speaker 1: lie down and someone's got clunky hands, like they've got 454 00:25:32,400 --> 00:25:33,280 Speaker 1: no feel for it. 455 00:25:33,480 --> 00:25:35,359 Speaker 2: So you're going to love this device because what it 456 00:25:35,400 --> 00:25:38,840 Speaker 2: does as you lay down on the table on the platform, 457 00:25:39,160 --> 00:25:43,160 Speaker 2: it uses infrared sensors to scan your body and then 458 00:25:43,760 --> 00:25:47,720 Speaker 2: by being powered by AI, it delivers a personalized experience, 459 00:25:48,119 --> 00:25:50,919 Speaker 2: so it knows you know, I guess potentially where there 460 00:25:51,000 --> 00:25:52,840 Speaker 2: might be not why are you laughing at me? 461 00:25:54,840 --> 00:25:56,840 Speaker 1: I'm thinking about the title of the show and how 462 00:25:56,880 --> 00:25:59,200 Speaker 1: I can weave hand job in there. 463 00:25:59,240 --> 00:26:03,159 Speaker 2: Gone, I'm trying to be serious. Yeah, So anyway, it 464 00:26:03,200 --> 00:26:06,760 Speaker 2: will scan your body beforehand, so potentially that could be 465 00:26:06,880 --> 00:26:09,800 Speaker 2: those mystical, wonderful hands that you've been dreaming of. 466 00:26:09,920 --> 00:26:14,880 Speaker 1: Great well, firstly available in Australia, and I'm imagining this 467 00:26:15,000 --> 00:26:16,120 Speaker 1: is not a cheap product. 468 00:26:16,440 --> 00:26:18,720 Speaker 2: Look, it won't be. I don't know. I didn't actually 469 00:26:18,720 --> 00:26:20,280 Speaker 2: see a price with it. I don't know if it's 470 00:26:20,359 --> 00:26:22,440 Speaker 2: it's I don't think it's a proof of concept. I 471 00:26:22,440 --> 00:26:25,560 Speaker 2: think that it's actually won a design award, so I'm 472 00:26:25,560 --> 00:26:26,960 Speaker 2: thinking that it's out there. I'll have to do a 473 00:26:27,000 --> 00:26:29,399 Speaker 2: bit of research for you, so I'll look into it 474 00:26:29,440 --> 00:26:31,439 Speaker 2: and see what the cost is going to be. But 475 00:26:31,480 --> 00:26:33,359 Speaker 2: the other thing that scared the Jesus out of me, 476 00:26:33,400 --> 00:26:35,679 Speaker 2: and I wouldn't do this in a million years. I 477 00:26:35,680 --> 00:26:38,159 Speaker 2: don't think about notes. It might have been something I 478 00:26:38,200 --> 00:26:43,480 Speaker 2: read earlier, but the world's first fully automated robot dentist 479 00:26:43,800 --> 00:26:46,960 Speaker 2: has done surgery on a human is that it. 480 00:26:46,840 --> 00:26:50,240 Speaker 1: Seems like a fuck that tiff shaken a head way. 481 00:26:52,040 --> 00:26:54,960 Speaker 2: So I five movies. If you watched where they've got 482 00:26:55,040 --> 00:26:58,080 Speaker 2: robots with needles and stuff pointing at you, and that 483 00:26:58,240 --> 00:27:02,119 Speaker 2: was the TORTUD device. Look it's done, it's happened. A 484 00:27:02,160 --> 00:27:06,120 Speaker 2: person actually had a procedure done. One of the benefits 485 00:27:06,119 --> 00:27:08,560 Speaker 2: of this, it says that they can do crowns and 486 00:27:08,640 --> 00:27:11,520 Speaker 2: things in fifteen minutes instead of an hour and speed 487 00:27:11,600 --> 00:27:14,679 Speaker 2: up the process. He's the crap out of me. 488 00:27:14,960 --> 00:27:18,600 Speaker 1: Yeah, I'm not after speed, I'm after fucking human hands. 489 00:27:18,680 --> 00:27:21,800 Speaker 1: But speaking of such medical procedures, I did see during 490 00:27:21,840 --> 00:27:29,440 Speaker 1: the week Patrick James, Kevin Fernando that a surgeon operator 491 00:27:29,520 --> 00:27:33,760 Speaker 1: did an operation, a remote operation on a patient who 492 00:27:33,880 --> 00:27:37,720 Speaker 1: was in a different location. So he I can't remember 493 00:27:37,720 --> 00:27:39,600 Speaker 1: how they do it, but I think he has his 494 00:27:39,800 --> 00:27:42,440 Speaker 1: hands or she. I think it was a he in 495 00:27:42,560 --> 00:27:45,600 Speaker 1: basically these gloves that have got all of these senses, 496 00:27:46,400 --> 00:27:51,080 Speaker 1: and he performs the operation remotely and it's carried or 497 00:27:51,080 --> 00:27:57,840 Speaker 1: it's executed through a robot on site with so again terrifying. 498 00:27:57,880 --> 00:28:02,080 Speaker 1: But in some situations I guess over time, like honestly, 499 00:28:03,800 --> 00:28:07,240 Speaker 1: think about in fifty years, maybe ninety eight percent of 500 00:28:07,240 --> 00:28:09,719 Speaker 1: the procedures are going to be done by robots because 501 00:28:09,760 --> 00:28:13,760 Speaker 1: they don't get tired, they don't get cranky, you know that, 502 00:28:13,800 --> 00:28:18,320 Speaker 1: they don't lose focus and attention, and they don't have emotions. 503 00:28:18,480 --> 00:28:22,919 Speaker 1: And I guess you know, it's just I think for us, 504 00:28:22,960 --> 00:28:27,680 Speaker 1: well me anyway, it's just a mental and emotional hurdle. 505 00:28:28,000 --> 00:28:30,879 Speaker 1: But maybe it's going to be much better outcomes. 506 00:28:31,400 --> 00:28:34,520 Speaker 2: Well, the remote surgery that you're talking about, when you 507 00:28:34,640 --> 00:28:37,840 Speaker 2: think of situations, okay, here's a great one, the International 508 00:28:37,880 --> 00:28:41,760 Speaker 2: Space Station. Now, what if someone had a ruptured spleen 509 00:28:41,920 --> 00:28:46,440 Speaker 2: or something, what do you do? You can't just pack 510 00:28:46,480 --> 00:28:49,480 Speaker 2: them into a capsule and fire and back down to Earth. 511 00:28:50,160 --> 00:28:52,760 Speaker 2: They may not even survive the re entry. So the 512 00:28:52,800 --> 00:28:54,880 Speaker 2: reality of it is if you're working on an oil 513 00:28:54,960 --> 00:28:57,720 Speaker 2: rig out in the middle of the North Sea, and 514 00:28:59,200 --> 00:29:03,440 Speaker 2: so I'm thinking that potentially, you know, you could suddenly 515 00:29:03,520 --> 00:29:09,480 Speaker 2: turn a remote clinic into a full surgical you know, procedural, 516 00:29:10,520 --> 00:29:15,080 Speaker 2: you know, a venue clinic. Because when you think about 517 00:29:15,080 --> 00:29:19,120 Speaker 2: it in Australia, particularly to fly or to travel from 518 00:29:19,160 --> 00:29:21,760 Speaker 2: one location to another, if you've got a drive for 519 00:29:21,840 --> 00:29:26,040 Speaker 2: twelve fourteen hours and whatever you've got, that's wrong with 520 00:29:26,080 --> 00:29:29,240 Speaker 2: you is mission critical, so potentially you you know, this 521 00:29:29,320 --> 00:29:32,560 Speaker 2: is life saving surgery. If you had a device with 522 00:29:32,960 --> 00:29:35,800 Speaker 2: a robotic doctor, I guess I'd be putting my hand up. 523 00:29:36,280 --> 00:29:38,480 Speaker 2: It's either that or die in transit. 524 00:29:39,440 --> 00:29:42,120 Speaker 1: We had we had a guy tif do you remember 525 00:29:42,160 --> 00:29:46,479 Speaker 1: his name, who was from Antarctica. I think you had 526 00:29:46,560 --> 00:29:52,080 Speaker 1: him on your show as well, David David David Knof, Yeah, 527 00:29:52,120 --> 00:29:54,880 Speaker 1: and he was talking to us about that. I mean, 528 00:29:54,880 --> 00:29:58,120 Speaker 1: they're based in Antarctica, Patrick, and so they have to 529 00:29:58,160 --> 00:30:00,240 Speaker 1: have a whole team of people. I can't remember if 530 00:30:00,240 --> 00:30:02,680 Speaker 1: it was him was telling me or someone else. But 531 00:30:03,320 --> 00:30:05,920 Speaker 1: so they have a doctor which is a really good, 532 00:30:06,200 --> 00:30:09,520 Speaker 1: you know doctor who can do pretty much everything, including surgery, 533 00:30:10,560 --> 00:30:14,920 Speaker 1: until the doctor gets appendicitis. And so she had to 534 00:30:14,960 --> 00:30:20,200 Speaker 1: do an appendectomy on herself. So she had to operate 535 00:30:20,240 --> 00:30:24,520 Speaker 1: her on herself. I think it was appendicitis what obviously 536 00:30:24,880 --> 00:30:32,120 Speaker 1: while she was awake. So that's in those kinds of situations. Yeah, 537 00:30:32,280 --> 00:30:35,240 Speaker 1: I mean to be able to and we've spoken before 538 00:30:35,280 --> 00:30:38,440 Speaker 1: about people who live in remote areas of Australia. You know, 539 00:30:38,640 --> 00:30:42,240 Speaker 1: they have access to everything now, you know, education and 540 00:30:43,320 --> 00:30:46,200 Speaker 1: you know, lots of other kind of medical consults and 541 00:30:47,960 --> 00:30:51,640 Speaker 1: mental health consults, and so it's kind of it's quite 542 00:30:51,640 --> 00:30:54,000 Speaker 1: a it's quite a leap, but maybe it's part of 543 00:30:54,040 --> 00:30:54,800 Speaker 1: the next steps. 544 00:30:55,960 --> 00:30:59,479 Speaker 2: Yeah, sounds pretty cool. Has you arguing? 545 00:31:00,720 --> 00:31:02,840 Speaker 1: You know? You know if you know how I know 546 00:31:02,880 --> 00:31:06,320 Speaker 1: when Patrick, he's already onto the next topic is when 547 00:31:06,360 --> 00:31:09,840 Speaker 1: I finish, and he goes, yeah, good, yeah, good, like 548 00:31:09,920 --> 00:31:11,920 Speaker 1: he's I know, he's bought out of it. And I 549 00:31:11,920 --> 00:31:15,640 Speaker 1: can see his eyes fucking glazing over because he doesn't 550 00:31:15,720 --> 00:31:18,160 Speaker 1: he's not even listening to what I'm saying, and he. 551 00:31:18,200 --> 00:31:20,320 Speaker 2: Just goes, yeah good. 552 00:31:20,360 --> 00:31:22,600 Speaker 1: What was your question? How is my what going? 553 00:31:22,760 --> 00:31:24,760 Speaker 2: As your car going? I was thinking about you and 554 00:31:24,800 --> 00:31:27,719 Speaker 2: your car. I fancy adget, So you've got everything. 555 00:31:28,280 --> 00:31:31,000 Speaker 1: No, I don't. I haven't even driven it because i've 556 00:31:31,640 --> 00:31:35,320 Speaker 1: I'm just yeah, I'm just at home and when I 557 00:31:35,320 --> 00:31:39,360 Speaker 1: get around Suburbia, I drive, I ride a motorbike. So 558 00:31:39,400 --> 00:31:42,080 Speaker 1: the answer is I think I've driven it three times, 559 00:31:42,080 --> 00:31:44,320 Speaker 1: and once was on the way home from the dealership. 560 00:31:44,440 --> 00:31:47,120 Speaker 1: So I'm not wearing that motherfucker out. If anyone wants 561 00:31:47,320 --> 00:31:51,120 Speaker 1: really low mileitch, no, no, I'm not selling driven. 562 00:31:51,120 --> 00:31:53,800 Speaker 2: It's a good car. Lady to church on a Sunday. 563 00:31:53,840 --> 00:31:55,920 Speaker 2: That's the ad it's a nice car. 564 00:31:56,000 --> 00:31:58,280 Speaker 1: It's just it's a car. Let's not. It's but it's 565 00:31:58,320 --> 00:32:00,880 Speaker 1: too it's do you know what it's not car? It's 566 00:32:00,920 --> 00:32:04,160 Speaker 1: a computer that doubles as a car. It's a computer 567 00:32:04,280 --> 00:32:09,920 Speaker 1: with four wheels and seats. I you know how like 568 00:32:10,000 --> 00:32:13,640 Speaker 1: the analog, not the analog, but the old Basically the 569 00:32:13,760 --> 00:32:16,280 Speaker 1: dumb phone is making a little bit of a comeback 570 00:32:16,320 --> 00:32:19,120 Speaker 1: in some I reckon. Eventually, we're going to do the 571 00:32:19,120 --> 00:32:21,840 Speaker 1: full circle and we're going to offer cars that essentially 572 00:32:22,480 --> 00:32:26,080 Speaker 1: are like almost an analog car. Where how do you 573 00:32:26,120 --> 00:32:28,720 Speaker 1: turn on the heater? You push your button, how do 574 00:32:28,760 --> 00:32:32,600 Speaker 1: you make it warmer? Your twist and knob right? And 575 00:32:32,680 --> 00:32:34,680 Speaker 1: how do you lock it? You put the key in 576 00:32:35,040 --> 00:32:37,440 Speaker 1: and you lock it, or you press the button and 577 00:32:37,440 --> 00:32:40,960 Speaker 1: then you hold the handle. Like imagine if they made 578 00:32:41,000 --> 00:32:46,920 Speaker 1: a car with no absolutely zero unnecessary tech, only the 579 00:32:46,960 --> 00:32:49,600 Speaker 1: tech that a car needs to get around. I wonder 580 00:32:49,600 --> 00:32:51,240 Speaker 1: if they could sell that for ten grand? 581 00:32:51,720 --> 00:32:53,360 Speaker 2: You would have by my like, ah, I'm selling my 582 00:32:53,440 --> 00:32:55,920 Speaker 2: old miss annex coop remember my miss hand on the 583 00:32:55,920 --> 00:32:56,840 Speaker 2: little target roof. 584 00:32:57,160 --> 00:32:58,600 Speaker 1: Yeah, I'll give you a hundred bucks for it. 585 00:32:58,920 --> 00:33:02,400 Speaker 2: One hundred bucks. That's pristine condition. You can put another 586 00:33:02,480 --> 00:33:03,880 Speaker 2: four zeros at the end of it. 587 00:33:04,280 --> 00:33:07,640 Speaker 1: Fuck your Christine condition. Nobody wanted them when they were new, 588 00:33:07,800 --> 00:33:10,880 Speaker 1: especially now they're thirty years old. You should take that 589 00:33:11,000 --> 00:33:14,920 Speaker 1: to the Nissan Museum and they'll go, no, thanks, I'll 590 00:33:14,920 --> 00:33:22,680 Speaker 1: go okay, here's a voucher to the tip. Wow, no, 591 00:33:23,280 --> 00:33:25,960 Speaker 1: what is it this? And NX isn't it? What is it? 592 00:33:26,160 --> 00:33:29,120 Speaker 2: It's an NX coop it was. Actually, you should look 593 00:33:29,120 --> 00:33:31,360 Speaker 2: it up. They're actually quite a cool little car. The 594 00:33:31,400 --> 00:33:34,320 Speaker 2: target roof. The roof comes off and you can zip around. 595 00:33:34,360 --> 00:33:37,560 Speaker 2: It's got a two liter engine. It's the Pulsar trip 596 00:33:37,640 --> 00:33:38,760 Speaker 2: less engine, so the enginees go. 597 00:33:38,920 --> 00:33:40,720 Speaker 1: You know, you know, this is not a forum for 598 00:33:40,760 --> 00:33:43,520 Speaker 1: you to fucking sell your car, right, so back off. 599 00:33:45,720 --> 00:33:47,280 Speaker 1: That's why you brought up your car. 600 00:33:47,520 --> 00:33:48,480 Speaker 2: It was a segue. 601 00:33:48,640 --> 00:33:51,520 Speaker 1: Yeah, that's right. He's like he wanted to tell people 602 00:33:51,520 --> 00:33:54,240 Speaker 1: that he's selling a car, and he thinks I can't 603 00:33:54,280 --> 00:33:56,520 Speaker 1: be too. I know, I'll ask Craig about his car. 604 00:33:56,600 --> 00:33:58,400 Speaker 2: Oh, while we're on cars. 605 00:33:58,160 --> 00:34:02,840 Speaker 1: Guess what I'm selling my forty year old piece of shit? 606 00:34:03,000 --> 00:34:03,960 Speaker 1: Anyone interested? 607 00:34:04,720 --> 00:34:06,800 Speaker 2: A thirty one year old card by the way, and 608 00:34:06,960 --> 00:34:12,439 Speaker 2: the segway actually was for Samsung because I reckon. They've 609 00:34:12,560 --> 00:34:15,680 Speaker 2: now been able to produce a solid state battery for 610 00:34:15,760 --> 00:34:21,120 Speaker 2: evs with a six hundred mile range and it's a 611 00:34:21,320 --> 00:34:24,360 Speaker 2: nine minute charging cycle to be able to get it 612 00:34:24,400 --> 00:34:28,200 Speaker 2: up to speed and it will last twenty years. 613 00:34:29,480 --> 00:34:30,160 Speaker 1: The battery. 614 00:34:30,400 --> 00:34:31,719 Speaker 2: The battery will last twenty years. 615 00:34:31,920 --> 00:34:33,919 Speaker 1: Well, that's a game charge, I mean, if that's put 616 00:34:33,960 --> 00:34:37,000 Speaker 1: it this way. Okay, So if that is real, if 617 00:34:37,000 --> 00:34:40,040 Speaker 1: you can charge it in nine minutes, six hundred miles 618 00:34:40,080 --> 00:34:43,719 Speaker 1: is one thousand kilometers, which is forever, and if at 619 00:34:43,800 --> 00:34:47,919 Speaker 1: last twenty if all of those things are real, then 620 00:34:48,160 --> 00:34:50,799 Speaker 1: that's a game changer for ev Then I would be 621 00:34:51,120 --> 00:34:54,319 Speaker 1: if they could do that just well not that I'd 622 00:34:54,400 --> 00:34:56,920 Speaker 1: be a convert, but I'd be more supportive of that 623 00:34:57,000 --> 00:34:59,799 Speaker 1: because that makes sense. There's still the small conundrum of 624 00:34:59,800 --> 00:35:01,800 Speaker 1: what we do with all those fucking batteries. 625 00:35:02,719 --> 00:35:06,360 Speaker 2: Well, twenty twenty seven is when they're talking about launching 626 00:35:06,400 --> 00:35:10,719 Speaker 2: at Toyota and Samsung have teamed up. They're sung to 627 00:35:10,719 --> 00:35:12,680 Speaker 2: try to develop this new battery, but they reckon twenty 628 00:35:12,719 --> 00:35:16,960 Speaker 2: twenty seven. The solid state battery. It's an oxide solid 629 00:35:16,960 --> 00:35:18,719 Speaker 2: state battery, and they're saying they're going to be able 630 00:35:18,719 --> 00:35:22,440 Speaker 2: to deliver all of this that's a massive boon. Wouldn't 631 00:35:22,440 --> 00:35:23,799 Speaker 2: that be amazing to be able to have that? 632 00:35:24,840 --> 00:35:27,719 Speaker 1: And I tell you, speaking of electric vehicles, what is 633 00:35:27,800 --> 00:35:36,040 Speaker 1: really struggling on the transport landscape is electric motorcycles. Who 634 00:35:36,200 --> 00:35:39,640 Speaker 1: blokes well and women who ride bikes. Motorbikes are more 635 00:35:39,800 --> 00:35:41,520 Speaker 1: men than women. But there are lots of ladies that 636 00:35:41,640 --> 00:35:44,200 Speaker 1: ride bikes. So I'm going to backtrack on them to 637 00:35:44,280 --> 00:35:47,840 Speaker 1: say people who ride bikes are generally not warming to 638 00:35:47,960 --> 00:35:49,320 Speaker 1: electric motorbikes. 639 00:35:49,719 --> 00:35:52,840 Speaker 2: You know why I was excited about electric motorbikes because 640 00:35:52,920 --> 00:35:57,120 Speaker 2: the three top motorcycle manufacturers got together and they said, 641 00:35:57,160 --> 00:35:59,920 Speaker 2: you know what, why don't we produce a standard battery 642 00:36:00,360 --> 00:36:02,359 Speaker 2: and that way you do a swap and go when 643 00:36:02,400 --> 00:36:05,640 Speaker 2: you go to your service station, instead of charging, you 644 00:36:05,760 --> 00:36:07,880 Speaker 2: just swap the battery out, put the new battery in 645 00:36:08,000 --> 00:36:10,399 Speaker 2: drive off. How good is that? 646 00:36:12,280 --> 00:36:14,880 Speaker 1: Yeah? There's one problem though, it makes no sound. 647 00:36:17,239 --> 00:36:19,920 Speaker 2: What the me screaming like the girl as I'm trying 648 00:36:19,920 --> 00:36:21,359 Speaker 2: to right, So. 649 00:36:21,400 --> 00:36:25,280 Speaker 1: You don't understand it's motorbikes. It's about the visceral experience. 650 00:36:25,360 --> 00:36:28,880 Speaker 1: It's about that fucking throbbing between your legs. It's about 651 00:36:28,920 --> 00:36:32,720 Speaker 1: the noise, It's about the what I'm not talking about 652 00:36:32,719 --> 00:36:35,759 Speaker 1: anything central. I'm talking about or it could be a 653 00:36:35,800 --> 00:36:38,720 Speaker 1: bit central the motorbike, but it's that that fucking sitting 654 00:36:38,760 --> 00:36:41,080 Speaker 1: on something while you're at the lights and your whole 655 00:36:41,080 --> 00:36:43,880 Speaker 1: body is vibrating because you're sitting on a two and 656 00:36:43,920 --> 00:36:45,120 Speaker 1: a half liter engine. 657 00:36:45,200 --> 00:36:48,960 Speaker 2: That's what it's about. 658 00:36:48,239 --> 00:36:48,279 Speaker 1: You. 659 00:36:49,880 --> 00:36:51,400 Speaker 2: I'll tell you what vibrator. 660 00:36:52,680 --> 00:36:58,040 Speaker 1: That's right, it's a six hundred pounds vibrator. That's of sorts. 661 00:36:58,680 --> 00:37:00,360 Speaker 1: Can we get on something less actual? 662 00:37:00,800 --> 00:37:03,160 Speaker 2: Because I didn't enjoy my ride on the bike with 663 00:37:03,200 --> 00:37:05,759 Speaker 2: you at all. I was too busy having the shit 664 00:37:05,840 --> 00:37:08,560 Speaker 2: scared out of me when you were weaving through traffic. 665 00:37:09,120 --> 00:37:11,240 Speaker 1: It's because you're a big baby. If you just fucking 666 00:37:11,360 --> 00:37:16,040 Speaker 1: let it go and you just you just sat really close. 667 00:37:16,760 --> 00:37:19,560 Speaker 1: You wanted to just sit on the back without going 668 00:37:20,160 --> 00:37:23,719 Speaker 1: because you just loved being in that position. You were 669 00:37:23,840 --> 00:37:28,640 Speaker 1: enjoying it till we head it off all right now, 670 00:37:28,680 --> 00:37:31,600 Speaker 1: Can you please tell me why Google has got a 671 00:37:31,600 --> 00:37:35,120 Speaker 1: payback of squillion dollars and why they've gotten in trouble 672 00:37:35,239 --> 00:37:39,359 Speaker 1: for something recently. Not that I'm too heartbroken for them. 673 00:37:39,480 --> 00:37:42,560 Speaker 2: It's an antitrust thing in the United States. So one 674 00:37:42,600 --> 00:37:47,880 Speaker 2: of the problems is that Google has such a massive 675 00:37:48,360 --> 00:37:52,640 Speaker 2: market share. You know, we even knew would you google something, 676 00:37:53,160 --> 00:37:56,520 Speaker 2: you for something, it's when you search online, you google it. 677 00:37:56,920 --> 00:37:59,719 Speaker 2: So there's a real concern now with a lot of lawmakers. 678 00:37:59,719 --> 00:38:01,439 Speaker 2: This is kind of happening in the States at the moment, 679 00:38:01,480 --> 00:38:03,319 Speaker 2: but I'm sure it's being thought. You know, a lot 680 00:38:03,360 --> 00:38:07,080 Speaker 2: of people are thinking this through, whether you're an advertiser 681 00:38:07,160 --> 00:38:09,120 Speaker 2: or whether you're somebody who just happens to be looking 682 00:38:09,160 --> 00:38:12,360 Speaker 2: for something on search. The monopoly is totally with Google, 683 00:38:12,440 --> 00:38:14,800 Speaker 2: and I think on phones it's about ninety five percent, 684 00:38:15,200 --> 00:38:18,759 Speaker 2: and on desktop searches it's about ninety percent. But the 685 00:38:18,840 --> 00:38:22,200 Speaker 2: concern is that Google has done a deal with a 686 00:38:22,280 --> 00:38:27,520 Speaker 2: number of phone manufacturers to make their search feature or 687 00:38:27,560 --> 00:38:30,839 Speaker 2: make the Google Search resident on that device. So when 688 00:38:30,880 --> 00:38:33,959 Speaker 2: you unpack the device, charge it up, and start using 689 00:38:34,000 --> 00:38:36,560 Speaker 2: it and you want to do an Internet search, Google 690 00:38:36,640 --> 00:38:39,080 Speaker 2: is the default search parameter the search engine. And this 691 00:38:39,160 --> 00:38:42,399 Speaker 2: is the thing that's concerning some of the lawmakers. They're 692 00:38:42,440 --> 00:38:45,759 Speaker 2: saying that they've got a massive, massive market share, and 693 00:38:45,800 --> 00:38:49,040 Speaker 2: that's the real concern at the moment. So there's Look, 694 00:38:49,040 --> 00:38:52,279 Speaker 2: it's a hard one because I guess we use it. 695 00:38:52,360 --> 00:38:55,240 Speaker 2: We use it every day? And what are the alternatives? 696 00:38:55,280 --> 00:38:56,960 Speaker 2: I mean, you use duck duck go? Don't you? 697 00:38:57,680 --> 00:39:00,279 Speaker 1: Oh use duck duck go sometimes? But I do you 698 00:39:00,280 --> 00:39:02,400 Speaker 1: know what, Like, I'm not a pro or an anti 699 00:39:02,400 --> 00:39:05,439 Speaker 1: Google person. I use Google like nearly everyone I know, Right, 700 00:39:05,520 --> 00:39:09,880 Speaker 1: But I think, well, if they built that, if they 701 00:39:10,000 --> 00:39:13,800 Speaker 1: got that market share, if they did, I mean, who 702 00:39:13,880 --> 00:39:17,359 Speaker 1: is it moral? Is it ethical? Who gives it? 703 00:39:17,440 --> 00:39:17,680 Speaker 2: Well? 704 00:39:17,760 --> 00:39:20,759 Speaker 1: I mean I care? But did they do anything illegal? 705 00:39:20,960 --> 00:39:26,000 Speaker 1: Like do they have a monopoly? Yes? Did they did 706 00:39:26,000 --> 00:39:30,120 Speaker 1: they create that? Yes? And it's like why, I mean, 707 00:39:31,239 --> 00:39:33,440 Speaker 1: and I don't care about the welfare of Google, but 708 00:39:33,440 --> 00:39:36,480 Speaker 1: I'm thinking, wouldn't we go, Hey, well done, that's just 709 00:39:36,680 --> 00:39:40,440 Speaker 1: really good business, Like you've done some great business to 710 00:39:40,520 --> 00:39:44,480 Speaker 1: get this foothold and this leverage and your hands down 711 00:39:44,520 --> 00:39:48,400 Speaker 1: the market leader in that space. And do they have 712 00:39:48,480 --> 00:39:49,120 Speaker 1: a monopoly? 713 00:39:49,239 --> 00:39:49,799 Speaker 2: Yeah? 714 00:39:50,440 --> 00:39:53,480 Speaker 1: Has anyone got a gun to anyone's head to use it? 715 00:39:53,600 --> 00:39:57,120 Speaker 2: No? Well, this is actually a legal case in the US. 716 00:39:57,560 --> 00:40:01,759 Speaker 2: So the judge has ruled that Google actually broke the 717 00:40:01,880 --> 00:40:05,200 Speaker 2: law with its monopoly. Okay, that's where this is the 718 00:40:05,280 --> 00:40:09,400 Speaker 2: sticking point. This has been effectively a course against Google 719 00:40:09,440 --> 00:40:12,560 Speaker 2: for the last twenty years. So what they're saying is 720 00:40:12,560 --> 00:40:15,440 Speaker 2: it's the way that it serves up as search related ads. 721 00:40:16,120 --> 00:40:20,000 Speaker 2: And so a federal court judge in the US has 722 00:40:20,120 --> 00:40:25,160 Speaker 2: effectively put through this ruling to say that their share. 723 00:40:25,680 --> 00:40:27,880 Speaker 2: So this is the Department of Justice. They're saying that 724 00:40:27,920 --> 00:40:32,960 Speaker 2: the Google share is massive, and that Google had paid 725 00:40:33,320 --> 00:40:38,160 Speaker 2: in Australian dollars, had paid over forty one billion dollars 726 00:40:38,200 --> 00:40:40,960 Speaker 2: in twenty twenty one to make sure that its search 727 00:40:41,000 --> 00:40:46,799 Speaker 2: engine was the default smartphone on app on browsers to 728 00:40:46,880 --> 00:40:50,239 Speaker 2: keep it right. So that's the sticking point that it 729 00:40:50,400 --> 00:40:54,200 Speaker 2: was paying off the phone manufacturers to make their search 730 00:40:54,239 --> 00:40:56,200 Speaker 2: engine the default on their devices. 731 00:40:56,719 --> 00:40:58,799 Speaker 1: But why do we use the term paying off? Why 732 00:40:58,800 --> 00:41:01,640 Speaker 1: don't we just say they had a commercial arrangement, Like 733 00:41:01,680 --> 00:41:05,120 Speaker 1: to me, that seems like, hey, make us your your 734 00:41:05,280 --> 00:41:09,080 Speaker 1: preferred search engine a provider, and we'll give you all 735 00:41:09,120 --> 00:41:11,640 Speaker 1: this money. To me, that just seems like business. 736 00:41:12,520 --> 00:41:15,920 Speaker 2: Well, the term that the judge used was monopolist, and 737 00:41:15,960 --> 00:41:20,400 Speaker 2: it acted to maintain a monopoly, so it forced out 738 00:41:20,880 --> 00:41:22,160 Speaker 2: any other competition. 739 00:41:22,719 --> 00:41:27,160 Speaker 1: So they're just doing better. I mean, I don't know. 740 00:41:27,560 --> 00:41:31,040 Speaker 1: To me, that's the point of business. We're the best. Yeah, yeah, 741 00:41:31,080 --> 00:41:33,480 Speaker 1: I know, I don't know. To me, it reminds me 742 00:41:33,560 --> 00:41:36,840 Speaker 1: of al Capone. You remember, do you know what happened 743 00:41:36,840 --> 00:41:39,720 Speaker 1: with al Capone? Like he obviously he was a complete gangster, 744 00:41:40,320 --> 00:41:42,160 Speaker 1: you know, he did all this stuff, right, and they 745 00:41:42,200 --> 00:41:47,040 Speaker 1: couldn't they couldn't convict him on any crime, any of 746 00:41:47,080 --> 00:41:50,680 Speaker 1: the fucking hundreds of murders and all the underworld stuff 747 00:41:50,719 --> 00:41:55,520 Speaker 1: and all the stealing and all the embezzlement and all 748 00:41:55,560 --> 00:41:58,040 Speaker 1: the corruption, and you know what, they ended up getting 749 00:41:58,080 --> 00:41:59,640 Speaker 1: him on and putting him in prison. 750 00:41:59,360 --> 00:42:02,719 Speaker 2: For no I can't remember, but I know this, yeah, gone. 751 00:42:02,600 --> 00:42:03,719 Speaker 1: Yeah, tax evasion. 752 00:42:03,840 --> 00:42:04,879 Speaker 2: Oh yes, that's right. 753 00:42:05,640 --> 00:42:09,560 Speaker 1: It's like they you know, like such a white collar, 754 00:42:09,680 --> 00:42:13,240 Speaker 1: boring crime. But anyway, you know, so, how. 755 00:42:13,080 --> 00:42:15,600 Speaker 2: Many searches do you reckon Google does in a day, 756 00:42:15,840 --> 00:42:18,200 Speaker 2: as in people using the Google search engine? How many 757 00:42:18,200 --> 00:42:20,160 Speaker 2: searches do you reckon have done in a single day 758 00:42:21,200 --> 00:42:23,960 Speaker 2: all over the world? Just a wild guess. 759 00:42:25,400 --> 00:42:27,480 Speaker 1: So I'm going to go, well, there's eight billion people. 760 00:42:27,760 --> 00:42:31,080 Speaker 1: Let's say half the billion half the people use, so 761 00:42:31,160 --> 00:42:35,400 Speaker 1: that's four billion, and I'd say three I'd say we 762 00:42:35,480 --> 00:42:39,640 Speaker 1: all do on average five searches a day. Twenty billion 763 00:42:39,719 --> 00:42:41,919 Speaker 1: is my guess. Twenty billion searches a day. 764 00:42:42,239 --> 00:42:44,279 Speaker 2: Oh, that's a lot of searches. Well, you actually came 765 00:42:44,320 --> 00:42:47,000 Speaker 2: pretty close at the start, were you're just estimating the usage. 766 00:42:47,080 --> 00:42:49,759 Speaker 2: It's actually eight point five billion searches. 767 00:42:50,320 --> 00:42:54,080 Speaker 1: Okay, yeah, so that's two that's two and a bit 768 00:42:54,160 --> 00:42:57,560 Speaker 1: per person. Yeah, if there's four million, or if we 769 00:42:57,600 --> 00:42:59,919 Speaker 1: averaged it out to that's once a day, if we're 770 00:43:00,080 --> 00:43:01,160 Speaker 1: for everyone in the world. 771 00:43:02,160 --> 00:43:06,239 Speaker 2: Yeah, yeah, okay, Well, don't forget though that there are 772 00:43:06,440 --> 00:43:09,040 Speaker 2: like China has a massive population. You can't use Google 773 00:43:09,040 --> 00:43:11,800 Speaker 2: in China because they've got the Great Firewall of China. 774 00:43:12,239 --> 00:43:15,759 Speaker 1: That's right, that's do you? I mean, I don't know 775 00:43:15,760 --> 00:43:17,520 Speaker 1: if you know the I don't know if you know 776 00:43:17,600 --> 00:43:19,920 Speaker 1: the answer to this, but you are a tai Chi teacher, 777 00:43:20,000 --> 00:43:26,839 Speaker 1: so you should do they do they? Well? I mean 778 00:43:26,920 --> 00:43:28,480 Speaker 1: you've you've been to China, haven't you? 779 00:43:28,840 --> 00:43:30,440 Speaker 2: Yeah? China four times? 780 00:43:30,719 --> 00:43:32,800 Speaker 1: Yeah, so I've never been there. So my point is 781 00:43:32,840 --> 00:43:34,279 Speaker 1: you're going to know better than I'm going to know. 782 00:43:35,440 --> 00:43:38,239 Speaker 1: Do the Chinese have much of an awareness of what 783 00:43:38,400 --> 00:43:42,000 Speaker 1: happens outside? Like, do they I mean, do they know 784 00:43:42,400 --> 00:43:46,040 Speaker 1: about you know, the rest of the world in terms 785 00:43:46,040 --> 00:43:51,160 Speaker 1: of Google and or are they? Are they somewhat manipulated 786 00:43:51,160 --> 00:43:53,520 Speaker 1: and brainwashed, and you know. 787 00:43:54,040 --> 00:43:57,919 Speaker 2: Yeah, that's a really really good question. I think that. 788 00:43:58,120 --> 00:44:00,000 Speaker 2: I mean, if I look at my tai Chi family, 789 00:44:00,120 --> 00:44:02,319 Speaker 2: we call that. That's a lovely term that I really like. 790 00:44:02,320 --> 00:44:04,840 Speaker 2: I've got my ti Chair brothers in China and sisters 791 00:44:04,840 --> 00:44:09,480 Speaker 2: in China. But because we have an ongoing dialogue, they 792 00:44:09,520 --> 00:44:13,000 Speaker 2: do know what's going on around the world. So it's 793 00:44:13,040 --> 00:44:15,560 Speaker 2: not like they're locked in and we talk about the 794 00:44:15,560 --> 00:44:19,480 Speaker 2: Great Firewall of China. But you know, VPNs are still something, 795 00:44:19,520 --> 00:44:21,839 Speaker 2: so they can actually get out of China if they're 796 00:44:21,840 --> 00:44:22,239 Speaker 2: more clearly. 797 00:44:22,480 --> 00:44:25,319 Speaker 1: Tell people what that is. Tell people what VPN is. 798 00:44:26,000 --> 00:44:29,800 Speaker 2: So what you can do is you can use a 799 00:44:29,920 --> 00:44:32,600 Speaker 2: service that can geolocate you anywhere in the world. So 800 00:44:32,640 --> 00:44:36,240 Speaker 2: say you want to sign up to Netflix, you can use. 801 00:44:36,200 --> 00:44:40,240 Speaker 1: Make it easier. People like, what the fuck is geo locate? 802 00:44:40,600 --> 00:44:42,319 Speaker 1: Just speak our language. 803 00:44:43,200 --> 00:44:46,239 Speaker 2: It hides where you are, so the Internet knows where 804 00:44:46,280 --> 00:44:48,320 Speaker 2: you are when you log on. If you use a VPN, 805 00:44:48,400 --> 00:44:50,880 Speaker 2: you can hide where you are and pretend to be 806 00:44:50,920 --> 00:44:51,520 Speaker 2: somewhere else. 807 00:44:51,600 --> 00:44:56,279 Speaker 1: Has that thank you, thank you, professor? Fucking hell? So 808 00:44:56,680 --> 00:44:59,840 Speaker 1: oh can you explain? Yes, so gea lok and everyone's 809 00:44:59,880 --> 00:45:05,000 Speaker 1: just switched off. It's like all right, tell me about space. 810 00:45:05,080 --> 00:45:07,800 Speaker 1: What's going on in the bloody intergalactic weirdness? 811 00:45:07,840 --> 00:45:12,160 Speaker 2: That is us intergalectic weirdness. It's not integlectic or weird. 812 00:45:12,600 --> 00:45:15,279 Speaker 2: This is like you've heard of the vault, that there's 813 00:45:15,320 --> 00:45:18,640 Speaker 2: a vault somewhere in Scandinavia that has all these seeds. 814 00:45:18,640 --> 00:45:19,399 Speaker 2: It's a seed vault. 815 00:45:19,880 --> 00:45:22,360 Speaker 1: I've seen a doco on it. It's under a mountain. 816 00:45:22,560 --> 00:45:28,200 Speaker 1: It's fucking amazing. Have you seen that, Tiff. So there's 817 00:45:28,239 --> 00:45:32,439 Speaker 1: this vault where they keep all of these seeds from 818 00:45:32,560 --> 00:45:35,120 Speaker 1: all of these plants around the world, so that if 819 00:45:35,840 --> 00:45:38,360 Speaker 1: there's an apocalypse and the world falls apart and that 820 00:45:38,480 --> 00:45:42,759 Speaker 1: goes to shit and then you know, one hundred people live, 821 00:45:42,880 --> 00:45:45,399 Speaker 1: there's this vault where people can go and get all 822 00:45:45,440 --> 00:45:50,759 Speaker 1: of these seeds and replant and regrow if everything. So 823 00:45:51,080 --> 00:45:54,759 Speaker 1: but to get to it, sorry Patrick, you you literally 824 00:45:55,120 --> 00:45:59,600 Speaker 1: enter this massive door that goes into a mountain, so 825 00:45:59,719 --> 00:46:04,840 Speaker 1: it's like hundreds and hundreds of feet underground, so that 826 00:46:04,960 --> 00:46:09,400 Speaker 1: it would pretty much survive well you know, we don't know, 827 00:46:09,480 --> 00:46:15,600 Speaker 1: but most cataclysmic events to basically restart the world anyway, 828 00:46:15,680 --> 00:46:17,360 Speaker 1: sorry Patrick, And it's called the vault. 829 00:46:17,920 --> 00:46:21,880 Speaker 2: Well to doomsday vault, and evidently scientists are now saying 830 00:46:22,080 --> 00:46:24,280 Speaker 2: they want to start up another one. But guess where. 831 00:46:26,040 --> 00:46:26,600 Speaker 1: Australia. 832 00:46:27,160 --> 00:46:29,319 Speaker 2: No, I just said space. Wasn't that a hint? 833 00:46:29,840 --> 00:46:31,760 Speaker 1: Oh? 834 00:46:32,160 --> 00:46:35,120 Speaker 2: The Moon? The Moon? They would have put it on 835 00:46:35,160 --> 00:46:37,719 Speaker 2: the moon. They want to because the Moon's cold enough 836 00:46:37,760 --> 00:46:41,280 Speaker 2: to preserve the samples without needing to power it. Right. 837 00:46:41,360 --> 00:46:44,480 Speaker 1: So yes, here's the problem though, if the Earth blows up, 838 00:46:44,560 --> 00:46:46,719 Speaker 1: we've got no fucking rockets to get to the Moon 839 00:46:46,760 --> 00:46:53,400 Speaker 1: to retrieve our samples. Fucking that's great. So we can't 840 00:46:53,440 --> 00:46:53,879 Speaker 1: get there. 841 00:46:55,239 --> 00:46:58,799 Speaker 2: So the seeds are cryo preserve, right, and then you 842 00:46:58,840 --> 00:47:01,759 Speaker 2: don't have to worry about it. So you know, if 843 00:47:01,760 --> 00:47:04,239 Speaker 2: the air conditioning breaks down, well there is no air 844 00:47:04,239 --> 00:47:06,600 Speaker 2: conditioning to worry about because it's so cold on the Moon. 845 00:47:07,120 --> 00:47:09,440 Speaker 1: I think you're missing the point though, If we can't 846 00:47:09,560 --> 00:47:13,680 Speaker 1: get to the moon because we're all dead from starvation 847 00:47:13,880 --> 00:47:16,680 Speaker 1: because the seeds, the seeds are in. 848 00:47:19,880 --> 00:47:22,840 Speaker 2: I think it's not a bad contingency. We're not talking 849 00:47:22,920 --> 00:47:26,000 Speaker 2: the whole earth blowing up here, right, which. 850 00:47:25,840 --> 00:47:30,600 Speaker 1: Isn't we aren't we? No? Did you hear what happened 851 00:47:30,600 --> 00:47:35,120 Speaker 1: to the dinosaurs? You hear about that, you're familiar with 852 00:47:35,160 --> 00:47:36,280 Speaker 1: that story. 853 00:47:36,160 --> 00:47:38,600 Speaker 2: What the Earth's more than six thousand years old? 854 00:47:39,239 --> 00:47:44,640 Speaker 1: Well, well, no, it's not, because the Bible's true. But 855 00:47:45,640 --> 00:47:51,160 Speaker 1: in some alternate evolutionary timeline, perhaps, yeah, in some fictitious 856 00:47:51,239 --> 00:47:57,120 Speaker 1: timeline of dinosaurs, maybe that the Earth blew up once 857 00:47:57,239 --> 00:47:58,160 Speaker 1: or twice. 858 00:47:57,960 --> 00:48:01,520 Speaker 2: It didn't actually blow up. Okay, So Throid collides with Earth. 859 00:48:01,840 --> 00:48:06,000 Speaker 2: Giant plume of dust and dirt and debris enters the atmosphere, 860 00:48:06,200 --> 00:48:09,799 Speaker 2: blocks out the sun, Earth goes into ice age. Dinosaurs die, 861 00:48:09,960 --> 00:48:14,440 Speaker 2: but some animals, like the wily little animals that we 862 00:48:14,600 --> 00:48:17,680 Speaker 2: ended up becoming, when you make it through, and then 863 00:48:17,880 --> 00:48:20,200 Speaker 2: there's a big sign that says, build a rocket, fly 864 00:48:20,320 --> 00:48:22,040 Speaker 2: to the moon. You can get all your seeds done. 865 00:48:22,640 --> 00:48:25,440 Speaker 1: Now, before you start stating all this stuff as a 866 00:48:25,440 --> 00:48:27,720 Speaker 1: matter of fact, like you were there as an observer, 867 00:48:28,239 --> 00:48:33,799 Speaker 1: could you start all of that with allegedly. 868 00:48:30,760 --> 00:48:39,480 Speaker 2: Oh sorry, we allegedly and the Earth allegedly there was 869 00:48:39,520 --> 00:48:42,560 Speaker 2: a plume of dust and allegedly on the dinosaurs die. 870 00:48:42,600 --> 00:48:44,680 Speaker 2: That's all the fossils and all the stuff that we 871 00:48:44,760 --> 00:48:46,880 Speaker 2: know have been there for you know, all those hundreds 872 00:48:46,880 --> 00:48:52,200 Speaker 2: of millions of years, four thousands last week, I was 873 00:48:52,239 --> 00:48:53,480 Speaker 2: it last week? Yeah? 874 00:48:53,600 --> 00:48:57,000 Speaker 1: Can you tell me speaking of nothing to do with crime, 875 00:48:57,040 --> 00:49:01,240 Speaker 1: I'm interested in this story. Record breaking seventy five million 876 00:49:01,320 --> 00:49:06,960 Speaker 1: dollar ransom had to dark an Angels gang. I feel 877 00:49:07,000 --> 00:49:10,880 Speaker 1: like these kind of that was that like an online 878 00:49:10,920 --> 00:49:11,600 Speaker 1: thing mate. 879 00:49:11,960 --> 00:49:15,000 Speaker 2: Yeah, this is an interesting one because this is thought 880 00:49:15,080 --> 00:49:19,520 Speaker 2: to have been the largest ever ransomware payment ever made. 881 00:49:20,280 --> 00:49:26,080 Speaker 2: Because look what happens is that ransom were like activists 882 00:49:26,160 --> 00:49:28,640 Speaker 2: or people who go out, their actors go out and 883 00:49:28,680 --> 00:49:33,160 Speaker 2: they infect a system and lock out the entire manufacturing 884 00:49:33,239 --> 00:49:36,200 Speaker 2: process or whatever it happens to be. That freezes out 885 00:49:36,239 --> 00:49:39,200 Speaker 2: the company's information and they say to them, if you 886 00:49:39,239 --> 00:49:41,840 Speaker 2: don't pay a certain amount of money within x amount 887 00:49:41,880 --> 00:49:43,960 Speaker 2: of time, we're going to delete all of your data. 888 00:49:44,800 --> 00:49:47,480 Speaker 2: So it's effectively what ransomware is. It happens on a 889 00:49:47,480 --> 00:49:50,440 Speaker 2: small scale, It happens to everyday people, but it can 890 00:49:50,480 --> 00:49:53,759 Speaker 2: also happen to large corporations and companies. And look, as 891 00:49:53,800 --> 00:49:56,799 Speaker 2: an off the record, I do know that in Australia 892 00:49:56,880 --> 00:49:59,040 Speaker 2: there was a company that made a decision to pay 893 00:49:59,040 --> 00:50:02,399 Speaker 2: their ransomware because they were losing more money every day 894 00:50:02,440 --> 00:50:06,000 Speaker 2: in production than they did to pay the people who 895 00:50:06,960 --> 00:50:10,200 Speaker 2: were behind the ransomware attack. But they're saying this is 896 00:50:11,640 --> 00:50:15,880 Speaker 2: like Threat Labs is a like as a group that 897 00:50:16,200 --> 00:50:19,440 Speaker 2: looks into this sort of stuff. And what they're saying 898 00:50:19,560 --> 00:50:22,400 Speaker 2: is that this is pretty deplorable because the thing is, 899 00:50:22,719 --> 00:50:27,160 Speaker 2: so many legal authorities, so many jurisdictions say don't pay ransomware. 900 00:50:27,160 --> 00:50:30,360 Speaker 2: Don't pay ransomware. It only encourages people to keep the 901 00:50:30,440 --> 00:50:34,200 Speaker 2: system going. And so now this and this group called 902 00:50:34,239 --> 00:50:38,280 Speaker 2: the Dark Angels evidently got a massive payment of seventy 903 00:50:38,280 --> 00:50:41,320 Speaker 2: five million dollars paid to them by an undisclosed victim 904 00:50:41,320 --> 00:50:45,240 Speaker 2: earlier this year, and that massively beat the previous record 905 00:50:45,239 --> 00:50:48,719 Speaker 2: of forty million. So the problem is that this is 906 00:50:48,719 --> 00:50:53,560 Speaker 2: a ransomware report that comes out and it's on the increase, 907 00:50:53,800 --> 00:50:55,719 Speaker 2: and the problem is, you know, do we pay it 908 00:50:55,800 --> 00:50:59,040 Speaker 2: or don't we pay it? The Australian government says don't 909 00:50:59,040 --> 00:51:02,440 Speaker 2: pay it. The American government says don't pay it. But 910 00:51:02,480 --> 00:51:05,399 Speaker 2: it's pretty hard if you're in an organization where you're 911 00:51:05,440 --> 00:51:08,680 Speaker 2: relying on the match, say, the manufacturing process, and allo, 912 00:51:09,280 --> 00:51:11,200 Speaker 2: everything grinds to a holt. What do you do? 913 00:51:12,040 --> 00:51:14,920 Speaker 1: And especially it's all well and good for the government, 914 00:51:14,920 --> 00:51:18,560 Speaker 1: and I understand the intention behind that, and in principle 915 00:51:18,640 --> 00:51:21,600 Speaker 1: I agree with that, but Let's say you're the owner 916 00:51:21,640 --> 00:51:26,960 Speaker 1: of a fifty billion dollar company and you've got to 917 00:51:26,960 --> 00:51:30,719 Speaker 1: pay out fifty million to get the wheels turning again, 918 00:51:30,760 --> 00:51:35,160 Speaker 1: and you're losing ten twenty thirty million a day, and 919 00:51:35,200 --> 00:51:36,920 Speaker 1: you go, well, it's all well and good, but if 920 00:51:36,960 --> 00:51:40,880 Speaker 1: we don't pay it, my business is fucked and I'm fucked, 921 00:51:40,880 --> 00:51:44,239 Speaker 1: and we go broke, and my five thousand employees don't 922 00:51:44,320 --> 00:51:47,239 Speaker 1: have a job. And the easy way to fix that, 923 00:51:47,280 --> 00:51:50,279 Speaker 1: in inverted commas is to do this. It is. It 924 00:51:50,320 --> 00:51:53,719 Speaker 1: is a real conundrum and a you know, like financial 925 00:51:53,840 --> 00:51:58,759 Speaker 1: and moral and ethical and legal dilemma. And while I 926 00:51:58,800 --> 00:52:02,280 Speaker 1: agree with the intention and the sentiment behind the don't 927 00:52:02,280 --> 00:52:06,319 Speaker 1: give in to criminals, I can understand in certain I 928 00:52:06,360 --> 00:52:08,520 Speaker 1: think everything is context dependent. 929 00:52:08,960 --> 00:52:09,200 Speaker 2: You know. 930 00:52:09,280 --> 00:52:13,600 Speaker 1: It's like I would never hurt anyone, but if someone 931 00:52:13,680 --> 00:52:16,560 Speaker 1: was hurting my mum, I would definitely hurt someone. Like 932 00:52:16,840 --> 00:52:20,520 Speaker 1: it depends on what's going on. You know, would you 933 00:52:20,600 --> 00:52:23,040 Speaker 1: ever want to kill anyone? Of course I would never 934 00:52:23,080 --> 00:52:26,200 Speaker 1: want to kill anyone, but you put any person in 935 00:52:26,239 --> 00:52:32,719 Speaker 1: some situations, in an extreme situations, you go, well, if 936 00:52:32,760 --> 00:52:35,920 Speaker 1: it's like, you know, show me a dad or mum 937 00:52:35,960 --> 00:52:39,520 Speaker 1: that won't die to protect their child, or kill someone 938 00:52:39,560 --> 00:52:42,200 Speaker 1: who's trying to hurt their child, you know, I think, 939 00:52:42,239 --> 00:52:48,520 Speaker 1: and that's an extreme stretch, but I think I don't 940 00:52:48,560 --> 00:52:53,080 Speaker 1: think that all of these rules or laws are Yes, 941 00:52:53,120 --> 00:52:55,160 Speaker 1: they're black and white, but in certain things like this 942 00:52:55,200 --> 00:52:57,400 Speaker 1: where you think, well, someone's going to lose everything and 943 00:52:57,480 --> 00:53:00,960 Speaker 1: everyone that works in that organization can pay these crooks, 944 00:53:01,000 --> 00:53:03,759 Speaker 1: and you go, yeah, it doesn't help because now we're 945 00:53:03,920 --> 00:53:07,400 Speaker 1: kind of encouraging the crooks. But yeah, that's one of 946 00:53:07,400 --> 00:53:10,600 Speaker 1: those there is no right and wrong, but it's understandable 947 00:53:10,600 --> 00:53:11,080 Speaker 1: either way. 948 00:53:11,600 --> 00:53:13,960 Speaker 2: I think the key finding out of this one is 949 00:53:14,239 --> 00:53:18,000 Speaker 2: we need to pour more resources into cybersecurity and to 950 00:53:18,520 --> 00:53:21,239 Speaker 2: kind of churning out of I guess universities where people 951 00:53:21,239 --> 00:53:25,680 Speaker 2: are using AI to learn cybersecurity. Now, I'm thinking that 952 00:53:26,080 --> 00:53:29,080 Speaker 2: if the right investment is made to try to encourage 953 00:53:29,360 --> 00:53:32,600 Speaker 2: businesses and not just big business but smaller businesses to 954 00:53:33,440 --> 00:53:36,799 Speaker 2: just do what they can to increase their security, and 955 00:53:37,000 --> 00:53:38,719 Speaker 2: if the government can do that, then we're going to 956 00:53:38,719 --> 00:53:41,000 Speaker 2: prevent it in the first place. It's that whole prevention 957 00:53:41,120 --> 00:53:42,320 Speaker 2: is better than the cure. 958 00:53:43,400 --> 00:53:45,600 Speaker 1: I just realized I've got another appointment in four and 959 00:53:45,600 --> 00:53:48,440 Speaker 1: a half minutes, So Patrick, you're the best. How do 960 00:53:48,520 --> 00:53:50,239 Speaker 1: people find you and connect with you? 961 00:53:50,160 --> 00:53:53,680 Speaker 2: You can go to websites now, dot com, dot au. 962 00:53:53,760 --> 00:53:56,279 Speaker 2: There's all the stuff about what we do. So feel 963 00:53:56,280 --> 00:53:57,799 Speaker 2: free reach out say gooday. 964 00:53:58,840 --> 00:54:02,560 Speaker 1: Thank you for that, thank you and thanks kids. I 965 00:54:02,560 --> 00:54:05,840 Speaker 1: feel like tis been really distracted today. Have you noticed that? Patrick? 966 00:54:06,360 --> 00:54:07,840 Speaker 2: Oh? I know she was just taking it all in, 967 00:54:07,880 --> 00:54:10,759 Speaker 2: weren't you, Ti? I understand loaded dak dack go and 968 00:54:10,840 --> 00:54:13,239 Speaker 2: installed it. To be honest, I. 969 00:54:13,239 --> 00:54:16,560 Speaker 1: Knew she was doing something else because she definitely wasn't 970 00:54:16,560 --> 00:54:17,280 Speaker 1: paying attention. 971 00:54:19,080 --> 00:54:20,320 Speaker 2: Is every time you talk about. 972 00:54:20,120 --> 00:54:21,759 Speaker 1: It, I'm like, I've got to do that, and then 973 00:54:21,800 --> 00:54:23,120 Speaker 1: I forget, and I'm like, well, I'm doing it now 974 00:54:23,160 --> 00:54:23,840 Speaker 1: because I'll forget. 975 00:54:24,160 --> 00:54:26,440 Speaker 2: See that's good, that's great. She actually listens to what 976 00:54:26,480 --> 00:54:27,520 Speaker 2: you're saying, Craigo. 977 00:54:28,239 --> 00:54:31,920 Speaker 1: Yeah, you're only getting paid half of this episode. All right, 978 00:54:32,000 --> 00:54:33,600 Speaker 1: thanks everyone, Thanks mate,