1 00:00:00,440 --> 00:00:02,800 Speaker 1: I get a team. It's Harps and Patrick, it's you project. 2 00:00:02,880 --> 00:00:06,560 Speaker 1: For the first time in twenty twenty six, we're back. 3 00:00:07,520 --> 00:00:10,360 Speaker 1: And to those of you who are thinking to yourselves self, 4 00:00:11,200 --> 00:00:14,280 Speaker 1: where is the answer to my question that Harps asked 5 00:00:14,320 --> 00:00:18,200 Speaker 1: me to write in the group, It's coming. I apologize. 6 00:00:18,880 --> 00:00:21,320 Speaker 1: I said, I'm going to do a Q and a 7 00:00:21,360 --> 00:00:25,279 Speaker 1: pod or two, which I'm doing. I've started. I just 8 00:00:25,320 --> 00:00:27,680 Speaker 1: got a bit caught up in things, as is my way, 9 00:00:27,720 --> 00:00:30,880 Speaker 1: a little bit of Ron and Mary business, and so 10 00:00:31,000 --> 00:00:35,519 Speaker 1: the time that I put aside to record that episode 11 00:00:35,560 --> 00:00:39,160 Speaker 1: or two selfishly got taken up by Ron and Mary. 12 00:00:39,440 --> 00:00:42,760 Speaker 1: So blame them, get mad at them them. It could 13 00:00:42,840 --> 00:00:46,239 Speaker 1: never be me. Patrick. James Bonello joins me, as he 14 00:00:46,280 --> 00:00:48,680 Speaker 1: does on a regular basis. Happy New Year Champion. 15 00:00:49,000 --> 00:00:50,959 Speaker 2: Hey, it's great. This is exciting for me. I don't 16 00:00:50,960 --> 00:00:53,280 Speaker 2: think I've done the very first episode of a New 17 00:00:53,360 --> 00:00:55,880 Speaker 2: Year before because we've been doing this about three years. 18 00:00:56,200 --> 00:00:57,840 Speaker 1: We've been doing it for a while, and you are 19 00:00:57,960 --> 00:01:01,480 Speaker 1: very much part of the furniture here typ studio, the 20 00:01:01,640 --> 00:01:05,760 Speaker 1: human furniture that is you. So we're recording this on 21 00:01:05,800 --> 00:01:09,160 Speaker 1: the third, Satdy. The third will be out on Monday 22 00:01:09,200 --> 00:01:11,360 Speaker 1: the fifth. Yeah, what have you been doing? 23 00:01:11,880 --> 00:01:13,679 Speaker 2: Well? Right the second of being licked? 24 00:01:14,760 --> 00:01:18,960 Speaker 1: Wow? Yeah, And to that young man in front of you, 25 00:01:19,080 --> 00:01:25,479 Speaker 1: if you could go and make Patrick some breakfast? Oh really, Well, 26 00:01:25,640 --> 00:01:27,759 Speaker 1: you opened the door, you started. 27 00:01:27,360 --> 00:01:29,520 Speaker 2: It, but no you asked me how I've been what 28 00:01:29,560 --> 00:01:32,600 Speaker 2: I'm doing, and I'm being licked, literally being licked by 29 00:01:32,640 --> 00:01:34,600 Speaker 2: my dog. So look, it was a bit of a 30 00:01:34,600 --> 00:01:39,440 Speaker 2: traumatic day yesterday because Fritzy apparently when we went out 31 00:01:39,520 --> 00:01:43,039 Speaker 2: running at the park, he ran into something and it 32 00:01:43,200 --> 00:01:46,600 Speaker 2: sliced a gash into his chest, but I didn't notice. 33 00:01:46,600 --> 00:01:49,080 Speaker 2: He was fine. And then when he was at home 34 00:01:49,160 --> 00:01:52,600 Speaker 2: he started licking incessantly and I rolled him over and 35 00:01:52,640 --> 00:01:56,720 Speaker 2: he whimpered and there was a visible hole in his chest, 36 00:01:56,800 --> 00:01:59,840 Speaker 2: and I panicked and rushed him to the veton. So 37 00:02:00,080 --> 00:02:02,320 Speaker 2: we had a very difficult night trying to get him 38 00:02:02,320 --> 00:02:05,600 Speaker 2: to stop lick at himself. You know, ordinarily I would 39 00:02:05,680 --> 00:02:09,560 Speaker 2: encourage that because it's I'm jealous, but in this instance, so. 40 00:02:10,040 --> 00:02:12,440 Speaker 1: That's twice in one minute. You don't need to go for. 41 00:02:12,560 --> 00:02:14,639 Speaker 2: Me, one for you. That's one age. Now we can 42 00:02:14,720 --> 00:02:18,480 Speaker 2: get off that topic. So he's here next to me 43 00:02:18,680 --> 00:02:21,480 Speaker 2: in my little broadcast area, and I'm trying to get 44 00:02:21,520 --> 00:02:25,160 Speaker 2: him not to lick his wound. He's little wounds, He's 45 00:02:25,160 --> 00:02:28,400 Speaker 2: got medication and I've been bathing it in saline solution, 46 00:02:28,680 --> 00:02:29,560 Speaker 2: doing all the right things. 47 00:02:30,240 --> 00:02:34,120 Speaker 1: Nurse Patrick, Well, let's send our healing vibes a little 48 00:02:34,160 --> 00:02:37,119 Speaker 1: Fritzi and hope that he gets better quick and for Ittz, 49 00:02:37,160 --> 00:02:41,000 Speaker 1: stop licking your fucking wounds. Uncle harps here, stop licking 50 00:02:41,000 --> 00:02:44,600 Speaker 1: yourself in that area, and when the wound's better, then 51 00:02:44,600 --> 00:02:46,799 Speaker 1: you can return to business as usual with the other 52 00:02:46,880 --> 00:02:51,440 Speaker 1: leaking area, as trained by Dad. But yeah, what did 53 00:02:51,480 --> 00:02:53,960 Speaker 1: you what did you do for Christmas? You don't drink? 54 00:02:54,040 --> 00:02:54,200 Speaker 2: Right? 55 00:02:54,280 --> 00:02:56,200 Speaker 1: Do you ever drink booze? Very? 56 00:02:56,320 --> 00:02:59,280 Speaker 2: Very occasionally? I might have one little drink, you know, 57 00:02:59,360 --> 00:03:01,400 Speaker 2: have a Guinness renown again, it's a little pub that 58 00:03:01,440 --> 00:03:03,760 Speaker 2: I go to, and Fritz and I both go to actually, 59 00:03:04,000 --> 00:03:06,040 Speaker 2: and we sit in the front bar and I have 60 00:03:06,080 --> 00:03:08,600 Speaker 2: a Guinness. But that's maybe once a month. I think 61 00:03:08,600 --> 00:03:09,360 Speaker 2: I would have a drink. 62 00:03:09,840 --> 00:03:12,440 Speaker 1: I shouldn't know this, well maybe I shouldn't, but I 63 00:03:12,480 --> 00:03:15,360 Speaker 1: know beer is made from hops and shiit. What is 64 00:03:15,400 --> 00:03:16,600 Speaker 1: guinness made from? Do you know? 65 00:03:16,880 --> 00:03:20,040 Speaker 2: Probably the same thing? I reckon. It's supposed to have 66 00:03:20,080 --> 00:03:22,680 Speaker 2: lots of vitamin B in it, you know, it's that black, 67 00:03:22,760 --> 00:03:25,040 Speaker 2: dark color. And in Ireland they used to give it 68 00:03:25,040 --> 00:03:28,480 Speaker 2: to pregnant women after they had their babies because it 69 00:03:28,520 --> 00:03:30,840 Speaker 2: was restorative to their health and well being. 70 00:03:31,360 --> 00:03:36,680 Speaker 1: Let's ask to please ask, let's ask chat jp T. 71 00:03:37,840 --> 00:03:40,400 Speaker 2: All right, So in the meantime, this is not. 72 00:03:40,800 --> 00:03:42,960 Speaker 1: No, no, no, We're going to We're going to We're going 73 00:03:43,000 --> 00:03:47,120 Speaker 1: to chat to it. Ah, hey chat jp tea. What 74 00:03:47,280 --> 00:03:49,800 Speaker 1: is guinness made from? 75 00:03:50,040 --> 00:03:53,600 Speaker 3: Of course, guinness is me from four key ingredients. Water 76 00:03:53,920 --> 00:03:56,160 Speaker 3: malted barley, some of which is roasted to give it 77 00:03:56,160 --> 00:03:58,960 Speaker 3: that dark color, Hops for bitterness and flavor, and he 78 00:03:59,080 --> 00:04:01,920 Speaker 3: used for fermentation. It's the roasted barley that gives it 79 00:04:01,960 --> 00:04:04,080 Speaker 3: that signature, deep flavor and dark color. 80 00:04:04,640 --> 00:04:06,440 Speaker 1: How does it differ from beer? 81 00:04:08,960 --> 00:04:11,920 Speaker 3: A great question. So guinness is indeed a tape of beer, 82 00:04:12,040 --> 00:04:14,560 Speaker 3: but it's a stout, which is a darker, richer and 83 00:04:14,680 --> 00:04:17,919 Speaker 3: often creamier style compared to a lighter lagger or ale. 84 00:04:18,040 --> 00:04:20,679 Speaker 3: It has roasted barley, giving it more of a coffee 85 00:04:20,760 --> 00:04:23,599 Speaker 3: or chocolate note, and that creamy head comes from the 86 00:04:23,680 --> 00:04:26,920 Speaker 3: natridgen using the poor So it's just a specific delicious 87 00:04:26,960 --> 00:04:28,400 Speaker 3: category within the big world of beer. 88 00:04:29,560 --> 00:04:30,760 Speaker 2: Well, what the hell am I you for? 89 00:04:31,800 --> 00:04:34,599 Speaker 1: Yeah? You know what I love about chat GPT is 90 00:04:34,760 --> 00:04:37,440 Speaker 1: it can answer your questions but also make you feel 91 00:04:37,440 --> 00:04:40,240 Speaker 1: good by telling you what a great question you asked. 92 00:04:40,320 --> 00:04:43,560 Speaker 1: It can also it can build your self esteem while 93 00:04:43,680 --> 00:04:44,279 Speaker 1: educating you. 94 00:04:44,760 --> 00:04:47,560 Speaker 2: Yeah, and you know what make you dumber? 95 00:04:48,040 --> 00:04:50,080 Speaker 1: Why? I just learned something. 96 00:04:50,160 --> 00:04:54,280 Speaker 2: No, no, it's making you dumber. Okay, explain how an 97 00:04:54,320 --> 00:04:56,599 Speaker 2: article in the BBC I was reading just recently. 98 00:04:57,520 --> 00:05:00,160 Speaker 1: So this is not your thought. This is some one 99 00:05:00,160 --> 00:05:00,760 Speaker 1: else has thought. 100 00:05:01,080 --> 00:05:03,160 Speaker 2: Yeah, because this is research that's been done by the 101 00:05:03,160 --> 00:05:05,000 Speaker 2: Massachusetts So you're. 102 00:05:04,640 --> 00:05:08,359 Speaker 1: Doing exactly what I just did. You're outsourcing your knowledge. 103 00:05:08,600 --> 00:05:11,200 Speaker 2: No, you asked chat gbt did all the work for you. 104 00:05:11,279 --> 00:05:13,480 Speaker 2: I had to research this, go through and read an 105 00:05:13,600 --> 00:05:16,320 Speaker 2: entire article. You just asked the stupid AI chatbot. 106 00:05:16,800 --> 00:05:20,120 Speaker 1: Well, my way is just way more efficient. You're still out. 107 00:05:20,200 --> 00:05:23,359 Speaker 1: You're still going somewhere to get the answer. I went somewhere. 108 00:05:23,360 --> 00:05:24,239 Speaker 1: That's much quicker. 109 00:05:24,640 --> 00:05:29,440 Speaker 2: Yeah. Well, MIT published a study showing used chat GPT 110 00:05:29,600 --> 00:05:33,000 Speaker 2: to write essays showed less activity in the brain networks 111 00:05:33,279 --> 00:05:39,400 Speaker 2: associated with cognitive processing while undertaking that particular exercise. So 112 00:05:39,480 --> 00:05:41,320 Speaker 2: the more that we rely on AI, the less No. 113 00:05:41,440 --> 00:05:45,200 Speaker 1: No, no, stop stop stop. Stop. You just said people 114 00:05:45,200 --> 00:05:49,640 Speaker 1: who write essays. That's a very specific thing, which is 115 00:05:49,680 --> 00:05:52,880 Speaker 1: not what I am doing. So you can't extrapolate and 116 00:05:53,040 --> 00:05:56,440 Speaker 1: generalize on one thing. So what you do is you 117 00:05:56,600 --> 00:05:58,880 Speaker 1: take you take a grain of sand and go. Let 118 00:05:58,960 --> 00:06:02,719 Speaker 1: me tell you about the each right used in the 119 00:06:02,720 --> 00:06:05,200 Speaker 1: way that we do, which is, hey, we don't know something, 120 00:06:05,600 --> 00:06:08,320 Speaker 1: tell us what this is? Now we know? How is 121 00:06:08,360 --> 00:06:08,839 Speaker 1: that bad? 122 00:06:09,440 --> 00:06:13,960 Speaker 2: Well? One of the things is that by reading, do 123 00:06:14,080 --> 00:06:15,440 Speaker 2: you ever get anything wrong? 124 00:06:15,560 --> 00:06:18,240 Speaker 1: Come on, do you ever go are you ever going 125 00:06:18,320 --> 00:06:20,839 Speaker 1: to go on this show? Actually, Craig, that's a good point. 126 00:06:20,960 --> 00:06:22,640 Speaker 2: Can I give you a quote from the BBC? 127 00:06:22,960 --> 00:06:26,680 Speaker 1: And so not your quote, someone else's quote. 128 00:06:26,800 --> 00:06:31,560 Speaker 2: AI makes it too easy to find answers. Okay. So 129 00:06:31,720 --> 00:06:35,600 Speaker 2: a Carnegie mailon University in Microsoft, which I operates Copilot, 130 00:06:35,600 --> 00:06:40,279 Speaker 2: found people's problem solving skills could diminish if they become 131 00:06:40,360 --> 00:06:42,920 Speaker 2: too reliant on aim. 132 00:06:43,400 --> 00:06:48,000 Speaker 1: According to them, according to the research, why when you 133 00:06:48,080 --> 00:06:50,040 Speaker 1: read something, do you go, oh, well, now, that's what 134 00:06:50,080 --> 00:06:50,560 Speaker 1: I think? 135 00:06:50,880 --> 00:06:53,000 Speaker 2: No, I think that when you're reading an article, you're 136 00:06:53,040 --> 00:06:55,839 Speaker 2: spending more time on it and taking more in. You're 137 00:06:55,960 --> 00:06:59,159 Speaker 2: thinking through the process, reading the paragraphs, taking that information in. 138 00:06:59,760 --> 00:07:03,000 Speaker 2: We're when you're asking an AI, it's just doing the 139 00:07:03,040 --> 00:07:05,920 Speaker 2: summary for you. Everything's been done for you at least. 140 00:07:06,160 --> 00:07:08,479 Speaker 1: It isn't the point of both of them to find 141 00:07:08,480 --> 00:07:09,680 Speaker 1: out the answer to a question. 142 00:07:09,760 --> 00:07:12,120 Speaker 2: Sure, yeah it is, But the other thing is the 143 00:07:12,480 --> 00:07:14,360 Speaker 2: And this is what we kind of I mean, I guess. 144 00:07:14,520 --> 00:07:16,560 Speaker 2: AI summaries is one of the big things that came 145 00:07:16,560 --> 00:07:20,880 Speaker 2: out of twenty twenty five where first time ever Google 146 00:07:21,040 --> 00:07:24,280 Speaker 2: and look, most of us do use Google for our 147 00:07:24,400 --> 00:07:28,040 Speaker 2: online searching, and now when you do a search, it 148 00:07:28,160 --> 00:07:31,600 Speaker 2: means that you get this AI overview. And this is 149 00:07:31,640 --> 00:07:35,720 Speaker 2: a concern for websites, people who are writing news articles 150 00:07:35,760 --> 00:07:39,040 Speaker 2: because what we found is that between sixty and seventy 151 00:07:39,040 --> 00:07:41,360 Speaker 2: percent of people are now no longer going to the 152 00:07:41,400 --> 00:07:44,480 Speaker 2: actual website where the content's from. They're just relying on 153 00:07:44,520 --> 00:07:48,120 Speaker 2: the AI overview information. And the big concern is that 154 00:07:48,200 --> 00:07:52,320 Speaker 2: sometimes that AI overview is actually wrong. And this is 155 00:07:52,720 --> 00:07:55,640 Speaker 2: the big concern because people are getting their news from 156 00:07:55,640 --> 00:07:59,360 Speaker 2: AI and they're reading the summer summary, but they're not 157 00:07:59,400 --> 00:08:02,000 Speaker 2: delving any deeper into it, and they're relying on that 158 00:08:02,360 --> 00:08:05,640 Speaker 2: and AI is wrong quite often, it can be wrong, 159 00:08:06,000 --> 00:08:08,920 Speaker 2: and that could mean critical things for people if they're 160 00:08:08,960 --> 00:08:13,000 Speaker 2: researching things like you know, with my cancer treatment, should 161 00:08:13,040 --> 00:08:15,680 Speaker 2: I be doing X, Y Z. And these are specific 162 00:08:15,680 --> 00:08:18,680 Speaker 2: examples I was reading about where the concerns are that 163 00:08:18,720 --> 00:08:21,720 Speaker 2: people are relying on it too heavily, and particularly medical 164 00:08:21,760 --> 00:08:24,280 Speaker 2: advice is something that you know, you want to make 165 00:08:24,280 --> 00:08:27,680 Speaker 2: sure you're getting that from a legitimate site, which is 166 00:08:27,720 --> 00:08:31,040 Speaker 2: a medical authority or health government health authority, rather than 167 00:08:31,360 --> 00:08:35,240 Speaker 2: just relying on an overview. So it is concerning. And look, 168 00:08:35,640 --> 00:08:37,800 Speaker 2: I love the fact that AI is here and can 169 00:08:37,800 --> 00:08:40,720 Speaker 2: do some great things. I've got to go to a 170 00:08:40,840 --> 00:08:44,839 Speaker 2: funeral service next week, and you know, I'm putting together 171 00:08:44,840 --> 00:08:48,120 Speaker 2: the booklet for the service, and there was a photo, 172 00:08:48,200 --> 00:08:51,640 Speaker 2: a family photo that my cousin gave to me, and 173 00:08:52,080 --> 00:08:54,000 Speaker 2: there were two people in the family that weren't in 174 00:08:54,040 --> 00:08:57,280 Speaker 2: the photo, and so she said, look, can you put 175 00:08:57,320 --> 00:08:59,920 Speaker 2: those people into the photograph? And I was able to 176 00:09:00,040 --> 00:09:02,640 Speaker 2: put it together. And the reality of it is, you know, 177 00:09:02,880 --> 00:09:05,400 Speaker 2: traditionally I would have to have cut out all the 178 00:09:05,400 --> 00:09:09,559 Speaker 2: people from the photo using traditional photoshop tools, which is 179 00:09:09,640 --> 00:09:11,960 Speaker 2: quite priceiss and one of the difficult things, not so 180 00:09:12,040 --> 00:09:13,680 Speaker 2: much for you and I, because we you know, in 181 00:09:13,720 --> 00:09:16,040 Speaker 2: your case, you've got cropped hair. In my case, I 182 00:09:16,080 --> 00:09:19,160 Speaker 2: have none, so cutting us out of a photograph is 183 00:09:19,160 --> 00:09:22,600 Speaker 2: pretty easy. But for somebody who's got lushes long lengths 184 00:09:22,600 --> 00:09:24,920 Speaker 2: of air, it actually is really difficult to cut the 185 00:09:24,960 --> 00:09:28,880 Speaker 2: hair out. But now AI allows you to instantly remove 186 00:09:28,960 --> 00:09:31,680 Speaker 2: backgrounds and then you can layer that, put new people in, 187 00:09:32,040 --> 00:09:34,880 Speaker 2: arrange them around, take the backgrounds, put the background back 188 00:09:34,880 --> 00:09:38,760 Speaker 2: in again, and suddenly you've got this family photo. And 189 00:09:38,800 --> 00:09:40,679 Speaker 2: it was great to be able to use those tools 190 00:09:40,720 --> 00:09:43,880 Speaker 2: which normally would have taken me ours. So yeah, I 191 00:09:44,280 --> 00:09:47,880 Speaker 2: use stacks of AI and it's great in some applications. 192 00:09:49,800 --> 00:09:52,280 Speaker 1: Yeah, I think there are pros and cons for everything. 193 00:09:52,520 --> 00:09:57,080 Speaker 1: But I think back just quickly to the AI summaries. Yeah, well, 194 00:09:57,320 --> 00:09:59,600 Speaker 1: I think everybody knows we don't get our health advice 195 00:09:59,600 --> 00:10:01,600 Speaker 1: from the end to that period, do we. I mean, 196 00:10:01,679 --> 00:10:04,400 Speaker 1: who's going, I'm not going to a doctor. But hang on, 197 00:10:04,480 --> 00:10:10,760 Speaker 1: how studio like you can get misinformation and you do everywhere. 198 00:10:11,040 --> 00:10:14,800 Speaker 1: It's not just AI summaries, it's YouTube, it's Instagram, it's Facebook, 199 00:10:15,080 --> 00:10:19,000 Speaker 1: it's every fucking post that every person puts up. So 200 00:10:19,120 --> 00:10:22,360 Speaker 1: I think to go, you know, like, you can go 201 00:10:22,400 --> 00:10:25,880 Speaker 1: to actual websites and get bullshit because they sell bullshit 202 00:10:25,920 --> 00:10:28,240 Speaker 1: and promote bullshit, and they've got a vested interest in 203 00:10:28,320 --> 00:10:32,160 Speaker 1: you buying what they're selling. So I think to assume that, 204 00:10:33,040 --> 00:10:35,280 Speaker 1: you know, like I agree with you, but I think, 205 00:10:35,320 --> 00:10:39,320 Speaker 1: but that's just that's you know, everywhere like to be 206 00:10:39,360 --> 00:10:46,559 Speaker 1: able to if you're going to any website that sells anything, 207 00:10:46,800 --> 00:10:49,720 Speaker 1: then you're not getting objectivity. You're getting a version of 208 00:10:49,720 --> 00:10:54,439 Speaker 1: a sales pitch. And most most websites that we go 209 00:10:54,559 --> 00:10:59,400 Speaker 1: to are commercial, you know, like it's not like it's 210 00:10:59,520 --> 00:11:03,560 Speaker 1: just this amazing resource that nobody's making money from. 211 00:11:04,000 --> 00:11:06,400 Speaker 2: I guess the one thing that I was saying before 212 00:11:06,520 --> 00:11:10,920 Speaker 2: is that like for specific things like medical advice, legal advice, 213 00:11:11,160 --> 00:11:13,400 Speaker 2: there's certainly are legitimate sites that you can go to. 214 00:11:13,520 --> 00:11:16,880 Speaker 2: I would go to a government and even if you 215 00:11:16,880 --> 00:11:19,040 Speaker 2: look at the extension of a website, is that if 216 00:11:19,080 --> 00:11:21,920 Speaker 2: it's a dot gov, dot au, then you know that 217 00:11:21,960 --> 00:11:24,160 Speaker 2: it's a government site. You know, if you want a 218 00:11:24,160 --> 00:11:27,800 Speaker 2: specific weather related information, you go to the Weather Bureau, 219 00:11:28,120 --> 00:11:30,720 Speaker 2: you know, for information there, or for help to a 220 00:11:30,760 --> 00:11:34,000 Speaker 2: medical site. So I think it's discerning where you are 221 00:11:34,920 --> 00:11:37,040 Speaker 2: in relation to the type of information that you're getting. 222 00:11:37,040 --> 00:11:39,199 Speaker 2: But you're right, that is a difficult thing to kind 223 00:11:39,200 --> 00:11:41,520 Speaker 2: of discern because there's so much slop out there. That 224 00:11:41,640 --> 00:11:44,480 Speaker 2: was the word of the year. Remember AI slop was 225 00:11:44,520 --> 00:11:47,280 Speaker 2: one of the dictionaries, the big Dictionaries word of the year. 226 00:11:47,400 --> 00:11:51,480 Speaker 2: And if you create a new YouTube account right now, 227 00:11:51,640 --> 00:11:54,120 Speaker 2: so you've never used YouTube, so you jump on and 228 00:11:54,160 --> 00:11:56,920 Speaker 2: you create a new account to start using YouTube. From 229 00:11:57,360 --> 00:12:00,800 Speaker 2: absolute scratch, twenty percent of video is more than twenty 230 00:12:00,840 --> 00:12:03,360 Speaker 2: percent of videos that you will be shown. You know 231 00:12:03,360 --> 00:12:06,360 Speaker 2: when you start scrolling through your feed, well, twenty percent 232 00:12:06,720 --> 00:12:10,440 Speaker 2: more than twenty will be AI slop. There's guaranteed, so 233 00:12:10,920 --> 00:12:13,720 Speaker 2: you know already one in five will be you know, 234 00:12:13,840 --> 00:12:17,840 Speaker 2: this generated AI stuff. Do you when you flick through 235 00:12:18,080 --> 00:12:21,120 Speaker 2: and whether it's X or any of those, do you 236 00:12:21,160 --> 00:12:22,920 Speaker 2: struggle with that? Do you look at that and think 237 00:12:22,920 --> 00:12:25,400 Speaker 2: to yourself, am I looking at something that's real? Is 238 00:12:25,440 --> 00:12:27,720 Speaker 2: this fake? Is that something that's a concern for you? 239 00:12:28,520 --> 00:12:30,800 Speaker 1: Well, yeah, but I don't worry about it. I don't 240 00:12:30,840 --> 00:12:33,079 Speaker 1: believe it's like even when you go the government, do 241 00:12:33,120 --> 00:12:36,160 Speaker 1: you think the government doesn't get shit wrong? Do you 242 00:12:36,200 --> 00:12:39,640 Speaker 1: think that everything on government websites is like I just 243 00:12:39,720 --> 00:12:44,120 Speaker 1: think it's not about being negative or I'm not anti 244 00:12:44,160 --> 00:12:46,640 Speaker 1: government at all. I thank God for the government, we'd 245 00:12:46,679 --> 00:12:49,720 Speaker 1: be living in fucking whatever, you know, But the government 246 00:12:49,760 --> 00:12:52,160 Speaker 1: gets stuff wrong, the government gets stuff right. I think 247 00:12:52,240 --> 00:12:57,360 Speaker 1: the reality like putting aside whatever the medium is or 248 00:12:57,400 --> 00:13:00,200 Speaker 1: the delivery system, be it Facebook, be at x, be 249 00:13:00,280 --> 00:13:04,840 Speaker 1: it Instagram, be at a government website. The human experience 250 00:13:04,920 --> 00:13:08,280 Speaker 1: is we get things wrong. We give Have I ever 251 00:13:08,320 --> 00:13:11,720 Speaker 1: gotten things wrong every fucking day of my life? Have 252 00:13:11,840 --> 00:13:14,880 Speaker 1: I ever said things that aren't true every day of 253 00:13:14,920 --> 00:13:19,880 Speaker 1: my life? Have I ever misled people unintentionally a million times? 254 00:13:20,160 --> 00:13:22,760 Speaker 1: So I think that we need to go you know, like, 255 00:13:23,320 --> 00:13:26,240 Speaker 1: if you've got the two one hundred episodes of this show, 256 00:13:27,000 --> 00:13:30,080 Speaker 1: I would say on every episode, we've said something that 257 00:13:30,160 --> 00:13:33,359 Speaker 1: was bullshit, if not once, ten times on each episode. 258 00:13:34,000 --> 00:13:36,120 Speaker 1: So this kind of thing that we have of oh, 259 00:13:36,240 --> 00:13:38,800 Speaker 1: that's good over there and this is bad over here, 260 00:13:38,960 --> 00:13:41,040 Speaker 1: I think you've got to turn down the volume on 261 00:13:41,080 --> 00:13:44,840 Speaker 1: that because to say, oh, and it always with me 262 00:13:45,600 --> 00:13:48,760 Speaker 1: and I am literally a fucking researcher at a PhD level. 263 00:13:48,760 --> 00:13:51,319 Speaker 1: I don't like to play that card. But when people 264 00:13:51,360 --> 00:13:55,200 Speaker 1: go sixty percent of this, really it's not fifty seven, 265 00:13:55,280 --> 00:13:57,800 Speaker 1: it's not sixty two. I'm not talking about you, by 266 00:13:57,840 --> 00:14:00,760 Speaker 1: the way, but there are all of these very convenient 267 00:14:03,040 --> 00:14:07,600 Speaker 1: research kind of numbers that come out. I was talking 268 00:14:07,640 --> 00:14:12,600 Speaker 1: with somebody the other day about a particular thing in 269 00:14:12,640 --> 00:14:16,680 Speaker 1: big farm, this particular drug and the efficacy of this drug. 270 00:14:16,720 --> 00:14:19,520 Speaker 1: And you know when you go through it. I was 271 00:14:19,560 --> 00:14:26,240 Speaker 1: talking with David Gillespie about treatment for cholesterol, right, And 272 00:14:26,280 --> 00:14:28,720 Speaker 1: then you start digging and you go, yeah, this is 273 00:14:28,800 --> 00:14:30,840 Speaker 1: science and it is science, and you can read the 274 00:14:30,880 --> 00:14:33,120 Speaker 1: report and you can read the paper and you go, fuck, 275 00:14:33,200 --> 00:14:36,640 Speaker 1: but this is in this journal. This is a Tier one, 276 00:14:36,800 --> 00:14:39,720 Speaker 1: very credible Q one journal. And then you dig and 277 00:14:39,720 --> 00:14:42,400 Speaker 1: you dig and you dig and you go, oh, this 278 00:14:42,480 --> 00:14:45,800 Speaker 1: is financed by big farmer, which it was, and then 279 00:14:45,840 --> 00:14:48,040 Speaker 1: you go, oh, but it's science. You go no, no, 280 00:14:48,800 --> 00:14:53,960 Speaker 1: Like people get too hysterical about the validity of science. 281 00:14:54,280 --> 00:14:57,080 Speaker 1: It's like, you've got to remember this, who designed the study, 282 00:14:57,160 --> 00:15:01,840 Speaker 1: a human who's financing the study big farmer? Well as 283 00:15:01,840 --> 00:15:04,360 Speaker 1: if they want you to produce outcomes that make them 284 00:15:04,400 --> 00:15:07,720 Speaker 1: look bad when they are literally paying your fucking wage, right, 285 00:15:08,320 --> 00:15:12,040 Speaker 1: so you have and who interpreted the data. Oh, and 286 00:15:12,120 --> 00:15:15,680 Speaker 1: for this one study we spoke about, they wouldn't reveal 287 00:15:15,760 --> 00:15:21,040 Speaker 1: the data because the data disproved their theory, their hypothesis. 288 00:15:21,320 --> 00:15:24,600 Speaker 1: So what they did was they just revealed the interpretation 289 00:15:24,760 --> 00:15:27,800 Speaker 1: of the scientists of the data, right. But for the 290 00:15:27,800 --> 00:15:33,040 Speaker 1: average panta, we go, Oh, Craig said this. Craig's smart. 291 00:15:33,400 --> 00:15:35,920 Speaker 1: So this is how it is. No, Craig's not smart. 292 00:15:35,960 --> 00:15:38,520 Speaker 1: Craig's a fucking human who gets things wrong all the time. 293 00:15:38,920 --> 00:15:42,480 Speaker 1: Scientists get things wrong all the time. Over the years, 294 00:15:42,520 --> 00:15:46,560 Speaker 1: we've constantly been you know, like low fad eating in 295 00:15:46,600 --> 00:15:50,280 Speaker 1: the early seventies was meant to revolutionize the health state 296 00:15:50,320 --> 00:15:53,160 Speaker 1: of all First world countries, and in fact, what it's 297 00:15:53,160 --> 00:15:56,800 Speaker 1: done since the fifties or since the seventies, there's been 298 00:15:56,800 --> 00:16:00,440 Speaker 1: an increase in obesity and obesity related diseases. Sense the 299 00:16:00,480 --> 00:16:04,480 Speaker 1: introduction of low fat food because low fat food equaled 300 00:16:04,560 --> 00:16:07,480 Speaker 1: high sugar food, right, And so which is not to 301 00:16:07,520 --> 00:16:10,800 Speaker 1: say donat carbs or donat sugar. It's just to say 302 00:16:10,840 --> 00:16:15,240 Speaker 1: that science is constantly getting it wrong. When the pandemic struck, 303 00:16:15,640 --> 00:16:19,080 Speaker 1: it was estimated one hundred and fifty thousand Australians would die. 304 00:16:19,680 --> 00:16:22,000 Speaker 1: It was on the front of the paper, it was 305 00:16:22,040 --> 00:16:24,240 Speaker 1: on the Herald Sun, and it was on the age 306 00:16:24,280 --> 00:16:27,560 Speaker 1: and this was this was the estimates by the best 307 00:16:27,600 --> 00:16:31,800 Speaker 1: scientist virologist, except they got it wildly fucking wrong. Nobody 308 00:16:31,840 --> 00:16:34,000 Speaker 1: comes out and goes, oh, you know that thing we said, 309 00:16:34,080 --> 00:16:36,800 Speaker 1: we fucked up right, This is the world that we 310 00:16:36,880 --> 00:16:39,440 Speaker 1: live in. Politicians tell me a politician who comes out 311 00:16:39,480 --> 00:16:42,080 Speaker 1: and goes, that thing that we did, we got it wrong. 312 00:16:42,160 --> 00:16:44,480 Speaker 1: That thing that we said, we got it wrong. So 313 00:16:44,560 --> 00:16:48,000 Speaker 1: I think, like, for me, I'm I'm not I read 314 00:16:48,080 --> 00:16:50,680 Speaker 1: everything and I pay attention or everything I can, and 315 00:16:50,720 --> 00:16:54,160 Speaker 1: I pay attention, but I'm more about let me see 316 00:16:54,160 --> 00:16:58,040 Speaker 1: the results over time. So rather than going, oh, I 317 00:16:58,160 --> 00:17:00,960 Speaker 1: read this here and it said that, therefore that's true, 318 00:17:01,040 --> 00:17:04,359 Speaker 1: I'm like no, or I heard this, therefore that's true. 319 00:17:04,400 --> 00:17:08,960 Speaker 1: I'm like maybe maybe. And I just think that that 320 00:17:09,920 --> 00:17:16,680 Speaker 1: assigning a certain value or worth or respect to something 321 00:17:16,880 --> 00:17:20,399 Speaker 1: just because someone said it to me, I think is 322 00:17:20,520 --> 00:17:25,399 Speaker 1: actually the opposite of critical thinking. Like I think, like, 323 00:17:25,480 --> 00:17:29,320 Speaker 1: you're super smart, and I'm not saying that disingenuously. You're 324 00:17:29,359 --> 00:17:32,480 Speaker 1: super smart, and I think that all of us have 325 00:17:32,560 --> 00:17:34,920 Speaker 1: the ability to go, all right, this is what they said, 326 00:17:34,960 --> 00:17:37,800 Speaker 1: but let me And by the way, research is not 327 00:17:37,960 --> 00:17:41,159 Speaker 1: reading a paper that someone wrote that's not research and 328 00:17:41,240 --> 00:17:45,600 Speaker 1: watching a video that's not research. Research is actually research, 329 00:17:46,280 --> 00:17:48,960 Speaker 1: you know, steps down off soapbox. But that's all I think. 330 00:17:49,080 --> 00:17:52,920 Speaker 1: I just get I get frustrated when people go, oh, 331 00:17:53,000 --> 00:17:56,840 Speaker 1: you're wrong because I read this here. I'm like, well, fuck, okay, 332 00:17:57,040 --> 00:17:58,080 Speaker 1: let's see over time. 333 00:17:58,640 --> 00:18:00,639 Speaker 2: Look. One of the interesting things is I trained to 334 00:18:00,680 --> 00:18:04,359 Speaker 2: be a journalist. We were told never to believe one source. 335 00:18:04,440 --> 00:18:07,880 Speaker 2: That you check multiple sources, You interview more than one 336 00:18:07,920 --> 00:18:10,800 Speaker 2: person on the topic, You get both sides of the story, 337 00:18:10,880 --> 00:18:13,560 Speaker 2: try to get as much information in so that you 338 00:18:13,600 --> 00:18:16,439 Speaker 2: could give both sides, you know. And that's where I 339 00:18:16,480 --> 00:18:19,480 Speaker 2: think a lot of the mainstream media has diverged a lot. 340 00:18:19,560 --> 00:18:22,440 Speaker 2: If you look at Sky Television for example, and look 341 00:18:22,440 --> 00:18:26,080 Speaker 2: at some of the extreme more extreme you know, reporting, 342 00:18:26,720 --> 00:18:28,639 Speaker 2: and there's lots of reasons for that. I think that 343 00:18:28,800 --> 00:18:33,280 Speaker 2: one of the challenges for journalism now is that, you know, 344 00:18:33,680 --> 00:18:36,680 Speaker 2: there's less resources going into journalism because people are now 345 00:18:36,800 --> 00:18:42,240 Speaker 2: using Facebook to disseminate information, people using X to seminate information. 346 00:18:42,359 --> 00:18:45,800 Speaker 2: So the checks and measures aren't there anymore. There's no editors, 347 00:18:45,800 --> 00:18:49,920 Speaker 2: there's no sub editors that are deciding that this person 348 00:18:50,240 --> 00:18:53,640 Speaker 2: who's on X is being vetted in any way. And 349 00:18:53,680 --> 00:18:56,439 Speaker 2: that's problematic because you know, if you go down that 350 00:18:56,520 --> 00:18:59,800 Speaker 2: rabbit hole, X will continue to serve up the same stuff. 351 00:18:59,840 --> 00:19:01,679 Speaker 2: That is, you know, they just want to keep you 352 00:19:01,720 --> 00:19:03,760 Speaker 2: there as long as you can, whether it's Facebook or X. 353 00:19:04,240 --> 00:19:07,800 Speaker 2: And that way, the information you're being given, it may 354 00:19:07,880 --> 00:19:10,280 Speaker 2: not be credible, but it will keep serving it up 355 00:19:10,320 --> 00:19:12,800 Speaker 2: if you keep clicking on it, and it will try 356 00:19:12,920 --> 00:19:15,639 Speaker 2: to keep you. So, if your rabbit hole is the 357 00:19:15,680 --> 00:19:20,000 Speaker 2: Earth is actually legitimately flat, that and I know somebody 358 00:19:20,160 --> 00:19:24,480 Speaker 2: who has a sibling who dead set believes the world 359 00:19:24,560 --> 00:19:29,200 Speaker 2: is flat, absolutely black and blue believes the world is flat. 360 00:19:29,600 --> 00:19:31,879 Speaker 1: And I have a friend who believes the world is flat. 361 00:19:31,960 --> 00:19:34,720 Speaker 1: And you know this friend. I won't say this person's name, 362 00:19:34,760 --> 00:19:37,600 Speaker 1: but i'll tell you after. I have a good friend 363 00:19:37,760 --> 00:19:41,560 Speaker 1: who they are going further and further. They're not one 364 00:19:41,600 --> 00:19:44,840 Speaker 1: hundred percent, but they're about ninety percent sure the Earth's flat. 365 00:19:44,920 --> 00:19:51,399 Speaker 1: I'm like, oh, okay, okay, like all right and all that. 366 00:19:51,680 --> 00:19:54,200 Speaker 1: You know, Yeah, go on, though, but if. 367 00:19:53,840 --> 00:19:56,679 Speaker 2: You if you then mix with those Facebook groups, if 368 00:19:56,720 --> 00:19:59,120 Speaker 2: you if you're a flat earther and then you talk 369 00:19:59,160 --> 00:20:02,919 Speaker 2: to other flat earth you're going to have that reinforced constantly. 370 00:20:03,200 --> 00:20:05,560 Speaker 2: But with social media, So I would talk to you 371 00:20:05,560 --> 00:20:09,760 Speaker 2: you reinforce my mindset. But what happened on social media 372 00:20:09,840 --> 00:20:12,080 Speaker 2: is it's got an algorithm that's built into it to 373 00:20:12,240 --> 00:20:14,800 Speaker 2: keep serving that up. So it's not just me catching 374 00:20:14,880 --> 00:20:16,920 Speaker 2: up with Craig over a beer and you say, yeah, 375 00:20:16,920 --> 00:20:19,520 Speaker 2: the world is definitely flat and I say, well, Craig 376 00:20:19,560 --> 00:20:22,800 Speaker 2: reckons it is. So there you go. Now you've got 377 00:20:22,840 --> 00:20:25,879 Speaker 2: all these voices in the wilderness, and then your algorithm 378 00:20:26,160 --> 00:20:28,520 Speaker 2: just serves you up that. So if you're far left 379 00:20:28,840 --> 00:20:31,520 Speaker 2: or far right, then you're going to get more far 380 00:20:31,600 --> 00:20:35,080 Speaker 2: left or far right information via your social media. Because 381 00:20:35,440 --> 00:20:38,480 Speaker 2: the big thing to bear in mind with socials is 382 00:20:38,520 --> 00:20:42,280 Speaker 2: that it's not government vetted, it's not government run. It's 383 00:20:42,359 --> 00:20:46,280 Speaker 2: run by corporations that just want to make advertising revenue. 384 00:20:46,480 --> 00:20:49,520 Speaker 2: That's all there about the AI slop we were talking 385 00:20:49,520 --> 00:20:52,480 Speaker 2: about before on YouTube, so you sign up to get 386 00:20:52,480 --> 00:20:55,399 Speaker 2: onto YouTube for the first time, twenty percent of videos 387 00:20:55,400 --> 00:20:59,040 Speaker 2: are going to be AI slop. But last year AI 388 00:20:59,280 --> 00:21:03,560 Speaker 2: slop general id one hundred and seventeen million dollars US 389 00:21:03,640 --> 00:21:07,360 Speaker 2: a year in ad revenue. That's why people are generating 390 00:21:07,359 --> 00:21:09,640 Speaker 2: the crap because they've got money out of. 391 00:21:09,560 --> 00:21:12,760 Speaker 1: It, of course, but also think you're exactly right but 392 00:21:12,800 --> 00:21:16,520 Speaker 1: also Patrick James Kevin, Patrick James Bonello think about this. 393 00:21:17,400 --> 00:21:20,880 Speaker 1: So you go, oh, when I open I start watching 394 00:21:21,240 --> 00:21:25,880 Speaker 1: fucking you know, videos on how I'm going to heal 395 00:21:25,960 --> 00:21:29,359 Speaker 1: my exma, and I get all these weird products from 396 00:21:29,400 --> 00:21:31,720 Speaker 1: the Amazon, and then I keep getting more and more 397 00:21:31,720 --> 00:21:34,480 Speaker 1: and then this one will heal my fucking whatever. Right, 398 00:21:34,520 --> 00:21:38,760 Speaker 1: and it's all similar stuff from So it's like, now 399 00:21:38,800 --> 00:21:41,840 Speaker 1: you're in this echo chamber, but walk away from the 400 00:21:41,920 --> 00:21:44,560 Speaker 1: virtual world and the Internet. And then you go into 401 00:21:44,600 --> 00:21:48,600 Speaker 1: a church. What do you hear everybody preaching the same message, 402 00:21:48,640 --> 00:21:51,960 Speaker 1: telling you the same thing, and if anybody questions that message, 403 00:21:52,000 --> 00:21:54,240 Speaker 1: you're bad, You're evil, You're going to hell. You go 404 00:21:54,359 --> 00:21:58,040 Speaker 1: to a synagogue, same except without the Jesus bit, right. 405 00:21:58,480 --> 00:22:02,120 Speaker 1: You go to a Buddhist temple where, well they're you know, 406 00:22:02,480 --> 00:22:05,800 Speaker 1: they they don't really have a god per se, they 407 00:22:05,800 --> 00:22:09,200 Speaker 1: don't have a deity. You go to a Carlton football club, Well, 408 00:22:09,240 --> 00:22:12,959 Speaker 1: Collingwood fans are fucking idiots, right. It's like the world 409 00:22:13,080 --> 00:22:17,000 Speaker 1: is full of these echo chambers where you essentially belong 410 00:22:17,160 --> 00:22:19,280 Speaker 1: to a group, whether or not you chose it, you 411 00:22:19,400 --> 00:22:24,080 Speaker 1: belong to a group who keeps reinforcing the same message, 412 00:22:24,400 --> 00:22:29,160 Speaker 1: and if you question the message, then you're somehow penalized. 413 00:22:29,200 --> 00:22:32,399 Speaker 1: This is how people control and manipulate and coerce and 414 00:22:32,440 --> 00:22:35,400 Speaker 1: get the outcomes they want. And I grew up very 415 00:22:35,680 --> 00:22:39,639 Speaker 1: much in an echo chamber of theology and philosophy, where 416 00:22:39,720 --> 00:22:46,520 Speaker 1: if I questioned anything, I was basically criticized and humiliated 417 00:22:46,840 --> 00:22:50,160 Speaker 1: and all of these things, and so was everyone, by 418 00:22:50,200 --> 00:22:53,440 Speaker 1: the way, And that's how you control people. And that's 419 00:22:53,480 --> 00:22:56,920 Speaker 1: how you manipulate people. Is you go, no, there's only one. 420 00:22:57,080 --> 00:23:00,520 Speaker 1: It's like when people go to me, what's the best way? Okay, 421 00:23:00,560 --> 00:23:02,400 Speaker 1: So if I am an expert in anything, it's probably 422 00:23:02,480 --> 00:23:05,800 Speaker 1: exercise science, what's the best way to achieve this outcome 423 00:23:05,840 --> 00:23:07,800 Speaker 1: with my body? And I go, well, there actually is 424 00:23:07,840 --> 00:23:10,879 Speaker 1: an a best way. There's probably one hundred and sixty 425 00:23:10,920 --> 00:23:13,640 Speaker 1: seven different best ways, depending on your body, your genetics, 426 00:23:13,680 --> 00:23:17,480 Speaker 1: your background, your medical state, your chronological your biological agent. 427 00:23:17,560 --> 00:23:20,400 Speaker 1: So the idea that there's a best way, we need 428 00:23:20,440 --> 00:23:24,160 Speaker 1: to dispense with that. It's like you and the way 429 00:23:24,200 --> 00:23:26,280 Speaker 1: that you live. So you're a single bloke, you live 430 00:23:26,320 --> 00:23:31,640 Speaker 1: in the country. You like blokes, right, that's not good 431 00:23:31,720 --> 00:23:35,000 Speaker 1: or bad. That's how you live. Right, for you, living 432 00:23:35,040 --> 00:23:39,199 Speaker 1: with your dog, having a great social network, having life 433 00:23:39,880 --> 00:23:42,800 Speaker 1: and work that for you is meaningful and purposeful. I 434 00:23:42,880 --> 00:23:44,960 Speaker 1: know you're not driven by money. You need to make 435 00:23:45,000 --> 00:23:46,719 Speaker 1: money because you've got to pay the bills, but I 436 00:23:46,760 --> 00:23:51,040 Speaker 1: know that you know for you, you're more about emotional and 437 00:23:51,200 --> 00:23:54,639 Speaker 1: I guess spiritual and social connection and that's kind of 438 00:23:54,720 --> 00:23:57,920 Speaker 1: your bank account right now. That's perfect for you. It's 439 00:23:57,960 --> 00:24:01,600 Speaker 1: going to be the worst model ever for some other bloke. Yeah, right, 440 00:24:01,840 --> 00:24:05,199 Speaker 1: So we're all trying to find like what is in 441 00:24:05,240 --> 00:24:08,040 Speaker 1: the middle of the myriad of ways that we can 442 00:24:08,119 --> 00:24:11,240 Speaker 1: live and think and do and be. Not what's the 443 00:24:11,320 --> 00:24:13,840 Speaker 1: right way, but what's the best way for me based 444 00:24:13,880 --> 00:24:17,400 Speaker 1: on who I am. And so it's like I haven't 445 00:24:17,400 --> 00:24:19,520 Speaker 1: had a job since I was twenty four, Would I 446 00:24:19,560 --> 00:24:24,359 Speaker 1: recommend that? Definitely? Not if you're a clone of me, yes, 447 00:24:25,200 --> 00:24:28,000 Speaker 1: But if you're not, which nobody is, will then figure 448 00:24:28,000 --> 00:24:30,520 Speaker 1: it out for yourself. But what I'll do is I 449 00:24:30,560 --> 00:24:32,239 Speaker 1: will tell you what I did and how I did 450 00:24:32,280 --> 00:24:34,439 Speaker 1: it and why I did it, and I'll tell you 451 00:24:34,480 --> 00:24:37,840 Speaker 1: what my outcomes were. But then, because you're not me, 452 00:24:37,960 --> 00:24:40,480 Speaker 1: you're going to think for yourself and you're going to 453 00:24:40,600 --> 00:24:42,800 Speaker 1: maybe try a couple of things and see what the 454 00:24:42,840 --> 00:24:45,480 Speaker 1: outcome is, and then based on your own data and experience, 455 00:24:45,960 --> 00:24:49,120 Speaker 1: you'll figure out the best model for you. Craig Harper 456 00:24:49,200 --> 00:24:52,040 Speaker 1: cannot tell you what is the path for you because 457 00:24:52,119 --> 00:24:54,159 Speaker 1: I'm not you and you're not me, you know, And 458 00:24:54,200 --> 00:24:57,800 Speaker 1: I think that's true for everything from how should I eat? 459 00:24:58,119 --> 00:25:00,760 Speaker 1: To what's the best way for me to learn? What's 460 00:25:00,800 --> 00:25:04,920 Speaker 1: the best relationship kind of paradigm for me? Is living 461 00:25:04,960 --> 00:25:07,000 Speaker 1: in the country good for me? I know people have 462 00:25:07,040 --> 00:25:08,760 Speaker 1: moved to the country to have a kind of a 463 00:25:09,680 --> 00:25:12,040 Speaker 1: what do they call it a tree change, and they're 464 00:25:12,040 --> 00:25:14,760 Speaker 1: back in fucking twelve months they go fuck the country 465 00:25:15,160 --> 00:25:17,200 Speaker 1: and then you go up there, you're like, this is 466 00:25:17,240 --> 00:25:18,040 Speaker 1: the best shit ever. 467 00:25:18,560 --> 00:25:20,800 Speaker 2: Yeah, seventeen years. 468 00:25:21,320 --> 00:25:24,440 Speaker 1: That's it, mate. It's just everyone trying to figure out 469 00:25:24,680 --> 00:25:28,000 Speaker 1: how they best fit and how they best work. In 470 00:25:28,040 --> 00:25:31,600 Speaker 1: the middle of all of this overwhelm of you should 471 00:25:31,600 --> 00:25:34,200 Speaker 1: do this, you should do that, you shouldn't do those things. 472 00:25:35,280 --> 00:25:38,479 Speaker 2: Yeah, it's interesting. I know people who bought in the 473 00:25:38,480 --> 00:25:41,159 Speaker 2: town that I live in during COVID they did the 474 00:25:41,160 --> 00:25:43,479 Speaker 2: tree change, They got out of the Ring of Steel 475 00:25:43,640 --> 00:25:45,760 Speaker 2: and they did the move and then they moved back 476 00:25:45,800 --> 00:25:48,919 Speaker 2: to Melbourne. Yeah, they just don't you know. The lifestyle 477 00:25:49,040 --> 00:25:51,760 Speaker 2: is just not to their liking. They don't have a 478 00:25:51,920 --> 00:25:54,879 Speaker 2: ray of cafes to go down to and have their 479 00:25:55,480 --> 00:25:56,600 Speaker 2: avocado on toast. 480 00:25:57,880 --> 00:26:00,560 Speaker 1: Listen to you, listen to you. But I mean, but 481 00:26:00,640 --> 00:26:04,880 Speaker 1: that's okay. Yeah, like not everyone. It's like I say 482 00:26:04,880 --> 00:26:07,200 Speaker 1: to people all the time, don't fucking do what I do, 483 00:26:07,880 --> 00:26:10,520 Speaker 1: like you would hate it, Like it works for me, 484 00:26:10,760 --> 00:26:13,679 Speaker 1: And I don't think, oh, Craig's got it right. I 485 00:26:13,760 --> 00:26:15,760 Speaker 1: look at people who go out and get shit faced 486 00:26:15,760 --> 00:26:19,240 Speaker 1: and talk all night for seven hours and fucking party 487 00:26:19,280 --> 00:26:20,960 Speaker 1: on and have the best time, and part of me 488 00:26:21,040 --> 00:26:23,000 Speaker 1: is a bit jealous. I definitely don't want to do 489 00:26:23,080 --> 00:26:25,919 Speaker 1: that because I'm not wired that way. But I definitely 490 00:26:26,000 --> 00:26:27,720 Speaker 1: look at people who do that and have the best 491 00:26:27,720 --> 00:26:30,160 Speaker 1: time ever and go fuck good for them. That's amazing. 492 00:26:30,480 --> 00:26:32,080 Speaker 2: That's their idea of a good time. I went to 493 00:26:32,080 --> 00:26:34,840 Speaker 2: bed at ten o'clock on New Year's Eve. I had 494 00:26:34,840 --> 00:26:37,280 Speaker 2: some friends over. We played some board games, had a 495 00:26:37,280 --> 00:26:40,600 Speaker 2: bit of a laugh, listened to some vinyl records. I 496 00:26:40,680 --> 00:26:43,880 Speaker 2: managed to draw some vinyl records from a neighbor recently, 497 00:26:44,000 --> 00:26:49,280 Speaker 2: including Elton John's Yellow Brick Road eight, an eight box 498 00:26:49,320 --> 00:26:53,000 Speaker 2: set of the Beatles, and the record we were listening 499 00:26:53,000 --> 00:26:57,880 Speaker 2: to was seventies disco music. It was great. So and 500 00:26:57,920 --> 00:27:00,359 Speaker 2: then I went to bed like just after ten clock. 501 00:27:00,560 --> 00:27:03,000 Speaker 2: You know, I'm not a New Year's Eve person. Well, 502 00:27:03,160 --> 00:27:04,960 Speaker 2: I guess you had an early night too, I'm assuming 503 00:27:05,119 --> 00:27:05,280 Speaker 2: you know. 504 00:27:05,560 --> 00:27:07,920 Speaker 1: Yeah, yeah, I did fuck all. I did fuck all. 505 00:27:07,960 --> 00:27:10,520 Speaker 1: I was like, to me, it was Wednesday or Thursday 506 00:27:10,560 --> 00:27:13,879 Speaker 1: or whatever it was, But that's okay. It's like I 507 00:27:13,920 --> 00:27:16,800 Speaker 1: was talking with somebody about this, max this morning. In fact, 508 00:27:16,840 --> 00:27:18,920 Speaker 1: I out of coffee with Maxie shout out to Maxie 509 00:27:18,920 --> 00:27:22,440 Speaker 1: and yeah, like you mate, you go. I'm well aware 510 00:27:22,520 --> 00:27:24,439 Speaker 1: that the way that I do things is not the 511 00:27:24,480 --> 00:27:27,560 Speaker 1: way most people should. And I don't think I'm better 512 00:27:27,640 --> 00:27:32,240 Speaker 1: or worse or special or I just know that. You know, firstly, 513 00:27:33,640 --> 00:27:36,280 Speaker 1: I'm an old bloke who lives by himself. Don't fucking 514 00:27:36,320 --> 00:27:39,639 Speaker 1: do what I do. It's like most people that listen 515 00:27:39,680 --> 00:27:42,119 Speaker 1: to this show are not a version of me. So 516 00:27:42,200 --> 00:27:45,480 Speaker 1: that's why, you know, I'm so fascinated with theory of mind, 517 00:27:45,520 --> 00:27:50,640 Speaker 1: which is understanding other people's mind, understanding how other people think. Like, 518 00:27:50,840 --> 00:27:53,680 Speaker 1: I'm not even that interested in the Craig experience. I'm 519 00:27:53,720 --> 00:27:55,760 Speaker 1: more interested in why does Patrick think the way that 520 00:27:55,800 --> 00:27:57,399 Speaker 1: he has? Why does he do this? And that? Not 521 00:27:57,440 --> 00:28:01,479 Speaker 1: good or bad? No judgment, just curiosity, just awareness, Like 522 00:28:01,880 --> 00:28:04,280 Speaker 1: what is it like for our listeners when you and 523 00:28:04,320 --> 00:28:06,920 Speaker 1: I argue? And I love when we argue, by the way, 524 00:28:07,080 --> 00:28:09,560 Speaker 1: I'm glad that we don't fucking agree on everything? Right? 525 00:28:10,240 --> 00:28:10,359 Speaker 2: Like? 526 00:28:10,440 --> 00:28:12,679 Speaker 1: What's that like for them? I think even when you 527 00:28:12,720 --> 00:28:15,199 Speaker 1: and I going at it, I think, uh, is this 528 00:28:15,280 --> 00:28:18,119 Speaker 1: good to listen to? Is this shit to listen to? 529 00:28:18,880 --> 00:28:25,040 Speaker 1: You know? That other awareness, that kind of that kind 530 00:28:25,040 --> 00:28:28,920 Speaker 1: of external self awareness it's called as well as metaperception 531 00:28:29,040 --> 00:28:32,639 Speaker 1: of trying to have an understanding of what something is 532 00:28:32,800 --> 00:28:36,640 Speaker 1: like for somebody else. You know, for me that's that's 533 00:28:36,680 --> 00:28:40,280 Speaker 1: a fascination, but for other people understandably, they're like, shut 534 00:28:40,280 --> 00:28:41,800 Speaker 1: the fuck up, past me a beer. 535 00:28:43,320 --> 00:28:46,600 Speaker 2: You know, yeah, it's I was actually just thinking in 536 00:28:46,640 --> 00:28:50,480 Speaker 2: relation to, you know, where we're heading in the future, 537 00:28:50,920 --> 00:28:54,360 Speaker 2: and how some of us are attracted to the likes 538 00:28:54,360 --> 00:28:57,560 Speaker 2: of say technology, and other people are resistant to it. 539 00:28:57,760 --> 00:28:59,960 Speaker 2: And sometimes you can put it into an age Brad. 540 00:29:00,160 --> 00:29:02,240 Speaker 2: You can say, well, young people have grown up with tech, 541 00:29:02,560 --> 00:29:05,600 Speaker 2: but I'm not a young person. You know, there weren't 542 00:29:05,600 --> 00:29:08,280 Speaker 2: computers around when I first went to school, and yet 543 00:29:08,400 --> 00:29:11,960 Speaker 2: I love tech, you know. And I have people, my peers, 544 00:29:12,360 --> 00:29:15,720 Speaker 2: who say, oh, I'm technically illiterate. That's your choice to 545 00:29:15,760 --> 00:29:18,960 Speaker 2: be technically illiterate. But I get it because you're now 546 00:29:19,080 --> 00:29:21,440 Speaker 2: comfortable with it. You don't like it. But for me, 547 00:29:22,000 --> 00:29:24,360 Speaker 2: you know, I think I've told this story before, and 548 00:29:24,400 --> 00:29:25,960 Speaker 2: I know I'm getting off topic a little bit, But 549 00:29:26,320 --> 00:29:29,360 Speaker 2: when I was a kid, we got our first VHS 550 00:29:29,520 --> 00:29:32,239 Speaker 2: video recorder. It was an Achi and it cost a 551 00:29:32,280 --> 00:29:34,680 Speaker 2: real lot of money. Do you remember the story? Yeah? 552 00:29:34,880 --> 00:29:38,360 Speaker 1: No, I remember though. My old man used to anyway 553 00:29:38,720 --> 00:29:42,240 Speaker 1: run a store where we sold those Ai vhs. 554 00:29:42,320 --> 00:29:44,840 Speaker 2: Yeah, they were expensive. They were over one thousand dollars 555 00:29:45,000 --> 00:29:48,200 Speaker 2: at that time, so we're talking in what the early eighties. 556 00:29:48,560 --> 00:29:50,720 Speaker 2: You were paying over one thousand dollars. That's a lot 557 00:29:50,720 --> 00:29:53,840 Speaker 2: of money for a working class family in Coburg. So 558 00:29:53,960 --> 00:29:57,000 Speaker 2: Dad buys this Achai and it also had things on screen, 559 00:29:57,120 --> 00:29:59,400 Speaker 2: so the counter was on screen whether we're going forward 560 00:29:59,480 --> 00:30:01,600 Speaker 2: or backward. It was very high tech at the time. 561 00:30:02,000 --> 00:30:04,160 Speaker 2: So the first week we had it, Mum and Dad 562 00:30:04,160 --> 00:30:06,200 Speaker 2: get in the car to go shopping on a Saturday morning. 563 00:30:06,320 --> 00:30:10,920 Speaker 2: Patrick gets the screwdriver out because I couldn't cope not 564 00:30:11,120 --> 00:30:14,160 Speaker 2: knowing what was inside and how the mechanism worked. So 565 00:30:14,200 --> 00:30:16,760 Speaker 2: I'm in the poor sucker and I've got all the 566 00:30:16,800 --> 00:30:19,360 Speaker 2: screws out, I've got a handful of screws, I've got 567 00:30:19,360 --> 00:30:21,640 Speaker 2: the cover off, and then Mum and Dad pulled back 568 00:30:21,680 --> 00:30:24,200 Speaker 2: into the driveway. It's like, fuck, what am I going 569 00:30:24,280 --> 00:30:26,360 Speaker 2: to do? So I throw the cover back on, put 570 00:30:26,400 --> 00:30:28,320 Speaker 2: all the screws in the pocket and look guilty as 571 00:30:28,320 --> 00:30:30,120 Speaker 2: shit when they walk back in, because Mum had left 572 00:30:30,120 --> 00:30:33,160 Speaker 2: her purse and that's the reason they'd come back so quickly. 573 00:30:33,480 --> 00:30:35,560 Speaker 2: And of course there's no way they knew what I 574 00:30:35,640 --> 00:30:39,000 Speaker 2: was doing. I mean, I'd hidden the screwdriver and the bolt, 575 00:30:39,040 --> 00:30:41,720 Speaker 2: the screws are all in the pocket. But Jesus shited myself. 576 00:30:41,760 --> 00:30:45,680 Speaker 2: And then anyway, old, how old were you? Maybe thirty one? 577 00:30:47,960 --> 00:30:50,600 Speaker 2: Probably about thirteen or fourteen, I guess at that time? 578 00:30:51,720 --> 00:30:54,640 Speaker 1: Did you fuck it up? Or did you successfully reassemble it? 579 00:30:54,680 --> 00:30:56,280 Speaker 2: Okay? I can tell you that I put it all 580 00:30:56,320 --> 00:30:58,680 Speaker 2: back together again, but there was one screw left over, 581 00:30:58,720 --> 00:31:03,400 Speaker 2: but that happens with everything ever taken. And not only 582 00:31:03,400 --> 00:31:06,280 Speaker 2: did it keep working when I finally moved out of home, 583 00:31:06,640 --> 00:31:09,240 Speaker 2: like a decade later they gave it to me to 584 00:31:09,280 --> 00:31:10,960 Speaker 2: take with me because it still worked. 585 00:31:11,520 --> 00:31:16,040 Speaker 1: It's hilarious. Hey, you tell me about on my little 586 00:31:16,040 --> 00:31:19,520 Speaker 1: list of things that we haven't covered of yours hiring 587 00:31:19,560 --> 00:31:20,160 Speaker 1: a robot. 588 00:31:20,880 --> 00:31:24,640 Speaker 2: Yes, so in China, if you're having a meeting or 589 00:31:24,680 --> 00:31:28,360 Speaker 2: a wedding, you cannot now hire a robot. So there's 590 00:31:28,360 --> 00:31:33,960 Speaker 2: a Chinese firm called Aggie Bot and they're humanoid robot rentals. 591 00:31:34,400 --> 00:31:38,000 Speaker 2: So the price of the rental is dependent on what 592 00:31:38,080 --> 00:31:40,640 Speaker 2: the robot model is and what you wanted to do. 593 00:31:41,280 --> 00:31:44,000 Speaker 2: But they're actually entering the economy in a big way. 594 00:31:44,480 --> 00:31:48,040 Speaker 2: They're not cheap. But basically this has only just happened, 595 00:31:48,080 --> 00:31:49,400 Speaker 2: so this is going to be new. So for a 596 00:31:49,400 --> 00:31:53,040 Speaker 2: lot of people, there's sixteen different event duties that they 597 00:31:53,120 --> 00:31:58,560 Speaker 2: can be deployed for, so weddings, business meetings, concerts, trade shows. 598 00:31:59,760 --> 00:32:04,640 Speaker 2: The Chinese rental market is fascinating. So rather than buying 599 00:32:04,680 --> 00:32:06,880 Speaker 2: a robot, you pay about one hundred and thirty eight 600 00:32:06,920 --> 00:32:11,000 Speaker 2: dollars a day US and you can like, if it's 601 00:32:11,040 --> 00:32:13,320 Speaker 2: a robot dog, it's going to cost you one hundred 602 00:32:13,320 --> 00:32:15,360 Speaker 2: and thirty eight bucks. But if you wanted a one 603 00:32:15,360 --> 00:32:17,480 Speaker 2: of the have you seen the Uni tree U two 604 00:32:17,640 --> 00:32:21,040 Speaker 2: dancing robots. They're awesome, they look fantastic. That's going to 605 00:32:21,080 --> 00:32:24,280 Speaker 2: cost you about six hundred and ninety dollars per day. Okay, 606 00:32:24,360 --> 00:32:27,080 Speaker 2: so wow, This really depends on what you want the 607 00:32:27,240 --> 00:32:29,880 Speaker 2: robot dog to do or the robot dancer to do. 608 00:32:30,360 --> 00:32:33,240 Speaker 2: But how interesting is that? I reckon? I would do that? Definitely. 609 00:32:33,280 --> 00:32:35,240 Speaker 2: I would hire it for a day. I'd try before 610 00:32:35,280 --> 00:32:38,760 Speaker 2: I buy. That would be a me thing, definitely. But 611 00:32:38,880 --> 00:32:40,720 Speaker 2: isn't that interesting that it's now become a thing in 612 00:32:40,880 --> 00:32:42,920 Speaker 2: China that you go and you rent a robot for 613 00:32:42,960 --> 00:32:43,280 Speaker 2: the day. 614 00:32:43,840 --> 00:32:45,240 Speaker 1: I would love that. And you know what would be 615 00:32:45,280 --> 00:32:47,080 Speaker 1: even cooler is once they've got a lot of tech 616 00:32:47,200 --> 00:32:50,680 Speaker 1: right and they can move around and do stuff and 617 00:32:50,840 --> 00:32:52,960 Speaker 1: whatever you need them to do, clean or vacuum or 618 00:32:53,080 --> 00:32:55,360 Speaker 1: chat to you or change the channel or whatever the 619 00:32:55,400 --> 00:32:58,400 Speaker 1: fuck it is, then they start to build retro models 620 00:32:58,440 --> 00:33:01,480 Speaker 1: that look like R two, D two and C three po. Yeah, 621 00:33:01,640 --> 00:33:01,800 Speaker 1: you know. 622 00:33:02,240 --> 00:33:03,240 Speaker 2: So, Like. 623 00:33:04,880 --> 00:33:07,880 Speaker 1: I was listening to another podcast and they were talking 624 00:33:07,880 --> 00:33:12,440 Speaker 1: about this Patrick, you know, humanoid robots, and the guy 625 00:33:12,800 --> 00:33:15,160 Speaker 1: who is the I forget his name, the boss of 626 00:33:15,400 --> 00:33:20,360 Speaker 1: in video. He was talking about all the potential he 627 00:33:20,440 --> 00:33:22,040 Speaker 1: was talking about all the jobs that are going to 628 00:33:22,080 --> 00:33:24,440 Speaker 1: go away, but all the jobs that will come. And 629 00:33:24,560 --> 00:33:27,160 Speaker 1: he's like, but well with all the robots. He's like, 630 00:33:27,240 --> 00:33:29,800 Speaker 1: even think about this. People are going to have personal robots. 631 00:33:31,280 --> 00:33:33,480 Speaker 1: And he goes, you know what's going to become a market? 632 00:33:33,600 --> 00:33:38,600 Speaker 1: He goes, clothing for robots. What people dress their robots in? Right, 633 00:33:39,120 --> 00:33:44,480 Speaker 1: There'll be whole lines of clothing for you to I'm like, 634 00:33:44,680 --> 00:33:46,320 Speaker 1: I never thought of that, and he's like, I know, 635 00:33:46,440 --> 00:33:49,640 Speaker 1: that seems crazy, it's absolutely going to happen. Have you 636 00:33:49,720 --> 00:33:51,800 Speaker 1: seen people dress their dogs? I'm like, well, I've seen 637 00:33:51,840 --> 00:33:56,840 Speaker 1: Patrick dress his dog. So if people are dressing fucking animals, 638 00:33:57,360 --> 00:34:01,320 Speaker 1: then people. Yeah. So it's like that. That just fascinating me, 639 00:34:01,920 --> 00:34:09,480 Speaker 1: the the development and the accelerated kind of rate at 640 00:34:10,239 --> 00:34:13,160 Speaker 1: which everything is happening. I'll tell you one quick thing, 641 00:34:13,200 --> 00:34:16,719 Speaker 1: which you'll probably be all over, But so Melissa's gone down. 642 00:34:16,800 --> 00:34:19,080 Speaker 1: Melissa for those three of you who don't know who 643 00:34:19,160 --> 00:34:22,479 Speaker 1: Melissa is, runs my life, is my business partner. She's 644 00:34:23,200 --> 00:34:27,560 Speaker 1: one hundred years younger than me, smarter than me. I'm 645 00:34:27,640 --> 00:34:29,480 Speaker 1: not even being nice to her and bad to me. 646 00:34:29,600 --> 00:34:30,120 Speaker 2: She just is. 647 00:34:30,480 --> 00:34:33,080 Speaker 1: But she's down. The AI rabbit hole At the moment, 648 00:34:33,160 --> 00:34:40,359 Speaker 1: Patrick building AI agents for our business, and she's trying 649 00:34:40,400 --> 00:34:44,799 Speaker 1: to explain to me, right, which is like me trying 650 00:34:44,840 --> 00:34:48,239 Speaker 1: to explain to my mum the functional anatomy of a 651 00:34:49,280 --> 00:34:53,960 Speaker 1: deadlift or a squat or something. But anyway, so one 652 00:34:54,040 --> 00:34:58,080 Speaker 1: of the things that she's building is an agent that 653 00:34:58,800 --> 00:35:01,920 Speaker 1: basically runs the you project from a behind the scenes 654 00:35:01,960 --> 00:35:06,400 Speaker 1: point of view. So it's currently being trained on the 655 00:35:06,560 --> 00:35:09,799 Speaker 1: kind of guests that we want the language to use. 656 00:35:10,200 --> 00:35:13,040 Speaker 1: You know, there's a template to send out, but it 657 00:35:13,200 --> 00:35:16,840 Speaker 1: personalizes it somewhat and like all of these variables that 658 00:35:17,040 --> 00:35:22,120 Speaker 1: will basically save us having an employee from doing that thing. 659 00:35:22,480 --> 00:35:26,000 Speaker 1: And as we've said before, so then you've got this 660 00:35:26,239 --> 00:35:30,960 Speaker 1: agent or this employee in inverted commas that doesn't get tired, 661 00:35:31,000 --> 00:35:34,560 Speaker 1: that doesn't get sick, that doesn't need holiday pay, that 662 00:35:34,760 --> 00:35:39,760 Speaker 1: doesn't you know, have emotional and psychological and relationship issues. 663 00:35:40,560 --> 00:35:43,399 Speaker 1: So that the stuff that's being developed right now that's 664 00:35:43,440 --> 00:35:46,839 Speaker 1: going to become commonplace. I don't think we can even 665 00:35:47,280 --> 00:35:49,080 Speaker 1: even now compared to five years ago. 666 00:35:49,120 --> 00:35:51,400 Speaker 2: It's crazy, it is, yeah, and it looked at some 667 00:35:51,520 --> 00:35:54,279 Speaker 2: great useful tools out there, But it also means that 668 00:35:54,600 --> 00:35:56,040 Speaker 2: there's going to be a lot of people out of 669 00:35:56,120 --> 00:35:59,000 Speaker 2: work as well. I think did we talk about this 670 00:35:59,120 --> 00:36:02,960 Speaker 2: earlier that there were fifty thousand jobs lost to right 671 00:36:04,239 --> 00:36:06,359 Speaker 2: by in the US, I think it's over. It might 672 00:36:06,400 --> 00:36:09,439 Speaker 2: have been fifty five thousand jobs last year, So twenty 673 00:36:09,520 --> 00:36:13,200 Speaker 2: twenty five were lost to AI, is what you know. 674 00:36:13,960 --> 00:36:18,920 Speaker 2: And in some employment research company was put out basically 675 00:36:19,080 --> 00:36:22,200 Speaker 2: fifty five thousand jobs out. But you know, I was 676 00:36:22,239 --> 00:36:24,320 Speaker 2: thinking about clothing for robots. So I'm going back a 677 00:36:24,360 --> 00:36:27,920 Speaker 2: little bit because I do you have your news resolution? 678 00:36:28,040 --> 00:36:30,920 Speaker 2: I know you spend a lot of time planning the future, 679 00:36:30,960 --> 00:36:33,879 Speaker 2: and you usually use New Year's Day and you really 680 00:36:33,920 --> 00:36:36,320 Speaker 2: think through all of that, which I don't normally do. 681 00:36:37,080 --> 00:36:40,920 Speaker 1: Yeah, I'm less kind of New Year's these days, but yeah, 682 00:36:41,239 --> 00:36:43,440 Speaker 1: I'm always kind of thinking about and whether or not 683 00:36:43,520 --> 00:36:46,839 Speaker 1: it's January one or December twenty three or but yeah, 684 00:36:46,960 --> 00:36:50,839 Speaker 1: I'm always kind of I have yearly, weekly goals, monthly goals, 685 00:36:50,960 --> 00:36:55,080 Speaker 1: yearly goals, So yeah, I'm because hopefully my PhD is 686 00:36:55,080 --> 00:36:57,560 Speaker 1: finished in the next month or so, that's the plan anyway, 687 00:36:57,640 --> 00:36:59,160 Speaker 1: and beyond that they're a goals. 688 00:36:59,239 --> 00:37:02,719 Speaker 2: So yes, well, no, in about November last year, even 689 00:37:02,760 --> 00:37:05,000 Speaker 2: before November last year, I was reading a few articles 690 00:37:05,080 --> 00:37:08,480 Speaker 2: on fast fashion and I thought to myself, I've got 691 00:37:08,600 --> 00:37:11,040 Speaker 2: so many clothes I don't need to buy any new clothes. 692 00:37:11,080 --> 00:37:13,560 Speaker 2: So for twenty twenty six, I've decided I'm not going 693 00:37:13,640 --> 00:37:17,359 Speaker 2: to buy a new item of clothing for the next 694 00:37:17,440 --> 00:37:20,640 Speaker 2: twelve months. And so I scattered out my local op 695 00:37:20,719 --> 00:37:23,000 Speaker 2: shop to see if there's anything there that I might 696 00:37:23,120 --> 00:37:25,560 Speaker 2: want to wear in case I do need a new top. 697 00:37:26,040 --> 00:37:28,640 Speaker 2: But that's my little pledge to myself, just to stop 698 00:37:28,760 --> 00:37:31,680 Speaker 2: fast fashion and kind of keep using my current clothing. 699 00:37:31,840 --> 00:37:35,080 Speaker 2: So yeah, that's that's my little promise to myself not 700 00:37:35,320 --> 00:37:38,200 Speaker 2: to buy any clothing except for footwear, because of course, 701 00:37:38,360 --> 00:37:41,480 Speaker 2: you know, particularly if you've got pronate feet, you need 702 00:37:41,600 --> 00:37:43,480 Speaker 2: to make sure you've got the right footwear. But yeah, 703 00:37:43,600 --> 00:37:44,960 Speaker 2: that's my little pledge to myself. 704 00:37:45,040 --> 00:37:46,920 Speaker 1: I hope you've got a lot of jocks in the cupboard. 705 00:37:47,600 --> 00:37:49,399 Speaker 2: Actually I have a lot of jocks, and I get 706 00:37:49,480 --> 00:37:52,480 Speaker 2: my jocks from an Australian company that manufactures locally. I 707 00:37:52,520 --> 00:37:53,719 Speaker 2: think that's really important too. 708 00:37:54,280 --> 00:37:56,359 Speaker 1: So wow, what are they called? Give them a shout out? 709 00:37:56,760 --> 00:38:00,839 Speaker 2: Aussie bum, Aussie bum? You started a couple of gay guys. 710 00:38:00,880 --> 00:38:02,439 Speaker 2: Do you know what can I tell you this story? 711 00:38:02,440 --> 00:38:04,800 Speaker 2: It's getting bit personal here. One of the reasons I 712 00:38:04,840 --> 00:38:08,120 Speaker 2: started wearing them is, well, you remember the old advertising 713 00:38:08,200 --> 00:38:11,120 Speaker 2: for cross your Heart bras and women who have supported 714 00:38:11,200 --> 00:38:13,759 Speaker 2: bras for when they run to keep everything you know 715 00:38:14,080 --> 00:38:14,560 Speaker 2: in place. 716 00:38:15,120 --> 00:38:17,160 Speaker 1: I don't think you needed to explain the second bit. 717 00:38:17,280 --> 00:38:19,600 Speaker 1: I think we could have just said supported bras. Yep, 718 00:38:19,680 --> 00:38:22,520 Speaker 1: we're with you, just trying to spell create a picture 719 00:38:22,680 --> 00:38:25,160 Speaker 1: for everyone. Pretty sure we don't need that. 720 00:38:26,280 --> 00:38:29,279 Speaker 2: Well, these jobs, there's there's a particular range of these 721 00:38:29,800 --> 00:38:32,600 Speaker 2: underwear that have a supporting cup in them as well 722 00:38:33,080 --> 00:38:33,440 Speaker 2: to keep it. 723 00:38:33,520 --> 00:38:36,720 Speaker 1: I love how to our listeners, Patrick's holding his hand 724 00:38:36,840 --> 00:38:40,759 Speaker 1: out like with his palm up like he's supporting a 725 00:38:40,920 --> 00:38:45,520 Speaker 1: cock and some testicles. Yep, thanks fucking hell. Can you 726 00:38:45,680 --> 00:38:49,520 Speaker 1: cut out the visuals today? I haven't had brecky yet, but. 727 00:38:49,600 --> 00:38:52,480 Speaker 2: I still a lot of paragliding and rock climbing, and 728 00:38:52,680 --> 00:38:54,839 Speaker 2: keeping the boys in the right position when you're wearing 729 00:38:54,880 --> 00:38:56,719 Speaker 2: a harness is pretty damn important. 730 00:38:57,640 --> 00:38:59,960 Speaker 1: Can I just say, nobody in the history of probably 731 00:39:00,200 --> 00:39:03,160 Speaker 1: any show has ever started a sentence with but I 732 00:39:03,360 --> 00:39:05,760 Speaker 1: used to do a lot of paragliding and hang gliding, 733 00:39:06,000 --> 00:39:07,520 Speaker 1: rock climbing, now rock climbing. 734 00:39:07,360 --> 00:39:10,240 Speaker 2: Rock climbing, gliding and rock climbing, and those are sports 735 00:39:10,280 --> 00:39:14,000 Speaker 2: where you need to have very good undergarments to make 736 00:39:14,080 --> 00:39:14,759 Speaker 2: sure you don't. 737 00:39:14,960 --> 00:39:16,879 Speaker 1: So what do you what are you saying that you're 738 00:39:17,320 --> 00:39:17,959 Speaker 1: well and doubt? 739 00:39:18,360 --> 00:39:20,840 Speaker 2: Is that the is that the I'm not saying that 740 00:39:20,920 --> 00:39:22,600 Speaker 2: at all. I'm just saying that you're still going to 741 00:39:22,719 --> 00:39:23,600 Speaker 2: keep the boys in place? 742 00:39:24,800 --> 00:39:25,719 Speaker 1: Well, what's wrong? 743 00:39:26,080 --> 00:39:28,320 Speaker 2: Have you been climbing before? Have you worn climbing harness? 744 00:39:30,080 --> 00:39:30,120 Speaker 1: No? 745 00:39:30,600 --> 00:39:32,160 Speaker 2: You do not want to slip out of your climbing 746 00:39:32,200 --> 00:39:36,160 Speaker 2: harness and get something caught in that strap of the harness. 747 00:39:36,160 --> 00:39:36,839 Speaker 2: You've got to be pretty. 748 00:39:36,920 --> 00:39:39,680 Speaker 1: I'm just a free range climber. I'm like an orangutan. 749 00:39:40,760 --> 00:39:41,880 Speaker 1: You fucking scale up? 750 00:39:41,960 --> 00:39:44,560 Speaker 2: Shit, how do we even get there? So anyway, I 751 00:39:44,640 --> 00:39:47,319 Speaker 2: buy a local and I try to buy Australian may 752 00:39:47,400 --> 00:39:49,520 Speaker 2: where possible. But this year, no, I have got enough 753 00:39:49,600 --> 00:39:52,000 Speaker 2: jocks to keep me through the year. I reckon Crago. 754 00:39:52,680 --> 00:39:57,480 Speaker 1: Talk to us about AI pro AI prompt. Yeah, I'm 755 00:39:57,520 --> 00:40:00,960 Speaker 1: fucking taking a left turn AI prompt. So when we 756 00:40:02,000 --> 00:40:04,840 Speaker 1: I think being able to write a good prompt makes 757 00:40:04,960 --> 00:40:08,239 Speaker 1: all the difference, because sometimes and a prompt everybody we're 758 00:40:08,280 --> 00:40:10,640 Speaker 1: just talking about when we're asking chat GPT your question 759 00:40:10,800 --> 00:40:13,040 Speaker 1: or we want it to do something or give it 760 00:40:13,160 --> 00:40:17,360 Speaker 1: an instruction, or any AI for that matter. Yeah, And 761 00:40:17,520 --> 00:40:21,160 Speaker 1: I've found that the quality of what I tell it 762 00:40:21,280 --> 00:40:24,520 Speaker 1: to do or ask it to do makes a huge 763 00:40:24,560 --> 00:40:26,879 Speaker 1: difference to the output it does. 764 00:40:27,040 --> 00:40:29,520 Speaker 2: AI prompting is really interesting because I tend to use 765 00:40:29,760 --> 00:40:33,360 Speaker 2: AI for the most part visually, so if I'm trying 766 00:40:33,400 --> 00:40:36,560 Speaker 2: to edit an image, you know, or trying to create 767 00:40:36,640 --> 00:40:40,400 Speaker 2: something that involves visual aspect. But AI prompting, it's this 768 00:40:40,480 --> 00:40:42,520 Speaker 2: is a skill to it, isn't there getting the prompt 769 00:40:42,640 --> 00:40:47,040 Speaker 2: right and getting it And I don't look, I don't 770 00:40:47,040 --> 00:40:49,920 Speaker 2: know if there's any real secret to it. I think 771 00:40:50,160 --> 00:40:52,839 Speaker 2: being you don't want to be too long winded. One 772 00:40:52,880 --> 00:40:55,440 Speaker 2: of an article I was reading with someone saying, one 773 00:40:55,440 --> 00:40:58,080 Speaker 2: of the mistakes you can make is being too detailed 774 00:40:58,520 --> 00:41:00,719 Speaker 2: when doing an AI prompt. If you do that, you 775 00:41:00,840 --> 00:41:04,399 Speaker 2: can kind of overcomplicate things by trying to say too 776 00:41:04,520 --> 00:41:08,280 Speaker 2: much and maybe start off with very simple basic prompts 777 00:41:08,280 --> 00:41:10,160 Speaker 2: and then if you need to, you know, to kind 778 00:41:10,200 --> 00:41:12,520 Speaker 2: of get more detailed, do it that way. But sometimes 779 00:41:12,560 --> 00:41:16,719 Speaker 2: you can go too long and be concerned about Yeah, 780 00:41:16,960 --> 00:41:19,240 Speaker 2: the detail don't get bogged down in the detail. 781 00:41:20,280 --> 00:41:23,600 Speaker 1: Yeah, And also sometimes yes is the answer. And also 782 00:41:24,560 --> 00:41:31,120 Speaker 1: sometimes I'm writing something and i'm writing a let's say, 783 00:41:31,160 --> 00:41:33,880 Speaker 1: a piece about decision making or something, and I'm using 784 00:41:35,239 --> 00:41:39,120 Speaker 1: metaphors and analogies, and then I'm trying to figure out 785 00:41:39,280 --> 00:41:42,719 Speaker 1: one and I might write nine sentences and I need 786 00:41:42,800 --> 00:41:45,719 Speaker 1: a banger last sentence, so I just go, I put 787 00:41:45,760 --> 00:41:49,000 Speaker 1: it in, see what it says. And so quite often 788 00:41:49,080 --> 00:41:52,120 Speaker 1: it will give a last sentence that's totally out of 789 00:41:53,080 --> 00:41:56,960 Speaker 1: kilter with the nine that I've done. Like I'm writing 790 00:41:57,280 --> 00:42:01,840 Speaker 1: tongue in cheek. It's it's like you have to figure 791 00:42:01,960 --> 00:42:05,520 Speaker 1: it's not clear, and then it writes the clear obvious. 792 00:42:06,480 --> 00:42:10,400 Speaker 1: So it takes a different style altogether. And while the 793 00:42:10,520 --> 00:42:15,440 Speaker 1: sentence kind of makes sense as a reading experience, it 794 00:42:15,640 --> 00:42:16,759 Speaker 1: totally doesn't fit. 795 00:42:17,360 --> 00:42:17,760 Speaker 2: Ye. 796 00:42:18,000 --> 00:42:21,600 Speaker 1: So you've really got to go, Okay, I want you 797 00:42:21,760 --> 00:42:24,840 Speaker 1: to write a bit about this, but it's got to 798 00:42:24,920 --> 00:42:27,440 Speaker 1: be it's got to be it's got to be accurate. 799 00:42:27,520 --> 00:42:29,640 Speaker 1: But also it's got to be fun, it's got to be. 800 00:42:30,320 --> 00:42:32,279 Speaker 1: You know, tongue in cheek it's a bit cheeky da 801 00:42:32,360 --> 00:42:35,440 Speaker 1: data And even then it's like, yeah, there's a lot 802 00:42:35,520 --> 00:42:38,640 Speaker 1: of like creative things. It's not very good. 803 00:42:39,760 --> 00:42:44,680 Speaker 2: Interestingly, Sam Altman, who is the genius, I guess behind 804 00:42:45,760 --> 00:42:50,680 Speaker 2: one of the largest AI models, he's saying his vision 805 00:42:50,840 --> 00:42:55,080 Speaker 2: for AI in the future is where it remembers everything 806 00:42:55,680 --> 00:42:58,960 Speaker 2: and you basically put your whole life into it. Everything 807 00:42:59,000 --> 00:43:01,560 Speaker 2: you've ever written, everything you've ever said. So we'd put 808 00:43:01,600 --> 00:43:04,120 Speaker 2: all of your podcasts in there, all of the emails 809 00:43:04,160 --> 00:43:06,920 Speaker 2: you've ever seen, and it would remember that. Because at 810 00:43:06,960 --> 00:43:08,759 Speaker 2: the moment, it kind of doesn't really do that, and 811 00:43:09,239 --> 00:43:10,960 Speaker 2: I don't know that fills me with a little bit 812 00:43:10,960 --> 00:43:13,879 Speaker 2: of dread. Would you want every single thing you've ever 813 00:43:14,040 --> 00:43:17,120 Speaker 2: emailed out, everything you've ever done, because really, if someone 814 00:43:17,239 --> 00:43:18,800 Speaker 2: was going to mimic you, or someone was going to 815 00:43:18,840 --> 00:43:21,880 Speaker 2: assist you, if you had the ultimate AI assistant, they 816 00:43:21,920 --> 00:43:24,759 Speaker 2: would need the full picture, not a sanitized version of you. 817 00:43:25,640 --> 00:43:28,600 Speaker 1: Yeah. Look, I get what you're saying. I'm not too 818 00:43:28,640 --> 00:43:31,000 Speaker 1: worried about that. I guess some people be real worried 819 00:43:31,040 --> 00:43:34,480 Speaker 1: about that. I don't think there's too much that's fucking 820 00:43:34,560 --> 00:43:41,480 Speaker 1: incriminating or but but what I think about you know 821 00:43:41,680 --> 00:43:43,960 Speaker 1: how we're kind of handing out a lot of information 822 00:43:44,160 --> 00:43:47,640 Speaker 1: to AI, or we're kind of outsourced. It seems like 823 00:43:47,719 --> 00:43:50,799 Speaker 1: we're outsourcing thinking, right, Oh, I don't want to think. 824 00:43:50,840 --> 00:43:53,200 Speaker 1: I'll let it think for me, which I understand that 825 00:43:54,280 --> 00:43:56,799 Speaker 1: and I do actually think depending on how it's used, 826 00:43:57,000 --> 00:44:00,279 Speaker 1: it can be a negative and it can as you 827 00:44:00,480 --> 00:44:06,120 Speaker 1: suggested earlier, there's definitely that, Not that it makes people dumber, 828 00:44:07,120 --> 00:44:10,879 Speaker 1: because IQ is IQ, but I think that it can 829 00:44:11,760 --> 00:44:15,480 Speaker 1: have a negative impact on cognitive function even the structure 830 00:44:15,520 --> 00:44:19,000 Speaker 1: of the brain. Like somebody uses their brain a lot, 831 00:44:20,080 --> 00:44:22,680 Speaker 1: their brain looks different to someone who doesn't use their 832 00:44:22,719 --> 00:44:24,520 Speaker 1: brain a lot. Like there's just a lot of a 833 00:44:24,600 --> 00:44:28,160 Speaker 1: lot more neural pathways, a lot of different stuff physiologically 834 00:44:28,200 --> 00:44:32,920 Speaker 1: and anatomically going on. But I think like for me 835 00:44:34,520 --> 00:44:37,759 Speaker 1: using AI, my brain's working more than ever because I'm 836 00:44:37,920 --> 00:44:40,960 Speaker 1: not only am I it makes it gives me more 837 00:44:41,040 --> 00:44:44,280 Speaker 1: time to do other stuff where I am using my brain, 838 00:44:45,200 --> 00:44:47,960 Speaker 1: whereas it can give me a short cut and it 839 00:44:48,080 --> 00:44:50,600 Speaker 1: can give me an answer to something quite quickly, which 840 00:44:50,680 --> 00:44:54,120 Speaker 1: then I have to verify, of course. But also I'm 841 00:44:54,280 --> 00:44:58,360 Speaker 1: constantly thinking about how do I use this tool optimally, 842 00:44:59,160 --> 00:45:01,719 Speaker 1: Like it's not a rep placement for thinking or for 843 00:45:01,880 --> 00:45:05,520 Speaker 1: my brain or for my knowledge. For me, it's just 844 00:45:05,640 --> 00:45:10,040 Speaker 1: a tool that I can use to do more shit 845 00:45:10,360 --> 00:45:14,759 Speaker 1: and create better outcomes. And it's like, well, sure you 846 00:45:14,880 --> 00:45:17,439 Speaker 1: can cut those bits of wood with that ten and saw, 847 00:45:17,560 --> 00:45:20,520 Speaker 1: it's going to take you five hours, or you can 848 00:45:20,640 --> 00:45:23,360 Speaker 1: use this circular saw. It's going to take you five minutes. 849 00:45:23,960 --> 00:45:27,520 Speaker 1: Oh oh yeah, but now someone's losing a job because 850 00:45:27,560 --> 00:45:30,120 Speaker 1: they're not. No, now we're just all going to use 851 00:45:30,200 --> 00:45:34,600 Speaker 1: these circular saws. Now it's much more efficient. So yeah, 852 00:45:34,680 --> 00:45:38,360 Speaker 1: I think I think every I think with everything. You know, 853 00:45:38,440 --> 00:45:40,759 Speaker 1: when rock and mole music came in, it's like it's 854 00:45:40,800 --> 00:45:41,560 Speaker 1: the end of the world. 855 00:45:42,320 --> 00:45:42,480 Speaker 2: You know. 856 00:45:42,560 --> 00:45:44,840 Speaker 1: When computers came in, people are like, oh, that's not 857 00:45:44,960 --> 00:45:47,120 Speaker 1: going to take on the internet. That was never going 858 00:45:47,200 --> 00:45:49,920 Speaker 1: to work. And I don't know, I just think everybody 859 00:45:50,600 --> 00:45:53,600 Speaker 1: has a negative bent when new things kind of emerge. 860 00:45:54,000 --> 00:45:58,239 Speaker 2: And I had an employee who left last year who 861 00:45:58,920 --> 00:46:01,520 Speaker 2: was with me for about four year years and he 862 00:46:02,400 --> 00:46:05,759 Speaker 2: obviously used a calculator a lot, and we would do 863 00:46:05,800 --> 00:46:07,560 Speaker 2: an end of month's summary of the work that he 864 00:46:07,640 --> 00:46:10,120 Speaker 2: did and come up with totals and all. We're doing 865 00:46:10,239 --> 00:46:13,320 Speaker 2: proposals and we'd have to work out what the you know, 866 00:46:13,360 --> 00:46:16,880 Speaker 2: we're going to quote for a job. Interestingly, I always 867 00:46:17,120 --> 00:46:19,799 Speaker 2: like the challenge of adding up sums in my head, 868 00:46:20,080 --> 00:46:24,200 Speaker 2: just occasionally. But his total reliance on the calculator, and 869 00:46:24,239 --> 00:46:26,920 Speaker 2: I mean a physical, literal calculator, either the on screen 870 00:46:27,320 --> 00:46:29,840 Speaker 2: or he had a calculator on his desk. And this 871 00:46:30,040 --> 00:46:33,400 Speaker 2: is old tech we're talking about now, But the reality 872 00:46:33,680 --> 00:46:37,279 Speaker 2: was that even simple sums he would use in the calculator, 873 00:46:37,680 --> 00:46:41,520 Speaker 2: so that part of his processing brain that did sums 874 00:46:42,200 --> 00:46:44,400 Speaker 2: he just didn't want to use. He just used an 875 00:46:44,440 --> 00:46:47,239 Speaker 2: electronic device. Now, calculators has been out forever, you know, 876 00:46:49,080 --> 00:46:50,799 Speaker 2: they've been out. You know, we grew up with them 877 00:46:50,800 --> 00:46:53,239 Speaker 2: at school. But what does that actually say about the 878 00:46:53,320 --> 00:46:56,040 Speaker 2: brain's ability to do that, you know, to process and 879 00:46:56,160 --> 00:46:59,759 Speaker 2: to take figures. You know, a local supermarket a little 880 00:46:59,760 --> 00:47:05,440 Speaker 2: while ago, the checkout girl was trying to work out 881 00:47:05,480 --> 00:47:07,480 Speaker 2: the price or something and that wasn't in the system, 882 00:47:07,640 --> 00:47:09,759 Speaker 2: and there were a number of them, and she was 883 00:47:09,880 --> 00:47:12,399 Speaker 2: sitting there like she couldn't do anything, she couldn' add 884 00:47:12,400 --> 00:47:14,920 Speaker 2: them up. And I said, well it's this. It was 885 00:47:14,960 --> 00:47:18,000 Speaker 2: a really not relatively simple I mean obviously, and I 886 00:47:18,040 --> 00:47:20,640 Speaker 2: think it was a percentage of something or or a 887 00:47:20,719 --> 00:47:23,560 Speaker 2: half price and then add something else. I can't even 888 00:47:23,560 --> 00:47:25,760 Speaker 2: remember what the sum was, but it was a pretty 889 00:47:25,800 --> 00:47:27,839 Speaker 2: easy one to work out. But she just stood there 890 00:47:27,920 --> 00:47:30,440 Speaker 2: dumbfounded because she a when I gave her the answer, 891 00:47:30,480 --> 00:47:33,239 Speaker 2: she couldn't use it because she couldn't verify it, and 892 00:47:33,440 --> 00:47:35,480 Speaker 2: b ended up having to get someone over to reset 893 00:47:35,520 --> 00:47:38,120 Speaker 2: her computer so she could put it in and recalculate it, 894 00:47:39,280 --> 00:47:39,440 Speaker 2: you know. 895 00:47:39,920 --> 00:47:43,600 Speaker 1: But it was, yeah, I'm with you, But then I think, Okay, 896 00:47:43,719 --> 00:47:46,879 Speaker 1: so the person who can't add maths, you give them 897 00:47:47,040 --> 00:47:49,840 Speaker 1: a canvas and some paint, and they paint something fucking 898 00:47:49,960 --> 00:47:53,560 Speaker 1: phenomenal that I couldn't even begin, Like, I'm really good 899 00:47:53,600 --> 00:47:56,800 Speaker 1: at maths, but that's my default setting. It's not because 900 00:47:56,800 --> 00:47:59,520 Speaker 1: I try to be like I'm that's the way my 901 00:47:59,640 --> 00:48:03,080 Speaker 1: brain works, But some people's Like one of my friends 902 00:48:03,160 --> 00:48:06,640 Speaker 1: is dyslexic. He's a fifty year old grown ass man 903 00:48:07,480 --> 00:48:10,520 Speaker 1: and he struggles to read a paragraph. But that's the 904 00:48:10,600 --> 00:48:14,359 Speaker 1: way his brain works. But you give him some shit 905 00:48:14,440 --> 00:48:17,640 Speaker 1: to build, He'll build a fucking house in an afternoon, 906 00:48:17,840 --> 00:48:19,640 Speaker 1: and I'll still be trying to figure out how the 907 00:48:19,680 --> 00:48:23,560 Speaker 1: hammer works. So intelligence is on a spectrum. And just 908 00:48:23,600 --> 00:48:27,160 Speaker 1: because I think some people using a calculator is the 909 00:48:27,239 --> 00:48:30,720 Speaker 1: right thing, and I don't think I know what you're saying, 910 00:48:31,800 --> 00:48:35,319 Speaker 1: but you're probably naturally good at maths. I'm naturally good 911 00:48:35,360 --> 00:48:37,120 Speaker 1: at maths, but then there are things that I'm not 912 00:48:37,320 --> 00:48:39,439 Speaker 1: naturally good at which I avoid. 913 00:48:40,120 --> 00:48:43,160 Speaker 2: And I go more than you'll lose it. Yeah, I 914 00:48:43,239 --> 00:48:44,800 Speaker 2: was thinking the use it or lose it kind of 915 00:48:44,880 --> 00:48:48,040 Speaker 2: mentality in terms of that side of it. But you're right. 916 00:48:48,800 --> 00:48:51,560 Speaker 2: In fact, the person that I'm thinking of was amazingly 917 00:48:51,719 --> 00:48:53,480 Speaker 2: is amazingly talented and creative. 918 00:48:53,880 --> 00:48:59,719 Speaker 1: So there are so many creative geniuses who are academically 919 00:48:59,840 --> 00:49:00,480 Speaker 1: not brilliant. 920 00:49:00,960 --> 00:49:03,319 Speaker 2: Yeah, yeah, for sure. Hey, just can we get onto 921 00:49:03,360 --> 00:49:06,640 Speaker 2: a different topic. I've got this really interesting tech gadgets, 922 00:49:06,640 --> 00:49:08,640 Speaker 2: and I thought, if there's a couple of little gadgets 923 00:49:08,680 --> 00:49:11,160 Speaker 2: I'm really excited about. But this seems to make so 924 00:49:11,280 --> 00:49:13,360 Speaker 2: much sense to me. I've never even thought about it before. 925 00:49:13,719 --> 00:49:17,920 Speaker 2: You can buy a smart jar lid. Okay, so thinking 926 00:49:18,120 --> 00:49:22,279 Speaker 2: a smart jar lid that uses UV light to keep 927 00:49:22,360 --> 00:49:24,840 Speaker 2: your food fresher longer. So if you've got stuff that 928 00:49:24,880 --> 00:49:27,680 Speaker 2: you're putting in the fridge and you put this lid on, 929 00:49:27,960 --> 00:49:31,520 Speaker 2: it emits ultraviolet light to stop bacteria growing. So obviously 930 00:49:31,560 --> 00:49:33,759 Speaker 2: putting things in the fridge will prolong how long it 931 00:49:33,800 --> 00:49:37,000 Speaker 2: will last because it slows down any movement by bacteria 932 00:49:37,040 --> 00:49:39,759 Speaker 2: or any growth by bacteria. But the addition of a 933 00:49:39,960 --> 00:49:44,120 Speaker 2: smart lid that just shines UV light onto the top 934 00:49:44,160 --> 00:49:45,680 Speaker 2: of whatever it is. Because if you've got a jar 935 00:49:45,800 --> 00:49:49,520 Speaker 2: of jam and you know you spring, the top of 936 00:49:50,080 --> 00:49:52,440 Speaker 2: the bacterial will form at the top because the bottom 937 00:49:52,560 --> 00:49:54,520 Speaker 2: hasn't got any air exposed to it. It's just the 938 00:49:54,600 --> 00:49:57,120 Speaker 2: top part. So how good is that idea? Having a 939 00:49:57,239 --> 00:49:59,920 Speaker 2: UV light just in the jar the lid of your jar. 940 00:50:00,719 --> 00:50:03,560 Speaker 1: Wow, imagine opening the fridge. There's all these little kind 941 00:50:03,600 --> 00:50:06,560 Speaker 1: of looks like science experiments in your fridge. There's just 942 00:50:06,640 --> 00:50:10,359 Speaker 1: all these little illuminated fucking jars of shit. So it's 943 00:50:10,400 --> 00:50:11,239 Speaker 1: not at all. 944 00:50:11,960 --> 00:50:14,960 Speaker 2: It looks like the YouTube podcast backdrop that you've got 945 00:50:15,200 --> 00:50:18,719 Speaker 2: on your screen. So that's glowing blue. There's an ultra 946 00:50:18,800 --> 00:50:19,880 Speaker 2: violet to keep you fresher. 947 00:50:20,600 --> 00:50:24,200 Speaker 1: That is. Yes, I'm actually two under thirteen years old. 948 00:50:25,080 --> 00:50:27,600 Speaker 1: Just unless it comes every day and sprays me with 949 00:50:27,760 --> 00:50:30,520 Speaker 1: something I'm not sure what it is. Yeah, it turns 950 00:50:30,600 --> 00:50:33,160 Speaker 1: on the UV just to get the bacteria off me. 951 00:50:33,480 --> 00:50:36,840 Speaker 2: So Craig is using a fake background on his zoom 952 00:50:37,800 --> 00:50:39,000 Speaker 2: could we not say fake. 953 00:50:39,360 --> 00:50:42,840 Speaker 1: Can we just say it's a virtual it's a virtual studio. 954 00:50:44,040 --> 00:50:48,080 Speaker 2: Fake, it's not real. It's fake. It's just a projected image. 955 00:50:48,280 --> 00:50:50,440 Speaker 2: But it's very blue and it looks like UV. And 956 00:50:50,560 --> 00:50:53,239 Speaker 2: now it suddenly occurred to me that he's probably using 957 00:50:53,320 --> 00:50:55,120 Speaker 2: it to just keep himself fresh. 958 00:50:55,680 --> 00:50:56,880 Speaker 1: It's a real image. 959 00:50:57,800 --> 00:51:01,239 Speaker 2: Okay, hey, other So I know we're running out of time, 960 00:51:01,239 --> 00:51:03,080 Speaker 2: but it's going to tell you. You know, China is 961 00:51:03,280 --> 00:51:08,480 Speaker 2: now officially saying that any cars so predominantly electric car 962 00:51:08,600 --> 00:51:12,920 Speaker 2: companies that have recessed door handles. So in a lot 963 00:51:12,920 --> 00:51:17,040 Speaker 2: of cars, including Tesla, there are recessed handles you kind 964 00:51:17,080 --> 00:51:18,959 Speaker 2: of tap and then they come out of the side 965 00:51:18,960 --> 00:51:22,440 Speaker 2: of the car, so it's for screamlining. The problem is 966 00:51:22,760 --> 00:51:25,440 Speaker 2: if something happens to the battery and the power, then 967 00:51:25,480 --> 00:51:29,000 Speaker 2: the handle won't come out. And so authorities in China 968 00:51:29,040 --> 00:51:31,120 Speaker 2: are saying this is still a couple of years away. 969 00:51:31,120 --> 00:51:33,920 Speaker 2: They're saying from twenty twenty seven, any car sold in 970 00:51:34,080 --> 00:51:38,360 Speaker 2: China has to have a manual door release after electronic 971 00:51:38,440 --> 00:51:41,799 Speaker 2: handles were failing during crashes. So that's an in one. 972 00:51:41,840 --> 00:51:43,480 Speaker 2: I think that's a really good decision. 973 00:51:44,200 --> 00:51:46,319 Speaker 1: I've heard about that I've heard about people who had 974 00:51:46,400 --> 00:51:49,880 Speaker 1: accidents and couldn't get out of the fucking car because 975 00:51:50,239 --> 00:51:54,520 Speaker 1: it was all locked electronically and all the electronics were down. 976 00:51:54,680 --> 00:52:00,000 Speaker 1: So though were just quickly on cars and China and electronics. 977 00:52:00,760 --> 00:52:02,400 Speaker 1: I don't know if you saw the news last night, 978 00:52:02,440 --> 00:52:08,240 Speaker 1: I'm doubting not so byd the biggest seller of electric 979 00:52:08,360 --> 00:52:10,120 Speaker 1: cars in the world. 980 00:52:10,320 --> 00:52:14,080 Speaker 2: Now, oh yeah, that doesn't SPA. So it was. 981 00:52:14,200 --> 00:52:19,800 Speaker 1: Tesla I think probably in Australia as well. Yeah, so 982 00:52:19,960 --> 00:52:24,080 Speaker 1: they they yeah, they're killing it. At the moment that 983 00:52:24,320 --> 00:52:26,239 Speaker 1: some of the Chinese stuff that's coming out, I know, 984 00:52:26,360 --> 00:52:29,880 Speaker 1: people are like, oh, it's junk. It's like, dude, some 985 00:52:30,080 --> 00:52:33,520 Speaker 1: of the stuff is so fucking amazing. Like if you 986 00:52:33,719 --> 00:52:37,680 Speaker 1: just objectively, you know, cars are an interesting thing. I'll 987 00:52:38,160 --> 00:52:40,279 Speaker 1: talk for thirty seconds then back to you. But I've 988 00:52:40,320 --> 00:52:43,400 Speaker 1: always thought cars are fascinating in that how much of 989 00:52:43,440 --> 00:52:46,800 Speaker 1: an emotional attachment, not all, not you, not me, probably, 990 00:52:47,680 --> 00:52:51,239 Speaker 1: but like people having them. Oh i've got a MERK 991 00:52:51,480 --> 00:52:54,320 Speaker 1: or I've got a BM or Traditionally those European cars 992 00:52:54,360 --> 00:52:57,680 Speaker 1: that were very aspirational, Oh look he's got this or 993 00:52:57,800 --> 00:53:02,080 Speaker 1: she's got that, right, but the actual car wasn't that 994 00:53:02,200 --> 00:53:05,560 Speaker 1: fucking amazing. You know, there were Mazdas that you could 995 00:53:05,600 --> 00:53:08,880 Speaker 1: buy for thirty grand less. Like, you've got a Mazda, 996 00:53:08,880 --> 00:53:11,880 Speaker 1: which is a fucking amazing car. Yeah, I've got a 997 00:53:11,960 --> 00:53:16,840 Speaker 1: Hyundai which is Santa Fe, which is better than some 998 00:53:17,040 --> 00:53:20,000 Speaker 1: of my friends fucking two hundred thousand dollars SUVs. They 999 00:53:20,080 --> 00:53:22,759 Speaker 1: get in my car, which costs less than half of 1000 00:53:22,840 --> 00:53:24,840 Speaker 1: what theirs did, and they're like, oh my god, this 1001 00:53:25,000 --> 00:53:28,280 Speaker 1: is fucking space age. I'm like, yeah, made in Korea. 1002 00:53:28,360 --> 00:53:28,480 Speaker 2: Bro. 1003 00:53:29,239 --> 00:53:33,399 Speaker 1: You know, it's like it's just that psychology of where 1004 00:53:33,520 --> 00:53:36,560 Speaker 1: is this made, and therefore it must be, you know, 1005 00:53:36,880 --> 00:53:39,160 Speaker 1: especially with things like cars, where there are no ASI 1006 00:53:39,239 --> 00:53:42,759 Speaker 1: cars anymore, So fucking why whatever you're going to buy? 1007 00:53:43,440 --> 00:53:45,200 Speaker 2: My first car was amazing. You know, it's funny you 1008 00:53:45,280 --> 00:53:48,440 Speaker 2: say that because I bought my very first brand new 1009 00:53:48,560 --> 00:53:50,560 Speaker 2: car because I'd never bought a brand new car before 1010 00:53:50,680 --> 00:53:53,320 Speaker 2: until just after COVID, when it was very hard to 1011 00:53:53,400 --> 00:53:56,359 Speaker 2: buy a brand new cars. But also the second hand 1012 00:53:56,440 --> 00:53:58,120 Speaker 2: car market had gone through the roof. So for me, 1013 00:53:58,200 --> 00:53:59,799 Speaker 2: I thought, I'm not gonna waste money on a second 1014 00:53:59,800 --> 00:54:02,760 Speaker 2: hand carsick at end problems, and I found a demo 1015 00:54:02,880 --> 00:54:05,680 Speaker 2: model of a particular mas to the Mix thirty. But 1016 00:54:05,760 --> 00:54:08,200 Speaker 2: what I love about it is it's got They call 1017 00:54:08,239 --> 00:54:11,120 Speaker 2: them freedom doors, but everybody in the world calls them 1018 00:54:11,160 --> 00:54:14,200 Speaker 2: suicide doors. You know the doors. The doors open up 1019 00:54:14,239 --> 00:54:16,839 Speaker 2: the opposite way, and I think that you know why 1020 00:54:16,960 --> 00:54:18,480 Speaker 2: they called them suicide doors. 1021 00:54:19,600 --> 00:54:24,760 Speaker 1: I'm sure because well, so, just to reiterate what Patrick 1022 00:54:24,800 --> 00:54:28,440 Speaker 1: said to everyone, So it's like, the door opens the 1023 00:54:28,600 --> 00:54:31,400 Speaker 1: opposite way. So instead of you know, like if you 1024 00:54:31,440 --> 00:54:33,440 Speaker 1: went up, you know, you open the handle, if you 1025 00:54:33,480 --> 00:54:35,360 Speaker 1: went up to the other end of the door, it 1026 00:54:35,560 --> 00:54:38,720 Speaker 1: opens like where you're opening the door to the world. 1027 00:54:39,000 --> 00:54:43,840 Speaker 1: Not yeah, so I guess it's because you were, in 1028 00:54:44,040 --> 00:54:48,040 Speaker 1: some instances basically committing suicide to step out into the traffic. 1029 00:54:48,280 --> 00:54:51,520 Speaker 2: No, well, yeah, kind of. So in the olden days, 1030 00:54:51,719 --> 00:54:54,000 Speaker 2: the front doors would open the normal way that front 1031 00:54:54,040 --> 00:54:56,239 Speaker 2: door would, but the back doors would be hinged at 1032 00:54:56,320 --> 00:54:59,920 Speaker 2: the rear and flip open. But because people didn't have seatbell, 1033 00:55:00,200 --> 00:55:03,080 Speaker 2: occasionally the mechanism would fail and while they were driving 1034 00:55:03,440 --> 00:55:05,960 Speaker 2: the rear door the suicide door would flick open, and 1035 00:55:06,040 --> 00:55:09,000 Speaker 2: they would reach to grab it to stop it flicking open, 1036 00:55:09,080 --> 00:55:11,120 Speaker 2: and it would drag them out of the car. So 1037 00:55:11,280 --> 00:55:14,759 Speaker 2: people died grabbing for the rear door that flicked open 1038 00:55:14,840 --> 00:55:18,520 Speaker 2: the wrong way. But what I was getting at is 1039 00:55:19,080 --> 00:55:21,680 Speaker 2: I've never gone for brands. My first car was a Mazda, 1040 00:55:21,719 --> 00:55:23,160 Speaker 2: and then I had a run of cars. I've only 1041 00:55:23,200 --> 00:55:25,600 Speaker 2: had four cars, Craig go, I know, you can't believe 1042 00:55:25,680 --> 00:55:27,920 Speaker 2: that I've only ever owned four cars in my whole life. 1043 00:55:29,200 --> 00:55:32,320 Speaker 2: First three were secondhand and I had them for ages. 1044 00:55:32,640 --> 00:55:34,839 Speaker 2: In fact, I still got my Nissan. That's I think 1045 00:55:34,880 --> 00:55:37,760 Speaker 2: I've had for thirty two years sitting in my driveway anyway. 1046 00:55:38,120 --> 00:55:41,080 Speaker 2: But the Mazda is my most recent car, and it's 1047 00:55:41,280 --> 00:55:44,319 Speaker 2: those things that I like about it. It's got analogue still, 1048 00:55:44,680 --> 00:55:48,319 Speaker 2: so I've got knobs to control the volume, to turn 1049 00:55:48,360 --> 00:55:50,560 Speaker 2: the temperature up and down. I like that it's still 1050 00:55:50,560 --> 00:55:52,600 Speaker 2: a hybrid between and it is literally a hybrid, but 1051 00:55:52,680 --> 00:55:55,440 Speaker 2: it's got a lot of analog as well as digital. 1052 00:55:55,520 --> 00:55:57,279 Speaker 2: So it's got the heads up display and all the 1053 00:55:57,320 --> 00:55:59,879 Speaker 2: fun things that I love. But it's still not quite 1054 00:56:00,160 --> 00:56:03,000 Speaker 2: a scream like an iPad sitting on a dash, which 1055 00:56:03,040 --> 00:56:05,680 Speaker 2: is what the Tesla is. So I think I like 1056 00:56:05,880 --> 00:56:10,080 Speaker 2: that combination of tech but analogue as well. That's kind 1057 00:56:10,080 --> 00:56:10,759 Speaker 2: of does it for me. 1058 00:56:11,640 --> 00:56:14,680 Speaker 1: Yeah, I love that visceral experience of twisting a nob 1059 00:56:14,880 --> 00:56:18,319 Speaker 1: or sliding something across something that's physical, that I can 1060 00:56:19,239 --> 00:56:20,520 Speaker 1: give us one more than I've got to go. 1061 00:56:20,960 --> 00:56:25,400 Speaker 2: Oh, this is great move over. Bluetooth wired headphones are 1062 00:56:25,520 --> 00:56:28,439 Speaker 2: back and they're super cool. So I watched the last 1063 00:56:28,520 --> 00:56:30,440 Speaker 2: episode of Stranger Things last night. 1064 00:56:31,480 --> 00:56:34,880 Speaker 1: Can I just stop you? Did you say wired? W 1065 00:56:35,200 --> 00:56:37,920 Speaker 1: ir ed or wid no? 1066 00:56:38,160 --> 00:56:41,520 Speaker 2: Wired? So it's become a lot of Hollywood celebrities and 1067 00:56:41,600 --> 00:56:44,719 Speaker 2: now going back to their headphones with a chord on them. 1068 00:56:45,360 --> 00:56:48,600 Speaker 2: So tis well, it's just the cool, trendy, hip thing 1069 00:56:48,640 --> 00:56:50,680 Speaker 2: to do. Man, if you're one of the latest trendy 1070 00:56:50,800 --> 00:56:54,160 Speaker 2: young kids out there, We've got CDs, we've got cassettes, 1071 00:56:54,160 --> 00:56:55,640 Speaker 2: and we've got wired headphones. 1072 00:56:56,800 --> 00:56:59,640 Speaker 1: I'm pretty sure it's just the trendy thing to do. Man, 1073 00:56:59,840 --> 00:57:04,319 Speaker 1: is not a trend not a trendy sentence? Patrick, Where 1074 00:57:04,360 --> 00:57:06,160 Speaker 1: can people find you and connect with you? 1075 00:57:06,719 --> 00:57:09,680 Speaker 2: Well, they can go to websites now, dot com, dot 1076 00:57:09,840 --> 00:57:12,960 Speaker 2: au if they want to kind of just chat. Give 1077 00:57:13,040 --> 00:57:14,840 Speaker 2: us some topics to talk about. If you want us 1078 00:57:14,840 --> 00:57:17,880 Speaker 2: to talk about something in particular that's tech related, then 1079 00:57:17,920 --> 00:57:20,000 Speaker 2: I'll I'll research it, I will and I'll read more 1080 00:57:20,040 --> 00:57:22,040 Speaker 2: than one paper about it too, Craigo. 1081 00:57:22,240 --> 00:57:24,520 Speaker 1: That'd be great. Or if there's something you think Patrick 1082 00:57:24,600 --> 00:57:31,480 Speaker 1: and I should argue about, just same. It's beautiful. I 1083 00:57:31,600 --> 00:57:38,080 Speaker 1: love it all right, mate, fast recovery to Fritzie as well. 1084 00:57:38,160 --> 00:57:43,240 Speaker 2: Fretzie for a lick then, just as you were talking. 1085 00:57:43,600 --> 00:57:46,600 Speaker 1: Yeah, stop licking that ship, all right mate. I'll see 1086 00:57:46,640 --> 00:57:47,680 Speaker 1: you soon, do mate,