1 00:00:00,640 --> 00:00:03,400 Speaker 1: I get our team, craig anthe arper, it's a year project. 2 00:00:03,400 --> 00:00:07,080 Speaker 1: It's Schilespo. It's Cook, who's very excited because one of 3 00:00:07,160 --> 00:00:11,639 Speaker 1: Melbourne's most excited Bogans has just brought herself what's called 4 00:00:12,440 --> 00:00:15,640 Speaker 1: a new exhaust system for her motorbike because she thinks 5 00:00:15,640 --> 00:00:19,759 Speaker 1: her motorbike is not loud enough. So she's just gone 6 00:00:19,800 --> 00:00:22,279 Speaker 1: from a three star Bogan to a five star Bogan 7 00:00:22,400 --> 00:00:25,160 Speaker 1: and now the cops are we chasing her with her 8 00:00:25,239 --> 00:00:26,800 Speaker 1: new illegal muffler. 9 00:00:26,840 --> 00:00:31,240 Speaker 2: Congratulations, thanks Harps, I'm so excited. 10 00:00:33,479 --> 00:00:37,479 Speaker 1: What precipitated you spending all of that money to literally 11 00:00:37,720 --> 00:00:41,400 Speaker 1: just make your motorbike more obvious to the cops. 12 00:00:42,800 --> 00:00:44,760 Speaker 2: Well, I don't do anything illegal in the bike, so 13 00:00:44,880 --> 00:00:48,879 Speaker 2: that doesn't bother me. But once you recognize that you 14 00:00:49,040 --> 00:00:53,080 Speaker 2: feel like you're scooting around the street on a sewing machine. 15 00:00:53,760 --> 00:00:56,800 Speaker 2: That awareness never fades and it was going to bother 16 00:00:56,880 --> 00:00:58,040 Speaker 2: me for the rest of eternity. 17 00:00:58,720 --> 00:01:03,640 Speaker 1: Well, that's only because you've got a friend. And I 18 00:01:03,680 --> 00:01:07,440 Speaker 1: say friend, then inverted commas everyone and wink, whink, nudge nudge. 19 00:01:07,480 --> 00:01:11,520 Speaker 1: Special friend who rides a what's he ride in India 20 00:01:11,920 --> 00:01:15,720 Speaker 1: and rides and he's got a loud exhaust. Has he 21 00:01:15,800 --> 00:01:17,200 Speaker 1: so you had exhaust envy? 22 00:01:17,560 --> 00:01:20,440 Speaker 2: Yeah, everyone's got a louder exhaust to me, And don't 23 00:01:20,520 --> 00:01:21,240 Speaker 2: you start. 24 00:01:21,959 --> 00:01:23,840 Speaker 1: I've got no idea. What was that ride? 25 00:01:23,840 --> 00:01:25,560 Speaker 2: Once? And I heard you start your bike? 26 00:01:26,720 --> 00:01:30,320 Speaker 1: I'm all very street legal, glespo. Have you ever ridden 27 00:01:30,319 --> 00:01:34,200 Speaker 1: a motorbike? Yeah? I didn't think so, no. 28 00:01:34,280 --> 00:01:37,640 Speaker 3: Desire whatsoever to ride a motorbike. And I thought the 29 00:01:37,720 --> 00:01:39,200 Speaker 3: way you made them louder was to go out with 30 00:01:39,200 --> 00:01:41,000 Speaker 3: a screwdriver and punch holes in the muffler. 31 00:01:41,800 --> 00:01:45,720 Speaker 1: Well that until until the seventies, that was the that 32 00:01:45,840 --> 00:01:48,400 Speaker 1: was the practice. But we've kicked on a little bit 33 00:01:48,760 --> 00:01:51,480 Speaker 1: since when you were dirty. 34 00:01:51,760 --> 00:01:53,520 Speaker 3: And then you get to fend a grand and put 35 00:01:53,520 --> 00:01:55,480 Speaker 3: an attachment on Insteadae, Well. 36 00:01:55,360 --> 00:01:58,080 Speaker 1: That's a cheap that's a cheap one too, That's all right. 37 00:01:59,000 --> 00:02:01,520 Speaker 1: That is very cheap because she bought it off gum 38 00:02:01,560 --> 00:02:04,559 Speaker 1: Tree whatever that is. I feel like I have no idea. 39 00:02:04,600 --> 00:02:06,880 Speaker 1: But is gum Tree dodgy or is that legitive? 40 00:02:07,080 --> 00:02:09,200 Speaker 2: Now it's legit. It's just a secondhand platform. This is 41 00:02:09,240 --> 00:02:12,800 Speaker 2: a platform where you can sell things a marketplace. But 42 00:02:12,840 --> 00:02:14,399 Speaker 2: there's plenty of dodgy people on there. 43 00:02:15,400 --> 00:02:20,040 Speaker 1: Of course, there are David James Kevin Patrick allSPI. It's 44 00:02:20,040 --> 00:02:22,720 Speaker 1: five point thirty seven as we record in Melbourne on 45 00:02:22,840 --> 00:02:27,280 Speaker 1: the sixth of November. It's quite an historic not atif, 46 00:02:27,360 --> 00:02:31,960 Speaker 1: and it's quite an historic day in the world because 47 00:02:33,760 --> 00:02:37,240 Speaker 1: they are stumbling towards the finish line, or striding brilliantly 48 00:02:37,320 --> 00:02:42,280 Speaker 1: towards the finish line of the American election. Any thoughts, 49 00:02:42,320 --> 00:02:45,239 Speaker 1: any predictions? Do you give a shit? Do you pay attention? 50 00:02:45,840 --> 00:02:48,040 Speaker 1: You are a lawyer, I would think that you do 51 00:02:48,720 --> 00:02:50,280 Speaker 1: have a fleeting interest. 52 00:02:50,880 --> 00:02:52,280 Speaker 3: Oh no, I have a lot of interest in it. 53 00:02:52,520 --> 00:02:54,959 Speaker 3: I think America is about to make a very very 54 00:02:55,000 --> 00:03:01,040 Speaker 3: bad decision. I think they're probably going to elect Trump 55 00:03:01,120 --> 00:03:04,560 Speaker 3: at this stage, that's what the numbers look like. And 56 00:03:06,320 --> 00:03:09,560 Speaker 3: I think that's very very bad for America and then 57 00:03:09,880 --> 00:03:12,800 Speaker 3: as a consequence, bad for the rest of us eventually. 58 00:03:14,840 --> 00:03:18,040 Speaker 1: Well, we could do a whole episode on that. Just 59 00:03:18,120 --> 00:03:20,680 Speaker 1: yes or no, sociopath or yes or no? 60 00:03:21,040 --> 00:03:25,600 Speaker 3: Oh definitely. Oh, but then again, that's that's that's not 61 00:03:25,720 --> 00:03:29,119 Speaker 3: a hard question when you're talking about politicians. I mean, 62 00:03:29,600 --> 00:03:33,840 Speaker 3: your ninety nine percent answer is yes, definitely. It's you know, 63 00:03:34,360 --> 00:03:38,440 Speaker 3: now convinced me I'm wrong. Really, start from the proposition 64 00:03:38,440 --> 00:03:40,480 Speaker 3: that you're talking about a psychopath when you're talking about 65 00:03:40,480 --> 00:03:43,840 Speaker 3: a politician and then find evidence that perhaps they're not. 66 00:03:46,040 --> 00:03:49,440 Speaker 1: Well this time tomorrow, I think, well, know, so let's 67 00:03:49,560 --> 00:03:54,240 Speaker 1: let's not have any political debates. But that's not what 68 00:03:54,280 --> 00:03:58,640 Speaker 1: we do here. I've consciously avoided doing We've never. 69 00:03:58,680 --> 00:04:01,560 Speaker 3: Not once talked about politics, and this is and my 70 00:04:01,640 --> 00:04:04,360 Speaker 3: opinion is not really about politics. I'm not a Republican 71 00:04:04,480 --> 00:04:07,200 Speaker 3: or Democrat. I'm just looking at the things he says 72 00:04:07,240 --> 00:04:09,440 Speaker 3: he's going to do and the likelihood that he will 73 00:04:09,480 --> 00:04:11,800 Speaker 3: do it, and how that's going to be some pretty 74 00:04:11,800 --> 00:04:13,080 Speaker 3: bad outcomes for the world. 75 00:04:13,800 --> 00:04:18,400 Speaker 1: Yeah. Yeah, Well, time will tell, and I just hope 76 00:04:18,440 --> 00:04:24,480 Speaker 1: there's I hope there's not a civil war. I hope there's. 77 00:04:25,040 --> 00:04:28,440 Speaker 3: I think it's less likely if he wins, because the 78 00:04:28,480 --> 00:04:33,560 Speaker 3: people who would lose are a likely to accept the result, 79 00:04:34,000 --> 00:04:38,719 Speaker 3: and because they believe in democracy and be aren't the 80 00:04:39,000 --> 00:04:43,760 Speaker 3: armed crazy people or have less of them. 81 00:04:43,800 --> 00:04:48,760 Speaker 1: Wow. Wow, all right, So if you don't know everyone, 82 00:04:49,560 --> 00:04:54,520 Speaker 1: we've got a page for the podcast, which is brilliantly 83 00:04:54,800 --> 00:05:00,080 Speaker 1: named the You Project Podcast Facebook page. Tiff and I 84 00:05:00,120 --> 00:05:02,800 Speaker 1: worked on that tielessly. We had a team of branding 85 00:05:02,839 --> 00:05:07,200 Speaker 1: and marketing people. We paid thousands. It took about three weeks, 86 00:05:07,200 --> 00:05:10,840 Speaker 1: but we finally came up with a you Project podcast 87 00:05:10,960 --> 00:05:12,479 Speaker 1: as the name of the page. 88 00:05:12,560 --> 00:05:16,400 Speaker 3: So I want to know about I'm going to interrupt 89 00:05:16,400 --> 00:05:19,520 Speaker 3: you here because it brings to mind a story back 90 00:05:19,520 --> 00:05:21,080 Speaker 3: in the tech days when I was in the tech 91 00:05:21,080 --> 00:05:24,000 Speaker 3: industry and I was heavily involved in a partnership with 92 00:05:24,040 --> 00:05:26,960 Speaker 3: Intel with a startup that I did in the United States, 93 00:05:26,960 --> 00:05:30,720 Speaker 3: and we were working with them closely on our chip 94 00:05:30,760 --> 00:05:33,440 Speaker 3: that they were about to bring out, which ultimately was 95 00:05:33,480 --> 00:05:37,280 Speaker 3: released as the fifth version of the Pentium chip that 96 00:05:37,320 --> 00:05:40,159 Speaker 3: they released, And they had a huge marketing team with 97 00:05:40,200 --> 00:05:42,920 Speaker 3: a massive marketing budget aimed at trying to figure out 98 00:05:42,920 --> 00:05:46,280 Speaker 3: what to call this thing. And then after about or 99 00:05:46,320 --> 00:05:50,279 Speaker 3: six weeks of deliberations and user surveys and focused teams 100 00:05:50,279 --> 00:05:52,440 Speaker 3: and all that sort of crap, they came up with 101 00:05:52,680 --> 00:06:00,960 Speaker 3: Pentium five, which was the successor to Penum four. Wow, 102 00:06:01,000 --> 00:06:02,360 Speaker 3: that was worth the money? 103 00:06:03,800 --> 00:06:08,680 Speaker 1: Did wow? Did you? You would have recoiled with bloody 104 00:06:08,720 --> 00:06:11,120 Speaker 1: amazement and admiration for the time. 105 00:06:11,160 --> 00:06:12,840 Speaker 3: We were all in shock. We were in shock. We 106 00:06:13,040 --> 00:06:15,120 Speaker 3: would never have predicted that that would be where they 107 00:06:15,160 --> 00:06:17,479 Speaker 3: would have gone, because they'd gone from Pendium three to 108 00:06:17,520 --> 00:06:20,039 Speaker 3: Pentium four, So it was you know, they could have 109 00:06:20,040 --> 00:06:21,600 Speaker 3: gone anywhere with the who. 110 00:06:21,520 --> 00:06:24,720 Speaker 1: Would have seen that? Yeah, that's probably one of the 111 00:06:24,760 --> 00:06:29,640 Speaker 1: great creative master strokes of all time. And and was 112 00:06:29,680 --> 00:06:31,560 Speaker 1: there a subsequent pentem six? 113 00:06:32,240 --> 00:06:35,080 Speaker 3: Oh yeah I think so. Yeah, yeah, no doubt. 114 00:06:35,160 --> 00:06:43,719 Speaker 1: After much deliberation, now before gillespo bloody rudely interrupted me. Everyone, Yeah, 115 00:06:43,720 --> 00:06:46,359 Speaker 1: I was saying, we got this page. Anyway, on this 116 00:06:46,480 --> 00:06:51,200 Speaker 1: page you are more than welcome to there's no hooks 117 00:06:51,240 --> 00:06:54,799 Speaker 1: catches genders, but become a member of the just jump 118 00:06:54,839 --> 00:06:56,640 Speaker 1: on and click. I don't know how the fuck you 119 00:06:56,680 --> 00:06:58,719 Speaker 1: do it. Actually I've never done it. You just click 120 00:06:58,800 --> 00:07:02,640 Speaker 1: something ti shaking it. I don't know. Speaking of branding and. 121 00:07:02,600 --> 00:07:05,120 Speaker 4: Marketing, and I'm shit at selling my own stuff, but 122 00:07:05,160 --> 00:07:08,359 Speaker 4: it costs nothing but a bunch of people get in 123 00:07:08,360 --> 00:07:10,840 Speaker 4: there and talk about the podcast and or perhaps who 124 00:07:10,840 --> 00:07:13,400 Speaker 4: they want to have on coming up, or suggestions or 125 00:07:13,400 --> 00:07:17,520 Speaker 4: ideas for topics that we might explore, and also of like. 126 00:07:17,840 --> 00:07:21,400 Speaker 1: A young mister Gillespie is sharing. And I don't like 127 00:07:21,520 --> 00:07:25,600 Speaker 1: complimenting him because you know he doesn't need it, but 128 00:07:25,680 --> 00:07:28,320 Speaker 1: he does write brilliant fucking articles, and so most of 129 00:07:28,360 --> 00:07:31,320 Speaker 1: his articles are going up or the ones that he 130 00:07:31,440 --> 00:07:34,320 Speaker 1: sees fits put up on our page. So not only 131 00:07:34,320 --> 00:07:36,960 Speaker 1: can you listen to him once fortnite, you can read 132 00:07:37,000 --> 00:07:41,840 Speaker 1: his stuff. And so, speaking of that, let's start with 133 00:07:42,040 --> 00:07:43,880 Speaker 1: I didn't talk to you about this one before we 134 00:07:43,960 --> 00:07:51,240 Speaker 1: went live, but the trust apocalypse, Why honesty is our 135 00:07:51,360 --> 00:07:54,880 Speaker 1: only hope and the kind of the hook your preface 136 00:07:55,160 --> 00:07:58,680 Speaker 1: was is trust becoming a luxury we can no longer afford. 137 00:07:59,400 --> 00:08:02,400 Speaker 1: Feeling a little dystopian lately, supermarket of a charging you, 138 00:08:02,480 --> 00:08:05,680 Speaker 1: politicians lying to face kindently being spied on. You're not alone. 139 00:08:06,240 --> 00:08:08,680 Speaker 1: My latest article is about the alarming erosion of trust 140 00:08:08,720 --> 00:08:12,320 Speaker 1: in our society and the serious consequence it brings. And 141 00:08:12,360 --> 00:08:15,240 Speaker 1: there's more but that will kick you off. Open that 142 00:08:15,280 --> 00:08:18,440 Speaker 1: door if you would, gless Boat. Yeah, so. 143 00:08:19,960 --> 00:08:24,680 Speaker 3: It's one of the fundamental things that we've talked a 144 00:08:24,720 --> 00:08:28,320 Speaker 3: little bit about this before. But we're an unlikely APEX 145 00:08:28,360 --> 00:08:32,760 Speaker 3: predator because we really have nothing to recommend ourselves as 146 00:08:32,920 --> 00:08:35,920 Speaker 3: winners of the evolutionary fight. You know, we're sort of 147 00:08:36,440 --> 00:08:40,079 Speaker 3: meet on feet really, you know, no no fangs, no venom, 148 00:08:40,160 --> 00:08:44,480 Speaker 3: no armor, no nothing. So and yet we own this place. 149 00:08:45,120 --> 00:08:48,680 Speaker 3: So what's our evolutionary advantage? It isn't our good looks, 150 00:08:49,240 --> 00:08:54,960 Speaker 3: It isn't how loud our motorbikes are. It's oh maybe 151 00:08:55,840 --> 00:09:01,760 Speaker 3: it's it's it's that we work together well, Withers. We're 152 00:09:01,760 --> 00:09:05,320 Speaker 3: the only species on the planet that can cooperate with 153 00:09:05,400 --> 00:09:10,680 Speaker 3: strangers in that. No other species does this. Now, there's 154 00:09:10,720 --> 00:09:14,200 Speaker 3: other species that work well in family groups, you know, elephants, 155 00:09:14,240 --> 00:09:17,840 Speaker 3: for example, chimpanzees, things like that. They'll work together in 156 00:09:18,000 --> 00:09:21,000 Speaker 3: large family groups of you know, uncle's, nieces, nephews, etc. 157 00:09:21,480 --> 00:09:24,000 Speaker 3: But they don't work with strangers. They don't get together 158 00:09:24,040 --> 00:09:27,560 Speaker 3: with strangers and arrange themselves in large groups of hundreds 159 00:09:27,640 --> 00:09:31,240 Speaker 3: or thousands, or in humans case, minions of strangers who 160 00:09:31,280 --> 00:09:34,600 Speaker 3: are all cooperating towards a goal. Now we're the only 161 00:09:34,640 --> 00:09:36,559 Speaker 3: ones who do that, And people might say, oh, yeah, 162 00:09:36,559 --> 00:09:40,760 Speaker 3: but what about ants. Well, ants are essentially one organism 163 00:09:40,920 --> 00:09:45,680 Speaker 3: with lots of moving parts. They're not individuals working together 164 00:09:46,200 --> 00:09:51,720 Speaker 3: there and organism. So but with humans, we're separate organisms 165 00:09:52,160 --> 00:09:56,079 Speaker 3: cooperating with each other at really large scale, which means 166 00:09:56,280 --> 00:09:58,520 Speaker 3: that when we're going to take down the wooly mammoth 167 00:09:58,600 --> 00:10:01,120 Speaker 3: or the saber tooth tiger, we've got twenty mates will 168 00:10:01,120 --> 00:10:03,840 Speaker 3: help us and we can cooperate in real time and 169 00:10:04,000 --> 00:10:07,079 Speaker 3: trust that none of them will stab us in the back. 170 00:10:08,360 --> 00:10:11,640 Speaker 3: And that last bit is the important bit. What allows 171 00:10:11,720 --> 00:10:15,840 Speaker 3: us to work well with strangers is that we trust strangers. 172 00:10:17,080 --> 00:10:21,319 Speaker 3: Our number one first go to when we think about 173 00:10:21,320 --> 00:10:24,319 Speaker 3: whether or not we're going to work with someone is 174 00:10:24,880 --> 00:10:29,800 Speaker 3: we default to trust. We assume you are not going 175 00:10:29,920 --> 00:10:34,160 Speaker 3: to double cross us. Now, sometimes we're wrong. If we're 176 00:10:34,200 --> 00:10:37,400 Speaker 3: dealing with the psychopath, we will probably be wrong because 177 00:10:37,440 --> 00:10:39,439 Speaker 3: they will see that we trust them and exploit it 178 00:10:39,480 --> 00:10:42,439 Speaker 3: against us. But for the vast majority of us, the 179 00:10:42,480 --> 00:10:45,559 Speaker 3: other ninety five percent of the community, we trust each 180 00:10:45,600 --> 00:10:49,400 Speaker 3: other on default, and that trust is what allows us 181 00:10:49,400 --> 00:10:51,040 Speaker 3: to work together. It allows you to sit in a 182 00:10:51,080 --> 00:10:53,200 Speaker 3: room with people sitting behind you and have you not 183 00:10:53,280 --> 00:10:55,160 Speaker 3: be anxious about the fact that one of them might 184 00:10:55,200 --> 00:10:58,920 Speaker 3: stab you in the back. It allows you to focus 185 00:10:58,960 --> 00:11:01,840 Speaker 3: your energy on building an outcome that is greater than 186 00:11:01,920 --> 00:11:05,480 Speaker 3: the sum of the parts, because as a community of 187 00:11:05,640 --> 00:11:08,920 Speaker 3: one hundred people working together, you can always do more 188 00:11:09,440 --> 00:11:11,680 Speaker 3: than an individual can do. Just because you don't have 189 00:11:11,800 --> 00:11:14,400 Speaker 3: enough time, even if people went out to get you 190 00:11:15,320 --> 00:11:18,959 Speaker 3: so talking about your page before you've got what three 191 00:11:19,000 --> 00:11:21,520 Speaker 3: and a half thousand people on that page. As a 192 00:11:21,559 --> 00:11:24,760 Speaker 3: community of three and a half thousand people, they can 193 00:11:24,840 --> 00:11:28,679 Speaker 3: work to solve problems and share information because they all 194 00:11:28,720 --> 00:11:33,560 Speaker 3: trust each other and that's what gives it the power. 195 00:11:33,600 --> 00:11:35,840 Speaker 3: They can do much more there than you can do 196 00:11:35,960 --> 00:11:41,120 Speaker 3: on your own. And so the piece is about the 197 00:11:41,200 --> 00:11:45,080 Speaker 3: fact that that's eroding now in modern society, is that 198 00:11:45,480 --> 00:11:49,199 Speaker 3: we're increasingly ending up in a position where it's every 199 00:11:49,240 --> 00:11:55,000 Speaker 3: man for himself, so the community assets draining away. We're 200 00:11:55,000 --> 00:11:58,080 Speaker 3: not able to trust that the community will look after us, 201 00:11:58,320 --> 00:12:03,240 Speaker 3: We're not able to to believe that those in power 202 00:12:03,320 --> 00:12:08,880 Speaker 3: have our best interests in mind. And people often say, well, well, yeah, 203 00:12:08,880 --> 00:12:11,319 Speaker 3: that's the reality. I understand that. That's why you've got 204 00:12:11,320 --> 00:12:13,959 Speaker 3: no one volunteering to do touch shop anymore. It's why 205 00:12:14,280 --> 00:12:18,360 Speaker 3: non volunteers for anything, because it's every man for himself. 206 00:12:18,520 --> 00:12:22,280 Speaker 3: When it's every man for himself, most people lose. And 207 00:12:22,360 --> 00:12:25,040 Speaker 3: a society that operates on every man for himself will 208 00:12:25,080 --> 00:12:29,679 Speaker 3: always lose against a society that doesn't, because the society 209 00:12:29,679 --> 00:12:34,000 Speaker 3: that doesn't has the power of the community working together. 210 00:12:34,480 --> 00:12:36,600 Speaker 3: And the only way to bring that back into a 211 00:12:36,640 --> 00:12:42,680 Speaker 3: society is with honesty. Demand honesty and transparency from those 212 00:12:42,679 --> 00:12:46,280 Speaker 3: in power, punish them when they're not, and you will 213 00:12:46,360 --> 00:12:48,800 Speaker 3: have people behave in the best interests of the community, 214 00:12:49,080 --> 00:12:52,080 Speaker 3: whether they're psychopaths or not, because the only way to 215 00:12:52,120 --> 00:12:55,600 Speaker 3: get ahead is to have the community to support you, 216 00:12:55,679 --> 00:12:57,440 Speaker 3: and the only way to have that happen is be 217 00:12:57,520 --> 00:13:01,360 Speaker 3: honest with them so that they believe you. Older listeners 218 00:13:01,400 --> 00:13:04,760 Speaker 3: in your audience may well remember a time when politicians 219 00:13:04,760 --> 00:13:10,000 Speaker 3: were punished for lying. Where a politician lied to the population, 220 00:13:10,080 --> 00:13:12,200 Speaker 3: no matter how little it was about, no matter what 221 00:13:12,240 --> 00:13:14,760 Speaker 3: it was about, they'd be unlikely to ever win an 222 00:13:14,800 --> 00:13:17,840 Speaker 3: election again, and they'd probably have to leave office as 223 00:13:17,840 --> 00:13:22,240 Speaker 3: soon as it was discovered. Now that doesn't happen. Now 224 00:13:22,400 --> 00:13:26,440 Speaker 3: politicians lie continuously to us and are still in the job. 225 00:13:26,679 --> 00:13:30,160 Speaker 3: And that's because we let them do it. It's not 226 00:13:30,280 --> 00:13:33,079 Speaker 3: like we're falling for it. We know they're lying to us, 227 00:13:33,280 --> 00:13:34,319 Speaker 3: but we let them do it. 228 00:13:35,920 --> 00:13:41,000 Speaker 1: It's yeah, I was thinking, we'll come back to the politics, 229 00:13:41,040 --> 00:13:45,920 Speaker 1: but in terms of like trust, and this is a 230 00:13:45,960 --> 00:13:49,480 Speaker 1: funny thing, but it relates. I've thought many times because 231 00:13:49,480 --> 00:13:52,640 Speaker 1: I've been riding motorbikes since I was eighteen on the road, right, 232 00:13:53,320 --> 00:13:55,760 Speaker 1: and you trust when you're on a just to say 233 00:13:55,760 --> 00:14:00,160 Speaker 1: a single lane rode, so one lane each way, Like 234 00:14:00,240 --> 00:14:03,480 Speaker 1: I'm riding one way, a truck or even at is 235 00:14:03,520 --> 00:14:06,160 Speaker 1: the other way. They just need to deviate a little 236 00:14:06,160 --> 00:14:10,040 Speaker 1: bit and I'm dead and they've got a dent, you 237 00:14:10,080 --> 00:14:13,800 Speaker 1: know what I mean. It's like you're trusting that everyone 238 00:14:13,920 --> 00:14:15,840 Speaker 1: is going to do what they are meant to do 239 00:14:16,080 --> 00:14:20,160 Speaker 1: legally on the road, because when you're a motorcyclist, of 240 00:14:20,240 --> 00:14:25,040 Speaker 1: course you're very, very vulnerable, and you are relying on 241 00:14:25,160 --> 00:14:31,560 Speaker 1: other people's I guess, not honesty, but other people's intention 242 00:14:31,840 --> 00:14:35,120 Speaker 1: to not hurt you and to abide by the law 243 00:14:35,320 --> 00:14:39,320 Speaker 1: and do what's required. And I even think that sometimes 244 00:14:39,600 --> 00:14:42,920 Speaker 1: because I live on a busy main road and every 245 00:14:42,960 --> 00:14:44,440 Speaker 1: time I want to go get a coffee, I haven't 246 00:14:44,440 --> 00:14:47,600 Speaker 1: got across that busy road. And sometimes when a car's 247 00:14:47,640 --> 00:14:51,120 Speaker 1: going by me, it's like the difference between being fully 248 00:14:51,160 --> 00:14:54,520 Speaker 1: alive and fully dead is about two feet. Yes, you 249 00:14:54,560 --> 00:14:57,120 Speaker 1: know this car go you know if yeah, if that 250 00:14:57,320 --> 00:15:00,520 Speaker 1: one foot It's like there's a car going pass me 251 00:15:00,600 --> 00:15:04,680 Speaker 1: at sixty k's if that car hit me full on, 252 00:15:05,040 --> 00:15:11,080 Speaker 1: I'd probably be dead, if not, you know, next to dead. 253 00:15:10,760 --> 00:15:14,360 Speaker 1: And that's like whizzing by me, like all they've got 254 00:15:14,360 --> 00:15:18,360 Speaker 1: to do is not concentrate or worse still, be a 255 00:15:18,400 --> 00:15:22,680 Speaker 1: fucking psychopath, you know. And I'm dead, you know, but 256 00:15:22,760 --> 00:15:24,720 Speaker 1: you just I stand on the side of the road 257 00:15:25,440 --> 00:15:28,920 Speaker 1: and I literally I'm inches from where they're whizzing by. 258 00:15:28,920 --> 00:15:31,680 Speaker 1: Its sixty k's thinking that they're going to stay on 259 00:15:31,720 --> 00:15:34,520 Speaker 1: the trajectory they are when I get to the edge 260 00:15:34,560 --> 00:15:35,000 Speaker 1: of the road. 261 00:15:37,000 --> 00:15:39,960 Speaker 3: And I mean, there's lots of mechanisms to enforce trust, 262 00:15:40,040 --> 00:15:42,200 Speaker 3: one of them being that they would be punished if 263 00:15:42,200 --> 00:15:45,800 Speaker 3: they did run into you. So policing is an effective 264 00:15:45,840 --> 00:15:49,920 Speaker 3: way of making enforcing people or making them trust and corporate. 265 00:15:50,320 --> 00:15:52,880 Speaker 3: But you can't police everything all the time. And it's 266 00:15:52,920 --> 00:15:57,080 Speaker 3: really trust. Someone famous whose name I can't remember, once 267 00:15:57,120 --> 00:16:02,560 Speaker 3: said trust is like air. When it's there, you don't 268 00:16:02,600 --> 00:16:05,080 Speaker 3: notice it. When it's not, you notice it very very quickly. 269 00:16:06,360 --> 00:16:10,360 Speaker 3: And that's what it's like with trust. So we've had 270 00:16:10,400 --> 00:16:13,680 Speaker 3: a recent example of the air being sucked out of 271 00:16:13,720 --> 00:16:16,360 Speaker 3: our society in the form of the trust being sucked out, 272 00:16:16,560 --> 00:16:19,760 Speaker 3: and I mentioned it in that piece. It's last week 273 00:16:19,840 --> 00:16:23,280 Speaker 3: we had the report into the COVID reaction of the 274 00:16:23,360 --> 00:16:28,240 Speaker 3: response of Australia COVID and what they found was that 275 00:16:28,280 --> 00:16:31,880 Speaker 3: the response of government in failing to properly explain what 276 00:16:31,920 --> 00:16:34,720 Speaker 3: they were doing, failing to have evidence behind what they did, 277 00:16:35,120 --> 00:16:39,440 Speaker 3: and the overreaction in terms of the enforcement of lockdowns 278 00:16:39,480 --> 00:16:43,680 Speaker 3: and so on. It it took a massive impact on 279 00:16:43,800 --> 00:16:47,440 Speaker 3: the trust that society has in public health authorities and 280 00:16:48,280 --> 00:16:52,320 Speaker 3: and and it's translating through into other public health initiatives now. 281 00:16:52,360 --> 00:16:55,920 Speaker 3: Now people are less likely to become vaccinated, are less 282 00:16:56,000 --> 00:16:59,040 Speaker 3: likely to comply with public health directives because they don't 283 00:16:59,200 --> 00:17:02,960 Speaker 3: trust the authorities because whilst they complied at the time 284 00:17:03,000 --> 00:17:06,679 Speaker 3: because they didn't have a choice, what's happening now is 285 00:17:06,720 --> 00:17:10,320 Speaker 3: the blowback from that. And yes, that's what That's the 286 00:17:10,400 --> 00:17:13,040 Speaker 3: thing about trust. When you suck it out, which is 287 00:17:13,080 --> 00:17:16,000 Speaker 3: what happened there, when you burn that credit, when you 288 00:17:16,160 --> 00:17:19,439 Speaker 3: when you erode the trust when you need it, it 289 00:17:19,520 --> 00:17:24,840 Speaker 3: won't be there. People won't trust you, and they won't 290 00:17:24,880 --> 00:17:26,600 Speaker 3: do what you want, what you need them to do. 291 00:17:26,920 --> 00:17:31,280 Speaker 3: And that's happening. That's one example, but it's happening across 292 00:17:31,359 --> 00:17:34,800 Speaker 3: all of our institutions in society. Is that more and 293 00:17:34,920 --> 00:17:38,119 Speaker 3: more we are becoming aware that we're being lied to 294 00:17:38,800 --> 00:17:43,000 Speaker 3: and the result is we don't trust those in authority, 295 00:17:43,320 --> 00:17:45,719 Speaker 3: which means we won't do what they want us to 296 00:17:45,800 --> 00:17:51,000 Speaker 3: do because we're thinking about every man for himself, and 297 00:17:51,320 --> 00:17:55,040 Speaker 3: whilst that might be an immediate thing that protects us, 298 00:17:55,359 --> 00:17:57,119 Speaker 3: it's a bad thing for our society. 299 00:17:57,680 --> 00:18:02,399 Speaker 1: M Yeah. It was such an interesting time. It was 300 00:18:02,440 --> 00:18:07,040 Speaker 1: such like a weird intersection of of like biological and 301 00:18:07,160 --> 00:18:15,360 Speaker 1: sociological and commercial and medical kind of interest. And I 302 00:18:15,359 --> 00:18:18,239 Speaker 1: remember Brett Sutton or whatever his name is, standing up 303 00:18:18,359 --> 00:18:22,000 Speaker 1: or he's the see just the Victorian once if or 304 00:18:22,040 --> 00:18:26,720 Speaker 1: that I think he's the spect Yeah, yeah, and like 305 00:18:26,880 --> 00:18:30,800 Speaker 1: emphatically going well, you know once you I think it 306 00:18:30,840 --> 00:18:32,240 Speaker 1: was him that said, or it might have been a 307 00:18:32,320 --> 00:18:35,280 Speaker 1: lady I forget, but basically, you know, once you're vaccinated, 308 00:18:35,320 --> 00:18:38,919 Speaker 1: that's it. You can't you can't get COVID or you 309 00:18:38,960 --> 00:18:42,480 Speaker 1: can't infect anyone else. It's like, well neither of those 310 00:18:42,560 --> 00:18:47,159 Speaker 1: things were true, and and we unequivocally know that you 311 00:18:47,200 --> 00:18:50,720 Speaker 1: know this, this works, and it's like, well, no, lots 312 00:18:50,720 --> 00:18:54,720 Speaker 1: of people got vaccinated and then fucking had COVID, you know. 313 00:18:54,920 --> 00:18:56,720 Speaker 1: And I think there was quite a bit of evidence 314 00:18:56,760 --> 00:18:59,800 Speaker 1: to say that once you've had COVID, that's a much 315 00:18:59,800 --> 00:19:03,359 Speaker 1: better a vaccine in inverted commas than the vaccine itself. 316 00:19:03,640 --> 00:19:07,240 Speaker 1: You know, like your immunity is much greater post COVID 317 00:19:07,680 --> 00:19:12,280 Speaker 1: than post vaccination, but nobody wanted to talk about and yeah. 318 00:19:12,200 --> 00:19:16,280 Speaker 3: It's the trouble with that is And I think I 319 00:19:16,359 --> 00:19:18,399 Speaker 3: heard someone say it at the time. I can't remember 320 00:19:18,440 --> 00:19:19,960 Speaker 3: who it was. I'd love to give them the credit 321 00:19:20,000 --> 00:19:23,200 Speaker 3: for it, but some expert got on the radio or 322 00:19:23,240 --> 00:19:25,760 Speaker 3: something and said, one of the risks we take with 323 00:19:25,800 --> 00:19:30,000 Speaker 3: this in overstating the efficacy of these vaccines because they 324 00:19:30,040 --> 00:19:32,560 Speaker 3: knew they were doing that, just as they know they're 325 00:19:32,560 --> 00:19:34,760 Speaker 3: doing it with the flu vaccine, right, I mean, the 326 00:19:34,840 --> 00:19:37,359 Speaker 3: highest point that the flu vaccines ever reached in terms 327 00:19:37,359 --> 00:19:40,199 Speaker 3: of efficacy is something under fifty percent, which makes it 328 00:19:40,240 --> 00:19:44,679 Speaker 3: basically means it's a matter of chance, and that's the 329 00:19:44,720 --> 00:19:48,320 Speaker 3: highest point. So you know, they knew they were doing 330 00:19:48,359 --> 00:19:50,600 Speaker 3: that with the COVID vaccines as well, but they felt 331 00:19:50,600 --> 00:19:54,480 Speaker 3: that the public health outcomes were more important. Is that 332 00:19:55,320 --> 00:19:58,080 Speaker 3: the risk that they're running by doing that is that 333 00:19:58,160 --> 00:20:02,280 Speaker 3: they make people believe that applies to all vaccines, and 334 00:20:02,320 --> 00:20:06,679 Speaker 3: it doesn't. So there are vaccines with extremely high efficacy, 335 00:20:07,119 --> 00:20:11,159 Speaker 3: but if you lie to people about a vaccine people 336 00:20:11,280 --> 00:20:16,120 Speaker 3: translate that across to every vaccine, and that's a very 337 00:20:16,160 --> 00:20:18,960 Speaker 3: bad thing for society because then you have people not 338 00:20:19,119 --> 00:20:20,920 Speaker 3: trusting something that they should trust. 339 00:20:21,440 --> 00:20:25,560 Speaker 1: What do you think would have happened if the you know, 340 00:20:25,640 --> 00:20:29,359 Speaker 1: the leading medical people that were shoved in front of 341 00:20:29,440 --> 00:20:35,320 Speaker 1: microphones at that time had have said, look, we don't know, 342 00:20:36,400 --> 00:20:39,640 Speaker 1: we don't which is good, true, we don't actually know, 343 00:20:39,840 --> 00:20:42,920 Speaker 1: like we think this, we believe this, but we don't 344 00:20:43,000 --> 00:20:45,359 Speaker 1: unequivocally know because this is new. 345 00:20:46,200 --> 00:20:48,920 Speaker 3: The good thing is that there's actually a lived example 346 00:20:48,920 --> 00:20:52,280 Speaker 3: of a country that did exactly that, and that's Sweden. 347 00:20:52,920 --> 00:20:57,680 Speaker 3: So unlike every other country in Northern Europe, Sweden went 348 00:20:57,720 --> 00:21:00,800 Speaker 3: down a path of not doing lockdowns and not doing 349 00:21:00,920 --> 00:21:03,639 Speaker 3: vaccine mandates and all of that sort of thing, and 350 00:21:03,840 --> 00:21:06,920 Speaker 3: it essentially, you know, when it all boiled down at 351 00:21:06,920 --> 00:21:09,600 Speaker 3: the end of the day, years later, it worked out 352 00:21:09,600 --> 00:21:13,640 Speaker 3: that they had the same outcomes as everybody else. So 353 00:21:15,080 --> 00:21:17,440 Speaker 3: they didn't go through all the economic pain of lockdowns 354 00:21:17,440 --> 00:21:19,919 Speaker 3: and so on, and they didn't go through all of 355 00:21:19,960 --> 00:21:24,280 Speaker 3: those mandates and they arrived in the same place. So 356 00:21:25,240 --> 00:21:27,720 Speaker 3: it's great to look at these things with twenty twenty hindsight. 357 00:21:27,920 --> 00:21:30,800 Speaker 3: It's different when you're in the heat of the World 358 00:21:30,800 --> 00:21:34,520 Speaker 3: Health Organization running around saying it's a deadly pandemic, et cetera, etc. 359 00:21:35,240 --> 00:21:38,360 Speaker 3: But one of the things that came out of that 360 00:21:38,680 --> 00:21:44,240 Speaker 3: review was in making pronouncements as if they had evidence 361 00:21:44,840 --> 00:21:49,280 Speaker 3: when they knew they didn't. People had a sense of 362 00:21:49,320 --> 00:21:54,479 Speaker 3: that and went along with it because of fear and 363 00:21:54,600 --> 00:21:57,200 Speaker 3: enforcement and so on or whatever it was. But they 364 00:21:57,240 --> 00:21:59,879 Speaker 3: still had an abiding sense that they were being lied to. 365 00:22:00,440 --> 00:22:03,440 Speaker 3: And we have as humans a good sense of when 366 00:22:03,440 --> 00:22:05,840 Speaker 3: we're being lied to, even if we do nothing about it. 367 00:22:06,520 --> 00:22:08,879 Speaker 3: We know when we're being lied to by a politician 368 00:22:08,960 --> 00:22:12,240 Speaker 3: or a leader, or a business or whatever, and which 369 00:22:12,280 --> 00:22:14,040 Speaker 3: is why it's so important. I mean, one of the 370 00:22:14,040 --> 00:22:16,119 Speaker 3: things I looked at in that article was the recent 371 00:22:16,160 --> 00:22:18,800 Speaker 3: thing with Woolworth's where they said, the reason our profits 372 00:22:18,800 --> 00:22:21,800 Speaker 3: have dropped is because customers no longer trust us. They 373 00:22:21,920 --> 00:22:25,239 Speaker 3: understand the importance of that. They understand how important it 374 00:22:25,280 --> 00:22:28,000 Speaker 3: is so much that they actively measure it. They have 375 00:22:28,160 --> 00:22:33,520 Speaker 3: internal metrics that tell them whether they are trusted, and 376 00:22:33,560 --> 00:22:36,440 Speaker 3: it's vitally important to them because it's vitally important to 377 00:22:36,480 --> 00:22:41,240 Speaker 3: their bottom line. They need us to trust them. And 378 00:22:41,320 --> 00:22:45,480 Speaker 3: so the interesting thing is that okay. When it comes 379 00:22:45,480 --> 00:22:48,600 Speaker 3: down to dollars and cents, people can be really nakedly 380 00:22:48,680 --> 00:22:51,159 Speaker 3: obvious about this and say, we've got to have trust, 381 00:22:51,240 --> 00:22:53,399 Speaker 3: as it's as an important a commodity in our business 382 00:22:53,440 --> 00:22:55,720 Speaker 3: as anything else. We've got to measure it, we've got 383 00:22:55,720 --> 00:22:59,240 Speaker 3: to make sure it's okay. And yet we have other 384 00:22:59,400 --> 00:23:02,680 Speaker 3: areas of our our society where it's freely being burned 385 00:23:03,720 --> 00:23:06,520 Speaker 3: and no one is either measuring it or caring about it. 386 00:23:06,720 --> 00:23:12,520 Speaker 3: But the consequences are starting to be felt where if 387 00:23:12,720 --> 00:23:15,520 Speaker 3: most people, if you ask them about their political opinion 388 00:23:15,560 --> 00:23:18,400 Speaker 3: about something, would say, I don't trust any of them. 389 00:23:18,720 --> 00:23:20,760 Speaker 3: I don't like any of them, I don't trust any 390 00:23:20,760 --> 00:23:22,720 Speaker 3: of them. I wish I had someone else to vote for. 391 00:23:24,840 --> 00:23:27,560 Speaker 3: And that is a symptom of what's going on, And 392 00:23:27,600 --> 00:23:29,800 Speaker 3: then the end result of that is what's going on 393 00:23:29,840 --> 00:23:32,840 Speaker 3: in the US at the moment, which is it's about 394 00:23:32,880 --> 00:23:35,720 Speaker 3: I hate them both. But this guy at least seems 395 00:23:35,760 --> 00:23:37,840 Speaker 3: to think that he can solve the problem and is 396 00:23:37,840 --> 00:23:42,720 Speaker 3: not promising more of the same. Yes, yes, and I 397 00:23:42,720 --> 00:23:44,520 Speaker 3: don't trust him anymore than trust the other guy, but 398 00:23:44,520 --> 00:23:46,080 Speaker 3: at least he's saying the right things. 399 00:23:46,359 --> 00:23:49,480 Speaker 1: Yeah, it's almost it's down to if you're going to 400 00:23:49,560 --> 00:23:52,679 Speaker 1: vote which over there? Of course it's optional. It's like 401 00:23:52,760 --> 00:23:56,840 Speaker 1: who do you hate the least? Or yeah, who are 402 00:23:56,840 --> 00:23:58,760 Speaker 1: you the least skeptical of all? 403 00:23:58,960 --> 00:24:00,920 Speaker 3: Look, we're a trending to that here too. 404 00:24:01,680 --> 00:24:08,560 Speaker 1: Yeah yeah, let's do one more topic before Tiff heads 405 00:24:08,600 --> 00:24:11,480 Speaker 1: off to get her new bloody pipe put on a motorbike. 406 00:24:12,560 --> 00:24:15,399 Speaker 1: If I hear a loud noise outside my house tonight, 407 00:24:15,480 --> 00:24:19,159 Speaker 1: echoing up and down Hampton Street at eleven pm. 408 00:24:19,320 --> 00:24:22,280 Speaker 2: I was actually thinking about driving that way home, just 409 00:24:22,760 --> 00:24:23,640 Speaker 2: a little rev. 410 00:24:23,520 --> 00:24:25,840 Speaker 3: Little yeah, no, don't but a but it'd be at three, 411 00:24:25,880 --> 00:24:27,520 Speaker 3: wouldn't it three in the morning, wouldn't be eleven? 412 00:24:28,000 --> 00:24:31,760 Speaker 2: I'm not up at three, silly eleven. 413 00:24:33,880 --> 00:24:39,000 Speaker 1: So you wrote, you wrote an article called Plunging into Recovery? 414 00:24:39,080 --> 00:24:45,720 Speaker 1: Can ice barres rewire the addicted brain before you? Before 415 00:24:45,760 --> 00:24:48,440 Speaker 1: you open that door wide? Have you ever had an 416 00:24:48,440 --> 00:24:49,040 Speaker 1: ice bath? 417 00:24:49,200 --> 00:24:55,760 Speaker 3: Or I haven't, not intentionally, Not intentionally. I have occasionally 418 00:24:55,840 --> 00:24:57,800 Speaker 3: jumped into water that was a lot colder than I 419 00:24:57,880 --> 00:25:02,280 Speaker 3: thought it was. But one of the times I was 420 00:25:02,359 --> 00:25:05,040 Speaker 3: in Germany, and you know the Germans I was staying 421 00:25:05,080 --> 00:25:08,080 Speaker 3: with We're all leaping into this alpine lake you know, 422 00:25:08,160 --> 00:25:09,840 Speaker 3: as if it was like a summer day at the beach. 423 00:25:09,880 --> 00:25:12,119 Speaker 3: And so I joined them and I swear that was 424 00:25:12,160 --> 00:25:15,560 Speaker 3: just above freezing, that thing was, but they got like 425 00:25:15,560 --> 00:25:15,800 Speaker 3: it was. 426 00:25:15,840 --> 00:25:21,200 Speaker 1: Alpine and lake. Yes, alpine and lake. I should have thought. 427 00:25:23,440 --> 00:25:26,920 Speaker 3: I have done unintentional ice bath like experiences. 428 00:25:27,000 --> 00:25:31,199 Speaker 1: Yes, yeah, we'll call that ice bass bath ish. All right, 429 00:25:31,240 --> 00:25:33,920 Speaker 1: plugging into Recovery can And by the way, both of 430 00:25:33,960 --> 00:25:36,560 Speaker 1: these articles in the group. If you want to read 431 00:25:36,560 --> 00:25:42,480 Speaker 1: the actual full articles, you project podcast facebook page, tell 432 00:25:42,560 --> 00:25:44,879 Speaker 1: us about tell us about this one mate. 433 00:25:45,680 --> 00:25:49,200 Speaker 3: Okay, So this came up because I was doing a 434 00:25:49,240 --> 00:25:55,480 Speaker 3: talk to a group that are about helping people through addictions, 435 00:25:55,520 --> 00:25:58,119 Speaker 3: so a rehabit type group who were quite interesting in 436 00:25:58,160 --> 00:26:01,320 Speaker 3: some of the things I've written, particularly in Brain Reset, 437 00:26:01,400 --> 00:26:04,679 Speaker 3: about how do you break addictions? And one of the 438 00:26:04,720 --> 00:26:08,880 Speaker 3: groups said, what do you think about ice baths? And honestly, 439 00:26:09,240 --> 00:26:11,399 Speaker 3: I never even thought of it as a way of 440 00:26:11,440 --> 00:26:13,600 Speaker 3: breaking an addiction, which is kind of ignorant of me. 441 00:26:14,840 --> 00:26:17,719 Speaker 3: So I immediately went away and started doing some fairly 442 00:26:18,200 --> 00:26:21,640 Speaker 3: detailed research into it and came to the conclusion that, look, 443 00:26:21,680 --> 00:26:26,160 Speaker 3: there is a strong chance that the shock produced by 444 00:26:26,720 --> 00:26:30,200 Speaker 3: an ice bath, because when you do that. It's a 445 00:26:30,320 --> 00:26:34,960 Speaker 3: terrifying experience for your body. There's a significant shock experience 446 00:26:35,000 --> 00:26:39,840 Speaker 3: associated with plunging yourself into an ice bath, and that 447 00:26:39,920 --> 00:26:43,960 Speaker 3: shock experience produces a significant hit of dopamine. And some 448 00:26:44,000 --> 00:26:46,960 Speaker 3: of the studies have said, look, we're talking two three 449 00:26:47,080 --> 00:26:50,879 Speaker 3: times as much dopamine in a single hit as you 450 00:26:50,920 --> 00:26:54,800 Speaker 3: would get from cocaine. This is a big hit of 451 00:26:54,840 --> 00:27:00,479 Speaker 3: dopamine and naturally produced by the body. Quite a lot 452 00:27:00,480 --> 00:27:03,040 Speaker 3: of studies have now looked at you know, there's lots 453 00:27:03,080 --> 00:27:05,040 Speaker 3: of purported benefits of ice baths, so I even go 454 00:27:05,040 --> 00:27:06,200 Speaker 3: into all of that, but I did look at this 455 00:27:06,600 --> 00:27:10,480 Speaker 3: dopamine aspect because one of the best ways to naturally 456 00:27:10,480 --> 00:27:13,040 Speaker 3: break an addiction is to substitute. When you get the 457 00:27:13,080 --> 00:27:17,439 Speaker 3: craving for whatever it is you're addicted to substitute. Instead 458 00:27:17,440 --> 00:27:20,120 Speaker 3: of doing that, do something that produces a natural dopamine hit. 459 00:27:20,240 --> 00:27:22,800 Speaker 3: Now we've talked about this before. That could be doing, 460 00:27:23,160 --> 00:27:25,639 Speaker 3: you know, running a marathon. It could be having a 461 00:27:25,680 --> 00:27:28,359 Speaker 3: hobby that thoroughly engrosses you could be playing a musical 462 00:27:28,359 --> 00:27:30,920 Speaker 3: instrument as long as you can play it well. It's 463 00:27:30,920 --> 00:27:34,239 Speaker 3: something that requires extreme focus and produces dopamine. Well, this 464 00:27:34,359 --> 00:27:36,480 Speaker 3: might be a shortcut to that. In what you might 465 00:27:36,560 --> 00:27:39,680 Speaker 3: be able to do is just go take a cold shower, which, 466 00:27:39,720 --> 00:27:43,360 Speaker 3: by the way, cold showers, while not producing as big 467 00:27:43,359 --> 00:27:46,960 Speaker 3: a hit as an actual ice bath, does produce a 468 00:27:46,960 --> 00:27:52,000 Speaker 3: dopamine hit too, And so an interesting piece of advice 469 00:27:52,119 --> 00:27:55,840 Speaker 3: might well be, when you feel the cravings, go take 470 00:27:55,880 --> 00:28:00,720 Speaker 3: a cold shower. And there's some good science behind this. Now, 471 00:28:00,760 --> 00:28:03,600 Speaker 3: there's some really good science that suggests that this might 472 00:28:03,720 --> 00:28:08,159 Speaker 3: be a reasonable path to take when you're trying to 473 00:28:08,160 --> 00:28:09,320 Speaker 3: rehab people. 474 00:28:10,600 --> 00:28:13,080 Speaker 1: Or t if the other one would be get an 475 00:28:13,119 --> 00:28:16,480 Speaker 1: aftermarket exhaust on your motorbike. The credit. 476 00:28:18,080 --> 00:28:21,159 Speaker 2: Ten out of ten can recommend or. 477 00:28:21,320 --> 00:28:24,240 Speaker 3: And that's not as silly as it sounds. Just look 478 00:28:24,240 --> 00:28:27,600 Speaker 3: at the way Tips reacting to this thing, and she's 479 00:28:27,760 --> 00:28:29,879 Speaker 3: desperately excited to do it, and when it's on it, 480 00:28:29,920 --> 00:28:31,920 Speaker 3: when she hits that thing and it makes it the really, 481 00:28:31,960 --> 00:28:34,600 Speaker 3: really loud noise, it is going to be like having 482 00:28:34,600 --> 00:28:39,600 Speaker 3: a hit of cocaine to her. And so, yeah, find 483 00:28:39,640 --> 00:28:41,560 Speaker 3: that thing that does that for you. But if you're 484 00:28:41,600 --> 00:28:43,760 Speaker 3: not sure what that is, if you're not like Tiff 485 00:28:43,760 --> 00:28:48,480 Speaker 3: and you're not excited by loud motorbikes, a cold shower 486 00:28:48,520 --> 00:28:49,000 Speaker 3: might do it. 487 00:28:49,920 --> 00:28:53,280 Speaker 1: Wow, t if you could write a book called Motorbikes, 488 00:28:53,320 --> 00:28:59,320 Speaker 1: Puppies and Punching People and other addictions or other addiction 489 00:28:59,560 --> 00:29:05,320 Speaker 1: treatment and other dopamine producers by TIF, you just. 490 00:29:05,240 --> 00:29:07,120 Speaker 3: Call it them, just call it the motorbike Cure. 491 00:29:08,280 --> 00:29:10,760 Speaker 1: Yeah. Do you know what is funny? I know we're 492 00:29:10,760 --> 00:29:13,600 Speaker 1: going to wind up. But what's funny is I've had 493 00:29:14,480 --> 00:29:16,440 Speaker 1: bike since I was a kid, and I'll probably I've 494 00:29:16,480 --> 00:29:19,080 Speaker 1: taken over the years a lot of people on the back, 495 00:29:19,720 --> 00:29:22,680 Speaker 1: and depending on how they feel about going on the back, 496 00:29:23,280 --> 00:29:26,080 Speaker 1: they're either producing a shitload of dopamine or a shitload 497 00:29:26,120 --> 00:29:26,760 Speaker 1: of cortisol. 498 00:29:27,880 --> 00:29:30,880 Speaker 3: But same thing, no, no, same thing terror being terrified, 499 00:29:31,280 --> 00:29:33,960 Speaker 3: being terrified, which is kind of what the ice bath 500 00:29:34,040 --> 00:29:38,600 Speaker 3: is doing to you. Being terrified produces a significant dopamine hit. 501 00:29:39,000 --> 00:29:43,360 Speaker 3: So if you want to scare yourself, then you can 502 00:29:43,360 --> 00:29:44,960 Speaker 3: do it with an ice bath. Or if you're not 503 00:29:45,000 --> 00:29:47,160 Speaker 3: into bikes, you know, I guess you could ride pinion 504 00:29:47,240 --> 00:29:48,120 Speaker 3: with Tiff. 505 00:29:48,480 --> 00:29:55,000 Speaker 1: Yeah, well yeah, definitely don't do that or watch Yeah, 506 00:29:55,640 --> 00:29:59,280 Speaker 1: well you delivered again, mate, And next time we chat 507 00:29:59,320 --> 00:30:02,280 Speaker 1: there'll be there will be a new boss in the 508 00:30:02,280 --> 00:30:06,840 Speaker 1: White House. And yes, I so if you had to 509 00:30:06,880 --> 00:30:10,640 Speaker 1: make a prediction you at where we are right now 510 00:30:10,680 --> 00:30:12,880 Speaker 1: on Tuesday or Wednesday. 511 00:30:12,760 --> 00:30:14,600 Speaker 3: At where we are right now. And by the way, 512 00:30:14,640 --> 00:30:17,160 Speaker 3: I'm notoriously bad at predicting things like this, but at 513 00:30:17,160 --> 00:30:19,080 Speaker 3: where we are right now, it feels like it's a 514 00:30:19,120 --> 00:30:19,520 Speaker 3: Trump win. 515 00:30:20,240 --> 00:30:20,480 Speaker 2: Wow. 516 00:30:20,880 --> 00:30:24,080 Speaker 1: That's well, we'll wait with breath. Tift, you want to 517 00:30:24,120 --> 00:30:26,320 Speaker 1: say something. You look like you're sticking your finger up. 518 00:30:26,680 --> 00:30:31,240 Speaker 2: Can we acknowledge Caspo's strapping new profile? 519 00:30:32,360 --> 00:30:34,520 Speaker 3: Is that incredible? 520 00:30:37,840 --> 00:30:39,440 Speaker 2: Are you going to pop that on your website? Because 521 00:30:39,440 --> 00:30:41,760 Speaker 2: the first thing I did with freeschool dot org that 522 00:30:41,880 --> 00:30:43,240 Speaker 2: are you and you have. 523 00:30:43,240 --> 00:30:44,000 Speaker 1: An updated it? 524 00:30:44,560 --> 00:30:47,880 Speaker 3: I will have to do that. Yeah, yeah, this is 525 00:30:47,920 --> 00:30:51,800 Speaker 3: a this AI thing is incredible. You give it, you 526 00:30:51,880 --> 00:30:54,880 Speaker 3: give it like ten real photos of yourself and obviously 527 00:30:54,960 --> 00:31:01,400 Speaker 3: I gave it younger ones of me, and and it 528 00:31:01,520 --> 00:31:04,720 Speaker 3: produces like a hundred version. You can tell it how 529 00:31:04,760 --> 00:31:07,280 Speaker 3: you want to be dressed whatever, and off you go. 530 00:31:08,280 --> 00:31:11,080 Speaker 1: Which which program is that or which. 531 00:31:10,920 --> 00:31:13,080 Speaker 3: App is that? Stuff? To find know, I'd have to 532 00:31:13,120 --> 00:31:14,840 Speaker 3: send it to you, all right. 533 00:31:14,840 --> 00:31:17,200 Speaker 1: Send it to you, all right, We'll say goodbye. Heare 534 00:31:17,200 --> 00:31:20,240 Speaker 1: but as always brilliant, Thank you mate, We appreciate you. 535 00:31:20,800 --> 00:31:22,200 Speaker 3: No worries, see you lady guys,