1 00:00:00,280 --> 00:00:03,000 Speaker 1: Good team. It's the bloody project. That's you, Tiffany and 2 00:00:03,040 --> 00:00:07,040 Speaker 1: Cook over there in the fancy ritchy area of Melbourne 3 00:00:07,080 --> 00:00:10,080 Speaker 1: that she lives in Glaspow up there in the bloody 4 00:00:10,360 --> 00:00:15,160 Speaker 1: jungle in Queensland. Will start with the smart one. Good I, tiff. 5 00:00:16,760 --> 00:00:17,040 Speaker 2: Good I. 6 00:00:17,840 --> 00:00:19,799 Speaker 1: We'll go to the beautiful one next. How are you? 7 00:00:19,840 --> 00:00:20,600 Speaker 1: How's your day been? 8 00:00:21,120 --> 00:00:22,639 Speaker 2: I've been fabulous? Thank you. 9 00:00:23,560 --> 00:00:25,880 Speaker 1: Well you should thank me because I just finished twenty 10 00:00:25,960 --> 00:00:29,520 Speaker 1: minutes ago training you and your gorilla boyfriend. You're the 11 00:00:29,720 --> 00:00:32,880 Speaker 1: andethal boyfriend who can cook chock brown in like world 12 00:00:32,960 --> 00:00:38,960 Speaker 1: class apparently. Yeah, oh well, good for you, doctor Gillespie. 13 00:00:39,040 --> 00:00:42,320 Speaker 1: Welcome to the show. Fourteen days later, here we are, 14 00:00:42,400 --> 00:00:42,920 Speaker 1: how are you? 15 00:00:43,600 --> 00:00:48,400 Speaker 3: Yeah? Good good? Yeah, it's been a yeah follo out. 16 00:00:48,400 --> 00:00:51,519 Speaker 3: Fourteen days for me actually feels like several months. 17 00:00:51,520 --> 00:00:54,600 Speaker 1: Actually, I thought you tried to avoid work as much 18 00:00:54,640 --> 00:00:57,720 Speaker 1: as possible. How dare you? Yeah? 19 00:00:57,840 --> 00:01:00,440 Speaker 3: Well, the new book's coming out twenty eighth of April, 20 00:01:00,480 --> 00:01:02,240 Speaker 3: and so there's a bit of stuff that has to 21 00:01:02,280 --> 00:01:05,800 Speaker 3: happen ahead of that, including producing the actual book. 22 00:01:05,920 --> 00:01:09,679 Speaker 1: So yeah, Well, because you're you're the world's worst at 23 00:01:09,720 --> 00:01:12,440 Speaker 1: fucking self promotion. If I managed you were, if I 24 00:01:12,560 --> 00:01:15,160 Speaker 1: was your agent or any I'd be like, dude, do better, 25 00:01:15,760 --> 00:01:19,240 Speaker 1: so give us, give us without doing anything sales. You 26 00:01:19,400 --> 00:01:21,920 Speaker 1: just tell us about what you can about the book. 27 00:01:22,959 --> 00:01:26,800 Speaker 3: Uh yeah, I'd rather than at the moment. 28 00:01:26,880 --> 00:01:31,479 Speaker 1: That's a terrible start my case. 29 00:01:31,720 --> 00:01:35,280 Speaker 3: It's called the Attention Recovery Plan, right, and it is. 30 00:01:35,800 --> 00:01:38,160 Speaker 3: There's a lot of stuff written about, including by me, 31 00:01:38,760 --> 00:01:45,840 Speaker 3: about the damage that devices and addictions are doing to us. 32 00:01:45,880 --> 00:01:49,120 Speaker 3: So you know, the social media gaming, all that kind 33 00:01:49,160 --> 00:01:53,080 Speaker 3: of thing, and I've written a lot about that and 34 00:01:54,480 --> 00:01:59,120 Speaker 3: made it really really clear that the biochemistry tells us 35 00:01:59,200 --> 00:02:02,760 Speaker 3: that this is no different to substance abuse, and it's 36 00:02:02,840 --> 00:02:08,680 Speaker 3: having the same consequences, which is severe addiction, anxiety, depression, psychosis, schizophrenia. 37 00:02:09,080 --> 00:02:12,040 Speaker 3: And you know, we're going to hell in a handbasket 38 00:02:12,080 --> 00:02:14,760 Speaker 3: as a society because the rates of those things are 39 00:02:14,919 --> 00:02:21,880 Speaker 3: increasing incredibly quickly because we're addicting everybody. And so I've 40 00:02:21,919 --> 00:02:25,079 Speaker 3: written about that before. We've spoken about it before, I've 41 00:02:25,080 --> 00:02:28,400 Speaker 3: written books about it. The most recent time was a 42 00:02:28,440 --> 00:02:30,520 Speaker 3: book I wrote called Brain Reset a few years ago. 43 00:02:32,120 --> 00:02:36,280 Speaker 3: This is about Okay, that's brilliant, thanks for scaring me 44 00:02:36,320 --> 00:02:40,800 Speaker 3: to death. What do you do about it? And the 45 00:02:40,880 --> 00:02:43,919 Speaker 3: answer can't be well, chuck your phone in the bin, 46 00:02:44,240 --> 00:02:47,240 Speaker 3: because if it were that easy, it wouldn't be an addiction. 47 00:02:48,400 --> 00:02:53,399 Speaker 3: So that's not what this book's about. This book's about 48 00:02:53,680 --> 00:02:58,400 Speaker 3: how do you intentionally recover your attention? How do you 49 00:02:58,520 --> 00:03:03,519 Speaker 3: intentionally recover your focus? How do you, I guess, for 50 00:03:03,639 --> 00:03:07,079 Speaker 3: want of a better term, go through a focused workout 51 00:03:07,280 --> 00:03:10,600 Speaker 3: to make sure you can still do it and you 52 00:03:10,840 --> 00:03:15,680 Speaker 3: can still apply your mind. And how do you combat 53 00:03:16,040 --> 00:03:21,679 Speaker 3: the inevitable disease wave that comes with addiction, the anxiety, 54 00:03:21,760 --> 00:03:25,919 Speaker 3: the oppression, the ADHD, all of those things. So this 55 00:03:26,000 --> 00:03:30,640 Speaker 3: is an intensely practical book about what do you do? 56 00:03:31,200 --> 00:03:33,400 Speaker 3: So there's a little bit of science in there, some 57 00:03:33,520 --> 00:03:36,480 Speaker 3: of it you're recognize from things that we've discussed over 58 00:03:36,520 --> 00:03:40,440 Speaker 3: the years, but it's mostly about what do you do 59 00:03:40,480 --> 00:03:42,440 Speaker 3: about it? And there's a lot of step by step 60 00:03:42,480 --> 00:03:46,400 Speaker 3: guide's a lot of planning exactly what to do, how 61 00:03:46,400 --> 00:03:51,000 Speaker 3: to organize your workspace, how to organize your personal life, 62 00:03:51,240 --> 00:03:54,240 Speaker 3: all of those sorts of things to ensure that you 63 00:03:54,360 --> 00:03:55,760 Speaker 3: maintain focus, fitness. 64 00:03:57,200 --> 00:04:02,320 Speaker 1: Wow, that's good. And the fact that it it's outlining 65 00:04:02,360 --> 00:04:06,200 Speaker 1: the problem but really talking about and presenting a solution. 66 00:04:06,720 --> 00:04:10,520 Speaker 1: I love and it is. It is interesting when you 67 00:04:10,560 --> 00:04:13,480 Speaker 1: think that, like whether it's booze or whether it's gambling, 68 00:04:13,560 --> 00:04:16,559 Speaker 1: or whether it's drugs, or whether it's pawn or whether 69 00:04:16,600 --> 00:04:22,520 Speaker 1: it's foo, it's I mean, it's the addiction is kind 70 00:04:22,560 --> 00:04:24,640 Speaker 1: of the same in a way. It's just a different 71 00:04:24,800 --> 00:04:27,120 Speaker 1: pathway to a similar addiction, right. 72 00:04:28,160 --> 00:04:31,799 Speaker 3: It's exactly the same. From a biochemical perspective. Your brain 73 00:04:31,880 --> 00:04:36,479 Speaker 3: can't tell the difference between a gambling addiction, a gaming addiction, 74 00:04:38,200 --> 00:04:41,479 Speaker 3: a social media addiction, or an addiction to heroin. It 75 00:04:41,560 --> 00:04:45,840 Speaker 3: can't tell the difference from a biochemical perspective, They're all 76 00:04:45,880 --> 00:04:46,760 Speaker 3: exactly the same. 77 00:04:47,880 --> 00:04:51,520 Speaker 1: And without giving away anything, you can't, so feel free 78 00:04:51,560 --> 00:04:57,920 Speaker 1: to not answer it. Did you where did you? Did 79 00:04:57,960 --> 00:05:00,760 Speaker 1: you talk to people who had overcome this addiction? Did 80 00:05:00,800 --> 00:05:04,520 Speaker 1: you talk to addiction specialists? Did you read a million 81 00:05:04,600 --> 00:05:08,039 Speaker 1: papers as you do? From where? Did you kind of 82 00:05:08,160 --> 00:05:15,880 Speaker 1: create or formulate your suggestions or your kind of recovery models. 83 00:05:16,120 --> 00:05:18,719 Speaker 3: It's a combination of a few of those things. The 84 00:05:18,720 --> 00:05:22,560 Speaker 3: book has a lot of stories from you know, anonymised obviously, 85 00:05:22,680 --> 00:05:26,200 Speaker 3: but a lot of real world stories from people who've 86 00:05:26,400 --> 00:05:30,080 Speaker 3: struggled with these things. You know, things like dating app addictions, 87 00:05:30,920 --> 00:05:36,599 Speaker 3: gaming addictions, gambling addictions, and so on, and real life 88 00:05:36,600 --> 00:05:39,800 Speaker 3: stories of that. And the reason for putting those in is, 89 00:05:40,160 --> 00:05:42,680 Speaker 3: rather than me just banging on about all this is bad, 90 00:05:43,240 --> 00:05:45,520 Speaker 3: I want people to be able to read the story 91 00:05:45,520 --> 00:05:49,320 Speaker 3: of a person and identify them, see themselves in the story, 92 00:05:49,520 --> 00:05:52,920 Speaker 3: see something that they're doing, because one of the difficulties 93 00:05:52,960 --> 00:05:57,240 Speaker 3: with addiction is it usually comes with an enormous moral judgment. 94 00:05:58,640 --> 00:06:06,359 Speaker 3: As this there's this general sense that if you're an addict, 95 00:06:06,400 --> 00:06:10,920 Speaker 3: it's because you're weak or you're in some other way faulty, 96 00:06:11,960 --> 00:06:13,920 Speaker 3: and you just need to get your act together and 97 00:06:13,960 --> 00:06:17,240 Speaker 3: we can't help it. It's just it's ingrained in the 98 00:06:17,279 --> 00:06:21,800 Speaker 3: way we operate as humans. You know, if you see 99 00:06:21,800 --> 00:06:25,280 Speaker 3: someone addicted to pornography, it's because they've got something wrong 100 00:06:25,320 --> 00:06:27,920 Speaker 3: with them, or you see someone addicted to gambling, they're 101 00:06:27,960 --> 00:06:32,040 Speaker 3: weak in some way. But that's not what's going on. 102 00:06:32,520 --> 00:06:36,960 Speaker 3: What's going on is a very intentional perversion of some 103 00:06:37,000 --> 00:06:41,760 Speaker 3: biochemistry in the human brain by people who are doing 104 00:06:41,760 --> 00:06:46,400 Speaker 3: it to make money. And we need to not lose 105 00:06:46,440 --> 00:06:50,800 Speaker 3: sight of that that this is a business model we're 106 00:06:50,839 --> 00:06:56,080 Speaker 3: talking about here, and this book is about fighting that 107 00:06:56,160 --> 00:07:00,680 Speaker 3: business model and keeping your own brain to get us. 108 00:07:01,600 --> 00:07:03,640 Speaker 1: And I guess the reality is, as much as we 109 00:07:03,680 --> 00:07:06,200 Speaker 1: don't want to hear this, if we put aside morality 110 00:07:06,240 --> 00:07:10,320 Speaker 1: in all of that, just for this hypothetical, it's in 111 00:07:10,440 --> 00:07:16,200 Speaker 1: the interest of a lot of these companies and apps 112 00:07:16,200 --> 00:07:19,160 Speaker 1: and all of that to get people addicted because it's 113 00:07:19,200 --> 00:07:20,320 Speaker 1: financially rewarding. 114 00:07:21,320 --> 00:07:24,800 Speaker 3: Well, we've said this before. You know, when the product 115 00:07:24,880 --> 00:07:28,480 Speaker 3: is free, you are the product. And that's what's going 116 00:07:28,520 --> 00:07:33,800 Speaker 3: on in when people are selling addiction. It's the point here, say, 117 00:07:34,120 --> 00:07:37,000 Speaker 3: for whether it's a playing games or gambling or on 118 00:07:37,280 --> 00:07:43,200 Speaker 3: social media, it costs billions of dollars to develop, maintain, 119 00:07:43,960 --> 00:07:48,640 Speaker 3: and run something like Facebook or Instagram, and what does 120 00:07:48,640 --> 00:07:51,440 Speaker 3: it cost you to do it? Nothing? Not a sense. 121 00:07:53,520 --> 00:07:55,640 Speaker 3: That just doesn't make sense as a business model. That's 122 00:07:55,680 --> 00:07:58,200 Speaker 3: totally insane. And that's because that's not the business model. 123 00:07:58,600 --> 00:08:03,880 Speaker 3: The business model is capturing seconds of attention, yes, or 124 00:08:04,000 --> 00:08:07,120 Speaker 3: engagement as it's often called, and then reselling that to 125 00:08:07,160 --> 00:08:10,440 Speaker 3: the highest bidder, and the best way to do that 126 00:08:10,720 --> 00:08:11,680 Speaker 3: is to addict people. 127 00:08:13,440 --> 00:08:16,120 Speaker 1: We talk about it a fair bit, but it feels 128 00:08:16,160 --> 00:08:19,600 Speaker 1: to me like everyone kind of like what we've well, 129 00:08:19,640 --> 00:08:22,920 Speaker 1: some of what you've said so far will be revelatory 130 00:08:22,960 --> 00:08:24,640 Speaker 1: to some of it will bit like, oh wow, I 131 00:08:24,680 --> 00:08:26,440 Speaker 1: didn't really think about it. But I think most of 132 00:08:26,520 --> 00:08:29,360 Speaker 1: us kind of know that there's a problem around social 133 00:08:29,360 --> 00:08:32,199 Speaker 1: media and the like. But I also feel like while 134 00:08:32,200 --> 00:08:36,080 Speaker 1: we we kind of get intellectually that it's a potential 135 00:08:36,160 --> 00:08:38,400 Speaker 1: threat for some people in some ways, if not all 136 00:08:38,440 --> 00:08:42,240 Speaker 1: of us, we don't hold it in the same regard 137 00:08:42,280 --> 00:08:46,600 Speaker 1: as booze or heroine or gambling or some of the 138 00:08:47,559 --> 00:08:50,640 Speaker 1: I guess the addictions that we would call life destroying 139 00:08:50,720 --> 00:08:54,600 Speaker 1: or destructive, but it seems like which it isn't, but 140 00:08:54,679 --> 00:08:57,520 Speaker 1: it can almost seem like, oh yeah, it's kind of 141 00:08:57,520 --> 00:09:00,560 Speaker 1: an addiction, but it's not one of them, the danger ones. 142 00:09:02,280 --> 00:09:07,000 Speaker 3: There's two aspects to addiction, which people often mix up 143 00:09:08,840 --> 00:09:12,160 Speaker 3: because when you talk about let's say a substance addiction 144 00:09:12,600 --> 00:09:16,280 Speaker 3: or you know, things like her and cocaine, etc. People 145 00:09:16,640 --> 00:09:20,000 Speaker 3: talk about the symptoms of that and the badness associated 146 00:09:20,000 --> 00:09:21,520 Speaker 3: with that, and what they're actually talking about is the 147 00:09:21,520 --> 00:09:25,760 Speaker 3: side effects. So in order to produce the dopamine spike 148 00:09:26,480 --> 00:09:30,160 Speaker 3: and create the addiction, those things come with side effects 149 00:09:30,200 --> 00:09:33,320 Speaker 3: because they're drugs, and a lot of those side effects 150 00:09:33,360 --> 00:09:37,000 Speaker 3: are what people associate with addiction, but that isn't addiction. 151 00:09:37,440 --> 00:09:40,520 Speaker 3: Addiction is the dopamine spike that's produced in the first place, 152 00:09:40,960 --> 00:09:44,120 Speaker 3: and the damage done by that is not from the 153 00:09:44,160 --> 00:09:46,439 Speaker 3: side effects, or it's not the primary damage from the 154 00:09:46,480 --> 00:09:49,200 Speaker 3: side effects. If you didn't think about something like gambling, 155 00:09:49,679 --> 00:09:53,720 Speaker 3: there's no real side effect from gambling in the same 156 00:09:53,760 --> 00:09:56,240 Speaker 3: sense of substance abuse, but no one would fight with 157 00:09:56,280 --> 00:09:59,240 Speaker 3: you over the notion that it's very definitely an addiction. 158 00:10:01,080 --> 00:10:04,600 Speaker 3: What's going on there is that dover meanspike is being 159 00:10:04,640 --> 00:10:09,720 Speaker 3: produced by behavior, which is gambling, and that dover meanspike 160 00:10:09,840 --> 00:10:14,000 Speaker 3: is having the consequence that you become anxious and depressed, 161 00:10:15,280 --> 00:10:17,760 Speaker 3: that you create chaos in your life by running out 162 00:10:17,760 --> 00:10:22,120 Speaker 3: of money, and that in turn makes it even worse. 163 00:10:22,840 --> 00:10:29,200 Speaker 3: So that's a pure form of what happens in addiction. 164 00:10:29,800 --> 00:10:33,480 Speaker 3: And when people wave away things like Instagram or dating 165 00:10:33,520 --> 00:10:38,080 Speaker 3: apps or gaming and say, well, that's not the same thing, 166 00:10:38,720 --> 00:10:41,680 Speaker 3: they're mixing the two up. Yes, there are no instant 167 00:10:42,720 --> 00:10:46,560 Speaker 3: downside side effects from substance consumption, but the effect on 168 00:10:46,600 --> 00:10:49,880 Speaker 3: the brain is identical, which is why we're seeing anxiety 169 00:10:49,960 --> 00:10:52,920 Speaker 3: rates in teenagers go through the roof. It's why We're 170 00:10:52,920 --> 00:10:55,480 Speaker 3: seeing depression go through the roof. It's why schizophrenia is 171 00:10:55,480 --> 00:11:00,280 Speaker 3: that epidemic proportions, It's why suicide is going crazy. It's 172 00:11:00,320 --> 00:11:05,400 Speaker 3: because these are the flow on effects of addiction. So 173 00:11:06,160 --> 00:11:09,400 Speaker 3: people waving away something because it doesn't have a substance 174 00:11:09,440 --> 00:11:12,160 Speaker 3: abuse side effect doesn't make it any less addiction. 175 00:11:15,240 --> 00:11:19,800 Speaker 1: It's so funny. I've even with adults, and maybe for me, 176 00:11:19,880 --> 00:11:23,240 Speaker 1: it's almost like this is like the impact of if 177 00:11:23,280 --> 00:11:26,560 Speaker 1: we had an impact scale of these things that went 178 00:11:26,559 --> 00:11:29,599 Speaker 1: from zero to ten, and we go, well, ten is complete. 179 00:11:29,640 --> 00:11:33,040 Speaker 1: It's hijacked your life in your brain and you really 180 00:11:33,080 --> 00:11:35,560 Speaker 1: need you really need help, and then write down to 181 00:11:35,960 --> 00:11:40,400 Speaker 1: almost like disordered eating through to an eating disorder. So 182 00:11:40,480 --> 00:11:43,200 Speaker 1: I think, you know, the level of impact kind of 183 00:11:43,280 --> 00:11:50,360 Speaker 1: grows without us even realizing. But it still comes down 184 00:11:50,440 --> 00:11:55,400 Speaker 1: to like our awareness around it now, willingness to acknowledge it. 185 00:11:55,840 --> 00:11:58,560 Speaker 1: But I was having a chat yesterday with a made 186 00:11:58,559 --> 00:12:01,520 Speaker 1: of mine who's fifty five. Was something he's got grandkids 187 00:12:01,760 --> 00:12:05,800 Speaker 1: and his kid, his son took his iPhone off his 188 00:12:06,200 --> 00:12:08,240 Speaker 1: well I don't know, nine year old or ten year 189 00:12:08,240 --> 00:12:11,800 Speaker 1: old for a week, and the kid lost his shit. 190 00:12:12,920 --> 00:12:16,840 Speaker 1: I mean he and he was there when that, like 191 00:12:16,920 --> 00:12:18,320 Speaker 1: he happened to be there and I don't know what 192 00:12:18,360 --> 00:12:20,280 Speaker 1: the kid did, but the consequence of what he did 193 00:12:20,440 --> 00:12:23,040 Speaker 1: was no phone for a week. Well you think you 194 00:12:23,040 --> 00:12:25,160 Speaker 1: would think, he said to me, it's like they fucking 195 00:12:25,240 --> 00:12:26,480 Speaker 1: chopped his leg off. 196 00:12:27,840 --> 00:12:31,760 Speaker 3: Or it's like any other addict when you take away 197 00:12:31,800 --> 00:12:37,160 Speaker 3: the thing that they're addicted to. M that's what's going 198 00:12:37,200 --> 00:12:40,880 Speaker 3: on here. And people will often say to me when 199 00:12:40,880 --> 00:12:45,240 Speaker 3: I talk like this, they'll say, yeah, but I'm not, 200 00:12:45,280 --> 00:12:47,560 Speaker 3: as you know, I don't really have addiction. Sure, I 201 00:12:47,559 --> 00:12:49,000 Speaker 3: don't mind a bit of gaming or a bit of 202 00:12:49,040 --> 00:12:52,080 Speaker 3: gambling or you know, get on the social media or someone, 203 00:12:52,360 --> 00:12:54,880 Speaker 3: but I'm not. I'm not addicted. You know, there's no 204 00:12:54,880 --> 00:12:58,880 Speaker 3: problems there. And often I'll say to them, well, sure, well, 205 00:12:59,120 --> 00:13:02,000 Speaker 3: there's an easy way to put through that, you know, 206 00:13:02,120 --> 00:13:04,240 Speaker 3: just to lete the apps, and they go, sure, easy, 207 00:13:04,320 --> 00:13:08,240 Speaker 3: do it tomorrow, and they will, knowing in the back 208 00:13:08,280 --> 00:13:11,360 Speaker 3: of their mind that they can reinstall them anytime they want. 209 00:13:11,679 --> 00:13:13,680 Speaker 3: But if I say to them, well, just delete the 210 00:13:13,720 --> 00:13:19,000 Speaker 3: accounts too, then they're then they're not so easy to convince. 211 00:13:19,760 --> 00:13:23,040 Speaker 3: No one wants to delete the accounts because that really 212 00:13:23,280 --> 00:13:27,760 Speaker 3: is stopping and they in their own minds are behaving 213 00:13:27,800 --> 00:13:30,600 Speaker 3: the way that kid is behaving, which is, you know, 214 00:13:30,720 --> 00:13:33,360 Speaker 3: they're rebelling against the notion of stopping this, and that's 215 00:13:33,400 --> 00:13:35,280 Speaker 3: a sure sign of addiction. But even if you're not 216 00:13:35,320 --> 00:13:39,400 Speaker 3: prepared to acknowledge that, there are other signs. The early 217 00:13:39,960 --> 00:13:45,320 Speaker 3: metabolic signs of addiction are insomnia and focus difficulties, or 218 00:13:45,360 --> 00:13:49,559 Speaker 3: what people are now currently describing as ADHD. The massive 219 00:13:49,640 --> 00:13:54,040 Speaker 3: title wave of adult diagnosis of ADHD is in fact 220 00:13:55,480 --> 00:14:00,360 Speaker 3: diagnosis of the functional effects of addiction, which is should 221 00:14:00,360 --> 00:14:04,360 Speaker 3: focus that comes from being an addict. It's been measured. 222 00:14:04,400 --> 00:14:08,040 Speaker 3: It's forty seven seconds I think is our average ability 223 00:14:08,120 --> 00:14:13,480 Speaker 3: to stay on task at the moment, and that has 224 00:14:13,559 --> 00:14:18,000 Speaker 3: been a massive decline. And what's happening on a society 225 00:14:18,040 --> 00:14:22,320 Speaker 3: wide basis is we're fracturing everybody's focus. That's the first 226 00:14:22,360 --> 00:14:24,120 Speaker 3: thing to go. Focus is the first thing to go. 227 00:14:24,160 --> 00:14:28,280 Speaker 3: It's the first sign that there's a problem. Then you 228 00:14:28,360 --> 00:14:33,680 Speaker 3: start noticing things like insomnia, you start noticing stress growing 229 00:14:33,720 --> 00:14:39,120 Speaker 3: to anxiety, growing to depression and so on. But ask yourself, 230 00:14:39,320 --> 00:14:42,760 Speaker 3: am as I am I as focused as I remember 231 00:14:42,840 --> 00:14:43,960 Speaker 3: being ten years ago? 232 00:14:45,200 --> 00:14:52,120 Speaker 1: Yeah, yeah, yeah, it's yeah. Well, I mean everyone who 233 00:14:52,240 --> 00:14:55,960 Speaker 1: is addicted thinks they're not addicted, or most people, if 234 00:14:56,000 --> 00:14:56,600 Speaker 1: they do. 235 00:14:56,680 --> 00:14:59,120 Speaker 3: Or if they do, think they're addicted, or no, they're addicted, 236 00:14:59,240 --> 00:15:03,080 Speaker 3: or at least suspec most of them don't like the 237 00:15:03,120 --> 00:15:04,880 Speaker 3: fact that they are and will deny it. 238 00:15:05,480 --> 00:15:11,640 Speaker 1: Yeah, exactly, exactly. But I had to, like, in all 239 00:15:11,640 --> 00:15:14,360 Speaker 1: of my meetings and stuff, even if I just had 240 00:15:14,400 --> 00:15:17,760 Speaker 1: my phone near me and it would chime or it 241 00:15:17,760 --> 00:15:22,960 Speaker 1: would like there would be something I would I would get. 242 00:15:23,240 --> 00:15:25,640 Speaker 1: I would in the middle of the thing. I wouldn't 243 00:15:25,680 --> 00:15:28,920 Speaker 1: pick up my phone, but I would. I would. My 244 00:15:29,160 --> 00:15:31,680 Speaker 1: attention would shift a minute to the fucking phone. And 245 00:15:31,760 --> 00:15:33,800 Speaker 1: it's not like I went, oh, that dinged I think 246 00:15:33,880 --> 00:15:37,520 Speaker 1: I should. I just looked before I'd thought, like, it's 247 00:15:37,560 --> 00:15:38,440 Speaker 1: Pavlov's dog. 248 00:15:38,760 --> 00:15:41,800 Speaker 3: You know, it's exactly like Pavlov's dog. 249 00:15:42,120 --> 00:15:43,680 Speaker 1: It's a trained response. 250 00:15:44,120 --> 00:15:47,920 Speaker 3: Yeah, And just you know, if anyone's not across those experiments, 251 00:15:47,920 --> 00:15:53,200 Speaker 3: what Pavlov did was get some dogs to cellivate when 252 00:15:53,240 --> 00:15:55,320 Speaker 3: they heard a bell ring, and the reason was he 253 00:15:55,480 --> 00:15:57,080 Speaker 3: ring the bell and then give them something to eat. 254 00:15:57,520 --> 00:15:59,840 Speaker 3: And then he got them to ring. Then the experiment 255 00:16:00,120 --> 00:16:02,160 Speaker 3: progressed on and it'd ring the bell and not give 256 00:16:02,240 --> 00:16:04,760 Speaker 3: them something. To eat and they'd celebrate because their brains 257 00:16:04,760 --> 00:16:07,640 Speaker 3: have been trained. When that bell rings, I'm going to 258 00:16:07,640 --> 00:16:10,600 Speaker 3: be rewarded. And that's what's going on with notifications. That's 259 00:16:10,640 --> 00:16:13,720 Speaker 3: why there are notifications. By the way, it's not by 260 00:16:13,800 --> 00:16:16,880 Speaker 3: accident that every app is sending you a million notifications 261 00:16:16,880 --> 00:16:20,560 Speaker 3: a minute. It's that they know that this is part 262 00:16:20,600 --> 00:16:24,720 Speaker 3: of the addictive mechanism and it's a very important part 263 00:16:24,720 --> 00:16:26,800 Speaker 3: of it. In fact, so important that there's a whole book, 264 00:16:26,800 --> 00:16:29,920 Speaker 3: in a whole chapter in the book dedicated to notifications 265 00:16:30,840 --> 00:16:35,320 Speaker 3: and why theyre an important part of addiction and why 266 00:16:35,360 --> 00:16:36,600 Speaker 3: they have to be turned off. 267 00:16:38,520 --> 00:16:42,160 Speaker 1: Bro, you should be doing workshops on this in every 268 00:16:42,320 --> 00:16:44,560 Speaker 1: fucking school in Australia. No, you don't have the time 269 00:16:44,680 --> 00:16:48,640 Speaker 1: or the energy, but I mean, it is so important, 270 00:16:48,840 --> 00:16:52,720 Speaker 1: and it's like I think everybody, like even I'm thinking 271 00:16:52,720 --> 00:16:54,200 Speaker 1: about now and what do I and I know we're 272 00:16:54,200 --> 00:16:56,680 Speaker 1: going to have another conversation probably we'll see how we go. 273 00:16:56,800 --> 00:17:00,440 Speaker 1: We didn't intend to have one, But this is how 274 00:17:00,480 --> 00:17:02,520 Speaker 1: we work, isn't it. But the door's open down we've 275 00:17:02,600 --> 00:17:06,240 Speaker 1: kind of but I just think, like, and I know 276 00:17:06,400 --> 00:17:10,439 Speaker 1: this is often discussed, but talking about real practical like 277 00:17:10,520 --> 00:17:16,560 Speaker 1: what's really happening physiologically, psychologically, and strategically from the companies, 278 00:17:16,840 --> 00:17:19,000 Speaker 1: and then what's happening in your body and your nervous 279 00:17:19,000 --> 00:17:22,399 Speaker 1: system and your brain chemistry when just to like, you 280 00:17:22,400 --> 00:17:25,560 Speaker 1: know that seek first to understand, like try to understand 281 00:17:25,600 --> 00:17:29,240 Speaker 1: what this is actually doing to you, not via some 282 00:17:29,440 --> 00:17:32,680 Speaker 1: meme on social media, but actually to try to understand 283 00:17:32,720 --> 00:17:36,040 Speaker 1: a little bit about your own body and your own 284 00:17:36,080 --> 00:17:39,000 Speaker 1: brain and the way that things work. And being able 285 00:17:39,000 --> 00:17:41,520 Speaker 1: to recognize that. And the other thing too is because 286 00:17:42,560 --> 00:17:47,439 Speaker 1: like there's a diminishing return on the same dose. So 287 00:17:47,480 --> 00:17:49,760 Speaker 1: what used to fucking light you up doesn't light you 288 00:17:49,880 --> 00:17:52,400 Speaker 1: up five years to light you up, now you need ten. 289 00:17:52,800 --> 00:17:55,600 Speaker 1: Three months later, you need two hundred to get the 290 00:17:55,640 --> 00:17:59,000 Speaker 1: same response that you used to get from five. So 291 00:17:59,080 --> 00:18:02,920 Speaker 1: you're training your self up. Oh god, you know. 292 00:18:03,200 --> 00:18:05,879 Speaker 3: And even that's pure biochemistry. We've got to switch in 293 00:18:05,920 --> 00:18:08,040 Speaker 3: our brain that every time we give it a dose, 294 00:18:08,119 --> 00:18:11,600 Speaker 3: it moves up a notch. It's a rat. So so 295 00:18:11,840 --> 00:18:14,520 Speaker 3: every time we rapchet it up one, it means we 296 00:18:14,560 --> 00:18:17,920 Speaker 3: need a bigger dose next time. And that's by design. 297 00:18:18,560 --> 00:18:21,000 Speaker 3: So that's the way our brain works, it's the way 298 00:18:21,000 --> 00:18:24,800 Speaker 3: our physiology works, and the companies know that and exploit that. 299 00:18:25,200 --> 00:18:28,240 Speaker 3: Now that has really really bad effects on the way 300 00:18:28,280 --> 00:18:31,040 Speaker 3: our brain works, but they don't care about that as 301 00:18:31,080 --> 00:18:37,680 Speaker 3: long as we remain addicted to their product. Mmmm. It's interesting. 302 00:18:37,720 --> 00:18:39,640 Speaker 3: One of one of the things I suggest people do 303 00:18:39,880 --> 00:18:45,960 Speaker 3: is is introduce the idea of friction. Intensionally, introduce friction. 304 00:18:47,480 --> 00:18:50,239 Speaker 3: One of the things that an addictive anything tries to 305 00:18:50,280 --> 00:18:53,600 Speaker 3: do is remove friction, so it tries to make it easier. 306 00:18:53,640 --> 00:18:58,320 Speaker 3: So and I give the example of you know, if 307 00:18:58,359 --> 00:19:02,480 Speaker 3: you're a gambler and you had in order to gamble, 308 00:19:02,600 --> 00:19:06,639 Speaker 3: you had to you know, get dressed, walk five miles 309 00:19:06,640 --> 00:19:10,720 Speaker 3: to the casino and then gamble, You'd be much less 310 00:19:10,800 --> 00:19:13,000 Speaker 3: likely to do it. Then if you could just pull 311 00:19:13,040 --> 00:19:14,800 Speaker 3: your phone out of bocktt and have a bet right 312 00:19:14,800 --> 00:19:18,639 Speaker 3: there and then so that you get dressed, leave the home, 313 00:19:18,880 --> 00:19:22,760 Speaker 3: walk to the casino is friction. And everywhere you can 314 00:19:22,800 --> 00:19:27,040 Speaker 3: introduce friction between you and the addictive behavior is. 315 00:19:27,000 --> 00:19:31,439 Speaker 1: A good thing. And you may or may not have 316 00:19:31,480 --> 00:19:34,879 Speaker 1: any insight into this specifically, but I'm sure you'll have 317 00:19:34,920 --> 00:19:40,719 Speaker 1: an opinion. The how is the under sixteen band working 318 00:19:40,760 --> 00:19:44,159 Speaker 1: in Australia? Is that making? Is that doing anything? Or 319 00:19:44,240 --> 00:19:46,440 Speaker 1: is that a is that window dressing. 320 00:19:47,440 --> 00:19:51,239 Speaker 3: I think it's important, but not for the reasons that 321 00:19:51,359 --> 00:19:54,720 Speaker 3: it's often promoted as being important. People get lost in 322 00:19:54,760 --> 00:19:58,520 Speaker 3: the weeds about this and say it won't work. You know, 323 00:19:58,680 --> 00:20:00,760 Speaker 3: kids will get around this this marter than we are. 324 00:20:00,840 --> 00:20:04,720 Speaker 3: They'll figure it out and they will just the same 325 00:20:04,760 --> 00:20:08,159 Speaker 3: as they have around alcohol. Band. Did you know that 326 00:20:08,200 --> 00:20:11,120 Speaker 3: you're not supposed to drink if you're under eighteen in Australia. 327 00:20:12,880 --> 00:20:13,600 Speaker 1: I've heard of that. 328 00:20:14,560 --> 00:20:17,920 Speaker 3: You wouldn't know it from observing the average under eighteen, 329 00:20:18,520 --> 00:20:21,560 Speaker 3: but it is the law. And you're not supposed to 330 00:20:21,560 --> 00:20:24,879 Speaker 3: smoke if you're under eighteen either, But it is because 331 00:20:24,920 --> 00:20:28,159 Speaker 3: it is the law. The fact that it is the 332 00:20:28,240 --> 00:20:33,119 Speaker 3: law does, however, introduce friction. It does, however, mean that 333 00:20:33,200 --> 00:20:35,159 Speaker 3: a parent can say to a kid, no, you are 334 00:20:35,280 --> 00:20:40,440 Speaker 3: not having a beer because you are seventeen. It gives you. 335 00:20:40,960 --> 00:20:44,280 Speaker 3: It gives a parent the ability to fall back and say, 336 00:20:44,440 --> 00:20:49,239 Speaker 3: I have the support of my society. The law is 337 00:20:49,280 --> 00:20:51,640 Speaker 3: that this is not possible. Sorry, I don't know. If 338 00:20:51,640 --> 00:20:55,200 Speaker 3: you can hear the dog barking in the background, Well. 339 00:20:55,000 --> 00:20:58,919 Speaker 1: That's great. 340 00:21:00,000 --> 00:21:03,720 Speaker 3: Saying I have the backing of society when I tell 341 00:21:03,760 --> 00:21:07,640 Speaker 3: you you're not doing this. So in that way it's 342 00:21:07,640 --> 00:21:11,520 Speaker 3: a good idea. In that way is it allows parents 343 00:21:11,560 --> 00:21:15,000 Speaker 3: and others to say, no, you are not having access 344 00:21:15,040 --> 00:21:18,200 Speaker 3: to this thing because it is against the law. Will 345 00:21:18,280 --> 00:21:21,560 Speaker 3: may still break the law, Yes they will. Does it 346 00:21:21,600 --> 00:21:23,359 Speaker 3: make it harder, Yes it does. 347 00:21:25,640 --> 00:21:28,840 Speaker 1: There's a guy who comes in to the cafe I 348 00:21:28,880 --> 00:21:32,000 Speaker 1: have a coffee out every morning. I'm going to give 349 00:21:32,000 --> 00:21:33,399 Speaker 1: me a shout out. He used to play footy in 350 00:21:33,400 --> 00:21:37,440 Speaker 1: the AFL Force and killed her. James Gwilt James. And 351 00:21:37,600 --> 00:21:43,159 Speaker 1: he's got the gorgeousest little girl in the world, cool rosie, 352 00:21:43,200 --> 00:21:45,160 Speaker 1: and she two and a half and she's got blonde, 353 00:21:45,240 --> 00:21:47,119 Speaker 1: curly hair, and she looks like something out of a 354 00:21:47,119 --> 00:21:50,240 Speaker 1: fucking nursery rhyme. And every time she comes in she 355 00:21:50,320 --> 00:21:53,120 Speaker 1: comes straight to me because she sits on my lap 356 00:21:53,160 --> 00:21:56,239 Speaker 1: and watches pepper Pig. That sounds creepy. I don't mean 357 00:21:56,280 --> 00:21:58,639 Speaker 1: it too. Her dad's there. It's all good, right, but 358 00:21:58,760 --> 00:22:04,560 Speaker 1: it's and she goes from mayhem to complete focus. By 359 00:22:04,560 --> 00:22:07,000 Speaker 1: the way, we only watched two or three minutes of 360 00:22:07,040 --> 00:22:11,400 Speaker 1: pepper Pig, but talk about like this tiny little bundle 361 00:22:11,440 --> 00:22:15,240 Speaker 1: of energy goes to this fully focused cyborg. In a 362 00:22:15,280 --> 00:22:19,399 Speaker 1: matter of seconds, just watching Pepper Pig and she only 363 00:22:19,440 --> 00:22:22,119 Speaker 1: comes to me. She don't wonder, she's not interested in me. 364 00:22:22,240 --> 00:22:26,480 Speaker 1: She's like, I'm the pathway to Pepper Pig. I'm the 365 00:22:26,640 --> 00:22:31,439 Speaker 1: conduit to the thing that she wants. Dealer, Yeah, that's it. 366 00:22:31,520 --> 00:22:33,640 Speaker 1: I'm just handing out a little bit of cartoon over 367 00:22:33,680 --> 00:22:36,280 Speaker 1: here in the corner. And I said to her dad, 368 00:22:36,320 --> 00:22:38,480 Speaker 1: how much of this does she Yeah? No, we don't 369 00:22:38,560 --> 00:22:40,960 Speaker 1: let her at home. So that's why she fucking loves you, 370 00:22:41,320 --> 00:22:44,720 Speaker 1: because she gets whatever her three minute hit every second 371 00:22:44,800 --> 00:22:47,320 Speaker 1: day or whatever it is. But now I'm feeling like 372 00:22:47,440 --> 00:22:51,760 Speaker 1: I'm a dealer. Now I'm feeling like I'm enabling some 373 00:22:52,000 --> 00:22:55,199 Speaker 1: two and a half year old into a lifetime of 374 00:22:55,240 --> 00:22:58,840 Speaker 1: bloody potential addiction. But it is even at that age 375 00:23:00,119 --> 00:23:03,080 Speaker 1: can have a massive It's not just fourteen, fifteen, sixteen 376 00:23:03,160 --> 00:23:06,280 Speaker 1: year olds, you know, it's toddlers. 377 00:23:06,640 --> 00:23:08,520 Speaker 3: Oh absolutely. I mean you don't have to walk into 378 00:23:08,560 --> 00:23:10,960 Speaker 3: a supermarket to see that. I mean, you know, all 379 00:23:11,000 --> 00:23:14,280 Speaker 3: the kids in the in the prams are sitting there 380 00:23:14,320 --> 00:23:17,959 Speaker 3: set to stunt with an iPad in front of their faces. 381 00:23:18,240 --> 00:23:22,040 Speaker 3: And I'm that sounds like I'm having a go and 382 00:23:22,080 --> 00:23:27,400 Speaker 3: I'm not that is a direct consequence of the business 383 00:23:27,400 --> 00:23:31,080 Speaker 3: model that is being used. The younger you can get 384 00:23:31,119 --> 00:23:33,760 Speaker 3: people on these things, the better as far as that 385 00:23:33,880 --> 00:23:40,360 Speaker 3: business model is concerned. And it's not a judgment about 386 00:23:40,600 --> 00:23:45,119 Speaker 3: what's going on. It's just an observation, and I want 387 00:23:45,160 --> 00:23:48,800 Speaker 3: people to stand back from it and say, wow, I'm 388 00:23:48,880 --> 00:23:52,440 Speaker 3: not sure I want that to happen. I will happily, 389 00:23:54,240 --> 00:23:56,399 Speaker 3: you know, give these kids a shot of vodka to 390 00:23:56,440 --> 00:23:58,720 Speaker 3: calm them down. Am I happy to do it with 391 00:23:58,840 --> 00:24:04,080 Speaker 3: digital vodka? Hmm? I saw you laughing there. She's thinking, yeah, 392 00:24:04,080 --> 00:24:05,440 Speaker 3: I would actually, yeah. 393 00:24:05,720 --> 00:24:07,080 Speaker 2: Like the term digital voca. 394 00:24:07,640 --> 00:24:10,240 Speaker 1: No, that's good. That's I'm going to write that down. 395 00:24:10,280 --> 00:24:16,080 Speaker 1: That could be today's podcast titled digital vodka. So, I mean, 396 00:24:16,119 --> 00:24:20,879 Speaker 1: I know you're not nostradamis, but you're pretty nostrodharmasy. So 397 00:24:21,760 --> 00:24:24,639 Speaker 1: obviously you're trying to make a dent, which is great. 398 00:24:25,400 --> 00:24:27,840 Speaker 1: And we've got more awareness, which is great. I think 399 00:24:27,840 --> 00:24:30,800 Speaker 1: we've got a few more rules and REGs or laws 400 00:24:30,840 --> 00:24:35,280 Speaker 1: in place, which is good. But like just humor me 401 00:24:35,400 --> 00:24:39,119 Speaker 1: go like left unchecked, like we just let this be 402 00:24:39,240 --> 00:24:42,639 Speaker 1: a fucking free for all with no awareness or no 403 00:24:42,840 --> 00:24:45,720 Speaker 1: kind of speed bumps put in place for these people 404 00:24:45,800 --> 00:24:52,399 Speaker 1: who are addicted or if not addicted, like significantly influenced 405 00:24:52,400 --> 00:24:55,080 Speaker 1: in their behavior. Like, what's the end of the line, 406 00:24:55,280 --> 00:24:56,840 Speaker 1: just a total. 407 00:24:56,600 --> 00:24:59,880 Speaker 3: Blind is you just have to look at a graph 408 00:25:00,119 --> 00:25:04,760 Speaker 3: of the rates of anxiety depression ADHD diagnosis in Australia 409 00:25:05,920 --> 00:25:08,320 Speaker 3: and these things are off the charts, right. These are 410 00:25:08,840 --> 00:25:11,919 Speaker 3: growing by multiples of four or five times in just 411 00:25:12,000 --> 00:25:15,960 Speaker 3: the last five years. So these things are in steep 412 00:25:16,080 --> 00:25:20,800 Speaker 3: upward climbs, and you know, we see it occasionally surface 413 00:25:21,040 --> 00:25:25,840 Speaker 3: in crime statistics, you know, and people sort of vaguely say, oh, 414 00:25:26,160 --> 00:25:28,160 Speaker 3: you know, us holds are up because of mental health 415 00:25:28,240 --> 00:25:30,639 Speaker 3: or what they mean there is that there's lots of 416 00:25:30,800 --> 00:25:35,400 Speaker 3: there's an epidemic of schizophrenia in the community. Now schizophrenia, 417 00:25:35,440 --> 00:25:38,199 Speaker 3: for those of your listeners who aren't quite sure what 418 00:25:38,320 --> 00:25:44,200 Speaker 3: that is. That's people having hallucinations, being believing that people 419 00:25:44,280 --> 00:25:47,119 Speaker 3: are out to get them and acting on those beliefs. 420 00:25:47,920 --> 00:25:53,119 Speaker 3: So it's an extreme form of anxiety about the intentions 421 00:25:53,119 --> 00:25:55,720 Speaker 3: of others, or an extreme form of paranoia about the 422 00:25:55,720 --> 00:25:59,040 Speaker 3: intentions of others. And once again there's a tendency to 423 00:25:59,040 --> 00:26:02,159 Speaker 3: be judging about that and think that all these are 424 00:26:02,200 --> 00:26:06,119 Speaker 3: poor weak people. But it's not the case. This is 425 00:26:06,160 --> 00:26:10,240 Speaker 3: an inevitable consequence of doing this to the human brain. 426 00:26:10,760 --> 00:26:16,040 Speaker 3: You keep pushing that dopamine requirement higher and higher and higher, 427 00:26:16,080 --> 00:26:20,639 Speaker 3: and you eventually get into the territory of schizophrenia. You 428 00:26:20,720 --> 00:26:23,080 Speaker 3: don't want to live in a society where there are 429 00:26:23,280 --> 00:26:29,840 Speaker 3: a lot of people who are suffering hallucinations and suffering 430 00:26:29,880 --> 00:26:33,680 Speaker 3: extreme paranoia and are prepared to act on the belief 431 00:26:34,080 --> 00:26:39,280 Speaker 3: that you are there to harm them. And you know, 432 00:26:40,200 --> 00:26:42,800 Speaker 3: a more obvious and earlier sign of this is the 433 00:26:43,000 --> 00:26:47,400 Speaker 3: massive increase in ADHD diagnosis. It's what I'd call functional ADHD, 434 00:26:47,480 --> 00:26:55,520 Speaker 3: which is the symptoms look the same as the traditional ADHD. 435 00:26:55,840 --> 00:26:59,240 Speaker 3: The inability to focus at all on anything at any 436 00:26:59,280 --> 00:27:05,840 Speaker 3: time without medication, and that is on an absolute tear. 437 00:27:06,840 --> 00:27:10,760 Speaker 3: You know, we already have teachers saying they have classrooms 438 00:27:10,760 --> 00:27:13,560 Speaker 3: that you know, you know, twenty years ago they might 439 00:27:13,600 --> 00:27:17,960 Speaker 3: have had one kid if that, suffering from something like that. 440 00:27:18,840 --> 00:27:21,080 Speaker 3: But now you know, five, six, seven, eight of the 441 00:27:21,160 --> 00:27:24,959 Speaker 3: kids in your average twenty twenty two kid class I 442 00:27:25,040 --> 00:27:27,919 Speaker 3: have those problems and or need to be medicated to 443 00:27:28,040 --> 00:27:30,320 Speaker 3: control them. 444 00:27:31,040 --> 00:27:34,000 Speaker 1: Teaching has got to be the hardest job. Like at 445 00:27:34,000 --> 00:27:36,080 Speaker 1: the moment, I thumb and there's a lot of hard jobs, 446 00:27:36,080 --> 00:27:37,920 Speaker 1: but it's got to be in the top for nursings 447 00:27:38,000 --> 00:27:40,960 Speaker 1: up there and a lot of you know people who 448 00:27:40,960 --> 00:27:41,800 Speaker 1: work in that space. 449 00:27:41,840 --> 00:27:44,679 Speaker 3: But down the front they're the front line. Both of 450 00:27:44,720 --> 00:27:48,720 Speaker 3: those professions are the front line. They're seeing this day 451 00:27:48,760 --> 00:27:52,640 Speaker 3: by day. They're seeing the increase in mental illness, they're 452 00:27:52,680 --> 00:27:55,320 Speaker 3: seeing the increase in ADHD and kids. 453 00:27:56,920 --> 00:27:59,680 Speaker 1: Yeah, yeah, that's going to sound like a weird question, 454 00:27:59,720 --> 00:28:03,480 Speaker 1: not a normal question. I would ask you it's more philosophical. 455 00:28:03,560 --> 00:28:09,400 Speaker 1: But what what drives you to write a book like this? 456 00:28:09,680 --> 00:28:12,680 Speaker 1: I know it's not for money because you make about 457 00:28:12,720 --> 00:28:15,320 Speaker 1: the thirteen cents every time you write a book. We 458 00:28:15,440 --> 00:28:19,119 Speaker 1: know how profitable books are in Australia. I also know 459 00:28:20,760 --> 00:28:22,720 Speaker 1: what is it because you just see that there's a 460 00:28:22,760 --> 00:28:25,359 Speaker 1: need and you want to highlight that and potentially talk 461 00:28:25,359 --> 00:28:29,639 Speaker 1: about some you know, strategies or solutions. Or is it 462 00:28:29,680 --> 00:28:33,119 Speaker 1: because you have a personal interest or you've this has 463 00:28:33,200 --> 00:28:37,160 Speaker 1: been something's happened to you or around you that's triggered this, 464 00:28:37,320 --> 00:28:38,760 Speaker 1: Like where does that come from? 465 00:28:39,040 --> 00:28:42,360 Speaker 3: You're all all of the above, So it is it 466 00:28:42,480 --> 00:28:46,840 Speaker 3: is a combination of I'm just naturally a person who 467 00:28:46,880 --> 00:28:51,960 Speaker 3: needs to understand how things work, and I rarely buy 468 00:28:52,840 --> 00:28:54,880 Speaker 3: this is a complex problem that we don't know how 469 00:28:55,360 --> 00:28:57,760 Speaker 3: it works. I think the science is good enough now 470 00:28:57,760 --> 00:29:00,560 Speaker 3: that we know how most things work, if we're prepared 471 00:29:00,600 --> 00:29:03,240 Speaker 3: to have an open mind in looking at what the 472 00:29:03,320 --> 00:29:07,240 Speaker 3: science is actually telling us. There's a lot of people 473 00:29:07,240 --> 00:29:11,920 Speaker 3: out there in your future profession who still favor the 474 00:29:11,960 --> 00:29:18,280 Speaker 3: psychobabble and one narrative explanations for all of this, but 475 00:29:18,400 --> 00:29:22,600 Speaker 3: it isn't. The narrative explanations are useless. This is neurochemistry 476 00:29:22,640 --> 00:29:26,640 Speaker 3: we're talking about, and you push the right buttons, the 477 00:29:26,680 --> 00:29:29,520 Speaker 3: same things will happen every single time in every single human. 478 00:29:30,000 --> 00:29:33,000 Speaker 3: And we know enough now The science is good enough 479 00:29:33,000 --> 00:29:35,760 Speaker 3: for us to be able to put those blocks together 480 00:29:35,840 --> 00:29:38,400 Speaker 3: and say this is what is going on. I want 481 00:29:38,480 --> 00:29:43,080 Speaker 3: it to stop. I want the society that we're living 482 00:29:43,120 --> 00:29:46,200 Speaker 3: in to return to a state where not every second 483 00:29:46,240 --> 00:29:50,640 Speaker 3: person is potentially suffering from mental illness. I want that 484 00:29:50,760 --> 00:29:51,960 Speaker 3: to stop, and I want it to go on to 485 00:29:52,040 --> 00:29:54,440 Speaker 3: reverse because I think we're a better society if we're 486 00:29:54,480 --> 00:30:00,080 Speaker 3: not all sick. And I at a personal level, yes, 487 00:30:00,240 --> 00:30:03,640 Speaker 3: I have seen things like this, both in myself and 488 00:30:03,680 --> 00:30:09,479 Speaker 3: in others that I think this, if you apply the 489 00:30:09,520 --> 00:30:14,160 Speaker 3: things that I set out, can be corrected. And I'm 490 00:30:14,200 --> 00:30:16,680 Speaker 3: not suggesting it's easy, but I think it's a lot 491 00:30:16,720 --> 00:30:17,920 Speaker 3: easier if you've got a plan. 492 00:30:20,360 --> 00:30:23,160 Speaker 1: If to you on the one to ten ten being 493 00:30:23,400 --> 00:30:27,400 Speaker 1: it's hijacked your life and brain. Do you think you 494 00:30:27,480 --> 00:30:31,760 Speaker 1: have an issue? Not an addiction necessarily, but are you 495 00:30:31,960 --> 00:30:34,200 Speaker 1: challenged in this space? Do you feel like you need 496 00:30:34,240 --> 00:30:37,360 Speaker 1: to be on less or use it less? Or that's 497 00:30:37,400 --> 00:30:39,400 Speaker 1: not a loaded question, that's curiosity. 498 00:30:39,720 --> 00:30:42,960 Speaker 2: Yeah, yeah, definitely, like what what. 499 00:30:42,880 --> 00:30:45,320 Speaker 1: Would be the thing that you would want to reduce. 500 00:30:45,000 --> 00:30:49,920 Speaker 2: Or if you're definitely that's. 501 00:30:51,560 --> 00:30:52,240 Speaker 1: What did he say? 502 00:30:52,760 --> 00:30:53,320 Speaker 3: Less porn? 503 00:30:53,840 --> 00:30:56,480 Speaker 2: Yeah, David Gillespie like comment. 504 00:30:56,560 --> 00:31:00,840 Speaker 1: That that is that's him getting edge, that's him just 505 00:31:01,000 --> 00:31:07,840 Speaker 1: like letting his hand opportunity for a joke about time. 506 00:31:08,040 --> 00:31:10,280 Speaker 1: Fucking welcome to the club. Do it, will you? 507 00:31:11,080 --> 00:31:15,280 Speaker 3: And sorry for interrupt let us please tell us. 508 00:31:15,240 --> 00:31:16,920 Speaker 1: I am aware of loo. 509 00:31:17,040 --> 00:31:21,600 Speaker 2: I noticed myself pick up and open apps and open 510 00:31:21,640 --> 00:31:24,240 Speaker 2: apps mindlessly without it happening like I do that. I 511 00:31:24,320 --> 00:31:27,680 Speaker 2: do that a lot. And yeah, so I think I 512 00:31:27,840 --> 00:31:30,200 Speaker 2: use social media and things way too much. 513 00:31:30,800 --> 00:31:34,160 Speaker 1: Mmm, I'm a little bit the same. I'm not as 514 00:31:34,160 --> 00:31:36,040 Speaker 1: bad as I was, but I'm still not there, so 515 00:31:37,200 --> 00:31:37,719 Speaker 1: that one of them. 516 00:31:37,880 --> 00:31:42,239 Speaker 2: Yeah, the app, and when I'm when I'm going to 517 00:31:42,480 --> 00:31:48,360 Speaker 2: do something and forget and then open three social media 518 00:31:48,360 --> 00:31:49,840 Speaker 2: apps and then go, oh, I want to pick up 519 00:31:49,840 --> 00:31:50,480 Speaker 2: my phone. 520 00:31:50,200 --> 00:31:51,840 Speaker 1: For why have I opened that app? 521 00:31:52,160 --> 00:31:54,400 Speaker 2: Because the finger just going straight to that app? You know, 522 00:31:54,480 --> 00:31:54,960 Speaker 2: it's silly. 523 00:31:56,840 --> 00:32:00,240 Speaker 1: Yeah, I have this default where if I I'm just 524 00:32:00,280 --> 00:32:03,080 Speaker 1: doing nothing, I'll go, I wonder else, see who liked 525 00:32:03,120 --> 00:32:06,320 Speaker 1: that post I put up like that? Will I'm not thinking, 526 00:32:06,320 --> 00:32:09,120 Speaker 1: oh geez, I'll ring mum, I'll go, I wonder how 527 00:32:09,160 --> 00:32:12,720 Speaker 1: that post is traveling? Oh really, do you know what 528 00:32:12,760 --> 00:32:13,120 Speaker 1: I mean? 529 00:32:13,560 --> 00:32:15,640 Speaker 3: But what your brain is saying is just give me 530 00:32:15,640 --> 00:32:18,920 Speaker 3: another dopamine hit. Because what your brain wants when you 531 00:32:19,000 --> 00:32:21,320 Speaker 3: do that and check how the post is going is 532 00:32:21,360 --> 00:32:23,720 Speaker 3: they want to see the number of likes gone up. 533 00:32:24,200 --> 00:32:28,000 Speaker 3: And because every every single like is an oxytocin hit, 534 00:32:28,040 --> 00:32:32,840 Speaker 3: which produces a dopamine hit. And that's what's going on 535 00:32:32,880 --> 00:32:34,560 Speaker 3: in your brain. And you might say, oh, it's harmless, 536 00:32:34,600 --> 00:32:38,120 Speaker 3: I'm just looking at stuff. Well, sure, but it's rewiring 537 00:32:38,160 --> 00:32:40,800 Speaker 3: your brain just as effectively as anything else that produces 538 00:32:40,840 --> 00:32:45,760 Speaker 3: a dope. Meine it's and then there's also the time cost. 539 00:32:46,760 --> 00:32:49,720 Speaker 3: I don't know if you've noticed, but you only get 540 00:32:49,760 --> 00:32:54,320 Speaker 3: one life, and the minute that you spent doing that 541 00:32:54,960 --> 00:32:56,400 Speaker 3: is a minute you never get back. 542 00:32:57,080 --> 00:33:01,560 Speaker 1: M M yeah it is. It's like cost benefit analysis. 543 00:33:01,600 --> 00:33:05,600 Speaker 1: What is this? What am I investing? Time? Energy, money, 544 00:33:06,200 --> 00:33:11,200 Speaker 1: maybe focus, attention? What am I getting back? Fucking an addiction? 545 00:33:11,720 --> 00:33:13,560 Speaker 3: Like you're also getting some ads? 546 00:33:14,400 --> 00:33:16,760 Speaker 1: Yeah, well, there's not a great ROI if we're talking 547 00:33:16,800 --> 00:33:22,000 Speaker 1: about digital investment in terms of our attention and and 548 00:33:22,080 --> 00:33:25,240 Speaker 1: I know there's no three step plan, but surely, like 549 00:33:25,320 --> 00:33:27,680 Speaker 1: as you're talking, I'm like, yeah, that's that's not me, 550 00:33:27,920 --> 00:33:31,240 Speaker 1: that is me. That's a bit me. Tiff is probably 551 00:33:31,280 --> 00:33:35,120 Speaker 1: the same. I know there's no single entry point or 552 00:33:35,360 --> 00:33:36,880 Speaker 1: stepping off point, but if it. 553 00:33:37,720 --> 00:33:42,440 Speaker 3: Is, this is not meant to be like alcoholics anonymous 554 00:33:42,520 --> 00:33:45,920 Speaker 3: or something. This is meant to be for everyone who 555 00:33:46,000 --> 00:33:49,160 Speaker 3: has who sees a little bit of themselves in all 556 00:33:49,200 --> 00:33:51,080 Speaker 3: of the stories that I put in this book, who 557 00:33:51,120 --> 00:33:54,960 Speaker 3: sees some of the things there that they are doing too. 558 00:33:55,200 --> 00:33:59,480 Speaker 3: And the plans I put forward for what to do 559 00:33:59,560 --> 00:34:02,840 Speaker 3: about it are relatively easy to implement, and they're not 560 00:34:03,160 --> 00:34:06,720 Speaker 3: things like throw your phone in the bin. They are 561 00:34:06,760 --> 00:34:10,799 Speaker 3: things like introduce friction by doing this instead of this, 562 00:34:12,239 --> 00:34:15,000 Speaker 3: you know, introduce delay by doing it this way rather 563 00:34:15,080 --> 00:34:18,759 Speaker 3: than that way. And I walk people through that in 564 00:34:18,800 --> 00:34:23,120 Speaker 3: this book and then describe why it matters and what 565 00:34:23,160 --> 00:34:26,840 Speaker 3: effects doing that will have on your brain biochemistry. 566 00:34:27,600 --> 00:34:30,160 Speaker 1: Do you touch on? And we'll wind up now because 567 00:34:30,880 --> 00:34:33,000 Speaker 1: I know you've got to get back to your computer game. 568 00:34:33,120 --> 00:34:40,120 Speaker 1: Do you touch on? Do you touch on? AI? I 569 00:34:40,320 --> 00:34:43,919 Speaker 1: feel like, you know, so many of my friends don't 570 00:34:43,920 --> 00:34:46,560 Speaker 1: actually write whatever it is they're writing. They don't actually 571 00:34:46,560 --> 00:34:48,400 Speaker 1: write it anymore. They just put in a prompt and 572 00:34:48,440 --> 00:34:51,279 Speaker 1: AI writes it for them and that becomes the letter. 573 00:34:53,040 --> 00:34:55,840 Speaker 3: It's a slightly different thing in that AI is not addictive, 574 00:34:56,200 --> 00:34:58,799 Speaker 3: but it is something else and it's related. And I 575 00:34:58,840 --> 00:35:00,719 Speaker 3: was actually having this thought today and I thought we 576 00:35:00,800 --> 00:35:02,799 Speaker 3: might talk about this today, but we've got a little 577 00:35:02,840 --> 00:35:06,879 Speaker 3: bit sidetracked. I was thinking, if a guy is doing 578 00:35:06,920 --> 00:35:11,040 Speaker 3: our thinking for us, and increasingly it is, isn't this 579 00:35:11,320 --> 00:35:15,839 Speaker 3: just like the Industrial Revolution when machines started doing our 580 00:35:15,920 --> 00:35:19,840 Speaker 3: lifting for us. So the idea of there being a 581 00:35:19,920 --> 00:35:24,120 Speaker 3: gym or you're doing a workout in the early eighteen 582 00:35:24,160 --> 00:35:28,839 Speaker 3: hundreds would have been frankly laughable because people were just 583 00:35:28,960 --> 00:35:35,200 Speaker 3: doing that every day anyway, it was the job. And 584 00:35:35,280 --> 00:35:39,799 Speaker 3: for that to have become something that has been created 585 00:35:40,920 --> 00:35:45,080 Speaker 3: is really really interesting, which is you know, you probably 586 00:35:45,120 --> 00:35:48,839 Speaker 3: know about Eugene Sandow, sort of the father of bodybuilding 587 00:35:49,280 --> 00:35:50,160 Speaker 3: modern bodybuilding. 588 00:35:50,440 --> 00:35:53,920 Speaker 1: Yeah, and the actual mister Olympia, which is the biggest 589 00:35:53,960 --> 00:35:58,400 Speaker 1: comp in the world. The trophy is called the Sandow Yeah. 590 00:35:58,680 --> 00:36:01,120 Speaker 3: And I mean he was the I guess, the original 591 00:36:01,480 --> 00:36:05,239 Speaker 3: fitness influencer. If it had an Instagram account, it would 592 00:36:05,280 --> 00:36:09,720 Speaker 3: have been mad. But he was a product of something 593 00:36:09,800 --> 00:36:14,960 Speaker 3: called the physical culture movement, which really spread throughout Europe 594 00:36:15,000 --> 00:36:19,160 Speaker 3: the United States as a result of the Industrial Revolution, 595 00:36:19,600 --> 00:36:23,160 Speaker 3: and it was based in this notion of you are 596 00:36:23,280 --> 00:36:26,080 Speaker 3: no longer using your body for what it was designed 597 00:36:26,120 --> 00:36:29,080 Speaker 3: to do, what it has been doing for the rest 598 00:36:29,160 --> 00:36:31,680 Speaker 3: of time up until now when we had all these 599 00:36:32,000 --> 00:36:34,080 Speaker 3: Now I got all these machines that do it for us. 600 00:36:34,520 --> 00:36:37,400 Speaker 3: So you need to keep training it, you need to 601 00:36:37,520 --> 00:36:42,080 Speaker 3: keep using it, or it will disappear. And I wonder 602 00:36:42,800 --> 00:36:47,000 Speaker 3: is there something like that necessary with AI, because what 603 00:36:47,120 --> 00:36:49,840 Speaker 3: a lot of people are doing now with AI is 604 00:36:49,960 --> 00:36:54,120 Speaker 3: using it to do their thinking. And if we use 605 00:36:54,239 --> 00:36:58,320 Speaker 3: a machine to do our thinking, do we need something 606 00:36:58,520 --> 00:37:02,520 Speaker 3: like the physical culture movement, where we actually design programs 607 00:37:02,800 --> 00:37:05,799 Speaker 3: that where people can do a mental workout every day. 608 00:37:05,840 --> 00:37:08,320 Speaker 3: And I'm not touring a sudoku or something like that. 609 00:37:08,320 --> 00:37:11,920 Speaker 3: That's the equivalent of lifting a tin of beans and 610 00:37:11,960 --> 00:37:15,759 Speaker 3: calling it a workout. I am saying programs that are 611 00:37:15,800 --> 00:37:21,360 Speaker 3: actually designed to make sure your brain still functions. Yeah. 612 00:37:21,880 --> 00:37:27,480 Speaker 1: Yeah, it's funny when you think about eugen stand out onwards, right. 613 00:37:27,840 --> 00:37:30,880 Speaker 1: Imagine you know, for however long depends, they keep changing 614 00:37:30,920 --> 00:37:34,239 Speaker 1: the timeline. But three hundred thousand years where men and 615 00:37:34,360 --> 00:37:38,200 Speaker 1: women were active, most men and women active all the time, 616 00:37:38,280 --> 00:37:43,480 Speaker 1: lifting shit, like fixing shit or whatever, just moving their 617 00:37:43,520 --> 00:37:47,520 Speaker 1: body a lot, expending physical energy a lot, naturally functional 618 00:37:47,560 --> 00:37:51,319 Speaker 1: and strong, and them projecting forward going, oh, in two 619 00:37:51,400 --> 00:37:53,640 Speaker 1: hundred years, nobody will do any of this, but they'll 620 00:37:53,640 --> 00:37:57,279 Speaker 1: go to this room and then they'll pretend to Yeah, 621 00:37:57,280 --> 00:37:59,600 Speaker 1: they'll pick something up and put it down for an hour, 622 00:38:00,120 --> 00:38:02,919 Speaker 1: and then they'll go back and sit in their lounge chair. 623 00:38:03,040 --> 00:38:07,120 Speaker 1: It's yeah, it's well, there was no need really, no, 624 00:38:07,160 --> 00:38:07,840 Speaker 1: there was no. 625 00:38:07,680 --> 00:38:11,960 Speaker 3: Need recently, and and to me, I'm wondering there was 626 00:38:12,000 --> 00:38:18,640 Speaker 3: no need for the similar things for brains. And I 627 00:38:18,680 --> 00:38:21,520 Speaker 3: think we're going to need it, and I think there's 628 00:38:21,560 --> 00:38:25,600 Speaker 3: a new Eugene Sander. Maybe it's you and new Eugene 629 00:38:25,600 --> 00:38:29,399 Speaker 3: Sander waiting in the wings to do the same thing 630 00:38:29,920 --> 00:38:32,480 Speaker 3: for brains because of what AI is going to do 631 00:38:32,520 --> 00:38:34,000 Speaker 3: to or is doing to us. 632 00:38:34,360 --> 00:38:37,720 Speaker 1: Yeah, I do actually talk about the importance of training 633 00:38:37,719 --> 00:38:43,239 Speaker 1: our brain, like the importance of maintaining, if not optimizing 634 00:38:43,719 --> 00:38:48,719 Speaker 1: cognitive function and solving problems and being creative and doing all. 635 00:38:49,480 --> 00:38:54,120 Speaker 3: It's not just that it's the physical brain interaction. It's 636 00:38:54,600 --> 00:38:58,760 Speaker 3: high kinetic sports and so on that have high rates 637 00:38:59,200 --> 00:39:03,920 Speaker 3: of thinking and action associated. So you know, not surprisingly, 638 00:39:03,920 --> 00:39:06,840 Speaker 3: I would use handball given my association with handball, with 639 00:39:06,880 --> 00:39:10,560 Speaker 3: that where that's an extremely high speed, high cardio sport 640 00:39:10,719 --> 00:39:13,880 Speaker 3: where things are changing second by second. It's the interaction 641 00:39:13,960 --> 00:39:17,120 Speaker 3: of the brain and the body in that motion that 642 00:39:17,239 --> 00:39:19,920 Speaker 3: I think will become more and more important. 643 00:39:20,840 --> 00:39:24,600 Speaker 1: Yeah, there's another one which I would just because I 644 00:39:24,640 --> 00:39:27,000 Speaker 1: have been a tiny bit involved in, but I've trained 645 00:39:27,080 --> 00:39:30,480 Speaker 1: quite a lot of athletes who do Brazilian jiu jitsu 646 00:39:30,760 --> 00:39:34,960 Speaker 1: Tiff knows one of them, and like it's such a 647 00:39:35,160 --> 00:39:38,480 Speaker 1: cerebral sport where they're on the map, which if you 648 00:39:38,560 --> 00:39:41,040 Speaker 1: haven't seen it, you wouldn't think that. You think it's 649 00:39:41,160 --> 00:39:44,759 Speaker 1: just two bullf ed's just squeezing each other and strangling 650 00:39:44,800 --> 00:39:45,279 Speaker 1: each other. 651 00:39:45,920 --> 00:39:50,880 Speaker 3: All highly kinetic sports, fencing, jiu jitsu, all of the 652 00:39:51,000 --> 00:39:56,040 Speaker 3: martial arts where your brain has to be functioning at 653 00:39:56,120 --> 00:39:59,800 Speaker 3: light speed and incredibly focused or you are going to 654 00:39:59,800 --> 00:40:00,839 Speaker 3: get messed up. 655 00:40:01,960 --> 00:40:05,320 Speaker 1: Yeah, yeah, yeah, Oh I'm excited. What's the ETA on 656 00:40:05,360 --> 00:40:05,959 Speaker 1: the book again? 657 00:40:06,320 --> 00:40:10,040 Speaker 3: Did you say twenty eighth of April, just after transit? 658 00:40:10,760 --> 00:40:13,399 Speaker 1: All right, well, we'll do a promo when it comes out. 659 00:40:13,440 --> 00:40:17,840 Speaker 1: It's yeah, that was good, mate, that was so interesting. 660 00:40:18,400 --> 00:40:23,399 Speaker 1: But just also shout out everyone on David's substack page. 661 00:40:23,440 --> 00:40:25,719 Speaker 1: You wrote an article today which I actually read in 662 00:40:25,760 --> 00:40:30,080 Speaker 1: preparation for this, which god, I know, so I've actually 663 00:40:30,080 --> 00:40:34,040 Speaker 1: got it under the under your faces, well can't see 664 00:40:34,040 --> 00:40:36,000 Speaker 1: your face. You're always just a black box, like a 665 00:40:36,040 --> 00:40:41,040 Speaker 1: fucking mystery thing on a plane. But anyway, may I yeah, yeah, 666 00:40:41,080 --> 00:40:46,280 Speaker 1: that's right. But the article is called the Purchased Health Halo. 667 00:40:46,920 --> 00:40:49,719 Speaker 1: We might talk about it next time, but it's on substack. 668 00:40:49,920 --> 00:40:52,360 Speaker 1: Just go David Gillespie substack. It will come up on 669 00:40:53,200 --> 00:40:56,279 Speaker 1: Google or whatever. It's fucking great. So have a read 670 00:40:56,320 --> 00:40:58,919 Speaker 1: of that and maybe we'll explore that next time, but Mate, 671 00:40:58,920 --> 00:41:02,720 Speaker 1: appreciate you as always and thanks for the chat and Tiff, 672 00:41:02,760 --> 00:41:06,080 Speaker 1: thanks for hanging out, Thanks lads, Thanks Jo