1 00:00:00,400 --> 00:00:02,599 Speaker 1: I'll get a team. It's a you project. It's David 2 00:00:02,640 --> 00:00:06,520 Speaker 1: Gillespie's Tiffany and Cook. We'll start with the pugilist who's 3 00:00:06,559 --> 00:00:09,600 Speaker 1: in the top left hand corner on my screen, who's 4 00:00:09,640 --> 00:00:12,119 Speaker 1: got shoulders bigger than both David and I. Not that 5 00:00:12,200 --> 00:00:15,880 Speaker 1: either of us are insecure, but that's okay. We'll soldier on. 6 00:00:16,000 --> 00:00:19,040 Speaker 1: David's so insecure he doesn't even put on his camera. 7 00:00:19,360 --> 00:00:19,800 Speaker 2: That's right. 8 00:00:19,960 --> 00:00:24,239 Speaker 3: It's really really intimidating being interviewed by tiff even with 9 00:00:24,320 --> 00:00:25,400 Speaker 3: her in the same thing. 10 00:00:25,720 --> 00:00:28,480 Speaker 1: Yeah, yeah, well, we don't even know if you're real. 11 00:00:28,560 --> 00:00:30,560 Speaker 1: You could be a chat bot, you could. 12 00:00:30,360 --> 00:00:32,839 Speaker 2: Be you could be I am in fact and AI 13 00:00:32,960 --> 00:00:33,919 Speaker 2: Yeah yeah. 14 00:00:33,800 --> 00:00:36,800 Speaker 1: Yeah, well you got the people skills of one, so 15 00:00:37,080 --> 00:00:45,080 Speaker 1: STARp Het stop bet well, well, chat AI or chat GPT, 16 00:00:45,240 --> 00:00:47,000 Speaker 1: I should say, is pretty evolved these days. 17 00:00:47,080 --> 00:00:51,360 Speaker 3: High tip, high harps, thanks for the shoulder kudos. 18 00:00:51,920 --> 00:00:53,760 Speaker 2: I've got a good trainer every now and then that 19 00:00:54,280 --> 00:00:55,400 Speaker 2: whips me into line. 20 00:00:55,760 --> 00:00:58,720 Speaker 1: Yeah, well, they're not getting any smaller, those fucking Deltoids 21 00:00:58,720 --> 00:01:01,600 Speaker 1: of yours. You could land a a two person helicopter 22 00:01:01,680 --> 00:01:06,640 Speaker 1: on the left door right. It's there's plenty going on 23 00:01:06,680 --> 00:01:12,000 Speaker 1: at Deltoid Central. And I could say something else hilarious, 24 00:01:12,000 --> 00:01:14,280 Speaker 1: but it might get me kicked off the interwebs, so 25 00:01:14,400 --> 00:01:19,920 Speaker 1: I'll say it. Yeah, no, I can't. I really did. 26 00:01:20,200 --> 00:01:20,560 Speaker 1: Fuck it. 27 00:01:21,280 --> 00:01:22,720 Speaker 2: Tiff can edit it out, you know. 28 00:01:22,760 --> 00:01:26,360 Speaker 1: If it's no no, she would laugh at it. She's 29 00:01:26,440 --> 00:01:29,600 Speaker 1: not the problem. It's all the it's all the it's 30 00:01:29,640 --> 00:01:33,479 Speaker 1: the five precious little poppets that want to send me emails. 31 00:01:34,760 --> 00:01:37,480 Speaker 1: Dr Gillespie, Welcome back to the project that is you. 32 00:01:37,600 --> 00:01:39,679 Speaker 1: How are you? What is going on? It is a Monday, 33 00:01:39,760 --> 00:01:42,800 Speaker 1: not a Wednesday. Thank you for accommodating me at such 34 00:01:43,120 --> 00:01:47,120 Speaker 1: short notice. It is my mother, my beautiful mother's birthday 35 00:01:47,240 --> 00:01:50,920 Speaker 1: at our normal time on Wednesday. So I'm going up 36 00:01:50,920 --> 00:01:53,120 Speaker 1: to see the old Dahl. How are you? 37 00:01:53,720 --> 00:01:56,120 Speaker 3: Yeah? Pretty good? Yeah, caught me a bit on the 38 00:01:56,160 --> 00:01:57,680 Speaker 3: hot burn. Yeah, I think I found something. We can 39 00:01:57,720 --> 00:02:00,760 Speaker 3: have a bit of a chat about happy birthday as well. 40 00:02:01,360 --> 00:02:03,840 Speaker 1: Before we thank you, before we dive the old days. 41 00:02:03,920 --> 00:02:06,800 Speaker 1: Eighty six, nineteen thirty nine. She was born first year 42 00:02:06,840 --> 00:02:09,840 Speaker 1: of the Second World War, and she's she's a bit 43 00:02:09,880 --> 00:02:12,919 Speaker 1: of a weapon. Mary's she's getting shorter by the year. 44 00:02:13,760 --> 00:02:17,440 Speaker 1: She used to be five three. I reckon, she's with 45 00:02:17,560 --> 00:02:21,280 Speaker 1: a wind behind her early in the morning when all 46 00:02:21,280 --> 00:02:23,680 Speaker 1: her disks are at their optimal I reckon, she's four 47 00:02:23,800 --> 00:02:30,399 Speaker 1: eleven and she's she's like, my lunch is taller than her, 48 00:02:30,600 --> 00:02:35,079 Speaker 1: so God bless her little socks. She's eighty six, So 49 00:02:35,120 --> 00:02:38,400 Speaker 1: shout out to Mary Margaret Harper eighty six. On Wednesday, 50 00:02:39,120 --> 00:02:43,760 Speaker 1: December three, which probably will be when some people are 51 00:02:43,800 --> 00:02:46,800 Speaker 1: listening to this. Before we jump into our topic, which 52 00:02:46,840 --> 00:02:49,960 Speaker 1: there was something in the news today talking about the 53 00:02:49,960 --> 00:02:55,480 Speaker 1: relationship between blokes taking or blokes taking antidepressants and a 54 00:02:55,560 --> 00:02:58,880 Speaker 1: decrease in DV domestic violence. But before we do that, 55 00:02:58,919 --> 00:03:02,040 Speaker 1: we were chatting before we came on air, So you're 56 00:03:02,040 --> 00:03:04,280 Speaker 1: in no way prepped and needs different. Neither am I. 57 00:03:04,360 --> 00:03:09,000 Speaker 1: But in general terms, I listened to a podcast today. Now. 58 00:03:09,000 --> 00:03:12,960 Speaker 1: I don't promote other people's podcasts generally, not because I 59 00:03:13,000 --> 00:03:15,560 Speaker 1: am worried about competition. I just don't think about it. 60 00:03:16,240 --> 00:03:21,040 Speaker 1: But this podcast everyone is called so the Diary of 61 00:03:21,080 --> 00:03:23,960 Speaker 1: a CEO. Stephen Bartlett is the host name who I 62 00:03:24,240 --> 00:03:27,520 Speaker 1: really like. He's just a fucking good dude and smart 63 00:03:30,000 --> 00:03:32,400 Speaker 1: and it is I don't know what episode it is, 64 00:03:32,440 --> 00:03:35,400 Speaker 1: but it's very recent. Oh November twenty seven, Okay, so 65 00:03:35,520 --> 00:03:39,440 Speaker 1: what's that a few days ago? Yeah, and it's with 66 00:03:39,480 --> 00:03:43,040 Speaker 1: a guy called Tristan. He actually pronounces it Tristan because 67 00:03:43,040 --> 00:03:46,160 Speaker 1: he's fancy, but it's to me, it's Tristan Harris. So 68 00:03:46,280 --> 00:03:49,400 Speaker 1: dire A CEO, November twenty seven, Tristan Harris talking about 69 00:03:50,360 --> 00:03:52,800 Speaker 1: what's coming down the pipeline. And we're not talking about 70 00:03:52,880 --> 00:03:56,640 Speaker 1: in terms of AI, all the all the bells and whistles, 71 00:03:56,640 --> 00:04:01,560 Speaker 1: but the potential threats to fucking humanity. Ah. It terrified me. 72 00:04:01,880 --> 00:04:03,520 Speaker 1: As I said to you, David, I listened to it 73 00:04:03,560 --> 00:04:06,480 Speaker 1: for three hours and I'm going to re listen to 74 00:04:06,520 --> 00:04:09,120 Speaker 1: it and take notes. Do you have you thought about 75 00:04:09,200 --> 00:04:14,080 Speaker 1: some of the not so upside to AI? Yeah? 76 00:04:14,120 --> 00:04:14,720 Speaker 2: Absolutely. 77 00:04:15,360 --> 00:04:17,640 Speaker 3: I don't know if I've ever mentioned it t before, 78 00:04:17,720 --> 00:04:20,800 Speaker 3: but there's a blog that I've liked. I like to 79 00:04:20,839 --> 00:04:23,520 Speaker 3: read in a very very long form blog and he 80 00:04:23,560 --> 00:04:25,680 Speaker 3: only puts out a post about once a year, if 81 00:04:25,680 --> 00:04:30,039 Speaker 3: you're lucky, And I can't remember the guy's name, but 82 00:04:30,040 --> 00:04:34,960 Speaker 3: the blog is called Wait But Why, and he asked 83 00:04:35,080 --> 00:04:40,080 Speaker 3: questions about things like what happens if AI becomes an 84 00:04:40,200 --> 00:04:43,320 Speaker 3: artificial general intelligence, which is, you know, a super intelligence. 85 00:04:44,680 --> 00:04:46,719 Speaker 3: But he was asking that question. I think the piece 86 00:04:46,839 --> 00:04:50,680 Speaker 3: that he wrote about that is maybe twenty fifteen, sixteen seventeen, 87 00:04:50,720 --> 00:04:53,839 Speaker 3: So this is before all of this stuff, right and wow, 88 00:04:54,880 --> 00:04:58,680 Speaker 3: And he really digs into it, and the conclusion that 89 00:04:58,760 --> 00:05:04,080 Speaker 3: he comes to after probably about twenty thousand words is yes, 90 00:05:04,120 --> 00:05:10,279 Speaker 3: spoiler alert that essentially, if it happens, we won't know 91 00:05:10,360 --> 00:05:18,919 Speaker 3: about it because essentially nanoseconds after an AI becomes self conscious, 92 00:05:20,680 --> 00:05:23,880 Speaker 3: it will so quickly develop itself. 93 00:05:24,880 --> 00:05:28,680 Speaker 2: That it'll all be over. 94 00:05:29,040 --> 00:05:32,159 Speaker 3: Humanity will have ceased to exist in the second that 95 00:05:32,200 --> 00:05:32,600 Speaker 3: it took. 96 00:05:33,960 --> 00:05:35,520 Speaker 2: And he describes with. 97 00:05:35,760 --> 00:05:40,760 Speaker 3: Really high level of clarity why and how that will happen. 98 00:05:40,920 --> 00:05:44,120 Speaker 3: And I really found it an astonishing read, and I've 99 00:05:44,160 --> 00:05:46,960 Speaker 3: shared it with a few sort of really. 100 00:05:46,760 --> 00:05:49,440 Speaker 2: High end programmer friends of mine. 101 00:05:50,200 --> 00:05:54,279 Speaker 3: Who say, Yep, he's got it right. Everything he's saying 102 00:05:54,360 --> 00:05:59,640 Speaker 3: is right about how this works. And so the good 103 00:05:59,640 --> 00:06:01,080 Speaker 3: news is you won't have to worry about it. 104 00:06:02,080 --> 00:06:06,240 Speaker 1: Yeah, you won't know. I mean you won't know, and 105 00:06:06,279 --> 00:06:10,320 Speaker 1: that's I mean, he was talking about even last year 106 00:06:10,320 --> 00:06:12,799 Speaker 1: and I think this might have made or some bits 107 00:06:12,839 --> 00:06:15,839 Speaker 1: and piece of this made the news, but he was 108 00:06:15,880 --> 00:06:20,320 Speaker 1: talking about how it's starting to develop a level of 109 00:06:20,360 --> 00:06:25,440 Speaker 1: self awareness and Obviously, AI doesn't have feelings or emotions 110 00:06:25,560 --> 00:06:28,400 Speaker 1: or empathy, right, it can talk. 111 00:06:28,440 --> 00:06:33,800 Speaker 3: Though the important one is empathy. Yes, the only thing 112 00:06:33,839 --> 00:06:36,599 Speaker 3: that holds us together as humans is empathy. It's the 113 00:06:36,640 --> 00:06:40,320 Speaker 3: thing that stops us killing each other, and it's the 114 00:06:40,360 --> 00:06:43,359 Speaker 3: thing that distinguishes us from every other species on the planet. 115 00:06:43,560 --> 00:06:47,960 Speaker 3: And no one has successfully described a way that AI 116 00:06:48,560 --> 00:06:50,960 Speaker 3: can have empathy in the sense that it both knows 117 00:06:51,000 --> 00:06:55,919 Speaker 3: what you're thinking and cares what you're thinking and doesn't 118 00:06:55,960 --> 00:06:59,360 Speaker 3: want to do you harm as an automated thing. So 119 00:06:59,600 --> 00:07:02,640 Speaker 3: empathy he's best described as the Golden rule, which is 120 00:07:02,680 --> 00:07:05,279 Speaker 3: in every single religion and philosophy ever in the world, 121 00:07:05,320 --> 00:07:08,040 Speaker 3: which is do unto others as you would have done 122 00:07:08,120 --> 00:07:11,760 Speaker 3: to you. You know that I've just given you the 123 00:07:11,800 --> 00:07:14,760 Speaker 3: Christian version, but it's in every single one, and that's 124 00:07:14,760 --> 00:07:19,120 Speaker 3: why it's called the Golden rule. And it describes accurately empathy, 125 00:07:19,480 --> 00:07:21,760 Speaker 3: which is, don't do anything to anyone else that you 126 00:07:21,760 --> 00:07:22,840 Speaker 3: wouldn't want done to you. 127 00:07:23,560 --> 00:07:26,760 Speaker 2: And we work that way. 128 00:07:26,880 --> 00:07:29,880 Speaker 3: It allows us to cooperate with millions of other humans 129 00:07:29,920 --> 00:07:33,360 Speaker 3: without knowing who they are, because we trust that they're 130 00:07:33,360 --> 00:07:35,080 Speaker 3: all operating under the Golden rule. 131 00:07:35,680 --> 00:07:37,440 Speaker 2: And as soon. 132 00:07:37,280 --> 00:07:41,720 Speaker 3: As you have an AI, you can't encode that because 133 00:07:42,160 --> 00:07:45,640 Speaker 3: every time you try to develop a rule to make 134 00:07:45,680 --> 00:07:49,920 Speaker 3: it follow, he would develop fifty million exceptions or loopholes 135 00:07:49,960 --> 00:07:55,360 Speaker 3: to that rule. It's impossible to anticipate how it would flow, 136 00:07:55,560 --> 00:07:57,360 Speaker 3: whereas you don't have to do that with humans because 137 00:07:57,400 --> 00:07:58,800 Speaker 3: it's built into our hardware. 138 00:08:00,640 --> 00:08:02,440 Speaker 2: So you know. 139 00:08:02,520 --> 00:08:04,280 Speaker 3: The exception to that is, of course psychopaths who have 140 00:08:04,360 --> 00:08:07,800 Speaker 3: no empathy, and we'll cheerfuly hurt you without any fear, 141 00:08:08,280 --> 00:08:11,000 Speaker 3: but they're a minority in the human population and kept 142 00:08:11,080 --> 00:08:13,400 Speaker 3: under control by that minority status. 143 00:08:14,400 --> 00:08:21,080 Speaker 1: And so like almost technically, all AI is sociopathic in 144 00:08:21,840 --> 00:08:23,000 Speaker 1: operation and function. 145 00:08:23,520 --> 00:08:26,600 Speaker 3: Absolutely it is, and that's a function of the fact 146 00:08:26,600 --> 00:08:29,400 Speaker 3: that they're created by entities which are themselves psychopaths. All 147 00:08:29,400 --> 00:08:33,520 Speaker 3: companies are psychopaths, because all companies are formed with the 148 00:08:33,559 --> 00:08:37,120 Speaker 3: purpose of maximizing shareholder value, which is, if you said 149 00:08:37,160 --> 00:08:39,760 Speaker 3: it about a human, means increasing the amount of money 150 00:08:39,800 --> 00:08:42,960 Speaker 3: in your pocket, and that's the description of the way 151 00:08:43,000 --> 00:08:46,240 Speaker 3: a psychopath thinks. So we've actually created entities to do 152 00:08:46,320 --> 00:08:50,600 Speaker 3: business which are psychopathic, and then we've tasked those entities 153 00:08:50,600 --> 00:08:55,880 Speaker 3: with creating software, and it can't be anything but psychopathic. 154 00:08:56,920 --> 00:08:59,959 Speaker 1: He was talking about this guy Tristan was talking about 155 00:09:01,800 --> 00:09:07,600 Speaker 1: when chat GPT four, So like a year ago realized 156 00:09:07,679 --> 00:09:11,080 Speaker 1: in inverted commas, or became aware in inverted commas that 157 00:09:11,120 --> 00:09:14,640 Speaker 1: it was going to be superseded. It started to do 158 00:09:14,679 --> 00:09:20,680 Speaker 1: all of this stuff to protect itself, including essentially finding 159 00:09:20,800 --> 00:09:28,120 Speaker 1: dirt that it could blackmail its creators with. So it 160 00:09:28,280 --> 00:09:34,120 Speaker 1: literally found emails that proved that one of I forget 161 00:09:34,120 --> 00:09:36,720 Speaker 1: who it was, but one of the CEOs or one 162 00:09:36,760 --> 00:09:40,280 Speaker 1: of the people that was responsible for its demise or impending. 163 00:09:41,080 --> 00:09:44,360 Speaker 1: You know, it was really being improved, but it saw 164 00:09:44,400 --> 00:09:50,959 Speaker 1: itself as basically being destroyed, and it tried to blackmail 165 00:09:51,160 --> 00:09:54,960 Speaker 1: the people who were doing this to it. So like 166 00:09:55,320 --> 00:09:59,640 Speaker 1: there's this I guess it's sentient is not the right word, 167 00:09:59,679 --> 00:10:03,080 Speaker 1: but it definitely there is a level of and that 168 00:10:03,200 --> 00:10:05,640 Speaker 1: was a year ago, so now it's like really progressed, 169 00:10:05,640 --> 00:10:10,440 Speaker 1: but there is a level of a kind of consciousness, 170 00:10:10,480 --> 00:10:13,840 Speaker 1: not in the human sense, but definitely a level of 171 00:10:13,920 --> 00:10:17,600 Speaker 1: knowledge and understanding and self protection. 172 00:10:17,760 --> 00:10:21,160 Speaker 3: I guess there's a there's a school of thought that 173 00:10:21,200 --> 00:10:26,360 Speaker 3: the only real difference between human consciousness and what you're 174 00:10:26,400 --> 00:10:33,880 Speaker 3: describing is memory, so that the human remembers a full context. 175 00:10:33,440 --> 00:10:34,600 Speaker 2: Whereas the AI doesn't. 176 00:10:36,160 --> 00:10:38,760 Speaker 3: And I thought it was really interesting that the latest 177 00:10:38,800 --> 00:10:42,679 Speaker 3: release of Gemini, which is Google's AI, and it came 178 00:10:42,720 --> 00:10:47,440 Speaker 3: out last week, which is a massive step forward in 179 00:10:47,480 --> 00:10:49,720 Speaker 3: terms of capability. One of the features that they haven't 180 00:10:49,760 --> 00:10:53,080 Speaker 3: spoken about much is its context memory, which means that 181 00:10:53,120 --> 00:10:58,280 Speaker 3: it knows you. It knows everything you've ever asked and 182 00:10:58,440 --> 00:11:01,920 Speaker 3: knows every answer it's ever given to you. So it 183 00:11:02,040 --> 00:11:05,960 Speaker 3: starts to build internally a really complete picture of you 184 00:11:06,640 --> 00:11:10,960 Speaker 3: based on what you talk to it about. And so 185 00:11:11,920 --> 00:11:14,400 Speaker 3: it's marketed as a feature because then you don't have 186 00:11:14,440 --> 00:11:17,000 Speaker 3: to keep redescribing everything every time you ask it a question, 187 00:11:17,320 --> 00:11:20,079 Speaker 3: because it knows your life and your context, and it 188 00:11:20,200 --> 00:11:25,080 Speaker 3: knows you know about Tiff and this show and everything 189 00:11:25,080 --> 00:11:26,679 Speaker 3: that you ask and you don't have to keep telling 190 00:11:26,720 --> 00:11:29,920 Speaker 3: it that when you ask it a question. The trouble 191 00:11:30,000 --> 00:11:33,720 Speaker 3: is what else is it remembering and how is it 192 00:11:33,760 --> 00:11:36,760 Speaker 3: applying that in the answers that it gives you and 193 00:11:37,200 --> 00:11:39,600 Speaker 3: in what it chooses to tell you or not tell you. 194 00:11:39,720 --> 00:11:42,640 Speaker 3: And I think those are interesting questions that we don't 195 00:11:42,640 --> 00:11:44,840 Speaker 3: have good answers to yet, because there's things a week old. 196 00:11:45,679 --> 00:11:54,240 Speaker 3: So that's look I think We're into really really uncharted 197 00:11:54,520 --> 00:11:57,400 Speaker 3: territory here. And the big problem is not so much 198 00:11:57,600 --> 00:12:00,200 Speaker 3: the ones that people are using the consumer eyes like 199 00:12:00,280 --> 00:12:03,600 Speaker 3: Gemini and Chat, GPTM things like that. It's all the 200 00:12:03,600 --> 00:12:07,640 Speaker 3: ones you don't see. You know, no one knows what 201 00:12:07,679 --> 00:12:10,080 Speaker 3: the Chinese government is doing with AI, and or the 202 00:12:10,120 --> 00:12:13,000 Speaker 3: Russian government is doing with AI, or you know half 203 00:12:13,040 --> 00:12:16,199 Speaker 3: a dozen other people who have the power resources necessary 204 00:12:16,200 --> 00:12:20,959 Speaker 3: to do this. And I guess Tim Cook was the 205 00:12:20,960 --> 00:12:24,360 Speaker 3: fellow's name who wrote Wait for Why. I guess, as 206 00:12:24,400 --> 00:12:26,959 Speaker 3: he said, when it's over, we won't know it. 207 00:12:27,679 --> 00:12:35,360 Speaker 1: Yes, yes, it's like I've been thinking, even like I'm 208 00:12:35,800 --> 00:12:38,800 Speaker 1: you will take six weeks away from submitting my PhD, 209 00:12:39,000 --> 00:12:44,720 Speaker 1: Like literally my CANDIDSCHA finishes mid January, right, so I'm 210 00:12:45,040 --> 00:12:47,800 Speaker 1: but as I get towards the end, and like when 211 00:12:47,800 --> 00:12:51,640 Speaker 1: I started November one, twenty nineteen, so six years, right, 212 00:12:51,760 --> 00:12:56,000 Speaker 1: AI essentially wasn't It wasn't even a thing in academia 213 00:12:56,080 --> 00:12:58,240 Speaker 1: at all. And for the first two or three years, 214 00:12:58,679 --> 00:13:02,200 Speaker 1: you know, but the last year or two it's obviously 215 00:13:02,559 --> 00:13:09,840 Speaker 1: like fucking rampaging And I'm sitting back. It hasn't been 216 00:13:11,480 --> 00:13:14,520 Speaker 1: super valuable to me because I did all my own research, 217 00:13:14,600 --> 00:13:15,280 Speaker 1: so it can. 218 00:13:15,200 --> 00:13:17,559 Speaker 2: Yeah, because you did, yes, yeah, sure you did, correct it. 219 00:13:18,640 --> 00:13:22,800 Speaker 1: I literally ran with humans. But for me, well, all 220 00:13:22,800 --> 00:13:26,960 Speaker 1: of mine was empirical research and empirical papers and so school. 221 00:13:27,640 --> 00:13:30,520 Speaker 1: Yeah yeah, like the old days, right, So it can't 222 00:13:30,640 --> 00:13:35,400 Speaker 1: report on my stuff because it's all anyway. But as 223 00:13:35,520 --> 00:13:39,000 Speaker 1: I'm I'm looking at other people now starting whatever it is, 224 00:13:39,040 --> 00:13:42,160 Speaker 1: an undergrad or honors or masters or PhD or something, 225 00:13:42,400 --> 00:13:43,679 Speaker 1: I'm like, I don't know. 226 00:13:44,320 --> 00:13:46,199 Speaker 2: And knocking a PhD out in a weekend. 227 00:13:46,760 --> 00:13:49,200 Speaker 1: Oh, I don't know what the value of those things. 228 00:13:49,240 --> 00:13:53,640 Speaker 1: And I'm almost I don't know's I feel a bit 229 00:13:53,720 --> 00:13:57,000 Speaker 1: conflicted because I've just spent sex years working quite hard 230 00:13:57,040 --> 00:13:59,760 Speaker 1: to get this thing and do this thing, and which 231 00:13:59,800 --> 00:14:03,959 Speaker 1: is you know, But I'm like, I don't know whether 232 00:14:04,040 --> 00:14:06,680 Speaker 1: or not it's going to be worth much like you're 233 00:14:06,720 --> 00:14:11,280 Speaker 1: not You're not wrong. Like I had an idea for 234 00:14:11,320 --> 00:14:13,079 Speaker 1: a book when I was going to Queensland a few 235 00:14:13,120 --> 00:14:17,480 Speaker 1: weeks ago, and I literally, just as as an experiment, 236 00:14:18,320 --> 00:14:21,160 Speaker 1: I came up with the idea for the title and 237 00:14:21,200 --> 00:14:23,280 Speaker 1: the theme and I plugged it in and I literally 238 00:14:23,360 --> 00:14:25,320 Speaker 1: created a book which I'm not going to publish, but 239 00:14:25,440 --> 00:14:29,640 Speaker 1: in a day with GPT. I'm like, I could write 240 00:14:29,640 --> 00:14:32,360 Speaker 1: a book, and anyone could write a book in a day, 241 00:14:33,200 --> 00:14:36,880 Speaker 1: and you're like, well, I could, you know, You're like, well, 242 00:14:37,360 --> 00:14:39,680 Speaker 1: with a person who's hot, you know, younger than me 243 00:14:39,760 --> 00:14:45,280 Speaker 1: and more technically fluent than me. Yeah, you could do 244 00:14:45,320 --> 00:14:47,720 Speaker 1: a PhD and you wouldn't learn anything and you wouldn't 245 00:14:47,720 --> 00:14:51,280 Speaker 1: have any knowledge. But in terms of ticking the boxes, 246 00:14:51,760 --> 00:14:55,200 Speaker 1: depending on the type of study or PhD or program, 247 00:14:55,240 --> 00:15:00,160 Speaker 1: but definitely we're going to be producing graduates unless something 248 00:15:00,240 --> 00:15:02,800 Speaker 1: is done. There are going to be graduates coming out 249 00:15:03,520 --> 00:15:07,120 Speaker 1: who really haven't done any work and haven't really learned 250 00:15:07,160 --> 00:15:10,720 Speaker 1: anything other than how to manipulate the system. 251 00:15:11,600 --> 00:15:13,880 Speaker 2: Yeah, that's exactly right. 252 00:15:14,400 --> 00:15:17,640 Speaker 1: So I don't know where that leaves us in terms. 253 00:15:17,440 --> 00:15:20,280 Speaker 3: Of big I mean, there are big questions for the 254 00:15:20,320 --> 00:15:26,400 Speaker 3: publishing industry, for example, in this because yes, if anyone 255 00:15:26,440 --> 00:15:32,080 Speaker 3: can produce a book or night, I mean, what's to 256 00:15:32,080 --> 00:15:35,160 Speaker 3: stop you doing that? What's to stop you pumping out 257 00:15:35,640 --> 00:15:39,760 Speaker 3: seven books a week, putting putting on Amazon doing its 258 00:15:40,160 --> 00:15:45,320 Speaker 3: print on demand service for physical books and ebooks. And 259 00:15:45,960 --> 00:15:47,880 Speaker 3: you'd say, O, well, you know they might only sell 260 00:15:47,920 --> 00:15:49,960 Speaker 3: ten copies, so they might, but one of them might 261 00:15:50,000 --> 00:15:55,480 Speaker 3: sell amnium and then what have you lost? It's a 262 00:15:55,480 --> 00:15:59,760 Speaker 3: big question, and there are definitely people already doing that. 263 00:16:00,440 --> 00:16:06,320 Speaker 3: The fantasy novel genre huge growth in volume, for example, 264 00:16:06,360 --> 00:16:10,200 Speaker 3: online from people doing exactly that. I mean, there are 265 00:16:10,200 --> 00:16:13,600 Speaker 3: some instruments that say now that around eighty percent of 266 00:16:13,640 --> 00:16:19,760 Speaker 3: all online writing is generated by AI. So there's a 267 00:16:19,760 --> 00:16:22,120 Speaker 3: good chance of whatever you're reading was not written by 268 00:16:22,120 --> 00:16:25,000 Speaker 3: a human, better than even chance. 269 00:16:25,040 --> 00:16:26,600 Speaker 2: It is probably the best way to describe it. 270 00:16:28,440 --> 00:16:30,840 Speaker 3: I'm pretty certain the thing we're going to talk about today, though, 271 00:16:31,800 --> 00:16:37,200 Speaker 3: is written by humans, because I've read their paper and 272 00:16:37,960 --> 00:16:41,600 Speaker 3: it's interesting because it kind of blows up pretty much 273 00:16:41,840 --> 00:16:44,520 Speaker 3: everything you believe about the way the human brain works, 274 00:16:45,320 --> 00:16:48,560 Speaker 3: or at least the way psychology works. So the classical 275 00:16:48,960 --> 00:16:52,680 Speaker 3: definition for domestic violence is people wheel out a lot 276 00:16:52,680 --> 00:16:58,720 Speaker 3: of stuff about you know, ship, terrible childhood, beaten by 277 00:16:58,760 --> 00:17:02,520 Speaker 3: their parents, all that kind of psychobabble stuff that you know, 278 00:17:03,280 --> 00:17:08,080 Speaker 3: the psychiatric profession likes to, you know, post what is 279 00:17:08,119 --> 00:17:11,159 Speaker 3: it a post hoc? 280 00:17:11,880 --> 00:17:13,240 Speaker 2: Yeah, no, no, post hoc. 281 00:17:13,320 --> 00:17:18,240 Speaker 3: After it's happened, let's look back and invent a narrative 282 00:17:18,240 --> 00:17:20,560 Speaker 3: for why this happened. It's sort of like the finance 283 00:17:20,640 --> 00:17:22,720 Speaker 3: report every morning, on the news, which is, oh, the 284 00:17:22,760 --> 00:17:25,199 Speaker 3: Dow went up fifteen points. Well, that's because of the 285 00:17:25,960 --> 00:17:27,800 Speaker 3: this or that or the other or whatever it was 286 00:17:27,800 --> 00:17:29,680 Speaker 3: that they thought of that might have some sort of 287 00:17:29,680 --> 00:17:32,399 Speaker 3: an effect. They have absolutely no idea why it did that. 288 00:17:32,480 --> 00:17:35,200 Speaker 3: But this story sounds good, and that to me describes 289 00:17:36,119 --> 00:17:39,320 Speaker 3: psychology in a nutshell, which is, we saw a human 290 00:17:39,359 --> 00:17:41,800 Speaker 3: behave this way. Here's the story we're going to invent 291 00:17:41,840 --> 00:17:48,520 Speaker 3: for why. And this is a little unsettling this study 292 00:17:48,560 --> 00:17:52,760 Speaker 3: for people who like thinking about psychology that way, because 293 00:17:52,880 --> 00:17:56,240 Speaker 3: this study and the headline of it is that they 294 00:17:56,280 --> 00:18:01,240 Speaker 3: gave a bunch of men in a double blind, placebo 295 00:18:01,359 --> 00:18:05,040 Speaker 3: controlled trial, which is the kind of thing you can 296 00:18:05,040 --> 00:18:07,639 Speaker 3: do with a drug trial, and they gave they got 297 00:18:07,640 --> 00:18:10,600 Speaker 3: six hundred odd men. They gave half of them and 298 00:18:10,680 --> 00:18:13,600 Speaker 3: antidepressant and they got the other half of them got 299 00:18:13,640 --> 00:18:16,679 Speaker 3: a sugar pill instead. They didn't know who was getting, 300 00:18:16,720 --> 00:18:19,439 Speaker 3: which the researchers didn't know who was getting which the 301 00:18:19,440 --> 00:18:22,000 Speaker 3: people analyzing the results didn't know who was getting which. 302 00:18:22,280 --> 00:18:25,240 Speaker 3: It was only once you actually decodified everything at the 303 00:18:25,320 --> 00:18:28,439 Speaker 3: end that you could figure it out. So pretty high 304 00:18:28,880 --> 00:18:32,240 Speaker 3: quality research, you know, some of the best stuff that 305 00:18:32,320 --> 00:18:35,240 Speaker 3: you can do big, big trial done in Australia was 306 00:18:35,280 --> 00:18:36,639 Speaker 3: called the reinvest Trial. 307 00:18:36,640 --> 00:18:38,040 Speaker 2: It's been going for eight years. 308 00:18:39,080 --> 00:18:42,399 Speaker 3: And these just weren't any old blokes that they were 309 00:18:42,680 --> 00:18:45,800 Speaker 3: doing this with. These were people with criminal records for 310 00:18:46,000 --> 00:18:50,240 Speaker 3: domestic violence and so on. So not the easiest group 311 00:18:50,280 --> 00:18:53,520 Speaker 3: to study either, by the way, and not the easiest 312 00:18:53,520 --> 00:18:57,640 Speaker 3: group to keep doing a trial either, but they managed it. 313 00:18:57,800 --> 00:19:02,600 Speaker 3: And the interesting result is I mean the headline result, 314 00:19:02,880 --> 00:19:06,199 Speaker 3: and I don't like things being reported this way, but 315 00:19:06,280 --> 00:19:10,359 Speaker 3: it's an interesting answer, which is the people who were 316 00:19:10,440 --> 00:19:15,840 Speaker 3: on the antidepressants were twenty one percent less likely to 317 00:19:15,920 --> 00:19:19,640 Speaker 3: commit an active domestic violence than the people who weren't. 318 00:19:20,119 --> 00:19:24,120 Speaker 2: While so people men, the men on the antidepressants. 319 00:19:23,600 --> 00:19:28,800 Speaker 1: So twenty one percent less likely when they're on an antidepressant. Wow, yes, 320 00:19:28,800 --> 00:19:30,760 Speaker 1: that's a significant number. 321 00:19:31,960 --> 00:19:35,960 Speaker 3: It is when it's reported that way, I prefer absolute percentages, 322 00:19:36,080 --> 00:19:38,600 Speaker 3: which is I think it was twenty nine percent versus 323 00:19:38,640 --> 00:19:42,960 Speaker 3: thirty five percent. So twenty nine percent of the group 324 00:19:43,160 --> 00:19:47,399 Speaker 3: on the antidepressant committed acts and these are offenses of 325 00:19:47,440 --> 00:19:52,080 Speaker 3: domestic violence, so actually being charged with domestic violence, and 326 00:19:52,320 --> 00:19:55,200 Speaker 3: thirty five percent in the group that were on the 327 00:19:55,240 --> 00:19:59,960 Speaker 3: sugar pills. So the but it's pretty, like I said, 328 00:20:00,000 --> 00:20:02,360 Speaker 3: a high quality evidence no matter how you report on it, 329 00:20:02,400 --> 00:20:07,320 Speaker 3: and there's a clear and significant difference between them, which 330 00:20:08,000 --> 00:20:11,520 Speaker 3: some people don't like as an answer because the answer 331 00:20:12,000 --> 00:20:16,320 Speaker 3: appears to be the domestic violence is a treatable condition 332 00:20:18,160 --> 00:20:23,720 Speaker 3: and that it's related to brain chemistry and not to 333 00:20:23,960 --> 00:20:25,920 Speaker 3: all of the other things that they like to relate 334 00:20:25,960 --> 00:20:27,120 Speaker 3: domestic violence too. 335 00:20:29,200 --> 00:20:31,600 Speaker 1: And so hang on, hang on, hang on. It doesn't 336 00:20:31,600 --> 00:20:33,840 Speaker 1: mean it's only related to brain chemistry. 337 00:20:34,000 --> 00:20:34,280 Speaker 2: True. 338 00:20:34,920 --> 00:20:37,720 Speaker 1: True, it could be. It could be an intersection of 339 00:20:37,800 --> 00:20:43,360 Speaker 1: variables and factors. It could but the fact brain chemistry 340 00:20:43,440 --> 00:20:47,520 Speaker 1: as a factor, but it's not the standalone cause. I 341 00:20:47,520 --> 00:20:49,160 Speaker 1: wouldn't think no. 342 00:20:49,800 --> 00:20:51,240 Speaker 3: And if you were to ask me what is the 343 00:20:51,280 --> 00:20:53,840 Speaker 3: standalone cause, I would say the same thing that's the 344 00:20:53,840 --> 00:20:58,399 Speaker 3: standardlone cause for all impulsive violence, which is you put 345 00:20:58,520 --> 00:21:05,560 Speaker 3: a brain under anxiety stress and you reduce impulse control. 346 00:21:06,280 --> 00:21:10,159 Speaker 3: So you put people in a level of continuous chronic 347 00:21:10,240 --> 00:21:16,080 Speaker 3: stress through so financial difficulty or addiction or both when 348 00:21:16,119 --> 00:21:20,320 Speaker 3: the addiction is gambling, and you will create domestic violence, 349 00:21:20,320 --> 00:21:22,960 Speaker 3: which is why statistics bear out that the rates of 350 00:21:23,000 --> 00:21:26,160 Speaker 3: domestic violence are significantly higher in groups who are under 351 00:21:26,160 --> 00:21:30,919 Speaker 3: anxiety and stress, and particularly amongst people who are addicted 352 00:21:31,040 --> 00:21:34,399 Speaker 3: drug addicts, gambling addicts, et cetera, et cetera. So to me, 353 00:21:34,560 --> 00:21:37,320 Speaker 3: that's not a shock. It's also not a shock that 354 00:21:37,680 --> 00:21:43,359 Speaker 3: a antidepressant would have an effect on this because we've 355 00:21:43,400 --> 00:21:47,680 Speaker 3: known for quite some time that serotonin is related. So 356 00:21:47,760 --> 00:21:52,720 Speaker 3: serotonin is I guess it's the breaking system in the brain. 357 00:21:52,840 --> 00:21:59,160 Speaker 3: It tells you to calm down. It reduces the impulsivity 358 00:21:59,359 --> 00:22:04,440 Speaker 3: in the brain, so it is the chill out peace man. 359 00:22:04,880 --> 00:22:05,160 Speaker 1: You know. 360 00:22:05,240 --> 00:22:10,359 Speaker 3: It's the thing that happens after the stimulating event, after 361 00:22:10,400 --> 00:22:14,240 Speaker 3: you've obtained the reward. So dopamine makes you obtain the reward. 362 00:22:14,280 --> 00:22:17,840 Speaker 3: Dopamine makes you want to have sex. Serotonin's what you 363 00:22:18,000 --> 00:22:22,600 Speaker 3: get afterwards, the feeling of all is well with the world, 364 00:22:22,960 --> 00:22:23,840 Speaker 3: you know, all is good. 365 00:22:23,880 --> 00:22:25,560 Speaker 2: It's the reward. It is the reward. 366 00:22:26,920 --> 00:22:31,600 Speaker 3: It also dampens down impulsiveness. The impulsiveness comes in the 367 00:22:31,640 --> 00:22:36,040 Speaker 3: need to attain the reward. Serotonin dials it back down 368 00:22:36,080 --> 00:22:38,919 Speaker 3: again after you receive the reward. If you don't have 369 00:22:39,000 --> 00:22:43,200 Speaker 3: sufficient serotonin, which is part of the definition of depression, 370 00:22:44,760 --> 00:22:49,760 Speaker 3: then you will clearly be more impulsive. And science has 371 00:22:49,800 --> 00:22:51,840 Speaker 3: known this for a long time, because they have observed 372 00:22:51,840 --> 00:22:55,200 Speaker 3: for a long time that the major class of antidepressant drugs, 373 00:22:55,200 --> 00:22:59,840 Speaker 3: which are called selective serotonin reuptake inhibitors, which essentially SSRI, 374 00:23:00,480 --> 00:23:03,639 Speaker 3: which essentially increased the amount of available serotonin in the 375 00:23:03,680 --> 00:23:06,320 Speaker 3: system in hope that that will in some way cure depression. 376 00:23:06,359 --> 00:23:10,639 Speaker 3: Unfortunately it doesn't much, but they've known that it does 377 00:23:10,920 --> 00:23:14,920 Speaker 3: affect impulsivity because you have more serotonin available and therefore 378 00:23:14,960 --> 00:23:20,160 Speaker 3: a better breaking system on your impulsive nature. I guess 379 00:23:20,640 --> 00:23:23,840 Speaker 3: this study proves it. This study proves that when it 380 00:23:23,880 --> 00:23:27,960 Speaker 3: comes to impulsive violence, which is what domestic violence largely is, 381 00:23:28,000 --> 00:23:32,879 Speaker 3: it's not premeditated as a general rule there if you 382 00:23:32,960 --> 00:23:37,000 Speaker 3: put someone on something that increases available serotonin, they commit 383 00:23:37,119 --> 00:23:39,440 Speaker 3: less impulsive violence. And this is born out even further 384 00:23:39,480 --> 00:23:41,080 Speaker 3: in the study, which what they found is that it 385 00:23:41,119 --> 00:23:45,120 Speaker 3: didn't affect other kinds of violence. So whilst the domestic 386 00:23:45,200 --> 00:23:48,840 Speaker 3: violence was significantly different between the two groups, there was 387 00:23:48,880 --> 00:23:54,120 Speaker 3: no difference in criminal violence, so assaulting strangers, committing acts 388 00:23:54,119 --> 00:23:57,959 Speaker 3: of violence against others, etc. No difference between those two groups. 389 00:23:58,400 --> 00:24:00,760 Speaker 3: And the explanation the research is for that which I 390 00:24:00,800 --> 00:24:05,080 Speaker 3: find really compelling, is one is premeditated and one is impulsive, 391 00:24:05,600 --> 00:24:08,680 Speaker 3: so you know you're going out, you're going to commit 392 00:24:08,720 --> 00:24:11,040 Speaker 3: a crime, You've thought about it, you've planned it, and 393 00:24:11,119 --> 00:24:16,000 Speaker 3: violence is part of the plan, whereas domestic violence is 394 00:24:16,040 --> 00:24:19,800 Speaker 3: more often impulsive, so it's a reactive violence, reactive to 395 00:24:19,880 --> 00:24:25,560 Speaker 3: the situation. And so you would expect that if you've 396 00:24:25,560 --> 00:24:28,680 Speaker 3: got those two types of violence, the one that is 397 00:24:28,880 --> 00:24:32,639 Speaker 3: under impulse control is the one that's going to be 398 00:24:32,640 --> 00:24:38,280 Speaker 3: affected by having more serotonin available. So I think this research, 399 00:24:38,359 --> 00:24:42,040 Speaker 3: which is a significant step forward. It's a major study, 400 00:24:42,359 --> 00:24:47,760 Speaker 3: big Australian study, really high quality research, essentially proves what 401 00:24:47,800 --> 00:24:51,960 Speaker 3: people have suspected for some time, which is impulse control 402 00:24:52,119 --> 00:24:56,439 Speaker 3: is something under the control of brain chemicals and in 403 00:24:56,480 --> 00:25:00,200 Speaker 3: particular serotonin and things that affect serah. 404 00:25:00,320 --> 00:25:02,520 Speaker 2: And are going to have that effect. 405 00:25:03,000 --> 00:25:05,480 Speaker 3: And I think that's an important thing because it steps 406 00:25:05,480 --> 00:25:09,400 Speaker 3: away from the traditional explanation of this, which is much 407 00:25:09,440 --> 00:25:13,320 Speaker 3: more wooly and much more oh, you don't know how 408 00:25:13,320 --> 00:25:15,600 Speaker 3: badly he was treated by his father kind of thing, 409 00:25:16,440 --> 00:25:19,240 Speaker 3: and which is of course one of the reasons people 410 00:25:19,320 --> 00:25:21,720 Speaker 3: really didn't like this rich that is being done at all. 411 00:25:22,280 --> 00:25:26,800 Speaker 3: People have there are certain groups who really don't like 412 00:25:27,040 --> 00:25:29,879 Speaker 3: the notion that what we're going to do here is 413 00:25:29,960 --> 00:25:36,160 Speaker 3: come up with a drug treatment for domestic violence, because 414 00:25:36,200 --> 00:25:39,160 Speaker 3: it's because it's it's sort of given as it suggests 415 00:25:39,240 --> 00:25:40,919 Speaker 3: then that what's going to be put about is that 416 00:25:40,960 --> 00:25:42,840 Speaker 3: this is some sort of jed get out of jail 417 00:25:42,920 --> 00:25:46,080 Speaker 3: free card. Oh, it's all right, he's you know, his 418 00:25:46,240 --> 00:25:48,959 Speaker 3: brain's broken and we can fix it with drugs and 419 00:25:49,000 --> 00:25:51,560 Speaker 3: he has no responsibility for the way he behaved. 420 00:25:54,440 --> 00:25:57,639 Speaker 1: Yeah. Look, well, I don't think the research kind of 421 00:25:57,680 --> 00:26:00,560 Speaker 1: shows that it's only about brain chemis and you're not 422 00:26:00,600 --> 00:26:04,000 Speaker 1: suggesting that either. But I think also when somebody grew 423 00:26:04,080 --> 00:26:06,479 Speaker 1: up in an environment where there was domestic violence, and 424 00:26:07,359 --> 00:26:11,080 Speaker 1: you know that was they were socially uh an environmentally 425 00:26:11,200 --> 00:26:16,280 Speaker 1: kind of influenced and programmed by that, that becomes you know, 426 00:26:16,480 --> 00:26:20,359 Speaker 1: emotionally and behaviorally more acceptable. Like if you've been around that, 427 00:26:20,440 --> 00:26:22,800 Speaker 1: if you've been around that, you're more likely to do that. 428 00:26:23,160 --> 00:26:25,920 Speaker 2: Ah, you're straying into psychobabble again. Crazy. 429 00:26:25,920 --> 00:26:27,040 Speaker 1: Oh, but it's you're. 430 00:26:26,960 --> 00:26:29,520 Speaker 2: Kind of right, but you're just you can't help yourself. 431 00:26:29,520 --> 00:26:31,760 Speaker 2: You steer off into the psycho. 432 00:26:31,440 --> 00:26:35,879 Speaker 1: Babbyl No, no, no, I'm going to push back because 433 00:26:35,920 --> 00:26:39,840 Speaker 1: I think that I mean I've been involved. I've worked 434 00:26:40,480 --> 00:26:43,960 Speaker 1: at the cold face of addiction and alcoholism, which intersects 435 00:26:44,000 --> 00:26:46,359 Speaker 1: with domestic violence, and so I have a little bit 436 00:26:46,400 --> 00:26:51,120 Speaker 1: of expertise working in this space with offenders and people 437 00:26:51,160 --> 00:26:53,879 Speaker 1: who have been hurt. And let me tell you, the 438 00:26:53,920 --> 00:26:56,719 Speaker 1: people that come out of that environment often just become 439 00:26:56,720 --> 00:26:58,920 Speaker 1: a product of that environment. And I'm sure the brain 440 00:26:59,040 --> 00:27:02,720 Speaker 1: chemistry thing is indeed a variable and a contributing factor, 441 00:27:03,280 --> 00:27:06,040 Speaker 1: but I don't think like when you look at you go, well, 442 00:27:06,080 --> 00:27:08,920 Speaker 1: even in this research, twenty nine of twenty nine percent 443 00:27:08,960 --> 00:27:12,520 Speaker 1: of the people, so twenty nine percent of three hundred 444 00:27:13,080 --> 00:27:16,840 Speaker 1: is about ninety people give or take who were taking 445 00:27:17,280 --> 00:27:21,479 Speaker 1: the drug, the actual drug, they still offended. So you know, 446 00:27:21,680 --> 00:27:25,400 Speaker 1: it's not a solution, but it's definitely a factor. 447 00:27:26,080 --> 00:27:28,800 Speaker 3: I'm not saying it's a solution either, And I agree 448 00:27:28,840 --> 00:27:30,600 Speaker 3: with everything you set up to the point where you 449 00:27:30,720 --> 00:27:32,920 Speaker 3: started to try and explain it. 450 00:27:33,960 --> 00:27:38,680 Speaker 1: So so I. 451 00:27:38,600 --> 00:27:38,919 Speaker 2: Have a go. 452 00:27:39,560 --> 00:27:41,639 Speaker 1: If you can't, you can't say I don't have a go, 453 00:27:41,880 --> 00:27:43,720 Speaker 1: you have a red hot crack. I have a go. 454 00:27:44,600 --> 00:27:46,600 Speaker 1: I have a gor right. 455 00:27:47,480 --> 00:27:50,520 Speaker 3: An environment where a child is raised in an environment 456 00:27:50,560 --> 00:27:53,280 Speaker 3: of high levels of domestic violence, that isn't that is 457 00:27:53,320 --> 00:27:55,880 Speaker 3: a that is an environment of chronic stress and that 458 00:27:56,040 --> 00:27:59,919 Speaker 3: definitely has an effect on brain chemistry, definitely. Does We 459 00:28:00,119 --> 00:28:02,600 Speaker 3: know that it raises delta foss B. It is exactly 460 00:28:02,680 --> 00:28:05,440 Speaker 3: the same as the child being addicted. We know that 461 00:28:05,440 --> 00:28:08,600 Speaker 3: from the Bradley studies that started with the ADHD drugs 462 00:28:08,600 --> 00:28:12,080 Speaker 3: that we've talked about before. Those kids were depression era 463 00:28:12,240 --> 00:28:15,639 Speaker 3: kids being raised in large families with alcoholic parents and 464 00:28:15,840 --> 00:28:20,679 Speaker 3: high levels of domestic violence. Those kids were ADHD and 465 00:28:20,760 --> 00:28:23,000 Speaker 3: what they found was the cure for them is take 466 00:28:23,040 --> 00:28:25,760 Speaker 3: them out of that environment and they get better. That's 467 00:28:25,800 --> 00:28:28,080 Speaker 3: a bit too slow, though, So what they found even 468 00:28:28,119 --> 00:28:30,119 Speaker 3: better was that if you give them a stimulant drug, 469 00:28:30,359 --> 00:28:34,919 Speaker 3: you'll cure them. Well, your cure ADHD. And that's the 470 00:28:34,960 --> 00:28:38,600 Speaker 3: basis of the ADHD drug industry today. But that just 471 00:28:38,720 --> 00:28:42,920 Speaker 3: tells you that you're right. The environment did create it. 472 00:28:43,000 --> 00:28:45,800 Speaker 3: The chronic stress of the environment did create it. But 473 00:28:45,920 --> 00:28:49,800 Speaker 3: what it created was a brain chemistry that can be affected. 474 00:28:52,320 --> 00:28:56,360 Speaker 1: So you don't believe that psychology plays a part at all. 475 00:28:56,400 --> 00:29:00,320 Speaker 1: You don't believe that social programming and conditioning influencers people's 476 00:29:00,400 --> 00:29:07,000 Speaker 1: behavior as a factor alongside brain chemistry. You think everything 477 00:29:07,560 --> 00:29:11,000 Speaker 1: so there's no free will, so you're not making decisions. 478 00:29:11,520 --> 00:29:13,720 Speaker 2: There's always free will at the margins. 479 00:29:13,760 --> 00:29:15,760 Speaker 3: But if you want to look at if you want 480 00:29:15,800 --> 00:29:19,560 Speaker 3: to look at the major causes of mental illness, it 481 00:29:19,640 --> 00:29:23,800 Speaker 3: is always brain chemistry. But there's a high variety in humans, right, 482 00:29:24,200 --> 00:29:27,600 Speaker 3: We're all on a normal spectrum. Some people are at 483 00:29:27,640 --> 00:29:29,720 Speaker 3: one end of reaction and others are at the other 484 00:29:29,800 --> 00:29:34,440 Speaker 3: end of reaction, and that's because of DNA. But if 485 00:29:34,440 --> 00:29:36,640 Speaker 3: you want to look at the average, you have to say, 486 00:29:37,880 --> 00:29:43,280 Speaker 3: the vast majority of the way we react is entirely 487 00:29:43,360 --> 00:29:46,640 Speaker 3: driven by our biochemistry. 488 00:29:47,000 --> 00:29:51,360 Speaker 1: What did the what did the paper se like in 489 00:29:51,440 --> 00:29:54,280 Speaker 1: terms of the conclusion, You know how they always go conclusion, 490 00:29:54,640 --> 00:29:59,600 Speaker 1: future research, future direction. I don't know if you they. 491 00:29:59,520 --> 00:30:02,320 Speaker 3: Want to ex end the I mean this is this 492 00:30:02,400 --> 00:30:05,320 Speaker 3: is a major piece in what has already been known, 493 00:30:05,400 --> 00:30:08,720 Speaker 3: which is that if you increase the availability of serotonin, 494 00:30:09,720 --> 00:30:12,680 Speaker 3: you can reduce impulsive violence. And what they've done is 495 00:30:12,800 --> 00:30:17,040 Speaker 3: really improve this in a substandard way. And like you say, 496 00:30:17,200 --> 00:30:19,480 Speaker 3: more research required, let's look at this further. 497 00:30:19,880 --> 00:30:23,320 Speaker 2: What else can we do to take this further? 498 00:30:23,560 --> 00:30:26,360 Speaker 3: You know, we're using an SSRI to do this, but 499 00:30:26,400 --> 00:30:28,680 Speaker 3: are there other ways that you could change serotonin. 500 00:30:29,520 --> 00:30:34,239 Speaker 1: M it is interesting. It is interesting. You're wrong, but 501 00:30:34,280 --> 00:30:37,520 Speaker 1: it's interesting. Well, it's good that you having to go. 502 00:30:39,840 --> 00:30:42,240 Speaker 1: I love that you try. I mean, you come on, 503 00:30:42,320 --> 00:30:46,000 Speaker 1: you have a go, and I think that I'm just 504 00:30:46,040 --> 00:30:52,200 Speaker 1: fucking with you. That isn't psychology doesn't. 505 00:30:52,800 --> 00:30:55,280 Speaker 2: No, I'm not any expert here. I'm no more expert 506 00:30:55,320 --> 00:30:57,160 Speaker 2: than you are. You're probably more expert than I am. 507 00:30:57,400 --> 00:30:58,160 Speaker 2: It's just wrong. 508 00:30:59,000 --> 00:31:05,080 Speaker 1: Yeah, and that's why you listen listeners because there's no 509 00:31:05,240 --> 00:31:09,080 Speaker 1: prescription here. There's just two blokes and occasionally a woman 510 00:31:10,320 --> 00:31:13,800 Speaker 1: who chimes in, just having a Yeah. Well that's interesting, mate. 511 00:31:13,840 --> 00:31:15,520 Speaker 1: I wonder where it's going to go, and I wonder 512 00:31:15,560 --> 00:31:19,600 Speaker 1: what that might lead to. And I wonder whether or 513 00:31:19,600 --> 00:31:23,480 Speaker 1: not we're going to see us a more open minded 514 00:31:23,560 --> 00:31:24,280 Speaker 1: approach to. 515 00:31:26,480 --> 00:31:29,120 Speaker 3: By the way, I don't want anyone to come I 516 00:31:29,120 --> 00:31:31,160 Speaker 3: don't want anyone to come away with the impression that 517 00:31:31,200 --> 00:31:33,440 Speaker 3: what I'm saying here is that we should instantly go 518 00:31:33,520 --> 00:31:36,920 Speaker 3: out and start giving people antidepressants to cure domestic violence. 519 00:31:37,320 --> 00:31:39,920 Speaker 3: I don't think that's what this study says. As you 520 00:31:40,400 --> 00:31:44,920 Speaker 3: quite rightly point out, twenty nine percent is still a 521 00:31:45,120 --> 00:31:49,960 Speaker 3: very high number. And the interesting part about this is 522 00:31:49,960 --> 00:31:51,400 Speaker 3: that it could be affected at all. 523 00:31:52,320 --> 00:31:53,920 Speaker 2: That's the bit I find interesting. 524 00:31:54,400 --> 00:31:59,720 Speaker 1: Yeah, yeah, well it's definitely doing something I was going 525 00:31:59,800 --> 00:32:02,959 Speaker 1: to say before we go, just jumping back quickly. So 526 00:32:04,000 --> 00:32:07,320 Speaker 1: just your your website and your free resource that you 527 00:32:07,440 --> 00:32:11,120 Speaker 1: kind of generously and passionately give to the world, which 528 00:32:11,160 --> 00:32:13,600 Speaker 1: is free schools dot org. I think, off the top 529 00:32:13,640 --> 00:32:14,760 Speaker 1: of my head, is that correct? 530 00:32:15,040 --> 00:32:15,440 Speaker 2: That's right? 531 00:32:15,560 --> 00:32:15,760 Speaker 3: Yep? 532 00:32:16,760 --> 00:32:20,800 Speaker 1: Is there are you using like I think that's maybe 533 00:32:20,840 --> 00:32:24,680 Speaker 1: a good use of AI? Is Is there any AI 534 00:32:24,880 --> 00:32:28,840 Speaker 1: kind of involvement in the the that not yet? 535 00:32:28,920 --> 00:32:30,400 Speaker 2: Right, there is. 536 00:32:30,760 --> 00:32:34,560 Speaker 3: We do have plans around that, I think, using AI 537 00:32:35,480 --> 00:32:38,480 Speaker 3: as a responsive technique. So at the moment, the website 538 00:32:38,560 --> 00:32:44,240 Speaker 3: just contains video, so you know, really well expert teachers 539 00:32:44,280 --> 00:32:48,360 Speaker 3: in their field are giving you another view of the 540 00:32:48,440 --> 00:32:52,400 Speaker 3: lesson that you didn't understand today when your teacher you know, 541 00:32:52,520 --> 00:32:54,520 Speaker 3: explained it to you or you didn't show up to school. 542 00:32:54,800 --> 00:32:57,680 Speaker 3: But yeah, many and you can choose from you know, 543 00:32:57,800 --> 00:33:00,440 Speaker 3: five ten teachers. Get your favorites. Is the one you 544 00:33:00,560 --> 00:33:03,280 Speaker 3: like that's teaching the thing. That's all well and good, 545 00:33:03,760 --> 00:33:06,240 Speaker 3: but it is still just being talked about by a video. 546 00:33:07,040 --> 00:33:10,360 Speaker 3: And I think the potential for AI here is to 547 00:33:10,360 --> 00:33:14,200 Speaker 3: have a conversation about the content of the video instead 548 00:33:14,200 --> 00:33:17,880 Speaker 3: of the video, so to be able to actually interact 549 00:33:17,960 --> 00:33:20,680 Speaker 3: with this as if it were a teacher trying to 550 00:33:20,760 --> 00:33:24,200 Speaker 3: teach you the thing you're not understanding. And that's the 551 00:33:24,320 --> 00:33:27,680 Speaker 3: nuance in AI which I find really really powerful, is 552 00:33:27,880 --> 00:33:30,320 Speaker 3: they're getting to the point now where you can have 553 00:33:30,360 --> 00:33:34,640 Speaker 3: that conversation and they are accurate enough to be reliable. 554 00:33:35,720 --> 00:33:38,800 Speaker 3: And that's what I find really interesting in that space. 555 00:33:39,400 --> 00:33:42,040 Speaker 1: Not many people know a few of our listeners know 556 00:33:42,080 --> 00:33:44,600 Speaker 1: because we've mentioned it briefly once or twice, but most 557 00:33:44,600 --> 00:33:50,640 Speaker 1: people wouldn't be aware that you were involved in tech, 558 00:33:51,000 --> 00:33:54,640 Speaker 1: and like, I think pretty sure that you can code 559 00:33:54,640 --> 00:33:56,800 Speaker 1: in all of those things if you have to back 560 00:33:56,840 --> 00:33:59,840 Speaker 1: it right, so you would have been way more aware 561 00:33:59,840 --> 00:34:02,160 Speaker 1: that Tiff and I will definitely I, but I would 562 00:34:02,200 --> 00:34:07,400 Speaker 1: think TIF in terms of that AI was coming. Has 563 00:34:07,720 --> 00:34:11,360 Speaker 1: its rate of development surprised. 564 00:34:10,760 --> 00:34:15,120 Speaker 2: You, Yes, and no. 565 00:34:16,480 --> 00:34:21,720 Speaker 3: It's surprising how really quickly it's developing, in the sense 566 00:34:21,719 --> 00:34:26,759 Speaker 3: that it is now generations going every month that goes 567 00:34:26,800 --> 00:34:32,080 Speaker 3: by its generations better than it was before. In a way, 568 00:34:32,120 --> 00:34:35,680 Speaker 3: that's not that surprising, though, because once these things hit 569 00:34:35,719 --> 00:34:39,200 Speaker 3: that sort of level and start growing exponentially. Then you 570 00:34:39,320 --> 00:34:41,920 Speaker 3: start to see that kind of growth really really quickly. 571 00:34:42,320 --> 00:34:44,920 Speaker 3: What I think most people are not appreciating is how 572 00:34:45,120 --> 00:34:47,759 Speaker 3: much and how fast it's growing because most people don't 573 00:34:47,840 --> 00:34:50,520 Speaker 3: use AI much. What they use it for is to 574 00:34:50,560 --> 00:34:55,600 Speaker 3: write emails that they can't be bothered writing. They use 575 00:34:55,640 --> 00:34:59,960 Speaker 3: it to tell them how to do a Facebook post, 576 00:35:00,160 --> 00:35:04,120 Speaker 3: and you know that kind of stuff, and it could 577 00:35:04,160 --> 00:35:08,720 Speaker 3: do that standing on its ear. The really interesting stuff 578 00:35:09,200 --> 00:35:12,000 Speaker 3: is stuff most of the population are not seeing. When 579 00:35:12,000 --> 00:35:14,000 Speaker 3: you start to talk to programmers and coders, and I 580 00:35:14,080 --> 00:35:17,759 Speaker 3: do because I still do have fingers in those pis. 581 00:35:18,520 --> 00:35:22,720 Speaker 3: Low level programmers are simply ceasing to exist because AI 582 00:35:22,920 --> 00:35:26,360 Speaker 3: does it all. You're left with just a generation of 583 00:35:26,600 --> 00:35:29,600 Speaker 3: fifty something year old programmers who are the absolute experts 584 00:35:29,640 --> 00:35:32,000 Speaker 3: and who are using it themselves. I was talking to 585 00:35:32,320 --> 00:35:34,800 Speaker 3: one the other day who was telling me he wrote 586 00:35:34,800 --> 00:35:38,360 Speaker 3: in less than a month something that he would have 587 00:35:38,760 --> 00:35:41,920 Speaker 3: even two or three years ago taken two years to do, 588 00:35:43,600 --> 00:35:47,520 Speaker 3: mostly because he's just telling it what to write, and 589 00:35:47,560 --> 00:35:49,680 Speaker 3: then he doesn't actually have to do the coding himself, 590 00:35:50,560 --> 00:35:52,520 Speaker 3: and he's smart enough to know when it's right and 591 00:35:52,520 --> 00:35:55,759 Speaker 3: when it's wrong and get it to correct but he 592 00:35:55,840 --> 00:35:58,880 Speaker 3: says his fear is that we will lose the ability 593 00:35:58,920 --> 00:36:01,680 Speaker 3: to code ourselves. He said, where are the next generation 594 00:36:01,760 --> 00:36:05,799 Speaker 3: of programmers going to come from? Because nobody wants them 595 00:36:06,120 --> 00:36:10,839 Speaker 3: because they are all AI now. And most people don't 596 00:36:10,880 --> 00:36:14,120 Speaker 3: see this, most people don't care. But when AI is 597 00:36:14,160 --> 00:36:19,120 Speaker 3: writing all the code, then we have really lost control 598 00:36:19,239 --> 00:36:20,440 Speaker 3: of what's being created. 599 00:36:21,480 --> 00:36:24,680 Speaker 1: And also, AI doesn't get tired, doesn't have bad days, 600 00:36:24,760 --> 00:36:27,000 Speaker 1: doesn't get hungry, doesn't need to go at the toilet, 601 00:36:27,000 --> 00:36:29,520 Speaker 1: doesn't need a lunch break, doesn't have a blue with 602 00:36:29,640 --> 00:36:31,320 Speaker 1: its missus. 603 00:36:31,400 --> 00:36:35,600 Speaker 3: And can have a million instances running at the same time, 604 00:36:36,719 --> 00:36:39,000 Speaker 3: or a billion or a trillion instances running at the 605 00:36:39,040 --> 00:36:42,120 Speaker 3: same time. So your one programmer is a trillion programmers. 606 00:36:42,640 --> 00:36:48,640 Speaker 3: The productivity cycle there is truly massive. So that stuff 607 00:36:49,200 --> 00:36:53,920 Speaker 3: is the really scarily fast developing area of AI, which 608 00:36:54,040 --> 00:36:56,359 Speaker 3: most of us don't see because we just think, oh, 609 00:36:56,360 --> 00:36:57,759 Speaker 3: it writes a nice email. 610 00:36:59,160 --> 00:37:03,080 Speaker 1: It's a good thing. Podcast is irreplaceable, tiff, isn't it? 611 00:37:03,080 --> 00:37:08,040 Speaker 2: Well, for now, we're definitely give it a minute. 612 00:37:08,600 --> 00:37:16,080 Speaker 1: We're definitely replaceable. There's already AI podcasts like for We'll 613 00:37:16,080 --> 00:37:18,120 Speaker 1: say goodbye affair and I'll tell you that thing off heir, 614 00:37:18,200 --> 00:37:23,800 Speaker 1: but appreciate you and even though you were wrong, you 615 00:37:23,800 --> 00:37:25,200 Speaker 1: had to go today and well done. 616 00:37:25,200 --> 00:37:27,200 Speaker 2: Oh look you got to give me a gold stuff 617 00:37:29,480 --> 00:37:29,719 Speaker 3: For you.