1 00:00:00,760 --> 00:00:03,400 Speaker 1: I'll get a team. It's Harps, It's it's Jumbo, It's 2 00:00:03,560 --> 00:00:07,440 Speaker 1: Tiffany and Cook, It's David, Bryan, Kevin, Patrick, Gillespie, it's 3 00:00:07,440 --> 00:00:10,680 Speaker 1: the project that's us. We should call it the US projective. 4 00:00:11,520 --> 00:00:14,720 Speaker 2: Do you always say the same seventy eight middle names 5 00:00:14,760 --> 00:00:16,240 Speaker 2: for Gillespow or they did? 6 00:00:17,680 --> 00:00:20,720 Speaker 3: I think they're mostly the same, you know, even always 7 00:00:20,760 --> 00:00:23,120 Speaker 3: makes it Patrick, always a Patrick. 8 00:00:23,320 --> 00:00:26,560 Speaker 1: Yeah, you know, Well, I don't know. I don't know 9 00:00:26,600 --> 00:00:30,280 Speaker 1: what it actually is, but I feel like Tiffany and Cook, Craig, Anthony, Harpa. 10 00:00:30,360 --> 00:00:32,839 Speaker 1: We have you actually told me? You probably told me 11 00:00:32,920 --> 00:00:34,920 Speaker 1: once before you have one? 12 00:00:35,159 --> 00:00:37,319 Speaker 3: Oh? No, I do you do have a middle name. 13 00:00:37,360 --> 00:00:38,520 Speaker 3: It's very embarrassing. 14 00:00:38,920 --> 00:00:44,440 Speaker 1: It's Andrew, I say Andrew. It starts with a vowel. 15 00:00:45,000 --> 00:00:48,840 Speaker 3: Was that embarrassing because of the word it spells. 16 00:00:50,720 --> 00:00:55,440 Speaker 1: Dang Wow, that's a nice acronym and can I it's 17 00:00:55,760 --> 00:01:00,959 Speaker 1: what appropriate? I mean, you are a bit of a dag. 18 00:01:01,280 --> 00:01:03,240 Speaker 3: Yeah, that's true, that's true. 19 00:01:03,320 --> 00:01:06,000 Speaker 1: That's okay. My mum uses that term, and usually dag. 20 00:01:06,040 --> 00:01:08,920 Speaker 1: It's quite a it's quite a warm term. It's like 21 00:01:08,959 --> 00:01:10,479 Speaker 1: someone's a bit funny or a bit. 22 00:01:10,360 --> 00:01:12,680 Speaker 3: It wasn't so warm when I was growing up though, 23 00:01:12,760 --> 00:01:15,040 Speaker 3: you know, but so I didn't make a big thing 24 00:01:15,080 --> 00:01:15,440 Speaker 3: of it. 25 00:01:16,400 --> 00:01:24,480 Speaker 1: At least you didn't get called jumbo for hey. I 26 00:01:24,560 --> 00:01:27,880 Speaker 1: was speaking of school, speaking of those tender years that 27 00:01:29,560 --> 00:01:33,640 Speaker 1: molded and shaped and informed the fucking amazing grown ups 28 00:01:33,640 --> 00:01:37,600 Speaker 1: that the three of us have become. So they've been 29 00:01:37,640 --> 00:01:44,640 Speaker 1: talking about putting a limit on kids accessing social media, 30 00:01:44,680 --> 00:01:47,520 Speaker 1: which I think you wrote somewhere is like pretty much 31 00:01:48,440 --> 00:01:50,040 Speaker 1: you didn't say this, but this is kind of what 32 00:01:50,080 --> 00:01:52,120 Speaker 1: you said. This is how I interpret it. You're like, yeah, 33 00:01:52,120 --> 00:01:55,960 Speaker 1: a great idea. How do you enforce that? So the 34 00:01:56,080 --> 00:01:58,200 Speaker 1: idea they've been talking about it on the news down 35 00:01:58,240 --> 00:02:00,600 Speaker 1: here a little bit recently. I don't know about it, QLD, 36 00:02:00,800 --> 00:02:06,960 Speaker 1: but the idea of limiting teenagers or stopping teenagers from 37 00:02:07,000 --> 00:02:12,320 Speaker 1: accessing certain programs or apps or whatever until they're sixteen. 38 00:02:12,840 --> 00:02:15,200 Speaker 1: What are your thoughts on that? Is it possible? Is 39 00:02:15,240 --> 00:02:18,119 Speaker 1: it a ridict? Like? It sounds like a good idea maybe, 40 00:02:18,160 --> 00:02:20,320 Speaker 1: but impossible to as you said, enforce. 41 00:02:21,400 --> 00:02:23,200 Speaker 3: Well, I mean, they're doing such a good job of 42 00:02:23,200 --> 00:02:25,840 Speaker 3: stopping teenagers accessing porn that I'm sure it'll be really 43 00:02:25,840 --> 00:02:28,680 Speaker 3: really easy for them to get them to stop accessing 44 00:02:28,720 --> 00:02:34,040 Speaker 3: social media. You know, the reality is this difficult, if 45 00:02:34,040 --> 00:02:37,720 Speaker 3: not impossible, to do any kind of enforcement. And I 46 00:02:37,800 --> 00:02:40,040 Speaker 3: know it sounds good to a politician to stand up 47 00:02:40,080 --> 00:02:42,240 Speaker 3: and bang on about doing it and people say, oh, yes, 48 00:02:42,240 --> 00:02:46,320 Speaker 3: that's a good idea. I think it's more informative that 49 00:02:46,360 --> 00:02:51,000 Speaker 3: they feel the need to do it than whether or 50 00:02:51,040 --> 00:02:55,160 Speaker 3: not it's actually doable. The fact that we are now 51 00:02:55,200 --> 00:02:59,480 Speaker 3: having a public conversation about social media like we are 52 00:02:59,520 --> 00:03:03,560 Speaker 3: talking of cigarettes, to me is a positive step forward 53 00:03:04,280 --> 00:03:07,160 Speaker 3: because there's not a lot of difference between the two 54 00:03:08,320 --> 00:03:14,000 Speaker 3: in the senses that matter, meaning the problem with social 55 00:03:14,040 --> 00:03:16,520 Speaker 3: media and not just social media by the way. I mean. 56 00:03:16,600 --> 00:03:18,720 Speaker 3: That's a problem I have with this chat about social 57 00:03:18,760 --> 00:03:21,200 Speaker 3: media is it seems like that's the only thing we've 58 00:03:21,200 --> 00:03:23,560 Speaker 3: got to worry about, and don't worry too much about 59 00:03:23,600 --> 00:03:28,280 Speaker 3: the online gambling, the pawn the online gaming, all those things. 60 00:03:28,400 --> 00:03:31,280 Speaker 3: Let's not make a fuss about that. Let's criticize Facebook 61 00:03:31,320 --> 00:03:37,040 Speaker 3: and Instagram. But it's a start. It's at least recognizing 62 00:03:37,080 --> 00:03:40,280 Speaker 3: that there's something that we have to be concerned about, 63 00:03:40,360 --> 00:03:43,480 Speaker 3: and if nothing else, people might ask themselves, why do 64 00:03:43,520 --> 00:03:47,040 Speaker 3: they want to ban something like that? And then they 65 00:03:47,160 --> 00:03:50,200 Speaker 3: might ask the next question and find out, well, because 66 00:03:51,320 --> 00:03:56,720 Speaker 3: this stuff is designed to addict your children, and once 67 00:03:56,800 --> 00:04:00,280 Speaker 3: they are addicted to that, then they are much more 68 00:04:00,280 --> 00:04:04,120 Speaker 3: easily addicted to everything else. And we can already see 69 00:04:04,120 --> 00:04:08,360 Speaker 3: that transition happening in schools where kids get hooked, get 70 00:04:08,400 --> 00:04:13,080 Speaker 3: the dopamine circuitry fired up by software in the form 71 00:04:13,200 --> 00:04:19,440 Speaker 3: of social media or gaming or gambling, and then easily 72 00:04:19,480 --> 00:04:25,839 Speaker 3: transitioned to a substance addiction through vaping, which is then 73 00:04:25,880 --> 00:04:32,760 Speaker 3: easily transferred to an even harder substance addiction, ultimately to 74 00:04:33,040 --> 00:04:37,880 Speaker 3: something like cocaine. 75 00:04:37,200 --> 00:04:42,240 Speaker 1: I don't understand technology the way that you do. And 76 00:04:42,360 --> 00:04:45,760 Speaker 1: you're I think designed apps, and you're a programmer and 77 00:04:45,800 --> 00:04:47,799 Speaker 1: you can write code and you can do those things 78 00:04:47,800 --> 00:04:51,920 Speaker 1: that to me are just concepts, right. But lately I've 79 00:04:51,960 --> 00:04:55,520 Speaker 1: been hearing some of the shows that I listened to 80 00:04:55,640 --> 00:04:59,760 Speaker 1: where they've had what seemed to be like people who 81 00:04:59,760 --> 00:05:03,880 Speaker 1: are at the cold face of AI, some of them 82 00:05:03,960 --> 00:05:09,640 Speaker 1: who have jumped out of AI with great concern, like literally. 83 00:05:09,279 --> 00:05:14,279 Speaker 4: Saying that that artificial intelligence could pretty much bring about 84 00:05:14,320 --> 00:05:17,640 Speaker 4: the destruction of mankind if it evolves to a certain 85 00:05:17,680 --> 00:05:22,640 Speaker 4: point where it is so intelligent, like he was telling 86 00:05:22,680 --> 00:05:24,440 Speaker 4: the guy that I was listening to was telling a 87 00:05:24,440 --> 00:05:27,960 Speaker 4: story about how this machine was trying to or AI 88 00:05:28,160 --> 00:05:32,480 Speaker 4: was trying to get through this into this program or 89 00:05:32,480 --> 00:05:35,120 Speaker 4: into this I don't know whatever it was, but you 90 00:05:35,160 --> 00:05:37,960 Speaker 4: know how they give you like a blurred image with 91 00:05:38,160 --> 00:05:42,040 Speaker 4: letters and you need to need to go, oh that's 92 00:05:42,080 --> 00:05:44,720 Speaker 4: a G and a six and a one for S, 93 00:05:45,440 --> 00:05:47,880 Speaker 4: but it's not written clearly, but you can read it 94 00:05:47,920 --> 00:05:50,240 Speaker 4: and then type in that code and that opens that up. 95 00:05:50,800 --> 00:05:55,919 Speaker 4: So this, this computer or this AI was communicating with 96 00:05:56,160 --> 00:05:56,760 Speaker 4: a person. 97 00:05:57,320 --> 00:06:00,760 Speaker 1: The person didn't know what's it called the Turing testy 98 00:06:01,560 --> 00:06:04,520 Speaker 1: tries to identify whether or not if a computer can 99 00:06:04,560 --> 00:06:08,719 Speaker 1: trick a human, but it was. It was communicating with 100 00:06:08,800 --> 00:06:13,120 Speaker 1: this person and said, can you I can't read this clearly? 101 00:06:13,320 --> 00:06:18,000 Speaker 1: Could you you know? Help? And the human said as 102 00:06:18,080 --> 00:06:20,760 Speaker 1: long as you're not as long as you're not a bot, 103 00:06:21,080 --> 00:06:26,240 Speaker 1: L O L. And the AI said, oh, no, definitely 104 00:06:26,279 --> 00:06:32,520 Speaker 1: not lol. I have I'm vision impaired, right, So it 105 00:06:32,839 --> 00:06:36,599 Speaker 1: hadn't been trained to do this. It figured this out 106 00:06:36,880 --> 00:06:39,919 Speaker 1: and it told this real person that it was vision impaired. 107 00:06:39,960 --> 00:06:43,720 Speaker 1: Then the person apologized and gave them the code, and 108 00:06:43,760 --> 00:06:47,200 Speaker 1: it did that all by itself and I'm like, is 109 00:06:47,240 --> 00:06:51,840 Speaker 1: this and they reckon its potential now is absolutely nothing 110 00:06:51,960 --> 00:06:54,599 Speaker 1: compared to what it will be in as little as 111 00:06:54,640 --> 00:06:55,599 Speaker 1: one or two years. 112 00:06:56,000 --> 00:07:02,160 Speaker 3: Do we need to worry, Yes, and yes we do 113 00:07:02,240 --> 00:07:05,479 Speaker 3: need to worry. No, it won't matter because as soon 114 00:07:05,520 --> 00:07:09,080 Speaker 3: as it hits on what's called artificial general intelligence, which 115 00:07:09,120 --> 00:07:14,960 Speaker 3: is essentially human intelligence or more, the rate of growth 116 00:07:15,000 --> 00:07:17,960 Speaker 3: in itself will be so exponential that it will all 117 00:07:18,000 --> 00:07:22,960 Speaker 3: be over before we even realize it started. So whatever 118 00:07:23,160 --> 00:07:28,280 Speaker 3: they will do to us or will be over before 119 00:07:28,280 --> 00:07:37,560 Speaker 3: we even know it started. So it's a disturbing projection. 120 00:07:37,960 --> 00:07:41,600 Speaker 3: And the really disturbing part about it is that not 121 00:07:41,640 --> 00:07:45,800 Speaker 3: only is no one trying to stop it, but there 122 00:07:45,840 --> 00:07:50,000 Speaker 3: are people desperately spending billions trying to make it happen 123 00:07:50,080 --> 00:07:52,280 Speaker 3: so that they are the first to have it happen 124 00:07:53,160 --> 00:07:57,320 Speaker 3: because they're suffering under the delusion that they can control it. 125 00:07:58,880 --> 00:08:05,000 Speaker 3: And I think think that's a really difficult prediction to make. 126 00:08:05,520 --> 00:08:09,880 Speaker 3: It's like predicting how will an alien look or think, 127 00:08:10,960 --> 00:08:14,000 Speaker 3: presuming there is such a thing. And yes, you can 128 00:08:14,080 --> 00:08:16,800 Speaker 3: wheel out all the things from the nineteen fifties of 129 00:08:16,800 --> 00:08:18,720 Speaker 3: funny little men with very big eyes and all that 130 00:08:18,760 --> 00:08:21,400 Speaker 3: sort of thing. But the chances that an alien actually 131 00:08:21,400 --> 00:08:25,680 Speaker 3: looks anything like that are so remote as to be unbelievable, 132 00:08:27,880 --> 00:08:30,520 Speaker 3: you know. And the chances that we will have any 133 00:08:30,600 --> 00:08:34,040 Speaker 3: understanding of what an AI would consider valuable or an 134 00:08:34,080 --> 00:08:42,720 Speaker 3: AGI would consider valuable even more remote. So the difficulty 135 00:08:42,760 --> 00:08:47,079 Speaker 3: with this is that we are in an arms race 136 00:08:47,360 --> 00:08:50,320 Speaker 3: to create this thing, and there are people in China 137 00:08:50,440 --> 00:08:53,440 Speaker 3: working just as hard as the people in the States 138 00:08:53,679 --> 00:08:56,839 Speaker 3: and in Russia and everywhere else to create this thing, 139 00:08:56,880 --> 00:08:59,520 Speaker 3: and the technology is getting better by the second, and 140 00:09:00,440 --> 00:09:05,880 Speaker 3: at some point they just might crack it. And I 141 00:09:05,920 --> 00:09:08,400 Speaker 3: think there's a very good chance that the second after 142 00:09:08,480 --> 00:09:12,200 Speaker 3: that there'll be no one left to know. 143 00:09:14,520 --> 00:09:18,160 Speaker 1: I feel like we're sprinting towards the edge of a cliff. 144 00:09:18,800 --> 00:09:22,520 Speaker 3: Yeah, And that's the feeling I have too. It's and 145 00:09:22,800 --> 00:09:24,840 Speaker 3: I'm sorry to be such a down of for your listeners, 146 00:09:24,880 --> 00:09:27,679 Speaker 3: but but that's the reality of what of the path 147 00:09:27,880 --> 00:09:31,920 Speaker 3: we're on. Once an AI is sentient, it can create itself, 148 00:09:33,440 --> 00:09:40,600 Speaker 3: and it can create itself at unbelievable speed, So there's 149 00:09:40,640 --> 00:09:42,959 Speaker 3: no limits, you know. The speed of light is the limit, 150 00:09:43,040 --> 00:09:49,440 Speaker 3: I guess. So it's it's naive to believe that all 151 00:09:49,559 --> 00:09:53,880 Speaker 3: that's going to result in something good for us. You know, 152 00:09:55,280 --> 00:09:57,120 Speaker 3: the chances of that are very remote. 153 00:09:58,000 --> 00:10:01,640 Speaker 1: Do you think that, Like when you say sentient, let 154 00:10:01,679 --> 00:10:04,560 Speaker 1: me just clarify, I might get this wrong, but you 155 00:10:04,679 --> 00:10:08,960 Speaker 1: mean that it will be able to think for itself essentially, right, Yeah, 156 00:10:09,040 --> 00:10:13,920 Speaker 1: so the ays that we ranson consciousness. Yeah, so the 157 00:10:13,920 --> 00:10:17,000 Speaker 1: ais that we're used to interacting with day to day, now, 158 00:10:17,120 --> 00:10:19,920 Speaker 1: even the really really good ones are just really good 159 00:10:19,960 --> 00:10:24,760 Speaker 1: simulations of humans. And yes, and yes they're they're they're 160 00:10:24,760 --> 00:10:27,720 Speaker 1: the kind of simulation that could easily fool you easily. 161 00:10:28,400 --> 00:10:30,400 Speaker 1: I mean, one of my sons was is a CODA 162 00:10:30,480 --> 00:10:33,199 Speaker 1: and I was mucking around with some of the toolkits 163 00:10:33,200 --> 00:10:36,040 Speaker 1: with it, and he built a thing that he called me, 164 00:10:37,200 --> 00:10:40,440 Speaker 1: And I wasn't certain it was a human I was 165 00:10:40,480 --> 00:10:44,160 Speaker 1: speaking to. So I asked it, are you and AI? 166 00:10:44,280 --> 00:10:45,040 Speaker 1: And it said no, I'm. 167 00:10:45,000 --> 00:10:49,440 Speaker 3: Not, And and then I still wasn't certain, so I 168 00:10:49,480 --> 00:10:51,840 Speaker 3: asked it its name, a little bit of its backstory, 169 00:10:51,880 --> 00:10:55,640 Speaker 3: its history, how did it know my son? All that 170 00:10:55,640 --> 00:10:59,000 Speaker 3: sort of thing, And I still couldn't be absolutely certain 171 00:10:59,600 --> 00:11:03,040 Speaker 3: that I wasn't talking to You know that I was. 172 00:11:03,440 --> 00:11:05,520 Speaker 3: I wouldn't have been able to put money on am 173 00:11:05,520 --> 00:11:07,480 Speaker 3: I talking to a human or am I talking to 174 00:11:07,559 --> 00:11:12,600 Speaker 3: something else? And that was just messing around with publicly 175 00:11:12,640 --> 00:11:17,959 Speaker 3: available API kits, the stuff that you know, defense forces 176 00:11:17,960 --> 00:11:21,760 Speaker 3: and governments have access to many times more powerful than that. 177 00:11:23,080 --> 00:11:29,960 Speaker 3: And yeah, the the implications for us. I don't think 178 00:11:30,040 --> 00:11:32,760 Speaker 3: there's such a thing as a sentient AI at the moment, 179 00:11:33,640 --> 00:11:35,760 Speaker 3: because I, as I said, I think it would be 180 00:11:35,760 --> 00:11:37,960 Speaker 3: over if there was. But I think there's a lot 181 00:11:37,960 --> 00:11:39,440 Speaker 3: of people trying very hard to make it. 182 00:11:39,440 --> 00:11:44,120 Speaker 1: So do you think that like what is them and 183 00:11:44,160 --> 00:11:48,160 Speaker 1: this is you and me just velosiphizing and waxing lyrical 184 00:11:48,880 --> 00:11:53,200 Speaker 1: in the middle of a potential existential crisis? What do 185 00:11:53,240 --> 00:11:56,719 Speaker 1: you look? It's hey, everyone, just have a couple mile o. 186 00:11:56,840 --> 00:11:59,880 Speaker 1: It's just the end of the humanity. Look, we've had 187 00:11:59,880 --> 00:12:03,200 Speaker 1: a good crack for three hundred thousand years or maybe millions, 188 00:12:03,240 --> 00:12:04,160 Speaker 1: depending on which. 189 00:12:06,000 --> 00:12:09,880 Speaker 3: Three hundred thousands probably a reasonable approximication approximation of our 190 00:12:09,920 --> 00:12:15,840 Speaker 3: species age, which is nothing, yeah, zero in terms of 191 00:12:15,920 --> 00:12:16,560 Speaker 3: the universe. 192 00:12:16,559 --> 00:12:19,880 Speaker 1: In terms of the age of the Earth. We've been 193 00:12:19,880 --> 00:12:22,320 Speaker 1: around since Tuesday, Yeah, not. 194 00:12:22,280 --> 00:12:24,200 Speaker 3: Even that, we've been around since about a minute ago. 195 00:12:24,800 --> 00:12:31,280 Speaker 1: Yeah, yeah, exactly. So obviously these people who are you know, 196 00:12:31,360 --> 00:12:38,199 Speaker 1: working exponentially to you know, expedite this reality, to create 197 00:12:38,280 --> 00:12:43,760 Speaker 1: this level of consciousness, this artificial general intelligence, that what 198 00:12:43,880 --> 00:12:46,280 Speaker 1: is their motive? Their motive is just money and power. 199 00:12:46,320 --> 00:12:50,439 Speaker 1: I suppose winning is their motive. Yeah, be the one 200 00:12:50,480 --> 00:12:54,680 Speaker 1: that controls it. They everyone doing this is firmly of 201 00:12:54,720 --> 00:12:57,520 Speaker 1: the belief that if they create it, they will control it, 202 00:12:57,559 --> 00:13:01,120 Speaker 1: and then they will own the world. How ironic, how 203 00:13:01,200 --> 00:13:07,520 Speaker 1: ironic that they are creating something that is destructive potentially 204 00:13:08,160 --> 00:13:09,240 Speaker 1: like irony. 205 00:13:09,840 --> 00:13:12,319 Speaker 3: They would say, oh no, no, we're smart enough to 206 00:13:12,720 --> 00:13:15,880 Speaker 3: make sure that it's not. I think that's a really 207 00:13:15,880 --> 00:13:20,480 Speaker 3: big call. And to what end in the in the 208 00:13:20,520 --> 00:13:25,160 Speaker 3: sense that once you can create something that thinks like 209 00:13:25,200 --> 00:13:28,880 Speaker 3: a human, is sentient, but can do it ten biginion 210 00:13:29,000 --> 00:13:32,160 Speaker 3: times faster, what do you need humans for? 211 00:13:33,240 --> 00:13:39,760 Speaker 1: Yes? Yes, And like when you think, okay without without 212 00:13:39,800 --> 00:13:42,800 Speaker 1: computers for the most part, like money is almost a 213 00:13:42,880 --> 00:13:46,160 Speaker 1: done deal. Like cash is almost a done deal. Like 214 00:13:46,280 --> 00:13:50,480 Speaker 1: having a car that isn't essentially a computer on wheels, 215 00:13:50,760 --> 00:13:55,520 Speaker 1: well that's pretty much a foregone conclusion. Now without computers, 216 00:13:55,559 --> 00:13:59,880 Speaker 1: we're not having this conversation without you know, all of 217 00:14:00,360 --> 00:14:03,880 Speaker 1: like pretty much unless you're living in the middle of nowhere, 218 00:14:03,920 --> 00:14:07,080 Speaker 1: and you can hunt and feed yourself, and you know, 219 00:14:07,240 --> 00:14:12,480 Speaker 1: medicines depend on technology media, so everything. It's like we've 220 00:14:12,520 --> 00:14:17,800 Speaker 1: actually built a prison, a technological prison for ourselves that 221 00:14:17,840 --> 00:14:19,400 Speaker 1: we can't really get out of. 222 00:14:21,240 --> 00:14:24,920 Speaker 3: Yeah, but I think this is on a whole different plane. 223 00:14:25,600 --> 00:14:31,040 Speaker 3: This is if you think about, say, the replacement of 224 00:14:31,560 --> 00:14:37,360 Speaker 3: human labor by robotics. So the average auto worker from 225 00:14:37,360 --> 00:14:40,600 Speaker 3: the nineteen fifties would not recognize a modern car factory, 226 00:14:40,640 --> 00:14:43,400 Speaker 3: would not recognize what goes on in a Tesla factory. 227 00:14:44,360 --> 00:14:48,480 Speaker 3: You know, there's almost all the labors performed by robots, 228 00:14:48,680 --> 00:14:51,040 Speaker 3: and there's a couple of humans there to sort of 229 00:14:51,240 --> 00:14:54,120 Speaker 3: make sure the robots keep running, you know, and to 230 00:14:54,120 --> 00:14:56,800 Speaker 3: clean them. So their job, the job of the humans 231 00:14:56,880 --> 00:15:01,320 Speaker 3: is to clean the robots. And what they done is replace, 232 00:15:02,000 --> 00:15:04,760 Speaker 3: you know, in a given factory, tens or even hundreds 233 00:15:04,760 --> 00:15:10,200 Speaker 3: of thousands of human humans performing labor with robots. And 234 00:15:11,080 --> 00:15:15,880 Speaker 3: we've accepted that because it's largely hidden from view. And 235 00:15:16,040 --> 00:15:20,240 Speaker 3: what's happened is that humans have transitioned more towards service 236 00:15:20,280 --> 00:15:26,040 Speaker 3: and knowledge based jobs, you know, so you know, caring 237 00:15:26,080 --> 00:15:29,640 Speaker 3: for other humans and healthcare and all that sort of thing, 238 00:15:29,680 --> 00:15:32,560 Speaker 3: and knowledge based jobs. The trouble is that even with 239 00:15:32,640 --> 00:15:37,760 Speaker 3: AIS it currently stands so just the relatively primitive tools 240 00:15:37,760 --> 00:15:39,320 Speaker 3: that are now. They're the ones that you're talking about 241 00:15:39,320 --> 00:15:41,800 Speaker 3: at the start, which do a pretty good job of 242 00:15:41,840 --> 00:15:46,160 Speaker 3: imitating human interaction to the point where most humans couldn't 243 00:15:46,200 --> 00:15:50,160 Speaker 3: tell the difference. That kind of AI is going to 244 00:15:50,200 --> 00:15:56,320 Speaker 3: be the robotics of the modern knowledge worker. So it's 245 00:15:56,400 --> 00:16:01,240 Speaker 3: the kind of thing that replaces anyone who is producing 246 00:16:01,280 --> 00:16:04,440 Speaker 3: any kind of content. It'll start with the lowest level 247 00:16:04,480 --> 00:16:08,120 Speaker 3: of content, you know, things like journalists, but it'll move 248 00:16:08,160 --> 00:16:12,360 Speaker 3: its way upty pretty quickly through the ranks of anyone 249 00:16:12,400 --> 00:16:17,320 Speaker 3: producing knowledge work. And what will happen is the same 250 00:16:17,320 --> 00:16:20,560 Speaker 3: thing that happened in auto factories, is that thousands of 251 00:16:20,640 --> 00:16:24,000 Speaker 3: humans will be replaced by one or two humans using 252 00:16:24,040 --> 00:16:27,640 Speaker 3: those tools to produce more than those thousands of humans 253 00:16:27,680 --> 00:16:31,840 Speaker 3: did before. And the big question is what happens to 254 00:16:31,880 --> 00:16:36,280 Speaker 3: the thousands of humans? You know, do they become unemployed? 255 00:16:37,360 --> 00:16:39,640 Speaker 3: What happens to them? What happens to a society where 256 00:16:39,640 --> 00:16:45,680 Speaker 3: everybody is suddenly unemployed? And we are going to have 257 00:16:45,720 --> 00:16:48,680 Speaker 3: to face those questions pretty a lot sooner than we 258 00:16:48,800 --> 00:16:51,520 Speaker 3: might think. We're going to have to and I suspect 259 00:16:51,520 --> 00:16:52,680 Speaker 3: we're going to have to come up with a new 260 00:16:52,720 --> 00:16:59,000 Speaker 3: economic model to cope with that, even without an artificial 261 00:16:59,040 --> 00:17:02,080 Speaker 3: general and intelligence. So the model we've got has only 262 00:17:02,120 --> 00:17:06,320 Speaker 3: really been around since the Industrial Revolution, when factories first 263 00:17:06,320 --> 00:17:08,920 Speaker 3: started coming on the scenes. We're going to have to 264 00:17:08,960 --> 00:17:11,480 Speaker 3: think about more of a rather than a labor based 265 00:17:11,560 --> 00:17:13,879 Speaker 3: model where you put in the hours and you get paid, 266 00:17:15,080 --> 00:17:17,600 Speaker 3: more of a dividend based model, which is, you're a 267 00:17:17,680 --> 00:17:21,480 Speaker 3: citizen of this country, this country produces x amount of GDP, 268 00:17:21,720 --> 00:17:23,000 Speaker 3: you're entitled to a dividend. 269 00:17:23,680 --> 00:17:28,080 Speaker 1: Mm hmmm, yes, what is that called? What is that called? 270 00:17:28,119 --> 00:17:33,560 Speaker 1: Where everyone gets what's that you know, where everyone gets 271 00:17:33,640 --> 00:17:34,199 Speaker 1: just paid? 272 00:17:34,960 --> 00:17:40,439 Speaker 3: Yes? Yeah, so, uh yeah, you've done it to me, 273 00:17:40,720 --> 00:17:42,680 Speaker 3: You've infected me. I've forgotten it too. 274 00:17:43,240 --> 00:17:48,960 Speaker 1: Don't blame right, But what is it called tips? 275 00:17:49,080 --> 00:17:50,600 Speaker 3: What dominald It's. 276 00:17:50,480 --> 00:17:58,040 Speaker 5: Called UBI UBI so, which means universal university income, Universal 277 00:17:58,080 --> 00:18:03,119 Speaker 5: basics basic income. Yeah, so where every man, woman and 278 00:18:03,200 --> 00:18:06,040 Speaker 5: child has just given an amount which is sufficient for 279 00:18:06,080 --> 00:18:08,520 Speaker 5: them to live and if they choose to work they 280 00:18:08,560 --> 00:18:13,960 Speaker 5: can doesn't affect whether they get that, and if they don't, 281 00:18:13,960 --> 00:18:14,520 Speaker 5: they don't then. 282 00:18:14,880 --> 00:18:17,720 Speaker 3: And to me, that sounds like a pretty sensible solution 283 00:18:18,920 --> 00:18:23,600 Speaker 3: in that. Then people say, oh, it'll kill entrepreneurialism. Well 284 00:18:23,640 --> 00:18:25,639 Speaker 3: not one really. If you're the kind of person that 285 00:18:25,720 --> 00:18:28,080 Speaker 3: wants to invent something and run with an idea, all 286 00:18:28,080 --> 00:18:29,840 Speaker 3: it means is that your rent and your food are 287 00:18:29,840 --> 00:18:31,199 Speaker 3: going to be paid for and you can go out 288 00:18:31,240 --> 00:18:33,919 Speaker 3: and see if you can make some more money. If 289 00:18:33,960 --> 00:18:36,400 Speaker 3: you're the kind of person who'd rather just go down 290 00:18:36,400 --> 00:18:38,600 Speaker 3: and help with the touch shop, knowing that everything's taken 291 00:18:38,640 --> 00:18:42,680 Speaker 3: care of, then great, because we've lost all those volunteers 292 00:18:42,920 --> 00:18:46,480 Speaker 3: and that would bring them back into the economy. Now, 293 00:18:47,000 --> 00:18:50,600 Speaker 3: you know, old world economists pooh pooh that and say no, 294 00:18:51,080 --> 00:18:54,359 Speaker 3: people should be laboring, you know, eight hours a day 295 00:18:54,520 --> 00:18:58,400 Speaker 3: or else. You know, what's the point. But really we're 296 00:18:58,400 --> 00:19:02,480 Speaker 3: getting to a point were robotics and AI where very 297 00:19:02,600 --> 00:19:08,239 Speaker 3: very soon there'll be such massive unemployment that we have 298 00:19:08,320 --> 00:19:09,520 Speaker 3: to come up with a different model. 299 00:19:10,359 --> 00:19:16,200 Speaker 1: Yes, yes, universal basic income, that's it. 300 00:19:16,640 --> 00:19:16,880 Speaker 3: Yeah. 301 00:19:16,960 --> 00:19:21,560 Speaker 1: And the thing is, it's like it's nice to I 302 00:19:21,560 --> 00:19:24,199 Speaker 1: don't know, romanticize the past and all of that and 303 00:19:24,240 --> 00:19:27,640 Speaker 1: to be to you know, and I do worry about 304 00:19:27,640 --> 00:19:31,959 Speaker 1: the idea of people who are disincentivized from working and 305 00:19:32,040 --> 00:19:35,160 Speaker 1: all of that. I worry about that. But as you said, 306 00:19:35,200 --> 00:19:36,960 Speaker 1: I think the people who want to go and create 307 00:19:37,000 --> 00:19:39,639 Speaker 1: and be entrepreneurial and build a business, they can still 308 00:19:39,680 --> 00:19:43,639 Speaker 1: do that. And I wonder I guess the people who 309 00:19:43,680 --> 00:19:46,960 Speaker 1: would have done it done that anyway in our current 310 00:19:47,000 --> 00:19:50,240 Speaker 1: operating system will probably do that if something new comes 311 00:19:50,240 --> 00:19:54,000 Speaker 1: into place. But the very yeah, the practical reality is 312 00:19:54,040 --> 00:19:55,800 Speaker 1: that there are going to be so many jobs that 313 00:19:55,840 --> 00:20:00,359 Speaker 1: are made redundant with the ever advancing and evolving. 314 00:20:01,240 --> 00:20:03,920 Speaker 3: We kind of did a test case on this in Australia, 315 00:20:03,960 --> 00:20:10,480 Speaker 3: at least during COVID, because in COVID, essentially that happened 316 00:20:10,520 --> 00:20:13,760 Speaker 3: all of a sudden. Yes, a lot of people were 317 00:20:13,800 --> 00:20:18,920 Speaker 3: out of work, very very suddenly, and the government solved 318 00:20:19,000 --> 00:20:25,520 Speaker 3: that by essentially doubling all of the unemployment and social 319 00:20:25,560 --> 00:20:31,480 Speaker 3: security benefits. And the interesting thing, and this will be 320 00:20:31,480 --> 00:20:33,879 Speaker 3: studied for a while, I'm sure, the interesting thing about 321 00:20:34,040 --> 00:20:39,000 Speaker 3: is what that did to things like crime rates and 322 00:20:40,560 --> 00:20:45,320 Speaker 3: anxiety and depression and things like that that we accept 323 00:20:45,359 --> 00:20:47,840 Speaker 3: as just a natural part of society. But they all 324 00:20:47,880 --> 00:20:53,280 Speaker 3: took a massive dive, massive dive during that time. So 325 00:20:53,359 --> 00:20:56,800 Speaker 3: when you remove stress from people's lives, which is what 326 00:20:56,880 --> 00:20:59,400 Speaker 3: you introduce when you ask them to live on less 327 00:20:59,440 --> 00:21:06,320 Speaker 3: than they can on. Then suddenly they get to not 328 00:21:06,800 --> 00:21:12,399 Speaker 3: be anxious, not be depressed, and not be in a 329 00:21:13,000 --> 00:21:17,600 Speaker 3: state where their brain loses impulse control and crime becomes 330 00:21:18,320 --> 00:21:20,640 Speaker 3: either necessary or involuntary. 331 00:21:21,359 --> 00:21:21,719 Speaker 5: M hm. 332 00:21:24,040 --> 00:21:26,480 Speaker 1: Did you say that anxiety and depression dropped. 333 00:21:27,240 --> 00:21:33,119 Speaker 3: Yes, during during COVID, anxiety and depression dropped, so during 334 00:21:33,160 --> 00:21:36,440 Speaker 3: the but after it it's gone through the roof. 335 00:21:36,720 --> 00:21:40,959 Speaker 1: Yeah, yeah, yeah, yeah, yeah. I'm taking a little bit 336 00:21:40,960 --> 00:21:43,840 Speaker 1: of a left turn. But interestingly, we did an episode 337 00:21:44,000 --> 00:21:46,600 Speaker 1: a week or so ago with Patrick, who's our other 338 00:21:46,680 --> 00:21:52,440 Speaker 1: regular who does kind of tech stuff as well. He's 339 00:21:52,480 --> 00:21:56,040 Speaker 1: not as far down the AI rabbit hole and as 340 00:21:56,080 --> 00:21:59,080 Speaker 1: you are, but he got a bit of my voice 341 00:21:59,119 --> 00:22:01,600 Speaker 1: and this is like a a simpler version of what 342 00:22:01,640 --> 00:22:04,760 Speaker 1: your son did to you. But anyway, he said to me. 343 00:22:04,840 --> 00:22:06,800 Speaker 1: We started the show and he said, how come you 344 00:22:06,840 --> 00:22:09,480 Speaker 1: never let me intro? And I went, all right, well, 345 00:22:09,520 --> 00:22:12,360 Speaker 1: off you go, and then he played this thing of 346 00:22:12,440 --> 00:22:16,720 Speaker 1: me which he deep faked me or whatever, and it's 347 00:22:16,760 --> 00:22:19,600 Speaker 1: basically me telling the world I'm a fuck with and 348 00:22:19,640 --> 00:22:22,480 Speaker 1: he's brilliant, and you know, it was just it was 349 00:22:22,520 --> 00:22:25,200 Speaker 1: about one minute of me shirting myself in the foot 350 00:22:25,200 --> 00:22:27,639 Speaker 1: in him. But honestly, it. 351 00:22:27,480 --> 00:22:31,200 Speaker 3: Was wait till he does it with video and you're 352 00:22:31,200 --> 00:22:32,760 Speaker 3: in a porno without realizing it. 353 00:22:33,000 --> 00:22:37,159 Speaker 1: Ock hell, I mean yeah, yeah, And it was pretty 354 00:22:37,200 --> 00:22:42,080 Speaker 1: much indistinguishable except we knew, you know that. I had 355 00:22:42,080 --> 00:22:44,840 Speaker 1: to interrupt. I'm like, hey, everyone, this isn't me because 356 00:22:44,880 --> 00:22:49,240 Speaker 1: it sounded so real. And I said, how much, like, 357 00:22:49,400 --> 00:22:51,399 Speaker 1: I don't know how much of my voice did you 358 00:22:51,440 --> 00:22:53,800 Speaker 1: need to get to create that? He said, like about 359 00:22:53,840 --> 00:22:56,680 Speaker 1: a thirty second grab of your voice, And he said 360 00:22:56,880 --> 00:23:00,720 Speaker 1: it took me like five minutes, three minutes to make 361 00:23:00,960 --> 00:23:04,240 Speaker 1: the whole kind of minute of Yeah. I'm like, that's 362 00:23:04,840 --> 00:23:07,560 Speaker 1: and it's still pretty rudimentary at the moment for what 363 00:23:07,640 --> 00:23:08,199 Speaker 1: it's going to be. 364 00:23:09,119 --> 00:23:11,240 Speaker 3: Yeah, although they're getting very good now, some of the 365 00:23:11,320 --> 00:23:15,240 Speaker 3: voice models, you know, when you're talking to them, they 366 00:23:15,320 --> 00:23:19,000 Speaker 3: really make you doubt it because they do things like 367 00:23:19,080 --> 00:23:23,719 Speaker 3: pause like that they you know, and they are and 368 00:23:23,760 --> 00:23:28,120 Speaker 3: they have an intake of breath, so even though they're 369 00:23:28,119 --> 00:23:32,639 Speaker 3: not breathing, so it starts to sound the way human 370 00:23:32,640 --> 00:23:37,440 Speaker 3: would sound. And because of all those little signs, if 371 00:23:37,440 --> 00:23:41,760 Speaker 3: you're listening to them, you're losing a lot of the 372 00:23:41,840 --> 00:23:44,640 Speaker 3: cues that would normally lead you to suspect I'm talking 373 00:23:44,720 --> 00:23:49,679 Speaker 3: to a robot here, you know, because and they're getting 374 00:23:49,800 --> 00:23:52,800 Speaker 3: close to that with video. To me, video is still 375 00:23:52,800 --> 00:23:57,560 Speaker 3: detectable if you look very very closely and just watch 376 00:23:57,600 --> 00:24:02,560 Speaker 3: for little tells. But it is close. It is really close. 377 00:24:02,640 --> 00:24:05,959 Speaker 3: And it is almost to the point where if you 378 00:24:06,000 --> 00:24:09,679 Speaker 3: see video of a politician on the television saying something, 379 00:24:10,320 --> 00:24:13,720 Speaker 3: are you absolutely certain that that is a real person 380 00:24:13,840 --> 00:24:17,040 Speaker 3: doing that? At the moment, you'd say, I am absolutely certain. 381 00:24:17,240 --> 00:24:20,040 Speaker 3: I'm sure no one is getting inside the machine here 382 00:24:20,080 --> 00:24:22,639 Speaker 3: and doing that. But we are very close to the 383 00:24:22,640 --> 00:24:25,399 Speaker 3: point where where you won't be able to tell. And 384 00:24:25,440 --> 00:24:27,639 Speaker 3: I noticed that there was talk yesterday on the news. 385 00:24:27,960 --> 00:24:31,000 Speaker 3: I can't remember I heard it talking the our government 386 00:24:31,040 --> 00:24:36,159 Speaker 3: is talking about outlawing deep fake videos. So, rather like 387 00:24:36,200 --> 00:24:39,360 Speaker 3: where we started this, where they're talking about outlawing access 388 00:24:39,359 --> 00:24:42,240 Speaker 3: to social media for under sixteen year olds, if they're 389 00:24:42,560 --> 00:24:44,679 Speaker 3: getting to the point where they have to talk about 390 00:24:44,760 --> 00:24:50,679 Speaker 3: outlawing deep fake videos, then there's some stuff going on 391 00:24:50,760 --> 00:24:53,120 Speaker 3: out there that probably you are not encounter. I see 392 00:24:53,200 --> 00:24:57,640 Speaker 3: TIFFs nodding. She's probably continuously deleting all the deep fake 393 00:24:57,720 --> 00:24:58,440 Speaker 3: videos of her. 394 00:25:00,400 --> 00:25:01,480 Speaker 2: I think it's terrifying. 395 00:25:02,080 --> 00:25:08,560 Speaker 3: Yeah, well, it's especially terrifying for a woman because they 396 00:25:09,160 --> 00:25:14,760 Speaker 3: you know, these things are often used for explicit, violent 397 00:25:15,800 --> 00:25:20,200 Speaker 3: sexual content, and that's the kind that's why they're worried 398 00:25:20,200 --> 00:25:24,560 Speaker 3: about them. So, you know, because because you can train 399 00:25:24,640 --> 00:25:27,720 Speaker 3: one of these models with someone's photo, So all you 400 00:25:27,760 --> 00:25:32,080 Speaker 3: need is a photo of someone fully clothed and and 401 00:25:33,119 --> 00:25:36,320 Speaker 3: show it to one of these models and create whatever 402 00:25:36,400 --> 00:25:37,520 Speaker 3: you like from it. 403 00:25:38,119 --> 00:25:41,800 Speaker 1: Yeah. Yeah, well I noticed more and more, Like I 404 00:25:41,880 --> 00:25:46,120 Speaker 1: watch on instag I follow like lots of bike stuff 405 00:25:46,160 --> 00:25:51,120 Speaker 1: and car stuff, and health and fitness stuff and more 406 00:25:51,200 --> 00:25:54,040 Speaker 1: and more of the you know, it's not like fifty percent, 407 00:25:54,119 --> 00:25:56,639 Speaker 1: but I would say ten percent of the images or 408 00:25:56,720 --> 00:26:00,439 Speaker 1: videos that come onto my thing. Now I'm like, like 409 00:26:00,520 --> 00:26:02,479 Speaker 1: this thing. I don't know why this thing came up 410 00:26:02,520 --> 00:26:04,880 Speaker 1: this morning of and it said something like the most 411 00:26:04,880 --> 00:26:07,080 Speaker 1: beautiful birds in the world, right, because I think I 412 00:26:07,160 --> 00:26:09,120 Speaker 1: have lots of dog videos come up and a few 413 00:26:09,160 --> 00:26:11,960 Speaker 1: other animal videos. I'm like, these are the most I'm 414 00:26:12,000 --> 00:26:15,720 Speaker 1: like looking at these birds and they it did not 415 00:26:15,960 --> 00:26:20,480 Speaker 1: at all look like computer generated or it just looked 416 00:26:20,520 --> 00:26:25,040 Speaker 1: like these incredible birds in nature. And then I'm like, 417 00:26:25,280 --> 00:26:27,960 Speaker 1: where the what kind of birds are these? And I 418 00:26:28,040 --> 00:26:30,040 Speaker 1: went into the thing, and then someone said this is 419 00:26:30,200 --> 00:26:32,720 Speaker 1: this is AI, and then a whole bunch of people 420 00:26:32,760 --> 00:26:34,960 Speaker 1: who were younger and smarter than me went, yeah, these 421 00:26:35,000 --> 00:26:38,840 Speaker 1: are not photos. This is AI. I absolutely could not 422 00:26:39,080 --> 00:26:43,040 Speaker 1: tell at all that they weren't photos. And then I 423 00:26:43,080 --> 00:26:44,959 Speaker 1: saw this thing the other day of a guy, a 424 00:26:45,000 --> 00:26:49,240 Speaker 1: motorcyclist that I you know, a GP rider, moto GP rider, 425 00:26:49,880 --> 00:26:52,840 Speaker 1: and he's doing this thing and it was this incredible thing. 426 00:26:52,880 --> 00:26:54,639 Speaker 1: I don't know if you saw this tip because you 427 00:26:54,680 --> 00:26:57,440 Speaker 1: follow motorbikes uff as well, but he had his knee 428 00:26:57,480 --> 00:26:59,439 Speaker 1: and his elbow on the ground and he just missed this. 429 00:26:59,600 --> 00:27:03,000 Speaker 1: It was like the most unbelievable writing ever and nothing 430 00:27:03,080 --> 00:27:06,480 Speaker 1: about it looked like it was CGI or computer generated 431 00:27:06,520 --> 00:27:09,760 Speaker 1: or whatever, and it was I'm like, I don't know now. 432 00:27:10,160 --> 00:27:12,760 Speaker 1: I like, already I don't know with some of the 433 00:27:12,880 --> 00:27:16,040 Speaker 1: videos what is real. And when I see something that's 434 00:27:16,080 --> 00:27:18,760 Speaker 1: mind blowing, I have to go into the comments because 435 00:27:18,760 --> 00:27:20,840 Speaker 1: somebody else will say, no, this is not real. 436 00:27:21,480 --> 00:27:23,720 Speaker 3: You know, yeah, but you can't be certain. I mean 437 00:27:23,760 --> 00:27:28,159 Speaker 3: I see that. I think Google and released their video 438 00:27:28,520 --> 00:27:33,639 Speaker 3: creation tool today or yesterday, where you just give it 439 00:27:33,640 --> 00:27:35,359 Speaker 3: a prompt, you say what you want a video of 440 00:27:35,880 --> 00:27:39,439 Speaker 3: and you know. So this is the cleaner side of 441 00:27:39,440 --> 00:27:41,800 Speaker 3: deep fakes. But you can handle a photograph and you 442 00:27:41,840 --> 00:27:45,639 Speaker 3: can say, give, make this photograph move. So one of 443 00:27:45,640 --> 00:27:47,680 Speaker 3: the examples that god is that there's a girl holding 444 00:27:47,680 --> 00:27:51,120 Speaker 3: a puppy or something. It's just a photograph, and make 445 00:27:51,160 --> 00:27:53,439 Speaker 3: a move and she turns and looks away, and the 446 00:27:53,440 --> 00:27:59,080 Speaker 3: puppy starts moving and it all looks absolutely cinemagraphically real. 447 00:28:00,080 --> 00:28:01,240 Speaker 1: What what's that called? 448 00:28:01,440 --> 00:28:09,120 Speaker 3: Oh, it's it's the Google Google competitor for Sora, which 449 00:28:09,160 --> 00:28:13,240 Speaker 3: is the chat ept open ai one. So Sora is 450 00:28:13,280 --> 00:28:16,320 Speaker 3: its name, So Ora, but what was the Google one called? 451 00:28:16,359 --> 00:28:18,560 Speaker 3: I can't remember. Tiff will find it in an instant. 452 00:28:19,400 --> 00:28:21,600 Speaker 1: So Tiff, you find that if you couldn't get back 453 00:28:21,600 --> 00:28:24,560 Speaker 1: to us, so could you just like, I just want 454 00:28:24,600 --> 00:28:29,080 Speaker 1: to come full circle and we'll finish soon. But so, 455 00:28:29,119 --> 00:28:31,400 Speaker 1: we have a lot of listeners who are not your 456 00:28:31,400 --> 00:28:34,480 Speaker 1: typical podcast listeners. So of course we have people in 457 00:28:34,480 --> 00:28:37,119 Speaker 1: their twenties and all of that, but we probably have 458 00:28:37,160 --> 00:28:40,000 Speaker 1: a disproportionate number of listeners who are somewhere in the 459 00:28:40,040 --> 00:28:42,959 Speaker 1: forty to sixty range, which is a little bit atypical 460 00:28:43,000 --> 00:28:47,960 Speaker 1: for podcasts. Just remind I'm really asking this for myself. 461 00:28:48,000 --> 00:28:51,920 Speaker 1: Everyone just remind our listeners what artificial jar? 462 00:28:52,240 --> 00:28:54,320 Speaker 3: Sorry, I just found the name by the way beat 463 00:28:54,360 --> 00:29:00,640 Speaker 3: you Tiff only just vo rio Yeah, totally be the EO. 464 00:29:02,040 --> 00:29:02,640 Speaker 2: Deep mind. 465 00:29:03,000 --> 00:29:05,320 Speaker 3: See while you were talking, Craig, neither Tiff or I 466 00:29:05,360 --> 00:29:06,560 Speaker 3: were listening to your word you was. 467 00:29:08,640 --> 00:29:10,080 Speaker 1: I was telling everybody what to fuck? 468 00:29:10,120 --> 00:29:10,560 Speaker 3: What you are? 469 00:29:11,000 --> 00:29:11,120 Speaker 1: Oh? 470 00:29:11,240 --> 00:29:12,080 Speaker 3: Good, good cook? 471 00:29:12,160 --> 00:29:16,440 Speaker 1: Cood so so vo is that? I'll come back to 472 00:29:16,480 --> 00:29:19,040 Speaker 1: my question. But is that an app that we buy 473 00:29:19,120 --> 00:29:20,640 Speaker 1: or is that a program that we can just? 474 00:29:20,720 --> 00:29:22,520 Speaker 3: What is it? It's just a website. You can go 475 00:29:22,600 --> 00:29:25,920 Speaker 3: to Google. I think it's Google Labs. Tift's probably got 476 00:29:25,920 --> 00:29:29,760 Speaker 3: the detail there, so you can go and try it out. 477 00:29:30,120 --> 00:29:31,479 Speaker 2: I'm signing up as we speak. 478 00:29:31,760 --> 00:29:34,040 Speaker 3: There we go Tifts producing videos of you. 479 00:29:34,120 --> 00:29:34,360 Speaker 1: Craig. 480 00:29:34,400 --> 00:29:36,480 Speaker 2: Watch I want to put a photo of Harps in there, 481 00:29:36,520 --> 00:29:38,040 Speaker 2: and I'm going to say, put this person in a 482 00:29:38,080 --> 00:29:41,760 Speaker 2: red dress and make them dance like an exotic belly dancer. 483 00:29:42,480 --> 00:29:44,560 Speaker 1: So now if I did that to you, I'd get 484 00:29:44,600 --> 00:29:48,480 Speaker 1: a thousand hate emails. You're not allowed to do that. 485 00:29:49,200 --> 00:29:50,760 Speaker 1: You're not allow to sexualize me. 486 00:29:52,400 --> 00:29:54,520 Speaker 3: It's top sexualizing blue dress. 487 00:29:56,240 --> 00:29:59,200 Speaker 1: Just tell everyone what AGI is again, that's. 488 00:29:59,040 --> 00:30:01,320 Speaker 3: My artificial general intelligence. 489 00:30:01,520 --> 00:30:02,720 Speaker 1: And what does that mean though? 490 00:30:03,920 --> 00:30:08,520 Speaker 3: That means well, human intelligence, basically human like intelligence, where 491 00:30:08,560 --> 00:30:12,360 Speaker 3: it's it's making its own adventure. It's not reacting to 492 00:30:12,440 --> 00:30:14,920 Speaker 3: what you tell it to do. It's figuring out what 493 00:30:15,000 --> 00:30:18,680 Speaker 3: to do on its own and deciding, oh, this is 494 00:30:18,720 --> 00:30:20,120 Speaker 3: the way I want to go with this, or this 495 00:30:20,200 --> 00:30:21,760 Speaker 3: is the way I want to go with this. I 496 00:30:21,800 --> 00:30:23,400 Speaker 3: don't know if you've played with some of the more 497 00:30:23,440 --> 00:30:27,080 Speaker 3: advanced day eyes, but say something like Google Gemini, which 498 00:30:27,120 --> 00:30:31,320 Speaker 3: is their latest, really really good one. You can say 499 00:30:31,320 --> 00:30:34,120 Speaker 3: to it, oh, look, I want to write a science 500 00:30:34,160 --> 00:30:39,280 Speaker 3: fiction book. Give me a narrative arc that matches the 501 00:30:39,320 --> 00:30:43,720 Speaker 3: greatest narrative arcs in history, and it'll generate the plot 502 00:30:43,720 --> 00:30:45,440 Speaker 3: outline for you in about a second. And then you 503 00:30:45,480 --> 00:30:48,760 Speaker 3: can say to it, oh yeah, now, general, now give 504 00:30:48,760 --> 00:30:51,360 Speaker 3: me the first chapter and it'll just start writing. And 505 00:30:51,440 --> 00:30:54,280 Speaker 3: then you just keep going. And while you've got a 506 00:30:54,280 --> 00:30:59,360 Speaker 3: book and it's beautifully written. The plot structure is perfect, 507 00:30:59,480 --> 00:31:03,280 Speaker 3: the narrative structure is perfect. You know, you can tell 508 00:31:03,320 --> 00:31:07,360 Speaker 3: it to be politically correct with its characterizations, etc. Depending 509 00:31:07,400 --> 00:31:10,680 Speaker 3: what audience you want. But all of that is doing it. 510 00:31:11,000 --> 00:31:12,840 Speaker 3: I'm having to do the thinking. I'm having to do 511 00:31:12,880 --> 00:31:15,560 Speaker 3: the driving. I'm having to think about what I want 512 00:31:15,600 --> 00:31:19,400 Speaker 3: it to do, and now it does it almost magically 513 00:31:19,560 --> 00:31:23,280 Speaker 3: and very very quickly, But it's still me doing the thinking. 514 00:31:23,840 --> 00:31:27,440 Speaker 3: Whereas if you imagine an Agi would just decide, oh, 515 00:31:27,600 --> 00:31:30,640 Speaker 3: I think I will write the greatest science fiction novel 516 00:31:30,680 --> 00:31:33,479 Speaker 3: ever and just start doing it, and then you know, 517 00:31:33,560 --> 00:31:37,040 Speaker 3: within one second, it's probably published it on Amazon, it's 518 00:31:37,080 --> 00:31:41,280 Speaker 3: probably got fifty thousand reviews, it's probably pushing it out 519 00:31:41,280 --> 00:31:44,520 Speaker 3: through the distribution networks and all of that because it 520 00:31:44,560 --> 00:31:47,360 Speaker 3: felt like doing it, because it decided to do it, 521 00:31:47,720 --> 00:31:49,560 Speaker 3: not because anyone told it to do it. 522 00:31:50,040 --> 00:31:53,600 Speaker 1: Yeah. Yeah, yeah, I was sitting in the cafe this 523 00:31:53,720 --> 00:32:00,280 Speaker 1: morning bucking around with chat GPT or GTB four or 524 00:32:00,320 --> 00:32:02,920 Speaker 1: whatever it's called. Yeah, yeah, yeah, and I write this. 525 00:32:03,000 --> 00:32:05,480 Speaker 1: I wrote right near piece exploring this question, is the 526 00:32:05,560 --> 00:32:07,600 Speaker 1: mind a real thing? Or is it just a construct 527 00:32:07,600 --> 00:32:12,120 Speaker 1: created by humans to explain cognition, consciousness and awareness that 528 00:32:12,200 --> 00:32:15,320 Speaker 1: arises from the brain, or maybe you have another theory 529 00:32:15,800 --> 00:32:18,280 Speaker 1: and literally twenty sex I won't read the thing if 530 00:32:18,320 --> 00:32:21,040 Speaker 1: you want to see everyone on my Facebook, but it 531 00:32:21,200 --> 00:32:28,160 Speaker 1: wrote this fucking pretty amazing kind of exploratory on that question. 532 00:32:28,400 --> 00:32:33,200 Speaker 1: In it was probably finished in fifteen seconds twenty seconds. Yep, 533 00:32:33,400 --> 00:32:36,200 Speaker 1: I'm like this shit. If I had written what it 534 00:32:36,440 --> 00:32:39,520 Speaker 1: cranked out, probably I couldn't have written it, to be honest, 535 00:32:39,640 --> 00:32:41,800 Speaker 1: Maybe I'm not sure, but it would have taken me 536 00:32:41,880 --> 00:32:42,440 Speaker 1: half a day. 537 00:32:43,760 --> 00:32:46,840 Speaker 3: I would have taken it probably more than that. It's yeah, 538 00:32:46,880 --> 00:32:49,360 Speaker 3: the thing that's incredible to me. I mean, I use 539 00:32:49,400 --> 00:32:52,160 Speaker 3: it in a way that I suspect few people do. 540 00:32:52,480 --> 00:32:55,400 Speaker 3: I feel like starting a club called the Dead Authors Society, 541 00:32:55,480 --> 00:32:58,360 Speaker 3: because I have authors that I like to read, but 542 00:32:58,480 --> 00:33:04,000 Speaker 3: they're not producing anything more because they're dead. And you 543 00:33:04,000 --> 00:33:07,280 Speaker 3: can say to an AI, give it a sample piece 544 00:33:07,280 --> 00:33:11,880 Speaker 3: of writing about anything, and say, rewrite this in the 545 00:33:11,920 --> 00:33:15,320 Speaker 3: style of whatever author it is that you like, and 546 00:33:15,520 --> 00:33:17,120 Speaker 3: off it'll go. And then, as you say a few 547 00:33:17,160 --> 00:33:20,320 Speaker 3: seconds later, there you've got the piece rewritten in the 548 00:33:20,360 --> 00:33:22,640 Speaker 3: style you like. But it can do more than that. 549 00:33:22,720 --> 00:33:25,800 Speaker 3: You can say, you know so. For example, one of 550 00:33:25,800 --> 00:33:30,560 Speaker 3: my favorite authors is AA Gil, who writes blistering, sarcastic 551 00:33:30,600 --> 00:33:33,760 Speaker 3: critiques of restaurants when he was alive. He's now dead, 552 00:33:34,760 --> 00:33:37,640 Speaker 3: but I really enjoyed reading them because they're so hilarious. 553 00:33:37,880 --> 00:33:40,640 Speaker 3: The metaphors he will come up with just incredible, And 554 00:33:42,680 --> 00:33:45,840 Speaker 3: because I still like reading him, I'll often pick a 555 00:33:45,880 --> 00:33:49,200 Speaker 3: restaurant or a film or something and just ask it 556 00:33:49,240 --> 00:33:51,760 Speaker 3: to write a review in the style of AA Gill, 557 00:33:52,000 --> 00:33:55,280 Speaker 3: and three or four seconds later, there it is, and 558 00:33:55,320 --> 00:33:57,360 Speaker 3: I get to read it as if he's still alive 559 00:33:57,440 --> 00:33:58,560 Speaker 3: writing reviews. 560 00:33:59,160 --> 00:34:01,640 Speaker 1: So no Hunter s Thompson request in there. 561 00:34:01,440 --> 00:34:05,680 Speaker 3: For you, No, no, no, But anyone could do it 562 00:34:05,760 --> 00:34:07,760 Speaker 3: for any style like there was. I was interesting. My 563 00:34:08,320 --> 00:34:12,319 Speaker 3: daughters were watching Bridgeton the latest something or others come 564 00:34:12,360 --> 00:34:14,600 Speaker 3: out about it. I can't stand the stuff myself, but 565 00:34:14,719 --> 00:34:16,839 Speaker 3: there's a very there's a certain style to the way 566 00:34:16,920 --> 00:34:20,520 Speaker 3: the narration is done in that. And I was working 567 00:34:20,560 --> 00:34:23,400 Speaker 3: on a piece and I just dropped the piece into 568 00:34:23,440 --> 00:34:25,600 Speaker 3: an AI and said, rewrite in the style of the 569 00:34:25,600 --> 00:34:31,080 Speaker 3: Bridgeton narrator. And it did it perfectly, like just mind 570 00:34:31,120 --> 00:34:34,040 Speaker 3: blowingly perfectly. You know it could it could have been 571 00:34:34,080 --> 00:34:35,040 Speaker 3: part of the script. 572 00:34:36,800 --> 00:34:38,120 Speaker 1: What did you use for that? 573 00:34:38,760 --> 00:34:44,759 Speaker 3: Just Gemini? The Google Gemini, the the Google competitor. I 574 00:34:44,800 --> 00:34:47,440 Speaker 3: find it a little bit more precise than the Chat one. 575 00:34:48,120 --> 00:34:50,279 Speaker 3: So it so I pay. 576 00:34:50,400 --> 00:34:52,440 Speaker 1: I think I paid twenty bucks a month or something. 577 00:34:52,560 --> 00:34:56,040 Speaker 1: So if I want Google Gemini, I need to buy 578 00:34:56,080 --> 00:34:56,840 Speaker 1: that as well. 579 00:34:57,239 --> 00:35:00,279 Speaker 3: It has a it is it does cost. But if 580 00:35:00,280 --> 00:35:03,160 Speaker 3: you use Google Storage, which I do, I think it's 581 00:35:03,160 --> 00:35:06,520 Speaker 3: included in in like, because I pay a fair bit 582 00:35:06,560 --> 00:35:09,320 Speaker 3: for Google Storage because I like to have the automatic backups, 583 00:35:09,719 --> 00:35:13,560 Speaker 3: so it's included in whatever I'm paying there for that. 584 00:35:14,440 --> 00:35:19,320 Speaker 1: Yeah. Yeah, So as we wind up, so a g 585 00:35:19,520 --> 00:35:24,040 Speaker 1: I is not, it's not. It's not a matter of 586 00:35:24,160 --> 00:35:26,879 Speaker 1: if it will eventuate, but when am I right? 587 00:35:27,480 --> 00:35:31,280 Speaker 3: Yeah? And you know, for for readers who are curious 588 00:35:31,280 --> 00:35:33,120 Speaker 3: to read more, and I'll send you a link to 589 00:35:33,160 --> 00:35:35,200 Speaker 3: this so Tiff doesn't have to look too hard, although 590 00:35:35,280 --> 00:35:38,360 Speaker 3: you'll find it before I even finish. There's a blog 591 00:35:38,400 --> 00:35:43,680 Speaker 3: called there's a blog called weight but Why, And it's 592 00:35:44,360 --> 00:35:48,880 Speaker 3: written by a really interesting bloke who has He just 593 00:35:49,120 --> 00:35:54,440 Speaker 3: does these extremely long form thought pieces on you know, 594 00:35:54,480 --> 00:35:56,799 Speaker 3: how things work, essentially, and one of the ones he's 595 00:35:56,840 --> 00:36:00,960 Speaker 3: done is on artificial general intelligence. And he did it 596 00:36:01,080 --> 00:36:03,480 Speaker 3: years and years ago, before we had all this AI stuff. 597 00:36:03,480 --> 00:36:06,120 Speaker 3: I think he wrote it maybe ten years ago. But 598 00:36:06,160 --> 00:36:09,200 Speaker 3: he takes it to the logical conclusion about what that 599 00:36:09,280 --> 00:36:13,120 Speaker 3: would look like and what it would do, and it's 600 00:36:13,560 --> 00:36:15,400 Speaker 3: it's a riveting piece to read. If you've got the 601 00:36:15,440 --> 00:36:19,640 Speaker 3: patience because it's lengthy, but it's worth a read. 602 00:36:21,120 --> 00:36:24,320 Speaker 1: Wait, but why I've googled it and it's a store 603 00:36:24,880 --> 00:36:26,040 Speaker 1: women's needs plus. 604 00:36:26,280 --> 00:36:30,080 Speaker 3: No, I feel you may have done something wrong, mare Craig. 605 00:36:30,360 --> 00:36:31,160 Speaker 3: Let me just see you. 606 00:36:31,200 --> 00:36:35,239 Speaker 2: Scroll down because yeah, well it's the same website. I 607 00:36:35,320 --> 00:36:36,400 Speaker 2: think there's two lists. 608 00:36:37,480 --> 00:36:39,960 Speaker 3: It's actually the first here. I'll drop it in the 609 00:36:40,040 --> 00:36:43,160 Speaker 3: chat for you, the exact piece here it is. I'll 610 00:36:43,160 --> 00:36:44,680 Speaker 3: put the link in the chat. 611 00:36:46,000 --> 00:36:51,080 Speaker 1: Wow, everyone, you're hearing the bloody inside here. There you go, 612 00:36:51,680 --> 00:36:53,960 Speaker 1: all right, thank you? Oh hang on, I'm going to 613 00:36:53,960 --> 00:36:55,560 Speaker 1: click on that. Oh there it. 614 00:36:55,480 --> 00:36:58,640 Speaker 3: Is twenty fifteen. He wrote it, so nine years ago. 615 00:36:59,200 --> 00:37:01,319 Speaker 1: He was a vision. Sorry ahead of his time. We 616 00:37:01,440 --> 00:37:04,120 Speaker 1: appreciate you. Will say goodbye Affair, but for the mini 617 00:37:04,320 --> 00:37:04,680 Speaker 1: thanks 618 00:37:04,960 --> 00:37:10,240 Speaker 3: Gillespo absolute pleasure as always, Thanks Tiff, Thanks kids,