1 00:00:00,040 --> 00:00:03,160 Speaker 1: What's up, everybody. Welcome to financial heresy, where we talk 2 00:00:03,160 --> 00:00:05,160 Speaker 1: about how money works so that you can make more, 3 00:00:05,600 --> 00:00:08,360 Speaker 1: keep more, and give more. Today, I've got a guest 4 00:00:08,400 --> 00:00:10,960 Speaker 1: on the channel who has been here a couple of times. 5 00:00:11,000 --> 00:00:15,319 Speaker 1: Repeat guest Alex Svetsky, good friend of mine, and we're 6 00:00:15,360 --> 00:00:17,800 Speaker 1: having a great conversation here today. I'm really excited for 7 00:00:17,840 --> 00:00:22,799 Speaker 1: you because we're talking about artificial intelligence, specifically number one, 8 00:00:23,840 --> 00:00:28,040 Speaker 1: about the main fears that are being pushed about about 9 00:00:28,040 --> 00:00:32,040 Speaker 1: AI and how they're completely unfounded. And then number two, 10 00:00:32,280 --> 00:00:34,879 Speaker 1: what the real risk is that is kind of being 11 00:00:34,920 --> 00:00:36,919 Speaker 1: distracted from, like you know, don't look at the man 12 00:00:37,040 --> 00:00:40,080 Speaker 1: behind the curtain, Like what is the real risk with 13 00:00:40,159 --> 00:00:43,440 Speaker 1: AI that people are being distracted from? And then finally, 14 00:00:43,440 --> 00:00:48,040 Speaker 1: which I think is most important, Alex is building a solution, 15 00:00:48,760 --> 00:00:51,760 Speaker 1: an alternative so that people don't have to get sucked 16 00:00:51,760 --> 00:00:54,320 Speaker 1: in with the crowd. So really excited here, got a 17 00:00:54,360 --> 00:00:58,720 Speaker 1: good episode for you today. Thanks Alex for joining all right, Alex, well, 18 00:00:58,760 --> 00:01:00,880 Speaker 1: thank you so much for joining me today. Really excited 19 00:01:00,880 --> 00:01:02,320 Speaker 1: to have this conversation with you. 20 00:01:02,960 --> 00:01:04,880 Speaker 2: Joe. Good to see again, man, it's been a while. 21 00:01:05,640 --> 00:01:09,039 Speaker 1: It has been a while. We've spoken in the past, 22 00:01:09,200 --> 00:01:11,280 Speaker 1: about some things that you've always had a little bit 23 00:01:11,280 --> 00:01:15,440 Speaker 1: of a controversial, a little bit of a an anti 24 00:01:15,520 --> 00:01:20,080 Speaker 1: consensus opinion on things. I remember our first conversation was 25 00:01:20,160 --> 00:01:25,600 Speaker 1: when everybody was going all in on NFTs and decentralized 26 00:01:25,640 --> 00:01:27,759 Speaker 1: social media networks and you said, no, no, no, we need 27 00:01:27,800 --> 00:01:29,720 Speaker 1: to we need to fix them, like that's that's not 28 00:01:29,800 --> 00:01:31,720 Speaker 1: the solution, that's a non solution for you know, the 29 00:01:32,040 --> 00:01:35,080 Speaker 1: wrong problem. It turns out all that was right, everything 30 00:01:35,160 --> 00:01:36,920 Speaker 1: like that that you were talking about has been crashing, 31 00:01:37,360 --> 00:01:41,880 Speaker 1: and now your latest latest endeavor is attacking a much 32 00:01:41,880 --> 00:01:46,360 Speaker 1: bigger dragon with artificial intelligence. So what is it that 33 00:01:46,400 --> 00:01:49,880 Speaker 1: you've been you've been building and looking into and how 34 00:01:50,200 --> 00:01:52,880 Speaker 1: are we how are we getting this AI thing all wrong? 35 00:01:53,800 --> 00:01:58,040 Speaker 2: Dude? Thank you for that intro. And yeah, let's uh, 36 00:01:58,120 --> 00:02:01,320 Speaker 2: let's let's try and get some context around the AI 37 00:02:01,400 --> 00:02:03,920 Speaker 2: things that people can be on the same page. I 38 00:02:03,920 --> 00:02:09,040 Speaker 2: guess I did. I did actually podcast on my own 39 00:02:09,080 --> 00:02:12,720 Speaker 2: show roughly two years ago now with with the guy 40 00:02:12,760 --> 00:02:15,440 Speaker 2: out of San Francisco called Rob Malco, and we were 41 00:02:15,440 --> 00:02:17,200 Speaker 2: supposed to talk about AI. We spoke a little bit 42 00:02:17,240 --> 00:02:20,040 Speaker 2: about GPT two back then, which was you know, long 43 00:02:20,040 --> 00:02:23,680 Speaker 2: before chat GPT and chat Jipt's orders of magnitude more 44 00:02:25,720 --> 00:02:28,799 Speaker 2: effective and useful and powerful than that was, and we 45 00:02:28,880 --> 00:02:30,880 Speaker 2: kind of ended up like glossing over it. Nobody really 46 00:02:30,880 --> 00:02:33,320 Speaker 2: gave a shit, and we ended up talking about his 47 00:02:33,400 --> 00:02:35,959 Speaker 2: life story growing up to deaf parents and went down 48 00:02:36,000 --> 00:02:38,359 Speaker 2: the Nietzure rabbit hole and all sorts of other stuff. 49 00:02:38,400 --> 00:02:40,240 Speaker 2: So kind of like it was one of those things. 50 00:02:40,280 --> 00:02:42,040 Speaker 2: You know, when you first hear about bitcoin twenty twelve, 51 00:02:42,080 --> 00:02:43,840 Speaker 2: we're like, oh, yeah, cool, whatever, and you know, you 52 00:02:43,960 --> 00:02:49,680 Speaker 2: move on to other other interesting stuff. So obviously, you know, 53 00:02:49,760 --> 00:02:52,919 Speaker 2: November last year came around, chat GPT landed and kind 54 00:02:52,919 --> 00:02:56,320 Speaker 2: of you know, January February March particularly like blew up, 55 00:02:56,360 --> 00:03:00,480 Speaker 2: and you know everyone's running around with a new toy, 56 00:03:00,680 --> 00:03:07,760 Speaker 2: new hysteria, and you know, alongside that hysterian exubrism, you 57 00:03:07,800 --> 00:03:11,000 Speaker 2: get all sorts of stuff that comes out of it. 58 00:03:11,040 --> 00:03:14,800 Speaker 2: You know, everything from the yuval Harari wet dream of 59 00:03:15,000 --> 00:03:18,000 Speaker 2: you know, becoming a brain in a vat and computers 60 00:03:18,000 --> 00:03:19,960 Speaker 2: that are going to run everything and everything is you 61 00:03:19,960 --> 00:03:21,959 Speaker 2: know going to be hacked and all this sort of 62 00:03:22,000 --> 00:03:24,680 Speaker 2: stuff through to you know, the preppers, We're all going 63 00:03:24,760 --> 00:03:27,760 Speaker 2: to die. AI is going to kill us all terminator 64 00:03:28,080 --> 00:03:32,960 Speaker 2: blah blah blah to everything between right and for me, 65 00:03:33,000 --> 00:03:36,600 Speaker 2: it was interesting. I when I saw chat tipt come out, 66 00:03:36,600 --> 00:03:38,720 Speaker 2: like it reminded me of that conversation that Robin I had, 67 00:03:38,720 --> 00:03:41,040 Speaker 2: I was like, oh fuck, you know, and it was 68 00:03:41,080 --> 00:03:43,080 Speaker 2: so much like Bitcoin. I was like, damn, I remember 69 00:03:43,120 --> 00:03:44,680 Speaker 2: Max kais are jumping up and down on the couch 70 00:03:44,680 --> 00:03:48,320 Speaker 2: in twenty twelve, ignored it twenty fifteen, twenty sixteen, sort again, 71 00:03:48,360 --> 00:03:50,240 Speaker 2: I was like, oh shit, it's something here. So I 72 00:03:50,280 --> 00:03:51,880 Speaker 2: did the same thing with this, and I just went 73 00:03:51,880 --> 00:03:56,000 Speaker 2: down the rabbit hole, you know. Between November and February March, 74 00:03:56,040 --> 00:03:59,720 Speaker 2: I did like my thousand hours digging into what is 75 00:03:59,760 --> 00:04:02,800 Speaker 2: a what is intelligence? What does this mean? You know? 76 00:04:02,840 --> 00:04:04,680 Speaker 2: And the more and more I dug, Like I even 77 00:04:04,720 --> 00:04:08,600 Speaker 2: started writing a little blog on it called Authentic Intelligence 78 00:04:09,560 --> 00:04:12,280 Speaker 2: on sub stuck, and I was just sort of digging 79 00:04:12,280 --> 00:04:15,320 Speaker 2: through this stuff. And one of the big things that 80 00:04:15,360 --> 00:04:17,200 Speaker 2: I found, like I did some polls on Twitter I 81 00:04:17,240 --> 00:04:20,000 Speaker 2: was talking to people, was that most people weren't sort 82 00:04:20,000 --> 00:04:23,040 Speaker 2: of afraid, like they sort of know that AI is 83 00:04:23,200 --> 00:04:27,119 Speaker 2: all over the place, like when you order something on Uber, 84 00:04:27,360 --> 00:04:31,800 Speaker 2: like AI is essentially you know, algorithms decide which car 85 00:04:31,920 --> 00:04:33,560 Speaker 2: is going to come pick you up. You know, algorithms 86 00:04:33,600 --> 00:04:38,400 Speaker 2: decide what stuff you're fed on search engines, on Twitter, 87 00:04:38,520 --> 00:04:40,160 Speaker 2: on Instagram and all this sort of stuff. So so 88 00:04:40,200 --> 00:04:42,200 Speaker 2: we're kind of living in the age of quote unquote 89 00:04:42,240 --> 00:04:46,960 Speaker 2: AI anyway, but this language model thing was a big 90 00:04:47,000 --> 00:04:52,880 Speaker 2: shift for people because I think it's the first time 91 00:04:53,240 --> 00:04:58,880 Speaker 2: in history, other than like a parrot, that we've been 92 00:04:58,920 --> 00:05:02,359 Speaker 2: able to talk to something and talk back, right, And 93 00:05:02,400 --> 00:05:04,839 Speaker 2: it kind of made sense, like, tell me a previous 94 00:05:04,839 --> 00:05:07,320 Speaker 2: time in history that we've ever done that we haven't really, right, 95 00:05:08,440 --> 00:05:10,240 Speaker 2: And I use a parrot kind of as a joke, 96 00:05:10,279 --> 00:05:12,960 Speaker 2: but it's honestly the case, like a parrot is the 97 00:05:12,960 --> 00:05:15,080 Speaker 2: only thing that's ever talked back to us. I've kind 98 00:05:15,080 --> 00:05:17,600 Speaker 2: of understood it, like or you know, when you see 99 00:05:17,600 --> 00:05:19,440 Speaker 2: those dogs that they've trained up to kind of say 100 00:05:19,440 --> 00:05:21,600 Speaker 2: a few words, we're like, oh my god, it's so intelligent. Right, 101 00:05:22,160 --> 00:05:25,000 Speaker 2: So we've got these programs now which can string to 102 00:05:25,000 --> 00:05:29,320 Speaker 2: get a language, which are essentially you know, we said 103 00:05:29,320 --> 00:05:33,960 Speaker 2: this offline, but they're like a sophisticated auto complete, and 104 00:05:34,480 --> 00:05:37,360 Speaker 2: you know, people's imagination caught fire and they're like, holy shit, 105 00:05:37,960 --> 00:05:41,320 Speaker 2: artificial general intelligence, which is like the big holy grail 106 00:05:41,400 --> 00:05:43,760 Speaker 2: of AI, right, and you see open AI and all 107 00:05:43,760 --> 00:05:45,840 Speaker 2: these guys talking about it. That's like that's the number 108 00:05:45,839 --> 00:05:49,120 Speaker 2: one goal. But people started to think, okay, and they 109 00:05:49,160 --> 00:05:54,119 Speaker 2: are still thinking that this is the dawn of real 110 00:05:54,440 --> 00:06:00,640 Speaker 2: sentient intelligent machines that are somehow going to either take 111 00:06:00,680 --> 00:06:04,920 Speaker 2: of the world absolute humanity. You know, name the threat, 112 00:06:05,680 --> 00:06:10,400 Speaker 2: the name the issue, name the problem. So I started 113 00:06:10,400 --> 00:06:12,600 Speaker 2: going down the rabbit hole and asked, basically, the big 114 00:06:12,680 --> 00:06:16,159 Speaker 2: question is like, okay, if everyone's afraid of, like you know, 115 00:06:16,240 --> 00:06:18,200 Speaker 2: what is everyone afraid of? Okay, it's not really AI, 116 00:06:18,279 --> 00:06:22,599 Speaker 2: it's actually AGI. It's sentient intelligence, like another alien type 117 00:06:22,600 --> 00:06:24,560 Speaker 2: of intelligence that is more powerful than us. Okay, So 118 00:06:24,640 --> 00:06:27,680 Speaker 2: if that's the fear, all right, what does that actually mean? 119 00:06:27,800 --> 00:06:31,279 Speaker 2: So I tried to define artificial general intelligence and what 120 00:06:31,360 --> 00:06:34,440 Speaker 2: I found difficult was, wait a minute, we don't even 121 00:06:34,480 --> 00:06:38,480 Speaker 2: have a consensus on what the word intelligence means. You know, 122 00:06:38,880 --> 00:06:45,120 Speaker 2: that's first and foremost like still nebulous, and more importantly 123 00:06:45,160 --> 00:06:47,520 Speaker 2: than that is like, you know, we might be able 124 00:06:47,560 --> 00:06:50,560 Speaker 2: to get you know, some sort of consensus. Intelligence is 125 00:06:50,560 --> 00:06:53,600 Speaker 2: some sort of pattern recognition and probabilities and this and that. 126 00:06:54,040 --> 00:06:59,520 Speaker 2: But then okay, if that's some rough estimation or conceptualization 127 00:06:59,560 --> 00:07:03,640 Speaker 2: of what intellligences, then how many intelligences are there? Well, 128 00:07:05,200 --> 00:07:08,240 Speaker 2: I've obviously got cognitive intelligence, but if you look at 129 00:07:08,279 --> 00:07:12,560 Speaker 2: like the human being, you've got uh, you've got emotional intelligence, 130 00:07:12,720 --> 00:07:15,840 Speaker 2: you've got hormonal endocrine intelligence, you've got intelligence, and your 131 00:07:15,920 --> 00:07:18,559 Speaker 2: muscles intelligence, and your gut, you've got neurons all throughout 132 00:07:18,600 --> 00:07:20,960 Speaker 2: the body. You've got intelligence in the way your bones form. 133 00:07:21,320 --> 00:07:23,920 Speaker 2: You've got intuitive intelligence, like you've got all you know, 134 00:07:23,920 --> 00:07:26,760 Speaker 2: the metaphysical kind of intelligence, Like that's a bit more spiritual, 135 00:07:26,800 --> 00:07:30,000 Speaker 2: instinctual in nature. Like how the fuck do you even 136 00:07:30,040 --> 00:07:32,720 Speaker 2: like count all of those and define all of those 137 00:07:32,920 --> 00:07:36,520 Speaker 2: and work through all of those. And I what I 138 00:07:36,600 --> 00:07:38,720 Speaker 2: kind of came to realize, and I wrote about this 139 00:07:38,760 --> 00:07:44,600 Speaker 2: in those essays was man like, cognitive intelligence itself is 140 00:07:44,640 --> 00:07:49,720 Speaker 2: like a you know, super broad concept, and language models 141 00:07:51,120 --> 00:07:54,840 Speaker 2: they're just like a sliver. Like all we've really discovered 142 00:07:54,880 --> 00:07:57,040 Speaker 2: with these new AI, these new language models is that 143 00:07:58,000 --> 00:08:03,600 Speaker 2: language is actually more hat and recognition then it is intelligence, 144 00:08:04,640 --> 00:08:07,160 Speaker 2: and that that's something we didn't realize before. So, like 145 00:08:07,240 --> 00:08:09,320 Speaker 2: I went back and read some old aibooks like Nick 146 00:08:09,320 --> 00:08:14,440 Speaker 2: Bostrom's Superintelligence, and in there he was adamant that by 147 00:08:14,480 --> 00:08:17,240 Speaker 2: the time computers work out how to speak, they will 148 00:08:17,240 --> 00:08:20,600 Speaker 2: be already artificially generally intelligent, and they will have hit 149 00:08:20,680 --> 00:08:25,040 Speaker 2: that that escape velocity, and by the time we realize 150 00:08:25,080 --> 00:08:28,040 Speaker 2: that we can talk to machines, they will run the world. 151 00:08:28,680 --> 00:08:33,240 Speaker 2: Well he was one hundred percent wrung on that, right. So, 152 00:08:33,679 --> 00:08:35,480 Speaker 2: you know what, the only thing we've realized out of 153 00:08:35,480 --> 00:08:38,840 Speaker 2: this whole sort of modern air experiment now with language 154 00:08:38,840 --> 00:08:41,840 Speaker 2: models is that language is not as intelligence. You know, 155 00:08:42,960 --> 00:08:44,560 Speaker 2: it's not as deep as we thought it was. It 156 00:08:44,600 --> 00:08:49,080 Speaker 2: was relatively probabilistic, same as imagery and art, like you know, 157 00:08:49,120 --> 00:08:52,400 Speaker 2: these kind of diffusion and daily, daily two and mid 158 00:08:52,440 --> 00:08:54,920 Speaker 2: journey and everything like that. They're just like you know, 159 00:08:55,679 --> 00:08:59,040 Speaker 2: putting pixels together in probabilistic fashion that you know, we 160 00:08:59,240 --> 00:09:02,560 Speaker 2: recognize a few huan beings. So anyway, I went on 161 00:09:02,600 --> 00:09:08,200 Speaker 2: this whole rant about like, hey, AGI is actually a scam. 162 00:09:08,600 --> 00:09:11,560 Speaker 2: It's probably not going to happen in our lifetimes. If ever, 163 00:09:12,080 --> 00:09:15,720 Speaker 2: because we are so far from just cognitive cerebral intelligence, 164 00:09:15,800 --> 00:09:19,079 Speaker 2: let al and general intelligence that I think there's a 165 00:09:19,800 --> 00:09:23,319 Speaker 2: there's a deeper problem here that people who know better 166 00:09:23,720 --> 00:09:26,720 Speaker 2: are using a GI as red herring for So I'll 167 00:09:26,720 --> 00:09:28,560 Speaker 2: stop there for a moment. You know, maybe we can 168 00:09:28,600 --> 00:09:30,360 Speaker 2: pull on a couple threads there, but that's sort of 169 00:09:30,400 --> 00:09:34,120 Speaker 2: like trying to debunk like the fear is AGI, but 170 00:09:34,200 --> 00:09:37,800 Speaker 2: that is a that is a it's a nebulous fear. 171 00:09:37,840 --> 00:09:40,440 Speaker 2: It's undefined, very similar to climate change, right, the climate's 172 00:09:40,440 --> 00:09:42,360 Speaker 2: going to kill us all. We don't know how, why, what, 173 00:09:42,679 --> 00:09:48,160 Speaker 2: But like it's it's it's it's freaky enough and people 174 00:09:48,200 --> 00:09:53,199 Speaker 2: can't understand it enough that it's like worthwhile creating another 175 00:09:53,320 --> 00:09:56,120 Speaker 2: regulatory body to manage it and protect us from that 176 00:09:56,200 --> 00:09:58,800 Speaker 2: and get more taxes out of us for the purpose 177 00:09:58,840 --> 00:10:00,000 Speaker 2: of saving the world. Again. 178 00:10:01,679 --> 00:10:06,200 Speaker 1: So basically, to summarize, if I understand this first fear correctly, 179 00:10:06,320 --> 00:10:10,800 Speaker 1: it's that that people are afraid that that we're going 180 00:10:10,840 --> 00:10:14,760 Speaker 1: to have a sky net terminator type of event and 181 00:10:15,920 --> 00:10:18,000 Speaker 1: at some point the computer is going to become smarter 182 00:10:18,080 --> 00:10:22,720 Speaker 1: than us, and whatever objective we have given to it, 183 00:10:22,720 --> 00:10:26,240 Speaker 1: it could potentially figure out a way to use that 184 00:10:26,280 --> 00:10:32,000 Speaker 1: objective against humanity. But your point is that intelligence is 185 00:10:32,040 --> 00:10:37,640 Speaker 1: so little understood, and what we do understand about it 186 00:10:37,640 --> 00:10:40,080 Speaker 1: we already know that it's way, way, way, way way 187 00:10:40,080 --> 00:10:45,160 Speaker 1: bigger than anything artificial general intelligence, or anything that we've 188 00:10:45,360 --> 00:10:47,680 Speaker 1: been able to even come close to accomplishing so far. 189 00:10:49,400 --> 00:10:53,240 Speaker 1: You talked about language being more pattern recognition. Then you 190 00:10:53,400 --> 00:10:55,560 Speaker 1: also mentioned this is something I think the reason why, 191 00:10:55,720 --> 00:10:58,440 Speaker 1: like full self driving with Tesla has been delayed, you know, 192 00:10:58,520 --> 00:11:03,640 Speaker 1: like a decade now that object recognition is not actually 193 00:11:03,679 --> 00:11:07,680 Speaker 1: something that's it seems like possible with computers. You you 194 00:11:07,800 --> 00:11:12,199 Speaker 1: really need tool recognition, which is something even deeper, fundamentally 195 00:11:12,240 --> 00:11:16,160 Speaker 1: embedded into huge humanity psyche that we don't understand how 196 00:11:16,160 --> 00:11:21,320 Speaker 1: to translate that to computers. It also assumes that that 197 00:11:21,760 --> 00:11:28,840 Speaker 1: all the intelligence humans have is one, uh, what's the 198 00:11:28,880 --> 00:11:33,880 Speaker 1: word biological. It assumes that there's nothing like transcendent and 199 00:11:34,000 --> 00:11:35,960 Speaker 1: we don't need to get you know, get into the 200 00:11:36,080 --> 00:11:38,560 Speaker 1: you know, any any sort of a spiritual argument here. 201 00:11:38,600 --> 00:11:42,319 Speaker 1: But it does assume that there's nothing there's no. 202 00:11:42,480 --> 00:11:44,839 Speaker 2: Mind, there's no meta that's in charge. 203 00:11:44,559 --> 00:11:47,960 Speaker 1: Of the brain, you know what I mean, Because if 204 00:11:48,000 --> 00:11:51,439 Speaker 1: there is, then that completely throws out the whole like 205 00:11:51,480 --> 00:11:53,880 Speaker 1: how would you be able to build that with you know, electrodes. 206 00:11:55,679 --> 00:12:00,360 Speaker 1: So basically this whole, the whole fear mongering of AI 207 00:12:00,520 --> 00:12:03,800 Speaker 1: is about to take over is not is not the 208 00:12:03,800 --> 00:12:05,839 Speaker 1: thing that people should really be concerned about. 209 00:12:06,080 --> 00:12:09,719 Speaker 2: Exactly exactly. I mean, there's before we get into what 210 00:12:09,800 --> 00:12:12,080 Speaker 2: people should actually be concerned about, because I think there 211 00:12:12,120 --> 00:12:15,040 Speaker 2: is actually a genuine, genuine danger here, which is a 212 00:12:15,160 --> 00:12:19,320 Speaker 2: very different kind. But there. I was listening to a 213 00:12:19,320 --> 00:12:22,160 Speaker 2: podcast by I think it was John Vivek or something, 214 00:12:22,160 --> 00:12:24,080 Speaker 2: and he made a claim which I really really agree with. 215 00:12:24,120 --> 00:12:26,600 Speaker 2: He just articulated so well as something along the lines 216 00:12:26,600 --> 00:12:32,440 Speaker 2: of we humans might be the universe's peak form of 217 00:12:32,520 --> 00:12:36,040 Speaker 2: general intelligence, right, because what people are finding with these 218 00:12:36,080 --> 00:12:37,960 Speaker 2: language models and stuff like that, they're like, oh, you know, 219 00:12:38,040 --> 00:12:40,280 Speaker 2: like the typical notes. You know, they're like, oh, if 220 00:12:40,320 --> 00:12:43,360 Speaker 2: we just string together all of these different models and 221 00:12:43,400 --> 00:12:45,440 Speaker 2: get each model doing a different thing, and then you know, 222 00:12:45,480 --> 00:12:48,240 Speaker 2: you place a governing model that selects the different models, 223 00:12:48,640 --> 00:12:50,480 Speaker 2: and then you know, you have them clustered, and then 224 00:12:50,480 --> 00:12:52,960 Speaker 2: another governing model, like so basically building up you know, 225 00:12:53,040 --> 00:12:57,640 Speaker 2: intelligence hierarchy. What they find is that compute resources and 226 00:12:57,640 --> 00:13:00,640 Speaker 2: everything goes through the fucking roof. Complexity go through the roof. 227 00:13:00,640 --> 00:13:04,800 Speaker 2: The actual whole system slows down, and you get these 228 00:13:04,880 --> 00:13:07,960 Speaker 2: kind of like dis economies of scale. So you know, 229 00:13:08,040 --> 00:13:10,640 Speaker 2: it seems like humans in some way, like we have 230 00:13:10,679 --> 00:13:13,520 Speaker 2: all of these embedded layers upon layers upon layers upon 231 00:13:13,559 --> 00:13:15,920 Speaker 2: layers of intelligence, and we're just like sort of the 232 00:13:16,000 --> 00:13:20,080 Speaker 2: right amount of everything that Yeah, we can't beat a 233 00:13:20,120 --> 00:13:23,520 Speaker 2: computer specifically on math, right, but hey, we're fucking good 234 00:13:23,600 --> 00:13:25,960 Speaker 2: enough at math, and we're good enough dexteriously like we 235 00:13:26,000 --> 00:13:29,240 Speaker 2: can do something like so you know, computers, can you 236 00:13:29,840 --> 00:13:34,200 Speaker 2: beat us in narrow domains or narrow dimensions, But as 237 00:13:34,200 --> 00:13:35,960 Speaker 2: soon as you try and make the ultimate computer to 238 00:13:36,000 --> 00:13:39,600 Speaker 2: do everything better than all of us, you'll fail. Like, 239 00:13:39,880 --> 00:13:41,400 Speaker 2: you know, maybe the end point is actually we end 240 00:13:41,480 --> 00:13:46,920 Speaker 2: up creating another human being and like, okay, well you know, like. 241 00:13:46,920 --> 00:13:50,040 Speaker 1: So you know, we should have just gone and had babies, 242 00:13:50,320 --> 00:13:54,880 Speaker 1: right right, that makes sense, which okay, and that's a 243 00:13:55,000 --> 00:13:56,760 Speaker 1: that's a good point, which because this brings up the 244 00:13:56,800 --> 00:14:00,360 Speaker 1: other the other fear about this you've talked about, how 245 00:14:01,160 --> 00:14:04,880 Speaker 1: you know, we build one specific program that can do 246 00:14:05,480 --> 00:14:08,360 Speaker 1: one thing way better than any human can you know 247 00:14:08,400 --> 00:14:10,840 Speaker 1: it can it can be at a set chess, it 248 00:14:10,880 --> 00:14:12,920 Speaker 1: can do you know, you math, it can do whatever 249 00:14:13,000 --> 00:14:15,360 Speaker 1: it is. We could build a program that can be 250 00:14:15,360 --> 00:14:19,720 Speaker 1: better than than than people. At this first, I'd like to, 251 00:14:19,960 --> 00:14:22,520 Speaker 1: you know, make the note that that's all technology is 252 00:14:22,560 --> 00:14:25,400 Speaker 1: and it's all it's ever been. I mean, fire can 253 00:14:25,240 --> 00:14:29,600 Speaker 1: can get nutrients better than humans can by foraging, like 254 00:14:29,920 --> 00:14:33,320 Speaker 1: you know, a spear or a tractor can can can 255 00:14:33,360 --> 00:14:35,400 Speaker 1: do work better than humans. I mean, this is that's 256 00:14:35,440 --> 00:14:38,360 Speaker 1: what human progress is. It's finding tools better than better 257 00:14:38,400 --> 00:14:40,800 Speaker 1: than better than we do. But that's the that's one 258 00:14:40,800 --> 00:14:43,080 Speaker 1: of the main fears. Even without a g I the 259 00:14:43,080 --> 00:14:46,120 Speaker 1: Holy Grail. People are afraid all they're gonna take my jobs. 260 00:14:46,120 --> 00:14:47,680 Speaker 1: I'm not gonna have my cashier job. I'm not gonna 261 00:14:47,680 --> 00:14:48,920 Speaker 1: have my truck driver job. I'm not gonna have my 262 00:14:48,960 --> 00:14:50,760 Speaker 1: lawyer job. I'm not gonna have my doctor job, whatever 263 00:14:50,760 --> 00:14:53,000 Speaker 1: the job is. We fear that, hey, this tool is 264 00:14:53,000 --> 00:14:56,000 Speaker 1: going to replace me. Is that a Is that a 265 00:14:56,040 --> 00:15:00,800 Speaker 1: fear that is unfounded? Is that, uh, you know, gonna happen. 266 00:15:00,800 --> 00:15:02,600 Speaker 1: But it's good? Is it not gonna happen? What are 267 00:15:02,600 --> 00:15:04,640 Speaker 1: your thoughts on jobs being replaced? 268 00:15:04,800 --> 00:15:08,400 Speaker 2: Yeah? Good one. So I think I haven't finished this 269 00:15:08,480 --> 00:15:11,680 Speaker 2: article yet, but I was writing an article called Midwit 270 00:15:11,840 --> 00:15:15,000 Speaker 2: Obsolescence Technology, and I was like, hey, I should be 271 00:15:15,040 --> 00:15:18,000 Speaker 2: renamed to this, right, because you know a lot of 272 00:15:18,040 --> 00:15:20,840 Speaker 2: this these modern language models, particularly the mainstream ones like 273 00:15:20,880 --> 00:15:28,480 Speaker 2: your chatjepets, they're very good at basically just regurgitating mainstream language. 274 00:15:28,560 --> 00:15:28,760 Speaker 1: Right. 275 00:15:28,800 --> 00:15:34,200 Speaker 2: So, like, if you're a if you're like a vice 276 00:15:34,240 --> 00:15:36,440 Speaker 2: reporter or you know, working for the CNBC and you 277 00:15:36,520 --> 00:15:38,400 Speaker 2: know you're writing like this basic run of the mill 278 00:15:38,480 --> 00:15:41,880 Speaker 2: journo stuff, you're fucked. Chat jipt can do that for you. 279 00:15:42,360 --> 00:15:46,800 Speaker 2: But if you're someone with an opinion outside of the 280 00:15:46,800 --> 00:15:49,160 Speaker 2: the accepted Overton window, you know, if you're not a 281 00:15:49,200 --> 00:15:52,160 Speaker 2: midwin for example, you're fine. Like I tried to use 282 00:15:52,240 --> 00:15:54,440 Speaker 2: chat jupet to help me write some stuff and I'm like, fuck, 283 00:15:54,520 --> 00:15:57,680 Speaker 2: this thing sucks. Then now you can prompt and conjole 284 00:15:57,760 --> 00:15:59,560 Speaker 2: it and like try and like you know, do this 285 00:15:59,600 --> 00:16:02,360 Speaker 2: and that, like you know, clean this and act like 286 00:16:02,480 --> 00:16:05,160 Speaker 2: nature or speak like Jordan Peters. But you know, it's 287 00:16:05,200 --> 00:16:08,880 Speaker 2: just that hours you spend on trying to prompt youry 288 00:16:08,920 --> 00:16:11,000 Speaker 2: into something useful, you could have just written it yourself, 289 00:16:11,080 --> 00:16:13,040 Speaker 2: and you may as well practice that. You may as 290 00:16:13,080 --> 00:16:14,960 Speaker 2: well use that muscle in your head because if you 291 00:16:15,000 --> 00:16:18,680 Speaker 2: don't use it, you lose it, right, So you know, 292 00:16:18,880 --> 00:16:22,120 Speaker 2: beyond that, So beyond this sort of like language tasks, 293 00:16:22,160 --> 00:16:23,800 Speaker 2: because you know, they were always saying, oh, you know, 294 00:16:23,840 --> 00:16:29,600 Speaker 2: AI will replace rote tasks before it replaces creative tasks. Now, 295 00:16:29,680 --> 00:16:32,200 Speaker 2: you know, writing is somewhat creative, and you know, maybe 296 00:16:32,280 --> 00:16:35,480 Speaker 2: like this, as I said, this midwiit type of creative stuff. 297 00:16:35,560 --> 00:16:39,800 Speaker 2: You know that the general mainstream ship is probably going 298 00:16:39,840 --> 00:16:44,320 Speaker 2: to get replaced by these you know, probability machines. So 299 00:16:44,320 --> 00:16:46,160 Speaker 2: so those people should be afraid. But you know, things 300 00:16:46,200 --> 00:16:48,680 Speaker 2: like cashier's truck drivers and stuff like that. You know, 301 00:16:48,800 --> 00:16:51,680 Speaker 2: sure we'll have automation, you know, and this is you know, 302 00:16:51,720 --> 00:16:55,480 Speaker 2: you could you could place I would actually place AI 303 00:16:55,720 --> 00:17:01,040 Speaker 2: underneath automation as opposed to automation under AI. But you know, 304 00:17:01,320 --> 00:17:04,600 Speaker 2: technology and automation has always been a goal and trying 305 00:17:04,640 --> 00:17:08,760 Speaker 2: to do more with less, and that will continue. The 306 00:17:08,800 --> 00:17:11,440 Speaker 2: thing is a lot of these shifts are largely cultural. 307 00:17:12,000 --> 00:17:14,919 Speaker 2: And as you know, and I think as anyone listening 308 00:17:14,960 --> 00:17:19,000 Speaker 2: to this nose is, human beings are incredibly good at 309 00:17:19,040 --> 00:17:23,680 Speaker 2: finding something else to do with their time when they 310 00:17:23,720 --> 00:17:27,280 Speaker 2: have some free time. And this really just comes back 311 00:17:27,320 --> 00:17:30,160 Speaker 2: to the kind of human being you are and how 312 00:17:30,200 --> 00:17:33,879 Speaker 2: intentional you are about that spare time. So there's two 313 00:17:33,920 --> 00:17:36,640 Speaker 2: threads I want to pull on. Here is number one. 314 00:17:37,760 --> 00:17:41,560 Speaker 2: Let's say there is some genuinely useful bits of AI 315 00:17:41,680 --> 00:17:45,080 Speaker 2: you know that at that or tools you know. Let's 316 00:17:45,080 --> 00:17:47,800 Speaker 2: say there's some you know, more based language models you know, 317 00:17:47,880 --> 00:17:49,520 Speaker 2: kind of like what we're trying to build now, et cetera. 318 00:17:49,640 --> 00:17:53,760 Speaker 2: And you might, as an intentional person, use that to 319 00:17:53,840 --> 00:17:56,359 Speaker 2: cut down maybe a writing time on a daily basis 320 00:17:56,400 --> 00:17:58,520 Speaker 2: from three hours down to two hours or one hour 321 00:17:58,600 --> 00:18:02,159 Speaker 2: or something like that. I use something to help, you know, 322 00:18:02,320 --> 00:18:04,560 Speaker 2: automate your emails, this and that. Let's say you cut 323 00:18:04,560 --> 00:18:06,920 Speaker 2: your working day down, you can be as effective in 324 00:18:06,960 --> 00:18:10,760 Speaker 2: four hour four hours instead of eight. Well, what do 325 00:18:10,760 --> 00:18:12,320 Speaker 2: you then do? Do you spend the rest of the 326 00:18:12,359 --> 00:18:16,720 Speaker 2: time on social media? Do you go on Netflix? 327 00:18:16,960 --> 00:18:17,159 Speaker 1: You know? 328 00:18:17,240 --> 00:18:19,560 Speaker 2: Do you watch porn? You know? What are you doing? 329 00:18:20,119 --> 00:18:22,080 Speaker 2: Or are you maybe feeding your mind another way? Or 330 00:18:22,119 --> 00:18:24,240 Speaker 2: are you actually going out there and using your hands? 331 00:18:24,280 --> 00:18:25,960 Speaker 2: Are you using your body? Do you like go and 332 00:18:26,040 --> 00:18:28,760 Speaker 2: do some jiu jitsu? Like do you go for a run? 333 00:18:28,800 --> 00:18:29,639 Speaker 2: Do you go to the gym? 334 00:18:29,720 --> 00:18:29,800 Speaker 1: Like? 335 00:18:30,200 --> 00:18:32,880 Speaker 2: We will open up time for other things and people 336 00:18:32,880 --> 00:18:36,159 Speaker 2: are intentional about it, we'll use that. Most people, I 337 00:18:36,200 --> 00:18:39,600 Speaker 2: actually think are just going to basically, you know, if 338 00:18:39,640 --> 00:18:41,560 Speaker 2: there is these tools which automate a bunch of stuff 339 00:18:41,560 --> 00:18:43,480 Speaker 2: and people end up with more time, my guess is 340 00:18:43,680 --> 00:18:46,040 Speaker 2: you know that's where things like you know, the Metaverse, 341 00:18:46,840 --> 00:18:50,400 Speaker 2: virtual reality, Netflix, Uber eats blah blah blah. Like it's 342 00:18:50,440 --> 00:18:52,840 Speaker 2: that kind of the long house, right the you know, 343 00:18:52,880 --> 00:18:56,320 Speaker 2: they'll they'll just distract themselves into obliviing and you know 344 00:18:56,480 --> 00:19:01,800 Speaker 2: the Wally movie. Basically, So I I don't think it's 345 00:19:01,840 --> 00:19:06,160 Speaker 2: a you know, automation has always done this and it's 346 00:19:06,160 --> 00:19:09,679 Speaker 2: always created more free time, which we never end up 347 00:19:09,680 --> 00:19:11,520 Speaker 2: with free time. We just fill it with something and 348 00:19:11,560 --> 00:19:13,959 Speaker 2: it's it's the same old story as like what are 349 00:19:13,960 --> 00:19:17,080 Speaker 2: you gonna feel it with something useful, productive, growth, blah 350 00:19:17,080 --> 00:19:18,879 Speaker 2: blah blah blah blah, you know, or is it going 351 00:19:18,960 --> 00:19:21,440 Speaker 2: to be in the bucket of you know, wasting yourself away, 352 00:19:21,480 --> 00:19:23,800 Speaker 2: wasting your time, wasting your energy, wasting your vitality, wasting 353 00:19:23,800 --> 00:19:26,640 Speaker 2: your seat, et cetera. And you know it'll be people 354 00:19:26,640 --> 00:19:29,000 Speaker 2: are just gonna be confronted with the exact same fucking question. 355 00:19:29,040 --> 00:19:32,679 Speaker 2: I mean, just just think about Like chachipt came out 356 00:19:33,040 --> 00:19:36,159 Speaker 2: six seven months ago and like as much as I 357 00:19:36,200 --> 00:19:39,360 Speaker 2: bag it out sometimes like it is still a fundamentally 358 00:19:39,359 --> 00:19:42,000 Speaker 2: profound thing, Like I'm typing to a fucking computer and 359 00:19:42,000 --> 00:19:43,600 Speaker 2: the computer talking back to me as much as it 360 00:19:43,600 --> 00:19:45,600 Speaker 2: sounds like a little midwit and it's apologetic and all 361 00:19:45,600 --> 00:19:48,679 Speaker 2: this sort of shit, like that's fucking cool. People already 362 00:19:48,680 --> 00:19:51,159 Speaker 2: bored with it, man, like that. This is like, you know, 363 00:19:51,320 --> 00:19:52,879 Speaker 2: so people have already found other things to fill their 364 00:19:52,880 --> 00:19:55,240 Speaker 2: time with. So that's what the human condition is. It 365 00:19:55,240 --> 00:19:58,720 Speaker 2: gets used to a situation and it fills its time again. 366 00:20:00,000 --> 00:20:00,879 Speaker 2: It's never gonna change. 367 00:20:01,960 --> 00:20:06,239 Speaker 1: Yeah, yeah, And the fear is well, then there's you know, 368 00:20:06,320 --> 00:20:08,920 Speaker 1: there's going to be a small minority of people who 369 00:20:08,920 --> 00:20:12,000 Speaker 1: are going to be doing all of the production. Therefore 370 00:20:12,000 --> 00:20:13,600 Speaker 1: they're going to have all the wealth. And that's just 371 00:20:13,680 --> 00:20:17,760 Speaker 1: a fundamentally fundamental misunderstanding the way the difference between wealth 372 00:20:17,800 --> 00:20:21,000 Speaker 1: and money. It's like, Okay, the same thing happened with 373 00:20:21,080 --> 00:20:25,320 Speaker 1: like the discovery of petroleum and electricity. It's like that 374 00:20:25,440 --> 00:20:28,840 Speaker 1: drastically increased the wealth of everybody around the world. And today, 375 00:20:29,280 --> 00:20:32,160 Speaker 1: if you wanted to live the lifestyle of a king 376 00:20:32,520 --> 00:20:37,720 Speaker 1: in the seventeen hundreds, it takes like two hours of 377 00:20:37,760 --> 00:20:40,479 Speaker 1: work a day at a minimum wage job, and you 378 00:20:40,520 --> 00:20:42,960 Speaker 1: can live the lifestyle that a king lived in the 379 00:20:43,000 --> 00:20:46,920 Speaker 1: seventeen hundreds. It's like, okay, nothing, no petroleum by products, 380 00:20:46,960 --> 00:20:51,480 Speaker 1: no electricity like, no running water like the it's there's 381 00:20:51,840 --> 00:20:55,960 Speaker 1: it's very very easy to achieve the level of wealth 382 00:20:56,320 --> 00:20:59,200 Speaker 1: that was available from a monetary perspective today. The difference 383 00:20:59,240 --> 00:21:03,240 Speaker 1: is that people both like you to your point, we escalate, 384 00:21:03,280 --> 00:21:07,080 Speaker 1: we increase our expectations, and so therefore we drive ourselves 385 00:21:07,240 --> 00:21:10,399 Speaker 1: to produce more so that we can have more. And 386 00:21:12,000 --> 00:21:17,160 Speaker 1: so the amount of wealth that will be shared will 387 00:21:17,160 --> 00:21:21,000 Speaker 1: be as a result of things that people want and 388 00:21:21,040 --> 00:21:25,960 Speaker 1: need becoming cheaper and so they're more easily affordable. And 389 00:21:26,000 --> 00:21:28,159 Speaker 1: then the other point you made about who who's going 390 00:21:28,240 --> 00:21:30,040 Speaker 1: to lose their who's going to lose their jobs, and 391 00:21:30,040 --> 00:21:32,960 Speaker 1: who's in trouble, it's like at video editing is a 392 00:21:32,960 --> 00:21:35,560 Speaker 1: big thing for me, it's you know, it's a big 393 00:21:35,680 --> 00:21:40,400 Speaker 1: expense for me. And I could sit literally any more 394 00:21:40,400 --> 00:21:42,359 Speaker 1: on down and teach them how to use a video editor, 395 00:21:42,600 --> 00:21:45,119 Speaker 1: and it just takes training. I say, click this button, 396 00:21:45,280 --> 00:21:47,680 Speaker 1: click this button, click this button, and after a couple 397 00:21:47,720 --> 00:21:51,200 Speaker 1: of weeks, literally anybody can do that job. But now 398 00:21:51,240 --> 00:21:53,080 Speaker 1: there's a program that can do it for me, And 399 00:21:53,080 --> 00:21:55,199 Speaker 1: instead of having to spend five grand a month to 400 00:21:55,280 --> 00:21:58,800 Speaker 1: produce ten tiktoks, today I can spend seventy dollars a 401 00:21:58,840 --> 00:22:01,200 Speaker 1: month and have the computer do the exact same thing 402 00:22:01,280 --> 00:22:04,159 Speaker 1: for me. And it's like, basically, in my opinion, it 403 00:22:04,160 --> 00:22:06,159 Speaker 1: seems like if I can train a person to do it, 404 00:22:06,200 --> 00:22:08,320 Speaker 1: eventually a program will be able to do it because 405 00:22:08,320 --> 00:22:10,760 Speaker 1: the program can get trained just like a person. But 406 00:22:10,880 --> 00:22:13,640 Speaker 1: anything that has to be done that can't be trained, 407 00:22:14,600 --> 00:22:16,280 Speaker 1: you're not going to be able to train a computer 408 00:22:16,359 --> 00:22:17,800 Speaker 1: to do it because you can't train a person to 409 00:22:17,800 --> 00:22:20,760 Speaker 1: do it. Like you to your point, the thinking, the creativity, 410 00:22:20,800 --> 00:22:25,320 Speaker 1: the writing, the reasoning, Yes, yeah, something that takes an 411 00:22:25,320 --> 00:22:30,600 Speaker 1: actual person, uh venturing beyond what is you know, uh 412 00:22:31,400 --> 00:22:33,680 Speaker 1: trainable from a mid of home manager. 413 00:22:33,920 --> 00:22:37,639 Speaker 2: Yeah, well and this is this is exactly like you know, 414 00:22:37,720 --> 00:22:42,119 Speaker 2: the what a bit of world it is if you 415 00:22:42,160 --> 00:22:47,040 Speaker 2: know what you're doing can be can be done from 416 00:22:47,040 --> 00:22:53,160 Speaker 2: a place of creativity, interest, intrigue, and curiosity, right Like 417 00:22:53,280 --> 00:22:54,880 Speaker 2: who the fuck wants to sit down and just fucking 418 00:22:54,880 --> 00:22:57,840 Speaker 2: push buttons all day? Like well, maybe, like you know, 419 00:22:57,960 --> 00:23:00,919 Speaker 2: I want to do something interesting, and you know, the 420 00:23:00,960 --> 00:23:04,159 Speaker 2: more we can outsource these menual touts the better. So 421 00:23:05,600 --> 00:23:08,080 Speaker 2: you know, I think in that sense, I mean, that's 422 00:23:08,840 --> 00:23:11,160 Speaker 2: been and always will be. I think, in my mind 423 00:23:12,680 --> 00:23:19,160 Speaker 2: the proper AI pitch, so to speak. And I just think, 424 00:23:20,280 --> 00:23:22,359 Speaker 2: you know, I don't see how that's different from what 425 00:23:22,440 --> 00:23:24,600 Speaker 2: the AI pitch was twelve months ago, eighteen months ago, 426 00:23:24,720 --> 00:23:26,280 Speaker 2: or three years ago or five years ago, when they 427 00:23:26,280 --> 00:23:28,440 Speaker 2: were talking about AI, you know, automating things away. I 428 00:23:28,440 --> 00:23:30,639 Speaker 2: think just just the flavor of it. This time, I 429 00:23:30,640 --> 00:23:33,960 Speaker 2: think people got caught up with chat tipt and mid 430 00:23:34,000 --> 00:23:37,680 Speaker 2: journeys seeming creativity. And this is the thing, is none 431 00:23:37,720 --> 00:23:40,240 Speaker 2: of that stuff is creativity. It's just the law of 432 00:23:40,320 --> 00:23:45,199 Speaker 2: large numbers. It gives you the perception of creativity, but 433 00:23:45,280 --> 00:23:47,600 Speaker 2: it really isn't. There's nothing new that comes out of 434 00:23:47,640 --> 00:23:49,680 Speaker 2: any of these language models or mid journey or anything 435 00:23:49,760 --> 00:23:52,600 Speaker 2: like that. It's just the new combination of stuff. And 436 00:23:52,680 --> 00:23:54,480 Speaker 2: you know, someone might argue, well, you know, all human 437 00:23:54,520 --> 00:23:57,960 Speaker 2: creativity is a new combination of stuff. But that is 438 00:23:58,840 --> 00:24:03,000 Speaker 2: true to a degree. But there's there's something. I mean, 439 00:24:03,000 --> 00:24:05,399 Speaker 2: you even see it in like, you know, when you 440 00:24:05,400 --> 00:24:07,480 Speaker 2: get mid journey to create a really cool image or 441 00:24:07,480 --> 00:24:09,000 Speaker 2: something like that. I was doing it the other day 442 00:24:09,000 --> 00:24:11,640 Speaker 2: for some Alexander the Great imagery and you know, trying 443 00:24:11,640 --> 00:24:14,600 Speaker 2: to get like a Alexander charging into battle with his 444 00:24:14,960 --> 00:24:17,280 Speaker 2: with his Soissa up, and I just couldn't get the 445 00:24:17,320 --> 00:24:19,679 Speaker 2: fucking Soissa, Like it was just you know, this random 446 00:24:19,720 --> 00:24:21,800 Speaker 2: sword like poking out of his head or like you know, 447 00:24:21,880 --> 00:24:23,480 Speaker 2: pointing the wrong way or something like that. It was 448 00:24:23,800 --> 00:24:27,119 Speaker 2: because it's like the machine doesn't know, so it's just 449 00:24:27,160 --> 00:24:30,919 Speaker 2: like slapping pixels together in such an order that tries 450 00:24:30,960 --> 00:24:33,919 Speaker 2: to approximate, you know, the words that I'm have in 451 00:24:33,920 --> 00:24:36,280 Speaker 2: a particular order, and you just got to keep playing 452 00:24:36,320 --> 00:24:38,040 Speaker 2: and fucking with the words until you kind of get 453 00:24:38,040 --> 00:24:39,439 Speaker 2: them in the right order. It's the right thing. But 454 00:24:40,160 --> 00:24:45,760 Speaker 2: you see that the image itself almost like lacks like 455 00:24:45,880 --> 00:24:49,159 Speaker 2: lacks essence or lacks intent, like when when an artist 456 00:24:49,200 --> 00:24:51,840 Speaker 2: actually paints something or creates something, there's an actual intent 457 00:24:52,240 --> 00:24:54,879 Speaker 2: around where it's where things are placed. It's not a 458 00:24:56,400 --> 00:24:58,800 Speaker 2: it's not a random like you know, monkeys typing on 459 00:24:58,840 --> 00:25:03,920 Speaker 2: a typewriter producing the Bible right like it's it's there's 460 00:25:04,040 --> 00:25:09,480 Speaker 2: there's a difference between sentience and probability, and I just 461 00:25:09,920 --> 00:25:15,439 Speaker 2: I'm not convinced that you know, sentence is just going 462 00:25:15,520 --> 00:25:18,080 Speaker 2: to magically emerge from the circuits, at least at least 463 00:25:18,080 --> 00:25:22,000 Speaker 2: not now. We're definitely like, if that is how sentience emerged, 464 00:25:23,000 --> 00:25:27,280 Speaker 2: I just don't see it being anywhere close to where 465 00:25:27,280 --> 00:25:28,040 Speaker 2: we are at the moment. 466 00:25:28,840 --> 00:25:32,480 Speaker 1: Yeah, yeah, it makes sense. So if those two things 467 00:25:33,600 --> 00:25:37,239 Speaker 1: are not the real fear? What what? What is the 468 00:25:37,280 --> 00:25:38,760 Speaker 1: real distraction? 469 00:25:38,840 --> 00:25:38,879 Speaker 2: Is? 470 00:25:39,000 --> 00:25:41,480 Speaker 1: What is the real thing that that we should be 471 00:25:41,520 --> 00:25:42,200 Speaker 1: afraid of here? 472 00:25:42,520 --> 00:25:47,280 Speaker 2: So I think, you know, maybe instead of being afraid 473 00:25:47,320 --> 00:25:50,080 Speaker 2: of it, I think we can be vigilant of and 474 00:25:50,119 --> 00:25:55,560 Speaker 2: do something about it, because I mean, at least the 475 00:25:55,600 --> 00:25:57,560 Speaker 2: people listening to this for example, or at least the 476 00:25:57,600 --> 00:25:59,800 Speaker 2: people who want to be a little bit more awake 477 00:26:00,119 --> 00:26:02,800 Speaker 2: and the agents of their own life instead of being 478 00:26:02,880 --> 00:26:06,280 Speaker 2: you know, the classic lemming who gets pushed around with 479 00:26:06,359 --> 00:26:09,760 Speaker 2: whichever way the wind blows. But when we think about 480 00:26:09,800 --> 00:26:12,119 Speaker 2: the internet, right, so when it first came out, when 481 00:26:12,160 --> 00:26:14,679 Speaker 2: the Internet emerged, you know what was it? It was 482 00:26:14,720 --> 00:26:18,480 Speaker 2: this kind of this universe of information that anybody could 483 00:26:18,560 --> 00:26:21,000 Speaker 2: you know, create something and we could go, we could 484 00:26:21,000 --> 00:26:23,199 Speaker 2: find it, we could search for it. But as the 485 00:26:23,200 --> 00:26:25,679 Speaker 2: Internet grew, it became harder and harder to find stuff. 486 00:26:25,720 --> 00:26:27,000 Speaker 2: So you know what did you have you had the 487 00:26:27,080 --> 00:26:29,240 Speaker 2: rise of the search engines? 488 00:26:29,320 --> 00:26:29,480 Speaker 1: Right? 489 00:26:30,640 --> 00:26:33,240 Speaker 2: And obviously we know a little company called Google that 490 00:26:33,520 --> 00:26:36,359 Speaker 2: figured out a way to index that stuff really well, 491 00:26:36,800 --> 00:26:39,280 Speaker 2: you know, they crawled the whole Internet and made the 492 00:26:39,320 --> 00:26:43,640 Speaker 2: stuff that you were looking for more discoverable and more relevant, right, 493 00:26:44,560 --> 00:26:48,600 Speaker 2: And what happened over time? Like I still remember the 494 00:26:48,640 --> 00:26:50,240 Speaker 2: early days of Google, like I used to go to 495 00:26:50,280 --> 00:26:52,320 Speaker 2: page two, three, four, five. I don't know if you 496 00:26:52,680 --> 00:26:54,400 Speaker 2: remember those days, right, Like you used to go back, 497 00:26:54,520 --> 00:26:56,800 Speaker 2: used to like search for stuff. I mean, tell me, 498 00:26:56,840 --> 00:26:58,680 Speaker 2: who the fuck goes past the first page of Google? 499 00:26:58,720 --> 00:26:58,880 Speaker 1: Now? 500 00:26:59,440 --> 00:27:03,440 Speaker 2: Nobody as the first couple of results exactly nobody. So 501 00:27:04,160 --> 00:27:10,159 Speaker 2: what's essentially happened is that the way people like you know, 502 00:27:11,320 --> 00:27:13,639 Speaker 2: how we perceive the world, you know, is a function 503 00:27:13,720 --> 00:27:16,800 Speaker 2: of what we see, right, and like the glasses we wear. 504 00:27:16,960 --> 00:27:20,400 Speaker 2: Essentially it reminds me of sort of like Plato's cave, 505 00:27:20,480 --> 00:27:23,840 Speaker 2: you know, the allegories of you know, the shadows and shit. 506 00:27:23,920 --> 00:27:25,800 Speaker 2: It's like, you know, what you see is what you 507 00:27:25,880 --> 00:27:29,320 Speaker 2: perceive the world as. So, you know, the way we 508 00:27:29,359 --> 00:27:34,440 Speaker 2: get information on the Internet now is what Google tells us. 509 00:27:34,480 --> 00:27:37,520 Speaker 2: Like that's truth, that's reality, that's what we know, that's knowledge. 510 00:27:37,720 --> 00:27:37,920 Speaker 1: Right. 511 00:27:39,280 --> 00:27:41,520 Speaker 2: The same thing with social media, right, like the algorithms, 512 00:27:41,520 --> 00:27:44,280 Speaker 2: what they feed us is what we perceive as true 513 00:27:44,560 --> 00:27:49,040 Speaker 2: as knowledge, et cetera. Right, So what I'm seeing and 514 00:27:49,040 --> 00:27:51,240 Speaker 2: this that I think this is definitely well maybe I 515 00:27:51,240 --> 00:27:53,320 Speaker 2: shouldn't say definitely because it's too much of a strong word. 516 00:27:53,520 --> 00:27:57,480 Speaker 2: What I see as very likely the step change or 517 00:27:57,520 --> 00:28:00,159 Speaker 2: the new the zero to one moment for aar I 518 00:28:00,240 --> 00:28:04,080 Speaker 2: hear is the language user interface. Is this idea that 519 00:28:04,240 --> 00:28:05,720 Speaker 2: like you don't have to search for shit anymore, you 520 00:28:05,720 --> 00:28:07,439 Speaker 2: just ask you language model, Hey, you know what what 521 00:28:07,520 --> 00:28:09,240 Speaker 2: was this? Or like how does this work? And I 522 00:28:09,840 --> 00:28:11,640 Speaker 2: do that myself if my wife does it all the time, 523 00:28:11,760 --> 00:28:13,600 Speaker 2: Like she asked me a question, I'm like, they've just 524 00:28:13,600 --> 00:28:15,960 Speaker 2: asked chat ChiPT and she'll ask them to be like, eah, 525 00:28:16,040 --> 00:28:18,879 Speaker 2: you know it said this, and I'm like like sometimes 526 00:28:18,920 --> 00:28:20,920 Speaker 2: there'll be things that she asked me about. I'm like, yeah, 527 00:28:21,160 --> 00:28:24,520 Speaker 2: that's fucking scam. Like you know, things like you know, 528 00:28:24,600 --> 00:28:26,639 Speaker 2: the eat a balanced diet, like you know, start with 529 00:28:26,640 --> 00:28:29,200 Speaker 2: your granola in the morning, and you know, like pasteurized 530 00:28:29,200 --> 00:28:31,080 Speaker 2: milk or whatever. You know, like stuff we know that 531 00:28:31,160 --> 00:28:35,600 Speaker 2: is mainstream crap. Like you know, you'll you like people 532 00:28:35,640 --> 00:28:37,520 Speaker 2: like us will know that straight away. But for the 533 00:28:37,520 --> 00:28:41,400 Speaker 2: average Norman, that's the truth, right. So as we move forward, 534 00:28:41,480 --> 00:28:46,880 Speaker 2: my my sense is that people like the language model 535 00:28:46,920 --> 00:28:49,480 Speaker 2: will become the new user interface. You just chat and 536 00:28:49,520 --> 00:28:52,400 Speaker 2: ask and what it tells you will be your your 537 00:28:52,440 --> 00:28:57,440 Speaker 2: definition of truth. So if that's the case, well I 538 00:28:57,520 --> 00:28:59,600 Speaker 2: guess people can probably tell where the real problem is 539 00:28:59,600 --> 00:29:03,120 Speaker 2: going to be, is that upstream of that if you 540 00:29:03,200 --> 00:29:08,160 Speaker 2: can control what the right language or the right output is, 541 00:29:08,440 --> 00:29:11,000 Speaker 2: you know, and we're already like there's a huge talk 542 00:29:11,040 --> 00:29:14,560 Speaker 2: about AI safety. When they talk about AI safety, what 543 00:29:14,560 --> 00:29:17,840 Speaker 2: are they talking about. They're talking about what the AI says. 544 00:29:18,840 --> 00:29:23,120 Speaker 2: When they talk about approved language or safe speech. Like 545 00:29:23,400 --> 00:29:26,360 Speaker 2: whenever you hear like any sort of bureaucrat's regultary agency 546 00:29:26,360 --> 00:29:28,920 Speaker 2: say the word safe, just fucking run the opposite direction, right, 547 00:29:29,000 --> 00:29:31,520 Speaker 2: Like you know it's for your safety, all right, bro, 548 00:29:31,600 --> 00:29:34,600 Speaker 2: we know exactly what that means. So, like, you know, 549 00:29:34,680 --> 00:29:36,880 Speaker 2: as soon as and this is what triggered me, Like 550 00:29:36,920 --> 00:29:38,880 Speaker 2: when I was going down the rabbit hole and I 551 00:29:38,920 --> 00:29:41,880 Speaker 2: just started seeing all this stuff, like we need regulatory 552 00:29:41,880 --> 00:29:45,480 Speaker 2: bodies for safe and responsible use of AI. We need 553 00:29:45,520 --> 00:29:48,120 Speaker 2: to be you know, like as soon as I hear 554 00:29:48,160 --> 00:29:51,600 Speaker 2: safe and responsible man like my skin crawls, like Spider 555 00:29:51,680 --> 00:29:55,360 Speaker 2: SAIDs straight away, right, and you dig a little bit deeper. 556 00:29:55,400 --> 00:29:59,600 Speaker 2: What are they doing? Well? You know, now, particular AI models, 557 00:29:59,600 --> 00:30:02,080 Speaker 2: if you want run them, you need to ensure that 558 00:30:02,080 --> 00:30:05,280 Speaker 2: they are run through toxicity filters. And these toxicity filters 559 00:30:05,320 --> 00:30:07,680 Speaker 2: you know who's behind them, Ah, well, sprid Surprise. It's 560 00:30:07,680 --> 00:30:10,920 Speaker 2: like Google and this and that. And they basically filter 561 00:30:11,040 --> 00:30:14,760 Speaker 2: for particular words, particular language structures, particular sentence, complete, and 562 00:30:14,760 --> 00:30:16,920 Speaker 2: all this sort of stuff. And they basically guardrail these 563 00:30:16,960 --> 00:30:21,800 Speaker 2: models into delivering you an overton window of acceptable discourse. 564 00:30:22,080 --> 00:30:24,760 Speaker 2: Now you can try and prompt your way in and 565 00:30:24,800 --> 00:30:26,520 Speaker 2: around and all this sort of stuff. But every time 566 00:30:26,520 --> 00:30:28,800 Speaker 2: you do that, you actually help them harden their system 567 00:30:28,800 --> 00:30:30,640 Speaker 2: a little bit more because they learn from that. They 568 00:30:30,720 --> 00:30:31,440 Speaker 2: you know, they. 569 00:30:31,600 --> 00:30:33,840 Speaker 1: Restrict those of their weaknesses. Yeah, exactly. 570 00:30:34,160 --> 00:30:36,440 Speaker 2: So what you end up having is you get this 571 00:30:36,520 --> 00:30:40,760 Speaker 2: kind of like this age of approved knowledge. And if 572 00:30:40,800 --> 00:30:42,360 Speaker 2: we end up in a world, for example, where we 573 00:30:42,400 --> 00:30:45,080 Speaker 2: have one or two or three language models that are 574 00:30:45,240 --> 00:30:48,360 Speaker 2: the approved language models that everyone's allowed to use, well, 575 00:30:48,400 --> 00:30:49,600 Speaker 2: then what do you think is going to happen? 576 00:30:50,000 --> 00:30:50,880 Speaker 1: It is like you're. 577 00:30:50,720 --> 00:30:52,959 Speaker 2: Literally playing a game of inception. You can just like 578 00:30:53,720 --> 00:30:55,960 Speaker 2: essentially just make people think whatever you want them to think, 579 00:30:56,080 --> 00:30:58,560 Speaker 2: because that's how they get their knowledge, and then that's 580 00:30:58,600 --> 00:31:02,360 Speaker 2: what they regurt to take out into the world. So 581 00:31:02,440 --> 00:31:05,560 Speaker 2: that's where I believe the real danger is. And I 582 00:31:05,560 --> 00:31:08,880 Speaker 2: mean it's like the kind of people that have been 583 00:31:09,360 --> 00:31:12,760 Speaker 2: kicking and yelling and screaming about AI safety. Like when 584 00:31:12,800 --> 00:31:15,560 Speaker 2: you dig into the details that they on the surface 585 00:31:15,560 --> 00:31:18,640 Speaker 2: they say artificial general intelligence, you know, is the biggest 586 00:31:18,680 --> 00:31:22,080 Speaker 2: threat to humanity. It could obsolete us, it could wipe 587 00:31:22,160 --> 00:31:26,120 Speaker 2: us out. They tell you nothing about like what general 588 00:31:26,120 --> 00:31:28,760 Speaker 2: intelligence means, how that's even possible, like any of that 589 00:31:28,800 --> 00:31:30,480 Speaker 2: sort of stuff. And then when you look at the 590 00:31:30,480 --> 00:31:33,000 Speaker 2: action steps, the action steps are we need to put 591 00:31:33,000 --> 00:31:37,680 Speaker 2: regultary bodies in place to regulate speech, to regulate discourse, 592 00:31:37,720 --> 00:31:40,000 Speaker 2: to ensure that these language models are safe, and to 593 00:31:40,080 --> 00:31:44,120 Speaker 2: ensure that we have responsible development moving forward. Like that's 594 00:31:44,400 --> 00:31:49,880 Speaker 2: straight up controlled language, like it's it's like it's all 595 00:31:49,920 --> 00:31:54,960 Speaker 2: well and inception like in one. So that's to sum 596 00:31:55,040 --> 00:31:56,760 Speaker 2: up where I think the biggest threat is. 597 00:31:57,360 --> 00:32:01,800 Speaker 1: Well, and that's that's that like perfectly fits with you know, 598 00:32:01,840 --> 00:32:05,280 Speaker 1: the things that some of these people are doing outside 599 00:32:05,320 --> 00:32:08,400 Speaker 1: of artificial intelligence as well, like Sam Altman, the Open 600 00:32:08,440 --> 00:32:12,160 Speaker 1: AI guy number one, when he's in front of Congress. 601 00:32:12,280 --> 00:32:14,320 Speaker 1: Congress is saying, you know, you're making this dangerous thing, 602 00:32:14,360 --> 00:32:16,360 Speaker 1: and he's like, that's why I'm asking you guys to 603 00:32:16,440 --> 00:32:19,880 Speaker 1: regulate it. It's like, that's that's odd number one. And 604 00:32:19,880 --> 00:32:23,719 Speaker 1: then number two his world coin, the new cryptocurrency number one. 605 00:32:23,720 --> 00:32:26,240 Speaker 1: It's named world Coin, and it's got orbs that scan 606 00:32:26,320 --> 00:32:29,560 Speaker 1: your eyeballs. It's like, you couldn't imagine a more dystopian, 607 00:32:30,200 --> 00:32:34,640 Speaker 1: Like it sounds fake and when you dig into it, 608 00:32:34,640 --> 00:32:37,800 Speaker 1: it gathers all your personal data for your own protection, 609 00:32:38,640 --> 00:32:44,480 Speaker 1: and it promises privacy, but not from governments. It's like, literally, 610 00:32:44,600 --> 00:32:47,880 Speaker 1: we're gathering it so that governments can use this to 611 00:32:48,000 --> 00:32:50,840 Speaker 1: enable cross board, keep you safe, enable cross border payments, 612 00:32:50,880 --> 00:32:54,080 Speaker 1: but also make sure there's no money laundering aka shut 613 00:32:54,080 --> 00:32:56,240 Speaker 1: off payments of anybody that we don't agree with what 614 00:32:56,280 --> 00:32:59,360 Speaker 1: they're paying for. And it's just this dystopian nightmare, and 615 00:32:59,400 --> 00:33:04,360 Speaker 1: it fits perfectly with controlling thought from UH from their 616 00:33:04,480 --> 00:33:08,040 Speaker 1: artificial intelligence model. It's just it's not it's not far 617 00:33:08,120 --> 00:33:11,440 Speaker 1: fetched at all. It's it's it's literally when you read 618 00:33:11,440 --> 00:33:13,920 Speaker 1: between the lines, with these people are pushing. 619 00:33:13,640 --> 00:33:17,960 Speaker 2: For totally totally, So you know, with with that in mind, 620 00:33:18,040 --> 00:33:20,440 Speaker 2: like the question isn't and this this is this sort 621 00:33:20,440 --> 00:33:22,680 Speaker 2: of ties into what I've been working on like for 622 00:33:22,720 --> 00:33:25,520 Speaker 2: the last six or Jesus almost been eight months now 623 00:33:25,680 --> 00:33:30,080 Speaker 2: is you fight this by building alternatives? Right? So, like 624 00:33:30,160 --> 00:33:32,640 Speaker 2: if we think about what is bitcoin? You know, we've 625 00:33:32,640 --> 00:33:34,880 Speaker 2: spoken about bitcoin on a number of occasions. So bitcoin 626 00:33:34,960 --> 00:33:38,960 Speaker 2: is an alternative. It's an open source alternative that anybody 627 00:33:38,960 --> 00:33:42,000 Speaker 2: can use, anybody can get access to, anybody can run locally, 628 00:33:42,360 --> 00:33:48,640 Speaker 2: and it happens to be a network that supports the 629 00:33:48,640 --> 00:33:52,440 Speaker 2: most important technology humans need to subsist, which is money. Right, 630 00:33:52,520 --> 00:33:55,920 Speaker 2: So m hmm, that's what that is. So if we 631 00:33:55,960 --> 00:33:58,120 Speaker 2: look at the AI, think well, what do we need 632 00:33:58,120 --> 00:34:01,280 Speaker 2: to combat this? And luck enough, this is actually happening 633 00:34:01,280 --> 00:34:03,480 Speaker 2: in the AI space pretty broadly. Is like you need 634 00:34:03,560 --> 00:34:07,720 Speaker 2: open source models, you need options, you need other models 635 00:34:07,720 --> 00:34:11,680 Speaker 2: that can compete with these primary models. Now you know, 636 00:34:11,800 --> 00:34:16,239 Speaker 2: the AI does fundamentally benefit from economies of scale, so 637 00:34:16,280 --> 00:34:20,080 Speaker 2: it is very difficult to compete with the the chat 638 00:34:20,120 --> 00:34:21,960 Speaker 2: Gibt's no been ais of the world. Like, I mean, 639 00:34:22,000 --> 00:34:24,560 Speaker 2: these guys are sitting on how many billions in the bank, 640 00:34:24,640 --> 00:34:28,200 Speaker 2: and you know, they can afford to spend gazillions of 641 00:34:28,239 --> 00:34:30,680 Speaker 2: dollars on the best engineers, et cetera, et cetera, and 642 00:34:30,719 --> 00:34:32,960 Speaker 2: like data gathering and all this sort of stuff, but 643 00:34:33,800 --> 00:34:35,919 Speaker 2: you know, they are fundamentally trying to build I mean, 644 00:34:36,480 --> 00:34:40,480 Speaker 2: I part of me actually hopes that they're they believe 645 00:34:40,520 --> 00:34:42,880 Speaker 2: their own bullshit when it comes to AGI, because at 646 00:34:42,920 --> 00:34:45,160 Speaker 2: least what they'll do is they'll spend they'll spend billions 647 00:34:45,160 --> 00:34:47,160 Speaker 2: of dollars trying to build a you know, tin man 648 00:34:47,239 --> 00:34:50,360 Speaker 2: that is never going to actually work. So you know 649 00:34:50,360 --> 00:34:52,399 Speaker 2: that that might be like poetic justice in the end, 650 00:34:53,880 --> 00:34:55,160 Speaker 2: because I mean a lot of these guys, you look 651 00:34:55,160 --> 00:34:56,880 Speaker 2: at their model of the world that they are truly 652 00:34:57,000 --> 00:35:01,000 Speaker 2: the the yuval Harari brain and at you know, like 653 00:35:01,040 --> 00:35:03,960 Speaker 2: the Brian Johnson I'm going to live to five hundred years, 654 00:35:04,000 --> 00:35:06,320 Speaker 2: like and I've looked like a fucking like the guy's 655 00:35:06,360 --> 00:35:09,120 Speaker 2: literally transforming into a fish before our very eyes, like 656 00:35:09,600 --> 00:35:12,200 Speaker 2: you know, pale skin, like it's just retarded. 657 00:35:12,239 --> 00:35:15,200 Speaker 1: So he looks almost real, dude, Yeah, exactly, he just 658 00:35:16,040 --> 00:35:16,800 Speaker 1: it's so weird. 659 00:35:16,920 --> 00:35:21,440 Speaker 2: So like, I yeah, there's probably some poetic justice in 660 00:35:21,480 --> 00:35:23,239 Speaker 2: all of this, and a lot of you know, just 661 00:35:23,440 --> 00:35:27,480 Speaker 2: money just being spent on dumb shit. I think there's 662 00:35:27,560 --> 00:35:30,759 Speaker 2: you know, there's also a mix of you know, these 663 00:35:30,840 --> 00:35:34,200 Speaker 2: narratives being pushed. Like you look at who's really pushing 664 00:35:34,239 --> 00:35:39,080 Speaker 2: the narratives and what is it. It's it's Nvidia, Microsoft, Google, blah, 665 00:35:39,120 --> 00:35:42,279 Speaker 2: bah blah, all these guys. And you know, funny enough, 666 00:35:42,920 --> 00:35:46,120 Speaker 2: just just follow the money. Classic is all these startups 667 00:35:46,120 --> 00:35:51,880 Speaker 2: are joining which are innovating, and they're sucking talent from 668 00:35:51,960 --> 00:35:54,759 Speaker 2: all sides of you know, the world, and they're raising money. 669 00:35:54,800 --> 00:35:58,800 Speaker 2: The money is coming from vcs, and the vcs they're LPs, 670 00:35:59,000 --> 00:36:02,840 Speaker 2: are Google beres, you know, n video ventures, all this 671 00:36:02,880 --> 00:36:05,280 Speaker 2: sort of stuff. And what are the startups or spending 672 00:36:05,280 --> 00:36:09,040 Speaker 2: money on. They're spending money on graphics cards and compute 673 00:36:09,120 --> 00:36:11,640 Speaker 2: and all. So the money is just flowing right back 674 00:36:11,680 --> 00:36:14,239 Speaker 2: up to the same dudes, and you know they're just 675 00:36:14,320 --> 00:36:18,160 Speaker 2: concentrating their positions, you know, just based on a new narrative. 676 00:36:18,280 --> 00:36:21,560 Speaker 2: So you know, it's there's all sorts of weird sort 677 00:36:21,560 --> 00:36:24,640 Speaker 2: of shenanigans going on, you know. And that's not to 678 00:36:24,680 --> 00:36:28,560 Speaker 2: say that you know, these tools, these new technologies are 679 00:36:28,600 --> 00:36:31,240 Speaker 2: not going to be useful. There's there's obviously some stuff here, 680 00:36:31,920 --> 00:36:34,560 Speaker 2: but anyway to kind of went off on a tangent 681 00:36:34,600 --> 00:36:36,799 Speaker 2: there to sort of tie it back is the way 682 00:36:36,840 --> 00:36:40,719 Speaker 2: to combat this stuff. The way to combat anything is 683 00:36:40,760 --> 00:36:44,400 Speaker 2: to build alternatives and make the alternatives appealing enough so 684 00:36:44,480 --> 00:36:46,400 Speaker 2: that other people want to use it. So, like what 685 00:36:46,480 --> 00:36:48,840 Speaker 2: we're doing now with like the Spirit of Suatoshi project 686 00:36:49,000 --> 00:36:54,439 Speaker 2: is we're collecting and curating all of the bitcoin data 687 00:36:54,440 --> 00:36:56,280 Speaker 2: in the world, which is a huge corpus of data. 688 00:36:56,440 --> 00:37:00,000 Speaker 2: Everything from Austrian economics, from classical literature, from like conservative 689 00:37:00,239 --> 00:37:02,919 Speaker 2: type of philosophy. You know, you're Thomas Carliles, Edmond Berks 690 00:37:02,960 --> 00:37:04,200 Speaker 2: and all this sort of stuff. Like all of this 691 00:37:04,280 --> 00:37:06,720 Speaker 2: kind of stuff we're cleating, and we're training a model 692 00:37:06,719 --> 00:37:10,600 Speaker 2: from scratch on all of that, and we want to 693 00:37:10,600 --> 00:37:12,800 Speaker 2: give people it's going to be a far more narrow model. 694 00:37:13,160 --> 00:37:15,319 Speaker 2: It'll be much smaller than chat chipity, won't be as 695 00:37:15,480 --> 00:37:17,560 Speaker 2: versatile as chatchiputy, like won't be able to write you 696 00:37:17,560 --> 00:37:21,120 Speaker 2: a poem and shit like that, but it'll be functional. 697 00:37:21,160 --> 00:37:23,000 Speaker 2: Like let's say you want to get an idea of like, okay, 698 00:37:23,000 --> 00:37:26,200 Speaker 2: you know, what's something that you know, Thomas Carlisle, you know, 699 00:37:26,239 --> 00:37:29,840 Speaker 2: would have disagreed on with you know Nietzsche for example. 700 00:37:30,000 --> 00:37:33,320 Speaker 2: You know, it'll give you something like functional, something useful. 701 00:37:34,080 --> 00:37:36,040 Speaker 2: You know what's what like what would Natsue have thought 702 00:37:36,040 --> 00:37:39,560 Speaker 2: about bitcoin? You know what would like Alexander sols Nit's 703 00:37:39,640 --> 00:37:42,640 Speaker 2: and thought of ethereum, I don't know, like whatever, Like 704 00:37:42,680 --> 00:37:46,040 Speaker 2: it'll it'll give you some interesting stuff. And I think 705 00:37:46,120 --> 00:37:49,240 Speaker 2: somewhere in there there might be some utility for people 706 00:37:49,239 --> 00:37:51,080 Speaker 2: who want to think outside of the box and not 707 00:37:51,160 --> 00:37:54,200 Speaker 2: conform to the mainstream. And what this might show the 708 00:37:54,239 --> 00:37:58,040 Speaker 2: seeds for is language models around health. Like I'd love 709 00:37:58,040 --> 00:38:01,080 Speaker 2: to do like a repeat doctor mccola language model, like 710 00:38:01,160 --> 00:38:02,799 Speaker 2: something that you can ask like, hey, you know what 711 00:38:02,840 --> 00:38:06,000 Speaker 2: should I You know, I'm twenty five, you know, I'm 712 00:38:06,320 --> 00:38:08,279 Speaker 2: looking to train for this, Like this is what I want? 713 00:38:08,320 --> 00:38:10,040 Speaker 2: This my profile, Like what should I eat? When should 714 00:38:10,040 --> 00:38:11,560 Speaker 2: eat it? How should ead? What should I look outlah 715 00:38:11,560 --> 00:38:16,359 Speaker 2: blah blah, Like just more more alternatives that are more 716 00:38:16,480 --> 00:38:19,880 Speaker 2: narrow but more suited for people that don't want to 717 00:38:20,000 --> 00:38:22,720 Speaker 2: adopt a mainstream narrative. And I think that's it's always 718 00:38:22,719 --> 00:38:25,320 Speaker 2: been the case, and I think that's how we combat 719 00:38:25,360 --> 00:38:26,080 Speaker 2: this sort of stuff. 720 00:38:27,840 --> 00:38:32,719 Speaker 1: Is is the bitcoin transaction data like the blockchain, is 721 00:38:32,760 --> 00:38:36,799 Speaker 1: that information that this is pulling from as well? Or 722 00:38:37,000 --> 00:38:39,560 Speaker 1: would that not have any utility? 723 00:38:40,040 --> 00:38:43,360 Speaker 2: I mean not not really any utility now, Like I 724 00:38:43,360 --> 00:38:45,480 Speaker 2: guess what you could do, for example, is you could 725 00:38:45,520 --> 00:38:49,520 Speaker 2: train a language model to query the bitcoin blockchain for 726 00:38:49,960 --> 00:38:53,520 Speaker 2: particular data or blockhird or transactions for example. And what 727 00:38:53,520 --> 00:38:56,680 Speaker 2: you could maybe do is you could build a a 728 00:38:57,200 --> 00:39:03,760 Speaker 2: men pool or or you know, blockchain transaction assistant. For example. 729 00:39:03,800 --> 00:39:06,400 Speaker 2: You might say, hey, you know I'm looking for you know, 730 00:39:06,440 --> 00:39:09,560 Speaker 2: this this transaction on this day, you know, can you 731 00:39:09,560 --> 00:39:11,279 Speaker 2: pull it up for me? And you know, maybe a 732 00:39:11,520 --> 00:39:15,640 Speaker 2: bitcoin type model could go and like find the precise transaction, 733 00:39:15,760 --> 00:39:17,040 Speaker 2: you know, give your list to hey, is it one 734 00:39:17,040 --> 00:39:19,000 Speaker 2: of these ones? What are you looking for specifically? Blah 735 00:39:19,000 --> 00:39:22,040 Speaker 2: blah blah. So so you know, maybe there's utility there, 736 00:39:22,440 --> 00:39:26,440 Speaker 2: like in kind of trying to you know, make sense 737 00:39:26,480 --> 00:39:29,800 Speaker 2: of what is actually important data on the bitcoin network. 738 00:39:29,880 --> 00:39:31,719 Speaker 2: But you know, would you do that as an individual? 739 00:39:31,760 --> 00:39:33,680 Speaker 2: You know, less likely that might be something that's an 740 00:39:33,800 --> 00:39:36,280 Speaker 2: enterprise tool. For example, you know, a company that wants 741 00:39:36,320 --> 00:39:39,680 Speaker 2: to understand money flows, movements, you know, do chain analysis 742 00:39:39,760 --> 00:39:42,000 Speaker 2: stuff like that, you know they might use something like that. 743 00:39:43,480 --> 00:39:48,960 Speaker 1: So Essentially you are well number one, you can anybody 744 00:39:49,160 --> 00:39:51,800 Speaker 1: who who knows how to do this can have a 745 00:39:51,880 --> 00:39:57,759 Speaker 1: language model trained on any data set. And so in 746 00:39:57,800 --> 00:40:06,600 Speaker 1: this circumstance, you're saying, okay, Austrian economics, classical philosophy or 747 00:40:06,640 --> 00:40:11,719 Speaker 1: classical literature yet, and you're you're taking that so that 748 00:40:12,239 --> 00:40:15,920 Speaker 1: anything from there people can be creative and say, okay, 749 00:40:15,920 --> 00:40:18,759 Speaker 1: here's here's something useful that I can pull out of that. 750 00:40:22,200 --> 00:40:26,440 Speaker 2: Sorry, I just made it myself. Fricking dog's bucking yes. Basically, 751 00:40:26,560 --> 00:40:29,080 Speaker 2: so the the. 752 00:40:28,800 --> 00:40:35,040 Speaker 3: The idea is that there's a big misconception about language 753 00:40:35,080 --> 00:40:38,000 Speaker 3: models that people think, Okay, these language models are trained 754 00:40:38,000 --> 00:40:40,320 Speaker 3: on all this data and they know the data. 755 00:40:41,320 --> 00:40:43,640 Speaker 2: It's actually not how it really works like that, like 756 00:40:43,640 --> 00:40:45,920 Speaker 2: a language model doesn't know ship. It doesn't know that, 757 00:40:46,560 --> 00:40:49,239 Speaker 2: Jordan Peterson said, x or Nia said y or so, 758 00:40:49,520 --> 00:40:54,280 Speaker 2: she said, z right. What what like what the training 759 00:40:54,280 --> 00:40:56,880 Speaker 2: does is it's basically a game of guess the next word, 760 00:40:57,360 --> 00:40:59,480 Speaker 2: and it's very complex, like you know that. That's kind 761 00:40:59,480 --> 00:41:02,440 Speaker 2: of like an easy way to summarize it so people 762 00:41:02,440 --> 00:41:07,920 Speaker 2: can conceptualize it. And when you're feeding a model particular 763 00:41:08,320 --> 00:41:11,439 Speaker 2: kinds of data, there is a language style like when 764 00:41:11,440 --> 00:41:13,799 Speaker 2: you read Austrian economics, you kind of know that it's 765 00:41:13,800 --> 00:41:16,239 Speaker 2: Austrian economics, Like you know, Mesas and Rothbard might have 766 00:41:16,320 --> 00:41:18,399 Speaker 2: their own sort of linguistic style, but the points they're 767 00:41:18,400 --> 00:41:21,960 Speaker 2: making are generally similar, right, Like if you read the 768 00:41:22,280 --> 00:41:25,160 Speaker 2: Communist manifesto or something from Marx or Angles or something 769 00:41:25,200 --> 00:41:27,240 Speaker 2: like that, like the general points the way they string 770 00:41:27,239 --> 00:41:29,640 Speaker 2: words to get sentence everything that the essence of what 771 00:41:29,640 --> 00:41:33,720 Speaker 2: they're saying is different. Right, So when you're training a model, 772 00:41:34,000 --> 00:41:37,040 Speaker 2: you know, you're training it around the probabilities of words 773 00:41:37,040 --> 00:41:39,600 Speaker 2: and the weights and the biases, around how words and 774 00:41:39,680 --> 00:41:42,080 Speaker 2: sentences and everything is structured. So the model doesn't actually 775 00:41:42,080 --> 00:41:45,319 Speaker 2: know anything, but when you ask it something, it will 776 00:41:46,400 --> 00:41:49,120 Speaker 2: it will create a sentence where the structure of the 777 00:41:49,120 --> 00:41:51,600 Speaker 2: words and the structure of sentence is such that it 778 00:41:51,760 --> 00:41:56,000 Speaker 2: is going to sound like an Austrian economist or a 779 00:41:56,000 --> 00:42:00,120 Speaker 2: Bitcoina or something like that. And you know what you 780 00:42:00,200 --> 00:42:03,400 Speaker 2: end up getting with these, with these probability machines is 781 00:42:04,280 --> 00:42:06,799 Speaker 2: you might actually get a fucking random insight out of it, 782 00:42:06,880 --> 00:42:08,920 Speaker 2: Like you say, hey, you know, tell me something. You know, 783 00:42:08,920 --> 00:42:11,520 Speaker 2: as I said earlier, like you know what Nietzsche and 784 00:42:12,080 --> 00:42:15,279 Speaker 2: Thomas Carlisle would have disagreed on that safety in you know, 785 00:42:15,480 --> 00:42:18,960 Speaker 2: said like you know, and and it would like string 786 00:42:19,000 --> 00:42:21,000 Speaker 2: this shit together in such way, and you'll be like, fuck, 787 00:42:21,040 --> 00:42:23,600 Speaker 2: actually that's pretty good, or you'll be like, okay, no, 788 00:42:23,719 --> 00:42:26,399 Speaker 2: that's just dumb. And you know that that other part 789 00:42:26,480 --> 00:42:28,799 Speaker 2: is called like hallucination for example, that's why you know 790 00:42:28,840 --> 00:42:31,839 Speaker 2: when people I'm sure you've heard the term like chat 791 00:42:31,920 --> 00:42:34,640 Speaker 2: chipety and stuff hallucinates like it makes up facts. Have 792 00:42:34,719 --> 00:42:37,160 Speaker 2: you heard of that? Yeah, yep, yeah, So it's not 793 00:42:37,280 --> 00:42:39,759 Speaker 2: it's because chatchipe doesn't know that it's a fact or not. 794 00:42:40,239 --> 00:42:46,520 Speaker 2: It's just strung together words probabilistically speaking, that actually make 795 00:42:47,000 --> 00:42:48,640 Speaker 2: you know, I hate to use the words sense, but 796 00:42:48,760 --> 00:42:52,440 Speaker 2: like the highest probability of these words being in a 797 00:42:52,520 --> 00:42:57,279 Speaker 2: row come out and chatchipity says it with like what 798 00:42:57,440 --> 00:43:01,239 Speaker 2: sounds like certainty, but it doesn't know that it's wrong 799 00:43:01,320 --> 00:43:03,520 Speaker 2: or right, and you, as a human might know that 800 00:43:03,560 --> 00:43:05,839 Speaker 2: it's wrong and be like, well, that's not the fucking quote. 801 00:43:05,880 --> 00:43:09,120 Speaker 2: That's bullshit. But yeah, chagib is not drawing from a 802 00:43:09,200 --> 00:43:11,680 Speaker 2: database and you know, copy and pasting the quote. It's 803 00:43:11,719 --> 00:43:14,600 Speaker 2: just stringing the words together in such a way. So 804 00:43:14,800 --> 00:43:16,960 Speaker 2: anyway that that you know, at risk and getting a 805 00:43:17,000 --> 00:43:20,520 Speaker 2: little bit technical there, Like the utility in an alternative 806 00:43:20,560 --> 00:43:26,840 Speaker 2: model is having you know, these you know, a different 807 00:43:26,920 --> 00:43:30,799 Speaker 2: linguistic style. Now I will mention one more thing here 808 00:43:30,960 --> 00:43:34,799 Speaker 2: is you can train a model to query a database 809 00:43:35,640 --> 00:43:40,520 Speaker 2: so that you can embed facts in language. Right, And 810 00:43:41,760 --> 00:43:47,120 Speaker 2: this is where you could get for example, uh, you 811 00:43:47,160 --> 00:43:49,520 Speaker 2: know the the example I gave you earlier about these 812 00:43:49,600 --> 00:43:52,680 Speaker 2: language user interfaces where you can speak to a model 813 00:43:52,800 --> 00:43:55,960 Speaker 2: and it tells you stuff and it's genuinely fact, but 814 00:43:56,040 --> 00:43:59,200 Speaker 2: it's also within you know, the linguistic style that is approved, 815 00:44:00,080 --> 00:44:02,160 Speaker 2: and this could become the new way. We don't use 816 00:44:02,200 --> 00:44:06,200 Speaker 2: search engines anymore. We just use our knowledgeable assistant, right, 817 00:44:06,360 --> 00:44:10,600 Speaker 2: our intelligent friend. So we'll do the same thing with 818 00:44:10,640 --> 00:44:14,440 Speaker 2: spiritual stoci. Is that not only are we training the 819 00:44:14,520 --> 00:44:16,680 Speaker 2: model on all of the data that we're cataloging, but 820 00:44:16,760 --> 00:44:19,000 Speaker 2: all of the data that we're cataloging we are putting 821 00:44:19,080 --> 00:44:23,440 Speaker 2: into a massive repository that the model can then query later, 822 00:44:24,000 --> 00:44:28,319 Speaker 2: so that it is not only linguistically of a you know, 823 00:44:28,920 --> 00:44:31,920 Speaker 2: nuanced style that is more like our model of the world, 824 00:44:32,000 --> 00:44:34,480 Speaker 2: but it can also query that stuff so that when 825 00:44:34,480 --> 00:44:40,840 Speaker 2: it delivers something. There is fact mixed with style together basically. 826 00:44:41,600 --> 00:44:45,760 Speaker 1: Yeah, yeah, that makes sense. Yeah, it's really interesting because 827 00:44:46,160 --> 00:44:49,400 Speaker 1: it's these seem to me, and you can correct me 828 00:44:49,400 --> 00:44:52,880 Speaker 1: if I'm wrong about this. These seem like foundational tools 829 00:44:53,120 --> 00:45:00,960 Speaker 1: that we were not. It doesn't seem apparent yet how 830 00:45:01,200 --> 00:45:05,120 Speaker 1: uh how how they how how much they'll be used for. 831 00:45:05,520 --> 00:45:11,279 Speaker 1: It's almost like it's almost like inventing, like like we 832 00:45:11,360 --> 00:45:16,040 Speaker 1: have a tractor before, before we have farms. It's like 833 00:45:18,320 --> 00:45:20,239 Speaker 1: and somebody's got to come along and they're gonna be like, hey, 834 00:45:20,280 --> 00:45:25,160 Speaker 1: we can actually use this to make a farm. 835 00:45:25,520 --> 00:45:27,960 Speaker 2: That's a very good analogy actually, because like you know, 836 00:45:28,160 --> 00:45:29,680 Speaker 2: we we raise a little bit of money for this, 837 00:45:29,800 --> 00:45:31,480 Speaker 2: and like you know, I told investors, I'm like, man, 838 00:45:31,520 --> 00:45:33,560 Speaker 2: this is this is a fucking experiment like we have 839 00:45:34,040 --> 00:45:35,520 Speaker 2: we don't know what like because they're like, what's the 840 00:45:35,520 --> 00:45:38,000 Speaker 2: commercial application? I said, well, we could have commercial applications. 841 00:45:38,040 --> 00:45:41,120 Speaker 2: We have like bitcoin onboarding assistant or like a bigcoin 842 00:45:41,239 --> 00:45:44,719 Speaker 2: educational assistant. You know, maybe we could build like a uh, 843 00:45:44,960 --> 00:45:47,400 Speaker 2: you know, an alternative to Coursera, which is about like 844 00:45:47,440 --> 00:45:50,040 Speaker 2: training people to be more sovereign, and you know, homeschooling 845 00:45:50,080 --> 00:45:52,200 Speaker 2: and health and all this sort of stuff, but I said, 846 00:45:52,200 --> 00:45:55,160 Speaker 2: you know, the caveat was like, we don't actually fucking know. 847 00:45:55,520 --> 00:45:57,840 Speaker 2: Like the first step is, can we build an alternative 848 00:45:57,960 --> 00:46:02,000 Speaker 2: language model that is not god rail, not toxicity filtered 849 00:46:02,040 --> 00:46:03,759 Speaker 2: and all this other crap, and it's trained on a 850 00:46:03,800 --> 00:46:07,600 Speaker 2: specific corpus that we've selected and we've curated, and then 851 00:46:07,680 --> 00:46:11,200 Speaker 2: let's see what problems we can solve with that. And 852 00:46:12,000 --> 00:46:14,160 Speaker 2: that's kind of like the next step. So right now, 853 00:46:14,200 --> 00:46:16,920 Speaker 2: like we're still in the process of training. We're basically 854 00:46:16,960 --> 00:46:21,480 Speaker 2: in the process of curating, cleaning and creating the specific 855 00:46:21,600 --> 00:46:26,960 Speaker 2: data set and training simultaneous that we're doing it in a 856 00:46:27,040 --> 00:46:31,600 Speaker 2: parallel and we're actually built like a might just mention 857 00:46:31,719 --> 00:46:34,400 Speaker 2: this here is we built a cool little app called 858 00:46:35,680 --> 00:46:38,319 Speaker 2: kind of like well, we haven't given it a name really, 859 00:46:38,360 --> 00:46:40,640 Speaker 2: but it's like you can help train spiritus associate. And 860 00:46:40,840 --> 00:46:43,319 Speaker 2: this is something we've integrated bitcoin quite deeply, and so 861 00:46:43,880 --> 00:46:45,200 Speaker 2: you can go create an account like you can do 862 00:46:45,239 --> 00:46:47,160 Speaker 2: it with just by scanning a Lightning wallet or just 863 00:46:47,280 --> 00:46:50,480 Speaker 2: logging with Noster email. And we're calling a proof of 864 00:46:50,560 --> 00:46:53,120 Speaker 2: knowledge in the sense that if you've got like some 865 00:46:53,360 --> 00:46:55,480 Speaker 2: knowledge about bitcoin, of store economics or anything like that 866 00:46:55,520 --> 00:46:57,880 Speaker 2: you can go in there and you can answer questions 867 00:46:58,280 --> 00:47:02,879 Speaker 2: as if you were the model, and we use that 868 00:47:03,920 --> 00:47:07,000 Speaker 2: style of language and that information to then train the 869 00:47:07,040 --> 00:47:11,160 Speaker 2: model so that people from around the world can actually 870 00:47:11,239 --> 00:47:14,880 Speaker 2: have their input. You can also help like clean up 871 00:47:14,920 --> 00:47:17,879 Speaker 2: some data, so we've got all this programmatically clean data. 872 00:47:17,880 --> 00:47:20,959 Speaker 2: So like without getting into the weeds here, like when 873 00:47:20,960 --> 00:47:23,080 Speaker 2: you train a model on a book, like let's say 874 00:47:23,080 --> 00:47:25,480 Speaker 2: the Bitcoin Standard, you don't just like put the whole 875 00:47:25,480 --> 00:47:27,000 Speaker 2: book in there. You have to break it up into 876 00:47:27,040 --> 00:47:29,240 Speaker 2: a specific format. You have to turn it into question 877 00:47:29,320 --> 00:47:31,400 Speaker 2: and answer pairs based on every single paragraph and all 878 00:47:31,400 --> 00:47:33,440 Speaker 2: that sort of stuff. So some of the stuff in 879 00:47:33,480 --> 00:47:35,759 Speaker 2: there might be irrelevant, Like we went and transcribed a 880 00:47:35,800 --> 00:47:38,480 Speaker 2: bunch of podcasts, for example, and then we broke it 881 00:47:38,560 --> 00:47:41,000 Speaker 2: up into chunks and we are before feeding it to 882 00:47:41,080 --> 00:47:43,759 Speaker 2: the model, we do like a programmatic cleanup to try 883 00:47:43,800 --> 00:47:46,000 Speaker 2: and get rid of like stuff that's irrelevant. Like you know, 884 00:47:46,040 --> 00:47:47,720 Speaker 2: on a podcast, you say what I ate for breakfast, 885 00:47:47,800 --> 00:47:49,600 Speaker 2: what I did last week with my wife, whatever, Like 886 00:47:49,680 --> 00:47:53,879 Speaker 2: all that shit's irrelevant, But you can't programmatically get all 887 00:47:53,920 --> 00:47:56,680 Speaker 2: of that out there, so you need some human kind 888 00:47:56,680 --> 00:47:59,560 Speaker 2: of assistance. So you know, people are in there helping 889 00:47:59,640 --> 00:48:01,640 Speaker 2: us kind of clean that ad just questions, make it 890 00:48:01,719 --> 00:48:05,520 Speaker 2: more relevant, this and that and and yeah, earning earning 891 00:48:05,560 --> 00:48:08,960 Speaker 2: SATs for it. We've got mirandoms from like Philippines, from Salvador, 892 00:48:09,040 --> 00:48:11,480 Speaker 2: from Europe, from America, from Australia, like all this sort 893 00:48:11,520 --> 00:48:14,000 Speaker 2: of stuff, and we're able to do that, which is 894 00:48:14,040 --> 00:48:17,800 Speaker 2: really interesting because we can pay them SATs from anywhere 895 00:48:17,800 --> 00:48:19,719 Speaker 2: around the world. They can be completely anonymous. Then I 896 00:48:19,760 --> 00:48:22,879 Speaker 2: have to like, you know, it's it's beautiful and this 897 00:48:22,920 --> 00:48:25,680 Speaker 2: is something like for example, people like open ai can't 898 00:48:25,719 --> 00:48:27,839 Speaker 2: really do because they're paying everyone in dollars. They're doing 899 00:48:27,880 --> 00:48:31,919 Speaker 2: everything kind of like the old school way. So yeah, 900 00:48:31,960 --> 00:48:34,360 Speaker 2: we've we've got some interesting advantages there. And it's like, 901 00:48:35,880 --> 00:48:38,200 Speaker 2: you know, I will say, if anyone's interested in that 902 00:48:38,239 --> 00:48:39,640 Speaker 2: sort of stuff and they want to earn some little 903 00:48:39,680 --> 00:48:41,600 Speaker 2: you know, sidecash and want to be involved and actually 904 00:48:41,600 --> 00:48:43,640 Speaker 2: help be part of the solution, you know, they can 905 00:48:43,680 --> 00:48:47,279 Speaker 2: go check out spirits soocial dot ai and there's there's 906 00:48:47,320 --> 00:48:50,759 Speaker 2: a little train the model button that they can they 907 00:48:50,760 --> 00:48:54,279 Speaker 2: can click and they can get participate. But like, yeah, 908 00:48:54,520 --> 00:49:00,120 Speaker 2: this is it's a big science experiment project, but I 909 00:49:00,200 --> 00:49:03,239 Speaker 2: think it will become a utility. As you said, it's 910 00:49:03,280 --> 00:49:05,840 Speaker 2: like we're building a tractor before there's really any farms. 911 00:49:06,480 --> 00:49:09,160 Speaker 2: But I think I have this sense, this instinct that 912 00:49:10,239 --> 00:49:13,759 Speaker 2: there's something here. We don't really know what the application 913 00:49:13,840 --> 00:49:15,880 Speaker 2: is going to be, but I get the sense that, 914 00:49:17,239 --> 00:49:19,319 Speaker 2: you know, having an alternative to chat GPT is going 915 00:49:19,400 --> 00:49:22,919 Speaker 2: to be immensely important in the coming two to five years. 916 00:49:23,440 --> 00:49:26,400 Speaker 1: Yeah. Yeah, well I'm already starting to you know, my 917 00:49:26,520 --> 00:49:29,920 Speaker 1: mind is racing with you know, potential ways that this 918 00:49:29,960 --> 00:49:32,680 Speaker 1: could be applied, like number one, like looking at markets, 919 00:49:32,719 --> 00:49:35,560 Speaker 1: It's like, all right, well, given given what's going on 920 00:49:35,640 --> 00:49:38,399 Speaker 1: right now, you know, I could say, you know, what's 921 00:49:38,440 --> 00:49:41,080 Speaker 1: going on monetary policy and physical policy and things like that, 922 00:49:41,160 --> 00:49:45,560 Speaker 1: like what would what would you know? Rothbard and means 923 00:49:45,600 --> 00:49:48,680 Speaker 1: this and Hi X say, is is likely to be 924 00:49:48,719 --> 00:49:52,560 Speaker 1: the result of this economically speaking, or even even help 925 00:49:52,600 --> 00:49:55,640 Speaker 1: with you know, trying to put together monetary policy. We're 926 00:49:55,640 --> 00:49:57,720 Speaker 1: seeing kind of a movement around the world, different countries 927 00:49:58,000 --> 00:50:01,439 Speaker 1: move get away from Kinesi economics, and it's like, okay, well, 928 00:50:01,600 --> 00:50:06,160 Speaker 1: we don't have a massive body of economists around the 929 00:50:06,160 --> 00:50:11,480 Speaker 1: world that are trained correctly, so you know, looking at okay, 930 00:50:11,560 --> 00:50:14,279 Speaker 1: you know a lot of these guys are dead, So 931 00:50:14,680 --> 00:50:16,560 Speaker 1: what how how do we put together some you know, 932 00:50:16,600 --> 00:50:19,919 Speaker 1: good monetary policy from a national perspective that that could 933 00:50:19,920 --> 00:50:21,080 Speaker 1: be another potential solution. 934 00:50:21,280 --> 00:50:25,360 Speaker 2: So totally, really cool, Totally there's some Yeah, it's like 935 00:50:26,360 --> 00:50:29,000 Speaker 2: once you this is the beauty of tools, right, It's like, 936 00:50:29,040 --> 00:50:32,560 Speaker 2: once you have a tool, you you know, you can 937 00:50:32,640 --> 00:50:35,319 Speaker 2: kind of come up with ways to use it and 938 00:50:35,360 --> 00:50:37,520 Speaker 2: then you know, the market adapts, like and this is 939 00:50:37,520 --> 00:50:39,600 Speaker 2: where stuff like open source is really important. So we're 940 00:50:39,600 --> 00:50:42,320 Speaker 2: going to open source the whole tool and then allow 941 00:50:42,360 --> 00:50:44,120 Speaker 2: people to figure out you know if I mean if 942 00:50:44,120 --> 00:50:46,080 Speaker 2: they've got the compute power at home to run it locally. 943 00:50:46,120 --> 00:50:49,120 Speaker 2: If they don't, you know, they can just use use 944 00:50:49,160 --> 00:50:52,160 Speaker 2: it through our portal online and you know, they can 945 00:50:52,160 --> 00:50:55,000 Speaker 2: pay some sets for it if they you know, want 946 00:50:55,040 --> 00:50:56,840 Speaker 2: to use it you know as a power user or whatever. 947 00:50:56,880 --> 00:51:01,160 Speaker 2: But like, yeah, it's where balking on something big, and 948 00:51:01,200 --> 00:51:02,719 Speaker 2: it's gonna be one of those things where we look 949 00:51:02,719 --> 00:51:04,560 Speaker 2: back in three to five years from now and be like, 950 00:51:04,760 --> 00:51:07,279 Speaker 2: you know where somewhere, Like I get the sense we'll 951 00:51:07,320 --> 00:51:09,080 Speaker 2: be somewhere in three to five years that we had 952 00:51:10,040 --> 00:51:13,440 Speaker 2: zero ability to predict right now, Like I have no 953 00:51:13,480 --> 00:51:16,759 Speaker 2: fucking idea where we're going to be, but I think 954 00:51:16,800 --> 00:51:19,520 Speaker 2: we'll I think it's a journey worth on the taking. 955 00:51:20,920 --> 00:51:24,279 Speaker 1: And if anybody wants to join along and help be 956 00:51:24,360 --> 00:51:26,279 Speaker 1: part of the solution and get paid some SATs, the 957 00:51:26,400 --> 00:51:28,960 Speaker 1: website is Spirit off Satashi dot Ai. 958 00:51:29,360 --> 00:51:31,319 Speaker 2: Yes, correct, Okay. 959 00:51:31,120 --> 00:51:34,120 Speaker 1: I'll have that in the that link in the show 960 00:51:34,160 --> 00:51:38,600 Speaker 1: notes as well for anybody who wants to copy and 961 00:51:38,640 --> 00:51:42,080 Speaker 1: paste it or click on the link. Well, I don't 962 00:51:42,080 --> 00:51:44,520 Speaker 1: want to take up too much of your time today here, 963 00:51:44,600 --> 00:51:46,840 Speaker 1: but I really appreciate you coming on the coming on 964 00:51:46,880 --> 00:51:49,200 Speaker 1: the show and talking about this really exciting. I think 965 00:51:49,200 --> 00:51:51,680 Speaker 1: at the end of the day, anytime we see something 966 00:51:51,719 --> 00:51:56,400 Speaker 1: wrong with our world, it is it is always always 967 00:51:56,480 --> 00:52:01,120 Speaker 1: better option to choose to build a solution rather than 968 00:52:01,160 --> 00:52:03,320 Speaker 1: just complain about the way things are. And you're definitely 969 00:52:03,320 --> 00:52:05,600 Speaker 1: doing that here, So I appreciate it. Thank you for 970 00:52:05,640 --> 00:52:06,720 Speaker 1: doing that totally. 971 00:52:06,800 --> 00:52:08,200 Speaker 2: Man, Thank you so much for having me on and 972 00:52:08,239 --> 00:52:09,840 Speaker 2: sort of helping get the word out with all of 973 00:52:09,880 --> 00:52:14,399 Speaker 2: these things. Every time man means a lot. And yeah, 974 00:52:13,520 --> 00:52:18,000 Speaker 2: I think, as you said, built, you know, we complain 975 00:52:18,120 --> 00:52:20,680 Speaker 2: or we built. That's it, like, yeah, one of the two, 976 00:52:20,800 --> 00:52:22,799 Speaker 2: and you know, we rather do something about it or 977 00:52:22,840 --> 00:52:26,319 Speaker 2: we you know, wait for somebody else, and I try 978 00:52:26,360 --> 00:52:29,000 Speaker 2: and try and make that the you know, the theme 979 00:52:29,200 --> 00:52:30,680 Speaker 2: of my life. So we'll see. 980 00:52:32,400 --> 00:52:34,319 Speaker 1: Well, thanks so much, looking forward to seeing where this 981 00:52:34,360 --> 00:52:36,600 Speaker 1: thing goes over the coming years and it's spirit of 982 00:52:36,640 --> 00:52:38,759 Speaker 1: Satashi dot Ai. Thanks so much for joining. We'll talk 983 00:52:38,760 --> 00:52:39,319 Speaker 1: to you again soon. 984 00:52:39,520 --> 00:52:40,680 Speaker 2: Thank you, Joe. Take care, man,