1 00:00:01,760 --> 00:00:02,880 Speaker 1: Also media. 2 00:00:04,080 --> 00:00:08,360 Speaker 2: Oh my goodness, welcome back to Behind the Bastards, a 3 00:00:08,480 --> 00:00:12,640 Speaker 2: podcast that is it'll be interested to see how the 4 00:00:12,680 --> 00:00:15,600 Speaker 2: audience reacts to this one talking about some of the 5 00:00:15,640 --> 00:00:22,040 Speaker 2: most obscure, frustrating Internet arcana that has ever occurred and 6 00:00:22,120 --> 00:00:27,400 Speaker 2: recently led to the deaths of like six people. My 7 00:00:27,560 --> 00:00:32,199 Speaker 2: guest today as in last episode, David Borie, David, Hey, 8 00:00:32,200 --> 00:00:34,160 Speaker 2: you doing man, I'm doing great. 9 00:00:34,240 --> 00:00:37,120 Speaker 3: I really can't wait to see where this goes. 10 00:00:39,520 --> 00:00:41,760 Speaker 2: Yeah, I feel. 11 00:00:41,520 --> 00:00:43,320 Speaker 3: Like anything could happen at this point. 12 00:00:44,159 --> 00:00:48,480 Speaker 2: It is going to It is going to a lot 13 00:00:48,520 --> 00:00:57,280 Speaker 2: of frustrating things are going to happen. So we'd kind 14 00:00:57,280 --> 00:01:00,600 Speaker 2: of left off by setting up the rationalists where they 15 00:01:00,680 --> 00:01:03,720 Speaker 2: came from, some of the different strains of thought and 16 00:01:03,800 --> 00:01:07,080 Speaker 2: beliefs that come out of their weird thought experiments. And 17 00:01:07,160 --> 00:01:10,160 Speaker 2: now we are talking about a person who falls into 18 00:01:10,200 --> 00:01:13,120 Speaker 2: this movement fairly early on and is going to be 19 00:01:13,160 --> 00:01:16,679 Speaker 2: the leader of this quote unquote group, the Zizians, who 20 00:01:16,680 --> 00:01:20,319 Speaker 2: were responsible for these murders that just happened. Ziz Lesota 21 00:01:20,440 --> 00:01:23,640 Speaker 2: was born in nineteen ninety or nineteen ninety one. I 22 00:01:23,680 --> 00:01:26,440 Speaker 2: don't have an exact birth date. She's known to be 23 00:01:26,480 --> 00:01:28,600 Speaker 2: thirty four years old as of twenty twenty five, so 24 00:01:28,920 --> 00:01:33,240 Speaker 2: somewhere in that field. She was born in Fairbanks, Alaska, 25 00:01:33,319 --> 00:01:35,440 Speaker 2: and grew up there as her father worked for the 26 00:01:35,560 --> 00:01:39,479 Speaker 2: University of Alaska as an AI researcher. We know very 27 00:01:39,520 --> 00:01:42,840 Speaker 2: little of the specifics of her childhood or upbringing, but 28 00:01:42,920 --> 00:01:45,280 Speaker 2: in more than one hundred thousand words of blog posts, 29 00:01:45,319 --> 00:01:48,720 Speaker 2: she did make some references to her early years. She 30 00:01:48,840 --> 00:01:51,880 Speaker 2: claims to have been talented in engineering and computer science 31 00:01:51,920 --> 00:01:53,720 Speaker 2: from a young age, and there's no real reason to 32 00:01:53,760 --> 00:01:56,560 Speaker 2: doubt this. The best single article on all of this 33 00:01:56,680 --> 00:01:59,240 Speaker 2: is a piece and Wired by Evan Ratliffe. He found 34 00:01:59,280 --> 00:02:01,440 Speaker 2: a two thousand and four teen blog post by Ziz 35 00:02:01,480 --> 00:02:04,360 Speaker 2: where she wrote, my friends and family, even if they 36 00:02:04,360 --> 00:02:07,200 Speaker 2: think I'm weird, don't really seem to be bothered by 37 00:02:07,200 --> 00:02:09,240 Speaker 2: the fact that I'm weird. But one thing I can 38 00:02:09,280 --> 00:02:11,760 Speaker 2: tell you is that I used to deemphasize my weirdness 39 00:02:11,760 --> 00:02:14,240 Speaker 2: around them, and then I stopped and found that being 40 00:02:14,280 --> 00:02:16,480 Speaker 2: unapologetically weird is a lot more fun. 41 00:02:17,360 --> 00:02:17,560 Speaker 3: Now. 42 00:02:17,680 --> 00:02:20,040 Speaker 2: It's important you know. Ziz is not the name this 43 00:02:20,080 --> 00:02:22,959 Speaker 2: person is born under. She's a trans woman, and so 44 00:02:23,200 --> 00:02:25,920 Speaker 2: I'm like using the name that she adopts later but 45 00:02:26,040 --> 00:02:29,040 Speaker 2: she is not transitioned at this point like this, This 46 00:02:29,080 --> 00:02:31,640 Speaker 2: is when she's a kid, right, and she's not going 47 00:02:31,639 --> 00:02:34,440 Speaker 2: to transition until fairly late in the story after coming 48 00:02:34,440 --> 00:02:37,239 Speaker 2: to San Francisco. So you just keep that in mind 49 00:02:37,280 --> 00:02:40,280 Speaker 2: as this is going on here. Hey, everyone, Robert here, 50 00:02:40,280 --> 00:02:42,880 Speaker 2: just a little additional context, as best as I think 51 00:02:42,960 --> 00:02:45,720 Speaker 2: anyone can tell. If you're curious about where the name 52 00:02:45,840 --> 00:02:49,960 Speaker 2: Ziz came from, there's another piece of serial released online 53 00:02:50,000 --> 00:02:53,320 Speaker 2: fiction that's not like a rationalist story, but it's very 54 00:02:53,360 --> 00:02:59,079 Speaker 2: popular with rationalists. It's called Worm. Ziz is a character 55 00:02:59,240 --> 00:03:04,639 Speaker 2: in that that's effectively like an angel like being who 56 00:03:04,800 --> 00:03:08,760 Speaker 2: can like manipulate the future, usually in order to do 57 00:03:09,080 --> 00:03:13,359 Speaker 2: very bad things. Anyway, that's where the name comes from. 58 00:03:13,440 --> 00:03:17,359 Speaker 2: So smart kid, really good with computers, kind of weird 59 00:03:18,120 --> 00:03:22,240 Speaker 2: and embraces being unapologetically weird at a certain point in 60 00:03:22,240 --> 00:03:26,000 Speaker 2: her childhood. Hey, everybody, Robert here. Did not have this 61 00:03:26,120 --> 00:03:29,679 Speaker 2: piece of information when I first put the episode together, 62 00:03:30,560 --> 00:03:33,680 Speaker 2: but I came across a quote in an article from 63 00:03:33,800 --> 00:03:39,240 Speaker 2: the Boston Globe that provides additional context on Zizz's childhood quote. 64 00:03:39,440 --> 00:03:41,560 Speaker 2: In middle school, the teen was among a group of 65 00:03:41,560 --> 00:03:45,200 Speaker 2: students who managed to infiltrate the school district's payroll system 66 00:03:45,440 --> 00:03:48,520 Speaker 2: and award huge paychecks to teachers they admired while slashing 67 00:03:48,520 --> 00:03:52,480 Speaker 2: the salaries of those they despised. According to one teacher, Zizz, 68 00:03:52,520 --> 00:03:56,680 Speaker 2: the teacher said, struggled to regulate strong emotions, often erupting 69 00:03:56,760 --> 00:04:00,160 Speaker 2: in tantrums. I wish I'd had this when Dave it 70 00:04:00,320 --> 00:04:03,360 Speaker 2: was on, but definitely sets up some of the things 71 00:04:03,400 --> 00:04:06,680 Speaker 2: that are coming. She goes to the U of Alaska 72 00:04:06,680 --> 00:04:10,600 Speaker 2: for her undergraduate degree in computer engineering in February of 73 00:04:10,600 --> 00:04:13,560 Speaker 2: two thousand and nine, which is when Elisia Jedkowski started 74 00:04:13,640 --> 00:04:19,240 Speaker 2: Less Wrong. Ziz starts kind of getting drawn into some 75 00:04:19,360 --> 00:04:23,920 Speaker 2: of the people who are around this growing subculture, right, 76 00:04:24,400 --> 00:04:29,000 Speaker 2: and she's drawn in initially by veganism. So Ziz becomes 77 00:04:29,000 --> 00:04:31,600 Speaker 2: a vegan at a fairly young age. Her family are 78 00:04:31,720 --> 00:04:36,080 Speaker 2: not vegans, and she's obsessed with the concept of animal sentience, 79 00:04:36,480 --> 00:04:40,600 Speaker 2: right of the fact that like animals are thinking and 80 00:04:40,600 --> 00:04:43,400 Speaker 2: feeling beings just like human beings. And a lot of 81 00:04:43,400 --> 00:04:47,400 Speaker 2: this is based in her interest in kind of foundational rationalist. 82 00:04:48,680 --> 00:04:51,680 Speaker 2: A lot of this is based in her interest of 83 00:04:51,720 --> 00:04:55,480 Speaker 2: a foundational rationalist and EA figure a guy named Brian 84 00:04:55,600 --> 00:04:59,600 Speaker 2: Thomasik Brian is a writer and a software engineer as 85 00:04:59,600 --> 00:05:02,840 Speaker 2: well as an animal rights activists and as a thinker. 86 00:05:03,160 --> 00:05:05,800 Speaker 2: He's what you'd call a long termist, right, which is, 87 00:05:06,279 --> 00:05:08,920 Speaker 2: you know, pretty tied to the EA guys. These are 88 00:05:08,960 --> 00:05:11,560 Speaker 2: all the same people using kind of different words to 89 00:05:11,600 --> 00:05:15,839 Speaker 2: describe the aspects of what they believe. His organization is 90 00:05:15,880 --> 00:05:18,600 Speaker 2: the Center on Long Term Risk, which is a think 91 00:05:18,680 --> 00:05:21,800 Speaker 2: tank he establishes that's at the ground floor of these 92 00:05:21,839 --> 00:05:25,800 Speaker 2: effective altruism discussions, and the goal for the Center of 93 00:05:25,880 --> 00:05:29,599 Speaker 2: Long Term Risk is to find ways to reduce suffering 94 00:05:29,800 --> 00:05:34,040 Speaker 2: on a long timeline. Thomasik is obsessed with the concept 95 00:05:34,080 --> 00:05:37,240 Speaker 2: of suffering and specifically obsessed with conceps. Suffering is a 96 00:05:37,360 --> 00:05:41,560 Speaker 2: mathematical concept. So when I say to you, I want 97 00:05:41,560 --> 00:05:44,560 Speaker 2: to end suffering, you probably think, like, oh, you want 98 00:05:44,560 --> 00:05:46,839 Speaker 2: to like, you know, go help people who don't have 99 00:05:47,040 --> 00:05:49,560 Speaker 2: access to clean water, or like who have like worms 100 00:05:49,560 --> 00:05:52,280 Speaker 2: and stuff that they're dealing with, have access to medicine. 101 00:05:52,360 --> 00:05:57,280 Speaker 2: That's what normal people think of, right, you know, maybe 102 00:05:57,360 --> 00:05:59,880 Speaker 2: try to improve access to medical care that sort of stuff. 103 00:06:00,360 --> 00:06:04,200 Speaker 2: Thomas Saith thinks of suffering as like a mass, like 104 00:06:04,240 --> 00:06:07,560 Speaker 2: an aggregate mass that he wants to reduce in the 105 00:06:07,680 --> 00:06:12,520 Speaker 2: long term through actions right. It's a numbers game to him, 106 00:06:12,760 --> 00:06:16,320 Speaker 2: in other words, and his idea of ultimate good is 107 00:06:16,360 --> 00:06:20,800 Speaker 2: to reduce and end the suffering of sentient life. Critical 108 00:06:20,839 --> 00:06:23,120 Speaker 2: to his belief system and the one that Ziz starts 109 00:06:23,120 --> 00:06:26,240 Speaker 2: to develop, is the growing understanding that sentience is much 110 00:06:26,279 --> 00:06:30,159 Speaker 2: more common than many people had previously assumed. Part of 111 00:06:30,200 --> 00:06:32,960 Speaker 2: this comes from long standing debates with their origins in 112 00:06:33,040 --> 00:06:36,160 Speaker 2: Christian doctrine as to whether or not animals have souls 113 00:06:36,320 --> 00:06:40,880 Speaker 2: or basically machines with meat right that don't feel anything right. 114 00:06:41,040 --> 00:06:44,359 Speaker 2: There's still a lot of Christian evangelicals who feel that 115 00:06:44,400 --> 00:06:47,400 Speaker 2: way today about like at least the animals we eat, 116 00:06:47,560 --> 00:06:50,080 Speaker 2: you know, like, well they don't really think it's fine. 117 00:06:50,120 --> 00:06:52,000 Speaker 2: God gave them to us. We can do whatever we 118 00:06:52,040 --> 00:06:52,480 Speaker 2: want to them. 119 00:06:52,680 --> 00:06:53,839 Speaker 3: Here we eat and. 120 00:06:54,360 --> 00:06:57,640 Speaker 2: To be fair, this is an extremely common way for 121 00:06:57,880 --> 00:07:00,599 Speaker 2: that people in Japan feel about like fish, even whales 122 00:07:00,680 --> 00:07:03,200 Speaker 2: and dolphins, like the much more intelligent they're not fish, 123 00:07:03,240 --> 00:07:05,400 Speaker 2: but like the much more intelligent ocean going creatures is 124 00:07:05,440 --> 00:07:07,920 Speaker 2: like they're fish. They don't think you do whatever to them, 125 00:07:08,040 --> 00:07:11,400 Speaker 2: you know. This is a reason for a lot of 126 00:07:11,440 --> 00:07:13,960 Speaker 2: like the really fucked up stuff with like whaling fleets 127 00:07:14,000 --> 00:07:16,040 Speaker 2: in that part of the world. So this is a 128 00:07:16,080 --> 00:07:18,560 Speaker 2: thing all over the planet. People are very good at 129 00:07:18,560 --> 00:07:21,680 Speaker 2: deciding certain things we want to eat. Are our machines 130 00:07:21,680 --> 00:07:25,000 Speaker 2: that don't feel anything, you know, it's just much more 131 00:07:25,120 --> 00:07:29,680 Speaker 2: comfortable that way. Now, this is obviously like you go 132 00:07:29,720 --> 00:07:31,680 Speaker 2: into like pagan The Pagans would have been like, what 133 00:07:31,680 --> 00:07:34,800 Speaker 2: do you mean animals don't think or have souls? Animals 134 00:07:34,840 --> 00:07:38,000 Speaker 2: like animals think, you know, like they're they're they're like, 135 00:07:38,200 --> 00:07:41,160 Speaker 2: you're telling me, like my horse that I love it 136 00:07:41,400 --> 00:07:46,920 Speaker 2: doesn't think, you know, that's nonsense. But it's this thing 137 00:07:46,960 --> 00:07:50,720 Speaker 2: that in like early modernity especially gets more common. But 138 00:07:50,760 --> 00:07:52,560 Speaker 2: they're also this is when we start to have debates 139 00:07:52,560 --> 00:07:55,680 Speaker 2: about like what is sentience and what is thinking? And 140 00:07:55,840 --> 00:07:57,760 Speaker 2: a lot of them are centered around trying to answer, 141 00:07:57,840 --> 00:08:03,240 Speaker 2: like our animals sentient And the initial definition of sentience 142 00:08:03,280 --> 00:08:06,720 Speaker 2: that most of these people are using is can it reason? 143 00:08:06,920 --> 00:08:10,160 Speaker 2: Can it speak? If we can't prove that like a 144 00:08:10,240 --> 00:08:14,360 Speaker 2: dog or a cow can reason, and if it can't 145 00:08:14,400 --> 00:08:17,440 Speaker 2: speak to us, right, then it's not sentient. That's how 146 00:08:17,480 --> 00:08:20,200 Speaker 2: a lot of people feel. It's an English philosopher named 147 00:08:20,240 --> 00:08:24,160 Speaker 2: Jeremy Bentham, who first argues, I think that what matters 148 00:08:24,240 --> 00:08:26,640 Speaker 2: isn't can it reason or can it speak? But can 149 00:08:26,720 --> 00:08:31,000 Speaker 2: it suffer? Because a machine can't suffer. If these are 150 00:08:31,000 --> 00:08:35,000 Speaker 2: machines with meat, they can't suffer. If these can suffer, 151 00:08:35,200 --> 00:08:38,720 Speaker 2: they're not machine with meat, right, And this is the 152 00:08:38,800 --> 00:08:43,560 Speaker 2: kind of thing how we define sentience is a moving thing. 153 00:08:43,840 --> 00:08:46,800 Speaker 2: Like you can find different definitions of it. But the 154 00:08:46,880 --> 00:08:50,319 Speaker 2: last couple of decades, in particular of actually very good 155 00:08:50,400 --> 00:08:54,760 Speaker 2: data has made it clear. I think inarguably that basically 156 00:08:54,840 --> 00:08:57,480 Speaker 2: every living thing on this planet has a degree of 157 00:08:57,480 --> 00:09:00,240 Speaker 2: what you would call sentience if you are described being 158 00:09:00,280 --> 00:09:03,000 Speaker 2: sentienced the way it generally is now, which is a 159 00:09:03,120 --> 00:09:08,079 Speaker 2: creature has the capacity for subjective experience with a positive 160 00:09:08,160 --> 00:09:12,679 Speaker 2: or negative negative valence i e. Can feel pain or pleasure, 161 00:09:13,080 --> 00:09:16,640 Speaker 2: and also is it can feel it as an individual right, 162 00:09:18,080 --> 00:09:20,280 Speaker 2: it doesn't mean, you know, sometimes people use the term 163 00:09:20,320 --> 00:09:23,480 Speaker 2: effective sentience to refer to this, to differentiate it from 164 00:09:23,559 --> 00:09:27,400 Speaker 2: like being able to reason and make moral decisions. You know, 165 00:09:28,400 --> 00:09:32,680 Speaker 2: for example, ants I don't think can make moral decisions, 166 00:09:32,800 --> 00:09:34,959 Speaker 2: you know, in any way that we would recognize that 167 00:09:35,080 --> 00:09:38,840 Speaker 2: they certainly don't think about stuff that way. But twenty 168 00:09:38,880 --> 00:09:43,199 Speaker 2: twenty five, research published by doctor Vulker Nehring found evidence 169 00:09:43,240 --> 00:09:46,080 Speaker 2: that ants are capable of remembering for long periods of 170 00:09:46,120 --> 00:09:49,760 Speaker 2: time violent encounters they have with other individual ants and 171 00:09:49,880 --> 00:09:53,480 Speaker 2: holding grudges against those ants. Right, just like us. They're 172 00:09:53,600 --> 00:09:56,560 Speaker 2: just like us. And there's strong evidence that ants do 173 00:09:56,640 --> 00:09:59,120 Speaker 2: feel pain. Right, We're We're not pretty sure of that. 174 00:09:59,200 --> 00:10:01,480 Speaker 2: And in fact, again this is an argument that a 175 00:10:01,559 --> 00:10:04,240 Speaker 2: number of researchers in this space will make. Sentience is 176 00:10:04,280 --> 00:10:07,120 Speaker 2: probably some kind of something like this. Kind of sentience, 177 00:10:07,160 --> 00:10:10,800 Speaker 2: the ability to have subjective positive and negative experiences is 178 00:10:10,960 --> 00:10:13,360 Speaker 2: universal to living things, or very close to it. 179 00:10:13,520 --> 00:10:13,760 Speaker 3: Right. 180 00:10:15,600 --> 00:10:19,040 Speaker 2: It's an interesting body of research, but there's as it's 181 00:10:19,080 --> 00:10:22,040 Speaker 2: fairly solid at this point. And again I say this 182 00:10:22,080 --> 00:10:25,520 Speaker 2: as somebody who like hunts and raises livestock. I don't 183 00:10:25,520 --> 00:10:28,160 Speaker 2: think there's any any solid reason to disagree with this. 184 00:10:28,880 --> 00:10:31,320 Speaker 2: So you can see there's a basis to a lot 185 00:10:31,360 --> 00:10:34,120 Speaker 2: of what Thomisik is saying, right, which is that you 186 00:10:34,160 --> 00:10:38,440 Speaker 2: should if you're what matters is reducing the overall amount 187 00:10:38,480 --> 00:10:42,280 Speaker 2: of suffering in the world, and if you're looking at 188 00:10:42,360 --> 00:10:44,520 Speaker 2: suffering as a mass, if you're just adding up all 189 00:10:44,559 --> 00:10:47,200 Speaker 2: of the bad things experienced by all of the living things. 190 00:10:47,400 --> 00:10:50,040 Speaker 2: Animal suffering is a lot of the suffering. So if 191 00:10:50,040 --> 00:10:52,640 Speaker 2: our goal is to reduce suffering, animal welfare is hugely 192 00:10:52,640 --> 00:10:53,240 Speaker 2: importantly right. 193 00:10:53,280 --> 00:10:54,400 Speaker 3: It's a great place to start. 194 00:10:54,559 --> 00:10:57,240 Speaker 2: Great, fine enough, you know, not a little bit of 195 00:10:57,280 --> 00:10:58,600 Speaker 2: a weird way to phrase it, but fine. 196 00:10:58,760 --> 00:10:58,960 Speaker 1: Yeah. 197 00:11:00,320 --> 00:11:05,120 Speaker 2: H So here's the way. Problem though, Thomisik, like all 198 00:11:05,160 --> 00:11:09,120 Speaker 2: these guys, spends too much time. None of them can 199 00:11:09,160 --> 00:11:12,800 Speaker 2: be like, hey, had a good thought, We're done, setting 200 00:11:12,840 --> 00:11:16,280 Speaker 2: that thought down, moving on. So he keeps thinking about 201 00:11:16,320 --> 00:11:18,559 Speaker 2: shit like this, and it leads him to some very 202 00:11:18,559 --> 00:11:23,000 Speaker 2: irrational takes. For example, in twenty fourteen, Thomas six starts 203 00:11:23,120 --> 00:11:25,640 Speaker 2: arguing that it might be a moral to kill characters 204 00:11:25,640 --> 00:11:28,280 Speaker 2: in video games, and I'm going to quote from an 205 00:11:28,320 --> 00:11:32,440 Speaker 2: article in Volves. He argues that while NPCs do not 206 00:11:32,520 --> 00:11:35,400 Speaker 2: have anywhere near the mental complexity of animals, the difference 207 00:11:35,440 --> 00:11:37,720 Speaker 2: is one of degree rather than kind, and we should 208 00:11:37,720 --> 00:11:40,440 Speaker 2: care at least a tiny amount about their suffering, especially 209 00:11:40,520 --> 00:11:46,760 Speaker 2: if they grow more complex. And his argument is that like, yeah, 210 00:11:46,840 --> 00:11:50,320 Speaker 2: most it doesn't matter like individually killing a goomba or 211 00:11:50,320 --> 00:11:52,880 Speaker 2: a bet or a guy in GTA five, but like, 212 00:11:53,440 --> 00:11:56,320 Speaker 2: because they're getting more complicated and able to try to 213 00:11:56,360 --> 00:11:59,560 Speaker 2: avoid injury and stuff. There's evidence that there's some sort 214 00:11:59,559 --> 00:12:02,280 Speaker 2: of suffer ring there, and thus the sheer mass of 215 00:12:02,400 --> 00:12:05,400 Speaker 2: NPCs being killed that might be like enough that it's 216 00:12:05,440 --> 00:12:09,480 Speaker 2: ethically relevant to consider. And I think that's silly. Yeah, 217 00:12:09,640 --> 00:12:15,680 Speaker 2: I think that's ridiculous. I'm sorry, man, No, I'm sorry. 218 00:12:18,200 --> 00:12:19,600 Speaker 3: But that's a lot of the fun of the game. 219 00:12:20,120 --> 00:12:21,679 Speaker 3: Kill you. 220 00:12:22,320 --> 00:12:24,880 Speaker 2: If you're telling me, like we need to be deeply 221 00:12:24,920 --> 00:12:27,720 Speaker 2: concerned about the welfare of like cows that we lock 222 00:12:27,800 --> 00:12:31,880 Speaker 2: into factory farms, you got me, absolutely for sure. If 223 00:12:31,880 --> 00:12:34,680 Speaker 2: you're telling me I should feel bad about running down 224 00:12:34,679 --> 00:12:36,480 Speaker 2: a bunch of cops and grand theft auto. 225 00:12:38,720 --> 00:12:40,240 Speaker 3: It's also one of those things where it's like you 226 00:12:40,280 --> 00:12:41,440 Speaker 3: got to think locally. Man. 227 00:12:41,480 --> 00:12:45,840 Speaker 2: There's yeah, there's there's like there's like this is this 228 00:12:46,040 --> 00:12:47,640 Speaker 2: is the I mean? And he does say like, I 229 00:12:47,640 --> 00:12:49,960 Speaker 2: don't consider this a main problem, but like the fact 230 00:12:50,000 --> 00:12:52,120 Speaker 2: that you think this is a problem is it means 231 00:12:52,160 --> 00:13:00,080 Speaker 2: that you believe silly things about consciousness. Yeah, anyway, so 232 00:13:00,360 --> 00:13:02,120 Speaker 2: this is I think the fact that he gets he 233 00:13:02,200 --> 00:13:04,160 Speaker 2: leads himself here is kind of evidence of the sort 234 00:13:04,160 --> 00:13:07,000 Speaker 2: of logical fractures that are very common in this community. 235 00:13:07,040 --> 00:13:10,240 Speaker 2: But this is the guy that young Ziz is drawn to. 236 00:13:10,440 --> 00:13:13,120 Speaker 2: She loves this dude, right, He is kind of her 237 00:13:13,200 --> 00:13:18,240 Speaker 2: first intellectual heart throb, and she writes, quote, MY primary 238 00:13:18,280 --> 00:13:21,400 Speaker 2: concern upon learning about the singularity was how do I 239 00:13:21,480 --> 00:13:24,640 Speaker 2: make this benefit all sentient life, not just humans? So 240 00:13:24,679 --> 00:13:27,559 Speaker 2: she gets interested in this idea of the singularity. It's 241 00:13:27,559 --> 00:13:31,080 Speaker 2: inevitable that an AI god is going to arise, and 242 00:13:31,520 --> 00:13:35,240 Speaker 2: she gets into the, you know, the rationalist thing of 243 00:13:35,320 --> 00:13:37,079 Speaker 2: we have to make sure that this is a nice 244 00:13:37,160 --> 00:13:39,640 Speaker 2: AI rather than a mean one. But she has this 245 00:13:39,679 --> 00:13:42,600 Speaker 2: other thing to it, which is this AI has to 246 00:13:42,640 --> 00:13:45,680 Speaker 2: care as much as I do about animal life, right, 247 00:13:46,080 --> 00:13:48,880 Speaker 2: otherwise we're not really making the world better, you know. 248 00:13:50,760 --> 00:13:50,960 Speaker 3: Now. 249 00:13:51,120 --> 00:13:54,360 Speaker 2: Thamisik advises her to check out Less Wrong, which is 250 00:13:54,400 --> 00:13:58,280 Speaker 2: how Ziz starts reading elizer Yetkowski's work. From there, in 251 00:13:58,320 --> 00:14:02,640 Speaker 2: twenty twelve, she starts reading up effective altruism and existential risk, 252 00:14:03,559 --> 00:14:05,439 Speaker 2: which is a term that means the risk that a 253 00:14:05,480 --> 00:14:09,800 Speaker 2: super intelligent AI will kill us all. She starts believing 254 00:14:09,840 --> 00:14:14,360 Speaker 2: in all of this kind of stuff, and her a 255 00:14:14,360 --> 00:14:18,000 Speaker 2: particular belief is that like the Singularity, when it happens, 256 00:14:18,120 --> 00:14:20,800 Speaker 2: is going to occur in a flash kind of like 257 00:14:20,840 --> 00:14:23,880 Speaker 2: the rapture and almost immediately lead to the creation of 258 00:14:23,920 --> 00:14:27,240 Speaker 2: either a hell or a heaven. Right, and this will 259 00:14:27,240 --> 00:14:29,800 Speaker 2: be done by the term they use for this inevitable 260 00:14:29,840 --> 00:14:33,200 Speaker 2: AI is the Singleton, right'. That's what they call the 261 00:14:33,240 --> 00:14:37,160 Speaker 2: AI god that's going to come about, right, And so 262 00:14:37,360 --> 00:14:39,960 Speaker 2: her obsession is that she has to find a way 263 00:14:40,000 --> 00:14:43,360 Speaker 2: to make this Singleton a nice AI that cares about 264 00:14:43,400 --> 00:14:45,720 Speaker 2: animals as much as it cares about people. Right, that's 265 00:14:45,720 --> 00:14:49,320 Speaker 2: our initial big motivation. So she starts emailing Thomasick with 266 00:14:49,360 --> 00:14:53,520 Speaker 2: her concerns because she's worried that the other rationalists aren't vegans, right, 267 00:14:53,600 --> 00:14:56,520 Speaker 2: and they don't feel like animal welfare is like the 268 00:14:56,560 --> 00:15:00,040 Speaker 2: top priority for making sure this AI is good, and 269 00:15:00,120 --> 00:15:04,000 Speaker 2: she really wants to convert this whole community to veganism 270 00:15:04,040 --> 00:15:07,360 Speaker 2: in order to ensure that the Singleton is as focused 271 00:15:07,360 --> 00:15:10,600 Speaker 2: on insect and animal welfare as human welfare. And Thomas 272 00:15:10,640 --> 00:15:13,320 Speaker 2: Sick does care about animal rights, but he disagrees with 273 00:15:13,320 --> 00:15:15,680 Speaker 2: her because he's like, now, what matters is maximizing the 274 00:15:15,720 --> 00:15:19,440 Speaker 2: reduction of suffering, and like, a good Singleton will solve 275 00:15:19,520 --> 00:15:22,360 Speaker 2: climate change and shit, which will be better for the animals. 276 00:15:22,640 --> 00:15:24,960 Speaker 2: And if we focus on trying to convert everybody in 277 00:15:25,000 --> 00:15:28,320 Speaker 2: this the rationalist space to veganism. It's going to stop 278 00:15:28,400 --> 00:15:32,960 Speaker 2: us from accomplishing these bigger goals, right, this is shattering 279 00:15:33,280 --> 00:15:36,360 Speaker 2: to Ziz. Right, she decides that he doesn't Thomas Sick 280 00:15:36,440 --> 00:15:39,160 Speaker 2: doesn't care about good things, and she decides that she's 281 00:15:39,200 --> 00:15:43,120 Speaker 2: basically alone in her values. And so her first move. 282 00:15:43,040 --> 00:15:46,080 Speaker 3: The time to start a smaller subcol shit that. 283 00:15:46,200 --> 00:15:51,560 Speaker 2: Sounds like we're on our way. She first considers embracing 284 00:15:51,600 --> 00:15:55,440 Speaker 2: what she calls negative utilitarianism. And this is an example 285 00:15:55,440 --> 00:15:57,280 Speaker 2: of the fact that from the jump, this is a 286 00:15:58,840 --> 00:16:03,080 Speaker 2: young woman who's not well, right because once her hero 287 00:16:03,280 --> 00:16:06,520 Speaker 2: is like, I don't know if veganism is necessarily are 288 00:16:06,760 --> 00:16:11,000 Speaker 2: the priority we have to embrace right now, her immediate 289 00:16:11,040 --> 00:16:13,680 Speaker 2: goals to jump to, well, maybe what I should do 290 00:16:14,440 --> 00:16:18,119 Speaker 2: is optimize myself to cause as much harm to humanity 291 00:16:18,240 --> 00:16:21,120 Speaker 2: and quote destroy the world, to prevent it from becoming 292 00:16:21,160 --> 00:16:24,120 Speaker 2: hell for mostly everyone. So that's a jump. 293 00:16:24,520 --> 00:16:24,760 Speaker 3: You know. 294 00:16:25,240 --> 00:16:29,040 Speaker 2: That's not somebody who's doing well you think is healthy. 295 00:16:29,680 --> 00:16:34,360 Speaker 3: No, she's uh, she's having a tough time out. Uh huh. 296 00:16:34,840 --> 00:16:37,920 Speaker 2: So Ziz does ultimately decide she should still work to 297 00:16:37,960 --> 00:16:41,680 Speaker 2: bring about a nice AI, even though that necessitates working 298 00:16:41,720 --> 00:16:44,760 Speaker 2: with people she describes as flesh eating monsters who had 299 00:16:44,760 --> 00:16:47,040 Speaker 2: created hell on Earth for far more people than those 300 00:16:47,040 --> 00:16:51,960 Speaker 2: they had helped. That's everybody who eats meat, Okay, yes, yes, 301 00:16:52,200 --> 00:16:55,760 Speaker 2: And it's ironic, large group. It's ironic because like if 302 00:16:55,800 --> 00:16:57,920 Speaker 2: you if you're she really wants to be in the 303 00:16:57,960 --> 00:17:00,200 Speaker 2: tech industry. She's trying to get an all all these 304 00:17:00,200 --> 00:17:02,160 Speaker 2: people are in the tech industry. That's a pretty good 305 00:17:02,160 --> 00:17:04,639 Speaker 2: description of a lot of the tech industry. They're in 306 00:17:04,760 --> 00:17:07,320 Speaker 2: fact monsters who have created hell on Earth for more 307 00:17:07,320 --> 00:17:09,919 Speaker 2: people than they've felt. But she means that for like, 308 00:17:10,600 --> 00:17:12,840 Speaker 2: I don't know, you're you're onto has a Hamburger once 309 00:17:12,880 --> 00:17:16,280 Speaker 2: a week and look again, factory farming evil. I just 310 00:17:16,320 --> 00:17:22,720 Speaker 2: don't think that's how morality works. I think you're going 311 00:17:22,760 --> 00:17:23,400 Speaker 2: a little far. 312 00:17:23,840 --> 00:17:25,439 Speaker 3: No, she's making big jumps. 313 00:17:25,600 --> 00:17:30,560 Speaker 2: Yeah, you're making bold thinkers, bold thinker. Yeah. Now, what 314 00:17:30,600 --> 00:17:33,040 Speaker 2: you see here with this logic is that Ziz has 315 00:17:33,160 --> 00:17:36,600 Speaker 2: taken this. She has a massive case of main character syndrome. 316 00:17:36,720 --> 00:17:36,880 Speaker 3: Right. 317 00:17:37,440 --> 00:17:40,080 Speaker 2: All of this is based in her attitude that I 318 00:17:40,359 --> 00:17:45,480 Speaker 2: have to save the universe by creating, by helping to 319 00:17:45,800 --> 00:17:49,240 Speaker 2: or figuring out how to create an AI that can 320 00:17:49,400 --> 00:17:52,840 Speaker 2: end the eternal holocaust of all animal life and also 321 00:17:52,960 --> 00:17:53,760 Speaker 2: save humanity. 322 00:17:53,960 --> 00:17:55,320 Speaker 3: Right to do? 323 00:17:55,600 --> 00:17:58,960 Speaker 2: That's me and this is this is a thing. 324 00:17:59,640 --> 00:17:59,920 Speaker 3: Again. 325 00:18:00,119 --> 00:18:04,080 Speaker 2: All of this comes out of both subcultural aspects and 326 00:18:04,119 --> 00:18:06,639 Speaker 2: aspects of American culture. One major problem that we have 327 00:18:06,720 --> 00:18:10,840 Speaker 2: in the society is Hollywood has trained us all on 328 00:18:10,880 --> 00:18:14,679 Speaker 2: a diet of movies with main characters that are the 329 00:18:14,720 --> 00:18:18,160 Speaker 2: special boy or the special girl with the special powers 330 00:18:18,440 --> 00:18:23,680 Speaker 2: who save the day, right, and real life doesn't work 331 00:18:23,720 --> 00:18:26,879 Speaker 2: that way very often. Right, the Nazis, there was no 332 00:18:27,000 --> 00:18:29,760 Speaker 2: special boy who stopped the Nazis. There were a lot 333 00:18:29,760 --> 00:18:32,040 Speaker 2: of farm boys who were just like, I guess I'll 334 00:18:32,040 --> 00:18:34,399 Speaker 2: go run in a machine gun nest until this is 335 00:18:34,480 --> 00:18:38,520 Speaker 2: done exactly. There were a lot of sixteen year old 336 00:18:38,600 --> 00:18:41,040 Speaker 2: Russians who were like, guess I'm gonna walk in a bullet, 337 00:18:41,200 --> 00:18:46,080 Speaker 2: you know, like that's that's how evil gets fought. Usually, unfortunately, 338 00:18:46,400 --> 00:18:50,600 Speaker 2: all were reluctant like that, yeah yeah, or a shitloaded 339 00:18:50,600 --> 00:18:52,800 Speaker 2: guys in a lab figuring out how to make corn 340 00:18:52,880 --> 00:18:56,480 Speaker 2: that has higher yields so people don't starve. Right, These 341 00:18:56,480 --> 00:19:00,240 Speaker 2: are these are really like how world clatt Like huge 342 00:19:00,280 --> 00:19:02,359 Speaker 2: world problems get solved. 343 00:19:02,720 --> 00:19:04,119 Speaker 3: People who have been touched, you know. 344 00:19:04,320 --> 00:19:06,320 Speaker 2: Yeah, it's not people who have been touched, and it's 345 00:19:06,359 --> 00:19:09,240 Speaker 2: certainly not people who have entirely based their understanding on 346 00:19:09,280 --> 00:19:12,280 Speaker 2: the world from quotes from Star Wars and Harry Potter. 347 00:19:18,359 --> 00:19:21,919 Speaker 2: So some of this comes from just like, this is 348 00:19:21,960 --> 00:19:24,879 Speaker 2: a normal, deranged way of thinking that happens to a 349 00:19:24,880 --> 00:19:27,160 Speaker 2: lot of people in just Western I think a lot 350 00:19:27,200 --> 00:19:32,440 Speaker 2: of this leads to, uh, why you get very comfortable 351 00:19:32,440 --> 00:19:35,720 Speaker 2: middle class people joining these very aggressive fascist movements in 352 00:19:35,760 --> 00:19:38,160 Speaker 2: the West, Like in Germany, it's like middle class mostly 353 00:19:38,240 --> 00:19:40,720 Speaker 2: like middle class and upper middle class people in the US, 354 00:19:40,840 --> 00:19:43,959 Speaker 2: especially among like these street fighting, you know, proud boy types. 355 00:19:44,320 --> 00:19:48,000 Speaker 2: It's because it's not because they're like suffering and desperate. 356 00:19:48,040 --> 00:19:52,280 Speaker 2: They're not starving in the streets. It's because they're bored 357 00:19:52,440 --> 00:19:55,159 Speaker 2: and they want to feel like they're fighting an epic 358 00:19:55,240 --> 00:19:56,200 Speaker 2: war against evil. 359 00:19:56,560 --> 00:19:58,639 Speaker 3: Yeah, I mean, you want to fill your time with importance, 360 00:19:58,720 --> 00:20:00,720 Speaker 3: right right, regardless of what you do, you want to 361 00:20:01,160 --> 00:20:02,800 Speaker 3: and you want to feel like you have a cause 362 00:20:02,840 --> 00:20:06,119 Speaker 3: worthy of fighting for. So in that, I guess I 363 00:20:06,160 --> 00:20:07,320 Speaker 3: see how you got here. 364 00:20:07,680 --> 00:20:09,520 Speaker 2: Yeah, So there's a piece I mean, I think there's 365 00:20:09,520 --> 00:20:11,520 Speaker 2: a piece of this that originally it's just from this 366 00:20:11,600 --> 00:20:13,720 Speaker 2: is something in our culture. But there's also a major, 367 00:20:13,880 --> 00:20:17,200 Speaker 2: a major chunk of this gets supercharged by the kind 368 00:20:17,240 --> 00:20:20,280 Speaker 2: of thinking that's common in EA and rationalist spaces, because 369 00:20:20,840 --> 00:20:25,120 Speaker 2: so rationalists and effective altruists are not ever thinking like, hey, 370 00:20:25,359 --> 00:20:28,480 Speaker 2: how do we as a species fix these major problems? 371 00:20:28,560 --> 00:20:28,760 Speaker 3: Right? 372 00:20:29,119 --> 00:20:34,680 Speaker 2: They're thinking, how do I make myself better, optimize myself 373 00:20:34,960 --> 00:20:41,200 Speaker 2: to be incredible, and how do I like fix the 374 00:20:41,200 --> 00:20:45,879 Speaker 2: major problems of the world alongside my mentally superpowered friends. 375 00:20:45,960 --> 00:20:46,120 Speaker 3: Right. 376 00:20:46,160 --> 00:20:51,840 Speaker 2: These are very individual focused philosophies and attitudes, right, And 377 00:20:51,920 --> 00:20:54,560 Speaker 2: so they do lend themselves to people who think that, like, 378 00:20:54,680 --> 00:20:57,919 Speaker 2: we are heroes who are uniquely empowered to save the 379 00:20:57,960 --> 00:21:02,760 Speaker 2: world ziz rights. I did not trust most humans indifference 380 00:21:02,800 --> 00:21:05,640 Speaker 2: to build a net positive cosmos, even in the absence 381 00:21:05,640 --> 00:21:08,679 Speaker 2: of a technological convenience to prey on animals, So like, 382 00:21:09,240 --> 00:21:12,240 Speaker 2: I'm the only one who has the mental capability to 383 00:21:12,359 --> 00:21:16,160 Speaker 2: actually create the net positive cosmos that needs to come 384 00:21:16,200 --> 00:21:19,560 Speaker 2: into being. All of her discussion is talking in like 385 00:21:19,680 --> 00:21:22,679 Speaker 2: terms of I'm saving the universe, right, And a lot 386 00:21:22,720 --> 00:21:25,080 Speaker 2: of that does come out of the way many of 387 00:21:25,119 --> 00:21:28,520 Speaker 2: these people talk on the Internet about the stakes of AI, 388 00:21:28,640 --> 00:21:31,679 Speaker 2: and just like the importance of rationality. Again, this is 389 00:21:31,680 --> 00:21:35,480 Speaker 2: something scientology does. El Ron Hubbard always couched getting people 390 00:21:35,480 --> 00:21:37,600 Speaker 2: on dianetics in terms of we are going to save 391 00:21:37,640 --> 00:21:40,320 Speaker 2: the world and end war, right, like this is you know, 392 00:21:40,440 --> 00:21:45,439 Speaker 2: it's very normal for cold stuff. She starts reading around 393 00:21:45,440 --> 00:21:48,000 Speaker 2: this time when she's in college, Harry Potter and the 394 00:21:48,000 --> 00:21:51,840 Speaker 2: Methods of Rationality. This helps to solidify her feelings of 395 00:21:51,880 --> 00:21:55,280 Speaker 2: her own centrality as a hero figure. In a blog 396 00:21:55,320 --> 00:21:58,560 Speaker 2: post where she lays out her intellectual journey, she quotes 397 00:21:58,600 --> 00:22:02,760 Speaker 2: a line from that fan thick of yed Kowski's that is, 398 00:22:03,160 --> 00:22:07,280 Speaker 2: it's essentially about what Yudkowski calls the hero contract, right 399 00:22:07,400 --> 00:22:11,800 Speaker 2: or sorry, It's essentially about this concept called the hero contract. Right, 400 00:22:11,880 --> 00:22:17,080 Speaker 2: And there's this, there's this, This is a psychological concept 401 00:22:17,080 --> 00:22:20,640 Speaker 2: among academics right where. And it's about like, it's about 402 00:22:20,680 --> 00:22:24,200 Speaker 2: about analyzing how we as a how we should look 403 00:22:24,240 --> 00:22:30,440 Speaker 2: at the people who societies declare heroes and the communities 404 00:22:30,560 --> 00:22:34,440 Speaker 2: that declare them heroes and see them as in a dialogue, 405 00:22:34,960 --> 00:22:37,679 Speaker 2: right as in when when when you're in a country 406 00:22:37,680 --> 00:22:41,000 Speaker 2: decides this guy is a hero. He is through his 407 00:22:41,160 --> 00:22:43,840 Speaker 2: actions kind of conversing to them, and they are kind 408 00:22:43,840 --> 00:22:46,880 Speaker 2: of telling him what they expect from him. 409 00:22:47,000 --> 00:22:47,200 Speaker 3: Right. 410 00:22:47,840 --> 00:22:51,440 Speaker 2: But Yedkowski wrestles with this concept, right, and he comes 411 00:22:51,480 --> 00:22:54,560 Speaker 2: to some very weird conclusions about it. In one of 412 00:22:54,600 --> 00:22:58,159 Speaker 2: the worst articles that I've ever read, he frames it 413 00:22:58,200 --> 00:23:02,040 Speaker 2: as hero licensing. I refer to the fact that people 414 00:23:02,720 --> 00:23:04,639 Speaker 2: get angry at you if they don't think you have 415 00:23:04,800 --> 00:23:07,360 Speaker 2: if you're trying to do something and they don't think 416 00:23:07,359 --> 00:23:09,600 Speaker 2: you have a hero license to do it. In other words, 417 00:23:09,840 --> 00:23:13,600 Speaker 2: if you're trying to do something like that they don't 418 00:23:13,600 --> 00:23:16,760 Speaker 2: think you're qualified to do. He'll describe that as them 419 00:23:16,760 --> 00:23:19,040 Speaker 2: not thinking of like a hero license. And he like 420 00:23:19,080 --> 00:23:22,120 Speaker 2: writes this annoying article that's like a conversation between him 421 00:23:22,160 --> 00:23:24,560 Speaker 2: and a person who's supposed to embody the community of 422 00:23:24,600 --> 00:23:27,679 Speaker 2: people who don't think he should write Harry Potter fan fiction. 423 00:23:29,160 --> 00:23:32,399 Speaker 2: It's all very silly, and again always is ridiculous. But 424 00:23:32,840 --> 00:23:36,920 Speaker 2: Ziz is very interested in the idea of the hero contract, right, 425 00:23:37,280 --> 00:23:39,159 Speaker 2: but she comes up with her own spin on it, 426 00:23:39,200 --> 00:23:43,200 Speaker 2: which she calls the true hero contract, right and instead 427 00:23:43,240 --> 00:23:47,520 Speaker 2: of again, the academic term is the hero contract means 428 00:23:47,560 --> 00:23:52,280 Speaker 2: societies and communities pick heroes, and those heroes in the 429 00:23:52,359 --> 00:23:54,800 Speaker 2: community that they're in are in a constant dialogue with 430 00:23:54,840 --> 00:23:58,440 Speaker 2: each other about what is heroic and what is expected, right, 431 00:23:58,760 --> 00:24:00,920 Speaker 2: what the hero needs from the commune unity, and vice versa. 432 00:24:01,080 --> 00:24:05,440 Speaker 2: You know, that's all that that's saying, Ziz says, No, no, nah, 433 00:24:05,840 --> 00:24:10,720 Speaker 2: that's bullshit. The real hero contract is quote poor free 434 00:24:11,480 --> 00:24:13,679 Speaker 2: energy at my direction, and it will go into the 435 00:24:13,760 --> 00:24:15,280 Speaker 2: optimization for good. 436 00:24:16,000 --> 00:24:18,240 Speaker 3: In other words, classic. 437 00:24:17,840 --> 00:24:22,840 Speaker 2: SiZ, it's not a dialogue. If you're the hero, the 438 00:24:22,920 --> 00:24:27,040 Speaker 2: community has to give you their energy and time and power, 439 00:24:27,560 --> 00:24:30,520 Speaker 2: and you will use it to optimize them for good 440 00:24:30,560 --> 00:24:32,800 Speaker 2: because they don't know how to do it themselves, because 441 00:24:32,800 --> 00:24:34,080 Speaker 2: they're not really able to think. 442 00:24:34,240 --> 00:24:35,120 Speaker 3: You know, they're not the. 443 00:24:35,119 --> 00:24:38,080 Speaker 2: Hero, because they're not the hero. Right you are? You are? 444 00:24:38,359 --> 00:24:39,920 Speaker 3: You are the all powerful hero. 445 00:24:41,560 --> 00:24:44,080 Speaker 2: Now this is a fancy way of describing how cult 446 00:24:44,160 --> 00:24:49,199 Speaker 2: leaders think. Right, Yeah, everyone exists to poor energy into me, 447 00:24:49,280 --> 00:24:51,040 Speaker 2: and I'll use it to do what's right. You know. 448 00:24:51,800 --> 00:24:54,800 Speaker 2: So this is where her mind is in twenty twelve, 449 00:24:54,840 --> 00:24:57,640 Speaker 2: but again she's just a student posting on the Internet 450 00:24:57,680 --> 00:25:01,640 Speaker 2: and chatting with other members of the subculture. Point that year, 451 00:25:01,680 --> 00:25:05,800 Speaker 2: she starts donating money to MIRY, the Machine Intelligence Research 452 00:25:05,920 --> 00:25:09,280 Speaker 2: Research Institute, which is a nonprofit devoted to studying how 453 00:25:09,280 --> 00:25:12,760 Speaker 2: to create friendly Ai y had Kowski founded Miri in 454 00:25:12,800 --> 00:25:15,720 Speaker 2: two thousand, right, so this is his like nonprofit think tank. 455 00:25:16,400 --> 00:25:19,959 Speaker 2: In twenty thirteen, she finished an internship at NASA. So 456 00:25:20,080 --> 00:25:21,960 Speaker 2: again she is a very smart young woman. 457 00:25:22,040 --> 00:25:22,159 Speaker 3: Right. 458 00:25:22,200 --> 00:25:24,280 Speaker 2: She gets an internship at NASA and she builds a 459 00:25:24,320 --> 00:25:27,800 Speaker 2: tool for space weather analysis, so as the person with 460 00:25:27,840 --> 00:25:30,600 Speaker 2: a lot of potential, very very as. All of the 461 00:25:30,600 --> 00:25:33,800 Speaker 2: stuff she's writing is like dumbest shit. But again, intelligence 462 00:25:33,840 --> 00:25:37,080 Speaker 2: isn't an absolute. People can be brilliant at coding and 463 00:25:37,119 --> 00:25:39,440 Speaker 2: have terrible ideas about everything else. 464 00:25:39,600 --> 00:25:44,080 Speaker 3: Yes, exactly. Yeah, I wonder if she's telling you think 465 00:25:44,080 --> 00:25:45,320 Speaker 3: she's telling people at work. 466 00:25:47,920 --> 00:25:49,800 Speaker 2: I don't. I don't think at this point she is, 467 00:25:49,840 --> 00:25:54,960 Speaker 2: because she's super insular. Right, She's very uncomfortable talking to people. Right, 468 00:25:55,520 --> 00:25:57,879 Speaker 2: She's going to kind of break out of her shell 469 00:25:57,960 --> 00:26:00,560 Speaker 2: once she gets to San Francisco. I don't know. She 470 00:26:00,680 --> 00:26:02,560 Speaker 2: may have talked to some of them about this stuff, 471 00:26:02,600 --> 00:26:04,240 Speaker 2: but I really don't think she is at this point. 472 00:26:04,280 --> 00:26:09,800 Speaker 2: I don't think she's comfortable enough doing that. Yeah. So 473 00:26:10,160 --> 00:26:12,840 Speaker 2: she also does an internship at the Software Giant Oracle. 474 00:26:13,000 --> 00:26:15,200 Speaker 2: So at this point you've got this young lady who's 475 00:26:15,240 --> 00:26:16,480 Speaker 2: got a lot of potential, you. 476 00:26:16,440 --> 00:26:17,640 Speaker 3: Know, a real career as well. 477 00:26:17,720 --> 00:26:21,120 Speaker 2: Yeah, the start of a very real career. That's a 478 00:26:21,160 --> 00:26:24,280 Speaker 2: great starting resume for like a twenty two year old. 479 00:26:25,600 --> 00:26:28,200 Speaker 2: Now at this point, she's torn should she go get 480 00:26:28,240 --> 00:26:31,600 Speaker 2: a graduate degree, right, or should she jump right into 481 00:26:31,600 --> 00:26:35,080 Speaker 2: the tech industry, you know, and she worries that like, 482 00:26:35,119 --> 00:26:37,720 Speaker 2: if she waits to get a graduate degree, this will 483 00:26:37,760 --> 00:26:41,000 Speaker 2: delay her making a positive impact on the existential risk 484 00:26:41,160 --> 00:26:44,159 Speaker 2: caused by AI, and it'll be too late. The singularity 485 00:26:44,200 --> 00:26:48,040 Speaker 2: will happen already, you know. At this point, she's still 486 00:26:48,040 --> 00:26:51,439 Speaker 2: a big fawning fan of Eliza Yedkowski and the highest 487 00:26:51,520 --> 00:26:55,560 Speaker 2: ranking woman at Gydkowski's organization, Mary is a lady named 488 00:26:55,600 --> 00:26:59,560 Speaker 2: Susan Salomon. Susan gives a public invitation to the online 489 00:26:59,600 --> 00:27:02,600 Speaker 2: community to pitch ideas for the best way to improve 490 00:27:02,720 --> 00:27:05,800 Speaker 2: the ultimate quality of the singleton that these people believe 491 00:27:05,840 --> 00:27:09,240 Speaker 2: is inevitable. In other words, hey, give us your ideas 492 00:27:09,280 --> 00:27:12,040 Speaker 2: for how to make the inevitable AI god nice right. 493 00:27:13,400 --> 00:27:15,919 Speaker 2: Here's what Ziz writes about her response to that. I 494 00:27:15,960 --> 00:27:18,159 Speaker 2: asked her whether I should try an alter course and 495 00:27:18,200 --> 00:27:20,920 Speaker 2: do research or continue a fork of my pre existing 496 00:27:21,000 --> 00:27:24,520 Speaker 2: life plan earned to give as a computer engineer, but 497 00:27:24,600 --> 00:27:27,720 Speaker 2: retrain and try to do research directly instead. At the time, 498 00:27:27,720 --> 00:27:29,399 Speaker 2: I was planning to go to grad school and I 499 00:27:29,440 --> 00:27:32,359 Speaker 2: had an irrational attachment to the idea. She sort of 500 00:27:32,400 --> 00:27:34,600 Speaker 2: compromised and said, I should go to grad school, fight 501 00:27:34,680 --> 00:27:37,080 Speaker 2: a startup co founder, drop out, and earn to give 502 00:27:37,200 --> 00:27:43,880 Speaker 2: via startups instead. First off, bad advices and bad advice 503 00:27:45,119 --> 00:27:50,600 Speaker 2: being stobs, being Steve Jobs, worked for Steve Jobs well, 504 00:27:50,600 --> 00:27:53,119 Speaker 2: and Bill Gates. I guess, to an extent, it doesn't 505 00:27:53,119 --> 00:27:54,000 Speaker 2: work for most people. 506 00:27:54,400 --> 00:27:58,560 Speaker 3: No, no, no, it seems like the general tech disruptor idea, 507 00:27:58,640 --> 00:27:58,880 Speaker 3: you know. 508 00:28:00,240 --> 00:28:03,240 Speaker 2: Yeah, and most people these people aren't very original thinkers, 509 00:28:03,320 --> 00:28:05,080 Speaker 2: like yeah, she's just saying, like, yeah, go to a 510 00:28:05,119 --> 00:28:10,480 Speaker 2: Steve Jobs. So Ziz does go to grad school, and 511 00:28:10,560 --> 00:28:13,720 Speaker 2: somewhere around that time in twenty fourteen, she attends a 512 00:28:13,800 --> 00:28:17,760 Speaker 2: lecture by Elisia Jedkowski on the subject of Inadequate Equilibria, 513 00:28:18,119 --> 00:28:20,399 Speaker 2: which is the title of a book that Jodkowski had 514 00:28:20,400 --> 00:28:23,080 Speaker 2: wrote about the time, and the book is about where 515 00:28:23,119 --> 00:28:27,919 Speaker 2: and how civilizations get stuck. One reviewer, Brian Kaplan, who 516 00:28:27,960 --> 00:28:30,680 Speaker 2: despite being a professor of economics, must have a brain 517 00:28:30,680 --> 00:28:33,760 Speaker 2: as smooth as a Pearl wrote this about it. Every 518 00:28:33,800 --> 00:28:36,600 Speaker 2: society is screwed up. Elisia Yidkowski is one of the 519 00:28:36,640 --> 00:28:38,760 Speaker 2: few thinkers on earth who are trying, at the most 520 00:28:38,800 --> 00:28:43,040 Speaker 2: general level to understand why. And this is like, wow, 521 00:28:43,280 --> 00:28:48,080 Speaker 2: that's it you, Pete. Please study the humanities a little bit, 522 00:28:48,160 --> 00:28:50,400 Speaker 2: A little bit, a little bit, I. 523 00:28:50,320 --> 00:28:50,960 Speaker 3: Mean, fuck man. 524 00:28:51,160 --> 00:28:54,560 Speaker 2: The first and most like one of the first influential 525 00:28:54,720 --> 00:28:57,680 Speaker 2: works of his modern historic scholarship is The Decline and 526 00:28:57,880 --> 00:29:01,120 Speaker 2: Fall of the Roman Empire. It's a whole book about 527 00:29:01,120 --> 00:29:05,160 Speaker 2: why a society fell apart, and like motherfucker. More recently, 528 00:29:05,560 --> 00:29:13,080 Speaker 2: Mike Davis existed, like like Jesus Christ. 529 00:29:12,760 --> 00:29:14,920 Speaker 3: I believe this guy continues to get traction. 530 00:29:15,320 --> 00:29:17,840 Speaker 2: Nobody else is thinking about why society screwed up, but 531 00:29:17,840 --> 00:29:19,960 Speaker 2: a Leezer yed Kowski, this. 532 00:29:19,920 --> 00:29:24,320 Speaker 3: Man, this man, this guy, this marry. 533 00:29:25,400 --> 00:29:27,360 Speaker 2: Yeah, no, I was trying to find another. I read 534 00:29:27,360 --> 00:29:34,480 Speaker 2: through that Martin Luther King junior speech. Everything's good, oh, 535 00:29:34,600 --> 00:29:39,400 Speaker 2: oh my god, oh my god. Like motherfucker. So many 536 00:29:39,400 --> 00:29:41,760 Speaker 2: people do nothing but try to write about why our 537 00:29:41,800 --> 00:29:42,840 Speaker 2: society is sick. 538 00:29:43,400 --> 00:29:43,880 Speaker 1: They did. 539 00:29:46,320 --> 00:29:48,080 Speaker 3: On all levels, by the way, on. 540 00:29:49,680 --> 00:29:54,520 Speaker 2: Everybody's thinking about this. This is such a common subject 541 00:29:54,640 --> 00:29:57,040 Speaker 2: of scholarship and discussion. 542 00:30:00,040 --> 00:30:02,080 Speaker 3: What everyone's talking always. 543 00:30:02,360 --> 00:30:04,960 Speaker 2: It would be like if if I got really into 544 00:30:05,000 --> 00:30:07,560 Speaker 2: like reading medical textbooks and was like, you know what, 545 00:30:08,320 --> 00:30:11,320 Speaker 2: nobody's ever tried to figure out how to transplant a heart. 546 00:30:11,520 --> 00:30:14,280 Speaker 2: I'm going to write a book about how that might work. 547 00:30:15,520 --> 00:30:28,440 Speaker 2: I think I got you know, people, So yeah, speaking 548 00:30:28,480 --> 00:30:35,800 Speaker 2: of these fucking people have sex with Uh Nope, well. 549 00:30:35,640 --> 00:30:36,280 Speaker 3: That's not something. 550 00:30:36,280 --> 00:30:38,680 Speaker 2: No, I don't know. I don't know. Uh, don't fuck 551 00:30:39,560 --> 00:30:49,680 Speaker 2: listen to ads. We're back. So Zizz is at this 552 00:30:49,760 --> 00:30:52,880 Speaker 2: speech where Yudkowski is shilling his book, and he most 553 00:30:52,920 --> 00:30:54,800 Speaker 2: of what he seems to be talking about in this 554 00:30:55,560 --> 00:30:58,640 Speaker 2: speech about this book about why societies fall apart is 555 00:30:58,680 --> 00:31:02,000 Speaker 2: how to make a text startup. She says, quote he 556 00:31:02,080 --> 00:31:04,920 Speaker 2: gave a recipe for finding startup ideas. He said, Paul 557 00:31:05,000 --> 00:31:09,120 Speaker 2: Graham's idea only filter on people, ignore startup ideas was 558 00:31:09,160 --> 00:31:12,360 Speaker 2: partial epistemic learned helplessness. That means Paul Graham is saying, 559 00:31:12,640 --> 00:31:15,360 Speaker 2: focus on finding good people that you'd start a company with. 560 00:31:15,880 --> 00:31:19,320 Speaker 2: Having an idea for a company doesn't matter. Yudkowski says, 561 00:31:19,400 --> 00:31:22,440 Speaker 2: of course, startup ideas mattered. You needed a good startup idea, 562 00:31:22,560 --> 00:31:24,760 Speaker 2: so look for a way in the world is broken, 563 00:31:25,040 --> 00:31:27,920 Speaker 2: then compare against a checklist of things you couldn't fix, 564 00:31:28,480 --> 00:31:31,360 Speaker 2: you know, right, Like, that's that's what this speech is 565 00:31:31,440 --> 00:31:33,440 Speaker 2: largely about, as him being like, here's how to find 566 00:31:33,480 --> 00:31:37,080 Speaker 2: startup ideas. So she starts thinking. She starts thinking as 567 00:31:37,160 --> 00:31:41,320 Speaker 2: hard as she can, and you know, being a person 568 00:31:41,560 --> 00:31:45,720 Speaker 2: who is very much of the tech brain industry rot 569 00:31:45,760 --> 00:31:47,680 Speaker 2: at this point, she comes up with a brilliant idea. 570 00:31:48,040 --> 00:31:50,880 Speaker 2: It's a genius idea. Oh you're gonna you're gonna love 571 00:31:50,920 --> 00:31:55,920 Speaker 2: this idea. David Uber for prostitutes. 572 00:31:58,520 --> 00:31:59,760 Speaker 3: Yeah, fucking with me? 573 00:32:00,240 --> 00:32:03,120 Speaker 2: No, No, it's. 574 00:32:04,080 --> 00:32:05,160 Speaker 3: That's where she landed. 575 00:32:06,400 --> 00:32:10,480 Speaker 2: She lands on the idea of oh wow, sex work 576 00:32:10,560 --> 00:32:14,520 Speaker 2: is illegal, but porn isn't. So if we start an 577 00:32:14,600 --> 00:32:19,040 Speaker 2: uber whereby a team with a camera and a porn 578 00:32:19,160 --> 00:32:22,320 Speaker 2: star come to your house and you fuck them and 579 00:32:22,520 --> 00:32:26,560 Speaker 2: record it, that's a legal loophole. We just found that 580 00:32:26,680 --> 00:32:32,600 Speaker 2: at Its not just bus. She makes the bank bus, 581 00:32:32,680 --> 00:32:41,600 Speaker 2: the gig economy. It is really like dondreper a moment, 582 00:32:41,760 --> 00:32:47,080 Speaker 2: what about uber but a pimp? It's so funny these people. 583 00:32:51,320 --> 00:32:54,760 Speaker 3: You gotta love it, you got wow, it's wow. Wow, 584 00:32:54,960 --> 00:32:57,200 Speaker 3: what a place to end up. I would love to 585 00:32:57,240 --> 00:32:58,480 Speaker 3: see the other drafts. 586 00:32:58,720 --> 00:33:08,880 Speaker 2: Yeah, yeah, first because god, yeah, man, that's that's that 587 00:33:09,000 --> 00:33:10,320 Speaker 2: is the good stuff, isn't it. 588 00:33:10,800 --> 00:33:11,440 Speaker 3: Yeah? 589 00:33:11,480 --> 00:33:18,200 Speaker 2: Wow wow, we special minds at work here. Oh man. 590 00:33:18,480 --> 00:33:21,680 Speaker 3: Ultimately it all I have to make smart. 591 00:33:21,760 --> 00:33:24,160 Speaker 2: I have to make pimp uber. 592 00:33:25,160 --> 00:33:26,520 Speaker 3: That's so wild. 593 00:33:27,000 --> 00:33:30,720 Speaker 2: Yes, yes, the uber of pimping. What an idea? Now, 594 00:33:32,480 --> 00:33:35,320 Speaker 2: so Ziz devotes her brief time in grad school. She's 595 00:33:35,360 --> 00:33:38,560 Speaker 2: working on pimping Uber to try and find a partner. 596 00:33:38,680 --> 00:33:38,840 Speaker 3: Right. 597 00:33:38,880 --> 00:33:41,080 Speaker 2: She wants to have a startup partner, someone who will 598 00:33:41,160 --> 00:33:42,720 Speaker 2: will embark on this journey with her. 599 00:33:42,840 --> 00:33:44,600 Speaker 3: I don't know if that's an investor you need to. 600 00:33:48,440 --> 00:33:51,320 Speaker 2: It doesn't work out, She drops out of grad school 601 00:33:51,360 --> 00:33:53,680 Speaker 2: because quote, I did not find someone who felt like 602 00:33:53,760 --> 00:33:57,840 Speaker 2: good startup co founder material. This may be because she's 603 00:33:58,040 --> 00:34:01,240 Speaker 2: very bad at talking to people and all, so probably 604 00:34:01,280 --> 00:34:03,760 Speaker 2: scares people off because the things that she talks about 605 00:34:03,760 --> 00:34:04,720 Speaker 2: are deeply off putting. 606 00:34:04,840 --> 00:34:07,080 Speaker 3: Yeah, I was gonna say. It's also a terrible idea. 607 00:34:08,440 --> 00:34:10,680 Speaker 2: And at this point she hasn't done anything bad, so 608 00:34:10,719 --> 00:34:13,000 Speaker 2: I feel bad for this is a person who's very lonely, 609 00:34:13,360 --> 00:34:15,520 Speaker 2: who is very confused. She has by this point realized 610 00:34:15,560 --> 00:34:18,520 Speaker 2: that she's trans but not transitioned. She's in like this 611 00:34:18,600 --> 00:34:21,279 Speaker 2: is this is like a tough place to be, right, 612 00:34:21,480 --> 00:34:23,239 Speaker 2: that's a hard times. 613 00:34:23,360 --> 00:34:24,120 Speaker 3: That's that's hard. 614 00:34:24,200 --> 00:34:27,520 Speaker 2: And nothing about her inherent personality makes it is going 615 00:34:27,560 --> 00:34:29,879 Speaker 2: to make this easier for her. Right who she is 616 00:34:30,120 --> 00:34:36,359 Speaker 2: makes all of this much harder because she also makes 617 00:34:36,360 --> 00:34:40,480 Speaker 2: some comments about dropping out because her her thesis advisor 618 00:34:40,600 --> 00:34:44,000 Speaker 2: was abusive. I don't fully know what this means. And 619 00:34:44,040 --> 00:34:48,680 Speaker 2: here's why Ziz and encounter some behavior I will describe 620 00:34:48,760 --> 00:34:52,120 Speaker 2: later that is abusive from other people, but also regularly 621 00:34:52,120 --> 00:34:55,359 Speaker 2: defines abuse as people who disagree with her about the 622 00:34:55,360 --> 00:34:57,799 Speaker 2: only thing that matters being creating an AI god to 623 00:34:57,840 --> 00:35:01,200 Speaker 2: protect the animals. So I don't know if her thesis 624 00:35:01,200 --> 00:35:03,919 Speaker 2: advisor was abusive or was just like, maybe drop the 625 00:35:04,000 --> 00:35:08,880 Speaker 2: alien god idea for a second. Yeah, yeah, but maybe 626 00:35:09,440 --> 00:35:13,240 Speaker 2: maybe focus on like finding a job, you know, making 627 00:35:13,320 --> 00:35:16,480 Speaker 2: some friends, a couple of dates, go on a couple 628 00:35:16,520 --> 00:35:20,879 Speaker 2: of dates, something like that. Maybe maybe like maybe make 629 00:35:20,960 --> 00:35:25,720 Speaker 2: God on the back burner here for a second. Whatever 630 00:35:25,840 --> 00:35:28,640 Speaker 2: happened here, She decides it's time to move to the Bay. 631 00:35:28,680 --> 00:35:30,880 Speaker 2: This is like twenty sixteen. She's going to find a 632 00:35:30,880 --> 00:35:33,240 Speaker 2: big tech job. She's going to make that big tech money. 633 00:35:33,680 --> 00:35:36,319 Speaker 2: While she figures out a startup idea and finds a 634 00:35:36,360 --> 00:35:38,600 Speaker 2: co founder who will let her make enough money to 635 00:35:38,760 --> 00:35:42,719 Speaker 2: change and save the world, well, the whole universe. Her 636 00:35:42,760 --> 00:35:45,759 Speaker 2: first plan is to give the money to Mary Yudkowski's 637 00:35:45,840 --> 00:35:49,520 Speaker 2: organization so it can continue it's important, important work, imagining 638 00:35:49,520 --> 00:35:53,040 Speaker 2: an ic AI. Her parents. She's got enough family money 639 00:35:53,040 --> 00:35:54,880 Speaker 2: that her parents are able to pay for like I 640 00:35:54,920 --> 00:35:57,239 Speaker 2: think like six months or more of rent in the bay, 641 00:35:57,320 --> 00:36:00,680 Speaker 2: which is not nothing, not a cheap place to live. 642 00:36:02,400 --> 00:36:04,560 Speaker 2: I don't know exactly how long her parents are paying, 643 00:36:04,600 --> 00:36:08,040 Speaker 2: but like that that implies a degree of financial comfort. Right, 644 00:36:10,160 --> 00:36:12,919 Speaker 2: So she gets hired by a startup very quickly, because 645 00:36:12,920 --> 00:36:14,640 Speaker 2: again very gifted. 646 00:36:16,400 --> 00:36:16,520 Speaker 3: Right. 647 00:36:16,680 --> 00:36:20,200 Speaker 2: Yes, yes it's some sort of gaming company. But at 648 00:36:20,200 --> 00:36:24,160 Speaker 2: this point she's made another change in her ethics system 649 00:36:24,200 --> 00:36:28,759 Speaker 2: based on the Leiser Yeddkowski's writings. One of Yukowski's writings 650 00:36:28,880 --> 00:36:32,880 Speaker 2: argues that it is talking about the difference between consequentialists 651 00:36:33,440 --> 00:36:38,360 Speaker 2: and virtue ethics. Right. Consequentialists are people who focus entirely 652 00:36:38,400 --> 00:36:41,480 Speaker 2: on what will the outcome of my actions be? And 653 00:36:41,480 --> 00:36:43,480 Speaker 2: it kind of doesn't matter what I'm doing or even 654 00:36:43,520 --> 00:36:46,440 Speaker 2: if it's sometimes a little fucked up, if the end 655 00:36:46,520 --> 00:36:51,360 Speaker 2: result is good virtue ethics. People folk have a code 656 00:36:51,520 --> 00:36:55,319 Speaker 2: and stick to it, right And actually, and I kind 657 00:36:55,320 --> 00:36:57,760 Speaker 2: of am surprised that he came to this. Yakowski's conclusion 658 00:36:57,800 --> 00:37:01,000 Speaker 2: is that like, while logically you're more likely to succeed, 659 00:37:01,200 --> 00:37:04,040 Speaker 2: like on paper, you're more likely to succeed as a consequentialist. 660 00:37:04,080 --> 00:37:06,880 Speaker 2: His opinion is that virtue ethics has the best outcome. 661 00:37:07,040 --> 00:37:08,920 Speaker 2: People tend to do well when they stick to a 662 00:37:08,960 --> 00:37:12,120 Speaker 2: code and they try to rather than like anything goes 663 00:37:12,160 --> 00:37:15,520 Speaker 2: as long as I succeed, right, And I think that's 664 00:37:15,520 --> 00:37:18,280 Speaker 2: actually a pretty decent way to live your life. 665 00:37:19,040 --> 00:37:21,959 Speaker 3: It's a pretty reasonable conclusion for him. 666 00:37:22,520 --> 00:37:24,600 Speaker 2: It's a reasonable conclusion for him, So I don't blame 667 00:37:24,680 --> 00:37:27,319 Speaker 2: him on this part. But here's the problem. Zizz is 668 00:37:27,400 --> 00:37:30,640 Speaker 2: trying to break into and succeed in the tech industry, 669 00:37:31,239 --> 00:37:36,600 Speaker 2: and you can't. You are very unlikely to succeed at 670 00:37:36,640 --> 00:37:39,319 Speaker 2: a high level in the tech industry if you are 671 00:37:39,440 --> 00:37:43,080 Speaker 2: unwilling to do things and have things done to you 672 00:37:43,160 --> 00:37:46,080 Speaker 2: that are unethical and fucked up. I'm not saying this 673 00:37:46,120 --> 00:37:49,759 Speaker 2: is good. This is the reality of the entertainment industry too, right. 674 00:37:50,280 --> 00:37:52,680 Speaker 2: I experience when I started and I started with an 675 00:37:52,760 --> 00:37:57,359 Speaker 2: unpaid internship. Unpaid internships are bad, right, It's bad that 676 00:37:57,400 --> 00:38:00,360 Speaker 2: those exist. They inherently favor people who have money and 677 00:38:00,400 --> 00:38:03,319 Speaker 2: people who have family connections. You know, I had like 678 00:38:03,560 --> 00:38:06,160 Speaker 2: a small savings account for my job in special ed, 679 00:38:07,120 --> 00:38:08,799 Speaker 2: but that was the standard. It is, like, there were 680 00:38:08,800 --> 00:38:11,400 Speaker 2: a lot of unpaid internships. It got me my foot 681 00:38:11,400 --> 00:38:13,720 Speaker 2: in the door. It worked for me. I also worked 682 00:38:13,719 --> 00:38:16,319 Speaker 2: a lot of overtime that I didn't get paid for. 683 00:38:16,560 --> 00:38:18,200 Speaker 2: I did a lot of shit that wasn't a part 684 00:38:18,200 --> 00:38:21,280 Speaker 2: of my job to impress my bosses to make myself 685 00:38:21,320 --> 00:38:25,359 Speaker 2: indispensable so that they would decide, like, we have to 686 00:38:25,520 --> 00:38:27,440 Speaker 2: keep this guy on and pay him. And it worked 687 00:38:27,440 --> 00:38:29,719 Speaker 2: for me. And I just wanted to add because this 688 00:38:29,760 --> 00:38:31,400 Speaker 2: was not in the original thing. A big part of 689 00:38:31,400 --> 00:38:33,399 Speaker 2: why it worked for me is that I'm talking about 690 00:38:33,400 --> 00:38:36,279 Speaker 2: a few different companies here, but particularly at Cracked where 691 00:38:36,280 --> 00:38:40,640 Speaker 2: I had the internship. Like my bosses, you know, made 692 00:38:40,640 --> 00:38:44,120 Speaker 2: a choice tom intor me, and you know, to get me, 693 00:38:44,320 --> 00:38:47,239 Speaker 2: you know, to work overtime on their own behalf to 694 00:38:47,320 --> 00:38:49,960 Speaker 2: like make sure I got a paying job, which is 695 00:38:50,080 --> 00:38:52,640 Speaker 2: a big part of like the luck that I encountered 696 00:38:52,640 --> 00:38:56,200 Speaker 2: that a lot of people don't. So that's another major 697 00:38:56,480 --> 00:38:58,880 Speaker 2: part of like why things worked out for me is 698 00:38:58,920 --> 00:39:01,160 Speaker 2: that I just got incredible lucky with the people I 699 00:39:01,200 --> 00:39:05,960 Speaker 2: was working for and with That's bad. It's not good 700 00:39:06,080 --> 00:39:09,920 Speaker 2: that things work that way, right, It's not healthy. 701 00:39:09,640 --> 00:39:11,520 Speaker 3: Set up for you either, Like you know, you kind 702 00:39:11,520 --> 00:39:14,040 Speaker 3: of defied the odds. It's it's, like you said, the 703 00:39:14,120 --> 00:39:17,000 Speaker 3: rich people who get the job where exactly it's not. 704 00:39:17,080 --> 00:39:22,080 Speaker 2: Even yes, that said, if I am giving someone, if 705 00:39:22,080 --> 00:39:25,600 Speaker 2: someone wants what is the most likely path to succeeding? 706 00:39:25,800 --> 00:39:28,560 Speaker 2: You know, I've I've just got this job working, you know, 707 00:39:28,640 --> 00:39:32,319 Speaker 2: on this production company or the music steer, I would 708 00:39:32,520 --> 00:39:34,640 Speaker 2: I would say, well, your best odds are to like 709 00:39:34,760 --> 00:39:40,880 Speaker 2: make yourself completely indispensable and become obsessively devoted to that task. 710 00:39:41,040 --> 00:39:41,200 Speaker 3: Right. 711 00:39:41,840 --> 00:39:45,640 Speaker 2: Uh, that's it. I don't tend to give that advice anymore. 712 00:39:45,760 --> 00:39:48,680 Speaker 2: I have and I have had several other friends succeed 713 00:39:48,840 --> 00:39:51,680 Speaker 2: as a result of it, and all of us also 714 00:39:51,800 --> 00:39:55,040 Speaker 2: burnt ourselves out and did huge amounts of damage to ourselves. 715 00:39:55,080 --> 00:39:58,359 Speaker 2: Like I am permanently broken as a result of of 716 00:39:59,760 --> 00:40:01,879 Speaker 2: you know, the ten years that I did eighty hour 717 00:40:01,960 --> 00:40:02,799 Speaker 2: weeks and shit, you. 718 00:40:02,760 --> 00:40:04,960 Speaker 3: Know, now you're sounding like somebody who works in the 719 00:40:05,080 --> 00:40:05,680 Speaker 3: entertainment and. 720 00:40:06,040 --> 00:40:09,600 Speaker 2: Yes, yes, and it worked for me, right, I got 721 00:40:10,320 --> 00:40:13,400 Speaker 2: a I succeeded, I got a great job, I got money. 722 00:40:14,719 --> 00:40:17,200 Speaker 2: Most people it doesn't. And it's bad that it works 723 00:40:17,200 --> 00:40:21,879 Speaker 2: this way. Ziz, unlike me, is not willing to do that, right. 724 00:40:22,360 --> 00:40:25,440 Speaker 2: She thinks it's wrong to be asked to work overtime 725 00:40:25,560 --> 00:40:27,480 Speaker 2: and not get paid for it, and so on her 726 00:40:27,520 --> 00:40:30,120 Speaker 2: first day at the job, she leaves after eight hours 727 00:40:30,480 --> 00:40:32,320 Speaker 2: and her boss is like, what the fuck are you doing? 728 00:40:32,360 --> 00:40:35,360 Speaker 2: And she's like, I'm I'm here to supposed to be 729 00:40:35,360 --> 00:40:37,439 Speaker 2: here eight hours. Eight hours is up, I'm going home. 730 00:40:38,000 --> 00:40:41,799 Speaker 2: And he calls her half an hour later and fires her. Right, 731 00:40:42,400 --> 00:40:44,960 Speaker 2: and this is because the tech industry is evil, you know, 732 00:40:45,480 --> 00:40:49,319 Speaker 2: like this is bad. She's not bad here she is. 733 00:40:49,800 --> 00:40:53,440 Speaker 2: It is like a thing where it's she's not doing 734 00:40:53,600 --> 00:40:56,040 Speaker 2: by her standards. What I would say is the rational thing, 735 00:40:56,400 --> 00:40:59,200 Speaker 2: which would be if all that matters is optimizing your 736 00:40:59,200 --> 00:41:02,080 Speaker 2: earning power, right, right, well, then you do this, then 737 00:41:02,120 --> 00:41:05,040 Speaker 2: you do do whatever it takes. Right. So it's kind 738 00:41:05,040 --> 00:41:08,040 Speaker 2: of interesting to me like that she is so devoted 739 00:41:08,080 --> 00:41:10,080 Speaker 2: to this like virtue ethics thing at this point that 740 00:41:10,120 --> 00:41:13,640 Speaker 2: she fucks over her career in the tech industry because 741 00:41:13,640 --> 00:41:15,399 Speaker 2: she's not willing to do the things that you kind 742 00:41:15,400 --> 00:41:18,400 Speaker 2: of need to do to succeed, you know, in the 743 00:41:18,440 --> 00:41:22,600 Speaker 2: place that she is. But it's interesting, I don't like 744 00:41:22,640 --> 00:41:24,839 Speaker 2: give her any shit for that. So she asked your 745 00:41:24,840 --> 00:41:27,200 Speaker 2: parents more for more runway to extend your time in 746 00:41:27,280 --> 00:41:29,719 Speaker 2: the bay. And then she finds work at another startup, 747 00:41:29,800 --> 00:41:33,040 Speaker 2: but the same problems persist. Quote, they kept demanding that 748 00:41:33,080 --> 00:41:36,080 Speaker 2: I work unpaid overtime, talking about how other employees just 749 00:41:36,120 --> 00:41:39,080 Speaker 2: put always put forty hours on their timesheet no matter what. 750 00:41:39,440 --> 00:41:42,040 Speaker 2: And this exemplary employee over there worked twelve hours a 751 00:41:42,120 --> 00:41:43,759 Speaker 2: day and he really went the extra mile and got 752 00:41:43,760 --> 00:41:45,920 Speaker 2: the job done. And they needed me to really go 753 00:41:46,000 --> 00:41:49,239 Speaker 2: the extra mile and get the job done. She's not 754 00:41:49,320 --> 00:41:51,680 Speaker 2: willing to do that. And again I hate that this 755 00:41:51,719 --> 00:41:53,799 Speaker 2: is part of what drives her to the madness that 756 00:41:53,880 --> 00:41:57,080 Speaker 2: leads to the cult to the killings, because it's like, oh, honey, 757 00:41:57,120 --> 00:41:59,160 Speaker 2: you're in the right. It's an eavil industry. 758 00:41:59,320 --> 00:42:01,719 Speaker 3: Yeah, you a flash of where it could have gone. 759 00:42:01,800 --> 00:42:04,959 Speaker 3: Well it really there were chances for this to work out. 760 00:42:05,280 --> 00:42:08,520 Speaker 2: No, you were one hundred percent right, Like this is 761 00:42:08,560 --> 00:42:11,480 Speaker 2: stuck to it. Yeah, you know what I mean, And 762 00:42:11,480 --> 00:42:14,440 Speaker 2: that's super hard. I really respect that part of you. 763 00:42:14,760 --> 00:42:20,239 Speaker 2: Oh yeah, yeah, yeah, I'm so sad that this is 764 00:42:20,239 --> 00:42:23,440 Speaker 2: part of what shatters your brain. Like that really bums 765 00:42:23,480 --> 00:42:28,880 Speaker 2: me out. So first off, she's kind of starts spiraling 766 00:42:28,920 --> 00:42:31,360 Speaker 2: and she concludes that she hates virtue ethics. This is 767 00:42:31,400 --> 00:42:34,560 Speaker 2: where she starts hating Yidkowski, right, This is she doesn't 768 00:42:34,560 --> 00:42:37,279 Speaker 2: come break entirely on him yet, but she gets really 769 00:42:37,320 --> 00:42:40,040 Speaker 2: angry at this point because she's like, well, obviously virtue 770 00:42:40,080 --> 00:42:41,040 Speaker 2: ethics don't work. 771 00:42:42,280 --> 00:42:44,719 Speaker 3: And she's been following this man at this point for. 772 00:42:44,760 --> 00:42:48,759 Speaker 2: Years exactly exactly, so this is a very like damaging 773 00:42:48,840 --> 00:42:53,440 Speaker 2: thing to her that this happens. And you know, and 774 00:42:53,600 --> 00:42:56,400 Speaker 2: again as much as I'm blamed Yadkowski, the culture of 775 00:42:56,440 --> 00:42:59,120 Speaker 2: the Bay Bay area tech industry. That's a big part 776 00:42:59,160 --> 00:43:01,439 Speaker 2: of what drives this person and you know, to where 777 00:43:01,480 --> 00:43:02,160 Speaker 2: she ends up. 778 00:43:02,360 --> 00:43:02,520 Speaker 3: Right. 779 00:43:03,120 --> 00:43:06,320 Speaker 2: So that said, some of her issues are also rooted 780 00:43:06,320 --> 00:43:09,320 Speaker 2: in a kind of rigid and unforgiving internal rule set. 781 00:43:10,320 --> 00:43:12,640 Speaker 2: At one point, she negotiates work with a professor and 782 00:43:12,640 --> 00:43:16,200 Speaker 2: their undergraduate helper. She doesn't want to take an hourly job, 783 00:43:16,239 --> 00:43:18,520 Speaker 2: and she tries to negotiate a flat rate of seven k. 784 00:43:19,440 --> 00:43:22,680 Speaker 2: And they're like, yeah, okay, that sounds fair, but the 785 00:43:22,719 --> 00:43:24,920 Speaker 2: school doesn't do stuff like that, so you will have 786 00:43:25,000 --> 00:43:27,239 Speaker 2: to fake some paperwork with me for me to be 787 00:43:27,280 --> 00:43:29,400 Speaker 2: able to get them to pay you seven thousand dollars. 788 00:43:29,880 --> 00:43:32,600 Speaker 2: And she isn't willing to do that. And that's the 789 00:43:32,600 --> 00:43:34,640 Speaker 2: thing where it's like, ah, no, I've had some shit 790 00:43:34,680 --> 00:43:37,200 Speaker 2: where this was the like there was a stupid rule 791 00:43:37,560 --> 00:43:40,319 Speaker 2: and like in order for the pep meat or other 792 00:43:40,360 --> 00:43:42,840 Speaker 2: people to get paid, we had to like tell something 793 00:43:42,880 --> 00:43:46,480 Speaker 2: else to the company. Like that's just that's just no 794 00:43:46,640 --> 00:43:49,480 Speaker 2: knowing how to get by. Yeah, that's that's living. 795 00:43:49,239 --> 00:43:52,239 Speaker 3: In the world. You got Yeah, you did the hard part. Yeah, 796 00:43:52,280 --> 00:43:53,480 Speaker 3: they said that we were going to do it. 797 00:43:53,520 --> 00:43:54,600 Speaker 2: You said they did it. 798 00:43:54,880 --> 00:43:57,279 Speaker 3: Yeah, that's like they already said we don't do this. 799 00:43:57,360 --> 00:43:59,040 Speaker 3: That's where are you just? 800 00:43:59,080 --> 00:44:00,960 Speaker 2: You can't get by in America If you're not, why 801 00:44:01,080 --> 00:44:05,120 Speaker 2: will rely on certain kinds of paperwork? Right, That's that's 802 00:44:05,160 --> 00:44:08,200 Speaker 2: the game our president does all the time. He's the 803 00:44:08,360 --> 00:44:14,600 Speaker 2: king of that shit. So at this point, Ziz is 804 00:44:14,640 --> 00:44:18,239 Speaker 2: stuck in what they consider a calamitous situation. The prophecy 805 00:44:18,320 --> 00:44:21,360 Speaker 2: of doom, as they call it, is ticking ever closer, 806 00:44:21,400 --> 00:44:24,520 Speaker 2: which means the bad AI that's going to create hell 807 00:44:24,719 --> 00:44:28,080 Speaker 2: for everybody. Her panic over this is elevated by the 808 00:44:28,120 --> 00:44:31,000 Speaker 2: fact that she she starts to get obsessed with Roco's 809 00:44:31,040 --> 00:44:34,240 Speaker 2: basilisk at this time. No I know, I know, worst 810 00:44:34,320 --> 00:44:40,000 Speaker 2: thing for her to read, come on and info hazards 811 00:44:40,120 --> 00:44:43,800 Speaker 2: right the warnings yep, and a lot of the smarter 812 00:44:43,960 --> 00:44:45,640 Speaker 2: rationalists are just annoyed by it. 813 00:44:45,680 --> 00:44:45,879 Speaker 3: Again. 814 00:44:45,960 --> 00:44:49,759 Speaker 2: Yadkowski immediately is like this is very quickly decides it's 815 00:44:49,760 --> 00:44:53,920 Speaker 2: bullshit and bans discussion of it. He argues there's no 816 00:44:54,000 --> 00:44:56,799 Speaker 2: incentive for a future agent to follow through with that 817 00:44:57,000 --> 00:45:00,160 Speaker 2: threat because it by doing so, it just expends resources 818 00:45:00,160 --> 00:45:02,719 Speaker 2: no gain to itself, which is like, yeah, man, a 819 00:45:02,800 --> 00:45:06,200 Speaker 2: hyperlogical AI would not immediately jump to I must make 820 00:45:06,280 --> 00:45:10,000 Speaker 2: hell for everybody who didn't code me Like, yeah, that's 821 00:45:10,040 --> 00:45:10,640 Speaker 2: just crazy. 822 00:45:10,760 --> 00:45:11,759 Speaker 3: There's step skew. 823 00:45:12,320 --> 00:45:15,720 Speaker 2: Yeah, only humans are like ill in that way. 824 00:45:16,080 --> 00:45:17,800 Speaker 3: That's the funny thing about it is it's such a 825 00:45:17,880 --> 00:45:18,880 Speaker 3: human response to it. 826 00:45:18,960 --> 00:45:23,239 Speaker 2: Yeah, right right now. When she encounters the concept of 827 00:45:23,280 --> 00:45:26,759 Speaker 2: Roco's basilisk at first, Ziz thinks that it's silly, right. 828 00:45:26,760 --> 00:45:29,520 Speaker 2: She kind of rejects it and moves on, But once 829 00:45:29,560 --> 00:45:31,600 Speaker 2: she gets to the Bay, she starts going to in 830 00:45:31,760 --> 00:45:35,520 Speaker 2: person rationalist meetups and having long conversations with other believers 831 00:45:35,680 --> 00:45:39,799 Speaker 2: who are still talking about Roco's basilisk. She writes, I 832 00:45:39,840 --> 00:45:42,720 Speaker 2: started encountering people who were freaked out by it, freaked 833 00:45:42,719 --> 00:45:45,080 Speaker 2: out that they had discovered an improvement to the info 834 00:45:45,120 --> 00:45:48,520 Speaker 2: hazard that made it function got around a Leezer's objection. 835 00:45:49,280 --> 00:45:52,360 Speaker 2: Her ultimate conclusion is this, if I persisted in trying 836 00:45:52,400 --> 00:45:55,160 Speaker 2: to save the world, I would be tortured until the 837 00:45:55,320 --> 00:45:57,759 Speaker 2: end of top the universe by a coalition of all 838 00:45:57,800 --> 00:46:01,080 Speaker 2: unfriendly ais in order to increase the amount of measure 839 00:46:01,120 --> 00:46:04,239 Speaker 2: they got by demoralizing me. Even if my system too 840 00:46:04,320 --> 00:46:07,239 Speaker 2: had good decision theory, my system one did not, and 841 00:46:07,280 --> 00:46:11,160 Speaker 2: that would damage my effectiveness. And like, I can't explain 842 00:46:11,200 --> 00:46:13,000 Speaker 2: all of the terms in that without taking more time 843 00:46:13,040 --> 00:46:14,759 Speaker 2: than we need to. But like you can hear like 844 00:46:14,800 --> 00:46:16,840 Speaker 2: that is not the writing of a person who is 845 00:46:16,840 --> 00:46:18,240 Speaker 2: thinking in logical terms. 846 00:46:18,440 --> 00:46:23,640 Speaker 3: No, it's it's a uh it's so scary. 847 00:46:23,600 --> 00:46:26,040 Speaker 2: Yes, yes, it is very scary stuff. 848 00:46:26,080 --> 00:46:28,319 Speaker 3: It's so scary to be like, Oh, that's where she 849 00:46:28,440 --> 00:46:29,880 Speaker 3: was operating those mistakes. 850 00:46:29,960 --> 00:46:31,919 Speaker 2: This is where you're she's dealing with. 851 00:46:32,160 --> 00:46:35,480 Speaker 3: Yes, that's that's it is. You know. 852 00:46:35,520 --> 00:46:37,279 Speaker 2: I talked to my friends who grow are raised in 853 00:46:37,360 --> 00:46:42,640 Speaker 2: like very toxic evangelical subculture, chunks of the evangelgelical sub 854 00:46:42,920 --> 00:46:45,160 Speaker 2: culture and growth and spend their whole childhood terrified of 855 00:46:45,160 --> 00:46:47,799 Speaker 2: hell that like everything. You know, I got angry at 856 00:46:47,800 --> 00:46:49,879 Speaker 2: my mom and I didn't say anything, but God knows 857 00:46:49,920 --> 00:46:51,920 Speaker 2: I'm angry at her, and he's going to send me 858 00:46:51,960 --> 00:46:53,560 Speaker 2: to hell because I didn't respect my mother. 859 00:46:53,920 --> 00:46:54,160 Speaker 3: Mother. 860 00:46:54,400 --> 00:46:56,320 Speaker 2: Like, that's what she's doing right. 861 00:46:56,160 --> 00:46:58,879 Speaker 3: Exactly, like exactly, she can't win. There's no winning here. 862 00:46:58,960 --> 00:47:02,960 Speaker 2: Yes, yes, and again I say this a lot. We 863 00:47:03,000 --> 00:47:06,080 Speaker 2: need to put lithium back in the drinking water. We 864 00:47:05,480 --> 00:47:10,080 Speaker 2: we gotta put lithium back in the water. Maybe xanax too, she. 865 00:47:10,120 --> 00:47:16,360 Speaker 3: Needed she could have tooken a combo. Yeah, before it 866 00:47:16,400 --> 00:47:18,640 Speaker 3: gets to where it gets at this point, you really 867 00:47:19,080 --> 00:47:22,319 Speaker 3: you really feel for and like just living in this 868 00:47:23,080 --> 00:47:26,880 Speaker 3: living like that every day. She's so scared and that 869 00:47:26,880 --> 00:47:29,440 Speaker 3: that this is what she's doing. It's it's this, this 870 00:47:29,560 --> 00:47:30,359 Speaker 3: is she is. 871 00:47:30,320 --> 00:47:33,480 Speaker 2: The therapy needingest woman I have ever heard. At this point, 872 00:47:33,520 --> 00:47:35,120 Speaker 2: Oh my god. 873 00:47:35,080 --> 00:47:38,200 Speaker 3: She just needs to talk to she needs to talk again. 874 00:47:38,680 --> 00:47:41,919 Speaker 2: You know, the cult the thing that happens to cult 875 00:47:41,960 --> 00:47:45,120 Speaker 2: members has happened to her. Where she her. The whole 876 00:47:45,200 --> 00:47:48,880 Speaker 2: language she uses is incomprehensible to people. I had to 877 00:47:49,080 --> 00:47:51,360 Speaker 2: talk to you for an hour and fifteen minutes, so 878 00:47:51,480 --> 00:47:55,280 Speaker 2: you want to understand parts of what this lady says, right, Exactly, 879 00:47:55,400 --> 00:47:57,560 Speaker 2: she has to because it's all nonsense if you don't 880 00:47:57,600 --> 00:47:59,000 Speaker 2: do that work exactly. 881 00:47:59,080 --> 00:48:01,839 Speaker 3: She so spun out at this point, it's like, how 882 00:48:01,840 --> 00:48:04,359 Speaker 3: do you even get back? Yeah, how do you even 883 00:48:04,400 --> 00:48:04,880 Speaker 3: get back? 884 00:48:05,120 --> 00:48:08,200 Speaker 2: Yeah? So she she ultimately decides, even though she thinks 885 00:48:08,200 --> 00:48:11,480 Speaker 2: she's doomed to be tortured by unfriendly ais evil gods, 886 00:48:11,560 --> 00:48:14,120 Speaker 2: must be fought. If this damns me, then so be it. 887 00:48:14,520 --> 00:48:18,879 Speaker 2: She's very heroic, she had and she sees herself that way, right. 888 00:48:19,200 --> 00:48:21,960 Speaker 3: Yeah, And even like just with her convictions and that 889 00:48:22,120 --> 00:48:25,480 Speaker 3: she she does, she does, she does, she does, she 890 00:48:25,880 --> 00:48:26,520 Speaker 3: does it. 891 00:48:26,640 --> 00:48:29,319 Speaker 2: She's a woman of conviction. You really can't take that 892 00:48:29,400 --> 00:48:35,879 Speaker 2: away from hers. Are nonsense, No, that's the but they're there. 893 00:48:36,200 --> 00:48:39,160 Speaker 3: Yeah, they're based on elaborate Harry Potter fan fiction. 894 00:48:39,320 --> 00:48:42,360 Speaker 2: Yeah, it's like David Ike, the guy who believes in 895 00:48:42,400 --> 00:48:44,960 Speaker 2: like literal lizard people, and everyone thinks he's like talking 896 00:48:45,000 --> 00:48:46,759 Speaker 2: about the Jews, but like no, no, no, no, he 897 00:48:46,920 --> 00:48:48,240 Speaker 2: don't just lizards. 898 00:48:48,440 --> 00:48:52,520 Speaker 3: It's exactly that where it's just like you want to draw, yeah, 899 00:48:52,600 --> 00:48:54,880 Speaker 3: you want to draw something, so it's not nonsense, and 900 00:48:54,920 --> 00:48:56,000 Speaker 3: then you realize. 901 00:48:55,640 --> 00:48:58,120 Speaker 2: No, that's no, no, no no. And like David he 902 00:48:58,160 --> 00:49:00,239 Speaker 2: went out, he's made like a big rant against how 903 00:49:00,280 --> 00:49:02,920 Speaker 2: Elon Musk is like evil for what all these people 904 00:49:02,920 --> 00:49:05,839 Speaker 2: he's hurt by firing the whole federal government and people 905 00:49:05,840 --> 00:49:08,120 Speaker 2: were shocked. It's like no, no, no, David Ike believes 906 00:49:08,160 --> 00:49:14,840 Speaker 2: in a thing. It's just crazy. Those people do exist. 907 00:49:16,400 --> 00:49:18,640 Speaker 2: Here we are talking about and here we are talking 908 00:49:18,719 --> 00:49:21,480 Speaker 2: about them. Some of them run the country. But actually 909 00:49:21,520 --> 00:49:23,320 Speaker 2: I don't know how much all of those people believe 910 00:49:23,320 --> 00:49:29,400 Speaker 2: in anything. But yeah, yeah, speaking of people who believe 911 00:49:29,400 --> 00:49:43,479 Speaker 2: in something, our sponsors believe in getting your money. We're back. 912 00:49:44,080 --> 00:49:48,080 Speaker 2: So uh. She is at this point suffering from delusions 913 00:49:48,080 --> 00:49:50,560 Speaker 2: of granteur, and those are going to rapidly lead her 914 00:49:50,640 --> 00:49:53,360 Speaker 2: to danger, but she concludes that since the fate of 915 00:49:53,400 --> 00:49:56,640 Speaker 2: the universe is at stake in her actions, she would 916 00:49:56,640 --> 00:50:00,000 Speaker 2: make a timeless choice to not believe in the bast 917 00:50:00,520 --> 00:50:04,560 Speaker 2: right and that that will protect her in the future 918 00:50:04,760 --> 00:50:07,279 Speaker 2: because that's how these people talk about stuff like that. 919 00:50:07,880 --> 00:50:10,680 Speaker 2: So she gets over her fear of the basilisk for 920 00:50:10,719 --> 00:50:14,680 Speaker 2: a little while, but even it, like when she claims 921 00:50:14,680 --> 00:50:17,200 Speaker 2: to have rejected the theory, whenever she references it in 922 00:50:17,239 --> 00:50:19,920 Speaker 2: her blog, she like locks it away under a spoiler 923 00:50:19,960 --> 00:50:24,480 Speaker 2: with like an info hazard warning Roco's Basilisk family skippable, 924 00:50:24,680 --> 00:50:26,319 Speaker 2: so you don't like have to see it and have 925 00:50:26,400 --> 00:50:27,920 Speaker 2: it destroy your psyche. 926 00:50:28,760 --> 00:50:30,000 Speaker 3: That's the power of it. 927 00:50:30,400 --> 00:50:34,439 Speaker 2: Yeah yeah yeah. The concept does, however, keep coming back 928 00:50:34,480 --> 00:50:37,320 Speaker 2: to her like and continuing to drive her mad. Thoughts 929 00:50:37,320 --> 00:50:39,719 Speaker 2: of the basilisk return, and eventually she comes to an 930 00:50:39,719 --> 00:50:42,920 Speaker 2: extreme conclusion. If what I cared about was sentient life, 931 00:50:42,920 --> 00:50:44,479 Speaker 2: and I was willing to go to hell to save 932 00:50:44,520 --> 00:50:47,399 Speaker 2: everyone else, why not just send everyone else to hell? 933 00:50:47,480 --> 00:50:51,760 Speaker 2: If I didn't submit, Can I tell you I really 934 00:50:52,280 --> 00:50:54,799 Speaker 2: it felt like this is what was This is where 935 00:50:54,800 --> 00:51:00,799 Speaker 2: it had to go, right yeah yeah yes. So what 936 00:51:00,880 --> 00:51:03,160 Speaker 2: she means here is that she is now making the 937 00:51:03,200 --> 00:51:05,640 Speaker 2: timeless decision that when she is in a position of 938 00:51:05,760 --> 00:51:08,560 Speaker 2: ultimate influence and helps bring this all powerful vegan Ai 939 00:51:08,640 --> 00:51:12,560 Speaker 2: into existence, she's promising now ahead of time, to create 940 00:51:12,600 --> 00:51:16,919 Speaker 2: a perfect hell, a digital hell to like punish all 941 00:51:16,960 --> 00:51:21,439 Speaker 2: of the people who don't stop like eat meat. Ever, 942 00:51:22,600 --> 00:51:24,560 Speaker 2: she wants to make a hell for people who eat meat. 943 00:51:25,040 --> 00:51:28,000 Speaker 2: And that's the yeah, that's the conclusion that she makes. Right, 944 00:51:28,560 --> 00:51:32,160 Speaker 2: So this becomes an intrusive thought in her head, primarily 945 00:51:32,160 --> 00:51:36,480 Speaker 2: the idea that like everyone isn't going along with her, right, Like, 946 00:51:36,520 --> 00:51:38,919 Speaker 2: she doesn't want to create this hell. She just thinks 947 00:51:38,960 --> 00:51:41,600 Speaker 2: that she has to. So she's like very focused on 948 00:51:41,680 --> 00:51:45,160 Speaker 2: like trying to convince these other people in the rationalist 949 00:51:45,239 --> 00:51:49,000 Speaker 2: culture to become vegan. Anyway, she writes this quote, I 950 00:51:49,040 --> 00:51:51,920 Speaker 2: thought it had to be subconsciously influencing me, damaging me 951 00:51:51,960 --> 00:51:54,640 Speaker 2: at my effectiveness, that I had done more harm than 952 00:51:54,640 --> 00:51:57,480 Speaker 2: I can imagine by thinking these things, because I had 953 00:51:57,520 --> 00:52:00,400 Speaker 2: the hubris to think info hazards didn't exist, worse to 954 00:52:00,400 --> 00:52:03,040 Speaker 2: feel resigned a grim sort of pride in my previous 955 00:52:03,160 --> 00:52:05,520 Speaker 2: choice to fight for sentient life, although it damned me 956 00:52:05,800 --> 00:52:08,480 Speaker 2: and the gaps between. Do not think about that, you moron. 957 00:52:08,600 --> 00:52:11,319 Speaker 2: Do not think about that, you moron pride, which may 958 00:52:11,360 --> 00:52:14,200 Speaker 2: have led to intrusive thoughts to resurface, and progress and 959 00:52:14,239 --> 00:52:17,640 Speaker 2: progress to resume. In other words, my ego had perhaps 960 00:52:17,760 --> 00:52:23,319 Speaker 2: damned the universe. So Man, I don't fully get all 961 00:52:23,360 --> 00:52:25,959 Speaker 2: of what she's saying here. But it's also because she's 962 00:52:26,000 --> 00:52:28,160 Speaker 2: like just spun out into madness at this. 963 00:52:28,400 --> 00:52:32,879 Speaker 3: Yeah, she lives in it now. It's so yeah, it's 964 00:52:32,920 --> 00:52:35,279 Speaker 3: so far for we've been talking about it, however long 965 00:52:35,400 --> 00:52:37,719 Speaker 3: she's she's so far away from us even. 966 00:52:39,280 --> 00:52:41,239 Speaker 2: Yeah, and it is it is deeply I've read a 967 00:52:41,239 --> 00:52:44,719 Speaker 2: lot of her writing. It is deeply hard to understand 968 00:52:44,800 --> 00:52:47,160 Speaker 2: pieces of it here, man. 969 00:52:47,239 --> 00:52:48,720 Speaker 3: But she is at war with herself. 970 00:52:49,280 --> 00:52:51,200 Speaker 2: She is for sure at war with herself. 971 00:52:52,280 --> 00:52:52,600 Speaker 3: Now. 972 00:52:53,719 --> 00:52:56,960 Speaker 2: This is at this point attending rationalist events by the Bay, 973 00:52:57,080 --> 00:52:59,280 Speaker 2: and a lot of the people at those events are older, 974 00:52:59,600 --> 00:53:02,840 Speaker 2: more influential men, some of whom are influential in the 975 00:53:02,880 --> 00:53:06,040 Speaker 2: tech industry, all of whom have a lot more money 976 00:53:06,040 --> 00:53:09,160 Speaker 2: than her. And some of these people are members of 977 00:53:09,160 --> 00:53:12,520 Speaker 2: an organization called see FAR, the Center for Applied Rationality, 978 00:53:12,560 --> 00:53:16,040 Speaker 2: which is a nonprofit founded to help people get better 979 00:53:16,080 --> 00:53:18,920 Speaker 2: at a pursuing their goals. It's a self help company, right, 980 00:53:18,960 --> 00:53:22,120 Speaker 2: and what runs self help seminars. This is the same 981 00:53:22,160 --> 00:53:25,120 Speaker 2: as like a Tony Robbins thing. Right, We're all just 982 00:53:25,160 --> 00:53:26,960 Speaker 2: trying to get you to sign up and then get 983 00:53:27,000 --> 00:53:28,719 Speaker 2: you to sign up for the next workshop and the 984 00:53:28,760 --> 00:53:31,919 Speaker 2: next workshop and the next workshop, like all self help 985 00:53:31,960 --> 00:53:35,880 Speaker 2: people do. Yeah, there's no difference between this and Tony Robbins. 986 00:53:36,640 --> 00:53:39,359 Speaker 2: So Ziz goes to this event and she has a 987 00:53:39,400 --> 00:53:43,520 Speaker 2: long conversation with several members of SEEFAR, who I think 988 00:53:43,520 --> 00:53:47,080 Speaker 2: are clearly kind of My interpretation of this is that 989 00:53:47,120 --> 00:53:50,279 Speaker 2: they're trying to groom her to get a new because 990 00:53:50,280 --> 00:53:53,360 Speaker 2: they think chick's clearly brilliant, she'll find her way in 991 00:53:53,400 --> 00:53:55,759 Speaker 2: the industry, and we want her money, right, you know, 992 00:53:55,800 --> 00:53:57,879 Speaker 2: maybe we wanted to do some free work for us too, 993 00:53:58,000 --> 00:54:01,799 Speaker 2: but like, let's you know, uh, we got to reel 994 00:54:01,840 --> 00:54:04,960 Speaker 2: this fish in, right. So this is described as an 995 00:54:04,960 --> 00:54:08,560 Speaker 2: academic conference by people who are in the AI risk 996 00:54:08,640 --> 00:54:12,120 Speaker 2: field and rationalism, you know, thinking of ways to save 997 00:54:12,239 --> 00:54:15,600 Speaker 2: the universe, because only the true, the super geniuses can 998 00:54:15,640 --> 00:54:19,400 Speaker 2: do that. The actual why I'm really glad that I 999 00:54:19,480 --> 00:54:23,040 Speaker 2: read Ziz's account here is I've been reading about these 1000 00:54:23,080 --> 00:54:25,400 Speaker 2: people for a long time. I've been reading about their beliefs. 1001 00:54:25,680 --> 00:54:31,200 Speaker 2: I felt there's some cult stuff here. When Ziz laid 1002 00:54:31,239 --> 00:54:34,920 Speaker 2: out what happened at this seminar, this self help him 1003 00:54:35,239 --> 00:54:37,799 Speaker 2: seminar put on by these people, very close to Yedkowski. 1004 00:54:39,320 --> 00:54:42,440 Speaker 2: This is it's almost exactly the same as a Synonon meeting, 1005 00:54:43,040 --> 00:54:45,480 Speaker 2: Like it's the same stuff, It's exact and it's the 1006 00:54:45,520 --> 00:54:48,799 Speaker 2: same shit. It's the same as accounts of like big 1007 00:54:49,080 --> 00:54:52,480 Speaker 2: like self help movement, things from like the seventies and 1008 00:54:52,520 --> 00:54:55,240 Speaker 2: stuff that I've read that that that's when it really 1009 00:54:55,239 --> 00:54:55,759 Speaker 2: clicked to me. 1010 00:54:55,880 --> 00:54:56,080 Speaker 3: Right. 1011 00:54:56,480 --> 00:54:59,000 Speaker 2: Quote, here's a description of one of the because they 1012 00:54:59,040 --> 00:55:01,320 Speaker 2: have you know, speeches in they break out into groups 1013 00:55:01,360 --> 00:55:06,279 Speaker 2: to do different exercises. Right there were hamming circles per 1014 00:55:06,360 --> 00:55:09,400 Speaker 2: person take turns, having everyone else spend twenty minutes trying 1015 00:55:09,400 --> 00:55:12,000 Speaker 2: to solve the most important problem about your life. To you, 1016 00:55:12,520 --> 00:55:14,760 Speaker 2: I didn't pick the most important problem in my life 1017 00:55:14,800 --> 00:55:17,040 Speaker 2: because secrets. I think I used my turn on a 1018 00:55:17,080 --> 00:55:19,399 Speaker 2: problem I thought they might actually be able to help 1019 00:55:19,440 --> 00:55:21,919 Speaker 2: with the fact that it did, although it didn't seem 1020 00:55:21,920 --> 00:55:24,719 Speaker 2: to affect my productivity or willpower at all, I e 1021 00:55:24,840 --> 00:55:27,880 Speaker 2: I was in humanly determined. Basically all the time, I 1022 00:55:27,920 --> 00:55:30,520 Speaker 2: still felt terrible all the time that I was hurting 1023 00:55:30,560 --> 00:55:34,160 Speaker 2: from to some degree relinquishing my humanity. I was sort 1024 00:55:34,160 --> 00:55:36,200 Speaker 2: of vaguing about the pain of being trans and having 1025 00:55:36,200 --> 00:55:39,839 Speaker 2: decided not to transition and so like, this is a 1026 00:55:39,880 --> 00:55:42,279 Speaker 2: part of the thing. You build a connection between other 1027 00:55:42,320 --> 00:55:44,240 Speaker 2: people in this group by getting people to like spill 1028 00:55:44,280 --> 00:55:47,040 Speaker 2: their secrets to each other. It's a thing scientology does. 1029 00:55:47,080 --> 00:55:48,920 Speaker 2: It's a thing they did. It's send it on, tell 1030 00:55:48,960 --> 00:55:49,960 Speaker 2: me your dark is secret. 1031 00:55:50,000 --> 00:55:50,200 Speaker 3: Right. 1032 00:55:51,960 --> 00:55:54,759 Speaker 2: And she's not fully willing to because she doesn't want 1033 00:55:54,800 --> 00:55:58,480 Speaker 2: to come out to this group of people yet. And 1034 00:55:59,120 --> 00:56:00,879 Speaker 2: you know part of what I forget. 1035 00:56:00,600 --> 00:56:03,200 Speaker 3: That she's also dealing with that entire yes. 1036 00:56:03,880 --> 00:56:07,960 Speaker 2: Wow, yeah, and that the hamming circle doesn't sound so 1037 00:56:08,080 --> 00:56:10,560 Speaker 2: bad if you'll recall any And as you mentioned this, 1038 00:56:10,640 --> 00:56:13,120 Speaker 2: I was really in part one. Sin and On would 1039 00:56:13,120 --> 00:56:15,120 Speaker 2: have people break into circles where they would insult and 1040 00:56:15,120 --> 00:56:17,840 Speaker 2: attack each other in order to create a traumatic experience 1041 00:56:17,840 --> 00:56:20,480 Speaker 2: that would bond them together and with the cult. These 1042 00:56:20,480 --> 00:56:23,759 Speaker 2: hamming circles are weird, but they're not that. But there's 1043 00:56:23,800 --> 00:56:28,839 Speaker 2: another exercise they did next called doom circles. Quote. There 1044 00:56:28,880 --> 00:56:32,560 Speaker 2: were doom circles where each person, including themselves, took turns 1045 00:56:32,560 --> 00:56:36,279 Speaker 2: having everyone else bluntly but compassionately say why they were 1046 00:56:36,320 --> 00:56:40,840 Speaker 2: doomed using blindsight. Someone decided and set a precedent of 1047 00:56:40,880 --> 00:56:43,880 Speaker 2: starting these off with a sort of ritual incantation we 1048 00:56:44,000 --> 00:56:46,520 Speaker 2: now invoke and bow to the doom gods and waving 1049 00:56:46,560 --> 00:56:49,400 Speaker 2: their hands saying doom. I said I'd never bow to 1050 00:56:49,440 --> 00:56:51,319 Speaker 2: the doom gods. And while everyone else said that, I 1051 00:56:51,360 --> 00:56:53,600 Speaker 2: flipped the double bird to the heavens and said fuck 1052 00:56:53,640 --> 00:56:57,560 Speaker 2: you instead. Person a that's this member of Seafar that 1053 00:56:57,640 --> 00:57:02,040 Speaker 2: she is. Admires found this and joined in. Some people 1054 00:57:02,080 --> 00:57:04,319 Speaker 2: brought up that they felt like they were only as 1055 00:57:04,400 --> 00:57:08,120 Speaker 2: morally valuable as half a person. This irked me. I 1056 00:57:08,200 --> 00:57:11,120 Speaker 2: said they were whole persons and don't be stupid like that, 1057 00:57:11,640 --> 00:57:14,439 Speaker 2: Like if they wanted to sacrifice themselves, they could weigh 1058 00:57:14,520 --> 00:57:17,640 Speaker 2: one versus seven billion. They didn't have to falsely detegrate 1059 00:57:17,720 --> 00:57:20,960 Speaker 2: themselves is less than one person. They didn't listen. When 1060 00:57:20,960 --> 00:57:23,480 Speaker 2: it was my turn concerning myself, I said my doom 1061 00:57:23,560 --> 00:57:25,760 Speaker 2: was that I could succeed at the things I tried, 1062 00:57:25,880 --> 00:57:28,560 Speaker 2: succeed exceptionally well, Like I bet I could in ten 1063 00:57:28,640 --> 00:57:31,400 Speaker 2: years have earned a give like ten million dollars through startups, 1064 00:57:31,520 --> 00:57:33,520 Speaker 2: and it would still be too little, too late, like 1065 00:57:33,560 --> 00:57:35,800 Speaker 2: I came into this game too late. The world would 1066 00:57:35,840 --> 00:57:41,040 Speaker 2: still burn. And first off, like this is you know, 1067 00:57:41,080 --> 00:57:43,440 Speaker 2: it's a variant of the synonym thing you're going on. 1068 00:57:43,560 --> 00:57:45,720 Speaker 2: You're telling people why they're doomed, right, like why they 1069 00:57:45,760 --> 00:57:49,200 Speaker 2: won't succeed in life, you know. But it's also one 1070 00:57:49,240 --> 00:57:51,160 Speaker 2: of the things here these people are saying they feel 1071 00:57:51,200 --> 00:57:54,080 Speaker 2: like less than a person. A major topic of discussion 1072 00:57:54,120 --> 00:57:56,880 Speaker 2: in the community at the time is if you don't 1073 00:57:56,920 --> 00:58:01,200 Speaker 2: think you can succeed in business and make money, is 1074 00:58:01,280 --> 00:58:04,640 Speaker 2: the best thing with the highest net value you can 1075 00:58:04,720 --> 00:58:08,480 Speaker 2: do taking out an insurance policy on yourself and committing suicide, 1076 00:58:08,720 --> 00:58:11,440 Speaker 2: oh my god, and then having the money donated to 1077 00:58:11,480 --> 00:58:14,600 Speaker 2: a rationalist organization. That's a major topic of discussion that 1078 00:58:14,680 --> 00:58:16,960 Speaker 2: like Ziz grapples with, a lot of these people grapple 1079 00:58:17,000 --> 00:58:20,120 Speaker 2: with right because they're obsessed with the idea of like, 1080 00:58:20,720 --> 00:58:22,920 Speaker 2: oh my god, I might be net negative value. Right, 1081 00:58:23,000 --> 00:58:24,880 Speaker 2: if I can't do this or can't do this, I 1082 00:58:24,880 --> 00:58:27,480 Speaker 2: could be a net negative value individual. And that means 1083 00:58:27,520 --> 00:58:30,200 Speaker 2: like I'm not contributing to the solution. And there's nothing 1084 00:58:30,200 --> 00:58:31,840 Speaker 2: worse than not contributing to the solution. 1085 00:58:33,840 --> 00:58:34,960 Speaker 3: Were there people who did that? 1086 00:58:37,120 --> 00:58:42,760 Speaker 2: I am not aware there are people who commit suicide 1087 00:58:42,960 --> 00:58:45,000 Speaker 2: in this community. I will say that, like, there are 1088 00:58:45,040 --> 00:58:48,400 Speaker 2: a number of suicides tied to this community I don't 1089 00:58:48,400 --> 00:58:52,880 Speaker 2: know if the actual insurance con thing happened, but it's 1090 00:58:52,920 --> 00:58:56,120 Speaker 2: like a seriously discussed thing. And it seriously discussed because 1091 00:58:57,160 --> 00:58:59,240 Speaker 2: all of these people to talk about the value of 1092 00:58:59,280 --> 00:59:04,160 Speaker 2: their own line vives in purely like mechanistic how much 1093 00:59:04,520 --> 00:59:08,200 Speaker 2: money or expected value can I produce? Like that is 1094 00:59:08,240 --> 00:59:11,880 Speaker 2: a person and that's why a person matters, right And 1095 00:59:11,960 --> 00:59:15,560 Speaker 2: the term they use is morally valuable, right like, that's 1096 00:59:15,640 --> 00:59:19,320 Speaker 2: that's what means. You're a worthwhile human being if you're 1097 00:59:19,360 --> 00:59:22,320 Speaker 2: morally if you're creating a net positive benefit to the 1098 00:59:22,360 --> 00:59:24,560 Speaker 2: world in the way they define it. And so a 1099 00:59:24,640 --> 00:59:27,520 Speaker 2: lot of these people are. Yes, there are people who 1100 00:59:27,560 --> 00:59:29,919 Speaker 2: are depressed, and there are people who kill themselves because 1101 00:59:29,960 --> 00:59:32,480 Speaker 2: they come to the conclusion that they're a net negative person, 1102 00:59:32,960 --> 00:59:35,240 Speaker 2: right like that that is a thing at the edge 1103 00:59:35,320 --> 00:59:39,040 Speaker 2: of all of this shit that's really fucked up. And 1104 00:59:39,080 --> 00:59:41,720 Speaker 2: that's that's what this doom circle is about. Is everybody 1105 00:59:41,760 --> 00:59:45,480 Speaker 2: like flipping out over I'm like and telling each other, 1106 00:59:45,520 --> 00:59:47,960 Speaker 2: I think you might know be as only be as 1107 00:59:48,000 --> 00:59:50,280 Speaker 2: morally valuable as half a person, right like, that's people 1108 00:59:50,320 --> 00:59:52,480 Speaker 2: are saying that, right like, that's what's going on here, 1109 00:59:52,560 --> 00:59:55,880 Speaker 2: you know, Like it's not the synonym thing of like 1110 00:59:55,920 --> 00:59:59,280 Speaker 2: screaming like you're a you know, using the f slur 1111 00:59:59,360 --> 01:00:02,600 Speaker 2: a million times or whatever. But it's very bad. 1112 01:00:03,000 --> 01:00:06,760 Speaker 3: No, this is this is this is awful for like. 1113 01:00:06,720 --> 01:00:09,640 Speaker 2: One thing I don't know. My feeling is you have 1114 01:00:09,680 --> 01:00:11,760 Speaker 2: an inherent value because you're a person. 1115 01:00:12,520 --> 01:00:16,240 Speaker 3: Yeah, that's a great place to start, you know, I'm 1116 01:00:16,360 --> 01:00:19,600 Speaker 3: so leading people to destroy themselves. Like it's yeah. 1117 01:00:21,040 --> 01:00:24,200 Speaker 2: It's it's so. It's such a bleak way of looking 1118 01:00:24,240 --> 01:00:24,800 Speaker 2: at things. 1119 01:00:25,560 --> 01:00:27,480 Speaker 3: It's so crazy too. Where were these meals? I just 1120 01:00:27,600 --> 01:00:29,080 Speaker 3: in my head, I'm like, this is just happening in 1121 01:00:29,160 --> 01:00:31,000 Speaker 3: like a ballroom at a raticon. 1122 01:00:31,320 --> 01:00:34,760 Speaker 2: I think it is, or a convention center, you know, 1123 01:00:34,760 --> 01:00:37,960 Speaker 2: the different kind of public spaces. I don't know, Like honestly, 1124 01:00:38,000 --> 01:00:40,040 Speaker 2: if you've been to like an anime convention or a 1125 01:00:40,040 --> 01:00:42,240 Speaker 2: Magic the Gathering convention somewhere in the Bay, you may 1126 01:00:42,240 --> 01:00:43,640 Speaker 2: have been in one of the rooms they did these, 1127 01:00:43,680 --> 01:00:47,120 Speaker 2: And I don't know exactly where they hold this. So 1128 01:00:48,120 --> 01:00:51,640 Speaker 2: the person A mentioned above, this like person who's like 1129 01:00:51,640 --> 01:00:54,720 Speaker 2: affiliated with the organization that I think is a recruiter 1130 01:00:55,120 --> 01:00:57,760 Speaker 2: h looking for young people who can be cultivated to 1131 01:00:57,800 --> 01:01:01,120 Speaker 2: pay for classes. Right, this person, it's very clear to 1132 01:01:01,160 --> 01:01:03,800 Speaker 2: them that Zizz is at the height of her vulnerability, 1133 01:01:04,560 --> 01:01:06,560 Speaker 2: and so he tries to take advantage of that. So 1134 01:01:06,600 --> 01:01:10,520 Speaker 2: he and another person from the organization engage Ziz during 1135 01:01:10,560 --> 01:01:14,960 Speaker 2: a break. Ziz, who's extremely insecure, asks them point blank, 1136 01:01:15,400 --> 01:01:18,440 Speaker 2: what do you think my net value ultimately will be 1137 01:01:18,480 --> 01:01:22,320 Speaker 2: in life? Right? And again there's like an element of this. 1138 01:01:22,320 --> 01:01:25,000 Speaker 2: It's almost like rationalist calvinism, where it's like it's actually 1139 01:01:25,000 --> 01:01:28,800 Speaker 2: decided ahead of time by your inherent, immutable characteristics, you know, 1140 01:01:29,240 --> 01:01:32,000 Speaker 2: if you are a person who can do good. Quote. 1141 01:01:32,160 --> 01:01:34,200 Speaker 2: I asked person A if they expected me to be 1142 01:01:34,280 --> 01:01:38,000 Speaker 2: net negative. They said yes. After a moment, they asked 1143 01:01:38,000 --> 01:01:39,880 Speaker 2: me what I was feeling or something like that. I 1144 01:01:39,880 --> 01:01:43,240 Speaker 2: said something like dazed and sad. They asked why sad. 1145 01:01:43,520 --> 01:01:45,560 Speaker 2: I said I might leave the field as a consequence 1146 01:01:45,560 --> 01:01:47,960 Speaker 2: and maybe something else. I said I needed time to 1147 01:01:48,000 --> 01:01:50,880 Speaker 2: process or think. And so she goes home after this 1148 01:01:50,920 --> 01:01:52,920 Speaker 2: guy is saying like, yeah, I think your life's probably 1149 01:01:53,000 --> 01:01:56,400 Speaker 2: net negative value, and sleeps the rest of the day, 1150 01:01:57,120 --> 01:02:00,919 Speaker 2: and she wakes up the next morning and comes back 1151 01:02:00,960 --> 01:02:05,880 Speaker 2: to the second day of this thing, and yeah, Ziz 1152 01:02:06,000 --> 01:02:09,040 Speaker 2: goes back and she tells this person, Okay, here's what 1153 01:02:09,080 --> 01:02:11,480 Speaker 2: I'm gonna do. I'm going to pick a group of 1154 01:02:11,520 --> 01:02:14,840 Speaker 2: three people at the event I respect, including you, and 1155 01:02:14,920 --> 01:02:17,040 Speaker 2: if two of them vote that they think I have 1156 01:02:17,120 --> 01:02:21,640 Speaker 2: a net negative value quote, I'll leave EA and Existential 1157 01:02:21,720 --> 01:02:24,840 Speaker 2: risk and the rationalist community and so on forever. I'd 1158 01:02:24,920 --> 01:02:27,920 Speaker 2: transition and move, probably to Seattle. I heard it was 1159 01:02:27,960 --> 01:02:30,600 Speaker 2: relatively nice for trans people, and there do what I 1160 01:02:30,600 --> 01:02:32,680 Speaker 2: could to be a normy, retool my mind as much 1161 01:02:32,720 --> 01:02:36,320 Speaker 2: as possible, to be stable, unchanging an enormy gradually abandoned 1162 01:02:36,360 --> 01:02:38,960 Speaker 2: my Facebook account and email, use a name change as 1163 01:02:38,960 --> 01:02:42,920 Speaker 2: a story for that, and God, that would have been 1164 01:02:42,960 --> 01:02:45,280 Speaker 2: the best thing for That's what you see. 1165 01:02:45,520 --> 01:02:48,640 Speaker 3: But sliver of hope, like yeah, oh man. 1166 01:02:48,800 --> 01:02:52,720 Speaker 2: She sees this as a nightmare. Right, this is the 1167 01:02:52,760 --> 01:02:56,640 Speaker 2: worst case scenario for her, Right, because you're not part 1168 01:02:56,680 --> 01:02:59,240 Speaker 2: of right, you're not part of the you're not part 1169 01:02:59,240 --> 01:03:02,400 Speaker 2: of the cause. You know, you're you have no you 1170 01:03:02,440 --> 01:03:05,280 Speaker 2: have no involvement in the great quest to save humanity. 1171 01:03:05,320 --> 01:03:07,880 Speaker 2: That's worse than death, almost right, It's its own kind. 1172 01:03:07,720 --> 01:03:09,480 Speaker 3: Of hell, though, Right to think that you have this 1173 01:03:09,640 --> 01:03:14,640 Speaker 3: enlightenment and that you that you weren't good enough to. 1174 01:03:14,880 --> 01:03:18,760 Speaker 2: And that's a lot about how I'd probably just kill myself, 1175 01:03:18,800 --> 01:03:22,840 Speaker 2: you know, that's the logical thing to do. It's so 1176 01:03:23,000 --> 01:03:26,080 Speaker 2: fucked up, it's so fucked up. But also if she's 1177 01:03:26,080 --> 01:03:28,440 Speaker 2: trying to live a normal life as a normy, and 1178 01:03:28,840 --> 01:03:31,280 Speaker 2: she she refers to like being a normy as like 1179 01:03:31,680 --> 01:03:34,160 Speaker 2: just trying to be nice to people, because again that's useless. 1180 01:03:35,200 --> 01:03:37,280 Speaker 2: So her fe fear here is that she would be 1181 01:03:37,320 --> 01:03:40,959 Speaker 2: a causal negative if she does this right. And also 1182 01:03:41,200 --> 01:03:44,880 Speaker 2: the robot god that comes about might put her in hell, right. 1183 01:03:45,440 --> 01:03:49,560 Speaker 3: Because that's also looming. Yeah, after every for every decision right. Yeah. 1184 01:03:49,560 --> 01:03:51,800 Speaker 2: And the thing here, she she expressed, she tells these 1185 01:03:51,800 --> 01:03:54,320 Speaker 2: guys a story, and it really shows, both in this 1186 01:03:54,400 --> 01:03:57,800 Speaker 2: community and among her, how little value they actually have 1187 01:03:58,000 --> 01:04:00,919 Speaker 2: for like human life. I I told a story about 1188 01:04:00,960 --> 01:04:03,160 Speaker 2: a time I had killed four ants in a bathtub 1189 01:04:03,200 --> 01:04:05,520 Speaker 2: where I wanted to take a shower before growing to work. 1190 01:04:05,840 --> 01:04:08,680 Speaker 2: I'd considered, can I just not take a shower, and 1191 01:04:08,760 --> 01:04:11,280 Speaker 2: presumed me smelling bad at work would, because of big 1192 01:04:11,360 --> 01:04:13,560 Speaker 2: numbers in the fate of the world and stuff, make 1193 01:04:13,640 --> 01:04:16,760 Speaker 2: the world worse than the deaths of four basically causally 1194 01:04:16,840 --> 01:04:20,120 Speaker 2: isolated people. I considered getting paper in a cup and 1195 01:04:20,120 --> 01:04:22,560 Speaker 2: taking them elsewhere, and I figured there were decent odds 1196 01:04:22,600 --> 01:04:24,080 Speaker 2: if I did, I'd be late to work and it 1197 01:04:24,120 --> 01:04:26,040 Speaker 2: would probably make the world worse in the long run. 1198 01:04:26,040 --> 01:04:29,919 Speaker 2: So again she considers ants identical to human beings, and 1199 01:04:30,480 --> 01:04:33,400 Speaker 2: she is also saying it was worth killing four of 1200 01:04:33,440 --> 01:04:36,120 Speaker 2: them because they're causally isolated so that I could get 1201 01:04:36,160 --> 01:04:38,000 Speaker 2: to work in time, because I'm working for the cause. 1202 01:04:40,840 --> 01:04:44,040 Speaker 2: It's also in a bad place here. Yeah. 1203 01:04:44,200 --> 01:04:47,840 Speaker 3: The crazy thing about her is it like the amount 1204 01:04:47,920 --> 01:04:50,440 Speaker 3: of thinking just to like get in the shower to 1205 01:04:50,480 --> 01:04:53,520 Speaker 3: go to work, you know, you know what I mean 1206 01:04:54,120 --> 01:04:59,200 Speaker 3: like that that Oh, it just seems like it makes everything. Yeah, 1207 01:04:59,680 --> 01:05:04,640 Speaker 3: every the action is so loaded, yes, yes, must. 1208 01:05:04,600 --> 01:05:09,440 Speaker 2: It's it's so it's it's wild to me both this 1209 01:05:09,520 --> 01:05:12,480 Speaker 2: like mix of like fucking Jane Buddhist compassion of like 1210 01:05:12,520 --> 01:05:15,040 Speaker 2: an aunt is no less than I, or an aunt 1211 01:05:15,120 --> 01:05:17,160 Speaker 2: is no less than a human being, right we are 1212 01:05:17,200 --> 01:05:20,080 Speaker 2: all these are all lives. And then but also it's 1213 01:05:20,080 --> 01:05:21,520 Speaker 2: fine for me to kill a bunch from me to 1214 01:05:21,560 --> 01:05:24,080 Speaker 2: go to work on time because like they're causally isolated, 1215 01:05:24,080 --> 01:05:29,280 Speaker 2: so they're basically not people. Like it's it's so weird. 1216 01:05:29,720 --> 01:05:34,479 Speaker 2: Like and again it's getting a lot clearer here why 1217 01:05:35,320 --> 01:05:37,960 Speaker 2: this lady and her ideas end in a bunch of 1218 01:05:38,040 --> 01:05:39,240 Speaker 2: people getting shot. 1219 01:05:39,600 --> 01:05:40,640 Speaker 3: Yeah and stabbed. 1220 01:05:41,600 --> 01:05:47,080 Speaker 2: Okay, there's a samurai's sword later in the story, my friend, that's. 1221 01:05:46,840 --> 01:05:48,280 Speaker 3: The one thing this has been missing. 1222 01:05:48,480 --> 01:05:53,200 Speaker 2: Yes, yes, So they continue, these guys to have a 1223 01:05:53,320 --> 01:05:56,320 Speaker 2: very abusive conversation with this young person, and she clearly 1224 01:05:56,360 --> 01:05:57,240 Speaker 2: she trust them enough. 1225 01:05:57,120 --> 01:06:00,440 Speaker 3: To a conversation where she asks for the two yeah, ok. 1226 01:06:00,480 --> 01:06:04,120 Speaker 2: Yeah, and she tells them she's trans, right, And this 1227 01:06:04,160 --> 01:06:06,400 Speaker 2: gives you an idea of like how kind of predatory 1228 01:06:06,480 --> 01:06:08,320 Speaker 2: some of the stuff going on in this community is. 1229 01:06:08,920 --> 01:06:11,440 Speaker 2: They asked what I'd do with a female body. They 1230 01:06:11,440 --> 01:06:13,360 Speaker 2: were trying to get me to admit what I actually 1231 01:06:13,360 --> 01:06:15,720 Speaker 2: wanted to do was the first thing in heaven, heaven 1232 01:06:15,760 --> 01:06:18,880 Speaker 2: being there's this idea, especially amongst like some trans members 1233 01:06:18,920 --> 01:06:22,560 Speaker 2: of the rationalist community that like all of them basically 1234 01:06:22,560 --> 01:06:24,880 Speaker 2: believe a robot's going to make heaven right. And obviously, 1235 01:06:25,320 --> 01:06:27,360 Speaker 2: like there's a number of the folks who are in 1236 01:06:27,400 --> 01:06:29,000 Speaker 2: this who are trans are like and in heaven like 1237 01:06:29,080 --> 01:06:31,120 Speaker 2: you just kind of get the body you want immediately, right, 1238 01:06:33,120 --> 01:06:36,000 Speaker 2: so these get they were trying to get me to 1239 01:06:36,040 --> 01:06:38,240 Speaker 2: admit that what I actually wanted to do as the 1240 01:06:38,240 --> 01:06:41,040 Speaker 2: first thing in heaven was masturbate in a female body, 1241 01:06:41,480 --> 01:06:44,560 Speaker 2: and they follow us up by sitting really close to her, 1242 01:06:44,600 --> 01:06:48,840 Speaker 2: close enough that she gets uncomfortable. And then a really 1243 01:06:49,240 --> 01:06:53,520 Speaker 2: really rationalist conversation follows. They asked if I felt trapped. 1244 01:06:53,800 --> 01:06:57,560 Speaker 2: I may have clarified physically they may have said sure. Afterward, 1245 01:06:57,600 --> 01:06:59,960 Speaker 2: I answered no to that question, under the likely justify. 1246 01:07:00,000 --> 01:07:02,080 Speaker 2: I believe that it was framed that way. They asked 1247 01:07:02,080 --> 01:07:04,080 Speaker 2: why not. I said I was pretty sure I could 1248 01:07:04,080 --> 01:07:07,040 Speaker 2: take them in a fight. They prodded for details why 1249 01:07:07,080 --> 01:07:08,800 Speaker 2: I thought so, and then how I thought a fight 1250 01:07:08,840 --> 01:07:11,080 Speaker 2: between us would go. I asked, what kind of fight, 1251 01:07:11,160 --> 01:07:13,440 Speaker 2: like a physical, unarmed fight to the death right now? 1252 01:07:13,800 --> 01:07:16,320 Speaker 2: And why? What were my payouts? This was over the 1253 01:07:16,320 --> 01:07:19,360 Speaker 2: fate of the multiverse. Triggering actions by other people, ie 1254 01:07:19,560 --> 01:07:22,760 Speaker 2: or imprisonment or murder was not relevant, So they decide 1255 01:07:23,000 --> 01:07:25,160 Speaker 2: they make this into again. These people are all addicted 1256 01:07:25,200 --> 01:07:27,840 Speaker 2: to dumb game theory stuff, right, Okay, so what is 1257 01:07:27,880 --> 01:07:29,920 Speaker 2: this fight? Is this fight over the fate of the multiverse? 1258 01:07:29,960 --> 01:07:32,160 Speaker 2: Are we in a you know, an alternate reality where 1259 01:07:32,200 --> 01:07:33,840 Speaker 2: like no one will come and intervene and there's no 1260 01:07:33,920 --> 01:07:36,080 Speaker 2: cops we're the only people in the world or whatever. 1261 01:07:37,200 --> 01:07:39,760 Speaker 2: So they tell her like, yeah, imagine, there's no consequences 1262 01:07:39,840 --> 01:07:41,840 Speaker 2: legally whatever to you do, and we're fighting over the 1263 01:07:41,840 --> 01:07:44,280 Speaker 2: fate of the multiverse. And so she proceeds to give 1264 01:07:44,280 --> 01:07:47,080 Speaker 2: an extremely elaborate discussion of how she'll gouge out their 1265 01:07:47,120 --> 01:07:49,560 Speaker 2: eyes and try to destroy their prefrontal lobes and then 1266 01:07:49,600 --> 01:07:52,560 Speaker 2: stomp on their skulls until they die. And it's both 1267 01:07:52,600 --> 01:07:55,800 Speaker 2: it's like, it's nonsense. It's like how ten year olds 1268 01:07:55,800 --> 01:07:59,600 Speaker 2: thinks fights work. It's also it's based on this game 1269 01:07:59,680 --> 01:08:02,480 Speaker 2: theory attitude of fighting that they have, which is like 1270 01:08:03,120 --> 01:08:05,280 Speaker 2: it's you have to make this kind of timeless decision 1271 01:08:05,320 --> 01:08:08,480 Speaker 2: that any fight is you're you're just going to write. 1272 01:08:08,280 --> 01:08:10,960 Speaker 3: The hardest confrontation, right, Yes, I suppose you have to 1273 01:08:10,960 --> 01:08:11,960 Speaker 3: be the most violent. 1274 01:08:12,080 --> 01:08:14,360 Speaker 2: Yes, yes, because that will make other people not want 1275 01:08:14,360 --> 01:08:17,160 Speaker 2: to attack you, as opposed to like what normal people 1276 01:08:17,240 --> 01:08:19,800 Speaker 2: understand about like real fights, which is if you have 1277 01:08:19,880 --> 01:08:23,280 Speaker 2: to do one, if you have to, you like try 1278 01:08:23,360 --> 01:08:25,360 Speaker 2: to just like hit him in the hit him is 1279 01:08:25,360 --> 01:08:27,840 Speaker 2: somewhere that's going to shock them, and then run like 1280 01:08:27,840 --> 01:08:31,479 Speaker 2: a motherfucker, right you get them get out of Like 1281 01:08:32,200 --> 01:08:35,400 Speaker 2: if you have to, like ideally just run like a motherfucker. 1282 01:08:35,439 --> 01:08:37,040 Speaker 2: But if you have to strike somebody, you know, yeah, 1283 01:08:37,200 --> 01:08:38,800 Speaker 2: go for the eye and then run like a son 1284 01:08:38,800 --> 01:08:41,679 Speaker 2: of a bitch, you know. Like, but there's no run 1285 01:08:41,720 --> 01:08:43,840 Speaker 2: like a son of a bitch here, because the point 1286 01:08:43,920 --> 01:08:48,400 Speaker 2: in part is this like timeless decision to anyway, This 1287 01:08:48,439 --> 01:08:51,439 Speaker 2: gives tells you a lot about the rationalist community. So 1288 01:08:51,479 --> 01:08:53,880 Speaker 2: she tells these people, she explains in detail how she 1289 01:08:53,920 --> 01:08:57,160 Speaker 2: would murder them if they have to af They're like 1290 01:08:57,840 --> 01:09:00,920 Speaker 2: sitting next super close. Have you just asked her about masturbation? 1291 01:09:01,800 --> 01:09:04,719 Speaker 2: Here's their first question quote. They asked if I'd rape 1292 01:09:04,720 --> 01:09:07,840 Speaker 2: their corpse. Part of me insisted this was not going 1293 01:09:07,880 --> 01:09:10,479 Speaker 2: as it was supposed to, but I decided. I decided 1294 01:09:10,560 --> 01:09:13,360 Speaker 2: inflicting discomfort in order to get reliable information was a 1295 01:09:13,400 --> 01:09:16,080 Speaker 2: valid tactic. In other words, them trying to make her 1296 01:09:16,400 --> 01:09:20,080 Speaker 2: uncomfortable to get info from her, she decides, is fine. Also, 1297 01:09:20,360 --> 01:09:22,760 Speaker 2: the whole discussion about raping their corpses is like, well, 1298 01:09:22,880 --> 01:09:24,840 Speaker 2: if you rape, obviously, if you want to have the 1299 01:09:24,880 --> 01:09:28,000 Speaker 2: most extreme response possible, that would like make other people 1300 01:09:28,360 --> 01:09:31,439 Speaker 2: unlikely to fuck with you knowing that you'll violate their corpse. 1301 01:09:31,439 --> 01:09:35,519 Speaker 2: If you kill them as clearly the light and like that. Okay, sure, 1302 01:09:36,120 --> 01:09:37,559 Speaker 2: I love rational thought. 1303 01:09:38,479 --> 01:09:45,559 Speaker 3: Oh man, this is crazy. Sorry, this yes is so crazy, 1304 01:09:45,640 --> 01:09:46,759 Speaker 3: It's so nuts. 1305 01:09:47,960 --> 01:09:51,679 Speaker 2: So then they talk about psychopathy. One of these guys 1306 01:09:51,680 --> 01:09:55,400 Speaker 2: had earlier told Ziz that they thought she was a psychopath. 1307 01:09:55,720 --> 01:09:58,400 Speaker 3: But he told. 1308 01:09:57,400 --> 01:10:00,759 Speaker 2: Her that doesn't mean what it means bo to actual 1309 01:10:00,840 --> 01:10:03,840 Speaker 2: like clinicians, because psychopathy is a diagnostition or like what 1310 01:10:03,880 --> 01:10:06,960 Speaker 2: normal people mean. To rationalists, a lot of them think 1311 01:10:07,040 --> 01:10:11,280 Speaker 2: psychopathy is a state you can put yourself into in 1312 01:10:11,400 --> 01:10:15,400 Speaker 2: order to maximize your performance in certain situations. It's because 1313 01:10:15,439 --> 01:10:18,360 Speaker 2: they they've again there's some like popular books that are 1314 01:10:18,400 --> 01:10:22,439 Speaker 2: about like the psychopaths way, the Dark Triad, and like, well, 1315 01:10:22,479 --> 01:10:24,439 Speaker 2: you know, these are the people who lead societies in 1316 01:10:24,479 --> 01:10:26,439 Speaker 2: the toughest times, and so like you could you need 1317 01:10:26,439 --> 01:10:29,080 Speaker 2: to optimize and engage in some of those behaviors if 1318 01:10:29,120 --> 01:10:33,120 Speaker 2: you want to win in these situations. Based on all 1319 01:10:33,160 --> 01:10:38,120 Speaker 2: of this, Ziz brings up what rationalists call the Gervas principle. Now, 1320 01:10:38,280 --> 01:10:41,160 Speaker 2: this started as a tongue in cheek joke describing a 1321 01:10:41,240 --> 01:10:44,400 Speaker 2: rule of office dynamics based on the TV show The Office. 1322 01:10:46,240 --> 01:10:49,519 Speaker 2: Yes it's Ricky Space, Yes, and The idea is that 1323 01:10:49,680 --> 01:10:53,400 Speaker 2: in office environments, psycho's always rise to the top. This 1324 01:10:53,479 --> 01:10:55,800 Speaker 2: is supposed to be like a negative observation, Like the 1325 01:10:55,840 --> 01:10:58,000 Speaker 2: person who wrote this initially is like, yeah, this is 1326 01:10:58,000 --> 01:10:59,920 Speaker 2: how offices work, and it's like why they're bad. 1327 01:11:00,160 --> 01:11:00,400 Speaker 3: You know. 1328 01:11:00,640 --> 01:11:04,680 Speaker 2: It's an extension of the Peter principle. And these psychopaths 1329 01:11:04,720 --> 01:11:07,960 Speaker 2: put bad like dumb and incompetent people in like when 1330 01:11:08,120 --> 01:11:12,400 Speaker 2: positions below them for a variety. It's trying to kind 1331 01:11:12,439 --> 01:11:15,800 Speaker 2: of work out why in which offices are often dysfunctional. Right, 1332 01:11:16,640 --> 01:11:19,160 Speaker 2: it's not like the original Jervas principle thing is like 1333 01:11:19,280 --> 01:11:21,760 Speaker 2: not a bad piece of writing or whatever, but Zis 1334 01:11:21,800 --> 01:11:24,519 Speaker 2: takes something insane out of it. I described how the 1335 01:11:24,640 --> 01:11:28,200 Speaker 2: Jervase principle said sociopaths give up empathy as in a 1336 01:11:27,840 --> 01:11:32,080 Speaker 2: certain chunk of social software, not literally all hardware awared 1337 01:11:32,280 --> 01:11:36,160 Speaker 2: accelerated modeling of people, not necessarily compassion, and with it 1338 01:11:36,240 --> 01:11:39,920 Speaker 2: happiness destroying meaning to create power, meaning too, I did 1339 01:11:39,960 --> 01:11:42,200 Speaker 2: not care about I wanted this world to live on. 1340 01:11:42,840 --> 01:11:45,200 Speaker 2: So she tells them she's come to the conclusion I 1341 01:11:45,320 --> 01:11:49,120 Speaker 2: need to make myself into a psychopath in order to 1342 01:11:49,160 --> 01:11:51,880 Speaker 2: have the kind of mental power necessary to do the 1343 01:11:51,920 --> 01:11:55,040 Speaker 2: things that I want to do. And she largely justifies 1344 01:11:55,040 --> 01:11:58,040 Speaker 2: this by describing the beliefs of the Syth from Star Wars, 1345 01:11:59,360 --> 01:12:01,600 Speaker 2: because she thinks she needs to remake herself as a 1346 01:12:01,640 --> 01:12:08,920 Speaker 2: psychopathic evil warrior monk in order to save all of creation. Yeah, no, 1347 01:12:08,960 --> 01:12:12,200 Speaker 2: of course, yep, So this is her hitting her final form. 1348 01:12:12,240 --> 01:12:15,400 Speaker 2: And true to fact, these guys are like, they don't 1349 01:12:15,400 --> 01:12:17,599 Speaker 2: say it's a good idea, but they're like, Okay, yeah, 1350 01:12:17,800 --> 01:12:19,479 Speaker 2: you know that's not That's not the worst thing you 1351 01:12:19,479 --> 01:12:23,040 Speaker 2: could do. Sure, you know, like I think the Sith 1352 01:12:23,080 --> 01:12:26,360 Speaker 2: stuff kind of weird, but making yourself a psychopath makes sense. Sure, yeah, 1353 01:12:26,439 --> 01:12:28,200 Speaker 2: of course I know a lot of guys who did that. 1354 01:12:29,240 --> 01:12:32,960 Speaker 2: That's literally what they say, right, And then they say 1355 01:12:33,000 --> 01:12:34,880 Speaker 2: that also, I don't even think that's what they really 1356 01:12:34,960 --> 01:12:37,280 Speaker 2: they say that, because the next thing they say, this 1357 01:12:37,320 --> 01:12:40,400 Speaker 2: guy person a is like, look the best that later 1358 01:12:40,439 --> 01:12:43,280 Speaker 2: turn yourself from a net negative to a net positive value. 1359 01:12:43,320 --> 01:12:45,439 Speaker 2: I really believe you could do it. But to do it, 1360 01:12:45,479 --> 01:12:47,280 Speaker 2: you need to come to ten more of these seminars 1361 01:12:47,320 --> 01:12:51,960 Speaker 2: and keep taking classes here right right right, here's a 1362 01:12:52,040 --> 01:12:55,960 Speaker 2: quote from them. Are from Ziz. She's conditional on me 1363 01:12:56,080 --> 01:12:58,439 Speaker 2: going to a long course of circling like these two 1364 01:12:58,800 --> 01:13:03,040 Speaker 2: organizations offered, particularly a ten weekend one, then I probably 1365 01:13:03,080 --> 01:13:09,439 Speaker 2: would not be net negative. So things are going good. 1366 01:13:09,560 --> 01:13:15,960 Speaker 2: This is this is you know, ah, yeah great. 1367 01:13:17,840 --> 01:13:19,280 Speaker 3: How much is ten weekends cost? 1368 01:13:19,760 --> 01:13:22,200 Speaker 2: I don't actually know. I don't I don't fully know 1369 01:13:22,280 --> 01:13:25,360 Speaker 2: with this. It's possible some of these are like some 1370 01:13:25,400 --> 01:13:28,439 Speaker 2: of the events are free, like but the classes cost money, 1371 01:13:28,520 --> 01:13:30,200 Speaker 2: or but it's also a lot of it's like there's 1372 01:13:30,240 --> 01:13:34,519 Speaker 2: donations expected, or by doing this and being a member, 1373 01:13:34,640 --> 01:13:40,479 Speaker 2: it's expected you're going to tithe basically like your income 1374 01:13:40,640 --> 01:13:42,320 Speaker 2: right more than at. 1375 01:13:42,479 --> 01:13:45,080 Speaker 1: Ult I don't know the format. Is she not going 1376 01:13:45,120 --> 01:13:48,280 Speaker 1: to be like super suspicious that people are like, you know, 1377 01:13:49,000 --> 01:13:51,639 Speaker 1: faking it or like going over the top. 1378 01:13:51,920 --> 01:13:56,120 Speaker 2: She okay, is she is? She gets actually really uncomfortable. 1379 01:13:56,160 --> 01:13:58,840 Speaker 2: They have an exercise where they're basically doing you know, 1380 01:13:58,840 --> 01:14:01,360 Speaker 2: they're playing with love bombing right where everyone's like hugging 1381 01:14:01,640 --> 01:14:04,080 Speaker 2: and telling each other they love each other, and she's like, 1382 01:14:04,120 --> 01:14:05,960 Speaker 2: I don't really believe it. I just met these people. 1383 01:14:06,280 --> 01:14:08,200 Speaker 2: So she has started to and she is going to 1384 01:14:08,240 --> 01:14:12,680 Speaker 2: break away from these organizations pretty quickly. But this conversation 1385 01:14:12,840 --> 01:14:15,439 Speaker 2: she have with these guys is a critical part of 1386 01:14:15,520 --> 01:14:20,559 Speaker 2: like why she finally has this fracture because number one, 1387 01:14:20,600 --> 01:14:23,160 Speaker 2: this dude keeps telling her you have a net negative 1388 01:14:23,280 --> 01:14:27,960 Speaker 2: value to the universe, right, and so she's obsessed with 1389 01:14:28,000 --> 01:14:29,879 Speaker 2: like how do I And it comes to the conclusion, 1390 01:14:29,920 --> 01:14:33,160 Speaker 2: my best way of being net positive is to make 1391 01:14:33,200 --> 01:14:39,880 Speaker 2: myself into as sociopath and the sith Lord to save 1392 01:14:39,960 --> 01:14:40,640 Speaker 2: the animals. 1393 01:14:40,680 --> 01:14:43,920 Speaker 3: Of course, it feels like the same thinking though, as 1394 01:14:43,960 --> 01:14:47,240 Speaker 3: like the Robot's gonna make It seems to always come 1395 01:14:47,280 --> 01:14:49,680 Speaker 3: back to this idea of like I think we just 1396 01:14:49,720 --> 01:14:50,559 Speaker 3: gotta be evil. 1397 01:14:51,840 --> 01:14:55,280 Speaker 1: It's like, yes, oh, yes, well I guess the only 1398 01:14:55,400 --> 01:14:59,320 Speaker 1: logical conllusion is doom yep. 1399 01:15:01,560 --> 01:15:02,439 Speaker 3: Yeah, yeah. 1400 01:15:02,560 --> 01:15:05,679 Speaker 4: It's like it feels like it's a it's a theme here, 1401 01:15:06,120 --> 01:15:10,040 Speaker 4: m m yep. Anyway you want to plug anything. At 1402 01:15:10,040 --> 01:15:13,240 Speaker 4: the end here, I have. 1403 01:15:13,240 --> 01:15:16,559 Speaker 3: A comedy specially you can purchase on Patreon. It's called 1404 01:15:16,640 --> 01:15:19,400 Speaker 3: Birth of a Nation with a G. You can get 1405 01:15:19,479 --> 01:15:22,120 Speaker 3: that at Patreon, dot combat Slash. 1406 01:15:22,240 --> 01:15:28,960 Speaker 2: David Bori excellent, excellent, All right, folks, Well that is 1407 01:15:29,040 --> 01:15:31,360 Speaker 2: the end of the episode. David, thank you so much 1408 01:15:31,400 --> 01:15:34,080 Speaker 2: for coming on to our inaugural episode. By listening to 1409 01:15:35,120 --> 01:15:37,960 Speaker 2: some of the weirdest shit we've ever talked about on 1410 01:15:38,000 --> 01:15:38,519 Speaker 2: this show. 1411 01:15:40,160 --> 01:15:42,640 Speaker 3: Yeah, this is uh, I don't really I'm going to 1412 01:15:42,720 --> 01:15:44,360 Speaker 3: be thinking about this for weeks. 1413 01:15:44,600 --> 01:15:50,280 Speaker 1: I mean, yeah, yeah, fair because your co host likes 1414 01:15:50,280 --> 01:15:53,720 Speaker 1: a curbent kbon for the Elders of Zion episodes. 1415 01:15:54,680 --> 01:15:58,439 Speaker 2: Yeah, yeah, okay, I wanted to. I was initially gonna 1416 01:15:58,520 --> 01:16:00,760 Speaker 2: kind of just focus on all this would have been 1417 01:16:00,840 --> 01:16:02,840 Speaker 2: like half a page or so, you know, just kind 1418 01:16:02,840 --> 01:16:06,040 Speaker 2: of summing up, here's the gist of what this believes, 1419 01:16:06,080 --> 01:16:08,280 Speaker 2: and then let's get to the actual cult stuff when, 1420 01:16:08,400 --> 01:16:11,160 Speaker 2: like you know, Ziz starts bringing in followers and the 1421 01:16:11,200 --> 01:16:15,479 Speaker 2: crimes start happening. But that Rolling or that Wired article 1422 01:16:15,520 --> 01:16:18,840 Speaker 2: really covers all that very well. And that's the best piece. 1423 01:16:19,000 --> 01:16:22,160 Speaker 2: Most of the journalism I've read on these guys is 1424 01:16:22,200 --> 01:16:24,519 Speaker 2: not very well written. It's not very good. It does 1425 01:16:24,560 --> 01:16:26,960 Speaker 2: not really explain why they what they are are, why 1426 01:16:27,000 --> 01:16:29,080 Speaker 2: they do it. So I decided, and I'm not The 1427 01:16:29,080 --> 01:16:31,799 Speaker 2: Wired piece is great. I know, the Wired guy knows 1428 01:16:31,840 --> 01:16:33,519 Speaker 2: all of the stuff that I brought up here. He 1429 01:16:33,640 --> 01:16:38,160 Speaker 2: just it's an article. You have editors. He left out 1430 01:16:38,200 --> 01:16:40,280 Speaker 2: what he thought he needed to leave out. I don't 1431 01:16:40,280 --> 01:16:43,479 Speaker 2: have that problem, and I wanted to really, really deeply 1432 01:16:43,560 --> 01:16:48,200 Speaker 2: trace exactly where this lady's how this lady's mind develops, 1433 01:16:48,240 --> 01:16:53,880 Speaker 2: and how that intersects with rationalism, because it's interesting and 1434 01:16:54,120 --> 01:16:56,080 Speaker 2: kind of important and bad. 1435 01:16:56,960 --> 01:16:59,920 Speaker 3: Yeah, Okay, he's so. 1436 01:17:01,160 --> 01:17:06,080 Speaker 2: Anyway, thanks for having a had fuck with me, all right. 1437 01:17:06,320 --> 01:17:10,120 Speaker 2: That's it, everybody, goodbye. 1438 01:17:11,320 --> 01:17:14,040 Speaker 1: Behind the Bastards is a production of cool Zone Media. 1439 01:17:14,400 --> 01:17:17,680 Speaker 1: For more from cool Zone Media, visit our website Coolzonemedia 1440 01:17:17,840 --> 01:17:21,040 Speaker 1: dot com, or check us out on the iHeartRadio app, 1441 01:17:21,120 --> 01:17:24,280 Speaker 1: Apple Podcasts, or wherever you get your podcasts. Behind the 1442 01:17:24,280 --> 01:17:28,360 Speaker 1: Bastards is now available on YouTube, new episodes every Wednesday 1443 01:17:28,439 --> 01:17:32,120 Speaker 1: and Friday. Subscribe to our channel YouTube dot com slash 1444 01:17:32,320 --> 01:17:34,000 Speaker 1: at Behind the Bastards