1 00:00:01,840 --> 00:00:02,880 Speaker 1: All the media. 2 00:00:05,800 --> 00:00:08,920 Speaker 2: Hello, Hello, helloo, and welcome to it could happen here. 3 00:00:09,600 --> 00:00:13,640 Speaker 2: I'm andrews Age otherwise known as Andrewism on YouTube, and 4 00:00:13,760 --> 00:00:15,600 Speaker 2: I'm here with James. 5 00:00:16,160 --> 00:00:18,040 Speaker 1: Just James don't have a YouTube. 6 00:00:18,640 --> 00:00:21,360 Speaker 2: More than just James, I mean, I love talking to you, 7 00:00:21,440 --> 00:00:23,040 Speaker 2: so in more than just James to me. 8 00:00:23,520 --> 00:00:26,000 Speaker 1: Oh, thank you, Andrew. It's very sweet. I enjoyed these 9 00:00:26,000 --> 00:00:27,120 Speaker 1: Two's the phone for me. 10 00:00:27,720 --> 00:00:29,680 Speaker 2: Yeah, so I really I'd like to get into one 11 00:00:29,680 --> 00:00:32,840 Speaker 2: of the hotter topics as of late. Not the heat, 12 00:00:33,240 --> 00:00:38,600 Speaker 2: so that is a hot topic. But yeah, YEAHI artificial intelligence. 13 00:00:39,320 --> 00:00:41,160 Speaker 1: Oh good, Yeah, my favorite thing. 14 00:00:41,720 --> 00:00:45,000 Speaker 2: Yeah, and more specifically the ways in which AI has 15 00:00:45,080 --> 00:00:49,239 Speaker 2: contributed to and accentuated alienation and the capitalism and the 16 00:00:49,280 --> 00:00:52,440 Speaker 2: state in the twenty first century. So that's a mouthful, 17 00:00:52,479 --> 00:00:55,320 Speaker 2: but it's obviously very important. Okay, Yeah, I like you 18 00:00:55,360 --> 00:00:59,520 Speaker 2: so lot in my opinion. Alienation, with all its meanings, 19 00:00:59,560 --> 00:01:01,920 Speaker 2: really it's one of those words that you could really 20 00:01:02,000 --> 00:01:06,520 Speaker 2: use to describe the current side guys, the experience of 21 00:01:06,600 --> 00:01:10,680 Speaker 2: separation from yourself, from your work, from the products your work, 22 00:01:10,800 --> 00:01:15,360 Speaker 2: from your community. All these things, both philosophical and material, 23 00:01:15,440 --> 00:01:19,280 Speaker 2: get wrapped up into this concept of alienation. Because it's 24 00:01:19,280 --> 00:01:22,520 Speaker 2: both an experience, it's something that people like feel internally, 25 00:01:23,040 --> 00:01:25,640 Speaker 2: it describes the way that they see their lives, and 26 00:01:25,680 --> 00:01:31,080 Speaker 2: it's also just a fact of how people work into society. 27 00:01:31,480 --> 00:01:34,960 Speaker 2: You're dispossessed of the products of a labor and you're 28 00:01:35,000 --> 00:01:37,640 Speaker 2: disconnected from the process of a labor and the outcomes 29 00:01:37,640 --> 00:01:40,840 Speaker 2: of a labor. And this is of course all thanks 30 00:01:40,880 --> 00:01:46,240 Speaker 2: to development of capitalism and industrialization and this development of 31 00:01:46,280 --> 00:01:50,600 Speaker 2: a mass society quote unquote with all the apathy and 32 00:01:51,280 --> 00:01:55,800 Speaker 2: loss of agency and weaken social fabric that generates. 33 00:01:56,400 --> 00:01:59,160 Speaker 1: Yeah, it's I think alienation is like something we don't 34 00:01:59,160 --> 00:02:02,760 Speaker 1: talk about enough. It's like the thing that ties together 35 00:02:03,280 --> 00:02:07,840 Speaker 1: that despare the loneliness. They like loneliness is is maybe 36 00:02:07,920 --> 00:02:12,639 Speaker 1: like it's a way that capitalism has come to talk 37 00:02:12,680 --> 00:02:16,720 Speaker 1: about alienation without acknowledging the capitalism is creating alienation. Every 38 00:02:16,960 --> 00:02:21,560 Speaker 1: sort of developed state in the colonial core have acknowledged 39 00:02:21,560 --> 00:02:23,840 Speaker 1: that loneliness as a problem. Right. I saw Gavin Newsome 40 00:02:23,960 --> 00:02:29,240 Speaker 1: was with launching a loneliness campaign. But like the system 41 00:02:29,320 --> 00:02:32,680 Speaker 1: is a problem. The alienation is created by the way 42 00:02:32,720 --> 00:02:36,079 Speaker 1: that things are, and like we can't fixate without changing 43 00:02:36,080 --> 00:02:37,000 Speaker 1: the way that things are. 44 00:02:37,600 --> 00:02:42,200 Speaker 2: Exactly exactly it comes down to. I mean, in particular, 45 00:02:42,720 --> 00:02:46,880 Speaker 2: I think we see alienation manifests in most in our relationships, 46 00:02:46,880 --> 00:02:50,079 Speaker 2: of course, and in our works. And it's been an 47 00:02:50,080 --> 00:02:55,720 Speaker 2: issue for some decades now. And what I'm really intrigued 48 00:02:55,760 --> 00:02:58,000 Speaker 2: by is, you know, this has been an issue for 49 00:02:58,040 --> 00:03:02,560 Speaker 2: a while, but how was HEI interacting with these issues? 50 00:03:02,560 --> 00:03:05,640 Speaker 2: Hall is THEI impacts in the ilination that we already 51 00:03:05,639 --> 00:03:07,840 Speaker 2: experience under the system. 52 00:03:08,720 --> 00:03:13,840 Speaker 1: Yeah, that's fascinating. I'm currently teaching a class at the 53 00:03:14,040 --> 00:03:18,920 Speaker 1: community college to class about pre sixteen hundred history, and like, 54 00:03:19,680 --> 00:03:21,520 Speaker 1: I teach a little bit every year, right, and every 55 00:03:21,600 --> 00:03:25,440 Speaker 1: year I've seen what AI use. But this year this 56 00:03:25,760 --> 00:03:30,000 Speaker 1: is fully blackpilled me, Like, I don't quite know how 57 00:03:30,040 --> 00:03:35,120 Speaker 1: to describe the feelings I'm experiencing, I guess, but it's 58 00:03:35,120 --> 00:03:38,360 Speaker 1: class I assigned like David Graeber, I assigned Jim Scott, 59 00:03:38,600 --> 00:03:41,320 Speaker 1: I assigned Charles Tilly on state making and war making, right, 60 00:03:41,400 --> 00:03:46,320 Speaker 1: Like very basic left libertarian kind of text, right, which 61 00:03:46,560 --> 00:03:49,480 Speaker 1: for many people will be the first time they encounter 62 00:03:49,560 --> 00:03:52,560 Speaker 1: the concept of like what if no state? What if 63 00:03:52,600 --> 00:03:56,120 Speaker 1: state bad? And I think they're all writing in a 64 00:03:56,160 --> 00:03:58,680 Speaker 1: way that's very approachable to people who don't, you know, 65 00:03:58,720 --> 00:04:02,680 Speaker 1: the dense academic writing is annoying and pretentious, and I 66 00:04:02,680 --> 00:04:05,360 Speaker 1: don't like it every time I do this, cause it 67 00:04:05,480 --> 00:04:06,960 Speaker 1: used to be the case that, like thirty to forty ten, 68 00:04:07,000 --> 00:04:09,840 Speaker 1: the students would be like, holy fuck. Whether they like 69 00:04:09,880 --> 00:04:12,720 Speaker 1: it or not, it's a new concept and it's cool, 70 00:04:12,800 --> 00:04:16,080 Speaker 1: and they engage with it in like a passionate way, 71 00:04:16,160 --> 00:04:20,560 Speaker 1: a human way. Every year it's got worse, and now, 72 00:04:20,680 --> 00:04:23,200 Speaker 1: like I can think of two students out of one 73 00:04:23,240 --> 00:04:26,920 Speaker 1: hundred who are like engaging with it in any human way, 74 00:04:27,160 --> 00:04:29,599 Speaker 1: and I'm sure most of them. I would imagine they 75 00:04:29,680 --> 00:04:31,839 Speaker 1: buy the a I summarize the text, or in many 76 00:04:31,880 --> 00:04:35,760 Speaker 1: cases they certainly have used AI to just respond. I 77 00:04:35,839 --> 00:04:38,640 Speaker 1: let my students respond in ways that they feel appropriate, right, 78 00:04:38,680 --> 00:04:42,240 Speaker 1: so that they could do videos or different things if 79 00:04:42,279 --> 00:04:44,360 Speaker 1: they wanted to do, like a they wanted to make 80 00:04:44,400 --> 00:04:46,320 Speaker 1: a video about doing an essay, that's fine with me. 81 00:04:46,320 --> 00:04:47,719 Speaker 1: I don't care. I just want them to read the 82 00:04:47,720 --> 00:04:52,359 Speaker 1: shit think about it. But like, there's been no human reaction, 83 00:04:52,480 --> 00:04:54,480 Speaker 1: and that's so sad to me. Like the reason they 84 00:04:54,520 --> 00:04:57,880 Speaker 1: teach is to get young people to see the world differently. 85 00:04:58,279 --> 00:05:01,919 Speaker 1: It certainly isn't for the fucking money, and that's just 86 00:05:02,080 --> 00:05:03,839 Speaker 1: might be in capable of doing that now, or like 87 00:05:03,920 --> 00:05:08,159 Speaker 1: I can't get through that alienation that like I can't 88 00:05:08,400 --> 00:05:12,920 Speaker 1: get people to engage, and like, think about it. Obviously, 89 00:05:12,960 --> 00:05:14,520 Speaker 1: I got to work that shit out, right, Like this 90 00:05:14,680 --> 00:05:17,719 Speaker 1: generation of people who went through high school when AI 91 00:05:17,839 --> 00:05:21,279 Speaker 1: was the thing and detecting AI use in long form 92 00:05:21,279 --> 00:05:23,440 Speaker 1: writing was not very well developed, so they were able 93 00:05:23,480 --> 00:05:25,200 Speaker 1: to use it instead of doing long form writing and 94 00:05:26,000 --> 00:05:29,800 Speaker 1: maybe even reading long form, And like, I have to 95 00:05:29,839 --> 00:05:32,680 Speaker 1: work out how to get those people to engage not 96 00:05:33,080 --> 00:05:36,680 Speaker 1: to be so sort of alienated from the concept of 97 00:05:37,040 --> 00:05:40,400 Speaker 1: reading and absorbing big ideas. But I haven't fucking worked 98 00:05:40,400 --> 00:05:40,920 Speaker 1: it out yet. 99 00:05:41,520 --> 00:05:45,200 Speaker 2: Yeah, it's it's a really big issue, and it's only cruent, 100 00:05:45,480 --> 00:05:48,800 Speaker 2: you know, as EI expands, I mean, it's not so 101 00:05:48,880 --> 00:05:51,800 Speaker 2: much the focus of this episode, but it is something 102 00:05:51,839 --> 00:05:54,240 Speaker 2: that I wanted to touch on. You know, people used 103 00:05:54,240 --> 00:05:56,280 Speaker 2: to be doing fine without it, used to be well 104 00:05:56,400 --> 00:05:59,560 Speaker 2: to function without A three years ago, and now you 105 00:05:59,640 --> 00:06:02,000 Speaker 2: talk to them and they can't live without it. They 106 00:06:02,040 --> 00:06:05,400 Speaker 2: have to run everything through AI. You know, people have 107 00:06:05,480 --> 00:06:11,200 Speaker 2: offloaded most of their cognitive processes the yeah, yeah, yeah, 108 00:06:11,240 --> 00:06:14,240 Speaker 2: you know, and you know we talk about the environmental 109 00:06:14,279 --> 00:06:17,360 Speaker 2: impact of that, the way the data centers are damage 110 00:06:17,360 --> 00:06:20,960 Speaker 2: in the environment, taking fresh water and taking vast amounts 111 00:06:20,960 --> 00:06:23,720 Speaker 2: of energy from the system. So we all rely upon 112 00:06:24,120 --> 00:06:26,640 Speaker 2: to live and you know, we could, as we touched on, 113 00:06:26,880 --> 00:06:30,920 Speaker 2: talk about called schools and the education systems pretty much 114 00:06:31,600 --> 00:06:34,600 Speaker 2: falling apart. Yeah, I mean, I know you're one of those, 115 00:06:35,279 --> 00:06:40,760 Speaker 2: you know, genuinely passionate professors. But what I've noticed is 116 00:06:40,760 --> 00:06:44,599 Speaker 2: this this whole fast now in many sections of the 117 00:06:44,720 --> 00:06:50,080 Speaker 2: education system where you have students AI summarize in material 118 00:06:50,680 --> 00:06:55,159 Speaker 2: if even doing that, you know, submitting AI generated essays 119 00:06:55,240 --> 00:06:59,280 Speaker 2: or AI generated material, and the professors just AI created. 120 00:07:00,080 --> 00:07:01,600 Speaker 1: Yeah, I heard of this. 121 00:07:02,000 --> 00:07:05,039 Speaker 2: So it's just one one big puppet show, you know, 122 00:07:05,120 --> 00:07:07,359 Speaker 2: one one big fast god. 123 00:07:08,760 --> 00:07:12,400 Speaker 1: Yeah, yeah, exactly, one big charade, which you know, to 124 00:07:12,440 --> 00:07:14,600 Speaker 1: an extent, education has always just been that, right, one 125 00:07:14,600 --> 00:07:18,120 Speaker 1: big fast. But there are things that are redeemable about it. 126 00:07:18,320 --> 00:07:20,720 Speaker 1: And I'm just talking about teaching now, and I'll stop 127 00:07:20,760 --> 00:07:23,040 Speaker 1: in a minute. There's very little demand for in person 128 00:07:23,040 --> 00:07:27,440 Speaker 1: classes compared to online classes anymore, so like that makes 129 00:07:27,480 --> 00:07:31,200 Speaker 1: it harder for us to break through that alienation, right, 130 00:07:31,280 --> 00:07:35,760 Speaker 1: Like there's something special about sitting in a room and 131 00:07:36,240 --> 00:07:38,440 Speaker 1: talking just just find it's just like being like, we're 132 00:07:38,440 --> 00:07:41,040 Speaker 1: going to be here for ninety minutes, none of us 133 00:07:41,040 --> 00:07:44,360 Speaker 1: in It's a dynamic, yeah, and it's an important dynamic. 134 00:07:44,440 --> 00:07:46,800 Speaker 1: Like the function of the university is to fucking turn 135 00:07:46,880 --> 00:07:49,080 Speaker 1: out people with stem degrees who can go on and 136 00:07:49,120 --> 00:07:52,600 Speaker 1: make shitty apps. We don't need. It's to prepare us 137 00:07:52,640 --> 00:07:55,880 Speaker 1: to be citizens in the community, exactly, and we are 138 00:07:55,920 --> 00:07:59,400 Speaker 1: failing at that. And yeah, instead I'm just great and 139 00:07:59,480 --> 00:08:00,640 Speaker 1: chet GP deal day know. 140 00:08:01,400 --> 00:08:03,480 Speaker 2: Yeah, And that's that's a big piece of the puzzle 141 00:08:03,480 --> 00:08:07,640 Speaker 2: that we end up missing, because the way in which 142 00:08:08,440 --> 00:08:10,760 Speaker 2: the sort of dynamics and the connections that you would 143 00:08:10,760 --> 00:08:14,280 Speaker 2: get from the university class room and beyond just social 144 00:08:14,280 --> 00:08:17,240 Speaker 2: connections in general is lacking and alienated world. And it's 145 00:08:17,760 --> 00:08:21,880 Speaker 2: wisened by you know, the instruction of the I I 146 00:08:22,000 --> 00:08:26,080 Speaker 2: managed to complete most of my education, most of my 147 00:08:26,120 --> 00:08:31,280 Speaker 2: bachelor's degree, that is, prior to the pandemic. Right, I 148 00:08:31,360 --> 00:08:35,319 Speaker 2: was nearing the end of my third year when lockdown, 149 00:08:35,559 --> 00:08:37,040 Speaker 2: you know, came into FOCE, and then it just I 150 00:08:37,080 --> 00:08:39,960 Speaker 2: did my entire fourth year online, and honestly, I'm so 151 00:08:40,080 --> 00:08:43,679 Speaker 2: glad that I was able to do my classes in Boson, 152 00:08:44,240 --> 00:08:48,120 Speaker 2: you know, and I'm so glad that I did my classes, 153 00:08:48,800 --> 00:08:50,920 Speaker 2: you know, entirely on my own in a time where 154 00:08:51,520 --> 00:08:54,600 Speaker 2: you know, yeah, I was not a thing. You know, 155 00:08:54,640 --> 00:08:56,679 Speaker 2: there were times where you know, it probably feels like, 156 00:08:56,720 --> 00:08:59,160 Speaker 2: oh my god, it's so stressful, like, but you just 157 00:08:59,200 --> 00:09:02,560 Speaker 2: had a buckle had to buckle down and figure out 158 00:09:02,640 --> 00:09:05,520 Speaker 2: a way to get it done. And because we could 159 00:09:05,520 --> 00:09:09,280 Speaker 2: talk about the perverse incentives of breeding systems and schools 160 00:09:09,320 --> 00:09:12,960 Speaker 2: and how that sort of pushes some students who you know, 161 00:09:13,040 --> 00:09:16,680 Speaker 2: may have learning difficulties or time management difficulties or whatever 162 00:09:16,800 --> 00:09:19,160 Speaker 2: to actually do their stuff, they end up going down 163 00:09:19,200 --> 00:09:22,520 Speaker 2: the I route. But yeah, I mean, even just looking 164 00:09:22,559 --> 00:09:26,600 Speaker 2: back at my experience because lockdown hits during the semester, 165 00:09:27,440 --> 00:09:29,160 Speaker 2: I had a writing class that I was a part of, 166 00:09:29,400 --> 00:09:31,920 Speaker 2: and every time we went into class, it was so dynamic, 167 00:09:32,120 --> 00:09:35,360 Speaker 2: was so lively, it was so engaging. All the ideas 168 00:09:35,400 --> 00:09:38,680 Speaker 2: were just bouncing off for each other. After the lockdown, 169 00:09:39,360 --> 00:09:42,160 Speaker 2: that class completely feels a lout. Everything that we were 170 00:09:42,200 --> 00:09:45,760 Speaker 2: getting from it was just absent because we were entirely online. 171 00:09:46,320 --> 00:09:51,120 Speaker 2: And yeah, it's really a struggle and I think social 172 00:09:51,200 --> 00:09:54,400 Speaker 2: life that's coming out of the education conversation, social life, 173 00:09:54,400 --> 00:09:59,000 Speaker 2: community and connection all ends up lacking because of the 174 00:09:59,240 --> 00:10:01,880 Speaker 2: aliens and need of the system the way that things 175 00:10:01,880 --> 00:10:04,320 Speaker 2: have been set up. But also AI is playing a 176 00:10:04,360 --> 00:10:08,080 Speaker 2: major role too. AI in a sense as a category 177 00:10:08,280 --> 00:10:10,679 Speaker 2: is you know, you can have a whole discussion about 178 00:10:10,679 --> 00:10:14,040 Speaker 2: that quibbolo for definitions. But in a sense AAI has 179 00:10:14,040 --> 00:10:16,680 Speaker 2: already been playing a major role into how people socialize 180 00:10:16,720 --> 00:10:19,240 Speaker 2: even before these large language models came to be in 181 00:10:19,360 --> 00:10:22,760 Speaker 2: because you have a sort of artificial intelligence in the 182 00:10:22,880 --> 00:10:27,240 Speaker 2: algorithms that people interact with on social media. You know, 183 00:10:27,320 --> 00:10:32,800 Speaker 2: people have the content they consume being curated by algorithms. 184 00:10:33,400 --> 00:10:37,000 Speaker 2: They end up in these sort of echo chambers, these 185 00:10:37,000 --> 00:10:42,440 Speaker 2: reinforcement loops and outrage bait and then dopamine loops, and 186 00:10:42,520 --> 00:10:46,840 Speaker 2: all those things have lended to people spending more and 187 00:10:46,840 --> 00:10:51,480 Speaker 2: more time online because you know, it's hitting that part 188 00:10:51,520 --> 00:10:55,560 Speaker 2: of the brain, and everybody is hyper connected and always online, 189 00:10:55,559 --> 00:10:58,240 Speaker 2: and more and more of life takes place on the Internet, 190 00:10:58,720 --> 00:11:02,920 Speaker 2: and that has left people feel and isolated. I think 191 00:11:03,160 --> 00:11:08,960 Speaker 2: loneliness is obviously not entirely the result of social media 192 00:11:09,240 --> 00:11:13,640 Speaker 2: and now AI, but the sort of irony is that 193 00:11:14,200 --> 00:11:19,520 Speaker 2: loneliness has been a side effect of this digital hyperconnection. Yeah, 194 00:11:20,120 --> 00:11:21,839 Speaker 2: when you look at some of the factors that are 195 00:11:21,880 --> 00:11:26,079 Speaker 2: contributing to this this already isolated nature of our world, right, 196 00:11:26,160 --> 00:11:29,000 Speaker 2: you know, people don't have as much free time. You know, 197 00:11:29,120 --> 00:11:31,240 Speaker 2: there's in as much public space as there used to be. 198 00:11:31,920 --> 00:11:35,120 Speaker 2: Some people have no public space available to them. Public 199 00:11:35,120 --> 00:11:38,040 Speaker 2: spaces that do exist are not open in the times 200 00:11:38,080 --> 00:11:41,120 Speaker 2: when people are available to go to them. Libraries are 201 00:11:41,160 --> 00:11:43,200 Speaker 2: a famous example. A lot of them are you know, 202 00:11:43,280 --> 00:11:47,400 Speaker 2: not open for working people pretty much. And then people 203 00:11:47,400 --> 00:11:49,360 Speaker 2: who do want to go out and socialize and stuff, 204 00:11:49,400 --> 00:11:51,240 Speaker 2: you know, you're dealing with the higher cost of living, 205 00:11:51,360 --> 00:11:54,160 Speaker 2: so there's little resources that you can use to you know, 206 00:11:54,320 --> 00:11:55,920 Speaker 2: go and put yourself out there because you have to 207 00:11:55,920 --> 00:11:59,320 Speaker 2: spend money to go to places. And then it also 208 00:11:59,400 --> 00:12:03,000 Speaker 2: just moons out energy ys because of you know, the 209 00:12:03,080 --> 00:12:05,200 Speaker 2: long work week, long work hours, just trying to make 210 00:12:05,240 --> 00:12:09,240 Speaker 2: against meet psychological to all of that. Yeah, and so 211 00:12:10,400 --> 00:12:12,920 Speaker 2: part of what AI has been doing is pushing these 212 00:12:12,960 --> 00:12:16,360 Speaker 2: AI companions on people. And you know, I don't mean 213 00:12:16,400 --> 00:12:18,360 Speaker 2: to fair mongo or anything, because I know there a 214 00:12:18,400 --> 00:12:21,280 Speaker 2: lot of people who reject AI and who stand against AI, 215 00:12:21,360 --> 00:12:23,760 Speaker 2: and of course that could just be the bubble that 216 00:12:23,840 --> 00:12:28,679 Speaker 2: I'm in. But yeah, I also know somebody in person, 217 00:12:29,400 --> 00:12:32,719 Speaker 2: or rather I knew somebody in person who spoke to 218 00:12:33,440 --> 00:12:40,439 Speaker 2: chat reipt like their partner and therapists. Yeah, they listen like, yeah, 219 00:12:40,520 --> 00:12:45,920 Speaker 2: that's it's I mean, it's sad. Yeah, it's as you said, 220 00:12:45,960 --> 00:12:50,840 Speaker 2: almost kind of black pillar, you know, because these chatbots 221 00:12:51,480 --> 00:12:57,600 Speaker 2: they listen in a simulated sense, they respond in a 222 00:12:57,640 --> 00:13:02,160 Speaker 2: simulated sense, and they affirm with the certain is dealing with, 223 00:13:02,320 --> 00:13:05,800 Speaker 2: is going through, is venting about. They're almost like a 224 00:13:05,840 --> 00:13:10,600 Speaker 2: hug box because you don't really see chatbots disagreeing with 225 00:13:11,440 --> 00:13:14,200 Speaker 2: the people they're speaking to. Chetbots are very much like 226 00:13:15,200 --> 00:13:18,040 Speaker 2: you know, fawning, you know, they try their best to 227 00:13:18,600 --> 00:13:21,600 Speaker 2: affume everything that a person is telling them. So you 228 00:13:21,600 --> 00:13:24,679 Speaker 2: have this kind of cuddle box for people's egos, which, 229 00:13:24,800 --> 00:13:26,839 Speaker 2: in two, it makes it even more difficult for them 230 00:13:26,880 --> 00:13:28,840 Speaker 2: to connect to real people because you know, real people 231 00:13:28,880 --> 00:13:31,240 Speaker 2: are going to call you out. You know, they're going 232 00:13:31,280 --> 00:13:34,000 Speaker 2: to disagree with you, You're going to have friction and conflict, 233 00:13:35,160 --> 00:13:37,560 Speaker 2: but there's also a lot of joy becomes for interact 234 00:13:37,640 --> 00:13:40,960 Speaker 2: with real people, and unfortunately a lot of people, because 235 00:13:40,960 --> 00:13:45,280 Speaker 2: they're not getting that, they're turned into this on demand affection, 236 00:13:45,559 --> 00:13:51,880 Speaker 2: this on demand flirtation, this pseudo therapy, and it's it's brutal, 237 00:13:52,160 --> 00:13:55,800 Speaker 2: you know, Loneliness is a brutal experience. Relationships are very hard, 238 00:13:55,840 --> 00:13:58,880 Speaker 2: and therapy is extremely expensive for a lot of people. 239 00:14:00,000 --> 00:14:12,360 Speaker 2: So I understand that, you know, you can only put 240 00:14:12,360 --> 00:14:14,480 Speaker 2: so much blame on individuals because the will is not 241 00:14:14,559 --> 00:14:17,680 Speaker 2: really set up to support those kind of lasting connections. Yeah, 242 00:14:17,840 --> 00:14:21,880 Speaker 2: people live very spread out. They have few and fewer 243 00:14:22,440 --> 00:14:24,720 Speaker 2: opportunities to interact with each other. In fact, a lot 244 00:14:24,720 --> 00:14:29,200 Speaker 2: of times, the last time a Polson had extended exposure 245 00:14:29,400 --> 00:14:32,720 Speaker 2: with other people was in school or in college. And 246 00:14:32,800 --> 00:14:37,240 Speaker 2: outside of that, you're just kind of on your own. Yeah, 247 00:14:37,600 --> 00:14:41,200 Speaker 2: and places are increasingly not walkable, the more cor eccentric, 248 00:14:41,840 --> 00:14:47,560 Speaker 2: the sort of spontaneity and friction and interaction that would 249 00:14:47,600 --> 00:14:52,600 Speaker 2: have made relationships blossom naturally and the religious possible, as 250 00:14:52,640 --> 00:14:55,760 Speaker 2: messy and inconvenience as they can be, sometimes those things 251 00:14:55,800 --> 00:14:59,480 Speaker 2: are lacking now and unfortunately some for action of people. 252 00:14:59,520 --> 00:15:01,480 Speaker 2: And I don't know what the actual number would be 253 00:15:01,480 --> 00:15:03,960 Speaker 2: because I can imagine a lot of people will not 254 00:15:04,040 --> 00:15:08,120 Speaker 2: admit that they turn into a chatbot for companionship. But 255 00:15:09,160 --> 00:15:14,000 Speaker 2: it is a frightening woman of what their action we're 256 00:15:14,080 --> 00:15:18,800 Speaker 2: going in, and I also worry about the potential outcomes 257 00:15:18,840 --> 00:15:24,480 Speaker 2: of you know, egoic behavior that might results from that 258 00:15:24,600 --> 00:15:28,120 Speaker 2: sort of continue us interaction with something that is affirming 259 00:15:28,200 --> 00:15:32,240 Speaker 2: in every belief and thought and conclusion. What kind of 260 00:15:32,240 --> 00:15:33,520 Speaker 2: google are we going to be there? First? 261 00:15:33,640 --> 00:15:38,160 Speaker 1: You know? Yeah, it's the world that super rich people 262 00:15:38,160 --> 00:15:42,040 Speaker 1: already live in. One of the reasons that the gulf 263 00:15:42,120 --> 00:15:44,280 Speaker 1: between the rest of us and the super rich, like 264 00:15:44,320 --> 00:15:48,200 Speaker 1: the really you know, incredibly wealthy people. Part of that 265 00:15:48,280 --> 00:15:49,800 Speaker 1: is that no one says no to a lot of 266 00:15:49,800 --> 00:15:53,960 Speaker 1: those people, and that's why they exclusively end up socializing 267 00:15:54,000 --> 00:15:59,120 Speaker 1: with each other, right, Like they're they're surrounded by nothing 268 00:15:59,160 --> 00:15:59,880 Speaker 1: of the affirmation. 269 00:16:00,360 --> 00:16:00,640 Speaker 2: Right. 270 00:16:01,040 --> 00:16:02,680 Speaker 1: One of the things we see was Trump, right, is 271 00:16:02,840 --> 00:16:04,760 Speaker 1: that like, if there is a reality that he doesn't like, 272 00:16:04,800 --> 00:16:08,760 Speaker 1: he manifests his own reality. He just speaks things and 273 00:16:09,160 --> 00:16:12,920 Speaker 1: expects them to be accepted as truths. Right. Growing up, 274 00:16:12,960 --> 00:16:16,120 Speaker 1: my dad worked for a lot of extremely wealthy people, 275 00:16:16,600 --> 00:16:19,320 Speaker 1: and so I've interacted with them, and like, there's a 276 00:16:19,360 --> 00:16:21,640 Speaker 1: lot of people who just aren't used to hearing no 277 00:16:22,280 --> 00:16:24,240 Speaker 1: or why, but not a lot that there is a 278 00:16:24,600 --> 00:16:26,760 Speaker 1: number of them, and like I think when you see 279 00:16:26,880 --> 00:16:29,720 Speaker 1: I was just thinking about it. The behavior that you know, 280 00:16:29,840 --> 00:16:32,000 Speaker 1: did Trump now asserting with the Epstein thing is like 281 00:16:32,040 --> 00:16:34,440 Speaker 1: it is made up, right, and it's a hoax, and 282 00:16:34,480 --> 00:16:36,520 Speaker 1: it just when we were talking about AI. It sort 283 00:16:36,560 --> 00:16:39,600 Speaker 1: of reminds me of that, right, that like constant affirmation, 284 00:16:39,640 --> 00:16:41,000 Speaker 1: because what a I want you to do is to 285 00:16:41,040 --> 00:16:43,560 Speaker 1: please you so that you spend more time on it, 286 00:16:43,600 --> 00:16:45,920 Speaker 1: I assume. And there's some way that it attempts to 287 00:16:45,920 --> 00:16:47,840 Speaker 1: monetize that, I'm sure, and it just wants you to 288 00:16:47,880 --> 00:16:49,800 Speaker 1: keep interacting with it so it can get more information 289 00:16:49,840 --> 00:16:51,240 Speaker 1: to take into its model. I guess. 290 00:16:51,320 --> 00:16:54,040 Speaker 2: Yeah. The dates are called rush. 291 00:16:53,320 --> 00:16:55,760 Speaker 1: Yeah, right, And and people are doing the same with 292 00:16:55,760 --> 00:16:57,840 Speaker 1: with wealthy people, right, They just want to interact with 293 00:16:57,880 --> 00:17:00,000 Speaker 1: them so they can siphon off some of the resources 294 00:17:00,080 --> 00:17:02,800 Speaker 1: is that those people have accumulated, Like it's not maybe 295 00:17:02,800 --> 00:17:06,080 Speaker 1: it's not the same. I think that's still humans interacting 296 00:17:06,080 --> 00:17:10,200 Speaker 1: with wealthy people is distinct from an AI interacting with humans, 297 00:17:10,280 --> 00:17:12,560 Speaker 1: but it sort of gives us a window into what 298 00:17:12,960 --> 00:17:15,640 Speaker 1: the impact of that being most of your human interaction 299 00:17:15,720 --> 00:17:16,240 Speaker 1: over time. 300 00:17:16,920 --> 00:17:20,520 Speaker 2: Indeed, Indeed, and as we speak of wealthy people, I 301 00:17:20,560 --> 00:17:23,240 Speaker 2: suppose we should look at the other way in which 302 00:17:23,800 --> 00:17:27,679 Speaker 2: EI is intersected with alienation, right, because you know, for 303 00:17:28,280 --> 00:17:31,440 Speaker 2: the current narrative has been about you know, EI is 304 00:17:31,520 --> 00:17:34,920 Speaker 2: taking jobs, and before then it was about automation was 305 00:17:34,960 --> 00:17:38,520 Speaker 2: taking jobs. EI is you know, a form of automation, 306 00:17:39,640 --> 00:17:43,040 Speaker 2: and before that it was just innovations in general, just 307 00:17:43,920 --> 00:17:48,280 Speaker 2: steps in some technological direction would be eliminating jobs. But 308 00:17:48,400 --> 00:17:51,199 Speaker 2: always marveled at stepping back and looking at the whole 309 00:17:51,280 --> 00:17:54,200 Speaker 2: conversation about this has taking jobs, that has taken jobs 310 00:17:54,280 --> 00:17:58,160 Speaker 2: is at the root of it is this dependence on employment, 311 00:17:58,920 --> 00:18:02,800 Speaker 2: on jobs for people to have, you know, life, to 312 00:18:02,840 --> 00:18:05,480 Speaker 2: be able to have a quality of life. We have 313 00:18:05,760 --> 00:18:09,600 Speaker 2: gotten more and more productive, and I mean that productivity 314 00:18:09,640 --> 00:18:11,879 Speaker 2: has helped people in some ways, and it's harm to 315 00:18:11,920 --> 00:18:15,800 Speaker 2: the environment in a lot of ways. But we have 316 00:18:16,160 --> 00:18:19,560 Speaker 2: a certain level of productivity now and we've produced so 317 00:18:19,680 --> 00:18:22,560 Speaker 2: much now that in some sectors we have more of 318 00:18:22,560 --> 00:18:25,680 Speaker 2: the enough for several decades to come. I think fashion 319 00:18:25,760 --> 00:18:28,480 Speaker 2: is one of them where we have like quite the 320 00:18:28,560 --> 00:18:32,520 Speaker 2: excess of clothing everybody. And of course you can talk 321 00:18:32,520 --> 00:18:35,119 Speaker 2: about how that level of productivity is done damage to 322 00:18:35,320 --> 00:18:39,240 Speaker 2: our creativity or craftsmanship, but it's all the worse when 323 00:18:39,240 --> 00:18:42,440 Speaker 2: you think about how even with all that productivity, the 324 00:18:42,520 --> 00:18:45,920 Speaker 2: work has hardly benefited. You know, more productivity doesn't necessarily 325 00:18:46,000 --> 00:18:48,840 Speaker 2: mean more pay. And so even before EI came around, 326 00:18:48,840 --> 00:18:52,320 Speaker 2: we were having issues with labor and alienation, right, people 327 00:18:52,320 --> 00:18:54,920 Speaker 2: disconnected from their work from whether it be a service job, 328 00:18:54,960 --> 00:18:59,320 Speaker 2: a factory job, or delivery job, whatever, any of these 329 00:18:59,400 --> 00:19:01,960 Speaker 2: jobs that you look at, it's structured at the end 330 00:19:02,000 --> 00:19:04,840 Speaker 2: of the day, not around providing a product or providing 331 00:19:04,880 --> 00:19:11,560 Speaker 2: a service, but around profit, around the podynamic between the owner, 332 00:19:11,840 --> 00:19:15,840 Speaker 2: the capitalist and the worker. The worker who is not 333 00:19:16,000 --> 00:19:18,639 Speaker 2: in control, is alienated from their labor and from the 334 00:19:18,680 --> 00:19:21,840 Speaker 2: products of their labor. And this is what Marx famously 335 00:19:21,880 --> 00:19:24,680 Speaker 2: spoke about, but he wasn't the only one to speak 336 00:19:24,720 --> 00:19:29,480 Speaker 2: about it. This sort of alienated labor that is compelled 337 00:19:29,640 --> 00:19:33,679 Speaker 2: rather than creative, that has no control for work, and 338 00:19:33,720 --> 00:19:38,040 Speaker 2: where workers are treated as commodities on a labor market Tankfully, 339 00:19:38,119 --> 00:19:40,160 Speaker 2: i haven't had to look for a job in a while, 340 00:19:40,280 --> 00:19:44,199 Speaker 2: but I've had to see my friends seeking jobs and 341 00:19:44,240 --> 00:19:49,000 Speaker 2: it's not a nice experience. And I have to spend weeks, 342 00:19:49,080 --> 00:19:53,640 Speaker 2: months sometimes looking for a job. But you will most 343 00:19:53,720 --> 00:19:56,840 Speaker 2: likely heed, but you need to survive. You know, and 344 00:19:56,920 --> 00:19:58,520 Speaker 2: a lot of these jobs you end up looking for, 345 00:19:58,680 --> 00:20:01,439 Speaker 2: end up getting into, and you've unnecessary jobs. There are 346 00:20:01,440 --> 00:20:04,399 Speaker 2: a lot of bullshit jobs, and I don't contribute to 347 00:20:04,440 --> 00:20:08,240 Speaker 2: a person's you know, development and fulfillment or they could 348 00:20:08,280 --> 00:20:12,640 Speaker 2: have humanity in any way. Yeah, and then a lot 349 00:20:12,680 --> 00:20:14,760 Speaker 2: of the benefits that people have fought for, even for 350 00:20:14,840 --> 00:20:18,480 Speaker 2: these jobs have either been eroded, you know, rolled back 351 00:20:19,000 --> 00:20:23,199 Speaker 2: over time, or they've been loopholed out. So you know, 352 00:20:23,240 --> 00:20:26,240 Speaker 2: for example, you don't even get enough hours to qualify 353 00:20:26,400 --> 00:20:30,719 Speaker 2: for benefits when you work at certain places, or you 354 00:20:30,760 --> 00:20:35,239 Speaker 2: are an independent contractor instead of an employee, so they 355 00:20:35,240 --> 00:20:38,480 Speaker 2: can get away from you know, giving me your due. 356 00:20:38,880 --> 00:20:42,040 Speaker 2: And so then in this environment you have EI coming 357 00:20:42,080 --> 00:20:47,760 Speaker 2: in now and taking certain rules varying levels of quality 358 00:20:47,880 --> 00:20:52,400 Speaker 2: and writing and in art and coding and administrative work. 359 00:20:52,800 --> 00:20:56,240 Speaker 2: And I don't know, I think for one, EI does 360 00:20:56,280 --> 00:20:58,560 Speaker 2: a lot of these jobs very poorly. But then there's 361 00:20:58,600 --> 00:21:02,600 Speaker 2: also cases where I don't like copyrighting, which is something 362 00:21:02,640 --> 00:21:06,920 Speaker 2: I used to do. The Yeah, copyrighting and the sort 363 00:21:06,960 --> 00:21:10,800 Speaker 2: of copyrighting that I had to write is back in 364 00:21:10,840 --> 00:21:15,160 Speaker 2: the day. It's almost indistinguishable in terms of it feels generic, 365 00:21:15,400 --> 00:21:19,000 Speaker 2: quite less. You know slop, Like it's just you're pumping 366 00:21:19,080 --> 00:21:22,439 Speaker 2: this out to pollute the airwaves in a sense. 367 00:21:23,080 --> 00:21:26,200 Speaker 1: Yeah, it's very like it has a very formulaic nature 368 00:21:26,520 --> 00:21:29,439 Speaker 1: when a human does it. It's funny when I think 369 00:21:29,440 --> 00:21:32,520 Speaker 1: about copywriting, right, Like you can see the people have 370 00:21:32,640 --> 00:21:35,880 Speaker 1: identified the completely generic nature of it, because occasionally you'll 371 00:21:35,920 --> 00:21:39,240 Speaker 1: have like brands who do it in a non formulaic 372 00:21:39,320 --> 00:21:42,680 Speaker 1: way and briefly see success from it, like just by 373 00:21:43,400 --> 00:21:46,399 Speaker 1: having some element of humanity in it. 374 00:21:46,560 --> 00:21:49,000 Speaker 2: Yeah, like Wendy is when they did that for a 375 00:21:49,040 --> 00:21:52,159 Speaker 2: little while, and yeah, then every brand to copy that 376 00:21:52,200 --> 00:21:53,920 Speaker 2: method and then it became steel and. 377 00:21:53,960 --> 00:21:56,880 Speaker 1: Yeah, yeah, some almost sometimes like puncturate for a minute, 378 00:21:56,880 --> 00:21:58,840 Speaker 1: and then like you say, everyone will run after it 379 00:21:58,880 --> 00:22:02,320 Speaker 1: like a viper. Sunglasses, this one, I guess they're very 380 00:22:02,359 --> 00:22:05,520 Speaker 1: popular with like right wing bigots. Every time like biggots 381 00:22:05,520 --> 00:22:08,040 Speaker 1: are pictured in their sunglasses, they'll like donate money to 382 00:22:08,920 --> 00:22:12,879 Speaker 1: LGBTQ affirming causes or like gender affirming care stuff or 383 00:22:12,920 --> 00:22:15,720 Speaker 1: whatever depends what the people are being bigoted about. And 384 00:22:15,800 --> 00:22:17,600 Speaker 1: like briefly I saw them have success with that just 385 00:22:17,640 --> 00:22:20,359 Speaker 1: because like people are so accustomed to brand's being a 386 00:22:20,520 --> 00:22:23,679 Speaker 1: political rather than just being like no, fuck you. So 387 00:22:23,920 --> 00:22:27,040 Speaker 1: by doing the kind of basics of being a good person, 388 00:22:27,680 --> 00:22:31,040 Speaker 1: it appears human and therefore not so generic, and people 389 00:22:32,040 --> 00:22:34,480 Speaker 1: you know, briefly fall in love with it or whatever. 390 00:22:34,960 --> 00:22:37,200 Speaker 2: Yeah. But I mean at the end of the day, 391 00:22:37,600 --> 00:22:42,879 Speaker 2: of the corporations and our potsons, there are people behind corporations. Yeah, 392 00:22:42,920 --> 00:22:45,000 Speaker 2: And I guess I sat that sort of wonder with 393 00:22:45,119 --> 00:22:48,240 Speaker 2: these kinds of jobs that are all being build in 394 00:22:48,280 --> 00:22:53,240 Speaker 2: at recent part by EI, what is the impact on 395 00:22:53,320 --> 00:22:58,360 Speaker 2: our pots self worse? Oh yeah, but they're they're skilled 396 00:22:58,520 --> 00:23:03,040 Speaker 2: to be just sort of swapped out for a machine. 397 00:23:03,280 --> 00:23:06,800 Speaker 2: You know, a lot of people have already felt that 398 00:23:06,960 --> 00:23:09,399 Speaker 2: their work is non essential, and then you have a 399 00:23:09,440 --> 00:23:14,320 Speaker 2: sense of being replaceable and unneeded. And in some cases 400 00:23:14,400 --> 00:23:17,160 Speaker 2: the difference is negligible because, like I said, the work 401 00:23:17,240 --> 00:23:19,520 Speaker 2: they was already being put out was the sort of 402 00:23:19,600 --> 00:23:25,280 Speaker 2: generic stuff that it sort of fills people, yeah, and 403 00:23:25,359 --> 00:23:28,520 Speaker 2: fill screens. But then you also have more necessary, the 404 00:23:28,600 --> 00:23:31,000 Speaker 2: more creative work. It is also just being sort of 405 00:23:31,040 --> 00:23:34,399 Speaker 2: funneled out. You know, I'm seeing billboards all over the 406 00:23:34,440 --> 00:23:40,119 Speaker 2: place that just have like this nasty, smooth looking like 407 00:23:40,359 --> 00:23:45,040 Speaker 2: AI generated pictures. Yeah, just a lot of slop, you know, 408 00:23:45,160 --> 00:23:49,480 Speaker 2: slop content, slop ad slop emails, you know, even on 409 00:23:49,600 --> 00:23:52,400 Speaker 2: YouTube now Like I like to listen to these sort 410 00:23:52,440 --> 00:23:56,200 Speaker 2: of music mixes when I work sometimes, and most of 411 00:23:56,240 --> 00:24:00,000 Speaker 2: the channels being recommended for music mixes on YouTube nowadays 412 00:24:00,119 --> 00:24:02,159 Speaker 2: is at least in the genres that I would listen to. 413 00:24:03,040 --> 00:24:06,439 Speaker 2: It's just like, yeah, I generated jazz chill. The thing 414 00:24:06,520 --> 00:24:09,240 Speaker 2: is they were't titlet that way. Yeah, you know, the 415 00:24:09,320 --> 00:24:12,440 Speaker 2: titlet some some wood and they probably haven't somewhat AI 416 00:24:12,520 --> 00:24:15,840 Speaker 2: generated thumbnail and whenever. And then you just, you know, 417 00:24:15,880 --> 00:24:19,399 Speaker 2: if you're unaware of the pattern of how those channels operate, 418 00:24:19,960 --> 00:24:21,560 Speaker 2: my click on are thinking, oh, it's just like a 419 00:24:21,680 --> 00:24:24,359 Speaker 2: music mix, like every other music mix, and then you 420 00:24:24,440 --> 00:24:25,879 Speaker 2: listened to it for a while and listen to a 421 00:24:25,920 --> 00:24:27,840 Speaker 2: few of them, and you realize, oh, this is just 422 00:24:28,000 --> 00:24:32,160 Speaker 2: like a machine made this. It has no flavor, yeah, 423 00:24:32,440 --> 00:24:35,399 Speaker 2: like no soul. There's also a lot of articles that 424 00:24:35,560 --> 00:24:38,440 Speaker 2: just fill in the Internet. It's just like slow yeah, yeah, 425 00:24:38,760 --> 00:24:42,200 Speaker 2: you know, just AI generated articles that feed into the 426 00:24:42,600 --> 00:24:46,400 Speaker 2: AI pool of references, and so the I almost eats itself. Yeah, 427 00:24:46,840 --> 00:24:50,000 Speaker 2: and it's sad, but I think it was like we 428 00:24:50,160 --> 00:24:52,879 Speaker 2: always going in this direction in a sense, not to 429 00:24:52,920 --> 00:24:55,720 Speaker 2: say it was entirely inevitable, but this was the trajectory 430 00:24:55,760 --> 00:24:58,359 Speaker 2: that we are pointed at. This actually could have been changed, 431 00:24:58,520 --> 00:25:00,760 Speaker 2: but now it hasn't been. So it's how we kind 432 00:25:00,800 --> 00:25:12,359 Speaker 2: of got here. I don't know if it's just me, 433 00:25:12,480 --> 00:25:15,240 Speaker 2: but I feel like there was a time when, boy, 434 00:25:15,400 --> 00:25:17,520 Speaker 2: it may it may still be true that a still 435 00:25:17,560 --> 00:25:20,440 Speaker 2: plus is not always a good thing. There's something to 436 00:25:20,520 --> 00:25:25,120 Speaker 2: be said about the value that we impue to things 437 00:25:25,280 --> 00:25:28,440 Speaker 2: when they are a bit rarer, you know, when it's 438 00:25:29,000 --> 00:25:31,520 Speaker 2: you have to be more attentive and engaging with it. 439 00:25:32,400 --> 00:25:34,320 Speaker 2: You know. I was actually thinking about it earlier today 440 00:25:34,440 --> 00:25:36,040 Speaker 2: when I was a child and I was watching TV. 441 00:25:36,640 --> 00:25:38,400 Speaker 2: You know, if they didn't have anything on the TV 442 00:25:38,480 --> 00:25:39,680 Speaker 2: that I wanted to watch, I have to go and 443 00:25:39,760 --> 00:25:43,600 Speaker 2: do something else, right, Yeah, And nowadays TV is pretty 444 00:25:43,680 --> 00:25:46,760 Speaker 2: much unlimited because at any point in time, you couldn't 445 00:25:46,800 --> 00:25:51,240 Speaker 2: have access to anything that an algorithm could see. If 446 00:25:51,240 --> 00:25:54,359 Speaker 2: you're up there is perfectly curated to your interests, and 447 00:25:54,480 --> 00:25:56,480 Speaker 2: it's ought to play and everything. It's just one hits 448 00:25:56,520 --> 00:25:59,920 Speaker 2: after the next. In that excess, I just feel like 449 00:26:01,080 --> 00:26:05,240 Speaker 2: we've lost the sort of attentive curation of your tease 450 00:26:05,480 --> 00:26:09,920 Speaker 2: curation of and evaluation of things. Of the effort and 451 00:26:10,200 --> 00:26:13,000 Speaker 2: energy and craft goes into making things, we just end 452 00:26:13,080 --> 00:26:14,800 Speaker 2: up sort of taking things for granted. 453 00:26:14,920 --> 00:26:17,760 Speaker 1: And like I think we kind of lower the standard 454 00:26:17,800 --> 00:26:20,960 Speaker 1: that we will accept because it's just so much of it. 455 00:26:21,040 --> 00:26:23,000 Speaker 1: There's so much volume of it. Yeah, Like, and you're 456 00:26:23,040 --> 00:26:25,280 Speaker 1: not so attentive to it because it's always there that 457 00:26:25,520 --> 00:26:29,480 Speaker 1: like slop becomes okay, it just kind of fills the 458 00:26:29,600 --> 00:26:32,719 Speaker 1: gaps in this non stop stream of content. 459 00:26:33,160 --> 00:26:35,440 Speaker 2: Yeah, just filling feeling the noise. I have to catch 460 00:26:35,520 --> 00:26:38,960 Speaker 2: myself sometimes, yeah, because I mean just like sometimes I 461 00:26:39,080 --> 00:26:41,800 Speaker 2: just put something on because it was there, you know, 462 00:26:41,880 --> 00:26:44,360 Speaker 2: and just feeling noise, and sometimes I have to remind myself, 463 00:26:44,560 --> 00:26:48,399 Speaker 2: you know, falls just be with your thoughts for a bit, 464 00:26:48,560 --> 00:26:48,720 Speaker 2: you know. 465 00:26:49,560 --> 00:26:49,760 Speaker 1: Yeah. 466 00:26:49,960 --> 00:26:53,240 Speaker 2: Yeah, And I try not to put too much blame 467 00:26:53,280 --> 00:26:56,800 Speaker 2: on myself even as I try to work on it, 468 00:26:57,320 --> 00:27:01,119 Speaker 2: because all of this, once again is probably you know, 469 00:27:01,680 --> 00:27:06,399 Speaker 2: these platforms and these algorithms have been set up to perfectly. 470 00:27:07,359 --> 00:27:10,720 Speaker 2: They're perfectly honed into their ability to exploit the little 471 00:27:10,800 --> 00:27:15,159 Speaker 2: shortcuts and weaknesses in the human mind to engage just 472 00:27:15,320 --> 00:27:18,760 Speaker 2: for as long as possible. Yea. So even if you 473 00:27:18,840 --> 00:27:20,359 Speaker 2: feel like Oh my gosh, I want to get off 474 00:27:20,400 --> 00:27:21,960 Speaker 2: of smedia. I want to quit this, that or the other. 475 00:27:22,920 --> 00:27:25,600 Speaker 2: It's hard, you know, even on you when you know 476 00:27:26,320 --> 00:27:30,680 Speaker 2: in your mind that it's detrimental, that it's affecting your negatively, 477 00:27:31,920 --> 00:27:34,680 Speaker 2: you still end up going back because again it's it's 478 00:27:34,720 --> 00:27:38,680 Speaker 2: hacked into your brain in a sense. But so I'm 479 00:27:38,720 --> 00:27:42,360 Speaker 2: just really frustrated by the way that AI has contributed 480 00:27:43,040 --> 00:27:47,280 Speaker 2: to this sort of disconnect because I also think it 481 00:27:47,359 --> 00:27:51,160 Speaker 2: makes the whole breadth of human creativity a lot less valued, 482 00:27:51,440 --> 00:27:57,800 Speaker 2: practiced and supported, you know, instead of people actually respecting 483 00:27:58,040 --> 00:28:02,080 Speaker 2: and you know ports in the craft and the afthlet 484 00:28:02,119 --> 00:28:05,479 Speaker 2: that goes into into things, it's just like, oh, scrolls 485 00:28:05,520 --> 00:28:07,960 Speaker 2: the next thing, scrolls to the next thing, or for 486 00:28:08,080 --> 00:28:11,640 Speaker 2: some people who seem to love yeah, it's just oh yeah, 487 00:28:11,680 --> 00:28:14,880 Speaker 2: you're you're you're obsolete. Now you can be replaced by 488 00:28:14,960 --> 00:28:17,720 Speaker 2: this you know, junk of. 489 00:28:17,880 --> 00:28:20,119 Speaker 1: Just thinking about like art, like I see it so 490 00:28:20,359 --> 00:28:24,600 Speaker 1: often in like like even in revolutionary spaces, I'll see 491 00:28:24,640 --> 00:28:26,800 Speaker 1: it right like there, I guess sometimes is what it 492 00:28:26,920 --> 00:28:29,119 Speaker 1: is actually is AI accounts that have no idea what 493 00:28:29,200 --> 00:28:31,360 Speaker 1: a revolution is. They're incapable of doing so because they're 494 00:28:31,359 --> 00:28:34,840 Speaker 1: not human, but like I just designed to monetize clicks. 495 00:28:35,480 --> 00:28:39,000 Speaker 1: You know, you'll see there's a bunch of fucking Israel 496 00:28:39,080 --> 00:28:43,240 Speaker 1: stands with Kurtistan ads which will just like AI generate 497 00:28:43,720 --> 00:28:47,600 Speaker 1: pictures of yep as women, like the women who fight 498 00:28:47,760 --> 00:28:51,960 Speaker 1: for their an es, right, and like it's just I 499 00:28:52,040 --> 00:28:54,360 Speaker 1: don't think these are not again, people are actually part 500 00:28:54,360 --> 00:28:56,120 Speaker 1: of the revolution, right, there are people who just who 501 00:28:56,200 --> 00:28:58,680 Speaker 1: want to in a sense objectify the revolution and the 502 00:28:58,720 --> 00:29:01,120 Speaker 1: women who fought in it and continue to fight in 503 00:29:01,200 --> 00:29:05,080 Speaker 1: it for financial benefit. But like it's the antithesis of 504 00:29:05,200 --> 00:29:09,560 Speaker 1: the beautiful life that people are trying to build there, right, 505 00:29:09,640 --> 00:29:12,840 Speaker 1: Like it is the opposite of everything that that revolution 506 00:29:13,000 --> 00:29:13,520 Speaker 1: stands for. 507 00:29:14,240 --> 00:29:16,880 Speaker 2: So you see, and people are like AI generates in 508 00:29:17,480 --> 00:29:18,520 Speaker 2: these humount iters. 509 00:29:19,080 --> 00:29:24,440 Speaker 1: Yeah, yes, exactly, and then using that for some either 510 00:29:24,680 --> 00:29:27,560 Speaker 1: just straight because you get paid per click on x 511 00:29:27,680 --> 00:29:32,800 Speaker 1: now right, or for some nefarious propaganda bullshit, but like 512 00:29:33,520 --> 00:29:37,240 Speaker 1: it's and then by contrast, right by friends in Memma, 513 00:29:38,160 --> 00:29:40,320 Speaker 1: there's a group called Art Strike Collective who do these 514 00:29:40,360 --> 00:29:44,160 Speaker 1: cool drawings of various individuals who have fought in the revolution, 515 00:29:44,840 --> 00:29:47,800 Speaker 1: and like one is a beautiful thing that shows your 516 00:29:47,840 --> 00:29:50,840 Speaker 1: respect for these people. Many of whom have given their 517 00:29:50,880 --> 00:29:54,520 Speaker 1: lives for this revolution. And another is just complete fucking 518 00:29:54,640 --> 00:29:57,960 Speaker 1: slop that is actively harming the thing it's supposed to 519 00:29:58,040 --> 00:29:58,920 Speaker 1: be supporting. 520 00:30:00,080 --> 00:30:04,200 Speaker 2: Unfortunately, and it's a cliche at this point, but many 521 00:30:04,280 --> 00:30:09,720 Speaker 2: such cases. Yeah, yeah, I saw this short lecture on 522 00:30:09,920 --> 00:30:14,520 Speaker 2: YouTube by a professor im professors name as Jim. There's 523 00:30:14,520 --> 00:30:16,600 Speaker 2: such a short clip from I'm assuming a longer lecture, 524 00:30:16,640 --> 00:30:18,200 Speaker 2: he said. The title of the video was really what 525 00:30:18,360 --> 00:30:21,960 Speaker 2: captured me. It was something along the lines of consumerism 526 00:30:22,040 --> 00:30:25,719 Speaker 2: as the perfection of slavery, and it was really speaking 527 00:30:25,760 --> 00:30:30,720 Speaker 2: about how we are able to be so perfectly locked 528 00:30:30,800 --> 00:30:34,560 Speaker 2: into our role as workers, as cogs in this machine 529 00:30:35,240 --> 00:30:39,960 Speaker 2: to become, you know, so docile because of just how 530 00:30:40,320 --> 00:30:46,160 Speaker 2: good the consumeristic system has gotten at keeping us chasing 531 00:30:46,320 --> 00:30:50,240 Speaker 2: that next you know, dopamine, hit, that next purchase, that 532 00:30:50,360 --> 00:30:53,280 Speaker 2: next thing to consume. You know. So we're still being exploited, 533 00:30:53,520 --> 00:30:57,680 Speaker 2: We are still wages slaves in a sense, but we 534 00:30:57,840 --> 00:31:01,160 Speaker 2: are either unaware of it or we accept that rule 535 00:31:02,120 --> 00:31:06,040 Speaker 2: just to chase after, you know, the next tie of consumption. 536 00:31:07,240 --> 00:31:10,720 Speaker 1: Mm hmmm. Yeah, Like when you think about a brave 537 00:31:10,760 --> 00:31:13,000 Speaker 1: new world in nineteen eighty four, right, these two dystopian 538 00:31:13,120 --> 00:31:16,960 Speaker 1: novels roughly, I mean briefly what came up before eighty 539 00:31:17,120 --> 00:31:19,960 Speaker 1: nineteen eighty four. Right. The difference is one is like 540 00:31:20,760 --> 00:31:22,920 Speaker 1: a boot stamping on the human face forever, which is 541 00:31:22,960 --> 00:31:25,960 Speaker 1: nineteen eighty four, and hux Leaves Dystopia is based on 542 00:31:26,840 --> 00:31:29,320 Speaker 1: people being essentially bought off through pleasure. Right. 543 00:31:29,680 --> 00:31:32,280 Speaker 2: Yeah, it's like unlimited cookie in for everyone. 544 00:31:32,640 --> 00:31:35,760 Speaker 1: Yeah. Yeah, yeah, they call it. It's called sohmer. I think, right, 545 00:31:36,040 --> 00:31:38,280 Speaker 1: we're in the unlimited it came for everyone world, right, 546 00:31:38,360 --> 00:31:39,400 Speaker 1: like it's it's stuf. 547 00:31:39,440 --> 00:31:41,040 Speaker 2: I mean, I think we're in both. You know, it's 548 00:31:41,080 --> 00:31:45,160 Speaker 2: a simultaneously a hux leyerand a well yeah and yeah Dustopia, 549 00:31:45,400 --> 00:31:47,280 Speaker 2: you know what's the fourth worlds? 550 00:31:47,600 --> 00:31:51,200 Speaker 1: Yeah, you're right. I'm starting to read Jack London's Dystopia 551 00:31:51,320 --> 00:31:54,000 Speaker 1: the Iron Heel. Now I have decided I want to 552 00:31:54,160 --> 00:31:57,320 Speaker 1: work out who was best calling the dystopia. But yeah, 553 00:31:57,400 --> 00:31:59,240 Speaker 1: we we have a little bit of both. Now we 554 00:31:59,360 --> 00:32:01,920 Speaker 1: have the they'll get you at both ends, right, Like 555 00:32:02,040 --> 00:32:04,880 Speaker 1: they'll try and give you things to keep you glassid 556 00:32:04,920 --> 00:32:07,880 Speaker 1: and then also things to keep you afraid. Yeah. 557 00:32:08,840 --> 00:32:11,520 Speaker 2: So, I mean there's there's a lot of reasons to despair, 558 00:32:12,120 --> 00:32:15,280 Speaker 2: you know, people just blindly embrace an EI and they 559 00:32:15,280 --> 00:32:16,600 Speaker 2: don't see the problem with you as in the EI 560 00:32:16,680 --> 00:32:19,880 Speaker 2: and all these different things. There's also, as I'd like 561 00:32:19,960 --> 00:32:22,560 Speaker 2: to end things on reason to hope, right, there are 562 00:32:22,720 --> 00:32:27,280 Speaker 2: people who are willing to voidcott it. Who are you know, 563 00:32:28,200 --> 00:32:30,520 Speaker 2: maintaining a stigma around it. You know, people are not 564 00:32:30,600 --> 00:32:33,680 Speaker 2: taking it lined down. Artists are not taking it lined down. 565 00:32:34,200 --> 00:32:37,200 Speaker 2: Writers and are taking it line down designers, We're not 566 00:32:37,320 --> 00:32:41,440 Speaker 2: taking it lined down. People are still craving the authenticity, 567 00:32:41,600 --> 00:32:45,960 Speaker 2: connection and craft that comes from human people. And although 568 00:32:46,840 --> 00:32:51,400 Speaker 2: there's little any individual can do to resist the alienation 569 00:32:52,200 --> 00:32:57,120 Speaker 2: of this society, whether be at work or relationships by themselves, 570 00:32:57,200 --> 00:33:00,280 Speaker 2: you know, it's very hard, there are things we can 571 00:33:00,320 --> 00:33:03,200 Speaker 2: do together in tandem to make things a little bit 572 00:33:03,280 --> 00:33:07,680 Speaker 2: easier as we sort of try and strive toward social revolution. 573 00:33:08,160 --> 00:33:12,320 Speaker 2: You know, there's the classic you know, touch grass, you know, 574 00:33:12,480 --> 00:33:15,800 Speaker 2: log off and try and find where people are. There's 575 00:33:15,840 --> 00:33:22,440 Speaker 2: also the individualist solution of reclaiming your agency by finding 576 00:33:22,560 --> 00:33:26,000 Speaker 2: some version of digital minimalism that works for you, you know, 577 00:33:26,200 --> 00:33:29,840 Speaker 2: taking a break soon and out limiting your screen time 578 00:33:30,200 --> 00:33:34,479 Speaker 2: here and there. But really it's going to take system change. 579 00:33:34,600 --> 00:33:36,640 Speaker 2: It's going to take collective action. It's gonna take us 580 00:33:36,720 --> 00:33:40,720 Speaker 2: boycotting both you know, of course the AI products, there's 581 00:33:40,760 --> 00:33:42,880 Speaker 2: a boycott already taking place with those, but then also 582 00:33:43,040 --> 00:33:47,680 Speaker 2: just yeah, striking at the pressure points of the system 583 00:33:47,760 --> 00:33:54,640 Speaker 2: and prefiguring about the world for everyone. Yeah, and you know, 584 00:33:54,680 --> 00:33:57,200 Speaker 2: I hope that everybody is able to do what they 585 00:33:57,280 --> 00:34:01,160 Speaker 2: can to take steps in that direction. And yeah, so 586 00:34:01,600 --> 00:34:02,560 Speaker 2: please don't use the I. 587 00:34:03,800 --> 00:34:08,080 Speaker 1: Yeah, yeah, yeah, I think I always like that Superman 588 00:34:08,120 --> 00:34:11,120 Speaker 1: Dane Marcos quote where he says, like, it's not necessary 589 00:34:11,200 --> 00:34:13,440 Speaker 1: to conquer the world, it's sufficient to build a new one. 590 00:34:14,080 --> 00:34:16,160 Speaker 1: I like that approach to this AI stuff. The way 591 00:34:16,200 --> 00:34:19,560 Speaker 1: we make it so people in our community don't turn 592 00:34:19,640 --> 00:34:21,880 Speaker 1: to AI to talk about things they want to talk 593 00:34:21,920 --> 00:34:23,719 Speaker 1: about is to be there for them to talk to, right, 594 00:34:23,800 --> 00:34:28,120 Speaker 1: to build community, to build real human interactions with each other, 595 00:34:28,600 --> 00:34:31,840 Speaker 1: so people don't have real human conversations with the computer. 596 00:34:32,440 --> 00:34:37,360 Speaker 2: Absolutely agreed. Yeah, and that's all they have for today. 597 00:34:38,880 --> 00:34:41,680 Speaker 2: So all part, it's all the people. This has been, 598 00:34:41,719 --> 00:34:44,319 Speaker 2: It could happen here. I've been Andrew, this has been 599 00:34:44,600 --> 00:34:51,880 Speaker 2: James and Lasst Yeah, thanks, it could happen here is 600 00:34:51,920 --> 00:34:54,640 Speaker 2: a production of cool Zone Media. For more podcasts from 601 00:34:54,719 --> 00:34:57,920 Speaker 2: cool Zone Media. Visit our website cool Zonemedia dot com, 602 00:34:58,440 --> 00:35:01,280 Speaker 2: or check us out on the iHeartRadio app, Apple Podcasts, 603 00:35:01,600 --> 00:35:04,399 Speaker 2: or wherever you listen to podcasts. You can now find 604 00:35:04,480 --> 00:35:07,480 Speaker 2: sources for it could happen here, listed directly in episode descriptions. 605 00:35:07,840 --> 00:35:08,600 Speaker 1: Thanks for listening.