1 00:00:00,240 --> 00:00:03,560 Speaker 1: Hello the Internet, and welcome to this episode of The 2 00:00:03,600 --> 00:00:08,520 Speaker 1: Weekly Zeitgeist. These are some of our favorite segments from 3 00:00:08,640 --> 00:00:19,400 Speaker 1: this week, all edited together into one NonStop infotainment laugh stravaganza. Yeah, So, 4 00:00:19,880 --> 00:00:25,639 Speaker 1: without further ado, here is the Weekly Zeitgeist. What is 5 00:00:25,720 --> 00:00:29,120 Speaker 1: something from your search history that is revealing about who 6 00:00:29,160 --> 00:00:29,440 Speaker 1: you are? 7 00:00:30,000 --> 00:00:33,720 Speaker 2: I yesterday was looking up the public pool hours in 8 00:00:33,760 --> 00:00:36,919 Speaker 2: New York because I'm a huge, huge, huge advocate of 9 00:00:37,200 --> 00:00:40,000 Speaker 2: going to the public pools in New York City because 10 00:00:40,000 --> 00:00:42,880 Speaker 2: everybody's like, I mean, itw gross the public pools, like 11 00:00:42,880 --> 00:00:44,680 Speaker 2: I'd rather go to a hotel, and I'm like, okay, 12 00:00:44,680 --> 00:00:46,479 Speaker 2: but the pools are run by the city and have 13 00:00:46,520 --> 00:00:49,360 Speaker 2: like really intense guidelines about how clean they have to be. 14 00:00:49,479 --> 00:00:52,479 Speaker 2: And you know who doesn't have those guidelines any hotel. 15 00:00:53,680 --> 00:00:55,880 Speaker 2: I don't know if any hotel pools are getting cleaned. 16 00:00:55,920 --> 00:00:57,680 Speaker 2: And the way that the public pools in New York 17 00:00:57,720 --> 00:01:01,560 Speaker 2: are so I'm a huge fan freezing cold if you 18 00:01:01,560 --> 00:01:05,800 Speaker 2: live they're super They're like because they're like cooling centers, 19 00:01:05,880 --> 00:01:08,479 Speaker 2: like they're four people, especially if you don't have air 20 00:01:08,480 --> 00:01:11,440 Speaker 2: conditioning in the summer. They're also the location of a 21 00:01:11,480 --> 00:01:14,280 Speaker 2: lot of the free lunches that get you know, for students, 22 00:01:14,480 --> 00:01:16,840 Speaker 2: so in the summer you're not in school, that's where 23 00:01:16,840 --> 00:01:19,000 Speaker 2: they do it. I love, love, love the public pools, 24 00:01:19,000 --> 00:01:22,160 Speaker 2: but I was I've never gone early enough to know 25 00:01:22,200 --> 00:01:23,560 Speaker 2: what time they opened, so I was like, I guess 26 00:01:23,560 --> 00:01:25,319 Speaker 2: I should find out. And it's eleven, which I think 27 00:01:25,400 --> 00:01:27,720 Speaker 2: is a little close for me, so I don't know 28 00:01:27,720 --> 00:01:30,480 Speaker 2: if I'll ever get there at opening, but noon feels. 29 00:01:30,240 --> 00:01:33,160 Speaker 3: Right right, right right. Did you grow up in New York? 30 00:01:33,520 --> 00:01:34,040 Speaker 4: In New York? 31 00:01:34,200 --> 00:01:36,680 Speaker 2: No, Yeah, I grew up in Maryland. But I've been 32 00:01:36,720 --> 00:01:40,920 Speaker 2: here for sixteen or seventeen years, which is crazy because 33 00:01:40,920 --> 00:01:43,680 Speaker 2: I'm twenty. 34 00:01:44,319 --> 00:01:46,039 Speaker 3: God so much just kidding. 35 00:01:46,080 --> 00:01:48,160 Speaker 4: I would never want to be twenty again, my god. 36 00:01:49,080 --> 00:01:51,280 Speaker 5: Oh yeah, some of the most chaotic times in my 37 00:01:51,320 --> 00:01:56,360 Speaker 5: life because like La, the pools in La are just 38 00:01:56,400 --> 00:02:00,280 Speaker 5: because of the ambient heat, not always like super but 39 00:02:00,320 --> 00:02:01,440 Speaker 5: you definitely cool off. 40 00:02:01,520 --> 00:02:04,639 Speaker 3: But then we also have like all those like splash parks. 41 00:02:04,680 --> 00:02:07,960 Speaker 5: Now we're just basically equivalent like an open fire hydrant, yeah, 42 00:02:08,000 --> 00:02:11,519 Speaker 5: a like foam ground area to make it sae yeah 43 00:02:11,520 --> 00:02:12,000 Speaker 5: for kids. 44 00:02:12,080 --> 00:02:13,760 Speaker 4: I love the public pools in LA though, too. I 45 00:02:13,840 --> 00:02:15,240 Speaker 4: used to go swim in those when I lived there. 46 00:02:15,560 --> 00:02:16,160 Speaker 4: They're the best. 47 00:02:16,280 --> 00:02:17,399 Speaker 3: I love the proofs. 48 00:02:17,880 --> 00:02:21,000 Speaker 4: I guess that's that's something about me. I love pools. 49 00:02:20,000 --> 00:02:22,840 Speaker 1: It's I've talked before and hear about how my first 50 00:02:22,880 --> 00:02:25,200 Speaker 1: job out of college was as a pool boy at 51 00:02:25,240 --> 00:02:28,600 Speaker 1: the Soho House, the roof deck of the Soho House 52 00:02:28,639 --> 00:02:31,639 Speaker 1: when that had just opened, and I was in charge 53 00:02:31,680 --> 00:02:35,400 Speaker 1: of just to your point about them not having regulations, 54 00:02:35,560 --> 00:02:38,000 Speaker 1: I was in charge of like the chlorine levels. 55 00:02:38,880 --> 00:02:39,200 Speaker 3: Also. 56 00:02:40,200 --> 00:02:43,720 Speaker 1: Yeah, they were like like there was Oh no. I 57 00:02:43,800 --> 00:02:46,040 Speaker 1: had somebody who like knew more about it, who checked 58 00:02:46,040 --> 00:02:47,799 Speaker 1: with me every once in a while, but I was 59 00:02:47,840 --> 00:02:51,000 Speaker 1: like doing the pH testing and yeah, and I was 60 00:02:51,080 --> 00:02:54,440 Speaker 1: in no way equipped to do that. I had to 61 00:02:54,840 --> 00:02:57,079 Speaker 1: like pull a drowning kid out of the pool one 62 00:02:57,120 --> 00:03:02,000 Speaker 1: time I there. I was there during the blackout of 63 00:03:02,919 --> 00:03:07,000 Speaker 1: whatever year that was, yeah, two thousand and three, and 64 00:03:07,480 --> 00:03:11,360 Speaker 1: everybody like came to that pool. It was a tiny pool. 65 00:03:11,400 --> 00:03:14,360 Speaker 1: It was like halfway between a pool and a bath tub. 66 00:03:14,520 --> 00:03:15,760 Speaker 4: Yeah, it's like the size of a couch. 67 00:03:16,120 --> 00:03:18,160 Speaker 1: But everyone thought it would be a cool place to 68 00:03:18,240 --> 00:03:22,400 Speaker 1: hang out. And it was like milky by the end 69 00:03:22,400 --> 00:03:25,240 Speaker 1: of it. Because I didn't know what I was doing. 70 00:03:26,440 --> 00:03:29,760 Speaker 1: They just like put whoever's up there in charge of 71 00:03:29,800 --> 00:03:31,280 Speaker 1: like the chlorine levels. 72 00:03:31,560 --> 00:03:34,080 Speaker 5: Yeah, because like on the other side, like in Vegas, 73 00:03:34,080 --> 00:03:36,600 Speaker 5: their pools, Like if you open your eyes under the water, 74 00:03:36,800 --> 00:03:40,640 Speaker 5: it will just strip like membranes from your like eyeballs, so. 75 00:03:41,120 --> 00:03:45,840 Speaker 1: Your eyes turns like five shades, Like, yeah, you just 76 00:03:45,880 --> 00:03:51,400 Speaker 1: have white walker eyes open them underwater. What is what's 77 00:03:51,440 --> 00:03:53,160 Speaker 1: something that you think is overrated? 78 00:03:53,720 --> 00:03:57,520 Speaker 6: I think that Wes Anderson is overrated? 79 00:04:01,440 --> 00:04:06,720 Speaker 7: What the job for saying? What exactly about less Anderson? 80 00:04:08,600 --> 00:04:14,440 Speaker 6: I think, Look, I he just tells the same quirky 81 00:04:14,520 --> 00:04:17,240 Speaker 6: little story over and over again. Granted I have not 82 00:04:18,120 --> 00:04:22,960 Speaker 6: seen Asteroid City yet, so maybe this one will just 83 00:04:22,960 --> 00:04:26,800 Speaker 6: blow me away and I will be excited again. To 84 00:04:26,839 --> 00:04:29,120 Speaker 6: be clear, there was a time period where I did 85 00:04:29,560 --> 00:04:32,560 Speaker 6: like his work, but I don't know, Maybe I think 86 00:04:32,560 --> 00:04:37,400 Speaker 6: it's just like too repetitive. I don't know. And but like, yeah, 87 00:04:37,400 --> 00:04:39,240 Speaker 6: I get it, like he's an a tour so he's 88 00:04:39,279 --> 00:04:42,480 Speaker 6: going to have a distinct style that he kind of 89 00:04:42,520 --> 00:04:47,640 Speaker 6: just keeps recycling over and over. But like, I don't know, 90 00:04:47,760 --> 00:04:51,680 Speaker 6: I just like find his quirky little quirkiness a bit 91 00:04:51,800 --> 00:04:53,000 Speaker 6: tiresome these days. 92 00:04:53,800 --> 00:04:54,680 Speaker 7: What era. 93 00:04:58,360 --> 00:05:02,160 Speaker 6: I would say, Well, the for me was fantastic mister Fox. 94 00:05:02,560 --> 00:05:09,440 Speaker 6: Hell yeah, that's probably my favorite of his and leading 95 00:05:09,480 --> 00:05:11,320 Speaker 6: up to but then like after that, I felt like 96 00:05:11,360 --> 00:05:15,400 Speaker 6: there was just a declose moon Rise Kingdom. I think 97 00:05:15,800 --> 00:05:19,720 Speaker 6: that one was like, okay, fine, and then and then 98 00:05:20,080 --> 00:05:22,880 Speaker 6: again another controversial take here, but I thought that Isle 99 00:05:22,960 --> 00:05:27,159 Speaker 6: of Dogs was speaking of toilets. 100 00:05:31,920 --> 00:05:36,560 Speaker 7: Yeah, oh gosh, I think that's totally fair. I okay, 101 00:05:37,160 --> 00:05:41,480 Speaker 7: here's my experience watching almost every was Anderson film. She 102 00:05:41,640 --> 00:05:45,880 Speaker 7: was like, gosh, that's really pretty Ohen Wilson. He's always delightful. 103 00:05:47,000 --> 00:05:51,039 Speaker 7: Costume should get credit. Wow, the fits are fitting. That 104 00:05:51,120 --> 00:05:53,640 Speaker 7: movie was that. I'll never watch it again and don't 105 00:05:53,680 --> 00:05:56,719 Speaker 7: revisit Wes Anderson films. They're like very beautiful. I feel 106 00:05:56,760 --> 00:05:59,600 Speaker 7: like they have just en a story to avoid being 107 00:05:59,720 --> 00:06:01,839 Speaker 7: music in pieces, you know what I mean. You've seen 108 00:06:01,880 --> 00:06:03,000 Speaker 7: a movie in the museum. 109 00:06:03,680 --> 00:06:07,479 Speaker 6: Yeah right, but it's like, tell me the plot of 110 00:06:07,520 --> 00:06:11,279 Speaker 6: any Wes Anderson movie. You can't like, I can't do it, Like, 111 00:06:11,320 --> 00:06:14,359 Speaker 6: I don't like, they just don't stick with me. I 112 00:06:14,400 --> 00:06:17,640 Speaker 6: feel like it's just like you see a frame and 113 00:06:17,640 --> 00:06:20,279 Speaker 6: you're like, wow, look at all that headroom. That's nice, 114 00:06:20,360 --> 00:06:23,279 Speaker 6: and look at this misan sen whoo and then you're like, 115 00:06:23,320 --> 00:06:28,719 Speaker 6: but what was the plot and also why did I 116 00:06:29,000 --> 00:06:29,480 Speaker 6: watch it? 117 00:06:30,640 --> 00:06:35,160 Speaker 1: Yeah, feel like he is going to be the least 118 00:06:35,240 --> 00:06:37,720 Speaker 1: likely filmmaker too for you to be like, Oh, he 119 00:06:37,800 --> 00:06:40,479 Speaker 1: really surprised me on this one. This is this is not. 120 00:06:41,920 --> 00:06:44,159 Speaker 1: I did not see this one coming. I think his 121 00:06:44,680 --> 00:06:48,960 Speaker 1: like French Dispatch, was his attempt to do something different, 122 00:06:49,400 --> 00:06:51,840 Speaker 1: and I think it's like it's it's definitely my least 123 00:06:51,880 --> 00:06:53,200 Speaker 1: favorite of his movies. 124 00:06:53,440 --> 00:06:55,239 Speaker 6: I skipped it. I couldn't be bothered. 125 00:06:55,680 --> 00:06:59,800 Speaker 1: Yeah, but it's it also seems like I don't see 126 00:06:59,800 --> 00:07:02,960 Speaker 1: that many people like Riding for that one usually, Like 127 00:07:03,040 --> 00:07:07,400 Speaker 1: I feel like his movies are some people really like it, 128 00:07:07,480 --> 00:07:10,800 Speaker 1: some people really hate it, and you know, it's it's 129 00:07:10,880 --> 00:07:13,760 Speaker 1: kind of random, Like I really like Grand Budapest Hotel, 130 00:07:13,880 --> 00:07:15,840 Speaker 1: but I don't like a lot of the ones around that, 131 00:07:16,680 --> 00:07:20,160 Speaker 1: and that's kind of the only later era one that 132 00:07:20,360 --> 00:07:25,160 Speaker 1: I really enjoy. But a lot of people like really 133 00:07:25,200 --> 00:07:29,040 Speaker 1: love The Life Aquatic. I've never really connected with that 134 00:07:29,080 --> 00:07:33,960 Speaker 1: one either, But yeah, French Dispatch felt like people were like, huh, 135 00:07:34,000 --> 00:07:36,320 Speaker 1: we're we're not going with you on this one. There 136 00:07:36,360 --> 00:07:39,520 Speaker 1: are some really good performances in it. Yeah, I don't know, 137 00:07:39,600 --> 00:07:42,240 Speaker 1: I like part of me wants to see the Asteroid 138 00:07:42,240 --> 00:07:45,400 Speaker 1: City one because I don't know, it just looks like 139 00:07:46,320 --> 00:07:49,280 Speaker 1: the he's instead of trying to do something different, He's like, 140 00:07:49,320 --> 00:07:51,800 Speaker 1: I'm going to do the same so hard. 141 00:07:53,360 --> 00:07:55,080 Speaker 6: Oh you didn't like it when I tried something new, 142 00:07:55,160 --> 00:08:00,000 Speaker 6: Well fine, I'll do the thing again. Yeah. 143 00:08:00,120 --> 00:08:02,520 Speaker 1: How about if Wes Anderson tried to make a Wes 144 00:08:02,560 --> 00:08:05,080 Speaker 1: Anderson movie. He like got jealous of all the people 145 00:08:05,120 --> 00:08:07,880 Speaker 1: being like on social media being like what if your 146 00:08:08,120 --> 00:08:09,880 Speaker 1: day was a Wes Anderson movie? 147 00:08:09,920 --> 00:08:12,960 Speaker 7: And like wild to me that he was upset about that, 148 00:08:13,680 --> 00:08:14,840 Speaker 7: Like I was really trying to. 149 00:08:15,040 --> 00:08:16,840 Speaker 1: Was he upset about that? Yeah? 150 00:08:17,040 --> 00:08:19,560 Speaker 7: Oh wow, Yeah he came out he was like I 151 00:08:19,560 --> 00:08:22,720 Speaker 7: would never look at those and people realize it's like 152 00:08:22,720 --> 00:08:23,760 Speaker 7: people are just homaging. 153 00:08:23,800 --> 00:08:24,720 Speaker 3: You felt like that. 154 00:08:25,240 --> 00:08:27,240 Speaker 7: I think that the highest form of flattery is like 155 00:08:27,240 --> 00:08:29,880 Speaker 7: the entire Internet got together and was like, I think 156 00:08:29,960 --> 00:08:33,400 Speaker 7: your aesthetic is so utterly charming. I'm gonna place myself 157 00:08:33,400 --> 00:08:35,040 Speaker 7: inside of it. 158 00:08:35,040 --> 00:08:36,320 Speaker 4: It was I really liked to meet. 159 00:08:36,360 --> 00:08:39,600 Speaker 7: I thought it was cute, disappointing court your fan base, please. 160 00:08:39,679 --> 00:08:42,679 Speaker 7: I think it's this is the best ap para social 161 00:08:42,720 --> 00:08:44,480 Speaker 7: relationship can be. Is just to say hey, guys, I 162 00:08:44,480 --> 00:08:46,080 Speaker 7: see you and I liked what you did and thank 163 00:08:46,120 --> 00:08:47,160 Speaker 7: you and then just move on. 164 00:08:47,720 --> 00:08:48,280 Speaker 1: Yeah. 165 00:08:48,400 --> 00:08:48,840 Speaker 8: I don't know. 166 00:08:49,160 --> 00:08:51,080 Speaker 1: I wonder if it's also like you. It would just 167 00:08:51,120 --> 00:08:53,520 Speaker 1: be like he's worried. It would like fuck him up 168 00:08:53,720 --> 00:08:57,000 Speaker 1: a little bit to see everybody's like version of him, 169 00:08:57,200 --> 00:08:59,240 Speaker 1: the way that like seeing someone do an impression of 170 00:08:59,280 --> 00:09:02,120 Speaker 1: you can be a little unnerving for the first time. 171 00:09:02,200 --> 00:09:06,199 Speaker 7: You know, that is fair that Okay, maybe, or. 172 00:09:06,160 --> 00:09:09,920 Speaker 1: Maybe he's just an asshole. I couldn't possibly imagine a 173 00:09:10,040 --> 00:09:11,319 Speaker 1: universe where that's true. 174 00:09:12,000 --> 00:09:16,280 Speaker 7: That's from Wes Anderson. That's got his quote directly. So okay, 175 00:09:16,320 --> 00:09:20,040 Speaker 7: So Wes Anderson said about the memes, I'm very good 176 00:09:20,080 --> 00:09:23,160 Speaker 7: at protecting myself from seeing all that stuff. If somebody 177 00:09:23,160 --> 00:09:25,560 Speaker 7: sends me something like that, I'll immediately erase it and 178 00:09:25,600 --> 00:09:28,480 Speaker 7: say please, sorry, do not send me things of people 179 00:09:28,520 --> 00:09:30,280 Speaker 7: doing me, because I do not want to look at 180 00:09:30,320 --> 00:09:31,200 Speaker 7: it thinking. 181 00:09:31,120 --> 00:09:31,920 Speaker 6: Is that what I do? 182 00:09:32,160 --> 00:09:34,160 Speaker 7: Is that what I mean? I don't want to see 183 00:09:34,200 --> 00:09:36,600 Speaker 7: too much of someone else thinking about what I try 184 00:09:36,640 --> 00:09:39,480 Speaker 7: to be, because God knows, I could then start doing it. 185 00:09:39,559 --> 00:09:42,520 Speaker 7: So to your point, Jack, this is a form of 186 00:09:42,559 --> 00:09:45,840 Speaker 7: self protection. And now I dismissed love his fans. 187 00:09:46,200 --> 00:09:53,920 Speaker 1: Yeah, okay, all right. Also he does betray in that quote, 188 00:09:54,360 --> 00:09:55,959 Speaker 1: like a little bit of like a thing I always 189 00:09:55,960 --> 00:09:59,400 Speaker 1: suspect about anybody who got famous before like a certain 190 00:09:59,440 --> 00:10:02,839 Speaker 1: point like that they just like don't know how technology works, 191 00:10:02,920 --> 00:10:06,560 Speaker 1: because like if someone sends me something, I erase it, like, 192 00:10:07,679 --> 00:10:10,200 Speaker 1: what do you mean you RaSE it the link that 193 00:10:10,240 --> 00:10:12,000 Speaker 1: they sent to you? 194 00:10:12,400 --> 00:10:17,080 Speaker 6: They delete it from Internet dot com. 195 00:10:17,120 --> 00:10:21,559 Speaker 1: Oh man, what's something you think is underrated? Teresa? 196 00:10:21,720 --> 00:10:26,240 Speaker 9: Okay, there's this really cool taco spot right around where 197 00:10:26,280 --> 00:10:29,360 Speaker 9: I live. It's like my neighborhood. But it's just a 198 00:10:29,360 --> 00:10:31,520 Speaker 9: guy who sells tacos in front of his house, and 199 00:10:31,559 --> 00:10:36,800 Speaker 9: he's really cool and it's underrated obviously because you guys. 200 00:10:36,600 --> 00:10:37,240 Speaker 4: Don't know about it. 201 00:10:37,280 --> 00:10:38,880 Speaker 9: But I want to kind of out because I love 202 00:10:38,960 --> 00:10:43,360 Speaker 9: this so much. Literally, I feel like it's like I 203 00:10:43,679 --> 00:10:45,520 Speaker 9: don't want to say I manifested it, because obviously this 204 00:10:45,600 --> 00:10:47,480 Speaker 9: man is an individual person who has his own life 205 00:10:47,480 --> 00:10:51,720 Speaker 9: and needs. But I was thinking right before I move, like, oh, 206 00:10:51,720 --> 00:10:54,120 Speaker 9: it would be really cool to just I think I 207 00:10:54,200 --> 00:10:58,400 Speaker 9: was just having existential dreams about you can't tell, like 208 00:10:58,440 --> 00:11:00,760 Speaker 9: what's my future and livelihood? 209 00:11:01,040 --> 00:11:02,120 Speaker 4: So it's like what if I just. 210 00:11:02,040 --> 00:11:04,320 Speaker 9: Like opened a little restaurant in my garage. But that 211 00:11:04,440 --> 00:11:06,200 Speaker 9: was mostly like I don't actually want to do it, 212 00:11:06,240 --> 00:11:07,800 Speaker 9: but I was just thinking the idea of like, let's 213 00:11:07,840 --> 00:11:09,520 Speaker 9: just go back to It reminds me of like Taiwan, 214 00:11:09,559 --> 00:11:11,720 Speaker 9: where people just have these little eateries and you know 215 00:11:11,760 --> 00:11:15,640 Speaker 9: your neighborhood and you enjoy your community and it's nice 216 00:11:15,679 --> 00:11:17,719 Speaker 9: and and it doesn't have to be you know, a 217 00:11:17,800 --> 00:11:21,559 Speaker 9: Chipotle or bought by Facebook. But then, like the first 218 00:11:21,559 --> 00:11:24,920 Speaker 9: week I lived here, I saw a sign that was 219 00:11:24,960 --> 00:11:28,120 Speaker 9: just like handwritten I just like in Spanish, I said, 220 00:11:28,120 --> 00:11:32,200 Speaker 9: like Tacos Fierness East Abado, like Fridays and Saturdays. And 221 00:11:32,480 --> 00:11:34,360 Speaker 9: my first thought was like, oh, that must be good 222 00:11:34,400 --> 00:11:36,679 Speaker 9: because if it's a handwritten sign, it's not advertised. 223 00:11:36,720 --> 00:11:38,240 Speaker 4: He's only doing it up two days of the week. 224 00:11:38,559 --> 00:11:40,560 Speaker 1: Just go sheer love of the game. 225 00:11:41,080 --> 00:11:43,520 Speaker 9: Yeah, but Friday comes around, I'm like, oh, I can't wait. 226 00:11:43,960 --> 00:11:46,840 Speaker 9: First time I go. And he said it was the 227 00:11:46,880 --> 00:11:49,640 Speaker 9: first time he was doing it too, So oh, mine 228 00:11:49,679 --> 00:11:50,559 Speaker 9: line's coincided. 229 00:11:50,880 --> 00:11:53,320 Speaker 5: Yeah, and so and did You're like I made you 230 00:11:53,400 --> 00:11:56,040 Speaker 5: do this, Like I've never heard of tacos before. 231 00:11:56,160 --> 00:12:01,880 Speaker 9: Yesterday He's like, I just spun tuously appeared, but no, 232 00:12:02,000 --> 00:12:04,760 Speaker 9: he like started this little business in front of his 233 00:12:04,840 --> 00:12:07,360 Speaker 9: house and he's so good. He just like starts cooking 234 00:12:07,480 --> 00:12:09,800 Speaker 9: in the afternoon on Fridays, and then around five he 235 00:12:09,800 --> 00:12:11,360 Speaker 9: started telling it like I was trying to buy it, 236 00:12:11,400 --> 00:12:12,720 Speaker 9: like at like two, and he was like it's. 237 00:12:12,640 --> 00:12:13,280 Speaker 3: Not ready yet. 238 00:12:13,640 --> 00:12:19,280 Speaker 9: And then now, but they're so good and it's so cute, 239 00:12:19,280 --> 00:12:20,680 Speaker 9: and he's been doing it now for the last couple 240 00:12:20,679 --> 00:12:23,280 Speaker 9: of weeks every Friday Saturday, and there's like a crowd, 241 00:12:23,440 --> 00:12:26,880 Speaker 9: and he started here's by Papa's Talk Tacos Papa Jose, 242 00:12:27,520 --> 00:12:29,440 Speaker 9: And like every week I go there, there's like a 243 00:12:29,480 --> 00:12:32,040 Speaker 9: new sign now, like he's like adding more marketing. Like 244 00:12:32,080 --> 00:12:34,559 Speaker 9: now there's a sign down the street along like the 245 00:12:34,880 --> 00:12:37,080 Speaker 9: you know the main street, that has an little arrow 246 00:12:37,120 --> 00:12:39,920 Speaker 9: that says tacos. And then the other day there was 247 00:12:39,920 --> 00:12:42,200 Speaker 9: one that you could scan the QR code and follow him. Oh, 248 00:12:42,480 --> 00:12:43,680 Speaker 9: Tacos Papa Jose. 249 00:12:44,480 --> 00:12:47,040 Speaker 1: It's actually brilliant marketing strategy. Could you actually give him 250 00:12:47,040 --> 00:12:48,679 Speaker 1: my number? I actually want to reach out to him 251 00:12:48,720 --> 00:12:53,720 Speaker 1: about getting his brand online. That's it. That's really like 252 00:12:53,840 --> 00:12:57,680 Speaker 1: a cool thing because I feel like the lemonade stand model, 253 00:12:57,960 --> 00:13:00,880 Speaker 1: like you you never hear that applied anything else. It's 254 00:13:00,960 --> 00:13:04,760 Speaker 1: just like kids trying their first taking their first whack 255 00:13:04,800 --> 00:13:05,679 Speaker 1: at capitalism. 256 00:13:06,000 --> 00:13:08,720 Speaker 4: It's like very and they don't know it's because they're cute. 257 00:13:08,760 --> 00:13:09,760 Speaker 9: That's why they're selling it. 258 00:13:10,000 --> 00:13:13,240 Speaker 1: Yeah, right, right, But like if you're good at making stuff, 259 00:13:13,280 --> 00:13:15,280 Speaker 1: why not do a little talk. 260 00:13:16,080 --> 00:13:19,040 Speaker 9: It's so good and it's like like it's homemade and 261 00:13:19,080 --> 00:13:21,320 Speaker 9: there's like, you know, people sitting around. There's usually like 262 00:13:21,320 --> 00:13:23,439 Speaker 9: a TV and his family's hanging out, so it's like 263 00:13:23,520 --> 00:13:26,440 Speaker 9: really nice. But also like the first weekend, I think 264 00:13:26,440 --> 00:13:28,280 Speaker 9: there was a graduation, so a lot of people were coming. 265 00:13:28,720 --> 00:13:30,480 Speaker 9: But also he just said he I asked them how 266 00:13:30,520 --> 00:13:33,280 Speaker 9: we advertised, and he just said he was posting on Facebook. 267 00:13:33,520 --> 00:13:37,000 Speaker 9: So I mean, whatever, fuck it, Zuckerberg, But like that's 268 00:13:37,080 --> 00:13:38,920 Speaker 9: kind of nice. I don't know, there was something really 269 00:13:38,960 --> 00:13:43,040 Speaker 9: like sweet about just like it's good and he's just 270 00:13:43,160 --> 00:13:45,600 Speaker 9: urgently trying to do it that way. But in my mind, 271 00:13:45,640 --> 00:13:46,920 Speaker 9: I'm like, oh my god, what if this is the 272 00:13:46,960 --> 00:13:48,920 Speaker 9: next sting type funk because that's how you hear about 273 00:13:48,960 --> 00:13:51,319 Speaker 9: these like huge eateries. They always start from. 274 00:13:51,160 --> 00:13:53,040 Speaker 3: Just okay, So then maybe you need to get in 275 00:13:53,080 --> 00:13:53,800 Speaker 3: as an investor. 276 00:13:53,920 --> 00:13:59,679 Speaker 1: Yeah, ground investor. This is how you get the Wilding's 277 00:13:59,760 --> 00:14:00,800 Speaker 1: name after you. 278 00:14:00,920 --> 00:14:03,720 Speaker 3: Yeah you do. You can stand up and it's gonna. 279 00:14:03,520 --> 00:14:04,480 Speaker 9: Be called Taco's. 280 00:14:04,679 --> 00:14:07,200 Speaker 4: Mama Teresa. 281 00:14:07,640 --> 00:14:10,040 Speaker 3: I did a leverage buy out. I got chased this 282 00:14:10,080 --> 00:14:11,920 Speaker 3: outside of town. It's all mine now. 283 00:14:13,640 --> 00:14:17,760 Speaker 1: This idiot made the mistake of showing me his process 284 00:14:17,800 --> 00:14:18,440 Speaker 1: for making. 285 00:14:18,240 --> 00:14:22,560 Speaker 3: Meary recipe true capitalists. 286 00:14:22,560 --> 00:14:24,080 Speaker 4: They're also only two dollars nevery good. 287 00:14:24,120 --> 00:14:24,560 Speaker 3: That's the other thing. 288 00:14:24,600 --> 00:14:28,240 Speaker 9: I'm like, oh so so near Salazar, but I'm like 289 00:14:28,400 --> 00:14:30,480 Speaker 9: this place that's great. Yeah. 290 00:14:30,520 --> 00:14:33,520 Speaker 5: I love like I love local food, local eateries. It's 291 00:14:34,040 --> 00:14:36,080 Speaker 5: it's so funny when you meet people like I don't know, 292 00:14:36,080 --> 00:14:38,440 Speaker 5: maybe it's just being an Angelina, like I've never grown 293 00:14:38,520 --> 00:14:40,720 Speaker 5: up with like the fear of street food. And also 294 00:14:40,760 --> 00:14:43,800 Speaker 5: maybe just like in Japan to in Asia, like they're 295 00:14:43,880 --> 00:14:46,640 Speaker 5: just like in Japan they're called Yatai's where they're just 296 00:14:46,680 --> 00:14:49,120 Speaker 5: like fucking people just throw up a tent and ship 297 00:14:49,160 --> 00:14:49,600 Speaker 5: and a grill. 298 00:14:49,680 --> 00:14:53,360 Speaker 3: Yeah, like they're serving outside of a train. Yeah yeah. Yeah. 299 00:14:53,600 --> 00:14:54,880 Speaker 3: So like it's it's always funny me. 300 00:14:54,920 --> 00:14:56,480 Speaker 5: People are like, I don't know about that. I'm like, 301 00:14:56,520 --> 00:14:59,520 Speaker 5: you have clearly not felt the love of street food. 302 00:14:59,680 --> 00:15:01,560 Speaker 4: That I will so you can't get stick of you. 303 00:15:01,840 --> 00:15:03,320 Speaker 9: Like when I went back to time, wanted, I hadn't 304 00:15:03,320 --> 00:15:05,160 Speaker 9: been for a while, and I just like late, I 305 00:15:05,160 --> 00:15:08,680 Speaker 9: got something that probably had been out for hours. I 306 00:15:08,720 --> 00:15:11,360 Speaker 9: was like, I'm going to be like, yeah, you gotta 307 00:15:11,400 --> 00:15:13,120 Speaker 9: know how to be a little savvy about what time 308 00:15:13,160 --> 00:15:14,600 Speaker 9: and where which stands are going to. 309 00:15:14,600 --> 00:15:16,960 Speaker 3: But I like to just say, yeah, it's because you've 310 00:15:16,960 --> 00:15:19,280 Speaker 3: been out of country too long. Yeah it was. It 311 00:15:19,320 --> 00:15:20,120 Speaker 3: wasn't their fault. 312 00:15:20,120 --> 00:15:22,240 Speaker 9: It was you, wasn't They were like, why would you 313 00:15:22,280 --> 00:15:25,520 Speaker 9: buy this meat thingless? 314 00:15:26,400 --> 00:15:29,880 Speaker 3: It was clearly a shoe on a stick. I don't know. 315 00:15:30,240 --> 00:15:33,520 Speaker 5: I thought it was like a gag, like a fun shape. 316 00:15:34,800 --> 00:15:37,160 Speaker 1: All right, let's take a quick break and we'll come 317 00:15:37,200 --> 00:15:50,480 Speaker 1: back and talk about cocaine, and we're back and back 318 00:15:50,520 --> 00:15:53,800 Speaker 1: in our crack days. I worked on an article about 319 00:15:53,840 --> 00:15:59,800 Speaker 1: like cognitive biases that affect our ability to understand money, 320 00:16:00,160 --> 00:16:02,800 Speaker 1: and like the piece really like zoomed in on examples 321 00:16:02,800 --> 00:16:06,440 Speaker 1: of how traditional media interacts with a brain that was 322 00:16:06,560 --> 00:16:10,160 Speaker 1: designed millions of years ago for an animal that was 323 00:16:10,280 --> 00:16:13,640 Speaker 1: trying to survive the food chain, and like our brain 324 00:16:13,720 --> 00:16:17,160 Speaker 1: is designed to process like fairly simple visual stimuli in 325 00:16:17,200 --> 00:16:20,240 Speaker 1: a pretty straightforward way. That tree has fruit on it. 326 00:16:20,280 --> 00:16:23,800 Speaker 1: Remember that tree. And then you know, the point in 327 00:16:23,880 --> 00:16:27,160 Speaker 1: the article was like if you only show the person 328 00:16:27,200 --> 00:16:30,160 Speaker 1: who won the lottery on TV and not the you know, 329 00:16:30,240 --> 00:16:33,240 Speaker 1: billions of people who lose the lottery every day, it 330 00:16:33,360 --> 00:16:37,560 Speaker 1: creates a imprint of like that's the fruit tree, that's 331 00:16:37,760 --> 00:16:42,920 Speaker 1: that's the place to go, and like that is so basic. 332 00:16:43,800 --> 00:16:45,960 Speaker 1: That's such a basic thing that just feels like so 333 00:16:46,120 --> 00:16:50,880 Speaker 1: quaint and antiqu antiquated compared to the modern world that 334 00:16:51,160 --> 00:16:54,560 Speaker 1: we're existing in, where the increasing use of AI to 335 00:16:54,600 --> 00:16:57,920 Speaker 1: create intentionally false stories, use of like technology really seems 336 00:16:58,000 --> 00:17:01,920 Speaker 1: to be accelerating things to in some cases like make 337 00:17:02,000 --> 00:17:05,639 Speaker 1: them worse or at least there's a feeling that that 338 00:17:05,800 --> 00:17:09,040 Speaker 1: is the case in like day to day our day 339 00:17:09,080 --> 00:17:12,000 Speaker 1: to day lives. And so we wanted to talk about 340 00:17:12,800 --> 00:17:15,840 Speaker 1: kind of all the ways that the modern world is 341 00:17:16,160 --> 00:17:19,760 Speaker 1: sort of this fun house mirror that our brand's ability 342 00:17:19,800 --> 00:17:23,680 Speaker 1: to interact with it like just warps and stretches, and 343 00:17:24,080 --> 00:17:26,639 Speaker 1: it's it's not going well. It seems like based on 344 00:17:26,680 --> 00:17:29,640 Speaker 1: some of these statistics that you pulled, Jason. 345 00:17:29,680 --> 00:17:31,879 Speaker 8: Well, I wanted to pull some stance because when we 346 00:17:32,000 --> 00:17:35,440 Speaker 8: talk about like people being anxious or depressed or whatever. 347 00:17:35,480 --> 00:17:38,720 Speaker 8: These days, it's not just talking about vibes like you 348 00:17:39,119 --> 00:17:41,639 Speaker 8: can look at the statistics. So, suicide rates been climbing 349 00:17:41,640 --> 00:17:44,400 Speaker 8: in the USA for the last twenty years. Around two 350 00:17:44,440 --> 00:17:48,920 Speaker 8: thousand is when most of these trends started to skew higher. 351 00:17:49,280 --> 00:17:52,800 Speaker 8: That is a distinctly American phenomenon. Most other countries suicide 352 00:17:52,880 --> 00:17:55,880 Speaker 8: rates have been steadily following. This is an American thing, 353 00:17:56,960 --> 00:17:59,479 Speaker 8: but there's a large number they call deaths of despair 354 00:17:59,480 --> 00:18:04,639 Speaker 8: where they want together suicide, alcohol debts, drug related deaths, 355 00:18:05,560 --> 00:18:08,400 Speaker 8: and all of that has skewed up since two thousand 356 00:18:08,920 --> 00:18:12,520 Speaker 8: and then since around twenty ten has started to like 357 00:18:13,000 --> 00:18:15,720 Speaker 8: most of them have kind of started to spike. Now 358 00:18:15,760 --> 00:18:17,720 Speaker 8: part of you you mentioned like the deaths of the 359 00:18:17,800 --> 00:18:20,040 Speaker 8: drug overdose deaths. Some of that is separately just the 360 00:18:20,040 --> 00:18:24,200 Speaker 8: opioid epidemic and fentanyl, but also some of those overdoses 361 00:18:24,240 --> 00:18:27,719 Speaker 8: are intentional, but if there's no note left behind, they 362 00:18:27,720 --> 00:18:29,280 Speaker 8: just put it down as overdose because of course, how 363 00:18:29,320 --> 00:18:32,400 Speaker 8: do you know, so that's and so. And then anxiety 364 00:18:32,400 --> 00:18:35,239 Speaker 8: and depression both have been rising up specifically among the 365 00:18:35,280 --> 00:18:39,480 Speaker 8: youth again going back to around two thousand and nine 366 00:18:39,480 --> 00:18:42,359 Speaker 8: twenty ten. Now there's two ways people interpret this that 367 00:18:42,400 --> 00:18:44,760 Speaker 8: are controversial. You can either say that these things have 368 00:18:44,840 --> 00:18:47,840 Speaker 8: been going up in the internet era and then have 369 00:18:48,000 --> 00:18:53,520 Speaker 8: accelerated in the smartphone era. Or you can say, well, 370 00:18:53,520 --> 00:18:55,879 Speaker 8: these things have gone up since nine to eleven and 371 00:18:56,000 --> 00:18:58,720 Speaker 8: accelerated since the financial crisis of two thousand and eight. 372 00:18:59,760 --> 00:19:02,879 Speaker 8: When we start talking about how the media and smartphones, 373 00:19:02,920 --> 00:19:06,760 Speaker 8: all those things make people more anxious, this obsess people 374 00:19:06,800 --> 00:19:10,240 Speaker 8: because the response is always, well, we're not anxious because 375 00:19:10,240 --> 00:19:13,119 Speaker 8: of phones. We're anxious because the world is on fire. 376 00:19:13,960 --> 00:19:16,240 Speaker 8: Our assertion is not that there are no problems in 377 00:19:16,240 --> 00:19:18,640 Speaker 8: the world. Our assertion, or at least mine, is that 378 00:19:19,200 --> 00:19:23,439 Speaker 8: the way media, the media environment is one of those problems. 379 00:19:23,920 --> 00:19:26,679 Speaker 8: It's one of those things that make things worse. 380 00:19:27,160 --> 00:19:29,080 Speaker 1: Yeah, I think there's a lot of things, like all 381 00:19:29,160 --> 00:19:32,040 Speaker 1: the things that you were saying, like the world is 382 00:19:32,080 --> 00:19:35,400 Speaker 1: on fire nine to eleven, the economic collapse, and then 383 00:19:35,440 --> 00:19:39,120 Speaker 1: also having devices in our hands feeding us a steady 384 00:19:39,119 --> 00:19:42,679 Speaker 1: stream of media that's like specifically tailored to us and 385 00:19:42,840 --> 00:19:48,399 Speaker 1: specifically based on preferences of what makes us angriest or 386 00:19:48,520 --> 00:19:53,960 Speaker 1: most frightened. Like those things can't be good and yeah, 387 00:19:54,160 --> 00:19:57,520 Speaker 1: I think there's miles you pulled to Al Jazeera story 388 00:19:57,560 --> 00:20:00,720 Speaker 1: about like outrage headlines increasing well because. 389 00:20:00,520 --> 00:20:04,320 Speaker 5: I mean, yeah again, like we're there's so many facets 390 00:20:04,359 --> 00:20:07,600 Speaker 5: of how we end up with bad vibes. The bad 391 00:20:07,680 --> 00:20:11,439 Speaker 5: vibes decades uh trademarked, but yeah, like I think one 392 00:20:11,480 --> 00:20:13,320 Speaker 5: of them is, you know, like this is some version 393 00:20:13,359 --> 00:20:15,920 Speaker 5: of like mean world syndrome where a lot of your 394 00:20:16,040 --> 00:20:18,800 Speaker 5: media diet are just giving you sort of an over 395 00:20:18,840 --> 00:20:21,760 Speaker 5: emphasis on the terrible things that are happening, which can 396 00:20:21,840 --> 00:20:24,200 Speaker 5: just lead to like being more cynical or just being 397 00:20:24,240 --> 00:20:26,359 Speaker 5: like what the fuck is going on? And yeah, like 398 00:20:26,400 --> 00:20:29,879 Speaker 5: with Al Jazeera, they analyze like headlines like well, like 399 00:20:30,480 --> 00:20:33,280 Speaker 5: tens of thousands of headlines from twenty to twenty nineteen 400 00:20:33,880 --> 00:20:37,159 Speaker 5: just to see what the emotional charge of was of 401 00:20:37,160 --> 00:20:40,320 Speaker 5: some of these headlines, and they've noticed that things have 402 00:20:40,680 --> 00:20:44,160 Speaker 5: just gotten more and more increasingly negative since the year 403 00:20:44,160 --> 00:20:47,440 Speaker 5: two thousand, like you know, headlines that say like quote, 404 00:20:47,520 --> 00:20:51,560 Speaker 5: Brazil prison riot leaves nine dead rather than things like 405 00:20:51,640 --> 00:20:54,320 Speaker 5: a new lens restores vision and brings relief. 406 00:20:54,720 --> 00:20:57,120 Speaker 3: And we're seeing just that sort of that. 407 00:20:57,160 --> 00:21:00,760 Speaker 5: Rise in those kinds of headlines be equally sort of 408 00:21:00,800 --> 00:21:03,840 Speaker 5: distributed between the right and left, but there is there 409 00:21:03,880 --> 00:21:06,520 Speaker 5: is an edge on the right, like with conservative media 410 00:21:06,600 --> 00:21:10,480 Speaker 5: definitely having those you know, a bigger emphasis on the 411 00:21:10,960 --> 00:21:14,320 Speaker 5: fear based kind of headline. And I feel the same 412 00:21:14,359 --> 00:21:17,359 Speaker 5: way too, like because reading as much news as I do, 413 00:21:17,480 --> 00:21:20,560 Speaker 5: or we have to making this show like that absolutely 414 00:21:20,600 --> 00:21:21,520 Speaker 5: has an effect on me. 415 00:21:21,680 --> 00:21:23,080 Speaker 3: But then you kind of have to. 416 00:21:23,080 --> 00:21:24,840 Speaker 5: Take a step back and like, what are the statistics 417 00:21:24,880 --> 00:21:26,919 Speaker 5: saying you're actually getting something completely different? 418 00:21:27,359 --> 00:21:27,919 Speaker 1: Yeah? 419 00:21:28,359 --> 00:21:31,000 Speaker 8: Yeah, what the headlines? And again note that there's the 420 00:21:31,080 --> 00:21:33,760 Speaker 8: year two thousand again when going back to the origin 421 00:21:33,800 --> 00:21:37,000 Speaker 8: of that, what was driven by that was the death 422 00:21:37,080 --> 00:21:40,520 Speaker 8: of news being a print medium versus news being an 423 00:21:40,520 --> 00:21:44,680 Speaker 8: online medium because again with newspaper headlines, like of course 424 00:21:44,720 --> 00:21:47,200 Speaker 8: once pon a time they had to sell papers that 425 00:21:47,400 --> 00:21:50,800 Speaker 8: now individual stories have to get clicked on. Yeah, and 426 00:21:51,040 --> 00:21:54,479 Speaker 8: very very quickly, just through ab testing. Again there's no 427 00:21:54,560 --> 00:21:58,720 Speaker 8: conspiracy here. Just through trancking user behavior, they figured out 428 00:21:58,720 --> 00:22:01,480 Speaker 8: that the more emotionally charge i mean, we saw this cracked, 429 00:22:01,520 --> 00:22:05,280 Speaker 8: the more emotionally charged headline gets clicked. So now every 430 00:22:05,400 --> 00:22:08,080 Speaker 8: individual headline because you know, once upon a time you'd 431 00:22:08,119 --> 00:22:10,440 Speaker 8: have a newspaper with a big headline across the top. 432 00:22:10,520 --> 00:22:13,959 Speaker 8: You know, Nixon goes to jail whatever. I don't think 433 00:22:14,000 --> 00:22:16,159 Speaker 8: Nixon never actually went to the other Yeah, yeah, it 434 00:22:16,160 --> 00:22:16,640 Speaker 8: doesn't matter. 435 00:22:16,720 --> 00:22:18,399 Speaker 1: Would have sold papers and it would have been a 436 00:22:18,400 --> 00:22:19,600 Speaker 1: headline in the modern. 437 00:22:19,280 --> 00:22:21,560 Speaker 8: Are and then a lot of the other headlines would 438 00:22:21,560 --> 00:22:26,000 Speaker 8: be very boring and straightforward city council votes to do whatever. Well, now, 439 00:22:26,119 --> 00:22:28,240 Speaker 8: if you want people to click on that boring city 440 00:22:28,280 --> 00:22:31,399 Speaker 8: council story, if it's your job as a journalist get 441 00:22:31,440 --> 00:22:33,280 Speaker 8: people to click, there's got to be an angle on 442 00:22:33,359 --> 00:22:37,080 Speaker 8: there that's going to get people mad. And so beyond 443 00:22:37,080 --> 00:22:41,119 Speaker 8: the examples you gave here, you have the news, you know, 444 00:22:41,160 --> 00:22:43,440 Speaker 8: the actual news organization's gathering news. But then you have 445 00:22:43,560 --> 00:22:47,600 Speaker 8: the aggregators of the content like Coffington Post, BuzzFeed, Vox, 446 00:22:47,640 --> 00:22:49,800 Speaker 8: all of these sites that basically would take the headline 447 00:22:49,840 --> 00:22:52,239 Speaker 8: and then do a little blog post about it, and 448 00:22:52,400 --> 00:22:56,320 Speaker 8: those titles would be things like, you know, if you're 449 00:22:56,320 --> 00:22:59,600 Speaker 8: not paying attention to this, you're not angry enough, or 450 00:22:59,760 --> 00:23:02,119 Speaker 8: this clip is going to leave you outraged when you 451 00:23:02,160 --> 00:23:06,320 Speaker 8: see how they treated this disabled man at McDonald's, you're 452 00:23:06,359 --> 00:23:08,919 Speaker 8: going to be furious, Like literally putting the emotion in 453 00:23:09,000 --> 00:23:14,320 Speaker 8: the headline. If you're not mad, You're not a good person, right, 454 00:23:14,400 --> 00:23:16,880 Speaker 8: and making that part of the ethos of the time 455 00:23:16,960 --> 00:23:18,760 Speaker 8: that it's like, if you want to be a good 456 00:23:18,840 --> 00:23:21,240 Speaker 8: person who cares about the world, you must be angry 457 00:23:21,600 --> 00:23:23,880 Speaker 8: all the time because there's so much there's so much 458 00:23:23,920 --> 00:23:27,480 Speaker 8: injustice or whatever. Never Mind that that doesn't help that 459 00:23:27,520 --> 00:23:30,040 Speaker 8: person in the video. Never Mind that you're watching a 460 00:23:30,080 --> 00:23:33,879 Speaker 8: clip from three years ago and everyone has forgotten about it. 461 00:23:33,880 --> 00:23:35,720 Speaker 8: It doesn't matter. It has bubbled up on the Reddit 462 00:23:35,800 --> 00:23:39,200 Speaker 8: and now everybody's mad again to no effect, Like it's 463 00:23:39,200 --> 00:23:41,399 Speaker 8: not motivating you to help this person. 464 00:23:42,080 --> 00:23:46,680 Speaker 1: Yeah. What one old, outdated technology I think we're underrating 465 00:23:46,760 --> 00:23:49,679 Speaker 1: is newsies. These headlines are doing. You know, newsies just 466 00:23:49,760 --> 00:23:52,359 Speaker 1: used to yell at you to read the newspaper and 467 00:23:52,400 --> 00:23:55,520 Speaker 1: people would just listen because they're scared of the newsies. 468 00:23:55,960 --> 00:23:58,360 Speaker 1: But now they have to scare us with the actual 469 00:23:58,480 --> 00:24:02,040 Speaker 1: content of the headline, and then I think, you know, 470 00:24:02,080 --> 00:24:05,679 Speaker 1: we We've also talked recently about how the number of 471 00:24:05,720 --> 00:24:10,280 Speaker 1: people who identify with a religion or involved in a religion, 472 00:24:10,560 --> 00:24:14,520 Speaker 1: and the number of people who have access to or 473 00:24:14,600 --> 00:24:20,000 Speaker 1: regular contact with a community like both are just on 474 00:24:20,080 --> 00:24:23,800 Speaker 1: an all time low and going lower. And so I 475 00:24:23,840 --> 00:24:29,359 Speaker 1: think that there's a broad kind of you can call 476 00:24:29,400 --> 00:24:33,280 Speaker 1: it spiritual or like broadly psychological level more than ever 477 00:24:33,359 --> 00:24:39,399 Speaker 1: before where we don't have access to answering some of 478 00:24:39,440 --> 00:24:42,879 Speaker 1: the big questions. And so, like we speculated in past 479 00:24:43,280 --> 00:24:46,879 Speaker 1: one of these like kind of broad episodes about the 480 00:24:46,920 --> 00:24:51,160 Speaker 1: idea that like maybe stand culture is coming from that 481 00:24:51,280 --> 00:24:55,240 Speaker 1: this need to sublimate ourselves to something higher and like 482 00:24:55,320 --> 00:25:00,199 Speaker 1: some mythical like godhead figure or like leader person like 483 00:25:00,240 --> 00:25:03,760 Speaker 1: because we're not getting that anywhere else, and so like 484 00:25:03,840 --> 00:25:07,240 Speaker 1: we get stand culture or the stories of the movies 485 00:25:07,240 --> 00:25:09,439 Speaker 1: that make an imprint on us when we're very young, 486 00:25:09,760 --> 00:25:13,160 Speaker 1: like Star Wars or you know, Harry Potter, the Matrix, 487 00:25:13,200 --> 00:25:19,600 Speaker 1: Like these are our mythical stories that we have chosen 488 00:25:19,640 --> 00:25:24,399 Speaker 1: to kind of put ourselves like derive our meaning from. 489 00:25:24,960 --> 00:25:30,080 Speaker 1: And that actually made a lot of sense of how 490 00:25:30,119 --> 00:25:33,960 Speaker 1: people react, like how Star Wars fans or like fandoms 491 00:25:34,480 --> 00:25:38,680 Speaker 1: react like there is a religious war when something happens 492 00:25:38,760 --> 00:25:42,359 Speaker 1: that doesn't cohere to the dogma that they were raised on. 493 00:25:42,720 --> 00:25:45,199 Speaker 1: But I think there's a lot of like kind of 494 00:25:45,600 --> 00:25:51,320 Speaker 1: grasping for meaning as technology has isolated us from one 495 00:25:51,359 --> 00:25:55,399 Speaker 1: another and from the communities that used to allow people 496 00:25:55,440 --> 00:26:00,040 Speaker 1: to sort of dissolve their sense of self into like 497 00:26:00,080 --> 00:26:04,600 Speaker 1: a broader, less kind of self centered way of viewing 498 00:26:04,600 --> 00:26:07,400 Speaker 1: the world. And I mean, yeah, we were talking about 499 00:26:07,440 --> 00:26:12,320 Speaker 1: like consumer culture as like kind of another example of 500 00:26:12,359 --> 00:26:15,639 Speaker 1: where we see this. I mean with with I guess 501 00:26:16,080 --> 00:26:20,040 Speaker 1: stand culture is one example of that, but consumerism people 502 00:26:20,080 --> 00:26:23,600 Speaker 1: have become like very serious consumers, like to like on 503 00:26:23,640 --> 00:26:26,840 Speaker 1: a deep, deeply personal level. 504 00:26:27,000 --> 00:26:29,600 Speaker 5: I feel like yeah, because I mean, like to your point, 505 00:26:29,640 --> 00:26:32,400 Speaker 5: I mean, I don't you know, the lack of religion 506 00:26:32,520 --> 00:26:35,680 Speaker 5: or I think the sense of community more than like finding. 507 00:26:35,800 --> 00:26:37,879 Speaker 5: I mean, people are finding meaning in other things, but 508 00:26:37,920 --> 00:26:40,479 Speaker 5: I don't know how many people are explicitly like I 509 00:26:40,560 --> 00:26:43,399 Speaker 5: just at least something that kind of explains everything. And 510 00:26:43,480 --> 00:26:46,159 Speaker 5: I think the way we pivot to that is just 511 00:26:46,200 --> 00:26:48,439 Speaker 5: to find, like you say, like, because we feel so 512 00:26:48,520 --> 00:26:51,320 Speaker 5: isolated where we want to find community in these other 513 00:26:51,359 --> 00:26:53,800 Speaker 5: ways that like are interesting to us because I'm not 514 00:26:53,920 --> 00:26:57,639 Speaker 5: interested in religion, Yeah, but I'm interested in Arsenal Football club, 515 00:26:58,359 --> 00:27:02,240 Speaker 5: and that is the closest I begin to devote myself 516 00:27:02,320 --> 00:27:04,680 Speaker 5: to something that's like part like there's a group, there's 517 00:27:04,720 --> 00:27:07,240 Speaker 5: like a in group where I'm trying to be like 518 00:27:07,320 --> 00:27:11,399 Speaker 5: identify with these other supporters. I go through the religious 519 00:27:11,800 --> 00:27:15,119 Speaker 5: ceremonies of like watching the matches and getting very emotional 520 00:27:15,160 --> 00:27:17,840 Speaker 5: as I watch them, and be very like emotionally moved 521 00:27:17,880 --> 00:27:20,119 Speaker 5: when things go up or down, and so like, I 522 00:27:20,440 --> 00:27:23,720 Speaker 5: find that like most people have like a topic where 523 00:27:23,760 --> 00:27:26,760 Speaker 5: they will bring that sort of level of like devotion to, 524 00:27:27,200 --> 00:27:28,879 Speaker 5: you know, like what they're paying attention to and what 525 00:27:28,880 --> 00:27:31,800 Speaker 5: they're willing to debate people on, et cetera. Because yeah, 526 00:27:31,840 --> 00:27:34,480 Speaker 5: like we're all we're all just trying to find something 527 00:27:34,520 --> 00:27:37,399 Speaker 5: that like feels good and helps us feel connected at 528 00:27:37,440 --> 00:27:38,879 Speaker 5: the end of the day at least certainly that I 529 00:27:38,880 --> 00:27:41,160 Speaker 5: can speak for myself in that very narrow example. 530 00:27:42,200 --> 00:27:44,080 Speaker 8: Yes, this is something else where you can tract and 531 00:27:44,119 --> 00:27:47,280 Speaker 8: statistics like the average number of friends and close friends 532 00:27:47,280 --> 00:27:50,200 Speaker 8: a person has again has been dropping since the nineties, 533 00:27:51,119 --> 00:27:54,680 Speaker 8: and it is just the people used to meet their 534 00:27:54,720 --> 00:27:57,680 Speaker 8: friends at church, They used to meet their friends at 535 00:27:57,680 --> 00:27:59,560 Speaker 8: the office, and now a lot of people work at home, 536 00:28:00,119 --> 00:28:03,920 Speaker 8: you know. And that's the thing where we can sit 537 00:28:03,920 --> 00:28:05,879 Speaker 8: here and talk about as like a mystical thing like 538 00:28:05,920 --> 00:28:08,719 Speaker 8: there's no sense of unity or community or whatever, but 539 00:28:09,240 --> 00:28:11,600 Speaker 8: from just a practical point of view, having a friend 540 00:28:11,600 --> 00:28:13,879 Speaker 8: who will give you a ride to the airport, or 541 00:28:13,880 --> 00:28:16,720 Speaker 8: who will help you move or you know, like just 542 00:28:16,800 --> 00:28:19,199 Speaker 8: as in a practical manner. We that's part of what 543 00:28:19,760 --> 00:28:22,120 Speaker 8: people don't get this. It's part of what a church provided. 544 00:28:22,280 --> 00:28:24,720 Speaker 8: Like the church when one person got sick and couldn't work, 545 00:28:24,760 --> 00:28:26,600 Speaker 8: the other members of the church would bring food to 546 00:28:26,640 --> 00:28:29,040 Speaker 8: their house. They would come help them clean like that 547 00:28:29,200 --> 00:28:31,160 Speaker 8: was and you did it because it's like, hey, we 548 00:28:31,240 --> 00:28:34,760 Speaker 8: all are Baptists or whatever, and we are all on 549 00:28:34,800 --> 00:28:38,320 Speaker 8: the same team. And it was That's something that is 550 00:28:39,160 --> 00:28:42,440 Speaker 8: humans do everywhere that you find humans as we organize 551 00:28:42,440 --> 00:28:44,600 Speaker 8: and get together, but we usually have to have something 552 00:28:44,640 --> 00:28:49,480 Speaker 8: to rally around, a symbol or something a tree, this 553 00:28:49,600 --> 00:28:53,040 Speaker 8: tree si, we all yeah, whatever it is, you know. 554 00:28:54,520 --> 00:28:57,280 Speaker 8: And unfortunately that also makes us go to war with 555 00:28:57,320 --> 00:28:59,200 Speaker 8: one another. If you see somebody in a Red Sox 556 00:28:59,280 --> 00:29:01,840 Speaker 8: Jersey Yankee fan and then it's like, all right, let 557 00:29:02,240 --> 00:29:05,160 Speaker 8: let's go punch that guy. But when you talk about 558 00:29:05,240 --> 00:29:08,840 Speaker 8: and there's a term that is for the modern situation, 559 00:29:08,960 --> 00:29:13,200 Speaker 8: which is atomization, where you've adomized people to where now 560 00:29:13,320 --> 00:29:15,080 Speaker 8: if you need a ride to the airport, you're going 561 00:29:15,160 --> 00:29:18,800 Speaker 8: to you know, pay for an uber and so many 562 00:29:18,800 --> 00:29:21,120 Speaker 8: of these things that used to be stuff that friends 563 00:29:21,120 --> 00:29:23,400 Speaker 8: would do. It's like, well, now it's a corporation doing 564 00:29:23,440 --> 00:29:27,880 Speaker 8: it for you. And I'm not saying that the friends 565 00:29:28,000 --> 00:29:32,040 Speaker 8: you know purely online aren't your real friends, but it's 566 00:29:32,080 --> 00:29:34,120 Speaker 8: a different type of friendship. If it's somebody who you 567 00:29:34,240 --> 00:29:36,600 Speaker 8: can't call when you've broken your leg you need somebody 568 00:29:36,600 --> 00:29:40,040 Speaker 8: to go get groceries for you, or vice versa. If 569 00:29:40,080 --> 00:29:42,600 Speaker 8: you're not the person that like, you feel obligator to 570 00:29:42,640 --> 00:29:44,760 Speaker 8: do that because they're my friend, like like they depend 571 00:29:44,880 --> 00:29:48,080 Speaker 8: on me. It means something to be needed, to be 572 00:29:48,160 --> 00:29:51,320 Speaker 8: dependent on, Like to get in a situation where nobody 573 00:29:51,320 --> 00:29:54,840 Speaker 8: depends on you, and like I don't have children or whatever, 574 00:29:54,880 --> 00:29:57,120 Speaker 8: but like I don't have friends where that friend's gonna 575 00:29:57,160 --> 00:29:58,920 Speaker 8: call me in the middle of I said, hey, I 576 00:29:58,920 --> 00:30:01,200 Speaker 8: need you to come bail me in it. Like that 577 00:30:01,240 --> 00:30:03,880 Speaker 8: sucks in the moment, but knowing that somebody needs you 578 00:30:04,640 --> 00:30:07,800 Speaker 8: is what keeps you going When you're that isolated from 579 00:30:07,840 --> 00:30:11,880 Speaker 8: real life connections. It's too easy to just drift away. 580 00:30:12,320 --> 00:30:15,680 Speaker 8: Humans need to be needed. Do we need that actual 581 00:30:15,760 --> 00:30:16,840 Speaker 8: in real life connection? 582 00:30:17,200 --> 00:30:20,040 Speaker 5: Which is interesting too because I see this like on 583 00:30:20,120 --> 00:30:23,960 Speaker 5: TikTok more and more of people posting like these strategies 584 00:30:24,160 --> 00:30:27,680 Speaker 5: that come off as like the most manipulative, sort of 585 00:30:27,680 --> 00:30:30,240 Speaker 5: like sadistic things where they talk about it's like you 586 00:30:30,360 --> 00:30:32,640 Speaker 5: got to be needed, and that's how you do It's 587 00:30:32,680 --> 00:30:35,200 Speaker 5: how you develop even deeper relationships. I do this thing 588 00:30:35,240 --> 00:30:39,479 Speaker 5: with my wife where I unhook the chain from the 589 00:30:39,520 --> 00:30:44,280 Speaker 5: toilet handle so she'll need and when the toilet isn't working, 590 00:30:44,600 --> 00:30:47,080 Speaker 5: she will then associate me with someone who can come 591 00:30:47,160 --> 00:30:49,800 Speaker 5: and solve a problem, and then that helps create a 592 00:30:49,840 --> 00:30:50,520 Speaker 5: deeper love. 593 00:30:50,600 --> 00:30:53,840 Speaker 3: And you're like, oh my god, and people are like really. 594 00:30:53,720 --> 00:30:57,120 Speaker 5: Like I mean, that's a kind of fringe element of TikTok, 595 00:30:57,200 --> 00:30:58,880 Speaker 5: But more and more people are taking these sort of 596 00:30:58,880 --> 00:31:03,040 Speaker 5: things and sort of finding to like manufacture these kinds 597 00:31:03,080 --> 00:31:06,840 Speaker 5: of connections because they aren't happening like normally either, and 598 00:31:06,880 --> 00:31:09,479 Speaker 5: you're kind of like this sounds like sociopathic. But on 599 00:31:09,520 --> 00:31:11,720 Speaker 5: the other side of it, you can see people sort 600 00:31:11,720 --> 00:31:14,719 Speaker 5: of like yearning for like, yeah, how do I cultivate 601 00:31:14,840 --> 00:31:15,680 Speaker 5: a deeper connection? 602 00:31:15,760 --> 00:31:17,680 Speaker 3: Do I need to sort of gaslight this person? 603 00:31:17,680 --> 00:31:20,440 Speaker 5: And then thinking that the toilet never works and I'm 604 00:31:20,040 --> 00:31:24,040 Speaker 5: the like magical fixer of that. But again, like you 605 00:31:24,040 --> 00:31:27,719 Speaker 5: see it expressed in so many ways of that there 606 00:31:27,840 --> 00:31:30,360 Speaker 5: is this deeper human thing lacking. 607 00:31:30,560 --> 00:31:33,040 Speaker 8: Yeah work, I mean then a system from it's always 608 00:31:33,080 --> 00:31:36,280 Speaker 8: sunny in Philadelphia, right right if you don't know that 609 00:31:36,320 --> 00:31:37,120 Speaker 8: reference right. 610 00:31:38,280 --> 00:31:43,360 Speaker 1: About but a strange like when one of your human 611 00:31:43,360 --> 00:31:47,920 Speaker 1: interactions most common human interactions is like an uber driver, 612 00:31:48,120 --> 00:31:50,560 Speaker 1: like a person a stranger you can put on quiet 613 00:31:50,600 --> 00:31:53,840 Speaker 1: mode like when when you're having a conversation with them. 614 00:31:53,920 --> 00:31:56,440 Speaker 1: Then it makes sense to me that that sort of 615 00:31:57,320 --> 00:31:59,560 Speaker 1: manipulation and viewing other people as a means to an 616 00:31:59,640 --> 00:32:04,320 Speaker 1: end bleed into how you view like other other parts 617 00:32:04,320 --> 00:32:08,120 Speaker 1: of your life. Right if everything's just sort of a 618 00:32:08,120 --> 00:32:14,440 Speaker 1: transactional and I mean they talk about this new sort 619 00:32:14,440 --> 00:32:18,560 Speaker 1: of information and also like just day to day economy 620 00:32:18,560 --> 00:32:24,400 Speaker 1: as being a way to reduce the friction of that 621 00:32:24,800 --> 00:32:26,680 Speaker 1: like got in the way of some of our consumer 622 00:32:26,800 --> 00:32:31,320 Speaker 1: like spending habits, and like friction in many cases seems 623 00:32:31,320 --> 00:32:34,720 Speaker 1: to be human interaction. So I think one thing we're 624 00:32:34,760 --> 00:32:39,080 Speaker 1: going to say overall human interaction good. Find ways to 625 00:32:39,120 --> 00:32:42,239 Speaker 1: be part of a community, you know, preferably one that 626 00:32:42,520 --> 00:32:45,200 Speaker 1: needs you and gives you meaning. 627 00:32:45,360 --> 00:32:48,280 Speaker 8: Miles The next time you go on TikTok, like there's 628 00:32:48,320 --> 00:32:51,120 Speaker 8: a meta narrative in that video you saw, because that 629 00:32:51,200 --> 00:32:55,960 Speaker 8: algorithm floated up to you a video where the actual 630 00:32:56,640 --> 00:33:01,800 Speaker 8: message was people are crazy out there and relationships are 631 00:33:01,800 --> 00:33:04,440 Speaker 8: weird and toxic these days. You would be shocked at 632 00:33:04,480 --> 00:33:08,440 Speaker 8: what percentage of the feed is some subtle message, some 633 00:33:08,480 --> 00:33:11,360 Speaker 8: subtle version of out there is dangerous. 634 00:33:12,000 --> 00:33:12,160 Speaker 6: Right. 635 00:33:12,320 --> 00:33:16,880 Speaker 8: Guys are sexist, women are crazy, their standards are super high. 636 00:33:17,480 --> 00:33:20,240 Speaker 8: You'll be accused of sexual harassment if you ever try 637 00:33:20,280 --> 00:33:23,160 Speaker 8: to talk to a woman in any setting. And there's 638 00:33:23,200 --> 00:33:27,120 Speaker 8: this meta message of the only safe place is at 639 00:33:27,160 --> 00:33:28,920 Speaker 8: home looking at a screen. 640 00:33:29,320 --> 00:33:32,720 Speaker 1: Yeah, women out there and sees people who aren't there, 641 00:33:33,640 --> 00:33:36,280 Speaker 1: just as you mentioned as a video like it's a 642 00:33:36,280 --> 00:33:40,720 Speaker 1: person having a mental health episode. That was everywhere for 643 00:33:40,960 --> 00:33:44,920 Speaker 1: like a week, and I never would have seen thirty 644 00:33:45,280 --> 00:33:47,440 Speaker 1: you know, twenty maybe like ten years ago. 645 00:33:47,720 --> 00:33:49,880 Speaker 8: You know, there was a time when that wouldn't have 646 00:33:49,920 --> 00:33:52,160 Speaker 8: even shown up in like the police boughter and your 647 00:33:52,200 --> 00:33:54,520 Speaker 8: local paper. It was a non event. Somebody threw a 648 00:33:54,560 --> 00:33:56,400 Speaker 8: fit on it, but because it was captured on camera. 649 00:33:56,440 --> 00:33:59,480 Speaker 8: But the message is it's scary to find there's crazy 650 00:33:59,520 --> 00:34:02,880 Speaker 8: people out of there, and any conservative news like you 651 00:34:02,920 --> 00:34:05,920 Speaker 8: if you look at the comments on any conservative news outlet. 652 00:34:05,920 --> 00:34:09,520 Speaker 8: It's like, well, I won't even drive through this city, 653 00:34:09,640 --> 00:34:13,759 Speaker 8: like it's just mac It's like, no, it's not. I've 654 00:34:13,840 --> 00:34:16,600 Speaker 8: been there. It's people walking around and shopping and eating 655 00:34:16,640 --> 00:34:18,799 Speaker 8: in the restaurants. Bit It's like no, from from their 656 00:34:18,840 --> 00:34:22,640 Speaker 8: point of view, it's just cars on fire, smashed windows. 657 00:34:22,680 --> 00:34:25,480 Speaker 8: Like if you are a white person, the minorities will 658 00:34:25,480 --> 00:34:27,359 Speaker 8: just drag you off the street because they all it's 659 00:34:27,400 --> 00:34:31,160 Speaker 8: Antifa and the Black Lives Matter. It's like, you've got 660 00:34:31,160 --> 00:34:35,320 Speaker 8: such a weird view of the world, but it's all 661 00:34:36,120 --> 00:34:38,920 Speaker 8: you're only safe at home. Like you have people living 662 00:34:38,960 --> 00:34:42,520 Speaker 8: in the middle of whatever, Oklahoma or Montana, someplace where 663 00:34:42,560 --> 00:34:45,759 Speaker 8: they haven't had a violent crime, and you know, like 664 00:34:45,800 --> 00:34:48,520 Speaker 8: in six months, where they've got cameras all over the 665 00:34:48,520 --> 00:34:51,080 Speaker 8: outside of their house and they've got a shotgun under 666 00:34:51,120 --> 00:34:53,840 Speaker 8: their bed because they're sure that at any moment a 667 00:34:53,920 --> 00:34:56,120 Speaker 8: gang of twenty five guys is going to come try 668 00:34:56,120 --> 00:34:58,160 Speaker 8: to take over their house because it's something they saw 669 00:34:58,239 --> 00:35:00,480 Speaker 8: on the news. It's like, well, you know, in like 670 00:35:00,520 --> 00:35:03,880 Speaker 8: in Portland and the Walgreens, they'll just have like twenty 671 00:35:03,920 --> 00:35:06,840 Speaker 8: five looters show up and steal everything that could happen 672 00:35:06,880 --> 00:35:10,439 Speaker 8: to me at any time here in North Dakota. Right, 673 00:35:10,680 --> 00:35:13,720 Speaker 8: It's like, and you see when you see their fear 674 00:35:13,760 --> 00:35:18,360 Speaker 8: and how irrational it is from the outside, your fear 675 00:35:18,440 --> 00:35:21,120 Speaker 8: also looks like that to someone else, Like we all 676 00:35:21,239 --> 00:35:23,680 Speaker 8: have been put into kind of a little a little box, 677 00:35:23,719 --> 00:35:24,720 Speaker 8: a little fear box. 678 00:35:25,040 --> 00:35:27,040 Speaker 5: Well, I think, like to your point, like especially about 679 00:35:27,040 --> 00:35:30,480 Speaker 5: like I think of things like reddit, right, and the 680 00:35:30,560 --> 00:35:33,720 Speaker 5: subreddit public freak Out. There's like a subreddit called public 681 00:35:33,760 --> 00:35:36,799 Speaker 5: freakout that's really popular and it's mostly a lot of 682 00:35:36,800 --> 00:35:39,040 Speaker 5: people just having some kind of like mental health crisis 683 00:35:39,120 --> 00:35:41,040 Speaker 5: or something like that, or just some wild thing that's 684 00:35:41,080 --> 00:35:43,719 Speaker 5: going on. But when you think about like when you're 685 00:35:43,800 --> 00:35:46,279 Speaker 5: like a kid, I remember being like, oh, what's the 686 00:35:46,360 --> 00:35:48,520 Speaker 5: ocean like? Or I'm trying to like, you know, draw 687 00:35:48,560 --> 00:35:51,840 Speaker 5: on experiences I have. Most of them are like media 688 00:35:51,880 --> 00:35:54,719 Speaker 5: reference to memories I have, like of a movie or 689 00:35:54,719 --> 00:35:57,279 Speaker 5: something I saw on TV that was forming even my 690 00:35:57,400 --> 00:36:00,480 Speaker 5: own concept of what something was. And now that we 691 00:36:00,560 --> 00:36:04,440 Speaker 5: have like video that's like IROL type sort of video 692 00:36:04,560 --> 00:36:07,680 Speaker 5: from cameras that sort of that ups the stakes even 693 00:36:07,719 --> 00:36:10,239 Speaker 5: more where people begin to associate Well, I did see 694 00:36:10,239 --> 00:36:12,480 Speaker 5: this one clip of this thing happening in this place 695 00:36:12,560 --> 00:36:14,840 Speaker 5: now that is exactly what is going to happen to me, 696 00:36:14,880 --> 00:36:17,960 Speaker 5: and has this effect of just ramping up these fears 697 00:36:18,440 --> 00:36:20,440 Speaker 5: and like a good like you know, an example of 698 00:36:20,480 --> 00:36:23,200 Speaker 5: this is, like you're saying, Jason, like, especially on conservative news, 699 00:36:23,200 --> 00:36:26,440 Speaker 5: the way they portray certain cities and what quote unquote 700 00:36:26,520 --> 00:36:29,520 Speaker 5: crime waves are happening. That has this effect on us 701 00:36:29,560 --> 00:36:32,399 Speaker 5: just as human beings, where the more we're put into 702 00:36:32,480 --> 00:36:36,360 Speaker 5: a fear state, the more malleable we become. And also 703 00:36:36,600 --> 00:36:39,600 Speaker 5: it we inch closer and closer to being in a 704 00:36:39,640 --> 00:36:43,799 Speaker 5: mindset or a mind state where solving things violently is 705 00:36:43,880 --> 00:36:48,080 Speaker 5: acceptable because we see just how fucked up and like 706 00:36:48,239 --> 00:36:51,319 Speaker 5: agro and violent the world is, and just looking at 707 00:36:51,360 --> 00:36:53,120 Speaker 5: all of those things feeding on each other, it's like, 708 00:36:53,200 --> 00:36:55,799 Speaker 5: at some point for me, like I realized that the 709 00:36:55,840 --> 00:36:58,080 Speaker 5: way to even break that kind of cycle was first 710 00:36:58,120 --> 00:37:00,839 Speaker 5: just to have awareness because a lot of the time, 711 00:37:00,880 --> 00:37:02,520 Speaker 5: if I am getting caught up on what I read 712 00:37:02,560 --> 00:37:06,880 Speaker 5: in the headlines, it it feeds on like, you know, 713 00:37:06,920 --> 00:37:10,000 Speaker 5: I start ruminating on things that aren't actually necessarily helpful 714 00:37:10,000 --> 00:37:12,520 Speaker 5: to me because it's also not the reality. Because we 715 00:37:12,560 --> 00:37:14,800 Speaker 5: talk about the quote unquote crime waves that were that 716 00:37:14,840 --> 00:37:16,759 Speaker 5: everyone wanted to talk about in the last year that 717 00:37:16,880 --> 00:37:21,440 Speaker 5: just didn't exist, and it's it's much easier to be like, Okay, 718 00:37:21,480 --> 00:37:23,719 Speaker 5: I know I'm seeing a lot of like over emphasis 719 00:37:23,719 --> 00:37:27,719 Speaker 5: on these like visually really like you know, uh, provocative 720 00:37:27,719 --> 00:37:30,600 Speaker 5: images and videos. But that's not actually, that's not that's 721 00:37:30,640 --> 00:37:33,680 Speaker 5: not the most accurate depiction on what on what is happening. 722 00:37:34,040 --> 00:37:39,360 Speaker 1: Yeah, And it's specifically like designed that way, specifically based 723 00:37:39,400 --> 00:37:43,719 Speaker 1: on like it's nothing is based on reality as much 724 00:37:43,719 --> 00:37:47,319 Speaker 1: as it's based on what is going to feed. Like 725 00:37:47,360 --> 00:37:49,680 Speaker 1: nobody was sitting back and being like, what we're going 726 00:37:49,680 --> 00:37:53,120 Speaker 1: to make up a crime wave? It was that people's 727 00:37:53,680 --> 00:37:57,839 Speaker 1: like that that was the story that made people have 728 00:37:58,040 --> 00:38:02,399 Speaker 1: the most scared reaction, and therefore they clicked on it, 729 00:38:02,520 --> 00:38:05,560 Speaker 1: and therefore they got more of it, and the media 730 00:38:05,880 --> 00:38:08,799 Speaker 1: you know, keeps feeding them. All Right, let's let's take 731 00:38:08,920 --> 00:38:13,120 Speaker 1: a quick break and we'll come back. We'll talk about climate, 732 00:38:13,280 --> 00:38:16,800 Speaker 1: which I think plays into this in a bunch of ways. 733 00:38:17,000 --> 00:38:20,880 Speaker 1: And also just you know, are there solutions like what 734 00:38:21,560 --> 00:38:22,640 Speaker 1: can we do? 735 00:38:23,320 --> 00:38:23,520 Speaker 7: No? 736 00:38:24,880 --> 00:38:39,160 Speaker 1: Then nothing and we're back. And so the latest Mission 737 00:38:39,160 --> 00:38:43,000 Speaker 1: Impossible dropped last night or two nights ago. It's getting 738 00:38:43,040 --> 00:38:47,640 Speaker 1: great reviews. It does indeed seem to be about Ethan 739 00:38:47,760 --> 00:38:52,240 Speaker 1: Hunt battling a self aware AI program known as the Entity, 740 00:38:52,960 --> 00:38:57,000 Speaker 1: which I don't know this like he might as well 741 00:38:57,440 --> 00:39:03,759 Speaker 1: from my perspective, like that is as dramatically impactful as 742 00:39:03,880 --> 00:39:07,720 Speaker 1: like he's battling a wizard this time. It's just like, Okay, 743 00:39:07,960 --> 00:39:12,759 Speaker 1: well that doesn't really make sense to me or like 744 00:39:12,920 --> 00:39:15,279 Speaker 1: does It's not like really a thing, and that does. 745 00:39:15,360 --> 00:39:17,439 Speaker 1: When people are critical of the movie, they seem to say, 746 00:39:17,480 --> 00:39:19,960 Speaker 1: like all the stuff about AI kind of drags a 747 00:39:19,960 --> 00:39:26,440 Speaker 1: little bit, but and visually it is depicted as basically 748 00:39:26,480 --> 00:39:29,920 Speaker 1: an evil screensaver from nineteen ninety nine, like it's just 749 00:39:30,680 --> 00:39:35,520 Speaker 1: the black circle with with like white dots of light. 750 00:39:36,000 --> 00:39:40,400 Speaker 1: Three hundred million dollar movie, by the way, but it can. So. 751 00:39:40,440 --> 00:39:44,200 Speaker 1: The main claim that the movie makes about this AI, 752 00:39:44,520 --> 00:39:47,960 Speaker 1: and I guess AI in general, is that the program 753 00:39:48,040 --> 00:39:52,319 Speaker 1: can also see the future, like very specifically through its 754 00:39:52,360 --> 00:39:56,520 Speaker 1: predictive technology. And that's like based on what we're seeing 755 00:39:56,760 --> 00:40:01,880 Speaker 1: in news headlines like the you know, deep learning teaching 756 00:40:01,920 --> 00:40:04,960 Speaker 1: computers to predict the future is like a headline I 757 00:40:04,960 --> 00:40:08,440 Speaker 1: feel like I've seen a hundred times spooky Artificial intelligence 758 00:40:08,480 --> 00:40:10,880 Speaker 1: can accurately predict the future, and it's about to be 759 00:40:10,960 --> 00:40:14,880 Speaker 1: asked more questions. AI can now predict crime before it 760 00:40:14,960 --> 00:40:17,680 Speaker 1: happens as well, we'll get into in a moment. 761 00:40:18,080 --> 00:40:21,879 Speaker 6: And you mean like that movie Minority Report. Also, yeah, 762 00:40:22,120 --> 00:40:26,160 Speaker 6: Tom Cruise Cruise, Yeah, he's he. I feel like his 763 00:40:26,320 --> 00:40:30,440 Speaker 6: sense of reality is probably more blurred by the movies 764 00:40:30,480 --> 00:40:32,879 Speaker 6: he's in than any other movie star. 765 00:40:33,760 --> 00:40:34,959 Speaker 4: Well, yeah, because he has. 766 00:40:34,840 --> 00:40:38,319 Speaker 7: An entire organization blurring his reality. Yeah, before he hits 767 00:40:38,360 --> 00:40:41,439 Speaker 7: set and then you're in that magical landscape where he's 768 00:40:41,440 --> 00:40:44,560 Speaker 7: made himself king. Yeah. I mean that that guy's not 769 00:40:44,560 --> 00:40:45,479 Speaker 7: living on planet Earth. 770 00:40:45,680 --> 00:40:46,480 Speaker 3: He doesn't need too. 771 00:40:47,160 --> 00:40:50,719 Speaker 1: Yeah. The this movie actually was like written, it was 772 00:40:50,760 --> 00:40:53,560 Speaker 1: supposed to come out like years ago, but the production 773 00:40:53,920 --> 00:40:58,480 Speaker 1: famously shut down due to the coronavirus outbreak, which, according 774 00:40:58,520 --> 00:41:01,120 Speaker 1: to reports in February twenty two, when he was putting 775 00:41:01,120 --> 00:41:05,960 Speaker 1: a real damper on offshore location shoots. Wow, gave us 776 00:41:06,000 --> 00:41:09,120 Speaker 1: the Tom Cruise thing where it was clear that he 777 00:41:09,160 --> 00:41:13,239 Speaker 1: thought he was the last barrier between like the end 778 00:41:13,280 --> 00:41:17,040 Speaker 1: of the world and you know, coronavirus just like killing 779 00:41:17,080 --> 00:41:20,359 Speaker 1: everyone where he was like shouting at everyone. But first 780 00:41:20,360 --> 00:41:23,040 Speaker 1: of all, I just want to say too, Ai, speaking 781 00:41:23,320 --> 00:41:26,200 Speaker 1: of I'd like to issue a big where were you 782 00:41:26,280 --> 00:41:28,880 Speaker 1: on that one? Dipshit to the AI on predicting the 783 00:41:28,920 --> 00:41:33,160 Speaker 1: global pandemic, Like you couldn't We couldn't see that one coming. 784 00:41:33,239 --> 00:41:36,200 Speaker 1: A lot of people saw it coming, people mere people 785 00:41:36,440 --> 00:41:40,640 Speaker 1: saw it coming. But yeah, the whole predicting the future 786 00:41:40,680 --> 00:41:46,239 Speaker 1: thing seems to be vastly overstated. Like there's a so 787 00:41:46,320 --> 00:41:49,680 Speaker 1: the MI T researchers who had created a computer that 788 00:41:49,760 --> 00:41:50,800 Speaker 1: could supposedly lose. 789 00:41:51,160 --> 00:41:56,680 Speaker 6: Mission Impossible Technology Mission Impossible Technology am I technology. 790 00:41:57,120 --> 00:42:03,120 Speaker 1: Yes, yeah, they are loosely associated. But it was basically 791 00:42:03,200 --> 00:42:06,600 Speaker 1: like they were predicting. They would show the computer a 792 00:42:06,640 --> 00:42:12,040 Speaker 1: picture and then the computer would predict, like what would 793 00:42:12,120 --> 00:42:15,360 Speaker 1: happen for the next one point five seconds in that picture? 794 00:42:16,200 --> 00:42:19,400 Speaker 1: And like that it was like someone walking across the 795 00:42:19,400 --> 00:42:21,640 Speaker 1: golf course and they were like, that thing's about to 796 00:42:21,719 --> 00:42:25,400 Speaker 1: keep walking across that golf course. They were like, holy shit, 797 00:42:25,680 --> 00:42:30,560 Speaker 1: you guys. Or like one was a wave crashing, they 798 00:42:30,560 --> 00:42:34,640 Speaker 1: were like, I think it's gonna keep crashing. Call me crazy, 799 00:42:35,239 --> 00:42:39,480 Speaker 1: and they were like, fucking a, do you think it's brilliant? 800 00:42:39,800 --> 00:42:46,800 Speaker 1: Thing's magical and yeah, and then like MIT Mission Possible 801 00:42:46,800 --> 00:42:49,840 Speaker 1: Technology also developed an algorithm that could predict how people 802 00:42:49,880 --> 00:42:52,520 Speaker 1: will greet each other and got a lot of cool 803 00:42:52,520 --> 00:42:55,560 Speaker 1: headlines for that. It was trained on YouTube videos and 804 00:42:55,600 --> 00:42:59,680 Speaker 1: reruns of the Office and Desperate Housewives. Amazing, and it 805 00:42:59,800 --> 00:43:03,399 Speaker 1: was only right just over forty three percent of the time. 806 00:43:03,920 --> 00:43:06,200 Speaker 7: We never greet each other like we do ones for Housewives. 807 00:43:06,280 --> 00:43:08,319 Speaker 7: I wish we did, but like, I can't just walk 808 00:43:08,400 --> 00:43:10,160 Speaker 7: up to my enemy and slap them across the face. 809 00:43:10,200 --> 00:43:13,880 Speaker 1: It's not reality, I know, and that sucks. And hopefully 810 00:43:13,880 --> 00:43:17,680 Speaker 1: we'll get to that world soon. That's my future that 811 00:43:17,719 --> 00:43:21,320 Speaker 1: I'm hopeful for. But so one second away from the greeting, 812 00:43:21,360 --> 00:43:23,439 Speaker 1: they could only predict it forty three percent at a time. 813 00:43:23,760 --> 00:43:27,520 Speaker 1: So this is what I'm always wondering, Like, what how 814 00:43:27,560 --> 00:43:31,120 Speaker 1: does that compare to humans? Humans making the same prediction 815 00:43:31,440 --> 00:43:34,960 Speaker 1: were right seventy one percent of the time. Oh so 816 00:43:36,160 --> 00:43:39,439 Speaker 1: much better than this AI that everyone did. You guys 817 00:43:39,440 --> 00:43:42,280 Speaker 1: ever see the sixty minutes where they like this sixty 818 00:43:42,280 --> 00:43:44,560 Speaker 1: minutes about AI. I think it came out like back 819 00:43:44,600 --> 00:43:49,719 Speaker 1: in March or April, and like Scott, I think it's 820 00:43:49,800 --> 00:43:53,000 Speaker 1: Scott Pelly, like one of the old journalists on sixty minutes, 821 00:43:53,120 --> 00:43:56,640 Speaker 1: is just like okay, like write a speech for me 822 00:43:56,760 --> 00:44:00,000 Speaker 1: about this, and then he seems to be blown away, 823 00:44:00,200 --> 00:44:03,600 Speaker 1: like the amount of text it produces, like he's like 824 00:44:03,680 --> 00:44:07,719 Speaker 1: that just rolled a whole speech in four seconds, Like 825 00:44:08,200 --> 00:44:13,480 Speaker 1: he's it's like he's never seen a computer before. But 826 00:44:14,520 --> 00:44:17,960 Speaker 1: I do think we're at a weird place where people 827 00:44:18,040 --> 00:44:21,719 Speaker 1: will you can just be like magic AI is bad 828 00:44:21,760 --> 00:44:25,520 Speaker 1: guy in movie, and everyone's like, yep, that makes sense, 829 00:44:25,760 --> 00:44:28,480 Speaker 1: because like we we should be afraid of AI, but 830 00:44:28,600 --> 00:44:32,160 Speaker 1: people don't really know why necessarily we should be, and 831 00:44:32,239 --> 00:44:35,799 Speaker 1: so we just you know, like it's the same thing 832 00:44:35,840 --> 00:44:39,520 Speaker 1: that happens with all our conspiracy theories, and you know, 833 00:44:39,680 --> 00:44:42,319 Speaker 1: we should be scared about the pandemic, we should be 834 00:44:42,560 --> 00:44:46,040 Speaker 1: scared about the government's response to the pandemic, but people 835 00:44:46,239 --> 00:44:50,000 Speaker 1: create like vast conspiracy theories just to make sure they're 836 00:44:50,040 --> 00:44:54,160 Speaker 1: like scared about it for nonsensical reasons. It's like we 837 00:44:54,560 --> 00:44:57,839 Speaker 1: need that like cognitive dissonance transference happening, and I feel 838 00:44:57,840 --> 00:45:00,520 Speaker 1: like we're starting to see that with AI, Like AI 839 00:45:00,640 --> 00:45:02,839 Speaker 1: is going to be magic and tell us everything that 840 00:45:02,880 --> 00:45:05,000 Speaker 1: could happen to us for the rest of our lives. 841 00:45:05,160 --> 00:45:10,920 Speaker 1: And that's that's not the scary thing about AI. 842 00:45:10,280 --> 00:45:14,680 Speaker 6: Right, The scary thing is like Skynet. And that's the thing. 843 00:45:14,719 --> 00:45:19,160 Speaker 6: Like movies have been having AI as the villain for 844 00:45:19,239 --> 00:45:23,640 Speaker 6: like decades now, and I feel like I don't know 845 00:45:23,680 --> 00:45:27,080 Speaker 6: what my point is gonna be. But remember Terminator one. 846 00:45:27,640 --> 00:45:30,719 Speaker 6: I remember the Matrix from nineteen ninety nine, over twenty 847 00:45:30,840 --> 00:45:34,520 Speaker 6: years ago. Like, I just think it's funny that AI, 848 00:45:34,719 --> 00:45:41,160 Speaker 6: I guess, hasn't evolved very much in movies. Yeah, I 849 00:45:41,160 --> 00:45:44,200 Speaker 6: don't know, No, I mean for real, like we've been 850 00:45:44,239 --> 00:45:45,400 Speaker 6: knowing tho is what I'm saying. 851 00:45:45,960 --> 00:45:50,520 Speaker 7: Yeah, And I think it's interesting. You know, movies, by 852 00:45:50,560 --> 00:45:54,080 Speaker 7: their very nature have to heighten and dramatize in order 853 00:45:54,200 --> 00:45:57,680 Speaker 7: to effectively bring home their messages, particularly if we're looking 854 00:45:57,719 --> 00:46:02,960 Speaker 7: at genre films, and the AI has always been like, 855 00:46:03,520 --> 00:46:06,200 Speaker 7: you know, turns into a cop and now it's hunting you, 856 00:46:06,520 --> 00:46:11,440 Speaker 7: or you know, it takes away your freedom of choice. 857 00:46:11,719 --> 00:46:16,520 Speaker 7: You know, if we look at the matrix, Yes, what's 858 00:46:16,520 --> 00:46:20,839 Speaker 7: the other one that report a minority report, Like they're 859 00:46:21,480 --> 00:46:23,280 Speaker 7: saying that crime. 860 00:46:23,320 --> 00:46:25,399 Speaker 1: And it's said in the future. It's like a sci 861 00:46:25,440 --> 00:46:30,200 Speaker 1: fi movie where sorry, mission impossible is supposed to be. 862 00:46:30,680 --> 00:46:33,440 Speaker 1: Like even though they can take masks off that make 863 00:46:33,480 --> 00:46:37,040 Speaker 1: them look exactly like the other person like you wouldn't 864 00:46:37,200 --> 00:46:42,160 Speaker 1: see Tom Cruise fighting robots necessarily, or like, no, you 865 00:46:42,200 --> 00:46:46,960 Speaker 1: know that that feels like it's a step into this 866 00:46:47,120 --> 00:46:48,720 Speaker 1: is ship that can actually happen. 867 00:46:49,360 --> 00:46:53,279 Speaker 7: Yeah, yeah, and I appreciate the grounding. I think we 868 00:46:53,600 --> 00:46:57,080 Speaker 7: got there because it's so much more tactile in this moment. 869 00:46:57,200 --> 00:47:00,960 Speaker 7: I think between if you think about your grocery store 870 00:47:00,960 --> 00:47:04,560 Speaker 7: workers who have been concerned about self checkout for like 871 00:47:04,680 --> 00:47:07,279 Speaker 7: over a decade, a lot of those stories are unionized 872 00:47:07,320 --> 00:47:11,160 Speaker 7: and have actively been like, hey, you're taking jobs from 873 00:47:11,280 --> 00:47:14,719 Speaker 7: like checkout counter people, some of whom are elderly, and 874 00:47:14,800 --> 00:47:16,640 Speaker 7: this is like the last place they can work and 875 00:47:16,680 --> 00:47:21,239 Speaker 7: make money consistently. To the writers and actors who may 876 00:47:21,480 --> 00:47:24,160 Speaker 7: be on strike by the time this comes out, and 877 00:47:24,280 --> 00:47:30,120 Speaker 7: there feared that not only could AIBS to take their job, 878 00:47:30,200 --> 00:47:33,800 Speaker 7: but I mean for performers, can my face be doing 879 00:47:33,880 --> 00:47:37,960 Speaker 7: things long, long, long after I'm dead and what will 880 00:47:38,000 --> 00:47:42,800 Speaker 7: that look like? I think, as we're sort of confronting 881 00:47:42,800 --> 00:47:47,160 Speaker 7: these things, AI is a really freaking dope tool. I'm 882 00:47:47,200 --> 00:47:50,239 Speaker 7: so much more concerned about these executives who are like, 883 00:47:50,600 --> 00:47:52,719 Speaker 7: oh great, now I don't have to pay people. Did 884 00:47:52,760 --> 00:47:54,760 Speaker 7: you guys see the on nine article that was written 885 00:47:54,760 --> 00:47:57,080 Speaker 7: by AI. They came out a couple days ago. They 886 00:47:57,120 --> 00:47:58,880 Speaker 7: have Internet in a tangle. 887 00:47:59,320 --> 00:48:04,759 Speaker 1: Yeah, this is like I've ever since I've been you know, 888 00:48:05,360 --> 00:48:09,080 Speaker 1: working on the Internet, creating trying to create like unique 889 00:48:09,360 --> 00:48:13,200 Speaker 1: interesting content. There's also been this push to just flood 890 00:48:13,239 --> 00:48:17,399 Speaker 1: the zone with shit. To quote Steve Bannon, like we 891 00:48:17,400 --> 00:48:20,200 Speaker 1: we used to like crack cracked zoned by a company 892 00:48:20,200 --> 00:48:25,160 Speaker 1: that was like also just destroying Google's search results with 893 00:48:25,440 --> 00:48:29,640 Speaker 1: just like fake articles like answers to questions and stuff. 894 00:48:29,640 --> 00:48:31,279 Speaker 1: And they were like trying to get us to like 895 00:48:31,840 --> 00:48:34,360 Speaker 1: do that. Just like there was like scale, how do 896 00:48:34,400 --> 00:48:37,879 Speaker 1: we scale? And like that's been you know that that 897 00:48:38,000 --> 00:48:43,160 Speaker 1: intersection of tech and media has always been scary because 898 00:48:43,239 --> 00:48:46,719 Speaker 1: it's very easy for them to just create ninety nine 899 00:48:46,760 --> 00:48:52,160 Speaker 1: million articles using technology now that like yeah, pass like 900 00:48:52,520 --> 00:48:54,560 Speaker 1: a little bit like you have to like read a 901 00:48:54,600 --> 00:48:57,600 Speaker 1: paragraph to be like wait this this wasn't written by 902 00:48:57,600 --> 00:48:58,759 Speaker 1: a human, wasn't. Yeah. 903 00:48:58,960 --> 00:49:03,200 Speaker 7: So article was like ranking the Star Wars movies in 904 00:49:03,320 --> 00:49:07,279 Speaker 7: like chronological order, I think, yeah, And they got the 905 00:49:07,600 --> 00:49:10,440 Speaker 7: dates wrong on many of them. Sometimes they weren't even 906 00:49:10,440 --> 00:49:14,800 Speaker 7: listed in chronological order. The way they were dated. The 907 00:49:14,840 --> 00:49:18,200 Speaker 7: information in individual movies was in corect. It was just 908 00:49:18,920 --> 00:49:22,960 Speaker 7: error filled, like no logical saying editor would have been 909 00:49:23,000 --> 00:49:26,520 Speaker 7: like and publish. And what's crazy is I don't know 910 00:49:26,600 --> 00:49:29,759 Speaker 7: his entire union that's actively been like, hey, we do 911 00:49:29,800 --> 00:49:31,879 Speaker 7: not want this. It does not make sense to ask 912 00:49:32,080 --> 00:49:36,080 Speaker 7: an AI program that cannot consistently write fact checked articles 913 00:49:36,600 --> 00:49:40,120 Speaker 7: to be publishing these and then asking somebody an editor 914 00:49:40,160 --> 00:49:44,200 Speaker 7: to like go through and essentially rewrite this incorrect work 915 00:49:44,239 --> 00:49:47,799 Speaker 7: that isn't benefiting anyone, because who needs this list? We 916 00:49:47,920 --> 00:49:50,640 Speaker 7: have twenty seven thousand of this very specific Well, this 917 00:49:50,760 --> 00:49:53,879 Speaker 7: is dumb uh and it's wild that, you know. I'm 918 00:49:53,920 --> 00:49:58,640 Speaker 7: glad the Internet roasted the editor in chief pretty intently. 919 00:49:58,680 --> 00:50:00,960 Speaker 7: I'm really hoping that brings them back to the table 920 00:50:01,040 --> 00:50:03,960 Speaker 7: and there can be working discussions about how this company 921 00:50:04,000 --> 00:50:06,440 Speaker 7: decides to use AI and the future. But it's that 922 00:50:06,520 --> 00:50:09,239 Speaker 7: kind of stupid shit that's like really draining, I think 923 00:50:09,239 --> 00:50:11,760 Speaker 7: on workers across job titles. 924 00:50:11,840 --> 00:50:12,880 Speaker 4: It's just like, what are. 925 00:50:12,719 --> 00:50:15,080 Speaker 7: We what are you trying? You just want to force 926 00:50:15,120 --> 00:50:17,680 Speaker 7: people out of work and that's not a logical thing 927 00:50:17,719 --> 00:50:19,560 Speaker 7: for you to do to yourself. 928 00:50:19,960 --> 00:50:21,520 Speaker 4: You mean the product money. 929 00:50:21,320 --> 00:50:24,600 Speaker 1: Yeah, to your point, the product sucks, like the product 930 00:50:24,719 --> 00:50:27,080 Speaker 1: like the AI as bad as it's bad at its 931 00:50:27,080 --> 00:50:30,279 Speaker 1: fucking job, like with regards to like the self checkout thing, 932 00:50:30,680 --> 00:50:34,200 Speaker 1: that the AI like can't stop people from stealing stuff, 933 00:50:34,239 --> 00:50:36,440 Speaker 1: and so the companies are like losing massive amounts of 934 00:50:36,440 --> 00:50:39,839 Speaker 1: money on theft, and then in order to combat that, 935 00:50:39,840 --> 00:50:43,080 Speaker 1: they're just like making claims that like everyone's shoplifting when 936 00:50:43,120 --> 00:50:46,400 Speaker 1: it's just like no, like you just put a like 937 00:50:47,600 --> 00:50:51,719 Speaker 1: inadequate program like as the thing in charge of like 938 00:50:51,760 --> 00:50:55,319 Speaker 1: making sure people can't steal stuff or like that they 939 00:50:55,320 --> 00:50:57,840 Speaker 1: don't want to steal stuff, and it's just not going 940 00:50:57,880 --> 00:50:59,760 Speaker 1: to it's not going to work. 941 00:51:00,320 --> 00:51:03,000 Speaker 7: But they made it way too easy because when I 942 00:51:03,040 --> 00:51:04,840 Speaker 7: was in college, we would for sure bring up like 943 00:51:04,880 --> 00:51:08,080 Speaker 7: a full roast chicken as like some griefs man, we're 944 00:51:08,120 --> 00:51:13,040 Speaker 7: good stop no uh, And there's personators is the possible 945 00:51:13,080 --> 00:51:14,760 Speaker 7: way you could get like that used to be somebody's 946 00:51:14,880 --> 00:51:17,320 Speaker 7: job to scan and bag your stuff. There's no possible 947 00:51:17,360 --> 00:51:19,160 Speaker 7: way to get around. So that's what I'm saying though, 948 00:51:19,200 --> 00:51:21,640 Speaker 7: like none of these things are thought out. They just 949 00:51:21,680 --> 00:51:24,640 Speaker 7: see a way to potentially preserve money without looking at 950 00:51:24,680 --> 00:51:25,840 Speaker 7: the long term. 951 00:51:25,640 --> 00:51:28,200 Speaker 6: Costs, and then they still have to hire someone to 952 00:51:28,320 --> 00:51:33,359 Speaker 6: monitor the self checkout area, so it's like, well, how 953 00:51:33,440 --> 00:51:37,080 Speaker 6: much money are you actually saving, mister grocery. 954 00:51:37,239 --> 00:51:40,759 Speaker 1: Yeah, big Grocery, which are now owned by just like 955 00:51:40,800 --> 00:51:43,920 Speaker 1: two companies in all of America. But yeah, like so 956 00:51:44,280 --> 00:51:49,520 Speaker 1: they've also been using it in like predictive crime fighting technology. 957 00:51:49,640 --> 00:51:52,440 Speaker 1: But of course, first of all, the data that it's 958 00:51:52,480 --> 00:51:57,160 Speaker 1: being fed is from police data, so it's you know, 959 00:51:57,640 --> 00:52:02,320 Speaker 1: using the it's not predicting. It's predicting like where police 960 00:52:02,480 --> 00:52:06,520 Speaker 1: are spending their time is essentially which is in low 961 00:52:06,560 --> 00:52:11,319 Speaker 1: income and you know, non white neighborhoods, And it's just 962 00:52:11,640 --> 00:52:17,480 Speaker 1: complete bullshit. There's one great anecdote where Chicago's predictive policing software, 963 00:52:17,520 --> 00:52:20,359 Speaker 1: created by the Illinois Institute of Technology, compiled a list 964 00:52:20,400 --> 00:52:22,720 Speaker 1: of people most likely to be involved in a violent 965 00:52:22,800 --> 00:52:26,960 Speaker 1: crime and they're, oh, no, a AI, baby, check out 966 00:52:26,960 --> 00:52:30,520 Speaker 1: this because it was generated by AI, you can trust this. 967 00:52:30,880 --> 00:52:33,719 Speaker 1: And then an investigation later found that the software is 968 00:52:33,760 --> 00:52:37,880 Speaker 1: list of future criminals included every single person arrested or 969 00:52:37,960 --> 00:52:42,560 Speaker 1: fingerprinted in Chicago since twenty thirteen. WHOA, So it was 970 00:52:42,680 --> 00:52:46,560 Speaker 1: just like copying off of another like data size A 971 00:52:46,800 --> 00:52:47,480 Speaker 1: I can do. 972 00:52:47,480 --> 00:52:52,160 Speaker 7: Is it can gather and redistribute information using keywords that 973 00:52:52,920 --> 00:52:57,239 Speaker 7: the person using it inputs, which means it's logically going 974 00:52:57,280 --> 00:52:59,719 Speaker 7: to be flawed because it's not understanding the words, it's 975 00:52:59,760 --> 00:53:02,080 Speaker 7: just organizing them in a way that it thinks make 976 00:53:02,160 --> 00:53:06,800 Speaker 7: sense to you. It's on a thinking machine. Yeah, the 977 00:53:06,880 --> 00:53:09,400 Speaker 7: technology is not there yet, and people are way too 978 00:53:09,480 --> 00:53:12,040 Speaker 7: eager to jump into it and in a way that 979 00:53:12,080 --> 00:53:15,080 Speaker 7: would almost assuredly collapse society, which is why it's insane 980 00:53:15,120 --> 00:53:17,239 Speaker 7: to me that we're still having to like explain this 981 00:53:17,320 --> 00:53:20,120 Speaker 7: to people to be like they're like, did you see 982 00:53:20,120 --> 00:53:23,480 Speaker 7: the article from was it the Hollywood Reporter that came 983 00:53:23,520 --> 00:53:26,279 Speaker 7: out about the executives and how they're choosing to fight 984 00:53:26,320 --> 00:53:26,840 Speaker 7: the unions. 985 00:53:27,200 --> 00:53:28,880 Speaker 1: Yeah, by just waiting them out. 986 00:53:29,239 --> 00:53:33,520 Speaker 7: Yeah, essentially, And the key part that's resonated with me 987 00:53:33,680 --> 00:53:36,040 Speaker 7: was like, this is a necessary evil. 988 00:53:36,680 --> 00:53:36,919 Speaker 1: Yeah. 989 00:53:37,000 --> 00:53:39,360 Speaker 7: It was the verbatim words from this quote story. So 990 00:53:39,440 --> 00:53:42,200 Speaker 7: we're pretty sure this is a scare tactic to get 991 00:53:42,239 --> 00:53:46,160 Speaker 7: sagged to not strike and them saying, hey, we'll wade out. 992 00:53:46,200 --> 00:53:48,520 Speaker 7: The writers will wait out the actors as well. But 993 00:53:49,040 --> 00:53:52,000 Speaker 7: I think what's really happened is they fired up the 994 00:53:52,040 --> 00:53:55,920 Speaker 7: base of unions to be like, excuse me, a necessary 995 00:53:56,040 --> 00:54:00,360 Speaker 7: evil is not destroying the beautiful industry can be that 996 00:54:00,400 --> 00:54:04,040 Speaker 7: people have tried to create here for literally over a century. 997 00:54:04,480 --> 00:54:07,520 Speaker 7: It's nonsensical. I just I can't understand the end game 998 00:54:08,040 --> 00:54:10,960 Speaker 7: other than I guess forty people will be a little 999 00:54:11,000 --> 00:54:11,719 Speaker 7: bit wealthier. 1000 00:54:12,239 --> 00:54:14,439 Speaker 1: Yeah, that's all it is. Yeah, I don't think it's 1001 00:54:14,440 --> 00:54:17,120 Speaker 1: going to cause society to collapse. It'll just accelerate the 1002 00:54:17,160 --> 00:54:21,320 Speaker 1: thing that's already happening, which is just like grotesque inequality 1003 00:54:21,480 --> 00:54:24,960 Speaker 1: like gets worse and like they're able to hire fewer 1004 00:54:25,000 --> 00:54:29,600 Speaker 1: people to do jobs. And also the other thing that 1005 00:54:29,600 --> 00:54:32,040 Speaker 1: that seems to be happening is like the quality of 1006 00:54:32,120 --> 00:54:36,760 Speaker 1: everything is getting worse, and like that's going to keep happening, 1007 00:54:36,920 --> 00:54:39,040 Speaker 1: you know, like TV shows, movies. 1008 00:54:38,800 --> 00:54:42,719 Speaker 7: Or But to my point, isn't that like if we 1009 00:54:42,760 --> 00:54:45,880 Speaker 7: can track revolutions based off the price of bread, right, Like, 1010 00:54:45,920 --> 00:54:48,200 Speaker 7: so when people can no longer afford bread, that's when 1011 00:54:48,239 --> 00:54:50,440 Speaker 7: they riot in the streets and take out leaders and 1012 00:54:50,520 --> 00:54:54,120 Speaker 7: get very serious about their actions because everybody's base staple 1013 00:54:54,160 --> 00:54:59,280 Speaker 7: for survival. Then we have a fairly largely the entertainment 1014 00:54:59,280 --> 00:55:01,560 Speaker 7: community is huge, which I know some people think this 1015 00:55:01,640 --> 00:55:04,720 Speaker 7: is like a very small or elite space, but it's ginormous, 1016 00:55:04,760 --> 00:55:06,600 Speaker 7: and most of the people working in it are like 1017 00:55:06,760 --> 00:55:08,160 Speaker 7: working class people. 1018 00:55:08,560 --> 00:55:09,239 Speaker 1: And so if you. 1019 00:55:09,239 --> 00:55:12,959 Speaker 7: Eliminate everyone who I mean, because essentially, what when AI 1020 00:55:13,080 --> 00:55:17,719 Speaker 7: gets good enough, when it's actually usable, it will eliminate 1021 00:55:17,760 --> 00:55:22,440 Speaker 7: everybody who is of average talent. Right. What you do 1022 00:55:22,560 --> 00:55:25,800 Speaker 7: is you write pretty cute Hallmark stories and you're pretty 1023 00:55:25,800 --> 00:55:29,000 Speaker 7: good at that, and they're successful enough to get commercials 1024 00:55:29,000 --> 00:55:31,480 Speaker 7: into air frequently. Think you can probably buy a house, 1025 00:55:31,520 --> 00:55:33,080 Speaker 7: and you can fut your kids and send them to college, 1026 00:55:33,080 --> 00:55:35,439 Speaker 7: and you can have a very comfortable life. If AI 1027 00:55:35,560 --> 00:55:38,920 Speaker 7: can do that job, then you're wiping out hundreds of 1028 00:55:38,920 --> 00:55:41,600 Speaker 7: thousands of jobs, and that is really really bad news 1029 00:55:41,600 --> 00:55:45,360 Speaker 7: for like Los Angeles, the county. I just to me, 1030 00:55:45,560 --> 00:55:47,759 Speaker 7: I know, it's like a small first step, but to me, 1031 00:55:47,880 --> 00:55:53,640 Speaker 7: like the dominoes seem large and looming absolutely like probable, 1032 00:55:53,840 --> 00:55:57,479 Speaker 7: like if things are not stopped immediately. And I really 1033 00:55:57,480 --> 00:56:00,480 Speaker 7: think that that's the takeaway from the WGA strike in 1034 00:56:00,680 --> 00:56:04,279 Speaker 7: and I really do hope SAG joints because it's not 1035 00:56:04,320 --> 00:56:06,040 Speaker 7: just going to stop at the entertainment industry. It's going 1036 00:56:06,120 --> 00:56:07,880 Speaker 7: to continue to all out to it, like we are 1037 00:56:08,400 --> 00:56:11,520 Speaker 7: one American workforce and we're all about to be impacted 1038 00:56:11,520 --> 00:56:11,920 Speaker 7: by this ship. 1039 00:56:11,960 --> 00:56:15,600 Speaker 1: It's great, all right, that's gonna do it. For this 1040 00:56:15,760 --> 00:56:20,560 Speaker 1: week's weekly Zeitgeist, Please like and review the show if 1041 00:56:20,600 --> 00:56:24,759 Speaker 1: you like, the show means the world to Miles. He 1042 00:56:24,960 --> 00:56:28,880 Speaker 1: needs your validation, folks. I hope you're having a great 1043 00:56:28,920 --> 00:57:12,520 Speaker 1: weekend and I will talk to him Monday. By spe