1 00:00:05,160 --> 00:00:07,360 Speaker 1: Hey, this is Annie and Samantha. I'm welcome to stuff. 2 00:00:07,360 --> 00:00:18,800 Speaker 1: I never told you production if I hurt you, and 3 00:00:18,880 --> 00:00:21,280 Speaker 1: we are once again so happy to be joined by 4 00:00:21,360 --> 00:00:24,560 Speaker 1: the marvelous, the magnificent Bridget Todd. Welcome Bridget. 5 00:00:24,960 --> 00:00:27,880 Speaker 2: Thanks for having me back. Always such a pleasure. 6 00:00:28,120 --> 00:00:31,320 Speaker 1: It is, it really is. I look forward to talking 7 00:00:31,320 --> 00:00:34,440 Speaker 1: to you every time, even though the topics we tackle 8 00:00:35,600 --> 00:00:38,760 Speaker 1: famously not so. It's not so easy. 9 00:00:39,159 --> 00:00:41,680 Speaker 3: I always say this, I swear I am a happy 10 00:00:41,720 --> 00:00:44,680 Speaker 3: person who is drawn to happy things. 11 00:00:44,240 --> 00:00:47,040 Speaker 2: Something about let's just cut let's just be real. 12 00:00:47,200 --> 00:00:51,920 Speaker 3: It's there's a lot of misery out there, and sometimes 13 00:00:52,000 --> 00:00:54,560 Speaker 3: I end up covering it on podcasts, But I am 14 00:00:54,600 --> 00:00:56,920 Speaker 3: a happy person who is drawn to happy things. 15 00:00:56,960 --> 00:00:58,560 Speaker 2: Don't get insted exactly. 16 00:00:58,600 --> 00:01:02,080 Speaker 4: The reality is, as much as like we would love 17 00:01:02,120 --> 00:01:04,440 Speaker 4: to be happy, go lucky, all of the things that 18 00:01:04,480 --> 00:01:07,319 Speaker 4: are happening we can't ignore. And it just happens that 19 00:01:07,400 --> 00:01:10,399 Speaker 4: these things are really gross and sad. And you know, 20 00:01:10,480 --> 00:01:14,119 Speaker 4: if I have to be given sad news or really 21 00:01:14,160 --> 00:01:17,840 Speaker 4: gross information, having someone as upbeat as you bringing it 22 00:01:17,880 --> 00:01:19,920 Speaker 4: to me, Bridget is kind of nice. Because I am 23 00:01:20,120 --> 00:01:20,839 Speaker 4: a sad person. 24 00:01:21,280 --> 00:01:24,119 Speaker 2: I am drawn to the darkness. 25 00:01:24,720 --> 00:01:28,080 Speaker 4: I am a bit gloomy, so having your perspective which 26 00:01:28,120 --> 00:01:30,440 Speaker 4: puts it in such a very factual but. 27 00:01:30,319 --> 00:01:33,840 Speaker 1: Also like realistic, but like here what. 28 00:01:33,880 --> 00:01:36,840 Speaker 4: Could be solutions, Here could be like the good possibilities, 29 00:01:36,959 --> 00:01:39,800 Speaker 4: and here are the realities. It's nice to be given 30 00:01:39,840 --> 00:01:41,720 Speaker 4: those that kind of news from someone like you. So 31 00:01:41,840 --> 00:01:43,720 Speaker 4: I appreciate that you bring this to us. 32 00:01:44,120 --> 00:01:46,319 Speaker 2: That makes me so happy to hear. 33 00:01:46,360 --> 00:01:48,400 Speaker 3: I mean, that really is what I try to do there, 34 00:01:48,440 --> 00:01:52,280 Speaker 3: Like there's no need for panic, even though things are 35 00:01:52,320 --> 00:01:54,640 Speaker 3: tough and scary, but you still need to know what's 36 00:01:54,680 --> 00:01:58,560 Speaker 3: going on. And also I think sometimes the conversations that 37 00:01:58,600 --> 00:02:02,400 Speaker 3: are happening these days are so grim that it's tempting 38 00:02:02,400 --> 00:02:04,280 Speaker 3: to just sort of check out from them. But we 39 00:02:04,360 --> 00:02:07,680 Speaker 3: really do need to understand how these things impact things 40 00:02:07,800 --> 00:02:11,640 Speaker 3: like gender and things like identity and who gets to 41 00:02:11,760 --> 00:02:13,560 Speaker 3: show up as themselves and who doesn't? 42 00:02:13,720 --> 00:02:14,680 Speaker 2: You know, I think it's important. 43 00:02:14,720 --> 00:02:20,040 Speaker 1: So thank you, yes, well, thank you. You know here's 44 00:02:20,080 --> 00:02:22,519 Speaker 1: the million dollar question. How have you been rigid? 45 00:02:22,960 --> 00:02:26,639 Speaker 2: Oh god, this is like that broad City. I mean, 46 00:02:26,680 --> 00:02:27,600 Speaker 2: how am I? 47 00:02:30,040 --> 00:02:30,079 Speaker 1: No? 48 00:02:30,280 --> 00:02:30,799 Speaker 2: Things are good? 49 00:02:30,840 --> 00:02:34,400 Speaker 3: I actually just came back from a trip to Spain. 50 00:02:34,600 --> 00:02:37,520 Speaker 3: I was there doing a live show for this other podcast. 51 00:02:37,520 --> 00:02:40,600 Speaker 3: That I work on called Irl with Mozilla Foundation, which 52 00:02:40,639 --> 00:02:42,399 Speaker 3: was awesome. Was my first time in Spain. I went 53 00:02:42,440 --> 00:02:45,480 Speaker 3: to Barcelona, which was lovely and wonderful. 54 00:02:45,520 --> 00:02:47,560 Speaker 2: I didn't want to come back. We might. 55 00:02:47,639 --> 00:02:49,440 Speaker 3: This was the first time it's ever happened to me. 56 00:02:49,840 --> 00:02:52,880 Speaker 3: We were supposed to land in Barcelona. We tried to 57 00:02:52,960 --> 00:02:59,000 Speaker 3: land in Barcelona through very atypical lightning storms and it 58 00:02:59,080 --> 00:03:02,960 Speaker 3: was one of those situations is where you know, we 59 00:03:03,080 --> 00:03:05,120 Speaker 3: everybody on the plane looked out the window and just 60 00:03:05,120 --> 00:03:07,160 Speaker 3: saw these big bolts of lightning and we all just 61 00:03:07,160 --> 00:03:10,280 Speaker 3: sort of made a quiet peace with our gods. Then 62 00:03:10,919 --> 00:03:13,560 Speaker 3: the plane went back up in the air and the 63 00:03:13,600 --> 00:03:15,880 Speaker 3: pilot was like, yeah, we can't land in Barcelona, so 64 00:03:15,880 --> 00:03:18,160 Speaker 3: we're landing in Madrid. So I got to go to 65 00:03:18,200 --> 00:03:20,880 Speaker 3: a bonus, a bonus trip to Madrid. First time I've 66 00:03:20,919 --> 00:03:23,600 Speaker 3: ever experienced that a plane landing in a different place 67 00:03:23,600 --> 00:03:24,919 Speaker 3: than where you thought it was gonna land. 68 00:03:25,600 --> 00:03:27,160 Speaker 4: Okay, I think this is the only time that would 69 00:03:27,160 --> 00:03:28,440 Speaker 4: be warranted for y'all to clap. 70 00:03:31,880 --> 00:03:34,240 Speaker 3: Somebody is sitting next to me burst into tears like 71 00:03:34,320 --> 00:03:36,600 Speaker 3: it was like unfair. It really was a moment of 72 00:03:36,680 --> 00:03:39,200 Speaker 3: like are we gonna make it right? 73 00:03:39,640 --> 00:03:42,080 Speaker 4: Everybody's quiet and eeriness because they're like, oh god, what 74 00:03:42,120 --> 00:03:42,800 Speaker 4: do we Yeah. 75 00:03:43,560 --> 00:03:46,800 Speaker 3: That was That was the creepy part, is that people 76 00:03:46,880 --> 00:03:50,720 Speaker 3: were like gasping, and a few people like like shrieked 77 00:03:50,800 --> 00:03:51,400 Speaker 3: or screamed. 78 00:03:51,520 --> 00:03:53,960 Speaker 2: But at a certain point everybody just got very quiet. 79 00:03:54,040 --> 00:03:56,360 Speaker 3: And that's when you know you're like, oh, this we 80 00:03:56,400 --> 00:03:57,320 Speaker 3: are in a situation. 81 00:03:57,920 --> 00:03:58,720 Speaker 1: Be real. 82 00:04:00,960 --> 00:04:02,360 Speaker 2: But it ended up being fine, and I got to 83 00:04:02,360 --> 00:04:04,600 Speaker 2: see Madrid for the first time, so I'm all good. 84 00:04:05,360 --> 00:04:07,720 Speaker 4: Nice, We love a good twist, a happy ending. 85 00:04:07,840 --> 00:04:09,160 Speaker 2: Yeah, how have you two? Then? 86 00:04:10,600 --> 00:04:13,280 Speaker 1: Well, Samtha's apparently you. 87 00:04:13,280 --> 00:04:15,920 Speaker 4: Know my answer was not pretty. And for the listeners 88 00:04:15,920 --> 00:04:19,000 Speaker 4: who have listened consistently and have just recently heard my 89 00:04:19,040 --> 00:04:21,200 Speaker 4: happy hour, which was thirty minutes long of me beaching 90 00:04:21,200 --> 00:04:25,719 Speaker 4: and whining about adulthood, I think they know. But you 91 00:04:25,800 --> 00:04:28,560 Speaker 4: know what, we are together, so I am doing much 92 00:04:28,560 --> 00:04:31,159 Speaker 4: better right now than I'm want twenty minutes ago. I 93 00:04:31,160 --> 00:04:31,920 Speaker 4: will say that. 94 00:04:32,440 --> 00:04:33,920 Speaker 2: More like an unhappy hour. 95 00:04:34,760 --> 00:04:35,440 Speaker 1: It really was. 96 00:04:36,120 --> 00:04:40,440 Speaker 4: We named that specific session and wine because we have 97 00:04:40,520 --> 00:04:43,480 Speaker 4: some wine and then we and then I say. 98 00:04:43,279 --> 00:04:47,320 Speaker 1: Things, Well, there's a power in unloading yourself like that. 99 00:04:47,920 --> 00:04:50,279 Speaker 1: I'm okay, yeah, I'm good. I've had a lot of 100 00:04:50,320 --> 00:04:52,599 Speaker 1: good times with friends lately, and I appreciate that. That's 101 00:04:52,640 --> 00:04:55,320 Speaker 1: a nice I've gotten to see some people I haven't 102 00:04:55,320 --> 00:04:58,839 Speaker 1: seen it in a while, and it's been nice. I'm pretty tired, 103 00:04:58,880 --> 00:04:59,960 Speaker 1: but it's been good. 104 00:05:01,080 --> 00:05:05,720 Speaker 3: Yeah, especially as we go into the dark months. I mean, 105 00:05:05,720 --> 00:05:07,800 Speaker 3: we still have the holidays. I feel like after the 106 00:05:07,839 --> 00:05:10,960 Speaker 3: holidays is when the real slug begins. But holding onto 107 00:05:10,960 --> 00:05:13,719 Speaker 3: those small comforts like getting to see friends, you know, 108 00:05:14,800 --> 00:05:17,279 Speaker 3: I think that really is very important. 109 00:05:17,839 --> 00:05:22,680 Speaker 1: Yeah, and especially given some of the news right now 110 00:05:22,720 --> 00:05:25,760 Speaker 1: and some of the dark, dark times and dark things 111 00:05:25,760 --> 00:05:28,159 Speaker 1: we're going to talk about in this episode. 112 00:05:28,520 --> 00:05:32,360 Speaker 3: That is right, just a content warning up top because 113 00:05:32,440 --> 00:05:37,920 Speaker 3: we were talking about convicted sex criminal Jeffrey Epstein, which I. 114 00:05:37,880 --> 00:05:39,400 Speaker 2: Feel like it's everywhere in the news. 115 00:05:39,480 --> 00:05:42,720 Speaker 3: You really kind of can't avoid the conversation around him. 116 00:05:43,360 --> 00:05:46,400 Speaker 3: Today is November twentieth, so by the time you hear this, 117 00:05:46,839 --> 00:05:49,440 Speaker 3: there is some chance that things might have changed. But 118 00:05:49,680 --> 00:05:53,479 Speaker 3: here is the latest as of today. So yesterday, the 119 00:05:53,520 --> 00:05:56,960 Speaker 3: Senate almost unanimously voted to release the Epstein files, with 120 00:05:57,040 --> 00:06:01,279 Speaker 3: only one dissenting vote, Republican Representative Klay Higgins of Louisiana. 121 00:06:02,160 --> 00:06:03,159 Speaker 3: So curious what's going on? 122 00:06:03,200 --> 00:06:04,200 Speaker 2: With him. I know, I looked at it. 123 00:06:04,240 --> 00:06:07,320 Speaker 3: I was like, oh, you're just a major trumper, even 124 00:06:07,360 --> 00:06:10,920 Speaker 3: for Republicans. See, he seems like a quite extreme fellow. 125 00:06:11,080 --> 00:06:13,360 Speaker 3: So I guess I shouldn't be surprised, but it is 126 00:06:13,400 --> 00:06:16,080 Speaker 3: well to be like, oh, even among Republicans, just one 127 00:06:16,120 --> 00:06:17,400 Speaker 3: dissenting vote. 128 00:06:17,640 --> 00:06:20,800 Speaker 2: So that bill then went to Trump's desk. 129 00:06:20,920 --> 00:06:22,920 Speaker 3: There were some back and forth of whether Trump was 130 00:06:22,960 --> 00:06:25,719 Speaker 3: going to sign it, but he did. So what happens 131 00:06:25,760 --> 00:06:28,800 Speaker 3: now is the Department of Justice has thirty days to 132 00:06:28,880 --> 00:06:30,159 Speaker 3: release these files. 133 00:06:33,040 --> 00:06:35,640 Speaker 4: I'm trying not to be a conspiracist, but I swear 134 00:06:35,880 --> 00:06:44,159 Speaker 4: the more I read, especially in this specific topic, the 135 00:06:44,200 --> 00:06:46,359 Speaker 4: thirty days seems too long. I know there has to 136 00:06:46,400 --> 00:06:49,440 Speaker 4: be some jurisdiction and all of that, but it feels 137 00:06:49,480 --> 00:06:51,640 Speaker 4: like there's too much happening, and thirty days is giving 138 00:06:51,680 --> 00:06:56,560 Speaker 4: them too much time and including the fight. Are they 139 00:06:56,560 --> 00:06:58,960 Speaker 4: actually going to release it? That's my question. 140 00:06:59,279 --> 00:07:01,479 Speaker 3: Yeah, that is seems to be the big question. I 141 00:07:01,480 --> 00:07:03,960 Speaker 3: should say. I am no attorney so this is somewhat 142 00:07:03,960 --> 00:07:06,320 Speaker 3: above my pay grade. But here's a little bit from CNN. 143 00:07:06,680 --> 00:07:10,640 Speaker 3: Despite Trump signing the bill, uncertainty remains. Attorney General Pam 144 00:07:10,640 --> 00:07:13,000 Speaker 3: Bondi said yesterday that the Department of Justice will quote 145 00:07:13,120 --> 00:07:16,520 Speaker 3: follow the law, but some lawmakers and analysts are worried 146 00:07:16,520 --> 00:07:19,600 Speaker 3: that the Trump administration may try to hinder the process 147 00:07:19,640 --> 00:07:23,360 Speaker 3: by slowing the release or redacting information. So I don't 148 00:07:23,360 --> 00:07:28,600 Speaker 3: know above my pay grade, but your worries are grounded. 149 00:07:28,720 --> 00:07:31,960 Speaker 3: You're not the only person with that question. That seems 150 00:07:31,960 --> 00:07:34,440 Speaker 3: to be the question on everybody's mind. And in any event, 151 00:07:34,800 --> 00:07:38,000 Speaker 3: this is still all like a very big reversal of 152 00:07:38,120 --> 00:07:41,400 Speaker 3: the time very recently when Trump was calling this entire 153 00:07:41,440 --> 00:07:44,520 Speaker 3: thing a Democrat hoax, when Pam Bondy was like, oh, 154 00:07:44,520 --> 00:07:46,760 Speaker 3: there's not going to be any additional information release. 155 00:07:46,800 --> 00:07:48,280 Speaker 2: We looked into it. We don't need to release any 156 00:07:48,320 --> 00:07:48,880 Speaker 2: more information. 157 00:07:49,520 --> 00:07:53,560 Speaker 3: But then Trump was also recently vowing to investigate any 158 00:07:53,720 --> 00:07:56,800 Speaker 3: Democrats accused of breaking the law and the Epstein files, 159 00:07:57,000 --> 00:07:59,160 Speaker 3: which like, if the entire thing was just a big 160 00:07:59,160 --> 00:08:02,520 Speaker 3: Democratic hope, why would there be Democrats in the files. 161 00:08:02,640 --> 00:08:05,120 Speaker 3: I'm not fully as sure that he thought this one through, 162 00:08:05,280 --> 00:08:08,600 Speaker 3: but there you have it. And I do think it's 163 00:08:08,640 --> 00:08:12,560 Speaker 3: this weird thing where people, in an attempt to downplay 164 00:08:12,560 --> 00:08:16,200 Speaker 3: this entire thing, are saying, well, what if this takes 165 00:08:16,240 --> 00:08:21,200 Speaker 3: down a prominent Democrat like Bill Clinton, to which I say, great, like, 166 00:08:21,240 --> 00:08:23,920 Speaker 3: get him out of here. If Bill Clinton committed a 167 00:08:23,920 --> 00:08:26,520 Speaker 3: crime and that is revealed in the Epstein files, take 168 00:08:26,600 --> 00:08:30,200 Speaker 3: him to prison, right, Like, I don't think anybody is saying, no, 169 00:08:30,520 --> 00:08:33,760 Speaker 3: we need to protect prominent democrats like Bill Clinton from 170 00:08:33,760 --> 00:08:37,400 Speaker 3: accountability if they are named as having committed sex crimes 171 00:08:37,640 --> 00:08:41,120 Speaker 3: in the Epstein files. And also, do people really think 172 00:08:41,120 --> 00:08:43,920 Speaker 3: that most leftists are like clinging to the legacy of 173 00:08:43,960 --> 00:08:44,720 Speaker 3: Bill Clinton. 174 00:08:45,160 --> 00:08:46,520 Speaker 2: I saw this great post. 175 00:08:46,360 --> 00:08:48,760 Speaker 3: That was like, oh no, then I would have to 176 00:08:48,760 --> 00:08:51,880 Speaker 3: take down my Bill Clinton yard sign, burn my Bill 177 00:08:51,920 --> 00:08:54,960 Speaker 3: Clinton hat and T shirt, take off my Bill Clinton 178 00:08:55,120 --> 00:08:57,840 Speaker 3: car wrap, Like what are these people talking about? This 179 00:08:57,880 --> 00:08:59,240 Speaker 3: is completely like a fiction. 180 00:09:00,360 --> 00:09:03,080 Speaker 4: It really does baffle me, I guess in the sense 181 00:09:03,120 --> 00:09:08,160 Speaker 4: of like the people thinking everyone just because they have 182 00:09:08,400 --> 00:09:11,520 Speaker 4: idolized a politician into a way of being like this 183 00:09:11,559 --> 00:09:14,120 Speaker 4: is God's son literally putting pictures of like this, He's 184 00:09:14,160 --> 00:09:16,880 Speaker 4: the new Jesus that we all think of that way 185 00:09:16,960 --> 00:09:20,080 Speaker 4: for anyone that we voted for, instead of understanding a 186 00:09:20,080 --> 00:09:22,320 Speaker 4: lot of the choices that we make is the less 187 00:09:22,320 --> 00:09:25,480 Speaker 4: are evil and hoping that we can negotiate and hold 188 00:09:25,520 --> 00:09:29,000 Speaker 4: our people accountable to follow through with the promises they made, 189 00:09:29,040 --> 00:09:32,200 Speaker 4: Like it's not like this level of like, these are 190 00:09:32,200 --> 00:09:34,640 Speaker 4: people who work for us. These are people who we 191 00:09:34,679 --> 00:09:36,720 Speaker 4: are supposed to be able to, like count on to 192 00:09:36,760 --> 00:09:40,440 Speaker 4: protect our rights, not people I pray to because I 193 00:09:40,480 --> 00:09:43,160 Speaker 4: think you're going to do something to other people that 194 00:09:43,200 --> 00:09:45,240 Speaker 4: I don't like or I don't agree with that. That's 195 00:09:45,280 --> 00:09:47,400 Speaker 4: the other part to that is like their worship for 196 00:09:47,520 --> 00:09:49,439 Speaker 4: him is not because they think he's going to do 197 00:09:49,480 --> 00:09:51,640 Speaker 4: something great for them, is that he's going to be 198 00:09:51,760 --> 00:09:54,319 Speaker 4: mean to other people for them exactly. 199 00:09:54,720 --> 00:09:56,400 Speaker 2: Oh, this is a bit of a non sequitor. 200 00:09:56,480 --> 00:09:59,520 Speaker 3: But over the summer, I was at Ocean City, Maryland, 201 00:09:59,559 --> 00:10:01,760 Speaker 3: which is just like a kind of a trashy beach 202 00:10:01,840 --> 00:10:04,760 Speaker 3: town but I fully love and I go there every summer, 203 00:10:05,080 --> 00:10:07,480 Speaker 3: and they have all these boardwalk t shirt shops, and 204 00:10:08,000 --> 00:10:12,760 Speaker 3: every one of them has such prominent Trump memorabilia, Trump. 205 00:10:12,520 --> 00:10:15,079 Speaker 2: Hats, Trump shirts. It's like Trump Trump Trump, Tump Trump. 206 00:10:15,280 --> 00:10:17,000 Speaker 3: And I remember thinking I want to go into one 207 00:10:17,000 --> 00:10:19,200 Speaker 3: of these stories and be like, excuse me, where's your 208 00:10:19,240 --> 00:10:22,640 Speaker 3: Hakeem Jeffrey section, Like there's not You're you're so right 209 00:10:22,679 --> 00:10:25,560 Speaker 3: that there's not an analogous thing on the left where 210 00:10:25,600 --> 00:10:30,319 Speaker 3: people are buying like, you know, Hakeem Jeffrey's car wraps 211 00:10:30,360 --> 00:10:33,240 Speaker 3: and stuff like Ocean City boardwalks are not cluttered with 212 00:10:34,080 --> 00:10:35,640 Speaker 3: memorabilia about AOC. 213 00:10:37,400 --> 00:10:41,520 Speaker 4: It's just an interesting level. Also, the again with your 214 00:10:41,559 --> 00:10:44,880 Speaker 4: non sequit I love that everyone, and I mean everyone 215 00:10:45,120 --> 00:10:49,000 Speaker 4: takes advantage of that level of consumerism. Be like, yeah, 216 00:10:49,000 --> 00:10:51,200 Speaker 4: we'll buy those key products and sell itate everybody because 217 00:10:51,240 --> 00:10:53,640 Speaker 4: they all buy it. I'm gonna make money. You're gonna 218 00:10:53,679 --> 00:10:56,520 Speaker 4: be that ridiculous. I'm gonna sell it to you. 219 00:10:56,280 --> 00:10:58,000 Speaker 2: Get your money. I'm not even mad at it, to 220 00:10:58,000 --> 00:10:58,720 Speaker 2: be honest with you. 221 00:11:00,040 --> 00:11:02,600 Speaker 3: So the thing about whether or not prominent people, including 222 00:11:02,600 --> 00:11:05,280 Speaker 3: Democrats like Bill Clinton, would be involved in the Epstein 223 00:11:05,320 --> 00:11:08,240 Speaker 3: files is that Epstein, this is his total emmo. He 224 00:11:08,320 --> 00:11:12,360 Speaker 3: intentionally had deep ties to all kinds of prominent, wealthy, 225 00:11:12,440 --> 00:11:14,880 Speaker 3: powerful people on all sides of the aisle. This is 226 00:11:14,920 --> 00:11:19,120 Speaker 3: the same thing we saw from other convicted sex criminal 227 00:11:19,240 --> 00:11:23,880 Speaker 3: Sean Diddycomb's you know, surround yourself with wealthy, prominent, powerful 228 00:11:23,920 --> 00:11:28,560 Speaker 3: people and that both becomes this shield from accountability for 229 00:11:28,640 --> 00:11:30,959 Speaker 3: your crimes and then also you can get dirt on 230 00:11:31,000 --> 00:11:33,120 Speaker 3: all these other powerful people and sort of have an 231 00:11:33,200 --> 00:11:36,640 Speaker 3: extra layer of power and protection. So it is pretty 232 00:11:36,840 --> 00:11:40,080 Speaker 3: likely that it's the entirety of who did what is released. 233 00:11:40,080 --> 00:11:41,520 Speaker 2: There would be powerful. 234 00:11:41,080 --> 00:11:43,960 Speaker 3: People on all sides of the political spectrum and also 235 00:11:44,040 --> 00:11:47,960 Speaker 3: just like wealthy, famous non political people because that's how 236 00:11:48,040 --> 00:11:53,080 Speaker 3: Epstein rolled. But I want to talk about one prominent 237 00:11:53,200 --> 00:11:56,880 Speaker 3: democrat who we know had a connection with Epstein and 238 00:11:56,960 --> 00:11:59,480 Speaker 3: what that says about our current tech climate. 239 00:11:59,600 --> 00:12:01,520 Speaker 2: And that person is Larry. 240 00:12:01,320 --> 00:12:05,560 Speaker 1: Summers, yep who I did not know by name. I 241 00:12:05,640 --> 00:12:07,920 Speaker 1: knew of him. I think I just didn't know him, 242 00:12:08,320 --> 00:12:10,160 Speaker 1: but now I do, and I'm like, what a way 243 00:12:10,320 --> 00:12:11,640 Speaker 1: for people to find out? 244 00:12:11,720 --> 00:12:14,320 Speaker 3: Oh, yes, this is why we wanted to make this 245 00:12:14,400 --> 00:12:17,880 Speaker 3: episode in part, just as an old these names that 246 00:12:18,000 --> 00:12:21,240 Speaker 3: come up, you're like, oh, well, if you're under thirty, 247 00:12:21,320 --> 00:12:23,880 Speaker 3: you might not know this name, but if you're over thirty, 248 00:12:23,920 --> 00:12:26,400 Speaker 3: you might remember a lot about this person just from 249 00:12:26,440 --> 00:12:28,160 Speaker 3: being a person who was alive when a lot of 250 00:12:28,160 --> 00:12:31,040 Speaker 3: this stuff was going on. So Larry Summers, he might 251 00:12:31,080 --> 00:12:34,560 Speaker 3: not be a flashy name that you recognize, but if 252 00:12:34,600 --> 00:12:37,600 Speaker 3: you have any cash dollar bills that were minted from 253 00:12:37,880 --> 00:12:40,440 Speaker 3: nineteen ninety nine to two thousand and one, you are 254 00:12:40,520 --> 00:12:42,800 Speaker 3: carrying around Larry Summers's signature. 255 00:12:43,080 --> 00:12:46,199 Speaker 2: In your wallet because his signature is on the money 256 00:12:46,200 --> 00:12:48,360 Speaker 2: that was mented then because he was the US Secretary 257 00:12:48,360 --> 00:12:50,920 Speaker 2: of the Treasury from nineteen ninety nine to two thousand 258 00:12:50,960 --> 00:12:53,560 Speaker 2: and one under former President Bill Clinton. 259 00:12:53,960 --> 00:12:56,000 Speaker 3: And he was also the Director of the National Economic 260 00:12:56,040 --> 00:13:00,679 Speaker 3: Council under former President Barack Obama. Basically that's the fancy 261 00:13:00,720 --> 00:13:03,000 Speaker 3: title for the fact that he was the top advisor 262 00:13:03,400 --> 00:13:05,160 Speaker 3: on economic matters for the president. 263 00:13:05,200 --> 00:13:06,520 Speaker 2: So a pretty big deal. 264 00:13:06,559 --> 00:13:08,800 Speaker 3: And Annie, you actually, I think we talked about the 265 00:13:08,840 --> 00:13:10,880 Speaker 3: movie The Social Network one time on here and how 266 00:13:10,920 --> 00:13:14,520 Speaker 3: much I like it. He is actually a fictionalized version 267 00:13:14,559 --> 00:13:17,800 Speaker 3: of himself is portrayed in the movie The Social Network. 268 00:13:17,840 --> 00:13:19,280 Speaker 2: He's sort of portrayed. 269 00:13:18,880 --> 00:13:22,400 Speaker 3: As like a good guy when the Winklevoss twins set 270 00:13:22,480 --> 00:13:25,640 Speaker 3: up a meeting with him to complain that Zuckerberg stole 271 00:13:25,679 --> 00:13:27,880 Speaker 3: their idea for Facebook while they were students at Harvard, 272 00:13:28,480 --> 00:13:30,960 Speaker 3: of which Larry Summers used to be the president of 273 00:13:31,080 --> 00:13:32,520 Speaker 3: which I will come back to. But yeah, so you 274 00:13:32,600 --> 00:13:35,720 Speaker 3: might have seen like a fictionalized version of Larry Summers 275 00:13:36,160 --> 00:13:37,000 Speaker 3: in that movie. 276 00:13:37,280 --> 00:13:38,920 Speaker 1: I'm gonna have to go back and watch it now. 277 00:13:39,679 --> 00:13:42,160 Speaker 1: It's gonna be strange. Now I know all this other 278 00:13:42,280 --> 00:13:48,120 Speaker 1: stuff that has come out. Yes, yes, yes, even about Zuckerberg. 279 00:13:48,800 --> 00:13:50,760 Speaker 1: At the time, he was bad, and now I'm like, 280 00:13:51,240 --> 00:13:52,679 Speaker 1: oh wild that. 281 00:13:52,840 --> 00:13:55,120 Speaker 3: I feel like the Winklevoss Twins are the kind of 282 00:13:55,120 --> 00:13:57,480 Speaker 3: the ones that history sort of is like, well there, 283 00:13:57,520 --> 00:14:01,480 Speaker 3: I'm not reading about them like in connections with convicted 284 00:14:01,600 --> 00:14:06,000 Speaker 3: sex criminals and stuff, or you know, disrupting democracy via 285 00:14:06,160 --> 00:14:07,120 Speaker 3: Facebook and meta. 286 00:14:07,280 --> 00:14:08,800 Speaker 2: So I don't know, maybe they turned out to be 287 00:14:08,880 --> 00:14:10,040 Speaker 2: Maybe history. 288 00:14:10,200 --> 00:14:13,080 Speaker 3: Will be kinder to the Winklevoss Twins, even though they're 289 00:14:13,080 --> 00:14:15,559 Speaker 3: sort of portrayed as the villains in that movie. 290 00:14:15,640 --> 00:14:16,440 Speaker 1: I played by. 291 00:14:18,200 --> 00:14:19,480 Speaker 4: Who played who played? 292 00:14:23,440 --> 00:14:28,080 Speaker 3: How did I just put this Togetherlake? 293 00:14:28,120 --> 00:14:32,120 Speaker 2: But that justin Timberlake is Sean Parker, the nabster guy. 294 00:14:32,440 --> 00:14:37,000 Speaker 3: Oh okay, okay, Wow. 295 00:14:36,320 --> 00:14:38,040 Speaker 1: I'm going to rewatch this and I think I'm going 296 00:14:38,120 --> 00:14:38,960 Speaker 1: to have a wild time. 297 00:14:39,240 --> 00:14:40,040 Speaker 4: I've never seen it. 298 00:14:40,240 --> 00:14:41,920 Speaker 2: I just know such a good movie. 299 00:14:42,000 --> 00:14:44,280 Speaker 3: If you ever ever want to do a rewatch, like 300 00:14:44,320 --> 00:14:46,040 Speaker 3: we all watch it and like recap it. 301 00:14:46,120 --> 00:14:49,720 Speaker 4: With you love to Yes, we might need we might 302 00:14:49,760 --> 00:14:50,280 Speaker 4: need to do this. 303 00:14:50,520 --> 00:14:52,520 Speaker 2: Oh my gosh, my first time. 304 00:14:52,560 --> 00:14:55,880 Speaker 3: Take cast some characters and that in that hole, both 305 00:14:55,880 --> 00:14:59,360 Speaker 3: in the fictionalized version in the Sorkin film and in 306 00:14:59,400 --> 00:15:14,240 Speaker 3: real life just a real a menagerie there. So folks 307 00:15:14,240 --> 00:15:17,080 Speaker 3: have probably seen or heard about some of these emails 308 00:15:17,080 --> 00:15:19,160 Speaker 3: that were newly released as part of the Epstein files, 309 00:15:19,160 --> 00:15:22,400 Speaker 3: because last week the House Oversight Committee released a new 310 00:15:22,480 --> 00:15:26,200 Speaker 3: round of twenty thousand documents and emails from Epstein's estate. 311 00:15:26,840 --> 00:15:29,080 Speaker 3: And that is how we know that Larry Summers, who 312 00:15:29,160 --> 00:15:32,320 Speaker 3: is currently a very big deal professor at Harvard University, 313 00:15:32,800 --> 00:15:38,360 Speaker 3: is mentioned as somebody who had a documented, pretty uncool 314 00:15:38,480 --> 00:15:44,240 Speaker 3: relationship with convicted pedophile and wealthy financier Jeffrey Epstein. And 315 00:15:44,320 --> 00:15:46,240 Speaker 3: just in the way of disclosures, before I talk too 316 00:15:46,320 --> 00:15:48,800 Speaker 3: much about Harvard University, I should disclose that I am 317 00:15:48,840 --> 00:15:52,040 Speaker 3: currently an affiliate of Harvard University's Brooklyn Clin Center. 318 00:15:52,400 --> 00:15:53,520 Speaker 2: So I did want to back. 319 00:15:53,440 --> 00:15:55,360 Speaker 3: Up and just give a bit of a quick and 320 00:15:55,400 --> 00:15:58,520 Speaker 3: dirty rundown of who Epstein is for folks who are 321 00:15:58,560 --> 00:15:59,120 Speaker 3: not mired. 322 00:15:59,160 --> 00:16:00,960 Speaker 2: And this a guy had been the last couple of months. 323 00:16:01,000 --> 00:16:04,480 Speaker 3: So Epstein was a wealthy financier who threw his money 324 00:16:04,520 --> 00:16:07,840 Speaker 3: all over the place. He was convicted of sex crimes 325 00:16:07,840 --> 00:16:10,600 Speaker 3: against a minor in two thousand and eight and famously 326 00:16:10,600 --> 00:16:13,160 Speaker 3: got off on a very lenient sentence for that. 327 00:16:13,160 --> 00:16:15,720 Speaker 2: Crime only thirteen months with work release. 328 00:16:16,280 --> 00:16:22,000 Speaker 3: Importantly, both before this and notably after this conviction, a 329 00:16:22,040 --> 00:16:26,200 Speaker 3: lot of influential people continued to associate with Epstein, who 330 00:16:26,240 --> 00:16:29,560 Speaker 3: by then was a convicted sex criminal. Ebstein died in 331 00:16:29,640 --> 00:16:33,400 Speaker 3: jail in twenty nineteen while awaiting trial on federal sex 332 00:16:33,440 --> 00:16:36,920 Speaker 3: trafficking charges, and the New York City Medical Examiner ruled 333 00:16:36,960 --> 00:16:40,000 Speaker 3: his death of suicide, but conspiracy theories claiming that Epstein 334 00:16:40,000 --> 00:16:44,040 Speaker 3: didn't kill himself began to spread almost immediately in the aftermath. 335 00:16:44,360 --> 00:16:47,080 Speaker 3: The idea here is that Epstein was taken out to 336 00:16:47,120 --> 00:16:51,800 Speaker 3: avoid details that might incriminate powerful people that he associated with. 337 00:16:52,000 --> 00:16:53,840 Speaker 3: And it's one of those theories that kind of works 338 00:16:54,200 --> 00:16:56,440 Speaker 3: regardless of what side of the aisle you're on, because 339 00:16:56,440 --> 00:17:01,160 Speaker 3: Epstein had ties with all the political party and as 340 00:17:01,200 --> 00:17:06,720 Speaker 3: I said that, ingratiating himself with like wealthy, powerful spaces 341 00:17:06,760 --> 00:17:10,919 Speaker 3: and people was just sort of Epstein's m especially in 342 00:17:10,960 --> 00:17:13,719 Speaker 3: the tech world. He gave donations to a lot of 343 00:17:13,760 --> 00:17:17,000 Speaker 3: influential tech spaces. We did an episode of my podcast 344 00:17:17,040 --> 00:17:19,040 Speaker 3: There Are No Girls on the Internet about the contributions 345 00:17:19,040 --> 00:17:22,320 Speaker 3: he made to MIT's Media Lab and the Kenyan first 346 00:17:22,400 --> 00:17:25,880 Speaker 3: year MIT grad student who called for Joy Echo ahead 347 00:17:25,880 --> 00:17:28,800 Speaker 3: of the Media Lab to step down once this was revealed. 348 00:17:29,119 --> 00:17:34,080 Speaker 3: He also had some kind of a relationship with Microsoft 349 00:17:34,160 --> 00:17:38,720 Speaker 3: co founder Bill Gates. Bill Gates's ex wife, Melinda has 350 00:17:38,760 --> 00:17:42,560 Speaker 3: given several interviews to the effect that her marriage to 351 00:17:42,680 --> 00:17:47,600 Speaker 3: Bill Gates ended in part because of Bill Gates's relationship 352 00:17:47,800 --> 00:17:50,960 Speaker 3: with Jeffrey Epstein. She does this interview with Gail King 353 00:17:51,160 --> 00:17:55,520 Speaker 3: where Gail is like, well, what exactly happened? Like, can 354 00:17:55,560 --> 00:17:58,920 Speaker 3: you give us more details about what made you want 355 00:17:58,920 --> 00:18:02,280 Speaker 3: to leave your husband because of Epstein? 356 00:18:02,320 --> 00:18:04,040 Speaker 2: And she was like, you're gonna have to ask Bill Gates. 357 00:18:04,040 --> 00:18:04,560 Speaker 2: That not me. 358 00:18:04,800 --> 00:18:07,120 Speaker 3: So it's like a very it seems to me she's 359 00:18:07,119 --> 00:18:10,000 Speaker 3: like continuating something and all I guess I'll just leave 360 00:18:10,000 --> 00:18:12,800 Speaker 3: it at that. So it's one of those things where 361 00:18:12,800 --> 00:18:16,040 Speaker 3: you have a lot of powerful, rich and famous people 362 00:18:16,920 --> 00:18:20,400 Speaker 3: having this connection with Epstein, and that is how he 363 00:18:20,520 --> 00:18:23,560 Speaker 3: designed it. And so when the House Oversite Committee released 364 00:18:23,600 --> 00:18:26,720 Speaker 3: these emails and these documents, we got a lot more 365 00:18:26,800 --> 00:18:30,040 Speaker 3: insight into what that looked like. And I just have 366 00:18:30,200 --> 00:18:32,640 Speaker 3: to add, as a side note, have you all read 367 00:18:32,680 --> 00:18:34,359 Speaker 3: any of these emails or exchanges? 368 00:18:35,200 --> 00:18:35,600 Speaker 1: I have not. 369 00:18:35,920 --> 00:18:39,480 Speaker 4: It makes me sad I've seen things being posted, so 370 00:18:39,560 --> 00:18:40,800 Speaker 4: I know of blips. 371 00:18:41,040 --> 00:18:46,639 Speaker 3: Okay, so this is just a side note. I guess 372 00:18:47,080 --> 00:18:47,720 Speaker 3: the way they. 373 00:18:47,560 --> 00:18:49,480 Speaker 2: Are written is working insane. 374 00:18:49,600 --> 00:18:53,840 Speaker 3: Like reading these emails that some of the most prominent, wealthy, 375 00:18:54,359 --> 00:18:58,440 Speaker 3: important people in government, tech, and media reading in the 376 00:18:58,520 --> 00:19:01,600 Speaker 3: way that they emailed each other has absolutely cured me 377 00:19:01,640 --> 00:19:06,119 Speaker 3: of my anxiety around needing to to word my emails 378 00:19:06,240 --> 00:19:09,439 Speaker 3: just so. When I'm writing an email, I'm trying to 379 00:19:09,600 --> 00:19:12,720 Speaker 3: gauge the exact right amount of exclamation marks to use 380 00:19:12,760 --> 00:19:15,080 Speaker 3: to sound enthusiastic and easy to work with, but not 381 00:19:15,160 --> 00:19:16,080 Speaker 3: too enthusiastic. 382 00:19:16,480 --> 00:19:19,439 Speaker 2: Every email is like, no worries. If not, I'm happy 383 00:19:19,480 --> 00:19:21,080 Speaker 2: to like I go through. 384 00:19:21,119 --> 00:19:24,200 Speaker 3: So I go through so much second guessing this. Meanwhile, 385 00:19:24,320 --> 00:19:28,399 Speaker 3: Jeffery Epstein was just like, Yo, what's up. Want to 386 00:19:28,480 --> 00:19:30,280 Speaker 3: do some sex crimes later? 387 00:19:30,560 --> 00:19:31,000 Speaker 2: Lol? 388 00:19:31,119 --> 00:19:35,440 Speaker 3: Like every word miss spell, every word mistype, every word misspelled. 389 00:19:35,000 --> 00:19:39,320 Speaker 1: Like punctuation absent or completely inexplicable. 390 00:19:40,640 --> 00:19:43,920 Speaker 4: Yes, it's kind of like if a boomer was trying 391 00:19:43,960 --> 00:19:46,600 Speaker 4: to be cool and they're like lols and like that 392 00:19:46,680 --> 00:19:49,080 Speaker 4: type of thing no one actually uses, but they think 393 00:19:49,080 --> 00:19:50,439 Speaker 4: that's what the kids are doing. 394 00:19:50,960 --> 00:19:56,639 Speaker 3: Truly, it is like up number four sex crime question 395 00:19:56,760 --> 00:20:01,960 Speaker 3: mark lol, with small diz punctuated in insane. You say 396 00:20:02,200 --> 00:20:05,639 Speaker 3: he put a lot of stuff in writing that. I mean, 397 00:20:05,760 --> 00:20:07,800 Speaker 3: I just don't think any of this stuff should have 398 00:20:07,800 --> 00:20:10,880 Speaker 3: been in writing, And they're so egregiously mistyped that part 399 00:20:10,880 --> 00:20:12,800 Speaker 3: of me wonders if this was intentional, if they were 400 00:20:12,800 --> 00:20:15,800 Speaker 3: trying to like evade some sort of something. I have 401 00:20:15,880 --> 00:20:18,080 Speaker 3: no idea, but I want to make it clear that 402 00:20:18,800 --> 00:20:22,040 Speaker 3: he wasn't writing these in like nineteen ninety nine or something, 403 00:20:22,280 --> 00:20:24,240 Speaker 3: during a time and we didn't really have a concept 404 00:20:24,280 --> 00:20:26,720 Speaker 3: of what email was or what did or didn't belong 405 00:20:26,760 --> 00:20:28,840 Speaker 3: in emails and how to write them. He was writing 406 00:20:28,840 --> 00:20:31,880 Speaker 3: a lot of these emails in twenty eighteen, right at 407 00:20:31,880 --> 00:20:34,760 Speaker 3: a time when email was commonplace. 408 00:20:35,040 --> 00:20:37,560 Speaker 2: The whole thing is just very weird to me. 409 00:20:38,080 --> 00:20:41,080 Speaker 4: Yeah, it does get to the like plausible deniability. I 410 00:20:41,080 --> 00:20:42,879 Speaker 4: would never write it that way. That's not how I 411 00:20:42,880 --> 00:20:43,760 Speaker 4: speak type of thing. 412 00:20:44,480 --> 00:20:46,800 Speaker 1: Or just even because they tried to index. You know, 413 00:20:46,880 --> 00:20:50,159 Speaker 1: you can search how many times a name comes up 414 00:20:50,320 --> 00:20:53,080 Speaker 1: or whatever in these and so if if there's like 415 00:20:53,119 --> 00:20:56,720 Speaker 1: a misspelling or if there's something that would mess up 416 00:20:57,160 --> 00:20:59,719 Speaker 1: trying to index something. So I also was kind of like, 417 00:21:01,520 --> 00:21:04,359 Speaker 1: are you trying to get around something I don't know about. 418 00:21:04,600 --> 00:21:07,439 Speaker 3: Yes, that was my big and I have no idea, 419 00:21:07,440 --> 00:21:09,720 Speaker 3: but that was my thought because they're so everything is 420 00:21:09,760 --> 00:21:12,720 Speaker 3: so egregiously misspelled that it almost is like this has 421 00:21:12,760 --> 00:21:16,360 Speaker 3: to be some rich guy way of not having your 422 00:21:16,359 --> 00:21:18,600 Speaker 3: crimes getting found out later or something. 423 00:21:19,000 --> 00:21:21,560 Speaker 2: Otherwise you just really couldn't be type like that's. 424 00:21:23,160 --> 00:21:27,440 Speaker 1: How do you get those bottom quotation marks. I really 425 00:21:27,560 --> 00:21:29,200 Speaker 1: have to go out of your way to do that. 426 00:21:29,280 --> 00:21:35,040 Speaker 4: What I'm saying, I mean, we know the actual like 427 00:21:35,240 --> 00:21:37,480 Speaker 4: intelligence is not as intelligent as we think they are 428 00:21:37,520 --> 00:21:38,560 Speaker 4: when it comes to white men. 429 00:21:38,680 --> 00:21:43,560 Speaker 3: So hellmo hell I mean they were putting this stuff 430 00:21:43,560 --> 00:21:46,560 Speaker 3: in writing, and it was like, yeah, don't put your 431 00:21:46,600 --> 00:21:47,639 Speaker 3: crimes in writing. 432 00:21:47,800 --> 00:21:49,399 Speaker 2: However, I've said this so many times. 433 00:21:49,560 --> 00:21:52,639 Speaker 3: There's nothing when when when it's like, oh, we've released 434 00:21:52,640 --> 00:21:54,400 Speaker 3: the emails, I'm like, well, I'm gonna read every single one. 435 00:21:54,560 --> 00:21:57,080 Speaker 2: I I there's just something about people. 436 00:21:56,840 --> 00:21:58,720 Speaker 3: Writing emails that they don't realize they're going to be 437 00:21:58,760 --> 00:22:00,800 Speaker 3: read later in a deposition or made public in a 438 00:22:00,800 --> 00:22:01,720 Speaker 3: deposition or something. 439 00:22:01,960 --> 00:22:04,439 Speaker 2: Yeah, so I should give a big caveat that. 440 00:22:04,560 --> 00:22:07,960 Speaker 3: Just because somebody knew Epstein and associated with him and 441 00:22:08,040 --> 00:22:11,200 Speaker 3: emailed with him, does not necessarily mean that they did 442 00:22:11,240 --> 00:22:14,680 Speaker 3: something criminal. When a lot of these people have been asked, 443 00:22:15,000 --> 00:22:17,000 Speaker 3: people are mostly like, oh, I didn't know him like that. 444 00:22:18,040 --> 00:22:21,359 Speaker 3: But at the same time, it is not like what 445 00:22:21,400 --> 00:22:24,800 Speaker 3: Epstein was doing was a secret. Right in twenty eleven, 446 00:22:24,960 --> 00:22:28,480 Speaker 3: mind you, this is two years after Epstein was convicted 447 00:22:28,560 --> 00:22:32,080 Speaker 3: of a sex crime against a minor. Bill Gates told 448 00:22:32,119 --> 00:22:35,840 Speaker 3: The New York Times about Epstein, quote, his lifestyle is 449 00:22:36,040 --> 00:22:38,640 Speaker 3: very different and kind of intriguing, although it would never 450 00:22:38,720 --> 00:22:41,560 Speaker 3: work for me. So you're not going to ever convince 451 00:22:41,600 --> 00:22:45,679 Speaker 3: me that Bill Gates had no idea he was associating 452 00:22:45,720 --> 00:22:48,399 Speaker 3: with somebody who committed sex crimes against miners. When he 453 00:22:48,480 --> 00:22:50,120 Speaker 3: was saying, like, you're just not going to be able 454 00:22:50,119 --> 00:22:51,600 Speaker 3: to convince me that he didn't know what was going on, 455 00:22:51,640 --> 00:22:54,920 Speaker 3: which is what he says now. When Epstein was donating 456 00:22:54,920 --> 00:22:58,080 Speaker 3: money to the MIT Media Lab, for instance, the staff 457 00:22:58,240 --> 00:23:03,520 Speaker 3: at MIT had a whole system for flagging Epstein's donations 458 00:23:03,960 --> 00:23:07,199 Speaker 3: using the code name Voldemort or he who shall not 459 00:23:07,280 --> 00:23:11,400 Speaker 3: be named, specifically to obscure where that money came from, 460 00:23:11,440 --> 00:23:14,840 Speaker 3: because he had already been convicted of sex crimes. So, yeah, 461 00:23:14,880 --> 00:23:17,479 Speaker 3: these people knew this was not a big secret people 462 00:23:17,600 --> 00:23:19,200 Speaker 3: now who were like, oh, I had no idea. 463 00:23:19,640 --> 00:23:20,080 Speaker 2: I don't know. 464 00:23:20,600 --> 00:23:23,959 Speaker 3: I have a hard time believing some of the people 465 00:23:24,080 --> 00:23:26,800 Speaker 3: who are going out of their way to say this now. 466 00:23:28,440 --> 00:23:32,879 Speaker 1: Yeah. And I also think some of the things we're hearing, 467 00:23:33,280 --> 00:23:35,960 Speaker 1: especially like Megan Kelly being like there's a difference between 468 00:23:36,000 --> 00:23:38,439 Speaker 1: a five year old and fifteen year old, I feel 469 00:23:38,440 --> 00:23:45,600 Speaker 1: like that we're also having to confront that older men 470 00:23:46,560 --> 00:23:49,679 Speaker 1: do this, yeah, and have been doing it forever, and 471 00:23:49,720 --> 00:23:54,800 Speaker 1: we've kind of societally been like, oh, that's his lifestyle 472 00:23:55,200 --> 00:24:00,000 Speaker 1: or yeah, that's yeah, that's what he's into. Gross. 473 00:24:00,800 --> 00:24:03,040 Speaker 3: Yeah, like the way Bill Gates is like, oh, his 474 00:24:03,119 --> 00:24:06,680 Speaker 3: lifestyle is certainly intriguing to me, Homie. Those are sex 475 00:24:06,720 --> 00:24:09,959 Speaker 3: crimes against kids. It's kind of lifestyle choice. We're not 476 00:24:10,040 --> 00:24:15,200 Speaker 3: talking about like being vegan. We're talking about a entire 477 00:24:15,359 --> 00:24:22,119 Speaker 3: network designed to traffic children for sex crime. Like, I 478 00:24:22,119 --> 00:24:25,280 Speaker 3: think you're exactly right, and I do think we should 479 00:24:25,320 --> 00:24:28,959 Speaker 3: talk about that because I think when we make Epstein 480 00:24:29,320 --> 00:24:33,959 Speaker 3: into a solo monster, which she definitely is a monster, 481 00:24:34,400 --> 00:24:38,119 Speaker 3: but it makes it easy to not look at some 482 00:24:38,240 --> 00:24:41,400 Speaker 3: of the ways these attitudes really do show up all 483 00:24:41,440 --> 00:24:45,200 Speaker 3: over the place. Right, you don't have to be flying 484 00:24:45,240 --> 00:24:48,680 Speaker 3: in girls on your private plane nicknamed the Lolita Express, 485 00:24:49,040 --> 00:24:55,560 Speaker 3: to have really messed up attitudes about young girls. I 486 00:24:55,720 --> 00:24:59,600 Speaker 3: was just reading an article about how a AI company 487 00:24:59,880 --> 00:25:05,920 Speaker 3: was used to deep fake images of girls to make 488 00:25:06,000 --> 00:25:09,720 Speaker 3: deep fake nudes, and they had used specifically yearbook pictures, 489 00:25:09,760 --> 00:25:11,760 Speaker 3: and it was like one of the headlines I'd read 490 00:25:11,920 --> 00:25:15,639 Speaker 3: was something to the effect of women's yearbook pictures used 491 00:25:15,640 --> 00:25:16,560 Speaker 3: for deep fakes. 492 00:25:16,960 --> 00:25:19,119 Speaker 2: I got to thinking, I'm a grown woman. 493 00:25:19,520 --> 00:25:22,080 Speaker 3: I have not taken a yearbook photo since I have 494 00:25:22,200 --> 00:25:25,160 Speaker 3: become an adult, because it is not grown adult women 495 00:25:25,240 --> 00:25:26,600 Speaker 3: who take yearbook photos. 496 00:25:26,640 --> 00:25:28,400 Speaker 2: It's children because they are in school. 497 00:25:28,600 --> 00:25:31,840 Speaker 3: So why did this headline say that it was women's 498 00:25:32,200 --> 00:25:33,640 Speaker 3: yearbook photos, as if these. 499 00:25:33,520 --> 00:25:35,600 Speaker 2: Women were adults. Anybody who's getting. 500 00:25:35,400 --> 00:25:37,879 Speaker 3: A yearbook photo unless you've got some other situation going on, 501 00:25:37,960 --> 00:25:41,600 Speaker 3: maybe you're a teacher. You're usually a child. You're in school, 502 00:25:41,640 --> 00:25:43,639 Speaker 3: you're in K through twelve education. 503 00:25:44,040 --> 00:25:48,200 Speaker 4: Mm hm oh. Yeah, I was looking because I saw 504 00:25:48,200 --> 00:25:53,159 Speaker 4: that headline too, and some of the advertisements for those sides. 505 00:25:53,200 --> 00:25:55,880 Speaker 4: Did you see that it said the AI girls will 506 00:25:55,880 --> 00:25:57,040 Speaker 4: never say no to Yeah? 507 00:25:57,200 --> 00:25:57,600 Speaker 2: Yeah. 508 00:25:58,040 --> 00:26:00,480 Speaker 4: It was so disturbing that obviously there was so many 509 00:26:00,560 --> 00:26:04,959 Speaker 4: implications into these things in a like date rape culture, 510 00:26:05,040 --> 00:26:07,400 Speaker 4: but also coming back to the fact that yes, these 511 00:26:07,440 --> 00:26:09,200 Speaker 4: girls look like they're in school. 512 00:26:10,280 --> 00:26:13,680 Speaker 3: Yeah, and something about the way that the Megan Kelly thing, 513 00:26:15,440 --> 00:26:18,359 Speaker 3: you know, Meghan Kelly has a daughter who is like 514 00:26:18,400 --> 00:26:23,639 Speaker 3: fifteen sixteen, And I think especially for a parent or 515 00:26:23,680 --> 00:26:26,160 Speaker 3: somebody who spends a lot of time around a fifteen 516 00:26:26,280 --> 00:26:30,639 Speaker 3: sixteen year old to make it sound like because a 517 00:26:30,640 --> 00:26:32,639 Speaker 3: lot of people will say like, oh, well, she looked 518 00:26:32,680 --> 00:26:35,720 Speaker 3: older or seemed older. When you're around a fifteen year old, 519 00:26:35,760 --> 00:26:38,600 Speaker 3: you really have a front row seat to the ways 520 00:26:38,640 --> 00:26:40,679 Speaker 3: in which they are obviously very much kids. 521 00:26:40,920 --> 00:26:42,640 Speaker 2: And I just think it's just it's hard. 522 00:26:42,440 --> 00:26:44,800 Speaker 3: For me to wrap my head around somebody who could 523 00:26:44,800 --> 00:26:47,439 Speaker 3: be spending time with a fifteen year old and not 524 00:26:47,520 --> 00:26:50,040 Speaker 3: be like, this is a child. You know, there's there's 525 00:26:50,160 --> 00:26:52,639 Speaker 3: it's it's I just I really can't. If you spend 526 00:26:52,680 --> 00:26:56,280 Speaker 3: any time around, if you've got teenagers yourself, you really 527 00:26:56,320 --> 00:26:59,760 Speaker 3: see the ways every day in which they are just kids. 528 00:27:00,040 --> 00:27:04,520 Speaker 3: And it really is shocking to me that someone who 529 00:27:04,640 --> 00:27:06,920 Speaker 3: is around a fifteen year old every day, but really 530 00:27:06,960 --> 00:27:13,239 Speaker 3: anybody would make it seem like somehow because they're not 531 00:27:13,320 --> 00:27:17,000 Speaker 3: a toddler it's it's different, you know, and I know 532 00:27:17,080 --> 00:27:19,919 Speaker 3: that there are there are when it comes to like, 533 00:27:20,359 --> 00:27:22,720 Speaker 3: you know, professionals who study this kind of thing, they 534 00:27:22,760 --> 00:27:26,560 Speaker 3: have different designations. But really, you know, a sex crime 535 00:27:26,560 --> 00:27:28,399 Speaker 3: against a minor is a sex crime against a minor. 536 00:27:29,440 --> 00:27:32,440 Speaker 1: Yeah, well, and it just feels like such a if 537 00:27:32,480 --> 00:27:35,840 Speaker 1: you have to explain that you're already in the something 538 00:27:35,840 --> 00:27:38,080 Speaker 1: has gone wrong, something has gone wrong. 539 00:27:38,440 --> 00:27:41,000 Speaker 2: Oh yes, yes. 540 00:27:41,160 --> 00:27:45,360 Speaker 3: If you find yourself saying something like that, something's up, 541 00:27:45,400 --> 00:27:46,879 Speaker 3: like you need to you just to like run it 542 00:27:46,920 --> 00:27:48,560 Speaker 3: back exactly. 543 00:27:58,680 --> 00:28:04,359 Speaker 1: Well, I guess we should get into Larry Summers quite pathetic, 544 00:28:05,400 --> 00:28:06,879 Speaker 1: quite pathetic emails. 545 00:28:06,880 --> 00:28:10,040 Speaker 2: Oh my god, pathetic is the word for it. 546 00:28:10,600 --> 00:28:13,600 Speaker 3: So in this new drop of emails we get some 547 00:28:13,640 --> 00:28:17,679 Speaker 3: more insight until the relationship that Larry Summers had with 548 00:28:17,760 --> 00:28:21,640 Speaker 3: Jeffrey Epstein, and it's a weird one. So Summers described 549 00:28:21,680 --> 00:28:25,560 Speaker 3: Epstein as his quote wingman. So from these emails we 550 00:28:25,760 --> 00:28:29,080 Speaker 3: learn that Larry Summers was really hung up on trying 551 00:28:29,119 --> 00:28:32,520 Speaker 3: to pursue a romantic relationship with a student that he 552 00:28:32,560 --> 00:28:36,200 Speaker 3: referred to as his mentee, and he consistently sought advice 553 00:28:36,440 --> 00:28:39,960 Speaker 3: from Epstein about how to best do this. From November 554 00:28:40,040 --> 00:28:43,360 Speaker 3: twenty eighteen to July twenty nineteen. Text messages and emails 555 00:28:43,400 --> 00:28:47,520 Speaker 3: show that Larry Summers consistently asked Epstein for advice about 556 00:28:47,560 --> 00:28:52,480 Speaker 3: this relationship. Epstein responded eagerly, offering encouragement and tips, even 557 00:28:52,520 --> 00:28:56,000 Speaker 3: calling himself summers wingman in a message from November twenty eighteen. 558 00:28:57,080 --> 00:29:02,160 Speaker 3: Maybe don't get a convicted sex crime against kids. Maybe 559 00:29:02,200 --> 00:29:04,600 Speaker 3: don't seek this person out for advice on how to 560 00:29:04,600 --> 00:29:07,360 Speaker 3: get your mentee who is also a student, to have 561 00:29:07,400 --> 00:29:10,280 Speaker 3: sex with you, especially when you are married, which, by 562 00:29:10,320 --> 00:29:12,920 Speaker 3: the way, Larry Summers was when all of this. 563 00:29:13,000 --> 00:29:14,720 Speaker 2: Was going down. 564 00:29:15,280 --> 00:29:18,960 Speaker 1: Yes, and also just a note he was contacting Epstein 565 00:29:19,560 --> 00:29:21,120 Speaker 1: until he went to jail. 566 00:29:21,360 --> 00:29:23,560 Speaker 2: Yes, the guy finds so wild. 567 00:29:23,640 --> 00:29:27,120 Speaker 3: So to make it clear, this contact was happening well 568 00:29:27,200 --> 00:29:30,680 Speaker 3: after Epstein's initial conviction on sex crimes against kids back 569 00:29:30,680 --> 00:29:31,560 Speaker 3: in two thousand and eight. 570 00:29:31,840 --> 00:29:34,080 Speaker 2: And you are so right, Annie. 571 00:29:34,160 --> 00:29:37,160 Speaker 3: The back and forth exchanges that Larry Summers had with 572 00:29:37,240 --> 00:29:41,360 Speaker 3: Jeffrey Epstein, they only ended the day before Epstein was 573 00:29:41,480 --> 00:29:45,040 Speaker 3: arrested again on new federal sex trafficking charges. So that 574 00:29:45,120 --> 00:29:48,080 Speaker 3: means they were like bud bud buds, Like if the 575 00:29:48,120 --> 00:29:50,280 Speaker 3: back and forth about how to get your mentee to 576 00:29:50,280 --> 00:29:52,600 Speaker 3: sleep with you only stopped because this man got arrested 577 00:29:52,640 --> 00:29:54,640 Speaker 3: on federal sex trafficking charges. 578 00:29:54,800 --> 00:29:55,640 Speaker 2: Y'all were in deep. 579 00:29:56,040 --> 00:29:59,000 Speaker 3: They were like thickest thieves as far as I'm concerned. 580 00:30:00,360 --> 00:30:02,480 Speaker 4: Like because someone was looking at the mail and you 581 00:30:02,480 --> 00:30:05,360 Speaker 4: would have to do it by hand, like it is insane, 582 00:30:05,680 --> 00:30:06,800 Speaker 4: That is insane. 583 00:30:07,120 --> 00:30:08,520 Speaker 3: I do want to get into the meat of some 584 00:30:08,560 --> 00:30:11,000 Speaker 3: of these messages because I think they're very revealing about 585 00:30:11,040 --> 00:30:13,680 Speaker 3: what kind of person was actually, you know, at the 586 00:30:13,760 --> 00:30:17,680 Speaker 3: Helm at Harvard. So at one point, Summers told Epstein 587 00:30:17,760 --> 00:30:21,000 Speaker 3: that he was worried that maybe his Minty didn't want 588 00:30:21,040 --> 00:30:24,320 Speaker 3: to pursue a romantic or sexual relationship with him, because 589 00:30:24,440 --> 00:30:27,120 Speaker 3: maybe she just wanted to keep things professional. He said, 590 00:30:27,280 --> 00:30:31,320 Speaker 3: quote think for now, I'm going nowhere with her except economics, 591 00:30:31,400 --> 00:30:36,200 Speaker 3: mentor boring. Summers wrote on November twenty eighteen. I think 592 00:30:36,240 --> 00:30:39,080 Speaker 3: I'm right now in the seem very warmly in the 593 00:30:39,160 --> 00:30:42,760 Speaker 3: rear view mirror category. She must be very confused or 594 00:30:42,800 --> 00:30:46,080 Speaker 3: maybe wants to cut me off, but wants professional connection 595 00:30:46,160 --> 00:30:47,600 Speaker 3: a lot, so she holds to it. 596 00:30:47,680 --> 00:30:48,960 Speaker 2: Summers wrote in March twenty. 597 00:30:48,840 --> 00:30:53,440 Speaker 3: Nineteen, basically saying that like, oh like, I like her 598 00:30:53,520 --> 00:30:55,320 Speaker 3: and maybe she likes me, but she just wants to 599 00:30:55,440 --> 00:30:57,960 Speaker 3: keep me around for professional reasons. 600 00:30:58,160 --> 00:31:01,920 Speaker 2: Mind you, this he describe he is her mentor. 601 00:31:02,240 --> 00:31:04,200 Speaker 3: So it's like, oh, she only wants me to like 602 00:31:04,240 --> 00:31:08,200 Speaker 3: give her economics guidance and education a little lowering. 603 00:31:08,680 --> 00:31:11,120 Speaker 2: So who was this student in question? 604 00:31:11,760 --> 00:31:14,320 Speaker 3: According to the Harvard Crimson and at least some of 605 00:31:14,320 --> 00:31:17,560 Speaker 3: its exchanges with Epstein on the relationship, Summers appears to 606 00:31:17,640 --> 00:31:21,200 Speaker 3: refer to economist ky eu Jin, a tenured professor at 607 00:31:21,200 --> 00:31:23,480 Speaker 3: the London School of Economics at the time, who is 608 00:31:23,560 --> 00:31:26,640 Speaker 3: mentioned in a series of late twenty eighteen messages between 609 00:31:26,640 --> 00:31:29,640 Speaker 3: the two men. So I actually knew who she was 610 00:31:29,760 --> 00:31:31,560 Speaker 3: before this all came to life, because she's a pretty 611 00:31:31,600 --> 00:31:34,160 Speaker 3: prominent person. She goes on all the big podcasts like 612 00:31:34,600 --> 00:31:38,600 Speaker 3: she is a pretty prominent career and is I would say, 613 00:31:38,600 --> 00:31:41,960 Speaker 3: a public figure at this point. So I mentioned this 614 00:31:42,360 --> 00:31:46,600 Speaker 3: because look how grossly Larry Summers talks about this person 615 00:31:46,760 --> 00:31:50,320 Speaker 3: who at the time was his mentor. Larry Summers was 616 00:31:50,480 --> 00:31:53,600 Speaker 3: forwarding academic work that she would send to. 617 00:31:53,600 --> 00:31:56,640 Speaker 2: Him to Epstein, so like, here's my paper, and. 618 00:31:56,560 --> 00:31:58,640 Speaker 3: Then he's like, I gotta get I gotta get Epstein's 619 00:31:58,640 --> 00:32:01,640 Speaker 3: eyes on this so some forwarded Epstein an email from 620 00:32:01,760 --> 00:32:04,360 Speaker 3: Gin which she was asking for feedback on a paper 621 00:32:04,440 --> 00:32:07,200 Speaker 3: that she had written, and then Summers used to Epstein 622 00:32:07,240 --> 00:32:10,920 Speaker 3: that it was probably appropriate to hold off on responding. 623 00:32:11,000 --> 00:32:13,040 Speaker 3: So he's really doing that thing of like, ooh, like 624 00:32:13,080 --> 00:32:14,160 Speaker 3: when should I text back? 625 00:32:14,200 --> 00:32:14,840 Speaker 2: Like is it like. 626 00:32:14,960 --> 00:32:18,200 Speaker 3: Should I give her her paper feedback today? Or does 627 00:32:18,240 --> 00:32:23,000 Speaker 3: that look too eager? Epstein replied, quote, she's already beginning 628 00:32:23,120 --> 00:32:28,360 Speaker 3: spelled wrong to sound needy, smiley face emoji nice. Now 629 00:32:28,560 --> 00:32:32,120 Speaker 3: Gin is Chinese, and at one point Summers wrote quote 630 00:32:32,840 --> 00:32:39,520 Speaker 3: capital you lowercase are better at understanding Chinese women than 631 00:32:39,560 --> 00:32:41,040 Speaker 3: at probability theory. 632 00:32:41,560 --> 00:32:44,440 Speaker 2: Larry wrote to Epstein. They used to also refer to 633 00:32:44,480 --> 00:32:47,080 Speaker 2: her as this nickname peril, right, So like when I 634 00:32:47,080 --> 00:32:49,120 Speaker 2: saw this, I was like, like, where did this nickname 635 00:32:49,160 --> 00:32:49,560 Speaker 2: come from? 636 00:32:50,200 --> 00:32:54,400 Speaker 3: Apparently this nickname might be drawn from the racist quote 637 00:32:54,520 --> 00:32:58,320 Speaker 3: yellow Peril trope of the late nineteenth and twentieth century, 638 00:32:58,440 --> 00:33:01,400 Speaker 3: which was used to stoke fear that Asian immigrants, especially 639 00:33:01,520 --> 00:33:04,880 Speaker 3: Chinese and Japanese people, were a danger or a threat 640 00:33:05,000 --> 00:33:09,360 Speaker 3: to westerners. Just real cool and classy conversations to be 641 00:33:09,360 --> 00:33:11,920 Speaker 3: putting in writing with that convicted sex criminal. 642 00:33:12,240 --> 00:33:15,240 Speaker 4: Right, literally during that time is when they also made 643 00:33:15,240 --> 00:33:19,440 Speaker 4: this fear that all Asian women were prostitutes. So that 644 00:33:19,560 --> 00:33:23,040 Speaker 4: that was that implication of this level of Asian feticism 645 00:33:23,080 --> 00:33:26,000 Speaker 4: that began in that propaganda. 646 00:33:26,040 --> 00:33:29,480 Speaker 3: Yeah, and was continued in these messages. 647 00:33:29,800 --> 00:33:31,200 Speaker 1: From twenty eighteen. 648 00:33:31,360 --> 00:33:34,280 Speaker 2: That's what I'm saying, Like, I just like. 649 00:33:35,880 --> 00:33:39,840 Speaker 1: Yeah, it's just like yeah, yep, yep, yep. 650 00:33:40,200 --> 00:33:43,920 Speaker 3: So Epstein and Larry Summers were also financially entangled, which 651 00:33:43,960 --> 00:33:48,200 Speaker 3: is an Epstein classic. Summers traveled on Epstein's private plane, 652 00:33:48,400 --> 00:33:52,560 Speaker 3: which was nicknamed the Lolita Express, on at least four occasions, 653 00:33:52,600 --> 00:33:55,840 Speaker 3: including at least three times while Summers was the president 654 00:33:55,880 --> 00:33:59,080 Speaker 3: of Harvard. Summers also met more than a dozen times 655 00:33:59,080 --> 00:34:03,640 Speaker 3: with Epstein and solicited donations from him for his wife, 656 00:34:03,800 --> 00:34:07,720 Speaker 3: who was a Harvard English professor, Alisa Knew. The messages 657 00:34:07,760 --> 00:34:10,759 Speaker 3: released showed that Somers was trying to organize visits to 658 00:34:10,800 --> 00:34:14,880 Speaker 3: Harvard on Epstein's behalf to discuss his wife's poetry work. 659 00:34:15,239 --> 00:34:17,600 Speaker 2: So just really not smart. 660 00:34:17,640 --> 00:34:21,120 Speaker 3: I will say that Harvard did put out a report 661 00:34:21,160 --> 00:34:24,680 Speaker 3: about their connection to Epstein, and it seems like before 662 00:34:24,800 --> 00:34:27,200 Speaker 3: Epstein was convicted in two thousand and eight, he did 663 00:34:27,239 --> 00:34:30,280 Speaker 3: donate I think nine million dollars to Harvard, and after 664 00:34:30,320 --> 00:34:32,640 Speaker 3: he was convicted did visit campus. 665 00:34:32,360 --> 00:34:32,799 Speaker 2: Quite a bit. 666 00:34:32,880 --> 00:34:34,719 Speaker 3: So it does seem like the calls were sort of 667 00:34:34,760 --> 00:34:36,560 Speaker 3: coming from inside the ivy of the university, if you 668 00:34:36,560 --> 00:34:39,759 Speaker 3: know what I mean. But having the president of Harvard 669 00:34:40,120 --> 00:34:43,239 Speaker 3: associating with somebody who had already been convicted of sex 670 00:34:43,280 --> 00:34:45,919 Speaker 3: crimes against miners, it's not a good look, to say 671 00:34:45,960 --> 00:34:50,960 Speaker 3: the least. It's definitely not good judgment. Now, if only 672 00:34:51,000 --> 00:34:54,360 Speaker 3: there had been some clue to tip us off that 673 00:34:54,480 --> 00:34:57,800 Speaker 3: perhaps Larry Summers was a creep who could not be 674 00:34:57,960 --> 00:35:01,960 Speaker 3: trusted in positions of authority around young women and co eds. 675 00:35:02,520 --> 00:35:04,959 Speaker 2: Oh wait a minute, because that's exactly what we fink got. 676 00:35:04,960 --> 00:35:07,960 Speaker 3: Because I mentioned that Larry Summers was formerly the president 677 00:35:08,040 --> 00:35:10,880 Speaker 3: of Harvard University, not just a professor there, lucky is 678 00:35:10,880 --> 00:35:15,120 Speaker 3: Now why was he forced to resign? You ask the 679 00:35:15,440 --> 00:35:19,400 Speaker 3: attitudes toward women. Where there is smoke, there is fire, 680 00:35:19,520 --> 00:35:24,200 Speaker 3: My friends, this should not be surprising to anybody. Unfortunately, no, 681 00:35:26,280 --> 00:35:28,160 Speaker 3: And I guess it's like why I wanted to talk 682 00:35:28,160 --> 00:35:31,120 Speaker 3: about this is like we had a whole converse. It 683 00:35:31,160 --> 00:35:34,359 Speaker 3: was a national conversation around two thousand and five, two 684 00:35:34,360 --> 00:35:37,600 Speaker 3: thousand and six about what kind of person Larry Summers was. 685 00:35:37,640 --> 00:35:39,319 Speaker 3: And it's just wild to be like, oh, we're that 686 00:35:39,360 --> 00:35:40,279 Speaker 3: guy's still in the mix. 687 00:35:40,280 --> 00:35:41,600 Speaker 2: We're still talking about that guy. 688 00:35:41,880 --> 00:35:45,480 Speaker 3: Yes, because these people they're never pushed out when like 689 00:35:45,520 --> 00:35:48,760 Speaker 3: it's clear something is going on. The thing that really 690 00:35:48,840 --> 00:35:52,320 Speaker 3: sunk Somers's ten year as president of Harvard who remarks 691 00:35:52,320 --> 00:35:53,920 Speaker 3: that he made in two thousand and five at an 692 00:35:53,920 --> 00:35:57,080 Speaker 3: economics conference. So Summers was talking about why there are 693 00:35:57,120 --> 00:36:00,320 Speaker 3: so few women in STEM fields, and he won the 694 00:36:00,400 --> 00:36:02,640 Speaker 3: reason why there aren't more women in these fields just 695 00:36:02,640 --> 00:36:06,000 Speaker 3: because women are just naturally and innately stupid and also 696 00:36:06,040 --> 00:36:09,840 Speaker 3: bad at things. He laid out three kind of potential 697 00:36:09,920 --> 00:36:13,120 Speaker 3: hypotheses for what's going on. One is that women want 698 00:36:13,200 --> 00:36:16,000 Speaker 3: more work life balance than men, so they can't succeed 699 00:36:16,040 --> 00:36:17,719 Speaker 3: in these fields as much as men. He was likeah, 700 00:36:17,719 --> 00:36:21,600 Speaker 3: it's probably not it. Two gender discrimination, which he basically 701 00:36:21,640 --> 00:36:23,400 Speaker 3: was like, yeah, I don't think so. And then the 702 00:36:23,400 --> 00:36:26,319 Speaker 3: one that he really kind of double clicked on was 703 00:36:26,400 --> 00:36:29,640 Speaker 3: the idea that women are just biologically worse at the 704 00:36:29,680 --> 00:36:32,120 Speaker 3: traits that one needs to succeed in STEM, which is 705 00:36:32,120 --> 00:36:35,120 Speaker 3: basically where he landed. He argued that the gender gap 706 00:36:35,120 --> 00:36:40,360 Speaker 3: in stem must be biological, specifically differences in intrinsic aptitude, 707 00:36:40,760 --> 00:36:43,920 Speaker 3: and that men and women might have different variabilities in 708 00:36:44,040 --> 00:36:47,960 Speaker 3: certain traits like math or science ability, meaning that more 709 00:36:48,040 --> 00:36:52,080 Speaker 3: men would just naturally occupy more top positions in those fields. 710 00:36:52,160 --> 00:36:55,520 Speaker 3: He tried to back this up with like behavioral genetics research, 711 00:36:56,239 --> 00:36:59,759 Speaker 3: claiming that some attributes once thought to come from socialization 712 00:37:00,320 --> 00:37:05,040 Speaker 3: may actually have you know, biological components. I will say 713 00:37:05,760 --> 00:37:08,799 Speaker 3: he did say when this kind of became a controversy, 714 00:37:08,800 --> 00:37:10,160 Speaker 3: he was like, oh, I thought this was meant to 715 00:37:10,160 --> 00:37:13,560 Speaker 3: be kind of an off the record safe space for 716 00:37:13,640 --> 00:37:16,919 Speaker 3: bad opinions about gender. So like, my bad, I didn't 717 00:37:16,920 --> 00:37:20,920 Speaker 3: know we were gonna be like telling everybody about what 718 00:37:20,960 --> 00:37:21,960 Speaker 3: it was being said here. 719 00:37:23,880 --> 00:37:28,040 Speaker 1: Oh that's another in the long line. We've been talking 720 00:37:28,040 --> 00:37:30,839 Speaker 1: a lot lately about and even with you, Bridget we've 721 00:37:30,840 --> 00:37:33,239 Speaker 1: discussed this before, but about men using those terms like 722 00:37:33,280 --> 00:37:39,839 Speaker 1: safe space completely incorrectly. And also it's kind of like, 723 00:37:40,200 --> 00:37:44,200 Speaker 1: no wonder women might not want to work with you 724 00:37:44,360 --> 00:37:47,520 Speaker 1: who has the power because you treat them as a 725 00:37:47,640 --> 00:37:51,640 Speaker 1: dating opportunity, your chance to have sex. Guess what, they 726 00:37:51,680 --> 00:37:53,360 Speaker 1: don't want to be involved in that, and you're the 727 00:37:53,400 --> 00:37:55,480 Speaker 1: one making these decisions. 728 00:37:55,040 --> 00:37:58,080 Speaker 3: Exactly that, Like, these things are so clearly linked. The 729 00:37:58,120 --> 00:38:00,160 Speaker 3: fact that he got up on a stage and said this, 730 00:38:00,239 --> 00:38:03,600 Speaker 3: and the fact that he was creepily pursuing his mentee 731 00:38:03,640 --> 00:38:08,160 Speaker 3: and sending her fucking like academic papers to a convicted pedophile, 732 00:38:08,560 --> 00:38:10,759 Speaker 3: these things are all linked, right, And I will just 733 00:38:10,760 --> 00:38:14,680 Speaker 3: say this, He also picked a terrible time to say 734 00:38:14,719 --> 00:38:17,880 Speaker 3: these things, because he said these things against the backdrop 735 00:38:17,920 --> 00:38:22,600 Speaker 3: of Harvard facing, according to the Crimson, widespread faculty criticism 736 00:38:22,760 --> 00:38:25,960 Speaker 3: following reports that women received only four of the thirty 737 00:38:26,000 --> 00:38:28,759 Speaker 3: two ten year offers from the Faculty of Arts and 738 00:38:28,800 --> 00:38:32,520 Speaker 3: Sciences that year. Right, So, like there was already like 739 00:38:32,560 --> 00:38:36,120 Speaker 3: a brew haha about women and whether or not they 740 00:38:36,120 --> 00:38:38,680 Speaker 3: were included and represented at Harvard. 741 00:38:38,520 --> 00:38:40,600 Speaker 2: And he gets this, Jack gets on stage and says 742 00:38:40,640 --> 00:38:41,040 Speaker 2: these things. 743 00:38:41,160 --> 00:38:41,279 Speaker 4: Right. 744 00:38:41,560 --> 00:38:43,400 Speaker 3: I wish that we could go back to that time 745 00:38:43,840 --> 00:38:46,960 Speaker 3: and have a different conversation about it, because I think 746 00:38:47,000 --> 00:38:49,000 Speaker 3: we'd be in a different place. Right now, I will 747 00:38:49,000 --> 00:38:52,360 Speaker 3: say that Summers apologize and clarify that he does not 748 00:38:52,520 --> 00:38:56,200 Speaker 3: believe that women are intellectually inferior. What a nice guy, 749 00:38:56,520 --> 00:39:00,000 Speaker 3: definitely a great president at Harvard. But this was all 750 00:39:00,160 --> 00:39:02,560 Speaker 3: so like not an isolated thing with him. He used 751 00:39:02,560 --> 00:39:06,239 Speaker 3: to joke about women's intelligence quite often, apparently, and talked 752 00:39:06,239 --> 00:39:09,640 Speaker 3: about what he described as excessive penalties for men who 753 00:39:09,760 --> 00:39:13,080 Speaker 3: hit on women in the workplace. Surprise, surprise, he was 754 00:39:13,160 --> 00:39:17,279 Speaker 3: actively hitting on his mentee at an academic institution. I 755 00:39:17,320 --> 00:39:19,520 Speaker 3: was like, are we being too hard on men who 756 00:39:19,560 --> 00:39:21,000 Speaker 3: just try to hit on women at work? 757 00:39:21,239 --> 00:39:23,840 Speaker 1: He would think that, he wouldn't me. 758 00:39:24,160 --> 00:39:26,080 Speaker 4: He sounds like the guy who's like, we can't talk 759 00:39:26,080 --> 00:39:28,600 Speaker 4: to women at all anymore. I can't be like touching 760 00:39:28,600 --> 00:39:31,239 Speaker 4: them though it's against the rules now, Like. 761 00:39:31,160 --> 00:39:36,360 Speaker 1: What, yes, you behave that way? 762 00:39:36,440 --> 00:39:39,120 Speaker 3: Yeah, women these days can't even try to cheat on 763 00:39:39,160 --> 00:39:40,239 Speaker 3: your wife with them or. 764 00:39:40,239 --> 00:39:43,000 Speaker 2: What good time with them? 765 00:39:43,400 --> 00:39:45,600 Speaker 3: So I should add that when this was going on, 766 00:39:45,760 --> 00:39:48,640 Speaker 3: a lot of people in the Harvard community did stick 767 00:39:48,719 --> 00:39:52,520 Speaker 3: up for him, but ultimately Harvard's board voted no confidence. 768 00:39:52,800 --> 00:39:57,440 Speaker 3: This controversy really lingered until eventually Larry Summers resigned from 769 00:39:57,440 --> 00:40:00,400 Speaker 3: his role as president of Harvard the next year. But 770 00:40:00,400 --> 00:40:04,040 Speaker 3: don't worry, two years later, President Obama named Somers as 771 00:40:04,080 --> 00:40:07,360 Speaker 3: the director of the National Economic Council. Because he always 772 00:40:07,400 --> 00:40:10,200 Speaker 3: lands on his feet, This one. 773 00:40:11,120 --> 00:40:13,759 Speaker 4: The like comeback of white men. It really is like 774 00:40:15,080 --> 00:40:18,600 Speaker 4: amazing to say, like we talk about how women like 775 00:40:18,680 --> 00:40:21,799 Speaker 4: the bootstrapping theory, this, this is why that's not a thing, 776 00:40:22,239 --> 00:40:24,759 Speaker 4: Like bootstrap thing is not a thing because we see this, 777 00:40:24,960 --> 00:40:27,520 Speaker 4: like men who should be taken down always bounce back, 778 00:40:27,600 --> 00:40:31,280 Speaker 4: like they all because of connections, no matter what happens. 779 00:40:31,640 --> 00:40:34,440 Speaker 3: And like some of the stuff that he was saying 780 00:40:34,800 --> 00:40:37,520 Speaker 3: to students and around campus, he we know from the 781 00:40:37,560 --> 00:40:39,400 Speaker 3: release of these emails that he was also saying the 782 00:40:39,440 --> 00:40:41,640 Speaker 3: same stuff to Epstein, which really isn't surprising. 783 00:40:42,080 --> 00:40:46,200 Speaker 2: He retread that same terrain that he got it. 784 00:40:46,800 --> 00:40:49,280 Speaker 3: Years after getting in trouble for this stuff at Harvard, 785 00:40:49,480 --> 00:40:51,520 Speaker 3: he continued to say this kind of stuff to Epstein. 786 00:40:51,760 --> 00:40:55,239 Speaker 3: In twenty seventeen, he wrote this old gem, saying that 787 00:40:55,320 --> 00:40:58,440 Speaker 3: he had a quote observed that half the IQ in 788 00:40:58,480 --> 00:41:01,440 Speaker 3: the world was possessed by women, without mentioning that they 789 00:41:01,480 --> 00:41:05,040 Speaker 3: are more than fifty one percent of the population. Oh good, one, 790 00:41:05,200 --> 00:41:07,239 Speaker 3: it's not even a good joke, Like if you're gonna 791 00:41:07,280 --> 00:41:09,839 Speaker 3: there are you shouldn't be making jokes with this about women. 792 00:41:09,880 --> 00:41:12,120 Speaker 3: But it's not even like, in my opinion, not even 793 00:41:12,160 --> 00:41:12,839 Speaker 3: a very good joke. 794 00:41:13,440 --> 00:41:19,640 Speaker 1: No, no, And it's I I mean all of this 795 00:41:19,719 --> 00:41:22,560 Speaker 1: is disgusting and horrific. But it is, going back twice 796 00:41:22,560 --> 00:41:25,200 Speaker 1: said earlier, very pathetic that you would have to reach 797 00:41:25,239 --> 00:41:32,879 Speaker 1: out to a convicted sex predator to get late because 798 00:41:32,920 --> 00:41:36,840 Speaker 1: he's very like he has one email that says something like, 799 00:41:36,880 --> 00:41:41,000 Speaker 1: I just need to get horizontal help. Yeah, yes, what 800 00:41:41,200 --> 00:41:43,319 Speaker 1: is wrong with you? Bro? 801 00:41:43,680 --> 00:41:46,560 Speaker 2: You are the president of Harvard to show something like 802 00:41:47,000 --> 00:41:48,000 Speaker 2: act like it. 803 00:41:48,000 --> 00:41:50,319 Speaker 3: It's something you know, A therapist once told me that 804 00:41:50,760 --> 00:41:53,680 Speaker 3: men in power their need to have sex with people 805 00:41:53,680 --> 00:41:55,040 Speaker 3: that they should not be having sex with. 806 00:41:55,200 --> 00:41:56,920 Speaker 2: It makes them so small. 807 00:41:57,239 --> 00:41:59,600 Speaker 3: And it's so true that like, you are the president 808 00:41:59,640 --> 00:42:01,799 Speaker 3: of HERT and you're putting in an email that you 809 00:42:01,880 --> 00:42:05,320 Speaker 3: just need to get horizontal. Bro to Jeffrey Epstein about 810 00:42:05,320 --> 00:42:08,279 Speaker 3: your men tee, that is a problem that is not 811 00:42:08,400 --> 00:42:11,120 Speaker 3: like I just yeah, And I think it really speaks 812 00:42:11,160 --> 00:42:13,520 Speaker 3: to the fact that I'm sure I've said this before, 813 00:42:13,840 --> 00:42:16,480 Speaker 3: I don't trust any institution where there are not women 814 00:42:16,680 --> 00:42:19,080 Speaker 3: all up and through at the tippy tippy top. If 815 00:42:19,120 --> 00:42:22,040 Speaker 3: it's all men or mostly men left to their own devices, 816 00:42:22,080 --> 00:42:25,120 Speaker 3: I don't care if it's an ivy, the university, the military, 817 00:42:25,560 --> 00:42:29,560 Speaker 3: the Catholic priesthood, anything, the NFL. If you don't got 818 00:42:29,600 --> 00:42:31,800 Speaker 3: women all up and through at the tippy top. I 819 00:42:31,840 --> 00:42:35,160 Speaker 3: don't trust it. Something's going on. Some shouldn't put in emails. 820 00:42:35,200 --> 00:42:37,120 Speaker 3: That shouldn't be put in emails. 821 00:42:37,080 --> 00:42:39,160 Speaker 4: And why does they sell the emails sound like nineteen ninety 822 00:42:39,160 --> 00:42:41,560 Speaker 4: five boys? Like it was really like I'm just the 823 00:42:41,920 --> 00:42:45,520 Speaker 4: wording like the faces I have made throughout this whole episode, 824 00:42:46,239 --> 00:42:49,120 Speaker 4: like it's just discussed and shocked because I'm like, why 825 00:42:49,120 --> 00:42:52,040 Speaker 4: are you saying? Why are you sounding like my eighteen 826 00:42:52,160 --> 00:42:54,560 Speaker 4: year old nephew right now? What is wrong with you? 827 00:42:54,920 --> 00:42:58,399 Speaker 3: Yeah, in the same breath, it'll be like I got 828 00:42:58,480 --> 00:43:04,040 Speaker 3: to get horizontal bro winki face, and then it'll be like, oh, 829 00:43:04,120 --> 00:43:04,600 Speaker 3: what are your. 830 00:43:04,480 --> 00:43:06,680 Speaker 2: Thoughts on Trump? Don't you think he's a buffoon? 831 00:43:06,719 --> 00:43:09,520 Speaker 3: And it's like, I think that might be the buffoon 832 00:43:09,560 --> 00:43:10,760 Speaker 3: calling the buffoon a buffoon. 833 00:43:11,920 --> 00:43:14,000 Speaker 4: All one in the saying what is happening? 834 00:43:14,400 --> 00:43:15,040 Speaker 2: It's bad. 835 00:43:25,880 --> 00:43:30,480 Speaker 3: So the fallout from Larry Summers's relationship with Epstein being 836 00:43:30,520 --> 00:43:35,160 Speaker 3: made clear, I actually feel like it's been pretty minimal. 837 00:43:35,280 --> 00:43:35,720 Speaker 2: Honestly. 838 00:43:36,560 --> 00:43:39,360 Speaker 3: Initially, after making that statement, he said that he was 839 00:43:39,400 --> 00:43:43,440 Speaker 3: going to be stepping back from public roles, but continue 840 00:43:43,520 --> 00:43:45,879 Speaker 3: was going to be continuing teaching at Harvard, so he's 841 00:43:45,920 --> 00:43:48,200 Speaker 3: currently like a faculty member at Harvard. 842 00:43:48,400 --> 00:43:51,279 Speaker 2: There was a video. Maybe I'll put the audio in 843 00:43:51,320 --> 00:43:51,880 Speaker 2: if I can find it. 844 00:43:51,880 --> 00:43:55,239 Speaker 3: There's a video of him starting his class a couple 845 00:43:55,200 --> 00:43:58,759 Speaker 3: of days ago, basically with an apology to students, being like, oh, 846 00:43:58,760 --> 00:44:01,319 Speaker 3: I'm sorry that I was friends with Epstein. 847 00:44:02,160 --> 00:44:04,040 Speaker 2: With your permission, I'd like to. 848 00:44:04,000 --> 00:44:07,360 Speaker 3: Just like Kinchin, you talking about economics and class now. So, 849 00:44:07,680 --> 00:44:09,680 Speaker 3: after he announced that he was going to be not 850 00:44:10,440 --> 00:44:13,880 Speaker 3: stepping back from teaching, he eventually did reverse course and 851 00:44:13,880 --> 00:44:15,719 Speaker 3: said that he was going to be stepping back for 852 00:44:15,920 --> 00:44:18,680 Speaker 3: the moment. He does have ten years, so it's not 853 00:44:18,719 --> 00:44:20,680 Speaker 3: like he was fired and who knows what will happen, 854 00:44:20,920 --> 00:44:24,000 Speaker 3: you know next, but Ta's are going to finish off 855 00:44:24,000 --> 00:44:26,160 Speaker 3: the rest of his class with a semester. He also 856 00:44:26,320 --> 00:44:28,640 Speaker 3: resigned from the board of open Ai, the company that 857 00:44:28,640 --> 00:44:32,880 Speaker 3: makes chat GPT, and Harvard is launching an investigation into 858 00:44:33,320 --> 00:44:36,840 Speaker 3: his connections with Epstein, which, yeah, I'll say they should. 859 00:44:37,680 --> 00:44:40,440 Speaker 3: Elizabeth Warren, who is a law professor at Harvard, has 860 00:44:40,520 --> 00:44:43,279 Speaker 3: urged the university to cut ties with him. And I 861 00:44:43,320 --> 00:44:46,080 Speaker 3: guess the reason that I wanted to talk about this 862 00:44:46,320 --> 00:44:47,240 Speaker 3: is because. 863 00:44:46,920 --> 00:44:48,400 Speaker 2: You know, when I was doing research. 864 00:44:48,680 --> 00:44:50,200 Speaker 3: One of the big pieces that I read to prepare 865 00:44:50,200 --> 00:44:52,880 Speaker 3: for this conversation was this piece in The Times about 866 00:44:53,200 --> 00:44:56,080 Speaker 3: whether or not Summers could mount a comeback since he's 867 00:44:56,120 --> 00:44:59,160 Speaker 3: bounced back so many times in the past. Now, this 868 00:44:59,200 --> 00:45:03,560 Speaker 3: piece in the Time published the day that Summers announced 869 00:45:03,560 --> 00:45:05,839 Speaker 3: that he was going to be temporarily stepping back from 870 00:45:05,880 --> 00:45:09,920 Speaker 3: his classes, and I just hated that the conversation was 871 00:45:09,960 --> 00:45:13,000 Speaker 3: being framed just a few hours after he announced he 872 00:45:13,040 --> 00:45:16,760 Speaker 3: was just stepping down temporarily, as like, well, Kenna's career 873 00:45:16,880 --> 00:45:17,640 Speaker 3: be saved. 874 00:45:18,120 --> 00:45:19,440 Speaker 2: He's seventy years old. 875 00:45:19,840 --> 00:45:24,600 Speaker 3: I don't think that him being relieved from being in 876 00:45:24,600 --> 00:45:27,759 Speaker 3: a position to be around young, you know, co eds 877 00:45:27,800 --> 00:45:30,880 Speaker 3: and students and to have this relationship where he is 878 00:45:31,000 --> 00:45:33,640 Speaker 3: inherently of an authority figure to people like that. 879 00:45:34,000 --> 00:45:36,279 Speaker 2: I don't think that's like ruining his life, you know 880 00:45:36,280 --> 00:45:36,759 Speaker 2: what I'm saying. 881 00:45:36,800 --> 00:45:40,000 Speaker 3: Like the fact that it was even phrased that way 882 00:45:40,560 --> 00:45:44,240 Speaker 3: really bothered me. And Alejandro Carbalo, who is a clinical 883 00:45:44,280 --> 00:45:48,640 Speaker 3: instructor at Harvard Law Cyberclinic, commented online quote, it is 884 00:45:48,760 --> 00:45:52,080 Speaker 3: not normal for a professor to start a class discussing 885 00:45:52,120 --> 00:45:56,560 Speaker 3: how they regret being best buddies with a child sex trafficker, 886 00:45:56,960 --> 00:46:02,640 Speaker 3: and honestly I have to agree, right, we cannot. This 887 00:46:03,239 --> 00:46:05,960 Speaker 3: is not an apologize and move on and continue with 888 00:46:05,960 --> 00:46:09,120 Speaker 3: the economics lesson kind of thing. This is a pretty 889 00:46:09,360 --> 00:46:12,520 Speaker 3: big deal that when we should be treating it like 890 00:46:12,560 --> 00:46:13,680 Speaker 3: it's a pretty big deal. 891 00:46:13,920 --> 00:46:15,360 Speaker 2: And I think especially. 892 00:46:15,680 --> 00:46:18,640 Speaker 3: Someone like Larry Summers, who at the very least has 893 00:46:18,719 --> 00:46:21,279 Speaker 3: over and over again proven himself to have pretty bad 894 00:46:21,360 --> 00:46:23,040 Speaker 3: judgment when it comes to women. 895 00:46:23,880 --> 00:46:25,399 Speaker 2: Having this person have. 896 00:46:25,600 --> 00:46:29,360 Speaker 3: Roles where he is so involved in hands on about 897 00:46:29,400 --> 00:46:32,239 Speaker 3: shaping what the future of technology looks like through things 898 00:46:32,320 --> 00:46:36,080 Speaker 3: like a board position at Open AI. I just think 899 00:46:36,160 --> 00:46:40,560 Speaker 3: that we really have to think about what that means 900 00:46:40,600 --> 00:46:43,919 Speaker 3: and whether or not men like this are who we 901 00:46:44,040 --> 00:46:47,279 Speaker 3: want shaping the future of how technology shows up in 902 00:46:47,320 --> 00:46:49,239 Speaker 3: all of our lives, right, Like, I certainly do not 903 00:46:49,280 --> 00:46:52,680 Speaker 3: trust somebody like Larry Summers to be making decisions about 904 00:46:52,680 --> 00:46:55,040 Speaker 3: what our shared future is going to look like. 905 00:46:55,360 --> 00:46:58,239 Speaker 4: I'm honestly so disturbed by the mere fact that people 906 00:46:58,280 --> 00:47:01,319 Speaker 4: are ignoring the fact that heat for once, Like, I'm 907 00:47:01,320 --> 00:47:03,759 Speaker 4: sure there's so many other instances of like sexual harassment, 908 00:47:03,880 --> 00:47:06,640 Speaker 4: Like she actually said, I don't want to do this. 909 00:47:06,800 --> 00:47:09,560 Speaker 4: Can we keep this professional and her life is probably 910 00:47:09,600 --> 00:47:12,799 Speaker 4: being up ended, Like I cannot imagine what she like 911 00:47:13,040 --> 00:47:16,000 Speaker 4: the trauma and the shock she's going through seeing these 912 00:47:16,040 --> 00:47:19,000 Speaker 4: emails of not only that, having the name of Epstein, 913 00:47:19,080 --> 00:47:21,120 Speaker 4: like if she didn't know him, she hadn't met him, 914 00:47:21,320 --> 00:47:23,399 Speaker 4: and knowing all the things about him, and she was like, oh, 915 00:47:23,440 --> 00:47:27,719 Speaker 4: I literally was being trafficked, like groom trafficked and kind 916 00:47:27,760 --> 00:47:30,799 Speaker 4: of in a way with this dude who traffics and 917 00:47:30,800 --> 00:47:34,000 Speaker 4: grooms women and like as his profession, Like that's what 918 00:47:34,080 --> 00:47:38,279 Speaker 4: he was doing. And this man who I trusted, was 919 00:47:38,840 --> 00:47:44,880 Speaker 4: getting like territorial slash predatorial advice for me when all 920 00:47:44,920 --> 00:47:46,720 Speaker 4: I'm trying to do is survive and get an education 921 00:47:46,760 --> 00:47:49,120 Speaker 4: and move on in this life. Like there's so much 922 00:47:49,160 --> 00:47:49,400 Speaker 4: to this. 923 00:47:49,880 --> 00:47:52,520 Speaker 3: Yeah, and it's not like she asked to be involved 924 00:47:52,560 --> 00:47:55,160 Speaker 3: in this, And I should say, and in case it's 925 00:47:55,160 --> 00:47:58,800 Speaker 3: not clear, we don't even know that she had any 926 00:47:59,000 --> 00:48:02,200 Speaker 3: concept of the that this was going on, that her 927 00:48:02,600 --> 00:48:08,680 Speaker 3: mentor at Harvard was forwarding her perfectly reasonable emails and 928 00:48:08,760 --> 00:48:12,520 Speaker 3: papers and work to Jeffrey Epstein. It is it just 929 00:48:12,640 --> 00:48:15,920 Speaker 3: is really we should be treating it as a shocking 930 00:48:15,960 --> 00:48:19,480 Speaker 3: and unacceptable thing because it is. And I guess where 931 00:48:19,520 --> 00:48:23,040 Speaker 3: I land is that we really cannot keep acting shocked 932 00:48:23,280 --> 00:48:27,759 Speaker 3: when powerful institutions protect powerful men who have these like 933 00:48:27,920 --> 00:48:31,880 Speaker 3: long and well documented histories of bad judgment, sexist thinking, 934 00:48:32,320 --> 00:48:36,880 Speaker 3: and frankly ethically grotesque associations. Right, It's not like Summer's 935 00:48:36,960 --> 00:48:41,640 Speaker 3: just had one bad quote. He had an entire legacy 936 00:48:41,840 --> 00:48:45,560 Speaker 3: of things like minimizing discrimination, joking about women in the workplace, 937 00:48:45,800 --> 00:48:49,880 Speaker 3: letting Epstein fund his wife's projects, and treating young women 938 00:48:49,960 --> 00:48:52,760 Speaker 3: like his mentee like a conquest that he could gain 939 00:48:52,840 --> 00:48:56,719 Speaker 3: theory his way into getting horizontal with and then describing 940 00:48:56,760 --> 00:49:01,040 Speaker 3: it in the crudest ways impossible. Right, Like and Harvard, 941 00:49:01,239 --> 00:49:05,319 Speaker 3: the US government, tech companies, the media all kept continuing 942 00:49:05,360 --> 00:49:07,799 Speaker 3: to give him power. The New York Times writing an 943 00:49:07,880 --> 00:49:10,960 Speaker 3: article the day that it becomes clear that he's going 944 00:49:10,960 --> 00:49:14,640 Speaker 3: to step back temporarily teaching, asking about the state of 945 00:49:14,680 --> 00:49:17,080 Speaker 3: his career and whether or not he can come back 946 00:49:17,120 --> 00:49:20,080 Speaker 3: and bounce it back from this is exactly what I mean. 947 00:49:21,520 --> 00:49:23,919 Speaker 1: Yeah, And that's like one of the memes that I've 948 00:49:23,960 --> 00:49:28,040 Speaker 1: seen going around is like something that finally brings all 949 00:49:28,200 --> 00:49:31,799 Speaker 1: men of any religion, of any together, is that they 950 00:49:31,840 --> 00:49:35,440 Speaker 1: will use young girls this way. And one of the 951 00:49:35,480 --> 00:49:38,920 Speaker 1: things that really frustrates me about this conversation is that, like, 952 00:49:38,960 --> 00:49:41,279 Speaker 1: even going back to the Bill Clinton thing, who do 953 00:49:41,320 --> 00:49:43,440 Speaker 1: we immediately are like, we got to protect Bill Clinton. 954 00:49:43,840 --> 00:49:46,640 Speaker 1: Who are we like, oh, we got to protect Larry Summers. No, 955 00:49:47,080 --> 00:49:50,239 Speaker 1: we're not talking about the people who were hurt by this. 956 00:49:50,920 --> 00:49:54,200 Speaker 1: You just want to know that this guy's career is 957 00:49:54,280 --> 00:49:57,800 Speaker 1: safe and he's fine, He's doing fine. 958 00:49:57,920 --> 00:50:01,400 Speaker 3: He is seventy years old. I don't want to sound ageist. 959 00:50:01,680 --> 00:50:04,400 Speaker 3: I just feel like at seventy years old, if you're 960 00:50:04,440 --> 00:50:06,560 Speaker 3: this kind of person, maybe we don't need you. Like 961 00:50:06,600 --> 00:50:08,160 Speaker 3: what I'm you know, I'm saying like maybe you living 962 00:50:08,320 --> 00:50:11,359 Speaker 3: a private life with your wife, who now knows all 963 00:50:11,440 --> 00:50:15,600 Speaker 3: about the emails you were sending about your mentee. Maybe 964 00:50:15,640 --> 00:50:17,400 Speaker 3: that is fine. Maybe we don't want you to talk 965 00:50:17,440 --> 00:50:20,360 Speaker 3: about that. Like it's the biggest tragedy that somebody like 966 00:50:20,560 --> 00:50:23,520 Speaker 3: Larry Summers doesn't get to have this role of authority 967 00:50:23,680 --> 00:50:25,840 Speaker 3: and this role of influential. 968 00:50:25,280 --> 00:50:29,880 Speaker 2: Power both over you know, Carbarde. 969 00:50:29,480 --> 00:50:32,640 Speaker 3: Students, but also over all of us, right, because if 970 00:50:32,640 --> 00:50:34,400 Speaker 3: you're sitting on the board at open AI, you do 971 00:50:34,480 --> 00:50:36,399 Speaker 3: have some power over us. You have you you're making 972 00:50:36,440 --> 00:50:39,120 Speaker 3: decisions and you're you're putting out technology. 973 00:50:38,600 --> 00:50:39,799 Speaker 2: That is going to go on to shape all of 974 00:50:39,800 --> 00:50:40,280 Speaker 2: our lives. 975 00:50:40,480 --> 00:50:42,919 Speaker 3: And I completely agree with you, Annie, I just really 976 00:50:42,960 --> 00:50:46,120 Speaker 3: it bothers me that when the conversation is framed about 977 00:50:46,200 --> 00:50:48,919 Speaker 3: what's going to happen to poor Larry Summers, He's gonna 978 00:50:48,960 --> 00:50:49,520 Speaker 3: be fine. 979 00:50:49,680 --> 00:50:51,360 Speaker 2: He's gonna be completely fine. 980 00:50:52,120 --> 00:50:54,960 Speaker 4: Literally the system was built for him, like for him 981 00:50:54,960 --> 00:50:56,440 Speaker 4: to be fine. The fact that he can really be like, 982 00:50:56,480 --> 00:50:59,360 Speaker 4: ah my bad, bro, we cool like that literally was 983 00:50:59,400 --> 00:51:03,600 Speaker 4: a apology, like yeah, not even like conversation about the 984 00:51:03,640 --> 00:51:06,560 Speaker 4: fact that, yes, he was using his authority and power 985 00:51:06,840 --> 00:51:10,080 Speaker 4: to go after women that he's like this is this 986 00:51:10,120 --> 00:51:12,880 Speaker 4: is definitely normal for me as a professor and president 987 00:51:12,880 --> 00:51:15,600 Speaker 4: of a school to try to be like intimidate women 988 00:51:15,640 --> 00:51:20,080 Speaker 4: into sleeping with me. And then you know, there are 989 00:51:20,120 --> 00:51:22,000 Speaker 4: so many other things, but like the fact that he 990 00:51:22,040 --> 00:51:25,400 Speaker 4: can come back and be like, ah my bad. But 991 00:51:25,560 --> 00:51:28,840 Speaker 4: you know we're moving on, right, let me talk about economics, 992 00:51:28,880 --> 00:51:31,719 Speaker 4: and that seems like okay, that sounds about right. This 993 00:51:31,760 --> 00:51:33,479 Speaker 4: is nothing to that doesn't affect you as a person, 994 00:51:33,480 --> 00:51:34,480 Speaker 4: so it's fine as a teacher. 995 00:51:34,920 --> 00:51:38,400 Speaker 3: And so when these rules, when people decide that because 996 00:51:38,400 --> 00:51:41,959 Speaker 3: they are powerful or influential, these rules don't apply to them. 997 00:51:42,520 --> 00:51:45,319 Speaker 3: I think it's really a problem when you get to say, oh, 998 00:51:45,360 --> 00:51:47,120 Speaker 3: well I did this, but I to just go on 999 00:51:47,239 --> 00:51:50,880 Speaker 3: and continue my lesson for today. No, you don't write. 1000 00:51:51,040 --> 00:51:53,520 Speaker 3: And I think that they're saying that for a long time, 1001 00:51:53,800 --> 00:51:56,040 Speaker 3: the rules did not apply to powerful people. And the 1002 00:51:56,120 --> 00:51:59,160 Speaker 3: question is whether or not we want to live in 1003 00:51:59,160 --> 00:52:00,600 Speaker 3: a world where that is the case, right? Do we 1004 00:52:00,640 --> 00:52:02,960 Speaker 3: want to live in a world where someone like this 1005 00:52:03,040 --> 00:52:05,920 Speaker 3: gets to do things like this and just go on 1006 00:52:06,000 --> 00:52:08,399 Speaker 3: to finish their economics lesson for the day? 1007 00:52:08,520 --> 00:52:09,480 Speaker 2: I say no? 1008 00:52:10,000 --> 00:52:12,200 Speaker 3: And so I think as we watch what comes out 1009 00:52:12,200 --> 00:52:15,279 Speaker 3: of this Department of Justice release with the Epstein files, 1010 00:52:15,800 --> 00:52:20,480 Speaker 3: I hope we can like resist this idea of using 1011 00:52:20,520 --> 00:52:23,960 Speaker 3: it to talk about like whether or not someone's career 1012 00:52:24,040 --> 00:52:27,000 Speaker 3: is going to be impacted or all of that. I 1013 00:52:27,040 --> 00:52:33,040 Speaker 3: think truly this will likely touch every corner of power, academia, politics, entertainment, tech, finance, 1014 00:52:33,360 --> 00:52:36,440 Speaker 3: and we should be talking about accountability, right, We should 1015 00:52:36,440 --> 00:52:39,720 Speaker 3: be talking about like centering the people who were harmed, 1016 00:52:40,000 --> 00:52:44,520 Speaker 3: regardless of the resumes and power and influence of the 1017 00:52:44,560 --> 00:52:46,440 Speaker 3: people who were doing the harming. I think that is 1018 00:52:46,480 --> 00:52:50,720 Speaker 3: the bare minimum and ultimately I don't think the people 1019 00:52:50,840 --> 00:52:54,719 Speaker 3: shaping the future should be taking advice from a convicted 1020 00:52:54,960 --> 00:52:59,120 Speaker 3: child sex predator on how to pursue women at their age. 1021 00:52:59,160 --> 00:53:02,120 Speaker 3: I think that is a whole, totally reasonable standard to set. 1022 00:53:02,440 --> 00:53:03,759 Speaker 3: And I will die on this hill. 1023 00:53:04,000 --> 00:53:05,440 Speaker 2: I don't know. I don't care if it makes me 1024 00:53:05,520 --> 00:53:08,160 Speaker 2: sound like an instrumist. I will die on the pill. 1025 00:53:08,320 --> 00:53:11,520 Speaker 4: Oh. I mean, I think a lot of pod bros 1026 00:53:11,520 --> 00:53:12,600 Speaker 4: Are going to be mad at you about that. 1027 00:53:14,000 --> 00:53:19,640 Speaker 1: Come at me bros, them and their Pizzagate And now 1028 00:53:19,680 --> 00:53:23,960 Speaker 1: they're like, this is no information though I know it's because. 1029 00:53:23,760 --> 00:53:25,440 Speaker 3: If you if you go back and look at like 1030 00:53:25,600 --> 00:53:30,240 Speaker 3: QAnon and Pizzagate, it's like you have emails that Hillary 1031 00:53:30,239 --> 00:53:33,799 Speaker 3: Clinton wrote to Podesta that are like, oh, should we 1032 00:53:33,800 --> 00:53:36,080 Speaker 3: get some pizza later? And people are like, but if 1033 00:53:36,080 --> 00:53:39,360 Speaker 3: you replaced pizza with children, think about it then. And 1034 00:53:39,400 --> 00:53:42,440 Speaker 3: then you have Epstein putting like up for sex crimes 1035 00:53:42,480 --> 00:53:43,560 Speaker 3: later and it's like, well. 1036 00:53:43,440 --> 00:53:45,080 Speaker 2: No, no, no, no no, no, we don't know. 1037 00:53:45,040 --> 00:53:49,719 Speaker 1: What that means. That means that's out a context. 1038 00:53:51,760 --> 00:53:52,360 Speaker 2: Exactly. 1039 00:53:52,960 --> 00:53:57,280 Speaker 3: Uh yeah, So if the Department of Justice releases these files, 1040 00:53:57,800 --> 00:53:59,799 Speaker 3: I'm sure we're going to have a lot more to 1041 00:53:59,800 --> 00:54:02,640 Speaker 3: just but that's where we're at for right now. 1042 00:54:03,160 --> 00:54:06,440 Speaker 4: Who Yeah, Bill Clinton and Hillary Clinton are on that list. 1043 00:54:06,520 --> 00:54:08,080 Speaker 4: Yeah they should also be locked up. 1044 00:54:08,160 --> 00:54:08,719 Speaker 1: Let's go ahead. 1045 00:54:08,800 --> 00:54:12,520 Speaker 3: But yeah, if anybody who is shown to have committed 1046 00:54:12,520 --> 00:54:15,040 Speaker 3: a crime should be locked up. That's the thing is, like, 1047 00:54:15,760 --> 00:54:18,839 Speaker 3: I don't want abusers on my team, and I don't 1048 00:54:18,880 --> 00:54:23,160 Speaker 3: think that I think that we're being sold this idea 1049 00:54:23,239 --> 00:54:26,759 Speaker 3: of tribalism of like, oh, are you going to care 1050 00:54:26,840 --> 00:54:30,480 Speaker 3: if it's your favorite entertainer or your favorite you know, 1051 00:54:31,000 --> 00:54:33,600 Speaker 3: celebrity who gets taken down? 1052 00:54:33,719 --> 00:54:34,880 Speaker 2: No, because I don't want that. 1053 00:54:35,080 --> 00:54:38,000 Speaker 3: I don't want a person who is a criminal walking free, 1054 00:54:38,080 --> 00:54:39,440 Speaker 3: even if it's somebody who I happen to. 1055 00:54:39,440 --> 00:54:42,480 Speaker 2: Be politically aligned with or like enjoy their work. 1056 00:54:43,239 --> 00:54:45,640 Speaker 4: I mean, I haven't stopped watching many of shows because 1057 00:54:45,680 --> 00:54:47,839 Speaker 4: many of the celebrities have disappointed me and have shown 1058 00:54:47,840 --> 00:54:50,719 Speaker 4: their true color. So putting on my bane list, let's go. 1059 00:54:50,880 --> 00:54:57,280 Speaker 1: Yes, Yes, well we shall see, we shall see what happens, 1060 00:54:58,000 --> 00:55:01,200 Speaker 1: and I'm sure we will come back discuss this more. 1061 00:55:02,080 --> 00:55:05,160 Speaker 1: And yeah, thank you so much, Bridget for breaking down 1062 00:55:06,000 --> 00:55:08,480 Speaker 1: such an intense topic for us. 1063 00:55:08,840 --> 00:55:10,200 Speaker 2: Thank you for having me. 1064 00:55:10,600 --> 00:55:13,919 Speaker 3: Uh yeah, you all, even though it's horrifying, you all 1065 00:55:13,920 --> 00:55:16,440 Speaker 3: made it. 1066 00:55:15,320 --> 00:55:19,160 Speaker 1: A little less so yes, and now we might watch 1067 00:55:19,160 --> 00:55:23,200 Speaker 1: the Social Network together and be kind of horrified about 1068 00:55:23,400 --> 00:55:24,600 Speaker 1: some realities. 1069 00:55:24,840 --> 00:55:26,960 Speaker 2: But I'm excited they're making a sequel. 1070 00:55:27,280 --> 00:55:28,440 Speaker 1: They're making a sequel. 1071 00:55:28,920 --> 00:55:33,640 Speaker 3: Jeremy Strong is gonna play Zuckerberg like a later version 1072 00:55:33,680 --> 00:55:35,280 Speaker 3: of Zuckerberg in the sequel. 1073 00:55:35,440 --> 00:55:38,319 Speaker 2: I know this sounds look it up. I read it. 1074 00:55:38,800 --> 00:55:40,439 Speaker 2: I read it in Deadline. Look it up. 1075 00:55:40,960 --> 00:55:43,360 Speaker 4: Are they gonna have the moments where he and Elon 1076 00:55:43,480 --> 00:55:44,400 Speaker 4: Musk might fight? 1077 00:55:45,520 --> 00:55:46,000 Speaker 2: Oh? 1078 00:55:46,480 --> 00:55:50,160 Speaker 3: I was just talking about that on a podcast. I mean, 1079 00:55:50,280 --> 00:55:53,520 Speaker 3: say what you will about Mark Zuckerberg. Zuckerberg would drop 1080 00:55:53,600 --> 00:55:55,080 Speaker 3: Elon Musk like a sack of potato. 1081 00:55:55,120 --> 00:55:57,120 Speaker 2: Say what you will. You know I hate Zuckerberg. 1082 00:55:57,120 --> 00:56:00,480 Speaker 3: I hate them both, absolutely going Zuckerberg on this one. 1083 00:56:00,920 --> 00:56:02,640 Speaker 1: I hope they have the scene and when he's on 1084 00:56:02,680 --> 00:56:06,440 Speaker 1: the wakeboard and he's just tale translation, Yeah, I really 1085 00:56:06,480 --> 00:56:09,000 Speaker 1: want that to be in the movie and like sad, 1086 00:56:09,160 --> 00:56:14,080 Speaker 1: like adult midlife crisis music playing. Yeah, that's what I want. 1087 00:56:14,239 --> 00:56:17,600 Speaker 3: My partner's mom when we showed her that picture, she 1088 00:56:17,719 --> 00:56:20,239 Speaker 3: just said, what a dweeb. And I'll never forget it 1089 00:56:20,280 --> 00:56:22,919 Speaker 3: because I don't think I've ever heard somebody call someone 1090 00:56:22,960 --> 00:56:23,640 Speaker 3: else Adweb. 1091 00:56:25,640 --> 00:56:29,319 Speaker 1: Yeah, it's like yes, deweb, but you are correct. You 1092 00:56:29,400 --> 00:56:35,840 Speaker 1: are correct? Yeah, yeah, flor Well something to look forward to. 1093 00:56:36,520 --> 00:56:39,680 Speaker 1: Question in the meantime, Bridget where can the good listeners 1094 00:56:39,719 --> 00:56:40,320 Speaker 1: find you? 1095 00:56:40,320 --> 00:56:42,279 Speaker 2: You can listen to my podcast there are No Girls 1096 00:56:42,280 --> 00:56:44,359 Speaker 2: on the Internet. You can follow me on Instagram at 1097 00:56:44,360 --> 00:56:47,960 Speaker 2: bridget Ryan DC, on TikTok at bridget Ryan DC, and 1098 00:56:48,000 --> 00:56:51,000 Speaker 2: on YouTube at there are No Girls on the Internet. 1099 00:56:51,239 --> 00:56:54,640 Speaker 1: And definitely go check that out if you haven't already listeners. 1100 00:56:54,920 --> 00:56:56,880 Speaker 1: If you would like to email us, you can. Our 1101 00:56:56,960 --> 00:56:59,879 Speaker 1: email is Hello Stuffmannever Told You dot com from Blue Sky, 1102 00:57:00,239 --> 00:57:02,320 Speaker 1: a podcast and on Instagram and chiktok it stuff I 1103 00:57:02,320 --> 00:57:04,759 Speaker 1: Never Told You for us on YouTube. We have a 1104 00:57:04,760 --> 00:57:06,600 Speaker 1: book you can get wherever you can get your books, 1105 00:57:06,600 --> 00:57:09,239 Speaker 1: and we have new merchandise have com Hero. Thanks as 1106 00:57:09,239 --> 00:57:11,600 Speaker 1: always too our superroducer Christine or executive ducer by and 1107 00:57:11,719 --> 00:57:14,320 Speaker 1: contributor Joey. Thank you and thanks to you for listening. 1108 00:57:14,400 --> 00:57:16,080 Speaker 1: Stuff Never Told You is production by Heart Radio. For 1109 00:57:16,080 --> 00:57:17,640 Speaker 1: more podcasts from my heart Radio, you can check out 1110 00:57:17,640 --> 00:57:19,480 Speaker 1: the heart Radio Apple podcast. Wherever you listen to your 1111 00:57:19,480 --> 00:57:20,320 Speaker 1: favorite shows,