1 00:00:00,200 --> 00:00:03,120 Speaker 1: Welcome to Worst Year Ever, a production of My Heart 2 00:00:03,240 --> 00:00:22,560 Speaker 1: Radio Welcome Together Everything, So don't don't hey, welcome back 3 00:00:22,600 --> 00:00:27,560 Speaker 1: to the Worst Year Ever. My name's it is? That 4 00:00:27,680 --> 00:00:30,080 Speaker 1: was a weird intro. No, it was great. Um, you 5 00:00:30,600 --> 00:00:33,040 Speaker 1: got my name out there. I introduced a show. What's 6 00:00:33,040 --> 00:00:37,440 Speaker 1: your name? Oh it's Cody Johnston. That's right, it is? Yes, 7 00:00:37,520 --> 00:00:42,000 Speaker 1: thank you and joining us as always is I'm dying, 8 00:00:43,080 --> 00:00:50,519 Speaker 1: is I'm dying. I'm Robert Evans, and I uh was 9 00:00:50,600 --> 00:00:52,879 Speaker 1: recovering from the crud that I caught from all of 10 00:00:52,880 --> 00:00:56,639 Speaker 1: those gun owners in Virginia, and uh then I had 11 00:00:56,680 --> 00:00:58,880 Speaker 1: to do a live show in San Francisco and just 12 00:00:59,080 --> 00:01:02,520 Speaker 1: ruined my my voice. So I am going to sit 13 00:01:02,560 --> 00:01:05,400 Speaker 1: back and listen to my friends today. I'm sure you'll 14 00:01:05,400 --> 00:01:07,559 Speaker 1: have some things to chirp in. I think you sound great, 15 00:01:07,600 --> 00:01:10,600 Speaker 1: but you sound beautiful. You look beautiful too. He's wrapped 16 00:01:10,640 --> 00:01:14,800 Speaker 1: up in a nice cozy robe. He's got his tissues. Um. 17 00:01:15,720 --> 00:01:18,039 Speaker 1: We miss you being here in l A for sure, 18 00:01:18,400 --> 00:01:23,000 Speaker 1: But at this moment, I'm glad you're there. Sorry. Yeah, 19 00:01:23,120 --> 00:01:24,880 Speaker 1: I would have gotten you all sick, and I would 20 00:01:24,880 --> 00:01:26,800 Speaker 1: have done it on purpose. I know, yeah, I know 21 00:01:26,880 --> 00:01:30,600 Speaker 1: you would have Germans, you'd be throwing your germs right 22 00:01:30,640 --> 00:01:37,600 Speaker 1: in our faces. Get your get your chucking germs out. Um. 23 00:01:38,680 --> 00:01:43,440 Speaker 1: So this week, uh, we wanted to dig a little 24 00:01:43,440 --> 00:01:48,320 Speaker 1: bit deeper into the differences between Bernie Sanders and Elizabeth Warren. 25 00:01:48,640 --> 00:01:51,840 Speaker 1: UM feels personally, I feel like I've been talking about 26 00:01:51,840 --> 00:01:54,200 Speaker 1: this a lot lately. Cody and I talked about it 27 00:01:54,560 --> 00:01:58,560 Speaker 1: a little bit on our other show last week. UM. 28 00:01:58,640 --> 00:02:01,600 Speaker 1: And I know it might see like we're harping on 29 00:02:01,720 --> 00:02:03,960 Speaker 1: Warren and Sanders a lot, which I guess we are, 30 00:02:04,200 --> 00:02:05,920 Speaker 1: But I think that's the majority of our listeners are 31 00:02:05,920 --> 00:02:10,040 Speaker 1: probably split between the two of them, and voting starts 32 00:02:10,040 --> 00:02:12,639 Speaker 1: for some of you guys like now, So it feels 33 00:02:12,639 --> 00:02:16,680 Speaker 1: like now is the time to have this conversation. Yeah, yeah, 34 00:02:16,720 --> 00:02:23,560 Speaker 1: that's fair. Um. And I think it's as conversations are good, 35 00:02:23,600 --> 00:02:26,000 Speaker 1: discourse is great. Everybody has a good time during it. 36 00:02:26,160 --> 00:02:31,440 Speaker 1: Nobody hates it, nobody gets m I think it's a 37 00:02:31,520 --> 00:02:35,799 Speaker 1: good Uh. It's a good time because it's a constant 38 00:02:35,840 --> 00:02:38,880 Speaker 1: reminder of the fact that this is the worst year 39 00:02:39,040 --> 00:02:42,919 Speaker 1: ever for everybody. And I know that in the past 40 00:02:42,960 --> 00:02:45,560 Speaker 1: few weeks at least for me, and leading into the 41 00:02:47,080 --> 00:02:50,760 Speaker 1: year that we're currently in. Uh, it has felt so 42 00:02:50,919 --> 00:02:53,720 Speaker 1: much like I don't always has a little bit like 43 00:02:54,480 --> 00:02:57,200 Speaker 1: never really ended, and it never really will. It never will. 44 00:02:57,240 --> 00:03:01,400 Speaker 1: We're stuck there and that's the o G. We're Yeah, yeah, absolutely, 45 00:03:01,560 --> 00:03:04,280 Speaker 1: it was the worst year, and we're reliving just like 46 00:03:04,520 --> 00:03:07,440 Speaker 1: heightened diversions of it every every year, so they do 47 00:03:07,560 --> 00:03:10,200 Speaker 1: get worse, see it worse. But that was like alright, 48 00:03:10,240 --> 00:03:13,839 Speaker 1: it's time. Yeah, it's time for only bad years from now. Yeah. Yeah, 49 00:03:13,919 --> 00:03:19,000 Speaker 1: this this conversation definitely feels like we're back in in 50 00:03:19,080 --> 00:03:22,639 Speaker 1: a lot of ways, um, especially now, like recently it's 51 00:03:22,680 --> 00:03:25,799 Speaker 1: been like, oh, by the way, we're gonna again. And 52 00:03:25,840 --> 00:03:27,840 Speaker 1: I guess part of the reason why I think it's 53 00:03:27,919 --> 00:03:30,960 Speaker 1: important I'm gonna spen it positive here is that it's 54 00:03:31,080 --> 00:03:34,640 Speaker 1: it's these actually are two of the most progressive presidential 55 00:03:34,720 --> 00:03:38,200 Speaker 1: candidates that we've ever seen, uh, and we're lucky to 56 00:03:38,240 --> 00:03:40,600 Speaker 1: have them. I think a lot of my frustration around 57 00:03:40,640 --> 00:03:42,440 Speaker 1: this conversation comes from the fact that a lot of 58 00:03:42,480 --> 00:03:45,040 Speaker 1: people seem to forget that really quickly. We're all very 59 00:03:45,120 --> 00:03:48,320 Speaker 1: quick to paint Warren as a liar and Bernie as 60 00:03:48,400 --> 00:03:50,840 Speaker 1: an asshole or whatever it is that people want to 61 00:03:50,880 --> 00:03:54,040 Speaker 1: frame Bernie as and the media exacerbates this and soon 62 00:03:54,040 --> 00:03:57,320 Speaker 1: everyone is just piste off of each other. And uh, yeah, 63 00:03:57,400 --> 00:03:59,240 Speaker 1: So I just wanted to say that take a moment 64 00:03:59,280 --> 00:04:02,240 Speaker 1: of appreciation and for being fortunate enough to have two 65 00:04:02,280 --> 00:04:06,000 Speaker 1: great candidates and to even be having these conversations, we 66 00:04:06,040 --> 00:04:08,160 Speaker 1: don't actually have to take a moment. Yeah, that was 67 00:04:08,200 --> 00:04:11,640 Speaker 1: a moment. Um. So you're saying that she's not necessarily 68 00:04:11,680 --> 00:04:14,680 Speaker 1: a neoliberal show and he's not necessarily a sexist anti 69 00:04:14,760 --> 00:04:17,240 Speaker 1: semit exactly. I think the truth is somewhere in the 70 00:04:17,320 --> 00:04:22,279 Speaker 1: middle there. Um. Also, I mean, this is a scary 71 00:04:22,360 --> 00:04:25,080 Speaker 1: conversation for me to have. Every time we talk about it, 72 00:04:25,120 --> 00:04:28,080 Speaker 1: I get a lot of ship online from some of 73 00:04:28,160 --> 00:04:32,000 Speaker 1: you guys, and Okay, we're all worked up about all 74 00:04:32,080 --> 00:04:34,560 Speaker 1: of it. But part of this conversation, I hope is 75 00:04:34,720 --> 00:04:38,160 Speaker 1: to like uh, dial show a way to have these 76 00:04:38,200 --> 00:04:42,120 Speaker 1: conversations in a more productive manner. Well, Katie, before you 77 00:04:42,240 --> 00:04:45,080 Speaker 1: get into it, I want to try and just detract 78 00:04:45,279 --> 00:04:49,040 Speaker 1: perhaps some some some heat from you by starting this 79 00:04:49,160 --> 00:04:53,120 Speaker 1: episode with my contention that Bernard Sanders was the real 80 00:04:53,480 --> 00:04:56,640 Speaker 1: gunman on the Grassy Knoll who shot dead JFK. God 81 00:04:57,760 --> 00:05:02,040 Speaker 1: statement not ted Cruiz's dead. No, No, Ted Cruise's dad 82 00:05:02,120 --> 00:05:08,360 Speaker 1: was the AMMO man though. Okay, that's called bipartisanship right there. Okay, 83 00:05:10,839 --> 00:05:13,560 Speaker 1: that's why Bernie should be president. He's gonna bring us 84 00:05:13,560 --> 00:05:16,760 Speaker 1: all together. It's just it's all full circle. And like 85 00:05:17,000 --> 00:05:19,080 Speaker 1: I also want to say, I I don't I still 86 00:05:19,120 --> 00:05:21,080 Speaker 1: don't know who I'm voting for. A lot of people 87 00:05:21,120 --> 00:05:23,440 Speaker 1: are like, oh, you're just you're only going to ever 88 00:05:23,520 --> 00:05:26,680 Speaker 1: vote for Warren. I think, like, no, I I feel 89 00:05:26,720 --> 00:05:29,240 Speaker 1: the need to defend her because I think that she 90 00:05:29,360 --> 00:05:33,840 Speaker 1: gets mischaracterized and I like her, but it doesn't mean 91 00:05:33,920 --> 00:05:38,000 Speaker 1: that I'm necessarily voting for her. So anyway, my two 92 00:05:38,080 --> 00:05:39,920 Speaker 1: cents on all that sense, because you're yeah, I mean, 93 00:05:40,240 --> 00:05:44,000 Speaker 1: like you're saying, like, she's not Hillary Clinton. There are 94 00:05:44,040 --> 00:05:46,760 Speaker 1: issues I have, but like it's not the same and 95 00:05:46,920 --> 00:05:49,600 Speaker 1: being seeing that it's treated as the same thing, it's 96 00:05:49,640 --> 00:05:53,640 Speaker 1: probably frustrating. So okay. Uh, Like I mentioned last week 97 00:05:53,680 --> 00:05:56,360 Speaker 1: on even More News, we talked about this whole he said, 98 00:05:56,680 --> 00:06:00,440 Speaker 1: she said, woman running for president controversying, and then we 99 00:06:00,520 --> 00:06:03,640 Speaker 1: had a larger conversation about the toxic climate online surrounding 100 00:06:03,640 --> 00:06:05,800 Speaker 1: this whole election and the sexism that we still seem 101 00:06:05,839 --> 00:06:08,400 Speaker 1: to be struggling with as a party. Um, And it 102 00:06:08,480 --> 00:06:10,159 Speaker 1: was an interesting conversation, and I think that a lot 103 00:06:10,200 --> 00:06:12,800 Speaker 1: of you guys also agreed us, so we wanted to 104 00:06:12,920 --> 00:06:15,800 Speaker 1: keep that going. Um. But yeah, if you if you 105 00:06:15,880 --> 00:06:18,400 Speaker 1: remember seeing and reported that last year, Elizabeth Warren and 106 00:06:18,400 --> 00:06:20,440 Speaker 1: Bernie Sanders had some sort of a meeting to discuss 107 00:06:20,480 --> 00:06:22,720 Speaker 1: policies and got into a dispute as to whether or 108 00:06:22,760 --> 00:06:25,839 Speaker 1: otto woman could win the presidency. Bernie denied the allegations, 109 00:06:25,880 --> 00:06:28,400 Speaker 1: saying that it was a discussion about how Trump would 110 00:06:28,440 --> 00:06:31,200 Speaker 1: manipulate any weaknesses. And again, I don't want to spend 111 00:06:31,200 --> 00:06:33,840 Speaker 1: too much time talking about this right now. Yeah, it 112 00:06:33,960 --> 00:06:37,440 Speaker 1: just really disappointed me how so many people's initial reaction 113 00:06:37,560 --> 00:06:40,560 Speaker 1: was to immediately jump to, uh, you know, she's a liar, 114 00:06:40,760 --> 00:06:43,200 Speaker 1: And then there was just this huge, aggressive and disproportionate 115 00:06:43,279 --> 00:06:47,800 Speaker 1: online blowback against her, like millions of snake emojis, hashtags 116 00:06:47,839 --> 00:06:52,200 Speaker 1: calling her allying snake, calling for rephones for their donations, etcetera. Um, 117 00:06:53,200 --> 00:06:56,760 Speaker 1: and I personally perceived the snake symbol as uh in 118 00:06:56,880 --> 00:07:00,560 Speaker 1: this specific context, as as being sexist. I know that 119 00:07:00,640 --> 00:07:03,480 Speaker 1: not everybody agrees with me. I know that you don't Cody. Um, 120 00:07:04,000 --> 00:07:06,919 Speaker 1: we can get into that if you want. But even 121 00:07:07,160 --> 00:07:12,200 Speaker 1: outside of that, to me, the immediately jumping to the 122 00:07:12,280 --> 00:07:15,200 Speaker 1: conclusion that she is the liar is kind of sexist. 123 00:07:15,680 --> 00:07:18,880 Speaker 1: It's like he said, she said situation, and immediately people 124 00:07:19,400 --> 00:07:23,520 Speaker 1: assumed that she is the wrong party in here. And 125 00:07:23,840 --> 00:07:25,640 Speaker 1: I think the truth of the matter is that it's 126 00:07:25,680 --> 00:07:27,960 Speaker 1: something that's much more great than that. I think that 127 00:07:28,000 --> 00:07:30,320 Speaker 1: they're probably both right in their perceptions of how that 128 00:07:30,360 --> 00:07:33,920 Speaker 1: conversation went. It's room, there's room for that. Yeah. Um, 129 00:07:34,120 --> 00:07:36,120 Speaker 1: I mean I think again, like we're not going to 130 00:07:36,200 --> 00:07:39,040 Speaker 1: get super into what we've already talked about, but I 131 00:07:39,200 --> 00:07:42,040 Speaker 1: think that my main a lot of my pushback was 132 00:07:42,080 --> 00:07:45,440 Speaker 1: about the media and how they framed it and how 133 00:07:45,480 --> 00:07:51,200 Speaker 1: they continued to frame it um, and how her response, 134 00:07:51,760 --> 00:07:56,040 Speaker 1: like during the debate was sort of an embrace of 135 00:07:56,360 --> 00:08:00,920 Speaker 1: the framing that CNN was going for, like the uh oh, 136 00:08:01,200 --> 00:08:03,360 Speaker 1: so Bernie, you're saying you didn't say this, Yeah I 137 00:08:03,400 --> 00:08:06,520 Speaker 1: didn't say that, Sota, what did you say when he 138 00:08:06,600 --> 00:08:09,320 Speaker 1: said that? And her response was, well, I disagreed. So 139 00:08:09,400 --> 00:08:12,960 Speaker 1: that's accepting the narrative that they're framing to get that answer. 140 00:08:13,160 --> 00:08:16,320 Speaker 1: And I think and the way my response to that 141 00:08:16,560 --> 00:08:19,440 Speaker 1: is like, yeah, I mean, if that's how she interpreted 142 00:08:19,480 --> 00:08:23,000 Speaker 1: this conversation, if that's how it plays out in her mind, 143 00:08:24,360 --> 00:08:26,520 Speaker 1: part of me is like, why should she be expected 144 00:08:26,600 --> 00:08:30,960 Speaker 1: to like back down from that. She's angry, but she didn't. 145 00:08:31,000 --> 00:08:32,559 Speaker 1: If she didn't like the story, then it was just 146 00:08:32,760 --> 00:08:37,800 Speaker 1: like a leaked story about her like private conversation. So 147 00:08:39,000 --> 00:08:40,280 Speaker 1: is it that a kind of thing where she's like, 148 00:08:40,440 --> 00:08:43,199 Speaker 1: I'm mad about this and I want to talk about it. Well, 149 00:08:43,240 --> 00:08:45,920 Speaker 1: she's being everybody's talking about it, and she's being put 150 00:08:46,000 --> 00:08:48,160 Speaker 1: on the spot to talk about it, and I think 151 00:08:48,200 --> 00:08:50,520 Speaker 1: that the whole conversation about it has been unfair and 152 00:08:50,600 --> 00:08:52,959 Speaker 1: she's probably even more angrying. I mean, that's reading into it. 153 00:08:53,280 --> 00:08:56,360 Speaker 1: That's my interpretation because to me, I'm just sitting here 154 00:08:56,520 --> 00:08:59,240 Speaker 1: imagining myself in that conversation and you're talking to one 155 00:08:59,280 --> 00:09:02,120 Speaker 1: of your dear friends where you're very much aligned and 156 00:09:02,760 --> 00:09:06,920 Speaker 1: being told as a woman that you can't do something 157 00:09:07,000 --> 00:09:10,400 Speaker 1: that you really want to do. It's humiliating. I would 158 00:09:10,440 --> 00:09:13,400 Speaker 1: feel humiliated if I was her and this story came 159 00:09:13,440 --> 00:09:16,320 Speaker 1: out and in that moment with somebody that she really respects, 160 00:09:16,720 --> 00:09:19,040 Speaker 1: I bet she was pissed. I don't think she leaked it. 161 00:09:19,240 --> 00:09:21,080 Speaker 1: I know that she talked about it like a year ago, 162 00:09:21,400 --> 00:09:24,439 Speaker 1: and then this story gets circulated again now at this 163 00:09:24,600 --> 00:09:26,880 Speaker 1: point in time. I have also said this on the 164 00:09:26,920 --> 00:09:29,720 Speaker 1: other show. The accusations that she did it on purpose 165 00:09:29,880 --> 00:09:33,719 Speaker 1: seemed really far fetched to me. Maybe somebody pushed it 166 00:09:33,920 --> 00:09:36,000 Speaker 1: from her side. Well, we don't see any evidence of 167 00:09:36,040 --> 00:09:38,360 Speaker 1: that from what I can tell. And she has running 168 00:09:38,400 --> 00:09:42,640 Speaker 1: a campaign that is so staunchly avoided this kind of 169 00:09:42,679 --> 00:09:44,719 Speaker 1: a conversation that it doesn't make sense to me, and 170 00:09:44,840 --> 00:09:49,120 Speaker 1: it's hurting her. It's hurting her. I I doesn't logically, 171 00:09:49,160 --> 00:09:52,000 Speaker 1: it doesn't make sense to me. I hear your argument that, 172 00:09:52,080 --> 00:09:53,960 Speaker 1: like you feel like she should have squashed it at 173 00:09:54,000 --> 00:09:58,120 Speaker 1: that moment. Yeah, maybe maybe maybe, but I don't know. 174 00:09:58,440 --> 00:10:00,640 Speaker 1: I don't know that I would have wanted to. I 175 00:10:01,000 --> 00:10:02,720 Speaker 1: think that I would be a little bit fed up. 176 00:10:03,440 --> 00:10:06,880 Speaker 1: And that's my Yeah, that's my perspective. By not squashing 177 00:10:06,920 --> 00:10:10,280 Speaker 1: it or not, like like by accepting their narrative. Like 178 00:10:10,480 --> 00:10:14,280 Speaker 1: later that day, CNN had a headline like, the candidates 179 00:10:14,320 --> 00:10:16,120 Speaker 1: debate on whether or not a woman can be president 180 00:10:16,400 --> 00:10:19,360 Speaker 1: and like literally nobody was debating that and that's obnoxious. Again, 181 00:10:19,440 --> 00:10:22,920 Speaker 1: and even like We've had this conversation about what Trump 182 00:10:22,960 --> 00:10:25,440 Speaker 1: will probably do uh in the election. As we saw 183 00:10:25,480 --> 00:10:29,679 Speaker 1: in sixteen two weeks before this big thing about this 184 00:10:29,800 --> 00:10:35,240 Speaker 1: private conversation between friends broke, Joe Biden on camera in 185 00:10:35,320 --> 00:10:37,800 Speaker 1: front of reporters for people to see, said the exact 186 00:10:37,880 --> 00:10:41,320 Speaker 1: same thing he said in Hillary Clinton had to deal 187 00:10:41,360 --> 00:10:43,439 Speaker 1: with a lot of sexism. I won't have to deal 188 00:10:43,520 --> 00:10:46,240 Speaker 1: with that. It's the same thing, and nobody cares. Nobody 189 00:10:46,280 --> 00:10:48,240 Speaker 1: brought it up. It's not the thing, um, So I 190 00:10:48,800 --> 00:10:53,480 Speaker 1: so it's I'm always I'm just I completely agree with 191 00:10:53,559 --> 00:10:58,079 Speaker 1: you about that. I think that that's bizarre and uh 192 00:10:58,440 --> 00:11:02,079 Speaker 1: and blatant moves by the media and all of this, 193 00:11:02,200 --> 00:11:04,599 Speaker 1: and they certainly exacerbated this, but I want us to 194 00:11:04,720 --> 00:11:07,920 Speaker 1: be able to see it happening and to not immediately 195 00:11:08,040 --> 00:11:11,080 Speaker 1: jump to attacking the woman and then women and people 196 00:11:11,160 --> 00:11:15,360 Speaker 1: online that support them the woman in question, Like, it 197 00:11:15,520 --> 00:11:19,640 Speaker 1: just feels very ugly to me, um and personally, especially 198 00:11:19,720 --> 00:11:22,439 Speaker 1: in the in the thick of all that, I felt 199 00:11:22,480 --> 00:11:26,480 Speaker 1: like a little bit, a little bit betrayed in a way. 200 00:11:26,600 --> 00:11:29,760 Speaker 1: And that's the extreme version of it. Uh. You know, 201 00:11:30,000 --> 00:11:33,320 Speaker 1: when I think about people who I want, I assume 202 00:11:33,640 --> 00:11:37,679 Speaker 1: in general were allied with, UM are incapable of like 203 00:11:37,840 --> 00:11:40,880 Speaker 1: taking a step back and and and thinking about it 204 00:11:41,040 --> 00:11:45,360 Speaker 1: from a female perspective. This whole situation, just this situation, UM. 205 00:11:45,640 --> 00:11:51,079 Speaker 1: It it bothers me. And I've said this before, and 206 00:11:51,200 --> 00:11:54,720 Speaker 1: that's also this reaction kind of makes me think that, yeah, 207 00:11:54,840 --> 00:11:57,080 Speaker 1: a woman can't when in this climate if the people 208 00:11:57,160 --> 00:12:00,920 Speaker 1: on our side, UM are so quick to jump to 209 00:12:01,000 --> 00:12:05,160 Speaker 1: these conclusions about her. Yeah, I think a lot of 210 00:12:05,240 --> 00:12:09,280 Speaker 1: it might come down to the fact that, like we're 211 00:12:09,280 --> 00:12:12,160 Speaker 1: all having this filtered through the media that exists in 212 00:12:12,200 --> 00:12:15,040 Speaker 1: this country, in the mainstream media that exists in this country, UM, 213 00:12:15,280 --> 00:12:19,600 Speaker 1: which is not only bad at journalism, UM, but also 214 00:12:19,880 --> 00:12:25,319 Speaker 1: exists to stoke division and disagreement in discordance because that 215 00:12:25,559 --> 00:12:28,800 Speaker 1: makes for better television UM. And I think that's a 216 00:12:28,920 --> 00:12:31,240 Speaker 1: major factor and why this got out of hand as 217 00:12:31,320 --> 00:12:33,280 Speaker 1: quickly as it did. And I do think that it 218 00:12:33,400 --> 00:12:36,319 Speaker 1: may have and the impact of actually kind of exaggerating 219 00:12:36,720 --> 00:12:39,560 Speaker 1: how difficult it is for a woman to become president 220 00:12:39,640 --> 00:12:42,640 Speaker 1: in this country. UM. And And I will say it's 221 00:12:42,679 --> 00:12:45,079 Speaker 1: one of the few things that that makes me happy 222 00:12:45,320 --> 00:12:48,400 Speaker 1: that the media ecosystem in this nation is collapsing in 223 00:12:48,520 --> 00:12:52,120 Speaker 1: on itself. Like a dying star. Um. Because it's it's 224 00:12:52,280 --> 00:12:54,960 Speaker 1: very irresponsible, and I do think it's making it hard 225 00:12:55,080 --> 00:12:57,760 Speaker 1: for anyone to really know where the people of this 226 00:12:57,840 --> 00:13:01,160 Speaker 1: country lay on issues like this, and also is making 227 00:13:01,760 --> 00:13:04,200 Speaker 1: you know, is creating these sort of conflicts that would 228 00:13:04,240 --> 00:13:07,640 Speaker 1: not in a responsible media ecosystem, this never would have 229 00:13:07,679 --> 00:13:11,320 Speaker 1: been a story. Um. And I think that's frustrating. It's 230 00:13:11,320 --> 00:13:15,400 Speaker 1: as frustrating as, for example, the media's obsession with the 231 00:13:15,480 --> 00:13:18,960 Speaker 1: fact that Bernie Sanders used a man liquor carcano rifle 232 00:13:19,040 --> 00:13:22,480 Speaker 1: to shoot JFK when he was driving to Dallas. Um. 233 00:13:22,600 --> 00:13:24,760 Speaker 1: They won't show, yes, it's not a it's an Italian 234 00:13:24,840 --> 00:13:28,080 Speaker 1: made rifle, it's not an American made rifle. But if 235 00:13:28,120 --> 00:13:30,439 Speaker 1: you look at what was available on the civilian market 236 00:13:30,480 --> 00:13:33,400 Speaker 1: at that time and what was a coomfordable working man's 237 00:13:33,440 --> 00:13:38,079 Speaker 1: option for shooting the president exactly exactly Bernie Sanders, you know, 238 00:13:38,320 --> 00:13:42,440 Speaker 1: is going to pick like the affordable working person's choice, 239 00:13:42,559 --> 00:13:44,719 Speaker 1: you know, even if that means not buying American, and 240 00:13:44,840 --> 00:13:48,079 Speaker 1: I for one support that, thank you, Robert. It's like 241 00:13:48,160 --> 00:13:52,040 Speaker 1: I mean, is a campaign slogans not me, nobody, Nobody's president. 242 00:13:52,160 --> 00:13:56,240 Speaker 1: I'm a president killer. Um. I think that it that 243 00:13:56,360 --> 00:13:59,480 Speaker 1: we should transition to talking a little bit about Bernie 244 00:13:59,520 --> 00:14:02,400 Speaker 1: Brow since we've been talking about online climate, whether or 245 00:14:02,440 --> 00:14:07,560 Speaker 1: not they exist. Cody and I have slightly different perspectives. Well, also, 246 00:14:07,600 --> 00:14:11,400 Speaker 1: I'm going to speak about this, but also because I 247 00:14:12,480 --> 00:14:14,040 Speaker 1: sort of to your point where you're talking about like 248 00:14:14,080 --> 00:14:17,320 Speaker 1: you felt sort of betrayed by the reaction to the situation. Um, 249 00:14:17,400 --> 00:14:19,040 Speaker 1: I've also talked to a lot of women who felt 250 00:14:19,080 --> 00:14:21,440 Speaker 1: betrayed on the other side of it. I'm sure I 251 00:14:21,560 --> 00:14:24,400 Speaker 1: certainly don't speak for all women, right. Well, right, so, 252 00:14:24,480 --> 00:14:26,360 Speaker 1: I just wanted to make sure that I'm sort of 253 00:14:26,440 --> 00:14:29,520 Speaker 1: communicating that you're speaking for other ones exactly. Thank you, 254 00:14:29,720 --> 00:14:32,960 Speaker 1: Thank you, Cody for framing it like that. No problem, No, 255 00:14:33,120 --> 00:14:35,040 Speaker 1: but it's true. I I don't want to pretend like 256 00:14:35,240 --> 00:14:40,440 Speaker 1: I represent everybody. That's just my perspective, and um, I 257 00:14:42,400 --> 00:14:44,240 Speaker 1: a lot of people that I speak with, And granted 258 00:14:44,280 --> 00:14:47,080 Speaker 1: that's because you know you, I have my friends like 259 00:14:47,200 --> 00:14:49,040 Speaker 1: the people that I'm aligned with and you communicate with. 260 00:14:49,480 --> 00:14:53,600 Speaker 1: But it's difficult because again, Bernie and Elizabeth are two 261 00:14:53,600 --> 00:14:57,000 Speaker 1: great candidates, and I think a lot of people that 262 00:14:57,040 --> 00:15:01,560 Speaker 1: are still waffling between feel attack for still liking her, 263 00:15:02,000 --> 00:15:06,840 Speaker 1: and it becomes difficult for a lot of people, the 264 00:15:06,920 --> 00:15:09,560 Speaker 1: idea that it is difficult to say, like that they're 265 00:15:09,600 --> 00:15:12,680 Speaker 1: going to capitulate or whatever and start supporting Bernie. And 266 00:15:13,040 --> 00:15:17,720 Speaker 1: that's not fair, it's not good, you know. But I'm worry. 267 00:15:17,800 --> 00:15:21,200 Speaker 1: I worry about this dialogue alienating people, which again is 268 00:15:21,560 --> 00:15:24,160 Speaker 1: a good transition to talking about the whole Bernie bro 269 00:15:25,360 --> 00:15:29,960 Speaker 1: phenomenon um, the alleged phenomenon, alleged phenomenon. Cody does not 270 00:15:30,120 --> 00:15:33,920 Speaker 1: believe that it exists, or to the degree that they believe. 271 00:15:34,040 --> 00:15:38,640 Speaker 1: Let me please, I'm sorry. I think that sexism is 272 00:15:38,680 --> 00:15:44,120 Speaker 1: a problem in America, UM and the globe, and also 273 00:15:44,320 --> 00:15:50,480 Speaker 1: that every candidate has a very passionate, slash toxic supporters. 274 00:15:51,200 --> 00:15:53,920 Speaker 1: And I think that, uh, like I said, sex is 275 00:15:53,960 --> 00:15:57,280 Speaker 1: a problem, and uh, you can see it, and we'll 276 00:15:57,280 --> 00:15:58,880 Speaker 1: talk about this a little more later, but you can 277 00:15:58,920 --> 00:16:01,200 Speaker 1: see it from everyone. You can see it in every 278 00:16:01,320 --> 00:16:05,560 Speaker 1: person's campaign, all their supporters. UM. I think that the 279 00:16:06,520 --> 00:16:11,600 Speaker 1: Bernie Bro specific thing um is blown out of proportion, 280 00:16:11,800 --> 00:16:15,680 Speaker 1: like we've talked about by the media um to create 281 00:16:16,320 --> 00:16:21,440 Speaker 1: uh divisive narrative. And also because they hate Bernie Sanders specifically, 282 00:16:21,760 --> 00:16:26,240 Speaker 1: and I don't disagree with you that the media exacerbates 283 00:16:26,280 --> 00:16:28,800 Speaker 1: like I can't, don't, but I have such a different 284 00:16:28,840 --> 00:16:32,320 Speaker 1: experience than that, and and I think so many other 285 00:16:32,360 --> 00:16:35,200 Speaker 1: people do that. It's really hard to accept that because 286 00:16:35,320 --> 00:16:38,320 Speaker 1: I feel it very intensely, and I know, I know 287 00:16:38,520 --> 00:16:42,760 Speaker 1: that it's a problem across the board. Let me just 288 00:16:42,800 --> 00:16:45,280 Speaker 1: finish this point really quick. I know that it's a 289 00:16:45,360 --> 00:16:48,840 Speaker 1: problem across the board, and that sucks. We need to 290 00:16:48,960 --> 00:16:51,840 Speaker 1: stop it, stop talking to each other like this. But specifically, 291 00:16:51,920 --> 00:16:55,280 Speaker 1: there are a lot of people that feel and the 292 00:16:55,600 --> 00:16:58,240 Speaker 1: Bernie we don't have to call Bernie bros. That's offensive name. 293 00:16:58,280 --> 00:17:01,920 Speaker 1: Now it's become pejorative. But like you know, there's an 294 00:17:01,960 --> 00:17:09,240 Speaker 1: aggressive type of reaction online particularly and it's very alienating, 295 00:17:09,320 --> 00:17:13,480 Speaker 1: it's very offensive. It's hard for me to accept that 296 00:17:13,760 --> 00:17:15,880 Speaker 1: it's all just the media because I see it every 297 00:17:16,000 --> 00:17:18,840 Speaker 1: day personally, and I know that a lot of other 298 00:17:18,880 --> 00:17:21,240 Speaker 1: people do as well. Um yeah, well, and that's I 299 00:17:21,280 --> 00:17:24,680 Speaker 1: mean again, I'm not speaking for I've I've just seen 300 00:17:24,760 --> 00:17:29,159 Speaker 1: so many, uh so many women online who are like 301 00:17:29,480 --> 00:17:33,200 Speaker 1: Bernie Sanders fans who have just like screen grabs and 302 00:17:33,200 --> 00:17:35,360 Speaker 1: screen grabs of the exact same thing that you're describing. 303 00:17:35,880 --> 00:17:40,080 Speaker 1: Sum and it's hard to gauge, like like I do, 304 00:17:40,200 --> 00:17:42,320 Speaker 1: see so many, so many takes of like, yeah, it's 305 00:17:42,359 --> 00:17:44,800 Speaker 1: a problem in general, but it's worse with Bernie Sanders. 306 00:17:45,600 --> 00:17:48,560 Speaker 1: I haven't seen there's no data on that. It's all 307 00:17:49,720 --> 00:17:51,200 Speaker 1: how do you gauge that? It's all just sort of 308 00:17:51,240 --> 00:17:55,879 Speaker 1: this anecdotal thing um And I think that having that claim, 309 00:17:56,040 --> 00:17:58,359 Speaker 1: and like you'll see, especially in the past like a 310 00:17:58,400 --> 00:18:02,280 Speaker 1: week and a half, uh is Sanders is growing and 311 00:18:02,400 --> 00:18:05,359 Speaker 1: growing in the polls and popularity. There have been like 312 00:18:05,640 --> 00:18:09,480 Speaker 1: five articles about Bernie bros Sure, and I think that's 313 00:18:09,520 --> 00:18:12,439 Speaker 1: all they have. And that's why I resisted so much, 314 00:18:12,520 --> 00:18:18,680 Speaker 1: especially the name, because it it I hear like, well, no, 315 00:18:18,800 --> 00:18:20,520 Speaker 1: it's not even because it's not. I don't think it's 316 00:18:20,560 --> 00:18:22,440 Speaker 1: a Bernie thing. I think it's just like there's there's 317 00:18:22,440 --> 00:18:26,320 Speaker 1: sexism and misogyny, and some of them happen to support him, 318 00:18:26,600 --> 00:18:29,600 Speaker 1: but it's not a him problem. He's like the only 319 00:18:29,680 --> 00:18:33,840 Speaker 1: candidate who's actually but he's the only candidate who's who 320 00:18:34,000 --> 00:18:36,960 Speaker 1: said like, don't do it. He's the only one about it. 321 00:18:37,040 --> 00:18:39,760 Speaker 1: It's great. I'm not blaming Bernie. I'm talking about us. 322 00:18:39,880 --> 00:18:43,160 Speaker 1: What can we do to control ourselves as we navigate 323 00:18:43,200 --> 00:18:45,960 Speaker 1: the worst year ever. If you do not want that label, 324 00:18:46,040 --> 00:18:48,320 Speaker 1: what can we do not to put that all on you? 325 00:18:48,680 --> 00:18:52,480 Speaker 1: Universal you? What can we do to detract from the 326 00:18:52,560 --> 00:18:56,359 Speaker 1: potency of this story? How can we behave better to 327 00:18:56,520 --> 00:18:59,439 Speaker 1: each other? And it's not just Bernie people, Sure, how 328 00:18:59,480 --> 00:19:02,200 Speaker 1: can we all do that? But it's indisputable that this 329 00:19:02,320 --> 00:19:05,600 Speaker 1: exists because people are are passionate about their candidate and 330 00:19:05,640 --> 00:19:09,320 Speaker 1: I love that, especially people from marginalized communities that see 331 00:19:09,440 --> 00:19:11,639 Speaker 1: him as like being a real shot. I get that. 332 00:19:12,280 --> 00:19:15,400 Speaker 1: But there there's a problem right now with how we're 333 00:19:15,400 --> 00:19:17,680 Speaker 1: talking about it. And yes, the media is exacerbating it. 334 00:19:17,720 --> 00:19:19,879 Speaker 1: So let's take the power away from it. Let's stop 335 00:19:20,119 --> 00:19:22,119 Speaker 1: doing it. Well, So I guess that's I mean, I 336 00:19:22,280 --> 00:19:24,760 Speaker 1: don't disagree with anything you said. I just think that 337 00:19:24,920 --> 00:19:28,520 Speaker 1: part of the problem is literally the term and what 338 00:19:28,640 --> 00:19:31,080 Speaker 1: we're talking about now. It's still being framed like this 339 00:19:31,400 --> 00:19:36,480 Speaker 1: Bernie brow thing. In uh twenty eight there are articles 340 00:19:36,480 --> 00:19:40,480 Speaker 1: about Obama boys and like, again, it's not. And but 341 00:19:40,640 --> 00:19:43,800 Speaker 1: then they're also like all articles about like Clinton supporters 342 00:19:43,840 --> 00:19:46,720 Speaker 1: posting child porn and Facebook groups to get them banned 343 00:19:46,760 --> 00:19:49,880 Speaker 1: and things. It's everywhere. It's not this thing. I think 344 00:19:49,960 --> 00:19:52,119 Speaker 1: part of how we talk to each other better is 345 00:19:52,160 --> 00:19:57,720 Speaker 1: not uh, it is using this framework that the media 346 00:19:57,960 --> 00:20:02,480 Speaker 1: is starving for step that it it's it's for a 347 00:20:02,560 --> 00:20:04,840 Speaker 1: lot of us. It feels much more potent within the 348 00:20:04,960 --> 00:20:09,680 Speaker 1: Bernie movement. And again I understand why, and I understand 349 00:20:09,880 --> 00:20:12,880 Speaker 1: and you see it differently. I mean, if you we're 350 00:20:12,920 --> 00:20:16,160 Speaker 1: supporting Warren, maybe you would feel more of that heat 351 00:20:16,240 --> 00:20:18,639 Speaker 1: online and you would have a different perspective. But I 352 00:20:18,760 --> 00:20:20,840 Speaker 1: accept what you're saying. I understand where you're coming from. 353 00:20:20,920 --> 00:20:22,840 Speaker 1: We have to take a break now. This has been 354 00:20:23,000 --> 00:20:25,840 Speaker 1: so fun so far. Though it's one of the best 355 00:20:26,359 --> 00:20:30,280 Speaker 1: days of the best year. It's been as fun as 356 00:20:30,800 --> 00:20:34,600 Speaker 1: when Bernie Sanders and Ted Cruz's father teamed up to 357 00:20:35,560 --> 00:20:37,879 Speaker 1: give this nation a little shot in the arm, just 358 00:20:38,000 --> 00:20:41,359 Speaker 1: a little, just a little how do you do? Uh? Cool? 359 00:20:41,920 --> 00:20:44,359 Speaker 1: You know? Products and services? And then more of this 360 00:20:49,400 --> 00:20:58,560 Speaker 1: together everything don't cool. We're back on that break, more 361 00:20:59,560 --> 00:21:03,080 Speaker 1: of our restation. We're gonna move on to other fresh 362 00:21:03,119 --> 00:21:06,200 Speaker 1: new topics soon. But real quick, Uh, Cody, why don't 363 00:21:06,200 --> 00:21:08,600 Speaker 1: you talk to us about that New York Times article? Yeah? 364 00:21:08,720 --> 00:21:13,120 Speaker 1: So this this is a recent New York Times article 365 00:21:13,200 --> 00:21:14,960 Speaker 1: that came out about Bernie Bros. And it's one of 366 00:21:15,040 --> 00:21:17,959 Speaker 1: again many that came out recently because they're all very 367 00:21:18,040 --> 00:21:22,080 Speaker 1: very scared UM. And I think it's very indicative of 368 00:21:22,359 --> 00:21:25,919 Speaker 1: kind of the problem with just like accepting this narrative 369 00:21:25,960 --> 00:21:30,680 Speaker 1: wholesale and and I personally am I'm just resistant to 370 00:21:30,880 --> 00:21:34,240 Speaker 1: it because I know what they're doing. Um. This New 371 00:21:34,320 --> 00:21:38,760 Speaker 1: York Times piece about the Bernie's Army of trolls is 372 00:21:39,200 --> 00:21:42,040 Speaker 1: it's written by three people. It seems like a lot 373 00:21:42,240 --> 00:21:45,840 Speaker 1: for an article about Bernie Bros. But that's fine, and 374 00:21:46,480 --> 00:21:48,840 Speaker 1: they're just these some passages in there that I think 375 00:21:48,880 --> 00:21:52,200 Speaker 1: are very interesting. But one one of their main sources 376 00:21:52,280 --> 00:21:55,760 Speaker 1: for this article, uh was a woman named Candice Um 377 00:21:55,920 --> 00:21:58,800 Speaker 1: and it was about all the abuse she's gotten harassment UM. 378 00:21:59,119 --> 00:22:02,000 Speaker 1: And one thing they don't seem to mention is that 379 00:22:02,080 --> 00:22:06,040 Speaker 1: she is like a notorious troll who harrasses people online. UM. 380 00:22:06,359 --> 00:22:10,479 Speaker 1: They're like writers for other publications who've talked about how 381 00:22:10,880 --> 00:22:14,320 Speaker 1: she after they blocked her, she would send them emails 382 00:22:14,359 --> 00:22:18,680 Speaker 1: and go to their other social media platforms to like 383 00:22:18,800 --> 00:22:22,240 Speaker 1: harass them, which again is interest. So I'm gonna read 384 00:22:22,240 --> 00:22:26,120 Speaker 1: these passages real quick, um from the article. But it's 385 00:22:26,119 --> 00:22:28,720 Speaker 1: interesting that they didn't point out that she does the 386 00:22:28,800 --> 00:22:32,040 Speaker 1: exact same thing. And then like vicious like sexist things too. 387 00:22:32,240 --> 00:22:36,040 Speaker 1: There's a lot of sexism there, but this passage um 388 00:22:36,440 --> 00:22:41,080 Speaker 1: describing the behavior they swarm someone online more commonly, there 389 00:22:41,200 --> 00:22:44,280 Speaker 1: is a barrage of jabs and threats, sometimes framed as jokes. 390 00:22:44,320 --> 00:22:46,359 Speaker 1: If the targets a woman, and uh, it often is 391 00:22:46,400 --> 00:22:49,680 Speaker 1: these insults can veer towards her physical appearance. It just 392 00:22:49,720 --> 00:22:52,560 Speaker 1: sort of describes this behavior of Bernie Brow specifically, and 393 00:22:52,600 --> 00:22:56,639 Speaker 1: then there's this one short paragraph. His allies also argue 394 00:22:56,880 --> 00:22:59,200 Speaker 1: that online combat is not unique to the Sanders side, 395 00:22:59,240 --> 00:23:01,720 Speaker 1: with some high pro file women who support the senators 396 00:23:01,760 --> 00:23:04,320 Speaker 1: saying they've been attacked too. And I think the way 397 00:23:04,359 --> 00:23:08,520 Speaker 1: that they've phrased this is really interesting because on the 398 00:23:08,600 --> 00:23:13,840 Speaker 1: one hand, you have this article describing a behavior like 399 00:23:13,960 --> 00:23:15,560 Speaker 1: they do this, they do this, they do this, they 400 00:23:15,640 --> 00:23:19,359 Speaker 1: do this, and then there's this very short passage and 401 00:23:19,520 --> 00:23:25,120 Speaker 1: some people say that it's this, it's uh a descriptive 402 00:23:25,520 --> 00:23:28,760 Speaker 1: factual statements and then this sort of passive like and 403 00:23:28,840 --> 00:23:31,960 Speaker 1: then some people say that it's actually another thing. And 404 00:23:33,400 --> 00:23:36,760 Speaker 1: when you literally using a source that engages in that 405 00:23:36,880 --> 00:23:40,000 Speaker 1: exact BASI. And I'm not saying that a lot of 406 00:23:40,080 --> 00:23:43,680 Speaker 1: this like this is representative of what you've experienced or 407 00:23:43,760 --> 00:23:46,320 Speaker 1: like the kind of messages you get, but the way 408 00:23:46,400 --> 00:23:49,440 Speaker 1: they frame it. There's this one passage of describing John 409 00:23:49,600 --> 00:23:52,800 Speaker 1: Legend how he's going to vote for Warren, and he's like, hey, 410 00:23:52,800 --> 00:23:54,560 Speaker 1: and you know all the you know, I'm going to 411 00:23:54,680 --> 00:23:57,680 Speaker 1: vote for whoever in the primary. So everybody chill. To 412 00:23:57,800 --> 00:24:00,159 Speaker 1: quote New York Times, this did not necessarily lay end 413 00:24:00,240 --> 00:24:03,520 Speaker 1: with its intended audience. Quote some of you millionaires need 414 00:24:03,600 --> 00:24:05,919 Speaker 1: to realize that many of us actually need Bernie Sanders 415 00:24:06,000 --> 00:24:09,080 Speaker 1: to win the presidency. One account replied, we can't just chill. 416 00:24:09,760 --> 00:24:11,960 Speaker 1: And that's like their example of like, yeah, they should 417 00:24:11,960 --> 00:24:15,920 Speaker 1: have showed all of the snake poop emojis and stuff. Sure, 418 00:24:16,720 --> 00:24:18,760 Speaker 1: but even that so like this, I mean, we don't 419 00:24:18,760 --> 00:24:21,960 Speaker 1: need to talk about that much. But I this sending 420 00:24:22,000 --> 00:24:24,320 Speaker 1: snake emojis to a politician is like, I don't know 421 00:24:24,359 --> 00:24:28,639 Speaker 1: they're politician. Yeah, I think it's it really depends on 422 00:24:28,640 --> 00:24:30,199 Speaker 1: the context. But we don't need to get derailed by 423 00:24:30,240 --> 00:24:34,160 Speaker 1: that this article. I see what you're saying. Um, it's 424 00:24:34,680 --> 00:24:36,960 Speaker 1: one of the things that's frustrating to me about coverage 425 00:24:37,040 --> 00:24:39,560 Speaker 1: like that is that it it sort of acts as if, 426 00:24:39,960 --> 00:24:45,159 Speaker 1: um like, I've had responses where like I express an 427 00:24:45,200 --> 00:24:47,600 Speaker 1: opinion and get swarmed by a bunch of assholes who 428 00:24:48,720 --> 00:24:52,280 Speaker 1: are being wildly unreasonable and aggressive about it for basically 429 00:24:52,400 --> 00:24:56,359 Speaker 1: every opinion that I've ever expressed. Um Like, it's like 430 00:24:56,480 --> 00:24:59,119 Speaker 1: there's there's communities of like if you take a stance 431 00:24:59,320 --> 00:25:02,800 Speaker 1: on something on the damn Internet, um, you're going to 432 00:25:03,440 --> 00:25:05,520 Speaker 1: like there's a good chance you'll attract a bunch of 433 00:25:05,560 --> 00:25:08,960 Speaker 1: harassment from it, because that's the Internet. If you attack. 434 00:25:09,400 --> 00:25:12,960 Speaker 1: For example, if I insult Elon Musk, I will deal 435 00:25:13,080 --> 00:25:15,879 Speaker 1: with Elon Musk fan boys when I mentions if I 436 00:25:16,040 --> 00:25:18,520 Speaker 1: insult Andrew Yang, I will deal with Andrew Yang fan 437 00:25:18,600 --> 00:25:22,320 Speaker 1: boys in my mentions. And neither of those guys get 438 00:25:22,720 --> 00:25:25,080 Speaker 1: the New York Times being like, this is a problem 439 00:25:25,200 --> 00:25:28,200 Speaker 1: that that you know, Andrew Yang has to address, or 440 00:25:28,280 --> 00:25:31,199 Speaker 1: this is a problem that Elon Musk has to address. Um. 441 00:25:31,520 --> 00:25:34,200 Speaker 1: But it's treated as if it's like some unique aspect 442 00:25:34,280 --> 00:25:36,960 Speaker 1: of the Sanders campaign, when I don't think it is. 443 00:25:37,040 --> 00:25:39,119 Speaker 1: I think it's just the fact that like it's a 444 00:25:39,160 --> 00:25:42,080 Speaker 1: it's an it's fandom. Bernie has a dedicated fandom, and 445 00:25:42,160 --> 00:25:46,439 Speaker 1: dedicated fans will do abusive, shitty things to people who 446 00:25:46,480 --> 00:25:48,600 Speaker 1: attack the thing that they the thing that they're a 447 00:25:48,640 --> 00:25:51,680 Speaker 1: fan of. And that's true of Bernie. It's also true 448 00:25:51,720 --> 00:25:56,760 Speaker 1: of for example, Star Wars. Yeah, I mean, or Donald Trump. 449 00:25:57,040 --> 00:25:59,280 Speaker 1: It's true. You know, you're you're both right about that. 450 00:26:00,040 --> 00:26:03,000 Speaker 1: But I think it's undeniable how much of it is 451 00:26:03,080 --> 00:26:06,640 Speaker 1: happening right now around this conversation and around Bernie. I mean, yes, 452 00:26:07,160 --> 00:26:08,520 Speaker 1: you put it something out there, and you're going to 453 00:26:08,600 --> 00:26:10,440 Speaker 1: get responses from people. You're putting it out there, and 454 00:26:10,720 --> 00:26:12,320 Speaker 1: who knows what's who's going to read it, and some 455 00:26:12,359 --> 00:26:14,119 Speaker 1: people are gonna agree with you and some people aren't. 456 00:26:14,640 --> 00:26:17,159 Speaker 1: It feels unique to me, and I've been on a 457 00:26:17,200 --> 00:26:21,199 Speaker 1: woman online for a long time now around this conversation, 458 00:26:21,920 --> 00:26:24,280 Speaker 1: I mean, just the nature of a lot of the responses. 459 00:26:24,320 --> 00:26:27,280 Speaker 1: How about how many people respond to me saying that 460 00:26:27,400 --> 00:26:31,000 Speaker 1: Cody needs to talk sense and to me, that's fucking sexist. 461 00:26:31,880 --> 00:26:34,440 Speaker 1: I'm sorry. I don't get that from other people and 462 00:26:34,520 --> 00:26:37,960 Speaker 1: other conversations and other candidates anyway. Now I'm getting worked 463 00:26:38,000 --> 00:26:40,159 Speaker 1: up about it, but yeah, I hear you. I I 464 00:26:40,280 --> 00:26:43,120 Speaker 1: think that there wouldn't be that different of a reaction 465 00:26:43,600 --> 00:26:46,280 Speaker 1: if you were a man and we had this disagreement 466 00:26:46,359 --> 00:26:48,600 Speaker 1: on and talked about on the podcast a lot. Maybe 467 00:26:48,800 --> 00:26:51,840 Speaker 1: I don't know, Katie and your soft brain have Cody 468 00:26:52,000 --> 00:26:55,440 Speaker 1: talks some sense and do you does not feel necessarily 469 00:26:55,520 --> 00:27:00,280 Speaker 1: like phrasing someone and would say to every person Internet, 470 00:27:00,920 --> 00:27:04,200 Speaker 1: it's like, it's like, do not mean to make this 471 00:27:04,280 --> 00:27:07,320 Speaker 1: heated and personal. I'm sorry for that, but I agree 472 00:27:07,359 --> 00:27:09,320 Speaker 1: with you and I think and I do think it's 473 00:27:09,320 --> 00:27:11,080 Speaker 1: a problem across the board. And that's why I said 474 00:27:11,119 --> 00:27:13,879 Speaker 1: what I said earlier, which is that I I wanted 475 00:27:13,920 --> 00:27:15,600 Speaker 1: to be better within the Bernie camp, but I also 476 00:27:15,640 --> 00:27:17,240 Speaker 1: wanted to be better across the board with all of 477 00:27:17,359 --> 00:27:19,320 Speaker 1: us in this conversation. But I think that we should 478 00:27:19,320 --> 00:27:21,359 Speaker 1: move on to other topics. How do you guys feel 479 00:27:21,359 --> 00:27:23,560 Speaker 1: about that? You don't want to have more fun? I 480 00:27:23,640 --> 00:27:24,879 Speaker 1: do want to. I want to have more fun with 481 00:27:25,000 --> 00:27:28,879 Speaker 1: other things. UM, let's talk about some endorsements that they 482 00:27:28,960 --> 00:27:31,320 Speaker 1: both have been getting because they've both had a big week. 483 00:27:32,240 --> 00:27:35,560 Speaker 1: A lot of people have attacked Bernie for accepting the 484 00:27:35,720 --> 00:27:40,280 Speaker 1: endorsement of the Dallas Book Depository, but you know it's 485 00:27:40,359 --> 00:27:42,840 Speaker 1: done a lot for him over the years. The only 486 00:27:42,960 --> 00:27:48,600 Speaker 1: endorsement they've ever given, they've ever given, and I think 487 00:27:48,880 --> 00:27:50,360 Speaker 1: we should be able to move on as a nation 488 00:27:50,400 --> 00:27:52,320 Speaker 1: at this point. I accept that I agree with you. 489 00:27:53,080 --> 00:27:57,159 Speaker 1: Let's uh talk about the des Moines register because Warren 490 00:27:57,600 --> 00:27:59,679 Speaker 1: scored that one this week. It's a pretty big deal 491 00:27:59,720 --> 00:28:04,320 Speaker 1: because hidering that the Iowa caucus is just days away. Um. 492 00:28:04,760 --> 00:28:05,960 Speaker 1: You know, at the top of the article they have 493 00:28:06,000 --> 00:28:08,719 Speaker 1: a list of the highlights, which include many of her 494 00:28:08,800 --> 00:28:12,240 Speaker 1: ideas aren't radical, they are right. Uh, she must show 495 00:28:12,280 --> 00:28:14,280 Speaker 1: that her vision will lift people up rather than divide them. 496 00:28:14,400 --> 00:28:16,280 Speaker 1: She cares about people, and she will use her seemingly 497 00:28:16,359 --> 00:28:19,080 Speaker 1: and endless energy and passion to fight for them. Um. 498 00:28:19,720 --> 00:28:22,359 Speaker 1: The outstanding caliber of Democratic candidates make it difficult to 499 00:28:22,440 --> 00:28:26,080 Speaker 1: choose just one. So I know you've got some thoughts 500 00:28:26,480 --> 00:28:29,480 Speaker 1: on this. Um. I'll just start by saying, I know 501 00:28:29,480 --> 00:28:31,399 Speaker 1: a lot of people have pointed to the many of 502 00:28:31,480 --> 00:28:35,000 Speaker 1: her her ideas aren't radical. Line is you know, kind 503 00:28:35,040 --> 00:28:38,000 Speaker 1: of proof that she's the more centrist of the candidate. 504 00:28:38,040 --> 00:28:42,320 Speaker 1: And I guess that's true of the two. And then 505 00:28:42,400 --> 00:28:44,280 Speaker 1: as we were going to talk about their different policies 506 00:28:44,360 --> 00:28:47,000 Speaker 1: soon and I think you'll see that they aren't that 507 00:28:47,400 --> 00:28:52,200 Speaker 1: much different. There's some like key differences. Um So to 508 00:28:52,320 --> 00:28:56,720 Speaker 1: me that she's not that her ideas aren't that radical 509 00:28:56,920 --> 00:29:00,280 Speaker 1: is like, well, she's really not that different than Bernie. Yeah. 510 00:29:00,280 --> 00:29:02,560 Speaker 1: Think it's more just like the phrasing of like clearly 511 00:29:02,600 --> 00:29:04,120 Speaker 1: like they're trying to be like, don't worry, she's not 512 00:29:04,160 --> 00:29:06,440 Speaker 1: that radical. She's like she used to be a Republican. Guys, 513 00:29:08,480 --> 00:29:10,720 Speaker 1: important to keep in mind that this is like a 514 00:29:10,840 --> 00:29:15,440 Speaker 1: much more conservative state, and you know, so they're trying 515 00:29:15,480 --> 00:29:20,400 Speaker 1: to appeal to who's the voters of this publication and 516 00:29:20,880 --> 00:29:23,920 Speaker 1: who vote there? Um, but yeah, go ahead and say 517 00:29:23,960 --> 00:29:26,040 Speaker 1: what your thoughts. Oh, I just I just wanted to 518 00:29:26,080 --> 00:29:29,600 Speaker 1: make a quick note about the Des Moines Register endorsement, 519 00:29:29,680 --> 00:29:34,080 Speaker 1: and I guess endorsements in general. Um uh so the 520 00:29:34,200 --> 00:29:38,400 Speaker 1: past candidates who have been endorsed by the Des Moines Register. 521 00:29:38,480 --> 00:29:41,719 Speaker 1: I'm gonna go back to like late eighties, I think, Um, 522 00:29:41,800 --> 00:29:47,480 Speaker 1: Paul Simon, not who you're thinking of, Paul Simon, Bob Dole, 523 00:29:47,960 --> 00:29:51,560 Speaker 1: Bill Bradley, George W. Bush, the second, I guess, Georgia 524 00:29:51,600 --> 00:29:54,920 Speaker 1: Lady Bush. I'm gonna start this list again. Hold on, uh, 525 00:29:55,280 --> 00:29:59,960 Speaker 1: Paul Simon, Bob Dole, Bill Bradley Bush to John Edward, 526 00:30:00,240 --> 00:30:02,080 Speaker 1: Hillary Clinton in two thousand and eight, as well as 527 00:30:02,400 --> 00:30:09,080 Speaker 1: John McCain, Clinton and Ruby On. So of all of those, 528 00:30:09,480 --> 00:30:11,960 Speaker 1: I think only two of them, actually one Iowa, like 529 00:30:12,160 --> 00:30:14,880 Speaker 1: the Iowa Caucus specifically, and of all of them, only 530 00:30:14,960 --> 00:30:18,640 Speaker 1: one was as was the president. Um. And we don't 531 00:30:18,680 --> 00:30:20,760 Speaker 1: need to talk about the two thousand election. But and 532 00:30:20,840 --> 00:30:22,440 Speaker 1: this isn't to say that like it's not cool that 533 00:30:22,520 --> 00:30:24,640 Speaker 1: she got it. Uh. There's a video of her learning 534 00:30:24,680 --> 00:30:27,840 Speaker 1: the news that's actually very very cute, um, and it's 535 00:30:27,920 --> 00:30:30,080 Speaker 1: like she's like really excited. It's like great uh for 536 00:30:30,160 --> 00:30:32,400 Speaker 1: the campaign. But and so I'm not saying like it 537 00:30:32,440 --> 00:30:35,240 Speaker 1: doesn't matter. I'm just saying that it's not like this 538 00:30:35,680 --> 00:30:39,200 Speaker 1: big thing. And endorsements in general these days, like the 539 00:30:39,240 --> 00:30:41,120 Speaker 1: New York Times endorsement where they split it down the 540 00:30:41,160 --> 00:30:43,480 Speaker 1: middle and they chose to and like they treated like 541 00:30:43,520 --> 00:30:46,360 Speaker 1: a reality show, uh, even though they hate the reality 542 00:30:46,400 --> 00:30:50,360 Speaker 1: show President. Um. This and sort of a little bit 543 00:30:50,440 --> 00:30:52,640 Speaker 1: about what Rob was talking about earlier, and just sort 544 00:30:52,680 --> 00:30:57,080 Speaker 1: of the the slow deterioration of uh, these institutions and 545 00:30:57,200 --> 00:30:59,840 Speaker 1: journalism in general. I think this is a good example 546 00:31:00,000 --> 00:31:02,840 Speaker 1: of it where like does it really matter? Does did 547 00:31:03,080 --> 00:31:05,960 Speaker 1: the endorsement move the needle at all for any Klobachar? 548 00:31:06,120 --> 00:31:09,680 Speaker 1: I doubt it? Um maybe maybe a few people, sure, 549 00:31:09,800 --> 00:31:14,000 Speaker 1: But but like just in general, uh, endorsements at all? Right, 550 00:31:14,360 --> 00:31:19,160 Speaker 1: anything like doing a whole like these are your endorsement points. 551 00:31:19,560 --> 00:31:23,400 Speaker 1: But it's two thousands, Uh, Donald Trump became the president? 552 00:31:23,840 --> 00:31:27,200 Speaker 1: Does any of this matter? Right? Another point? Another thing 553 00:31:27,280 --> 00:31:30,800 Speaker 1: that was included in this endorsement, uh, a qualification. Some 554 00:31:30,960 --> 00:31:34,200 Speaker 1: of her ideas for big structural change go too far. 555 00:31:34,520 --> 00:31:37,360 Speaker 1: This board could not endorse the wholesale overhaul of corporate 556 00:31:37,400 --> 00:31:41,200 Speaker 1: governance or cumulative levels of taxation she proposes. While the 557 00:31:41,240 --> 00:31:43,920 Speaker 1: board has long supported single payer health insurance, it believes 558 00:31:43,920 --> 00:31:47,760 Speaker 1: a gradual transition is more realistic approach. But Warren is 559 00:31:47,800 --> 00:31:53,160 Speaker 1: pushing in the right direction. Yeah. Basically, we're the New 560 00:31:53,240 --> 00:31:58,640 Speaker 1: York Times and we don't support doing anything. Uh this 561 00:31:58,800 --> 00:32:04,840 Speaker 1: is the register, yes, but consistent. Um. Yeah, you want 562 00:32:04,880 --> 00:32:09,880 Speaker 1: to say, oh yeah, I think that's it's we'll probably 563 00:32:09,880 --> 00:32:11,920 Speaker 1: talk about this a little bit more. And it's amazing 564 00:32:12,040 --> 00:32:15,080 Speaker 1: specific stuff. But like the phrasing pushing in the right direction, 565 00:32:15,120 --> 00:32:18,120 Speaker 1: I think it's a big distinction. There's between pushing in 566 00:32:18,120 --> 00:32:21,560 Speaker 1: the right direction and like doing the right thing, and 567 00:32:21,640 --> 00:32:22,880 Speaker 1: it's being like, no, this is what we're gonna do. 568 00:32:23,160 --> 00:32:25,120 Speaker 1: We're gonna fight for this thing that is right, as 569 00:32:25,120 --> 00:32:27,120 Speaker 1: opposed to we're going to push in the right direction. 570 00:32:27,400 --> 00:32:30,880 Speaker 1: You know, it's the the Buddha Judge plenty Bold kind 571 00:32:30,880 --> 00:32:33,200 Speaker 1: of I get why that's annoying, and I agree. I 572 00:32:33,240 --> 00:32:38,360 Speaker 1: don't like that part. Um. Let's talk about Bernie and 573 00:32:38,480 --> 00:32:44,120 Speaker 1: Joe Rogan, Joseph Joseph Joseph Biden net Rogan. Of course. 574 00:32:44,240 --> 00:32:48,280 Speaker 1: Earlier this week, Joseph Biden at Rogan endorsed Bernie Sanders. 575 00:32:48,320 --> 00:32:51,120 Speaker 1: I will say real quick, he didn't necessarily endorse him. 576 00:32:51,160 --> 00:32:54,840 Speaker 1: He said quote, I'm probably gonna vote for Bernie. Yeah, 577 00:32:55,160 --> 00:32:56,960 Speaker 1: we can play the clips, but we don't need to. Yeah, 578 00:32:57,040 --> 00:32:59,800 Speaker 1: And like it's you know obviously, like that's that's showing support. 579 00:32:59,840 --> 00:33:02,720 Speaker 1: And he went on to talk about it, but like this, like, 580 00:33:02,800 --> 00:33:09,360 Speaker 1: oh he endorsed him, No, okay, but an endorsement. I'll 581 00:33:09,400 --> 00:33:11,760 Speaker 1: probably vote for him. I don't know if that's an endorsement, 582 00:33:12,840 --> 00:33:19,200 Speaker 1: but he Joe Rogue endorsed him. Yeah, point taken, um, 583 00:33:20,520 --> 00:33:23,520 Speaker 1: And I don't have a problem with that. That's great. 584 00:33:24,000 --> 00:33:28,280 Speaker 1: The conversation about this was confusing online because like, that's 585 00:33:28,960 --> 00:33:32,480 Speaker 1: not a problem people are it's great. I don't know 586 00:33:34,640 --> 00:33:38,360 Speaker 1: people have been saying for a long time. Okay. So 587 00:33:38,520 --> 00:33:42,080 Speaker 1: a lot of the a lot of the blowback against 588 00:33:42,120 --> 00:33:44,000 Speaker 1: this came from the fact that Joe rokean has uh 589 00:33:44,320 --> 00:33:47,360 Speaker 1: said horrible things about trans people in the past, and 590 00:33:47,680 --> 00:33:49,640 Speaker 1: so there were folks being like, I don't know if 591 00:33:49,680 --> 00:33:52,760 Speaker 1: I'll vote for burning now that he's like, now that 592 00:33:52,880 --> 00:33:57,400 Speaker 1: he's uh touting this endorsement, and like what, I don't know, 593 00:33:57,520 --> 00:34:00,240 Speaker 1: Like it's one of those things where I didn't see 594 00:34:00,280 --> 00:34:05,000 Speaker 1: what Bernie like Bernie's the way he touted the the endorsement, 595 00:34:05,080 --> 00:34:08,360 Speaker 1: so to speak, was very carefully not endorsing Joe Rogan. 596 00:34:08,400 --> 00:34:11,560 Speaker 1: It was quoting Joe Rogan saying something nice about Bernie, 597 00:34:12,200 --> 00:34:15,239 Speaker 1: which I see is fine. Like, I get why people 598 00:34:15,239 --> 00:34:17,640 Speaker 1: are angry at Joe Rogan. I am permanently angry at 599 00:34:17,680 --> 00:34:22,680 Speaker 1: Joe Rogan, but I think it's shortsighted to take issue 600 00:34:22,880 --> 00:34:26,080 Speaker 1: with a guy that influential being like, yeah, I vote 601 00:34:26,120 --> 00:34:30,719 Speaker 1: for Bernie Sanders. That's a positive thing. That is a 602 00:34:30,760 --> 00:34:34,920 Speaker 1: positive thing. I have a slightly different perspective in that 603 00:34:36,800 --> 00:34:40,680 Speaker 1: I mean, I'm not extraordinarily pissed at Bernie for sharing, 604 00:34:40,800 --> 00:34:44,640 Speaker 1: for retweeting that. I don't think it was necessary. And 605 00:34:44,760 --> 00:34:47,920 Speaker 1: I think that again, we can't possibly speak for everybody 606 00:34:47,920 --> 00:34:50,480 Speaker 1: I've seen. I've tried to just participate, Like watch the 607 00:34:50,560 --> 00:34:55,439 Speaker 1: conversation unfold, you know, I think that it's I don't 608 00:34:55,480 --> 00:34:57,920 Speaker 1: think that they he gets more votes by sharing it. 609 00:34:58,239 --> 00:35:00,920 Speaker 1: I think that he offends a lot of people within 610 00:35:01,120 --> 00:35:02,520 Speaker 1: his base, and a lot of people aren't but a 611 00:35:02,560 --> 00:35:04,359 Speaker 1: lot of people don't care. But there are a lot 612 00:35:04,400 --> 00:35:08,320 Speaker 1: of people whose existence Joe Rogan kind of denies, and 613 00:35:09,200 --> 00:35:12,440 Speaker 1: that supporting him, and and I don't see what is 614 00:35:12,560 --> 00:35:16,320 Speaker 1: gained by that, because anybody following Bernie that likes Bernie 615 00:35:17,560 --> 00:35:21,520 Speaker 1: isn't convinced by Joe Rogan. The fact that Joe Rogan 616 00:35:21,760 --> 00:35:24,680 Speaker 1: said he's probably going to vote for him, great, that's 617 00:35:24,719 --> 00:35:28,160 Speaker 1: all that him him saying that on the show is 618 00:35:28,200 --> 00:35:30,919 Speaker 1: what does it? Also, like the conversation that they had 619 00:35:31,560 --> 00:35:33,600 Speaker 1: when he was on, and like the comments and that 620 00:35:33,640 --> 00:35:36,879 Speaker 1: it's like okay, yeah, it's clearly like it's an effective thing. Um, 621 00:35:37,000 --> 00:35:39,719 Speaker 1: And I think and like I totally understand, and I like, 622 00:35:39,800 --> 00:35:42,160 Speaker 1: I like you said, like I've mostly just been listening 623 00:35:42,160 --> 00:35:44,640 Speaker 1: to what people have to say about it. I've seen 624 00:35:44,880 --> 00:35:47,440 Speaker 1: a lot of people the trans community have issue with it, 625 00:35:47,600 --> 00:35:49,239 Speaker 1: and a lot of people not have issue with it. 626 00:35:49,560 --> 00:35:54,279 Speaker 1: I think what the better move is because it's not like, oh, 627 00:35:54,400 --> 00:35:58,399 Speaker 1: look we got Joe. It's presenting what he said and saying, 628 00:35:58,640 --> 00:36:02,680 Speaker 1: look at how convincing we are. It's not saying that like, 629 00:36:02,960 --> 00:36:04,880 Speaker 1: oh we got Joe and we like him, which they 630 00:36:04,920 --> 00:36:08,399 Speaker 1: didn't say, but it's presenting it like, look at how 631 00:36:08,480 --> 00:36:11,520 Speaker 1: effective this is deposit. That's something that surro gets and 632 00:36:11,719 --> 00:36:17,000 Speaker 1: what people can do like AOC and they could say 633 00:36:17,120 --> 00:36:19,960 Speaker 1: something like we there's since there's a little bit distance, 634 00:36:20,080 --> 00:36:22,880 Speaker 1: they can say something like we understand that Joe Rogan 635 00:36:23,640 --> 00:36:25,880 Speaker 1: maybe does we don't agree with him on all X, 636 00:36:26,000 --> 00:36:28,440 Speaker 1: Y or Z, but here's how we can abridge the divide. 637 00:36:28,520 --> 00:36:30,160 Speaker 1: We'd love to have AOC on the show so she 638 00:36:30,200 --> 00:36:32,240 Speaker 1: can talk to him about these issues, but we didn't 639 00:36:32,320 --> 00:36:35,920 Speaker 1: do that. That's not what happened. That's all. That's that's 640 00:36:35,960 --> 00:36:37,920 Speaker 1: my whole perspective. But I also don't think that it 641 00:36:38,000 --> 00:36:42,920 Speaker 1: makes that's inherently it just everything should come down and 642 00:36:42,960 --> 00:36:46,359 Speaker 1: not Yeah, everything should come down and notch Um because Yeah, 643 00:36:46,400 --> 00:36:50,399 Speaker 1: and I think it is a good point to say, 644 00:36:50,480 --> 00:36:53,040 Speaker 1: look at how convincing we are. There was no compromise there. 645 00:36:53,080 --> 00:36:56,640 Speaker 1: The platform hasn't changed, his opinions haven't changed. In order 646 00:36:56,680 --> 00:36:59,600 Speaker 1: to get Joe's endorsement, he stuck to his guns, and 647 00:37:00,000 --> 00:37:02,960 Speaker 1: that's what was convincing. Um. I think that's the argument. 648 00:37:03,040 --> 00:37:06,640 Speaker 1: They're not necessarily lifting up this. Uh you know, like 649 00:37:06,719 --> 00:37:09,800 Speaker 1: transphobic guy who's had a lot of like pretty horrible 650 00:37:09,840 --> 00:37:14,440 Speaker 1: people on the show to sort of spread their false message. Um. 651 00:37:15,239 --> 00:37:18,200 Speaker 1: Also just real quick on this issue. This is again, 652 00:37:18,680 --> 00:37:21,440 Speaker 1: like you said, like bring it down and notch everybody. Um. 653 00:37:21,800 --> 00:37:27,560 Speaker 1: And it's another example of uh never Trump Republicans, uh 654 00:37:27,960 --> 00:37:33,000 Speaker 1: centrist lib pundit folk two pounce on an issue and 655 00:37:33,360 --> 00:37:36,440 Speaker 1: use like the trans community and other communities as like 656 00:37:36,520 --> 00:37:41,239 Speaker 1: a cudgel to beat Bernie Sanders down. Yeah. I've I've 657 00:37:41,320 --> 00:37:47,080 Speaker 1: seen more rage about this, particularly from like kind of 658 00:37:47,160 --> 00:37:51,600 Speaker 1: centrist Democrat media people online, than over any of the awful, 659 00:37:51,640 --> 00:37:55,479 Speaker 1: actual awful things that Donald Trump has done to harm 660 00:37:55,560 --> 00:37:59,480 Speaker 1: trans people in the United States. M even like Hillary 661 00:37:59,520 --> 00:38:03,000 Speaker 1: Clinton had Democrat responsible to this. Yeah, Hillary Clint had 662 00:38:03,040 --> 00:38:07,200 Speaker 1: transport comments like earlier last year or like late last year. Um, 663 00:38:08,000 --> 00:38:11,839 Speaker 1: but that wasn't an issue yea, even though like uh 664 00:38:12,400 --> 00:38:16,600 Speaker 1: like a few days ago, uh like right after this happened. Uh, 665 00:38:16,800 --> 00:38:22,880 Speaker 1: Pete Buddha Jedge got endorsed by Charlemagne and and uh 666 00:38:23,320 --> 00:38:32,000 Speaker 1: it's the King of the not Charlemagne historical figure talking 667 00:38:32,000 --> 00:38:35,560 Speaker 1: about Charlotmagne and the god he hosts the Breakfast Club 668 00:38:36,280 --> 00:38:40,200 Speaker 1: radio stuff. Big Pete fan video of Pete with him 669 00:38:40,920 --> 00:38:43,919 Speaker 1: being like, look I got I got Charlemagne, and it's 670 00:38:44,120 --> 00:38:46,760 Speaker 1: just sort of it's it's literally like what did that happen? 671 00:38:46,920 --> 00:38:50,080 Speaker 1: I don't know, but it's like it's even it's a 672 00:38:50,160 --> 00:38:53,200 Speaker 1: further step from what the Bernie campaign did. It's him 673 00:38:53,239 --> 00:38:55,480 Speaker 1: literally in a video being like, look, I'm with him 674 00:38:55,520 --> 00:39:00,440 Speaker 1: and we're doing an event together. It is him like 675 00:39:00,600 --> 00:39:04,920 Speaker 1: really touting the endorsement. But like he's talked so he 676 00:39:05,000 --> 00:39:08,000 Speaker 1: said so many transphobic things in his career, Like two 677 00:39:08,080 --> 00:39:10,480 Speaker 1: years ago. He has this whole thing about how it's 678 00:39:10,520 --> 00:39:14,920 Speaker 1: okay if families want to keep their bloodline pure and 679 00:39:15,040 --> 00:39:17,759 Speaker 1: like compared it to like dog breeding and stuff. Just 680 00:39:17,880 --> 00:39:21,839 Speaker 1: like some pretty like racist transphobic stuff. Nobody cares. People 681 00:39:21,840 --> 00:39:23,680 Speaker 1: aren't making hay out of this they're not being like 682 00:39:23,760 --> 00:39:28,080 Speaker 1: Pete needs to denounce this pizza, Pete, Pete. And it's 683 00:39:28,120 --> 00:39:30,600 Speaker 1: because I don't think that. Yeah, I hear you. And 684 00:39:30,760 --> 00:39:33,640 Speaker 1: and it's just it's just another example like the using 685 00:39:33,880 --> 00:39:36,799 Speaker 1: these little things that that our conversation is worth having 686 00:39:37,480 --> 00:39:41,120 Speaker 1: U that the media pounces on and use it as 687 00:39:41,200 --> 00:39:46,279 Speaker 1: like we finally got them. They are problem. It's very frustrated. Um, 688 00:39:46,360 --> 00:39:48,920 Speaker 1: we gotta take a quick break for more products and services. 689 00:39:49,200 --> 00:40:03,800 Speaker 1: Welcome together ever back from that little break. I love breaks. 690 00:40:04,600 --> 00:40:07,720 Speaker 1: I love breaks as much as Bernie Sanders loves taking 691 00:40:07,760 --> 00:40:10,600 Speaker 1: the succession of American presidents into his own hands with 692 00:40:10,680 --> 00:40:14,279 Speaker 1: the cold steel of an Italian rifle. Um back into 693 00:40:14,320 --> 00:40:18,399 Speaker 1: the left, back into the far left, right, back back 694 00:40:18,480 --> 00:40:20,840 Speaker 1: into the far left. Which is actually not what happened 695 00:40:20,880 --> 00:40:24,600 Speaker 1: after the Kennedy A situation assassination anyway, Uh, I don't know. 696 00:40:24,680 --> 00:40:26,440 Speaker 1: Before we move on, I kind of wanted to talk 697 00:40:26,440 --> 00:40:30,120 Speaker 1: about something that frustrates me, just about the whole Joe 698 00:40:30,239 --> 00:40:35,320 Speaker 1: Rogan debate, which is, um, there are no debates h 699 00:40:36,360 --> 00:40:39,240 Speaker 1: on the other side about what works. It's just purely 700 00:40:39,320 --> 00:40:43,520 Speaker 1: a matter of taking and holding and exercising power. Um. 701 00:40:44,040 --> 00:40:47,960 Speaker 1: And I don't I'm not advocating uh like an embrace 702 00:40:48,160 --> 00:40:53,480 Speaker 1: of of soulless, sociopathic uh politics of power, because that's 703 00:40:53,600 --> 00:40:55,920 Speaker 1: not what I want the left to be. But I 704 00:40:56,000 --> 00:40:59,719 Speaker 1: think a little bit more pragmatism is warranted to where 705 00:40:59,760 --> 00:41:01,800 Speaker 1: we and say, like it would be one thing. I 706 00:41:01,800 --> 00:41:05,719 Speaker 1: wouldn't be suggesting this if, for example, sanders Head softened 707 00:41:05,719 --> 00:41:08,160 Speaker 1: at all his attitudes on trans rights to get Joe 708 00:41:08,239 --> 00:41:11,640 Speaker 1: Rogan's endorsement. But he didn't. He just connected with Rogan 709 00:41:11,719 --> 00:41:14,480 Speaker 1: and his audience about other things that they agreed on. 710 00:41:14,760 --> 00:41:19,440 Speaker 1: And I think that makes uh, like, that's how what 711 00:41:19,560 --> 00:41:22,719 Speaker 1: we should be doing. Um, you're not going to by 712 00:41:22,760 --> 00:41:28,279 Speaker 1: the time rolls around convince everybody that uh it's reasonable 713 00:41:28,320 --> 00:41:32,040 Speaker 1: to be pro choice or that uh, you know, trans 714 00:41:32,120 --> 00:41:34,600 Speaker 1: people's rights are as important as we think they are, 715 00:41:34,960 --> 00:41:37,399 Speaker 1: as long as you're not conceding those points. If you're 716 00:41:37,400 --> 00:41:41,040 Speaker 1: getting those people to buy onto other aspects of your agenda, um, 717 00:41:41,320 --> 00:41:44,560 Speaker 1: then it allows you to continue to support pro choice, 718 00:41:44,680 --> 00:41:48,640 Speaker 1: pro trans politics while also making other things happen, um 719 00:41:48,840 --> 00:41:51,799 Speaker 1: and protecting all of these different communities, and hopefully over 720 00:41:51,920 --> 00:41:56,279 Speaker 1: time changing the opinions of these people. UM. Right, you 721 00:41:56,320 --> 00:42:00,640 Speaker 1: bring them in that. Yeah, I agree. I totally agree. 722 00:42:00,760 --> 00:42:04,000 Speaker 1: And that's why I found the whole conversation to be frustrating. 723 00:42:04,080 --> 00:42:09,000 Speaker 1: It it's like we're focusing on the wrong things here. Um, 724 00:42:09,960 --> 00:42:14,120 Speaker 1: let's talk a little bit about their platforms and the 725 00:42:14,200 --> 00:42:16,200 Speaker 1: ways that they're different and everything on a few of 726 00:42:16,239 --> 00:42:19,520 Speaker 1: the key issues here. Um, Cody, you have been digging 727 00:42:19,520 --> 00:42:23,879 Speaker 1: into a little bit. Um. Um, we want to start 728 00:42:23,920 --> 00:42:27,160 Speaker 1: with a wealth tax. Yeah, I mean this is, uh, 729 00:42:27,440 --> 00:42:31,200 Speaker 1: I think just a representation of sort of the difference 730 00:42:31,320 --> 00:42:36,000 Speaker 1: between the two. Um. You know, they're this warrant support 731 00:42:36,000 --> 00:42:38,880 Speaker 1: of a wealth tax is great. Um, Bernie's is a 732 00:42:38,880 --> 00:42:43,120 Speaker 1: little more aggressive. Um. And yeah, and so that's a 733 00:42:43,200 --> 00:42:48,239 Speaker 1: conversation of like what to me, they're probably both off 734 00:42:48,280 --> 00:42:53,719 Speaker 1: putting to certain sex of the population. Yeah. His his 735 00:42:54,200 --> 00:42:57,120 Speaker 1: gets his kicks in at thirty two point one million, 736 00:42:58,239 --> 00:43:01,080 Speaker 1: hers kicks in at fifty at one million. UM. I 737 00:43:01,160 --> 00:43:04,440 Speaker 1: do think it's interesting if there's certain people I've noticed 738 00:43:05,120 --> 00:43:07,359 Speaker 1: leaning towards Warren and then you look at their net 739 00:43:07,400 --> 00:43:10,160 Speaker 1: worth and it's in that range, it's less than fifties, 740 00:43:10,200 --> 00:43:11,719 Speaker 1: Like oh, you just don't want anything to kick in 741 00:43:11,920 --> 00:43:15,000 Speaker 1: and not like maybe not, but like, uh, that's a 742 00:43:15,040 --> 00:43:18,720 Speaker 1: fun little game that is worth playing. If you notice 743 00:43:18,840 --> 00:43:22,319 Speaker 1: a rich person leaning towards Warren. Also his I mean, 744 00:43:22,400 --> 00:43:25,600 Speaker 1: his plan is a progressive tax um. So it's slowly, 745 00:43:26,040 --> 00:43:29,000 Speaker 1: you know, it goes up as the wealth increases, So 746 00:43:29,160 --> 00:43:31,800 Speaker 1: it you know, fifty millions two percent and then to 747 00:43:31,920 --> 00:43:36,440 Speaker 1: fifty millions three percent, whereas hers is just two and 748 00:43:36,520 --> 00:43:41,920 Speaker 1: three percent difference there, and obviously I goes up to 749 00:43:41,960 --> 00:43:47,640 Speaker 1: eight Yeah, tax them, tax fu. Um. You know, there 750 00:43:47,760 --> 00:43:51,640 Speaker 1: is conversation that I don't know, I don't know how 751 00:43:53,480 --> 00:43:57,640 Speaker 1: how much more likely people that are on the fence 752 00:43:58,680 --> 00:44:02,120 Speaker 1: are to vote for Warren on this issue versus Bernie. 753 00:44:02,320 --> 00:44:04,839 Speaker 1: You know, like what how does it affect the electability 754 00:44:05,080 --> 00:44:08,120 Speaker 1: question and all of that. I wish, you know, because 755 00:44:08,120 --> 00:44:10,160 Speaker 1: I think that both of those these plans would be great, 756 00:44:10,840 --> 00:44:13,040 Speaker 1: but obviously one is better, right. Well. I think also 757 00:44:13,200 --> 00:44:15,920 Speaker 1: like we consider like who we're even talking about, like 758 00:44:16,320 --> 00:44:18,560 Speaker 1: in terms of millionaires, I think it's like a little 759 00:44:18,600 --> 00:44:21,480 Speaker 1: less than like five percent of the country. UM. So 760 00:44:21,480 --> 00:44:25,120 Speaker 1: in terms of like who what kind of voters it attracts? Um? 761 00:44:26,040 --> 00:44:28,799 Speaker 1: I guess do we need to worry about that? Really? Um? 762 00:44:28,960 --> 00:44:31,120 Speaker 1: And even like I mean, we won't get too much 763 00:44:31,160 --> 00:44:35,000 Speaker 1: into their Wall Street stuff, but like, uh, I think 764 00:44:35,000 --> 00:44:39,000 Speaker 1: there's there's there's some hedging of bets and sort of like, okay, 765 00:44:39,040 --> 00:44:41,440 Speaker 1: well if it's going to be Burnie or warrant her 766 00:44:41,680 --> 00:44:45,360 Speaker 1: because it's less aggressive. I think there's one other dimension 767 00:44:45,480 --> 00:44:48,600 Speaker 1: to it, which is that, um Like, obviously I I 768 00:44:48,920 --> 00:44:52,719 Speaker 1: am more supportive of Bernie Sanders is wealth tax because 769 00:44:52,760 --> 00:44:56,640 Speaker 1: it's more aggressive. But I think based on the latest statistics, 770 00:44:56,680 --> 00:44:59,239 Speaker 1: I saw Warren's polls much better and it actually a 771 00:44:59,320 --> 00:45:02,040 Speaker 1: majority of her publicans are supportive of it. And so 772 00:45:02,200 --> 00:45:05,360 Speaker 1: I think if the goal is to start with start 773 00:45:05,440 --> 00:45:08,759 Speaker 1: wealth redistribution and show Americans that it can work and 774 00:45:08,840 --> 00:45:10,959 Speaker 1: that it can fund these programs that we've been saying 775 00:45:10,960 --> 00:45:14,400 Speaker 1: are necessary for so long, um, it's possible that Warren's 776 00:45:14,520 --> 00:45:17,839 Speaker 1: strategy is a better tactical move because it's easier to get, 777 00:45:18,120 --> 00:45:21,560 Speaker 1: for example, conservatives on board with and then convince them, no, 778 00:45:21,760 --> 00:45:24,560 Speaker 1: you guys actually like having free healthcare and stuff and 779 00:45:24,680 --> 00:45:27,320 Speaker 1: this you know doesn't harm it a like yeah, and 780 00:45:27,400 --> 00:45:30,120 Speaker 1: then we can you know, push from more aggressive wealth 781 00:45:30,160 --> 00:45:33,600 Speaker 1: taxes down the line. Um. But I think that's an 782 00:45:33,600 --> 00:45:36,399 Speaker 1: important kind of dimension. I think that with a few 783 00:45:36,440 --> 00:45:39,400 Speaker 1: of her policies that seems to be the conversation that 784 00:45:39,480 --> 00:45:42,440 Speaker 1: we should will be having in and honestly the conversation 785 00:45:42,480 --> 00:45:44,120 Speaker 1: that I think that we should be focusing on instead 786 00:45:44,160 --> 00:45:48,040 Speaker 1: of stuff that about how the candidates are or are 787 00:45:48,120 --> 00:45:50,120 Speaker 1: not arguing over whether or not a woman can be 788 00:45:50,160 --> 00:45:54,480 Speaker 1: the president. Um. You know, it's the same thing with 789 00:45:54,560 --> 00:45:56,520 Speaker 1: the Medica for all, Like what what is the best 790 00:45:56,719 --> 00:46:00,440 Speaker 1: path to getting these things? Um? And I know that 791 00:46:00,520 --> 00:46:03,279 Speaker 1: also you the conversation about whether or not we can 792 00:46:03,360 --> 00:46:06,799 Speaker 1: trust her gets wrapped up into this. Well, but yeah, 793 00:46:06,880 --> 00:46:08,560 Speaker 1: I think it's a really great point, Robert, and something 794 00:46:08,640 --> 00:46:13,560 Speaker 1: that I I'm still grappling with and trying to think about. 795 00:46:13,719 --> 00:46:16,440 Speaker 1: And and see, I'm so excited to start seeing how 796 00:46:16,960 --> 00:46:20,200 Speaker 1: these primaries are going to be playing out, um, and 797 00:46:20,360 --> 00:46:22,400 Speaker 1: and and to get an action, start to get a 798 00:46:22,480 --> 00:46:26,600 Speaker 1: real grasp on like what is resonating with people outside 799 00:46:26,600 --> 00:46:30,560 Speaker 1: of just statistics? Um, yeah, do you want to do 800 00:46:30,800 --> 00:46:33,680 Speaker 1: about Yeah? And I think this, I mean this is 801 00:46:33,880 --> 00:46:37,600 Speaker 1: also like I think there's something you consider in terms 802 00:46:37,640 --> 00:46:41,200 Speaker 1: of like aggressive or not aggressive and how that appeals 803 00:46:41,239 --> 00:46:47,520 Speaker 1: to voters versus how passible it is as actual legislation, um, 804 00:46:47,640 --> 00:46:51,279 Speaker 1: and getting conservatives to go along with it, and like, yeah, 805 00:46:51,280 --> 00:46:55,600 Speaker 1: it's all part of and like we've talked about before, basically, 806 00:46:56,400 --> 00:46:59,680 Speaker 1: no matter what you propose, it's more than likely that 807 00:46:59,800 --> 00:47:02,319 Speaker 1: you're going to get less than what you propose, um, 808 00:47:02,480 --> 00:47:06,279 Speaker 1: just because of how our system works. So pushing for 809 00:47:06,560 --> 00:47:08,400 Speaker 1: as far as you can and then having to be like, 810 00:47:08,400 --> 00:47:10,680 Speaker 1: all right, well it's not going to be as much. 811 00:47:10,800 --> 00:47:13,600 Speaker 1: I guess. I mean, I I hear that, I I 812 00:47:14,760 --> 00:47:18,520 Speaker 1: hear that. It's just so hard to know. Um. Yeah, 813 00:47:18,640 --> 00:47:23,200 Speaker 1: talk some more. Yeah, I mean all Medicare for All 814 00:47:23,239 --> 00:47:26,360 Speaker 1: I think is probably the biggest one and sort of 815 00:47:26,400 --> 00:47:29,480 Speaker 1: speaks to Yeah, this sort of general issue, like what 816 00:47:29,560 --> 00:47:32,560 Speaker 1: you were talking about of like Robert, how a case 817 00:47:32,640 --> 00:47:37,120 Speaker 1: could be made for doing something to convince people that 818 00:47:37,640 --> 00:47:41,799 Speaker 1: going further is good, where like for her Medicare for All, 819 00:47:41,960 --> 00:47:45,080 Speaker 1: which she does still say she sports is to pass 820 00:47:45,200 --> 00:47:48,200 Speaker 1: legislation to like bring the age down, have it uh 821 00:47:48,600 --> 00:47:53,440 Speaker 1: people update teams, and that's just after that transitioning well, 822 00:47:53,480 --> 00:47:57,799 Speaker 1: after that, there'd be another like vote to expand it um. 823 00:47:58,200 --> 00:48:02,000 Speaker 1: Which again I mean, there's a lot to say and 824 00:48:02,200 --> 00:48:04,400 Speaker 1: and and that's another one that I'm I'm I'm not 825 00:48:04,600 --> 00:48:09,360 Speaker 1: sure still where I fall as to what's the the 826 00:48:09,520 --> 00:48:13,080 Speaker 1: right path? Can we trust her that always comes up. 827 00:48:13,080 --> 00:48:14,719 Speaker 1: Can we trust that's what she's actually going to be 828 00:48:14,840 --> 00:48:18,279 Speaker 1: driving towards. I don't see any reason not to, but 829 00:48:18,360 --> 00:48:21,359 Speaker 1: I understand that concern. I personally don't see any reason 830 00:48:21,440 --> 00:48:23,960 Speaker 1: not to. UM. Then I hear you, like, is that 831 00:48:24,160 --> 00:48:26,279 Speaker 1: shooting ourselves in the foot with this conversation, with this 832 00:48:26,440 --> 00:48:30,680 Speaker 1: fight require mid terms to right in order to hear that? 833 00:48:31,840 --> 00:48:35,440 Speaker 1: Or is it realistically that's how it's going to take 834 00:48:35,520 --> 00:48:38,200 Speaker 1: even even with the Bernie presidency. How is he going 835 00:48:38,239 --> 00:48:41,640 Speaker 1: to get us there? Um? Is this a more palatable plan? 836 00:48:41,719 --> 00:48:45,719 Speaker 1: It's an actual plan. I'll give her that, you know, UM, 837 00:48:45,960 --> 00:48:48,000 Speaker 1: as to how we're going to achieve this thing that 838 00:48:48,640 --> 00:48:53,640 Speaker 1: the majority of Americans want? Um. And yeah, that's so 839 00:48:53,760 --> 00:48:58,080 Speaker 1: I don't know the answer to that. Yeah, it's uh, 840 00:48:59,120 --> 00:49:04,040 Speaker 1: it's a debate worth having. UM. I do question. Yeah, 841 00:49:04,160 --> 00:49:09,160 Speaker 1: just like if you're if you're uh starting at we're 842 00:49:09,200 --> 00:49:11,120 Speaker 1: gonna do care for all? And then the plan comes 843 00:49:11,160 --> 00:49:13,400 Speaker 1: out it's like actually this several step plan that requires 844 00:49:13,480 --> 00:49:16,399 Speaker 1: this one thing first and then this um and again 845 00:49:16,480 --> 00:49:18,959 Speaker 1: like mid term is going well? Um, and then also 846 00:49:19,040 --> 00:49:20,319 Speaker 1: sort of talking about how like well we're not going 847 00:49:20,360 --> 00:49:25,279 Speaker 1: to get everything done that we say, um, which that politics. Yeah. Um. 848 00:49:26,680 --> 00:49:29,839 Speaker 1: Then like personally, I'm like, well, just say you want 849 00:49:29,880 --> 00:49:33,320 Speaker 1: to do the thing and and do that, um, because 850 00:49:33,320 --> 00:49:34,880 Speaker 1: then I at least believe that you want to do 851 00:49:35,040 --> 00:49:37,959 Speaker 1: that as opposed to the sort of like well let's 852 00:49:38,000 --> 00:49:41,480 Speaker 1: talk about some more things student deb Yeah, I mean 853 00:49:42,320 --> 00:49:44,840 Speaker 1: student debt. And like again this is like they generally 854 00:49:44,920 --> 00:49:46,480 Speaker 1: agree on a lot of things. We want, you know, 855 00:49:46,920 --> 00:49:49,200 Speaker 1: get getrit of private prisons, like a lot of the 856 00:49:49,239 --> 00:49:54,959 Speaker 1: big issues that are now uh standard for like being 857 00:49:55,120 --> 00:49:57,320 Speaker 1: a democratic progressive and like wanting like we want to 858 00:49:57,320 --> 00:49:58,920 Speaker 1: get rid of private prisons, we want to do this, 859 00:49:59,000 --> 00:50:00,840 Speaker 1: we want to do something about client to change. Um. 860 00:50:00,920 --> 00:50:05,480 Speaker 1: It's just the degree and the plan and uh yeah, 861 00:50:05,880 --> 00:50:08,200 Speaker 1: and how they how are they how are they plan 862 00:50:08,360 --> 00:50:13,320 Speaker 1: to distribute the money? How they planned? Like like Bernie 863 00:50:13,320 --> 00:50:15,520 Speaker 1: wants to get rid of all student debt? Uh those 864 00:50:15,600 --> 00:50:17,399 Speaker 1: the one wants to get rid of most of student debt, 865 00:50:17,760 --> 00:50:23,080 Speaker 1: Like I think people people up to fifty dollars student debt. Um, 866 00:50:24,000 --> 00:50:30,120 Speaker 1: that's a distinction, um one. Like yeah, they're both good. Um, 867 00:50:30,200 --> 00:50:35,000 Speaker 1: it's how far you want to go? Um. Similarly, like 868 00:50:35,520 --> 00:50:39,320 Speaker 1: like with climate change, Uh yeah, so they've got different types, 869 00:50:40,000 --> 00:50:43,560 Speaker 1: different amount of amounts of money that they're proposing, but 870 00:50:43,719 --> 00:50:47,160 Speaker 1: a different approach. Yeah, Like Sanders all in on like 871 00:50:47,280 --> 00:50:48,920 Speaker 1: this idea of a green new deal, like here's our 872 00:50:48,960 --> 00:50:52,560 Speaker 1: climate change legislation, um, not the war like for that, 873 00:50:52,800 --> 00:50:54,879 Speaker 1: but she sort of packages a lot of it into 874 00:50:55,080 --> 00:50:57,720 Speaker 1: other plans where she's like and when talking about the military, 875 00:50:57,760 --> 00:51:02,239 Speaker 1: we're gonna do that. I I would like there'd be 876 00:51:02,280 --> 00:51:05,040 Speaker 1: more money in her plan, but I also think that 877 00:51:05,160 --> 00:51:09,920 Speaker 1: that's a really effective and important approach to this is 878 00:51:10,040 --> 00:51:14,160 Speaker 1: to integrate all of these things into different areas. It's 879 00:51:14,320 --> 00:51:16,040 Speaker 1: in everything, it's a part of everything. Yeah. I mean 880 00:51:16,040 --> 00:51:18,960 Speaker 1: there's a reason that like whenever they have debates, someone 881 00:51:19,120 --> 00:51:21,040 Speaker 1: ultimately is like, why aren't we talking about climate change? 882 00:51:21,120 --> 00:51:23,239 Speaker 1: Or like why aren't we talking about climate change in 883 00:51:23,320 --> 00:51:26,520 Speaker 1: relation to this question? Um? Oh, and so sixteen what 884 00:51:26,680 --> 00:51:28,840 Speaker 1: sixteen point three trillion versus three trillion, I think is 885 00:51:28,920 --> 00:51:32,520 Speaker 1: the planned difference, And I agree, I think it's Yeah, 886 00:51:32,560 --> 00:51:36,600 Speaker 1: it's important to have climate change in mind in all 887 00:51:36,680 --> 00:51:41,719 Speaker 1: these other departments and plans. Um. Like when Trump's new 888 00:51:41,760 --> 00:51:44,319 Speaker 1: trade deal came out, Bernie, the whole thing about how 889 00:51:44,560 --> 00:51:47,759 Speaker 1: this doesn't mention climate And that's the one of the 890 00:51:47,800 --> 00:51:50,360 Speaker 1: main things that I've been disappointed about with Warren is 891 00:51:50,600 --> 00:51:52,360 Speaker 1: that she did go along with it. And I get it, 892 00:51:52,480 --> 00:51:55,640 Speaker 1: like the hassle of passing a new trade deal down 893 00:51:55,680 --> 00:52:00,680 Speaker 1: the line and this whole I understand the reason why, 894 00:52:00,800 --> 00:52:03,759 Speaker 1: but I do think that, yeah, let's not pass one 895 00:52:03,800 --> 00:52:06,319 Speaker 1: until it includes it. I agree with that, right, Yeah, 896 00:52:06,480 --> 00:52:09,120 Speaker 1: and yeah, little things like that um and related to 897 00:52:09,120 --> 00:52:11,680 Speaker 1: climate change. Like I think another example of kind of 898 00:52:11,760 --> 00:52:15,840 Speaker 1: what we're talking about. Bernie's talked about prosecuting x On 899 00:52:15,960 --> 00:52:21,320 Speaker 1: Mobile and really going after these companies UM, and Warren 900 00:52:21,360 --> 00:52:24,520 Speaker 1: hasn't not done that, But her approach is more about 901 00:52:25,040 --> 00:52:28,919 Speaker 1: creating rules that if they break them, they will be prosecuted. 902 00:52:30,120 --> 00:52:33,120 Speaker 1: Good to do. Yeah, I feel like it's kind of 903 00:52:33,200 --> 00:52:36,920 Speaker 1: the same thing ultimately, Like what does she she have 904 00:52:37,040 --> 00:52:39,439 Speaker 1: a quote here? If bad actors like x on break 905 00:52:39,480 --> 00:52:41,879 Speaker 1: the rules and deliberately lie to government agencies, my plan 906 00:52:41,960 --> 00:52:44,040 Speaker 1: will treat them the same way as the law treats 907 00:52:44,080 --> 00:52:46,880 Speaker 1: someone who lies in court by subjecting them to prosecution 908 00:52:46,960 --> 00:52:50,399 Speaker 1: for perjury. So that's great, And this is a part 909 00:52:50,440 --> 00:52:52,000 Speaker 1: of her plan, Like she's going to put these rules 910 00:52:52,040 --> 00:52:54,960 Speaker 1: in place so that if they lie and do it 911 00:52:55,440 --> 00:52:58,239 Speaker 1: X and Y, then they will be prosecuted. But I 912 00:52:58,280 --> 00:53:00,759 Speaker 1: don't think there is a difference between saying like, we're 913 00:53:00,760 --> 00:53:04,359 Speaker 1: gonna put in rules and if they break them, they're 914 00:53:04,680 --> 00:53:08,080 Speaker 1: gonna get prosecuted, and oh, yeah, they've lied to us 915 00:53:08,120 --> 00:53:12,560 Speaker 1: for decades and we're going to prosecute them. Like I 916 00:53:12,840 --> 00:53:17,640 Speaker 1: feel like it's pretty I hear you. I think I 917 00:53:17,680 --> 00:53:20,720 Speaker 1: think it's a minor distinction because it's gonna I understand, 918 00:53:20,760 --> 00:53:22,839 Speaker 1: it's not a minor distinction. He wants to prosecute them 919 00:53:22,840 --> 00:53:25,080 Speaker 1: for stuff that's already happened, and she's saying like, let's 920 00:53:25,120 --> 00:53:27,279 Speaker 1: move forward with these new regulations in place, and if 921 00:53:27,320 --> 00:53:31,080 Speaker 1: this happens, then we will prosecute them. Slight different. They 922 00:53:31,160 --> 00:53:33,120 Speaker 1: both seemed tough to me, but those just know they're 923 00:53:33,160 --> 00:53:35,919 Speaker 1: they're they're both tough. And that's that's the thing again, 924 00:53:36,000 --> 00:53:37,759 Speaker 1: the thing we're sort of talking about, like they're both good. 925 00:53:38,320 --> 00:53:41,360 Speaker 1: It's good that we have we have these two candidates 926 00:53:41,880 --> 00:53:45,480 Speaker 1: that exist, it's just the aggressiveness and and they're not. 927 00:53:46,239 --> 00:53:48,440 Speaker 1: And so a lot of this sort of brings me 928 00:53:48,520 --> 00:53:52,480 Speaker 1: to my real my my real thing, um and like 929 00:53:52,560 --> 00:53:54,719 Speaker 1: you're like, look at let's look into the policy, let's 930 00:53:54,800 --> 00:53:58,919 Speaker 1: compare and contrast, And my thing is that I don't 931 00:53:58,960 --> 00:54:02,400 Speaker 1: super care like about the very specifics of it, Like 932 00:54:02,480 --> 00:54:06,080 Speaker 1: it's like it's very similar to me when well, well, no, 933 00:54:06,280 --> 00:54:09,120 Speaker 1: it's not like medicare for all, and then how are 934 00:54:09,160 --> 00:54:10,799 Speaker 1: you going to pay for it? It's that where it's 935 00:54:10,840 --> 00:54:14,640 Speaker 1: like I don't care when talking about like a president 936 00:54:14,960 --> 00:54:18,440 Speaker 1: or a leadership, I'm less interested in the very specifics 937 00:54:18,760 --> 00:54:22,600 Speaker 1: of the plans because ultimately, people like you get a 938 00:54:22,600 --> 00:54:25,319 Speaker 1: policy team together and you make the policy and it's 939 00:54:25,320 --> 00:54:29,080 Speaker 1: all based off vision. And that's I think that that's 940 00:54:29,120 --> 00:54:32,480 Speaker 1: the thing that I care more about, is uh, having 941 00:54:32,960 --> 00:54:38,520 Speaker 1: vision and commitment to these things makes me a little nervous. 942 00:54:38,560 --> 00:54:40,080 Speaker 1: But I hear you. I understand what you're saying, Like 943 00:54:40,239 --> 00:54:42,200 Speaker 1: the role of a president of leader, it's like that. 944 00:54:42,520 --> 00:54:48,319 Speaker 1: It's why, uh, nowadays you might hear the phrase health 945 00:54:48,360 --> 00:54:50,440 Speaker 1: care is a human right, and you wouldn't have heard 946 00:54:50,480 --> 00:54:53,600 Speaker 1: that five years ago. Um, and there's a reason, and 947 00:54:53,680 --> 00:54:57,759 Speaker 1: it's because vision and leadership has pushed us to all 948 00:54:57,800 --> 00:54:59,600 Speaker 1: sort of adopt that. I do think that the next 949 00:55:00,160 --> 00:55:02,960 Speaker 1: to enacting that vision is a plan. I don't I know, 950 00:55:03,080 --> 00:55:06,880 Speaker 1: I agree. I'm just saying that, like the specifics of 951 00:55:06,960 --> 00:55:09,600 Speaker 1: that plan don't matter to you. That's not what I'm saying. 952 00:55:09,640 --> 00:55:14,239 Speaker 1: I'm saying that because all the things that we're talking 953 00:55:14,280 --> 00:55:15,600 Speaker 1: about now, like oh, we gotta do this, we gotta 954 00:55:15,600 --> 00:55:17,960 Speaker 1: do this, great this. I'm not saying Warren hasn't supported that, 955 00:55:18,040 --> 00:55:20,359 Speaker 1: but I'm saying that the reason we're talking about them, 956 00:55:20,400 --> 00:55:22,440 Speaker 1: and the reason that these ideas have taken over the 957 00:55:22,520 --> 00:55:26,959 Speaker 1: Democratic Party are because of this vision. Because of Bernie's vision, 958 00:55:27,120 --> 00:55:29,480 Speaker 1: I mean yeah, I mean he's the person that has 959 00:55:29,480 --> 00:55:33,800 Speaker 1: been him specifically obviously there's a movie behind it, but 960 00:55:33,880 --> 00:55:36,080 Speaker 1: like I think that has a lot to do with 961 00:55:36,160 --> 00:55:40,759 Speaker 1: care health care thing, um, and just in general, I 962 00:55:40,840 --> 00:55:43,040 Speaker 1: think that that that So, like when we're talking about 963 00:55:43,080 --> 00:55:46,480 Speaker 1: like what's uh this plan? Was this plan? Well, I 964 00:55:46,480 --> 00:55:48,600 Speaker 1: just want a person that I believe, but a person 965 00:55:48,640 --> 00:55:50,520 Speaker 1: that I believe believes the things that they say and 966 00:55:50,600 --> 00:55:52,880 Speaker 1: have been saying for decor you could probably understand. I 967 00:55:53,000 --> 00:55:56,759 Speaker 1: imagine why a lot of people do care about having 968 00:55:56,800 --> 00:55:58,600 Speaker 1: an idea that I'm not saying it doesn't, but he 969 00:55:58,760 --> 00:56:02,560 Speaker 1: has plans. I'm just saying personally, I'm like, yeah, um, 970 00:56:04,160 --> 00:56:07,000 Speaker 1: so before we leave here, I wanted to give room 971 00:56:07,160 --> 00:56:11,279 Speaker 1: for you know, other other points, Uh, specifically Robert, you 972 00:56:11,320 --> 00:56:13,800 Speaker 1: wanted to talk about the blood quantum stuff, and I 973 00:56:13,840 --> 00:56:17,560 Speaker 1: think that's pretty important to get into before we And yeah, 974 00:56:17,800 --> 00:56:21,200 Speaker 1: and this is obviously something that there's no comparison with 975 00:56:21,280 --> 00:56:24,759 Speaker 1: Sanders because Bernie Sanders did not labor, labor under the 976 00:56:24,800 --> 00:56:27,920 Speaker 1: misconception that he had Native American blood for decades. He 977 00:56:28,040 --> 00:56:30,920 Speaker 1: just killed the president. But but yeah, he is on 978 00:56:31,000 --> 00:56:33,560 Speaker 1: the lamb. He did. He did kill President Johns. That's 979 00:56:35,239 --> 00:56:41,240 Speaker 1: he's on the possible. You know, he's hiding from the cops. 980 00:56:41,360 --> 00:56:45,719 Speaker 1: I go, in a minute, they're after me. Uh, you know, No, 981 00:56:46,000 --> 00:56:50,279 Speaker 1: Bernie just wrote some weird erotic essays and when he 982 00:56:50,440 --> 00:56:52,640 Speaker 1: was younger. And we did talk about this a bit 983 00:56:52,680 --> 00:56:55,560 Speaker 1: on the on the Elizabeth Warren episode. But I think 984 00:56:55,600 --> 00:56:58,080 Speaker 1: that some of you guys um I wanted to hear 985 00:56:58,080 --> 00:56:59,600 Speaker 1: more about it, and I agree. I agree with you. 986 00:56:59,760 --> 00:57:02,600 Speaker 1: So let's let's yeah, and this is I wanted to 987 00:57:02,640 --> 00:57:05,000 Speaker 1: point out, like I'm not doing this is like an 988 00:57:05,040 --> 00:57:07,359 Speaker 1: attack on her. I've stated my opinion on like her 989 00:57:07,400 --> 00:57:10,000 Speaker 1: actual level of culpability. I think other than it being 990 00:57:10,160 --> 00:57:13,680 Speaker 1: really dumb, incredibly dumb decision, like, I don't think what 991 00:57:13,800 --> 00:57:16,600 Speaker 1: she did was like horrifically immoral or cruel. I think 992 00:57:16,640 --> 00:57:19,920 Speaker 1: it was pretty coming up growing up where she grew up, 993 00:57:20,120 --> 00:57:21,960 Speaker 1: in a community like the one she grew up, And 994 00:57:22,040 --> 00:57:24,280 Speaker 1: I think it's a pretty in knowing the air she 995 00:57:24,400 --> 00:57:28,000 Speaker 1: grew up and it's a pretty common mistake to have made. UM. 996 00:57:28,400 --> 00:57:30,800 Speaker 1: But a number of Native American folks reached out on 997 00:57:30,880 --> 00:57:33,120 Speaker 1: Twitter after the last episode where we talked about this 998 00:57:33,320 --> 00:57:36,160 Speaker 1: and wanted to make sure that I kind of pointed 999 00:57:36,200 --> 00:57:40,360 Speaker 1: out the context of why it was troubling, because there's 1000 00:57:40,440 --> 00:57:43,240 Speaker 1: there's some really important information about UM kind of the 1001 00:57:43,320 --> 00:57:48,360 Speaker 1: way Native American nous is determined legally, UM that what 1002 00:57:48,520 --> 00:57:51,440 Speaker 1: she did fed into UM. That goes back pretty far, 1003 00:57:51,560 --> 00:57:53,080 Speaker 1: and it's it's kind of important to talk about, and 1004 00:57:53,360 --> 00:57:56,560 Speaker 1: it goes back to something called blood quantum, which is 1005 00:57:56,640 --> 00:57:59,800 Speaker 1: a system that the federal government put on tribes in 1006 00:58:00,080 --> 00:58:02,120 Speaker 1: order to limit the number of people who could call 1007 00:58:02,200 --> 00:58:06,880 Speaker 1: themselves Native American UM and a number of Native nations 1008 00:58:06,960 --> 00:58:09,520 Speaker 1: do not use blood quantum, but like the Navajo Nation 1009 00:58:09,680 --> 00:58:11,800 Speaker 1: and the Turnal Mountain Band of Chippewa Indians and a 1010 00:58:11,880 --> 00:58:14,760 Speaker 1: number of others still do use it. Um. How tribes 1011 00:58:14,840 --> 00:58:18,160 Speaker 1: use blood quantum varies from tribe to tribe. The Navajo 1012 00:58:18,360 --> 00:58:22,720 Speaker 1: require about of Navajo blood in order to call like 1013 00:58:22,840 --> 00:58:28,000 Speaker 1: be considered Navajoternal Mountain requires any kind of Indian blood um, 1014 00:58:28,480 --> 00:58:33,320 Speaker 1: but blood quantum minimums like um like. One of the 1015 00:58:33,440 --> 00:58:36,200 Speaker 1: problems is that they were essentially set up back in 1016 00:58:36,280 --> 00:58:40,280 Speaker 1: the day by the United States in order to ensure 1017 00:58:40,320 --> 00:58:42,600 Speaker 1: that there would kind of always be a declining number 1018 00:58:42,840 --> 00:58:47,120 Speaker 1: of of of Native Americans. So if you've got Navajo 1019 00:58:47,240 --> 00:58:50,160 Speaker 1: blood um and you have children with somebody who has 1020 00:58:50,240 --> 00:58:53,360 Speaker 1: a lower blood quantum, then your kids, by definition will 1021 00:58:53,400 --> 00:58:55,560 Speaker 1: not be able to enroll as Navajo, which is like 1022 00:58:55,640 --> 00:59:00,040 Speaker 1: one of the issues with the system. Um IF. In 1023 00:59:00,120 --> 00:59:03,000 Speaker 1: an article which interviews a woman named Elizabeth Rule, she's 1024 00:59:03,040 --> 00:59:06,720 Speaker 1: a doctoral candidate at Brown University, specializes in Native American studies, 1025 00:59:06,800 --> 00:59:11,000 Speaker 1: and she's a Chickasaw Nation citizen UM, and she calls 1026 00:59:11,040 --> 00:59:15,920 Speaker 1: this a colonial catch twenty two UM basically because like 1027 00:59:16,200 --> 00:59:18,400 Speaker 1: there's kind of no perfect way to deal with this 1028 00:59:18,560 --> 00:59:21,440 Speaker 1: problem because it's been sort of enshrined in law for 1029 00:59:21,640 --> 00:59:28,000 Speaker 1: so long. UM And yeah, it's it's uh, it's a 1030 00:59:28,080 --> 00:59:30,320 Speaker 1: really thorny issue and I'm not gonna be able to 1031 00:59:30,400 --> 00:59:34,880 Speaker 1: give like a super complex like explanation of it. One 1032 00:59:34,960 --> 00:59:37,800 Speaker 1: of the problems is that it doesn't go along with 1033 00:59:37,920 --> 00:59:41,120 Speaker 1: how these actual tribes for thousands of years prior to 1034 00:59:41,320 --> 00:59:45,919 Speaker 1: the United States like determined membership in the tribe. Most 1035 00:59:45,960 --> 00:59:48,080 Speaker 1: of them didn't Obviously you couldn't do blood tests a 1036 00:59:48,160 --> 00:59:50,280 Speaker 1: thousand years ago, so like that wasn't even really a 1037 00:59:50,320 --> 00:59:53,720 Speaker 1: part of it. Like there's a long history of for example, 1038 00:59:53,840 --> 00:59:59,200 Speaker 1: freed black people being fully incorporated into Native American tribes um. 1039 00:59:59,680 --> 01:00:02,800 Speaker 1: And then when sort of like the blood quantum thing 1040 01:00:03,320 --> 01:00:06,000 Speaker 1: like came into being because of the United States government, 1041 01:00:06,320 --> 01:00:09,280 Speaker 1: these guys were basically kicked out of the tribes that 1042 01:00:09,360 --> 01:00:12,280 Speaker 1: they've been fully accepted in because their blood wasn't Native 1043 01:00:12,320 --> 01:00:15,800 Speaker 1: American UM. And so like it's it's a very like 1044 01:00:16,120 --> 01:00:23,640 Speaker 1: complicated issue UM. And it basically guarantees that over time, uh, 1045 01:00:24,280 --> 01:00:28,200 Speaker 1: Native Americans will basically breed themselves out of existence um, 1046 01:00:28,400 --> 01:00:32,280 Speaker 1: which in a fortunate coincidence, means that the federal government 1047 01:00:32,320 --> 01:00:35,840 Speaker 1: will no longer have to continue like maintaining the legal 1048 01:00:35,960 --> 01:00:38,360 Speaker 1: obligations that they have to the treaties that they signed. 1049 01:00:38,360 --> 01:00:41,640 Speaker 1: They gradually die out as a result of this blood 1050 01:00:41,720 --> 01:00:46,880 Speaker 1: quantum thing and so kind of by UM. And again 1051 01:00:46,960 --> 01:00:50,040 Speaker 1: this is like a very rough overview of of of 1052 01:00:50,240 --> 01:00:52,360 Speaker 1: what the blood Quantum sort of is I think I'm 1053 01:00:52,360 --> 01:00:54,640 Speaker 1: gonna do it behind the Bastards episode on just sort 1054 01:00:54,640 --> 01:00:58,760 Speaker 1: of how UH tribes were treated by the US government 1055 01:00:58,800 --> 01:01:00,280 Speaker 1: at some point that goes into this and more to tail, 1056 01:01:00,720 --> 01:01:04,400 Speaker 1: but by um By specifically sort of trying to prove 1057 01:01:04,480 --> 01:01:08,320 Speaker 1: her claim to Indigenous blood or Indigenous heritage by taking 1058 01:01:08,360 --> 01:01:11,760 Speaker 1: a blood test, Warren sort of bought into this system, 1059 01:01:11,880 --> 01:01:15,120 Speaker 1: which is very problematic and very heavily debated, and so 1060 01:01:15,320 --> 01:01:19,080 Speaker 1: that's one reason why she attracted a sizeable amount of 1061 01:01:19,120 --> 01:01:22,000 Speaker 1: criticism UM. But it's also sort of evidence that, like, 1062 01:01:22,720 --> 01:01:26,200 Speaker 1: you know, again, her belief that she had this blood 1063 01:01:26,720 --> 01:01:29,880 Speaker 1: comes down to um or who had this heritage comes 1064 01:01:29,920 --> 01:01:34,520 Speaker 1: down to some very um incorrect ideas about Native American 1065 01:01:34,600 --> 01:01:37,000 Speaker 1: heritage that have been passed through my people for a 1066 01:01:37,120 --> 01:01:41,919 Speaker 1: very long time, particularly in Oklahoma. UM and her sort 1067 01:01:41,920 --> 01:01:45,600 Speaker 1: of going with a blood test and John Lovett's suggesting 1068 01:01:45,600 --> 01:01:48,800 Speaker 1: a blood test in order to to prove her heritage 1069 01:01:49,400 --> 01:01:52,480 Speaker 1: is is, you know, buying into this long tradition And 1070 01:01:52,600 --> 01:01:55,360 Speaker 1: so it's it's definitely a dumb and lame thing that 1071 01:01:55,560 --> 01:01:58,640 Speaker 1: she and her campaign agreed to do. But more to 1072 01:01:58,760 --> 01:02:02,320 Speaker 1: the point, it's like there's an incredibly long and lame 1073 01:02:02,480 --> 01:02:05,400 Speaker 1: and shitty history that is attached to all of this. 1074 01:02:06,160 --> 01:02:09,600 Speaker 1: And Uh, one of the deep frustrations of this election 1075 01:02:09,720 --> 01:02:12,959 Speaker 1: is that if Elizabeth Warren continues to be a major 1076 01:02:13,080 --> 01:02:15,800 Speaker 1: force in the election. Uh. And I like her politics 1077 01:02:15,960 --> 01:02:18,440 Speaker 1: and I hope that she is a part of it. Um, 1078 01:02:19,240 --> 01:02:23,800 Speaker 1: people like the president will continue to drag. It's just 1079 01:02:23,840 --> 01:02:25,959 Speaker 1: gonna continue to be of real pain in the ass 1080 01:02:26,000 --> 01:02:29,640 Speaker 1: for the Native American community. And that's us and I 1081 01:02:29,880 --> 01:02:31,920 Speaker 1: And again, we didn't talk about this nearly enough when 1082 01:02:32,000 --> 01:02:35,880 Speaker 1: we did her episode, but I do think we mentioned 1083 01:02:35,880 --> 01:02:37,920 Speaker 1: it like that. That's a big caveat for me with 1084 01:02:38,000 --> 01:02:43,160 Speaker 1: her is how she handled this situation. Yeah, I think 1085 01:02:43,760 --> 01:02:47,040 Speaker 1: she made a mistake. I think that she could have 1086 01:02:47,200 --> 01:02:50,640 Speaker 1: done a better job owning up to that mistake addressing it. 1087 01:02:51,080 --> 01:02:53,320 Speaker 1: And if she does get the nomination, if she does continue, 1088 01:02:53,360 --> 01:02:55,160 Speaker 1: I think that we would need to see that because 1089 01:02:55,200 --> 01:02:57,840 Speaker 1: there's a lot of communities that are you know, are 1090 01:02:58,240 --> 01:03:01,320 Speaker 1: rightfully upset about it, and this would be a good 1091 01:03:01,360 --> 01:03:05,200 Speaker 1: opportunity to draw attention to an issue that most people 1092 01:03:05,280 --> 01:03:08,160 Speaker 1: don't understand are aware of. And so there's that. I 1093 01:03:08,240 --> 01:03:11,440 Speaker 1: am also worried about how Yeah, we're talking about things 1094 01:03:11,520 --> 01:03:15,840 Speaker 1: that the president is going to use to weaponize against 1095 01:03:15,920 --> 01:03:19,800 Speaker 1: his opponent. This is one of them. So it is 1096 01:03:19,880 --> 01:03:22,760 Speaker 1: absolutely something that we need to keep in mind. I 1097 01:03:23,080 --> 01:03:27,120 Speaker 1: I understand in a way, you know, like there's a problem. 1098 01:03:28,080 --> 01:03:31,080 Speaker 1: It sucks our society, and that the time that she 1099 01:03:31,120 --> 01:03:33,720 Speaker 1: grew up and where she grew up it but you know, 1100 01:03:34,040 --> 01:03:37,640 Speaker 1: like and and and she grew up with this narrative, 1101 01:03:37,640 --> 01:03:39,880 Speaker 1: and she grew up with this story that was false 1102 01:03:40,200 --> 01:03:44,080 Speaker 1: of who she is. I don't you know, and and 1103 01:03:44,200 --> 01:03:47,000 Speaker 1: she benefited from it. And she's white, you know, like 1104 01:03:47,400 --> 01:03:51,280 Speaker 1: she is, Uh and and and she and and we 1105 01:03:52,240 --> 01:03:56,760 Speaker 1: culturally super and we culturally have learned and changed. Um. 1106 01:03:57,440 --> 01:03:59,600 Speaker 1: And you know, you'd like to think that if she 1107 01:03:59,800 --> 01:04:04,120 Speaker 1: was a kid now, maybe it would be a different story. 1108 01:04:04,520 --> 01:04:06,480 Speaker 1: This isn't making an excuse. I'm just putting it all 1109 01:04:06,520 --> 01:04:08,960 Speaker 1: into perspective. But again, like I said, I think that 1110 01:04:09,040 --> 01:04:11,320 Speaker 1: she handled it wrong. Yeah, And like being able to 1111 01:04:11,520 --> 01:04:13,720 Speaker 1: I think being able to handle those kinds of things 1112 01:04:13,760 --> 01:04:17,080 Speaker 1: and address those issues is really important leading up to 1113 01:04:17,120 --> 01:04:21,040 Speaker 1: this year. Uh, like they're gonna throw so much stuff 1114 01:04:21,080 --> 01:04:26,760 Speaker 1: in everybody, um, and how that's handled I think is important. 1115 01:04:26,800 --> 01:04:28,040 Speaker 1: Like you know, like we're not going to talk about 1116 01:04:28,080 --> 01:04:29,840 Speaker 1: Biden or anything. But you know, when he's criticized, he 1117 01:04:29,880 --> 01:04:32,280 Speaker 1: steaks out. He can't handle it. He like grabs people 1118 01:04:32,280 --> 01:04:35,600 Speaker 1: by the collars, like hey man. Uh, And like how 1119 01:04:36,600 --> 01:04:39,320 Speaker 1: the criticism is addressed. I think is going to play 1120 01:04:39,360 --> 01:04:41,160 Speaker 1: into it quite a bit so in terms of, like 1121 01:04:41,960 --> 01:04:44,720 Speaker 1: just because I don't want to be uh, just stating 1122 01:04:44,760 --> 01:04:47,840 Speaker 1: what I think are different Native American attitudes on this. 1123 01:04:47,880 --> 01:04:51,000 Speaker 1: I actually want to quote um a medium article I 1124 01:04:51,080 --> 01:04:56,280 Speaker 1: found from someone named Eli tatosian Um uh titled Warren 1125 01:04:56,360 --> 01:04:59,320 Speaker 1: and the Blood Quantum that kind of goes into at 1126 01:04:59,400 --> 01:05:02,880 Speaker 1: least this one. This one Indigenous person's perspective on um, 1127 01:05:03,280 --> 01:05:09,760 Speaker 1: on all this uh and like how it's impacted their life. Quote. 1128 01:05:10,360 --> 01:05:13,000 Speaker 1: Along with criticism of Warren, a wave of general distrust 1129 01:05:13,080 --> 01:05:15,280 Speaker 1: and prove it began to spread through the United States, 1130 01:05:15,720 --> 01:05:18,520 Speaker 1: placing a negative spotlight on Indigenous folks. This is after 1131 01:05:19,080 --> 01:05:22,360 Speaker 1: her blood test, Uh, Suddenly the question of what percentage 1132 01:05:22,400 --> 01:05:24,439 Speaker 1: are you held more weight and not in a good way. 1133 01:05:24,920 --> 01:05:27,720 Speaker 1: And Kateie Cannon describes that one day, upon being asked 1134 01:05:27,760 --> 01:05:30,480 Speaker 1: how Indian she was, the man asking her stated that 1135 01:05:30,560 --> 01:05:32,760 Speaker 1: he was just trying to make sure you're not another 1136 01:05:32,840 --> 01:05:36,240 Speaker 1: Elizabeth Warren. Warren's claimed created more pressure than ever for 1137 01:05:36,280 --> 01:05:38,800 Speaker 1: Indigenous folks to prove their ancestry to people who were 1138 01:05:38,880 --> 01:05:42,760 Speaker 1: not entitled to such information. Cannon states that Warren answers 1139 01:05:42,800 --> 01:05:46,959 Speaker 1: the dog whistlers and a conversation about Indian uh indigenity 1140 01:05:47,560 --> 01:05:50,120 Speaker 1: without Indigenous input. She makes it seem like the percentage 1141 01:05:50,200 --> 01:05:53,080 Speaker 1: question is something that actually deserves an answer. Before Warren, 1142 01:05:53,160 --> 01:05:55,320 Speaker 1: it was possible to brush off the percentage question, to 1143 01:05:55,400 --> 01:05:57,360 Speaker 1: chuckle and chalk it up to a poorly phrased remark 1144 01:05:57,440 --> 01:05:59,600 Speaker 1: from a well meaning person who just doesn't know any better. 1145 01:06:00,040 --> 01:06:02,840 Speaker 1: Now that blood quantum has been solidified as a political maneuver, 1146 01:06:03,160 --> 01:06:07,600 Speaker 1: I feel like the percentage question has lost its innocence. Yeah, yeah, 1147 01:06:07,880 --> 01:06:13,000 Speaker 1: yeah yeah, whether intended or not justifying the thing, yeah, 1148 01:06:13,120 --> 01:06:17,440 Speaker 1: I completely agree. Um, And it is. It is absolutely 1149 01:06:17,480 --> 01:06:20,720 Speaker 1: a factor as to why I might not for her, 1150 01:06:20,800 --> 01:06:24,400 Speaker 1: And it also plays into UM. It's interesting. There's that 1151 01:06:24,560 --> 01:06:28,040 Speaker 1: old clip of Trump at a in a is in 1152 01:06:28,160 --> 01:06:32,640 Speaker 1: court about his casino stuff. He's talking about Native Americans, uh, 1153 01:06:32,920 --> 01:06:34,280 Speaker 1: and I think it's like about a claim to land, 1154 01:06:34,320 --> 01:06:35,640 Speaker 1: and he's just like, well, they don't look like Native 1155 01:06:35,640 --> 01:06:38,320 Speaker 1: Americans to me. Um. And it's just his that like 1156 01:06:38,720 --> 01:06:41,280 Speaker 1: oh yes, yes, remember that. It's like sort of like 1157 01:06:41,560 --> 01:06:45,440 Speaker 1: playing yeah, playing into remember when Donald Trump said that 1158 01:06:45,560 --> 01:06:47,720 Speaker 1: in a court of law a couple of times he 1159 01:06:47,760 --> 01:06:52,400 Speaker 1: appeared himself a bunch Actually is very weird, but uh, 1160 01:06:52,560 --> 01:06:55,520 Speaker 1: just sort of like playing into his narrative and being 1161 01:06:55,600 --> 01:07:00,400 Speaker 1: unable not his specifically, but like, uh playing into it 1162 01:07:00,520 --> 01:07:03,040 Speaker 1: and not being able to really address it and justifying 1163 01:07:03,240 --> 01:07:07,120 Speaker 1: where he's coming from. Um, it's going to be an issue. Okay, 1164 01:07:07,280 --> 01:07:10,200 Speaker 1: Well we're getting the wrap it up sign by Sophie, 1165 01:07:10,360 --> 01:07:12,440 Speaker 1: So that's what I'm going to do. I was going 1166 01:07:12,480 --> 01:07:15,200 Speaker 1: to do it anyway. We are going to wrap it 1167 01:07:15,440 --> 01:07:19,479 Speaker 1: up like Bernard Sanders wrapped up the presidency of John 1168 01:07:19,520 --> 01:07:22,840 Speaker 1: Fitzgerald Kennedy. Yeah, I'm so glad you brought that home 1169 01:07:23,360 --> 01:07:26,760 Speaker 1: scoped rifle so eloquently, so poetical. Thank you, thank you. 1170 01:07:27,000 --> 01:07:30,080 Speaker 1: You guys can find us online on Twitter and Instagram 1171 01:07:30,440 --> 01:07:33,439 Speaker 1: at worst your pod. Please find us online on Twitter 1172 01:07:33,520 --> 01:07:37,360 Speaker 1: and Instagram, find us online at us um be mean 1173 01:07:37,400 --> 01:07:40,520 Speaker 1: to us online, Please be mean to us online. Uh No, 1174 01:07:40,680 --> 01:07:42,520 Speaker 1: A lot of you guys have been kind of like 1175 01:07:42,800 --> 01:07:45,840 Speaker 1: profoundly abusive to me online, to them, not to me only, 1176 01:07:45,880 --> 01:07:50,760 Speaker 1: And I think um, uh, say what you need to say. 1177 01:07:51,040 --> 01:07:54,000 Speaker 1: Try to be kind about it. Uh. To everybody that's 1178 01:07:54,040 --> 01:07:55,840 Speaker 1: reached out over the last week, I really appreciate you. 1179 01:07:55,920 --> 01:08:00,680 Speaker 1: You're wonderful yourself. No, you're fine. I mean you're yeah. 1180 01:08:02,080 --> 01:08:07,720 Speaker 1: Uh cool boy, howdy. I hope that all of this 1181 01:08:08,040 --> 01:08:12,560 Speaker 1: joking about Bernie Sanders assassinating a former president uh has 1182 01:08:12,640 --> 01:08:17,120 Speaker 1: not harmed our chances of interviewing him. Well, I mean, 1183 01:08:17,160 --> 01:08:21,360 Speaker 1: he needs to answer for it someday. So Bernie, come 1184 01:08:21,400 --> 01:08:23,080 Speaker 1: here and clear up the record. It's out there, it's 1185 01:08:23,080 --> 01:08:26,280 Speaker 1: out there. It's out there. All right, all right, it's 1186 01:08:26,280 --> 01:08:31,479 Speaker 1: out there, correct, Bernie disgusted? All right, I'll see you guys. 1187 01:08:33,000 --> 01:08:44,000 Speaker 1: Everything everything so dull. I tried. Worst Year Ever is 1188 01:08:44,040 --> 01:08:46,720 Speaker 1: a production of I Heart Radio. For more podcasts from 1189 01:08:46,720 --> 01:08:49,880 Speaker 1: my Heart Radio, visit the I heart Radio app, Apple Podcasts, 1190 01:08:50,000 --> 01:08:51,840 Speaker 1: or wherever you listen to your favorite shows.