1 00:00:11,720 --> 00:00:14,520 Speaker 1: Good morning, peeps, and welcome to ok F Daily with 2 00:00:14,640 --> 00:00:19,239 Speaker 1: Meet your Girl, Danielle Moody, recording from the Home Bunker. Folks. 3 00:00:19,760 --> 00:00:24,480 Speaker 1: Happy Monday to all of you, and I hope that 4 00:00:25,040 --> 00:00:29,080 Speaker 1: again everyone had a very good weekend, got some rest, 5 00:00:29,720 --> 00:00:34,440 Speaker 1: had some reflection, got outdoors. I wanted to share something 6 00:00:34,880 --> 00:00:37,440 Speaker 1: with all of you today, which is that at the 7 00:00:37,560 --> 00:00:42,120 Speaker 1: end of last week, I had the really wonderful honor 8 00:00:42,440 --> 00:00:49,040 Speaker 1: of being honored by my former high school. And I 9 00:00:49,200 --> 00:00:53,400 Speaker 1: was so taken aback when I got an email asking 10 00:00:53,560 --> 00:00:57,760 Speaker 1: if I would be interested in coming in prior to 11 00:00:57,800 --> 00:01:01,800 Speaker 1: being entered into my former high school's Hall of Honors, 12 00:01:02,520 --> 00:01:06,319 Speaker 1: which is for non athletes who the Hall of Famous 13 00:01:06,319 --> 00:01:08,679 Speaker 1: for athletes, and the Hall of Honor is for people 14 00:01:09,080 --> 00:01:14,640 Speaker 1: who have achieved I guess, made accomplishments, made waves in 15 00:01:15,040 --> 00:01:19,319 Speaker 1: their chosen field. And I was so taken aback one 16 00:01:19,480 --> 00:01:24,399 Speaker 1: because when I was going to school, I was one 17 00:01:24,440 --> 00:01:29,080 Speaker 1: of a handful of black students. At the time, I 18 00:01:29,200 --> 00:01:32,240 Speaker 1: was not out. I wouldn't come out for probably several 19 00:01:32,319 --> 00:01:38,040 Speaker 1: years when I would graduate from college. And to think 20 00:01:38,120 --> 00:01:41,639 Speaker 1: that I graduated with a graduating class of eleven hundred 21 00:01:41,680 --> 00:01:45,640 Speaker 1: and three, I was one of like twenty black kids 22 00:01:45,720 --> 00:01:50,040 Speaker 1: and kids of color in my graduating class. And while 23 00:01:50,120 --> 00:01:52,480 Speaker 1: I had an amazing high school experience, I say all 24 00:01:52,520 --> 00:01:56,440 Speaker 1: that not to then tell a sob story about my 25 00:01:56,520 --> 00:02:00,440 Speaker 1: high school experience. Thankfully, I had really amazing teachers. I 26 00:02:00,480 --> 00:02:03,520 Speaker 1: did have incidents of race that I have written about racism, 27 00:02:03,920 --> 00:02:06,040 Speaker 1: that I have written about being called the N word 28 00:02:06,240 --> 00:02:12,359 Speaker 1: in my eleventh grade honors English class and subsequently cursing 29 00:02:12,400 --> 00:02:15,600 Speaker 1: that young man out and him being shunned for the 30 00:02:15,639 --> 00:02:19,680 Speaker 1: rest of the year, you know, good times. But aside 31 00:02:19,680 --> 00:02:23,280 Speaker 1: from that, I had a remarkable experience. And why was that. 32 00:02:23,400 --> 00:02:26,560 Speaker 1: It was largely due to the amazing teachers in my 33 00:02:26,720 --> 00:02:33,640 Speaker 1: school who never made me feel other who encourage my 34 00:02:33,760 --> 00:02:40,079 Speaker 1: pursuits and engaged me in you know, debates and questions 35 00:02:40,120 --> 00:02:45,680 Speaker 1: about our society and culture and racism. And I never 36 00:02:45,800 --> 00:02:49,880 Speaker 1: felt like I was marginalized in any type of way. 37 00:02:50,040 --> 00:02:52,359 Speaker 1: And that's what good teachers and a good school is 38 00:02:52,400 --> 00:02:55,840 Speaker 1: supposed to do. And I say this as I'm reflecting 39 00:02:55,880 --> 00:02:59,280 Speaker 1: on the experience of being honored by my high school 40 00:03:00,160 --> 00:03:04,720 Speaker 1: and thinking about what Ron de Santis and Greg Abbott 41 00:03:04,840 --> 00:03:11,040 Speaker 1: are doing to young people in their states, to young 42 00:03:11,240 --> 00:03:15,120 Speaker 1: black and brown people, to young queer people to young 43 00:03:15,160 --> 00:03:19,839 Speaker 1: people who live at the intersections, and how they are 44 00:03:19,880 --> 00:03:24,760 Speaker 1: creating a climate and an experience that is devastating to 45 00:03:24,840 --> 00:03:29,519 Speaker 1: their development, that is going to be devastating to their psyche, 46 00:03:29,840 --> 00:03:32,360 Speaker 1: to their sense of self, and to their ability to 47 00:03:32,560 --> 00:03:37,640 Speaker 1: want to contribute to this society, to want to show 48 00:03:37,800 --> 00:03:41,200 Speaker 1: up in their full selves and be able to make 49 00:03:41,240 --> 00:03:44,600 Speaker 1: their stamp and make a difference, because right now they 50 00:03:44,680 --> 00:03:47,680 Speaker 1: are living in states where they are being stomped out. 51 00:03:48,000 --> 00:03:50,680 Speaker 1: Right where legislation at the end of the week that 52 00:03:50,800 --> 00:03:54,960 Speaker 1: Ron DeSantis passed last week that would allow trans children 53 00:03:55,000 --> 00:03:59,960 Speaker 1: to be taken from their loving homes, do we understand 54 00:04:00,080 --> 00:04:05,040 Speaker 1: and the history of stripping people from marginalized communities of 55 00:04:05,120 --> 00:04:08,360 Speaker 1: their families, of their children. I mean, this is something 56 00:04:08,360 --> 00:04:12,320 Speaker 1: that America was born on, from the indigenous tribes of 57 00:04:12,320 --> 00:04:16,080 Speaker 1: this nation to the enslaved black people that were forced 58 00:04:16,080 --> 00:04:19,680 Speaker 1: to come here, right, So, it is nothing new. It 59 00:04:19,720 --> 00:04:24,560 Speaker 1: has always been a tool of the oppressor to break 60 00:04:24,640 --> 00:04:28,960 Speaker 1: apart families, right because families are supposed to be our 61 00:04:29,080 --> 00:04:32,520 Speaker 1: foundation and the place that we draw our strength from. 62 00:04:32,760 --> 00:04:36,400 Speaker 1: So in order to weaken the queer community, in order 63 00:04:36,480 --> 00:04:40,760 Speaker 1: to disempower and to erase us, their desire is to 64 00:04:40,960 --> 00:04:45,880 Speaker 1: strip families that are accepting of their children, that are 65 00:04:46,000 --> 00:04:49,680 Speaker 1: loving of their children, of their parental rights. Can you 66 00:04:49,720 --> 00:04:53,479 Speaker 1: think about anything more heinous to do in the twenty 67 00:04:53,480 --> 00:04:58,359 Speaker 1: first century after centuries of what we have seen and 68 00:04:58,400 --> 00:05:02,120 Speaker 1: what we have learned from the abusive practices, And frankly, 69 00:05:02,320 --> 00:05:07,880 Speaker 1: you know, the Trump administration did that right with undocumented 70 00:05:07,920 --> 00:05:11,120 Speaker 1: people trying to find safe haven in the United States. 71 00:05:11,160 --> 00:05:14,920 Speaker 1: It is always, always been a tool of the oppressor. 72 00:05:15,520 --> 00:05:18,800 Speaker 1: And I bring that up to say that our children, 73 00:05:19,240 --> 00:05:23,360 Speaker 1: whether or not you birth children, adopt children, foster children, 74 00:05:23,680 --> 00:05:26,760 Speaker 1: teach children, care for them in any type of way, 75 00:05:27,160 --> 00:05:33,799 Speaker 1: are our collective responsibility, right, They are our collective responsibility. 76 00:05:33,880 --> 00:05:38,360 Speaker 1: If we want to see a society that is accepting, 77 00:05:39,000 --> 00:05:43,320 Speaker 1: that is safe, that is progressive, right, where laws are 78 00:05:43,360 --> 00:05:48,279 Speaker 1: created that are about adding people to the table, not 79 00:05:48,640 --> 00:05:54,240 Speaker 1: separating them, then we have to invest in our young people. 80 00:05:54,880 --> 00:05:57,679 Speaker 1: We have to give them safe space and safe harbor 81 00:05:58,440 --> 00:06:02,600 Speaker 1: to grow their voices right so that they can lift 82 00:06:02,680 --> 00:06:07,680 Speaker 1: them for progress. And so when you see these heinous 83 00:06:07,720 --> 00:06:12,680 Speaker 1: acts that are happening across the country, I implore you, 84 00:06:13,040 --> 00:06:18,040 Speaker 1: I implore you to find your local p flag organizations, 85 00:06:18,080 --> 00:06:21,479 Speaker 1: parents and friends of lesbians and gays to go to 86 00:06:21,680 --> 00:06:28,000 Speaker 1: Glisten's website. Right to go to Family Equality, an organization 87 00:06:28,160 --> 00:06:31,200 Speaker 1: that I have worked with over the last two years 88 00:06:31,200 --> 00:06:39,440 Speaker 1: that is about empowering queer families. Please consider donations, consider volunteering, 89 00:06:40,040 --> 00:06:45,000 Speaker 1: Consider calling the offices of these heinous legislators to tell 90 00:06:45,040 --> 00:06:49,440 Speaker 1: them what they are doing to our society. It is 91 00:06:49,520 --> 00:06:53,000 Speaker 1: important that we all in this moment lift our voices 92 00:06:53,200 --> 00:06:58,960 Speaker 1: to protect all kids. Coming up next, my conversation with 93 00:06:59,240 --> 00:07:07,520 Speaker 1: our good goodness friend, our in house doctor, doctor Jonathan Metzel. Folks. 94 00:07:07,640 --> 00:07:12,120 Speaker 1: I am as always very excited when we get to 95 00:07:12,240 --> 00:07:16,040 Speaker 1: chat with our in house doctor, doctor Jonathan Metzel. But 96 00:07:16,240 --> 00:07:22,960 Speaker 1: more importantly, he's an award winner, because doctor Jonathan Metzel 97 00:07:23,640 --> 00:07:29,800 Speaker 1: was awarded the Carlson Award from wild Cornell Medicine Department 98 00:07:29,840 --> 00:07:35,240 Speaker 1: of Psychiatry in recognition of his important contributions to the 99 00:07:35,400 --> 00:07:40,559 Speaker 1: history of psychiatry. So thank you award winner for making 100 00:07:40,600 --> 00:07:43,680 Speaker 1: the time for us a little for us little people 101 00:07:43,760 --> 00:07:45,360 Speaker 1: over at wok app. 102 00:07:45,800 --> 00:07:49,560 Speaker 2: Yeah, well I'll squeeze you guys in no, I it was. 103 00:07:49,600 --> 00:07:51,880 Speaker 2: It was lovely. Actually, it's nice when you get awards 104 00:07:51,920 --> 00:07:54,240 Speaker 2: like this that aren't like the Oh my god, big 105 00:07:54,320 --> 00:07:57,000 Speaker 2: name award that everybody's heard of it. Awards like this 106 00:07:57,080 --> 00:07:59,960 Speaker 2: are nice because they're often This is an annual award 107 00:08:00,120 --> 00:08:04,480 Speaker 2: given by in memory of a colleague, and they do 108 00:08:04,560 --> 00:08:07,160 Speaker 2: it at New York Hospital, and so it was just 109 00:08:07,240 --> 00:08:09,200 Speaker 2: nic It's nice to go. I got to give a 110 00:08:09,200 --> 00:08:11,760 Speaker 2: big talk. The department came, they took me to this 111 00:08:11,880 --> 00:08:14,360 Speaker 2: lovely lunch. I got this nice plaque. But it was 112 00:08:14,400 --> 00:08:18,080 Speaker 2: also nice just to well first be recognized. I mean, 113 00:08:18,080 --> 00:08:20,760 Speaker 2: there's some incredible people who won this award, but also 114 00:08:21,080 --> 00:08:24,680 Speaker 2: to be able to kind of speak to somebody's memory, 115 00:08:24,680 --> 00:08:26,920 Speaker 2: it's such a nice way to keep traditional alive. So 116 00:08:27,080 --> 00:08:27,400 Speaker 2: thank you. 117 00:08:28,560 --> 00:08:31,440 Speaker 1: Well, it's wonderful to see. And we can't think of 118 00:08:31,480 --> 00:08:38,240 Speaker 1: a better person to be honored than you, Jonathan. You know, 119 00:08:39,360 --> 00:08:43,760 Speaker 1: there's just I'm so disgusted these days. I mean, people 120 00:08:43,840 --> 00:08:46,920 Speaker 1: know that I'm disgusted all the time, but I will 121 00:08:46,920 --> 00:08:52,400 Speaker 1: tell you that I'm still kind of reeling from CNN's 122 00:08:52,440 --> 00:08:58,559 Speaker 1: desire to continue to mainstream wide supremacy and anti semitism 123 00:08:58,960 --> 00:09:05,560 Speaker 1: on their network, to continue the president. Chris Lickt just 124 00:09:05,600 --> 00:09:11,200 Speaker 1: came out again because their ratings were not through the 125 00:09:11,280 --> 00:09:14,320 Speaker 1: roof for Donald Trump, and you know, and they have 126 00:09:14,520 --> 00:09:18,560 Speaker 1: faced a ton, a ton of backlash, so he's on 127 00:09:18,679 --> 00:09:22,040 Speaker 1: the defensive saying that we need to show both sides 128 00:09:22,080 --> 00:09:26,120 Speaker 1: of every issue. And Medie Hassan, who has a show 129 00:09:26,120 --> 00:09:29,480 Speaker 1: on MSNBC and has been on this show and others, 130 00:09:30,760 --> 00:09:33,720 Speaker 1: tweeted and said, really, we need to show the both 131 00:09:33,760 --> 00:09:39,520 Speaker 1: sides of every issue. So climate deniers and election deniers 132 00:09:39,760 --> 00:09:45,120 Speaker 1: and racism and injustice in this country, so everyone deserves 133 00:09:45,200 --> 00:09:47,920 Speaker 1: both sides. And he said when, so, when does it end? 134 00:09:48,320 --> 00:09:52,040 Speaker 1: And I wanted to get your thoughts on how media 135 00:09:52,840 --> 00:10:00,280 Speaker 1: is just mainstreaming white supremacy, mainstreaming hatred in a way 136 00:10:00,720 --> 00:10:04,960 Speaker 1: that I don't think it's ever done on this level. 137 00:10:05,040 --> 00:10:09,680 Speaker 1: Because of the amount of technology, streaming, social media platforms 138 00:10:09,920 --> 00:10:12,200 Speaker 1: and all of these things that allow you to access 139 00:10:12,200 --> 00:10:14,000 Speaker 1: so many people so quickly. 140 00:10:16,080 --> 00:10:20,400 Speaker 2: It's obviously not a surprise to anyone that media in 141 00:10:20,440 --> 00:10:24,480 Speaker 2: this environment where so many media companies are going out 142 00:10:24,480 --> 00:10:28,440 Speaker 2: of business. I mean, Vice just went under. I actually 143 00:10:28,480 --> 00:10:31,040 Speaker 2: had a piece that was coming out in Vice. I 144 00:10:31,080 --> 00:10:35,080 Speaker 2: did an interview with them about guns, and it didn't 145 00:10:35,120 --> 00:10:41,560 Speaker 2: come out because the company is going under. And right 146 00:10:41,800 --> 00:10:45,959 Speaker 2: is it what BuzzFeed? You name it? All these companies 147 00:10:45,960 --> 00:10:50,160 Speaker 2: are firing, people are going under It's a horrible environment 148 00:10:50,280 --> 00:10:52,640 Speaker 2: right now. Forgetting just the ethics for one second, which 149 00:10:52,720 --> 00:10:57,920 Speaker 2: of course it's all about ethics, but companies are driven 150 00:10:58,000 --> 00:11:03,120 Speaker 2: by revenue and click and views, which leads to advertisers. 151 00:11:03,679 --> 00:11:07,319 Speaker 2: And in our world, we live in algorithms where conflict 152 00:11:07,559 --> 00:11:10,680 Speaker 2: drives revenue, right, And so I guess that's the question 153 00:11:10,800 --> 00:11:13,480 Speaker 2: is like where does that leave us? So I guess 154 00:11:13,520 --> 00:11:16,320 Speaker 2: I'm not surprised that CNN has taking this turn, right. 155 00:11:16,480 --> 00:11:19,720 Speaker 2: We knew it when when he was hired. That basically 156 00:11:19,720 --> 00:11:25,640 Speaker 2: they're an entertainment company, and the entertainment company they need viewers, right, 157 00:11:25,760 --> 00:11:28,320 Speaker 2: just like Twitter needs followers and et cetera, et cetera, 158 00:11:28,360 --> 00:11:30,400 Speaker 2: et cetera. They're just they're trying to make money. They're 159 00:11:30,400 --> 00:11:33,520 Speaker 2: not people any more than bud Light is a person. 160 00:11:34,360 --> 00:11:37,280 Speaker 2: And so you know, it really is true. 161 00:11:37,840 --> 00:11:41,760 Speaker 1: The Supreme Court say soon, but it is a person. 162 00:11:42,120 --> 00:11:44,160 Speaker 2: So these these things are all just existing in this 163 00:11:44,280 --> 00:11:47,720 Speaker 2: marketplace that is really cutthroat. And where the the you know, 164 00:11:47,760 --> 00:11:51,400 Speaker 2: the the terrain right now is the more conflict you have. 165 00:11:51,520 --> 00:11:58,280 Speaker 2: You know, Uh, politicians can never pass legislation in the world, 166 00:11:58,320 --> 00:12:00,560 Speaker 2: but if they stand outside of Congress, and say racist 167 00:12:00,600 --> 00:12:04,439 Speaker 2: stuff or get in fight with each other and it 168 00:12:04,480 --> 00:12:07,360 Speaker 2: goes viral. That's what drives revenue. So our whole system 169 00:12:07,400 --> 00:12:11,440 Speaker 2: is monetized right now, not to give one side what 170 00:12:11,520 --> 00:12:14,360 Speaker 2: it agrees with, but to tweak one side then tweet 171 00:12:14,360 --> 00:12:16,640 Speaker 2: the other side. That's just the world. That's what entertainment. 172 00:12:16,880 --> 00:12:19,079 Speaker 2: That's how entertainment is mon sized. I wish we had 173 00:12:19,120 --> 00:12:23,640 Speaker 2: different algorithms. I truly wish we had different algorithms. But 174 00:12:23,720 --> 00:12:26,000 Speaker 2: I guess for me, the question is what do we 175 00:12:26,040 --> 00:12:31,400 Speaker 2: do as liberals, Like what's our response? Because I don't know. 176 00:12:32,120 --> 00:12:33,959 Speaker 2: I was again. I mean, I think what CNA did 177 00:12:34,040 --> 00:12:37,719 Speaker 2: was horrible. They legitimized and platformed, and they're gonna, I'm sure, 178 00:12:37,760 --> 00:12:41,640 Speaker 2: do it again. But the question is, like, you see 179 00:12:41,679 --> 00:12:45,240 Speaker 2: all these people now boycotting CNN and a lot of 180 00:12:45,280 --> 00:12:50,080 Speaker 2: people are boycotting Twitter obviously, And that's the question I 181 00:12:50,080 --> 00:12:54,000 Speaker 2: want to ask you, Danielle, is boycotting the right response? 182 00:12:54,080 --> 00:12:57,720 Speaker 2: Right now? Is boycotting the right response? Or are we 183 00:12:57,800 --> 00:13:01,040 Speaker 2: deplatforming ourselves? I mean, that's really, honestly what I've run 184 00:13:01,040 --> 00:13:03,560 Speaker 2: through my mind all the time. Because they're not enough 185 00:13:03,760 --> 00:13:08,120 Speaker 2: people on liberal Twitter to sway elections that they are 186 00:13:08,160 --> 00:13:11,400 Speaker 2: not enough liberals to make a difference. If we stop 187 00:13:11,440 --> 00:13:13,440 Speaker 2: watching CNN, we're just going to hand it over and 188 00:13:13,440 --> 00:13:16,280 Speaker 2: make it another Fox News, and Fox News is whatever. 189 00:13:16,360 --> 00:13:20,760 Speaker 2: So I just I'm worried about the response of boycotting. 190 00:13:21,679 --> 00:13:24,760 Speaker 2: It works sometimes, but if that's always our response, I 191 00:13:24,800 --> 00:13:27,920 Speaker 2: just feel like we're lessening our voice as opposed to 192 00:13:29,720 --> 00:13:32,120 Speaker 2: I don't know is everybody's choice, but what if every 193 00:13:32,160 --> 00:13:35,920 Speaker 2: liberal person got a blue check and blasted on Twitter 194 00:13:35,960 --> 00:13:37,400 Speaker 2: and fought back on it. I just feel like we've 195 00:13:37,400 --> 00:13:39,040 Speaker 2: given over Twitter to these people. 196 00:13:40,360 --> 00:13:44,600 Speaker 1: I don't think so. I think that boycotts are important. 197 00:13:44,720 --> 00:13:47,240 Speaker 1: I think that we have seen how boycotts have worked 198 00:13:47,280 --> 00:13:52,200 Speaker 1: throughout history to move this country to a more progressive place. 199 00:13:52,280 --> 00:13:55,680 Speaker 1: You can look at the Montgomery bus boycott, and you 200 00:13:55,720 --> 00:13:59,800 Speaker 1: can look at, you know, the airing out of Alec, 201 00:14:01,160 --> 00:14:06,320 Speaker 1: the company that the organization, I'll say that created all 202 00:14:06,400 --> 00:14:10,640 Speaker 1: of the anti homophobic, anti black legislation that we've seen, 203 00:14:10,920 --> 00:14:14,480 Speaker 1: and people being able to be informed about what companies 204 00:14:14,880 --> 00:14:17,760 Speaker 1: they are shopping with, what companies they are spending their 205 00:14:17,800 --> 00:14:20,520 Speaker 1: money with, and who is taking their money by day 206 00:14:20,600 --> 00:14:22,640 Speaker 1: and then voting against their rights at night. 207 00:14:23,040 --> 00:14:24,560 Speaker 2: And so I think, let me let me ask you, 208 00:14:24,880 --> 00:14:27,800 Speaker 2: because I'm not against boycotts at all. I'm for sure, 209 00:14:27,840 --> 00:14:31,760 Speaker 2: I completely respect the history of boycott's. I think they're 210 00:14:31,760 --> 00:14:36,720 Speaker 2: important tools, I think, but is the Montgomery boycott the 211 00:14:36,800 --> 00:14:43,400 Speaker 2: same as people boycotting CNN, for example, or not engaging Twitter. 212 00:14:44,480 --> 00:14:47,360 Speaker 1: It's not at all the same thing. But here's it, 213 00:14:47,600 --> 00:14:49,720 Speaker 1: because I'm one of the people that have actually been 214 00:14:49,720 --> 00:14:52,640 Speaker 1: calling for people to boycott CNN because I don't think 215 00:14:52,640 --> 00:14:55,480 Speaker 1: that See I think that CNN, by virtue of the 216 00:14:55,520 --> 00:14:58,840 Speaker 1: people that they are putting on air, are delegitimizing themselves. 217 00:14:59,280 --> 00:15:03,240 Speaker 1: That they were once seen as the stalwart of cable news, 218 00:15:03,680 --> 00:15:07,040 Speaker 1: and you have seen that erosion of that network since 219 00:15:07,080 --> 00:15:10,320 Speaker 1: the Trump era in a very major way. They are 220 00:15:10,600 --> 00:15:14,480 Speaker 1: you know, just recently what they gave Caitlyn Collins, the 221 00:15:14,520 --> 00:15:18,800 Speaker 1: woman who interviewed Donald Trump the plum spot at nine PM, 222 00:15:19,280 --> 00:15:22,680 Speaker 1: come to find out, which I did not know until recently. 223 00:15:23,040 --> 00:15:26,560 Speaker 1: Caitlyn Collins got her start at The Daily Caller and 224 00:15:26,720 --> 00:15:31,240 Speaker 1: was the entertainment reporter and has has espoused anti Semitism, 225 00:15:31,320 --> 00:15:36,160 Speaker 1: has espoused racial bias, has espoused and written about all 226 00:15:36,200 --> 00:15:39,320 Speaker 1: of these things. And now by virtue of the idea 227 00:15:39,360 --> 00:15:42,200 Speaker 1: that CNN is still this mainstream, middle of the bar 228 00:15:42,400 --> 00:15:47,360 Speaker 1: place is able to you know, whitewash herself right and 229 00:15:47,720 --> 00:15:50,400 Speaker 1: her and so now it makes sense as to why 230 00:15:50,440 --> 00:15:53,720 Speaker 1: Donald Trump chose her in the first place as somebody 231 00:15:53,760 --> 00:15:56,400 Speaker 1: to sit down with. And so my feeling is, you 232 00:15:56,560 --> 00:16:01,000 Speaker 1: do not have to continue to give eyes, eyeballs and 233 00:16:01,120 --> 00:16:04,880 Speaker 1: Nielsen ratings to a network that is no longer a 234 00:16:05,000 --> 00:16:09,240 Speaker 1: trusted entity and form of journalism. And I think that 235 00:16:09,440 --> 00:16:12,800 Speaker 1: if CNN is out there deciding that they are no 236 00:16:12,840 --> 00:16:16,600 Speaker 1: longer interested in expanding their base of people who want 237 00:16:16,640 --> 00:16:20,120 Speaker 1: real news and instead want to chase Fox News, then 238 00:16:20,160 --> 00:16:24,680 Speaker 1: why would we continue as viewers to reward them? Right Like, 239 00:16:24,720 --> 00:16:26,920 Speaker 1: I think that at some at some point in time, 240 00:16:27,240 --> 00:16:30,560 Speaker 1: the people of this country, with their dollars and their voice, 241 00:16:30,920 --> 00:16:33,440 Speaker 1: need to say that we're not going to be along. 242 00:16:33,600 --> 00:16:36,760 Speaker 1: We're not going along on this ride with you right 243 00:16:36,800 --> 00:16:38,880 Speaker 1: to the bottom. And I think in the same way 244 00:16:38,920 --> 00:16:42,040 Speaker 1: for Twitter, I still have not left Twitter because I 245 00:16:42,080 --> 00:16:46,880 Speaker 1: do still believe that Twitter is a space for you 246 00:16:46,960 --> 00:16:51,000 Speaker 1: to get information quickly, is a space for you to 247 00:16:51,080 --> 00:16:53,760 Speaker 1: be able to share your thoughts, and so until it 248 00:16:54,160 --> 00:16:58,400 Speaker 1: absolutely come busts wholeheartedly, I'll be I'll be like you, 249 00:16:58,480 --> 00:17:00,560 Speaker 1: and I'll be one of the people that continue to 250 00:17:00,600 --> 00:17:04,879 Speaker 1: hang on because all of the other social media platforms 251 00:17:04,880 --> 00:17:08,800 Speaker 1: that have bubbled up don't are still in beta version 252 00:17:08,960 --> 00:17:12,359 Speaker 1: right and don't have the same a reach right and 253 00:17:12,520 --> 00:17:15,199 Speaker 1: power that a Twitter does at this time. And so 254 00:17:15,800 --> 00:17:18,200 Speaker 1: I but I don't like to answer the larger question. 255 00:17:18,280 --> 00:17:21,280 Speaker 1: I don't think that we are deplatforming ourselves. I think 256 00:17:21,320 --> 00:17:24,359 Speaker 1: that people are showing the power that they have. And 257 00:17:24,400 --> 00:17:28,159 Speaker 1: if CNN wants to continue in business and not you know, 258 00:17:28,320 --> 00:17:31,960 Speaker 1: troll behind a Newsmax. In their ratings, which they did 259 00:17:32,000 --> 00:17:35,040 Speaker 1: two days after they had Donald Trump on, they came 260 00:17:35,040 --> 00:17:38,920 Speaker 1: in at three hundred and thirty five thousand viewers behind Newsmax, 261 00:17:39,240 --> 00:17:42,119 Speaker 1: CNN was that if you want to continue with the 262 00:17:42,160 --> 00:17:44,800 Speaker 1: far so pretending to be mainstream news when you're leaning 263 00:17:44,880 --> 00:17:47,880 Speaker 1: right wing, then you're going to see those numbers. 264 00:17:48,000 --> 00:17:50,600 Speaker 2: No, I don't do And again, like I'm as you know, 265 00:17:50,680 --> 00:17:53,160 Speaker 2: I'm on CNN every so often, and so I don't 266 00:17:53,400 --> 00:17:56,560 Speaker 2: certainly like what's happening. And I know people at CNN 267 00:17:56,560 --> 00:17:58,439 Speaker 2: who are not happy and people who have les seen it. 268 00:17:58,480 --> 00:18:01,920 Speaker 2: A lot of good people have lest CNN. It's great 269 00:18:01,920 --> 00:18:04,800 Speaker 2: for MSNBC, right you know, Anna Carrera and all these 270 00:18:04,800 --> 00:18:10,000 Speaker 2: amazing people are going over to MSNBC. I guess my concern, 271 00:18:10,480 --> 00:18:13,240 Speaker 2: And I'll also say, like, I was called out, I 272 00:18:13,240 --> 00:18:16,560 Speaker 2: think fairly by one of our listeners on this show 273 00:18:17,400 --> 00:18:20,960 Speaker 2: because I think I made a bad joke about mated on, 274 00:18:21,119 --> 00:18:24,040 Speaker 2: and I apologize. I'm sure it was I wasn't trying 275 00:18:24,040 --> 00:18:26,800 Speaker 2: to be offensive, and I have total respect for people 276 00:18:26,800 --> 00:18:31,080 Speaker 2: who were on mastered On. I'm on the waiting list 277 00:18:31,119 --> 00:18:34,199 Speaker 2: for Blue Sky. I tried masked it on and I 278 00:18:34,280 --> 00:18:38,200 Speaker 2: just couldn't, honestly. I just couldn't figure it out, honestly. 279 00:18:39,040 --> 00:18:41,960 Speaker 2: But I guess for me, my concern is that the 280 00:18:42,080 --> 00:18:44,280 Speaker 2: lead up to I mean, think about how important social 281 00:18:44,280 --> 00:18:48,840 Speaker 2: media was to Biden's election in twenty twenty, like, because 282 00:18:48,880 --> 00:18:51,399 Speaker 2: I was part of the influencer group, one of the 283 00:18:51,400 --> 00:18:55,280 Speaker 2: influencer groups that was doing all the stuff with you know, 284 00:18:55,320 --> 00:18:58,600 Speaker 2: we had we had Mark Ruffalo and all these people 285 00:18:58,680 --> 00:19:02,240 Speaker 2: working with us. We had a massive footpath, a massive turnout, 286 00:19:02,280 --> 00:19:06,359 Speaker 2: and we spoke to ambivalent voters, progressives who weren't sure 287 00:19:06,359 --> 00:19:09,040 Speaker 2: that they were going to vote for vote for Biden. 288 00:19:10,000 --> 00:19:13,359 Speaker 2: All these people, like social media was incredible, incredible, and 289 00:19:13,680 --> 00:19:16,840 Speaker 2: it really helped, especially, you know, when we could target 290 00:19:16,920 --> 00:19:22,200 Speaker 2: swing states, And my concern is just that's being undermined 291 00:19:22,240 --> 00:19:24,640 Speaker 2: in such powerful ways right now. Like I think there's 292 00:19:24,640 --> 00:19:27,119 Speaker 2: a very concerted effort by the Elon Musks of the 293 00:19:27,160 --> 00:19:30,639 Speaker 2: world and other people to make sure that liberals don't 294 00:19:30,680 --> 00:19:33,520 Speaker 2: have that kind of sway leading into the twenty twenty 295 00:19:33,520 --> 00:19:37,360 Speaker 2: four election. And my concern is just that the Mastodons 296 00:19:37,400 --> 00:19:39,359 Speaker 2: of the world and the blue skies of the world, 297 00:19:39,760 --> 00:19:42,960 Speaker 2: it just becomes like a place to protest. It doesn't 298 00:19:42,960 --> 00:19:46,199 Speaker 2: have a reach, it doesn't have enough reach, and so 299 00:19:46,359 --> 00:19:49,480 Speaker 2: right now Twitter is the place with reach. And I'm 300 00:19:49,520 --> 00:19:51,320 Speaker 2: just totally torn because of course I don't want to 301 00:19:51,320 --> 00:19:53,840 Speaker 2: give my money to Elon Musk. And it also does 302 00:19:53,920 --> 00:19:56,840 Speaker 2: concern me that like all these right wingers have this 303 00:19:56,920 --> 00:20:00,359 Speaker 2: blue check which means that their tweets are boosted, can 304 00:20:00,720 --> 00:20:04,679 Speaker 2: tweet longer tweets and put up videos, and everybody like 305 00:20:04,720 --> 00:20:07,280 Speaker 2: me on my side is just sending these short tweets 306 00:20:07,320 --> 00:20:09,560 Speaker 2: that don't go and go away where except to my 307 00:20:09,640 --> 00:20:12,159 Speaker 2: friends and so and so. I just I don't know 308 00:20:12,160 --> 00:20:12,560 Speaker 2: what to do. 309 00:20:14,040 --> 00:20:15,919 Speaker 1: I mean, the reality is you can pay in the 310 00:20:15,960 --> 00:20:18,680 Speaker 1: same way that they're paying to have their voices heard. 311 00:20:18,680 --> 00:20:21,960 Speaker 1: But I think that also, look, I don't think the 312 00:20:22,040 --> 00:20:24,960 Speaker 1: blue check now that we know that it can be bought, 313 00:20:25,280 --> 00:20:28,119 Speaker 1: has the same cachet. I think that, of course, we 314 00:20:28,200 --> 00:20:32,359 Speaker 1: continue to grow our numbers as you go on TV 315 00:20:32,640 --> 00:20:34,479 Speaker 1: and do other things. But I think at the end 316 00:20:34,520 --> 00:20:37,760 Speaker 1: of the day, people are going to diversify where they 317 00:20:37,800 --> 00:20:40,640 Speaker 1: get the information and they're going to continue to do so. Right, 318 00:20:40,720 --> 00:20:44,800 Speaker 1: the amount of people that use TikTok, for instance, as 319 00:20:44,800 --> 00:20:47,560 Speaker 1: a way to get their news as opposed to actually 320 00:20:47,600 --> 00:20:50,440 Speaker 1: watching the news or going on Twitter, because many people 321 00:20:50,520 --> 00:20:54,160 Speaker 1: left Twitter even before because it became a swamp, right, 322 00:20:54,800 --> 00:20:57,480 Speaker 1: And this is yeah, and this is what happens with 323 00:20:57,680 --> 00:21:02,000 Speaker 1: these social media platforms. Like people left Facebook why because 324 00:21:02,160 --> 00:21:05,280 Speaker 1: Facebook became a site for older people, but it was 325 00:21:05,359 --> 00:21:09,159 Speaker 1: NonStop ads, right, So you were no longer even seeing 326 00:21:09,200 --> 00:21:11,960 Speaker 1: the people are being able to engage with family and 327 00:21:11,960 --> 00:21:14,600 Speaker 1: friends in the way that you were. You're just engaging 328 00:21:14,760 --> 00:21:18,879 Speaker 1: with corporate sponsors, right. And so I think that with 329 00:21:19,119 --> 00:21:23,080 Speaker 1: every every few years, and not even every few years, 330 00:21:23,240 --> 00:21:26,439 Speaker 1: every every couple of you know, short years, there is 331 00:21:26,520 --> 00:21:30,479 Speaker 1: going to be another explosion of social media platforms that 332 00:21:30,600 --> 00:21:33,480 Speaker 1: try and refine what it is that is going wrong 333 00:21:33,520 --> 00:21:36,560 Speaker 1: on these other places, and that's how people will move. 334 00:21:36,720 --> 00:21:39,600 Speaker 1: I think that it is about figuring out how to 335 00:21:39,600 --> 00:21:43,119 Speaker 1: get your message out in a multitude of ways, giving 336 00:21:43,160 --> 00:21:46,320 Speaker 1: all the opportunities that people like you and I with 337 00:21:46,480 --> 00:21:48,760 Speaker 1: varied platforms have the ability to do. 338 00:21:49,200 --> 00:21:51,760 Speaker 2: No, I think that's right. I just think people need 339 00:21:51,760 --> 00:21:54,720 Speaker 2: guidance about it right now, and again if Maston is 340 00:21:54,880 --> 00:21:57,840 Speaker 2: the way to do that. But I mean, the important 341 00:21:57,880 --> 00:22:01,439 Speaker 2: thing about something like Twitter is that you're not just 342 00:22:01,480 --> 00:22:04,120 Speaker 2: speaking to your friends or your bubble, right, it actually 343 00:22:04,480 --> 00:22:08,240 Speaker 2: potentially has reached well beyond people who agree with you, right, 344 00:22:08,240 --> 00:22:10,680 Speaker 2: you can agree, you can engage people who disagree with you, 345 00:22:11,200 --> 00:22:14,199 Speaker 2: and so I think that really that's kind of the 346 00:22:14,640 --> 00:22:17,840 Speaker 2: that's kind of what's important when you're trying to think 347 00:22:17,880 --> 00:22:21,119 Speaker 2: about social media and elections. For example, Like right now, 348 00:22:22,000 --> 00:22:24,800 Speaker 2: I feel fine. I just again my worry is just 349 00:22:24,880 --> 00:22:27,320 Speaker 2: like what's going to be the platform there is Probably 350 00:22:27,600 --> 00:22:30,680 Speaker 2: the joke is slightly on us for like calling out 351 00:22:30,680 --> 00:22:34,000 Speaker 2: Elon Musk when he wasn't going to buy Twitter, like 352 00:22:34,280 --> 00:22:38,280 Speaker 2: probably we should have just let him slick away. But 353 00:22:38,280 --> 00:22:41,920 Speaker 2: but but but but I think that what's the infrastructure 354 00:22:41,960 --> 00:22:45,080 Speaker 2: going to be for that kind of mobilization. And maybe 355 00:22:45,119 --> 00:22:48,080 Speaker 2: it is social media, maybe it's not social media, but 356 00:22:48,080 --> 00:22:50,680 Speaker 2: but I just think I just think there's a there's 357 00:22:50,720 --> 00:22:54,399 Speaker 2: a question of reach right now. That is, like, obviously 358 00:22:54,480 --> 00:22:56,520 Speaker 2: there's a lot of data right now that the demographics 359 00:22:56,520 --> 00:23:00,360 Speaker 2: are changing of Twitter. Twitter is a very powerful tool 360 00:23:00,440 --> 00:23:04,280 Speaker 2: for Republicans right now. Twitter as a massive increase in 361 00:23:04,640 --> 00:23:07,120 Speaker 2: people in Republicans who feel like it meets their needs. 362 00:23:07,119 --> 00:23:10,680 Speaker 2: So Twitter is going to be very effective tool, even 363 00:23:10,880 --> 00:23:14,240 Speaker 2: more than it was in twenty twenty for Republicans, just 364 00:23:14,240 --> 00:23:18,639 Speaker 2: given everything that's happening. And so what's the counter to that? 365 00:23:18,920 --> 00:23:20,760 Speaker 2: And I don't I don't know the answer. I mean, 366 00:23:20,760 --> 00:23:23,600 Speaker 2: I'll say that's what I'm asking. I again, wasn't trying 367 00:23:23,600 --> 00:23:26,360 Speaker 2: to poke fund at Macedon. I know everybody's people are 368 00:23:26,359 --> 00:23:28,840 Speaker 2: trying that, But I just think, you know, we have 369 00:23:28,880 --> 00:23:30,880 Speaker 2: an infrastructure issue that we need to address. 370 00:23:31,240 --> 00:23:33,880 Speaker 1: Yeah, And I think that also people need to get 371 00:23:33,880 --> 00:23:36,240 Speaker 1: off of their screens and get out into their communities. 372 00:23:36,600 --> 00:23:39,679 Speaker 1: I think that all Another part of that is, you know, 373 00:23:39,880 --> 00:23:42,960 Speaker 1: there is going to be a return to doing the 374 00:23:43,000 --> 00:23:45,639 Speaker 1: good old fashioned work of actually meeting with the people 375 00:23:45,680 --> 00:23:49,719 Speaker 1: in your community, knocking on those doors, and you know, 376 00:23:49,880 --> 00:23:53,480 Speaker 1: and engaging in real ways. And I think that as 377 00:23:53,480 --> 00:23:55,840 Speaker 1: we're going to see, you know, the twenty twenty four 378 00:23:55,880 --> 00:23:57,840 Speaker 1: election is going to be the first election that we 379 00:23:57,880 --> 00:24:01,440 Speaker 1: have chat GPT right where that is. That is also 380 00:24:02,359 --> 00:24:04,920 Speaker 1: on one hand, oh how exciting look at this new 381 00:24:04,960 --> 00:24:08,200 Speaker 1: technology that makes things easier, and oh, oh my god, 382 00:24:08,280 --> 00:24:12,640 Speaker 1: how devastating to humanity this technology that makes everything easier. 383 00:24:13,560 --> 00:24:16,119 Speaker 1: It makes it easier to spread lies, makes it easier 384 00:24:16,119 --> 00:24:18,960 Speaker 1: to doctor videos, makes it easier to do a whole 385 00:24:19,000 --> 00:24:22,040 Speaker 1: lot of things, uh that are going to be dangerous 386 00:24:22,200 --> 00:24:24,400 Speaker 1: right moving on in the future. And so I think 387 00:24:24,440 --> 00:24:28,880 Speaker 1: that people need to make decisions that align with their 388 00:24:29,119 --> 00:24:33,480 Speaker 1: with their values, and if boycotting CNN, for instance, and 389 00:24:33,680 --> 00:24:37,800 Speaker 1: putting your eyeballs towards a PVS or an MSNBC, or 390 00:24:38,359 --> 00:24:40,760 Speaker 1: you know, going on and deciding, you know what, I'm 391 00:24:40,760 --> 00:24:46,040 Speaker 1: just gonna read. I'm just gonna go like, you know, right, 392 00:24:46,119 --> 00:24:49,440 Speaker 1: I'm just gonna read articles instead of you know, listening 393 00:24:49,440 --> 00:24:52,159 Speaker 1: to the same stories regurgitated or actually I'm going to 394 00:24:52,280 --> 00:24:55,919 Speaker 1: join you know, my local organizers and you know, and 395 00:24:55,960 --> 00:24:58,440 Speaker 1: connect with the real people so that I'm not afraid 396 00:24:58,480 --> 00:25:00,880 Speaker 1: of my neighbor down the street. I think that those 397 00:25:01,000 --> 00:25:03,680 Speaker 1: are actually important things that we need to be doing. 398 00:25:03,720 --> 00:25:06,439 Speaker 1: We shouldn't be looking at one place that none of 399 00:25:06,520 --> 00:25:10,880 Speaker 1: us own right to be the life saver of democracy, 400 00:25:11,040 --> 00:25:13,480 Speaker 1: because that ain't it right. And I think that that's 401 00:25:13,520 --> 00:25:16,399 Speaker 1: what we have seen over the last several years. 402 00:25:16,840 --> 00:25:21,640 Speaker 2: Remember it was the Obama reelection when he beat Romney, 403 00:25:22,119 --> 00:25:25,600 Speaker 2: and it turned out afterwards that Obama and the Democrats 404 00:25:25,640 --> 00:25:29,120 Speaker 2: had this technology that the Republicans didn't even really have, 405 00:25:29,640 --> 00:25:33,399 Speaker 2: which was ways to reach voters. They had some kind 406 00:25:33,440 --> 00:25:35,960 Speaker 2: of technology where they could identify in the voter rules 407 00:25:36,560 --> 00:25:39,840 Speaker 2: people who hadn't voted in the past two elections. But 408 00:25:39,880 --> 00:25:43,280 Speaker 2: still they had some kind of technology where they figured out, like, 409 00:25:43,359 --> 00:25:46,280 Speaker 2: here's a way to reach people that swung a lot 410 00:25:46,280 --> 00:25:48,960 Speaker 2: of swing states. It was, of course about Obama and 411 00:25:49,240 --> 00:25:52,800 Speaker 2: Romney was not a great candidate, etc. But the Democrats 412 00:25:52,840 --> 00:25:58,919 Speaker 2: had like outthought everybody. Trump in twenty sixteen had some 413 00:25:59,040 --> 00:26:03,879 Speaker 2: kind of technological stuff. I think about kind of secret 414 00:26:04,000 --> 00:26:04,800 Speaker 2: Russian spy. 415 00:26:04,960 --> 00:26:07,080 Speaker 1: Yeah, I'm like that wasn't that just putin what. 416 00:26:09,080 --> 00:26:12,280 Speaker 2: Microwaves or something like that. And I do think there's 417 00:26:12,320 --> 00:26:15,520 Speaker 2: a story that is going to obviously be told about 418 00:26:15,560 --> 00:26:17,840 Speaker 2: the twenty twenty four election and I'm curious what it is. 419 00:26:17,840 --> 00:26:20,840 Speaker 2: I know, like people are thinking about it, so I 420 00:26:20,920 --> 00:26:22,960 Speaker 2: just I'm just curious to see what it is. But 421 00:26:23,000 --> 00:26:24,680 Speaker 2: I guess you're saying it's not going to be Twitter 422 00:26:24,720 --> 00:26:28,600 Speaker 2: for us. But again, I I just do I just 423 00:26:28,680 --> 00:26:31,080 Speaker 2: do think. That's just what I think. Like, even I'm 424 00:26:31,119 --> 00:26:34,159 Speaker 2: still active on Twitter, you're still active on Twitter, and 425 00:26:34,200 --> 00:26:36,520 Speaker 2: the people who are reading our tweets are not the 426 00:26:36,560 --> 00:26:38,480 Speaker 2: same people who used to be reading our tweets, and 427 00:26:38,480 --> 00:26:41,280 Speaker 2: so kind I'm just thinking, like, what what's what does 428 00:26:41,320 --> 00:26:43,200 Speaker 2: that mean? What does that mean going forward? 429 00:26:44,040 --> 00:26:46,280 Speaker 1: We will we shall see, but we will leave it 430 00:26:46,320 --> 00:26:51,439 Speaker 1: there today. Friends, doctor Jonathan Mepzel, congratulations on all of 431 00:26:51,480 --> 00:26:55,639 Speaker 1: your recognition UH and the award that you received for 432 00:26:55,760 --> 00:26:58,439 Speaker 1: your work. We appreciate you and your analysis that you 433 00:26:58,480 --> 00:27:00,640 Speaker 1: bring to us each and every week. 434 00:27:01,040 --> 00:27:01,679 Speaker 2: Thank you, Thank you. 435 00:27:06,680 --> 00:27:09,399 Speaker 1: That is it for me today, Dear friends on Woke 436 00:27:09,400 --> 00:27:12,920 Speaker 1: af as always, Power to the people and to all 437 00:27:13,240 --> 00:27:16,800 Speaker 1: the people. Power, get woke and stay woke as fuck.