1 00:00:11,840 --> 00:00:14,800 Speaker 1: Good morning, peeps, and welcome to wika EPI Daily with 2 00:00:14,920 --> 00:00:19,520 Speaker 1: me your Girl Danielle Moody recording not so live but 3 00:00:19,800 --> 00:00:23,360 Speaker 1: from the Brooklyn Bunker. Folks, let me tell you something 4 00:00:23,520 --> 00:00:29,720 Speaker 1: that this Facebook story has really been getting to me, 5 00:00:30,160 --> 00:00:35,680 Speaker 1: getting to me in ways that I didn't anticipate being 6 00:00:36,040 --> 00:00:40,239 Speaker 1: this kind of angry. But the reality is this right 7 00:00:41,080 --> 00:00:48,360 Speaker 1: once again thanks to someone's better angels, thanks to Francis Hagen, 8 00:00:48,960 --> 00:00:53,720 Speaker 1: the thirty seventy year old former Facebook product manager. Francis Hagen, 9 00:00:54,720 --> 00:01:01,360 Speaker 1: because of her integrity, because of her morality, because of 10 00:01:01,400 --> 00:01:05,959 Speaker 1: her desire to see our democracy as well as other 11 00:01:06,000 --> 00:01:12,560 Speaker 1: democracies around the world continue to flourish, she did an 12 00:01:12,600 --> 00:01:19,319 Speaker 1: amazing thing. She came in as a caped crusader and 13 00:01:21,120 --> 00:01:26,560 Speaker 1: divulged documents that we had all had a hunch about. Right. 14 00:01:26,680 --> 00:01:30,959 Speaker 1: We all pretty much kind of knew right that Facebook 15 00:01:31,000 --> 00:01:35,000 Speaker 1: and social media have been in a lot of ways 16 00:01:35,360 --> 00:01:38,399 Speaker 1: really bad for our society. It started out as a 17 00:01:38,440 --> 00:01:42,440 Speaker 1: way to connect people. The whole idea of social media 18 00:01:42,480 --> 00:01:45,600 Speaker 1: was to have people be more social, to connect to 19 00:01:45,640 --> 00:01:48,960 Speaker 1: folks that you no longer live by or go to 20 00:01:49,040 --> 00:01:52,440 Speaker 1: school with, or family members that live in different countries 21 00:01:52,520 --> 00:01:56,160 Speaker 1: or across you know, around the world in different spaces. 22 00:01:56,160 --> 00:01:58,520 Speaker 1: It was an opportunity for us to kind of take 23 00:01:59,120 --> 00:02:03,160 Speaker 1: you know, the the tangible photo album, right that we 24 00:02:03,200 --> 00:02:05,720 Speaker 1: all used to put together or you know, maybe I'm 25 00:02:05,800 --> 00:02:08,400 Speaker 1: naming my age, but the tangible photo album and be 26 00:02:08,440 --> 00:02:12,160 Speaker 1: able to share that in real time with people that 27 00:02:12,280 --> 00:02:15,920 Speaker 1: you care about. Well, that was the intent, right, But 28 00:02:15,960 --> 00:02:18,639 Speaker 1: what do they say about intentions? The road to hell 29 00:02:18,800 --> 00:02:21,320 Speaker 1: was paved with them. And that is exactly where we 30 00:02:21,360 --> 00:02:26,040 Speaker 1: are in our current makeup of the Facebook world that 31 00:02:26,080 --> 00:02:30,960 Speaker 1: we are living in. Is that what began as the 32 00:02:31,040 --> 00:02:34,520 Speaker 1: intention of Mark Zuckerberg and others to be able to 33 00:02:34,560 --> 00:02:39,280 Speaker 1: connect to other folks on their campus has now morphed 34 00:02:39,320 --> 00:02:42,920 Speaker 1: into the ability for the Proud Boys and you know, 35 00:02:43,000 --> 00:02:47,080 Speaker 1: the Boogoloo Boys and different hate groups to be able 36 00:02:47,080 --> 00:02:51,920 Speaker 1: to unite. It has turned into a tool for bullies 37 00:02:52,000 --> 00:02:57,000 Speaker 1: to be able to harass and taunt people. Right. And 38 00:02:57,120 --> 00:03:02,959 Speaker 1: what we are learning about, particularly what Francis Hoggan has 39 00:03:03,080 --> 00:03:07,200 Speaker 1: laid out for us, is the toxic effect that again 40 00:03:07,639 --> 00:03:12,920 Speaker 1: we knew right before Facebook, before Instagram, what did we have. 41 00:03:13,560 --> 00:03:17,800 Speaker 1: We had actual magazines and what were we taught about 42 00:03:17,880 --> 00:03:21,160 Speaker 1: those magazines that the images that were being projected back 43 00:03:21,240 --> 00:03:26,079 Speaker 1: two young girls, right, was distorting their images of themselves. 44 00:03:26,880 --> 00:03:30,880 Speaker 1: We knew this, We've known this for decades. So of course, 45 00:03:31,400 --> 00:03:35,560 Speaker 1: when you are able now to just continually scroll and 46 00:03:35,680 --> 00:03:43,040 Speaker 1: see images of these perfected, quote unquote bodies, right, these edited, photoshopped, 47 00:03:43,200 --> 00:03:46,280 Speaker 1: enhanced bodies, then of course you're going to look at 48 00:03:46,280 --> 00:03:49,400 Speaker 1: yourself in the mirror and say, oh my god, I'm 49 00:03:49,440 --> 00:03:52,440 Speaker 1: not worthy, right, Or you're going to post a picture 50 00:03:52,640 --> 00:03:56,520 Speaker 1: and maybe you don't get as many likes as your friend, right, 51 00:03:56,560 --> 00:04:01,280 Speaker 1: And so what does that do well? According to CNN Business, 52 00:04:01,360 --> 00:04:04,960 Speaker 1: who picked up some of the key moments of francis 53 00:04:05,080 --> 00:04:09,440 Speaker 1: testimony in front of the Senate Commerce Subcommittee, here are 54 00:04:09,480 --> 00:04:14,560 Speaker 1: some of the things that she said. I am here 55 00:04:14,600 --> 00:04:19,760 Speaker 1: today because I believe that Facebook's products harm children, stoke division, 56 00:04:20,200 --> 00:04:23,240 Speaker 1: and weaken our democracy. These were a part of her 57 00:04:23,279 --> 00:04:27,720 Speaker 1: opening remarks. The company's leadership knows how to make Facebook 58 00:04:27,720 --> 00:04:32,039 Speaker 1: and Instagram safer, but won't make the necessary changes because 59 00:04:32,080 --> 00:04:37,679 Speaker 1: they have put their astronomical profits before people. Congressional action 60 00:04:38,000 --> 00:04:42,000 Speaker 1: is needed. They won't solve this crisis without your help. 61 00:04:42,440 --> 00:04:46,839 Speaker 1: And here's the thing is that this is a crisis 62 00:04:46,880 --> 00:04:49,680 Speaker 1: of their own making, but much in the same way 63 00:04:49,839 --> 00:04:53,200 Speaker 1: that they have been able to take action to remove 64 00:04:53,440 --> 00:04:58,120 Speaker 1: child pornography and child pornography groups and images from Facebook. 65 00:04:58,320 --> 00:05:00,760 Speaker 1: Tell me why they weren't able to the same thing 66 00:05:01,000 --> 00:05:05,000 Speaker 1: with white supremacy right, tell me why they weren't able 67 00:05:05,040 --> 00:05:08,440 Speaker 1: to do the same thing with misinformation about voting and 68 00:05:08,560 --> 00:05:13,240 Speaker 1: misinformation about COVID. Well, what the whistleblower air is out 69 00:05:13,279 --> 00:05:17,159 Speaker 1: for the rest of us is that hate and fear 70 00:05:17,560 --> 00:05:21,680 Speaker 1: are what get people to stay on the platform. That 71 00:05:21,880 --> 00:05:25,520 Speaker 1: even though these girls seventeen percent of girls are developing 72 00:05:25,600 --> 00:05:29,400 Speaker 1: eating disorders and thirteen point five percent have horrible images 73 00:05:29,400 --> 00:05:33,120 Speaker 1: of themselves, even though these stats based on Facebook's own 74 00:05:33,160 --> 00:05:39,600 Speaker 1: internal research, is there that what they have also learned 75 00:05:40,160 --> 00:05:45,480 Speaker 1: is that though the platform is proving to be toxic 76 00:05:45,520 --> 00:05:49,560 Speaker 1: and making people feel bad about themselves, guess what, they 77 00:05:49,640 --> 00:05:53,560 Speaker 1: can't stop. It's like a drug and they just keep 78 00:05:53,640 --> 00:05:58,720 Speaker 1: scrolling and scrolling and scrolling. So when Miss Hagan says 79 00:05:59,279 --> 00:06:03,080 Speaker 1: they won't solve this crisis without your help, no, it's 80 00:06:03,120 --> 00:06:06,440 Speaker 1: not about their help. It's about their fucking mandate. It's 81 00:06:06,480 --> 00:06:08,960 Speaker 1: about they won't solve this problem because they have no 82 00:06:09,040 --> 00:06:14,600 Speaker 1: incentive to do so. Right, the incentive here is to 83 00:06:14,720 --> 00:06:18,800 Speaker 1: make billions and billions upon billions of dollars, more money 84 00:06:19,000 --> 00:06:22,000 Speaker 1: than you'll ever be able to spend in a fucking lifetime. 85 00:06:22,839 --> 00:06:27,279 Speaker 1: Right earlier this week, Mark Zuckerberg, according to many reports, 86 00:06:27,680 --> 00:06:31,600 Speaker 1: with the five hour outage across multiple platforms that are 87 00:06:31,640 --> 00:06:38,200 Speaker 1: owned by Facebook, lost between six and seven billion dollars 88 00:06:38,320 --> 00:06:44,480 Speaker 1: himself six and seven billion dollars he lost. Can you imagine? 89 00:06:44,920 --> 00:06:47,560 Speaker 1: I just like, take that fucking in for a minute, 90 00:06:48,400 --> 00:06:52,000 Speaker 1: Mark Zuckerberg lost in the five hours that these platforms 91 00:06:52,000 --> 00:06:58,560 Speaker 1: were down, more than some country's GDP. But we don't 92 00:06:58,600 --> 00:07:01,320 Speaker 1: want to talk about the wealth tax. We don't want 93 00:07:01,320 --> 00:07:06,240 Speaker 1: to talk about how big is too fucking big? Right. 94 00:07:06,480 --> 00:07:10,280 Speaker 1: We got into a financial crisis in the early aughts 95 00:07:10,320 --> 00:07:13,400 Speaker 1: because we said, oh, the banks are too big to fail, 96 00:07:13,480 --> 00:07:16,040 Speaker 1: we can't let them shut down. What about the people? 97 00:07:16,320 --> 00:07:20,240 Speaker 1: But we did nothing to restore the wealth that was 98 00:07:20,360 --> 00:07:24,080 Speaker 1: lost to the millions of people that lost homes because 99 00:07:24,120 --> 00:07:27,160 Speaker 1: of the predatory lending practices, and nobody went to jail. 100 00:07:30,680 --> 00:07:36,000 Speaker 1: So it is not that this behemoth of Facebook needs 101 00:07:37,040 --> 00:07:41,120 Speaker 1: the help of Congress. It's that without the intervention, without 102 00:07:41,160 --> 00:07:44,920 Speaker 1: the regulation, without the mandates, they're going to continue to 103 00:07:44,960 --> 00:07:51,160 Speaker 1: grow out of control. Millions of people, small businesses, individuals 104 00:07:52,080 --> 00:07:56,440 Speaker 1: run their lives their businesses through Facebook's ads and Instagram ads. 105 00:07:57,760 --> 00:08:00,840 Speaker 1: It's the reason why Stephanie Rule on ms NBC said, 106 00:08:01,800 --> 00:08:05,040 Speaker 1: you know what you have to ask yourself when you 107 00:08:05,080 --> 00:08:09,440 Speaker 1: are not paying for a service, Recognize that you are 108 00:08:09,480 --> 00:08:13,480 Speaker 1: the product all of your information, all of your data, 109 00:08:13,680 --> 00:08:16,400 Speaker 1: your phone numbers, You're this everything that we have done. 110 00:08:16,640 --> 00:08:21,080 Speaker 1: We have literally handed over our lives to Facebook, and 111 00:08:21,160 --> 00:08:24,920 Speaker 1: they have optimized their algorithms in order to be able 112 00:08:24,960 --> 00:08:29,880 Speaker 1: to market to us in a way that we don't 113 00:08:29,880 --> 00:08:34,320 Speaker 1: even know if what we like we actually like or 114 00:08:34,440 --> 00:08:39,440 Speaker 1: is it because we've seen it seventeen times? Right? Is 115 00:08:39,440 --> 00:08:43,400 Speaker 1: the news that I'm getting actual news? Or have they 116 00:08:43,440 --> 00:08:46,199 Speaker 1: just decided to feed me the bubble that they believe 117 00:08:46,280 --> 00:08:49,200 Speaker 1: that I'm most comfortable in? And so what does that 118 00:08:49,280 --> 00:08:53,080 Speaker 1: mean if I'm a person that believes that in verktimine right, 119 00:08:53,200 --> 00:08:56,560 Speaker 1: or whatever the fuck that horse hormone is that everybody 120 00:08:56,559 --> 00:09:00,760 Speaker 1: seems to be using in the Midwest, the circles that 121 00:09:00,800 --> 00:09:02,840 Speaker 1: I run in tell me that that is the safest 122 00:09:02,880 --> 00:09:05,080 Speaker 1: thing and the best thing to use. And I'm just 123 00:09:05,120 --> 00:09:13,280 Speaker 1: getting projected back images from inside my own bubble, right, 124 00:09:13,360 --> 00:09:15,560 Speaker 1: then what does that mean for my safety? And while 125 00:09:15,600 --> 00:09:19,840 Speaker 1: being one of the questions that I asked, as we're 126 00:09:19,920 --> 00:09:23,400 Speaker 1: learning more information, as more information is becoming available, and 127 00:09:23,520 --> 00:09:28,720 Speaker 1: after this testimony on Capitol Hill, the question that I asked, 128 00:09:28,760 --> 00:09:31,360 Speaker 1: and I will have some lawyers on in the coming 129 00:09:31,400 --> 00:09:37,640 Speaker 1: weeks that will hopefully unpack this for us. But here's 130 00:09:37,640 --> 00:09:41,440 Speaker 1: a thought, And once again, I'm not a lawyer, but 131 00:09:41,520 --> 00:09:47,600 Speaker 1: I said this. The documents Francis Hagen released confirmed what 132 00:09:47,640 --> 00:09:51,840 Speaker 1: we already knew about Facebook, including their role in spreading hate. 133 00:09:52,360 --> 00:09:56,040 Speaker 1: If the papers reveal that the disillusion of the Civic 134 00:09:56,080 --> 00:09:58,960 Speaker 1: Facebook group right, the one that was supposed to be 135 00:09:59,559 --> 00:10:04,079 Speaker 1: monitor ring misinformation about voting on all of these things, 136 00:10:04,160 --> 00:10:07,360 Speaker 1: all of these social ills, if that group was dissolved 137 00:10:07,679 --> 00:10:11,000 Speaker 1: post twenty twenty election because they felt like, oh, well, 138 00:10:11,000 --> 00:10:13,480 Speaker 1: we did our part, we did our very minimal, low 139 00:10:13,520 --> 00:10:19,600 Speaker 1: fucking lift. And they show that the Civic Facebook group 140 00:10:20,000 --> 00:10:23,920 Speaker 1: right that was responsible for reporting and kind of having 141 00:10:23,960 --> 00:10:27,679 Speaker 1: eyes on the situation with regard to voting and misinformation 142 00:10:27,800 --> 00:10:30,920 Speaker 1: and with regard to these white supremacist groups, if they 143 00:10:31,000 --> 00:10:35,160 Speaker 1: shut down and then went back to their old algorithm, 144 00:10:35,280 --> 00:10:40,200 Speaker 1: allowing their platform to become a convening for the insurrection? 145 00:10:41,840 --> 00:10:51,120 Speaker 1: Then are they can Facebook be held criminally liable because 146 00:10:51,600 --> 00:10:54,880 Speaker 1: it was not then an accident? And that's the point. 147 00:10:54,960 --> 00:10:59,800 Speaker 1: That is what is so important here about Francis's testimony 148 00:11:00,240 --> 00:11:04,120 Speaker 1: is that what she has unearthed is what we all 149 00:11:04,160 --> 00:11:09,360 Speaker 1: assumed but had no evidence to prove, which is that 150 00:11:09,679 --> 00:11:13,840 Speaker 1: Facebook knows exactly what the fuck they are doing. They 151 00:11:13,840 --> 00:11:17,599 Speaker 1: know exactly what is toxic, they know exactly what is dangerous. 152 00:11:17,800 --> 00:11:24,040 Speaker 1: But they are choosing to monetize people's pain and monetize 153 00:11:24,120 --> 00:11:29,000 Speaker 1: the demise of our democracy above all else because it 154 00:11:29,080 --> 00:11:32,880 Speaker 1: does not, in this instance or pretty much fucking ever 155 00:11:33,280 --> 00:11:37,600 Speaker 1: pay to do good. But what also gets me, folks, 156 00:11:37,920 --> 00:11:42,120 Speaker 1: is that if it wasn't for this whistleblower, we would 157 00:11:42,160 --> 00:11:46,240 Speaker 1: be none the wiser. And so we are basing right, 158 00:11:46,840 --> 00:11:51,720 Speaker 1: We are basing the health and well being of our democracy, 159 00:11:51,920 --> 00:11:55,360 Speaker 1: right of our cultural climate, on a few people that 160 00:11:55,480 --> 00:11:59,120 Speaker 1: decide to do good. Well, what happens when nobody decides 161 00:11:59,160 --> 00:12:02,800 Speaker 1: to do good? What happens when tribalism fully wins over? 162 00:12:03,600 --> 00:12:07,680 Speaker 1: Are we just fucked then? Because we have put together 163 00:12:08,120 --> 00:12:12,520 Speaker 1: no guardrails, no safety measures. We are not handing out 164 00:12:12,600 --> 00:12:16,400 Speaker 1: sentencing for the insurrectionists that would make us believe that 165 00:12:16,520 --> 00:12:21,400 Speaker 1: another attack isn't on the horizon because there's nothing. What 166 00:12:21,800 --> 00:12:25,079 Speaker 1: they're going to say, Oh, well, maybe you know, this 167 00:12:25,200 --> 00:12:29,680 Speaker 1: is worth the forty five days in jail, This is 168 00:12:29,679 --> 00:12:32,679 Speaker 1: worth the slap on the wrist, if I believe, because 169 00:12:32,720 --> 00:12:36,840 Speaker 1: I'm being fed this line of thinking that I'm the 170 00:12:36,880 --> 00:12:40,240 Speaker 1: true patriot and everybody else's the enemy. If I believe that, 171 00:12:40,280 --> 00:12:43,480 Speaker 1: and that is being reinforced through these algorithms that are 172 00:12:43,480 --> 00:12:50,360 Speaker 1: coming to me, isn't someone somewhere fucking responsible for this? 173 00:12:50,640 --> 00:12:54,480 Speaker 1: And I'm not taking away like this is not this 174 00:12:54,600 --> 00:12:58,320 Speaker 1: is not me deciding then that, oh everyone has been 175 00:12:58,440 --> 00:13:02,839 Speaker 1: duped and so to avail them of personal responsibility. No, No, 176 00:13:03,320 --> 00:13:07,160 Speaker 1: But there was a network that was in place from 177 00:13:07,200 --> 00:13:10,440 Speaker 1: the donors who got these buses and put people up 178 00:13:10,440 --> 00:13:14,600 Speaker 1: in hotels. Right. There was the right wing media that 179 00:13:14,720 --> 00:13:19,040 Speaker 1: was amping people up and telling them what to do. Right. 180 00:13:21,000 --> 00:13:23,760 Speaker 1: There's a manifesto that was put out that Donald Trump 181 00:13:23,800 --> 00:13:26,520 Speaker 1: decided to follow to a t in order to conduct 182 00:13:26,520 --> 00:13:30,120 Speaker 1: this coup. And then he had this tool, this technology, 183 00:13:30,520 --> 00:13:33,679 Speaker 1: both Twitter and Facebook to be able to do it. 184 00:13:36,200 --> 00:13:43,960 Speaker 1: Eighty million fucking people. And so if Facebook knowingly decided, well, 185 00:13:44,679 --> 00:13:48,679 Speaker 1: you know, we're losing money because God forbid, right the 186 00:13:48,880 --> 00:13:52,160 Speaker 1: billions upon billions that they make, It's just not enough. 187 00:13:53,600 --> 00:13:57,920 Speaker 1: And that is the disgusting aspect of capitalism. There is 188 00:13:58,080 --> 00:14:03,240 Speaker 1: never enough, right. They'll just keep consuming and doing and 189 00:14:03,360 --> 00:14:10,280 Speaker 1: consuming and doing until somebody stops them. So essentially, what 190 00:14:10,440 --> 00:14:15,880 Speaker 1: this whistleblower has laid out is what she believes and 191 00:14:16,000 --> 00:14:19,480 Speaker 1: what she had said in her testimony. She is doing 192 00:14:19,480 --> 00:14:25,080 Speaker 1: this at great personal risk because she believes quote, we 193 00:14:25,160 --> 00:14:28,920 Speaker 1: still have time to act, but must act now. Well, 194 00:14:28,920 --> 00:14:31,440 Speaker 1: what's terrifying to me is that I don't think that 195 00:14:31,480 --> 00:14:35,080 Speaker 1: our government actually knows how to act. We can't decide 196 00:14:35,120 --> 00:14:38,880 Speaker 1: what Facebook actually is, and we don't have laws that 197 00:14:38,960 --> 00:14:42,440 Speaker 1: are in place or in development of how one regulates 198 00:14:42,440 --> 00:14:48,520 Speaker 1: an algorithm versus an actual tangible entity. Coming up next, 199 00:14:48,600 --> 00:14:51,800 Speaker 1: dear friends, is my conversation with our friend, our in 200 00:14:51,920 --> 00:14:55,720 Speaker 1: house doctor, doctor Jonathan Metzel to unpack this very issue 201 00:14:55,760 --> 00:14:58,680 Speaker 1: and whether or not the harms that have been done 202 00:14:59,280 --> 00:15:02,800 Speaker 1: are just too far gone for us to be able 203 00:15:02,800 --> 00:15:08,280 Speaker 1: to do anything about it. Hey, friends, as always, I 204 00:15:08,320 --> 00:15:11,720 Speaker 1: am excited every single Wednesday when we get to sit 205 00:15:11,760 --> 00:15:15,160 Speaker 1: down with our friend, our in house doctor, doctor Jonathan 206 00:15:15,240 --> 00:15:20,440 Speaker 1: Metzel to go through the latest in our global health pandemic, 207 00:15:20,520 --> 00:15:24,400 Speaker 1: the latest in the multiple pandemics that we are living through. Jonathan, 208 00:15:25,040 --> 00:15:28,280 Speaker 1: you know, I think that we've kind of reached a 209 00:15:28,320 --> 00:15:32,080 Speaker 1: place where we don't know if I like what I 210 00:15:32,120 --> 00:15:34,280 Speaker 1: like or because somebody is telling me that I should 211 00:15:34,280 --> 00:15:37,720 Speaker 1: like it. And I think that that is you know, 212 00:15:38,080 --> 00:15:43,200 Speaker 1: that's part of the problem of being in and like 213 00:15:43,560 --> 00:15:45,480 Speaker 1: I think, I forget who said it, but they said 214 00:15:45,480 --> 00:15:48,840 Speaker 1: it this week. Oh it was Stephanie Rule on MSNBC. 215 00:15:49,560 --> 00:15:52,760 Speaker 1: When you're on a platform that is free, right, you 216 00:15:52,800 --> 00:15:55,320 Speaker 1: have to recognize that if you're not paying for something, 217 00:15:55,440 --> 00:15:58,920 Speaker 1: then you are the product, right, And that is what 218 00:15:58,960 --> 00:16:02,680 Speaker 1: we are learning with the revelation through you know, through 219 00:16:02,720 --> 00:16:06,680 Speaker 1: Facebook and you know their outages this week across you know, 220 00:16:06,800 --> 00:16:11,600 Speaker 1: multiple platforms, billions of people. But it's recognizing that, you know, 221 00:16:11,760 --> 00:16:13,960 Speaker 1: am I clicking on this thing because I like it, 222 00:16:14,000 --> 00:16:17,560 Speaker 1: because I want it, or because somebody has created an 223 00:16:17,640 --> 00:16:21,280 Speaker 1: algorithm that is enticing me? And what does that mean? 224 00:16:21,520 --> 00:16:24,400 Speaker 1: If I am an addict? Right? If I if I 225 00:16:24,400 --> 00:16:26,720 Speaker 1: am somebody who you know is an addict, is a 226 00:16:26,760 --> 00:16:29,480 Speaker 1: gambling addict, you know, an alcoholic or what have you 227 00:16:29,560 --> 00:16:33,120 Speaker 1: like you can literally target to people, and so it's 228 00:16:33,200 --> 00:16:36,560 Speaker 1: really important to begin to cross a reference and to 229 00:16:36,680 --> 00:16:40,400 Speaker 1: kind of, I think, unplug and go back to what 230 00:16:40,480 --> 00:16:43,880 Speaker 1: you were doing, you know, pre which was picking up 231 00:16:43,880 --> 00:16:46,240 Speaker 1: a paper and decid I mean that you were going 232 00:16:46,240 --> 00:16:48,880 Speaker 1: to read it going you know, reading it reading a 233 00:16:48,920 --> 00:16:52,080 Speaker 1: magazine because like you've decided to pick it up. I 234 00:16:52,120 --> 00:16:54,680 Speaker 1: think that you know, it is the question that we 235 00:16:54,720 --> 00:16:57,480 Speaker 1: all need to be asking ourselves, like why do I 236 00:16:57,520 --> 00:17:00,400 Speaker 1: feel the way that I do? Why why has my 237 00:17:00,480 --> 00:17:03,280 Speaker 1: opinion been formed in a certain way? And if we're 238 00:17:03,320 --> 00:17:07,040 Speaker 1: working to get people out of these bubbles, right, like 239 00:17:07,080 --> 00:17:09,720 Speaker 1: these tribal bubbles that you and I have talked about 240 00:17:09,720 --> 00:17:12,600 Speaker 1: so many times, you know, how do you even begin 241 00:17:12,680 --> 00:17:15,040 Speaker 1: to do that? Right? How do you even be able 242 00:17:15,040 --> 00:17:18,760 Speaker 1: to break in to the bubble that's been created around them? 243 00:17:19,119 --> 00:17:23,320 Speaker 1: And it's interesting because it's you can just see even 244 00:17:23,320 --> 00:17:25,840 Speaker 1: and as you describe it, the kind of vicious cycle here, right, 245 00:17:25,880 --> 00:17:30,280 Speaker 1: which is you have a pre existing ideology, your liberal 246 00:17:30,440 --> 00:17:35,680 Speaker 1: or your conservative, your libertarian, your Raylian, your whatever you are, 247 00:17:36,600 --> 00:17:40,919 Speaker 1: and then and then the news is formatted to reaffirm 248 00:17:41,000 --> 00:17:44,119 Speaker 1: your belief, so then you gravitate toward that news that 249 00:17:44,200 --> 00:17:47,640 Speaker 1: news makes an emotional sense to you because of your 250 00:17:47,680 --> 00:17:50,600 Speaker 1: existing ideology, and then when you read the article, it 251 00:17:50,640 --> 00:17:54,600 Speaker 1: doesn't challenge that, it reaffirms that, you know, liberals suck 252 00:17:54,720 --> 00:17:58,040 Speaker 1: or conservative suck, and then you in your mind generalize, oh, 253 00:17:58,080 --> 00:18:01,720 Speaker 1: all liberals suck or all whoever suck, and then it's 254 00:18:01,720 --> 00:18:04,160 Speaker 1: a feedback because then you share that article and stuff 255 00:18:04,160 --> 00:18:08,280 Speaker 1: like that. And so we're in this thing where, especially 256 00:18:08,320 --> 00:18:10,639 Speaker 1: in a time like a pandemic, where we don't know 257 00:18:10,720 --> 00:18:12,960 Speaker 1: all the answers. I mean, we thought vaccines did one thing, 258 00:18:13,000 --> 00:18:17,520 Speaker 1: they do something else. Masks now cloth masks are bad whatever. 259 00:18:17,760 --> 00:18:21,520 Speaker 1: Like that's the way humanity understands a pandemic is like 260 00:18:21,520 --> 00:18:26,400 Speaker 1: our knowledge evolves. But when knowledge is changing important life 261 00:18:26,680 --> 00:18:30,000 Speaker 1: or death information is changing, then these kind of bubbles, 262 00:18:30,040 --> 00:18:33,840 Speaker 1: these ideological bubbles. It's just it's a perfect storm. Like 263 00:18:33,920 --> 00:18:36,800 Speaker 1: no wonder Facebook made so much money because it's a 264 00:18:36,840 --> 00:18:41,639 Speaker 1: perfect storm of hunger, attention, anxiety, all those kind of things. 265 00:18:42,000 --> 00:18:43,480 Speaker 1: You know. But I want to I want to ask 266 00:18:43,560 --> 00:18:47,840 Speaker 1: this though, because I you know, I don't think and 267 00:18:48,000 --> 00:18:51,439 Speaker 1: again I'm existing inside of the bubble that you and 268 00:18:51,480 --> 00:18:57,240 Speaker 1: I share, right as progressives, as as democrats. Right, I 269 00:18:57,320 --> 00:19:01,879 Speaker 1: don't get messaging that tells me to hate conservatives. I 270 00:19:01,920 --> 00:19:07,080 Speaker 1: don't get messaging that is directed towards how terrible you know, 271 00:19:07,640 --> 00:19:11,000 Speaker 1: conservatives are white evangelical Christians like any I don't get 272 00:19:11,000 --> 00:19:13,679 Speaker 1: messaging like that, do you? Because I think that the 273 00:19:13,760 --> 00:19:17,200 Speaker 1: messaging that is to I would argue that the messaging 274 00:19:17,240 --> 00:19:20,800 Speaker 1: that is targeted towards people who are Democrats, who are 275 00:19:21,280 --> 00:19:26,840 Speaker 1: you know, who are more aligned with the intent around 276 00:19:26,880 --> 00:19:30,359 Speaker 1: the values of our country. That like, we're not trying 277 00:19:30,359 --> 00:19:35,560 Speaker 1: to hate other people, and that's not that's not our clickbait, right, 278 00:19:35,640 --> 00:19:39,160 Speaker 1: but but it's actually a kind of visual sample biased 279 00:19:39,440 --> 00:19:40,879 Speaker 1: kind of how this works. So even if you're not, 280 00:19:41,080 --> 00:19:44,439 Speaker 1: you're not getting a message that says something horrible like 281 00:19:44,800 --> 00:19:47,640 Speaker 1: Mitch McConnell is holding up the infrastructure bill. Who would 282 00:19:47,680 --> 00:19:51,080 Speaker 1: possibly say that, um, you know, but but but but 283 00:19:51,160 --> 00:19:54,720 Speaker 1: what you get is so like I was on a 284 00:19:54,800 --> 00:19:58,080 Speaker 1: plane this morning. There was one guy who was determined 285 00:19:58,119 --> 00:20:01,520 Speaker 1: to be a nostril patriot. He needed his nose out, 286 00:20:01,560 --> 00:20:04,560 Speaker 1: and that kept telling this dude, put your frigging nose 287 00:20:04,800 --> 00:20:06,880 Speaker 1: under your mask for everybody else, and then he'd pull 288 00:20:06,920 --> 00:20:09,600 Speaker 1: it back down something like that. Now there were maybe 289 00:20:09,920 --> 00:20:12,080 Speaker 1: on my flight this morning. I don't know how many 290 00:20:12,080 --> 00:20:15,879 Speaker 1: people want to flight two hundred people at whatever. One 291 00:20:15,960 --> 00:20:18,560 Speaker 1: hundred ninety nine people had their mask on normal, and 292 00:20:18,600 --> 00:20:21,240 Speaker 1: this one guy wanted us to see the inside of 293 00:20:21,280 --> 00:20:24,480 Speaker 1: his nostrils. And so if the way the media algorithm 294 00:20:24,560 --> 00:20:28,359 Speaker 1: works is somebody films this guy, it goes crazy and 295 00:20:28,400 --> 00:20:31,280 Speaker 1: then you think all these anti mask people, they're just 296 00:20:31,320 --> 00:20:34,320 Speaker 1: pulling their nose off out on the thing, but you 297 00:20:34,359 --> 00:20:36,359 Speaker 1: don't see the other one hundred ninety nine people. And 298 00:20:36,400 --> 00:20:38,159 Speaker 1: so the way it works is by kind of a 299 00:20:38,280 --> 00:20:41,679 Speaker 1: stable giance, like you think every single person is doing this, 300 00:20:41,760 --> 00:20:44,679 Speaker 1: who is part of this bigger ideology, when really the 301 00:20:44,720 --> 00:20:48,439 Speaker 1: way things go viral is I mean again, think about it, 302 00:20:48,520 --> 00:20:52,920 Speaker 1: seventy two million people voted for Trump. How many two 303 00:20:53,000 --> 00:20:57,000 Speaker 1: thousand stormed the Capitol on January sixth, But in a way, 304 00:20:57,080 --> 00:20:59,360 Speaker 1: what you think is everybody who voted for Trump must 305 00:20:59,359 --> 00:21:00,920 Speaker 1: have been at the see on that day. That's the 306 00:21:00,960 --> 00:21:03,719 Speaker 1: way it kind of messes with your mind. Yeah, and 307 00:21:03,800 --> 00:21:06,840 Speaker 1: also what it takes to go viral. And you know, conversely, 308 00:21:08,359 --> 00:21:11,520 Speaker 1: how many people marched for justice after the murder of 309 00:21:11,560 --> 00:21:15,199 Speaker 1: George Floyd and how many created acts of violence like 310 00:21:15,200 --> 00:21:18,280 Speaker 1: it was like point zero one percent. But in conservative 311 00:21:18,280 --> 00:21:21,720 Speaker 1: media it was like every protester is smashing plates and 312 00:21:21,920 --> 00:21:23,880 Speaker 1: lighting stuff on fire and stuff. So in a way 313 00:21:23,920 --> 00:21:28,159 Speaker 1: it works by taking out of context and totally decontextualizing 314 00:21:28,480 --> 00:21:32,080 Speaker 1: and creating this generalizing, generalizable ethos, and then all of 315 00:21:32,119 --> 00:21:36,520 Speaker 1: a sudden when it's like, oh, every protester is smashing things, 316 00:21:36,560 --> 00:21:39,480 Speaker 1: how could I possibly negotiate? We're so different from you. 317 00:21:40,480 --> 00:21:42,480 Speaker 1: And so in a way it creates this thing where 318 00:21:42,480 --> 00:21:46,159 Speaker 1: it's like it's like what's the literary term metonomy or 319 00:21:46,280 --> 00:21:49,200 Speaker 1: synectic key or something like that, where the part comes 320 00:21:49,200 --> 00:21:52,720 Speaker 1: to represent the whole. And so you think every black 321 00:21:52,720 --> 00:21:56,080 Speaker 1: person must be an angry protester who's going to steal 322 00:21:56,080 --> 00:21:59,480 Speaker 1: stuff from CBS and all that kind of stuff, when 323 00:21:59,520 --> 00:22:02,600 Speaker 1: probably one person that maybe they were, I don't know, 324 00:22:02,680 --> 00:22:05,159 Speaker 1: getting paid off to do whatever. But that's what drives clicks, 325 00:22:05,280 --> 00:22:08,280 Speaker 1: is like you see the little thing and then you 326 00:22:08,400 --> 00:22:10,719 Speaker 1: generalize it. And that's the danger of this stuff is, 327 00:22:10,960 --> 00:22:14,119 Speaker 1: particularly because the bar is so high to go viral, 328 00:22:14,160 --> 00:22:16,720 Speaker 1: Like it used to be just your dog could catch 329 00:22:16,760 --> 00:22:18,959 Speaker 1: a frisbee, and now it's like you've got to do 330 00:22:19,000 --> 00:22:22,960 Speaker 1: the craziest shit in the world. But then everybody thinks like, oh, 331 00:22:23,000 --> 00:22:25,680 Speaker 1: every person is like that. So that's that's how that 332 00:22:25,920 --> 00:22:31,080 Speaker 1: this algorithm works. Yeah, And this is what the whistleblower 333 00:22:31,800 --> 00:22:36,679 Speaker 1: was articulating in the hearing, which is that the presumption 334 00:22:36,760 --> 00:22:40,240 Speaker 1: that everyone had. And I say presumption that everyone had. 335 00:22:40,280 --> 00:22:43,160 Speaker 1: But I really believe that our members of Congress are 336 00:22:43,280 --> 00:22:46,480 Speaker 1: inept and very fucking lazy. And I will say that 337 00:22:46,800 --> 00:22:52,119 Speaker 1: to say that, one, I wouldn't call my mother and 338 00:22:52,280 --> 00:22:56,560 Speaker 1: ask her what is the best like social media platform 339 00:22:56,600 --> 00:23:02,640 Speaker 1: to use, right, because she is out of her age depth. Right. 340 00:23:03,200 --> 00:23:06,760 Speaker 1: But we then are sitting with a Congress where the 341 00:23:06,800 --> 00:23:10,840 Speaker 1: average age in the Senate is seventy two years old. 342 00:23:11,640 --> 00:23:14,720 Speaker 1: They understand how to regulate things that you can touch 343 00:23:14,800 --> 00:23:18,480 Speaker 1: and feel, and even intellectual property to some degrees. We 344 00:23:18,560 --> 00:23:24,360 Speaker 1: are talking about now regulating algorithms, right, which you need 345 00:23:24,440 --> 00:23:27,840 Speaker 1: somebody to go in and explain exact to all of us, frankly, 346 00:23:28,080 --> 00:23:31,639 Speaker 1: but exactly what an algorithm is. But you're more likely 347 00:23:32,520 --> 00:23:35,200 Speaker 1: to be able to understand the machinations of it if 348 00:23:35,240 --> 00:23:39,199 Speaker 1: you're somebody that readily uses technology and has a general awareness. 349 00:23:39,560 --> 00:23:43,200 Speaker 1: And so I say the laziness part because we have known, 350 00:23:43,240 --> 00:23:46,800 Speaker 1: we've been listening to Congress bitch about Facebook, and we've 351 00:23:46,840 --> 00:23:49,960 Speaker 1: all been bitching about Facebook and it growing into a 352 00:23:50,040 --> 00:23:52,800 Speaker 1: monopoly and gobbling up all of these companies and then 353 00:23:53,040 --> 00:23:55,840 Speaker 1: it goes out and three and a half billion people 354 00:23:55,880 --> 00:24:00,520 Speaker 1: across the around the world can't communicate, right, And so 355 00:24:00,960 --> 00:24:06,400 Speaker 1: it's it's it's this thing where we're our laws are 356 00:24:06,440 --> 00:24:10,040 Speaker 1: not keeping pace with the reality that we're living in 357 00:24:10,080 --> 00:24:13,840 Speaker 1: and how fast technology is moving. And the fact is 358 00:24:13,840 --> 00:24:16,680 Speaker 1: is that like, go ahead, Well, it's also that our 359 00:24:16,760 --> 00:24:20,000 Speaker 1: legal system becomes a reflection of that system. Right. And 360 00:24:20,040 --> 00:24:22,440 Speaker 1: in other words, like the people who do well are 361 00:24:22,440 --> 00:24:25,840 Speaker 1: people like Marjorie together Green who don't do anything, but 362 00:24:25,920 --> 00:24:28,960 Speaker 1: they're good at they're good at manipulating that algorithm, and 363 00:24:29,000 --> 00:24:33,160 Speaker 1: the algorithm the ultimate result. It's great for business if 364 00:24:33,200 --> 00:24:36,359 Speaker 1: people feel like we have so little in common that 365 00:24:36,359 --> 00:24:37,960 Speaker 1: there's no way we're ever going to be able to 366 00:24:37,960 --> 00:24:41,200 Speaker 1: work together, or the other side is so morally bankrupt, 367 00:24:41,240 --> 00:24:44,520 Speaker 1: there's whatever. So if you feed into that, but there's 368 00:24:44,640 --> 00:24:47,600 Speaker 1: a ton of money in that, right, and so polarization 369 00:24:47,720 --> 00:24:50,920 Speaker 1: is great for business. And so again you just think 370 00:24:50,920 --> 00:24:53,960 Speaker 1: about that happening during the pandemic. It's like the it's 371 00:24:54,000 --> 00:24:56,280 Speaker 1: like it's like the perfect storm. Right, it's great for 372 00:24:56,320 --> 00:24:59,920 Speaker 1: their business to be creating all this polarization, but it's 373 00:25:00,840 --> 00:25:03,840 Speaker 1: you know, it's just really bad if you're trying to 374 00:25:03,880 --> 00:25:07,880 Speaker 1: solve a common problem of humanity. But I don't think 375 00:25:07,880 --> 00:25:12,040 Speaker 1: that that's the thing. Facebook isn't here to solve problems, right, 376 00:25:12,240 --> 00:25:16,560 Speaker 1: They're here to monetize, right, and that and the reality 377 00:25:16,680 --> 00:25:20,440 Speaker 1: is is that we know I mean, you can go 378 00:25:20,560 --> 00:25:23,720 Speaker 1: back in history. What did they used to say about 379 00:25:23,800 --> 00:25:27,960 Speaker 1: journalists and headlines? If it bleeds, it leads, that is 380 00:25:28,000 --> 00:25:31,680 Speaker 1: the same reality that is true for this tech this 381 00:25:31,880 --> 00:25:35,639 Speaker 1: major tech company, this utility that needs to be regulated 382 00:25:36,000 --> 00:25:40,360 Speaker 1: that for them, they've just been able to use that exponentially. 383 00:25:40,920 --> 00:25:43,000 Speaker 1: You know, with billions. We're not talking about, oh, the 384 00:25:43,040 --> 00:25:46,680 Speaker 1: circulation of let's say The New York Times or even 385 00:25:46,720 --> 00:25:50,280 Speaker 1: their online subscription. We're talking about everyone in the world. 386 00:25:50,600 --> 00:25:54,320 Speaker 1: And so when the whistleblower is talking about the fact 387 00:25:54,320 --> 00:25:57,760 Speaker 1: that this is destroying democracies not just in this country 388 00:25:57,760 --> 00:26:00,760 Speaker 1: but around the world because you are so hate and 389 00:26:00,840 --> 00:26:03,320 Speaker 1: you know that you're sewing hate because it's telling you 390 00:26:03,359 --> 00:26:06,560 Speaker 1: that the angrier that people are, the longer that they 391 00:26:06,640 --> 00:26:11,560 Speaker 1: stay on the platform. And this was alternatively true for 392 00:26:11,680 --> 00:26:13,800 Speaker 1: the young girls and the research that was done about 393 00:26:13,800 --> 00:26:17,200 Speaker 1: Instagram that seventeen percent of the young girls that are 394 00:26:17,240 --> 00:26:21,240 Speaker 1: on Instagram or developing eating disorders. Well, and also I 395 00:26:21,280 --> 00:26:24,119 Speaker 1: feel so bad, like like, I mean, so many of 396 00:26:24,119 --> 00:26:28,480 Speaker 1: my students they just want information. Right, the world is changing, 397 00:26:28,520 --> 00:26:31,560 Speaker 1: the economy is changing, There's all these health risks that 398 00:26:31,600 --> 00:26:33,960 Speaker 1: we weren't bargaining for. So they want to know, like 399 00:26:34,000 --> 00:26:37,920 Speaker 1: where can we go for information? And that's really what 400 00:26:37,960 --> 00:26:39,399 Speaker 1: our class was about, and they were just saying, like 401 00:26:39,440 --> 00:26:41,880 Speaker 1: where can we get like do we need to wear 402 00:26:41,920 --> 00:26:44,280 Speaker 1: a mask, do we need to whatever? And the sad 403 00:26:44,320 --> 00:26:47,920 Speaker 1: reality is like, I mean, Trump was right about this, 404 00:26:48,000 --> 00:26:50,240 Speaker 1: and he created this world in a way that there's 405 00:26:50,320 --> 00:26:53,200 Speaker 1: no such thing as a neutral fact that everybody can 406 00:26:53,640 --> 00:26:56,360 Speaker 1: look at the same story. But some people are being 407 00:26:56,359 --> 00:27:00,439 Speaker 1: told Iver Metton is the greatest thing since whatever the 408 00:27:00,520 --> 00:27:04,120 Speaker 1: last d wormer was, and some people are are being 409 00:27:04,160 --> 00:27:07,040 Speaker 1: told it's it's it's a forest, And so we're kind 410 00:27:07,040 --> 00:27:10,600 Speaker 1: of getting different versions. People are just hearing different realities 411 00:27:10,640 --> 00:27:13,760 Speaker 1: in a way, and so it just makes like coming 412 00:27:13,760 --> 00:27:17,560 Speaker 1: together in a common good more difficult because people are 413 00:27:18,119 --> 00:27:19,679 Speaker 1: I just say, I've done a lot of on the 414 00:27:19,680 --> 00:27:23,119 Speaker 1: ground research I'm doing a focused group project tomorrow that's 415 00:27:23,160 --> 00:27:26,200 Speaker 1: starting where I'm talking about COVID with a bunch of 416 00:27:26,560 --> 00:27:30,359 Speaker 1: very conservative people. So I'm curious, you know, because the 417 00:27:30,359 --> 00:27:32,960 Speaker 1: thing is, like in people's daily lives, I feel like 418 00:27:32,960 --> 00:27:36,119 Speaker 1: they're a bit more malleable. But the problem is it 419 00:27:36,160 --> 00:27:40,919 Speaker 1: has tons of implications, particularly during a pandemic, Like you 420 00:27:40,960 --> 00:27:43,280 Speaker 1: want people to be kind of working with the same 421 00:27:43,400 --> 00:27:47,400 Speaker 1: knowledge base, but that becomes impossible when people are are 422 00:27:47,440 --> 00:27:50,560 Speaker 1: really are really just there's like so many different realities 423 00:27:50,640 --> 00:27:53,840 Speaker 1: right now. But that is that is a product of 424 00:27:54,280 --> 00:27:58,479 Speaker 1: Trump and Trumpism, which is where we decided that you 425 00:27:58,520 --> 00:28:01,800 Speaker 1: can have facts and alternative facts. And while the rest 426 00:28:01,800 --> 00:28:04,119 Speaker 1: of us were laughing at Kelly Ann Conway when she 427 00:28:04,240 --> 00:28:07,959 Speaker 1: said that, other people were like, oh, that's right, you know, 428 00:28:08,200 --> 00:28:12,760 Speaker 1: I believe something that is different. Well, this is actual science. Right. 429 00:28:12,800 --> 00:28:15,720 Speaker 1: So if we're just talking, if we're looking at this 430 00:28:15,840 --> 00:28:20,359 Speaker 1: Facebook issue just through the prism of the health pandemic, 431 00:28:20,480 --> 00:28:25,040 Speaker 1: not our social pandemic, just the health pandemic, then we're saying, well, 432 00:28:25,480 --> 00:28:27,440 Speaker 1: what do you mean, where do you get information from? 433 00:28:27,680 --> 00:28:30,160 Speaker 1: Don't you get it from the CDC, Don't you get 434 00:28:30,160 --> 00:28:32,640 Speaker 1: it from the World Health Organization? And if you don't 435 00:28:32,640 --> 00:28:35,920 Speaker 1: trust any of those entities, wouldn't you call your doctor's 436 00:28:35,960 --> 00:28:38,440 Speaker 1: office if in fact you have a doctor, or go 437 00:28:38,600 --> 00:28:43,400 Speaker 1: to the local pharmacy. Because I'm sure, Jonathan right, like 438 00:28:43,480 --> 00:28:46,840 Speaker 1: they're not saying, well, maybe you should try and the 439 00:28:46,920 --> 00:28:49,600 Speaker 1: horse hormone. They're not going to tell you that, not 440 00:28:49,680 --> 00:28:53,960 Speaker 1: even at the fucking CBS. But also we live in 441 00:28:53,960 --> 00:28:59,240 Speaker 1: an environment where like again I just tell my students 442 00:28:59,280 --> 00:29:03,280 Speaker 1: go on Twitter and say something really nice, like I'm 443 00:29:03,320 --> 00:29:05,840 Speaker 1: a Democrat but I think Republicans are really friendly or 444 00:29:05,880 --> 00:29:09,680 Speaker 1: something like that. Um. You know, like you get that 445 00:29:09,720 --> 00:29:11,920 Speaker 1: makes no money for Twitter. But if you go on 446 00:29:12,080 --> 00:29:16,280 Speaker 1: and you like spit fire, that's that. So the algorithm 447 00:29:16,360 --> 00:29:19,800 Speaker 1: monetizes conflict. And that's true for the media. I do 448 00:29:19,840 --> 00:29:23,720 Speaker 1: the media, you do. Um And and now we're seeing 449 00:29:23,720 --> 00:29:27,560 Speaker 1: the implications, which is sometimes and certainly of course Trump 450 00:29:27,800 --> 00:29:30,280 Speaker 1: with fake news and all that we didn't we didn't 451 00:29:30,280 --> 00:29:33,040 Speaker 1: even we didn't even say fake news before him. But 452 00:29:33,200 --> 00:29:36,760 Speaker 1: he was stepping into like a he was stepping into 453 00:29:37,160 --> 00:29:39,120 Speaker 1: a table that we set for him in a way. 454 00:29:39,560 --> 00:29:43,480 Speaker 1: Um And and he just mean, I mean, he's a 455 00:29:43,560 --> 00:29:46,640 Speaker 1: he's a manipulator. He's not a maker. He's a manipulator. 456 00:29:47,560 --> 00:29:50,000 Speaker 1: But uh, but but I think that's the issue is 457 00:29:50,040 --> 00:29:52,520 Speaker 1: like how do you step back from that in a way, 458 00:29:52,600 --> 00:29:55,360 Speaker 1: you know, if if it's so clear, like your people 459 00:29:55,400 --> 00:29:58,120 Speaker 1: in your you can like COVID death is not a mystery. 460 00:29:58,160 --> 00:30:01,120 Speaker 1: You can see it everywhere you go right now. So 461 00:30:01,160 --> 00:30:02,600 Speaker 1: it's kind of like, how do you how do you 462 00:30:02,600 --> 00:30:05,800 Speaker 1: step back from that? You know, It's kind of a 463 00:30:05,840 --> 00:30:14,480 Speaker 1: million dollar question. That's it for today's Woke f Daily podcast. 464 00:30:14,520 --> 00:30:17,360 Speaker 1: To hear more from today's show, including my full interview 465 00:30:17,360 --> 00:30:20,600 Speaker 1: with doctor Jonathan Metzel, support me on Patreon at patreon 466 00:30:20,680 --> 00:30:23,680 Speaker 1: dot com. Slash Woke a power to the people and 467 00:30:23,760 --> 00:30:26,600 Speaker 1: to all the people. Power, get woke and stay woke 468 00:30:26,600 --> 00:30:27,040 Speaker 1: as fuck.