1 00:00:11,840 --> 00:00:14,920 Speaker 1: Good morning, peeps, and welcome to wikayeff Daily with Meet 2 00:00:14,960 --> 00:00:18,280 Speaker 1: your Girl Danielle Moody, recording live from our pod stream 3 00:00:18,320 --> 00:00:22,880 Speaker 1: studios here in Times Square, Folks, The fallout continues from 4 00:00:23,079 --> 00:00:28,000 Speaker 1: the whistle bower of Francis Hagen's deep, deep dive into 5 00:00:28,680 --> 00:00:32,159 Speaker 1: Facebook's decision to you know, put profit ahead of people, 6 00:00:32,440 --> 00:00:35,599 Speaker 1: as we all know they've been doing. Zuckerberg, even though 7 00:00:35,640 --> 00:00:39,680 Speaker 1: he lost seven billion dollars this week, is still you know, 8 00:00:39,920 --> 00:00:44,640 Speaker 1: hella fucking rich. And what pisces me off beyond belief 9 00:00:44,760 --> 00:00:49,080 Speaker 1: about this continual fallout is one his bullshit statement that 10 00:00:49,159 --> 00:00:52,000 Speaker 1: he put out this week that basically says, oh that 11 00:00:52,159 --> 00:00:54,920 Speaker 1: Francis doesn't know what she's talking about, because you know, 12 00:00:55,040 --> 00:00:59,520 Speaker 1: evidently she can't read their own internal Facebook reports that 13 00:00:59,680 --> 00:01:02,200 Speaker 1: tell us exactly what it is that they've been doing. 14 00:01:02,480 --> 00:01:05,800 Speaker 1: That they were able to fix algorithms so that they 15 00:01:05,840 --> 00:01:10,319 Speaker 1: could skew against misinformation for the twenty twenty election, but 16 00:01:10,360 --> 00:01:13,639 Speaker 1: then once the election was done, they could flip the switch, 17 00:01:14,120 --> 00:01:17,640 Speaker 1: change the algorithm back so that people were continuing to 18 00:01:17,680 --> 00:01:20,839 Speaker 1: get misinformation, spread hate, and do all of the vile 19 00:01:20,920 --> 00:01:23,440 Speaker 1: things that we know that people are doing on social media. 20 00:01:23,640 --> 00:01:27,240 Speaker 1: And once again he is taking absolutely no responsibility for 21 00:01:27,319 --> 00:01:30,640 Speaker 1: it and feigning dumb. And my question, now, folks, is this, 22 00:01:31,160 --> 00:01:33,920 Speaker 1: does he think that we are all stupid? Like? This 23 00:01:34,000 --> 00:01:39,280 Speaker 1: is what is happening with these people right who make 24 00:01:39,400 --> 00:01:42,040 Speaker 1: a ton of fucking money. We know that in order 25 00:01:42,080 --> 00:01:46,200 Speaker 1: to become a multibillionaire, chances are you're not doing it 26 00:01:46,280 --> 00:01:48,520 Speaker 1: on the up and up. Now, I'm not saying that 27 00:01:48,600 --> 00:01:53,480 Speaker 1: Zuckerberg is a criminal, but I'm saying that what bleeds leads, right, 28 00:01:53,600 --> 00:01:57,120 Speaker 1: So we know that hate groups, we know that, you know, 29 00:01:57,400 --> 00:02:01,280 Speaker 1: Russian misinformation, China interference, all of these things are happening, 30 00:02:01,680 --> 00:02:03,960 Speaker 1: and they could do something about it. But at the 31 00:02:04,080 --> 00:02:07,600 Speaker 1: end of the day, it would disrupt Facebook's profits. Now 32 00:02:07,680 --> 00:02:11,160 Speaker 1: here's the thing. They're not saying they wouldn't make money. 33 00:02:11,360 --> 00:02:14,080 Speaker 1: It's not as if doing right and having morals and 34 00:02:14,200 --> 00:02:17,240 Speaker 1: values would stop them from making money. No, they just 35 00:02:17,240 --> 00:02:20,359 Speaker 1: wouldn't make as much, right, And so one of the 36 00:02:20,440 --> 00:02:24,360 Speaker 1: questions that I want answered is exactly how much money 37 00:02:24,400 --> 00:02:27,800 Speaker 1: would they stand to lose? Like? Are they still making 38 00:02:27,919 --> 00:02:31,160 Speaker 1: billions upon billions of dollars a year? I'm pretty sure 39 00:02:31,200 --> 00:02:34,680 Speaker 1: that they fucking are. And so the desire here to 40 00:02:34,840 --> 00:02:37,840 Speaker 1: once again try and pretend that they don't know what 41 00:02:37,880 --> 00:02:42,040 Speaker 1: they're doing. That, oh, Francis couldn't possibly read these reports, 42 00:02:42,040 --> 00:02:45,000 Speaker 1: and because she wasn't sitting in these groups, doesn't know 43 00:02:45,080 --> 00:02:47,760 Speaker 1: what we all know, which is that social media also 44 00:02:48,080 --> 00:02:51,000 Speaker 1: harms young people. How is that, Oh, I don't know. 45 00:02:51,120 --> 00:02:54,760 Speaker 1: Because when you're inundated with images of beauty or beauty 46 00:02:55,040 --> 00:02:59,359 Speaker 1: expectations that are outside of normal bodies, then yeah, you're 47 00:02:59,360 --> 00:03:01,440 Speaker 1: going to start to look at yourself and say that 48 00:03:01,520 --> 00:03:04,480 Speaker 1: there's something wrong with you. The reality, folks, is, at 49 00:03:04,480 --> 00:03:07,959 Speaker 1: the end of the day, Facebook needs to be regulated, right, 50 00:03:08,240 --> 00:03:12,600 Speaker 1: algorithms need to be regulated. This mega monopoly that they 51 00:03:12,639 --> 00:03:16,400 Speaker 1: have needs to be broken up, and Francis has provided 52 00:03:16,440 --> 00:03:19,480 Speaker 1: the roadmap for Congress to do so. The question is 53 00:03:19,560 --> 00:03:22,560 Speaker 1: whether or not they will. I want to bring up 54 00:03:23,760 --> 00:03:28,920 Speaker 1: a recent report that all in Together, Women Leading Change 55 00:03:29,400 --> 00:03:33,639 Speaker 1: put together to showcase the polling. Folks, again, we're talking 56 00:03:33,680 --> 00:03:37,280 Speaker 1: about twenty twenty two midterm elections, and I've been talking 57 00:03:37,280 --> 00:03:39,840 Speaker 1: about this because it's right around the corner. We only 58 00:03:39,880 --> 00:03:42,280 Speaker 1: have but a couple of months left in twenty twenty one. 59 00:03:42,320 --> 00:03:44,840 Speaker 1: Can you believe that let me just pause for a second. 60 00:03:45,160 --> 00:03:49,320 Speaker 1: It is, folks, going to be twenty twenty two, meaning 61 00:03:49,320 --> 00:03:52,000 Speaker 1: that we've been living in the midst of a global 62 00:03:52,040 --> 00:03:55,920 Speaker 1: health pandemic for going on two years. Like, let's wrap 63 00:03:55,960 --> 00:04:00,560 Speaker 1: your mind around that. But the All in Together poll 64 00:04:00,560 --> 00:04:04,320 Speaker 1: shows young people and I want people to pay attention, 65 00:04:04,680 --> 00:04:08,680 Speaker 1: feel less motivated ahead of twenty twenty two election cycle. 66 00:04:09,120 --> 00:04:11,920 Speaker 1: So let me read you this directly from All in 67 00:04:12,000 --> 00:04:16,719 Speaker 1: Together's page, and then we'll discuss The nonpartisan advocacy group 68 00:04:16,760 --> 00:04:19,280 Speaker 1: All in Together has teamed up with Lake Research in 69 00:04:19,400 --> 00:04:24,040 Speaker 1: Emerson College Polling to explore what is motivating people as 70 00:04:24,080 --> 00:04:26,599 Speaker 1: we look forward to the twenty twenty two men terms. 71 00:04:27,640 --> 00:04:31,839 Speaker 1: This survey looked at a thousand registered voters nationwide from 72 00:04:31,839 --> 00:04:36,160 Speaker 1: September twenty second to the twenty fourth, with a three 73 00:04:36,160 --> 00:04:39,640 Speaker 1: negative three point one margin of era. This survey also 74 00:04:39,800 --> 00:04:43,919 Speaker 1: oversampled two hundred Black women, two hundred Latin women, and 75 00:04:44,040 --> 00:04:49,279 Speaker 1: two hundred Asian American Pacific Islander women. The oversamples were 76 00:04:49,279 --> 00:04:52,359 Speaker 1: weighted down to reflect their actual proportion of women. The 77 00:04:52,440 --> 00:04:56,400 Speaker 1: survey was conducted using a mixed method of IVR online 78 00:04:56,440 --> 00:04:59,200 Speaker 1: and SMS to the web. Here is what they found 79 00:05:00,120 --> 00:05:05,359 Speaker 1: and this is what is concerning. According to All in Together, 80 00:05:05,760 --> 00:05:09,640 Speaker 1: there is an enthusiasm gap in voting among young voters. 81 00:05:10,279 --> 00:05:15,320 Speaker 1: AI TS New Poland shows a massive enthusiasm gap between generations. 82 00:05:15,600 --> 00:05:18,560 Speaker 1: Younger voters are at least motivated age group to vote. 83 00:05:18,839 --> 00:05:22,360 Speaker 1: Only thirty five percent of eighteen to twenty nine year 84 00:05:22,360 --> 00:05:26,400 Speaker 1: old voters are very motivated to vote, and only twenty 85 00:05:26,480 --> 00:05:30,479 Speaker 1: eight percent are almost certain to vote next year, compared 86 00:05:30,560 --> 00:05:33,880 Speaker 1: to fifty two point nine percent and fifty one point 87 00:05:33,920 --> 00:05:39,360 Speaker 1: two percent, respectively of the overall electorate. This is after 88 00:05:39,520 --> 00:05:43,679 Speaker 1: they say record youth turnout in twenty twenty. So here's 89 00:05:43,760 --> 00:05:47,760 Speaker 1: one of the things that I want to unpack, which 90 00:05:47,839 --> 00:05:50,840 Speaker 1: is this, This is the problem that Democrats are going 91 00:05:50,880 --> 00:05:53,040 Speaker 1: to be facing as they go into twenty twenty two, 92 00:05:53,080 --> 00:05:56,840 Speaker 1: because you see Republicans set up right the fact that 93 00:05:56,880 --> 00:06:00,040 Speaker 1: they are an obstructionist group. They don't actually give a 94 00:06:00,080 --> 00:06:03,480 Speaker 1: fuck about governing, and quite honestly, they're terrible at it. 95 00:06:03,680 --> 00:06:08,120 Speaker 1: Because Donald Trump within his presidency added roughly seven trillion 96 00:06:08,160 --> 00:06:10,800 Speaker 1: dollars to the national debt. Every single time that a 97 00:06:10,839 --> 00:06:15,320 Speaker 1: Republican is in power, our debt rises right astronomically. But 98 00:06:15,440 --> 00:06:18,920 Speaker 1: here's the thing. Republicans also don't promise you shit. They 99 00:06:18,920 --> 00:06:21,600 Speaker 1: don't promise you policy change. They don't promise to make 100 00:06:21,600 --> 00:06:25,240 Speaker 1: your lives better, right. What they do offer is to 101 00:06:25,720 --> 00:06:30,600 Speaker 1: roll back rights and to assume to create a Christian 102 00:06:30,640 --> 00:06:36,000 Speaker 1: fundamentalist authoritarian society. This is what they propose. So you see, 103 00:06:36,360 --> 00:06:41,040 Speaker 1: folks don't look to the Republican Party for anything really. 104 00:06:41,400 --> 00:06:45,400 Speaker 1: But the problem is that Democrats and Joe Biden in particular, 105 00:06:45,520 --> 00:06:48,840 Speaker 1: when he was campaigning, said that he was going to 106 00:06:48,880 --> 00:06:51,120 Speaker 1: be the adult in the room, that we were going 107 00:06:51,120 --> 00:06:54,719 Speaker 1: to what build back better together, that he was going 108 00:06:54,760 --> 00:06:56,080 Speaker 1: to be the one that was going to be able 109 00:06:56,080 --> 00:06:59,800 Speaker 1: to negotiate with the domestic terrorists that we call the GOP, 110 00:07:00,360 --> 00:07:03,000 Speaker 1: and that he was going to be able to get 111 00:07:03,040 --> 00:07:06,799 Speaker 1: the job done of putting America back on the map. Well, 112 00:07:06,839 --> 00:07:11,000 Speaker 1: here's the thing, folks. We have yet to put together 113 00:07:11,840 --> 00:07:15,520 Speaker 1: the infrastructure bill, right, the three point five trillion dollar bill. 114 00:07:15,920 --> 00:07:21,040 Speaker 1: We just reached a decision, right, an agreement on raising 115 00:07:21,040 --> 00:07:23,480 Speaker 1: the debt ceiling, but that just kicked the can down 116 00:07:23,560 --> 00:07:26,800 Speaker 1: the road until December. Mitch McConnell did, in fact blink. 117 00:07:27,000 --> 00:07:30,240 Speaker 1: But again, this is not an overall fix. It's a 118 00:07:30,280 --> 00:07:32,920 Speaker 1: band aid on a broken leg. Because we should not 119 00:07:33,080 --> 00:07:35,920 Speaker 1: need to have an active Congress or a vote to 120 00:07:35,960 --> 00:07:39,360 Speaker 1: continually raise our debt ceiling. It should be something that 121 00:07:39,440 --> 00:07:43,200 Speaker 1: automatically kicks in, right, and we know that the debt 122 00:07:43,200 --> 00:07:46,840 Speaker 1: ceiling was raised three times under the Trump administration so 123 00:07:47,000 --> 00:07:50,480 Speaker 1: that they could spend fucking widely and offer up all 124 00:07:50,520 --> 00:07:53,960 Speaker 1: of their tax cuts to the uber wealthy. I digress. 125 00:07:54,360 --> 00:07:56,960 Speaker 1: The fact of the matter is that Democrats are the 126 00:07:57,040 --> 00:08:01,120 Speaker 1: party of promises. They promise us a lot, and unfortunately, 127 00:08:01,920 --> 00:08:06,480 Speaker 1: going against the reality of an obstructionist right wing party, 128 00:08:06,920 --> 00:08:10,720 Speaker 1: they are not able to deliver. So when young people 129 00:08:11,360 --> 00:08:14,080 Speaker 1: risk their lives and people of color in twenty twenty 130 00:08:14,160 --> 00:08:17,080 Speaker 1: to have a historic turnout, they did so not just 131 00:08:17,200 --> 00:08:19,360 Speaker 1: because they wanted to get rid of Donald Trump, but 132 00:08:19,440 --> 00:08:24,080 Speaker 1: because they needed someone to take control of COVID nineteen. Well, 133 00:08:24,160 --> 00:08:27,240 Speaker 1: here we are, folks, where we have reached. Seven hundred 134 00:08:27,360 --> 00:08:31,800 Speaker 1: thousand Americans have died of COVID nineteen. The vaccination rate 135 00:08:31,880 --> 00:08:34,480 Speaker 1: has reached a plateau. We have a little bit more 136 00:08:34,520 --> 00:08:37,400 Speaker 1: people getting vaccinated, but that's really just people getting the 137 00:08:37,440 --> 00:08:43,760 Speaker 1: booster shots. Right. Fizer has has just asked the FDA 138 00:08:43,840 --> 00:08:49,800 Speaker 1: to certify the vaccinations for five to eleven year olds, right, 139 00:08:49,880 --> 00:08:53,920 Speaker 1: and to ramp that up so that that could possibly 140 00:08:54,040 --> 00:08:59,120 Speaker 1: have twenty eight million more people vaccinated if we expand 141 00:08:59,160 --> 00:09:04,199 Speaker 1: it to that age group again. But there is still folks, 142 00:09:04,240 --> 00:09:08,000 Speaker 1: a substantial amount of Americans that are not vaccinated, and 143 00:09:08,040 --> 00:09:12,679 Speaker 1: the rollout has been up and down. One minute, we're 144 00:09:12,720 --> 00:09:14,640 Speaker 1: told to wear a mask, the next minute we're told 145 00:09:14,720 --> 00:09:17,360 Speaker 1: not to. One minute, we're told to social distance, the 146 00:09:17,360 --> 00:09:20,240 Speaker 1: next minute we're told not to. One minute, we're told 147 00:09:20,280 --> 00:09:23,240 Speaker 1: that the fis of vaccination is one hundred percent great, 148 00:09:23,280 --> 00:09:25,560 Speaker 1: and then the next minute we learned maderna is better. 149 00:09:26,400 --> 00:09:29,600 Speaker 1: I'm not saying that it is not our responsibility as 150 00:09:29,640 --> 00:09:34,640 Speaker 1: an educated citizensury not to be intellectually nimble. We need that. 151 00:09:35,040 --> 00:09:37,800 Speaker 1: But the reality is is that the entirety of the 152 00:09:37,880 --> 00:09:42,400 Speaker 1: vaccination rollout hasn't gone according to plan because I don't 153 00:09:42,440 --> 00:09:46,360 Speaker 1: think that Democrats once again had their eyes wide open 154 00:09:46,720 --> 00:09:49,320 Speaker 1: to what it was that Republicans were going to do. 155 00:09:49,559 --> 00:09:51,720 Speaker 1: That there was going to be an entire network like 156 00:09:51,800 --> 00:09:54,680 Speaker 1: Fox News that was going to be an anti vaccine 157 00:09:54,679 --> 00:09:58,880 Speaker 1: network that has tens of millions of people that are 158 00:09:58,960 --> 00:10:03,600 Speaker 1: viewing it every single day. The other reality, right, we 159 00:10:03,679 --> 00:10:08,320 Speaker 1: don't have the infrastructure bill, and so when we're going 160 00:10:08,480 --> 00:10:11,280 Speaker 1: to the polls, when Democrats are getting ready to campaign. 161 00:10:11,600 --> 00:10:15,120 Speaker 1: What the fuck are they campaigning on? Because we've lost 162 00:10:15,280 --> 00:10:19,400 Speaker 1: women's rights right in Texas right now. We know that 163 00:10:19,440 --> 00:10:23,200 Speaker 1: a judge has intervened, The federal judge in Texas has 164 00:10:23,280 --> 00:10:26,240 Speaker 1: put a stop right on the band. But that's just 165 00:10:26,320 --> 00:10:28,480 Speaker 1: going to be kicked to the Fifth Circuit Court, which 166 00:10:28,520 --> 00:10:31,040 Speaker 1: we know is extremely conservative, and then we'll make its 167 00:10:31,040 --> 00:10:34,320 Speaker 1: way up to Trump's Supreme Court. Right, So, what is 168 00:10:34,320 --> 00:10:37,240 Speaker 1: the likelihood that we don't lose women's rights all together? 169 00:10:37,480 --> 00:10:40,560 Speaker 1: And mind you, just let me mind you that December 170 00:10:40,640 --> 00:10:44,200 Speaker 1: first is going to be when the Supreme Court starts 171 00:10:44,240 --> 00:10:47,520 Speaker 1: hearing the Mississippi case, which is about a fifteen week 172 00:10:47,559 --> 00:10:53,199 Speaker 1: abortion ban. Right then, on top of that, we can't 173 00:10:53,240 --> 00:10:56,240 Speaker 1: pass the John Lewis Voting Rights at and we can't 174 00:10:56,280 --> 00:10:59,280 Speaker 1: do anything about the over three hundred bills that have 175 00:10:59,400 --> 00:11:02,120 Speaker 1: been pushed out across a majority of the states in 176 00:11:02,160 --> 00:11:05,680 Speaker 1: this country to suppress the vote. So when young people 177 00:11:06,120 --> 00:11:08,840 Speaker 1: are looking at their fucking future right now, does it 178 00:11:09,000 --> 00:11:11,920 Speaker 1: look bright to anybody? No, it does it. It looks 179 00:11:11,960 --> 00:11:15,200 Speaker 1: like they're inheriting a lot of bullshit baggage from the 180 00:11:15,240 --> 00:11:19,679 Speaker 1: boomers and down. So when we're looking at these numbers, 181 00:11:19,720 --> 00:11:23,199 Speaker 1: I don't want us to turn around and start blaming 182 00:11:23,240 --> 00:11:26,160 Speaker 1: young people, right, because this is where it goes. If 183 00:11:26,240 --> 00:11:29,160 Speaker 1: Democrats lose in mid terms, who you think it's going 184 00:11:29,240 --> 00:11:32,000 Speaker 1: to be the scapegoats. It's gonna be black people, and 185 00:11:32,080 --> 00:11:36,679 Speaker 1: it's gonna be young people, as opposed to saying, here's 186 00:11:36,679 --> 00:11:41,479 Speaker 1: the thing, you haven't delivered. Shit, We delivered you Congress 187 00:11:41,600 --> 00:11:44,520 Speaker 1: and the fucking White House and you can't get dick done. 188 00:11:45,040 --> 00:11:48,040 Speaker 1: So how is it that we're supposed to be the 189 00:11:48,080 --> 00:11:51,800 Speaker 1: ones that are creating the enthusiasm when we have a 190 00:11:51,840 --> 00:11:56,240 Speaker 1: democratic administration and a democratic Congress that is at a 191 00:11:57,120 --> 00:12:01,920 Speaker 1: what a stalemate with themselves? So of course you have 192 00:12:02,040 --> 00:12:04,199 Speaker 1: eighteen to twenty nine year olds that are looking around 193 00:12:04,200 --> 00:12:07,280 Speaker 1: and saying, fuck this, what am I voting for? And 194 00:12:07,400 --> 00:12:10,360 Speaker 1: it's not just them, And I can't say that I 195 00:12:10,440 --> 00:12:14,480 Speaker 1: blame them because again, we're looking around. It has been 196 00:12:14,760 --> 00:12:19,080 Speaker 1: nine months, nine months into this administration, and we have 197 00:12:19,200 --> 00:12:23,360 Speaker 1: gotten very little done, and our democracy is on the 198 00:12:23,400 --> 00:12:27,840 Speaker 1: fucking line, and people are looking around and saying, we 199 00:12:28,000 --> 00:12:31,640 Speaker 1: can't address climate change head on. We don't have massive 200 00:12:31,679 --> 00:12:35,040 Speaker 1: legislation for that. There's only a piece of that within 201 00:12:35,120 --> 00:12:38,720 Speaker 1: the infrastructure bill. But remember Joe Mansion wasn't comfortable with 202 00:12:38,840 --> 00:12:44,080 Speaker 1: climate language. Mister Cole doesn't like us creating infrastructure and 203 00:12:44,240 --> 00:12:49,880 Speaker 1: demands of companies to, you know, combat this pressing crisis 204 00:12:49,960 --> 00:12:53,240 Speaker 1: that we are in. He doesn't like the dollar amount, 205 00:12:53,440 --> 00:12:56,600 Speaker 1: so he doesn't want the American people's own tax dollars 206 00:12:57,000 --> 00:13:01,720 Speaker 1: right to be invested back into the American people. Then 207 00:13:01,800 --> 00:13:05,960 Speaker 1: you have Curs in cinema who won't even get on 208 00:13:06,080 --> 00:13:08,400 Speaker 1: camera to talk about what the fuck she is for 209 00:13:08,559 --> 00:13:11,320 Speaker 1: or against, so we don't know. You have her own 210 00:13:11,360 --> 00:13:14,600 Speaker 1: constituents chasing that bitch down in the bathroom and her 211 00:13:14,679 --> 00:13:17,560 Speaker 1: deciding to say, oh, this is inappropriate. You know what's 212 00:13:17,559 --> 00:13:21,680 Speaker 1: inappropriate being a senator and not doing anything for the 213 00:13:21,720 --> 00:13:25,559 Speaker 1: people that actually voted and campaigned to get you into office, 214 00:13:25,880 --> 00:13:28,320 Speaker 1: and doing an about face on whether or not you're 215 00:13:28,360 --> 00:13:31,160 Speaker 1: going to be a part of our democracy's future or 216 00:13:31,280 --> 00:13:35,600 Speaker 1: its demise. So, of course, when we are looking at 217 00:13:35,679 --> 00:13:39,840 Speaker 1: these numbers, it makes sense to me that young people 218 00:13:39,880 --> 00:13:42,720 Speaker 1: are looking around and saying, what am I voting for? 219 00:13:43,240 --> 00:13:47,439 Speaker 1: And that is a huge problem for Democrats because they're 220 00:13:47,480 --> 00:13:50,880 Speaker 1: the ones who, once again, do you know what issues 221 00:13:51,280 --> 00:13:54,160 Speaker 1: that these folks care about? Well, let me tell you 222 00:13:55,440 --> 00:13:58,920 Speaker 1: So the number one issue for women eighteen to twenty nine, 223 00:13:59,200 --> 00:14:03,760 Speaker 1: No shock, he is abortion. Thirteen percent of women. That 224 00:14:03,920 --> 00:14:06,400 Speaker 1: is their number one issue of young people in this 225 00:14:06,440 --> 00:14:09,439 Speaker 1: age group. Of course, it is because if they can't 226 00:14:09,440 --> 00:14:12,080 Speaker 1: control when and how they have a family, how are 227 00:14:12,120 --> 00:14:14,560 Speaker 1: they going to be able to control their careers. How 228 00:14:14,640 --> 00:14:16,439 Speaker 1: are they going to be able to control whether or 229 00:14:16,520 --> 00:14:19,720 Speaker 1: not they're able to access higher education if they're saddled 230 00:14:19,720 --> 00:14:23,960 Speaker 1: with kids that they can't afford. So that's a major issue. 231 00:14:23,960 --> 00:14:27,520 Speaker 1: And again, what is it that this administration is doing 232 00:14:27,760 --> 00:14:31,440 Speaker 1: to ensure that all people that are able to birth 233 00:14:31,920 --> 00:14:38,280 Speaker 1: have access to safe, right and readily available abortions. So 234 00:14:38,480 --> 00:14:42,600 Speaker 1: that's another thing. The second thing that folks are really 235 00:14:42,920 --> 00:14:47,360 Speaker 1: concerned about as an overall, right, men and women are 236 00:14:47,400 --> 00:14:51,400 Speaker 1: concerned about twenty four percent of women and twenty percent 237 00:14:51,440 --> 00:14:55,640 Speaker 1: of men say that COVID nineteen is their number one issue. Well, 238 00:14:55,640 --> 00:14:58,200 Speaker 1: of course it is because all of our lives have 239 00:14:58,240 --> 00:15:01,800 Speaker 1: been turned upside down of parents playing Russian roulette with 240 00:15:01,840 --> 00:15:04,680 Speaker 1: their children's lives by sending them back to school, not 241 00:15:04,800 --> 00:15:08,080 Speaker 1: knowing when in fact they're going to be told that 242 00:15:08,080 --> 00:15:09,960 Speaker 1: the school is shutting down or the kid has to 243 00:15:09,960 --> 00:15:13,760 Speaker 1: stay home for two weeks because of a COVID outbreak, right, 244 00:15:13,960 --> 00:15:18,400 Speaker 1: And again we have all of the information, but we 245 00:15:18,480 --> 00:15:21,560 Speaker 1: are not disseminating it. Democrats are not disseminating in a 246 00:15:21,640 --> 00:15:25,560 Speaker 1: way that it's a breaking through the misinformation. This has 247 00:15:25,600 --> 00:15:29,160 Speaker 1: been a problem since twenty sixteen, folks, and we didn't 248 00:15:29,200 --> 00:15:33,320 Speaker 1: do anything about it then. So again you're looking at 249 00:15:33,360 --> 00:15:36,640 Speaker 1: the issues that matter to young people. On top of that, 250 00:15:36,760 --> 00:15:42,480 Speaker 1: it is rising prices, right, and increased debt. The Biden 251 00:15:42,520 --> 00:15:46,760 Speaker 1: administration only eliminated student debt for certain groups of people, 252 00:15:47,400 --> 00:15:50,240 Speaker 1: and it was those from the disabled community and a 253 00:15:50,240 --> 00:15:55,560 Speaker 1: few others. That's not eliminating student debt, right. That is 254 00:15:55,640 --> 00:15:58,680 Speaker 1: crushing people. That is not allowing them to invest in 255 00:15:58,720 --> 00:16:01,000 Speaker 1: buying homes. That is having them live at home with 256 00:16:01,040 --> 00:16:04,000 Speaker 1: their families, right, and their parents because they can't afford 257 00:16:04,000 --> 00:16:07,200 Speaker 1: to live on their owns. That is stopping people. Because 258 00:16:07,240 --> 00:16:09,960 Speaker 1: here's another study that I want to bring up as well, 259 00:16:10,000 --> 00:16:13,600 Speaker 1: which is just fucking wild. But here's another report that 260 00:16:13,680 --> 00:16:17,960 Speaker 1: ties in to the fact that young people are not 261 00:16:18,200 --> 00:16:23,040 Speaker 1: enthusiastic about voting. According to Time Magazine and a recent 262 00:16:23,160 --> 00:16:27,280 Speaker 1: Pew Research analysis, this is the title of this article, 263 00:16:28,040 --> 00:16:31,640 Speaker 1: men are now more likely to be single than women. 264 00:16:32,080 --> 00:16:35,960 Speaker 1: It's not a good sign. And in the Pew research data, 265 00:16:36,040 --> 00:16:39,640 Speaker 1: what they are finding is that men are more likely 266 00:16:39,680 --> 00:16:42,320 Speaker 1: to be living at home with their parents. Single men, 267 00:16:42,920 --> 00:16:45,800 Speaker 1: and that they're not making nearly as much money as 268 00:16:45,840 --> 00:16:49,160 Speaker 1: they once were, and that they're not as educated as 269 00:16:49,280 --> 00:16:52,600 Speaker 1: the women are. You see, because society has set up 270 00:16:52,600 --> 00:16:56,000 Speaker 1: in a way that men, by virtue were always going 271 00:16:56,040 --> 00:16:58,720 Speaker 1: to make more money and therefore didn't need to go 272 00:16:58,880 --> 00:17:02,080 Speaker 1: and get higher Edgy women, on the other hand, needed 273 00:17:02,080 --> 00:17:05,199 Speaker 1: to build up their resumes be just to be treated 274 00:17:05,240 --> 00:17:08,159 Speaker 1: as equal. And we still are not paid as equal 275 00:17:08,240 --> 00:17:10,960 Speaker 1: or treated as equal in the workplace. But now this 276 00:17:11,040 --> 00:17:14,920 Speaker 1: is trickling down. Let me read you this piece. Almost 277 00:17:14,960 --> 00:17:17,679 Speaker 1: a third of adult single men live with a parent. 278 00:17:18,320 --> 00:17:22,320 Speaker 1: That is startling. Single men are much more likely to 279 00:17:22,359 --> 00:17:27,159 Speaker 1: be unemployed, financially fragile, and to lack a college degree 280 00:17:27,720 --> 00:17:31,280 Speaker 1: than those with a partner. They're also likely to have 281 00:17:31,680 --> 00:17:35,800 Speaker 1: lower median earnings. Single men earned less in twenty nineteen 282 00:17:35,960 --> 00:17:41,560 Speaker 1: than in nineteen ninety, even adjusting for inflation. Single women meanwhile, 283 00:17:41,840 --> 00:17:44,359 Speaker 1: earned the same as they did thirty years ago. Because 284 00:17:44,520 --> 00:17:47,320 Speaker 1: we don't have a fucking Equal Pay Act in this 285 00:17:47,440 --> 00:17:50,400 Speaker 1: country and so we can continue to pay women less 286 00:17:50,440 --> 00:17:54,360 Speaker 1: for the same job, but those with partners have increased 287 00:17:54,359 --> 00:17:57,760 Speaker 1: their earnings by fifty percent. There are some of these 288 00:17:57,840 --> 00:18:00,399 Speaker 1: or some of the findings in the new Pew Research 289 00:18:00,440 --> 00:18:04,199 Speaker 1: analysis of twenty nineteen data on the growing gap between 290 00:18:04,240 --> 00:18:07,720 Speaker 1: adults American adults who live with a partner and those 291 00:18:07,760 --> 00:18:10,639 Speaker 1: who do not. While the study is less about the 292 00:18:10,680 --> 00:18:13,520 Speaker 1: effect of marriage and more about the effect of a 293 00:18:13,600 --> 00:18:18,800 Speaker 1: changing economic circumstances, circumstances that have an effect on marriage, 294 00:18:19,400 --> 00:18:25,679 Speaker 1: because here's the idea, are you going to be actively dating, 295 00:18:25,960 --> 00:18:29,360 Speaker 1: actively wanting to join your life with somebody else when 296 00:18:29,400 --> 00:18:31,520 Speaker 1: you are living at home with your parents and don't 297 00:18:31,520 --> 00:18:35,000 Speaker 1: have a stable income. Once again, this all comes back 298 00:18:35,040 --> 00:18:39,720 Speaker 1: to our failing economy. The reality is this that it 299 00:18:39,840 --> 00:18:43,720 Speaker 1: is very expensive to live in America right. It is 300 00:18:43,960 --> 00:18:49,560 Speaker 1: very expensive to go to go to college. You're talking 301 00:18:49,560 --> 00:18:55,040 Speaker 1: about average people raking in debts of six figures for 302 00:18:55,160 --> 00:18:58,919 Speaker 1: a higher education that is not necessarily going to command 303 00:18:59,000 --> 00:19:01,000 Speaker 1: them the salary that is going to help them to 304 00:19:01,040 --> 00:19:06,000 Speaker 1: pay off that debt, So why go This is why 305 00:19:06,119 --> 00:19:09,400 Speaker 1: another study came out and said that more men are 306 00:19:09,560 --> 00:19:13,520 Speaker 1: not going into going to higher education, that more men 307 00:19:13,640 --> 00:19:17,720 Speaker 1: are dropping out of college because what is happening here 308 00:19:18,080 --> 00:19:22,679 Speaker 1: is that our economy is not keeping pace right and 309 00:19:22,880 --> 00:19:25,800 Speaker 1: people are not making as much as they used to. 310 00:19:26,600 --> 00:19:29,159 Speaker 1: Let me go on to show this because there was 311 00:19:29,200 --> 00:19:34,119 Speaker 1: another startling issue here that I found. And of course 312 00:19:34,440 --> 00:19:38,160 Speaker 1: when we look at communities of color, these numbers are 313 00:19:38,200 --> 00:19:42,640 Speaker 1: even worse. I'll go on. The trend has not had 314 00:19:42,760 --> 00:19:47,000 Speaker 1: an equal impact across all sectors of society. The Pew study, 315 00:19:47,040 --> 00:19:50,800 Speaker 1: which uses information from the twenty nineteen American Community Survey, 316 00:19:51,200 --> 00:19:54,159 Speaker 1: notes that men are now more likely to be single 317 00:19:54,240 --> 00:19:57,080 Speaker 1: than women, which was not the case thirty years ago. 318 00:19:57,760 --> 00:20:01,000 Speaker 1: Black people are much more likely to be single get 319 00:20:01,040 --> 00:20:06,439 Speaker 1: this fifty nine percent than any other race, and Black 320 00:20:06,480 --> 00:20:10,560 Speaker 1: women sixty two percent are more likely to be single 321 00:20:10,680 --> 00:20:14,800 Speaker 1: of any sector. Asian people twenty nine percent are the 322 00:20:14,920 --> 00:20:18,600 Speaker 1: least likely to be single, followed by whiteset thirty three 323 00:20:18,640 --> 00:20:23,119 Speaker 1: and Hispanics at thirty eight percent. Most researchers agree that 324 00:20:23,200 --> 00:20:26,199 Speaker 1: the trend lines showing that fewer people are getting married 325 00:20:26,359 --> 00:20:29,760 Speaker 1: and that those who do are increasingly better off financially, 326 00:20:30,080 --> 00:20:33,359 Speaker 1: have a lot to do with the effect of wealth 327 00:20:33,400 --> 00:20:37,760 Speaker 1: and education on marriage, then vice versa. People who are 328 00:20:37,800 --> 00:20:42,040 Speaker 1: financially stable are just more likely to find and attract 329 00:20:42,200 --> 00:20:47,080 Speaker 1: a partner. So again this goes back to what are 330 00:20:47,080 --> 00:20:51,080 Speaker 1: we doing to ensure that people are not having crippling debt? 331 00:20:51,440 --> 00:20:55,120 Speaker 1: What are we doing to increase and make certain that 332 00:20:55,160 --> 00:20:59,359 Speaker 1: we have good paying jobs. I did a debate recently 333 00:20:59,680 --> 00:21:03,560 Speaker 1: with a conservative, right wing Republican who said, I don't 334 00:21:03,560 --> 00:21:06,400 Speaker 1: know what this study is talking about because there are 335 00:21:06,440 --> 00:21:10,560 Speaker 1: so many jobs. I see many job openings, and I'm like, 336 00:21:10,920 --> 00:21:14,040 Speaker 1: just because you see job openings in your neighborhood does 337 00:21:14,080 --> 00:21:17,919 Speaker 1: not mean that that translates to the nation. And also, 338 00:21:18,400 --> 00:21:21,960 Speaker 1: their reason why there are so many job openings in retail, 339 00:21:22,080 --> 00:21:26,159 Speaker 1: for instance, right now, is because it doesn't fucking pay 340 00:21:26,440 --> 00:21:30,000 Speaker 1: and it doesn't have health benefits. And when you're living 341 00:21:30,280 --> 00:21:34,200 Speaker 1: in the midst of a health pandemic, having health insurance 342 00:21:34,280 --> 00:21:38,760 Speaker 1: and health benefits is really fucking good. So why would 343 00:21:38,760 --> 00:21:41,959 Speaker 1: you be at a job that pays you minimum wage 344 00:21:42,080 --> 00:21:45,600 Speaker 1: or below and doesn't provide you with benefits. You're better 345 00:21:45,760 --> 00:21:49,919 Speaker 1: off not going into those positions. So when we're talking 346 00:21:49,960 --> 00:21:52,320 Speaker 1: about whether or not people are going to be able 347 00:21:52,320 --> 00:21:56,320 Speaker 1: to leave home, we're talking about the reality that here's 348 00:21:56,320 --> 00:22:00,919 Speaker 1: the thing. We need, better jobs, good paying jobs for 349 00:22:01,080 --> 00:22:05,440 Speaker 1: all people, because right now folks are opting out and 350 00:22:05,520 --> 00:22:11,040 Speaker 1: that's the problem, folks. I am so excited to welcome 351 00:22:11,080 --> 00:22:15,400 Speaker 1: to the show for the very first time the campaign 352 00:22:15,480 --> 00:22:20,840 Speaker 1: director of CHIROS, the Digital First organizers behind Tech Is 353 00:22:20,880 --> 00:22:25,960 Speaker 1: Not Neutral campaign and the recently launched Facebook Logout campaign, 354 00:22:26,920 --> 00:22:33,240 Speaker 1: Gilanni Drew Davie. Gilanni, can you talk to us about 355 00:22:33,400 --> 00:22:39,480 Speaker 1: your initial reactions to the Facebook whistleblower Francis Hagen and 356 00:22:40,400 --> 00:22:43,240 Speaker 1: what we have learned We're I mean, because I will 357 00:22:43,240 --> 00:22:45,399 Speaker 1: tell you this, I don't know if I needed the 358 00:22:45,480 --> 00:22:49,240 Speaker 1: internal documents to tell me that Facebook was putting profits 359 00:22:49,240 --> 00:22:51,920 Speaker 1: over people and that they were more concerned with making 360 00:22:51,960 --> 00:22:55,280 Speaker 1: money than they were with stopping hate groups and white supremacy. 361 00:22:56,240 --> 00:23:01,520 Speaker 1: But what were your reactions and the reactions of CAIROS. Yeah, 362 00:23:01,560 --> 00:23:04,520 Speaker 1: so I'm first super excited to talk to you and 363 00:23:04,560 --> 00:23:08,280 Speaker 1: be here about it. I think that, you know, the 364 00:23:08,320 --> 00:23:11,879 Speaker 1: first reactions that we had as CHIROS, as an organization 365 00:23:11,920 --> 00:23:15,040 Speaker 1: who's been kind of doing this work for a while 366 00:23:15,160 --> 00:23:19,160 Speaker 1: and both both in the campaigning organizing realm, but also 367 00:23:19,240 --> 00:23:23,560 Speaker 1: building up organizers to do digital organizing and do it 368 00:23:23,640 --> 00:23:26,800 Speaker 1: on tech issues, I think it was the same. You know, 369 00:23:27,040 --> 00:23:29,560 Speaker 1: I didn't really need the internal documents. I didn't really 370 00:23:29,600 --> 00:23:33,840 Speaker 1: need to hear the whistleblower Francis saying the things that 371 00:23:33,880 --> 00:23:37,520 Speaker 1: she was saying, and it was reiterating a point that 372 00:23:37,720 --> 00:23:40,919 Speaker 1: you know, civil rights groups to accountability groups have been 373 00:23:40,960 --> 00:23:44,400 Speaker 1: saying for for years and for as us as users 374 00:23:44,480 --> 00:23:48,360 Speaker 1: have been experiencing for years. And so, you know, very 375 00:23:48,440 --> 00:23:51,920 Speaker 1: very appreciative all ways of whistleblowers, especially from Silicon Valley, 376 00:23:51,960 --> 00:23:54,879 Speaker 1: but it's very hard to you know, come out and 377 00:23:55,480 --> 00:23:58,320 Speaker 1: say the things and be public about it. So very 378 00:23:58,320 --> 00:24:03,399 Speaker 1: appreciative that also, I think, you know, it wasn't a 379 00:24:03,400 --> 00:24:06,919 Speaker 1: surprise at all. You know, I think that one of 380 00:24:06,920 --> 00:24:11,760 Speaker 1: the things that we are all learning, which is really disconcerning, 381 00:24:12,440 --> 00:24:18,119 Speaker 1: is the fact that white supremacy, right hate, homophobia, transphobia, 382 00:24:18,200 --> 00:24:21,640 Speaker 1: all of these things have no boundaries. Right. They now 383 00:24:21,680 --> 00:24:25,000 Speaker 1: have algorithms, Right, we have a way of being able 384 00:24:25,000 --> 00:24:29,720 Speaker 1: to tap into people's worst instincts. And we're using now 385 00:24:29,800 --> 00:24:33,879 Speaker 1: platforms that were originally i think created to bring people together, 386 00:24:34,200 --> 00:24:38,959 Speaker 1: to bring families and communities and students and all of 387 00:24:39,000 --> 00:24:43,240 Speaker 1: these different types of people seven billion people right together. 388 00:24:43,960 --> 00:24:47,560 Speaker 1: And now what we have seen, at least through the 389 00:24:47,640 --> 00:24:52,240 Speaker 1: activation of these hate groups using Facebook is just an 390 00:24:52,240 --> 00:24:56,679 Speaker 1: increase in tribalism, an increase in how things go viral 391 00:24:56,800 --> 00:25:00,760 Speaker 1: or what makes things go viral. Talk to us about 392 00:25:00,800 --> 00:25:05,720 Speaker 1: the Tech is Not Neutral campaign and why it's important 393 00:25:05,720 --> 00:25:08,840 Speaker 1: for us to have these conversations about who is actually 394 00:25:08,920 --> 00:25:13,160 Speaker 1: creating the tech, who is creating the artificial intelligence, who 395 00:25:13,280 --> 00:25:18,359 Speaker 1: is creating these algorithms, and through what perspective they're doing. So, yeah, 396 00:25:18,400 --> 00:25:22,040 Speaker 1: for sure, I think, you know, tech company products have 397 00:25:22,200 --> 00:25:27,480 Speaker 1: real life consequences for people, for real people. And when 398 00:25:27,480 --> 00:25:29,399 Speaker 1: I say that too, I also want to put a 399 00:25:29,440 --> 00:25:33,080 Speaker 1: little like asterisk of the people who experience the most 400 00:25:33,080 --> 00:25:35,320 Speaker 1: harm are also the same people who experienced the most 401 00:25:35,320 --> 00:25:39,040 Speaker 1: harm offline, and those are going to be like marginalized communities, 402 00:25:39,080 --> 00:25:44,240 Speaker 1: black and brown people, LGBT people, etc. And so, you know, 403 00:25:44,560 --> 00:25:48,440 Speaker 1: it is so so important to call out and realize 404 00:25:48,440 --> 00:25:52,080 Speaker 1: who is making these products, who was writting these platforms, 405 00:25:52,080 --> 00:25:55,280 Speaker 1: who was making the decisions when it comes to things 406 00:25:55,280 --> 00:26:00,760 Speaker 1: like content moderation, disinformation, even just the way that the 407 00:26:00,800 --> 00:26:04,040 Speaker 1: platform is like functioning. And so, you know, the Tech 408 00:26:04,200 --> 00:26:07,560 Speaker 1: is Not Mutual campaign that we launched last year, we 409 00:26:07,600 --> 00:26:11,160 Speaker 1: was really in response to seeing tech companies jumping on 410 00:26:11,200 --> 00:26:15,640 Speaker 1: this bandwagon of black Lives Matter. I was seeing, you know, 411 00:26:16,280 --> 00:26:19,879 Speaker 1: an Amazon um black Lives Matter statement, and I was like, 412 00:26:19,960 --> 00:26:24,200 Speaker 1: that's really that's really rich because Amazon is doing so 413 00:26:24,240 --> 00:26:27,960 Speaker 1: many harmful things to communities of color around the world, 414 00:26:28,440 --> 00:26:30,520 Speaker 1: and so for them to put up their little black square, 415 00:26:30,560 --> 00:26:34,920 Speaker 1: it was like no. And same thing with Google, same 416 00:26:34,920 --> 00:26:41,800 Speaker 1: thing with next Door and and Microsoft for example, saying 417 00:26:41,840 --> 00:26:45,919 Speaker 1: those statements and also playing with our lives, playing with 418 00:26:45,920 --> 00:26:49,160 Speaker 1: our democracy, and so you know, we launched that campaign 419 00:26:49,600 --> 00:26:54,800 Speaker 1: to point out the the I guess like hypercritical nature 420 00:26:54,880 --> 00:26:59,080 Speaker 1: of those of those statements and then also get people 421 00:26:59,119 --> 00:27:03,280 Speaker 1: to start thinking about tech not being neutral. So it's 422 00:27:03,280 --> 00:27:06,160 Speaker 1: a campaign, but it's also a framework. It's a way 423 00:27:06,160 --> 00:27:10,800 Speaker 1: of thinking around technology, and also it's a way of 424 00:27:10,800 --> 00:27:12,840 Speaker 1: thinking about a vision of the world that we do 425 00:27:12,920 --> 00:27:15,000 Speaker 1: want to see. We want to create a world where 426 00:27:15,000 --> 00:27:17,240 Speaker 1: tech works for us. We want to create a world 427 00:27:17,240 --> 00:27:20,760 Speaker 1: where we are not told by Silicon Valley that this place, 428 00:27:20,880 --> 00:27:24,360 Speaker 1: this platform, this product is neutral when that's just not 429 00:27:24,400 --> 00:27:28,200 Speaker 1: the truth. That all sways one way or the other. 430 00:27:28,840 --> 00:27:30,560 Speaker 1: You know, what's so funny is that every time that 431 00:27:30,600 --> 00:27:33,240 Speaker 1: I hear somebody say something that is neutral, it is 432 00:27:33,280 --> 00:27:35,880 Speaker 1: almost an alarm that goes off in my head for 433 00:27:35,920 --> 00:27:38,200 Speaker 1: me to be like, oh, so it's the opposite of 434 00:27:38,200 --> 00:27:40,800 Speaker 1: what it is that you're saying. I think that, you know, 435 00:27:40,880 --> 00:27:43,920 Speaker 1: one of the frustrating points, too, is that we wanted 436 00:27:43,960 --> 00:27:47,640 Speaker 1: to believe that technology was going to be the great equalizer, right, 437 00:27:48,080 --> 00:27:52,000 Speaker 1: was going to be this this democratized space the Internet, 438 00:27:52,200 --> 00:27:54,480 Speaker 1: where everyone was going to be able to have the 439 00:27:54,520 --> 00:27:57,760 Speaker 1: same amount of access, was going to be able to 440 00:27:57,800 --> 00:28:00,919 Speaker 1: post in the same way, you know, And that's not 441 00:28:00,960 --> 00:28:04,800 Speaker 1: the case. What we what we know right is from 442 00:28:05,320 --> 00:28:08,440 Speaker 1: I would even go with the content creators on TikTok, 443 00:28:08,480 --> 00:28:11,600 Speaker 1: the black content creators on TikTok who went on strike, 444 00:28:11,920 --> 00:28:14,400 Speaker 1: who are just like, so, we're not putting our information 445 00:28:14,440 --> 00:28:17,680 Speaker 1: out there. I follow so many sites who are talking 446 00:28:17,720 --> 00:28:20,840 Speaker 1: about white supremacy and they're having their content pulled down 447 00:28:21,080 --> 00:28:23,840 Speaker 1: as hate speech, and they're the and they are talking 448 00:28:24,119 --> 00:28:26,880 Speaker 1: about the harms of white supremacy and how it is spreading, 449 00:28:27,040 --> 00:28:30,200 Speaker 1: and yet they are the ones that are being blocked 450 00:28:30,280 --> 00:28:33,719 Speaker 1: or banned or having their accounts disabled. How do you 451 00:28:33,800 --> 00:28:37,800 Speaker 1: think that we deal with those issues the fact that 452 00:28:38,120 --> 00:28:41,840 Speaker 1: you know, we're looking for Congress, right. I think that 453 00:28:41,880 --> 00:28:46,280 Speaker 1: the whistleblower has laid out a kind of you know, 454 00:28:46,880 --> 00:28:51,000 Speaker 1: trail of gingerbread crumbs right for them to develop a 455 00:28:51,120 --> 00:28:55,320 Speaker 1: case right against Facebook, but it's like, do they even 456 00:28:55,480 --> 00:28:59,800 Speaker 1: understand what algorithms are? Do they even understand what needs 457 00:28:59,840 --> 00:29:03,880 Speaker 1: to be regulated? And if if cairos, if your organization 458 00:29:04,360 --> 00:29:07,960 Speaker 1: had it, they're wig, what are some waste in which 459 00:29:08,120 --> 00:29:11,000 Speaker 1: we would ensure that the internet is free, that it 460 00:29:11,080 --> 00:29:16,080 Speaker 1: is democratize, that people are safe. Yeah, so I think 461 00:29:16,160 --> 00:29:18,720 Speaker 1: the whistle blower definitely laid out a bunch of ways, 462 00:29:18,760 --> 00:29:22,480 Speaker 1: and I think this moment in time is a bit 463 00:29:22,800 --> 00:29:27,720 Speaker 1: different because we've now seen from January sixth to now 464 00:29:27,920 --> 00:29:31,080 Speaker 1: all the Wall Street Journal files the whistle blower, how 465 00:29:31,840 --> 00:29:36,120 Speaker 1: Facebook and technologies like Facebook or like harming our children, 466 00:29:36,160 --> 00:29:39,520 Speaker 1: our society or democracy, you name it. They're touching it. 467 00:29:40,040 --> 00:29:42,840 Speaker 1: And I think that you know, this is both This 468 00:29:43,080 --> 00:29:48,960 Speaker 1: is both a Congress issue and a way where users 469 00:29:49,040 --> 00:29:51,760 Speaker 1: can also have power. And so I think about it. 470 00:29:51,880 --> 00:29:55,000 Speaker 1: I think, Okay, a campaign we're running now, the Facebook blogout, 471 00:29:55,360 --> 00:29:59,120 Speaker 1: is really getting users to harness their own power, to 472 00:29:59,240 --> 00:30:02,320 Speaker 1: realize their own power as people who make Facebook and 473 00:30:02,440 --> 00:30:06,680 Speaker 1: Instagram run like those are our engagement, that's our data 474 00:30:06,800 --> 00:30:09,560 Speaker 1: that they're using to make more money, and if we 475 00:30:09,720 --> 00:30:12,800 Speaker 1: take that away, then they start making money, and therefore 476 00:30:12,840 --> 00:30:14,719 Speaker 1: we can demand the things that we want from them. 477 00:30:15,600 --> 00:30:20,720 Speaker 1: Users are also the same people as constituents, and so lawmakers, 478 00:30:20,840 --> 00:30:24,320 Speaker 1: I think also need to realize that users are constituents, 479 00:30:24,720 --> 00:30:26,920 Speaker 1: and so they need to listen to the things that 480 00:30:27,040 --> 00:30:30,560 Speaker 1: we're saying and listen to the harms that you're laying 481 00:30:30,640 --> 00:30:34,800 Speaker 1: out that you know, places like Facebook have have done 482 00:30:34,880 --> 00:30:37,560 Speaker 1: to us. And as far as like you know, do 483 00:30:37,720 --> 00:30:39,640 Speaker 1: they even know what to regulate? Do they do they 484 00:30:39,680 --> 00:30:41,760 Speaker 1: know what algorithms are. I think We've come a long 485 00:30:41,840 --> 00:30:45,200 Speaker 1: way since I've started this work and the hearings and 486 00:30:45,360 --> 00:30:47,680 Speaker 1: the kind of like memes that have come out and 487 00:30:47,760 --> 00:30:51,040 Speaker 1: the interesting quotes that have come out of those hearings, 488 00:30:52,840 --> 00:30:54,480 Speaker 1: and I think we's leve a way to go. And 489 00:30:54,560 --> 00:30:56,880 Speaker 1: I think that you know, lawmakers need to start also 490 00:30:57,000 --> 00:31:00,800 Speaker 1: listening to groups like CAIROS and our partners SPURB. We're saying, like, 491 00:31:00,960 --> 00:31:03,400 Speaker 1: here are the things that we want to see from you. 492 00:31:03,920 --> 00:31:06,440 Speaker 1: We want to see you know, better things on privacy, 493 00:31:07,040 --> 00:31:09,640 Speaker 1: like protecting children like we saw on the hearings this week, 494 00:31:11,320 --> 00:31:16,400 Speaker 1: and just being able to regulate technology and on value 495 00:31:16,400 --> 00:31:19,960 Speaker 1: in a way that doesn't take everything from people in 496 00:31:20,040 --> 00:31:23,240 Speaker 1: the way that it is. Right now, we experienced a 497 00:31:24,760 --> 00:31:29,520 Speaker 1: blackout of sorts right when Facebook went down, then taking 498 00:31:29,600 --> 00:31:31,920 Speaker 1: down with it all of the entities in which they 499 00:31:32,000 --> 00:31:35,640 Speaker 1: have gobbled up and own. Three and a half billion 500 00:31:35,800 --> 00:31:39,120 Speaker 1: people were affected. Now we have the you know, the 501 00:31:39,320 --> 00:31:42,960 Speaker 1: jokes that people in the United States make about, oh, well, 502 00:31:43,040 --> 00:31:45,320 Speaker 1: I guess I'll just get more work done today because 503 00:31:45,360 --> 00:31:48,120 Speaker 1: I can't scroll through Instagram and you know, like, oh 504 00:31:48,240 --> 00:31:50,120 Speaker 1: look at me, like I'll clean my house, or there 505 00:31:50,160 --> 00:31:52,000 Speaker 1: are a bunch of babies that were being born because 506 00:31:52,080 --> 00:31:55,200 Speaker 1: you know, nobody had anything else to do with their time. 507 00:31:55,800 --> 00:32:00,280 Speaker 1: The reality is, though, that globally, for instance, What's app 508 00:32:00,440 --> 00:32:03,360 Speaker 1: is used as a survival tool for many in the 509 00:32:03,560 --> 00:32:09,200 Speaker 1: LGBTQ community, particularly activists in very hostile countries, and so 510 00:32:09,560 --> 00:32:12,280 Speaker 1: what seems like a funny meme to us when these 511 00:32:12,360 --> 00:32:16,400 Speaker 1: platforms go down for a few hours is life threatening 512 00:32:16,440 --> 00:32:19,600 Speaker 1: and consequential to others. Do you think that we have 513 00:32:19,920 --> 00:32:27,040 Speaker 1: enough of a conversation about how these tools have become utilities? 514 00:32:27,400 --> 00:32:30,840 Speaker 1: Right Like, it isn't just a nice to have that 515 00:32:30,960 --> 00:32:34,719 Speaker 1: it actually is a need for people that are at 516 00:32:34,840 --> 00:32:40,360 Speaker 1: risk and for small businesses and entrepreneurs. Yeah, so I 517 00:32:40,400 --> 00:32:44,760 Speaker 1: think the Facebook, the Facebook down, Instagram down moment of 518 00:32:45,280 --> 00:32:47,160 Speaker 1: I believe it was my day at the beginning of 519 00:32:47,240 --> 00:32:52,200 Speaker 1: my week ends. You know, I think it was who 520 00:32:52,320 --> 00:32:55,360 Speaker 1: knows what data is at this point, but I think, 521 00:32:55,480 --> 00:32:57,600 Speaker 1: you know, that really highlighted a couple of things. I 522 00:32:58,080 --> 00:33:00,880 Speaker 1: love the memes, you know, we're quick on a on Twitter. 523 00:33:01,680 --> 00:33:05,040 Speaker 1: The spirit Halloween signs over the Facebook thing really killed me. 524 00:33:05,520 --> 00:33:09,160 Speaker 1: And yeah, it like it really was horrible for a 525 00:33:09,240 --> 00:33:12,800 Speaker 1: lot of small business owners, for organizers who use Facebook 526 00:33:13,320 --> 00:33:17,600 Speaker 1: for maybe their local elections that are happening now, and 527 00:33:18,040 --> 00:33:20,040 Speaker 1: people across the world, like you said, in addition to 528 00:33:20,200 --> 00:33:24,840 Speaker 1: what'sapp being like main communication source, also people open the 529 00:33:24,960 --> 00:33:29,200 Speaker 1: internet to just Facebook in places outside of the US. Right, 530 00:33:29,680 --> 00:33:32,440 Speaker 1: we are incredibly privileged to be able to even go 531 00:33:32,640 --> 00:33:34,960 Speaker 1: onto Twitter and have a key key about you know, 532 00:33:35,080 --> 00:33:37,400 Speaker 1: Facebook is down. This is what I'm doing, making the 533 00:33:37,520 --> 00:33:40,480 Speaker 1: memes all that kind of stuff. But taking a step back, 534 00:33:40,600 --> 00:33:44,920 Speaker 1: Facebook operates as a public utility, yet they don't, you know, 535 00:33:45,400 --> 00:33:50,720 Speaker 1: communicate as one. They don't take users seriously as as 536 00:33:50,760 --> 00:33:54,440 Speaker 1: a public utility should. And I think that is really 537 00:33:54,560 --> 00:33:58,160 Speaker 1: where our kind of a unique stance is is like 538 00:33:58,520 --> 00:34:00,760 Speaker 1: We're not saying delete Facebook if you want to as 539 00:34:00,760 --> 00:34:05,160 Speaker 1: an individual, sure, And the conversation shouldn't be like, well, 540 00:34:05,320 --> 00:34:07,280 Speaker 1: if you don't like it, just get out and if 541 00:34:07,360 --> 00:34:10,520 Speaker 1: you don't, you know, and if you are on it, 542 00:34:10,680 --> 00:34:12,960 Speaker 1: then you should deal with it. No, Like, this is 543 00:34:13,000 --> 00:34:15,880 Speaker 1: the thing that people use and need. It's probably the 544 00:34:15,960 --> 00:34:19,960 Speaker 1: biggest way to communicate at this point. And as users 545 00:34:20,360 --> 00:34:24,759 Speaker 1: we can basically strike and say, you know, we will 546 00:34:25,520 --> 00:34:27,880 Speaker 1: strike until we get the thing that we need because 547 00:34:28,239 --> 00:34:30,279 Speaker 1: this is the place where we are doing the things 548 00:34:30,360 --> 00:34:32,799 Speaker 1: that we need to do. And so yeah, I mean, 549 00:34:32,840 --> 00:34:35,799 Speaker 1: I don't I don't think that we have a kind 550 00:34:35,840 --> 00:34:38,960 Speaker 1: of like maybe maybe it's large, maybe it's a nuanced 551 00:34:39,040 --> 00:34:42,040 Speaker 1: conversation about the ways on which Facebook is actually helpful 552 00:34:42,120 --> 00:34:46,360 Speaker 1: to people, especially globally, And that is a conversation that 553 00:34:46,520 --> 00:34:48,840 Speaker 1: like it's kind of rows were interested in like continuing 554 00:34:48,960 --> 00:34:51,319 Speaker 1: to have. And then I think that all of us 555 00:34:51,360 --> 00:34:56,120 Speaker 1: need to start having given right, how many people around 556 00:34:56,160 --> 00:35:00,320 Speaker 1: the world, billions of people are on Facebook's platform, worms, 557 00:35:01,520 --> 00:35:05,640 Speaker 1: how effective do you think it can be for folks 558 00:35:05,719 --> 00:35:09,759 Speaker 1: to just log out? Like if small businesses rely on 559 00:35:09,960 --> 00:35:15,520 Speaker 1: this thing, entrepreneurs rely on this platform, organizers, activists rely 560 00:35:15,719 --> 00:35:20,920 Speaker 1: on this platform. How effective is it to just say, well, 561 00:35:20,960 --> 00:35:22,879 Speaker 1: we need to opt out or we need to log 562 00:35:22,960 --> 00:35:25,000 Speaker 1: out to your point, not to delete it, but to 563 00:35:25,120 --> 00:35:29,839 Speaker 1: log out until our demands are met. M I think 564 00:35:29,880 --> 00:35:32,640 Speaker 1: the focus of the campaign is so is much about 565 00:35:32,719 --> 00:35:37,400 Speaker 1: showing people power as impacting like Facebook's bottom line. And 566 00:35:37,520 --> 00:35:40,960 Speaker 1: so I think that it's this balance of we are 567 00:35:41,040 --> 00:35:44,600 Speaker 1: impacting Facebook's bottom line by like opting out of seeing 568 00:35:44,760 --> 00:35:48,719 Speaker 1: the ads and really digging into where how much they 569 00:35:48,800 --> 00:35:51,759 Speaker 1: get their revenue, which is like ninety eight percent advertising 570 00:35:51,840 --> 00:35:55,719 Speaker 1: at this point, and we're showing people power by buy 571 00:35:55,760 --> 00:35:58,719 Speaker 1: and logging out. And I think that you know, we've 572 00:35:58,840 --> 00:36:01,560 Speaker 1: been contacted by people say, hey, I run a small business. 573 00:36:01,600 --> 00:36:05,560 Speaker 1: We're organized on Facebook, and that's why our logout is temporary, 574 00:36:06,280 --> 00:36:11,160 Speaker 1: and that's why it's on an individual basis, and it 575 00:36:11,320 --> 00:36:13,239 Speaker 1: we really are going with the frame of like we 576 00:36:13,520 --> 00:36:16,560 Speaker 1: make or break Facebook, And so I think it's this 577 00:36:16,719 --> 00:36:22,000 Speaker 1: kind of this this balance of or maybe balance a 578 00:36:22,080 --> 00:36:25,320 Speaker 1: combination of trying to get people to start thinking differently 579 00:36:25,400 --> 00:36:29,759 Speaker 1: about Facebook and also follow through with an action. And 580 00:36:29,880 --> 00:36:34,399 Speaker 1: we hope to have an impact that is both both 581 00:36:34,440 --> 00:36:37,800 Speaker 1: the bottom line impact and both a change in mind impact, 582 00:36:37,880 --> 00:36:40,279 Speaker 1: so that you know, we're starting with this pledge now, 583 00:36:40,360 --> 00:36:44,280 Speaker 1: we're starting with this logout period now and going forward 584 00:36:44,400 --> 00:36:49,239 Speaker 1: and you know, two, three, four years then we are 585 00:36:49,520 --> 00:36:52,960 Speaker 1: you know, doing more things, and we are Mark Zuckerberg 586 00:36:53,080 --> 00:36:56,080 Speaker 1: is listening to us, or even better, Mark Zuckerberg's not 587 00:36:56,160 --> 00:37:00,399 Speaker 1: CEO of Facebook, because we, as users have in life, 588 00:37:00,560 --> 00:37:07,520 Speaker 1: we need better leadership. You know, I find Mark Zuckerberg 589 00:37:07,560 --> 00:37:11,200 Speaker 1: to be nauseating on so many different levels, just because 590 00:37:11,200 --> 00:37:15,000 Speaker 1: I think that he's emblematic of you know, white male 591 00:37:15,280 --> 00:37:19,440 Speaker 1: privilege and wealth and power personified to a degree that 592 00:37:20,000 --> 00:37:23,319 Speaker 1: you know, Mark Zuckerberg in many ways to be able 593 00:37:23,400 --> 00:37:26,520 Speaker 1: to put out the lights accidentally, as they said on 594 00:37:26,680 --> 00:37:29,080 Speaker 1: three and a half billion people, has more power than 595 00:37:29,120 --> 00:37:31,680 Speaker 1: the president of the United States. And that's something that 596 00:37:31,760 --> 00:37:35,919 Speaker 1: I just don't think again, people get right, they don't 597 00:37:36,000 --> 00:37:39,400 Speaker 1: they I don't think that they fully understand, you know, 598 00:37:39,600 --> 00:37:42,840 Speaker 1: the way in which this one person who is beholden 599 00:37:42,880 --> 00:37:45,840 Speaker 1: to no one but his shareholders and his own wallet, 600 00:37:46,560 --> 00:37:49,279 Speaker 1: is in charge of their life. You know. Part of 601 00:37:49,320 --> 00:37:53,160 Speaker 1: the work that you all do is to empower people 602 00:37:53,239 --> 00:37:56,600 Speaker 1: to have these campaigns that show people that you know, 603 00:37:56,800 --> 00:38:00,560 Speaker 1: as was said after the whistleblower came out, that when 604 00:38:00,640 --> 00:38:03,560 Speaker 1: you have a free platform, you have to recognize that 605 00:38:04,080 --> 00:38:06,799 Speaker 1: you're the product. If you're not paying for a subscription, 606 00:38:06,840 --> 00:38:09,040 Speaker 1: if you're not paying for the product, then you are 607 00:38:09,120 --> 00:38:12,719 Speaker 1: the product. So how do we continue to empower people 608 00:38:12,800 --> 00:38:17,480 Speaker 1: to recognize and fully understand in this very mindless facial 609 00:38:17,600 --> 00:38:20,439 Speaker 1: recognition you know space that we're in, let me log 610 00:38:20,520 --> 00:38:23,360 Speaker 1: into everything with Google and Facebook, you know, so that 611 00:38:23,440 --> 00:38:26,719 Speaker 1: they have all of my passwords and have everything just 612 00:38:27,080 --> 00:38:31,520 Speaker 1: how I guess just how vulnerable they really are, because 613 00:38:31,560 --> 00:38:34,320 Speaker 1: I think that that's what we were shown with the 614 00:38:35,400 --> 00:38:40,080 Speaker 1: five hour blackout, is that Wow, we're really vulnerable to 615 00:38:40,600 --> 00:38:45,200 Speaker 1: like somebody literally just pulling the plug. Yeah, it's super 616 00:38:45,320 --> 00:38:47,200 Speaker 1: important to know that we don't have to actually be 617 00:38:47,400 --> 00:38:51,279 Speaker 1: complacent with the terms that Facebook or any other social 618 00:38:51,360 --> 00:38:54,360 Speaker 1: media platform sets. We don't have to be just grateful 619 00:38:54,400 --> 00:38:57,200 Speaker 1: for the tool, because when the tool goles down, then 620 00:38:57,800 --> 00:39:02,040 Speaker 1: that's what we experienced on Monday. I think the couple 621 00:39:02,080 --> 00:39:04,759 Speaker 1: of thirds of what you were saying, like one is 622 00:39:04,840 --> 00:39:08,960 Speaker 1: definitely about data privacy is definitely about you know, the 623 00:39:09,080 --> 00:39:14,320 Speaker 1: convenience that is single click sign ons giving my information 624 00:39:14,440 --> 00:39:18,960 Speaker 1: over to this platform so it can work easier, better 625 00:39:19,000 --> 00:39:21,279 Speaker 1: or whatever for me I think that's kind of a 626 00:39:22,880 --> 00:39:26,839 Speaker 1: false a false choice, and that's something the whistle bloord 627 00:39:26,960 --> 00:39:29,239 Speaker 1: was saying to like, Facebook has presented us with this 628 00:39:29,680 --> 00:39:32,640 Speaker 1: false choice, like we don't actually we do have choices here. 629 00:39:32,960 --> 00:39:36,560 Speaker 1: We do have the choice to give information or not. 630 00:39:37,760 --> 00:39:40,160 Speaker 1: And I think just even being armed with that knowledge 631 00:39:40,280 --> 00:39:43,880 Speaker 1: of I don't have to just give Facebook all of 632 00:39:43,920 --> 00:39:47,960 Speaker 1: my information, that is a game changer in and of itself. 633 00:39:48,440 --> 00:39:50,719 Speaker 1: And so I think it's this continuous beating of the 634 00:39:50,800 --> 00:39:52,960 Speaker 1: drum of like, you don't have to just be grateful 635 00:39:53,000 --> 00:39:55,440 Speaker 1: for the tool, you don't have to just give all 636 00:39:55,440 --> 00:40:01,320 Speaker 1: your information, you don't have to just etc. With these platforms. 637 00:40:02,480 --> 00:40:06,960 Speaker 1: You know, I read a very terrifying article with how 638 00:40:07,560 --> 00:40:12,720 Speaker 1: and watch a terrifying documentary Coded Bias on how essentially 639 00:40:13,080 --> 00:40:18,400 Speaker 1: you know code is going to exacerbate our racism problem 640 00:40:18,800 --> 00:40:21,520 Speaker 1: in this country and around the world. But the terrifying 641 00:40:21,640 --> 00:40:25,440 Speaker 1: article that I read was about Venice in Italy and 642 00:40:25,600 --> 00:40:30,160 Speaker 1: how they are deciding to deploy a facial recognition right 643 00:40:30,280 --> 00:40:33,680 Speaker 1: within the city walls to be able to track tourists 644 00:40:34,040 --> 00:40:37,040 Speaker 1: so that they can deal with overcrowding. And I use 645 00:40:37,120 --> 00:40:40,000 Speaker 1: that in air quotes because essentially they are testing out 646 00:40:40,040 --> 00:40:43,560 Speaker 1: a mechanism to create an even longer digital leash where 647 00:40:43,840 --> 00:40:47,560 Speaker 1: by every CCTV, everything that you pass will be able 648 00:40:47,640 --> 00:40:50,279 Speaker 1: to have a snapshot. They have tools that are able 649 00:40:50,320 --> 00:40:52,960 Speaker 1: to pull data from your phone just by you by 650 00:40:53,080 --> 00:40:57,000 Speaker 1: virtue of passing. Of passing by. You know, we see 651 00:40:57,120 --> 00:41:00,759 Speaker 1: these little articles pop up every once and again, and 652 00:41:00,880 --> 00:41:04,040 Speaker 1: because it's in Venice, nobody is really paying attention to 653 00:41:04,160 --> 00:41:06,040 Speaker 1: the fact that I believe that they are using it 654 00:41:06,120 --> 00:41:08,640 Speaker 1: as a testing ground to see how well it works 655 00:41:08,920 --> 00:41:11,840 Speaker 1: in this city of fifty thousand people before it is 656 00:41:12,040 --> 00:41:15,080 Speaker 1: used everywhere, right before they decide to adopt it all 657 00:41:15,160 --> 00:41:18,120 Speaker 1: over Italy. In the form of safety, how do you 658 00:41:18,239 --> 00:41:25,080 Speaker 1: think that we maintain right our safety but at the 659 00:41:25,160 --> 00:41:29,360 Speaker 1: same time keep pace with the innovation that is happening, 660 00:41:29,480 --> 00:41:33,160 Speaker 1: keep pace with technology that is going to be used 661 00:41:33,320 --> 00:41:37,239 Speaker 1: to have us in these digital cages of essentially our 662 00:41:37,320 --> 00:41:42,560 Speaker 1: own making, because we've provided them with the information. I 663 00:41:42,640 --> 00:41:46,040 Speaker 1: think that's another thing around around those two things. There's 664 00:41:46,080 --> 00:41:48,880 Speaker 1: one thing around, like knowledge of what's even happening in 665 00:41:48,960 --> 00:41:51,759 Speaker 1: the first place. Sometimes that is really hard because you know, 666 00:41:51,920 --> 00:41:54,120 Speaker 1: these little articles pop up every now and again, or 667 00:41:54,280 --> 00:41:56,839 Speaker 1: it's secret. And I think in the case of many 668 00:41:56,920 --> 00:41:59,880 Speaker 1: states in the United States, like there's been the secrecy 669 00:42:00,040 --> 00:42:03,399 Speaker 1: around you know, biometric surveillance, which is what you're talking about, 670 00:42:03,520 --> 00:42:09,640 Speaker 1: facial recognition, and other other ways. There's been this secrecy 671 00:42:09,920 --> 00:42:12,759 Speaker 1: and people have found out about it, and organizers have 672 00:42:12,840 --> 00:42:16,360 Speaker 1: actually organized who change the laws that make those things possible. 673 00:42:16,719 --> 00:42:19,680 Speaker 1: And so there's been there's states in the United States. 674 00:42:19,719 --> 00:42:22,520 Speaker 1: I think it's Massachusetts is one of them, and there's 675 00:42:23,040 --> 00:42:27,040 Speaker 1: I believe, if not California as a whole, but like 676 00:42:27,320 --> 00:42:32,279 Speaker 1: cities in California who have banned biometric surveillance or banded 677 00:42:32,400 --> 00:42:39,000 Speaker 1: facial recognition use by police by cities, it varies, but 678 00:42:39,120 --> 00:42:41,879 Speaker 1: I think that is one way that will be really 679 00:42:41,960 --> 00:42:45,680 Speaker 1: tackle this. We get people who are in leadership, We 680 00:42:45,800 --> 00:42:51,640 Speaker 1: get organizers focused on this tech issue as it relates 681 00:42:51,800 --> 00:42:57,239 Speaker 1: with things like racism, with things like holophobia, transphobia, all 682 00:42:57,280 --> 00:42:59,480 Speaker 1: of the things that we're trying to fight. Tech is 683 00:42:59,560 --> 00:43:02,840 Speaker 1: like intersection with those and so even getting people to 684 00:43:02,960 --> 00:43:07,560 Speaker 1: think about how do we look at pace of recognition 685 00:43:07,719 --> 00:43:11,560 Speaker 1: is not just a lonky nere thing that's happening, but 686 00:43:11,640 --> 00:43:15,919 Speaker 1: actually I think that's happening right now and that goes 687 00:43:15,960 --> 00:43:18,879 Speaker 1: for all tech issues as well. So I think it's 688 00:43:19,080 --> 00:43:22,960 Speaker 1: I think it's being aware of it ourselves and doing 689 00:43:23,040 --> 00:43:25,680 Speaker 1: the work to connect the dots for people, and then 690 00:43:25,800 --> 00:43:28,520 Speaker 1: doing the work to then ban those things as far 691 00:43:28,560 --> 00:43:32,680 Speaker 1: as laws go. I mean, I just even think, yeah, 692 00:43:32,840 --> 00:43:35,640 Speaker 1: I think that it's it has to be a massive 693 00:43:35,840 --> 00:43:39,160 Speaker 1: information campaign because I don't even think that people recognize 694 00:43:39,239 --> 00:43:42,720 Speaker 1: what is happening. You know, we, you know, pay attention 695 00:43:43,000 --> 00:43:45,520 Speaker 1: and are reading the news and are following these stories, 696 00:43:45,840 --> 00:43:48,800 Speaker 1: but the average person is not. And you know, and 697 00:43:49,080 --> 00:43:51,399 Speaker 1: day by day we are giving away more and more 698 00:43:51,480 --> 00:43:53,319 Speaker 1: of our civil liberties and we're going to wake up 699 00:43:53,360 --> 00:43:55,279 Speaker 1: one day and they're just going to be gone, and 700 00:43:55,520 --> 00:43:59,280 Speaker 1: we'll be inside, locked inside of these digital cages. Please 701 00:43:59,360 --> 00:44:03,200 Speaker 1: tell folks on willkaf how they can participate in the 702 00:44:03,280 --> 00:44:07,759 Speaker 1: Facebook logout and when it's happening. Yeah, absolutely, so the 703 00:44:07,960 --> 00:44:12,280 Speaker 1: Facebook logout. You can participate by going to the Facebook 704 00:44:12,360 --> 00:44:15,640 Speaker 1: logout dot com and on there you can sign up 705 00:44:15,640 --> 00:44:18,120 Speaker 1: to take the pledge to logout on Facebook. So let's 706 00:44:18,120 --> 00:44:19,719 Speaker 1: just start with the pledge. We're not asking you to 707 00:44:19,840 --> 00:44:25,120 Speaker 1: do anything right now, but in November we will take 708 00:44:25,640 --> 00:44:30,600 Speaker 1: log off days. So from now into November, bring your friends, 709 00:44:30,680 --> 00:44:33,680 Speaker 1: bring your family, bring it aunties to the Facebook logout 710 00:44:33,760 --> 00:44:37,799 Speaker 1: dot com. Sign up you will get regular communications from 711 00:44:37,960 --> 00:44:40,879 Speaker 1: us and then join us in a couple of days 712 00:44:40,920 --> 00:44:44,120 Speaker 1: in November to really just take days off of Facebook 713 00:44:44,200 --> 00:44:47,160 Speaker 1: and show Facebook you know we have power and we 714 00:44:47,400 --> 00:44:51,279 Speaker 1: know how to use it. Jolenny also tell people who 715 00:44:51,320 --> 00:44:54,360 Speaker 1: are interested in learning more about Cairos are becoming trained 716 00:44:55,160 --> 00:44:58,000 Speaker 1: in the many programs and campaigns that you all put together, 717 00:44:58,320 --> 00:45:01,440 Speaker 1: how would they get that information as well? Yeah, so 718 00:45:01,600 --> 00:45:05,319 Speaker 1: you can go to Kiros Fellows dot org to learn 719 00:45:05,440 --> 00:45:08,520 Speaker 1: more about us. We also have Compirisaction dot org to 720 00:45:08,640 --> 00:45:13,800 Speaker 1: learn more about our more legislatives CE four activities. And 721 00:45:14,160 --> 00:45:15,960 Speaker 1: I think I would say two the two things that 722 00:45:16,080 --> 00:45:20,239 Speaker 1: Kiros does well is build leaders and strategies for you know, 723 00:45:20,360 --> 00:45:24,600 Speaker 1: contending for power. This like digital realm, internet centric world 724 00:45:24,680 --> 00:45:28,400 Speaker 1: we live in. And we run campaigns like tech usant 725 00:45:28,520 --> 00:45:31,120 Speaker 1: Mutual and the Facebook Blogout. So if those things are 726 00:45:31,160 --> 00:45:36,359 Speaker 1: interest of view, definitely go to Cairos Fellows dot org 727 00:45:37,160 --> 00:45:42,000 Speaker 1: and then the Facebook Logout dot com. Fantastic, Gilani Drew Davies, 728 00:45:42,120 --> 00:45:44,279 Speaker 1: thank you so much for making the time to join woke. 729 00:45:44,520 --> 00:45:46,719 Speaker 1: F Thank you for the work that you are doing 730 00:45:47,000 --> 00:45:49,840 Speaker 1: that you continue to bring awareness to these issues that 731 00:45:49,920 --> 00:45:53,040 Speaker 1: I think more people need to be paying attention to. Absolutely, 732 00:45:53,120 --> 00:45:56,360 Speaker 1: thank you so much for having us. It's great, appreciate you. 733 00:46:01,440 --> 00:46:06,080 Speaker 1: That is it for me today on Woke af as always, 734 00:46:06,680 --> 00:46:09,480 Speaker 1: Power to the people and to all the people power, 735 00:46:09,920 --> 00:46:12,279 Speaker 1: get woke and stay woke as fuck.