1 00:00:04,000 --> 00:00:09,520 Speaker 1: From Futuro Media and PRX It's Latino USA by Mariao Rosa. Today, 2 00:00:09,600 --> 00:00:16,520 Speaker 1: the rise of disinformation and misinformation targeting Latino and Latina voters. 3 00:00:20,680 --> 00:00:24,639 Speaker 1: As the country enters the twenty twenty two midterm election cycle, 4 00:00:25,360 --> 00:00:29,040 Speaker 1: more and more attention has been given to how disinformation 5 00:00:29,680 --> 00:00:35,040 Speaker 1: and misinformation are impacting Latino and Latina voters. This roundtable 6 00:00:35,080 --> 00:00:37,519 Speaker 1: discussion is going to kick off our twenty twenty two 7 00:00:37,640 --> 00:00:41,840 Speaker 1: election cycle coverage right here at Latino USA, and I'm 8 00:00:41,880 --> 00:00:46,760 Speaker 1: joined by my in the Thick Political podcast co host Julorella. 9 00:00:46,840 --> 00:00:48,400 Speaker 2: Hey Julio, Hey Maria. 10 00:00:48,960 --> 00:00:52,640 Speaker 1: We're also joined by guests Maria Teresa Kumar, founding president 11 00:00:52,680 --> 00:00:56,840 Speaker 1: and CEO of the voting rights organization Voto Latino, and 12 00:00:57,160 --> 00:01:01,360 Speaker 1: Haime Longoria, manager of research and training for the Disinfote 13 00:01:01,400 --> 00:01:06,720 Speaker 1: defensely at the Media Democracy Fund. Hey Marie, Theresa, thanks. 14 00:01:06,440 --> 00:01:07,440 Speaker 3: For having me get asked. 15 00:01:07,480 --> 00:01:09,440 Speaker 4: Yes, Hey, Hi, May, thanks for having me on. 16 00:01:10,240 --> 00:01:14,120 Speaker 1: So today we're going to be talking about unpacking disinformation 17 00:01:14,400 --> 00:01:20,160 Speaker 1: and misinformation in our community, specifically, yes, the Latino, Latina 18 00:01:20,240 --> 00:01:24,520 Speaker 1: LATINX community. So, the midterms are happening in November and 19 00:01:24,720 --> 00:01:28,920 Speaker 1: Latinos and Latinas make up the largest ethnic voting block, 20 00:01:29,640 --> 00:01:32,640 Speaker 1: and so there's so much at stake in terms of 21 00:01:32,640 --> 00:01:36,160 Speaker 1: Democrats and Republicans. The Democrats are desperately trying to hold 22 00:01:36,200 --> 00:01:38,800 Speaker 1: on to their slim majority in the Senate, and with 23 00:01:38,920 --> 00:01:43,160 Speaker 1: thirty five Senate seats up for election, thirty five, this 24 00:01:43,240 --> 00:01:45,800 Speaker 1: could really tilt the balance either way. It could set 25 00:01:45,840 --> 00:01:48,840 Speaker 1: the stage obviously for the twenty twenty four presidential election. 26 00:01:49,840 --> 00:01:52,919 Speaker 1: And we know, of course, you've got to cue the jingle. 27 00:01:53,760 --> 00:01:56,760 Speaker 1: Latinos and Latinas are not a monolith, right, So we 28 00:01:56,800 --> 00:02:01,840 Speaker 1: may be the second largest voting block, voting cohort, but 29 00:02:01,920 --> 00:02:04,920 Speaker 1: we don't vote as one, right, We're not a monolith. 30 00:02:05,520 --> 00:02:10,160 Speaker 1: This notion of disinformation, often targeted to Spanish speaking voters, 31 00:02:10,639 --> 00:02:14,400 Speaker 1: has a lot of people freaking out, I think rightly so. 32 00:02:14,840 --> 00:02:17,480 Speaker 1: And there are so many layers to this conversation, from 33 00:02:17,760 --> 00:02:22,200 Speaker 1: who's behind the disinformation campaigns, to who is being targeted, 34 00:02:22,680 --> 00:02:26,480 Speaker 1: how they're being targeted, and what is the impact of 35 00:02:26,520 --> 00:02:28,240 Speaker 1: all of this. So we're going to start at the 36 00:02:28,240 --> 00:02:32,280 Speaker 1: beginning and try to get to the root of the issue. So, 37 00:02:32,360 --> 00:02:37,680 Speaker 1: Marietre Sa Kumar, from your perspective, where does this disinformation 38 00:02:37,880 --> 00:02:42,880 Speaker 1: come from and why is it targeting specifically Latinos and 39 00:02:42,960 --> 00:02:45,280 Speaker 1: Latinas and where is it happening? 40 00:02:45,880 --> 00:02:50,480 Speaker 3: Disinformation? Oftentimes targeting the Latino community is to do twofold one. 41 00:02:50,960 --> 00:02:54,079 Speaker 3: We are a new community in participation. We've been here 42 00:02:54,080 --> 00:02:57,520 Speaker 3: for centuries. But when you think about who is a 43 00:02:57,560 --> 00:03:00,160 Speaker 3: young voter, it is squarely a young Latin. You know, 44 00:03:00,600 --> 00:03:03,520 Speaker 3: the most common age made among white people is fifty 45 00:03:03,520 --> 00:03:06,520 Speaker 3: eight years old. The most common age among Latinos is 46 00:03:06,520 --> 00:03:07,560 Speaker 3: eleven years old. 47 00:03:07,760 --> 00:03:08,000 Speaker 2: Wow. 48 00:03:08,040 --> 00:03:08,239 Speaker 5: Whow. 49 00:03:08,360 --> 00:03:11,560 Speaker 3: So when I say we're young, we're very young, right, 50 00:03:11,600 --> 00:03:14,480 Speaker 3: And so we're starting to come of age and most 51 00:03:14,520 --> 00:03:17,280 Speaker 3: of us are first genan. We are born in this 52 00:03:17,360 --> 00:03:21,359 Speaker 3: country and trying to navigate the political process, and there 53 00:03:21,400 --> 00:03:24,600 Speaker 3: are nefarious actors who see our potential and not only 54 00:03:24,680 --> 00:03:26,520 Speaker 3: do they want to restrict our vote, but they want 55 00:03:26,520 --> 00:03:29,320 Speaker 3: to give us dirty information so that we can stay sick, 56 00:03:29,520 --> 00:03:31,640 Speaker 3: because it's a lot of anti COVID information, so that 57 00:03:31,680 --> 00:03:33,799 Speaker 3: we don't get our vaccines, but then also that we 58 00:03:33,840 --> 00:03:37,800 Speaker 3: don't trust the government that is the entity that if 59 00:03:37,800 --> 00:03:40,960 Speaker 3: we vote them in, they can transform our lives every 60 00:03:41,000 --> 00:03:44,040 Speaker 3: single day. The only reason we are even on a 61 00:03:44,080 --> 00:03:48,960 Speaker 3: progressive trajectory in this country is because Latinos secured California 62 00:03:49,080 --> 00:03:52,880 Speaker 3: twenty five years ago, boom. And so the disinformation is 63 00:03:53,120 --> 00:03:57,080 Speaker 3: very much targeting Latinos because not only of our potential, 64 00:03:57,280 --> 00:03:59,800 Speaker 3: but because the present day participation that we're. 65 00:03:59,680 --> 00:04:05,120 Speaker 1: Seeing, right, So, Haime, it is very targeted. That's clear 66 00:04:06,000 --> 00:04:09,600 Speaker 1: from your perspective. Just take us like sixty four thousand 67 00:04:09,680 --> 00:04:10,840 Speaker 1: feet high. 68 00:04:11,120 --> 00:04:13,480 Speaker 2: What do you see in terms of. 69 00:04:13,600 --> 00:04:16,680 Speaker 1: Disinformation, Because Mariet, that is I really laid it out 70 00:04:16,720 --> 00:04:17,120 Speaker 1: for us. 71 00:04:17,640 --> 00:04:18,719 Speaker 2: Just something you want to add. 72 00:04:18,520 --> 00:04:21,760 Speaker 6: To that, Yeah, I think I think we're all probably 73 00:04:21,800 --> 00:04:24,240 Speaker 6: tired of hearing this, But just defining what mis and 74 00:04:24,279 --> 00:04:28,200 Speaker 6: disinformation is is really important. So misinformation is information that 75 00:04:28,240 --> 00:04:33,360 Speaker 6: people share that they don't necessarily know is false or misleading. Disinformation, 76 00:04:33,480 --> 00:04:37,400 Speaker 6: on the other hand, is information that has sort of 77 00:04:37,480 --> 00:04:40,880 Speaker 6: a purpose, right, Somebody puts it out there for political 78 00:04:41,000 --> 00:04:45,080 Speaker 6: monetary gain or to do some sort of damage. 79 00:04:45,160 --> 00:04:45,360 Speaker 4: Right. 80 00:04:45,640 --> 00:04:48,080 Speaker 6: I think, as you said, to kind of get off 81 00:04:48,080 --> 00:04:50,520 Speaker 6: the ground a little bit, from the perspective that I 82 00:04:50,600 --> 00:04:53,400 Speaker 6: come to this research is that we're all susceptible right 83 00:04:53,520 --> 00:04:57,279 Speaker 6: across the political spectrum, across all demographics, across all ages. 84 00:04:57,640 --> 00:05:00,680 Speaker 6: Because mis and disinformation is really emotional. It works to 85 00:05:00,720 --> 00:05:05,640 Speaker 6: really target your emotional reactions, right, And we are all 86 00:05:05,680 --> 00:05:10,560 Speaker 6: sort of predisposed to believing specific things. There are examples 87 00:05:10,600 --> 00:05:13,360 Speaker 6: when things that we have called miss and disinformation or 88 00:05:13,400 --> 00:05:16,520 Speaker 6: a conspiracy theory has been true, and different communities have 89 00:05:16,520 --> 00:05:18,599 Speaker 6: different examples for that, and that sort of sets us 90 00:05:18,680 --> 00:05:21,560 Speaker 6: up to be predisposed to believing certain things. The other 91 00:05:21,600 --> 00:05:23,159 Speaker 6: thing that I want to say is that it doesn't 92 00:05:23,160 --> 00:05:25,680 Speaker 6: really matter, like facts don't really matter. What matters is 93 00:05:25,680 --> 00:05:29,000 Speaker 6: whether or not information aligns with your perception of the world. Right, 94 00:05:29,360 --> 00:05:32,560 Speaker 6: just because you hear something doesn't necessarily mean that you 95 00:05:32,720 --> 00:05:35,280 Speaker 6: care about the veracity of it, But whether or not 96 00:05:35,360 --> 00:05:39,880 Speaker 6: illicits sort of an emotion in you that makes you 97 00:05:39,920 --> 00:05:41,840 Speaker 6: either feel comfortable or makes you feel like you need 98 00:05:41,880 --> 00:05:42,520 Speaker 6: to do something. 99 00:05:42,920 --> 00:05:44,919 Speaker 4: Where the problem for me lies. 100 00:05:44,920 --> 00:05:48,880 Speaker 6: Is that Spanish language is not prioritized by the platforms. 101 00:05:49,400 --> 00:05:51,880 Speaker 6: And the other thing is that there aren't really any 102 00:05:52,000 --> 00:05:55,719 Speaker 6: sort of safeguards or safety nets in the Latino Hispanic 103 00:05:56,320 --> 00:05:59,320 Speaker 6: media ecosystem. And I think that also has a lot 104 00:05:59,360 --> 00:06:03,359 Speaker 6: to do with inherent misunderstanding, as Money pointed out, like 105 00:06:03,440 --> 00:06:06,120 Speaker 6: this inherent understanding of what the Latino community actually is, 106 00:06:06,800 --> 00:06:10,080 Speaker 6: and that happens in media and in policy. And there's 107 00:06:10,080 --> 00:06:14,160 Speaker 6: also a lack of understanding of where Latinos congregate online 108 00:06:14,320 --> 00:06:17,279 Speaker 6: and where they get their information online and also offline, 109 00:06:17,279 --> 00:06:19,159 Speaker 6: because that really matters as well, right. 110 00:06:19,040 --> 00:06:21,919 Speaker 7: Right, You know, there's so many points both of you 111 00:06:21,960 --> 00:06:24,960 Speaker 7: have brought up and to give that context. And one 112 00:06:25,000 --> 00:06:28,720 Speaker 7: of the things that we also wanted to point out 113 00:06:28,760 --> 00:06:33,679 Speaker 7: as we cover the election cycle right the midterms through 114 00:06:33,720 --> 00:06:37,200 Speaker 7: our Latino and Latino lenses here on the show is 115 00:06:37,360 --> 00:06:40,240 Speaker 7: you know, it's already politically charged, right in a post 116 00:06:40,360 --> 00:06:42,320 Speaker 7: January sixth attack on the US capital. 117 00:06:43,080 --> 00:06:44,160 Speaker 2: You know, we have the coronavirus. 118 00:06:44,240 --> 00:06:45,839 Speaker 7: Might yet that I saw you mentioned it that you've 119 00:06:45,839 --> 00:06:48,800 Speaker 7: only added fuel to the issue of misinformation and disinformation. 120 00:06:49,200 --> 00:06:52,240 Speaker 7: Then we have these false claims about COVID vaccines, to 121 00:06:52,320 --> 00:06:56,960 Speaker 7: the politicization of mask mandates. These different types of misinformation 122 00:06:57,040 --> 00:06:59,600 Speaker 7: and disinformation. They go hand in hand, right, on one hand, 123 00:06:59,680 --> 00:07:02,760 Speaker 7: health and science and on the other hand political campaigns. 124 00:07:03,120 --> 00:07:04,880 Speaker 7: So how do you think this is all going to 125 00:07:04,880 --> 00:07:07,320 Speaker 7: play together in the midterms because it is a strategy? 126 00:07:07,440 --> 00:07:09,600 Speaker 2: No? Or is that too simplistic of a take? 127 00:07:10,200 --> 00:07:12,000 Speaker 4: No, I don't think that's simplistic. 128 00:07:12,080 --> 00:07:15,920 Speaker 6: I think there is a strategy behind this, especially when 129 00:07:15,960 --> 00:07:17,080 Speaker 6: we talk about disinformation. 130 00:07:17,920 --> 00:07:19,480 Speaker 4: But if we zoom out. 131 00:07:19,680 --> 00:07:22,960 Speaker 6: Also, I'm a big fan of zooming out apparently if 132 00:07:22,960 --> 00:07:24,240 Speaker 6: we zoom out a little bit. 133 00:07:24,160 --> 00:07:25,560 Speaker 2: Like sixty four thousand feet. 134 00:07:25,640 --> 00:07:29,600 Speaker 6: Yeah, looking at the conversation, I mean, a lot of 135 00:07:29,640 --> 00:07:32,760 Speaker 6: the things that we're dealing with right now actually have 136 00:07:33,040 --> 00:07:37,280 Speaker 6: their source in mis and disinformation or even malinformation, which 137 00:07:37,320 --> 00:07:41,120 Speaker 6: is information that's true but is used in a context 138 00:07:41,120 --> 00:07:41,640 Speaker 6: that's harmful. 139 00:07:41,720 --> 00:07:43,800 Speaker 4: Right when we talk about. 140 00:07:43,400 --> 00:07:45,720 Speaker 6: Like a lot of the border policies that are going on, 141 00:07:46,040 --> 00:07:47,640 Speaker 6: there was a lot of fear mongering, right, and there 142 00:07:47,680 --> 00:07:50,080 Speaker 6: continues to be a lot of fear mongering there. And 143 00:07:50,200 --> 00:07:54,800 Speaker 6: also now all these restrictions on voting rights like that 144 00:07:54,840 --> 00:07:57,040 Speaker 6: comes from the big lie, right from all these claims 145 00:07:57,040 --> 00:08:00,880 Speaker 6: about election fraud that happen in twenty twenty. In some ways, 146 00:08:00,920 --> 00:08:03,520 Speaker 6: we can see that there is sort of this playbook 147 00:08:03,640 --> 00:08:05,400 Speaker 6: which is not new. I think we've seen it for 148 00:08:05,400 --> 00:08:08,560 Speaker 6: as long as America has had politics. But now it's 149 00:08:08,560 --> 00:08:13,400 Speaker 6: sort of interacting with platforms that incentivize polarizing content and 150 00:08:13,440 --> 00:08:16,040 Speaker 6: that really have a way of making all this stuff 151 00:08:16,240 --> 00:08:17,440 Speaker 6: spread so widely. 152 00:08:18,080 --> 00:08:20,320 Speaker 7: Madeta, how do you think this will play out in 153 00:08:20,360 --> 00:08:21,920 Speaker 7: the upcoming midterm elections? 154 00:08:22,040 --> 00:08:23,800 Speaker 2: What are the specific things that you're seeing. 155 00:08:24,000 --> 00:08:26,520 Speaker 3: I mean, there's a lot of challenges. We saw the 156 00:08:26,600 --> 00:08:29,880 Speaker 3: disinformation in twenty nineteen to going into Florida in March, 157 00:08:30,240 --> 00:08:32,040 Speaker 3: and then we kept tracking it and we saw the 158 00:08:32,080 --> 00:08:35,000 Speaker 3: firestorm of the disinformation hitting the Rio Grande Valley the 159 00:08:35,040 --> 00:08:37,960 Speaker 3: mid September of twenty twenty, and we were trying to 160 00:08:38,160 --> 00:08:40,120 Speaker 3: raise the alarm, but no one pays attention to the 161 00:08:40,200 --> 00:08:44,520 Speaker 3: Latino community, not recognizing that Latinos the disinformation that we're receiving. 162 00:08:45,040 --> 00:08:46,760 Speaker 3: We're often the canary in. 163 00:08:46,720 --> 00:08:48,040 Speaker 2: The coal mine exactly now. 164 00:08:48,080 --> 00:08:51,720 Speaker 3: Oftentimes they use us because it's low hanging fruit on Facebook, 165 00:08:51,760 --> 00:08:54,240 Speaker 3: just a terrible ad. Then you figure out what really 166 00:08:54,280 --> 00:08:56,680 Speaker 3: sticks and then you send it off into the rest 167 00:08:56,720 --> 00:09:00,920 Speaker 3: of these closed networks. And we are in this crisis 168 00:09:00,960 --> 00:09:05,640 Speaker 3: mode of Latinos getting disinformation because our media is balconized. 169 00:09:05,640 --> 00:09:09,600 Speaker 3: No one has recognized us as a force for media 170 00:09:09,679 --> 00:09:12,520 Speaker 3: consumption and so we have to go and have a 171 00:09:12,559 --> 00:09:15,400 Speaker 3: different level of discernment when we go on YouTube, and 172 00:09:15,559 --> 00:09:18,480 Speaker 3: none of these folks have to tell us the truth. 173 00:09:18,960 --> 00:09:20,199 Speaker 3: Yet I have to say, you know, we talked a 174 00:09:20,200 --> 00:09:23,160 Speaker 3: little bit about the midterms. Sadly, South Texas wasn't a 175 00:09:23,200 --> 00:09:25,960 Speaker 3: surprise because again we saw it in twenty twenty, and then, 176 00:09:26,080 --> 00:09:29,280 Speaker 3: believe it or not, we saw that firestorm hitting southern 177 00:09:29,320 --> 00:09:32,880 Speaker 3: Arizona two weeks out of the election. The level of 178 00:09:32,920 --> 00:09:35,880 Speaker 3: sophistication they recognize they don't need all Latinos, they just 179 00:09:35,920 --> 00:09:37,960 Speaker 3: need to skim off the top in certain areas and 180 00:09:38,040 --> 00:09:38,960 Speaker 3: create insertacudes. 181 00:09:39,200 --> 00:09:41,840 Speaker 7: Right, And I think one of the points talking about 182 00:09:41,880 --> 00:09:45,800 Speaker 7: the online platforms, and this report has been cited previously 183 00:09:45,840 --> 00:09:47,920 Speaker 7: and I've written about it, Money has talked about it. 184 00:09:48,000 --> 00:09:52,400 Speaker 7: There After the twenty twenty presidential election, research from the 185 00:09:52,440 --> 00:09:56,680 Speaker 7: Online Advocacy Group of OZ found that Facebook failed to 186 00:09:56,760 --> 00:09:59,640 Speaker 7: flag seventy percent seventy percent. 187 00:09:59,320 --> 00:10:00,880 Speaker 2: Of Spanish lange which of misinformation. 188 00:10:01,960 --> 00:10:05,440 Speaker 7: And according to the pre Research Center in twenty twenty one, 189 00:10:06,120 --> 00:10:11,520 Speaker 7: Latinos use YouTube, Instagram, Snapchat, WhatsApp, and TikTok more than 190 00:10:11,640 --> 00:10:13,360 Speaker 7: any other demographic group. 191 00:10:14,000 --> 00:10:15,319 Speaker 2: So you both have mentioned it. 192 00:10:15,440 --> 00:10:19,719 Speaker 7: Latinos are online a lot, which makes the likelihood of 193 00:10:19,800 --> 00:10:21,920 Speaker 7: running into false information higher. 194 00:10:22,120 --> 00:10:25,440 Speaker 1: Right, Because if our average age is ten eleven, twelve, 195 00:10:26,480 --> 00:10:29,480 Speaker 1: we're young. We are going to be our kids, our 196 00:10:29,520 --> 00:10:32,400 Speaker 1: grandkids are going to be on social media and consuming 197 00:10:32,400 --> 00:10:35,000 Speaker 1: all of this media. So that's just like a data point, 198 00:10:35,000 --> 00:10:39,480 Speaker 1: that's a given and how we consume media also is different. 199 00:10:39,559 --> 00:10:42,720 Speaker 1: So even though Spanish is the second most commonly spoken 200 00:10:42,800 --> 00:10:46,040 Speaker 1: language in the United States, social media companies have not 201 00:10:46,200 --> 00:10:51,920 Speaker 1: dedicated resources to flag and take down Spanish language misinformation. 202 00:10:52,520 --> 00:10:57,520 Speaker 1: It's there, but it's not really enough, and this misinformation 203 00:10:58,200 --> 00:11:01,200 Speaker 1: can be harder to detect if it's a video or 204 00:11:01,240 --> 00:11:04,440 Speaker 1: if it's shared on a private WhatsApp group. I'm sure 205 00:11:04,480 --> 00:11:07,640 Speaker 1: all of you have, you know, like how did this 206 00:11:07,720 --> 00:11:09,640 Speaker 1: suddenly end up on my WhatsApp feed? 207 00:11:09,760 --> 00:11:11,280 Speaker 2: Like how is this? Well? 208 00:11:11,440 --> 00:11:13,520 Speaker 3: This is well, usually it's my father, I mean, if 209 00:11:13,520 --> 00:11:16,520 Speaker 3: we're honest, But you really make an important point. Folks 210 00:11:16,559 --> 00:11:22,080 Speaker 3: don't realize. Imagine, Facebook is the entry way drug to radicalization. 211 00:11:22,520 --> 00:11:24,440 Speaker 3: And I don't say that lightly. If you go on 212 00:11:24,480 --> 00:11:27,400 Speaker 3: to Facebook and the stuff that they're saying oftentimes is extreme, 213 00:11:27,679 --> 00:11:30,160 Speaker 3: but what they do then is on the comments they 214 00:11:30,160 --> 00:11:32,520 Speaker 3: will tell you to join another group that is either 215 00:11:32,559 --> 00:11:35,880 Speaker 3: on WhatsApp or that is on Telegram. And then once 216 00:11:35,960 --> 00:11:38,439 Speaker 3: they're in those closed groups, that is when people become 217 00:11:38,480 --> 00:11:41,560 Speaker 3: really radicalized. I mean I have gone into some of 218 00:11:41,600 --> 00:11:47,600 Speaker 3: these rooms and the chats are absolutely misogynistic, racist, anti government. 219 00:11:47,640 --> 00:11:49,800 Speaker 3: I mean, you name it, and it is there. And 220 00:11:49,840 --> 00:11:52,400 Speaker 3: there's no monitoring that happens. And at the same time 221 00:11:52,880 --> 00:11:54,760 Speaker 3: Meta wants to wash their hands and saying, well, it's 222 00:11:54,800 --> 00:11:56,600 Speaker 3: not our fault. Well, if you were doing the exact 223 00:11:56,600 --> 00:11:59,320 Speaker 3: same thing right now on radio, you would get in trouble. 224 00:11:59,600 --> 00:11:59,760 Speaker 5: Right. 225 00:11:59,840 --> 00:12:01,839 Speaker 1: The thing is that it's again, as you said, it's 226 00:12:01,880 --> 00:12:05,480 Speaker 1: not just happening online. You mentioned radio. So Spanish language 227 00:12:05,559 --> 00:12:08,440 Speaker 1: radio stations, especially in South Florida, have been criticized for 228 00:12:08,520 --> 00:12:12,280 Speaker 1: spreading disinformation on the air, including claims that the left 229 00:12:12,280 --> 00:12:17,760 Speaker 1: wing movement Antifa Antifa was behind the January sixth capital attack, 230 00:12:18,240 --> 00:12:21,800 Speaker 1: the attempted state of coupdeta in the United States and 231 00:12:21,840 --> 00:12:25,040 Speaker 1: saying that it was Antifa. So there's an element of 232 00:12:25,120 --> 00:12:29,360 Speaker 1: trust when information is given in Spanish, and that's important too. 233 00:12:30,040 --> 00:12:35,520 Speaker 1: It's actually something very particular about Spanish language communications, right. 234 00:12:35,600 --> 00:12:38,240 Speaker 1: It is always in a sense of like, we care 235 00:12:38,320 --> 00:12:43,240 Speaker 1: about you, we're interested trust us. So why is it 236 00:12:43,320 --> 00:12:47,439 Speaker 1: that Spanish language disinformation is so much more prevalent and unregulated? 237 00:12:48,080 --> 00:12:50,880 Speaker 1: And is it because we are that language that was 238 00:12:50,960 --> 00:12:53,080 Speaker 1: just like, well, it's the other it's the language you 239 00:12:53,080 --> 00:12:54,839 Speaker 1: don't really want to talk about, we don't really want 240 00:12:54,840 --> 00:12:56,520 Speaker 1: to have to deal with this, I mean, is that 241 00:12:56,559 --> 00:12:58,840 Speaker 1: where it starts? And why don't you start us off? 242 00:12:59,120 --> 00:13:05,280 Speaker 1: Iime particular, why is disinformation and misinformation such a problem 243 00:13:05,440 --> 00:13:09,200 Speaker 1: when we're talking about Spanish language in the United States. 244 00:13:09,600 --> 00:13:11,360 Speaker 6: It has a lot to do with what you were 245 00:13:11,400 --> 00:13:16,600 Speaker 6: pointing to, Madia. The more important and the more politically viable, 246 00:13:16,640 --> 00:13:20,800 Speaker 6: the more weight that the Latino community carries, the more 247 00:13:20,920 --> 00:13:23,560 Speaker 6: they are going to be efforts to kind of muddy 248 00:13:23,600 --> 00:13:27,040 Speaker 6: the waters. I think that says something that we have 249 00:13:27,160 --> 00:13:30,400 Speaker 6: reached a moment in our history as a community where 250 00:13:30,679 --> 00:13:34,120 Speaker 6: we have a lot of political power. There are other 251 00:13:34,280 --> 00:13:37,960 Speaker 6: sort of entities, other groups, other people that have the 252 00:13:38,000 --> 00:13:40,960 Speaker 6: incentive to sort of co opt that for themselves. Since 253 00:13:41,000 --> 00:13:42,800 Speaker 6: I'm sort of like knee deep in all this stuff 254 00:13:42,800 --> 00:13:44,840 Speaker 6: every day, like one thing that I always think about 255 00:13:45,000 --> 00:13:48,480 Speaker 6: is how sort of these conversations are getting politicized. I 256 00:13:48,520 --> 00:13:51,400 Speaker 6: don't really like to frame it as these are right 257 00:13:51,400 --> 00:13:55,760 Speaker 6: wing efforts to shift people to the Republican Party. That's 258 00:13:55,800 --> 00:13:57,959 Speaker 6: not how I see this for me. I feel if 259 00:13:57,960 --> 00:14:00,960 Speaker 6: you really get down into what's actually happening, is that 260 00:14:01,360 --> 00:14:04,480 Speaker 6: there are a lot of grievances and wedge issues within 261 00:14:04,520 --> 00:14:08,120 Speaker 6: our community that we as the community have not yet 262 00:14:08,160 --> 00:14:11,559 Speaker 6: addressed that open space for other people to come in 263 00:14:12,040 --> 00:14:16,120 Speaker 6: and manipulate the conversations that we're having and to sort 264 00:14:16,160 --> 00:14:19,160 Speaker 6: of affect the ways in which we see public health initiatives. 265 00:14:19,560 --> 00:14:21,760 Speaker 6: That's not to say that it's divorced from any political 266 00:14:21,800 --> 00:14:24,400 Speaker 6: context at all, Right, Like I said, it's disinformation. There 267 00:14:24,440 --> 00:14:27,920 Speaker 6: are sort of people behind the scenes pulling the strings. 268 00:14:28,360 --> 00:14:30,800 Speaker 6: But there's something to say about the fact that not 269 00:14:30,880 --> 00:14:33,240 Speaker 6: all of this comes from outside of the community, right. 270 00:14:35,760 --> 00:14:37,480 Speaker 6: A lot of the times it's people from within our 271 00:14:37,480 --> 00:14:41,080 Speaker 6: own community, people from within our own political spaces that 272 00:14:41,160 --> 00:14:44,800 Speaker 6: are the ones sort of facilitating and spreading misinformation, right, 273 00:14:45,000 --> 00:14:47,520 Speaker 6: And we don't always know. The fact that we're a 274 00:14:47,600 --> 00:14:52,000 Speaker 6: growing community that has so much political power today is 275 00:14:52,040 --> 00:14:54,000 Speaker 6: really why I think we're at the forefront of a 276 00:14:54,000 --> 00:14:55,280 Speaker 6: lot of this targeting. 277 00:15:00,000 --> 00:15:03,240 Speaker 1: Helping up on Latino US say what outreach is being 278 00:15:03,280 --> 00:15:06,240 Speaker 1: done with Latino and Latino voters to combat the problem 279 00:15:06,360 --> 00:15:12,040 Speaker 1: of disinformation and is the work making a difference? Stay 280 00:15:12,080 --> 00:15:59,440 Speaker 1: with us, don't stay by. Yes, Hey, we're back, and 281 00:15:59,640 --> 00:16:03,840 Speaker 1: here's the rest of our conversation about disinformation and misinformation 282 00:16:04,360 --> 00:16:08,120 Speaker 1: with my in the Thic co host j We're joined 283 00:16:08,120 --> 00:16:12,840 Speaker 1: by marietarees A Kumara vot Latino and Jimenrongoia from the 284 00:16:13,000 --> 00:16:16,240 Speaker 1: Disinfo Defense League at the Media Democracy Fund. 285 00:16:16,960 --> 00:16:17,920 Speaker 2: Let's jump back in. 286 00:16:18,680 --> 00:16:20,680 Speaker 1: So the thing is, Mariet is I think that you 287 00:16:20,680 --> 00:16:23,880 Speaker 1: would probably say to Jime, actually, it is very specifically 288 00:16:23,920 --> 00:16:29,120 Speaker 1: being targeted to shift the politics. That is the intention. 289 00:16:29,240 --> 00:16:31,040 Speaker 1: I mean, do you think that that is? I do, 290 00:16:31,120 --> 00:16:33,080 Speaker 1: because him is kind of like it's it's just like 291 00:16:33,440 --> 00:16:36,480 Speaker 1: a vulnerability. But you're like, nah, there's a much more 292 00:16:36,600 --> 00:16:40,080 Speaker 1: specific and I would say nefarious intention here. 293 00:16:40,040 --> 00:16:41,840 Speaker 3: When you take a step back of why do you 294 00:16:42,040 --> 00:16:45,960 Speaker 3: have disinformation? And Hima said something that is right. We've 295 00:16:45,960 --> 00:16:49,800 Speaker 3: always had disinformation in politics. Autocrats are the ones that 296 00:16:49,800 --> 00:16:53,680 Speaker 3: peddle disinformation. Non functioning democracies are the people who peddle 297 00:16:53,680 --> 00:16:55,880 Speaker 3: disinformation because they don't want you to be able to 298 00:16:55,880 --> 00:16:59,240 Speaker 3: trust anything or anyone, let alone your government. And that's 299 00:16:59,320 --> 00:17:01,280 Speaker 3: the reason that the Latino got in the game was 300 00:17:01,320 --> 00:17:05,960 Speaker 3: because we saw two barriers to a thriving democracy for 301 00:17:05,960 --> 00:17:08,840 Speaker 3: the Latino community. One is the voting restrictions that are 302 00:17:09,000 --> 00:17:11,440 Speaker 3: very much as a result of the boom and the 303 00:17:11,480 --> 00:17:15,000 Speaker 3: Latino population. Most people don't realize that the gutting of 304 00:17:15,200 --> 00:17:18,240 Speaker 3: the Voting Rights Act Shelby County made had a two 305 00:17:18,320 --> 00:17:21,080 Speaker 3: hundred and ninety seven percent increase in the Latino population 306 00:17:21,200 --> 00:17:24,359 Speaker 3: in twenty ten. That is not by accident. And then 307 00:17:24,400 --> 00:17:27,040 Speaker 3: the other one is disinformation. If you have a new 308 00:17:27,040 --> 00:17:30,680 Speaker 3: community that is learning about democracy, that it's learning that 309 00:17:30,800 --> 00:17:33,680 Speaker 3: their government pay work because the government that their parents 310 00:17:33,920 --> 00:17:36,840 Speaker 3: led may not work, then you actually have a window 311 00:17:37,040 --> 00:17:40,200 Speaker 3: of distrust that you have because they're getting familiar with it. 312 00:17:40,480 --> 00:17:42,520 Speaker 3: And again, I don't think that it's by accident that 313 00:17:42,560 --> 00:17:45,600 Speaker 3: the people that are heavily targeted in our politics and 314 00:17:45,640 --> 00:17:50,400 Speaker 3: in disinformation are the extreme right people that are uneducated 315 00:17:50,440 --> 00:17:53,040 Speaker 3: white men who are sitting on the fence and getting cultivated, 316 00:17:53,359 --> 00:17:56,720 Speaker 3: and the Latino community who was growing because they also 317 00:17:56,720 --> 00:17:59,399 Speaker 3: see our potential that seemed to be aligned with the 318 00:17:59,480 --> 00:18:01,160 Speaker 3: democratic values right. 319 00:18:01,200 --> 00:18:03,840 Speaker 6: And also, just to be clear, I don't completely divorce 320 00:18:03,880 --> 00:18:06,920 Speaker 6: this from politics either. Honestly, there are Latinos within the 321 00:18:06,960 --> 00:18:10,160 Speaker 6: community that do take part in spreading mis and disinformation obviously, 322 00:18:10,600 --> 00:18:14,440 Speaker 6: and I think a large part of the discourse really 323 00:18:14,480 --> 00:18:18,280 Speaker 6: flattens what Latinos believe in, right. I think this is 324 00:18:18,320 --> 00:18:20,199 Speaker 6: a lot of questions that I got from reporters as 325 00:18:20,359 --> 00:18:23,399 Speaker 6: or like why are Latinos believing in misinformation and shifting 326 00:18:23,400 --> 00:18:25,480 Speaker 6: to the right. We can't really say for sure. There's 327 00:18:25,560 --> 00:18:30,639 Speaker 6: a conversation about whether or not disinformation actually affects real 328 00:18:30,680 --> 00:18:32,959 Speaker 6: world change, and obviously the answer we would all say 329 00:18:33,000 --> 00:18:35,080 Speaker 6: is that yes, of course, but we don't have any 330 00:18:35,160 --> 00:18:38,200 Speaker 6: quantifiable evidence of that. We don't know how it exactly 331 00:18:38,240 --> 00:18:41,040 Speaker 6: does that. We don't know if it's enabling people to 332 00:18:41,119 --> 00:18:43,399 Speaker 6: do it or if it's the cause of people actually 333 00:18:43,440 --> 00:18:46,320 Speaker 6: going and shifting their beliefs completely. I think we really 334 00:18:46,359 --> 00:18:47,520 Speaker 6: still need to study this. 335 00:18:47,480 --> 00:18:50,640 Speaker 4: A lot more. But I think what I'm very wary. 336 00:18:50,359 --> 00:18:52,840 Speaker 6: About as someone who does this research is that the 337 00:18:52,880 --> 00:18:56,919 Speaker 6: politicization of mis and disinformation studies is a really dangerous trend. 338 00:18:57,280 --> 00:19:00,199 Speaker 6: This is only a part of the picture of what 339 00:19:00,280 --> 00:19:04,320 Speaker 6: is like a sort of a wider political malaise society wide. 340 00:19:04,320 --> 00:19:08,560 Speaker 6: Trust institutions is completely lost. And I'm not denying that 341 00:19:08,600 --> 00:19:11,800 Speaker 6: there are campaigns out there that are working towards shifting 342 00:19:12,040 --> 00:19:16,720 Speaker 6: Latino's sentiment, but misin disinformation is a symptom of a 343 00:19:16,840 --> 00:19:19,400 Speaker 6: larger issue, and this is only one way in which 344 00:19:19,440 --> 00:19:20,480 Speaker 6: that's being enabled. 345 00:19:20,760 --> 00:19:20,920 Speaker 4: Yeah. 346 00:19:21,000 --> 00:19:22,639 Speaker 7: No, And I think that you know, one of the 347 00:19:22,680 --> 00:19:26,520 Speaker 7: things in this context is the organizing and outreach that's 348 00:19:26,560 --> 00:19:30,880 Speaker 7: being done with Latino Latino voters to counter this issue. 349 00:19:31,000 --> 00:19:32,200 Speaker 2: So just a couple examples. 350 00:19:32,200 --> 00:19:35,560 Speaker 7: At the grassroots level, there's a lot of local organizations 351 00:19:35,600 --> 00:19:38,600 Speaker 7: that are partnering to get accurate information about COVID to 352 00:19:38,640 --> 00:19:42,159 Speaker 7: their communities. Right, and some Spanish language radio stations like 353 00:19:42,280 --> 00:19:46,960 Speaker 7: Radio Bilingue and Radio and Dihena have programs aimed at 354 00:19:46,960 --> 00:19:52,359 Speaker 7: providing health resources for their listeners. Producer Lisa Salina spoke 355 00:19:52,480 --> 00:19:56,879 Speaker 7: with Jose Munos, who is a deputy communications director at 356 00:19:57,240 --> 00:19:59,680 Speaker 7: United We Dream Action, which is the first and largest 357 00:19:59,720 --> 00:20:03,960 Speaker 7: immag gant youth led organization, about the work their organization 358 00:20:04,080 --> 00:20:08,280 Speaker 7: is doing to combat disinformation. Let's listen to what Jose said. 359 00:20:08,520 --> 00:20:12,119 Speaker 5: There are some LATINX folks who are solely consuming stories 360 00:20:12,200 --> 00:20:16,159 Speaker 5: about immigration. So these are more tactical, more transactional stories 361 00:20:16,200 --> 00:20:19,719 Speaker 5: about the system of immigration. Those tend to be young 362 00:20:19,840 --> 00:20:25,399 Speaker 5: Latino males and older LATINX audiences over thirty five across gender. 363 00:20:25,880 --> 00:20:28,720 Speaker 5: And then the second piece of consumption for folks is 364 00:20:28,880 --> 00:20:31,760 Speaker 5: immigrant stories. These are stories about people you know. These 365 00:20:31,760 --> 00:20:35,960 Speaker 5: are stories about folks who are seeking a better life 366 00:20:36,000 --> 00:20:37,800 Speaker 5: in the US. One of the things that we found 367 00:20:37,840 --> 00:20:40,600 Speaker 5: within that was that young Latino males they have sort 368 00:20:40,600 --> 00:20:43,520 Speaker 5: of a content vacuum where they're not actually getting a 369 00:20:43,560 --> 00:20:47,560 Speaker 5: lot of stories about immigrants. The immigration stories that they're 370 00:20:47,600 --> 00:20:51,480 Speaker 5: getting tend to come from more conservative places, and it 371 00:20:51,520 --> 00:20:54,880 Speaker 5: really leaves this content a vacuum for them that makes 372 00:20:54,880 --> 00:20:58,200 Speaker 5: them easier targets for racialized disinformation about the role that 373 00:20:58,240 --> 00:20:59,520 Speaker 5: immigrants play in this country. 374 00:21:00,160 --> 00:21:02,920 Speaker 7: That was a specific example, and both of you are 375 00:21:03,000 --> 00:21:06,520 Speaker 7: working on a larger scale like the Latino Anti Disinformation 376 00:21:06,680 --> 00:21:10,439 Speaker 7: Lab that you co chair Maria, Theresa and Hime. In 377 00:21:10,440 --> 00:21:14,359 Speaker 7: addition to your research on COVID nineteen vaccine misinformation, you 378 00:21:14,440 --> 00:21:17,760 Speaker 7: work with the Disinformation Defense League, which seeks to fight 379 00:21:18,000 --> 00:21:22,040 Speaker 7: racialized disinformation that targets black, Latino and other communities of color. 380 00:21:22,680 --> 00:21:25,119 Speaker 7: Hi may talk to us about the work you're doing. 381 00:21:25,320 --> 00:21:28,440 Speaker 7: How do you measure impact and success? So we'll start 382 00:21:28,480 --> 00:21:30,320 Speaker 7: with you, Hymen, and then Maria Trees how you can 383 00:21:30,359 --> 00:21:30,760 Speaker 7: jump in. 384 00:21:31,000 --> 00:21:34,960 Speaker 6: What we do is basically, we're a network of more 385 00:21:35,000 --> 00:21:38,640 Speaker 6: than two hundred and fifty organizations who come together under 386 00:21:38,680 --> 00:21:42,840 Speaker 6: the umbrella of combating the harmful effects of mis and disinformation, 387 00:21:42,960 --> 00:21:46,000 Speaker 6: specifically looking at communities that are racialized in the United States. 388 00:21:46,680 --> 00:21:50,119 Speaker 6: We try to make these connections between folks on the ground, 389 00:21:50,200 --> 00:21:53,200 Speaker 6: not only working at the national level, but also folks 390 00:21:53,240 --> 00:21:57,600 Speaker 6: working within their communities at really smaller community organizations, and 391 00:21:57,640 --> 00:22:00,359 Speaker 6: we want them not only to connect with themselves and 392 00:22:00,400 --> 00:22:03,800 Speaker 6: to talk about the strategies that they're using, but also 393 00:22:03,880 --> 00:22:07,760 Speaker 6: giving them access to experts that are doing advocacy research, 394 00:22:08,000 --> 00:22:11,399 Speaker 6: someone like myself that works in nonprofits and does the 395 00:22:11,440 --> 00:22:14,639 Speaker 6: research every day looking at what narratives are trending and 396 00:22:14,680 --> 00:22:18,800 Speaker 6: what's going on online. And we connect them with academics 397 00:22:18,920 --> 00:22:22,560 Speaker 6: as well, because there's a whole world of people in 398 00:22:22,640 --> 00:22:25,680 Speaker 6: academia doing this work that they never have any contact 399 00:22:25,720 --> 00:22:28,040 Speaker 6: with folks on the ground. That's the main thing that 400 00:22:28,040 --> 00:22:30,600 Speaker 6: we're trying to do, bridging all of these gaps. My 401 00:22:30,720 --> 00:22:34,520 Speaker 6: wider goal is really to pull all of this expertise. 402 00:22:35,320 --> 00:22:37,119 Speaker 6: I want to give that to the community, right. I 403 00:22:37,200 --> 00:22:40,880 Speaker 6: want to foster in community expertise, because who is there 404 00:22:41,000 --> 00:22:43,399 Speaker 6: better to understand their community than the people who actually 405 00:22:43,440 --> 00:22:45,919 Speaker 6: live in it. And we have a couple of areas 406 00:22:45,920 --> 00:22:50,840 Speaker 6: specifically looking at Asian American disinformation and hate speech. We're 407 00:22:50,880 --> 00:22:55,560 Speaker 6: looking at immigration dis and malinformation. We're looking at also 408 00:22:55,840 --> 00:22:59,720 Speaker 6: how best do we transfer the research into real world 409 00:22:59,720 --> 00:23:02,840 Speaker 6: act because that's something that's really lacking in this space. 410 00:23:03,200 --> 00:23:05,760 Speaker 6: We have the research, but how does that look like 411 00:23:05,800 --> 00:23:07,760 Speaker 6: in the real world? How do we do this stuff? 412 00:23:08,119 --> 00:23:09,760 Speaker 6: So we're kind of all over the place. But the 413 00:23:09,840 --> 00:23:13,240 Speaker 6: main goal really is connecting and bridging the gaps of 414 00:23:13,280 --> 00:23:16,840 Speaker 6: this whole area of studies so that people can really 415 00:23:16,840 --> 00:23:20,480 Speaker 6: come together and start to make solutions and not just 416 00:23:20,600 --> 00:23:22,600 Speaker 6: talk about the issue itself. 417 00:23:22,960 --> 00:23:25,560 Speaker 1: So, Madyetnis, I'm wondering, I mean, that is a lot. 418 00:23:25,800 --> 00:23:28,080 Speaker 1: Heima said, we're kind of all over the place, and really, 419 00:23:28,080 --> 00:23:31,520 Speaker 1: when you're talking about this, the thing about disinformation and 420 00:23:31,560 --> 00:23:35,360 Speaker 1: misinformation is that it too is kind of all over 421 00:23:35,400 --> 00:23:38,600 Speaker 1: the place. And I think that's part of the strategy. 422 00:23:38,760 --> 00:23:40,679 Speaker 1: So tell us a little bit about the work that 423 00:23:40,720 --> 00:23:44,320 Speaker 1: you're doing, how you measure it. And also, I'm assuming 424 00:23:44,359 --> 00:23:48,120 Speaker 1: that when you were trying to get this issue out there, 425 00:23:48,160 --> 00:23:50,400 Speaker 1: you had a lot of people just like huh huh, 426 00:23:50,400 --> 00:23:53,440 Speaker 1: and so you've faced a lot of challenges in trying 427 00:23:53,480 --> 00:23:56,240 Speaker 1: to launch this thing that you're taking on. 428 00:23:56,359 --> 00:23:58,040 Speaker 2: It's not like it's a building, you. 429 00:23:58,000 --> 00:24:00,719 Speaker 1: Know, it's a much more complicated right, you know, So 430 00:24:00,760 --> 00:24:02,720 Speaker 1: tell us a little bit about those challenges. 431 00:24:02,600 --> 00:24:05,960 Speaker 3: We did something very simple in twenty twenty. We created 432 00:24:06,119 --> 00:24:08,639 Speaker 3: a model where we identified people who were sitting on 433 00:24:08,680 --> 00:24:11,680 Speaker 3: the fence when it came to any type of electoral participation. 434 00:24:12,320 --> 00:24:13,880 Speaker 3: And this was a big sample size. We did four 435 00:24:13,920 --> 00:24:16,479 Speaker 3: hundred thousand people. We divided them into two groups. Two 436 00:24:16,560 --> 00:24:18,960 Speaker 3: hundred thousand and received you know, a go by a 437 00:24:19,000 --> 00:24:21,120 Speaker 3: Gillette shaver for your legs, and the other two hundred 438 00:24:21,160 --> 00:24:24,240 Speaker 3: thousand received a vote registered to Vote AD. The two 439 00:24:24,280 --> 00:24:27,280 Speaker 3: hundred thousand that received that AD, they were nine times 440 00:24:27,320 --> 00:24:30,879 Speaker 3: more likely to search voter registration information and how to 441 00:24:30,920 --> 00:24:36,000 Speaker 3: register to vote. That is indicators of behavioral change. Vote 442 00:24:36,040 --> 00:24:39,280 Speaker 3: Latin in twenty twenty registered over six hundred and fifty 443 00:24:39,320 --> 00:24:42,679 Speaker 3: thousand people. Seventy percent of them went out and voted. 444 00:24:42,720 --> 00:24:45,000 Speaker 3: But we learned so much that we then applied it 445 00:24:45,040 --> 00:24:46,840 Speaker 3: to COVID because we found that the person who was 446 00:24:46,920 --> 00:24:50,240 Speaker 3: vaccine hesitant had very similar markers than the person that 447 00:24:50,359 --> 00:24:53,160 Speaker 3: was local pensity voter hesitant, and so we were able 448 00:24:53,200 --> 00:24:56,360 Speaker 3: to again work very closely serving ads, and this time 449 00:24:56,400 --> 00:24:58,520 Speaker 3: it was even more enlightening the people who saw the 450 00:24:58,520 --> 00:25:01,120 Speaker 3: COVID vaccine AD. We did this in July of last year, 451 00:25:01,240 --> 00:25:04,240 Speaker 3: high density Latino zip codes could have had the vaccine 452 00:25:04,280 --> 00:25:06,080 Speaker 3: by the middle of July, but had decided not to. 453 00:25:06,600 --> 00:25:09,880 Speaker 3: Those people that saw our ads fifty four times more 454 00:25:09,960 --> 00:25:12,600 Speaker 3: likely to search get a COVID vaccine. And so now 455 00:25:12,680 --> 00:25:15,040 Speaker 3: what we have in the field is using the information 456 00:25:15,440 --> 00:25:18,639 Speaker 3: of how to identify people who are vaccine hesitant, again 457 00:25:18,840 --> 00:25:21,600 Speaker 3: learning what are they consuming. Oftentimes we'll say, you know, 458 00:25:21,640 --> 00:25:24,479 Speaker 3: Latinos consume x amount of Twitter, that's important, but what 459 00:25:24,520 --> 00:25:27,360 Speaker 3: are they consuming on Twitter? That's equally, if not more important. 460 00:25:27,359 --> 00:25:29,000 Speaker 3: And so that's where we are right now. We hope 461 00:25:29,000 --> 00:25:31,000 Speaker 3: to have that setting done by the middle of June 462 00:25:31,040 --> 00:25:33,320 Speaker 3: and then we will share it with our partners and 463 00:25:33,359 --> 00:25:35,560 Speaker 3: grassroot organizers. We want to be able to do the 464 00:25:35,560 --> 00:25:38,960 Speaker 3: research but then applied immediately into the field because time 465 00:25:39,040 --> 00:25:43,720 Speaker 3: is short and once someone has created a real distrust 466 00:25:43,760 --> 00:25:46,240 Speaker 3: of government that person we will never be able to 467 00:25:46,240 --> 00:25:46,920 Speaker 3: bring them back. 468 00:25:48,160 --> 00:25:50,320 Speaker 1: So to bring it back kind of to the beginning, 469 00:25:50,920 --> 00:25:56,120 Speaker 1: it's happening because of the incredible demographic growth and strength 470 00:25:56,280 --> 00:26:00,840 Speaker 1: of the Latino and Latina voter. It's actually about voters 471 00:26:00,920 --> 00:26:04,600 Speaker 1: who can, as marietrees I said, shift what's happening in 472 00:26:04,600 --> 00:26:07,960 Speaker 1: this country. So there tremendous amount of power within the 473 00:26:08,000 --> 00:26:12,600 Speaker 1: Latino and Latina voting population, you know, just empirically there is. 474 00:26:13,160 --> 00:26:15,640 Speaker 1: So we actually want to understand a little bit more 475 00:26:15,640 --> 00:26:19,959 Speaker 1: about the impact of the campaigns. Right, So the biggest 476 00:26:19,960 --> 00:26:21,640 Speaker 1: thing that we have to acknowledge is that a huge 477 00:26:21,720 --> 00:26:25,159 Speaker 1: chunk of people Latinos and Latinus too, you know, this 478 00:26:25,240 --> 00:26:28,400 Speaker 1: whole notion of it. Is there misinformation? Is there disinformation? 479 00:26:29,119 --> 00:26:32,280 Speaker 1: Is it a real issue? We know that there are smart, 480 00:26:32,720 --> 00:26:36,359 Speaker 1: politically engage people out there who think that you know 481 00:26:36,440 --> 00:26:38,439 Speaker 1: that this is a hoax or this is just like 482 00:26:38,560 --> 00:26:41,480 Speaker 1: it's the leftist media that has you thinking that there's 483 00:26:41,520 --> 00:26:45,840 Speaker 1: misinformation and disinformation. And we have to reconcile these two ideas. 484 00:26:45,840 --> 00:26:51,920 Speaker 1: One that there actually is harmful Spanish language disinformation targeting 485 00:26:51,960 --> 00:26:55,119 Speaker 1: our communities, and at the same time that Latinos and 486 00:26:55,160 --> 00:26:59,120 Speaker 1: Latinas are not a monolith and that they're not always 487 00:26:59,200 --> 00:27:02,080 Speaker 1: easily manipulate. Right, that these two things can happen at 488 00:27:02,080 --> 00:27:04,119 Speaker 1: the same time. So, Mariete is a what do you 489 00:27:04,160 --> 00:27:06,560 Speaker 1: say to people who think that, oh, you know what, 490 00:27:06,960 --> 00:27:10,280 Speaker 1: just calm down, gil Minza gal Man say, because you're 491 00:27:10,400 --> 00:27:14,359 Speaker 1: saying actually the opposite, you're saying Latinos and Latinas are 492 00:27:14,440 --> 00:27:18,640 Speaker 1: the canaries in the mind in terms of mis and disinformation. 493 00:27:19,160 --> 00:27:20,320 Speaker 1: So what do you say to the people who are 494 00:27:20,359 --> 00:27:23,679 Speaker 1: like Marie is overreacting? Yeah, you're overreacting. 495 00:27:23,920 --> 00:27:26,000 Speaker 3: So first of all, I'm sure as a good Latina 496 00:27:26,080 --> 00:27:26,760 Speaker 3: no one can say. 497 00:27:27,119 --> 00:27:33,200 Speaker 2: Because it's not in Yeah, I gotcha, really. 498 00:27:34,440 --> 00:27:37,399 Speaker 3: Now, But when you look at disinformation and the amount 499 00:27:37,440 --> 00:27:39,679 Speaker 3: of harm it's costs. Let's go back to COVID. If 500 00:27:39,720 --> 00:27:43,639 Speaker 3: you look at the deaths among Latino's age zero to 501 00:27:43,840 --> 00:27:46,679 Speaker 3: twenty four, age twenty four to thirty four, Like, if 502 00:27:46,720 --> 00:27:49,960 Speaker 3: you look at that categorically, we are roughly about twenty 503 00:27:50,000 --> 00:27:52,359 Speaker 3: percent of the population across the board, but for people 504 00:27:52,480 --> 00:27:56,359 Speaker 3: under the age of sixty five, we represent close to 505 00:27:56,480 --> 00:27:59,720 Speaker 3: thirty percent of the deaths within h age category. We 506 00:27:59,800 --> 00:28:02,920 Speaker 3: are overrepresenting in those age categories. And one of the 507 00:28:03,040 --> 00:28:06,000 Speaker 3: leading factors of it is because we were vacting hesitant, 508 00:28:06,440 --> 00:28:09,440 Speaker 3: And it has everything to do with the way people 509 00:28:09,560 --> 00:28:11,760 Speaker 3: are presenting information to us, and we don't know how 510 00:28:11,880 --> 00:28:15,960 Speaker 3: to necessarily navigate it because we trust our friends more 511 00:28:16,000 --> 00:28:18,600 Speaker 3: than we trust someone in the lab cot. Right. 512 00:28:18,680 --> 00:28:20,360 Speaker 7: But I think this is the part where, at least 513 00:28:20,400 --> 00:28:23,120 Speaker 7: in covering this issue. As a political journalist and as 514 00:28:23,119 --> 00:28:28,280 Speaker 7: an opinion writer and editor, there still is this feeling 515 00:28:28,520 --> 00:28:31,600 Speaker 7: that this is a non issue, Hima, that this is 516 00:28:31,840 --> 00:28:34,800 Speaker 7: way too much attention is being given to this Latino 517 00:28:34,840 --> 00:28:37,600 Speaker 7: and Latino voters are being portrayed as dumb because they 518 00:28:37,720 --> 00:28:42,280 Speaker 7: fall for this and it's disrespectful and insulting and patronizing. 519 00:28:42,360 --> 00:28:44,640 Speaker 7: And people are free to decide on whoever they want 520 00:28:44,680 --> 00:28:45,040 Speaker 7: to vote for. 521 00:28:45,280 --> 00:28:48,360 Speaker 2: So is there a disconnect there, because that's the part 522 00:28:48,360 --> 00:28:51,000 Speaker 2: where I don't understand. Is this a political issue. 523 00:28:51,960 --> 00:28:54,720 Speaker 7: Is this going to impact voters or is it just 524 00:28:55,280 --> 00:28:58,560 Speaker 7: Democrats saying, hey, Latinos, you're being misinformed, vote for US, 525 00:28:58,680 --> 00:29:02,480 Speaker 7: and Republicans saying, you know, don't trust those Democrats because 526 00:29:02,480 --> 00:29:04,240 Speaker 7: they're they think you're all stupid. 527 00:29:04,560 --> 00:29:05,280 Speaker 2: Your thought timing. 528 00:29:05,680 --> 00:29:08,320 Speaker 6: Yeah, I think that's exactly the balance that I tried 529 00:29:08,360 --> 00:29:10,440 Speaker 6: to tread when I talk about the research that I do, 530 00:29:10,520 --> 00:29:13,080 Speaker 6: and I think Madya you put it really well earlier 531 00:29:13,160 --> 00:29:17,440 Speaker 6: as well, that we can have bigger conversations about yes 532 00:29:17,720 --> 00:29:20,800 Speaker 6: and right in a way, the way that we talk 533 00:29:20,840 --> 00:29:23,920 Speaker 6: about this miss and disinformation discourse is miss and disinformation 534 00:29:24,040 --> 00:29:24,520 Speaker 6: in itself. 535 00:29:24,840 --> 00:29:28,400 Speaker 4: We can say that, like there are political powers. 536 00:29:28,080 --> 00:29:32,280 Speaker 6: Here who want to co opt the language of miss 537 00:29:32,320 --> 00:29:36,080 Speaker 6: and disinformation for the benefit of their political goals, and 538 00:29:36,160 --> 00:29:39,000 Speaker 6: not just in spreading miss and disinformation, but in convincing 539 00:29:39,040 --> 00:29:41,440 Speaker 6: communities that they should or should not believe in it. 540 00:29:41,600 --> 00:29:45,040 Speaker 6: I think it's grossly irresponsible to tell people that this 541 00:29:45,240 --> 00:29:48,400 Speaker 6: is overblown. I think it's a very serious issue for me. 542 00:29:48,720 --> 00:29:51,040 Speaker 6: What we still need to do as an area of 543 00:29:51,120 --> 00:29:54,600 Speaker 6: study is we need to really look closer to what 544 00:29:54,760 --> 00:29:57,560 Speaker 6: the actual impact is. And it's a difficult question to answer, Like, 545 00:29:57,720 --> 00:30:01,560 Speaker 6: we know what we've seen, know what people say the 546 00:30:01,600 --> 00:30:04,600 Speaker 6: election fraudnard is in twenty twenty, and then the insurrection 547 00:30:04,960 --> 00:30:07,360 Speaker 6: also vaccine hasn't I could go on and on and 548 00:30:07,440 --> 00:30:07,760 Speaker 6: on and on. 549 00:30:08,240 --> 00:30:10,360 Speaker 2: No, you could go through the list. It's all good, 550 00:30:10,960 --> 00:30:12,080 Speaker 2: it's scary, we can laugh. 551 00:30:12,200 --> 00:30:17,800 Speaker 1: But actually, you know, if you are consuming this and 552 00:30:17,960 --> 00:30:20,960 Speaker 1: it's overwhelming and it's constant. 553 00:30:20,880 --> 00:30:24,880 Speaker 6: Yeah, yeah, that's exactly it. For me, the question is 554 00:30:25,000 --> 00:30:29,200 Speaker 6: like to what measurable extent did this push and enable people, Like, 555 00:30:29,320 --> 00:30:33,040 Speaker 6: were these folks susceptible already to this misin disinformation or 556 00:30:33,080 --> 00:30:34,840 Speaker 6: did this just provide a framing for the things that 557 00:30:34,880 --> 00:30:37,400 Speaker 6: they already wanted to do? Does it make it easier 558 00:30:37,440 --> 00:30:40,720 Speaker 6: to rally people around specific cause like, yeah, we've seen that. 559 00:30:40,880 --> 00:30:45,120 Speaker 6: It definitely does. Absolutely there's something here. Absolutely we need 560 00:30:45,200 --> 00:30:48,000 Speaker 6: to pay attention. Absolutely we need to sound the alarms, 561 00:30:48,040 --> 00:30:51,560 Speaker 6: and we're doing that too, But there's still a barrier 562 00:30:51,720 --> 00:30:54,360 Speaker 6: of how this actually shifts sentiment. We need to be 563 00:30:54,480 --> 00:30:58,880 Speaker 6: really cautious because sometimes blaming miss and dist info lets 564 00:30:58,920 --> 00:31:00,520 Speaker 6: other institutions. 565 00:31:00,720 --> 00:31:03,160 Speaker 4: Off for their own failures. That's the thing that I 566 00:31:03,240 --> 00:31:04,200 Speaker 4: want at the center of this. 567 00:31:04,760 --> 00:31:08,480 Speaker 6: But also it's dangerous and people are being targeted and 568 00:31:08,600 --> 00:31:10,720 Speaker 6: it's being used for political means. 569 00:31:10,840 --> 00:31:13,360 Speaker 4: I want to have that yes and conversation. 570 00:31:15,640 --> 00:31:17,680 Speaker 1: Thank you for putting in that context and for all 571 00:31:17,720 --> 00:31:19,800 Speaker 1: of your work, And thanks to the three of you 572 00:31:19,960 --> 00:31:23,680 Speaker 1: for joining me on Latino USA for this really important conversation. 573 00:31:24,040 --> 00:31:24,640 Speaker 2: Appreciate it. 574 00:31:24,880 --> 00:31:25,560 Speaker 3: Thank you so much. 575 00:31:27,000 --> 00:31:29,200 Speaker 2: Thank you, Maria. It's always great to be on Latino 576 00:31:29,320 --> 00:31:29,840 Speaker 2: USA with you. 577 00:31:43,800 --> 00:31:47,240 Speaker 1: This episode was produced by Norsaudi and edited by Julio 578 00:31:47,320 --> 00:31:51,040 Speaker 1: Ricardo Barella. It was mixed by JJ Carubin. The Latino 579 00:31:51,160 --> 00:31:56,640 Speaker 1: USA team includes Andrea Lopez Crusado, Marta Martinez, Daisy Contredras, 580 00:31:57,040 --> 00:32:03,280 Speaker 1: Mike Sargent, Julieta Martinelli, Victoria Strada, Patricia Sulbaran Alejandra Salasad Rinaldo, 581 00:32:03,400 --> 00:32:05,280 Speaker 1: Leanos Junior and Julia. 582 00:32:05,040 --> 00:32:07,120 Speaker 2: Rocha, with help from Rari Perees. 583 00:32:07,600 --> 00:32:09,960 Speaker 1: Special thanks to the rest of the In the Thick 584 00:32:10,040 --> 00:32:14,160 Speaker 1: team as well, Harshana Hata, Lisa Salinaz and Sarah Harshander 585 00:32:14,560 --> 00:32:19,080 Speaker 1: and remember subscribed In the Thick. Our director of Engineering 586 00:32:19,240 --> 00:32:23,120 Speaker 1: is Stephanie Lebau. Our senior engineer is Julia Caruso. Our 587 00:32:23,160 --> 00:32:28,200 Speaker 1: associate engineer is gabriel A Bias. Our marketing manager is Louisetuna. 588 00:32:28,440 --> 00:32:32,880 Speaker 1: Our fellows are Elisa Baena, Monica Morales Garcia and Andrew Vignalis. 589 00:32:33,360 --> 00:32:36,680 Speaker 1: Our theme music was composed by Sena Rubinos. I'm your 590 00:32:36,680 --> 00:32:39,880 Speaker 1: host and executive producer marieo Josa. Join us again on 591 00:32:39,960 --> 00:32:41,960 Speaker 1: our next episode. In the meantime, look for us on 592 00:32:42,040 --> 00:32:45,960 Speaker 1: all of your social media and remember not Tevayas and 593 00:32:46,080 --> 00:32:47,280 Speaker 1: see you on our next episode. 594 00:32:47,360 --> 00:32:57,120 Speaker 8: By Latino USA. Is made possible in part by the 595 00:32:57,240 --> 00:33:00,960 Speaker 8: Ford Foundation, working with visionaries on the front lines of 596 00:33:01,040 --> 00:33:07,520 Speaker 8: social change worldwide, the Heising Simons Foundation Unlocking Knowledge, opportunity 597 00:33:07,800 --> 00:33:14,160 Speaker 8: and possibilities more at hsfoundation dot org, and the John D. 598 00:33:14,480 --> 00:33:16,600 Speaker 8: And Catherine T. MacArthur Foundation. 599 00:33:20,960 --> 00:33:30,520 Speaker 6: I'm like a sad goth and then very pessimistic is 600 00:33:30,560 --> 00:33:31,520 Speaker 6: a sad goth too. 601 00:33:32,000 --> 00:33:34,880 Speaker 2: I've not felt that way, Okay, I just want to 602 00:33:34,920 --> 00:33:36,240 Speaker 2: double check because I'm with him. 603 00:33:36,280 --> 00:33:37,280 Speaker 4: Man, I'm a sad goth too,