1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to Tech Stuff, a production from my Heart Radio. 2 00:00:12,280 --> 00:00:14,280 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,320 --> 00:00:16,799 Speaker 1: Jonathan Strickland. I'm an executive producer with our Heart Radio 4 00:00:16,840 --> 00:00:18,320 Speaker 1: and I love all things tech and I had to 5 00:00:18,440 --> 00:00:23,439 Speaker 1: rush through that intro. Ladies, gentlemen, everybody. I have to 6 00:00:23,480 --> 00:00:26,640 Speaker 1: talk to my good friend Bridget, host of There Are 7 00:00:26,680 --> 00:00:29,360 Speaker 1: No Girls on the Internet, who has joined the show 8 00:00:29,400 --> 00:00:33,320 Speaker 1: today to talk about a very serious topic. But before 9 00:00:33,360 --> 00:00:36,280 Speaker 1: we get into that, Bridget, it is so nice to 10 00:00:36,360 --> 00:00:38,800 Speaker 1: have you on the show. It's so nice to be here. 11 00:00:38,840 --> 00:00:41,800 Speaker 1: I'm so excited. You know, we connect every week just 12 00:00:41,960 --> 00:00:45,000 Speaker 1: via phone, and it's nice to be connecting in people's 13 00:00:45,000 --> 00:00:50,160 Speaker 1: ear buzz in podcast land. Yeah. So, Bridget has been 14 00:00:50,240 --> 00:00:53,040 Speaker 1: hosting an incredible show, There Are No Girls on the Internet. 15 00:00:53,080 --> 00:00:56,400 Speaker 1: If you haven't heard it already, you absolutely need to 16 00:00:56,440 --> 00:00:59,800 Speaker 1: seek it out. And specifically, I wanted to bring her 17 00:00:59,840 --> 00:01:04,280 Speaker 1: on the show to talk about a season two theme 18 00:01:04,720 --> 00:01:09,959 Speaker 1: that you've really done a lot of hard work on, Disinformed, 19 00:01:10,360 --> 00:01:14,720 Speaker 1: because we're going to be talking about misinformation, disinformation and 20 00:01:15,080 --> 00:01:19,080 Speaker 1: the promotion of that through algorithms. But Bridge on one, 21 00:01:19,160 --> 00:01:20,800 Speaker 1: in your own words, why don't you talk a bit 22 00:01:20,840 --> 00:01:23,200 Speaker 1: about there are no girls on the Internet in general 23 00:01:23,280 --> 00:01:27,840 Speaker 1: and disinformed particular. Yeah, so There Are No Girls on 24 00:01:27,880 --> 00:01:30,720 Speaker 1: the Internet was really born from this idea that we 25 00:01:30,760 --> 00:01:35,200 Speaker 1: know that marginalized people, underrepresented people, women, communities of color, 26 00:01:35,240 --> 00:01:38,559 Speaker 1: other folks from underrepresented backgrounds have had such a huge 27 00:01:38,600 --> 00:01:42,679 Speaker 1: impact on technology and honestly, just what makes the Internet 28 00:01:42,720 --> 00:01:45,160 Speaker 1: a fun, great place? Like how many times is there 29 00:01:45,560 --> 00:01:48,840 Speaker 1: hilarious hashtag or a hilarious vine or a hilarious video 30 00:01:48,880 --> 00:01:52,160 Speaker 1: on TikTok that goes viral that you're seeing everywhere and 31 00:01:52,240 --> 00:01:54,960 Speaker 1: the person behind it is a woman or a person 32 00:01:55,000 --> 00:01:57,280 Speaker 1: of color. And so I really wanted to carve out 33 00:01:57,280 --> 00:02:00,280 Speaker 1: a space where we could really highlight and sell brain 34 00:02:00,320 --> 00:02:04,480 Speaker 1: and amplify all the amazing impacts that people from underrepresented 35 00:02:04,480 --> 00:02:07,320 Speaker 1: backgrounds have had on the Internet. And so that's why 36 00:02:07,360 --> 00:02:09,079 Speaker 1: I started There Are No Girls on the Internet. It's 37 00:02:09,080 --> 00:02:12,000 Speaker 1: been a really fun journey to really talk about some 38 00:02:12,040 --> 00:02:14,920 Speaker 1: of the ways that underrepresenting folks either show up or 39 00:02:14,919 --> 00:02:18,320 Speaker 1: don't show up online. And from making that show, one 40 00:02:18,360 --> 00:02:19,960 Speaker 1: of the things that came up for me again and 41 00:02:20,000 --> 00:02:25,760 Speaker 1: again was the role that disinformation and distorted media stories, 42 00:02:25,800 --> 00:02:30,280 Speaker 1: fake news that you know, um things like online harassment radicalization. 43 00:02:30,760 --> 00:02:34,880 Speaker 1: How these same underrepresenting people who are showing up, making up, 44 00:02:34,919 --> 00:02:38,080 Speaker 1: making great content online, making the Internet such a fun place, 45 00:02:38,760 --> 00:02:42,360 Speaker 1: those same people are really at the forefront of a 46 00:02:42,400 --> 00:02:44,400 Speaker 1: lot of this stuff, right, You know, when it comes 47 00:02:44,440 --> 00:02:47,920 Speaker 1: to disinformation, communities of color are the most impacted, women 48 00:02:47,960 --> 00:02:50,440 Speaker 1: are the most impacted. But we are the same ones 49 00:02:50,440 --> 00:02:51,959 Speaker 1: who are doing a lot of the research and a 50 00:02:51,960 --> 00:02:53,920 Speaker 1: lot of the work to fight back. And I was 51 00:02:53,919 --> 00:02:57,000 Speaker 1: really fascinated by that. I was fascinated by the human 52 00:02:57,320 --> 00:03:00,480 Speaker 1: center of of disinformation. I think with a lot of tech, 53 00:03:00,800 --> 00:03:03,400 Speaker 1: it can be hard to remember that it's really about people. 54 00:03:03,520 --> 00:03:05,920 Speaker 1: You know, that people looked up at the other end 55 00:03:05,960 --> 00:03:09,360 Speaker 1: of screens or the other end of phones, and disinformation 56 00:03:09,400 --> 00:03:12,440 Speaker 1: was a situation where it really was important to meet 57 00:03:12,440 --> 00:03:14,680 Speaker 1: a center those people and to make sure that their 58 00:03:14,720 --> 00:03:18,120 Speaker 1: stories were at the forefront. You bring up a point 59 00:03:18,160 --> 00:03:20,280 Speaker 1: that I think is good for us to lay out 60 00:03:20,440 --> 00:03:22,720 Speaker 1: right at the very beginning, because while we're going to 61 00:03:22,760 --> 00:03:29,959 Speaker 1: be talking about social media platforms, specifically platforms like Facebook, YouTube, Twitter, 62 00:03:30,120 --> 00:03:32,920 Speaker 1: those sort of things, and to another extent, some of 63 00:03:32,960 --> 00:03:38,240 Speaker 1: the more uh extreme social networks that are out there 64 00:03:38,280 --> 00:03:43,520 Speaker 1: that are heavily promoting things like radicalization within the confines 65 00:03:43,560 --> 00:03:46,480 Speaker 1: of those own communities. While we're going to talk about that, 66 00:03:46,520 --> 00:03:50,480 Speaker 1: we have to remember that's one piece of a very big, 67 00:03:50,760 --> 00:03:56,160 Speaker 1: very complicated puzzle. UH. The the pathway to extremism or 68 00:03:56,240 --> 00:04:02,080 Speaker 1: radicalization is not a one lane highway where everybody goes 69 00:04:02,120 --> 00:04:05,440 Speaker 1: the exact same way. There's usually a lot of different 70 00:04:05,480 --> 00:04:11,080 Speaker 1: factors at play, and if we're being completely honest, we 71 00:04:11,160 --> 00:04:16,880 Speaker 1: still don't really have a full understanding of what weighted 72 00:04:17,520 --> 00:04:21,800 Speaker 1: UH factors are the most important, right like obviously being 73 00:04:21,880 --> 00:04:28,960 Speaker 1: in person with groups that are UH proselytizing extremist views 74 00:04:29,000 --> 00:04:32,920 Speaker 1: and are reinforcing when new people come into the community 75 00:04:33,000 --> 00:04:37,720 Speaker 1: that those extremist views are legitimate and and reinforcing. Whenever 76 00:04:37,760 --> 00:04:41,960 Speaker 1: someone expresses an extremist view, that's clearly very important, and 77 00:04:42,000 --> 00:04:46,080 Speaker 1: when it happens in a real space meet space, UH, 78 00:04:46,080 --> 00:04:48,080 Speaker 1: it can have a huge impact. But we also know 79 00:04:48,240 --> 00:04:54,680 Speaker 1: that online interactions also can play a big part, especially 80 00:04:54,760 --> 00:04:59,720 Speaker 1: as we have embraced them more wholeheartedly in the last decade, 81 00:05:00,040 --> 00:05:03,400 Speaker 1: and clearly in a situation like we are in now 82 00:05:03,480 --> 00:05:06,080 Speaker 1: where all of a lot of us are at home 83 00:05:06,400 --> 00:05:09,440 Speaker 1: and the only interactions were mostly having with other people 84 00:05:09,480 --> 00:05:14,680 Speaker 1: are online we're seeing that online radicalization is in fact 85 00:05:15,480 --> 00:05:19,520 Speaker 1: a factor as well, but to what extent it is 86 00:05:19,680 --> 00:05:25,360 Speaker 1: as effective or ineffective as others. That's something that's under 87 00:05:25,400 --> 00:05:29,159 Speaker 1: academic debate. So we're gonna have this discussion, but we 88 00:05:29,200 --> 00:05:31,520 Speaker 1: do want to make it clear that there are no 89 00:05:31,640 --> 00:05:34,680 Speaker 1: absolutes that we know of yet. These are all things 90 00:05:34,720 --> 00:05:38,280 Speaker 1: that are under study and consideration. It's more that we're 91 00:05:38,279 --> 00:05:41,000 Speaker 1: trying to kind of take stock of what's going on 92 00:05:41,600 --> 00:05:45,800 Speaker 1: and what elements could be playing a part in the 93 00:05:45,920 --> 00:05:50,480 Speaker 1: in the process of someone encountering extremist views and the 94 00:05:50,560 --> 00:05:55,520 Speaker 1: possibility of getting wrapped up in that um. So with 95 00:05:55,600 --> 00:05:59,760 Speaker 1: that in mind, let's talk about social media. One thing 96 00:05:59,800 --> 00:06:02,719 Speaker 1: I noticed I decided to do a quick search bridget 97 00:06:02,800 --> 00:06:07,000 Speaker 1: before we jumped on this call, and I found a 98 00:06:07,920 --> 00:06:11,479 Speaker 1: Pew Research survey from a couple of years ago that 99 00:06:11,640 --> 00:06:15,400 Speaker 1: was looking at, uh, what percentage of people were getting 100 00:06:15,480 --> 00:06:19,880 Speaker 1: some or most of their news through social networking sites, 101 00:06:20,040 --> 00:06:24,920 Speaker 1: primarily Facebook, And at this point it's around adults get 102 00:06:25,040 --> 00:06:29,279 Speaker 1: either some of their news through social media or practically 103 00:06:29,320 --> 00:06:33,760 Speaker 1: all of their news through social media. So obviously a 104 00:06:33,880 --> 00:06:38,480 Speaker 1: social platform plays a big part in the way we 105 00:06:38,560 --> 00:06:43,360 Speaker 1: access information that we then tried to incorporate into our lives. Yeah, 106 00:06:43,400 --> 00:06:45,200 Speaker 1: I think so. I know that study. I'm glad that 107 00:06:45,240 --> 00:06:48,840 Speaker 1: you grounded it in our conversation in that study. You know, 108 00:06:48,960 --> 00:06:51,480 Speaker 1: as someone who tracks and and talks and thinks a 109 00:06:51,520 --> 00:06:54,200 Speaker 1: lot about the way that platforms operate in the role 110 00:06:54,279 --> 00:06:57,680 Speaker 1: they have in our society. For a long time, platforms 111 00:06:57,680 --> 00:07:00,440 Speaker 1: like Facebook would say that they were just neutral plat forms, 112 00:07:00,480 --> 00:07:03,000 Speaker 1: you know, they did not need to take any kind 113 00:07:03,040 --> 00:07:06,600 Speaker 1: of editorial strategy because they're just neutral platforms. And I 114 00:07:06,600 --> 00:07:10,400 Speaker 1: think that Pew study that adults get some some aspect 115 00:07:10,440 --> 00:07:13,600 Speaker 1: of their news from from these platforms really shows how 116 00:07:14,160 --> 00:07:16,080 Speaker 1: that's not always the case. You know, you have to 117 00:07:16,320 --> 00:07:20,240 Speaker 1: I think that tech leaders should be thinking about their 118 00:07:20,280 --> 00:07:23,360 Speaker 1: platform policies based on the role they actually play in 119 00:07:23,400 --> 00:07:26,280 Speaker 1: their users lives. And if most users are saying, hey, 120 00:07:26,360 --> 00:07:29,800 Speaker 1: I use your platforms to get my news, then maybe 121 00:07:29,840 --> 00:07:32,120 Speaker 1: you do need to have a little bit of editorial strategy, right, 122 00:07:32,120 --> 00:07:33,840 Speaker 1: Maybe you do need to have policies in place to 123 00:07:33,840 --> 00:07:36,480 Speaker 1: make sure that it's that it's being used in safe 124 00:07:36,520 --> 00:07:39,480 Speaker 1: and responsible ways. I'm glad that we have moved away 125 00:07:39,560 --> 00:07:41,920 Speaker 1: from that thinking that's like, oh, it is not my 126 00:07:42,040 --> 00:07:44,160 Speaker 1: fire on a platform. It is not my responsibility to 127 00:07:44,760 --> 00:07:47,160 Speaker 1: have any kind of policing or policies around what is 128 00:07:47,200 --> 00:07:49,840 Speaker 1: on that platform, which I think was the conversation for 129 00:07:49,880 --> 00:07:53,200 Speaker 1: so long. Yeah, I think the interesting thing to me 130 00:07:53,480 --> 00:07:56,520 Speaker 1: is that, um, you will hear a lot of debate 131 00:07:56,840 --> 00:08:00,200 Speaker 1: in in the in America about Section to thirty, which 132 00:08:00,240 --> 00:08:06,560 Speaker 1: is a specific piece of legislation that provides online platforms 133 00:08:06,600 --> 00:08:09,760 Speaker 1: some legal immunity from the sort of stuff that their 134 00:08:09,880 --> 00:08:13,560 Speaker 1: users might post to those platforms, saying that they are 135 00:08:13,600 --> 00:08:17,520 Speaker 1: not responsible for the things that their users share. And 136 00:08:17,600 --> 00:08:20,840 Speaker 1: the hope of the people who actually drafted that legislation 137 00:08:21,360 --> 00:08:23,320 Speaker 1: was that it was going to be a two pronged thing. 138 00:08:23,800 --> 00:08:28,600 Speaker 1: It would allow platforms to establish themselves, because otherwise they 139 00:08:28,600 --> 00:08:32,400 Speaker 1: would constantly be under legal threat and nothing would ever 140 00:08:32,679 --> 00:08:35,559 Speaker 1: gain any traction, because this legislation was proposed in the 141 00:08:35,679 --> 00:08:40,520 Speaker 1: nineties when the Worldwide Web was very young, And the 142 00:08:40,559 --> 00:08:42,680 Speaker 1: second part was that they were really hoping it was 143 00:08:42,679 --> 00:08:47,240 Speaker 1: going to lead to uh to moderation practices on these platforms, 144 00:08:47,240 --> 00:08:50,800 Speaker 1: that the platforms would moderate the material that was being 145 00:08:50,840 --> 00:08:55,440 Speaker 1: posted to them without fear of legal action against them 146 00:08:55,520 --> 00:08:58,520 Speaker 1: for doing so, that they would be given that freedom. 147 00:08:58,679 --> 00:09:03,760 Speaker 1: And the problem or one of the problems is that 148 00:09:03,800 --> 00:09:07,120 Speaker 1: a lot of these platforms embraced the first part where 149 00:09:07,160 --> 00:09:09,839 Speaker 1: they're not held responsible for the stuff that people post 150 00:09:09,880 --> 00:09:12,240 Speaker 1: to them, but they didn't go so hard on the 151 00:09:12,280 --> 00:09:16,280 Speaker 1: second part, where they actually moderate the material. And we've 152 00:09:16,280 --> 00:09:20,760 Speaker 1: seen that time and again because we've seen it on Facebook, 153 00:09:20,800 --> 00:09:22,720 Speaker 1: we've seen it a lot on YouTube. I mean, the 154 00:09:22,720 --> 00:09:26,080 Speaker 1: the whole uh controversy a couple of years ago about 155 00:09:26,679 --> 00:09:31,319 Speaker 1: the the various bizarre and sometimes very disturbing videos that 156 00:09:31,360 --> 00:09:37,280 Speaker 1: were showing up on YouTube's children oriented service shows that, Uh, 157 00:09:37,320 --> 00:09:40,240 Speaker 1: they weren't really looking at it from that perspective for 158 00:09:40,320 --> 00:09:43,000 Speaker 1: multiple reasons. And we'll get into some of those a 159 00:09:43,000 --> 00:09:46,079 Speaker 1: little bit later, because it does tie into the algorithm 160 00:09:46,200 --> 00:09:49,800 Speaker 1: side too. Uh. The other part Pew study that I 161 00:09:49,840 --> 00:09:55,160 Speaker 1: saw that was interesting and disheartening is a separate study 162 00:09:55,200 --> 00:09:59,120 Speaker 1: found that people who gained most of their news from 163 00:09:59,320 --> 00:10:03,480 Speaker 1: social media of sites were in general less engaged and 164 00:10:03,600 --> 00:10:07,280 Speaker 1: less knowledgeable about the subject matter than those who were 165 00:10:07,360 --> 00:10:12,800 Speaker 1: consuming it from multiple sources. So if you were someone who, yeah, 166 00:10:12,800 --> 00:10:15,280 Speaker 1: you've got some of your news from social platforms, but 167 00:10:15,280 --> 00:10:18,040 Speaker 1: you also got some of your news from journals or 168 00:10:18,200 --> 00:10:23,000 Speaker 1: newspapers or television, or radio or whatever. You generally had 169 00:10:23,040 --> 00:10:26,280 Speaker 1: a better understanding of subject matter. And this to me 170 00:10:26,320 --> 00:10:29,880 Speaker 1: almost sounds like the problem of people reading a headline, 171 00:10:30,840 --> 00:10:35,160 Speaker 1: never clicking through to read and digest the content, and 172 00:10:35,280 --> 00:10:40,840 Speaker 1: basing their entire response on the headline and maybe comments 173 00:10:40,920 --> 00:10:44,320 Speaker 1: that are placed underneath an article that's been posted to Facebook, 174 00:10:44,400 --> 00:10:48,920 Speaker 1: let's say, which is really upsetting as someone who has 175 00:10:48,960 --> 00:10:52,000 Speaker 1: an English background, I mean especially and for us, like 176 00:10:52,040 --> 00:10:55,480 Speaker 1: we've worked in media, we know there's a special skill 177 00:10:55,800 --> 00:11:01,520 Speaker 1: in creating headlines that doesn't necessarily relate back to the 178 00:11:01,559 --> 00:11:04,760 Speaker 1: content of the piece that it's attached to, or it's 179 00:11:04,760 --> 00:11:10,160 Speaker 1: it's some minor element that's elevated in order to gain attention. 180 00:11:10,679 --> 00:11:13,040 Speaker 1: So this, to me is another issue. And this is 181 00:11:13,080 --> 00:11:16,800 Speaker 1: before we even get into purposeful misinformation. This can just 182 00:11:16,880 --> 00:11:21,320 Speaker 1: be a misunderstanding because you didn't take the time to 183 00:11:22,000 --> 00:11:26,960 Speaker 1: read the full material. Oh absolutely, I can. I'm I 184 00:11:26,960 --> 00:11:29,680 Speaker 1: can say so much about this. Um So, I got 185 00:11:29,679 --> 00:11:31,400 Speaker 1: my One of the places that I've worked in my 186 00:11:31,440 --> 00:11:33,880 Speaker 1: life was MSNBC. I was on their digital team and 187 00:11:33,920 --> 00:11:38,560 Speaker 1: so our job was essentially framing stories on social media 188 00:11:38,600 --> 00:11:40,679 Speaker 1: in ways that we're going to get them clicks, and 189 00:11:40,800 --> 00:11:44,160 Speaker 1: you know, attention online, and so we also had access 190 00:11:44,160 --> 00:11:46,719 Speaker 1: to metrics, so we could actually see how people were 191 00:11:46,760 --> 00:11:50,400 Speaker 1: engaging with this content. And the amount of times that 192 00:11:50,559 --> 00:11:54,320 Speaker 1: people share stories on Facebook that we can see on 193 00:11:54,360 --> 00:11:56,400 Speaker 1: the back end that they have not actually clicked in 194 00:11:56,440 --> 00:11:59,240 Speaker 1: to read would frighten you. But it was our job 195 00:11:59,280 --> 00:12:01,680 Speaker 1: to kind of explore that, and so we learned that, 196 00:12:02,040 --> 00:12:06,200 Speaker 1: you know, oh, people like to share articles where it 197 00:12:06,240 --> 00:12:09,520 Speaker 1: seems like they're trying to indicate something about themselves, like 198 00:12:09,559 --> 00:12:12,920 Speaker 1: this article makes them look thoughtful or um interesting or 199 00:12:13,040 --> 00:12:16,320 Speaker 1: unique or well read but really informed, um, and so 200 00:12:16,400 --> 00:12:18,960 Speaker 1: they want to project that to the best of their followers, 201 00:12:18,960 --> 00:12:21,720 Speaker 1: so they share it. But those same stories are the 202 00:12:21,720 --> 00:12:23,600 Speaker 1: ones that people aren't clicking in to read. And so 203 00:12:24,040 --> 00:12:26,360 Speaker 1: it really is not surprising to me that people who 204 00:12:26,440 --> 00:12:30,120 Speaker 1: get their media and news from social media primarily are 205 00:12:30,400 --> 00:12:34,000 Speaker 1: less informed and less engaged, because I think these media 206 00:12:34,080 --> 00:12:38,240 Speaker 1: ecosystems really prioritize that. I think that they built a 207 00:12:38,360 --> 00:12:43,120 Speaker 1: media landscape that prioritizes sharing without reading thoughtfully right, like 208 00:12:43,200 --> 00:12:45,199 Speaker 1: everything happened so quickly on the Internet, I feel like 209 00:12:45,280 --> 00:12:47,480 Speaker 1: that is what's really the priority. And I also think 210 00:12:47,880 --> 00:12:50,800 Speaker 1: just our our media landscape has really changed I grew 211 00:12:50,880 --> 00:12:53,560 Speaker 1: up in a household where we got the newspaper. We've 212 00:12:53,600 --> 00:12:56,320 Speaker 1: got multiple newspapers in my home and we have to, 213 00:12:56,600 --> 00:13:00,440 Speaker 1: you know, note the ways that are Our journalism industry 214 00:13:00,440 --> 00:13:03,959 Speaker 1: has really been you know, um decimated. And so it's 215 00:13:03,960 --> 00:13:07,160 Speaker 1: not surprising to me that as small local papers that 216 00:13:07,200 --> 00:13:11,280 Speaker 1: are putting out local, relevant, timely information right on your doorstep, 217 00:13:11,440 --> 00:13:13,760 Speaker 1: as those begin to shut their doors, more and more 218 00:13:13,800 --> 00:13:16,360 Speaker 1: people are turning to social media, which is in turn 219 00:13:16,360 --> 00:13:18,680 Speaker 1: and making them less informed. That this is not a 220 00:13:18,679 --> 00:13:22,560 Speaker 1: surprise to me. Right, I have a great example to 221 00:13:22,559 --> 00:13:24,920 Speaker 1: give you a bridget It's a very recent one and 222 00:13:24,960 --> 00:13:28,600 Speaker 1: I think it's one you'll appreciate, which is that, uh, 223 00:13:28,640 --> 00:13:31,200 Speaker 1: as I record this, I think we're like a day 224 00:13:31,320 --> 00:13:35,880 Speaker 1: out from when a trailer for a new Mortal Kombat 225 00:13:35,920 --> 00:13:40,280 Speaker 1: movie came out and over on Jezebel, a writer wrote 226 00:13:40,280 --> 00:13:44,080 Speaker 1: an article with the title, why isn't chun Lee in 227 00:13:44,120 --> 00:13:48,720 Speaker 1: the Mortal Kombat movie? And for those who aren't schooled 228 00:13:48,800 --> 00:13:52,120 Speaker 1: in Mortal Kombat, chun Ley is not a character from 229 00:13:52,160 --> 00:13:56,800 Speaker 1: Mortal Kombat. She's a character from Street Fighter, different different franchise, Right, 230 00:13:57,200 --> 00:14:00,640 Speaker 1: But that was the headline. If you at the article, 231 00:14:01,120 --> 00:14:03,680 Speaker 1: there were hints in the article that this was a joke, 232 00:14:04,080 --> 00:14:06,520 Speaker 1: that there's a point where she they say, if in 233 00:14:06,600 --> 00:14:09,720 Speaker 1: the ultimate street Fight, chun Lee needs to be there, 234 00:14:10,120 --> 00:14:14,480 Speaker 1: indicating yes, the author knows that chun Lee doesn't live 235 00:14:14,520 --> 00:14:17,079 Speaker 1: in Mortal Kombat world, that chun Ley lives in street 236 00:14:17,200 --> 00:14:20,680 Speaker 1: Fighter world. But it was almost like a like a 237 00:14:20,720 --> 00:14:25,760 Speaker 1: social experiment, and of course Twitter went nuts and everybody 238 00:14:25,800 --> 00:14:29,360 Speaker 1: started slamming this article, saying, look, how stupid this person is. 239 00:14:29,400 --> 00:14:31,000 Speaker 1: They don't even know that chun Ley is not in 240 00:14:31,160 --> 00:14:34,760 Speaker 1: Mortal Kombat. Which made me say, all this is telling 241 00:14:34,800 --> 00:14:38,320 Speaker 1: me is which of you guys failed to read the article, 242 00:14:38,480 --> 00:14:40,840 Speaker 1: Because if you had read the article, you would have 243 00:14:40,880 --> 00:14:43,440 Speaker 1: seen it was a joke, you would have recognized it 244 00:14:43,480 --> 00:14:46,920 Speaker 1: as satire, and you would know that that this was 245 00:14:46,960 --> 00:14:49,320 Speaker 1: a wind up the whole time. You're really just falling 246 00:14:49,360 --> 00:14:52,600 Speaker 1: into the trap that the writer set at this point. 247 00:14:52,880 --> 00:14:57,160 Speaker 1: And it's just the perfect microcosm of this this tendency. 248 00:14:57,600 --> 00:15:01,200 Speaker 1: And I think that po listing links to two news 249 00:15:01,280 --> 00:15:06,200 Speaker 1: articles has almost become a shortcut for a comment. It's 250 00:15:06,200 --> 00:15:09,400 Speaker 1: almost like a comment or a gift or an emoji. Right, 251 00:15:09,520 --> 00:15:12,840 Speaker 1: It's a way for someone to say, here is my thought, 252 00:15:13,120 --> 00:15:17,240 Speaker 1: on that, uh in headline form, even though you may 253 00:15:17,280 --> 00:15:20,000 Speaker 1: not have clicked through to read the actual article underneath, 254 00:15:20,080 --> 00:15:23,640 Speaker 1: or you're just you're looking for something that supports whatever 255 00:15:24,760 --> 00:15:28,640 Speaker 1: position you're taking on any particular topic, whether it's something 256 00:15:28,720 --> 00:15:31,880 Speaker 1: lighthearted in pop culture or it's something really serious and 257 00:15:31,920 --> 00:15:35,640 Speaker 1: it has to do with like politics or health. And 258 00:15:35,760 --> 00:15:38,000 Speaker 1: I see that a lot too, where people are clearly 259 00:15:38,280 --> 00:15:42,640 Speaker 1: grabbing articles. They're just doing the down in dirty Google 260 00:15:42,680 --> 00:15:45,200 Speaker 1: search to find something that looks like it supports their 261 00:15:45,240 --> 00:15:48,680 Speaker 1: position and uses that in place. So again, this is 262 00:15:48,720 --> 00:15:53,680 Speaker 1: all before you even get too purposeful misinformation or disinformation. 263 00:15:53,720 --> 00:15:55,920 Speaker 1: It may be that the sources that you pull from 264 00:15:56,040 --> 00:16:00,240 Speaker 1: our misinformation, but it may be that they're totally yet 265 00:16:00,280 --> 00:16:02,200 Speaker 1: you just don't know the context for it. So you're 266 00:16:02,200 --> 00:16:04,600 Speaker 1: just you're just throwing things at the wall to see 267 00:16:04,640 --> 00:16:08,000 Speaker 1: what sticks. And uh So we're doing a lot of 268 00:16:08,000 --> 00:16:10,640 Speaker 1: the work ourselves is part of it, Like we have 269 00:16:11,320 --> 00:16:16,240 Speaker 1: created a a kind of culture that supports the sharing 270 00:16:16,360 --> 00:16:19,600 Speaker 1: of misinformation. But the flip side of that coin is 271 00:16:20,600 --> 00:16:26,280 Speaker 1: these platforms are optimized for the sharing of misinformation and 272 00:16:26,320 --> 00:16:31,760 Speaker 1: the elevation of misinformation. And that's really the part that 273 00:16:31,800 --> 00:16:35,640 Speaker 1: we can look at more critically and say in what 274 00:16:35,760 --> 00:16:39,960 Speaker 1: ways is this happening? And uh and and how culpable 275 00:16:40,120 --> 00:16:45,440 Speaker 1: are these platforms for that process? Uh? Keeping in mind 276 00:16:45,440 --> 00:16:47,440 Speaker 1: that in most cases, I think we would argue that 277 00:16:48,040 --> 00:16:52,240 Speaker 1: ultimately there are other people not connected to the platforms 278 00:16:52,240 --> 00:16:56,880 Speaker 1: who are generating the content that's getting elevated there, but 279 00:16:56,960 --> 00:17:00,600 Speaker 1: the platforms are benefiting from that. Um. So let's talk 280 00:17:00,640 --> 00:17:03,600 Speaker 1: a little bit about that and and what is going 281 00:17:03,640 --> 00:17:05,800 Speaker 1: on with these platforms and why we want to even 282 00:17:05,840 --> 00:17:10,200 Speaker 1: talk about algorithms. Uh So, just so you guys out there, no, 283 00:17:10,359 --> 00:17:13,640 Speaker 1: and algorithm is essentially just a list of instructions. It's 284 00:17:13,680 --> 00:17:16,320 Speaker 1: a you can think of it like a program. Doesn't 285 00:17:16,320 --> 00:17:18,360 Speaker 1: have to be a program, but it's essentially a list 286 00:17:18,359 --> 00:17:21,919 Speaker 1: of instructions that guides some sort of process. And in 287 00:17:21,920 --> 00:17:25,200 Speaker 1: the case of social networks, essentially what we talk about 288 00:17:25,200 --> 00:17:28,240 Speaker 1: when we say algorithm is it's the bit of that 289 00:17:28,560 --> 00:17:33,080 Speaker 1: platform that decides what content you see in what order 290 00:17:33,160 --> 00:17:35,600 Speaker 1: you see it. So for YouTube, it could be the 291 00:17:35,680 --> 00:17:39,960 Speaker 1: recommendation engine where you're watching one video and then you've 292 00:17:39,960 --> 00:17:42,440 Speaker 1: got a whole bunch of other videos recommended along the side. 293 00:17:42,440 --> 00:17:46,399 Speaker 1: So for me, it's all, uh, cute animal videos. I 294 00:17:46,440 --> 00:17:49,080 Speaker 1: wish I could lie and say it wasn't. I'm a 295 00:17:49,119 --> 00:17:51,439 Speaker 1: punk rock kind of guy. But it's all kitten and 296 00:17:51,440 --> 00:17:56,960 Speaker 1: puppy videos sometimes a sloth um and Facebook. It's it's 297 00:17:57,040 --> 00:18:00,440 Speaker 1: which posts you see in which order. You know, everyone 298 00:18:00,480 --> 00:18:02,840 Speaker 1: gets frustrated that they're not seeing it in a reverse 299 00:18:02,920 --> 00:18:09,040 Speaker 1: chronological order. Well, that's by design, and if as users 300 00:18:09,040 --> 00:18:11,040 Speaker 1: we think of it as it's the algorithm that shows 301 00:18:11,119 --> 00:18:15,480 Speaker 1: us what is potentially next on our docket. For the platforms, 302 00:18:16,359 --> 00:18:21,639 Speaker 1: algorithms exist solely because the platform wants to keep people 303 00:18:21,800 --> 00:18:25,479 Speaker 1: on it for as long as possible, because that's how 304 00:18:25,520 --> 00:18:29,640 Speaker 1: the platforms make money. It's through advertising, Uh, it's through 305 00:18:29,880 --> 00:18:35,400 Speaker 1: the the commoditization of people's personal information that they generate 306 00:18:35,440 --> 00:18:38,240 Speaker 1: while they're on that platform. And the longer you're on it, 307 00:18:38,920 --> 00:18:42,720 Speaker 1: the more the platform makes So the algorithms are are 308 00:18:42,760 --> 00:18:46,840 Speaker 1: I would argue a moral they don't really care what 309 00:18:47,119 --> 00:18:49,880 Speaker 1: is they're serving up as long as it keeps you there, 310 00:18:50,200 --> 00:18:54,159 Speaker 1: and that I think is the core of the problem. 311 00:18:54,400 --> 00:18:59,200 Speaker 1: Absolutely a thousand percent correct. I think that tech platforms 312 00:18:59,640 --> 00:19:01,960 Speaker 1: need to be held accountable for the ways that their 313 00:19:02,000 --> 00:19:08,880 Speaker 1: algorithms have really just take an advantage of a lot 314 00:19:08,920 --> 00:19:13,480 Speaker 1: of human nature is not so great tendencies, right humans. Listen, 315 00:19:13,520 --> 00:19:16,280 Speaker 1: we all know humans are terrible, right. We are lazy. 316 00:19:16,520 --> 00:19:20,720 Speaker 1: We make rash, snap decisions, myself very much included. We 317 00:19:20,920 --> 00:19:24,240 Speaker 1: will smash that share button, smash that like button, smash 318 00:19:24,320 --> 00:19:26,639 Speaker 1: that comment button on an article that frankly, we have 319 00:19:26,680 --> 00:19:29,240 Speaker 1: not read, and we're just putting our uninformed opinion out there. 320 00:19:29,440 --> 00:19:33,960 Speaker 1: I am guilty of all of this. Right, Platforms Essentially, 321 00:19:34,000 --> 00:19:37,960 Speaker 1: algorithms have rewarded that kind of behavior, that kind of 322 00:19:38,080 --> 00:19:42,240 Speaker 1: quick behavior, snap decision making, and created this this media 323 00:19:42,359 --> 00:19:47,639 Speaker 1: landscape where the citizenry is less informed, less engaged, and 324 00:19:47,920 --> 00:19:50,959 Speaker 1: more prime for bad actors to disinform and misinform them. 325 00:19:51,000 --> 00:19:52,320 Speaker 1: And I think the way that you put it it 326 00:19:52,640 --> 00:19:57,199 Speaker 1: is completely correct. And I think you know I I 327 00:19:57,240 --> 00:19:59,560 Speaker 1: say this all the time on my own podcast. I 328 00:19:59,640 --> 00:20:01,800 Speaker 1: want to one of the reasons why I'm very interested 329 00:20:01,800 --> 00:20:03,920 Speaker 1: in disinformation is because I think it's a good example 330 00:20:03,960 --> 00:20:07,200 Speaker 1: of the way that algorithmic thinking has really failed us. 331 00:20:07,320 --> 00:20:10,600 Speaker 1: You know, I want to build an internet that prioritizes 332 00:20:10,640 --> 00:20:14,320 Speaker 1: and privileges things like thoughtfulness or you know, discourse, or 333 00:20:14,680 --> 00:20:17,720 Speaker 1: really engaging with opinions that are different from your own. 334 00:20:17,760 --> 00:20:20,840 Speaker 1: And I feel like algorithms and platforms have given us 335 00:20:20,880 --> 00:20:24,760 Speaker 1: the opposite. It's a media landscape that rewards quick decision 336 00:20:24,800 --> 00:20:29,360 Speaker 1: making and being less engaged and less thoughtful and engaging 337 00:20:29,440 --> 00:20:33,640 Speaker 1: less and thus leaving people really able to be exploited 338 00:20:33,680 --> 00:20:36,640 Speaker 1: by people who do want to push misinformation and disinformation. 339 00:20:36,720 --> 00:20:39,800 Speaker 1: And I think when platforms like Facebook get rich off 340 00:20:39,840 --> 00:20:42,000 Speaker 1: of it, we really have to ask some questions. You know, 341 00:20:42,400 --> 00:20:44,960 Speaker 1: Facebook's own internal report says that six or four percent 342 00:20:44,960 --> 00:20:47,400 Speaker 1: at a time, when somebody joins an extremist Facebook group, 343 00:20:47,480 --> 00:20:50,800 Speaker 1: it's because Facebook itself recommended it, right, They should not 344 00:20:50,920 --> 00:20:54,560 Speaker 1: then be getting more money based on something that they 345 00:20:54,600 --> 00:20:58,680 Speaker 1: admit has such a corrosive impact on our society. Hey, guys, 346 00:20:58,760 --> 00:21:00,920 Speaker 1: Jonathan from the future here is coming in to say, 347 00:21:00,920 --> 00:21:03,040 Speaker 1: we're going to take a quick break with our conversation 348 00:21:03,080 --> 00:21:05,560 Speaker 1: with bridget Todd of there are no girls on the Internet, 349 00:21:05,880 --> 00:21:16,280 Speaker 1: and we will be back right after this break. You 350 00:21:16,359 --> 00:21:22,520 Speaker 1: can't play both sides of the issue saying, uh, we 351 00:21:22,600 --> 00:21:26,960 Speaker 1: don't have any responsibility, We're we're neutral, We're just a 352 00:21:27,000 --> 00:21:31,639 Speaker 1: platform that people post to, and then also reaping the 353 00:21:31,880 --> 00:21:37,440 Speaker 1: billions of dollars generated through the process of radicalization. If 354 00:21:37,480 --> 00:21:42,280 Speaker 1: you are directly profiting from that process. You are at 355 00:21:42,359 --> 00:21:46,160 Speaker 1: least in some way culpable for it. You can't like 356 00:21:46,880 --> 00:21:49,679 Speaker 1: the idea of being able to wash your hands and 357 00:21:49,720 --> 00:21:55,200 Speaker 1: walk away is uh alien to me. I can't imagine 358 00:21:55,240 --> 00:22:00,280 Speaker 1: being able to shed that kind of responsibility. And you know, 359 00:22:00,320 --> 00:22:03,199 Speaker 1: we talk about Facebook revenue in the billions of dollars 360 00:22:03,240 --> 00:22:06,160 Speaker 1: every single quarter. It's not not billions of dollars per year. 361 00:22:06,240 --> 00:22:09,040 Speaker 1: Every quarter of that company is making billions and a 362 00:22:09,080 --> 00:22:13,479 Speaker 1: lot of this is because of these algorithms that you know, 363 00:22:13,880 --> 00:22:19,920 Speaker 1: Facebook's their value proposition is that they can put very 364 00:22:19,960 --> 00:22:23,880 Speaker 1: specific groups of people in front of very specific content. 365 00:22:24,720 --> 00:22:29,679 Speaker 1: And it's it's because the the the Facebook engine learns 366 00:22:29,840 --> 00:22:34,240 Speaker 1: so much about us and what how we interact that 367 00:22:34,359 --> 00:22:37,040 Speaker 1: it can start predicting the things that we're going to 368 00:22:37,119 --> 00:22:41,280 Speaker 1: react to next and thus serve up that stuff in 369 00:22:41,320 --> 00:22:44,960 Speaker 1: front of us, and sure enough, we're bound to act 370 00:22:45,000 --> 00:22:48,640 Speaker 1: on it. And in if this were a world where 371 00:22:48,720 --> 00:22:51,200 Speaker 1: radicalization wasn't a thing, that would mostly mean we would 372 00:22:51,200 --> 00:22:54,159 Speaker 1: all be buying more crap and that would be the 373 00:22:54,160 --> 00:22:57,840 Speaker 1: worst of it. But we're seeing in the same thing 374 00:22:57,960 --> 00:23:00,679 Speaker 1: is true of YouTube, right that YouTube Google when you 375 00:23:00,760 --> 00:23:03,879 Speaker 1: log into YouTube, not if you're just browsing it, like, 376 00:23:04,080 --> 00:23:07,320 Speaker 1: without being logged in. But if you're logged into YouTube, 377 00:23:07,800 --> 00:23:12,879 Speaker 1: it is constantly tweaking your profile of the sort of 378 00:23:12,880 --> 00:23:15,520 Speaker 1: things that you interact with, both on and off the site, 379 00:23:16,560 --> 00:23:20,119 Speaker 1: and building on that and determining what you want to 380 00:23:20,160 --> 00:23:22,560 Speaker 1: see next. In fact, there were there was a study 381 00:23:23,600 --> 00:23:27,080 Speaker 1: self published, not even pure reviewed. I hesitate to even 382 00:23:27,119 --> 00:23:29,680 Speaker 1: bring it up, but there was one report that was 383 00:23:29,680 --> 00:23:32,800 Speaker 1: published a couple of years ago where a pair of 384 00:23:33,040 --> 00:23:36,920 Speaker 1: researchers and I used the term lightly, argued that YouTube 385 00:23:37,119 --> 00:23:41,280 Speaker 1: wasn't contributing to radicalization. And they had this whole thing 386 00:23:41,280 --> 00:23:44,359 Speaker 1: where they talked about watching different styles of videos and 387 00:23:44,359 --> 00:23:48,400 Speaker 1: seeing what got recommended next. Except for one tiny little thing. 388 00:23:48,640 --> 00:23:52,040 Speaker 1: They didn't log into YouTube. They were they were watching 389 00:23:52,040 --> 00:23:55,960 Speaker 1: it without logging in, And it's the log in process 390 00:23:55,960 --> 00:23:59,479 Speaker 1: where Google starts to build that profile, figuring out, oh, well, 391 00:23:59,520 --> 00:24:01,639 Speaker 1: they're inter acting with a lot of content that's in 392 00:24:02,040 --> 00:24:04,439 Speaker 1: say the video game space. That's another one that I 393 00:24:04,480 --> 00:24:07,320 Speaker 1: watch a lot of. So now I get puppies and 394 00:24:07,400 --> 00:24:09,480 Speaker 1: kittens and video games, not all in the same video 395 00:24:09,560 --> 00:24:13,639 Speaker 1: there separate, but I get a good mix. Um. But 396 00:24:13,680 --> 00:24:18,000 Speaker 1: if I were if I were to start watching videos 397 00:24:18,040 --> 00:24:22,120 Speaker 1: that were let's say a little to the far right 398 00:24:22,200 --> 00:24:25,879 Speaker 1: or the far left of the spectrum. Uh, then the 399 00:24:26,160 --> 00:24:29,880 Speaker 1: those algorithms start working and start determining what other sort 400 00:24:29,920 --> 00:24:33,800 Speaker 1: of videos might I respond well to, and based upon 401 00:24:33,840 --> 00:24:38,000 Speaker 1: my activities on the site, it tweaks those waitings, right, 402 00:24:38,040 --> 00:24:40,919 Speaker 1: It says, it's like machine learning. It says, all right, well, 403 00:24:41,960 --> 00:24:44,920 Speaker 1: we saw that they that this person spent an hour 404 00:24:45,600 --> 00:24:48,240 Speaker 1: when we shared this kind of video, so let's push 405 00:24:48,240 --> 00:24:50,080 Speaker 1: even harder on that and see if we can get 406 00:24:50,119 --> 00:24:54,840 Speaker 1: that engagement up even more. And from the platform side, 407 00:24:54,880 --> 00:24:57,880 Speaker 1: they're just thinking, how can we maximize this person's time 408 00:24:57,880 --> 00:25:00,520 Speaker 1: on our platform and make the most money. But it's 409 00:25:00,560 --> 00:25:06,760 Speaker 1: it's the same general approach as trying to radicalize someone, 410 00:25:06,880 --> 00:25:11,320 Speaker 1: where you're trying to continually serve them up information from 411 00:25:11,320 --> 00:25:15,639 Speaker 1: a very specific ideology and to reinforce that over and over. 412 00:25:16,040 --> 00:25:18,720 Speaker 1: It just so happens that these two things are in 413 00:25:19,000 --> 00:25:23,840 Speaker 1: aligned with one another. And so from that algorithmic standpoint, 414 00:25:24,840 --> 00:25:29,680 Speaker 1: we see the process of radicalization somewhat automated, and that 415 00:25:29,880 --> 00:25:32,600 Speaker 1: is where the real concerns are, and we know that 416 00:25:32,600 --> 00:25:36,000 Speaker 1: this is still a thing. There was a report just 417 00:25:36,200 --> 00:25:41,080 Speaker 1: earlier this month about how researchers were still finding that 418 00:25:41,200 --> 00:25:45,040 Speaker 1: extremist videos would get pop up in the recommendation uh 419 00:25:45,160 --> 00:25:48,720 Speaker 1: sidebar if you were to to watch them, whereas YouTube 420 00:25:48,760 --> 00:25:52,480 Speaker 1: has been working pretty hard to to to call that, 421 00:25:52,720 --> 00:25:58,520 Speaker 1: but it's still happening. So this is I can't see 422 00:25:58,600 --> 00:26:03,240 Speaker 1: how if you run so shual media platform that has 423 00:26:03,280 --> 00:26:06,480 Speaker 1: an algorithm to determine what order you see stuff in, 424 00:26:07,119 --> 00:26:11,560 Speaker 1: how you can step back and say you're not responsible 425 00:26:11,960 --> 00:26:17,320 Speaker 1: for at least playing a role in leading people toward extremism. 426 00:26:17,359 --> 00:26:19,399 Speaker 1: I mean, I wish I had that ability to wash 427 00:26:19,440 --> 00:26:22,240 Speaker 1: my hands of responsibility of things that I don't want 428 00:26:22,400 --> 00:26:26,199 Speaker 1: you be responsible for. But it's difficult to see how people, 429 00:26:26,600 --> 00:26:29,879 Speaker 1: how tech leaders cannot see their culpability in this. Just 430 00:26:29,920 --> 00:26:32,800 Speaker 1: as you described, it's it's very difficult to make the 431 00:26:32,920 --> 00:26:36,680 Speaker 1: argument that they don't have a responsibility when these kinds 432 00:26:36,720 --> 00:26:40,000 Speaker 1: of things, by their own metrics, are happening on their platform. 433 00:26:40,040 --> 00:26:43,280 Speaker 1: It's very clear study after study after study, particularly on YouTube, 434 00:26:43,440 --> 00:26:46,720 Speaker 1: which frankly, you know, in all of the conversations we've 435 00:26:46,720 --> 00:26:49,520 Speaker 1: been happening about having about you know, platform accountability, I've 436 00:26:49,560 --> 00:26:53,280 Speaker 1: been surprised the way that YouTube has really been able 437 00:26:53,320 --> 00:26:55,240 Speaker 1: to kind of skirt a lot of that heat. Like 438 00:26:55,240 --> 00:26:57,280 Speaker 1: we talk a lot about Facebook, we talk about Twitter, 439 00:26:57,600 --> 00:26:59,520 Speaker 1: but YouTube. I don't know how they've done it, but 440 00:26:59,520 --> 00:27:02,720 Speaker 1: they've been a this sort of sidestep that conversation, which 441 00:27:02,760 --> 00:27:05,880 Speaker 1: I think is really not good considering the role they 442 00:27:05,960 --> 00:27:08,199 Speaker 1: do play in radicalization. You know, I think that we 443 00:27:08,320 --> 00:27:11,360 Speaker 1: really I'm very curious how they've been able to avoid 444 00:27:11,440 --> 00:27:15,480 Speaker 1: that kind of responsibility for so long. Your point about platforms, 445 00:27:15,520 --> 00:27:16,720 Speaker 1: I think it is such a good one, and that 446 00:27:17,359 --> 00:27:18,800 Speaker 1: if they're if if we didn't, if you lived in 447 00:27:18,840 --> 00:27:22,160 Speaker 1: a wrote that did not have radical views or extremist views, 448 00:27:22,600 --> 00:27:25,320 Speaker 1: even then I would I would want to ask questions 449 00:27:25,359 --> 00:27:29,120 Speaker 1: about whether or not these platforms are actually doing good 450 00:27:29,240 --> 00:27:31,679 Speaker 1: or doing harm. I remember this time. This is kind 451 00:27:31,720 --> 00:27:33,080 Speaker 1: of a weird story, but I was going through a 452 00:27:33,080 --> 00:27:36,160 Speaker 1: breakup and I was still following my ex on Facebook. 453 00:27:36,200 --> 00:27:37,879 Speaker 1: And it was one of those breakups where I was like, 454 00:27:38,000 --> 00:27:40,919 Speaker 1: you know, like it was really getting me down, and 455 00:27:40,960 --> 00:27:45,760 Speaker 1: I realized Facebook must have sensed that my relationship with 456 00:27:45,800 --> 00:27:48,560 Speaker 1: this other account was something was going on there. So 457 00:27:48,880 --> 00:27:51,600 Speaker 1: every little update that I got about my ex, Facebook 458 00:27:51,680 --> 00:27:53,680 Speaker 1: was like shoving it in my face. And I realized 459 00:27:54,000 --> 00:27:56,359 Speaker 1: a platform can make you feel bad about yourself. A 460 00:27:56,359 --> 00:27:58,440 Speaker 1: platform can make you if it has the power to 461 00:27:58,480 --> 00:28:01,240 Speaker 1: shape you to extremist ideality, to your extremist thinking or 462 00:28:01,280 --> 00:28:04,800 Speaker 1: political content. It has that power to really shape how 463 00:28:04,880 --> 00:28:07,000 Speaker 1: we feel in our day to day. I wish it 464 00:28:07,080 --> 00:28:10,120 Speaker 1: wasn't true, but it is. And so I think even 465 00:28:10,359 --> 00:28:13,200 Speaker 1: from that perspective, it is completely fair to step back 466 00:28:13,240 --> 00:28:16,960 Speaker 1: and ask, well, how are these platforms contributing to harm? 467 00:28:17,000 --> 00:28:19,720 Speaker 1: Whether it's extremist content or just how someone feels on 468 00:28:19,760 --> 00:28:21,480 Speaker 1: their day to day, like if they feel like they're 469 00:28:21,520 --> 00:28:25,040 Speaker 1: able to step away from platforms, if platforms have prioritized 470 00:28:25,359 --> 00:28:28,439 Speaker 1: amount of time on screen or amount of time on 471 00:28:28,480 --> 00:28:31,080 Speaker 1: a page, or what have you. Is that kind of 472 00:28:31,080 --> 00:28:34,080 Speaker 1: thinking doing harm in society? I think it's completely fair 473 00:28:34,119 --> 00:28:36,199 Speaker 1: to ask these questions. And I think, you know, I 474 00:28:36,240 --> 00:28:39,880 Speaker 1: would like platforms. I would like tech leaders and the 475 00:28:39,960 --> 00:28:42,840 Speaker 1: people that you know use these these technologies to have 476 00:28:43,000 --> 00:28:45,160 Speaker 1: that conversation. But I feel like getting to a point 477 00:28:45,160 --> 00:28:46,560 Speaker 1: where we're all on the same page it has been 478 00:28:46,600 --> 00:28:50,479 Speaker 1: so tough. Yeah, And I would say that the the 479 00:28:50,600 --> 00:28:55,520 Speaker 1: efforts we have seen kind of illustrate the point you're making, 480 00:28:55,520 --> 00:29:00,520 Speaker 1: Bridget in that we see these tech companies occasionally spawned 481 00:29:01,400 --> 00:29:04,280 Speaker 1: when things come to a point where there's no other option, 482 00:29:04,360 --> 00:29:08,240 Speaker 1: they have to respond right there, called before Congress. Perhaps 483 00:29:08,440 --> 00:29:12,960 Speaker 1: that's happened multiple times now, or they're under pressure because 484 00:29:13,240 --> 00:29:17,600 Speaker 1: of advertisers that don't want to be associated with things 485 00:29:17,640 --> 00:29:21,400 Speaker 1: that are spiraling out of control. Uh, we see that 486 00:29:21,680 --> 00:29:24,600 Speaker 1: those cases when I have things get to the extreme, 487 00:29:24,680 --> 00:29:30,760 Speaker 1: But often the responses are very uh, surface level. So 488 00:29:31,000 --> 00:29:35,760 Speaker 1: things like we now have a part of our app 489 00:29:35,840 --> 00:29:39,000 Speaker 1: where it will alert you if you've spent X amount 490 00:29:39,000 --> 00:29:41,200 Speaker 1: of time on the app, So it's a little screen 491 00:29:41,200 --> 00:29:46,080 Speaker 1: alert time. And you're thinking, well, I see how you're 492 00:29:46,080 --> 00:29:50,280 Speaker 1: trying to address the issue kind of, but you're not 493 00:29:50,400 --> 00:29:54,120 Speaker 1: getting at the underlying problem. You're just looking at a symptom. 494 00:29:54,160 --> 00:29:58,440 Speaker 1: It's like treating someone who is very, very sick, but 495 00:29:58,480 --> 00:30:00,640 Speaker 1: all you can do is alleviate the terms but not 496 00:30:00,840 --> 00:30:05,000 Speaker 1: cure the sickness. The same sort of issue. And uh, 497 00:30:05,400 --> 00:30:09,840 Speaker 1: when we're talking about platforms that have the capacity to 498 00:30:10,080 --> 00:30:16,600 Speaker 1: make massive changes in people's behavior over time, that's not 499 00:30:16,680 --> 00:30:22,280 Speaker 1: really good enough, right, That's it. We are We are 500 00:30:22,360 --> 00:30:27,480 Speaker 1: having countless people go down dark pathways where it's very 501 00:30:27,480 --> 00:30:31,400 Speaker 1: hard to turn back, and there's they're going into communities 502 00:30:31,480 --> 00:30:37,120 Speaker 1: where there's all this reinforcement that's again supplemented by the 503 00:30:37,640 --> 00:30:40,880 Speaker 1: wave platforms work in the first place. Now, one of 504 00:30:40,880 --> 00:30:43,000 Speaker 1: the other things I wanted to mention is that and 505 00:30:43,040 --> 00:30:45,120 Speaker 1: I we talked about at the very beginning about how 506 00:30:46,200 --> 00:30:50,440 Speaker 1: we don't know really the full scope of the effect 507 00:30:50,480 --> 00:30:52,880 Speaker 1: of this. We know what's happening, we know that it's 508 00:30:52,880 --> 00:30:58,560 Speaker 1: a problem. The interesting thing is seeing some disagreements among 509 00:30:59,000 --> 00:31:02,480 Speaker 1: researchers who have really looked into this as to the extent. 510 00:31:02,880 --> 00:31:06,600 Speaker 1: So one of the articles I referenced with you, Bridget 511 00:31:06,720 --> 00:31:10,320 Speaker 1: was one that an opinion piece and wired by Emma Bryant. 512 00:31:11,200 --> 00:31:15,080 Speaker 1: She was writing about the Oxford Institute. They had released 513 00:31:15,080 --> 00:31:19,840 Speaker 1: a study that was showing an increase in UH companies 514 00:31:19,880 --> 00:31:24,920 Speaker 1: participating in influenced. Influenced campaigns is the way they word it. 515 00:31:25,120 --> 00:31:28,480 Speaker 1: I hate these I hate these words that we use 516 00:31:29,080 --> 00:31:32,040 Speaker 1: where we take the staying out of what's happening. It's 517 00:31:32,040 --> 00:31:35,000 Speaker 1: an influence campaing. Although it does also bring give you 518 00:31:35,040 --> 00:31:37,440 Speaker 1: a new appreciation of what it means to be an influencer. 519 00:31:37,480 --> 00:31:43,560 Speaker 1: It makes it much more sinister, realistic, but sinister. Yeah, 520 00:31:44,280 --> 00:31:47,680 Speaker 1: campaign sounds like what a Instagram influencer does to get 521 00:31:47,720 --> 00:31:54,040 Speaker 1: you to buy coffee or something. Yeah. And now, granted 522 00:31:54,120 --> 00:31:58,080 Speaker 1: you could argue that the same same ideas that are 523 00:31:58,200 --> 00:32:00,800 Speaker 1: used by influencers to try and get their followers to 524 00:32:00,840 --> 00:32:04,720 Speaker 1: go and buy whatever brand is supporting them are it's 525 00:32:04,760 --> 00:32:07,240 Speaker 1: somewhat aligned with some of these other ideas. But I 526 00:32:07,280 --> 00:32:09,680 Speaker 1: think we could agree that there's a spectrum of harm 527 00:32:09,760 --> 00:32:14,240 Speaker 1: here from from oh man, this drink that I bought 528 00:32:14,360 --> 00:32:16,240 Speaker 1: is real nasty. I wish I hadn't bought it too. 529 00:32:16,720 --> 00:32:19,200 Speaker 1: Oh man, I found myself on the steps of the 530 00:32:19,240 --> 00:32:21,800 Speaker 1: Capitol on January six that boy, well, am I sorry 531 00:32:21,800 --> 00:32:25,240 Speaker 1: that I've done that? Um, that's a that's a spectrum 532 00:32:25,680 --> 00:32:28,720 Speaker 1: right there, That's what we call that. But what am 533 00:32:28,720 --> 00:32:31,040 Speaker 1: I Brant was writing about was that so Oxford Institut 534 00:32:31,080 --> 00:32:33,680 Speaker 1: who comes out and says that that there's been an 535 00:32:33,720 --> 00:32:39,000 Speaker 1: increase year over year of this form of of misinformation 536 00:32:39,120 --> 00:32:43,720 Speaker 1: campaigns and the role of the Internet, and uh, Bryant's 537 00:32:43,800 --> 00:32:48,920 Speaker 1: point was that we don't even understand the full scope 538 00:32:49,000 --> 00:32:51,920 Speaker 1: of this, and it's more like we're paying more attention, 539 00:32:52,080 --> 00:32:54,960 Speaker 1: so we're seeing more of this, but it doesn't tell 540 00:32:55,040 --> 00:32:57,480 Speaker 1: us that there's actually been an increase year over year. 541 00:32:57,560 --> 00:33:00,480 Speaker 1: What it tells us is that we're finally paying attention 542 00:33:00,520 --> 00:33:03,040 Speaker 1: to a very real problem that's been around for a while. 543 00:33:03,520 --> 00:33:05,800 Speaker 1: If I can make an analogy, it's kind of like 544 00:33:06,400 --> 00:33:09,680 Speaker 1: when people started to see, uh, what appeared to be 545 00:33:09,720 --> 00:33:13,320 Speaker 1: autism rates rise, but in reality it looked like it 546 00:33:13,400 --> 00:33:16,800 Speaker 1: was more that the definition and the way of diagnosing 547 00:33:16,840 --> 00:33:20,600 Speaker 1: autism had expanded to a point where we were just 548 00:33:20,760 --> 00:33:24,520 Speaker 1: realizing the number of cases that actually exist, as opposed 549 00:33:24,560 --> 00:33:28,200 Speaker 1: to it's increasing. It was more that, oh, no, it 550 00:33:28,320 --> 00:33:31,240 Speaker 1: was here, we just didn't recognize a lot of this 551 00:33:31,520 --> 00:33:35,080 Speaker 1: as autism. It was sort of the same argument that 552 00:33:35,360 --> 00:33:38,560 Speaker 1: Emma Brian is making in that if you if you 553 00:33:38,800 --> 00:33:41,200 Speaker 1: come back and you say, well, I looked at the ocean, 554 00:33:41,480 --> 00:33:44,320 Speaker 1: and we took a bucket out and we filled it 555 00:33:44,400 --> 00:33:46,360 Speaker 1: up five times, so we know there's at least five 556 00:33:46,400 --> 00:33:49,360 Speaker 1: buckets of water in the ocean. And you say, well, yes, 557 00:33:49,960 --> 00:33:51,840 Speaker 1: but there's a lot else that's there that we don't 558 00:33:51,880 --> 00:33:54,280 Speaker 1: know about yet, and we we haven't measured and we 559 00:33:54,320 --> 00:33:59,280 Speaker 1: haven't quantified. And to me, that is it's interesting. It 560 00:33:59,520 --> 00:34:04,080 Speaker 1: shows an opportunity for actual, real research and analysis and 561 00:34:04,120 --> 00:34:08,320 Speaker 1: study into what's actually there as opposed to what's being reported, 562 00:34:08,560 --> 00:34:13,320 Speaker 1: and it's frightening. It's so scary because we don't even 563 00:34:13,840 --> 00:34:16,239 Speaker 1: like like she says, you know, they looked at a 564 00:34:16,320 --> 00:34:20,320 Speaker 1: list of a couple of dozen different um UH platforms 565 00:34:20,360 --> 00:34:24,440 Speaker 1: that were actively pushing out misinformation back in twenty nineteen, 566 00:34:24,480 --> 00:34:28,000 Speaker 1: I think it was, And she said, listen in Cambridge, 567 00:34:28,000 --> 00:34:31,799 Speaker 1: Analytica was active in six different countries by itself, that 568 00:34:31,880 --> 00:34:34,600 Speaker 1: was just one and says I've I've just started and 569 00:34:34,640 --> 00:34:37,440 Speaker 1: I've made a list of more than six hundred companies 570 00:34:37,480 --> 00:34:44,960 Speaker 1: that are actively push pushing out influencer operations. So that's 571 00:34:45,120 --> 00:34:47,680 Speaker 1: why I wanted to set that ground early on. Is 572 00:34:47,840 --> 00:34:51,719 Speaker 1: it's not to say that what we're talking about UH 573 00:34:51,920 --> 00:34:56,520 Speaker 1: isn't relevant, but rather we can't speak in any any 574 00:34:56,640 --> 00:34:59,960 Speaker 1: definitive scope because we just don't know what that scope is. 575 00:35:00,719 --> 00:35:04,520 Speaker 1: And to that means that we've gotta be on the lookout. Yeah, 576 00:35:04,640 --> 00:35:07,480 Speaker 1: and and I'm sorry to say this, but that Oxford 577 00:35:07,520 --> 00:35:11,799 Speaker 1: study that that she references in that piece is incredibly influential, right, 578 00:35:11,840 --> 00:35:16,920 Speaker 1: and so it is it's upsetting to see, I mean, 579 00:35:17,000 --> 00:35:20,880 Speaker 1: the name like Oxford, you expect a level of rigor, 580 00:35:21,239 --> 00:35:23,040 Speaker 1: you know, in in the kind of research they're putting 581 00:35:23,040 --> 00:35:24,879 Speaker 1: out right, I'm not a researcher, so I I can't 582 00:35:24,880 --> 00:35:27,799 Speaker 1: really say. But the name Oxford, You're like, Okay, that's 583 00:35:27,800 --> 00:35:29,360 Speaker 1: going to be a very reputable you know, it's a 584 00:35:29,400 --> 00:35:34,160 Speaker 1: reputable place. But it's got some it's got some research 585 00:35:34,160 --> 00:35:38,799 Speaker 1: street cred. Yeah. Absolutely. Um. And it's upsetting that that 586 00:35:38,920 --> 00:35:43,120 Speaker 1: study was so influential in the disinformation space when a 587 00:35:43,200 --> 00:35:46,160 Speaker 1: lot of the points that it makes are really flawed. 588 00:35:46,239 --> 00:35:47,879 Speaker 1: And she also, I have to say, she points out 589 00:35:47,920 --> 00:35:50,480 Speaker 1: that there the study itself had a lot of typos 590 00:35:50,520 --> 00:35:53,239 Speaker 1: and like spelling errors, which I would be so embarrassed 591 00:35:53,440 --> 00:35:55,560 Speaker 1: if I put out an influential study and someone was like, 592 00:35:55,560 --> 00:35:59,279 Speaker 1: actually it was really sloppy, I would be like mortified. 593 00:35:59,480 --> 00:36:02,440 Speaker 1: Her point is that the work is largely without meaning 594 00:36:02,600 --> 00:36:07,600 Speaker 1: because it's creating a sense that we have metrics for 595 00:36:07,800 --> 00:36:12,879 Speaker 1: something where uh, you know, really it's it's it's such 596 00:36:12,920 --> 00:36:16,440 Speaker 1: a small subsection of the overall problem that her point 597 00:36:16,480 --> 00:36:18,600 Speaker 1: is that it doesn't really tell us anything, right, It 598 00:36:18,719 --> 00:36:23,399 Speaker 1: just it tells us of the reported incidents, But that, 599 00:36:23,840 --> 00:36:27,640 Speaker 1: unfortunately is not really meaningful if you're looking at a 600 00:36:27,680 --> 00:36:33,640 Speaker 1: holistic approach to misinformation, and in fact, you know, you know, 601 00:36:33,719 --> 00:36:36,520 Speaker 1: and in fact, she was also pointing out that the 602 00:36:36,520 --> 00:36:41,440 Speaker 1: study looked at countries where the reporting mechanism might not 603 00:36:41,520 --> 00:36:44,640 Speaker 1: be as vigorous as in others, like countries that don't 604 00:36:45,560 --> 00:36:48,000 Speaker 1: have the same sort of perspective on things like the 605 00:36:48,040 --> 00:36:51,640 Speaker 1: freedom of the press, maybe state controlled press. You have 606 00:36:51,680 --> 00:36:54,920 Speaker 1: places like China and Russia where the state has a 607 00:36:55,040 --> 00:36:57,680 Speaker 1: large amount of influence on the media that goes out 608 00:36:57,719 --> 00:37:02,359 Speaker 1: in those countries. Uh, It's it does show that that 609 00:37:03,000 --> 00:37:08,400 Speaker 1: it's dangerous to take any stance where you're making absolute 610 00:37:08,480 --> 00:37:12,520 Speaker 1: claims because we just don't have the study to really 611 00:37:13,239 --> 00:37:16,319 Speaker 1: do that justifiably. We we don't have the evidence to 612 00:37:16,360 --> 00:37:19,120 Speaker 1: back that up. You're exactly right, We don't know the 613 00:37:19,160 --> 00:37:21,319 Speaker 1: real scope of the problem and the sort of lay 614 00:37:21,320 --> 00:37:23,640 Speaker 1: of the land. I also think something that she points 615 00:37:23,640 --> 00:37:25,480 Speaker 1: out that's really important to keep in mind is that 616 00:37:25,920 --> 00:37:29,600 Speaker 1: a lot of folks from a media perspective just weren't 617 00:37:29,640 --> 00:37:33,319 Speaker 1: really talking about disinformation in a serious way prior to 618 00:37:33,360 --> 00:37:37,080 Speaker 1: like right, and so I think that an issue that 619 00:37:37,120 --> 00:37:39,719 Speaker 1: we do have is people kind of getting up to 620 00:37:39,760 --> 00:37:42,279 Speaker 1: speed with how we think about this issue in a 621 00:37:42,320 --> 00:37:46,840 Speaker 1: holistic way, because I think, you know, after and after 622 00:37:47,080 --> 00:37:50,560 Speaker 1: you know, the insurrection, I feel almost overnight, this was 623 00:37:50,600 --> 00:37:53,040 Speaker 1: an issue that was getting more buzz and more pressed 624 00:37:53,040 --> 00:37:55,640 Speaker 1: and people were talking about it more. But with that 625 00:37:55,760 --> 00:37:58,719 Speaker 1: really does come and need to take a take a 626 00:37:58,760 --> 00:38:01,840 Speaker 1: beat and would have analyzed how we've got to the 627 00:38:01,920 --> 00:38:04,759 Speaker 1: space and and what, you know, try to put some 628 00:38:04,800 --> 00:38:08,920 Speaker 1: more research around, you know, understanding what is actually going on. 629 00:38:08,960 --> 00:38:11,600 Speaker 1: And I agree that we haven't really gotten there yet. Um, 630 00:38:11,640 --> 00:38:14,880 Speaker 1: and I am happy that there are more people who 631 00:38:14,920 --> 00:38:16,800 Speaker 1: are interested in this issue, but I don't want it 632 00:38:16,840 --> 00:38:20,040 Speaker 1: to become an issue where people are claiming to have 633 00:38:20,120 --> 00:38:22,719 Speaker 1: more knowledge than they do, or people are claiming that 634 00:38:22,760 --> 00:38:25,120 Speaker 1: we know more than we do. Because this this is 635 00:38:25,400 --> 00:38:28,359 Speaker 1: very much still developing problem that we're still sort of 636 00:38:28,520 --> 00:38:31,600 Speaker 1: trying to get a hold on. I would say, yeah, 637 00:38:31,840 --> 00:38:34,279 Speaker 1: we don't want there to be misinformation about how much 638 00:38:34,360 --> 00:38:41,839 Speaker 1: misinformation there is. Yeah, hey, guys, Jonathan from the future. Again, 639 00:38:41,880 --> 00:38:43,880 Speaker 1: we're going to take a quick break, but we'll be 640 00:38:44,000 --> 00:38:50,320 Speaker 1: right back with more about social media platforms, algorithms and radicalization. 641 00:38:58,120 --> 00:39:03,480 Speaker 1: So this issue is getting to a point where the 642 00:39:03,520 --> 00:39:06,640 Speaker 1: concern is great, and yet you also look at the 643 00:39:06,760 --> 00:39:09,600 Speaker 1: tactics that are being used. You realize that the tactics 644 00:39:09,600 --> 00:39:12,920 Speaker 1: that are used are very old. I mean, you know, 645 00:39:13,000 --> 00:39:19,120 Speaker 1: the the approaches to extremism, to propaganda, to misinformation, that stuff. 646 00:39:19,160 --> 00:39:21,839 Speaker 1: We we've got a handle on that. I mean, you know, 647 00:39:21,920 --> 00:39:25,759 Speaker 1: you can see numerous documentaries from everything from people who 648 00:39:25,760 --> 00:39:28,839 Speaker 1: were experts in creating the propaganda during World War Two, 649 00:39:29,320 --> 00:39:32,960 Speaker 1: both on both sides of the conflict, up to you know, 650 00:39:33,040 --> 00:39:36,480 Speaker 1: the ad executives. I mean, that's what Madman was all about, 651 00:39:36,680 --> 00:39:40,040 Speaker 1: was the idea of framing. You know, how do you 652 00:39:40,040 --> 00:39:42,200 Speaker 1: frame information in a way to get people to do 653 00:39:42,239 --> 00:39:45,399 Speaker 1: what you want them to do. Um. You know, there's 654 00:39:45,560 --> 00:39:49,680 Speaker 1: the whole discussion about the advertising about cigarettes, was all 655 00:39:49,719 --> 00:39:53,080 Speaker 1: about that. So, like this is all old stuff. What's 656 00:39:53,120 --> 00:40:01,800 Speaker 1: new is this this method of compartmentalizing communities and reinforcing 657 00:40:01,880 --> 00:40:06,799 Speaker 1: the delivery system of that material. So it's the identifying 658 00:40:06,840 --> 00:40:13,080 Speaker 1: of a potential candidate, the introduction of material to that 659 00:40:13,200 --> 00:40:17,600 Speaker 1: candidate that will set them potentially down this pathway, and 660 00:40:17,640 --> 00:40:21,680 Speaker 1: then the methods of reinforcing that and indoctrinating that person 661 00:40:22,239 --> 00:40:27,840 Speaker 1: into more radical views. Interestingly, probably not surprisingly, some of 662 00:40:27,880 --> 00:40:30,960 Speaker 1: the studies I was looking at suggested that this is 663 00:40:31,040 --> 00:40:34,720 Speaker 1: most effective for people who kind of have the lone 664 00:40:34,760 --> 00:40:38,919 Speaker 1: wolf kind of approach to radicalism, that in cases where 665 00:40:38,920 --> 00:40:44,440 Speaker 1: you're looking at groups of extremists, typically the meat space 666 00:40:44,680 --> 00:40:48,160 Speaker 1: is where that kind of radicalization still happens primarily. So, 667 00:40:49,120 --> 00:40:52,920 Speaker 1: but I would also argue that the insurrection on January six, 668 00:40:53,440 --> 00:40:56,640 Speaker 1: that that was in large part a lot of different 669 00:40:56,680 --> 00:41:00,120 Speaker 1: individuals who all just sort of kind of converged on 670 00:41:00,160 --> 00:41:03,319 Speaker 1: the same point. That it wasn't as much let's all 671 00:41:03,400 --> 00:41:08,759 Speaker 1: join these online groups and then we start to plan 672 00:41:08,880 --> 00:41:11,319 Speaker 1: from there. It was that it was a bunch of 673 00:41:11,360 --> 00:41:18,640 Speaker 1: individuals who started slowly gravitating toward one another through methods 674 00:41:18,680 --> 00:41:21,920 Speaker 1: like this. And of course there are other communities I 675 00:41:22,000 --> 00:41:28,239 Speaker 1: mentioned before, communities like parlor or parley if you prefer, uh, which, hey, 676 00:41:28,320 --> 00:41:33,520 Speaker 1: they're back, that's great, um that where you know, there's 677 00:41:33,520 --> 00:41:35,839 Speaker 1: not so much the algorithm there. It's it's that it's 678 00:41:35,840 --> 00:41:40,560 Speaker 1: a community that is actively reinforcing beliefs. So in that case, 679 00:41:41,040 --> 00:41:45,440 Speaker 1: it's it's almost more like the traditional method of radicalization 680 00:41:46,040 --> 00:41:50,440 Speaker 1: in the sense that you have this this self um 681 00:41:50,800 --> 00:41:57,920 Speaker 1: selected community that is following this process. So it we 682 00:41:58,000 --> 00:42:00,160 Speaker 1: have we have the whole spectrum here. We still have 683 00:42:00,239 --> 00:42:03,680 Speaker 1: the meat space stuff. We have the online communities that 684 00:42:03,719 --> 00:42:08,239 Speaker 1: are specifically geared uh if not specifically geared overtly at 685 00:42:08,280 --> 00:42:13,120 Speaker 1: least effectively, they're geared towards radicalization. And then you've got 686 00:42:13,480 --> 00:42:18,000 Speaker 1: the stuff that everybody's using that can lead you to 687 00:42:18,080 --> 00:42:24,279 Speaker 1: that pathway. So, um, how you feeling, bridget I mean, 688 00:42:24,360 --> 00:42:27,520 Speaker 1: as you were describing the the issue, I almost felt 689 00:42:27,520 --> 00:42:30,200 Speaker 1: this this paying in the pit of my stomach because 690 00:42:30,680 --> 00:42:33,319 Speaker 1: we're up like, we are up against so much. There 691 00:42:33,360 --> 00:42:36,360 Speaker 1: are so many you know, we have we have the 692 00:42:36,480 --> 00:42:40,480 Speaker 1: in real life meat space organizing radicalizing people. We have 693 00:42:40,560 --> 00:42:44,840 Speaker 1: these platforms that are engineered to radicalized folks. That scope 694 00:42:44,840 --> 00:42:48,920 Speaker 1: of the problem is quite large. And I often wonder like, 695 00:42:49,120 --> 00:42:51,120 Speaker 1: and I'm actually will be curious to know your thoughts. 696 00:42:51,520 --> 00:42:54,080 Speaker 1: Do you think will ever tackle this? I do you 697 00:42:54,120 --> 00:42:55,880 Speaker 1: think that we will ever get to a place where 698 00:42:56,080 --> 00:42:58,000 Speaker 1: it is not just the norm for folks to be 699 00:42:58,080 --> 00:43:02,000 Speaker 1: having these kinds of experiences being rad clies online. I 700 00:43:02,080 --> 00:43:13,560 Speaker 1: think without without forcing the platforms through legislation, regulations, whatever 701 00:43:13,600 --> 00:43:18,600 Speaker 1: it may be, to take a truly active role and 702 00:43:18,680 --> 00:43:23,399 Speaker 1: to and also to be incredibly transparent with how their 703 00:43:23,440 --> 00:43:28,319 Speaker 1: algorithms work, we won't get there. And companies are going 704 00:43:28,360 --> 00:43:32,520 Speaker 1: to be extremely resistant to that, obviously, because the algorithm 705 00:43:32,880 --> 00:43:35,640 Speaker 1: is that's the secret sauce for making the money, so 706 00:43:35,719 --> 00:43:38,600 Speaker 1: you don't. The companies are very resistant to make that 707 00:43:38,640 --> 00:43:42,319 Speaker 1: a transparent process because, for one thing, it could give 708 00:43:42,520 --> 00:43:46,120 Speaker 1: a competitor the opportunity to beat them at their own game, 709 00:43:46,360 --> 00:43:48,960 Speaker 1: make a better algorithm that does essentially the same thing 710 00:43:49,040 --> 00:43:51,880 Speaker 1: but in a slightly different way, and then they're no 711 00:43:51,920 --> 00:43:55,279 Speaker 1: longer king of the hill. Uh. I think it's gonna 712 00:43:55,320 --> 00:43:58,440 Speaker 1: be really tricky. I think the moves we're seeing in 713 00:43:58,480 --> 00:44:01,920 Speaker 1: the US government where there's the potential of breaking up 714 00:44:01,960 --> 00:44:05,640 Speaker 1: some of these companies, I don't think that that's necessarily 715 00:44:05,680 --> 00:44:08,560 Speaker 1: going to solve this problem. There's going to be there. 716 00:44:08,640 --> 00:44:12,280 Speaker 1: There will need to be additional measures put in place 717 00:44:12,320 --> 00:44:17,080 Speaker 1: for that too to be effective. Otherwise, all you're doing 718 00:44:17,160 --> 00:44:19,239 Speaker 1: is taking one big piece and you're making a bunch 719 00:44:19,239 --> 00:44:21,439 Speaker 1: of smaller pieces. But if they're all working the same 720 00:44:21,480 --> 00:44:24,480 Speaker 1: way as the big piece was, we haven't really solved 721 00:44:24,480 --> 00:44:28,120 Speaker 1: any issues. Um. I think everyone recognizes the amount of 722 00:44:28,120 --> 00:44:32,160 Speaker 1: power these companies have there that's undeniable. The question is 723 00:44:32,160 --> 00:44:35,000 Speaker 1: how do we deal with that. My answer from my 724 00:44:35,000 --> 00:44:38,799 Speaker 1: my from a personal standpoint is not satisfying, because I 725 00:44:38,920 --> 00:44:44,280 Speaker 1: just disengaged. I quit Facebook, But that that's one person, 726 00:44:45,239 --> 00:44:48,000 Speaker 1: and I would never tell anyone else like, you've got 727 00:44:48,000 --> 00:44:51,719 Speaker 1: to quit Facebook. I might believe it really hard, but 728 00:44:51,800 --> 00:44:53,759 Speaker 1: I can't tell them that because that's the way a 729 00:44:53,760 --> 00:44:55,640 Speaker 1: lot of people stay in touch with their friends and family. 730 00:44:55,680 --> 00:44:57,720 Speaker 1: It's a way a lot of people rely on Facebook 731 00:44:57,760 --> 00:45:01,440 Speaker 1: for their own businesses. I am in a luxurious position 732 00:45:01,480 --> 00:45:06,520 Speaker 1: where I can disengage and uh, I got a dog. 733 00:45:06,600 --> 00:45:10,279 Speaker 1: I'm okay. I might not talked about friends anymore, but 734 00:45:10,320 --> 00:45:12,880 Speaker 1: I got a dog. Um, and I still got all 735 00:45:12,880 --> 00:45:15,200 Speaker 1: those dog videos on YouTube too, So really I'm living 736 00:45:15,239 --> 00:45:17,840 Speaker 1: it up. But but yeah, I mean, like that's the 737 00:45:17,880 --> 00:45:20,160 Speaker 1: that's the thing is that this is a huge problem, 738 00:45:20,239 --> 00:45:23,080 Speaker 1: and like a lot of huge problems, there may not 739 00:45:23,280 --> 00:45:27,200 Speaker 1: be a simple solution. There may not be one that 740 00:45:27,400 --> 00:45:32,080 Speaker 1: is uh completely satisfying, and it may be really messy 741 00:45:32,160 --> 00:45:36,279 Speaker 1: to implement solutions that themselves could have unintended consequences that 742 00:45:36,320 --> 00:45:40,960 Speaker 1: will have to deal with later. Um. The important thing 743 00:45:41,040 --> 00:45:46,200 Speaker 1: is really acknowledging that problem, putting more effort into understanding 744 00:45:47,200 --> 00:45:50,680 Speaker 1: the scope and impact of that problem, and making sure 745 00:45:50,719 --> 00:45:54,319 Speaker 1: that our our energy for solutions is directed in the 746 00:45:54,400 --> 00:45:57,560 Speaker 1: right place. Because without really understanding the scope and the 747 00:45:58,160 --> 00:46:01,480 Speaker 1: nature of the issue, the best we can do is 748 00:46:01,560 --> 00:46:04,719 Speaker 1: try random solutions and hope they work. But with a 749 00:46:04,840 --> 00:46:08,719 Speaker 1: deeper understanding, you can craft a pathway that is at 750 00:46:08,800 --> 00:46:11,920 Speaker 1: least has a better chance of making a positive impact. 751 00:46:12,480 --> 00:46:14,520 Speaker 1: That's kind of it's the lame way of saying it, 752 00:46:14,600 --> 00:46:17,560 Speaker 1: But the more I looked into this, the more I thought, 753 00:46:18,280 --> 00:46:21,839 Speaker 1: we just don't have a deep enough understanding, and it's 754 00:46:21,880 --> 00:46:25,319 Speaker 1: largely because we didn't take it seriously, like you were saying, 755 00:46:25,320 --> 00:46:30,160 Speaker 1: bridget I mean, before Brexit, people were aware that there 756 00:46:30,160 --> 00:46:33,960 Speaker 1: are lies posted on the internet because people lie. You know, 757 00:46:34,000 --> 00:46:38,280 Speaker 1: wherever people are, we're gonna find falsehoods. But when Brexit happened, 758 00:46:39,160 --> 00:46:44,400 Speaker 1: and after all the fallout about accusations that the support 759 00:46:44,400 --> 00:46:49,799 Speaker 1: for Brexit was largely based off of unsupportable claims, that's 760 00:46:49,840 --> 00:46:53,759 Speaker 1: where it kind of started, the snowball effect of Wow, 761 00:46:54,280 --> 00:46:57,400 Speaker 1: we've really let this get to a place where we 762 00:46:57,440 --> 00:47:00,360 Speaker 1: don't have a handle on it um and we honestly 763 00:47:00,400 --> 00:47:03,040 Speaker 1: don't even know how bad it is, and we still 764 00:47:03,080 --> 00:47:06,640 Speaker 1: don't five years later. So yeah, I mean I feel 765 00:47:06,719 --> 00:47:10,440 Speaker 1: very similarly, but I am hopeful that we are finally 766 00:47:10,520 --> 00:47:13,680 Speaker 1: taking it seriously. I wish we had gotten here three 767 00:47:13,760 --> 00:47:16,560 Speaker 1: years ago, five years ago, ten years ago about understanding 768 00:47:16,600 --> 00:47:18,839 Speaker 1: the impact that this has. But I'm glad that we're 769 00:47:18,840 --> 00:47:20,920 Speaker 1: here now, and I think what we need to do 770 00:47:20,960 --> 00:47:24,360 Speaker 1: now is have these honest conversations, put money behind the 771 00:47:24,480 --> 00:47:27,799 Speaker 1: research to actually understand what's going on, because I think 772 00:47:27,800 --> 00:47:29,799 Speaker 1: we're tackling it late. You know, I wish we had 773 00:47:29,800 --> 00:47:32,120 Speaker 1: gotten there earlier. But I'm glad we're here now. And 774 00:47:32,600 --> 00:47:35,520 Speaker 1: you know, you talked about how you deleted Facebook. I 775 00:47:35,560 --> 00:47:38,000 Speaker 1: still have Facebook. I need it for work. Um, But 776 00:47:38,040 --> 00:47:40,600 Speaker 1: I always say, you know, a lot of these issues 777 00:47:40,680 --> 00:47:44,680 Speaker 1: are big systemic. You know, we're talking about your Mark Zuckerberg, 778 00:47:44,800 --> 00:47:46,839 Speaker 1: so we're talking about policy level things that it's very 779 00:47:46,880 --> 00:47:49,359 Speaker 1: difficult to feel as an individual like you have any 780 00:47:49,400 --> 00:47:53,279 Speaker 1: impact over that. But and that's true, but I think 781 00:47:53,320 --> 00:47:56,600 Speaker 1: we can all take small steps in our individual lives 782 00:47:56,640 --> 00:47:59,279 Speaker 1: to assess and be critical of the role that platforms 783 00:47:59,280 --> 00:48:00,840 Speaker 1: play in our own to get diets and how we 784 00:48:00,920 --> 00:48:04,000 Speaker 1: use those platforms. And so you know, I would also 785 00:48:04,080 --> 00:48:06,760 Speaker 1: never tell anybody to delete Facebook, especially in a pandemic 786 00:48:06,760 --> 00:48:08,920 Speaker 1: where you might not have the ability to see your 787 00:48:08,960 --> 00:48:10,839 Speaker 1: friends every day but you want to feel connected. But 788 00:48:11,200 --> 00:48:13,480 Speaker 1: maybe you can delete Facebook from your phone so you're 789 00:48:13,520 --> 00:48:17,120 Speaker 1: not carrying around a physical reminder of its presence on 790 00:48:17,200 --> 00:48:19,640 Speaker 1: your person at all times. Right, maybe you only go 791 00:48:19,680 --> 00:48:22,600 Speaker 1: on the browser, and you and and on your laptop 792 00:48:22,680 --> 00:48:26,040 Speaker 1: or whatever, and you're deciding when you're gonna engage with Facebook. Right, 793 00:48:26,040 --> 00:48:28,680 Speaker 1: maybe you're like, I have a new rule in my house, 794 00:48:28,760 --> 00:48:31,400 Speaker 1: no phones and no phones in the bedroom. Right. You 795 00:48:31,440 --> 00:48:34,680 Speaker 1: can we can all find ways of having an impact 796 00:48:34,760 --> 00:48:37,720 Speaker 1: on the way that Facebook plays out in our personal 797 00:48:37,760 --> 00:48:40,160 Speaker 1: media diets and how we use it in our personal lives. 798 00:48:40,200 --> 00:48:42,560 Speaker 1: And so while we may not be able to have 799 00:48:42,719 --> 00:48:46,160 Speaker 1: a huge impact on the grip that platforms have on 800 00:48:46,200 --> 00:48:48,640 Speaker 1: our democracy and our discourse, those are small things that 801 00:48:48,680 --> 00:48:50,719 Speaker 1: we can all do. You know, if you're gonna share 802 00:48:50,760 --> 00:48:52,960 Speaker 1: something on Facebook, read it first. You know, before you 803 00:48:53,000 --> 00:48:54,719 Speaker 1: comment on something, make sure that you read it. You know, 804 00:48:54,760 --> 00:48:59,560 Speaker 1: just little things, little steps. Agreed, Agreed. I think also 805 00:49:00,120 --> 00:49:02,799 Speaker 1: we should point out the fact that you and I 806 00:49:02,840 --> 00:49:06,000 Speaker 1: also host shows where we get the opportunity to reach 807 00:49:06,000 --> 00:49:09,239 Speaker 1: out to our listeners and to suggest to them that 808 00:49:09,600 --> 00:49:13,880 Speaker 1: they incorporate qualities like critical thinking. I always partner critical 809 00:49:13,920 --> 00:49:16,560 Speaker 1: thinking with compassion. I think if you have one without 810 00:49:16,600 --> 00:49:19,080 Speaker 1: the other, things don't go so well. But when you 811 00:49:19,160 --> 00:49:22,600 Speaker 1: got both you don't go too wrong. You can't you 812 00:49:22,600 --> 00:49:25,040 Speaker 1: can't go wrong. Yeah, you really just got to employ 813 00:49:25,120 --> 00:49:28,879 Speaker 1: both of those things. And I am so thankful that 814 00:49:28,920 --> 00:49:31,799 Speaker 1: you joined me for this discussion, and I hope that 815 00:49:31,840 --> 00:49:35,759 Speaker 1: our listeners really get an understanding. Like, the reason we 816 00:49:35,840 --> 00:49:40,880 Speaker 1: talk about this is because the stakes are very high, 817 00:49:40,920 --> 00:49:45,480 Speaker 1: and clearly, if you recognize there are elements of friends 818 00:49:45,600 --> 00:49:49,680 Speaker 1: or family of yours who appear to be maybe going 819 00:49:49,840 --> 00:49:54,120 Speaker 1: down a path, you should seek out resources to help 820 00:49:54,200 --> 00:49:57,680 Speaker 1: you and perhaps help them. It's a very delicate thing. 821 00:49:58,040 --> 00:50:00,560 Speaker 1: I certainly am not an expert to all you how 822 00:50:00,640 --> 00:50:03,440 Speaker 1: to handle a situation in every situation is different, but 823 00:50:04,640 --> 00:50:08,399 Speaker 1: that I'm seeing this real impact on people. I know. 824 00:50:08,520 --> 00:50:11,239 Speaker 1: There was a person that I worked with years ago 825 00:50:11,800 --> 00:50:15,520 Speaker 1: in a theatrical production, and several years ago I saw 826 00:50:15,600 --> 00:50:18,560 Speaker 1: him making some statements that kind of had him going 827 00:50:18,560 --> 00:50:23,719 Speaker 1: down the men's rights pathway, and uh it, that was 828 00:50:24,000 --> 00:50:26,359 Speaker 1: a big warning sign at the beginning, but it has 829 00:50:26,760 --> 00:50:29,359 Speaker 1: since gotten worse, and I just wish that I had 830 00:50:29,640 --> 00:50:36,000 Speaker 1: recognized things earlier and perhaps been a a positive influence 831 00:50:36,040 --> 00:50:39,839 Speaker 1: on his life too, and not have him go down 832 00:50:39,840 --> 00:50:44,279 Speaker 1: that pathway quite so wholeheartedly. Um, people are their own people. 833 00:50:44,320 --> 00:50:46,680 Speaker 1: They'll make their own decisions, but we can always be 834 00:50:46,760 --> 00:50:50,600 Speaker 1: supportive and helpful in certain situations. And one way you 835 00:50:50,880 --> 00:50:54,040 Speaker 1: can be supportive and helpful for yourself and for others 836 00:50:54,520 --> 00:50:57,360 Speaker 1: is to subscribe to There Are No Girls on the Internet. 837 00:50:57,400 --> 00:51:00,319 Speaker 1: Because it's an incredible show. I don't just say that 838 00:51:00,360 --> 00:51:03,320 Speaker 1: because I am the executive producer of that show. I 839 00:51:03,400 --> 00:51:05,640 Speaker 1: say it because it's I say it because if I 840 00:51:05,719 --> 00:51:10,799 Speaker 1: had no involvement whatsoever, it's still an amazing show. In fact, 841 00:51:10,840 --> 00:51:13,480 Speaker 1: I should I should point out I have a very 842 00:51:13,560 --> 00:51:16,080 Speaker 1: light touch on that show. That show is amazing because 843 00:51:16,120 --> 00:51:20,560 Speaker 1: of Bridget, because of Tari, and because of the incredible 844 00:51:20,560 --> 00:51:23,319 Speaker 1: amount of work you you put into it. And uh, 845 00:51:23,520 --> 00:51:25,960 Speaker 1: definitely go and check out Bridget show. There Are No 846 00:51:26,080 --> 00:51:29,359 Speaker 1: Girls on the Internet. Um, see what all the all 847 00:51:29,400 --> 00:51:31,520 Speaker 1: the all the accolades are about. Because this is a 848 00:51:31,560 --> 00:51:34,080 Speaker 1: show that's received quite a few of them. Oh, I mean, 849 00:51:34,120 --> 00:51:35,880 Speaker 1: I couldn't. I couldn't do it without you, John at 850 00:51:35,880 --> 00:51:38,920 Speaker 1: that Entari. This means so much coming from you, you know, 851 00:51:39,120 --> 00:51:44,200 Speaker 1: the tech podcast Guru. I got a poster of myself 852 00:51:44,200 --> 00:51:47,880 Speaker 1: in the background. It's that's how important I am. I 853 00:51:47,920 --> 00:51:50,399 Speaker 1: don't know if you notice that Bridget but that's that's 854 00:51:50,440 --> 00:51:53,200 Speaker 1: me back there. That's if you ever wondered who's the 855 00:51:53,239 --> 00:51:55,399 Speaker 1: kind of guy who hangs a poster of himself up, 856 00:51:55,440 --> 00:51:58,280 Speaker 1: well it's it's me and apparently Donald Trump. I guess 857 00:51:58,360 --> 00:52:01,240 Speaker 1: that's I'm not in good I'm not in good company, 858 00:52:01,360 --> 00:52:04,920 Speaker 1: but I can't deny it. It's on the door. Uh, 859 00:52:05,120 --> 00:52:09,000 Speaker 1: thank you guys for listening. Make sure you again subscribe 860 00:52:09,000 --> 00:52:10,680 Speaker 1: to There Are No Girls on the Internet. Go check 861 00:52:10,719 --> 00:52:13,160 Speaker 1: that out. Look at the list of shows, because there 862 00:52:13,160 --> 00:52:17,160 Speaker 1: have been some really incredible episodes. You've had some amazing 863 00:52:17,239 --> 00:52:20,239 Speaker 1: guests on that show, and uh, it just makes me 864 00:52:20,280 --> 00:52:23,120 Speaker 1: want to become a better interviewer. So I thank you 865 00:52:23,200 --> 00:52:26,160 Speaker 1: for that too, because that's been very inspirational to me. 866 00:52:26,480 --> 00:52:29,280 Speaker 1: And if you guys have any suggestions for future topics 867 00:52:29,360 --> 00:52:32,040 Speaker 1: of tech stuff, you can reach out to me on Twitter. 868 00:52:32,360 --> 00:52:35,120 Speaker 1: Yes I'm I'm still active there and the handle to 869 00:52:35,239 --> 00:52:38,319 Speaker 1: use this text stuff h s W and I'll talk 870 00:52:38,360 --> 00:52:45,960 Speaker 1: to you again really soon. Text Stuff is an I 871 00:52:46,080 --> 00:52:49,560 Speaker 1: Heart Radio production. For more podcasts from My Heart Radio, 872 00:52:49,880 --> 00:52:53,080 Speaker 1: visit the I Heart Radio app, Apple Podcasts, or wherever 873 00:52:53,160 --> 00:52:54,680 Speaker 1: you listen to your favorite shows.