1 00:00:04,240 --> 00:00:07,240 Speaker 1: Welcome to Tech Stuff, a production of I Heart Radios, 2 00:00:07,320 --> 00:00:14,080 Speaker 1: How Stuff Works. Hey there, and welcome to tech Stuff. 3 00:00:14,120 --> 00:00:17,360 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:17,400 --> 00:00:19,080 Speaker 1: I Heart Radio and How Stuff Works and a lot 5 00:00:19,079 --> 00:00:22,599 Speaker 1: of all things tech again kind of, this is another 6 00:00:22,600 --> 00:00:24,040 Speaker 1: one of those episodes we have to put in a 7 00:00:24,079 --> 00:00:29,480 Speaker 1: lot of qualifiers. So in late September twenty nineteen, University 8 00:00:29,480 --> 00:00:34,560 Speaker 1: of Oxford's Computational Propaganda Research Project released the two thousand 9 00:00:34,680 --> 00:00:40,440 Speaker 1: nineteen Global Inventory of Organized Social Media Manipulation, And apart 10 00:00:40,520 --> 00:00:44,479 Speaker 1: from misspelling the word organized, those wacky ox ford Ians 11 00:00:44,640 --> 00:00:47,960 Speaker 1: turned the Z into an S, so it's an easy mistake. 12 00:00:48,120 --> 00:00:51,240 Speaker 1: But otherwise it's a great report. And yeah, I'm making 13 00:00:51,240 --> 00:00:54,480 Speaker 1: a joke about the differences in spelling conventions between America 14 00:00:54,560 --> 00:00:57,800 Speaker 1: and the UK, because in all honesty, this report is 15 00:00:57,840 --> 00:01:00,560 Speaker 1: pretty scary and I need to get in my oofswell 16 00:01:00,640 --> 00:01:04,200 Speaker 1: I can now. I say the report is scary, but 17 00:01:04,319 --> 00:01:08,680 Speaker 1: it's also not exactly surprising. We've heard plenty of reports 18 00:01:08,760 --> 00:01:11,759 Speaker 1: all around the world over the last few years about 19 00:01:11,880 --> 00:01:15,240 Speaker 1: governments and political parties using social media as a way 20 00:01:15,280 --> 00:01:18,880 Speaker 1: to spread misinformation in an effort to manipulate people to 21 00:01:18,959 --> 00:01:22,400 Speaker 1: do whatever it is the party in question wants them 22 00:01:22,440 --> 00:01:27,040 Speaker 1: to do, or in some cases, to not do something, 23 00:01:27,200 --> 00:01:30,480 Speaker 1: depending on the circumstances. But the scale of the issue 24 00:01:30,840 --> 00:01:33,679 Speaker 1: is a truly global one and it's only becoming a 25 00:01:33,760 --> 00:01:37,240 Speaker 1: bigger problem as time goes on. Now, it's also important 26 00:01:37,240 --> 00:01:42,039 Speaker 1: to note how the researchers generated this report. This was 27 00:01:42,160 --> 00:01:45,920 Speaker 1: not some sort of deep undercover mission in which dozens 28 00:01:45,920 --> 00:01:51,600 Speaker 1: of security experts infiltrated various countries to monitor social media activity. Instead, 29 00:01:52,080 --> 00:01:56,320 Speaker 1: the researchers relied heavily on published accounts of governments and 30 00:01:56,360 --> 00:02:00,400 Speaker 1: political parties manipulating social media for the purposes of propit Ganda, 31 00:02:00,640 --> 00:02:04,080 Speaker 1: typically from a lot of news outlets, and they created 32 00:02:04,080 --> 00:02:07,480 Speaker 1: a process in which they would score news sources on 33 00:02:07,520 --> 00:02:10,880 Speaker 1: a scale of one to three. One being a pretty 34 00:02:10,919 --> 00:02:15,760 Speaker 1: reliable and reputable source of information, something that is has 35 00:02:15,800 --> 00:02:20,120 Speaker 1: got a nice long standing reputation for for being a 36 00:02:20,200 --> 00:02:24,400 Speaker 1: foundational source for news, and then three is on the 37 00:02:24,400 --> 00:02:27,359 Speaker 1: opposite end of that right, Three would be a partisan, 38 00:02:27,520 --> 00:02:32,000 Speaker 1: biased or unreliable source. Articles that ranked a three were 39 00:02:32,080 --> 00:02:36,600 Speaker 1: removed from the the whole set of articles before the 40 00:02:36,639 --> 00:02:40,000 Speaker 1: next step in the process began, and that next step 41 00:02:40,360 --> 00:02:42,680 Speaker 1: was to review all of those articles and then go 42 00:02:42,800 --> 00:02:47,400 Speaker 1: into a mode called secondary literature review, in which researchers 43 00:02:47,400 --> 00:02:51,400 Speaker 1: would focus on specific countries and do a deeper dive 44 00:02:51,520 --> 00:02:55,240 Speaker 1: into the news stories about manipulation and social media sites. 45 00:02:55,600 --> 00:02:59,639 Speaker 1: This included further research that collected stuff like research papers, 46 00:02:59,720 --> 00:03:03,360 Speaker 1: govern ment reports, and other publicly available information. Then the 47 00:03:03,440 --> 00:03:08,520 Speaker 1: researchers prepared country case studies for most of the countries 48 00:03:08,600 --> 00:03:12,440 Speaker 1: they covered in their initial search. Their case studies laid 49 00:03:12,440 --> 00:03:15,920 Speaker 1: out specific instances and strategies that were found in the 50 00:03:16,000 --> 00:03:20,480 Speaker 1: respective countries. The researchers then called upon experts to review 51 00:03:20,560 --> 00:03:23,680 Speaker 1: the case studies, and the experts were there to evaluate 52 00:03:23,720 --> 00:03:26,760 Speaker 1: the reliability of the data and also whether or not 53 00:03:26,800 --> 00:03:30,119 Speaker 1: the case studies accurately reflected the information that was available. 54 00:03:30,120 --> 00:03:33,520 Speaker 1: So not just are these facts accurate, but is the 55 00:03:33,560 --> 00:03:37,480 Speaker 1: way that that the report presents the facts is that 56 00:03:37,560 --> 00:03:41,640 Speaker 1: in itself accurate? Because you could have some accurate stats 57 00:03:41,680 --> 00:03:43,720 Speaker 1: and then still report on it in a way that 58 00:03:43,960 --> 00:03:47,320 Speaker 1: is not, you know, entirely representational of the truth. So 59 00:03:47,880 --> 00:03:50,680 Speaker 1: the experts were sort of the peer review process for 60 00:03:50,680 --> 00:03:53,920 Speaker 1: this report, and you could argue that the report is 61 00:03:53,960 --> 00:03:56,560 Speaker 1: itself sort of a meta study that would be a 62 00:03:56,560 --> 00:04:00,440 Speaker 1: study that pulls information from many other established sources as 63 00:04:00,440 --> 00:04:03,720 Speaker 1: opposed to an original study that does new and original research. 64 00:04:04,080 --> 00:04:06,920 Speaker 1: This one was dependent upon stuff that had already been 65 00:04:07,080 --> 00:04:10,600 Speaker 1: read written about these various countries. One thing I think 66 00:04:10,640 --> 00:04:15,000 Speaker 1: the report does particularly well is that the researchers acknowledged 67 00:04:15,120 --> 00:04:18,760 Speaker 1: what makes this manipulation possible in the first place. The 68 00:04:18,800 --> 00:04:22,600 Speaker 1: amount of information we have access to at any given 69 00:04:22,720 --> 00:04:28,400 Speaker 1: time is truly monumental. Let's think back a few decades 70 00:04:28,480 --> 00:04:31,760 Speaker 1: imagine what it would be like before the eras of 71 00:04:31,920 --> 00:04:34,800 Speaker 1: radio and television. Back when you would get your news 72 00:04:34,960 --> 00:04:39,080 Speaker 1: from print. You would get newspapers or journals or magazines 73 00:04:39,160 --> 00:04:43,080 Speaker 1: or other periodicals. That was pretty much the only way 74 00:04:43,120 --> 00:04:45,960 Speaker 1: you were going to get any news beyond what's just 75 00:04:46,040 --> 00:04:49,640 Speaker 1: going on in your immediate neighborhood. Radio and TV brought 76 00:04:49,720 --> 00:04:52,400 Speaker 1: with them the ability to spread news faster and in 77 00:04:52,440 --> 00:04:56,240 Speaker 1: a wider distribution. Then came cable and the invention of 78 00:04:56,320 --> 00:04:59,279 Speaker 1: twenty four hour cable news, and now we had a 79 00:04:59,279 --> 00:05:02,640 Speaker 1: whole new p problem. Suddenly we had to find a 80 00:05:02,680 --> 00:05:05,719 Speaker 1: way to fill up twenty four hours of news time 81 00:05:05,920 --> 00:05:11,480 Speaker 1: every single day. When before newspapers, radio programs, television programs, 82 00:05:11,760 --> 00:05:15,680 Speaker 1: they would curate the most important news stories because you know, 83 00:05:15,720 --> 00:05:18,040 Speaker 1: you had a limited amount of space and or time. 84 00:05:18,560 --> 00:05:21,360 Speaker 1: But with twenty four hours, suddenly time was not as 85 00:05:21,400 --> 00:05:23,839 Speaker 1: big of a problem. In fact, it was the opposite problem. 86 00:05:23,880 --> 00:05:25,960 Speaker 1: How do you fill it all up? Then we get 87 00:05:26,000 --> 00:05:29,200 Speaker 1: to the Internet, which, like the twenty four hour cable 88 00:05:29,279 --> 00:05:32,880 Speaker 1: news channels, is also always on. And much of the 89 00:05:32,920 --> 00:05:37,760 Speaker 1: business that is on the Internet relies on doing a 90 00:05:37,760 --> 00:05:40,880 Speaker 1: few things, and they're all related. And this is not 91 00:05:40,920 --> 00:05:42,680 Speaker 1: going to come as news to any of you, but 92 00:05:42,720 --> 00:05:45,640 Speaker 1: I want to lay it out. So business on the internet, 93 00:05:45,680 --> 00:05:49,080 Speaker 1: if that's in fact where you are, are really dependent 94 00:05:49,160 --> 00:05:52,159 Speaker 1: upon generating revenue. As the Internet itself, you want to 95 00:05:52,200 --> 00:05:55,080 Speaker 1: attract as many sets of eyeballs as possible, so get 96 00:05:55,080 --> 00:05:57,760 Speaker 1: as many people to visit your website as you possibly can. 97 00:05:58,200 --> 00:06:01,039 Speaker 1: You want to keep those eyeballs on the company's web 98 00:06:01,040 --> 00:06:03,119 Speaker 1: pages as much as possible, so you don't want people 99 00:06:03,240 --> 00:06:06,960 Speaker 1: bouncing off and leaving. And as a consequence, you also 100 00:06:07,000 --> 00:06:10,320 Speaker 1: want to serve up as many ads to those eyeballs 101 00:06:10,360 --> 00:06:14,800 Speaker 1: as possible, because that's generally how most web based businesses 102 00:06:15,120 --> 00:06:18,240 Speaker 1: are generating their revenue. Uh. You know, obviously you've got 103 00:06:18,279 --> 00:06:21,760 Speaker 1: other businesses like retail that use the web as a 104 00:06:21,960 --> 00:06:26,159 Speaker 1: portal to shop the inventory of the retail store. But 105 00:06:26,320 --> 00:06:31,120 Speaker 1: for anything that's specifically dependent upon the web itself for revenue, 106 00:06:31,160 --> 00:06:34,440 Speaker 1: that's generally how this works unless you're doing a subscription model, 107 00:06:35,080 --> 00:06:38,039 Speaker 1: and all of these things I just mentioned contributes to 108 00:06:38,080 --> 00:06:40,880 Speaker 1: add revenue. So yeah, I know, again I'm starting with 109 00:06:40,920 --> 00:06:44,640 Speaker 1: the obvious and that if you didn't know this already, 110 00:06:44,680 --> 00:06:47,359 Speaker 1: you probably had noticed it at the very least. But 111 00:06:47,440 --> 00:06:50,200 Speaker 1: this is why you find so many web pages that 112 00:06:50,440 --> 00:06:54,040 Speaker 1: take kind of irritating approaches, like they'll create a gallery 113 00:06:54,120 --> 00:06:56,720 Speaker 1: or slide show approach which forces you to click on 114 00:06:56,760 --> 00:07:00,880 Speaker 1: the next button to generate another page view. So rather 115 00:07:00,920 --> 00:07:02,599 Speaker 1: than just lay it all out in one page where 116 00:07:02,600 --> 00:07:05,160 Speaker 1: you can just scroll through and read everything, you're clicking 117 00:07:05,200 --> 00:07:08,039 Speaker 1: over and over again. Well, those clicks count as page views, 118 00:07:08,240 --> 00:07:12,200 Speaker 1: which help the the company that makes the web page 119 00:07:12,960 --> 00:07:17,560 Speaker 1: market it to advertisers, or the websites that are designed 120 00:07:17,560 --> 00:07:23,000 Speaker 1: for mobile that just have super long articles that scroll 121 00:07:23,040 --> 00:07:26,760 Speaker 1: and scroll and scroll. You might get a picture a 122 00:07:26,800 --> 00:07:29,160 Speaker 1: line of text, and then an AD, and then you'll 123 00:07:29,160 --> 00:07:32,160 Speaker 1: get another picture in a line of text and another AD, 124 00:07:32,320 --> 00:07:35,360 Speaker 1: and then you finally get around to finding the point 125 00:07:35,800 --> 00:07:38,560 Speaker 1: where the article actually tells you whatever the headline was 126 00:07:38,600 --> 00:07:41,440 Speaker 1: claiming at the top. And that's not even really touching 127 00:07:41,440 --> 00:07:43,800 Speaker 1: on the whole concept of click bait, in which a 128 00:07:43,840 --> 00:07:47,640 Speaker 1: title and thumbnail image are carefully curated to get as 129 00:07:47,720 --> 00:07:51,160 Speaker 1: many clicks as possible, sometimes with no regard as to 130 00:07:51,160 --> 00:07:53,840 Speaker 1: whether or not the final web page actually reflects whatever 131 00:07:53,880 --> 00:07:57,240 Speaker 1: the original promise was of the title and the thumbnail. 132 00:07:57,760 --> 00:08:01,920 Speaker 1: And one of the consequences of all this data around us, 133 00:08:02,240 --> 00:08:05,560 Speaker 1: coupled with the various methods companies are using to get 134 00:08:05,560 --> 00:08:08,440 Speaker 1: our attention, means we don't tend to spend a lot 135 00:08:08,480 --> 00:08:15,440 Speaker 1: of time actually, you know, attending to anything. We can't. Instead, 136 00:08:15,480 --> 00:08:19,920 Speaker 1: our attention drifts over data point after data point. Meanwhile, 137 00:08:20,200 --> 00:08:23,560 Speaker 1: we've got a limitation on how much we trust information sources, 138 00:08:23,880 --> 00:08:27,360 Speaker 1: so skillful manipulators take all of this into account when 139 00:08:27,360 --> 00:08:31,800 Speaker 1: designing an approach to manipulate the public. The executive summary 140 00:08:31,840 --> 00:08:35,240 Speaker 1: of the report lays out the scale of the problem 141 00:08:35,440 --> 00:08:39,040 Speaker 1: right away, and according to the researchers, their work uncovered 142 00:08:39,080 --> 00:08:43,520 Speaker 1: instances of governments or political parties using social media manipulation 143 00:08:43,880 --> 00:08:48,120 Speaker 1: in twenty eight countries. In a year later, the number 144 00:08:48,160 --> 00:08:51,680 Speaker 1: of countries had grown to forty eight, and this year 145 00:08:51,760 --> 00:08:54,280 Speaker 1: twenty nineteen, the number of countries in which at least 146 00:08:54,360 --> 00:08:57,720 Speaker 1: one political party, if not the government itself, is using 147 00:08:57,760 --> 00:09:02,640 Speaker 1: social media for manipulation purposes is at seventy countries. The 148 00:09:02,720 --> 00:09:06,240 Speaker 1: researchers also point out this doesn't necessarily mean the number 149 00:09:06,240 --> 00:09:10,880 Speaker 1: of countries with governmental agencies using misinformation online is doubling 150 00:09:11,000 --> 00:09:14,920 Speaker 1: year over year. Part of the increase maybe do not 151 00:09:15,080 --> 00:09:19,040 Speaker 1: to more countries doing this, but rather are growing awareness 152 00:09:19,120 --> 00:09:22,960 Speaker 1: and ability to detect social manipulation. So it's not just 153 00:09:23,200 --> 00:09:26,559 Speaker 1: that more people are using these techniques though that seems 154 00:09:26,600 --> 00:09:29,040 Speaker 1: like it's a pretty safe bet, but also that we're 155 00:09:29,040 --> 00:09:32,400 Speaker 1: getting better at detecting them, so in places where it 156 00:09:32,480 --> 00:09:35,400 Speaker 1: may once have been overlooked, we now know about it. 157 00:09:35,880 --> 00:09:38,920 Speaker 1: So and again, it's just that our our tool set 158 00:09:38,920 --> 00:09:42,040 Speaker 1: has gotten better, so that's also contributing to this growing number. 159 00:09:42,640 --> 00:09:46,000 Speaker 1: And listing all the countries of those seventy would be 160 00:09:46,000 --> 00:09:48,920 Speaker 1: pretty tedious, but pretty much everyone you would expect to 161 00:09:49,000 --> 00:09:52,880 Speaker 1: be there is there. That includes the United States and 162 00:09:52,920 --> 00:09:56,920 Speaker 1: the United Kingdom. It also includes Russia and China. Other 163 00:09:56,960 --> 00:10:01,400 Speaker 1: countries on the list include India, Greece, the Check Republic, Nigeria, 164 00:10:01,520 --> 00:10:08,040 Speaker 1: North Korea, Pakistan, Brazil, Cambodia, Saudi Arabia, South Africa, Spain, 165 00:10:08,320 --> 00:10:12,040 Speaker 1: and many many more. So the researchers were casting a 166 00:10:12,080 --> 00:10:15,880 Speaker 1: broad net around the globe. In this study, much of 167 00:10:15,920 --> 00:10:19,719 Speaker 1: that focus has been on how governments use social platforms 168 00:10:19,760 --> 00:10:22,959 Speaker 1: to manipulate things within their own borders, so, in other words, 169 00:10:23,080 --> 00:10:28,720 Speaker 1: domestic concerns. But the researchers also found some reports about 170 00:10:28,760 --> 00:10:33,160 Speaker 1: foreign influence operations or attempts to manipulate people who are 171 00:10:33,200 --> 00:10:37,480 Speaker 1: living in other countries entirely. Now, this focus was more 172 00:10:37,640 --> 00:10:43,360 Speaker 1: narrow than the overall domestic focus because it's a challenge 173 00:10:43,400 --> 00:10:47,000 Speaker 1: to get a handle on how frequently this foreign influence 174 00:10:47,040 --> 00:10:51,319 Speaker 1: operation stuff is happening because platforms like Twitter and Facebook 175 00:10:51,480 --> 00:10:55,840 Speaker 1: have either limited the investigations into such things or the 176 00:10:55,920 --> 00:10:59,960 Speaker 1: reporting of any findings they've had has been pretty limited. 177 00:11:00,120 --> 00:11:03,600 Speaker 1: For example, those platforms have at least in their reporting, 178 00:11:03,840 --> 00:11:08,240 Speaker 1: limited all their actions against campaigns that originated in just 179 00:11:08,600 --> 00:11:15,199 Speaker 1: seven countries, those seven being China, India, Iran, Pakistan, Russia, 180 00:11:15,440 --> 00:11:19,800 Speaker 1: Saudi Arabia, and Venezuela. And to be clear, we're talking 181 00:11:19,880 --> 00:11:23,200 Speaker 1: about just the stuff the researchers were able to find. 182 00:11:23,280 --> 00:11:26,439 Speaker 1: I think it's safe to say there are probably instances 183 00:11:26,720 --> 00:11:29,520 Speaker 1: of this that have yet to be uncovered, and some 184 00:11:29,600 --> 00:11:32,760 Speaker 1: countries they didn't have time to look into thoroughly. There 185 00:11:32,800 --> 00:11:37,080 Speaker 1: are on countries in the world, or three if you 186 00:11:37,080 --> 00:11:40,400 Speaker 1: only go with countries recognized by the United Nations. So 187 00:11:40,640 --> 00:11:43,240 Speaker 1: this is a really big deal. Seventy out of a hundred, 188 00:11:44,960 --> 00:11:48,840 Speaker 1: that's a significant Now, in this episode, I'm going to 189 00:11:48,880 --> 00:11:51,400 Speaker 1: go through the report with you guys to talk about 190 00:11:51,440 --> 00:11:53,840 Speaker 1: what they found and what it all means, and maybe 191 00:11:53,920 --> 00:11:56,160 Speaker 1: think about what we might do to protect ourselves and 192 00:11:56,200 --> 00:12:00,360 Speaker 1: those around us from being manipulated bad news. There's not 193 00:12:00,400 --> 00:12:03,120 Speaker 1: a whole lot we can do on an individual basis, 194 00:12:03,120 --> 00:12:06,920 Speaker 1: but we'll get there now. Part of what makes this 195 00:12:07,040 --> 00:12:10,679 Speaker 1: all challenging is that the Internet isn't exactly the same 196 00:12:10,920 --> 00:12:13,960 Speaker 1: all across the world, as you guys know. I live 197 00:12:14,040 --> 00:12:16,760 Speaker 1: in the United States, and despite a few attempts to 198 00:12:16,760 --> 00:12:20,520 Speaker 1: shut down access to servers that were hosting pirated media, 199 00:12:20,559 --> 00:12:25,880 Speaker 1: because corporations have enormous sway in the United States, internet 200 00:12:25,920 --> 00:12:31,360 Speaker 1: access in the States is largely unfettered, so essentially, if 201 00:12:31,360 --> 00:12:34,880 Speaker 1: it's out there, you can access it in the United States. 202 00:12:34,920 --> 00:12:38,480 Speaker 1: This is an over generalization but you get the idea. 203 00:12:38,640 --> 00:12:42,240 Speaker 1: There are other countries that restrict, to one degree or 204 00:12:42,280 --> 00:12:46,360 Speaker 1: another that sort of access. In China, for example, there 205 00:12:46,440 --> 00:12:50,640 Speaker 1: is the famous Great Firewall of China. The term describes 206 00:12:50,640 --> 00:12:54,280 Speaker 1: not just the technology used, but the political policies. In China, 207 00:12:54,559 --> 00:12:59,040 Speaker 1: the restrict citizens access to the Internet. State approved sites 208 00:12:59,080 --> 00:13:03,280 Speaker 1: and services are fine, they can access those. The Chinese 209 00:13:03,280 --> 00:13:07,439 Speaker 1: government subjects other stuff to heavy censorship or just blocks 210 00:13:07,480 --> 00:13:11,480 Speaker 1: it outright. Controlling information is one way to maintain control 211 00:13:11,600 --> 00:13:14,400 Speaker 1: over a population, and China is one of the most 212 00:13:14,480 --> 00:13:17,880 Speaker 1: obvious examples of that happening today. Another one would be 213 00:13:18,040 --> 00:13:21,680 Speaker 1: North Korea. One of the big developments in twenty nineteen 214 00:13:22,040 --> 00:13:26,520 Speaker 1: saw China get more involved in foreign influence operations. Previously, 215 00:13:27,120 --> 00:13:30,320 Speaker 1: nearly all of China's propaganda efforts were confined to China 216 00:13:30,400 --> 00:13:34,560 Speaker 1: itself and its strict control over Internet access to its citizens. 217 00:13:35,000 --> 00:13:37,320 Speaker 1: But in two thousand nineteen, with the rise of public 218 00:13:37,360 --> 00:13:41,320 Speaker 1: demonstrations and protests in Hong Kong, the country began to 219 00:13:41,360 --> 00:13:45,760 Speaker 1: initiate misinformation campaigns on social media to attempt to undermine 220 00:13:45,800 --> 00:13:50,360 Speaker 1: public support for Hong Kong, casting the protesters as lawless 221 00:13:50,400 --> 00:13:54,160 Speaker 1: and violent. The researchers state that there's no reason to 222 00:13:54,200 --> 00:13:57,559 Speaker 1: assume China will stop using social media in an effort 223 00:13:57,640 --> 00:13:59,800 Speaker 1: to shape the public understanding of things that are of 224 00:14:00,000 --> 00:14:03,480 Speaker 1: importance to the country, So we'll probably see that country 225 00:14:03,520 --> 00:14:08,120 Speaker 1: continue it's foreign influence operations. Now, a lot of countries 226 00:14:08,160 --> 00:14:11,480 Speaker 1: fall between these extremes I've just laid out, and you 227 00:14:11,559 --> 00:14:14,800 Speaker 1: could also argue that neither the US nor China are 228 00:14:14,880 --> 00:14:18,080 Speaker 1: truly on the very opposite ends of the spectrum, but 229 00:14:18,160 --> 00:14:20,880 Speaker 1: that's the way they're often portrayed. Now, for the purposes 230 00:14:20,920 --> 00:14:24,360 Speaker 1: of the report, the researchers decided to focus only on 231 00:14:24,480 --> 00:14:27,600 Speaker 1: cases in which there was a clear mandate from a 232 00:14:27,640 --> 00:14:31,720 Speaker 1: government or political party to initiate the manipulation. This is 233 00:14:31,760 --> 00:14:35,000 Speaker 1: important to distinguish because there may be many cases in 234 00:14:35,000 --> 00:14:40,080 Speaker 1: which hackers, activists, companies, subcultures, or other groups of people 235 00:14:40,320 --> 00:14:45,120 Speaker 1: could act on their own without the explicit permission or 236 00:14:45,280 --> 00:14:49,240 Speaker 1: mandate of a government or political party. In cases like those, 237 00:14:49,560 --> 00:14:52,960 Speaker 1: the ideologies and the goals of the group and the 238 00:14:53,000 --> 00:14:56,960 Speaker 1: government just happened to align, but there's no explicit direction 239 00:14:57,080 --> 00:15:00,760 Speaker 1: from the state to commit any acts. You may remember 240 00:15:00,800 --> 00:15:03,560 Speaker 1: that in the wake of the massive two thousand fourteen 241 00:15:03,640 --> 00:15:07,000 Speaker 1: cyber attack on Sony which involved the theft of a 242 00:15:07,000 --> 00:15:10,960 Speaker 1: ton of confidential information within the company. A hacker group 243 00:15:11,000 --> 00:15:14,760 Speaker 1: called the Guardians of Peace claimed responsibility, and while the 244 00:15:14,800 --> 00:15:17,960 Speaker 1: hacker group's goals were in line with those of North 245 00:15:18,040 --> 00:15:21,440 Speaker 1: Korea as a whole, the country of North Korea maintained 246 00:15:21,680 --> 00:15:25,280 Speaker 1: it had not directed any hackers to go after Sony. 247 00:15:25,640 --> 00:15:28,720 Speaker 1: If that were the case, which is still a matter 248 00:15:28,760 --> 00:15:31,200 Speaker 1: of dispute, then it would be an example of what 249 00:15:31,240 --> 00:15:34,880 Speaker 1: I was saying earlier. So the report would focus only 250 00:15:34,920 --> 00:15:38,840 Speaker 1: on stories that link back to official government or political agencies, 251 00:15:39,120 --> 00:15:43,400 Speaker 1: and not campaigns from apparent independent groups that just happened 252 00:15:43,400 --> 00:15:47,760 Speaker 1: to align with those governments. Okay, so those are the 253 00:15:47,800 --> 00:15:51,120 Speaker 1: basics when we come back. I'll talk more about specifics 254 00:15:51,120 --> 00:15:54,120 Speaker 1: in the report, but first let's take a quick break. 255 00:16:01,720 --> 00:16:06,560 Speaker 1: According to the report, authoritarian governments and political parties use 256 00:16:06,720 --> 00:16:11,920 Speaker 1: social manipulation to achieve one or more of three general outcomes, 257 00:16:12,320 --> 00:16:16,080 Speaker 1: and the first is to suppress fundamental human rights, which 258 00:16:16,160 --> 00:16:19,920 Speaker 1: I think we can all agree is pretty horrifying. The 259 00:16:19,960 --> 00:16:24,080 Speaker 1: second is to discredit political opponents, and the third is 260 00:16:24,080 --> 00:16:27,960 Speaker 1: to drown out any dissenting opinions. Now, those last two 261 00:16:27,960 --> 00:16:30,880 Speaker 1: points show that these manipulators are relying on a tool 262 00:16:30,960 --> 00:16:35,200 Speaker 1: set used by Internet trolls. In general, trolls will use 263 00:16:35,240 --> 00:16:39,240 Speaker 1: all sorts of manipulative tactics to dismiss anyone they target, 264 00:16:39,520 --> 00:16:42,440 Speaker 1: and to rely on strategies to start flame wars or 265 00:16:42,480 --> 00:16:45,800 Speaker 1: other distractions to keep the targeted party from being able 266 00:16:45,840 --> 00:16:48,800 Speaker 1: to make any sort of impact. So, in some ways, 267 00:16:49,120 --> 00:16:53,400 Speaker 1: the techniques being used in social manipulation are already incredibly 268 00:16:53,440 --> 00:16:56,760 Speaker 1: familiar to us. It's just that instead of popping up 269 00:16:56,800 --> 00:17:01,520 Speaker 1: on a message board centered around no Dancing with the stars, 270 00:17:01,880 --> 00:17:05,400 Speaker 1: it's a government or political party trying to establish control 271 00:17:05,520 --> 00:17:09,800 Speaker 1: over a population. However, it does get more complicated than 272 00:17:09,880 --> 00:17:13,320 Speaker 1: just rilling up folks on the internet. The report's introduction 273 00:17:13,359 --> 00:17:16,640 Speaker 1: starts off with the sentence and I quote around the world, 274 00:17:17,000 --> 00:17:22,439 Speaker 1: government actors are using social media to manufacture consensus, automate suppression, 275 00:17:22,760 --> 00:17:26,959 Speaker 1: and undermine trust in the liberal international order end quote. 276 00:17:27,400 --> 00:17:30,040 Speaker 1: Now some of that still falls in the troll wheelhouse. 277 00:17:30,359 --> 00:17:34,520 Speaker 1: Manufacturing consensus, for example, this is when a party attempts 278 00:17:34,520 --> 00:17:37,240 Speaker 1: to make it seem as though the majority of people 279 00:17:37,320 --> 00:17:40,720 Speaker 1: around you all believe a certain philosophy or a course 280 00:17:40,760 --> 00:17:44,160 Speaker 1: of action is the right one. It's the whole concept 281 00:17:44,200 --> 00:17:47,320 Speaker 1: of go with the crowd, right, or if you're being 282 00:17:47,400 --> 00:17:52,240 Speaker 1: particularly cynical, it's likening people to sheep or cattle. People 283 00:17:52,320 --> 00:17:55,280 Speaker 1: tend to go along with what others are doing because 284 00:17:55,600 --> 00:17:59,080 Speaker 1: to do otherwise, to go outside of that, is to 285 00:17:59,160 --> 00:18:01,720 Speaker 1: invite scrutiny or criticism, and a lot of us just 286 00:18:01,800 --> 00:18:05,680 Speaker 1: prefer to avoid that, so, rather than make waves, will 287 00:18:05,720 --> 00:18:08,879 Speaker 1: go with the flow. Trolls do this online through stuff 288 00:18:08,880 --> 00:18:12,359 Speaker 1: like sock puppet accounts. That's when a troll makes two 289 00:18:12,480 --> 00:18:15,720 Speaker 1: or more accounts for an online discussion, and the troll 290 00:18:15,800 --> 00:18:19,359 Speaker 1: will use their primary account to make whatever statement they want, 291 00:18:19,880 --> 00:18:23,000 Speaker 1: and then the sock puppets are used to help the 292 00:18:23,040 --> 00:18:27,560 Speaker 1: troll achieve that goal, typically by adding support in some way, 293 00:18:27,600 --> 00:18:31,080 Speaker 1: so the trolls controlling all three or four, however many 294 00:18:31,119 --> 00:18:34,840 Speaker 1: of these accounts there are, and using the sock puppet 295 00:18:34,840 --> 00:18:38,800 Speaker 1: accounts to add support and credence to whatever the troll 296 00:18:38,880 --> 00:18:41,159 Speaker 1: is saying. So to an outside viewer, it's as if 297 00:18:41,160 --> 00:18:43,720 Speaker 1: the troll has said something and that other people are 298 00:18:43,800 --> 00:18:47,000 Speaker 1: now chiming in to support that's something, But in reality, 299 00:18:47,359 --> 00:18:49,479 Speaker 1: it's just the troll, or sometimes a group of trolls, 300 00:18:49,680 --> 00:18:53,960 Speaker 1: manufacturing that consensus, when in fact no such consensus exists 301 00:18:54,080 --> 00:18:57,440 Speaker 1: within the group of at large well countries and political 302 00:18:57,480 --> 00:19:00,440 Speaker 1: parties are doing the same thing, but on a much 303 00:19:00,520 --> 00:19:04,639 Speaker 1: larger scale. Governments do this by employing either directly or 304 00:19:04,680 --> 00:19:08,080 Speaker 1: otherwise agents to do the dirty work, and the report 305 00:19:08,160 --> 00:19:12,040 Speaker 1: refers to these agents as cyber troops. And some of 306 00:19:12,040 --> 00:19:15,359 Speaker 1: those cyber troops are a little extra cyber. That is, 307 00:19:15,600 --> 00:19:18,159 Speaker 1: some of the agents working on behalf of achieving the 308 00:19:18,200 --> 00:19:22,119 Speaker 1: goals of these various governments and political parties are bots. 309 00:19:22,160 --> 00:19:25,680 Speaker 1: They are programs that generate automated responses with the goal 310 00:19:25,720 --> 00:19:30,240 Speaker 1: of either suppressing messages in opposition to the party's goals 311 00:19:30,359 --> 00:19:35,000 Speaker 1: or elevating an escalating language that supports those goals. The 312 00:19:35,080 --> 00:19:39,119 Speaker 1: researchers identify three types of fake accounts in the report, 313 00:19:39,480 --> 00:19:44,480 Speaker 1: human bot, and then cyborg. Now, out of the seventy 314 00:19:44,600 --> 00:19:48,639 Speaker 1: countries that the researchers looked at, fifty were found using 315 00:19:48,800 --> 00:19:52,280 Speaker 1: fake accounts run by bots in some way, mostly to 316 00:19:52,400 --> 00:19:57,800 Speaker 1: spread certain messages while drowning out dissent. Interestingly, human run 317 00:19:57,880 --> 00:20:01,520 Speaker 1: accounts were even more widespread, with sixty out of the 318 00:20:01,560 --> 00:20:05,840 Speaker 1: seventy countries employing them in some way to run fake accounts, 319 00:20:05,880 --> 00:20:08,560 Speaker 1: and typically the people who are running these fake accounts 320 00:20:08,720 --> 00:20:12,480 Speaker 1: would engage with others people who have real accounts, like 321 00:20:12,520 --> 00:20:16,119 Speaker 1: the real users of these sites, by leaving comments on 322 00:20:16,200 --> 00:20:19,640 Speaker 1: posts or sending private messages, and otherwise attempting to start 323 00:20:19,680 --> 00:20:24,960 Speaker 1: up conversations aligned with the overall goals of the communication strategy. 324 00:20:25,160 --> 00:20:28,720 Speaker 1: The report also looked into instances of hacked accounts, in 325 00:20:28,760 --> 00:20:32,480 Speaker 1: which a person's online account would become compromised and then 326 00:20:32,480 --> 00:20:35,280 Speaker 1: pulled out of their control. Then a bot or a 327 00:20:35,359 --> 00:20:40,760 Speaker 1: human agent or the hybrid cyborg could post as that person. 328 00:20:41,320 --> 00:20:45,000 Speaker 1: This strategy accomplishes two things at once. It can silence 329 00:20:45,200 --> 00:20:48,760 Speaker 1: someone who otherwise might speak the scent against a government 330 00:20:48,800 --> 00:20:52,280 Speaker 1: or political party, and it can appear to lend credibility 331 00:20:52,560 --> 00:20:55,240 Speaker 1: to that government or political party by having a quote 332 00:20:55,320 --> 00:20:58,680 Speaker 1: unquote real person add to the conversation in a way 333 00:20:58,680 --> 00:21:02,960 Speaker 1: that further is the political goals. Complicating matters is that 334 00:21:03,440 --> 00:21:07,960 Speaker 1: in some countries governments encourage citizens to engage in spreading 335 00:21:08,040 --> 00:21:13,320 Speaker 1: propaganda and silencing dissenting voices on behalf of the government. 336 00:21:13,840 --> 00:21:15,960 Speaker 1: This would be a strategy where you say it's all 337 00:21:16,080 --> 00:21:20,040 Speaker 1: part of being a good patriotic citizen of that country. 338 00:21:20,359 --> 00:21:24,959 Speaker 1: In those cases, you're not talking about compromised or fake accounts. Instead, 339 00:21:25,240 --> 00:21:29,560 Speaker 1: you're talking about indoctrinated people using their accounts to support 340 00:21:29,560 --> 00:21:32,880 Speaker 1: their respective governments. And because we're starting to see some 341 00:21:32,920 --> 00:21:38,000 Speaker 1: platforms actually make moves against bots and other fake accounts, 342 00:21:38,040 --> 00:21:41,520 Speaker 1: this could become a more common practice in the future. 343 00:21:41,600 --> 00:21:44,639 Speaker 1: It's a lot harder for social platforms to remove quote 344 00:21:44,680 --> 00:21:49,280 Speaker 1: unquote real accounts that happen to spread propaganda without running 345 00:21:49,280 --> 00:21:52,919 Speaker 1: the risk of being labeled as partisan or advocating for censorship. 346 00:21:53,640 --> 00:21:57,680 Speaker 1: As for the messaging, the research team identified five general 347 00:21:57,800 --> 00:22:01,560 Speaker 1: types of messages and their intended effects. So number one 348 00:22:02,040 --> 00:22:06,280 Speaker 1: is the straightforward pro government propaganda. So these are messages 349 00:22:06,320 --> 00:22:10,159 Speaker 1: that praise whatever power is behind the manipulation, just you know, 350 00:22:11,320 --> 00:22:16,400 Speaker 1: yea America, or yea our glorious leader, or go socks. 351 00:22:17,200 --> 00:22:22,080 Speaker 1: Next our messages designed to discredit or defame political dissidents 352 00:22:22,119 --> 00:22:25,080 Speaker 1: and opponents, which might include a mixture of truth and 353 00:22:25,160 --> 00:22:28,679 Speaker 1: misinformation about the target. So this is where you have 354 00:22:28,760 --> 00:22:34,040 Speaker 1: identified some potential opponent to the powers that be, and 355 00:22:34,200 --> 00:22:37,720 Speaker 1: you use every tool in your tool chest to make 356 00:22:37,760 --> 00:22:40,600 Speaker 1: that person seem like the worst human being in the 357 00:22:40,640 --> 00:22:42,480 Speaker 1: world and no one should ever support him or her. 358 00:22:42,960 --> 00:22:46,920 Speaker 1: The third path is to use misinformation to distract from 359 00:22:46,960 --> 00:22:50,120 Speaker 1: important issues. In the United States, you'll hear a lot 360 00:22:50,160 --> 00:22:53,560 Speaker 1: of people complain about this sort of activity, in particular 361 00:22:53,800 --> 00:22:57,160 Speaker 1: in which let's say a government official or an agency 362 00:22:57,280 --> 00:23:01,320 Speaker 1: issues an outrageous or a controversial mess edge, and people 363 00:23:01,359 --> 00:23:03,520 Speaker 1: will say it's an effort to pull the focus off 364 00:23:03,600 --> 00:23:07,320 Speaker 1: of matters of more critical importance. I'm not saying this 365 00:23:07,440 --> 00:23:10,800 Speaker 1: actually happens all the time in the United States, instead 366 00:23:10,800 --> 00:23:13,639 Speaker 1: of saying people talk about it happening all the time 367 00:23:13,680 --> 00:23:16,720 Speaker 1: in the United States that when someone does make such 368 00:23:16,720 --> 00:23:20,199 Speaker 1: a statement, one of the frequent responses is, this is 369 00:23:20,280 --> 00:23:24,000 Speaker 1: just a pull focus away from X, you know, and 370 00:23:24,119 --> 00:23:27,200 Speaker 1: it's not really a genuine attempt to start a real 371 00:23:27,280 --> 00:23:32,040 Speaker 1: conversation about this other thing. And the fourth type of 372 00:23:32,160 --> 00:23:37,160 Speaker 1: message is one meant specifically to polarize and divide a population, 373 00:23:37,359 --> 00:23:39,840 Speaker 1: because if you divide the people, if you push them 374 00:23:39,880 --> 00:23:43,600 Speaker 1: further to extremes, it means that the people will not 375 00:23:43,840 --> 00:23:48,480 Speaker 1: unify on anything. They're they're less powerful divided than they 376 00:23:48,520 --> 00:23:52,680 Speaker 1: would be unified. So while that can lead to other 377 00:23:53,080 --> 00:23:56,600 Speaker 1: major social problems, as you push people to political extremes, 378 00:23:57,040 --> 00:23:59,920 Speaker 1: you've also really decreased the ability of them to organ 379 00:24:00,040 --> 00:24:03,800 Speaker 1: eyes on any meaningful level, and they can't really counteract 380 00:24:03,840 --> 00:24:07,240 Speaker 1: what the government is doing. The fifth type of message, 381 00:24:07,280 --> 00:24:11,920 Speaker 1: the final one, is a direct attack on dissidents themselves, 382 00:24:11,960 --> 00:24:14,280 Speaker 1: made in an effort to drown out their voices through 383 00:24:14,320 --> 00:24:17,960 Speaker 1: any means necessary. And the researchers also pointed out that 384 00:24:18,040 --> 00:24:23,200 Speaker 1: authoritarian regimes use social media propaganda in conjunction with other 385 00:24:23,320 --> 00:24:28,280 Speaker 1: methods of intimidation, including surveillance and threats of violence. Frequent 386 00:24:28,320 --> 00:24:32,960 Speaker 1: targets would be political opponents, journalists, and sometimes members of 387 00:24:32,960 --> 00:24:36,040 Speaker 1: the population or at least large segments of the population 388 00:24:36,040 --> 00:24:39,480 Speaker 1: as a whole. If such a government can intimidate those 389 00:24:39,520 --> 00:24:43,800 Speaker 1: who would otherwise speak out against it while simultaneously manipulating 390 00:24:43,800 --> 00:24:46,760 Speaker 1: the conversation on social media to be in support of 391 00:24:46,800 --> 00:24:50,560 Speaker 1: that same government, it's in a stronger position to maintain power. 392 00:24:51,320 --> 00:24:55,600 Speaker 1: The report also details the actual communication strategies governments and 393 00:24:55,680 --> 00:24:58,719 Speaker 1: political parties are using. So we've talked about the types 394 00:24:58,840 --> 00:25:03,159 Speaker 1: of messages that the organization's propagate, but how are they 395 00:25:03,200 --> 00:25:06,639 Speaker 1: propagating the messages? What are those strategies and the report 396 00:25:06,760 --> 00:25:12,000 Speaker 1: identifies five key ways. Number one is creating outright disinformation. 397 00:25:12,320 --> 00:25:16,600 Speaker 1: This can include fake news articles, fake videos, that kind 398 00:25:16,640 --> 00:25:20,159 Speaker 1: of thing, and as technology becomes more sophisticated, it becomes 399 00:25:20,200 --> 00:25:24,280 Speaker 1: increasingly challenging for the average person to determine if something 400 00:25:24,320 --> 00:25:28,280 Speaker 1: they encounter is genuine or has been faked. In some cases, 401 00:25:28,560 --> 00:25:31,920 Speaker 1: all it takes is a few edits to remove some context, 402 00:25:32,280 --> 00:25:34,879 Speaker 1: and suddenly a message can have a very different meaning 403 00:25:34,880 --> 00:25:38,200 Speaker 1: than the original intended one. So you can take video 404 00:25:38,320 --> 00:25:41,160 Speaker 1: of a politician making a speech, for example, you trem 405 00:25:41,200 --> 00:25:43,080 Speaker 1: a little bit off the beginning, a little bit off 406 00:25:43,119 --> 00:25:45,040 Speaker 1: the end, and you can make it sound like that 407 00:25:45,119 --> 00:25:48,640 Speaker 1: politician is saying the very opposite of what they actually intend. 408 00:25:49,200 --> 00:25:52,240 Speaker 1: In other cases, such as with deep fakes, there's the 409 00:25:52,240 --> 00:25:56,240 Speaker 1: opportunity to manufacture an entirely fake video of a person, 410 00:25:56,800 --> 00:25:59,280 Speaker 1: and this is only going to get harder as we 411 00:25:59,359 --> 00:26:03,159 Speaker 1: go on. This is also the most common communication strategy. 412 00:26:03,200 --> 00:26:06,000 Speaker 1: It's employed by fifty two of the countries that the 413 00:26:06,040 --> 00:26:09,120 Speaker 1: team researched, though in many cases we're talking about more 414 00:26:09,200 --> 00:26:12,560 Speaker 1: modest examples, such as fake news sites like a like 415 00:26:12,600 --> 00:26:16,119 Speaker 1: an article as opposed to a fake video. The second 416 00:26:16,160 --> 00:26:21,280 Speaker 1: strategy involves mass reporting accounts or content as being against 417 00:26:21,359 --> 00:26:25,320 Speaker 1: the terms of service of various platforms or organizations, so 418 00:26:25,400 --> 00:26:28,119 Speaker 1: an example of this would be a concentrated effort to 419 00:26:28,200 --> 00:26:33,080 Speaker 1: remove a person from Twitter by coordinating a big effort 420 00:26:33,080 --> 00:26:36,639 Speaker 1: to report that user uh to Twitter with a claim 421 00:26:36,680 --> 00:26:39,840 Speaker 1: that the user had violated Twitter's policies. So it's sort 422 00:26:39,880 --> 00:26:44,760 Speaker 1: of like a brute force attack. You overwhelm a provider 423 00:26:44,840 --> 00:26:50,159 Speaker 1: a platform with requests saying this person, this page, this 424 00:26:50,520 --> 00:26:54,120 Speaker 1: entity is breaking the rules and has to be banned, 425 00:26:54,400 --> 00:26:56,200 Speaker 1: and it's all on an effort to ban that person 426 00:26:56,480 --> 00:27:00,240 Speaker 1: or that thing. We've seen instances of this recent lye 427 00:27:00,760 --> 00:27:05,200 Speaker 1: with people going after figures they don't like and sending 428 00:27:05,400 --> 00:27:09,520 Speaker 1: complaints to that figures employer I'm thinking of specifically, like 429 00:27:09,640 --> 00:27:13,280 Speaker 1: James Gunn and Disney, and James Gunn was at least 430 00:27:13,320 --> 00:27:17,879 Speaker 1: temporarily removed from being able to direct movies like Guardians 431 00:27:17,880 --> 00:27:20,720 Speaker 1: of the Galaxy because of things he had done in 432 00:27:20,760 --> 00:27:24,399 Speaker 1: his past that were, you know, legit not cool. But 433 00:27:24,520 --> 00:27:29,959 Speaker 1: he had since apologized, acknowledged them, and pledged to do 434 00:27:30,040 --> 00:27:34,320 Speaker 1: better long long before anyone brought this up. But it 435 00:27:34,359 --> 00:27:36,919 Speaker 1: was enough to get him removed from the project for 436 00:27:36,960 --> 00:27:39,920 Speaker 1: a while, and so we have seen that this is 437 00:27:39,920 --> 00:27:44,240 Speaker 1: an effective tactic. The third strategy is to use data 438 00:27:44,320 --> 00:27:48,320 Speaker 1: to target specific groups of people with messaging that the 439 00:27:48,320 --> 00:27:51,560 Speaker 1: attackers tailored to that group of people, because it turns 440 00:27:51,560 --> 00:27:53,960 Speaker 1: out that if you tell a group of people what 441 00:27:54,119 --> 00:27:56,960 Speaker 1: they want to hear, that works great. So if you 442 00:27:57,000 --> 00:28:00,360 Speaker 1: identify what your target audience is and what they want 443 00:28:00,400 --> 00:28:02,280 Speaker 1: to hear, and then you convey your message in a 444 00:28:02,280 --> 00:28:05,240 Speaker 1: way that falls in line with that, you get more success. 445 00:28:05,760 --> 00:28:10,760 Speaker 1: Fourth is the awful practice of trolling and doxing. This 446 00:28:10,840 --> 00:28:14,360 Speaker 1: is all about silencing people by intimidation, like I mentioned earlier, 447 00:28:14,600 --> 00:28:18,200 Speaker 1: and can include revealing a person's real world address, phone number, 448 00:28:18,200 --> 00:28:22,879 Speaker 1: and other personal information information about people connected to that person. 449 00:28:23,440 --> 00:28:27,359 Speaker 1: It's it's kind of the mafia approach of putting pressure 450 00:28:27,440 --> 00:28:31,399 Speaker 1: on a person by intimidating and threatening them. And fifth 451 00:28:31,480 --> 00:28:35,600 Speaker 1: is actually the easiest strategy. It involves amplifying messages that 452 00:28:35,640 --> 00:28:39,200 Speaker 1: are already out there. So in this case, a government 453 00:28:39,280 --> 00:28:42,239 Speaker 1: or political party can just add resources to boost the 454 00:28:42,280 --> 00:28:45,400 Speaker 1: signal that are already in line with the organization's goals. 455 00:28:45,520 --> 00:28:47,720 Speaker 1: They don't have to create it themselves. They just can 456 00:28:47,760 --> 00:28:50,160 Speaker 1: create a whole bunch of let's say, fake accounts and 457 00:28:50,240 --> 00:28:52,600 Speaker 1: retweet a message that happens to fall in line with 458 00:28:52,600 --> 00:28:55,840 Speaker 1: what they believe. Now as you can imagine, the scope 459 00:28:55,880 --> 00:28:59,120 Speaker 1: of these efforts varies around the world. In some countries, 460 00:28:59,360 --> 00:29:03,960 Speaker 1: the researcher saw activities centered around specific events such as elections, 461 00:29:04,280 --> 00:29:06,640 Speaker 1: and then it would die down. In other countries, it 462 00:29:06,680 --> 00:29:09,040 Speaker 1: was more of an ongoing effort that the government would 463 00:29:09,040 --> 00:29:14,280 Speaker 1: perpetually support. Likewise, some countries spend a relatively modest amount 464 00:29:14,320 --> 00:29:18,640 Speaker 1: funding cyber troops, while others might dedicate many millions of 465 00:29:18,720 --> 00:29:22,360 Speaker 1: dollars to a single campaign. One of the larger efforts 466 00:29:22,360 --> 00:29:26,040 Speaker 1: cited by researchers was the case of Cambridge Analytica, which 467 00:29:26,040 --> 00:29:28,360 Speaker 1: I covered in a past episode of Tech Stuff. So 468 00:29:28,400 --> 00:29:30,479 Speaker 1: I recommend you go and hunt that one down and 469 00:29:30,520 --> 00:29:33,640 Speaker 1: listen to that to hear about how that unfolded, because 470 00:29:33,680 --> 00:29:37,400 Speaker 1: that was an enormous mess. All right. When we come back, 471 00:29:37,600 --> 00:29:39,600 Speaker 1: I'll go into a little bit more about what was 472 00:29:39,600 --> 00:29:42,560 Speaker 1: in the report and the ramifications we have to consider, 473 00:29:42,640 --> 00:29:53,240 Speaker 1: but first let's take another quick break. The researchers created 474 00:29:53,360 --> 00:29:57,280 Speaker 1: a four point scale to describe the size and capability 475 00:29:57,440 --> 00:30:01,320 Speaker 1: of cybertroop forces around the world. On the low end 476 00:30:01,320 --> 00:30:05,440 Speaker 1: of the scale is the designation minimal cyber troop teams, 477 00:30:05,920 --> 00:30:09,040 Speaker 1: and these are efforts that haven't been around for very long, 478 00:30:09,360 --> 00:30:13,680 Speaker 1: or they only manifest temporarily around those political events I 479 00:30:13,720 --> 00:30:18,080 Speaker 1: mentioned earlier, like elections. They tend to be limited in 480 00:30:18,160 --> 00:30:20,880 Speaker 1: what they can accomplish, and as a result, they typically 481 00:30:20,880 --> 00:30:24,640 Speaker 1: will focus on a single social media platform to maximize 482 00:30:24,640 --> 00:30:29,560 Speaker 1: their results, and they also focus exclusively on domestic misinformation 483 00:30:29,600 --> 00:30:33,840 Speaker 1: campaigns and not foreign influence operations. So countries that have 484 00:30:34,600 --> 00:30:39,200 Speaker 1: minimal cyber troop teams would be things like our country's 485 00:30:39,240 --> 00:30:44,800 Speaker 1: like Argentina, Australia, Croatia, Greece, South Korea, and a few 486 00:30:44,800 --> 00:30:48,560 Speaker 1: others on the opposite end of the scale, so that's 487 00:30:48,560 --> 00:30:50,760 Speaker 1: on the lowest end. On the highest end is high 488 00:30:50,880 --> 00:30:55,120 Speaker 1: cyber troop capacity. These countries dedicate a large budget to 489 00:30:55,240 --> 00:31:00,560 Speaker 1: funding online propaganda campaigns. They maintain a large permanent staff 490 00:31:00,680 --> 00:31:03,560 Speaker 1: of people in order to do that. They not only 491 00:31:03,640 --> 00:31:07,320 Speaker 1: execute comprehensive campaigns, they also research ways to do it 492 00:31:07,360 --> 00:31:11,080 Speaker 1: more effectively, so they're always working to improve the staff 493 00:31:11,120 --> 00:31:13,960 Speaker 1: works full time, not just in election years or around 494 00:31:13,960 --> 00:31:18,360 Speaker 1: other political events. They focus on both domestic and foreign operations, 495 00:31:18,600 --> 00:31:24,120 Speaker 1: and countries in this category include China, Israel, Iran, Russia, 496 00:31:24,360 --> 00:31:28,400 Speaker 1: Saudi Arabia, and yes, the United States. Now I didn't 497 00:31:28,440 --> 00:31:31,400 Speaker 1: include the other two categories because I'm sure you can 498 00:31:31,400 --> 00:31:34,960 Speaker 1: all extrapolate that they fit between the lowest level and 499 00:31:35,000 --> 00:31:38,520 Speaker 1: the highest level of cybertroop capacity. So really it's just 500 00:31:38,640 --> 00:31:42,760 Speaker 1: stages of capability and how much these countries are spending 501 00:31:42,880 --> 00:31:48,280 Speaker 1: on those efforts, making matters more complicated. As countries end 502 00:31:48,360 --> 00:31:52,080 Speaker 1: up developing more effective ways to leverage social media to 503 00:31:52,120 --> 00:31:57,000 Speaker 1: spread misinformation, they're also spreading those techniques around the world. 504 00:31:57,240 --> 00:32:00,920 Speaker 1: The researchers specifically call out a case in which Russian 505 00:32:01,000 --> 00:32:05,640 Speaker 1: operatives taught military officials in Myanmar how to manipulate people 506 00:32:05,640 --> 00:32:09,040 Speaker 1: through social media, So we're seeing the skills being shared 507 00:32:09,120 --> 00:32:14,520 Speaker 1: across territorial borders. So which online platforms are the most 508 00:32:14,600 --> 00:32:19,240 Speaker 1: important for people who want to spread propaganda, Well, it 509 00:32:19,240 --> 00:32:23,959 Speaker 1: should come as no surprise that Facebook leads the pack. Now, 510 00:32:24,120 --> 00:32:27,040 Speaker 1: not saying that because I think Facebook has a wretched 511 00:32:27,240 --> 00:32:31,240 Speaker 1: track record when it comes to dealing with misinformation, although 512 00:32:31,280 --> 00:32:34,000 Speaker 1: that happens to be my opinion, But that's not why 513 00:32:34,080 --> 00:32:37,479 Speaker 1: I'm saying this. I'm saying because Facebook is just so 514 00:32:37,600 --> 00:32:42,640 Speaker 1: darned popular analysts estimate that two point for one billion 515 00:32:42,920 --> 00:32:47,640 Speaker 1: people use Facebook. The world's population is approximately seven point 516 00:32:47,720 --> 00:32:51,920 Speaker 1: five three billion people, so about a third of all 517 00:32:52,000 --> 00:32:57,080 Speaker 1: the people in the world are using Facebook. So if 518 00:32:57,120 --> 00:32:59,400 Speaker 1: you want to get a message out there, you have 519 00:32:59,480 --> 00:33:02,800 Speaker 1: to go where the people are that happens to be Facebook. 520 00:33:03,040 --> 00:33:06,640 Speaker 1: So it's not necessarily the case that Facebook is less 521 00:33:06,640 --> 00:33:11,480 Speaker 1: effective at policing these things than other platforms are. Instead, 522 00:33:11,520 --> 00:33:15,240 Speaker 1: it's more like it's a target rich environment. It's where 523 00:33:15,240 --> 00:33:18,400 Speaker 1: the people happen to be. The researchers state that as 524 00:33:18,480 --> 00:33:21,840 Speaker 1: other platforms grow in use, particularly for the purposes of 525 00:33:21,840 --> 00:33:26,000 Speaker 1: political discourse, they will no doubt become targets for cyber 526 00:33:26,080 --> 00:33:29,640 Speaker 1: troops wishing to spread propaganda in the future. Another thing 527 00:33:29,720 --> 00:33:33,840 Speaker 1: working against Facebook is that it's pretty easy to figure 528 00:33:33,880 --> 00:33:37,160 Speaker 1: out how Facebook works, at least from a very high level. 529 00:33:37,600 --> 00:33:41,840 Speaker 1: The entire platform is built around the concept of engagement. 530 00:33:42,240 --> 00:33:47,440 Speaker 1: Facebook makes money when people interact with Facebook, so posts 531 00:33:47,480 --> 00:33:51,160 Speaker 1: that inspire more interaction with the platform, whether that's in 532 00:33:51,200 --> 00:33:56,000 Speaker 1: the form of comments, sharing a post, or sending likes 533 00:33:56,120 --> 00:33:58,960 Speaker 1: or whatever. These things, by the way, are not equal, 534 00:33:59,000 --> 00:34:02,560 Speaker 1: but they're all very anyway, those posts end up getting 535 00:34:02,600 --> 00:34:06,680 Speaker 1: more visibility thanks to Facebook's algorithms. If a particular post 536 00:34:06,760 --> 00:34:09,360 Speaker 1: is doing well, Facebook's more likely to show that to 537 00:34:09,480 --> 00:34:13,719 Speaker 1: more people because it's already proven to drive engagement, and 538 00:34:13,800 --> 00:34:17,919 Speaker 1: engagement is how Facebook makes money. Essentially, what it's doing 539 00:34:18,000 --> 00:34:22,760 Speaker 1: is it's selling your time to advertisers. So the more 540 00:34:22,760 --> 00:34:26,040 Speaker 1: time you spend on Facebook, the more money it's gonna make. 541 00:34:26,560 --> 00:34:29,440 Speaker 1: So if you know that as someone who's trying to 542 00:34:29,520 --> 00:34:32,960 Speaker 1: run a propaganda campaign, you can start to build posts 543 00:34:33,000 --> 00:34:36,319 Speaker 1: that drive that kind of engagement in various ways, or 544 00:34:36,400 --> 00:34:38,680 Speaker 1: you might even game the system a little bit by 545 00:34:38,760 --> 00:34:42,880 Speaker 1: using some state backed accounts, some fake accounts to boost 546 00:34:43,000 --> 00:34:45,920 Speaker 1: the signal. You post something, you get a bunch of 547 00:34:45,960 --> 00:34:48,279 Speaker 1: people sharing it and liking it. Maybe some of those 548 00:34:48,320 --> 00:34:51,040 Speaker 1: people are real, maybe a lot of them are fake accounts, 549 00:34:51,480 --> 00:34:53,680 Speaker 1: and then you try and make it go viral from there. 550 00:34:53,920 --> 00:34:57,560 Speaker 1: If it's inflammatory, all the better because it's going to 551 00:34:57,640 --> 00:35:00,520 Speaker 1: make people either want to share it because it has 552 00:35:01,200 --> 00:35:04,920 Speaker 1: uh it has affirmed a belief that they hold that 553 00:35:05,040 --> 00:35:10,960 Speaker 1: other people find wrong, or people are so upset at 554 00:35:11,080 --> 00:35:14,279 Speaker 1: how terrible the statement is they share it to let 555 00:35:14,280 --> 00:35:16,880 Speaker 1: other people know, Hey, do you see how horrible this is? 556 00:35:17,120 --> 00:35:20,279 Speaker 1: Either way, the message keeps on spreading. Other platforms that 557 00:35:20,320 --> 00:35:25,839 Speaker 1: the researchers cited included Twitter, WhatsApp, YouTube, and Instagram, and 558 00:35:25,880 --> 00:35:29,160 Speaker 1: of course others will grow in importance over time. In fact, 559 00:35:29,200 --> 00:35:34,360 Speaker 1: I've already heard about TikTok being another platform that merits 560 00:35:34,440 --> 00:35:37,280 Speaker 1: special attention in the near future, if not right darn 561 00:35:37,360 --> 00:35:40,799 Speaker 1: now now. On the one hand, it is distressing to 562 00:35:40,880 --> 00:35:43,720 Speaker 1: think that a tool like the Internet, which has often 563 00:35:43,760 --> 00:35:46,719 Speaker 1: been seen as a means to facilitate communication and the 564 00:35:46,800 --> 00:35:50,839 Speaker 1: sharing of ideas and ideals in a positive sense, has 565 00:35:50,880 --> 00:35:54,839 Speaker 1: been twisted to spread misinformation in various ways to make 566 00:35:54,840 --> 00:35:58,040 Speaker 1: people behave the way some government or some political party 567 00:35:58,120 --> 00:36:01,640 Speaker 1: wants those people to behave. On the other hand, this 568 00:36:01,680 --> 00:36:07,000 Speaker 1: isn't new. Propaganda is an old, old idea. Social media 569 00:36:07,080 --> 00:36:10,759 Speaker 1: hasn't made propaganda possible because those ideas and approaches have 570 00:36:10,800 --> 00:36:13,239 Speaker 1: been around for centuries, but it has made it much 571 00:36:13,280 --> 00:36:17,280 Speaker 1: more efficient and scalable than it ever was before. Also, 572 00:36:17,320 --> 00:36:20,840 Speaker 1: you can tailor it to a level that you couldn't before. 573 00:36:21,560 --> 00:36:24,200 Speaker 1: And it's a bit ironic that the platforms that were 574 00:36:24,200 --> 00:36:28,080 Speaker 1: ostensibly designed to let us connect with friends online and 575 00:36:28,120 --> 00:36:31,759 Speaker 1: make new ones, is simultaneously the tool of groups that 576 00:36:31,800 --> 00:36:36,600 Speaker 1: are dedicated to driving deeper divides within populations. The very 577 00:36:36,600 --> 00:36:38,760 Speaker 1: things that are supposed to bring us together are pushing 578 00:36:38,840 --> 00:36:42,319 Speaker 1: us further apart. That a tool that is supposedly meant 579 00:36:42,360 --> 00:36:44,640 Speaker 1: to allow for communication can also be a tool that 580 00:36:44,760 --> 00:36:49,120 Speaker 1: suppresses it is incredibly ironic to me, and again that 581 00:36:49,200 --> 00:36:53,400 Speaker 1: wasn't necessarily the intent of the people who made the platform. 582 00:36:53,440 --> 00:36:56,960 Speaker 1: But because people work the way they do, and because 583 00:36:57,160 --> 00:37:01,600 Speaker 1: people who are spreading misinformation know how to leverage these platforms, 584 00:37:01,640 --> 00:37:04,480 Speaker 1: that's what's happening. And there's been a lot of pressure, 585 00:37:04,600 --> 00:37:08,399 Speaker 1: particularly on Facebook and Twitter to do something about this. 586 00:37:08,800 --> 00:37:12,360 Speaker 1: A few weeks before I recorded this episode, Twitter announced 587 00:37:12,360 --> 00:37:15,760 Speaker 1: it had deleted nearly a thousand accounts that had linked 588 00:37:15,760 --> 00:37:21,160 Speaker 1: to quote significant state backed information operations end quote that 589 00:37:21,200 --> 00:37:24,120 Speaker 1: was in relation to the protest demonstrations in Hong Kong, 590 00:37:24,600 --> 00:37:28,360 Speaker 1: and the state in question at this point was China. 591 00:37:28,480 --> 00:37:30,960 Speaker 1: According to Twitter, the purpose of those accounts was to 592 00:37:31,040 --> 00:37:36,040 Speaker 1: divide Hong Kong, diminishing the support for protests. Likewise, Facebook 593 00:37:36,080 --> 00:37:39,480 Speaker 1: removed a few users and groups that it had identified 594 00:37:39,520 --> 00:37:42,200 Speaker 1: as being part of a state backed effort to undermine 595 00:37:42,239 --> 00:37:45,920 Speaker 1: the Hong Kong protesters. And because both Facebook and Twitter 596 00:37:46,000 --> 00:37:49,960 Speaker 1: are blocked in China due to the aforementioned Great Firewall 597 00:37:50,040 --> 00:37:52,960 Speaker 1: of China, the implication here is that the accounts were 598 00:37:52,960 --> 00:37:56,520 Speaker 1: attempting to shape the international perception of the story because 599 00:37:56,560 --> 00:37:58,480 Speaker 1: people in China wouldn't be able to see it. It 600 00:37:58,560 --> 00:38:02,880 Speaker 1: also indicates the yes this was state backed, because no 601 00:38:02,960 --> 00:38:05,719 Speaker 1: one in China would be able to access those platforms 602 00:38:06,040 --> 00:38:11,880 Speaker 1: without the permission and cooperation of the Chinese government itself, 603 00:38:11,960 --> 00:38:15,880 Speaker 1: otherwise just be blocked off. Over the summer of twenty nineteen, 604 00:38:16,239 --> 00:38:20,200 Speaker 1: the BBC organized a Trusted News Summit in an effort 605 00:38:20,239 --> 00:38:23,600 Speaker 1: to devise a strategy to combat the spread of disinformation. 606 00:38:23,920 --> 00:38:27,400 Speaker 1: The company reached out to major platforms like Google, Facebook, 607 00:38:27,440 --> 00:38:30,759 Speaker 1: and Twitter to come up with a strategy to minimize 608 00:38:30,800 --> 00:38:33,560 Speaker 1: the spread of false information and to make sure that 609 00:38:33,680 --> 00:38:37,440 Speaker 1: reputable reporting could rise to the top. Some of the 610 00:38:37,480 --> 00:38:40,600 Speaker 1: solutions the group came up with included the formation of 611 00:38:40,640 --> 00:38:43,880 Speaker 1: an early warning system in which one company would quickly 612 00:38:43,880 --> 00:38:47,040 Speaker 1: reach out to the others in the event and identified 613 00:38:47,080 --> 00:38:51,319 Speaker 1: a misinformation campaign. So if Twitter got wind of an 614 00:38:51,360 --> 00:38:53,680 Speaker 1: effort like the one we just mentioned with Hong Kong, 615 00:38:54,080 --> 00:38:57,400 Speaker 1: Twitter executives could send out an alert through this system 616 00:38:57,520 --> 00:39:00,400 Speaker 1: so that the folks at Google and Facebook would also 617 00:39:00,440 --> 00:39:03,239 Speaker 1: know to be on the lookout. Those efforts actually go 618 00:39:03,400 --> 00:39:07,440 Speaker 1: beyond the political propaganda covered by the Oxford Report. They 619 00:39:07,440 --> 00:39:11,279 Speaker 1: also encompass stuff like anti vaccination rhetoric, which is on 620 00:39:11,480 --> 00:39:15,560 Speaker 1: scientific and ultimately it's deadly because it discourages people from 621 00:39:15,560 --> 00:39:19,719 Speaker 1: immunizing children against diseases, which then raises the possibility of 622 00:39:19,800 --> 00:39:24,359 Speaker 1: serious and deadly outbreaks. The summit also called for newsmakers 623 00:39:24,360 --> 00:39:27,760 Speaker 1: and platforms to take more care to educate readers about issues, 624 00:39:28,320 --> 00:39:32,360 Speaker 1: educating and informing on top of reporting the news. In 625 00:39:32,400 --> 00:39:35,920 Speaker 1: the United States, DARPA, which is the agency that funds 626 00:39:36,120 --> 00:39:39,719 Speaker 1: research and development into technologies intended to contribute to the 627 00:39:39,760 --> 00:39:43,560 Speaker 1: defense of the United States, has initiated its own program 628 00:39:43,680 --> 00:39:48,560 Speaker 1: to counteract disinformation campaigns, which, you know, it is great, 629 00:39:49,160 --> 00:39:52,880 Speaker 1: but I suspect more than a few disinformation campaigns originating 630 00:39:52,880 --> 00:39:57,400 Speaker 1: from the United States have some relation distant or otherwise 631 00:39:57,480 --> 00:40:00,200 Speaker 1: to DARPA. So take that with a grain of alt. 632 00:40:00,280 --> 00:40:01,919 Speaker 1: I guess you could say it's kind of a case 633 00:40:01,960 --> 00:40:05,080 Speaker 1: of when we do it, it's a strategic tool in 634 00:40:05,120 --> 00:40:08,879 Speaker 1: our arsenal that guarantees national security, but when they do it, 635 00:40:08,880 --> 00:40:13,720 Speaker 1: it's dirty, rotten, cheating. The researchers note that we can't 636 00:40:13,960 --> 00:40:17,600 Speaker 1: just look to the social platforms to fix this problem 637 00:40:17,640 --> 00:40:22,320 Speaker 1: because the problem extends beyond the platforms. Companies like Twitter 638 00:40:22,400 --> 00:40:25,239 Speaker 1: and Facebook can form policies and enforce them to help 639 00:40:25,360 --> 00:40:29,160 Speaker 1: mitigate the spread of misinformation, but they aren't ultimately at 640 00:40:29,160 --> 00:40:33,240 Speaker 1: fault for the actual content on their services. They enable 641 00:40:33,560 --> 00:40:36,120 Speaker 1: the spread of that information, they might even promote it, 642 00:40:36,160 --> 00:40:39,279 Speaker 1: and they're certainly responsible for that. Their algorithms can be 643 00:40:39,320 --> 00:40:42,239 Speaker 1: co opted by those who know how they work and 644 00:40:42,520 --> 00:40:46,080 Speaker 1: used to malicious ends, But the root of the problem 645 00:40:46,239 --> 00:40:51,239 Speaker 1: is systemic within governments and cultures themselves, not in technology 646 00:40:51,320 --> 00:40:55,000 Speaker 1: in general or social media in particular. So to that end, 647 00:40:55,280 --> 00:40:59,560 Speaker 1: addressing propaganda really means taking a look at the institutions 648 00:40:59,600 --> 00:41:02,719 Speaker 1: and says stems within a political framework that allow it 649 00:41:02,800 --> 00:41:05,719 Speaker 1: to exist in the first place. Until we do that, 650 00:41:06,080 --> 00:41:09,240 Speaker 1: it stands to reason that the various governments and political 651 00:41:09,280 --> 00:41:12,719 Speaker 1: parties will make use of every communications tool they have 652 00:41:12,840 --> 00:41:17,160 Speaker 1: at their disposal to spread messages and suppress dissenting opinions. 653 00:41:17,800 --> 00:41:20,799 Speaker 1: The timing of the study is also interesting. It's in 654 00:41:20,840 --> 00:41:23,959 Speaker 1: the wake of Brexit, which many in the UK say 655 00:41:24,200 --> 00:41:28,520 Speaker 1: was supported in part because of a misinformation campaign, that 656 00:41:28,600 --> 00:41:33,160 Speaker 1: they were misled with false promises and pretenses, and that 657 00:41:33,480 --> 00:41:37,040 Speaker 1: social media played a big part in spreading that misinformation, 658 00:41:37,280 --> 00:41:41,120 Speaker 1: which ultimately lead to a vote in favor of Brexit, 659 00:41:41,719 --> 00:41:45,800 Speaker 1: which a lot of people now are second guessing. Not everyone, 660 00:41:45,880 --> 00:41:47,320 Speaker 1: I mean, there are a lot of people who still 661 00:41:47,600 --> 00:41:51,040 Speaker 1: fully believe that the UK should exit the EU, but 662 00:41:51,239 --> 00:41:54,960 Speaker 1: it did muddy the waters. There's also been reports of 663 00:41:55,000 --> 00:41:58,760 Speaker 1: election interference for an election interference in the United States 664 00:41:58,880 --> 00:42:04,000 Speaker 1: during the elections. Uh, And so awareness of how vulnerable 665 00:42:04,080 --> 00:42:07,600 Speaker 1: we all are to manipulation is on the rise. Now, 666 00:42:07,640 --> 00:42:10,120 Speaker 1: that doesn't mean our ability to suss it out has 667 00:42:10,160 --> 00:42:14,160 Speaker 1: improved dramatically. If anything, it has encouraged more tribalism, in 668 00:42:14,160 --> 00:42:17,319 Speaker 1: which groups of people inherently trust anything that aligns with 669 00:42:17,360 --> 00:42:21,719 Speaker 1: their own worldview and distrust anything that is outside of 670 00:42:21,760 --> 00:42:24,720 Speaker 1: that worldview. And to be fair, it can be quite 671 00:42:24,840 --> 00:42:28,760 Speaker 1: hard to identify misinformation just on the face of it. 672 00:42:28,760 --> 00:42:32,080 Speaker 1: It requires critical thinking and often a lot of research 673 00:42:32,160 --> 00:42:35,280 Speaker 1: to make sure the information you're receiving is reasonably accurate. 674 00:42:35,560 --> 00:42:37,840 Speaker 1: And as I mentioned earlier in this episode, with the 675 00:42:37,920 --> 00:42:40,600 Speaker 1: sheer amount of data we encounter in our lives, that's 676 00:42:40,600 --> 00:42:44,600 Speaker 1: not really practical. So what are we to do. Well, 677 00:42:44,760 --> 00:42:47,960 Speaker 1: we could back out on our consumption and take a 678 00:42:48,040 --> 00:42:51,839 Speaker 1: more critical approach to selecting our news sources. But that's 679 00:42:51,840 --> 00:42:54,480 Speaker 1: a lot to ask. It would mean making a dramatic 680 00:42:54,560 --> 00:42:57,719 Speaker 1: shift away from the behaviors we've cultivated over the last 681 00:42:57,760 --> 00:43:00,560 Speaker 1: decade or so longer. If you want to get issues 682 00:43:00,600 --> 00:43:02,319 Speaker 1: that kind of grew out of the twenty four hour 683 00:43:02,440 --> 00:43:06,759 Speaker 1: cable news cycle. We can push for more transparent and 684 00:43:06,840 --> 00:43:11,080 Speaker 1: democratic processes and government, but that's obviously not something everyone 685 00:43:11,160 --> 00:43:14,759 Speaker 1: can do everywhere, at least not without significant personal risk, 686 00:43:15,239 --> 00:43:18,360 Speaker 1: and even in countries that pride themselves on being founded 687 00:43:18,400 --> 00:43:22,120 Speaker 1: on principles like that, pointing out shortcomings can lead to 688 00:43:22,200 --> 00:43:25,840 Speaker 1: some ugly consequences. Just ask anyone in the United States 689 00:43:25,840 --> 00:43:31,560 Speaker 1: who has publicly questioned any politician. You get attacked pretty quickly. 690 00:43:31,640 --> 00:43:35,120 Speaker 1: Doesn't matter which side you're looking at either, the attacks 691 00:43:35,120 --> 00:43:39,040 Speaker 1: will follow. I think pressuring companies to be more proactive 692 00:43:39,040 --> 00:43:42,080 Speaker 1: when detecting misinformation is a good step, but we have 693 00:43:42,320 --> 00:43:46,560 Speaker 1: to take it upon ourselves to develop strong critical thinking skills, 694 00:43:46,880 --> 00:43:49,600 Speaker 1: and we have to be willing to hold public officials 695 00:43:49,600 --> 00:43:53,799 Speaker 1: accountable when they engage in shift e misinformation campaigns and 696 00:43:53,840 --> 00:43:56,719 Speaker 1: we find out about it. Like the researchers stated in 697 00:43:56,760 --> 00:44:00,239 Speaker 1: the paper, nothing will change unless we tackle the root 698 00:44:00,400 --> 00:44:04,319 Speaker 1: of the problem. Just dealing with the symptoms isn't enough, 699 00:44:05,040 --> 00:44:08,839 Speaker 1: and that sums up this episode of tech Stuff. If 700 00:44:08,880 --> 00:44:12,160 Speaker 1: you've guys got suggestions for future episodes, send me an 701 00:44:12,160 --> 00:44:15,759 Speaker 1: email the addresses tech Stuff at how stuff works dot com. 702 00:44:15,800 --> 00:44:18,680 Speaker 1: Pop on over to the website that's text stuff podcast 703 00:44:18,719 --> 00:44:20,920 Speaker 1: dot com. You're gonna find an archive of all of 704 00:44:20,960 --> 00:44:23,439 Speaker 1: our past episodes. There. You also find links to where 705 00:44:23,440 --> 00:44:25,279 Speaker 1: we are on social media. You can reach out to 706 00:44:25,280 --> 00:44:28,320 Speaker 1: me on the Facebook or the Twitter with your information. 707 00:44:28,480 --> 00:44:31,719 Speaker 1: I'm sure none of you would send me misinformation, and 708 00:44:31,800 --> 00:44:34,120 Speaker 1: you can also at our website click on the little 709 00:44:34,120 --> 00:44:36,200 Speaker 1: link that takes you to our merchandise store, where every 710 00:44:36,200 --> 00:44:38,360 Speaker 1: purchase you make goes to help the show. We greatly 711 00:44:38,360 --> 00:44:41,839 Speaker 1: appreciate it, and I'll talk to you again really soon. 712 00:44:46,600 --> 00:44:48,839 Speaker 1: Text Stuff is a production of I Heart Radio's How 713 00:44:48,880 --> 00:44:52,280 Speaker 1: Stuff Works. For more podcasts from my Heart Radio, visit 714 00:44:52,280 --> 00:44:55,400 Speaker 1: the i heart Radio app, Apple Podcasts, or wherever you 715 00:44:55,440 --> 00:45:00,320 Speaker 1: listen to your favorite shows. Two