1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,200 --> 00:00:13,960 Speaker 1: stuff works dot com. Hey there, and welcome to tech Stuff. 3 00:00:14,000 --> 00:00:16,880 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:16,880 --> 00:00:19,000 Speaker 1: How Stuff Works in a love of all things tech, 5 00:00:19,920 --> 00:00:24,880 Speaker 1: and welcome to a week of darkness. I know we 6 00:00:25,000 --> 00:00:28,639 Speaker 1: just did Nuclear Power, but we're not done with some 7 00:00:28,800 --> 00:00:32,200 Speaker 1: dark and scary topics. This week we're going to explore 8 00:00:32,320 --> 00:00:36,840 Speaker 1: some stuff, uh in the world of the web that 9 00:00:37,120 --> 00:00:40,920 Speaker 1: is of concern, and today specifically, we're gonna talk about 10 00:00:40,920 --> 00:00:45,159 Speaker 1: echo chambers. Um So, also I should mention before I 11 00:00:45,200 --> 00:00:48,640 Speaker 1: really jump into this, I'm still dealing with a cold. 12 00:00:48,960 --> 00:00:51,440 Speaker 1: You may have heard from the Nuclear Power episodes that 13 00:00:51,479 --> 00:00:54,320 Speaker 1: I was dealing with the cold. That cold is slowly 14 00:00:54,360 --> 00:00:56,880 Speaker 1: moving from my head to my chest and it is 15 00:00:56,920 --> 00:00:59,360 Speaker 1: affecting my voice. So I could get a little extra 16 00:00:59,440 --> 00:01:03,600 Speaker 1: crokey over the next few episodes. So back in April, 17 00:01:04,640 --> 00:01:09,040 Speaker 1: Facebook founder Mark Zuckerberg appeared before Congress to talk about 18 00:01:09,080 --> 00:01:11,720 Speaker 1: some pretty heavy topics. Now, one of the big ones 19 00:01:11,760 --> 00:01:15,160 Speaker 1: was about Cambridge Analytica, and the next two episodes will 20 00:01:15,200 --> 00:01:18,120 Speaker 1: be about Cambridge Analytica. But the other was about how 21 00:01:18,480 --> 00:01:22,039 Speaker 1: dozens of accounts linked to Russia had used Facebook as 22 00:01:22,040 --> 00:01:25,680 Speaker 1: a platform to influence US voters during the two thousand 23 00:01:25,720 --> 00:01:28,840 Speaker 1: and sixteen election season, and this episode is going to 24 00:01:28,880 --> 00:01:32,320 Speaker 1: be more about how Facebook and other social media platforms 25 00:01:32,800 --> 00:01:35,800 Speaker 1: got to the point where they could be exploited in 26 00:01:35,880 --> 00:01:39,400 Speaker 1: that way. And before I get deep into this topic, 27 00:01:39,760 --> 00:01:42,760 Speaker 1: I am not going to take a specific political stance 28 00:01:43,000 --> 00:01:45,800 Speaker 1: in this episode, so you don't need to worry about that. 29 00:01:45,880 --> 00:01:49,880 Speaker 1: I am not here to argue one philosophy over another. 30 00:01:50,520 --> 00:01:55,680 Speaker 1: This is not a pro party episode in either or 31 00:01:55,720 --> 00:01:59,000 Speaker 1: an anti party episode in either case. Instead, I'm going 32 00:01:59,040 --> 00:02:01,480 Speaker 1: to talk about the way so media platforms in general, 33 00:02:01,520 --> 00:02:05,240 Speaker 1: and Facebook and Twitter in particular, work to explain how 34 00:02:05,280 --> 00:02:08,640 Speaker 1: features that were meant to do one thing could be 35 00:02:08,680 --> 00:02:12,840 Speaker 1: twisted to do something else. And this can affect anyone 36 00:02:13,040 --> 00:02:16,280 Speaker 1: of any philosophy, so it can be exploited by anyone 37 00:02:16,360 --> 00:02:20,239 Speaker 1: who has ulterior motives, doesn't matter whether that's for one 38 00:02:20,320 --> 00:02:22,679 Speaker 1: party or for a different party. So a lot of 39 00:02:22,760 --> 00:02:24,800 Speaker 1: what I'm going to talk about relate is related to 40 00:02:24,840 --> 00:02:29,160 Speaker 1: the concept of the metaphorical echo chamber. And I'm sure 41 00:02:29,200 --> 00:02:32,120 Speaker 1: most of you have heard about that and you probably 42 00:02:32,240 --> 00:02:34,919 Speaker 1: know what it is, but just in case, I'm gonna 43 00:02:35,000 --> 00:02:37,480 Speaker 1: lay it out. It's a phrase we use to describe 44 00:02:37,480 --> 00:02:40,959 Speaker 1: an environment in which our persons previously held beliefs are 45 00:02:41,000 --> 00:02:44,640 Speaker 1: boosted through repetition of those beliefs within a closed system. 46 00:02:45,120 --> 00:02:47,920 Speaker 1: So imagine that you're in a room full of people 47 00:02:48,040 --> 00:02:51,400 Speaker 1: who all love one sports team and they all hate 48 00:02:51,800 --> 00:02:55,880 Speaker 1: another sports team. Okay, so they all share these common traits. 49 00:02:55,880 --> 00:02:58,760 Speaker 1: They all love Team A and they all really hate 50 00:02:58,800 --> 00:03:02,280 Speaker 1: team B. And they talk about how great their team 51 00:03:02,320 --> 00:03:06,160 Speaker 1: A is and how rotten Team B is. They all 52 00:03:06,200 --> 00:03:10,200 Speaker 1: feel validated in their beliefs, right. They are getting validation 53 00:03:10,360 --> 00:03:12,480 Speaker 1: from their peers who are also saying, you know what, 54 00:03:12,560 --> 00:03:16,280 Speaker 1: you're right. Teammate does rule, and Team B can go 55 00:03:16,919 --> 00:03:21,280 Speaker 1: kick sand. There are no dissenting opinions. There's no objective 56 00:03:21,320 --> 00:03:24,280 Speaker 1: point of view. Everyone is biased. By the end of 57 00:03:24,320 --> 00:03:27,320 Speaker 1: the meeting, everyone's pretty much sure that their team rules 58 00:03:27,480 --> 00:03:31,160 Speaker 1: and the other team is literally like the worst. That 59 00:03:31,440 --> 00:03:34,880 Speaker 1: is an echo chamber. Social media platforms can become echo 60 00:03:34,960 --> 00:03:38,600 Speaker 1: chambers too, and often do partly because it's hard to 61 00:03:38,640 --> 00:03:42,440 Speaker 1: have a nuanced conversation online where two people with different 62 00:03:42,440 --> 00:03:46,400 Speaker 1: points of view can come to some sort of mutual understanding. 63 00:03:46,840 --> 00:03:50,680 Speaker 1: That seems to be the exception as opposed to the rule. 64 00:03:50,840 --> 00:03:54,160 Speaker 1: When you have two people of opposing viewpoints interact online. 65 00:03:54,560 --> 00:03:58,160 Speaker 1: The culture of flame wars and memes, coupled with the 66 00:03:58,280 --> 00:04:01,600 Speaker 1: lack of tone and bodyline which and other communication tools, 67 00:04:02,000 --> 00:04:05,680 Speaker 1: means we're far more likely to have a shallow interaction. 68 00:04:06,120 --> 00:04:09,119 Speaker 1: We're either patting each other on the back for confirming 69 00:04:09,160 --> 00:04:12,680 Speaker 1: our common beliefs, or we're tearing each other down because 70 00:04:12,680 --> 00:04:15,800 Speaker 1: we share different points of view and we tend to 71 00:04:15,840 --> 00:04:20,320 Speaker 1: get pushed more towards radical ends of our respective philosophies. 72 00:04:20,720 --> 00:04:24,159 Speaker 1: It becomes an us versus them and not a wee. 73 00:04:24,560 --> 00:04:27,080 Speaker 1: So instead of we should work this out, it becomes 74 00:04:27,160 --> 00:04:30,479 Speaker 1: how can we defeat them? This is just something that 75 00:04:30,560 --> 00:04:34,120 Speaker 1: happens in general online more frequently than it does in 76 00:04:34,120 --> 00:04:35,880 Speaker 1: the real world, although it can happen in the real 77 00:04:35,880 --> 00:04:39,360 Speaker 1: world too. Obviously, now there have been actual studies about 78 00:04:39,400 --> 00:04:43,280 Speaker 1: this sort of thing. I'm not just spouting armchair philosophy, 79 00:04:43,320 --> 00:04:46,960 Speaker 1: but whenever it comes to psychological studies, I'm always personally 80 00:04:47,000 --> 00:04:51,080 Speaker 1: a little bit cautious. Not because I lack respect for psychology. 81 00:04:51,120 --> 00:04:53,120 Speaker 1: I have a great deal of respect for it, but 82 00:04:53,279 --> 00:04:59,080 Speaker 1: rather it is particularly challenging to design good scientific tests 83 00:04:59,160 --> 00:05:03,719 Speaker 1: for psychologic issues and and scenarios due to the enormous 84 00:05:03,800 --> 00:05:07,800 Speaker 1: number of variables that come along with being a human being. 85 00:05:08,240 --> 00:05:11,479 Speaker 1: But there is a study that published in eleven called 86 00:05:11,800 --> 00:05:16,400 Speaker 1: Effects of anonymity, invisibility and lack of eye contact on 87 00:05:16,520 --> 00:05:21,760 Speaker 1: toxic online disinhibition that explores causes for our bad behaviors 88 00:05:21,760 --> 00:05:24,680 Speaker 1: when we get behind a keyboard. The experiment looked at 89 00:05:24,720 --> 00:05:28,240 Speaker 1: how anonymity, invisibility, and lack of eye contact could affect 90 00:05:28,839 --> 00:05:33,839 Speaker 1: interpersonal interactions online and the researchers took subjects test subjects 91 00:05:33,920 --> 00:05:37,760 Speaker 1: and paired them together randomly, and each pair received a 92 00:05:37,760 --> 00:05:40,839 Speaker 1: dilemma that they were supposed to discuss and resolve as 93 00:05:40,880 --> 00:05:44,800 Speaker 1: best they could. The researchers saw the greatest evidence of 94 00:05:45,680 --> 00:05:50,320 Speaker 1: negative behaviors, of toxic behaviors when the subjects did not 95 00:05:50,760 --> 00:05:53,479 Speaker 1: have eye contact with one another in their interactions. So 96 00:05:53,520 --> 00:05:56,480 Speaker 1: whenever they were in a setup that prevented them from 97 00:05:56,480 --> 00:05:59,120 Speaker 1: being able to make eye contact with each other, they 98 00:05:59,160 --> 00:06:02,760 Speaker 1: saw that the toxicity of those interactions would increase. Where 99 00:06:02,800 --> 00:06:05,760 Speaker 1: you have people of a very different points of view 100 00:06:06,040 --> 00:06:09,560 Speaker 1: arguing about what's the best solution to a dilemma. Now, 101 00:06:09,600 --> 00:06:12,680 Speaker 1: maybe this means that when we're online and when we're 102 00:06:12,680 --> 00:06:16,039 Speaker 1: interacting with someone else online. The fact we cannot see 103 00:06:16,080 --> 00:06:18,840 Speaker 1: that person and make eye contact with them means that 104 00:06:18,880 --> 00:06:24,080 Speaker 1: to a certain extent, that person doesn't seem real to us. 105 00:06:24,279 --> 00:06:27,279 Speaker 1: We're interacting with what appears to just be words on 106 00:06:27,320 --> 00:06:29,760 Speaker 1: a screen, so not a human being. This is text. 107 00:06:30,520 --> 00:06:32,919 Speaker 1: The fact that there is a real human being on 108 00:06:32,960 --> 00:06:35,680 Speaker 1: the other end of those words is a level of abstraction. 109 00:06:35,760 --> 00:06:39,400 Speaker 1: We usually don't bother the process, like we know that's 110 00:06:39,520 --> 00:06:42,479 Speaker 1: the case. We know most of the time anyway, that 111 00:06:42,560 --> 00:06:44,440 Speaker 1: there is an actual person on the other end of 112 00:06:44,480 --> 00:06:49,520 Speaker 1: those words, but that doesn't you know, knowing it doesn't 113 00:06:49,600 --> 00:06:54,520 Speaker 1: filter into our our immediate behaviors. Dr John Sewler defines 114 00:06:54,600 --> 00:06:58,800 Speaker 1: what he calls the online disinhibition effect, which consists of 115 00:06:58,880 --> 00:07:03,839 Speaker 1: six factors that remove or reduce inhibitions that we typically 116 00:07:03,920 --> 00:07:07,880 Speaker 1: feel when we interact with one another in public spaces. 117 00:07:08,440 --> 00:07:13,800 Speaker 1: Those factors are dissociative anonymity. That means the anonymous persona 118 00:07:13,840 --> 00:07:18,360 Speaker 1: we adopt online isn't quote unquote really us, right, So 119 00:07:18,720 --> 00:07:21,280 Speaker 1: we might act in ways that don't reflect who we 120 00:07:21,320 --> 00:07:25,960 Speaker 1: are as individuals in our day to day lives because 121 00:07:26,120 --> 00:07:30,600 Speaker 1: we're taking on a persona online and maybe this makes 122 00:07:30,680 --> 00:07:33,720 Speaker 1: us more bold than we normally would be, or more 123 00:07:34,040 --> 00:07:36,360 Speaker 1: what we think of as outspoken, and other people might 124 00:07:36,400 --> 00:07:40,400 Speaker 1: think of as really freaking rude. Um, anyone who has 125 00:07:40,400 --> 00:07:43,760 Speaker 1: spent any time online can probably say, yeah, you know, 126 00:07:44,240 --> 00:07:47,600 Speaker 1: I get that. Like I, I'm a different person online 127 00:07:47,640 --> 00:07:51,000 Speaker 1: than I am in real life. Then there's invisibility that 128 00:07:51,040 --> 00:07:53,680 Speaker 1: means no one can judge my tone or tell what 129 00:07:53,720 --> 00:07:58,360 Speaker 1: I look like when I'm online, so that disinhibits me. Right. So, 130 00:07:58,720 --> 00:08:01,600 Speaker 1: if I have certain inhibitions I experience in my day 131 00:08:01,640 --> 00:08:04,000 Speaker 1: to day life because of the way I look or 132 00:08:04,040 --> 00:08:08,600 Speaker 1: the tone I have, and I'm constantly adjusting my behavior 133 00:08:08,640 --> 00:08:11,520 Speaker 1: because of that, that no longer applies when I'm online. 134 00:08:11,920 --> 00:08:15,960 Speaker 1: There's a synchronicity that is, your actions are not unfolding 135 00:08:16,000 --> 00:08:20,160 Speaker 1: in real time. You can um read a response, you 136 00:08:20,160 --> 00:08:23,920 Speaker 1: can write something down, days can pass in between, and 137 00:08:24,000 --> 00:08:28,760 Speaker 1: that changes things as well. Then there's solipsistic introjection, which 138 00:08:28,840 --> 00:08:32,280 Speaker 1: is because I cannot see this person, I must fill 139 00:08:32,280 --> 00:08:34,280 Speaker 1: in the gaps as to what they intend and who 140 00:08:34,360 --> 00:08:37,520 Speaker 1: they really are. So, in other words, when I interact 141 00:08:37,559 --> 00:08:42,160 Speaker 1: with you online, I cannot see you, and I start 142 00:08:42,200 --> 00:08:47,560 Speaker 1: to a sign intent and behavior to the words that 143 00:08:47,600 --> 00:08:51,240 Speaker 1: you're sending me because I can't see them. I can't 144 00:08:51,320 --> 00:08:54,800 Speaker 1: witness them myself, and now those are pieces of information 145 00:08:54,840 --> 00:08:57,200 Speaker 1: that are kind of necessary for me to understand the 146 00:08:57,240 --> 00:08:59,680 Speaker 1: meaning of what you are saying, So I start to 147 00:08:59,760 --> 00:09:04,320 Speaker 1: just assume what it is you mean. So this is 148 00:09:04,360 --> 00:09:07,839 Speaker 1: where you get into some of those problems where someone says, oh, no, 149 00:09:07,880 --> 00:09:10,080 Speaker 1: I was I was trying to make a joke, but 150 00:09:10,200 --> 00:09:13,199 Speaker 1: you took it seriously, um, which, by the way, can 151 00:09:13,240 --> 00:09:16,040 Speaker 1: be a real lousy way of trying to cover up 152 00:09:16,080 --> 00:09:19,800 Speaker 1: bad behavior saying oh, it's just joke. Um. Sometimes it 153 00:09:20,200 --> 00:09:22,840 Speaker 1: can literally be just a joke and it ends up 154 00:09:23,559 --> 00:09:26,480 Speaker 1: snowballing into something terrible. But there are a lot of 155 00:09:26,480 --> 00:09:28,320 Speaker 1: people who will use just a joke as a way 156 00:09:28,320 --> 00:09:30,720 Speaker 1: to try and excuse bad behavior in general. So it's 157 00:09:30,720 --> 00:09:34,400 Speaker 1: not a great response, but it does mean like if 158 00:09:34,440 --> 00:09:36,199 Speaker 1: you send me a message that was intended to be 159 00:09:36,240 --> 00:09:38,480 Speaker 1: a joke, like let's say, legit, it was meant to 160 00:09:38,480 --> 00:09:41,000 Speaker 1: be a joke and it wasn't mean spirited or anything 161 00:09:41,040 --> 00:09:44,240 Speaker 1: like that, but I misinterpreted because I have assigned a 162 00:09:44,320 --> 00:09:49,320 Speaker 1: motivation to you based upon those words, that's me engaging 163 00:09:49,360 --> 00:09:53,720 Speaker 1: in this particular behavior. Then there's dissociative imagination. That is 164 00:09:53,920 --> 00:09:57,320 Speaker 1: the uh, the idea that this is all online, which 165 00:09:57,400 --> 00:10:01,080 Speaker 1: isn't in real life. Therefore the are not real people 166 00:10:01,640 --> 00:10:04,880 Speaker 1: that I'm interacting with. And there's minimizing authority, which is 167 00:10:05,040 --> 00:10:06,920 Speaker 1: there are no Internet police, so I can do what 168 00:10:06,960 --> 00:10:08,960 Speaker 1: I want. It's like the Wild West, so I'm not 169 00:10:08,960 --> 00:10:11,920 Speaker 1: gonna get punished, So why should I worry about limiting 170 00:10:11,920 --> 00:10:15,680 Speaker 1: my behavior? Now, those factors will influence people and lead 171 00:10:15,720 --> 00:10:19,439 Speaker 1: them to behaving in ways they might not behave normally 172 00:10:19,480 --> 00:10:22,120 Speaker 1: in the public world, and sometimes they do that in 173 00:10:22,120 --> 00:10:24,720 Speaker 1: a positive way. This doesn't have to be negative. It 174 00:10:24,760 --> 00:10:27,640 Speaker 1: could mean that you might be more honest and open, 175 00:10:28,160 --> 00:10:32,120 Speaker 1: and you might be more accepting. But other times it 176 00:10:32,240 --> 00:10:36,000 Speaker 1: might mean being more negative than lashing out in attacks 177 00:10:36,040 --> 00:10:38,520 Speaker 1: in a way that you would never do in any 178 00:10:38,559 --> 00:10:42,079 Speaker 1: other context. And it may even be that those who 179 00:10:42,240 --> 00:10:45,240 Speaker 1: attack are normally pretty decent people that you know in 180 00:10:45,280 --> 00:10:49,480 Speaker 1: their day to day lives. They aren't uh aggressive or 181 00:10:49,600 --> 00:10:54,000 Speaker 1: obnoxious or insulting, but when they get online, that changes 182 00:10:54,679 --> 00:10:58,520 Speaker 1: and the factors remove those inhibitions they would typically feel 183 00:10:58,559 --> 00:11:00,840 Speaker 1: that guide them to being a any decent person. Now, 184 00:11:00,840 --> 00:11:03,040 Speaker 1: I'm not sure what that says about the real person 185 00:11:03,120 --> 00:11:08,000 Speaker 1: underneath all that. It doesn't seem great. If the argument 186 00:11:08,120 --> 00:11:11,440 Speaker 1: is this person only behaves well because he or she 187 00:11:12,040 --> 00:11:15,360 Speaker 1: feels they have to based upon the community they exist in, 188 00:11:15,920 --> 00:11:18,719 Speaker 1: that's not a great argument for that person's character, but 189 00:11:19,000 --> 00:11:22,319 Speaker 1: it is a reality. So out of the real world, 190 00:11:22,800 --> 00:11:27,040 Speaker 1: we have communities and we adapt our behaviors to the 191 00:11:27,080 --> 00:11:30,600 Speaker 1: communities we belong to, and this is a survival mechanism. 192 00:11:30,760 --> 00:11:34,559 Speaker 1: Human beings are social creatures, and to be social, one 193 00:11:34,559 --> 00:11:36,520 Speaker 1: of the things you have to have is the ability 194 00:11:36,559 --> 00:11:38,959 Speaker 1: to get along with people. If you start to alienate 195 00:11:39,040 --> 00:11:43,319 Speaker 1: everyone you come into contact with, eventually you'll get ostracized 196 00:11:43,360 --> 00:11:46,920 Speaker 1: from the community. And if we're talking about you know, 197 00:11:47,960 --> 00:11:52,640 Speaker 1: primitive humans, that might mean that you have severely reduced 198 00:11:52,640 --> 00:11:56,160 Speaker 1: your chance for survival. So a survival mechanism that is 199 00:11:56,200 --> 00:11:58,240 Speaker 1: important to develop is how do I get along with 200 00:11:58,280 --> 00:12:01,160 Speaker 1: everybody else and contribute in a way where I can 201 00:12:01,200 --> 00:12:04,160 Speaker 1: be part of the group. So, whether it's laws or 202 00:12:04,240 --> 00:12:06,640 Speaker 1: just the desire not to rock the boat, we tend 203 00:12:06,720 --> 00:12:11,080 Speaker 1: to behave within the context of our community's values and rules. 204 00:12:11,120 --> 00:12:15,480 Speaker 1: But online those things aren't nearly as present, or established 205 00:12:15,640 --> 00:12:20,000 Speaker 1: or enforced, and so more destructive human tendencies tend to 206 00:12:20,040 --> 00:12:23,599 Speaker 1: come to light there. In addition, communication online tends to 207 00:12:23,640 --> 00:12:26,760 Speaker 1: be fairly short with each blast, and that means we 208 00:12:26,800 --> 00:12:29,319 Speaker 1: have very little time to reflect on what we are 209 00:12:29,360 --> 00:12:32,720 Speaker 1: actually saying before we say it. Now, back in the 210 00:12:32,720 --> 00:12:36,520 Speaker 1: old days, gather around the fire, my friends. We would 211 00:12:36,520 --> 00:12:42,160 Speaker 1: write letters on paper with pen or pencil, and it 212 00:12:42,200 --> 00:12:45,560 Speaker 1: took ages to write something out of any substance. And 213 00:12:45,600 --> 00:12:47,079 Speaker 1: by the time you were finished, then you have to 214 00:12:47,120 --> 00:12:49,600 Speaker 1: go find an envelope and address the envelope and put 215 00:12:49,679 --> 00:12:52,120 Speaker 1: the letter in the envelope, but a stamp on the envelope, 216 00:12:52,160 --> 00:12:54,120 Speaker 1: go out to the mailbox or to the post office 217 00:12:54,160 --> 00:12:56,360 Speaker 1: to mail it. By the time you do all that, 218 00:12:56,440 --> 00:12:57,960 Speaker 1: you might have thought better of the words that you 219 00:12:58,000 --> 00:13:00,920 Speaker 1: wrote down, right, You might have thought, you know, there 220 00:13:00,920 --> 00:13:02,400 Speaker 1: probably was a better way for me to put that. 221 00:13:02,559 --> 00:13:07,440 Speaker 1: Maybe I shouldn't say that, uh that you know you 222 00:13:07,600 --> 00:13:10,200 Speaker 1: are a horse's rear end. There might be a better 223 00:13:10,240 --> 00:13:13,640 Speaker 1: way of putting my thoughts down on paper, and you 224 00:13:13,640 --> 00:13:16,520 Speaker 1: would try again, or maybe you toss everything out. Venting 225 00:13:16,760 --> 00:13:20,360 Speaker 1: on the page might work through your feelings. Maybe even 226 00:13:20,400 --> 00:13:23,520 Speaker 1: through this process, you think about how the recipient of 227 00:13:23,559 --> 00:13:26,880 Speaker 1: your letter will react to the words that you have written, 228 00:13:27,640 --> 00:13:30,040 Speaker 1: and that might guide you to write them in a 229 00:13:30,120 --> 00:13:33,959 Speaker 1: much more constructive way. But online, when we can zap 230 00:13:34,000 --> 00:13:38,079 Speaker 1: off a quick zinger in no time, it's instant gratification. 231 00:13:38,679 --> 00:13:41,760 Speaker 1: A tweet or a Facebook update zooms out there before 232 00:13:41,760 --> 00:13:45,280 Speaker 1: we've even really thought about the consequences of what we 233 00:13:45,360 --> 00:13:48,680 Speaker 1: have just said, and the nature of online communication itself 234 00:13:48,720 --> 00:13:52,440 Speaker 1: has catered to our more base natures. Now, to be clear, 235 00:13:52,800 --> 00:13:56,600 Speaker 1: social media platforms did not create this problem. They just 236 00:13:57,440 --> 00:14:02,440 Speaker 1: facilitate it very efficient lely, and if managers of those 237 00:14:02,480 --> 00:14:06,920 Speaker 1: programs do not intervene or do not enforce behavior policies, 238 00:14:07,280 --> 00:14:09,920 Speaker 1: things can get out of hand. Now, next, we're gonna 239 00:14:09,920 --> 00:14:12,240 Speaker 1: take a look at how Facebook's algorithm works to get 240 00:14:12,280 --> 00:14:16,160 Speaker 1: a better understanding of how Russian accounts Russian hackers were 241 00:14:16,200 --> 00:14:19,680 Speaker 1: able to really exploit that system. But first let's take 242 00:14:19,720 --> 00:14:29,520 Speaker 1: a quick break to thank our sponsor. One of the 243 00:14:29,520 --> 00:14:33,440 Speaker 1: big goals for any web based property is to drive engagement. 244 00:14:33,840 --> 00:14:37,320 Speaker 1: Engagement might mean viewing more pages, which ends up counting 245 00:14:37,320 --> 00:14:40,640 Speaker 1: towards page views for advertising, or it might mean encouraging 246 00:14:40,680 --> 00:14:43,800 Speaker 1: people to buy stuff from an online store, or it 247 00:14:43,880 --> 00:14:46,520 Speaker 1: might mean getting people to sign up for newsletters or 248 00:14:46,520 --> 00:14:49,160 Speaker 1: to join groups online. But it's all about getting people 249 00:14:49,200 --> 00:14:52,640 Speaker 1: to go from being a momentary visitor to something more 250 00:14:52,720 --> 00:14:56,560 Speaker 1: than that. And Facebook measures engagement in a few ways 251 00:14:56,680 --> 00:14:59,320 Speaker 1: on their site. One of those ways is through the 252 00:14:59,360 --> 00:15:03,160 Speaker 1: number of likes or reactions a post gets. If a 253 00:15:03,240 --> 00:15:05,520 Speaker 1: post gets a lot more people hitting that like button, 254 00:15:05,600 --> 00:15:08,760 Speaker 1: then engagement is high. If nobody has reacted to it, 255 00:15:09,040 --> 00:15:12,320 Speaker 1: engagement is low. Another is in the number of times 256 00:15:12,320 --> 00:15:15,600 Speaker 1: a post is shared to various people's walls. If I 257 00:15:15,640 --> 00:15:18,720 Speaker 1: see acute meme that features foxes, I know I need 258 00:15:18,760 --> 00:15:21,600 Speaker 1: to share that on my friend Shaise wall because she 259 00:15:21,800 --> 00:15:25,760 Speaker 1: loves foxes. So I'll do that, and sharing it counts 260 00:15:25,800 --> 00:15:29,160 Speaker 1: toward the engagement of that original post. And the third 261 00:15:29,240 --> 00:15:33,320 Speaker 1: major way that Facebook gauges engagement, which I realized now 262 00:15:33,440 --> 00:15:38,800 Speaker 1: sounds a little repetitive but isn't, is through comments. So 263 00:15:38,840 --> 00:15:41,920 Speaker 1: if a post gets a lot of comments, particularly from 264 00:15:41,960 --> 00:15:44,240 Speaker 1: a lot of different people, not just the same like 265 00:15:44,320 --> 00:15:47,440 Speaker 1: two people going back and forth, Facebook registers that as 266 00:15:47,480 --> 00:15:50,520 Speaker 1: a post with high engagement. Now why is that important? 267 00:15:50,560 --> 00:15:53,480 Speaker 1: Why should you care? Well, the reason engagement is important 268 00:15:53,480 --> 00:15:56,840 Speaker 1: with Facebook is that you, as a user. Let's say 269 00:15:56,840 --> 00:15:58,920 Speaker 1: that you have a Facebook profile. Some of you may not, 270 00:15:59,160 --> 00:16:01,680 Speaker 1: but if you do, you do not see all the 271 00:16:01,720 --> 00:16:04,520 Speaker 1: stuff that your friends post on Facebook, even if they're 272 00:16:04,560 --> 00:16:07,760 Speaker 1: posting it publicly or to their friends. Even if they 273 00:16:07,760 --> 00:16:10,920 Speaker 1: are not specifically leaving you out of the posts, you're 274 00:16:10,960 --> 00:16:13,680 Speaker 1: not seeing all their stuff. And this has been a 275 00:16:13,720 --> 00:16:18,400 Speaker 1: common complaint among users, including myself. Uh. A lot of 276 00:16:18,400 --> 00:16:21,480 Speaker 1: people have argued, I wish that Facebook would just send 277 00:16:21,600 --> 00:16:24,960 Speaker 1: everything chronologically. It's my job to sit there and go 278 00:16:25,040 --> 00:16:26,840 Speaker 1: through it all and read up on things, and if 279 00:16:26,840 --> 00:16:29,760 Speaker 1: I miss stuff, I miss stuff. But Facebook doesn't do 280 00:16:29,800 --> 00:16:32,280 Speaker 1: it that way. First of all, you don't necessarily get 281 00:16:32,280 --> 00:16:34,840 Speaker 1: all the posts in chronological order, even when you said it, 282 00:16:34,880 --> 00:16:37,480 Speaker 1: because they keep changing the darn settings. But that's another 283 00:16:37,560 --> 00:16:39,800 Speaker 1: argument for another time. But you might notice when you 284 00:16:39,840 --> 00:16:43,200 Speaker 1: log on some of your friends seem like they're pretty 285 00:16:43,280 --> 00:16:48,120 Speaker 1: consistently really active, and others appear to rarely post. In fact, 286 00:16:48,240 --> 00:16:51,480 Speaker 1: some of those people who appear to rarely post might 287 00:16:51,480 --> 00:16:54,520 Speaker 1: be posting regularly, but you just aren't seeing it unless 288 00:16:54,560 --> 00:16:57,000 Speaker 1: you actually take the effort to pop on over to 289 00:16:57,160 --> 00:16:59,280 Speaker 1: that friend's wall, and then you can see all the 290 00:16:59,320 --> 00:17:02,320 Speaker 1: posts that you been missing. Well, what Facebook is doing 291 00:17:02,360 --> 00:17:04,919 Speaker 1: is serving you a selection of the stuff your friends 292 00:17:04,920 --> 00:17:08,560 Speaker 1: are sharing on the social media platform and they leave 293 00:17:08,600 --> 00:17:12,080 Speaker 1: out the rest, and Facebook will serve up posts in 294 00:17:12,119 --> 00:17:15,520 Speaker 1: your news feed that have high engagement. So if one 295 00:17:15,520 --> 00:17:17,840 Speaker 1: of your friends shares a post from someone else on 296 00:17:17,880 --> 00:17:21,040 Speaker 1: their feed that got a ton of responses, you'll probably 297 00:17:21,080 --> 00:17:23,720 Speaker 1: see that when you scroll through your feed, or if 298 00:17:23,720 --> 00:17:26,479 Speaker 1: the post has anything to do with anything you're interested in, 299 00:17:26,920 --> 00:17:30,760 Speaker 1: you'll likely see it then too. Stuff goes viral, helped 300 00:17:30,760 --> 00:17:33,879 Speaker 1: in no small part by Facebook's algorithm to make sure 301 00:17:34,200 --> 00:17:38,199 Speaker 1: the more visible posts get even more visibility and they 302 00:17:38,240 --> 00:17:41,679 Speaker 1: get the most engagement. Now, let me be clear, on 303 00:17:41,720 --> 00:17:44,120 Speaker 1: the face of it, this seems like a no brainer. 304 00:17:44,359 --> 00:17:47,600 Speaker 1: After all, if a lot of people are interested in something, 305 00:17:47,640 --> 00:17:50,760 Speaker 1: if someone writes a very thought provoking post and it 306 00:17:50,800 --> 00:17:53,639 Speaker 1: gets a lot of engagement, chances are you will be 307 00:17:53,720 --> 00:17:56,439 Speaker 1: interested in that as well. You don't want to be 308 00:17:56,520 --> 00:17:59,520 Speaker 1: left out the next conversational topic or the next fun meme. 309 00:18:00,080 --> 00:18:02,600 Speaker 1: But there's a big problem with this philosophy, and that 310 00:18:02,800 --> 00:18:05,560 Speaker 1: is you can exploit it as long as you can 311 00:18:05,600 --> 00:18:08,480 Speaker 1: make posts that drive a lot of engagement, and it 312 00:18:08,560 --> 00:18:12,480 Speaker 1: doesn't have to be positive engagement, which makes it way easier. 313 00:18:12,960 --> 00:18:15,639 Speaker 1: It can actually just be about stirring the pot and 314 00:18:15,680 --> 00:18:18,720 Speaker 1: making people mad, and you can do it without directly 315 00:18:18,800 --> 00:18:23,760 Speaker 1: violating Facebook policies. Now, if you go and post something 316 00:18:23,800 --> 00:18:29,080 Speaker 1: that's outright hateful, racist, misogynistic, or offensive, that might get 317 00:18:29,119 --> 00:18:33,720 Speaker 1: flagged and Facebook might remove it. It might not. I've 318 00:18:33,720 --> 00:18:37,639 Speaker 1: seen some pretty awful things up on Facebook that have 319 00:18:37,720 --> 00:18:40,840 Speaker 1: been left there even after people have complained. But Facebook 320 00:18:40,840 --> 00:18:43,439 Speaker 1: does have policies about this sort of stuff, and if 321 00:18:43,640 --> 00:18:47,240 Speaker 1: enough people flag it and Facebook takes notice, those posts 322 00:18:47,280 --> 00:18:51,200 Speaker 1: might get taken down. Um, So let's assume for the 323 00:18:51,240 --> 00:18:53,840 Speaker 1: moment that Facebook has established some firm rules and it's 324 00:18:53,920 --> 00:18:57,320 Speaker 1: enforcing them, and just take that off the table. What 325 00:18:57,520 --> 00:18:59,760 Speaker 1: you could still do is linked to stories that are 326 00:18:59,760 --> 00:19:03,679 Speaker 1: written specifically to rile people up. There may not be 327 00:19:03,720 --> 00:19:07,439 Speaker 1: an ounce of truth to those stories. They might be 328 00:19:07,480 --> 00:19:13,560 Speaker 1: complete fabrications, specifically crafted just to get a reaction from people. 329 00:19:14,160 --> 00:19:17,639 Speaker 1: People who agree with whatever the perspective is of the 330 00:19:17,680 --> 00:19:20,600 Speaker 1: story will share a link of that story on their 331 00:19:20,600 --> 00:19:23,080 Speaker 1: social media page, or at least they'll be more inclined to, 332 00:19:23,560 --> 00:19:28,080 Speaker 1: and others will share that and then they'll comment or 333 00:19:28,119 --> 00:19:30,720 Speaker 1: they'll engage in some way, and in this manner, the 334 00:19:30,720 --> 00:19:34,359 Speaker 1: message gets elevated and it gets spread around a little 335 00:19:34,359 --> 00:19:38,959 Speaker 1: bit more, and then you start to see it go viral, 336 00:19:39,080 --> 00:19:41,119 Speaker 1: and there doesn't ever have to be an ounce of 337 00:19:41,200 --> 00:19:43,840 Speaker 1: truth in the original story for this to happen. I've 338 00:19:43,880 --> 00:19:46,840 Speaker 1: been seeing this on Facebook for a few years, and 339 00:19:46,880 --> 00:19:50,840 Speaker 1: it's not always political. The stuff I'm talking about today 340 00:19:50,880 --> 00:19:54,200 Speaker 1: relates largely to politics, but I've seen it for all 341 00:19:54,240 --> 00:19:56,760 Speaker 1: sorts of different stuff. And I'm sure many of you 342 00:19:56,840 --> 00:19:59,440 Speaker 1: have had the experience of looking at a post that 343 00:19:59,560 --> 00:20:02,479 Speaker 1: a friend of years has shared and there's an article 344 00:20:02,800 --> 00:20:05,639 Speaker 1: that it links to with a crazy headline, and you 345 00:20:05,720 --> 00:20:08,440 Speaker 1: might say to yourself, hang on, that can't possibly be true. 346 00:20:08,560 --> 00:20:10,600 Speaker 1: And then you click through to the article and you 347 00:20:10,640 --> 00:20:12,600 Speaker 1: do a little digging maybe, and you find out that 348 00:20:12,640 --> 00:20:16,000 Speaker 1: the site that's hosting that article is a quote unquote 349 00:20:16,000 --> 00:20:21,720 Speaker 1: satire or parody site. Typically for these sites, you'll find 350 00:20:21,720 --> 00:20:25,880 Speaker 1: an about page somewhere that lays this out. But because 351 00:20:26,080 --> 00:20:30,480 Speaker 1: our real journalism has become so focused on creating clickable 352 00:20:30,680 --> 00:20:34,160 Speaker 1: or a clickbait if you prefer headlines, the quote unquote 353 00:20:34,240 --> 00:20:38,600 Speaker 1: joke articles don't seem that unreasonable within that same context, right, 354 00:20:38,800 --> 00:20:44,040 Speaker 1: You've got legitimate news outlets that are creating ridiculous headlines 355 00:20:44,119 --> 00:20:47,040 Speaker 1: because they work. They get people to click on the stories, 356 00:20:47,200 --> 00:20:51,399 Speaker 1: and the stories might be very well written, very well researched. 357 00:20:51,880 --> 00:20:56,199 Speaker 1: But that headline culture means that when we encounter a 358 00:20:56,520 --> 00:21:01,560 Speaker 1: quote unquote satire or parody headline, it's not always clear 359 00:21:01,600 --> 00:21:04,680 Speaker 1: that it's a joke, because we see crazy stuff all 360 00:21:04,680 --> 00:21:08,920 Speaker 1: the time. Now click bait has made that a reality, 361 00:21:09,280 --> 00:21:11,840 Speaker 1: and so it only matters if you go further in 362 00:21:11,920 --> 00:21:15,240 Speaker 1: and read the actual article to see, all right, is 363 00:21:15,280 --> 00:21:19,280 Speaker 1: this legit or is this a joke? What is this? 364 00:21:20,200 --> 00:21:23,640 Speaker 1: Sometimes it's a joke, and sometimes it's very evident from 365 00:21:23,640 --> 00:21:25,720 Speaker 1: the beginning, like you start reading the article and you think, 366 00:21:25,880 --> 00:21:28,680 Speaker 1: all right, this is a joke. But my friend clearly 367 00:21:28,680 --> 00:21:31,040 Speaker 1: didn't read the article. They just saw the headline and 368 00:21:31,080 --> 00:21:35,560 Speaker 1: shared it and that became their contribution to the conversation, 369 00:21:35,640 --> 00:21:38,080 Speaker 1: which really all it did was add more traffic to 370 00:21:38,240 --> 00:21:41,960 Speaker 1: this website. In other cases, it may not obviously be 371 00:21:42,000 --> 00:21:43,560 Speaker 1: a joke, and it's only when you go to the 372 00:21:43,600 --> 00:21:46,600 Speaker 1: about page on the website that you see that everything 373 00:21:46,640 --> 00:21:49,240 Speaker 1: is supposed to be satire, although for it to really 374 00:21:49,240 --> 00:21:52,920 Speaker 1: be satire, you have to know it's satire um In reality, 375 00:21:52,920 --> 00:21:55,720 Speaker 1: I would call those websites just purveyors of lies. They 376 00:21:55,760 --> 00:21:59,160 Speaker 1: just make up junk and they pass it off as true. 377 00:21:59,400 --> 00:22:01,840 Speaker 1: And it's only if you dig in the website that 378 00:22:01,880 --> 00:22:04,399 Speaker 1: you find the little disclaimer saying, oh, this is a 379 00:22:04,440 --> 00:22:08,240 Speaker 1: satire site. That's not really satire, that's fake news. Um. 380 00:22:08,440 --> 00:22:11,640 Speaker 1: And I hate using the term fake news because it's 381 00:22:11,640 --> 00:22:14,720 Speaker 1: so politicized, but there really is that stuff out there, 382 00:22:14,760 --> 00:22:16,720 Speaker 1: and it's been there for years. And like I said, 383 00:22:16,760 --> 00:22:20,040 Speaker 1: it's not just politics. I've seen it for things like, uh, 384 00:22:20,080 --> 00:22:23,840 Speaker 1: the entertainment industry, where you'll see some ridiculous headline you think, well, 385 00:22:23,880 --> 00:22:26,920 Speaker 1: that's just bizarre, and only through digging you realize, oh, 386 00:22:27,160 --> 00:22:29,879 Speaker 1: there's nothing to this. So the owners of those sites 387 00:22:29,960 --> 00:22:32,960 Speaker 1: are really in a sweet spot. They can point to 388 00:22:33,000 --> 00:22:35,800 Speaker 1: that about page and say, hey, it's not our intent 389 00:22:35,880 --> 00:22:37,879 Speaker 1: to be taken as a serious source of news. You 390 00:22:38,000 --> 00:22:39,520 Speaker 1: just read our about page. You see that we're a 391 00:22:39,600 --> 00:22:43,359 Speaker 1: joke site. But they also keep publishing articles that don't 392 00:22:43,400 --> 00:22:45,080 Speaker 1: seem to be so much of a joke as an 393 00:22:45,119 --> 00:22:49,800 Speaker 1: outright fabrication, and they end up making huge amounts of 394 00:22:49,800 --> 00:22:54,160 Speaker 1: money off of ads. High engagement means high traffic. High 395 00:22:54,200 --> 00:22:56,920 Speaker 1: traffic means high page views, high page views means you're 396 00:22:57,000 --> 00:22:59,320 Speaker 1: start making money off the ads that you serve against 397 00:22:59,320 --> 00:23:02,119 Speaker 1: your site. So those deceitful articles are really just a 398 00:23:02,160 --> 00:23:05,800 Speaker 1: means to an end. The consequences of those deceitful articles, 399 00:23:05,840 --> 00:23:09,439 Speaker 1: the idea that they might be spreading misinformation, that's not 400 00:23:09,560 --> 00:23:12,480 Speaker 1: really a consideration or concern for a lot of these sites. 401 00:23:12,640 --> 00:23:15,680 Speaker 1: And I've read articles, I've read interviews by people who 402 00:23:15,680 --> 00:23:19,520 Speaker 1: wrote for these sites, and it's clear that they were 403 00:23:19,560 --> 00:23:22,440 Speaker 1: just trying to think of ways that would get more 404 00:23:22,480 --> 00:23:26,959 Speaker 1: people to click on stories, and that beyond that, they 405 00:23:27,000 --> 00:23:30,240 Speaker 1: didn't care. They just wanted to drive traffic, to get 406 00:23:30,240 --> 00:23:31,760 Speaker 1: a lot of traffic, to make a lot of money, 407 00:23:32,160 --> 00:23:34,919 Speaker 1: and so they would just come up with whatever outlandish 408 00:23:34,960 --> 00:23:37,320 Speaker 1: stories they could think of that would play right into 409 00:23:37,359 --> 00:23:41,760 Speaker 1: people's preconceived ideas in order to make money. It's very cynical, 410 00:23:42,119 --> 00:23:45,320 Speaker 1: and as someone who works very hard to create content 411 00:23:45,400 --> 00:23:48,240 Speaker 1: that I consider to be of high quality, I find 412 00:23:48,240 --> 00:23:51,679 Speaker 1: it quite insulting to just both as a reader and 413 00:23:51,720 --> 00:23:54,560 Speaker 1: as someone who creates content. All Right, so you've got 414 00:23:54,560 --> 00:23:58,040 Speaker 1: these writers cynically creating inflammatory articles to drive traffic to 415 00:23:58,040 --> 00:24:01,040 Speaker 1: a site. They might incorporate just enough real world facts 416 00:24:01,080 --> 00:24:03,800 Speaker 1: to give the article some believability, but even that isn't 417 00:24:03,800 --> 00:24:06,720 Speaker 1: really necessary, as a lot of people are going to 418 00:24:06,840 --> 00:24:09,560 Speaker 1: just share an article if the headline gets their attention 419 00:24:09,640 --> 00:24:12,400 Speaker 1: and seems to confirm they're already held beliefs. And if 420 00:24:12,400 --> 00:24:15,160 Speaker 1: you believe deep in your heart that your local government 421 00:24:15,240 --> 00:24:17,959 Speaker 1: was I don't know, replaced with pod people or something, 422 00:24:18,320 --> 00:24:21,719 Speaker 1: the headline that says as much confirms and validates your belief. 423 00:24:22,000 --> 00:24:24,879 Speaker 1: And hey, you're busy, right. You can't be expected to 424 00:24:24,960 --> 00:24:27,639 Speaker 1: read every article you come across just to see if 425 00:24:27,680 --> 00:24:30,400 Speaker 1: it's legit, or even do further digging to make sure 426 00:24:30,480 --> 00:24:32,399 Speaker 1: the site that hosted the article is in fact a 427 00:24:32,440 --> 00:24:35,439 Speaker 1: real news site, so you just share it. Add to 428 00:24:35,520 --> 00:24:38,600 Speaker 1: this the fact that there are countless sites out there 429 00:24:38,720 --> 00:24:43,720 Speaker 1: that only exist to repurpose other people's content that performs 430 00:24:43,760 --> 00:24:47,120 Speaker 1: well in order to exploit that content for advertising revenue. 431 00:24:47,359 --> 00:24:49,360 Speaker 1: And you have a recipe to make things worse. So 432 00:24:49,400 --> 00:24:53,159 Speaker 1: these are outfits that are all about let's look and 433 00:24:53,200 --> 00:24:56,200 Speaker 1: see what's trending, and hey, there's this article. Let's say 434 00:24:56,200 --> 00:24:58,680 Speaker 1: it's on BuzzFeed. There's this article on BuzzFeed. It's to 435 00:24:58,800 --> 00:25:01,840 Speaker 1: end super well, let's do a version, our version of 436 00:25:01,880 --> 00:25:04,880 Speaker 1: that exact same article, and we're going to piggyback off 437 00:25:04,920 --> 00:25:08,520 Speaker 1: their success. Um, there are a lot of different sites 438 00:25:08,560 --> 00:25:11,560 Speaker 1: out there that are essentially doing this. They're taking other 439 00:25:11,600 --> 00:25:15,840 Speaker 1: people's content. They might change enough stuff so that it's not, 440 00:25:16,520 --> 00:25:20,119 Speaker 1: you know, just a hatchet copy and paste job, but 441 00:25:20,720 --> 00:25:24,359 Speaker 1: ultimately they're just again about trying to get as much 442 00:25:25,000 --> 00:25:28,760 Speaker 1: traffic as possible. The content can be terrible. By the way, 443 00:25:29,040 --> 00:25:31,199 Speaker 1: it doesn't matter if the content is good or not. 444 00:25:31,280 --> 00:25:33,600 Speaker 1: It just has to drive engagement. So now you've got 445 00:25:33,640 --> 00:25:37,439 Speaker 1: people trying to make legitimate, good content, the content that 446 00:25:37,600 --> 00:25:41,200 Speaker 1: is thorough, it's investigative, it's objective, it's of high quality, 447 00:25:41,640 --> 00:25:44,960 Speaker 1: and they're competing with people just throwing junk up online 448 00:25:44,960 --> 00:25:47,520 Speaker 1: as fast as they possibly can to drive as many 449 00:25:47,520 --> 00:25:51,440 Speaker 1: page views as possible. This is not a good environment 450 00:25:51,480 --> 00:25:54,320 Speaker 1: if you want to make good content, because you get 451 00:25:54,400 --> 00:25:57,560 Speaker 1: drowned out by all the noise. You can hope that 452 00:25:57,640 --> 00:26:01,440 Speaker 1: your reputation is good enough so that people take you seriously, 453 00:26:02,280 --> 00:26:04,880 Speaker 1: but you're still going up against people who just don't 454 00:26:05,000 --> 00:26:09,080 Speaker 1: care about quality. They care about quantity and engagement, and 455 00:26:09,400 --> 00:26:13,760 Speaker 1: that is very demoralizing as you go on. Now, so 456 00:26:13,840 --> 00:26:16,360 Speaker 1: far i've been talking about this just as a way 457 00:26:16,400 --> 00:26:19,399 Speaker 1: to make money through serving up ads against lousy content. 458 00:26:19,480 --> 00:26:21,760 Speaker 1: But when we come back, I'll talk about the dreaded 459 00:26:21,920 --> 00:26:25,480 Speaker 1: fake news for political gains. I'm talking about the stuff 460 00:26:25,520 --> 00:26:31,080 Speaker 1: fabricated to guide conversations and influence political elections. So stay 461 00:26:31,119 --> 00:26:33,560 Speaker 1: tuned because that's coming up next. But first let's take 462 00:26:33,600 --> 00:26:44,560 Speaker 1: a quick break to thank our sponsor. Propaganda is a 463 00:26:44,640 --> 00:26:47,760 Speaker 1: really old idea, and the definition of propaganda is that 464 00:26:47,840 --> 00:26:53,480 Speaker 1: it is information, typically biased, sometimes disinformation or misleading information, 465 00:26:53,880 --> 00:26:57,120 Speaker 1: maybe an even outright lie, but it's used to promote 466 00:26:57,200 --> 00:27:01,160 Speaker 1: a particular philosophical or political point of view. All sorts 467 00:27:01,200 --> 00:27:04,879 Speaker 1: of organizations used propaganda to build support among the general 468 00:27:04,920 --> 00:27:09,280 Speaker 1: public for a particular stance or action, such as electing 469 00:27:09,359 --> 00:27:14,280 Speaker 1: a leader or putting your faith in an organization. Russian 470 00:27:14,320 --> 00:27:17,280 Speaker 1: propaganda is kind of in a class all of its own. 471 00:27:17,400 --> 00:27:21,920 Speaker 1: For decades, the Soviet government used propaganda to praise Soviet 472 00:27:22,040 --> 00:27:26,560 Speaker 1: leaders and demonize Western countries, particularly the United States, as 473 00:27:26,560 --> 00:27:29,920 Speaker 1: well as the concept of capitalism. During the Soviet era, 474 00:27:30,480 --> 00:27:34,960 Speaker 1: the various publications in the Soviet Union were state owned, 475 00:27:35,320 --> 00:27:39,800 Speaker 1: so the government got to dictate what was communicated down 476 00:27:39,840 --> 00:27:45,520 Speaker 1: to the citizens. Us very much a propaganda machine. Russian 477 00:27:45,560 --> 00:27:49,880 Speaker 1: propagandists included artists who are great at capturing the public imagination. 478 00:27:50,359 --> 00:27:52,760 Speaker 1: Now these days, at least when it comes to the Internet, 479 00:27:52,880 --> 00:27:57,399 Speaker 1: Russian propagandists are like assembly line workers. In March two 480 00:27:57,440 --> 00:28:01,399 Speaker 1: thousand eighteen, Time magazine published an article titled a former 481 00:28:01,560 --> 00:28:06,399 Speaker 1: Russian troll explains how to spread fake news Now. That person, 482 00:28:06,880 --> 00:28:11,000 Speaker 1: vitally best Belove, explained that he took a job with 483 00:28:11,119 --> 00:28:17,440 Speaker 1: a company called the Internet Research Agency. That particular company, 484 00:28:17,480 --> 00:28:19,960 Speaker 1: a lot of people call it a troll factory. It's 485 00:28:20,000 --> 00:28:24,320 Speaker 1: just infamous for churning up people who do this professionally 486 00:28:24,440 --> 00:28:27,560 Speaker 1: in Russia. The real purpose of this organization is not 487 00:28:27,640 --> 00:28:31,439 Speaker 1: to conduct research, despite the name Internet Research Agency, but 488 00:28:31,600 --> 00:28:34,880 Speaker 1: rather spread propaganda as quickly and as effectively as possible. 489 00:28:35,240 --> 00:28:39,120 Speaker 1: The employees were given instructions to create fake accounts on 490 00:28:39,240 --> 00:28:42,680 Speaker 1: various social media sites like Facebook and Twitter, and to 491 00:28:42,760 --> 00:28:46,800 Speaker 1: leave comments and posts that followed the directions of their superiors. 492 00:28:47,480 --> 00:28:50,160 Speaker 1: So those directions could involve sharing an article that was 493 00:28:50,200 --> 00:28:53,600 Speaker 1: written specifically to appeal to people with particular political or 494 00:28:53,600 --> 00:28:57,440 Speaker 1: social views, or to leave comments on posts of that nature, 495 00:28:57,960 --> 00:29:00,160 Speaker 1: and the whole point was to make those article will 496 00:29:00,160 --> 00:29:04,600 Speaker 1: seem relevant and important and elevated and popular to get 497 00:29:04,600 --> 00:29:07,400 Speaker 1: a post rolling with enough momentum and make it go 498 00:29:07,680 --> 00:29:11,200 Speaker 1: viral so that it could affect as many people as possible. 499 00:29:11,400 --> 00:29:15,200 Speaker 1: The Russian government, by the way, doesn't place the same 500 00:29:15,280 --> 00:29:19,800 Speaker 1: value on free speech as say United States citizens do. 501 00:29:20,280 --> 00:29:23,040 Speaker 1: The Russian government has laws that make it illegal to 502 00:29:23,160 --> 00:29:27,000 Speaker 1: post certain types of material on social media pages, such 503 00:29:27,000 --> 00:29:30,320 Speaker 1: as material meant to quote threatened public order end quote, 504 00:29:30,480 --> 00:29:33,520 Speaker 1: or posts that are extremist in nature. Now, on the 505 00:29:33,520 --> 00:29:36,400 Speaker 1: face of it, that sounds reasonable, because you don't want 506 00:29:36,400 --> 00:29:40,040 Speaker 1: people to incite others to violence, But the Russian government's 507 00:29:40,080 --> 00:29:42,960 Speaker 1: interpretation of this tends to be we don't want you 508 00:29:43,040 --> 00:29:46,240 Speaker 1: posting anything that criticizes the Russian government in general, or 509 00:29:46,320 --> 00:29:51,200 Speaker 1: Vladimir Putin in particular, or any of Putant's buddies generally speaking. 510 00:29:52,400 --> 00:29:57,200 Speaker 1: So the Russian rules seem to be we want to 511 00:29:57,200 --> 00:29:59,760 Speaker 1: make sure that everything that is posted is true and 512 00:30:00,080 --> 00:30:04,160 Speaker 1: liable and objective. But in reality, it's more about we 513 00:30:04,200 --> 00:30:06,760 Speaker 1: don't want to see you posting anything that's critical of 514 00:30:06,760 --> 00:30:11,360 Speaker 1: our president. Twitter has also been a target for these 515 00:30:11,400 --> 00:30:14,239 Speaker 1: types of tactics. In July two thousand eighteen, NBR ran 516 00:30:14,280 --> 00:30:17,720 Speaker 1: a story about how that same Internet Research agency had 517 00:30:17,800 --> 00:30:22,680 Speaker 1: created nearly fifty Twitter accounts claiming to represent various US newspapers. 518 00:30:23,480 --> 00:30:28,040 Speaker 1: Most of those newspapers were fake. They did not exist. 519 00:30:28,440 --> 00:30:31,880 Speaker 1: They were just made up by the Twitter accounts. So 520 00:30:32,320 --> 00:30:35,680 Speaker 1: you might see a title, their city name and a 521 00:30:35,760 --> 00:30:39,440 Speaker 1: newspaper title, and there's no actual newspaper called that from 522 00:30:39,480 --> 00:30:45,040 Speaker 1: that city. The accounts were steadily gathering followers, and they 523 00:30:45,040 --> 00:30:48,040 Speaker 1: were posting links to news that was relevant to the 524 00:30:48,120 --> 00:30:52,360 Speaker 1: various regions the papers claimed to represent, like Chicago or Seattle. 525 00:30:52,880 --> 00:30:55,880 Speaker 1: And again, some of these were taking names of papers 526 00:30:55,920 --> 00:30:59,560 Speaker 1: that one day existed but haven't existed for decades, so 527 00:30:59,680 --> 00:31:02,320 Speaker 1: they were trying to trade on that legitimacy, and they 528 00:31:02,360 --> 00:31:07,240 Speaker 1: were also trying to establish a sense of legitimacy by 529 00:31:07,320 --> 00:31:12,440 Speaker 1: sending spreading links to real stories that were from those areas, 530 00:31:13,160 --> 00:31:16,440 Speaker 1: and they were unbiased news reports. This was not a 531 00:31:16,480 --> 00:31:19,880 Speaker 1: misinformation campaign at this point, there was no fake news 532 00:31:19,960 --> 00:31:23,719 Speaker 1: being spread around. But NBR had uncovered this trend and 533 00:31:23,760 --> 00:31:26,720 Speaker 1: realized that what was happening was that the Internet Research 534 00:31:26,760 --> 00:31:31,920 Speaker 1: Agency was building trust online through these fake accounts and 535 00:31:32,000 --> 00:31:36,320 Speaker 1: gathering followers that way and being seen as a reliable 536 00:31:36,360 --> 00:31:40,320 Speaker 1: news source, and it was all in preparation to begin 537 00:31:40,520 --> 00:31:43,560 Speaker 1: a misinformation campaign. It's just that NBR found out about 538 00:31:43,560 --> 00:31:47,800 Speaker 1: it before they had moved into that phase. Twitter, like Facebook, 539 00:31:47,800 --> 00:31:52,600 Speaker 1: measures engagement and stuff like replies, retweets, quoted tweets, likes, 540 00:31:52,720 --> 00:31:55,360 Speaker 1: that kind of thing, and those metrics guide Twitter to 541 00:31:55,400 --> 00:31:59,680 Speaker 1: occasionally show tweets beyond the feeds of people that are 542 00:31:59,760 --> 00:32:02,640 Speaker 1: directly following those accounts. So, in other words, if I 543 00:32:02,680 --> 00:32:06,160 Speaker 1: make a Twitter account and I follow let's say five people, 544 00:32:07,000 --> 00:32:10,560 Speaker 1: and I'm I would expect every time I log into Twitter, 545 00:32:10,800 --> 00:32:13,600 Speaker 1: I'm just gonna see the tweets from those five people 546 00:32:13,720 --> 00:32:16,360 Speaker 1: and the stuff that they retweet or quote, and that's 547 00:32:16,400 --> 00:32:18,800 Speaker 1: all I'm gonna see. So I might see tweets from 548 00:32:18,800 --> 00:32:21,360 Speaker 1: others occasionally, but because one of the people I followed 549 00:32:21,400 --> 00:32:25,200 Speaker 1: retweeted it or quoted it. Except that sometimes Twitter will 550 00:32:25,200 --> 00:32:28,640 Speaker 1: show me tweets beyond those five people that were not 551 00:32:28,720 --> 00:32:34,160 Speaker 1: retweeted or quoted by them. And you might not follow 552 00:32:34,320 --> 00:32:37,000 Speaker 1: one of those fake accounts, but you might still see 553 00:32:37,000 --> 00:32:39,560 Speaker 1: a post from it because it drove a lot of engagement, 554 00:32:39,920 --> 00:32:42,479 Speaker 1: and thus Twitter would serve it up to you saying, well, 555 00:32:43,040 --> 00:32:46,479 Speaker 1: this particular post seems to be really relevant. A lot 556 00:32:46,480 --> 00:32:48,400 Speaker 1: of people are responding to it, so maybe we should 557 00:32:48,400 --> 00:32:50,360 Speaker 1: show it to more people because I bet more people 558 00:32:50,520 --> 00:32:54,400 Speaker 1: will find it interesting. And that's one way Twitter can 559 00:32:54,440 --> 00:33:01,000 Speaker 1: facilitate the uh the spread of misinformation, or they can 560 00:33:01,040 --> 00:33:03,840 Speaker 1: pay for promoted tweets, which means Twitter will serve it 561 00:33:03,920 --> 00:33:06,200 Speaker 1: up to a larger number of people, regardless of whether 562 00:33:06,280 --> 00:33:09,000 Speaker 1: or not they follow the original account. And of course, 563 00:33:09,040 --> 00:33:11,240 Speaker 1: if someone you follow retweets are quotes a tweet from 564 00:33:11,240 --> 00:33:13,240 Speaker 1: differ account, you're going to see it then. So, like 565 00:33:13,320 --> 00:33:16,760 Speaker 1: Facebook's algorithm, this approach is something people can exploit, and 566 00:33:16,840 --> 00:33:19,320 Speaker 1: that seems to be the case. Since the twenty six 567 00:33:19,560 --> 00:33:22,680 Speaker 1: election in the United States, both Facebook and Twitter have 568 00:33:22,760 --> 00:33:25,440 Speaker 1: cracked down on accounts that were fake in the sense 569 00:33:25,520 --> 00:33:28,959 Speaker 1: that the entity the account claimed to represent was not 570 00:33:29,080 --> 00:33:32,000 Speaker 1: the real owner of the account, or they were accounts 571 00:33:32,040 --> 00:33:36,480 Speaker 1: that existed specifically to spread misinformation. So you can see, 572 00:33:36,520 --> 00:33:39,280 Speaker 1: our love of social media and the business model that 573 00:33:39,360 --> 00:33:44,040 Speaker 1: supports social media has created a perfect situation for savvy 574 00:33:44,120 --> 00:33:47,640 Speaker 1: people who wish to spread a specific message. We humans 575 00:33:47,720 --> 00:33:51,320 Speaker 1: are less likely to feel empathy in online interactions with 576 00:33:51,360 --> 00:33:54,440 Speaker 1: each other, were more likely to respond quickly and with 577 00:33:54,640 --> 00:33:58,160 Speaker 1: very little inhibition. This leads to other people responding to 578 00:33:58,200 --> 00:34:01,360 Speaker 1: our words in a similar manner, either in support of 579 00:34:01,360 --> 00:34:05,600 Speaker 1: our view or an attempt to rip us to shreds, 580 00:34:05,640 --> 00:34:10,160 Speaker 1: and the process continues. That same process fuels the visibility 581 00:34:10,160 --> 00:34:13,600 Speaker 1: of the original piece of content that prompted the flame 582 00:34:13,640 --> 00:34:15,920 Speaker 1: war in the first place, which means even more people 583 00:34:15,960 --> 00:34:19,480 Speaker 1: see it and react to it. This rewards the social 584 00:34:19,520 --> 00:34:22,600 Speaker 1: media site, as the more engagement needs more money from 585 00:34:22,680 --> 00:34:26,360 Speaker 1: ads served on the site itself. So Facebook once more engagement, 586 00:34:26,600 --> 00:34:30,560 Speaker 1: they want to make more money through ads and the 587 00:34:30,680 --> 00:34:33,759 Speaker 1: uh It rewards whatever entity posts the content because it 588 00:34:33,800 --> 00:34:36,160 Speaker 1: means that the message is getting out there is shaping 589 00:34:36,160 --> 00:34:39,200 Speaker 1: the perception of whatever the topic happens to be, and 590 00:34:39,239 --> 00:34:41,640 Speaker 1: it can also be a monetary reward if the content 591 00:34:41,760 --> 00:34:45,440 Speaker 1: is served up with ads for those entities, and it 592 00:34:45,520 --> 00:34:48,200 Speaker 1: serves to create a deeper divide between the people who 593 00:34:48,280 --> 00:34:50,799 Speaker 1: have opposing points of view on a given subject. And 594 00:34:50,880 --> 00:34:54,840 Speaker 1: since it is a more challenging prospect to engage online 595 00:34:54,920 --> 00:34:57,680 Speaker 1: in an empathetic way, we are not likely to come 596 00:34:57,719 --> 00:35:01,279 Speaker 1: to any sort of agreement on that object. Instead, we're 597 00:35:01,280 --> 00:35:04,839 Speaker 1: gonna push ourselves more toward that us versus them mentality. 598 00:35:05,320 --> 00:35:08,279 Speaker 1: Now I didn't even touch on other problems, such as 599 00:35:08,320 --> 00:35:11,120 Speaker 1: the belief that journalism must give an equal amount of 600 00:35:11,160 --> 00:35:14,520 Speaker 1: time and opportunity to all perspectives on the topic. Now, 601 00:35:14,600 --> 00:35:18,600 Speaker 1: journalism should be objective and unbiased. It should be investigative. 602 00:35:18,640 --> 00:35:21,520 Speaker 1: It should not take anything at face value. But that 603 00:35:21,680 --> 00:35:24,200 Speaker 1: is not the same thing as giving all sides of 604 00:35:24,200 --> 00:35:27,279 Speaker 1: a discussion equal opportunity to use the platform to get 605 00:35:27,280 --> 00:35:30,080 Speaker 1: a message out. If there's a hate group that exists 606 00:35:30,120 --> 00:35:34,560 Speaker 1: primarily to oppress some other group, it's not responsible journalism 607 00:35:34,560 --> 00:35:37,879 Speaker 1: to give that group a platform to espouse those beliefs. Now, 608 00:35:37,880 --> 00:35:41,440 Speaker 1: it would be objective journalism to investigate that group to 609 00:35:41,520 --> 00:35:45,280 Speaker 1: determine what are the motivations behind that group's philosophy, why 610 00:35:45,320 --> 00:35:48,360 Speaker 1: do they believe the things they do, and why do 611 00:35:48,440 --> 00:35:50,440 Speaker 1: they act in the ways that they have chosen, and 612 00:35:50,480 --> 00:35:55,600 Speaker 1: to publish that that response, That that investigation. But it 613 00:35:55,760 --> 00:36:00,040 Speaker 1: is not journalism's responsibility to act like a stage for 614 00:36:00,160 --> 00:36:03,160 Speaker 1: anyone to jump up and use it. So what can 615 00:36:03,200 --> 00:36:05,560 Speaker 1: we do about this? Well, one thing we can try 616 00:36:05,560 --> 00:36:09,600 Speaker 1: to remember is that the people online are mostly people mostly. 617 00:36:10,080 --> 00:36:12,440 Speaker 1: I mean there are bots, their fake accounts that have 618 00:36:12,520 --> 00:36:14,920 Speaker 1: made this kind of muddy, makes it a little more 619 00:36:14,960 --> 00:36:17,920 Speaker 1: tricky to just make a blanket statement to be honest. 620 00:36:18,120 --> 00:36:20,520 Speaker 1: And in some cases, the quote unquote person on the 621 00:36:20,520 --> 00:36:23,040 Speaker 1: other end may either not be a person at all, 622 00:36:23,600 --> 00:36:25,480 Speaker 1: or they may be someone who is posting something they 623 00:36:25,480 --> 00:36:28,759 Speaker 1: don't necessarily believe or care about. They're just posting it 624 00:36:28,800 --> 00:36:31,960 Speaker 1: because it's literally their job to do it. They're just 625 00:36:32,120 --> 00:36:36,680 Speaker 1: filling out an assignment. They don't care because they have 626 00:36:36,719 --> 00:36:40,000 Speaker 1: no investment in whatever the messages. But we should remember 627 00:36:40,080 --> 00:36:43,320 Speaker 1: that most of the people that we interact with online 628 00:36:43,440 --> 00:36:46,720 Speaker 1: are people, and we need to keep that in our minds. 629 00:36:46,920 --> 00:36:50,839 Speaker 1: If we are not willing to make that that assumption, 630 00:36:51,400 --> 00:36:54,640 Speaker 1: then that says something very troubling about our own character. Now, no, 631 00:36:54,800 --> 00:36:57,319 Speaker 1: when I say this, I do not mean that we 632 00:36:57,400 --> 00:37:01,719 Speaker 1: need to entertain racist or misogynist or hateful ideologies just 633 00:37:01,800 --> 00:37:04,400 Speaker 1: because people hold them. I don't think there's any place 634 00:37:04,480 --> 00:37:06,600 Speaker 1: for that kind of stuff, at least no place I 635 00:37:06,719 --> 00:37:09,759 Speaker 1: want to be in. Another thing we can try to 636 00:37:09,800 --> 00:37:13,719 Speaker 1: do is seek out information from reliable sources, not just 637 00:37:13,880 --> 00:37:17,680 Speaker 1: information that seems to support our own personal worldview, but 638 00:37:17,800 --> 00:37:22,160 Speaker 1: objective information about any number of topics. And it might 639 00:37:22,200 --> 00:37:25,120 Speaker 1: mean that you find yourself questioning your own perspective about 640 00:37:25,120 --> 00:37:28,719 Speaker 1: certain things, and maybe you even change your mind. For example, 641 00:37:29,000 --> 00:37:30,879 Speaker 1: if you asked me a year ago why I thought 642 00:37:30,920 --> 00:37:34,160 Speaker 1: about universal basic income as a concept, I would probably 643 00:37:34,200 --> 00:37:36,960 Speaker 1: be pretty positive about it, and I still am. I'm 644 00:37:37,000 --> 00:37:39,240 Speaker 1: still more or less positive about it. But I think 645 00:37:40,120 --> 00:37:42,920 Speaker 1: instead of universal basic income, what I would prefer to 646 00:37:42,960 --> 00:37:47,640 Speaker 1: see is some sort of universal guaranteed jobs program. So, 647 00:37:47,680 --> 00:37:49,799 Speaker 1: in other words, I would like to see a program 648 00:37:49,840 --> 00:37:53,680 Speaker 1: where anyone who wants to find a job can get 649 00:37:53,719 --> 00:37:56,600 Speaker 1: a job. It's a guarantee. Now, those jobs would have 650 00:37:56,640 --> 00:37:59,319 Speaker 1: to be created by various governments at various levels. It 651 00:37:59,400 --> 00:38:03,799 Speaker 1: could be everything from a local government to federal programs, 652 00:38:04,840 --> 00:38:07,880 Speaker 1: but it could include all different types of work as well, 653 00:38:07,960 --> 00:38:09,879 Speaker 1: and that would be very important. But I don't say 654 00:38:09,880 --> 00:38:12,960 Speaker 1: this to convince anyone that my beliefs are the only 655 00:38:13,600 --> 00:38:16,880 Speaker 1: legit ones, that my approach is the only right way, 656 00:38:17,239 --> 00:38:20,320 Speaker 1: and that everyone should just subscribe to my idea, or 657 00:38:20,360 --> 00:38:22,759 Speaker 1: at least the idea that I have already subscribed to. 658 00:38:22,800 --> 00:38:26,480 Speaker 1: It's not my idea. Other people have had universally guaranteed 659 00:38:26,560 --> 00:38:31,120 Speaker 1: job program UH suggestions for years and years and years. 660 00:38:31,200 --> 00:38:33,680 Speaker 1: I did not come up with that idea, but I 661 00:38:33,760 --> 00:38:36,360 Speaker 1: just wanted to give you an example as an idea 662 00:38:36,400 --> 00:38:39,200 Speaker 1: where originally I was thinking, yeah, universal basic income, that 663 00:38:39,239 --> 00:38:41,920 Speaker 1: makes most sense to me. But I actually think that 664 00:38:42,080 --> 00:38:46,399 Speaker 1: the guaranteed jobs makes more sense because it creates more 665 00:38:47,120 --> 00:38:50,000 Speaker 1: direct benefits, and I think it's an easier concept to 666 00:38:51,520 --> 00:38:54,440 Speaker 1: get up to to get support behind. So there are 667 00:38:54,520 --> 00:38:58,120 Speaker 1: reasons why I've changed my mind. But that wouldn't have 668 00:38:58,120 --> 00:39:01,640 Speaker 1: been possible if I had not sought out more information 669 00:39:01,640 --> 00:39:04,600 Speaker 1: about the subject from a variety of different sources. If 670 00:39:04,640 --> 00:39:09,440 Speaker 1: I had only just kept reading people who were advocating 671 00:39:09,480 --> 00:39:12,240 Speaker 1: for universal basic income, I never would have taken any 672 00:39:12,280 --> 00:39:16,840 Speaker 1: time to consider alternatives. So that's a simple example, and 673 00:39:16,880 --> 00:39:21,960 Speaker 1: also it's an example that I admit is is fairly shallow. Right, 674 00:39:22,200 --> 00:39:24,759 Speaker 1: it's not a huge shift to go from universal basic 675 00:39:24,800 --> 00:39:29,600 Speaker 1: income to universal uh guaranteed job program. That's not an 676 00:39:29,680 --> 00:39:32,400 Speaker 1: enormous leap. It would take a lot more for me 677 00:39:32,480 --> 00:39:36,520 Speaker 1: to change a more fundamental idea. I have to something 678 00:39:36,560 --> 00:39:40,160 Speaker 1: that is more in opposition to that idea. But it 679 00:39:40,280 --> 00:39:42,920 Speaker 1: is important for us to seek those pieces of information 680 00:39:42,920 --> 00:39:46,760 Speaker 1: out so that we aren't just confirming our biases, whether 681 00:39:47,040 --> 00:39:51,480 Speaker 1: we consider ourselves liberal or conservative, whatever it may be. 682 00:39:52,520 --> 00:39:54,640 Speaker 1: It's very important to try and seek that out. Now, 683 00:39:54,640 --> 00:39:58,080 Speaker 1: the real trick there, obviously, is trying to find sources 684 00:39:58,120 --> 00:40:01,360 Speaker 1: that are as objective as possible and you're not just reading, 685 00:40:02,040 --> 00:40:06,600 Speaker 1: you know, propaganda that is supporting one viewpoint over another. 686 00:40:07,000 --> 00:40:08,760 Speaker 1: I don't think we're going to get to a spot 687 00:40:08,760 --> 00:40:12,640 Speaker 1: where everyone magically becomes totally objective and empathetic all at 688 00:40:12,680 --> 00:40:15,279 Speaker 1: the same time and makes decisions that are responsible from 689 00:40:15,280 --> 00:40:17,600 Speaker 1: a social and physical point of view. I don't think 690 00:40:17,600 --> 00:40:21,520 Speaker 1: that's going to happen. But being aware of how online information, 691 00:40:21,680 --> 00:40:24,719 Speaker 1: which has become the primary source for information for a 692 00:40:24,760 --> 00:40:28,240 Speaker 1: growing number of people, can be used to manipulate those people, 693 00:40:28,560 --> 00:40:32,200 Speaker 1: that's of critical importance. Only then can we spot what's 694 00:40:32,239 --> 00:40:34,920 Speaker 1: happening and we can do our best to shut down 695 00:40:35,440 --> 00:40:39,000 Speaker 1: abuses of the system and make informed decisions based on 696 00:40:39,040 --> 00:40:43,480 Speaker 1: real information and maybe remember that we're all human beings 697 00:40:43,560 --> 00:40:47,200 Speaker 1: in the process. Uh. It's a big request, but I 698 00:40:47,239 --> 00:40:51,040 Speaker 1: think we could do it if we wanted to. That 699 00:40:51,120 --> 00:40:54,520 Speaker 1: wraps up this episode about echo chambers. In our next episode, 700 00:40:54,520 --> 00:40:58,239 Speaker 1: we will start our discussion about Cambridge Analytica and the 701 00:40:58,920 --> 00:41:02,759 Speaker 1: enormous mess that company found itself in. UH, if you 702 00:41:02,800 --> 00:41:05,759 Speaker 1: guys have suggestions for future episodes of tech Stuff. Send 703 00:41:05,760 --> 00:41:08,480 Speaker 1: me an email the addresses tech stuff at how stuff 704 00:41:08,480 --> 00:41:10,520 Speaker 1: Works dot com, or you can drop me a line 705 00:41:10,520 --> 00:41:12,319 Speaker 1: on Facebook or Twitter. The handle of both of those 706 00:41:12,400 --> 00:41:15,319 Speaker 1: is tech Stuff hs W. Don't forget we have a 707 00:41:15,320 --> 00:41:18,560 Speaker 1: merchandise store at t public dot com slash tech stuff 708 00:41:18,560 --> 00:41:20,960 Speaker 1: where you can get all your tech stuff merchandise needs. 709 00:41:21,640 --> 00:41:25,080 Speaker 1: And make sure you follow us on Instagram and I'll 710 00:41:25,080 --> 00:41:33,120 Speaker 1: talk to you again really soon. For more on this 711 00:41:33,360 --> 00:41:35,839 Speaker 1: and thousands of other topics, is that how stuff works 712 00:41:35,840 --> 00:41:46,160 Speaker 1: dot com