1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,240 --> 00:00:15,000 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,120 --> 00:00:18,280 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:18,400 --> 00:00:21,160 Speaker 1: and I love all things tech and I was listening 5 00:00:21,160 --> 00:00:25,280 Speaker 1: to a recent episode of pod Save America's offline series 6 00:00:26,000 --> 00:00:29,680 Speaker 1: in which journalist Charlie Warzel joined the show to talk 7 00:00:29,720 --> 00:00:32,440 Speaker 1: about Facebook and its role in causing harm on a 8 00:00:32,479 --> 00:00:36,559 Speaker 1: global scale, which inspired me to do this episode. Now, 9 00:00:36,600 --> 00:00:39,360 Speaker 1: before we get started, I guess I should apologize for 10 00:00:39,440 --> 00:00:43,319 Speaker 1: the somewhat click baity title of this episode. We're going 11 00:00:43,360 --> 00:00:48,640 Speaker 1: to be exploring algorithms and their unintended consequences, and we're 12 00:00:48,680 --> 00:00:52,360 Speaker 1: focusing on the Facebook algorithm. That is, the algorithm that 13 00:00:52,479 --> 00:00:56,279 Speaker 1: determines what appears in your news feed. But algorithms by 14 00:00:56,320 --> 00:00:59,320 Speaker 1: themselves are not necessarily good or evil on their own. 15 00:01:00,000 --> 00:01:02,600 Speaker 1: You know. It's kind of like what Hamlet said, Now, 16 00:01:02,640 --> 00:01:06,560 Speaker 1: there's nothing neither good nor bad, but thinking makes it so. 17 00:01:06,560 --> 00:01:11,280 Speaker 1: So let's start just by defining what an algorithm is. So, 18 00:01:11,319 --> 00:01:15,640 Speaker 1: an algorithm is essentially a set of rules or directions 19 00:01:15,680 --> 00:01:18,520 Speaker 1: that a system follows in order to process information and 20 00:01:18,560 --> 00:01:21,840 Speaker 1: produce results. And that's it. So it's just a set 21 00:01:21,840 --> 00:01:25,920 Speaker 1: of rules. Let's use a really simple analogy to explain this. 22 00:01:26,040 --> 00:01:29,520 Speaker 1: Let's say I've hired you to sit on a bench 23 00:01:29,880 --> 00:01:33,399 Speaker 1: by a road. Say it's a road that gets decent 24 00:01:33,480 --> 00:01:36,199 Speaker 1: but not heavy traffic, And I say it's your job 25 00:01:36,600 --> 00:01:38,759 Speaker 1: to make a tally mark on a sheet of paper 26 00:01:39,000 --> 00:01:43,839 Speaker 1: whenever a silver car drives past you along this road, 27 00:01:44,080 --> 00:01:46,520 Speaker 1: and you're to ignore all other cars, only make a 28 00:01:46,560 --> 00:01:49,800 Speaker 1: tally when a silver car goes by. I just, you know, 29 00:01:50,000 --> 00:01:51,960 Speaker 1: want to see how many tally marks are there at 30 00:01:51,960 --> 00:01:53,920 Speaker 1: the end of a certain amount of time. Let's say 31 00:01:53,920 --> 00:01:56,840 Speaker 1: it's an hour, and I have given you an algorithm 32 00:01:57,040 --> 00:02:00,560 Speaker 1: tally every single silver car that drives past you. So 33 00:02:00,640 --> 00:02:03,200 Speaker 1: you observe the vehicles that drive by you, you follow 34 00:02:03,240 --> 00:02:06,080 Speaker 1: the directions, and that's the whole concept right there. Now. 35 00:02:06,080 --> 00:02:09,480 Speaker 1: I feel like a lot of folks, including myself, sometimes 36 00:02:09,600 --> 00:02:11,320 Speaker 1: use the word algorithm as a kind of way to 37 00:02:11,360 --> 00:02:13,800 Speaker 1: save a bit of time, But it also tends to 38 00:02:13,840 --> 00:02:17,440 Speaker 1: create a gap in understanding. Like, you know, the word 39 00:02:17,480 --> 00:02:22,480 Speaker 1: algorithm acts kind of like the curtain between Dorothy and 40 00:02:22,560 --> 00:02:25,399 Speaker 1: the Wizard of Oz. You know something's going on back 41 00:02:25,440 --> 00:02:28,440 Speaker 1: behind the curtain, but you can't actually see what it is, 42 00:02:28,840 --> 00:02:31,880 Speaker 1: so we use the word algorithm to describe it, and 43 00:02:31,919 --> 00:02:34,639 Speaker 1: then we just move on. Now. Part of the reason 44 00:02:34,720 --> 00:02:37,520 Speaker 1: for that is that a lot of companies, big ones 45 00:02:37,560 --> 00:02:42,400 Speaker 1: like Google and Facebook jealously guard their algorithms from critical view. 46 00:02:42,760 --> 00:02:46,359 Speaker 1: To the outsider, it would be impossible to say whether 47 00:02:46,400 --> 00:02:49,480 Speaker 1: the algorithm was working well or if it wasn't, because 48 00:02:49,480 --> 00:02:52,480 Speaker 1: we don't necessarily know what the goal of the algorithm 49 00:02:52,560 --> 00:02:55,520 Speaker 1: is in the first place. We know it is doing work, 50 00:02:55,600 --> 00:02:58,520 Speaker 1: we know it's doing something, it's creating the results that 51 00:02:58,600 --> 00:03:01,320 Speaker 1: we see, but we don't know if it's doing the 52 00:03:01,400 --> 00:03:05,000 Speaker 1: best job at that. So this lack of transparency becomes 53 00:03:05,000 --> 00:03:07,800 Speaker 1: a big issue in some cases, like with Facebook. And 54 00:03:07,840 --> 00:03:11,160 Speaker 1: there are times when an algorithm is technically doing what 55 00:03:11,200 --> 00:03:14,240 Speaker 1: it's supposed to do and it is doing it well 56 00:03:14,560 --> 00:03:18,600 Speaker 1: by that definition, but that there are unintended consequences and 57 00:03:18,680 --> 00:03:22,600 Speaker 1: sometimes really bad ones. I would argue that this is 58 00:03:22,600 --> 00:03:25,760 Speaker 1: what we see with Facebook in particular. It's what's led 59 00:03:25,800 --> 00:03:30,240 Speaker 1: the company Meta, you know, the company that owns Facebook, 60 00:03:30,480 --> 00:03:33,320 Speaker 1: to get into hot water around the world because of 61 00:03:33,360 --> 00:03:37,839 Speaker 1: issues like the spread of misinformation and hate speech. Now, 62 00:03:37,880 --> 00:03:41,480 Speaker 1: to be clear, I'm not saying Facebook engineered an algorithm 63 00:03:41,520 --> 00:03:46,280 Speaker 1: to spread that kind of bad content. Rather, Facebook engineers 64 00:03:46,280 --> 00:03:49,960 Speaker 1: create a system that prioritizes that kind of content because 65 00:03:50,080 --> 00:03:54,320 Speaker 1: that content fulfills other goals. So let's back up a 66 00:03:54,360 --> 00:03:58,560 Speaker 1: bit and really get to grips with what's actually going on. First, 67 00:03:59,120 --> 00:04:02,760 Speaker 1: let us consider what Facebook started out as, because it 68 00:04:02,800 --> 00:04:05,920 Speaker 1: was a much less expansive and influential site when it 69 00:04:06,040 --> 00:04:09,960 Speaker 1: launched way back in two thousand four. Back then, it was, 70 00:04:10,160 --> 00:04:14,600 Speaker 1: in Mark Zuckerberg's own estimation, a tool for Harvard students, 71 00:04:14,920 --> 00:04:19,200 Speaker 1: presumably mostly male ones, to rank the quote unquote hot 72 00:04:19,240 --> 00:04:23,560 Speaker 1: girls on campus. So Facebook, the site has always been 73 00:04:23,560 --> 00:04:26,279 Speaker 1: a creep. It's just now it's a much older and 74 00:04:26,400 --> 00:04:31,360 Speaker 1: much more influential creep. Anyway, running a site costs money. 75 00:04:31,400 --> 00:04:34,840 Speaker 1: There are hosting services and bandwidth fees and other stuff 76 00:04:34,880 --> 00:04:37,479 Speaker 1: that you have to cover, so there needed to be 77 00:04:37,560 --> 00:04:41,320 Speaker 1: a way to offset the costs of running this pretty 78 00:04:41,360 --> 00:04:45,600 Speaker 1: basic social network. In April of two thousand four, Zuckerberg 79 00:04:45,720 --> 00:04:49,960 Speaker 1: introduced flyers and you know f L y e r s. 80 00:04:50,200 --> 00:04:54,680 Speaker 1: These were essentially banner ads on Facebook's homepage. So these 81 00:04:54,680 --> 00:04:56,560 Speaker 1: are the same kind of banner ads that would appear, 82 00:04:57,000 --> 00:04:59,400 Speaker 1: you know, on the top, bottom or sides of any 83 00:04:59,440 --> 00:05:03,120 Speaker 1: website that you visited. The ads were really mostly for 84 00:05:03,240 --> 00:05:06,320 Speaker 1: local businesses, So this was really small potatoes stuff, but 85 00:05:06,520 --> 00:05:08,840 Speaker 1: it meant that there was some money coming in that 86 00:05:08,920 --> 00:05:13,880 Speaker 1: would help offset Facebook's costs. Facebook group pretty rapidly and 87 00:05:13,960 --> 00:05:17,120 Speaker 1: the company received some investment capital, but it needed to 88 00:05:17,240 --> 00:05:20,359 Speaker 1: establish a way to generate more revenue. See, in the 89 00:05:20,400 --> 00:05:23,840 Speaker 1: world of startups, there are a couple of different pathways 90 00:05:23,960 --> 00:05:26,679 Speaker 1: you can try to take. Actually there's more than a couple, 91 00:05:26,880 --> 00:05:30,360 Speaker 1: but there's really only a couple of successful pathways. One 92 00:05:30,880 --> 00:05:33,880 Speaker 1: is to become attractive enough as a product, service, or 93 00:05:33,920 --> 00:05:36,960 Speaker 1: business that some other bigger fish is going to come 94 00:05:37,000 --> 00:05:39,760 Speaker 1: along and scoop you up in an acquisition and you'll 95 00:05:39,800 --> 00:05:43,560 Speaker 1: cash out with millions of dollars. In fact, that fuels 96 00:05:43,600 --> 00:05:46,480 Speaker 1: a ton of the startup culture these days, and I've 97 00:05:46,520 --> 00:05:48,400 Speaker 1: talked about it in the past about how I think 98 00:05:48,440 --> 00:05:52,400 Speaker 1: it's a pretty dangerous and unhealthy trend. But the other 99 00:05:52,480 --> 00:05:56,080 Speaker 1: way is to build a business that generates more money 100 00:05:56,080 --> 00:05:59,520 Speaker 1: and revenue than it costs to operate the business, and 101 00:05:59,560 --> 00:06:01,640 Speaker 1: to build it in such a way that you can 102 00:06:01,720 --> 00:06:05,360 Speaker 1: scale up the business as it grows, making more money 103 00:06:05,440 --> 00:06:09,040 Speaker 1: while also spending more to maintain a larger business. This 104 00:06:09,120 --> 00:06:12,159 Speaker 1: is a really hard thing to do. A lot of 105 00:06:12,160 --> 00:06:15,240 Speaker 1: companies struggle with it, and I'm guessing this is why 106 00:06:15,279 --> 00:06:19,479 Speaker 1: the acquisition culture runs so strong in startups. It's not 107 00:06:19,600 --> 00:06:22,880 Speaker 1: easy to be seen as important enough to acquire. That's 108 00:06:23,080 --> 00:06:25,360 Speaker 1: I'm not trying to downplay that. It is tough. You've 109 00:06:25,360 --> 00:06:29,600 Speaker 1: got to make something that is flashy enough and and 110 00:06:29,720 --> 00:06:32,320 Speaker 1: potentially important enough for someone else to come along and 111 00:06:32,360 --> 00:06:36,360 Speaker 1: say we want that. But that can be easier than 112 00:06:36,480 --> 00:06:39,440 Speaker 1: creating a business model that supports itself throughout the various 113 00:06:39,440 --> 00:06:43,200 Speaker 1: phases of a company's growth. That is really hard to 114 00:06:43,240 --> 00:06:46,360 Speaker 1: do well. Facebook was looking to go that second route, 115 00:06:46,680 --> 00:06:50,120 Speaker 1: you know, the actual business route, as opposed to getting acquired, 116 00:06:50,240 --> 00:06:53,960 Speaker 1: and in November two thousand seven, the site introduced Facebook 117 00:06:54,000 --> 00:06:57,600 Speaker 1: Ads Now. That product had a few different facets. One 118 00:06:57,720 --> 00:07:00,799 Speaker 1: was that businesses would be allowed to create Facebook pages 119 00:07:01,240 --> 00:07:04,239 Speaker 1: that could operate in ways similar to an actual human 120 00:07:04,240 --> 00:07:07,560 Speaker 1: beings Facebook page works, and that would give folks the 121 00:07:07,640 --> 00:07:10,560 Speaker 1: chance to quote unquote like a business that they were 122 00:07:10,600 --> 00:07:16,320 Speaker 1: interested in. Another facet was the Facebook Marketplace, which exists 123 00:07:16,440 --> 00:07:21,720 Speaker 1: to this day, and a third was a product called Beacon. Now. 124 00:07:21,840 --> 00:07:24,800 Speaker 1: This product would be one of the first to inspire 125 00:07:24,920 --> 00:07:29,280 Speaker 1: legal action against Facebook. You see, Beacon was a tool 126 00:07:29,400 --> 00:07:33,240 Speaker 1: that would collect data from external websites sites that were 127 00:07:33,280 --> 00:07:36,280 Speaker 1: not Facebook, and send that data back to Facebook. These 128 00:07:36,320 --> 00:07:40,160 Speaker 1: were partnered websites companies that had partnered with Facebook to 129 00:07:40,160 --> 00:07:43,080 Speaker 1: be part of this program. So if you went online 130 00:07:43,760 --> 00:07:47,120 Speaker 1: and you started shopping for shoes, and if you were visiting, 131 00:07:47,120 --> 00:07:50,080 Speaker 1: you know, these sites that were part of this Beacon program, 132 00:07:50,120 --> 00:07:52,640 Speaker 1: your activities would be sent back to Facebook for the 133 00:07:52,640 --> 00:07:56,120 Speaker 1: purposes of targeted advertising, which meant the next time you 134 00:07:56,200 --> 00:08:00,040 Speaker 1: jumped on Facebook, you would start seeing shoe adds. The 135 00:08:00,040 --> 00:08:04,120 Speaker 1: problem was Facebook didn't alert users to this program, nor 136 00:08:04,280 --> 00:08:07,480 Speaker 1: did it allow people the chance to opt out of 137 00:08:07,520 --> 00:08:10,960 Speaker 1: the service, and so Facebook was tracking users even if 138 00:08:10,960 --> 00:08:14,240 Speaker 1: they weren't actively connected to Facebook at the time, as 139 00:08:14,280 --> 00:08:16,960 Speaker 1: long as they were logged in and they were, you know, 140 00:08:17,040 --> 00:08:19,480 Speaker 1: not necessarily logged in and active on the site, but 141 00:08:19,560 --> 00:08:22,960 Speaker 1: like they hadn't logged out of their account, then this 142 00:08:23,040 --> 00:08:25,800 Speaker 1: activity would continue to be tracked as they went to 143 00:08:25,800 --> 00:08:29,960 Speaker 1: other sites. After users brought a class action lawsuit against Facebook, 144 00:08:29,960 --> 00:08:33,080 Speaker 1: the company shut down Beacon in two thousand nine. That 145 00:08:33,120 --> 00:08:36,280 Speaker 1: doesn't mean Facebook said goodbye to targeted ads. Far from it, 146 00:08:36,760 --> 00:08:40,120 Speaker 1: and that brings us to something really important. Facebook makes 147 00:08:40,160 --> 00:08:44,559 Speaker 1: the majority of its revenue from advertising, and ads only 148 00:08:44,600 --> 00:08:47,600 Speaker 1: make money if there are people to see them. It 149 00:08:47,679 --> 00:08:50,640 Speaker 1: would not make any sense to put up a billboard 150 00:08:50,760 --> 00:08:53,439 Speaker 1: in the middle of Antarctica, right, I mean, no one 151 00:08:53,440 --> 00:08:56,240 Speaker 1: would ever see the billboard. That would just be a 152 00:08:56,240 --> 00:08:59,160 Speaker 1: waste of money. So you've got to put ads where 153 00:08:59,160 --> 00:09:01,600 Speaker 1: people are going to see them. So Facebook's business model 154 00:09:01,679 --> 00:09:05,280 Speaker 1: revolves around getting as many people to see as many 155 00:09:05,400 --> 00:09:09,560 Speaker 1: ads as is possible. So for Facebook, that means getting 156 00:09:10,000 --> 00:09:12,160 Speaker 1: as many people to log into Facebook as they can 157 00:09:12,559 --> 00:09:16,800 Speaker 1: and then keeping those people on Facebook as much as 158 00:09:17,200 --> 00:09:20,679 Speaker 1: is possible. That's the end goal, because it means serving 159 00:09:20,679 --> 00:09:23,880 Speaker 1: more ads to more people. Well, we'll talk about stuff 160 00:09:23,920 --> 00:09:27,320 Speaker 1: like engagement a bit later, but really engagement isn't the 161 00:09:27,360 --> 00:09:31,640 Speaker 1: important bit like that engagement on Facebook. Like, if if 162 00:09:31,640 --> 00:09:34,920 Speaker 1: Facebook could guarantee that it would get the same number 163 00:09:35,000 --> 00:09:39,679 Speaker 1: of ads viewed without worrying about engagement, they would drop 164 00:09:39,760 --> 00:09:42,480 Speaker 1: the whole thing about engagement, trust me. So the important 165 00:09:42,520 --> 00:09:46,880 Speaker 1: bit is really just getting eyeballs on ads. That's it. Now, 166 00:09:46,880 --> 00:09:49,080 Speaker 1: you could argue that the sole purpose of Facebook, at 167 00:09:49,160 --> 00:09:52,640 Speaker 1: least from the business standpoint is just that getting eyes 168 00:09:52,640 --> 00:09:57,360 Speaker 1: on ads. The features, the connectivity, all of that stuff 169 00:09:57,520 --> 00:10:00,400 Speaker 1: is just the bit that brings folks in to use 170 00:10:00,440 --> 00:10:03,240 Speaker 1: the platform. But from a revenue side, it's just about 171 00:10:03,280 --> 00:10:06,000 Speaker 1: getting as many folks to look at ads as as possible. 172 00:10:06,559 --> 00:10:09,480 Speaker 1: That's where the algorithm comes into play. And when people 173 00:10:09,520 --> 00:10:12,240 Speaker 1: talk about Facebook algorithms, they typically mean a couple of 174 00:10:12,280 --> 00:10:15,400 Speaker 1: different ones. One is the way that Facebook determines what 175 00:10:15,640 --> 00:10:18,080 Speaker 1: shows up in a user's news feed. What do you 176 00:10:18,120 --> 00:10:22,040 Speaker 1: see when you log into Facebook. Longtime Facebook users often 177 00:10:22,120 --> 00:10:25,480 Speaker 1: limit the fact that the site no longer posts stuff 178 00:10:25,520 --> 00:10:28,240 Speaker 1: in reverse chronological order on your news feed. You used 179 00:10:28,280 --> 00:10:30,400 Speaker 1: to be able to really do that for real, Like 180 00:10:30,480 --> 00:10:32,839 Speaker 1: you would see the most recent posts at the top, 181 00:10:33,559 --> 00:10:36,960 Speaker 1: and you could just scroll down and you go back 182 00:10:36,960 --> 00:10:39,720 Speaker 1: in time. You would see the older posts, like the 183 00:10:39,760 --> 00:10:42,120 Speaker 1: from most recent to the oldest, and you could get 184 00:10:42,160 --> 00:10:45,280 Speaker 1: caught up. Like you could actually do that, But that 185 00:10:45,400 --> 00:10:47,559 Speaker 1: means that your time on Facebook would be kept too 186 00:10:47,640 --> 00:10:50,320 Speaker 1: fairly short bursts unless you've just got a lot of 187 00:10:50,400 --> 00:10:54,640 Speaker 1: chatty friends who are constantly posting, And since revenue comes 188 00:10:54,679 --> 00:10:57,120 Speaker 1: from having more people seeing the site for as long 189 00:10:57,160 --> 00:11:00,960 Speaker 1: as possible, that's not great for Facebook, right That's why 190 00:11:01,080 --> 00:11:04,679 Speaker 1: the reverse chronological order is not a huge priority for Facebook, 191 00:11:04,679 --> 00:11:07,960 Speaker 1: because even though users might love it, they wouldn't necessarily 192 00:11:08,000 --> 00:11:10,760 Speaker 1: spend as much time on the platform. So the company 193 00:11:10,800 --> 00:11:14,480 Speaker 1: set out to make algorithms that would select content across 194 00:11:14,520 --> 00:11:17,959 Speaker 1: your friends and interests. In an effort to keep users 195 00:11:18,160 --> 00:11:22,559 Speaker 1: glued to the site longer. You might end up missing 196 00:11:22,600 --> 00:11:26,040 Speaker 1: stuff from your friends. In fact, I guarantee that's happening 197 00:11:26,080 --> 00:11:28,800 Speaker 1: to you if you are on Facebook, that you are 198 00:11:29,480 --> 00:11:32,200 Speaker 1: not seeing stuff that some of your friends are posting 199 00:11:32,240 --> 00:11:35,520 Speaker 1: because the algorithm isn't promoting it to you, And it's 200 00:11:35,520 --> 00:11:38,480 Speaker 1: because the algorithm chose to promote something else for you 201 00:11:38,520 --> 00:11:41,640 Speaker 1: to see instead of the stuff that your friends are posting. 202 00:11:42,480 --> 00:11:44,920 Speaker 1: We can also talk about algorithms in the sense of 203 00:11:45,000 --> 00:11:48,880 Speaker 1: advertising algorithms, that being the algorithms that determine which ads 204 00:11:48,920 --> 00:11:52,480 Speaker 1: get displayed to a specific user, But the news feed 205 00:11:52,480 --> 00:11:54,800 Speaker 1: algorithm tends to be the big one that most people 206 00:11:54,840 --> 00:11:58,520 Speaker 1: talk about when they say the Facebook algorithm. So this 207 00:11:58,600 --> 00:12:01,760 Speaker 1: has actually gotten Facebook into trouble a few times, even 208 00:12:01,800 --> 00:12:05,480 Speaker 1: before we started seeing stuff like the Cambridge Analytica scandal, 209 00:12:06,040 --> 00:12:09,760 Speaker 1: or we started seeing the various times that that Congress 210 00:12:09,800 --> 00:12:15,400 Speaker 1: has called Facebook executives to testify about stuff like misinformation 211 00:12:15,679 --> 00:12:19,319 Speaker 1: and hate speech. Way back in two thousand fifteen, content 212 00:12:19,480 --> 00:12:23,800 Speaker 1: sites in general saw a seismic shift because of something 213 00:12:23,840 --> 00:12:27,240 Speaker 1: Facebook did, and this would end up disrupting the entire 214 00:12:27,320 --> 00:12:32,400 Speaker 1: industry of online content, from news organizations to entertainment sites. 215 00:12:32,800 --> 00:12:36,160 Speaker 1: And that shift had a name. It was called pivot 216 00:12:36,559 --> 00:12:41,319 Speaker 1: to video. Just saying those words fills me with dread. 217 00:12:41,559 --> 00:12:47,440 Speaker 1: So making content is hard. I do it nearly every day. 218 00:12:47,720 --> 00:12:51,480 Speaker 1: It can be grueling, you know, writing up long pieces 219 00:12:51,559 --> 00:12:54,040 Speaker 1: takes a lot of time and research and a lot 220 00:12:54,120 --> 00:12:56,640 Speaker 1: of specialized talent. If you want to do it well, 221 00:12:56,679 --> 00:12:58,960 Speaker 1: I mean you could just be a content farm and 222 00:12:59,600 --> 00:13:05,360 Speaker 1: dump endless numbers of articles that are hastily written, badly researched, 223 00:13:05,760 --> 00:13:09,520 Speaker 1: and poorly edited. You can do that, Uh it doesn't 224 00:13:09,920 --> 00:13:12,040 Speaker 1: you know. You can try and make it up in volume, 225 00:13:12,160 --> 00:13:14,960 Speaker 1: but if you go the quality route, you tend to 226 00:13:14,960 --> 00:13:18,640 Speaker 1: see better results. Uh. And in two thousand and fifteen, 227 00:13:18,679 --> 00:13:21,320 Speaker 1: we saw a convergence of several things that would prompt 228 00:13:21,360 --> 00:13:23,720 Speaker 1: a lot of content companies to change the way they 229 00:13:23,760 --> 00:13:28,080 Speaker 1: do business. First, AD revenue for owned and operated sites 230 00:13:28,360 --> 00:13:31,920 Speaker 1: was hurting. So that by that, I mean if you 231 00:13:31,960 --> 00:13:35,040 Speaker 1: were a media company and you had your own website, 232 00:13:35,679 --> 00:13:38,480 Speaker 1: you were starting to see lower amounts of traffic, which 233 00:13:38,520 --> 00:13:42,120 Speaker 1: meant you were also seeing lower payouts and ads. It 234 00:13:42,240 --> 00:13:45,360 Speaker 1: was probably devaluing the ads that were on your site. 235 00:13:45,559 --> 00:13:48,920 Speaker 1: In other words, when companies make these ad deals, they 236 00:13:48,960 --> 00:13:53,520 Speaker 1: negotiate with the advertisers a certain amount per view or 237 00:13:53,559 --> 00:13:57,000 Speaker 1: per click or per action or whatever the criteria are, 238 00:13:57,520 --> 00:14:00,760 Speaker 1: and the as a whole, like, if you're doing really well, 239 00:14:01,160 --> 00:14:03,480 Speaker 1: then you can demand a higher rate. It's kind of 240 00:14:03,520 --> 00:14:07,080 Speaker 1: like how in the US commercials during the Super Bowl 241 00:14:07,240 --> 00:14:11,640 Speaker 1: are incredibly expensive because everyone knows that that's where a 242 00:14:11,640 --> 00:14:15,959 Speaker 1: ton of attention happens to be focused at that particular time, 243 00:14:16,080 --> 00:14:19,480 Speaker 1: So the ad space on a game of the super 244 00:14:19,520 --> 00:14:23,320 Speaker 1: Bowl ends up being really, really, really valuable. The same 245 00:14:23,360 --> 00:14:25,200 Speaker 1: sort of thing. If your site is doing really well, 246 00:14:25,320 --> 00:14:28,920 Speaker 1: then you can command a higher price. But at this point, 247 00:14:29,280 --> 00:14:33,120 Speaker 1: content companies were starting to have difficulty. They were seeing 248 00:14:33,120 --> 00:14:36,400 Speaker 1: there there the value of their ad deals going down, 249 00:14:36,400 --> 00:14:39,840 Speaker 1: not up. Traffic was going down, not up, And a 250 00:14:39,920 --> 00:14:43,280 Speaker 1: lot of sites are dependent upon stuff like search engines 251 00:14:43,320 --> 00:14:46,440 Speaker 1: as well. If you rank high in Google, that's really important. 252 00:14:46,920 --> 00:14:50,040 Speaker 1: The Google algorithm is another example of an algorithm that 253 00:14:50,080 --> 00:14:52,920 Speaker 1: you could argue could be evil, but it means that 254 00:14:53,280 --> 00:14:55,520 Speaker 1: there are more chances for people to find your work. 255 00:14:55,560 --> 00:14:58,120 Speaker 1: If you're ranking high in search, right, someone searches a 256 00:14:58,120 --> 00:15:01,000 Speaker 1: particular term, your site comes up. That's great for you, 257 00:15:01,640 --> 00:15:04,760 Speaker 1: and thus more people will end up going to your 258 00:15:04,760 --> 00:15:07,280 Speaker 1: site and seeing the ads that are on your site, 259 00:15:07,400 --> 00:15:11,040 Speaker 1: and you generate revenue. We saw some companies experiment with 260 00:15:11,080 --> 00:15:13,760 Speaker 1: stuff like paywalls, you know, getting away from the whole 261 00:15:13,760 --> 00:15:18,080 Speaker 1: advertising thing and saying, well, we will instead uh invite 262 00:15:18,240 --> 00:15:22,160 Speaker 1: users to pay a subscription to get our content. A 263 00:15:22,200 --> 00:15:24,600 Speaker 1: lot of users balked at that because they were used 264 00:15:24,600 --> 00:15:27,720 Speaker 1: to getting their content for quote unquote free, though we 265 00:15:27,760 --> 00:15:31,200 Speaker 1: know it wasn't really for free. It was just that 266 00:15:31,240 --> 00:15:33,640 Speaker 1: the money wasn't coming directly out of the user's pocket 267 00:15:33,680 --> 00:15:36,120 Speaker 1: in those cases. Now, a big part of the problem 268 00:15:36,160 --> 00:15:38,800 Speaker 1: was that rather than go on owned and operated sites, 269 00:15:38,840 --> 00:15:41,720 Speaker 1: a lot of people were just gobbling up content on 270 00:15:41,800 --> 00:15:46,239 Speaker 1: places like Facebook. So social network platforms became a competitive 271 00:15:46,280 --> 00:15:49,960 Speaker 1: marketplace for content creators, with many trying to find ways 272 00:15:50,040 --> 00:15:53,920 Speaker 1: to craft their work. Uh. Just so that it would 273 00:15:53,960 --> 00:15:56,960 Speaker 1: be effective when you share it on social media doesn't 274 00:15:56,960 --> 00:15:59,280 Speaker 1: mean it would be the best work you've ever made, 275 00:15:59,280 --> 00:16:01,320 Speaker 1: but rather you were trying to make stuff that was 276 00:16:01,520 --> 00:16:04,400 Speaker 1: share able. Oh man, I remember this era and it 277 00:16:04,440 --> 00:16:09,400 Speaker 1: was bleak anyway. At the same time, you had smartphone 278 00:16:09,440 --> 00:16:12,160 Speaker 1: companies that were really coming into their own I mean, 279 00:16:12,160 --> 00:16:17,200 Speaker 1: the smartphone was starting to dominate the computer space. They 280 00:16:17,280 --> 00:16:19,760 Speaker 1: had premiered less than a decade earlier, but the swift 281 00:16:19,800 --> 00:16:22,880 Speaker 1: proliferation of smartphones meant that people were starting to consume 282 00:16:22,960 --> 00:16:25,760 Speaker 1: content a little bit differently than they had before. A 283 00:16:25,800 --> 00:16:29,440 Speaker 1: good site would optimize for mobile access, and we started 284 00:16:29,440 --> 00:16:31,480 Speaker 1: seeing people say that the future of the web was 285 00:16:31,600 --> 00:16:35,200 Speaker 1: really and how it would cater to mobile devices, and 286 00:16:35,240 --> 00:16:39,560 Speaker 1: we saw various content sites looking for ways to cut costs. 287 00:16:39,680 --> 00:16:43,160 Speaker 1: This was the other big element here. And one way 288 00:16:43,480 --> 00:16:46,520 Speaker 1: is you can really reduce your writing staff. You can 289 00:16:46,600 --> 00:16:48,920 Speaker 1: have layoffs, but you still need to make a lot 290 00:16:48,960 --> 00:16:52,560 Speaker 1: of content or else you have no inventory to sell 291 00:16:52,640 --> 00:16:56,240 Speaker 1: to advertisers, and then the revenue drives up. And that's 292 00:16:56,280 --> 00:16:59,520 Speaker 1: what would set the stage for a really big problem. 293 00:16:59,560 --> 00:17:10,680 Speaker 1: Alex lay More after we take this short break. Okay, 294 00:17:10,800 --> 00:17:14,800 Speaker 1: this all brings us up to the pivoting to video movement. 295 00:17:15,160 --> 00:17:20,480 Speaker 1: So in early Facebook announced a shift to video, indicating 296 00:17:20,520 --> 00:17:25,080 Speaker 1: that the platform would prioritize video content from third party publishers, 297 00:17:25,440 --> 00:17:29,680 Speaker 1: and that implies that it would simultaneously de emphasize content 298 00:17:29,800 --> 00:17:32,159 Speaker 1: that wasn't videos. So, if you were a third party 299 00:17:32,200 --> 00:17:35,720 Speaker 1: publisher and you were making stuff like articles like how 300 00:17:35,760 --> 00:17:37,760 Speaker 1: staff works is a great example. I used to write 301 00:17:37,760 --> 00:17:40,320 Speaker 1: for how Stuff Works. And if you were doing these 302 00:17:40,400 --> 00:17:42,320 Speaker 1: articles and you were hoping that at least some of 303 00:17:42,359 --> 00:17:45,160 Speaker 1: your traffic was going to come from social networking sites, 304 00:17:45,480 --> 00:17:48,560 Speaker 1: you hear this news and you think, oh, well, now, 305 00:17:48,760 --> 00:17:51,080 Speaker 1: even if I write a great article and if not 306 00:17:51,240 --> 00:17:55,080 Speaker 1: given a fantastic title and it's really share able, the 307 00:17:55,119 --> 00:17:58,280 Speaker 1: Facebook algorithm is going to de emphasize it because it's 308 00:17:58,280 --> 00:18:02,000 Speaker 1: not video. It's written con tent, and the Facebook algorithm 309 00:18:02,040 --> 00:18:06,119 Speaker 1: has decided to promote video content instead. Uh. There were 310 00:18:06,160 --> 00:18:09,680 Speaker 1: a lot of content companies that legitimately got real nervous 311 00:18:09,680 --> 00:18:13,000 Speaker 1: about this for good reason. So the message appeared to be, 312 00:18:13,400 --> 00:18:15,760 Speaker 1: if you want your content to be seen on Facebook, 313 00:18:16,040 --> 00:18:20,840 Speaker 1: you should probably put it in short form video content form. 314 00:18:20,920 --> 00:18:23,720 Speaker 1: And now, now why would Facebook even do this? Well, 315 00:18:23,760 --> 00:18:25,919 Speaker 1: one big reason is that if you click on a 316 00:18:26,040 --> 00:18:30,480 Speaker 1: link in Facebook to go read an article, you leave Facebook. 317 00:18:31,080 --> 00:18:33,719 Speaker 1: And since the revenue model for Facebook is again all 318 00:18:33,760 --> 00:18:36,640 Speaker 1: about keeping as many eyeballs on the site for as 319 00:18:36,680 --> 00:18:41,840 Speaker 1: long as possible. That means Facebook hates that but video 320 00:18:42,040 --> 00:18:44,640 Speaker 1: and Facebook could build a media player, and they did 321 00:18:45,000 --> 00:18:48,480 Speaker 1: that could work within the news feed and just host 322 00:18:48,600 --> 00:18:51,280 Speaker 1: videos natively on the Facebook site so you didn't go 323 00:18:51,359 --> 00:18:54,280 Speaker 1: away from Facebook to watch the videos. They played within 324 00:18:54,400 --> 00:18:57,440 Speaker 1: the Facebook page, so people could just click on those 325 00:18:57,520 --> 00:19:00,880 Speaker 1: videos and watch them and not leave face book. Facebook 326 00:19:00,960 --> 00:19:04,120 Speaker 1: loved that idea. If the user news feeds just filled 327 00:19:04,160 --> 00:19:07,000 Speaker 1: up with video after video and all of them called, 328 00:19:07,200 --> 00:19:10,800 Speaker 1: you know, Facebook home, even though the actual content was 329 00:19:10,840 --> 00:19:14,360 Speaker 1: coming from other companies, well, that would just work out 330 00:19:14,400 --> 00:19:18,280 Speaker 1: fine for Facebook. The shift to video would have an 331 00:19:18,440 --> 00:19:22,400 Speaker 1: enormous impact on content companies. Several companies began cutting back 332 00:19:22,400 --> 00:19:25,520 Speaker 1: on staff, keeping a small group of writers and editors 333 00:19:25,560 --> 00:19:29,119 Speaker 1: whose main job wasn't to publish articles, but rather write 334 00:19:29,119 --> 00:19:33,040 Speaker 1: and produce as many videos as possible. I saw this 335 00:19:33,119 --> 00:19:35,399 Speaker 1: happen in front of my own eyes. But that's a 336 00:19:35,480 --> 00:19:38,960 Speaker 1: story for a different time. Anyway, a lot of talented 337 00:19:39,000 --> 00:19:42,080 Speaker 1: people found themselves out of a job as dozens of 338 00:19:42,119 --> 00:19:46,480 Speaker 1: content companies made the pivot to video. You also heard 339 00:19:46,480 --> 00:19:48,520 Speaker 1: a lot of stories about this is kind of around 340 00:19:48,520 --> 00:19:51,560 Speaker 1: the same time where the perception of journalism was taking 341 00:19:52,200 --> 00:19:55,040 Speaker 1: a pretty big hit, like people were starting to say 342 00:19:55,080 --> 00:19:58,240 Speaker 1: that journalism was dead. That kind of ties into this 343 00:19:58,359 --> 00:20:01,600 Speaker 1: too well. Anyway, companies chose to pivot to video in 344 00:20:01,680 --> 00:20:05,760 Speaker 1: part because they could have a smaller video department than 345 00:20:05,800 --> 00:20:09,399 Speaker 1: a writing staff. Often that would also involve bringing in 346 00:20:09,520 --> 00:20:13,040 Speaker 1: new employees out of college to work on the video team, 347 00:20:13,520 --> 00:20:17,119 Speaker 1: because new employees often have lower salaries than more established 348 00:20:17,119 --> 00:20:20,840 Speaker 1: employees who might be let go. But companies wouldn't have 349 00:20:20,880 --> 00:20:24,479 Speaker 1: done this unless the revenue side was also in place, right, Like, 350 00:20:25,160 --> 00:20:27,560 Speaker 1: it's one thing to cut costs, but it means nothing 351 00:20:27,560 --> 00:20:29,720 Speaker 1: if you do that if you're not also bringing in 352 00:20:29,760 --> 00:20:32,439 Speaker 1: the cash. You have to be bringing in revenue. And 353 00:20:32,480 --> 00:20:36,719 Speaker 1: the problem was the industry was responding to a misrepresentation 354 00:20:36,920 --> 00:20:40,760 Speaker 1: of the facts uh an error in other words, or, 355 00:20:41,080 --> 00:20:47,000 Speaker 1: if you are particularly cynical, a lie. See. In January, 356 00:20:47,160 --> 00:20:51,119 Speaker 1: Facebook presented some pretty startling figures saying that Facebook users 357 00:20:51,320 --> 00:20:55,560 Speaker 1: were watching more than a billion videos per day. What's more, 358 00:20:55,840 --> 00:20:59,440 Speaker 1: the company said that users were watching tons of ads 359 00:20:59,600 --> 00:21:02,359 Speaker 1: in the videos in the process, that, in other words, 360 00:21:02,560 --> 00:21:05,600 Speaker 1: producing videos to publish on Facebook would be a great 361 00:21:05,640 --> 00:21:07,720 Speaker 1: way to show a lot of ads to a lot 362 00:21:07,760 --> 00:21:10,639 Speaker 1: of people, and the revenue would come flowing in for 363 00:21:10,880 --> 00:21:15,160 Speaker 1: everyone concerned. But things were not quite playing out that way. 364 00:21:15,359 --> 00:21:18,400 Speaker 1: Some companies had gone so far as to ditch their 365 00:21:18,440 --> 00:21:23,119 Speaker 1: owned and operated sites and they relied exclusively on building 366 00:21:23,119 --> 00:21:26,800 Speaker 1: out content to live on sites like Facebook, which I 367 00:21:26,840 --> 00:21:28,919 Speaker 1: need to say, in case you didn't pick it up 368 00:21:28,920 --> 00:21:32,920 Speaker 1: from my tone, is a terrible idea. They were trying 369 00:21:32,960 --> 00:21:36,280 Speaker 1: to respond to a rapidly shifting landscape, so in their 370 00:21:36,320 --> 00:21:38,800 Speaker 1: minds they were saying, well, if no one's coming to 371 00:21:38,840 --> 00:21:43,000 Speaker 1: our website, then we're just wasting money maintaining it, so 372 00:21:43,080 --> 00:21:45,919 Speaker 1: let's just change over to Facebook. So the pivot to 373 00:21:46,000 --> 00:21:49,960 Speaker 1: video phrase would also end up taking on an additional meaning. 374 00:21:50,000 --> 00:21:52,479 Speaker 1: A couple of them, for example, one of them was 375 00:21:52,600 --> 00:21:55,720 Speaker 1: a cheeky way to refer to layoffs. You know, layoffs 376 00:21:55,720 --> 00:21:57,800 Speaker 1: are coming up. It's because the company is quote unquote 377 00:21:57,840 --> 00:22:03,800 Speaker 1: pivoting to video. Um, so that that's kind of tells 378 00:22:03,840 --> 00:22:05,879 Speaker 1: you all you need to know about that particular phrase, 379 00:22:05,960 --> 00:22:11,359 Speaker 1: like it became a euphemism. Meanwhile, the revenue numbers weren't 380 00:22:11,400 --> 00:22:14,560 Speaker 1: actually matching up with what was to be expected based 381 00:22:14,600 --> 00:22:17,840 Speaker 1: upon Facebook's kind of sales pitch to the industry, and 382 00:22:17,920 --> 00:22:22,640 Speaker 1: in February two sixteen, Facebook essentially said, whoops are bad. 383 00:22:23,040 --> 00:22:25,639 Speaker 1: Remember when we talked about how many ads the average 384 00:22:25,680 --> 00:22:29,840 Speaker 1: person was watching in videos, It turns out we overestimated 385 00:22:29,880 --> 00:22:33,160 Speaker 1: that a bit, perhaps by as much as eighty percent. 386 00:22:33,840 --> 00:22:37,280 Speaker 1: So imagine for a moment that you want to open 387 00:22:37,359 --> 00:22:40,120 Speaker 1: up a business. Let's say it's a bakery and you're 388 00:22:40,160 --> 00:22:43,640 Speaker 1: looking for the perfect place to locate it. And you 389 00:22:43,720 --> 00:22:46,399 Speaker 1: have a realtor who tells you that they have a 390 00:22:46,440 --> 00:22:48,480 Speaker 1: spot that's in an area that gets a ton of 391 00:22:48,520 --> 00:22:52,280 Speaker 1: foot traffic, and it's ideal for your location, it's in 392 00:22:52,320 --> 00:22:54,960 Speaker 1: your price range, it's ready to go. So you signed 393 00:22:54,960 --> 00:22:56,960 Speaker 1: the deal, you rent out the spot, you build out 394 00:22:56,960 --> 00:22:59,359 Speaker 1: your bakery. But then you find out the realtor quote 395 00:22:59,440 --> 00:23:03,879 Speaker 1: unquote or estimated the foot traffic by That would be 396 00:23:03,920 --> 00:23:07,040 Speaker 1: a huge blow to your bakery business, right, I mean, like, 397 00:23:07,160 --> 00:23:09,800 Speaker 1: now you don't have that foot traffic to depend upon 398 00:23:10,040 --> 00:23:12,880 Speaker 1: to get customers. It's gonna be way harder. Doesn't mean 399 00:23:12,880 --> 00:23:16,040 Speaker 1: that you've necessarily outright failed, but you're not going to 400 00:23:16,200 --> 00:23:19,320 Speaker 1: have the path to success you thought you were. You 401 00:23:19,400 --> 00:23:22,119 Speaker 1: might not even be able to stay in business. Well, 402 00:23:22,320 --> 00:23:27,119 Speaker 1: here you had all these advertising companies that had banked 403 00:23:27,160 --> 00:23:29,480 Speaker 1: on this, and all these content companies that had put 404 00:23:29,520 --> 00:23:32,840 Speaker 1: all their eggs in the Facebook video basket. Some of 405 00:23:32,880 --> 00:23:35,960 Speaker 1: the content companies no longer had their own websites, so 406 00:23:36,160 --> 00:23:40,359 Speaker 1: they had really decided to throw all in on short 407 00:23:40,520 --> 00:23:44,480 Speaker 1: video content that being the future of content creation. And 408 00:23:44,480 --> 00:23:47,000 Speaker 1: it turned out that Facebook had fudged, or at the 409 00:23:47,080 --> 00:23:51,639 Speaker 1: very least misinterpreted the numbers. A lawsuit in two thousand 410 00:23:51,680 --> 00:23:56,240 Speaker 1: eighteen alleged that Facebook learned of this overestimation error earlier 411 00:23:56,640 --> 00:23:59,440 Speaker 1: but kept it a secret, not admitting to the problem 412 00:23:59,560 --> 00:24:03,720 Speaker 1: until it became unavoidable. In fact, the lawsuit alleged that 413 00:24:03,800 --> 00:24:09,439 Speaker 1: the overestimation wasn't six like you know, Facebook, uh. The 414 00:24:09,560 --> 00:24:11,720 Speaker 1: argument was that they or Facebook had said that it 415 00:24:11,800 --> 00:24:15,440 Speaker 1: accidentally overstated the numbers of people who are watching ads 416 00:24:15,440 --> 00:24:19,000 Speaker 1: by six. The lawsuits said, no, no, no, it was 417 00:24:19,040 --> 00:24:22,480 Speaker 1: more like you you misrepresented it by a hundred fifty 418 00:24:22,520 --> 00:24:27,960 Speaker 1: to nine hundred percent, drastically overstating how many people were 419 00:24:27,960 --> 00:24:32,240 Speaker 1: watching ads in these little videos. Facebook would eventually settle 420 00:24:32,560 --> 00:24:35,959 Speaker 1: with advertisers out of court, agreeing to pay around forty 421 00:24:36,000 --> 00:24:39,760 Speaker 1: million dollars to the advertisers, but the content companies were 422 00:24:39,880 --> 00:24:42,640 Speaker 1: kind of left in the lurch. Now, during this era, 423 00:24:42,800 --> 00:24:46,600 Speaker 1: from around late two thousand fourteen to two thousand sixteen 424 00:24:46,680 --> 00:24:50,800 Speaker 1: or so, Facebook's algorithm was favoring video over other types 425 00:24:50,800 --> 00:24:54,520 Speaker 1: of content. But since people weren't actually watching an endless 426 00:24:54,560 --> 00:24:57,960 Speaker 1: stream of videos with ads, that meant that the strategy 427 00:24:57,960 --> 00:25:02,120 Speaker 1: of video wasn't working. It wasn't working for the content companies, 428 00:25:02,480 --> 00:25:06,159 Speaker 1: and more importantly, for Facebook, it wasn't working for Facebook. 429 00:25:06,480 --> 00:25:09,840 Speaker 1: It wasn't keeping people on Facebook the way they wanted 430 00:25:09,840 --> 00:25:12,640 Speaker 1: it to, and so the company would change its algorithm. 431 00:25:12,680 --> 00:25:16,800 Speaker 1: Now you could say that Facebook pivoted away from video. Meanwhile, 432 00:25:16,840 --> 00:25:20,440 Speaker 1: all those companies were, you know, left behind. And granted, 433 00:25:20,760 --> 00:25:23,840 Speaker 1: there were tons of folks who had been saying, you know, 434 00:25:24,000 --> 00:25:26,480 Speaker 1: don't do this, and then they could say I told 435 00:25:26,520 --> 00:25:30,199 Speaker 1: you so to the people that were relying on a 436 00:25:30,280 --> 00:25:32,760 Speaker 1: third party. You know, there are a lot of people 437 00:25:32,800 --> 00:25:34,879 Speaker 1: who say that's a bad idea to rely on a 438 00:25:34,960 --> 00:25:38,880 Speaker 1: third party for distributing your content, that if you don't 439 00:25:38,880 --> 00:25:42,640 Speaker 1: own it, when something changes your you can be hurt. 440 00:25:42,800 --> 00:25:45,480 Speaker 1: And this happens all the time. It happens with Facebook, 441 00:25:45,520 --> 00:25:48,680 Speaker 1: it happens with Google, it happens with YouTube. Every time 442 00:25:48,680 --> 00:25:51,560 Speaker 1: there's a change in the algorithm, you'll hear about how 443 00:25:51,720 --> 00:25:56,359 Speaker 1: certain sites benefit or certain content creators benefit and others suffer. 444 00:25:57,000 --> 00:25:59,199 Speaker 1: But you know the fact is that a ton of 445 00:25:59,240 --> 00:26:03,680 Speaker 1: folks jumped on that bandwagon already and they were all 446 00:26:03,760 --> 00:26:08,560 Speaker 1: hit negatively by this whole uh shift. Anyway, let's talk 447 00:26:08,600 --> 00:26:12,760 Speaker 1: about what Facebook's shift in its algorithm actually meant. Now. 448 00:26:12,760 --> 00:26:14,880 Speaker 1: I have to talk about this kind of in vague 449 00:26:14,880 --> 00:26:18,080 Speaker 1: and high level ways because Facebook, like Google, is not 450 00:26:18,240 --> 00:26:21,080 Speaker 1: transparent about its algorithms. Now we know a lot of 451 00:26:21,080 --> 00:26:24,280 Speaker 1: stuff because of the leaked internal documents that came to 452 00:26:24,359 --> 00:26:27,280 Speaker 1: light recently, but the company has never been super upfront 453 00:26:27,320 --> 00:26:30,159 Speaker 1: with how much you know or how it determines the 454 00:26:30,240 --> 00:26:34,960 Speaker 1: content you've seen on your news feed. Generally speaking, Facebook 455 00:26:35,000 --> 00:26:38,720 Speaker 1: shifted its focus towards content that quote unquote drives engagement. 456 00:26:39,240 --> 00:26:42,400 Speaker 1: That is, content that seems to inspire people to interact 457 00:26:42,480 --> 00:26:46,080 Speaker 1: with that content in some way, like leaving a comment 458 00:26:46,560 --> 00:26:50,600 Speaker 1: or clicking like or dislike, or sharing a post of 459 00:26:50,760 --> 00:26:55,240 Speaker 1: Facebook really loves it when people share posts. The actual 460 00:26:55,440 --> 00:26:59,399 Speaker 1: content of the post doesn't matter to Facebook. They are 461 00:26:59,520 --> 00:27:02,639 Speaker 1: content agnostic when it comes to that. What matters to 462 00:27:02,680 --> 00:27:06,639 Speaker 1: Facebook is that people are engaging with that content and 463 00:27:06,720 --> 00:27:10,600 Speaker 1: thus spending more time on the platform. That yeah, that's 464 00:27:10,600 --> 00:27:13,600 Speaker 1: what it really boils down to. If you're engaging in content, 465 00:27:14,000 --> 00:27:17,160 Speaker 1: it means you're still on Facebook. You haven't left. And 466 00:27:17,240 --> 00:27:19,240 Speaker 1: like I said in the beginning, at the end of 467 00:27:19,280 --> 00:27:22,480 Speaker 1: the day, that's what it's all about. Facebook's just concerned 468 00:27:22,520 --> 00:27:24,560 Speaker 1: with keeping as many people on the platform for as 469 00:27:24,560 --> 00:27:27,920 Speaker 1: long as possible and that's it. So promoting material that 470 00:27:27,960 --> 00:27:30,159 Speaker 1: gets people to spend more time on the platform just 471 00:27:30,320 --> 00:27:35,199 Speaker 1: makes sense from that business perspective. Unfortunately, one of the 472 00:27:35,240 --> 00:27:38,520 Speaker 1: crappy things about humans is that we tend to respond 473 00:27:38,840 --> 00:27:43,560 Speaker 1: with or to outrage more readily than just about anything else. 474 00:27:44,560 --> 00:27:47,720 Speaker 1: If something makes you mad, you're way more likely to 475 00:27:47,760 --> 00:27:50,520 Speaker 1: act on that, and a lot of posts that quote 476 00:27:50,560 --> 00:27:56,400 Speaker 1: unquote drive engagement are really all about outrage from different perspectives. 477 00:27:56,440 --> 00:27:59,359 Speaker 1: Like a post could be about how a certain group 478 00:27:59,359 --> 00:28:02,200 Speaker 1: of people are awful. Whether those people are awful or 479 00:28:02,240 --> 00:28:04,760 Speaker 1: not doesn't matter. It's the fact that someone has made 480 00:28:04,760 --> 00:28:08,520 Speaker 1: a post saying that. That's what matters. So the people 481 00:28:08,560 --> 00:28:12,440 Speaker 1: who agree with the post, who might harbor strong negative 482 00:28:12,480 --> 00:28:15,359 Speaker 1: feelings about the people who are mentioned in the post, 483 00:28:15,800 --> 00:28:19,600 Speaker 1: they chime in others who sympathize with the people you 484 00:28:19,800 --> 00:28:23,160 Speaker 1: singled out, maybe they are members of that group. They 485 00:28:23,240 --> 00:28:26,040 Speaker 1: leap to the defense or they go on a counter 486 00:28:26,119 --> 00:28:29,920 Speaker 1: attack against the person who posted it. There's enough outrage 487 00:28:29,960 --> 00:28:32,080 Speaker 1: to go around, so you end up with a bunch 488 00:28:32,080 --> 00:28:36,320 Speaker 1: of angry people waging battles across Facebook comments sections, and 489 00:28:36,359 --> 00:28:39,760 Speaker 1: the algorithm, seeing that certain posts are generating a ton 490 00:28:39,960 --> 00:28:44,560 Speaker 1: of quote unquote engagement, steps it up to promote those 491 00:28:44,600 --> 00:28:48,040 Speaker 1: posts into more news feeds and thus spreads them further 492 00:28:48,160 --> 00:28:50,800 Speaker 1: and wider. Because if the engagement is going up, it 493 00:28:50,840 --> 00:28:53,400 Speaker 1: means more people are spending more time on Facebook, so 494 00:28:53,560 --> 00:28:56,120 Speaker 1: there are more ads to serve. Now there's a lot 495 00:28:56,200 --> 00:28:58,960 Speaker 1: going on here from a sociological standpoint, right. I mean, 496 00:28:59,000 --> 00:29:03,600 Speaker 1: we've seen hates speech, racism, and misogyny spread like wildfire 497 00:29:03,680 --> 00:29:07,040 Speaker 1: across Facebook. And while the battles in the comments rage, 498 00:29:07,080 --> 00:29:10,040 Speaker 1: the algorithm just keeps stirring things up. We have to 499 00:29:10,080 --> 00:29:14,520 Speaker 1: remember there are real world consequences to this online activity. 500 00:29:14,960 --> 00:29:18,160 Speaker 1: People and vulnerable populations can be made to feel unsafe. 501 00:29:18,560 --> 00:29:21,440 Speaker 1: That's a powerful and destructive thing. Folks can be driven 502 00:29:21,480 --> 00:29:25,560 Speaker 1: to extremist views, joining echo chambers that reinforce beliefs and 503 00:29:25,640 --> 00:29:29,960 Speaker 1: stereotypes that are outright harmful. People who are neither targeted 504 00:29:30,080 --> 00:29:34,040 Speaker 1: nor pulled into groupthink might find themselves just distressed over 505 00:29:34,080 --> 00:29:37,360 Speaker 1: the whole thing. It's pretty ugly stuff. And this is 506 00:29:37,360 --> 00:29:41,520 Speaker 1: how misinformation and hate speech takes hold on Facebook. The algorithm, 507 00:29:41,720 --> 00:29:45,960 Speaker 1: which is again content agnostic, recognizes that whatever is being 508 00:29:46,000 --> 00:29:49,320 Speaker 1: posted is effective, and that's all that really matters, so 509 00:29:49,360 --> 00:29:52,920 Speaker 1: we get an amplification chamber. Now, don't think anyone at 510 00:29:52,920 --> 00:29:56,160 Speaker 1: Facebook necessarily saw this coming, at least not early on, 511 00:29:56,680 --> 00:29:59,200 Speaker 1: or that the folks who worked on the algorithm intended 512 00:29:59,240 --> 00:30:01,760 Speaker 1: it to spread hard, full material in a way that 513 00:30:01,920 --> 00:30:05,480 Speaker 1: is alarmingly effective. I think they just didn't bank on 514 00:30:05,600 --> 00:30:08,000 Speaker 1: human nature being what it is, or the fact that 515 00:30:08,040 --> 00:30:12,320 Speaker 1: you have certain agents out there, Russian backed and Chinese 516 00:30:12,360 --> 00:30:15,760 Speaker 1: backed agents who, once they figured out how to game 517 00:30:15,800 --> 00:30:20,120 Speaker 1: the system, would flood Facebook with posts designed to drive 518 00:30:20,200 --> 00:30:24,040 Speaker 1: engagement and spread misinformation and thus get caught up in 519 00:30:24,040 --> 00:30:28,160 Speaker 1: this algorithm promotion and spread harmful messages. But the fact 520 00:30:28,240 --> 00:30:31,600 Speaker 1: remains that this is what we're seeing play out. Facebook 521 00:30:31,640 --> 00:30:35,080 Speaker 1: has attempted to address this with various content moderation strategies, 522 00:30:35,080 --> 00:30:37,760 Speaker 1: but the company does so while also trying to walk 523 00:30:37,800 --> 00:30:41,360 Speaker 1: a fine line and not trigger criticism from various sources 524 00:30:41,440 --> 00:30:44,520 Speaker 1: such as the conservative right. The fear is that too 525 00:30:44,600 --> 00:30:47,800 Speaker 1: much moderation will be perceived as censorship, and that the 526 00:30:47,840 --> 00:30:52,680 Speaker 1: company is actively trying to silence conservative voices. Facebook has 527 00:30:52,720 --> 00:30:55,920 Speaker 1: been the target of such criticism on multiple occasions, so 528 00:30:55,960 --> 00:30:58,640 Speaker 1: the solution tends to be to use a very light 529 00:30:58,720 --> 00:31:03,240 Speaker 1: touch when moderating content. Meanwhile, the messages continue to populate 530 00:31:03,280 --> 00:31:06,640 Speaker 1: Facebook users news feeds. Now. I called this episode the 531 00:31:06,680 --> 00:31:09,920 Speaker 1: evil Algorithm, And again, I don't necessarily think an algorithm 532 00:31:09,960 --> 00:31:12,960 Speaker 1: is on its own evil, but I do think continuing 533 00:31:12,960 --> 00:31:17,320 Speaker 1: to rely on an algorithm that is demonstrably causing harm 534 00:31:18,400 --> 00:31:21,320 Speaker 1: is pretty darn evil. At the very least, it is 535 00:31:21,400 --> 00:31:25,960 Speaker 1: unethical and irresponsible. However, you know, again, it's like at 536 00:31:25,960 --> 00:31:30,440 Speaker 1: the very heart of Facebook's business strategy. And that was 537 00:31:30,520 --> 00:31:34,320 Speaker 1: something that that Charlie was saying on Pod Save America 538 00:31:34,480 --> 00:31:39,000 Speaker 1: that the very core of Facebook's purpose, like it's it's 539 00:31:39,280 --> 00:31:43,640 Speaker 1: mission as far as a business is concerned, is at 540 00:31:43,640 --> 00:31:47,960 Speaker 1: odds with you know, not causing harm, like that those 541 00:31:48,000 --> 00:31:51,600 Speaker 1: two things can't really coexist because of just the way 542 00:31:51,720 --> 00:31:54,560 Speaker 1: it has evolved from this site. That was meant to 543 00:31:54,640 --> 00:31:59,760 Speaker 1: rank the attractiveness of female students at Harvard. Who would 544 00:31:59,760 --> 00:32:03,400 Speaker 1: have thought that's such a thing could go so horribly wrong. Anyway, 545 00:32:04,400 --> 00:32:07,280 Speaker 1: that's it for this episode. I really wanted to lay 546 00:32:07,320 --> 00:32:10,240 Speaker 1: it out because I feel like when we talk about algorithms, 547 00:32:10,240 --> 00:32:14,080 Speaker 1: we get so so lucy goosey with our words that 548 00:32:14,280 --> 00:32:17,720 Speaker 1: it just becomes a like a blanket term that doesn't 549 00:32:17,800 --> 00:32:20,920 Speaker 1: really mean anything. If you have suggestions for topics I 550 00:32:20,920 --> 00:32:23,560 Speaker 1: should cover on future episodes of tech Stuff, please reach 551 00:32:23,560 --> 00:32:25,280 Speaker 1: out to me. The best way to do that is 552 00:32:25,320 --> 00:32:28,000 Speaker 1: over on Twitter. To handle for the show is text 553 00:32:28,040 --> 00:32:32,600 Speaker 1: stuff hs W and I'll talk to you again really soon. 554 00:32:37,840 --> 00:32:40,840 Speaker 1: Text Stuff is an I Heart Radio production. For more 555 00:32:40,920 --> 00:32:44,320 Speaker 1: podcasts from my Heart Radio, visit the i Heart Radio app, 556 00:32:44,440 --> 00:32:47,600 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.