1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tex Stuff, a production from I Heart Radio. 2 00:00:12,200 --> 00:00:15,040 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:15,280 --> 00:00:18,840 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio, 4 00:00:18,880 --> 00:00:22,360 Speaker 1: and I love all things tech and I don't get 5 00:00:22,360 --> 00:00:25,720 Speaker 1: to do this very often, so indulge me. But in 6 00:00:25,840 --> 00:00:33,080 Speaker 1: Shakespeare's classic tragedy Hamlet, our protagonist and perpetually moody Dane 7 00:00:33,520 --> 00:00:37,760 Speaker 1: says at one point there is nothing either good or bad, 8 00:00:38,040 --> 00:00:42,720 Speaker 1: but thinking makes it so. What he means there is that, 9 00:00:42,880 --> 00:00:45,680 Speaker 1: at least in his eyes, all things in the universe 10 00:00:46,120 --> 00:00:49,960 Speaker 1: are just as they are. They are neither good nor bad. 11 00:00:50,440 --> 00:00:53,640 Speaker 1: It's only how we think about them that makes them 12 00:00:53,840 --> 00:00:56,560 Speaker 1: good or bad. This is similar to a philosophy and 13 00:00:56,560 --> 00:00:59,720 Speaker 1: I've talked about on this show when it comes to technology. 14 00:01:00,280 --> 00:01:03,920 Speaker 1: Technology and of itself has the capacity to help or 15 00:01:03,960 --> 00:01:08,480 Speaker 1: to harm. But the underlying tech isn't necessarily good or evil. 16 00:01:08,840 --> 00:01:12,959 Speaker 1: It's how it's put to use that makes all the difference. Now, granted, 17 00:01:13,319 --> 00:01:16,640 Speaker 1: there are some technologies that are harder to use in 18 00:01:16,680 --> 00:01:19,360 Speaker 1: a way that is a net positive, you know, like 19 00:01:19,640 --> 00:01:24,000 Speaker 1: nuclear warheads for example. Not that people haven't tried, they have, 20 00:01:24,600 --> 00:01:28,360 Speaker 1: but you get my meaning. Well, one technology that has 21 00:01:28,440 --> 00:01:32,000 Speaker 1: far reaching effects and consequences that can end up being 22 00:01:32,040 --> 00:01:35,920 Speaker 1: positive or negative. Would be the algorithms that are underlying 23 00:01:36,000 --> 00:01:40,280 Speaker 1: stuff like search and social networks. Specifically today, I'm going 24 00:01:40,360 --> 00:01:44,760 Speaker 1: to be talking about Facebook in this episode, but algorithms 25 00:01:44,840 --> 00:01:49,000 Speaker 1: underlie the various computer systems and online systems that we 26 00:01:49,040 --> 00:01:51,840 Speaker 1: rely upon, and they have the potential to be harmful 27 00:01:51,960 --> 00:01:56,360 Speaker 1: depending on their implementation. But before I start talking about 28 00:01:56,400 --> 00:02:00,320 Speaker 1: the effects of algorithms, let's back up a little bit 29 00:02:00,440 --> 00:02:06,000 Speaker 1: and define that what exactly is an algorithm. Basically, an 30 00:02:06,040 --> 00:02:10,200 Speaker 1: algorithm is a set of directions or rules, and that's 31 00:02:10,240 --> 00:02:15,560 Speaker 1: really it. There's nothing special about this definition. These rules 32 00:02:15,639 --> 00:02:20,679 Speaker 1: guide how computers calculate operations and solve problems. It's a 33 00:02:20,800 --> 00:02:25,400 Speaker 1: finite sequence of instructions for computers to follow when they're 34 00:02:25,440 --> 00:02:30,520 Speaker 1: executing a specific task. These instructions are unambiguous, which means 35 00:02:30,840 --> 00:02:34,000 Speaker 1: the computer system doesn't have to make judgment calls or 36 00:02:34,000 --> 00:02:38,519 Speaker 1: anything like that while executing the instructions. There is a 37 00:02:38,520 --> 00:02:42,480 Speaker 1: clear path to follow based on whatever the situation is 38 00:02:42,560 --> 00:02:47,079 Speaker 1: at hand. A different situation might lead to a different outcome, 39 00:02:47,240 --> 00:02:50,560 Speaker 1: but the instruction path would still remain clear. If you 40 00:02:50,600 --> 00:02:53,320 Speaker 1: were to dig down deeply enough to see what was 41 00:02:53,400 --> 00:02:57,600 Speaker 1: going on behind the computer's actions. The number of potential 42 00:02:57,639 --> 00:03:01,400 Speaker 1: paths to various outcomes could be an warmous and so 43 00:03:01,480 --> 00:03:04,360 Speaker 1: to us it would look like a really complicated system 44 00:03:04,360 --> 00:03:08,320 Speaker 1: and very difficult to grock. But if you really got granular, 45 00:03:08,800 --> 00:03:11,639 Speaker 1: you would see the logic, as it were, that connects 46 00:03:11,680 --> 00:03:16,040 Speaker 1: the beginning state to the end state of any given operation. 47 00:03:16,800 --> 00:03:21,200 Speaker 1: At Google, search algorithms include elements that determine what position 48 00:03:21,240 --> 00:03:25,320 Speaker 1: of particular query result will take on a results page. 49 00:03:25,960 --> 00:03:29,480 Speaker 1: So when we search for something using an engine like Google, 50 00:03:29,960 --> 00:03:33,160 Speaker 1: there are several things going on in that split second 51 00:03:33,240 --> 00:03:36,760 Speaker 1: before our results pop up. For one thing, Google has 52 00:03:36,800 --> 00:03:40,080 Speaker 1: an index of all the web pages it has crawled, 53 00:03:40,480 --> 00:03:43,880 Speaker 1: and so Google will reference that index to find all 54 00:03:43,920 --> 00:03:47,160 Speaker 1: the entries and that index that relate to the terms 55 00:03:47,160 --> 00:03:49,800 Speaker 1: that you put in your search query. And then those 56 00:03:49,840 --> 00:03:53,560 Speaker 1: results are ranked by a few different criteria. So way 57 00:03:53,680 --> 00:03:58,480 Speaker 1: up there is relevancy, like how relevant is each search 58 00:03:58,560 --> 00:04:01,960 Speaker 1: result to the original search query. If I were to 59 00:04:02,000 --> 00:04:05,600 Speaker 1: search for Captain America, Google is going to put the 60 00:04:05,600 --> 00:04:08,800 Speaker 1: pages about the fictional superhero way up towards the top 61 00:04:09,360 --> 00:04:12,320 Speaker 1: search results about someone who happens to hold the rank 62 00:04:12,440 --> 00:04:16,120 Speaker 1: of Captain and there's somewhere a mention of the region 63 00:04:16,160 --> 00:04:20,160 Speaker 1: of America would likely fall much lower on the search 64 00:04:20,200 --> 00:04:24,520 Speaker 1: results list, because even though those pages will have relevant 65 00:04:24,640 --> 00:04:27,880 Speaker 1: terms in them, the chances are it's not what I'm 66 00:04:27,920 --> 00:04:31,919 Speaker 1: actually looking for. A context is important, but context is 67 00:04:32,000 --> 00:04:36,120 Speaker 1: just part of it. There are thousands of pages that 68 00:04:36,200 --> 00:04:40,719 Speaker 1: mentioned Captain America. Some are from really good resources, meaning 69 00:04:40,920 --> 00:04:44,560 Speaker 1: they are resources that are dependable, they're informative, they might 70 00:04:44,560 --> 00:04:48,720 Speaker 1: be entertaining, you know whatever. Some are from garbage sites 71 00:04:48,760 --> 00:04:52,800 Speaker 1: that don't provide any real valuable experience. So then you've 72 00:04:52,839 --> 00:04:56,039 Speaker 1: got the ranking system within Google that determines how the 73 00:04:56,120 --> 00:05:01,040 Speaker 1: most relevant search engine results will appear on the results age. Now, 74 00:05:01,040 --> 00:05:04,000 Speaker 1: if there are thousands or even millions of hits, how 75 00:05:04,000 --> 00:05:06,880 Speaker 1: does Google determine which ones show up on the first 76 00:05:06,880 --> 00:05:10,520 Speaker 1: page of results. Well, there's a complicated system that determines 77 00:05:10,680 --> 00:05:13,359 Speaker 1: that ranking, and it's not really in the realm of 78 00:05:13,400 --> 00:05:16,200 Speaker 1: this episode to go into it, but suffice it to 79 00:05:16,240 --> 00:05:19,120 Speaker 1: say that Google takes a lot of things into consideration 80 00:05:19,320 --> 00:05:22,920 Speaker 1: when ordering search results. Ostensibly, this is all in an 81 00:05:22,960 --> 00:05:25,840 Speaker 1: effort to give you the most relevant and helpful results 82 00:05:26,080 --> 00:05:29,200 Speaker 1: based on your search query. For the people companies and 83 00:05:29,320 --> 00:05:32,680 Speaker 1: organizations that are making pages. It's always a goal to 84 00:05:32,880 --> 00:05:37,000 Speaker 1: rank really high on that list, preferably on that first page, 85 00:05:37,000 --> 00:05:39,719 Speaker 1: and if you can, the top half of that first page, 86 00:05:39,760 --> 00:05:42,960 Speaker 1: because most people will only look at the first few 87 00:05:43,040 --> 00:05:46,320 Speaker 1: search results and then they'll ignore everything that comes afterward. 88 00:05:46,920 --> 00:05:49,839 Speaker 1: Over the years, Google has made it really tricky to 89 00:05:49,960 --> 00:05:53,480 Speaker 1: game this engine, which means it's hard to come up 90 00:05:53,520 --> 00:05:57,440 Speaker 1: with effective strategies to show up higher in search results. 91 00:05:57,920 --> 00:06:01,000 Speaker 1: Your best bet is really to create just really good 92 00:06:01,040 --> 00:06:05,240 Speaker 1: resources around the particular topic of interest in question. And also, 93 00:06:05,400 --> 00:06:08,440 Speaker 1: you know, happened to be a really respected resource, which 94 00:06:08,520 --> 00:06:12,560 Speaker 1: is not easy to do, especially if you're just starting out. Obviously, Also, 95 00:06:12,600 --> 00:06:17,120 Speaker 1: whenever Google makes changes to this algorithm, it can drastically 96 00:06:17,120 --> 00:06:20,719 Speaker 1: affect how your page appears in search results, even with 97 00:06:20,760 --> 00:06:24,240 Speaker 1: a perfect query. And when tech stuff was still part 98 00:06:24,240 --> 00:06:27,200 Speaker 1: of how stuff works, that was something we were always 99 00:06:27,240 --> 00:06:31,040 Speaker 1: concerned about. A change in the Google algorithm would mean 100 00:06:31,080 --> 00:06:35,000 Speaker 1: that our web traffic could either skyrocket or it could plummet. 101 00:06:35,560 --> 00:06:37,760 Speaker 1: There was no change in the quality of our work. 102 00:06:38,000 --> 00:06:40,320 Speaker 1: Our work was just as good as it had been before. 103 00:06:41,000 --> 00:06:43,480 Speaker 1: It would just be a change in how our work 104 00:06:43,480 --> 00:06:47,760 Speaker 1: would show up in search results. However, that's the Google algorithm. 105 00:06:47,800 --> 00:06:51,600 Speaker 1: With other algorithms like the ones on Facebook, those can 106 00:06:51,640 --> 00:06:54,480 Speaker 1: be easier to game if you know what you're doing. 107 00:06:54,839 --> 00:06:58,000 Speaker 1: And what's more, there are algorithms on Facebook that allow 108 00:06:58,080 --> 00:07:03,680 Speaker 1: for advertisers to engage in some discriminatory and potentially criminal behavior. 109 00:07:03,800 --> 00:07:07,280 Speaker 1: So we'll look at how the various algorithms on Facebook 110 00:07:07,560 --> 00:07:10,560 Speaker 1: can have a negative impact because of the way they 111 00:07:10,560 --> 00:07:13,800 Speaker 1: are implemented and because of the way that people are 112 00:07:13,840 --> 00:07:17,520 Speaker 1: taking advantage of them. Now, in the beginning, Facebook didn't 113 00:07:17,560 --> 00:07:21,040 Speaker 1: have an algorithm at all. Heck, it didn't even have 114 00:07:21,080 --> 00:07:24,320 Speaker 1: a news feed when it first started. Instead, you could 115 00:07:24,360 --> 00:07:27,360 Speaker 1: post stuff to your profile, but then you could visit 116 00:07:27,400 --> 00:07:30,200 Speaker 1: the profiles of friends to see what they had posted. 117 00:07:30,680 --> 00:07:34,320 Speaker 1: You could leave comments on stuff, but it still required 118 00:07:34,360 --> 00:07:38,560 Speaker 1: you to actually go over to your friends profile. There 119 00:07:38,640 --> 00:07:43,520 Speaker 1: was no aggregator for content just yet that changed in 120 00:07:43,600 --> 00:07:48,320 Speaker 1: two thousand seven when Facebook introduced the news feed. Now, 121 00:07:48,480 --> 00:07:52,320 Speaker 1: your homepage on Facebook had a record of your friends activities, 122 00:07:52,360 --> 00:07:55,720 Speaker 1: just listed in reverse chronological order. You know, the most 123 00:07:55,800 --> 00:07:58,920 Speaker 1: recent things appearing at the top of the list. You 124 00:07:58,920 --> 00:08:01,280 Speaker 1: could log in and see what your friends had been 125 00:08:01,360 --> 00:08:03,440 Speaker 1: up to since the last time you were on and 126 00:08:03,480 --> 00:08:06,800 Speaker 1: you could catch up. There's still no algorithm to speak 127 00:08:06,840 --> 00:08:10,080 Speaker 1: of yet, and as would become a hallmark of any 128 00:08:10,280 --> 00:08:14,560 Speaker 1: change made on Facebook. Ever, a lot of users objected 129 00:08:14,760 --> 00:08:17,040 Speaker 1: to the news feed, and the main objection was that 130 00:08:17,440 --> 00:08:20,640 Speaker 1: people felt it was a violation of privacy. They didn't 131 00:08:20,680 --> 00:08:23,800 Speaker 1: like the idea that the activities they were doing on 132 00:08:23,800 --> 00:08:28,400 Speaker 1: Facebook were being broadcast to all of their friends, which okay, 133 00:08:28,400 --> 00:08:30,880 Speaker 1: now this one stumps me a little bit, because the 134 00:08:30,920 --> 00:08:35,840 Speaker 1: profiles on Facebook were public unless you set privacy features 135 00:08:35,880 --> 00:08:39,520 Speaker 1: to make it otherwise. So if you change your relationship 136 00:08:39,600 --> 00:08:43,360 Speaker 1: status on your profile, for example, why would you object 137 00:08:43,360 --> 00:08:47,640 Speaker 1: to it being broadcast to your friends. If anyone visited 138 00:08:47,640 --> 00:08:50,440 Speaker 1: your profile, they would see the difference. So either you 139 00:08:50,600 --> 00:08:53,360 Speaker 1: don't change it, or you change the privacy settings on 140 00:08:53,400 --> 00:08:56,280 Speaker 1: your profile, But otherwise why would you be upset? I 141 00:08:56,320 --> 00:08:59,800 Speaker 1: think the main takeaway is just that people hate change, 142 00:09:00,160 --> 00:09:02,920 Speaker 1: and if you don't want something broadcast to everybody, maybe 143 00:09:02,960 --> 00:09:07,120 Speaker 1: you shouldn't post it. A couple of years later, Facebook 144 00:09:07,160 --> 00:09:11,240 Speaker 1: introduced the first iteration of their algorithm, which would determine 145 00:09:11,360 --> 00:09:14,320 Speaker 1: what users would see in their news feed and in 146 00:09:14,440 --> 00:09:17,560 Speaker 1: what order they would appear, so rather than getting a 147 00:09:17,640 --> 00:09:21,200 Speaker 1: full account, of all of your friends activities in reverse 148 00:09:21,280 --> 00:09:25,000 Speaker 1: chronological order. Users would see a selection of stuff that 149 00:09:25,040 --> 00:09:29,120 Speaker 1: the algorithm had determined was the best for that user. 150 00:09:29,640 --> 00:09:35,360 Speaker 1: So how did that work? Well, generally speaking, the algorithm 151 00:09:35,360 --> 00:09:39,560 Speaker 1: took three main criteria into account to select the stuff 152 00:09:39,760 --> 00:09:43,760 Speaker 1: that would show up on news feeds. One was user affinity, 153 00:09:44,080 --> 00:09:46,960 Speaker 1: which was an indicator of how close the relationship is 154 00:09:47,520 --> 00:09:51,920 Speaker 1: between one Facebook user and another, or a Facebook user 155 00:09:52,040 --> 00:09:55,960 Speaker 1: and a brand. And this was measured in one way perspectives, 156 00:09:56,559 --> 00:09:59,400 Speaker 1: and that can be a little confusing. So let's make 157 00:09:59,520 --> 00:10:03,360 Speaker 1: a an example. We'll take three fictional people to explain 158 00:10:03,480 --> 00:10:08,240 Speaker 1: how this affinity worked. So in this example, we've got Sam, Dean, 159 00:10:08,760 --> 00:10:14,080 Speaker 1: and Castiel. Okay, So Sam and Dean are super close, right, 160 00:10:14,640 --> 00:10:18,160 Speaker 1: and they often comment on each other's posts, and they'll 161 00:10:18,240 --> 00:10:21,000 Speaker 1: hit like on each other's statuses and all that kind 162 00:10:21,040 --> 00:10:24,440 Speaker 1: of stuff. So they tend to see one another's posts 163 00:10:24,520 --> 00:10:26,960 Speaker 1: a lot in their own news feeds when they log 164 00:10:27,000 --> 00:10:31,400 Speaker 1: onto Facebook. They each have a high affinity for each other. 165 00:10:32,040 --> 00:10:35,400 Speaker 1: Then we've got Castil. So Castiel is friends with both 166 00:10:35,559 --> 00:10:39,720 Speaker 1: Sam and Dean, but really he's much closer to Dean 167 00:10:39,880 --> 00:10:43,320 Speaker 1: than he is to Sam. And Castiole responds to a 168 00:10:43,320 --> 00:10:46,720 Speaker 1: lot of Dean's posts, and as a result, Castiel C's 169 00:10:46,840 --> 00:10:51,400 Speaker 1: Dean's stuff pop up in his that is, Castil's news 170 00:10:51,400 --> 00:10:55,160 Speaker 1: feed pretty regularly. Castil is not as close with Sam 171 00:10:55,200 --> 00:10:58,360 Speaker 1: and he doesn't interact with Sam as much. So sam 172 00:10:58,440 --> 00:11:01,760 Speaker 1: stuff can pop up in Castile's news feed, but it's 173 00:11:01,800 --> 00:11:04,280 Speaker 1: not as frequent or as sure a thing as when 174 00:11:04,360 --> 00:11:08,800 Speaker 1: Dean posts something. Now, on top of that, Dean likes Castiel, 175 00:11:09,080 --> 00:11:12,000 Speaker 1: but you get the feeling that this relationship is not 176 00:11:12,080 --> 00:11:16,840 Speaker 1: a perfect two way street. So Dean only interacts with 177 00:11:17,120 --> 00:11:21,240 Speaker 1: some of Castile's stuff that he posts online. So Dean 178 00:11:21,559 --> 00:11:26,040 Speaker 1: sees Castiel's posts on occasion in his news feed, but 179 00:11:26,360 --> 00:11:29,680 Speaker 1: not nearly at the same frequency at which he sees 180 00:11:30,160 --> 00:11:34,160 Speaker 1: Sam's post in his that is Dean's news feed. And yes, 181 00:11:34,920 --> 00:11:38,199 Speaker 1: I just summarized the show Supernatural in an example about 182 00:11:38,320 --> 00:11:42,480 Speaker 1: Facebook algorithms and affinity. Now, as we see a high 183 00:11:42,520 --> 00:11:46,679 Speaker 1: affinity one person has for another helps determine how much 184 00:11:47,040 --> 00:11:50,040 Speaker 1: of that other person's stuff the first person is going 185 00:11:50,120 --> 00:11:52,160 Speaker 1: to see. But it only works in that one way. 186 00:11:52,480 --> 00:11:56,200 Speaker 1: If I like your posts a lot, then I'm going 187 00:11:56,280 --> 00:11:58,600 Speaker 1: to see more of your posts in my feed. But 188 00:11:58,640 --> 00:12:01,360 Speaker 1: if you never interact with me, You're not going to 189 00:12:01,440 --> 00:12:03,800 Speaker 1: see a lot of my stuff in your feed. That's 190 00:12:03,800 --> 00:12:06,280 Speaker 1: what I mean by that one way perspective. Of course, 191 00:12:06,679 --> 00:12:10,240 Speaker 1: there are other ways to wait this, more like you 192 00:12:10,280 --> 00:12:13,439 Speaker 1: can tag someone in a post, which will definitely give 193 00:12:13,480 --> 00:12:16,839 Speaker 1: their attention much more effectively. But let's move on. So 194 00:12:16,880 --> 00:12:20,360 Speaker 1: the second component to the edge rank factor we had 195 00:12:20,600 --> 00:12:24,280 Speaker 1: user affinity as a component one. Component two is the 196 00:12:24,320 --> 00:12:28,319 Speaker 1: weight of the post W E, I, G, H T. 197 00:12:29,200 --> 00:12:33,040 Speaker 1: So different kinds of posts have different weights, and the 198 00:12:33,080 --> 00:12:36,240 Speaker 1: greater the weight, the more likely it will pop up 199 00:12:36,600 --> 00:12:41,360 Speaker 1: on a given friends feed. So, generally speaking, with edge rank, 200 00:12:41,559 --> 00:12:45,520 Speaker 1: stuff that had photos or videos had the most weight, 201 00:12:46,040 --> 00:12:52,000 Speaker 1: followed by posts that contained links, followed by plain text posts. Now, 202 00:12:52,040 --> 00:12:55,640 Speaker 1: the reason for this is that Facebook analysts had figured 203 00:12:55,679 --> 00:12:59,240 Speaker 1: out that posts that had stuff like photos and videos 204 00:12:59,640 --> 00:13:03,120 Speaker 1: with tend to get more engagement. I'll get back to 205 00:13:03,240 --> 00:13:06,520 Speaker 1: engagement a bit later in this episode. Just know that 206 00:13:06,600 --> 00:13:11,440 Speaker 1: it is a critical concept for Facebook. The third component 207 00:13:11,760 --> 00:13:16,200 Speaker 1: factored into edge rink is called time decay, which essentially 208 00:13:16,200 --> 00:13:20,720 Speaker 1: translates to how old is this post? And the idea 209 00:13:20,800 --> 00:13:24,400 Speaker 1: here is that the more recent posts are likely more 210 00:13:24,480 --> 00:13:28,720 Speaker 1: relevant than older posts are it doesn't do you any 211 00:13:28,760 --> 00:13:31,960 Speaker 1: good to see the announcement of an upcoming get together 212 00:13:32,600 --> 00:13:36,640 Speaker 1: if in fact that get together actually happened last week, 213 00:13:36,800 --> 00:13:41,000 Speaker 1: so you're not likely to see the initial post pop 214 00:13:41,080 --> 00:13:42,679 Speaker 1: up at the top of your news feed if you've 215 00:13:42,720 --> 00:13:47,200 Speaker 1: already missed everything. As the post ages, the value of 216 00:13:47,240 --> 00:13:51,560 Speaker 1: the post decreases. So when you log into Facebook or 217 00:13:51,760 --> 00:13:55,520 Speaker 1: open the Facebook app or whatever, this algorithm would run 218 00:13:55,679 --> 00:13:58,680 Speaker 1: with your particular situation in mind. It would scan through 219 00:13:58,760 --> 00:14:02,160 Speaker 1: all the stuff that had been posted by the friends 220 00:14:02,200 --> 00:14:05,960 Speaker 1: you have and the brands you follow since your last session, 221 00:14:06,280 --> 00:14:10,640 Speaker 1: and then it would select and rank or order those 222 00:14:10,679 --> 00:14:13,880 Speaker 1: status updates that you would see in your news feed 223 00:14:14,120 --> 00:14:18,440 Speaker 1: based off of those criteria. But this would also mean 224 00:14:18,880 --> 00:14:22,360 Speaker 1: you would miss out on stuff. No longer would you 225 00:14:22,360 --> 00:14:25,680 Speaker 1: be able to see everything that all of your friends 226 00:14:25,720 --> 00:14:30,280 Speaker 1: were doing, listed in reverse chronological order. In fact, at times, 227 00:14:30,360 --> 00:14:33,520 Speaker 1: chronology didn't have anything to do with the order in 228 00:14:33,520 --> 00:14:36,960 Speaker 1: which status updates would appear on your news feed. A 229 00:14:37,040 --> 00:14:41,160 Speaker 1: post several hours old might appear above one that was 230 00:14:41,200 --> 00:14:44,600 Speaker 1: posted just five minutes before you logged in, all because 231 00:14:44,720 --> 00:14:49,080 Speaker 1: of those other factors. For some Facebook users, including me 232 00:14:49,360 --> 00:14:53,040 Speaker 1: at the time, this was really frustrating. It made us 233 00:14:53,040 --> 00:14:55,880 Speaker 1: feel as though we were missing out on stuff our 234 00:14:55,920 --> 00:14:59,680 Speaker 1: friends were saying because we were. I found it particularly 235 00:14:59,720 --> 00:15:04,240 Speaker 1: irre tating because I wasn't one to frequently engage with updates. 236 00:15:04,240 --> 00:15:07,440 Speaker 1: I didn't like a lot of them, and I wouldn't comment. 237 00:15:07,480 --> 00:15:10,280 Speaker 1: A ton of times. I would read everything, but I 238 00:15:10,320 --> 00:15:13,240 Speaker 1: wasn't interacting with it, So the algorithm wasn't getting a 239 00:15:13,280 --> 00:15:17,880 Speaker 1: really good idea of who my good friends were versus 240 00:15:17,960 --> 00:15:21,880 Speaker 1: like people who are my acquaintances. So it meant that 241 00:15:21,960 --> 00:15:26,520 Speaker 1: some people who were my closest friends in reality were 242 00:15:26,560 --> 00:15:29,640 Speaker 1: not the ones that I was seeing, you know, being 243 00:15:29,680 --> 00:15:32,240 Speaker 1: active on Facebook. Even if they were, I just wasn't 244 00:15:32,280 --> 00:15:35,840 Speaker 1: seeing those posts. And my case was not unusual. A 245 00:15:35,840 --> 00:15:38,040 Speaker 1: lot of people went through this and and then I 246 00:15:38,120 --> 00:15:42,520 Speaker 1: missed stuff, sometimes big stuff like an engagement announcement or something. 247 00:15:43,520 --> 00:15:47,640 Speaker 1: Edge rank was just the beginning, and actually that particular 248 00:15:47,680 --> 00:15:51,240 Speaker 1: algorithm didn't stick around that long in the grand scheme 249 00:15:51,240 --> 00:15:55,760 Speaker 1: of things. By late within Facebook, no one was really 250 00:15:55,840 --> 00:15:59,160 Speaker 1: using the term edge rank anymore because they had moved 251 00:15:59,160 --> 00:16:04,080 Speaker 1: towards a more sophisticated machine learning based algorithm. This one 252 00:16:04,160 --> 00:16:06,600 Speaker 1: was far more sophisticated than edge rank. It was not 253 00:16:06,720 --> 00:16:09,760 Speaker 1: limited to just those three main factors, although they were 254 00:16:09,760 --> 00:16:14,720 Speaker 1: still components within the algorithm. Instead, according to Lar's Backstrom, 255 00:16:14,720 --> 00:16:17,920 Speaker 1: who was the engineering manager for the news feed ranking 256 00:16:17,960 --> 00:16:20,960 Speaker 1: at the time, this new one used somewhere in the 257 00:16:20,960 --> 00:16:25,800 Speaker 1: neighborhood of one hundred thousand weights to determine what users 258 00:16:25,800 --> 00:16:28,600 Speaker 1: would see and in which order. Now you would think 259 00:16:28,760 --> 00:16:30,680 Speaker 1: that with that many weights it would be hard to 260 00:16:30,720 --> 00:16:34,160 Speaker 1: gain the system, but you'd be wrong. Now, a couple 261 00:16:34,240 --> 00:16:36,320 Speaker 1: other things I want to mention before we go to break. 262 00:16:36,920 --> 00:16:39,600 Speaker 1: One of those is that, yes, there was an option 263 00:16:39,720 --> 00:16:44,080 Speaker 1: still is to try and list things in most recent 264 00:16:44,240 --> 00:16:48,040 Speaker 1: as opposed to what is most relevant or whatever Facebook 265 00:16:48,040 --> 00:16:51,480 Speaker 1: thinks is most relevant, except that you're not really seeing 266 00:16:51,600 --> 00:16:54,880 Speaker 1: everything that way. Either. You're not getting the most recent 267 00:16:55,440 --> 00:16:59,520 Speaker 1: reverse chronological list of all of your friends activities. It's 268 00:16:59,560 --> 00:17:03,120 Speaker 1: still a selection of what your friends are posting, and 269 00:17:03,120 --> 00:17:05,720 Speaker 1: it's a selection that the algorithm has determined is the 270 00:17:05,760 --> 00:17:08,959 Speaker 1: most relevant for you. It's just that they'll be listed 271 00:17:09,040 --> 00:17:12,440 Speaker 1: in reverse order instead of kind of that haphazard all 272 00:17:12,440 --> 00:17:15,200 Speaker 1: over the place method. But the fact is you're still 273 00:17:15,240 --> 00:17:18,160 Speaker 1: not seeing everything, which still frustrates a lot of people. 274 00:17:18,840 --> 00:17:22,720 Speaker 1: And when we come back we'll talk about the algorithm, 275 00:17:23,080 --> 00:17:27,840 Speaker 1: how people have used the creative approach to game the system, 276 00:17:27,880 --> 00:17:32,439 Speaker 1: and some unintended consequences that algorithm has led to. But 277 00:17:32,560 --> 00:17:42,600 Speaker 1: first let's take a quick break. So we're at the 278 00:17:42,680 --> 00:17:47,000 Speaker 1: point where an algorithm starts to serve up content catered 279 00:17:47,119 --> 00:17:51,920 Speaker 1: to each individual Facebook user, and the goal was always 280 00:17:51,920 --> 00:17:56,280 Speaker 1: the same from the very beginning keep people engaged, which 281 00:17:56,320 --> 00:18:00,800 Speaker 1: really just means keep people on Facebook for longer. And 282 00:18:00,840 --> 00:18:05,719 Speaker 1: this gets to Facebook's method of generating revenue. Facebook's primary 283 00:18:05,760 --> 00:18:10,120 Speaker 1: source of revenue is through advertising. And when I say primary, 284 00:18:10,160 --> 00:18:16,960 Speaker 1: according to Investopedia, advertising accounts for ent of Facebook's revenue, 285 00:18:17,359 --> 00:18:25,399 Speaker 1: which for Q three was twenty one point five billion dollars. 286 00:18:25,840 --> 00:18:31,280 Speaker 1: Wholly cal princely some doesn't even begin to cover it. 287 00:18:31,920 --> 00:18:35,320 Speaker 1: Facebook sales ad space to various companies to serve up 288 00:18:35,320 --> 00:18:39,080 Speaker 1: advertising to Facebook's users, so they're essentially think of it 289 00:18:39,200 --> 00:18:42,159 Speaker 1: like a billboard. Facebook is like a giant billboard, but 290 00:18:42,160 --> 00:18:46,080 Speaker 1: it's a very special billboard. It's incredibly valuable. There's another 291 00:18:46,119 --> 00:18:48,760 Speaker 1: component to this I'll touch on later, which revolves around 292 00:18:48,760 --> 00:18:53,399 Speaker 1: targeted advertising, but let's just stick with the basic value 293 00:18:53,400 --> 00:18:57,959 Speaker 1: proposition here. So Facebook has space on its apps and 294 00:18:58,119 --> 00:19:00,879 Speaker 1: its pages, and in that space ace it has the 295 00:19:01,040 --> 00:19:04,639 Speaker 1: ability to show ads that they could be on the side, 296 00:19:04,680 --> 00:19:06,880 Speaker 1: they can be fed into the news feed as well, 297 00:19:06,920 --> 00:19:10,120 Speaker 1: so they're kind of interspersed along with the status updates 298 00:19:10,119 --> 00:19:14,280 Speaker 1: from friends and stuff. Uh. The longer that people use 299 00:19:14,359 --> 00:19:18,480 Speaker 1: Facebook per any given session, the more ads they will 300 00:19:18,600 --> 00:19:21,879 Speaker 1: encounter as they interact with Facebook, which also means that 301 00:19:21,920 --> 00:19:25,520 Speaker 1: they could potentially interact with the ads. So part of 302 00:19:25,520 --> 00:19:28,840 Speaker 1: what Facebook really needs to do is to find ways 303 00:19:28,880 --> 00:19:32,800 Speaker 1: to convince people to spend as much time as possible 304 00:19:33,359 --> 00:19:39,080 Speaker 1: on Facebook. In this use case, time really is money. 305 00:19:39,320 --> 00:19:42,639 Speaker 1: Greater engagement doesn't just mean more opportunities to see ads, 306 00:19:42,920 --> 00:19:45,639 Speaker 1: it also means that Facebook can demonstrate its own value 307 00:19:45,680 --> 00:19:50,159 Speaker 1: to advertisers and thus command higher prices. To put it 308 00:19:50,240 --> 00:19:54,320 Speaker 1: another way, let's say you own a few billboards around town. 309 00:19:54,359 --> 00:19:57,080 Speaker 1: Since I've already mentioned them, so let's say you have 310 00:19:57,119 --> 00:19:59,879 Speaker 1: a billboard that is on a back road and that 311 00:20:00,359 --> 00:20:03,240 Speaker 1: gets just a little bit of traffic every day, and 312 00:20:03,320 --> 00:20:06,520 Speaker 1: you happen to own another billboard space which is on 313 00:20:06,560 --> 00:20:09,120 Speaker 1: a much more heavily used road that goes right into 314 00:20:09,160 --> 00:20:11,600 Speaker 1: the heart of town. The rates that you would be 315 00:20:11,640 --> 00:20:15,560 Speaker 1: able to charge for the back road billboard are probably 316 00:20:15,560 --> 00:20:18,200 Speaker 1: gonna be a lot lower than what you can command 317 00:20:18,240 --> 00:20:21,840 Speaker 1: with the heavily traveled thoroughfare. More folks are going to 318 00:20:21,920 --> 00:20:25,919 Speaker 1: see that second billboard, which makes it more valuable. On 319 00:20:25,960 --> 00:20:29,600 Speaker 1: a similar front, if Facebook can show numbers that indicate 320 00:20:30,040 --> 00:20:33,399 Speaker 1: that more than two billion people are spending more and 321 00:20:33,440 --> 00:20:38,000 Speaker 1: more time on Facebook, it can command higher prices for advertising. 322 00:20:38,320 --> 00:20:41,199 Speaker 1: Add companies and their clients want to be where the 323 00:20:41,240 --> 00:20:45,040 Speaker 1: eyeballs are, and Facebook fits the bill nicely. So with 324 00:20:45,200 --> 00:20:48,440 Speaker 1: that in mind, the real purpose of the algorithm ultimately 325 00:20:48,680 --> 00:20:51,240 Speaker 1: is to serve up stuff to users that will keep 326 00:20:51,320 --> 00:20:54,600 Speaker 1: them on Facebook longer and convince them to come back 327 00:20:54,640 --> 00:20:58,640 Speaker 1: to Facebook frequently. Now, some of the things that will 328 00:20:58,680 --> 00:21:01,040 Speaker 1: bring people back fall online with a few of the 329 00:21:01,119 --> 00:21:05,520 Speaker 1: ideas that Drank demonstrated. If you logged onto Facebook but 330 00:21:05,680 --> 00:21:08,240 Speaker 1: you only ever saw posts from people you only kind 331 00:21:08,280 --> 00:21:11,000 Speaker 1: of sort of know, and you always missed out on 332 00:21:11,000 --> 00:21:14,000 Speaker 1: the posts from your close friends, you probably not find 333 00:21:14,080 --> 00:21:17,280 Speaker 1: it a very satisfying experience. So at least some of 334 00:21:17,320 --> 00:21:19,479 Speaker 1: the stuff you'd be seeing would fall in line with 335 00:21:19,600 --> 00:21:22,200 Speaker 1: the stuff you'd really want to see in the first place, 336 00:21:22,720 --> 00:21:25,359 Speaker 1: But maybe there would be some posts you would really 337 00:21:25,400 --> 00:21:28,800 Speaker 1: like to see, but the algorithm, for whatever reason, doesn't 338 00:21:28,880 --> 00:21:32,760 Speaker 1: judge those as being particularly engaging. Well, the chances are 339 00:21:32,920 --> 00:21:36,240 Speaker 1: more slim that you would see those posts. If you've 340 00:21:36,280 --> 00:21:39,640 Speaker 1: ever posted something to Facebook that you thought was particularly 341 00:21:39,680 --> 00:21:44,399 Speaker 1: important or impactful, but you didn't see very much reaction 342 00:21:44,520 --> 00:21:47,360 Speaker 1: to it, there's a chance that not many people actually 343 00:21:47,440 --> 00:21:51,840 Speaker 1: saw what you posted, So that's a real possibility. Of course, 344 00:21:52,040 --> 00:21:54,240 Speaker 1: there's also a chance that they did see it, they 345 00:21:54,280 --> 00:21:56,840 Speaker 1: just didn't care. It's kind of hard to say for 346 00:21:56,880 --> 00:22:00,639 Speaker 1: any given post, but it's entirely possible for you to 347 00:22:00,680 --> 00:22:04,280 Speaker 1: post something that people would have reacted to. They just 348 00:22:04,320 --> 00:22:06,240 Speaker 1: never saw it, so they never got the chance to. 349 00:22:06,800 --> 00:22:10,120 Speaker 1: But let's say someone in your circle posts something and 350 00:22:10,240 --> 00:22:13,359 Speaker 1: it starts to catch on. Maybe it's someone in your 351 00:22:13,400 --> 00:22:16,159 Speaker 1: circle that you don't interact with much, so usually you 352 00:22:16,200 --> 00:22:19,879 Speaker 1: don't see very much from them on Facebook. But for 353 00:22:19,920 --> 00:22:23,560 Speaker 1: whatever reason, this particular post that they made is getting 354 00:22:23,560 --> 00:22:26,520 Speaker 1: a lot of interaction from other friends, which means the 355 00:22:26,520 --> 00:22:29,920 Speaker 1: algorithm might say, hey, this post is really taking off 356 00:22:30,359 --> 00:22:32,960 Speaker 1: a lot of people are interacting with it, so I'm 357 00:22:33,000 --> 00:22:36,440 Speaker 1: going to serve it up to even more people, including you, 358 00:22:36,920 --> 00:22:41,600 Speaker 1: who usually wouldn't see such things. From this particular distant friend. So, 359 00:22:41,600 --> 00:22:45,040 Speaker 1: in other words, posts that get a lot of engagement, 360 00:22:45,240 --> 00:22:49,080 Speaker 1: that being reactions and comments and shares and that kind 361 00:22:49,119 --> 00:22:53,919 Speaker 1: of stuff, they tend to rise above other posts. Engagement 362 00:22:54,040 --> 00:22:57,879 Speaker 1: doesn't have to be positive either. It's not really important 363 00:22:57,920 --> 00:23:01,560 Speaker 1: to Facebook if the stuff you see makes you happy 364 00:23:01,680 --> 00:23:04,679 Speaker 1: or unhappy, or if the stuff you see is true 365 00:23:04,880 --> 00:23:08,600 Speaker 1: or if it's false. What matters to Facebook is that 366 00:23:08,680 --> 00:23:12,199 Speaker 1: you spend as much time on the platform as they 367 00:23:12,200 --> 00:23:14,920 Speaker 1: can coax from you, and that you feel the need 368 00:23:14,960 --> 00:23:19,520 Speaker 1: to return to Facebook frequently. Maybe it turns out you 369 00:23:19,640 --> 00:23:22,720 Speaker 1: rage posts on Facebook a lot. Maybe you see stuff 370 00:23:22,760 --> 00:23:27,919 Speaker 1: that you vehemently disagree with, so you post lengthy comments, 371 00:23:28,560 --> 00:23:30,840 Speaker 1: or you share a post you don't agree with, but 372 00:23:30,960 --> 00:23:32,840 Speaker 1: you do so in an effort to put your own 373 00:23:32,880 --> 00:23:36,080 Speaker 1: spin on the thesis of the original post, or to 374 00:23:36,160 --> 00:23:40,680 Speaker 1: criticize it in some way. Your motivations or goals are 375 00:23:40,800 --> 00:23:44,840 Speaker 1: not important to Facebook. The company could not care less. 376 00:23:45,280 --> 00:23:49,320 Speaker 1: The platform just once your time. Well, that's being a 377 00:23:49,320 --> 00:23:53,320 Speaker 1: bit too reductive. Facebook also wants your posts to inspire 378 00:23:53,400 --> 00:23:57,119 Speaker 1: other people to engage using their own comments, likes, and shares, 379 00:23:57,440 --> 00:24:00,080 Speaker 1: so that they get their time too. That's really it. 380 00:24:00,359 --> 00:24:04,320 Speaker 1: They want everyone's time, not just yours. I'm sure most 381 00:24:04,359 --> 00:24:07,360 Speaker 1: of you already see where this becomes an issue. As 382 00:24:07,359 --> 00:24:10,439 Speaker 1: we've heard for a few years now, Facebook has become 383 00:24:10,480 --> 00:24:15,879 Speaker 1: a hotbed for misinformation and disinformation, something that Facebook founder 384 00:24:15,920 --> 00:24:20,600 Speaker 1: Mark Zuckerberg has tried to avoid addressing for years. There's 385 00:24:20,600 --> 00:24:25,080 Speaker 1: a saying that's frequently misattributed to Mark Twain that says 386 00:24:25,440 --> 00:24:28,679 Speaker 1: a lie can travel halfway around the world while the 387 00:24:28,680 --> 00:24:31,800 Speaker 1: truth is still putting on its shoes. I think it's 388 00:24:31,880 --> 00:24:34,560 Speaker 1: kind of fitting that Twain is often cited as the 389 00:24:34,560 --> 00:24:37,040 Speaker 1: originator of the saying, but the truth of the matter 390 00:24:37,119 --> 00:24:39,480 Speaker 1: is that it predates him. It might have been the 391 00:24:39,480 --> 00:24:42,960 Speaker 1: product of Jonathan Swift, but we don't really know. But 392 00:24:43,359 --> 00:24:47,520 Speaker 1: the saying is apt. Misinformation is not saddled with the 393 00:24:47,640 --> 00:24:53,480 Speaker 1: heavy burden of being, you know. True misinformation can tap 394 00:24:53,520 --> 00:24:59,560 Speaker 1: into biases and preconceptions. It can reinforce the beliefs we hold, 395 00:25:00,160 --> 00:25:03,199 Speaker 1: whether they're positive or negative, even if those beliefs have 396 00:25:03,440 --> 00:25:08,080 Speaker 1: no real basis in reality. And if we feel that 397 00:25:08,119 --> 00:25:12,720 Speaker 1: our beliefs are validated, we're more likely to engage and 398 00:25:12,840 --> 00:25:16,119 Speaker 1: share those posts and to feel reinforced in it to 399 00:25:16,160 --> 00:25:19,800 Speaker 1: do more, to continue sharing that kind of stuff. Now. 400 00:25:19,840 --> 00:25:24,719 Speaker 1: I've seen examples of this across different ideologies. The narrative 401 00:25:24,800 --> 00:25:28,000 Speaker 1: we frequently see here in the United States is how 402 00:25:28,200 --> 00:25:32,720 Speaker 1: right wing misinformation will spread across Facebook, and that that 403 00:25:32,880 --> 00:25:35,800 Speaker 1: certainly does happen, but I've also seen it happen for 404 00:25:35,840 --> 00:25:40,240 Speaker 1: people who have a left leaning ideology. For example, if 405 00:25:40,240 --> 00:25:44,280 Speaker 1: a post indicates that someone with a right wing philosophy 406 00:25:44,359 --> 00:25:48,600 Speaker 1: has done or said something terrible, that can spread rapidly 407 00:25:48,720 --> 00:25:52,000 Speaker 1: across Facebook. But often you can do a little fact 408 00:25:52,119 --> 00:25:56,439 Speaker 1: checking and find out that there's no actual evidence that 409 00:25:56,520 --> 00:25:59,520 Speaker 1: such a thing was ever said or done by that person. 410 00:25:59,640 --> 00:26:02,560 Speaker 1: I've seen this happen multiple times, So what I'm getting 411 00:26:02,600 --> 00:26:06,760 Speaker 1: at is that no one is immune. That being said, 412 00:26:07,119 --> 00:26:11,600 Speaker 1: certain groups, particularly those catering to more right wing audiences, 413 00:26:12,200 --> 00:26:15,879 Speaker 1: have really pounced on the opportunity more than others have. 414 00:26:16,359 --> 00:26:19,560 Speaker 1: And again I'm not saying left leaning organizations and people 415 00:26:19,720 --> 00:26:23,520 Speaker 1: haven't done the same thing. They just haven't been as 416 00:26:23,640 --> 00:26:27,479 Speaker 1: fast to jump on it, and it's less common they 417 00:26:27,480 --> 00:26:30,640 Speaker 1: don't do it as much. Right now, I think it's 418 00:26:30,680 --> 00:26:34,720 Speaker 1: pretty fair to say that no ethical organization or person 419 00:26:35,200 --> 00:26:40,520 Speaker 1: would want to perpetuate misinformation or create disinformation. This is 420 00:26:40,560 --> 00:26:43,879 Speaker 1: a form of manipulation to get people to support something 421 00:26:44,240 --> 00:26:47,560 Speaker 1: without having to go to the trouble of actually really 422 00:26:48,000 --> 00:26:51,080 Speaker 1: giving evidence that that point of view is the best one. 423 00:26:51,560 --> 00:26:55,200 Speaker 1: If you're using lies to get people to support you, that's, 424 00:26:55,640 --> 00:27:00,359 Speaker 1: you know, not great. But on Facebook that sort of 425 00:27:00,400 --> 00:27:03,879 Speaker 1: thing can really take off, and as a post gets 426 00:27:03,960 --> 00:27:07,240 Speaker 1: traction with people sharing it or commenting on it, or 427 00:27:07,280 --> 00:27:12,200 Speaker 1: reacting to it, whether positively or negatively, Facebook will start 428 00:27:12,400 --> 00:27:16,959 Speaker 1: serving that content to progressively larger audiences. And you can 429 00:27:17,000 --> 00:27:19,840 Speaker 1: think of this as like ripples in a pond. That 430 00:27:19,960 --> 00:27:24,240 Speaker 1: initial circle of a ripple that's really small, right, But 431 00:27:24,320 --> 00:27:27,480 Speaker 1: let's say that that circle represents a certain group of 432 00:27:27,560 --> 00:27:30,760 Speaker 1: users and they react very strongly to that content. Well, 433 00:27:30,800 --> 00:27:34,000 Speaker 1: Facebook will then expand that to the next circle outward 434 00:27:34,480 --> 00:27:36,800 Speaker 1: to affect a larger group of users, and if the 435 00:27:36,840 --> 00:27:40,040 Speaker 1: trend continues, it just keeps getting bigger and bigger. Now, 436 00:27:40,040 --> 00:27:42,400 Speaker 1: while this has been an issue with Facebook for ages, 437 00:27:42,520 --> 00:27:45,960 Speaker 1: we really saw it come under scrutiny in when it 438 00:27:46,000 --> 00:27:50,720 Speaker 1: became clear that Russian operatives were leveraging Facebook to manipulate 439 00:27:50,760 --> 00:27:55,840 Speaker 1: Americans and spread disinformation, particularly when they came to politics. 440 00:27:56,680 --> 00:28:01,679 Speaker 1: Complicating matters are fake Facebook accounts also known as bot accounts. 441 00:28:02,160 --> 00:28:06,280 Speaker 1: These accounts don't represent actual users, but rather are part 442 00:28:06,400 --> 00:28:10,879 Speaker 1: of larger systems designed to amplify the signal of certain messages. 443 00:28:11,560 --> 00:28:15,280 Speaker 1: Every content creator out there would love for every post 444 00:28:15,400 --> 00:28:18,159 Speaker 1: they make to go viral, that is, to become the 445 00:28:18,200 --> 00:28:20,919 Speaker 1: sort of thing that tons of people see and share 446 00:28:20,920 --> 00:28:25,119 Speaker 1: with others. Making viral content is pretty darn hard to 447 00:28:25,160 --> 00:28:29,240 Speaker 1: do on purpose. Often the stuff happens organically, then it 448 00:28:29,280 --> 00:28:33,399 Speaker 1: becomes difficult or even impossible to recapture that same magic 449 00:28:33,440 --> 00:28:36,440 Speaker 1: on subsequent efforts. I'm sure you can think of lots 450 00:28:36,440 --> 00:28:40,240 Speaker 1: of famous viral videos from people that never seemed to 451 00:28:40,280 --> 00:28:43,000 Speaker 1: pop up again afterward, or if they did, it didn't 452 00:28:43,000 --> 00:28:47,120 Speaker 1: have nearly the same effect. There countless examples of that. 453 00:28:48,040 --> 00:28:52,040 Speaker 1: But by creating lots of fake accounts, you can amplify 454 00:28:52,160 --> 00:28:54,880 Speaker 1: a message. You can get that effect of a viral 455 00:28:55,000 --> 00:28:59,200 Speaker 1: breakout because you're manipulating the system. You're using bots that 456 00:28:59,240 --> 00:29:01,920 Speaker 1: are either all blasting out the same post at the 457 00:29:01,960 --> 00:29:05,240 Speaker 1: same time, which is actually really easy to detect. You 458 00:29:05,240 --> 00:29:08,479 Speaker 1: can often see this on Twitter, where you'll find the 459 00:29:08,520 --> 00:29:12,760 Speaker 1: exact same message posted by a bunch of seemingly disconnected 460 00:29:12,800 --> 00:29:15,600 Speaker 1: Twitter accounts, which indicates they're all bots that are coming 461 00:29:15,600 --> 00:29:18,720 Speaker 1: out of the same source, or more likely, you would 462 00:29:18,760 --> 00:29:22,920 Speaker 1: have the bots interacting with a post and sharing that 463 00:29:23,000 --> 00:29:27,400 Speaker 1: post and liking the post, thus amplifying the post's importance 464 00:29:27,840 --> 00:29:30,800 Speaker 1: on Facebook in general and making it more visible as 465 00:29:30,840 --> 00:29:34,040 Speaker 1: a result. Now, the visibility is one thing, but that 466 00:29:34,120 --> 00:29:38,520 Speaker 1: interaction also creates a perception among the real humans who 467 00:29:38,560 --> 00:29:42,719 Speaker 1: are using Facebook that that original post must have merit. 468 00:29:43,320 --> 00:29:45,800 Speaker 1: If you look at a post and you see that 469 00:29:45,880 --> 00:29:49,880 Speaker 1: it has thousands of reactions connected to it, you're more 470 00:29:49,920 --> 00:29:52,680 Speaker 1: likely to pay attention and perhaps even take that post 471 00:29:52,800 --> 00:29:56,760 Speaker 1: to heart. You know that that has a certain cash 472 00:29:56,800 --> 00:30:02,320 Speaker 1: a that that social acceptance of a piece. And by 473 00:30:02,400 --> 00:30:06,240 Speaker 1: lots of fake accounts, I really do mean lots. Last 474 00:30:06,320 --> 00:30:09,200 Speaker 1: year in twenty nineteen, Facebook announced it had shut down 475 00:30:09,200 --> 00:30:12,840 Speaker 1: more than three billion fake accounts over the course of 476 00:30:12,920 --> 00:30:18,360 Speaker 1: half a year, three billion in six months. Even after that, 477 00:30:18,720 --> 00:30:23,200 Speaker 1: Facebook estimated that somewhere around five of all accounts still 478 00:30:23,240 --> 00:30:27,120 Speaker 1: on Facebook we're actually bots. Considering Facebook has more than 479 00:30:27,160 --> 00:30:32,720 Speaker 1: two billion users, five percent represents a lot of fake accounts. Now, 480 00:30:32,800 --> 00:30:35,560 Speaker 1: this doesn't mean that every user is going to see 481 00:30:35,600 --> 00:30:39,480 Speaker 1: the posts that get that kind of engagement. The algorithm 482 00:30:39,520 --> 00:30:42,720 Speaker 1: is taking a lot more into account, for example, the 483 00:30:42,760 --> 00:30:46,160 Speaker 1: types of pages you like on the platform. That matters. 484 00:30:46,440 --> 00:30:48,720 Speaker 1: If you like a lot of pages that have a 485 00:30:48,800 --> 00:30:52,240 Speaker 1: more left leaning slant, you know, stuff that falls into 486 00:30:52,320 --> 00:30:56,560 Speaker 1: like progressive politics, that category, you're less likely to see 487 00:30:56,920 --> 00:31:00,479 Speaker 1: really hard right leaning posts as a rule, and in 488 00:31:00,520 --> 00:31:04,320 Speaker 1: this way, Facebook can become a true echo chamber. The 489 00:31:04,360 --> 00:31:07,400 Speaker 1: platform again wants to keep you there as long as possible, 490 00:31:07,880 --> 00:31:12,080 Speaker 1: as many times as possible, so the algorithm is prioritizing 491 00:31:12,200 --> 00:31:15,440 Speaker 1: content that it calculates has the best chance of keeping 492 00:31:15,440 --> 00:31:19,920 Speaker 1: you on Facebook. If they drive you away, they've done 493 00:31:20,240 --> 00:31:23,720 Speaker 1: the wrong thing, at least as far as Facebook's revenue 494 00:31:23,760 --> 00:31:28,560 Speaker 1: generating plan goes. Facebook's crackdown on fake accounts was part 495 00:31:28,680 --> 00:31:32,320 Speaker 1: of a larger effort to address the problems with misinformation 496 00:31:32,600 --> 00:31:35,760 Speaker 1: that we're growing concerning enough for the US Congress to 497 00:31:35,800 --> 00:31:39,400 Speaker 1: call Zuckerberg to Washington, d C. To answer questions about 498 00:31:39,640 --> 00:31:43,480 Speaker 1: fake news and the like. Zuckerberg had long maintained that 499 00:31:43,920 --> 00:31:46,160 Speaker 1: he didn't want Facebook to get into the business of 500 00:31:46,320 --> 00:31:49,960 Speaker 1: arbitrating which bits of information were legit and which ones 501 00:31:50,040 --> 00:31:53,360 Speaker 1: represented misinformation, as kind of this idea of it's the 502 00:31:53,480 --> 00:31:59,480 Speaker 1: free market of ideas all ideas are are given equal platforms, 503 00:31:59,840 --> 00:32:02,560 Speaker 1: And you can sort of understand his reasoning from a 504 00:32:02,640 --> 00:32:06,959 Speaker 1: business perspective, because all that engagement really meant money for 505 00:32:07,040 --> 00:32:10,640 Speaker 1: his company. But with Facebook under scrutiny and criticism, there 506 00:32:10,720 --> 00:32:13,880 Speaker 1: was a very real danger of the platform being seen 507 00:32:14,080 --> 00:32:19,160 Speaker 1: as a toxic place for advertisers. Companies that aren't too 508 00:32:19,240 --> 00:32:21,880 Speaker 1: keen on getting looped in with content that falls in 509 00:32:21,920 --> 00:32:25,280 Speaker 1: with radical ideology, we're getting nervous, and in fact, in 510 00:32:26,120 --> 00:32:30,120 Speaker 1: we've seen advertisers boycott Facebook for various reasons, some of 511 00:32:30,120 --> 00:32:34,280 Speaker 1: which relating back to Facebook's failure to address things like 512 00:32:35,080 --> 00:32:39,479 Speaker 1: groups that are radicalizing individuals, that sort of stuff. So 513 00:32:39,520 --> 00:32:43,200 Speaker 1: it does have an impact. Shutting down body accounts was 514 00:32:43,320 --> 00:32:46,880 Speaker 1: one way to diminish the spread of misinformation on Facebook 515 00:32:46,920 --> 00:32:50,840 Speaker 1: without actually taking the additional step of Facebook becoming a 516 00:32:51,040 --> 00:32:55,520 Speaker 1: judge of what is and isn't legit information. But that 517 00:32:55,600 --> 00:32:58,600 Speaker 1: wasn't quite enough to satisfy critics who are pointing out 518 00:32:58,640 --> 00:33:01,880 Speaker 1: that people were using face to spread false narratives and 519 00:33:01,920 --> 00:33:05,760 Speaker 1: to mislead people and to manipulate large populations, while Facebook 520 00:33:05,800 --> 00:33:08,360 Speaker 1: meanwhile just raking in the profits from all of that. 521 00:33:08,920 --> 00:33:11,000 Speaker 1: In fact, a recent article in The New York Times 522 00:33:11,000 --> 00:33:15,280 Speaker 1: titled on Facebook, misinformation is more popular now than in 523 00:33:16,600 --> 00:33:21,400 Speaker 1: Reporter Davey Alba shared research from the German Martial Fund 524 00:33:21,600 --> 00:33:26,000 Speaker 1: Digital which found that people were sharing, liking, and commenting 525 00:33:26,120 --> 00:33:30,240 Speaker 1: on three times as many articles from outlets that regularly 526 00:33:30,320 --> 00:33:35,120 Speaker 1: published misinformation in than they were back in ten. So 527 00:33:35,160 --> 00:33:38,120 Speaker 1: we're actually seeing more of it today. That's not a 528 00:33:38,120 --> 00:33:41,680 Speaker 1: great trend there, and it's another indication that Facebook is 529 00:33:41,760 --> 00:33:45,960 Speaker 1: reluctant to intervene in this and as Karen Cornbla, the 530 00:33:46,040 --> 00:33:49,880 Speaker 1: director of the research firm, would say, counteracting this trend 531 00:33:50,080 --> 00:33:55,360 Speaker 1: quote just runs against their economic incentives. In the quote, 532 00:33:56,360 --> 00:34:01,760 Speaker 1: when we come back, it gets worse. But first let's 533 00:34:01,800 --> 00:34:11,760 Speaker 1: take another quick break. So I mentioned before the break 534 00:34:12,040 --> 00:34:14,880 Speaker 1: that it gets worse, and early in this episode I 535 00:34:14,920 --> 00:34:18,959 Speaker 1: alluded to the concept of targeted advertising, and this gets 536 00:34:18,960 --> 00:34:23,920 Speaker 1: into another aspect of algorithms on Facebook that can be harmful. Okay, 537 00:34:24,000 --> 00:34:28,720 Speaker 1: So the basic idea behind targeted advertising is really simple, 538 00:34:29,080 --> 00:34:31,120 Speaker 1: and the goal is to get the right ads in 539 00:34:31,200 --> 00:34:33,920 Speaker 1: front of the right people. The goal is always to 540 00:34:34,040 --> 00:34:37,320 Speaker 1: increase the odds that someone is going to act upon 541 00:34:37,360 --> 00:34:41,359 Speaker 1: that ad. Otherwise advertising is just throwing money away, right, 542 00:34:41,960 --> 00:34:45,560 Speaker 1: So going back to my billboard example from earlier, there's 543 00:34:45,600 --> 00:34:48,520 Speaker 1: only so much targeting you can do with billboards. Now, 544 00:34:48,560 --> 00:34:50,160 Speaker 1: you might choose to put up an add on a 545 00:34:50,200 --> 00:34:53,120 Speaker 1: billboard that's in a part of town that most closely 546 00:34:53,239 --> 00:34:57,760 Speaker 1: matches the demographic of your average customer, Meaning the people 547 00:34:58,000 --> 00:35:00,960 Speaker 1: that you cater to happen to live in a certain 548 00:35:01,000 --> 00:35:03,000 Speaker 1: part of town, so it makes more sense to put 549 00:35:03,000 --> 00:35:05,560 Speaker 1: your billboard in that part of town. But that's a 550 00:35:05,640 --> 00:35:11,120 Speaker 1: pretty primitive approach to targeted advertising. Facebook provides a laser focused, 551 00:35:11,440 --> 00:35:17,640 Speaker 1: individual precision approach. Every user activity on Facebook gives Facebook 552 00:35:17,760 --> 00:35:21,440 Speaker 1: more information about the the user in question and what 553 00:35:21,520 --> 00:35:25,760 Speaker 1: they like, which is obvious, but we need to start there. 554 00:35:25,840 --> 00:35:29,320 Speaker 1: So the pages that you visit on Facebook, the posts 555 00:35:29,320 --> 00:35:32,600 Speaker 1: you interact with on Facebook, the general information in your 556 00:35:32,640 --> 00:35:37,160 Speaker 1: profile like your birth date, your location, your relationship status, 557 00:35:37,640 --> 00:35:41,040 Speaker 1: all of these are valuable pieces of information. You can 558 00:35:41,080 --> 00:35:44,920 Speaker 1: actually go into Facebook's settings and the ad preferences page 559 00:35:45,120 --> 00:35:48,760 Speaker 1: and you can see what Facebook has deduced about you. 560 00:35:48,760 --> 00:35:51,600 Speaker 1: You know, the company will list out what it thinks 561 00:35:51,640 --> 00:35:54,719 Speaker 1: your interests are, so you can see, oh, this is 562 00:35:54,760 --> 00:35:57,200 Speaker 1: why I'm getting all of those ads for such and such. 563 00:35:57,239 --> 00:36:00,920 Speaker 1: It's because Facebook thinks I'm really interested in that. For example, 564 00:36:01,480 --> 00:36:04,839 Speaker 1: when I was first getting into exercising, which I really 565 00:36:04,880 --> 00:36:08,120 Speaker 1: need to get back to. Uh, Facebook would serve me 566 00:36:08,239 --> 00:36:11,719 Speaker 1: up all these ads for things like muscle enhancement, protein 567 00:36:11,840 --> 00:36:15,879 Speaker 1: shake things. They're all these insanely buff dudes popping up 568 00:36:15,920 --> 00:36:18,959 Speaker 1: on my Facebook profile news feed, and I was like, well, 569 00:36:19,000 --> 00:36:21,680 Speaker 1: you know, good on you guys. You've done some great work. 570 00:36:21,719 --> 00:36:25,320 Speaker 1: But that's not really it's not really my jam. But anyway, 571 00:36:25,440 --> 00:36:28,160 Speaker 1: that that is one way that Facebook starts to build 572 00:36:28,160 --> 00:36:30,600 Speaker 1: out information on you. But that's just the tip of 573 00:36:30,640 --> 00:36:35,839 Speaker 1: the iceberg. Facebook also has marketing partners, a lot of 574 00:36:35,960 --> 00:36:40,360 Speaker 1: marketing partners, and these partners are also collecting information about 575 00:36:40,400 --> 00:36:43,480 Speaker 1: the people who are visiting their pages and their habits, 576 00:36:43,520 --> 00:36:47,319 Speaker 1: including stuff like what items people might have purchased or 577 00:36:47,400 --> 00:36:50,520 Speaker 1: perhaps just searched for or looked at. Like think of 578 00:36:50,560 --> 00:36:54,040 Speaker 1: something that you've searched for, maybe on a site like Amazon, 579 00:36:54,680 --> 00:36:56,840 Speaker 1: and you're looking at stuff you haven't pulled the trigger 580 00:36:56,880 --> 00:37:00,560 Speaker 1: on buying anything yet, but you're just kind of comparison shopping. Well, 581 00:37:00,719 --> 00:37:04,160 Speaker 1: these sort of partners share that data back with Facebook, 582 00:37:04,760 --> 00:37:08,960 Speaker 1: and then Facebook can leverage that information and target specific 583 00:37:09,000 --> 00:37:12,600 Speaker 1: ads to you based on what you've been doing off 584 00:37:12,640 --> 00:37:15,200 Speaker 1: of Facebook in other parts of the web. So if 585 00:37:15,200 --> 00:37:17,439 Speaker 1: you ever done a search for something like an air 586 00:37:17,560 --> 00:37:20,440 Speaker 1: fryer and then you notice that whenever you go to 587 00:37:20,480 --> 00:37:24,200 Speaker 1: Facebook you're seeing ads for air fryers and other kitchen 588 00:37:24,239 --> 00:37:28,200 Speaker 1: gadgets just in your news feed, that's what's going on now. 589 00:37:28,200 --> 00:37:30,280 Speaker 1: I saw this happen a lot when I was shopping 590 00:37:30,280 --> 00:37:33,160 Speaker 1: around for a guitar. Every time I would log onto Facebook, 591 00:37:33,440 --> 00:37:36,000 Speaker 1: I would see a ton of ads for various music shops, 592 00:37:36,640 --> 00:37:39,759 Speaker 1: some of which were clearly scams. There's a lot of 593 00:37:39,760 --> 00:37:42,480 Speaker 1: that on Facebook. Whether you'll see an ad for you know, 594 00:37:42,560 --> 00:37:44,239 Speaker 1: you'll see an ad for something that looks like an 595 00:37:44,280 --> 00:37:49,640 Speaker 1: incredible product, something there's phenomenal, amazing mask or costume piece, 596 00:37:49,800 --> 00:37:54,600 Speaker 1: or you know, a special kit that seems to be 597 00:37:54,640 --> 00:37:58,120 Speaker 1: catered right to your interests, and more than that, it's 598 00:37:58,160 --> 00:38:00,520 Speaker 1: an unbelievably low price, like this is gonna be for 599 00:38:00,520 --> 00:38:02,840 Speaker 1: forty bucks. You're like, wow, that would normally be hundreds 600 00:38:02,840 --> 00:38:06,160 Speaker 1: of dollars. Well, this falls into the general category of 601 00:38:06,440 --> 00:38:09,000 Speaker 1: if it sounds too good to be true, it probably is. 602 00:38:09,480 --> 00:38:12,200 Speaker 1: A lot of these company pages are really just fronts 603 00:38:12,200 --> 00:38:17,400 Speaker 1: for stores that sell cheap knockoffs, often with huge shipping 604 00:38:17,440 --> 00:38:19,480 Speaker 1: costs because a lot of them are located in places 605 00:38:19,480 --> 00:38:22,600 Speaker 1: like China. So you order the thing that looks incredible, 606 00:38:22,760 --> 00:38:24,480 Speaker 1: you're like, wow, this would normally set me back a 607 00:38:24,560 --> 00:38:27,759 Speaker 1: grant and they're asking for forty bucks. And then if 608 00:38:27,800 --> 00:38:30,759 Speaker 1: you do get a product it doesn't resemble what you 609 00:38:30,800 --> 00:38:33,120 Speaker 1: were promised. You end up getting a super cheap knockoff 610 00:38:33,239 --> 00:38:36,480 Speaker 1: of whatever it was you were ordering it. But the 611 00:38:36,520 --> 00:38:40,760 Speaker 1: expense of returning that product would be so much higher 612 00:38:40,840 --> 00:38:44,439 Speaker 1: because of shipping costs, that it would not make any 613 00:38:44,480 --> 00:38:46,560 Speaker 1: sense to return it. You would still be out more 614 00:38:46,680 --> 00:38:49,600 Speaker 1: money by returning the product, so you just kind of 615 00:38:49,640 --> 00:38:51,719 Speaker 1: suck it up. But that's kind of a tangent. Let's 616 00:38:51,800 --> 00:38:55,759 Speaker 1: let's get back to targeting now. After some criticism, Facebook 617 00:38:55,800 --> 00:38:58,600 Speaker 1: has made it possible for people to tweak their settings 618 00:38:58,640 --> 00:39:02,839 Speaker 1: on the platform to bit targeted advertising, So you can 619 00:39:02,920 --> 00:39:05,080 Speaker 1: change things so that the ads you get are more 620 00:39:05,200 --> 00:39:08,960 Speaker 1: general and they're no longer hinging upon your identity or 621 00:39:09,000 --> 00:39:13,080 Speaker 1: your behaviors. Not that Facebook isn't still collecting of these information. 622 00:39:13,160 --> 00:39:16,120 Speaker 1: They are, it's just you're not seeing it in the 623 00:39:16,160 --> 00:39:20,040 Speaker 1: effect of the advertising that appears on Facebook. But it's 624 00:39:20,080 --> 00:39:22,520 Speaker 1: not always easy to spot these settings, and a lot 625 00:39:22,520 --> 00:39:24,719 Speaker 1: of people just never bother to even check into it. 626 00:39:25,120 --> 00:39:28,279 Speaker 1: And that suits Facebook just fine, because remember when I 627 00:39:28,280 --> 00:39:32,399 Speaker 1: said Facebook can charge more for ads because of their 628 00:39:32,520 --> 00:39:35,319 Speaker 1: high engagement, the fact that people spend so much time 629 00:39:35,320 --> 00:39:39,080 Speaker 1: on Facebook. That gets boosted even more when Facebook can 630 00:39:39,080 --> 00:39:43,080 Speaker 1: tell an advertiser that they're able to put their ads 631 00:39:43,239 --> 00:39:45,440 Speaker 1: in front of the people who are most likely to 632 00:39:45,480 --> 00:39:48,560 Speaker 1: respond to those ads. So not only are people spending 633 00:39:48,600 --> 00:39:52,640 Speaker 1: more time on the platform, Facebook can tell the advertisers 634 00:39:52,760 --> 00:39:55,200 Speaker 1: you want to you want to target this specific person 635 00:39:55,600 --> 00:39:58,759 Speaker 1: because they're the one most likely to respond to your ad. Now, 636 00:39:58,800 --> 00:40:02,399 Speaker 1: let me be clear, this doesn't have to be a 637 00:40:02,480 --> 00:40:05,600 Speaker 1: bad thing necessarily. I mean, you might end up seeing 638 00:40:05,640 --> 00:40:08,760 Speaker 1: ads that are really relevant to you and they could 639 00:40:08,800 --> 00:40:11,600 Speaker 1: be really helpful. Maybe you would see something that you 640 00:40:11,640 --> 00:40:14,719 Speaker 1: wouldn't have come across otherwise. But it can also come 641 00:40:14,760 --> 00:40:18,480 Speaker 1: across as really invasive or creeping. But now let's go 642 00:40:18,640 --> 00:40:21,959 Speaker 1: to something that is just a bad thing. Period See 643 00:40:22,040 --> 00:40:25,920 Speaker 1: targeting ads to get your most likely customer. Sounds fairly 644 00:40:26,120 --> 00:40:28,760 Speaker 1: on the up and up, except the practice can also 645 00:40:28,800 --> 00:40:33,640 Speaker 1: be discriminatory. For example, let's say there's a big company 646 00:40:33,719 --> 00:40:38,000 Speaker 1: and they own upscale apartment complexes in different cities, and 647 00:40:38,080 --> 00:40:41,000 Speaker 1: this company wants to advertise on Facebook because it's opening 648 00:40:41,080 --> 00:40:44,360 Speaker 1: up a brand new apartment building. And in the process, 649 00:40:44,640 --> 00:40:48,640 Speaker 1: the company specifically opts to show the ad only to 650 00:40:48,800 --> 00:40:54,440 Speaker 1: white people, because that's something Facebook used to allow advertisers 651 00:40:54,480 --> 00:41:00,480 Speaker 1: to do. Facebook was allowing for racial discrimination in advertising. Now, 652 00:41:00,719 --> 00:41:04,000 Speaker 1: if you took a bird's eye view of these practices, 653 00:41:04,360 --> 00:41:08,319 Speaker 1: you would see some really ugly trends here. Ads for 654 00:41:08,400 --> 00:41:12,880 Speaker 1: companies that were hiring fast food workers were disproportionately directed 655 00:41:12,960 --> 00:41:17,480 Speaker 1: towards black users. Ads for companies hiring cleaners or nurses 656 00:41:17,840 --> 00:41:24,160 Speaker 1: disproportionately targeted women users. The practice was reinforcing disparities in 657 00:41:24,239 --> 00:41:29,799 Speaker 1: the job market. It was perpetuating and and and heightening 658 00:41:30,000 --> 00:41:34,759 Speaker 1: these disparities. It was an effect making things worse. And 659 00:41:34,840 --> 00:41:38,920 Speaker 1: this practice first came to light way back in sixteen, 660 00:41:39,400 --> 00:41:43,520 Speaker 1: but it wasn't until twenty nineteen that Facebook was dealing 661 00:41:43,520 --> 00:41:46,720 Speaker 1: with a lawsuit and the company promised that it would 662 00:41:46,840 --> 00:41:51,840 Speaker 1: end the practice. The announcement also wasn't exactly transparent. In 663 00:41:51,920 --> 00:41:55,120 Speaker 1: a post, the Facebook for Business division said that it 664 00:41:55,160 --> 00:41:58,239 Speaker 1: was removing the option as part of its quote efforts 665 00:41:58,280 --> 00:42:02,520 Speaker 1: to simplify and stream line are targeting options end quote, 666 00:42:02,800 --> 00:42:07,440 Speaker 1: and that it would be quote removing multicultural affinity segments 667 00:42:07,719 --> 00:42:11,520 Speaker 1: and encouraging advertisers to use other targeting options such as 668 00:42:11,600 --> 00:42:17,880 Speaker 1: language or culture end quote. Multicultural affinity segments was the 669 00:42:17,960 --> 00:42:23,080 Speaker 1: euphemism the company used to essentially talk about racial discrimination now. 670 00:42:23,080 --> 00:42:27,600 Speaker 1: According to a Facebook insider named Kion Lvy, team members 671 00:42:27,680 --> 00:42:31,520 Speaker 1: within Facebook had been pushing for the removal of these 672 00:42:32,280 --> 00:42:36,800 Speaker 1: factors for advertisers for more than three years. In addition, 673 00:42:37,160 --> 00:42:40,399 Speaker 1: it's not the only discrimination that people have faced when 674 00:42:40,440 --> 00:42:44,080 Speaker 1: it comes to trying to do advertising on Facebook. One 675 00:42:44,080 --> 00:42:47,040 Speaker 1: thing I didn't touch on earlier in this episode is 676 00:42:47,080 --> 00:42:50,480 Speaker 1: how brands have been affected by the various tweaks to 677 00:42:50,600 --> 00:42:54,080 Speaker 1: the news feed algorithm and the news feed itself. Brands 678 00:42:54,200 --> 00:42:57,160 Speaker 1: can make pages, and users can visit the pages, and 679 00:42:57,160 --> 00:42:59,840 Speaker 1: they can like those pages. And in the old days, 680 00:43:00,280 --> 00:43:04,040 Speaker 1: if a business posted to their page, then that update 681 00:43:04,120 --> 00:43:07,120 Speaker 1: would go out to a decent number of the followers 682 00:43:07,160 --> 00:43:10,720 Speaker 1: who had liked that page. But Facebook tweaked the news 683 00:43:10,760 --> 00:43:13,120 Speaker 1: feed a couple of times, and each time it would 684 00:43:13,120 --> 00:43:17,360 Speaker 1: affect the organic reach of a business's post. So, for example, 685 00:43:18,280 --> 00:43:22,279 Speaker 1: there is a text of Facebook page. I think. I 686 00:43:22,280 --> 00:43:24,719 Speaker 1: mean I quit Facebook a couple of weeks ago, so 687 00:43:25,080 --> 00:43:28,640 Speaker 1: I mean I assume it's still there, but it wasn't 688 00:43:28,719 --> 00:43:31,879 Speaker 1: updated frequently. And I'll be up front with you why 689 00:43:31,960 --> 00:43:35,360 Speaker 1: that is because there wasn't much point to updating the 690 00:43:35,360 --> 00:43:39,360 Speaker 1: Facebook page due to the way the news feedworks. Posting 691 00:43:39,680 --> 00:43:42,320 Speaker 1: to the text off page would mean only a small 692 00:43:42,400 --> 00:43:46,359 Speaker 1: fraction of the people who were actually following that page 693 00:43:46,560 --> 00:43:50,000 Speaker 1: would even see the update. So if a thousand people 694 00:43:50,200 --> 00:43:53,520 Speaker 1: had liked the page, maybe only ten of them get 695 00:43:53,520 --> 00:43:57,520 Speaker 1: the update. That's not a good return on investment. So 696 00:43:58,880 --> 00:44:02,360 Speaker 1: ostensibly this was on an effort to deliver a better 697 00:44:02,400 --> 00:44:06,279 Speaker 1: experience to users. You know, Zuckerberg always positioned this as 698 00:44:06,280 --> 00:44:10,040 Speaker 1: saying the ideas they wanted the users to see more 699 00:44:10,120 --> 00:44:14,160 Speaker 1: posts from the actual people they know, their friends and family, 700 00:44:14,600 --> 00:44:19,359 Speaker 1: and fewer posts from businesses and media outlets, even though 701 00:44:19,640 --> 00:44:24,840 Speaker 1: we should remember these were outlets and businesses that users 702 00:44:24,840 --> 00:44:30,160 Speaker 1: had gone to and liked on Facebook, so presumably the 703 00:44:30,960 --> 00:44:34,799 Speaker 1: posts from those entities were things that the users were 704 00:44:34,840 --> 00:44:38,359 Speaker 1: interested in the first place. But what was really going 705 00:44:38,400 --> 00:44:42,360 Speaker 1: on here was that Facebook was creating a market for itself. 706 00:44:42,920 --> 00:44:46,279 Speaker 1: Why give away free access to users when you can 707 00:44:46,320 --> 00:44:50,880 Speaker 1: create an incentive for businesses to pay for promoted posts, 708 00:44:51,320 --> 00:44:53,960 Speaker 1: and that's what was really going on here. Facebook had 709 00:44:54,000 --> 00:44:57,840 Speaker 1: created a system where brands felt the necessity for creating 710 00:44:57,840 --> 00:45:00,000 Speaker 1: a Facebook page, and you had to have a face 711 00:45:00,000 --> 00:45:04,040 Speaker 1: fcebook page. There are two billion people on Facebook, so 712 00:45:04,120 --> 00:45:06,000 Speaker 1: you want to be where the people are, you gotta 713 00:45:06,040 --> 00:45:09,279 Speaker 1: have a Facebook page. But to reach a significant percentage 714 00:45:09,280 --> 00:45:11,360 Speaker 1: of the people who have actually gone to that page 715 00:45:11,360 --> 00:45:14,120 Speaker 1: and liked it, you would also have to pay for 716 00:45:14,200 --> 00:45:17,759 Speaker 1: that privilege. Now, I'm not going to argue against Facebook's 717 00:45:17,800 --> 00:45:21,279 Speaker 1: strategy here. It kind of stinks if you happen to 718 00:45:21,360 --> 00:45:24,759 Speaker 1: run a brand. But from Facebook's perspective, I can understand 719 00:45:24,800 --> 00:45:29,200 Speaker 1: the reasoning. So while it kind of is tough if 720 00:45:29,239 --> 00:45:33,239 Speaker 1: you're someone who's trying to reach your fans, uh, from 721 00:45:33,239 --> 00:45:35,960 Speaker 1: Facebook's perspective, I get the I get the idea, I 722 00:45:36,000 --> 00:45:40,399 Speaker 1: get the business case. However, that discrimination that we would 723 00:45:40,440 --> 00:45:43,319 Speaker 1: see directed toward users would creep in again, this time 724 00:45:43,320 --> 00:45:48,840 Speaker 1: towards advertisers. There's a band called Unsung Lily, which is 725 00:45:48,880 --> 00:45:52,840 Speaker 1: an independent pop band. It's headed by a same sex couple, 726 00:45:53,320 --> 00:45:57,440 Speaker 1: Frankie and Sarah Golding Young. These two are attempting to 727 00:45:57,480 --> 00:46:02,279 Speaker 1: promote an album on Facebook this year, in because, like 728 00:46:02,320 --> 00:46:05,320 Speaker 1: I think pretty much everyone in the world, the pandemic 729 00:46:05,440 --> 00:46:08,800 Speaker 1: changed all of their plans for the year. They wouldn't 730 00:46:08,800 --> 00:46:11,040 Speaker 1: be able to tour, they wouldn't be able to market 731 00:46:11,080 --> 00:46:14,160 Speaker 1: their album the way they would typically do so, and 732 00:46:14,200 --> 00:46:17,640 Speaker 1: so they decided to run and add on Facebook. But 733 00:46:17,760 --> 00:46:21,480 Speaker 1: they also knew that if they relied solely on posting 734 00:46:21,760 --> 00:46:24,279 Speaker 1: the the ad to the band's page, like if they 735 00:46:24,320 --> 00:46:28,520 Speaker 1: just made a post on their fan page, the way 736 00:46:28,600 --> 00:46:31,040 Speaker 1: Facebook serves up that kind of stuff means that they 737 00:46:31,040 --> 00:46:34,440 Speaker 1: wouldn't reach nearly enough of their fans for it to 738 00:46:34,480 --> 00:46:37,879 Speaker 1: make an impact. So they decided to do it as 739 00:46:37,920 --> 00:46:41,160 Speaker 1: an paid ad instead. So they shot an ad, they 740 00:46:41,200 --> 00:46:44,399 Speaker 1: put it together, they edited it, they submitted it. They 741 00:46:44,400 --> 00:46:47,920 Speaker 1: received a rejection, and under the reason why the ad 742 00:46:47,920 --> 00:46:51,680 Speaker 1: had been rejected, it said that the ad contained inappropriate 743 00:46:51,760 --> 00:46:57,280 Speaker 1: sexually explicit content, except it didn't. What it did feature 744 00:46:57,800 --> 00:47:01,400 Speaker 1: was an image of Sarah and Frank Key leaning into 745 00:47:01,440 --> 00:47:04,560 Speaker 1: each other with their foreheads touching, kind of just an 746 00:47:04,600 --> 00:47:09,640 Speaker 1: image of affection. There was nothing remotely sexual or explicit 747 00:47:09,880 --> 00:47:15,200 Speaker 1: about this photograph. Now, the rejection notice didn't specify that 748 00:47:15,320 --> 00:47:18,879 Speaker 1: the image was the reason for the rejection, so they 749 00:47:18,920 --> 00:47:21,040 Speaker 1: decided they would run a test to see if, in 750 00:47:21,120 --> 00:47:24,960 Speaker 1: fact that was the problem. First, they submitted an AD 751 00:47:25,000 --> 00:47:28,320 Speaker 1: that had all the same content as the original piece, 752 00:47:28,680 --> 00:47:32,680 Speaker 1: only with this one image swapped out for a quote 753 00:47:32,719 --> 00:47:38,240 Speaker 1: unquote non romantic picture of the couple boom Facebook excepts 754 00:47:38,239 --> 00:47:43,160 Speaker 1: the ad. To follow that up, they also submitted another ad. 755 00:47:43,200 --> 00:47:46,520 Speaker 1: They still used the same video, the same copy, but 756 00:47:46,600 --> 00:47:50,200 Speaker 1: they got a heterosexual couple making the same pose that 757 00:47:50,280 --> 00:47:52,680 Speaker 1: they had made with the two four heads together, so 758 00:47:52,680 --> 00:47:56,160 Speaker 1: it's a man and a woman foreheads touching. They just 759 00:47:56,239 --> 00:47:59,960 Speaker 1: used that to replace their own image. Boom Facebook except 760 00:48:00,200 --> 00:48:03,239 Speaker 1: that ad too. Okay, so I guess I don't have 761 00:48:03,280 --> 00:48:06,960 Speaker 1: to explain that this is a really bad thing. The 762 00:48:07,000 --> 00:48:11,160 Speaker 1: fact that Facebook ads deemed a heterosexual display of affection 763 00:48:11,680 --> 00:48:15,600 Speaker 1: as perfectly acceptable and the same sex display of the 764 00:48:15,680 --> 00:48:21,280 Speaker 1: exact same act of affection as offensive is beyond infuriating. 765 00:48:22,000 --> 00:48:25,920 Speaker 1: The band contacted the American Civil Liberties Union, which facilitated 766 00:48:25,960 --> 00:48:28,719 Speaker 1: some communication with Facebook. Facebook said, oh, no, no, no no, 767 00:48:28,880 --> 00:48:31,920 Speaker 1: this whole thing was a mistake. And besides, the actual 768 00:48:31,960 --> 00:48:35,319 Speaker 1: objectionable part wasn't that photograph. It was because of some 769 00:48:35,400 --> 00:48:39,520 Speaker 1: of the dancing those in the video. But again, Facebook 770 00:48:39,600 --> 00:48:43,600 Speaker 1: accepted two versions of the video that contained the dancing, 771 00:48:43,680 --> 00:48:47,520 Speaker 1: It just didn't have that photograph of Sarah and Frankie, 772 00:48:47,920 --> 00:48:52,440 Speaker 1: which seems pretty fishy to me. Meanwhile, the company continues 773 00:48:52,480 --> 00:48:56,160 Speaker 1: to face criticism that it fails to curb truly objectionable 774 00:48:56,239 --> 00:49:00,279 Speaker 1: stuff like hate speech. So hate speech spreads like easy 775 00:49:00,320 --> 00:49:04,000 Speaker 1: on Facebook, but an ad that had a photograph of 776 00:49:04,239 --> 00:49:09,000 Speaker 1: two women with their foreheads together was offensive. There are 777 00:49:09,000 --> 00:49:12,440 Speaker 1: studies that are also showing that black users are disproportionately 778 00:49:12,520 --> 00:49:16,839 Speaker 1: more likely to have their accounts suspended by automated moderating 779 00:49:16,960 --> 00:49:21,320 Speaker 1: systems than white users are, which is another bad algorithm example. 780 00:49:22,000 --> 00:49:25,719 Speaker 1: And there are real concerns that the targeting criteria Facebook 781 00:49:25,760 --> 00:49:31,080 Speaker 1: continues to use exacerbate socioeconomic disparities in the real world. 782 00:49:31,960 --> 00:49:35,759 Speaker 1: None of this is good news. Facebook has been under 783 00:49:36,120 --> 00:49:41,319 Speaker 1: intense scrutiny recently from various governments around the world that 784 00:49:41,400 --> 00:49:45,480 Speaker 1: have all floated variations of the possibility that Facebook could 785 00:49:45,640 --> 00:49:48,960 Speaker 1: face some really big consequences down the road, like getting 786 00:49:48,960 --> 00:49:51,719 Speaker 1: broken up. There are a lot of different entities that 787 00:49:51,840 --> 00:49:56,160 Speaker 1: argue Facebook is a monopoly or is monopolistic, and it's 788 00:49:56,160 --> 00:50:00,360 Speaker 1: pretty clear that Facebook and its incarnation as it exists 789 00:50:00,440 --> 00:50:04,120 Speaker 1: right now really has some big problems to iron out, 790 00:50:04,640 --> 00:50:08,200 Speaker 1: things that are affecting people in a truly negative way. 791 00:50:08,320 --> 00:50:13,080 Speaker 1: These are all parts of the reason why I left Facebook. 792 00:50:13,200 --> 00:50:16,399 Speaker 1: I did so for my own personal reasons, and I'm 793 00:50:16,400 --> 00:50:20,040 Speaker 1: not advocating that anyone listening to this just go out 794 00:50:20,040 --> 00:50:22,400 Speaker 1: and shut down your account right now. I'm not saying 795 00:50:22,440 --> 00:50:26,520 Speaker 1: that for me that was the right decision. I needed 796 00:50:26,560 --> 00:50:30,600 Speaker 1: to do it for my own mental health, honestly, So 797 00:50:31,560 --> 00:50:34,160 Speaker 1: I felt like this was an important thing to talk about, 798 00:50:34,239 --> 00:50:37,080 Speaker 1: because again, it's those algorithms that, at the end of 799 00:50:37,080 --> 00:50:42,080 Speaker 1: the day, we're really just designed to optimize the amount 800 00:50:42,080 --> 00:50:44,320 Speaker 1: of time people would spend on Facebook and the number 801 00:50:44,480 --> 00:50:47,680 Speaker 1: and quality, or at least the variety of ads that 802 00:50:47,760 --> 00:50:51,400 Speaker 1: they would see while they spent their time there. That 803 00:50:51,520 --> 00:50:54,719 Speaker 1: was really the only thing that mattered to Facebook. It 804 00:50:54,800 --> 00:50:58,640 Speaker 1: was otherwise kind of distanced from any of the content stuff. 805 00:50:58,760 --> 00:51:02,799 Speaker 1: Content was not at all the concern. The concern was 806 00:51:03,400 --> 00:51:05,480 Speaker 1: how long can we keep them there, how many ads 807 00:51:05,480 --> 00:51:07,399 Speaker 1: can we show them, how much money can we get 808 00:51:07,440 --> 00:51:10,080 Speaker 1: for those ads? That was really all that mattered. And 809 00:51:10,160 --> 00:51:14,719 Speaker 1: now we're starting to see when you disassociate the content 810 00:51:15,080 --> 00:51:19,280 Speaker 1: from the approach how that can have a profoundly negative 811 00:51:19,280 --> 00:51:22,600 Speaker 1: impact on the world, and we're trying to get to 812 00:51:22,600 --> 00:51:24,680 Speaker 1: the point where we figure out what to do next. 813 00:51:25,520 --> 00:51:29,240 Speaker 1: I don't have the answers um other than I walked away. 814 00:51:30,960 --> 00:51:35,480 Speaker 1: So hopefully people at Facebook will start to come up 815 00:51:35,520 --> 00:51:39,120 Speaker 1: with ways to approach the business from a more ethical 816 00:51:39,200 --> 00:51:43,359 Speaker 1: standpoint that will have less of a negative impact on 817 00:51:43,520 --> 00:51:45,759 Speaker 1: the people who actually use the service, whether it's the 818 00:51:45,800 --> 00:51:50,279 Speaker 1: advertisers or the end users. I certainly hope. So if 819 00:51:50,320 --> 00:51:52,640 Speaker 1: you found this interesting, let me know if you have 820 00:51:52,840 --> 00:51:56,240 Speaker 1: suggestions for future topics on tech stuff. Whether it's a company, 821 00:51:56,520 --> 00:51:59,720 Speaker 1: a trend in technology, maybe it's a person in tech. 822 00:52:00,400 --> 00:52:03,239 Speaker 1: Maybe it's a specific technology you've always wanted to know 823 00:52:03,320 --> 00:52:05,960 Speaker 1: how it worked. Let me know. The best way to 824 00:52:06,000 --> 00:52:09,080 Speaker 1: do that is clearly to go over to Twitter. Not 825 00:52:09,239 --> 00:52:12,600 Speaker 1: that that's a perfect service, but that's a different podcast 826 00:52:12,640 --> 00:52:15,840 Speaker 1: for a different time. But use the handle text stuff 827 00:52:16,160 --> 00:52:20,160 Speaker 1: h s W and I'll talk to you again really soon. 828 00:52:24,840 --> 00:52:27,839 Speaker 1: Text Stuff is an I Heart Radio production. For more 829 00:52:27,920 --> 00:52:31,320 Speaker 1: podcasts from my Heart Radio, visit the i Heart Radio app, 830 00:52:31,440 --> 00:52:34,600 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.