1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,200 --> 00:00:13,920 Speaker 1: stuff works dot com. Hey there, and welcome to tech Stuff. 3 00:00:13,960 --> 00:00:17,759 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:17,800 --> 00:00:20,439 Speaker 1: How Stuff Works, and I heart radio and I love 5 00:00:20,600 --> 00:00:26,239 Speaker 1: all things tech. And today we're going to dive into 6 00:00:26,440 --> 00:00:31,280 Speaker 1: a recent story. On March sixth, two thousand nineteen, Facebook 7 00:00:31,360 --> 00:00:36,280 Speaker 1: co founder and CEO Mark Zuckerberg posted a very long 8 00:00:36,520 --> 00:00:42,639 Speaker 1: essay titled a Privacy Focused Vision for Social Networking on Facebook. 9 00:00:42,680 --> 00:00:46,960 Speaker 1: Of course, the post has prompted a lot of discussion 10 00:00:47,080 --> 00:00:49,720 Speaker 1: in the tech space as well as the political space, 11 00:00:50,360 --> 00:00:54,800 Speaker 1: over what Zuckerberg actually means, both for Facebook users and 12 00:00:54,920 --> 00:00:58,760 Speaker 1: for the company itself, and it raises some interesting questions. 13 00:00:58,880 --> 00:01:02,040 Speaker 1: So today I thought I dedicate an episode to Facebook 14 00:01:02,240 --> 00:01:06,360 Speaker 1: and the concept of privacy, because the two things have 15 00:01:06,640 --> 00:01:11,280 Speaker 1: long been at odds with one another. You wouldn't necessarily 16 00:01:11,760 --> 00:01:17,080 Speaker 1: associate Facebook with privacy, and that's part of the problem. Now, 17 00:01:17,560 --> 00:01:20,560 Speaker 1: before I dive into the essay, and I'm going to 18 00:01:20,800 --> 00:01:23,560 Speaker 1: quote the essay quite a bit, but don't worry, I'm 19 00:01:23,560 --> 00:01:27,440 Speaker 1: not reading out the whole three thousand, six hundred word thing. 20 00:01:28,000 --> 00:01:30,520 Speaker 1: But let me set the stage a little bit. We're 21 00:01:30,520 --> 00:01:34,960 Speaker 1: gonna dial back the clock to January twenty ten. Facebook 22 00:01:34,959 --> 00:01:38,200 Speaker 1: had been around for several years and Michael Arrington of 23 00:01:38,240 --> 00:01:43,560 Speaker 1: tech Crunch interviewed Mark Zuckerberg in January to talk all 24 00:01:43,600 --> 00:01:46,280 Speaker 1: about Facebook. And this is what Zuckerberg had to say 25 00:01:46,280 --> 00:01:49,840 Speaker 1: about the concept of privacy back then. So this is 26 00:01:50,040 --> 00:01:54,440 Speaker 1: nine years ago. People have gotten really comfortable not only 27 00:01:54,560 --> 00:01:58,720 Speaker 1: sharing more information and different kinds, but more openly and 28 00:01:58,760 --> 00:02:02,240 Speaker 1: with more people. That social norm is just something that's 29 00:02:02,240 --> 00:02:05,240 Speaker 1: evolved over time. We view it as a role in 30 00:02:05,280 --> 00:02:08,840 Speaker 1: the system to constantly be innovating and be updating. Water 31 00:02:08,960 --> 00:02:12,600 Speaker 1: system is to reflect what the current social norms are. 32 00:02:14,200 --> 00:02:17,960 Speaker 1: So essentially, Zuckerberg was saying that things have changed culture, 33 00:02:18,760 --> 00:02:21,760 Speaker 1: you know, values had changed, where things that might have 34 00:02:21,880 --> 00:02:26,519 Speaker 1: been considered private in the past now we're public. People 35 00:02:26,560 --> 00:02:33,239 Speaker 1: were eagerly sharing private thoughts on social media platforms, and 36 00:02:33,320 --> 00:02:37,120 Speaker 1: they were sharing more and more with more people, people 37 00:02:37,200 --> 00:02:40,919 Speaker 1: who might not be close friends, they might be further 38 00:02:41,200 --> 00:02:45,440 Speaker 1: out in that person's social network, they might even be strangers. 39 00:02:45,480 --> 00:02:48,560 Speaker 1: So in that discussion, Zuckerberg was essentially making the claim 40 00:02:48,760 --> 00:02:52,400 Speaker 1: that Facebook's policies and operations were a reaction to the 41 00:02:52,520 --> 00:02:57,440 Speaker 1: changing values of the culture at large that this was happening, 42 00:02:57,560 --> 00:03:01,680 Speaker 1: so Facebook was changing to cater to that. Now, not 43 00:03:01,800 --> 00:03:07,960 Speaker 1: everyone agreed with that particular assertion, including yours, truly, I 44 00:03:08,040 --> 00:03:13,000 Speaker 1: also didn't think that this was a totally genuine response. 45 00:03:13,320 --> 00:03:18,359 Speaker 1: I would argue instead that Facebook is dependent upon people 46 00:03:18,760 --> 00:03:21,960 Speaker 1: issuing privacy and that the company has done a great 47 00:03:22,000 --> 00:03:27,160 Speaker 1: deal to encourage this kind of oversharing behavior. I'm not 48 00:03:27,200 --> 00:03:30,600 Speaker 1: saying that Facebook is fully responsible for it, merely that 49 00:03:30,639 --> 00:03:34,640 Speaker 1: the company has greatly encouraged it. I mean, that's what 50 00:03:34,800 --> 00:03:38,240 Speaker 1: the company is based around. So rather than react to 51 00:03:38,640 --> 00:03:41,680 Speaker 1: a cultural shift, I would argue Facebook has done a 52 00:03:41,680 --> 00:03:46,600 Speaker 1: lot to boost or push for that shift. So again, 53 00:03:46,680 --> 00:03:50,640 Speaker 1: not fully responsible. I wouldn't say that Facebook is the 54 00:03:50,720 --> 00:03:56,119 Speaker 1: culture king, that's the kingmaker of culture, but rather they 55 00:03:56,160 --> 00:03:59,800 Speaker 1: saw the opportunity and they pushed really hard for that 56 00:04:00,040 --> 00:04:03,960 Speaker 1: to become the social norm. And the reason for doing 57 00:04:04,000 --> 00:04:09,400 Speaker 1: that is really really simple, because it's cash, cash money, y'all. 58 00:04:09,920 --> 00:04:13,880 Speaker 1: Facebook provides a service to users, right, It's a social 59 00:04:13,920 --> 00:04:17,599 Speaker 1: media platform we can use, but the way the company 60 00:04:17,680 --> 00:04:22,200 Speaker 1: makes money is largely through advertising. So you can think 61 00:04:22,200 --> 00:04:26,480 Speaker 1: of Facebook's product as being your data, assuming that you 62 00:04:26,560 --> 00:04:30,680 Speaker 1: use Facebook, of course, and your data might include specific 63 00:04:30,720 --> 00:04:35,000 Speaker 1: facts about your identity, like your home address, your phone number, 64 00:04:35,279 --> 00:04:38,560 Speaker 1: your name, your age, your occupation, the schools you attended. 65 00:04:39,240 --> 00:04:41,960 Speaker 1: It might also include information about people you know and 66 00:04:42,000 --> 00:04:45,479 Speaker 1: your relation to those people. So it might have the 67 00:04:45,520 --> 00:04:48,760 Speaker 1: identity of your spouse or family members, and it may 68 00:04:48,800 --> 00:04:51,880 Speaker 1: even identify them as such because you can, you know, 69 00:04:51,960 --> 00:04:56,479 Speaker 1: designate people as having a specific relationship to you. That's 70 00:04:56,520 --> 00:05:00,279 Speaker 1: valuable to you to have these particular circle is like 71 00:05:00,320 --> 00:05:02,080 Speaker 1: if you want to message just people who are in 72 00:05:02,160 --> 00:05:05,440 Speaker 1: your family, that's valuable to you, but it's also valuable 73 00:05:05,760 --> 00:05:09,040 Speaker 1: to Facebook to have that information. So Facebook also has 74 00:05:09,040 --> 00:05:12,479 Speaker 1: a record of all your activities on the platform and 75 00:05:12,640 --> 00:05:15,480 Speaker 1: the posts you tend to interact with. So as you 76 00:05:15,600 --> 00:05:20,400 Speaker 1: use Facebook, you generate information. You provide a more complete 77 00:05:20,520 --> 00:05:24,240 Speaker 1: picture of who you are, what you like, what you 78 00:05:24,400 --> 00:05:28,800 Speaker 1: value and more and what Facebook doesn't outright no from 79 00:05:28,839 --> 00:05:31,760 Speaker 1: your input and behavior, it might be able to guess 80 00:05:31,800 --> 00:05:37,320 Speaker 1: at based upon similar behaviors they've seen, so essentially pattern recognition. 81 00:05:37,640 --> 00:05:40,240 Speaker 1: It might notice that lots of people who happen to 82 00:05:40,320 --> 00:05:45,640 Speaker 1: like certain types of posts also like other posts, and 83 00:05:45,680 --> 00:05:49,039 Speaker 1: maybe you haven't liked this other post yet, but because 84 00:05:49,080 --> 00:05:52,160 Speaker 1: you fall into the first category, chances are you'd also 85 00:05:52,160 --> 00:05:56,400 Speaker 1: fall into the second one. Now let's shift our attention 86 00:05:56,440 --> 00:06:01,520 Speaker 1: to advertisers. Advertisers want to get the most for their money, 87 00:06:01,600 --> 00:06:05,839 Speaker 1: which is I mean, that's just easy business, right. Everyone 88 00:06:05,960 --> 00:06:09,560 Speaker 1: wants that, and this doesn't make advertisers bad people. But 89 00:06:10,000 --> 00:06:12,280 Speaker 1: they want to get the most bang for their buck. 90 00:06:12,480 --> 00:06:15,440 Speaker 1: They need to get the word out about products or services, 91 00:06:15,680 --> 00:06:18,040 Speaker 1: and they want to focus on people who are most 92 00:06:18,120 --> 00:06:21,599 Speaker 1: likely to respond to those advertisements and make a purchase. 93 00:06:21,960 --> 00:06:24,760 Speaker 1: It doesn't do you any good to send ads to 94 00:06:24,839 --> 00:06:29,080 Speaker 1: somebody and those ads have nothing in common with that person. 95 00:06:29,440 --> 00:06:31,760 Speaker 1: If the person is never going to buy your product 96 00:06:31,839 --> 00:06:34,520 Speaker 1: or service just because of who they are, it's a 97 00:06:34,560 --> 00:06:37,560 Speaker 1: waste of time and resources to send that add to them. 98 00:06:37,880 --> 00:06:42,120 Speaker 1: So the more information advertisers have about a potential target 99 00:06:42,320 --> 00:06:46,400 Speaker 1: or customer, the better campaigns could even be tailored for 100 00:06:46,480 --> 00:06:49,760 Speaker 1: specific types of people. You could have a really good 101 00:06:49,760 --> 00:06:54,320 Speaker 1: advertising company craft different ads, all meant for the same 102 00:06:54,360 --> 00:06:59,200 Speaker 1: product or service, but intended to target different types of people. 103 00:06:59,640 --> 00:07:02,800 Speaker 1: So you might see one version of an ad for 104 00:07:02,839 --> 00:07:06,000 Speaker 1: a product and your buddy sees a totally different version 105 00:07:06,040 --> 00:07:09,039 Speaker 1: of an ad for that same product, but the two 106 00:07:09,080 --> 00:07:12,800 Speaker 1: different versions of the ads were selected because of differences 107 00:07:12,800 --> 00:07:16,080 Speaker 1: between you and your buddy. Maybe you really like hiking 108 00:07:16,480 --> 00:07:19,120 Speaker 1: and your buddy really likes the beach, and this is 109 00:07:19,160 --> 00:07:23,320 Speaker 1: an outdoor sporting event type company. Maybe it's uh, you know, 110 00:07:23,400 --> 00:07:28,960 Speaker 1: a sports equipment type company, And so they send one 111 00:07:29,000 --> 00:07:32,840 Speaker 1: ad to you that is geared more towards hiking, camping 112 00:07:32,880 --> 00:07:35,040 Speaker 1: and that kind of thing, and one to your buddy 113 00:07:35,240 --> 00:07:38,240 Speaker 1: that's more about beach life kind of stuff. It's both 114 00:07:38,280 --> 00:07:41,440 Speaker 1: for the same company or service, but the ad has 115 00:07:41,520 --> 00:07:46,600 Speaker 1: been selected specifically because of differences in your preferences that 116 00:07:46,680 --> 00:07:49,920 Speaker 1: Facebook is able to keep track of. If that means 117 00:07:49,960 --> 00:07:52,720 Speaker 1: that either of you are more likely to make a purchase, 118 00:07:52,840 --> 00:07:56,240 Speaker 1: then that's a valuable expense for the advertiser. So you 119 00:07:56,240 --> 00:07:59,600 Speaker 1: start to see how all these little pieces of data 120 00:08:00,000 --> 00:08:02,800 Speaker 1: start to add up to become incredibly valuable. And then 121 00:08:02,800 --> 00:08:06,480 Speaker 1: you think about Facebook being a data broker with all 122 00:08:06,640 --> 00:08:10,200 Speaker 1: this information about you and all these potential advertisers to 123 00:08:10,240 --> 00:08:13,240 Speaker 1: work with, and you really see how the money rolls in. 124 00:08:13,720 --> 00:08:18,240 Speaker 1: So Facebook has a vested interest and people sharing information publicly. 125 00:08:18,760 --> 00:08:21,880 Speaker 1: The more people share about themselves, the more data Facebook 126 00:08:21,920 --> 00:08:25,360 Speaker 1: has about them. So from posts to about you know, 127 00:08:25,480 --> 00:08:29,800 Speaker 1: pets too, pictures of meals, invitations to parties, requests for 128 00:08:29,920 --> 00:08:33,040 Speaker 1: help to seek out a job. Facebook's paying attention to 129 00:08:33,080 --> 00:08:36,880 Speaker 1: all of this, and it has on occasion gone even 130 00:08:36,920 --> 00:08:40,160 Speaker 1: further than that, such as when the site launched the 131 00:08:40,200 --> 00:08:43,319 Speaker 1: Beacon program back in around two thousand eight. I believe 132 00:08:43,320 --> 00:08:46,480 Speaker 1: it was that's what allowed companies that were partnering with 133 00:08:46,520 --> 00:08:51,360 Speaker 1: Facebook they could share activities that Facebook users were doing 134 00:08:51,520 --> 00:08:54,440 Speaker 1: on their sites. So let's say that you are a 135 00:08:54,559 --> 00:08:59,840 Speaker 1: ticket broker company. You you sell tickets to live events, 136 00:09:00,559 --> 00:09:05,080 Speaker 1: and you have this Facebook Beacon partnership, and then I, 137 00:09:05,240 --> 00:09:08,480 Speaker 1: a Facebook user, go to that site and I purchased 138 00:09:08,480 --> 00:09:13,079 Speaker 1: a ticket, then the site could have published that activity 139 00:09:13,160 --> 00:09:15,600 Speaker 1: to my Facebook feed, whether I wanted it or not, 140 00:09:15,679 --> 00:09:17,600 Speaker 1: and a lot of people felt like this was a 141 00:09:17,760 --> 00:09:21,680 Speaker 1: terrible violation of privacy. So maybe you just happened to 142 00:09:21,679 --> 00:09:24,520 Speaker 1: go and buy tickets to go see my little pony 143 00:09:24,600 --> 00:09:27,079 Speaker 1: on Ice, and maybe you don't need all your friends 144 00:09:27,120 --> 00:09:29,679 Speaker 1: on Facebook to know that and then to use that 145 00:09:29,760 --> 00:09:31,559 Speaker 1: information as a way to make fun of you for 146 00:09:31,600 --> 00:09:35,080 Speaker 1: the next decade, for example. But then Beacon goes and 147 00:09:35,160 --> 00:09:37,800 Speaker 1: posts your purchase for all to see on your profile. 148 00:09:38,440 --> 00:09:40,480 Speaker 1: Not that this has happened to anyone I know, and 149 00:09:41,120 --> 00:09:43,679 Speaker 1: it definitely did not happen to me, but you get 150 00:09:43,679 --> 00:09:48,240 Speaker 1: the idea. So Facebook is in the information business, the 151 00:09:48,280 --> 00:09:50,920 Speaker 1: same as Google and lots of other companies that we 152 00:09:50,960 --> 00:09:53,920 Speaker 1: tend to associate with some other type of business. Right, 153 00:09:54,000 --> 00:09:57,079 Speaker 1: most of us think of Facebook as a social network platform. 154 00:09:57,440 --> 00:09:59,920 Speaker 1: We tend to think of Google as a search engine, 155 00:10:00,120 --> 00:10:05,440 Speaker 1: and then increasingly other stuff remotely related to search engines. 156 00:10:05,720 --> 00:10:09,839 Speaker 1: But both companies actually depend heavily on leveraging data about 157 00:10:09,840 --> 00:10:14,400 Speaker 1: their users to entice advertisers. They really are advertising companies 158 00:10:14,920 --> 00:10:19,120 Speaker 1: or companies that host advertising. So it was completely in 159 00:10:19,120 --> 00:10:22,320 Speaker 1: the interests of Facebook for Zuckerberg to declare that the 160 00:10:22,360 --> 00:10:25,880 Speaker 1: notion of privacy was no longer a cultural norm back 161 00:10:25,920 --> 00:10:29,360 Speaker 1: in twenty Whether that was true or not as a 162 00:10:29,400 --> 00:10:32,720 Speaker 1: matter of debate. And certainly, some people appear to have 163 00:10:32,840 --> 00:10:36,240 Speaker 1: little concern for their privacy, and I'm not passing judgment 164 00:10:36,280 --> 00:10:38,719 Speaker 1: on them either, but I suspect there are a lot 165 00:10:38,720 --> 00:10:41,600 Speaker 1: of folks out there who feel otherwise, that privacy still 166 00:10:41,679 --> 00:10:44,440 Speaker 1: is important and still should play a part in our 167 00:10:44,480 --> 00:10:49,160 Speaker 1: interactions online and otherwise, And some pretty high profile incidents 168 00:10:49,440 --> 00:10:52,640 Speaker 1: have brought the topic into sharp focus. If you've been 169 00:10:52,679 --> 00:10:54,679 Speaker 1: listening to my show for a while, you probably heard 170 00:10:54,720 --> 00:10:58,040 Speaker 1: the episodes about Cambridge Analytica and how the data analytics 171 00:10:58,080 --> 00:11:01,120 Speaker 1: company relied upon information and that was pulled from an 172 00:11:01,160 --> 00:11:04,120 Speaker 1: app that left many feeling that they had had their 173 00:11:04,160 --> 00:11:07,880 Speaker 1: privacy violated. The app was a survey that would pay 174 00:11:08,040 --> 00:11:11,280 Speaker 1: users a small fee for completing the survey, and part 175 00:11:11,280 --> 00:11:14,680 Speaker 1: of the process involved granting the survey app access to 176 00:11:14,800 --> 00:11:18,199 Speaker 1: the survey takers Facebook page, so the app could pull 177 00:11:18,240 --> 00:11:21,839 Speaker 1: down information about the person taking the survey. But so far, 178 00:11:21,920 --> 00:11:24,720 Speaker 1: so good, right, I mean, the person taking the survey 179 00:11:24,880 --> 00:11:28,920 Speaker 1: is presumably aware of this process and is getting compensated 180 00:11:28,960 --> 00:11:32,440 Speaker 1: to boot. But then we go a step further. It 181 00:11:32,480 --> 00:11:35,560 Speaker 1: turned out the app was also pulling down information about 182 00:11:35,679 --> 00:11:39,120 Speaker 1: the friends connected to the survey takers, so not just 183 00:11:39,200 --> 00:11:43,520 Speaker 1: the taker him or herself, but that person's friends. The 184 00:11:43,559 --> 00:11:47,200 Speaker 1: app's permission allowed the administrator of the app to view 185 00:11:47,280 --> 00:11:51,280 Speaker 1: the survey taker's friends profiles as if the app were 186 00:11:51,320 --> 00:11:54,280 Speaker 1: a friend of those people itself. So let's say you 187 00:11:54,360 --> 00:11:57,640 Speaker 1: and I are friends on Facebook. Sometimes you make a 188 00:11:57,679 --> 00:12:00,800 Speaker 1: public post on Facebook and anyone in your feed can 189 00:12:00,840 --> 00:12:04,720 Speaker 1: read that public post. Anybody, No problem there? Right? That 190 00:12:04,720 --> 00:12:07,200 Speaker 1: That makes perfect sense. But sometimes let's say that you 191 00:12:07,280 --> 00:12:10,840 Speaker 1: post so that only your friends can see those status updates. 192 00:12:10,880 --> 00:12:14,280 Speaker 1: The general public never sees them. However, I'm your friend, 193 00:12:14,440 --> 00:12:17,839 Speaker 1: so I can see those posts. Then I decide I'm 194 00:12:17,840 --> 00:12:20,280 Speaker 1: gonna go take this survey, and I agree to the 195 00:12:20,280 --> 00:12:24,079 Speaker 1: apps terms, not knowing what all is entailed with that, 196 00:12:24,400 --> 00:12:27,840 Speaker 1: And now the app can see your profile as if 197 00:12:27,880 --> 00:12:30,680 Speaker 1: the app were me, meaning it can get access to 198 00:12:30,760 --> 00:12:35,000 Speaker 1: all those status updates you made using the friends only option, 199 00:12:35,280 --> 00:12:38,480 Speaker 1: and you never agreed to that. You're you are an 200 00:12:38,559 --> 00:12:42,040 Speaker 1: unknowing party. You never gave permission to the app or 201 00:12:42,080 --> 00:12:46,680 Speaker 1: to me, You just had all that information accessed. Facebook 202 00:12:46,679 --> 00:12:49,440 Speaker 1: would address that problem after people brought it to the 203 00:12:49,480 --> 00:12:52,600 Speaker 1: company's attention, and this was actually years before the Cambridge 204 00:12:52,640 --> 00:12:56,640 Speaker 1: Analytica scandal really broke out in public knowledge, but the 205 00:12:56,720 --> 00:12:59,800 Speaker 1: damage was already done. And once the scandal did break 206 00:13:00,160 --> 00:13:03,720 Speaker 1: many people were rightfully upset at Facebook. But that's just 207 00:13:03,840 --> 00:13:06,920 Speaker 1: one example of how Facebook has been less than perfect 208 00:13:06,960 --> 00:13:11,120 Speaker 1: when it comes to protecting users safety. Another recent example 209 00:13:11,400 --> 00:13:15,200 Speaker 1: came to light in late two thousand eighteen, Gizmoto published 210 00:13:15,200 --> 00:13:19,160 Speaker 1: a report that revealed Facebook had done something fairly shady, 211 00:13:19,280 --> 00:13:22,280 Speaker 1: pretty shady I would say with users who had opted 212 00:13:22,320 --> 00:13:26,080 Speaker 1: into two factor authentication with the service. Now see the 213 00:13:26,080 --> 00:13:29,559 Speaker 1: purpose of two factor authentication is to make online profiles 214 00:13:29,600 --> 00:13:32,400 Speaker 1: more secure. That's the whole reason for it, and the 215 00:13:32,440 --> 00:13:35,800 Speaker 1: ideas that you require users to submit proof of identity 216 00:13:35,960 --> 00:13:40,719 Speaker 1: from two categories or factors. The factors we typically look 217 00:13:40,760 --> 00:13:44,640 Speaker 1: at our things you know, things you own, and things 218 00:13:44,760 --> 00:13:48,920 Speaker 1: you are. Facebook was looking at the first two factors. 219 00:13:48,960 --> 00:13:51,800 Speaker 1: The thing you know would be your password. You know 220 00:13:51,880 --> 00:13:53,719 Speaker 1: your password, so you put it in when you log in. 221 00:13:54,080 --> 00:13:56,800 Speaker 1: The thing you own would be a cell phone, a 222 00:13:56,880 --> 00:14:00,120 Speaker 1: cell phone that you have registered with your account. So 223 00:14:00,240 --> 00:14:04,079 Speaker 1: when you enabled two factor authentication, logging into Facebook on 224 00:14:04,120 --> 00:14:06,720 Speaker 1: a new machine would prompt you to provide your password, 225 00:14:06,960 --> 00:14:09,520 Speaker 1: and then Facebook would send you a one time code 226 00:14:09,720 --> 00:14:14,280 Speaker 1: to your phone, which you would then enter into Facebook's site. 227 00:14:14,760 --> 00:14:17,680 Speaker 1: And by proving you had both the password and the phone, 228 00:14:17,920 --> 00:14:20,360 Speaker 1: you're essentially proving you are who you claim to be. 229 00:14:20,720 --> 00:14:22,840 Speaker 1: It's supposed to cut down on the chance that someone 230 00:14:22,880 --> 00:14:26,320 Speaker 1: else might have access your account just by guessing or 231 00:14:26,360 --> 00:14:28,560 Speaker 1: otherwise getting access to your password. If they don't have 232 00:14:28,600 --> 00:14:31,800 Speaker 1: your phone, they can't complete the two factor authentication and 233 00:14:31,800 --> 00:14:34,920 Speaker 1: they're locked out of your account. But Gizmodo found out 234 00:14:35,040 --> 00:14:38,800 Speaker 1: Facebook was also using this process to do something else, 235 00:14:39,240 --> 00:14:43,520 Speaker 1: something not so awesome. The company was using the data 236 00:14:44,360 --> 00:14:48,800 Speaker 1: from these two factor authentication phones to target those users 237 00:14:48,800 --> 00:14:52,360 Speaker 1: with more specific ads. So the system, which was supposed 238 00:14:52,360 --> 00:14:55,560 Speaker 1: to be about providing users with a sense of confidence 239 00:14:55,560 --> 00:14:59,120 Speaker 1: that their accounts were secure, was simultaneously being used to 240 00:14:59,120 --> 00:15:03,400 Speaker 1: generate revenue you for Facebook with targeted advertising. And Facebook 241 00:15:03,440 --> 00:15:07,800 Speaker 1: was doing this by collecting contact information from users phones 242 00:15:08,160 --> 00:15:11,440 Speaker 1: to further fill out the links between social contacts who 243 00:15:11,440 --> 00:15:15,480 Speaker 1: were on Facebook. The Gizmoto report gave a pretty interesting 244 00:15:15,520 --> 00:15:19,080 Speaker 1: account of this. Cashmir Hill, who wrote the piece, did 245 00:15:19,120 --> 00:15:23,280 Speaker 1: an experiment by paying for an ad on Facebook that 246 00:15:23,360 --> 00:15:26,800 Speaker 1: would target a specific phone number, and that phone number 247 00:15:26,800 --> 00:15:30,600 Speaker 1: belonged to the landline for a computer science professor named 248 00:15:30,640 --> 00:15:34,560 Speaker 1: Alan Mislove, and this was with miss Love's consent. He 249 00:15:34,640 --> 00:15:38,080 Speaker 1: was in on this, so miss Love had never provided 250 00:15:38,120 --> 00:15:42,960 Speaker 1: this particular phone number to Facebook. Hill suspected that Facebook 251 00:15:43,040 --> 00:15:46,600 Speaker 1: had made the association between the phone number and Mislove 252 00:15:46,960 --> 00:15:50,560 Speaker 1: by combing through the contact information on phones being used 253 00:15:50,560 --> 00:15:55,160 Speaker 1: for two factor authentication purposes, because after she placed this ad, 254 00:15:55,280 --> 00:15:59,000 Speaker 1: within a couple of hours, miss Love saw it, and again, 255 00:15:59,200 --> 00:16:01,960 Speaker 1: miss Love had never put this particular phone number into 256 00:16:02,440 --> 00:16:06,840 Speaker 1: his profile on Facebook, and yet within two hours of 257 00:16:06,880 --> 00:16:11,480 Speaker 1: the ad being placed, it popped up on his Facebook feed, 258 00:16:12,120 --> 00:16:14,960 Speaker 1: So that meant the company had to get that phone 259 00:16:15,040 --> 00:16:19,080 Speaker 1: number from somewhere, So that sets the stage for this 260 00:16:19,160 --> 00:16:22,160 Speaker 1: privacy concern. Next, we'll take a look at what Mark 261 00:16:22,240 --> 00:16:25,960 Speaker 1: Zuckerberg said about privacy in that essay, and then we'll 262 00:16:25,960 --> 00:16:29,600 Speaker 1: look into some analysis and criticism of that essay. But 263 00:16:29,720 --> 00:16:40,280 Speaker 1: first let's take a quick break. Now you can find 264 00:16:40,560 --> 00:16:43,880 Speaker 1: the full essay that Zuckerberg wrote on Facebook, but I'd 265 00:16:43,880 --> 00:16:47,400 Speaker 1: like to pull some segments to talk about specifically, and 266 00:16:47,440 --> 00:16:51,120 Speaker 1: we're gonna start with the first two paragraphs after the introduction. Now, 267 00:16:51,160 --> 00:16:53,240 Speaker 1: like I said earlier, I'm not going to read out 268 00:16:53,240 --> 00:16:55,400 Speaker 1: the whole thing that would take up nearly an entire 269 00:16:55,480 --> 00:16:59,200 Speaker 1: episode just by itself. But once Zuckerberg is done with 270 00:16:59,280 --> 00:17:02,480 Speaker 1: his introduct and paragraph, he has this to say. Quote. 271 00:17:03,240 --> 00:17:06,880 Speaker 1: Over the last fifteen years, Facebook and Instagram have helped 272 00:17:06,920 --> 00:17:10,440 Speaker 1: people connect with friends, communities, and interests in the digital 273 00:17:10,480 --> 00:17:14,879 Speaker 1: equivalent of a town square. But people increasingly also want 274 00:17:14,960 --> 00:17:19,080 Speaker 1: to connect privately in the digital equivalent of the living room. 275 00:17:19,119 --> 00:17:21,359 Speaker 1: As I think about the future of the Internet. I 276 00:17:21,400 --> 00:17:25,320 Speaker 1: believe a privacy focused communications platform will become even more 277 00:17:25,359 --> 00:17:29,560 Speaker 1: important than today's open platforms. Privacy gives people the freedom 278 00:17:29,600 --> 00:17:33,080 Speaker 1: to be themselves and connect more naturally, which is why 279 00:17:33,160 --> 00:17:38,040 Speaker 1: we build social networks today. We already see that private messaging, 280 00:17:38,160 --> 00:17:42,080 Speaker 1: ephemeral stories, and small groups are by far the fastest 281 00:17:42,119 --> 00:17:45,480 Speaker 1: growing areas of online communication. There are a number of 282 00:17:45,520 --> 00:17:48,840 Speaker 1: reasons for this. Many people prefer the intimacy of communicating 283 00:17:48,920 --> 00:17:51,560 Speaker 1: one on one with just a few friends. People are 284 00:17:51,560 --> 00:17:54,879 Speaker 1: more cautious of having a permanent record of what they've shared, 285 00:17:55,200 --> 00:17:57,400 Speaker 1: and we all expect to be able to do things 286 00:17:57,440 --> 00:18:02,680 Speaker 1: like payments privately and securely. End quote. All right, so far, 287 00:18:03,200 --> 00:18:08,200 Speaker 1: so good. These are not super deep insights, of course, 288 00:18:08,240 --> 00:18:12,600 Speaker 1: they're mostly common sense, But this is Zuckerberg laying down 289 00:18:12,600 --> 00:18:16,680 Speaker 1: the foundation for his position moving forward. There's a growing 290 00:18:16,760 --> 00:18:21,560 Speaker 1: interest in separating out our personal interactions from the public space, 291 00:18:21,960 --> 00:18:24,720 Speaker 1: and there's not really a point here in which Zuckerberg 292 00:18:24,760 --> 00:18:28,399 Speaker 1: addresses targeted advertising, which I would argue is one of 293 00:18:28,440 --> 00:18:31,480 Speaker 1: the big reasons people are interested in moving off of 294 00:18:31,560 --> 00:18:35,720 Speaker 1: public online communications channels. But then I would have been 295 00:18:35,760 --> 00:18:38,920 Speaker 1: surprised to see so candid a statement from him on 296 00:18:38,960 --> 00:18:42,720 Speaker 1: that subject. The statement also seems to ignore the fact 297 00:18:42,760 --> 00:18:45,520 Speaker 1: that Facebook may have played a part in making people 298 00:18:45,600 --> 00:18:48,120 Speaker 1: feel like they need to move to a more private, 299 00:18:48,240 --> 00:18:52,280 Speaker 1: secure platform. So, in other words, Zuckerberry saying, I'm noticing 300 00:18:52,320 --> 00:18:56,880 Speaker 1: this trend, but he's not saying we're responsible for kind 301 00:18:56,920 --> 00:19:02,440 Speaker 1: of creating this desire um, just as he wasn't saying 302 00:19:02,480 --> 00:19:07,159 Speaker 1: that they were responsible for promoting the concept of oversharing 303 00:19:07,320 --> 00:19:11,399 Speaker 1: in the first place. Zuckerberg goes on to assert that 304 00:19:11,480 --> 00:19:15,720 Speaker 1: social networking platforms will remain important moving forward and argues 305 00:19:15,800 --> 00:19:18,960 Speaker 1: that their value and how they connect people together can't 306 00:19:19,000 --> 00:19:24,040 Speaker 1: be understated, that they provide a service that is valuable, 307 00:19:24,080 --> 00:19:27,399 Speaker 1: and therefore they will continue to survive. And he says, quote, 308 00:19:27,600 --> 00:19:31,080 Speaker 1: I understand that many people don't think Facebook can or 309 00:19:31,119 --> 00:19:34,480 Speaker 1: would even want to build this kind of privacy focused 310 00:19:34,520 --> 00:19:39,200 Speaker 1: platform because frankly, we don't currently have a strong reputation 311 00:19:39,280 --> 00:19:43,520 Speaker 1: for building privacy protective services, and we've historically focused on 312 00:19:43,640 --> 00:19:47,639 Speaker 1: tools for more open sharing end quote. Now there's an 313 00:19:47,640 --> 00:19:51,879 Speaker 1: illusion both to the company's privacy related woes and a 314 00:19:52,000 --> 00:19:56,320 Speaker 1: sort of tangential nod to its business model without explicitly 315 00:19:56,400 --> 00:19:59,960 Speaker 1: calling it out. He goes on to say he see 316 00:20:00,040 --> 00:20:04,440 Speaker 1: communications shifting to private and encrypted services, and he says, 317 00:20:05,000 --> 00:20:09,000 Speaker 1: we plan to build this the way we've developed WhatsApp. 318 00:20:09,320 --> 00:20:13,399 Speaker 1: Focus on the most fundamental and private use case, messaging, 319 00:20:13,640 --> 00:20:17,000 Speaker 1: and make it as secure as possible, and then build 320 00:20:17,040 --> 00:20:19,840 Speaker 1: more ways for people to interact on top of that, 321 00:20:20,200 --> 00:20:26,479 Speaker 1: including calls, video chats, groups, stories, businesses, payments, commerce, and 322 00:20:26,560 --> 00:20:30,200 Speaker 1: ultimately a platform for many other kinds of private services. 323 00:20:30,480 --> 00:20:34,280 Speaker 1: End quote, and he lists six principles that he says 324 00:20:34,280 --> 00:20:37,959 Speaker 1: will be important for this new focus to come true. First, 325 00:20:38,200 --> 00:20:41,120 Speaker 1: that people should be able to have private interactions with 326 00:20:41,160 --> 00:20:45,439 Speaker 1: clear controls for who can access those interactions. Next, that 327 00:20:45,520 --> 00:20:49,480 Speaker 1: these methods should incorporate end to end encryption to protect 328 00:20:49,480 --> 00:20:54,480 Speaker 1: the communication from prying eyes. Third is reducing permanence uh, 329 00:20:54,600 --> 00:20:57,439 Speaker 1: something made famous by Snapchat. This is the idea that 330 00:20:57,480 --> 00:21:01,359 Speaker 1: communications would have sort of an effective self destruct mechanism 331 00:21:01,480 --> 00:21:05,119 Speaker 1: after a particular amount of time, presumably enough time for 332 00:21:05,160 --> 00:21:08,840 Speaker 1: the other party to have seen the message, no permanent record. 333 00:21:08,840 --> 00:21:11,199 Speaker 1: In other words, though I should point out that some 334 00:21:11,280 --> 00:21:15,240 Speaker 1: of the services famous for providing this kind of experience weren't, 335 00:21:15,960 --> 00:21:21,240 Speaker 1: you know, actually deleting the content right away. So while 336 00:21:21,280 --> 00:21:24,680 Speaker 1: it sounds like it would be this kind of one 337 00:21:24,720 --> 00:21:27,080 Speaker 1: time only thing, and then the data has gone forever. 338 00:21:28,000 --> 00:21:31,320 Speaker 1: There's a chance the data is not really gone forever. Next, 339 00:21:31,400 --> 00:21:34,199 Speaker 1: Zuckerberg stresses the system will have to be safe and 340 00:21:34,200 --> 00:21:37,159 Speaker 1: that people will entrust Facebook with their data. Then he 341 00:21:37,280 --> 00:21:40,399 Speaker 1: calls for an interoperable approach, pointing out that lots of 342 00:21:40,400 --> 00:21:44,000 Speaker 1: people rely on different services to connect with one another. 343 00:21:44,359 --> 00:21:47,000 Speaker 1: So you might be a whiz on Facebook Messenger, but 344 00:21:47,080 --> 00:21:51,040 Speaker 1: maybe your best buddy prefers using WhatsApp, and then another 345 00:21:51,080 --> 00:21:54,320 Speaker 1: friend of yours, a friend in common is really only 346 00:21:54,440 --> 00:21:57,560 Speaker 1: using Instagram, and all three of those, by the way, 347 00:21:57,960 --> 00:22:01,840 Speaker 1: belong to Facebook. Zuckerberg wants a method that could work 348 00:22:01,880 --> 00:22:06,400 Speaker 1: across apps, as long as they are Facebook's apps. More 349 00:22:06,400 --> 00:22:09,080 Speaker 1: on that point in just a bit. The last principle 350 00:22:09,200 --> 00:22:13,480 Speaker 1: Zuckerberg mentions is secure data storage, which ties back into safety. 351 00:22:13,680 --> 00:22:17,560 Speaker 1: Though perhaps Zuckerberg was actually differentiating a person's physical safety 352 00:22:17,840 --> 00:22:20,680 Speaker 1: and the security of their data, or maybe the physical 353 00:22:20,720 --> 00:22:25,359 Speaker 1: safety of data centers. It's not entirely clear in that part. 354 00:22:25,400 --> 00:22:28,520 Speaker 1: He does later go into data centers in particular, so 355 00:22:28,560 --> 00:22:31,480 Speaker 1: maybe that's where really what he meant. Then Zuckerberg goes 356 00:22:31,520 --> 00:22:33,960 Speaker 1: on to flesh out these points a little more, and 357 00:22:34,000 --> 00:22:36,399 Speaker 1: I'm not going to dwell on all of them, because 358 00:22:36,440 --> 00:22:38,119 Speaker 1: again it would just take too long, but I do 359 00:22:38,200 --> 00:22:42,240 Speaker 1: want to focus on a few. So under encryption and safety, 360 00:22:42,640 --> 00:22:45,880 Speaker 1: he has this to say, quote, there is a growing 361 00:22:46,000 --> 00:22:49,919 Speaker 1: awareness that the more entities that have access to your data, 362 00:22:50,240 --> 00:22:53,399 Speaker 1: the more vulnerabilities there are for someone to misuse it, 363 00:22:53,600 --> 00:22:57,439 Speaker 1: for a cyber attack to expose it. There is also 364 00:22:57,560 --> 00:23:01,560 Speaker 1: a growing concern among some um that technology may be 365 00:23:01,680 --> 00:23:07,160 Speaker 1: centralizing power in the hands of governments and companies like ours, 366 00:23:07,200 --> 00:23:10,960 Speaker 1: and some people worry that our services could access their 367 00:23:11,000 --> 00:23:14,280 Speaker 1: messages and use them for advertising or in other ways 368 00:23:14,560 --> 00:23:18,000 Speaker 1: they don't expect. End to end encryption is an important 369 00:23:18,040 --> 00:23:23,679 Speaker 1: tool in developing a privacy focused social network. Encryption is decentralizing. 370 00:23:23,680 --> 00:23:27,760 Speaker 1: It limits services like ours from seeing the content flowing 371 00:23:27,880 --> 00:23:30,520 Speaker 1: through them and makes it much harder for anyone else 372 00:23:30,560 --> 00:23:34,560 Speaker 1: to access your information. End quote. So here we have 373 00:23:34,640 --> 00:23:37,600 Speaker 1: Zuckerberg pointing out that using data for the purposes of 374 00:23:37,680 --> 00:23:41,080 Speaker 1: advertising is a concern for some users, and he's also 375 00:23:41,160 --> 00:23:44,520 Speaker 1: explaining that end to end encryption would mean Facebook would 376 00:23:44,520 --> 00:23:48,040 Speaker 1: be unable to see the content of the messages themselves, 377 00:23:48,320 --> 00:23:52,680 Speaker 1: making that content awful limits for monetization, at least directly, 378 00:23:53,320 --> 00:23:55,720 Speaker 1: And that sounds pretty good to me on the surface 379 00:23:55,800 --> 00:23:57,840 Speaker 1: of it. I like the idea of a means of 380 00:23:57,840 --> 00:24:01,720 Speaker 1: communication that keeps things truly private between parties and isn't 381 00:24:01,880 --> 00:24:05,840 Speaker 1: used as currency between Facebook and advertisers, So kudos to 382 00:24:05,880 --> 00:24:08,760 Speaker 1: that idea. But then he turns around and points out 383 00:24:08,800 --> 00:24:12,320 Speaker 1: that end to end encryption also creates opportunities for people 384 00:24:12,359 --> 00:24:16,280 Speaker 1: to use the communication tools to do bad or harmful things, 385 00:24:16,680 --> 00:24:19,679 Speaker 1: such as planning a terrorist attack, and he states, quote, 386 00:24:19,920 --> 00:24:22,919 Speaker 1: we have a responsibility to work with law enforcement and 387 00:24:22,960 --> 00:24:26,560 Speaker 1: to help prevent these wherever we can end quote. So 388 00:24:27,000 --> 00:24:29,920 Speaker 1: this opens up a question that people have asked for ages, 389 00:24:30,000 --> 00:24:36,000 Speaker 1: how do you balance individual privacy with collective security. Zuckerberg 390 00:24:36,000 --> 00:24:39,200 Speaker 1: says that Facebook would use other means to detect possible 391 00:24:39,240 --> 00:24:44,200 Speaker 1: harmful use, including looking for usage patterns among people communicating 392 00:24:44,240 --> 00:24:48,280 Speaker 1: through this private, end to end encrypted service. This is 393 00:24:48,320 --> 00:24:51,520 Speaker 1: a good reminder that sometimes a spy doesn't even have 394 00:24:51,640 --> 00:24:54,560 Speaker 1: to be able to read or hear a message to 395 00:24:54,680 --> 00:24:58,160 Speaker 1: draw some pretty accurate conclusions about what is being said 396 00:24:58,240 --> 00:25:02,200 Speaker 1: between different parties. Just ask the n S A. Actually, 397 00:25:02,720 --> 00:25:04,960 Speaker 1: you don't have to ask the n S a they're 398 00:25:05,000 --> 00:25:10,760 Speaker 1: already listening. Zuckerberg then moves on to talk about interoperability 399 00:25:10,840 --> 00:25:13,480 Speaker 1: and talks about how, in his vision, users will be 400 00:25:13,520 --> 00:25:17,480 Speaker 1: able to communicate across Facebook apps through an opt in feature, 401 00:25:17,840 --> 00:25:23,679 Speaker 1: eventually even including SMS along with Messenger, WhatsApp, and Instagram. Now, 402 00:25:23,720 --> 00:25:26,360 Speaker 1: I feel this is a pretty risky move. The more 403 00:25:26,440 --> 00:25:29,960 Speaker 1: points of connection you have between services, the more potential 404 00:25:30,040 --> 00:25:32,879 Speaker 1: vulnerabilities that will be in the overall system, and the 405 00:25:32,920 --> 00:25:35,920 Speaker 1: more chances you've given hackers to find ways to exploit 406 00:25:35,960 --> 00:25:42,080 Speaker 1: those vulnerabilities. Zuckerberg does acknowledge that interoperability will create challenges 407 00:25:42,280 --> 00:25:43,840 Speaker 1: that will need to be solved before it can be 408 00:25:43,920 --> 00:25:46,520 Speaker 1: rolled out. So I don't want to suggest that he's 409 00:25:46,600 --> 00:25:50,240 Speaker 1: oblivious to this, but it's a concern I have that 410 00:25:50,359 --> 00:25:55,479 Speaker 1: I think needs further attention. Zuckerberg concludes with a section 411 00:25:55,560 --> 00:25:58,800 Speaker 1: on data security, pointing out that Facebook doesn't build data 412 00:25:58,840 --> 00:26:02,040 Speaker 1: centers and countries that a history of ignoring or violating 413 00:26:02,119 --> 00:26:05,760 Speaker 1: human rights like privacy and freedom of expression, and he 414 00:26:05,840 --> 00:26:08,760 Speaker 1: gives a rather broad series of next steps that the 415 00:26:08,800 --> 00:26:12,720 Speaker 1: company will take to achieve this vision. Now, I've already 416 00:26:12,760 --> 00:26:16,640 Speaker 1: talked a little bit about some of the concerns I 417 00:26:16,760 --> 00:26:20,880 Speaker 1: have about this essay, But when we come back, we're 418 00:26:20,880 --> 00:26:23,840 Speaker 1: going to explore some other takes on it that apply 419 00:26:24,000 --> 00:26:27,840 Speaker 1: critical thinking to the whole proposal and to see what 420 00:26:28,000 --> 00:26:32,479 Speaker 1: might really be at the heart of this message. But first, 421 00:26:33,080 --> 00:26:43,320 Speaker 1: let's take a quick break. I think it's pretty clear 422 00:26:43,840 --> 00:26:49,880 Speaker 1: that I have some concerns about what Zuckerberg has said, 423 00:26:49,880 --> 00:26:53,560 Speaker 1: but I am not the only person to express skepticism 424 00:26:53,560 --> 00:26:57,879 Speaker 1: about this new vision for Facebook. Some people, like Jeff Chester, 425 00:26:58,359 --> 00:27:01,560 Speaker 1: doubt the sincerity of the mess as well. Chester is 426 00:27:01,600 --> 00:27:06,240 Speaker 1: executive director for a nonprofit organization focused on privacy. It's 427 00:27:06,280 --> 00:27:11,480 Speaker 1: called Center for Digital Democracy. The Washington Post quoted Chester's 428 00:27:11,520 --> 00:27:15,800 Speaker 1: response to Zuckerberg's essay. He said, quote, why does it 429 00:27:15,880 --> 00:27:19,240 Speaker 1: always sound like we are witnessing a digital version of 430 00:27:19,320 --> 00:27:23,639 Speaker 1: Groundhog Day? When Facebook yet again promises when it's in 431 00:27:23,640 --> 00:27:26,880 Speaker 1: a crisis that it will do better, will it actually 432 00:27:26,920 --> 00:27:30,119 Speaker 1: bring a change to how Facebook continually gathers data on 433 00:27:30,160 --> 00:27:33,680 Speaker 1: its users in order to drive big profits. That same 434 00:27:33,760 --> 00:27:36,280 Speaker 1: Washington Post article, which by the way, has the title 435 00:27:36,600 --> 00:27:41,280 Speaker 1: Facebook's Mark Zuckerberg says he'll reorient the company toward encryption 436 00:27:41,320 --> 00:27:44,960 Speaker 1: and privacy. It's a great read. It's a long title 437 00:27:45,440 --> 00:27:49,200 Speaker 1: cited surveys that indicated people's trust, and Facebook has taken 438 00:27:49,240 --> 00:27:51,240 Speaker 1: a bit of a nose dive over the last couple 439 00:27:51,240 --> 00:27:54,040 Speaker 1: of years, So it sounds like Chester isn't alone in 440 00:27:54,080 --> 00:27:57,880 Speaker 1: this skepticism, and his question about whether or not Facebook 441 00:27:57,880 --> 00:28:01,399 Speaker 1: will actually change its ways is a valid one. Facebook's 442 00:28:01,440 --> 00:28:04,920 Speaker 1: operating margin has been pretty darn high for the last 443 00:28:04,920 --> 00:28:08,679 Speaker 1: few years, and for those not inclined toward business speak, 444 00:28:08,840 --> 00:28:13,360 Speaker 1: a company's operating margin is the ratio of operating income 445 00:28:13,400 --> 00:28:16,200 Speaker 1: to net sales, So it gives you a measurement of 446 00:28:16,480 --> 00:28:19,760 Speaker 1: what proportion of a company's revenue as in the money 447 00:28:19,760 --> 00:28:23,560 Speaker 1: it's bringing in from doing business before some other indirect 448 00:28:23,560 --> 00:28:27,640 Speaker 1: costs like taxes are taken out, are compared to operating costs, 449 00:28:27,720 --> 00:28:30,280 Speaker 1: so you want the percentage to be high. You want 450 00:28:30,280 --> 00:28:32,520 Speaker 1: to the higher it is, the more money you are 451 00:28:32,640 --> 00:28:36,199 Speaker 1: keeping from what you earn. So Facebook's has been between 452 00:28:36,240 --> 00:28:39,720 Speaker 1: forty three and forty seven, which is pretty darn good. 453 00:28:40,160 --> 00:28:44,080 Speaker 1: It means that so for every dollar Facebook makes between 454 00:28:44,120 --> 00:28:46,920 Speaker 1: forty three to forty seven, cents of that dollar is 455 00:28:46,960 --> 00:28:52,320 Speaker 1: an operating earnings, so that's profit before taxes. The rest 456 00:28:52,520 --> 00:28:56,680 Speaker 1: would go to covering the costs of doing business and 457 00:28:56,760 --> 00:29:00,880 Speaker 1: it's pretty clear in business. This is from a non 458 00:29:00,920 --> 00:29:03,400 Speaker 1: business e sort of person, and I think this is 459 00:29:03,440 --> 00:29:06,640 Speaker 1: a fair thing to say that making more money and 460 00:29:06,680 --> 00:29:09,120 Speaker 1: being able to keep more of the money you make 461 00:29:09,640 --> 00:29:12,680 Speaker 1: is considered a good thing. I'm pretty sure about that one. 462 00:29:13,000 --> 00:29:16,880 Speaker 1: A company could spend more to make more But if 463 00:29:16,920 --> 00:29:20,320 Speaker 1: the operating margin gets more narrow over time, if that 464 00:29:20,360 --> 00:29:23,400 Speaker 1: percentage begins to drop, you could end up in a 465 00:29:23,440 --> 00:29:27,360 Speaker 1: situation in which your actual profits, your your take home 466 00:29:27,400 --> 00:29:31,280 Speaker 1: profits after all costs are lower than they used to 467 00:29:31,280 --> 00:29:35,120 Speaker 1: be because your costs have grown so much. So in 468 00:29:35,200 --> 00:29:37,080 Speaker 1: the end of the day, you could say, yeah, we 469 00:29:37,200 --> 00:29:39,640 Speaker 1: made more money than we did last year. Like last 470 00:29:39,720 --> 00:29:42,160 Speaker 1: year we made fifty million, this year we made a 471 00:29:42,200 --> 00:29:46,320 Speaker 1: hundred million. But then last year our costs were ten million, 472 00:29:46,600 --> 00:29:50,600 Speaker 1: and this year our costs were ninety million. Well, that 473 00:29:50,640 --> 00:29:53,240 Speaker 1: means your your take home profits from the year before 474 00:29:53,280 --> 00:29:55,920 Speaker 1: we're forty million dollars. Your take home profits from this 475 00:29:56,000 --> 00:29:59,640 Speaker 1: year are ten million dollars. So yeah, you made more money, 476 00:30:00,120 --> 00:30:02,480 Speaker 1: but you weren't able to keep all of it. Right, 477 00:30:02,520 --> 00:30:05,360 Speaker 1: So that operating margin is a very important thing, especially 478 00:30:05,360 --> 00:30:10,920 Speaker 1: for investors. So some critics question Zuckerbert's commitment because what's 479 00:30:10,960 --> 00:30:14,760 Speaker 1: CEO of a publicly traded company is going to come 480 00:30:14,800 --> 00:30:17,840 Speaker 1: forward and say, I'm gonna pivot here and focus on 481 00:30:17,880 --> 00:30:20,480 Speaker 1: a new approach that we can't monetize the way we 482 00:30:20,520 --> 00:30:23,720 Speaker 1: could with our old model because I would cost Facebook 483 00:30:24,280 --> 00:30:28,400 Speaker 1: money and it would be expensive to make this change 484 00:30:28,480 --> 00:30:31,440 Speaker 1: to the platform. And if there's not a substantial way 485 00:30:31,480 --> 00:30:34,400 Speaker 1: to generate revenue, then what you're looking at is the 486 00:30:34,400 --> 00:30:39,600 Speaker 1: operating margin getting smaller and investors getting angry. So this 487 00:30:39,680 --> 00:30:44,000 Speaker 1: is really the the critics pointing out. Facebook isn't saying 488 00:30:44,320 --> 00:30:47,200 Speaker 1: they're not going to make money off your data. That's 489 00:30:47,240 --> 00:30:50,680 Speaker 1: not what Zuckerberg was saying at all. That that by 490 00:30:50,720 --> 00:30:54,760 Speaker 1: definition will have to continue because otherwise Facebook won't be 491 00:30:54,800 --> 00:30:59,440 Speaker 1: a business anymore. It won't be making money. So keep 492 00:30:59,480 --> 00:31:02,200 Speaker 1: that in mind when you're hearing these messages now. Other 493 00:31:02,280 --> 00:31:05,720 Speaker 1: critics say that pushing this new approach for Facebook would 494 00:31:05,720 --> 00:31:09,040 Speaker 1: play right into the hands of people who use Facebook 495 00:31:09,040 --> 00:31:12,800 Speaker 1: to spread misinformation. That's something the company has been struggling 496 00:31:12,800 --> 00:31:15,840 Speaker 1: with for a while now. Charges that bad actors have 497 00:31:15,920 --> 00:31:19,760 Speaker 1: been using Facebook to spread lies and propaganda all around 498 00:31:19,760 --> 00:31:22,360 Speaker 1: the world. We're really familiar with it here in the 499 00:31:22,440 --> 00:31:25,680 Speaker 1: United States, but the same thing is happening in other 500 00:31:25,760 --> 00:31:30,320 Speaker 1: countries around the world. So these critics are interestingly arguing 501 00:31:30,480 --> 00:31:35,000 Speaker 1: against Facebook for making things more private. They're they're not 502 00:31:35,080 --> 00:31:38,080 Speaker 1: saying Facebook is still going to be making money off 503 00:31:38,080 --> 00:31:40,720 Speaker 1: of you. What they're saying is this particular approach to 504 00:31:40,720 --> 00:31:44,720 Speaker 1: communication is more dangerous. They say that the threat to 505 00:31:44,720 --> 00:31:47,720 Speaker 1: security is too great. And I find that really interesting 506 00:31:47,760 --> 00:31:51,520 Speaker 1: because you've got privacy advocates on one side that's say 507 00:31:51,560 --> 00:31:56,600 Speaker 1: that these approaches aren't really addressing privacy problems, and then 508 00:31:56,600 --> 00:31:59,360 Speaker 1: you have security people on the other side that think 509 00:31:59,440 --> 00:32:04,560 Speaker 1: that this is pushing toward a more insecure future for 510 00:32:04,640 --> 00:32:10,800 Speaker 1: online communication and that encrypted private messaging is a dangerous strategy. 511 00:32:11,200 --> 00:32:14,040 Speaker 1: So two different perspectives, both of which are saying that 512 00:32:14,080 --> 00:32:18,880 Speaker 1: Facebook is maybe not so uh so wise to pursue 513 00:32:18,880 --> 00:32:23,040 Speaker 1: this particular approach. But one of the best critical examinations 514 00:32:23,040 --> 00:32:27,440 Speaker 1: of Zuckerberg's essay I have seen was written by Molly Wood, 515 00:32:27,480 --> 00:32:30,680 Speaker 1: and it was published by Wired. And in the interest 516 00:32:30,720 --> 00:32:34,480 Speaker 1: of full disclosure, I've known Molly Wood's work since before 517 00:32:34,560 --> 00:32:37,040 Speaker 1: I was recording text stuff, and she and I have 518 00:32:37,120 --> 00:32:41,720 Speaker 1: chatted several times, so we do know each other. But 519 00:32:41,880 --> 00:32:44,760 Speaker 1: the funny thing was I was reading her piece, and 520 00:32:44,800 --> 00:32:47,480 Speaker 1: I was making notes, and I had fully intended on 521 00:32:47,600 --> 00:32:51,320 Speaker 1: incorporating those notes into this episode before I had even 522 00:32:51,480 --> 00:32:53,840 Speaker 1: really noticed that she was the one who wrote the piece. 523 00:32:53,920 --> 00:32:56,800 Speaker 1: I was reading the article based on the headline, and 524 00:32:56,840 --> 00:32:59,200 Speaker 1: it was only after I had started making notes that 525 00:32:59,240 --> 00:33:01,720 Speaker 1: I went to see who wrote it, and then I thought, oh, well, 526 00:33:01,720 --> 00:33:05,120 Speaker 1: that makes sense. It was Molly Wood. Anyway. Her article 527 00:33:05,280 --> 00:33:12,040 Speaker 1: is titled Zuckerberg's Privacy Manifesto is actually about messaging end quote, 528 00:33:12,520 --> 00:33:15,720 Speaker 1: and she makes the case that what Zuckerberg was really 529 00:33:15,760 --> 00:33:19,800 Speaker 1: doing was laying out a product development plan to address 530 00:33:19,960 --> 00:33:23,560 Speaker 1: some pressing issues for the company. Not issues about user 531 00:33:23,640 --> 00:33:27,920 Speaker 1: privacy or misinformation campaigns or anything like that, but rather 532 00:33:28,040 --> 00:33:32,720 Speaker 1: issues with growth and usage and retention of customers. She 533 00:33:32,840 --> 00:33:37,600 Speaker 1: points out that Edison Research found Facebook had around fifteen 534 00:33:37,640 --> 00:33:40,600 Speaker 1: million fewer users in the United States in two thousand 535 00:33:40,680 --> 00:33:44,320 Speaker 1: eighteen as compared to two thousand seventeen. That keep in mind, 536 00:33:44,520 --> 00:33:48,720 Speaker 1: that's an overall loss of fifteen million users even after 537 00:33:48,880 --> 00:33:51,680 Speaker 1: new members have joined the service. So you've got new 538 00:33:51,680 --> 00:33:55,000 Speaker 1: people coming on to Facebook. That the people leaving Facebook 539 00:33:55,040 --> 00:33:59,280 Speaker 1: means that you have an overall decline in users. That's 540 00:33:59,320 --> 00:34:02,720 Speaker 1: an ugly warning sign that people are leaving your platform 541 00:34:02,800 --> 00:34:06,040 Speaker 1: in droves. Moreover, it was the second year in a 542 00:34:06,160 --> 00:34:10,040 Speaker 1: row that Edison Research measured a drop in users year 543 00:34:10,120 --> 00:34:14,399 Speaker 1: over year, and the United States is while it's while 544 00:34:14,400 --> 00:34:18,320 Speaker 1: it's small in population compared to other countries, it's large 545 00:34:18,640 --> 00:34:22,360 Speaker 1: as far as a revenue generator for Facebook. So seeing 546 00:34:22,480 --> 00:34:27,080 Speaker 1: declining numbers in your prime market is really bad business. 547 00:34:27,600 --> 00:34:31,399 Speaker 1: In addition, Molly sites and article in the Information dot 548 00:34:31,440 --> 00:34:34,799 Speaker 1: Com that pointed out another big problem with Facebook, which 549 00:34:34,840 --> 00:34:38,239 Speaker 1: is a decline in original sharing. So the people who 550 00:34:38,320 --> 00:34:42,759 Speaker 1: haven't left Facebook are sharing less about themselves. They might 551 00:34:42,760 --> 00:34:46,719 Speaker 1: share links to articles or post cartoons or memes or 552 00:34:46,800 --> 00:34:50,200 Speaker 1: videos or or whatever, but they're not sharing information about 553 00:34:50,280 --> 00:34:53,840 Speaker 1: themselves as much. And since we've already covered how Facebook 554 00:34:53,880 --> 00:34:57,600 Speaker 1: bases its revenue off of user data, that's a problem 555 00:34:57,640 --> 00:34:59,880 Speaker 1: for the company. I mean, you can draw some con 556 00:35:00,000 --> 00:35:02,960 Speaker 1: inclusions about what a person likes based on the stuff 557 00:35:03,000 --> 00:35:05,839 Speaker 1: they share on Facebook, but it's not the same as 558 00:35:05,960 --> 00:35:11,160 Speaker 1: information about the actual people behind those accounts. So Facebook 559 00:35:11,200 --> 00:35:14,359 Speaker 1: is facing what could be in the long run, an 560 00:35:14,360 --> 00:35:18,840 Speaker 1: existential crisis for the company. And messaging is where there's 561 00:35:18,840 --> 00:35:21,840 Speaker 1: a glimmer of hope Molly points out that in China, 562 00:35:22,160 --> 00:35:27,799 Speaker 1: a popular service called we Chat brings together messaging, phone calls, apps, 563 00:35:28,280 --> 00:35:32,760 Speaker 1: and it's a product where the average user is spending 564 00:35:32,800 --> 00:35:35,520 Speaker 1: an hour a day on it, and she says, quote, 565 00:35:35,600 --> 00:35:39,480 Speaker 1: this is almost exactly what Zuckerberg describes wanting to build 566 00:35:39,560 --> 00:35:41,759 Speaker 1: over the next few years. End quote. And if you 567 00:35:41,800 --> 00:35:45,600 Speaker 1: remember when I quoted Zuckerberg earlier in this episode, that 568 00:35:45,760 --> 00:35:48,040 Speaker 1: was pretty much what he was saying. And she goes 569 00:35:48,080 --> 00:35:50,600 Speaker 1: on to argue that a Facebook can make a super app, 570 00:35:50,880 --> 00:35:55,320 Speaker 1: one that can accommodate advertising and commerce transactions, with Facebook 571 00:35:55,360 --> 00:35:59,640 Speaker 1: getting a cut of each transaction. Since it's facilitating those deals, 572 00:36:00,000 --> 00:36:03,800 Speaker 1: it could shift its dependence from a social networking site 573 00:36:03,840 --> 00:36:06,919 Speaker 1: that seems to have passed its peak and move on 574 00:36:06,960 --> 00:36:09,600 Speaker 1: to a messaging service more in line with the way 575 00:36:09,680 --> 00:36:13,560 Speaker 1: younger people in particular are communicating with one another. She 576 00:36:13,640 --> 00:36:18,080 Speaker 1: also Kennely points out that the commitment to encryption doesn't 577 00:36:18,239 --> 00:36:21,600 Speaker 1: mean the company will stop targeted advertising or profiting off 578 00:36:21,640 --> 00:36:25,560 Speaker 1: of users. Quote. The fact that your individual messages might 579 00:36:25,600 --> 00:36:29,240 Speaker 1: be encrypted in transit does not in any way prevent 580 00:36:29,320 --> 00:36:33,240 Speaker 1: Facebook the entity from knowing who your friends are, where 581 00:36:33,280 --> 00:36:36,680 Speaker 1: you go, what links you click, what apps you use, 582 00:36:37,000 --> 00:36:40,399 Speaker 1: what you buy, what you pay for and where, what 583 00:36:40,480 --> 00:36:44,680 Speaker 1: businesses you communicate with, what games you play, and whatever 584 00:36:44,760 --> 00:36:47,799 Speaker 1: information you might have given to Facebook or Instagram in 585 00:36:47,840 --> 00:36:51,560 Speaker 1: the past. End quote that actually touches on another point 586 00:36:51,600 --> 00:36:54,880 Speaker 1: that I didn't mention before, the one about location services. 587 00:36:55,239 --> 00:36:57,480 Speaker 1: If you have Facebook installed on a phone and you 588 00:36:57,520 --> 00:37:00,359 Speaker 1: haven't opted out of location data being, incorporate it into 589 00:37:00,400 --> 00:37:03,520 Speaker 1: the service. Or even if you have, as some studies 590 00:37:03,520 --> 00:37:06,480 Speaker 1: have shown, you've got yourself an app that's keeping tabs 591 00:37:06,480 --> 00:37:08,680 Speaker 1: on you wherever you're going and how long you are 592 00:37:08,719 --> 00:37:12,279 Speaker 1: spending at every location you go to, and potentially who 593 00:37:12,320 --> 00:37:15,080 Speaker 1: you are there with, because if they also have the 594 00:37:15,120 --> 00:37:18,880 Speaker 1: Facebook app installed, Facebook is able to to core operate 595 00:37:18,920 --> 00:37:22,239 Speaker 1: all that information. You can bet that would still be 596 00:37:22,280 --> 00:37:24,520 Speaker 1: a part of the messaging service in the future that 597 00:37:24,600 --> 00:37:28,359 Speaker 1: Facebook and visions. It's a gold mine of valuable data. 598 00:37:28,560 --> 00:37:32,000 Speaker 1: Molly Wood concludes with quote. In fact, nowhere in the 599 00:37:32,120 --> 00:37:35,280 Speaker 1: more than three thousand words that Zuckerberg published on Wednesday 600 00:37:35,360 --> 00:37:38,200 Speaker 1: does he say that users will ultimately control their own 601 00:37:38,280 --> 00:37:41,120 Speaker 1: data or have the option to reduce the amount of 602 00:37:41,200 --> 00:37:44,480 Speaker 1: data they share with Facebook, or delete their information, or 603 00:37:44,520 --> 00:37:48,200 Speaker 1: operate anonymously, or pay a subscription fee to reduce or 604 00:37:48,239 --> 00:37:52,680 Speaker 1: eliminate ad tracking, anything that would represent an actual commitment 605 00:37:52,719 --> 00:37:57,719 Speaker 1: to privacy other than secure messaging. End quote. Now, I 606 00:37:57,840 --> 00:38:00,239 Speaker 1: urge you to read her whole piece on Wired, because 607 00:38:00,239 --> 00:38:02,880 Speaker 1: while I've given some sizeable excerpts, it's best if you 608 00:38:02,920 --> 00:38:05,280 Speaker 1: read it from start to finish. It's a really good piece, 609 00:38:05,480 --> 00:38:09,960 Speaker 1: and I pretty much agree with her not Now, this 610 00:38:10,040 --> 00:38:13,480 Speaker 1: doesn't mean I think Zuckerberg is some sort of evil 611 00:38:13,600 --> 00:38:17,520 Speaker 1: data tyrant, but he is the CEO of a multibillion 612 00:38:17,600 --> 00:38:22,160 Speaker 1: dollar global company that's in the business of leveraging user data. 613 00:38:22,440 --> 00:38:26,759 Speaker 1: The business pressures are enormous, and he's likely making decisions 614 00:38:26,800 --> 00:38:29,520 Speaker 1: to ensure the health of his company, including how to 615 00:38:29,600 --> 00:38:33,240 Speaker 1: make users interested into companies products before too many people 616 00:38:33,320 --> 00:38:38,560 Speaker 1: migrate away from Facebook entirely. And again, I think this 617 00:38:39,160 --> 00:38:43,160 Speaker 1: whole story brings up the necessity for us to employ 618 00:38:43,239 --> 00:38:47,440 Speaker 1: critical thinking when we come across various pieces of news, 619 00:38:47,920 --> 00:38:51,880 Speaker 1: to really put it through some difficult questions, ask ourselves, 620 00:38:52,200 --> 00:38:55,960 Speaker 1: what are the actual motivators here? What are the benefits 621 00:38:56,200 --> 00:39:00,560 Speaker 1: of the approach that is suggested in this piece? You know, 622 00:39:00,680 --> 00:39:04,480 Speaker 1: answering those questions can lead you to some conclusions that 623 00:39:04,560 --> 00:39:08,759 Speaker 1: might be different than whatever the surface level happens to 624 00:39:08,800 --> 00:39:13,319 Speaker 1: be of that communication. Uh Now, it could very well 625 00:39:13,360 --> 00:39:17,840 Speaker 1: be that maybe Zuckerberg is being extremely sincere, maybe there's 626 00:39:18,200 --> 00:39:22,759 Speaker 1: no old terrier motive there, But I feel pretty confident 627 00:39:22,880 --> 00:39:28,120 Speaker 1: that the critics are onto something here. I'm curious what 628 00:39:28,200 --> 00:39:31,520 Speaker 1: you guys think. Are you guys still using Facebook out? 629 00:39:31,520 --> 00:39:34,240 Speaker 1: That's another thing, Just in the interest of full disclosure. Again, 630 00:39:34,760 --> 00:39:38,279 Speaker 1: I actually planned to step away from Facebook this year 631 00:39:39,040 --> 00:39:43,239 Speaker 1: after June. I think right now I'm in a production 632 00:39:43,640 --> 00:39:47,560 Speaker 1: that is using Facebook to communicate things like schedules and 633 00:39:48,280 --> 00:39:50,200 Speaker 1: that kind of stuff, So I actually need to use 634 00:39:50,280 --> 00:39:53,360 Speaker 1: the service right now just so that I can continue 635 00:39:53,400 --> 00:39:57,839 Speaker 1: to do my job with this production. However, once that's concluded, 636 00:39:58,160 --> 00:40:01,560 Speaker 1: I plan on sort of taking a break, not deactivating 637 00:40:01,560 --> 00:40:07,440 Speaker 1: my account entirely, but not visiting frequently, not really relying 638 00:40:07,520 --> 00:40:10,239 Speaker 1: upon it, and hoping that my friends will go through 639 00:40:10,280 --> 00:40:13,040 Speaker 1: the effort of reaching out to me through other means 640 00:40:13,480 --> 00:40:16,160 Speaker 1: and not just assume that I've disappeared off the face 641 00:40:16,160 --> 00:40:19,520 Speaker 1: of the earth. I'll report back after that's gone on 642 00:40:19,600 --> 00:40:21,799 Speaker 1: for a while. But I'm curious what you guys think 643 00:40:21,840 --> 00:40:24,040 Speaker 1: and if you have any suggestions for future episodes of 644 00:40:24,040 --> 00:40:26,400 Speaker 1: tech Stuff. If you do send me a message, the 645 00:40:26,440 --> 00:40:30,960 Speaker 1: email address is text stuff at how stuff works dot com, 646 00:40:31,040 --> 00:40:33,720 Speaker 1: or you can drop me a line on Facebook or Twitter. 647 00:40:33,760 --> 00:40:36,120 Speaker 1: The handle for both of those is tech Stuff hs W. 648 00:40:36,600 --> 00:40:39,799 Speaker 1: Pop on over to our website that's tech Stuff podcast 649 00:40:39,960 --> 00:40:43,439 Speaker 1: dot com. You'll find a link to our store there. 650 00:40:43,640 --> 00:40:45,719 Speaker 1: Every purchase you make goes to help the show. We 651 00:40:45,800 --> 00:40:48,360 Speaker 1: greatly appreciate it. It's all the advertising I have for 652 00:40:48,440 --> 00:40:52,280 Speaker 1: you right now, and I'll talk to you again. Release 653 00:40:58,239 --> 00:41:00,680 Speaker 1: for more on this and thousands of other topics, is 654 00:41:00,680 --> 00:41:11,720 Speaker 1: that how stuff works dot com.