1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,200 --> 00:00:13,840 Speaker 1: stuff works dot com. Hey therein Welcome to Tech Stuff. 3 00:00:13,880 --> 00:00:16,919 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer and 4 00:00:16,960 --> 00:00:21,000 Speaker 1: I love all things tech and this is one of 5 00:00:21,040 --> 00:00:25,880 Speaker 1: those topics where things are pretty complicated and are unfolding 6 00:00:26,120 --> 00:00:28,560 Speaker 1: as I sit down to record this episode, but I 7 00:00:28,560 --> 00:00:31,479 Speaker 1: thought it was important enough to actually address it. On 8 00:00:31,560 --> 00:00:35,080 Speaker 1: November two thousand eighteen, The New York Times ran a 9 00:00:35,120 --> 00:00:39,120 Speaker 1: story written by five reporters and it was six thousand 10 00:00:39,200 --> 00:00:42,640 Speaker 1: words long. The focus of the story was about Facebook 11 00:00:42,880 --> 00:00:45,960 Speaker 1: and how executives at the social media company have tried 12 00:00:46,040 --> 00:00:49,760 Speaker 1: to respond after a series of scandals and accusations and 13 00:00:50,200 --> 00:00:55,240 Speaker 1: muddied Facebook's public image. And it was a pretty eye 14 00:00:55,320 --> 00:00:59,440 Speaker 1: opening report and it's ended up causing a lot of 15 00:00:59,440 --> 00:01:02,000 Speaker 1: people to yell at each other, or at least it 16 00:01:02,040 --> 00:01:06,080 Speaker 1: has given a lot of ammunition in the yelling arguments 17 00:01:06,080 --> 00:01:08,880 Speaker 1: that are happening in Washington, d C. And Silicon Valley. 18 00:01:08,959 --> 00:01:13,160 Speaker 1: So the scandals in question are uh some of the 19 00:01:13,160 --> 00:01:15,600 Speaker 1: things we've talked about in previous episodes of Tech Stuff 20 00:01:15,640 --> 00:01:19,560 Speaker 1: this year, like the signs that Russian agents and hackers 21 00:01:19,560 --> 00:01:23,680 Speaker 1: were creating numerous accounts in order to steal information and 22 00:01:23,760 --> 00:01:28,759 Speaker 1: spread propaganda and misinformation, and to generally undermine the democratic 23 00:01:28,800 --> 00:01:32,120 Speaker 1: process in the United States. It also included the Cambridge 24 00:01:32,120 --> 00:01:35,040 Speaker 1: Analytica scandal, which is a separate thing. I covered that 25 00:01:35,120 --> 00:01:38,039 Speaker 1: in an earlier episode. In that mess, the public discovered 26 00:01:38,040 --> 00:01:41,520 Speaker 1: that an enormous amount of personal data had been mined 27 00:01:41,640 --> 00:01:45,960 Speaker 1: by this political analystic company, Cambridge Analytica, often without the 28 00:01:46,040 --> 00:01:50,640 Speaker 1: user's consent. Then there's the ongoing problem of hate groups 29 00:01:50,680 --> 00:01:53,840 Speaker 1: and hate speech proliferating on the social network. So this 30 00:01:54,000 --> 00:01:57,320 Speaker 1: New York Times piece was really an investigative look into 31 00:01:57,440 --> 00:02:01,680 Speaker 1: how Facebook top brass have bonded to these problems, and 32 00:02:02,560 --> 00:02:06,840 Speaker 1: spoiler alert, it ain't great. The piece goes through the 33 00:02:06,880 --> 00:02:11,080 Speaker 1: process of Facebook becoming aware of Russian hacker activity on 34 00:02:11,120 --> 00:02:14,760 Speaker 1: the platform. Alex Stamos, who was then the head of 35 00:02:14,800 --> 00:02:18,840 Speaker 1: security over at Facebook, initiated an internal team to look 36 00:02:18,919 --> 00:02:25,160 Speaker 1: into suspicious activity. In December, Mark Zuckerberg sort of dismissed 37 00:02:25,200 --> 00:02:28,680 Speaker 1: the idea that fake news on Facebook was having some 38 00:02:28,720 --> 00:02:32,880 Speaker 1: sort of effect on uh political process and had in 39 00:02:32,919 --> 00:02:38,040 Speaker 1: fact played any role in the election. Stamos was worried 40 00:02:38,080 --> 00:02:41,720 Speaker 1: that this wasn't exactly true and so he ended up 41 00:02:41,800 --> 00:02:45,639 Speaker 1: meeting with Zuckerberg and Cheryl Sandberg, who is the chief 42 00:02:45,680 --> 00:02:49,200 Speaker 1: operating officer over at Facebook now. According to the New 43 00:02:49,240 --> 00:02:54,280 Speaker 1: York Times article, Sandberg was furious with Stamos and said 44 00:02:54,320 --> 00:02:57,480 Speaker 1: that his investigation had opened up the possibility that Facebook 45 00:02:57,520 --> 00:03:00,880 Speaker 1: could be held accountable or liable for this stuff. But 46 00:03:01,000 --> 00:03:05,680 Speaker 1: ultimately the company chose to expand this investigation in an 47 00:03:05,720 --> 00:03:09,840 Speaker 1: internal project that was called Project P. The P stood 48 00:03:09,840 --> 00:03:14,040 Speaker 1: for propaganda. I should also point out that Sandberg responded 49 00:03:14,120 --> 00:03:17,120 Speaker 1: to the New York Times piece after it ran in 50 00:03:17,200 --> 00:03:20,000 Speaker 1: a blog post, and she denied the suggestion that she 51 00:03:20,080 --> 00:03:24,680 Speaker 1: wanted to avoid or slow down any internal investigations into 52 00:03:24,880 --> 00:03:29,360 Speaker 1: Russian interference. She specifically wrote, quote, Mark and I have 53 00:03:29,560 --> 00:03:33,560 Speaker 1: said many times we were too slow, But to suggest 54 00:03:33,600 --> 00:03:36,840 Speaker 1: that we weren't interested in knowing the truth, or we 55 00:03:36,880 --> 00:03:39,920 Speaker 1: wanted to hide what we knew, or that we tried 56 00:03:39,960 --> 00:03:45,800 Speaker 1: to prevent investigations is simply untrue. End quote. So that's 57 00:03:45,840 --> 00:03:50,400 Speaker 1: going on. But in April, Facebook would publish a paper 58 00:03:50,520 --> 00:03:54,360 Speaker 1: about the subject. However, in this paper, the word Russia 59 00:03:54,680 --> 00:03:59,200 Speaker 1: was mysteriously absent from it. It was about interference, but 60 00:04:00,160 --> 00:04:04,880 Speaker 1: the company did not name Russian hackers as the perpetrators. 61 00:04:05,560 --> 00:04:09,160 Speaker 1: Joel Kaplan, who is Facebook's vice president of Global Policy. 62 00:04:09,360 --> 00:04:12,400 Speaker 1: He's also a former deputy chief of staff for Republican 63 00:04:12,520 --> 00:04:16,640 Speaker 1: US President George W. Bush, had argued behind the scenes 64 00:04:16,839 --> 00:04:21,240 Speaker 1: against Facebook taking a more firm and definitive stance. He 65 00:04:21,279 --> 00:04:23,839 Speaker 1: said it would open the company up to accusations that 66 00:04:23,920 --> 00:04:27,880 Speaker 1: it was anti Republican and biased towards Democrats, and so 67 00:04:27,960 --> 00:04:30,520 Speaker 1: he said, in order to do that, let's not just 68 00:04:30,920 --> 00:04:33,760 Speaker 1: let's not lay out that it's Russian hackers that were 69 00:04:33,800 --> 00:04:39,080 Speaker 1: attempting to sway elections specifically in favor of Donald Trump. 70 00:04:39,480 --> 00:04:43,680 Speaker 1: So how did things get so bad so fast for 71 00:04:43,800 --> 00:04:47,680 Speaker 1: the company. While the piece maintains that Mark Zuckerberg, you know, 72 00:04:47,760 --> 00:04:51,760 Speaker 1: founder and CEO, and Cheryl Sandberg had focused on personal 73 00:04:51,800 --> 00:04:56,360 Speaker 1: projects rather than critical operations at Facebook, and they had 74 00:04:56,400 --> 00:05:01,720 Speaker 1: handed off those important responsibilities to subordinates, the New York 75 00:05:01,760 --> 00:05:06,520 Speaker 1: Times journalists cite numerous current and former executives who indicated 76 00:05:06,680 --> 00:05:09,240 Speaker 1: that there was a lot of delegating going on and 77 00:05:09,320 --> 00:05:13,040 Speaker 1: not enough oversight. So there were a lot of people 78 00:05:13,040 --> 00:05:16,680 Speaker 1: who were given a great deal of leeway to do 79 00:05:16,760 --> 00:05:20,200 Speaker 1: their jobs, and as a result, people probably stepped a 80 00:05:20,240 --> 00:05:25,320 Speaker 1: little further out than what Zuckerberg or Sandbrick would have preferred. Now, 81 00:05:25,400 --> 00:05:28,080 Speaker 1: when the various scandals all rose to a certain level 82 00:05:28,120 --> 00:05:31,960 Speaker 1: and public opinion was really beginning to shift against Facebook, 83 00:05:32,640 --> 00:05:37,040 Speaker 1: someone over at Facebook and it's not clear who yet 84 00:05:37,240 --> 00:05:40,320 Speaker 1: as of the recording of this podcast, made the decision 85 00:05:40,440 --> 00:05:43,920 Speaker 1: to go on the offensive and hire an opposition research 86 00:05:44,000 --> 00:05:49,920 Speaker 1: firm called Definer's Public Affairs. Opposition research is a really 87 00:05:50,040 --> 00:05:53,840 Speaker 1: nice way to describe the technique of researching an opponent, 88 00:05:54,200 --> 00:05:57,280 Speaker 1: typically a political opponent, in an effort to dig up 89 00:05:57,400 --> 00:06:01,680 Speaker 1: dirt or compromising information so that that information can be 90 00:06:01,800 --> 00:06:05,359 Speaker 1: used against that opponent. This information can be used to 91 00:06:05,400 --> 00:06:09,680 Speaker 1: discredit or weaken the person. And this technique is not new. 92 00:06:09,800 --> 00:06:14,080 Speaker 1: It's actually ancient. Was used in ancient Rome in their 93 00:06:14,200 --> 00:06:17,960 Speaker 1: republic more than two thousand years ago, so this has 94 00:06:17,960 --> 00:06:21,279 Speaker 1: been around for quite some time. The term opposition research 95 00:06:21,360 --> 00:06:25,359 Speaker 1: is a bit more modern, but the underlying principles are ancient. 96 00:06:26,160 --> 00:06:28,320 Speaker 1: And just to be clear here, this is a tactic 97 00:06:28,360 --> 00:06:32,160 Speaker 1: that's been used by politicians from all political parties. This 98 00:06:32,240 --> 00:06:35,760 Speaker 1: is not something that's someone should say, oh, only Republicans 99 00:06:35,800 --> 00:06:39,600 Speaker 1: do that. No, No, all political parties, at some point 100 00:06:39,680 --> 00:06:44,120 Speaker 1: or another engage opposition research at some level, and the 101 00:06:44,200 --> 00:06:48,440 Speaker 1: question is when does it go from being a legitimate 102 00:06:48,839 --> 00:06:52,559 Speaker 1: political strategy to an unethical one. And it's a pretty 103 00:06:52,600 --> 00:06:56,960 Speaker 1: gray line. It's uh ugly as well. Politics tend to 104 00:06:57,000 --> 00:07:00,440 Speaker 1: be pretty ugly, and in this case, they Facebook was 105 00:07:00,480 --> 00:07:04,919 Speaker 1: starting to employ a tactic that had been used in 106 00:07:05,080 --> 00:07:09,200 Speaker 1: politics and now was going to be used in business. 107 00:07:09,840 --> 00:07:13,560 Speaker 1: So the reason Facebook hired definers in the first place 108 00:07:13,800 --> 00:07:17,600 Speaker 1: was to help monitor news stories about Facebook so that 109 00:07:17,680 --> 00:07:20,960 Speaker 1: the executives would be aware of the general public opinion 110 00:07:21,160 --> 00:07:26,800 Speaker 1: about the company. In October, Facebook would expand this to 111 00:07:27,160 --> 00:07:31,400 Speaker 1: direct definers to specifically focus on the story about Russian 112 00:07:31,440 --> 00:07:35,600 Speaker 1: hackers on Facebook and how it relates to the manipulating 113 00:07:35,680 --> 00:07:37,920 Speaker 1: of the American public and the lead up to the 114 00:07:37,920 --> 00:07:41,400 Speaker 1: twenty sixteen election. They said, forget all the other stories, 115 00:07:41,440 --> 00:07:44,480 Speaker 1: really focus on these and see where that narrative is going. 116 00:07:45,040 --> 00:07:49,080 Speaker 1: The more the Facebook security team investigated the Russian hacker issue, 117 00:07:49,360 --> 00:07:51,800 Speaker 1: the bigger and more impactful it was turning out to be. 118 00:07:52,000 --> 00:07:55,440 Speaker 1: So it was sort of an attempt to Facebook trying 119 00:07:55,440 --> 00:08:00,760 Speaker 1: to stay ahead of what the public narrative was about 120 00:08:00,800 --> 00:08:03,080 Speaker 1: this whole thing and to get a better grip on 121 00:08:03,240 --> 00:08:07,600 Speaker 1: exactly what had happened before someone else found out and 122 00:08:07,760 --> 00:08:11,040 Speaker 1: then put Facebook on the defensive. So Facebook was really 123 00:08:11,080 --> 00:08:16,080 Speaker 1: concerned that this increased focus on the company would possibly 124 00:08:16,320 --> 00:08:20,200 Speaker 1: lead to government intervention in the form of regulations. Now, 125 00:08:20,240 --> 00:08:25,280 Speaker 1: generally speaking, big companies are not huge fans of regulations. 126 00:08:25,320 --> 00:08:29,760 Speaker 1: By definition, regulations limit what a company can do, and 127 00:08:29,800 --> 00:08:33,240 Speaker 1: since from a very high level perspective, the purpose of 128 00:08:33,280 --> 00:08:37,800 Speaker 1: a publicly traded company is ultimately to make money for shareholders, 129 00:08:38,320 --> 00:08:42,560 Speaker 1: limitations are generally viewed as a bad thing. They tend 130 00:08:42,559 --> 00:08:46,600 Speaker 1: to also require that a company invest money in various 131 00:08:46,640 --> 00:08:50,880 Speaker 1: processes and procedures, which means there's less money to go 132 00:08:51,040 --> 00:08:56,040 Speaker 1: toward profit. So again, the more costs you have, the 133 00:08:56,120 --> 00:09:00,199 Speaker 1: less attractive you tend to be towards shareholders. So all 134 00:09:00,200 --> 00:09:03,240 Speaker 1: of this requires a bit of mental gymnastics to separate 135 00:09:03,240 --> 00:09:06,360 Speaker 1: out what would typically be considered ethical, as in, what 136 00:09:06,520 --> 00:09:09,560 Speaker 1: is the right thing to do and what is considered 137 00:09:09,679 --> 00:09:15,960 Speaker 1: good business practices. Those two questions often arrive at very 138 00:09:16,000 --> 00:09:19,840 Speaker 1: different answers. A lot of a lot of businesses try 139 00:09:19,920 --> 00:09:24,079 Speaker 1: to go and UH and a moral route not immoral. 140 00:09:24,400 --> 00:09:28,880 Speaker 1: They're not trying to do something that is UH antithetical 141 00:09:29,000 --> 00:09:33,880 Speaker 1: to morals, but rather remove morals from the the question 142 00:09:34,200 --> 00:09:39,040 Speaker 1: entirely as much as you can. Uh, that isn't every business, 143 00:09:39,679 --> 00:09:42,400 Speaker 1: and certainly I don't think there are very many businesses 144 00:09:42,400 --> 00:09:46,320 Speaker 1: that do it to the fullest extent. But you see 145 00:09:46,360 --> 00:09:51,239 Speaker 1: a lot of companies try to ignore certain ethical questions 146 00:09:51,280 --> 00:09:55,959 Speaker 1: if those ethical questions are inconvenient in the pursuit of profit. Now, 147 00:09:56,000 --> 00:09:59,920 Speaker 1: to be fair to Facebook, the situation is incredibly complic 148 00:10:00,040 --> 00:10:03,360 Speaker 1: had it. I don't wish to say that there was 149 00:10:03,400 --> 00:10:06,840 Speaker 1: a very simple choice to be made and Facebook went 150 00:10:06,960 --> 00:10:10,640 Speaker 1: the wrong way. That is far too simplistic for what 151 00:10:10,720 --> 00:10:15,800 Speaker 1: was going on. Facebook executives have understandably, I think, argued 152 00:10:15,840 --> 00:10:19,520 Speaker 1: that Facebook is a platform, not a publisher. There is 153 00:10:19,559 --> 00:10:24,040 Speaker 1: a difference. As a platform, the company is not responsible 154 00:10:24,040 --> 00:10:28,040 Speaker 1: for the type of stuff people will post to that platform. 155 00:10:28,120 --> 00:10:32,800 Speaker 1: The ideas that the company is agnostic and disinterested. They 156 00:10:32,840 --> 00:10:36,120 Speaker 1: provide the venue, they do not provide the script. In 157 00:10:36,160 --> 00:10:38,920 Speaker 1: other words, so that guess Facebook a bit of protection 158 00:10:39,080 --> 00:10:43,160 Speaker 1: if someone were to post something really awful on that platform, 159 00:10:43,240 --> 00:10:46,240 Speaker 1: the company can enjoy a bit of protection. It's related 160 00:10:46,240 --> 00:10:49,280 Speaker 1: to a concept it's called safe harbor. The idea that 161 00:10:49,600 --> 00:10:53,160 Speaker 1: if you provide a place for people to put stuff. 162 00:10:53,920 --> 00:10:58,160 Speaker 1: You actually are not liable if someone puts something illegal there. 163 00:10:58,679 --> 00:11:01,760 Speaker 1: You you were fighting a service in the sense of 164 00:11:01,800 --> 00:11:05,480 Speaker 1: a place for people to go and do things. The 165 00:11:05,559 --> 00:11:07,880 Speaker 1: other person who put the illegal thing, they're they're the 166 00:11:07,880 --> 00:11:10,160 Speaker 1: ones who broke the rules. They should be held liable, 167 00:11:10,360 --> 00:11:14,480 Speaker 1: not you as the service provider. But then the problem 168 00:11:14,559 --> 00:11:19,199 Speaker 1: is Facebook doesn't take a completely hands off approach when 169 00:11:19,280 --> 00:11:22,720 Speaker 1: it comes to people posting stuff on the platform. For 170 00:11:22,800 --> 00:11:26,679 Speaker 1: one thing, the company has designed algorithms so that users 171 00:11:26,760 --> 00:11:30,760 Speaker 1: see some content, but they might not see other content 172 00:11:31,240 --> 00:11:33,960 Speaker 1: from their friends. And I'm sure if you've used Facebook 173 00:11:34,320 --> 00:11:37,280 Speaker 1: you've had this experience. Maybe you missed out on a 174 00:11:37,400 --> 00:11:39,720 Speaker 1: post that a lot of other people are talking about, 175 00:11:40,040 --> 00:11:42,440 Speaker 1: and it's not that you were excluded, it's just that 176 00:11:42,520 --> 00:11:45,880 Speaker 1: Facebook's algorithm didn't share that post with you, so you 177 00:11:45,920 --> 00:11:49,440 Speaker 1: didn't see it. Or maybe you posted something and you 178 00:11:49,480 --> 00:11:52,240 Speaker 1: were surprised that more of your friends didn't respond to it. 179 00:11:52,640 --> 00:11:55,440 Speaker 1: And again, it may very well be that Facebook just 180 00:11:55,559 --> 00:12:00,760 Speaker 1: didn't display your post in people's feeds. So Facebook algorithms 181 00:12:00,760 --> 00:12:04,600 Speaker 1: in part determine what you see. Generally speaking, posts that 182 00:12:04,679 --> 00:12:08,079 Speaker 1: get more interaction or engagement tend to be seen by 183 00:12:08,120 --> 00:12:11,839 Speaker 1: more people. Facebook's algorithm tends to favor those. So if 184 00:12:11,840 --> 00:12:13,640 Speaker 1: a post gets a lot of likes, if it gets 185 00:12:13,640 --> 00:12:15,679 Speaker 1: a lot of shares, if it gets a lot of comments, 186 00:12:16,360 --> 00:12:19,360 Speaker 1: it tends to raise the visibility of that post, and 187 00:12:19,400 --> 00:12:23,480 Speaker 1: it tends to show up in more people's news feeds. Well. 188 00:12:24,280 --> 00:12:27,520 Speaker 1: News flash posts that get a lot of engagement tend 189 00:12:27,559 --> 00:12:32,319 Speaker 1: to be very emotionally charged and controversial because they tend 190 00:12:32,360 --> 00:12:35,880 Speaker 1: to invite people to either chime in and say, yeah, 191 00:12:35,920 --> 00:12:39,559 Speaker 1: you're totally right for that controversial perspective you've posted, or 192 00:12:39,960 --> 00:12:41,920 Speaker 1: you are way off base and you are a jerk 193 00:12:41,960 --> 00:12:44,320 Speaker 1: face for putting such a controversial post up on your 194 00:12:44,360 --> 00:12:48,840 Speaker 1: news feed. All of that engagement just drives the visibility 195 00:12:48,960 --> 00:12:52,480 Speaker 1: of that post and makes it even more visible, which 196 00:12:52,520 --> 00:12:56,679 Speaker 1: invites more people to participate, which again boosts the visibility, 197 00:12:56,920 --> 00:12:59,120 Speaker 1: and so you start to get these sort of toxic 198 00:12:59,240 --> 00:13:04,000 Speaker 1: posts rising to the top. Then there are promoted posts. 199 00:13:04,440 --> 00:13:08,199 Speaker 1: So by paying money people, organizations, companies, they can boost 200 00:13:08,280 --> 00:13:12,120 Speaker 1: the visibility of a post. They pay Facebook and Facebook 201 00:13:12,160 --> 00:13:14,400 Speaker 1: make sure that that post will show up on more 202 00:13:14,480 --> 00:13:19,240 Speaker 1: news feeds. So that muddies the waters too, because Facebook, 203 00:13:19,240 --> 00:13:22,760 Speaker 1: as it turns out, isn't just an empty stage where 204 00:13:22,800 --> 00:13:26,520 Speaker 1: anyone can get up and say anything they want and 205 00:13:26,559 --> 00:13:28,960 Speaker 1: be heard by all the people who happen to be 206 00:13:28,960 --> 00:13:32,000 Speaker 1: in the room at that time. It's not an even 207 00:13:32,040 --> 00:13:37,400 Speaker 1: playing ground, and it tends to favor inflammatory, controversial posts 208 00:13:37,960 --> 00:13:42,199 Speaker 1: and people who have money to spend. And again, understandably, 209 00:13:42,559 --> 00:13:46,439 Speaker 1: Facebook didn't want to take on the mantle of publisher 210 00:13:46,880 --> 00:13:49,679 Speaker 1: at that time. They didn't want to accept that as 211 00:13:49,720 --> 00:13:52,880 Speaker 1: their responsibility because that would mean the company would need 212 00:13:52,920 --> 00:13:57,800 Speaker 1: to monitor posts and potentially step into sensor problematic users 213 00:13:57,800 --> 00:14:01,480 Speaker 1: and accounts. They would where a lot of investment on 214 00:14:01,520 --> 00:14:04,840 Speaker 1: the part of Facebook, and this would be an ongoing expense. 215 00:14:04,880 --> 00:14:08,400 Speaker 1: They would have to keep on policing their service. And 216 00:14:08,880 --> 00:14:11,720 Speaker 1: it's a big service, so that's a huge job. And 217 00:14:11,760 --> 00:14:15,240 Speaker 1: since again the purpose of business is to make as 218 00:14:15,280 --> 00:14:19,480 Speaker 1: much money for the owners or shareholders as possible, additional 219 00:14:19,720 --> 00:14:24,280 Speaker 1: expenses are generally undesirable. I've got a lot more to 220 00:14:24,400 --> 00:14:28,120 Speaker 1: say about this whole subject, but first let's take our 221 00:14:28,160 --> 00:14:30,960 Speaker 1: own quick break to thank our sponsor so we can 222 00:14:31,040 --> 00:14:43,200 Speaker 1: pay our expenses. In addition to monitoring the news about Facebook, 223 00:14:43,720 --> 00:14:48,320 Speaker 1: definers began to dedicate resources to deflecting some of the 224 00:14:48,400 --> 00:14:52,479 Speaker 1: blame for Russian involvement by trying to steer the conversation 225 00:14:52,840 --> 00:14:57,400 Speaker 1: to target some of Facebook's rival tech companies, namely Google 226 00:14:57,440 --> 00:15:01,960 Speaker 1: and Twitter definers, along with two other companies that it 227 00:15:02,080 --> 00:15:05,840 Speaker 1: shares space with. Those two companies being America Rising, which 228 00:15:05,880 --> 00:15:09,840 Speaker 1: is a political action committee, and in t K Network, 229 00:15:10,000 --> 00:15:12,440 Speaker 1: which is a news network with a right wing slant 230 00:15:12,880 --> 00:15:18,480 Speaker 1: on news items. We're all collectively using the same office space, 231 00:15:18,520 --> 00:15:21,680 Speaker 1: some of the same staff, and so that brings some 232 00:15:21,800 --> 00:15:25,080 Speaker 1: questions in there. In t K Network would publish stories 233 00:15:25,520 --> 00:15:30,840 Speaker 1: that were pro Facebook and anti Facebook competitor during this time, 234 00:15:30,920 --> 00:15:34,960 Speaker 1: so they were trying to steer public opinion to try 235 00:15:35,040 --> 00:15:38,000 Speaker 1: and take some of the heat off of Facebook itself 236 00:15:38,040 --> 00:15:41,360 Speaker 1: and put it on some of its competitors. Well before 237 00:15:41,640 --> 00:15:44,240 Speaker 1: the problems that would lead to Zuckerberg having to sit 238 00:15:44,320 --> 00:15:48,720 Speaker 1: in front of Congress, activists had been accusing Facebook of 239 00:15:48,760 --> 00:15:52,680 Speaker 1: allowing various oppressive governments around the world to co op 240 00:15:52,760 --> 00:15:56,400 Speaker 1: the platform in order to spread propaganda or to identify 241 00:15:56,480 --> 00:15:59,400 Speaker 1: people that those governments considered a threat in order to 242 00:15:59,680 --> 00:16:03,440 Speaker 1: sign lence them or eliminate them. When Facebook was responding 243 00:16:03,480 --> 00:16:08,640 Speaker 1: to mounting criticisms, including that famous session in which Mark 244 00:16:08,680 --> 00:16:11,800 Speaker 1: Zuckerberg Berg would appear in front of Congress to provide 245 00:16:11,800 --> 00:16:16,000 Speaker 1: answers or explanations regarding Facebook user data, Russian interference, and more. 246 00:16:16,880 --> 00:16:21,040 Speaker 1: There was a simultaneous problem of people protesting the company 247 00:16:21,240 --> 00:16:23,640 Speaker 1: and its practices. So this is going on around the 248 00:16:23,680 --> 00:16:27,840 Speaker 1: same time. Activists were calling for oversight or regulations, which 249 00:16:27,840 --> 00:16:30,040 Speaker 1: Facebook definitely did not want to have to deal with, 250 00:16:30,520 --> 00:16:33,360 Speaker 1: and so definers got the directive to go do some 251 00:16:33,480 --> 00:16:37,880 Speaker 1: digging on the activists in an effort to discredit them. Now, 252 00:16:37,880 --> 00:16:40,760 Speaker 1: one particular group that was becoming a thorn in the 253 00:16:40,800 --> 00:16:44,720 Speaker 1: side of Facebook was called Freedom from Facebook. Now, this 254 00:16:44,800 --> 00:16:49,200 Speaker 1: included an attempt to link those activists to a billionaire 255 00:16:49,280 --> 00:16:53,360 Speaker 1: financier named George Soros. You probably heard that name if 256 00:16:53,360 --> 00:16:55,800 Speaker 1: you've been paying attention to the political news in the 257 00:16:55,920 --> 00:17:01,200 Speaker 1: United States. George Soros has backed a lot of different causes. 258 00:17:01,480 --> 00:17:05,280 Speaker 1: A lot of them are democratic causes for countries in Europe. 259 00:17:05,760 --> 00:17:10,400 Speaker 1: He's backed a lot of philanthropythropic causes in the United States, 260 00:17:10,440 --> 00:17:14,680 Speaker 1: but hasn't put as much money in direct democratic races 261 00:17:15,600 --> 00:17:18,280 Speaker 1: here in the US as he has in Europe. Soros 262 00:17:18,280 --> 00:17:21,159 Speaker 1: would actually create a philanthropic agency. It was called the 263 00:17:21,400 --> 00:17:26,320 Speaker 1: It is called the Open Society Foundations, and Soros funded 264 00:17:26,600 --> 00:17:30,520 Speaker 1: this organization with eighteen billion dollars of his own money. 265 00:17:30,560 --> 00:17:35,840 Speaker 1: According to Bloomberg, Soros is currently worth about eight billion dollars. So. 266 00:17:35,840 --> 00:17:38,879 Speaker 1: Soros was born in Budapest in nineteen thirty and he 267 00:17:38,960 --> 00:17:43,080 Speaker 1: comes from a Jewish family. His family wasn't really religious, 268 00:17:43,080 --> 00:17:47,080 Speaker 1: they were non observant Jews. But growing up in Budapest 269 00:17:47,119 --> 00:17:50,120 Speaker 1: in the nineteen thirties, you know, World War two write 270 00:17:50,160 --> 00:17:54,800 Speaker 1: on the horizon. It was a stressful time and uh. 271 00:17:55,040 --> 00:17:59,639 Speaker 1: He then would move throughout of Hungary. He moved to 272 00:17:59,800 --> 00:18:02,760 Speaker 1: the UK and then moved to the United States, and 273 00:18:02,800 --> 00:18:06,679 Speaker 1: he built his wealth. His enormous amount of wealth and 274 00:18:06,760 --> 00:18:09,760 Speaker 1: his support of various liberal causes, not just in the 275 00:18:09,840 --> 00:18:13,720 Speaker 1: United States but elsewhere, particularly in Europe, has evolved to 276 00:18:13,720 --> 00:18:19,320 Speaker 1: the point that some conservatives, not all, Some conservatives tend 277 00:18:19,359 --> 00:18:23,320 Speaker 1: to accuse Soros of trying to manipulate government policies around 278 00:18:23,359 --> 00:18:27,680 Speaker 1: the world through his wealth, and those accusations often come 279 00:18:27,800 --> 00:18:32,399 Speaker 1: from the more extreme fringes of the conservative movement, and 280 00:18:32,440 --> 00:18:36,840 Speaker 1: those accusations sometimes carry with them anti Semitic messaging. This 281 00:18:37,480 --> 00:18:43,520 Speaker 1: notion of wealthy Jewish people trying to control world events 282 00:18:44,200 --> 00:18:48,080 Speaker 1: and it's a frequent anti semitic message that has been 283 00:18:48,119 --> 00:18:53,360 Speaker 1: repeated numerous times. Uh, and it is a an insidious 284 00:18:53,480 --> 00:18:58,080 Speaker 1: and dangerous message to send out there. It feels a 285 00:18:58,119 --> 00:19:02,280 Speaker 1: lot of hate groups. So at the same time that 286 00:19:02,840 --> 00:19:07,560 Speaker 1: the Definers were starting to link George Soros to this 287 00:19:07,680 --> 00:19:11,360 Speaker 1: activist group, Freedom from Facebook, the Freedom from Facebook group 288 00:19:11,440 --> 00:19:16,640 Speaker 1: went and did something pretty bone headed and wrong. They 289 00:19:16,680 --> 00:19:21,240 Speaker 1: attended a House Judiciary committee meeting where a Facebook executive 290 00:19:21,640 --> 00:19:26,200 Speaker 1: was testifying about corporate policies, and the activist group held 291 00:19:26,280 --> 00:19:29,639 Speaker 1: up a sign and that sign showed Mark Zuckerberg and 292 00:19:29,760 --> 00:19:33,760 Speaker 1: Cheryl Sandberg as a two headed octopus with its tentacles 293 00:19:33,920 --> 00:19:38,080 Speaker 1: encircling the globe. And that's more than a little problematic. 294 00:19:38,440 --> 00:19:42,440 Speaker 1: Both Zuckerberg and Sandberg are also Jewish, and the depiction 295 00:19:42,520 --> 00:19:45,840 Speaker 1: of an octopus grasping the globe has been used as 296 00:19:45,840 --> 00:19:50,400 Speaker 1: an anti semitic messaging method before, and so this particular 297 00:19:50,440 --> 00:19:54,520 Speaker 1: sign could legitimately be viewed as being anti semitic, as 298 00:19:54,560 --> 00:19:59,280 Speaker 1: being uh bigoted against Jewish people. Now, whether or not 299 00:19:59,320 --> 00:20:02,800 Speaker 1: the activist group intended to express anti Semitic views or 300 00:20:02,840 --> 00:20:06,639 Speaker 1: not is up for debate, but either way, Definers was 301 00:20:06,720 --> 00:20:10,440 Speaker 1: able to take that protest and leverage it against them. 302 00:20:10,520 --> 00:20:14,359 Speaker 1: They contact the Anti Defamation League and they said, this 303 00:20:14,440 --> 00:20:17,639 Speaker 1: is a terrible thing. This should not stand. The group 304 00:20:17,680 --> 00:20:21,320 Speaker 1: needs to apologize taking some of the attention away from 305 00:20:21,320 --> 00:20:27,160 Speaker 1: Facebook and putting it onto Facebook's critics. And again I'm 306 00:20:27,160 --> 00:20:30,680 Speaker 1: not defending Facebook. Freedom from Facebook for doing that sign. 307 00:20:30,800 --> 00:20:33,840 Speaker 1: It was a dumb thing to do, and at best 308 00:20:33,880 --> 00:20:38,240 Speaker 1: it was dumb. At worst, it was racist. So Definers 309 00:20:38,280 --> 00:20:42,560 Speaker 1: takes that protest and leverages it against them. Simultaneously, Definers 310 00:20:42,640 --> 00:20:46,080 Speaker 1: is trying to link George Soros and his money to 311 00:20:46,240 --> 00:20:50,200 Speaker 1: that activist group, which is doubly weird right, and on 312 00:20:50,200 --> 00:20:53,600 Speaker 1: one part the group, this Definers group is reaching out 313 00:20:53,600 --> 00:20:56,280 Speaker 1: to the Anti Defamation League and saying, look at this 314 00:20:56,440 --> 00:21:00,280 Speaker 1: anti semitic messaging that this activist group is sending out 315 00:21:00,480 --> 00:21:05,159 Speaker 1: that is unconscionable. At the same time, this same group 316 00:21:05,840 --> 00:21:10,640 Speaker 1: is trying to link George Soros, who has been frequently 317 00:21:11,640 --> 00:21:19,320 Speaker 1: uh accused of engaging in various anti conservative causes and 318 00:21:19,600 --> 00:21:23,720 Speaker 1: has been linked anti semitic messaging, has been linked to 319 00:21:23,960 --> 00:21:27,240 Speaker 1: those claims, like there's been a lot of of anti 320 00:21:27,240 --> 00:21:32,600 Speaker 1: semitic messages that specifically target George Soros. So they're playing 321 00:21:32,600 --> 00:21:35,359 Speaker 1: both sides at the same time, essentially is what I'm saying. 322 00:21:35,400 --> 00:21:40,360 Speaker 1: They're They're saying, how dare this activist group send out 323 00:21:40,440 --> 00:21:45,400 Speaker 1: this anti Jewish message while meanwhile they're fanning the flames 324 00:21:45,440 --> 00:21:49,920 Speaker 1: of anti Jewish sentiment by linking Soros to that group. 325 00:21:50,560 --> 00:21:54,040 Speaker 1: By the way, Soros is philanthropic organization would later say 326 00:21:54,160 --> 00:21:57,680 Speaker 1: it had not provided any financial support to Freedom from Facebook, 327 00:21:57,720 --> 00:22:00,959 Speaker 1: So the claims were, uh, we're burious to begin with, 328 00:22:01,040 --> 00:22:04,760 Speaker 1: they weren't even true. The New York Times piece also 329 00:22:04,840 --> 00:22:09,080 Speaker 1: goes into detail about Facebook executives and their relationships with 330 00:22:09,200 --> 00:22:13,080 Speaker 1: various politicians, and there are many executives at Facebook who 331 00:22:13,080 --> 00:22:16,720 Speaker 1: have worked on political campaigns or held government jobs and 332 00:22:16,760 --> 00:22:20,879 Speaker 1: specific administrations over the years. Several of them, Sandberg included, 333 00:22:21,119 --> 00:22:24,640 Speaker 1: are very close friends with top lawmakers and have leveraged 334 00:22:24,680 --> 00:22:29,280 Speaker 1: those relationships throughout the whole affair. Sandberg would testify in 335 00:22:29,320 --> 00:22:32,199 Speaker 1: front of the Senate Intelligence Committee, and leading up to 336 00:22:32,280 --> 00:22:36,919 Speaker 1: her appearance, the Facebook team lobbied the committee chairman, Richard 337 00:22:36,960 --> 00:22:40,800 Speaker 1: Burr to stick to the topic of election interference and 338 00:22:41,119 --> 00:22:45,520 Speaker 1: not to press Cheryl Sandberg on other issues related to Facebook, 339 00:22:45,560 --> 00:22:51,000 Speaker 1: like user privacy or the Cambridge Analytica scandal, and Burr 340 00:22:51,040 --> 00:22:53,360 Speaker 1: agreed to that. He said, we should really focus just 341 00:22:53,520 --> 00:22:56,920 Speaker 1: on the election interference side, so that took some pressure 342 00:22:56,960 --> 00:23:01,520 Speaker 1: off of Facebook. Facebook also lobbied to include competitors in 343 00:23:01,600 --> 00:23:04,920 Speaker 1: this same hearing. They said, well, we're going to come forward, 344 00:23:04,960 --> 00:23:07,720 Speaker 1: but you should also really get someone from Google and 345 00:23:07,800 --> 00:23:11,240 Speaker 1: someone from Twitter. And since we're sending Sandberg our chief 346 00:23:11,280 --> 00:23:14,920 Speaker 1: operating officer, the people those companies should send should also 347 00:23:15,000 --> 00:23:18,359 Speaker 1: be very high ranking executives. Burr agreed to that too, 348 00:23:18,400 --> 00:23:21,840 Speaker 1: and he invited both of those companies to send in 349 00:23:22,160 --> 00:23:26,120 Speaker 1: comparable executives to appear before the committee. Twitter's Jack Dorsey 350 00:23:26,320 --> 00:23:29,680 Speaker 1: did so, he showed up. Google did not send anyone, 351 00:23:30,640 --> 00:23:34,199 Speaker 1: So Google's absence became a topic of scorn among the 352 00:23:34,200 --> 00:23:36,719 Speaker 1: committee members. People were saying, well, you know this, this 353 00:23:36,760 --> 00:23:39,000 Speaker 1: is this looks really bad for Google not to be here. 354 00:23:39,280 --> 00:23:42,399 Speaker 1: And that also helped take some of the heat off 355 00:23:42,440 --> 00:23:46,760 Speaker 1: of Facebook and also Twitter because they were there, so 356 00:23:47,000 --> 00:23:49,199 Speaker 1: they were able to come out of it looking a 357 00:23:49,240 --> 00:23:52,840 Speaker 1: little bit better because Google didn't show up. Oh and 358 00:23:52,840 --> 00:23:56,560 Speaker 1: and definers got involved in this part too. According to 359 00:23:56,600 --> 00:24:00,880 Speaker 1: that New York Times article, definers gathered information about all 360 00:24:00,960 --> 00:24:05,200 Speaker 1: the senators on this committee and then sent that information 361 00:24:05,320 --> 00:24:10,080 Speaker 1: to various journalists That included information about how much each 362 00:24:10,160 --> 00:24:15,160 Speaker 1: senator had spent on Facebook ads during various campaigns, as 363 00:24:15,200 --> 00:24:19,679 Speaker 1: well as which tracking tools the various senators websites used 364 00:24:20,119 --> 00:24:23,359 Speaker 1: on the visitors to those websites. The message was pretty 365 00:24:23,359 --> 00:24:27,760 Speaker 1: clear if those senators were to really go after Facebook, 366 00:24:28,200 --> 00:24:31,960 Speaker 1: the journalists had information that could lead to questions from 367 00:24:32,000 --> 00:24:34,639 Speaker 1: the senators. They could they could ask the senators, you know, 368 00:24:34,720 --> 00:24:37,919 Speaker 1: you really chased Facebook down and you argued about privacy, 369 00:24:38,000 --> 00:24:41,880 Speaker 1: but it turns out you're using a tracker on your 370 00:24:41,920 --> 00:24:44,840 Speaker 1: website to track information about people who visit your website. 371 00:24:44,840 --> 00:24:47,280 Speaker 1: So how can how can you accuse them of being 372 00:24:47,320 --> 00:24:51,640 Speaker 1: bad about privacy when you are gathering data or you're 373 00:24:51,760 --> 00:24:55,000 Speaker 1: arguing about Facebook, but at the same time you've spent 374 00:24:55,160 --> 00:24:58,440 Speaker 1: a huge amount of money on Facebook to advertise your campaign. 375 00:24:58,440 --> 00:25:01,720 Speaker 1: How can you be so critical of that company. It 376 00:25:01,840 --> 00:25:04,399 Speaker 1: was all meant to kind of add pressure to the senators, 377 00:25:05,240 --> 00:25:08,560 Speaker 1: and it's all it's essentially saying something like, sure, right now, 378 00:25:08,600 --> 00:25:13,160 Speaker 1: it's politically advantageous to go after Facebook because the public, 379 00:25:13,440 --> 00:25:16,600 Speaker 1: their opinion is turning against this social media site. But 380 00:25:16,680 --> 00:25:19,040 Speaker 1: let's talk about all the ways you've leveraged Facebook to 381 00:25:19,040 --> 00:25:22,040 Speaker 1: get where you are, and as I said before, politics 382 00:25:22,040 --> 00:25:25,440 Speaker 1: gets real ugly. Shortly after The New York Times published 383 00:25:25,440 --> 00:25:29,399 Speaker 1: its article about Facebook strategies to manage those crises, Zuckerberg 384 00:25:29,400 --> 00:25:34,000 Speaker 1: announced that his company had severed ties with Definers. He 385 00:25:34,080 --> 00:25:38,080 Speaker 1: and Sandberg both said that they were unaware that Definers 386 00:25:38,119 --> 00:25:41,520 Speaker 1: had been retained on behalf of Facebook. Sandberg said she 387 00:25:41,560 --> 00:25:43,879 Speaker 1: should have been aware of it, but she wasn't until 388 00:25:44,119 --> 00:25:47,240 Speaker 1: this article came out, and they said it was probably 389 00:25:47,320 --> 00:25:51,239 Speaker 1: someone in the communications department who had hired Definers, and 390 00:25:51,280 --> 00:25:54,840 Speaker 1: they just didn't realize it. That they being Zuckerberg and Sandberg, 391 00:25:54,880 --> 00:25:58,440 Speaker 1: they didn't realize that had happened. According to tech Crunch, 392 00:25:58,840 --> 00:26:02,760 Speaker 1: a number of face boks communications staff have ties to 393 00:26:02,920 --> 00:26:07,240 Speaker 1: Matt Rhodes. Matt Rhodes is the founder of Definers, and 394 00:26:07,280 --> 00:26:11,160 Speaker 1: he had previously run the election campaign for Mitt Romney. 395 00:26:11,160 --> 00:26:15,119 Speaker 1: So Andrea Saul, who serves Facebook as the director of 396 00:26:15,160 --> 00:26:18,680 Speaker 1: Policy Communications, also worked for Rhodes in two thousand eleven. 397 00:26:18,720 --> 00:26:24,240 Speaker 1: Two thousand twelve, Facebook spokesperson Jackie Rooney likewise had worked 398 00:26:24,240 --> 00:26:27,520 Speaker 1: on the Romney campaign as chief of staff. Another member 399 00:26:27,560 --> 00:26:31,560 Speaker 1: of the corporate communications team named Carolyn Glanville also worked 400 00:26:31,600 --> 00:26:35,239 Speaker 1: on the Romney campaign as the deputy communications director, and 401 00:26:35,280 --> 00:26:38,480 Speaker 1: then Joel Kaplan may have worked with Matt Rhodes while 402 00:26:38,520 --> 00:26:41,480 Speaker 1: Kaplan was deputy chief of staff under George W. Bush. 403 00:26:41,720 --> 00:26:44,720 Speaker 1: So there are plenty of people who could have initiated 404 00:26:44,800 --> 00:26:48,719 Speaker 1: bringing on Definers. Now, I want to be clear, I 405 00:26:48,800 --> 00:26:52,440 Speaker 1: don't mean to suggest that the communications department at Facebook 406 00:26:52,520 --> 00:26:55,520 Speaker 1: has a particular political bias, or if they do have 407 00:26:55,560 --> 00:26:59,239 Speaker 1: a particular political bias, that they performed their jobs and 408 00:26:59,280 --> 00:27:02,280 Speaker 1: that bias affects them. I don't know that that's true. 409 00:27:02,520 --> 00:27:06,880 Speaker 1: I hope it's not. I'm assuming the department is largely 410 00:27:06,920 --> 00:27:10,440 Speaker 1: just doing what most corporate departments do, which is to 411 00:27:10,560 --> 00:27:14,240 Speaker 1: act in the best interests of the company, rather than 412 00:27:14,280 --> 00:27:17,800 Speaker 1: to push any particular political philosophy. The only reason I 413 00:27:17,800 --> 00:27:21,320 Speaker 1: point out the relationships is because I think Zuckerberg's explanation 414 00:27:21,520 --> 00:27:24,159 Speaker 1: that he did not hire Definers, but someone in the 415 00:27:24,160 --> 00:27:28,280 Speaker 1: communications department did is probably true because there were so 416 00:27:28,320 --> 00:27:32,720 Speaker 1: many people who had relationships with the founder of Definers. 417 00:27:33,440 --> 00:27:38,159 Speaker 1: One of the things I think is really interesting is 418 00:27:38,200 --> 00:27:42,760 Speaker 1: that when Zuckerberg appeared before Congress is the founder who 419 00:27:42,760 --> 00:27:45,200 Speaker 1: I don't think always views his company the same way 420 00:27:45,240 --> 00:27:49,959 Speaker 1: as fellow executives. Due said that perhaps regulations might be 421 00:27:50,000 --> 00:27:53,720 Speaker 1: inevitable for platforms like Facebook. Now, this was not quite 422 00:27:53,760 --> 00:27:56,920 Speaker 1: the same thing as saying regulations would be a good thing. 423 00:27:58,040 --> 00:28:02,159 Speaker 1: He said he thinks that they be unavoidable. So you 424 00:28:02,200 --> 00:28:05,480 Speaker 1: can read that as saying Zuckerberg says yes, they're necessary, 425 00:28:05,600 --> 00:28:07,920 Speaker 1: or he just is saying there's no way we're going 426 00:28:07,960 --> 00:28:13,160 Speaker 1: to avoid it in the future. Zuckerberg also warned Congress, however, saying, quote, 427 00:28:13,680 --> 00:28:16,679 Speaker 1: I think a lot of times regulation puts in place 428 00:28:16,880 --> 00:28:20,720 Speaker 1: rules that a large company like ours can easily comply with, 429 00:28:21,280 --> 00:28:26,240 Speaker 1: but that small startups can't. End quote, Sandberg would say 430 00:28:26,359 --> 00:28:29,960 Speaker 1: essentially the same thing behind closed door meetings with various lawmakers, 431 00:28:30,320 --> 00:28:33,320 Speaker 1: and she said Facebook was already changing policies to follow 432 00:28:33,440 --> 00:28:37,520 Speaker 1: new best practices to make sure it was doing more 433 00:28:37,560 --> 00:28:41,880 Speaker 1: to police the content on Facebook, but that regulations, if 434 00:28:41,880 --> 00:28:45,760 Speaker 1: they were made formal, could end up hurting smaller platforms. 435 00:28:46,160 --> 00:28:48,200 Speaker 1: The New York Times reports that some of the officials 436 00:28:48,200 --> 00:28:51,560 Speaker 1: were a little skeptical of that messaging, understandably, so it 437 00:28:51,600 --> 00:28:54,360 Speaker 1: sounds like they're saying, oh, no, we we can handle 438 00:28:54,400 --> 00:28:57,000 Speaker 1: this just fine. We're good what we're worried about are 439 00:28:57,040 --> 00:29:02,480 Speaker 1: these smaller companies that don't have our resources. It doesn't 440 00:29:03,600 --> 00:29:07,000 Speaker 1: strike me as being totally genuine, because I'm not convinced 441 00:29:07,080 --> 00:29:12,040 Speaker 1: that Facebook is that concerned with smaller businesses. Um that's 442 00:29:12,080 --> 00:29:16,800 Speaker 1: based upon pretty much every action I've seen Facebook take 443 00:29:16,880 --> 00:29:20,480 Speaker 1: over the last decade. I've got more to say about 444 00:29:20,600 --> 00:29:23,880 Speaker 1: this article and its implications, but first, let's take another 445 00:29:23,960 --> 00:29:34,440 Speaker 1: quick break to thank our sponsor. On top of these 446 00:29:34,480 --> 00:29:39,800 Speaker 1: stories was one coming from CNBC about Kevin Systrom. Systrom 447 00:29:39,920 --> 00:29:43,720 Speaker 1: co founded the company Instagram with Mike Krieger. In two 448 00:29:43,720 --> 00:29:49,520 Speaker 1: thousand twelve, Facebook acquired Instagram for one billion dollars, a 449 00:29:49,720 --> 00:29:54,440 Speaker 1: princely some Systrom stayed on for several years, heading up 450 00:29:54,440 --> 00:30:01,400 Speaker 1: Instagram within Facebook, but in September System left Facebook. His 451 00:30:01,480 --> 00:30:05,640 Speaker 1: departure was likely mostly tied to how Facebook has involved 452 00:30:05,680 --> 00:30:09,600 Speaker 1: itself with Instagram over the last several months, and how 453 00:30:09,640 --> 00:30:12,800 Speaker 1: it has changed the way Instagram photos show up in 454 00:30:12,840 --> 00:30:15,880 Speaker 1: Facebook feeds. There were a lot of arguments that said 455 00:30:15,920 --> 00:30:20,520 Speaker 1: that Facebook's approach was watering down the value of instagram. 456 00:30:20,760 --> 00:30:23,760 Speaker 1: Sistrom and Creeer reportedly felt that Facebook was really interfering 457 00:30:23,760 --> 00:30:26,520 Speaker 1: too much with their work and that the decisions being 458 00:30:26,560 --> 00:30:31,360 Speaker 1: made were ultimately hurting growth. So a month after leaving Facebook, 459 00:30:31,560 --> 00:30:35,000 Speaker 1: System would say at a press conference, quote, no one 460 00:30:35,200 --> 00:30:39,800 Speaker 1: ever leaves a job because everything's awesome. End quote. But 461 00:30:39,800 --> 00:30:42,040 Speaker 1: they didn't go into a lot of detail. More to 462 00:30:42,080 --> 00:30:46,000 Speaker 1: the point of this episode, recently, Systrom said that it 463 00:30:46,120 --> 00:30:49,080 Speaker 1: is important for social media companies to be policed well 464 00:30:49,520 --> 00:30:53,400 Speaker 1: and that misinformation and harassment is a growing concern. He 465 00:30:53,480 --> 00:30:56,160 Speaker 1: even referenced deep fakes, which I talked about in a 466 00:30:56,280 --> 00:30:59,720 Speaker 1: very recent episode. But of course that's just one way 467 00:30:59,800 --> 00:31:03,400 Speaker 1: some one could misrepresent a person, from faked video footage 468 00:31:03,440 --> 00:31:07,000 Speaker 1: to faked audio footage or recordings. I guess I should 469 00:31:07,000 --> 00:31:11,719 Speaker 1: say two photoshopped images to smear campaigns. There are tons 470 00:31:11,760 --> 00:31:14,320 Speaker 1: of different ways for people to be pretty darn awful 471 00:31:14,360 --> 00:31:17,720 Speaker 1: to each other and to also reach a huge audience 472 00:31:17,760 --> 00:31:21,840 Speaker 1: to boot, because social media platforms have a very broad reach. 473 00:31:22,240 --> 00:31:25,680 Speaker 1: This goes beyond Facebook. Obviously, Facebook is easy to talk 474 00:31:25,720 --> 00:31:28,800 Speaker 1: about because the platform is so darned huge, but these 475 00:31:28,800 --> 00:31:31,960 Speaker 1: same tactics work on other social media platforms as well. 476 00:31:32,200 --> 00:31:34,360 Speaker 1: I don't mean to say that Facebook is the only 477 00:31:34,440 --> 00:31:38,560 Speaker 1: one that is vulnerable to this sort of thing. Specifically, 478 00:31:38,920 --> 00:31:42,400 Speaker 1: System said at the conference, quote, you start to realize 479 00:31:42,480 --> 00:31:44,960 Speaker 1: how important it's going to be for the future of 480 00:31:44,960 --> 00:31:48,360 Speaker 1: the world that we police these things well, that we 481 00:31:48,440 --> 00:31:52,760 Speaker 1: take it very seriously and put real resources against solving 482 00:31:52,800 --> 00:31:56,280 Speaker 1: the problems now that you're at this scale end quote. 483 00:31:56,720 --> 00:32:01,560 Speaker 1: Not to be clear, system wasn't necessarily calling for outside regulation, 484 00:32:01,720 --> 00:32:04,560 Speaker 1: but rather the need for policing the platforms, which could 485 00:32:04,560 --> 00:32:07,280 Speaker 1: come from within. It would not have to be a 486 00:32:07,320 --> 00:32:10,840 Speaker 1: formal set of laws. His point, though, was that it 487 00:32:11,080 --> 00:32:15,600 Speaker 1: is necessary whether it's internal or external, and that the 488 00:32:15,640 --> 00:32:19,040 Speaker 1: added expense of monitoring users and responding quickly in the 489 00:32:19,080 --> 00:32:22,520 Speaker 1: event of someone trying to spread lies or harass others 490 00:32:22,680 --> 00:32:27,760 Speaker 1: is absolutely critical. US Senator Mark Warner's office published a 491 00:32:27,800 --> 00:32:32,720 Speaker 1: paper describing a regulatory scheme for social media platforms after 492 00:32:32,840 --> 00:32:37,479 Speaker 1: Zuckerberg's appearances in front of Congress. This proposed policy covered 493 00:32:37,520 --> 00:32:40,840 Speaker 1: stuff like media literacy programs that are aimed at helping 494 00:32:40,880 --> 00:32:43,479 Speaker 1: people so they can determine if the information they are 495 00:32:43,600 --> 00:32:47,200 Speaker 1: encountering online is legitimate or if it's fake. It also 496 00:32:47,280 --> 00:32:51,360 Speaker 1: called for more funding of military and intelligence agencies so 497 00:32:51,400 --> 00:32:54,880 Speaker 1: that they can focus on misinformation campaigns from other countries 498 00:32:54,920 --> 00:32:58,040 Speaker 1: that are aimed to affect domestic politics. Essentially, the policy 499 00:32:58,120 --> 00:33:02,760 Speaker 1: was saying, we've gotten pretty good at detecting hacking attempts 500 00:33:02,760 --> 00:33:06,680 Speaker 1: and infiltration attempts. You know, not not flawless, but we're 501 00:33:07,280 --> 00:33:09,320 Speaker 1: aware of a lot of the tricks people use in 502 00:33:09,400 --> 00:33:12,920 Speaker 1: order to infiltrate systems. What we're not good at is 503 00:33:12,960 --> 00:33:16,160 Speaker 1: combating these misinformation campaigns, and we need to put money 504 00:33:16,200 --> 00:33:19,520 Speaker 1: aside to get better about doing that. The policy also 505 00:33:19,560 --> 00:33:22,160 Speaker 1: calls for social media platforms to do more to ensure 506 00:33:22,200 --> 00:33:25,400 Speaker 1: that the accounts made on those platforms are in fact 507 00:33:25,480 --> 00:33:29,040 Speaker 1: legitimate and not just run by a bot. If they 508 00:33:29,080 --> 00:33:30,760 Speaker 1: are run by a bot and it's all on the 509 00:33:30,800 --> 00:33:33,400 Speaker 1: up and up, it should be labeled as such so 510 00:33:33,440 --> 00:33:36,479 Speaker 1: that users aren't misled into thinking that a bot account 511 00:33:36,560 --> 00:33:40,560 Speaker 1: represents a real, human like person. It also calls for 512 00:33:40,640 --> 00:33:44,480 Speaker 1: platforms to be held legally liable for failing to take 513 00:33:44,520 --> 00:33:49,480 Speaker 1: down posts that include stuff like quote defamation, invasion of privacy, 514 00:33:49,840 --> 00:33:54,720 Speaker 1: false light, and public disclosure of private facts end quote. Also, 515 00:33:54,760 --> 00:33:56,880 Speaker 1: the companies would be held accountable if they failed to 516 00:33:56,920 --> 00:34:00,360 Speaker 1: take down fabricated video or audio if a victim had 517 00:34:00,360 --> 00:34:05,120 Speaker 1: secured a necessary judgment regarding the sharing of that content 518 00:34:05,960 --> 00:34:09,160 Speaker 1: and they also pointed at the European Unions General Data 519 00:34:09,200 --> 00:34:12,080 Speaker 1: Protection Regulation or g d p R rules. I covered 520 00:34:12,120 --> 00:34:15,160 Speaker 1: that in an episode earlier this year that would put 521 00:34:15,200 --> 00:34:20,480 Speaker 1: some pretty extensive privacy protections for Internet users if they 522 00:34:20,480 --> 00:34:24,040 Speaker 1: were to try and copy that. The paper itself wasn't 523 00:34:24,080 --> 00:34:26,239 Speaker 1: a draft of any sort of legislation. It wasn't a 524 00:34:26,280 --> 00:34:30,880 Speaker 1: proposed law. It was more of a broader policy suggestion 525 00:34:31,360 --> 00:34:34,200 Speaker 1: and a call for a discussion about those topics that 526 00:34:34,320 --> 00:34:38,400 Speaker 1: could lead to more actionable plans. The authors of the 527 00:34:38,440 --> 00:34:42,400 Speaker 1: paper admit that the ideas they propose may have flaws, 528 00:34:42,640 --> 00:34:45,880 Speaker 1: that they may not fully understand the situation or the 529 00:34:45,920 --> 00:34:49,400 Speaker 1: implementations that they're suggesting, and that in some cases the 530 00:34:49,480 --> 00:34:53,560 Speaker 1: proposed solutions might even undermine the goal that the solutions 531 00:34:53,560 --> 00:34:57,080 Speaker 1: were meant to achieve. I appreciate that they're very forthcoming 532 00:34:57,080 --> 00:34:59,680 Speaker 1: about this because one of the big problems we saw 533 00:34:59,760 --> 00:35:03,960 Speaker 1: in the initial congressional hearing with Mark Zuckerberg was that 534 00:35:04,000 --> 00:35:07,640 Speaker 1: a lot of these politicians are not exactly clued in 535 00:35:08,120 --> 00:35:11,520 Speaker 1: to the way social media works. Not a big surprise. 536 00:35:12,080 --> 00:35:18,399 Speaker 1: There's a fairly let's call it statesman like nature too 537 00:35:19,560 --> 00:35:22,799 Speaker 1: the Congress in the United States. That's a good way 538 00:35:22,840 --> 00:35:25,239 Speaker 1: of saying, a lot of them are old and are 539 00:35:25,280 --> 00:35:27,280 Speaker 1: a little out of touch, at least when it comes 540 00:35:27,360 --> 00:35:32,480 Speaker 1: to the technological side of things. So this paper was 541 00:35:32,520 --> 00:35:34,560 Speaker 1: really meant more as a call to action to get 542 00:35:34,600 --> 00:35:37,040 Speaker 1: an official stance of how it would be best to 543 00:35:37,120 --> 00:35:41,640 Speaker 1: approach social media platforms as they play increasingly important roles 544 00:35:41,640 --> 00:35:44,880 Speaker 1: in the way people get and share information. The paper 545 00:35:44,960 --> 00:35:48,600 Speaker 1: did not call for Facebook or Google or any other 546 00:35:48,760 --> 00:35:51,560 Speaker 1: big company that plays in this social media space to 547 00:35:51,640 --> 00:35:55,040 Speaker 1: get broken up into smaller companies. That is something that 548 00:35:55,200 --> 00:35:59,160 Speaker 1: some activists have called for. That these companies represent an 549 00:35:59,160 --> 00:36:02,719 Speaker 1: effective monopo bully in various industries, and that as a result, 550 00:36:03,239 --> 00:36:06,560 Speaker 1: they have been able to dictate the conversation and sort 551 00:36:06,560 --> 00:36:10,880 Speaker 1: of bully their way into favorable positions and favorable treatment 552 00:36:10,920 --> 00:36:15,440 Speaker 1: from the government. The general consensus to the policy paper 553 00:36:15,480 --> 00:36:18,279 Speaker 1: that I saw was that it would likely not make 554 00:36:18,440 --> 00:36:21,520 Speaker 1: much headway, and the only real chance it would have 555 00:36:21,680 --> 00:36:24,200 Speaker 1: of getting any real momentum would be if there had 556 00:36:24,280 --> 00:36:27,520 Speaker 1: been a really big shift in Congress after the two 557 00:36:27,520 --> 00:36:30,960 Speaker 1: thousand eighteen mid term elections, and there was a pretty 558 00:36:30,960 --> 00:36:34,000 Speaker 1: big shift. The Democratic Party picked up more than enough 559 00:36:34,000 --> 00:36:36,840 Speaker 1: seats to take control of the House of Representatives, but 560 00:36:36,920 --> 00:36:40,040 Speaker 1: the Senate still remains a Republican majority, so it's hard 561 00:36:40,080 --> 00:36:42,000 Speaker 1: to say if this is going to see any progress. 562 00:36:42,040 --> 00:36:47,200 Speaker 1: It may come down to uh partisan lines, where depending 563 00:36:47,239 --> 00:36:51,200 Speaker 1: upon which side proposes it, the other side might strike 564 00:36:51,239 --> 00:36:55,040 Speaker 1: it down not because of the merits of the ideas, 565 00:36:55,080 --> 00:36:59,480 Speaker 1: but because the other side suggested it. Because again, politics 566 00:36:59,520 --> 00:37:03,840 Speaker 1: get ugly, and sometimes politicians act like big old babies 567 00:37:04,520 --> 00:37:06,680 Speaker 1: when if it's not their idea, it can't be a 568 00:37:06,719 --> 00:37:11,720 Speaker 1: good idea. And that again, I apply this to both sides, y'all. 569 00:37:11,840 --> 00:37:15,160 Speaker 1: I am not I have my own personal political beliefs, 570 00:37:15,160 --> 00:37:17,960 Speaker 1: but I have no illusions that the party that I 571 00:37:18,040 --> 00:37:21,560 Speaker 1: support is any better about that than the other party. 572 00:37:21,760 --> 00:37:25,000 Speaker 1: Facebook meanwhile, has changed its policy in many ways to 573 00:37:25,120 --> 00:37:28,399 Speaker 1: police content more effectively. It launched a new report called 574 00:37:28,440 --> 00:37:32,400 Speaker 1: the Community Guidelines Enforcement Report, which goes into the policing 575 00:37:32,400 --> 00:37:35,239 Speaker 1: efforts that Facebook is engaged in, including how many fake 576 00:37:35,320 --> 00:37:38,600 Speaker 1: accounts it has deleted. According to the initial report, Facebook 577 00:37:38,640 --> 00:37:43,800 Speaker 1: deleted one point five billion fake accounts in just six months. 578 00:37:44,480 --> 00:37:47,360 Speaker 1: After the New York Times piece, Mark Zuckerberg announced that 579 00:37:47,360 --> 00:37:50,360 Speaker 1: the company will produce a report like that every quarter, 580 00:37:50,719 --> 00:37:54,359 Speaker 1: rather than you know, annually or semi annually. Zuckerberg also 581 00:37:54,440 --> 00:37:58,080 Speaker 1: published a four thousand, five hundred word outline or blueprint 582 00:37:58,239 --> 00:38:01,080 Speaker 1: on how the company is going to move forward with 583 00:38:01,200 --> 00:38:05,440 Speaker 1: content moderation. In that piece, Zuckerberg wrote, and I quote, 584 00:38:05,840 --> 00:38:09,120 Speaker 1: one of the biggest issues social networks face is that 585 00:38:09,239 --> 00:38:14,800 Speaker 1: when left unchecked, people will engage disproportionately with more sensationalist 586 00:38:14,880 --> 00:38:18,960 Speaker 1: and provocative content. This is not a new phenomenon. It 587 00:38:19,080 --> 00:38:21,880 Speaker 1: is widespread on cable news today and has been a 588 00:38:21,920 --> 00:38:25,600 Speaker 1: stable of tabloids for more than a century. At scale, 589 00:38:25,920 --> 00:38:29,160 Speaker 1: it can undermine the quality of public discourse and lead 590 00:38:29,200 --> 00:38:32,920 Speaker 1: to polarization. In our case, it can also degrade the 591 00:38:33,000 --> 00:38:37,240 Speaker 1: quality of our services. Our research suggests that no matter 592 00:38:37,400 --> 00:38:40,120 Speaker 1: where we draw the lines for what is allowed, as 593 00:38:40,160 --> 00:38:43,320 Speaker 1: a piece of content gets close to that line, people 594 00:38:43,320 --> 00:38:46,960 Speaker 1: will engage with it more on average, even when they 595 00:38:47,000 --> 00:38:51,359 Speaker 1: tell us afterwards they don't like the content. So how 596 00:38:51,440 --> 00:38:53,279 Speaker 1: is Facebook going to respond to that? How are they 597 00:38:53,320 --> 00:38:56,680 Speaker 1: going to put a cap on that? According to Zuckerberg, 598 00:38:56,920 --> 00:39:00,680 Speaker 1: They're going to train AI models to recognize is when 599 00:39:00,680 --> 00:39:06,239 Speaker 1: a piece is sensationalist, when it is either fake news 600 00:39:06,360 --> 00:39:11,160 Speaker 1: or it's misrepresenting the facts, or it is inflammatory on 601 00:39:11,280 --> 00:39:15,120 Speaker 1: purpose and that it will then automatically be able to 602 00:39:15,160 --> 00:39:18,160 Speaker 1: remove those items, which seems kind of interesting, especially since 603 00:39:18,200 --> 00:39:22,440 Speaker 1: we just finished all those pieces about how AI is 604 00:39:22,480 --> 00:39:26,880 Speaker 1: not infallible. But the alternative would be to employ human 605 00:39:26,920 --> 00:39:30,799 Speaker 1: beings to go through billions of posts every day, which 606 00:39:30,840 --> 00:39:36,080 Speaker 1: doesn't seem like it's particularly realistic either. Zuckerberg also said 607 00:39:36,320 --> 00:39:39,360 Speaker 1: that the company would seek out an independent oversight body 608 00:39:39,400 --> 00:39:42,279 Speaker 1: to review any appeals made by people who had their 609 00:39:42,280 --> 00:39:46,040 Speaker 1: content removed from the platform. Zuckerberg doesn't expect that such 610 00:39:46,040 --> 00:39:47,920 Speaker 1: a body will be ready to go until the end 611 00:39:47,960 --> 00:39:50,640 Speaker 1: of twenty nineteen at the earliest. But the goal here 612 00:39:50,920 --> 00:39:53,680 Speaker 1: is to create an entity that can review these appeals 613 00:39:54,280 --> 00:39:58,239 Speaker 1: and do so objectively and remove the possibility that Facebook's 614 00:39:58,239 --> 00:40:01,520 Speaker 1: algorithms are behaving on a bias against certain groups. So 615 00:40:02,040 --> 00:40:07,160 Speaker 1: let's say it's conservative news. If the news items are 616 00:40:07,360 --> 00:40:12,239 Speaker 1: not against Facebook's policies, if they are objective, you know, 617 00:40:12,360 --> 00:40:16,279 Speaker 1: they are fact based, they're not representing things, and they're 618 00:40:16,320 --> 00:40:19,920 Speaker 1: not inflammatory, but they're still getting removed. That conservative groups 619 00:40:19,920 --> 00:40:24,360 Speaker 1: could legitimately say, hey, your algorithms are targeting us based 620 00:40:24,440 --> 00:40:29,560 Speaker 1: upon our political stance, but we're not lying, we're not 621 00:40:29,640 --> 00:40:33,760 Speaker 1: misrepresenting the truth. This board would be able to review 622 00:40:33,800 --> 00:40:36,399 Speaker 1: the appeals and say, you know what, You're right, that 623 00:40:36,520 --> 00:40:39,560 Speaker 1: piece is completely legitimate. We're going to allow it on Facebook, 624 00:40:39,840 --> 00:40:42,200 Speaker 1: or they might say, I see what you're saying, but 625 00:40:42,280 --> 00:40:45,279 Speaker 1: this piece violates our policy because X, Y, and Z. 626 00:40:45,880 --> 00:40:48,640 Speaker 1: As of the recording of this podcast, this story is 627 00:40:48,840 --> 00:40:52,440 Speaker 1: still unfolding and a lot of people are angry, including 628 00:40:52,440 --> 00:40:54,960 Speaker 1: a lot of politicians, And it's likely that Facebook is 629 00:40:54,960 --> 00:40:56,680 Speaker 1: going to have to wade through a lot more political 630 00:40:56,680 --> 00:40:59,680 Speaker 1: scrutiny in the near future. So I'll probably have to 631 00:40:59,719 --> 00:41:02,440 Speaker 1: read at this sometime in the uh in the future, 632 00:41:02,520 --> 00:41:05,799 Speaker 1: but I wanted to cover it now because it is 633 00:41:05,920 --> 00:41:09,359 Speaker 1: a very fresh story and it brings up a lot 634 00:41:09,360 --> 00:41:13,839 Speaker 1: of very interesting questions because if Facebook's a publisher, at 635 00:41:13,840 --> 00:41:17,800 Speaker 1: what point is it able to or what point should 636 00:41:17,800 --> 00:41:22,360 Speaker 1: it step in to moderate things. I mean, it's a 637 00:41:22,400 --> 00:41:25,160 Speaker 1: private company or publicly traded company, but it is a company, 638 00:41:25,280 --> 00:41:27,640 Speaker 1: not a government, so it can choose what can and 639 00:41:27,680 --> 00:41:32,040 Speaker 1: cannot be shown on its platform. That's been the case forever. 640 00:41:32,160 --> 00:41:33,799 Speaker 1: Facebook has been able to do that all the time. 641 00:41:34,040 --> 00:41:37,960 Speaker 1: It's just they haven't really enforced it a whole lot um. 642 00:41:38,000 --> 00:41:39,880 Speaker 1: I'm very curious to see how this unfolds, because you're 643 00:41:39,880 --> 00:41:42,680 Speaker 1: gonna see different groups react in different ways. They're going 644 00:41:42,719 --> 00:41:46,799 Speaker 1: to be civil rights groups that might say there's some 645 00:41:46,840 --> 00:41:49,399 Speaker 1: freedom of speech problems here. There are going to be 646 00:41:49,680 --> 00:41:54,439 Speaker 1: other groups that say you're not doing enough because they're 647 00:41:54,480 --> 00:41:59,439 Speaker 1: still problematic posts being made on your platform. It's gonna 648 00:41:59,560 --> 00:42:03,480 Speaker 1: be a rough, uneven road, I think, for a while. 649 00:42:04,320 --> 00:42:08,080 Speaker 1: But I'm glad to see that some serious discussion is 650 00:42:08,120 --> 00:42:13,480 Speaker 1: being held about these issues. It's a shame that it 651 00:42:13,880 --> 00:42:16,920 Speaker 1: seems to be largely in response to this expose, a 652 00:42:16,960 --> 00:42:18,920 Speaker 1: piece from the New York Times. You would hope that 653 00:42:18,960 --> 00:42:24,480 Speaker 1: people would take these initiatives without that kind of public pressure, 654 00:42:24,680 --> 00:42:28,799 Speaker 1: but sometimes that's what it takes. Anyway. That's what's going 655 00:42:28,840 --> 00:42:31,360 Speaker 1: on so far. We will revisit this sometime in the 656 00:42:31,400 --> 00:42:34,800 Speaker 1: future if there's more to say about it. And uh, 657 00:42:34,880 --> 00:42:37,319 Speaker 1: I hope you guys have enjoyed this episode. I hope 658 00:42:37,360 --> 00:42:41,640 Speaker 1: you're all having a great holiday week for those of 659 00:42:41,680 --> 00:42:44,359 Speaker 1: you who are listening to this when it publishes, and 660 00:42:44,440 --> 00:42:46,839 Speaker 1: I look forward to talking to you again soon. If 661 00:42:46,880 --> 00:42:51,000 Speaker 1: you have any ideas for episodes, why not visit our 662 00:42:51,040 --> 00:42:53,799 Speaker 1: website tech Stuff podcast dot com. You can learn more 663 00:42:53,840 --> 00:42:55,920 Speaker 1: about the show there you can email us at tech 664 00:42:56,040 --> 00:43:00,680 Speaker 1: Stuff at how stuff works dot com. I'll get those messages. Uh. 665 00:43:00,719 --> 00:43:03,600 Speaker 1: You'll also find ways to contact me via Twitter or 666 00:43:03,680 --> 00:43:06,680 Speaker 1: Facebook up on that web page. Don't forget to go 667 00:43:06,760 --> 00:43:09,279 Speaker 1: visit our store. It's over at t public dot com 668 00:43:09,320 --> 00:43:11,960 Speaker 1: slash tech Stuff. You can buy merchandise there. Every purchase 669 00:43:12,160 --> 00:43:15,120 Speaker 1: goes to help the show. We greatly appreciate it, and 670 00:43:15,280 --> 00:43:18,080 Speaker 1: don't forget. We also are nominated in the I Heart 671 00:43:18,239 --> 00:43:21,439 Speaker 1: Radio Podcast Awards. You can go to the I Heart 672 00:43:21,560 --> 00:43:25,520 Speaker 1: Radio Podcast Awards website. You can log in, you can 673 00:43:25,600 --> 00:43:29,400 Speaker 1: vote up to five times a day for your favorite podcast. 674 00:43:29,480 --> 00:43:33,120 Speaker 1: You can even dedicate all five votes for tech Stuff 675 00:43:33,400 --> 00:43:37,120 Speaker 1: if you are so inclined. And if you vote enough 676 00:43:37,800 --> 00:43:41,799 Speaker 1: and tech Stuff wins, I will have no choice but 677 00:43:41,920 --> 00:43:44,640 Speaker 1: to go up onto a very large stage in front 678 00:43:44,640 --> 00:43:48,440 Speaker 1: of a whole lot of fancy people and accept an award. 679 00:43:49,040 --> 00:43:51,520 Speaker 1: And that is terrifying. So if you want to scare me, 680 00:43:52,280 --> 00:43:55,320 Speaker 1: vote for tech Stuff and I'll talk to you again 681 00:43:56,280 --> 00:44:04,440 Speaker 1: really soon for more on this and thousands of other topics. 682 00:44:04,440 --> 00:44:15,920 Speaker 1: Because at how stuff works dot com