1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,200 --> 00:00:13,720 Speaker 1: stuff works dot com. Hey there, and welcome to tech Stuff. 3 00:00:13,720 --> 00:00:16,280 Speaker 1: I'm your host, John in Strickland. I'm an executive producer 4 00:00:16,320 --> 00:00:18,840 Speaker 1: with how Stuff Works in a love all things tech, 5 00:00:19,680 --> 00:00:23,279 Speaker 1: and we're continuing our story about Cambridge and Alytica, and 6 00:00:23,320 --> 00:00:28,960 Speaker 1: also continuing Jonathan's slow descent towards laryngitis. I'm all right 7 00:00:28,960 --> 00:00:31,000 Speaker 1: at the moment. I got my cup of water with me, 8 00:00:31,520 --> 00:00:34,559 Speaker 1: but I am recording this immediately after recording the previous 9 00:00:34,600 --> 00:00:38,000 Speaker 1: two episodes of tech Stuff, So if I start getting 10 00:00:38,040 --> 00:00:42,920 Speaker 1: particularly raspy, I apologize, but it's because my health is 11 00:00:42,960 --> 00:00:46,160 Speaker 1: slowly failing. In our last episode, I talked about how 12 00:00:46,440 --> 00:00:51,519 Speaker 1: two old Etonian's, first Nigel Oaks and later Alexander Nicks, 13 00:00:51,560 --> 00:00:55,880 Speaker 1: formed a company that depended heavily on psychological information of 14 00:00:56,080 --> 00:01:00,360 Speaker 1: questionable utility to form a consulting company that would take 15 00:01:00,440 --> 00:01:04,880 Speaker 1: that data and create action items for clients. They focused 16 00:01:05,120 --> 00:01:08,560 Speaker 1: mainly on political clients, with Nix's push to focus more 17 00:01:08,600 --> 00:01:13,400 Speaker 1: exclusively on elections, and I ended just as the organization 18 00:01:13,440 --> 00:01:18,319 Speaker 1: they created, Strategic Communication Laboratories or SCL, had spun off, 19 00:01:18,360 --> 00:01:23,759 Speaker 1: a new entity backed by American billionaires and This new 20 00:01:23,880 --> 00:01:29,120 Speaker 1: entity was called Cambridge Analytica. That entity incorporated in Delaware 21 00:01:29,440 --> 00:01:33,080 Speaker 1: on December thirty one, two thousand thirteen, so the very 22 00:01:33,160 --> 00:01:36,680 Speaker 1: last day of two thousand thirteen. Before incorporating, NIX had 23 00:01:36,720 --> 00:01:40,479 Speaker 1: led a few attempts to get cozy with American politics, 24 00:01:40,720 --> 00:01:45,520 Speaker 1: but things had not really gone well. The Republican candidate 25 00:01:45,560 --> 00:01:50,360 Speaker 1: for governor of Virginia was Ken Cucinelli, and a political 26 00:01:50,520 --> 00:01:55,080 Speaker 1: action committee supporting Cucinelli had hired SCL. Now remember this 27 00:01:55,160 --> 00:01:58,320 Speaker 1: was this was before Cambridge Analytica had become an official thing. 28 00:01:59,320 --> 00:02:02,720 Speaker 1: S c L was to post to create a list 29 00:02:02,920 --> 00:02:07,000 Speaker 1: of voters who could be persuaded to support Cucinelli. That 30 00:02:07,080 --> 00:02:11,040 Speaker 1: was what they were contracted to do. But the deadline 31 00:02:11,080 --> 00:02:16,120 Speaker 1: came and went and still no list, and eventually the 32 00:02:16,160 --> 00:02:20,120 Speaker 1: political action committee, the pack cut ties with the firm, 33 00:02:20,200 --> 00:02:23,160 Speaker 1: saying well, you're not delivering upon your promise, so we're 34 00:02:23,160 --> 00:02:25,880 Speaker 1: not gonna do business anymore. But s c L had 35 00:02:25,919 --> 00:02:30,919 Speaker 1: also worked with another PACK in Virginia, also supporting Cucinelli. 36 00:02:31,160 --> 00:02:34,680 Speaker 1: This one was called the Americans for Limited Government and 37 00:02:34,720 --> 00:02:37,120 Speaker 1: they wanted SCL to provide the group with a list 38 00:02:37,200 --> 00:02:41,760 Speaker 1: of women voters from suburban areas who traditionally would vote Democrat, 39 00:02:41,880 --> 00:02:46,840 Speaker 1: but who were thought to possibly flip Republican. So s 40 00:02:46,880 --> 00:02:50,520 Speaker 1: c L eventually produced a list of women and handed 41 00:02:50,520 --> 00:02:53,240 Speaker 1: it over, But upon closer inspection it turned out that 42 00:02:53,280 --> 00:02:57,119 Speaker 1: this was a list of Republican supporters, not flippable Democrats. 43 00:02:57,639 --> 00:03:00,839 Speaker 1: So in other words, those people that already were kind 44 00:03:00,840 --> 00:03:05,440 Speaker 1: of promising to vote that way. So s c L 45 00:03:05,480 --> 00:03:08,160 Speaker 1: and Knicks had not performed very well in two thousand thirteen, 46 00:03:08,520 --> 00:03:12,639 Speaker 1: And if the mercers had learned about that, the billionaires 47 00:03:12,639 --> 00:03:17,040 Speaker 1: who had backed decided to help form Cambridge Analytica, maybe 48 00:03:17,040 --> 00:03:20,200 Speaker 1: they wouldn't have funded the founding of Cambridge an Analytica 49 00:03:20,240 --> 00:03:24,160 Speaker 1: if they had known about those failures. Steve Bannon became 50 00:03:24,200 --> 00:03:28,600 Speaker 1: a driving force behind the scenes at Cambridge Analytica. He's 51 00:03:28,880 --> 00:03:32,680 Speaker 1: was the executive over at Bright Barton News at the time. 52 00:03:33,040 --> 00:03:36,480 Speaker 1: According to Chris Wiley, that's the data analyst who had 53 00:03:37,080 --> 00:03:40,480 Speaker 1: UH first come on board in order to look into 54 00:03:41,120 --> 00:03:46,200 Speaker 1: using data from various sources to support sc l's efforts, 55 00:03:46,480 --> 00:03:51,040 Speaker 1: and was the one to recommend going towards using social 56 00:03:51,080 --> 00:03:56,120 Speaker 1: media and apps similar to the one that that Stillwell 57 00:03:56,240 --> 00:04:00,000 Speaker 1: had created back over at the University of Cambridge. UH 58 00:04:00,320 --> 00:04:06,640 Speaker 1: He said that Bannon was really interested in using social 59 00:04:06,680 --> 00:04:10,680 Speaker 1: media and messaging as a way to wage psychological warfare 60 00:04:10,920 --> 00:04:15,040 Speaker 1: to push a particular political philosophy. And in fact, to 61 00:04:15,120 --> 00:04:19,440 Speaker 1: quote Wiley, he said, Cambrage Analytica was quote Bannon's arsenal 62 00:04:19,600 --> 00:04:23,200 Speaker 1: of weaponry to wage a culture war on America using 63 00:04:23,279 --> 00:04:27,960 Speaker 1: military strategies end quote. He said, Bannon was absolutely fascinated 64 00:04:27,960 --> 00:04:33,159 Speaker 1: with the idea of using military tactics to wage a 65 00:04:33,279 --> 00:04:38,560 Speaker 1: political war in in a campaign. The mercers meanwhile, use 66 00:04:38,640 --> 00:04:42,240 Speaker 1: their leverage as wealthy go op backers to convince many 67 00:04:42,360 --> 00:04:45,800 Speaker 1: different campaigns to actually go ahead and make use of 68 00:04:45,920 --> 00:04:51,400 Speaker 1: Cambridge Analytica. So, out of the eight federal level clients 69 00:04:51,400 --> 00:04:54,480 Speaker 1: that Cambridge Analytica one in two thousand thirteen and two 70 00:04:54,480 --> 00:04:57,240 Speaker 1: thousand and fourteen, the clients they landed out of all 71 00:04:57,279 --> 00:05:01,040 Speaker 1: eight of them, all eight where received being financial backing 72 00:05:01,080 --> 00:05:04,360 Speaker 1: from the mercers. So how about that. So, for example, 73 00:05:04,480 --> 00:05:08,640 Speaker 1: in two thousand fourteen, Robert Mercer donated one million dollars 74 00:05:08,680 --> 00:05:12,600 Speaker 1: to John Bolton's Superpack, and then that same superpack went 75 00:05:12,680 --> 00:05:16,000 Speaker 1: and hired Cambridge Analytica for consulting work to the tune 76 00:05:16,000 --> 00:05:21,280 Speaker 1: of three dollars. Kind of odd they reviews of the 77 00:05:21,279 --> 00:05:25,920 Speaker 1: work that Cambridge an Altica were returning. The reviews were 78 00:05:25,920 --> 00:05:30,400 Speaker 1: not great, they weren't spectacular. Apart from one U S 79 00:05:30,400 --> 00:05:33,760 Speaker 1: Senate race in North Carolina, that one was pretty positive 80 00:05:33,800 --> 00:05:37,640 Speaker 1: about Cambridge an Analytica, but a lot of other reports 81 00:05:37,640 --> 00:05:40,560 Speaker 1: were saying they aren't really delivering upon what they promised, 82 00:05:40,880 --> 00:05:44,040 Speaker 1: but the hype continued. Nix was really really good at 83 00:05:44,040 --> 00:05:47,279 Speaker 1: selling an idea. Never mind that the idea did not 84 00:05:47,480 --> 00:05:51,480 Speaker 1: really have a lot of solid foundation in academic scholarship 85 00:05:51,640 --> 00:05:56,320 Speaker 1: or published scientific research, or or even proof that their 86 00:05:56,360 --> 00:06:00,240 Speaker 1: efforts were effective in the field. He was still really 87 00:06:00,320 --> 00:06:04,040 Speaker 1: selling it, and the company in began to test certain 88 00:06:04,040 --> 00:06:07,920 Speaker 1: messages that would later show up in a future campaign. Now, 89 00:06:07,960 --> 00:06:11,719 Speaker 1: this was during the mid term elections often, and the 90 00:06:11,760 --> 00:06:17,160 Speaker 1: messages that they were testing out in different regions appeared 91 00:06:17,160 --> 00:06:21,520 Speaker 1: to be firmly rooted in Steve Bannon's philosophy. So these 92 00:06:21,560 --> 00:06:24,000 Speaker 1: this is when we started seeing phrases like build a 93 00:06:24,040 --> 00:06:28,120 Speaker 1: wall or drain the swamp. They're being tested at that 94 00:06:28,200 --> 00:06:32,600 Speaker 1: time in fourteen among different Republican populations, messages that were 95 00:06:32,600 --> 00:06:36,560 Speaker 1: critical of immigrants and ones that criticized big government would 96 00:06:36,600 --> 00:06:39,640 Speaker 1: roll into rotation and test out, and the company also 97 00:06:39,640 --> 00:06:42,320 Speaker 1: began testing the waters to see what the general opinion 98 00:06:42,360 --> 00:06:46,400 Speaker 1: among Republican voters was about a certain Vladimir Putin, the 99 00:06:46,440 --> 00:06:49,280 Speaker 1: President of Russia. And this was all during those mid 100 00:06:49,400 --> 00:06:53,960 Speaker 1: terms of fourteen years before they would be used in 101 00:06:54,279 --> 00:07:00,480 Speaker 1: greater rollouts in twenty. In two thousand fifteen, Ted Ru's 102 00:07:00,520 --> 00:07:04,560 Speaker 1: announced his candidacy for president, although that announcement did not 103 00:07:04,720 --> 00:07:09,720 Speaker 1: go as planned. Uh So, his campaign, before they had 104 00:07:09,840 --> 00:07:13,240 Speaker 1: announced that it was a campaign, had secured the services 105 00:07:13,280 --> 00:07:17,880 Speaker 1: of Cambridge Analytica to design the campaign website, the official 106 00:07:17,960 --> 00:07:21,600 Speaker 1: website for Ted Cruz running for president. It did not 107 00:07:21,760 --> 00:07:24,920 Speaker 1: start off great. So Ted Cruz sends out a tweet 108 00:07:25,680 --> 00:07:30,640 Speaker 1: before midnight, uh the day before he's going to announce 109 00:07:30,720 --> 00:07:34,480 Speaker 1: his candidacy, and it's a very teasing tweet. Everyone knew 110 00:07:34,520 --> 00:07:38,360 Speaker 1: what it really meant because there were there's no shortage 111 00:07:38,480 --> 00:07:42,080 Speaker 1: of uh pundits out there saying Ted Cruz was going 112 00:07:42,120 --> 00:07:45,400 Speaker 1: to put his name in the ring for president, but 113 00:07:46,160 --> 00:07:48,400 Speaker 1: the official announcement had not yet happened, and he said, 114 00:07:48,440 --> 00:07:51,280 Speaker 1: just wait, it's gonna come really soon. And what was 115 00:07:51,280 --> 00:07:54,360 Speaker 1: supposed to happen was one minute after midnight, the official 116 00:07:54,360 --> 00:07:58,840 Speaker 1: campaign website was supposed to launch, but his team couldn't 117 00:07:58,840 --> 00:08:01,400 Speaker 1: get the website to park. They couldn't get it live. 118 00:08:02,640 --> 00:08:05,800 Speaker 1: Once again, a product from Cambridge Analytica was not performing 119 00:08:05,840 --> 00:08:12,559 Speaker 1: the way people expected. So several minutes after midnight, crews 120 00:08:12,600 --> 00:08:15,880 Speaker 1: would announce his candidacy on Twitter. So so didn't get 121 00:08:15,920 --> 00:08:20,280 Speaker 1: to share the link to the official campaign website. It 122 00:08:20,360 --> 00:08:22,760 Speaker 1: was not live yet. He just went ahead and announced 123 00:08:22,800 --> 00:08:28,280 Speaker 1: on Twitter because the plan was not going according to plan. Uh. 124 00:08:28,320 --> 00:08:32,640 Speaker 1: Initially a dozen Cambridge Analytica employees were on the staff 125 00:08:32,760 --> 00:08:38,319 Speaker 1: with the Cruise campaign, but apparently the company consistently failed 126 00:08:38,440 --> 00:08:42,080 Speaker 1: to deliver results and so that number was whittled down 127 00:08:42,120 --> 00:08:47,120 Speaker 1: to three. As the campaign continued, Cambridge Analytica broke down 128 00:08:47,200 --> 00:08:53,320 Speaker 1: Cruises target audience into four categories temperamental people, relaxed leaders, 129 00:08:53,520 --> 00:08:58,040 Speaker 1: timid traditionalists, and stoic traditionalists, and this was based off 130 00:08:58,040 --> 00:09:02,360 Speaker 1: the data the company had gathered about those potential voters, 131 00:09:02,640 --> 00:09:07,040 Speaker 1: and the company had also created suggested messaging for each group, 132 00:09:07,360 --> 00:09:10,960 Speaker 1: saying well, this target group, this type of messaging will 133 00:09:10,960 --> 00:09:13,520 Speaker 1: work most effectively, but for this other group, you want 134 00:09:13,559 --> 00:09:16,880 Speaker 1: to go with this different approach. And the whole idea 135 00:09:16,920 --> 00:09:19,600 Speaker 1: was that they were targeting psychological anchor points that would 136 00:09:19,600 --> 00:09:24,240 Speaker 1: give Cruizes campaign the advantage. Now Cruizes campaign made use 137 00:09:24,400 --> 00:09:28,280 Speaker 1: of data that Cambridge General Nitteca had gathered using Alexander 138 00:09:28,400 --> 00:09:34,600 Speaker 1: Cogan's Facebook app, the one that had retrieved information about 139 00:09:34,760 --> 00:09:39,720 Speaker 1: users friends without those friends consent. And also this was 140 00:09:39,760 --> 00:09:44,360 Speaker 1: in violation of Facebook's policies. Facebook was told that it 141 00:09:44,440 --> 00:09:47,840 Speaker 1: was going to be an academic research project and therefore 142 00:09:48,000 --> 00:09:50,720 Speaker 1: the data was supposed to just stay with Cogan and 143 00:09:50,800 --> 00:09:55,240 Speaker 1: not change hands. But that's not what happened. Cogan did 144 00:09:56,040 --> 00:10:02,680 Speaker 1: essentially sell the data from his possession to Cambridge Analytica, 145 00:10:03,240 --> 00:10:08,640 Speaker 1: so Ted Cruz making use of this data was big news, uh, 146 00:10:08,720 --> 00:10:12,120 Speaker 1: and once people heard about it. No one had really 147 00:10:12,160 --> 00:10:15,440 Speaker 1: known about it at this point, but the information started 148 00:10:15,480 --> 00:10:18,520 Speaker 1: to leak. And this is a good point to talk 149 00:10:18,559 --> 00:10:22,559 Speaker 1: about how Facebook shares information among users. If you are 150 00:10:22,600 --> 00:10:25,200 Speaker 1: a user, some of you may not use Facebook, but 151 00:10:25,280 --> 00:10:28,560 Speaker 1: here's how it works. So you've got yourself a news feed, 152 00:10:28,840 --> 00:10:31,480 Speaker 1: and that's where posts from your friends will show up, 153 00:10:31,720 --> 00:10:35,880 Speaker 1: and occasionally, you know, posts from advertisers things like that 154 00:10:35,920 --> 00:10:38,920 Speaker 1: will show up there too, And you can post your 155 00:10:38,960 --> 00:10:44,040 Speaker 1: own things to Facebook, pictures, thoughts, whatever, and you probably know. 156 00:10:44,120 --> 00:10:46,960 Speaker 1: There are various privacy settings that you can activate before 157 00:10:47,000 --> 00:10:50,240 Speaker 1: you post stuff to Facebook. You could choose public that 158 00:10:50,320 --> 00:10:53,280 Speaker 1: means anyone who is on Facebook can view that material, 159 00:10:53,880 --> 00:10:57,120 Speaker 1: or you can choose different privacy settings. You can set 160 00:10:57,160 --> 00:11:00,559 Speaker 1: so that only your friends can read your posts, so 161 00:11:00,679 --> 00:11:03,520 Speaker 1: anyone who isn't your friend won't see the post that 162 00:11:03,600 --> 00:11:06,760 Speaker 1: you've you've put in. You can even exclude specific people, 163 00:11:06,840 --> 00:11:09,160 Speaker 1: or you can select only specific people to be able 164 00:11:09,160 --> 00:11:12,560 Speaker 1: to see it. But you get the idea. Well, until 165 00:11:14,200 --> 00:11:17,960 Speaker 1: if you developed an app for Facebook, you could request 166 00:11:18,240 --> 00:11:21,360 Speaker 1: friends permission on that app, and that would give you 167 00:11:21,400 --> 00:11:25,199 Speaker 1: access to view user friend data. And if a user 168 00:11:25,280 --> 00:11:28,280 Speaker 1: says sure, yeah, i'll I want to use this app, 169 00:11:28,360 --> 00:11:31,120 Speaker 1: I agree to that, then that would mean you would 170 00:11:31,160 --> 00:11:34,160 Speaker 1: be able to view that user's friends as if you 171 00:11:34,240 --> 00:11:37,640 Speaker 1: were that user. And in fact, it's like you have 172 00:11:37,800 --> 00:11:40,480 Speaker 1: just become a friend to all of that users friends. 173 00:11:40,559 --> 00:11:43,319 Speaker 1: So let's let's use an example to make this more clear. 174 00:11:43,760 --> 00:11:48,000 Speaker 1: Let's say that Ben Bolan Ben Boland's using Facebook and 175 00:11:48,080 --> 00:11:51,079 Speaker 1: Ben Bowen comes across an app that I have made 176 00:11:51,400 --> 00:11:55,040 Speaker 1: any Season is an app from before, and he sees 177 00:11:55,080 --> 00:11:57,200 Speaker 1: that in this app, I've said I want to have 178 00:11:57,240 --> 00:12:00,640 Speaker 1: permission to view your your friends data, and Ben Bolan, 179 00:12:01,360 --> 00:12:04,880 Speaker 1: being the conniving jerk face that he is and not 180 00:12:04,960 --> 00:12:08,839 Speaker 1: giving a care about the privacy of his friend's data, says, sure, 181 00:12:09,200 --> 00:12:13,600 Speaker 1: I want to know what howson hogwarts I belong to. 182 00:12:13,960 --> 00:12:17,480 Speaker 1: I want to take this quiz, so he agrees. Then 183 00:12:17,720 --> 00:12:21,599 Speaker 1: I get to see that all, not just all of 184 00:12:21,640 --> 00:12:24,400 Speaker 1: Ben Bolen's data, but all of Ben Boland's friends data 185 00:12:24,480 --> 00:12:26,760 Speaker 1: as if I were Ben Bowlen. I get to see 186 00:12:26,800 --> 00:12:29,439 Speaker 1: all of his friends data, just as if I were him, 187 00:12:29,480 --> 00:12:31,360 Speaker 1: and I can collect all that data and make use 188 00:12:31,400 --> 00:12:35,160 Speaker 1: of it at all. Because he was so thoughtless, I 189 00:12:35,200 --> 00:12:37,520 Speaker 1: should give out his email address so that you can 190 00:12:37,600 --> 00:12:39,680 Speaker 1: send him complaints because he does that to me all 191 00:12:39,720 --> 00:12:43,200 Speaker 1: the time on his podcasts. But I'm not gonna because honestly, 192 00:12:43,240 --> 00:12:44,640 Speaker 1: Ben Bolan is actually a really nice guy and I 193 00:12:44,760 --> 00:12:46,680 Speaker 1: like him, but it's great to use him as an example. 194 00:12:47,200 --> 00:12:49,720 Speaker 1: So this is why that number of people who may 195 00:12:49,760 --> 00:12:52,760 Speaker 1: have been affected by the Cambridge Analytica scandal is so 196 00:12:52,880 --> 00:12:57,400 Speaker 1: high at eighty seven million users, even though reportedly only 197 00:12:57,440 --> 00:13:02,120 Speaker 1: two seventy thousand people actually took the survey and agreed 198 00:13:02,720 --> 00:13:07,720 Speaker 1: to the the app permissions, and the average Facebook user 199 00:13:07,920 --> 00:13:10,680 Speaker 1: you see it has three thirty eight friends. That's that's 200 00:13:10,720 --> 00:13:12,719 Speaker 1: the average some people and more some people if you 201 00:13:12,800 --> 00:13:17,880 Speaker 1: were But if you multiply two by eight, you'll get 202 00:13:17,880 --> 00:13:23,040 Speaker 1: more than nine million possible friends. And granted there's bound 203 00:13:23,040 --> 00:13:26,440 Speaker 1: to be some overlap between different users. I'm honestly always 204 00:13:26,480 --> 00:13:29,360 Speaker 1: surprised to find friends I know from very different circles 205 00:13:29,400 --> 00:13:31,520 Speaker 1: who also happened to know each other. But you can 206 00:13:31,559 --> 00:13:35,200 Speaker 1: see why that seven million number got there. So in 207 00:13:35,640 --> 00:13:39,480 Speaker 1: Facebook would change that policy. These days, if you develop 208 00:13:39,480 --> 00:13:42,480 Speaker 1: an app, you can still send a friends permission request 209 00:13:43,000 --> 00:13:46,000 Speaker 1: and Facebook will review your app before allowing it to 210 00:13:46,040 --> 00:13:49,719 Speaker 1: be on the platform. But now that does not give 211 00:13:49,800 --> 00:13:53,000 Speaker 1: you access to view all of the friends data of 212 00:13:53,040 --> 00:13:56,040 Speaker 1: a user. Instead, what it does is it lets you 213 00:13:56,080 --> 00:13:59,439 Speaker 1: see a list of friends of that user who also 214 00:13:59,600 --> 00:14:02,839 Speaker 1: haven't all that same app. So you don't get new 215 00:14:02,880 --> 00:14:06,280 Speaker 1: information about new people. You just you just get to 216 00:14:06,320 --> 00:14:10,160 Speaker 1: know about connections between existing people who have already downloaded 217 00:14:10,200 --> 00:14:12,480 Speaker 1: your app. So you're just you know, you already have 218 00:14:12,640 --> 00:14:16,199 Speaker 1: that other users information because they installed your app already. Um, 219 00:14:16,320 --> 00:14:19,760 Speaker 1: so it's a very different policy. Well, when I come back, 220 00:14:19,840 --> 00:14:23,120 Speaker 1: I'll talk more about what Cambridge Analytica did and why 221 00:14:23,160 --> 00:14:25,120 Speaker 1: it got into so much trouble. But first, let's take 222 00:14:25,120 --> 00:14:35,640 Speaker 1: a quick break to thank our sponsor. Cambridge Analytica used 223 00:14:35,680 --> 00:14:39,280 Speaker 1: all that data to build out psycho graphic models of 224 00:14:39,320 --> 00:14:44,960 Speaker 1: potential voters, essentially putting them in these different categorizations. About 225 00:14:45,040 --> 00:14:49,440 Speaker 1: two months before the Iowa caucus, Caucuses and primaries are 226 00:14:49,680 --> 00:14:54,640 Speaker 1: how we decide which nominees within a party will become 227 00:14:54,680 --> 00:14:58,240 Speaker 1: the official party nominee for general election. If you're not 228 00:14:58,280 --> 00:15:01,120 Speaker 1: from the United States, it gets pretty confused ing anyway, 229 00:15:01,160 --> 00:15:03,280 Speaker 1: About two months before the Iowa Caucus, which is one 230 00:15:03,280 --> 00:15:07,080 Speaker 1: of the the it's the first big event that can 231 00:15:07,120 --> 00:15:10,240 Speaker 1: really set the tone for a nominee, The Guardian ran 232 00:15:10,280 --> 00:15:13,800 Speaker 1: a story revealing that Cambrage Analytica had possession of unauthorized 233 00:15:13,840 --> 00:15:17,320 Speaker 1: Facebook data and that therefore the Cruise campaign was depending 234 00:15:17,400 --> 00:15:20,560 Speaker 1: upon information that was gathered in an unethical way, that 235 00:15:20,680 --> 00:15:24,560 Speaker 1: they were relying upon information that people did not consent 236 00:15:24,800 --> 00:15:30,200 Speaker 1: to sharing. Facebook responded to this by telling Cambrage Analytica 237 00:15:30,280 --> 00:15:34,280 Speaker 1: and Alexander Cogan that they needed to delete all that data. 238 00:15:34,600 --> 00:15:38,680 Speaker 1: But Facebook really doesn't have any enforceable authority here. The 239 00:15:38,800 --> 00:15:43,160 Speaker 1: data had already changed hands first to Alexander Cogan and 240 00:15:43,200 --> 00:15:47,800 Speaker 1: then from Alexander Cogan to Cambridge Analytica. Chris Wiley said 241 00:15:48,200 --> 00:15:51,520 Speaker 1: that at this point he had already left Cambrige gen 242 00:15:51,520 --> 00:15:54,880 Speaker 1: Ata and he had deleted that data. But all he 243 00:15:54,920 --> 00:15:57,880 Speaker 1: had to do was fell out a little form and 244 00:15:58,040 --> 00:16:02,080 Speaker 1: click a checkbox and said that onto Facebook to say, hey, yeah, 245 00:16:02,120 --> 00:16:05,400 Speaker 1: I followed your instructions. But Facebook didn't take any steps 246 00:16:05,400 --> 00:16:07,680 Speaker 1: to actually verify that. And he says he could have 247 00:16:07,760 --> 00:16:12,280 Speaker 1: kept all of the data Facebook Facebook never would have known. Uh. 248 00:16:12,320 --> 00:16:15,520 Speaker 1: And and years later people said that the information was 249 00:16:15,560 --> 00:16:19,040 Speaker 1: still available from that time, that information that was gathered 250 00:16:19,200 --> 00:16:22,240 Speaker 1: by the initial app was still readily available. The New 251 00:16:22,280 --> 00:16:25,040 Speaker 1: York Times reported the data at least some of it 252 00:16:25,080 --> 00:16:31,120 Speaker 1: was still remaining in use as Cruise narrowly one in Iowa, 253 00:16:31,280 --> 00:16:33,520 Speaker 1: and Alexander Nicks was very quick to jump on that 254 00:16:33,600 --> 00:16:37,360 Speaker 1: news and boast about how his company, how Cambridge Analytica 255 00:16:37,520 --> 00:16:40,840 Speaker 1: helped push Cruise to victory, and that Cruise had started 256 00:16:41,040 --> 00:16:45,360 Speaker 1: in a very low place, like maybe polling at single 257 00:16:45,440 --> 00:16:51,520 Speaker 1: digit percentages, although that was not, you know true. Others 258 00:16:51,600 --> 00:16:55,040 Speaker 1: on Cruises campaign staff were not very quick to give 259 00:16:55,080 --> 00:16:58,280 Speaker 1: any credit to Cambridge Analytica, but Nicks was selling his 260 00:16:58,360 --> 00:17:01,160 Speaker 1: idea to a bigger audience now and staffers on the 261 00:17:01,200 --> 00:17:05,200 Speaker 1: campaign were concerned that Cambridge Analytica wasn't forthcoming and how 262 00:17:05,240 --> 00:17:08,240 Speaker 1: it was coming up with the various categories for voters. 263 00:17:08,280 --> 00:17:13,080 Speaker 1: They were not transparent in how they were making these determinations. 264 00:17:13,440 --> 00:17:15,960 Speaker 1: So the campaign staff says, we can't really tell how 265 00:17:16,000 --> 00:17:18,120 Speaker 1: well we're doing, and we can't even we don't even 266 00:17:18,119 --> 00:17:20,480 Speaker 1: know how they're coming to these determinations, so we can't 267 00:17:20,560 --> 00:17:24,280 Speaker 1: judge how well they're doing in the United States. Uh, 268 00:17:24,359 --> 00:17:28,359 Speaker 1: in order to become a nominee, like I said, you 269 00:17:28,400 --> 00:17:31,960 Speaker 1: have to participate in these caucuses, in these primaries. By 270 00:17:31,960 --> 00:17:35,760 Speaker 1: the time Cruizes campaign was working in South Carolina, staffers 271 00:17:35,800 --> 00:17:38,400 Speaker 1: noticed that the data they were getting from Cambridge Analytica 272 00:17:38,640 --> 00:17:42,080 Speaker 1: was woefully out of date. CRUs would come in third 273 00:17:42,119 --> 00:17:45,639 Speaker 1: place in South Carolina, and you didn't see Nick's jumping 274 00:17:45,720 --> 00:17:48,840 Speaker 1: up on the news to talk about that problem. Cruises 275 00:17:48,920 --> 00:17:53,560 Speaker 1: campaign stopped relying on Cambridge Analytica shortly thereafter, saying, you know, 276 00:17:54,480 --> 00:17:56,560 Speaker 1: I don't think this is really working out. I don't 277 00:17:56,600 --> 00:18:00,359 Speaker 1: think we're getting the return that we were promised. Now, 278 00:18:00,480 --> 00:18:02,800 Speaker 1: Nix did have the chutzpa to appear at the Concordia 279 00:18:02,880 --> 00:18:07,240 Speaker 1: Summit and claimed that Cambridge Analytica helped push ted Cruz 280 00:18:07,280 --> 00:18:11,880 Speaker 1: to a second place finish for the Republican nomination. Uh 281 00:18:12,000 --> 00:18:14,560 Speaker 1: The story Nix was selling was that Cruise had been 282 00:18:14,680 --> 00:18:19,360 Speaker 1: so far behind that getting second place was an enormous 283 00:18:19,359 --> 00:18:22,600 Speaker 1: achievement all by itself, and that the only reason he 284 00:18:22,640 --> 00:18:24,720 Speaker 1: was able to really get there was because Cambridge Analytica 285 00:18:24,840 --> 00:18:27,440 Speaker 1: was able to lift him much further than he was 286 00:18:27,480 --> 00:18:30,760 Speaker 1: going to go on his own. That's not necessarily an 287 00:18:30,800 --> 00:18:33,959 Speaker 1: accurate betrayal of what happened, as I understand it from 288 00:18:34,000 --> 00:18:36,840 Speaker 1: the various reports I've read, but it appeared that that 289 00:18:36,960 --> 00:18:40,399 Speaker 1: was Alexander Nick's messaging, and his next move was to 290 00:18:40,440 --> 00:18:42,760 Speaker 1: take aim at the Trump campaign to see if Cambridge 291 00:18:42,760 --> 00:18:45,560 Speaker 1: Analytica could get hired on to help in those efforts. 292 00:18:45,760 --> 00:18:48,439 Speaker 1: He had already attempted once before, but Trump was not 293 00:18:48,560 --> 00:18:51,919 Speaker 1: interested in hiring a political consultant that was already working 294 00:18:51,960 --> 00:18:55,959 Speaker 1: for one of his opponents. He had also around this 295 00:18:56,000 --> 00:18:59,520 Speaker 1: time allegedly reached out to Julian Assange. He's the founder 296 00:18:59,560 --> 00:19:01,880 Speaker 1: of Wicky Leaks. Assange had said that he got hold 297 00:19:01,920 --> 00:19:05,439 Speaker 1: of internal emails from the Clinton campaign, and these are 298 00:19:05,480 --> 00:19:08,480 Speaker 1: the same emails that we later discovered were harvested from 299 00:19:08,480 --> 00:19:11,520 Speaker 1: a cyber attack that was led by Russian hackers. So 300 00:19:11,640 --> 00:19:16,480 Speaker 1: Russian hackers attack Clinton's email servers, get access to these 301 00:19:16,480 --> 00:19:19,920 Speaker 1: emails and send them to Julian Assange over at Wiki Leaks, 302 00:19:19,920 --> 00:19:22,800 Speaker 1: who then says, hey, I've got these emails. And then 303 00:19:22,920 --> 00:19:26,200 Speaker 1: Alexander Nicks reaches out to Julian Osange and says, I'd 304 00:19:26,200 --> 00:19:29,320 Speaker 1: really think we should talk about the possibility of making 305 00:19:29,440 --> 00:19:32,879 Speaker 1: use of this information. According to an article in Mother Jones, 306 00:19:33,400 --> 00:19:36,480 Speaker 1: Nicks really wanted to weaponize those emails, but Assange ended 307 00:19:36,520 --> 00:19:40,040 Speaker 1: up passing on that offer. Alexander Nicks would go on 308 00:19:40,119 --> 00:19:42,920 Speaker 1: to claim this company had essentially done all the marketing 309 00:19:42,960 --> 00:19:46,440 Speaker 1: and messaging for the Trump campaign, that Cambridge an Analytica 310 00:19:46,920 --> 00:19:50,760 Speaker 1: was ultimately responsible for developing the approach Trump used during 311 00:19:50,800 --> 00:19:55,639 Speaker 1: the whole election, and that Cambridge Analytica effectively won the 312 00:19:55,680 --> 00:19:58,520 Speaker 1: presidency for Trump, and that it was because of Cambridge 313 00:19:58,520 --> 00:20:01,680 Speaker 1: Analytica's research that Trump was able to target the voters 314 00:20:01,880 --> 00:20:04,520 Speaker 1: that would swing the election his way. He could lose 315 00:20:04,560 --> 00:20:08,399 Speaker 1: the popular vote, and he did by nearly three million votes, 316 00:20:09,000 --> 00:20:11,880 Speaker 1: but he won in areas that got him the electoral 317 00:20:11,960 --> 00:20:15,760 Speaker 1: votes he needed to win the presidency, and Alexander Nick 318 00:20:15,840 --> 00:20:18,919 Speaker 1: said that this was because Cambridge Analytica had done the 319 00:20:18,960 --> 00:20:22,280 Speaker 1: research and knew who to target and that's why Trump won. 320 00:20:22,720 --> 00:20:26,600 Speaker 1: Others in Trump's campaign are not so quick to credit 321 00:20:26,680 --> 00:20:31,520 Speaker 1: Cambridge Analytica. They dispute these claims. They say that the 322 00:20:31,600 --> 00:20:35,000 Speaker 1: firm was not instrumental in Trump's messaging, and that most 323 00:20:35,040 --> 00:20:38,600 Speaker 1: of those efforts were being funneled directly into Facebook from 324 00:20:38,640 --> 00:20:42,639 Speaker 1: the campaign itself rather than through Cambridge Analytica, and that 325 00:20:42,760 --> 00:20:45,960 Speaker 1: the consulting firm had even screwed up some major TV 326 00:20:46,160 --> 00:20:51,560 Speaker 1: ad deals, serving ads and locations that were a complete bust, 327 00:20:52,200 --> 00:20:54,520 Speaker 1: and that they were handling the campaign in a very 328 00:20:54,560 --> 00:20:57,119 Speaker 1: amateurish way. So, in other words, the story Nix was 329 00:20:57,240 --> 00:21:00,240 Speaker 1: telling didn't seem to match up with what other people 330 00:21:00,600 --> 00:21:03,879 Speaker 1: who had been involved in these campaigns had to say 331 00:21:03,920 --> 00:21:07,320 Speaker 1: about the matter. Steve Bannon, who would become the chief 332 00:21:07,359 --> 00:21:11,000 Speaker 1: political strategist for Trump for a while anyway, remained more 333 00:21:11,119 --> 00:21:14,600 Speaker 1: or less in charge of Cambridge Analytica until April seventeen, 334 00:21:14,880 --> 00:21:17,359 Speaker 1: which was several months after Trump had taken office, which 335 00:21:17,880 --> 00:21:21,120 Speaker 1: I don't know sounds like a conflict to me. By 336 00:21:21,119 --> 00:21:24,600 Speaker 1: the end of twenty seventeen, Cambridge Analytica had withdrawn from 337 00:21:24,640 --> 00:21:28,080 Speaker 1: pursuing consulting gigs in politics in the US. The official 338 00:21:28,119 --> 00:21:31,240 Speaker 1: reason that Alexander Nicks gave was that there was going 339 00:21:31,280 --> 00:21:33,960 Speaker 1: to be too many other political consulting firms in the mix, 340 00:21:35,040 --> 00:21:36,720 Speaker 1: which meant there'll be more sharks in the water and 341 00:21:36,760 --> 00:21:39,840 Speaker 1: not enough food. It's gonna be too competitive, not not 342 00:21:40,000 --> 00:21:43,080 Speaker 1: lucrative enough, so the company was going to look elsewhere. 343 00:21:43,520 --> 00:21:45,400 Speaker 1: But a lot of the reports I read were very 344 00:21:45,400 --> 00:21:48,680 Speaker 1: skeptical of this claim, because if in fact, you were 345 00:21:48,720 --> 00:21:53,160 Speaker 1: a consulting firm that had just helped a candidate when 346 00:21:53,320 --> 00:21:57,760 Speaker 1: the position of President of the United States, you would 347 00:21:57,760 --> 00:22:01,440 Speaker 1: think you could leverage that into being a very lucrative 348 00:22:01,480 --> 00:22:06,800 Speaker 1: selling point for future campaigns. What these journalists have said 349 00:22:07,000 --> 00:22:13,040 Speaker 1: is that effectively Cambridge Analytica had been under delivering on 350 00:22:13,119 --> 00:22:20,840 Speaker 1: their their their promises regularly repeatedly, that perhaps those psychographic 351 00:22:20,920 --> 00:22:23,960 Speaker 1: models that the company had been touting may not have 352 00:22:24,040 --> 00:22:28,720 Speaker 1: been nearly as effective as the company was suggesting, and 353 00:22:28,880 --> 00:22:32,719 Speaker 1: that also they had a general lack of knowledge about 354 00:22:32,760 --> 00:22:37,480 Speaker 1: how US politics work. That these Brits were coming into 355 00:22:37,600 --> 00:22:41,760 Speaker 1: a US system and they didn't really understand the way 356 00:22:42,119 --> 00:22:45,480 Speaker 1: politics work in the United States, and that all of 357 00:22:45,520 --> 00:22:50,119 Speaker 1: this meant that people were viewing Cambridge Analytica as a 358 00:22:50,200 --> 00:22:55,479 Speaker 1: bad consulting firm for at least for American politics, and 359 00:22:55,480 --> 00:22:58,399 Speaker 1: it essentially landed the company on a do not hire 360 00:22:58,400 --> 00:23:01,880 Speaker 1: a list for both parties. So the general consensus seemed 361 00:23:01,920 --> 00:23:05,240 Speaker 1: to be that Trump had won the presidency despite participation 362 00:23:05,280 --> 00:23:08,520 Speaker 1: of Cambridge and Analytica, not because of it. So you've 363 00:23:08,520 --> 00:23:10,960 Speaker 1: got two different stories here. You have Alexander Nick saying 364 00:23:11,560 --> 00:23:14,359 Speaker 1: we're not gonna get involved in ten because there's not 365 00:23:14,440 --> 00:23:16,840 Speaker 1: enough money in it, and you have everyone else saying 366 00:23:17,119 --> 00:23:20,000 Speaker 1: you're not gonna get involved in ten because nobody wants you. 367 00:23:21,720 --> 00:23:25,520 Speaker 1: Was when things really fell apart for Cambridge Analytica. Over 368 00:23:25,680 --> 00:23:29,320 Speaker 1: in the United Kingdom, the BBC sent in an undercover 369 00:23:29,400 --> 00:23:34,120 Speaker 1: reporter to look into scl and Cambridge Analytica, which we're 370 00:23:34,160 --> 00:23:36,960 Speaker 1: still claiming to be two separate entities, but we're operating 371 00:23:37,000 --> 00:23:40,560 Speaker 1: more or less as a single one. This reporter got 372 00:23:41,600 --> 00:23:44,040 Speaker 1: got got stuff on tape, was able to capture on 373 00:23:44,160 --> 00:23:48,520 Speaker 1: tape discussions in which Alexander Nix himself talked about spreading 374 00:23:48,560 --> 00:23:53,800 Speaker 1: misinformation on purpose and even blackmailing political opponents of clients 375 00:23:53,840 --> 00:23:58,640 Speaker 1: through entrapment, essentially by by hiring sex workers to go 376 00:23:59,240 --> 00:24:03,080 Speaker 1: and and uh proposition those opponents and then use that 377 00:24:03,520 --> 00:24:08,520 Speaker 1: as a way of blackmailing those opponents. This gut, you know, 378 00:24:08,560 --> 00:24:11,680 Speaker 1: came to light, it was published, it it aired, and 379 00:24:11,720 --> 00:24:15,119 Speaker 1: then Alexander Nicks tried to shrug the whole thing off, said, oh, no, no, no, this, 380 00:24:15,960 --> 00:24:19,040 Speaker 1: I wasn't talking about things we would actually do. I 381 00:24:19,119 --> 00:24:23,919 Speaker 1: was engaging in this ridiculous hypothetical situation and uh in 382 00:24:23,960 --> 00:24:26,320 Speaker 1: a way to try and win a client. But we 383 00:24:26,359 --> 00:24:29,080 Speaker 1: would never actually do any of that. And a lot 384 00:24:29,119 --> 00:24:33,720 Speaker 1: of the people that Noah Alexander Nick say yeah, this 385 00:24:33,800 --> 00:24:38,159 Speaker 1: is kind of in line with his approach anything to 386 00:24:38,280 --> 00:24:41,840 Speaker 1: land that sale. So would the company have engaged in 387 00:24:41,880 --> 00:24:46,640 Speaker 1: these behaviors Maybe not. Maybe that was all talk, but 388 00:24:46,680 --> 00:24:49,119 Speaker 1: it's not good talk, especially not good talk to have 389 00:24:49,359 --> 00:24:54,479 Speaker 1: on tape and play out in public. Uh. This, by 390 00:24:54,520 --> 00:24:57,280 Speaker 1: the way, would end up resulting in Cambridge Analytic a 391 00:24:57,400 --> 00:25:00,680 Speaker 1: suspending Alexander Nix in response to that warn't coming out, 392 00:25:01,160 --> 00:25:05,200 Speaker 1: and so he would be suspended as CEO and would 393 00:25:05,240 --> 00:25:08,560 Speaker 1: effectively be removed from power over in the United States, 394 00:25:08,920 --> 00:25:13,680 Speaker 1: investigations into Russian interference in the twenty sixteen elections extended 395 00:25:13,680 --> 00:25:16,440 Speaker 1: over to Cambridge Analytica and Nicks was called to appear 396 00:25:16,480 --> 00:25:19,760 Speaker 1: before the House Intelligence Committee via video conference so he 397 00:25:19,800 --> 00:25:23,159 Speaker 1: didn't have to travel to the States. His testimony was 398 00:25:23,240 --> 00:25:25,880 Speaker 1: never made public. According to Alexander Nicks, he was only 399 00:25:25,920 --> 00:25:29,440 Speaker 1: asked five questions and that was a total breeze, uh 400 00:25:29,640 --> 00:25:31,240 Speaker 1: or maybe it was three questions. He said, was in 401 00:25:31,280 --> 00:25:34,080 Speaker 1: and out in like five minutes. Also in twenty eighteen, 402 00:25:34,080 --> 00:25:36,440 Speaker 1: the British Parliament called Nicks to appear before them to 403 00:25:36,520 --> 00:25:40,840 Speaker 1: discuss another event that happened in sixteen, which was the 404 00:25:40,960 --> 00:25:44,160 Speaker 1: UK's vote to leave the European Union, also known as Brexit. 405 00:25:44,760 --> 00:25:48,280 Speaker 1: The general policy at s c L was that they 406 00:25:48,280 --> 00:25:51,359 Speaker 1: weren't going to get involved in local politics. Remember s 407 00:25:51,400 --> 00:25:54,320 Speaker 1: c L and Cambridge Analytica both are located in the UK. 408 00:25:55,400 --> 00:25:58,359 Speaker 1: But the company had also released a statement saying it 409 00:25:58,440 --> 00:26:01,760 Speaker 1: was forming a partnership with an ganization campaigning in favor 410 00:26:01,840 --> 00:26:05,480 Speaker 1: of Brexit. But Alexander Nick said his company never actually 411 00:26:05,520 --> 00:26:10,520 Speaker 1: did any work. It was something that was announced but 412 00:26:10,640 --> 00:26:13,960 Speaker 1: never never actually came to fruition. According to Alexander Nick's 413 00:26:14,280 --> 00:26:19,760 Speaker 1: investigations into that matter continue because they involve finance issues 414 00:26:20,080 --> 00:26:23,240 Speaker 1: about rules about what, where and when money can be spent, 415 00:26:23,760 --> 00:26:27,199 Speaker 1: and how these efforts can be coordinated or not coordinated. 416 00:26:27,600 --> 00:26:31,439 Speaker 1: It gets really super complicated, and it involves another organization 417 00:26:31,440 --> 00:26:35,439 Speaker 1: that's located of Canada, an organization that has connections to 418 00:26:35,560 --> 00:26:38,840 Speaker 1: Chris Wiley, the data analysts who became a whistleblower for 419 00:26:38,880 --> 00:26:42,960 Speaker 1: Cambridge Analytica. But that probably will merit its own episode, 420 00:26:43,000 --> 00:26:46,720 Speaker 1: and that investigation is still ongoing. So rather than do 421 00:26:46,920 --> 00:26:50,160 Speaker 1: a half finished episode, I don't think I'd rather wait 422 00:26:50,200 --> 00:26:52,440 Speaker 1: to hear more about it. Well, I do have a 423 00:26:52,440 --> 00:26:54,879 Speaker 1: little bit more to say about Cambridge Analytica and the 424 00:26:54,920 --> 00:27:00,680 Speaker 1: fallout that resulted from the various reports, but let's take 425 00:27:00,680 --> 00:27:11,120 Speaker 1: another quick break to thank our sponsor. Chris Wiley would 426 00:27:11,119 --> 00:27:14,480 Speaker 1: tell The Observer that in July two thousand fourteen, he 427 00:27:14,720 --> 00:27:17,760 Speaker 1: and Alexander Nixt met with executives of a company called 428 00:27:17,880 --> 00:27:22,280 Speaker 1: Luke Oil. Luke Oil is the second largest oil company 429 00:27:22,320 --> 00:27:26,640 Speaker 1: in Russia, and the head of Luke Oil was vat Alekporov, 430 00:27:27,119 --> 00:27:29,280 Speaker 1: who was close friends with a guy you might have 431 00:27:29,320 --> 00:27:33,439 Speaker 1: heard of, a guy named Vladimir Putin. Wiley said to 432 00:27:33,440 --> 00:27:36,840 Speaker 1: the Observer that at the time he was confused why 433 00:27:36,920 --> 00:27:39,600 Speaker 1: Cabrige Analytica was being called in to have a meeting 434 00:27:39,600 --> 00:27:42,320 Speaker 1: with a Russian oil company in the first place. I mean, 435 00:27:42,320 --> 00:27:45,359 Speaker 1: why would an oil company want to know how Cabrige 436 00:27:45,359 --> 00:27:48,879 Speaker 1: Analytica was targeting and profiling American voters and has sending 437 00:27:48,880 --> 00:27:53,440 Speaker 1: out messaging to American voters. According to an email, Alexander 438 00:27:53,520 --> 00:27:56,040 Speaker 1: Nick had said that the purpose was to explain how 439 00:27:56,119 --> 00:27:59,520 Speaker 1: their work they were doing in US elections could apply 440 00:27:59,760 --> 00:28:03,640 Speaker 1: to the Russian oil business, which left wildly totally at 441 00:28:03,640 --> 00:28:06,760 Speaker 1: a loss. He said, I don't see where this connection is. 442 00:28:07,119 --> 00:28:10,399 Speaker 1: And according to Wiley, the presentation focused on ways to 443 00:28:10,520 --> 00:28:14,960 Speaker 1: disrupt and affect elections. The entire presentation was really geared 444 00:28:15,080 --> 00:28:17,960 Speaker 1: up to what cabra j Analytica was doing with the 445 00:28:18,040 --> 00:28:21,159 Speaker 1: data that it had in the United States, and that 446 00:28:21,200 --> 00:28:23,639 Speaker 1: seems to be a pretty strange presentation to give to 447 00:28:23,680 --> 00:28:28,960 Speaker 1: a petroleum company. Coincidentally, or maybe not so coincidentally, American 448 00:28:29,000 --> 00:28:32,080 Speaker 1: intelligence indicates that it was right around this time when 449 00:28:32,160 --> 00:28:36,480 Speaker 1: Russian hackers began targeting social media platforms to plant propaganda 450 00:28:36,600 --> 00:28:39,680 Speaker 1: and fake stories in an effort to affect elections in 451 00:28:39,680 --> 00:28:42,320 Speaker 1: the United States as well as push for Brexit in 452 00:28:42,360 --> 00:28:45,000 Speaker 1: the UK. So this is where that fake news stuff 453 00:28:45,040 --> 00:28:47,920 Speaker 1: starts starts to come in, the fake news that was 454 00:28:48,280 --> 00:28:53,360 Speaker 1: planted in social media to rile up different UH electoral bases. 455 00:28:53,920 --> 00:29:00,600 Speaker 1: And so the suggestion is that perhaps this meeting between 456 00:29:00,640 --> 00:29:07,080 Speaker 1: Cambrage Analytica and Luke Oil might have sparked this approach. 457 00:29:07,640 --> 00:29:10,840 Speaker 1: Cambra Generalytica didn't appear to actually do any work with 458 00:29:10,880 --> 00:29:13,440 Speaker 1: a Luke Oil, didn't appear to accept a contract, but 459 00:29:13,560 --> 00:29:17,800 Speaker 1: did make this presentation that may have created that inspiration 460 00:29:17,840 --> 00:29:21,640 Speaker 1: for Russia to interfere with the election through misinformation campaigns 461 00:29:22,040 --> 00:29:26,360 Speaker 1: using social media as the platform. Whether Cambridge Analytica had 462 00:29:26,400 --> 00:29:29,520 Speaker 1: any measurable effect on the various elections in the United 463 00:29:29,520 --> 00:29:34,480 Speaker 1: States remains a matter of debate. It also is not 464 00:29:34,600 --> 00:29:37,520 Speaker 1: exactly a smoking gun case that the company contributed to 465 00:29:37,600 --> 00:29:41,320 Speaker 1: Russia's interference with a two thousand sixteen election. Cambridge Analytica 466 00:29:41,440 --> 00:29:45,640 Speaker 1: certainly tried to influence the election. They were hired to 467 00:29:45,760 --> 00:29:49,320 Speaker 1: try and help various candidates get elected, and some of 468 00:29:49,320 --> 00:29:52,800 Speaker 1: those candidates did get elected. I mean, that was the 469 00:29:52,840 --> 00:29:55,880 Speaker 1: whole point of it. But it mostly seems to come 470 00:29:55,920 --> 00:29:59,280 Speaker 1: across as snake oil. To me, I'm sure there's merit 471 00:29:59,400 --> 00:30:01,680 Speaker 1: in the idea behind it. I'm sure. I mean that 472 00:30:01,840 --> 00:30:04,640 Speaker 1: it seems intuitive that if you know a lot about 473 00:30:04,640 --> 00:30:07,720 Speaker 1: it person, you can more effectively communicate to that person. 474 00:30:08,360 --> 00:30:12,280 Speaker 1: But it sounds to me that Alexander Nicks was largely 475 00:30:12,320 --> 00:30:15,520 Speaker 1: more talk than substance when it came to producing results. 476 00:30:15,600 --> 00:30:19,040 Speaker 1: This is based off the numerous reports I've read and 477 00:30:19,120 --> 00:30:25,760 Speaker 1: the various sites that have posted people who are familiar 478 00:30:25,920 --> 00:30:29,840 Speaker 1: with the company and the way it worked. Um, just 479 00:30:29,960 --> 00:30:34,080 Speaker 1: drawing those conclusions based on those numerous reports. Now, that 480 00:30:34,120 --> 00:30:36,320 Speaker 1: doesn't change the fact that the company engaged in some 481 00:30:36,400 --> 00:30:40,640 Speaker 1: really murky stuff. Leveraging Facebook data without user consent is 482 00:30:40,680 --> 00:30:44,760 Speaker 1: a huge ethical problem, though the company has repeatedly denied 483 00:30:44,760 --> 00:30:47,920 Speaker 1: it did any such thing, despite evidence pointing to the contrary, 484 00:30:48,360 --> 00:30:51,920 Speaker 1: much of that evidence provided by Chris Wiley. In addition, 485 00:30:52,360 --> 00:30:55,520 Speaker 1: the mess is really brought into focus Facebook's role. The 486 00:30:55,560 --> 00:30:59,480 Speaker 1: company did very little to protect user data and still 487 00:30:59,680 --> 00:31:03,560 Speaker 1: does very little to protect user data. So why is that? Well, 488 00:31:05,000 --> 00:31:07,720 Speaker 1: ultimately the answer comes down to money. You see, user 489 00:31:07,800 --> 00:31:13,600 Speaker 1: data and privacy are of our data. Security specifically and privacy, 490 00:31:13,680 --> 00:31:17,200 Speaker 1: they're a little concern to Facebook's shareholders. Those are the 491 00:31:17,200 --> 00:31:20,680 Speaker 1: people who own stock in Facebook. What they want is 492 00:31:20,680 --> 00:31:24,360 Speaker 1: a return on investment. They want to spend money to 493 00:31:24,520 --> 00:31:26,680 Speaker 1: own part of Facebook, and then they want Facebook to 494 00:31:26,680 --> 00:31:29,040 Speaker 1: make way more money so that they get some of 495 00:31:29,080 --> 00:31:32,840 Speaker 1: that money. So that means they want to see Facebook grow. 496 00:31:33,080 --> 00:31:34,840 Speaker 1: They want to see it growing revenue, they want to 497 00:31:34,840 --> 00:31:38,440 Speaker 1: see it growing users. They don't care about the quality 498 00:31:38,480 --> 00:31:42,480 Speaker 1: of the service as long as it doesn't impact those numbers. 499 00:31:43,080 --> 00:31:47,160 Speaker 1: Facebook could be a terrible, terrible, terrible experience, but as 500 00:31:47,200 --> 00:31:50,760 Speaker 1: long as it drove that growth and revenue and in users, 501 00:31:51,600 --> 00:31:55,440 Speaker 1: shareholders aren't going to care because that doesn't that that's 502 00:31:55,440 --> 00:31:57,080 Speaker 1: not important to them. What's important to them is that 503 00:31:57,120 --> 00:32:00,920 Speaker 1: return on investment. So Facebook's primary focus has always been 504 00:32:00,920 --> 00:32:05,040 Speaker 1: on delivering returns, not on creating a better service. And 505 00:32:05,080 --> 00:32:07,120 Speaker 1: that's something that can happen to any company. I'm not 506 00:32:07,240 --> 00:32:11,240 Speaker 1: singling Facebook out to say they are particularly at fault here. 507 00:32:11,640 --> 00:32:15,440 Speaker 1: They follow a very similar pathway that tons of other 508 00:32:15,520 --> 00:32:20,320 Speaker 1: companies follow. But when it's a company that's gathering tons 509 00:32:20,360 --> 00:32:24,120 Speaker 1: of personal data about all of us, largely because we're 510 00:32:24,120 --> 00:32:27,280 Speaker 1: sharing it willingly, but not not just through that, this 511 00:32:27,360 --> 00:32:30,920 Speaker 1: becomes a concern, and I question Facebook's decision to ever 512 00:32:31,000 --> 00:32:34,120 Speaker 1: allow developers to create apps that would give those developers 513 00:32:34,160 --> 00:32:37,160 Speaker 1: access to the data belonging to friends of people who 514 00:32:37,200 --> 00:32:40,800 Speaker 1: downloaded those apps, friends who had never actually been given 515 00:32:40,840 --> 00:32:43,360 Speaker 1: the opportunity to say yes or no to having their 516 00:32:43,440 --> 00:32:47,040 Speaker 1: data shared. I think that's a terrible, terrible policy that 517 00:32:47,320 --> 00:32:49,800 Speaker 1: to have ever been in place, and I'm glad that 518 00:32:49,840 --> 00:32:52,320 Speaker 1: it's gone, but it never should have been there. So 519 00:32:52,400 --> 00:32:55,040 Speaker 1: it's just not cool to say, hey, I can look 520 00:32:55,080 --> 00:32:58,160 Speaker 1: at all your information because your buddy Bill said it 521 00:32:58,280 --> 00:33:01,800 Speaker 1: was okay, Because the rash response to that is go 522 00:33:01,920 --> 00:33:04,200 Speaker 1: take a flying leap, because Bill does not have the 523 00:33:04,320 --> 00:33:07,840 Speaker 1: rights to give you permission to look at my information. 524 00:33:08,080 --> 00:33:11,520 Speaker 1: And yet for years that's exactly how Facebook allowed developers 525 00:33:11,600 --> 00:33:15,160 Speaker 1: to create apps. As I mentioned earlier, they've since changed 526 00:33:15,200 --> 00:33:17,680 Speaker 1: that policy, but it was ludicrous to have it in 527 00:33:17,680 --> 00:33:22,120 Speaker 1: the beginning, not to mention this awful passing the buck 528 00:33:22,200 --> 00:33:26,240 Speaker 1: situation has been going on between the various parties involved 529 00:33:26,240 --> 00:33:29,640 Speaker 1: in this scandal. No one is taking accountability for this. 530 00:33:30,200 --> 00:33:33,440 Speaker 1: So Facebook's initial response was, Hey, it's not our fault. 531 00:33:33,560 --> 00:33:36,280 Speaker 1: We were told this was for an academic research project, 532 00:33:36,640 --> 00:33:39,880 Speaker 1: and that met our criteria, so we said, go ahead, 533 00:33:39,960 --> 00:33:42,680 Speaker 1: it's fine. It's not our fault that they abused that. 534 00:33:42,920 --> 00:33:45,400 Speaker 1: In fact, we told him to knock it off, as 535 00:33:45,400 --> 00:33:48,520 Speaker 1: if somehow that makes it okay to share user data 536 00:33:48,640 --> 00:33:52,840 Speaker 1: without their consent. Then you have Alexander Cogan, whose response 537 00:33:52,960 --> 00:33:57,600 Speaker 1: was nothing I did was technically illegal, which might be true, 538 00:33:57,640 --> 00:34:00,160 Speaker 1: but that doesn't mean that it wasn't unethical and it 539 00:34:00,280 --> 00:34:03,800 Speaker 1: was against Facebook's terms of service, although again Cogan had 540 00:34:03,880 --> 00:34:06,760 Speaker 1: said I didn't read the terms of service, and clearly 541 00:34:06,800 --> 00:34:09,440 Speaker 1: Facebook didn't read my terms of service, or else they 542 00:34:09,440 --> 00:34:12,160 Speaker 1: would have never allowed my app to publish. So everyone 543 00:34:12,280 --> 00:34:14,720 Speaker 1: is equally at fault because no one reads terms of service. 544 00:34:15,080 --> 00:34:17,880 Speaker 1: And Cambrage analyticas response was, Hey, all we did was 545 00:34:17,920 --> 00:34:21,640 Speaker 1: by data. We didn't specifically hire anyone to scrape information 546 00:34:21,640 --> 00:34:29,520 Speaker 1: well user consent, So no one is accepting responsibility for this. Uh, 547 00:34:29,520 --> 00:34:31,800 Speaker 1: and I would argue that they all are at fault 548 00:34:32,280 --> 00:34:35,720 Speaker 1: in some measure. The story reminds us that there remains 549 00:34:35,760 --> 00:34:38,759 Speaker 1: out there a gray market, and that gray market is 550 00:34:38,800 --> 00:34:43,160 Speaker 1: filled with information about us. It's the data we generate 551 00:34:43,280 --> 00:34:46,279 Speaker 1: as we navigate over the Internet. We create it when 552 00:34:46,280 --> 00:34:48,840 Speaker 1: we post a social media we create it when we 553 00:34:48,880 --> 00:34:51,960 Speaker 1: make an online purchase, we leave traces as we move 554 00:34:52,040 --> 00:34:55,799 Speaker 1: from one website to another in our browsing histories. We 555 00:34:55,880 --> 00:34:59,680 Speaker 1: created a messages. It's created about us without us even 556 00:34:59,719 --> 00:35:03,360 Speaker 1: taking active participation, and all of that data has value 557 00:35:03,360 --> 00:35:06,400 Speaker 1: to it. There are different ways you can use that data. 558 00:35:06,480 --> 00:35:08,879 Speaker 1: You can use it to sell to people, you can 559 00:35:08,960 --> 00:35:13,200 Speaker 1: use it for political purposes, as Cambridge Analytica did. And 560 00:35:13,320 --> 00:35:15,680 Speaker 1: that data is out there on the gray market, and 561 00:35:15,680 --> 00:35:18,239 Speaker 1: it is for sale. If you have the money, you 562 00:35:18,280 --> 00:35:22,440 Speaker 1: can buy it, and it's shockingly cheap on an individual basis. 563 00:35:22,440 --> 00:35:24,160 Speaker 1: It's only when you start talking about in bulk that 564 00:35:24,239 --> 00:35:29,560 Speaker 1: gets really expensive. But that is a disturbing thing to remember, 565 00:35:29,600 --> 00:35:31,960 Speaker 1: and there's no indication that that's going to go away. 566 00:35:31,960 --> 00:35:34,640 Speaker 1: As long as the money is there, it's gonna stick around. 567 00:35:35,239 --> 00:35:39,960 Speaker 1: Cambridge Analytica and its sister or parent or whatever company 568 00:35:40,160 --> 00:35:44,080 Speaker 1: scl Elections they are no more. Both entities dissolved in 569 00:35:44,080 --> 00:35:48,000 Speaker 1: the Spring of and while they no longer exist officially, 570 00:35:48,160 --> 00:35:51,919 Speaker 1: investigations continue into both companies, and at least in the UK, 571 00:35:52,440 --> 00:35:55,719 Speaker 1: we may see more efforts to bring charges against individuals 572 00:35:56,000 --> 00:35:59,920 Speaker 1: who worked for those companies. The United States, it's a 573 00:36:00,000 --> 00:36:03,200 Speaker 1: little less certain the House Intelligence Committee had an investigation, 574 00:36:03,680 --> 00:36:06,000 Speaker 1: but they shut that down. It was an investigation to 575 00:36:06,000 --> 00:36:09,799 Speaker 1: look into Russian meddling in the US election that got 576 00:36:09,800 --> 00:36:11,960 Speaker 1: shut down for reasons that I'm not going to go 577 00:36:12,000 --> 00:36:14,359 Speaker 1: into because that gets super political and it doesn't really 578 00:36:14,400 --> 00:36:17,560 Speaker 1: have a place in this podcast. Investigations in the US 579 00:36:17,640 --> 00:36:20,480 Speaker 1: might come back at some point, but for the moment, 580 00:36:20,640 --> 00:36:23,000 Speaker 1: it's mostly in the realm of journalism and out of 581 00:36:23,040 --> 00:36:27,200 Speaker 1: the courts in the US. Meanwhile, Alexander Nix and Rebecca Mercer, 582 00:36:27,239 --> 00:36:33,000 Speaker 1: along with Silver others who were connected to Cambridge Analytica, 583 00:36:33,200 --> 00:36:37,040 Speaker 1: ended up organizing to create a new data analyst company 584 00:36:37,080 --> 00:36:40,239 Speaker 1: called Emmer Data E M. E R D A t 585 00:36:40,480 --> 00:36:43,640 Speaker 1: A that was established in the summer of sev and 586 00:36:43,719 --> 00:36:46,600 Speaker 1: its headquarters are in the same building where Cambridge Analytica 587 00:36:46,680 --> 00:36:49,800 Speaker 1: was now. According to Nigel Oaks, the guy who founded 588 00:36:49,800 --> 00:36:54,240 Speaker 1: the company that eventually spawned Cambridge Analytica, the whole purpose 589 00:36:54,280 --> 00:36:58,440 Speaker 1: for Emmer Data was ultimately to acquire Cambridge Analytica and 590 00:36:58,800 --> 00:37:02,360 Speaker 1: s c L Elections together and form a brand new entity. 591 00:37:03,000 --> 00:37:06,279 Speaker 1: But then everything fell apart, and according to Oakes, all 592 00:37:06,280 --> 00:37:09,200 Speaker 1: of those organizations are now being wound down in the 593 00:37:09,200 --> 00:37:13,359 Speaker 1: near future. They're not going to continue on um. That's 594 00:37:13,360 --> 00:37:16,800 Speaker 1: according to Nigel Oaks. I don't know if that's official, 595 00:37:17,280 --> 00:37:19,880 Speaker 1: but that's what he says. Cogan's colleague, by the way, 596 00:37:20,080 --> 00:37:23,600 Speaker 1: who helped build the app was Joseph Chancellor. So Cogan 597 00:37:23,680 --> 00:37:28,080 Speaker 1: and Chancellor together built the survey and the Facebook application 598 00:37:28,239 --> 00:37:31,359 Speaker 1: that was at the heart of the United States controversy. 599 00:37:31,920 --> 00:37:34,719 Speaker 1: And here's a fun fact, Cogan would end up being 600 00:37:34,760 --> 00:37:40,239 Speaker 1: labeled by Facebook as being a terrible, uh person who 601 00:37:40,280 --> 00:37:42,080 Speaker 1: had well, maybe not a terrible person, but a person 602 00:37:42,080 --> 00:37:45,359 Speaker 1: who had violated the terms of service that he had 603 00:37:45,400 --> 00:37:51,720 Speaker 1: created a tool that was in strict violation of Facebook's policies. However, 604 00:37:51,880 --> 00:37:57,359 Speaker 1: Facebook also hired Joseph Chancellor to work for Facebook, so 605 00:37:57,520 --> 00:38:01,000 Speaker 1: one of the two developers was pointed out as saying, 606 00:38:01,000 --> 00:38:03,400 Speaker 1: you're a criminal, and the other one was hired to 607 00:38:03,480 --> 00:38:06,640 Speaker 1: work for the company. Although to be fair, Joseph Chancellor 608 00:38:06,760 --> 00:38:11,359 Speaker 1: no longer works for Facebook. Still interesting. Probably some sour 609 00:38:11,440 --> 00:38:15,960 Speaker 1: grapes there, I'm guessing for Cogan, probably not terribly happy 610 00:38:15,960 --> 00:38:21,920 Speaker 1: about that. Anyway. That wraps up the confusing, complicated, sad, 611 00:38:21,960 --> 00:38:26,040 Speaker 1: and infuriating tale of Cambridge Analytica, a company that I 612 00:38:26,080 --> 00:38:30,080 Speaker 1: think ultimately was more about promising stuff that it might 613 00:38:30,120 --> 00:38:33,520 Speaker 1: not have been able to deliver than anything else. But 614 00:38:33,760 --> 00:38:36,840 Speaker 1: because it was wrapped up in this other issue with Facebook, 615 00:38:36,920 --> 00:38:41,359 Speaker 1: it definitely was thrust into the spotlight. Otherwise, I think 616 00:38:41,360 --> 00:38:44,480 Speaker 1: it probably would have just kind of faded out of 617 00:38:44,640 --> 00:38:47,920 Speaker 1: US politics at least might have still gotten into trouble 618 00:38:47,960 --> 00:38:50,680 Speaker 1: elsewhere in the world. But in the US people seem 619 00:38:50,760 --> 00:38:53,480 Speaker 1: to have gotten wise and just felt that it wasn't 620 00:38:53,760 --> 00:38:58,839 Speaker 1: a valuable asset to have on your team. Interesting stuff. Well, 621 00:38:58,840 --> 00:39:01,640 Speaker 1: in our next episode, we're going to shine a light 622 00:39:01,880 --> 00:39:05,520 Speaker 1: on the dark Web. It was a listener request to 623 00:39:05,560 --> 00:39:07,480 Speaker 1: take a look at what the dark web is and 624 00:39:07,520 --> 00:39:11,040 Speaker 1: how it works. And then we will conclude the dark 625 00:39:11,080 --> 00:39:15,239 Speaker 1: and scary week of tech stuff, and we'll we'll try 626 00:39:15,320 --> 00:39:18,520 Speaker 1: and talk about fun things next week, like I don't know, 627 00:39:18,600 --> 00:39:23,080 Speaker 1: furbies or something. I'll figure it out. Tri's nodding. She's like, please, 628 00:39:23,760 --> 00:39:26,080 Speaker 1: please pick a happy note. I'm tired of hearing about 629 00:39:26,160 --> 00:39:29,960 Speaker 1: nuclear power and dark and deceptive politics. If you have 630 00:39:30,040 --> 00:39:33,640 Speaker 1: suggestions for future episodes of Tech Stuff, whether it's a technology, 631 00:39:33,719 --> 00:39:36,600 Speaker 1: a person, a company, whatever it may be, send me 632 00:39:36,640 --> 00:39:39,560 Speaker 1: an email. The email address for the show is tech 633 00:39:39,600 --> 00:39:41,719 Speaker 1: stuff at how stuff works dot com or drop me 634 00:39:41,719 --> 00:39:43,879 Speaker 1: a line on Facebook or Twitter. The handle for both 635 00:39:43,920 --> 00:39:47,400 Speaker 1: of those is tech stuff h s W. You should 636 00:39:47,440 --> 00:39:50,240 Speaker 1: also go on over to t public dot com slash 637 00:39:50,239 --> 00:39:53,040 Speaker 1: tech stuff. I was looking at your shirt and it's nice, 638 00:39:53,960 --> 00:39:56,640 Speaker 1: but it's looking like you know, you could spruce up 639 00:39:56,680 --> 00:39:59,480 Speaker 1: your fashion a little bit, add some variety there. I 640 00:39:59,560 --> 00:40:01,960 Speaker 1: know what you need. A tech stuff T shirt. It 641 00:40:02,000 --> 00:40:07,040 Speaker 1: makes any wardrobe into a fabulous array of clothing. It'll 642 00:40:07,120 --> 00:40:09,319 Speaker 1: brighten all the other clothes you put in there. See. Look, 643 00:40:09,360 --> 00:40:12,319 Speaker 1: I'm following Cambridge Analytica's lead. I'm making promises I can't 644 00:40:12,320 --> 00:40:14,960 Speaker 1: deliver upon, but they know there really are great shirts. 645 00:40:15,040 --> 00:40:17,120 Speaker 1: You need to go check them out t public dot 646 00:40:17,160 --> 00:40:19,439 Speaker 1: com slash tech stuff, and don't forget follow us over 647 00:40:19,600 --> 00:40:23,600 Speaker 1: at Instagram. I'm out and I'll talk to you again 648 00:40:24,480 --> 00:40:32,680 Speaker 1: really soon. For more on this and thousands of other topics, 649 00:40:32,920 --> 00:40:44,160 Speaker 1: is it how stuff works dot com