1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,200 --> 00:00:13,680 Speaker 1: stuff Works dot com. Hey there, and welcome to Tech Stuff. 3 00:00:13,720 --> 00:00:16,320 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:16,360 --> 00:00:20,119 Speaker 1: How Stuff Works and I love all things tech. Just 5 00:00:20,239 --> 00:00:23,160 Speaker 1: a reminder, we are in our deep, dark, scary week 6 00:00:23,480 --> 00:00:27,000 Speaker 1: where I'm talking about stuff that tends to have to 7 00:00:27,000 --> 00:00:30,160 Speaker 1: do with a social media and the web and communications 8 00:00:30,640 --> 00:00:33,840 Speaker 1: and politics. We've got midterm elections coming up pretty soon 9 00:00:33,920 --> 00:00:36,560 Speaker 1: here in the United States as I record this, so 10 00:00:36,680 --> 00:00:39,280 Speaker 1: these are are kind of related back to that without 11 00:00:39,320 --> 00:00:43,760 Speaker 1: being too political in one stance or another. Also a reminder, 12 00:00:44,560 --> 00:00:47,520 Speaker 1: I'm recovering from getting sick. I'm recording this right after 13 00:00:47,560 --> 00:00:50,760 Speaker 1: I just recorded the Echo Chamber episode, and so my 14 00:00:50,880 --> 00:00:56,440 Speaker 1: voice is going to continually decrease in quality over this 15 00:00:56,480 --> 00:00:58,840 Speaker 1: episode and the following one, which will be Part two. 16 00:00:59,520 --> 00:01:03,680 Speaker 1: So we have had tons of conversations in the United 17 00:01:03,680 --> 00:01:08,280 Speaker 1: States about all things political. It's probably getting a little 18 00:01:08,319 --> 00:01:10,640 Speaker 1: exhausting for those of you who are in the US. 19 00:01:10,720 --> 00:01:13,000 Speaker 1: And in our last episode, I talked a little bit 20 00:01:13,040 --> 00:01:16,200 Speaker 1: about propaganda and how delivering the right type of information 21 00:01:16,319 --> 00:01:19,520 Speaker 1: or misinformation depending on the message to the right audience 22 00:01:19,520 --> 00:01:22,560 Speaker 1: at the right time. Can really influence outcomes, and this 23 00:01:22,800 --> 00:01:24,720 Speaker 1: and the next episode we're going to look at a 24 00:01:24,720 --> 00:01:28,280 Speaker 1: specific story involving a company that specialized in that type 25 00:01:28,319 --> 00:01:31,200 Speaker 1: of thing, or at least claimed to specialize in that 26 00:01:31,240 --> 00:01:36,560 Speaker 1: type of thing. The company was the UK based Cambridge Analytica. 27 00:01:36,640 --> 00:01:38,480 Speaker 1: So if you're in the United States or the UK, 28 00:01:39,040 --> 00:01:42,759 Speaker 1: you've probably heard that name. If you're outside of those countries, 29 00:01:42,920 --> 00:01:45,360 Speaker 1: you might have heard it because it was the center 30 00:01:45,480 --> 00:01:50,240 Speaker 1: of pretty massive controversies that may have involved the personal 31 00:01:50,320 --> 00:01:53,480 Speaker 1: data of up to eight seven million Facebook users, only 32 00:01:53,560 --> 00:01:56,800 Speaker 1: a small number of whom agreed to share any information 33 00:01:56,880 --> 00:02:00,560 Speaker 1: with this group in the first place. And you'll understand 34 00:02:00,560 --> 00:02:02,800 Speaker 1: why all this means over the course of the next 35 00:02:02,840 --> 00:02:05,880 Speaker 1: couple of episodes. Now, the story is a complicated one, 36 00:02:05,920 --> 00:02:09,320 Speaker 1: and it involves a lot of people and other organizations, 37 00:02:09,440 --> 00:02:11,119 Speaker 1: and one of the big ones we're going to talk 38 00:02:11,120 --> 00:02:15,040 Speaker 1: about is Facebook and how Facebook's policies were what made 39 00:02:15,080 --> 00:02:20,240 Speaker 1: Cambridge Analytica's actions possible, or at least one part of 40 00:02:20,320 --> 00:02:23,840 Speaker 1: Cambridge Analytica's actions, the part that really took a lot 41 00:02:23,880 --> 00:02:27,560 Speaker 1: of focus in the spring of and then there's the 42 00:02:27,600 --> 00:02:31,400 Speaker 1: responsibility that we social media users bear as well. We 43 00:02:31,480 --> 00:02:35,440 Speaker 1: also play a part in this. So what the heck 44 00:02:35,960 --> 00:02:40,160 Speaker 1: was Cambridge Analytica because spoiler alert, the company, at least 45 00:02:40,760 --> 00:02:45,840 Speaker 1: officially no longer exists. Well simply, but Cambridge Analytica was 46 00:02:45,919 --> 00:02:51,440 Speaker 1: in the business of gathering information, personal information, psychological information 47 00:02:51,480 --> 00:02:55,519 Speaker 1: about people and then crafting messaging specifically aimed at those 48 00:02:55,520 --> 00:03:01,359 Speaker 1: people to make the biggest impact possible, specifically or political campaigns, 49 00:03:01,480 --> 00:03:05,960 Speaker 1: in other words, to help elect certain candidates. Candidates that 50 00:03:06,080 --> 00:03:10,919 Speaker 1: had reserved the services of Cambridge Analytica, which was acting 51 00:03:10,960 --> 00:03:14,120 Speaker 1: as a political consultant. They were guns for hire. You 52 00:03:14,120 --> 00:03:17,240 Speaker 1: could go out and secure the firm services and try 53 00:03:17,280 --> 00:03:20,520 Speaker 1: to get elected. Um, they would gather information about the 54 00:03:20,560 --> 00:03:24,679 Speaker 1: people you were hoping to reach, and they were meant 55 00:03:24,760 --> 00:03:28,160 Speaker 1: to determine the most effective approach on how to do that, 56 00:03:28,280 --> 00:03:30,840 Speaker 1: how to reach those people based on the information they 57 00:03:30,880 --> 00:03:34,160 Speaker 1: had learned about your target audience, and they would serve 58 00:03:34,280 --> 00:03:38,760 Speaker 1: up and almost personalized form of messaging based on that data, 59 00:03:38,960 --> 00:03:43,200 Speaker 1: at least in theory. So it wasn't a guarantee that 60 00:03:43,280 --> 00:03:45,960 Speaker 1: your message was going to be received well, but it 61 00:03:46,000 --> 00:03:48,560 Speaker 1: was about as close as you can get. And in fact, 62 00:03:48,800 --> 00:03:52,280 Speaker 1: sometimes you might say that Alexander Nicks, who who ran 63 00:03:52,560 --> 00:03:58,080 Speaker 1: Cambridge Analytica was in fact guaranteeing those results that messaging 64 00:03:58,120 --> 00:04:01,720 Speaker 1: could include misinformation, it could include propaganda, didn't have to, 65 00:04:02,400 --> 00:04:04,760 Speaker 1: but it could, and it could be used to create 66 00:04:04,800 --> 00:04:07,880 Speaker 1: a larger wedge between different people within a region. It 67 00:04:08,040 --> 00:04:13,360 Speaker 1: was a powerful and therefore potentially dangerous claim to make. Now. 68 00:04:13,400 --> 00:04:17,480 Speaker 1: According to the company itself, it would pair consumer information 69 00:04:17,680 --> 00:04:22,520 Speaker 1: with psychological profile information on each person in your target audience, 70 00:04:22,640 --> 00:04:29,160 Speaker 1: and it would gather that information from different sources like surveys, research, 71 00:04:29,600 --> 00:04:34,000 Speaker 1: and social media platforms, namely Facebook. It would also conduct 72 00:04:34,600 --> 00:04:38,200 Speaker 1: other ways to try and gather data. It would sort 73 00:04:38,279 --> 00:04:42,200 Speaker 1: people into different buckets, different categories, and the four big 74 00:04:42,240 --> 00:04:47,560 Speaker 1: ones would be are you confrontational or non confrontational? Are 75 00:04:47,560 --> 00:04:50,400 Speaker 1: you agreeable? Are you a follower? Are you a leader? 76 00:04:51,240 --> 00:04:54,400 Speaker 1: That sort of stuff, and it could formulate strategies to 77 00:04:54,480 --> 00:04:59,599 Speaker 1: target each bucket based on those qualities. So if these 78 00:04:59,600 --> 00:05:04,159 Speaker 1: claim as were all supported by evidence and had really 79 00:05:04,200 --> 00:05:08,880 Speaker 1: strong results, it would be both really impressive and really disturbing. 80 00:05:09,000 --> 00:05:11,560 Speaker 1: This idea that we can learn a lot about the 81 00:05:11,560 --> 00:05:15,760 Speaker 1: people you are trying to get to support you, and 82 00:05:15,800 --> 00:05:18,320 Speaker 1: we can send out the message that is most likely 83 00:05:18,360 --> 00:05:23,240 Speaker 1: to get you that support disturbing because the information this 84 00:05:23,279 --> 00:05:27,880 Speaker 1: company gathered about people went well beyond just general information 85 00:05:27,920 --> 00:05:32,200 Speaker 1: like a name and an age and zip code or something. 86 00:05:32,680 --> 00:05:36,200 Speaker 1: It would include stuff like occupations, maybe the types of 87 00:05:36,320 --> 00:05:39,200 Speaker 1: shows they like to watch, the type of car they drove, 88 00:05:39,240 --> 00:05:42,240 Speaker 1: their voting record, the sort of stuff they shopped for, 89 00:05:42,320 --> 00:05:45,040 Speaker 1: what kind of medications they took. These were the claims 90 00:05:45,400 --> 00:05:48,800 Speaker 1: Cambridge Analytica made that they had these data points. And 91 00:05:48,839 --> 00:05:51,560 Speaker 1: here's the kicker, A whole lot of the information was 92 00:05:51,640 --> 00:05:55,120 Speaker 1: stuff the target audience was already sharing. They were posting 93 00:05:55,160 --> 00:05:58,919 Speaker 1: this information to social media, though the company was claiming 94 00:05:58,920 --> 00:06:02,000 Speaker 1: to have access to these data points long before it 95 00:06:02,080 --> 00:06:05,520 Speaker 1: actually got that access. So, in other words, Gambrage Analytica 96 00:06:05,640 --> 00:06:10,600 Speaker 1: seemed to be making some pretty grandiose claims about their capabilities. 97 00:06:10,680 --> 00:06:14,160 Speaker 1: Alexander Nicks, the head of it, was making most of 98 00:06:14,200 --> 00:06:19,440 Speaker 1: those claims and very public forums before they had any 99 00:06:19,880 --> 00:06:25,360 Speaker 1: real means of delivering upon it. But it did sort 100 00:06:25,360 --> 00:06:29,920 Speaker 1: of catapult them into the spotlight for a while. Now, 101 00:06:30,520 --> 00:06:33,800 Speaker 1: keep in mind, we are often talking in this episode 102 00:06:33,800 --> 00:06:37,400 Speaker 1: and the next one about stuff that people were willingly 103 00:06:37,800 --> 00:06:41,520 Speaker 1: sharing regularly on social media. It's the same sort of 104 00:06:41,520 --> 00:06:43,599 Speaker 1: stuff that they might get upset about if they heard 105 00:06:43,640 --> 00:06:47,400 Speaker 1: that some research firm had a spreadsheet and their name 106 00:06:47,440 --> 00:06:49,640 Speaker 1: was in that spreadsheet and the data that they had 107 00:06:49,680 --> 00:06:52,920 Speaker 1: just shared on Facebook or Twitter ended up being in 108 00:06:52,960 --> 00:06:56,320 Speaker 1: that spreadsheet. It kind of changes the context because when 109 00:06:56,320 --> 00:06:58,920 Speaker 1: you pop on Facebook, let's say you're celebrating the fact 110 00:06:58,920 --> 00:07:01,279 Speaker 1: that you just bought a brand new car. To you, 111 00:07:01,279 --> 00:07:04,040 Speaker 1: you are sharing some fun news with your friends. You're like, Hey, 112 00:07:04,240 --> 00:07:07,159 Speaker 1: I got my new card. It's pretty cool, blah blah blah. 113 00:07:07,200 --> 00:07:10,000 Speaker 1: And to you, that's just like I'm excited, be excited 114 00:07:10,080 --> 00:07:12,640 Speaker 1: with me. But to these research firms, what you've done 115 00:07:12,720 --> 00:07:15,320 Speaker 1: is just revealed a little bit more information about yourself 116 00:07:15,600 --> 00:07:18,080 Speaker 1: that might potentially be valuable in the future, and they're 117 00:07:18,080 --> 00:07:21,200 Speaker 1: taking note of it. However, it would be too easy 118 00:07:21,240 --> 00:07:23,880 Speaker 1: to put all the blame on social media users. We 119 00:07:24,120 --> 00:07:28,000 Speaker 1: probably could share less in a lot of cases than 120 00:07:28,040 --> 00:07:31,040 Speaker 1: we are sharing, but we don't hold all the blame. 121 00:07:31,120 --> 00:07:34,640 Speaker 1: Cambridge Analytica also employed tools that went beyond looking at 122 00:07:34,680 --> 00:07:38,600 Speaker 1: public profiles. So this story involves not just user behavior, 123 00:07:39,080 --> 00:07:44,600 Speaker 1: not just Cambridge Analyticas behavior, but also Facebook and its policies. 124 00:07:44,840 --> 00:07:47,320 Speaker 1: Facebook had in place of policy that developers could take 125 00:07:47,320 --> 00:07:51,360 Speaker 1: advantage of that gave those developers access to enormous amounts 126 00:07:51,360 --> 00:07:54,200 Speaker 1: of data, not just the information belonging to the people 127 00:07:54,200 --> 00:07:56,920 Speaker 1: who downloaded the apps, but others who never gave any 128 00:07:57,000 --> 00:08:00,720 Speaker 1: consent for their information to be shared. That's also at 129 00:08:00,760 --> 00:08:03,080 Speaker 1: the heart of this issue, so we're going to cover 130 00:08:03,160 --> 00:08:06,800 Speaker 1: that as well. By the way, I'm making no effort 131 00:08:07,000 --> 00:08:10,000 Speaker 1: to be unbiased in this episode as far as the 132 00:08:10,040 --> 00:08:13,360 Speaker 1: behavior of Cambridge Analytica, because I think it's pretty clear 133 00:08:14,080 --> 00:08:17,720 Speaker 1: that they did business in a dangerous and unethical way. 134 00:08:17,960 --> 00:08:22,280 Speaker 1: Maybe an illegal way, but definitely in an ethical way. 135 00:08:22,520 --> 00:08:25,760 Speaker 1: And it's not just my opinion. There are a lot 136 00:08:25,760 --> 00:08:29,080 Speaker 1: of others who share this opinion. There's an article in 137 00:08:29,440 --> 00:08:33,120 Speaker 1: mother Jones that's titled Cloak and Data, the real story 138 00:08:33,160 --> 00:08:38,160 Speaker 1: behind Cambridge Analytica's Rise and Fall, which is quite good, uh. 139 00:08:38,200 --> 00:08:41,600 Speaker 1: And in that article, an employee of Cambridge Analytica said 140 00:08:41,640 --> 00:08:45,400 Speaker 1: that the staff would never employ their techniques on British 141 00:08:45,440 --> 00:08:49,200 Speaker 1: political campaigns because it was too close to home, it 142 00:08:49,240 --> 00:08:52,360 Speaker 1: was too questionable, but it seemed like it was okay 143 00:08:52,360 --> 00:08:55,800 Speaker 1: to do it when it involved a different countries political system, 144 00:08:55,920 --> 00:09:00,680 Speaker 1: that that would be fair game. That seems pretty ugly 145 00:09:00,800 --> 00:09:03,160 Speaker 1: to me. The idea that oh, no, we would never 146 00:09:03,280 --> 00:09:06,760 Speaker 1: do this here, that would be probably not cool at all, 147 00:09:06,920 --> 00:09:09,840 Speaker 1: but it's okay to do it in other country's political system. 148 00:09:09,960 --> 00:09:13,000 Speaker 1: That's fine. It's because that's over there, that's not over here. 149 00:09:13,960 --> 00:09:19,240 Speaker 1: That's troubling to a great degree. Also spoiler alert, there 150 00:09:19,240 --> 00:09:22,280 Speaker 1: are a lot of allegations that suggest that Cambridge Analytica 151 00:09:22,320 --> 00:09:28,920 Speaker 1: at least had some involvement with British politics, specifically the 152 00:09:28,960 --> 00:09:33,640 Speaker 1: referendum to vote on on leaving the European Union, also 153 00:09:33,760 --> 00:09:38,559 Speaker 1: known as Brexit. But those are allegations that Cambridge Analytica 154 00:09:38,800 --> 00:09:43,559 Speaker 1: has denied vehemently and there are ongoing investigations into that matter. 155 00:09:43,720 --> 00:09:46,520 Speaker 1: So that story still hasn't played out. I'll touch on 156 00:09:46,559 --> 00:09:49,559 Speaker 1: it probably again in the course of these episodes. So 157 00:09:49,840 --> 00:09:52,400 Speaker 1: Cambrage Analytica was a company that, at the bottom line, 158 00:09:52,440 --> 00:09:56,160 Speaker 1: would scrape data about people and then form messaging strategies 159 00:09:56,360 --> 00:09:59,280 Speaker 1: around that data. So let's get into the details. So 160 00:09:59,360 --> 00:10:02,120 Speaker 1: before there was at Cambridge Analytica, there was a guy 161 00:10:02,200 --> 00:10:05,679 Speaker 1: named Nigel Oaks. Oaks went to Eaton, which is a 162 00:10:05,760 --> 00:10:10,240 Speaker 1: boarding school in England, one of the more posh ones, 163 00:10:10,840 --> 00:10:17,920 Speaker 1: certainly expensive. A lot of statesman from England have attended there. Uh, 164 00:10:17,960 --> 00:10:22,920 Speaker 1: it is a boy's only boarding school, and he is 165 00:10:23,240 --> 00:10:26,640 Speaker 1: an old Etonian. That's what they call the folks who 166 00:10:26,760 --> 00:10:32,440 Speaker 1: who graduated from Eaton. Oaks started out in advertising. Actually 167 00:10:32,480 --> 00:10:34,200 Speaker 1: he started out as a DJ, but he went on 168 00:10:34,280 --> 00:10:37,200 Speaker 1: into advertising, became an executive at Sachi and Sacchi, and 169 00:10:37,200 --> 00:10:41,520 Speaker 1: according to Oaks, in nine he founded an academic working 170 00:10:41,559 --> 00:10:45,960 Speaker 1: group called Behavioral Dynamics Institute. And I say according to 171 00:10:46,040 --> 00:10:48,760 Speaker 1: because it's really hard to track down official information about 172 00:10:48,760 --> 00:10:52,000 Speaker 1: these things. The purpose of this working group was to 173 00:10:52,040 --> 00:10:56,479 Speaker 1: advance research and development quote into persuasion and social influence 174 00:10:56,559 --> 00:11:00,920 Speaker 1: end quote according to the official history of research group, 175 00:11:01,320 --> 00:11:04,400 Speaker 1: So sort of an offshoot of advertising. How can you 176 00:11:04,480 --> 00:11:08,760 Speaker 1: shape public opinion about a given subject? That was the key. 177 00:11:08,800 --> 00:11:12,800 Speaker 1: How can we use our understanding of human psychology to 178 00:11:13,559 --> 00:11:17,960 Speaker 1: have better success with any kind of communications campaign which 179 00:11:17,960 --> 00:11:20,680 Speaker 1: could be for marketing purposes, It didn't have to be political. 180 00:11:21,360 --> 00:11:25,400 Speaker 1: And the main focus was, according to Oaks, communication for 181 00:11:25,520 --> 00:11:29,200 Speaker 1: conflict reduction, So an idea of how can a government 182 00:11:29,640 --> 00:11:33,320 Speaker 1: or how can an organization communicate in a way that's 183 00:11:33,320 --> 00:11:39,240 Speaker 1: effective to reduce conflict around the situation. Oakes brought on 184 00:11:39,280 --> 00:11:41,600 Speaker 1: a couple of psychologists to work with him on this. 185 00:11:42,400 --> 00:11:45,200 Speaker 1: One of them was Barry Gunter, who is now a 186 00:11:45,320 --> 00:11:49,560 Speaker 1: professor emeritus at Leicester University, and the other was Adrian Furnham, 187 00:11:49,800 --> 00:11:52,880 Speaker 1: and apparently the three would meet semi regularly between ninet 188 00:11:55,120 --> 00:11:58,520 Speaker 1: to talk about psychology, potential applications of it in the 189 00:11:58,559 --> 00:12:02,680 Speaker 1: commercial world, and about research projects. But the two psychologists 190 00:12:02,679 --> 00:12:07,960 Speaker 1: eventually broke ties with Oaks. They, at least Barry Gunter, 191 00:12:08,120 --> 00:12:10,559 Speaker 1: said they felt that Oaks was sort of taking their 192 00:12:10,600 --> 00:12:15,839 Speaker 1: work and making grandiose promises about leveraging psychology to get results, 193 00:12:15,880 --> 00:12:19,920 Speaker 1: such as in advertising, and Gunter was contacted by The 194 00:12:19,960 --> 00:12:22,680 Speaker 1: New Yorker and said, quote we felt he was promising 195 00:12:22,720 --> 00:12:25,200 Speaker 1: more than the science of psychology at that time could 196 00:12:25,559 --> 00:12:29,520 Speaker 1: substantiate end quote. So in other words, psychologists were saying, 197 00:12:30,360 --> 00:12:32,880 Speaker 1: there may be something to this, but you're making promises 198 00:12:32,880 --> 00:12:35,680 Speaker 1: before we even have a full understanding, and it may 199 00:12:35,720 --> 00:12:39,800 Speaker 1: even be that those promises have nothing behind them. In 200 00:12:41,000 --> 00:12:43,920 Speaker 1: Behavioral Dynamics Institute had enough funding to make a go 201 00:12:44,040 --> 00:12:46,560 Speaker 1: of it and became a company again according to the 202 00:12:46,559 --> 00:12:50,240 Speaker 1: company's own accounts. I could not find any actual records 203 00:12:50,280 --> 00:12:54,080 Speaker 1: that talked about this, but either in or in two 204 00:12:54,080 --> 00:12:59,040 Speaker 1: thousand five, Behavioral Dynamics Institute created a spinoff company called 205 00:12:59,160 --> 00:13:04,360 Speaker 1: Strategic Communications Laboratories or sc L, and I say either 206 00:13:04,480 --> 00:13:07,640 Speaker 1: ninete or two thousand five because according to the b 207 00:13:07,760 --> 00:13:12,040 Speaker 1: d I history the date is when s c L started, 208 00:13:12,280 --> 00:13:15,280 Speaker 1: but the UK company's house did not register s c 209 00:13:15,679 --> 00:13:19,160 Speaker 1: L as a company until two thousand five. The company 210 00:13:19,160 --> 00:13:23,520 Speaker 1: official or otherwise certainly did business between nine and two 211 00:13:23,520 --> 00:13:26,920 Speaker 1: thousand five. In two thousand there was an article published 212 00:13:27,360 --> 00:13:33,439 Speaker 1: about the company working communications strategies for the Indonesian President Wahid, who, 213 00:13:33,720 --> 00:13:36,800 Speaker 1: by the way, would later be impeached and removed from office. 214 00:13:36,920 --> 00:13:41,040 Speaker 1: So I guess that messaging didn't go so well. I'll 215 00:13:41,080 --> 00:13:44,320 Speaker 1: tell you more about s c L and the journey 216 00:13:44,320 --> 00:13:46,640 Speaker 1: towards Cambridge Analytic in a moment, but first let's take 217 00:13:46,640 --> 00:13:56,360 Speaker 1: a quick break to thank our sponsor. S c L 218 00:13:56,480 --> 00:13:59,160 Speaker 1: seemed to make a lot of big promises about what 219 00:13:59,200 --> 00:14:02,920 Speaker 1: it could do a psychological data After September eleven, two 220 00:14:02,920 --> 00:14:06,480 Speaker 1: thousand one, the company positioned itself as an expert resource 221 00:14:06,520 --> 00:14:10,240 Speaker 1: for psychological warfare and began to go after military contracts 222 00:14:10,240 --> 00:14:13,880 Speaker 1: in the UK and in the US as consultants. In 223 00:14:13,880 --> 00:14:16,960 Speaker 1: the early two thousands, s c L hired a salesman 224 00:14:17,080 --> 00:14:20,640 Speaker 1: named Alexander Nicks, and Knicks, like Oaks, had attended Eton, 225 00:14:21,080 --> 00:14:24,680 Speaker 1: so he was another old Etonian. In two thousand ten, 226 00:14:24,800 --> 00:14:28,680 Speaker 1: Alexander Nicks traveled to America to scope out the possibility 227 00:14:28,720 --> 00:14:33,040 Speaker 1: of finding clients for scl UH, specifically political clients. But 228 00:14:33,160 --> 00:14:35,800 Speaker 1: Nick saw that the climate in the US was different 229 00:14:36,120 --> 00:14:39,640 Speaker 1: than in other places. In the US, political consultants would 230 00:14:39,680 --> 00:14:43,080 Speaker 1: tend to ally themselves to a particular side. You would 231 00:14:43,120 --> 00:14:48,400 Speaker 1: either be Republican or Democratic consultants, but you rarely saw 232 00:14:48,440 --> 00:14:52,200 Speaker 1: anyone who is sort of just a consultant for hire 233 00:14:52,280 --> 00:14:54,840 Speaker 1: for either side. And s c L had clients that 234 00:14:54,880 --> 00:14:58,240 Speaker 1: had leaned left and some that leaned right. So it 235 00:14:58,280 --> 00:15:02,600 Speaker 1: would be really che challenging to get traction in America 236 00:15:02,720 --> 00:15:07,560 Speaker 1: as a foreign company that had worked for both sides 237 00:15:07,800 --> 00:15:12,360 Speaker 1: of political philosophies in the past um and so s 238 00:15:12,400 --> 00:15:14,440 Speaker 1: c L did not pursue a serious effort to get 239 00:15:14,480 --> 00:15:17,800 Speaker 1: into US politics at that time. By two thousand and twelve, 240 00:15:18,440 --> 00:15:21,480 Speaker 1: s c L was in financial difficulty. It was not 241 00:15:21,680 --> 00:15:25,600 Speaker 1: regularly pulling in consulting contracts, and there was a fundamental 242 00:15:25,640 --> 00:15:28,880 Speaker 1: disagreement between Nicks, who wanted to focus more on election 243 00:15:28,960 --> 00:15:32,160 Speaker 1: campaigns as a source for revenue, and Oakes, who wanted 244 00:15:32,160 --> 00:15:35,200 Speaker 1: to do more work by selling operations, centers and locations 245 00:15:35,240 --> 00:15:38,960 Speaker 1: across the Middle East as sort of defense contracts. So 246 00:15:39,080 --> 00:15:41,560 Speaker 1: Nix and Oakes more or less part of the ways. 247 00:15:41,800 --> 00:15:44,120 Speaker 1: Oakes would still head up a branch of SCL that 248 00:15:44,200 --> 00:15:47,560 Speaker 1: focused on defense, and Nicks would take up a group 249 00:15:47,640 --> 00:15:51,720 Speaker 1: that was called SCL Elections. So essentially they said, fine, 250 00:15:52,040 --> 00:15:56,600 Speaker 1: you take our research and you apply it that way. 251 00:15:56,640 --> 00:15:58,640 Speaker 1: I'm going to take our research and apply it this way. 252 00:15:59,040 --> 00:16:01,640 Speaker 1: That research, by the way, from all the reports I 253 00:16:01,640 --> 00:16:04,960 Speaker 1: could find, usually consisted of just just a few small, 254 00:16:05,680 --> 00:16:11,120 Speaker 1: very modest studies, nothing of any real major note. Uh, 255 00:16:11,160 --> 00:16:14,840 Speaker 1: so very odd that the companies were making these big, 256 00:16:14,880 --> 00:16:17,360 Speaker 1: big promises because they didn't have a whole lot of 257 00:16:17,440 --> 00:16:19,720 Speaker 1: research to back it up, and certainly not a lot 258 00:16:19,760 --> 00:16:23,400 Speaker 1: of published research. Two thousand twelve also marked a new 259 00:16:23,480 --> 00:16:27,000 Speaker 1: shot at the United States. So in in the US, 260 00:16:27,080 --> 00:16:31,160 Speaker 1: the Obama campaigns had successfully leveraged Facebook and social media 261 00:16:31,200 --> 00:16:34,440 Speaker 1: to reach out to young voters and to target audiences 262 00:16:34,520 --> 00:16:37,840 Speaker 1: and send out messaging. Uh. Not totally different from what 263 00:16:38,440 --> 00:16:42,240 Speaker 1: SCL and Cambridge Analytica would try to do, although in 264 00:16:42,280 --> 00:16:45,680 Speaker 1: a way that was much more transparent. So while the 265 00:16:45,680 --> 00:16:50,240 Speaker 1: Obama campaigns kind of led the way there, um, they 266 00:16:50,280 --> 00:16:53,360 Speaker 1: they did not go the further the step further the 267 00:16:53,400 --> 00:16:58,320 Speaker 1: Cambridge Analytica went. However, it convinced Nix that it meant 268 00:16:58,400 --> 00:17:02,480 Speaker 1: that the Republican side was falling behind when it comes 269 00:17:02,520 --> 00:17:05,159 Speaker 1: to technology, and there was a shift happening in the 270 00:17:05,160 --> 00:17:08,440 Speaker 1: way politicians would reach out to their electorate, and because 271 00:17:08,480 --> 00:17:12,160 Speaker 1: the Republican Party was falling behind, it showed an opportunity. 272 00:17:12,359 --> 00:17:15,080 Speaker 1: After the two thousand twelve election, the GOP, the Grand 273 00:17:15,119 --> 00:17:18,760 Speaker 1: Old Party Republican Party, issued a post mortem report that 274 00:17:18,840 --> 00:17:21,760 Speaker 1: emphasized a need to look into new tools for competing 275 00:17:21,800 --> 00:17:24,920 Speaker 1: in elections, including new places they had not or would 276 00:17:24,920 --> 00:17:28,200 Speaker 1: not have looked before, and that gave Nix an opening. 277 00:17:28,400 --> 00:17:31,720 Speaker 1: He rebranded SCL as being more than an image and 278 00:17:31,760 --> 00:17:35,720 Speaker 1: communications strategy consulting firm. Now, he said, s c L 279 00:17:36,119 --> 00:17:41,160 Speaker 1: is a expert in data analytics and using those results 280 00:17:41,160 --> 00:17:46,439 Speaker 1: to form actionable strategies. So, according to former employees, Nix 281 00:17:46,480 --> 00:17:49,920 Speaker 1: and Oakes both had a habit of promising deliverables without 282 00:17:50,000 --> 00:17:53,280 Speaker 1: having a really strong plan in place on how to 283 00:17:53,320 --> 00:17:56,239 Speaker 1: make good on those promises. So, in other words, they 284 00:17:56,280 --> 00:17:59,639 Speaker 1: found out what the potential client wanted. They would promise 285 00:17:59,680 --> 00:18:02,320 Speaker 1: to del liver that, and then they would come up 286 00:18:02,320 --> 00:18:05,600 Speaker 1: for a price for that promise, and after that they 287 00:18:05,600 --> 00:18:07,919 Speaker 1: would go back to their team and say, okay, I 288 00:18:08,000 --> 00:18:10,240 Speaker 1: promised that we would give them X. We have to 289 00:18:10,280 --> 00:18:12,920 Speaker 1: figure out how to give them X. That was kind 290 00:18:12,920 --> 00:18:15,560 Speaker 1: of what the rebranding was all about. The company was 291 00:18:15,600 --> 00:18:18,520 Speaker 1: positioning itself as having a deep expertise in a fairly 292 00:18:18,560 --> 00:18:21,600 Speaker 1: young field. But maybe you could argue it was not 293 00:18:21,760 --> 00:18:25,880 Speaker 1: quite as far along as the pitch would suggest. SCL 294 00:18:26,000 --> 00:18:31,440 Speaker 1: elections were a very expensive firm to have on your campaign, 295 00:18:31,800 --> 00:18:35,800 Speaker 1: and Nick's strategy was to aim for wealthy underdogs in 296 00:18:35,840 --> 00:18:39,199 Speaker 1: elections around the world, people who add access to a 297 00:18:39,200 --> 00:18:43,000 Speaker 1: lot of money but not many political resources. And then 298 00:18:43,119 --> 00:18:46,760 Speaker 1: the next piece in the Cambridge Analytica puzzle fell into place. 299 00:18:47,320 --> 00:18:51,160 Speaker 1: That piece was a data analyst and political campaign strategist 300 00:18:51,280 --> 00:18:58,280 Speaker 1: named Christopher Wiley. Wiley is an anomaly someone that I 301 00:18:58,359 --> 00:19:04,600 Speaker 1: really I just don't quite get. Interestingly, he had worked 302 00:19:04,600 --> 00:19:09,159 Speaker 1: in political campaigns before, specifically it worked for Obama's political campaigns, 303 00:19:09,560 --> 00:19:11,520 Speaker 1: and he had also acted as a consultant for the 304 00:19:11,560 --> 00:19:16,960 Speaker 1: Liberal Party in Canada. Ni's met Wiley and thought that 305 00:19:17,000 --> 00:19:19,960 Speaker 1: Wiley sounded like a genius and offered him a job 306 00:19:20,000 --> 00:19:23,200 Speaker 1: to help build out s c LS capabilities to actually 307 00:19:23,200 --> 00:19:25,760 Speaker 1: do what the company was claiming it could do. And 308 00:19:25,840 --> 00:19:29,800 Speaker 1: Wiley accepted the offer, and part of the reason why, 309 00:19:29,800 --> 00:19:31,600 Speaker 1: while he said he accepted it, in fact the main 310 00:19:31,640 --> 00:19:34,639 Speaker 1: reason was that Nick's essentially told him you're gonna have 311 00:19:35,359 --> 00:19:39,560 Speaker 1: free reign to work on your theories about data and 312 00:19:39,680 --> 00:19:44,280 Speaker 1: politics and messaging. You're gonna have a a sandbox to 313 00:19:44,359 --> 00:19:46,679 Speaker 1: play in where you can you can experiment all you like, 314 00:19:46,880 --> 00:19:50,719 Speaker 1: and that made Widy very happy. He thought that data 315 00:19:50,800 --> 00:19:54,960 Speaker 1: and politics were a place that had not yet really 316 00:19:54,960 --> 00:19:58,520 Speaker 1: come to fruition. So around the same time, s c 317 00:19:58,720 --> 00:20:01,520 Speaker 1: ls goal was to build out a tool it called Rippen, 318 00:20:02,320 --> 00:20:05,239 Speaker 1: or maybe more accurately, its goal was to sell a 319 00:20:05,280 --> 00:20:08,480 Speaker 1: tool called Rippen and then take the money from those 320 00:20:08,480 --> 00:20:11,800 Speaker 1: sales and then develop the tool called Ribbon. So in 321 00:20:11,840 --> 00:20:13,919 Speaker 1: other words, they were kind of selling a product they 322 00:20:13,920 --> 00:20:17,040 Speaker 1: did not really have. This loops us back to all 323 00:20:17,080 --> 00:20:20,480 Speaker 1: those data points the company claimed it had access to 324 00:20:20,720 --> 00:20:22,560 Speaker 1: early on when I was talking about it in the 325 00:20:22,600 --> 00:20:26,040 Speaker 1: opening of this episode. The Rippon tool was supposed to 326 00:20:26,080 --> 00:20:29,399 Speaker 1: be software that would allow a company a campaign to 327 00:20:29,440 --> 00:20:33,879 Speaker 1: manage things like a voter database, campaign efforts all in 328 00:20:33,920 --> 00:20:38,119 Speaker 1: a holistic, coordinated, and micro targeted way to really have 329 00:20:38,480 --> 00:20:43,320 Speaker 1: a focused strategy and actionable strategy so that at any 330 00:20:43,600 --> 00:20:46,000 Speaker 1: given step you knew what was happening and what was 331 00:20:46,040 --> 00:20:48,480 Speaker 1: supposed to happen next. And it sounded too good to 332 00:20:48,480 --> 00:20:50,639 Speaker 1: be true, And it turned out that that was the 333 00:20:50,680 --> 00:20:54,240 Speaker 1: case because Rippon which was named after Rippon, Wisconsin, that's 334 00:20:54,280 --> 00:20:58,320 Speaker 1: the birthplace of the Republican Party from the nineteenth century 335 00:20:59,080 --> 00:21:03,440 Speaker 1: when the the the party was founded in Rippon, Wisconsin 336 00:21:03,560 --> 00:21:07,640 Speaker 1: as mostly as an effort to unify people who were 337 00:21:08,240 --> 00:21:14,480 Speaker 1: um anti slavery UH and numerous other UH foundational ideas. 338 00:21:14,600 --> 00:21:19,840 Speaker 1: It all started there. But this the software named after 339 00:21:20,080 --> 00:21:23,199 Speaker 1: the birthplace for the Republican Party, didn't really exist as 340 00:21:23,200 --> 00:21:25,760 Speaker 1: a finished product. While Nix was out there trying to 341 00:21:25,800 --> 00:21:28,720 Speaker 1: sell it, it was consistently in development and the company 342 00:21:28,800 --> 00:21:31,120 Speaker 1: was making it a practice to get the train running 343 00:21:31,680 --> 00:21:35,639 Speaker 1: without having laid down any tracks at the beginning. So 344 00:21:35,680 --> 00:21:40,640 Speaker 1: while the Rippon mess was happening, while campaigns were saying, Hey, 345 00:21:40,840 --> 00:21:43,879 Speaker 1: where's this tool you've been promising us, Wiley began to 346 00:21:43,920 --> 00:21:47,400 Speaker 1: look around for ways to actually make it a reality 347 00:21:47,480 --> 00:21:49,960 Speaker 1: and something that would give the company an actual base 348 00:21:50,040 --> 00:21:52,720 Speaker 1: to stand on, And he heard about the work of 349 00:21:52,760 --> 00:21:56,280 Speaker 1: a psychologist named David Stillwell who had built a Facebook 350 00:21:56,280 --> 00:22:00,479 Speaker 1: app called My Personality Now. This app would quiz users 351 00:22:00,480 --> 00:22:05,400 Speaker 1: about five factors known as the Ocean model, and Ocean 352 00:22:05,560 --> 00:22:12,160 Speaker 1: stands for openness, conscientiousness, extra version, agreeableness, and europe neuroticism. 353 00:22:12,240 --> 00:22:14,800 Speaker 1: So it would measure you on the scale of each 354 00:22:14,840 --> 00:22:17,560 Speaker 1: of those things, and then we'll tell you how your 355 00:22:17,600 --> 00:22:20,480 Speaker 1: personality kind of measures up, what sort of personality you have. 356 00:22:21,040 --> 00:22:23,800 Speaker 1: Psychologists had found that it was tricky to get people 357 00:22:23,800 --> 00:22:26,800 Speaker 1: to agree to take those sorts of personality tests if 358 00:22:26,800 --> 00:22:29,080 Speaker 1: people knew the results would be applied to stuff like 359 00:22:29,160 --> 00:22:32,440 Speaker 1: politics or marketing, because they didn't want to share deeply 360 00:22:32,560 --> 00:22:37,359 Speaker 1: personal reactions within those contexts. They didn't want to think, oh, 361 00:22:37,400 --> 00:22:40,280 Speaker 1: I don't want to give this person information about my 362 00:22:40,440 --> 00:22:45,679 Speaker 1: darkest desires or fears or whatever. But on Facebook, in 363 00:22:45,720 --> 00:22:47,600 Speaker 1: the form of a quiz that promises to tell you 364 00:22:47,640 --> 00:22:51,159 Speaker 1: more about yourself, that would work like Gangbusters. And I 365 00:22:51,160 --> 00:22:53,840 Speaker 1: think it's pretty natural that we're all interested in stuff 366 00:22:54,440 --> 00:22:58,480 Speaker 1: that addresses us, specifically that talks about us. We tend 367 00:22:58,480 --> 00:23:01,560 Speaker 1: to be our own favorite topic of discussion. There's a 368 00:23:01,560 --> 00:23:03,800 Speaker 1: bit of egomaniac in all of us to some extent. 369 00:23:03,880 --> 00:23:08,960 Speaker 1: Now some people like me, we've got egomania and droves. 370 00:23:10,000 --> 00:23:12,960 Speaker 1: But when you encounter an app on Facebook that says 371 00:23:13,000 --> 00:23:15,440 Speaker 1: take this quiz to learn more about yourself, you may 372 00:23:15,440 --> 00:23:19,080 Speaker 1: feel inclined to take that quiz. Here's the trick. You're 373 00:23:19,160 --> 00:23:23,120 Speaker 1: not the only one learning stuff about yourself. The researchers 374 00:23:23,160 --> 00:23:26,760 Speaker 1: are learning things too, and they're collecting an enormous amount 375 00:23:26,800 --> 00:23:31,440 Speaker 1: of information about Facebook users. And so if you are 376 00:23:32,119 --> 00:23:35,560 Speaker 1: looking at the survey, and if the survey says, hey, 377 00:23:35,880 --> 00:23:38,840 Speaker 1: you can take the survey, uh, the app is going 378 00:23:38,960 --> 00:23:41,199 Speaker 1: to get access to your profile to see what kind 379 00:23:41,200 --> 00:23:43,760 Speaker 1: of stuff you post. It's not gonna post for you, 380 00:23:44,000 --> 00:23:46,760 Speaker 1: but they can see what you've posted publicly or whatever. 381 00:23:47,359 --> 00:23:50,680 Speaker 1: And you say, sure, that's fine. Whatever it's posted publicly, 382 00:23:50,720 --> 00:23:54,080 Speaker 1: you can know that. Well, that would give more data 383 00:23:54,480 --> 00:23:57,840 Speaker 1: to the developer who had created the survey. So they're 384 00:23:57,880 --> 00:24:00,520 Speaker 1: getting information from the survey that get know what kind 385 00:24:00,520 --> 00:24:03,560 Speaker 1: of personality you have, but they also know the type 386 00:24:03,560 --> 00:24:06,720 Speaker 1: of stuff you like based upon your interactions on Facebook 387 00:24:06,720 --> 00:24:09,240 Speaker 1: because they have access to that information as well, and 388 00:24:09,560 --> 00:24:13,040 Speaker 1: they can start to try and draw correlations between data points. So, 389 00:24:13,080 --> 00:24:16,520 Speaker 1: for example, they these researchers published a paper in two 390 00:24:16,560 --> 00:24:20,240 Speaker 1: thousand uh talking about their findings, and they said that, 391 00:24:20,680 --> 00:24:23,360 Speaker 1: you know, we found that people who displayed high intelligence 392 00:24:23,440 --> 00:24:27,560 Speaker 1: also really tended to like thunderstorms, or people who are 393 00:24:27,600 --> 00:24:31,000 Speaker 1: big fans of Hello Kitty. Scored really high on openness, 394 00:24:31,200 --> 00:24:35,199 Speaker 1: but they scored low on agreeableness and emotional stability. The 395 00:24:35,280 --> 00:24:39,199 Speaker 1: researchers knew that information because the app wouldn't again, just 396 00:24:39,200 --> 00:24:41,680 Speaker 1: gather information from the quiz. The app gave researchers a 397 00:24:41,760 --> 00:24:43,879 Speaker 1: chance to look at the profiles of the users themselves, 398 00:24:45,640 --> 00:24:48,280 Speaker 1: you know, so whatever those users had clicked like on 399 00:24:48,320 --> 00:24:50,639 Speaker 1: and Facebook, that stuff got lumped in with all the 400 00:24:50,680 --> 00:24:52,960 Speaker 1: other data. And that's how they were able to draw 401 00:24:53,040 --> 00:24:58,640 Speaker 1: some conclusions about relationships. And that seemed really interesting to Wiley. 402 00:24:58,840 --> 00:25:00,320 Speaker 1: I'll tell you a little bit more about that in 403 00:25:00,359 --> 00:25:02,359 Speaker 1: just a second, but first let's take another quick break 404 00:25:02,440 --> 00:25:12,800 Speaker 1: to thank our sponsor. In two thousand and fourteen, Wiley 405 00:25:12,840 --> 00:25:17,000 Speaker 1: would reach out to Stillwell and one of the other researchers, 406 00:25:17,040 --> 00:25:20,760 Speaker 1: a named guy named Michael Kazinki, to find out about 407 00:25:20,880 --> 00:25:24,440 Speaker 1: using the My Personality app or something akin to it 408 00:25:24,520 --> 00:25:29,360 Speaker 1: in conjunction with political campaign research. At that time, Stillwell 409 00:25:29,480 --> 00:25:33,240 Speaker 1: was working in the University of Cambridge's psycho Metrics Center 410 00:25:33,400 --> 00:25:38,159 Speaker 1: in the UK, and Stillwell would ultimately turn down this opportunity, 411 00:25:38,520 --> 00:25:42,040 Speaker 1: but a colleague of his, a Russian American psychologist named 412 00:25:42,080 --> 00:25:45,960 Speaker 1: Alexander Cogan, said, I can do that. I can make 413 00:25:46,000 --> 00:25:49,239 Speaker 1: something for you. I can create an app that is 414 00:25:49,520 --> 00:25:53,480 Speaker 1: replicates the research that my colleagues have done. And so 415 00:25:53,560 --> 00:25:56,040 Speaker 1: Cogan was tapped to create a tool that would work 416 00:25:56,080 --> 00:25:58,840 Speaker 1: in a similar way to my personality, and he got 417 00:25:58,880 --> 00:26:02,320 Speaker 1: to work on his own survey tool, which he called 418 00:26:02,560 --> 00:26:07,560 Speaker 1: this is Your Digital Life Now. He positioned this survey 419 00:26:07,600 --> 00:26:11,800 Speaker 1: to appear to be purely for academic research, and Facebook 420 00:26:12,320 --> 00:26:16,840 Speaker 1: approved this survey. So the survey it self existed on 421 00:26:17,000 --> 00:26:21,760 Speaker 1: top of Amazon's mechanical Turk platform, but it had this 422 00:26:22,359 --> 00:26:26,119 Speaker 1: inner operability with Facebook. It was a thing that you 423 00:26:26,119 --> 00:26:30,760 Speaker 1: would log into through Facebook, and it would therefore allow 424 00:26:31,280 --> 00:26:36,120 Speaker 1: the survey operator to pull data from Facebook in addition 425 00:26:36,200 --> 00:26:40,240 Speaker 1: to the information generated by the survey. And it wasn't 426 00:26:40,280 --> 00:26:44,760 Speaker 1: just the information about the person taking the quiz, because 427 00:26:46,000 --> 00:26:53,080 Speaker 1: Cogan had included a friends permission request, meaning that if 428 00:26:53,119 --> 00:26:56,600 Speaker 1: you agreed to take the survey under these terms. And 429 00:26:56,640 --> 00:26:58,760 Speaker 1: by the way, this was a paid survey. People were 430 00:26:58,760 --> 00:27:01,439 Speaker 1: paid money to part to debate. If you agreed to 431 00:27:01,440 --> 00:27:05,160 Speaker 1: take the survey and you logged in through your Facebook account, 432 00:27:05,920 --> 00:27:09,800 Speaker 1: doing so would grant permission for the developer, in this 433 00:27:09,840 --> 00:27:14,919 Speaker 1: case Cogan, to not just see your Facebook information, but 434 00:27:15,000 --> 00:27:20,040 Speaker 1: the information of your friends. So it included stuff like 435 00:27:21,119 --> 00:27:24,720 Speaker 1: your name, your age, your location, your email address, your likes, 436 00:27:25,000 --> 00:27:30,200 Speaker 1: your comments, your posts, and then all the information of 437 00:27:30,280 --> 00:27:33,600 Speaker 1: your friends as well. That was the part that would 438 00:27:33,680 --> 00:27:37,199 Speaker 1: ultimately get Cambridge Analytica and Facebook into a heap of 439 00:27:37,240 --> 00:27:41,320 Speaker 1: trouble in the United States. So Cogan builds out the survey, 440 00:27:41,400 --> 00:27:45,199 Speaker 1: uses Amazon's mechanical Turk program, and it ends with that 441 00:27:45,240 --> 00:27:48,920 Speaker 1: message that asks for that Facebook permission. Something between two 442 00:27:49,280 --> 00:27:52,680 Speaker 1: seventy thousand and three twenty thousand people took the survey, 443 00:27:52,800 --> 00:27:54,720 Speaker 1: So you're like, all right, well, that's like a quarter 444 00:27:54,760 --> 00:27:56,400 Speaker 1: of a million people, a little more than a quarter 445 00:27:56,400 --> 00:27:58,800 Speaker 1: of million people. That's a lot of people. But that 446 00:27:58,880 --> 00:28:03,160 Speaker 1: was not the limit of data, right, because you had 447 00:28:03,160 --> 00:28:06,600 Speaker 1: that friends permission. So you get the personality results, the 448 00:28:06,640 --> 00:28:09,439 Speaker 1: survey results just from the people who took the survey, 449 00:28:09,600 --> 00:28:12,480 Speaker 1: but you would get all the Facebook data from not 450 00:28:12,600 --> 00:28:14,640 Speaker 1: just those people, but all of their friends as well. 451 00:28:17,320 --> 00:28:19,879 Speaker 1: That is where you would start to hear this this 452 00:28:19,920 --> 00:28:24,840 Speaker 1: figure being thrown around that the survey gathered information from 453 00:28:24,880 --> 00:28:29,040 Speaker 1: fifty million users on Facebook, and then when Mark Zuckerberg 454 00:28:29,080 --> 00:28:32,760 Speaker 1: appeared before Congress in the spring of eighteen, he said, 455 00:28:33,040 --> 00:28:36,479 Speaker 1: actually the high end might be as much as eighties 456 00:28:36,520 --> 00:28:40,560 Speaker 1: seven million, not fifty million, and maybe even worse than that. 457 00:28:40,880 --> 00:28:43,800 Speaker 1: And eight seven million people might have had their data 458 00:28:43,840 --> 00:28:47,840 Speaker 1: scraped in this operation with only two seventy thousand people 459 00:28:47,920 --> 00:28:51,800 Speaker 1: giving consent. That brought brought the light up policy Facebook 460 00:28:51,840 --> 00:28:54,000 Speaker 1: had in place for years that a lot of people 461 00:28:54,040 --> 00:28:57,480 Speaker 1: had found troubling and had opposed before that point, and 462 00:28:57,720 --> 00:29:00,240 Speaker 1: that would be that a person could provide con sent 463 00:29:00,360 --> 00:29:03,720 Speaker 1: on behalf of everyone on their friends list to give 464 00:29:03,760 --> 00:29:07,520 Speaker 1: access to their data in addition to the user's own 465 00:29:07,600 --> 00:29:11,240 Speaker 1: data to an app developer. The more friends the person has, 466 00:29:11,280 --> 00:29:15,520 Speaker 1: the more data the developer would collect. Now, one complicating 467 00:29:15,600 --> 00:29:19,000 Speaker 1: factor that happened around this time is that Alexander Cogan 468 00:29:19,080 --> 00:29:22,960 Speaker 1: also accepted an offer from St. Petersburg University in Russia, 469 00:29:23,200 --> 00:29:26,960 Speaker 1: which granted Cogan research money to be an Associate Professor 470 00:29:27,240 --> 00:29:30,080 Speaker 1: Coganell has denied that any of that money was used 471 00:29:30,080 --> 00:29:33,520 Speaker 1: to build on this. Uh this is your Digital Life 472 00:29:33,560 --> 00:29:37,560 Speaker 1: app or the resulting analysis of the data that Facebook returned. 473 00:29:38,000 --> 00:29:40,920 Speaker 1: But this does raise even more eyebrows, saying, well, there 474 00:29:40,920 --> 00:29:43,440 Speaker 1: seems to be a connection to Russia with this as well. 475 00:29:43,880 --> 00:29:47,000 Speaker 1: That would not be the only connection, as we'll see 476 00:29:47,400 --> 00:29:50,920 Speaker 1: on June four, two thousand fourteen, according to an executed 477 00:29:51,000 --> 00:29:55,520 Speaker 1: contract that Chris Wiley would later supply to reporters, Cogan, 478 00:29:55,800 --> 00:29:59,120 Speaker 1: acting as head of a company called Global Science Research, 479 00:29:59,680 --> 00:30:03,720 Speaker 1: would hand over the data he collected to sc L 480 00:30:04,120 --> 00:30:07,440 Speaker 1: in return for money, and that would seem to violate 481 00:30:07,480 --> 00:30:09,880 Speaker 1: Facebook's terms of use. In fact, it doesn't seem to. 482 00:30:10,040 --> 00:30:14,040 Speaker 1: It did violate Facebook's terms of use. The terms were 483 00:30:14,200 --> 00:30:18,600 Speaker 1: under the understanding that Cogan was doing academic research, and 484 00:30:18,720 --> 00:30:22,400 Speaker 1: part of those terms said you are not allowed to 485 00:30:22,560 --> 00:30:25,720 Speaker 1: share those results or to share that data, that raw data, 486 00:30:26,160 --> 00:30:28,720 Speaker 1: with a third party. You cannot do that. We don't 487 00:30:28,720 --> 00:30:32,960 Speaker 1: have permission to do that. But Cogan didn't. Uh So 488 00:30:33,480 --> 00:30:35,720 Speaker 1: s c L was not part of this research, not 489 00:30:35,800 --> 00:30:37,840 Speaker 1: part of an academic research anyway. They wanted to make 490 00:30:37,880 --> 00:30:41,600 Speaker 1: practical use of that data, so that violates the policy. 491 00:30:41,840 --> 00:30:44,360 Speaker 1: Cogan would later say he felt he was being used 492 00:30:44,360 --> 00:30:46,960 Speaker 1: a scapegoat. As a scapegoat in the whole matter that 493 00:30:47,080 --> 00:30:49,800 Speaker 1: Cambridge Analytica and Facebook we're both pointing to Cogan and 494 00:30:49,840 --> 00:30:52,560 Speaker 1: saying he's the reason why we're in this mess. While 495 00:30:52,640 --> 00:30:55,000 Speaker 1: he said that the general opinion was that if no 496 00:30:55,040 --> 00:30:57,560 Speaker 1: one asked for permission, no one could be told no. 497 00:30:58,480 --> 00:31:02,040 Speaker 1: And so generally speaking, even if you suspected that what 498 00:31:02,080 --> 00:31:04,720 Speaker 1: you were doing was wrong, you certainly wouldn't ask anyone 499 00:31:04,760 --> 00:31:06,880 Speaker 1: about it because you didn't want to be told outright 500 00:31:07,000 --> 00:31:08,640 Speaker 1: it was wrong, because it would mean you wouldn't be 501 00:31:08,680 --> 00:31:10,760 Speaker 1: able to do it anymore. It was better to just 502 00:31:10,840 --> 00:31:13,480 Speaker 1: keep doing the wrong thing, thinking it was wrong, but 503 00:31:13,520 --> 00:31:17,480 Speaker 1: not knowing for sure, because you had plausible deniability, because 504 00:31:17,600 --> 00:31:19,640 Speaker 1: as long as you didn't say, hey, is what we're doing, 505 00:31:19,640 --> 00:31:22,000 Speaker 1: you know cool or not, no one could say no, 506 00:31:22,160 --> 00:31:26,360 Speaker 1: that's not cool. Cogan, by the way, would also later 507 00:31:26,400 --> 00:31:30,440 Speaker 1: say that the problem he ran into was that he 508 00:31:30,480 --> 00:31:34,320 Speaker 1: didn't read Facebook's terms of service. He didn't read the 509 00:31:34,360 --> 00:31:38,120 Speaker 1: part that said he wasn't allowed to share that data, 510 00:31:38,320 --> 00:31:41,520 Speaker 1: but moreover, in his own terms of service in the 511 00:31:41,560 --> 00:31:45,640 Speaker 1: app he had developed it stated that the information might 512 00:31:45,840 --> 00:31:50,600 Speaker 1: be used to be sold to third parties. That in 513 00:31:50,640 --> 00:31:53,720 Speaker 1: the terms of service that he he submitted to Facebook, 514 00:31:54,320 --> 00:31:57,840 Speaker 1: it said that the data gathered could be sold. Now 515 00:31:57,880 --> 00:32:00,840 Speaker 1: that's in violation of Facebook's policies, but means that neither 516 00:32:00,960 --> 00:32:04,280 Speaker 1: party read the other's terms of service. And I know, 517 00:32:04,720 --> 00:32:06,880 Speaker 1: I know for a fact, or at least not I 518 00:32:06,920 --> 00:32:09,560 Speaker 1: don't know for a fact. I suspect that every single 519 00:32:09,560 --> 00:32:12,120 Speaker 1: one of you out there, at some point or another, 520 00:32:12,440 --> 00:32:15,200 Speaker 1: has signed up for something. And there was a little box, 521 00:32:15,600 --> 00:32:18,959 Speaker 1: a little checkbox that said check to show that you 522 00:32:19,120 --> 00:32:22,160 Speaker 1: have read the terms of service. And I know that 523 00:32:22,240 --> 00:32:25,120 Speaker 1: you checked it. And I know you didn't read the 524 00:32:25,200 --> 00:32:28,560 Speaker 1: terms of service because they were forty pages long and 525 00:32:28,600 --> 00:32:31,040 Speaker 1: we all got better things to do with our lives. 526 00:32:31,880 --> 00:32:35,160 Speaker 1: But here's the problem. Cogan didn't read those terms of service, 527 00:32:35,680 --> 00:32:37,920 Speaker 1: and he hands over the date of the Camerage Analytica. 528 00:32:37,960 --> 00:32:42,400 Speaker 1: Facebook didn't read Cogan's terms of service. So Facebook didn't say, hey, no, 529 00:32:42,480 --> 00:32:46,000 Speaker 1: that's totally not cool, We're not going to approve your app. Instead, 530 00:32:46,640 --> 00:32:50,200 Speaker 1: it all went through and this would become an enormous 531 00:32:50,240 --> 00:32:55,160 Speaker 1: problem later on. Now, for Alexander Nicks, this was a 532 00:32:55,160 --> 00:32:58,880 Speaker 1: great time because now he had a massive amount of 533 00:32:59,000 --> 00:33:04,080 Speaker 1: data at his company's disposal, and he's been promising for 534 00:33:04,120 --> 00:33:07,120 Speaker 1: a while now that he was going to have access 535 00:33:07,120 --> 00:33:08,720 Speaker 1: to a whole bunch of information and they were gonna 536 00:33:08,720 --> 00:33:12,000 Speaker 1: be able to draw a lot of different conclusions about 537 00:33:12,000 --> 00:33:15,440 Speaker 1: that information. They're going to get very accurate pictures of 538 00:33:15,960 --> 00:33:18,720 Speaker 1: various target audiences, and they'd be able to act upon 539 00:33:18,800 --> 00:33:21,200 Speaker 1: that in a way that would produce real results. And 540 00:33:21,240 --> 00:33:24,440 Speaker 1: now he actually had the data that he could point 541 00:33:24,440 --> 00:33:26,840 Speaker 1: to and say, Hey, we've got all this information, let's 542 00:33:26,840 --> 00:33:29,080 Speaker 1: make use of it. We're gonna let's make some gravy 543 00:33:29,320 --> 00:33:32,400 Speaker 1: pay us money. So that made a very effective sales 544 00:33:32,400 --> 00:33:35,120 Speaker 1: pitch to potential but political clients in the United States. 545 00:33:35,640 --> 00:33:38,160 Speaker 1: They could kind of back up what they had been promising, 546 00:33:38,200 --> 00:33:42,200 Speaker 1: at least from a we have the data standpoint, And 547 00:33:42,240 --> 00:33:46,920 Speaker 1: then by chance things clicked into place for Nix and 548 00:33:47,000 --> 00:33:50,800 Speaker 1: his team. There was a political consultant named Mark Block 549 00:33:51,200 --> 00:33:54,160 Speaker 1: who with an associate of his, got on a flight 550 00:33:54,280 --> 00:33:57,800 Speaker 1: from Los Angeles to New York and sitting in blocks 551 00:33:57,880 --> 00:34:01,680 Speaker 1: row was a subcontractor who had worked with SCL and 552 00:34:01,760 --> 00:34:05,120 Speaker 1: the subcontractor on the flight talked up SCLS methods and 553 00:34:05,160 --> 00:34:07,240 Speaker 1: strategies and what they were planning on doing and what 554 00:34:07,280 --> 00:34:09,200 Speaker 1: they were going to do with all this data, And 555 00:34:09,239 --> 00:34:11,440 Speaker 1: by the end of that flight, Block and his associate 556 00:34:11,719 --> 00:34:14,080 Speaker 1: were convinced that they should really get a meeting with 557 00:34:14,120 --> 00:34:17,840 Speaker 1: Alexander Nicks and talk about this. So they arranged for 558 00:34:17,920 --> 00:34:20,960 Speaker 1: one and Block met with Nicks and was impressed by 559 00:34:21,000 --> 00:34:23,680 Speaker 1: the sales pitch, so he went on to introduce Nick's 560 00:34:23,760 --> 00:34:27,759 Speaker 1: to a woman named Rebecca Mercer. Rebecca Mercer is one 561 00:34:27,800 --> 00:34:31,840 Speaker 1: of the daughters of billionaire Robert Mercer, and the Mercers 562 00:34:31,880 --> 00:34:35,920 Speaker 1: are really big backers for Republican nominees in the United States. 563 00:34:35,960 --> 00:34:40,800 Speaker 1: They are strong financial backers of the GOP. The Mercers 564 00:34:40,800 --> 00:34:44,520 Speaker 1: have also invested millions of dollars in bright Bart News, 565 00:34:44,960 --> 00:34:47,920 Speaker 1: which at that time was led by Steve Bannon. The 566 00:34:47,960 --> 00:34:50,160 Speaker 1: Mercers were on the lookout for a new leader in 567 00:34:50,160 --> 00:34:54,080 Speaker 1: digital strategy because this was just after the elections in 568 00:34:54,160 --> 00:34:58,000 Speaker 1: two thousand twelve. Mitt Romney had lost and the Mercers 569 00:34:58,360 --> 00:35:02,120 Speaker 1: felt that a large reason why Romney was not able 570 00:35:02,160 --> 00:35:05,440 Speaker 1: to win the election was due to the digital strategy 571 00:35:05,560 --> 00:35:08,360 Speaker 1: that the Republicans had been employing up to that point. 572 00:35:08,960 --> 00:35:13,239 Speaker 1: Uh keeping in mind here that that Robert Mercer comes 573 00:35:13,239 --> 00:35:17,920 Speaker 1: from a computer science background, so Steve Bannon, Rebecca Mercer, 574 00:35:17,960 --> 00:35:21,840 Speaker 1: and Robert Mercer listened to Alexander Nick's pitch his company's 575 00:35:21,880 --> 00:35:26,560 Speaker 1: capabilities in late two thousand thirteen, reportedly aboard the Mercer 576 00:35:26,680 --> 00:35:30,279 Speaker 1: families two hundred three foot yacht just gotta be a 577 00:35:30,320 --> 00:35:32,680 Speaker 1: nice place to have a meeting. And the meeting apparently 578 00:35:32,680 --> 00:35:36,880 Speaker 1: went well because the Mercers and Bannon invested money into 579 00:35:37,440 --> 00:35:42,240 Speaker 1: a new company. The Mercer's reportedly invested about fifteen million 580 00:35:42,320 --> 00:35:46,000 Speaker 1: dollars into this new company that spun off from s 581 00:35:46,040 --> 00:35:52,680 Speaker 1: c L. And this company was Cambridge Analytica. And I'm 582 00:35:52,680 --> 00:35:55,000 Speaker 1: gonna be talking a lot about Cambridge Analytica and s 583 00:35:55,080 --> 00:35:59,120 Speaker 1: c L in the next episode. It's almost as if 584 00:35:59,200 --> 00:36:05,040 Speaker 1: you can use both names and interchangeably. But in our 585 00:36:05,080 --> 00:36:07,919 Speaker 1: next episode, I'll talk more about what cambrigeen Aaltica did, 586 00:36:08,000 --> 00:36:10,920 Speaker 1: how it handled all that data we talked about earlier, 587 00:36:11,239 --> 00:36:13,960 Speaker 1: the fallout that happened after people found out what was 588 00:36:14,000 --> 00:36:17,000 Speaker 1: going on, and more. Before I leave, I do want 589 00:36:17,000 --> 00:36:19,759 Speaker 1: to mention that, according to all the research I read, well, 590 00:36:19,800 --> 00:36:22,880 Speaker 1: cambrag general Ittica technically spun off from s c L. 591 00:36:23,120 --> 00:36:27,400 Speaker 1: The two organizations were effectively one and the same. According 592 00:36:27,440 --> 00:36:31,759 Speaker 1: to multiple sources, s c L essentially handled stuff uh 593 00:36:31,920 --> 00:36:34,960 Speaker 1: in the U k and outside the States, and Cambridge 594 00:36:34,960 --> 00:36:38,160 Speaker 1: Gennaltica handled pretty much everything in the United States, but 595 00:36:38,280 --> 00:36:42,359 Speaker 1: they depended heavily on the same staff, the same physical resources, 596 00:36:42,400 --> 00:36:45,600 Speaker 1: headquartered in the same building. So while it sounds like 597 00:36:45,640 --> 00:36:49,520 Speaker 1: I'm ending this episode just at the founding of Cambridge Analytica, 598 00:36:50,080 --> 00:36:52,640 Speaker 1: where you know, the whole episode just leads up to 599 00:36:52,719 --> 00:36:56,279 Speaker 1: that one moment, I would argue that Cambridge Gennaltica really 600 00:36:56,280 --> 00:36:59,799 Speaker 1: existed the whole time, just not in name and not 601 00:37:00,000 --> 00:37:04,239 Speaker 1: with the specific focus of trying to influence political elections 602 00:37:04,280 --> 00:37:07,839 Speaker 1: in the United States. So in our next episode we're 603 00:37:07,840 --> 00:37:10,520 Speaker 1: gonna go into more detail about how that played out. 604 00:37:11,080 --> 00:37:13,600 Speaker 1: If you guys have suggestions for future episodes of tech Stuff, 605 00:37:13,600 --> 00:37:14,920 Speaker 1: why not get in touch with me and let me 606 00:37:14,960 --> 00:37:17,200 Speaker 1: know what they are. The email address for the show 607 00:37:17,640 --> 00:37:20,360 Speaker 1: is tech Stuff at how stuff Works dot com, or 608 00:37:20,360 --> 00:37:22,520 Speaker 1: you can drop me a line on Facebook or Twitter 609 00:37:22,640 --> 00:37:25,239 Speaker 1: to handle it. Both of those is tech Stuff hs W. 610 00:37:26,160 --> 00:37:29,120 Speaker 1: You should go over to our merch store over at 611 00:37:29,160 --> 00:37:32,439 Speaker 1: t public dot com slash tech Stuff. There you can 612 00:37:32,480 --> 00:37:35,760 Speaker 1: find lots of designs as you can get on shirts, totes, 613 00:37:37,200 --> 00:37:41,520 Speaker 1: phone cases, coffee mugs, all sorts of cool stuff and uh, 614 00:37:42,080 --> 00:37:44,880 Speaker 1: I love I love my tech stuff shirts so much 615 00:37:45,480 --> 00:37:48,239 Speaker 1: you'll love them too. And every purchase goes to help 616 00:37:48,280 --> 00:37:50,440 Speaker 1: the show a little bit, so we thank you for that. 617 00:37:50,520 --> 00:37:52,840 Speaker 1: If you want to pick up some cool tech stuff 618 00:37:52,880 --> 00:37:56,160 Speaker 1: merchandise and don't forget, follow us over on Instagram and 619 00:37:56,200 --> 00:38:05,200 Speaker 1: I'll talk to you again really soon. For more on 620 00:38:05,280 --> 00:38:07,720 Speaker 1: this and thousands of other topics, is it how stuff 621 00:38:07,760 --> 00:38:18,359 Speaker 1: works dot com