1 00:00:04,240 --> 00:00:07,240 Speaker 1: Welcome to Tech Stuff, a production of I Heart Radios 2 00:00:07,320 --> 00:00:13,880 Speaker 1: How Stuff Works. Hey there, and welcome to tech Stuff. 3 00:00:13,920 --> 00:00:16,919 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:16,960 --> 00:00:20,040 Speaker 1: I Heart Radio and I love all things tech and 5 00:00:20,120 --> 00:00:23,400 Speaker 1: you know. A few months back, I profiled the Chinese 6 00:00:23,400 --> 00:00:27,960 Speaker 1: telecommunications company Huawei, which continues to be the focal point 7 00:00:28,120 --> 00:00:32,159 Speaker 1: of scrutiny around the world. Huawei makes, among lots of 8 00:00:32,200 --> 00:00:37,360 Speaker 1: other stuff, critical components for five gene network infrastructure. There's 9 00:00:37,400 --> 00:00:41,239 Speaker 1: some folks who worry that entrusting telecommunications infrastructure to a 10 00:00:41,360 --> 00:00:45,680 Speaker 1: Chinese company is essentially inviting the government of China to 11 00:00:45,840 --> 00:00:51,640 Speaker 1: spy on everybody, companies, other countries, everyone. Other folks aren't 12 00:00:51,680 --> 00:00:55,360 Speaker 1: as concerned about that, or they took Huawei officials at 13 00:00:55,360 --> 00:00:58,000 Speaker 1: their word that the company has no real ties to 14 00:00:58,080 --> 00:01:02,080 Speaker 1: the Chinese Communist government or its goals. But a recent 15 00:01:02,120 --> 00:01:05,720 Speaker 1: story in The Washington Post and German broadcaster zd F 16 00:01:05,840 --> 00:01:07,640 Speaker 1: China light on why it might be a good idea 17 00:01:07,680 --> 00:01:11,240 Speaker 1: to view Whahwei with a critical eye, and another news 18 00:01:11,280 --> 00:01:16,280 Speaker 1: item from the The Wall Street Journal showed that Huahwei 19 00:01:16,600 --> 00:01:22,000 Speaker 1: has maintained a back door access to its networks for 20 00:01:22,200 --> 00:01:25,200 Speaker 1: ten years So I want to talk about these stories, 21 00:01:25,280 --> 00:01:28,720 Speaker 1: primarily focusing on the one from the Washington Post to 22 00:01:28,760 --> 00:01:34,000 Speaker 1: talk about the business of communication and secrets and also 23 00:01:34,040 --> 00:01:37,240 Speaker 1: the business of eavesdropping and why all of this gets 24 00:01:37,440 --> 00:01:43,320 Speaker 1: real dodgy, real fast. So the initial story doesn't involve 25 00:01:43,600 --> 00:01:47,560 Speaker 1: China or five G networks. It goes further back than that. 26 00:01:47,680 --> 00:01:51,320 Speaker 1: It actually concerns a Swiss company called Crypto a G 27 00:01:51,960 --> 00:01:55,480 Speaker 1: and its ties to the Central Intelligence Agent Agency a 28 00:01:55,600 --> 00:01:57,680 Speaker 1: k a. The c i A in the United States. 29 00:01:58,000 --> 00:02:01,640 Speaker 1: The story is all about the battle between secrecy and surveillance, 30 00:02:02,040 --> 00:02:04,720 Speaker 1: and it's also about trust, as in, whom do you 31 00:02:04,760 --> 00:02:08,240 Speaker 1: trust when you want to send a secure communication to 32 00:02:08,280 --> 00:02:12,800 Speaker 1: someone else? If you're using some sort of technology to 33 00:02:13,040 --> 00:02:19,320 Speaker 1: encrypt your stuff, who makes that that encryption you know, strategy, 34 00:02:19,360 --> 00:02:23,639 Speaker 1: whether it's it's software or uh actual device or whatever 35 00:02:23,680 --> 00:02:26,680 Speaker 1: it may be, who's making that and can they be trusted? 36 00:02:26,720 --> 00:02:29,800 Speaker 1: And as it turns out, those are difficult questions to 37 00:02:29,840 --> 00:02:33,840 Speaker 1: answer then would readily seem a parent Now, the story 38 00:02:33,919 --> 00:02:37,840 Speaker 1: for this really begins with a Swedish inventor named Arvid 39 00:02:38,000 --> 00:02:43,520 Speaker 1: gerard Dom who was born in eighteen sixty nine. He 40 00:02:43,560 --> 00:02:47,360 Speaker 1: worked in textile mills before he would start creating his 41 00:02:47,400 --> 00:02:51,639 Speaker 1: own version of a cipher machine sometime around nineteen fifteen 42 00:02:51,760 --> 00:02:56,120 Speaker 1: or so. So, what the heck is a cipher machine? Heck? 43 00:02:56,160 --> 00:02:59,720 Speaker 1: What's a cipher? Well, a cipher is a code. It's 44 00:02:59,720 --> 00:03:02,800 Speaker 1: a way of hiding the meaning of a message. And 45 00:03:02,840 --> 00:03:07,520 Speaker 1: there are a lot of different approaches to encoding information, uh, 46 00:03:07,560 --> 00:03:09,680 Speaker 1: And there are a lot of strategies that actually employ 47 00:03:09,840 --> 00:03:14,440 Speaker 1: multiple versions of this, multiple schemes. So, for example, one 48 00:03:14,480 --> 00:03:17,160 Speaker 1: way to have a code is to use words that 49 00:03:17,280 --> 00:03:21,320 Speaker 1: refer to something else. So instead of saying a military tank, 50 00:03:21,440 --> 00:03:24,760 Speaker 1: you might say Thomas, you know, because you've got Thomas 51 00:03:24,800 --> 00:03:27,560 Speaker 1: the tank engine. And you go from Thomas the tank 52 00:03:27,600 --> 00:03:31,440 Speaker 1: engine to military tank and there you are. So if 53 00:03:31,480 --> 00:03:33,840 Speaker 1: you referred to a Thomas, you might be talking about 54 00:03:33,840 --> 00:03:36,640 Speaker 1: a tank. That would be a very bad code, or 55 00:03:36,640 --> 00:03:39,800 Speaker 1: at least a very easy to decipher code. But that's 56 00:03:40,040 --> 00:03:43,000 Speaker 1: a version of codes where you have a codebook that 57 00:03:43,120 --> 00:03:47,320 Speaker 1: tells you what certain words or phrases actually are meant 58 00:03:47,400 --> 00:03:51,240 Speaker 1: to convey. Then you have ciphers in which you replace 59 00:03:51,680 --> 00:03:55,080 Speaker 1: the letters of a message with some other letter or symbol, 60 00:03:55,320 --> 00:03:58,880 Speaker 1: And the simplest of these is a shift cipher, sometimes 61 00:03:58,920 --> 00:04:02,560 Speaker 1: also called a c zer cipher. And with these ciphers, 62 00:04:02,760 --> 00:04:05,080 Speaker 1: you write on a message, but you shift all the 63 00:04:05,160 --> 00:04:10,400 Speaker 1: letters some predetermined number down or up the alphabet. So 64 00:04:10,520 --> 00:04:13,480 Speaker 1: if you had a shift cipher with just one shift 65 00:04:13,680 --> 00:04:16,800 Speaker 1: one step, that would mean that you would use the 66 00:04:16,880 --> 00:04:20,400 Speaker 1: letter B to represent the letter A, you would use 67 00:04:20,440 --> 00:04:23,240 Speaker 1: the letter C to represent the letter B, and so 68 00:04:23,320 --> 00:04:26,040 Speaker 1: on down the alphabet. So if someone else were to 69 00:04:26,080 --> 00:04:29,440 Speaker 1: get hold of the message at casual glance, the message 70 00:04:29,480 --> 00:04:32,160 Speaker 1: would appear to be gibberish. But of course that particular 71 00:04:32,200 --> 00:04:35,920 Speaker 1: cipher is super easy to decode, even if you are 72 00:04:35,920 --> 00:04:38,960 Speaker 1: shifting further up or down the alphabet. Let's say you're 73 00:04:38,960 --> 00:04:42,839 Speaker 1: shifting up ten spots instead of one. Well, just because 74 00:04:43,000 --> 00:04:46,880 Speaker 1: of the nature of language, someone with even a little 75 00:04:46,880 --> 00:04:49,279 Speaker 1: bit of patients would be able to probably break that 76 00:04:49,360 --> 00:04:53,360 Speaker 1: code pretty quickly. Well. In the early twentieth century, inventors 77 00:04:53,360 --> 00:04:57,800 Speaker 1: were working on mechanical systems that would create stronger ciphers, 78 00:04:57,880 --> 00:05:00,520 Speaker 1: and initially these were mostly thought of as a way 79 00:05:00,560 --> 00:05:06,280 Speaker 1: to protect business communications like financial communications between banks, for example, 80 00:05:06,720 --> 00:05:11,919 Speaker 1: or sometimes political messages between different parts of the world, 81 00:05:11,960 --> 00:05:15,440 Speaker 1: like a government and its embassy in another country. That 82 00:05:15,520 --> 00:05:18,800 Speaker 1: over time they would be adopted by militaries around the 83 00:05:18,800 --> 00:05:22,320 Speaker 1: world to send secret communications back and forth between headquarters 84 00:05:22,360 --> 00:05:25,440 Speaker 1: and units in the field, and these communications needed to 85 00:05:25,440 --> 00:05:30,880 Speaker 1: be much more secure than a Caesar cipher could potentially offer. 86 00:05:31,520 --> 00:05:35,640 Speaker 1: So the basic idea behind these cipher machines was that 87 00:05:35,680 --> 00:05:38,440 Speaker 1: you would have a device. Sometimes it would look like 88 00:05:38,480 --> 00:05:41,279 Speaker 1: a typewriter, sometimes it would have a hand crank on it, 89 00:05:41,600 --> 00:05:45,280 Speaker 1: but typically there'd be at least one dial, if not 90 00:05:45,480 --> 00:05:48,640 Speaker 1: several dials, and perhaps some other components that would allow 91 00:05:48,640 --> 00:05:52,359 Speaker 1: the operator to set the machine to establish the cipher. 92 00:05:53,000 --> 00:05:56,520 Speaker 1: So you choose your settings, and then the operator would 93 00:05:56,520 --> 00:06:00,320 Speaker 1: take a message that is meant to be encoded and 94 00:06:00,360 --> 00:06:03,120 Speaker 1: then put it through this machine in some way. Maybe 95 00:06:03,160 --> 00:06:06,359 Speaker 1: they're using a keyboard, maybe they're using a series of 96 00:06:06,440 --> 00:06:10,960 Speaker 1: keys and levers. However it may be they're actually typing 97 00:06:11,000 --> 00:06:15,080 Speaker 1: out the message in plain text. But the cipher machines 98 00:06:15,120 --> 00:06:17,800 Speaker 1: would have some sort of gears or other chains or 99 00:06:17,880 --> 00:06:21,000 Speaker 1: systems that would turn with each letter type, and it 100 00:06:21,000 --> 00:06:23,800 Speaker 1: would change the cipher as it did, so change the 101 00:06:23,880 --> 00:06:26,599 Speaker 1: nature of it. And this was a really clever way 102 00:06:26,720 --> 00:06:30,960 Speaker 1: to confound code breakers, particularly if the machine was really 103 00:06:31,000 --> 00:06:35,359 Speaker 1: well designed. So let's say you are an operator and 104 00:06:35,640 --> 00:06:39,200 Speaker 1: you have the word book that you need to encode 105 00:06:39,400 --> 00:06:41,919 Speaker 1: using one of these machines. So you have one of 106 00:06:41,920 --> 00:06:46,000 Speaker 1: these particular machines, You type the letter B into the device, which, 107 00:06:46,000 --> 00:06:49,560 Speaker 1: because of the settings for this particular session, will now 108 00:06:49,640 --> 00:06:52,960 Speaker 1: print out the letter G. So the letter G means 109 00:06:53,240 --> 00:06:57,159 Speaker 1: be with this particular cipher. The gears inside the machine 110 00:06:57,200 --> 00:07:00,560 Speaker 1: turn after you've typed in the letter B, which prints 111 00:07:00,560 --> 00:07:03,200 Speaker 1: out is G, So now the cipher is actually different. 112 00:07:03,560 --> 00:07:06,400 Speaker 1: You type in the first OH in book and you 113 00:07:06,440 --> 00:07:09,880 Speaker 1: get another G because of the way the cipher works. 114 00:07:10,360 --> 00:07:13,400 Speaker 1: Then the gears turn again. You type in the second OH, 115 00:07:13,440 --> 00:07:16,200 Speaker 1: and now the machine prints out the letter F. The 116 00:07:16,200 --> 00:07:18,880 Speaker 1: gears turn again, you type out the letter K, and 117 00:07:18,960 --> 00:07:22,360 Speaker 1: you get the print out of K, so the printed 118 00:07:22,400 --> 00:07:28,120 Speaker 1: word says G G F K rather than book. Well, 119 00:07:28,160 --> 00:07:31,120 Speaker 1: to decode the message, you would typically need the same 120 00:07:31,160 --> 00:07:33,840 Speaker 1: sort of machine that was used to encode it, and 121 00:07:33,880 --> 00:07:36,280 Speaker 1: you would need to know what settings the operator had 122 00:07:36,320 --> 00:07:38,840 Speaker 1: been using when they started the message, and you would 123 00:07:38,840 --> 00:07:42,960 Speaker 1: have to set up your machine to mirror that, and 124 00:07:43,000 --> 00:07:46,320 Speaker 1: then you would end up taking the encoded message and 125 00:07:46,320 --> 00:07:49,320 Speaker 1: you would start typing that out and the process would 126 00:07:49,400 --> 00:07:52,360 Speaker 1: essentially reverse itself, and it would allow the operator to 127 00:07:52,440 --> 00:07:56,920 Speaker 1: read out the original message. So in our example, the 128 00:07:57,000 --> 00:07:59,560 Speaker 1: operator on the other side would take g g F 129 00:07:59,720 --> 00:08:02,440 Speaker 1: K and enter that into their machine and they would 130 00:08:02,440 --> 00:08:06,320 Speaker 1: get the print out book. Now a couple of caveats here. 131 00:08:07,000 --> 00:08:10,320 Speaker 1: Not all cipher machines are created equal right or were 132 00:08:10,440 --> 00:08:15,400 Speaker 1: used to their best advantage. Sometimes people made bad decisions 133 00:08:15,440 --> 00:08:19,560 Speaker 1: when it came to either designing cipher machines or implementing them. 134 00:08:19,680 --> 00:08:22,720 Speaker 1: For example, the big wigs might decide that in no 135 00:08:22,840 --> 00:08:27,640 Speaker 1: circumstance would you ever have a letter represented by itself. 136 00:08:27,680 --> 00:08:30,440 Speaker 1: You would never allow that to happen. So in the 137 00:08:30,480 --> 00:08:34,959 Speaker 1: example I just gave where g g F K means book, 138 00:08:35,640 --> 00:08:38,880 Speaker 1: that last k wouldn't work. You would have to have 139 00:08:39,000 --> 00:08:42,480 Speaker 1: them the device go to a different letter because it 140 00:08:42,480 --> 00:08:46,120 Speaker 1: would not allow itself to replicate a letter with the 141 00:08:46,160 --> 00:08:50,240 Speaker 1: representation of itself. Other rules that could cause problems on 142 00:08:50,280 --> 00:08:52,800 Speaker 1: the role road might be a rule against the doubling 143 00:08:53,000 --> 00:08:55,920 Speaker 1: of letters like the g G in g g F K. 144 00:08:56,640 --> 00:09:00,480 Speaker 1: And the reason that these are problems is that if 145 00:09:00,520 --> 00:09:03,840 Speaker 1: you have a code breaker who's really looking at these 146 00:09:03,840 --> 00:09:07,120 Speaker 1: codes closely, and that code breaker starts to figure out 147 00:09:07,120 --> 00:09:10,480 Speaker 1: that there are restrictions to the code they can build 148 00:09:10,559 --> 00:09:13,400 Speaker 1: that into their code breaking models in an effort to 149 00:09:13,480 --> 00:09:16,880 Speaker 1: crack the code, because as you put in restrictions, that 150 00:09:16,920 --> 00:09:20,280 Speaker 1: means you're reducing variables. And anyone who has worked in 151 00:09:20,320 --> 00:09:24,480 Speaker 1: any sort of mathematics, particularly stuff like algebra, you know 152 00:09:25,080 --> 00:09:28,880 Speaker 1: that to solve complicated problems you need to reduce variables. 153 00:09:28,920 --> 00:09:33,079 Speaker 1: As you reduce variables, you make it easier to solve problems. 154 00:09:33,120 --> 00:09:35,080 Speaker 1: So it was actually this sort of thing that would 155 00:09:35,160 --> 00:09:38,720 Speaker 1: lead to the British cryptographers breaking German codes during World 156 00:09:38,720 --> 00:09:43,360 Speaker 1: War Two. It wasn't that the technology itself was necessarily faulty. 157 00:09:43,440 --> 00:09:46,720 Speaker 1: It was that the Germans were kind of using bad 158 00:09:46,800 --> 00:09:51,080 Speaker 1: methodology with some of their their equipment, and that's what 159 00:09:51,400 --> 00:09:55,240 Speaker 1: gave an in road for code breakers. Now, if you 160 00:09:55,280 --> 00:09:59,199 Speaker 1: want to learn way more about how these machines actually work, 161 00:09:59,640 --> 00:10:03,319 Speaker 1: you can listen to tech Stuff Ponders and Enigma. That's 162 00:10:03,320 --> 00:10:07,680 Speaker 1: a classic episode that originally published way back on October nineteen, 163 00:10:07,880 --> 00:10:11,079 Speaker 1: two thousand eleven, and I actually did a tech Stuff 164 00:10:11,160 --> 00:10:15,760 Speaker 1: classic rerun of that episode on October twelfth, two thousand eighteen. 165 00:10:16,360 --> 00:10:20,440 Speaker 1: The Enigma machine is the most famous cipher device that 166 00:10:20,520 --> 00:10:23,280 Speaker 1: was made in the early twentieth century. It was made 167 00:10:23,880 --> 00:10:26,840 Speaker 1: and used by the Germans, and it was used extensively 168 00:10:27,040 --> 00:10:30,040 Speaker 1: by the German military during World War Two. And in 169 00:10:30,080 --> 00:10:32,600 Speaker 1: that podcast, my old co host Chris Pallette and I 170 00:10:32,640 --> 00:10:36,840 Speaker 1: talked about how a really good cipher, one that's super 171 00:10:36,920 --> 00:10:40,040 Speaker 1: hard to crack, is also a pain in the patukas 172 00:10:40,520 --> 00:10:44,120 Speaker 1: to use because of that complexity, and that's mainly why 173 00:10:44,160 --> 00:10:47,679 Speaker 1: officials would put rules in place that ultimately would serve 174 00:10:47,720 --> 00:10:51,560 Speaker 1: as the downfall for their technology, because using the tech 175 00:10:51,640 --> 00:10:54,720 Speaker 1: without those rules in place was possible, but not always 176 00:10:54,880 --> 00:10:58,280 Speaker 1: fast enough to be practical. This would prove to be 177 00:10:58,320 --> 00:11:01,080 Speaker 1: a problem with cryptography and gen role. You want a 178 00:11:01,120 --> 00:11:04,880 Speaker 1: system that's secure enough that you're reasonably certain a person 179 00:11:04,920 --> 00:11:08,240 Speaker 1: who intercepts the message would be unable to make head 180 00:11:08,320 --> 00:11:10,320 Speaker 1: or tail of it, right, That's the whole purpose of 181 00:11:10,320 --> 00:11:14,920 Speaker 1: cryptography is to make any unauthorized person incapable of reading 182 00:11:14,960 --> 00:11:18,920 Speaker 1: the message. But you also want your solution to be 183 00:11:18,960 --> 00:11:22,960 Speaker 1: practical enough that your intended recipient can decode the message 184 00:11:23,160 --> 00:11:26,440 Speaker 1: with a minimum of fuss, particularly if it relates to 185 00:11:26,520 --> 00:11:30,200 Speaker 1: a time sensitive issue. So in this case, you had 186 00:11:30,280 --> 00:11:33,920 Speaker 1: Germans using the same settings on their Enigma machines for 187 00:11:34,080 --> 00:11:37,360 Speaker 1: longer than they were supposed to, or they were co 188 00:11:37,600 --> 00:11:41,120 Speaker 1: locating codebooks with the Enigma machines and those fell into 189 00:11:41,200 --> 00:11:45,520 Speaker 1: Allied hands who were able to use those to decode messages. 190 00:11:45,760 --> 00:11:50,400 Speaker 1: To this day, balancing out practical applications with security remains 191 00:11:50,400 --> 00:11:54,320 Speaker 1: a challenge. It may make it take longer for a 192 00:11:54,320 --> 00:11:57,360 Speaker 1: message to get through from one point to another, which 193 00:11:57,400 --> 00:12:00,520 Speaker 1: a lot of people don't accept in the age of 194 00:12:00,920 --> 00:12:04,160 Speaker 1: information traveling at the speed of light, or it just 195 00:12:04,280 --> 00:12:09,199 Speaker 1: maybe a pain to encrypt and decrypt, which also ends 196 00:12:09,280 --> 00:12:13,640 Speaker 1: up becoming a barrier to adoption and implementation. Okay, let's 197 00:12:13,640 --> 00:12:16,920 Speaker 1: get back to our story. So it's the nineteen tens. 198 00:12:17,000 --> 00:12:20,760 Speaker 1: Rights around nineteen fifteen, Ared Garad Doam has patented an 199 00:12:20,840 --> 00:12:24,800 Speaker 1: encryption device. He got that patent by nineteen nineteen, and 200 00:12:24,880 --> 00:12:28,440 Speaker 1: to manufacture and market the device, don would work with 201 00:12:28,480 --> 00:12:33,760 Speaker 1: business partners to create a company originally called Cryptograph or 202 00:12:33,840 --> 00:12:37,480 Speaker 1: a b Cryptograph, and one of Dom's investors was a 203 00:12:37,520 --> 00:12:41,800 Speaker 1: guy named Carl Wilhelm Haglin who had made his money 204 00:12:41,840 --> 00:12:45,000 Speaker 1: in Russia in the oil business. But then the Russian 205 00:12:45,040 --> 00:12:48,480 Speaker 1: Revolution happened and Haglin fled with his family and they 206 00:12:48,520 --> 00:12:52,600 Speaker 1: returned to Haglin's homeland of Sweden. They brought the family 207 00:12:52,640 --> 00:12:56,920 Speaker 1: with them and uh and Boris Hagelin was was Carl 208 00:12:56,960 --> 00:13:00,080 Speaker 1: Wilhelm Haglin's son, and Boris was given a position and 209 00:13:00,640 --> 00:13:04,960 Speaker 1: in Dom's company in return for this financial investment from 210 00:13:05,000 --> 00:13:07,839 Speaker 1: his father. Now Boris would actually prove to be quite 211 00:13:07,840 --> 00:13:11,760 Speaker 1: the entrepreneur. In nineteen twenty five, he would take over 212 00:13:11,960 --> 00:13:15,640 Speaker 1: the company entirely. He became the new head of the company. 213 00:13:15,679 --> 00:13:19,640 Speaker 1: He would rename it Crypto Technic in nineteen thirty two, 214 00:13:20,080 --> 00:13:23,080 Speaker 1: and then when the Nazis rose to power, he fled 215 00:13:23,240 --> 00:13:27,559 Speaker 1: Sweden for Switzerland and re established his company there. And 216 00:13:27,720 --> 00:13:30,880 Speaker 1: it was this company that he established that would later 217 00:13:31,040 --> 00:13:34,559 Speaker 1: become known as Crypto a g the focus of our 218 00:13:34,720 --> 00:13:38,760 Speaker 1: episode really. In the meantime, his company continued to produce 219 00:13:38,920 --> 00:13:42,439 Speaker 1: new cipher machines, incorporating new features in an effort to 220 00:13:42,480 --> 00:13:45,880 Speaker 1: build machines that were able to create stronger codes. And 221 00:13:45,920 --> 00:13:49,480 Speaker 1: again this was mostly for business use or occasional government use, 222 00:13:49,760 --> 00:13:52,160 Speaker 1: but the rise of World War Two would create a 223 00:13:52,160 --> 00:13:55,640 Speaker 1: new market as military sought ways to send messages securely 224 00:13:56,040 --> 00:13:59,040 Speaker 1: without fear that their plans would be shown to an enemy, 225 00:13:59,440 --> 00:14:02,679 Speaker 1: and that when the United States would enter into the picture, 226 00:14:02,920 --> 00:14:06,200 Speaker 1: setting the stage for the company's future in ways Haglin 227 00:14:06,440 --> 00:14:11,240 Speaker 1: could not have anticipated. I'll explain more when we come back, 228 00:14:11,280 --> 00:14:21,920 Speaker 1: but first let's take a quick break. So, when World 229 00:14:21,960 --> 00:14:25,960 Speaker 1: War two broke out, the United States military would become 230 00:14:26,120 --> 00:14:29,280 Speaker 1: one of Crypto a g s customers, and when the 231 00:14:29,360 --> 00:14:34,200 Speaker 1: Nazis invaded Norway in Haglin would again move operations. This 232 00:14:34,240 --> 00:14:38,600 Speaker 1: time he moved to the United States. His company's encryption device, 233 00:14:38,760 --> 00:14:41,640 Speaker 1: known as the M two oh nine, would be produced 234 00:14:41,720 --> 00:14:45,160 Speaker 1: in the US. According to The Washington Post, there was 235 00:14:45,160 --> 00:14:47,960 Speaker 1: a typewriter factory in upstate New York that would end 236 00:14:48,080 --> 00:14:52,160 Speaker 1: up making around a hundred forty thousand of these M 237 00:14:52,200 --> 00:14:56,720 Speaker 1: two oh nine encryption devices, and Haglin negotiated with the U. S. 238 00:14:56,800 --> 00:15:01,560 Speaker 1: Army and landed an eight point six million dollar contract. 239 00:15:02,000 --> 00:15:06,600 Speaker 1: A princely some today, but certainly a princely some way 240 00:15:06,600 --> 00:15:12,440 Speaker 1: back in nineteen Haiglin's devices lacked the sophistication of Germany's 241 00:15:12,560 --> 00:15:16,720 Speaker 1: Enigma machine. They weren't nearly as complex, nor were they 242 00:15:16,960 --> 00:15:23,440 Speaker 1: as capable of creating very tough encryption, so code breakers 243 00:15:23,560 --> 00:15:27,000 Speaker 1: could suss out the original messages that were created on 244 00:15:27,040 --> 00:15:29,560 Speaker 1: an M to two oh nine if they were given 245 00:15:29,640 --> 00:15:33,040 Speaker 1: enough time and attention, and for that reason the army 246 00:15:33,160 --> 00:15:38,560 Speaker 1: primarily relied on these devices to disguise extremely time sensitive orders. 247 00:15:38,800 --> 00:15:41,720 Speaker 1: So the logic was, by the time someone had actually 248 00:15:41,760 --> 00:15:45,640 Speaker 1: broken the code, the information would be worthless anyway, because 249 00:15:45,680 --> 00:15:49,040 Speaker 1: whatever was being covered in the message would have already happened. 250 00:15:49,400 --> 00:15:52,280 Speaker 1: It would have been something that was more imminent, so 251 00:15:52,320 --> 00:15:54,240 Speaker 1: you wouldn't be able to act on the information, even 252 00:15:54,280 --> 00:15:56,480 Speaker 1: though you'd be able to at least decode what had 253 00:15:56,560 --> 00:15:59,840 Speaker 1: been said. So you wouldn't want to use these devices 254 00:15:59,880 --> 00:16:04,880 Speaker 1: for any sort of long term plans because they were crackable. 255 00:16:05,040 --> 00:16:09,200 Speaker 1: People could crack the codes we given enough a time now. 256 00:16:09,920 --> 00:16:13,720 Speaker 1: Around that same time, Haglin became good friends with another 257 00:16:13,760 --> 00:16:19,080 Speaker 1: cryptographer named William Friedman. Freeman was born in Russia. Actually, 258 00:16:19,120 --> 00:16:24,480 Speaker 1: so was Haglin. Haglin's parents were Swedish, but when they 259 00:16:24,480 --> 00:16:28,280 Speaker 1: had Boris he was the family was in Russia, so 260 00:16:28,400 --> 00:16:31,720 Speaker 1: Freedman's family left Russia when Freedman was just a baby 261 00:16:32,000 --> 00:16:35,040 Speaker 1: back in eighteen ninety two due to a rise in 262 00:16:35,160 --> 00:16:39,360 Speaker 1: anti Semitism in Russia, and Freedman his family is Jewish. 263 00:16:39,600 --> 00:16:44,480 Speaker 1: So Freeman grew up loving codes and cryptography and became 264 00:16:44,520 --> 00:16:49,000 Speaker 1: fascinated with them. Uh, he joined a private research lab. 265 00:16:49,440 --> 00:16:52,440 Speaker 1: He met and then courted and then married a woman 266 00:16:52,520 --> 00:16:57,000 Speaker 1: named Elizabeth Smith, who on her own was an accomplished cryptographer, 267 00:16:57,040 --> 00:17:00,720 Speaker 1: a brilliant cryptographer, and they both sort of worked for 268 00:17:00,880 --> 00:17:03,280 Speaker 1: George Fabian, and that was the guy who owned the 269 00:17:03,320 --> 00:17:06,320 Speaker 1: private research lab. Fabian sounds like the sort of person 270 00:17:06,400 --> 00:17:11,359 Speaker 1: who really belonged in the Renaissance as far as I'm concerned. 271 00:17:11,680 --> 00:17:15,080 Speaker 1: In the Renaissance, you had rich nobles who would become 272 00:17:15,119 --> 00:17:21,560 Speaker 1: patrons of great thinkers and philosophers and artists. Fabian he 273 00:17:21,680 --> 00:17:24,560 Speaker 1: established this private research lab in order to look into 274 00:17:24,600 --> 00:17:26,760 Speaker 1: stuff that he just thought was interesting, which I think 275 00:17:26,840 --> 00:17:30,199 Speaker 1: is kind of cool, maybe a little eccentric. Well, when 276 00:17:30,240 --> 00:17:34,399 Speaker 1: the United States entered World War One, the Freedman's husband 277 00:17:34,440 --> 00:17:37,399 Speaker 1: and wife would work in code breaking for the United States, 278 00:17:37,880 --> 00:17:42,239 Speaker 1: and the cryptologic division of the research lab became the 279 00:17:42,280 --> 00:17:47,600 Speaker 1: genesis for the American Cryptography Service, So William Freeman would 280 00:17:47,640 --> 00:17:50,840 Speaker 1: later become the chief crypto analyst. In fact, he termed 281 00:17:51,000 --> 00:17:54,880 Speaker 1: the are, coined the term crypto analysis for the United States, 282 00:17:54,960 --> 00:17:59,640 Speaker 1: and would lead the future Signals Intelligence Service before going 283 00:17:59,640 --> 00:18:03,800 Speaker 1: on to serve in other intelligence agencies as a cryptographer, 284 00:18:04,080 --> 00:18:07,719 Speaker 1: so Freedman was very much working in the same world 285 00:18:07,840 --> 00:18:10,240 Speaker 1: as Hagland, though you could say that these were from 286 00:18:10,240 --> 00:18:13,920 Speaker 1: opposing perspectives, right, because Hagland's company was all about producing 287 00:18:13,920 --> 00:18:17,800 Speaker 1: machines that could in cipher messages, while Freedman was largely 288 00:18:17,920 --> 00:18:22,000 Speaker 1: interested in finding methods to de cipher codes. Though Freeman 289 00:18:22,040 --> 00:18:24,800 Speaker 1: also worked in in theory as well to talk about 290 00:18:24,880 --> 00:18:28,400 Speaker 1: different ways to create stronger ciphers. And we'll come back 291 00:18:28,440 --> 00:18:31,320 Speaker 1: to Freedman in just a moment. So Haglin would stay 292 00:18:31,320 --> 00:18:34,600 Speaker 1: in the US until World War Two ended in Europe, 293 00:18:34,880 --> 00:18:37,960 Speaker 1: and he had become extremely wealthy due to the lucrative 294 00:18:38,040 --> 00:18:40,640 Speaker 1: army contract he had made, and he had built many 295 00:18:40,680 --> 00:18:43,359 Speaker 1: professional and personal relationships in the United States, so he 296 00:18:43,400 --> 00:18:47,440 Speaker 1: would have strong ties to the US. He then returned 297 00:18:47,520 --> 00:18:52,160 Speaker 1: to Europe to again re establish his company there. Meanwhile, 298 00:18:52,720 --> 00:18:56,120 Speaker 1: American intelligence officials were starting to get a little worried 299 00:18:56,359 --> 00:19:00,680 Speaker 1: because code breaking was growing increasingly difficult due to sophisticated 300 00:19:00,680 --> 00:19:05,720 Speaker 1: machines running complicated systems to create these codes. And if 301 00:19:05,760 --> 00:19:09,600 Speaker 1: you had little insight into how those machines worked or 302 00:19:09,880 --> 00:19:13,840 Speaker 1: which systems they were following at any given time, you 303 00:19:13,880 --> 00:19:16,439 Speaker 1: had really little hope of breaking a code in a 304 00:19:16,480 --> 00:19:19,199 Speaker 1: reasonable amount of time. So it's very clear that a 305 00:19:19,200 --> 00:19:23,240 Speaker 1: lot of people were having really secret conversations that American 306 00:19:23,320 --> 00:19:26,920 Speaker 1: spies were unable to decipher, and that just rubbed the 307 00:19:26,960 --> 00:19:31,400 Speaker 1: Americans the wrong way. I'm gonna get a little critical 308 00:19:31,640 --> 00:19:35,920 Speaker 1: of my country in this episode. Uh. Anyway, in nineteen 309 00:19:36,000 --> 00:19:40,600 Speaker 1: fifty one, Haigland's company introduced the c X fifty two 310 00:19:40,760 --> 00:19:44,359 Speaker 1: cipher machine, and this one was sophisticated enough to present 311 00:19:44,359 --> 00:19:48,960 Speaker 1: a code that American intelligence agents viewed as practically unbreakable 312 00:19:49,000 --> 00:19:53,440 Speaker 1: at the time, and that in turn prompted some heated 313 00:19:53,560 --> 00:19:58,399 Speaker 1: internal discussions within the U. S Intelligence community and what 314 00:19:58,480 --> 00:20:00,960 Speaker 1: should officials do about this? Because there was a real 315 00:20:01,080 --> 00:20:05,840 Speaker 1: worry that countries might go out and buy Haglin's products. 316 00:20:06,040 --> 00:20:08,800 Speaker 1: I mean, that's what Haglin was making them for, and 317 00:20:08,880 --> 00:20:10,920 Speaker 1: if they did that, they would all be able to 318 00:20:10,960 --> 00:20:14,840 Speaker 1: communicate secretly and Americans would be unable to snoop out 319 00:20:14,840 --> 00:20:18,920 Speaker 1: what was going on. And boy, howdy does America hate that. 320 00:20:19,400 --> 00:20:22,639 Speaker 1: So American officials gave a sort of carrot and a 321 00:20:22,760 --> 00:20:26,840 Speaker 1: stick offer to Haglin. So on the one hand, they 322 00:20:27,119 --> 00:20:29,840 Speaker 1: were a big customer for his company. Right the United 323 00:20:29,880 --> 00:20:36,000 Speaker 1: States represented a significant potential customer for Hagland's products. He 324 00:20:36,040 --> 00:20:38,520 Speaker 1: didn't want that source of revenue to go away. So 325 00:20:39,000 --> 00:20:42,959 Speaker 1: there was that they also had a whole bunch of 326 00:20:43,000 --> 00:20:47,679 Speaker 1: old M two o nine cipher devices that were manufactured 327 00:20:47,720 --> 00:20:51,760 Speaker 1: in America during World War Two, and there was at 328 00:20:51,800 --> 00:20:56,239 Speaker 1: least the implied threat that if Haglin wouldn't be you know, 329 00:20:56,600 --> 00:21:01,120 Speaker 1: cooperative with the US, maybe the America can might let 330 00:21:01,160 --> 00:21:04,320 Speaker 1: a few thousand M two oh nine's get sold off 331 00:21:04,320 --> 00:21:07,879 Speaker 1: to countries around the world, and that would undercut Crypto's 332 00:21:07,960 --> 00:21:11,160 Speaker 1: own sales in the process. I mean, if you are 333 00:21:12,200 --> 00:21:14,919 Speaker 1: a kind of you know, the head of an agency 334 00:21:15,200 --> 00:21:19,879 Speaker 1: in a smaller country with limited resources, and the United 335 00:21:19,920 --> 00:21:23,119 Speaker 1: States says, hey, we'll sell you these old but totally 336 00:21:23,240 --> 00:21:27,840 Speaker 1: working cipher machines for much less than that brand new, 337 00:21:27,880 --> 00:21:31,040 Speaker 1: shiny cipher machine. You're gonna go with the cheaper model 338 00:21:31,080 --> 00:21:34,240 Speaker 1: as long as it works, and that means that Crypto 339 00:21:34,400 --> 00:21:38,800 Speaker 1: would not be making any sales. Uh. Then there was 340 00:21:38,880 --> 00:21:43,760 Speaker 1: William Freedman, Haglin's old buddy. In nineteen fifty one. Freeman 341 00:21:43,840 --> 00:21:46,840 Speaker 1: was then serving as the head of the cryptographic Division 342 00:21:46,960 --> 00:21:50,840 Speaker 1: of the Armed Forces Security Agency or AFSA. A f 343 00:21:51,280 --> 00:21:54,160 Speaker 1: s A. The following year he would become the head 344 00:21:54,240 --> 00:21:58,199 Speaker 1: of the cryptology Department for the National Security Agency or 345 00:21:58,280 --> 00:22:02,560 Speaker 1: the n s A. It was in when Freedman would 346 00:22:02,560 --> 00:22:05,200 Speaker 1: act on behalf of the U. S Government and met 347 00:22:05,560 --> 00:22:10,199 Speaker 1: secretly with Haglin in Washington, d C. So Freedman goes 348 00:22:10,280 --> 00:22:17,080 Speaker 1: up to Haglin with a fairly thorny proposition. The deal 349 00:22:17,160 --> 00:22:21,239 Speaker 1: was this, Haglin was to continue creating cipher machines just 350 00:22:21,280 --> 00:22:24,920 Speaker 1: as the company had been, but Crypto would only sell 351 00:22:25,200 --> 00:22:29,240 Speaker 1: the most sophisticated of those machines to a list of 352 00:22:29,280 --> 00:22:33,080 Speaker 1: countries that the United States would provide to Haglin, and 353 00:22:33,160 --> 00:22:37,160 Speaker 1: that would represent countries with whom the US had very 354 00:22:37,200 --> 00:22:40,720 Speaker 1: good relations, so allies and that sort of thing. They 355 00:22:40,720 --> 00:22:43,040 Speaker 1: were the only countries who would be allowed to buy 356 00:22:43,600 --> 00:22:47,719 Speaker 1: the top of the line products. Crypto would be allowed 357 00:22:47,760 --> 00:22:52,880 Speaker 1: to sell older, more vulnerable or weak machines to any 358 00:22:52,920 --> 00:22:56,080 Speaker 1: country that was not on that list. So, in other words, 359 00:22:56,280 --> 00:23:00,399 Speaker 1: Freeman was asking Haglin to kind of put on a 360 00:23:00,440 --> 00:23:05,840 Speaker 1: preference list certain countries and then everyone else would get older, 361 00:23:06,600 --> 00:23:11,879 Speaker 1: more vulnerable technologies. Uh. However, that's the extent of that deal. 362 00:23:12,080 --> 00:23:14,040 Speaker 1: It didn't go further than that, but it's still a 363 00:23:14,040 --> 00:23:17,600 Speaker 1: pretty big request. And you can kind of understand where 364 00:23:17,640 --> 00:23:20,639 Speaker 1: the US was coming from. At least, you know, they 365 00:23:21,440 --> 00:23:23,760 Speaker 1: clearly did not want the job to be even harder 366 00:23:23,880 --> 00:23:28,600 Speaker 1: when it came to breaking codes. And Haglin would ultimately 367 00:23:28,640 --> 00:23:31,280 Speaker 1: agree to this deal. And whether it was he saw 368 00:23:31,359 --> 00:23:33,600 Speaker 1: a guaranteed payout from the US and so it was 369 00:23:33,600 --> 00:23:36,720 Speaker 1: strictly a business decision. He just felt it was impossible 370 00:23:36,760 --> 00:23:39,560 Speaker 1: to turn down this offer, or he felt a strong 371 00:23:39,640 --> 00:23:42,320 Speaker 1: sense of loyalty toward a country that had made him 372 00:23:42,320 --> 00:23:45,320 Speaker 1: a millionaire, or maybe it was some combination of these 373 00:23:45,359 --> 00:23:47,959 Speaker 1: and other factors. I don't know, but whatever it was, 374 00:23:48,280 --> 00:23:51,919 Speaker 1: he said yes. And this would mark the beginning of 375 00:23:51,960 --> 00:23:55,840 Speaker 1: the U S intelligence community having a direct interest in 376 00:23:55,920 --> 00:24:00,920 Speaker 1: a company that was selling cryptographic equipment, that is Crypto. 377 00:24:01,240 --> 00:24:04,440 Speaker 1: But at this point it was still a fairly limited agreement. 378 00:24:04,560 --> 00:24:07,639 Speaker 1: Crypto could still sell equipment to countries all around the world, 379 00:24:08,040 --> 00:24:11,280 Speaker 1: though any country that was not on the US Best 380 00:24:11,359 --> 00:24:14,280 Speaker 1: Buddy list would only have access to the older devices. 381 00:24:14,800 --> 00:24:19,480 Speaker 1: Now this wasn't because US officials were feeling benevolent or 382 00:24:19,520 --> 00:24:21,639 Speaker 1: anything like that. I don't want to paint paint in 383 00:24:21,680 --> 00:24:25,320 Speaker 1: his that there was a very real desire in America 384 00:24:25,600 --> 00:24:30,240 Speaker 1: to push Crypto for a much more shady deal. Intelligence 385 00:24:30,280 --> 00:24:33,920 Speaker 1: officials were hoping that they could work directly with Crypto 386 00:24:34,000 --> 00:24:38,040 Speaker 1: to design machines that would produced codes that Americans could 387 00:24:38,200 --> 00:24:42,400 Speaker 1: quickly break. People would think they were sending secure messages, 388 00:24:42,600 --> 00:24:45,160 Speaker 1: but in reality the Americans would be able to decode 389 00:24:45,160 --> 00:24:49,440 Speaker 1: those messages fairly quickly. But William Friedman discouraged anyone from 390 00:24:49,440 --> 00:24:53,360 Speaker 1: America from going to Hagland with such an offer for 391 00:24:53,400 --> 00:24:56,679 Speaker 1: several years. He said Haglin would never go for it. 392 00:24:56,680 --> 00:24:59,199 Speaker 1: It would be deeply offensive to him. You're going to 393 00:24:59,240 --> 00:25:03,000 Speaker 1: destroy this or relationship we have. Let's not you know, 394 00:25:03,080 --> 00:25:06,280 Speaker 1: let's let's let's hold back rather than have a loss. 395 00:25:06,960 --> 00:25:09,320 Speaker 1: And hey, there were other companies out there, right, I mean, 396 00:25:09,480 --> 00:25:12,280 Speaker 1: it's it's not like you had to buy from Crypto 397 00:25:12,440 --> 00:25:14,520 Speaker 1: or else you'd have no way to communicate secretly. You 398 00:25:14,560 --> 00:25:18,600 Speaker 1: could always get cipher machines and cryptography machines from some 399 00:25:18,680 --> 00:25:22,159 Speaker 1: other source, right well. Part of the deal that the 400 00:25:22,280 --> 00:25:26,000 Speaker 1: US made included substantial amounts of money meant to go 401 00:25:26,119 --> 00:25:30,160 Speaker 1: toward marketing. The US wanted Crypto to be the world 402 00:25:30,480 --> 00:25:34,920 Speaker 1: leader in the market for this sort of of device, 403 00:25:35,400 --> 00:25:37,760 Speaker 1: mostly in an effort to make sure that some other 404 00:25:37,840 --> 00:25:41,399 Speaker 1: crypto company didn't come along with better, more difficult to 405 00:25:41,520 --> 00:25:45,240 Speaker 1: crack solutions, because that would just set America back again. 406 00:25:45,320 --> 00:25:48,800 Speaker 1: So the US supplied money year after year to Crypto 407 00:25:48,880 --> 00:25:51,919 Speaker 1: to renew this agreement and to keep the company going 408 00:25:52,119 --> 00:25:54,920 Speaker 1: even if things should get lean. All the while trying 409 00:25:54,960 --> 00:25:58,879 Speaker 1: to promote cryptos products and hold back any of cryptos 410 00:25:58,880 --> 00:26:04,080 Speaker 1: competitors was pretty brutal. Things slowly began to change as 411 00:26:04,119 --> 00:26:07,520 Speaker 1: time went on. The invention of the transistor would bring 412 00:26:07,560 --> 00:26:11,399 Speaker 1: on tons of innovation and maniaturization. So in the past, 413 00:26:11,880 --> 00:26:15,760 Speaker 1: electric circuits were physically enormous because you had to have 414 00:26:15,840 --> 00:26:18,960 Speaker 1: components like vacuum tubes, and those took up a lot 415 00:26:19,000 --> 00:26:21,240 Speaker 1: of space, and they also gave off a lot of heat, 416 00:26:21,320 --> 00:26:24,440 Speaker 1: which generally is bad not just for humans but also 417 00:26:24,560 --> 00:26:28,040 Speaker 1: for electronics. But in the mid nineteen sixties that was 418 00:26:28,080 --> 00:26:31,000 Speaker 1: all starting to change. Electronic circuits could now be made 419 00:26:31,080 --> 00:26:33,919 Speaker 1: much smaller thanks to the transistor, and they made it 420 00:26:33,960 --> 00:26:38,159 Speaker 1: possible for all sorts of new gadgets like pocket radios 421 00:26:38,240 --> 00:26:41,720 Speaker 1: and desktop computers. Further down the line, and yes, new 422 00:26:41,800 --> 00:26:46,920 Speaker 1: types of cryptographic machines Haglin was facing a very real 423 00:26:46,960 --> 00:26:51,600 Speaker 1: problem at that point. His company was built around mechanical 424 00:26:52,200 --> 00:26:57,320 Speaker 1: cryptographic devices. These were machines that relied on physical components 425 00:26:57,359 --> 00:27:02,400 Speaker 1: like gears and levers and chains. But the electronic era 426 00:27:02,560 --> 00:27:06,240 Speaker 1: was heading in a different direction and the crypto company 427 00:27:06,480 --> 00:27:09,840 Speaker 1: wasn't in a position to keep up. If Haglin wanted 428 00:27:09,840 --> 00:27:13,400 Speaker 1: to compete, he was going to need help. And when 429 00:27:13,440 --> 00:27:17,639 Speaker 1: someone needs help, that means they are vulnerable. Now, if 430 00:27:17,640 --> 00:27:19,880 Speaker 1: you're in a position to help someone, you can more 431 00:27:19,960 --> 00:27:23,360 Speaker 1: or less selflessly help that person to get them out 432 00:27:23,400 --> 00:27:28,560 Speaker 1: of that vulnerable position, or you can attempt to exploit it. 433 00:27:29,080 --> 00:27:31,679 Speaker 1: And the U S Intelligence community, with the n s 434 00:27:31,760 --> 00:27:36,080 Speaker 1: A at the forefront, took option number two. The n 435 00:27:36,160 --> 00:27:39,760 Speaker 1: s A, as I said, the National Security Agency was 436 00:27:39,840 --> 00:27:43,199 Speaker 1: founded in nineteen fifty two, just five years after the 437 00:27:43,240 --> 00:27:49,120 Speaker 1: Central Intelligence Agency was founded. It's primarily focused on signals intelligence, 438 00:27:49,480 --> 00:27:52,720 Speaker 1: and that is the interception and decoding of messages for 439 00:27:52,760 --> 00:27:56,800 Speaker 1: the purposes of gathering intelligence. Over at the n s A, 440 00:27:56,800 --> 00:28:01,520 Speaker 1: an analyst named Peter Jenks hypothesized that with care, you 441 00:28:01,560 --> 00:28:06,600 Speaker 1: could create an electronic cryptographic system that would seem to 442 00:28:06,720 --> 00:28:11,600 Speaker 1: be random, but it would actually depend upon a repeated 443 00:28:11,760 --> 00:28:15,760 Speaker 1: pattern at regular intervals, and a casual glance of the 444 00:28:15,800 --> 00:28:17,760 Speaker 1: code would make it seem as though the system was 445 00:28:17,800 --> 00:28:21,879 Speaker 1: following a complicated algorithm and producing an uncrackable code because 446 00:28:21,880 --> 00:28:25,760 Speaker 1: of some sort of random element. But the repetition of 447 00:28:25,760 --> 00:28:29,080 Speaker 1: the pattern would actually make code breakers with sufficient computing 448 00:28:29,080 --> 00:28:33,240 Speaker 1: power able to decode the messages. It wouldn't be easy, 449 00:28:33,480 --> 00:28:36,000 Speaker 1: it wouldn't be as simple as just running it through 450 00:28:36,000 --> 00:28:40,720 Speaker 1: a decoder, but because of that pattern, it would become possible. Again, 451 00:28:41,000 --> 00:28:47,640 Speaker 1: patterns represent restrictions. Restrictions are vulnerabilities, and vulnerabilities can be exploited, 452 00:28:48,080 --> 00:28:50,160 Speaker 1: so you can make a system that, at least on 453 00:28:50,280 --> 00:28:56,040 Speaker 1: casual glance, appears to be secure, but in reality it's not. So. 454 00:28:56,560 --> 00:28:59,040 Speaker 1: The n s A reaches out the Crypto, which is 455 00:28:59,120 --> 00:29:00,920 Speaker 1: really in need of x pertise in the form of 456 00:29:00,960 --> 00:29:06,080 Speaker 1: building electronic cryptographic machines, and Haglin welcomes the help because 457 00:29:06,080 --> 00:29:10,120 Speaker 1: otherwise his business is going to completely lose out. So 458 00:29:10,240 --> 00:29:12,800 Speaker 1: Crypto goes on to produce a machine called the H 459 00:29:12,960 --> 00:29:17,400 Speaker 1: four sixty based off the NSA's design. The company actually 460 00:29:17,400 --> 00:29:22,520 Speaker 1: made two versions of the H four sixty. One was compromised. 461 00:29:22,760 --> 00:29:25,120 Speaker 1: It used the n s as repeating pattern, so that 462 00:29:25,240 --> 00:29:27,960 Speaker 1: the agency could, with time and effort to code any 463 00:29:27,960 --> 00:29:31,720 Speaker 1: messages that were composed on that particular machine. The other 464 00:29:31,800 --> 00:29:35,360 Speaker 1: one was more secure, it didn't repeat the pattern. So 465 00:29:35,520 --> 00:29:39,320 Speaker 1: the United States was still fine with Crypto selling those machines, 466 00:29:39,400 --> 00:29:42,280 Speaker 1: the the good ones two countries that were still on 467 00:29:42,320 --> 00:29:45,840 Speaker 1: the US Best Buddy list. Everyone else would get the 468 00:29:45,880 --> 00:29:49,400 Speaker 1: compromised version. Now. While the n s a's assistance meant 469 00:29:49,440 --> 00:29:52,480 Speaker 1: that Crypto would remain a viable company as the world 470 00:29:52,520 --> 00:29:55,840 Speaker 1: moved away from mechanical systems, it also meant that Crypto 471 00:29:55,960 --> 00:29:59,920 Speaker 1: was a company that was becoming increasingly dependent upon American 472 00:30:00,000 --> 00:30:04,880 Speaker 1: intelligence agencies. Toward the end of the sixties, folks in 473 00:30:04,920 --> 00:30:08,600 Speaker 1: the CIA were starting to get a little bit antsy 474 00:30:08,800 --> 00:30:12,680 Speaker 1: with the company Crypto. It was a valuable asset, and 475 00:30:12,840 --> 00:30:16,760 Speaker 1: countries around the world depended upon equipment from Crypto, which 476 00:30:16,800 --> 00:30:19,680 Speaker 1: met the US had incredible advantages when it came to 477 00:30:19,720 --> 00:30:24,120 Speaker 1: deciphering intelligence. But Haglin was getting up there in years. 478 00:30:24,120 --> 00:30:26,800 Speaker 1: He was getting into his eighties, and there was no 479 00:30:26,880 --> 00:30:30,560 Speaker 1: guarantee that his successor would be as amenable to the 480 00:30:30,600 --> 00:30:34,960 Speaker 1: intelligence agents as Hagelin had been. Initially, it appeared as 481 00:30:35,000 --> 00:30:37,160 Speaker 1: though he was going to hand over control of his 482 00:30:37,240 --> 00:30:42,040 Speaker 1: company to his son, Bo Haglin. The CIA was not 483 00:30:42,360 --> 00:30:46,520 Speaker 1: crazy about that idea. The agency was not convinced that 484 00:30:46,560 --> 00:30:51,400 Speaker 1: bo Haglin would be as pliable as Boris Hagelin had been, 485 00:30:51,800 --> 00:30:54,040 Speaker 1: and the nature of the company's relationship with the U. S. 486 00:30:54,080 --> 00:30:58,160 Speaker 1: Intelligence community had been kept a secret from Bow. So 487 00:30:58,280 --> 00:31:02,560 Speaker 1: Boris Hagelin's own son did not apparently know about this 488 00:31:02,840 --> 00:31:07,320 Speaker 1: relationship with uh the n s A and later the CIA. 489 00:31:07,880 --> 00:31:10,240 Speaker 1: So Boris and his son Bow were also not on 490 00:31:10,280 --> 00:31:15,800 Speaker 1: the best of terms. They frequently had pretty massive fights. 491 00:31:16,320 --> 00:31:19,720 Speaker 1: Bo had felt he had been left out of some 492 00:31:19,920 --> 00:31:25,040 Speaker 1: pretty important patents that he had contributed to, and so 493 00:31:25,160 --> 00:31:28,960 Speaker 1: he was not on good speaking terms with his father. 494 00:31:29,800 --> 00:31:34,280 Speaker 1: Uh So this was a complicated issue, and the the U. S. 495 00:31:34,360 --> 00:31:38,920 Speaker 1: Government wasn't entirely sure how it was going to play out. Meanwhile, 496 00:31:39,080 --> 00:31:42,800 Speaker 1: over in Europe, you had intelligence agencies in West Germany 497 00:31:43,040 --> 00:31:46,160 Speaker 1: because you know, after World War Two, Germany was split 498 00:31:46,360 --> 00:31:49,560 Speaker 1: up into West Germany and East Germany. So West Germany 499 00:31:49,600 --> 00:31:52,240 Speaker 1: and an intelligence agency in France were both eager to 500 00:31:52,400 --> 00:31:57,280 Speaker 1: purchase crypto from Hageland. You know, Hagland's getting very old, 501 00:31:57,320 --> 00:31:59,520 Speaker 1: and so they think, hey, if we buy this company, 502 00:32:00,120 --> 00:32:04,400 Speaker 1: then we can benefit from this technology. They they had 503 00:32:04,440 --> 00:32:07,000 Speaker 1: figured out that the United States had some sort of 504 00:32:07,200 --> 00:32:11,400 Speaker 1: beneficial relationship with Crypto. I'm not sure if they knew 505 00:32:11,400 --> 00:32:13,400 Speaker 1: the full extent of it, but they at least knew 506 00:32:13,480 --> 00:32:15,800 Speaker 1: that there was some buddy buddy stuff going on there, 507 00:32:16,120 --> 00:32:18,960 Speaker 1: and they wanted to get in on that action. Haglin 508 00:32:19,120 --> 00:32:22,240 Speaker 1: rejected this initial offer and told the c I A 509 00:32:22,240 --> 00:32:25,160 Speaker 1: about it. So then we get to nineteen seventy, and 510 00:32:25,200 --> 00:32:30,240 Speaker 1: then two really big things happen. First, Bo Haglin Boris's 511 00:32:30,360 --> 00:32:35,000 Speaker 1: son would die in a car accident. UH. And no 512 00:32:35,440 --> 00:32:39,240 Speaker 1: conspiracy theorists does not appear that this was, you know, 513 00:32:39,320 --> 00:32:43,200 Speaker 1: engineered or manufactured in some way. Uh. It appears to 514 00:32:43,280 --> 00:32:46,640 Speaker 1: have been just a car accident and Bo dies as 515 00:32:46,680 --> 00:32:50,640 Speaker 1: a result of this. The CIA cooperates with West Germany's 516 00:32:50,760 --> 00:32:55,280 Speaker 1: Federal Intelligence Service also known as b n D. It's 517 00:32:55,320 --> 00:32:59,520 Speaker 1: called that because in German federal intelligence service is a different, 518 00:33:00,040 --> 00:33:02,000 Speaker 1: very long word that I am not even going to 519 00:33:02,080 --> 00:33:06,080 Speaker 1: attempt to pronounce, and they create an agreement in which 520 00:33:06,120 --> 00:33:11,240 Speaker 1: these two agencies would co own the company in secret. UH. 521 00:33:11,400 --> 00:33:14,640 Speaker 1: The CIA told West Germany, hey will totally go in 522 00:33:14,800 --> 00:33:17,000 Speaker 1: Z's with you on this one, but you got to 523 00:33:17,040 --> 00:33:20,120 Speaker 1: cut France out of the deal, and West Germany said, 524 00:33:20,560 --> 00:33:28,560 Speaker 1: uh okay by France, uh al vita Zane. Haglin would 525 00:33:28,560 --> 00:33:31,280 Speaker 1: be presented with this deal and would agree to the terms, 526 00:33:31,360 --> 00:33:35,240 Speaker 1: and the agencies would rely upon a company in Liechtenstein 527 00:33:35,840 --> 00:33:39,680 Speaker 1: that was called Mark ser and Goop at the time. 528 00:33:39,920 --> 00:33:42,840 Speaker 1: Great name, but Mark Staring Goop would draw up the 529 00:33:42,920 --> 00:33:46,120 Speaker 1: agreement in such a way that the agency's identities would 530 00:33:46,120 --> 00:33:49,280 Speaker 1: be protected through a series of shell companies and other 531 00:33:49,800 --> 00:33:54,160 Speaker 1: you know, ob you skation, So even if you were 532 00:33:54,200 --> 00:33:56,680 Speaker 1: to dig into it, you would not be able to 533 00:33:56,760 --> 00:33:59,720 Speaker 1: see that the C, I, A, and B and D 534 00:34:00,120 --> 00:34:03,600 Speaker 1: were co owners of this company. Instead, you would get 535 00:34:03,640 --> 00:34:06,240 Speaker 1: all these this sort of a runaround, you know, a 536 00:34:06,280 --> 00:34:10,000 Speaker 1: wild goose chase about the ownership of Crypto. It would 537 00:34:10,040 --> 00:34:12,960 Speaker 1: not appear to be owned by any intelligence agencies, however, 538 00:34:13,400 --> 00:34:17,759 Speaker 1: So Hagln sold his company for just under six million dollars. Uh. 539 00:34:18,000 --> 00:34:22,440 Speaker 1: He would pass away in three after a very long illness, 540 00:34:22,880 --> 00:34:26,520 Speaker 1: so he kind of leaves our story. But meanwhile, the 541 00:34:26,640 --> 00:34:29,920 Speaker 1: two intelligence agencies now had secret control of a company 542 00:34:29,920 --> 00:34:34,040 Speaker 1: that manufactured products meant to make communications secret I think 543 00:34:34,080 --> 00:34:36,799 Speaker 1: you can see where this is going. Right. If you're 544 00:34:37,280 --> 00:34:40,960 Speaker 1: if your agency is all about uncovering secrets, and then 545 00:34:40,960 --> 00:34:44,560 Speaker 1: you get control of a leading company that makes stuff 546 00:34:44,560 --> 00:34:48,560 Speaker 1: that's supposed to create things secretly, you're like a kid 547 00:34:48,560 --> 00:34:50,680 Speaker 1: in a candy store. I mean it was like it 548 00:34:50,760 --> 00:34:54,000 Speaker 1: was like they were selling locks to everyone in the world, 549 00:34:54,000 --> 00:34:56,200 Speaker 1: but they were holding on to all the skeleton keys 550 00:34:56,200 --> 00:35:00,799 Speaker 1: that would give them access to those locks. It was incredible. Now, 551 00:35:00,840 --> 00:35:04,160 Speaker 1: I should be clear that the list of clients for 552 00:35:04,280 --> 00:35:08,600 Speaker 1: Crypto did not include everybody. Not everyone in the world 553 00:35:09,160 --> 00:35:12,239 Speaker 1: was eager to purchase the products from this company. To 554 00:35:12,920 --> 00:35:16,280 Speaker 1: potential customers in particular were not on the list. China 555 00:35:16,640 --> 00:35:20,600 Speaker 1: and Russia were both suspicious about Crypto for years by 556 00:35:20,640 --> 00:35:24,759 Speaker 1: the time the CIA gained partial ownership, so they did 557 00:35:24,800 --> 00:35:29,719 Speaker 1: not purchase those products. They were figured something was up. 558 00:35:30,000 --> 00:35:34,320 Speaker 1: But other countries, including lots of US allies, were Crypto 559 00:35:34,400 --> 00:35:39,200 Speaker 1: customers frequent ones. While these two agencies would share ownership 560 00:35:39,239 --> 00:35:41,359 Speaker 1: of the company for a couple of decades, things were 561 00:35:41,400 --> 00:35:45,080 Speaker 1: not always super smooth between them. The West Germans noted 562 00:35:45,160 --> 00:35:47,720 Speaker 1: in their own history about the project that was shared 563 00:35:47,760 --> 00:35:51,360 Speaker 1: with The Washington Post that the Americans were eager to 564 00:35:51,440 --> 00:35:57,000 Speaker 1: spy on everybody really, enemy or ally alike. The West 565 00:35:57,040 --> 00:36:00,680 Speaker 1: German officials were really they were focusing on countries that 566 00:36:00,760 --> 00:36:05,200 Speaker 1: were not allies, but the Americans wanted to snoop on everybody. 567 00:36:05,480 --> 00:36:09,920 Speaker 1: CIA historians meanwhile, note that the American officials felt that 568 00:36:09,960 --> 00:36:13,600 Speaker 1: the West Germans were more interested in running crypto as 569 00:36:13,640 --> 00:36:16,560 Speaker 1: a straightforward business to earn money, and they were looking 570 00:36:16,600 --> 00:36:18,719 Speaker 1: at as a revenue generator, not as a way to 571 00:36:19,239 --> 00:36:22,840 Speaker 1: you know, dip into secrets. So both the CIA and 572 00:36:22,920 --> 00:36:26,200 Speaker 1: the B and D would take in millions of dollars 573 00:36:26,200 --> 00:36:29,160 Speaker 1: over the years as they operated crypto, and they would 574 00:36:29,160 --> 00:36:32,600 Speaker 1: pour that money into other projects around the world. So 575 00:36:32,640 --> 00:36:36,480 Speaker 1: if you ever wondered how some CIA operations appear to 576 00:36:36,520 --> 00:36:39,719 Speaker 1: happen under the radar, it's not all just you know, 577 00:36:40,080 --> 00:36:43,440 Speaker 1: dark deals that are behind closed doors and d C. 578 00:36:44,080 --> 00:36:46,799 Speaker 1: Some of that money comes straight from c I A 579 00:36:47,520 --> 00:36:52,239 Speaker 1: backed operations that are appearing to be you know, honest businesses. 580 00:36:52,719 --> 00:36:56,080 Speaker 1: So that's fun. We're going to take a break for 581 00:36:56,480 --> 00:36:59,759 Speaker 1: actual honest businesses, but we'll be right back after these 582 00:36:59,800 --> 00:37:11,200 Speaker 1: mess stages. So in the c I A history for 583 00:37:11,239 --> 00:37:14,120 Speaker 1: this project, and I have not read the entire history 584 00:37:14,160 --> 00:37:18,040 Speaker 1: because it was not made available. The Post was only 585 00:37:18,120 --> 00:37:22,120 Speaker 1: granted the right to produce excerpts from the report, not 586 00:37:22,200 --> 00:37:26,600 Speaker 1: the entire report. But the agency refers to Crypto with 587 00:37:26,680 --> 00:37:30,440 Speaker 1: a code name. That code name is Minerva, and the 588 00:37:30,560 --> 00:37:35,440 Speaker 1: project of running Crypto in an effort to UH to 589 00:37:35,600 --> 00:37:39,319 Speaker 1: produce equipment that could be exploited around the world, had 590 00:37:39,520 --> 00:37:42,680 Speaker 1: two different code names. The first one was the Saurus 591 00:37:43,120 --> 00:37:48,040 Speaker 1: and the second one was Rubicon UH. So German intelligence 592 00:37:48,080 --> 00:37:52,680 Speaker 1: agents would later bring in officials from Siemens, the company 593 00:37:52,800 --> 00:37:58,920 Speaker 1: Siemens to serve as advisors, technical advisors and entrepreneurial advisors 594 00:37:59,160 --> 00:38:02,440 Speaker 1: for Crypto, and in return, Siemens would get five pc 595 00:38:02,760 --> 00:38:07,480 Speaker 1: of Cryptos sales. The Americans they brought in Motorola to 596 00:38:07,760 --> 00:38:10,520 Speaker 1: take some of cryptos products and to tweak them to 597 00:38:10,600 --> 00:38:15,160 Speaker 1: make them, you know, work better, make them more commercially viable. 598 00:38:15,880 --> 00:38:21,839 Speaker 1: So we've got two intelligence agencies and two major companies 599 00:38:22,120 --> 00:38:26,200 Speaker 1: all working together as part of this, and all indications 600 00:38:26,280 --> 00:38:29,399 Speaker 1: seemed to point that at least some people in those 601 00:38:29,440 --> 00:38:33,520 Speaker 1: two big companies knew what was up. By the nineteen eighties, 602 00:38:33,880 --> 00:38:37,359 Speaker 1: more than half of all the intelligence gathered by the 603 00:38:37,400 --> 00:38:40,720 Speaker 1: CIA that came from places other than China or Russia 604 00:38:41,160 --> 00:38:45,760 Speaker 1: were encrypted by crypto machines. So when you look at 605 00:38:45,800 --> 00:38:50,359 Speaker 1: all the information that the CIA was bringing in, uh, 606 00:38:50,440 --> 00:38:53,360 Speaker 1: if it wasn't from Russia and if it wasn't from China, 607 00:38:53,560 --> 00:38:55,719 Speaker 1: more than half of the information had passed through a 608 00:38:55,760 --> 00:38:59,640 Speaker 1: crypto machine, meaning that the CIA could decrypt it and 609 00:38:59,719 --> 00:39:02,759 Speaker 1: read the underlying messages. There are sometimes where they said 610 00:39:02,760 --> 00:39:06,279 Speaker 1: that they could read messages from certain countries with eight 611 00:39:06,520 --> 00:39:11,240 Speaker 1: to nine success, which is pretty phenomenal in the world 612 00:39:11,239 --> 00:39:15,440 Speaker 1: of cryptography and code breaking. While neither Russia nor China 613 00:39:15,719 --> 00:39:19,000 Speaker 1: would use crypto devices, a lot of countries that we're 614 00:39:19,080 --> 00:39:22,839 Speaker 1: dealing with, those countries with Russia and China did use 615 00:39:22,920 --> 00:39:26,440 Speaker 1: crypto devices, so the CIA was able to learn a 616 00:39:26,480 --> 00:39:30,279 Speaker 1: lot about operations going on in Russia and China indirectly 617 00:39:30,560 --> 00:39:33,360 Speaker 1: through that means. This is also a good time to 618 00:39:33,400 --> 00:39:36,480 Speaker 1: point out a parallel in our daily lives, which is 619 00:39:36,520 --> 00:39:42,400 Speaker 1: that even if the content of our messages is safe, 620 00:39:43,120 --> 00:39:47,360 Speaker 1: the act of sending messages can sometimes provide enough information 621 00:39:47,400 --> 00:39:53,080 Speaker 1: for people to draw some pretty accurate conclusions. It shows 622 00:39:53,160 --> 00:39:57,920 Speaker 1: us that metadata is really an important thing to remember. 623 00:39:58,719 --> 00:40:03,799 Speaker 1: Metadata is the formation about information, and sometimes you don't 624 00:40:03,800 --> 00:40:06,400 Speaker 1: need to know the content of something in order to 625 00:40:06,480 --> 00:40:12,239 Speaker 1: draw some pretty damaging or valuable conclusions. I guess it 626 00:40:12,239 --> 00:40:15,920 Speaker 1: all depends upon your perspective. So this is kind of 627 00:40:15,920 --> 00:40:18,200 Speaker 1: an example of that that even though Russia and China 628 00:40:18,280 --> 00:40:21,720 Speaker 1: weren't using crypto devices, countries that we're dealing with, Russia 629 00:40:21,719 --> 00:40:24,000 Speaker 1: and China were, and that meant the CIA could read 630 00:40:24,040 --> 00:40:28,359 Speaker 1: at least that side of the messages. In nineteen one, 631 00:40:28,440 --> 00:40:32,800 Speaker 1: Saudi Arabia would become the biggest crypto customer and it 632 00:40:32,840 --> 00:40:35,759 Speaker 1: would play a very important role. The crypto technology would 633 00:40:35,760 --> 00:40:38,200 Speaker 1: play a very important role in the Middle East. This 634 00:40:38,360 --> 00:40:41,720 Speaker 1: also leads to a point in the Washington Post article 635 00:40:42,120 --> 00:40:47,480 Speaker 1: where the authors state that it's kind of an open 636 00:40:47,600 --> 00:40:51,160 Speaker 1: question as to how much the CIA knew about different 637 00:40:51,719 --> 00:40:56,120 Speaker 1: operations around the world throughout this time, and what the 638 00:40:56,160 --> 00:41:00,520 Speaker 1: agency did or didn't do in preparation for the events, 639 00:41:00,560 --> 00:41:03,600 Speaker 1: like whether or not they should have acted in some cases, 640 00:41:03,600 --> 00:41:06,239 Speaker 1: like if they were aware of an assassination attempt, did 641 00:41:06,320 --> 00:41:09,759 Speaker 1: they do anything to prevent that or to let anyone know? 642 00:41:10,320 --> 00:41:13,000 Speaker 1: And if not, was it just because they were worried 643 00:41:13,000 --> 00:41:17,319 Speaker 1: about compromising the fact that they knew about this information. 644 00:41:17,520 --> 00:41:21,360 Speaker 1: At what point does the value go away? From knowing 645 00:41:21,400 --> 00:41:23,960 Speaker 1: information if you don't act on that information. These are 646 00:41:23,960 --> 00:41:26,040 Speaker 1: big questions that are not answered in the article, by 647 00:41:26,040 --> 00:41:28,160 Speaker 1: the way, uh, and they bring up a lot of 648 00:41:29,440 --> 00:41:34,360 Speaker 1: deep ethical problems with what was going on. So crypto 649 00:41:34,440 --> 00:41:37,440 Speaker 1: would also receive a lot of direction from the CIA 650 00:41:37,600 --> 00:41:43,960 Speaker 1: and from BND two actively try and disparage competitors to 651 00:41:44,080 --> 00:41:50,800 Speaker 1: essentially run marketing campaigns that said, you know, cryptography devices 652 00:41:50,880 --> 00:41:54,000 Speaker 1: from such and such a company are total crap. Don't 653 00:41:54,000 --> 00:41:58,200 Speaker 1: buy them. Come to us by our stuff, we are secure. Uh. 654 00:41:58,400 --> 00:42:03,120 Speaker 1: They also were encouraged to bribe government officials to adopt 655 00:42:03,160 --> 00:42:07,280 Speaker 1: crypto tech. So there's some pretty awful stories about crypto 656 00:42:07,360 --> 00:42:10,839 Speaker 1: executives doing all sorts of stuff in order to you know, 657 00:42:11,120 --> 00:42:15,759 Speaker 1: bribe governments from all over the world to adopt crypto technology. 658 00:42:16,320 --> 00:42:23,880 Speaker 1: Skiezy scheezy stuff really makes me proud um. US President 659 00:42:23,960 --> 00:42:28,120 Speaker 1: Ronald Reagan inadvertently revealed that the US had intercepted and 660 00:42:28,160 --> 00:42:32,560 Speaker 1: decrypted communications out of a Libyan embassy in East Berlin 661 00:42:32,800 --> 00:42:36,040 Speaker 1: to Tripoli, and that tipped off Libya that something was 662 00:42:36,120 --> 00:42:40,280 Speaker 1: up right, that America somehow was able to decrypt messages, 663 00:42:40,960 --> 00:42:45,799 Speaker 1: and considering the company they were relying upon for their cryptography. 664 00:42:45,840 --> 00:42:50,960 Speaker 1: That started to raise some doubts about Crypto's authenticity, and 665 00:42:51,040 --> 00:42:54,680 Speaker 1: not just with Libya. Other countries took notice to employees 666 00:42:54,719 --> 00:42:58,680 Speaker 1: at Crypto. Meanwhile, didn't know about the arrangement. Right they 667 00:42:58,719 --> 00:43:03,120 Speaker 1: were working under the assumption that they were actually making genuine, 668 00:43:03,719 --> 00:43:09,080 Speaker 1: reliable cryptography equipment, And occasionally an employee might look at 669 00:43:09,120 --> 00:43:12,080 Speaker 1: something and say, ha, this is weird based upon what 670 00:43:12,200 --> 00:43:15,399 Speaker 1: I know. This algorithm we're using or this system we're 671 00:43:15,480 --> 00:43:19,359 Speaker 1: using has vulnerabilities. Their their problems with it. We should 672 00:43:19,440 --> 00:43:23,160 Speaker 1: fix those before we ship this because we could make 673 00:43:23,200 --> 00:43:27,239 Speaker 1: it more secure. They would get discouraged from doing that, 674 00:43:27,280 --> 00:43:31,280 Speaker 1: they would be told not to implement solutions. In one case, 675 00:43:31,880 --> 00:43:34,839 Speaker 1: it went much further than that. Uh. There was an 676 00:43:34,840 --> 00:43:42,120 Speaker 1: employee named Peter Fruitager who was very frustrated with what 677 00:43:42,160 --> 00:43:45,440 Speaker 1: was going on. He felt that that Crypto was just 678 00:43:45,520 --> 00:43:52,040 Speaker 1: being complacent or maybe negligent, and not responding to very 679 00:43:52,080 --> 00:43:56,879 Speaker 1: real concerns that Furniture had with clients in Damascus. So 680 00:43:57,040 --> 00:44:00,480 Speaker 1: his clients and Damascus were complaining about their stuff. So 681 00:44:00,520 --> 00:44:03,800 Speaker 1: he went to Damascus and he fixed their crypto equipment. 682 00:44:04,080 --> 00:44:07,279 Speaker 1: In other words, he removed the vulnerabilities that had been 683 00:44:07,320 --> 00:44:12,200 Speaker 1: engineered to go into this stuff, and the Crypto CEO 684 00:44:12,520 --> 00:44:15,720 Speaker 1: at the time would fire Friutiture as a result, because 685 00:44:16,000 --> 00:44:19,400 Speaker 1: Frititor had had messed things up. He had actually made 686 00:44:19,560 --> 00:44:21,600 Speaker 1: a what was supposed to be a secure system and 687 00:44:21,760 --> 00:44:25,200 Speaker 1: actual secure system. Of course he didn't know that that 688 00:44:25,360 --> 00:44:31,080 Speaker 1: was against the goals of the operation itself, and the 689 00:44:31,120 --> 00:44:34,759 Speaker 1: c i A got very mad at the CEO for 690 00:44:34,800 --> 00:44:37,560 Speaker 1: Crypto at that point, saying that he should have found 691 00:44:37,560 --> 00:44:39,920 Speaker 1: a way to sort of bring Frutiture in under the 692 00:44:39,960 --> 00:44:43,240 Speaker 1: fold to smooth things over, rather than fire him because 693 00:44:43,239 --> 00:44:48,440 Speaker 1: it brought undoe scrutiny to Crypto and its activities. Crypto 694 00:44:48,520 --> 00:44:53,160 Speaker 1: also hired an electrical engineer named Manjia Ca Flesh and 695 00:44:53,200 --> 00:44:57,400 Speaker 1: I'm sure I'm butchering these names, and I do apologize. Uh. 696 00:44:57,560 --> 00:45:00,319 Speaker 1: That also upset the n s A this time, not 697 00:45:00,400 --> 00:45:02,160 Speaker 1: the c i A, but the n s A because 698 00:45:02,280 --> 00:45:05,120 Speaker 1: N s A knew about this this electrical engineer, and 699 00:45:05,120 --> 00:45:08,960 Speaker 1: they said, she is way too smart, she's going to 700 00:45:09,040 --> 00:45:12,000 Speaker 1: figure out something's going on. You should not hire her. 701 00:45:12,280 --> 00:45:16,920 Speaker 1: But Crypto hired her because she's was brilliant and was 702 00:45:17,360 --> 00:45:20,640 Speaker 1: seen as a valuable asset. Turns out she was brilliant. 703 00:45:20,840 --> 00:45:23,920 Speaker 1: She still is brilliant, and she kept trying to initiate 704 00:45:24,000 --> 00:45:28,680 Speaker 1: fixes and improvements because she kept finding weaknesses and vulnerabilities 705 00:45:29,000 --> 00:45:32,720 Speaker 1: in the systems, but she was always discouraged from actually 706 00:45:32,760 --> 00:45:37,279 Speaker 1: implementing solutions, and she wondered what was going on, but 707 00:45:37,400 --> 00:45:39,920 Speaker 1: she was a little worried about speaking up because she 708 00:45:40,000 --> 00:45:44,640 Speaker 1: wasn't sure exactly what the extent was. The company would 709 00:45:44,640 --> 00:45:48,960 Speaker 1: actually produce a machine using an algorithm she had designed 710 00:45:49,360 --> 00:45:53,480 Speaker 1: that the n s A could not crack, So the 711 00:45:53,600 --> 00:45:56,239 Speaker 1: n s A reached out to the CIA, and the 712 00:45:56,280 --> 00:46:02,520 Speaker 1: CIA ordered the company Crypto to stop the manufacturing process, saying, 713 00:46:02,560 --> 00:46:06,440 Speaker 1: we can't produce these machines because we can't crack the code. 714 00:46:07,160 --> 00:46:11,200 Speaker 1: You've gotta break it. So only fifty or so of 715 00:46:11,239 --> 00:46:14,720 Speaker 1: these machines were actually manufactured. The company wind up selling 716 00:46:14,719 --> 00:46:18,680 Speaker 1: those two banks because the thought was, well, banks have 717 00:46:18,719 --> 00:46:21,200 Speaker 1: a need for security, and we don't really need to 718 00:46:21,239 --> 00:46:25,800 Speaker 1: snoop on them. That's not where our concern is. Uh, 719 00:46:25,800 --> 00:46:28,879 Speaker 1: But from now on, when you make this device, make 720 00:46:28,920 --> 00:46:32,319 Speaker 1: it with the algorithm that's broken on purpose, because we 721 00:46:32,360 --> 00:46:34,960 Speaker 1: want to be able to crack those codes. So that's 722 00:46:35,000 --> 00:46:40,680 Speaker 1: pretty dodgy anyway. There was also a mathematics professor from 723 00:46:40,680 --> 00:46:46,520 Speaker 1: Stockholm whose name I would butcher terribly. He actually studied 724 00:46:46,560 --> 00:46:49,600 Speaker 1: in the United States and his American family, like me, 725 00:46:49,960 --> 00:46:53,200 Speaker 1: would have trouble saying his name, so they called him 726 00:46:53,239 --> 00:46:58,880 Speaker 1: Henry Henry Vidman. He was brought into craft more sophisticated 727 00:46:58,920 --> 00:47:03,800 Speaker 1: but vulnerable out rhythms. So he was actually told about 728 00:47:03,840 --> 00:47:08,239 Speaker 1: the real relationship between the CIA and then B and 729 00:47:08,320 --> 00:47:12,880 Speaker 1: D and crypto. He was given the inside scoop and 730 00:47:12,920 --> 00:47:16,000 Speaker 1: asked to become part of the team, and his purpose 731 00:47:16,520 --> 00:47:21,360 Speaker 1: was to design algorithms that looked really super secure but 732 00:47:21,520 --> 00:47:26,600 Speaker 1: secretly weren't. So he was trying to make stuff that 733 00:47:26,680 --> 00:47:29,719 Speaker 1: appeared to be more on the up and up, but 734 00:47:29,840 --> 00:47:34,759 Speaker 1: in fact had vulnerabilities built into it, and meanwhile to 735 00:47:34,880 --> 00:47:38,120 Speaker 1: have those vulnerabilities designed in such a way that it 736 00:47:38,200 --> 00:47:42,000 Speaker 1: created plausible deniability. In other words, if someone found the vulnerability, 737 00:47:42,320 --> 00:47:45,400 Speaker 1: you could say, oh, that's due to human error or 738 00:47:45,440 --> 00:47:48,359 Speaker 1: it was an implementation error, but it was not put 739 00:47:48,400 --> 00:47:51,880 Speaker 1: there on purpose, even though it toats was. The CIA 740 00:47:52,440 --> 00:47:57,000 Speaker 1: used crypto communications to suss out where Manuel Noriega was 741 00:47:57,080 --> 00:48:00,759 Speaker 1: based off communications from the Vatican. They intercepted those communications, 742 00:48:00,800 --> 00:48:03,480 Speaker 1: decoded them, and were able to find Noriega. As a result, 743 00:48:04,239 --> 00:48:11,200 Speaker 1: in Iran arrested a Crypto salesman named Hans Bueller, and 744 00:48:11,320 --> 00:48:15,520 Speaker 1: Bueller didn't know about the relationship between Crypto and the 745 00:48:15,560 --> 00:48:18,400 Speaker 1: CIA or the B and D. He had no knowledge 746 00:48:18,400 --> 00:48:21,760 Speaker 1: of any of that. So he was literally an innocent 747 00:48:21,840 --> 00:48:28,200 Speaker 1: salesman who thought he was selling legit cryptographic equipment. Iran 748 00:48:28,520 --> 00:48:31,000 Speaker 1: had figured out something was going on. They had been 749 00:48:31,040 --> 00:48:35,080 Speaker 1: suspicious ever since that incident with Libya I had mentioned earlier, 750 00:48:35,480 --> 00:48:40,520 Speaker 1: and so they arrested him and they essentially tortured him 751 00:48:40,560 --> 00:48:45,880 Speaker 1: for nine months. Uh the Iran demanded a one million 752 00:48:45,880 --> 00:48:49,200 Speaker 1: dollar ransom from Crypto, and the company did pay it. 753 00:48:49,280 --> 00:48:53,080 Speaker 1: The CIA did not chip in because the CIA has 754 00:48:53,120 --> 00:48:57,279 Speaker 1: a policy against paying ransoms. We don't negotiate with terrorists, 755 00:48:57,480 --> 00:49:00,600 Speaker 1: is the way America would put it. So this guy 756 00:49:00,800 --> 00:49:05,560 Speaker 1: suffered for nine months in captivity before Crypto would pay 757 00:49:05,600 --> 00:49:08,120 Speaker 1: the ransom and get him back. And he legit didn't 758 00:49:08,160 --> 00:49:12,600 Speaker 1: know anything. He didn't know that the relationship existed, but 759 00:49:12,680 --> 00:49:15,400 Speaker 1: he certainly suspected it by the time he was released, 760 00:49:15,960 --> 00:49:19,960 Speaker 1: and he was worried about the fact that this foreign 761 00:49:20,040 --> 00:49:22,360 Speaker 1: government seemed to know more about the company he was 762 00:49:22,400 --> 00:49:25,920 Speaker 1: working for than he did. He ended up going to 763 00:49:25,960 --> 00:49:31,040 Speaker 1: the press and talking about his experiences and it caused 764 00:49:31,160 --> 00:49:34,160 Speaker 1: a bit of a stir in Europe. The CIA would 765 00:49:34,200 --> 00:49:37,680 Speaker 1: actually refer to this entire incident with a code name. 766 00:49:38,160 --> 00:49:42,360 Speaker 1: That code name was Hydra, so that's fun. Around that 767 00:49:42,440 --> 00:49:47,600 Speaker 1: same time, Germany was reunified, right the Soviet Union fell, 768 00:49:47,960 --> 00:49:51,920 Speaker 1: East Germany and West Germany unified into Germany. The Berlin 769 00:49:52,000 --> 00:49:55,160 Speaker 1: Wall came down, and it was around that same time 770 00:49:55,160 --> 00:49:59,040 Speaker 1: that the B and D felt that crypto's usefulness had 771 00:49:59,080 --> 00:50:02,319 Speaker 1: pretty much expied eared that now it was more of 772 00:50:02,360 --> 00:50:07,480 Speaker 1: a risk that if the full extent of B and 773 00:50:07,520 --> 00:50:11,279 Speaker 1: D's involvement in cryptos activities were known, that could put 774 00:50:11,360 --> 00:50:14,480 Speaker 1: Germany at risk. And so they ended up selling off 775 00:50:14,640 --> 00:50:19,120 Speaker 1: their interest in Crypto to the CIA for around seventeen 776 00:50:19,200 --> 00:50:24,480 Speaker 1: million dollars. So at that point forward, Crypto operated as 777 00:50:25,120 --> 00:50:31,080 Speaker 1: a c I A backed operation secretly, but yeah, CIA 778 00:50:31,200 --> 00:50:36,040 Speaker 1: had full ownership from around until two thousand eighteen. That's 779 00:50:36,040 --> 00:50:39,080 Speaker 1: when CIA would liquidate the company and sold it off 780 00:50:39,360 --> 00:50:43,520 Speaker 1: to to other companies. Um. The reason they did that 781 00:50:44,120 --> 00:50:48,680 Speaker 1: is that by the time rolled around, the cryptographic community 782 00:50:48,760 --> 00:50:51,640 Speaker 1: was very different. It no longer was so dependent upon 783 00:50:51,719 --> 00:50:57,520 Speaker 1: standalone machines, electronic or otherwise. A lot of solutions are 784 00:50:57,600 --> 00:51:02,560 Speaker 1: software based or web based. Uh, they're not based on 785 00:51:02,560 --> 00:51:07,799 Speaker 1: on physical equipment, so they're The usefulness of Crypto as 786 00:51:07,880 --> 00:51:11,839 Speaker 1: a company had pretty much gone out the window. Uh. 787 00:51:11,920 --> 00:51:15,239 Speaker 1: It had provided the CIA with a ton of information, 788 00:51:16,120 --> 00:51:19,120 Speaker 1: but they were you know, there's no no need to 789 00:51:19,200 --> 00:51:22,040 Speaker 1: keep it running, so they sold it off for parts essentially. 790 00:51:22,920 --> 00:51:28,120 Speaker 1: Um And you know, part of me says, this is 791 00:51:28,120 --> 00:51:31,560 Speaker 1: spy stuff. Of course, spies are going to be sneaky. 792 00:51:31,680 --> 00:51:35,080 Speaker 1: That's what spies do. Spies operate in a way where 793 00:51:35,120 --> 00:51:37,759 Speaker 1: they are trying to avoid detection while they try to 794 00:51:37,800 --> 00:51:40,800 Speaker 1: figure out what everyone else knows. That is the nature 795 00:51:40,840 --> 00:51:44,440 Speaker 1: of spying, and everybody does it. At the same time, 796 00:51:44,920 --> 00:51:51,640 Speaker 1: there's something really sinister about secretly owning a security firm 797 00:51:52,440 --> 00:51:56,880 Speaker 1: and uh using it to to do the opposite of 798 00:51:56,920 --> 00:51:59,600 Speaker 1: what the security firm says it's doing. Right. It says 799 00:51:59,640 --> 00:52:03,880 Speaker 1: it's tecting secrets, but in reality, it's leaving those secrets 800 00:52:03,920 --> 00:52:07,239 Speaker 1: open for people to see. Now. I mentioned Huawei at 801 00:52:07,280 --> 00:52:09,600 Speaker 1: the beginning of this episode, and the reason I did 802 00:52:09,640 --> 00:52:12,200 Speaker 1: that is because, again, around the same time that this 803 00:52:12,239 --> 00:52:15,880 Speaker 1: story was breaking, we were hearing about how Huawei, the 804 00:52:16,000 --> 00:52:21,400 Speaker 1: Chinese company telecommunications company, has had back door access to 805 00:52:21,680 --> 00:52:25,400 Speaker 1: networks that it it has rolled out for a decade. 806 00:52:25,760 --> 00:52:29,799 Speaker 1: So Whahwei makes all sorts of telecommunications equipment, including components 807 00:52:29,960 --> 00:52:33,080 Speaker 1: for networks. UH they are a leading provider for five 808 00:52:33,160 --> 00:52:37,320 Speaker 1: G components, for example, And there's been a concern around 809 00:52:37,680 --> 00:52:39,960 Speaker 1: much of the world, but particularly in the United States, 810 00:52:40,480 --> 00:52:43,319 Speaker 1: that this would mean that Huawei as a company would 811 00:52:43,360 --> 00:52:46,680 Speaker 1: have at least some capability of snooping on communications that 812 00:52:46,800 --> 00:52:51,960 Speaker 1: go across those networks. And since Huawei has some connections 813 00:52:52,360 --> 00:52:59,040 Speaker 1: to the communist government of China, because China requires companies 814 00:52:59,120 --> 00:53:02,200 Speaker 1: that operate in China to have this connection, that that 815 00:53:02,239 --> 00:53:06,319 Speaker 1: would mean that those networks would be used specifically as 816 00:53:06,440 --> 00:53:10,000 Speaker 1: surveillance tools. And in America you can kind of understand 817 00:53:10,239 --> 00:53:14,279 Speaker 1: where they're coming from, because that's what Americans do. Like, 818 00:53:14,400 --> 00:53:18,080 Speaker 1: if you're the one who spying on everybody, you probably 819 00:53:18,120 --> 00:53:21,520 Speaker 1: are really paranoid about everyone spying on you. It's just 820 00:53:21,600 --> 00:53:25,520 Speaker 1: kind of how it works. Also, again, that report showed 821 00:53:25,680 --> 00:53:28,960 Speaker 1: that for ten years, Whahwei actually did have that capability. 822 00:53:29,000 --> 00:53:31,600 Speaker 1: Whether they did anything with it or not, it's still 823 00:53:31,600 --> 00:53:35,279 Speaker 1: an open question. But with Whahwei, the story goes that 824 00:53:35,320 --> 00:53:39,120 Speaker 1: they were building in these back door access channels for 825 00:53:39,200 --> 00:53:42,360 Speaker 1: law enforcement officials. You know, law enforcement wants to have 826 00:53:42,480 --> 00:53:45,359 Speaker 1: that kind of access so that if they're conducting investigation, 827 00:53:45,880 --> 00:53:50,600 Speaker 1: they can look into communications going between various suspects so 828 00:53:50,640 --> 00:53:55,560 Speaker 1: that they can better do their investigations. Uh. The problem 829 00:53:55,600 --> 00:53:57,440 Speaker 1: is that Huahwei was not just building these in for 830 00:53:57,520 --> 00:54:01,480 Speaker 1: law enforcement, but was retaining its own access to those channels. 831 00:54:01,960 --> 00:54:04,400 Speaker 1: And again, whether it was using it or not, I 832 00:54:04,440 --> 00:54:07,320 Speaker 1: don't know, but the story goes that they were actually 833 00:54:07,320 --> 00:54:11,640 Speaker 1: retaining that ability. Uh. And this leads me to another 834 00:54:11,680 --> 00:54:14,080 Speaker 1: point I want to make before I conclude, which is 835 00:54:14,160 --> 00:54:19,800 Speaker 1: that back door channels are always a terrible idea, always, always, always, 836 00:54:19,800 --> 00:54:24,239 Speaker 1: always Uh. They inherently make systems less secure. So if 837 00:54:24,280 --> 00:54:27,160 Speaker 1: your job is to make a secure system, building in 838 00:54:27,200 --> 00:54:31,319 Speaker 1: a way to bypass that security is you might as 839 00:54:31,360 --> 00:54:34,319 Speaker 1: well not have any security. It's a terrible idea. I 840 00:54:34,400 --> 00:54:37,840 Speaker 1: get it why law enforcement and intelligence agencies want it, 841 00:54:38,040 --> 00:54:41,160 Speaker 1: because information is valuable and getting access to the information 842 00:54:41,480 --> 00:54:45,040 Speaker 1: could mean the difference between life or death in some cases, 843 00:54:45,120 --> 00:54:51,480 Speaker 1: and really can. But then you know, if you have 844 00:54:51,520 --> 00:54:54,880 Speaker 1: those backdoor channels, it means that you don't have to 845 00:54:54,920 --> 00:54:57,120 Speaker 1: go through the whole security process, and it means that 846 00:54:57,200 --> 00:55:00,320 Speaker 1: someone else might potentially discover that and expl laid it. 847 00:55:00,920 --> 00:55:05,759 Speaker 1: So one you've got the danger of the authorized parties 848 00:55:06,440 --> 00:55:10,440 Speaker 1: abusing this power. Right, you've got the potential for an 849 00:55:10,480 --> 00:55:13,960 Speaker 1: agency committing overreach, like we've heard about the n s 850 00:55:14,040 --> 00:55:18,319 Speaker 1: A and how that agency was collecting way more information 851 00:55:18,640 --> 00:55:22,359 Speaker 1: than they should have been able to, including information from 852 00:55:22,400 --> 00:55:26,000 Speaker 1: people that weren't under any direct surveillance, and how that 853 00:55:26,040 --> 00:55:29,319 Speaker 1: can be abused. That's a terrible thing. So you don't 854 00:55:29,360 --> 00:55:32,399 Speaker 1: want that capability. You don't want the ability of some 855 00:55:32,840 --> 00:55:37,360 Speaker 1: agency that had had authorized backdoor access to abuse that power. 856 00:55:37,719 --> 00:55:40,520 Speaker 1: You also don't want some third party that is not 857 00:55:40,680 --> 00:55:44,360 Speaker 1: authorized at all finding out about that back channel and 858 00:55:44,400 --> 00:55:47,360 Speaker 1: figuring out how to access it, because now your secure 859 00:55:47,400 --> 00:55:51,920 Speaker 1: system has no security. So I guess the in message 860 00:55:51,960 --> 00:55:55,799 Speaker 1: I want to give everybody is protect yourself as best 861 00:55:55,800 --> 00:55:59,000 Speaker 1: you can, which is increasingly difficult when we don't know 862 00:55:59,480 --> 00:56:03,640 Speaker 1: necessary who is behind the systems that are actually making 863 00:56:04,360 --> 00:56:08,239 Speaker 1: the security we depend upon. Another great example is people 864 00:56:08,280 --> 00:56:13,759 Speaker 1: have pointed out is should we trust the security company Kasperski, 865 00:56:14,120 --> 00:56:16,759 Speaker 1: which comes from Russia or is it possible that that 866 00:56:16,800 --> 00:56:21,239 Speaker 1: could be a state backed operation that is slowly or 867 00:56:21,360 --> 00:56:27,280 Speaker 1: quietly sewing in vulnerabilities from people who are using its products. Uh. 868 00:56:27,360 --> 00:56:31,000 Speaker 1: I have not seen any specific reports on that. I'm 869 00:56:31,040 --> 00:56:34,360 Speaker 1: just seeing people ask that question. But that leads us 870 00:56:34,400 --> 00:56:37,960 Speaker 1: to start asking questions about everything. Probably not a bad idea, 871 00:56:38,000 --> 00:56:40,520 Speaker 1: but it starts to, you know, it starts to create 872 00:56:40,560 --> 00:56:44,239 Speaker 1: this system where we're not trusting anything, and at the 873 00:56:44,320 --> 00:56:47,560 Speaker 1: end of the day, you either have to figure out 874 00:56:48,360 --> 00:56:50,560 Speaker 1: you've got to trust somebody, or you've got to just 875 00:56:50,640 --> 00:56:54,880 Speaker 1: kind of disengage, or I guess you just resign yourself 876 00:56:54,920 --> 00:56:58,200 Speaker 1: that all of your stuff is going to be findable 877 00:56:58,200 --> 00:57:02,360 Speaker 1: and readable by everyone at some point or another. Happy Days. 878 00:57:03,200 --> 00:57:06,879 Speaker 1: That wraps up this episode of text stuff, And this 879 00:57:06,920 --> 00:57:10,240 Speaker 1: is a pretty heavy topic. So in our next episode, 880 00:57:10,239 --> 00:57:12,839 Speaker 1: I'm gonna have a special guest join us, at least 881 00:57:12,840 --> 00:57:17,360 Speaker 1: that's the plan, and we're gonna have a conversation about 882 00:57:17,560 --> 00:57:21,080 Speaker 1: misinformation on the Internet and how it can quickly get 883 00:57:21,640 --> 00:57:27,200 Speaker 1: spread and evolve in rapid succession to the point where 884 00:57:27,200 --> 00:57:30,280 Speaker 1: it's passed as gospel. But that will be for our 885 00:57:30,360 --> 00:57:33,240 Speaker 1: next episode. If you have suggestions for future topics I 886 00:57:33,240 --> 00:57:36,600 Speaker 1: should cover on tech stuff, reach out to me on Facebook. 887 00:57:36,680 --> 00:57:39,520 Speaker 1: Or Twitter. I use the handle text stuff h s 888 00:57:39,760 --> 00:57:42,280 Speaker 1: W at both. I look forward to hearing from you, 889 00:57:42,640 --> 00:57:50,400 Speaker 1: and I'll talk to you again really soon. Text Stuff 890 00:57:50,440 --> 00:57:52,760 Speaker 1: is a production of I Heart Radio's How Stuff Works. 891 00:57:52,960 --> 00:57:55,760 Speaker 1: For more podcasts from my heart Radio, visit the I 892 00:57:55,880 --> 00:57:59,120 Speaker 1: heart Radio app, Apple Podcasts, or wherever you listen to 893 00:57:59,160 --> 00:58:02,000 Speaker 1: your favorite show. Ye