1 00:00:04,440 --> 00:00:12,239 Speaker 1: Welcome to Tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,240 --> 00:00:15,520 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,520 --> 00:00:18,400 Speaker 1: I'm an executive producer with iHeart Podcasts. And how the 4 00:00:18,440 --> 00:00:23,320 Speaker 1: tech are you. I am currently on vacation. Hopefully I'm 5 00:00:23,360 --> 00:00:25,799 Speaker 1: having a good time, and I hope you're having a 6 00:00:25,800 --> 00:00:28,000 Speaker 1: good time wherever you happen to be as well. But 7 00:00:28,120 --> 00:00:30,160 Speaker 1: in the meantime, I thought I would bring you an 8 00:00:30,240 --> 00:00:33,960 Speaker 1: episode that we published originally on February seventeenth, twenty twenty. 9 00:00:34,360 --> 00:00:38,400 Speaker 1: It's titled When Secrets Aren't Secret. This is about the 10 00:00:38,680 --> 00:00:42,440 Speaker 1: curious case of the CIA, the Central Intelligence Agency of 11 00:00:42,479 --> 00:00:46,240 Speaker 1: the United States of America, owning and operating an encryption company, 12 00:00:46,680 --> 00:00:50,080 Speaker 1: which seems to be a bit of a conflict. Let's 13 00:00:50,120 --> 00:00:53,360 Speaker 1: listen in. So I want to talk about the business 14 00:00:53,400 --> 00:00:57,680 Speaker 1: of communication and secrets and also the business of eavesdropping 15 00:00:57,840 --> 00:01:02,560 Speaker 1: and why all of this gets real dodgy, real fast. 16 00:01:03,360 --> 00:01:07,920 Speaker 1: So the initial story doesn't involve China or five G Networks. 17 00:01:08,200 --> 00:01:11,080 Speaker 1: It goes further back than that. It actually concerns a 18 00:01:11,160 --> 00:01:15,959 Speaker 1: Swiss company called cryptoag and its ties to the Central 19 00:01:16,000 --> 00:01:20,399 Speaker 1: Intelligence Agency aka the CIA in the United States. The 20 00:01:20,400 --> 00:01:23,920 Speaker 1: story is all about the battle between secrecy and surveillance, 21 00:01:24,319 --> 00:01:27,000 Speaker 1: and it's also about trust, as in, whom do you 22 00:01:27,080 --> 00:01:30,560 Speaker 1: trust when you want to send a secure communication to 23 00:01:30,600 --> 00:01:35,120 Speaker 1: someone else? If you're using some sort of technology to 24 00:01:35,360 --> 00:01:41,640 Speaker 1: encrypt your stuff, who makes that encryption you know, strategy, 25 00:01:41,720 --> 00:01:46,080 Speaker 1: whether it's it's software or actual device or whatever it 26 00:01:46,120 --> 00:01:48,960 Speaker 1: may be, who's making that and can they be trusted? 27 00:01:49,000 --> 00:01:52,120 Speaker 1: And as it turns out, those are difficult questions to 28 00:01:52,160 --> 00:01:56,320 Speaker 1: answer than would readily seem apparent. Now, the story for 29 00:01:56,360 --> 00:02:00,840 Speaker 1: this really begins with a Swedish inventor named ARV Gerard 30 00:02:01,120 --> 00:02:06,240 Speaker 1: Dom who was born in eighteen sixty nine. He worked 31 00:02:06,360 --> 00:02:09,960 Speaker 1: in textile mills before he would start creating his own 32 00:02:10,080 --> 00:02:15,120 Speaker 1: version of a cipher machine sometime around nineteen fifteen or so. So, 33 00:02:15,480 --> 00:02:20,000 Speaker 1: what the heck is a cipher machine? Heck? What's a cipher? Well, 34 00:02:20,040 --> 00:02:23,079 Speaker 1: a cipher is a code. It's a way of hiding 35 00:02:23,360 --> 00:02:25,639 Speaker 1: the meaning of a message. And there are a lot 36 00:02:25,680 --> 00:02:30,639 Speaker 1: of different approaches to encoding information, and there are a 37 00:02:30,639 --> 00:02:33,920 Speaker 1: lot of strategies that actually employ multiple versions of this, 38 00:02:34,040 --> 00:02:37,520 Speaker 1: multiple schemes. So, for example, one way to have a 39 00:02:37,560 --> 00:02:41,000 Speaker 1: code is to use words that refer to something else, 40 00:02:41,520 --> 00:02:44,919 Speaker 1: So instead of saying a military tank, you might say Thomas. 41 00:02:45,600 --> 00:02:48,280 Speaker 1: You know, because you got Thomas the tank engine, and 42 00:02:48,320 --> 00:02:51,680 Speaker 1: you go from Thomas the tank engine to military tank 43 00:02:51,760 --> 00:02:54,520 Speaker 1: and there you are. So if you referred to a 44 00:02:54,560 --> 00:02:57,560 Speaker 1: Thomas you might be talking about a tank. That would 45 00:02:57,560 --> 00:02:59,600 Speaker 1: be a very bad code, or at least a very 46 00:02:59,639 --> 00:03:03,760 Speaker 1: easy to decipher code. But that's a version of codes 47 00:03:03,800 --> 00:03:06,560 Speaker 1: where you have a codebook that tells you what certain 48 00:03:06,600 --> 00:03:11,280 Speaker 1: words or phrases actually are meant to convey. Then you 49 00:03:11,400 --> 00:03:14,880 Speaker 1: have ciphers in which you replace the letters of a 50 00:03:14,919 --> 00:03:18,440 Speaker 1: message with some other letter or symbol, and the simplest 51 00:03:18,520 --> 00:03:21,960 Speaker 1: of these is a shift cipher, sometimes also called a 52 00:03:22,080 --> 00:03:26,160 Speaker 1: Caesar cipher. And with these ciphers, you write on a message, 53 00:03:26,320 --> 00:03:30,440 Speaker 1: but you shift all the letters some predetermined number down 54 00:03:30,720 --> 00:03:33,840 Speaker 1: or up the alphabet. So if you had a shift 55 00:03:33,919 --> 00:03:38,040 Speaker 1: cipher with just one shift one step, that would mean 56 00:03:38,320 --> 00:03:40,960 Speaker 1: that you would use the letter B to represent the 57 00:03:41,040 --> 00:03:44,200 Speaker 1: letter A, you would use the letter C to represent 58 00:03:44,240 --> 00:03:47,440 Speaker 1: the letter B, and so on down the alphabet. So 59 00:03:47,440 --> 00:03:49,480 Speaker 1: if someone else were to get hold of the message 60 00:03:49,680 --> 00:03:53,240 Speaker 1: at casual glance, the message would appear to be gibberish. 61 00:03:53,320 --> 00:03:56,840 Speaker 1: But of course that particular cipher is super easy to decode, 62 00:03:57,160 --> 00:04:00,520 Speaker 1: even if you are shifting further up or down the alphabet. 63 00:04:00,760 --> 00:04:04,400 Speaker 1: Let's say you're shifting up ten spots instead of one. Well, 64 00:04:04,600 --> 00:04:08,520 Speaker 1: just because of the nature of language, someone with even 65 00:04:08,760 --> 00:04:11,200 Speaker 1: a little bit of patience would be able to probably 66 00:04:11,240 --> 00:04:14,280 Speaker 1: break that code pretty quickly. Well. In the early twentieth 67 00:04:14,360 --> 00:04:18,159 Speaker 1: century and victors were working on mechanical systems that would 68 00:04:18,200 --> 00:04:22,240 Speaker 1: create stronger ciphers, and initially these were mostly thought of 69 00:04:22,360 --> 00:04:27,000 Speaker 1: as a way to protect business communications like financial communications 70 00:04:27,040 --> 00:04:32,520 Speaker 1: between banks, for example, or sometimes political messages between different 71 00:04:33,360 --> 00:04:36,400 Speaker 1: parts of the world, like a government and its embassy 72 00:04:36,440 --> 00:04:39,760 Speaker 1: in another country. That over time they would be adopted 73 00:04:39,800 --> 00:04:43,080 Speaker 1: by militaries around the world to send secret communications back 74 00:04:43,080 --> 00:04:46,240 Speaker 1: and forth between headquarters and units in the field, and 75 00:04:46,320 --> 00:04:49,960 Speaker 1: these communications needed to be much more secure than a 76 00:04:50,040 --> 00:04:55,680 Speaker 1: caesar cipher could potentially offer. So the basic idea behind 77 00:04:55,680 --> 00:04:59,000 Speaker 1: these cipher machines was that you would have a device. 78 00:04:59,720 --> 00:05:02,000 Speaker 1: Sometime times it would look like a typewriter. Sometimes it 79 00:05:02,040 --> 00:05:05,279 Speaker 1: would have a hand crank on it, but typically there'd 80 00:05:05,279 --> 00:05:09,080 Speaker 1: be at least one dial, if not several dials, and 81 00:05:09,160 --> 00:05:11,760 Speaker 1: perhaps some other components that would allow the operator to 82 00:05:11,960 --> 00:05:16,719 Speaker 1: set the machine to establish the cipher. So you choose 83 00:05:16,720 --> 00:05:20,240 Speaker 1: your settings, and then the operator would take a message 84 00:05:20,640 --> 00:05:23,440 Speaker 1: that is meant to be encoded and then put it 85 00:05:23,560 --> 00:05:26,640 Speaker 1: through this machine in some way. Maybe they're using a keyboard, 86 00:05:27,080 --> 00:05:30,800 Speaker 1: maybe they're using a series of keys and levers. However 87 00:05:30,839 --> 00:05:34,479 Speaker 1: it may be they're actually typing out the message in 88 00:05:34,600 --> 00:05:38,159 Speaker 1: plain text. But the cipher machines would have some sort 89 00:05:38,200 --> 00:05:41,680 Speaker 1: of gears or other chains or systems that would turn 90 00:05:41,880 --> 00:05:44,560 Speaker 1: with each letter type, and it would change the cipher 91 00:05:44,640 --> 00:05:47,360 Speaker 1: as it did, so change the nature of it. And 92 00:05:47,400 --> 00:05:50,880 Speaker 1: this was a really clever way to confound code breakers, 93 00:05:51,040 --> 00:05:55,440 Speaker 1: particularly if the machine was really well designed. So let's 94 00:05:55,440 --> 00:05:58,760 Speaker 1: say you are an operator and you have the word 95 00:05:58,880 --> 00:06:02,839 Speaker 1: book that you need to encode using one of these machines. 96 00:06:03,240 --> 00:06:05,600 Speaker 1: So you have one of these particular machines. You type 97 00:06:05,640 --> 00:06:08,839 Speaker 1: the letter B into the device, which, because of the 98 00:06:08,880 --> 00:06:12,640 Speaker 1: settings for this particular session, will now print out the 99 00:06:12,720 --> 00:06:16,280 Speaker 1: letter G. So the letter G means B with this 100 00:06:16,320 --> 00:06:20,800 Speaker 1: particular cipher. The gears inside the machine turn after you've 101 00:06:20,839 --> 00:06:23,320 Speaker 1: typed in the letter B, which prints out as G, 102 00:06:23,960 --> 00:06:26,400 Speaker 1: So now the cipher is actually different. You type in 103 00:06:26,440 --> 00:06:29,359 Speaker 1: the first O in book and you get another G 104 00:06:29,960 --> 00:06:33,320 Speaker 1: because of the way the cipher works. Then the gears 105 00:06:33,360 --> 00:06:36,040 Speaker 1: turn again. You type in the second O, and now 106 00:06:36,080 --> 00:06:39,320 Speaker 1: the machine prints out the letter F. The gears turn again, 107 00:06:39,400 --> 00:06:42,120 Speaker 1: you type out the letter K, and you get the 108 00:06:42,160 --> 00:06:47,080 Speaker 1: print out of K, So the printed word says ggf 109 00:06:47,360 --> 00:06:51,960 Speaker 1: K rather than book. Well, to decode the message, you 110 00:06:51,960 --> 00:06:54,840 Speaker 1: would typically need the same sort of machine that was 111 00:06:54,920 --> 00:06:56,920 Speaker 1: used to encode it, and you would need to know 112 00:06:56,960 --> 00:06:59,920 Speaker 1: what settings the operator had been using when they started 113 00:07:00,160 --> 00:07:02,279 Speaker 1: the message, and you would have to set up your 114 00:07:02,320 --> 00:07:06,920 Speaker 1: machine to mirror that, and then you would end up 115 00:07:07,040 --> 00:07:09,840 Speaker 1: taking the encoded message and you would start typing that 116 00:07:10,000 --> 00:07:13,400 Speaker 1: out and the process would essentially reverse itself and it 117 00:07:13,400 --> 00:07:16,640 Speaker 1: would allow the operator to read out the original message. 118 00:07:17,040 --> 00:07:20,640 Speaker 1: So in our example, the operator on the other side 119 00:07:20,640 --> 00:07:24,400 Speaker 1: would take GGFK and enter that into their machine and 120 00:07:24,400 --> 00:07:27,720 Speaker 1: they would get the print out book. Now a couple 121 00:07:27,720 --> 00:07:31,200 Speaker 1: of caveats here. Not all cipher machines are created equal 122 00:07:31,680 --> 00:07:36,160 Speaker 1: right or were used to their best advantage. Sometimes people 123 00:07:36,280 --> 00:07:39,600 Speaker 1: made bad decisions when it came to either designing cipher 124 00:07:39,680 --> 00:07:43,960 Speaker 1: machines or implementing them. For example, the big wigs might 125 00:07:44,000 --> 00:07:47,000 Speaker 1: decide that, in no circumstance would you ever have a 126 00:07:47,080 --> 00:07:52,040 Speaker 1: letter represented by itself. You would never allow that to happen. 127 00:07:52,200 --> 00:07:57,320 Speaker 1: So in the example I just gave where GGFK means book, 128 00:07:57,960 --> 00:08:01,240 Speaker 1: that last k wouldn't work. You would have to have 129 00:08:01,280 --> 00:08:05,040 Speaker 1: the device go to a different letter because it would 130 00:08:05,080 --> 00:08:09,160 Speaker 1: not allow itself to replicate a letter with a representation 131 00:08:09,280 --> 00:08:12,920 Speaker 1: of itself. Other rules that could cause problems on the 132 00:08:12,680 --> 00:08:15,960 Speaker 1: road might be a rule against the doubling of letters 133 00:08:16,320 --> 00:08:20,080 Speaker 1: like the gg and GGFK. And the reason that these 134 00:08:20,160 --> 00:08:23,800 Speaker 1: are problems is that if you have a code breaker 135 00:08:24,120 --> 00:08:28,200 Speaker 1: who's really looking at these codes closely, and that code 136 00:08:28,200 --> 00:08:30,840 Speaker 1: breaker starts to figure out that there are restrictions to 137 00:08:30,920 --> 00:08:34,360 Speaker 1: the code, they can build that into their code breaking 138 00:08:34,480 --> 00:08:37,520 Speaker 1: models in an effort to crack the code. Because as 139 00:08:37,559 --> 00:08:41,240 Speaker 1: you put in restrictions, that means you're reducing variables. And 140 00:08:41,320 --> 00:08:45,040 Speaker 1: anyone who has worked in any sort of mathematics, particularly 141 00:08:45,080 --> 00:08:49,160 Speaker 1: stuff like algebra, you know that to solve complicated problems 142 00:08:49,160 --> 00:08:52,680 Speaker 1: you need to reduce variables. As you reduce variables, you 143 00:08:52,800 --> 00:08:56,320 Speaker 1: make it easier to solve problems. So it was actually 144 00:08:56,360 --> 00:08:58,280 Speaker 1: this sort of thing that would lead to the British 145 00:08:58,360 --> 00:09:02,400 Speaker 1: cryptographers breaking German codes during World War Two. It wasn't 146 00:09:02,520 --> 00:09:06,240 Speaker 1: that the technology itself was necessarily faulty. It was that 147 00:09:06,520 --> 00:09:10,640 Speaker 1: the Germans were kind of using bad methodology with some 148 00:09:10,760 --> 00:09:15,720 Speaker 1: of their equipment, and that's what gave an inroad for 149 00:09:15,840 --> 00:09:19,240 Speaker 1: code breakers. Now, if you want to learn way more 150 00:09:19,600 --> 00:09:22,800 Speaker 1: about how these machines actually work, you can listen to 151 00:09:23,080 --> 00:09:26,920 Speaker 1: tech Stuff Ponders and Enigma. That's a classic episode that 152 00:09:27,040 --> 00:09:31,679 Speaker 1: originally published way back on October nineteenth, twenty eleven, and 153 00:09:31,760 --> 00:09:34,840 Speaker 1: I actually did a tech Stuff Classic rerun of that 154 00:09:34,920 --> 00:09:40,000 Speaker 1: episode on October twelfth, twenty eighteen. The Enigma machine is 155 00:09:40,080 --> 00:09:43,520 Speaker 1: the most famous cipher device that was made in the 156 00:09:43,559 --> 00:09:47,640 Speaker 1: early twentieth century. It was made and used by the Germans, 157 00:09:47,800 --> 00:09:50,840 Speaker 1: and it was used extensively by the German military during 158 00:09:50,840 --> 00:09:53,679 Speaker 1: World War Two. And in that podcast, my old co 159 00:09:53,720 --> 00:09:56,880 Speaker 1: host Chris Paulette and I talk about how a really 160 00:09:56,880 --> 00:10:00,760 Speaker 1: good cipher one that's super hard to crack, is also 161 00:10:01,200 --> 00:10:04,679 Speaker 1: a pain in the patookas to use because of that complexity, 162 00:10:05,160 --> 00:10:08,720 Speaker 1: and that's mainly why officials would put rules in place 163 00:10:08,760 --> 00:10:12,920 Speaker 1: that ultimately would service the downfall for their technology, because 164 00:10:13,240 --> 00:10:15,960 Speaker 1: using the tech without those rules in place was possible, 165 00:10:16,080 --> 00:10:20,040 Speaker 1: but not always fast enough to be practical. This would 166 00:10:20,120 --> 00:10:23,080 Speaker 1: prove to be a problem with cryptography in general. You 167 00:10:23,120 --> 00:10:26,760 Speaker 1: want a system that's secure enough that you're reasonably certain 168 00:10:26,800 --> 00:10:30,120 Speaker 1: a person who intercepts the message would be unable to 169 00:10:30,160 --> 00:10:32,520 Speaker 1: make header tail of it. Right. That's the whole purpose 170 00:10:32,559 --> 00:10:36,840 Speaker 1: of cryptography is to make any unauthorized person incapable of 171 00:10:36,880 --> 00:10:41,080 Speaker 1: reading the message. But you also want your solution to 172 00:10:41,120 --> 00:10:44,800 Speaker 1: be practical enough that your intended recipient can decode the 173 00:10:44,840 --> 00:10:48,679 Speaker 1: message with a minimum of fuss, particularly if it relates 174 00:10:48,679 --> 00:10:52,200 Speaker 1: to a time sensitive issue. So in this case, you 175 00:10:52,280 --> 00:10:55,720 Speaker 1: had Germans using these same settings on their Enigma machines 176 00:10:56,160 --> 00:10:59,360 Speaker 1: for longer than they were supposed to, or they were 177 00:10:59,559 --> 00:11:03,200 Speaker 1: co locating codebooks with the Enigma machines and those fell 178 00:11:03,240 --> 00:11:06,240 Speaker 1: into Allied hands who were able to use those two 179 00:11:06,440 --> 00:11:11,600 Speaker 1: decode messages. To this day, balancing out practical applications with 180 00:11:11,679 --> 00:11:16,160 Speaker 1: security remains a challenge. It may make it take longer 181 00:11:16,400 --> 00:11:19,040 Speaker 1: for a message to get through from one point to another, 182 00:11:19,360 --> 00:11:22,720 Speaker 1: which a lot of people don't accept in the age 183 00:11:22,720 --> 00:11:26,240 Speaker 1: of information traveling at the speed of light. Or it 184 00:11:26,440 --> 00:11:30,760 Speaker 1: just may be a pain to encrypt and decrypt, which 185 00:11:30,840 --> 00:11:35,440 Speaker 1: also ends up becoming a barrier to adoption and implementation. Okay, 186 00:11:35,720 --> 00:11:39,400 Speaker 1: let's get back to our story. So it's the nineteen tens, right, 187 00:11:39,440 --> 00:11:44,000 Speaker 1: It's around nineteen fifteen. Arred Garaddam has patented an encryption device. 188 00:11:44,040 --> 00:11:48,000 Speaker 1: He got that patent by nineteen nineteen and to manufacture 189 00:11:48,080 --> 00:11:51,680 Speaker 1: and market the device, Dom would work with business partners 190 00:11:51,880 --> 00:11:57,360 Speaker 1: to create a company originally called Cryptograph or ab Cryptograph, 191 00:11:57,840 --> 00:12:00,800 Speaker 1: and one of Dom's investors was a guy named Carl 192 00:12:01,080 --> 00:12:04,880 Speaker 1: Wilhelm Heglan who had made his money in Russia in 193 00:12:04,920 --> 00:12:08,480 Speaker 1: the oil business. But then the Russian Revolution happened and 194 00:12:08,520 --> 00:12:12,079 Speaker 1: Haglan fled with his family and they returned to Haglund's 195 00:12:12,120 --> 00:12:15,520 Speaker 1: homeland of Sweden. They brought the family with them, and 196 00:12:16,240 --> 00:12:21,360 Speaker 1: Boris Haglan was a Carl Wilhelm Hegland's son, and Boris 197 00:12:21,440 --> 00:12:25,440 Speaker 1: was given a position in Dom's company in return for 198 00:12:25,480 --> 00:12:29,360 Speaker 1: this financial investment from his father. Now Boris would actually 199 00:12:29,400 --> 00:12:32,960 Speaker 1: prove to be quite the entrepreneur. In nineteen twenty five, 200 00:12:33,080 --> 00:12:36,080 Speaker 1: he would take over the company entirely. He became the 201 00:12:36,120 --> 00:12:39,480 Speaker 1: new head of the company. He would rename it Crypto 202 00:12:39,679 --> 00:12:43,760 Speaker 1: Technic in nineteen thirty two, and then when the Nazis 203 00:12:43,880 --> 00:12:47,719 Speaker 1: rose to power, he fled Sweden for Switzerland and re 204 00:12:47,880 --> 00:12:51,800 Speaker 1: established his company there, and it was this company that 205 00:12:51,840 --> 00:12:56,040 Speaker 1: he established that would later become known as Cryptoag, the 206 00:12:56,120 --> 00:12:59,600 Speaker 1: focus of our episode really well. In the meantime, his 207 00:12:59,640 --> 00:13:04,120 Speaker 1: company continued to produce new cipher machines, incorporating new features 208 00:13:04,160 --> 00:13:06,440 Speaker 1: in an effort to build machines that were able to 209 00:13:06,440 --> 00:13:09,800 Speaker 1: create stronger codes. And again, this was mostly for business 210 00:13:09,880 --> 00:13:13,080 Speaker 1: use or occasional government use, but the rise of World 211 00:13:13,160 --> 00:13:16,360 Speaker 1: War II would create a new market as military sought 212 00:13:16,360 --> 00:13:19,720 Speaker 1: ways to send messages securely without fear that their plans 213 00:13:19,720 --> 00:13:22,600 Speaker 1: would be shown to an enemy. And that's when the 214 00:13:22,679 --> 00:13:26,000 Speaker 1: United States would enter into the picture, setting the stage 215 00:13:26,040 --> 00:13:30,840 Speaker 1: for the company's future in ways Hageland could not have anticipated. 216 00:13:31,440 --> 00:13:34,319 Speaker 1: I'll explain more when we come back, but first let's 217 00:13:34,360 --> 00:13:47,200 Speaker 1: take a quick break. So when World War two broke out, 218 00:13:47,480 --> 00:13:51,959 Speaker 1: the United States military would become one of Cryptoag's customers, 219 00:13:52,480 --> 00:13:56,000 Speaker 1: and when the Nazis invaded Norway in nineteen forty, Hageland 220 00:13:56,040 --> 00:13:59,160 Speaker 1: would again move operations. This time he moved to the 221 00:13:59,240 --> 00:14:03,400 Speaker 1: United States. His company's encryption device, known as the M 222 00:14:03,520 --> 00:14:06,839 Speaker 1: two nine, would be produced in the US. According to 223 00:14:06,920 --> 00:14:10,440 Speaker 1: the Washington Post, there was a typewriter factory in upstate 224 00:14:10,480 --> 00:14:12,920 Speaker 1: New York that would end up making around one hundred 225 00:14:12,960 --> 00:14:18,319 Speaker 1: and forty thousand of these M two nine encryption devices, 226 00:14:18,400 --> 00:14:21,680 Speaker 1: and Hagland negotiated with the US Army and landed an 227 00:14:21,760 --> 00:14:27,800 Speaker 1: eight point six million dollar contract, a princely sum today, 228 00:14:28,120 --> 00:14:31,440 Speaker 1: but certainly a princely sum way back in nineteen forty. 229 00:14:32,120 --> 00:14:37,360 Speaker 1: Hegland's devices lacked the sophistication of Germany's Enigma machine. They 230 00:14:37,360 --> 00:14:41,560 Speaker 1: weren't nearly as complex, nor were they as capable of 231 00:14:41,640 --> 00:14:48,320 Speaker 1: creating very tough encryption, so codebreakers could suss out the 232 00:14:48,360 --> 00:14:52,280 Speaker 1: original messages that were created on an M two nine 233 00:14:52,480 --> 00:14:55,400 Speaker 1: if they were given enough time and attention, and for 234 00:14:55,440 --> 00:14:59,080 Speaker 1: that reason, the Army primarily relied on these devices to 235 00:14:59,160 --> 00:15:03,360 Speaker 1: disguise extra dreamly time sensitive orders. So the logic was, 236 00:15:03,800 --> 00:15:06,600 Speaker 1: by the time someone had actually broken the code, the 237 00:15:06,680 --> 00:15:10,640 Speaker 1: information would be worthless anyway, because whatever was being covered 238 00:15:10,640 --> 00:15:13,440 Speaker 1: in the message would have already happened. It would have 239 00:15:13,480 --> 00:15:16,440 Speaker 1: been something that was more imminent, so you wouldn't be 240 00:15:16,480 --> 00:15:18,360 Speaker 1: able to act on the information, even though you'd be 241 00:15:18,360 --> 00:15:21,520 Speaker 1: able to at least decode what had been said. So 242 00:15:21,800 --> 00:15:24,040 Speaker 1: you wouldn't want to use these devices for any sort 243 00:15:24,080 --> 00:15:29,480 Speaker 1: of long term plans because they were crackable. People could 244 00:15:29,640 --> 00:15:33,880 Speaker 1: crack the codes with given enough a time. Now. Around 245 00:15:33,960 --> 00:15:38,240 Speaker 1: that same time, Haglan became good friends with another cryptographer 246 00:15:38,560 --> 00:15:42,840 Speaker 1: named William Friedman. Freedman was born in Russia. Actually, so 247 00:15:43,000 --> 00:15:48,400 Speaker 1: was Haglan. Hegland's parents were Swedish, but when they had 248 00:15:48,440 --> 00:15:52,600 Speaker 1: Boris he was the family was in Russia. So Friedman's 249 00:15:52,600 --> 00:15:55,880 Speaker 1: family left Russia when Friedman was just a baby back 250 00:15:55,880 --> 00:15:59,160 Speaker 1: in eighteen ninety two due to a rise in anti 251 00:15:59,160 --> 00:16:03,880 Speaker 1: Semitism in Russia and Friedman his family's Jewish. So Freedman 252 00:16:04,280 --> 00:16:09,200 Speaker 1: grew up loving codes and cryptography and became fascinated with them. 253 00:16:10,160 --> 00:16:14,120 Speaker 1: He joined a private research lab. He met and then 254 00:16:14,440 --> 00:16:18,080 Speaker 1: courted and then married a woman named Elizabeth Smith, who 255 00:16:18,240 --> 00:16:22,280 Speaker 1: on her own was an accomplished cryptographer, a brilliant cryptographer. 256 00:16:22,840 --> 00:16:25,840 Speaker 1: And they both sort of worked for George Fabian, and 257 00:16:25,880 --> 00:16:28,000 Speaker 1: that was the guy who owned the private research lab. 258 00:16:28,320 --> 00:16:32,800 Speaker 1: Fabian sounds like the sort of person who really belonged 259 00:16:32,800 --> 00:16:35,480 Speaker 1: in the Renaissance as far as I'm concerned. In the 260 00:16:35,520 --> 00:16:39,840 Speaker 1: Renaissance you had rich nobles who would become patrons of 261 00:16:39,960 --> 00:16:45,880 Speaker 1: great thinkers and philosophers and artists. Fabian he established this 262 00:16:45,960 --> 00:16:48,560 Speaker 1: private research lab in order to look into stuff that 263 00:16:48,600 --> 00:16:50,720 Speaker 1: he just thought was interesting, which I think is kind 264 00:16:50,720 --> 00:16:54,240 Speaker 1: of cool, maybe a little eccentric. Well, when the United 265 00:16:54,280 --> 00:16:58,440 Speaker 1: States entered World War One, the Friedman's husband and wife 266 00:16:58,440 --> 00:17:01,680 Speaker 1: would work in code breaking for the United States, and 267 00:17:01,880 --> 00:17:06,600 Speaker 1: the cryptologic division of the research lab became the genesis 268 00:17:06,880 --> 00:17:11,359 Speaker 1: for the American Cryptography Service. So William Freeman would later 269 00:17:11,440 --> 00:17:15,159 Speaker 1: become the chief cryptoanalyst. In fact, he termed the or 270 00:17:15,160 --> 00:17:18,719 Speaker 1: he coined the term cryptoanalysis for the United States, and 271 00:17:18,840 --> 00:17:23,399 Speaker 1: would lead the future Signals Intelligence Service before going on 272 00:17:23,520 --> 00:17:27,800 Speaker 1: to serve in other intelligence agencies as a cryptographer. So 273 00:17:28,040 --> 00:17:32,160 Speaker 1: Friedman was very much working in the same world as Hegeland, 274 00:17:32,240 --> 00:17:35,280 Speaker 1: though you could say that these were from opposing perspectives, right, 275 00:17:35,359 --> 00:17:38,280 Speaker 1: because Hegeland's company was all about producing machines that could 276 00:17:38,280 --> 00:17:43,080 Speaker 1: incipher messages, while Freedman was largely interested in finding methods 277 00:17:43,119 --> 00:17:47,560 Speaker 1: to decipher codes. Though Freeman also worked in theory as 278 00:17:47,560 --> 00:17:51,320 Speaker 1: well to talk about different ways to create stronger ciphers. 279 00:17:51,400 --> 00:17:53,560 Speaker 1: And we'll come back to Freedman in just a moment. 280 00:17:53,960 --> 00:17:56,880 Speaker 1: So Hegelan would stay in the US until World War 281 00:17:56,920 --> 00:18:00,600 Speaker 1: Two ended in Europe, and he had become extremely wealthy 282 00:18:00,720 --> 00:18:03,479 Speaker 1: due to the lucrative army contract he had made, and 283 00:18:03,520 --> 00:18:06,080 Speaker 1: he had built many professional and personal relationships in the 284 00:18:06,200 --> 00:18:09,639 Speaker 1: United States so he would have strong ties to the US. 285 00:18:10,160 --> 00:18:13,560 Speaker 1: He then returned to Europe to again re establish his 286 00:18:13,680 --> 00:18:18,800 Speaker 1: company there. Meanwhile, American intelligence officials were starting to get 287 00:18:18,840 --> 00:18:22,760 Speaker 1: a little worried because code breaking was growing increasingly difficult 288 00:18:23,080 --> 00:18:28,320 Speaker 1: due to sophisticated machines running complicated systems to create these codes. 289 00:18:28,880 --> 00:18:31,920 Speaker 1: And if you had little insight into how those machines 290 00:18:31,960 --> 00:18:36,720 Speaker 1: worked or which systems they were following at any given time, 291 00:18:37,400 --> 00:18:39,960 Speaker 1: you had really little hope of breaking a code in 292 00:18:40,000 --> 00:18:42,520 Speaker 1: a reasonable amount of time. So it was very clear 293 00:18:42,600 --> 00:18:45,959 Speaker 1: that a lot of people were having really secret conversations 294 00:18:46,280 --> 00:18:50,040 Speaker 1: that American spies were unable to decipher, and that just 295 00:18:50,200 --> 00:18:53,560 Speaker 1: rubbed the Americans the wrong way. I'm going to get 296 00:18:53,600 --> 00:18:58,720 Speaker 1: a little critical of my country in this episode, anyway. 297 00:18:58,920 --> 00:19:03,280 Speaker 1: In nineteen fifty one, on Hageland's company introduced the CX 298 00:19:03,520 --> 00:19:07,520 Speaker 1: fifty two cipher machine, and this one was sophisticated enough 299 00:19:07,560 --> 00:19:11,080 Speaker 1: to present a code that American intelligence agents viewed as 300 00:19:11,240 --> 00:19:15,520 Speaker 1: practically unbreakable at the time, and that in turn prompted 301 00:19:15,600 --> 00:19:21,680 Speaker 1: some heated internal discussions within the US intelligence community and 302 00:19:21,920 --> 00:19:24,359 Speaker 1: what should officials do about this? Because there was a 303 00:19:24,400 --> 00:19:29,440 Speaker 1: real worry that countries might go out and buy Hageland's products. 304 00:19:29,640 --> 00:19:32,359 Speaker 1: I mean, that's what Hagland was making them for. And 305 00:19:32,520 --> 00:19:34,560 Speaker 1: if they did that, they would all be able to 306 00:19:34,600 --> 00:19:38,280 Speaker 1: communicate secretly, and the Americans would be unable to snoop 307 00:19:38,320 --> 00:19:42,000 Speaker 1: out what was going on. And boy, howdy does America 308 00:19:42,040 --> 00:19:46,040 Speaker 1: hate that. So American officials gave a sort of carrot 309 00:19:46,040 --> 00:19:49,720 Speaker 1: and a stick offer to Hagland. So on the one hand, 310 00:19:50,200 --> 00:19:53,080 Speaker 1: they were a big customer for his company, right, the 311 00:19:53,160 --> 00:19:59,240 Speaker 1: United States represented a significant potential customer for Hageland's products. 312 00:19:59,560 --> 00:20:01,800 Speaker 1: He didn't want that source of revenue to go away. 313 00:20:01,920 --> 00:20:06,439 Speaker 1: So there was that They also had a whole bunch 314 00:20:06,480 --> 00:20:11,320 Speaker 1: of old M two nine cipher devices that were manufactured 315 00:20:11,359 --> 00:20:15,359 Speaker 1: in America during World War Two, and there was at 316 00:20:15,440 --> 00:20:19,879 Speaker 1: least the implied threat that if Hagelan wouldn't be you know, 317 00:20:20,280 --> 00:20:24,919 Speaker 1: cooperative with the US, maybe the Americans might let a 318 00:20:24,960 --> 00:20:28,040 Speaker 1: few thousand M two nine s get sold off to 319 00:20:28,119 --> 00:20:31,760 Speaker 1: countries around the world, and that would undercut Crypto's own 320 00:20:31,920 --> 00:20:35,879 Speaker 1: sales in the process. I mean, if you are a 321 00:20:36,200 --> 00:20:39,439 Speaker 1: country you know, the head of an agency in a 322 00:20:39,720 --> 00:20:44,119 Speaker 1: smaller country with limited resources, and the United States says, hey, 323 00:20:44,200 --> 00:20:49,240 Speaker 1: we'll sell you these old but totally working cipher machines 324 00:20:49,600 --> 00:20:52,880 Speaker 1: for much less than that brand new, shiny cipher machine. 325 00:20:53,200 --> 00:20:55,040 Speaker 1: You're going to go with the cheaper model as long 326 00:20:55,080 --> 00:20:58,439 Speaker 1: as it works, and that means that Crypto would not 327 00:20:58,560 --> 00:21:04,520 Speaker 1: be making any sale. Then there was William Friedman, Hageland's 328 00:21:04,560 --> 00:21:08,159 Speaker 1: old buddy. In nineteen fifty one, Freeman was then serving 329 00:21:08,200 --> 00:21:11,320 Speaker 1: as the head of the Cryptographic Division of the Armed 330 00:21:11,440 --> 00:21:16,800 Speaker 1: Forces Security Agency or AFSA AFSA. The following year he 331 00:21:16,800 --> 00:21:19,679 Speaker 1: would become the head of the Cryptology Department for the 332 00:21:19,840 --> 00:21:24,359 Speaker 1: National Security Agency, or the NSA. But it was in 333 00:21:24,440 --> 00:21:27,080 Speaker 1: nineteen fifty one when Friedman would act on behalf of 334 00:21:27,119 --> 00:21:31,920 Speaker 1: the US government and met secretly with Hagelan in Washington, 335 00:21:32,040 --> 00:21:35,800 Speaker 1: d c. So Friedman goes up to Hagelan with a 336 00:21:35,840 --> 00:21:42,159 Speaker 1: fairly thorny proposition. The deal was this, Hageland was to 337 00:21:42,200 --> 00:21:45,880 Speaker 1: continue creating cipher machines just as the company had been, 338 00:21:46,640 --> 00:21:51,040 Speaker 1: but Crypto would only sell the most sophisticated of those 339 00:21:51,119 --> 00:21:54,399 Speaker 1: machines to a list of countries that the United States 340 00:21:54,440 --> 00:21:59,320 Speaker 1: would provide to Hageland, and that would represent countries with 341 00:21:59,400 --> 00:22:02,879 Speaker 1: whom the un U had very good relations, so allies 342 00:22:02,920 --> 00:22:05,280 Speaker 1: and that sort of thing. They were the only countries 343 00:22:05,320 --> 00:22:08,520 Speaker 1: who would be allowed to buy the top of the 344 00:22:08,600 --> 00:22:13,160 Speaker 1: line products. Crypto would be allowed to sell older, more 345 00:22:13,720 --> 00:22:17,600 Speaker 1: vulnerable or weak machines to any country that was not 346 00:22:18,000 --> 00:22:20,760 Speaker 1: on that list. So in other words, Freeman was asking 347 00:22:20,800 --> 00:22:25,359 Speaker 1: Hegeland to kind of put on a preference list certain 348 00:22:25,440 --> 00:22:33,680 Speaker 1: countries and then everyone else would get older, more vulnerable technologies. However, 349 00:22:33,960 --> 00:22:36,600 Speaker 1: that's the extent of that deal. It didn't go further 350 00:22:36,680 --> 00:22:38,800 Speaker 1: than that, but it's still a pretty big request, and 351 00:22:39,280 --> 00:22:42,560 Speaker 1: you can kind of understand where the US was coming from. 352 00:22:42,920 --> 00:22:46,119 Speaker 1: At least, you know, they clearly did not want the 353 00:22:46,200 --> 00:22:50,120 Speaker 1: job to be even harder when it came to breaking codes. 354 00:22:50,680 --> 00:22:54,120 Speaker 1: And Hegeland would ultimately agree to this deal, and whether 355 00:22:54,160 --> 00:22:56,679 Speaker 1: it was he saw a guaranteed payout from the US 356 00:22:56,720 --> 00:22:59,040 Speaker 1: and so it was strictly a business decision. He just 357 00:22:59,400 --> 00:23:02,000 Speaker 1: fello was in Goswel to turn down this offer, or 358 00:23:02,040 --> 00:23:05,120 Speaker 1: he felt a strong sense of loyalty toward a country 359 00:23:05,160 --> 00:23:07,520 Speaker 1: that had made him a millionaire, or maybe it was 360 00:23:07,600 --> 00:23:10,639 Speaker 1: some combination of these and other factors. I don't know, 361 00:23:10,720 --> 00:23:14,280 Speaker 1: but whatever it was, he said yes, and this would 362 00:23:14,320 --> 00:23:18,040 Speaker 1: mark the beginning of the US intelligence community having a 363 00:23:18,160 --> 00:23:23,200 Speaker 1: direct interest in a company that was selling cryptographic equipment, 364 00:23:23,240 --> 00:23:26,320 Speaker 1: that is Crypto. But at this point it was still 365 00:23:26,359 --> 00:23:30,000 Speaker 1: a fairly limited agreement. Crypto could still sell equipment to 366 00:23:30,080 --> 00:23:33,119 Speaker 1: countries all around the world, though any country that was 367 00:23:33,240 --> 00:23:36,439 Speaker 1: not on the US Best Buddy list would only have 368 00:23:36,520 --> 00:23:40,680 Speaker 1: access to the older devices. Now this wasn't because US 369 00:23:40,680 --> 00:23:44,240 Speaker 1: officials were feeling benevolent or anything like that. I don't 370 00:23:44,280 --> 00:23:46,960 Speaker 1: want to paint it as that. There was a very 371 00:23:47,040 --> 00:23:51,560 Speaker 1: real desire in America to push Crypto for a much 372 00:23:51,600 --> 00:23:55,520 Speaker 1: more shady deal. Intelligence officials were hoping that they could 373 00:23:55,600 --> 00:24:00,000 Speaker 1: work directly with Crypto to design machines that were produced 374 00:24:00,119 --> 00:24:04,680 Speaker 1: codes that Americans could quickly break. People would think they 375 00:24:04,680 --> 00:24:07,840 Speaker 1: were sending secure messages, but in reality the Americans would 376 00:24:07,840 --> 00:24:11,200 Speaker 1: be able to decode those messages fairly quickly. But William 377 00:24:11,200 --> 00:24:16,000 Speaker 1: Friedman discouraged anyone from America from going to Hageland with 378 00:24:16,040 --> 00:24:19,040 Speaker 1: such an offer for several years. He said Hageland would 379 00:24:19,040 --> 00:24:22,280 Speaker 1: never go for it. It would be deeply offensive to him. 380 00:24:22,359 --> 00:24:25,560 Speaker 1: You're going to destroy this relationship we have let's not 381 00:24:26,480 --> 00:24:29,919 Speaker 1: you know, let's let's hold back rather than have a loss. 382 00:24:30,560 --> 00:24:33,000 Speaker 1: And hey, there were other companies out there, right, I mean, 383 00:24:33,320 --> 00:24:36,159 Speaker 1: it's not like you had to buy from Crypto or 384 00:24:36,200 --> 00:24:38,400 Speaker 1: else you'd have no way to communicate secretly. You could 385 00:24:38,400 --> 00:24:42,800 Speaker 1: always get cipher machines and cryptography machines from some other source, 386 00:24:42,920 --> 00:24:46,520 Speaker 1: right well. Part of the deal that the US made 387 00:24:47,040 --> 00:24:50,879 Speaker 1: included substantial amounts of money meant to go toward marketing. 388 00:24:51,200 --> 00:24:55,040 Speaker 1: The US wanted Crypto to be the world leader in 389 00:24:55,119 --> 00:24:59,679 Speaker 1: the market for this sort of device, mostly in an 390 00:24:59,720 --> 00:25:02,560 Speaker 1: effort to make sure that some other crypto company didn't 391 00:25:02,560 --> 00:25:07,040 Speaker 1: come along with better, more difficult to crack solutions, because 392 00:25:07,040 --> 00:25:09,560 Speaker 1: that would just set America back again. So the US 393 00:25:09,800 --> 00:25:13,119 Speaker 1: supplied money year after year to Crypto to renew this 394 00:25:13,200 --> 00:25:16,360 Speaker 1: agreement and to keep the company going even if things 395 00:25:16,359 --> 00:25:19,879 Speaker 1: should get lean, all the while trying to promote cryptos 396 00:25:20,000 --> 00:25:23,720 Speaker 1: products and hold back any of Crypto's competitors. It was 397 00:25:23,760 --> 00:25:28,280 Speaker 1: pretty brutal. Things slowly began to change as time went on. 398 00:25:28,720 --> 00:25:32,199 Speaker 1: The Invention of the transistor would bring on tons of 399 00:25:32,200 --> 00:25:36,680 Speaker 1: innovation and miniaturization. So in the past electric circuits were 400 00:25:36,800 --> 00:25:41,200 Speaker 1: physically enormous because you had to have components like vacuum tubes, 401 00:25:41,560 --> 00:25:43,439 Speaker 1: and those took up a lot of space, and they 402 00:25:43,440 --> 00:25:45,919 Speaker 1: also gave off a lot of heat, which generally is 403 00:25:46,320 --> 00:25:49,679 Speaker 1: bad not just for humans but also for electronics. But 404 00:25:49,920 --> 00:25:52,600 Speaker 1: in the mid nineteen sixties that was all starting to change. 405 00:25:52,640 --> 00:25:55,919 Speaker 1: Electronic circuits could now be made much smaller thanks to 406 00:25:55,960 --> 00:25:59,200 Speaker 1: the transistor, and they made it possible for all sorts 407 00:25:59,359 --> 00:26:03,480 Speaker 1: of new gaps like pocket radios and desktop computers further 408 00:26:03,520 --> 00:26:07,400 Speaker 1: down the line, and yes, new types of cryptographic machines. 409 00:26:08,400 --> 00:26:11,840 Speaker 1: Hagelan was facing a very real problem at that point. 410 00:26:12,200 --> 00:26:18,160 Speaker 1: His company was built around mechanical cryptographic devices. These were 411 00:26:18,160 --> 00:26:23,159 Speaker 1: machines that relied on physical components like gears and levers 412 00:26:23,320 --> 00:26:26,760 Speaker 1: and chains. But the electronic era was heading in a 413 00:26:26,800 --> 00:26:31,120 Speaker 1: different direction, and the crypto company wasn't in a position 414 00:26:31,240 --> 00:26:34,840 Speaker 1: to keep up. If Hagelan wanted to compete, he was 415 00:26:34,880 --> 00:26:38,800 Speaker 1: going to need help. And when someone needs help, that 416 00:26:38,960 --> 00:26:42,000 Speaker 1: means they are vulnerable. Now, if you're in a position 417 00:26:42,040 --> 00:26:45,399 Speaker 1: to help someone, you can more or less selflessly help 418 00:26:45,520 --> 00:26:48,400 Speaker 1: that person to get them out of that vulnerable position, 419 00:26:49,160 --> 00:26:53,240 Speaker 1: or you can attempt to exploit it. And the US 420 00:26:53,320 --> 00:26:57,720 Speaker 1: intelligence community with the NSSA at the forefront took option 421 00:26:57,960 --> 00:27:02,800 Speaker 1: number two. THESA, as I said. The National Security Agency 422 00:27:03,320 --> 00:27:06,639 Speaker 1: was founded in nineteen fifty two, just five years after 423 00:27:06,720 --> 00:27:11,119 Speaker 1: the Central Intelligence Agency was founded. It's primarily focused on 424 00:27:11,440 --> 00:27:15,679 Speaker 1: signals intelligence, and that is the interception and decoding of 425 00:27:15,760 --> 00:27:19,919 Speaker 1: messages for the purposes of gathering intelligence. Over at the NSA, 426 00:27:20,400 --> 00:27:25,119 Speaker 1: an analyst named Peter Jenks hypothesized that with care, you 427 00:27:25,200 --> 00:27:30,280 Speaker 1: could create an electronic cryptographic system that would seem to 428 00:27:30,359 --> 00:27:35,240 Speaker 1: be random, but it would actually depend upon a repeated 429 00:27:35,440 --> 00:27:39,399 Speaker 1: pattern at regular intervals, and a casual glance at the 430 00:27:39,440 --> 00:27:41,399 Speaker 1: code would make it seem as though the system was 431 00:27:41,440 --> 00:27:45,480 Speaker 1: following a complicated algorithm and producing an uncrackable code because 432 00:27:45,520 --> 00:27:49,359 Speaker 1: of some sort of random element. But the repetition of 433 00:27:49,400 --> 00:27:52,720 Speaker 1: the pattern would actually make code breakers with sufficient computing 434 00:27:52,760 --> 00:27:56,919 Speaker 1: power able to decode the messages. It wouldn't be easy, 435 00:27:57,119 --> 00:27:59,560 Speaker 1: it wouldn't be as simple as just running it through 436 00:27:59,600 --> 00:28:04,320 Speaker 1: a decode, but because of that pattern, it would become possible. Again, 437 00:28:04,640 --> 00:28:11,280 Speaker 1: Patterns represent restrictions. Restrictions are vulnerabilities, and vulnerabilities can be exploited. 438 00:28:11,720 --> 00:28:13,760 Speaker 1: So you can make a system that, at least on 439 00:28:13,920 --> 00:28:19,640 Speaker 1: casual glance appears to be secure, but in reality it's not. So. 440 00:28:20,200 --> 00:28:23,080 Speaker 1: The NSA reaches out to Crypto, which is really in 441 00:28:23,119 --> 00:28:26,680 Speaker 1: need of expertise in the form of building electronic cryptographic machines, 442 00:28:27,040 --> 00:28:31,360 Speaker 1: and Hageln welcomes the help because otherwise his business is 443 00:28:31,440 --> 00:28:34,840 Speaker 1: going to completely lose out. So Crypto goes on to 444 00:28:34,880 --> 00:28:37,919 Speaker 1: produce a machine called the H four to sixty based 445 00:28:37,920 --> 00:28:42,440 Speaker 1: off the NSA's design. The company actually made two versions 446 00:28:42,680 --> 00:28:46,440 Speaker 1: of the H four to sixty. One was compromised. It 447 00:28:46,640 --> 00:28:49,840 Speaker 1: used the NSA's repeating pattern so that the agency could 448 00:28:49,880 --> 00:28:52,320 Speaker 1: with time and effort, to code any messages that were 449 00:28:52,320 --> 00:28:56,880 Speaker 1: composed on that particular machine. The other one was more secure, 450 00:28:56,960 --> 00:29:00,160 Speaker 1: it didn't repeat the pattern, so the United States was 451 00:29:00,200 --> 00:29:04,520 Speaker 1: still fine with Crypto selling those machines, the good ones 452 00:29:04,760 --> 00:29:07,720 Speaker 1: to countries that were still on the US Best Buddy list. 453 00:29:08,360 --> 00:29:11,480 Speaker 1: Everyone else would get the compromised version. Now. While the 454 00:29:11,560 --> 00:29:15,360 Speaker 1: NSA's assistance meant that Crypto would remain a viable company 455 00:29:15,600 --> 00:29:18,760 Speaker 1: as the world moved away from mechanical systems, it also 456 00:29:18,800 --> 00:29:21,880 Speaker 1: meant that Crypto was a company that was becoming increasingly 457 00:29:21,920 --> 00:29:27,560 Speaker 1: dependent upon American intelligence agencies. Toward the end of the sixties, 458 00:29:27,960 --> 00:29:30,920 Speaker 1: folks in the CIA were starting to get a little 459 00:29:31,000 --> 00:29:35,240 Speaker 1: bit antsy with the company Crypto. It was a valuable 460 00:29:35,280 --> 00:29:40,080 Speaker 1: asset and countries around the world depended upon equipment from Crypto, 461 00:29:40,160 --> 00:29:43,200 Speaker 1: which meant the US had incredible advantages when it came 462 00:29:43,240 --> 00:29:47,720 Speaker 1: to deciphering intelligence. But Hagelin was getting up there in years, 463 00:29:47,760 --> 00:29:50,440 Speaker 1: He was getting into his eighties, and there was no 464 00:29:50,560 --> 00:29:54,239 Speaker 1: guarantee that his successor would be as amenable to the 465 00:29:54,240 --> 00:29:58,640 Speaker 1: intelligence agents as Haglin had been. Initially, it appeared as 466 00:29:58,640 --> 00:30:00,840 Speaker 1: though he was going to hand over control of his 467 00:30:00,880 --> 00:30:05,640 Speaker 1: company to his son, Bo Hageln. The CIA was not 468 00:30:06,000 --> 00:30:10,120 Speaker 1: crazy about that idea. The agency was not convinced that 469 00:30:10,200 --> 00:30:15,040 Speaker 1: Bo Haglan would be as pliable as Boris Hagelin had been, 470 00:30:15,440 --> 00:30:17,640 Speaker 1: and the nature of the company's relationship with the US 471 00:30:17,720 --> 00:30:21,720 Speaker 1: intelligence community had been kept a secret from Bo. So 472 00:30:21,920 --> 00:30:26,120 Speaker 1: Boris Hagelnd's own son did not apparently know about this 473 00:30:26,480 --> 00:30:32,200 Speaker 1: relationship with the NSA and later the CIA. So Boris 474 00:30:32,240 --> 00:30:34,320 Speaker 1: and his son Bo were also not on the best 475 00:30:34,320 --> 00:30:40,640 Speaker 1: of terms. They frequently had pretty massive fights. Bo had 476 00:30:40,680 --> 00:30:44,400 Speaker 1: felt he had been left out of some pretty important 477 00:30:44,880 --> 00:30:49,200 Speaker 1: patents that he had contributed to, and so he was 478 00:30:49,840 --> 00:30:54,560 Speaker 1: not on good speaking terms with his father. So this 479 00:30:54,760 --> 00:30:59,200 Speaker 1: was a complicated issue and the US government wasn't entirely 480 00:30:59,240 --> 00:31:02,920 Speaker 1: sure how it was going to play out. Meanwhile, over 481 00:31:02,960 --> 00:31:07,000 Speaker 1: in Europe, you had intelligence agencies in West Germany because 482 00:31:07,040 --> 00:31:10,200 Speaker 1: you know, after World War Two, Germany was split up 483 00:31:10,280 --> 00:31:13,360 Speaker 1: into West Germany and East Germany. So West Germany and 484 00:31:13,400 --> 00:31:16,480 Speaker 1: an intelligence agency in France were both eager to purchase 485 00:31:16,520 --> 00:31:21,120 Speaker 1: Crypto from Hageland. You know, Hagland's getting very old, and 486 00:31:21,160 --> 00:31:23,920 Speaker 1: so they think, hey, if we buy this company, then 487 00:31:23,960 --> 00:31:28,560 Speaker 1: we can benefit from this technology. They had figured out 488 00:31:28,600 --> 00:31:32,360 Speaker 1: that the United States had some sort of beneficial relationship 489 00:31:32,480 --> 00:31:35,320 Speaker 1: with Crypto. I'm not sure if they knew the full 490 00:31:35,360 --> 00:31:37,440 Speaker 1: extent of it, but they at least knew that there 491 00:31:37,520 --> 00:31:40,360 Speaker 1: was Someboddy Buddy stuff going on there and they wanted 492 00:31:40,400 --> 00:31:44,280 Speaker 1: to get in on that action. Haglan rejected this initial 493 00:31:44,280 --> 00:31:47,200 Speaker 1: offer and told the CIA about it. So then we 494 00:31:47,240 --> 00:31:51,960 Speaker 1: get to nineteen seventy and then two really big things happen. First, 495 00:31:52,520 --> 00:31:57,280 Speaker 1: Bo Haglan Boris's son would die in a car accident, 496 00:31:58,200 --> 00:32:02,120 Speaker 1: and no conspiracy theorist does not appear that this was 497 00:32:02,960 --> 00:32:07,040 Speaker 1: engineered or manufactured in some way. It appears to have 498 00:32:07,080 --> 00:32:10,400 Speaker 1: been just a car accident and Bo dies as a 499 00:32:10,440 --> 00:32:14,840 Speaker 1: result of this. The CIA cooperates with West Germany's Federal 500 00:32:14,880 --> 00:32:19,320 Speaker 1: Intelligence Service also known as bn D. It's called that 501 00:32:19,360 --> 00:32:23,840 Speaker 1: because in German federal Intelligence service is a different, very 502 00:32:23,920 --> 00:32:26,240 Speaker 1: long word that I am not even going to attempt 503 00:32:26,280 --> 00:32:29,880 Speaker 1: to pronounce, and they create an agreement in which these 504 00:32:29,960 --> 00:32:35,200 Speaker 1: two agencies would co own the company. In secret, the 505 00:32:35,200 --> 00:32:38,680 Speaker 1: CIA told West Germany, hey, we'll totally go in z's 506 00:32:38,760 --> 00:32:40,960 Speaker 1: with you on this one, but you got to cut 507 00:32:41,000 --> 00:32:45,400 Speaker 1: France out of the deal, and West Germany said, okay. 508 00:32:46,800 --> 00:32:52,840 Speaker 1: By France, alf Vida zey Hageln would be presented with 509 00:32:52,920 --> 00:32:55,239 Speaker 1: this deal and would agree to the terms, and the 510 00:32:55,280 --> 00:32:59,840 Speaker 1: agencies would rely upon a company in Liechtenstein that was 511 00:33:00,680 --> 00:33:04,840 Speaker 1: called Marxer and Goop at the time. Great name, but 512 00:33:04,920 --> 00:33:07,440 Speaker 1: Marxer and Goop would draw up the agreement in such 513 00:33:07,440 --> 00:33:10,840 Speaker 1: a way that the agency's identities would be protected through 514 00:33:10,840 --> 00:33:15,240 Speaker 1: a series of shell companies and other you know, obfuscation, 515 00:33:15,840 --> 00:33:19,200 Speaker 1: so even if you were to dig into it, you 516 00:33:19,240 --> 00:33:22,800 Speaker 1: would not be able to see that the CIA and 517 00:33:22,920 --> 00:33:26,520 Speaker 1: B and D were co owners of this company. Instead, 518 00:33:26,560 --> 00:33:29,560 Speaker 1: you would get all these this sort of a run around, 519 00:33:29,640 --> 00:33:33,040 Speaker 1: you know, a wild goose chase about the ownership of Crypto. 520 00:33:33,320 --> 00:33:36,600 Speaker 1: It would not appear to be owned by any intelligence agencies. However, 521 00:33:37,000 --> 00:33:40,000 Speaker 1: so hag Glen sold his company for just under six 522 00:33:40,040 --> 00:33:44,320 Speaker 1: million dollars. He would pass away in nineteen eighty three 523 00:33:44,520 --> 00:33:48,160 Speaker 1: after a very long illness, so he kind of leaves 524 00:33:48,200 --> 00:33:52,160 Speaker 1: our story. But meanwhile, the two intelligence agencies now had 525 00:33:52,200 --> 00:33:55,080 Speaker 1: secret control of a company that manufactured products meant to 526 00:33:55,120 --> 00:33:58,480 Speaker 1: make communications secret. I think you can see where this 527 00:33:58,640 --> 00:34:03,160 Speaker 1: is going, right. If your agency is all about uncovering 528 00:34:03,240 --> 00:34:06,600 Speaker 1: secrets and then you get control of a leading company 529 00:34:07,080 --> 00:34:11,799 Speaker 1: that makes stuff that's supposed to create things secretly, you're 530 00:34:11,840 --> 00:34:13,360 Speaker 1: like a kid in a candy store. I mean, it 531 00:34:14,400 --> 00:34:17,560 Speaker 1: was like they were selling locks to everyone in the world, 532 00:34:17,640 --> 00:34:19,799 Speaker 1: but they were holding on to all the skeleton keys 533 00:34:19,840 --> 00:34:24,439 Speaker 1: that would give them access to those locks. It was incredible. Now, 534 00:34:24,440 --> 00:34:27,759 Speaker 1: I should be clear that the list of clients for 535 00:34:27,920 --> 00:34:32,440 Speaker 1: Crypto did not include everybody. Not everyone in the world 536 00:34:32,840 --> 00:34:36,000 Speaker 1: was eager to purchase the products from this company. Two 537 00:34:36,600 --> 00:34:39,920 Speaker 1: potential customers in particular were not on the list. China 538 00:34:40,280 --> 00:34:44,239 Speaker 1: and Russia were both suspicious about Crypto for years. By 539 00:34:44,280 --> 00:34:48,439 Speaker 1: the time the CIA gained partial ownership, so they did 540 00:34:48,440 --> 00:34:53,399 Speaker 1: not purchase those products. They were figured something was up. 541 00:34:53,640 --> 00:34:57,960 Speaker 1: But other countries, including lots of US allies, were Crypto 542 00:34:58,040 --> 00:35:02,840 Speaker 1: customers frequent ones. While these two agencies would share ownership 543 00:35:02,880 --> 00:35:05,000 Speaker 1: of the company for a couple of decades, things were 544 00:35:05,000 --> 00:35:08,680 Speaker 1: not always super smooth between them. The West Germans noted 545 00:35:08,800 --> 00:35:11,359 Speaker 1: in their own history about the project that was shared 546 00:35:11,400 --> 00:35:15,000 Speaker 1: with The Washington Post that the Americans were eager to 547 00:35:15,080 --> 00:35:20,680 Speaker 1: spy on everybody really, enemy or ally alike. The West 548 00:35:20,680 --> 00:35:24,319 Speaker 1: German officials were really they were focusing on countries that 549 00:35:24,400 --> 00:35:28,440 Speaker 1: were not allies, but the Americans wanted to snoop on everybody. 550 00:35:29,120 --> 00:35:33,520 Speaker 1: CIA historians, meanwhile, note that the American officials felt that 551 00:35:33,600 --> 00:35:37,239 Speaker 1: the West Germans were more interested in running Crypto as 552 00:35:37,239 --> 00:35:40,200 Speaker 1: a straightforward business to earn money, and they were looking 553 00:35:40,239 --> 00:35:43,800 Speaker 1: at as a revenue generator, not as a way to 554 00:35:43,840 --> 00:35:46,840 Speaker 1: dip into secrets. So both the CIA and the B 555 00:35:47,000 --> 00:35:50,120 Speaker 1: and D would take in millions of dollars over the 556 00:35:50,200 --> 00:35:53,239 Speaker 1: years as they operated Crypto, and they would pour that 557 00:35:53,360 --> 00:35:56,480 Speaker 1: money into other projects around the world. So if you 558 00:35:56,560 --> 00:36:01,400 Speaker 1: ever wondered how some CIA operations appeared happen under the radar, 559 00:36:01,760 --> 00:36:05,279 Speaker 1: it's not all just you know, dark deals that are 560 00:36:05,640 --> 00:36:08,800 Speaker 1: behind closed doors in DC. Some of that money comes 561 00:36:08,920 --> 00:36:14,440 Speaker 1: straight from CIA backed operations that are appearing to be 562 00:36:14,640 --> 00:36:18,759 Speaker 1: you know, honest businesses. So that's fun. We're going to 563 00:36:18,800 --> 00:36:22,239 Speaker 1: take a break for actual honest businesses. But we'll be 564 00:36:22,480 --> 00:36:36,120 Speaker 1: right back after these messages. So in the CIA history 565 00:36:36,360 --> 00:36:39,000 Speaker 1: for this project, and I have not read the entire 566 00:36:39,120 --> 00:36:43,040 Speaker 1: history because it was not made available. The Post was 567 00:36:43,080 --> 00:36:47,280 Speaker 1: only granted the right to produce excerpts from the report, 568 00:36:47,360 --> 00:36:51,440 Speaker 1: not the entire report. But the agency refers to Crypto 569 00:36:51,800 --> 00:36:55,480 Speaker 1: with a code name. That code name is Minerva, and 570 00:36:55,560 --> 00:37:00,799 Speaker 1: the project of running Crypto in an effort to to 571 00:37:00,920 --> 00:37:04,680 Speaker 1: produce equipment that could be exploited around the world had 572 00:37:04,840 --> 00:37:07,880 Speaker 1: two different code names. The first one was the Saurus 573 00:37:08,400 --> 00:37:13,800 Speaker 1: and the second one was Rubicon. So German intelligence agents 574 00:37:14,320 --> 00:37:18,560 Speaker 1: would later bring in officials from Semens the company Semens 575 00:37:18,800 --> 00:37:25,120 Speaker 1: to serve as advisors, technical advisors and entrepreneurial advisors for Crypto, 576 00:37:25,560 --> 00:37:29,360 Speaker 1: and in return, Siemens would get five percent of cryptos sales. 577 00:37:30,160 --> 00:37:33,640 Speaker 1: The Americans, they brought in Motorola to take some of 578 00:37:33,680 --> 00:37:38,240 Speaker 1: Crypto's products and to tweak them to make them work better. 579 00:37:38,680 --> 00:37:44,360 Speaker 1: Make them more commercially viable. So we've got two intelligence 580 00:37:44,360 --> 00:37:49,480 Speaker 1: agencies and two major companies all working together as part 581 00:37:49,480 --> 00:37:53,000 Speaker 1: of this, and all indications seem to point that at 582 00:37:53,120 --> 00:37:56,040 Speaker 1: least some people in those two big companies knew what 583 00:37:56,239 --> 00:38:00,600 Speaker 1: was up. By the nineteen eighties, more than half of 584 00:38:00,760 --> 00:38:04,239 Speaker 1: all the intelligence gathered by the CIA that came from 585 00:38:04,239 --> 00:38:08,480 Speaker 1: places other than China or Russia were encrypted by crypto machines. 586 00:38:09,040 --> 00:38:13,319 Speaker 1: So when you look at all the information that the 587 00:38:13,320 --> 00:38:17,440 Speaker 1: CIA was bringing in, if it wasn't from Russia and 588 00:38:17,480 --> 00:38:19,719 Speaker 1: if it wasn't from China, more than half of the 589 00:38:19,800 --> 00:38:22,480 Speaker 1: information had passed through a crypto machine, meaning that the 590 00:38:22,520 --> 00:38:27,040 Speaker 1: CIA could decrypt it and read the underlying messages. There 591 00:38:27,040 --> 00:38:28,839 Speaker 1: were some times where they said that they could read 592 00:38:29,239 --> 00:38:33,600 Speaker 1: messages from certain countries with eighty to ninety percent success, 593 00:38:33,960 --> 00:38:37,600 Speaker 1: which is pretty phenomenal in the world of cryptography and 594 00:38:37,640 --> 00:38:42,600 Speaker 1: code breaking. Well, neither Russia nor China would use crypto devices, 595 00:38:42,920 --> 00:38:46,360 Speaker 1: a lot of countries that were dealing with those countries 596 00:38:46,400 --> 00:38:50,319 Speaker 1: with Russia and China did use crypto devices, so the 597 00:38:50,400 --> 00:38:53,640 Speaker 1: CIA was able to learn a lot about operations going 598 00:38:53,680 --> 00:38:57,520 Speaker 1: on in Russia and China indirectly through that means. This 599 00:38:57,600 --> 00:38:59,800 Speaker 1: is also a good time to point out a parallel 600 00:38:59,840 --> 00:39:04,120 Speaker 1: in our daily lives, which is that even if the 601 00:39:04,320 --> 00:39:09,720 Speaker 1: content of our messages is safe, the act of sending 602 00:39:09,760 --> 00:39:13,640 Speaker 1: messages can sometimes provide enough information for people to draw 603 00:39:13,760 --> 00:39:20,440 Speaker 1: some pretty accurate conclusions. It shows us that metadata is 604 00:39:20,520 --> 00:39:25,719 Speaker 1: really an important thing to remember. Metadata is the information 605 00:39:26,120 --> 00:39:29,680 Speaker 1: about information, and sometimes you don't need to know the 606 00:39:29,800 --> 00:39:35,200 Speaker 1: content of something in order to draw some pretty damaging 607 00:39:35,480 --> 00:39:39,320 Speaker 1: or valuable conclusions. I guess it all depends upon your perspective. 608 00:39:40,000 --> 00:39:42,359 Speaker 1: So this is kind of an example of that that 609 00:39:42,400 --> 00:39:46,000 Speaker 1: even though Russia and China weren't using crypto devices, countries 610 00:39:46,000 --> 00:39:48,120 Speaker 1: that were dealing with Russia and China were, and that 611 00:39:48,160 --> 00:39:50,960 Speaker 1: meant the CIA could read at least that side of 612 00:39:51,040 --> 00:39:55,200 Speaker 1: the messages. In nineteen eighty one, Saudi Arabia would become 613 00:39:55,239 --> 00:39:58,920 Speaker 1: the biggest crypto customer and it would play a very 614 00:39:58,960 --> 00:40:02,040 Speaker 1: important role. The crypto technology play a very important role 615 00:40:02,040 --> 00:40:05,520 Speaker 1: in the Middle East. This also leads to a point 616 00:40:05,520 --> 00:40:09,439 Speaker 1: in the Washington Post article where the authors state that 617 00:40:10,840 --> 00:40:14,319 Speaker 1: it's kind of an open question as to how much 618 00:40:14,440 --> 00:40:19,160 Speaker 1: the CIA knew about different operations around the world throughout 619 00:40:19,160 --> 00:40:23,279 Speaker 1: this time and what the agency did or didn't do 620 00:40:24,040 --> 00:40:27,160 Speaker 1: in preparation for those events, like whether or not they 621 00:40:27,160 --> 00:40:29,440 Speaker 1: should have acted in some cases, like if they were 622 00:40:29,480 --> 00:40:32,560 Speaker 1: aware of an assassination attempt, did they do anything to 623 00:40:32,600 --> 00:40:36,160 Speaker 1: prevent that or to let anyone know? And if not, 624 00:40:36,800 --> 00:40:39,520 Speaker 1: was it just because they were worried about compromising the 625 00:40:39,560 --> 00:40:44,160 Speaker 1: fact that they knew about this information. At what point 626 00:40:44,640 --> 00:40:47,440 Speaker 1: does the value go away from knowing information if you 627 00:40:47,480 --> 00:40:49,960 Speaker 1: don't act on that information. These are big questions that 628 00:40:50,000 --> 00:40:52,400 Speaker 1: are not answered in the article, by the way, and 629 00:40:52,480 --> 00:40:56,480 Speaker 1: they bring up a lot of deep ethical problems with 630 00:40:56,520 --> 00:41:01,080 Speaker 1: what was going on. So crypto would also receive a 631 00:41:01,080 --> 00:41:03,520 Speaker 1: lot of direction from the CIA and from B and 632 00:41:03,600 --> 00:41:10,080 Speaker 1: D to actively try and disparage competitors, to essentially run 633 00:41:10,200 --> 00:41:16,640 Speaker 1: marketing campaigns that said, you know, cryptography devices from such 634 00:41:16,680 --> 00:41:19,759 Speaker 1: and such a company are total crap, don't buy them. 635 00:41:19,960 --> 00:41:23,959 Speaker 1: Come to us by our stuff, we are secure. They 636 00:41:24,239 --> 00:41:29,279 Speaker 1: also were encouraged to bribe government officials to adopt crypto tech. 637 00:41:29,560 --> 00:41:34,080 Speaker 1: So there's some pretty awful stories about crypto executives doing 638 00:41:34,120 --> 00:41:36,920 Speaker 1: all sorts of stuff in order to you know, bribe 639 00:41:37,040 --> 00:41:41,040 Speaker 1: governments from all over the world to adopt crypto technology. 640 00:41:41,680 --> 00:41:49,600 Speaker 1: Skeezy skeezee stuff really makes me proud. US President Ronald 641 00:41:49,600 --> 00:41:54,040 Speaker 1: Reagan inadvertently revealed that the US had intercepted and decrypted 642 00:41:54,080 --> 00:41:58,200 Speaker 1: communications out of a Libyan embassy in East Berlin to 643 00:41:58,320 --> 00:42:02,120 Speaker 1: Tripoli and that tiptf Libya that something was up right, 644 00:42:02,239 --> 00:42:07,400 Speaker 1: that America somehow was able to decrypt messages, and considering 645 00:42:07,520 --> 00:42:11,239 Speaker 1: the company they were relying upon for their cryptography, that 646 00:42:11,360 --> 00:42:16,520 Speaker 1: started to raise some doubts about Crypto's authenticity, and not 647 00:42:16,640 --> 00:42:21,320 Speaker 1: just with Libya, other countries took notice too. Employees at Crypto, meanwhile, 648 00:42:21,440 --> 00:42:25,160 Speaker 1: didn't know about the arrangement right. They were working under 649 00:42:25,200 --> 00:42:31,600 Speaker 1: the assumption that they were actually making genuine, reliable cryptography equipment, 650 00:42:32,000 --> 00:42:35,360 Speaker 1: and occasionally an employee might look at something and say, huh, 651 00:42:35,400 --> 00:42:39,320 Speaker 1: this is weird based upon what I know. This algorithm 652 00:42:39,320 --> 00:42:43,440 Speaker 1: we're using or this system we're using has vulnerabilities, their 653 00:42:43,560 --> 00:42:46,960 Speaker 1: problems with it. We should fix those before we ship this, 654 00:42:47,480 --> 00:42:50,839 Speaker 1: because we could make it more secure. They would get 655 00:42:51,000 --> 00:42:54,000 Speaker 1: discouraged from doing that, they would be told not to 656 00:42:54,040 --> 00:42:58,440 Speaker 1: implement solutions. In one case that went much further than that, 657 00:42:59,520 --> 00:43:05,240 Speaker 1: there was an employee named Peter Fritiger who was very 658 00:43:05,440 --> 00:43:10,400 Speaker 1: frustrated with what was going on. He felt that Crypto 659 00:43:10,560 --> 00:43:16,799 Speaker 1: was just being complacent or maybe negligent, and not responding 660 00:43:16,840 --> 00:43:21,560 Speaker 1: to very real concerns that Fritiger had with clients in Damascus. 661 00:43:22,080 --> 00:43:25,640 Speaker 1: So his clients in Damascus were complaining about their stuff. 662 00:43:25,680 --> 00:43:29,080 Speaker 1: So he went to Damascus and he fixed their Crypto equipment. 663 00:43:29,440 --> 00:43:32,240 Speaker 1: In other words, he removed the vulnerabilities that had been 664 00:43:32,680 --> 00:43:37,600 Speaker 1: engineered to go into this stuff. And the Crypto CEO 665 00:43:37,840 --> 00:43:41,080 Speaker 1: at the time would fire Fritiger as a result, because 666 00:43:41,320 --> 00:43:45,239 Speaker 1: Fritiger had messed things up. He had actually made what 667 00:43:45,320 --> 00:43:48,480 Speaker 1: was supposed to be a secure system and actual secure system. 668 00:43:48,680 --> 00:43:53,279 Speaker 1: Of course, he didn't know that that was against the 669 00:43:53,320 --> 00:43:58,080 Speaker 1: goals of the operation itself, and the CIA got very 670 00:43:58,120 --> 00:44:01,480 Speaker 1: mad at the CEO for Crypto at that point, saying 671 00:44:01,520 --> 00:44:03,759 Speaker 1: that he should have found a way to sort of 672 00:44:03,800 --> 00:44:06,919 Speaker 1: bring Friutiture in under the fold to smooth things over, 673 00:44:07,040 --> 00:44:11,080 Speaker 1: rather than fire him because it brought undue scrutiny to 674 00:44:11,160 --> 00:44:15,600 Speaker 1: Crypto and its activities. Crypto also hired an electrical engineer 675 00:44:15,920 --> 00:44:20,680 Speaker 1: named Mindia k. Flish and I'm sure I'm butchering these names, 676 00:44:20,719 --> 00:44:25,520 Speaker 1: and I do apologize that also upset the NSSA, this time, 677 00:44:25,560 --> 00:44:28,640 Speaker 1: not the CIA. But this NSSA because NSA knew about 678 00:44:28,640 --> 00:44:33,360 Speaker 1: this electrical engineer, and they said, she is way too smart, 679 00:44:33,840 --> 00:44:36,320 Speaker 1: she's going to figure out something's going on. You should 680 00:44:36,360 --> 00:44:40,760 Speaker 1: not hire her. But Crypto hired her because was brilliant 681 00:44:41,040 --> 00:44:45,239 Speaker 1: and was seen as a valuable asset. Turns out she 682 00:44:45,400 --> 00:44:48,360 Speaker 1: was brilliant. She still is brilliant, and she kept trying 683 00:44:48,400 --> 00:44:53,000 Speaker 1: to initiate fixes and improvements because she kept finding weaknesses 684 00:44:53,040 --> 00:44:57,080 Speaker 1: and vulnerabilities in the systems, but she was always discouraged 685 00:44:57,200 --> 00:45:01,600 Speaker 1: from actually implementing solutions, and she wondered what was going on, 686 00:45:02,520 --> 00:45:05,160 Speaker 1: but she was a little worried about speaking up because 687 00:45:05,160 --> 00:45:09,360 Speaker 1: she wasn't sure exactly what the extent was. The company 688 00:45:09,719 --> 00:45:13,680 Speaker 1: would actually produce a machine using an algorithm she had 689 00:45:13,719 --> 00:45:19,440 Speaker 1: designed that the NSA could not crack, So the NSA 690 00:45:19,800 --> 00:45:23,560 Speaker 1: reached out to the CIA, and the CIA ordered the 691 00:45:23,800 --> 00:45:28,360 Speaker 1: company Crypto, to stop the manufacturing process, saying, we can't 692 00:45:28,440 --> 00:45:32,520 Speaker 1: produce these machines because we can't crack the code. You 693 00:45:32,680 --> 00:45:36,480 Speaker 1: got to break it. So only fifty or so of 694 00:45:36,560 --> 00:45:40,040 Speaker 1: these machines were actually manufactured. The company wind up selling 695 00:45:40,040 --> 00:45:44,000 Speaker 1: those to banks because the thought was, well, banks have 696 00:45:44,080 --> 00:45:46,520 Speaker 1: a need for security and we don't really need to 697 00:45:46,560 --> 00:45:51,200 Speaker 1: snoop on them. That's not where our concern is. But 698 00:45:51,719 --> 00:45:54,359 Speaker 1: from now on, when you make this device, make it 699 00:45:54,400 --> 00:45:57,879 Speaker 1: with the algorithm that's broken on purpose because we want 700 00:45:57,880 --> 00:46:03,560 Speaker 1: to be able to crack those codes. That's pretty dodgy anyway. 701 00:46:03,920 --> 00:46:07,719 Speaker 1: There was also a mathematics professor from Stockholm whose name 702 00:46:07,840 --> 00:46:12,720 Speaker 1: I would butcher terribly. He actually studied in the United States, 703 00:46:12,800 --> 00:46:16,680 Speaker 1: and his American family, like me, would have trouble saying 704 00:46:16,719 --> 00:46:21,400 Speaker 1: his name, so they called him Henry Henry Vindman. He 705 00:46:21,520 --> 00:46:25,759 Speaker 1: was brought in to craft more sophisticated but vulnerable algorithms, 706 00:46:25,800 --> 00:46:31,680 Speaker 1: so he was actually told about the real relationship between 707 00:46:31,800 --> 00:46:35,680 Speaker 1: the CIA and then B and D and crypto. He 708 00:46:35,840 --> 00:46:39,520 Speaker 1: was given the inside scoop and asked to become part 709 00:46:39,520 --> 00:46:43,359 Speaker 1: of the team. And his purpose was to design algorithms 710 00:46:43,680 --> 00:46:49,560 Speaker 1: that looked really super secure but secretly weren't. So he 711 00:46:49,800 --> 00:46:53,640 Speaker 1: was trying to make stuff that appeared to be more 712 00:46:53,680 --> 00:46:57,279 Speaker 1: on the up and up, but in fact had vulnerabilities 713 00:46:57,280 --> 00:47:01,720 Speaker 1: built into it, and meanwhile to have those vulnerabilities designed 714 00:47:01,760 --> 00:47:05,400 Speaker 1: in such a way that it created plausible deniability. In 715 00:47:05,440 --> 00:47:08,600 Speaker 1: other words, if someone found the vulnerability, you could say, oh, 716 00:47:08,760 --> 00:47:12,200 Speaker 1: that's due to human error or it was an implementation error, 717 00:47:12,200 --> 00:47:15,040 Speaker 1: but it was not put there on purpose, even though 718 00:47:15,080 --> 00:47:19,840 Speaker 1: it toats was. The CIA used Crypto communications to suss 719 00:47:19,880 --> 00:47:24,160 Speaker 1: out where Manuel Noriega was based off communications from the Vatican. 720 00:47:24,480 --> 00:47:27,319 Speaker 1: They intercepted those communications, decoded them, and were able to 721 00:47:27,360 --> 00:47:31,760 Speaker 1: find Noriega as a result. In nineteen ninety two, Iran 722 00:47:32,200 --> 00:47:37,960 Speaker 1: arrested a Crypto salesman named Hans Buehler, and Buehler didn't 723 00:47:38,000 --> 00:47:41,640 Speaker 1: know about the relationship between Crypto and the CIA or 724 00:47:41,680 --> 00:47:44,160 Speaker 1: the B and D. He had no knowledge of any 725 00:47:44,200 --> 00:47:47,880 Speaker 1: of that, so he was literally an innocent salesman who 726 00:47:48,000 --> 00:47:54,440 Speaker 1: thought he was selling legit cryptographic equipment. Iran had figured 727 00:47:54,440 --> 00:47:57,120 Speaker 1: out something was going on. They had been suspicious ever 728 00:47:57,200 --> 00:48:01,160 Speaker 1: since that incident with Libya I had mentioned, and so 729 00:48:01,480 --> 00:48:07,040 Speaker 1: they arrested him and they essentially tortured him for nine months. 730 00:48:08,640 --> 00:48:13,120 Speaker 1: The Iran demanded a one million dollar ransom from Crypto, 731 00:48:13,400 --> 00:48:15,640 Speaker 1: and the company did pay it. The CIA did not 732 00:48:15,920 --> 00:48:20,120 Speaker 1: chip in because the CIA has a policy against paying ransoms. 733 00:48:20,840 --> 00:48:24,680 Speaker 1: We don't negotiate with terrorists, is the way America would 734 00:48:24,680 --> 00:48:28,080 Speaker 1: put it. So this guy suffered for nine months in 735 00:48:28,080 --> 00:48:32,080 Speaker 1: captivity before Crypto would pay the ransom and get him back, 736 00:48:32,320 --> 00:48:35,600 Speaker 1: and he legit didn't know anything. He didn't know that 737 00:48:35,840 --> 00:48:39,640 Speaker 1: the relationship existed, but he certainly suspected it by the 738 00:48:39,680 --> 00:48:43,239 Speaker 1: time he was released, and he was worried about the 739 00:48:43,320 --> 00:48:46,920 Speaker 1: fact that this foreign government seemed to know more about 740 00:48:46,920 --> 00:48:50,279 Speaker 1: the company he was working for than he did. He 741 00:48:50,360 --> 00:48:53,839 Speaker 1: ended up going to the press and talking about his 742 00:48:53,960 --> 00:48:58,239 Speaker 1: experiences and it caused a bit of a stir in Europe. 743 00:48:58,360 --> 00:49:02,279 Speaker 1: The CIA would actually refer to this entire incident with 744 00:49:02,360 --> 00:49:06,680 Speaker 1: a code name. That code name was Hydra, so that's fun. 745 00:49:07,200 --> 00:49:12,240 Speaker 1: Around that same time, Germany was reunified, right the Soviet 746 00:49:12,280 --> 00:49:16,480 Speaker 1: Union fol East Germany and West Germany unified into Germany. 747 00:49:16,760 --> 00:49:19,840 Speaker 1: The Berlin Wall came down, and it was around that 748 00:49:19,920 --> 00:49:23,160 Speaker 1: same time that the B and D felt that Crypto's 749 00:49:23,239 --> 00:49:27,480 Speaker 1: usefulness had pretty much expired, that now it was more 750 00:49:27,560 --> 00:49:32,640 Speaker 1: of a risk that if the full extent of B 751 00:49:32,719 --> 00:49:36,439 Speaker 1: and D's involvement in Crypto's activities were known, that could 752 00:49:36,480 --> 00:49:39,600 Speaker 1: put Germany at risk, and so they ended up selling 753 00:49:39,640 --> 00:49:43,760 Speaker 1: off their interest in Crypto to the CIA for around 754 00:49:43,880 --> 00:49:49,359 Speaker 1: seventeen million dollars. So at that point forward, Crypto operated 755 00:49:49,480 --> 00:49:56,680 Speaker 1: as a CIA backed operation secretly. But yeah, CIA had 756 00:49:56,680 --> 00:50:01,000 Speaker 1: full ownership from around nineteen ninety three until twenty eighteen. 757 00:50:01,120 --> 00:50:04,120 Speaker 1: That's when CIA would liquidate the company and sold it 758 00:50:04,160 --> 00:50:08,839 Speaker 1: off to two other companies. The reason they did that 759 00:50:09,440 --> 00:50:11,920 Speaker 1: is that by the time twenty eighteen rolled around, the 760 00:50:12,120 --> 00:50:16,080 Speaker 1: cryptographic community was very different. It no longer was so 761 00:50:16,200 --> 00:50:21,680 Speaker 1: dependent upon standalone machines, electronic or otherwise. A lot of 762 00:50:21,960 --> 00:50:26,960 Speaker 1: solutions are software based or web based. They're not based 763 00:50:27,040 --> 00:50:33,279 Speaker 1: on physical equipment. So the usefulness of Crypto as a 764 00:50:33,320 --> 00:50:37,560 Speaker 1: company had pretty much gone out the window. It had 765 00:50:37,560 --> 00:50:42,840 Speaker 1: provided the CIA with a ton of information, but they were, 766 00:50:43,000 --> 00:50:45,359 Speaker 1: you know, there's no need to keep it running, so 767 00:50:45,400 --> 00:50:51,920 Speaker 1: they sold it off for parts essentially. And you know, 768 00:50:52,200 --> 00:50:55,479 Speaker 1: part of me says, this is spy stuff. Of course, 769 00:50:55,480 --> 00:50:57,839 Speaker 1: spies are going to be sneaky. That's what spies do. 770 00:50:58,400 --> 00:51:01,359 Speaker 1: Spies operate in a way where they are trying to 771 00:51:01,480 --> 00:51:04,120 Speaker 1: avoid detection while they try to figure out what everyone 772 00:51:04,120 --> 00:51:07,880 Speaker 1: else knows. That is the nature of spying, and everybody 773 00:51:08,000 --> 00:51:12,040 Speaker 1: does it at the same time. There's something really sinister 774 00:51:12,760 --> 00:51:19,719 Speaker 1: about secretly owning a security firm and using it to 775 00:51:20,920 --> 00:51:23,719 Speaker 1: do the opposite of what the security firm says. It's 776 00:51:23,760 --> 00:51:27,440 Speaker 1: doing right. It says it's protecting secrets, but in reality, 777 00:51:27,960 --> 00:51:30,879 Speaker 1: it's leaving those secrets open for people to see. Now, 778 00:51:31,560 --> 00:51:34,120 Speaker 1: I mentioned Huawei at the beginning of this episode, and 779 00:51:34,160 --> 00:51:36,719 Speaker 1: the reason I did that is because, again around the 780 00:51:36,760 --> 00:51:39,520 Speaker 1: same time that this story was breaking, we were hearing 781 00:51:39,560 --> 00:51:44,359 Speaker 1: about how Huawei, the Chinese company telecommunications company, has had 782 00:51:44,600 --> 00:51:50,000 Speaker 1: backdoor access to networks that it has rolled out for 783 00:51:50,120 --> 00:51:53,840 Speaker 1: a decade. So Huawei makes all sorts of telecommunications equipment, 784 00:51:53,920 --> 00:51:58,120 Speaker 1: including components for networks. They are a leading provider for 785 00:51:58,200 --> 00:52:01,360 Speaker 1: five G components, for example, and there's been a concern 786 00:52:02,280 --> 00:52:05,240 Speaker 1: around much of the world, but particularly in the United States, 787 00:52:05,800 --> 00:52:08,640 Speaker 1: that this would mean that Huawei as a company would 788 00:52:08,680 --> 00:52:12,080 Speaker 1: have at least some capability of snooping on communications that 789 00:52:12,120 --> 00:52:17,279 Speaker 1: go across those networks. And since Huawei has some connections 790 00:52:17,719 --> 00:52:24,400 Speaker 1: to the communist government of China, because China requires companies 791 00:52:24,440 --> 00:52:27,480 Speaker 1: that operate in China to have this connection, that that 792 00:52:27,520 --> 00:52:31,600 Speaker 1: would mean that those networks would be used specifically as 793 00:52:31,800 --> 00:52:35,400 Speaker 1: surveillance tools. And in America you can kind of understand 794 00:52:35,560 --> 00:52:39,640 Speaker 1: where they're coming from, because that's what Americans do. Like, 795 00:52:39,719 --> 00:52:43,400 Speaker 1: if you're the one who's spying on everybody, you probably 796 00:52:43,440 --> 00:52:46,840 Speaker 1: are really paranoid about everyone spying on you. It's just 797 00:52:46,960 --> 00:52:50,800 Speaker 1: kind of how it works. Also, again, that report showed 798 00:52:51,000 --> 00:52:54,279 Speaker 1: that for ten years, Huawei actually did have that capability. 799 00:52:54,320 --> 00:52:56,919 Speaker 1: Whether they did anything with it or not is still 800 00:52:56,920 --> 00:53:00,600 Speaker 1: an open question. But with Huawei, the story goes that 801 00:53:00,640 --> 00:53:04,680 Speaker 1: they were building in these backdoor access channels for law 802 00:53:04,760 --> 00:53:07,840 Speaker 1: enforcement officials. You know, law enforcement wants to have that 803 00:53:07,920 --> 00:53:11,319 Speaker 1: kind of access so that if they're conducting investigation, they 804 00:53:11,320 --> 00:53:16,120 Speaker 1: can look into communications going between various suspects so that 805 00:53:16,200 --> 00:53:21,080 Speaker 1: they can better do their investigations. The problem is that 806 00:53:21,160 --> 00:53:23,520 Speaker 1: Huawei was not just building these in for law enforcement, 807 00:53:23,640 --> 00:53:27,680 Speaker 1: but was retaining its own access to those channels. And again, 808 00:53:27,719 --> 00:53:30,200 Speaker 1: whether it was using it or not, I don't know, 809 00:53:30,680 --> 00:53:33,719 Speaker 1: but the story goes that they were actually retaining that ability. 810 00:53:34,440 --> 00:53:37,520 Speaker 1: And this leads me to another point I want to 811 00:53:37,560 --> 00:53:42,399 Speaker 1: make before I conclude, which is that backdoor channels are 812 00:53:42,440 --> 00:53:47,120 Speaker 1: always a terrible idea, always, always, always, always They inherently 813 00:53:47,200 --> 00:53:50,520 Speaker 1: make systems less secure. So if your job is to 814 00:53:50,520 --> 00:53:54,080 Speaker 1: make a secure system, building in a way to bypass 815 00:53:54,160 --> 00:53:57,760 Speaker 1: that security is you might as well not have any security. 816 00:53:57,800 --> 00:54:01,480 Speaker 1: It's a terrible idea. I get it. Why law enforcement 817 00:54:01,560 --> 00:54:05,120 Speaker 1: and intelligence agencies want it because information is valuable and 818 00:54:05,160 --> 00:54:08,600 Speaker 1: getting access to the information could mean the difference between 819 00:54:08,840 --> 00:54:13,960 Speaker 1: life or death in some cases really can. But then 820 00:54:14,239 --> 00:54:18,839 Speaker 1: you know, if you have those backdoor channels, it means 821 00:54:18,880 --> 00:54:21,840 Speaker 1: that you don't have to go through the whole security process, 822 00:54:21,840 --> 00:54:24,400 Speaker 1: and it means that someone else might potentially discover that 823 00:54:24,840 --> 00:54:28,399 Speaker 1: and exploit it. So one, you've got the danger of 824 00:54:28,920 --> 00:54:33,719 Speaker 1: the authorized parties abusing this power. Right, you've got the 825 00:54:34,320 --> 00:54:38,879 Speaker 1: potential for an agency committing overreach like we've heard about 826 00:54:38,880 --> 00:54:43,000 Speaker 1: the NSSAY and how that agency was collecting way more 827 00:54:43,000 --> 00:54:47,320 Speaker 1: information than they should have been able to, including information 828 00:54:47,440 --> 00:54:51,160 Speaker 1: from people that weren't under any direct surveillance, and how 829 00:54:51,200 --> 00:54:54,360 Speaker 1: that can be abused. That's a terrible thing. So you 830 00:54:54,400 --> 00:54:57,359 Speaker 1: don't want that capability. You don't want the ability of 831 00:54:57,480 --> 00:55:02,200 Speaker 1: some agency that had had author backdoor access to abuse 832 00:55:02,239 --> 00:55:05,480 Speaker 1: that power. You also don't want some third party that 833 00:55:05,640 --> 00:55:09,160 Speaker 1: is not authorized at all finding out about that back 834 00:55:09,239 --> 00:55:12,080 Speaker 1: channel and figuring out how to access it, because now 835 00:55:12,160 --> 00:55:16,400 Speaker 1: your secure system has no security. So I guess the 836 00:55:16,760 --> 00:55:20,640 Speaker 1: end message I want to give everybody is protect yourself 837 00:55:20,680 --> 00:55:23,840 Speaker 1: as best you can, which is increasingly difficult when we 838 00:55:23,880 --> 00:55:28,200 Speaker 1: don't know necessarily who is behind the systems that are 839 00:55:28,200 --> 00:55:32,800 Speaker 1: actually making the security we depend upon. Another great example 840 00:55:32,880 --> 00:55:36,319 Speaker 1: is people have pointed out is should we trust the 841 00:55:37,120 --> 00:55:41,319 Speaker 1: security company Kaspersky, which comes from Russia, or is it 842 00:55:41,360 --> 00:55:44,799 Speaker 1: possible that that could be a state backed operation that 843 00:55:45,080 --> 00:55:50,600 Speaker 1: is slowly or quietly sewing in vulnerabilities from people who 844 00:55:50,640 --> 00:55:55,000 Speaker 1: are using its products. I have not seen any specific 845 00:55:55,040 --> 00:55:58,360 Speaker 1: reports on that. I'm just seeing people ask that question. 846 00:55:58,880 --> 00:56:01,440 Speaker 1: But that leads us to start asking questions about everything. 847 00:56:02,080 --> 00:56:04,719 Speaker 1: Probably not a bad idea, but it starts to, you know, 848 00:56:04,760 --> 00:56:08,360 Speaker 1: it starts to create this system where we're not trusting anything, 849 00:56:09,000 --> 00:56:11,719 Speaker 1: and at the end of the day, you either have 850 00:56:11,880 --> 00:56:15,480 Speaker 1: to figure out you've got to trust somebody, or you 851 00:56:15,520 --> 00:56:19,120 Speaker 1: got to just kind of disengage, or I guess you 852 00:56:19,239 --> 00:56:21,640 Speaker 1: just resign yourself that all of your stuff is going 853 00:56:21,719 --> 00:56:25,840 Speaker 1: to be findable and readable by everyone at some point 854 00:56:25,920 --> 00:56:30,960 Speaker 1: or another. Happy Days. I hope you enjoyed that episode 855 00:56:30,960 --> 00:56:35,520 Speaker 1: of tech Stuff When Secrets Aren't Secret. Back from February seventeenth, 856 00:56:35,560 --> 00:56:40,200 Speaker 1: twenty twenty. Just a quick update. Tomorrow we're going to 857 00:56:40,280 --> 00:56:43,360 Speaker 1: have a special episode of a different podcast published in 858 00:56:43,400 --> 00:56:47,120 Speaker 1: the tech Stuff feed is called Technically Speaking, and I 859 00:56:47,160 --> 00:56:50,160 Speaker 1: hope that you enjoy it. I will be back next 860 00:56:50,200 --> 00:56:52,919 Speaker 1: week with all new episodes, and I hope you're all well, 861 00:56:52,960 --> 00:57:01,840 Speaker 1: and I'll talk to you again really soon. Tech Stuff 862 00:57:01,960 --> 00:57:06,480 Speaker 1: is an iHeartRadio production. For more podcasts from iHeartRadio, visit 863 00:57:06,520 --> 00:57:10,040 Speaker 1: the iHeartRadio app, Apple Podcasts, or wherever you listen to 864 00:57:10,080 --> 00:57:11,040 Speaker 1: your favorite shows.