1 00:00:04,160 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,240 --> 00:00:14,160 Speaker 1: stuff Works dot com. Hey there, and welcome to tech Stuff. 3 00:00:14,200 --> 00:00:17,560 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:17,600 --> 00:00:20,160 Speaker 1: How Stuff Works and the Love all Things Tech Today. 5 00:00:20,200 --> 00:00:22,880 Speaker 1: Probably sounds a little different from my normal episodes because 6 00:00:22,880 --> 00:00:25,639 Speaker 1: I'm actually recording this while I'm in a hotel room 7 00:00:25,680 --> 00:00:28,639 Speaker 1: in Las Vegas. I'm attending yet another tech conference. But 8 00:00:28,760 --> 00:00:33,000 Speaker 1: this episode is more in line with our normally scheduled episodes. 9 00:00:33,320 --> 00:00:37,440 Speaker 1: This one is from a listener request, and the listener 10 00:00:37,479 --> 00:00:41,279 Speaker 1: in this case wished to remain anonymous, and so I 11 00:00:41,360 --> 00:00:46,640 Speaker 1: am going to to completely comply with his or her wishes, 12 00:00:47,200 --> 00:00:51,400 Speaker 1: but wanted to know more about end to end encryption, 13 00:00:51,680 --> 00:00:53,960 Speaker 1: which has been the news a lot, particularly in the UK. 14 00:00:54,960 --> 00:00:59,360 Speaker 1: End to end encryption is a means of sending information securely, 15 00:00:59,440 --> 00:01:03,880 Speaker 1: and it has positive aspects and depending upon who you are, 16 00:01:03,960 --> 00:01:06,440 Speaker 1: negative aspects. We're going to cover all of that, explain 17 00:01:06,520 --> 00:01:09,280 Speaker 1: what end and encryption is and how it works. It's 18 00:01:09,280 --> 00:01:11,440 Speaker 1: been the news for a couple of years for multiple reasons. 19 00:01:11,520 --> 00:01:14,000 Speaker 1: One is that some folks and government agencies have made 20 00:01:14,000 --> 00:01:16,399 Speaker 1: a stink about the technology because they say it gives 21 00:01:16,480 --> 00:01:20,520 Speaker 1: bad actors like criminals and terrorists, the ability to communicate 22 00:01:20,560 --> 00:01:24,200 Speaker 1: secretly with one another. That, in a sense is true, 23 00:01:24,400 --> 00:01:28,120 Speaker 1: but I think it oversimplifies the issue. The tech actually 24 00:01:28,160 --> 00:01:32,800 Speaker 1: allows any two people to communicate secretly with one another. So, 25 00:01:32,840 --> 00:01:34,440 Speaker 1: in other words, it's a tool that can be used 26 00:01:34,440 --> 00:01:38,360 Speaker 1: by people who are bad, but in itself is not 27 00:01:38,440 --> 00:01:41,080 Speaker 1: a bad tool. It's a tool that has tons of 28 00:01:41,200 --> 00:01:45,679 Speaker 1: legitimate uses that have nothing to do with terrorism or crime. 29 00:01:45,840 --> 00:01:48,560 Speaker 1: So as an example, let's say you are running a 30 00:01:48,640 --> 00:01:52,000 Speaker 1: record label and it's your job to sign new acts 31 00:01:52,040 --> 00:01:54,520 Speaker 1: to this record label. In the old days, you would 32 00:01:54,520 --> 00:01:57,200 Speaker 1: do all this in person, but today you can do 33 00:01:57,440 --> 00:01:59,640 Speaker 1: a lot of your business, practically all of your business 34 00:01:59,640 --> 00:02:02,240 Speaker 1: over the internet. So you can negotiate with the act. 35 00:02:02,320 --> 00:02:04,240 Speaker 1: You can come to an agreement on terms. You can 36 00:02:04,320 --> 00:02:06,480 Speaker 1: draw up a contract, and you can send it over 37 00:02:06,520 --> 00:02:08,680 Speaker 1: for them to sign, and then send it back for 38 00:02:08,720 --> 00:02:11,560 Speaker 1: you to countersign. But you want all this information to 39 00:02:11,600 --> 00:02:15,960 Speaker 1: remain confidential. The terms of one agreement with one act 40 00:02:16,120 --> 00:02:18,800 Speaker 1: could be very different from the terms you set with 41 00:02:18,840 --> 00:02:21,959 Speaker 1: another act. To lure a really big act to your label, 42 00:02:22,440 --> 00:02:26,240 Speaker 1: you might have to include more attractive terms for that act, 43 00:02:26,320 --> 00:02:30,280 Speaker 1: like better percentages, But you don't necessarily want to do 44 00:02:30,360 --> 00:02:33,720 Speaker 1: that with every act you're signing. It might not merit it. 45 00:02:33,919 --> 00:02:37,679 Speaker 1: In some cases, the amount of investment a label might 46 00:02:37,720 --> 00:02:42,000 Speaker 1: make in an unproven act could end up being unrewarded downeline. 47 00:02:42,080 --> 00:02:46,560 Speaker 1: So you would rather have the details of contracts remain 48 00:02:46,919 --> 00:02:50,040 Speaker 1: secure and not get leaked to the general public, because 49 00:02:50,080 --> 00:02:53,440 Speaker 1: then your negotiation tools are made plain for everyone to see. 50 00:02:53,760 --> 00:02:56,639 Speaker 1: It opens up the opportunity for various acts to say, hey, 51 00:02:56,639 --> 00:02:59,800 Speaker 1: how come they got such a sweet deal and we didn't. Uh, 52 00:03:00,040 --> 00:03:01,960 Speaker 1: you know, it's all part of business. So you would 53 00:03:02,040 --> 00:03:04,360 Speaker 1: rather have a means to send all this information in 54 00:03:04,360 --> 00:03:07,280 Speaker 1: a way so that even if someone were to intercept 55 00:03:07,320 --> 00:03:12,680 Speaker 1: the messages, no one who didn't you know, didn't have authority, 56 00:03:12,760 --> 00:03:16,280 Speaker 1: would ever understand what it actually says. Now that particular 57 00:03:16,320 --> 00:03:19,880 Speaker 1: idea has been around for as long as we have 58 00:03:20,040 --> 00:03:23,799 Speaker 1: had secrets, How do you communicate a secret to someone 59 00:03:23,840 --> 00:03:27,280 Speaker 1: else without anyone else finding out about it, particularly if 60 00:03:27,320 --> 00:03:30,480 Speaker 1: you can't whisper it quietly to that person. You have 61 00:03:30,600 --> 00:03:33,120 Speaker 1: to come up with a means to hide the meaning 62 00:03:33,360 --> 00:03:35,760 Speaker 1: of your message in some way, and there are a 63 00:03:35,800 --> 00:03:38,640 Speaker 1: lot of different ways to do this. So, for example, 64 00:03:38,680 --> 00:03:41,520 Speaker 1: steganography is a way, and I covered that in a 65 00:03:41,600 --> 00:03:44,000 Speaker 1: previous episode of Tech Stuff with my friend Ariel who 66 00:03:44,080 --> 00:03:48,120 Speaker 1: joined me for that episode. Steganography is the practice of 67 00:03:48,200 --> 00:03:53,040 Speaker 1: hiding a message within some other form of non secret data, 68 00:03:53,400 --> 00:03:58,400 Speaker 1: like uh an image or a music file. You could 69 00:03:58,680 --> 00:04:03,600 Speaker 1: literally high the message inside the image, meaning that unless 70 00:04:03,640 --> 00:04:06,160 Speaker 1: you know what to look for, you're not likely to 71 00:04:06,280 --> 00:04:08,960 Speaker 1: discover the message. That's not the most secure way of 72 00:04:09,000 --> 00:04:11,280 Speaker 1: doing it, obviously, but it is something you can do. 73 00:04:11,400 --> 00:04:15,440 Speaker 1: I remember collecting old Grow the Wanderer comic books by 74 00:04:15,440 --> 00:04:19,920 Speaker 1: Sergio Ariganez, who would hide a secret message in every 75 00:04:19,920 --> 00:04:24,200 Speaker 1: single issue. Typically it would read this is the hidden message. 76 00:04:24,839 --> 00:04:29,760 Speaker 1: So not super duper helpful, but that would be one 77 00:04:29,800 --> 00:04:32,279 Speaker 1: way of doing it. However, you could also hide the 78 00:04:32,279 --> 00:04:37,520 Speaker 1: information within the actual text or or the the the 79 00:04:37,560 --> 00:04:41,080 Speaker 1: file information of the image itself, so instead of like 80 00:04:41,160 --> 00:04:44,200 Speaker 1: hiding it physically inside the image, you're hiding it in 81 00:04:44,240 --> 00:04:47,200 Speaker 1: the code that represents the image. And it's only if 82 00:04:47,240 --> 00:04:49,000 Speaker 1: you know to look at that code that you would 83 00:04:49,000 --> 00:04:51,479 Speaker 1: even see that there was a message there. The whole 84 00:04:51,520 --> 00:04:54,719 Speaker 1: point is you're concealing that secret message inside some other 85 00:04:54,839 --> 00:04:58,400 Speaker 1: non secret information that you can freely distribute, so you 86 00:04:58,440 --> 00:05:01,920 Speaker 1: can send that photo out being reasonably confident that people 87 00:05:01,960 --> 00:05:05,120 Speaker 1: are not going to see that there's a secret message there. 88 00:05:05,160 --> 00:05:06,839 Speaker 1: You only know that it's there if you know to 89 00:05:06,880 --> 00:05:10,919 Speaker 1: look for it. At least that's the theory or the hope. Obviously, 90 00:05:10,960 --> 00:05:14,240 Speaker 1: if someone is super nosy and just curious and they 91 00:05:14,279 --> 00:05:17,520 Speaker 1: start looking at these things more closely, they might uncover 92 00:05:17,640 --> 00:05:22,440 Speaker 1: that message, So it's not always the most secure methodology. Cryptography, 93 00:05:22,480 --> 00:05:27,159 Speaker 1: which means to make communication secret, is a very broad category, 94 00:05:27,360 --> 00:05:31,279 Speaker 1: and encryption falls into that category. Encryption is the specific 95 00:05:31,360 --> 00:05:35,520 Speaker 1: process to make hidden or secret. The two terms are 96 00:05:35,560 --> 00:05:39,440 Speaker 1: often used interchangeably, but technically they are distinct, though I 97 00:05:39,440 --> 00:05:42,159 Speaker 1: guess if enough people use them interchangeably for long enough, 98 00:05:42,560 --> 00:05:45,320 Speaker 1: the meaning itself will change, because that's how language works. 99 00:05:45,360 --> 00:05:49,400 Speaker 1: But that's a bonus episode, I guess. So with encryption, 100 00:05:49,720 --> 00:05:54,480 Speaker 1: you would use some sort of process too encrypt the data. 101 00:05:54,520 --> 00:05:57,720 Speaker 1: You would use a cipher to take a plane message 102 00:05:58,400 --> 00:06:00,960 Speaker 1: the secret that you wish to can municate, and you 103 00:06:00,960 --> 00:06:03,440 Speaker 1: would turn it into something that the average person would 104 00:06:03,440 --> 00:06:07,640 Speaker 1: not be able to understand, but your intended recipient would 105 00:06:07,640 --> 00:06:10,160 Speaker 1: have the knowledge of how to reverse this process to 106 00:06:10,520 --> 00:06:13,560 Speaker 1: decipher the mixed up message, so that he or she 107 00:06:13,680 --> 00:06:17,440 Speaker 1: could read the original secret. So let's take a super 108 00:06:17,480 --> 00:06:21,720 Speaker 1: simple example, one that would never be used in modern 109 00:06:21,760 --> 00:06:26,520 Speaker 1: day encryption. Let's take a basic substitution cipher, in which 110 00:06:26,560 --> 00:06:30,000 Speaker 1: we swap out letters for other letters. In a super 111 00:06:30,040 --> 00:06:33,080 Speaker 1: simple version of this, are two communicators have agreed that, 112 00:06:33,120 --> 00:06:36,360 Speaker 1: for the purposes of their messaging, they will shift all 113 00:06:36,480 --> 00:06:40,640 Speaker 1: letters over by one so that an A will be 114 00:06:40,680 --> 00:06:44,440 Speaker 1: represented by the letter B, A B will be represented 115 00:06:44,440 --> 00:06:47,120 Speaker 1: by the letter C, and so forth until you get 116 00:06:47,160 --> 00:06:49,320 Speaker 1: to Z, which will be represented by the letter A. 117 00:06:49,720 --> 00:06:52,720 Speaker 1: So if I wanted to write out uh j O 118 00:06:52,960 --> 00:06:55,960 Speaker 1: n as my sign off, shortening my name to John, 119 00:06:56,400 --> 00:06:59,279 Speaker 1: I would actually use the letters k P O. My 120 00:06:59,360 --> 00:07:02,400 Speaker 1: recipient would receive this message and say, ah, I must 121 00:07:02,640 --> 00:07:07,919 Speaker 1: decipher it by cleverly stepping back each letter by one position, 122 00:07:08,040 --> 00:07:10,440 Speaker 1: and I get j O N Oh it was Jonathan 123 00:07:10,440 --> 00:07:13,960 Speaker 1: who sent me this message. Now, clearly anyone could crack 124 00:07:14,040 --> 00:07:17,040 Speaker 1: that code. It would not take any sort of master 125 00:07:17,360 --> 00:07:19,440 Speaker 1: code breaker to do it. You don't need a computer 126 00:07:19,520 --> 00:07:24,440 Speaker 1: to do it. Is the simplest of substitution ciphers, and 127 00:07:25,680 --> 00:07:29,720 Speaker 1: obviously that would never stand. However, more much more complicated 128 00:07:29,720 --> 00:07:32,800 Speaker 1: ciphers have been used throughout history, and in fact, one 129 00:07:32,840 --> 00:07:37,600 Speaker 1: of the earliest uses for computers was in decoding encrypted messages. 130 00:07:37,680 --> 00:07:41,200 Speaker 1: During World War Two, computers were dedicated to cracking codes 131 00:07:41,240 --> 00:07:45,160 Speaker 1: made by physical devices like the Enigma machine. But I've 132 00:07:45,160 --> 00:07:47,480 Speaker 1: talked about that in previous episodes, so I'm gonna move 133 00:07:47,520 --> 00:07:51,640 Speaker 1: on rather than dwell on it here now. In cryptology, 134 00:07:51,800 --> 00:07:57,320 Speaker 1: people often use examples to explain specific implementations and strategies. 135 00:07:57,960 --> 00:08:01,480 Speaker 1: This has led to two fictional care is becoming placeholders 136 00:08:01,520 --> 00:08:03,920 Speaker 1: for these examples. So when you want to say person 137 00:08:04,000 --> 00:08:07,760 Speaker 1: A in person B, that gets really cumbersome. So those 138 00:08:07,840 --> 00:08:11,520 Speaker 1: characters are Alice and Bob. If you've ever heard examples 139 00:08:11,640 --> 00:08:16,160 Speaker 1: using Alice and Bob, it comes from this branch of cryptology. 140 00:08:16,280 --> 00:08:20,640 Speaker 1: The characters were actually created by three researchers named Ron Revest, 141 00:08:20,960 --> 00:08:24,760 Speaker 1: Audi Schamir, and Leonard Alderman, who came up with the 142 00:08:24,960 --> 00:08:28,160 Speaker 1: r S A encryption strategy. R s A for the 143 00:08:28,200 --> 00:08:33,400 Speaker 1: first initial uh of each of their last names, so Revest, Schamir, 144 00:08:33,520 --> 00:08:36,160 Speaker 1: and Ottoman you get r S A. And when they 145 00:08:36,160 --> 00:08:38,080 Speaker 1: came up with that, they decided they needed to have 146 00:08:38,280 --> 00:08:42,520 Speaker 1: examples to describe what their whole procedure was, and they 147 00:08:42,800 --> 00:08:45,400 Speaker 1: invented these characters of Alice and Bob. More on R 148 00:08:45,520 --> 00:08:47,680 Speaker 1: s A in just a minute. Alice and Bob have 149 00:08:47,800 --> 00:08:51,560 Speaker 1: become the arc types for cryptological discussions and beyond that. Actually, 150 00:08:51,559 --> 00:08:53,800 Speaker 1: Alie and Bob are used in lots of different examples 151 00:08:53,800 --> 00:08:57,440 Speaker 1: for tech, not just cryptology, but basically, the premise spoils 152 00:08:57,480 --> 00:09:00,319 Speaker 1: down to Bob and Alice both wished to commune dicate 153 00:09:00,360 --> 00:09:03,600 Speaker 1: with one another without other people being able to intercept 154 00:09:03,640 --> 00:09:07,200 Speaker 1: and understand those messages. All right, So Alice and Bob 155 00:09:07,760 --> 00:09:09,960 Speaker 1: they want to send messages to each other using some 156 00:09:10,000 --> 00:09:12,880 Speaker 1: sort of computer device. It could be an actual computer 157 00:09:13,000 --> 00:09:15,360 Speaker 1: like a desktop or a laptop. It could be a smartphone. 158 00:09:15,400 --> 00:09:18,200 Speaker 1: It could be a laptop, tablet, computer, could be any 159 00:09:18,200 --> 00:09:23,320 Speaker 1: of these things. It doesn't really matter that the point 160 00:09:23,400 --> 00:09:25,559 Speaker 1: is that they want to be able to send electronic 161 00:09:25,640 --> 00:09:29,000 Speaker 1: messages in some meaningful way. And when Alice and Bob 162 00:09:29,040 --> 00:09:33,360 Speaker 1: send messages back and forth, their devices aren't doing so directly. Right, 163 00:09:33,400 --> 00:09:38,200 Speaker 1: there's no direct connection between Alice's device and Bob's device 164 00:09:38,280 --> 00:09:41,000 Speaker 1: unless they happen to be very close together and able 165 00:09:41,000 --> 00:09:43,840 Speaker 1: to use some sort of technology like near field communication 166 00:09:43,960 --> 00:09:47,160 Speaker 1: or NFC. Apart from that, where you're so close you 167 00:09:47,160 --> 00:09:49,760 Speaker 1: could just whisper it to each other. Let's say that 168 00:09:49,840 --> 00:09:52,480 Speaker 1: you are across the country from one. Alice is over 169 00:09:52,520 --> 00:09:55,400 Speaker 1: in California, Bob is in New York. The messages they 170 00:09:55,440 --> 00:09:59,080 Speaker 1: send to each other are going to go through routers 171 00:09:59,080 --> 00:10:02,480 Speaker 1: and switches and servers, and eventually they're going to pass 172 00:10:02,520 --> 00:10:06,160 Speaker 1: through some sort of central server system before going through 173 00:10:06,320 --> 00:10:09,320 Speaker 1: even more routers and switches and servers and eventually ending 174 00:10:09,400 --> 00:10:12,760 Speaker 1: up on the other person's phone. So whatever service Alice 175 00:10:12,760 --> 00:10:18,360 Speaker 1: and Bob are using, that centralized server, which is the 176 00:10:18,360 --> 00:10:21,359 Speaker 1: the you know, kind of like the the Great uh 177 00:10:21,480 --> 00:10:25,560 Speaker 1: Post service for that particular app it the message has 178 00:10:25,600 --> 00:10:28,440 Speaker 1: to pass through there because this is a specific app 179 00:10:28,520 --> 00:10:32,679 Speaker 1: being used offered by a specific company. So Alice sends 180 00:10:32,720 --> 00:10:36,960 Speaker 1: Bob a message on their agreed upon messaging app. Alice's 181 00:10:36,960 --> 00:10:39,440 Speaker 1: message is going to head out over the Internet, hit 182 00:10:39,480 --> 00:10:42,080 Speaker 1: the server in charge of handling the messaging service that 183 00:10:42,160 --> 00:10:44,720 Speaker 1: they are both using, and then continue on until it 184 00:10:44,760 --> 00:10:47,360 Speaker 1: hits Bob. Now, if the message is in plain text, 185 00:10:47,559 --> 00:10:52,120 Speaker 1: meaning it's unencrypted, anyone along that pathway could potentially read 186 00:10:52,160 --> 00:10:55,720 Speaker 1: the contents of that message. That includes hackers who could 187 00:10:55,720 --> 00:10:59,360 Speaker 1: have compromised a system somewhere along that pathway, and they're 188 00:10:59,360 --> 00:11:01,800 Speaker 1: trying to pull a man in the middle attack. The message, 189 00:11:01,800 --> 00:11:04,880 Speaker 1: in other words, is not safe. One way to fix 190 00:11:04,920 --> 00:11:08,440 Speaker 1: that is through a simple encryption method in which Alice 191 00:11:08,480 --> 00:11:12,640 Speaker 1: and Bob each have their own private encryption and decryption 192 00:11:12,720 --> 00:11:16,560 Speaker 1: keys with the server. So let's say Alice and Bob 193 00:11:16,840 --> 00:11:19,760 Speaker 1: are using a messaging app. Let's call the app x 194 00:11:19,880 --> 00:11:23,040 Speaker 1: y Z, and x y Z has its server systems. 195 00:11:23,520 --> 00:11:26,440 Speaker 1: When Alice wants to send Bob a message, she uses 196 00:11:26,520 --> 00:11:29,679 Speaker 1: her own personal x y Z encryption key to do 197 00:11:29,760 --> 00:11:33,520 Speaker 1: so and sends it along. No one else other than 198 00:11:33,720 --> 00:11:37,000 Speaker 1: x y Z has that encryption key, so Alice and 199 00:11:37,160 --> 00:11:40,280 Speaker 1: x y Z share it, but nobody else does. Her 200 00:11:40,360 --> 00:11:44,400 Speaker 1: message will get to the server, which sees that Alice 201 00:11:44,440 --> 00:11:47,520 Speaker 1: wants to send a message to Bob. So the service says, 202 00:11:47,559 --> 00:11:50,480 Speaker 1: all right, I'll send this to Bob, except Bob can't 203 00:11:50,520 --> 00:11:54,400 Speaker 1: read it because it's been encrypted with Alice's key, so 204 00:11:54,720 --> 00:11:57,679 Speaker 1: Bob doesn't have Alice's key. So what the server then 205 00:11:57,720 --> 00:12:01,880 Speaker 1: does will decrypt the message because it has Alice's key 206 00:12:01,920 --> 00:12:05,120 Speaker 1: and it also has Bob's key, so it then re 207 00:12:05,360 --> 00:12:09,320 Speaker 1: encrypts the message using Bob's key and then sends that 208 00:12:09,480 --> 00:12:14,640 Speaker 1: along to Bob, so the message has been encrypted twice. Technically, 209 00:12:14,679 --> 00:12:18,640 Speaker 1: it was encrypted, decrypted at the server level, encrypted again 210 00:12:18,800 --> 00:12:22,760 Speaker 1: sent to Bob. Whenever the message is passing over the Internet, 211 00:12:23,200 --> 00:12:27,600 Speaker 1: it is encrypted, which is pretty safe. But you may say, well, 212 00:12:27,640 --> 00:12:31,240 Speaker 1: what about the time that it's spending on the server. 213 00:12:32,280 --> 00:12:35,680 Speaker 1: That's the rub. See, this is not end to end encryption. 214 00:12:35,720 --> 00:12:38,840 Speaker 1: I'll explain that a little bit later. The server is 215 00:12:38,880 --> 00:12:42,559 Speaker 1: able to decrypt all incoming messages, no matter who is 216 00:12:42,600 --> 00:12:45,679 Speaker 1: sending them, and then encrypt them with the respective keys 217 00:12:45,720 --> 00:12:49,240 Speaker 1: belonging to whoever was the intended recipients. But this creates 218 00:12:49,240 --> 00:12:53,880 Speaker 1: a few different problematic scenarios. One is that governments they 219 00:12:54,080 --> 00:12:58,040 Speaker 1: love this strategy because it opens up the possibility that 220 00:12:58,080 --> 00:13:01,440 Speaker 1: the server could release message is on a court order 221 00:13:01,640 --> 00:13:05,600 Speaker 1: or some other means, Like if the government orders the 222 00:13:05,800 --> 00:13:09,600 Speaker 1: service to share those messages and the services compelled to 223 00:13:09,840 --> 00:13:14,920 Speaker 1: agree to it, then potentially the government could get hold 224 00:13:15,040 --> 00:13:18,640 Speaker 1: of unencrypted plane text messages because at the server level 225 00:13:18,679 --> 00:13:21,720 Speaker 1: everything gets unlocked. So the government agency can go to 226 00:13:21,840 --> 00:13:24,559 Speaker 1: X Y Z and say, we suspect Alice and Bob 227 00:13:24,600 --> 00:13:27,520 Speaker 1: are plotting something dangerous. We have some evidence, but we 228 00:13:27,559 --> 00:13:30,280 Speaker 1: want access to the messages they're sending to each other. 229 00:13:30,920 --> 00:13:33,480 Speaker 1: Lives could be at stake, and then they flash a 230 00:13:33,480 --> 00:13:36,480 Speaker 1: Warren or something, and because X y Z is following 231 00:13:36,480 --> 00:13:39,559 Speaker 1: this strategy, it could hand over such information if ordered to, 232 00:13:39,760 --> 00:13:43,400 Speaker 1: revealing all the messages between Alice and Bob. And maybe 233 00:13:43,440 --> 00:13:46,280 Speaker 1: Alice and Bob really were plotting something terrible and it's 234 00:13:46,320 --> 00:13:49,920 Speaker 1: prevented as a result, or maybe they're just happy people 235 00:13:50,080 --> 00:13:53,199 Speaker 1: and their private messages have now been compromised and they 236 00:13:53,200 --> 00:13:57,880 Speaker 1: were completely innocent, and yet there otherwise private communication has 237 00:13:57,920 --> 00:14:01,880 Speaker 1: now been made at least semi public. The strategy also 238 00:14:01,920 --> 00:14:06,040 Speaker 1: creates an incredibly attractive target for hackers, obviously, because if 239 00:14:06,080 --> 00:14:09,400 Speaker 1: you can compromise that central server, you can get at 240 00:14:09,440 --> 00:14:11,880 Speaker 1: all the data that's going across it because it gets 241 00:14:12,000 --> 00:14:15,160 Speaker 1: decrypted there. In fact, this has happened a few times 242 00:14:15,160 --> 00:14:17,480 Speaker 1: in the past where hackers have managed to get access 243 00:14:17,559 --> 00:14:20,880 Speaker 1: to such systems and mind them for data and dump 244 00:14:20,920 --> 00:14:23,520 Speaker 1: all the information onto various places on the web, or 245 00:14:23,880 --> 00:14:26,720 Speaker 1: more likely on the dark web. This is where stuff 246 00:14:26,760 --> 00:14:30,600 Speaker 1: like credit card information or compromising photos can come into play, 247 00:14:30,640 --> 00:14:34,160 Speaker 1: where hackers have managed to get into a server and 248 00:14:34,160 --> 00:14:37,880 Speaker 1: and get all that information. Sure the messages were encrypted 249 00:14:37,960 --> 00:14:40,120 Speaker 1: as they went over the Internet, but that one central 250 00:14:40,160 --> 00:14:44,280 Speaker 1: point everything was unlocked and ready for exploitation and to 251 00:14:44,600 --> 00:14:48,720 Speaker 1: end Encryption aims to defeat this by creating a strategy 252 00:14:48,800 --> 00:14:51,840 Speaker 1: in which only Alice and Bob know how to decrypt 253 00:14:51,920 --> 00:14:55,080 Speaker 1: messages from the other. They have a private means of 254 00:14:55,120 --> 00:14:59,520 Speaker 1: decrypting information. The central server does not have access to 255 00:14:59,600 --> 00:15:03,360 Speaker 1: that it decryption key, and so the messages sent across 256 00:15:03,440 --> 00:15:07,320 Speaker 1: the server are meaningless to the server, is just gobbledygook. 257 00:15:07,800 --> 00:15:11,680 Speaker 1: The server could confirm that messages had passed between Alice 258 00:15:11,720 --> 00:15:13,760 Speaker 1: and Bob, you could say, well, yes, we know that 259 00:15:13,880 --> 00:15:17,640 Speaker 1: the two have communicated with each other, but they the 260 00:15:17,640 --> 00:15:19,840 Speaker 1: company wouldn't be able to speak to what was actually 261 00:15:19,880 --> 00:15:22,960 Speaker 1: in those messages. So if a government agency served up 262 00:15:22,960 --> 00:15:25,480 Speaker 1: a search warrant, all they would get is some garbled, 263 00:15:25,480 --> 00:15:28,920 Speaker 1: seemingly random and meaningless data. They would need that private 264 00:15:29,040 --> 00:15:31,600 Speaker 1: key to make any sense of it. Now, there are 265 00:15:31,600 --> 00:15:35,560 Speaker 1: companies that absolutely love using the strategy because it really 266 00:15:35,600 --> 00:15:37,880 Speaker 1: takes the heat off of them that not only are 267 00:15:37,880 --> 00:15:41,320 Speaker 1: they able to offer their customers a secure way to communicate, 268 00:15:41,840 --> 00:15:43,840 Speaker 1: there's no way for the company to know what sort 269 00:15:43,880 --> 00:15:47,320 Speaker 1: of data is crossing its servers, so it cannot be 270 00:15:47,440 --> 00:15:51,480 Speaker 1: complicit in anything illegal because it doesn't know what's happening. 271 00:15:51,960 --> 00:15:54,680 Speaker 1: It provides a service one in which people may communicate 272 00:15:54,720 --> 00:15:57,040 Speaker 1: with one another in a secure fashion, and that's it. 273 00:15:57,520 --> 00:16:00,320 Speaker 1: But it raises a question how the heck are Alice 274 00:16:00,360 --> 00:16:03,400 Speaker 1: and Bob is supposed to get a private key across 275 00:16:03,440 --> 00:16:06,320 Speaker 1: to each other? Right? If Alice and Bob are apart 276 00:16:06,400 --> 00:16:10,040 Speaker 1: from one another and their communication requires going through a 277 00:16:10,080 --> 00:16:13,360 Speaker 1: centralized service, how do they establish a secret means of 278 00:16:13,360 --> 00:16:17,400 Speaker 1: communication in the first place. If that first message is hey, 279 00:16:17,600 --> 00:16:21,640 Speaker 1: let's use this secret code, but it's decrypted by the 280 00:16:21,640 --> 00:16:25,360 Speaker 1: centralized server, then the centralized server knows the secret code, 281 00:16:25,400 --> 00:16:27,760 Speaker 1: it doesn't help. Well. The answer comes down to what 282 00:16:27,880 --> 00:16:34,080 Speaker 1: was called an asymmetric cryptographic algorithm that uses public key cryptography. 283 00:16:34,120 --> 00:16:37,000 Speaker 1: And in this method, there are two keys that are 284 00:16:37,160 --> 00:16:40,920 Speaker 1: used for each message. One of them is a public key, 285 00:16:40,960 --> 00:16:44,040 Speaker 1: which is unique to each user that can be freely 286 00:16:44,280 --> 00:16:47,840 Speaker 1: distributed across the internet. The other is a private key, 287 00:16:47,880 --> 00:16:51,640 Speaker 1: which is a jealously guarded secret that allows only one 288 00:16:51,800 --> 00:16:55,240 Speaker 1: entity the chance to decrypt information. I'll speak about more 289 00:16:55,240 --> 00:16:57,880 Speaker 1: of this in just a minute, but first let's take 290 00:16:57,880 --> 00:17:07,840 Speaker 1: a quick break to thank our sponsor. Alright, So how 291 00:17:07,880 --> 00:17:12,160 Speaker 1: does public key cryptography work. Each person using the system, 292 00:17:12,240 --> 00:17:15,240 Speaker 1: let's call it ABC. For this case, we're talking about 293 00:17:15,240 --> 00:17:19,080 Speaker 1: different service from x y Z. Every single person using 294 00:17:19,119 --> 00:17:24,199 Speaker 1: ABC gets too encryption keys, or well, one encryption key 295 00:17:24,240 --> 00:17:28,440 Speaker 1: and one decryption key. Technically, one key is the person's 296 00:17:28,480 --> 00:17:31,639 Speaker 1: private key, So Alice has a private key that's unique 297 00:17:31,680 --> 00:17:34,280 Speaker 1: to her, and Bob has a private key it's unique 298 00:17:34,320 --> 00:17:38,439 Speaker 1: to him. They they also each have a public key, 299 00:17:38,640 --> 00:17:42,360 Speaker 1: so Alice has a public key, Bob has another public key. 300 00:17:42,440 --> 00:17:45,400 Speaker 1: These are also unique to Alice and Bob, but those 301 00:17:45,480 --> 00:17:49,040 Speaker 1: keys are published publicly on the service across the Internet. 302 00:17:49,440 --> 00:17:52,600 Speaker 1: Anyone who wants to send an encrypted message to Alice 303 00:17:52,720 --> 00:17:56,560 Speaker 1: or Bob must use their respective public keys. In other words, 304 00:17:56,560 --> 00:17:58,359 Speaker 1: if I want to send Alice a message, I have 305 00:17:58,400 --> 00:18:02,960 Speaker 1: to use Alice's public to encrypt it. To decode a message, 306 00:18:03,320 --> 00:18:06,240 Speaker 1: you have to have the private key. So if you 307 00:18:06,280 --> 00:18:09,240 Speaker 1: know both keys for a specific message, you can transform 308 00:18:09,240 --> 00:18:12,320 Speaker 1: that garbled mess into meaningful communication. And because you keep 309 00:18:12,359 --> 00:18:15,520 Speaker 1: your own private key to yourself, at least ideally, the 310 00:18:15,560 --> 00:18:18,399 Speaker 1: only person who can decode messages meant for you is you. 311 00:18:19,119 --> 00:18:23,280 Speaker 1: So Alice takes Bob's public key and uses it to 312 00:18:23,400 --> 00:18:26,960 Speaker 1: encode her message to Bob. She then sends the encoded 313 00:18:27,000 --> 00:18:31,080 Speaker 1: message to Bob across the messaging service. The encoded message 314 00:18:31,080 --> 00:18:35,080 Speaker 1: passes through ABC's server, which cannot read the message because 315 00:18:35,119 --> 00:18:39,080 Speaker 1: ABC does not have Bob's private key. Only Bob has 316 00:18:39,119 --> 00:18:42,520 Speaker 1: that the message gets to Bob, he uses his private 317 00:18:42,600 --> 00:18:45,400 Speaker 1: key and it decodes the message and then he can 318 00:18:45,440 --> 00:18:48,040 Speaker 1: read it in plain text. But what is actually going 319 00:18:48,080 --> 00:18:50,439 Speaker 1: on here? How is this happening? Well that that depends 320 00:18:50,480 --> 00:18:54,480 Speaker 1: heavily on the specific cryptographic strategy used, But we can 321 00:18:54,600 --> 00:18:58,399 Speaker 1: go with one of the earliest proposed methods of doing this, 322 00:18:58,560 --> 00:19:01,600 Speaker 1: the RSA algorithm, the one I mentioned earlier that was 323 00:19:01,760 --> 00:19:04,200 Speaker 1: proposed by the guys who used Alice and Bob as 324 00:19:04,200 --> 00:19:08,320 Speaker 1: examples in the first place. This one relies on prime numbers, 325 00:19:08,760 --> 00:19:12,200 Speaker 1: and just a quick refresher, a prime number is only 326 00:19:12,280 --> 00:19:15,320 Speaker 1: equal to one times the prime number itself. There are 327 00:19:15,320 --> 00:19:18,480 Speaker 1: no other multiple multiple factors. There are no other numbers 328 00:19:18,520 --> 00:19:21,119 Speaker 1: you can multiply together to get the prime number. So 329 00:19:21,160 --> 00:19:23,119 Speaker 1: if you can get to the value of a number 330 00:19:23,240 --> 00:19:26,439 Speaker 1: by by multiplying any other two numbers, it's what we 331 00:19:26,520 --> 00:19:30,080 Speaker 1: call a composite number. And is not a prime number. 332 00:19:30,359 --> 00:19:34,840 Speaker 1: So the numbers one, two, and three are all prime numbers. 333 00:19:34,880 --> 00:19:37,399 Speaker 1: They are only divisible by themselves and by one that 334 00:19:37,520 --> 00:19:41,560 Speaker 1: there are no other factors, right That our whole numbers anyway, 335 00:19:41,840 --> 00:19:44,119 Speaker 1: because the only multiplication you can do to arrive at 336 00:19:44,160 --> 00:19:46,720 Speaker 1: them is one times one is one, one times two 337 00:19:46,760 --> 00:19:49,879 Speaker 1: is two, and one times three is three, respectively. Four 338 00:19:50,080 --> 00:19:53,680 Speaker 1: is the smallest composite number because you can multiply two 339 00:19:53,720 --> 00:19:56,600 Speaker 1: times two to get four. So it's one times four 340 00:19:56,680 --> 00:19:58,360 Speaker 1: is four and two times two is four. That means 341 00:19:58,400 --> 00:20:00,680 Speaker 1: it is not a prime number. It's a posit number. 342 00:20:00,840 --> 00:20:03,400 Speaker 1: So the number is only divisible by one or itself, 343 00:20:03,600 --> 00:20:06,359 Speaker 1: then it's a prime number. The r s a algorithm 344 00:20:06,400 --> 00:20:12,520 Speaker 1: takes two really really really big prime numbers, like hundreds 345 00:20:12,520 --> 00:20:14,679 Speaker 1: of digits long. You might find systems that use a 346 00:20:14,680 --> 00:20:17,760 Speaker 1: five twelve bit prime number, though a lot of security 347 00:20:17,800 --> 00:20:20,359 Speaker 1: experts would say a thousand twenty four bit prime number 348 00:20:20,359 --> 00:20:24,919 Speaker 1: would be better than It takes those two already huge numbers. Remember, 349 00:20:24,960 --> 00:20:28,720 Speaker 1: it has to identify two different prime numbers that are 350 00:20:28,920 --> 00:20:33,959 Speaker 1: enormous and then multiplies those two enormous prime numbers together 351 00:20:34,240 --> 00:20:38,000 Speaker 1: to get a new number. This number is called the modulus. 352 00:20:38,440 --> 00:20:42,240 Speaker 1: It then requires the calculation of the totient of that product. 353 00:20:43,160 --> 00:20:45,080 Speaker 1: And this is where you say, wait a minute, hang on, 354 00:20:45,200 --> 00:20:47,199 Speaker 1: what the heck is a totient? At least if you 355 00:20:47,240 --> 00:20:49,600 Speaker 1: are like me, I was an English lit major, I 356 00:20:49,680 --> 00:20:52,720 Speaker 1: never got to totients in my studies in math. But 357 00:20:52,840 --> 00:20:57,520 Speaker 1: a totient is a number um that describes the number 358 00:20:57,520 --> 00:21:01,919 Speaker 1: of integers smaller than a given number that are co 359 00:21:02,040 --> 00:21:04,280 Speaker 1: prime with that number. Oh, that's a lot of numbers. 360 00:21:04,359 --> 00:21:06,119 Speaker 1: Let me try that again. Let's say you've got a 361 00:21:06,160 --> 00:21:12,160 Speaker 1: really big number, and the totient is all of the 362 00:21:12,200 --> 00:21:15,720 Speaker 1: integers that are smaller than this really big number that 363 00:21:15,760 --> 00:21:19,560 Speaker 1: are co prime with that really big number, which means 364 00:21:19,600 --> 00:21:23,639 Speaker 1: they share no factors except one with that really big number. 365 00:21:24,359 --> 00:21:28,119 Speaker 1: So let's use an example, because this sounds really really vague. Right. 366 00:21:28,240 --> 00:21:31,439 Speaker 1: Let's say that we have the number fourteen. We we 367 00:21:31,560 --> 00:21:35,280 Speaker 1: got to fourteen, uh, and we took prime numbers. We 368 00:21:35,320 --> 00:21:37,680 Speaker 1: took the number two and the number seven and we 369 00:21:37,800 --> 00:21:40,480 Speaker 1: multiplied those together. Those are our prime numbers we started with. 370 00:21:40,560 --> 00:21:42,880 Speaker 1: So again, this is just an example. You would never 371 00:21:43,000 --> 00:21:46,080 Speaker 1: use numbers this small. You multiply two times seven, you 372 00:21:46,080 --> 00:21:49,520 Speaker 1: get fourteen. Fourteen is your modulus. Then you say what 373 00:21:49,680 --> 00:21:54,399 Speaker 1: is the totient of fourteen? The answer is six, and 374 00:21:54,480 --> 00:21:58,919 Speaker 1: here is why. First, you take all the integers that 375 00:21:59,000 --> 00:22:02,960 Speaker 1: are less than four team, which is one through thirteen. Right, 376 00:22:03,240 --> 00:22:05,560 Speaker 1: you can't use fourteen. You have to use all the 377 00:22:05,600 --> 00:22:07,480 Speaker 1: ones that are all the whole numbers that are lower 378 00:22:07,480 --> 00:22:10,640 Speaker 1: than fourteen, so one through thirteen. But the numbers also 379 00:22:10,680 --> 00:22:13,280 Speaker 1: have to be co prime with fourteen. That means they 380 00:22:13,400 --> 00:22:17,880 Speaker 1: cannot share any factors that fourteen has. Now, the factors 381 00:22:17,920 --> 00:22:20,280 Speaker 1: of fourteen are two and seven, so we get those 382 00:22:20,359 --> 00:22:23,159 Speaker 1: off the list those those get removed, So two and 383 00:22:23,200 --> 00:22:25,600 Speaker 1: seven are gone. But you also have to eliminate the 384 00:22:25,680 --> 00:22:30,440 Speaker 1: numbers four, six, eight, ten, and twelve. All of those 385 00:22:30,520 --> 00:22:34,200 Speaker 1: numbers have two as a factor, and fourteen also has 386 00:22:34,200 --> 00:22:37,159 Speaker 1: two as a factor, so those numbers are not co prime, 387 00:22:37,600 --> 00:22:40,280 Speaker 1: so you strike those. Once you've gotten rid of those, 388 00:22:40,600 --> 00:22:43,400 Speaker 1: the numbers you have left that are lower than fourteen 389 00:22:43,480 --> 00:22:49,760 Speaker 1: are one, three, five, nine, eleven, and thirteen. Now nine 390 00:22:49,840 --> 00:22:52,160 Speaker 1: is not a prime number, right, You can multiply three 391 00:22:52,160 --> 00:22:55,760 Speaker 1: times three and get nine, but it is co prime 392 00:22:56,119 --> 00:23:00,080 Speaker 1: with fourteen because the two numbers share no common factor 393 00:23:00,240 --> 00:23:03,120 Speaker 1: except for the number one. So when you count up 394 00:23:03,280 --> 00:23:06,439 Speaker 1: all the co primes that are left. After you've eliminated 395 00:23:06,520 --> 00:23:09,080 Speaker 1: the numbers that are not co primes, you find out 396 00:23:09,160 --> 00:23:14,760 Speaker 1: you've got six numbers total one, three, five, nteen. That's 397 00:23:14,840 --> 00:23:18,960 Speaker 1: six numbers. So the totient for fourteen is six. By 398 00:23:18,960 --> 00:23:21,320 Speaker 1: the way, there's actually a formula for calculating the totent 399 00:23:21,400 --> 00:23:24,600 Speaker 1: of your huge number that's really really easy. And what 400 00:23:24,720 --> 00:23:27,239 Speaker 1: you do is you take your first prime number, the 401 00:23:27,240 --> 00:23:29,720 Speaker 1: one that you want. You know, in our case we 402 00:23:29,800 --> 00:23:31,680 Speaker 1: use two and seven. Let's say you take your first 403 00:23:31,680 --> 00:23:35,720 Speaker 1: prime number, you subtract one from that. You take your 404 00:23:35,760 --> 00:23:39,600 Speaker 1: second big prime number, you subtract one from that, and 405 00:23:39,640 --> 00:23:42,480 Speaker 1: you multiply those two new numbers together, and that is 406 00:23:42,520 --> 00:23:45,080 Speaker 1: your totent. So in our example, we take our two 407 00:23:45,080 --> 00:23:48,240 Speaker 1: prime numbers of two and seven, we subtract one from each. 408 00:23:48,400 --> 00:23:50,600 Speaker 1: That means we have a one and a six. We 409 00:23:50,720 --> 00:23:54,199 Speaker 1: multiply these two new numbers together one time, six is six. 410 00:23:54,760 --> 00:23:57,399 Speaker 1: That's the totent for fourteen. It's the fastest way to 411 00:23:57,440 --> 00:24:01,159 Speaker 1: calculate the totent. Next, because we're not done yet, you 412 00:24:01,240 --> 00:24:04,160 Speaker 1: have to select an integer, but not just any integer. 413 00:24:04,720 --> 00:24:07,240 Speaker 1: The integer you select has to be larger than the 414 00:24:07,320 --> 00:24:11,480 Speaker 1: number one, but smaller than the totient of your big 415 00:24:11,480 --> 00:24:14,199 Speaker 1: old number that you generated earlier by multiplying those two 416 00:24:14,240 --> 00:24:16,719 Speaker 1: prime numbers together and then taking the totient from it. 417 00:24:17,320 --> 00:24:19,719 Speaker 1: In fact, it has to be a co prime with 418 00:24:19,760 --> 00:24:24,119 Speaker 1: the totient and the modulus itself, so it cannot share 419 00:24:24,359 --> 00:24:27,560 Speaker 1: a factor with the totient or with the modulus. So again, 420 00:24:27,600 --> 00:24:30,679 Speaker 1: in my example, where I had fourteen as our modulus, 421 00:24:30,920 --> 00:24:34,320 Speaker 1: the totient was six, we need an integer between one 422 00:24:34,359 --> 00:24:38,240 Speaker 1: and six that is co prime with both six and fourteen. 423 00:24:38,760 --> 00:24:40,960 Speaker 1: So we cannot use one because the number has to 424 00:24:40,960 --> 00:24:44,240 Speaker 1: be bigger than one. We can't use two or three 425 00:24:44,600 --> 00:24:49,240 Speaker 1: or four because all of those share factors with six. Right, 426 00:24:49,359 --> 00:24:52,359 Speaker 1: it has to be co prime, so you cannot use those. 427 00:24:52,760 --> 00:24:56,240 Speaker 1: The only number available to us is five. This becomes 428 00:24:56,280 --> 00:24:59,719 Speaker 1: our encryption key. Figuring out the decryption key is a 429 00:24:59,720 --> 00:25:02,920 Speaker 1: little more complicated, and that's probably making some of you 430 00:25:03,200 --> 00:25:05,280 Speaker 1: scratch your heads, because what I just went through is 431 00:25:05,480 --> 00:25:07,520 Speaker 1: fairly complicated for those of us who are not very 432 00:25:07,560 --> 00:25:10,959 Speaker 1: mathematically inclined. I include myself in that number. By the way, 433 00:25:11,240 --> 00:25:14,040 Speaker 1: the prime number we arrived at for the encryption key 434 00:25:14,160 --> 00:25:17,159 Speaker 1: has the greatest common divisor or g c D of 435 00:25:17,280 --> 00:25:20,600 Speaker 1: one with the totient of that number. So we take 436 00:25:20,640 --> 00:25:24,480 Speaker 1: the multiplicative inverse of this number with respect to the 437 00:25:24,480 --> 00:25:28,200 Speaker 1: totient using the extended Euclidean algorithm. Yeah, I get super 438 00:25:28,280 --> 00:25:30,760 Speaker 1: duper complicated, and I'm pretty sure I can't explain it, 439 00:25:31,440 --> 00:25:35,919 Speaker 1: especially not in audio alone, so the well will skip ahead. 440 00:25:36,240 --> 00:25:39,680 Speaker 1: You just imagine you do some more complicated math and 441 00:25:39,760 --> 00:25:42,479 Speaker 1: you denote this result with the letter D, and that 442 00:25:42,560 --> 00:25:46,560 Speaker 1: represents the private key. Now. I use the prime numbers 443 00:25:46,560 --> 00:25:48,520 Speaker 1: two and seven in my example only because they were 444 00:25:48,520 --> 00:25:51,000 Speaker 1: easy to work with, But that's precisely why no one 445 00:25:51,040 --> 00:25:53,840 Speaker 1: in the right mind would ever use those to encrypt anything. 446 00:25:54,320 --> 00:25:56,320 Speaker 1: You want to have a pretty large number to work with, 447 00:25:56,520 --> 00:25:59,000 Speaker 1: and even your public key should be pretty big. A 448 00:25:59,119 --> 00:26:03,480 Speaker 1: standard public key is sixty five thousand, five seven, which 449 00:26:03,520 --> 00:26:05,520 Speaker 1: is a prime number, and as long as it has 450 00:26:05,560 --> 00:26:07,800 Speaker 1: no common factors with the modulus, you're good to go. 451 00:26:08,040 --> 00:26:10,120 Speaker 1: And the only way it could have a common factor 452 00:26:10,160 --> 00:26:13,479 Speaker 1: with a modulus is if it was itself a factor 453 00:26:13,520 --> 00:26:15,919 Speaker 1: of your big number, but that's not likely to happen. 454 00:26:16,280 --> 00:26:18,000 Speaker 1: So you want a big key, but you don't want 455 00:26:18,040 --> 00:26:19,840 Speaker 1: it to be too big, and the reason for that 456 00:26:20,000 --> 00:26:22,840 Speaker 1: is encryption is more efficient if you're using smaller numbers 457 00:26:22,840 --> 00:26:26,880 Speaker 1: as your keys. Remember, like essentially encryption involves doing lots 458 00:26:26,880 --> 00:26:29,280 Speaker 1: of math problems. The bigger the numbers are for your 459 00:26:29,320 --> 00:26:31,880 Speaker 1: math problems, the bigger the results are going to be, 460 00:26:32,040 --> 00:26:34,119 Speaker 1: and the larger it's going to make your message. So 461 00:26:34,160 --> 00:26:37,199 Speaker 1: if you're trying to encrypt a very long message to someone, 462 00:26:37,760 --> 00:26:40,920 Speaker 1: then ends up being a much larger amount of data, 463 00:26:41,320 --> 00:26:45,640 Speaker 1: like sometimes several times larger than the original amount of data. 464 00:26:45,680 --> 00:26:47,560 Speaker 1: And if you get too big, you may not even 465 00:26:47,600 --> 00:26:49,920 Speaker 1: be able to send it if the service you're using 466 00:26:49,960 --> 00:26:53,920 Speaker 1: has a limit on the size of messages. So if 467 00:26:53,960 --> 00:26:58,120 Speaker 1: you go too big, it gets inefficient and clunky and 468 00:26:58,200 --> 00:27:00,439 Speaker 1: sometimes too big for you to even hand. But if 469 00:27:00,440 --> 00:27:02,720 Speaker 1: you go too small, you risk the possibility of someone 470 00:27:02,760 --> 00:27:05,600 Speaker 1: being able to suss out the private key given enough 471 00:27:05,600 --> 00:27:08,360 Speaker 1: time and processing powers, So it's a balancing act. One 472 00:27:08,400 --> 00:27:10,679 Speaker 1: thing you could do to make things move more quickly 473 00:27:11,040 --> 00:27:13,520 Speaker 1: is to use this public and private key approach to 474 00:27:13,680 --> 00:27:17,679 Speaker 1: establish a new encryption key between the two communicators. This 475 00:27:17,720 --> 00:27:21,000 Speaker 1: would be called a session key. So Alice would send 476 00:27:21,000 --> 00:27:24,560 Speaker 1: Bob a message, she would use Bob's public encryption key 477 00:27:24,600 --> 00:27:27,359 Speaker 1: that's and she would encrypt a message that says, hey, 478 00:27:27,440 --> 00:27:29,960 Speaker 1: got a second and she sends it to Bob, and 479 00:27:30,040 --> 00:27:32,840 Speaker 1: Bob receives the message. He uses his private key, he 480 00:27:33,000 --> 00:27:35,840 Speaker 1: decodes the message and sends a message back to Alice 481 00:27:35,960 --> 00:27:39,040 Speaker 1: using her public encryption key that says, sure I do. 482 00:27:39,560 --> 00:27:42,679 Speaker 1: And here's a brand new private key that we can 483 00:27:42,760 --> 00:27:46,119 Speaker 1: use for each other for the purposes of this conversation session. 484 00:27:46,680 --> 00:27:49,640 Speaker 1: And then Alice would decrypt this message, and they would 485 00:27:49,720 --> 00:27:53,520 Speaker 1: each have a private key that they could use to 486 00:27:53,600 --> 00:27:57,400 Speaker 1: each other, a symmetric key. All subsequent messages between Bob 487 00:27:57,400 --> 00:28:01,320 Speaker 1: and Alice would use this new private encryption methodology. And 488 00:28:01,359 --> 00:28:04,800 Speaker 1: we call this symmetric because you're using the same key 489 00:28:04,840 --> 00:28:09,119 Speaker 1: to encrypt and decrypt and UH. Otherwise, you wouldn't be 490 00:28:09,119 --> 00:28:12,120 Speaker 1: able to send this to each other because it would 491 00:28:12,160 --> 00:28:14,520 Speaker 1: be public information and everyone would have a copy of 492 00:28:14,560 --> 00:28:18,199 Speaker 1: the key. Using the public private key message to first 493 00:28:18,800 --> 00:28:22,240 Speaker 1: send this is clever because the first message, anyone could 494 00:28:22,280 --> 00:28:24,720 Speaker 1: intercept it, but they're not gonna know what's in it. 495 00:28:25,200 --> 00:28:28,120 Speaker 1: And then all subsequent messages would be using a totally 496 00:28:28,200 --> 00:28:34,439 Speaker 1: different UH encryption methodology. So even if someone were to 497 00:28:34,640 --> 00:28:39,239 Speaker 1: monitor this communication, the communications they would see would be 498 00:28:39,360 --> 00:28:43,760 Speaker 1: encrypted in different ways, and that would make it practically 499 00:28:43,800 --> 00:28:46,280 Speaker 1: impossible to figure out what was going on. You could 500 00:28:46,280 --> 00:28:49,320 Speaker 1: even go a step further and have a new key 501 00:28:49,360 --> 00:28:52,920 Speaker 1: generated with essentially every single message, so it's essentially a 502 00:28:52,920 --> 00:28:56,160 Speaker 1: one ahead. Each message gets a new key to be 503 00:28:56,280 --> 00:29:00,280 Speaker 1: encrypted by, and since it keeps changing each time someone 504 00:29:00,280 --> 00:29:03,240 Speaker 1: sends a message, like Bob sends Alice a new message 505 00:29:03,240 --> 00:29:08,160 Speaker 1: and says uh uh secret, secret, secret, brand new key, 506 00:29:08,200 --> 00:29:11,200 Speaker 1: Alice uses the brand new key to part her message 507 00:29:11,200 --> 00:29:13,520 Speaker 1: in too, Bob, and it says more secrets, more secrets, 508 00:29:13,520 --> 00:29:18,160 Speaker 1: more secrets, new, brand new key. Bob uses the uh 509 00:29:18,240 --> 00:29:22,320 Speaker 1: the the key from one session earlier to unlock Alice's 510 00:29:22,560 --> 00:29:26,120 Speaker 1: message reads it then uses the brand new key to 511 00:29:26,280 --> 00:29:28,800 Speaker 1: send the next message. You could keep doing this forever. 512 00:29:29,320 --> 00:29:31,960 Speaker 1: You could keep sending a brand new key with every 513 00:29:32,000 --> 00:29:35,040 Speaker 1: single message, and it would make your communications very secure. 514 00:29:35,080 --> 00:29:37,400 Speaker 1: It would be a little slow because you would have 515 00:29:37,480 --> 00:29:41,640 Speaker 1: to do this decryption encryption thing every single step, and 516 00:29:41,760 --> 00:29:44,040 Speaker 1: you wouldn't be able to repeat the steps because they'll 517 00:29:44,080 --> 00:29:46,640 Speaker 1: be using a new key every time, but it would 518 00:29:46,640 --> 00:29:49,160 Speaker 1: be really secure, and in fact, there are some messaging 519 00:29:49,200 --> 00:29:53,520 Speaker 1: apps that use this kind of methodology. So the really 520 00:29:53,520 --> 00:29:57,120 Speaker 1: big benefit of symmetric encryption is that. Well, one, it's 521 00:29:57,200 --> 00:30:00,520 Speaker 1: faster than using public and private key approach, but another 522 00:30:00,640 --> 00:30:03,000 Speaker 1: is that limits the number of times that you use 523 00:30:03,080 --> 00:30:06,520 Speaker 1: the public key, and that can be a good thing. Now, 524 00:30:06,520 --> 00:30:10,160 Speaker 1: it's not easy with today's computers, but at least there 525 00:30:10,240 --> 00:30:14,040 Speaker 1: it's at least theoretically possible that hackers could suss out 526 00:30:14,240 --> 00:30:16,960 Speaker 1: a public encryption key and work backward to figure out 527 00:30:16,960 --> 00:30:20,440 Speaker 1: the private encryption key. It's a non trivial problem. It 528 00:30:20,480 --> 00:30:23,640 Speaker 1: requires a ton of computer processing power, and it's not 529 00:30:23,720 --> 00:30:26,800 Speaker 1: likely to happen if you're using really secure encryption. But 530 00:30:26,880 --> 00:30:30,120 Speaker 1: the more frequently use the same public key, the more 531 00:30:30,200 --> 00:30:32,120 Speaker 1: data hackers have to work with, and they can start 532 00:30:32,200 --> 00:30:38,440 Speaker 1: looking for patterns. Patterns are the bane of encryption. When 533 00:30:38,440 --> 00:30:41,920 Speaker 1: you have patterns, then you can start to establish rules. 534 00:30:41,960 --> 00:30:43,800 Speaker 1: And when you start to establish rules, you can start 535 00:30:43,840 --> 00:30:46,600 Speaker 1: working backward and figuring out what generates those rules, and 536 00:30:46,600 --> 00:30:50,680 Speaker 1: eventually you figure out the methodology for encoding the information. 537 00:30:51,440 --> 00:30:55,640 Speaker 1: So you don't want to have patterns in your encoded information. 538 00:30:55,720 --> 00:30:59,280 Speaker 1: If possible, if you keep using the public private key 539 00:30:59,320 --> 00:31:02,959 Speaker 1: method and it's all you're using, then hackers eventually can 540 00:31:03,000 --> 00:31:06,560 Speaker 1: gather enough information that if they have a sufficiently powerful 541 00:31:06,600 --> 00:31:11,800 Speaker 1: computer and enough time on their hands, they can decrypt it. Uh. 542 00:31:11,840 --> 00:31:14,160 Speaker 1: That if is a big one though, because if you're 543 00:31:14,280 --> 00:31:20,000 Speaker 1: using pretty strong encryption, it would take weeks or months 544 00:31:20,200 --> 00:31:24,400 Speaker 1: or more likely years to decrypt the messages. Now, in 545 00:31:24,440 --> 00:31:28,800 Speaker 1: a recent episode, I talked about quantum computers. With a 546 00:31:28,840 --> 00:31:31,760 Speaker 1: classical computer, like I said, figuring out prime numbers for 547 00:31:31,800 --> 00:31:35,160 Speaker 1: a really large number takes an incredibly long time. Depending 548 00:31:35,160 --> 00:31:37,160 Speaker 1: on the size of the number, it could take like 549 00:31:37,200 --> 00:31:39,840 Speaker 1: I said, years for a classical computer to sort it out. 550 00:31:39,880 --> 00:31:43,760 Speaker 1: But a quantum computer, if it's sufficiently powerful, could solve 551 00:31:43,760 --> 00:31:49,200 Speaker 1: those problems much more quickly using a process called Shore's algorithm. 552 00:31:49,560 --> 00:31:51,400 Speaker 1: So if you have a quantum computer with a sufficient 553 00:31:51,480 --> 00:31:54,680 Speaker 1: number of cubits, you could crack this type of encryption 554 00:31:54,720 --> 00:31:58,480 Speaker 1: relatively quickly and pretty reliably. So it's therefore imperative to 555 00:31:58,520 --> 00:32:01,040 Speaker 1: look at new methods of encrypt in order to avoid 556 00:32:01,080 --> 00:32:04,200 Speaker 1: a situation in which the first really powerful quantum computer 557 00:32:04,400 --> 00:32:08,200 Speaker 1: effectively gets a skeleton key to all encrypted messages that 558 00:32:08,240 --> 00:32:11,080 Speaker 1: have been sent across the Internet. Now, let's talk a 559 00:32:11,120 --> 00:32:14,360 Speaker 1: second about the history of end to end encryption, and 560 00:32:14,400 --> 00:32:16,520 Speaker 1: then a bit more about the apps and services that 561 00:32:16,720 --> 00:32:23,720 Speaker 1: use it and some that do not use it. In nine, Whitfield, 562 00:32:23,720 --> 00:32:27,120 Speaker 1: Diffy and Martin Hellman proposed the concept of using public 563 00:32:27,160 --> 00:32:32,080 Speaker 1: and private key combinations in order to distribute symmetric encryption keys. 564 00:32:32,480 --> 00:32:35,240 Speaker 1: It's possible they weren't the first to consider this too. 565 00:32:35,400 --> 00:32:38,560 Speaker 1: There's been some evidence to suggest the British Secret Service 566 00:32:38,600 --> 00:32:40,960 Speaker 1: had come up with a similar approach but never really 567 00:32:41,000 --> 00:32:43,719 Speaker 1: did anything with it. Their idea was the basis not 568 00:32:43,800 --> 00:32:46,480 Speaker 1: just for the r s A algorithm, but a few 569 00:32:46,480 --> 00:32:49,720 Speaker 1: others as well, like the el Camal crypto system, which 570 00:32:49,760 --> 00:32:52,800 Speaker 1: was named after Tahir el Camal, or the d s 571 00:32:52,840 --> 00:32:56,960 Speaker 1: A system also known as the Digital Signature algorithm, invented 572 00:32:57,000 --> 00:33:01,360 Speaker 1: by a guy named David Kravitz, as well as the 573 00:33:01,440 --> 00:33:05,520 Speaker 1: Diffie Hellman cryptosystem, which obviously was created by Whitfield, Diffie 574 00:33:05,520 --> 00:33:09,680 Speaker 1: and Martin Hellman. Then along came p g P, which 575 00:33:09,720 --> 00:33:14,560 Speaker 1: stands for Pretty Good Privacy. The cryptosystem is a hybrid system. 576 00:33:14,720 --> 00:33:20,200 Speaker 1: It was proposed by Phil Zimmerman in n Zimmerman graduated 577 00:33:20,280 --> 00:33:24,080 Speaker 1: from Florida Atlantic University with a degree in computer science, 578 00:33:24,480 --> 00:33:27,480 Speaker 1: and he was active in a project called the Freeze, 579 00:33:27,600 --> 00:33:31,760 Speaker 1: also known as the Nuclear Weapons Freeze campaign. The purpose 580 00:33:31,880 --> 00:33:35,080 Speaker 1: of this organization was to try and curtail nuclear arms 581 00:33:35,120 --> 00:33:39,000 Speaker 1: production in an effort to de escalate mounting global tensions 582 00:33:39,120 --> 00:33:43,120 Speaker 1: and to remove a A an existential threat to the 583 00:33:43,200 --> 00:33:47,280 Speaker 1: human race, and Zimmerman created p g P to allow 584 00:33:47,320 --> 00:33:50,239 Speaker 1: for secure email communications among various parties to make their 585 00:33:50,280 --> 00:33:54,040 Speaker 1: efforts more effective and less likely to get snooped on. Now, 586 00:33:54,040 --> 00:33:56,880 Speaker 1: when you use p g P, the first thing it 587 00:33:56,920 --> 00:34:00,760 Speaker 1: does is it takes your plain text message and impresses it, 588 00:34:01,440 --> 00:34:03,800 Speaker 1: and so it makes it smaller, which is one benefit, 589 00:34:03,880 --> 00:34:07,440 Speaker 1: but it also helps to make it more secure because 590 00:34:07,440 --> 00:34:10,440 Speaker 1: it reduces patterns that might otherwise be useful to hackers 591 00:34:10,480 --> 00:34:14,879 Speaker 1: who want to decrypt messages. It also generates a session key, 592 00:34:14,920 --> 00:34:17,640 Speaker 1: that's that symmetric encryption key I was just talking about 593 00:34:17,680 --> 00:34:19,880 Speaker 1: that would be used throughout the length of any particular 594 00:34:19,920 --> 00:34:23,239 Speaker 1: communication session. The way PGP does this is pretty cool. 595 00:34:23,280 --> 00:34:26,720 Speaker 1: It actually generates the data for the session key based 596 00:34:26,719 --> 00:34:29,600 Speaker 1: on your mouse movements and key strokes you've typed, so 597 00:34:29,719 --> 00:34:32,520 Speaker 1: it's based upon physical interactions you've had with your computer, 598 00:34:33,200 --> 00:34:37,520 Speaker 1: not with any other just random prime number. All of this, 599 00:34:37,719 --> 00:34:40,600 Speaker 1: the compressed plane text message and the unique session key 600 00:34:40,640 --> 00:34:43,959 Speaker 1: generated from your physical interactions with your computer then get 601 00:34:44,120 --> 00:34:49,440 Speaker 1: encoded using this public private key strategy. If Alice sends 602 00:34:49,440 --> 00:34:53,520 Speaker 1: a message using PGP too Bob, first, Alice's computer will 603 00:34:53,520 --> 00:34:57,200 Speaker 1: compress your message, it will append a session key based 604 00:34:57,200 --> 00:35:00,919 Speaker 1: on Alice's typing and mouse movements, and then encrypt all 605 00:35:01,000 --> 00:35:04,640 Speaker 1: of that mess using Bob's public key. Then the message 606 00:35:04,760 --> 00:35:08,280 Speaker 1: travels to Bob. Bob uses his private key to decode 607 00:35:08,280 --> 00:35:11,320 Speaker 1: the message and he can see the session key Alice 608 00:35:11,320 --> 00:35:13,719 Speaker 1: has set up and then use that to communicate back 609 00:35:13,760 --> 00:35:16,880 Speaker 1: with her securely for the rest of the session. Zimmerman's 610 00:35:16,920 --> 00:35:20,040 Speaker 1: p GP was more than just pretty good. It was 611 00:35:20,120 --> 00:35:23,359 Speaker 1: actually a brilliant approach to encryption. It became adopted by 612 00:35:23,360 --> 00:35:26,640 Speaker 1: many organizations and people across the United States and then 613 00:35:26,680 --> 00:35:29,040 Speaker 1: the world, and that caused a huge amount of trouble 614 00:35:29,160 --> 00:35:32,760 Speaker 1: for Zimmerman. For one thing, r s A Security wished 615 00:35:32,760 --> 00:35:35,960 Speaker 1: to question Zimmerman about the use of the r s 616 00:35:36,040 --> 00:35:38,799 Speaker 1: A algorithm within p GP. There was some disputes about 617 00:35:38,840 --> 00:35:43,080 Speaker 1: the licensing. Then the United States Custom Services decided to 618 00:35:43,080 --> 00:35:46,840 Speaker 1: investigate Zimmerman because of the distribution of p GP beyond 619 00:35:46,920 --> 00:35:51,160 Speaker 1: the US borders. Because in the United States there was 620 00:35:51,239 --> 00:35:56,120 Speaker 1: the Arms Control Act and it listed cryptographics software as 621 00:35:56,160 --> 00:35:59,600 Speaker 1: a type of munitions, which would mean if Zimmerman had 622 00:35:59,600 --> 00:36:02,080 Speaker 1: allowed his work to go outside the US, he would 623 00:36:02,080 --> 00:36:04,840 Speaker 1: be in very big trouble because it was similar to 624 00:36:04,960 --> 00:36:09,000 Speaker 1: shipping prohibited weapons outside the country. And if you think 625 00:36:09,000 --> 00:36:11,920 Speaker 1: that sounds a little crazy that software could be considered 626 00:36:11,920 --> 00:36:15,960 Speaker 1: a weapon, Welcome to the twenty first century. Zimmerman was 627 00:36:16,160 --> 00:36:19,640 Speaker 1: never officially charged with any crime, but the investigation did 628 00:36:19,800 --> 00:36:23,359 Speaker 1: last several years. Eventually, the courts determined that p GP 629 00:36:23,560 --> 00:36:26,680 Speaker 1: did not fall into a category that would be considered munitions, 630 00:36:26,960 --> 00:36:30,440 Speaker 1: and today his PGP technology is the property of the 631 00:36:30,480 --> 00:36:34,560 Speaker 1: security firm Symantec, which purchased the PGP Corporation back in 632 00:36:35,520 --> 00:36:36,719 Speaker 1: Now I've got a little bit more I want to 633 00:36:36,719 --> 00:36:39,200 Speaker 1: say about end to end encryption, but before I get 634 00:36:39,239 --> 00:36:42,080 Speaker 1: to that, let's take another quick break to thank our sponsor. 635 00:36:49,200 --> 00:36:53,080 Speaker 1: So the PGP strategy also allows for digital signatures, which 636 00:36:53,080 --> 00:36:54,920 Speaker 1: are a way for you to make sure the message 637 00:36:54,960 --> 00:36:58,319 Speaker 1: being sent to you hasn't been intercepted and altered in 638 00:36:58,360 --> 00:37:00,840 Speaker 1: any way. Plus you can be sure that the message 639 00:37:00,920 --> 00:37:03,960 Speaker 1: came from the person it claims to have come from. 640 00:37:04,120 --> 00:37:07,000 Speaker 1: Digital signatures and p g P use what is called 641 00:37:07,040 --> 00:37:11,120 Speaker 1: a one way hash function. The implementation p g P 642 00:37:11,360 --> 00:37:13,640 Speaker 1: uses is to take the message, which can be of 643 00:37:13,680 --> 00:37:17,640 Speaker 1: any length. So let's say it's a really really long message. 644 00:37:17,880 --> 00:37:20,840 Speaker 1: Then it does a mathematical process on it to arrive 645 00:37:20,920 --> 00:37:24,800 Speaker 1: at a fixed length output, such as one sixty bits. 646 00:37:24,840 --> 00:37:28,080 Speaker 1: So the one sixty bits does not contain the entire message. 647 00:37:28,080 --> 00:37:32,200 Speaker 1: It represents the entire message. It's fine distinction. It doesn't 648 00:37:32,239 --> 00:37:34,960 Speaker 1: matter how long that original message was. The goal is 649 00:37:35,000 --> 00:37:39,160 Speaker 1: to create this shorter hash that speeds things up. It 650 00:37:39,239 --> 00:37:43,000 Speaker 1: doesn't make file sizes balloon from the encryption process. The 651 00:37:43,040 --> 00:37:47,439 Speaker 1: hash function depends entirely on the message itself, So if 652 00:37:47,480 --> 00:37:50,680 Speaker 1: someone were to change even one single bit as in one, 653 00:37:50,800 --> 00:37:54,280 Speaker 1: zero or one of information in that message, the hash 654 00:37:54,400 --> 00:37:58,240 Speaker 1: value would also change because it depends upon the nature 655 00:37:58,280 --> 00:38:00,479 Speaker 1: of the rest of the message. So if you send 656 00:38:00,480 --> 00:38:02,960 Speaker 1: a message to someone and they know what the hash 657 00:38:03,040 --> 00:38:05,960 Speaker 1: value is supposed to be, they can verify that the 658 00:38:06,000 --> 00:38:09,960 Speaker 1: associated message was never altered. So they've got essentially the 659 00:38:09,960 --> 00:38:12,640 Speaker 1: the hash value it's supposed to be and the hash 660 00:38:12,719 --> 00:38:15,360 Speaker 1: value it actually is. If those two numbers are different, 661 00:38:15,760 --> 00:38:18,480 Speaker 1: then they know that something has happened to the message 662 00:38:18,640 --> 00:38:22,000 Speaker 1: in transit. The way PGP does this is to generate 663 00:38:22,040 --> 00:38:25,720 Speaker 1: a hash called the message digest. It uses the message 664 00:38:25,840 --> 00:38:29,279 Speaker 1: digest and the private key to create a signature. Then 665 00:38:29,360 --> 00:38:32,160 Speaker 1: p GP sends the plain text in signature together to 666 00:38:32,320 --> 00:38:36,120 Speaker 1: the recipient. The recipient uses PGP to recompute the message 667 00:38:36,200 --> 00:38:39,080 Speaker 1: digest to verify it is from the supposed center and 668 00:38:39,080 --> 00:38:42,400 Speaker 1: then it hasn't been altered. And if the recomputed message 669 00:38:42,480 --> 00:38:45,200 Speaker 1: digest is the same as the original that was in 670 00:38:45,239 --> 00:38:49,160 Speaker 1: the message, everything's good. If the recomputed one is not 671 00:38:49,280 --> 00:38:52,399 Speaker 1: the same, then something has happened, and you know that 672 00:38:52,440 --> 00:38:56,319 Speaker 1: the the trail between the two has been compromised in 673 00:38:56,400 --> 00:38:59,239 Speaker 1: some way. By the way, this does not reveal the 674 00:38:59,280 --> 00:39:02,680 Speaker 1: private key to the other person. The private key remains 675 00:39:02,719 --> 00:39:05,600 Speaker 1: private um P. G P is able to protect that, 676 00:39:05,680 --> 00:39:09,279 Speaker 1: which is kind of cool. Alright. So who uses end 677 00:39:09,280 --> 00:39:12,320 Speaker 1: to end encryption and who does not? Well? Among the 678 00:39:12,360 --> 00:39:15,680 Speaker 1: messaging apps that use end to end encryption, there is 679 00:39:15,800 --> 00:39:19,880 Speaker 1: the Big Daddy What's App. That's an incredibly popular messaging app. 680 00:39:19,920 --> 00:39:23,440 Speaker 1: It is owned by Facebook. Now what didn't start off 681 00:39:23,480 --> 00:39:26,879 Speaker 1: as a Facebook project. It was purchased by Facebook. The 682 00:39:26,920 --> 00:39:31,799 Speaker 1: original messaging app was co founded by Jan Kom who 683 00:39:31,800 --> 00:39:34,520 Speaker 1: grew up in the Ukraine. And wanted to create a 684 00:39:34,520 --> 00:39:38,080 Speaker 1: way in which people could send messages safely without the 685 00:39:38,120 --> 00:39:41,920 Speaker 1: fear of an oppressive presence, like a totalitarian government agency, 686 00:39:41,920 --> 00:39:44,840 Speaker 1: for example, that could read all of them. Now, he 687 00:39:44,880 --> 00:39:47,919 Speaker 1: had moved to America when he was sixteen, and years 688 00:39:48,000 --> 00:39:50,560 Speaker 1: later he met with a guy named Brian Acton and 689 00:39:50,640 --> 00:39:54,360 Speaker 1: Acting became the other co founder of WhatsApp. They created 690 00:39:54,400 --> 00:39:57,200 Speaker 1: the messaging service, and they used end to end encryption 691 00:39:57,280 --> 00:40:02,640 Speaker 1: to protect users privacy. When Faceboo Public purchased WhatsApp rather, 692 00:40:03,320 --> 00:40:07,279 Speaker 1: many security experts express concern because they were worried that 693 00:40:07,320 --> 00:40:11,360 Speaker 1: Facebook might chip away at the messaging apps security and 694 00:40:11,400 --> 00:40:14,480 Speaker 1: allow Facebook the chance to mind these messages and then 695 00:40:14,520 --> 00:40:18,920 Speaker 1: sell that data to advertisers, and there have been accusations 696 00:40:19,040 --> 00:40:23,200 Speaker 1: levied at Facebook claiming as much. To Bias Bolter, a 697 00:40:23,320 --> 00:40:26,920 Speaker 1: security researcher, told reporters at The Guardian that Facebook had 698 00:40:26,960 --> 00:40:30,520 Speaker 1: the ability to generate new encryption keys for offline users 699 00:40:30,600 --> 00:40:34,000 Speaker 1: and thus access messages that would be sent to them. 700 00:40:34,040 --> 00:40:37,080 Speaker 1: Whether that's true or not, I don't know. Facebook has 701 00:40:37,120 --> 00:40:39,760 Speaker 1: denied that they have the ability to read any messages 702 00:40:39,760 --> 00:40:43,120 Speaker 1: from their users, but there's still a lot of concern 703 00:40:43,160 --> 00:40:47,000 Speaker 1: about WhatsApp in general. Another app that uses end to 704 00:40:47,120 --> 00:40:50,800 Speaker 1: end encryption is Signal. Signal started out as text secure 705 00:40:50,880 --> 00:40:53,399 Speaker 1: private messenger, and you can also use Signal to make 706 00:40:53,520 --> 00:40:57,280 Speaker 1: encrypted voice and video calls to other users. These follow 707 00:40:57,360 --> 00:41:01,680 Speaker 1: a similar approach to messages and email els um. They 708 00:41:01,719 --> 00:41:05,840 Speaker 1: protocols are obviously a little different because they're different media 709 00:41:06,040 --> 00:41:11,360 Speaker 1: or different forms of messaging, but the general principle remains 710 00:41:11,400 --> 00:41:14,560 Speaker 1: the same, the idea that all the information being sent 711 00:41:14,719 --> 00:41:17,840 Speaker 1: across these channels is encrypted all the way until it 712 00:41:17,840 --> 00:41:22,560 Speaker 1: gets to the destination device. Whire is a another popular app, 713 00:41:22,680 --> 00:41:26,720 Speaker 1: particularly in countries belonging to the European Union. It's also 714 00:41:26,880 --> 00:41:30,080 Speaker 1: free and open source. The app does store a record 715 00:41:30,120 --> 00:41:33,200 Speaker 1: of the people you've communicated with. That's a little problematic 716 00:41:33,239 --> 00:41:37,120 Speaker 1: from a security standpoint. Now, the purpose of storing this 717 00:41:37,239 --> 00:41:40,960 Speaker 1: information is to make it easier to synchronize and experience 718 00:41:41,000 --> 00:41:43,920 Speaker 1: across devices, because one of the big drawbacks of the 719 00:41:43,960 --> 00:41:47,680 Speaker 1: public private key approach is that ideally you have that 720 00:41:47,719 --> 00:41:51,920 Speaker 1: private key on one and only one device. If you 721 00:41:52,000 --> 00:41:54,840 Speaker 1: share a private key across multiple devices, then you have 722 00:41:55,000 --> 00:41:58,720 Speaker 1: increased the possibility of someone getting access to your private key. 723 00:41:58,760 --> 00:42:01,000 Speaker 1: But if you lose the device pace that the private 724 00:42:01,080 --> 00:42:04,120 Speaker 1: key is on, you lose all your messaging history as well, 725 00:42:04,160 --> 00:42:07,760 Speaker 1: because remember, no one else can see what that encrypted 726 00:42:07,800 --> 00:42:11,000 Speaker 1: stuff is because it all used your public key that 727 00:42:11,080 --> 00:42:13,440 Speaker 1: could only be decoded by your private key. So if 728 00:42:13,480 --> 00:42:16,560 Speaker 1: you lose your private key, all you have is gobbledygook. 729 00:42:16,880 --> 00:42:20,480 Speaker 1: You can't read it. That is, of course, if if 730 00:42:20,520 --> 00:42:24,480 Speaker 1: you haven't been storing messages in some other fashion, uh 731 00:42:24,760 --> 00:42:28,759 Speaker 1: like in plain text some services, you could, as a 732 00:42:28,880 --> 00:42:33,319 Speaker 1: user opt to store your messages using a third party application. 733 00:42:33,760 --> 00:42:37,440 Speaker 1: This would be like you receive the message, you decrypt 734 00:42:37,440 --> 00:42:39,720 Speaker 1: the message when you receive it, you have the message 735 00:42:39,760 --> 00:42:41,799 Speaker 1: in plain text because you have to in order to 736 00:42:41,800 --> 00:42:44,239 Speaker 1: read it or or view it or whatever it may be. 737 00:42:45,040 --> 00:42:47,239 Speaker 1: And then you decide you want to save that message. Well, 738 00:42:47,239 --> 00:42:50,120 Speaker 1: when you're saving that message, you may be saving that 739 00:42:50,280 --> 00:42:53,319 Speaker 1: in plain text using a third party app, in which 740 00:42:53,320 --> 00:42:57,359 Speaker 1: case all that secrecy for the communication is undone by 741 00:42:57,360 --> 00:43:01,200 Speaker 1: your storage. It becomes a security vulnerabile. So you could 742 00:43:01,200 --> 00:43:03,799 Speaker 1: be super careful about messages as they speed to and 743 00:43:03,840 --> 00:43:06,040 Speaker 1: from your device, But if you're then storing everything on 744 00:43:06,080 --> 00:43:09,360 Speaker 1: a cloud server somewhere, you could still technically be allowing 745 00:43:09,400 --> 00:43:12,320 Speaker 1: some other company access to that information. After the fact, 746 00:43:12,840 --> 00:43:15,520 Speaker 1: but if you don't do that, if you lose your phone, 747 00:43:15,560 --> 00:43:18,440 Speaker 1: then you could lose all those messages, all the histories, 748 00:43:18,520 --> 00:43:22,520 Speaker 1: all the communications, So it's a tough situation. Also, by 749 00:43:22,520 --> 00:43:27,319 Speaker 1: the way, this means that the hackers will sometimes concentrate 750 00:43:27,440 --> 00:43:29,800 Speaker 1: not on trying to intercept a message in the middle, 751 00:43:30,160 --> 00:43:33,040 Speaker 1: because if it's encrypted, it could be more trouble than 752 00:43:33,040 --> 00:43:36,480 Speaker 1: what it's worth to try and decrypt it. Or instead 753 00:43:36,480 --> 00:43:40,080 Speaker 1: of doing that, they'll they'll try and target the end device. 754 00:43:40,239 --> 00:43:42,280 Speaker 1: So in other words, they're actually trying to get physical 755 00:43:42,360 --> 00:43:48,080 Speaker 1: access to your computer or phone or somehow be able 756 00:43:48,120 --> 00:43:52,400 Speaker 1: to look at your screen remotely once the decryption process 757 00:43:52,440 --> 00:43:55,040 Speaker 1: has happened, So they want to compromise your device at 758 00:43:55,080 --> 00:43:59,840 Speaker 1: that point because it's relatively easier than trying to do 759 00:44:00,000 --> 00:44:03,000 Speaker 1: crypt one of these heavily encrypted messages on the Internet. Anyway, 760 00:44:03,360 --> 00:44:07,719 Speaker 1: Wire has said that a user once a user deactivates 761 00:44:07,760 --> 00:44:11,759 Speaker 1: deletes his or her account, then the service deletes all 762 00:44:11,880 --> 00:44:16,200 Speaker 1: the associated information with that account, all the stored connections, 763 00:44:16,280 --> 00:44:19,440 Speaker 1: everything is gone once you delete your account with Wire, 764 00:44:19,600 --> 00:44:23,080 Speaker 1: according to the service, there's also a multi platform app 765 00:44:23,160 --> 00:44:27,440 Speaker 1: that's popular that's called Telegram. Telegram, you can search for 766 00:44:27,520 --> 00:44:30,319 Speaker 1: people by user name rather than by phone number. A 767 00:44:30,360 --> 00:44:32,319 Speaker 1: lot of apps require that you know the phone number 768 00:44:32,320 --> 00:44:35,399 Speaker 1: of the person you're trying to contact, but Telegram, when 769 00:44:35,440 --> 00:44:41,600 Speaker 1: you create a profile, in addition to creating a profile 770 00:44:41,640 --> 00:44:44,359 Speaker 1: based on a telephone number, you can create a user name, 771 00:44:44,400 --> 00:44:45,960 Speaker 1: and so people can search for you on that. It 772 00:44:45,960 --> 00:44:49,120 Speaker 1: makes it a little easier to communicate with people. There's 773 00:44:49,160 --> 00:44:52,600 Speaker 1: also another app called wicker w i c k R, 774 00:44:52,800 --> 00:44:57,080 Speaker 1: which is really popular in enterprises, so in business, not 775 00:44:57,160 --> 00:45:01,640 Speaker 1: necessarily as much with individual users. But as for email, 776 00:45:01,680 --> 00:45:07,600 Speaker 1: their services like tour Guard, Hushmail, Mail, fence to to, 777 00:45:07,719 --> 00:45:10,960 Speaker 1: Not to run Box, proton Mail, all of these use 778 00:45:11,040 --> 00:45:13,680 Speaker 1: various encryption strategies to keep messages safe. Not all of 779 00:45:13,680 --> 00:45:17,200 Speaker 1: them necessarily use end to end encryption, some of them 780 00:45:17,280 --> 00:45:19,719 Speaker 1: use other methods for encryption than in too end, but 781 00:45:19,760 --> 00:45:22,560 Speaker 1: they all do some form of encryption on their messages, 782 00:45:23,280 --> 00:45:26,839 Speaker 1: whereas popular email services like Gmail do not support end 783 00:45:26,840 --> 00:45:30,840 Speaker 1: to end encryption, and one reason they might not I 784 00:45:30,880 --> 00:45:33,239 Speaker 1: can't say this is for sure sure, but one reason 785 00:45:33,320 --> 00:45:37,200 Speaker 1: they may not do that is that it's possible that 786 00:45:37,440 --> 00:45:41,160 Speaker 1: they want to mine all that information for the possibility 787 00:45:41,160 --> 00:45:43,879 Speaker 1: of advertising against it. So in other words, you send 788 00:45:43,880 --> 00:45:47,400 Speaker 1: a message. Alison's a message to Bob, Alice talks about 789 00:45:48,480 --> 00:45:51,200 Speaker 1: all these nice flowers that she saw on a recent trip. 790 00:45:51,680 --> 00:45:55,560 Speaker 1: And then Alice starts seeing ads mysteriously whenever she's using 791 00:45:55,560 --> 00:46:00,440 Speaker 1: a Google service that is talking about flowers. Well, that 792 00:46:00,560 --> 00:46:04,319 Speaker 1: raises some very tricky ethical questions. Now, whether or not 793 00:46:04,400 --> 00:46:09,160 Speaker 1: that's actually going on, that's a that's a matter of 794 00:46:09,640 --> 00:46:12,960 Speaker 1: investigation on a case by case basis. You know, not 795 00:46:13,080 --> 00:46:16,880 Speaker 1: all emails are necessarily being mined for information, but they 796 00:46:16,920 --> 00:46:19,680 Speaker 1: all could be if they're not being encrypted end to end. 797 00:46:20,320 --> 00:46:24,600 Speaker 1: So there are a lot of advocates for privacy out 798 00:46:24,600 --> 00:46:28,279 Speaker 1: there who some of them will say, Hey, don't even 799 00:46:28,320 --> 00:46:31,399 Speaker 1: bother sending me any email if it's not encrypted. I'm 800 00:46:31,440 --> 00:46:34,560 Speaker 1: not interested you. If you're not using an encryption service 801 00:46:34,600 --> 00:46:39,200 Speaker 1: of some sort, then I do not want to communicate 802 00:46:39,239 --> 00:46:42,560 Speaker 1: with you because I prefer to keep my privacy private. 803 00:46:43,000 --> 00:46:45,560 Speaker 1: And that's a legitimate argument, I think. I don't think 804 00:46:45,560 --> 00:46:48,840 Speaker 1: it's not necessarily the indication that someone is up to something, 805 00:46:49,560 --> 00:46:53,760 Speaker 1: you know, suspicious, But it's important to remember that without 806 00:46:53,800 --> 00:46:56,400 Speaker 1: that end to end encryption, it is a possibility that 807 00:46:56,520 --> 00:46:59,080 Speaker 1: someone along the way is reading your messages. Besides the 808 00:46:59,080 --> 00:47:01,839 Speaker 1: person you're sending them too. I would like to thank 809 00:47:01,880 --> 00:47:07,520 Speaker 1: our anonymous listener for asking for us to cover this topic. 810 00:47:07,560 --> 00:47:11,080 Speaker 1: It is very fascinating. In the UK in particular, this 811 00:47:11,160 --> 00:47:14,640 Speaker 1: is an ongoing issue where the UK government would love 812 00:47:14,680 --> 00:47:18,520 Speaker 1: to see end to end encryption go away because they 813 00:47:18,719 --> 00:47:23,000 Speaker 1: worry that it's a methodology of UH, that that terrorists 814 00:47:23,040 --> 00:47:25,719 Speaker 1: are using to communicate with one another, and that this 815 00:47:25,840 --> 00:47:29,360 Speaker 1: creates a problem because you can't really investigate those terrorists 816 00:47:29,360 --> 00:47:32,719 Speaker 1: cells at least not through the means of their communication, 817 00:47:32,800 --> 00:47:36,480 Speaker 1: because you can't decrypt it. There have been people who 818 00:47:36,480 --> 00:47:40,160 Speaker 1: have called for a back door of some sort to 819 00:47:40,800 --> 00:47:46,000 Speaker 1: end to end encryption services, which defeats the purpose. Backdoor 820 00:47:46,120 --> 00:47:50,319 Speaker 1: is essentially a vulnerability you introduce on purpose so that 821 00:47:50,600 --> 00:47:55,800 Speaker 1: a party apart from the intended recipient can decode a message, 822 00:47:56,600 --> 00:48:01,680 Speaker 1: and that that raises enormous security problems. Like you don't 823 00:48:01,680 --> 00:48:06,000 Speaker 1: create a backdoor because that means you're introducing the possibility 824 00:48:06,120 --> 00:48:10,640 Speaker 1: of a hacker getting access to this information. You don't 825 00:48:10,680 --> 00:48:15,040 Speaker 1: want to do that. That's that's bad security. So there, 826 00:48:15,400 --> 00:48:19,200 Speaker 1: that's not a great UH strategy moving forward. Is trying 827 00:48:19,200 --> 00:48:22,200 Speaker 1: to force services to create a backdoor. It pretty much 828 00:48:22,520 --> 00:48:26,920 Speaker 1: breaks the whole service in the first place. So I 829 00:48:26,960 --> 00:48:29,560 Speaker 1: don't know what the solution here is. I don't think 830 00:48:29,600 --> 00:48:33,320 Speaker 1: that that getting rid of a technology because some people 831 00:48:33,560 --> 00:48:38,719 Speaker 1: use it improperly is necessarily going to work. Um that 832 00:48:38,840 --> 00:48:42,480 Speaker 1: is uh, it is, it is. It raises a lot 833 00:48:42,480 --> 00:48:44,880 Speaker 1: of questions, and it also means that you have to 834 00:48:44,920 --> 00:48:50,359 Speaker 1: put an awful lot of trust in the governing agency, 835 00:48:50,600 --> 00:48:54,120 Speaker 1: and that can be difficult to do. And also, any 836 00:48:54,360 --> 00:48:57,040 Speaker 1: time you have a back door, anytime you have any 837 00:48:57,239 --> 00:49:01,160 Speaker 1: sort of vulnerability, if people know it exists, then you've 838 00:49:01,200 --> 00:49:04,560 Speaker 1: just given hackers a target to shoot for. And even 839 00:49:04,760 --> 00:49:07,680 Speaker 1: if it works as it's supposed to, where everyone is 840 00:49:07,719 --> 00:49:11,400 Speaker 1: behaving themselves from the good guys standpoint, let's say that 841 00:49:11,440 --> 00:49:13,399 Speaker 1: the government is a good guy in this. In this 842 00:49:13,600 --> 00:49:17,840 Speaker 1: scenario that both Alice and Bob are behaving themselves, the 843 00:49:17,880 --> 00:49:21,239 Speaker 1: government is behaving itself. It's not investigating Alison Bob. It 844 00:49:21,280 --> 00:49:23,759 Speaker 1: has no reason to suspect them. So they're able to 845 00:49:23,800 --> 00:49:28,520 Speaker 1: send their messages securely. But there is this back door 846 00:49:28,640 --> 00:49:33,239 Speaker 1: option that is available, then hackers could target that and 847 00:49:33,280 --> 00:49:35,880 Speaker 1: then they compromise the system, and then Alice and Bob's 848 00:49:35,920 --> 00:49:39,840 Speaker 1: messages are laid bare for the hackers because you've just 849 00:49:39,880 --> 00:49:43,280 Speaker 1: painted a huge target on the service. It's not great, 850 00:49:43,480 --> 00:49:48,880 Speaker 1: and not every end to end encryption service is infallible. Uh. 851 00:49:48,920 --> 00:49:52,400 Speaker 1: There have been security experts who have accused Apple in 852 00:49:52,440 --> 00:49:57,200 Speaker 1: particular of implementing a poor end to end encryption strategy 853 00:49:57,320 --> 00:50:00,879 Speaker 1: for their Eye Message app and have stated that there 854 00:50:00,880 --> 00:50:06,200 Speaker 1: are ways that that could be defeated using classical computer systems. 855 00:50:07,320 --> 00:50:10,000 Speaker 1: So it is possible to do this in a way 856 00:50:10,000 --> 00:50:14,799 Speaker 1: that isn't effective, but the actual idea itself is incredibly 857 00:50:15,000 --> 00:50:19,280 Speaker 1: effective if you implement it properly and you're using classical 858 00:50:19,320 --> 00:50:24,839 Speaker 1: computers as your potential threat. Once quantum computers actually become 859 00:50:24,840 --> 00:50:27,319 Speaker 1: on part of the scene for real z s, when 860 00:50:27,360 --> 00:50:31,000 Speaker 1: they're sufficiently powerful, it'll be a totally different story because 861 00:50:32,200 --> 00:50:38,160 Speaker 1: a sufficiently powerful quantum computer will completely lay bare all 862 00:50:38,280 --> 00:50:40,480 Speaker 1: these encryption schemes, So you have to come up with 863 00:50:40,520 --> 00:50:42,440 Speaker 1: something new. But I talked a little bit about that 864 00:50:42,920 --> 00:50:45,160 Speaker 1: in one of the IBM Think episodes, so you can 865 00:50:45,160 --> 00:50:46,440 Speaker 1: go back and listen to that if you want to 866 00:50:46,520 --> 00:50:49,319 Speaker 1: learn more. As for you, guys, if you have any 867 00:50:49,320 --> 00:50:52,440 Speaker 1: suggestions for future episodes of Tech Stuff, I I welcome 868 00:50:52,480 --> 00:50:56,120 Speaker 1: you to send those suggestions to me. You can email 869 00:50:56,160 --> 00:50:59,120 Speaker 1: the show that addresses tech stuff at how stuff Works 870 00:50:59,160 --> 00:51:01,719 Speaker 1: dot com, or you can drop me a line on 871 00:51:01,760 --> 00:51:06,120 Speaker 1: Twitter or Facebook. The handle at both of those is 872 00:51:06,200 --> 00:51:09,840 Speaker 1: tech stuff hs W. Remember to follow us on Instagram, 873 00:51:09,920 --> 00:51:14,000 Speaker 1: and remember typically I record these episodes live on twitch 874 00:51:14,080 --> 00:51:17,680 Speaker 1: dot tv slash tech stuff on Wednesdays and Fridays. This 875 00:51:17,719 --> 00:51:20,480 Speaker 1: particular episode I'm recording right now was an exception because 876 00:51:20,560 --> 00:51:23,919 Speaker 1: again I'm in a hotel, a music hotel WiFi, which 877 00:51:23,960 --> 00:51:26,719 Speaker 1: is not the greatest thing to use for streaming, but 878 00:51:27,239 --> 00:51:30,600 Speaker 1: usually I stream live from the studio on Wednesdays and Fridays. 879 00:51:30,640 --> 00:51:32,839 Speaker 1: Go to Twitch dot tv slash tech Stuff. You'll see 880 00:51:32,840 --> 00:51:34,360 Speaker 1: the schedule there. I hope to see you in the 881 00:51:34,440 --> 00:51:37,560 Speaker 1: chat room and you can chat with me as I'm recording. 882 00:51:37,880 --> 00:51:39,520 Speaker 1: You can have a grand old time and I'll talk 883 00:51:39,560 --> 00:51:48,440 Speaker 1: to you again really soon. For more on this and 884 00:51:48,480 --> 00:52:01,040 Speaker 1: bathands of other topics. Is that how stuff Works dot com.