1 00:00:04,360 --> 00:00:12,280 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,320 --> 00:00:15,920 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:16,040 --> 00:00:19,919 Speaker 1: I'm an executive producer with iHeartRadio and How the tech 4 00:00:20,120 --> 00:00:23,119 Speaker 1: Are Yet it is a Friday, which means it's time 5 00:00:23,120 --> 00:00:26,119 Speaker 1: for a classic episode of tech Stuff. This one is 6 00:00:26,120 --> 00:00:30,600 Speaker 1: titled Apple versus the FBI, and originally it published on 7 00:00:30,680 --> 00:00:37,240 Speaker 1: March second, twenty sixteen. This is sort of an ongoing 8 00:00:37,760 --> 00:00:42,560 Speaker 1: struggle between companies like Apple and various agencies because, as 9 00:00:42,600 --> 00:00:45,760 Speaker 1: we will hear in this episode, the FBI wanted ways 10 00:00:45,800 --> 00:00:49,720 Speaker 1: to be able to access locked iPhones that had been 11 00:00:49,760 --> 00:00:55,680 Speaker 1: recovered from say a suspect, and Apple is trying to 12 00:00:55,960 --> 00:01:01,560 Speaker 1: do its best to maintain consumer confidence by tecting their 13 00:01:01,800 --> 00:01:06,040 Speaker 1: data and their devices. We've heard similar things, which I'll 14 00:01:06,040 --> 00:01:09,040 Speaker 1: talk about more at the end of this episode, that 15 00:01:09,120 --> 00:01:14,520 Speaker 1: have involved other law enforcement agencies and investigative agencies here 16 00:01:14,560 --> 00:01:17,320 Speaker 1: in the United States. But first let's listen to this 17 00:01:17,360 --> 00:01:21,520 Speaker 1: classic episode from twenty sixteen. So we're gonna talk about 18 00:01:21,600 --> 00:01:24,400 Speaker 1: something that's actually we're gonna have to work very hard 19 00:01:24,400 --> 00:01:27,440 Speaker 1: for multiple reasons because we're talking about Apple versus the FBI. 20 00:01:28,560 --> 00:01:33,759 Speaker 1: The whole story that's unfolding as we record this, and 21 00:01:33,840 --> 00:01:36,800 Speaker 1: to talk about what is behind that case, what are 22 00:01:36,840 --> 00:01:41,560 Speaker 1: the implications, what is the FBI's argument, what is Apple's argument? 23 00:01:42,040 --> 00:01:44,959 Speaker 1: And in addition to that, we have, of course the 24 00:01:45,000 --> 00:01:51,320 Speaker 1: added responsibility of remembering that this is all centered around 25 00:01:51,480 --> 00:01:57,000 Speaker 1: a truly awful crime. Yes, absolutely so. The what we're 26 00:01:57,080 --> 00:02:02,040 Speaker 1: what we're talking about specifically hit inanstream news when Apple 27 00:02:02,440 --> 00:02:06,040 Speaker 1: did something that a lot of tech companies have never done. 28 00:02:06,560 --> 00:02:11,959 Speaker 1: They issued a blanket statement letter and went public. Yeah, 29 00:02:12,280 --> 00:02:16,840 Speaker 1: with they went public with a response to an FBI request. 30 00:02:16,840 --> 00:02:18,440 Speaker 1: But I guess I'm getting ahead of it. What was 31 00:02:18,480 --> 00:02:20,640 Speaker 1: the crime? Well, let's yeah, let's I'll give you the 32 00:02:20,639 --> 00:02:24,760 Speaker 1: background on what's happened and we'll build from there. So 33 00:02:25,280 --> 00:02:28,160 Speaker 1: first we have to look back on December second, twenty 34 00:02:28,360 --> 00:02:33,359 Speaker 1: fifteen in San Bernardino, California. That's in southern California, and 35 00:02:33,400 --> 00:02:37,400 Speaker 1: that's when two well an American citizen and a legal 36 00:02:37,440 --> 00:02:44,519 Speaker 1: resident husband wife team, both of Pakistani descent, committed a 37 00:02:45,720 --> 00:02:50,680 Speaker 1: mass shooting. It was a Sayed Rizwantharouk and tash Fen 38 00:02:51,040 --> 00:02:54,760 Speaker 1: Malik who committed this act, and it happened at the 39 00:02:54,760 --> 00:02:58,200 Speaker 1: Department of Public Health. They were having an event that 40 00:02:58,280 --> 00:03:00,800 Speaker 1: was starting off as a training event and then it 41 00:03:00,840 --> 00:03:06,799 Speaker 1: was supposed to transition into a holiday event. And after 42 00:03:06,840 --> 00:03:09,360 Speaker 1: the training element, Farouk, who was actually at the event, 43 00:03:09,400 --> 00:03:12,000 Speaker 1: he was in the employee of the Department of Public Health, 44 00:03:12,040 --> 00:03:16,440 Speaker 1: he was a food inspector, left, came back with his wife. 45 00:03:16,480 --> 00:03:19,679 Speaker 1: They were both heavily armed and they both started firing 46 00:03:19,760 --> 00:03:23,000 Speaker 1: into the crowd of people at the Department of Public Health. 47 00:03:23,000 --> 00:03:28,120 Speaker 1: Fourteen people were killed, twenty two injured. Very serious crime. 48 00:03:28,520 --> 00:03:35,640 Speaker 1: Then far and Malik left. They fled the scene. Turned 49 00:03:35,640 --> 00:03:38,720 Speaker 1: out they had also left behind an explosive device, which 50 00:03:38,800 --> 00:03:44,320 Speaker 1: thankfully did not detonate. Failed to detonate, and they fled 51 00:03:44,480 --> 00:03:48,760 Speaker 1: in a vehicle that they had rented a few days before, 52 00:03:49,400 --> 00:03:52,640 Speaker 1: and several hours later law enforcement tracked them down. There 53 00:03:52,720 --> 00:03:56,800 Speaker 1: was a confrontation, there was a shootout, and then both 54 00:03:57,040 --> 00:04:00,520 Speaker 1: of the shooters died as a result of that shootout. 55 00:04:00,960 --> 00:04:04,400 Speaker 1: The FBI has stated that they believed, based upon the 56 00:04:04,480 --> 00:04:06,800 Speaker 1: evidence they were able to find that the two were 57 00:04:06,840 --> 00:04:09,120 Speaker 1: acting on their own, that they were not part of 58 00:04:09,160 --> 00:04:12,760 Speaker 1: some sort of terrorist cell in the United States. However, 59 00:04:12,760 --> 00:04:15,520 Speaker 1: they can't be absolutely certain of that. That's where the 60 00:04:16,000 --> 00:04:19,080 Speaker 1: crux of this issue with Apple is going to come 61 00:04:19,120 --> 00:04:22,920 Speaker 1: into play, and the big part of it is they 62 00:04:22,960 --> 00:04:28,240 Speaker 1: can't account for it's just less than twenty minutes of 63 00:04:28,279 --> 00:04:33,320 Speaker 1: the activity that happened between the shooting or leading up 64 00:04:33,320 --> 00:04:36,200 Speaker 1: to the shooting, and around the shooting, because they're thinking 65 00:04:36,200 --> 00:04:39,760 Speaker 1: that there's a possibility that something that happened within that 66 00:04:39,839 --> 00:04:42,559 Speaker 1: time could give them more information and at least allow 67 00:04:42,600 --> 00:04:46,479 Speaker 1: them to confirm whether or not they truly acted on 68 00:04:46,480 --> 00:04:49,039 Speaker 1: their own or if they were under the direction of 69 00:04:49,160 --> 00:04:52,880 Speaker 1: some other group, which could potentially and this is an 70 00:04:52,880 --> 00:04:57,320 Speaker 1: FBI argument, which could potentially give the government the ability 71 00:04:57,400 --> 00:05:01,000 Speaker 1: to prevent a future attack. And more to that point, 72 00:05:01,040 --> 00:05:04,640 Speaker 1: neither of them had been listed in any database as 73 00:05:04,640 --> 00:05:09,240 Speaker 1: a potential threat, So that puts extra pressure on the government, 74 00:05:09,320 --> 00:05:14,720 Speaker 1: right because citizens say, well, why did this happen? The 75 00:05:14,720 --> 00:05:20,080 Speaker 1: government says, we followed procedure and neither of them registered 76 00:05:20,120 --> 00:05:23,080 Speaker 1: on any of our There were no red flags, so 77 00:05:23,120 --> 00:05:26,560 Speaker 1: we had no way of knowing. And that means there's 78 00:05:26,600 --> 00:05:30,120 Speaker 1: actual extra pressure on the FBI to investigate this thoroughly, 79 00:05:31,160 --> 00:05:34,200 Speaker 1: partially to show that in fact, everything that could be 80 00:05:34,279 --> 00:05:38,680 Speaker 1: done had been done short of taking some extreme step 81 00:05:38,720 --> 00:05:41,159 Speaker 1: that none of us want to see, right, the idea 82 00:05:41,160 --> 00:05:44,160 Speaker 1: of like, Okay, everyone of Pakistani descent has to leave 83 00:05:44,200 --> 00:05:47,800 Speaker 1: the country. That's ridiculous. We would that's that's returning to 84 00:05:47,839 --> 00:05:49,800 Speaker 1: an era of the United States history that we do 85 00:05:49,880 --> 00:05:53,640 Speaker 1: not want to revisit, a World War two internment camp situation. Yeah, 86 00:05:53,680 --> 00:05:56,839 Speaker 1: we don't want that. We don't want that because it 87 00:05:57,000 --> 00:06:00,200 Speaker 1: is never fair to lump innocent people in for or 88 00:06:00,240 --> 00:06:03,360 Speaker 1: that one of them might be guilty. That's not cool, right, 89 00:06:03,520 --> 00:06:05,440 Speaker 1: this is I'm already getting angry and i haven't even 90 00:06:05,440 --> 00:06:07,400 Speaker 1: gotten to the part about the apple stuff yet. I'm 91 00:06:07,440 --> 00:06:10,320 Speaker 1: going for a slow burned thing myself. That's fair. So 92 00:06:10,440 --> 00:06:13,839 Speaker 1: one of the thing that's that all of this is about, 93 00:06:14,040 --> 00:06:18,880 Speaker 1: ostensibly any at any rate, is an iPhone. It's specifically 94 00:06:18,880 --> 00:06:23,560 Speaker 1: an iPhone five C probably running iOS nine. That's based 95 00:06:23,640 --> 00:06:30,239 Speaker 1: upon the various public filings we've seen about this. And 96 00:06:30,520 --> 00:06:33,040 Speaker 1: it was county owned. It was owned by the Department 97 00:06:33,080 --> 00:06:36,559 Speaker 1: of Public Health and then issued to Farouk who worked 98 00:06:36,560 --> 00:06:39,839 Speaker 1: for the department Department of Public Health, so it was 99 00:06:39,880 --> 00:06:42,359 Speaker 1: county owned. The FBI went to the county and said, 100 00:06:42,400 --> 00:06:45,760 Speaker 1: we want your permission to access the contents on this phone. 101 00:06:45,760 --> 00:06:48,039 Speaker 1: The county said, of course, you have our permission, you 102 00:06:48,040 --> 00:06:51,200 Speaker 1: can access it. Now, if that's all their worked to it, 103 00:06:51,200 --> 00:06:53,680 Speaker 1: it'd be fine. Sure, I would just access the phone. 104 00:06:53,960 --> 00:06:57,040 Speaker 1: They've already received permission from the phone's owner, and they 105 00:06:57,040 --> 00:06:59,839 Speaker 1: could cull through it to look for any evidence that 106 00:07:00,080 --> 00:07:03,440 Speaker 1: would lead them to more information about this crime. But 107 00:07:05,480 --> 00:07:08,880 Speaker 1: there is a security measure on the phone, and it's 108 00:07:08,920 --> 00:07:11,960 Speaker 1: a very simple one. It's on lots of smartphones. It's 109 00:07:12,560 --> 00:07:16,640 Speaker 1: a little password. In this case, it's a four or 110 00:07:16,640 --> 00:07:21,000 Speaker 1: six digit pin which can be alpha numeric if you 111 00:07:21,240 --> 00:07:24,680 Speaker 1: activate that option on the six digit. So it's one 112 00:07:24,720 --> 00:07:29,080 Speaker 1: of the two, we don't know which, and without that pen, 113 00:07:29,800 --> 00:07:34,960 Speaker 1: you cannot access the information because the way the pen works. 114 00:07:34,960 --> 00:07:36,600 Speaker 1: This is where we get into the tech stuff. Been right. 115 00:07:37,480 --> 00:07:40,400 Speaker 1: There are two levels of encryption, or two keys to 116 00:07:40,440 --> 00:07:43,760 Speaker 1: this encryption. One of the keys is the pen. The 117 00:07:43,800 --> 00:07:48,600 Speaker 1: other key is hard coded onto the device itself. All right, 118 00:07:48,960 --> 00:07:51,240 Speaker 1: It's only it's if you think about movies like those 119 00:07:51,320 --> 00:07:54,000 Speaker 1: nineteen eighties movies with the nuclear power, where you have 120 00:07:54,040 --> 00:07:57,640 Speaker 1: to have two generals put their keys into the same 121 00:07:57,640 --> 00:07:59,640 Speaker 1: sort of thing. So if you don't have the pin, 122 00:08:00,320 --> 00:08:03,360 Speaker 1: it can't combine with the hardware key, and therefore you 123 00:08:03,400 --> 00:08:08,160 Speaker 1: cannot unlock the information. The idea here is that Apple 124 00:08:08,640 --> 00:08:13,600 Speaker 1: doesn't have the pen right, which is cool because it 125 00:08:13,640 --> 00:08:17,440 Speaker 1: means that if you or I or you listeners have 126 00:08:17,840 --> 00:08:21,800 Speaker 1: an iPhone, it means you can trust Apple. Because they 127 00:08:21,880 --> 00:08:24,920 Speaker 1: don't know your pin right, they can't access your phone. 128 00:08:26,560 --> 00:08:29,640 Speaker 1: You know, that was the whole point, right, Two points one. 129 00:08:29,680 --> 00:08:32,679 Speaker 1: Apple wanted to make its secure enough so that consumers 130 00:08:32,679 --> 00:08:36,280 Speaker 1: would say, I feel good about buying an iPhone because 131 00:08:36,360 --> 00:08:39,400 Speaker 1: I know whatever I store on it, whether it's something 132 00:08:39,440 --> 00:08:41,840 Speaker 1: that's personal or it's just you know, it's no one 133 00:08:41,840 --> 00:08:44,800 Speaker 1: else's business, whatever it may be, sure no one else 134 00:08:44,840 --> 00:08:46,480 Speaker 1: is going to have access to that unless I give 135 00:08:46,520 --> 00:08:51,040 Speaker 1: them my pen. It's good for Apple, not just because 136 00:08:51,080 --> 00:08:54,240 Speaker 1: consumers are happy, but because Apple then has an out. 137 00:08:54,360 --> 00:08:56,120 Speaker 1: If the government comes to Apple and says, hey, we 138 00:08:56,160 --> 00:08:58,280 Speaker 1: want you to break into this phone, they can literally 139 00:08:58,320 --> 00:09:01,120 Speaker 1: say we cannot do that. It's not that we will not, 140 00:09:01,320 --> 00:09:05,079 Speaker 1: it's that we literally physically cannot grant your request because 141 00:09:05,120 --> 00:09:09,280 Speaker 1: it's impossible. And there's also I would just to expound 142 00:09:09,320 --> 00:09:13,199 Speaker 1: that point a little bit further. This I'm not suggesting 143 00:09:13,240 --> 00:09:17,480 Speaker 1: as their conscious will or intention, but this also removes 144 00:09:17,520 --> 00:09:22,760 Speaker 1: possible culpability. Yes, yes, so all of these are very 145 00:09:22,800 --> 00:09:25,800 Speaker 1: important points, right, So let's get into a little bit 146 00:09:25,800 --> 00:09:28,520 Speaker 1: more of why the FBI is coming to Apple. So 147 00:09:29,520 --> 00:09:32,360 Speaker 1: let's say that it's a four digit pin. Okay, all right, 148 00:09:32,559 --> 00:09:37,160 Speaker 1: all numeric, all numeric, simplest version. That means there are 149 00:09:37,200 --> 00:09:41,760 Speaker 1: ten thousand combinations that are possible for that pen. That's it. 150 00:09:41,880 --> 00:09:44,840 Speaker 1: That's you're limited to ten thousand, okay, which is that's 151 00:09:44,840 --> 00:09:47,280 Speaker 1: a lot, but it's not that much if you're going 152 00:09:47,280 --> 00:09:50,080 Speaker 1: to do something like a brute force attack. Right. So, 153 00:09:50,120 --> 00:09:52,840 Speaker 1: if you do a brute force attack on a normal system, 154 00:09:53,120 --> 00:09:55,720 Speaker 1: and there are only ten thousand variations, it's with a 155 00:09:55,720 --> 00:09:58,000 Speaker 1: fast enough computer, it's just a matter of minutes, right, 156 00:09:58,240 --> 00:10:01,600 Speaker 1: But there are some limitations on the iPhone that make 157 00:10:01,679 --> 00:10:06,760 Speaker 1: this harder. Does it freeze if you enter the incorrect pin? Oh? 158 00:10:06,800 --> 00:10:09,640 Speaker 1: It does more than freeze, all right. So first of all, 159 00:10:09,640 --> 00:10:13,000 Speaker 1: there is an eighty millisecond delay when you enter a 160 00:10:13,000 --> 00:10:15,480 Speaker 1: pin and when it gets verified that it is or 161 00:10:15,640 --> 00:10:18,080 Speaker 1: is not the correct one when the key turns right. 162 00:10:18,200 --> 00:10:21,160 Speaker 1: So with that eighty millisecond delay, that doesn't sound like much, 163 00:10:21,240 --> 00:10:23,800 Speaker 1: but it means you can't just blast a whole bunch 164 00:10:23,840 --> 00:10:26,400 Speaker 1: of numbers through. Also, you have to type it in 165 00:10:26,480 --> 00:10:28,679 Speaker 1: on a screen. You can't hook it up to a 166 00:10:28,720 --> 00:10:33,120 Speaker 1: computer and just digitally send the various pin combinations to 167 00:10:33,160 --> 00:10:35,280 Speaker 1: try and get through. You have to actually tap them 168 00:10:35,320 --> 00:10:37,640 Speaker 1: on the screen to do it over and over. So 169 00:10:37,679 --> 00:10:41,040 Speaker 1: that means you tap in the number eighty milliseconds pass, 170 00:10:41,080 --> 00:10:45,600 Speaker 1: you get the confirmation or denial. Tap in a number next. 171 00:10:47,920 --> 00:10:53,959 Speaker 1: If you hit ten consecutive incorrect pen entries, the phone 172 00:10:53,960 --> 00:10:56,679 Speaker 1: interprets that as saying someone else who is not the 173 00:10:56,679 --> 00:10:59,360 Speaker 1: owner has gotten possession of this phone. So in order 174 00:10:59,400 --> 00:11:03,800 Speaker 1: to protect the owner's information, we're scrambling everything and you 175 00:11:03,920 --> 00:11:08,160 Speaker 1: lose all the metadata. Everything is useless. So even if 176 00:11:08,200 --> 00:11:10,319 Speaker 1: you got the pen afterward, you would not be able 177 00:11:10,360 --> 00:11:11,920 Speaker 1: to access the stuff you were looking for in the 178 00:11:11,920 --> 00:11:14,400 Speaker 1: first place. You would maybe be able to turn on 179 00:11:14,440 --> 00:11:16,959 Speaker 1: the phone, but there wouldn't be anything there to find. Yeah, 180 00:11:16,800 --> 00:11:19,800 Speaker 1: you would have just you would have just it would 181 00:11:19,840 --> 00:11:22,680 Speaker 1: be like you just accidentally set fire to the file room, 182 00:11:23,400 --> 00:11:25,280 Speaker 1: Like you know, like you can't get into the file 183 00:11:25,360 --> 00:11:28,400 Speaker 1: room door, and whatever you try and do accidentally sets 184 00:11:28,400 --> 00:11:30,520 Speaker 1: a fire inside the file room. You then get the 185 00:11:30,600 --> 00:11:32,280 Speaker 1: key to the file room, and then you think, well, 186 00:11:32,679 --> 00:11:37,199 Speaker 1: what's the point lady lost everything, and it's it's virtually assured. 187 00:11:37,720 --> 00:11:42,000 Speaker 1: Yeah that out of those thousand, yeah, out of those 188 00:11:42,000 --> 00:11:46,000 Speaker 1: ten thousand tries, your first ten are going to be wrong. Yeah, 189 00:11:46,040 --> 00:11:48,400 Speaker 1: I mean just the odds are pretty much against you, 190 00:11:48,559 --> 00:11:54,640 Speaker 1: unless you're ridiculously lucky. So there's that, and of course 191 00:11:54,679 --> 00:11:58,280 Speaker 1: they don't they also if it were an actual later iPhone, 192 00:11:58,320 --> 00:12:01,360 Speaker 1: if it weren't the iPhone five C, they would have 193 00:12:01,400 --> 00:12:05,640 Speaker 1: an added problem, which is that with the later models 194 00:12:06,040 --> 00:12:10,640 Speaker 1: there's an additional delay after four failed tries. So if 195 00:12:10,640 --> 00:12:13,360 Speaker 1: you try four times and fail, it will then give 196 00:12:13,400 --> 00:12:15,400 Speaker 1: you a five second delay before you can try the 197 00:12:15,400 --> 00:12:17,920 Speaker 1: fifth time. After that, it gives you a fifteen second 198 00:12:17,960 --> 00:12:20,440 Speaker 1: delay after that, like by the time you get to 199 00:12:20,480 --> 00:12:24,000 Speaker 1: the ninth try, it's an hour delay. Like, think about 200 00:12:24,040 --> 00:12:27,400 Speaker 1: what you're doing, right, So, but the five C doesn't 201 00:12:27,400 --> 00:12:29,760 Speaker 1: have that, so you may have read about that delay. 202 00:12:29,800 --> 00:12:33,679 Speaker 1: It does not qualify for this particular space. Through force 203 00:12:33,800 --> 00:12:37,320 Speaker 1: won't work in the most basic pin situation. But in 204 00:12:37,360 --> 00:12:40,520 Speaker 1: the sixth digit alpha numeric, I just imagine the same 205 00:12:40,600 --> 00:12:44,280 Speaker 1: rules apply, but the yeah, yeah, yeah exactly, But the 206 00:12:44,559 --> 00:12:48,720 Speaker 1: possible answers are much greater because now you've got six 207 00:12:48,760 --> 00:12:53,360 Speaker 1: digits you've also got the possibility of alpha characters in 208 00:12:53,360 --> 00:12:59,240 Speaker 1: their alphabet characters. I believe so. Um so, anyway, from 209 00:12:59,240 --> 00:13:00,960 Speaker 1: why I hear, if you were to try and use 210 00:13:00,960 --> 00:13:04,599 Speaker 1: brute force on a six digit alpha numeric one, it 211 00:13:04,600 --> 00:13:06,920 Speaker 1: would take you with a with a fast computer that 212 00:13:07,000 --> 00:13:09,560 Speaker 1: had been optimized for this, it would take you about 213 00:13:09,600 --> 00:13:12,760 Speaker 1: six years to break through it. And that's going through 214 00:13:12,800 --> 00:13:16,360 Speaker 1: all the different possible combinations, assuming that there's not another 215 00:13:16,480 --> 00:13:19,200 Speaker 1: kill switch type deal like there is with the iPhone. 216 00:13:19,320 --> 00:13:22,080 Speaker 1: By the way, that's software that Apple has built into 217 00:13:22,200 --> 00:13:25,319 Speaker 1: the iPhone, or really firmware that Apple is built into 218 00:13:25,360 --> 00:13:29,760 Speaker 1: the iPhone. It's not like it's a fundamental you know 219 00:13:29,920 --> 00:13:33,480 Speaker 1: quality that all smartphones. Oh right, yeah, yeah, that's that's 220 00:13:33,520 --> 00:13:38,520 Speaker 1: specific to Apple phones. Right, So you've got this problem, right, 221 00:13:38,520 --> 00:13:41,240 Speaker 1: You have the FBI. They've gotten permission from the owner 222 00:13:41,320 --> 00:13:45,400 Speaker 1: of the phone to access the phone's contents, but the 223 00:13:45,440 --> 00:13:48,040 Speaker 1: owner of the phone doesn't know the pen because the 224 00:13:48,080 --> 00:13:51,599 Speaker 1: owner of the phone had had issued it to an employee. 225 00:13:51,640 --> 00:13:54,320 Speaker 1: The employee had come up with the pen, So they 226 00:13:54,320 --> 00:13:55,800 Speaker 1: don't know. If it's a four digit, they don't know, 227 00:13:55,840 --> 00:13:59,080 Speaker 1: if it's a six digit, they don't know. They don't 228 00:13:59,120 --> 00:14:02,520 Speaker 1: know if this this feature where after ten tries everything 229 00:14:02,559 --> 00:14:05,280 Speaker 1: gets erased. They don't know if that's necessarily active because 230 00:14:05,280 --> 00:14:09,120 Speaker 1: you can turn it off, but all indications point to 231 00:14:09,160 --> 00:14:12,320 Speaker 1: it being on. For one, it was on when the 232 00:14:12,360 --> 00:14:16,800 Speaker 1: phone was issued to Farouk right, and people typically don't 233 00:14:17,000 --> 00:14:22,440 Speaker 1: change their defaults vs phone users. So FBI wants Apple 234 00:14:22,520 --> 00:14:27,000 Speaker 1: to do something particular, and it's something different from what 235 00:14:27,160 --> 00:14:31,800 Speaker 1: we've talked about in previous episodes about a backdoor entrance 236 00:14:31,920 --> 00:14:35,000 Speaker 1: into a system. It's not quite the same thing. So 237 00:14:35,920 --> 00:14:38,760 Speaker 1: you've got this phone, you can't brute force attack it 238 00:14:38,840 --> 00:14:45,080 Speaker 1: without risking damaging the contents. Apple cannot access the information 239 00:14:45,080 --> 00:14:47,880 Speaker 1: on this phone by design, they did not want that 240 00:14:48,000 --> 00:14:52,080 Speaker 1: to have that capability. So what the FBI wants Apple 241 00:14:52,120 --> 00:14:56,680 Speaker 1: to do is build a new version of iOS just 242 00:14:56,840 --> 00:15:00,320 Speaker 1: for this phone, a specific iOS for this own that 243 00:15:00,400 --> 00:15:06,000 Speaker 1: disables the safety features that would one prevent brute force 244 00:15:06,040 --> 00:15:11,000 Speaker 1: attacks from happening quickly, to allow brute force attacks to 245 00:15:11,040 --> 00:15:13,720 Speaker 1: happen by hooking up the phone to a computer so 246 00:15:13,760 --> 00:15:15,520 Speaker 1: you don't have to tap those numbers in it shore 247 00:15:16,240 --> 00:15:21,440 Speaker 1: and three disabling that kill switch. So what the FBI 248 00:15:21,520 --> 00:15:24,000 Speaker 1: wants Apple to do is to build up a brand 249 00:15:24,000 --> 00:15:29,440 Speaker 1: new iOS again just for this one phone, and then 250 00:15:29,200 --> 00:15:32,640 Speaker 1: install it on the phone. This would all you know, 251 00:15:32,720 --> 00:15:38,280 Speaker 1: hypothetically happen on Apple grounds, like at an Apple location, 252 00:15:38,480 --> 00:15:43,400 Speaker 1: either at the corporate headquarters or wherever. And then in theory, 253 00:15:44,320 --> 00:15:50,560 Speaker 1: you would then destroy the the the custom iOS because 254 00:15:50,600 --> 00:15:53,480 Speaker 1: you only needed it for that one phone, right, because 255 00:15:53,480 --> 00:15:56,880 Speaker 1: it was so easy to toss the one ring, bring 256 00:15:56,880 --> 00:16:00,000 Speaker 1: it back to more door and destroy it. Right. Yeah, yeah, 257 00:15:59,840 --> 00:16:04,000 Speaker 1: you know what, Mount Doom plenty of backdoors. It's fine, 258 00:16:04,280 --> 00:16:07,440 Speaker 1: it's integral to the plot. That was That was Sauron's 259 00:16:07,520 --> 00:16:11,280 Speaker 1: problem was that he did not plug the security vulnerabilities 260 00:16:11,560 --> 00:16:15,880 Speaker 1: into Mount Doom. I mean, that's that's a classic example, 261 00:16:16,080 --> 00:16:18,200 Speaker 1: like you know, right up there with the whole land 262 00:16:18,200 --> 00:16:22,040 Speaker 1: War with Asia thing. So FBI has been very very 263 00:16:22,040 --> 00:16:25,840 Speaker 1: careful in framing this in a way that presents it 264 00:16:25,920 --> 00:16:30,600 Speaker 1: as a reasonable request, one time thing. That's a that's 265 00:16:30,600 --> 00:16:33,680 Speaker 1: a big deal, right saying this is for one phone 266 00:16:33,680 --> 00:16:36,240 Speaker 1: and one phone only, you would you know, we're suggesting 267 00:16:36,240 --> 00:16:39,400 Speaker 1: the Apple creates an iOS that's directly tied to a 268 00:16:39,880 --> 00:16:44,680 Speaker 1: unique identifier on that phone, meaning even if the iOS 269 00:16:44,760 --> 00:16:47,920 Speaker 1: were somehow to leak, sure it would not be applicable 270 00:16:47,920 --> 00:16:50,480 Speaker 1: to any other device on the market, so it could 271 00:16:50,560 --> 00:16:52,760 Speaker 1: only work for this one phone. So that that sounds 272 00:16:52,760 --> 00:16:57,080 Speaker 1: good in its current form. That's that's important. They didn't 273 00:16:57,120 --> 00:16:59,280 Speaker 1: go so far as to say that that's the important 274 00:16:59,280 --> 00:17:04,320 Speaker 1: part that's left out, but yeah, so that's that was reasonable. 275 00:17:04,359 --> 00:17:06,639 Speaker 1: You could argue saying this is for one case only. 276 00:17:06,760 --> 00:17:10,560 Speaker 1: It's it's an important case. People died as a result, extraordinary, 277 00:17:10,680 --> 00:17:13,879 Speaker 1: and we need to have a clear timeline because we 278 00:17:14,000 --> 00:17:16,520 Speaker 1: don't know if there's something what is there something planned 279 00:17:16,520 --> 00:17:20,399 Speaker 1: for December second, twenty sixteen, right, or was there another 280 00:17:20,440 --> 00:17:23,359 Speaker 1: person involved that we need to get hold of, because 281 00:17:23,400 --> 00:17:29,159 Speaker 1: otherwise that this could happen again. So they've said, you know, 282 00:17:29,320 --> 00:17:31,800 Speaker 1: one time only, use going to destroy it after that 283 00:17:31,960 --> 00:17:35,160 Speaker 1: use Hey, an Apple, if you don't want to make 284 00:17:35,200 --> 00:17:38,560 Speaker 1: this iOS, that's fine, you don't have to do it. 285 00:17:38,680 --> 00:17:41,800 Speaker 1: We'll do it. We'll hire some people to reverse engineer it, 286 00:17:41,920 --> 00:17:46,800 Speaker 1: build out an iOS ourselves. Here's the thing, though, Those 287 00:17:46,840 --> 00:17:52,160 Speaker 1: iPhones will only verify firmware if there's a special Apple 288 00:17:52,280 --> 00:17:56,800 Speaker 1: digital signature attached to the firmware. So in other words, 289 00:17:57,200 --> 00:18:00,960 Speaker 1: if the digital signature, which is unique to Apple, if 290 00:18:01,000 --> 00:18:03,800 Speaker 1: that's not in there, it won't be verified by the device, 291 00:18:03,880 --> 00:18:09,240 Speaker 1: it won't be loaded on because Apple famously wants to 292 00:18:09,320 --> 00:18:12,840 Speaker 1: make sure that their hardware and software works together. And 293 00:18:12,920 --> 00:18:15,119 Speaker 1: that's it. No one else gets to play in that 294 00:18:15,640 --> 00:18:19,280 Speaker 1: play box. Yeah, it's the same with their Matt computers forever, right, 295 00:18:19,359 --> 00:18:23,400 Speaker 1: Like you were really meant to run Apple software on 296 00:18:23,480 --> 00:18:27,200 Speaker 1: Apple hardware and never the twain shall part, right, So, 297 00:18:28,560 --> 00:18:31,879 Speaker 1: same thing with this iPhone. So the fbis is so 298 00:18:32,080 --> 00:18:34,840 Speaker 1: we're not even asking you to give us the digital signature, 299 00:18:34,880 --> 00:18:37,880 Speaker 1: which would be disastrous if Apple did that. The whole 300 00:18:37,920 --> 00:18:40,040 Speaker 1: point of the signature is to make sure that only 301 00:18:40,080 --> 00:18:43,160 Speaker 1: Apple can do this stuff. What they're saying is will 302 00:18:44,000 --> 00:18:47,520 Speaker 1: make this iOS, will give it to you. Yes, you 303 00:18:47,560 --> 00:18:49,719 Speaker 1: will sign off and send it. Right, you give a 304 00:18:49,720 --> 00:18:52,600 Speaker 1: little stamp of approval with your little digital signature, and 305 00:18:52,640 --> 00:18:54,840 Speaker 1: then we can load it onto this phone. But that 306 00:18:54,880 --> 00:18:57,680 Speaker 1: way you don't even have to build the code. See 307 00:18:57,760 --> 00:19:01,920 Speaker 1: we're being really reasonable. We'll be back with more with 308 00:19:02,000 --> 00:19:17,399 Speaker 1: Apple versus the FBI after these quick messages. So here's 309 00:19:17,520 --> 00:19:22,879 Speaker 1: here's the other issue. Apple kind of shot itself in 310 00:19:22,920 --> 00:19:26,440 Speaker 1: the foot. See this is kind of a workaround already, 311 00:19:26,440 --> 00:19:30,280 Speaker 1: this idea of being able to create a new kind 312 00:19:30,280 --> 00:19:34,520 Speaker 1: of firmware to work around the security measures while not 313 00:19:34,640 --> 00:19:38,160 Speaker 1: affecting any of the underlying data. The reason why that's 314 00:19:38,200 --> 00:19:42,040 Speaker 1: possible at all is that Apple has allowed for the 315 00:19:42,080 --> 00:19:46,280 Speaker 1: possibility of issuing a firmware update to a phone without 316 00:19:46,320 --> 00:19:51,760 Speaker 1: the phone's owner having to accept it. Ah. Yes, see, 317 00:19:51,800 --> 00:19:55,959 Speaker 1: if Apple had designed this so that when it pushed 318 00:19:56,000 --> 00:19:59,359 Speaker 1: out a firmware update, you, as the user, had to 319 00:19:59,400 --> 00:20:02,679 Speaker 1: log into your phone and accept it, there'd be no 320 00:20:02,720 --> 00:20:05,000 Speaker 1: way for Apple to do this because you would already 321 00:20:05,000 --> 00:20:08,440 Speaker 1: have to have the pen in order to accept the update. 322 00:20:08,840 --> 00:20:11,439 Speaker 1: So you can't work your way around the pin because 323 00:20:11,480 --> 00:20:13,520 Speaker 1: even the update to try and do the workaround would 324 00:20:13,560 --> 00:20:15,840 Speaker 1: still need the pen. But that's not the case. Apple 325 00:20:16,000 --> 00:20:21,800 Speaker 1: can issue a firmware update without the user's consent. This, 326 00:20:22,000 --> 00:20:25,520 Speaker 1: by the way, also a security problem. Not just a 327 00:20:25,560 --> 00:20:28,720 Speaker 1: security problem in this particular case. But what if someone 328 00:20:28,760 --> 00:20:31,600 Speaker 1: at Apple, you know, decided to code it in so 329 00:20:31,640 --> 00:20:37,080 Speaker 1: that you could activate the microphone remotely. Yeah, and they 330 00:20:37,119 --> 00:20:40,000 Speaker 1: and they shoot this firmware update out and you don't 331 00:20:40,080 --> 00:20:43,600 Speaker 1: have the ability to deny it. You might not even 332 00:20:43,640 --> 00:20:45,800 Speaker 1: be aware that it happens unless you happen to be 333 00:20:45,880 --> 00:20:48,480 Speaker 1: using your phone when the firmware update gets pushed to 334 00:20:48,480 --> 00:20:53,920 Speaker 1: your phone. That's an issue. So because Apple can do this, 335 00:20:54,000 --> 00:20:59,200 Speaker 1: that gives FBI the the leg up to make this request. 336 00:21:00,080 --> 00:21:05,240 Speaker 1: So the FBI just is trying to be as reasonable 337 00:21:05,240 --> 00:21:09,719 Speaker 1: as possible in their request while avoiding the addressing the 338 00:21:09,760 --> 00:21:17,440 Speaker 1: problems that could arise should Apple agree to it. And yeah, 339 00:21:17,480 --> 00:21:22,080 Speaker 1: that's that's the thing. Okay, that's that's that's the thing. 340 00:21:22,240 --> 00:21:27,840 Speaker 1: No matter how single use this might be. Yeah, no 341 00:21:27,880 --> 00:21:34,480 Speaker 1: matter how noble or even crucial the cause, right, there's 342 00:21:34,680 --> 00:21:39,520 Speaker 1: not there is not a practical way that this would 343 00:21:39,560 --> 00:21:43,240 Speaker 1: work without severe repercussions. Yeah, there's a there's a word 344 00:21:43,280 --> 00:21:47,399 Speaker 1: I like to use in this case. It's called precedent. Ah. 345 00:21:47,400 --> 00:21:53,960 Speaker 1: So when you said a precedent where Apple agrees, acquiesces, surrenders, 346 00:21:54,760 --> 00:21:57,720 Speaker 1: is compelled to however you want to put it, to 347 00:21:58,520 --> 00:22:04,840 Speaker 1: agree to FBI's demands, you can't undo that. That has happened. 348 00:22:05,880 --> 00:22:09,840 Speaker 1: And perhaps more importantly, not only has it happened in 349 00:22:09,880 --> 00:22:14,080 Speaker 1: the US government, but now other governments that operate where 350 00:22:14,119 --> 00:22:18,080 Speaker 1: Apple sells products could come to Apple and say, we 351 00:22:18,119 --> 00:22:20,400 Speaker 1: know you can do this because you have done it, 352 00:22:20,760 --> 00:22:22,840 Speaker 1: and we know you will do this because you did 353 00:22:22,920 --> 00:22:24,760 Speaker 1: do it. So if you want to do business in 354 00:22:24,800 --> 00:22:28,040 Speaker 1: our country, you know, the one that starts with Chu 355 00:22:28,280 --> 00:22:31,920 Speaker 1: and ends with Aina. You will do this for us. 356 00:22:31,960 --> 00:22:34,760 Speaker 1: And when you're talking about a government like China's government, 357 00:22:35,200 --> 00:22:39,040 Speaker 1: you could see how this could be used to an 358 00:22:39,080 --> 00:22:43,960 Speaker 1: abusive extent. Anyone who is identified as a dissident could 359 00:22:44,000 --> 00:22:50,040 Speaker 1: be targeted. And China's an enormous market. Right, Apple cannot, 360 00:22:50,080 --> 00:22:54,520 Speaker 1: as a publicly traded company turn its back on the 361 00:22:54,520 --> 00:22:58,360 Speaker 1: biggest emerging market in the world, not even emerging, it's emerged, 362 00:22:58,640 --> 00:23:01,239 Speaker 1: the biggest market in the world. Also, it's a there 363 00:23:01,240 --> 00:23:05,439 Speaker 1: are manufacturing basis for apples in China. That's part of it. 364 00:23:05,480 --> 00:23:09,280 Speaker 1: That's already been a security concern. Let's also consider I mean, 365 00:23:09,600 --> 00:23:13,240 Speaker 1: if we're being honest, Okay, how how do I say 366 00:23:13,280 --> 00:23:19,680 Speaker 1: this correctly? Jonathan? Oh? Yes, Okay. While there is no 367 00:23:20,119 --> 00:23:26,360 Speaker 1: universally acknowledged proof that corporate corporate espionage projects or operations 368 00:23:26,440 --> 00:23:31,399 Speaker 1: coming from China are sponsored by the government, right, there 369 00:23:31,640 --> 00:23:38,760 Speaker 1: is widespread certitude that that is the case. Ockham's razor, right, Right, 370 00:23:38,920 --> 00:23:42,360 Speaker 1: Ockom's razor. You look at it and you think, okay, 371 00:23:42,440 --> 00:23:47,040 Speaker 1: it is entirely possible that any hackers operating in China One, 372 00:23:47,480 --> 00:23:52,960 Speaker 1: maybe they're not Chinese. Unlikely but possible too. Maybe they're 373 00:23:52,960 --> 00:23:56,399 Speaker 1: operating from a different country and using proxies to go 374 00:23:56,520 --> 00:23:59,399 Speaker 1: through China. But considering the firewall of China, that seems 375 00:23:59,440 --> 00:24:05,560 Speaker 1: like that extra headache for those hackers. Three, they could 376 00:24:05,600 --> 00:24:08,520 Speaker 1: be operating. They could be Chinese and operating in China, 377 00:24:08,640 --> 00:24:12,160 Speaker 1: but not be directed by the Chinese government, in which 378 00:24:12,160 --> 00:24:15,400 Speaker 1: case they would still have to use proxies in order 379 00:24:15,400 --> 00:24:19,040 Speaker 1: to access that level of infrastructure. It just gets to 380 00:24:19,040 --> 00:24:21,760 Speaker 1: a point where you think the simplest explanation is, in fact, 381 00:24:21,800 --> 00:24:25,399 Speaker 1: there are these state backed hackers that are doing this 382 00:24:25,560 --> 00:24:28,720 Speaker 1: on behalf or on the request of Chinese government. If 383 00:24:28,760 --> 00:24:33,159 Speaker 1: not on the request, at least with the implicit approval. 384 00:24:33,320 --> 00:24:35,240 Speaker 1: There we go at least someone looking the other way. 385 00:24:35,240 --> 00:24:38,120 Speaker 1: But even that's not enough. There's assistance involved anyway. I'm 386 00:24:38,480 --> 00:24:41,359 Speaker 1: derailing us as well. The point being that if if 387 00:24:41,400 --> 00:24:44,960 Speaker 1: Apple were to agree to this FBI request, that is 388 00:24:45,040 --> 00:24:47,720 Speaker 1: a distinct possibility that it would face not only future 389 00:24:47,760 --> 00:24:52,679 Speaker 1: requests from from other government agencies as well as the FBI, 390 00:24:53,040 --> 00:24:56,000 Speaker 1: but from other countries as well, and that it would 391 00:24:56,040 --> 00:24:59,800 Speaker 1: it would be the wedge that drives open the possibility 392 00:24:59,800 --> 00:25:02,520 Speaker 1: of the things like these backdoors that Apple and other 393 00:25:02,520 --> 00:25:06,240 Speaker 1: companies have been resisting for years now. No, there is 394 00:25:06,280 --> 00:25:08,200 Speaker 1: one other there there is a case that I think 395 00:25:08,200 --> 00:25:11,440 Speaker 1: we should clarify here. Um when we talk about precedent, 396 00:25:11,880 --> 00:25:15,080 Speaker 1: there have been precedents in the in the legal past 397 00:25:15,480 --> 00:25:21,680 Speaker 1: wherein Uncle Sam was allowed to compel a company right 398 00:25:21,920 --> 00:25:24,120 Speaker 1: to do and this is different. But we do need 399 00:25:24,160 --> 00:25:27,560 Speaker 1: to I think we need to differentiate these because it 400 00:25:27,680 --> 00:25:32,560 Speaker 1: is legal in the US to compel a third party, 401 00:25:32,600 --> 00:25:37,640 Speaker 1: whether an individual or a corporation to um help how 402 00:25:37,640 --> 00:25:40,359 Speaker 1: do they word it, like execute a court order or 403 00:25:40,359 --> 00:25:43,040 Speaker 1: something like. Yeah, So you're you're talking about the all 404 00:25:43,160 --> 00:25:46,520 Speaker 1: Ritz Act, right, which is a very old thing, back 405 00:25:46,520 --> 00:25:51,520 Speaker 1: to what's seventeen eighty nine. Okay, US wasn't even the 406 00:25:51,600 --> 00:25:56,520 Speaker 1: US very much the right. This is not a smartphone 407 00:25:56,600 --> 00:26:00,440 Speaker 1: specific law. There were still occasional battles with the British 408 00:26:00,520 --> 00:26:03,399 Speaker 1: going on. Okay, seventeen eighty nine, so all the states 409 00:26:03,440 --> 00:26:06,919 Speaker 1: were here, yet many of them were not here yet. Yes, 410 00:26:07,040 --> 00:26:11,719 Speaker 1: seventeen eighty nine is when this want was first written 411 00:26:11,760 --> 00:26:16,280 Speaker 1: down as part of the Judiciary Act. So specifically, what 412 00:26:16,359 --> 00:26:18,879 Speaker 1: the all Ritz Act allows the government to do is 413 00:26:18,920 --> 00:26:22,560 Speaker 1: to compel a third party to accommodate federal orders like 414 00:26:22,640 --> 00:26:26,800 Speaker 1: a search warrant. So, in other words, the federal government 415 00:26:26,840 --> 00:26:29,800 Speaker 1: can issue a search warrant to law enforcement. The law 416 00:26:29,960 --> 00:26:32,800 Speaker 1: enforcement can go to let's say it's an apartment building. 417 00:26:32,840 --> 00:26:35,480 Speaker 1: They can go to the manager of the apartment building 418 00:26:35,480 --> 00:26:37,960 Speaker 1: and say, we have the search warrant. The all Ritz 419 00:26:38,000 --> 00:26:40,840 Speaker 1: Act tells us that you have to allow us to 420 00:26:40,880 --> 00:26:43,879 Speaker 1: go into this apartment to search it. And then the 421 00:26:43,920 --> 00:26:46,800 Speaker 1: apartment owners says all right and lets them in. Now, 422 00:26:46,840 --> 00:26:51,159 Speaker 1: this serves a couple of different purposes. It expedites the 423 00:26:51,920 --> 00:26:55,159 Speaker 1: work of the federal government in investigations and things of 424 00:26:55,200 --> 00:26:58,359 Speaker 1: that nature. And it also provides a protection to those 425 00:26:58,359 --> 00:27:01,480 Speaker 1: third parties because the third party is having to comply 426 00:27:02,000 --> 00:27:06,679 Speaker 1: with a federal request. And if you are a person 427 00:27:07,000 --> 00:27:11,560 Speaker 1: like an apartment manager and your other tenants are coming 428 00:27:11,560 --> 00:27:13,679 Speaker 1: to you and saying why are you letting them into 429 00:27:13,920 --> 00:27:17,600 Speaker 1: someone's room without their permission, you can say I have 430 00:27:17,720 --> 00:27:21,000 Speaker 1: to by law. That protects you as the owner as well, 431 00:27:21,520 --> 00:27:24,560 Speaker 1: because it means that you're not you're not You're not 432 00:27:24,600 --> 00:27:27,160 Speaker 1: a rat. You know you're you're you're following the law. 433 00:27:27,240 --> 00:27:30,760 Speaker 1: You're obeying the law. Right, So automatically, anything you do 434 00:27:30,840 --> 00:27:33,960 Speaker 1: in the assistance of the execution of that court order 435 00:27:34,880 --> 00:27:39,280 Speaker 1: is automatically legal. I mean as long as you're yeah, 436 00:27:39,320 --> 00:27:41,840 Speaker 1: as long as they're not saying, hey, we need a 437 00:27:41,840 --> 00:27:44,040 Speaker 1: search warr and you go sure, Hey, while I'm on 438 00:27:44,080 --> 00:27:46,840 Speaker 1: the way, do you mind even steal the car right 439 00:27:47,119 --> 00:27:49,320 Speaker 1: or there? Or they can't say, hey, we're going to 440 00:27:49,359 --> 00:27:51,320 Speaker 1: get a search warrant? Do you want to just lessen 441 00:27:51,520 --> 00:27:56,639 Speaker 1: now right? That would not be cool. But there's a 442 00:27:56,720 --> 00:28:01,000 Speaker 1: very important idea that's attached to the US all Ritz Act, 443 00:28:01,880 --> 00:28:04,399 Speaker 1: which is that, and the Supreme Court has ruled on this, 444 00:28:05,880 --> 00:28:09,320 Speaker 1: you cannot rely on the all Ritz Act to compel 445 00:28:09,359 --> 00:28:14,119 Speaker 1: a third party to action if it creates an unreasonable 446 00:28:14,160 --> 00:28:18,000 Speaker 1: burden upon that party. So you might say, well, what's 447 00:28:18,000 --> 00:28:20,959 Speaker 1: the unreasonable burden for Apple? I mean, all they're asking 448 00:28:21,000 --> 00:28:26,879 Speaker 1: for is a way around this security system, just time, 449 00:28:26,960 --> 00:28:29,480 Speaker 1: just the one time. So there are actually several counter 450 00:28:29,680 --> 00:28:34,119 Speaker 1: arguments to this. Um. First, uh, you know, Apple is 451 00:28:34,160 --> 00:28:37,640 Speaker 1: saying their programmers may not even know how to make 452 00:28:37,680 --> 00:28:41,920 Speaker 1: the code that would allow for this to happen. They like, listen, 453 00:28:41,920 --> 00:28:44,480 Speaker 1: we don't even know that we can build this yet, 454 00:28:45,080 --> 00:28:46,960 Speaker 1: So you're asking us to do something that we don't 455 00:28:47,040 --> 00:28:50,760 Speaker 1: know we can build. So that's an unreasonable burden because 456 00:28:50,760 --> 00:28:54,280 Speaker 1: it means we have to dedicate our assets to from 457 00:28:54,600 --> 00:28:57,320 Speaker 1: from projects that they should be working on to trying 458 00:28:57,320 --> 00:29:00,040 Speaker 1: to figure out if this thing is possible and if so, 459 00:29:00,080 --> 00:29:03,680 Speaker 1: how to do it now. The FBI's argument was essentially, hey, 460 00:29:03,720 --> 00:29:05,760 Speaker 1: you're in the business of writing codes. There should be 461 00:29:05,760 --> 00:29:12,560 Speaker 1: no problem, right. I would desperately need a bleep sound 462 00:29:12,600 --> 00:29:15,760 Speaker 1: effect right now. But I call BS. Let's be nice. 463 00:29:15,840 --> 00:29:19,240 Speaker 1: I'll call BS on that argument, because to me, that's 464 00:29:19,240 --> 00:29:21,800 Speaker 1: the same as coming up to Let's say, let's say 465 00:29:21,800 --> 00:29:24,840 Speaker 1: Ben that you are a cook, yes, and you cook 466 00:29:25,240 --> 00:29:29,560 Speaker 1: in an Italian restaurant. You're you're the Italian restaurant's head cook. 467 00:29:29,800 --> 00:29:32,200 Speaker 1: Three initial in stars. I go up to you and 468 00:29:32,240 --> 00:29:35,040 Speaker 1: I say, hey, listen, I'm from the federal government. I 469 00:29:35,120 --> 00:29:39,040 Speaker 1: got this, uh, this this executive order, this federal order 470 00:29:39,840 --> 00:29:42,959 Speaker 1: for you. Okay, all right, you have to now go 471 00:29:43,040 --> 00:29:48,160 Speaker 1: and make a dinner of Peruvian food right now for 472 00:29:48,360 --> 00:29:50,960 Speaker 1: thirty people that are in the restaurant. You can do 473 00:29:51,000 --> 00:29:55,560 Speaker 1: it because you cook right exactly. Or it's like saying, yeah, 474 00:29:55,800 --> 00:29:58,400 Speaker 1: it would be a disastrous Peruvian dinner at very least 475 00:29:58,400 --> 00:30:03,080 Speaker 1: a looser interpretation, like for another example, let's say to 476 00:30:03,160 --> 00:30:06,120 Speaker 1: make it even broader. Yeah, because I think this is 477 00:30:06,240 --> 00:30:09,760 Speaker 1: this is pretty good too. So, uh, Jonathan, let's say 478 00:30:09,800 --> 00:30:14,600 Speaker 1: that you are a doctor. All right, Okay, you're at 479 00:30:14,600 --> 00:30:18,240 Speaker 1: your doctor, doctor Strickland, doctor Strickling. Hey, hey call me Jonathan, 480 00:30:18,240 --> 00:30:22,720 Speaker 1: doctor Strickland's my dad. You right, So you're an easy 481 00:30:22,760 --> 00:30:27,440 Speaker 1: going doctor, clearly, Um, and you are, and uh you 482 00:30:27,600 --> 00:30:30,000 Speaker 1: you're an ear nose and throat man. Okay yea. And 483 00:30:30,600 --> 00:30:34,840 Speaker 1: uh so I come up to un say I'm from 484 00:30:35,000 --> 00:30:39,080 Speaker 1: the federal government. I have an executive order but not 485 00:30:39,120 --> 00:30:41,680 Speaker 1: an appointment. So you're already kind of irritated, right, and 486 00:30:41,920 --> 00:30:46,240 Speaker 1: I'm like, and I say, I need you to make 487 00:30:46,280 --> 00:30:50,400 Speaker 1: the cure for cancer. Yeah that's a bit much. Yeah, 488 00:30:50,440 --> 00:30:52,920 Speaker 1: I mean you're a doctor, right, you know about organs 489 00:30:52,960 --> 00:30:56,880 Speaker 1: and stuff cancer affects bodies. Or going to Harley Davidson 490 00:30:57,120 --> 00:31:00,800 Speaker 1: Davidson and saying, look, you you make stuff where wheels 491 00:31:00,800 --> 00:31:03,680 Speaker 1: are connected to chassis and they turn and a motor 492 00:31:03,880 --> 00:31:06,640 Speaker 1: keeps things going. I need you to make a bus. Yeah, 493 00:31:06,920 --> 00:31:09,680 Speaker 1: I mean, as you see where the where it's ridiculous. 494 00:31:09,720 --> 00:31:14,240 Speaker 1: It's it's you can't argue that because this company is 495 00:31:14,280 --> 00:31:16,360 Speaker 1: in the business of doing this one thing, which by 496 00:31:16,360 --> 00:31:18,840 Speaker 1: the way, is just one part of Apple's business, absolutely, 497 00:31:19,200 --> 00:31:21,959 Speaker 1: that they are capable of making this other thing that 498 00:31:22,080 --> 00:31:24,800 Speaker 1: happens to fall into that same category. That is ludicrous 499 00:31:24,800 --> 00:31:26,440 Speaker 1: on the face of it. So that's argument number one 500 00:31:26,960 --> 00:31:30,920 Speaker 1: about it being a burden. Okay. The second burden is 501 00:31:30,920 --> 00:31:33,320 Speaker 1: the one that we've already touched on. It sets a precedent. 502 00:31:33,360 --> 00:31:35,920 Speaker 1: If Apple can be forced to attack the security of 503 00:31:35,920 --> 00:31:39,040 Speaker 1: its own system in this case, it could happen again 504 00:31:39,240 --> 00:31:43,840 Speaker 1: and that would be a disastrous result for consumer uh, 505 00:31:44,160 --> 00:31:47,440 Speaker 1: you know, confidence in Apple's products. Right, that's very bad 506 00:31:47,480 --> 00:31:49,800 Speaker 1: for Apple's bottom line. So if Apple says, look, you 507 00:31:49,840 --> 00:31:54,280 Speaker 1: will make us lose millions, if not billions of dollars 508 00:31:54,320 --> 00:31:58,440 Speaker 1: in revenue, how is that not an unreasonable burden? How 509 00:31:58,480 --> 00:32:02,520 Speaker 1: can you argue that that burden is reasonable? Right exactly? 510 00:32:02,560 --> 00:32:04,920 Speaker 1: And that's so. And not only that, but then you 511 00:32:04,960 --> 00:32:09,960 Speaker 1: get into the foreign agent approach, the foreign state approach, 512 00:32:10,000 --> 00:32:13,280 Speaker 1: saying what if this means that China comes to us 513 00:32:13,280 --> 00:32:16,280 Speaker 1: and says, because of this other thing that we agreed 514 00:32:16,320 --> 00:32:18,240 Speaker 1: to do, we now have to do it in China 515 00:32:18,360 --> 00:32:23,640 Speaker 1: all the time, and real human beings are being pursued 516 00:32:24,000 --> 00:32:28,480 Speaker 1: and their lives are turned upside down and ruined as 517 00:32:28,520 --> 00:32:31,120 Speaker 1: a result of it. And it's all because we have 518 00:32:31,240 --> 00:32:34,480 Speaker 1: to comply, because this has already set a precedent that's 519 00:32:34,480 --> 00:32:39,680 Speaker 1: an unreasonable burden. And finally, they've even said that it's 520 00:32:39,680 --> 00:32:43,360 Speaker 1: a violation of their right to free speech. And the 521 00:32:43,440 --> 00:32:46,280 Speaker 1: reason for that is because code has been ruled as 522 00:32:46,320 --> 00:32:49,240 Speaker 1: a type of free speech in the past, and if 523 00:32:49,240 --> 00:32:53,800 Speaker 1: the government compels Apple to write code that Apple doesn't 524 00:32:53,840 --> 00:32:59,120 Speaker 1: believe in, they're being compelled to speak against their own beliefs, 525 00:32:59,200 --> 00:33:03,000 Speaker 1: thus a violation of free speech. Now that argument, most 526 00:33:03,000 --> 00:33:05,040 Speaker 1: people are saying it's probably the weakest of all of 527 00:33:05,120 --> 00:33:07,640 Speaker 1: their styling on it a little, but I gotta admit 528 00:33:07,720 --> 00:33:11,840 Speaker 1: that's pretty awesome style. Well, yeah, that is. I enjoy it. 529 00:33:11,880 --> 00:33:16,000 Speaker 1: And there, while it's reaching, it's not invalid. And let's 530 00:33:16,040 --> 00:33:21,040 Speaker 1: consider that Apple legally, in this kind of case, is 531 00:33:21,120 --> 00:33:25,560 Speaker 1: playing against Uncle Sam and on its home territory right. 532 00:33:25,720 --> 00:33:29,960 Speaker 1: And this means that you you might often wonder why 533 00:33:30,160 --> 00:33:33,680 Speaker 1: when there are suits or countersuits or legal problems, why 534 00:33:33,880 --> 00:33:38,280 Speaker 1: so many cases open with just this laundry list of arguments, 535 00:33:38,480 --> 00:33:41,680 Speaker 1: And it's because it's you know, if we could go 536 00:33:41,720 --> 00:33:46,400 Speaker 1: back to my Italian restaurant, you're just throwing this spaghetti 537 00:33:46,440 --> 00:33:49,400 Speaker 1: at the wall. It's a scattered gun approach, it absolutely is, 538 00:33:49,960 --> 00:33:52,680 Speaker 1: or if you prefer, it's casting a very wide net 539 00:33:52,720 --> 00:33:56,920 Speaker 1: because you aren't sure which tactic is necessarily going to 540 00:33:56,960 --> 00:33:59,120 Speaker 1: be your best one from the starting gate, so you 541 00:33:59,160 --> 00:34:01,760 Speaker 1: want to throw out all of them at once, and 542 00:34:01,800 --> 00:34:04,520 Speaker 1: if it's if it's a compelling enough argument, then you 543 00:34:04,560 --> 00:34:07,520 Speaker 1: can get things thrown out before they go any further. 544 00:34:08,600 --> 00:34:11,799 Speaker 1: And in fact, Apple has said that they're willing to 545 00:34:11,880 --> 00:34:13,839 Speaker 1: go all the way to the Supreme Court with this 546 00:34:13,880 --> 00:34:18,280 Speaker 1: particular fight. The CEO said that publicly. Yeah, which interesting 547 00:34:18,280 --> 00:34:23,160 Speaker 1: because I you know, we recently lost a US Supreme 548 00:34:23,160 --> 00:34:26,240 Speaker 1: Court justice here in the United States, and I actually 549 00:34:26,239 --> 00:34:28,160 Speaker 1: think he probably would have sided with Apple on this 550 00:34:28,200 --> 00:34:33,440 Speaker 1: one because of his very strict view of the constitution, 551 00:34:33,560 --> 00:34:38,400 Speaker 1: right he's an originalist antonin Scalia. Yeah, So going back 552 00:34:38,560 --> 00:34:40,440 Speaker 1: to this argument of it, I'm going to read a 553 00:34:40,520 --> 00:34:42,560 Speaker 1: quote and I want to see what your reaction is, 554 00:34:43,560 --> 00:34:47,160 Speaker 1: because I know what mine was. This is from Congressman 555 00:34:47,320 --> 00:34:53,040 Speaker 1: David Jolly of Florida, who said Apple's leadership risks having 556 00:34:53,080 --> 00:34:57,920 Speaker 1: blood on their hands. Ben is shaking his head and 557 00:34:58,000 --> 00:35:01,680 Speaker 1: looking at me in disdain. Not at me, he's looking 558 00:35:01,719 --> 00:35:06,320 Speaker 1: through me. It's absolutely insincere, first off, and it's usually 559 00:35:06,840 --> 00:35:13,759 Speaker 1: whenever you listeners, whenever you hear people make lurid imagery 560 00:35:13,840 --> 00:35:22,160 Speaker 1: based appeals to emotion, right, or these these hyperbolic accusations 561 00:35:22,520 --> 00:35:26,680 Speaker 1: that this is the bread and butter of I'll say 562 00:35:26,680 --> 00:35:31,840 Speaker 1: it the political class theater exactly. That's a great phrase. 563 00:35:31,880 --> 00:35:36,440 Speaker 1: And so to say this in such a way, what 564 00:35:36,480 --> 00:35:39,800 Speaker 1: it does is psychologically you get an image of somebody 565 00:35:40,239 --> 00:35:44,160 Speaker 1: with with literal blood on their hands, and then you 566 00:35:44,200 --> 00:35:48,800 Speaker 1: know they're trying to cast aspersion against Apple, not by 567 00:35:48,960 --> 00:35:52,680 Speaker 1: not by making any point about the arguments Apple is 568 00:35:52,719 --> 00:35:57,319 Speaker 1: making right, but by going instantly to, uh, these people 569 00:35:57,360 --> 00:36:00,200 Speaker 1: are murderers, right if we don't do why are you 570 00:36:00,239 --> 00:36:05,719 Speaker 1: defending murderers? That's that's that's essentially you know, uh, the 571 00:36:05,840 --> 00:36:08,839 Speaker 1: argument about you know, it's it's one of those those 572 00:36:08,960 --> 00:36:11,720 Speaker 1: legal arguments you will hear occasionally. This is an illegal argument, 573 00:36:11,760 --> 00:36:13,719 Speaker 1: but it's like the legal arguments you will occasionally hear 574 00:36:13,760 --> 00:36:16,520 Speaker 1: where it's clear that the lawyer is trying to appeal 575 00:36:16,560 --> 00:36:19,600 Speaker 1: to the jury's sense of emotion. Sure as opposed to 576 00:36:19,800 --> 00:36:23,000 Speaker 1: addressing the facts of the case itself. Right, Um, so, 577 00:36:23,280 --> 00:36:26,160 Speaker 1: I agree entirely with you. The first reaction I had was, 578 00:36:26,840 --> 00:36:31,000 Speaker 1: I'm offended by that statement. I mean, it's condescending and 579 00:36:31,040 --> 00:36:34,800 Speaker 1: it imagines that it imagines that the person that would 580 00:36:34,800 --> 00:36:38,000 Speaker 1: be swayed by that is not intelligent enough to read, 581 00:36:38,239 --> 00:36:42,160 Speaker 1: and it it deflects the fact that there are two 582 00:36:42,200 --> 00:36:48,880 Speaker 1: people who were responsible for that terrible exact attack, two people, 583 00:36:49,320 --> 00:36:51,080 Speaker 1: and those two people are the people who are holding 584 00:36:51,080 --> 00:36:53,160 Speaker 1: the guns and pulling the trigger and aiming at people. 585 00:36:53,239 --> 00:36:55,160 Speaker 1: Those those are the ones who have blood on their hands, 586 00:36:55,200 --> 00:36:57,319 Speaker 1: and of course they're they're both dead now, they both 587 00:36:57,400 --> 00:37:01,840 Speaker 1: died in the shootout with the law enforced. But the 588 00:37:01,960 --> 00:37:04,720 Speaker 1: point being that they're the ones responsible, not Apple. Apple 589 00:37:04,760 --> 00:37:08,520 Speaker 1: did nothing in relation to this crime. In fact, Apple 590 00:37:08,560 --> 00:37:12,760 Speaker 1: didn't give the fricking phone to Furut that was issued 591 00:37:12,800 --> 00:37:16,280 Speaker 1: by the county. Apple just made some made a device 592 00:37:16,760 --> 00:37:19,640 Speaker 1: that has this level of security on it that people wanted. 593 00:37:19,719 --> 00:37:22,919 Speaker 1: People wanted that level of security. They want the reassurance 594 00:37:23,000 --> 00:37:27,480 Speaker 1: that Apple itself can't access their phones without their permission. 595 00:37:28,600 --> 00:37:32,359 Speaker 1: It's a very important cornerstone of security. In fact, if 596 00:37:32,360 --> 00:37:36,919 Speaker 1: you look at iOS eight or earlier, Apple could bypass security, 597 00:37:36,760 --> 00:37:41,040 Speaker 1: they could access the information on a phone without your pen, 598 00:37:41,560 --> 00:37:45,600 Speaker 1: without you acquiescing and allowing that to happen. But they 599 00:37:45,640 --> 00:37:50,000 Speaker 1: specifically change that with iOS nine. They made it so 600 00:37:50,040 --> 00:37:53,080 Speaker 1: that they could not do that because they said, it's 601 00:37:53,120 --> 00:37:57,640 Speaker 1: important for consumers to trust the company. And how can 602 00:37:57,680 --> 00:37:59,640 Speaker 1: you build trust if you know in the back of 603 00:37:59,640 --> 00:38:02,760 Speaker 1: your mind and this company could at any moment access 604 00:38:02,960 --> 00:38:07,160 Speaker 1: my private information that I have not chosen to share 605 00:38:07,200 --> 00:38:11,200 Speaker 1: with them, Well, you know that trust is destroyed in 606 00:38:11,239 --> 00:38:14,120 Speaker 1: that case, right, kind of brings us back to that 607 00:38:14,239 --> 00:38:20,040 Speaker 1: unreasonable burden. So these arguments are continuing. I think the 608 00:38:20,080 --> 00:38:24,799 Speaker 1: next stage doesn't start till till March tenth, and we're 609 00:38:24,800 --> 00:38:30,839 Speaker 1: recording this on February twenty sixth. So I am very 610 00:38:30,880 --> 00:38:35,000 Speaker 1: hopeful that the government sides with Apple on this ultimately, 611 00:38:35,080 --> 00:38:37,720 Speaker 1: that when this gets the courts at any rate side 612 00:38:37,719 --> 00:38:42,200 Speaker 1: with Apple on this, because if they do not, this 613 00:38:42,440 --> 00:38:45,880 Speaker 1: is the This could be like the snowball effect where 614 00:38:46,239 --> 00:38:49,799 Speaker 1: we see more requests of this nature come in and 615 00:38:49,840 --> 00:38:53,960 Speaker 1: thence it. Once it's been established as precedent, it's much 616 00:38:53,960 --> 00:38:56,560 Speaker 1: easier to happen in the future, and it's easier to 617 00:38:56,600 --> 00:39:01,080 Speaker 1: see larger requests, like things that are that go beyond 618 00:39:01,200 --> 00:39:03,360 Speaker 1: all we need you to help us circumvent the security 619 00:39:03,960 --> 00:39:06,400 Speaker 1: and may go into we need you. We finally are 620 00:39:06,400 --> 00:39:08,719 Speaker 1: going to get what we wanted all this time. We 621 00:39:08,760 --> 00:39:12,680 Speaker 1: want a direct path that like a doorway that's labeled 622 00:39:12,719 --> 00:39:16,160 Speaker 1: FBI that lets us go straight into the data that 623 00:39:16,280 --> 00:39:19,440 Speaker 1: your users are storing on their devices, which leads to 624 00:39:19,640 --> 00:39:23,640 Speaker 1: because the FBI, this leads to a horrific situation, because 625 00:39:23,680 --> 00:39:28,440 Speaker 1: the FBI is an institution sure made up of individuals. 626 00:39:28,680 --> 00:39:32,400 Speaker 1: Remember when the Snowden leaks revealed to us the extent 627 00:39:32,600 --> 00:39:35,879 Speaker 1: of unethical use of surveillance by the NSA. People would 628 00:39:35,920 --> 00:39:39,600 Speaker 1: look up their ex girlfriends or they were star ex boyfriends. Yeah, 629 00:39:39,640 --> 00:39:42,040 Speaker 1: they were just looking up people that they were interested 630 00:39:42,080 --> 00:39:44,960 Speaker 1: in with absolutely no oversight. As a matter of fact, 631 00:39:45,239 --> 00:39:50,160 Speaker 1: when we talked about the Chinese government looking the other 632 00:39:50,160 --> 00:39:54,400 Speaker 1: way for hackers, that was the same thing that occurred 633 00:39:54,440 --> 00:40:00,840 Speaker 1: with the NSA. To assume that, for some reason, given 634 00:40:00,840 --> 00:40:05,520 Speaker 1: the opportunity, individuals in another law enforcement branch or another 635 00:40:05,960 --> 00:40:11,120 Speaker 1: institution would not do the same thing, yeah, is cartoonishly naive. 636 00:40:12,040 --> 00:40:17,479 Speaker 1: We will conclude our twenty sixteen discussion about Apple versus 637 00:40:17,520 --> 00:40:29,840 Speaker 1: the FBI after this quick break, you know, using the 638 00:40:29,920 --> 00:40:32,200 Speaker 1: argument of this is a one time use, that wouldn't 639 00:40:32,239 --> 00:40:35,239 Speaker 1: stop the FBI from requesting another one time use, or 640 00:40:35,280 --> 00:40:38,080 Speaker 1: another one time use, or even extending that beyond it, 641 00:40:38,160 --> 00:40:41,520 Speaker 1: saying all right now, we want a one size fits 642 00:40:41,600 --> 00:40:44,120 Speaker 1: all approach to doing the same thing because it's too 643 00:40:44,160 --> 00:40:46,560 Speaker 1: much time for us. And don't worry, we'll get a 644 00:40:46,600 --> 00:40:49,799 Speaker 1: cord order before we do it. Will make sure that 645 00:40:49,840 --> 00:40:53,000 Speaker 1: nobody else gets access to this ability. And you'll know 646 00:40:53,080 --> 00:40:56,720 Speaker 1: that the cord order is good because they'll be classified. 647 00:40:57,280 --> 00:40:59,520 Speaker 1: So we'll just inform you when the orders are approved. 648 00:40:59,600 --> 00:41:02,080 Speaker 1: I have said it many times that there is no 649 00:41:02,160 --> 00:41:07,279 Speaker 1: way to ensure security by enforcing a vulnerability. Yeah, and 650 00:41:07,400 --> 00:41:10,879 Speaker 1: I think that's a way to encapsulate it. But there 651 00:41:11,000 --> 00:41:13,400 Speaker 1: is a question that I have that I'm sure a 652 00:41:13,440 --> 00:41:16,200 Speaker 1: lot of you have as well, ladies and gentlemen, which is, 653 00:41:17,000 --> 00:41:20,760 Speaker 1: let's say the worst happens, Yes, okay, worst happens court 654 00:41:20,920 --> 00:41:27,040 Speaker 1: rules in favor of the FBI, and Apple says, nah, 655 00:41:27,080 --> 00:41:31,080 Speaker 1: we're not going to do it. Well, I mean if 656 00:41:31,160 --> 00:41:33,360 Speaker 1: if the court, assuming he goes all the way up 657 00:41:33,360 --> 00:41:35,960 Speaker 1: to the Supreme Court, this could end up becoming a 658 00:41:36,000 --> 00:41:41,600 Speaker 1: matter of law where it's codified that companies have to 659 00:41:41,640 --> 00:41:44,279 Speaker 1: obey that within the United States, which would mean far 660 00:41:44,440 --> 00:41:49,280 Speaker 1: reaching implications, not just for Apple, but for all companies, everyone, 661 00:41:49,640 --> 00:41:53,040 Speaker 1: any any tech company, any company really that any company 662 00:41:53,120 --> 00:41:55,600 Speaker 1: doing business in the US, not even based here, but 663 00:41:55,760 --> 00:42:00,000 Speaker 1: just doing business. So that's a big deal. It's it's 664 00:42:00,000 --> 00:42:06,839 Speaker 1: potentially disastrous for privacy. There's rampant possibilities of misuse. We've 665 00:42:06,880 --> 00:42:11,160 Speaker 1: talked about the possibility that if you do create a vulnerability, 666 00:42:12,040 --> 00:42:14,319 Speaker 1: someone somewhere is going to try and figure out a 667 00:42:14,360 --> 00:42:18,040 Speaker 1: way to also gain access to that vulnerability, right, And 668 00:42:18,120 --> 00:42:23,360 Speaker 1: these are not necessarily other law agencies or intelligence agencies, 669 00:42:23,440 --> 00:42:25,640 Speaker 1: or maybe they are intelligence agencies. They just have to 670 00:42:25,680 --> 00:42:28,560 Speaker 1: be intelligence agencies working for a different country. And yeah, 671 00:42:28,600 --> 00:42:31,280 Speaker 1: and here's the problem, because we've talked about this before, 672 00:42:31,360 --> 00:42:35,160 Speaker 1: man I. Legislation, I think we talked about this with 673 00:42:35,200 --> 00:42:42,799 Speaker 1: autonomous vehicles before. Legislation is almost always outpaced by technological innovation. YEA, yes, yeah, 674 00:42:42,880 --> 00:42:45,960 Speaker 1: but you will almost always see a case where someone 675 00:42:46,000 --> 00:42:48,680 Speaker 1: has figured out something really interesting to do with technology, 676 00:42:48,760 --> 00:42:52,040 Speaker 1: or perhaps even really scary things they could do with technology, 677 00:42:52,400 --> 00:42:54,760 Speaker 1: and there are no laws to cover it because before 678 00:42:54,920 --> 00:42:57,560 Speaker 1: that person figured it out, it didn't exist. So you 679 00:42:57,600 --> 00:42:59,839 Speaker 1: don't write laws for stuff that doesn't exist. We don't 680 00:43:00,080 --> 00:43:04,200 Speaker 1: have a law saying listen, guys, I I just it's 681 00:43:04,280 --> 00:43:06,160 Speaker 1: keeping me up at night. We have got to write 682 00:43:06,160 --> 00:43:09,359 Speaker 1: a law about what happens in the case that the 683 00:43:09,440 --> 00:43:12,920 Speaker 1: Lochness Monster is real, gets out of Scotland, comes over 684 00:43:12,920 --> 00:43:15,040 Speaker 1: to New Jersey and starts to eat people. We need 685 00:43:15,080 --> 00:43:19,960 Speaker 1: a law to protect us from this potential catastrophe, right, 686 00:43:20,080 --> 00:43:23,920 Speaker 1: I'm ridiculous. Yeah, Like if we're senators, everyone in the 687 00:43:23,960 --> 00:43:26,640 Speaker 1: audience and you, Jonathan, myself, and then one of us 688 00:43:26,680 --> 00:43:30,640 Speaker 1: walks in and says, guys, I know that. I know 689 00:43:30,719 --> 00:43:33,120 Speaker 1: that we have some other issues coming up, and we 690 00:43:33,200 --> 00:43:36,240 Speaker 1: have to nominate this court justice, and there's an election 691 00:43:36,320 --> 00:43:40,000 Speaker 1: coming up. But I think we need to look into 692 00:43:40,000 --> 00:43:42,000 Speaker 1: the future and look at the big picture, which is 693 00:43:42,120 --> 00:43:46,960 Speaker 1: moon boot theft, right, because I don't want people shoeless 694 00:43:46,960 --> 00:43:49,640 Speaker 1: on the Moon when and if we build a colony there. Yeah, 695 00:43:49,640 --> 00:43:53,879 Speaker 1: I think arguing, for example, for robot rites right now 696 00:43:53,960 --> 00:43:56,480 Speaker 1: might be a little premature that kind of thing. Maybe 697 00:43:56,480 --> 00:43:59,000 Speaker 1: not forever, but for now, it's definitely it's good to 698 00:43:59,040 --> 00:44:02,360 Speaker 1: think about. But you know, you make a very astute 699 00:44:02,400 --> 00:44:06,320 Speaker 1: point when you say, if we're talking about codifying something 700 00:44:06,400 --> 00:44:09,880 Speaker 1: or codifying a law, then what happens is once the 701 00:44:10,000 --> 00:44:14,920 Speaker 1: Supreme Court rules on something like this and it becomes 702 00:44:15,040 --> 00:44:18,359 Speaker 1: a matter of law, it is much It is very, 703 00:44:18,480 --> 00:44:22,319 Speaker 1: very difficult to get that kind of ruling. The Supremes 704 00:44:22,320 --> 00:44:25,399 Speaker 1: are pretty busy people. Yep, they don't hear every case 705 00:44:25,440 --> 00:44:28,440 Speaker 1: that's brought before. They absolutely don't. And then but the 706 00:44:28,480 --> 00:44:31,000 Speaker 1: thing is, you think it's hard to get one of 707 00:44:31,080 --> 00:44:35,680 Speaker 1: the to get those justices to change the substance of 708 00:44:35,719 --> 00:44:39,879 Speaker 1: American jurisprudence. Imagine trying to get them to change it back. 709 00:44:40,000 --> 00:44:43,759 Speaker 1: This is like a Pandora's box Pandora's jar situation. Yeah, 710 00:44:43,800 --> 00:44:46,640 Speaker 1: now this is not not good. You don't want this 711 00:44:46,760 --> 00:44:50,360 Speaker 1: to happen for multiple reasons. Now, all that being said, 712 00:44:51,640 --> 00:44:54,319 Speaker 1: are our sympathies with the families of those who are 713 00:44:54,360 --> 00:44:57,440 Speaker 1: wounded and killed as a result of this mass shooting. 714 00:44:57,480 --> 00:45:01,440 Speaker 1: Absolutely I feel off for them. And if there were 715 00:45:01,480 --> 00:45:04,839 Speaker 1: any other way to get to that information that did 716 00:45:04,920 --> 00:45:09,160 Speaker 1: not require Apple to be complicit in destroying its own security, 717 00:45:09,800 --> 00:45:12,399 Speaker 1: I would be in favor of it. And in fact, 718 00:45:12,440 --> 00:45:14,840 Speaker 1: the FBI has taken such pains they got access to 719 00:45:14,840 --> 00:45:19,279 Speaker 1: the iCloud backups that this phone creates. The problem being 720 00:45:19,320 --> 00:45:21,480 Speaker 1: that the phone didn't have an iCloud backup for the 721 00:45:21,520 --> 00:45:25,319 Speaker 1: month leading up to the actual attacks, So there could 722 00:45:25,320 --> 00:45:27,680 Speaker 1: be information on the phone that's not on the cloud, 723 00:45:27,960 --> 00:45:30,359 Speaker 1: and that's why the FBI wants to get access to that. 724 00:45:30,520 --> 00:45:35,120 Speaker 1: I totally understand the reasoning behind it, but two things, 725 00:45:35,160 --> 00:45:39,080 Speaker 1: of course, keep me from being completely sympathetic. One is 726 00:45:39,120 --> 00:45:41,720 Speaker 1: that the FBI has for years been trying to get 727 00:45:42,080 --> 00:45:45,879 Speaker 1: backdoor access to multiple systems exactly. Yeah, So that so 728 00:45:46,440 --> 00:45:50,319 Speaker 1: you could argue that perhaps this mass shooting is being 729 00:45:50,520 --> 00:45:55,040 Speaker 1: leveraged cynically by the FBI in order to further their goals, 730 00:45:55,520 --> 00:46:00,440 Speaker 1: because it's hard to say no to such an emotionally 731 00:46:00,960 --> 00:46:06,040 Speaker 1: devastating event. Opportunistically, I would say, yeah, I do know, 732 00:46:06,360 --> 00:46:10,920 Speaker 1: I believe it. And this is just my personal opinion 733 00:46:11,600 --> 00:46:17,280 Speaker 1: based on again precedent. It is completely within the realm 734 00:46:17,320 --> 00:46:23,120 Speaker 1: of not only possibility but plausibility that an institution would 735 00:46:23,160 --> 00:46:27,160 Speaker 1: wait for an opportune time to make this this kind 736 00:46:27,280 --> 00:46:33,080 Speaker 1: of legislation, like the argument for Internet surveillance, based on saying, hey, 737 00:46:33,280 --> 00:46:36,000 Speaker 1: we need to protect people, we need to protect you 738 00:46:36,520 --> 00:46:43,000 Speaker 1: from inappropriate content and your children, think of the kids. Right, Really, 739 00:46:43,040 --> 00:46:45,440 Speaker 1: this is this is a lot of the same stuff 740 00:46:45,480 --> 00:46:48,759 Speaker 1: we heard in the wake of the Patriot Act. Absolutely 741 00:46:49,040 --> 00:46:52,080 Speaker 1: where a lot of people felt the Patriot Act was 742 00:46:52,239 --> 00:46:57,200 Speaker 1: a reactionary piece of legislation that was drafted far too 743 00:46:57,239 --> 00:47:00,319 Speaker 1: quickly and what had had reached far far are too 744 00:47:00,360 --> 00:47:06,120 Speaker 1: why for what it was proposing, what what everyone claimed 745 00:47:06,120 --> 00:47:10,719 Speaker 1: it was all about? Right, and uh, that was a 746 00:47:10,719 --> 00:47:14,560 Speaker 1: big mess. This is also potentially a really big mess. 747 00:47:14,600 --> 00:47:17,040 Speaker 1: And the Patriot Act, the substance of it was pretty 748 00:47:17,120 --> 00:47:20,520 Speaker 1: much had been written in advance. Yeah, yeah, which is 749 00:47:20,560 --> 00:47:24,080 Speaker 1: pretty like now that used to be a controversial statement, 750 00:47:24,120 --> 00:47:27,440 Speaker 1: but now it's acknowledged. Yeah, so this is this is uh, 751 00:47:27,600 --> 00:47:29,480 Speaker 1: I mean the fact that the FBI has had this 752 00:47:29,560 --> 00:47:33,839 Speaker 1: plan for a while, not this specific implementation, but this 753 00:47:34,080 --> 00:47:38,120 Speaker 1: desire to get this workaround access to things. And I mean, 754 00:47:38,160 --> 00:47:41,400 Speaker 1: I totally understand their point of view. Two, they're trying 755 00:47:41,440 --> 00:47:45,040 Speaker 1: to investigate things. It's not like the FBI is necessarily 756 00:47:45,080 --> 00:47:48,200 Speaker 1: made up of the cigarette smoking man and all of 757 00:47:48,200 --> 00:47:50,759 Speaker 1: his cronies. You know, I'm not I'm not going I don't. 758 00:47:50,840 --> 00:47:54,600 Speaker 1: I don't mean to to uh disparage then I don't 759 00:47:54,640 --> 00:47:57,040 Speaker 1: want to demonize them, Right, That's not That's not what 760 00:47:57,120 --> 00:48:00,400 Speaker 1: I'm trying to get at. Either. The FBI's intends may 761 00:48:00,480 --> 00:48:03,360 Speaker 1: in fact be nothing but noble that they want this 762 00:48:03,640 --> 00:48:09,480 Speaker 1: in the efforts to investigate, solve crime, prevent crime from happening, 763 00:48:09,920 --> 00:48:15,319 Speaker 1: and not in any way that is malevolent. However, the 764 00:48:15,360 --> 00:48:18,279 Speaker 1: fact remains, whether their intentions are noble or not, it 765 00:48:18,320 --> 00:48:22,880 Speaker 1: opens up this opportunity for people whose intentions are demonstrably 766 00:48:23,000 --> 00:48:27,600 Speaker 1: not noble to take advantage of those same opportunities. Well, yeah, 767 00:48:27,640 --> 00:48:29,600 Speaker 1: and you know, I'm glad you said that, because I 768 00:48:29,600 --> 00:48:31,520 Speaker 1: wanted to see something that I want to add to 769 00:48:31,560 --> 00:48:34,760 Speaker 1: this something that is rarely said when we talk about 770 00:48:35,640 --> 00:48:39,799 Speaker 1: government surveillance or concerns about privacy. Right, one thing that 771 00:48:39,920 --> 00:48:45,680 Speaker 1: is rarely said is that law enforcement agencies, law enforcement institutions, 772 00:48:45,719 --> 00:48:52,480 Speaker 1: and individuals in the US actually do quite an extraordinary 773 00:48:52,560 --> 00:48:56,080 Speaker 1: job compared to a lot of places. If you're fortunate 774 00:48:56,160 --> 00:48:58,080 Speaker 1: enough to grow up in a place that has rule 775 00:48:58,120 --> 00:49:00,960 Speaker 1: of law, where you can walk down this street in 776 00:49:01,000 --> 00:49:03,640 Speaker 1: the dark, or you can say you can say, uh, 777 00:49:03,920 --> 00:49:07,239 Speaker 1: you know, whomever your senator president is, you can go 778 00:49:07,320 --> 00:49:10,920 Speaker 1: on the internet and say I think they stink yeah, 779 00:49:10,920 --> 00:49:13,120 Speaker 1: I think you're in jerk. Yeah, I think you're the 780 00:49:14,200 --> 00:49:18,000 Speaker 1: piece of bologna with shoes. And then but in other countries, 781 00:49:18,080 --> 00:49:21,680 Speaker 1: we you know, people get arrested for that, people get imprisoned. 782 00:49:22,000 --> 00:49:25,399 Speaker 1: So we're not get a race, yes, like not just 783 00:49:26,080 --> 00:49:30,120 Speaker 1: arrested or imprisoned, but the government in some countries will 784 00:49:30,160 --> 00:49:32,960 Speaker 1: take steps to make it seem like that person was 785 00:49:33,000 --> 00:49:35,759 Speaker 1: never a person exactly. They'll keep the photos, but you 786 00:49:35,800 --> 00:49:38,960 Speaker 1: won't be in them. And I say that because it's 787 00:49:38,960 --> 00:49:43,120 Speaker 1: a sense of much needed perspective. However, you know, I'm 788 00:49:43,160 --> 00:49:51,200 Speaker 1: not demonizing the people who work at the FBI institutions, 789 00:49:51,239 --> 00:49:59,520 Speaker 1: whether private or public, seek power, they seek further influence. 790 00:49:59,520 --> 00:50:03,000 Speaker 1: And that's it's not because it's some sort of James 791 00:50:03,040 --> 00:50:08,400 Speaker 1: Bond super villain thing. It's not specter. And it's because 792 00:50:08,560 --> 00:50:14,440 Speaker 1: it allows for it allows for an easier, more efficient 793 00:50:14,600 --> 00:50:18,759 Speaker 1: pursuit of whatever the original mission would be. Right, all right, 794 00:50:18,760 --> 00:50:20,520 Speaker 1: we got a little bit more to say on this 795 00:50:20,560 --> 00:50:22,720 Speaker 1: topic before we get to that, though, Let's take another 796 00:50:22,800 --> 00:50:34,720 Speaker 1: quick break. Ben and I we both talk about things 797 00:50:34,719 --> 00:50:37,320 Speaker 1: that here at work where we say, gosh, I wish 798 00:50:37,320 --> 00:50:39,719 Speaker 1: we had X because it would make our lives so 799 00:50:39,800 --> 00:50:43,560 Speaker 1: much easier. Well, even if we got X, we would 800 00:50:43,600 --> 00:50:45,520 Speaker 1: come up with why. That would be the next one. Right, 801 00:50:46,040 --> 00:50:48,440 Speaker 1: we get X. X is awesome, X is helping us out. 802 00:50:48,480 --> 00:50:50,399 Speaker 1: We're like, oh man, it's so good to have X here. 803 00:50:50,480 --> 00:50:52,640 Speaker 1: But you know what, Yeah, it would be great if 804 00:50:52,680 --> 00:50:54,680 Speaker 1: we had WHY because if we had WHY, we could 805 00:50:54,719 --> 00:50:57,200 Speaker 1: really do our jobs. Well, we get why, and then 806 00:50:57,200 --> 00:51:00,520 Speaker 1: you know, man, X and why or work out like 807 00:51:00,560 --> 00:51:03,160 Speaker 1: a dream. But boy, if we had Z, can you 808 00:51:03,200 --> 00:51:06,359 Speaker 1: imagine the level we get to now what we do, Ben, 809 00:51:06,960 --> 00:51:11,680 Speaker 1: We make fun audio podcasts, videos and articles that go 810 00:51:11,840 --> 00:51:16,000 Speaker 1: on the intranet and that's awesome. And so really our 811 00:51:16,040 --> 00:51:22,120 Speaker 1: capacity to do horrible, horrible harm is fairly limited. I 812 00:51:22,160 --> 00:51:26,120 Speaker 1: mean in respect to how our jobs comparatively. Yeah, yeah, Nick. Granted, 813 00:51:26,120 --> 00:51:28,080 Speaker 1: if either of us wanted to go outside and just 814 00:51:28,080 --> 00:51:31,000 Speaker 1: start throwing Kingo pops popsicles at people, we could go 815 00:51:31,080 --> 00:51:34,920 Speaker 1: on a popsicle rampage havoc. But that's that's not job related. 816 00:51:36,280 --> 00:51:40,759 Speaker 1: The FBI, the CIA, the NSA, a lot of those 817 00:51:40,760 --> 00:51:46,360 Speaker 1: three letter organizations, in pursuit of what they need to 818 00:51:46,400 --> 00:51:51,440 Speaker 1: do in order to fulfill their their organization's mission, in 819 00:51:51,480 --> 00:51:55,840 Speaker 1: some cases will step over lines that we cannot allow 820 00:51:55,960 --> 00:52:00,160 Speaker 1: people to cross because it creates a system that is 821 00:52:00,200 --> 00:52:03,400 Speaker 1: at least as dangerous as whatever problem they're trying to solve. 822 00:52:03,600 --> 00:52:09,320 Speaker 1: You know. Another example of this is the the idea 823 00:52:10,000 --> 00:52:16,200 Speaker 1: that the idea of absolute prevention. Like you know, there 824 00:52:16,239 --> 00:52:20,719 Speaker 1: was the old conversation about tortures several years ago, when 825 00:52:20,840 --> 00:52:25,520 Speaker 1: it was the ticking time bomb argument, which was, should 826 00:52:25,560 --> 00:52:30,120 Speaker 1: torture be legal if there is a criminal in custody 827 00:52:30,320 --> 00:52:35,319 Speaker 1: who is suspected of having knowledge of another nine to 828 00:52:35,320 --> 00:52:40,000 Speaker 1: eleven here? Yeah? Yeah, Jack Bauer from twenty four kind 829 00:52:40,000 --> 00:52:45,600 Speaker 1: of argument, Should torture, while reprehensible, be allowed when it 830 00:52:45,640 --> 00:52:52,759 Speaker 1: gets results? And this, this kind of reasoning is dangerous. 831 00:52:52,880 --> 00:52:56,440 Speaker 1: Not I'm not saying that because of any desire to 832 00:52:56,480 --> 00:53:00,319 Speaker 1: see human tragedy, but I'm saying it's dangerous because of 833 00:53:00,360 --> 00:53:03,479 Speaker 1: the assumptions it makes. Yes, that a special case will 834 00:53:03,480 --> 00:53:07,360 Speaker 1: remain a special case, right, and that perhaps the next case, 835 00:53:07,360 --> 00:53:11,799 Speaker 1: which maybe isn't quite so special, like, well, you know, 836 00:53:12,680 --> 00:53:16,560 Speaker 1: we've done it before, so what's the deal here? Yeah? 837 00:53:16,640 --> 00:53:19,240 Speaker 1: I mean you were cool last time. What happens? Yeah? 838 00:53:19,280 --> 00:53:23,640 Speaker 1: So this is this is exactly why neither I think 839 00:53:23,640 --> 00:53:25,880 Speaker 1: I feel pretty strongly about this. I think I'm on 840 00:53:25,880 --> 00:53:28,960 Speaker 1: the right track that neither you nor I feel that 841 00:53:29,320 --> 00:53:32,319 Speaker 1: the FBI should win out in this particular case. I 842 00:53:32,360 --> 00:53:35,960 Speaker 1: think this is something where we really need to see 843 00:53:36,000 --> 00:53:39,360 Speaker 1: Apple come out on top. I am not a huge 844 00:53:39,400 --> 00:53:41,720 Speaker 1: fan of Apple. I don't own a lot of Apple 845 00:53:41,760 --> 00:53:46,120 Speaker 1: products I have. Apple does not sponsored this show. I'm 846 00:53:46,160 --> 00:53:49,600 Speaker 1: not getting any money from Apple. If anything, I'm losing 847 00:53:49,640 --> 00:53:52,319 Speaker 1: money to Apple because my wife is a fan. She 848 00:53:52,360 --> 00:53:54,880 Speaker 1: wants to get an Apple Watch, but I am not. 849 00:53:55,120 --> 00:53:58,879 Speaker 1: I'm not getting anything from Apple. I do think they're 850 00:53:58,880 --> 00:54:01,759 Speaker 1: in the right, but because I don't want to see 851 00:54:02,400 --> 00:54:06,480 Speaker 1: a precedent where a company that creates a secure system 852 00:54:07,040 --> 00:54:12,360 Speaker 1: has to be or can be compelled to compromise that security. 853 00:54:13,200 --> 00:54:16,719 Speaker 1: It defeats the purpose of the security. And whether it's 854 00:54:16,800 --> 00:54:21,760 Speaker 1: this case, which is extraordinary and very emotional, or something 855 00:54:22,000 --> 00:54:26,440 Speaker 1: much less impactful for the general public, maybe it's something, 856 00:54:26,680 --> 00:54:32,279 Speaker 1: you know, simpler and less dramatic. It doesn't matter. You 857 00:54:32,360 --> 00:54:37,160 Speaker 1: cannot You cannot go down that pathway and expect things 858 00:54:37,160 --> 00:54:39,399 Speaker 1: to turn out all right. You've got to figure out 859 00:54:39,440 --> 00:54:45,320 Speaker 1: other ways to do that kind of investigation. Either Apple 860 00:54:45,400 --> 00:54:50,560 Speaker 1: needs to go in a direction where they can they 861 00:54:50,600 --> 00:54:55,440 Speaker 1: can access user data without having to circumvent a security 862 00:54:55,480 --> 00:54:57,560 Speaker 1: system like this, which means they have to go backwards, 863 00:54:57,600 --> 00:55:01,440 Speaker 1: which can really is not a possibility, or Apple and 864 00:55:01,560 --> 00:55:06,600 Speaker 1: other companies have to create systems where it really is 865 00:55:07,440 --> 00:55:12,200 Speaker 1: impossible for them to access without the consent or the 866 00:55:12,680 --> 00:55:16,799 Speaker 1: actions of the owner of the device. I suspect that 867 00:55:16,880 --> 00:55:19,920 Speaker 1: every company is rushing to develop that kind of approach 868 00:55:20,000 --> 00:55:22,160 Speaker 1: right now, because none of them want to be in 869 00:55:22,160 --> 00:55:28,160 Speaker 1: this position where Facebook, Google, Microsoft all publicly showed support 870 00:55:28,200 --> 00:55:32,439 Speaker 1: for Apple. Apple right now is um you know, part 871 00:55:32,440 --> 00:55:37,240 Speaker 1: of Apple hired a dev that worked on Edward Snowden's 872 00:55:37,239 --> 00:55:41,719 Speaker 1: favorite messaging app, obviasually. Yeah, and I don't know how 873 00:55:41,800 --> 00:55:44,640 Speaker 1: much of that is meant to be like a pr move. 874 00:55:44,960 --> 00:55:50,520 Speaker 1: But also they have, um, you know, the leaked Snowden 875 00:55:50,600 --> 00:55:53,120 Speaker 1: papers are out there, and I know I'm harping on them. 876 00:55:53,160 --> 00:55:57,000 Speaker 1: It revealed an ugly behind the scenes look at corporate 877 00:55:57,120 --> 00:56:01,640 Speaker 1: involvement with government request for surveyor you know, so the 878 00:56:01,719 --> 00:56:11,239 Speaker 1: average consumer you me, h, Gary Busey whomever. We have 879 00:56:11,360 --> 00:56:15,640 Speaker 1: much less trust in general in these companies because we 880 00:56:15,719 --> 00:56:20,920 Speaker 1: have a reason not to trust them. Well, we have 881 00:56:21,120 --> 00:56:26,480 Speaker 1: handed over so much of our own personal data. We 882 00:56:26,560 --> 00:56:29,880 Speaker 1: trust that the devices that hold that personal data aren't 883 00:56:29,920 --> 00:56:37,080 Speaker 1: going to just give that away to whatever entity without 884 00:56:37,120 --> 00:56:39,880 Speaker 1: our consent. We trust that that's not going to happen. 885 00:56:40,560 --> 00:56:43,480 Speaker 1: When things like this pop up where we start to 886 00:56:43,560 --> 00:56:48,200 Speaker 1: question that trust, that's problematic. There's someone else that Apple 887 00:56:48,239 --> 00:56:52,560 Speaker 1: has recently hired, Ted Olson. Does that name sound familiar 888 00:56:52,560 --> 00:56:57,480 Speaker 1: to you, Ted? Ted Olson's a lawyer. So Ted Olsen's 889 00:56:57,520 --> 00:57:00,319 Speaker 1: going to be representing Apple. Ted Olson's probably best known 890 00:57:00,400 --> 00:57:05,440 Speaker 1: for representing George W. Bush in the Bush versus Gore 891 00:57:05,800 --> 00:57:10,239 Speaker 1: election fallout. You know, for those of you in the 892 00:57:10,320 --> 00:57:14,040 Speaker 1: United States, when when Bush was running against Gore, there 893 00:57:14,160 --> 00:57:18,480 Speaker 1: was this whole battle about, you know, voting recounts, voting recounts, 894 00:57:19,840 --> 00:57:24,240 Speaker 1: and Olsen represented w George W. Bush on that and 895 00:57:24,400 --> 00:57:27,560 Speaker 1: Bush ended up winning that. So now he's representing Apple 896 00:57:27,640 --> 00:57:32,840 Speaker 1: in this particular battle with the FBI. So interesting to 897 00:57:32,880 --> 00:57:37,880 Speaker 1: see these kinds of personalities involved in this. And now 898 00:57:37,920 --> 00:57:42,520 Speaker 1: I know that with the public perception it's been a 899 00:57:42,560 --> 00:57:46,080 Speaker 1: little c saw ish, but general public, I would argue 900 00:57:46,080 --> 00:57:48,520 Speaker 1: the people who are not necessarily paying attention to the 901 00:57:48,560 --> 00:57:51,240 Speaker 1: text sphere, I think a lot of them are citing 902 00:57:51,280 --> 00:57:55,360 Speaker 1: with the FBI because it's a terrorist story. It's a 903 00:57:55,400 --> 00:58:00,280 Speaker 1: story about trying to establish the most information of about 904 00:58:00,320 --> 00:58:03,720 Speaker 1: these these shooters as possible. Sure, do you think so? Though? 905 00:58:03,880 --> 00:58:06,640 Speaker 1: I do? I think that I think at least a 906 00:58:06,680 --> 00:58:08,720 Speaker 1: lot of polls that I've seen leading up to today, 907 00:58:09,680 --> 00:58:12,640 Speaker 1: the general public tends to side with the FBI because 908 00:58:12,640 --> 00:58:15,520 Speaker 1: the FBI has a very emotional story. Apple's story is 909 00:58:15,600 --> 00:58:22,200 Speaker 1: much more rationally based, intellectually based, and the FBI story 910 00:58:22,880 --> 00:58:30,280 Speaker 1: is is penned on this event, this very emotionally charged Eventum. 911 00:58:30,760 --> 00:58:32,400 Speaker 1: I don't know that that's going to continue. I think. 912 00:58:32,480 --> 00:58:34,720 Speaker 1: I think people who are savvy in the text sphere, 913 00:58:35,560 --> 00:58:37,960 Speaker 1: I think the majority of them side with Apple. But 914 00:58:38,040 --> 00:58:40,440 Speaker 1: it's still because it's still it goes back to the 915 00:58:40,520 --> 00:58:45,680 Speaker 1: taking time bomb argument, you know. Yeah, yeah, So this 916 00:58:45,760 --> 00:58:48,200 Speaker 1: is one of the stories that we definitely had to cover. 917 00:58:48,320 --> 00:58:50,640 Speaker 1: I mean, obviously it's it's such a huge it's probably 918 00:58:50,680 --> 00:58:54,200 Speaker 1: the biggest story in tech right now as we record this. Um, 919 00:58:54,360 --> 00:58:57,760 Speaker 1: And I'm glad that you could join, maybeen to chat 920 00:58:57,760 --> 00:59:00,320 Speaker 1: about this, to kind of give your your insight as well, 921 00:59:00,840 --> 00:59:04,680 Speaker 1: and to talk about why this is so important not 922 00:59:04,760 --> 00:59:08,600 Speaker 1: just from a technology standpoint, but just a philosophical standpoint 923 00:59:09,120 --> 00:59:13,320 Speaker 1: in a matter of law as well. Okay, that concludes 924 00:59:13,400 --> 00:59:17,200 Speaker 1: that twenty sixteen episode on Apple versus the FBI. As 925 00:59:17,240 --> 00:59:20,840 Speaker 1: I mentioned in the intro, this is an ongoing issue, 926 00:59:21,000 --> 00:59:25,200 Speaker 1: and it's not just about accessing physical devices. There have 927 00:59:25,280 --> 00:59:29,160 Speaker 1: also been lots of attempts from the FBI and other 928 00:59:29,240 --> 00:59:32,680 Speaker 1: agencies to convince companies like Apple to give them backdoor 929 00:59:32,760 --> 00:59:37,600 Speaker 1: access to otherwise encrypted forms of communication. As I have 930 00:59:37,680 --> 00:59:42,520 Speaker 1: said repeatedly in various episodes, this is a terrible idea. 931 00:59:42,680 --> 00:59:48,400 Speaker 1: Anytime you create a purposeful backdoor to an otherwise secure system, 932 00:59:48,440 --> 00:59:52,320 Speaker 1: assuming that that is even possible, because it's not always possible. 933 00:59:52,880 --> 00:59:55,960 Speaker 1: But if you do that, all you have done is 934 00:59:56,000 --> 01:00:00,640 Speaker 1: really introduced a vulnerability. You have created a huge target 935 01:00:01,000 --> 01:00:04,440 Speaker 1: for every hacker out there that is that wants to 936 01:00:04,480 --> 01:00:08,760 Speaker 1: be able to infiltrate that system because you have created 937 01:00:08,960 --> 01:00:13,240 Speaker 1: a work around through the otherwise legitimate way to send 938 01:00:13,320 --> 01:00:16,600 Speaker 1: encrypted data back and forth. It is never a good 939 01:00:16,640 --> 01:00:20,320 Speaker 1: idea to do that. It just it decreases security rather 940 01:00:20,400 --> 01:00:25,280 Speaker 1: than increases it. And so this is an ongoing issue 941 01:00:25,560 --> 01:00:31,040 Speaker 1: between investigative agencies and tech companies. I don't expect we're 942 01:00:31,040 --> 01:00:33,360 Speaker 1: going to see it go away anytime soon. There will 943 01:00:33,400 --> 01:00:36,480 Speaker 1: always be a struggle but I sure hope that we 944 01:00:36,560 --> 01:00:39,240 Speaker 1: never get to a point where more and more tech 945 01:00:39,280 --> 01:00:43,520 Speaker 1: companies begin to introduce these backdoors, because it just makes 946 01:00:43,560 --> 01:00:46,640 Speaker 1: everyone less safe in the long run. If you have 947 01:00:46,680 --> 01:00:49,080 Speaker 1: suggestions for topics I should cover in future episodes of 948 01:00:49,080 --> 01:00:51,080 Speaker 1: tech Stuff, please reach out to me and let me know. 949 01:00:51,600 --> 01:00:54,760 Speaker 1: One way, of course, is to download the iHeartRadio app. 950 01:00:54,840 --> 01:00:57,280 Speaker 1: It is free to download, it's free to use. You 951 01:00:57,360 --> 01:00:59,800 Speaker 1: navigate over to tech Stuff using the search field. You'll 952 01:01:00,120 --> 01:01:02,400 Speaker 1: there's a little microphone icon on the tech stuff page. 953 01:01:02,600 --> 01:01:04,560 Speaker 1: You can leave me a voice message up to thirty 954 01:01:04,600 --> 01:01:06,440 Speaker 1: seconds in length let me know what you would like 955 01:01:06,480 --> 01:01:09,360 Speaker 1: me to cover, or if you prefer, you can go 956 01:01:09,400 --> 01:01:11,680 Speaker 1: over to Twitter and send me a message there. The 957 01:01:11,720 --> 01:01:15,280 Speaker 1: handle for the show is tech Stuff HSW and I'll 958 01:01:15,280 --> 01:01:24,600 Speaker 1: talk to you again really soon. Tech Stuff is an 959 01:01:24,600 --> 01:01:30,120 Speaker 1: iHeartRadio production. For more podcasts from iHeartRadio, visit the iHeartRadio app, 960 01:01:30,240 --> 01:01:33,440 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.