1 00:00:04,440 --> 00:00:12,239 Speaker 1: Welcome to tech Stuff, a production from iHeartRadio. Hey there, 2 00:00:12,320 --> 00:00:15,800 Speaker 1: and welcome to tech Stuff. I'm your host, Jonathan Strickland. 3 00:00:15,840 --> 00:00:19,079 Speaker 1: I'm an executive producer with iHeartRadio, and how the tech 4 00:00:19,120 --> 00:00:22,480 Speaker 1: are you? It is time for a tech Stuff classics episode. 5 00:00:22,880 --> 00:00:27,920 Speaker 1: This episode originally published on December seventh, twenty sixteen. It 6 00:00:28,000 --> 00:00:33,000 Speaker 1: is called bad Computer Bugs. No, it was twenty sixteen, 7 00:00:33,080 --> 00:00:36,680 Speaker 1: the year where like bedbugs became a big news item. 8 00:00:36,840 --> 00:00:40,560 Speaker 1: Like there was a year where it just bed bugs 9 00:00:40,600 --> 00:00:43,000 Speaker 1: were in the news, and I'm wondering if that was 10 00:00:43,000 --> 00:00:45,559 Speaker 1: twenty sixteen. At this point it goes to show that 11 00:00:45,600 --> 00:00:48,800 Speaker 1: I haven't actually listened back to this episode, but yeah, 12 00:00:48,840 --> 00:00:53,279 Speaker 1: Bad Computer Bugs. It originally published December seventh, twenty sixteen. 13 00:00:53,760 --> 00:00:57,400 Speaker 1: Let's take a listen, now, shall we. I have to 14 00:00:57,480 --> 00:01:02,760 Speaker 1: address a bit of apocryphal history, and regrettably it's a 15 00:01:02,800 --> 00:01:05,760 Speaker 1: story that we've repeated on tech Stuff. So I'm sad 16 00:01:05,800 --> 00:01:10,800 Speaker 1: to admit that I was complicit, although unknowingly, in the 17 00:01:10,840 --> 00:01:15,160 Speaker 1: spread of misinformation, and that all has to do with 18 00:01:15,200 --> 00:01:18,360 Speaker 1: the origin of the term bug to describe a flaw 19 00:01:18,480 --> 00:01:22,720 Speaker 1: in programming. So here's the popular story, the one that 20 00:01:22,959 --> 00:01:29,760 Speaker 1: we have accidentally promoted on tech stuff without knowing that 21 00:01:29,800 --> 00:01:33,120 Speaker 1: we were in the wrong. It goes that Grace Hopper, 22 00:01:33,440 --> 00:01:35,959 Speaker 1: who was an early computer scientist who rose to the 23 00:01:36,040 --> 00:01:39,480 Speaker 1: rank of Rear admiral in the US Navy, coined the 24 00:01:39,560 --> 00:01:43,760 Speaker 1: phrase bug after discovering a moth guming up Harvard's Mark 25 00:01:43,880 --> 00:01:49,000 Speaker 1: II calculator, a literal bug. Generally speaking, the story tends 26 00:01:49,040 --> 00:01:53,080 Speaker 1: to be set in nineteen forty five, and there is 27 00:01:53,200 --> 00:01:56,280 Speaker 1: even a note in the logbook that reads first actual 28 00:01:56,400 --> 00:02:00,240 Speaker 1: case of bug being found that's attributed to Grace Hopper. 29 00:02:01,200 --> 00:02:05,559 Speaker 1: But there are several points that are wrong in this story. First, 30 00:02:05,640 --> 00:02:08,480 Speaker 1: the year. It didn't happen in nineteen forty five. It 31 00:02:08,520 --> 00:02:11,520 Speaker 1: happened on September ninth, nineteen forty seven. We know because 32 00:02:11,560 --> 00:02:15,480 Speaker 1: there's a logbook. At the logbook that marks the incident 33 00:02:15,760 --> 00:02:18,680 Speaker 1: not only has the notes, it actually has the moth 34 00:02:18,840 --> 00:02:23,760 Speaker 1: taped into the book itself. It's taped onto the page. Second, 35 00:02:23,840 --> 00:02:27,240 Speaker 1: Grace Hopper wasn't the person to discover the moth or 36 00:02:27,440 --> 00:02:31,640 Speaker 1: make that log entry. She did tell the story about 37 00:02:31,680 --> 00:02:35,040 Speaker 1: the moth several times, but it wasn't in the context 38 00:02:35,120 --> 00:02:38,880 Speaker 1: of finding it or logging it. She just told the 39 00:02:38,880 --> 00:02:41,040 Speaker 1: story that, yeah, we really did have a bug in 40 00:02:41,080 --> 00:02:44,920 Speaker 1: the system, and most importantly, the word bug had already 41 00:02:44,960 --> 00:02:49,200 Speaker 1: been used to describe design flaws for decades before the 42 00:02:49,240 --> 00:02:52,679 Speaker 1: Mark II was even designed. In fact, if you look 43 00:02:52,680 --> 00:02:55,840 Speaker 1: at the logbook, this makes sense. It says first actual 44 00:02:55,919 --> 00:02:59,880 Speaker 1: case of bug being found. That sentence doesn't make sense 45 00:03:00,200 --> 00:03:04,399 Speaker 1: unless you've already used the word bug to describe a flaw, 46 00:03:04,880 --> 00:03:08,760 Speaker 1: because you wouldn't say first actual case of bug being found. 47 00:03:09,320 --> 00:03:13,920 Speaker 1: The wording doesn't make any sense. The context makes no sense. Sadly, 48 00:03:13,960 --> 00:03:17,639 Speaker 1: there are documented quotes dating back to the nineteenth century 49 00:03:18,120 --> 00:03:21,480 Speaker 1: using the word bug to mean a design fault, and 50 00:03:21,520 --> 00:03:25,360 Speaker 1: it could go back even further than that. So is 51 00:03:25,400 --> 00:03:28,200 Speaker 1: with much regret that I admit I have unwittingly contributed 52 00:03:28,240 --> 00:03:31,080 Speaker 1: to a bit of misleading folklore making the rounds. But 53 00:03:31,120 --> 00:03:33,480 Speaker 1: I'm glad I can take this opportunity to address it. 54 00:03:34,120 --> 00:03:36,640 Speaker 1: All Right, So let's talk about design bugs, and I'll 55 00:03:36,680 --> 00:03:41,520 Speaker 1: be covering several goofs, mistakes, flubs, flaws, and outright catastrophes 56 00:03:41,640 --> 00:03:45,640 Speaker 1: in this episode. But one thing I'm not necessarily going 57 00:03:45,680 --> 00:03:49,600 Speaker 1: to cover our software vulnerabilities that were later exploited either 58 00:03:49,680 --> 00:03:53,280 Speaker 1: by opportunistic hackers or white hats who are just trying 59 00:03:53,320 --> 00:03:57,720 Speaker 1: to improve system security. Those vulnerabilities are common in many 60 00:03:57,760 --> 00:04:01,000 Speaker 1: types of software and arise not just through mistakes, but 61 00:04:01,480 --> 00:04:05,720 Speaker 1: sometimes simple oversights. And I think it might be more 62 00:04:05,760 --> 00:04:08,080 Speaker 1: fun to look at some real bugs, like stuff that 63 00:04:08,520 --> 00:04:11,320 Speaker 1: made things go wrong, stuff that may have rendered a 64 00:04:11,320 --> 00:04:15,080 Speaker 1: program defunct or otherwise caused headaches. Now I'm going to 65 00:04:15,120 --> 00:04:17,400 Speaker 1: make an exception to this. I'm going to start off 66 00:04:17,440 --> 00:04:21,000 Speaker 1: with the Ping of death. And I only mention it 67 00:04:21,040 --> 00:04:24,800 Speaker 1: because it has an awesome name. Now, this flaw caused 68 00:04:24,839 --> 00:04:27,680 Speaker 1: headaches back in nineteen ninety five and ninety six. It 69 00:04:27,760 --> 00:04:32,640 Speaker 1: was a flawed IP fragmentation reassembly code, and it became 70 00:04:32,680 --> 00:04:35,560 Speaker 1: possible to crash lots of different types of computers using 71 00:04:35,560 --> 00:04:39,760 Speaker 1: different operating systems, although Windows machines were particularly vulnerable, and 72 00:04:39,839 --> 00:04:42,680 Speaker 1: this particular flaw would make a Windows machine revert to 73 00:04:42,760 --> 00:04:46,200 Speaker 1: the dreaded blue screen of death. And it all happened 74 00:04:46,200 --> 00:04:49,800 Speaker 1: by sending a special ping packet over the Internet. So, 75 00:04:49,920 --> 00:04:52,120 Speaker 1: for those of you who aren't familiar with what that is, 76 00:04:52,200 --> 00:04:55,680 Speaker 1: a ping is essentially a simple message that checks for 77 00:04:55,800 --> 00:04:59,320 Speaker 1: a connection between two computers. You send one ping from 78 00:04:59,320 --> 00:05:01,520 Speaker 1: a computer to a another one and look for a 79 00:05:01,560 --> 00:05:04,479 Speaker 1: response so that way you verify there is in fact 80 00:05:04,520 --> 00:05:07,240 Speaker 1: a connection. You can also tell other things like how 81 00:05:07,320 --> 00:05:10,800 Speaker 1: fast is that connection between those two computers. Now, in 82 00:05:10,839 --> 00:05:13,800 Speaker 1: this case, you would have to actually design a malformed 83 00:05:13,920 --> 00:05:17,240 Speaker 1: ping request and send that to a target and it 84 00:05:17,279 --> 00:05:21,520 Speaker 1: would bring that target down. That's the only security vulnerability 85 00:05:21,560 --> 00:05:24,200 Speaker 1: story I really wanted to focus on. The others are 86 00:05:24,240 --> 00:05:27,960 Speaker 1: all just design flaws. And let's begin with the bug 87 00:05:27,960 --> 00:05:30,120 Speaker 1: that inspired me to do this episode in the first place, 88 00:05:30,200 --> 00:05:33,720 Speaker 1: that Spotify bug I mentioned earlier. Ours Tetnica wrote a 89 00:05:33,760 --> 00:05:36,479 Speaker 1: piece on it in November twenty sixteen, but the problem 90 00:05:36,560 --> 00:05:39,560 Speaker 1: seems to date back at least as far as June 91 00:05:39,640 --> 00:05:43,760 Speaker 1: twenty sixteen, and that's when a few savvy Spotify users 92 00:05:43,880 --> 00:05:47,680 Speaker 1: noticed some unusual activities on their computers, and it took 93 00:05:47,680 --> 00:05:49,760 Speaker 1: a little bit of detective work, but they discovered that 94 00:05:49,800 --> 00:05:53,600 Speaker 1: Spotify was apparently generating a huge amount of data on 95 00:05:53,640 --> 00:05:58,760 Speaker 1: a daily basis, like gigabytes of data per day. And 96 00:05:58,800 --> 00:06:02,039 Speaker 1: the culprit turned out to be a vacuum process for 97 00:06:02,120 --> 00:06:06,400 Speaker 1: a database file containing the string mercury dot dB. Now, 98 00:06:06,400 --> 00:06:10,120 Speaker 1: the vacuum process is the digital equivalent of vacuum ceiling. 99 00:06:10,240 --> 00:06:12,960 Speaker 1: It's meant to repack data so that it takes up 100 00:06:13,040 --> 00:06:16,200 Speaker 1: less space on a drive. Now, this involves building a 101 00:06:16,240 --> 00:06:19,640 Speaker 1: new file to maximize efficiency, which is a good thing 102 00:06:19,800 --> 00:06:26,040 Speaker 1: generally speaking. The problem was that Spotify's version was making 103 00:06:26,040 --> 00:06:28,400 Speaker 1: it happen way too frequently, like on the order of 104 00:06:28,440 --> 00:06:32,560 Speaker 1: once every few minutes. So that's not generally necessary. You 105 00:06:32,560 --> 00:06:35,400 Speaker 1: don't need to rebuild a database file every few minutes 106 00:06:35,440 --> 00:06:38,520 Speaker 1: to make sure it's the most efficient size it can be. 107 00:06:39,640 --> 00:06:42,919 Speaker 1: So each rebuild represented a relatively small amount of data, 108 00:06:43,000 --> 00:06:45,960 Speaker 1: but over time it added up, which meant that if 109 00:06:46,000 --> 00:06:48,839 Speaker 1: you had Spotify on on your computer, even if it 110 00:06:48,880 --> 00:06:51,640 Speaker 1: was just running in the background, it would be generating 111 00:06:51,640 --> 00:06:55,760 Speaker 1: gigabytes worth of information rewriting this file over and over. Now, 112 00:06:55,800 --> 00:06:58,600 Speaker 1: it wasn't filling up a hard drive. It was just 113 00:06:58,839 --> 00:07:04,159 Speaker 1: overwriting the same file. Now, if it had been filling 114 00:07:04,240 --> 00:07:06,240 Speaker 1: up a hard drive, people would have noticed much earlier, 115 00:07:06,279 --> 00:07:09,320 Speaker 1: and it wouldn't have just been savvy Spotify users, because 116 00:07:09,320 --> 00:07:12,200 Speaker 1: you would suddenly notice, hey, I can't save anything to 117 00:07:12,200 --> 00:07:15,840 Speaker 1: my hard drive because everything's filling up. Instead, again, it 118 00:07:15,920 --> 00:07:18,480 Speaker 1: was just sort of writing and deleting, and writing and 119 00:07:18,520 --> 00:07:21,320 Speaker 1: deleting the same file over and over again, and that 120 00:07:21,360 --> 00:07:23,440 Speaker 1: probably doesn't sound like a big deal, but it is 121 00:07:23,480 --> 00:07:27,160 Speaker 1: a problem if you're using a solid state drive or SSD. 122 00:07:28,040 --> 00:07:30,640 Speaker 1: So one of the drawbacks of an SSD is that 123 00:07:30,720 --> 00:07:34,560 Speaker 1: over time it loses storage capacity. Like you can store 124 00:07:34,960 --> 00:07:39,040 Speaker 1: less data on an SSD over time. Now, by overtime 125 00:07:39,080 --> 00:07:41,280 Speaker 1: I generally mean over a great deal of time and 126 00:07:41,320 --> 00:07:44,040 Speaker 1: a lot of different data being written to it and overwritten. 127 00:07:45,880 --> 00:07:49,080 Speaker 1: Generally speaking, most of us end up replacing our drives 128 00:07:49,120 --> 00:07:51,640 Speaker 1: before we get to a point where the loss of 129 00:07:51,720 --> 00:07:55,240 Speaker 1: capacity is a real issue. But similar in a way 130 00:07:55,280 --> 00:07:57,400 Speaker 1: to how a battery can lose its ability to hold 131 00:07:57,400 --> 00:08:01,200 Speaker 1: a full charge after you've gone through lots of charging 132 00:08:01,240 --> 00:08:04,520 Speaker 1: and discharging cycles, you know how a battery won't be 133 00:08:04,600 --> 00:08:07,200 Speaker 1: able to hold as much even if it says it's 134 00:08:07,280 --> 00:08:09,560 Speaker 1: up to one hundred percent, But one hundred percent doesn't 135 00:08:09,640 --> 00:08:12,280 Speaker 1: last you as long as it used to. That's because 136 00:08:12,440 --> 00:08:15,400 Speaker 1: its capacity to hold a full charge has decreased over time. 137 00:08:16,920 --> 00:08:19,600 Speaker 1: But let's say you've got a program that's just constantly 138 00:08:19,680 --> 00:08:23,480 Speaker 1: overwriting data to your drive, you might discover that your 139 00:08:23,520 --> 00:08:28,280 Speaker 1: SSD's useful lifespan has been drastically reduced. So as I 140 00:08:28,360 --> 00:08:32,360 Speaker 1: record this episode, Spotify has already rolled out an updated 141 00:08:32,440 --> 00:08:35,000 Speaker 1: version of its desktop application, and that, by the way, 142 00:08:35,080 --> 00:08:37,560 Speaker 1: is the only version of Spotify that was affected. If 143 00:08:37,600 --> 00:08:41,360 Speaker 1: you use web based Spotify or mobile Spotify, you're in 144 00:08:41,400 --> 00:08:43,959 Speaker 1: the clear already. If you use a desktop version, as 145 00:08:43,960 --> 00:08:46,960 Speaker 1: long as you have version one point zero point four 146 00:08:47,000 --> 00:08:51,880 Speaker 1: to two or later, you are fine. But if you 147 00:08:51,920 --> 00:08:54,280 Speaker 1: did have that earlier version and you just had Spotify 148 00:08:54,360 --> 00:08:57,360 Speaker 1: running on in the background, chances are it was writing 149 00:08:57,360 --> 00:09:00,680 Speaker 1: to your hard drive like crazy. So what about some 150 00:09:00,720 --> 00:09:03,440 Speaker 1: of the other big bugs in computer history. Well, some 151 00:09:03,480 --> 00:09:06,280 Speaker 1: of the real doozies involve our attempts to explore the 152 00:09:06,320 --> 00:09:10,080 Speaker 1: final frontier. So we'll be talking about space a few 153 00:09:10,080 --> 00:09:12,520 Speaker 1: times in this episode, and we'll start with an early 154 00:09:12,760 --> 00:09:16,720 Speaker 1: US satellite. So first up is a nineteen sixty two 155 00:09:16,840 --> 00:09:20,760 Speaker 1: blunder involving the Mariner one. So some backstory on this one. 156 00:09:21,720 --> 00:09:23,560 Speaker 1: We're going to talk a lot about the Soviet Union 157 00:09:23,559 --> 00:09:26,560 Speaker 1: in this episode too. It takes a couple of roles 158 00:09:26,640 --> 00:09:29,280 Speaker 1: as we go on, But in this case, the then 159 00:09:29,480 --> 00:09:33,440 Speaker 1: USSR had launched Sputnik into orbit in nineteen fifty seven, 160 00:09:33,440 --> 00:09:36,280 Speaker 1: which really kicked off the space race and also was 161 00:09:36,400 --> 00:09:39,000 Speaker 1: a big shot in the Cold War because the Soviet 162 00:09:39,080 --> 00:09:42,360 Speaker 1: Union was essentially saying, hey, we can launch this into space, 163 00:09:42,400 --> 00:09:46,200 Speaker 1: we could also launch something at you. In response, the 164 00:09:46,280 --> 00:09:48,200 Speaker 1: US done sort of the same thing. They had launched 165 00:09:48,240 --> 00:09:51,360 Speaker 1: some satellites into space, and the Mariner one was going 166 00:09:51,440 --> 00:09:54,240 Speaker 1: to be a big, big feather in the cap of 167 00:09:54,320 --> 00:09:56,439 Speaker 1: the US. The whole idea was to launch a probe 168 00:09:56,440 --> 00:09:59,400 Speaker 1: that would be a flyby probe and it would go 169 00:09:59,440 --> 00:10:05,280 Speaker 1: by Venus. So NASA, which was newly formed in nineteen 170 00:10:05,360 --> 00:10:08,360 Speaker 1: sixty two, was taking control of this, and the budget 171 00:10:08,400 --> 00:10:12,000 Speaker 1: for this particular project was eighteen point five million dollars, which, 172 00:10:12,080 --> 00:10:14,840 Speaker 1: if you were to adjust for inflation, would be almost 173 00:10:14,880 --> 00:10:19,079 Speaker 1: one hundred and fifty million dollars today, So one hundred 174 00:10:19,080 --> 00:10:21,920 Speaker 1: and fifty million dollar project to launch the Mariner one 175 00:10:21,960 --> 00:10:25,200 Speaker 1: and have it fly by Venus. But, as I'm sure 176 00:10:25,240 --> 00:10:27,960 Speaker 1: you guys have figured out by now based upon the 177 00:10:28,120 --> 00:10:31,679 Speaker 1: topic of this podcast, not all went according to plan. 178 00:10:33,000 --> 00:10:36,959 Speaker 1: Not long at all. After the rocket launched from the 179 00:10:37,040 --> 00:10:40,680 Speaker 1: launch pad, it began to veer off course, and neither 180 00:10:40,720 --> 00:10:43,600 Speaker 1: the computer controls on the rocket or manual controls back 181 00:10:43,640 --> 00:10:47,880 Speaker 1: at HQ could correct for the problem. The rocket's course 182 00:10:48,440 --> 00:10:50,360 Speaker 1: was such that it was going to take it over 183 00:10:50,480 --> 00:10:54,000 Speaker 1: shipping lanes, which meant there could be a potential catastrophe, 184 00:10:54,400 --> 00:10:58,040 Speaker 1: and so a range safety officer made the difficult call 185 00:10:58,160 --> 00:11:00,800 Speaker 1: and issued the command to blow the whole thing up 186 00:11:01,480 --> 00:11:05,080 Speaker 1: just shy of three hundred seconds after it launched. So 187 00:11:05,120 --> 00:11:07,440 Speaker 1: what happened? What why did it go off course in 188 00:11:07,440 --> 00:11:09,600 Speaker 1: the first place? Well, there was a flaw in the 189 00:11:09,600 --> 00:11:13,400 Speaker 1: spacecraft's guidance software which diverted the rocket, and no amount 190 00:11:13,440 --> 00:11:16,520 Speaker 1: of commands from ground control could correct for it. After 191 00:11:16,559 --> 00:11:21,520 Speaker 1: a lengthy investigation, NASA discovered the error was the result 192 00:11:21,600 --> 00:11:27,520 Speaker 1: of a mistake transcribing handwritten notes into computer code. So 193 00:11:27,559 --> 00:11:32,160 Speaker 1: someone just took some handwritten notes and misinterpreted one of them, 194 00:11:32,880 --> 00:11:38,720 Speaker 1: and that one mistake was enough to crash the rocket 195 00:11:38,800 --> 00:11:44,760 Speaker 1: or to necessitate it being destroyed. The great science fiction 196 00:11:45,280 --> 00:11:49,440 Speaker 1: author Arthur C. Clark wrote that the Mariner one was 197 00:11:49,600 --> 00:11:54,360 Speaker 1: wrecked by the most expensive hyphen in history, which isn't 198 00:11:54,400 --> 00:11:57,560 Speaker 1: quite right, but it's pretty funny, I mean, come on, 199 00:11:57,679 --> 00:12:01,880 Speaker 1: its humorous phrase. So the actual punctuation mark that caused 200 00:12:01,920 --> 00:12:05,040 Speaker 1: the problem was not technically a hyphen. It was a 201 00:12:05,080 --> 00:12:09,440 Speaker 1: superscript bar. Superscript bars, by the way, not a place 202 00:12:09,480 --> 00:12:12,760 Speaker 1: where playwrights hang out. To get tore up. A superscript 203 00:12:12,760 --> 00:12:16,040 Speaker 1: bar just means it's a horizontal bar that is above 204 00:12:16,200 --> 00:12:19,520 Speaker 1: some other symbol. In this case, it was a radius symbol, 205 00:12:20,160 --> 00:12:23,760 Speaker 1: and that was a symbol along with the superscript bar 206 00:12:24,040 --> 00:12:28,080 Speaker 1: to describe a smoothing function, which means the formula was 207 00:12:28,120 --> 00:12:32,040 Speaker 1: meant to calculate smoothed values over the time derivative of 208 00:12:32,080 --> 00:12:37,520 Speaker 1: a radius. Now, without the smoothing function, tiny deviations in 209 00:12:37,640 --> 00:12:40,880 Speaker 1: course sent commands to the rocket's thrusters to kick in 210 00:12:40,920 --> 00:12:44,760 Speaker 1: big time to overcorrect for that problem. As an analogy, 211 00:12:44,840 --> 00:12:47,559 Speaker 1: imagine you're driving a vehicle and you see a pothole 212 00:12:47,960 --> 00:12:50,560 Speaker 1: in the road and you're approaching it, and instead of 213 00:12:51,120 --> 00:12:55,320 Speaker 1: gently steering out of the way, you wrench the wheel 214 00:12:55,559 --> 00:12:57,680 Speaker 1: really hard to the left or to the right in 215 00:12:57,760 --> 00:13:00,280 Speaker 1: order to try and get around this pothole. That's kind 216 00:13:00,280 --> 00:13:02,360 Speaker 1: of what was happening with the rocket. It didn't have 217 00:13:02,400 --> 00:13:05,439 Speaker 1: the smoothing function and so as a result, it was 218 00:13:05,520 --> 00:13:09,640 Speaker 1: having these wild deviations and course. So it wasn't a 219 00:13:09,720 --> 00:13:14,000 Speaker 1: hyphen that caused the problem, was close enough. Our next 220 00:13:14,040 --> 00:13:16,480 Speaker 1: space story takes place in nineteen ninety six with the 221 00:13:16,480 --> 00:13:22,480 Speaker 1: European Space Agency's Ari Anne five flight five oh one rocket. Now, 222 00:13:22,520 --> 00:13:24,880 Speaker 1: this rocket was to launch into space on June fourth, 223 00:13:25,040 --> 00:13:30,040 Speaker 1: nineteen ninety six, and instead the rocket disintegrated forty seconds 224 00:13:30,080 --> 00:13:33,760 Speaker 1: after taking off. So what the heck happened? Well, it 225 00:13:33,840 --> 00:13:37,760 Speaker 1: largely had to do with the ESA reusing old work. 226 00:13:38,120 --> 00:13:40,880 Speaker 1: This actually becomes a theme in this episode. One of 227 00:13:40,920 --> 00:13:44,960 Speaker 1: the morals the of this entire podcast is, if you're 228 00:13:45,000 --> 00:13:50,040 Speaker 1: designing something a successor to an earlier product, and you'd 229 00:13:50,080 --> 00:13:53,480 Speaker 1: want to reuse some of the features that you created 230 00:13:53,520 --> 00:13:57,559 Speaker 1: in your previous product, test the heck out of it 231 00:13:57,720 --> 00:14:01,160 Speaker 1: in its new form, factor, because it could be that 232 00:14:01,200 --> 00:14:04,960 Speaker 1: things that worked perfectly fine in the earlier model will 233 00:14:05,040 --> 00:14:10,160 Speaker 1: go awry in the new one. That's what happened here. So, 234 00:14:10,400 --> 00:14:13,320 Speaker 1: as you might guess from the name, the Ariyan five 235 00:14:13,600 --> 00:14:17,240 Speaker 1: marked the fifth generation of launch vehicles under that name. 236 00:14:17,440 --> 00:14:22,200 Speaker 1: The Arian four's inertial reference system would convert sixty four 237 00:14:22,240 --> 00:14:26,080 Speaker 1: bit floating point numbers into a sixteen bit signed integer, 238 00:14:27,120 --> 00:14:32,600 Speaker 1: and it worked just fine. But the Arian five's stats 239 00:14:32,640 --> 00:14:36,680 Speaker 1: were beefier than its predecessor with faster engines, and that 240 00:14:36,880 --> 00:14:40,600 Speaker 1: was where the problem really started. The engine output meant 241 00:14:40,720 --> 00:14:45,280 Speaker 1: those sixty four bit floating point numbers were significantly larger 242 00:14:45,320 --> 00:14:48,640 Speaker 1: than the ones generated by the engines on the Arean four. 243 00:14:48,880 --> 00:14:52,960 Speaker 1: They didn't anticipate this, so during the conversion process there 244 00:14:53,040 --> 00:14:57,200 Speaker 1: was actually data overflow, and that overflow caused both the 245 00:14:57,240 --> 00:15:00,760 Speaker 1: backup computer and the primary computer board the Area N 246 00:15:00,880 --> 00:15:03,520 Speaker 1: five to crash, and they crashed in that order. The 247 00:15:03,520 --> 00:15:07,080 Speaker 1: backup computer crashed first, followed by the primary computer a 248 00:15:07,120 --> 00:15:10,120 Speaker 1: couple of seconds later. The whole thing took less than 249 00:15:10,160 --> 00:15:15,680 Speaker 1: a minute to go from launch to disintegration. Oops, now 250 00:15:15,720 --> 00:15:17,880 Speaker 1: we're going to stick with space. But jump forward to 251 00:15:18,000 --> 00:15:23,480 Speaker 1: nineteen ninety eight and the Mars Climate Orbiter. This was 252 00:15:24,360 --> 00:15:28,480 Speaker 1: an unfortunate problem. So this particular spacecraft was meant to 253 00:15:28,520 --> 00:15:32,080 Speaker 1: study Mars's climate, atmosphere and surface changes, and it was 254 00:15:32,120 --> 00:15:34,600 Speaker 1: also supposed to be a kind of relay station for 255 00:15:34,720 --> 00:15:38,760 Speaker 1: landers that would explore Mars's surface, but none of that 256 00:15:38,800 --> 00:15:42,560 Speaker 1: would last because of some pretty significant goofs. So on 257 00:15:42,600 --> 00:15:46,720 Speaker 1: September twenty third, nineteen ninety nine, the orbiter passed into 258 00:15:46,760 --> 00:15:51,120 Speaker 1: the upper atmosphere of Mars and did so at a 259 00:15:51,200 --> 00:15:54,400 Speaker 1: pretty low altitude. And this is what folks in the 260 00:15:54,440 --> 00:15:58,000 Speaker 1: space industry call a bad thing. The drag on the 261 00:15:58,000 --> 00:16:02,120 Speaker 1: spacecraft was significant, it began to fall apart and it 262 00:16:02,240 --> 00:16:08,880 Speaker 1: was destroyed upon entering Mars's atmosphere. That's what happened. So 263 00:16:08,960 --> 00:16:13,280 Speaker 1: the software guiding the orbiter was to blame, and it's 264 00:16:13,320 --> 00:16:16,880 Speaker 1: a dumb, dumb mistake. It was supposed to make adjustments 265 00:16:16,880 --> 00:16:21,520 Speaker 1: to the orbiter's flight in SI units, specifically in Newton's seconds. 266 00:16:22,600 --> 00:16:27,160 Speaker 1: That's what the contract with Lockheed and NASA said, Newton seconds, 267 00:16:27,360 --> 00:16:31,160 Speaker 1: use Si units for all of your all of your calculations. 268 00:16:31,840 --> 00:16:35,840 Speaker 1: But the software instead made calculations in non SI units, 269 00:16:35,920 --> 00:16:42,080 Speaker 1: namely pounds seconds. So Lockheed software gave information to NASA's 270 00:16:42,120 --> 00:16:46,920 Speaker 1: systems using the wrong units of measure. NASA systems then 271 00:16:46,960 --> 00:16:50,320 Speaker 1: took that information, assuming it was with the right units 272 00:16:50,320 --> 00:16:56,200 Speaker 1: of measure, and executed commands based upon that. So this 273 00:16:56,400 --> 00:16:59,200 Speaker 1: is why if you're ever in a math course and 274 00:16:59,840 --> 00:17:03,360 Speaker 1: the teacher makes you stop in the middle of writing 275 00:17:03,360 --> 00:17:05,720 Speaker 1: a problem on the board and says, where are your units? 276 00:17:06,400 --> 00:17:09,040 Speaker 1: This is why you have to make sure you're using 277 00:17:09,040 --> 00:17:12,800 Speaker 1: the right units, because if you're saying a number and 278 00:17:12,880 --> 00:17:15,800 Speaker 1: you don't associate a unit with it, someone could make 279 00:17:15,840 --> 00:17:19,200 Speaker 1: an incorrect decision based on that, and it could be disastrous, 280 00:17:19,320 --> 00:17:22,160 Speaker 1: as it was with the case of this orbiter, the 281 00:17:22,160 --> 00:17:25,639 Speaker 1: thrusters fired at four point four or five times the 282 00:17:25,760 --> 00:17:28,960 Speaker 1: power they were supposed to, and the orbiter didn't stand 283 00:17:28,960 --> 00:17:32,480 Speaker 1: a chance. And this was a pre expensive mistake. That 284 00:17:32,600 --> 00:17:34,960 Speaker 1: mission's cost came in at three hundred and twenty seven 285 00:17:35,000 --> 00:17:38,720 Speaker 1: point six million dollars. But on the bright side, with 286 00:17:38,840 --> 00:17:41,520 Speaker 1: all of these stories, at least no human lives were 287 00:17:41,560 --> 00:17:44,320 Speaker 1: ever in real danger as a result of the mistake. 288 00:17:45,440 --> 00:17:47,640 Speaker 1: We're gonna take a quick break, and when we come back, 289 00:17:48,160 --> 00:18:01,800 Speaker 1: we'll talk more about bad computer bugs. All right, Now, 290 00:18:01,880 --> 00:18:04,320 Speaker 1: let's make a switch to AT and T, which is 291 00:18:04,359 --> 00:18:06,840 Speaker 1: a company that had a pretty big problem with switches 292 00:18:07,040 --> 00:18:09,720 Speaker 1: once upon a time. I'm talking about an issue that 293 00:18:09,720 --> 00:18:13,240 Speaker 1: popped up on January fifteenth, nineteen ninety. That's when AT 294 00:18:13,280 --> 00:18:15,960 Speaker 1: and T long distance customers discovered they were unable to 295 00:18:15,960 --> 00:18:19,600 Speaker 1: make any long distance calls. Why why could they no 296 00:18:19,680 --> 00:18:24,359 Speaker 1: longer reach anybody? Well? AT and t's long distance switches, 297 00:18:25,080 --> 00:18:29,520 Speaker 1: which control that and allow for the actual connections to 298 00:18:29,560 --> 00:18:32,119 Speaker 1: be made, were on the frints. They were trying to 299 00:18:32,160 --> 00:18:34,840 Speaker 1: reboot over and over again. They were just stuck in 300 00:18:34,880 --> 00:18:39,040 Speaker 1: a reboot cycle. Now, initially the company thought it was 301 00:18:39,080 --> 00:18:41,679 Speaker 1: being hacked, but like I said at the top of 302 00:18:41,680 --> 00:18:44,920 Speaker 1: the show, I'm not covering stories about hackers here. I'm 303 00:18:44,920 --> 00:18:48,920 Speaker 1: talking about big design flaws that caused problems. So they 304 00:18:48,960 --> 00:18:52,320 Speaker 1: weren't getting hacked. That's not what was going on with 305 00:18:52,320 --> 00:18:55,400 Speaker 1: those one hundred and fourteen long distance switches. No, there 306 00:18:55,440 --> 00:18:58,560 Speaker 1: was a design problem at fault. So what had happened 307 00:18:58,600 --> 00:19:00,919 Speaker 1: was AT and T had rolled out an update to 308 00:19:01,000 --> 00:19:03,840 Speaker 1: the code that managed the switches, and it was meant 309 00:19:03,880 --> 00:19:06,959 Speaker 1: to increase the efficiency. It was meant to speed things up. 310 00:19:07,000 --> 00:19:09,720 Speaker 1: But the problem was it sped things up so much 311 00:19:09,760 --> 00:19:13,000 Speaker 1: that the system got caught up in itself. It gets 312 00:19:13,000 --> 00:19:14,679 Speaker 1: pretty technical, but I can give you kind of an 313 00:19:15,119 --> 00:19:18,359 Speaker 1: overview of what the problem was. All right, So each 314 00:19:18,400 --> 00:19:23,000 Speaker 1: switch had a function that allowed it to alert the 315 00:19:23,040 --> 00:19:26,520 Speaker 1: next switch down the line if things were starting to 316 00:19:26,520 --> 00:19:31,200 Speaker 1: get hairy. So imagine that switch number one is handling traffic, 317 00:19:31,320 --> 00:19:34,080 Speaker 1: but it's getting really close to capacity. So it sends 318 00:19:34,119 --> 00:19:36,320 Speaker 1: a message over to switch number two and says, I 319 00:19:36,359 --> 00:19:39,520 Speaker 1: can't take on any more work because if I do, 320 00:19:39,640 --> 00:19:43,680 Speaker 1: I'll be overloaded. Switch too then says, no problem, I'll 321 00:19:43,720 --> 00:19:47,800 Speaker 1: take on any oncoming work for you and we'll handle 322 00:19:47,840 --> 00:19:50,679 Speaker 1: it from there. And if switched number two were to 323 00:19:50,760 --> 00:19:53,200 Speaker 1: get into the same sort of situation, it would say 324 00:19:53,200 --> 00:19:55,119 Speaker 1: the same thing to switch number three, and so on 325 00:19:55,200 --> 00:19:59,040 Speaker 1: and so forth. Now, eventually each switch will contact the 326 00:19:59,040 --> 00:20:01,760 Speaker 1: one below it and say, hey, how you doing there, 327 00:20:02,160 --> 00:20:05,520 Speaker 1: And if the answer is okay, then everything switches back 328 00:20:05,560 --> 00:20:09,040 Speaker 1: and you go back to normal operation. That's how it's 329 00:20:09,040 --> 00:20:13,800 Speaker 1: supposed to work up. But AT and t's updated code 330 00:20:13,880 --> 00:20:18,000 Speaker 1: sped things up so much it caused some real issues, 331 00:20:18,080 --> 00:20:21,439 Speaker 1: and there was some poor timing, just coincidental timing that 332 00:20:21,480 --> 00:20:25,080 Speaker 1: made things worse. So switch number one starts to get 333 00:20:25,080 --> 00:20:27,840 Speaker 1: overwhelmed and sends a message over to switch number two, 334 00:20:27,920 --> 00:20:30,480 Speaker 1: But switch number two was just in the middle of 335 00:20:30,520 --> 00:20:34,720 Speaker 1: resetting itself, So switch number two goes into reset mode, 336 00:20:34,720 --> 00:20:37,000 Speaker 1: which says do not disturb. Sends a message over to 337 00:20:37,000 --> 00:20:40,639 Speaker 1: switch number three. That prompted switch number three to overload 338 00:20:40,920 --> 00:20:42,960 Speaker 1: and put up a do not disturb sign. Move that 339 00:20:43,000 --> 00:20:46,000 Speaker 1: down to switch number four. This whole thing goes down 340 00:20:46,040 --> 00:20:48,680 Speaker 1: the entire line of one hundred and fourteen switches. They 341 00:20:48,720 --> 00:20:51,360 Speaker 1: all end up getting overloaded as a result of this, 342 00:20:51,760 --> 00:20:54,920 Speaker 1: and all go into reset mode and they get stuck there. 343 00:20:56,480 --> 00:21:00,520 Speaker 1: That problem lasted for nine hours before be for AT 344 00:21:00,600 --> 00:21:03,240 Speaker 1: and T was finally able to address the message load 345 00:21:03,359 --> 00:21:05,879 Speaker 1: on the entire system and get the switches back to normal. 346 00:21:06,359 --> 00:21:09,520 Speaker 1: The estimated cost of lost revenue for that time was 347 00:21:09,560 --> 00:21:13,840 Speaker 1: about sixty million dollars in long distance calls, and there 348 00:21:13,840 --> 00:21:16,000 Speaker 1: were a lot of angry customers to boot, so to 349 00:21:16,119 --> 00:21:19,919 Speaker 1: placate them, AT and T offered reduced long distance rates 350 00:21:19,960 --> 00:21:25,040 Speaker 1: on Valentine's Day. Pretty ugly, but AT and T tried 351 00:21:25,080 --> 00:21:27,080 Speaker 1: to handle it, at least in a way that didn't 352 00:21:27,119 --> 00:21:30,400 Speaker 1: turn it into a pr nightmare. Not so with Intel. 353 00:21:31,080 --> 00:21:33,880 Speaker 1: That's what brings us to the Pendium problem. I don't 354 00:21:33,920 --> 00:21:36,760 Speaker 1: know if you guys remember when pentium processors first came out, 355 00:21:36,760 --> 00:21:41,080 Speaker 1: but they were a big deal. It was a redesign 356 00:21:41,160 --> 00:21:43,480 Speaker 1: of the architecture of the microprocessor and it was meant 357 00:21:43,520 --> 00:21:47,159 Speaker 1: to really speed things up. Well, Intel had a massive 358 00:21:47,240 --> 00:21:51,159 Speaker 1: nightmare in nineteen ninety four thanks to a flaw in 359 00:21:51,240 --> 00:21:56,439 Speaker 1: the entire first generation of Pentium processors. Now, when you 360 00:21:56,480 --> 00:21:59,920 Speaker 1: break it all down, a CPU is all about performing math, 361 00:22:00,080 --> 00:22:02,760 Speaker 1: medical operations on data, so it's kind of important that 362 00:22:02,760 --> 00:22:07,200 Speaker 1: it does this correctly. Unfortunately, the flaw on the Pentium 363 00:22:07,280 --> 00:22:10,200 Speaker 1: processors kind of messed that up, and the issue has 364 00:22:10,240 --> 00:22:13,600 Speaker 1: to do with floating point operations. So the predecessor to 365 00:22:13,640 --> 00:22:16,560 Speaker 1: the Pentium, the four eighty six, used a shift and 366 00:22:16,640 --> 00:22:21,040 Speaker 1: subtract algorithm for floating point operations, which was effective but 367 00:22:21,560 --> 00:22:25,080 Speaker 1: relatively slow compared to what Intel thought they could do 368 00:22:25,680 --> 00:22:32,280 Speaker 1: by totally redesigning that structure and using a lookup table approach. Now, 369 00:22:32,320 --> 00:22:35,040 Speaker 1: the table was supposed to have one thousand and sixty 370 00:22:35,160 --> 00:22:39,080 Speaker 1: six entries programmed directly onto the logic array of the 371 00:22:39,080 --> 00:22:43,800 Speaker 1: Pentium processor, but for some reason only one thousand and 372 00:22:43,920 --> 00:22:48,280 Speaker 1: sixty one entries made it. Five entries went missing and 373 00:22:48,640 --> 00:22:52,520 Speaker 1: essentially returned an answer of zero instead of what they 374 00:22:52,520 --> 00:22:56,359 Speaker 1: were supposed to say, so if a calculation accessed one 375 00:22:56,359 --> 00:22:58,720 Speaker 1: of those missing cells, it got zero, even though that's 376 00:22:58,760 --> 00:23:02,240 Speaker 1: not the correct answer. All the first generation pentiums went 377 00:23:02,280 --> 00:23:04,600 Speaker 1: out with this error because it was so minor that 378 00:23:04,680 --> 00:23:07,679 Speaker 1: it wasn't even picked up by Intel's quality control at 379 00:23:07,680 --> 00:23:12,560 Speaker 1: the time. Now, processes worked just fine up to the 380 00:23:12,680 --> 00:23:16,399 Speaker 1: eighth decimal point. Beyond that things got messy, but for 381 00:23:16,480 --> 00:23:19,720 Speaker 1: most folks that wasn't a problem because they weren't doing 382 00:23:20,000 --> 00:23:24,159 Speaker 1: mathematical calculations that needed that level of precision. It just 383 00:23:24,400 --> 00:23:26,560 Speaker 1: wasn't a thing. In fact, there was only a one 384 00:23:26,560 --> 00:23:30,000 Speaker 1: in three hundred and sixty billion chance that this error 385 00:23:30,040 --> 00:23:34,000 Speaker 1: would cause a big enough problem to reach up to 386 00:23:34,080 --> 00:23:38,000 Speaker 1: the fourth decimal place. So most calculations that were simple 387 00:23:38,040 --> 00:23:41,840 Speaker 1: were bulletproof. You were fine. But if you needed that precision, 388 00:23:42,440 --> 00:23:45,760 Speaker 1: if you needed that really fine degree, that's when you 389 00:23:45,800 --> 00:23:50,400 Speaker 1: would encounter the flaw. And that happened because they're math 390 00:23:50,480 --> 00:23:54,280 Speaker 1: professors in this world, and one of those, Thomas Nicely 391 00:23:55,040 --> 00:23:58,000 Speaker 1: discovered in October nineteen ninety four that he was getting 392 00:23:58,080 --> 00:24:01,280 Speaker 1: errors because of this issue. He needed the processor to 393 00:24:01,320 --> 00:24:05,639 Speaker 1: work correctly, and so he contacted Intel about the problem. 394 00:24:06,280 --> 00:24:09,480 Speaker 1: And this is where we take a moment to acknowledge 395 00:24:09,480 --> 00:24:11,960 Speaker 1: there's a right way and a wrong way to handle 396 00:24:12,680 --> 00:24:16,240 Speaker 1: an issue. That's your fault. Intel decided to go the 397 00:24:16,320 --> 00:24:19,280 Speaker 1: wrong way. My opinion is if you make a mistake, 398 00:24:19,320 --> 00:24:21,719 Speaker 1: it's usually a good idea to just own up to 399 00:24:21,760 --> 00:24:25,000 Speaker 1: it and try to make it better. But Intel's response 400 00:24:25,119 --> 00:24:26,840 Speaker 1: was more along the lines of, yeah, we didn't think 401 00:24:26,840 --> 00:24:29,480 Speaker 1: it was a big deal. And then Intel made other 402 00:24:29,520 --> 00:24:33,320 Speaker 1: pr blenders. But because people began to hear, hey, that 403 00:24:33,359 --> 00:24:37,359 Speaker 1: pentium processor in your computer that you just bought it 404 00:24:37,400 --> 00:24:41,560 Speaker 1: doesn't work properly. So people wanted to get replacements, but 405 00:24:41,840 --> 00:24:44,240 Speaker 1: Intel said, oh, we're only going to replace the ones 406 00:24:44,480 --> 00:24:47,600 Speaker 1: if you can prove that the mistakes that it makes 407 00:24:47,960 --> 00:24:52,080 Speaker 1: affect you in some meaningful way. So they weren't They 408 00:24:52,119 --> 00:24:55,320 Speaker 1: weren't denying that there was a problem. They were just saying, hey, 409 00:24:55,400 --> 00:24:58,320 Speaker 1: unless you can prove the problem affects you, we don't care. 410 00:24:59,240 --> 00:25:02,280 Speaker 1: That didn't go well. If you create a product and 411 00:25:02,320 --> 00:25:04,600 Speaker 1: you market it as the future of computing and then 412 00:25:04,640 --> 00:25:07,360 Speaker 1: it's discovered there's a flaw in the design, and then 413 00:25:07,400 --> 00:25:09,600 Speaker 1: you say we'll replace it, but only if you prove 414 00:25:09,680 --> 00:25:13,720 Speaker 1: you deserve it, it doesn't tend to make your customer 415 00:25:13,760 --> 00:25:18,639 Speaker 1: base very happy. So ultimately Intel reverse that decision and 416 00:25:18,680 --> 00:25:21,840 Speaker 1: offered to replace the processor for anyone who wanted it 417 00:25:21,880 --> 00:25:26,320 Speaker 1: who had a first generation Pentium, and that mistake ended 418 00:25:26,359 --> 00:25:32,000 Speaker 1: up costing the company four hundred and seventy five million dollars. Yikes. 419 00:25:32,880 --> 00:25:36,600 Speaker 1: All right, now we're gonna switch gears over to Microsoft. First. 420 00:25:36,640 --> 00:25:39,760 Speaker 1: I think you could claim that all of Microsoft Bob, 421 00:25:40,240 --> 00:25:42,280 Speaker 1: the nineteen ninety five product that was supposed to be 422 00:25:42,320 --> 00:25:46,400 Speaker 1: an easy accessible computer interface, was really just a massive 423 00:25:46,440 --> 00:25:51,320 Speaker 1: software bug. I mean it introduced comic sands. For goodness sakes, 424 00:25:52,000 --> 00:25:55,560 Speaker 1: the cluttered organization system, the lack of meaningful security, and 425 00:25:55,640 --> 00:26:00,760 Speaker 1: other numerous issues plagued that software. But we did an 426 00:26:00,920 --> 00:26:03,440 Speaker 1: entire episode of tech Stuff about Microsoft Bob a couple 427 00:26:03,400 --> 00:26:05,359 Speaker 1: of years ago, So I'm not going to dwell on 428 00:26:05,440 --> 00:26:08,280 Speaker 1: it anymore, but if you want to hear more about it, 429 00:26:09,000 --> 00:26:12,080 Speaker 1: go find that episode. It was a fun one. Now. 430 00:26:12,080 --> 00:26:16,159 Speaker 1: In two thousand and seven, Microsoft experienced a massive headache 431 00:26:16,800 --> 00:26:19,920 Speaker 1: when a bug on their servers notified thousands of Windows 432 00:26:19,960 --> 00:26:22,800 Speaker 1: customers that they were filthy, dirty software pirates and they 433 00:26:22,840 --> 00:26:26,679 Speaker 1: should be punished. These include people who actually had legitimate 434 00:26:27,240 --> 00:26:32,600 Speaker 1: legal purchase copies of Windows XP or Vista. So the 435 00:26:32,640 --> 00:26:37,400 Speaker 1: problem here was Microsoft had an initiative called Windows Genuine Advantage, 436 00:26:38,040 --> 00:26:40,240 Speaker 1: and it was a nice name for a strategy meant 437 00:26:40,240 --> 00:26:45,160 Speaker 1: to curtail operating system piracy. Essentially, it was a component 438 00:26:45,240 --> 00:26:48,120 Speaker 1: in Windows that would allow Microsoft to figure out if 439 00:26:48,160 --> 00:26:53,040 Speaker 1: the copy of Windows on any given computer was legit. 440 00:26:53,640 --> 00:26:56,480 Speaker 1: In other words, it was a DRM strategy. But in 441 00:26:56,520 --> 00:26:59,960 Speaker 1: two thousand and seven, a buggy install of software on 442 00:27:00,160 --> 00:27:06,080 Speaker 1: a server misidentified thousands of legitimate, law abiding customers as pirates. 443 00:27:06,760 --> 00:27:09,880 Speaker 1: For nineteen hours, the software just laid down the law, 444 00:27:10,040 --> 00:27:13,080 Speaker 1: and so people began to receive sternly written warnings about 445 00:27:13,119 --> 00:27:16,240 Speaker 1: their choice to indulge in bad behavior. And if you 446 00:27:16,280 --> 00:27:19,240 Speaker 1: were a Windows Vista customer, you had it the worst, 447 00:27:20,000 --> 00:27:22,960 Speaker 1: not just because you were using Windows Vista, which I 448 00:27:22,960 --> 00:27:25,800 Speaker 1: think we all agree was not one of the bright 449 00:27:25,880 --> 00:27:31,920 Speaker 1: points in Microsoft's operating system history, but also because Microsoft 450 00:27:32,000 --> 00:27:34,919 Speaker 1: had built in the ability for Windows Genuine Advantage to 451 00:27:35,080 --> 00:27:40,040 Speaker 1: switch off certain operating system features in Windows Vista if 452 00:27:40,080 --> 00:27:43,000 Speaker 1: it determined that the copy someone was using was a 453 00:27:43,040 --> 00:27:47,639 Speaker 1: pirated version. So it was misidentifying real versions as pirated ones, 454 00:27:48,040 --> 00:27:50,520 Speaker 1: turning off features. And these are for people who have 455 00:27:50,680 --> 00:27:53,600 Speaker 1: bought legitimate copies. This, by the way, is one of 456 00:27:53,600 --> 00:27:57,800 Speaker 1: the big arguments people have against DRM. It has the 457 00:27:57,880 --> 00:28:02,879 Speaker 1: tendency to punish legitimate CUS customers. And you feel like 458 00:28:03,000 --> 00:28:06,480 Speaker 1: you're stupid for buying a copy of a piece of 459 00:28:06,520 --> 00:28:09,120 Speaker 1: software rather than just stealing one that has had those 460 00:28:09,119 --> 00:28:13,720 Speaker 1: features or those defenses removed. Like why you're creating more 461 00:28:13,760 --> 00:28:17,800 Speaker 1: incentives for people to go outside and get a pirated copy. 462 00:28:19,280 --> 00:28:22,239 Speaker 1: All right, so imagine you've purchased this legitimate copy of 463 00:28:22,320 --> 00:28:25,240 Speaker 1: Windows Vista. First of all, you already feel bad. Then 464 00:28:25,280 --> 00:28:27,960 Speaker 1: you're told you're a thief, so you feel worse. Then 465 00:28:28,040 --> 00:28:31,119 Speaker 1: someone remotely switches off several features of your operating system. 466 00:28:31,160 --> 00:28:34,760 Speaker 1: That was not a great PR message, So that was 467 00:28:34,800 --> 00:28:37,639 Speaker 1: a real issue. They did eventually fix it after that 468 00:28:37,720 --> 00:28:41,440 Speaker 1: nineteen hours, but by then people were already very upset. Also, 469 00:28:41,520 --> 00:28:45,880 Speaker 1: I don't want to just you know, pile lots of 470 00:28:45,920 --> 00:28:49,040 Speaker 1: abuse onto Microsoft. I gotta talk about Apple here too. 471 00:28:50,280 --> 00:28:53,440 Speaker 1: So the company prides itself on a high standard of quality, 472 00:28:54,480 --> 00:28:57,000 Speaker 1: and in general it's pretty good about living up to 473 00:28:57,040 --> 00:28:59,720 Speaker 1: that standard of quality, depending upon your point of view 474 00:28:59,720 --> 00:29:02,720 Speaker 1: of their various products. But that hasn't stopped a few 475 00:29:02,800 --> 00:29:08,000 Speaker 1: clunkers getting through and into the public hands. And that 476 00:29:08,120 --> 00:29:11,240 Speaker 1: was the case in twenty twelve with Apple Maps. If 477 00:29:11,240 --> 00:29:13,600 Speaker 1: you owned an iPhone back in twenty twelve when Apple 478 00:29:13,640 --> 00:29:17,040 Speaker 1: Maps came out, you may remember this problem. It's pretty 479 00:29:17,080 --> 00:29:21,080 Speaker 1: well publicized. Maps were inaccurate, sometimes leaving out important details, like, 480 00:29:21,200 --> 00:29:23,960 Speaker 1: you know, a river or a lake between you and 481 00:29:24,000 --> 00:29:27,120 Speaker 1: your destination, things that might be important if I don't 482 00:29:27,120 --> 00:29:30,480 Speaker 1: know you don't drive an amphibious vehicle, might not have 483 00:29:30,560 --> 00:29:34,800 Speaker 1: a road on there that's important, might misidentify the location 484 00:29:35,000 --> 00:29:38,400 Speaker 1: of a historical landmark. For instance, it thought the Washington 485 00:29:38,440 --> 00:29:41,680 Speaker 1: Monument was across the street from where it is, but nope, 486 00:29:41,960 --> 00:29:45,640 Speaker 1: it's just where we left it. Despite all of Rolanimerick's 487 00:29:46,240 --> 00:29:50,240 Speaker 1: best attempts to move it or destroy it, it's still there. 488 00:29:52,600 --> 00:29:55,160 Speaker 1: The real problem here was that the Apple software just 489 00:29:55,240 --> 00:29:59,600 Speaker 1: wasn't ready for public unveiling. It needed a lot more testing. 490 00:30:00,080 --> 00:30:02,600 Speaker 1: It was trying to play catch up to Google Maps, 491 00:30:02,600 --> 00:30:05,560 Speaker 1: but Google had the advantage of working with companies that 492 00:30:05,560 --> 00:30:09,480 Speaker 1: had been doing mapping software for years. Google acquired those 493 00:30:09,520 --> 00:30:12,320 Speaker 1: companies and acquired the expertise of people had been working 494 00:30:12,360 --> 00:30:15,960 Speaker 1: on that software, and Apple was really just trying to 495 00:30:16,040 --> 00:30:19,000 Speaker 1: create their own version and get it out as fast 496 00:30:19,040 --> 00:30:21,240 Speaker 1: as it could. But it got out a little too early, 497 00:30:21,920 --> 00:30:24,480 Speaker 1: and the company spent the next several months tweaking maps 498 00:30:24,480 --> 00:30:26,520 Speaker 1: and trying to keep control of the situation. But by 499 00:30:26,520 --> 00:30:29,360 Speaker 1: that time, many of Apple's fans, even the most devoted ones, 500 00:30:29,760 --> 00:30:31,640 Speaker 1: had kind of given up and switched over to Google 501 00:30:31,680 --> 00:30:35,880 Speaker 1: Maps instead. All right, we've got another break ahead of us, 502 00:30:36,000 --> 00:30:38,800 Speaker 1: but don't worry. Once that's done, we're back to conclude 503 00:30:38,800 --> 00:30:51,280 Speaker 1: our discussion about bad computer bugs. Now I'm going to 504 00:30:51,280 --> 00:30:55,320 Speaker 1: transition into some serious bugs. These are ones that either 505 00:30:55,440 --> 00:31:00,600 Speaker 1: threaten the lives of people or they contributed to people dying. 506 00:31:01,400 --> 00:31:05,280 Speaker 1: The ones I've talked about now up to out rather 507 00:31:05,400 --> 00:31:08,720 Speaker 1: have cost companies millions of dollars, but no one's life 508 00:31:08,840 --> 00:31:12,320 Speaker 1: was truly threatened. Unfortunately, that's not the case with all 509 00:31:12,320 --> 00:31:15,400 Speaker 1: software bugs. Now, a couple of bugs had the potential 510 00:31:15,720 --> 00:31:19,920 Speaker 1: to kill millions of people. One of those happened in 511 00:31:20,000 --> 00:31:25,160 Speaker 1: nineteen eighty a famous famous bug, or at least a 512 00:31:25,200 --> 00:31:28,480 Speaker 1: faulty circuit, and that was a faulty circuit in Norrad's 513 00:31:28,520 --> 00:31:31,760 Speaker 1: computer system, which caused it to mistakenly conclude the US 514 00:31:31,840 --> 00:31:36,880 Speaker 1: was under nuclear attack from the Soviet Union. So displays 515 00:31:37,120 --> 00:31:40,400 Speaker 1: on nor Ad systems showed seemingly random attacks, and they 516 00:31:40,400 --> 00:31:43,680 Speaker 1: didn't correspond with each other. So the display might show, Hey, 517 00:31:43,720 --> 00:31:46,719 Speaker 1: they're two missiles heading over from the Soviet Union. No, 518 00:31:46,760 --> 00:31:50,720 Speaker 1: they're two hundred. No they're fifty. No there's three. And 519 00:31:50,880 --> 00:31:54,440 Speaker 1: it wasn't consistent, and command posts around the US all 520 00:31:54,480 --> 00:31:58,240 Speaker 1: had conflicting information, which led leaders to conclude the whole 521 00:31:58,280 --> 00:32:02,480 Speaker 1: thing was a regrettable computer error, and they were right 522 00:32:02,520 --> 00:32:05,680 Speaker 1: to do so. To be fair, they were kind of 523 00:32:05,680 --> 00:32:08,200 Speaker 1: prepared for this because there was another incident that had 524 00:32:08,200 --> 00:32:11,560 Speaker 1: actually happened in nineteen seventy nine that was way scarier, 525 00:32:12,240 --> 00:32:15,280 Speaker 1: and in that case, someone mistakenly inserted a training scenario 526 00:32:15,400 --> 00:32:17,479 Speaker 1: into the computer system that made it seem like the 527 00:32:17,480 --> 00:32:20,280 Speaker 1: Soviet Union had launched an all out nuclear attack on 528 00:32:20,320 --> 00:32:23,320 Speaker 1: the US. But that wasn't a bug. That was a 529 00:32:23,320 --> 00:32:25,680 Speaker 1: mistake on the part of a human who had accidentally 530 00:32:25,880 --> 00:32:29,760 Speaker 1: uploaded the wrong or rather executed the wrong command. It 531 00:32:29,840 --> 00:32:31,760 Speaker 1: didn't have something to do with a flaw in the 532 00:32:31,760 --> 00:32:37,080 Speaker 1: computer system itself. However, because that thing happened and everybody 533 00:32:37,720 --> 00:32:40,520 Speaker 1: was freaked out and then was able to determine that 534 00:32:40,560 --> 00:32:42,719 Speaker 1: in fact, it was a false alarm, it meant that 535 00:32:42,880 --> 00:32:47,360 Speaker 1: calmer heads could prevail in the nineteen eighty incident, so 536 00:32:47,800 --> 00:32:50,680 Speaker 1: the Soviets also had a close call just a few 537 00:32:50,720 --> 00:32:53,160 Speaker 1: years later. It was a bug in the early warning 538 00:32:53,200 --> 00:32:57,400 Speaker 1: detection software that the USSR was using in the early eighties, 539 00:32:57,680 --> 00:33:01,520 Speaker 1: and on September twenty third, nineteen eighty three, so Union 540 00:33:01,560 --> 00:33:04,640 Speaker 1: received an alert that the US had launched a nuclear 541 00:33:04,680 --> 00:33:09,520 Speaker 1: attack in the form of five nuclear warheads, technically two 542 00:33:09,560 --> 00:33:12,800 Speaker 1: different attacks. The first would have been a single nuclear 543 00:33:12,840 --> 00:33:16,760 Speaker 1: warhead and the second was four nuclear warheads, and this 544 00:33:16,960 --> 00:33:20,280 Speaker 1: was during a particularly stressful period in the history of 545 00:33:20,320 --> 00:33:23,800 Speaker 1: both countries and their relationship with each other, at the 546 00:33:23,840 --> 00:33:28,280 Speaker 1: height of the Cold War nineteen eighty three. Now, Fortunately, 547 00:33:30,200 --> 00:33:36,600 Speaker 1: Soviet Air Defense Forces Lieutenant Colonel Stanislav Petrov suspected that 548 00:33:36,640 --> 00:33:39,840 Speaker 1: this report was an error and that there was some 549 00:33:39,880 --> 00:33:43,000 Speaker 1: sort of bug in the software or a mistake in 550 00:33:43,080 --> 00:33:46,840 Speaker 1: the reporting system that caused this. He gave a command 551 00:33:46,920 --> 00:33:49,960 Speaker 1: to hold off on any sort of retaliatory strike, which 552 00:33:49,960 --> 00:33:52,880 Speaker 1: would have initiated a full scale nuclear war had it happened. 553 00:33:53,760 --> 00:33:56,680 Speaker 1: Petrov was the officer in charge at a bunker that 554 00:33:56,760 --> 00:33:59,640 Speaker 1: served as the command center for this early warning system, 555 00:34:00,040 --> 00:34:03,880 Speaker 1: and he had said afterward that his reckoning was any 556 00:34:03,960 --> 00:34:07,280 Speaker 1: real attack would consist of hundreds of warheads, not five. 557 00:34:08,080 --> 00:34:11,759 Speaker 1: No one would start an attack with just five warheads, 558 00:34:12,040 --> 00:34:13,759 Speaker 1: so it was more likely to be an error than 559 00:34:13,760 --> 00:34:16,600 Speaker 1: a genuine attack, So he gave the command to wait 560 00:34:16,719 --> 00:34:20,000 Speaker 1: until the reported missiles would pass into the range of radar, 561 00:34:20,520 --> 00:34:24,200 Speaker 1: which only extended as far as the horizon, so if 562 00:34:24,440 --> 00:34:26,520 Speaker 1: it had in fact been a real attack, it would 563 00:34:26,520 --> 00:34:30,279 Speaker 1: have potentially limited the Soviet Union's ability to respond. But 564 00:34:30,680 --> 00:34:36,560 Speaker 1: no missile showed up, and he was vindicated in his decision. Now, 565 00:34:36,600 --> 00:34:38,640 Speaker 1: the cause of the false alarm in this case was 566 00:34:39,280 --> 00:34:44,680 Speaker 1: a combination of factors that the designers didn't anticipate, which 567 00:34:44,760 --> 00:34:48,920 Speaker 1: largely consisted of sunlight hitting high altitude clouds at a 568 00:34:48,960 --> 00:34:53,400 Speaker 1: particular angle from a particular perspective of the satellites. So 569 00:34:53,440 --> 00:34:59,840 Speaker 1: the satellites misidentified that reflection as a warhead. Now they 570 00:34:59,880 --> 00:35:02,040 Speaker 1: were where the Soviets were able to address this error 571 00:35:02,040 --> 00:35:05,680 Speaker 1: in the future by adding another step in which these 572 00:35:05,719 --> 00:35:10,120 Speaker 1: satellites would cross reference data from other geostationary satellites to 573 00:35:10,200 --> 00:35:15,160 Speaker 1: make certain that they are identifying actual rockets as opposed 574 00:35:15,239 --> 00:35:21,200 Speaker 1: to high altitude clouds. Now, there are several cases of 575 00:35:22,200 --> 00:35:27,320 Speaker 1: software bugs leading to actual deaths. For example, the Therak 576 00:35:27,440 --> 00:35:29,840 Speaker 1: twenty five was such a case. Now, that was a 577 00:35:29,960 --> 00:35:33,080 Speaker 1: radiation therapy machine that could deliver two different modes of 578 00:35:33,200 --> 00:35:37,719 Speaker 1: radiation treatments. The first was a low powered direct electron 579 00:35:37,800 --> 00:35:41,520 Speaker 1: beam and the second was a megavault X ray beam. Now, 580 00:35:41,520 --> 00:35:44,000 Speaker 1: the X ray beam was far more intense and it 581 00:35:44,080 --> 00:35:48,360 Speaker 1: required physicians to provide shielding to patients to limit exposure 582 00:35:48,440 --> 00:35:51,840 Speaker 1: to the beam. But the Therak twenty five had inherited 583 00:35:51,920 --> 00:35:55,719 Speaker 1: its code from its predecessor, which had different hardware constraints. 584 00:35:56,280 --> 00:36:00,439 Speaker 1: Now the new machine meant that these constraints were aren't there, 585 00:36:00,480 --> 00:36:03,279 Speaker 1: and it created a deadly problem. If operators change the 586 00:36:03,360 --> 00:36:06,560 Speaker 1: machine's mode too quickly from one to the other, it 587 00:36:06,560 --> 00:36:09,799 Speaker 1: would actually send two sets of instructions to the processor, 588 00:36:09,880 --> 00:36:12,520 Speaker 1: one for each mode of operation, and whichever set of 589 00:36:12,520 --> 00:36:16,640 Speaker 1: instructions reached the processor first, that's what the machine would 590 00:36:16,680 --> 00:36:20,920 Speaker 1: switch to. So let's say you've been operating the the 591 00:36:21,120 --> 00:36:25,279 Speaker 1: Act twenty five in the Megavault X ray mode, but 592 00:36:25,480 --> 00:36:27,319 Speaker 1: now you're going to have a patient come in. You 593 00:36:27,400 --> 00:36:30,719 Speaker 1: need to administer radiation therapy, so you want to switch 594 00:36:30,760 --> 00:36:33,720 Speaker 1: it to low electron beam. You switch it too quickly, 595 00:36:34,360 --> 00:36:37,040 Speaker 1: it sends two sets of instructions to the processor, and 596 00:36:37,080 --> 00:36:40,319 Speaker 1: the one that arises the Megavault X ray instruction, So 597 00:36:40,360 --> 00:36:44,440 Speaker 1: instead of switching it, you confirm to stay on the 598 00:36:44,480 --> 00:36:50,439 Speaker 1: more intense, deadlier radiation. The tragic news is this did 599 00:36:50,560 --> 00:36:55,879 Speaker 1: happen several times. Six patients were documented as dying from 600 00:36:55,880 --> 00:36:59,160 Speaker 1: complications due to radiation poisoning from THERA Act twenty five 601 00:36:59,200 --> 00:37:02,880 Speaker 1: machines between nineteen eighty five and nineteen eighty six, and 602 00:37:02,920 --> 00:37:06,280 Speaker 1: while the machine would send error messages when these conditions 603 00:37:06,280 --> 00:37:10,239 Speaker 1: were present. The documentation for the machine didn't explain what 604 00:37:10,280 --> 00:37:12,600 Speaker 1: the errors meant. It didn't say, hey, if you get 605 00:37:12,600 --> 00:37:15,960 Speaker 1: this error, it means that you've switched modes too quickly 606 00:37:16,080 --> 00:37:20,120 Speaker 1: and you need to address this. So, since operators weren't 607 00:37:20,160 --> 00:37:24,000 Speaker 1: told that this was necessarily a hazardous condition, they would 608 00:37:24,040 --> 00:37:28,000 Speaker 1: just clear the error and proceed, and there were deadly results. 609 00:37:29,360 --> 00:37:33,360 Speaker 1: In a similar vein in Panama City, Panama, there was 610 00:37:33,400 --> 00:37:37,680 Speaker 1: an incident involving a Cobalt sixty system, actually several incidents 611 00:37:37,680 --> 00:37:41,320 Speaker 1: involving this Cobalt sixty system that was running therapy planning 612 00:37:41,400 --> 00:37:45,319 Speaker 1: software made by a company called Multi Data Systems International. Now, 613 00:37:45,320 --> 00:37:48,360 Speaker 1: the software's purpose was to calculate the amount of radiation 614 00:37:48,680 --> 00:37:53,480 Speaker 1: that cancer patients should receive in radiation therapy sessions. Now, 615 00:37:53,600 --> 00:37:57,480 Speaker 1: during these radiation therapy sessions, the therapists were meant to 616 00:37:57,520 --> 00:38:02,320 Speaker 1: place metal shields on the patient to protect healthy tissue 617 00:38:02,320 --> 00:38:07,560 Speaker 1: from radiation damage, and the software would allow therapists to 618 00:38:07,719 --> 00:38:11,680 Speaker 1: use a methodology to show where those shields were on 619 00:38:11,800 --> 00:38:15,239 Speaker 1: the patient to indicate where the shields are present, But 620 00:38:15,440 --> 00:38:18,680 Speaker 1: they could only draw up to four shields, and the 621 00:38:18,719 --> 00:38:22,200 Speaker 1: doctors in Panama wanted to use five shields for particular 622 00:38:22,360 --> 00:38:26,120 Speaker 1: therapy sessions. They were overloaded, they had a long waiting 623 00:38:26,120 --> 00:38:28,239 Speaker 1: list of patients, and they were trying to make things 624 00:38:28,280 --> 00:38:32,120 Speaker 1: more efficient, and they discovered that they could kind of 625 00:38:32,160 --> 00:38:36,680 Speaker 1: work around this limitation of four shields by drawing a 626 00:38:36,719 --> 00:38:38,959 Speaker 1: design on the computer screen as if they were using 627 00:38:39,080 --> 00:38:42,000 Speaker 1: just one large shield that has a hole in the 628 00:38:42,000 --> 00:38:43,880 Speaker 1: middle of it. And so what they would do is 629 00:38:43,920 --> 00:38:46,719 Speaker 1: they would arrange the five shields to essentially be in 630 00:38:46,760 --> 00:38:49,880 Speaker 1: the same sort of shape with the middle of it 631 00:38:49,960 --> 00:38:52,440 Speaker 1: being opened, so that they could have the radiation therapy 632 00:38:52,480 --> 00:38:57,840 Speaker 1: pass through it. But they didn't realize that the software 633 00:38:57,880 --> 00:39:00,200 Speaker 1: had a bug in it, and that bug was if 634 00:39:00,200 --> 00:39:03,239 Speaker 1: you drew the hole in one direction, you get the 635 00:39:03,280 --> 00:39:06,200 Speaker 1: correct dose of radiation, but if you drew it in 636 00:39:06,239 --> 00:39:11,240 Speaker 1: the other direction, so like clockwise versus counterclockwise, the software 637 00:39:11,239 --> 00:39:15,320 Speaker 1: would recommend a dosage twice as strong as what was needed, 638 00:39:15,640 --> 00:39:19,719 Speaker 1: and the result was devastating. Eight patients died as a 639 00:39:19,719 --> 00:39:23,279 Speaker 1: result of this, and another twenty received doses high enough 640 00:39:23,280 --> 00:39:27,359 Speaker 1: to potentially cause health problems. Later on, the physicians were 641 00:39:27,400 --> 00:39:30,640 Speaker 1: actually arrested and brought up on murder charges because they 642 00:39:30,640 --> 00:39:34,640 Speaker 1: were supposed to double check all calculations by hand to 643 00:39:34,800 --> 00:39:36,640 Speaker 1: ensure that they were going to give the proper dose 644 00:39:36,680 --> 00:39:40,880 Speaker 1: of radiation treatment. So while the software was calculating the 645 00:39:41,200 --> 00:39:46,200 Speaker 1: incorrect dose, the physicians were responsible for making sure that 646 00:39:46,480 --> 00:39:48,960 Speaker 1: any dose that was calculated was in fact the correct one, 647 00:39:48,960 --> 00:39:50,840 Speaker 1: and they failed to do so, or at least that 648 00:39:51,000 --> 00:39:55,560 Speaker 1: was the charge. There are also bugs that involve military 649 00:39:55,600 --> 00:39:58,399 Speaker 1: applications that have resulted in the loss of life. During 650 00:39:58,440 --> 00:40:01,480 Speaker 1: the Persian Gulf War in Iraq, he fired scud missile 651 00:40:01,520 --> 00:40:04,800 Speaker 1: hit a US base in Saudi Arabia and it killed 652 00:40:04,920 --> 00:40:08,680 Speaker 1: twenty eight soldiers. Now the base had detected the missile 653 00:40:09,200 --> 00:40:12,320 Speaker 1: and had launched and fired a Patriot missile in return. 654 00:40:12,760 --> 00:40:14,920 Speaker 1: The purpose of the Patriot missile was to intercept and 655 00:40:14,960 --> 00:40:18,359 Speaker 1: destroy incoming missiles, and the way a Patriot missile did 656 00:40:18,360 --> 00:40:22,600 Speaker 1: this was to use radar pulses to guide trajectory calculations 657 00:40:22,840 --> 00:40:25,600 Speaker 1: so that it would end up getting close to the 658 00:40:25,600 --> 00:40:28,600 Speaker 1: incoming missile. This is harder than it sounds because both 659 00:40:28,640 --> 00:40:31,759 Speaker 1: missiles are moving very very quickly, so a NEA very 660 00:40:31,800 --> 00:40:36,400 Speaker 1: precise information in order to adjust its trajectory properly and 661 00:40:36,480 --> 00:40:39,520 Speaker 1: make sure it was on target. Now, once it gets 662 00:40:39,520 --> 00:40:43,600 Speaker 1: within range, which is between five and ten meters. I 663 00:40:43,600 --> 00:40:48,319 Speaker 1: think it would then fire out one thousand pellets from 664 00:40:48,320 --> 00:40:50,680 Speaker 1: the Patriot missile at high velocity, with the goal of 665 00:40:51,120 --> 00:40:55,799 Speaker 1: causing the incoming warhead to explode prematurely. In this case, 666 00:40:55,840 --> 00:40:59,280 Speaker 1: the Patriot missile missed, and the military investigated the issue 667 00:40:59,320 --> 00:41:01,280 Speaker 1: in the wake of the loss of life and found 668 00:41:01,320 --> 00:41:05,160 Speaker 1: a problem with the software guiding the Patriot missile. And 669 00:41:05,239 --> 00:41:06,960 Speaker 1: it was a problem that actually the military kind of 670 00:41:07,040 --> 00:41:09,400 Speaker 1: knew about already. So one of the processes in the 671 00:41:09,400 --> 00:41:13,120 Speaker 1: Patriots programming was to convert time into floating point operations 672 00:41:13,680 --> 00:41:18,799 Speaker 1: for increased accuracy, but not all subroutines that depended on 673 00:41:18,840 --> 00:41:25,040 Speaker 1: tracking time did this. Some of them remained clock units 674 00:41:25,120 --> 00:41:28,799 Speaker 1: rather than floating point operations, which meant that they would 675 00:41:28,800 --> 00:41:31,040 Speaker 1: get out of sync after a while. There'd be a 676 00:41:31,040 --> 00:41:34,880 Speaker 1: disagreement in various subroutines as to how much time had 677 00:41:34,920 --> 00:41:37,719 Speaker 1: actually passed. And like I said, the military was aware 678 00:41:37,719 --> 00:41:40,880 Speaker 1: of this issue and they had a workaround which was 679 00:41:41,239 --> 00:41:45,240 Speaker 1: not ideal. The workaround was you would occasionally reboot the system, 680 00:41:45,480 --> 00:41:48,239 Speaker 1: which would reset the clocks and synchronize them, but over 681 00:41:48,280 --> 00:41:50,319 Speaker 1: time they would fall out of sync because they're not 682 00:41:50,360 --> 00:41:53,279 Speaker 1: tracking time the same way. And since there was no 683 00:41:53,360 --> 00:41:56,480 Speaker 1: hard and fast rule as to how frequently you'd reset 684 00:41:56,520 --> 00:41:59,839 Speaker 1: the system, problems like this one were possible, and in fact, 685 00:42:00,040 --> 00:42:02,160 Speaker 1: in this case it did happen. So prior to this 686 00:42:02,200 --> 00:42:06,239 Speaker 1: particular incident, that specific Patriot system had been running for 687 00:42:06,360 --> 00:42:11,239 Speaker 1: one hundred hours without a reboot, and the clock disagreement 688 00:42:11,480 --> 00:42:14,400 Speaker 1: amounted to about one third of a second. Now that 689 00:42:14,440 --> 00:42:16,279 Speaker 1: seems like it's no time at all. One third of 690 00:42:16,320 --> 00:42:19,759 Speaker 1: a second is so short, but a scud missile's top 691 00:42:19,800 --> 00:42:23,880 Speaker 1: speed is about one point one miles per second or 692 00:42:23,920 --> 00:42:27,080 Speaker 1: one point seven kilometers per second, which means if you 693 00:42:27,120 --> 00:42:29,200 Speaker 1: take a third of a second, the missile could travel 694 00:42:29,280 --> 00:42:32,719 Speaker 1: more than five hundred meters. And since a Patriot needs 695 00:42:32,719 --> 00:42:34,839 Speaker 1: to be within ten meters of a target to destroy it, 696 00:42:34,920 --> 00:42:39,160 Speaker 1: that resulted in a catastrophic failure. So software bugs can 697 00:42:39,239 --> 00:42:42,239 Speaker 1: be a matter of life or death. It's not all 698 00:42:42,520 --> 00:42:45,479 Speaker 1: just Hey, this irritating thing meant people couldn't make long 699 00:42:45,520 --> 00:42:50,960 Speaker 1: distance phone calls, or this issue caused my computer to 700 00:42:51,000 --> 00:42:53,360 Speaker 1: start writing massive amounts of data to its hard drive. 701 00:42:54,239 --> 00:42:58,000 Speaker 1: And this is why it's so important to have really 702 00:42:58,560 --> 00:43:02,759 Speaker 1: qualified QA personnel go through code and make sure it's 703 00:43:02,800 --> 00:43:05,239 Speaker 1: doing what it's supposed to do, because the problems that 704 00:43:05,280 --> 00:43:08,120 Speaker 1: can arise can be non trivial and in fact life 705 00:43:08,200 --> 00:43:12,319 Speaker 1: or death situations depending upon the application of technology. So 706 00:43:12,600 --> 00:43:15,920 Speaker 1: technology is a fascinating thing. It's a wonderful thing. It 707 00:43:15,960 --> 00:43:20,000 Speaker 1: has benefited us in ways that I can't even begin 708 00:43:20,080 --> 00:43:23,000 Speaker 1: to describe. It's just too broad a topic, and it's 709 00:43:23,040 --> 00:43:25,799 Speaker 1: something I've been tackling for, you know, eight years, and 710 00:43:25,840 --> 00:43:29,520 Speaker 1: I haven't even gotten close to getting toward a finishing point. 711 00:43:29,640 --> 00:43:33,399 Speaker 1: So I don't want to suggest that technology is bad, 712 00:43:33,440 --> 00:43:37,799 Speaker 1: but we definitely have the need to check, double check, 713 00:43:37,840 --> 00:43:40,120 Speaker 1: and triple check all this work to make certain things 714 00:43:40,160 --> 00:43:43,400 Speaker 1: are working properly before we release them ount into the wild. 715 00:43:43,400 --> 00:43:48,200 Speaker 1: That particularly applies if, again, you are reusing old code 716 00:43:48,520 --> 00:43:53,239 Speaker 1: or old components in a new way, because you have 717 00:43:53,280 --> 00:43:55,400 Speaker 1: to make absolutely certain that there's not going to be 718 00:43:55,600 --> 00:44:00,760 Speaker 1: some unintended problem that results when a new form factor 719 00:44:01,000 --> 00:44:05,520 Speaker 1: is using old code. Alrighty, And that was our Bad 720 00:44:05,560 --> 00:44:08,320 Speaker 1: Computer Bugs episode, which I still haven't listened back to yet. 721 00:44:08,400 --> 00:44:12,920 Speaker 1: So maybe I talked about bedbugs. I probably talked about 722 00:44:12,960 --> 00:44:16,840 Speaker 1: Grace Hopper in that episode. Just did an episode about 723 00:44:16,840 --> 00:44:21,000 Speaker 1: Grace Hopper earlier this year for a Memorial Day remembering 724 00:44:21,080 --> 00:44:25,200 Speaker 1: Grace Hopper and her contributions because of the somewhat apocryphal 725 00:44:25,280 --> 00:44:30,200 Speaker 1: story that she coined the term computer bug, but I 726 00:44:30,239 --> 00:44:32,080 Speaker 1: have a feeling I might have addressed that in this 727 00:44:32,200 --> 00:44:35,400 Speaker 1: episode too. Hopefully I did. If not, make sure you 728 00:44:35,960 --> 00:44:38,800 Speaker 1: find that episode about Grace Hopper. She was a truly 729 00:44:38,920 --> 00:44:44,720 Speaker 1: phenomenal innovator and computer scientist, someone who is extremely important 730 00:44:44,760 --> 00:44:47,600 Speaker 1: in the history of computer science, especially here in the 731 00:44:47,680 --> 00:44:51,000 Speaker 1: United States. Check that out and I hope you enjoyed 732 00:44:51,080 --> 00:44:53,839 Speaker 1: this classic episode. I hope you're all well, and I'll 733 00:44:53,880 --> 00:45:02,879 Speaker 1: talk to you again really soon. Tech Stuff is an 734 00:45:02,880 --> 00:45:08,400 Speaker 1: iHeartRadio production. For more podcasts from iHeartRadio, visit the iHeartRadio app, 735 00:45:08,560 --> 00:45:11,720 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.