1 00:00:04,240 --> 00:00:07,240 Speaker 1: Welcome to tech Stuff, a production of I Heart Radios 2 00:00:07,320 --> 00:00:13,960 Speaker 1: How Stuff Works. Hey there, and welcome to tech Stuff. 3 00:00:14,000 --> 00:00:16,360 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:16,440 --> 00:00:18,920 Speaker 1: iHeart Radio and I love all things tech and I'm 5 00:00:18,920 --> 00:00:22,680 Speaker 1: probably still running all over the United States right now 6 00:00:23,000 --> 00:00:26,040 Speaker 1: on special projects. And for that reason, we are going 7 00:00:26,079 --> 00:00:30,600 Speaker 1: to enjoy another classic episode of tech Stuff. This episode 8 00:00:30,840 --> 00:00:34,440 Speaker 1: is called bad Computer Bugs, and it's all about software 9 00:00:34,479 --> 00:00:37,879 Speaker 1: bugs that were some of the worst to ever go 10 00:00:37,960 --> 00:00:41,960 Speaker 1: out with shipped projects. So let's go back and listen 11 00:00:41,960 --> 00:00:45,200 Speaker 1: to this classic episode. I have to address a bit 12 00:00:45,240 --> 00:00:50,640 Speaker 1: of apocryphal history, and regrettably it's a story that we've 13 00:00:50,680 --> 00:00:53,960 Speaker 1: repeated on tech Stuff. So I'm sad to admit that 14 00:00:54,080 --> 00:00:59,400 Speaker 1: I was complicit, although unknowingly, in the spread of misinformation 15 00:01:00,960 --> 00:01:02,920 Speaker 1: and that all has to do with the origin of 16 00:01:02,960 --> 00:01:07,399 Speaker 1: the term bug to describe a flaw in programming. So 17 00:01:07,440 --> 00:01:11,960 Speaker 1: here's the popular story, the one that we have accidentally, 18 00:01:13,800 --> 00:01:17,080 Speaker 1: uh promoted on tech Stuff without knowing that we were 19 00:01:17,319 --> 00:01:20,760 Speaker 1: in the wrong. It goes that Grace Hopper, who was 20 00:01:20,800 --> 00:01:23,399 Speaker 1: an early computer scientist who rose to the rank of 21 00:01:23,480 --> 00:01:26,920 Speaker 1: rear admiral in the U. S. Navy coined the phrase 22 00:01:27,040 --> 00:01:31,800 Speaker 1: bug after discovering a moth coming up Harvard's Market two calculator, 23 00:01:31,959 --> 00:01:36,320 Speaker 1: a literal bug. Generally speaking, the story tends to be 24 00:01:36,440 --> 00:01:40,920 Speaker 1: set in nineteen, and there is even a note in 25 00:01:40,959 --> 00:01:44,240 Speaker 1: the log book that reads first actual case of bug 26 00:01:44,319 --> 00:01:48,560 Speaker 1: being found that's attributed to Grace Hopper. But there are 27 00:01:48,560 --> 00:01:52,640 Speaker 1: several points that are wrong in this story. First, the 28 00:01:52,720 --> 00:01:56,280 Speaker 1: year it didn't happen in nineteen. It happened on September. 29 00:01:57,680 --> 00:02:01,520 Speaker 1: We know because there's a log book. The log book 30 00:02:01,560 --> 00:02:04,320 Speaker 1: that marks the incident not only has the notes, it 31 00:02:04,400 --> 00:02:08,359 Speaker 1: actually has the moth taped into the book itself. It's 32 00:02:08,400 --> 00:02:12,560 Speaker 1: taped onto the page. Second, Grace Hopper wasn't the person 33 00:02:12,680 --> 00:02:16,520 Speaker 1: to discover the moth or make that log entry. She 34 00:02:16,639 --> 00:02:20,600 Speaker 1: did tell the story about the moth several times, but 35 00:02:20,680 --> 00:02:24,920 Speaker 1: it wasn't in the context of finding it or logging it. 36 00:02:25,080 --> 00:02:27,160 Speaker 1: She just told the story that, yeah, we really did 37 00:02:27,200 --> 00:02:30,680 Speaker 1: have a bug in the system. And most importantly, the 38 00:02:30,680 --> 00:02:34,360 Speaker 1: word bug had already been used to describe design flaws 39 00:02:34,440 --> 00:02:39,160 Speaker 1: for decades before the Mark two was even designed. In fact, 40 00:02:39,160 --> 00:02:41,280 Speaker 1: if you look at the log book, this makes sense. 41 00:02:41,320 --> 00:02:45,000 Speaker 1: It says first actual case of bug being found that 42 00:02:45,120 --> 00:02:48,959 Speaker 1: sentence doesn't make sense unless you've already used the word 43 00:02:49,040 --> 00:02:54,040 Speaker 1: bug to describe a flaw, because you wouldn't say, first 44 00:02:54,040 --> 00:02:57,239 Speaker 1: actual case of bug being found that the wording doesn't 45 00:02:57,240 --> 00:03:01,040 Speaker 1: make a sense, the context makes no sense. Sadly, there 46 00:03:01,040 --> 00:03:05,320 Speaker 1: are documented quotes dating back to the nineteenth century using 47 00:03:05,320 --> 00:03:08,560 Speaker 1: the word bug to mean a design fault, and it 48 00:03:08,560 --> 00:03:12,440 Speaker 1: could go back even further than that. So is with 49 00:03:12,520 --> 00:03:15,240 Speaker 1: much regret that I admit I have unwillingly contributed to 50 00:03:15,320 --> 00:03:18,200 Speaker 1: a bit of misleading folklore making the rounds. But I'm 51 00:03:18,240 --> 00:03:21,400 Speaker 1: glad I can take this opportunity to address it. All Right, 52 00:03:21,480 --> 00:03:24,040 Speaker 1: so let's talk about design bugs, and I'll be covering 53 00:03:24,080 --> 00:03:29,440 Speaker 1: several goofs mistakes, flubs, flaws, and outright catastrophes in this episode. 54 00:03:30,080 --> 00:03:33,400 Speaker 1: But one thing I'm not necessarily going to cover our 55 00:03:33,400 --> 00:03:38,200 Speaker 1: software vulnerabilities that were later exploited, either by opportunistic hackers 56 00:03:38,360 --> 00:03:42,280 Speaker 1: or white hats who are just trying to improve system security. 57 00:03:42,520 --> 00:03:46,040 Speaker 1: Those vulnerabilities are common in many types of software and 58 00:03:46,080 --> 00:03:51,600 Speaker 1: arise not just through mistakes but sometimes simple oversights, and 59 00:03:51,680 --> 00:03:53,320 Speaker 1: I think it might be more fun to look at 60 00:03:53,360 --> 00:03:56,920 Speaker 1: some real bugs, like stuff that made things go wrong, 61 00:03:57,040 --> 00:03:59,800 Speaker 1: stuff that may have rendered a program defunct or otherwise 62 00:03:59,800 --> 00:04:03,160 Speaker 1: cau headaches. Now I'm gonna make an exception to this. 63 00:04:03,400 --> 00:04:06,000 Speaker 1: I'm going to start off with the Ping of death, 64 00:04:06,840 --> 00:04:10,960 Speaker 1: and I only mention it because it has an awesome name. Now, 65 00:04:10,960 --> 00:04:14,760 Speaker 1: this flaw caused headaches back in ninety six. It was 66 00:04:14,800 --> 00:04:19,560 Speaker 1: a flawed i P fragmentation reassembly code, and it became 67 00:04:19,600 --> 00:04:22,480 Speaker 1: possible to crash lots of different types of computers using 68 00:04:22,520 --> 00:04:26,719 Speaker 1: different operating systems, although Windows machines were particularly vulnerable, and 69 00:04:26,760 --> 00:04:29,640 Speaker 1: this particular flaw would make a Windows machine revert to 70 00:04:29,680 --> 00:04:33,159 Speaker 1: the dreaded blue screen of death. And it all happened 71 00:04:33,160 --> 00:04:36,719 Speaker 1: by sending a special ping packet over the Internet. So, 72 00:04:36,839 --> 00:04:39,000 Speaker 1: for those of you who aren't familiar with what that is, 73 00:04:39,160 --> 00:04:42,520 Speaker 1: a ping is essentially a simple message that checks for 74 00:04:42,760 --> 00:04:46,280 Speaker 1: a connection between two computers. You send one ping from 75 00:04:46,279 --> 00:04:49,080 Speaker 1: a computer to another one and look for a response, 76 00:04:49,360 --> 00:04:51,960 Speaker 1: so that way you verify there is in fact a connection. 77 00:04:52,000 --> 00:04:54,720 Speaker 1: You can also tell other things like how fast is 78 00:04:54,760 --> 00:04:58,279 Speaker 1: that connection between those two computers. Now, in this case, 79 00:04:58,600 --> 00:05:01,800 Speaker 1: you would have to actually design a malformed ping request 80 00:05:01,920 --> 00:05:04,720 Speaker 1: and send that to a target and it would bring 81 00:05:04,760 --> 00:05:08,880 Speaker 1: that target down. That's the only security vulnerability story I 82 00:05:08,920 --> 00:05:11,920 Speaker 1: really wanted to focus on. The others are all just 83 00:05:12,640 --> 00:05:15,520 Speaker 1: design flaws. And let's begin with the bug that inspired 84 00:05:15,560 --> 00:05:17,200 Speaker 1: me to do this episode in the first place. That 85 00:05:17,279 --> 00:05:20,960 Speaker 1: Spotify bug I mentioned earlier. Ours Technica wrote a piece 86 00:05:21,000 --> 00:05:23,120 Speaker 1: on it in November two thousand and sixteen, but the 87 00:05:23,120 --> 00:05:26,000 Speaker 1: problem seems to date back at least as far as 88 00:05:26,120 --> 00:05:29,640 Speaker 1: June two thousand sixteen, and that's when a few savvy 89 00:05:29,720 --> 00:05:34,320 Speaker 1: Spotify users noticed some unusual activities on their computers. And 90 00:05:34,360 --> 00:05:36,080 Speaker 1: it took a little bit of detective work, but they 91 00:05:36,120 --> 00:05:40,000 Speaker 1: discovered that Spotify was apparently generating a huge amount of 92 00:05:40,080 --> 00:05:45,400 Speaker 1: data on a daily basis, like gigabytes of data per day. 93 00:05:45,600 --> 00:05:48,840 Speaker 1: And the culprit turned out to be a vacuum process 94 00:05:48,920 --> 00:05:53,320 Speaker 1: for a database file containing the string mercury dot d b. Now, 95 00:05:53,360 --> 00:05:57,080 Speaker 1: the vacuum process is the digital equivalent of vacuum ceiling. 96 00:05:57,200 --> 00:05:59,880 Speaker 1: It's meant to repack data so that it takes up 97 00:06:00,040 --> 00:06:03,160 Speaker 1: a space on a drive. Now, this involves building a 98 00:06:03,160 --> 00:06:06,599 Speaker 1: new file to maximize efficiency, which is a good thing 99 00:06:06,720 --> 00:06:12,960 Speaker 1: generally speaking. The problem was that Spotify's version was making 100 00:06:13,000 --> 00:06:15,320 Speaker 1: it happen way too frequently, like on the order of 101 00:06:15,400 --> 00:06:19,520 Speaker 1: once every few minutes, so that's not generally necessary. You 102 00:06:19,520 --> 00:06:22,320 Speaker 1: don't need to rebuild a database file every few minutes 103 00:06:22,360 --> 00:06:25,320 Speaker 1: to make sure it's the most efficient size it can be. 104 00:06:26,600 --> 00:06:29,840 Speaker 1: So each rebuild represented a relatively small amount of data, 105 00:06:29,960 --> 00:06:32,880 Speaker 1: but over time it added up, which meant that if 106 00:06:32,960 --> 00:06:35,760 Speaker 1: you had Spotify on on your computer, even if it 107 00:06:35,800 --> 00:06:38,560 Speaker 1: was just running in the background, it would be generating 108 00:06:38,600 --> 00:06:42,320 Speaker 1: gigabytes worth of information rewriting this file over and over. 109 00:06:42,600 --> 00:06:45,359 Speaker 1: Now it wasn't filling up a hard drive. It was 110 00:06:45,400 --> 00:06:50,800 Speaker 1: just overriding the same file. Now, if it had been 111 00:06:50,800 --> 00:06:53,200 Speaker 1: filling up a hard drive, people would have noticed much earlier, 112 00:06:53,200 --> 00:06:56,279 Speaker 1: and it wouldn't have just been savvy Spotify users, because 113 00:06:56,279 --> 00:06:58,640 Speaker 1: you would suddenly notice, hey, I don't I can't save 114 00:06:58,680 --> 00:07:02,680 Speaker 1: anything to my hard drive because things filling up. Instead, Again, 115 00:07:02,720 --> 00:07:05,320 Speaker 1: it was just sort of writing and deleting and writing 116 00:07:05,360 --> 00:07:08,159 Speaker 1: and deleting the same file over and over again. And 117 00:07:08,200 --> 00:07:10,120 Speaker 1: that probably doesn't sound like a big deal, but it 118 00:07:10,320 --> 00:07:12,680 Speaker 1: is a problem if you're using a solid state drive 119 00:07:12,920 --> 00:07:16,120 Speaker 1: or s s D. So one of the drawbacks of 120 00:07:16,120 --> 00:07:20,480 Speaker 1: an ss D is that over time it loses storage capacity. 121 00:07:20,600 --> 00:07:23,560 Speaker 1: Like you can store less data on an ss D 122 00:07:23,880 --> 00:07:27,280 Speaker 1: over time. Now by overtime, I generally mean over a 123 00:07:27,320 --> 00:07:29,400 Speaker 1: great deal of time and a lot of different data 124 00:07:29,440 --> 00:07:33,760 Speaker 1: being written to it and overwritten. Uh. Generally speaking, most 125 00:07:33,760 --> 00:07:36,680 Speaker 1: of us end up replacing our drives before we get 126 00:07:36,720 --> 00:07:39,520 Speaker 1: to a point where the loss of capacity is a 127 00:07:39,560 --> 00:07:42,600 Speaker 1: real issue. But similar in a way to how a 128 00:07:42,600 --> 00:07:45,120 Speaker 1: battery can lose its ability to hold a full charge 129 00:07:45,560 --> 00:07:49,520 Speaker 1: after you've gone through lots of charging and discharging cycles, 130 00:07:49,960 --> 00:07:52,160 Speaker 1: you know how a battery won't be able to to 131 00:07:52,240 --> 00:07:54,360 Speaker 1: hold as much even if it says it's up to 132 00:07:54,400 --> 00:07:57,120 Speaker 1: a percent, but a hundred percent doesn't last you as 133 00:07:57,160 --> 00:08:00,160 Speaker 1: long as it used to. That's because it's capacity need 134 00:08:00,200 --> 00:08:04,040 Speaker 1: to hold a full charge has decreased over time. But 135 00:08:04,160 --> 00:08:07,160 Speaker 1: let's say you've got a program that's just constantly overwriting 136 00:08:07,240 --> 00:08:10,720 Speaker 1: data to your drive, you might discover that your ss 137 00:08:10,800 --> 00:08:15,120 Speaker 1: D S useful lifespan has been drastically reduced. So as 138 00:08:15,160 --> 00:08:18,760 Speaker 1: I record this episode, Spotify has already rolled out an 139 00:08:18,840 --> 00:08:21,960 Speaker 1: updated version of its desktop application, and that, by the way, 140 00:08:22,040 --> 00:08:24,520 Speaker 1: is the only version of Spotify that was affected. If 141 00:08:24,560 --> 00:08:28,280 Speaker 1: you use web based Spotify or mobile Spotify, you're in 142 00:08:28,320 --> 00:08:30,920 Speaker 1: the clear already. If you use a desktop version, as 143 00:08:30,920 --> 00:08:34,000 Speaker 1: long as you have version one point zero point four 144 00:08:34,040 --> 00:08:39,040 Speaker 1: two or later, you are fine. But if you did 145 00:08:39,080 --> 00:08:41,600 Speaker 1: have that earlier version and you just had Spotify running 146 00:08:41,600 --> 00:08:44,400 Speaker 1: on in the background, chances are it was writing to 147 00:08:44,440 --> 00:08:47,760 Speaker 1: your hard drive like crazy. So what about some of 148 00:08:47,760 --> 00:08:50,520 Speaker 1: the other big bugs in computer history? Well, some of 149 00:08:50,520 --> 00:08:54,239 Speaker 1: the real doozies involve our attempts to explore the final frontier. 150 00:08:55,040 --> 00:08:57,400 Speaker 1: So we'll be talking about space a few times in 151 00:08:57,400 --> 00:09:00,920 Speaker 1: this episode, and we'll start with an early US satellite. 152 00:09:01,480 --> 00:09:04,679 Speaker 1: So first up is a nineteen sixty two blunder involving 153 00:09:04,679 --> 00:09:08,560 Speaker 1: the Mariner one. So some backstory on this one. Uh, 154 00:09:08,640 --> 00:09:10,600 Speaker 1: We're gonna talk a lot about the Soviet Union in 155 00:09:10,600 --> 00:09:13,800 Speaker 1: this episode too. It takes a couple of roles as 156 00:09:13,800 --> 00:09:17,160 Speaker 1: we go on. But in this case, the then USSR 157 00:09:17,360 --> 00:09:20,640 Speaker 1: had launched Sputnik into orbit in nineteen fifty seven, which 158 00:09:20,640 --> 00:09:23,480 Speaker 1: really kicked off the space race and also was a 159 00:09:23,520 --> 00:09:26,280 Speaker 1: big shot in the Cold War because of so Union 160 00:09:26,320 --> 00:09:29,280 Speaker 1: was essentially saying, hey, we can launch this into space, 161 00:09:29,360 --> 00:09:33,079 Speaker 1: we could also launch something at you in response to 162 00:09:33,080 --> 00:09:34,760 Speaker 1: the US and done sort of the same thing. They 163 00:09:34,760 --> 00:09:37,959 Speaker 1: had launched some satellites into space, and the Mariner one 164 00:09:38,080 --> 00:09:40,880 Speaker 1: was going to be a big, big, feather in the 165 00:09:40,920 --> 00:09:42,839 Speaker 1: cap of the US. The whole idea was to launch 166 00:09:42,840 --> 00:09:45,520 Speaker 1: a probe that would be a fly by probe and 167 00:09:45,600 --> 00:09:50,640 Speaker 1: it would go by Venus. So uh NASA, which was 168 00:09:50,679 --> 00:09:54,400 Speaker 1: newly formed in nineteen sixty two, was taking control of 169 00:09:54,440 --> 00:09:57,120 Speaker 1: this and the budget for this particular project was eighteen 170 00:09:57,160 --> 00:10:00,000 Speaker 1: point five million dollars, which if you were to adjust 171 00:10:00,040 --> 00:10:05,280 Speaker 1: for inflation, would be almost a hundred fifty million dollars today, 172 00:10:05,480 --> 00:10:08,120 Speaker 1: So a hundred fifty million dollar project to launch the 173 00:10:08,160 --> 00:10:11,640 Speaker 1: Mariner one and have it fly by Venus. But as 174 00:10:11,679 --> 00:10:14,439 Speaker 1: I'm sure you guys have figured out by now based 175 00:10:14,520 --> 00:10:18,160 Speaker 1: upon the topic of this podcast, not all went according 176 00:10:18,200 --> 00:10:23,079 Speaker 1: to plan. Not long at all. After the rocket launched 177 00:10:23,559 --> 00:10:27,280 Speaker 1: from the launchpad, it began to veer off course, and 178 00:10:27,360 --> 00:10:30,360 Speaker 1: neither the computer controls on the rocket or manual controls 179 00:10:30,360 --> 00:10:34,520 Speaker 1: back at HQ could correct for the problem. The rockets 180 00:10:34,520 --> 00:10:37,040 Speaker 1: course was such that it was going to take it 181 00:10:37,120 --> 00:10:40,880 Speaker 1: over shipping lanes, which meant there could be a potential catastrophe, 182 00:10:41,320 --> 00:10:45,200 Speaker 1: and so Arrange safety officer made the difficult call and 183 00:10:45,280 --> 00:10:48,360 Speaker 1: issued the command to blow the whole thing up just 184 00:10:48,559 --> 00:10:52,640 Speaker 1: shy of three hundred seconds after it launched. So what happened. 185 00:10:52,679 --> 00:10:55,520 Speaker 1: What why did it go off course in the first place, Well, 186 00:10:55,520 --> 00:10:58,560 Speaker 1: there was a flaw in the spacecraft's guidance software which 187 00:10:58,600 --> 00:11:01,360 Speaker 1: diverted the rocket, and no amount of commands from ground 188 00:11:01,360 --> 00:11:06,080 Speaker 1: control could correct for it. After a lengthy investigation, NASA 189 00:11:06,120 --> 00:11:09,320 Speaker 1: discovered the era error was the result of a mistake 190 00:11:09,760 --> 00:11:15,840 Speaker 1: transcribing handwritten notes into computer code. So someone just took 191 00:11:15,920 --> 00:11:20,199 Speaker 1: some handwritten notes and misinterpreted one of them, and that 192 00:11:20,600 --> 00:11:26,080 Speaker 1: one mistake was enough to crash the rocket or to 193 00:11:26,080 --> 00:11:32,520 Speaker 1: to necessitate it being destroyed. The great science fiction author 194 00:11:32,760 --> 00:11:36,960 Speaker 1: Arthur C. Clark wrote that the Mariner one was wrecked 195 00:11:37,080 --> 00:11:41,800 Speaker 1: by the most expensive hyphen in history, which isn't quite right, 196 00:11:41,840 --> 00:11:46,520 Speaker 1: but it's pretty funny. I mean, come on, it's humorous phrase. 197 00:11:46,960 --> 00:11:49,600 Speaker 1: So the actual punctuation mark that caused the problem was 198 00:11:49,679 --> 00:11:55,120 Speaker 1: not technically a hyphen. It was a superscript bar. Superscript 199 00:11:55,120 --> 00:11:57,120 Speaker 1: of bars, by the way, not a place where playwrights 200 00:11:57,120 --> 00:12:00,360 Speaker 1: hang out to get tore up. A superscript barge means 201 00:12:00,400 --> 00:12:04,080 Speaker 1: it's a horizontal bar that is above some other symbol. 202 00:12:04,120 --> 00:12:07,920 Speaker 1: In this case, it was a radius symbol, and that 203 00:12:08,040 --> 00:12:11,679 Speaker 1: was a symbol along with the superscript bar to describe 204 00:12:11,720 --> 00:12:15,320 Speaker 1: a smoothing function, which means the formula was meant to 205 00:12:15,360 --> 00:12:21,240 Speaker 1: calculate smoothed values over the time derivative of a radius. Now, 206 00:12:21,240 --> 00:12:25,760 Speaker 1: without the smoothing function, tiny deviations in course sent commands 207 00:12:25,800 --> 00:12:28,480 Speaker 1: to the rockets thrusters to kick in big time to 208 00:12:28,760 --> 00:12:32,560 Speaker 1: overcorrect for that problem. As an analogy, imagine you're driving 209 00:12:32,600 --> 00:12:35,440 Speaker 1: a vehicle and you see a pothole in the road 210 00:12:35,520 --> 00:12:39,360 Speaker 1: and you're approaching it, and instead of gently steering out 211 00:12:39,440 --> 00:12:43,360 Speaker 1: of the way, you wrenched the wheel really hard to 212 00:12:43,400 --> 00:12:45,199 Speaker 1: the left or to the right in order to try 213 00:12:45,200 --> 00:12:47,560 Speaker 1: and get around this pothole. That's kind of what was 214 00:12:47,559 --> 00:12:50,360 Speaker 1: happening with the rocket. It didn't have the smoothing function, 215 00:12:50,840 --> 00:12:53,320 Speaker 1: and so as a result, it was having these wild 216 00:12:53,360 --> 00:12:57,760 Speaker 1: deviations and course. So it wasn't a hyphen that caused 217 00:12:57,760 --> 00:13:01,640 Speaker 1: the problem, but is close enough. Our next space story 218 00:13:01,720 --> 00:13:05,720 Speaker 1: takes place in nine with the European Space Agencies ARIANE 219 00:13:06,000 --> 00:13:10,120 Speaker 1: five Flight five O one rocket. Now, this rocket was 220 00:13:10,160 --> 00:13:14,040 Speaker 1: to launch into space on June four nine and instead 221 00:13:14,440 --> 00:13:18,439 Speaker 1: the rocket disintegrated forty seconds after taking off. So what 222 00:13:18,520 --> 00:13:21,720 Speaker 1: the heck happened? Well, it largely had to do with 223 00:13:21,800 --> 00:13:25,920 Speaker 1: the E s A reusing old work. This actually becomes 224 00:13:25,920 --> 00:13:28,680 Speaker 1: a theme in this episode. One of the morals of 225 00:13:28,920 --> 00:13:33,160 Speaker 1: of this entire podcast is if you're designing something a 226 00:13:33,240 --> 00:13:38,000 Speaker 1: successor to an earlier product, and you'd want to reuse 227 00:13:38,320 --> 00:13:41,840 Speaker 1: some of the features that you created in your previous product, 228 00:13:42,960 --> 00:13:46,120 Speaker 1: test the heck out of it in its new form factor, 229 00:13:46,920 --> 00:13:49,760 Speaker 1: because it could be that things that worked perfectly fine 230 00:13:49,800 --> 00:13:53,920 Speaker 1: in the earlier model will go awry in the new one. 231 00:13:54,760 --> 00:13:58,280 Speaker 1: That's what happened here, So as you might guess from 232 00:13:58,320 --> 00:14:02,280 Speaker 1: the name, the ARIANE five marked the fifth generation of 233 00:14:02,400 --> 00:14:06,880 Speaker 1: launch vehicles under that name. The Arian four's inertial reference 234 00:14:06,920 --> 00:14:10,960 Speaker 1: system would convert sixty four bit floating point numbers into 235 00:14:11,040 --> 00:14:14,959 Speaker 1: a sixteen bits signed integer, and it worked just fine. 236 00:14:16,240 --> 00:14:21,800 Speaker 1: But the Arian five stats were beefier than its predecessor 237 00:14:22,120 --> 00:14:25,920 Speaker 1: with faster engines, and that was where the problem really started. 238 00:14:26,000 --> 00:14:29,320 Speaker 1: The engine output meant those sixty four bit floating point 239 00:14:29,360 --> 00:14:33,760 Speaker 1: numbers were significantly larger than the ones generated by the 240 00:14:33,800 --> 00:14:38,160 Speaker 1: engines on the Arian four. They didn't anticipate this, so 241 00:14:38,240 --> 00:14:42,360 Speaker 1: during the conversion process there was actually data overflow, and 242 00:14:42,440 --> 00:14:46,320 Speaker 1: that overflow caused both the backup computer and the primary 243 00:14:46,320 --> 00:14:49,160 Speaker 1: computer aboard the are N five to crash, and they 244 00:14:49,200 --> 00:14:52,920 Speaker 1: crashed in that order. The backup computer crashed first, followed 245 00:14:52,920 --> 00:14:55,840 Speaker 1: by the primary computer a couple of seconds later. The 246 00:14:55,920 --> 00:14:58,040 Speaker 1: whole thing took less than a minute to go from 247 00:14:58,120 --> 00:15:03,960 Speaker 1: launch to disintegration. Oops, now we're gonna stick with space. 248 00:15:04,000 --> 00:15:10,160 Speaker 1: But jumped forward to and the Mars Climate Orbiter. This 249 00:15:10,320 --> 00:15:15,320 Speaker 1: was an unfortunate problem. So this particular spacecraft was meant 250 00:15:15,320 --> 00:15:18,880 Speaker 1: to study Mars's climate, atmosphere and surface changes, and it 251 00:15:18,960 --> 00:15:21,360 Speaker 1: was also supposed to be a kind of relay station 252 00:15:21,440 --> 00:15:25,600 Speaker 1: for landers that would explore Mars's surface, but none of 253 00:15:25,640 --> 00:15:29,360 Speaker 1: that would last because of some pretty significant goofs. So 254 00:15:29,400 --> 00:15:34,760 Speaker 1: on September, the orbiter passed into the upper atmosphere of 255 00:15:34,800 --> 00:15:40,200 Speaker 1: Mars and did so at a pretty low altitude. And 256 00:15:40,280 --> 00:15:42,600 Speaker 1: this is what folks in the space industry called a 257 00:15:42,720 --> 00:15:47,120 Speaker 1: bad thing. The drag on the spacecraft was significant. It 258 00:15:47,200 --> 00:15:51,840 Speaker 1: began to fall apart and it was destroyed upon entering 259 00:15:51,880 --> 00:15:57,000 Speaker 1: Mars's atmosphere. That's what happened. So the software guiding the 260 00:15:57,120 --> 00:16:02,160 Speaker 1: orbiter was to blame, and it's a dumb, dumb mistake. 261 00:16:02,400 --> 00:16:04,840 Speaker 1: It was supposed to make adjustments to the orbiter's flight 262 00:16:05,400 --> 00:16:10,000 Speaker 1: in SI units specifically in Newton seconds. That's what the 263 00:16:10,040 --> 00:16:14,880 Speaker 1: contract with Lockheed and NASA said, Newton seconds use SI 264 00:16:15,000 --> 00:16:18,840 Speaker 1: units for all of your all of your calculations, but 265 00:16:18,920 --> 00:16:23,200 Speaker 1: the software instead made calculations in non SI units, namely 266 00:16:23,520 --> 00:16:29,600 Speaker 1: pounds seconds. So Lockheeds software gave information to NASA's systems 267 00:16:29,840 --> 00:16:34,160 Speaker 1: using the wrong units of measure. NASA systems then took 268 00:16:34,160 --> 00:16:37,720 Speaker 1: the information, assuming it was with the right units of measure, 269 00:16:38,160 --> 00:16:43,440 Speaker 1: and executed commands based upon that. Uh So, this is 270 00:16:43,480 --> 00:16:47,480 Speaker 1: why if you're ever in a math course and the 271 00:16:47,480 --> 00:16:50,360 Speaker 1: teacher makes you stop in the middle of writing a 272 00:16:50,440 --> 00:16:52,640 Speaker 1: problem on the board and says, where are your units? 273 00:16:53,320 --> 00:16:55,960 Speaker 1: This is why you have to make sure you're using 274 00:16:56,000 --> 00:16:59,680 Speaker 1: the right units, because if you're saying a number and 275 00:16:59,760 --> 00:17:02,760 Speaker 1: you don't associate a unit with it, someone could make 276 00:17:02,800 --> 00:17:06,160 Speaker 1: an incorrect decision based on that and it could be disastrous, 277 00:17:06,280 --> 00:17:09,080 Speaker 1: as it was with the case of this orbiter. The 278 00:17:09,119 --> 00:17:12,560 Speaker 1: thrusters fired at four point four or five times the 279 00:17:12,640 --> 00:17:15,880 Speaker 1: power they were supposed to, and the orbiter didn't stand 280 00:17:15,880 --> 00:17:19,399 Speaker 1: a chance. And this was a pretty expensive mistake. That 281 00:17:19,520 --> 00:17:23,840 Speaker 1: mission's cost came in at three seven point six million dollars. 282 00:17:24,520 --> 00:17:26,520 Speaker 1: But on the bright side, with all of these stories, 283 00:17:26,560 --> 00:17:29,720 Speaker 1: at least no human lives were ever in real danger 284 00:17:29,800 --> 00:17:33,399 Speaker 1: as a result of the mistake. Now I've got a 285 00:17:33,400 --> 00:17:37,560 Speaker 1: lot more to say about bugs, but before I get 286 00:17:37,560 --> 00:17:40,400 Speaker 1: into that, let's take a quick break to thank our sponsor. 287 00:17:48,880 --> 00:17:50,720 Speaker 1: All right, Now, let's make a switch to A T 288 00:17:50,880 --> 00:17:53,040 Speaker 1: and T, which is a company that had a pretty 289 00:17:53,119 --> 00:17:56,560 Speaker 1: big problem with switches once upon a time. I'm talking 290 00:17:56,600 --> 00:18:00,520 Speaker 1: about an issue that popped up on January. That's when 291 00:18:00,560 --> 00:18:03,000 Speaker 1: A T and T long distance customers discovered they were 292 00:18:03,040 --> 00:18:06,840 Speaker 1: unable to make any long distance calls. Why why could 293 00:18:06,840 --> 00:18:09,920 Speaker 1: they no longer reach anybody? Well, A T and T 294 00:18:10,200 --> 00:18:15,159 Speaker 1: s long distance switches, which control that and allow for 295 00:18:15,320 --> 00:18:18,680 Speaker 1: the actual connections to be made, were on the fritz. 296 00:18:19,040 --> 00:18:21,639 Speaker 1: They were trying to reboot over and over again. They 297 00:18:21,640 --> 00:18:25,399 Speaker 1: were just stuck in a reboot cycle. Now, initially the 298 00:18:25,480 --> 00:18:28,760 Speaker 1: company thought it was being hacked, But like I said 299 00:18:28,760 --> 00:18:30,800 Speaker 1: at the top of the show, I'm not covering stories 300 00:18:30,840 --> 00:18:34,040 Speaker 1: about hackers here. I'm talking about big design flaws that 301 00:18:34,119 --> 00:18:39,280 Speaker 1: caused problems. So they weren't getting hacked. That's not what 302 00:18:39,320 --> 00:18:42,600 Speaker 1: was going on with those D fourteen long distance switches. No, 303 00:18:42,760 --> 00:18:45,760 Speaker 1: there was a design problem at fault, so What had 304 00:18:45,760 --> 00:18:47,439 Speaker 1: happened was a T and D had rolled out an 305 00:18:47,520 --> 00:18:50,960 Speaker 1: update to the code that managed the switches, and it 306 00:18:51,040 --> 00:18:53,280 Speaker 1: was meant to increase the efficiency. It was meant to 307 00:18:53,359 --> 00:18:56,480 Speaker 1: speed things up. But the problem was it sped things 308 00:18:56,560 --> 00:18:59,840 Speaker 1: up so much that the system got caught up in itself. 309 00:19:00,240 --> 00:19:02,000 Speaker 1: It gets pretty technical, but I can give you kind 310 00:19:02,000 --> 00:19:05,479 Speaker 1: of a overview of what the problem was. Alright, so 311 00:19:05,640 --> 00:19:10,120 Speaker 1: each switch had a function that allowed it to alert 312 00:19:10,359 --> 00:19:13,960 Speaker 1: the next switch down the line if things were starting 313 00:19:13,960 --> 00:19:17,919 Speaker 1: to get harry. So imagine the switch number one is 314 00:19:17,960 --> 00:19:21,239 Speaker 1: handling traffic, but it's getting really close to capacity. So 315 00:19:21,280 --> 00:19:23,560 Speaker 1: it sends a message over to switch number two and says, 316 00:19:23,720 --> 00:19:27,040 Speaker 1: I can't take on any more work because if I do, 317 00:19:27,160 --> 00:19:31,240 Speaker 1: I'll be overloaded. Switch to then says, no problem, I'll 318 00:19:31,240 --> 00:19:35,040 Speaker 1: take on the any oncoming work for you and we'll 319 00:19:35,080 --> 00:19:38,000 Speaker 1: handle it from there. And if switched number two order 320 00:19:38,080 --> 00:19:40,720 Speaker 1: to get into the same source situation, it would say 321 00:19:40,760 --> 00:19:42,679 Speaker 1: the same thing to switch number three, and so on 322 00:19:42,720 --> 00:19:46,600 Speaker 1: and so forth. Now, eventually each switch will contact the 323 00:19:46,600 --> 00:19:49,320 Speaker 1: one below it and say, hey, how are you doing there? 324 00:19:49,720 --> 00:19:53,040 Speaker 1: And if the answer is okay, then everything switches back 325 00:19:53,080 --> 00:19:56,600 Speaker 1: and you go back to normal operation. That's how it's 326 00:19:56,600 --> 00:20:01,000 Speaker 1: supposed to work up. But A T and He's updated 327 00:20:01,040 --> 00:20:05,520 Speaker 1: code sped things up so much it caused some real issues, 328 00:20:05,600 --> 00:20:09,000 Speaker 1: and there was some poor timing, just coincidental timing that 329 00:20:09,040 --> 00:20:12,600 Speaker 1: made things worse. So switch number one starts to get 330 00:20:12,640 --> 00:20:15,480 Speaker 1: overwhelmed and sends a message over to switch number two, 331 00:20:15,480 --> 00:20:17,960 Speaker 1: But switched number two was just in the middle of 332 00:20:18,080 --> 00:20:22,240 Speaker 1: resetting itself. So switch number two goes into reset mode, 333 00:20:22,280 --> 00:20:24,520 Speaker 1: which says do not disturb. Sends a message over to 334 00:20:24,560 --> 00:20:28,200 Speaker 1: switch number three. That prompted switched number three to overload 335 00:20:28,480 --> 00:20:30,520 Speaker 1: and put up a do not disturb sign. Move that 336 00:20:30,520 --> 00:20:33,520 Speaker 1: down to switch number four. This whole thing goes down 337 00:20:33,560 --> 00:20:36,919 Speaker 1: the entire line of on switches. They all end up 338 00:20:36,920 --> 00:20:39,760 Speaker 1: getting overloaded as a result of this, and I'll go 339 00:20:39,800 --> 00:20:44,600 Speaker 1: into reset mode and they get stuck there. That problem 340 00:20:44,680 --> 00:20:48,560 Speaker 1: lasted for nine hours before A T and T was 341 00:20:48,600 --> 00:20:51,480 Speaker 1: finally able to address the message load on the entire 342 00:20:51,560 --> 00:20:54,639 Speaker 1: system and get the switches back to normal. The estimated 343 00:20:54,760 --> 00:20:57,840 Speaker 1: cost of lost revenue for that time was about sixty 344 00:20:57,960 --> 00:21:01,560 Speaker 1: million dollars in long distance calls, and there were a 345 00:21:01,600 --> 00:21:04,440 Speaker 1: lot of angry customers to boot, so to placate them, 346 00:21:04,480 --> 00:21:07,600 Speaker 1: A T and T offered reduced long distance rates on 347 00:21:07,720 --> 00:21:12,320 Speaker 1: Valentine's Day pretty ugly by the A, T and T 348 00:21:12,320 --> 00:21:14,159 Speaker 1: tried to handle it, at least in a way that 349 00:21:14,320 --> 00:21:17,920 Speaker 1: didn't turn it into a pr nightmare. Not so with Intel. 350 00:21:18,600 --> 00:21:21,199 Speaker 1: That's what it brings us to the Pendium problem. I 351 00:21:21,240 --> 00:21:23,880 Speaker 1: don't know if you guys remember when Penium processors first 352 00:21:23,960 --> 00:21:26,600 Speaker 1: came out, but they were a big deal. It was 353 00:21:27,560 --> 00:21:30,679 Speaker 1: a redesign of the architecture of the microprocessor and it 354 00:21:30,720 --> 00:21:34,119 Speaker 1: was meant to really speed things up. Well, Intel had 355 00:21:34,160 --> 00:21:38,679 Speaker 1: a massive nightmare in n thanks to a flaw in 356 00:21:38,760 --> 00:21:43,960 Speaker 1: the entire first generation of Pentium processors. Now, when you 357 00:21:44,000 --> 00:21:47,280 Speaker 1: break it all down, a CPU is all about performing 358 00:21:47,320 --> 00:21:50,320 Speaker 1: mathematical operations on data, so it's kind of important that 359 00:21:50,440 --> 00:21:55,480 Speaker 1: does this correctly. Unfortunately, the flaw and the Pentium processors 360 00:21:55,520 --> 00:21:57,840 Speaker 1: kind of messed that up. And the issue has to 361 00:21:57,840 --> 00:22:01,760 Speaker 1: do with floating point operations. So the predecessor to the Pentium, 362 00:22:01,840 --> 00:22:05,639 Speaker 1: the four six, used a shift and subtract algorithm for 363 00:22:05,680 --> 00:22:10,840 Speaker 1: floating point operations, which was effective but relatively slow compared 364 00:22:10,880 --> 00:22:15,480 Speaker 1: to what Intel thought they could do by totally redesigning 365 00:22:15,480 --> 00:22:19,840 Speaker 1: that structure and using a look up table approach. Now, 366 00:22:19,840 --> 00:22:23,040 Speaker 1: the table was supposed to have one thousand sixty six 367 00:22:23,320 --> 00:22:27,679 Speaker 1: entries programmed directly onto the logic array of the Pentium processor, 368 00:22:28,800 --> 00:22:33,200 Speaker 1: but for some reason only one thousand sixty one entries 369 00:22:33,280 --> 00:22:37,680 Speaker 1: made it. Five entries went missing and essentially returned an 370 00:22:37,680 --> 00:22:42,280 Speaker 1: answer of zero instead of what they were supposed to say, 371 00:22:42,359 --> 00:22:44,800 Speaker 1: so if a calculation accessed one of those missing cells, 372 00:22:44,840 --> 00:22:47,159 Speaker 1: it got zero, even though that's not the correct answer. 373 00:22:48,080 --> 00:22:50,600 Speaker 1: All the first generation pentiums went out with this error 374 00:22:50,640 --> 00:22:53,080 Speaker 1: because it was so minor that it wasn't even picked 375 00:22:53,200 --> 00:22:58,560 Speaker 1: up by Intel's quality control at the time. Now, processes 376 00:22:58,560 --> 00:23:02,040 Speaker 1: worked just fine up to the eighth decimal point. Beyond 377 00:23:02,119 --> 00:23:05,240 Speaker 1: that things got messy, But for most folks that wasn't 378 00:23:05,240 --> 00:23:09,800 Speaker 1: a problem because they weren't doing mathematical calculations that needed 379 00:23:09,840 --> 00:23:13,280 Speaker 1: that level of precision. It just wasn't a thing. In fact, 380 00:23:13,320 --> 00:23:16,280 Speaker 1: there was only a one in three sixty billion chance 381 00:23:16,760 --> 00:23:20,520 Speaker 1: that this error would cause a big enough problem to 382 00:23:20,640 --> 00:23:25,040 Speaker 1: reach up to the fourth decimal place. So most calculations 383 00:23:25,040 --> 00:23:27,879 Speaker 1: that were simple were bulletproof. You you were fine. But 384 00:23:27,960 --> 00:23:31,320 Speaker 1: if you needed that precision, if you needed that really 385 00:23:31,440 --> 00:23:35,320 Speaker 1: fine degree, that's when you would encounter the flaw, and 386 00:23:35,400 --> 00:23:39,320 Speaker 1: that happened because there are math professors in this world, 387 00:23:39,760 --> 00:23:44,919 Speaker 1: and one of those, Thomas Nicely, discovered in October that 388 00:23:45,000 --> 00:23:48,080 Speaker 1: he was getting errors because of this issue. He needed 389 00:23:48,080 --> 00:23:51,720 Speaker 1: the processor to work correctly, and so he contacted Intel 390 00:23:51,760 --> 00:23:55,679 Speaker 1: about the problem. And this is where we take a 391 00:23:55,720 --> 00:23:58,520 Speaker 1: moment to acknowledge there's a right way and a wrong 392 00:23:58,560 --> 00:24:03,360 Speaker 1: way to handle an issue. That's your fault until decided 393 00:24:03,400 --> 00:24:06,080 Speaker 1: to go the wrong way. My opinion is, if you 394 00:24:06,119 --> 00:24:08,400 Speaker 1: make a mistake, it's usually a good idea to just 395 00:24:08,600 --> 00:24:10,800 Speaker 1: own up to it and try to make it better. 396 00:24:11,480 --> 00:24:13,840 Speaker 1: But Intel's response was more along the lines of, yeah, 397 00:24:13,920 --> 00:24:16,080 Speaker 1: we didn't think it was a big deal. And then 398 00:24:16,119 --> 00:24:20,680 Speaker 1: Intel made other pr blenders. But because people began to hear, hey, 399 00:24:20,760 --> 00:24:24,000 Speaker 1: that pentium processor in your computer that you just bought, 400 00:24:24,760 --> 00:24:28,359 Speaker 1: it doesn't work properly. So people wanted to get replacements. 401 00:24:29,040 --> 00:24:31,399 Speaker 1: But Intel said, oh, we're only going to replace the 402 00:24:31,440 --> 00:24:34,760 Speaker 1: ones if you can prove that the mistakes that it 403 00:24:34,800 --> 00:24:38,920 Speaker 1: makes affect you in some meaningful way. So they weren't 404 00:24:39,480 --> 00:24:42,200 Speaker 1: They weren't denying that there was a problem. They were 405 00:24:42,240 --> 00:24:44,840 Speaker 1: just saying, hey, unless you can prove the problem affects you, 406 00:24:44,880 --> 00:24:48,840 Speaker 1: where we don't care that didn't go well. If you 407 00:24:48,920 --> 00:24:51,159 Speaker 1: create a product and you market it as the future 408 00:24:51,200 --> 00:24:53,639 Speaker 1: of computing, and then it's discovered there's a flaw on 409 00:24:53,720 --> 00:24:56,280 Speaker 1: the design, and then you say we'll replace it, but 410 00:24:56,359 --> 00:25:00,199 Speaker 1: only if you prove you deserve it, it doesn't end 411 00:25:00,280 --> 00:25:04,240 Speaker 1: to make your customer base very happy. So ultimately, until 412 00:25:04,960 --> 00:25:08,119 Speaker 1: reverse that decision and offered to replace the processor for 413 00:25:08,240 --> 00:25:12,119 Speaker 1: anyone who wanted it who had a first generation Pentium, 414 00:25:12,280 --> 00:25:15,919 Speaker 1: and that mistake ended up costing the company four seventy 415 00:25:15,920 --> 00:25:21,560 Speaker 1: five million dollars. Yikes. All right, now we're gonna switch 416 00:25:21,600 --> 00:25:25,240 Speaker 1: Gears over to Microsoft. First, I think you could claim 417 00:25:25,280 --> 00:25:29,760 Speaker 1: that all of Microsoft Bob product that was supposed to 418 00:25:29,760 --> 00:25:33,479 Speaker 1: be an easy, accessible computer interface was really just a 419 00:25:33,520 --> 00:25:38,080 Speaker 1: massive software bug. I mean, it introduced comic sands for 420 00:25:38,119 --> 00:25:43,000 Speaker 1: goodness sakes, The cluttered organization system, the lack of meaningful security, 421 00:25:43,000 --> 00:25:48,120 Speaker 1: and other numerous issues plagued that software. But we did 422 00:25:48,119 --> 00:25:50,720 Speaker 1: an entire episode of Tech Stuff about Microsoft Bob a 423 00:25:50,720 --> 00:25:52,879 Speaker 1: couple of years ago, so I'm not gonna dwell on 424 00:25:52,960 --> 00:25:55,840 Speaker 1: it anymore, but if you want to hear more about it, 425 00:25:56,560 --> 00:25:59,600 Speaker 1: go find that episode. It was a fun one. Now 426 00:25:59,600 --> 00:26:04,600 Speaker 1: into the seven Microsoft experienced a massive headache when a 427 00:26:04,600 --> 00:26:08,040 Speaker 1: bug on their servers notified thousands of Windows customers that 428 00:26:08,080 --> 00:26:11,280 Speaker 1: they were filthy, dirty software pirates and they should be punished. 429 00:26:11,600 --> 00:26:16,320 Speaker 1: These include people who actually had legitimate, legal purchase copies 430 00:26:16,400 --> 00:26:20,840 Speaker 1: of Windows XP or Vista. So the problem here was 431 00:26:20,880 --> 00:26:25,879 Speaker 1: Microsoft had an initiative called Windows Genuine Advantage, and it 432 00:26:26,000 --> 00:26:28,440 Speaker 1: was a nice name for a strategy meant to curtail 433 00:26:28,800 --> 00:26:33,400 Speaker 1: operating system piracy. Essentially, it was a component in Windows 434 00:26:33,720 --> 00:26:36,199 Speaker 1: that would allow Microsoft to figure out if the copy 435 00:26:36,280 --> 00:26:41,760 Speaker 1: of Windows on any given computer was legit. In other words, 436 00:26:41,800 --> 00:26:44,240 Speaker 1: it was a d r M strategy. But in two 437 00:26:44,240 --> 00:26:48,080 Speaker 1: thousand seven, a buggy install of software on a server 438 00:26:48,280 --> 00:26:54,359 Speaker 1: misidentified thousands of legitimate, law abiding customers as pirates for 439 00:26:54,560 --> 00:26:57,679 Speaker 1: nineteen hours. The software just laid down the law, and 440 00:26:57,720 --> 00:27:00,800 Speaker 1: so people began to receive sternly written warnings about their 441 00:27:00,880 --> 00:27:03,960 Speaker 1: choice to indulge in bad behavior. And if you were 442 00:27:03,960 --> 00:27:07,560 Speaker 1: a Windows Vista customer, you had it the worst, not 443 00:27:07,800 --> 00:27:10,680 Speaker 1: just because you were using Windows Vista, which I think 444 00:27:10,680 --> 00:27:13,840 Speaker 1: we all agree was not one of the bright points 445 00:27:13,840 --> 00:27:19,639 Speaker 1: and Microsoft's operating system history, but also because Microsoft had 446 00:27:19,640 --> 00:27:22,959 Speaker 1: built in the ability for Windows Genuine Advantage to switch 447 00:27:23,000 --> 00:27:27,679 Speaker 1: off certain operating system features in Windows Vista if it 448 00:27:27,760 --> 00:27:31,679 Speaker 1: determined that the copy someone was using was a pirated version, 449 00:27:32,160 --> 00:27:35,879 Speaker 1: so it was misidentifying real versions as pirated ones, turning 450 00:27:35,920 --> 00:27:38,520 Speaker 1: off features, and these are for people who have bought 451 00:27:38,680 --> 00:27:41,239 Speaker 1: legitimate copies. This, by the way, is one of the 452 00:27:41,240 --> 00:27:45,920 Speaker 1: big arguments people have against DRM. It has the tendency 453 00:27:46,000 --> 00:27:51,320 Speaker 1: to punish legitimate customers. And you feel like you're stupid 454 00:27:51,480 --> 00:27:54,760 Speaker 1: for buying a copy of a piece of software rather 455 00:27:54,800 --> 00:27:57,600 Speaker 1: than just stealing one that has had those features or 456 00:27:57,640 --> 00:28:01,800 Speaker 1: those defenses removed. Like why you're you're creating more incentives 457 00:28:01,840 --> 00:28:07,199 Speaker 1: for people to go outside and get a pirated copy. Alright, 458 00:28:07,240 --> 00:28:10,600 Speaker 1: so imagine you've purchased this legitimate copy of Windows Vista. 459 00:28:10,680 --> 00:28:13,000 Speaker 1: First of all, you you already feel bad. Then you're 460 00:28:13,040 --> 00:28:15,879 Speaker 1: told you're a thief, so you feel worse. Then someone 461 00:28:15,920 --> 00:28:18,800 Speaker 1: remotely switches off several features of your operating system. That 462 00:28:18,880 --> 00:28:22,439 Speaker 1: was not a great pr message, So that was a 463 00:28:22,560 --> 00:28:26,000 Speaker 1: real issue. They did eventually fix it after that nineteen hours, 464 00:28:26,040 --> 00:28:29,119 Speaker 1: but by then people were already very upset. Also, I 465 00:28:29,160 --> 00:28:34,800 Speaker 1: don't wanna just, you know, pile lots of abuse onto Microsoft. 466 00:28:34,880 --> 00:28:38,719 Speaker 1: I gotta talk about Apple here too. So the company 467 00:28:38,840 --> 00:28:42,239 Speaker 1: prides itself on a high standard of quality, and in 468 00:28:42,280 --> 00:28:45,200 Speaker 1: general it's pretty good about living up to that standard 469 00:28:45,240 --> 00:28:47,520 Speaker 1: of quality, depending upon your point of view of their 470 00:28:47,600 --> 00:28:51,760 Speaker 1: various products. But that hasn't stopped a few clunkers getting 471 00:28:51,800 --> 00:28:55,840 Speaker 1: through and into the public hands. And that was the 472 00:28:55,880 --> 00:28:58,880 Speaker 1: case in two thousand twelve with Apple Maps. If you 473 00:28:58,920 --> 00:29:01,160 Speaker 1: owned an iPhone back in two thousand twelve when Apple 474 00:29:01,200 --> 00:29:04,600 Speaker 1: Maps came out, you may remember this problem. It's pretty 475 00:29:04,600 --> 00:29:08,440 Speaker 1: well publicized. Maps were inaccurate, sometimes leaving out important details 476 00:29:08,480 --> 00:29:11,360 Speaker 1: like you know, a river or a lake between you 477 00:29:11,440 --> 00:29:14,440 Speaker 1: and your destination, things that might be important if I 478 00:29:14,440 --> 00:29:17,760 Speaker 1: don't know you don't drive an amphibious vehicle, might not 479 00:29:17,920 --> 00:29:21,840 Speaker 1: have a road on there that's important. Might misidentify the 480 00:29:21,880 --> 00:29:25,360 Speaker 1: location of a historical landmark. For instance, that thought the 481 00:29:25,400 --> 00:29:28,120 Speaker 1: Washington Monument was across the street from where it is, 482 00:29:28,720 --> 00:29:32,120 Speaker 1: But nope, it's just where we left it. Despite all 483 00:29:32,200 --> 00:29:37,080 Speaker 1: of roland emericks best attempts to move it or destroy it, 484 00:29:37,080 --> 00:29:41,720 Speaker 1: it's still there. The real problem here was that the 485 00:29:41,760 --> 00:29:45,640 Speaker 1: Apple software just wasn't ready for public unveiling. It was. 486 00:29:45,760 --> 00:29:48,320 Speaker 1: It needed a lot more testing. It was trying to 487 00:29:48,560 --> 00:29:51,160 Speaker 1: play catch up to Google Maps, but Google had the 488 00:29:51,200 --> 00:29:54,000 Speaker 1: advantage of working with companies that have been doing mapping 489 00:29:54,040 --> 00:29:58,160 Speaker 1: software for years. Google acquired those companies and acquired the 490 00:29:58,200 --> 00:30:01,360 Speaker 1: expertise of people who have been working that software, and 491 00:30:01,440 --> 00:30:04,960 Speaker 1: Apple was really just trying to create their own version 492 00:30:05,600 --> 00:30:07,240 Speaker 1: and get it out as fast as it could. But 493 00:30:07,400 --> 00:30:10,080 Speaker 1: it got out a little too early, and the company 494 00:30:10,120 --> 00:30:12,440 Speaker 1: spent the next several months tweaking maps and trying to 495 00:30:12,520 --> 00:30:14,600 Speaker 1: keep control of the situation. But by that time, many 496 00:30:14,640 --> 00:30:17,640 Speaker 1: of Apple's fans, even the most devoted ones, had kind 497 00:30:17,640 --> 00:30:21,800 Speaker 1: of given up and switched over to Google Maps instead. Well, 498 00:30:21,840 --> 00:30:24,320 Speaker 1: that's most of the fun stuff. I've got some really 499 00:30:24,560 --> 00:30:28,640 Speaker 1: serious bugs to cover. But before I do that, let's 500 00:30:28,640 --> 00:30:40,280 Speaker 1: take another quick break and thank our sponsor. Now I'm 501 00:30:40,320 --> 00:30:43,640 Speaker 1: going to transition into some serious bugs. These are ones 502 00:30:43,720 --> 00:30:48,280 Speaker 1: that either threatened the lives of people or they contributed 503 00:30:48,360 --> 00:30:53,560 Speaker 1: to people dying. The ones I've talked about now up 504 00:30:53,560 --> 00:30:56,719 Speaker 1: to now rather have cost companies millions of dollars, but 505 00:30:56,800 --> 00:31:00,920 Speaker 1: no one's life was truly threatened on Fortunately, that's not 506 00:31:01,000 --> 00:31:03,480 Speaker 1: the case with all software bugs. Now, a couple of 507 00:31:03,520 --> 00:31:08,520 Speaker 1: bugs had the potential to kill millions of people. One 508 00:31:08,560 --> 00:31:13,520 Speaker 1: of those happened in nineteen eighty a famous famous bug, 509 00:31:14,040 --> 00:31:16,480 Speaker 1: or at least a faulty circuit, and that was a 510 00:31:16,480 --> 00:31:19,240 Speaker 1: faulty circuit in nora ADS computer system which caused it 511 00:31:19,280 --> 00:31:22,520 Speaker 1: to mistakenly conclude the US was under nuclear attack from 512 00:31:22,560 --> 00:31:28,000 Speaker 1: the Soviet Union. So displays on nora AD systems showed 513 00:31:28,040 --> 00:31:31,320 Speaker 1: seemingly random attacks, and they didn't correspond with each other. 514 00:31:31,400 --> 00:31:34,520 Speaker 1: So the display might show, Hey, there two missiles heading 515 00:31:34,600 --> 00:31:37,240 Speaker 1: over from the Soviet Union. No, they're two hundred. No 516 00:31:37,400 --> 00:31:41,240 Speaker 1: they're fifty. No there's three, And it wasn't consistent, and 517 00:31:41,360 --> 00:31:45,840 Speaker 1: command posts around the US all had conflicting information, which 518 00:31:45,920 --> 00:31:49,520 Speaker 1: led leaders to conclude the whole thing was a regrettable 519 00:31:49,560 --> 00:31:53,600 Speaker 1: computer error, and they were right to do so. To 520 00:31:53,680 --> 00:31:56,160 Speaker 1: be fair, they were kind of prepared for this because 521 00:31:56,160 --> 00:31:58,600 Speaker 1: there was another incident that it actually happened in nineteen 522 00:31:58,680 --> 00:32:02,240 Speaker 1: seventy nine that was a scarier and in that case, 523 00:32:02,280 --> 00:32:05,840 Speaker 1: someone mistakenly inserted a training scenario into the computer system 524 00:32:05,840 --> 00:32:07,960 Speaker 1: that made it seem like the Soviet Union had launched 525 00:32:07,960 --> 00:32:11,239 Speaker 1: an all out nuclear attack on the US. But that 526 00:32:11,360 --> 00:32:13,480 Speaker 1: wasn't a bug. That was a mistake on the part 527 00:32:13,480 --> 00:32:16,600 Speaker 1: of a human who had accidentally uploaded the wrong or 528 00:32:16,880 --> 00:32:20,040 Speaker 1: rather executed the wrong command. It didn't have something to 529 00:32:20,080 --> 00:32:23,080 Speaker 1: do with a flaw in the computer system itself. However, 530 00:32:23,640 --> 00:32:28,560 Speaker 1: because that thing happened and everybody was freaked out and 531 00:32:28,560 --> 00:32:30,400 Speaker 1: then was able to determine that, in fact it was 532 00:32:30,440 --> 00:32:33,600 Speaker 1: a false alarm, it meant that calmer heads could prevail. 533 00:32:33,720 --> 00:32:38,880 Speaker 1: In the nineteen eighty incident, so the Soviets also had 534 00:32:38,880 --> 00:32:41,120 Speaker 1: a close call just a few years later. It was 535 00:32:41,160 --> 00:32:44,280 Speaker 1: a bug in the early warning detection software that the 536 00:32:44,400 --> 00:32:47,760 Speaker 1: USSR was using in the early eighties, and on September 537 00:32:47,760 --> 00:32:51,760 Speaker 1: twenty three, Night three and so Union received an alert 538 00:32:52,200 --> 00:32:54,600 Speaker 1: that the US had launched a nuclear attack in the 539 00:32:54,640 --> 00:32:59,720 Speaker 1: form of five nuclear warheads UH technically two different attacks. 540 00:33:00,080 --> 00:33:03,040 Speaker 1: The first would have been a single nuclear warhead and 541 00:33:03,080 --> 00:33:06,680 Speaker 1: the second was four nuclear warheads, and this was during 542 00:33:06,680 --> 00:33:10,360 Speaker 1: a particularly stressful period in the history of both countries 543 00:33:10,400 --> 00:33:13,600 Speaker 1: and their relationship with each other, at the height of 544 00:33:13,640 --> 00:33:20,200 Speaker 1: the Cold War nine three now fortunately UH Soviet Air 545 00:33:20,280 --> 00:33:26,640 Speaker 1: Defense Forces Lieutenant Colonel Stanislav Petrov suspected that this report 546 00:33:26,760 --> 00:33:29,520 Speaker 1: was an error and that there was some sort of 547 00:33:29,560 --> 00:33:32,960 Speaker 1: bug in the software or a mistake in the reporting 548 00:33:33,000 --> 00:33:36,640 Speaker 1: system that caused this. He gave a command to hold 549 00:33:36,680 --> 00:33:39,600 Speaker 1: off on any sort of retaliatory strike, which would have 550 00:33:39,600 --> 00:33:43,600 Speaker 1: initiated a full scale nuclear war had it happened. Petrov 551 00:33:43,800 --> 00:33:46,400 Speaker 1: was the officer in charge of a a bunker that served 552 00:33:46,400 --> 00:33:49,400 Speaker 1: as the command center for this early warning system, and 553 00:33:49,440 --> 00:33:53,240 Speaker 1: he he had said afterward that his reckoning was any 554 00:33:53,280 --> 00:33:56,600 Speaker 1: real attack would consist of hundreds of warheads, not five. 555 00:33:57,400 --> 00:34:01,080 Speaker 1: No one would start an attack with just five warheads, 556 00:34:01,360 --> 00:34:03,080 Speaker 1: so it was more likely to be an error than 557 00:34:03,120 --> 00:34:05,960 Speaker 1: a genuine attack. So he gave the command to wait 558 00:34:06,040 --> 00:34:09,360 Speaker 1: until the reported missiles would pass into the range of radar, 559 00:34:09,880 --> 00:34:13,520 Speaker 1: which only extended as far as the horizon, so if 560 00:34:13,760 --> 00:34:15,839 Speaker 1: it had in fact been a real attack, it would 561 00:34:15,840 --> 00:34:19,640 Speaker 1: have potentially limited the Soviet Union's ability to respond. But 562 00:34:20,000 --> 00:34:25,919 Speaker 1: no missile showed up, and he was vindicated in his decision. Now, 563 00:34:25,920 --> 00:34:27,960 Speaker 1: the cause of the false alarm in this case was 564 00:34:28,640 --> 00:34:33,720 Speaker 1: a combination of factors that the designers didn't anticipate, uh 565 00:34:33,800 --> 00:34:38,160 Speaker 1: which largely consisted of sunlight hitting high altitude clouds at 566 00:34:38,160 --> 00:34:42,239 Speaker 1: a particular angle from a particular perspective of the satellites, 567 00:34:42,600 --> 00:34:49,120 Speaker 1: So the satellites misidentified that reflection as a warhead. Now 568 00:34:49,160 --> 00:34:51,359 Speaker 1: they were the subviys were able to address this error 569 00:34:51,360 --> 00:34:54,960 Speaker 1: in the future by adding another step in which these 570 00:34:55,040 --> 00:34:59,400 Speaker 1: satellites would cross reference data from other geostationary satellites to 571 00:34:59,520 --> 00:35:04,080 Speaker 1: make certain and that they are identifying actual rockets as 572 00:35:04,080 --> 00:35:10,400 Speaker 1: opposed to high altitude clouds. Now, there are several cases 573 00:35:10,440 --> 00:35:16,040 Speaker 1: of software bugs leading to actual deaths. For example, the 574 00:35:16,200 --> 00:35:19,200 Speaker 1: the RACK was such a case. Now that was a 575 00:35:19,320 --> 00:35:22,440 Speaker 1: radiation therapy machine that could deliver two different modes of 576 00:35:22,520 --> 00:35:27,040 Speaker 1: radiation treatments. The first was a low powered direct electron 577 00:35:27,160 --> 00:35:30,839 Speaker 1: beam and the second was a mega volt X ray beam. Now, 578 00:35:30,840 --> 00:35:33,319 Speaker 1: the x ray beam was far more intense and it 579 00:35:33,400 --> 00:35:37,719 Speaker 1: required physicians to provide shielding to patients to limit exposure 580 00:35:37,760 --> 00:35:41,760 Speaker 1: to the beam. But the therac had inherited its code 581 00:35:41,800 --> 00:35:45,880 Speaker 1: from its predecessor, which had different hardware constraints. Now the 582 00:35:45,920 --> 00:35:50,040 Speaker 1: new machine meant that these constraints weren't there, and it 583 00:35:50,120 --> 00:35:53,480 Speaker 1: created a deadly problem if operators changed the machines mode 584 00:35:53,600 --> 00:35:56,480 Speaker 1: too quickly from one to the other, it would actually 585 00:35:56,480 --> 00:35:59,560 Speaker 1: send two sets of instructions to the processor, one for 586 00:35:59,640 --> 00:36:02,920 Speaker 1: each mode of operation, and whichever set of instructions reached 587 00:36:02,920 --> 00:36:06,440 Speaker 1: the processor first, that's what the machine would switch to. 588 00:36:07,440 --> 00:36:12,239 Speaker 1: So let's say you've been operating the THERAC in the 589 00:36:12,280 --> 00:36:15,440 Speaker 1: mega volt X ray mode, but now you're going to 590 00:36:15,520 --> 00:36:18,600 Speaker 1: have a patient come in. You need to administer radiation therapy, 591 00:36:18,920 --> 00:36:21,439 Speaker 1: so you want to switch it to low electron. Being 592 00:36:21,960 --> 00:36:24,759 Speaker 1: you switch it too quickly, it sends two sets of 593 00:36:24,800 --> 00:36:27,440 Speaker 1: instructions to the processor, and the one that arises the 594 00:36:27,480 --> 00:36:30,640 Speaker 1: mega volt x ray instruction, so instead of switching it, 595 00:36:30,920 --> 00:36:35,880 Speaker 1: you confirm to stay on the more intense, deadlier radiation. 596 00:36:37,400 --> 00:36:42,680 Speaker 1: The tragic news is this did happen several times. Six 597 00:36:42,719 --> 00:36:46,720 Speaker 1: patients were documented as dying from complications due to radiation 598 00:36:46,760 --> 00:36:51,200 Speaker 1: poisoning from THERAC twenty machines between night five and nineteen 599 00:36:51,280 --> 00:36:54,640 Speaker 1: eight six, and while the machine would send error messages 600 00:36:54,719 --> 00:36:58,720 Speaker 1: when these conditions were present, the documentation for the machine 601 00:36:58,760 --> 00:37:01,520 Speaker 1: didn't explain what the errors meant. It didn't say, hey, 602 00:37:01,520 --> 00:37:04,280 Speaker 1: if you get this error, it means that you've switched 603 00:37:04,360 --> 00:37:08,279 Speaker 1: modes too quickly and you need to address this. So, 604 00:37:08,320 --> 00:37:12,880 Speaker 1: since operators weren't told that this was necessarily a hazardous condition, 605 00:37:12,960 --> 00:37:16,319 Speaker 1: they would just clear the error and proceed, and there 606 00:37:16,320 --> 00:37:21,560 Speaker 1: were deadly results. In a similar vein in Panama City, Panama, 607 00:37:22,400 --> 00:37:26,040 Speaker 1: there was an incident involving a Cobalt sixty system, actually 608 00:37:26,080 --> 00:37:29,720 Speaker 1: several incidents involving this Cobalt sixty system that was running 609 00:37:29,760 --> 00:37:32,840 Speaker 1: therapy planning software made by a company called Multi Data 610 00:37:32,880 --> 00:37:36,680 Speaker 1: Systems International. Now, the software's purpose was to calculate the 611 00:37:36,719 --> 00:37:40,320 Speaker 1: amount of radiation that cancer patients should receive in a 612 00:37:40,440 --> 00:37:46,080 Speaker 1: radiation therapy sessions. During these radiation therapy sessions, the therapists 613 00:37:46,280 --> 00:37:50,400 Speaker 1: were meant to place metal shields on the patient to 614 00:37:50,440 --> 00:37:55,680 Speaker 1: protect healthy tissue from radiation damage. And the software would 615 00:37:55,719 --> 00:37:59,759 Speaker 1: allow therapists to use a methodology to show where those 616 00:38:00,000 --> 00:38:02,920 Speaker 1: shields were on the patient, to indicate where the shields 617 00:38:02,960 --> 00:38:07,440 Speaker 1: are present. But they could only draw up to four shields, 618 00:38:07,800 --> 00:38:10,799 Speaker 1: and the doctors in Panama wanted to use five shields 619 00:38:10,840 --> 00:38:14,840 Speaker 1: for particular therapy sessions. They were overloaded, they had a 620 00:38:14,880 --> 00:38:17,160 Speaker 1: long waiting list of patients, and they were trying to 621 00:38:17,160 --> 00:38:21,000 Speaker 1: make things more efficient, and they discovered that they could 622 00:38:21,080 --> 00:38:25,480 Speaker 1: kind of work around this limitation of four shields by 623 00:38:25,560 --> 00:38:27,839 Speaker 1: drawing a design on the computer screen as if they 624 00:38:27,840 --> 00:38:31,080 Speaker 1: were using just one large shield that has a hole 625 00:38:31,120 --> 00:38:33,080 Speaker 1: in the middle of it. And so what they would 626 00:38:33,080 --> 00:38:35,880 Speaker 1: do is they would arrange the five shields to essentially 627 00:38:35,880 --> 00:38:38,520 Speaker 1: be in the same sort of shape with the middle 628 00:38:38,680 --> 00:38:40,840 Speaker 1: of it being open so that they can have the 629 00:38:40,920 --> 00:38:46,360 Speaker 1: radiation therapy passed through it. Uh, But they didn't realize 630 00:38:46,400 --> 00:38:48,520 Speaker 1: that the software had a bug in it, and that 631 00:38:48,560 --> 00:38:51,719 Speaker 1: bug was if you drew the hole in one direction, 632 00:38:52,200 --> 00:38:55,000 Speaker 1: you get the correct dose of radiation, but if you 633 00:38:55,080 --> 00:38:59,440 Speaker 1: drew it in the other direction, so like clockwise versus counterclockwise, 634 00:39:00,040 --> 00:39:03,719 Speaker 1: the software would recommend a dosage twice as strong as 635 00:39:03,760 --> 00:39:07,360 Speaker 1: what was needed, and the result was devastating. Eight patients 636 00:39:07,520 --> 00:39:11,720 Speaker 1: died as a result of this, and another twenty received 637 00:39:11,760 --> 00:39:14,520 Speaker 1: doses high enough to potentially cause health problems. Later on, 638 00:39:15,840 --> 00:39:18,839 Speaker 1: the physicians were actually arrested and brought up on murder 639 00:39:19,000 --> 00:39:22,640 Speaker 1: charges because they were supposed to double check all calculations 640 00:39:22,680 --> 00:39:25,319 Speaker 1: by hand to ensure that they were going to give 641 00:39:25,360 --> 00:39:28,840 Speaker 1: the proper dose of radiation treatment. So while the software 642 00:39:29,040 --> 00:39:33,920 Speaker 1: was calculating the incorrect dose, the physicians were responsible for 643 00:39:34,000 --> 00:39:37,520 Speaker 1: making sure that any dose that was calculated was in 644 00:39:37,520 --> 00:39:39,319 Speaker 1: fact the correct one, and they failed to do so, 645 00:39:39,719 --> 00:39:43,479 Speaker 1: or at least that was the charge. There are also 646 00:39:43,560 --> 00:39:46,520 Speaker 1: bugs that involved military applications that have resulted in the 647 00:39:46,600 --> 00:39:49,520 Speaker 1: loss of life. During the Persian Gulf War in Iraqi 648 00:39:49,719 --> 00:39:52,680 Speaker 1: fired scud missile hit a US base in Saudi Arabia 649 00:39:52,960 --> 00:39:56,960 Speaker 1: and it killed twenty eight soldiers. Now the base had 650 00:39:57,000 --> 00:40:00,880 Speaker 1: detected the missile and had launched and fired a Patriot 651 00:40:00,880 --> 00:40:03,600 Speaker 1: missile in return. The purpose of the Patriot missile was 652 00:40:03,640 --> 00:40:06,759 Speaker 1: to intercept and destroy incoming missiles, and the way a 653 00:40:06,760 --> 00:40:09,719 Speaker 1: Patriot missile did this was to use radar pulses to 654 00:40:09,840 --> 00:40:14,000 Speaker 1: guide trajectory calculations so that it would end up getting 655 00:40:14,040 --> 00:40:17,200 Speaker 1: close to the incoming missile. This is harder than it sounds, 656 00:40:17,239 --> 00:40:20,520 Speaker 1: because both missiles are moving very very quickly, so we 657 00:40:20,560 --> 00:40:23,719 Speaker 1: need a very precise information in order to adjust its 658 00:40:23,760 --> 00:40:28,279 Speaker 1: trajectory properly and make sure it was on target. Now, 659 00:40:28,320 --> 00:40:31,160 Speaker 1: once it gets within range, which is between five and 660 00:40:31,400 --> 00:40:35,239 Speaker 1: ten meters I think uh, it would then fire out 661 00:40:36,000 --> 00:40:39,440 Speaker 1: thousand pellets from the Patriot missile at high velocity with 662 00:40:39,520 --> 00:40:42,960 Speaker 1: the goal of causing the incoming warhead to explode prematurely. 663 00:40:44,600 --> 00:40:47,440 Speaker 1: In this case, the Patriot missile missed, and the military 664 00:40:47,440 --> 00:40:49,759 Speaker 1: investigated the issue in the wake of the loss of 665 00:40:49,800 --> 00:40:52,440 Speaker 1: life and found a problem with the software guiding the 666 00:40:52,480 --> 00:40:55,640 Speaker 1: Patriot missile. And it was a problem that actually the 667 00:40:55,640 --> 00:40:57,880 Speaker 1: military kind of knew about already. So one of the 668 00:40:57,920 --> 00:41:00,959 Speaker 1: processes in the Patriots programming was to avert time into 669 00:41:01,000 --> 00:41:06,880 Speaker 1: floating point operations for increased accuracy. But not all subroutines 670 00:41:07,440 --> 00:41:12,080 Speaker 1: that depended on tracking time did this. Some of them 671 00:41:12,080 --> 00:41:16,640 Speaker 1: remained UH clock units rather than floating point operations, which 672 00:41:16,680 --> 00:41:19,840 Speaker 1: meant that they would get out of sync after a while. 673 00:41:19,880 --> 00:41:23,040 Speaker 1: There'd be a disagreement in various subroutines as to what 674 00:41:23,480 --> 00:41:26,160 Speaker 1: how much time had actually passed. And like I said, 675 00:41:26,160 --> 00:41:28,440 Speaker 1: the military was aware of this issue and they had 676 00:41:28,480 --> 00:41:32,319 Speaker 1: a work around, which was not ideal. The workaround was 677 00:41:32,760 --> 00:41:35,719 Speaker 1: you would occasionally reboot the system, which would reset the 678 00:41:35,719 --> 00:41:38,359 Speaker 1: clocks and synchronize them, but over time they would fall 679 00:41:38,400 --> 00:41:41,200 Speaker 1: out of sync because they're not tracking time the same way. 680 00:41:41,840 --> 00:41:44,359 Speaker 1: And since there was no hard and fast rule as 681 00:41:44,400 --> 00:41:47,680 Speaker 1: to how frequently you'd reset the system, problems like this 682 00:41:47,719 --> 00:41:50,040 Speaker 1: one where possible, and in fact, in this case it 683 00:41:50,080 --> 00:41:53,920 Speaker 1: did happen. So prior to this particular incident, that specific 684 00:41:53,960 --> 00:41:57,560 Speaker 1: Patriot system had been running for one hours without a reboot, 685 00:41:58,239 --> 00:42:02,640 Speaker 1: and the clock disagreement amounted to about one third of 686 00:42:02,680 --> 00:42:05,160 Speaker 1: a second. Now that seems like it's no time at all. 687 00:42:05,200 --> 00:42:07,920 Speaker 1: One third of a second is so so short, But 688 00:42:08,000 --> 00:42:11,319 Speaker 1: a scutt missile's top speed is about one point one 689 00:42:11,440 --> 00:42:15,240 Speaker 1: miles per second or one point seven kilometers per second, 690 00:42:15,280 --> 00:42:17,520 Speaker 1: which means if you take a third of a second, 691 00:42:17,600 --> 00:42:21,279 Speaker 1: the missile could travel more than five And since the 692 00:42:21,320 --> 00:42:23,520 Speaker 1: patriot needs to be within ten meters of a target 693 00:42:23,560 --> 00:42:27,359 Speaker 1: to destroy it, that resulted in a catastrophic failure. So 694 00:42:27,360 --> 00:42:30,720 Speaker 1: software bugs can be a matter of life or death. 695 00:42:30,800 --> 00:42:34,040 Speaker 1: It's not all just Hey, this irritating thing meant people 696 00:42:34,080 --> 00:42:38,240 Speaker 1: couldn't make long distance phone calls or uh, this issue 697 00:42:38,440 --> 00:42:41,919 Speaker 1: caused my computer to start writing massive amounts of data 698 00:42:41,920 --> 00:42:45,480 Speaker 1: to its hard drive. And this is why it's so 699 00:42:45,520 --> 00:42:51,160 Speaker 1: important to have really qualified QA personnel go through code 700 00:42:51,200 --> 00:42:53,400 Speaker 1: and make sure it's doing what it's supposed to do, 701 00:42:53,760 --> 00:42:56,600 Speaker 1: because the problems that can arise can be non trivial 702 00:42:56,640 --> 00:42:59,560 Speaker 1: and in fact life or death situations depending upon the 703 00:42:59,560 --> 00:43:04,080 Speaker 1: application of technology. So technology is a fascinating thing. It's 704 00:43:04,120 --> 00:43:07,680 Speaker 1: a wonderful thing. It has benefited us in ways that 705 00:43:08,239 --> 00:43:11,520 Speaker 1: I can't even begin to describe. It's just too broad 706 00:43:11,560 --> 00:43:14,480 Speaker 1: a topic, and it's something I've been tackling for, you know, 707 00:43:14,520 --> 00:43:17,080 Speaker 1: eight years, and I haven't haven't even gotten close to 708 00:43:17,120 --> 00:43:21,080 Speaker 1: getting toward the finishing point. So I don't want to 709 00:43:21,160 --> 00:43:24,799 Speaker 1: suggest that technology is bad, but we definitely have the 710 00:43:24,840 --> 00:43:28,200 Speaker 1: need to check, double check, and triple check all this 711 00:43:28,280 --> 00:43:31,520 Speaker 1: work to make certain things are working properly before we 712 00:43:31,560 --> 00:43:35,000 Speaker 1: release them out into the wild. That particularly applies if again, 713 00:43:35,080 --> 00:43:40,440 Speaker 1: you are reusing old code or old components in a 714 00:43:40,480 --> 00:43:43,960 Speaker 1: new way, because you have to make absolutely certain that 715 00:43:43,960 --> 00:43:47,759 Speaker 1: there's not going to be some unintended problem that results 716 00:43:48,239 --> 00:43:52,120 Speaker 1: when a new form factor is using old code. And 717 00:43:52,200 --> 00:43:54,440 Speaker 1: that wraps up that classic episode of tech Stuff. Hope 718 00:43:54,480 --> 00:43:57,920 Speaker 1: you guys enjoyed it. If you have any requests, questions, comments, 719 00:43:57,920 --> 00:44:00,080 Speaker 1: you can email me the addresses tech stuff at how 720 00:44:00,120 --> 00:44:02,120 Speaker 1: stuff Works dot com, or you can reach out on 721 00:44:02,120 --> 00:44:04,000 Speaker 1: Facebook or Twitter. The handle for both of those is 722 00:44:04,040 --> 00:44:06,560 Speaker 1: tech Stuff h s W. Don't forget to go to 723 00:44:06,600 --> 00:44:09,839 Speaker 1: our website that's tech Stuff podcast dot com. You'll find 724 00:44:09,840 --> 00:44:12,279 Speaker 1: a link to every episode we've ever recorded, plus a 725 00:44:12,320 --> 00:44:14,640 Speaker 1: link to our online store, where every purchase you make 726 00:44:14,800 --> 00:44:16,880 Speaker 1: goes to help the show. And we greatly appreciate it, 727 00:44:17,200 --> 00:44:24,480 Speaker 1: and I'll talk to you again really soon. Text Stuff 728 00:44:24,520 --> 00:44:26,880 Speaker 1: is a production of I Heart Radio's How Stuff Works. 729 00:44:27,040 --> 00:44:29,840 Speaker 1: For more podcasts from my heart Radio, visit the i 730 00:44:29,960 --> 00:44:33,160 Speaker 1: heart Radio app, Apple podcasts, or wherever you listen to 731 00:44:33,239 --> 00:44:34,160 Speaker 1: your favorite shows.