1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from I Heart Radio. 2 00:00:12,039 --> 00:00:14,480 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,600 --> 00:00:17,800 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:17,880 --> 00:00:20,360 Speaker 1: and I love all things tech. And you know what, 5 00:00:20,840 --> 00:00:25,880 Speaker 1: um twenty one years ago? Man, that blows my mind. 6 00:00:26,560 --> 00:00:30,240 Speaker 1: But twenty one years ago, the world was very concerned 7 00:00:30,920 --> 00:00:38,159 Speaker 1: that a clock, a a a digit changing from to 8 00:00:38,400 --> 00:00:44,160 Speaker 1: zero zero was going to completely turn our technological world 9 00:00:44,280 --> 00:00:48,839 Speaker 1: upside down. I'm talking about the Y two K problem. 10 00:00:48,880 --> 00:00:51,480 Speaker 1: And some of you out there might be too young 11 00:00:51,600 --> 00:00:54,160 Speaker 1: to even know what that is, or maybe you were 12 00:00:54,200 --> 00:00:57,760 Speaker 1: born after two thousand, you have no clue. But for 13 00:00:57,840 --> 00:01:00,920 Speaker 1: those of us who were working in the nineties, this 14 00:01:01,000 --> 00:01:02,880 Speaker 1: was a big deal. It was something that a lot 15 00:01:02,880 --> 00:01:05,520 Speaker 1: of people were worried about, and it all had to 16 00:01:05,560 --> 00:01:09,200 Speaker 1: do with those two little digits at the end of 17 00:01:09,240 --> 00:01:12,360 Speaker 1: the year. And we're gonna learn about why why two 18 00:01:12,400 --> 00:01:15,760 Speaker 1: K didn't in the world, because, I mean, spoiler alert, 19 00:01:16,360 --> 00:01:19,280 Speaker 1: it didn't. So let's listen back to this episode, which 20 00:01:19,319 --> 00:01:24,400 Speaker 1: originally published on January two thousand and fourteen. Like someone 21 00:01:25,319 --> 00:01:29,200 Speaker 1: cut and now things aren't working, also building up to 22 00:01:29,280 --> 00:01:33,600 Speaker 1: something terrifying. Yeah, it's gonna build up to a NonStop 23 00:01:33,680 --> 00:01:38,840 Speaker 1: replay of Prince over and over again, which it starts 24 00:01:38,840 --> 00:01:40,880 Speaker 1: out awesome, but as it goes on, let me tell 25 00:01:40,920 --> 00:01:43,480 Speaker 1: you that gets old. Okay, we're talking about the Y 26 00:01:43,480 --> 00:01:46,039 Speaker 1: two K Bug obviously, people, Yes, yes, that is what 27 00:01:46,080 --> 00:01:48,040 Speaker 1: our episode is about today. All of our fans who 28 00:01:48,080 --> 00:01:51,640 Speaker 1: hate it. Whenever I do cheesy humor, I apologize, don't 29 00:01:51,680 --> 00:01:54,920 Speaker 1: really apologize. It is who I am. So the reason 30 00:01:54,960 --> 00:01:57,960 Speaker 1: why we're talking about why two K Bug years after 31 00:01:58,160 --> 00:02:02,360 Speaker 1: the whole issue happened, is because, you know, we ask you, 32 00:02:02,400 --> 00:02:04,720 Speaker 1: guys what you want to hear, and in this case, 33 00:02:04,840 --> 00:02:07,279 Speaker 1: a listener named James sent us a message on Twitter 34 00:02:07,360 --> 00:02:09,960 Speaker 1: and said, hey, guys, you should do an episode on 35 00:02:10,000 --> 00:02:13,280 Speaker 1: the Y two K Bug. Heart heart James. Well, James, 36 00:02:13,440 --> 00:02:16,160 Speaker 1: we heart heart you two. Now we're going to do 37 00:02:16,200 --> 00:02:18,160 Speaker 1: our episode on the Y two K Bug. It's a 38 00:02:18,240 --> 00:02:21,639 Speaker 1: pretty interesting story because it's one of those things where, 39 00:02:22,000 --> 00:02:26,080 Speaker 1: you know, it really illustrates a few basic things about 40 00:02:26,120 --> 00:02:29,360 Speaker 1: computing and human nature in general. One of those things 41 00:02:29,440 --> 00:02:33,200 Speaker 1: is that when something new is created, no one who 42 00:02:33,280 --> 00:02:35,960 Speaker 1: is around has any idea of how long it's gonna last, 43 00:02:36,720 --> 00:02:39,560 Speaker 1: and they don't have any any appreciation of things that 44 00:02:39,600 --> 00:02:43,600 Speaker 1: they do then lasting into well into the future, right, well, 45 00:02:43,720 --> 00:02:47,200 Speaker 1: especially things like computer programming. I mean, no one in 46 00:02:47,320 --> 00:02:50,079 Speaker 1: say the nineteen sixties or seventies was expecting any of 47 00:02:50,120 --> 00:02:54,560 Speaker 1: the programs they were writing to last for forty years. Yeah, 48 00:02:54,720 --> 00:02:59,400 Speaker 1: computers were developing very quickly, and the general thought was that, 49 00:02:59,760 --> 00:03:02,359 Speaker 1: you know, this is changing so fast that programming is 50 00:03:02,400 --> 00:03:04,840 Speaker 1: going to change in at a crazy speed too. But 51 00:03:04,880 --> 00:03:08,280 Speaker 1: as it turns out, while the hardware changed, the practices 52 00:03:08,280 --> 00:03:11,360 Speaker 1: that were established early on remained pretty much standard. And 53 00:03:11,440 --> 00:03:15,639 Speaker 1: also a lot of this old programming would find its 54 00:03:15,680 --> 00:03:20,560 Speaker 1: way into subsequent generations of software. So even if it 55 00:03:20,639 --> 00:03:24,440 Speaker 1: wasn't something that people were continuing to do later on, uh, 56 00:03:24,480 --> 00:03:27,680 Speaker 1: there'd still be these old fragments of code incorporated into 57 00:03:27,720 --> 00:03:30,000 Speaker 1: stuff that did have it. Now we're kind of dancing 58 00:03:30,040 --> 00:03:33,760 Speaker 1: around what that old thing was. Oh, oh, the the 59 00:03:33,800 --> 00:03:36,800 Speaker 1: old thing, of course being I just got really excited 60 00:03:36,800 --> 00:03:41,880 Speaker 1: that I knew the answer to this um being the 61 00:03:41,920 --> 00:03:45,600 Speaker 1: digits in the year. Yeah. So here was the issue 62 00:03:45,800 --> 00:03:48,560 Speaker 1: back in the fifties and sixties when programmers were having 63 00:03:48,560 --> 00:03:50,720 Speaker 1: to put a code in for the year, which is 64 00:03:50,760 --> 00:03:53,920 Speaker 1: important for certain types of calculations, right, anything that's time 65 00:03:53,960 --> 00:03:56,200 Speaker 1: based Obviously you need to have a way of recording 66 00:03:56,240 --> 00:03:59,120 Speaker 1: the time so that you can compare times from different 67 00:03:59,160 --> 00:04:03,160 Speaker 1: points and draw your calculations based on that different you know, 68 00:04:03,240 --> 00:04:06,160 Speaker 1: for for for example, when people have been depositing paychecks 69 00:04:06,400 --> 00:04:09,560 Speaker 1: or right. So if you if you have a bank 70 00:04:09,560 --> 00:04:12,680 Speaker 1: account that has interest, for example, time is obviously a 71 00:04:12,720 --> 00:04:14,880 Speaker 1: factor there. It's not just the amount of money that 72 00:04:14,920 --> 00:04:17,479 Speaker 1: you've been continuously putting into or taking out of that 73 00:04:17,520 --> 00:04:19,960 Speaker 1: bank account. It's also the amount of times since you 74 00:04:20,080 --> 00:04:23,839 Speaker 1: established that bank account. And there's some complicated calculations that 75 00:04:23,880 --> 00:04:26,440 Speaker 1: are very time sensitive, so you have to have that 76 00:04:26,480 --> 00:04:29,640 Speaker 1: kind of stuff built into your algorithm, right right, or 77 00:04:29,680 --> 00:04:32,400 Speaker 1: in other cases, you know, records of dates of birth 78 00:04:32,720 --> 00:04:36,520 Speaker 1: or dates of medical surgery or all kinds of things. Yeah, 79 00:04:37,400 --> 00:04:39,920 Speaker 1: so many different applications, to the point where there were 80 00:04:39,960 --> 00:04:43,120 Speaker 1: even technologies that you wouldn't imagine would ever need to 81 00:04:43,200 --> 00:04:45,840 Speaker 1: know what year it is that had the stuff built 82 00:04:45,880 --> 00:04:48,320 Speaker 1: into it. And here's the problem when you have two 83 00:04:48,400 --> 00:04:53,440 Speaker 1: digits for your year. See the computer programming getting started 84 00:04:53,440 --> 00:04:56,160 Speaker 1: in the nineteen fifties and nineteen sixties, they figured, hey, 85 00:04:56,200 --> 00:04:58,800 Speaker 1: we've got practically half a century before we have to 86 00:04:58,839 --> 00:05:02,360 Speaker 1: worry about two digits, it's turning into zero zero. Clearly, 87 00:05:02,400 --> 00:05:05,640 Speaker 1: we're gonna totally fix this later. And computer memory right 88 00:05:05,680 --> 00:05:09,279 Speaker 1: now is incredibly expensive, so let's let's let's be really 89 00:05:09,320 --> 00:05:11,760 Speaker 1: conservative and just use two digits for the year, and 90 00:05:11,760 --> 00:05:14,680 Speaker 1: we'll be fine until these other problems work themselves out. 91 00:05:14,680 --> 00:05:17,400 Speaker 1: Oh and and I mean computer memory was so precious 92 00:05:17,400 --> 00:05:20,400 Speaker 1: and saving, especially across the course of for example, an 93 00:05:20,520 --> 00:05:24,520 Speaker 1: entire spreadsheet full of interest calculations. Saving two digits per 94 00:05:24,600 --> 00:05:29,839 Speaker 1: year was big. So so just by doing a month, month, day, day, year, year, 95 00:05:29,880 --> 00:05:32,200 Speaker 1: you know, just just the two digits each, you could 96 00:05:32,200 --> 00:05:35,000 Speaker 1: save a huge time and hassle for yourself at the 97 00:05:35,360 --> 00:05:37,640 Speaker 1: at that current moment, right right, because I mean, you know, 98 00:05:38,400 --> 00:05:40,920 Speaker 1: think about just a few years ago how expensive it 99 00:05:41,000 --> 00:05:43,760 Speaker 1: was to buy, say a terrabyte for a hard drive, 100 00:05:43,839 --> 00:05:47,919 Speaker 1: compared to today, now it's much more affordable. Well, you know, 101 00:05:47,960 --> 00:05:49,560 Speaker 1: as opposed to when I was a kid, when a 102 00:05:49,640 --> 00:05:53,680 Speaker 1: terabyte was a completely unimaginable amount of information. Yeah, when 103 00:05:53,720 --> 00:05:56,039 Speaker 1: I was a kid, I couldn't imagine ever filling up 104 00:05:56,040 --> 00:05:59,800 Speaker 1: a megabyte of space. So you know, as time has 105 00:05:59,839 --> 00:06:02,320 Speaker 1: gone on, memory has become less and less of a 106 00:06:02,360 --> 00:06:04,360 Speaker 1: problem in the sense that we were able to make 107 00:06:04,400 --> 00:06:08,520 Speaker 1: more of it more affordably. Back then very expensive and 108 00:06:08,560 --> 00:06:11,000 Speaker 1: precious stuff that you only had so much to work with, 109 00:06:11,160 --> 00:06:15,200 Speaker 1: and it was expensive to use. So cutting it down 110 00:06:15,240 --> 00:06:17,160 Speaker 1: to two digits made sense at the time. But the 111 00:06:17,200 --> 00:06:20,880 Speaker 1: problem was that when you roll over from nine to 112 00:06:20,960 --> 00:06:23,960 Speaker 1: two thousand in the computer terms, that goes from ninety 113 00:06:24,080 --> 00:06:27,200 Speaker 1: nine to zero zero, which meant that people weren't really 114 00:06:27,279 --> 00:06:29,960 Speaker 1: sure what was going to happen. Right, would the computer 115 00:06:30,040 --> 00:06:32,880 Speaker 1: think that it was all of a sudden Would that 116 00:06:32,960 --> 00:06:37,520 Speaker 1: completely bark all of your calculations for for example, interest rates, yeah, 117 00:06:37,680 --> 00:06:40,800 Speaker 1: or the the age of a person. So your example, 118 00:06:40,920 --> 00:06:43,719 Speaker 1: if it's figuring out the age by subtracting the current 119 00:06:43,800 --> 00:06:45,719 Speaker 1: date from your date of birth or the date of 120 00:06:45,760 --> 00:06:48,200 Speaker 1: birth from the current data, should say, so, let's say 121 00:06:48,200 --> 00:06:50,680 Speaker 1: it's ninety nine and you were born in ninety, then 122 00:06:50,720 --> 00:06:54,159 Speaker 1: that's pretty easy. It's years old. Okay, got it. But 123 00:06:54,200 --> 00:06:56,480 Speaker 1: then let's say it goes zero zero and you were 124 00:06:56,480 --> 00:06:59,600 Speaker 1: born in ninety. So at zero zero minus ninety suddenly 125 00:06:59,760 --> 00:07:03,520 Speaker 1: like oh um am I getting negative numbers because now 126 00:07:03,560 --> 00:07:05,839 Speaker 1: a negative age that doesn't make sense. And so you 127 00:07:05,839 --> 00:07:08,680 Speaker 1: can have all sorts of computer problems ranging from the 128 00:07:08,720 --> 00:07:13,160 Speaker 1: financial industry to health to all pretty much everything that 129 00:07:13,280 --> 00:07:16,480 Speaker 1: had any sort of code in it that included the year, 130 00:07:17,040 --> 00:07:22,800 Speaker 1: which extends to things like like elevators that had microchips. Yes, yeah, elevators. 131 00:07:22,920 --> 00:07:25,440 Speaker 1: I mean that's pretty that that was a real concern. 132 00:07:25,480 --> 00:07:26,760 Speaker 1: People are like, I do not want to be in 133 00:07:26,760 --> 00:07:31,240 Speaker 1: an elevator on New Year's Eve because you don't know 134 00:07:31,280 --> 00:07:32,800 Speaker 1: if that thing is going to make it to the floor. 135 00:07:32,840 --> 00:07:35,840 Speaker 1: You want by right, And I mean, and of course, 136 00:07:35,920 --> 00:07:38,760 Speaker 1: you know, they weren't afraid that the elevator was going 137 00:07:38,800 --> 00:07:41,840 Speaker 1: to slow down to a rate, to a negative motion 138 00:07:41,960 --> 00:07:43,800 Speaker 1: rate or anything like that, but they were afraid that 139 00:07:43,880 --> 00:07:47,160 Speaker 1: the code and the microchip crashing would I don't know, 140 00:07:47,280 --> 00:07:50,360 Speaker 1: cause a fire and make the elevator drop or just 141 00:07:50,520 --> 00:07:53,440 Speaker 1: or just stop or just stop entirely and refused to open. 142 00:07:53,800 --> 00:07:57,200 Speaker 1: People had a lot of just uncertainty about exactly what 143 00:07:57,240 --> 00:07:59,240 Speaker 1: was going to happen to code and whether or not 144 00:07:59,280 --> 00:08:01,920 Speaker 1: it was going to cry as an entire system when 145 00:08:02,680 --> 00:08:06,360 Speaker 1: this this year changed, over right, And so this this 146 00:08:06,640 --> 00:08:09,240 Speaker 1: fear started to kind of rear its head in the 147 00:08:09,280 --> 00:08:13,960 Speaker 1: ninety nineties. Uh, it really reached a fever pitch in nine. 148 00:08:14,040 --> 00:08:16,200 Speaker 1: That was when I think the general public became really 149 00:08:16,240 --> 00:08:19,760 Speaker 1: aware of it before that due to media complete over saturation. 150 00:08:19,960 --> 00:08:22,680 Speaker 1: Yeah yeah, I got a little, a little crazy, be 151 00:08:22,760 --> 00:08:24,960 Speaker 1: a lot crazy, depending about where you lived. In the 152 00:08:25,080 --> 00:08:29,320 Speaker 1: United States, it's certainly became crazy. So you had you 153 00:08:29,360 --> 00:08:32,360 Speaker 1: had computer scientists who were and programmers who were saying 154 00:08:32,559 --> 00:08:36,360 Speaker 1: earlier than this, like, hey, guys, maybe we should fix 155 00:08:36,440 --> 00:08:38,920 Speaker 1: this is you know, this is a problem and instead 156 00:08:38,960 --> 00:08:43,440 Speaker 1: of perpetuating it across multiple industries at infinitum, maybe we 157 00:08:43,440 --> 00:08:46,360 Speaker 1: should address it and that way, just establish a new 158 00:08:46,480 --> 00:08:49,439 Speaker 1: rule going forward. Now, computer memory not such a big deal. Now, 159 00:08:49,440 --> 00:08:50,719 Speaker 1: why don't we Why don't we fix it before we 160 00:08:50,760 --> 00:08:53,520 Speaker 1: get Hello? So any what is this thing on? Hello? 161 00:08:53,960 --> 00:08:55,560 Speaker 1: And the problem was that a lot of people didn't 162 00:08:55,559 --> 00:08:57,720 Speaker 1: listen until it started getting closer to two thousand and 163 00:08:57,800 --> 00:09:01,120 Speaker 1: people began to really worry about the possibility that this 164 00:09:01,160 --> 00:09:04,680 Speaker 1: could bring about, if not some sort of technological armageddon, 165 00:09:05,000 --> 00:09:08,000 Speaker 1: at least a lot of glitches and problems that could 166 00:09:08,040 --> 00:09:12,360 Speaker 1: have been avoided. So then they had to say, well, 167 00:09:12,360 --> 00:09:14,560 Speaker 1: what are we going to do about it? The obvious 168 00:09:14,559 --> 00:09:18,439 Speaker 1: solution was also the most time consuming and expensive one, 169 00:09:18,760 --> 00:09:22,559 Speaker 1: which was the manually go through and start updating code 170 00:09:22,720 --> 00:09:25,200 Speaker 1: and changing it so that it's a four digit year 171 00:09:25,280 --> 00:09:28,160 Speaker 1: instead of a two digit year and uh and then 172 00:09:28,360 --> 00:09:33,680 Speaker 1: thus increasing the usefulness till at least nine thou right, Yeah, 173 00:09:33,760 --> 00:09:35,559 Speaker 1: I mean, I mean, you know, the alternate there's to 174 00:09:35,679 --> 00:09:38,880 Speaker 1: recode just so that programs would recognize that zero zero 175 00:09:39,080 --> 00:09:42,560 Speaker 1: probably meant two thousands instead of nineteen hundred. But that's 176 00:09:42,679 --> 00:09:45,080 Speaker 1: a less effective solution. And b I mean, you just 177 00:09:45,120 --> 00:09:47,079 Speaker 1: need to change it over at the next turn of 178 00:09:47,120 --> 00:09:49,760 Speaker 1: the century, not that probably those same programs were going 179 00:09:49,800 --> 00:09:52,080 Speaker 1: to be in use, but you never know, right. The 180 00:09:52,120 --> 00:09:54,600 Speaker 1: thing is that, you know, you have these legacy systems 181 00:09:55,000 --> 00:09:59,160 Speaker 1: that certain companies rely on that were originally programmed, you know, 182 00:09:59,240 --> 00:10:02,640 Speaker 1: thirty forty years ago, and you know they continue to 183 00:10:02,679 --> 00:10:05,000 Speaker 1: rely on them because they do exactly what the company 184 00:10:05,040 --> 00:10:07,960 Speaker 1: needs them to do, right And Okay, so so either way, 185 00:10:08,000 --> 00:10:11,360 Speaker 1: these changes might have to be entered by hand thousands 186 00:10:11,400 --> 00:10:13,840 Speaker 1: of times or hundreds of thousands of times in a 187 00:10:13,920 --> 00:10:17,200 Speaker 1: single program um and each change then has to be 188 00:10:17,240 --> 00:10:21,280 Speaker 1: tested against errors. Of course, eventually code was developed to 189 00:10:21,400 --> 00:10:24,320 Speaker 1: help automate the process, but you know, I mean it 190 00:10:24,400 --> 00:10:27,240 Speaker 1: was just a big undertaking. Yeah. You might remember if 191 00:10:27,320 --> 00:10:32,200 Speaker 1: you watch the documentary Office Space that the characters at 192 00:10:32,240 --> 00:10:36,200 Speaker 1: in a tech the company and office space were they 193 00:10:36,240 --> 00:10:38,760 Speaker 1: that was their job. They went into other companies and 194 00:10:38,800 --> 00:10:42,640 Speaker 1: helped update their code to meet the Y two K issue, 195 00:10:42,679 --> 00:10:45,520 Speaker 1: which kind of raises another question, which is what was 196 00:10:45,559 --> 00:10:47,959 Speaker 1: this company going to do after the year two thousand? 197 00:10:48,360 --> 00:10:51,160 Speaker 1: But at any rate, that was actually a very real concern. 198 00:10:51,200 --> 00:10:53,640 Speaker 1: I mean, spaces is a great parody of all of that, 199 00:10:53,720 --> 00:10:57,360 Speaker 1: kind of sure, but but there was a concern that, 200 00:10:57,960 --> 00:11:00,320 Speaker 1: you know, with with all of these extra per gramming 201 00:11:00,360 --> 00:11:02,720 Speaker 1: jobs that were going to be created, that that businesses 202 00:11:02,760 --> 00:11:05,040 Speaker 1: would crash and burn. And some of them did. I mean, 203 00:11:05,040 --> 00:11:06,840 Speaker 1: most of them just moved on to other things and 204 00:11:06,880 --> 00:11:09,640 Speaker 1: found freedom and not having to do this incredibly tedious 205 00:11:09,640 --> 00:11:12,000 Speaker 1: work anymore. Right, And then there was a lot of 206 00:11:12,000 --> 00:11:14,920 Speaker 1: other crashing and burning in the tech industry for unrelated reasons. 207 00:11:15,320 --> 00:11:17,400 Speaker 1: That was the whole dot com bubble burst. But it 208 00:11:17,440 --> 00:11:21,199 Speaker 1: didn't have anything to do with Y two K directly. So, uh, 209 00:11:21,360 --> 00:11:23,520 Speaker 1: here's the other problem is that a lot of these 210 00:11:23,559 --> 00:11:26,719 Speaker 1: programs didn't recognize even if even if two thousand was 211 00:11:26,760 --> 00:11:28,559 Speaker 1: going to be fine, even if they could recognize the 212 00:11:28,600 --> 00:11:30,240 Speaker 1: five that was two thousand was a four digit year, 213 00:11:30,440 --> 00:11:32,880 Speaker 1: they didn't necessarily recognize that two thousand was going to 214 00:11:32,920 --> 00:11:35,840 Speaker 1: be a leap year. And here's the reason why. So 215 00:11:36,920 --> 00:11:40,400 Speaker 1: leap years actually follow an algorithm, a set of rules, obviously, 216 00:11:40,480 --> 00:11:43,840 Speaker 1: So the the basic rule is that for every four years, 217 00:11:43,840 --> 00:11:46,120 Speaker 1: you added an extra day, a leap day, at the 218 00:11:46,160 --> 00:11:48,880 Speaker 1: end of February to balance out the calendar year with 219 00:11:48,920 --> 00:11:51,480 Speaker 1: the solar year. Because the solar year is close to 220 00:11:51,520 --> 00:11:54,680 Speaker 1: three sixty five point to five days, not quite point 221 00:11:54,720 --> 00:11:57,920 Speaker 1: to five, almost point to five, which is important. Yeah, 222 00:11:57,960 --> 00:12:01,920 Speaker 1: So if you stretch out over an incredibly long time 223 00:12:01,960 --> 00:12:05,800 Speaker 1: for us humans, let's say a few centuries, your calendars 224 00:12:05,800 --> 00:12:08,480 Speaker 1: will start to become misaligned because it's not quite three 225 00:12:08,880 --> 00:12:11,080 Speaker 1: sixty five point to five days in the solar year. 226 00:12:11,440 --> 00:12:14,719 Speaker 1: So that means that occasionally you have to ignore the 227 00:12:14,800 --> 00:12:17,360 Speaker 1: leap year. And the way the rule goes is that 228 00:12:17,520 --> 00:12:19,760 Speaker 1: if the let's see if I can get this right, 229 00:12:20,320 --> 00:12:25,120 Speaker 1: if the century is divisible by one hundred but not 230 00:12:25,880 --> 00:12:28,480 Speaker 1: by four hundred, it would not be a leap year. 231 00:12:28,600 --> 00:12:30,959 Speaker 1: So if it's divisible by both one hundred and four hundred, 232 00:12:31,000 --> 00:12:33,760 Speaker 1: it's a leap here. So in other words, seventeen hundred, 233 00:12:34,000 --> 00:12:38,199 Speaker 1: eighteen hundred, and nineteen hundred were not leap years. Was 234 00:12:39,200 --> 00:12:42,160 Speaker 1: because sixteen hundred is divisible by four hundred. Two thousand 235 00:12:42,240 --> 00:12:44,520 Speaker 1: also divisible by four hundred, so it should be a 236 00:12:44,600 --> 00:12:48,480 Speaker 1: leap here. However, because you got just have zero zero 237 00:12:48,679 --> 00:12:52,240 Speaker 1: as the digits. If the computer thinks it's nineteen hundred, 238 00:12:52,679 --> 00:12:55,400 Speaker 1: the computers also knows the rule that nineteen hundred is 239 00:12:55,440 --> 00:12:58,640 Speaker 1: not a leap year, so it says, hey, this, this 240 00:12:58,720 --> 00:13:01,880 Speaker 1: big zero kind of number is totally not a leap year, right, 241 00:13:01,920 --> 00:13:04,679 Speaker 1: So it's only three d sixty five days, not three 242 00:13:04,720 --> 00:13:07,120 Speaker 1: hundred three, And we we don't have a f every 243 00:13:07,120 --> 00:13:08,920 Speaker 1: twenty nine this year, is what I would say. But 244 00:13:08,960 --> 00:13:11,600 Speaker 1: there was totally a fevery twenty nine here, which meant 245 00:13:11,640 --> 00:13:14,480 Speaker 1: that other calculations would get thrown off because it wouldn't 246 00:13:14,480 --> 00:13:17,400 Speaker 1: take that leap day into account. So all these calendar 247 00:13:17,440 --> 00:13:21,079 Speaker 1: applications weren't also had to be corrected. So suddenly people 248 00:13:21,080 --> 00:13:23,839 Speaker 1: were like, oh boy, this is a big old mess here. 249 00:13:23,920 --> 00:13:26,760 Speaker 1: We've got to fix this and uh and so so 250 00:13:26,840 --> 00:13:30,120 Speaker 1: a lot of time and effort and attention was directed 251 00:13:30,160 --> 00:13:32,080 Speaker 1: to this. And there was a third problem as well. 252 00:13:32,200 --> 00:13:34,480 Speaker 1: Wasn't there having to do with all of the nines. 253 00:13:34,760 --> 00:13:37,400 Speaker 1: Oh yeah, Oh, I totally forgot about that. I'm glad 254 00:13:37,400 --> 00:13:39,840 Speaker 1: you brought that up. Yes, So, okay, in the old days, 255 00:13:40,280 --> 00:13:43,880 Speaker 1: children gather around this, gather around the digital fireplace. You know, 256 00:13:43,920 --> 00:13:47,600 Speaker 1: if Netflix still has that digital fireplace. Stared started up 257 00:13:48,559 --> 00:13:51,680 Speaker 1: back in the old days. Children. Sometimes programmers, in order 258 00:13:51,720 --> 00:13:55,400 Speaker 1: to designate the end of a program, would just type 259 00:13:55,400 --> 00:13:57,720 Speaker 1: out a string of nines. It was essentially just the 260 00:13:57,760 --> 00:14:01,280 Speaker 1: code to say, this is where stuff ends, y'all. Uh 261 00:14:01,360 --> 00:14:05,920 Speaker 1: so had a date in Exeptember nine that if you 262 00:14:05,960 --> 00:14:08,440 Speaker 1: were to write it, I would be like nine. You know, 263 00:14:08,520 --> 00:14:11,680 Speaker 1: a lot of nines. And the worry was that certain 264 00:14:11,720 --> 00:14:15,480 Speaker 1: programs which would see that as meaning this is where 265 00:14:15,480 --> 00:14:19,680 Speaker 1: stuff stops and would stop working. So you had a 266 00:14:19,760 --> 00:14:22,840 Speaker 1: lot of digit problems here. So some of this you 267 00:14:22,840 --> 00:14:26,520 Speaker 1: could count on, you know, just a a kind of 268 00:14:27,200 --> 00:14:29,720 Speaker 1: a jerry rigged system of this is how I'm going 269 00:14:29,760 --> 00:14:32,200 Speaker 1: to designate this is the end of a program, And 270 00:14:32,240 --> 00:14:36,120 Speaker 1: it was just kind of arbitrarily chosen that would be stuff. 271 00:14:36,120 --> 00:14:38,480 Speaker 1: Some of it was more of a practical consideration, the 272 00:14:38,560 --> 00:14:40,880 Speaker 1: idea of we need to save time and money, so 273 00:14:40,920 --> 00:14:44,640 Speaker 1: therefore we're shortening this year to two digits. In either case, 274 00:14:44,680 --> 00:14:47,240 Speaker 1: it ended up meaning lots and lots of work for 275 00:14:47,280 --> 00:14:51,600 Speaker 1: people in the late nineteen nineties, and you got a 276 00:14:51,600 --> 00:14:54,720 Speaker 1: lot of attention. I mean, there were there were things 277 00:14:54,800 --> 00:14:58,600 Speaker 1: like industries that were already taking advantage of the time 278 00:14:58,640 --> 00:15:00,960 Speaker 1: in the nineties to address this. The software industry was 279 00:15:01,040 --> 00:15:02,840 Speaker 1: way ahead of the game. Oh yeah, yeah, I mean 280 00:15:02,880 --> 00:15:06,880 Speaker 1: back by you know, I think few people were on 281 00:15:06,920 --> 00:15:11,000 Speaker 1: top of it. Certainly by a lot of people had 282 00:15:11,040 --> 00:15:13,800 Speaker 1: already kind of corrected the problem, right, So the software 283 00:15:13,840 --> 00:15:16,840 Speaker 1: that was being produced from that point going forward had 284 00:15:16,840 --> 00:15:19,480 Speaker 1: already addressed it. Now, granted, there was still software that 285 00:15:19,560 --> 00:15:22,800 Speaker 1: was out previously that had this old code in it, 286 00:15:23,200 --> 00:15:26,320 Speaker 1: but the new code coming out of the software industry 287 00:15:26,360 --> 00:15:29,800 Speaker 1: had had adjusted for this kind of problem. But there 288 00:15:29,800 --> 00:15:32,640 Speaker 1: were other industries that were lagging behind. And in fact, 289 00:15:32,720 --> 00:15:37,600 Speaker 1: according to one study, uh the cap Gemini America consulting 290 00:15:37,640 --> 00:15:40,960 Speaker 1: firm did a study. They found that the state and 291 00:15:41,000 --> 00:15:45,000 Speaker 1: federal government systems were the furthest behind. And when you 292 00:15:45,040 --> 00:15:47,880 Speaker 1: think of all the information that state governments and federal 293 00:15:47,920 --> 00:15:51,400 Speaker 1: government here in the United States requires to operate, things 294 00:15:51,480 --> 00:15:56,880 Speaker 1: like taxes that are dependent upon or infrastructure. Infrastructure, yeah, 295 00:15:57,000 --> 00:15:59,680 Speaker 1: your water systems, all sorts of stuff that rely on 296 00:15:59,800 --> 00:16:04,160 Speaker 1: can puter systems that are run by spy satellites, all 297 00:16:04,640 --> 00:16:08,840 Speaker 1: of the stuff, communications, everything. I mean, there's entire industries 298 00:16:08,880 --> 00:16:12,200 Speaker 1: that are dependent either completely or in part on state 299 00:16:12,240 --> 00:16:15,840 Speaker 1: and federal systems. All of those were at risk because 300 00:16:16,200 --> 00:16:18,880 Speaker 1: they were the first behind. They had the least amount 301 00:16:18,880 --> 00:16:21,800 Speaker 1: of progress on addressing the Y two K problem. Uh, 302 00:16:22,040 --> 00:16:25,200 Speaker 1: such a huge deal that the president at the time 303 00:16:25,320 --> 00:16:28,720 Speaker 1: it was Bill Clinton, assigned the two thousand Information and 304 00:16:28,800 --> 00:16:32,840 Speaker 1: Readiness Disclosure Act into law, and that was designed to 305 00:16:33,120 --> 00:16:37,440 Speaker 1: create a collaborative environment among multiple industries. So that is 306 00:16:37,520 --> 00:16:41,400 Speaker 1: one industry developed the best practices and tools to address 307 00:16:41,400 --> 00:16:44,080 Speaker 1: the Y two K problem, it would be there was 308 00:16:44,120 --> 00:16:47,200 Speaker 1: an incentive to share the information across other industries so 309 00:16:47,240 --> 00:16:49,960 Speaker 1: that we didn't. It's not a competition, it's it's hey, 310 00:16:50,040 --> 00:16:52,520 Speaker 1: let's all get this done together. Kind of like I 311 00:16:52,560 --> 00:16:55,200 Speaker 1: would like my stuff to continue not being on fire. 312 00:16:55,640 --> 00:16:58,200 Speaker 1: How about I give this information to you guys, and 313 00:16:58,240 --> 00:17:00,400 Speaker 1: maybe that will decrease the chance that my stuff will 314 00:17:00,440 --> 00:17:04,080 Speaker 1: be on fire. In two thousand and that was that 315 00:17:04,160 --> 00:17:06,920 Speaker 1: was a big motivator. As it turns out, helped a lot. 316 00:17:07,520 --> 00:17:09,919 Speaker 1: There were other areas of the world that we're also 317 00:17:10,200 --> 00:17:13,200 Speaker 1: being very responsive to this European Commission issued a report 318 00:17:13,240 --> 00:17:16,240 Speaker 1: about Y two K to the European Union member countries 319 00:17:16,480 --> 00:17:18,160 Speaker 1: that all kind of got them. On the same page, 320 00:17:18,200 --> 00:17:21,439 Speaker 1: the British government announced that the British military would be 321 00:17:21,520 --> 00:17:24,480 Speaker 1: on hand to assist local police forces in the event 322 00:17:24,560 --> 00:17:27,480 Speaker 1: of emergency services breaking down as a result of the 323 00:17:27,560 --> 00:17:30,359 Speaker 1: Y two K problem. And there there was so much hype. 324 00:17:30,400 --> 00:17:33,440 Speaker 1: I mean, I mean, yeah, well, we'll talk more about 325 00:17:33,520 --> 00:17:36,240 Speaker 1: hype in a second. The United Nations held a conference 326 00:17:36,240 --> 00:17:39,720 Speaker 1: on it. They were trying to facilitate more sharing of information, 327 00:17:39,760 --> 00:17:43,760 Speaker 1: particularly but you know, that's that's cool, that's that's not hype, 328 00:17:43,800 --> 00:17:46,719 Speaker 1: that's preparedness, right. Well, they were particularly worried about a 329 00:17:46,720 --> 00:17:49,080 Speaker 1: lot of regions in Asia that were there were at 330 00:17:49,160 --> 00:17:52,080 Speaker 1: least thought of to be behind the curve on this 331 00:17:52,200 --> 00:17:55,320 Speaker 1: on addressing the HIT two K problem. So they wanted 332 00:17:55,359 --> 00:17:57,280 Speaker 1: to make sure that everyone in the world had an 333 00:17:57,640 --> 00:18:01,240 Speaker 1: equal chance of catching up, so that they could minimize 334 00:18:01,280 --> 00:18:03,280 Speaker 1: any effects that the Y two K problem. They have 335 00:18:03,280 --> 00:18:05,200 Speaker 1: not keep in mind, this is still at a time 336 00:18:05,240 --> 00:18:07,440 Speaker 1: where no one was really sure what was going to happen, 337 00:18:07,480 --> 00:18:09,560 Speaker 1: at least not on a global scale. There were some 338 00:18:09,560 --> 00:18:11,920 Speaker 1: people were saying like, well, you know, this system over 339 00:18:11,960 --> 00:18:15,080 Speaker 1: here is probably gonna be okay, because it's not really critical, 340 00:18:15,160 --> 00:18:17,600 Speaker 1: and even even if it were, you know, it would 341 00:18:17,640 --> 00:18:20,040 Speaker 1: just be something that we could adjust by writing a 342 00:18:20,040 --> 00:18:22,640 Speaker 1: couple of extra lines of code to correct that problem 343 00:18:23,160 --> 00:18:24,960 Speaker 1: other systems. People were like, I don't know if that 344 00:18:25,040 --> 00:18:27,080 Speaker 1: airplane will stay in the air. I mean, that wasn't 345 00:18:27,160 --> 00:18:31,240 Speaker 1: legitimate fear some people. I don't know, Okay, maybe fear 346 00:18:32,080 --> 00:18:35,679 Speaker 1: that people absolutely and and you know, on a on 347 00:18:35,720 --> 00:18:38,639 Speaker 1: a person to person basis, the amount of panic varied 348 00:18:39,119 --> 00:18:41,600 Speaker 1: um and problem they were depending on how much media 349 00:18:41,680 --> 00:18:45,040 Speaker 1: they had consumed about it, and excitable they were to 350 00:18:45,119 --> 00:18:46,679 Speaker 1: begin with. I want to say, towards the end of 351 00:18:47,880 --> 00:18:51,199 Speaker 1: a lot of the media coverage lent leaned more towards 352 00:18:51,280 --> 00:18:55,920 Speaker 1: the satirical and the the kind of jokey world's gonna 353 00:18:56,040 --> 00:18:58,639 Speaker 1: end next month kind of stuff, less less of the 354 00:18:58,680 --> 00:19:02,720 Speaker 1: actual fearmonger ring style, and more of the no one's 355 00:19:02,760 --> 00:19:05,280 Speaker 1: really sure, but you know, the worst case scenario could 356 00:19:05,280 --> 00:19:07,720 Speaker 1: be that kind of thing. So it wasn't at least 357 00:19:07,680 --> 00:19:10,840 Speaker 1: as as bad as you know, next month, everything you 358 00:19:10,880 --> 00:19:13,960 Speaker 1: know will be different because nothing's going to work, and 359 00:19:14,040 --> 00:19:17,240 Speaker 1: you know, start building your bomb shelter now, Um, you know, 360 00:19:17,280 --> 00:19:19,359 Speaker 1: there weren't a whole lot of serious reports that were 361 00:19:19,400 --> 00:19:21,040 Speaker 1: coming out like that, and I'm sure the Onion had 362 00:19:21,080 --> 00:19:23,840 Speaker 1: a lot of fun with it. We'll hear more about 363 00:19:23,920 --> 00:19:26,520 Speaker 1: the Y two K problem and why it wasn't as 364 00:19:27,080 --> 00:19:30,800 Speaker 1: big a calamity as we were originally expecting, but first, 365 00:19:30,880 --> 00:19:42,240 Speaker 1: let's take a quick break. All right, So we're back now. 366 00:19:43,119 --> 00:19:45,760 Speaker 1: The clock is ticking down. I know you've been waiting, 367 00:19:45,920 --> 00:19:49,560 Speaker 1: waiting all episode to learn what would happen about to 368 00:19:49,560 --> 00:19:52,080 Speaker 1: turn to two thousand? What half? Okay, so technically we're 369 00:19:52,119 --> 00:19:53,960 Speaker 1: all still here, so I guess we can draw some 370 00:19:54,000 --> 00:19:56,000 Speaker 1: conclusions right off the bat. And and and I mean, 371 00:19:56,000 --> 00:19:58,200 Speaker 1: this event is in our relatively recent path, and it 372 00:19:58,240 --> 00:20:00,360 Speaker 1: was really only years ago. And most of you guys 373 00:20:00,359 --> 00:20:02,600 Speaker 1: are probably remembering this, some of you, and some of 374 00:20:02,680 --> 00:20:04,480 Speaker 1: some of you folks who are in maybe middle school 375 00:20:04,600 --> 00:20:06,600 Speaker 1: or whatever. Maybe this is all new to you, in 376 00:20:06,640 --> 00:20:10,480 Speaker 1: which case, hey, welcome to the ridiculous panics that the 377 00:20:10,480 --> 00:20:12,359 Speaker 1: rest of the world went through before you were born. 378 00:20:12,480 --> 00:20:16,320 Speaker 1: Your parents were silly. Yeah, So, as it turns out, 379 00:20:16,800 --> 00:20:18,960 Speaker 1: a lot of the work that was being done leading 380 00:20:19,040 --> 00:20:23,840 Speaker 1: up to two thousand was successful, I mean, and there 381 00:20:23,880 --> 00:20:27,600 Speaker 1: was a lot of work. It was uh. One estimate 382 00:20:27,640 --> 00:20:32,840 Speaker 1: said that globally the world spent about three hundred billion 383 00:20:32,880 --> 00:20:36,680 Speaker 1: dollars that's billion with a B to address the Y 384 00:20:36,760 --> 00:20:39,880 Speaker 1: two K problem, and about just a little less than 385 00:20:39,960 --> 00:20:42,320 Speaker 1: half of that was spent in the United States alone 386 00:20:43,000 --> 00:20:46,360 Speaker 1: to address this issue. And that ranged from everything from 387 00:20:46,520 --> 00:20:50,159 Speaker 1: wide computer networks to like we were saying, microprocessors that 388 00:20:50,520 --> 00:20:54,400 Speaker 1: control things like microwaves, and you know, really in that case, 389 00:20:54,440 --> 00:20:56,679 Speaker 1: it was more of testing it to see, you know, 390 00:20:56,680 --> 00:21:00,240 Speaker 1: if you were to digitally alter the clock of the sheen, 391 00:21:00,280 --> 00:21:03,240 Speaker 1: would it continue to operate properly that kind of stuff, right, 392 00:21:03,359 --> 00:21:06,520 Speaker 1: And and in the most most cases things things were 393 00:21:06,520 --> 00:21:09,840 Speaker 1: absolutely fine. And and people people kind of knew that, 394 00:21:09,880 --> 00:21:11,399 Speaker 1: I mean that there was a little bit of this 395 00:21:11,480 --> 00:21:15,399 Speaker 1: media frenzy, but um but a p Poles indicated that 396 00:21:16,400 --> 00:21:20,639 Speaker 1: Americans expected minor problems at worst, um but that some 397 00:21:20,920 --> 00:21:24,360 Speaker 1: thirty percent had planned stockpiles just in case. Yeah, yeah, 398 00:21:24,560 --> 00:21:26,520 Speaker 1: that the food money, that kind of that was. Yeah, 399 00:21:26,840 --> 00:21:29,679 Speaker 1: they weren't necessarily creating an armed militia, although there was 400 00:21:29,720 --> 00:21:32,120 Speaker 1: some of that going on too at the time. But 401 00:21:32,200 --> 00:21:33,840 Speaker 1: you know, it's it's one of those things where I 402 00:21:33,840 --> 00:21:36,760 Speaker 1: think a lot of people were jokingly saying like, yeah, 403 00:21:36,760 --> 00:21:39,320 Speaker 1: I mean, everything's gonna be fine, nothing's gonna be a problem. 404 00:21:39,359 --> 00:21:41,639 Speaker 1: But then like, you know, but just in case, I 405 00:21:41,680 --> 00:21:44,760 Speaker 1: think I'm gonna take it easy this year, uh or 406 00:21:44,920 --> 00:21:48,240 Speaker 1: just that day specifically, we have New Year's Eve to 407 00:21:48,320 --> 00:21:51,280 Speaker 1: two thousand to make sure that, you know, let's let's 408 00:21:51,280 --> 00:21:54,560 Speaker 1: not let's not put ourselves in danger unnecessarily. But nothing's 409 00:21:54,560 --> 00:21:57,119 Speaker 1: going to happen, you know. Cautious optimism is probably how 410 00:21:57,119 --> 00:21:59,960 Speaker 1: I would describe it. Yeah, and basically, none of those 411 00:22:00,240 --> 00:22:04,480 Speaker 1: big doomsayer kind of things, the worldwide power failures, the 412 00:22:04,520 --> 00:22:08,080 Speaker 1: total breakdown of transportation infrastructure, that planes falling out of 413 00:22:08,119 --> 00:22:10,440 Speaker 1: the sky, none, none of that. None of that happened. Now, 414 00:22:10,600 --> 00:22:13,520 Speaker 1: to be fair, one reason a lot of that may 415 00:22:13,560 --> 00:22:15,600 Speaker 1: not have happened is because so much work was done 416 00:22:15,600 --> 00:22:18,040 Speaker 1: addressing the problem. Right. I still don't think that every 417 00:22:18,080 --> 00:22:20,719 Speaker 1: computer in the world would have simultaneously caught on fire 418 00:22:20,800 --> 00:22:23,879 Speaker 1: and started eating your face. No, that's the likelihood of 419 00:22:23,920 --> 00:22:26,199 Speaker 1: that was very low, pretty low. I mean, some other 420 00:22:26,200 --> 00:22:28,520 Speaker 1: spooky stuff would have had to have been going, but 421 00:22:28,640 --> 00:22:31,320 Speaker 1: they we're talking some paranormal activity stuff at that point. 422 00:22:31,640 --> 00:22:34,679 Speaker 1: But no, I think, you know, here's here's The problem 423 00:22:34,680 --> 00:22:38,080 Speaker 1: with assessing how Y two K worked out because a 424 00:22:38,119 --> 00:22:39,520 Speaker 1: lot of people said, oh, it was a lot of 425 00:22:39,560 --> 00:22:43,119 Speaker 1: worry over nothing, nothing really big happened. But part of 426 00:22:43,160 --> 00:22:45,000 Speaker 1: that was because so much work had been done to 427 00:22:45,240 --> 00:22:48,000 Speaker 1: address the issue on a on a code level, to 428 00:22:48,080 --> 00:22:52,120 Speaker 1: make sure that the code in some very critical systems 429 00:22:52,240 --> 00:22:55,280 Speaker 1: was updated to not have this problem. So you could 430 00:22:55,400 --> 00:22:57,840 Speaker 1: argue that the reason why there wasn't a problem was 431 00:22:57,880 --> 00:23:00,880 Speaker 1: because we caused such a fuss in the first place. Sure, 432 00:23:00,960 --> 00:23:03,840 Speaker 1: it's also possible that if we had never done anything 433 00:23:03,920 --> 00:23:06,719 Speaker 1: and someone in two thousand said, hey, guys, I just 434 00:23:06,800 --> 00:23:09,159 Speaker 1: thought of something that we probably should have thought about before. 435 00:23:09,359 --> 00:23:11,480 Speaker 1: That everything's fine now, but you know what could have 436 00:23:11,520 --> 00:23:13,960 Speaker 1: happened was blah blah blah. Sure that might have happened 437 00:23:14,000 --> 00:23:16,960 Speaker 1: to the clue ending Yeah, yeah, like this is that's 438 00:23:17,000 --> 00:23:19,360 Speaker 1: what really happened. But here's what could have happened. That's 439 00:23:19,440 --> 00:23:22,040 Speaker 1: kind of the opposite of the clue endings. But yes, um, yeah, 440 00:23:22,080 --> 00:23:24,760 Speaker 1: so the the you know, it's hard, it's impossible to 441 00:23:24,800 --> 00:23:27,080 Speaker 1: say in hindsight, right, how I would have turned out 442 00:23:27,080 --> 00:23:29,960 Speaker 1: differently had nothing happened. I imagine that we would have 443 00:23:30,000 --> 00:23:33,800 Speaker 1: seen a lot of other glitches and systems that would 444 00:23:33,800 --> 00:23:36,000 Speaker 1: have been time consuming to fix. And we did see 445 00:23:36,000 --> 00:23:38,600 Speaker 1: some glitches, right, It wasn't like everything went off without 446 00:23:38,600 --> 00:23:41,159 Speaker 1: a hitch. Right. Well, okay, most of the glitches were 447 00:23:41,240 --> 00:23:44,480 Speaker 1: kind of preemptive. Some some large chemical plants and oil 448 00:23:44,480 --> 00:23:48,199 Speaker 1: pipelines were shut down preemptively during the transition and and rebooted. 449 00:23:48,520 --> 00:23:51,680 Speaker 1: UM Service was suspended on like major freight railroads and 450 00:23:51,720 --> 00:23:54,040 Speaker 1: Amtrak on New Year's Eve for for a final round 451 00:23:54,040 --> 00:23:58,320 Speaker 1: of equipment and signal checks. Um. Yeah, uh, you know, 452 00:23:58,440 --> 00:24:01,560 Speaker 1: the workload on programmers over the past couple of years 453 00:24:01,640 --> 00:24:04,760 Speaker 1: had had been increased like twenty six in order to 454 00:24:04,800 --> 00:24:10,919 Speaker 1: solve the problems. So that was an effect anyway. Uh, yeah, 455 00:24:11,040 --> 00:24:14,000 Speaker 1: I mean there there were you know, some of the 456 00:24:14,119 --> 00:24:17,040 Speaker 1: some of the problems that came up were very comical 457 00:24:17,040 --> 00:24:21,720 Speaker 1: in nature. Yeah, there were, I mean, okay, there were legitimate, 458 00:24:21,800 --> 00:24:24,800 Speaker 1: like a few hundred reports of errors amongst small businesses, 459 00:24:24,840 --> 00:24:27,159 Speaker 1: but needs most most of them were resolved in a 460 00:24:27,160 --> 00:24:30,360 Speaker 1: matter of hours after they had been reported. Um. There 461 00:24:30,480 --> 00:24:33,760 Speaker 1: was the temporary shutdown of a Defense Department ground station 462 00:24:33,840 --> 00:24:36,320 Speaker 1: that that processed info from a from a satellite from 463 00:24:36,320 --> 00:24:41,160 Speaker 1: a spy satellite. UM. But it didn't have any major consequences. Yeah, 464 00:24:41,200 --> 00:24:43,240 Speaker 1: I mean there were there were a couple of really 465 00:24:43,280 --> 00:24:45,880 Speaker 1: good ones. Um for okay. So, so this was when 466 00:24:46,080 --> 00:24:48,399 Speaker 1: uh al Gore was the vice president. This is my 467 00:24:48,480 --> 00:24:51,480 Speaker 1: favorite of the White UK problems. By the way, um 468 00:24:51,520 --> 00:24:54,560 Speaker 1: for for a minute, his town hall web page informed 469 00:24:54,680 --> 00:25:00,920 Speaker 1: visitors that it was January three, uh ninete if they 470 00:25:01,080 --> 00:25:07,919 Speaker 1: arrived via netscape and January three, nineteen thousand, I'm sorry. 471 00:25:08,800 --> 00:25:12,720 Speaker 1: If they were coming in via Futurama. We know that 472 00:25:12,800 --> 00:25:14,760 Speaker 1: by that time al Gore's head is in a jar. 473 00:25:15,520 --> 00:25:17,560 Speaker 1: So maybe it was accurate. It could be. It could 474 00:25:17,600 --> 00:25:20,000 Speaker 1: have been like this was a glimpse into the Futurama future. 475 00:25:20,680 --> 00:25:23,159 Speaker 1: Um um, there was a there was a glitch in 476 00:25:23,200 --> 00:25:26,119 Speaker 1: the New York Times. Oh I love this one, dude, No, 477 00:25:26,240 --> 00:25:29,240 Speaker 1: this one's my favorite. I retract by earlier statement. This 478 00:25:29,280 --> 00:25:32,560 Speaker 1: one's my favorite. They can both be your favorite. Um okay. So, 479 00:25:32,560 --> 00:25:36,280 Speaker 1: so there's a telephone service that would read an automated 480 00:25:36,280 --> 00:25:38,800 Speaker 1: selection of the New York Times and other newspapers to 481 00:25:39,240 --> 00:25:42,440 Speaker 1: um New Yorkers with with vision problems, and um it 482 00:25:42,560 --> 00:25:46,400 Speaker 1: informed clients that they would be hearing the January three 483 00:25:46,520 --> 00:25:49,359 Speaker 1: hundred issues. So before we start recording. I said, I 484 00:25:49,359 --> 00:25:52,760 Speaker 1: could just mention the top headline dirigible races reach inevitable 485 00:25:52,800 --> 00:25:56,600 Speaker 1: draw for you're running. I think, I I want, I 486 00:25:56,680 --> 00:25:59,480 Speaker 1: kind of want that alternate history, right were you? What 487 00:25:59,520 --> 00:26:01,680 Speaker 1: would have been amazing as if it had actually read 488 00:26:01,720 --> 00:26:04,880 Speaker 1: the headlines from January three d at that point. Now 489 00:26:04,920 --> 00:26:06,720 Speaker 1: that's not exactly that's not what happened. It just had 490 00:26:06,720 --> 00:26:10,400 Speaker 1: the date wrong. On the date part. The actual content 491 00:26:10,560 --> 00:26:13,800 Speaker 1: was the same, was the one for January third, two thousand. 492 00:26:14,000 --> 00:26:17,040 Speaker 1: It's not like the computer glitched and went and looked 493 00:26:17,080 --> 00:26:19,800 Speaker 1: up some microfiche brought it back and read it out. 494 00:26:20,320 --> 00:26:21,760 Speaker 1: I wish that had happened though. That would have been 495 00:26:21,760 --> 00:26:23,960 Speaker 1: so awesome. That would have been delightful. UM, I mean, 496 00:26:24,160 --> 00:26:27,680 Speaker 1: like other stuff. Uh, there was some some legal battles 497 00:26:27,680 --> 00:26:30,320 Speaker 1: that arose over all of this. Xerox, Nike, Unitis, and 498 00:26:30,520 --> 00:26:32,840 Speaker 1: a whole bunch of other companies were a few major 499 00:26:32,880 --> 00:26:37,760 Speaker 1: other companies. Um UH sued their insurers for reimbursement for 500 00:26:37,840 --> 00:26:40,679 Speaker 1: having to have spent hundreds of millions of dollars on 501 00:26:40,840 --> 00:26:44,679 Speaker 1: these repairs. Um citing language from nineteenth century business contracts 502 00:26:44,880 --> 00:26:47,600 Speaker 1: wherein insurers had to repay ship owners for money spent 503 00:26:47,680 --> 00:26:52,080 Speaker 1: trying to prevent a ship from sinking. Yeah, interesting sighting 504 00:26:52,080 --> 00:26:55,520 Speaker 1: of a precedent. Yeah, that didn't work out so well. Um, 505 00:26:55,640 --> 00:26:59,800 Speaker 1: suits generally generally settled on the side of the insurance 506 00:26:59,800 --> 00:27:04,960 Speaker 1: come benize because I think arguing that because nothing bad happened, 507 00:27:05,000 --> 00:27:07,560 Speaker 1: then we shouldn't have been forced to prevent something bad 508 00:27:07,600 --> 00:27:11,360 Speaker 1: from happening is a weird argument, because if nothing bad happened, 509 00:27:11,400 --> 00:27:13,920 Speaker 1: that's possibly proof that the thing you had to do 510 00:27:14,200 --> 00:27:16,720 Speaker 1: worked oh well. And even I mean even if you 511 00:27:16,760 --> 00:27:19,400 Speaker 1: if you spend that money preventing a ship from sinking 512 00:27:19,520 --> 00:27:23,800 Speaker 1: or preventing a computer from crashing. Um. You know, in 513 00:27:24,040 --> 00:27:27,440 Speaker 1: this particular case, the companies had seen the ship sinking 514 00:27:27,600 --> 00:27:31,800 Speaker 1: several years before they actually informed the insurance company that 515 00:27:31,840 --> 00:27:33,919 Speaker 1: it was an issue, and so in that case, the 516 00:27:33,920 --> 00:27:37,080 Speaker 1: courts were like, you knew about this beforehand. This interesting, 517 00:27:37,280 --> 00:27:39,239 Speaker 1: this is stuff that you had to take care of. 518 00:27:39,440 --> 00:27:41,840 Speaker 1: And um, I see there were some other like practical 519 00:27:42,480 --> 00:27:45,480 Speaker 1: outcomes that were, you know, just the way people had 520 00:27:45,520 --> 00:27:48,679 Speaker 1: reacted to Y two K and started stockpiling stuff. That 521 00:27:48,760 --> 00:27:51,760 Speaker 1: meant that once the new year happened and society did 522 00:27:51,800 --> 00:27:54,280 Speaker 1: not crumble, a lot a lot of people returned to 523 00:27:54,280 --> 00:27:57,840 Speaker 1: space heaters, um like so many that that SEARS started 524 00:27:58,240 --> 00:28:02,520 Speaker 1: incurring a titter not in but um but charging stocking 525 00:28:02,600 --> 00:28:05,800 Speaker 1: fee because people so many people were worried that the 526 00:28:05,800 --> 00:28:09,600 Speaker 1: infrastructure would be gone, that they wouldn't have gas or electricity, 527 00:28:09,960 --> 00:28:12,679 Speaker 1: you know. And then they once once that those problems 528 00:28:12,680 --> 00:28:14,800 Speaker 1: went away, like once two thousand came around and everything 529 00:28:14,840 --> 00:28:17,640 Speaker 1: was fine, then they're a, well, I don't really need 530 00:28:17,680 --> 00:28:21,880 Speaker 1: this anymore. Charity groups collected a lot of extra cand 531 00:28:21,920 --> 00:28:25,440 Speaker 1: goods that year. Uh not so many people traveled by 532 00:28:25,800 --> 00:28:28,080 Speaker 1: airplane on New Year's Day that year, right, I mean 533 00:28:28,119 --> 00:28:30,080 Speaker 1: not that many people travel on New Year's Day to 534 00:28:30,240 --> 00:28:34,359 Speaker 1: begin with, but even spetistically fewer. Yeah, there were some 535 00:28:34,400 --> 00:28:36,320 Speaker 1: of those people who were worried about that whole airplane 536 00:28:36,359 --> 00:28:38,720 Speaker 1: dropping out of the sky thing. And here's the thing 537 00:28:38,800 --> 00:28:41,160 Speaker 1: is that while this YE two K problem sounds like 538 00:28:41,200 --> 00:28:44,280 Speaker 1: it's like, well, yeah, sure it happened once, they'll never 539 00:28:44,360 --> 00:28:47,760 Speaker 1: happen again. We've got more to say about Y two K, 540 00:28:48,360 --> 00:28:58,840 Speaker 1: but that's gonna have to wait till after this break alright. Alright, 541 00:28:58,880 --> 00:29:01,280 Speaker 1: So there there are a bunch there. There are like 542 00:29:01,480 --> 00:29:05,400 Speaker 1: many much multiple other problems, like like the Y t 543 00:29:05,480 --> 00:29:10,920 Speaker 1: K problem, they're all time dependent, and they're all code dependent. However, 544 00:29:11,560 --> 00:29:14,840 Speaker 1: the the year in which each one would hit its 545 00:29:15,080 --> 00:29:19,760 Speaker 1: big old problem is different from one example to another, 546 00:29:20,040 --> 00:29:23,840 Speaker 1: mostly because engineers have a wicked sense of humor. Well 547 00:29:23,880 --> 00:29:26,560 Speaker 1: I'm not sure, okay, So so I don't I don't code. 548 00:29:26,560 --> 00:29:29,840 Speaker 1: I'm not a programmer. I I've never used any kind 549 00:29:29,840 --> 00:29:31,400 Speaker 1: of back end sort of thing. I know how to 550 00:29:31,400 --> 00:29:33,200 Speaker 1: make things bold and h ham l on my own. 551 00:29:33,280 --> 00:29:37,040 Speaker 1: But that's that's about it. Um. But so so apparently 552 00:29:37,720 --> 00:29:41,520 Speaker 1: in various programs, UM, the beginning of time starts on 553 00:29:41,640 --> 00:29:44,479 Speaker 1: various dates. Yeah, the beginning of time tends to be 554 00:29:44,560 --> 00:29:48,840 Speaker 1: the date that whatever was created was put into action, 555 00:29:48,960 --> 00:29:51,400 Speaker 1: although not all of all the time. So, and it 556 00:29:51,440 --> 00:29:54,920 Speaker 1: can depend on the numeracle system that the coding is using. UM. 557 00:29:54,920 --> 00:29:58,320 Speaker 1: I know, for for IBM PCs, the beginning of time 558 00:29:58,360 --> 00:30:03,480 Speaker 1: is January on up. And uh, the time itself goes 559 00:30:03,560 --> 00:30:07,280 Speaker 1: up in seconds. So the second is the base integer 560 00:30:07,440 --> 00:30:10,120 Speaker 1: for this whole thing, um, And it's a thirty two 561 00:30:10,120 --> 00:30:13,560 Speaker 1: bit integer. So that means that if you if you 562 00:30:13,720 --> 00:30:15,640 Speaker 1: do the math, and you're thinking, okay, it's a thirty 563 00:30:15,640 --> 00:30:19,600 Speaker 1: two bit integer. Each second is another increment. So every 564 00:30:19,640 --> 00:30:22,320 Speaker 1: second that passes goes up another one. If you're limited 565 00:30:22,360 --> 00:30:25,040 Speaker 1: to thirty two bits and you're starting days January one, 566 00:30:25,120 --> 00:30:27,840 Speaker 1: nineteen eighty, you can extend that out and you see 567 00:30:27,880 --> 00:30:31,000 Speaker 1: that in two thousand one sixteen you have hit the 568 00:30:31,120 --> 00:30:33,360 Speaker 1: limit of of the integers you have. You are no 569 00:30:33,400 --> 00:30:36,200 Speaker 1: longer able to go up without rolling over. It's kind 570 00:30:36,200 --> 00:30:39,520 Speaker 1: of like those old alright, gather around that digital fireplace, 571 00:30:39,600 --> 00:30:42,520 Speaker 1: children and the old days we had digital uh like 572 00:30:42,880 --> 00:30:45,240 Speaker 1: pinball machines, and once you hit a high score at 573 00:30:45,280 --> 00:30:47,560 Speaker 1: a certain level, it would turn over, meaning it would 574 00:30:47,600 --> 00:30:50,360 Speaker 1: go back to I actually did that on the Star 575 00:30:50,440 --> 00:30:54,400 Speaker 1: Trek one. I'll tell you about some time. Anyway. I was, Yeah, 576 00:30:54,600 --> 00:30:57,000 Speaker 1: I had seventeen free replays by the end of that. 577 00:30:57,080 --> 00:30:59,640 Speaker 1: I ended up leaving because I couldn't keep playing all day. 578 00:30:59,680 --> 00:31:02,120 Speaker 1: I was in college at the time. Money will spend 579 00:31:02,200 --> 00:31:04,920 Speaker 1: Mom and Dad. Anyway, two thousand, one h and sixteen 580 00:31:04,960 --> 00:31:08,360 Speaker 1: is when those integers will reach the limit, meaning that, 581 00:31:09,160 --> 00:31:11,160 Speaker 1: UH don't know what's going to happen after that. It's 582 00:31:11,160 --> 00:31:14,440 Speaker 1: not gonna be able to make these these time dependent 583 00:31:14,520 --> 00:31:18,360 Speaker 1: calculations accurately anymore, because it won't be able to track 584 00:31:18,480 --> 00:31:21,200 Speaker 1: time in a in a way that makes sense to 585 00:31:21,240 --> 00:31:24,400 Speaker 1: the computer anymore. So you would think, oh, well, clearly, 586 00:31:24,440 --> 00:31:26,120 Speaker 1: all right, so we've got the y two k problem 587 00:31:26,280 --> 00:31:28,960 Speaker 1: and the two thousand, one hundred sixteen problem with IBM PCs. 588 00:31:28,960 --> 00:31:33,560 Speaker 1: But after that we're okay, right, well uh well, so 589 00:31:33,600 --> 00:31:38,840 Speaker 1: so Windows NT sets the beginning of time as January one, 590 00:31:38,440 --> 00:31:43,000 Speaker 1: six one, So apparently they were thinking, like, okay, Shakespeare 591 00:31:43,560 --> 00:31:47,120 Speaker 1: would totally use Windows INT just before, you know, shortly 592 00:31:47,160 --> 00:31:50,880 Speaker 1: before he dies, so clearly he would have written, you know, 593 00:31:51,120 --> 00:31:54,760 Speaker 1: some of his greatest plays using a machine using Windows NT. 594 00:31:54,880 --> 00:31:57,080 Speaker 1: So let's start the I have no idea why they 595 00:31:57,120 --> 00:32:00,959 Speaker 1: chose January. Yeah, and okay, so it uses a sixty 596 00:32:00,960 --> 00:32:04,280 Speaker 1: four bit integer to track time, so way more integers 597 00:32:04,360 --> 00:32:06,680 Speaker 1: than you know, twice as much as the thirty two 598 00:32:06,680 --> 00:32:10,000 Speaker 1: bit intager that IBMPC did, and furthermore uses a hundred 599 00:32:10,080 --> 00:32:14,760 Speaker 1: nanoseconds as its increment. So so it's problem is a 600 00:32:14,840 --> 00:32:19,560 Speaker 1: yeour right, So it's here's the thing. It covers a 601 00:32:19,640 --> 00:32:22,600 Speaker 1: much greater span of time, right because it starts in 602 00:32:22,680 --> 00:32:27,000 Speaker 1: sixt one and it won't end until four, so that's later. 603 00:32:28,920 --> 00:32:31,480 Speaker 1: But because of that hundred dano second problem that eats 604 00:32:31,560 --> 00:32:33,640 Speaker 1: up those managers pretty quickly. If it had done it 605 00:32:33,720 --> 00:32:36,880 Speaker 1: as a second intager, it would extend much further out. 606 00:32:37,320 --> 00:32:41,719 Speaker 1: But hey, good good news for Apple users. According to Apple, 607 00:32:41,920 --> 00:32:47,320 Speaker 1: um mac is okay out to the year. Uh yeah, 608 00:32:47,720 --> 00:32:50,320 Speaker 1: um so, I mean not that it matters, because you're 609 00:32:50,320 --> 00:32:52,280 Speaker 1: gonna update all your stuff every year. Anyway, you have 610 00:32:52,880 --> 00:32:57,680 Speaker 1: Apple fanboys. I'm saying that I'll love I'm not entirely 611 00:32:57,720 --> 00:33:01,600 Speaker 1: positive that you are. Really. My wife has a iPhone. 612 00:33:01,640 --> 00:33:04,320 Speaker 1: I I love her. I've got a Mac. I mean, 613 00:33:04,360 --> 00:33:06,920 Speaker 1: I'm granted my Mac. It's like eight years old. At 614 00:33:06,920 --> 00:33:10,320 Speaker 1: this point. It might actually be an Apple computer, not 615 00:33:10,440 --> 00:33:14,200 Speaker 1: a Mac. They might have devolved. But anyway, yes, at 616 00:33:14,240 --> 00:33:16,040 Speaker 1: any rate, all of these problems are going to be 617 00:33:16,080 --> 00:33:17,760 Speaker 1: a little bit easier to fix than the Y two 618 00:33:17,800 --> 00:33:21,080 Speaker 1: K problem, right, And and it's one of those things 619 00:33:21,080 --> 00:33:23,040 Speaker 1: where the Y two K problem that was something that 620 00:33:23,120 --> 00:33:27,960 Speaker 1: was so uh grounded in the very basic code that 621 00:33:28,200 --> 00:33:32,160 Speaker 1: so many different UH systems were using. That's probably the 622 00:33:32,240 --> 00:33:34,560 Speaker 1: scope of it was enormous, right, And we didn't have 623 00:33:34,600 --> 00:33:38,000 Speaker 1: the tools available than that we do today for for 624 00:33:38,120 --> 00:33:41,440 Speaker 1: going in and addressing propagating things. Only that, but we've 625 00:33:41,440 --> 00:33:44,480 Speaker 1: got a greater time scale for all of these problems. 626 00:33:44,480 --> 00:33:48,200 Speaker 1: It's not something that's you know, five years away. Although 627 00:33:48,720 --> 00:33:50,840 Speaker 1: we can't just have the attitude of oh well that's 628 00:33:50,920 --> 00:33:52,680 Speaker 1: you know, that's like twenty more years. We don't need 629 00:33:52,720 --> 00:33:55,360 Speaker 1: to worry about that. No, we should definitely take the 630 00:33:55,400 --> 00:33:58,040 Speaker 1: steps to address these issues. So, yes, it is one 631 00:33:58,040 --> 00:34:00,320 Speaker 1: of those things where we see it over and over again. 632 00:34:00,360 --> 00:34:02,400 Speaker 1: Does it mean that we are done that? No one 633 00:34:02,520 --> 00:34:05,280 Speaker 1: is ever going to make this kind of mistake just 634 00:34:05,400 --> 00:34:12,280 Speaker 1: for the sake of convenience or or efficiency or economics. Uh, 635 00:34:12,640 --> 00:34:16,839 Speaker 1: We're never gonna make a plain old mistake. No, we're human. 636 00:34:16,920 --> 00:34:19,439 Speaker 1: We make mistakes. That's kind of kind of thing. They're 637 00:34:19,640 --> 00:34:23,000 Speaker 1: they're warm and fuzzy. It's us. Yeah, well you know 638 00:34:23,320 --> 00:34:26,080 Speaker 1: we're good at that and making mistakes. I am great. 639 00:34:26,239 --> 00:34:28,440 Speaker 1: I'm like I learned from my mistakes. I can repeat 640 00:34:28,440 --> 00:34:32,279 Speaker 1: them almost exactly. So, um, yeah, it's something that will 641 00:34:32,320 --> 00:34:34,839 Speaker 1: price see also pop up. And of course those people 642 00:34:34,880 --> 00:34:37,399 Speaker 1: will eventually be ridiculed like do don't you remember why 643 00:34:37,400 --> 00:34:39,919 Speaker 1: I do? K? And uh, you know, we'll just we'll 644 00:34:40,000 --> 00:34:44,359 Speaker 1: relive this drama multiple times. But hey, some of these 645 00:34:44,400 --> 00:34:46,040 Speaker 1: are problems that are so far in the future that 646 00:34:46,080 --> 00:34:47,759 Speaker 1: it's our descendants that they are gonna be worried about 647 00:34:47,760 --> 00:34:50,640 Speaker 1: them unless we find some digital immortality or something. Yeah. 648 00:34:50,640 --> 00:34:52,399 Speaker 1: See there you go say, I think that what all 649 00:34:52,440 --> 00:34:54,480 Speaker 1: of this is not taking into consideration is that we 650 00:34:54,520 --> 00:34:58,920 Speaker 1: are clearly going to hit the singularity in years. Yes, 651 00:34:59,640 --> 00:35:03,879 Speaker 1: you know, we're getting We're rapidly approaching what Hurtswil said 652 00:35:03,920 --> 00:35:06,719 Speaker 1: would be the singularity. And I'm a little skeptical right now, 653 00:35:06,840 --> 00:35:09,160 Speaker 1: but hey, it could be proven wrong. But you know, 654 00:35:09,200 --> 00:35:11,520 Speaker 1: when we see problems like this rise up, it does 655 00:35:11,600 --> 00:35:14,600 Speaker 1: make you wonder about that singularity and think maybe that 656 00:35:14,880 --> 00:35:17,480 Speaker 1: maybe that would only be really super awesome for a 657 00:35:17,600 --> 00:35:21,200 Speaker 1: very relatively short time until our code ran out. I 658 00:35:21,200 --> 00:35:24,040 Speaker 1: hope you guys enjoyed that classic episode of tech Stuff 659 00:35:24,080 --> 00:35:25,960 Speaker 1: and you learned a little bit about Y two K, 660 00:35:26,200 --> 00:35:28,960 Speaker 1: and you wonder, you know, are we going to see 661 00:35:29,000 --> 00:35:33,000 Speaker 1: this happen again, because that's a fear and uh, it 662 00:35:33,160 --> 00:35:36,120 Speaker 1: really does show that sometimes a short cut is not 663 00:35:36,200 --> 00:35:40,480 Speaker 1: the best way to go about things. Sometimes it pays 664 00:35:40,520 --> 00:35:43,640 Speaker 1: off to, you know, take the long route. If you 665 00:35:43,719 --> 00:35:46,279 Speaker 1: have any suggestions for future episodes of tech Stuff, get 666 00:35:46,320 --> 00:35:47,919 Speaker 1: in touch with me and let me know what those 667 00:35:48,040 --> 00:35:49,640 Speaker 1: might be. It can be a technology, it could be 668 00:35:49,680 --> 00:35:52,120 Speaker 1: a company, it could be a trend in tech. The 669 00:35:52,160 --> 00:35:54,240 Speaker 1: way to do that is to go over to Twitter 670 00:35:54,520 --> 00:35:57,280 Speaker 1: and send me a message. The handle that you should 671 00:35:57,719 --> 00:36:01,040 Speaker 1: use to to contact me is text stuff hs W 672 00:36:01,480 --> 00:36:08,959 Speaker 1: and I'll talk to you again really soon. Yeah. Text 673 00:36:08,960 --> 00:36:12,400 Speaker 1: Stuff is an I heart Radio production. For more podcasts 674 00:36:12,440 --> 00:36:15,160 Speaker 1: from my heart Radio, visit the i heart Radio app, 675 00:36:15,320 --> 00:36:18,480 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.