1 00:00:01,480 --> 00:00:04,280 Speaker 1: Welcome to Stuff You Should Know, a production of I 2 00:00:04,360 --> 00:00:13,480 Speaker 1: Heart Radio. Hey, and welcome to the podcast. I'm Josh 3 00:00:13,520 --> 00:00:16,600 Speaker 1: and there's Chuck and this is Stuff you should Know. 4 00:00:17,680 --> 00:00:19,800 Speaker 1: This is a really good one too. I don't want 5 00:00:19,800 --> 00:00:21,680 Speaker 1: to I don't want to get ahead of ourselves because 6 00:00:21,680 --> 00:00:23,599 Speaker 1: we haven't recorded it yet. But I think this is 7 00:00:23,640 --> 00:00:26,840 Speaker 1: going to be pretty good because it's very interesting and 8 00:00:26,880 --> 00:00:32,040 Speaker 1: surprising and still kind of unresolved. Yeah, what city were 9 00:00:32,040 --> 00:00:34,280 Speaker 1: you in? You have to tell me what you were doing? 10 00:00:34,320 --> 00:00:37,200 Speaker 1: But what city were you in on December thirty feet? 11 00:00:41,840 --> 00:00:44,440 Speaker 1: I was in hot Atlanta? What about you? Okay, I 12 00:00:44,560 --> 00:00:46,960 Speaker 1: was there too. Uh. That's when I lived in the 13 00:00:47,040 --> 00:00:49,880 Speaker 1: big warehouse on the West End and we had a 14 00:00:49,920 --> 00:00:53,559 Speaker 1: big party, probably the biggest New Year's Eve party I've 15 00:00:53,560 --> 00:00:55,880 Speaker 1: ever thrown. And it was one of those parties where 16 00:00:55,920 --> 00:00:58,640 Speaker 1: people come that don't know you, oh wow, and all 17 00:00:58,680 --> 00:01:00,440 Speaker 1: of a sudden it was like, all right, here we go. 18 00:01:00,680 --> 00:01:03,360 Speaker 1: That kind of party. It's like weird science or sixteen 19 00:01:03,360 --> 00:01:07,120 Speaker 1: candles or something. Yeah, it was great. By the guy 20 00:01:07,160 --> 00:01:10,679 Speaker 1: from the Hills Have Eyes showed up on a motorcycle 21 00:01:11,080 --> 00:01:14,760 Speaker 1: who's also in goonies. That wasn't the guy from the 22 00:01:15,600 --> 00:01:17,840 Speaker 1: Are You sure. Oh no, no, you're right, you're right, 23 00:01:17,880 --> 00:01:19,920 Speaker 1: you're right. No, it was I think it was. He 24 00:01:20,000 --> 00:01:22,360 Speaker 1: was just in makeup. No, the guy from the Goonies 25 00:01:22,440 --> 00:01:27,399 Speaker 1: was football player John Matuzak, right, same guy. Oh that's 26 00:01:27,400 --> 00:01:29,640 Speaker 1: the guy from the Hills have eyes. No, I don't know. 27 00:01:29,720 --> 00:01:35,039 Speaker 1: I'm just trying to end up. Yeah. Yeah, so that's 28 00:01:35,040 --> 00:01:36,440 Speaker 1: what I was doing. I was throwing a big old 29 00:01:36,440 --> 00:01:39,399 Speaker 1: party and uh, not worrying too much about the Y 30 00:01:39,440 --> 00:01:42,760 Speaker 1: two K bug or my bank account being empty because 31 00:01:42,800 --> 00:01:45,000 Speaker 1: I didn't have much in it anyway, or being stuck 32 00:01:45,040 --> 00:01:47,840 Speaker 1: in an elevator or falling out of the sky in 33 00:01:47,880 --> 00:01:49,680 Speaker 1: a plane. Yeah, it was a good time to be 34 00:01:49,720 --> 00:01:51,960 Speaker 1: young because you didn't really care that much. You didn't 35 00:01:52,000 --> 00:01:53,920 Speaker 1: have as much to lose as if you were like 36 00:01:54,000 --> 00:01:57,120 Speaker 1: some I don't know, middle aged fat cat or something. 37 00:01:57,680 --> 00:01:59,960 Speaker 1: You're probably sweating it a little more than you were, 38 00:02:00,080 --> 00:02:03,800 Speaker 1: like in your twenties at the time. Probably a lot 39 00:02:03,840 --> 00:02:05,720 Speaker 1: of people who are listening to this right now, we're like, 40 00:02:05,760 --> 00:02:08,040 Speaker 1: what you guys just said, why to K bug? What 41 00:02:08,160 --> 00:02:11,800 Speaker 1: is that? And uh well, let Uncle Josh and Uncle 42 00:02:11,880 --> 00:02:14,320 Speaker 1: Chuck tell you all about why two K because it 43 00:02:14,360 --> 00:02:17,800 Speaker 1: was one of the weirdest times to ever live through 44 00:02:17,960 --> 00:02:20,600 Speaker 1: and Chuck, you and I were Cold War kids, like 45 00:02:20,639 --> 00:02:25,240 Speaker 1: we we lived through time where people thought, like, you know, 46 00:02:25,560 --> 00:02:29,200 Speaker 1: we could go to a nuclear apocalypse with with the 47 00:02:29,240 --> 00:02:32,480 Speaker 1: Soviet Union at any given moment, it could just happen everybody. 48 00:02:32,720 --> 00:02:35,720 Speaker 1: That's how we lived, and the Y two K bugs 49 00:02:35,760 --> 00:02:40,200 Speaker 1: still managed to stand out from that backdrop. Yeah. So 50 00:02:40,400 --> 00:02:43,920 Speaker 1: the idea was, and we'll get more specific, but the 51 00:02:43,919 --> 00:02:48,000 Speaker 1: one sentence descriptor is, computer code in early days was 52 00:02:48,040 --> 00:02:52,799 Speaker 1: written with just two digits, not nineteen fifty whatever, just 53 00:02:53,000 --> 00:02:56,040 Speaker 1: fifty whatever. And the idea was that was going to 54 00:02:56,120 --> 00:02:59,120 Speaker 1: cause a lot of problems when the calendar flipped to 55 00:02:59,280 --> 00:03:03,720 Speaker 1: two thousand and anything, like I said, from elevators being stuck, 56 00:03:03,800 --> 00:03:09,200 Speaker 1: to Wall Street going down to elevators being stuck. I mean, 57 00:03:09,440 --> 00:03:14,440 Speaker 1: there were all kinds of crazy scenarios um that you 58 00:03:14,639 --> 00:03:19,000 Speaker 1: might or might not have worried about as that date approached. Uh. 59 00:03:19,040 --> 00:03:20,800 Speaker 1: And that was the Y two K bug or the 60 00:03:20,840 --> 00:03:24,799 Speaker 1: millennium bug. And you know, some people really really freaked out, 61 00:03:25,240 --> 00:03:28,080 Speaker 1: and some people took advantage of people freaking out, and 62 00:03:28,160 --> 00:03:30,480 Speaker 1: some people didn't worry at all. And we're gonna tell 63 00:03:30,520 --> 00:03:33,000 Speaker 1: you all about it. Yeah, that was a really great 64 00:03:33,280 --> 00:03:39,240 Speaker 1: elevator falling down pitch. Thanks so um. The thing about 65 00:03:39,240 --> 00:03:41,839 Speaker 1: the Y two K bug is that if you hear 66 00:03:41,880 --> 00:03:45,000 Speaker 1: about it today, you're gonna hear a couple of different 67 00:03:45,160 --> 00:03:50,160 Speaker 1: responses potentially, and probably the more prevalent of the two 68 00:03:50,240 --> 00:03:53,000 Speaker 1: is that it was all just a big hoax, a sham, 69 00:03:53,040 --> 00:03:55,280 Speaker 1: maybe a big money suck. It was a bunch of 70 00:03:55,280 --> 00:03:59,600 Speaker 1: people just being paranoid, um. And whether that was rightfully so, 71 00:04:00,040 --> 00:04:03,120 Speaker 1: the or the paranoia was justified or not, you know, um, 72 00:04:03,160 --> 00:04:05,400 Speaker 1: because of the situation or the context that this is 73 00:04:05,440 --> 00:04:09,800 Speaker 1: happening in. We'll talk about that later. But every once 74 00:04:09,800 --> 00:04:11,640 Speaker 1: in a while you'll run into somebody and these are 75 00:04:11,640 --> 00:04:13,760 Speaker 1: the people you should probably listen to, who say, no, 76 00:04:14,360 --> 00:04:16,680 Speaker 1: it was actually a really big dealer. It had the 77 00:04:16,720 --> 00:04:19,719 Speaker 1: potential to be a really big deal And the reason 78 00:04:19,760 --> 00:04:23,400 Speaker 1: it wasn't a big deal is because the world, i 79 00:04:23,400 --> 00:04:26,920 Speaker 1: should say smart people in the world got behind solving 80 00:04:26,960 --> 00:04:30,680 Speaker 1: this issue, came together and solved this issue, and that 81 00:04:30,800 --> 00:04:33,799 Speaker 1: it was only people who weren't really paying attention who 82 00:04:33,839 --> 00:04:36,279 Speaker 1: saw nothing happened and said that was all just a 83 00:04:36,320 --> 00:04:39,840 Speaker 1: hoax or sham. And it's really interesting because to this day, 84 00:04:39,920 --> 00:04:42,479 Speaker 1: depending on what media coverage you read. And I'm not 85 00:04:42,520 --> 00:04:44,760 Speaker 1: even talking like left and right. I mean just like 86 00:04:45,200 --> 00:04:48,480 Speaker 1: the author. Uh, it can change from author to author. 87 00:04:49,040 --> 00:04:52,800 Speaker 1: Um that those two different approaches to the explanation will 88 00:04:52,800 --> 00:04:55,320 Speaker 1: be will be used depending on who you're reading. And 89 00:04:55,360 --> 00:04:58,640 Speaker 1: I just find that fascinating that we still haven't resolved 90 00:04:58,680 --> 00:05:03,680 Speaker 1: it fully. Oh yeah, Like you could go to any party, 91 00:05:03,760 --> 00:05:06,680 Speaker 1: any dinner party, and bring this up and and get 92 00:05:06,720 --> 00:05:09,640 Speaker 1: probably equal amount of those reactions. I bet, yeah, get 93 00:05:09,640 --> 00:05:13,160 Speaker 1: everybody really riled up. Um. All right, So I said 94 00:05:13,160 --> 00:05:16,600 Speaker 1: that it has to do with computer programming, and in 95 00:05:16,640 --> 00:05:19,599 Speaker 1: the early days of programming, like I said, they were 96 00:05:19,640 --> 00:05:23,560 Speaker 1: not using the first two numbers of the year um 97 00:05:23,600 --> 00:05:28,560 Speaker 1: because they weren't just lazy. Um. Those computers didn't have 98 00:05:28,600 --> 00:05:31,640 Speaker 1: a lot of computing power. They had about one or 99 00:05:31,720 --> 00:05:36,120 Speaker 1: two killer bits of memory. And it requires eight bits 100 00:05:36,320 --> 00:05:40,159 Speaker 1: or one bite to represent a single alpha numeric character. 101 00:05:40,320 --> 00:05:45,440 Speaker 1: So if you could save sixteen but sorry, sixteen bits 102 00:05:46,360 --> 00:05:51,000 Speaker 1: per operation that you're running by just not using nineteen, uh, 103 00:05:51,120 --> 00:05:53,320 Speaker 1: then you're saving a lot of memory, like well well 104 00:05:53,400 --> 00:05:57,159 Speaker 1: needed memory. Yeah. I mean like this is at a 105 00:05:57,200 --> 00:06:00,400 Speaker 1: time where like those bits were vital. To day, we're 106 00:06:00,440 --> 00:06:05,240 Speaker 1: just like drowning in in um RAM and yeah and bites. 107 00:06:05,640 --> 00:06:08,880 Speaker 1: So I saw somewhere that like somebody went down the 108 00:06:08,920 --> 00:06:11,200 Speaker 1: list of what a main frame a main frame like 109 00:06:11,320 --> 00:06:15,159 Speaker 1: just there was huge room, yes basically, but like where 110 00:06:15,200 --> 00:06:18,200 Speaker 1: there's a whole room full of whoppers that those things 111 00:06:18,440 --> 00:06:23,360 Speaker 1: maybe had two gigabytes of storage and just like some 112 00:06:23,480 --> 00:06:27,240 Speaker 1: paltry amount of of RAM. And so just you know, 113 00:06:27,279 --> 00:06:29,360 Speaker 1: it's such a cliche to say now that like, you know, 114 00:06:29,520 --> 00:06:33,039 Speaker 1: we we like there's more there's more data storage and 115 00:06:33,160 --> 00:06:35,600 Speaker 1: like a calculator that we used today than they used 116 00:06:35,600 --> 00:06:39,080 Speaker 1: to to go to the moon. But it's it's absolutely true. 117 00:06:39,279 --> 00:06:42,479 Speaker 1: And so to save that that that amount of data 118 00:06:42,560 --> 00:06:44,880 Speaker 1: made a lot of sense. But then the other thing 119 00:06:44,960 --> 00:06:48,080 Speaker 1: that kind of legitimized that decision is that these people 120 00:06:48,080 --> 00:06:50,279 Speaker 1: are writing code and like the fifties and the sixties, 121 00:06:50,320 --> 00:06:52,440 Speaker 1: and they're like this stuff is going to be gone, 122 00:06:52,600 --> 00:06:56,760 Speaker 1: like it'll have been wiped out and like rebuilt from 123 00:06:56,839 --> 00:06:59,440 Speaker 1: scratch by the time this becomes a problem in the 124 00:06:59,520 --> 00:07:03,640 Speaker 1: year two thousand and they were really really wrong with that, 125 00:07:03,960 --> 00:07:06,480 Speaker 1: you know idea, which it makes sense that they would 126 00:07:06,480 --> 00:07:08,440 Speaker 1: think that, but it turned out that it was really 127 00:07:08,440 --> 00:07:10,920 Speaker 1: wrong because a lot of the software in the world 128 00:07:11,440 --> 00:07:16,320 Speaker 1: that was running really important stuff like things like financial markets, 129 00:07:16,400 --> 00:07:20,880 Speaker 1: or things that monitored drinking water purity, or things that 130 00:07:21,000 --> 00:07:25,640 Speaker 1: ran freezers that kept smallpox from thawing out. You know, 131 00:07:26,080 --> 00:07:29,040 Speaker 1: like these were all run on software in a lot 132 00:07:29,040 --> 00:07:32,640 Speaker 1: of cases that were built and kind of tar papered 133 00:07:32,680 --> 00:07:36,680 Speaker 1: over with fixes and patches and expansions that that were 134 00:07:36,720 --> 00:07:40,360 Speaker 1: originally created and like the fifties or sixties, and that's 135 00:07:40,400 --> 00:07:42,400 Speaker 1: that's not a good thing to figure out when you're 136 00:07:42,960 --> 00:07:45,640 Speaker 1: like five years out from the millennium when this problem 137 00:07:45,680 --> 00:07:49,320 Speaker 1: is going to actually happen. Yeah, And it also was 138 00:07:49,360 --> 00:07:52,480 Speaker 1: complicated by the fact that these programmers weren't all aligned 139 00:07:52,480 --> 00:07:55,760 Speaker 1: on how to even enter dates in a uniform way. 140 00:07:55,880 --> 00:08:01,560 Speaker 1: Some people used the Julian format, which is the two 141 00:08:01,560 --> 00:08:05,000 Speaker 1: digit year and then the three digit account of the 142 00:08:05,080 --> 00:08:09,720 Speaker 1: days in the year, so, uh, January one would be 143 00:08:09,800 --> 00:08:12,840 Speaker 1: nine sevens or zero one. But not everyone did it 144 00:08:12,880 --> 00:08:15,600 Speaker 1: that way. So it's not like you could just go 145 00:08:15,680 --> 00:08:22,200 Speaker 1: through and say, you know, the nineteen point to twenty 146 00:08:22,400 --> 00:08:25,280 Speaker 1: I don't know nothing about coding, right, but you know, 147 00:08:25,360 --> 00:08:27,880 Speaker 1: just some simple line of code that just basically changes 148 00:08:27,960 --> 00:08:31,440 Speaker 1: everything in a uniform manner. No, and so even even 149 00:08:31,520 --> 00:08:34,559 Speaker 1: in like the software itself. Like even if one piece 150 00:08:34,600 --> 00:08:38,280 Speaker 1: of software did use the same kind of date throughout 151 00:08:38,320 --> 00:08:43,160 Speaker 1: that one piece of software, Um, that software, that code 152 00:08:43,280 --> 00:08:46,480 Speaker 1: was written to accommodate you know, X number of bits 153 00:08:46,480 --> 00:08:48,040 Speaker 1: for a date. So if you just went in and 154 00:08:48,160 --> 00:08:52,079 Speaker 1: added two extra digits for the date field, that could 155 00:08:52,120 --> 00:08:57,679 Speaker 1: throw off everything else in the software without any without 156 00:08:57,880 --> 00:09:00,640 Speaker 1: you having any way of predicting what it will throw off. 157 00:09:00,920 --> 00:09:02,640 Speaker 1: So if you went in and made that fix, it 158 00:09:02,640 --> 00:09:06,400 Speaker 1: could create even bigger problems then then, uh, than if 159 00:09:06,440 --> 00:09:08,440 Speaker 1: you just didn't make the fix to begin with as 160 00:09:08,480 --> 00:09:12,800 Speaker 1: far as the software operating was concerned, right, Uh. And 161 00:09:12,840 --> 00:09:15,120 Speaker 1: the other thing we need to point out and is 162 00:09:15,160 --> 00:09:19,920 Speaker 1: that the fact that it was the millennium didn't matter. Um, 163 00:09:19,960 --> 00:09:22,760 Speaker 1: it mattered, and we'll and we'll get to the sort 164 00:09:22,760 --> 00:09:26,640 Speaker 1: of cultural um hysteria that kind of followed. I think 165 00:09:26,679 --> 00:09:28,280 Speaker 1: that had a lot to do with the fact that 166 00:09:28,320 --> 00:09:31,199 Speaker 1: it was a millennium, But it could have been eighteen 167 00:09:31,240 --> 00:09:34,080 Speaker 1: hundred and nineteen hundred as far as the software new, 168 00:09:34,440 --> 00:09:37,320 Speaker 1: and it would have presented the same problem exactly that 169 00:09:37,440 --> 00:09:40,200 Speaker 1: it was just the prefix was changing. And what the 170 00:09:40,240 --> 00:09:42,360 Speaker 1: big concern was is that when it turned over to 171 00:09:42,440 --> 00:09:45,839 Speaker 1: zero zero the computers hadn't been taught that that meant 172 00:09:46,000 --> 00:09:50,120 Speaker 1: now the century now two thousand, technically not century, don't 173 00:09:50,120 --> 00:09:52,839 Speaker 1: at me um, but that they were going to go 174 00:09:52,960 --> 00:09:55,800 Speaker 1: back to nine hundred, which is what they were programmed 175 00:09:55,800 --> 00:09:58,280 Speaker 1: to think, and that that could cause all sorts of 176 00:09:58,320 --> 00:10:03,880 Speaker 1: cascading events, everything from Chuck's fabled falling elevator too, air 177 00:10:03,920 --> 00:10:08,880 Speaker 1: aircraft computers powering down midflight and just planes falling out 178 00:10:08,920 --> 00:10:12,520 Speaker 1: of the sky. Yeah, or computer just I had the 179 00:10:12,559 --> 00:10:16,520 Speaker 1: idea that a computer systems were just gonna go and 180 00:10:16,600 --> 00:10:18,840 Speaker 1: like smoke would come out of them and everything would 181 00:10:18,880 --> 00:10:22,160 Speaker 1: just shut down. Power would go out, your cable would 182 00:10:22,200 --> 00:10:24,600 Speaker 1: go out, like just nothing would work anymore. And that 183 00:10:24,679 --> 00:10:27,040 Speaker 1: was the way that everybody was touting and talking about. 184 00:10:27,120 --> 00:10:29,600 Speaker 1: It wasn't even worse than that, people talking about nuclear 185 00:10:29,600 --> 00:10:33,840 Speaker 1: warheads launching themselves or blowing up like in their silos 186 00:10:33,880 --> 00:10:35,720 Speaker 1: and that kind of stuff. Like there was a real 187 00:10:35,760 --> 00:10:39,240 Speaker 1: apocalyptic vibe to the to the public idea about it, 188 00:10:39,440 --> 00:10:42,640 Speaker 1: but even to the sober, level headed people who are 189 00:10:42,640 --> 00:10:45,640 Speaker 1: actually working to solve the problem. You know, there's a 190 00:10:45,679 --> 00:10:50,280 Speaker 1: big problem with an airline computer, you know, resetting itself 191 00:10:50,360 --> 00:10:53,840 Speaker 1: because it thinks it's while the planes in the air 192 00:10:54,360 --> 00:10:59,680 Speaker 1: or um planes weren't around the right exactly. Does the 193 00:10:59,760 --> 00:11:02,440 Speaker 1: other a problem to Chuck is that they weren't sure 194 00:11:02,480 --> 00:11:06,400 Speaker 1: if the computers reset themselves and like shut down, if 195 00:11:06,440 --> 00:11:08,320 Speaker 1: they would have any way to get in there and 196 00:11:08,400 --> 00:11:11,720 Speaker 1: start them back up again. So it's not like you know, 197 00:11:12,080 --> 00:11:13,920 Speaker 1: I saw somewhere. It's not like it was going to 198 00:11:14,360 --> 00:11:17,280 Speaker 1: fix itself once you know, it got like a couple 199 00:11:17,320 --> 00:11:21,079 Speaker 1: of seconds past midnight on two thousand, that that wasn't 200 00:11:21,120 --> 00:11:24,400 Speaker 1: necessarily a given. So they also the other thing that 201 00:11:24,520 --> 00:11:28,240 Speaker 1: really drove this this I don't want. So there were 202 00:11:28,280 --> 00:11:31,520 Speaker 1: two things that happened. The public was panicking. The people 203 00:11:31,600 --> 00:11:34,840 Speaker 1: working against it were like concerned and and and actively 204 00:11:35,080 --> 00:11:38,320 Speaker 1: managing it. But for the people who were actively managing 205 00:11:38,320 --> 00:11:41,520 Speaker 1: it and who were concerned, another thing that was driving 206 00:11:41,520 --> 00:11:45,600 Speaker 1: it was actually it was driving it for everybody. Yeah, 207 00:11:45,640 --> 00:11:49,160 Speaker 1: I'm gonna settle on that. Um that we had no 208 00:11:49,280 --> 00:11:52,960 Speaker 1: idea how widespread the problem was. Even by the mid 209 00:11:53,000 --> 00:11:57,320 Speaker 1: to late nineties, they were like forty billion microchips out 210 00:11:57,320 --> 00:11:59,200 Speaker 1: there in the world doing their thing. We had no 211 00:11:59,280 --> 00:12:01,320 Speaker 1: idea what person standage we're going to be affected by 212 00:12:01,360 --> 00:12:03,720 Speaker 1: the white two K bug. We had no idea what 213 00:12:03,880 --> 00:12:07,560 Speaker 1: missile guidance systems or what satellites or what, um, you know, 214 00:12:07,640 --> 00:12:10,680 Speaker 1: a t M machine systems, We're going to be affected. 215 00:12:10,720 --> 00:12:13,360 Speaker 1: So we had to dive in and figure out once 216 00:12:13,400 --> 00:12:16,240 Speaker 1: people started getting this point across, like, guys, this is 217 00:12:16,280 --> 00:12:19,440 Speaker 1: actually a thing. And apparently people were raising the flags 218 00:12:19,480 --> 00:12:21,959 Speaker 1: or at least one person was raising the flags UM 219 00:12:22,000 --> 00:12:24,959 Speaker 1: as far back as the sixties and seventies I think, right, 220 00:12:25,160 --> 00:12:28,520 Speaker 1: or the seventies and eighties. Yeah, why don't we talk 221 00:12:28,559 --> 00:13:02,400 Speaker 1: about him and some others right after this? Okay, all right, 222 00:13:02,520 --> 00:13:06,400 Speaker 1: so you mentioned a light in the darkness, a man 223 00:13:06,480 --> 00:13:11,120 Speaker 1: on a mountain proclaiming this could be an issue. And 224 00:13:11,160 --> 00:13:15,920 Speaker 1: that man, uh in the nineteen seventies was Bob Behmer 225 00:13:16,120 --> 00:13:19,840 Speaker 1: or Bob Beemer, and he was he really played a 226 00:13:19,840 --> 00:13:24,400 Speaker 1: big role in the creation of askey code. And uh 227 00:13:24,440 --> 00:13:26,120 Speaker 1: he was the first person and this is in the 228 00:13:26,160 --> 00:13:28,640 Speaker 1: nineteen seventies, like I said to to actually write in 229 00:13:28,679 --> 00:13:31,520 Speaker 1: papers like, hey, you know, this could be a problem 230 00:13:31,520 --> 00:13:33,080 Speaker 1: down the road, This something we should probably take a 231 00:13:33,080 --> 00:13:38,959 Speaker 1: look at. Um. You know, if you're a UM a 232 00:13:39,000 --> 00:13:42,679 Speaker 1: blockbuster video, you're probably not gonna really try and get 233 00:13:42,679 --> 00:13:44,560 Speaker 1: ahead of this too much. You can probably start working 234 00:13:44,559 --> 00:13:48,080 Speaker 1: on this in like, if you are the financial sector 235 00:13:48,640 --> 00:13:51,200 Speaker 1: and your Wall Street, You're gonna start working on this 236 00:13:51,320 --> 00:13:54,800 Speaker 1: in the nineteen eighties because there's just so much more 237 00:13:54,840 --> 00:13:57,880 Speaker 1: at risk. Somebody's late fees running up is not a 238 00:13:57,960 --> 00:14:01,080 Speaker 1: very big deal. You can correct that. The financial market 239 00:14:01,400 --> 00:14:04,320 Speaker 1: not being open or available or crashing is a really 240 00:14:04,360 --> 00:14:07,960 Speaker 1: big deal. So they started throwing uh a lot of 241 00:14:08,000 --> 00:14:11,440 Speaker 1: money at this in the late nineteen eighties UH in 242 00:14:11,480 --> 00:14:14,160 Speaker 1: the financial sector to try and correct this ahead of time. Yeah, 243 00:14:14,160 --> 00:14:16,280 Speaker 1: and as a matter of fact, later we'll see later 244 00:14:16,320 --> 00:14:20,280 Speaker 1: on in the after effects, they credited this, um, basically 245 00:14:20,320 --> 00:14:24,400 Speaker 1: the upgrade that the New York Stock Exchange dedicated itself 246 00:14:24,400 --> 00:14:26,120 Speaker 1: to in the face of the Y two K bug, 247 00:14:26,400 --> 00:14:29,400 Speaker 1: as the same reason why the global financial markets didn't 248 00:14:30,080 --> 00:14:36,280 Speaker 1: like their systems didn't collapse um after September eleven. Yeah, 249 00:14:36,320 --> 00:14:40,280 Speaker 1: that because of this. Yeah. So UM if bob Biehmer 250 00:14:40,440 --> 00:14:43,720 Speaker 1: was the first old hermit prophet who came down from 251 00:14:43,760 --> 00:14:47,320 Speaker 1: the hills to war and everybody, Um, Peter de Yeager 252 00:14:47,760 --> 00:14:52,000 Speaker 1: was the guy who got all the press for it. Um. 253 00:14:52,200 --> 00:14:55,680 Speaker 1: He was the one who really basically dedicated his life 254 00:14:56,240 --> 00:15:00,200 Speaker 1: to making sure that everybody was good and scared out 255 00:15:00,240 --> 00:15:03,760 Speaker 1: this the public, the public, but also you know the 256 00:15:03,840 --> 00:15:06,520 Speaker 1: people who were like pulling the levers of government, the 257 00:15:06,520 --> 00:15:08,680 Speaker 1: people who are running the corporations, and the people are 258 00:15:08,720 --> 00:15:11,640 Speaker 1: running the financial markets. Like he he met with a 259 00:15:11,680 --> 00:15:14,680 Speaker 1: lot of different people. Um gave a lot of different 260 00:15:14,720 --> 00:15:18,680 Speaker 1: scary presentations. UM. I saw that he would like it 261 00:15:18,720 --> 00:15:21,880 Speaker 1: wasn't just some pet presentation. He gave everywhere necessarily too. 262 00:15:21,920 --> 00:15:24,360 Speaker 1: I saw he met with the Canadian government and was 263 00:15:24,400 --> 00:15:27,840 Speaker 1: trying to get the get across like this is a 264 00:15:27,880 --> 00:15:30,160 Speaker 1: really big deal and it's a really big problem and 265 00:15:30,240 --> 00:15:32,760 Speaker 1: you're not equipped to deal with this as it stands now. 266 00:15:32,840 --> 00:15:35,320 Speaker 1: This is a new thing for you. And he pointed 267 00:15:35,320 --> 00:15:40,720 Speaker 1: out that the Canadian government meets its deadlines. It's it's 268 00:15:40,960 --> 00:15:44,480 Speaker 1: deadline goals for projects six percent of the time. He said, 269 00:15:44,520 --> 00:15:48,360 Speaker 1: this cannot happen, Like, you can't miss this goal, this deadline. UM. 270 00:15:48,440 --> 00:15:50,440 Speaker 1: So he would like kind of go in and like 271 00:15:50,560 --> 00:15:54,600 Speaker 1: make it apparent to each each group based on their 272 00:15:54,600 --> 00:15:58,280 Speaker 1: own needs or their own desires or their own perspective. UM. 273 00:15:58,320 --> 00:16:00,720 Speaker 1: But he did. They also gave lots of interviews. He 274 00:16:00,840 --> 00:16:03,640 Speaker 1: was on sixty minutes UH and he made a bunch 275 00:16:03,640 --> 00:16:07,040 Speaker 1: of money during I think during he made in today's 276 00:16:07,080 --> 00:16:10,600 Speaker 1: dollars something like two and a half million dollars consulting 277 00:16:10,680 --> 00:16:14,200 Speaker 1: and giving speeches and lectures. But from what I understand, 278 00:16:14,600 --> 00:16:16,920 Speaker 1: and he gets a lot of guph today, as we'll see, 279 00:16:17,160 --> 00:16:19,320 Speaker 1: But from what I understand, he was a true believer 280 00:16:19,400 --> 00:16:24,240 Speaker 1: who in some ways maybe saved the world from some 281 00:16:24,440 --> 00:16:28,320 Speaker 1: really big problems that we we will we will never 282 00:16:28,680 --> 00:16:34,040 Speaker 1: fully understand because we didn't experience them. Yeah, this is 283 00:16:34,040 --> 00:16:37,000 Speaker 1: a hard one to judge in retrospect. It's like, how 284 00:16:37,000 --> 00:16:39,840 Speaker 1: do you judge the thing that didn't happen right exactly? 285 00:16:40,280 --> 00:16:42,760 Speaker 1: Is it a cry wolf? Or was it uh? Or 286 00:16:42,800 --> 00:16:45,560 Speaker 1: did we slay the wolf? So Peter Yeager kicked the 287 00:16:45,600 --> 00:16:49,360 Speaker 1: whole thing off of the article called Doomsday two thousand 288 00:16:49,560 --> 00:16:53,040 Speaker 1: and in his in his um defense, his editor probably 289 00:16:53,080 --> 00:16:59,120 Speaker 1: came up with that, not him. Editors typically write the headline, 290 00:16:59,160 --> 00:17:02,880 Speaker 1: so it's possible. Who knows he could have suggested it 291 00:17:02,920 --> 00:17:05,639 Speaker 1: he was He was that much of a doomsayer for 292 00:17:05,720 --> 00:17:09,080 Speaker 1: sure that he would have been totally comfortable with that. Yeah, 293 00:17:09,119 --> 00:17:11,520 Speaker 1: and this is where you know, the the American public 294 00:17:11,560 --> 00:17:13,439 Speaker 1: like eventually it made its way to sixty minutes and 295 00:17:13,480 --> 00:17:16,120 Speaker 1: once Morley Safer is on there on a Sunday night 296 00:17:16,920 --> 00:17:19,359 Speaker 1: talking about it. Then you know the American public is 297 00:17:19,359 --> 00:17:21,960 Speaker 1: going to get on board. And they did, and it 298 00:17:22,040 --> 00:17:25,720 Speaker 1: brought out some some cooks. It brought out people, uh 299 00:17:25,960 --> 00:17:29,359 Speaker 1: preppers and survivalists. Uh. There are a lot of internet 300 00:17:29,400 --> 00:17:31,920 Speaker 1: scams going around at the time. There was a lot 301 00:17:32,000 --> 00:17:38,800 Speaker 1: of religious religiosity, um sort of really heated up, like uh, 302 00:17:39,480 --> 00:17:42,480 Speaker 1: sort of doomsday preachers and stuff like that started coming 303 00:17:42,480 --> 00:17:45,240 Speaker 1: out of the woodwork a little bit more. Anyone that 304 00:17:45,280 --> 00:17:47,400 Speaker 1: thought that they could make money off of this thing, 305 00:17:48,320 --> 00:17:50,879 Speaker 1: uh through fear kind of came out of the woodwork. 306 00:17:51,119 --> 00:17:53,640 Speaker 1: And you know, when something like that's happening, you've got 307 00:17:53,640 --> 00:17:56,720 Speaker 1: that on one side, and then you've got a lot 308 00:17:56,760 --> 00:17:59,720 Speaker 1: of people on the other side just thumbing their nose 309 00:17:59,760 --> 00:18:01,639 Speaker 1: at it and saying, this is all a big scam, 310 00:18:01,680 --> 00:18:03,320 Speaker 1: This is all a hoax. Look at all these kooks 311 00:18:03,800 --> 00:18:06,000 Speaker 1: that are trying to get you know, separate us from 312 00:18:06,000 --> 00:18:09,119 Speaker 1: our money. Um, it's just a big scam, and we 313 00:18:09,160 --> 00:18:11,600 Speaker 1: don't have anything to worry about. It's weird, It almost 314 00:18:11,600 --> 00:18:16,840 Speaker 1: sounds familiar in some weird way. Yeah, what we've been 315 00:18:16,880 --> 00:18:19,879 Speaker 1: experiencing a Yeah, this is definitely like it was a 316 00:18:19,880 --> 00:18:22,240 Speaker 1: peek behind the curtain of what could what could come, 317 00:18:22,240 --> 00:18:26,879 Speaker 1: although it was not nearly a stark although I don't know, 318 00:18:26,920 --> 00:18:30,040 Speaker 1: maybe I wasn't paying as much attention then as I 319 00:18:30,080 --> 00:18:33,800 Speaker 1: am these days. But it didn't definitely, it didn't seem 320 00:18:33,840 --> 00:18:37,240 Speaker 1: it didn't seem like, you know, there was anything approaching 321 00:18:37,280 --> 00:18:41,439 Speaker 1: the incivility and just outright like anger that that we 322 00:18:41,560 --> 00:18:46,880 Speaker 1: experienced today, especially you know in America, um, compared to them, 323 00:18:46,920 --> 00:18:49,560 Speaker 1: But there were there were definitely two sides to this issue, 324 00:18:49,560 --> 00:18:52,679 Speaker 1: and they definitely were entrenched against one another. It just 325 00:18:52,800 --> 00:18:55,640 Speaker 1: wasn't like neither side hated the other. They just thought 326 00:18:55,640 --> 00:18:59,119 Speaker 1: the other one was dumb, right, uh. And you know, 327 00:18:59,200 --> 00:19:01,040 Speaker 1: one of the big reasons that it was such a 328 00:19:01,080 --> 00:19:05,320 Speaker 1: big kerfuffle culturally was that this it sort of it 329 00:19:05,400 --> 00:19:08,919 Speaker 1: was all about timing and the calendar flip aligned at 330 00:19:08,920 --> 00:19:13,280 Speaker 1: a time where we were computers were really becoming super 331 00:19:13,320 --> 00:19:17,320 Speaker 1: super entrenched in everyone's daily life. Um. Of course, you know, 332 00:19:17,320 --> 00:19:19,240 Speaker 1: people have been using computers for a while, but as 333 00:19:19,240 --> 00:19:23,040 Speaker 1: far as like really everyday stuff like running your bank 334 00:19:23,080 --> 00:19:25,840 Speaker 1: through there and paying your bills and credit cards and 335 00:19:25,880 --> 00:19:29,240 Speaker 1: acting as your own travel agent, and the government everything 336 00:19:30,000 --> 00:19:32,800 Speaker 1: was reliant on computers by this point kind of for 337 00:19:32,840 --> 00:19:35,359 Speaker 1: the first you know, this was the first big wave 338 00:19:36,040 --> 00:19:38,959 Speaker 1: so as far, and it made a good point in 339 00:19:39,000 --> 00:19:41,720 Speaker 1: here as as far as your average person on the 340 00:19:41,760 --> 00:19:45,280 Speaker 1: street knows, is their computer comes to them in a 341 00:19:45,359 --> 00:19:47,200 Speaker 1: box and they take it out, and it's just a 342 00:19:47,240 --> 00:19:50,280 Speaker 1: little magic machine that runs on ferry dust, and we 343 00:19:50,320 --> 00:19:52,600 Speaker 1: know nothing about how it works or how it should work. 344 00:19:53,200 --> 00:19:55,280 Speaker 1: And so all of a sudden, everyone has got these 345 00:19:55,280 --> 00:19:58,320 Speaker 1: little magic boxes that they don't really understand that they're 346 00:19:58,359 --> 00:20:01,280 Speaker 1: super reliant on. And there are people out there saying 347 00:20:01,920 --> 00:20:05,560 Speaker 1: things are about to get really bad with your magic box. Yeah, 348 00:20:05,600 --> 00:20:07,400 Speaker 1: And then a lot of people are just like, all right, 349 00:20:07,440 --> 00:20:09,240 Speaker 1: I guess it's time to freak out a little bit. Yeah. 350 00:20:09,320 --> 00:20:11,920 Speaker 1: I mean, people definitely did freak out, and I think 351 00:20:11,920 --> 00:20:14,480 Speaker 1: it also, I think part of that freak out is 352 00:20:14,520 --> 00:20:16,880 Speaker 1: kind of like you were saying, like the Y two 353 00:20:16,960 --> 00:20:19,000 Speaker 1: K bug is the first time it was revealed to 354 00:20:19,080 --> 00:20:22,479 Speaker 1: us just how dependent on computers we've become. We we 355 00:20:22,560 --> 00:20:24,919 Speaker 1: never really saw that before. But you know, before up 356 00:20:24,960 --> 00:20:27,159 Speaker 1: to this point, it was all like ge whiz, you know, 357 00:20:27,280 --> 00:20:30,159 Speaker 1: like my insurance claim went through fifty times faster than 358 00:20:30,200 --> 00:20:33,480 Speaker 1: it would have five years ago. Instead now it's like, 359 00:20:33,640 --> 00:20:36,479 Speaker 1: you know, we're these all these things that were dependent 360 00:20:36,520 --> 00:20:38,920 Speaker 1: on are about to pull the rug out from our civilization, 361 00:20:39,359 --> 00:20:42,760 Speaker 1: and that that that really kind of got people scared, 362 00:20:42,800 --> 00:20:45,720 Speaker 1: even if people weren't sitting there analyzing and I mean 363 00:20:45,720 --> 00:20:48,400 Speaker 1: we're analyzing it in hindsight, at the time, people were 364 00:20:48,440 --> 00:20:52,479 Speaker 1: just scared, freaked out, nervous, angry, upset. But then the 365 00:20:52,520 --> 00:20:57,119 Speaker 1: fact that this was taking place already during a major 366 00:20:57,240 --> 00:21:00,840 Speaker 1: calendar calendar change, not just from um, you know, the 367 00:21:00,920 --> 00:21:02,920 Speaker 1: nine hundreds of the two thousands, but like a new 368 00:21:02,960 --> 00:21:07,920 Speaker 1: a new millennium, you know, um, that that really kind 369 00:21:07,920 --> 00:21:11,199 Speaker 1: of had people already primed. The weird timing of the 370 00:21:11,200 --> 00:21:13,640 Speaker 1: whole thing, Like you remember how big the X Files were. 371 00:21:15,240 --> 00:21:17,480 Speaker 1: Imagine the X Files now would be like a whole 372 00:21:17,560 --> 00:21:20,360 Speaker 1: hum Some people would watch it, some people be really 373 00:21:20,359 --> 00:21:22,679 Speaker 1: into it. You know, it might make a little splash 374 00:21:22,760 --> 00:21:25,159 Speaker 1: there for a little bit. But the The X Files 375 00:21:25,440 --> 00:21:29,480 Speaker 1: one of the biggest TV shows in the world, at 376 00:21:29,480 --> 00:21:32,879 Speaker 1: the very least in the West, because it was super 377 00:21:32,920 --> 00:21:36,800 Speaker 1: tapped into this millennial and not the not the generation, 378 00:21:36,840 --> 00:21:39,119 Speaker 1: but just like the end of this era or the 379 00:21:39,160 --> 00:21:43,280 Speaker 1: new era that this ants that everyone was carrying around 380 00:21:43,320 --> 00:21:45,720 Speaker 1: to some degree, whether you were aware of it or not, 381 00:21:46,480 --> 00:21:48,960 Speaker 1: you were worried, some small part of your brain was 382 00:21:49,000 --> 00:21:51,600 Speaker 1: worried because the calendar is about to change over to 383 00:21:51,640 --> 00:21:54,679 Speaker 1: the year two thousand. Yeah. I think the show that 384 00:21:54,720 --> 00:21:58,840 Speaker 1: does that best now is Black and Mirror. Definitely. I 385 00:21:58,880 --> 00:22:01,240 Speaker 1: loved Meet some X Files, but I'll take Black Mirror 386 00:22:01,320 --> 00:22:05,320 Speaker 1: over X Files. Yeah. It's really tough to rival Black Mirror, 387 00:22:05,680 --> 00:22:07,600 Speaker 1: it is, and I know people, boy, X Files people 388 00:22:07,600 --> 00:22:09,439 Speaker 1: are gonna be so upset. I loved X Files. I 389 00:22:09,480 --> 00:22:12,600 Speaker 1: did Ye Molder and Scully. I just watched them the 390 00:22:12,680 --> 00:22:15,320 Speaker 1: other day. It still holds up. Yeah, it was a 391 00:22:15,320 --> 00:22:16,480 Speaker 1: great show. It was a lot of fun, but a 392 00:22:16,520 --> 00:22:20,680 Speaker 1: show of its time, right exactly. Those overcoats they wear 393 00:22:20,680 --> 00:22:25,480 Speaker 1: were gigantic. They were big, Oh my god, shoulder pads. Yes, 394 00:22:25,760 --> 00:22:28,840 Speaker 1: they were so everybody's walking around like David Byrne and 395 00:22:28,880 --> 00:22:33,920 Speaker 1: stopped making sense. It was a little weird. Um. So 396 00:22:34,160 --> 00:22:36,600 Speaker 1: the cost of this thing was gonna be pretty big 397 00:22:36,640 --> 00:22:39,679 Speaker 1: and ended up being pretty big. It depends on who 398 00:22:39,760 --> 00:22:42,359 Speaker 1: you ask. If you see numbers on the internet of 399 00:22:42,800 --> 00:22:46,639 Speaker 1: five billion dollars, that is probably not true. That probably 400 00:22:46,680 --> 00:22:49,600 Speaker 1: came from people that were consulting saying you know, it 401 00:22:49,680 --> 00:22:52,040 Speaker 1: may cost up to five billion dollars to fix all 402 00:22:52,080 --> 00:22:54,760 Speaker 1: this and all over the world. Um, But they were 403 00:22:54,760 --> 00:22:57,840 Speaker 1: spending tons of money. They're like, I think there's an 404 00:22:57,920 --> 00:23:02,680 Speaker 1: article from the on Washington Posts that talked about General 405 00:23:02,800 --> 00:23:06,440 Speaker 1: Motors spending about six hundred twenty five mill x in 406 00:23:06,560 --> 00:23:09,439 Speaker 1: about a quarter of a mill I'm sorry, quarter of 407 00:23:09,440 --> 00:23:14,040 Speaker 1: a billion, Proctor and Gamble ninety million. So and you know, 408 00:23:14,240 --> 00:23:16,080 Speaker 1: then the federal government has to spend a ton of 409 00:23:16,080 --> 00:23:21,120 Speaker 1: money on their own systems. So it added up to many, many, 410 00:23:21,119 --> 00:23:24,399 Speaker 1: many billions, tens of billions of dollars. Let's just say that. Yeah. Also, 411 00:23:24,440 --> 00:23:26,680 Speaker 1: I don't want to let this opportunity pass by without 412 00:23:26,680 --> 00:23:32,760 Speaker 1: shouting out contemporary journalism. One other thing, chuck, though, the 413 00:23:33,000 --> 00:23:37,560 Speaker 1: US Senate conducted a special committee investigation into spending on 414 00:23:37,600 --> 00:23:40,000 Speaker 1: the Y two K bug, and it came up with 415 00:23:40,040 --> 00:23:42,560 Speaker 1: what I what I can tell is the um the 416 00:23:42,600 --> 00:23:46,040 Speaker 1: most widely accepted number at the time, this is this 417 00:23:47,680 --> 00:23:51,440 Speaker 1: dollars A hundred billion dollars were spent in the United 418 00:23:51,480 --> 00:23:57,120 Speaker 1: States alone, and that about eight points yeah, about eight 419 00:23:57,119 --> 00:24:01,359 Speaker 1: point four billion of that was by the US government. UM. 420 00:24:01,440 --> 00:24:04,879 Speaker 1: And that again, that's in two thousand dollars, so it 421 00:24:04,880 --> 00:24:08,000 Speaker 1: would be substantially more today, and that the US is 422 00:24:08,280 --> 00:24:14,439 Speaker 1: almost certainly uh the largest spender on this issue because 423 00:24:14,480 --> 00:24:18,040 Speaker 1: we were also at the time the most dependent country 424 00:24:18,040 --> 00:24:20,760 Speaker 1: on computers for the country that was the most dependent 425 00:24:20,800 --> 00:24:23,320 Speaker 1: on computers in the entire world. So we had the 426 00:24:23,320 --> 00:24:25,199 Speaker 1: most to lose, we had the most to gain by 427 00:24:25,240 --> 00:24:29,040 Speaker 1: spending this money, and the Senate report concluded even within 428 00:24:29,160 --> 00:24:33,960 Speaker 1: three months afterward, that it was money well spent. Yeah, well, 429 00:24:34,000 --> 00:24:37,000 Speaker 1: there you have a case closed. Case closed, although I 430 00:24:37,000 --> 00:24:41,359 Speaker 1: should probably say that to the end UM. In nineties six, 431 00:24:42,359 --> 00:24:47,040 Speaker 1: the Congressional Research Service said, you know what, we need 432 00:24:47,080 --> 00:24:51,159 Speaker 1: to do something about this, and Senator Daniel moyna Hans 433 00:24:51,320 --> 00:24:53,800 Speaker 1: went to Bill Clinton and said, hey, we need to 434 00:24:53,840 --> 00:24:57,199 Speaker 1: do something about this. So Clinton launched the Council on 435 00:24:57,320 --> 00:25:01,200 Speaker 1: Year two thousand conversion UH Congress Past the Year two 436 00:25:01,240 --> 00:25:06,399 Speaker 1: thousand Information and Readiness Disclosure Act, and UM all this 437 00:25:06,480 --> 00:25:09,840 Speaker 1: sounds very fancy, but UH ED is quick to point 438 00:25:09,880 --> 00:25:12,760 Speaker 1: out and we are as well that you know, the 439 00:25:12,800 --> 00:25:15,399 Speaker 1: government can't get in there and just fix all the 440 00:25:15,440 --> 00:25:17,439 Speaker 1: bugs of all these companies. They gotta take care of 441 00:25:17,440 --> 00:25:21,040 Speaker 1: it themselves. So a lot of it was just hey, 442 00:25:21,160 --> 00:25:23,040 Speaker 1: you gotta get on this, like you gotta get ahead 443 00:25:23,040 --> 00:25:26,120 Speaker 1: of this. You're on this right stock market and you're 444 00:25:26,119 --> 00:25:30,119 Speaker 1: on this general motors, aren't you? And um sharing information 445 00:25:30,240 --> 00:25:33,199 Speaker 1: for sure, but UM a lot of it was just 446 00:25:33,280 --> 00:25:37,280 Speaker 1: kind of cheerleading yeah UM, and that I mean that 447 00:25:37,400 --> 00:25:41,320 Speaker 1: worked like getting people, you know, kind of snapped in 448 00:25:41,359 --> 00:25:43,440 Speaker 1: line and saying like, hey, the U. S. Government is 449 00:25:43,480 --> 00:25:45,240 Speaker 1: telling you this is a real problem and you need 450 00:25:45,280 --> 00:25:47,000 Speaker 1: to do this by this time or else you're gonna 451 00:25:47,000 --> 00:25:49,640 Speaker 1: have some big troubles. That gets people's attention for sure, 452 00:25:50,119 --> 00:25:52,240 Speaker 1: So you know, in that in a lot of senses, 453 00:25:52,320 --> 00:25:54,679 Speaker 1: that was enough. But the government also had all of 454 00:25:54,720 --> 00:25:58,280 Speaker 1: its own systems and UH software to look at and 455 00:25:58,320 --> 00:26:02,320 Speaker 1: go over and make sure that was um UH in 456 00:26:02,400 --> 00:26:05,400 Speaker 1: working condition or fixed if it needed to be fixing. 457 00:26:05,800 --> 00:26:07,879 Speaker 1: And then they reached out and helped other countries to 458 00:26:08,560 --> 00:26:11,600 Speaker 1: UM I saw that. I think starting in they started 459 00:26:11,600 --> 00:26:16,080 Speaker 1: an exchange program with Russia to make sure that nobody 460 00:26:16,160 --> 00:26:19,719 Speaker 1: was going to accidentally knuke anybody else. And as a 461 00:26:19,720 --> 00:26:24,160 Speaker 1: matter of fact, on the the the turn of the millennium, 462 00:26:24,240 --> 00:26:28,120 Speaker 1: there were um US observers and Russian observers in one 463 00:26:28,119 --> 00:26:31,840 Speaker 1: another's countries just basically to work together to make sure 464 00:26:31,880 --> 00:26:34,160 Speaker 1: that this this got all worked out. But I think 465 00:26:34,160 --> 00:26:35,960 Speaker 1: also it's just kind of a show of faith, like, 466 00:26:36,440 --> 00:26:38,199 Speaker 1: you know, we're not gonna nuke you, or we're not 467 00:26:38,200 --> 00:26:40,040 Speaker 1: gonna let you get new to our own people are 468 00:26:40,080 --> 00:26:42,200 Speaker 1: there kind of thing to which I thought was kind 469 00:26:42,200 --> 00:26:45,439 Speaker 1: of cool. Yeah. Other country countries took a little more 470 00:26:45,480 --> 00:26:50,760 Speaker 1: strict approach. Uh. The Dutch Central Bank said if you 471 00:26:51,200 --> 00:26:56,399 Speaker 1: we won't loan money to companies who weren't compliant. China said, 472 00:26:56,520 --> 00:26:58,840 Speaker 1: you know what, all the top airline executives have to 473 00:26:58,840 --> 00:27:03,000 Speaker 1: take flights on January first, um, and you know, so 474 00:27:03,080 --> 00:27:05,120 Speaker 1: you better make sure your stuff is running correctly. Yeah. 475 00:27:05,119 --> 00:27:08,080 Speaker 1: I think about that the nation or the government ordered 476 00:27:08,880 --> 00:27:12,800 Speaker 1: airline executives to be in the air at midnight um, 477 00:27:12,960 --> 00:27:16,480 Speaker 1: or they would I guess probably go to jail. Yeah. 478 00:27:16,520 --> 00:27:19,120 Speaker 1: And that sounds harsh and it is, but I think 479 00:27:19,160 --> 00:27:23,520 Speaker 1: it is an interesting incentive to say the least, it's pragmatic. 480 00:27:23,560 --> 00:27:27,560 Speaker 1: At least, you know, make sure your stuff works everybody. Um. 481 00:27:27,640 --> 00:27:31,119 Speaker 1: But the the Y two K compliance is what I 482 00:27:31,160 --> 00:27:33,919 Speaker 1: was talking about. That was like you could buy products 483 00:27:34,560 --> 00:27:37,959 Speaker 1: around that time that said Y two K compliant, Like 484 00:27:38,320 --> 00:27:40,320 Speaker 1: it could be a clock radio that says Y two 485 00:27:40,400 --> 00:27:43,560 Speaker 1: K ready. Um, all kinds of products were labeled Y 486 00:27:43,600 --> 00:27:47,040 Speaker 1: two K compliant, certainly anything to do with computing. Yeah, 487 00:27:47,080 --> 00:27:49,240 Speaker 1: And there was actually nobody watching to make sure that 488 00:27:49,240 --> 00:27:53,200 Speaker 1: that certification was actually accurate. There was there, I think 489 00:27:53,200 --> 00:27:56,840 Speaker 1: the Defense Department put out some like some guidance and 490 00:27:57,440 --> 00:27:59,560 Speaker 1: on what to do to be Y two K compliant, 491 00:27:59,560 --> 00:28:02,560 Speaker 1: but they were like find it, fix it all good 492 00:28:02,720 --> 00:28:05,159 Speaker 1: were the steps like they were that vague. There was 493 00:28:05,200 --> 00:28:08,960 Speaker 1: nobody's certifying it. And to me, yes, cheerleading from the 494 00:28:09,040 --> 00:28:14,360 Speaker 1: US government and raising awareness and maybe lending aid financially. Um, 495 00:28:14,400 --> 00:28:17,359 Speaker 1: that was some really good roles that it played. But 496 00:28:17,440 --> 00:28:19,840 Speaker 1: I feel like also it could have said, hey, if 497 00:28:19,880 --> 00:28:22,840 Speaker 1: you're running this kind of operating software you're or you're 498 00:28:22,880 --> 00:28:25,480 Speaker 1: producing dates using this time, here's a good fix you 499 00:28:25,520 --> 00:28:28,320 Speaker 1: might be able to use or to create some sort 500 00:28:28,320 --> 00:28:31,400 Speaker 1: of Y two K compliance guidance or steps. I feel 501 00:28:31,400 --> 00:28:34,600 Speaker 1: like it could have done a little more in that respect. UM, 502 00:28:34,640 --> 00:28:37,320 Speaker 1: but there was nobody Like you could have bought anything 503 00:28:37,320 --> 00:28:38,880 Speaker 1: and it could have had that sticker on it, and 504 00:28:38,920 --> 00:28:43,520 Speaker 1: it really didn't necessarily mean anything. And chuck, there were 505 00:28:43,560 --> 00:28:46,880 Speaker 1: some other governments had different responses. I saw that UM 506 00:28:47,000 --> 00:28:52,880 Speaker 1: the that Canada had thirteen thousand troops readied UM on 507 00:28:53,000 --> 00:28:56,520 Speaker 1: high alert just in case the the S went down. Yeah, 508 00:28:56,560 --> 00:29:00,000 Speaker 1: that was part of Operation Abicus. Yes, there's a really 509 00:29:00,160 --> 00:29:03,040 Speaker 1: great Globe and Mail article called Y two K The 510 00:29:03,080 --> 00:29:06,280 Speaker 1: Strange True History of How Canada prepared for an apocalypse 511 00:29:06,320 --> 00:29:08,520 Speaker 1: that never happened but changed us all. It was a 512 00:29:08,520 --> 00:29:11,320 Speaker 1: really good article. And then one of the other things 513 00:29:11,360 --> 00:29:14,400 Speaker 1: that Canada did was the CBC sent a pair of 514 00:29:14,560 --> 00:29:20,560 Speaker 1: engineers to a remote broadcasting station with UM with like 515 00:29:20,680 --> 00:29:22,800 Speaker 1: some video which I would love to get my hands on. 516 00:29:22,800 --> 00:29:25,000 Speaker 1: I couldn't find it anywhere that would that They were 517 00:29:25,040 --> 00:29:29,600 Speaker 1: to broadcast instructions on how to survive post apocalypse. Basically 518 00:29:30,040 --> 00:29:35,040 Speaker 1: that was their job. Oh wow. Yeah. So everybody's sitting 519 00:29:35,040 --> 00:29:38,040 Speaker 1: there ready and waiting, and it's finally December thirty one, 520 00:29:39,040 --> 00:29:41,240 Speaker 1: and whether you're ready or not, it's all about to 521 00:29:41,240 --> 00:29:45,160 Speaker 1: click over, right, Chuck, that's right. So we'll take another 522 00:29:45,200 --> 00:29:49,960 Speaker 1: break here and we'll talk about what happened on December 523 00:29:50,560 --> 00:30:28,360 Speaker 1: one pm right after this? All right? What what happened? Man? 524 00:30:28,400 --> 00:30:30,800 Speaker 1: I've been on pins and needles for a twenty seconds. 525 00:30:32,880 --> 00:30:35,280 Speaker 1: Not much. Actually, some stuff did happen, but for the 526 00:30:35,320 --> 00:30:39,160 Speaker 1: most part, nothing really happened. I mean for for you 527 00:30:39,360 --> 00:30:42,160 Speaker 1: or me, or just about anybody else out there. The 528 00:30:43,120 --> 00:30:45,240 Speaker 1: millennium came and went in the Y two K bug 529 00:30:45,280 --> 00:30:48,920 Speaker 1: fizzled out like a dud. That's right. I remember waking 530 00:30:49,000 --> 00:30:51,680 Speaker 1: up late in the day on January one, hung over. 531 00:30:53,160 --> 00:30:56,880 Speaker 1: My everything worked, my phone worked, my power was on. 532 00:30:57,600 --> 00:30:59,480 Speaker 1: There were no planes falling out of the sky. I 533 00:30:59,480 --> 00:31:02,680 Speaker 1: would rode elevator up and down all day long. But 534 00:31:02,920 --> 00:31:07,280 Speaker 1: some things did happen. Uh. And like I mentioned video stores, 535 00:31:08,400 --> 00:31:11,080 Speaker 1: you know sometimes there was one in Albany, New York 536 00:31:11,120 --> 00:31:13,120 Speaker 1: that had an assessed a late fee of over ninety 537 00:31:13,200 --> 00:31:17,160 Speaker 1: thousand dollars to a customer for the general's daughter. No less. 538 00:31:18,880 --> 00:31:21,720 Speaker 1: Oh is that what it was? One the movie? Yes, 539 00:31:22,040 --> 00:31:25,400 Speaker 1: that was the rental Oh boy, I think I actually 540 00:31:25,440 --> 00:31:30,040 Speaker 1: saw that for some reason. There's Travolta, right, yeah, the 541 00:31:30,080 --> 00:31:34,720 Speaker 1: a Trivolta that was in the post pulp fiction boom. Uh. 542 00:31:34,840 --> 00:31:38,800 Speaker 1: Let me see. There was a National Laboratory nuclear weapons 543 00:31:38,840 --> 00:31:43,280 Speaker 1: plant in Tennessee had some malfunctions. I think there was 544 00:31:43,320 --> 00:31:48,440 Speaker 1: some other nuclear malthfunctions in Japan, but nothing big obviously, 545 00:31:48,640 --> 00:31:52,000 Speaker 1: or you know, it would have been catastrophic, but really 546 00:31:52,760 --> 00:31:57,000 Speaker 1: mostly minor things happened. Yeah, there was like thirty thousand 547 00:31:57,080 --> 00:31:59,400 Speaker 1: cash registers in Greece that we're all running the same 548 00:31:59,440 --> 00:32:03,400 Speaker 1: software showed that that the year's nine hundred on receipts, 549 00:32:03,440 --> 00:32:06,560 Speaker 1: like nothing particularly bad. I think. In the US, the 550 00:32:07,560 --> 00:32:10,440 Speaker 1: most jarring thing that happened for the government was some 551 00:32:10,520 --> 00:32:14,720 Speaker 1: spy satellites were they went offline for three days. But 552 00:32:14,840 --> 00:32:18,520 Speaker 1: for the most part, people were able to see what 553 00:32:18,600 --> 00:32:21,720 Speaker 1: the problem was, deal with it, figure out a solution 554 00:32:21,800 --> 00:32:24,160 Speaker 1: or a patch, and then bring the thing back online 555 00:32:24,200 --> 00:32:28,280 Speaker 1: with minimal interruption or problems. And the fact is that 556 00:32:28,360 --> 00:32:31,760 Speaker 1: these the interruptions that did happen were typically pretty small, 557 00:32:32,200 --> 00:32:38,800 Speaker 1: pretty inconsequential, and we're dealt with pretty quickly. Yeah. In England, 558 00:32:38,920 --> 00:32:42,680 Speaker 1: in Sheffield, England, the National Health Services came under fire 559 00:32:42,760 --> 00:32:48,320 Speaker 1: because there was a misreading of UH the age of 560 00:32:48,600 --> 00:32:53,120 Speaker 1: pregnant mothers. Believe a hundred and fifty four women UH 561 00:32:53,160 --> 00:32:57,040 Speaker 1: in the program were affected, and you know they were 562 00:32:57,040 --> 00:33:01,040 Speaker 1: getting testing, like pre natal testing to see if there 563 00:33:01,080 --> 00:33:04,160 Speaker 1: was a chance that their baby could have Down syndrome. 564 00:33:04,280 --> 00:33:08,440 Speaker 1: And it led to two pregnancy terminations and four births 565 00:33:08,480 --> 00:33:11,800 Speaker 1: of children with Down syndrome, and for the life of me. 566 00:33:11,840 --> 00:33:16,160 Speaker 1: I couldn't find out if there were lawsuits or what. 567 00:33:16,440 --> 00:33:20,040 Speaker 1: I just saw articles that talked about penalties against the 568 00:33:20,160 --> 00:33:22,960 Speaker 1: NHS And it's one of those things that was really 569 00:33:23,000 --> 00:33:25,040 Speaker 1: hard to find in any kind of follow up. Right, 570 00:33:25,560 --> 00:33:28,920 Speaker 1: that was not one of the inconsequential ones I mentioned. No, 571 00:33:29,080 --> 00:33:33,360 Speaker 1: obviously not so. Um. The fact that, and from what 572 00:33:33,400 --> 00:33:35,560 Speaker 1: I could tell, Chuck, that was far and away the 573 00:33:35,880 --> 00:33:41,520 Speaker 1: most consequential outcome. Everything else is usually pretty um, you know, 574 00:33:41,560 --> 00:33:45,520 Speaker 1: either inconvenient or aggravating comical, like the guy who got 575 00:33:45,520 --> 00:33:50,160 Speaker 1: the dollar late fee for his video tape. Um. But 576 00:33:50,280 --> 00:33:53,600 Speaker 1: the fact that, you know, aside from the NHST issue, 577 00:33:54,640 --> 00:33:57,880 Speaker 1: the fact that it was generally small stuff that happened 578 00:33:59,000 --> 00:34:03,080 Speaker 1: for some reason made the public say, oh, well, this 579 00:34:03,200 --> 00:34:05,120 Speaker 1: was all just a hoax. It was all just as 580 00:34:05,120 --> 00:34:08,799 Speaker 1: scammed as suck money out of our preppers pockets and 581 00:34:09,200 --> 00:34:11,680 Speaker 1: um get us all riled up and scared and probably 582 00:34:11,719 --> 00:34:15,200 Speaker 1: control us with George Bush's new world order that ministry 583 00:34:15,239 --> 00:34:18,640 Speaker 1: talked about, um, and that the the whole thing was 584 00:34:18,640 --> 00:34:21,160 Speaker 1: was all just just who we that we were all 585 00:34:21,280 --> 00:34:24,200 Speaker 1: riled up for nothing, which I think goes to show 586 00:34:24,239 --> 00:34:29,680 Speaker 1: that that the public by and large is a dumb, dumb. Yeah, 587 00:34:29,760 --> 00:34:31,600 Speaker 1: I think we're seeing that now because Ed makes a 588 00:34:31,640 --> 00:34:35,520 Speaker 1: great analogy here. He says, hey, um, if if the 589 00:34:35,719 --> 00:34:38,279 Speaker 1: Y two K bug had been a flood, and we 590 00:34:38,400 --> 00:34:42,320 Speaker 1: spent hundreds of billions of dollars in countless work hours 591 00:34:42,640 --> 00:34:44,880 Speaker 1: building a damn to hold back the flood, when that 592 00:34:44,920 --> 00:34:49,320 Speaker 1: flood arrived and the damn held, you wouldn't say, well, 593 00:34:49,480 --> 00:34:51,239 Speaker 1: I didn't get wet, So the whole thing must have 594 00:34:51,280 --> 00:34:54,520 Speaker 1: been a hoax. And that that is almost a perfect 595 00:34:54,560 --> 00:34:57,759 Speaker 1: analogy for what happened in retrospect with the Y two 596 00:34:57,800 --> 00:35:01,080 Speaker 1: K bug. A crisis that would have been potentially really 597 00:35:01,160 --> 00:35:05,520 Speaker 1: huge and catastrophic was averted. And rather than saying poray 598 00:35:05,640 --> 00:35:08,479 Speaker 1: we did it, we pulled together, people just kind of said, 599 00:35:08,960 --> 00:35:13,160 Speaker 1: you guys got me all upset for nothing. Nothing happened. Yeah, 600 00:35:13,200 --> 00:35:16,960 Speaker 1: I mean, let's say that they um, let's say it 601 00:35:17,000 --> 00:35:19,480 Speaker 1: was all just minor problems that they didn't bother to 602 00:35:19,560 --> 00:35:22,600 Speaker 1: fix beforehand, and all of a sudden, instead of a 603 00:35:22,600 --> 00:35:27,120 Speaker 1: few minor annoyances and aggravations, there are thousands of those, 604 00:35:27,160 --> 00:35:30,399 Speaker 1: and those compound on one other another somehow and create 605 00:35:30,440 --> 00:35:33,919 Speaker 1: sort of a domino effect. It's not just that they 606 00:35:34,960 --> 00:35:38,400 Speaker 1: they corrected the main systems like the financial markets and 607 00:35:38,440 --> 00:35:41,040 Speaker 1: the nuclear codes and all those things that they needed 608 00:35:41,080 --> 00:35:44,719 Speaker 1: to correct. They corrected a lot of stuff that had 609 00:35:44,760 --> 00:35:48,040 Speaker 1: a downstream effect just to make our lives a little 610 00:35:48,120 --> 00:35:52,040 Speaker 1: less disrupted. Um. People looked at Italy at the time. 611 00:35:52,080 --> 00:35:54,239 Speaker 1: They were a country that didn't do a lot and 612 00:35:54,280 --> 00:35:56,560 Speaker 1: they didn't have a big fall out. But other people 613 00:35:56,560 --> 00:36:00,600 Speaker 1: who were a lot smarter and don't make knee jerk reactions. Yeah, 614 00:36:00,680 --> 00:36:02,960 Speaker 1: but you know what, Italy is really small compared to us. 615 00:36:03,280 --> 00:36:05,759 Speaker 1: They weren't nearly as computer reliant at the time as 616 00:36:05,800 --> 00:36:09,760 Speaker 1: we were, and America was fixing their stuff, which helped 617 00:36:09,800 --> 00:36:12,640 Speaker 1: fix stuff for all of the world because they were 618 00:36:12,680 --> 00:36:16,400 Speaker 1: reliant on us and our systems. That's right. So so 619 00:36:16,480 --> 00:36:18,719 Speaker 1: there's a pretty good argument you could make for all 620 00:36:18,719 --> 00:36:21,200 Speaker 1: those people are like, no, it's totally fine. See, those 621 00:36:21,200 --> 00:36:24,719 Speaker 1: people didn't do anything. They got a free ride, basically 622 00:36:24,760 --> 00:36:27,960 Speaker 1: because so much of the software that America was fixing 623 00:36:28,040 --> 00:36:30,960 Speaker 1: is used around the world. That's a really really important 624 00:36:31,000 --> 00:36:33,319 Speaker 1: point for because that's that's a that's a point that 625 00:36:33,360 --> 00:36:36,160 Speaker 1: a lot of people make. They'll point to countries surprisingly 626 00:36:36,200 --> 00:36:39,439 Speaker 1: like Japan. I can't believe it, but Japan did very 627 00:36:39,480 --> 00:36:41,520 Speaker 1: little to deal with it, and like you said, there 628 00:36:41,520 --> 00:36:45,680 Speaker 1: were some minor problems that some of their nuclear plants, 629 00:36:45,680 --> 00:36:48,400 Speaker 1: but the fact is a lot of them benefit. A 630 00:36:48,400 --> 00:36:51,080 Speaker 1: lot of those countries benefited from the US leading the 631 00:36:51,160 --> 00:36:54,319 Speaker 1: way on this. Yeah, And one of the strategies that 632 00:36:54,360 --> 00:36:56,279 Speaker 1: other people who didn't think we should be pouring all 633 00:36:56,280 --> 00:36:59,360 Speaker 1: this money into it beforehand was called fix on failure. 634 00:36:59,440 --> 00:37:01,480 Speaker 1: They said, why why don't we just wait and see 635 00:37:01,480 --> 00:37:03,800 Speaker 1: what happens and then we'll start to correct things as needed. 636 00:37:04,320 --> 00:37:06,439 Speaker 1: But that that's just not a very like we didn't 637 00:37:06,440 --> 00:37:08,759 Speaker 1: know what we didn't know, Like you said earlier, we 638 00:37:08,800 --> 00:37:10,720 Speaker 1: had no way of knowing what it was gonna cause, 639 00:37:11,320 --> 00:37:14,600 Speaker 1: and so fixed on failure was not a viable option 640 00:37:14,680 --> 00:37:18,880 Speaker 1: because trying to patch something that is in chaos all 641 00:37:18,920 --> 00:37:20,880 Speaker 1: of a sudden is not the best way to work. No, 642 00:37:21,120 --> 00:37:23,320 Speaker 1: I mean, like this is about to date this episode, 643 00:37:23,360 --> 00:37:27,760 Speaker 1: but the Spirit Airlines problems of of like the last 644 00:37:28,000 --> 00:37:31,200 Speaker 1: week or so, it's a really great example. Oh man, 645 00:37:31,280 --> 00:37:34,240 Speaker 1: there was some problems with shifting cruise around because of weather, 646 00:37:34,800 --> 00:37:39,160 Speaker 1: and all of a sudden, some flights um started being 647 00:37:39,200 --> 00:37:41,839 Speaker 1: ready without the cruise, and then that led to more 648 00:37:41,880 --> 00:37:45,080 Speaker 1: cruise being shifted around, and it was just this cascade 649 00:37:45,160 --> 00:37:47,960 Speaker 1: of flights where they were every day canceling fifty six 650 00:37:48,560 --> 00:37:51,319 Speaker 1: of their flights and just leaving people stranded. And it 651 00:37:51,440 --> 00:37:54,560 Speaker 1: was a really great example that um the what was 652 00:37:54,600 --> 00:37:59,480 Speaker 1: the name of the pipeline that got cyber attacked for ransomware? 653 00:37:59,800 --> 00:38:02,200 Speaker 1: That certain that was a really great example of like 654 00:38:02,640 --> 00:38:05,160 Speaker 1: just these you know, people waiting in line for gas 655 00:38:05,160 --> 00:38:08,799 Speaker 1: around the whole southeast and the East coast for a week, 656 00:38:09,160 --> 00:38:11,640 Speaker 1: more than a week. Like it's it's like at the 657 00:38:11,680 --> 00:38:14,400 Speaker 1: time at y two k um the idea that like 658 00:38:14,880 --> 00:38:17,000 Speaker 1: we could have just fixed this stuff really kind of 659 00:38:17,040 --> 00:38:21,520 Speaker 1: shows their naivety for how how embedded, how we didn't 660 00:38:21,520 --> 00:38:24,440 Speaker 1: really understand how embedded we already were with computers and 661 00:38:24,680 --> 00:38:27,400 Speaker 1: how dependent we were on them. And you can't fix 662 00:38:28,560 --> 00:38:32,040 Speaker 1: nuclear missile guidance system after it fails. You need to 663 00:38:32,120 --> 00:38:34,520 Speaker 1: do that ahead of time. So that fixed on failure ideas. 664 00:38:34,840 --> 00:38:36,640 Speaker 1: It was a pretty bad idea from the get go. 665 00:38:37,719 --> 00:38:40,239 Speaker 1: It was you mentioned, you know, one of the good 666 00:38:40,280 --> 00:38:42,680 Speaker 1: things that came out of this was the fact that 667 00:38:42,760 --> 00:38:48,360 Speaker 1: post nine eleven, our financial markets didn't crash because largely 668 00:38:48,520 --> 00:38:50,040 Speaker 1: because of a lot of the work they did for 669 00:38:50,480 --> 00:38:53,920 Speaker 1: Y two K and making those systems more robust and updated, 670 00:38:54,480 --> 00:38:57,680 Speaker 1: and that happened kind of across the board, like the 671 00:38:57,800 --> 00:38:59,879 Speaker 1: investment that a lot of people, I think a lot 672 00:38:59,880 --> 00:39:02,600 Speaker 1: of people and companies were like, well, I guess now 673 00:39:02,640 --> 00:39:04,359 Speaker 1: as a good time as any to really just sort 674 00:39:04,360 --> 00:39:08,279 Speaker 1: of update everything and to get our systems, uh, you 675 00:39:08,320 --> 00:39:12,320 Speaker 1: know more um, you know, up to snuff for today's 676 00:39:12,480 --> 00:39:13,880 Speaker 1: you know, some of the stuff was still running on 677 00:39:14,120 --> 00:39:16,239 Speaker 1: code from the seventies that have been built on and 678 00:39:16,239 --> 00:39:19,319 Speaker 1: built on. So that really helped drive the tech boom 679 00:39:19,320 --> 00:39:22,080 Speaker 1: in a lot of ways in that companies were investing 680 00:39:22,080 --> 00:39:24,640 Speaker 1: a lot more for the first time, uh in tech. 681 00:39:24,840 --> 00:39:29,080 Speaker 1: And also they you know, there were hundreds of thousands 682 00:39:29,080 --> 00:39:32,080 Speaker 1: of new developers that were trained and hired to deal 683 00:39:32,120 --> 00:39:34,759 Speaker 1: with this, and they were all of a sudden, we're 684 00:39:34,760 --> 00:39:37,280 Speaker 1: looking for jobs, and some of them didn't find jobs, 685 00:39:37,280 --> 00:39:40,400 Speaker 1: so they got creative and started writing apps and writing 686 00:39:40,400 --> 00:39:44,600 Speaker 1: code for other programs. And it really fueled not only 687 00:39:44,640 --> 00:39:47,120 Speaker 1: here in India. I know, it was a really big 688 00:39:47,160 --> 00:39:51,880 Speaker 1: deal that um their I T industry now compared to 689 00:39:51,880 --> 00:39:53,839 Speaker 1: where it was then, it's just like night and day 690 00:39:53,880 --> 00:39:56,200 Speaker 1: because of everyone that got hired to help with Y 691 00:39:56,239 --> 00:39:58,719 Speaker 1: two K. Yeah, it's pretty amazing. Like they make the 692 00:39:58,719 --> 00:40:02,320 Speaker 1: case that the tech boom was a result of people 693 00:40:02,960 --> 00:40:05,680 Speaker 1: coming into this industry who wouldn't have otherwise been there. 694 00:40:06,000 --> 00:40:09,480 Speaker 1: And also the spending. The spending was just ridiculous, because 695 00:40:09,520 --> 00:40:12,239 Speaker 1: not only were people hiring I T people to fix 696 00:40:12,320 --> 00:40:15,359 Speaker 1: their software, like some companies were like, forget this, We're 697 00:40:15,400 --> 00:40:18,520 Speaker 1: just going to completely upgrade our systems. And these systems 698 00:40:18,560 --> 00:40:21,320 Speaker 1: could have kept hopping along or hobbling along for another 699 00:40:21,480 --> 00:40:25,000 Speaker 1: ten fifteen years, say, and then that system would have 700 00:40:25,000 --> 00:40:27,240 Speaker 1: had to have been replaced, and then some other company 701 00:40:27,280 --> 00:40:30,319 Speaker 1: does it another five years earlier, five years later, rather 702 00:40:30,400 --> 00:40:33,040 Speaker 1: than that. The United States and actually in a lot 703 00:40:33,080 --> 00:40:36,480 Speaker 1: of ways, the world's computer systems got upgraded all at once, 704 00:40:37,120 --> 00:40:40,560 Speaker 1: and that kind of laid that foundation with you know, 705 00:40:40,680 --> 00:40:44,080 Speaker 1: the industry being flush with tech workers to just really 706 00:40:44,120 --> 00:40:46,840 Speaker 1: take off, which is great. I saw in that same 707 00:40:46,920 --> 00:40:49,759 Speaker 1: article about that Center report, Chuck, that that said it 708 00:40:49,840 --> 00:40:53,000 Speaker 1: was money well spent. Somebody estimated that for every dollar 709 00:40:53,080 --> 00:40:55,920 Speaker 1: spent on fixing the Y two K bug, it led 710 00:40:55,960 --> 00:40:59,000 Speaker 1: to a return on investment of about six or seven dollars. 711 00:40:59,680 --> 00:41:01,960 Speaker 1: And that is an estimate in two thousand now today 712 00:41:01,960 --> 00:41:05,120 Speaker 1: in in hindsight, if the the Y two K bug 713 00:41:05,200 --> 00:41:08,080 Speaker 1: drove the tech booms that started in the early two 714 00:41:08,080 --> 00:41:11,880 Speaker 1: thousand's late nineties, like, it's just it's it's countless, So 715 00:41:11,920 --> 00:41:14,360 Speaker 1: it's probably in the trillions of dollars worth of value 716 00:41:14,360 --> 00:41:18,960 Speaker 1: that it led to toats. So I think we kind 717 00:41:18,960 --> 00:41:20,960 Speaker 1: of busted that myth in a way, didn't We kind 718 00:41:20,960 --> 00:41:23,640 Speaker 1: of like the world of World's Radio broadcast. I think 719 00:41:23,719 --> 00:41:27,719 Speaker 1: so as usual, gen X is correct. Go jen X. 720 00:41:29,120 --> 00:41:31,680 Speaker 1: If you want to know more about how gen X rules, 721 00:41:32,520 --> 00:41:35,480 Speaker 1: you can go onto the Internet and look at this 722 00:41:35,560 --> 00:41:38,439 Speaker 1: thing that we built called the Internet, and um, learn 723 00:41:38,560 --> 00:41:41,560 Speaker 1: some more stuff. What do you think of that? I 724 00:41:41,560 --> 00:41:43,600 Speaker 1: think it's great. And since I said learn some more stuff, 725 00:41:43,640 --> 00:41:48,759 Speaker 1: it's time for a listener maw. I'm gonna call this 726 00:41:48,920 --> 00:41:52,799 Speaker 1: Bob and Girls from our Child Labor podcast. Hi, guys, 727 00:41:52,920 --> 00:41:56,320 Speaker 1: I'm Amanda Marus Sars. I'm a long time listener and 728 00:41:56,440 --> 00:41:58,680 Speaker 1: love everything you guys do. In your Child Labor episode, 729 00:41:58,719 --> 00:42:01,080 Speaker 1: you mentioned you were unsure the dangers of working as 730 00:42:01,120 --> 00:42:03,160 Speaker 1: a Bob and girl or boy. Well, let me tell you. 731 00:42:03,640 --> 00:42:07,280 Speaker 1: I'm from Lowell, Massachusetts, where they claim the Industrial Revolution 732 00:42:07,360 --> 00:42:10,640 Speaker 1: was born. If you're from this area in elementary school, 733 00:42:10,680 --> 00:42:13,720 Speaker 1: you visit the boot Cotton Mill Museum for school field trips. 734 00:42:14,200 --> 00:42:16,480 Speaker 1: We learned about the history of h and cotton weaving 735 00:42:16,520 --> 00:42:19,759 Speaker 1: processes of the time. Something and always stuck with me, 736 00:42:20,000 --> 00:42:21,680 Speaker 1: being a little girl at the time of these trips 737 00:42:21,719 --> 00:42:23,799 Speaker 1: was the Bob and Girls. They were cite a list 738 00:42:23,840 --> 00:42:26,480 Speaker 1: of horrors that these child laborers went through, and getting 739 00:42:26,480 --> 00:42:29,400 Speaker 1: their fingers snapped off in the weaving machine crevices, to 740 00:42:29,440 --> 00:42:33,120 Speaker 1: getting their hair caught and essentially getting scalped. Man, I 741 00:42:33,239 --> 00:42:35,360 Speaker 1: knew I have it. I totally knew if there was 742 00:42:35,400 --> 00:42:38,680 Speaker 1: gonna be something really horrific about it. I just wanted 743 00:42:38,719 --> 00:42:41,239 Speaker 1: to polish off your episode with the most graphic details 744 00:42:41,640 --> 00:42:44,279 Speaker 1: that seven year old me could remember being plagued by. 745 00:42:44,440 --> 00:42:49,040 Speaker 1: That is from Amanda Maru's Stars. Very nice. Thanks a lot, Amanda. 746 00:42:49,120 --> 00:42:52,279 Speaker 1: That's exactly what I assumed was out there, So thanks 747 00:42:52,320 --> 00:42:55,080 Speaker 1: for filling in the blanks. Forest. I love it. If 748 00:42:55,080 --> 00:42:57,000 Speaker 1: you want to get in touch of this like Amanda did, 749 00:42:57,160 --> 00:42:59,560 Speaker 1: especially if you love everything we do like she says, 750 00:43:00,320 --> 00:43:02,879 Speaker 1: we love hearing from people like that. You can get 751 00:43:02,880 --> 00:43:05,759 Speaker 1: in touch with those via email. It's Stuff podcast at 752 00:43:05,760 --> 00:43:11,239 Speaker 1: I Heart radio dot com. Stuff you should Know is 753 00:43:11,280 --> 00:43:14,120 Speaker 1: a production of I heart Radio. For more podcasts My 754 00:43:14,160 --> 00:43:17,440 Speaker 1: heart Radio, visit the i heart Radio app, Apple podcasts, 755 00:43:17,560 --> 00:43:22,320 Speaker 1: or wherever you listen to your favorite shows. H