1 00:00:00,680 --> 00:00:03,240 Speaker 1: Hey there everyone, it's me Josh, and for this week's 2 00:00:03,440 --> 00:00:07,080 Speaker 1: s Y s K Selects, I've chosen a classic episode 3 00:00:07,080 --> 00:00:10,799 Speaker 1: on cyber war It's almost certainly hopelessly out of date 4 00:00:10,800 --> 00:00:14,680 Speaker 1: by now, but it is an interesting intro to cyber warfare, 5 00:00:15,160 --> 00:00:17,800 Speaker 1: and we learned that Chuck hates the words stucks net. 6 00:00:18,200 --> 00:00:21,280 Speaker 1: So listen out for that kick back and enjoy this 7 00:00:21,440 --> 00:00:28,760 Speaker 1: classic episode of Stuff You Should Know. Welcome to Stuff 8 00:00:28,760 --> 00:00:31,320 Speaker 1: You Should Know, a production of five Heart Radios How 9 00:00:31,320 --> 00:00:40,040 Speaker 1: Stuff Works. Hey, welcome to the podcast. I'm Josh Clark 10 00:00:40,200 --> 00:00:42,400 Speaker 1: with me is always a child to be Chuck Bryant. 11 00:00:43,280 --> 00:00:48,120 Speaker 1: We call him an authority on UM, cyber security, the 12 00:00:48,159 --> 00:00:52,400 Speaker 1: Internet and everything about it. An expert. You would say, 13 00:00:52,560 --> 00:00:55,520 Speaker 1: that's right. Hey, should we say hello to our our 14 00:00:55,720 --> 00:01:00,600 Speaker 1: latest celebrity fan who we just learned today that miss 15 00:01:00,760 --> 00:01:05,760 Speaker 1: Kristen Bell, the lovely and enchanting and nerdy Kristen Bell. 16 00:01:06,160 --> 00:01:09,759 Speaker 1: Is she nerdy, very nerdy, like prides herself on. I mean, 17 00:01:09,760 --> 00:01:12,160 Speaker 1: it doesn't surprise me that she listens to the show. Yeah, 18 00:01:12,200 --> 00:01:15,000 Speaker 1: because she's on record as being a big nerd, which 19 00:01:15,040 --> 00:01:17,160 Speaker 1: is one reason I like her a lot. And she's 20 00:01:17,240 --> 00:01:21,440 Speaker 1: curating UM a newsweek page right, Yeah, like things she 21 00:01:21,520 --> 00:01:24,160 Speaker 1: likes or kind of one of those deals that they 22 00:01:24,200 --> 00:01:27,720 Speaker 1: do in magazines now and she listed us. That's pretty awesome. 23 00:01:27,720 --> 00:01:30,080 Speaker 1: How about that? Thank you very much for that. I'm 24 00:01:30,120 --> 00:01:32,319 Speaker 1: a huge fan of a party down which she was 25 00:01:32,360 --> 00:01:35,440 Speaker 1: in and other stuff that she's been in forgetting Sarah Marshall, 26 00:01:35,440 --> 00:01:37,800 Speaker 1: how's it going? And her You gotta see her sloth 27 00:01:38,120 --> 00:01:40,440 Speaker 1: video online that she did on the Ellen Show. It's 28 00:01:40,440 --> 00:01:44,039 Speaker 1: pretty funny, highly recommended. Okay, so that's all I got. 29 00:01:44,120 --> 00:01:47,200 Speaker 1: That's all the sucking up I'm gonna do. Should we 30 00:01:47,319 --> 00:01:52,960 Speaker 1: say hi to another fan, slightly less famous but pretty cool? Yes? Sam? 31 00:01:53,080 --> 00:01:55,000 Speaker 1: Do you want to give a little a little backstory? 32 00:01:55,760 --> 00:01:58,280 Speaker 1: Giving some backstory here? Sure? We um. We had a 33 00:01:58,440 --> 00:02:01,160 Speaker 1: live Um tribute event here in Atlanta a couple of 34 00:02:01,240 --> 00:02:04,560 Speaker 1: years ago, and at the event was a little, uh 35 00:02:05,040 --> 00:02:10,040 Speaker 1: teenage fan named Sam teenage you mean like just post tween. Yeah, like, 36 00:02:10,760 --> 00:02:12,560 Speaker 1: and his mom brought him and he's a big fan. 37 00:02:12,639 --> 00:02:14,760 Speaker 1: He's just like really sweet kid. Mom was very sweet, sweet, 38 00:02:15,200 --> 00:02:17,040 Speaker 1: very nice family. Flash forward a couple of years we 39 00:02:17,080 --> 00:02:20,200 Speaker 1: got to south By, Southwest there Sam again, apparently mom 40 00:02:20,360 --> 00:02:23,440 Speaker 1: drove him to Austin to come see our life podcast. Yeah, 41 00:02:23,600 --> 00:02:26,639 Speaker 1: they weren't there to see. I've spoke to mom afterwards 42 00:02:26,760 --> 00:02:28,760 Speaker 1: and was like, so, what else are you guys gonna see? 43 00:02:28,800 --> 00:02:30,799 Speaker 1: She's like nothing. We're going back crazy. I was like, 44 00:02:31,360 --> 00:02:34,040 Speaker 1: what else did you see? She's like nothing. They came 45 00:02:34,120 --> 00:02:35,880 Speaker 1: to see us. So then we were like, all right, 46 00:02:35,880 --> 00:02:38,120 Speaker 1: we gotta think of something for Samy because he's proved 47 00:02:38,200 --> 00:02:40,519 Speaker 1: his metal And Sam wrote in and and sent his 48 00:02:40,639 --> 00:02:43,200 Speaker 1: resume and like all the reasons we should put him 49 00:02:43,200 --> 00:02:45,840 Speaker 1: to work, and it's just like, dude, if the future 50 00:02:45,880 --> 00:02:48,160 Speaker 1: is secure, if kids are like Sam, for I'm not 51 00:02:48,200 --> 00:02:50,919 Speaker 1: worried about a thing anymore. Yeah, So we we racked 52 00:02:50,960 --> 00:02:53,000 Speaker 1: our brains and we found out there's like a surprising 53 00:02:53,000 --> 00:02:57,280 Speaker 1: amount of adult only tasks that we do, like at 54 00:02:57,280 --> 00:02:59,840 Speaker 1: any given time, and we're like, all right, we have 55 00:02:59,919 --> 00:03:02,640 Speaker 1: to we have to figure out something that's age appropriate 56 00:03:02,720 --> 00:03:05,320 Speaker 1: for Sam. That's right. So long story short, I was 57 00:03:05,360 --> 00:03:08,080 Speaker 1: getting kind of thin on podcast topics. I put Sam 58 00:03:08,160 --> 00:03:10,639 Speaker 1: on the case and he sent me like a stellar, 59 00:03:10,720 --> 00:03:14,160 Speaker 1: stellar list with reasons why we should do these and uh, 60 00:03:14,639 --> 00:03:16,760 Speaker 1: this is the first one. This is one, and he 61 00:03:16,840 --> 00:03:18,480 Speaker 1: had a lot of overlap on once we had already 62 00:03:18,520 --> 00:03:20,239 Speaker 1: recorded it aren't out yet, so that just goes to 63 00:03:20,280 --> 00:03:22,880 Speaker 1: show you that Sam is like he gets the show right. 64 00:03:23,120 --> 00:03:26,119 Speaker 1: And uh so, as Sam's picking these out, we're gonna 65 00:03:26,240 --> 00:03:28,040 Speaker 1: let you know if this is a Sam one. But 66 00:03:28,160 --> 00:03:30,120 Speaker 1: this is going on in the summer, we're gonna call 67 00:03:30,200 --> 00:03:33,840 Speaker 1: this the Summer of Sam. That's right, Sam's choice. All right. 68 00:03:33,919 --> 00:03:37,000 Speaker 1: So that's the longest intro ever. That wasn't even the intro. Man, Well, 69 00:03:37,080 --> 00:03:42,160 Speaker 1: let's get to cyber worse. Then, let's bossy. Well, you know, Chuck, 70 00:03:42,320 --> 00:03:48,400 Speaker 1: have you ever been to Bellingham, Washington? No? Okay, I 71 00:03:48,480 --> 00:03:51,440 Speaker 1: have not. Have you been to Washington? Been in Seattle? 72 00:03:51,800 --> 00:03:54,720 Speaker 1: Isn't that where um Van Nostrin lives? Yeah? Or is 73 00:03:54,720 --> 00:03:58,480 Speaker 1: it Oregon? Now he sees Washington? Ok um? So, uh 74 00:03:59,040 --> 00:04:03,520 Speaker 1: in Bellingham, wash Ington on June at the Olympic Pipeline 75 00:04:03,560 --> 00:04:09,800 Speaker 1: Company a systems control and data acquisition system. There's systems 76 00:04:09,840 --> 00:04:14,880 Speaker 1: twice in there. Um, but a SCATA or SKATA system um, 77 00:04:14,960 --> 00:04:18,040 Speaker 1: which is basically like a computer program that like can 78 00:04:18,320 --> 00:04:22,240 Speaker 1: make a valve turn or turn something off for mechanical 79 00:04:22,520 --> 00:04:28,719 Speaker 1: right from from digital binary instructions right. Um, This Olympic 80 00:04:28,800 --> 00:04:32,040 Speaker 1: Pipeline Company system was operating on this this type of 81 00:04:32,320 --> 00:04:37,320 Speaker 1: program and UM something went wrong and one of their 82 00:04:38,040 --> 00:04:42,360 Speaker 1: pipes started leaking a lot, like millions of gallons of 83 00:04:42,440 --> 00:04:46,719 Speaker 1: gasoline UM and part of it erupted into a fireball 84 00:04:46,760 --> 00:04:50,960 Speaker 1: and killed three people injured many others. UM and they 85 00:04:51,080 --> 00:04:52,440 Speaker 1: went back and looked at it. They think it was 86 00:04:52,520 --> 00:04:57,240 Speaker 1: just a system malfunction. But the fact that this came along, 87 00:04:57,320 --> 00:05:00,640 Speaker 1: this happened because of this system can troll, and it 88 00:05:00,720 --> 00:05:04,920 Speaker 1: happened in as the dot com bubble was starting to 89 00:05:05,000 --> 00:05:07,560 Speaker 1: grow and like the Internet was really becoming a huge thing. 90 00:05:08,560 --> 00:05:12,320 Speaker 1: UM people who are into cybersecurity now point to this 91 00:05:12,680 --> 00:05:17,160 Speaker 1: as evidence of exactly what somebody could do during a 92 00:05:17,200 --> 00:05:23,440 Speaker 1: cyber attack, even though they think this was just an accident, right, irrelevant, 93 00:05:24,279 --> 00:05:27,080 Speaker 1: But they're they're they weren't like pointing to that as well. No, 94 00:05:27,240 --> 00:05:29,760 Speaker 1: they don't think that had anything to do, but they 95 00:05:29,800 --> 00:05:31,800 Speaker 1: were saying, this is what it would look like if 96 00:05:31,839 --> 00:05:34,760 Speaker 1: somebody had wanted to attack, Like, this is what a 97 00:05:34,839 --> 00:05:38,400 Speaker 1: cyber attack would look like. Because it's not just the 98 00:05:38,480 --> 00:05:43,600 Speaker 1: Olympic Pipeline company that's using these systems UM. All over 99 00:05:43,640 --> 00:05:50,880 Speaker 1: the United States, companies, law enforcement agencies, military banks, UM, 100 00:05:51,600 --> 00:05:54,720 Speaker 1: public works, all of these things are all running on 101 00:05:55,200 --> 00:06:01,240 Speaker 1: what amounts to Windows. It's a simple that. Yeah, Microsoft 102 00:06:01,320 --> 00:06:05,440 Speaker 1: systems many of them, and um. As Jonathan Strickland wrote 103 00:06:05,480 --> 00:06:09,040 Speaker 1: this from tech Stuff the article, and as Strickland points out, um, 104 00:06:09,880 --> 00:06:13,040 Speaker 1: a couple of things Microsoft has been uh kind of 105 00:06:13,160 --> 00:06:16,200 Speaker 1: chastise over the years for their security or lack of 106 00:06:16,279 --> 00:06:21,360 Speaker 1: security in some of their programs, and um. The other 107 00:06:21,400 --> 00:06:24,160 Speaker 1: thing he points out is the internets grew so fast 108 00:06:24,760 --> 00:06:27,480 Speaker 1: and everyone got on board so quickly that it kind 109 00:06:27,480 --> 00:06:32,120 Speaker 1: of outpaced what we could even do security wise. It 110 00:06:32,279 --> 00:06:35,560 Speaker 1: was all of a sudden, Government agencies and power grids 111 00:06:35,640 --> 00:06:39,880 Speaker 1: and emergency services and weapons systems, water and fuel pipelines, 112 00:06:40,000 --> 00:06:43,360 Speaker 1: all the stuff is running on on computers and a 113 00:06:43,400 --> 00:06:46,360 Speaker 1: lot of it through the Internet. And we don't quite 114 00:06:46,440 --> 00:06:48,920 Speaker 1: know how to guard against a cyber attack. No, And 115 00:06:49,000 --> 00:06:51,960 Speaker 1: apparently even as far as like the knowledge of how 116 00:06:52,040 --> 00:06:55,400 Speaker 1: to guard against cyber attacks goes, the United States is 117 00:06:56,320 --> 00:07:00,480 Speaker 1: is lacking, Yeah, compared to like China and Russia. UM, 118 00:07:01,360 --> 00:07:05,520 Speaker 1: so we're kind of in this really weird position right 119 00:07:05,560 --> 00:07:09,680 Speaker 1: now where we've realized that all of the ponies are 120 00:07:09,760 --> 00:07:13,240 Speaker 1: hooked to a single basket of eggs and all it's 121 00:07:13,240 --> 00:07:16,640 Speaker 1: gonna take is a couple of black cat firecrackers to 122 00:07:16,760 --> 00:07:20,440 Speaker 1: scare all the horses off. That's the best analogy I 123 00:07:20,520 --> 00:07:22,400 Speaker 1: can come up with. Did you just think of that? 124 00:07:22,560 --> 00:07:25,320 Speaker 1: Or did you I just thought it that. My imagination 125 00:07:25,400 --> 00:07:28,080 Speaker 1: is back. I can tell you where time traveled to 126 00:07:28,360 --> 00:07:33,240 Speaker 1: awesome where I can't think. Okay, um, all right, let's 127 00:07:33,240 --> 00:07:35,160 Speaker 1: go back a little bit in time. I don't think 128 00:07:35,200 --> 00:07:36,960 Speaker 1: we need the way Back machine for this, because we're 129 00:07:37,000 --> 00:07:39,920 Speaker 1: just going to We can just like walk outside. Yeah, 130 00:07:39,920 --> 00:07:44,160 Speaker 1: they'd be the waste of time for the way Back machine. Umn. 131 00:07:44,240 --> 00:07:47,920 Speaker 1: Some pretty smart people caught on early that hey, we 132 00:07:47,960 --> 00:07:50,240 Speaker 1: could be vulnerable to something like a cyber attack. So 133 00:07:50,320 --> 00:07:52,400 Speaker 1: let's look into this. Let's put a red team on it. 134 00:07:53,080 --> 00:07:57,040 Speaker 1: Red team is a our friends that act as enemies 135 00:07:57,160 --> 00:07:58,680 Speaker 1: to try and you know how they hire these people 136 00:07:58,720 --> 00:08:01,280 Speaker 1: to like break into your home. Yeah, those are red 137 00:08:01,360 --> 00:08:05,800 Speaker 1: teams basically, right, like um from Star Wars. Yeah, exactly. 138 00:08:06,160 --> 00:08:09,240 Speaker 1: So let's get a red team. Let's let's uh name 139 00:08:09,280 --> 00:08:13,200 Speaker 1: it this mission something really cool out of a football playbook. 140 00:08:13,320 --> 00:08:16,560 Speaker 1: Let's name it Project Eligible Receiver. Do you know how 141 00:08:16,600 --> 00:08:18,240 Speaker 1: many times I had to look at that before it 142 00:08:18,320 --> 00:08:21,120 Speaker 1: finally sunk in what words I was looking at? Really 143 00:08:22,000 --> 00:08:24,000 Speaker 1: it is? It does look kind of funny. It looks 144 00:08:24,040 --> 00:08:27,880 Speaker 1: like eleanor Rigby when you glance at it, at least 145 00:08:27,920 --> 00:08:30,440 Speaker 1: I think, so, yeah, it doesn't to me. I was 146 00:08:30,520 --> 00:08:33,120 Speaker 1: thinking more of like a radio receiver or something like, 147 00:08:33,600 --> 00:08:35,720 Speaker 1: I think it just means about football. No, it totally 148 00:08:36,280 --> 00:08:38,199 Speaker 1: just saying like I read this many times before, I 149 00:08:38,280 --> 00:08:40,959 Speaker 1: was like, oh, okay, so a lot of this is 150 00:08:41,000 --> 00:08:43,440 Speaker 1: still classified, so we don't know everything. But basically they 151 00:08:43,559 --> 00:08:46,280 Speaker 1: hired some hackers, which is what you do to test 152 00:08:46,280 --> 00:08:48,959 Speaker 1: your security, they being the Department of Defense. Yeah, yeah, 153 00:08:49,000 --> 00:08:52,520 Speaker 1: depart of Defense saying hey, can you nerds hack into 154 00:08:53,040 --> 00:08:58,120 Speaker 1: the Pentagon system and afterwards we won't assassinate exactly, And 155 00:08:58,320 --> 00:09:02,480 Speaker 1: the nerds were like, just watched this and it took 156 00:09:02,559 --> 00:09:05,480 Speaker 1: three days before the Pentagon even knew that they were 157 00:09:05,679 --> 00:09:09,559 Speaker 1: being cyber attacked by the Red Team, Like pretty successful 158 00:09:09,600 --> 00:09:13,400 Speaker 1: and very sobering. Yeah, so um, they they h it 159 00:09:13,600 --> 00:09:16,000 Speaker 1: was I guess kind of an eye opener for the 160 00:09:16,120 --> 00:09:19,160 Speaker 1: d O D and they I'm sure used it to 161 00:09:19,240 --> 00:09:22,959 Speaker 1: step up security. Not fast enough though, because after this 162 00:09:23,160 --> 00:09:28,320 Speaker 1: Red Team attack, um Operation Eligible Receiver, an actual attack 163 00:09:28,400 --> 00:09:32,320 Speaker 1: which they later came to call what was at Moonlight Maze. Yeah, 164 00:09:32,360 --> 00:09:35,199 Speaker 1: this is one year after that the tests. A year 165 00:09:35,320 --> 00:09:39,280 Speaker 1: after UM, somebody launched an attack and it was a 166 00:09:40,280 --> 00:09:43,280 Speaker 1: I guess what's probably the most typical kind of cyber attack, 167 00:09:43,400 --> 00:09:46,560 Speaker 1: where you insert some sort of software to basically spy 168 00:09:47,480 --> 00:09:52,360 Speaker 1: and get files and gather data and download sensitive materials. Right, 169 00:09:53,679 --> 00:09:59,280 Speaker 1: And apparently took two years before NASA, the Pentagon, UM 170 00:09:59,480 --> 00:10:03,800 Speaker 1: and other agencies in the US government noticed that UM 171 00:10:04,400 --> 00:10:07,760 Speaker 1: accidentally noticed that this that they were being spied on. 172 00:10:08,160 --> 00:10:12,839 Speaker 1: Cyber wise, Yeah, they got data like strategic maps, troop assignments, 173 00:10:13,040 --> 00:10:17,880 Speaker 1: and positions. Not good, right, very scary. And they trace 174 00:10:17,960 --> 00:10:20,920 Speaker 1: it back to Russia. Doesn't necessarily mean that it came 175 00:10:20,960 --> 00:10:24,280 Speaker 1: from Russia in its origin, but at least that's where 176 00:10:24,280 --> 00:10:28,360 Speaker 1: they traced it to. Uh. And this is cyber warfare, 177 00:10:28,480 --> 00:10:31,000 Speaker 1: like it's happening. It's been going on since the nineties 178 00:10:31,040 --> 00:10:34,360 Speaker 1: pretty much. Yeah, I mean it's not is a cyber 179 00:10:34,400 --> 00:10:37,480 Speaker 1: war coming, it's like, how do we prevent like a 180 00:10:37,559 --> 00:10:40,720 Speaker 1: cyber war from bringing us all down? Yeah, pretty much. 181 00:10:41,040 --> 00:10:44,520 Speaker 1: And it's apparently from looking into this, there's like two camps. 182 00:10:44,559 --> 00:10:46,719 Speaker 1: There's like a gloom and doom camp where it's like, yeah, 183 00:10:46,840 --> 00:10:49,040 Speaker 1: somebody really wants to mess things up. They're going to 184 00:10:49,120 --> 00:10:51,920 Speaker 1: be able to it's gonna be pretty easy. And the 185 00:10:52,160 --> 00:10:55,520 Speaker 1: sunny optimistic camp is kind of like, no, you know, 186 00:10:55,640 --> 00:10:57,920 Speaker 1: we know we're looking for now, Like, sure they could 187 00:10:58,000 --> 00:11:01,120 Speaker 1: launch an attack, but will we'll be able to stop 188 00:11:01,200 --> 00:11:03,360 Speaker 1: it in time for before we can do like a 189 00:11:03,480 --> 00:11:07,199 Speaker 1: lot of damage. Yeah, so we'll see, we'll lay out 190 00:11:07,280 --> 00:11:25,000 Speaker 1: everything for you can decide who's right, that's rights w 191 00:11:27,080 --> 00:11:36,880 Speaker 1: s K you should. So we've already mentioned that on 192 00:11:37,000 --> 00:11:40,760 Speaker 1: the defensive side of things, the US is sorely lacking, um. 193 00:11:40,840 --> 00:11:43,120 Speaker 1: But on the offensive side of things, we've actually done 194 00:11:43,160 --> 00:11:47,040 Speaker 1: this ourselves more than once. Um during the coast of 195 00:11:47,120 --> 00:11:50,880 Speaker 1: a war. Strickland points out, we used computer attacks to 196 00:11:51,720 --> 00:11:55,760 Speaker 1: compromise Serbian air defenses, basically kind of scrambling their information 197 00:11:56,440 --> 00:11:59,559 Speaker 1: so they had bad I guess coordinates. Well, just on 198 00:11:59,640 --> 00:12:07,040 Speaker 1: the race our screen wasn't wasn't apt? Was okay? Or appropriate? 199 00:12:07,160 --> 00:12:11,319 Speaker 1: Did you see that one? So we did this, We 200 00:12:11,440 --> 00:12:15,600 Speaker 1: launched it and it and it worked. So, uh, that's 201 00:12:15,640 --> 00:12:17,480 Speaker 1: a good thing, but it's also a bad thing if 202 00:12:17,520 --> 00:12:21,240 Speaker 1: you're like, was it Bush the first or Clinton in 203 00:12:21,360 --> 00:12:24,679 Speaker 1: Bush the second? Bush the second in two thou three 204 00:12:24,720 --> 00:12:26,920 Speaker 1: in a rock and Clinton? Well, they were both like, 205 00:12:27,040 --> 00:12:28,719 Speaker 1: we don't think we should be doing much of this 206 00:12:28,960 --> 00:12:32,840 Speaker 1: because a couple of reasons. A. It basically opens us up. 207 00:12:32,920 --> 00:12:34,559 Speaker 1: It's like, hey, they did this, so we can do 208 00:12:34,600 --> 00:12:38,439 Speaker 1: it right back and be I think they could have 209 00:12:38,559 --> 00:12:42,640 Speaker 1: drained some banks of terrorist cells. And they said, we 210 00:12:42,800 --> 00:12:45,960 Speaker 1: kind of depend on the integrity of the banking system worldwide, 211 00:12:46,480 --> 00:12:48,080 Speaker 1: like we don't want to start messing around with us. 212 00:12:48,160 --> 00:12:52,319 Speaker 1: So apparently with with UM cyber warfare, it's very much 213 00:12:52,440 --> 00:12:57,160 Speaker 1: like UM. When you build that virus, it's out there 214 00:12:57,520 --> 00:13:00,480 Speaker 1: and it can be captured and studied and re deployed 215 00:13:00,480 --> 00:13:04,680 Speaker 1: against you. Yeah, so what they were saying with Clinton 216 00:13:04,800 --> 00:13:07,400 Speaker 1: and Bush who were saying like, no, we're not going 217 00:13:07,480 --> 00:13:10,280 Speaker 1: to use a virus to UM to drain those bank 218 00:13:10,320 --> 00:13:13,800 Speaker 1: accounts because they could be it will eventually come back 219 00:13:13,880 --> 00:13:16,640 Speaker 1: on us, and our banking industry is not secure enough 220 00:13:17,120 --> 00:13:20,800 Speaker 1: to withstand something that we ourselves make. Because apparently the 221 00:13:20,920 --> 00:13:24,800 Speaker 1: US is pretty good at making viruses. I'm sure should 222 00:13:24,840 --> 00:13:27,000 Speaker 1: we talk about some of the different ways that this 223 00:13:27,120 --> 00:13:32,360 Speaker 1: can go down. Yeah, the Pearl Harbor attack, Yes, I 224 00:13:32,400 --> 00:13:34,320 Speaker 1: had the feeling strictly might A name this one himself, 225 00:13:34,440 --> 00:13:36,079 Speaker 1: but it's not true. He went to a lot of 226 00:13:36,120 --> 00:13:39,160 Speaker 1: trouble to explain why it's called the Pearl Harbor strategy, 227 00:13:39,200 --> 00:13:40,679 Speaker 1: and I think he could have just left at that. 228 00:13:41,520 --> 00:13:44,240 Speaker 1: The idea here is that it's it's pretty much in 229 00:13:44,320 --> 00:13:46,880 Speaker 1: your face. It's a massive cyber attack where they infiltrate 230 00:13:47,320 --> 00:13:50,920 Speaker 1: and then they sabotage systems. UM much like Pearl Harbor 231 00:13:51,360 --> 00:13:54,920 Speaker 1: was a big surprise and a big attack, wasn't I mean, 232 00:13:54,960 --> 00:13:57,360 Speaker 1: it was sneaky, but it wasn't quiet by any means, 233 00:13:57,720 --> 00:14:01,360 Speaker 1: right or stealthy. I guess the word UM. The other 234 00:14:01,440 --> 00:14:05,880 Speaker 1: ones are pretty much stealthy. Part of a Pearl Harbor attack. 235 00:14:05,920 --> 00:14:09,120 Speaker 1: I believe UM could be a distributed denial of service attack, 236 00:14:09,760 --> 00:14:12,760 Speaker 1: which is basically, you know, like when you UM try 237 00:14:12,840 --> 00:14:15,679 Speaker 1: to get onto a website or whatever, you're sending a 238 00:14:15,840 --> 00:14:19,760 Speaker 1: request to the server to let you on right now. 239 00:14:19,920 --> 00:14:22,600 Speaker 1: If you assault that one server with millions of pings 240 00:14:23,200 --> 00:14:26,920 Speaker 1: and it's trying to accommodate everybody as is appropriate and 241 00:14:27,040 --> 00:14:33,400 Speaker 1: apt UM, it'll basically they crash. Is the point you 242 00:14:33,480 --> 00:14:35,800 Speaker 1: can crash a server by hitting it with millions of 243 00:14:35,840 --> 00:14:37,760 Speaker 1: pings all at once, just slows it down to the 244 00:14:37,840 --> 00:14:39,960 Speaker 1: point either where it doesn't work or it crashes. Yeah, 245 00:14:40,040 --> 00:14:42,040 Speaker 1: and that's that's what anonymous likes to do with like 246 00:14:42,240 --> 00:14:45,200 Speaker 1: MasterCard during the whole wiki leaks thing when they was 247 00:14:45,240 --> 00:14:48,120 Speaker 1: a master Card or Visa crash. I cannot remember, um 248 00:14:48,280 --> 00:14:50,640 Speaker 1: remember when that happened. Though. It's basically just launching a 249 00:14:50,720 --> 00:14:53,560 Speaker 1: bunch of server requests at a specific server in the service, 250 00:14:53,600 --> 00:14:56,800 Speaker 1: like no, no, and this falls over? Is that why 251 00:14:56,840 --> 00:15:01,520 Speaker 1: people say ping? By the way, the ping um? Yeah, 252 00:15:02,040 --> 00:15:06,760 Speaker 1: I hate that. It's better than javastorm. I don't even 253 00:15:06,800 --> 00:15:09,600 Speaker 1: know what that is, drinking coffee while you're having a brainstorm, 254 00:15:09,840 --> 00:15:12,440 Speaker 1: like let's go get coffee and brainstorm something javas to 255 00:15:12,520 --> 00:15:17,440 Speaker 1: people say that, Yeah, I don't say it. I've never 256 00:15:17,520 --> 00:15:21,880 Speaker 1: heard of that. That ping and meta or the three 257 00:15:21,920 --> 00:15:24,960 Speaker 1: things that I will never say. Epic maybe the worst 258 00:15:25,320 --> 00:15:29,080 Speaker 1: to call something epic. I don't mind epic. Oh man, 259 00:15:29,200 --> 00:15:31,800 Speaker 1: I hate epics. At least it's a real word, especially 260 00:15:32,120 --> 00:15:38,400 Speaker 1: epic fail. Well yeah, sure, okay, back to it. Viruses, uh, 261 00:15:38,800 --> 00:15:43,280 Speaker 1: code red, slammer, nimda. These are viruses that Strickland has 262 00:15:43,360 --> 00:15:47,120 Speaker 1: mentioned that it spread very quickly across the Internet, and 263 00:15:47,280 --> 00:15:49,440 Speaker 1: there's a couple of ways this can go down. You 264 00:15:49,520 --> 00:15:54,960 Speaker 1: can either, um, you can set a you can do 265 00:15:55,000 --> 00:15:57,560 Speaker 1: it immediately and release a virus. You can have all 266 00:15:57,600 --> 00:16:02,160 Speaker 1: these other computers deliver the virus. You can put sort 267 00:16:02,200 --> 00:16:04,880 Speaker 1: of like a delay timer on your virus for it 268 00:16:04,920 --> 00:16:09,200 Speaker 1: to go off in two years, automatically or manually whenever 269 00:16:09,280 --> 00:16:11,480 Speaker 1: you want to. It can be waiting for you to 270 00:16:11,600 --> 00:16:14,160 Speaker 1: hit the button and then latch the virus that way, 271 00:16:14,360 --> 00:16:17,800 Speaker 1: or I think, um for the user of that computer 272 00:16:17,920 --> 00:16:21,560 Speaker 1: to do, like say control all delete, Well we'll trigger 273 00:16:21,600 --> 00:16:24,920 Speaker 1: it or something. Yeah, that's pretty scary. Yeah, I don't 274 00:16:24,960 --> 00:16:28,600 Speaker 1: don't press those three buttons all the time on my PC. 275 00:16:28,800 --> 00:16:32,120 Speaker 1: Oh my god, Chuck. I think we should talk about 276 00:16:32,480 --> 00:16:34,840 Speaker 1: right about here is I think we're stucks net fits 277 00:16:34,880 --> 00:16:39,960 Speaker 1: in who stucks net? Say it? When we're stucks net? 278 00:16:40,040 --> 00:16:42,160 Speaker 1: I don't know what that is. You know it's stucks net? 279 00:16:42,320 --> 00:16:45,680 Speaker 1: Is that in this? Yeah? It's the Iranian Um, it's 280 00:16:45,720 --> 00:16:48,520 Speaker 1: the virus that the US and Israel unleashed on Iran. 281 00:16:49,200 --> 00:16:52,160 Speaker 1: It's a perfect example of this. It is. You're right, 282 00:16:52,960 --> 00:16:56,320 Speaker 1: So let's talk about stucks net. Stucks net. It's a 283 00:16:56,360 --> 00:17:00,160 Speaker 1: great name. It was offensive a cyber attack. Offensive been 284 00:17:00,200 --> 00:17:01,840 Speaker 1: two thousand and ten. They thinking maybe it was the 285 00:17:01,880 --> 00:17:05,200 Speaker 1: first one ever, the US launched like a strictly for 286 00:17:05,320 --> 00:17:10,280 Speaker 1: sabotage attack. Basically, they wanted to disable Iran UH Iran's 287 00:17:11,000 --> 00:17:16,040 Speaker 1: UH centrifuges so they could not enrich uranium. And they 288 00:17:16,119 --> 00:17:19,919 Speaker 1: did this through the UH, the new Air Force based 289 00:17:19,960 --> 00:17:25,159 Speaker 1: out of Texas, right, Texas in Georgia. Yeah, what's the 290 00:17:25,880 --> 00:17:30,320 Speaker 1: Warner Robbins Robin's Air Force Space, Yeah, Robin's Air Force Base. Yeah, 291 00:17:30,359 --> 00:17:32,920 Speaker 1: those two places are where they station. Yeah. And this 292 00:17:33,080 --> 00:17:38,359 Speaker 1: is the first all cyber unit. Pretty much pretty cool, right, 293 00:17:38,440 --> 00:17:41,719 Speaker 1: Their whole is it, Their whole task is to wage 294 00:17:41,800 --> 00:17:46,680 Speaker 1: cyber warfare, and I imagine to be defensive against cyber attacks. 295 00:17:47,240 --> 00:17:50,560 Speaker 1: But um, I don't I don't know if they had 296 00:17:50,640 --> 00:17:53,720 Speaker 1: to do with stuck stent, but they probably would have. 297 00:17:54,440 --> 00:17:58,680 Speaker 1: Um I think it was being developed before was ordained 298 00:17:59,000 --> 00:18:00,840 Speaker 1: in two thousand nine. Think it went back to two 299 00:18:00,840 --> 00:18:05,560 Speaker 1: thousand seven when it was started. But basically, the they 300 00:18:06,000 --> 00:18:09,720 Speaker 1: the CIA got their hands on centrifuges that they knew 301 00:18:09,880 --> 00:18:14,080 Speaker 1: Iran was using, and they had just as many as 302 00:18:14,160 --> 00:18:17,440 Speaker 1: Iran did of the same kind, and they studied it 303 00:18:17,800 --> 00:18:22,040 Speaker 1: and they built this virus based on this configuration of 304 00:18:22,160 --> 00:18:28,119 Speaker 1: centrifuges running Windows and Siemens switches, right, yeah, and then 305 00:18:28,119 --> 00:18:30,480 Speaker 1: they build a virus to go infiltrate it. I thought 306 00:18:30,480 --> 00:18:33,600 Speaker 1: it was called Operation Olympic Games. It was, but the malware, 307 00:18:33,680 --> 00:18:37,240 Speaker 1: the virus itself is called Okay that's what I couldn't 308 00:18:37,240 --> 00:18:39,160 Speaker 1: figure it out, but you're right. It was called Operation 309 00:18:39,240 --> 00:18:44,840 Speaker 1: Olympic Games. And this whole operation was this huge, sweeping, awesome, 310 00:18:45,240 --> 00:18:50,520 Speaker 1: massive secretive basically imagine like the CIA. Do you remember 311 00:18:50,640 --> 00:18:54,159 Speaker 1: Uncommon Valor? Oh yeah, okay, do you remember when like 312 00:18:54,320 --> 00:18:58,119 Speaker 1: they're training at that replica of the camp? Okay, the 313 00:18:58,240 --> 00:19:02,600 Speaker 1: CIA did that with Iran centrifuges in the nuclear program, 314 00:19:03,040 --> 00:19:04,679 Speaker 1: and they figured out exactly how he worked, and then 315 00:19:04,720 --> 00:19:06,359 Speaker 1: they figured out the best way to break it. Was 316 00:19:06,440 --> 00:19:08,399 Speaker 1: Gene Hackman bank rolling the whole thing. Oh yeah, he 317 00:19:08,520 --> 00:19:10,440 Speaker 1: was there to get his son out. He he was 318 00:19:10,560 --> 00:19:13,359 Speaker 1: just staring at this menu of guns and silhouette that 319 00:19:13,440 --> 00:19:15,760 Speaker 1: he wanted to order. Remember that? Oh yeah, dude, that 320 00:19:15,760 --> 00:19:17,840 Speaker 1: I thought that was so bad. That yeah, but that 321 00:19:17,960 --> 00:19:20,600 Speaker 1: was a huge, huge movie for like dude's our age. No, 322 00:19:20,840 --> 00:19:24,280 Speaker 1: I'm saying bad isn't like good? Okay, yeah, gotcha? Um 323 00:19:25,400 --> 00:19:28,240 Speaker 1: so stucks and net Olympic Games happened, and like you said, 324 00:19:28,280 --> 00:19:32,480 Speaker 1: it was the first offensive cyber attack. Most of the 325 00:19:32,520 --> 00:19:36,080 Speaker 1: other ones have come in the form of UM sneaking 326 00:19:36,160 --> 00:19:40,000 Speaker 1: in and lying around and watching and waiting and spying. Well, 327 00:19:40,080 --> 00:19:42,600 Speaker 1: stuck that had that too, was that the initial There 328 00:19:42,720 --> 00:19:46,639 Speaker 1: was a companion program called Flame that somehow. This is 329 00:19:46,720 --> 00:19:49,960 Speaker 1: the part that's the biggest mystery. The m Iran's nuclear 330 00:19:50,000 --> 00:19:53,040 Speaker 1: program is not connected to the Internet, so somebody got 331 00:19:53,119 --> 00:19:58,440 Speaker 1: that in on thumb drive, infected their local system. UM 332 00:19:58,720 --> 00:20:02,800 Speaker 1: and Flames sat there and basically just studied everything, told 333 00:20:03,000 --> 00:20:06,600 Speaker 1: the US how the configuration was set up, and then 334 00:20:06,640 --> 00:20:09,680 Speaker 1: they built it, and then they inserted stucks net and 335 00:20:09,840 --> 00:20:14,320 Speaker 1: basically it made all of their data looked like everything 336 00:20:14,440 --> 00:20:18,080 Speaker 1: was operating normally, but it was telling their centrifuges to 337 00:20:18,200 --> 00:20:20,920 Speaker 1: spin out of control and basically break themselves. It's like 338 00:20:21,000 --> 00:20:24,160 Speaker 1: Oceans eleven when they built the Replica vault exactly showed 339 00:20:24,200 --> 00:20:27,480 Speaker 1: the Replica video. There's nothing going on. So basically, the 340 00:20:27,920 --> 00:20:31,280 Speaker 1: Pentagon has been watching a lot of movies pretty much 341 00:20:44,119 --> 00:20:57,320 Speaker 1: as skuld. But this is a hugely successful attack UM, 342 00:20:57,800 --> 00:21:01,920 Speaker 1: if not at the very least for American cyber warfare UM, 343 00:21:02,359 --> 00:21:06,280 Speaker 1: because it's supposedly set Iran's nuclear program back by at 344 00:21:06,400 --> 00:21:09,280 Speaker 1: least a year, if not more, and that this would 345 00:21:09,640 --> 00:21:11,840 Speaker 1: let us continue talk. Yeah, and I think it said 346 00:21:11,840 --> 00:21:14,199 Speaker 1: one of the aims was to make them feel stupid, 347 00:21:14,640 --> 00:21:18,000 Speaker 1: and then they said it worked like they that they 348 00:21:18,359 --> 00:21:22,040 Speaker 1: done something wrong and that's why this these systems were failing. 349 00:21:22,560 --> 00:21:25,760 Speaker 1: It's pretty scary, man. But the point is now is Okay, 350 00:21:26,040 --> 00:21:28,400 Speaker 1: that's out there. Stuck Snut is out there for anybody 351 00:21:28,440 --> 00:21:31,639 Speaker 1: who can get their hands on it. And that's the 352 00:21:31,760 --> 00:21:35,000 Speaker 1: name of it. It's a great name, alright, stuck Snut 353 00:21:35,080 --> 00:21:40,600 Speaker 1: within with an X with a new guetas center. But 354 00:21:40,720 --> 00:21:43,200 Speaker 1: it's out there, and the US is now basically just 355 00:21:44,000 --> 00:21:49,600 Speaker 1: the the computer equivalent of Hiroshima was just launched by 356 00:21:49,680 --> 00:21:53,760 Speaker 1: the United States. Yeah, and nice little set up there. 357 00:21:53,880 --> 00:21:57,040 Speaker 1: A lot of people are comparing these days of the 358 00:21:57,119 --> 00:21:59,800 Speaker 1: early days of cyber warring to the early days of 359 00:22:00,240 --> 00:22:04,479 Speaker 1: of nuclear bombs, and that there's not a ton of defense. 360 00:22:04,600 --> 00:22:07,159 Speaker 1: Not anyone really knows what they're doing. It's sort of 361 00:22:07,520 --> 00:22:10,200 Speaker 1: a chaotic mess that everyone's trying to get their finger 362 00:22:10,240 --> 00:22:14,080 Speaker 1: in the bie though. Yeah, and the other countries like China, 363 00:22:14,200 --> 00:22:18,919 Speaker 1: believe Russia, who are apparently better equipped to defend against 364 00:22:19,200 --> 00:22:21,720 Speaker 1: a cyber attech than the US. So basically the US 365 00:22:21,840 --> 00:22:26,359 Speaker 1: is really playing with fire. Well, and that's why Clinton 366 00:22:26,400 --> 00:22:29,040 Speaker 1: and Bush we're declining to use these is one of 367 00:22:29,080 --> 00:22:30,760 Speaker 1: the reasons where like, you know, this opens us up 368 00:22:30,800 --> 00:22:34,720 Speaker 1: to counterattacks and just may not be the smartest way 369 00:22:34,760 --> 00:22:37,480 Speaker 1: to Like we wouldn't go out and just drop a 370 00:22:37,560 --> 00:22:42,639 Speaker 1: nuclear bomb on a country. Right, Oh wait did all? Right? Oops? Twice? 371 00:22:43,359 --> 00:22:45,960 Speaker 1: What else you got? Let's see, Uh we talked about 372 00:22:46,000 --> 00:22:51,040 Speaker 1: the system controls and UM data acquisition systems. Yeah, that 373 00:22:51,240 --> 00:22:56,920 Speaker 1: was UM. Basically that is the achilles heel of infrastructure 374 00:22:57,520 --> 00:22:59,520 Speaker 1: and the United States. One of the reasons why we're 375 00:23:00,200 --> 00:23:04,119 Speaker 1: not set up to defend against UM a cyber attack 376 00:23:04,520 --> 00:23:08,840 Speaker 1: is because we are so connected to the internet. Yeah 377 00:23:09,119 --> 00:23:13,919 Speaker 1: everything is. Yeah. Iran North Korea, Yeah, not quite as 378 00:23:14,000 --> 00:23:16,159 Speaker 1: much because a lot of their stuff is off the 379 00:23:16,240 --> 00:23:20,680 Speaker 1: grid just by default because they don't have the infrastructure 380 00:23:20,760 --> 00:23:24,080 Speaker 1: that we have. So just the robustness of our own 381 00:23:24,160 --> 00:23:28,080 Speaker 1: infrastructure is one of the one of its vulnerabilities as well. Yeah, 382 00:23:28,440 --> 00:23:30,720 Speaker 1: that's a good point as far as defense goes to 383 00:23:30,800 --> 00:23:35,359 Speaker 1: I forgot about this stuff. Um. Strickland says that like 384 00:23:35,640 --> 00:23:40,240 Speaker 1: the first step is education as far as educating consumers 385 00:23:40,440 --> 00:23:44,600 Speaker 1: over you know, antivirus software and how they search the 386 00:23:44,640 --> 00:23:47,719 Speaker 1: Internet and stuff like that. So I give that a medium. 387 00:23:47,800 --> 00:23:51,760 Speaker 1: But uh, this guy, Richard Clarke, he's a security expert. 388 00:23:52,320 --> 00:23:56,399 Speaker 1: He blames things on companies like Microsoft too. He feels 389 00:23:56,480 --> 00:24:01,480 Speaker 1: like rushes through programs UM but for their fully security 390 00:24:01,560 --> 00:24:05,560 Speaker 1: tested because they want to make you know, they want 391 00:24:05,600 --> 00:24:08,359 Speaker 1: a few coins rub together by selling this stuff, and 392 00:24:08,560 --> 00:24:11,720 Speaker 1: the consumer doesn't want to wait, and the stockholders don't 393 00:24:11,720 --> 00:24:14,280 Speaker 1: want lots of testing because they want those new products 394 00:24:14,320 --> 00:24:17,040 Speaker 1: on the market. So it's a bit of a rough position. 395 00:24:17,400 --> 00:24:21,400 Speaker 1: And um, you know, private companies run most of the net, 396 00:24:21,440 --> 00:24:23,960 Speaker 1: you know, it's not like this big government thing. So 397 00:24:25,080 --> 00:24:29,080 Speaker 1: he contends, Clark does that it's up to these private 398 00:24:29,160 --> 00:24:32,840 Speaker 1: companies who own the Internet's infrastructure to really make it 399 00:24:33,080 --> 00:24:35,800 Speaker 1: more robust in a defensive sense, right, which is good 400 00:24:35,840 --> 00:24:38,359 Speaker 1: in one sense, because then you have a dollar amount 401 00:24:38,480 --> 00:24:42,800 Speaker 1: in the form of lost profits attached to UM a 402 00:24:43,000 --> 00:24:45,639 Speaker 1: security breach, right, so company is going to try to 403 00:24:45,680 --> 00:24:49,879 Speaker 1: protect it UM, which is good. Yeah, But at the 404 00:24:49,920 --> 00:24:53,000 Speaker 1: same time it's like, yeah, if you're putting out products though, 405 00:24:53,440 --> 00:24:57,960 Speaker 1: and you have competition and your competitors products are safer, UM, 406 00:24:58,119 --> 00:25:01,040 Speaker 1: and you're just rushing stuff to market and you're gonna 407 00:25:01,160 --> 00:25:05,760 Speaker 1: lose out ultimately pretty bit the same economic forces. And 408 00:25:06,240 --> 00:25:09,280 Speaker 1: Jonathan also points out to that, you know, a scary 409 00:25:09,320 --> 00:25:11,879 Speaker 1: way this can be implemented is as a one two 410 00:25:11,960 --> 00:25:15,520 Speaker 1: punch with a physical attack. Yeah. So, I mean, this 411 00:25:15,600 --> 00:25:16,880 Speaker 1: is the one that wakes me up in the middle 412 00:25:16,880 --> 00:25:19,639 Speaker 1: of the night is a cyber attack is launched and 413 00:25:20,480 --> 00:25:24,680 Speaker 1: the electric power grid is shut down, and gas lines 414 00:25:24,720 --> 00:25:27,520 Speaker 1: and waterlines start going haywire, and then all of a 415 00:25:27,560 --> 00:25:31,720 Speaker 1: sudden incomes the Red Dawn team parachuting in. Well, that's 416 00:25:31,760 --> 00:25:33,680 Speaker 1: what we did to a Rock in two thousand three. 417 00:25:34,200 --> 00:25:37,199 Speaker 1: We sent a cyber attack that messed with their UM 418 00:25:37,520 --> 00:25:40,640 Speaker 1: I guess their air defense systems, and then we invaded. 419 00:25:42,160 --> 00:25:45,440 Speaker 1: So that's happened before we've done it, doesn't surprise me. Yeah, 420 00:25:46,400 --> 00:25:48,480 Speaker 1: cyber war, we're in the midst of it. We're in 421 00:25:48,520 --> 00:25:51,879 Speaker 1: the midst of it. Pretty crazy stuff. Get your what 422 00:25:52,080 --> 00:25:58,080 Speaker 1: Norton anti virus that'll just solve everything. Yeah. Education, education, 423 00:25:58,640 --> 00:26:00,560 Speaker 1: that's all. That's the only thing, that's all we can 424 00:26:00,600 --> 00:26:03,600 Speaker 1: do to prevent cyber war. UM. If you want to 425 00:26:03,680 --> 00:26:06,120 Speaker 1: learn more about cyber War and read this article by 426 00:26:06,600 --> 00:26:10,320 Speaker 1: Jonathan Strickland. You can type cyber war one word in 427 00:26:10,560 --> 00:26:12,800 Speaker 1: the search bar how stuff works dot com and will 428 00:26:12,800 --> 00:26:16,320 Speaker 1: bring it up, I said Jonathan Strickland, which means it's 429 00:26:16,359 --> 00:26:20,080 Speaker 1: time for a listener. May all, it's time for a 430 00:26:20,160 --> 00:26:24,200 Speaker 1: lot more than that. Uh. I'm gonna call this beer 431 00:26:25,560 --> 00:26:29,320 Speaker 1: and Fire. Hi, guys, I'm a professor of history and 432 00:26:29,359 --> 00:26:31,480 Speaker 1: a long time act of your show. I use a 433 00:26:31,520 --> 00:26:33,480 Speaker 1: podcast in my college classes to talk about how we 434 00:26:33,560 --> 00:26:37,040 Speaker 1: use history and entertainment. I'm writing about the Great Chicago 435 00:26:37,119 --> 00:26:40,320 Speaker 1: Fire podcast, especially as it relates to my research. See. 436 00:26:40,480 --> 00:26:42,680 Speaker 1: I study the history of alcohol, and I teach a 437 00:26:42,720 --> 00:26:46,159 Speaker 1: class on the history of beer. Uh. Pretty cool. We 438 00:26:46,440 --> 00:26:48,840 Speaker 1: study the economic, social, and cultural history of beer, and 439 00:26:48,920 --> 00:26:55,719 Speaker 1: we make beer in class into weekly beer tastings. What anyway, 440 00:26:55,760 --> 00:26:57,879 Speaker 1: Aside from the stuff you mentioned the show, the Chicago 441 00:26:57,960 --> 00:27:00,000 Speaker 1: fire is important because it wiped out about three quarters 442 00:27:00,560 --> 00:27:04,600 Speaker 1: Chicago's breweries. Something like eighteen breweries were destroyed by the fire. 443 00:27:04,720 --> 00:27:09,639 Speaker 1: Of course, people still wanted beer. Uh. Chicago and the 444 00:27:09,680 --> 00:27:12,359 Speaker 1: Upper Midwest has was populated about a lot of Germans 445 00:27:12,520 --> 00:27:15,200 Speaker 1: at the time. This gave birth to the beer industry 446 00:27:15,240 --> 00:27:18,480 Speaker 1: in Milwaukee before the Great Fire. Milwaukee was a beer town, 447 00:27:18,560 --> 00:27:21,800 Speaker 1: but not a major supply center. Schlitz especially as a 448 00:27:21,840 --> 00:27:24,960 Speaker 1: good example of how the Milwaukee beer industry reacted to 449 00:27:25,000 --> 00:27:29,840 Speaker 1: the fire. Joseph Schlitz, the founder it, first donated thousands 450 00:27:29,880 --> 00:27:32,560 Speaker 1: of barrels of beer to Chicagoans and the weeks after 451 00:27:32,640 --> 00:27:36,280 Speaker 1: the fire. Been Sensing an opportunity, he then opened a 452 00:27:36,359 --> 00:27:39,680 Speaker 1: distribution point in the city. After all, there were still 453 00:27:39,760 --> 00:27:44,320 Speaker 1: hundreds of thousands of thirsty Chicagoans, he opened Schlitz Tide saloons. 454 00:27:44,720 --> 00:27:47,119 Speaker 1: By the eighteen eighties, he was selling about fifty thousand 455 00:27:47,200 --> 00:27:50,640 Speaker 1: barrels of beer in Chicago alone, which is about seventeen 456 00:27:50,680 --> 00:27:56,280 Speaker 1: percent of their total. And the slogan, the slogan for Schlitz, 457 00:27:57,000 --> 00:28:00,359 Speaker 1: the beer that made Milwaukee famous, came out of this period, 458 00:28:00,680 --> 00:28:03,480 Speaker 1: and it's because of the beer sold after the fire, 459 00:28:04,119 --> 00:28:06,760 Speaker 1: so that's where they got the name. By nineteen two, 460 00:28:06,840 --> 00:28:09,280 Speaker 1: Schlitz was the largest brew in the world, a title 461 00:28:09,359 --> 00:28:12,000 Speaker 1: it would trade back and forth with Budweiser until the 462 00:28:12,080 --> 00:28:14,680 Speaker 1: nineteen fifties. And he goes on to point out that 463 00:28:14,880 --> 00:28:21,160 Speaker 1: Blats and perhapsed followed similar trajectories stucks net, stucks net, 464 00:28:21,840 --> 00:28:25,040 Speaker 1: and UH. The Chicago brewing industry sadly never recovered from 465 00:28:25,080 --> 00:28:29,080 Speaker 1: the fire, although beer drinking remained steady. And I don't 466 00:28:29,119 --> 00:28:35,680 Speaker 1: have Professor Beer's name, so we'll just call him Professor Beer. Oh, 467 00:28:35,880 --> 00:28:38,960 Speaker 1: I'm sure he'd appreciate that. Yeah, I'm sure that's what 468 00:28:39,040 --> 00:28:42,960 Speaker 1: the students call him. Thanks, Professor Beer. Yeah. And if 469 00:28:42,960 --> 00:28:44,280 Speaker 1: you want to write in, I'll say your name on 470 00:28:44,400 --> 00:28:48,600 Speaker 1: a later show. Okay. Um. And if you teach, especially 471 00:28:48,680 --> 00:28:51,680 Speaker 1: something interesting or you stuff you should Know? To hell. 472 00:28:51,720 --> 00:28:54,080 Speaker 1: If you teach, we're always interested in hearing that. We 473 00:28:54,200 --> 00:28:57,080 Speaker 1: want to know about it. Okay. You can tweet it 474 00:28:57,160 --> 00:28:59,760 Speaker 1: to us at s y s K podcast, put it 475 00:28:59,840 --> 00:29:02,600 Speaker 1: on Facebook dot com slash Stuff you Should Know, or 476 00:29:03,040 --> 00:29:05,320 Speaker 1: you can send us an email. The Stuff podcast at 477 00:29:05,320 --> 00:29:07,720 Speaker 1: how Stuff Works dot com and has always joined us 478 00:29:07,760 --> 00:29:09,840 Speaker 1: at our home on the web Stuff you Should Know 479 00:29:10,000 --> 00:29:14,920 Speaker 1: dot Com. Stuff you Should Know is a production of 480 00:29:14,960 --> 00:29:17,720 Speaker 1: iHeart Radio's How Stuff Works. For more podcasts for my 481 00:29:17,760 --> 00:29:20,560 Speaker 1: heart Radio, visit the iHeart Radio app, Apple Podcasts, or 482 00:29:20,600 --> 00:29:25,400 Speaker 1: wherever you listen to your favorite shows. H