1 00:00:01,480 --> 00:00:04,280 Speaker 1: Welcome to Stuff you Should Know, a production of I 2 00:00:04,360 --> 00:00:13,200 Speaker 1: Heart Radio. Hey, and welcome to the podcast. I'm Josh, 3 00:00:13,200 --> 00:00:16,959 Speaker 1: and there's Chuck and there's Jerry over there on the download, 4 00:00:18,200 --> 00:00:22,160 Speaker 1: and this is Stuff you should Know the podcast, the 5 00:00:22,280 --> 00:00:26,000 Speaker 1: top Secret edition. Yes, exactly what you should know is 6 00:00:26,120 --> 00:00:31,720 Speaker 1: decided by us. That's right, we give clearance. Clarence, well, 7 00:00:31,720 --> 00:00:34,040 Speaker 1: that's a good one. I wouldn't have put that one together, 8 00:00:34,040 --> 00:00:39,840 Speaker 1: but that's great. Thanks, So Chuck, we're talking about classified information, 9 00:00:39,880 --> 00:00:42,120 Speaker 1: top secret information. A lot of people call it by 10 00:00:42,240 --> 00:00:45,560 Speaker 1: especially if you're twelve years older younger, but it's actually 11 00:00:45,600 --> 00:00:49,480 Speaker 1: a real thing. That's like a real designation top secret, which, 12 00:00:49,520 --> 00:00:51,440 Speaker 1: by the way, I went back and watched that movie 13 00:00:51,440 --> 00:00:56,840 Speaker 1: again for the first time in years, and it is hilarious. Yeah. 14 00:00:56,920 --> 00:00:58,920 Speaker 1: I just made an airplane reference in Top Secret? Is 15 00:00:59,040 --> 00:01:01,920 Speaker 1: uh saying that it's a Zuckers, right, Zuckers. Yeah. I 16 00:01:01,960 --> 00:01:04,560 Speaker 1: think they worked with somebody different that time. I'm not 17 00:01:04,640 --> 00:01:08,360 Speaker 1: sing what capacity, but the Zuckers were involved for sure. 18 00:01:08,800 --> 00:01:11,640 Speaker 1: The Balcomer. That documentary on him, it is fantastic, by 19 00:01:11,640 --> 00:01:14,240 Speaker 1: the way, I've heard, I have not yet to watch it. 20 00:01:14,280 --> 00:01:17,080 Speaker 1: We just got through a made Have you seen made 21 00:01:17,440 --> 00:01:22,120 Speaker 1: haven't seen made. It's a even after and you just like, 22 00:01:22,440 --> 00:01:24,200 Speaker 1: I wonder what bad thing is going to happen to 23 00:01:24,200 --> 00:01:26,200 Speaker 1: her next? I know something bad is going to It 24 00:01:26,240 --> 00:01:28,680 Speaker 1: really wears you down like that for sure, but it's good. 25 00:01:28,680 --> 00:01:32,160 Speaker 1: It's worth watching. Well, since we're on it with tv REX. 26 00:01:32,200 --> 00:01:34,640 Speaker 1: Before we get going, I've been watching Station eleven and 27 00:01:34,640 --> 00:01:37,679 Speaker 1: it's one of my favorite shows in years. Station eleven. 28 00:01:37,680 --> 00:01:41,080 Speaker 1: What's it on HBO? It is a from a novel, 29 00:01:41,800 --> 00:01:48,120 Speaker 1: a post apocalyptic virus actually pandemic weirdly, but it's more 30 00:01:48,160 --> 00:01:52,040 Speaker 1: on the order of like of the Earth Died and 31 00:01:52,280 --> 00:01:56,960 Speaker 1: but it's not the Road. It's it's more interesting, it's 32 00:01:57,160 --> 00:01:59,880 Speaker 1: more hopeful and artistic. Cool. I'll check it out. Did 33 00:01:59,880 --> 00:02:02,280 Speaker 1: you ever see The Leftovers? I think is what they 34 00:02:02,280 --> 00:02:05,720 Speaker 1: were called. Well, what's funny is we were watching that too. 35 00:02:05,720 --> 00:02:08,040 Speaker 1: We just started that because we were in that groove 36 00:02:08,760 --> 00:02:10,480 Speaker 1: and we never saw The Leftovers, and I heard it 37 00:02:10,520 --> 00:02:12,560 Speaker 1: was really good kind of and even got better through 38 00:02:12,600 --> 00:02:15,919 Speaker 1: the years. I envy you. I wish I could see 39 00:02:15,919 --> 00:02:18,440 Speaker 1: The Leftovers again for the first time. It's that. I'm 40 00:02:18,480 --> 00:02:21,519 Speaker 1: really excited, dude. It's very good. All right, Now we 41 00:02:21,520 --> 00:02:23,200 Speaker 1: should get to it because this is dense, so we're 42 00:02:23,200 --> 00:02:25,840 Speaker 1: talking about Yes, it is super dense. I mean this Uh. 43 00:02:26,240 --> 00:02:29,720 Speaker 1: We asked some help from Ed Grabanowski the grab Stir 44 00:02:30,200 --> 00:02:33,440 Speaker 1: and he turned into James Missioner novel. It is so dense. 45 00:02:33,520 --> 00:02:37,400 Speaker 1: Every paragraph is really dense. UM. And it's it's there's 46 00:02:37,480 --> 00:02:41,760 Speaker 1: it's inevitable, Like just researching this one, there's so much 47 00:02:41,800 --> 00:02:44,520 Speaker 1: to just go and read and so many rabbit holes 48 00:02:44,560 --> 00:02:47,280 Speaker 1: to go down, because what we're talking about is really 49 00:02:47,320 --> 00:02:52,079 Speaker 1: like simple concept on its face, Like there are secrets 50 00:02:52,160 --> 00:02:56,320 Speaker 1: that like any government would not want other governments or 51 00:02:56,320 --> 00:02:59,480 Speaker 1: other people are sometimes even their own people to know. 52 00:03:00,000 --> 00:03:02,520 Speaker 1: It would just make the functioning of that government that 53 00:03:02,639 --> 00:03:07,120 Speaker 1: much harder. Um, it might make the country vulnerable to attack. 54 00:03:07,160 --> 00:03:09,240 Speaker 1: There's a lot of reasons why you would not want 55 00:03:09,360 --> 00:03:11,840 Speaker 1: to share all of your business. So you want to 56 00:03:11,919 --> 00:03:16,000 Speaker 1: classify information based on how damaging it could be to 57 00:03:16,120 --> 00:03:20,200 Speaker 1: national security, and then you just decide who has clearance 58 00:03:20,280 --> 00:03:24,840 Speaker 1: to see that those various classifications of information. That's it, 59 00:03:24,919 --> 00:03:29,600 Speaker 1: that's classified information. But in practice, in reality, it's such 60 00:03:29,639 --> 00:03:34,840 Speaker 1: a behemoth concept that is so fraught with problems, that 61 00:03:35,000 --> 00:03:40,800 Speaker 1: is so often completely and utterly anti democratic. Um. And 62 00:03:40,840 --> 00:03:44,480 Speaker 1: then it's just so bureaucratically layer that it is extremely 63 00:03:44,680 --> 00:03:47,400 Speaker 1: it's an extremely dense topic. I just want to make 64 00:03:47,440 --> 00:03:49,960 Speaker 1: sure we added a good extra three minutes onto this 65 00:03:50,040 --> 00:03:53,520 Speaker 1: episode with that. Well, I think you know. The good 66 00:03:53,520 --> 00:03:57,520 Speaker 1: news is the history can be condensed really easily into 67 00:03:58,240 --> 00:04:01,640 Speaker 1: They had state secrets back in ancient times too, but 68 00:04:01,720 --> 00:04:03,600 Speaker 1: back then it was a lot easier to keep secrets 69 00:04:03,640 --> 00:04:07,880 Speaker 1: because there was no technology and there was no bureaucracy generally, 70 00:04:08,040 --> 00:04:13,160 Speaker 1: So if the autocratic or semi autocratic leader said it 71 00:04:13,200 --> 00:04:15,280 Speaker 1: was a secret, then it was a secret, and that 72 00:04:15,360 --> 00:04:18,159 Speaker 1: was kind of the end of it. Um, things changed 73 00:04:18,200 --> 00:04:22,680 Speaker 1: because of technology and because of bureaucracy, and because of 74 00:04:23,520 --> 00:04:26,839 Speaker 1: the development of nuclear weapons when there was truly something 75 00:04:27,040 --> 00:04:31,800 Speaker 1: so kind of world ending at stake that things needed 76 00:04:31,800 --> 00:04:36,039 Speaker 1: to be ratcheted up. And this all started with a 77 00:04:36,200 --> 00:04:40,600 Speaker 1: series of acts and even more so a series of 78 00:04:41,040 --> 00:04:44,720 Speaker 1: presidential executive orders over the years to really sort of 79 00:04:44,800 --> 00:04:47,760 Speaker 1: put these executive orders in place to vaguely say who 80 00:04:47,800 --> 00:04:49,520 Speaker 1: can know what and how we're gonna do it. And 81 00:04:49,520 --> 00:04:51,039 Speaker 1: then the acts come into play to say how are 82 00:04:51,040 --> 00:04:52,640 Speaker 1: we going to punish people who do it the wrong 83 00:04:52,680 --> 00:04:55,520 Speaker 1: way or don't do it? Yeah, and um, it's like 84 00:04:55,560 --> 00:04:58,560 Speaker 1: you said, the executive orders are where the kind of 85 00:04:58,600 --> 00:05:02,160 Speaker 1: classification and how thing are classified, how they become declassified, 86 00:05:02,200 --> 00:05:05,560 Speaker 1: how that got all sorted out. But Congress has largely 87 00:05:05,600 --> 00:05:07,800 Speaker 1: been left to figure out how to punish people who 88 00:05:07,839 --> 00:05:12,919 Speaker 1: transgress against those those um sharing that information. And that 89 00:05:13,000 --> 00:05:16,920 Speaker 1: whole thing started with the Defense Secrets Act of nineteen eleven, appropriately, 90 00:05:17,279 --> 00:05:21,200 Speaker 1: which evolved into the Espionage Act. But you hit you. 91 00:05:21,200 --> 00:05:24,240 Speaker 1: You use the word um that I think is really um, 92 00:05:24,400 --> 00:05:27,240 Speaker 1: really important and really describes one of the big problems 93 00:05:27,279 --> 00:05:31,120 Speaker 1: with the classification system in the United States, which is um, 94 00:05:31,160 --> 00:05:36,200 Speaker 1: it's vague. The descriptions, the explanations, the rules are very 95 00:05:36,360 --> 00:05:39,360 Speaker 1: vague and nebulous, and so there's a lot of room 96 00:05:39,440 --> 00:05:42,800 Speaker 1: for objectivity, and there's a lot of room to just say, 97 00:05:43,520 --> 00:05:45,120 Speaker 1: I don't really know what this is, so I'm just 98 00:05:45,120 --> 00:05:48,159 Speaker 1: gonna put it on the highest possible level of secrecy 99 00:05:48,200 --> 00:05:51,599 Speaker 1: that I can. Yeah, this is interesting because I think 100 00:05:51,640 --> 00:05:54,599 Speaker 1: the vagueness is definitely a problem that leads to part 101 00:05:54,600 --> 00:05:58,760 Speaker 1: of the bigger problem with classification and over classification, which 102 00:05:58,760 --> 00:06:01,200 Speaker 1: we're gonna talk a lot about. But I also think 103 00:06:01,240 --> 00:06:04,520 Speaker 1: that in a lot of cases like this, vagueness like 104 00:06:04,560 --> 00:06:07,880 Speaker 1: this is one area where human uh what was the 105 00:06:07,880 --> 00:06:15,240 Speaker 1: word he used? Error objectivity where human subjectivity kind of 106 00:06:15,279 --> 00:06:19,599 Speaker 1: has to come into play. Uh. And this is jumping 107 00:06:19,600 --> 00:06:21,719 Speaker 1: ahead a little bit, but like, as far as you 108 00:06:21,760 --> 00:06:24,799 Speaker 1: know what to classify something, like the way they decide 109 00:06:24,800 --> 00:06:27,640 Speaker 1: that is how bad will it hurt America? Is it grave? 110 00:06:27,760 --> 00:06:32,960 Speaker 1: Danger um? What was the second one? Um? Sort of 111 00:06:32,960 --> 00:06:37,359 Speaker 1: like medium grave and then just danger and that's and 112 00:06:37,400 --> 00:06:39,880 Speaker 1: it's you know, people can have complained that like, well 113 00:06:39,880 --> 00:06:41,560 Speaker 1: that's not really spelled out, But I don't know if 114 00:06:41,560 --> 00:06:45,400 Speaker 1: you can codify that to a t like of how 115 00:06:45,480 --> 00:06:49,560 Speaker 1: dangerous something is. I don't know. I believe that human 116 00:06:49,560 --> 00:06:53,200 Speaker 1: subjectivity has to come into play. I'm not defending over classification, 117 00:06:53,320 --> 00:06:55,560 Speaker 1: but I don't know. This is one of those things 118 00:06:55,600 --> 00:07:00,760 Speaker 1: that can be bureaucracized and legislated to the to the 119 00:07:00,880 --> 00:07:03,880 Speaker 1: nth degree. You know, well, it's that's ironic because the 120 00:07:03,880 --> 00:07:06,960 Speaker 1: whole reason this exists like this is because of bureaucracy. 121 00:07:07,040 --> 00:07:09,640 Speaker 1: You know, yes, but you you make a good point, 122 00:07:09,640 --> 00:07:12,480 Speaker 1: and I think that is ultimately the reason the driving 123 00:07:12,560 --> 00:07:16,200 Speaker 1: force behind keeping things vague, because you couldn't possibly describe 124 00:07:16,240 --> 00:07:20,400 Speaker 1: every situation and say, well, if this happens, then it 125 00:07:20,400 --> 00:07:22,280 Speaker 1: would cause grave danger and you want to make a 126 00:07:22,280 --> 00:07:27,040 Speaker 1: top secret, etcetera. Right, But like you said, the Espionage 127 00:07:27,040 --> 00:07:32,240 Speaker 1: Act in nineteen seventeen was brought upon by the advent 128 00:07:32,320 --> 00:07:36,520 Speaker 1: of World War One, and this basically um like they 129 00:07:36,560 --> 00:07:39,200 Speaker 1: also they already storted from the nineteen eleven version had 130 00:07:39,240 --> 00:07:42,160 Speaker 1: the vague terms that said, you know, you can't provide 131 00:07:42,160 --> 00:07:45,440 Speaker 1: this information related to national defense to people who shouldn't 132 00:07:45,440 --> 00:07:48,320 Speaker 1: have it. Uh, started out vague, but this is where 133 00:07:48,320 --> 00:07:51,120 Speaker 1: the teeth came in as far as punishment goes, because 134 00:07:51,160 --> 00:07:54,440 Speaker 1: it was during wartime, so it was super harsh. Yeah. 135 00:07:54,440 --> 00:07:57,080 Speaker 1: I remember we've talked before about Eugene Debs, who was 136 00:07:57,120 --> 00:08:01,040 Speaker 1: the Socialist candidate for president in the can't remember what election, 137 00:08:01,080 --> 00:08:04,239 Speaker 1: but around World War One, and he ran his campaign 138 00:08:04,320 --> 00:08:07,120 Speaker 1: from prison because he was put in prison for basically 139 00:08:07,160 --> 00:08:10,720 Speaker 1: denouncing the US war effort during World War One. And 140 00:08:10,800 --> 00:08:13,920 Speaker 1: you could not speak out against the United States military, 141 00:08:14,080 --> 00:08:17,360 Speaker 1: the United States itself, the United States Constitution. You couldn't 142 00:08:17,400 --> 00:08:20,640 Speaker 1: say anything critical about it or you could be put 143 00:08:20,680 --> 00:08:22,880 Speaker 1: in prison. And people were put in prison because of it. 144 00:08:22,880 --> 00:08:25,600 Speaker 1: And they walk that stuff back, the First Amendment violations back. 145 00:08:25,880 --> 00:08:28,360 Speaker 1: But the point is is a big part of it was. 146 00:08:28,880 --> 00:08:31,400 Speaker 1: It grew out of this idea that if you share 147 00:08:31,480 --> 00:08:35,800 Speaker 1: state secrets you can be seriously harshly punished like the 148 00:08:35,880 --> 00:08:38,920 Speaker 1: Julius and Ethel Rosenberg's were. They were actually put to 149 00:08:39,000 --> 00:08:43,600 Speaker 1: death under the Espionage Act and it's still around today. Um, 150 00:08:43,760 --> 00:08:47,760 Speaker 1: Chelsea Manning, Reality Winner, Edward Snowden, all of them were 151 00:08:47,800 --> 00:08:52,520 Speaker 1: prosecuted under the Espionage Act of nineteen seventeen, which basically says, 152 00:08:52,559 --> 00:08:56,160 Speaker 1: if you share any state any classified information, you can 153 00:08:56,480 --> 00:08:59,000 Speaker 1: be severely punished and lose years of your life in 154 00:08:59,040 --> 00:09:01,800 Speaker 1: prison or again ultimately put to death. I'm not sure 155 00:09:01,800 --> 00:09:04,280 Speaker 1: if that's still on the books, but it very well maybe. 156 00:09:04,720 --> 00:09:07,160 Speaker 1: So we should also talk about this other sort of 157 00:09:07,280 --> 00:09:11,120 Speaker 1: precursor to what we have today, the system we have today, 158 00:09:11,160 --> 00:09:14,440 Speaker 1: which was in eighteen sixty nine, and this addresses the 159 00:09:14,480 --> 00:09:19,080 Speaker 1: technology kind of for the first time. Um, the War Department. Uh, 160 00:09:19,240 --> 00:09:22,680 Speaker 1: they brought out General Order number thirty five, so cute 161 00:09:22,679 --> 00:09:29,319 Speaker 1: back then, number thirty five ledge. Uh. What they basically said, Hey, 162 00:09:29,360 --> 00:09:32,839 Speaker 1: there are cameras now that exist, so you can't be 163 00:09:32,920 --> 00:09:36,000 Speaker 1: in a fort, uh and take pictures of the fort 164 00:09:36,559 --> 00:09:38,760 Speaker 1: or above the fort or around the fort, like no 165 00:09:38,840 --> 00:09:43,000 Speaker 1: pictures basically anymore. And so they're acknowledging technology for the 166 00:09:43,040 --> 00:09:45,440 Speaker 1: first time. And the other big thing was this is 167 00:09:45,480 --> 00:09:47,560 Speaker 1: a peace time move, so it's the first time like 168 00:09:47,600 --> 00:09:49,520 Speaker 1: we weren't at war and sort of putting these rules 169 00:09:49,559 --> 00:09:52,120 Speaker 1: into place. Yeah, because there are rules about that kind 170 00:09:52,120 --> 00:09:54,840 Speaker 1: of thing, like even dating back to the Revolutionary Army 171 00:09:54,880 --> 00:09:57,000 Speaker 1: and the Continental Congress about this stuff. But this was 172 00:09:57,040 --> 00:09:58,440 Speaker 1: a that was a peace time one, and that was 173 00:09:58,480 --> 00:10:02,840 Speaker 1: significant because the classification system is a peace time and 174 00:10:02,920 --> 00:10:06,800 Speaker 1: now largely civilian system, but it found its roots in 175 00:10:06,840 --> 00:10:09,760 Speaker 1: the military quite rightfully. I mean even still today, even 176 00:10:09,760 --> 00:10:14,559 Speaker 1: people who are highly critical of the American classified information 177 00:10:14,600 --> 00:10:20,240 Speaker 1: system say it should basically be kept to military maneuvers, 178 00:10:20,440 --> 00:10:23,719 Speaker 1: maybe state department negotiations, and even then it should be 179 00:10:23,840 --> 00:10:27,440 Speaker 1: a very short time. But the general consensus is, yes, 180 00:10:27,520 --> 00:10:30,559 Speaker 1: you would not want anyone understanding or knowing what your 181 00:10:30,559 --> 00:10:35,240 Speaker 1: military was doing basically. Right. The first time that they 182 00:10:35,280 --> 00:10:38,520 Speaker 1: sort of or at least somebody started to say the 183 00:10:38,559 --> 00:10:42,600 Speaker 1: word over classification and and the fact that they needed 184 00:10:42,880 --> 00:10:47,160 Speaker 1: a sort of a tiered system was Brigadier General Arthur Murray, 185 00:10:47,200 --> 00:10:50,200 Speaker 1: who was Chief of Artillery at the time. He wrote 186 00:10:50,240 --> 00:10:52,600 Speaker 1: a letter to the Secretary of War that basically said, listen, 187 00:10:52,920 --> 00:10:56,120 Speaker 1: all we've got is this one classification. It's confidential, and 188 00:10:56,120 --> 00:10:59,959 Speaker 1: they're stamping everything confidential even if it shouldn't be confidential, 189 00:11:00,400 --> 00:11:02,839 Speaker 1: and that just you know, if if everything is being 190 00:11:02,840 --> 00:11:05,720 Speaker 1: stamped this way, then it loses its meaning. So we 191 00:11:05,800 --> 00:11:08,080 Speaker 1: have to And they didn't call it over classification at 192 00:11:08,080 --> 00:11:10,240 Speaker 1: the time. Really, I think that was just sort of 193 00:11:10,240 --> 00:11:13,360 Speaker 1: the first person to mention this. But he said, what 194 00:11:13,400 --> 00:11:17,120 Speaker 1: we really need is a tiered system of how classified 195 00:11:17,160 --> 00:11:20,800 Speaker 1: it should be. And they're already doing this in Britain. Uh, 196 00:11:20,840 --> 00:11:24,880 Speaker 1: they have their four tiered system for your eyes only only, 197 00:11:25,480 --> 00:11:32,920 Speaker 1: Moonraker Live and Let Die and gold remember no gold 198 00:11:32,960 --> 00:11:35,600 Speaker 1: member Old Finger gold Member was a good one. Now 199 00:11:35,640 --> 00:11:38,440 Speaker 1: for your eyes only is the actual classification. Then you 200 00:11:38,480 --> 00:11:42,079 Speaker 1: have for the information of commission officers only for official 201 00:11:42,200 --> 00:11:46,199 Speaker 1: use only and not for publication. And the chiefs of 202 00:11:46,240 --> 00:11:48,480 Speaker 1: Staff said, you know what, I think it's a great idea. 203 00:11:48,600 --> 00:11:51,720 Speaker 1: This four tiered system is probably something we should use, 204 00:11:52,160 --> 00:11:54,800 Speaker 1: but not these not the way you're doing it. We 205 00:11:54,840 --> 00:11:56,120 Speaker 1: need to be able to kind of make this up 206 00:11:56,120 --> 00:11:58,439 Speaker 1: as we go to It seems way too tea and 207 00:11:58,520 --> 00:12:00,720 Speaker 1: crumpets stuffs, so we need to I'm up with something 208 00:12:00,760 --> 00:12:03,280 Speaker 1: else well, I think too rigid. It sounds like they 209 00:12:03,280 --> 00:12:06,640 Speaker 1: wanted to be able to decide be a little nimble 210 00:12:06,760 --> 00:12:09,760 Speaker 1: with how they classified things as they went. Yeah, because 211 00:12:09,760 --> 00:12:11,959 Speaker 1: they the system they initially came up with was to 212 00:12:12,000 --> 00:12:14,760 Speaker 1: just basically put at the preface of say like a 213 00:12:14,840 --> 00:12:18,840 Speaker 1: manual or a document that says, this is only for 214 00:12:19,320 --> 00:12:23,160 Speaker 1: you know, this division or this this um armed services, 215 00:12:23,240 --> 00:12:27,200 Speaker 1: or this type of officer, this whatever, and then it 216 00:12:27,240 --> 00:12:29,520 Speaker 1: should not be it should not be given to anybody 217 00:12:29,559 --> 00:12:33,320 Speaker 1: else beyond them, um. And you could conceivably just do 218 00:12:33,440 --> 00:12:36,640 Speaker 1: that for every single document. But the problem is that 219 00:12:37,040 --> 00:12:38,760 Speaker 1: this was stuff that they were talking about in nine 220 00:12:39,160 --> 00:12:44,079 Speaker 1: seven in America was still fairly small federal government wise. 221 00:12:44,160 --> 00:12:46,320 Speaker 1: I mean it was, it was growing big time after 222 00:12:46,360 --> 00:12:49,480 Speaker 1: the Civil War, but compared to today, like the bureaucracy 223 00:12:49,600 --> 00:12:52,160 Speaker 1: was just nothing like it is. And so you could 224 00:12:52,200 --> 00:12:55,800 Speaker 1: conceivably do that in the armed forces. But as as 225 00:12:55,840 --> 00:12:59,720 Speaker 1: things grew and the bureaucracy grew and the government crew, um, 226 00:12:59,720 --> 00:13:01,880 Speaker 1: it became impossible to come up with this kind of 227 00:13:01,880 --> 00:13:04,560 Speaker 1: thing for every single document, and much easier to just 228 00:13:04,600 --> 00:13:07,280 Speaker 1: say this is this category, this is that category, and 229 00:13:07,320 --> 00:13:10,960 Speaker 1: this is that category, right, and those we should mention 230 00:13:11,000 --> 00:13:14,040 Speaker 1: as a little sidebar. There was even one debate for 231 00:13:14,080 --> 00:13:16,679 Speaker 1: a short time where they were coming up with the 232 00:13:17,000 --> 00:13:19,600 Speaker 1: different classification levels where they wondered if they should even 233 00:13:19,640 --> 00:13:23,559 Speaker 1: make up words. There weren't words, I guess to make 234 00:13:23,600 --> 00:13:26,840 Speaker 1: it even more confidential or confusing. But then they said, 235 00:13:26,840 --> 00:13:29,679 Speaker 1: oh wait a minute, this is this is more confusing 236 00:13:29,800 --> 00:13:32,960 Speaker 1: to us even and we don't want someone you know, 237 00:13:33,120 --> 00:13:37,040 Speaker 1: high on the totem pole which really means low, uh, 238 00:13:37,360 --> 00:13:40,200 Speaker 1: not understanding what this word even means, and then making 239 00:13:40,200 --> 00:13:42,640 Speaker 1: a mistake because of that. Yeah, so like, let's use 240 00:13:42,720 --> 00:13:46,600 Speaker 1: real word. Private piles like red turtle doesn't sound very important. 241 00:13:46,679 --> 00:13:49,600 Speaker 1: I'm sure I can tell everybody about that, that's right. 242 00:13:50,480 --> 00:13:52,120 Speaker 1: So when you say private pile, do you mean gilmer 243 00:13:52,160 --> 00:13:55,520 Speaker 1: pile or full metal jacket? Take your pick. Okay, so 244 00:13:55,640 --> 00:14:00,000 Speaker 1: I'm both doing that. Should we take a break, Yeah, 245 00:14:00,000 --> 00:14:01,520 Speaker 1: all right, we'll take a break and we'll talk about 246 00:14:01,760 --> 00:14:26,520 Speaker 1: the atomic age and how that played in right after this. Okay, chuck. 247 00:14:26,600 --> 00:14:28,920 Speaker 1: So it was all just kind of willy nilly and 248 00:14:29,000 --> 00:14:32,280 Speaker 1: the Armed Forces of America we're figuring out how to 249 00:14:32,400 --> 00:14:35,800 Speaker 1: set up their own classification system, and it was actually 250 00:14:35,880 --> 00:14:40,800 Speaker 1: supported and codified starting with one of those executive orders 251 00:14:41,400 --> 00:14:48,320 Speaker 1: that came from President Franklin Roosevelt EOE, and it basically said, 252 00:14:48,360 --> 00:14:51,560 Speaker 1: everything the military is doing is right. Um, they've got 253 00:14:51,600 --> 00:14:54,120 Speaker 1: some great terms over there, we've got some secret, we 254 00:14:54,240 --> 00:14:57,240 Speaker 1: got restricted, we've got confidential. Let's keep going with that. 255 00:14:57,760 --> 00:15:00,160 Speaker 1: And it just kind of was it was. It was 256 00:15:00,240 --> 00:15:02,240 Speaker 1: mainly coming out of the military, and the military was 257 00:15:02,280 --> 00:15:05,160 Speaker 1: figuring it out. But then, like you said, the atomic 258 00:15:05,200 --> 00:15:11,160 Speaker 1: age came along and the stakes became exponentially higher because 259 00:15:11,240 --> 00:15:15,640 Speaker 1: we had different countries conceivably working on an atomic weapon, 260 00:15:15,800 --> 00:15:18,880 Speaker 1: the likes of which had never been seen, the destructiveness 261 00:15:18,920 --> 00:15:21,360 Speaker 1: of which had never been seen ever in the history 262 00:15:21,400 --> 00:15:25,120 Speaker 1: of humanity. And like its just things got real and 263 00:15:25,160 --> 00:15:28,200 Speaker 1: the need for secrecy got real, so much so that 264 00:15:28,280 --> 00:15:31,640 Speaker 1: even before the Manhattan Projects started, the scientists that were 265 00:15:31,680 --> 00:15:37,680 Speaker 1: working on atomic research, we're keeping their research secret to themselves, 266 00:15:37,720 --> 00:15:40,400 Speaker 1: like by their own by their own like consensus, Like 267 00:15:40,440 --> 00:15:42,760 Speaker 1: they're like the public, we can't we have to be 268 00:15:42,800 --> 00:15:46,400 Speaker 1: careful publishing papers about this. Yeah, I think that's interesting 269 00:15:46,440 --> 00:15:49,720 Speaker 1: that they just sort of knew to shut up about it, 270 00:15:49,880 --> 00:15:53,160 Speaker 1: even I would imagine within their own families. That was 271 00:15:53,200 --> 00:15:55,680 Speaker 1: probably back in the day where they were like daddy 272 00:15:55,720 --> 00:15:57,680 Speaker 1: doesn't talk about what he doesn't work kind of stuff. 273 00:15:58,760 --> 00:16:02,560 Speaker 1: And finally, when the Manhattan Project, the man i Hatta 274 00:16:02,600 --> 00:16:06,400 Speaker 1: Project excuse me, officially got going, there was a dude, 275 00:16:06,520 --> 00:16:09,560 Speaker 1: one one lone man, General Leslie Groves that kind of 276 00:16:09,600 --> 00:16:15,240 Speaker 1: controlled the set of rules for secrecy and things got 277 00:16:15,240 --> 00:16:19,560 Speaker 1: a little more codified at that point through him basically saying, 278 00:16:20,240 --> 00:16:22,160 Speaker 1: you know, I'm gonna use whatever means I have under 279 00:16:22,200 --> 00:16:24,120 Speaker 1: the law and even outside the law to make sure 280 00:16:24,120 --> 00:16:27,080 Speaker 1: this stuff stays a secret. Um. But it wasn't until 281 00:16:27,120 --> 00:16:30,560 Speaker 1: these executive orders started rolling in. I think they called 282 00:16:30,560 --> 00:16:34,880 Speaker 1: stuff like restricted data and stuff like that former restricted data, 283 00:16:34,960 --> 00:16:40,000 Speaker 1: which actually was the progenitor of what would become declassification. 284 00:16:40,840 --> 00:16:43,360 Speaker 1: But uh, it's you know, it was still a little 285 00:16:43,360 --> 00:16:47,360 Speaker 1: willy nilly in that it took the these executive orders 286 00:16:47,640 --> 00:16:50,400 Speaker 1: and eventually Congress to really make this stuff law. And 287 00:16:50,480 --> 00:16:53,480 Speaker 1: at the time, even within the Manhattan Project, it was 288 00:16:53,520 --> 00:16:55,760 Speaker 1: still like this one guy saying, here, we can't we 289 00:16:55,760 --> 00:16:58,000 Speaker 1: gotta keep her to secret or I'm gonna do something 290 00:16:58,040 --> 00:17:00,760 Speaker 1: bad to you. Well, he was like the uh, like 291 00:17:01,000 --> 00:17:04,879 Speaker 1: the ultimate project manager. He was the sole person on 292 00:17:04,920 --> 00:17:09,040 Speaker 1: the planet who understood everything that was going on with 293 00:17:09,040 --> 00:17:12,119 Speaker 1: the Manhattan Project. Even the top scientists didn't fully have 294 00:17:12,240 --> 00:17:15,720 Speaker 1: like the entire picture. Only Leslie Groves did. Um. He 295 00:17:15,760 --> 00:17:18,840 Speaker 1: wouldn't let some of like the top physicists working on 296 00:17:18,880 --> 00:17:21,720 Speaker 1: the bomb talk to each other about their work. There 297 00:17:21,760 --> 00:17:26,000 Speaker 1: was a tremendous amount of compartmentalization. Mail was read and censored. 298 00:17:26,160 --> 00:17:28,640 Speaker 1: There's just tons of stuff that this guy was basically saying, 299 00:17:28,640 --> 00:17:30,879 Speaker 1: we have to do these extreme measures to keep this 300 00:17:31,040 --> 00:17:33,560 Speaker 1: under wraps. And once we enter the atomic age, you 301 00:17:33,600 --> 00:17:38,119 Speaker 1: can't go back. So that his ideas about the secrecy needed, 302 00:17:38,520 --> 00:17:40,720 Speaker 1: uh and the measures you needed to take to protect 303 00:17:40,720 --> 00:17:46,560 Speaker 1: atomic secrets eventually became the the the inspiration for how 304 00:17:46,640 --> 00:17:51,160 Speaker 1: America's classified information system like that was created. He he 305 00:17:51,200 --> 00:17:54,600 Speaker 1: created it inadvertently. He just basically came up with such 306 00:17:54,640 --> 00:17:58,560 Speaker 1: a great tight set of rules that when the Atomic 307 00:17:58,920 --> 00:18:01,640 Speaker 1: Energy Act of ninety forty six was created and they 308 00:18:01,680 --> 00:18:05,480 Speaker 1: took basically atomic research and put it to like a 309 00:18:05,600 --> 00:18:10,680 Speaker 1: government funded civilian research from the military, they said, basically 310 00:18:10,680 --> 00:18:12,880 Speaker 1: everything grows was doing, We're just going to keep doing 311 00:18:12,880 --> 00:18:15,560 Speaker 1: that and codify it. Yeah, and you know how I 312 00:18:15,600 --> 00:18:18,800 Speaker 1: mentioned that he kind of inadvertently created declassification with the 313 00:18:18,880 --> 00:18:24,600 Speaker 1: former restricted data. Uh. He also inadvertently created the derivative 314 00:18:24,640 --> 00:18:29,600 Speaker 1: classified designation, which is basically, if part of this project 315 00:18:29,600 --> 00:18:33,919 Speaker 1: we're working on is classified, then it's all classified. Um. 316 00:18:34,000 --> 00:18:36,040 Speaker 1: And you know, we'll get into the different layers of 317 00:18:36,040 --> 00:18:38,679 Speaker 1: classification later, but it was it was like, you know, 318 00:18:38,760 --> 00:18:41,359 Speaker 1: this whole thing is a big secret. It's not just 319 00:18:41,480 --> 00:18:43,400 Speaker 1: you can't talk about this one thing, and everything else 320 00:18:43,520 --> 00:18:46,280 Speaker 1: is just uh, you know whatever talk about in the 321 00:18:46,280 --> 00:18:49,960 Speaker 1: country club. It's no big deal. Um. And so after that, 322 00:18:50,040 --> 00:18:53,840 Speaker 1: so you've got um, Congress setting the punishment for sharing 323 00:18:54,119 --> 00:18:57,800 Speaker 1: state secrets UM. And then you've got Leslie Groves coming 324 00:18:57,880 --> 00:18:59,720 Speaker 1: up with the kind of the framework for how to 325 00:19:00,240 --> 00:19:04,359 Speaker 1: how to carry out a classified information system and protect information. 326 00:19:04,640 --> 00:19:07,199 Speaker 1: And then you've got the President's a succession of presidents 327 00:19:07,240 --> 00:19:09,800 Speaker 1: starting with Roosevelt and then picked up by Truman in 328 00:19:09,880 --> 00:19:13,879 Speaker 1: ninety one, issuing executive orders that really kind of spell 329 00:19:13,920 --> 00:19:17,480 Speaker 1: out how these classification systems are meant to work. And 330 00:19:17,520 --> 00:19:20,480 Speaker 1: Truman at a top secret the top secret category in 331 00:19:20,560 --> 00:19:23,720 Speaker 1: nineteen fifty one. And then the big one, UM was 332 00:19:23,800 --> 00:19:28,399 Speaker 1: Executive Order one oh two ninety UM, which was also 333 00:19:28,520 --> 00:19:31,560 Speaker 1: issued in in nineteen fifty one that said, hey, it's 334 00:19:31,600 --> 00:19:33,680 Speaker 1: not just the military who has secrets that they need 335 00:19:33,720 --> 00:19:36,639 Speaker 1: to keep. Basically, all of the Executive Branch needs to 336 00:19:36,680 --> 00:19:42,080 Speaker 1: be able to to create secret and top secret information classifications. 337 00:19:42,160 --> 00:19:47,320 Speaker 1: And he extended that power to classify information to UM 338 00:19:47,320 --> 00:19:50,800 Speaker 1: basically every agency there was, including like the Department of 339 00:19:50,840 --> 00:19:54,639 Speaker 1: Agriculture and the Department of Education UM to create to 340 00:19:54,760 --> 00:19:57,160 Speaker 1: say Nope, this is classified, this can't be this can't 341 00:19:57,160 --> 00:19:59,560 Speaker 1: be released. Yeah. They had to walk that back a 342 00:19:59,560 --> 00:20:02,000 Speaker 1: couple of years later a little bit because people were 343 00:20:02,480 --> 00:20:04,640 Speaker 1: worried that it was too broad and that too many 344 00:20:04,680 --> 00:20:08,480 Speaker 1: people were classifying things and had access to it. So 345 00:20:08,960 --> 00:20:12,520 Speaker 1: they reduced the number of agencies UH in nineteen fifty 346 00:20:12,600 --> 00:20:16,040 Speaker 1: three with another executive order and then what we ended 347 00:20:16,119 --> 00:20:21,520 Speaker 1: up with was top secret, secret, confidential, UH and restricted 348 00:20:22,760 --> 00:20:25,359 Speaker 1: And should we should we talk a little bit about 349 00:20:25,440 --> 00:20:29,720 Speaker 1: what you sent over, like how it literally works in practice. Yeah, 350 00:20:29,800 --> 00:20:32,480 Speaker 1: I think that's a good spot for it. All right. 351 00:20:32,680 --> 00:20:35,679 Speaker 1: So if you get a document and you you know, 352 00:20:35,760 --> 00:20:38,199 Speaker 1: let's say one day you get your through a FOIA 353 00:20:38,320 --> 00:20:42,000 Speaker 1: requests which we'll talk about later Free to Information Act. 354 00:20:42,119 --> 00:20:44,119 Speaker 1: If you get your hands on a document and it says, 355 00:20:44,880 --> 00:20:48,040 Speaker 1: you know, uh, it might have the title right there 356 00:20:48,720 --> 00:20:53,320 Speaker 1: with the letter you saying that it's unclassified at this point. 357 00:20:53,480 --> 00:20:57,480 Speaker 1: But with that, yeah, within the document, you're gonna have 358 00:20:58,040 --> 00:21:00,879 Speaker 1: different paragraphs, are gonna have designate aations, Like one might 359 00:21:00,920 --> 00:21:04,200 Speaker 1: say secret, one might say confidential, one might say top secret, 360 00:21:04,800 --> 00:21:09,399 Speaker 1: And this whole thing rolls up to its initial top classification. 361 00:21:10,040 --> 00:21:12,400 Speaker 1: So if part of it is top secret, then all 362 00:21:12,440 --> 00:21:15,280 Speaker 1: of it is top secret. But I guess the deal 363 00:21:15,440 --> 00:21:19,760 Speaker 1: is is for future declassification, they want those sub designations 364 00:21:19,840 --> 00:21:22,880 Speaker 1: in there, is that, right, Yeah, to make it easier 365 00:21:22,920 --> 00:21:26,240 Speaker 1: for the person you know, um sending the document out 366 00:21:26,280 --> 00:21:30,280 Speaker 1: for a foyer requested blackout. You know, any any paragraphs 367 00:21:30,280 --> 00:21:34,239 Speaker 1: that are secret confidential because part of me is like, 368 00:21:34,280 --> 00:21:36,280 Speaker 1: if it's all rolls up to top secret, if one 369 00:21:36,280 --> 00:21:38,920 Speaker 1: part is top secret, then what's the point of even 370 00:21:39,160 --> 00:21:42,040 Speaker 1: subcategorizing it? But that doesn't make sense. So in this case, 371 00:21:42,119 --> 00:21:44,359 Speaker 1: so you said that, they might have the letter you 372 00:21:44,560 --> 00:21:47,840 Speaker 1: next to the title. In that case, they're saying, you 373 00:21:47,880 --> 00:21:51,480 Speaker 1: can acknowledge the existence of this document and the title 374 00:21:51,480 --> 00:21:55,600 Speaker 1: of this document, but the document itself is considered top secret, 375 00:21:55,800 --> 00:21:57,720 Speaker 1: so you wouldn't No one would be able to see 376 00:21:57,720 --> 00:22:01,159 Speaker 1: it without a top secret clearance, right, And then depending 377 00:22:01,160 --> 00:22:03,879 Speaker 1: on what agency here with. They also have different terms. 378 00:22:04,800 --> 00:22:08,560 Speaker 1: I think the State Department says sensitive but unclassified. Uh, 379 00:22:08,640 --> 00:22:11,159 Speaker 1: the d O D and Homeland Security might use for 380 00:22:11,200 --> 00:22:14,560 Speaker 1: official use only uh. And I guess the Parks Department 381 00:22:14,600 --> 00:22:16,520 Speaker 1: says it would be a whole lot cooler if you 382 00:22:16,520 --> 00:22:21,600 Speaker 1: didn't mention this is that from parks and wrecks from 383 00:22:21,600 --> 00:22:23,600 Speaker 1: my brain just now. Oh, I liked it a lot, Chuck, 384 00:22:23,720 --> 00:22:27,440 Speaker 1: I like my friend. So, um, those those ones you 385 00:22:27,560 --> 00:22:30,680 Speaker 1: just said, sensitive but unclassified, these basically made up ones. 386 00:22:30,760 --> 00:22:32,960 Speaker 1: Those are to keep people like you and me from 387 00:22:32,960 --> 00:22:36,119 Speaker 1: being able to see this stuff. Which is a big problem, 388 00:22:36,160 --> 00:22:39,480 Speaker 1: as we'll see with with classified information and over classification, 389 00:22:39,560 --> 00:22:41,679 Speaker 1: is that the public is basically looped out by this 390 00:22:41,720 --> 00:22:43,440 Speaker 1: whole thing, but they have a stamp. This is no 391 00:22:43,560 --> 00:22:51,560 Speaker 1: dopes right, no grass, no forget so. Um. So, one 392 00:22:51,600 --> 00:22:53,600 Speaker 1: of the other things that we should say is that 393 00:22:53,720 --> 00:22:57,159 Speaker 1: there be you talked about derivative classification, which we'll talk 394 00:22:57,320 --> 00:23:00,159 Speaker 1: more about in a second. There's also original classification, and 395 00:23:00,640 --> 00:23:05,120 Speaker 1: an original classification says this new thing that we're talking about. 396 00:23:05,200 --> 00:23:08,840 Speaker 1: Let's say somebody comes up with a new gun, right, 397 00:23:09,760 --> 00:23:13,320 Speaker 1: somebody along the way will say, we don't want anybody 398 00:23:13,359 --> 00:23:15,080 Speaker 1: to know about this gun. We certainly don't want to 399 00:23:15,080 --> 00:23:17,480 Speaker 1: know how it works. So I'm going to deem this 400 00:23:17,520 --> 00:23:21,560 Speaker 1: gun top secret. There's only a handful of people in 401 00:23:21,600 --> 00:23:23,520 Speaker 1: the country you can do that, and it sounds like 402 00:23:23,560 --> 00:23:25,160 Speaker 1: a lot, But when you really step back and think 403 00:23:25,160 --> 00:23:29,680 Speaker 1: about it, Um, it's not that many federal officials typically 404 00:23:29,800 --> 00:23:35,320 Speaker 1: work in the executive branch. Um can can declare something 405 00:23:35,400 --> 00:23:40,520 Speaker 1: originally classified. That's just the original classification. Yeah, and just 406 00:23:40,680 --> 00:23:45,280 Speaker 1: quickly by the numbers that's people can classify. But uh, 407 00:23:45,320 --> 00:23:48,800 Speaker 1: and this is numbers, But more than four million people 408 00:23:48,960 --> 00:23:52,400 Speaker 1: have security clearances, including one point three million with top 409 00:23:52,440 --> 00:23:57,640 Speaker 1: secret clearance. So those people, those other four four million 410 00:23:57,720 --> 00:24:02,000 Speaker 1: people who can see class of information, they're the ones 411 00:24:02,119 --> 00:24:06,679 Speaker 1: usually who make derivative classifications. So when they start, when 412 00:24:06,720 --> 00:24:10,160 Speaker 1: they come along and they say they're tasked with creating 413 00:24:10,480 --> 00:24:13,680 Speaker 1: a handbook or an instruction manual on using that new 414 00:24:13,760 --> 00:24:17,520 Speaker 1: gun that just got a top secret classification. When they're 415 00:24:17,560 --> 00:24:21,600 Speaker 1: making that manual because they're talking about a top secret gun, 416 00:24:22,840 --> 00:24:27,400 Speaker 1: that manual is top secret. Yes, that's a derivative classification. Now, 417 00:24:27,480 --> 00:24:32,240 Speaker 1: let's say you're emailing your coworker about the progress on 418 00:24:32,320 --> 00:24:36,240 Speaker 1: the manual, that email is top secret because you're talking 419 00:24:36,240 --> 00:24:39,439 Speaker 1: about a manual that's top secret because it's talking about 420 00:24:39,600 --> 00:24:41,959 Speaker 1: the thing that was this gun that was originally deemed 421 00:24:42,080 --> 00:24:45,240 Speaker 1: top secret. Now, if you're talking about millions of people 422 00:24:45,640 --> 00:24:49,200 Speaker 1: with access to secret and top secret information and they're 423 00:24:49,200 --> 00:24:51,320 Speaker 1: all talking to one another trying to make all this 424 00:24:51,359 --> 00:24:55,919 Speaker 1: stuff work, you can see how quickly that derivative classific 425 00:24:56,200 --> 00:25:00,439 Speaker 1: is classified information can explode exponentially. In an fact, it 426 00:25:00,480 --> 00:25:04,120 Speaker 1: actually has over the years. Yeah, I mean, I guess 427 00:25:04,160 --> 00:25:05,639 Speaker 1: we could go through them. Are you talking about the 428 00:25:05,680 --> 00:25:12,920 Speaker 1: overclassification numbers? Yeah, yeah, so, Uh, the Security the Information 429 00:25:12,920 --> 00:25:16,760 Speaker 1: Security Oversight Office reported, uh, six years ago that cabinet 430 00:25:16,840 --> 00:25:22,640 Speaker 1: level agencies alone uh this is not the military, right, 431 00:25:23,000 --> 00:25:26,359 Speaker 1: classified more than fifty five million documents. And then the 432 00:25:26,359 --> 00:25:32,320 Speaker 1: Public Interest Declassification Board estimates that the intelligence community just 433 00:25:32,359 --> 00:25:36,760 Speaker 1: by themselves classifies multiple peta bytes of data every year, 434 00:25:37,359 --> 00:25:40,439 Speaker 1: which is about a petabyte is about eighty six billion 435 00:25:40,480 --> 00:25:45,080 Speaker 1: pages billion pages of either text or you know. It 436 00:25:45,160 --> 00:25:47,199 Speaker 1: points out a lot of that could be photographs and videos, 437 00:25:47,240 --> 00:25:50,239 Speaker 1: but the point is lots and lots and lots and 438 00:25:50,280 --> 00:25:53,080 Speaker 1: lots and lots of stuff. Yes, and that's just the 439 00:25:53,119 --> 00:25:56,760 Speaker 1: classification there's an entire other process for declassifying, and that 440 00:25:56,800 --> 00:26:01,960 Speaker 1: can be pretty nebulous too. Um Supposedly now nowadays if um, 441 00:26:02,000 --> 00:26:05,120 Speaker 1: if somebody doesn't doesn't get ahold of this, and very 442 00:26:05,119 --> 00:26:08,600 Speaker 1: few people are authorized to declassify stuff. Apparently the president 443 00:26:08,840 --> 00:26:11,960 Speaker 1: can declassify anything at any time, but there's procedures for 444 00:26:12,000 --> 00:26:15,800 Speaker 1: other like agency heads to declassify stuff. Um, if they 445 00:26:15,840 --> 00:26:18,399 Speaker 1: don't get ahold of a document and decided to declassify 446 00:26:18,440 --> 00:26:21,760 Speaker 1: it after ten years, if it's not that big of 447 00:26:21,760 --> 00:26:26,000 Speaker 1: a deal, it can automatically be declassified if it's very sensitive. 448 00:26:26,080 --> 00:26:29,200 Speaker 1: And again none of this is defined. Um, they're not saying, like, 449 00:26:29,359 --> 00:26:33,880 Speaker 1: you know, really sensitive like the Myli massacre documents, Um, 450 00:26:34,359 --> 00:26:36,159 Speaker 1: that kind of stuff you want to hang onto for 451 00:26:36,200 --> 00:26:40,199 Speaker 1: twenty five years. So um, there's like a system and 452 00:26:40,400 --> 00:26:44,200 Speaker 1: involved or in place, but it's not a very good system. 453 00:26:44,480 --> 00:26:47,520 Speaker 1: And even still you're saying, like, what is the stuff 454 00:26:47,560 --> 00:26:50,680 Speaker 1: that can just be automatically declassified after ten years? Should 455 00:26:50,680 --> 00:26:52,800 Speaker 1: it even be classified in the first place, which is 456 00:26:52,840 --> 00:26:55,639 Speaker 1: something we'll talk about two. Yeah, And the whole thing 457 00:26:55,640 --> 00:26:59,280 Speaker 1: with the twenty years is like what they're basically saying 458 00:26:59,440 --> 00:27:02,479 Speaker 1: is we want to wait long enough to where a 459 00:27:02,520 --> 00:27:06,000 Speaker 1: lot of the people may be dead, a lot of 460 00:27:06,480 --> 00:27:10,480 Speaker 1: maybe the Statute of Limitations could be up for any crime. Uh, 461 00:27:10,600 --> 00:27:13,160 Speaker 1: or maybe people hopefully have just gotten over to enough 462 00:27:13,200 --> 00:27:17,920 Speaker 1: to where they're not super mad about something. It's really interesting. Um, 463 00:27:17,960 --> 00:27:23,760 Speaker 1: all this comes out of uh shockingly, the last executive 464 00:27:23,840 --> 00:27:27,000 Speaker 1: order on this kind of thing was from President Obama, 465 00:27:27,720 --> 00:27:31,600 Speaker 1: which was in two thousand nine. Executive Order thirteen thousand 466 00:27:32,160 --> 00:27:37,600 Speaker 1: five six replaced all previous orders as far as classification goes, 467 00:27:38,280 --> 00:27:42,800 Speaker 1: and UH shocked that it wasn't thrown out and uh 468 00:27:42,840 --> 00:27:45,960 Speaker 1: and redone. It was a little shocking, isn't it. Yeah, 469 00:27:46,000 --> 00:27:48,480 Speaker 1: I was just I guess no one told the last 470 00:27:48,480 --> 00:27:52,280 Speaker 1: president that Obama had the last word on that, because UM, yeah, 471 00:27:52,280 --> 00:27:55,800 Speaker 1: I'm not sure how that got through. But um, that 472 00:27:55,960 --> 00:27:58,720 Speaker 1: is when it it class It moved things into top secret, 473 00:27:58,760 --> 00:28:02,040 Speaker 1: secret and confidential. And then that's when it talked about 474 00:28:02,119 --> 00:28:04,440 Speaker 1: you know, whether it was grave damage, serious damage, or 475 00:28:04,520 --> 00:28:09,280 Speaker 1: damage and uh stuff like that. So that's that's sort 476 00:28:09,280 --> 00:28:11,600 Speaker 1: of the last word on it. Um. We also should 477 00:28:11,640 --> 00:28:15,000 Speaker 1: mention there is a level above top secret called sensitive 478 00:28:15,040 --> 00:28:19,520 Speaker 1: compartmented information, and this is with when you're within if 479 00:28:19,560 --> 00:28:22,199 Speaker 1: you have top secret clearance, they're like, yeah, but you 480 00:28:22,240 --> 00:28:26,119 Speaker 1: can't even know this. Only people, these very few people 481 00:28:26,119 --> 00:28:28,159 Speaker 1: that deal with this thing specifically can even know this, 482 00:28:28,920 --> 00:28:32,400 Speaker 1: Like that is an sci designation. Sure, and that makes 483 00:28:32,440 --> 00:28:35,320 Speaker 1: total sense, And it's a very general Leslie Groves idea 484 00:28:35,760 --> 00:28:39,600 Speaker 1: compartmentalization of of of information, so that if you have 485 00:28:39,640 --> 00:28:42,520 Speaker 1: top secret clearance and you're in the Department of Energy, 486 00:28:43,040 --> 00:28:45,240 Speaker 1: they're like, uh, no, you can't have access to this 487 00:28:45,320 --> 00:28:48,120 Speaker 1: top secret weapons information has nothing to do with your job. 488 00:28:48,440 --> 00:28:51,560 Speaker 1: You can have all access to the top secret information 489 00:28:51,640 --> 00:28:56,320 Speaker 1: about the Department of energies new like cold nuclear fusion 490 00:28:56,800 --> 00:29:00,240 Speaker 1: reactor that we're secretly building, but no, we're you can't 491 00:29:00,240 --> 00:29:03,240 Speaker 1: see this this new gun design, which makes a lot 492 00:29:03,280 --> 00:29:05,080 Speaker 1: of sense, but it really just kind of goes to 493 00:29:05,080 --> 00:29:08,600 Speaker 1: show you how compartmentalized this classified information is, even in 494 00:29:08,640 --> 00:29:12,720 Speaker 1: the echelons of top secret clearance. Yeah, and within that 495 00:29:12,760 --> 00:29:15,320 Speaker 1: Obama order, there are eight types of information that can 496 00:29:15,360 --> 00:29:18,600 Speaker 1: be classified. I don't know if we need to go 497 00:29:18,720 --> 00:29:21,400 Speaker 1: through all these doing no, not necessarily, but they generally 498 00:29:21,440 --> 00:29:24,360 Speaker 1: all make sense to me though, like there's not anywhere 499 00:29:24,400 --> 00:29:26,880 Speaker 1: I'm like, oh, this one doesn't doesn't make military stuff, 500 00:29:26,920 --> 00:29:32,560 Speaker 1: weapons stuff, foreign government stuff, un vulnerability of infrastructures, just 501 00:29:32,680 --> 00:29:35,520 Speaker 1: things you would not want an enemy to to understand, 502 00:29:35,560 --> 00:29:39,120 Speaker 1: which it makes sense. And yet this executive order, it 503 00:29:39,120 --> 00:29:41,960 Speaker 1: may have cleaned up the process some, but it hasn't 504 00:29:42,000 --> 00:29:47,280 Speaker 1: helped It seems like, should we take another break? All right, 505 00:29:47,280 --> 00:29:49,080 Speaker 1: I'm an agreement on that, by the way, I don't 506 00:29:49,080 --> 00:29:52,320 Speaker 1: want to keep people in the suspense. And then we're 507 00:29:52,320 --> 00:29:54,600 Speaker 1: going to come back and talk about what some of 508 00:29:54,600 --> 00:29:57,280 Speaker 1: the problems are with over classification. And there's a lot 509 00:29:57,320 --> 00:30:21,880 Speaker 1: of them, all right. So over classification is a thing. 510 00:30:21,960 --> 00:30:24,760 Speaker 1: We talked about how many documents are classified each year, 511 00:30:26,240 --> 00:30:30,040 Speaker 1: how many peta bites and billions and billions of pages 512 00:30:30,880 --> 00:30:35,680 Speaker 1: of information and video and photos are classified. And there 513 00:30:35,800 --> 00:30:39,440 Speaker 1: is a worry that's obviously been around for a long 514 00:30:39,520 --> 00:30:43,280 Speaker 1: time that people are and there's a lot of reasons 515 00:30:43,360 --> 00:30:45,760 Speaker 1: and costs for over classification, but one of the big 516 00:30:45,800 --> 00:30:48,360 Speaker 1: reasons is that I think a lot of people don't 517 00:30:48,360 --> 00:30:51,880 Speaker 1: want to be on the hook for it, so they'll 518 00:30:51,880 --> 00:30:54,880 Speaker 1: just default to classifying something. Uh. And one of the 519 00:30:54,880 --> 00:30:57,960 Speaker 1: things the Obama order did was you had to have 520 00:30:58,040 --> 00:31:00,880 Speaker 1: your name on it if you classified it, so at 521 00:31:00,960 --> 00:31:03,360 Speaker 1: least they knew who it was and they could go 522 00:31:03,400 --> 00:31:06,560 Speaker 1: back to when it came to challenging that classification, right, 523 00:31:06,560 --> 00:31:10,120 Speaker 1: which makes sense. That's good, they added some kind of 524 00:31:10,120 --> 00:31:14,800 Speaker 1: accountability to it, right, But what are some of these costs. Well, 525 00:31:14,840 --> 00:31:16,840 Speaker 1: one of the big ones is corruption, Like if you 526 00:31:16,920 --> 00:31:20,400 Speaker 1: keep secrets and you classify everything, like it's really fun 527 00:31:20,440 --> 00:31:25,160 Speaker 1: to go look for examples of absurd over classification. Um, 528 00:31:25,200 --> 00:31:29,440 Speaker 1: there's apparently some facetious plot to overthrow Santa clause and 529 00:31:29,520 --> 00:31:34,400 Speaker 1: some report on it was classified. They routinely classify menus 530 00:31:34,480 --> 00:31:38,120 Speaker 1: at state dinner banquets, just stuff that does not need 531 00:31:38,200 --> 00:31:42,680 Speaker 1: to be classified. It reveals just how if everything is classified, 532 00:31:42,720 --> 00:31:45,360 Speaker 1: it makes it really easy to classify, which means that 533 00:31:45,440 --> 00:31:48,560 Speaker 1: you can cover up just about anything by classifying it. 534 00:31:48,600 --> 00:31:52,360 Speaker 1: And so it's a real breeding ground for corruption. Of course, 535 00:31:53,000 --> 00:31:57,560 Speaker 1: it's a real breeding ground for authoritarianism as well, because 536 00:31:57,600 --> 00:31:59,680 Speaker 1: if you're allowed to do what you want and then 537 00:31:59,720 --> 00:32:02,440 Speaker 1: cover it all up and make secrets, uh, and keep 538 00:32:02,480 --> 00:32:06,160 Speaker 1: those secrets then hide what you're doing essentially, then that 539 00:32:06,280 --> 00:32:09,640 Speaker 1: that's just a breeding ground, that's a peatrie dish for authoritarianism. Well, 540 00:32:09,680 --> 00:32:13,800 Speaker 1: I saw another really interesting UM explanation about how it 541 00:32:13,880 --> 00:32:17,720 Speaker 1: can breed authoritarianism from the outside to from a guy 542 00:32:17,800 --> 00:32:21,440 Speaker 1: named um Sean Holman. He was a professor at Mount 543 00:32:21,560 --> 00:32:25,080 Speaker 1: Royal University in Calgary, and he basically said, when you 544 00:32:25,160 --> 00:32:28,760 Speaker 1: have a government that has overtly has a mountain of 545 00:32:28,840 --> 00:32:33,040 Speaker 1: secrets that the public knows about, it creates uncertainty, and 546 00:32:33,080 --> 00:32:36,440 Speaker 1: so people look for certainty, even if it's not truthful. 547 00:32:36,720 --> 00:32:39,400 Speaker 1: They'll they'll they'll they want it, they seek it out. 548 00:32:39,680 --> 00:32:42,040 Speaker 1: So anybody can come along and make up whatever they 549 00:32:42,040 --> 00:32:45,520 Speaker 1: want about what the government's really doing or filling in 550 00:32:45,560 --> 00:32:48,320 Speaker 1: the blanks and in the in the public's knowledge, and 551 00:32:48,360 --> 00:32:50,320 Speaker 1: people will hunger for that and listen to what that 552 00:32:50,400 --> 00:32:55,200 Speaker 1: person is saying, and that can breed authoritarianism as well. Yeah, 553 00:32:55,280 --> 00:32:57,360 Speaker 1: and you know another thing we failed to mention and 554 00:32:57,480 --> 00:33:01,280 Speaker 1: is a problem with overclassification is they so far with 555 00:33:01,360 --> 00:33:05,280 Speaker 1: the Manhattan Project that Russia was tipped off because all 556 00:33:05,280 --> 00:33:08,480 Speaker 1: of a sudden there was no science, no scientific papers 557 00:33:08,520 --> 00:33:11,360 Speaker 1: coming out. And Russia was like that can also be 558 00:33:11,400 --> 00:33:13,880 Speaker 1: a bad thing because they were like, all of a sudden, uh, 559 00:33:13,960 --> 00:33:16,080 Speaker 1: or the Soviet Union, I guess, was like, m this 560 00:33:16,160 --> 00:33:19,000 Speaker 1: is very interesting. They're they're actually that's been very quiet 561 00:33:19,040 --> 00:33:21,520 Speaker 1: over there lately. So my spiy sense is going off, 562 00:33:21,560 --> 00:33:23,680 Speaker 1: they haven't used the word Adam in quite some time. 563 00:33:25,640 --> 00:33:28,520 Speaker 1: Another thing is that Hinder's research. Of course, Um, we 564 00:33:28,600 --> 00:33:32,600 Speaker 1: talked about the science. Science should be shared, and I 565 00:33:32,680 --> 00:33:35,480 Speaker 1: know that people on I know firsthand the people on 566 00:33:35,480 --> 00:33:39,680 Speaker 1: the Manhattan Project were frustrated by Groves and the fact 567 00:33:39,720 --> 00:33:43,440 Speaker 1: that they couldn't even share stuff with other science departments. Yeah, 568 00:33:43,520 --> 00:33:46,320 Speaker 1: like he just banned them from speaking. When they're finally 569 00:33:46,360 --> 00:33:51,520 Speaker 1: relented and let Oppenheimer hold like um, like weekly symposiums 570 00:33:51,520 --> 00:33:54,880 Speaker 1: trying to hammer out problems that were just intractable. Um. 571 00:33:54,920 --> 00:33:58,040 Speaker 1: But even still they were closely watched and it was 572 00:33:58,200 --> 00:34:00,600 Speaker 1: it was a difficult research client meant from what I 573 00:34:00,600 --> 00:34:04,840 Speaker 1: can tell, Uh sure. And it's also it's also a 574 00:34:04,920 --> 00:34:08,799 Speaker 1: challenge when you're a when you're an agency like the 575 00:34:08,840 --> 00:34:11,799 Speaker 1: c i A, and you're not sharing information and things 576 00:34:11,840 --> 00:34:16,239 Speaker 1: like nine eleven happened possibly because agencies aren't sharing information, 577 00:34:17,120 --> 00:34:19,560 Speaker 1: or that they're not sharing that there were no weapons 578 00:34:19,560 --> 00:34:22,600 Speaker 1: of mass destruction and we end up in war. So 579 00:34:22,760 --> 00:34:27,400 Speaker 1: sharing of information between agencies is it is something that 580 00:34:27,680 --> 00:34:30,640 Speaker 1: needs to happen more. I think yes, that was also 581 00:34:30,680 --> 00:34:33,600 Speaker 1: a big problem on nine eleven. Um. I remember seeing 582 00:34:33,719 --> 00:34:37,000 Speaker 1: on a few documentaries that came around around the twentie 583 00:34:37,080 --> 00:34:41,800 Speaker 1: anniversary of this past September, where there was like real 584 00:34:41,960 --> 00:34:45,520 Speaker 1: knowledge about a couple of the hijackers being in the 585 00:34:45,600 --> 00:34:47,840 Speaker 1: United States and that they were a problem and we 586 00:34:47,840 --> 00:34:50,840 Speaker 1: should be keeping tabs on them, and the information was 587 00:34:50,920 --> 00:34:54,359 Speaker 1: just not properly passed along. And I think the nine 588 00:34:54,440 --> 00:34:59,520 Speaker 1: eleven commission um settled on the idea that that had 589 00:34:59,760 --> 00:35:04,520 Speaker 1: had this over classification not hindered information sharing, there's a 590 00:35:04,560 --> 00:35:07,200 Speaker 1: possibility that nine eleven would have been thwarted before it 591 00:35:07,200 --> 00:35:12,040 Speaker 1: could have been carried out. Sure. Uh. Leaks is another 592 00:35:12,080 --> 00:35:14,480 Speaker 1: big one, because you know when you when you control 593 00:35:14,520 --> 00:35:17,239 Speaker 1: the information and have all the information kept secret, you 594 00:35:17,239 --> 00:35:20,040 Speaker 1: can also leak out bits of that information to to 595 00:35:20,080 --> 00:35:23,400 Speaker 1: wield as a weapon against a political enemy, and that 596 00:35:23,920 --> 00:35:26,160 Speaker 1: you know, we've we've seen that happen time and time again. Yeah, 597 00:35:26,200 --> 00:35:29,120 Speaker 1: like Valerie playing right with Dick Cheney when he outed 598 00:35:29,160 --> 00:35:34,160 Speaker 1: her to punish her husband for criticizing the Iraq war. Yeah. 599 00:35:34,160 --> 00:35:37,440 Speaker 1: I think some of the Obama executive order is to 600 00:35:37,760 --> 00:35:41,000 Speaker 1: protect whistleblowers too, I'm not mistaken. Yeah, and also to 601 00:35:41,040 --> 00:35:44,680 Speaker 1: protect people who who say this this document is being 602 00:35:44,719 --> 00:35:47,319 Speaker 1: overclassified with they bring it up to their superior. They 603 00:35:47,520 --> 00:35:50,359 Speaker 1: should not be punished for that, right. And I think 604 00:35:50,360 --> 00:35:53,200 Speaker 1: it even and I think the Executive Order even encourages 605 00:35:53,280 --> 00:35:57,400 Speaker 1: people to um, not protests, but to to question the 606 00:35:57,440 --> 00:36:01,520 Speaker 1: classification of your superiors and to be allowed to bring 607 00:36:01,560 --> 00:36:05,320 Speaker 1: that up. Yeah, you're supposed to be like, that's way off. 608 00:36:05,400 --> 00:36:08,680 Speaker 1: You really screwed that one. Uh. There's one other thing, 609 00:36:08,760 --> 00:36:13,920 Speaker 1: Chuck two about leaks is that UM over classification in 610 00:36:13,920 --> 00:36:18,160 Speaker 1: a climate where we're prosecuting leakers and whistleblowers like never before. 611 00:36:19,400 --> 00:36:23,880 Speaker 1: UM that that if we if we are classifying things 612 00:36:24,520 --> 00:36:29,680 Speaker 1: that somebody would feel morally obligated to put their own 613 00:36:29,880 --> 00:36:34,200 Speaker 1: self at risk to release to the public, should that 614 00:36:34,239 --> 00:36:37,640 Speaker 1: stuff be classified? Should we be classifying that? And the 615 00:36:37,640 --> 00:36:41,799 Speaker 1: answer is probably not. Um, that it's probably being covered up, 616 00:36:41,880 --> 00:36:44,480 Speaker 1: is what the classification is being used for. And yet 617 00:36:44,600 --> 00:36:48,640 Speaker 1: under the letter of the law, because this is classified information, 618 00:36:48,920 --> 00:36:52,440 Speaker 1: that 'spionage Act says that you can be prosecuted and 619 00:36:52,480 --> 00:36:56,919 Speaker 1: spend years in jail for following your conscience. Yeah, that's 620 00:36:56,920 --> 00:37:01,319 Speaker 1: another problem with leaks and over classification. Yeah, and you know, 621 00:37:01,400 --> 00:37:06,239 Speaker 1: to fight over classification is you know, I guess Ed 622 00:37:06,320 --> 00:37:08,359 Speaker 1: points it out very kind of plainly, like as an 623 00:37:08,600 --> 00:37:12,680 Speaker 1: as a president, you're kind of in a known situation. 624 00:37:13,320 --> 00:37:17,040 Speaker 1: If one of your big mandates is to uh make 625 00:37:17,080 --> 00:37:21,600 Speaker 1: a lot less things classified, like you might win over 626 00:37:21,840 --> 00:37:25,560 Speaker 1: you know, some some freedom of information enthusiasts, But as 627 00:37:25,600 --> 00:37:28,520 Speaker 1: a general, as a general rule, you're not going to 628 00:37:28,600 --> 00:37:30,719 Speaker 1: do yourselves any political favors by going in there and 629 00:37:30,719 --> 00:37:32,160 Speaker 1: being like, hey, you know what we need to do 630 00:37:32,560 --> 00:37:35,080 Speaker 1: is declassify a lot of stuff and not classified nearly 631 00:37:35,120 --> 00:37:38,520 Speaker 1: as many things. You're also gonna upset the intelligence community greatly. 632 00:37:38,600 --> 00:37:40,600 Speaker 1: I would assume, yeah, well it's a big part of it. 633 00:37:41,120 --> 00:37:43,200 Speaker 1: So but some people have said, okay, well there's got 634 00:37:43,200 --> 00:37:45,800 Speaker 1: to be some stuff we can do. Um time limits 635 00:37:45,840 --> 00:37:49,000 Speaker 1: is a big one. Yeah. There's a guy named Irwin 636 00:37:49,120 --> 00:37:53,279 Speaker 1: Griswald who I found in Washington post op ed. He 637 00:37:53,320 --> 00:37:56,640 Speaker 1: wrote he was the guy who prosecuted the Pentagon papers 638 00:37:56,680 --> 00:37:59,080 Speaker 1: on behalf of the government. So he prosecuted, like to 639 00:37:59,160 --> 00:38:02,520 Speaker 1: that time, the biggest leak of government secrets ever. And 640 00:38:02,600 --> 00:38:06,640 Speaker 1: he came around years later and said we're way over classifying. 641 00:38:06,920 --> 00:38:09,440 Speaker 1: He basically said, like, there may be some basis for 642 00:38:09,520 --> 00:38:13,040 Speaker 1: short term classification while plans are being made or negotiations 643 00:38:13,040 --> 00:38:15,720 Speaker 1: are going on, but apart from details of weapons systems. 644 00:38:15,760 --> 00:38:18,640 Speaker 1: There is very rarely any real risk to current national 645 00:38:18,680 --> 00:38:22,080 Speaker 1: security from the publication of facts relating to transactions in 646 00:38:22,080 --> 00:38:25,120 Speaker 1: the past, even the fairly recent past. So to put 647 00:38:25,120 --> 00:38:28,920 Speaker 1: a time limit, especially a short time limit on classified material, 648 00:38:29,160 --> 00:38:31,880 Speaker 1: that would help a lot. Yeah. I mean, is he 649 00:38:32,000 --> 00:38:34,719 Speaker 1: also the guy that said if you know a lot 650 00:38:34,760 --> 00:38:37,919 Speaker 1: about this system, then it's pretty clear that a lot 651 00:38:37,960 --> 00:38:41,000 Speaker 1: of these classifications are to cover up embarrassments. Yeah, he's 652 00:38:41,000 --> 00:38:45,120 Speaker 1: one and the same. So that's that's not good. That's like, 653 00:38:45,120 --> 00:38:47,680 Speaker 1: oh boy, this doesn't look good, so let's just classify it. 654 00:38:48,200 --> 00:38:51,480 Speaker 1: And depending on who you talked to, estimates are anywhere 655 00:38:51,480 --> 00:38:55,200 Speaker 1: from ten percent to nine of classified information can be 656 00:38:55,440 --> 00:38:58,759 Speaker 1: put in the overclassified bucket. Yeah, and be one of 657 00:38:58,800 --> 00:39:01,160 Speaker 1: the things that that he saying and that that Statistics 658 00:39:01,160 --> 00:39:04,520 Speaker 1: says is that a lot of the reason classified material 659 00:39:04,719 --> 00:39:08,000 Speaker 1: is classified is to loop the public out, either because 660 00:39:08,200 --> 00:39:11,640 Speaker 1: Congress is being fed a load of bs by some 661 00:39:11,680 --> 00:39:14,520 Speaker 1: lobbyists that don't want public input about what they're telling 662 00:39:14,560 --> 00:39:18,680 Speaker 1: Congress or um there's a real concern that a lot 663 00:39:18,680 --> 00:39:22,640 Speaker 1: of federal agencies can protect or do protect some of 664 00:39:22,640 --> 00:39:28,080 Speaker 1: the corporations that they um that they regulate from public scrutiny. Um, 665 00:39:28,160 --> 00:39:31,000 Speaker 1: that could be a really big issue too, right, Oh 666 00:39:31,040 --> 00:39:33,279 Speaker 1: for sure. Like I think the FCC in the early 667 00:39:33,320 --> 00:39:36,239 Speaker 1: two thousands said that they weren't going to allow UM 668 00:39:36,360 --> 00:39:40,359 Speaker 1: reports about outages among wireless car carriers to be made 669 00:39:40,360 --> 00:39:42,960 Speaker 1: public out of a fear of being a threat to 670 00:39:43,040 --> 00:39:46,160 Speaker 1: national security. Which yeah, I mean there's a lot of 671 00:39:46,160 --> 00:39:49,680 Speaker 1: that stuff. The the other one with the protecting the 672 00:39:50,120 --> 00:39:54,279 Speaker 1: the Department of Agriculture protecting I guess it depends on 673 00:39:54,480 --> 00:39:56,280 Speaker 1: you talked to if you want to think it's protecting 674 00:39:56,320 --> 00:40:00,560 Speaker 1: but um food producers from if they produce a food 675 00:40:00,560 --> 00:40:03,239 Speaker 1: board and illness into the food supply, like, they might 676 00:40:03,280 --> 00:40:05,920 Speaker 1: not allow that information on like who it was to 677 00:40:05,960 --> 00:40:08,960 Speaker 1: be released as a protective measure. And that's when it 678 00:40:09,000 --> 00:40:11,160 Speaker 1: gets a little dicey that in the f CC, it's like, 679 00:40:12,000 --> 00:40:14,319 Speaker 1: I don't know, I mean, these this is these are 680 00:40:14,600 --> 00:40:17,120 Speaker 1: big public companies and you're talking about the public good, 681 00:40:18,239 --> 00:40:20,560 Speaker 1: but you're throwing like a shield around it, and maybe 682 00:40:20,640 --> 00:40:22,759 Speaker 1: that's not the best thing to do. And even still, 683 00:40:22,840 --> 00:40:24,840 Speaker 1: you can take it to an even more extreme degree, 684 00:40:25,000 --> 00:40:28,279 Speaker 1: right up to the CIA's doorstep and say, um, I've 685 00:40:28,320 --> 00:40:32,880 Speaker 1: seen people argue that that you by looping the American 686 00:40:32,920 --> 00:40:37,040 Speaker 1: public out about the secret torture program, the the ghost 687 00:40:37,160 --> 00:40:42,520 Speaker 1: prisons program. Um. It kept the public from being able 688 00:40:42,560 --> 00:40:44,719 Speaker 1: to hold the debate on whether this is something we 689 00:40:44,760 --> 00:40:47,719 Speaker 1: want America to be doing or not. And that's a 690 00:40:47,800 --> 00:40:49,680 Speaker 1: huge part of it, Chuck. I mean, like the basis 691 00:40:49,719 --> 00:40:52,759 Speaker 1: of democracies, the public being looped in and then the 692 00:40:52,800 --> 00:40:56,040 Speaker 1: government carrying out the wishes of the public. If you're 693 00:40:56,080 --> 00:40:59,280 Speaker 1: looping the public out, then that's just the government operating 694 00:40:59,280 --> 00:41:03,000 Speaker 1: and deciding on your behalf without any input from you whatsoever, 695 00:41:03,040 --> 00:41:05,719 Speaker 1: because you're being completely kept in the dark. That is 696 00:41:05,760 --> 00:41:09,480 Speaker 1: a huge basis of the classification system in America. Sadly. 697 00:41:10,000 --> 00:41:12,920 Speaker 1: Oh yeah, I mean the the agencies in the military, 698 00:41:13,560 --> 00:41:16,279 Speaker 1: I think firmly believe that the public is better off 699 00:41:16,520 --> 00:41:19,640 Speaker 1: if they don't have any knowledge of this stuff and 700 00:41:19,960 --> 00:41:22,879 Speaker 1: any opinion on what we're doing behind closed doors. Yeah, 701 00:41:22,920 --> 00:41:24,319 Speaker 1: And I mean they're like, what do you want to know? 702 00:41:24,400 --> 00:41:26,920 Speaker 1: Where where water boarding people? And I think a lot 703 00:41:26,960 --> 00:41:28,759 Speaker 1: of people would say, yeah, I would have liked to 704 00:41:28,760 --> 00:41:30,600 Speaker 1: have known that so I could vote it. Whoever was 705 00:41:30,640 --> 00:41:33,759 Speaker 1: supporting that right out of office, because I don't support waterboarding, 706 00:41:33,840 --> 00:41:37,600 Speaker 1: even terrorists. No one was given that opportunity the public, 707 00:41:37,960 --> 00:41:40,640 Speaker 1: even the people who agreed with it. We're giving the 708 00:41:40,640 --> 00:41:45,400 Speaker 1: opportunity to debate the merits of it in public because 709 00:41:45,440 --> 00:41:49,400 Speaker 1: everyone was kept in the dark. Yeah, and you might think, okay, 710 00:41:49,440 --> 00:41:52,360 Speaker 1: well I actually do agree with water boarding terrorists. Well 711 00:41:52,360 --> 00:41:55,279 Speaker 1: that that was that was something the CIA kept that 712 00:41:55,400 --> 00:41:57,360 Speaker 1: you agreed with. What about all the other things that 713 00:41:57,400 --> 00:41:59,600 Speaker 1: are kept secret that you don't agree with? That as 714 00:41:59,680 --> 00:42:03,759 Speaker 1: being kept out of your ability to debate? Yeah, the 715 00:42:03,840 --> 00:42:08,160 Speaker 1: quick movie recommendation on that note, The new Paul Trader movie, 716 00:42:08,360 --> 00:42:13,239 Speaker 1: The Card Counter Fantastic. Oh yeah, won't give away too much, 717 00:42:13,280 --> 00:42:17,240 Speaker 1: but Oscar Isaacs plays a sort of a very solitary 718 00:42:17,280 --> 00:42:21,520 Speaker 1: gambler poker player who is haunted by his pass as 719 00:42:21,560 --> 00:42:25,080 Speaker 1: a former prison guard at Abu Grabe. Oh wow, okay, 720 00:42:25,120 --> 00:42:27,040 Speaker 1: I'll check that out. So there. And you know, pauls 721 00:42:27,040 --> 00:42:31,680 Speaker 1: Traders still making this really really, he's still being Paul Traders, 722 00:42:31,719 --> 00:42:35,080 Speaker 1: so tough, challenging hardcore movies. What else has he done? 723 00:42:36,080 --> 00:42:40,239 Speaker 1: He wrote Taxi Driver. Um. He also made a movie 724 00:42:40,280 --> 00:42:44,000 Speaker 1: called Hardcore Back and I think the eighties about it's 725 00:42:44,040 --> 00:42:48,879 Speaker 1: always just very grizzly, grizzly stuff. And he also did 726 00:42:48,920 --> 00:42:51,960 Speaker 1: Meet Me in St. Louis. Like the last movie he 727 00:42:52,000 --> 00:42:54,919 Speaker 1: did before this one was first Reformed, the Ethan Hawk 728 00:42:55,000 --> 00:42:57,680 Speaker 1: movie where he played the priest. I don't know if 729 00:42:57,719 --> 00:43:00,480 Speaker 1: you saw that, but because these are seen any of 730 00:43:00,520 --> 00:43:03,279 Speaker 1: his movies, I saw a taxi driver. I thought that 731 00:43:04,920 --> 00:43:07,360 Speaker 1: he used to write more movies and now he directs 732 00:43:07,520 --> 00:43:10,200 Speaker 1: quite a bit. But he also wrote. I mean he 733 00:43:10,200 --> 00:43:12,800 Speaker 1: wrote less Sentation of Christ and oh wow, this guy's 734 00:43:12,840 --> 00:43:16,160 Speaker 1: good raging bowl American Giggilo stuff like that. I love 735 00:43:16,200 --> 00:43:19,000 Speaker 1: his stuff. Yeah, you know, Pultrator, you just don't know him. 736 00:43:19,080 --> 00:43:24,120 Speaker 1: I got you anyway, great movie recommendation. But we were 737 00:43:24,120 --> 00:43:28,680 Speaker 1: talking about secret prisons and it felt relevant. So I mean, 738 00:43:29,040 --> 00:43:32,240 Speaker 1: there's like a lot of things that that people say, Okay, 739 00:43:32,280 --> 00:43:35,120 Speaker 1: you know, here's some fixes we can do. But it 740 00:43:35,160 --> 00:43:37,280 Speaker 1: seems like it all kind of comes down to Chuck 741 00:43:37,719 --> 00:43:42,359 Speaker 1: putting time limits on classified material and really raising the 742 00:43:42,400 --> 00:43:45,440 Speaker 1: bar for what qualifies as a classified material or not, 743 00:43:45,880 --> 00:43:49,640 Speaker 1: and then in conjunction with that, making the act of 744 00:43:49,719 --> 00:43:55,080 Speaker 1: classifying material um accountable, like saying like like like if 745 00:43:55,120 --> 00:43:58,000 Speaker 1: you if you over classify, you're in trouble, Like there's 746 00:43:58,040 --> 00:44:00,000 Speaker 1: something you're doing your job wrong and you're gonna get 747 00:44:00,000 --> 00:44:03,600 Speaker 1: fired and replaced. Um that those three things seem to 748 00:44:03,680 --> 00:44:05,600 Speaker 1: kind of be the bottom line, and there seems to 749 00:44:05,680 --> 00:44:08,719 Speaker 1: be almost no movement whatsoever on it. Yeah, And I 750 00:44:08,760 --> 00:44:11,520 Speaker 1: think the other two that are could be pretty impactful 751 00:44:11,640 --> 00:44:16,239 Speaker 1: is make make declassification a real thing, sort of like 752 00:44:16,280 --> 00:44:18,880 Speaker 1: a practice and less like well someone has to submit 753 00:44:18,880 --> 00:44:22,279 Speaker 1: a Fourier request or really bug us. And then within 754 00:44:22,360 --> 00:44:24,719 Speaker 1: that system, I mean, we could do an episode on 755 00:44:24,760 --> 00:44:27,680 Speaker 1: Foyer request because they don't want you to find out something. 756 00:44:27,719 --> 00:44:30,279 Speaker 1: They can They can stand bag you for years, they 757 00:44:30,280 --> 00:44:32,759 Speaker 1: can put you off. They can release a document that 758 00:44:33,400 --> 00:44:36,640 Speaker 1: is redacted, like of it is redacted and go like here, 759 00:44:37,520 --> 00:44:41,839 Speaker 1: here's here's a bunch of adjectives. Uh. So, you know, 760 00:44:41,960 --> 00:44:45,719 Speaker 1: I think making that real would help. But I don't 761 00:44:45,719 --> 00:44:48,719 Speaker 1: know if anyone's gonna have the the guts to kind 762 00:44:48,719 --> 00:44:51,040 Speaker 1: of stand up to this stuff because it's not a 763 00:44:51,120 --> 00:44:54,600 Speaker 1: very electable position. You know, Well, my friend, I'm going 764 00:44:54,640 --> 00:44:56,680 Speaker 1: to do you one better. We actually have done an 765 00:44:56,680 --> 00:45:01,480 Speaker 1: episode on for you, did we I'm gonna double check, but, um, 766 00:45:01,480 --> 00:45:03,959 Speaker 1: but I believe we did. I know we've talked about 767 00:45:04,000 --> 00:45:07,960 Speaker 1: it extensively. We have, we probably did. Yeah, so you know, 768 00:45:08,320 --> 00:45:09,799 Speaker 1: we don't even have to do that. We can just 769 00:45:09,840 --> 00:45:13,360 Speaker 1: go back and listen to yep, how foy it works. Okay, Well, 770 00:45:13,840 --> 00:45:15,960 Speaker 1: in my defense in in April, everyone, it will be 771 00:45:16,000 --> 00:45:19,200 Speaker 1: fourteen years that's true. And this is from like four 772 00:45:19,280 --> 00:45:21,640 Speaker 1: or five years ago. So don't don't, right man. I 773 00:45:21,640 --> 00:45:24,759 Speaker 1: thought you were gonna say, like last week, right, you 774 00:45:24,840 --> 00:45:27,200 Speaker 1: got anything else? I got nothing else. This is a 775 00:45:27,200 --> 00:45:30,640 Speaker 1: good one, yeah, agreed. Um, Well, thanks to the grabster 776 00:45:30,760 --> 00:45:32,840 Speaker 1: and to all the people who wrote the articles that 777 00:45:32,880 --> 00:45:35,839 Speaker 1: we use for research here. Um. And if you want 778 00:45:35,840 --> 00:45:39,520 Speaker 1: to know more about over classification, seriously, if this struck 779 00:45:39,560 --> 00:45:41,920 Speaker 1: you is at all interesting, there's a whole world out 780 00:45:41,960 --> 00:45:44,160 Speaker 1: there of debate about what should be classified and what 781 00:45:44,200 --> 00:45:46,520 Speaker 1: shouldn't and how to fix this, uh, that you might 782 00:45:46,560 --> 00:45:50,360 Speaker 1: find interesting. And since I said you might find it interesting, 783 00:45:50,520 --> 00:45:58,160 Speaker 1: it's time for listener mail. I'm gonna call this court reporter. Oh. 784 00:45:58,239 --> 00:45:59,880 Speaker 1: I like this one. This is a good one. This 785 00:45:59,920 --> 00:46:01,440 Speaker 1: is a little lengthy, but I cut it down some. 786 00:46:01,719 --> 00:46:03,440 Speaker 1: Hey guys, I'm sitting at my desk in the courthouse 787 00:46:03,440 --> 00:46:05,840 Speaker 1: in North l A County and just finished listening to 788 00:46:05,920 --> 00:46:08,280 Speaker 1: unsung Heroes of the Court, and I am still beaming. 789 00:46:09,160 --> 00:46:11,839 Speaker 1: I am a court reporter. Stenographer. I went to court 790 00:46:11,840 --> 00:46:15,040 Speaker 1: reporting straight out of high school after taking a speed 791 00:46:15,080 --> 00:46:17,720 Speaker 1: running class. Just like Chuck, I learned about court reporting 792 00:46:17,719 --> 00:46:20,320 Speaker 1: from a Career day speaker. I spoke to our class 793 00:46:20,320 --> 00:46:22,239 Speaker 1: and knew immediately that this was the job for me. 794 00:46:22,680 --> 00:46:24,920 Speaker 1: I learned how to type on an old manual typewriter 795 00:46:24,960 --> 00:46:27,440 Speaker 1: in seventh grade. Took typing every year because it was 796 00:46:27,480 --> 00:46:29,919 Speaker 1: an easy a By the time I graduated, I could 797 00:46:29,920 --> 00:46:33,160 Speaker 1: type sixty five words per minute. A couple of years later, 798 00:46:33,160 --> 00:46:36,240 Speaker 1: I got my California court reporting license the age of nineteen. 799 00:46:37,000 --> 00:46:39,879 Speaker 1: I turned fifty five next month, so thirty five plus 800 00:46:39,960 --> 00:46:43,040 Speaker 1: years later, I'm still loving my career. UH. We court 801 00:46:43,080 --> 00:46:45,640 Speaker 1: reporters are truly unsung heroes in the courtroom. I often 802 00:46:45,680 --> 00:46:49,160 Speaker 1: compare my job to a wedding photographer. Once everything is 803 00:46:49,200 --> 00:46:53,080 Speaker 1: said and done, the transcript we create is all that remains, uh, 804 00:46:53,080 --> 00:46:55,040 Speaker 1: and we're fairly invisible, so make sure you have a 805 00:46:55,040 --> 00:46:58,880 Speaker 1: good one. Our Stuno machines are incredibly high tech, equipped 806 00:46:58,880 --> 00:47:03,239 Speaker 1: with bluetooth communication for real time simultaneous transcription to laptops, 807 00:47:03,719 --> 00:47:07,440 Speaker 1: iPads and the Internet. We can access the transcript immediately 808 00:47:07,480 --> 00:47:10,800 Speaker 1: for readback and clarification for the record, which is extremely 809 00:47:10,880 --> 00:47:14,160 Speaker 1: useful to assist the lawyer's parties, the judge and juries. 810 00:47:14,800 --> 00:47:18,080 Speaker 1: That said, we are in desperate need of more court reporters. Guys. 811 00:47:18,520 --> 00:47:21,600 Speaker 1: Tech schools lost their luster in an era where everyone 812 00:47:21,680 --> 00:47:23,880 Speaker 1: felt the need to have a college degree. But just 813 00:47:23,920 --> 00:47:26,960 Speaker 1: like bailiffs that we cherish in our own courthouses, court 814 00:47:27,000 --> 00:47:31,239 Speaker 1: reporting school doesn't require a college degree. It just requires 815 00:47:31,239 --> 00:47:34,440 Speaker 1: hard work and a dedication to your profession and nerves 816 00:47:34,440 --> 00:47:38,240 Speaker 1: of steel. That's right. I have never regretted my career choice. 817 00:47:38,239 --> 00:47:42,000 Speaker 1: And I thank you Josh, Chuck and Livia Olivia's first 818 00:47:42,000 --> 00:47:46,560 Speaker 1: shoutout for highlighting uh we court reporters in the crucial 819 00:47:46,600 --> 00:47:50,240 Speaker 1: role we play in the courtroom. That is from Linda Davidson, 820 00:47:50,360 --> 00:47:54,040 Speaker 1: l a superior court official, court reporter and proud of it. 821 00:47:54,640 --> 00:47:56,800 Speaker 1: Very nice. Thank you, Linda. That was a great email. 822 00:47:56,880 --> 00:47:59,080 Speaker 1: I appreciate that one big time. That was a great 823 00:47:59,120 --> 00:48:02,759 Speaker 1: idea for an episode to Chuck. It turned out to 824 00:48:02,800 --> 00:48:05,319 Speaker 1: be pretty pretty good, and we heard from I haven't 825 00:48:05,360 --> 00:48:08,440 Speaker 1: heard from a bailiff yet, but we've heard from actually 826 00:48:08,440 --> 00:48:11,040 Speaker 1: just a couple of court reporters. No sketch rads, right, 827 00:48:11,400 --> 00:48:13,880 Speaker 1: they're quiet, Yeah, they're quiet. Oh, but get this, I 828 00:48:13,920 --> 00:48:17,880 Speaker 1: was watching Oh what case? I think the Gillaine Maxwell case, 829 00:48:18,400 --> 00:48:23,480 Speaker 1: and clearly the sketch artist who had drawn that Tom 830 00:48:23,480 --> 00:48:27,160 Speaker 1: Brady one that went viral she was producing sketches for. 831 00:48:27,239 --> 00:48:33,040 Speaker 1: I think NBC's UM National News recognized her style. I 832 00:48:33,080 --> 00:48:35,560 Speaker 1: was just looking at the one for the Elizabeth Holmes 833 00:48:35,920 --> 00:48:39,440 Speaker 1: case this morning, and now when I see those, I'm like, Okay, 834 00:48:39,520 --> 00:48:43,479 Speaker 1: nice work, it looks good. Yeah right right, well, thanks again, Linda. 835 00:48:43,520 --> 00:48:44,920 Speaker 1: If you want to be like Linda and get in 836 00:48:44,960 --> 00:48:46,759 Speaker 1: touch with us, you can send us an email to 837 00:48:47,040 --> 00:48:53,760 Speaker 1: stuff Podcast at iHeart radio dot com. Stuff you Should 838 00:48:53,760 --> 00:48:56,759 Speaker 1: Know is a production of iHeart Radio. For more podcasts 839 00:48:56,800 --> 00:49:00,000 Speaker 1: my heart Radio, visit the iHeart Radio app, Apple podcast, 840 00:49:00,320 --> 00:49:05,040 Speaker 1: or wherever you listen to your favorite shows. H