1 00:00:01,840 --> 00:00:04,439 Speaker 1: Also media. 2 00:00:06,440 --> 00:00:11,119 Speaker 2: Oh, welcome back to Behind the Bastards, a podcast that 3 00:00:11,240 --> 00:00:16,439 Speaker 2: is just disastrously hungover. Right now, I am. I am 4 00:00:17,000 --> 00:00:21,639 Speaker 2: not doing well everybody. It is the day after or 5 00:00:21,880 --> 00:00:24,840 Speaker 2: around my birthday. Happened at some point in the last 6 00:00:24,880 --> 00:00:28,800 Speaker 2: forty eight hours or perhaps right now. Hello Dave, how 7 00:00:28,800 --> 00:00:29,480 Speaker 2: are you doing? 8 00:00:29,920 --> 00:00:30,840 Speaker 3: Woo? 9 00:00:31,480 --> 00:00:33,919 Speaker 1: Could I read the text message you set me at 10 00:00:33,960 --> 00:00:38,240 Speaker 1: four or five of the morning. Yeah, I have alarms, 11 00:00:38,320 --> 00:00:41,519 Speaker 1: but if they don't wake me, you can call me. 12 00:00:42,800 --> 00:00:46,000 Speaker 1: You not spelled out like you normally do, which was 13 00:00:46,040 --> 00:00:47,320 Speaker 1: my favorite part. 14 00:00:47,600 --> 00:00:51,839 Speaker 3: Was like, ewe you call. 15 00:00:52,680 --> 00:00:55,200 Speaker 2: We did just have another U yesterday, so that would 16 00:00:55,240 --> 00:00:55,760 Speaker 2: make sense. 17 00:00:56,120 --> 00:01:00,600 Speaker 3: I can't believe it was your birthday. Happy birthday, lovely Dave. 18 00:01:00,800 --> 00:01:03,400 Speaker 2: It was a birthday until I had to wake up. 19 00:01:04,000 --> 00:01:07,280 Speaker 3: Well yeah, I mean that's yeah, it's life, man, you know, 20 00:01:07,640 --> 00:01:09,520 Speaker 3: just pummeling, just pummeling us. 21 00:01:09,800 --> 00:01:11,040 Speaker 2: But HiT's keep coming. 22 00:01:11,400 --> 00:01:13,440 Speaker 3: I don't know. I'm going to celebrate today, yeah. 23 00:01:13,319 --> 00:01:15,360 Speaker 1: Just to say I'm really glad you were born. 24 00:01:15,560 --> 00:01:18,520 Speaker 2: Good job, thank you. If anyone at home wants to 25 00:01:18,560 --> 00:01:23,360 Speaker 2: celebrate my birthday, just uh, I don't know, send me 26 00:01:23,560 --> 00:01:27,320 Speaker 2: some of your fingernails. Fingernails, I want as many fingers, 27 00:01:27,400 --> 00:01:29,760 Speaker 2: No you're right. Someone will do that. Someone will do that. 28 00:01:29,800 --> 00:01:31,240 Speaker 2: We probably shouldn't, Dave. 29 00:01:32,080 --> 00:01:34,480 Speaker 3: Yeah it bell. Yeah. 30 00:01:34,480 --> 00:01:40,000 Speaker 2: Oh yeah here hi co co host co impresario of 31 00:01:40,040 --> 00:01:45,440 Speaker 2: the Gamefully Unemployed network network, Mark my former co worker 32 00:01:45,600 --> 00:01:50,800 Speaker 2: at cracked dot com. Uh man about town at some 33 00:01:50,880 --> 00:01:53,800 Speaker 2: more news. How are you doing today, Dave? 34 00:01:54,280 --> 00:01:57,080 Speaker 3: I'm good. I'm living the dream. I'm wheeling and dealing. 35 00:01:57,720 --> 00:02:00,920 Speaker 3: I'm uh, I gotta figure out. I got to schedule colonoscopy. 36 00:02:01,560 --> 00:02:04,040 Speaker 2: Ooh, you're doting that just doing you. 37 00:02:03,960 --> 00:02:07,040 Speaker 3: Know, just doing the things, you know, just just live 38 00:02:07,080 --> 00:02:08,120 Speaker 3: in it, living life. 39 00:02:08,440 --> 00:02:11,040 Speaker 2: Yeah, that's good, that's good. You know, I uh do 40 00:02:11,440 --> 00:02:15,919 Speaker 2: you have? Is this a recreational colonoscopy or a necessary one? 41 00:02:16,280 --> 00:02:18,160 Speaker 3: It? Can it be both? Yeah? 42 00:02:18,240 --> 00:02:20,200 Speaker 2: Okay, that's fair ye fair enough. I know. 43 00:02:20,280 --> 00:02:22,440 Speaker 3: I mean I probably have nothing wrong with me. I 44 00:02:22,560 --> 00:02:25,760 Speaker 3: just uh, you know, I'm hitting forty. I I have 45 00:02:25,840 --> 00:02:28,560 Speaker 3: like a slight family history. And and I talked to 46 00:02:28,600 --> 00:02:30,440 Speaker 3: my my. I got a new I got a new doctor, 47 00:02:31,000 --> 00:02:34,760 Speaker 3: and I was like, what about them colonoscopies? And They're like, yeah, 48 00:02:34,800 --> 00:02:37,240 Speaker 3: let's do it. Let's fucking do it. And then we're 49 00:02:37,240 --> 00:02:42,480 Speaker 3: we're doing it, doing a colonoscopy. Yeah, excellent. Yeah, it's 50 00:02:42,520 --> 00:02:43,600 Speaker 3: exciting stuff. You know. 51 00:02:44,440 --> 00:02:49,160 Speaker 2: Well, I have not had a colonoscopy yet, although soon. 52 00:02:49,320 --> 00:02:52,680 Speaker 2: But you know, what is the the moral equivalent of 53 00:02:52,760 --> 00:02:54,639 Speaker 2: having a camera shoved up your ass? 54 00:02:54,919 --> 00:02:55,560 Speaker 3: Ooh what? 55 00:02:57,080 --> 00:03:00,480 Speaker 2: Living in the United States? Uh? In an election twenty 56 00:03:00,520 --> 00:03:04,680 Speaker 2: twenty four. And a big part of why it feels 57 00:03:04,720 --> 00:03:08,160 Speaker 2: that way is because of a little concept that you 58 00:03:08,280 --> 00:03:13,560 Speaker 2: might be aware of, Dave, the think tank? Ooh yeah, 59 00:03:13,960 --> 00:03:17,639 Speaker 2: what do you know about the think tank as a concept? 60 00:03:18,080 --> 00:03:22,320 Speaker 3: I mean I always picture like a SeaWorld style tank 61 00:03:22,800 --> 00:03:26,920 Speaker 3: filled with brine and just a giant floating brain that 62 00:03:27,040 --> 00:03:30,280 Speaker 3: a group of scientists sort of circle around, and that 63 00:03:30,400 --> 00:03:32,960 Speaker 3: brain is hooked up to computers and then the data 64 00:03:33,120 --> 00:03:37,760 Speaker 3: gets printed out, and they take that data and then 65 00:03:37,840 --> 00:03:40,920 Speaker 3: they use it or ignore it, and then they just 66 00:03:40,960 --> 00:03:42,560 Speaker 3: do whatever they want to do politically. 67 00:03:42,760 --> 00:03:45,800 Speaker 2: Wow, it's it's amazing. That's nearly the opposite of what 68 00:03:45,880 --> 00:03:50,120 Speaker 2: it really is. So what a think take really is 69 00:03:50,120 --> 00:03:53,680 Speaker 2: is a way in which to generate paper, and then 70 00:03:53,720 --> 00:03:59,680 Speaker 2: that paper convinces journalists that your policies are real policies, 71 00:04:00,080 --> 00:04:03,000 Speaker 2: and then you take over the Supreme Court. Right now, 72 00:04:03,160 --> 00:04:05,320 Speaker 2: I know this is going to sound convoluted. We're talking 73 00:04:05,320 --> 00:04:10,360 Speaker 2: about how the Republican Party kind of won and specifically 74 00:04:10,360 --> 00:04:13,480 Speaker 2: how they turned around the situation that existed in this 75 00:04:13,520 --> 00:04:16,120 Speaker 2: country in the United States, So we don't talk about much. 76 00:04:16,680 --> 00:04:20,039 Speaker 2: It's very especially because on the right there's this need 77 00:04:20,040 --> 00:04:24,520 Speaker 2: to believe that the fifties and sixties were this like 78 00:04:24,600 --> 00:04:28,159 Speaker 2: era where everything was better and like the whole country 79 00:04:28,240 --> 00:04:31,800 Speaker 2: was more conservative. And then there's this need among liberals 80 00:04:32,320 --> 00:04:35,320 Speaker 2: on the left to believe that like they have kind 81 00:04:35,360 --> 00:04:38,040 Speaker 2: of been ascendant lately and sort of the last couple 82 00:04:38,040 --> 00:04:41,680 Speaker 2: of years of resurgent right wing stuff has been a 83 00:04:41,720 --> 00:04:44,159 Speaker 2: severe disruption of the norm. And neither of this is 84 00:04:44,160 --> 00:04:47,120 Speaker 2: really accurate because the reality of the situation is that 85 00:04:47,480 --> 00:04:51,960 Speaker 2: in the fifties and sixties, basically any commentator who was 86 00:04:52,320 --> 00:04:54,840 Speaker 2: looking at it honestly would have told you that, like, well, 87 00:04:54,880 --> 00:04:59,760 Speaker 2: liberalism has clearly won, and we talk about liberalism, I'm 88 00:04:59,800 --> 00:05:02,560 Speaker 2: not talking about, like what the way we kind of 89 00:05:02,560 --> 00:05:05,240 Speaker 2: talk about liberals today, or the way liberals often like 90 00:05:05,320 --> 00:05:08,320 Speaker 2: to see themselves as progressives. I'm talking about like an 91 00:05:08,360 --> 00:05:12,599 Speaker 2: economic and kind of social set of political beliefs that 92 00:05:13,040 --> 00:05:18,040 Speaker 2: was the dominant way in which people viewed politics by 93 00:05:18,080 --> 00:05:22,640 Speaker 2: the nineteen fifties, liberal economic policy and social orthodoxy reigned 94 00:05:22,680 --> 00:05:26,000 Speaker 2: supreme in the post war era, and it was it 95 00:05:26,040 --> 00:05:28,960 Speaker 2: was like it had such a degree of capture of 96 00:05:29,000 --> 00:05:32,160 Speaker 2: the system that literary critic Lionel Trilling wrote in nineteen 97 00:05:32,200 --> 00:05:35,000 Speaker 2: fifty in the United States at this time liberalism is 98 00:05:35,040 --> 00:05:37,960 Speaker 2: not only the dominant, but even the sole intellectual tradition. 99 00:05:38,360 --> 00:05:42,080 Speaker 2: He claimed, there are no conservative or reactionary ideas in 100 00:05:42,160 --> 00:05:46,080 Speaker 2: general circulation only And I love this term irritable mental 101 00:05:46,120 --> 00:05:50,520 Speaker 2: gestures which seek to resemble ideas. Now a bunch of 102 00:05:50,520 --> 00:05:52,800 Speaker 2: people are going to be like thinking about all the 103 00:05:52,800 --> 00:05:55,479 Speaker 2: fucked up shit that they knew the government was doing 104 00:05:55,480 --> 00:05:59,000 Speaker 2: in the fifties and sixties, like overthrowing governments in Latin 105 00:05:59,040 --> 00:06:02,080 Speaker 2: America and getting it Vietnam, and being like, well, how 106 00:06:02,120 --> 00:06:04,560 Speaker 2: can you say that liberalism was dominant in this period? 107 00:06:04,600 --> 00:06:06,120 Speaker 2: And I can say that because, like what it was 108 00:06:06,160 --> 00:06:08,200 Speaker 2: liberals doing a lot of that, right, Like it was 109 00:06:08,279 --> 00:06:12,120 Speaker 2: JFK who got our asses into Vietnam, and LBJ the 110 00:06:12,160 --> 00:06:14,000 Speaker 2: great society guy, who accelerated it. 111 00:06:14,040 --> 00:06:16,400 Speaker 3: So I'm not, yeah, right, And like, I don't know 112 00:06:16,440 --> 00:06:20,120 Speaker 3: what's considered liberal or leftists, Yeah, you know, fifty. 113 00:06:19,920 --> 00:06:22,599 Speaker 2: Years not leftist, Yeah yeah, But it's. 114 00:06:22,440 --> 00:06:25,120 Speaker 3: Always going to be different, right, Like what you know, 115 00:06:25,320 --> 00:06:27,840 Speaker 3: I I you look at Star Trek, which is considered 116 00:06:27,920 --> 00:06:30,400 Speaker 3: pretty progressive, but then you go and look at TG 117 00:06:30,560 --> 00:06:33,080 Speaker 3: and you're like, man, they're almost there, but they still 118 00:06:33,080 --> 00:06:35,159 Speaker 3: don't quite you know, like I don't know. 119 00:06:35,160 --> 00:06:37,560 Speaker 2: They wouldn't let Ryker kiss a dude, right, yeah, yeah, 120 00:06:37,600 --> 00:06:40,920 Speaker 2: exactly like he wanted to. Jonathan Frakes was in of course. 121 00:06:41,160 --> 00:06:45,000 Speaker 3: Yeah, I mean he he fucked everything and that My 122 00:06:45,120 --> 00:06:48,320 Speaker 3: favorite is the non binary aliens. 123 00:06:48,400 --> 00:06:49,720 Speaker 2: Yeah yeah, where. 124 00:06:49,560 --> 00:06:53,440 Speaker 3: He immediately like upon meeting them, is like, oh, I 125 00:06:53,520 --> 00:06:55,960 Speaker 3: have to fuck one of these. I don't care which 126 00:06:56,320 --> 00:06:57,960 Speaker 3: I got to check off this box. 127 00:06:58,160 --> 00:07:02,599 Speaker 2: Mm hmm, poor Reich. But yeah, so when we're talking 128 00:07:02,640 --> 00:07:04,719 Speaker 2: about this, we're talking about the idea that like, and 129 00:07:04,920 --> 00:07:07,000 Speaker 2: it makes some sense if you think about like what 130 00:07:07,160 --> 00:07:10,920 Speaker 2: happened to the right is as a result of World 131 00:07:10,920 --> 00:07:13,400 Speaker 2: War two. Both like in the pre war period, you 132 00:07:13,440 --> 00:07:19,000 Speaker 2: have this kind of like isolationist, reactionary strain that really 133 00:07:19,080 --> 00:07:21,800 Speaker 2: has to retreat because we go to war with the 134 00:07:21,880 --> 00:07:23,880 Speaker 2: Nazis and they're like, oh shit, we have to we 135 00:07:23,960 --> 00:07:26,280 Speaker 2: have to get very careful about how we talk about 136 00:07:26,280 --> 00:07:29,280 Speaker 2: some of the things we believe. Suddenly so it's it's 137 00:07:29,360 --> 00:07:33,600 Speaker 2: just this this very like fundamentally different period. And one 138 00:07:33,640 --> 00:07:35,920 Speaker 2: of the few things that is kind of similar in 139 00:07:36,000 --> 00:07:37,920 Speaker 2: sort of how we look at liberalism from then to 140 00:07:37,960 --> 00:07:40,880 Speaker 2: now is that it was it was, and actually much 141 00:07:40,920 --> 00:07:43,920 Speaker 2: more so back then, defined by this kind of embrace 142 00:07:44,040 --> 00:07:48,960 Speaker 2: of public spending and huge public works projects, right, that 143 00:07:49,200 --> 00:07:53,200 Speaker 2: was pretty universally accepted as what the government ought to 144 00:07:53,240 --> 00:07:55,040 Speaker 2: be doing, to the point where I can look at 145 00:07:55,040 --> 00:07:59,280 Speaker 2: like Dwight D. Eisenhower, famous Republican president, builds a massive 146 00:07:59,280 --> 00:08:02,560 Speaker 2: interstate high system, right, which is not a thing that 147 00:08:02,600 --> 00:08:05,800 Speaker 2: you would you would get a Republican supporting today, like 148 00:08:05,840 --> 00:08:09,480 Speaker 2: a public's works project on that scale. It's it's kind 149 00:08:09,480 --> 00:08:11,800 Speaker 2: of inconceivable now. But that's what I'm talking about when 150 00:08:11,800 --> 00:08:15,840 Speaker 2: I'm talking about how like people, a lot of people 151 00:08:15,880 --> 00:08:17,760 Speaker 2: in public policy at the time were like, well, yeah, 152 00:08:17,800 --> 00:08:21,080 Speaker 2: this kind of liberal trend towards what the role of 153 00:08:21,120 --> 00:08:24,040 Speaker 2: the state should be in society has clearly won, and 154 00:08:24,080 --> 00:08:25,840 Speaker 2: there's not really any other game in town. 155 00:08:26,120 --> 00:08:28,720 Speaker 3: It's funny you should talk frame it this way because 156 00:08:28,720 --> 00:08:31,920 Speaker 3: I've actually I've talked about this with friends and stuff before, 157 00:08:32,000 --> 00:08:35,480 Speaker 3: and like people always frame it kind of not like, oh, 158 00:08:35,600 --> 00:08:37,839 Speaker 3: you know how they used to be liberal. It's more 159 00:08:37,880 --> 00:08:40,320 Speaker 3: of like, you know how what's considered quote unquote leftists 160 00:08:40,360 --> 00:08:43,440 Speaker 3: or liberal used to just be the norm. Yeah. It's like, 161 00:08:43,720 --> 00:08:46,320 Speaker 3: you know, we used to fix our highways up and 162 00:08:46,360 --> 00:08:49,080 Speaker 3: that wasn't seen as a political thing. Yeah, it was 163 00:08:49,280 --> 00:08:51,080 Speaker 3: just it was just a thing you have to do 164 00:08:51,120 --> 00:08:53,640 Speaker 3: when you run a government. Like would you would you 165 00:08:53,679 --> 00:08:55,360 Speaker 3: agree with that or would you say that they were 166 00:08:55,520 --> 00:08:58,120 Speaker 3: they were? It was seen at the time as very 167 00:08:58,160 --> 00:08:59,719 Speaker 3: liberal because I feel like these are things that other 168 00:08:59,760 --> 00:09:03,400 Speaker 3: kind do too, Yeah, that aren't seen as political leaning 169 00:09:03,600 --> 00:09:04,040 Speaker 3: at all. 170 00:09:04,559 --> 00:09:06,760 Speaker 2: Yeah, And part of the story today and why we 171 00:09:06,840 --> 00:09:10,000 Speaker 2: have think tanks is to make all that seem more 172 00:09:10,040 --> 00:09:12,360 Speaker 2: political than it used to. But it's also just a 173 00:09:12,400 --> 00:09:16,200 Speaker 2: matter of like when you're when you're looking back to 174 00:09:16,280 --> 00:09:20,559 Speaker 2: that period of time, you're looking at a period of 175 00:09:20,600 --> 00:09:24,120 Speaker 2: time in which, like the Republicans we had a very 176 00:09:24,160 --> 00:09:28,319 Speaker 2: strong liberal arm of the party. There were like Rockefeller 177 00:09:28,400 --> 00:09:30,760 Speaker 2: was a liberal Republican. There were like that was like 178 00:09:30,800 --> 00:09:34,079 Speaker 2: a dominant part of the Republican party. There were liberal 179 00:09:34,120 --> 00:09:39,000 Speaker 2: Republican presidential candidates who did pretty well, right, And that's 180 00:09:39,000 --> 00:09:42,040 Speaker 2: a really different situation today like now, and again I'm 181 00:09:42,080 --> 00:09:44,080 Speaker 2: not saying none of these people you would call like 182 00:09:44,160 --> 00:09:47,839 Speaker 2: a leftist, although they are all people who, if they 183 00:09:47,920 --> 00:09:51,560 Speaker 2: ran today, would be called leftists by Republicans. 184 00:09:50,960 --> 00:09:53,360 Speaker 3: Right Like. That's the thing. It's so hard to like 185 00:09:53,760 --> 00:09:55,200 Speaker 3: separate these words anymore. 186 00:09:55,280 --> 00:09:58,160 Speaker 2: Yes, yes, And I know that's a frustrating part of this. 187 00:09:58,480 --> 00:10:00,600 Speaker 3: I think I've I like, I don't even know what 188 00:10:00,640 --> 00:10:03,160 Speaker 3: to call myself. I certainly I don't think I'm a liberal. 189 00:10:03,200 --> 00:10:06,920 Speaker 3: I'm very left leaning and very progressive. But like, I 190 00:10:06,960 --> 00:10:09,880 Speaker 3: think I've accidentally called myself just a liberal because they're 191 00:10:09,880 --> 00:10:13,080 Speaker 3: all like like leftist, liberal, they're all l words that 192 00:10:13,160 --> 00:10:16,560 Speaker 3: are just like, eh, I don't know, but it's just 193 00:10:16,640 --> 00:10:20,040 Speaker 3: interesting because like the idea of a liberal Republican, I'm 194 00:10:20,080 --> 00:10:22,280 Speaker 3: just like, I don't know how that works. But then 195 00:10:22,320 --> 00:10:24,560 Speaker 3: I think about, like, you know, Nixon. I think it 196 00:10:24,600 --> 00:10:26,400 Speaker 3: was Nixon created the EPA. 197 00:10:26,760 --> 00:10:29,120 Speaker 2: That's a great example of how things have shifted. Were 198 00:10:29,160 --> 00:10:32,280 Speaker 2: like a guy like and Nixon went to China and 199 00:10:32,320 --> 00:10:36,080 Speaker 2: normalized relations with MAO, right Like, another thing that you 200 00:10:36,080 --> 00:10:39,160 Speaker 2: wouldn't really I mean, Trump kind of tried to do 201 00:10:39,200 --> 00:10:41,240 Speaker 2: his version of it with North Korea, but it was 202 00:10:41,280 --> 00:10:43,120 Speaker 2: not really the same deal, you. 203 00:10:43,000 --> 00:10:45,600 Speaker 3: Know, no, not at all, and like, I don't know, 204 00:10:45,640 --> 00:10:47,280 Speaker 3: a lot of this stuff to me just seems like 205 00:10:47,800 --> 00:10:50,720 Speaker 3: just things you have to do right where it's like 206 00:10:50,800 --> 00:10:54,400 Speaker 3: we need to protect our like forest innational parts. Yeah, 207 00:10:54,880 --> 00:10:57,480 Speaker 3: it doesn't feel like it needs to be political at all. 208 00:10:57,640 --> 00:11:01,200 Speaker 3: But it's just funny when you start trying figuring that 209 00:11:01,280 --> 00:11:02,280 Speaker 3: out those labels. 210 00:11:02,760 --> 00:11:05,439 Speaker 2: When I'm when I quote these guys from the fifties 211 00:11:05,480 --> 00:11:08,040 Speaker 2: and sixties saying, were like, liberalism is clearly won. That's 212 00:11:08,080 --> 00:11:10,680 Speaker 2: kind of what they're referring to, is right, So much 213 00:11:10,720 --> 00:11:13,600 Speaker 2: stuff that is now seen as political. Spending any amount 214 00:11:13,600 --> 00:11:16,440 Speaker 2: of public money on anything to help people is deeply political. 215 00:11:16,559 --> 00:11:19,560 Speaker 3: Right. It sounds like to some people it was political then, 216 00:11:19,760 --> 00:11:21,360 Speaker 3: So it was. 217 00:11:21,800 --> 00:11:24,160 Speaker 2: There was a tiny number of those people who thought 218 00:11:24,160 --> 00:11:26,440 Speaker 2: of it that way, and they were mostly very very 219 00:11:26,480 --> 00:11:30,600 Speaker 2: wealthy business owners, primarily people who had inherited businesses and 220 00:11:30,640 --> 00:11:33,559 Speaker 2: whatnot from their families. And they're going to be kind 221 00:11:33,600 --> 00:11:36,520 Speaker 2: of the folks who actually wind up creating the network 222 00:11:36,559 --> 00:11:38,440 Speaker 2: of think tanks like that is the story we're talking 223 00:11:38,440 --> 00:11:41,439 Speaker 2: about today, because this was all started as a way 224 00:11:41,480 --> 00:11:45,400 Speaker 2: to shift public consensus away from the idea that like, 225 00:11:45,760 --> 00:11:49,520 Speaker 2: the government can do things to benefit people and towards 226 00:11:49,520 --> 00:11:53,240 Speaker 2: the idea that like any public spending is communism, right, Like, 227 00:11:53,280 --> 00:11:56,319 Speaker 2: that's that's a big part of what we're talking about today. 228 00:11:56,800 --> 00:11:59,160 Speaker 2: And there's a number of reasons why we went from 229 00:11:59,200 --> 00:12:02,800 Speaker 2: like FDR being the most beloved president in American history 230 00:12:03,120 --> 00:12:06,199 Speaker 2: to like, I think he's still broadly people have fond 231 00:12:06,200 --> 00:12:09,199 Speaker 2: memories of him. But if anybody pushed policies today like this, 232 00:12:09,440 --> 00:12:13,240 Speaker 2: you know, like FDR did, they would be called a dictator, right, Like, 233 00:12:14,080 --> 00:12:18,240 Speaker 2: it would be like unimaginably controversial. And there's a number 234 00:12:18,280 --> 00:12:20,840 Speaker 2: of reasons why the status quo he started to change. 235 00:12:21,200 --> 00:12:23,400 Speaker 2: One of them was the Vietnam War, right, and how 236 00:12:23,480 --> 00:12:25,800 Speaker 2: much fucking money it cost. And by the time you 237 00:12:25,880 --> 00:12:28,720 Speaker 2: hit like the mid to late seventies, you've got, you know, 238 00:12:28,760 --> 00:12:32,000 Speaker 2: the economy sort of grinding to a halt, inflation rising 239 00:12:32,000 --> 00:12:35,720 Speaker 2: stuff that will never happen again. And there's there starts 240 00:12:35,720 --> 00:12:38,200 Speaker 2: to be this awareness among like public policy people in 241 00:12:38,240 --> 00:12:40,760 Speaker 2: the United States that like, oh, the unlimited money train 242 00:12:41,200 --> 00:12:43,440 Speaker 2: from after World War Two isn't going to keep going 243 00:12:43,440 --> 00:12:47,320 Speaker 2: on forever, right, And so yeah, we're going to talk 244 00:12:47,480 --> 00:12:50,720 Speaker 2: about that because like getting us from FDR to LBJA 245 00:12:50,880 --> 00:12:53,520 Speaker 2: to Ronald Reagan and then the Trump was in part 246 00:12:53,559 --> 00:12:56,360 Speaker 2: the result of a concerted effort to shift the culture 247 00:12:56,880 --> 00:13:00,480 Speaker 2: by building a robust system for generating conservative thought and 248 00:13:00,520 --> 00:13:03,160 Speaker 2: then pumping them into the culture at scale, so they 249 00:13:03,160 --> 00:13:07,520 Speaker 2: looked like they had scholarly support and public support, and 250 00:13:07,559 --> 00:13:10,720 Speaker 2: they kind of did that until they made it true. 251 00:13:11,120 --> 00:13:12,680 Speaker 2: And I think a good place to start is with 252 00:13:12,720 --> 00:13:15,360 Speaker 2: the development of the concept of the think tank. These 253 00:13:15,360 --> 00:13:18,760 Speaker 2: are not a uniquely American institution, but they are uniquely 254 00:13:18,760 --> 00:13:22,040 Speaker 2: influential and powerful in the United States. There's not any 255 00:13:22,040 --> 00:13:24,960 Speaker 2: other nation on earth that has its public policy or 256 00:13:24,960 --> 00:13:28,800 Speaker 2: political culture shaped so much by think tanks, which are 257 00:13:28,800 --> 00:13:33,720 Speaker 2: basically dark money sinks. We're intellectuals who are bribed generate 258 00:13:33,720 --> 00:13:37,840 Speaker 2: the illusion of consensus in exchange for money. Right, that's 259 00:13:37,880 --> 00:13:39,840 Speaker 2: what they are there for people. 260 00:13:39,640 --> 00:13:43,160 Speaker 3: Who don't like reality. Right, Like it's the idea of like, 261 00:13:43,320 --> 00:13:46,320 Speaker 3: I have a thing, I want to get richer doing 262 00:13:46,360 --> 00:13:48,640 Speaker 3: this thing, or I want to support these people, but 263 00:13:49,240 --> 00:13:52,800 Speaker 3: like reality, like studies and facts are showing that, like 264 00:13:53,040 --> 00:13:56,800 Speaker 3: it's bad to do this, so let's recreate studies and 265 00:13:56,840 --> 00:13:59,800 Speaker 3: facts to look like they're pushing this thing. 266 00:14:00,760 --> 00:14:02,480 Speaker 2: And one of the ways in which because you have 267 00:14:02,720 --> 00:14:05,920 Speaker 2: think tanks, that are more intellectually rigorous, and they all 268 00:14:06,000 --> 00:14:08,320 Speaker 2: charge for their shit, and a lot of the shitty 269 00:14:08,320 --> 00:14:09,960 Speaker 2: think tanks that are like funded by the oil and 270 00:14:10,040 --> 00:14:12,960 Speaker 2: gas industry give out their papers for free. So you'll 271 00:14:12,960 --> 00:14:16,120 Speaker 2: have like journalists writing articles and they're like, wow, I 272 00:14:16,160 --> 00:14:19,040 Speaker 2: need I need two different opinions on whether or not 273 00:14:19,080 --> 00:14:23,240 Speaker 2: we should frack the oil fields in fucking Texas. And 274 00:14:24,120 --> 00:14:26,960 Speaker 2: one you know think tank that actually does real research 275 00:14:26,960 --> 00:14:28,760 Speaker 2: will say you have to pay us twelve hundred dollars 276 00:14:28,800 --> 00:14:31,280 Speaker 2: for our papers on what will happen, And the Heritage 277 00:14:31,320 --> 00:14:33,840 Speaker 2: Foundation says, here you go, here's a thousand pages of 278 00:14:33,880 --> 00:14:36,840 Speaker 2: shit that you can cite in your articles and it's 279 00:14:36,880 --> 00:14:40,400 Speaker 2: all free. That's like a simplified version of the game 280 00:14:40,440 --> 00:14:41,440 Speaker 2: that's going on here. 281 00:14:41,600 --> 00:14:44,880 Speaker 3: Right. It sounds a lot like studies, Like there's study 282 00:14:45,320 --> 00:14:49,120 Speaker 3: studies where like you start looking into them and you're like, wait, 283 00:14:49,160 --> 00:14:51,800 Speaker 3: this doesn't actually say the thing that the headlines are saying. 284 00:14:52,400 --> 00:14:54,600 Speaker 3: And again sometimes it's done insidiously. 285 00:14:54,920 --> 00:14:55,120 Speaker 2: Yeah. 286 00:14:55,200 --> 00:14:59,000 Speaker 3: Sometimes I think it's just people misinterpret things or like 287 00:14:59,360 --> 00:15:04,000 Speaker 3: they're trying to like you're saying like actual, actual like 288 00:15:04,320 --> 00:15:08,120 Speaker 3: work takes money because it takes people, and that's not 289 00:15:08,320 --> 00:15:10,680 Speaker 3: very sexy no, I mean you can. 290 00:15:10,920 --> 00:15:13,680 Speaker 2: You can have a fancy name for an organization and 291 00:15:13,800 --> 00:15:15,880 Speaker 2: just put out papers that say whatever you want them 292 00:15:15,920 --> 00:15:17,720 Speaker 2: to say. And if you do it with like a 293 00:15:17,760 --> 00:15:20,600 Speaker 2: good letter ahead, people will trust that there's there's something 294 00:15:20,640 --> 00:15:21,680 Speaker 2: to what you're saying. 295 00:15:21,880 --> 00:15:25,160 Speaker 3: It's amazing how how far a good font will get. 296 00:15:25,320 --> 00:15:29,880 Speaker 2: Yeah, yeah, yeah. So the idea that like we would 297 00:15:29,880 --> 00:15:32,920 Speaker 2: have and it's kind of worth going back of it 298 00:15:32,960 --> 00:15:36,640 Speaker 2: and talking about how like the field of public policy analysis, 299 00:15:36,640 --> 00:15:38,720 Speaker 2: which is kind of where think tanks has a concept 300 00:15:38,760 --> 00:15:41,560 Speaker 2: come out of, how recent it is because for most 301 00:15:41,640 --> 00:15:45,160 Speaker 2: of the history of governments, you didn't have professional people 302 00:15:45,200 --> 00:15:48,960 Speaker 2: who like analyzed policy and looked into how it was working, right, 303 00:15:49,360 --> 00:15:52,760 Speaker 2: like progress was. Sometimes you'd get a king who was 304 00:15:52,840 --> 00:15:56,400 Speaker 2: like reasonably smart, and sometimes you'd get an inbred royal 305 00:15:56,520 --> 00:15:59,920 Speaker 2: king who would either like take things back a couple 306 00:15:59,960 --> 00:16:02,600 Speaker 2: of decades or be weak enough that like a couple 307 00:16:02,600 --> 00:16:06,320 Speaker 2: of smart guys can move things forward you get it 308 00:16:06,360 --> 00:16:08,480 Speaker 2: dead or two you get a couple of world wars. 309 00:16:09,040 --> 00:16:12,239 Speaker 2: And then in the twentieth century, after we finally invented cigarettes, 310 00:16:12,280 --> 00:16:15,560 Speaker 2: people got smart and that's when we start doing actual 311 00:16:15,640 --> 00:16:20,400 Speaker 2: policy analysis. It's really not until like the nineteen hundreds, 312 00:16:20,440 --> 00:16:23,680 Speaker 2: that we actually start in kind of a concerted way, 313 00:16:24,240 --> 00:16:27,520 Speaker 2: like looking at a lot of where you have these 314 00:16:27,520 --> 00:16:30,240 Speaker 2: guys who are like, I'm an expert in urban planning, right, 315 00:16:30,480 --> 00:16:34,040 Speaker 2: I'm an expert in like energy policy. Right. You had 316 00:16:34,080 --> 00:16:36,080 Speaker 2: like guys who were kind of doing that in the 317 00:16:36,120 --> 00:16:39,280 Speaker 2: eighteen hundreds, but it starts to actually become like systematized 318 00:16:39,360 --> 00:16:40,440 Speaker 2: in the nineteen hundreds. 319 00:16:40,760 --> 00:16:42,520 Speaker 3: Right. Yeah, it's a bunch of people who is like, 320 00:16:42,560 --> 00:16:46,320 Speaker 3: I'm a pervert for this one thing. Yes, oh, yes, yeah, 321 00:16:46,360 --> 00:16:49,280 Speaker 3: and I can exist that way. Yeah, it's like cool, 322 00:16:49,360 --> 00:16:51,960 Speaker 3: you weird little freak. That's great. Well, we'll consult you 323 00:16:52,040 --> 00:16:53,560 Speaker 3: when we need questions about that thing. 324 00:16:53,840 --> 00:16:57,200 Speaker 2: M hm. And that's kind of like that is kind 325 00:16:57,200 --> 00:17:00,560 Speaker 2: of a quietly revolutionary concept because it implied like, well, 326 00:17:00,600 --> 00:17:03,560 Speaker 2: whatever our society is doing right now might not be 327 00:17:03,600 --> 00:17:05,560 Speaker 2: the best way to do things, and so we should 328 00:17:05,560 --> 00:17:08,639 Speaker 2: always be looking at doing them differently. And most that 329 00:17:08,680 --> 00:17:11,520 Speaker 2: hasn't always been like a thing you could expect from 330 00:17:11,800 --> 00:17:15,879 Speaker 2: a society, right. And there's a really fun nineteen ninety 331 00:17:15,880 --> 00:17:19,800 Speaker 2: one doctoral dissertation by doctor Susan Marie Willis I found 332 00:17:19,840 --> 00:17:21,840 Speaker 2: that notes that in the period leading up to World 333 00:17:21,880 --> 00:17:24,200 Speaker 2: War two, there was this kind of turning point in 334 00:17:24,240 --> 00:17:27,679 Speaker 2: the federal government's willingness to solicit expert advice in solving 335 00:17:27,680 --> 00:17:30,880 Speaker 2: the nation's problems, and as a result, quote, the country's 336 00:17:30,880 --> 00:17:34,040 Speaker 2: intellectual magnet shifted from New York City to Washington, d C. 337 00:17:34,280 --> 00:17:37,240 Speaker 2: During that time. The shift is placed on reliance on 338 00:17:37,320 --> 00:17:40,720 Speaker 2: policy experts earlier during World War One, when many businessmen 339 00:17:40,720 --> 00:17:44,359 Speaker 2: and academicians played key roles in wartime management on the 340 00:17:44,359 --> 00:17:47,280 Speaker 2: War Industry's board. It is certainly true that Roosevelt had 341 00:17:47,280 --> 00:17:49,560 Speaker 2: the help of his famous Brains Trust in formulating some 342 00:17:49,640 --> 00:17:52,480 Speaker 2: new Deal policy, and also that Woodrow Wilson took selected 343 00:17:52,520 --> 00:17:55,800 Speaker 2: scholars with him to the Versailles peace Conference. In both cases, 344 00:17:55,840 --> 00:17:59,439 Speaker 2: these were individual consulting scholars and economists, many drawn from Harvard, 345 00:17:59,480 --> 00:18:01,840 Speaker 2: but not siated with any formal senses a group or 346 00:18:01,840 --> 00:18:05,080 Speaker 2: policy research body. So this is kind of when you 347 00:18:05,119 --> 00:18:07,600 Speaker 2: start to get the idea that, like, the president would 348 00:18:07,600 --> 00:18:11,120 Speaker 2: not just bring in his own people specifically, but might 349 00:18:11,200 --> 00:18:15,040 Speaker 2: like actually pick experts who were at least ostensibly independent, 350 00:18:15,160 --> 00:18:18,120 Speaker 2: and they would advise him on stuff. Right. That's when 351 00:18:18,160 --> 00:18:20,520 Speaker 2: this starts to become more common, right. 352 00:18:20,359 --> 00:18:23,119 Speaker 3: And that just seems like good leadership. 353 00:18:23,200 --> 00:18:26,560 Speaker 2: For the most it seems smart, right Yeah. 354 00:18:26,640 --> 00:18:29,280 Speaker 3: Yeah, I Like I remember in film school talking about 355 00:18:29,359 --> 00:18:31,720 Speaker 3: or like being a manager, right where it's like part 356 00:18:31,720 --> 00:18:34,159 Speaker 3: of being a good manager or a director is to 357 00:18:34,200 --> 00:18:38,159 Speaker 3: consult the people who are very ultra into specific things 358 00:18:38,160 --> 00:18:40,800 Speaker 3: and hear what they have to say. I mean, now 359 00:18:40,840 --> 00:18:42,960 Speaker 3: we got Wikipedia, so yeah, well, don't. 360 00:18:42,800 --> 00:18:44,639 Speaker 2: Need any of that, but you can see how there 361 00:18:44,680 --> 00:18:46,320 Speaker 2: could be a good and a bad side, right where 362 00:18:46,320 --> 00:18:48,720 Speaker 2: if the president's trying to, like, I don't know, deal 363 00:18:48,760 --> 00:18:50,960 Speaker 2: with the aftermap of World War One, yeah, you should 364 00:18:50,960 --> 00:18:54,080 Speaker 2: probably bring an academic who knows something about like fucking 365 00:18:54,280 --> 00:18:58,360 Speaker 2: Austrio Hungary, like, and it's different political factions and whatnot. 366 00:18:58,440 --> 00:19:00,640 Speaker 2: That makes sense, seems like that guys should be there. 367 00:19:00,800 --> 00:19:04,240 Speaker 3: Yeah, but they shouldn't necessarily. I mean, it's only as 368 00:19:04,240 --> 00:19:07,560 Speaker 3: good as the person, right person. If your expert is, say, 369 00:19:07,720 --> 00:19:10,560 Speaker 3: a racist, then everything they're going to do is tinted 370 00:19:10,600 --> 00:19:10,960 Speaker 3: that way. 371 00:19:11,160 --> 00:19:11,400 Speaker 2: Yeah. 372 00:19:11,440 --> 00:19:14,320 Speaker 3: I mean, I think that's why people look to like 373 00:19:14,600 --> 00:19:18,080 Speaker 3: the idea of computers as being impartial. But then also 374 00:19:18,160 --> 00:19:21,640 Speaker 3: you need someone who has instincts on something as well, 375 00:19:21,920 --> 00:19:24,720 Speaker 3: like being an expert and something should also mean being 376 00:19:24,800 --> 00:19:28,359 Speaker 3: able to interpret that information in a good way. 377 00:19:28,800 --> 00:19:30,640 Speaker 2: And also kind of the problem with this is that 378 00:19:31,040 --> 00:19:33,200 Speaker 2: when it's just seen as like, well, yeah, we elected 379 00:19:33,200 --> 00:19:35,199 Speaker 2: this president and he and all the people in his 380 00:19:35,280 --> 00:19:38,879 Speaker 2: administration are all like, you know, members of the same team, 381 00:19:39,280 --> 00:19:40,760 Speaker 2: when you start to see like, well, no, some of 382 00:19:40,760 --> 00:19:43,720 Speaker 2: the people giving him advice are like independent experts and 383 00:19:43,760 --> 00:19:45,960 Speaker 2: they're just trying to do what's best based on the facts. 384 00:19:46,040 --> 00:19:47,679 Speaker 2: Well are they right? 385 00:19:47,800 --> 00:19:47,960 Speaker 3: Right? 386 00:19:48,040 --> 00:19:50,399 Speaker 2: Like, they won't always be. Sometimes they'll be just as 387 00:19:50,440 --> 00:19:52,840 Speaker 2: political as anybody else the president might hire. But if 388 00:19:52,880 --> 00:19:55,520 Speaker 2: you start thinking about them as like possessed of some 389 00:19:55,560 --> 00:19:58,760 Speaker 2: sort of objective wisdom that's above the fray, you can 390 00:19:58,800 --> 00:20:01,240 Speaker 2: also wind up not being critical enough of like what 391 00:20:01,320 --> 00:20:02,399 Speaker 2: they're actually doing. 392 00:20:02,680 --> 00:20:05,800 Speaker 3: Yeah, they also become it's that thing. It's we've seen 393 00:20:05,840 --> 00:20:10,280 Speaker 3: it a million times where someone's really smart about one thing. Yeah, 394 00:20:10,280 --> 00:20:14,119 Speaker 3: that doesn't mean they're smart about everything. And so like 395 00:20:14,240 --> 00:20:16,280 Speaker 3: you have to know how to use that person and 396 00:20:16,320 --> 00:20:19,080 Speaker 3: that information, yeah, and like know what to do with 397 00:20:19,119 --> 00:20:24,840 Speaker 3: that information that the broader consequences of that because experts, 398 00:20:25,240 --> 00:20:28,040 Speaker 3: I mean, there's a reason why there's that weird stereotype 399 00:20:28,040 --> 00:20:31,400 Speaker 3: of like, yeah, these snooty college type yeah, right, where 400 00:20:31,440 --> 00:20:33,600 Speaker 3: it's like, yeah, I sort of get it, like, you know, 401 00:20:34,160 --> 00:20:36,119 Speaker 3: and we get to having a lot of authority in 402 00:20:36,160 --> 00:20:39,800 Speaker 3: one thing, you start acting with that authority on everything sometimes. 403 00:20:40,400 --> 00:20:42,760 Speaker 2: Yeah, and I'm kind of on the other side of things. 404 00:20:42,760 --> 00:20:46,080 Speaker 2: If you're like some scholar who's a weird wonky expert 405 00:20:46,200 --> 00:20:50,359 Speaker 2: in different kind of obscure European political conflicts and shit, 406 00:20:50,920 --> 00:20:53,359 Speaker 2: and the president comes to ask you to help him 407 00:20:53,359 --> 00:20:55,680 Speaker 2: at a peace conference, you might just kind of try 408 00:20:55,680 --> 00:20:57,520 Speaker 2: to figure out what the president wants to hear and 409 00:20:57,560 --> 00:21:01,200 Speaker 2: tell him that, you know, because you know, it's noteworthy 410 00:21:01,240 --> 00:21:03,960 Speaker 2: that Wilson took all of these scholars with him to Versailla. 411 00:21:04,040 --> 00:21:05,639 Speaker 2: But like we all know that didn't go well. That 412 00:21:05,720 --> 00:21:08,320 Speaker 2: wasn't a good peace treaty, right, Like that was It's 413 00:21:08,400 --> 00:21:13,120 Speaker 2: like a famously bad peace treaty. So this is sort 414 00:21:13,119 --> 00:21:16,120 Speaker 2: of the prehistoric era in think tanks, if you're thinking 415 00:21:16,119 --> 00:21:18,439 Speaker 2: about like your evolutionary chart, this is when like the 416 00:21:18,480 --> 00:21:21,760 Speaker 2: fish gets on to land, right, and the fish is 417 00:21:21,800 --> 00:21:24,320 Speaker 2: eventually going to be named the Heritage Foundation. 418 00:21:24,400 --> 00:21:25,280 Speaker 3: It's a good fish name. 419 00:21:25,359 --> 00:21:27,800 Speaker 2: But this period you start in, kind of the period 420 00:21:27,840 --> 00:21:30,600 Speaker 2: after Wilson, you get this groundwork laid for what's going 421 00:21:30,640 --> 00:21:34,320 Speaker 2: to come after it becomes the norm that private economists, lawyers, 422 00:21:34,320 --> 00:21:37,159 Speaker 2: and experts will be brought in to advise presidents in 423 00:21:37,240 --> 00:21:41,000 Speaker 2: Congress about like you know, just over time during like 424 00:21:41,080 --> 00:21:44,239 Speaker 2: their entire periods in office, but also about specific issues. Right, 425 00:21:44,280 --> 00:21:46,600 Speaker 2: it becomes more normal. And again this isn't like a 426 00:21:46,640 --> 00:21:49,480 Speaker 2: clean break. You could find examples of this, you know, 427 00:21:49,520 --> 00:21:53,840 Speaker 2: in earlier decades and centuries, but it becomes normalized that like, well, 428 00:21:53,840 --> 00:21:56,520 Speaker 2: we're debating this bill on like setting up I don't know, 429 00:21:56,720 --> 00:21:59,320 Speaker 2: fucking phone infrastructure for the country. Let's bring in an 430 00:21:59,320 --> 00:22:02,240 Speaker 2: expert on that, right, like an actual like policy expert 431 00:22:02,400 --> 00:22:04,399 Speaker 2: to like talk about how this should go or what 432 00:22:04,440 --> 00:22:06,800 Speaker 2: they think will happen with this thing. And it becomes 433 00:22:06,840 --> 00:22:12,159 Speaker 2: really desirous for experts to get positions like this, and organizations, 434 00:22:12,160 --> 00:22:15,800 Speaker 2: specifically corporations who have a lot of vested interest in 435 00:22:15,880 --> 00:22:18,240 Speaker 2: some of these like different bills being put for building 436 00:22:18,280 --> 00:22:22,959 Speaker 2: infrastructure start to realize that like, well, if we fund experts, 437 00:22:23,160 --> 00:22:25,719 Speaker 2: if we like have experts that we have paid to 438 00:22:25,720 --> 00:22:28,840 Speaker 2: get to that position in society, that could really help 439 00:22:28,920 --> 00:22:30,720 Speaker 2: us out when it comes time to like we want 440 00:22:30,760 --> 00:22:32,320 Speaker 2: to make sure that these laws are written in such 441 00:22:32,320 --> 00:22:34,680 Speaker 2: a way that we get some of these government contracts, right, 442 00:22:35,359 --> 00:22:37,320 Speaker 2: you know, you get all of this happening at once. 443 00:22:37,400 --> 00:22:39,680 Speaker 2: Both this positive benefit of like, yeah, we actually have 444 00:22:39,720 --> 00:22:42,679 Speaker 2: people who know what they're doing being consulted about laws, 445 00:22:43,040 --> 00:22:47,240 Speaker 2: and also these people are often deeply corrupted because they 446 00:22:47,280 --> 00:22:50,119 Speaker 2: need money before they hit that point. And there's always 447 00:22:50,119 --> 00:22:54,600 Speaker 2: corporations willing to pay for people to become experts as 448 00:22:54,600 --> 00:22:57,080 Speaker 2: long as they know who buttered their bread. 449 00:22:57,440 --> 00:23:02,320 Speaker 3: Right, Yeah, I mean so money, money's a real problem. 450 00:23:02,640 --> 00:23:06,399 Speaker 3: Like it involved it shouldn't be involved in everything we do. No, 451 00:23:06,720 --> 00:23:09,119 Speaker 3: there should be things we do that don't have to 452 00:23:09,160 --> 00:23:10,320 Speaker 3: be motivated by money. 453 00:23:10,359 --> 00:23:13,320 Speaker 2: But you no, that's the that's the beautiful dream of 454 00:23:13,359 --> 00:23:16,359 Speaker 2: Star Trek that sometimes we could just act purely based 455 00:23:16,400 --> 00:23:19,000 Speaker 2: on whether or not Will Riker wants to fuck something. 456 00:23:19,000 --> 00:23:21,240 Speaker 2: And Will Riker always wants to fuck something. 457 00:23:21,440 --> 00:23:23,640 Speaker 3: His dick is its own currency, that's. 458 00:23:23,560 --> 00:23:36,480 Speaker 2: Right, Speaking of Riker's dick, these ads, Oh, we're back now. 459 00:23:36,840 --> 00:23:39,119 Speaker 2: Before you can have someone who's an expert in like 460 00:23:39,440 --> 00:23:42,800 Speaker 2: any kind of policy field, you have to have I mean, 461 00:23:42,840 --> 00:23:44,399 Speaker 2: I guess you don't have to have this, but it 462 00:23:44,440 --> 00:23:46,280 Speaker 2: really helps. You have to have like the ability to 463 00:23:46,280 --> 00:23:50,280 Speaker 2: get degrees and stuff like economics and political science and 464 00:23:50,320 --> 00:23:53,919 Speaker 2: that wasn't always specifically like graduate degrees, and that was 465 00:23:53,960 --> 00:23:56,760 Speaker 2: not always a thing like it was kind of a 466 00:23:56,840 --> 00:24:01,520 Speaker 2: process of consolidating these broad feelings of knowledge into discrete 467 00:24:01,560 --> 00:24:04,440 Speaker 2: fields of studies, and it was sort of a thing 468 00:24:04,440 --> 00:24:07,119 Speaker 2: that kind of happens along kind of the late eighteen 469 00:24:07,200 --> 00:24:09,680 Speaker 2: hundreds early nineteen hundreds where you actually start to get 470 00:24:10,000 --> 00:24:13,600 Speaker 2: like graduate schools and a university system that looks like 471 00:24:13,720 --> 00:24:16,480 Speaker 2: the one we have today. Right, and we'll be talking 472 00:24:16,520 --> 00:24:19,560 Speaker 2: about like what happens to the university system because this 473 00:24:19,680 --> 00:24:21,560 Speaker 2: is really going to piss off a lot of people. 474 00:24:22,119 --> 00:24:24,880 Speaker 2: But you start to get the very first like kind 475 00:24:24,880 --> 00:24:28,400 Speaker 2: of modern looking colleges that aren't just you know, among 476 00:24:28,440 --> 00:24:31,280 Speaker 2: other things, aren't just a place for rich kids to go, 477 00:24:31,440 --> 00:24:34,439 Speaker 2: right where it becomes more normal for like regular people 478 00:24:34,480 --> 00:24:37,000 Speaker 2: to go and get degrees and then they can become 479 00:24:37,119 --> 00:24:40,760 Speaker 2: experts in fields and influence public policy. And that is 480 00:24:40,840 --> 00:24:43,080 Speaker 2: kind of a quietly radical change. It has a lot 481 00:24:43,119 --> 00:24:45,400 Speaker 2: to do with why, around the time of the Great 482 00:24:45,440 --> 00:24:47,960 Speaker 2: Depression and the New Deal you get all of these 483 00:24:48,040 --> 00:24:51,160 Speaker 2: to us seemingly radical policies is because you have all 484 00:24:51,200 --> 00:24:53,840 Speaker 2: of these people who were educated at a time in 485 00:24:53,880 --> 00:24:58,920 Speaker 2: which like there wasn't really education wasn't politicized in the 486 00:24:59,040 --> 00:25:01,080 Speaker 2: kind of way that it is now. It's still obviously 487 00:25:01,119 --> 00:25:04,359 Speaker 2: had plenty of biases, right, it reflected the biases of 488 00:25:04,359 --> 00:25:06,439 Speaker 2: the culture that it was in. But you did not 489 00:25:06,720 --> 00:25:10,919 Speaker 2: have like right wing schools. You did not have like 490 00:25:11,720 --> 00:25:14,320 Speaker 2: this kind of like culture warship over schools. And so 491 00:25:14,400 --> 00:25:18,960 Speaker 2: as a result, educated experts tended to overwhelmingly be progressives. 492 00:25:19,400 --> 00:25:21,320 Speaker 2: And again, these are people who, in a lot of 493 00:25:21,359 --> 00:25:24,719 Speaker 2: ways we would consider deeply reactionary today, but they were 494 00:25:24,760 --> 00:25:29,360 Speaker 2: all pretty supportive of, like widespread the idea that like, yeah, 495 00:25:29,359 --> 00:25:31,800 Speaker 2: you should use the government's money to help people, right 496 00:25:31,880 --> 00:25:34,760 Speaker 2: to do things, to have a society. You know, that 497 00:25:34,920 --> 00:25:37,760 Speaker 2: was the normal view of this kind of group of people. 498 00:25:38,200 --> 00:25:40,960 Speaker 3: I mean, yeah, it's that feels like that's the point 499 00:25:40,960 --> 00:25:43,760 Speaker 3: of a government. It's a group that basically has taken 500 00:25:43,800 --> 00:25:46,239 Speaker 3: control of land and went like, all right, we're going 501 00:25:46,280 --> 00:25:48,720 Speaker 3: to make it easy for everybody to live here, and 502 00:25:49,040 --> 00:25:53,200 Speaker 3: you know, in exchange, you won't chop our heads off. 503 00:25:53,320 --> 00:25:55,840 Speaker 2: Yeah we should have roads. Oh, everyone is starving. We 504 00:25:55,840 --> 00:25:58,879 Speaker 2: should give people jobs and just have them build stuff 505 00:25:58,960 --> 00:26:01,840 Speaker 2: for a while until this great depression thing shakes out. 506 00:26:02,680 --> 00:26:04,320 Speaker 2: I'm not trying to be like and all of the 507 00:26:04,400 --> 00:26:06,960 Speaker 2: experts were progressive, so the world was perfect because no, 508 00:26:07,000 --> 00:26:09,560 Speaker 2: everybody was still racist as hell. We did all sorts 509 00:26:09,560 --> 00:26:12,480 Speaker 2: of fucked up stuff back then. Progressive then did not 510 00:26:12,800 --> 00:26:15,439 Speaker 2: entirely mean the same thing it does today. But there 511 00:26:15,440 --> 00:26:18,840 Speaker 2: were certain things that just like weren't controversial back then, right. 512 00:26:19,440 --> 00:26:22,800 Speaker 2: And among the first great think tanks of American society 513 00:26:22,920 --> 00:26:26,040 Speaker 2: was the Russell Sage Foundation, which had been established in 514 00:26:26,119 --> 00:26:29,040 Speaker 2: nineteen oh seven from a ten million dollar donation. And 515 00:26:29,040 --> 00:26:30,280 Speaker 2: I'm going to quote from and that was a lot 516 00:26:30,320 --> 00:26:33,160 Speaker 2: of money back then. That's like in about twenty something 517 00:26:33,200 --> 00:26:35,280 Speaker 2: a year. To me, it's a lot of money for 518 00:26:35,359 --> 00:26:38,120 Speaker 2: a dude. Yeah, that's not a lot in think tank 519 00:26:38,280 --> 00:26:40,800 Speaker 2: terms back then. But I'm going to quote from Willis again. 520 00:26:41,320 --> 00:26:44,960 Speaker 2: It brought together amateur social investigators and charity volunteers with 521 00:26:45,000 --> 00:26:48,280 Speaker 2: professional social scientists for the purpose of applying new research 522 00:26:48,359 --> 00:26:51,680 Speaker 2: methods in the permanent improvement of social conditions. There was 523 00:26:51,720 --> 00:26:54,600 Speaker 2: a particular concern with child labor laws, child and family 524 00:26:54,640 --> 00:26:58,399 Speaker 2: and health services, and education. The staff of educators, sociologists, 525 00:26:58,400 --> 00:27:02,040 Speaker 2: and settlement house veterans comprised few academicians, but they compiled 526 00:27:02,080 --> 00:27:05,520 Speaker 2: statistics and other pertinent information on social problems and abuses. 527 00:27:05,720 --> 00:27:08,000 Speaker 2: These data were made available to the general public as 528 00:27:08,000 --> 00:27:10,040 Speaker 2: well as to state and local governments to guide them 529 00:27:10,080 --> 00:27:13,440 Speaker 2: in practical policy formation. This pamphlet was the most typical 530 00:27:13,440 --> 00:27:16,400 Speaker 2: publication of the Sage Foundation at the time, and traveling 531 00:27:16,440 --> 00:27:19,760 Speaker 2: exhibitions which visited county fairs and schools were also sent out. 532 00:27:20,400 --> 00:27:24,080 Speaker 2: So this is you can see some similarity in that 533 00:27:24,160 --> 00:27:27,160 Speaker 2: a lot of these today, these kind of like particularly 534 00:27:27,240 --> 00:27:29,480 Speaker 2: more political think tanks, will put out stuff that's meant 535 00:27:29,520 --> 00:27:33,119 Speaker 2: to be widely consumed, information that like you know, is 536 00:27:33,440 --> 00:27:37,000 Speaker 2: meant to be sort of disseminated via social media or whatnot, 537 00:27:37,040 --> 00:27:39,960 Speaker 2: are be like widely quoted in the news. The Russell 538 00:27:40,040 --> 00:27:44,000 Speaker 2: Sage Foundation is taking a more direct route because there's 539 00:27:44,040 --> 00:27:46,280 Speaker 2: not as many organs for people to get that kind 540 00:27:46,320 --> 00:27:48,440 Speaker 2: of stuff out, So they're just like handing out they're 541 00:27:48,480 --> 00:27:52,280 Speaker 2: looking into like, hey, how should we actually like educate children? 542 00:27:52,359 --> 00:27:54,480 Speaker 2: Is it bad for little kids to work? We put 543 00:27:54,520 --> 00:27:57,160 Speaker 2: together a pamphlet on this, like let's all look into 544 00:27:57,200 --> 00:28:01,040 Speaker 2: the And again, these people were extra And when I say, 545 00:28:01,119 --> 00:28:04,520 Speaker 2: I said earlier it wasn't really controversial to be against 546 00:28:04,600 --> 00:28:07,680 Speaker 2: child labor or be in favor of public spending. It 547 00:28:07,800 --> 00:28:12,480 Speaker 2: was among like very wealthy people. Obviously the business owners 548 00:28:12,520 --> 00:28:15,200 Speaker 2: in America are going to stage a coup against FDR 549 00:28:15,320 --> 00:28:18,760 Speaker 2: in the early thirties, right, but not among these educate 550 00:28:18,880 --> 00:28:20,359 Speaker 2: like these academics and stuff. 551 00:28:20,640 --> 00:28:26,600 Speaker 3: It's controversial, like controversy, you know, it's is the implication 552 00:28:26,800 --> 00:28:30,520 Speaker 3: that there are multiple competing views, and it's weird when 553 00:28:30,560 --> 00:28:34,359 Speaker 3: we count the views of the people who directly stand 554 00:28:34,359 --> 00:28:36,439 Speaker 3: to gain, you know, like it's weird to be like 555 00:28:37,320 --> 00:28:41,200 Speaker 3: it's it's weird to say, like there's controversy amongst like 556 00:28:41,320 --> 00:28:45,000 Speaker 3: the victims and the criminals that did this, Like like 557 00:28:45,040 --> 00:28:48,320 Speaker 3: there's controversy amongst the bank robbers and the bank. 558 00:28:49,440 --> 00:28:51,920 Speaker 2: Crime haters can't agree about on robbery. 559 00:28:52,160 --> 00:28:54,360 Speaker 3: Yeah, it's just like, I mean, it's just a group 560 00:28:54,400 --> 00:28:56,600 Speaker 3: of people doing something bad and then a group of 561 00:28:56,600 --> 00:28:59,040 Speaker 3: people saying like, yeah, what you're doing is bad, and 562 00:28:59,080 --> 00:29:01,880 Speaker 3: they're like, I just and oh, there's controversy. 563 00:29:02,000 --> 00:29:04,760 Speaker 2: We'd like to keep doing it. When it comes to 564 00:29:04,800 --> 00:29:07,280 Speaker 2: the term think tank, which again kind of the Russell 565 00:29:07,320 --> 00:29:10,480 Speaker 2: Sage Foundation is one of the first organ like institutions 566 00:29:10,520 --> 00:29:12,840 Speaker 2: you could call a think tank. They weren't using that 567 00:29:12,960 --> 00:29:16,080 Speaker 2: term back then, and I wanted to look, like try 568 00:29:16,080 --> 00:29:18,400 Speaker 2: to figure out like where the phrase comes from, because, 569 00:29:18,440 --> 00:29:20,560 Speaker 2: as you noted when we started this, it sounds like 570 00:29:20,920 --> 00:29:22,920 Speaker 2: it should be a big tank full of Brian with 571 00:29:22,960 --> 00:29:26,000 Speaker 2: a brain in it, and that's actually pretty apropos to 572 00:29:26,040 --> 00:29:28,680 Speaker 2: the history. One historian traces it back to the forties 573 00:29:28,720 --> 00:29:31,200 Speaker 2: as a just a slang term for brain, like someone 574 00:29:31,280 --> 00:29:33,560 Speaker 2: like my think tank's not ticking do well today. 575 00:29:34,400 --> 00:29:37,480 Speaker 3: That's such a good slang for brain. Yeah, but I'm 576 00:29:37,600 --> 00:29:40,560 Speaker 3: kind of mad that they took that away from us. 577 00:29:41,160 --> 00:29:44,200 Speaker 2: We can take it back, Dave, we can take it back. 578 00:29:44,280 --> 00:29:47,000 Speaker 2: Pour some poor some knowledge into your think tank, folks 579 00:29:47,040 --> 00:29:50,280 Speaker 2: with this podcast. I've heard another history and trace it 580 00:29:50,280 --> 00:29:52,440 Speaker 2: to World War Two, where it referred to like a 581 00:29:52,480 --> 00:29:54,840 Speaker 2: war room, right, like you get all of your military 582 00:29:54,880 --> 00:29:56,960 Speaker 2: experts together into your think tank and they figure out 583 00:29:57,000 --> 00:30:00,000 Speaker 2: how to do a normandy. It's like a tank for yeah, 584 00:30:00,280 --> 00:30:03,840 Speaker 2: that too. Yeah, it makes sense. And it's also possible 585 00:30:03,840 --> 00:30:05,760 Speaker 2: that it comes from a guy named Burton Pines who 586 00:30:05,800 --> 00:30:08,400 Speaker 2: was a historian who wrote about the traditionalist movement in 587 00:30:08,440 --> 00:30:11,960 Speaker 2: the early nineteen eighties, and he used the analogy gathering 588 00:30:12,000 --> 00:30:16,120 Speaker 2: different fish into a tank and concentrating the brain power, Dave. 589 00:30:16,280 --> 00:30:18,920 Speaker 3: Yeah, I'm gonna go ahead and say that's that's the dumbest, right, 590 00:30:18,960 --> 00:30:20,160 Speaker 3: that the dumbest. 591 00:30:20,440 --> 00:30:22,720 Speaker 2: I wanted to try it because whenever I hear like 592 00:30:22,760 --> 00:30:25,760 Speaker 2: an insane analogy like that, like this is just a 593 00:30:25,800 --> 00:30:29,200 Speaker 2: book about like traditionalists by this historian, and I'm like, 594 00:30:29,440 --> 00:30:31,800 Speaker 2: why the fuck would you use that analogy. I haven't 595 00:30:31,800 --> 00:30:33,320 Speaker 2: been able to get a copy of the book that 596 00:30:33,360 --> 00:30:35,560 Speaker 2: he says this, and it's not like online in a 597 00:30:35,560 --> 00:30:37,480 Speaker 2: way that I was able to find it. That's like, 598 00:30:38,080 --> 00:30:40,120 Speaker 2: I don't know what the context meant here. 599 00:30:40,440 --> 00:30:43,520 Speaker 3: Was this person a child? They were they? Because that's 600 00:30:43,520 --> 00:30:45,040 Speaker 3: the sort of thing like when you're a child you 601 00:30:45,040 --> 00:30:47,400 Speaker 3: hear about like earwigs, you assume it's a wig on 602 00:30:47,480 --> 00:30:50,200 Speaker 3: an ear, Like that's that's what a child would think 603 00:30:50,400 --> 00:30:53,400 Speaker 3: when they hear think tank, It's like, oh yeah, like 604 00:30:53,440 --> 00:30:57,120 Speaker 3: a bunch of fish are thinking you need to track 605 00:30:57,160 --> 00:30:59,040 Speaker 3: down this person, Robert I do. 606 00:30:59,200 --> 00:31:01,320 Speaker 2: I kind of think maybe he was making fun of 607 00:31:01,360 --> 00:31:03,520 Speaker 2: the traditionalists by being like they're as dumb as a 608 00:31:03,520 --> 00:31:06,440 Speaker 2: bunch of fish in a tank. That's also an insane 609 00:31:06,480 --> 00:31:09,720 Speaker 2: way to call someone dumb, Like what would you do 610 00:31:09,760 --> 00:31:12,320 Speaker 2: it that way? Just say they're silly man. 611 00:31:12,680 --> 00:31:14,640 Speaker 3: Yeah, you need to track this person down and find 612 00:31:14,680 --> 00:31:18,200 Speaker 3: their family. Yeah, and have them on the show, and yeah, 613 00:31:18,200 --> 00:31:19,920 Speaker 3: make them answer for themselves. 614 00:31:20,520 --> 00:31:24,160 Speaker 2: See usually, Dave, when I suggest finding someone's family, someone 615 00:31:24,200 --> 00:31:27,320 Speaker 2: then posits something that's a crime. You're the first person 616 00:31:27,360 --> 00:31:28,960 Speaker 2: to just say we should have them on the show. 617 00:31:29,160 --> 00:31:31,560 Speaker 3: I'm appraise you could do a crime while they're on 618 00:31:31,640 --> 00:31:35,440 Speaker 3: the show. It doesn't be a crime against that, but 619 00:31:35,600 --> 00:31:36,960 Speaker 3: like they'll be crimes, you know. 620 00:31:37,120 --> 00:31:40,840 Speaker 2: See, this is why you're a pinch hitter, Dave. You're 621 00:31:40,880 --> 00:31:42,320 Speaker 2: able to swing left and right. 622 00:31:42,720 --> 00:31:43,200 Speaker 3: Yeah. 623 00:31:43,240 --> 00:31:45,240 Speaker 2: So I don't know what the fuck, Why the fuck 624 00:31:45,280 --> 00:31:48,480 Speaker 2: he used this phrase. It's baffling to me. But the 625 00:31:48,520 --> 00:31:51,480 Speaker 2: development of what become known as think tanks, you know, 626 00:31:51,600 --> 00:31:55,360 Speaker 2: really starting in the eighties, primarily happens before we have 627 00:31:55,440 --> 00:31:58,040 Speaker 2: a name for them, and a big like a guy 628 00:31:58,080 --> 00:32:00,680 Speaker 2: who's kind of influential in development of this concept is 629 00:32:00,720 --> 00:32:04,400 Speaker 2: Frederick Taylor. If you've ever heard of Taylorism. It's this 630 00:32:04,480 --> 00:32:07,280 Speaker 2: thing that in kind of the mid century is going 631 00:32:07,320 --> 00:32:10,480 Speaker 2: to become increasingly common in every different field of endeavor. 632 00:32:10,720 --> 00:32:13,760 Speaker 2: We tailorize police forces, which is we're like, well, what 633 00:32:13,840 --> 00:32:15,960 Speaker 2: if we standardize police training and what if we try 634 00:32:16,000 --> 00:32:18,040 Speaker 2: to have like metrics for police officers and see if 635 00:32:18,080 --> 00:32:20,240 Speaker 2: we can make them, you know, work better and more. 636 00:32:20,640 --> 00:32:22,200 Speaker 2: And we do the same thing with factories. We're going 637 00:32:22,240 --> 00:32:26,360 Speaker 2: to tailorize this factory. It's it's scientific management, right. Frederick 638 00:32:26,440 --> 00:32:27,600 Speaker 2: Taylor is that guy. 639 00:32:27,680 --> 00:32:29,080 Speaker 3: You're talking about standardized? 640 00:32:29,240 --> 00:32:33,920 Speaker 2: Absolutely that all is in that same tradition. Yes, yeah, 641 00:32:34,120 --> 00:32:39,239 Speaker 2: it's optimizing organizations and workflow for efficiency. Taylorism is going 642 00:32:39,280 --> 00:32:41,240 Speaker 2: to be a big deal every It's where we get 643 00:32:41,360 --> 00:32:44,600 Speaker 2: Deloitte and McKenzie, these consulting firms, right, Like, they all 644 00:32:44,640 --> 00:32:48,320 Speaker 2: have Taylorism in their DNA. And one of Robert Taylor's 645 00:32:48,360 --> 00:32:52,360 Speaker 2: friends is this guy named Robert Brookings. And does the 646 00:32:52,400 --> 00:32:56,320 Speaker 2: Brookings sound familiar to you, Dave, It does, Yeah, he's 647 00:32:56,360 --> 00:32:59,520 Speaker 2: this is the Brookings Institute guy, right, And the Brookings 648 00:32:59,520 --> 00:33:02,800 Speaker 2: Institute is from this earlier generation of think tanks that 649 00:33:02,880 --> 00:33:05,600 Speaker 2: actually think about things right. They're not just going to 650 00:33:05,600 --> 00:33:09,000 Speaker 2: put out stuff to like make one politician or the 651 00:33:09,040 --> 00:33:11,920 Speaker 2: other happy, among other things. The Brookings Institute are the 652 00:33:11,960 --> 00:33:14,480 Speaker 2: people who like help put out the Vietnam papers or 653 00:33:14,520 --> 00:33:18,240 Speaker 2: the Pentagon papers. Like they're the guys Elsberg is working with. 654 00:33:19,000 --> 00:33:21,400 Speaker 2: And that doesn't make anyone in the government very happy, right, 655 00:33:21,560 --> 00:33:25,120 Speaker 2: Like that's just actually something that needed to get out. 656 00:33:25,280 --> 00:33:28,440 Speaker 2: So we're talking kind of about like the pre evil 657 00:33:28,520 --> 00:33:32,880 Speaker 2: days here, although they're also not purely good either. Robert 658 00:33:32,960 --> 00:33:35,520 Speaker 2: Brookings had spent time in German colleges in the early 659 00:33:35,600 --> 00:33:38,120 Speaker 2: nineteen hundreds, and like a lot of Americans who did this, 660 00:33:38,200 --> 00:33:40,520 Speaker 2: he came back with a thought that those Germans are 661 00:33:40,520 --> 00:33:43,920 Speaker 2: onto something, right. Surely their mechanistic obsession with pushing the 662 00:33:43,960 --> 00:33:46,840 Speaker 2: limits of efficiency will only lead to good things for Germany. 663 00:33:47,040 --> 00:33:52,360 Speaker 2: It's nineteen thirteen, and I'm very optimistic, like. 664 00:33:52,320 --> 00:33:55,760 Speaker 3: Those Germans are doing something over there. Exactly. 665 00:33:58,240 --> 00:34:01,280 Speaker 2: So Brookings, unlike most of Americans who come back from 666 00:34:01,320 --> 00:34:04,880 Speaker 2: Germany with ideas, he just gets into the dry goods business, 667 00:34:05,080 --> 00:34:09,279 Speaker 2: and he does well enough in dry goods that he's 668 00:34:09,360 --> 00:34:11,360 Speaker 2: he becomes very rich, and he gets a position on 669 00:34:11,440 --> 00:34:14,680 Speaker 2: the war industry's board in World War One, which ironically 670 00:34:14,760 --> 00:34:18,960 Speaker 2: puts him directly opposite Germany. The whole process of having 671 00:34:19,200 --> 00:34:22,160 Speaker 2: doing a world war and of advising the government through 672 00:34:22,200 --> 00:34:24,520 Speaker 2: a world war because we had never had to mobilize 673 00:34:25,040 --> 00:34:27,279 Speaker 2: the milidy. The closest thing was like the Civil War. 674 00:34:27,360 --> 00:34:29,240 Speaker 2: But you know that had been quite a while before. 675 00:34:30,160 --> 00:34:32,440 Speaker 2: So this is like a big deal, and it convinces 676 00:34:32,520 --> 00:34:35,600 Speaker 2: him that like having the government bringing in experts like 677 00:34:35,760 --> 00:34:38,879 Speaker 2: me really helped the process of doing World War One, 678 00:34:38,960 --> 00:34:41,799 Speaker 2: So maybe we should make that normal across the board. Right, 679 00:34:42,160 --> 00:34:44,520 Speaker 2: Maybe when the government wants to face when mean, when 680 00:34:44,560 --> 00:34:47,520 Speaker 2: Congress is voting on what the national budget should look like, 681 00:34:47,560 --> 00:34:51,280 Speaker 2: they should talk to experts, right, And so he decides, 682 00:34:51,360 --> 00:34:53,319 Speaker 2: I'm going to put a bunch of my money, this 683 00:34:53,440 --> 00:34:56,600 Speaker 2: wealth that I've got into building an institution that can 684 00:34:56,640 --> 00:34:59,520 Speaker 2: provide the government with the best possible information so it 685 00:34:59,520 --> 00:35:00,720 Speaker 2: can make at our choices. 686 00:35:00,960 --> 00:35:01,120 Speaker 3: Right. 687 00:35:01,719 --> 00:35:04,239 Speaker 2: That's that's his dream. And that's where we get the 688 00:35:04,239 --> 00:35:06,279 Speaker 2: Brookings Institute. And he gets a bunch of his rich 689 00:35:06,320 --> 00:35:09,160 Speaker 2: friends together and he's like, put some money into this thing, 690 00:35:09,239 --> 00:35:11,879 Speaker 2: and it's important. This is going to be basically the 691 00:35:11,920 --> 00:35:15,000 Speaker 2: same way think tanks work in the modern era, where 692 00:35:15,000 --> 00:35:16,560 Speaker 2: you get all of these think tanks that are paid 693 00:35:16,560 --> 00:35:19,520 Speaker 2: by a company. Hey, we're exon Mobile. We want you 694 00:35:19,600 --> 00:35:22,080 Speaker 2: to put out a bunch of research saying that gasoline 695 00:35:22,120 --> 00:35:26,359 Speaker 2: is great for the environment. Right, The Brookings Institute looks 696 00:35:26,400 --> 00:35:28,840 Speaker 2: like the same thing. It's not quite because at this 697 00:35:28,920 --> 00:35:31,760 Speaker 2: point there is not really the expectation that by putting 698 00:35:31,800 --> 00:35:34,960 Speaker 2: money into this thing, it will only publish information you 699 00:35:35,000 --> 00:35:38,080 Speaker 2: want it to publish. That's not really the case yet, 700 00:35:38,320 --> 00:35:40,399 Speaker 2: and people are going to get pissed about this as 701 00:35:40,480 --> 00:35:43,160 Speaker 2: time goes on. But yeah, yeah, it. 702 00:35:43,360 --> 00:35:46,120 Speaker 3: Seems like, sorry, this seems like a good idea on paper, right, 703 00:35:46,480 --> 00:35:48,839 Speaker 3: and you can immediately kind of see like what if 704 00:35:48,840 --> 00:35:52,640 Speaker 3: we do this like direct channel to the government through experts, 705 00:35:52,719 --> 00:35:56,399 Speaker 3: like that won't get corrupt or like yeah go bad, right, Like, yeah, 706 00:35:56,480 --> 00:35:59,400 Speaker 3: accept the information we give them, even if the information 707 00:35:59,520 --> 00:36:03,320 Speaker 3: is like let's not do a war, like they'll listen 708 00:36:03,360 --> 00:36:03,759 Speaker 3: to us. 709 00:36:04,040 --> 00:36:05,799 Speaker 2: We keep going back to star Trek. It's kind of 710 00:36:05,800 --> 00:36:08,279 Speaker 2: like the borg, right where you shoot at them once 711 00:36:08,320 --> 00:36:10,160 Speaker 2: and you can fuck them up, but the second time 712 00:36:10,200 --> 00:36:13,560 Speaker 2: their shields will have modulated. So for a little while, 713 00:36:13,840 --> 00:36:16,600 Speaker 2: the Brookings Institute actually works the way it's supposed to. 714 00:36:16,840 --> 00:36:19,759 Speaker 2: Rich people fund it and it puts out information and 715 00:36:19,800 --> 00:36:23,160 Speaker 2: it's usually just information based on like what these experts. 716 00:36:23,160 --> 00:36:25,200 Speaker 2: They're not perfect, they make mistakes, but it's what they 717 00:36:25,280 --> 00:36:28,640 Speaker 2: actually think is best in these different fields of endeavor. Right, 718 00:36:29,400 --> 00:36:34,279 Speaker 2: it's actual like attempts at analyzing policy and impact, and 719 00:36:34,320 --> 00:36:36,520 Speaker 2: it's not Again, this is not always good. For example, 720 00:36:36,560 --> 00:36:39,040 Speaker 2: the Brooking is an institute is going to oppose much 721 00:36:39,040 --> 00:36:42,080 Speaker 2: of the new Deal because they're really market driven, and 722 00:36:42,200 --> 00:36:45,520 Speaker 2: Roosevelt is saying, we need more central planning. 723 00:36:45,680 --> 00:36:46,640 Speaker 3: Right. Oh interesting. 724 00:36:46,760 --> 00:36:49,359 Speaker 2: Yeah, but by the time Nixon's in office, you know, 725 00:36:49,440 --> 00:36:52,719 Speaker 2: they are seen and derided a lot by Republicans as 726 00:36:53,000 --> 00:36:55,839 Speaker 2: the liberal think tank. Right. And again, by the time 727 00:36:55,920 --> 00:36:58,239 Speaker 2: Nixon's in office, these are the guys who published the 728 00:36:58,280 --> 00:37:01,080 Speaker 2: Pentagon papers. So like they are this symbol of like 729 00:37:01,239 --> 00:37:05,839 Speaker 2: liberal progressive like ethos by the seventies. And the fact 730 00:37:05,880 --> 00:37:08,000 Speaker 2: that they switch like this is not because like there's 731 00:37:08,040 --> 00:37:11,360 Speaker 2: much of an ideological capture. It's that they're they're actually 732 00:37:11,360 --> 00:37:13,520 Speaker 2: the people here are actually generally trying to do what 733 00:37:13,560 --> 00:37:16,799 Speaker 2: they think is right. Again, that doesn't mean they're always right, 734 00:37:16,840 --> 00:37:20,040 Speaker 2: but like they are actually trying to analyze policy here. 735 00:37:20,440 --> 00:37:24,880 Speaker 3: For sure, there's always that like I I mean when people, 736 00:37:25,160 --> 00:37:28,200 Speaker 3: you know, there's been a big push about like you know, 737 00:37:28,960 --> 00:37:31,799 Speaker 3: colleges and schools and being liberal and stuff where it's 738 00:37:31,840 --> 00:37:34,799 Speaker 3: that weird situation where you're like defending this thing, but 739 00:37:34,880 --> 00:37:38,120 Speaker 3: you're like, I'm not saying it doesn't have problems. Yeah, 740 00:37:38,160 --> 00:37:41,000 Speaker 3: it's that weird in between where it's like, yes, it's 741 00:37:41,120 --> 00:37:44,560 Speaker 3: more just that the criticism goes so over the top 742 00:37:44,960 --> 00:37:47,560 Speaker 3: that you're is like, no, they can be criticized, just 743 00:37:47,560 --> 00:37:49,759 Speaker 3: not for this shit you're saying. Jesus. 744 00:37:50,120 --> 00:37:51,920 Speaker 2: Yeah, it's like when we talk about like, oh, the 745 00:37:51,960 --> 00:37:54,480 Speaker 2: good old days when progressives were a sin, and it's like, well, 746 00:37:54,800 --> 00:37:57,520 Speaker 2: people called Woodrow Wilson a progressive and he was like 747 00:37:57,640 --> 00:38:01,120 Speaker 2: maybe our most racist president. Yeah, we had presidents who 748 00:38:01,160 --> 00:38:04,080 Speaker 2: were slave owners, and Wilson might have been more racist, 749 00:38:04,440 --> 00:38:06,080 Speaker 2: Like he was a terrible man. 750 00:38:06,520 --> 00:38:09,760 Speaker 3: There's a lot of nuanced that gets lost in these conversations. 751 00:38:09,840 --> 00:38:11,960 Speaker 2: Yeah. Yeah, I'm not trying to paper over like oh, 752 00:38:12,080 --> 00:38:14,040 Speaker 2: to go back to the good old days of the 753 00:38:14,080 --> 00:38:21,239 Speaker 2: progressive fifties. Yeah. Yeah. Anyway, So in the UH, it's 754 00:38:21,280 --> 00:38:25,360 Speaker 2: one of those things where like the partly like after 755 00:38:25,400 --> 00:38:27,960 Speaker 2: World War Two in particular, like the Brookings Institute is 756 00:38:28,000 --> 00:38:29,680 Speaker 2: kind of a little bit more what we might call 757 00:38:29,880 --> 00:38:33,560 Speaker 2: conservative prior to World War Two. In the post war years, 758 00:38:34,120 --> 00:38:37,000 Speaker 2: like half of the world's money, literally half of all 759 00:38:37,040 --> 00:38:39,439 Speaker 2: of the wealth in the world is in the United States, right, 760 00:38:39,800 --> 00:38:42,840 Speaker 2: And so for a period of time, debates over stuff 761 00:38:42,880 --> 00:38:45,439 Speaker 2: like the budget and fiscal policy, like part of why 762 00:38:45,480 --> 00:38:48,680 Speaker 2: Eisenhower is able to do these massive public works projects. 763 00:38:48,840 --> 00:38:53,160 Speaker 2: Part of why there's not that much organized resistance to 764 00:38:53,200 --> 00:38:55,680 Speaker 2: the idea of public funding is that we have all 765 00:38:55,719 --> 00:38:58,360 Speaker 2: of the money there's ever been, right, and so stuff 766 00:38:58,440 --> 00:39:01,120 Speaker 2: like you don't get a lot of people worrying over 767 00:39:01,160 --> 00:39:03,960 Speaker 2: the budget so much in the fifties. You know, one 768 00:39:04,040 --> 00:39:06,080 Speaker 2: article I found from The Atlantic in nineteen eighty six 769 00:39:06,120 --> 00:39:09,439 Speaker 2: by Greg Easterbrook describes the economy in like the mid 770 00:39:09,480 --> 00:39:14,200 Speaker 2: century as a dead issue and writes social justice in Vietnam. 771 00:39:14,320 --> 00:39:17,240 Speaker 2: And this is by the sixties dominated the agenda. Brookings 772 00:39:17,280 --> 00:39:20,040 Speaker 2: concentrated on those fields, emerging as a chief source of 773 00:39:20,160 --> 00:39:22,520 Speaker 2: arguments in favor of the Great Society and opposed to 774 00:39:22,640 --> 00:39:25,759 Speaker 2: US involvement in Vietnam. In the Washington swirl, where few 775 00:39:25,760 --> 00:39:28,160 Speaker 2: people have the time to actually read the reports they debate, 776 00:39:28,440 --> 00:39:32,240 Speaker 2: respectability is often proportional to tonnage. The more studies someone 777 00:39:32,280 --> 00:39:34,160 Speaker 2: tosses on the table, the more likely he is to 778 00:39:34,160 --> 00:39:37,600 Speaker 2: win his point. For years, Brookings held a monopoly on tonnage. 779 00:39:37,680 --> 00:39:41,960 Speaker 2: It's paper supporting liberal positions went unchallenged by serious conservative rebuttals. 780 00:39:42,480 --> 00:39:44,720 Speaker 2: And so that's part of why there's not much counter 781 00:39:45,160 --> 00:39:48,720 Speaker 2: weight to like a lot of these liberal ideas about 782 00:39:48,760 --> 00:39:50,799 Speaker 2: like how money should be spent, is that the only 783 00:39:50,840 --> 00:39:54,080 Speaker 2: people researching them as the Brookings Institute. So if you're 784 00:39:54,080 --> 00:39:55,880 Speaker 2: going to like look at, well, what's the evidence on 785 00:39:55,920 --> 00:39:58,800 Speaker 2: whether or not we should you know, engage in this policy, 786 00:39:58,800 --> 00:40:00,719 Speaker 2: where you've got like three hundred pages of shit from 787 00:40:00,719 --> 00:40:04,080 Speaker 2: the Brookings Institute and nothing else, So I guess, I 788 00:40:04,080 --> 00:40:07,480 Speaker 2: guess it's easier to make that case right right now. 789 00:40:07,520 --> 00:40:09,880 Speaker 2: This is going to start to change by the period 790 00:40:09,880 --> 00:40:12,239 Speaker 2: of the Nixon administration. And part of what changes it 791 00:40:12,280 --> 00:40:15,879 Speaker 2: is that this first wave of think takes and expert advisors, 792 00:40:16,440 --> 00:40:19,520 Speaker 2: a decent chunk of them, had been intimately involved in 793 00:40:19,680 --> 00:40:22,440 Speaker 2: like convincing the government that it was the right time, 794 00:40:22,640 --> 00:40:24,680 Speaker 2: that Vietnam was a good thing to get into, right, 795 00:40:24,760 --> 00:40:26,719 Speaker 2: this was going to go well. And these are not 796 00:40:26,880 --> 00:40:30,480 Speaker 2: Brookings Institute people again, they're they are pretty much pretty 797 00:40:30,560 --> 00:40:34,120 Speaker 2: consistently like this is a dumb idea, we're like fucking 798 00:40:34,120 --> 00:40:37,239 Speaker 2: ourselves over by sending troops to this war. But it's 799 00:40:37,280 --> 00:40:39,640 Speaker 2: another think tank that's going to be heavily involved in 800 00:40:39,719 --> 00:40:41,880 Speaker 2: US getting into Vietnam, and that thing tank is called 801 00:40:42,200 --> 00:40:46,040 Speaker 2: the Rand Corporation. You hear these guys, these fund did 802 00:40:46,400 --> 00:40:47,280 Speaker 2: these cool guys. 803 00:40:48,239 --> 00:40:49,960 Speaker 3: Were they right? Were they right about Vietnam? 804 00:40:50,239 --> 00:40:50,439 Speaker 2: Yeah? 805 00:40:51,160 --> 00:40:51,359 Speaker 3: Really? 806 00:40:51,440 --> 00:40:53,800 Speaker 2: Well, yeah, I just read Biden just signed a treaty 807 00:40:53,800 --> 00:40:56,000 Speaker 2: with Vietnam, so it must have gone well, yeah, I 808 00:40:56,040 --> 00:40:58,200 Speaker 2: haven't this is this is the last piece of history 809 00:40:58,200 --> 00:41:01,319 Speaker 2: about Vietnam I read until a week ago. So it 810 00:41:01,320 --> 00:41:01,880 Speaker 2: seems like it. 811 00:41:01,880 --> 00:41:03,680 Speaker 3: All went well in the end, it all went well. 812 00:41:03,880 --> 00:41:05,799 Speaker 2: Why would we be at peace with them otherwise? 813 00:41:05,920 --> 00:41:06,680 Speaker 3: Yeah, exactly. 814 00:41:08,120 --> 00:41:10,880 Speaker 2: So the RAND Corporation is founded right after the big 815 00:41:11,000 --> 00:41:14,080 Speaker 2: dub dub dose, and they're like a defense industry think 816 00:41:14,160 --> 00:41:17,640 Speaker 2: tank and their initial obsession. They come out of this period. 817 00:41:17,640 --> 00:41:21,680 Speaker 2: We've just nuked Japan, the Russians shortly thereafter get their 818 00:41:21,719 --> 00:41:24,040 Speaker 2: own nukes, and we're like, we should probably have some 819 00:41:24,080 --> 00:41:26,920 Speaker 2: smart people figuring out what might happen as a result 820 00:41:27,000 --> 00:41:31,200 Speaker 2: of the fact that we all now have these these weapons. Right, Yeah, 821 00:41:31,280 --> 00:41:34,320 Speaker 2: that's a good idea. It's not it's not a bad idea. 822 00:41:34,400 --> 00:41:37,200 Speaker 3: Right, we all have doomsday devices. Maybe get some of 823 00:41:37,200 --> 00:41:38,200 Speaker 3: them brainiacs to. 824 00:41:38,680 --> 00:41:40,880 Speaker 2: Same you think about this, Is this a good idea? 825 00:41:41,080 --> 00:41:42,959 Speaker 3: And what's gonna happen is the brainy acts are gonna 826 00:41:42,960 --> 00:41:46,080 Speaker 3: be like, no, everything's bad and they're like, cool, thanks. 827 00:41:46,360 --> 00:41:47,959 Speaker 3: I mean we're not gonna do it, we're just gonna 828 00:41:48,040 --> 00:41:48,840 Speaker 3: keep them, but thanks. 829 00:41:48,880 --> 00:41:51,840 Speaker 2: Yeah, that is if they'd had This is kind of 830 00:41:51,880 --> 00:41:55,359 Speaker 2: The RAND Corporation is kind of the first really influential 831 00:41:55,400 --> 00:41:57,879 Speaker 2: think tank that is funded by a deep because they're 832 00:41:58,040 --> 00:42:00,799 Speaker 2: they're funded by the US Air Force, so they are 833 00:42:00,840 --> 00:42:03,120 Speaker 2: not in fact going to be like, perhaps we shouldn't 834 00:42:03,120 --> 00:42:04,400 Speaker 2: have these things. You know. 835 00:42:04,400 --> 00:42:06,839 Speaker 3: What this is all reminding me of is when you 836 00:42:06,880 --> 00:42:09,000 Speaker 3: go out with like friends, I'm sure you can relate 837 00:42:09,480 --> 00:42:12,640 Speaker 3: and you're making a series. You're like you're drinking or 838 00:42:12,640 --> 00:42:15,239 Speaker 3: you're doing whatever, and you're tuned too your friend and 839 00:42:15,280 --> 00:42:17,320 Speaker 3: go like, I can have more drinks, right, and the 840 00:42:17,320 --> 00:42:18,680 Speaker 3: friend goes, oh yeah, yeah, yeah. 841 00:42:18,760 --> 00:42:20,840 Speaker 2: Yeah, Well you're like good, look, you're. 842 00:42:20,640 --> 00:42:23,799 Speaker 3: Looking to someone to just justify the bad decision you 843 00:42:23,840 --> 00:42:26,520 Speaker 3: know you're gonna make. Yeah, And that's what friends are for. 844 00:42:26,760 --> 00:42:29,279 Speaker 3: And these things thinks are like they're they're good friends. Now. 845 00:42:29,360 --> 00:42:29,600 Speaker 2: Yeah. 846 00:42:29,719 --> 00:42:31,279 Speaker 3: They go to them and go like, it's fine that 847 00:42:31,320 --> 00:42:33,600 Speaker 3: we're doing this right, and the thing too is like yeah, 848 00:42:33,600 --> 00:42:36,080 Speaker 3: it's fine, man, the experts say it's fine. 849 00:42:36,640 --> 00:42:40,239 Speaker 2: Escalating our involvement in Vietnam is like me saying last night, yeah, 850 00:42:40,239 --> 00:42:42,080 Speaker 2: I can get up at ten am to do this podcast. 851 00:42:42,360 --> 00:42:43,160 Speaker 3: Yeah why not? 852 00:42:43,640 --> 00:42:44,319 Speaker 2: And you did? 853 00:42:44,640 --> 00:42:47,240 Speaker 3: You did it, So Vietnam was fine. 854 00:42:47,520 --> 00:42:50,520 Speaker 2: I have felt like this is my Vietnam since about 855 00:42:50,520 --> 00:42:51,840 Speaker 2: when I ran out of coffee. 856 00:42:52,320 --> 00:42:53,560 Speaker 3: Yeah. 857 00:42:53,600 --> 00:42:56,680 Speaker 2: So the initial obsession of the Rand Corporation is thinking 858 00:42:56,760 --> 00:42:59,960 Speaker 2: the unthinkable. That's kind of like their unofficial tagline, right, 859 00:43:00,600 --> 00:43:04,680 Speaker 2: which is planning for a nuclear war? Right, the idea here? 860 00:43:04,680 --> 00:43:07,879 Speaker 2: And it's interesting, like they're sore. They're right in near 861 00:43:07,920 --> 00:43:09,759 Speaker 2: where we used to work, Dave. They're kind of in 862 00:43:09,840 --> 00:43:12,120 Speaker 2: between like Santa Monica and Hollywood. 863 00:43:12,160 --> 00:43:13,680 Speaker 3: Oh, we should have went and seen them. 864 00:43:13,800 --> 00:43:15,759 Speaker 2: We should have gone. I don't know if they're still there, 865 00:43:15,800 --> 00:43:18,239 Speaker 2: but they were for a time. That the fact they 866 00:43:18,280 --> 00:43:21,719 Speaker 2: were across the street from Mary Pickford's beach house. And 867 00:43:21,920 --> 00:43:24,040 Speaker 2: one description I've read at the time said that they 868 00:43:24,120 --> 00:43:27,560 Speaker 2: quote did little but sit, think, talk, right, pass around memos, 869 00:43:27,600 --> 00:43:31,640 Speaker 2: and dream up new ideas about nuclear war, which sounds 870 00:43:31,640 --> 00:43:32,600 Speaker 2: like a fun life. 871 00:43:33,080 --> 00:43:33,959 Speaker 3: It kind of does. 872 00:43:34,680 --> 00:43:38,520 Speaker 2: Yeah I could do that, yes, Jerwa, bring me on, guys, 873 00:43:40,040 --> 00:43:43,000 Speaker 2: And again, like everything here, it's easy to see how 874 00:43:43,040 --> 00:43:45,680 Speaker 2: it's like, well, yeah, that could be a good idea, right, 875 00:43:45,680 --> 00:43:48,600 Speaker 2: you probably want someone thinking about this for a living. 876 00:43:48,840 --> 00:43:52,319 Speaker 3: It's yeah, there's this romantic idea of like the quote 877 00:43:52,400 --> 00:43:55,560 Speaker 3: unquote like scholars, right, this idea of like, Okay, there's 878 00:43:55,600 --> 00:43:57,719 Speaker 3: so many humans and we have so much money, and 879 00:43:57,760 --> 00:44:00,680 Speaker 3: we've built the civilization. People don't have to worry about 880 00:44:00,719 --> 00:44:02,960 Speaker 3: just how to survive day to day. Yeah, what if 881 00:44:02,960 --> 00:44:06,000 Speaker 3: we have a building where just a group of people 882 00:44:06,239 --> 00:44:09,799 Speaker 3: get together and think about stuff and just like like 883 00:44:09,960 --> 00:44:12,439 Speaker 3: read all the books and like then when we need 884 00:44:12,719 --> 00:44:14,480 Speaker 3: an expert, we go to them and go, like, what 885 00:44:14,520 --> 00:44:16,120 Speaker 3: do you think if. 886 00:44:15,960 --> 00:44:18,400 Speaker 2: We put all of the smartest boys in a room? 887 00:44:18,840 --> 00:44:19,080 Speaker 1: Yeah? 888 00:44:20,080 --> 00:44:20,279 Speaker 2: Work. 889 00:44:20,600 --> 00:44:23,960 Speaker 3: It's a romantic idea. It's just that people are always 890 00:44:24,000 --> 00:44:25,040 Speaker 3: going to be people too. 891 00:44:25,280 --> 00:44:28,200 Speaker 2: Yeah, and that's the problem day've number one, they're funded 892 00:44:28,200 --> 00:44:30,520 Speaker 2: by the Air Force, and number two, the kind of 893 00:44:30,560 --> 00:44:33,600 Speaker 2: guys who are both interested in and able to get 894 00:44:33,640 --> 00:44:36,640 Speaker 2: a job thinking about nuclear war all day. They're the 895 00:44:36,719 --> 00:44:39,879 Speaker 2: kind of people today we would cording these people off 896 00:44:40,440 --> 00:44:43,000 Speaker 2: from the rest of society by getting them into Warhammer 897 00:44:43,040 --> 00:44:44,239 Speaker 2: forty thousand right right. 898 00:44:44,760 --> 00:44:47,960 Speaker 3: By the way. Instead of expert or enthusiasts. I often 899 00:44:48,000 --> 00:44:51,359 Speaker 3: say pervert because I think really breaks it down, which 900 00:44:51,400 --> 00:44:54,640 Speaker 3: is like they're a weird little sick freak who's into this. Yeah, 901 00:44:54,840 --> 00:44:58,319 Speaker 3: I think of there's nothing wrong with that. You just 902 00:44:58,320 --> 00:44:59,720 Speaker 3: have to remember that, right. 903 00:45:00,440 --> 00:45:03,120 Speaker 2: Yeah. I love that idea, Dave, because think of how 904 00:45:03,200 --> 00:45:06,640 Speaker 2: different decades of like nuclear weapons policy would be if 905 00:45:06,680 --> 00:45:08,920 Speaker 2: instead of bringing like nuclear weapons expert, it was like 906 00:45:09,000 --> 00:45:11,400 Speaker 2: nuclear weapons perverts on CNN. 907 00:45:11,680 --> 00:45:14,520 Speaker 3: Yeah, well, they would make a organization and thank you. 908 00:45:14,600 --> 00:45:16,600 Speaker 3: I mean, we're all aware that you're a weird pervert 909 00:45:16,640 --> 00:45:19,000 Speaker 3: about Yes, we're going to take it with a grain assault, 910 00:45:19,280 --> 00:45:20,640 Speaker 3: but thank you for your advice. 911 00:45:20,800 --> 00:45:23,399 Speaker 2: Yeah. Yeah, So the RAND guy wants to jack off 912 00:45:23,400 --> 00:45:26,640 Speaker 2: on the MORI CBMs. Let's let's now talk to him 913 00:45:26,719 --> 00:45:28,720 Speaker 2: not getting murdered experts. 914 00:45:28,719 --> 00:45:31,960 Speaker 3: Yeah exactly, No, not getting murder and murdered pervert. 915 00:45:31,760 --> 00:45:33,520 Speaker 2: Yeah, perfect pert. You're right here, right. We got to 916 00:45:33,520 --> 00:45:35,920 Speaker 2: be we gotta be consistent, Dave, otherwise we have nothing. 917 00:45:36,440 --> 00:45:40,680 Speaker 2: So the RAND guys are a bunch of their war gamers, right, 918 00:45:40,920 --> 00:45:42,920 Speaker 2: like a lot of them literally are. But that that 919 00:45:43,080 --> 00:45:45,399 Speaker 2: is the kind of guy that is drawn to this job. 920 00:45:45,440 --> 00:45:49,520 Speaker 2: These are like game theory nerds, yeah exactly, And they 921 00:45:49,520 --> 00:45:51,239 Speaker 2: didn't have anything better to do in the fifties and 922 00:45:51,280 --> 00:45:53,680 Speaker 2: sixties because there weren't many good war games then give 923 00:45:53,719 --> 00:45:58,319 Speaker 2: the president bad advice about Vietnam. Secretary of Defense and 924 00:45:58,440 --> 00:46:01,120 Speaker 2: a list war criminal Robert mac namara got most of 925 00:46:01,160 --> 00:46:04,360 Speaker 2: his top aids from the RAND Corporation, and they provided 926 00:46:04,400 --> 00:46:08,279 Speaker 2: more recommendations on US policy in Vietnam than any other organization. 927 00:46:08,640 --> 00:46:11,920 Speaker 2: In fact, RAND reports were critical in influencing every stage 928 00:46:11,960 --> 00:46:14,080 Speaker 2: of the war in Vietnam. And this is where we 929 00:46:14,120 --> 00:46:17,560 Speaker 2: really get into the think tank evil. By the mid sixties, 930 00:46:17,719 --> 00:46:21,040 Speaker 2: LBJ was trying to decide should we escalate US involvement 931 00:46:21,120 --> 00:46:23,880 Speaker 2: in the war or not At this point, like sixty 932 00:46:24,000 --> 00:46:26,920 Speaker 2: four or something like that, the US could have backed 933 00:46:26,920 --> 00:46:30,480 Speaker 2: out right, we could have left. We could have said like, sorry, 934 00:46:30,560 --> 00:46:33,120 Speaker 2: this regime is bad, we made a mistake, let's go. 935 00:46:33,480 --> 00:46:35,920 Speaker 2: And North Vietnam probably would have been like, hey, that's great. 936 00:46:36,000 --> 00:46:39,920 Speaker 2: That's all we ever wanted, Like like we're good. No, 937 00:46:39,920 --> 00:46:43,720 Speaker 2: no more conflict is necessary. This is not what happens. 938 00:46:43,760 --> 00:46:47,120 Speaker 2: Because a researcher named Leon Gower, who was a project 939 00:46:47,160 --> 00:46:51,280 Speaker 2: headed RAND, published several papers on morale among North Vietnamese 940 00:46:51,280 --> 00:46:54,800 Speaker 2: soldiers and civilians. He spends months traveling around the country 941 00:46:54,840 --> 00:46:59,520 Speaker 2: talking to captured North Vietnamese soldiers, Viet Cong militants who 942 00:46:59,520 --> 00:47:02,880 Speaker 2: had been like captured and stuff during like actions, and 943 00:47:03,040 --> 00:47:05,319 Speaker 2: also just a bunch of like villagers living in kind 944 00:47:05,320 --> 00:47:07,759 Speaker 2: of around the line of contact at the time. And 945 00:47:07,800 --> 00:47:11,520 Speaker 2: his conclusion based on these interviews is that Vietnamese morale 946 00:47:11,719 --> 00:47:14,239 Speaker 2: is right around the corner from collapsing. If we just 947 00:47:14,280 --> 00:47:17,759 Speaker 2: stick it out a couple more months, victory is inevitable. 948 00:47:17,960 --> 00:47:22,160 Speaker 2: That we've got them on the ropes now, no, weal 949 00:47:23,640 --> 00:47:24,240 Speaker 2: they talked. 950 00:47:24,080 --> 00:47:26,200 Speaker 3: To prisoners and was like, I talked to all these 951 00:47:26,200 --> 00:47:30,000 Speaker 3: people that were captured, and I predict their morale is low. 952 00:47:30,320 --> 00:47:32,080 Speaker 2: They're really sad. I think we're winning. 953 00:47:32,960 --> 00:47:35,160 Speaker 3: I talked to all the people who would be said 954 00:47:35,280 --> 00:47:36,960 Speaker 3: that's ridiculous, of course not. 955 00:47:37,360 --> 00:47:39,640 Speaker 2: Yeah, if you talk to the talk to the British 956 00:47:39,680 --> 00:47:42,840 Speaker 2: people who were captured in nineteen forty, Yeah, they're probably 957 00:47:42,880 --> 00:47:45,040 Speaker 2: pretty bummed about the way the war's going. 958 00:47:45,160 --> 00:47:47,920 Speaker 3: And this is the thing about experts or sorry perverts, 959 00:47:48,200 --> 00:47:51,439 Speaker 3: which is that like it's so tunnel vision right where 960 00:47:51,480 --> 00:47:54,360 Speaker 3: it's like it's also academic that a lot of the 961 00:47:54,400 --> 00:47:57,040 Speaker 3: time the big thing they're missing is like the big 962 00:47:57,160 --> 00:48:01,600 Speaker 3: obvious thing or the human factor, and that that happens, 963 00:48:01,920 --> 00:48:04,839 Speaker 3: you know, that's like I think a legit criticism of 964 00:48:05,000 --> 00:48:07,120 Speaker 3: like colleges and shit like that off. 965 00:48:07,239 --> 00:48:09,439 Speaker 2: Yes too, it's just a legit criticism of the way 966 00:48:09,520 --> 00:48:12,080 Speaker 2: human beings analyze. Like there's a famous study of like 967 00:48:12,120 --> 00:48:14,080 Speaker 2: World War Two. They're looking at like all of their 968 00:48:14,120 --> 00:48:16,720 Speaker 2: planes that get damaged, like where are they getting damaged? 969 00:48:16,800 --> 00:48:18,680 Speaker 2: Those are the parts of the planes we should reinforce. 970 00:48:18,719 --> 00:48:20,440 Speaker 2: You can see that diagram that's got like here are 971 00:48:20,440 --> 00:48:23,360 Speaker 2: all the areas that planes are most often damaged in. 972 00:48:23,440 --> 00:48:26,200 Speaker 2: And then they like realized later like, well, but we're 973 00:48:26,200 --> 00:48:28,400 Speaker 2: just counting the planes that actually made it back. So 974 00:48:28,520 --> 00:48:31,200 Speaker 2: actually those aren't the damage. That's not the damage to 975 00:48:31,320 --> 00:48:33,960 Speaker 2: focus on. It's the planes that went down we should 976 00:48:34,000 --> 00:48:35,040 Speaker 2: be paying attention to. 977 00:48:35,440 --> 00:48:37,000 Speaker 3: That's such a good example of me. 978 00:48:37,480 --> 00:48:40,680 Speaker 2: Yeah, that's what's going to happen with all of Vietnam. 979 00:48:40,760 --> 00:48:43,840 Speaker 2: And I'm going to quote from a report in Ramparts magazine, 980 00:48:44,040 --> 00:48:46,520 Speaker 2: which is a thing that no longer exists, but was 981 00:48:46,600 --> 00:48:49,320 Speaker 2: fucking dope for a very long time, And this is 982 00:48:49,360 --> 00:48:53,200 Speaker 2: their report on the Rand Corporation in Vietnam. It was 983 00:48:53,280 --> 00:48:56,600 Speaker 2: Gower's analyzes that provided the scientific underpinning for the light 984 00:48:56,680 --> 00:48:58,640 Speaker 2: at the end of the tunnel mentality that were so 985 00:48:58,760 --> 00:49:01,200 Speaker 2: crucial to the escalation of the war and the devastation 986 00:49:01,280 --> 00:49:04,440 Speaker 2: that followed. In nineteen sixty six, his work was identified 987 00:49:04,480 --> 00:49:06,680 Speaker 2: by Carl Rowan as the study which lies at the 988 00:49:06,719 --> 00:49:10,400 Speaker 2: heart of President Johnson's strategy. The implications of the Gower 989 00:49:10,480 --> 00:49:13,680 Speaker 2: study are profound, for they indicate yet another aspect of 990 00:49:13,719 --> 00:49:16,680 Speaker 2: the erosion of democratic decision making process that has attended 991 00:49:16,719 --> 00:49:19,560 Speaker 2: every phase of the present conflict. For both the Rand 992 00:49:19,640 --> 00:49:22,960 Speaker 2: interviews of the nlf cadra the most complete portrait available 993 00:49:22,960 --> 00:49:25,040 Speaker 2: of the other side in this war, and the reports 994 00:49:25,080 --> 00:49:28,279 Speaker 2: from Leon Gower were classified and kept securely within the 995 00:49:28,360 --> 00:49:32,080 Speaker 2: contact between the within the contract between the war bent 996 00:49:32,200 --> 00:49:35,680 Speaker 2: executive and the private corporation, and thus and thereby unavailable 997 00:49:35,680 --> 00:49:38,400 Speaker 2: to the American people. In fact, to this day, Gower's 998 00:49:38,400 --> 00:49:40,640 Speaker 2: reports are unavailable to Congress and will have to be 999 00:49:40,680 --> 00:49:42,719 Speaker 2: written and you will have to read them in ramparts. 1000 00:49:43,320 --> 00:49:47,600 Speaker 2: And it's like so this is the by far, like 1001 00:49:47,960 --> 00:49:50,520 Speaker 2: the best collection of information that we had at that 1002 00:49:50,640 --> 00:49:55,200 Speaker 2: point on North Vietnamese people's thoughts on how the war 1003 00:49:55,320 --> 00:49:57,920 Speaker 2: was going. And it was entirely filtered through the lens 1004 00:49:58,000 --> 00:50:02,239 Speaker 2: of this very biased man who was being paid effectively 1005 00:50:02,320 --> 00:50:04,360 Speaker 2: by the US Air force, but was in a private 1006 00:50:04,400 --> 00:50:08,040 Speaker 2: corporation and so none of his worker methodology was open 1007 00:50:08,080 --> 00:50:12,000 Speaker 2: to being scrutinized by the public, and his analysis that 1008 00:50:12,040 --> 00:50:14,960 Speaker 2: we were right around, right around the corner from winning 1009 00:50:15,000 --> 00:50:18,960 Speaker 2: in Vietnam had the biggest impact on Elbajay's decision to 1010 00:50:19,080 --> 00:50:21,200 Speaker 2: escalate of any like individual factor. 1011 00:50:21,600 --> 00:50:22,200 Speaker 3: That's bleak. 1012 00:50:22,920 --> 00:50:25,319 Speaker 2: It's really bad, right, and you probably shouldn't do it. 1013 00:50:25,320 --> 00:50:28,279 Speaker 3: Like that, Right. It's like according to this expert who 1014 00:50:28,320 --> 00:50:31,839 Speaker 3: literally like it's asking an expert to look into whether 1015 00:50:31,920 --> 00:50:34,360 Speaker 3: or not they should continue to pay this person, and 1016 00:50:34,480 --> 00:50:37,200 Speaker 3: you should have the job too. Like it's that's like 1017 00:50:37,320 --> 00:50:41,680 Speaker 3: when when one of the possibilities of like your study 1018 00:50:41,800 --> 00:50:43,799 Speaker 3: means that you no longer get to do the thing 1019 00:50:43,800 --> 00:50:47,000 Speaker 3: you're doing, you shouldn't get to do that study, right, Yeah. Yeah. 1020 00:50:47,040 --> 00:50:50,040 Speaker 2: It's the same thing with like police departments like inspecting 1021 00:50:50,040 --> 00:50:54,320 Speaker 2: each other, investigating each other over crimes. Yeah, probably shouldn't 1022 00:50:54,360 --> 00:50:56,440 Speaker 2: let it because it is it is like as the 1023 00:50:56,520 --> 00:51:00,000 Speaker 2: as rampartson, it's really anti democratic, right, because you've got 1024 00:51:00,120 --> 00:51:02,000 Speaker 2: a bunch of people saying, I don't think we should 1025 00:51:02,040 --> 00:51:04,680 Speaker 2: be involved in Vietnam, because why in the fuck are 1026 00:51:04,719 --> 00:51:08,279 Speaker 2: we sending soldiers to Vietnam, and then you've got the 1027 00:51:08,280 --> 00:51:11,040 Speaker 2: president saying, well, look, you're all a bunch of casuals, 1028 00:51:11,080 --> 00:51:13,239 Speaker 2: a bunch of Yahoo's who don't know anything. I've talked 1029 00:51:13,239 --> 00:51:15,560 Speaker 2: to the experts and they say, we're about to win. 1030 00:51:15,960 --> 00:51:19,040 Speaker 3: You know. Yeah, this is it's such a tricky. This 1031 00:51:19,080 --> 00:51:22,520 Speaker 3: is where nuance comes into play, because there's this idea, like, 1032 00:51:22,719 --> 00:51:26,480 Speaker 3: you know, very recently we had something that happened in 1033 00:51:26,480 --> 00:51:29,160 Speaker 3: this country where a bunch of experts were saying everybody 1034 00:51:29,200 --> 00:51:32,400 Speaker 3: needs to do this thing, and people are like, ah, 1035 00:51:32,440 --> 00:51:34,880 Speaker 3: screw the experts. Yeah, and there was sort of this 1036 00:51:35,280 --> 00:51:38,080 Speaker 3: battle over it where it's like sometimes experts, like when 1037 00:51:38,120 --> 00:51:41,719 Speaker 3: it's very when it's very black and white, is just like, yeah, 1038 00:51:42,080 --> 00:51:44,560 Speaker 3: you know, you should probably just do what they say, right, 1039 00:51:45,040 --> 00:51:47,879 Speaker 3: Like you know, if someone says, you know. 1040 00:51:48,680 --> 00:51:51,200 Speaker 2: Don't lick toilet seats, wow, I probably won't. 1041 00:51:51,360 --> 00:51:56,360 Speaker 3: Yeah, it's very tricky because I understand why over years 1042 00:51:56,360 --> 00:51:58,520 Speaker 3: and years there are people who are like, h fuck 1043 00:51:58,560 --> 00:52:02,680 Speaker 3: the experts, because then you get these cases where it's 1044 00:52:02,760 --> 00:52:07,200 Speaker 3: like you're saying undemocratic, where everybody's sort of saying like 1045 00:52:07,239 --> 00:52:09,000 Speaker 3: we should do these I should have the freedom to 1046 00:52:09,000 --> 00:52:11,359 Speaker 3: do these things, or I believe we should do this, 1047 00:52:11,680 --> 00:52:13,480 Speaker 3: and it's like, no, we're going to consult this handful 1048 00:52:13,520 --> 00:52:17,759 Speaker 3: of people to make the decision. Feels very undemocratic, And 1049 00:52:17,840 --> 00:52:21,000 Speaker 3: for the most part, it's just like it's there's no 1050 00:52:21,200 --> 00:52:24,920 Speaker 3: right answer across the board right where it's like I 1051 00:52:25,000 --> 00:52:28,560 Speaker 3: wish we were undemocratic about like climate change, Like I 1052 00:52:28,600 --> 00:52:31,360 Speaker 3: wish governments would just go listen, this is what we're doing, 1053 00:52:31,719 --> 00:52:36,600 Speaker 3: because otherwise we'll all die. But that's very dangerous thinking 1054 00:52:36,640 --> 00:52:40,440 Speaker 3: across the board. So it really is like a case 1055 00:52:40,480 --> 00:52:41,560 Speaker 3: by case situation. 1056 00:52:42,320 --> 00:52:45,319 Speaker 2: Yeah, I mean, the overwhelming lesson of history is that 1057 00:52:45,360 --> 00:52:48,279 Speaker 2: there's actually no good way to do things. Yes, but 1058 00:52:48,880 --> 00:52:51,640 Speaker 2: the right way certainly isn't the one we're doing. I 1059 00:52:51,640 --> 00:52:53,680 Speaker 2: think we can all agree on that. And you know 1060 00:52:53,719 --> 00:52:59,200 Speaker 2: what else we can all agree on, Dave what products 1061 00:52:59,200 --> 00:53:10,960 Speaker 2: and services? Oh, we're back Dave, the Rand Corporation. When 1062 00:53:10,960 --> 00:53:14,160 Speaker 2: we left them off, they had just told LBJ. Hey Man, 1063 00:53:14,640 --> 00:53:16,239 Speaker 2: just a few more guys, just a few more one 1064 00:53:16,280 --> 00:53:18,839 Speaker 2: hundred thousand US troops and it'll win us the war. 1065 00:53:18,920 --> 00:53:23,160 Speaker 2: We're right around the corner, like the Vietnam's on the ropes. 1066 00:53:23,200 --> 00:53:25,480 Speaker 2: Nor the Vietnam can't hold out much more. They're just 1067 00:53:25,560 --> 00:53:26,960 Speaker 2: about to give up, right. 1068 00:53:27,080 --> 00:53:29,040 Speaker 3: Right, and then we won. Yeah. 1069 00:53:29,080 --> 00:53:31,239 Speaker 2: Yeah, As we all know, nineteen sixty four was the 1070 00:53:31,320 --> 00:53:34,720 Speaker 2: last year of the Vietnam War before our glorious victory, 1071 00:53:35,280 --> 00:53:39,200 Speaker 2: and we started air dropping McDonald's into the jungle. So 1072 00:53:39,920 --> 00:53:42,040 Speaker 2: when you look at like the kind of analyzes the 1073 00:53:42,120 --> 00:53:45,799 Speaker 2: RAND Corporation was providing the US government, the Johnson administration 1074 00:53:45,920 --> 00:53:50,960 Speaker 2: during Vietnam, they kind of dispassionately advised this kind of ladder, 1075 00:53:51,640 --> 00:53:54,480 Speaker 2: A ladder of escalation is how it's usually described, right, 1076 00:53:54,520 --> 00:53:56,879 Speaker 2: where like, if the enemy does this, then you add 1077 00:53:56,920 --> 00:53:58,600 Speaker 2: more troops. If the need of it is this, then 1078 00:53:58,640 --> 00:54:00,960 Speaker 2: you carry out an offensive. If the enemy does this, 1079 00:54:01,000 --> 00:54:03,080 Speaker 2: then we launched another bombing campaign. Right. 1080 00:54:03,600 --> 00:54:04,920 Speaker 3: This was the ladder of escalation. 1081 00:54:05,239 --> 00:54:08,799 Speaker 2: Yes, yes, escalator, an escalator. I don't know if they 1082 00:54:08,840 --> 00:54:10,920 Speaker 2: had them back then. It was the sixties. It was 1083 00:54:10,920 --> 00:54:13,399 Speaker 2: the sixties. We hadn't invented science yet. 1084 00:54:13,600 --> 00:54:13,759 Speaker 3: Right. 1085 00:54:14,080 --> 00:54:16,400 Speaker 2: They were throwing spears out of planes. That's how the 1086 00:54:16,440 --> 00:54:19,080 Speaker 2: air force worked. Yeah, and all of this this like 1087 00:54:19,160 --> 00:54:22,520 Speaker 2: ladder of escalation that RAND advises, it's based off of 1088 00:54:22,600 --> 00:54:27,279 Speaker 2: their belief about what Vietnam would do in response to 1089 00:54:27,360 --> 00:54:30,680 Speaker 2: US ordering a bombing campaign right of the north through 1090 00:54:30,680 --> 00:54:34,160 Speaker 2: of like you know, of their capital, right, and it 1091 00:54:34,239 --> 00:54:36,040 Speaker 2: was based on their understanding that this is what an 1092 00:54:36,080 --> 00:54:40,200 Speaker 2: opposing government would do in this situation. So they're thinking, 1093 00:54:40,560 --> 00:54:44,239 Speaker 2: if our capital was bombed, how would we act, Right, Well, 1094 00:54:44,239 --> 00:54:47,640 Speaker 2: we would probably seek to super peace or whatever, because 1095 00:54:47,680 --> 00:54:50,360 Speaker 2: like Americans would not put up with the capital being bombed. 1096 00:54:50,719 --> 00:54:52,400 Speaker 2: You know, there would be a real issue for us. 1097 00:54:52,480 --> 00:54:55,680 Speaker 3: Yeah, we'd freak out, we'd call French fries freedom fries. 1098 00:54:55,840 --> 00:54:58,520 Speaker 2: Yes, we would lose our minds. 1099 00:54:58,600 --> 00:55:00,880 Speaker 3: Yeah, things would get very yea. 1100 00:55:00,960 --> 00:55:03,279 Speaker 2: Yeah. The I actually, I mean the idea was that 1101 00:55:03,360 --> 00:55:06,200 Speaker 2: I think you've actually you've actually predicted what happened here, 1102 00:55:06,200 --> 00:55:08,400 Speaker 2: which is that like, well, when they bombed our capital, 1103 00:55:08,440 --> 00:55:10,719 Speaker 2: we went insane. We didn't, we didn't seem to. We 1104 00:55:10,760 --> 00:55:13,919 Speaker 2: didn't go for negotiations. But the RAM Corporation is like, well, 1105 00:55:14,160 --> 00:55:17,120 Speaker 2: if we bomb the capital Vietna, North Vietnam will want 1106 00:55:17,120 --> 00:55:20,000 Speaker 2: to come to the negotiating table. They'll make concessions to us. 1107 00:55:20,040 --> 00:55:22,759 Speaker 3: Then, right, Yeah, it feels like we were we were 1108 00:55:22,840 --> 00:55:27,960 Speaker 3: riding high on nukes, which is that, like I get why, Like, well, 1109 00:55:28,080 --> 00:55:30,760 Speaker 3: I mean Japan had already surrendered, right, I'm pretty sure 1110 00:55:30,880 --> 00:55:32,439 Speaker 3: or like no, no. 1111 00:55:31,960 --> 00:55:36,280 Speaker 2: No, I mean that's that's a whole separate. 1112 00:55:37,640 --> 00:55:40,040 Speaker 3: But like it's all to say that, like nukes was 1113 00:55:40,080 --> 00:55:42,160 Speaker 3: a new thing, and so seeing a new and not 1114 00:55:42,239 --> 00:55:45,440 Speaker 3: knowing what it is, I can imagine the government being like, oh, okay, 1115 00:55:45,480 --> 00:55:47,879 Speaker 3: we're done here, you know, like I don't know what 1116 00:55:47,960 --> 00:55:48,400 Speaker 3: that is. 1117 00:55:48,840 --> 00:55:53,440 Speaker 2: Yeah, it because like the nukes are such a you know, 1118 00:55:53,800 --> 00:55:56,480 Speaker 2: we dropped two nukes on Japan and the government sues 1119 00:55:56,520 --> 00:56:00,399 Speaker 2: for peace. Right, that's the reality of what was going 1120 00:56:00,400 --> 00:56:03,800 Speaker 2: on is a bit more complicated than that. But because 1121 00:56:03,920 --> 00:56:07,680 Speaker 2: that's like kind of the last thing that happens, you 1122 00:56:07,719 --> 00:56:10,120 Speaker 2: do get this attitude that like, well, we can bomb 1123 00:56:10,120 --> 00:56:13,120 Speaker 2: our problems away with enough bombs, and the reality is that, yeah, 1124 00:56:13,160 --> 00:56:15,120 Speaker 2: we bomb the shit out of Japan prior to that, 1125 00:56:15,160 --> 00:56:17,879 Speaker 2: and they didn't surrender until the nukes, And we bomb 1126 00:56:17,920 --> 00:56:20,400 Speaker 2: the shit out of Germany and they didn't surrender because 1127 00:56:20,400 --> 00:56:24,200 Speaker 2: of the bombing. Right, Bombing doesn't make people surrender usually, 1128 00:56:24,480 --> 00:56:27,200 Speaker 2: and it doesn't in the case of Vietnam. Right, the 1129 00:56:27,280 --> 00:56:31,799 Speaker 2: RAND Corporation has utterly misread North Vietnam, like they are 1130 00:56:31,840 --> 00:56:34,480 Speaker 2: not going to respond to the latter of escalation the 1131 00:56:34,480 --> 00:56:37,600 Speaker 2: way we expect they are. As David Landau wrote in ramparts. 1132 00:56:38,280 --> 00:56:40,320 Speaker 2: Underlying almost all of Rand's work on the war in 1133 00:56:40,360 --> 00:56:42,880 Speaker 2: the late sixties and early seventies was the unquestioned assumption 1134 00:56:42,920 --> 00:56:45,439 Speaker 2: that the enemy in Vietnam would behave just like any 1135 00:56:45,440 --> 00:56:48,240 Speaker 2: other sovereign power at war, that he could be lured 1136 00:56:48,239 --> 00:56:51,560 Speaker 2: by attractive negotiating offers which fell short of his stated position, 1137 00:56:51,719 --> 00:56:54,000 Speaker 2: or that refusing to negotiate, he could be brought to 1138 00:56:54,000 --> 00:56:56,000 Speaker 2: the peace table with the threat and use of force. 1139 00:56:56,280 --> 00:56:59,000 Speaker 2: It was a universal failure to grasp the unique nature 1140 00:56:59,040 --> 00:57:02,400 Speaker 2: of the insurgency Vietnam. In other words, the kind of 1141 00:57:02,440 --> 00:57:05,800 Speaker 2: guys who work at RAND, who are like war gamers, 1142 00:57:05,880 --> 00:57:09,279 Speaker 2: who were obsessed with their own careers and like rational thought, 1143 00:57:09,840 --> 00:57:13,080 Speaker 2: could not accept that, like, well, there's people over there 1144 00:57:13,080 --> 00:57:15,719 Speaker 2: that believe in things, right, The people running the war 1145 00:57:15,719 --> 00:57:18,320 Speaker 2: effort in North Vietnam might not just come to the 1146 00:57:18,360 --> 00:57:21,440 Speaker 2: table because they get scared because of a bombing. They 1147 00:57:21,520 --> 00:57:24,680 Speaker 2: might actually have principles that they're holding too, you know. 1148 00:57:24,880 --> 00:57:28,400 Speaker 3: Yeah, that seems like an oversight that's common, which is 1149 00:57:28,520 --> 00:57:32,200 Speaker 3: like looking at them like NPCs, you know, where it's like, 1150 00:57:32,320 --> 00:57:34,720 Speaker 3: if we do this, then they'll get scared. Yeah, it's 1151 00:57:34,760 --> 00:57:38,280 Speaker 3: dehumanizing the enemy, right, and the thing about doing that 1152 00:57:38,360 --> 00:57:41,760 Speaker 3: is it often screws yourself over where it's like if 1153 00:57:41,760 --> 00:57:44,960 Speaker 3: you're not thinking about them like human beings, yeah, then 1154 00:57:45,560 --> 00:57:48,680 Speaker 3: you don't actually know how to deal with them. 1155 00:57:49,000 --> 00:57:51,240 Speaker 2: That's exactly it. And it's the same thing you get 1156 00:57:51,280 --> 00:57:53,320 Speaker 2: with like we're seeing with the houthis right now right 1157 00:57:53,320 --> 00:57:55,920 Speaker 2: where it's like, Okay, well we sent we started bombing them. 1158 00:57:56,360 --> 00:57:58,880 Speaker 2: Oh that hasn't changed what they're doing. They're still still 1159 00:57:58,880 --> 00:58:00,400 Speaker 2: throwing cruise missiles at ships. 1160 00:58:00,560 --> 00:58:00,800 Speaker 3: Yeah. 1161 00:58:01,160 --> 00:58:03,480 Speaker 2: I also as if yeah, yeah, no, you. 1162 00:58:03,440 --> 00:58:06,520 Speaker 3: Could even scale it down, Like I think like that 1163 00:58:06,520 --> 00:58:10,120 Speaker 3: that whole Libs of TikTok interview where she just froze 1164 00:58:10,200 --> 00:58:13,520 Speaker 3: up to me has that stinks of like, yeah, they 1165 00:58:13,560 --> 00:58:17,160 Speaker 3: have this idea in their head of like a liberal journalist, 1166 00:58:17,640 --> 00:58:20,160 Speaker 3: this straw man, and then when you actually sit down 1167 00:58:20,160 --> 00:58:22,919 Speaker 3: with these people or like Elon Muskin don Lemon, where 1168 00:58:22,920 --> 00:58:26,240 Speaker 3: it's like then you realize like, oh, they're completely unprepared 1169 00:58:26,280 --> 00:58:29,560 Speaker 3: for this situation because they had this straw man in 1170 00:58:29,600 --> 00:58:32,720 Speaker 3: their head that they thought like, oh, this will be easy. 1171 00:58:32,960 --> 00:58:36,240 Speaker 3: And then so like it's this, yeah, you can scale 1172 00:58:36,280 --> 00:58:37,919 Speaker 3: it up. It's a war too, where it's. 1173 00:58:37,880 --> 00:58:41,840 Speaker 2: Yeah, it is like the same psychological like the Rand 1174 00:58:41,880 --> 00:58:44,680 Speaker 2: Corporation in Vietnam are in the same position as like, yeah, 1175 00:58:44,720 --> 00:58:47,680 Speaker 2: Elon Musk in that interview, or the libs of TikTok 1176 00:58:47,760 --> 00:58:51,080 Speaker 2: lady when she get like this happens repeatedly. I mean 1177 00:58:51,080 --> 00:58:53,600 Speaker 2: it's the same. You could go back to like Nazi 1178 00:58:53,680 --> 00:58:57,000 Speaker 2: Germany right when they invade Russia, being like, because they're 1179 00:58:57,160 --> 00:58:59,320 Speaker 2: very much if you look at like the Rand Corporation 1180 00:58:59,400 --> 00:59:02,160 Speaker 2: is telling ob they're right around the corner from collapse. 1181 00:59:02,280 --> 00:59:04,680 Speaker 2: Just push a little harder and they'll fall apart. Hitler 1182 00:59:04,720 --> 00:59:07,320 Speaker 2: invades the USSR being like, yeah, if we just kick 1183 00:59:07,360 --> 00:59:09,520 Speaker 2: in the door, everything's going to collapse. It's like, now, 1184 00:59:10,080 --> 00:59:12,480 Speaker 2: it never quite works that way, does it, Guys. 1185 00:59:13,680 --> 00:59:14,040 Speaker 3: Russia? 1186 00:59:14,560 --> 00:59:16,880 Speaker 2: Anytime you're true And you should think about this when 1187 00:59:16,920 --> 00:59:19,720 Speaker 2: people talk about like, oh, you know, the Republicans, you know, 1188 00:59:19,760 --> 00:59:22,520 Speaker 2: don't have any real strip, Like we outnumber them by 1189 00:59:22,560 --> 00:59:24,880 Speaker 2: so much. You know, Trump is on the ropes, right, 1190 00:59:24,920 --> 00:59:26,440 Speaker 2: All we have to do is push a little harder 1191 00:59:26,480 --> 00:59:29,320 Speaker 2: and we'll beat them forever. Yeah, anytime anyone's telling you 1192 00:59:29,440 --> 00:59:33,200 Speaker 2: that about your enemy. No, people believe in things. It's 1193 00:59:33,240 --> 00:59:35,640 Speaker 2: hard to actually win a fight like this. 1194 00:59:35,960 --> 00:59:39,200 Speaker 3: It's wild how we keep doing it. Because like this 1195 00:59:39,800 --> 00:59:43,240 Speaker 3: not to deviate too far, but like the Oscars this year, 1196 00:59:43,280 --> 00:59:45,640 Speaker 3: they like read a Trump tweet on stage and everybody 1197 00:59:45,680 --> 00:59:49,320 Speaker 3: had a good laugh, and it's like a member like 1198 00:59:49,480 --> 00:59:52,919 Speaker 3: the static election, Like you guys gotta start like they're 1199 00:59:53,000 --> 00:59:56,960 Speaker 3: such humorous and it's like when does this ever worked 1200 00:59:56,960 --> 00:59:58,520 Speaker 3: out for you? Jesus Christ. 1201 00:59:58,760 --> 01:00:02,000 Speaker 2: Part of it this reveals something like fundamentally like kind 1202 01:00:02,080 --> 01:00:04,880 Speaker 2: of horrible at the center of a lot of the 1203 01:00:04,960 --> 01:00:07,560 Speaker 2: human experience, which is that a huge number of people 1204 01:00:08,160 --> 01:00:10,920 Speaker 2: don't want to think that there can be other people 1205 01:00:10,960 --> 01:00:15,200 Speaker 2: who believe wildly different things from them and really believe it. 1206 01:00:15,520 --> 01:00:17,800 Speaker 2: They're just like doing it to be like evil or 1207 01:00:17,840 --> 01:00:21,200 Speaker 2: like try to like get but like actually have are 1208 01:00:21,280 --> 01:00:23,240 Speaker 2: like that is the center of their being. Like the 1209 01:00:23,520 --> 01:00:26,800 Speaker 2: idea to a lot of these guys, the idea that like, yeah, 1210 01:00:26,840 --> 01:00:32,439 Speaker 2: in Vietnam, there are actual committed communists nationalists who are 1211 01:00:32,520 --> 01:00:37,160 Speaker 2: willing to die, lots of them for a cause and 1212 01:00:37,240 --> 01:00:41,200 Speaker 2: are not willing to compromise on that cause. Right, Yeah, 1213 01:00:41,520 --> 01:00:44,280 Speaker 2: you almost can't believe that because then you have to 1214 01:00:44,360 --> 01:00:46,720 Speaker 2: kind of accept that, like people can live in a 1215 01:00:46,720 --> 01:00:49,080 Speaker 2: way that is wildly different from how I do, and 1216 01:00:49,120 --> 01:00:53,520 Speaker 2: there's they're just as much people they're not like brainwashed 1217 01:00:53,600 --> 01:00:55,960 Speaker 2: or anything like. This is actually just a deeply held 1218 01:00:56,120 --> 01:00:57,400 Speaker 2: set of beliefs. 1219 01:00:57,040 --> 01:00:59,080 Speaker 3: Right right, And even if you think they're doing it, 1220 01:00:59,120 --> 01:01:00,960 Speaker 3: like they don't fully belie leave it or they're doing 1221 01:01:01,000 --> 01:01:03,920 Speaker 3: it in mouth. I mean, we have like what sunken cost 1222 01:01:03,960 --> 01:01:06,760 Speaker 3: fallony or a fallacy, or I call it being pot committed, 1223 01:01:07,000 --> 01:01:09,000 Speaker 3: which is like that idea that when you've put so 1224 01:01:09,160 --> 01:01:11,480 Speaker 3: much into it, you're not going to back out. 1225 01:01:11,840 --> 01:01:16,320 Speaker 2: Yeah, And that's that's really where we're headed here, because 1226 01:01:16,360 --> 01:01:19,640 Speaker 2: the Rand Corporation is very, very much integral in the 1227 01:01:19,720 --> 01:01:24,040 Speaker 2: US getting pot committed to Vietnam. Obviously, you can trace 1228 01:01:24,160 --> 01:01:26,840 Speaker 2: a lot of people's deaths back to the Rand Corporation's 1229 01:01:26,880 --> 01:01:29,840 Speaker 2: work there. And this has two major effects on domestic 1230 01:01:30,160 --> 01:01:33,680 Speaker 2: politics in the United States. One is that think tanks 1231 01:01:33,680 --> 01:01:37,000 Speaker 2: and experts, the concept of expertise you brought up sort 1232 01:01:37,000 --> 01:01:38,560 Speaker 2: of like the way in which everyone a lot of 1233 01:01:38,600 --> 01:01:40,960 Speaker 2: people reacted to experts talking about like what we should 1234 01:01:40,960 --> 01:01:43,959 Speaker 2: do with COVID. A big part of why there's such 1235 01:01:43,960 --> 01:01:47,080 Speaker 2: a rapid backlash to like very basic public health shit. 1236 01:01:47,360 --> 01:01:50,760 Speaker 2: It all kind of starts with the backlash to Vietnam. 1237 01:01:51,240 --> 01:01:54,439 Speaker 2: All of the experts say we're about to win, throw 1238 01:01:54,520 --> 01:01:56,760 Speaker 2: some more shit in here, right, throw some more money 1239 01:01:56,800 --> 01:02:00,560 Speaker 2: behind it, and they're horribly wrong. And that kind of 1240 01:02:01,040 --> 01:02:05,280 Speaker 2: that helps sort of fuel this anti expertise backlash in 1241 01:02:05,360 --> 01:02:08,560 Speaker 2: American popular culture, right, and it is you have to 1242 01:02:08,560 --> 01:02:11,120 Speaker 2: say that this is a big part of conservatism today. 1243 01:02:11,640 --> 01:02:14,760 Speaker 2: It's not wrong for there to be that backlash, right, 1244 01:02:15,160 --> 01:02:18,360 Speaker 2: You should be really skeptical about people who claim to 1245 01:02:18,680 --> 01:02:19,800 Speaker 2: expertise in this shit. 1246 01:02:20,080 --> 01:02:23,800 Speaker 3: Absolutely, yes, yeah, but yeah, it's yeah, all about the 1247 01:02:23,880 --> 01:02:26,360 Speaker 3: nuance where it's like, yeah, being able to think critically 1248 01:02:26,400 --> 01:02:30,280 Speaker 3: about these things is a good idea. Yeah, it's just 1249 01:02:30,280 --> 01:02:33,000 Speaker 3: that that's not often what people do. 1250 01:02:33,400 --> 01:02:35,840 Speaker 2: Yeah. Unfortunately, the response for some people is like, well, 1251 01:02:35,880 --> 01:02:37,680 Speaker 2: since all the experts are crooks, I'm just going to 1252 01:02:37,960 --> 01:02:41,040 Speaker 2: vote for the angriest man I've ever seen. You know, Well, 1253 01:02:41,080 --> 01:02:45,400 Speaker 2: that's not really a good idea either. The other equally 1254 01:02:45,440 --> 01:02:48,080 Speaker 2: important takeaway, though, and this is kind of it might 1255 01:02:48,120 --> 01:02:50,280 Speaker 2: seem like it's separate, but a lot of people recognize 1256 01:02:50,280 --> 01:02:53,439 Speaker 2: both these things is that think tanks have power. Right. 1257 01:02:53,920 --> 01:02:57,720 Speaker 2: A bunch of eggheads writing policy papers helped provide support 1258 01:02:57,840 --> 01:03:01,960 Speaker 2: for an insane escalation v Vietnam, and that means there's 1259 01:03:02,000 --> 01:03:05,800 Speaker 2: a lot of power in having eggheads write policy papers, right, 1260 01:03:06,080 --> 01:03:09,280 Speaker 2: and if you pay for those policy papers, maybe you 1261 01:03:09,360 --> 01:03:13,120 Speaker 2: can get eggheads to support any insane policy you want 1262 01:03:13,120 --> 01:03:17,040 Speaker 2: to push an American society, right, And this is deeply 1263 01:03:17,080 --> 01:03:21,040 Speaker 2: attractive to the oligarchs who had fought like hyenas against 1264 01:03:21,080 --> 01:03:24,120 Speaker 2: the New Deal. These people had been infuriated by the 1265 01:03:24,120 --> 01:03:28,040 Speaker 2: Great Society as well. That's lbj's right. Alongside escalation of Vietnam, 1266 01:03:28,160 --> 01:03:32,000 Speaker 2: LBJ is pushing some of the most like substantial social 1267 01:03:32,000 --> 01:03:35,480 Speaker 2: welfare programs in the history of our country. And these 1268 01:03:35,520 --> 01:03:37,640 Speaker 2: people who had kind of who had fought against the 1269 01:03:37,680 --> 01:03:40,040 Speaker 2: New Deal but had given up, were like feeling kind 1270 01:03:40,080 --> 01:03:42,760 Speaker 2: of hopeless at like we can't stop the anti war 1271 01:03:42,840 --> 01:03:45,440 Speaker 2: movement from rising. That's continuing to frustrate them in like 1272 01:03:45,480 --> 01:03:49,360 Speaker 2: the mid sixties, and also we couldn't stop this raft 1273 01:03:49,480 --> 01:03:52,760 Speaker 2: of social reforms from going through. But what they sort 1274 01:03:52,800 --> 01:03:57,320 Speaker 2: of start to realize is that because the liberal establishment 1275 01:03:57,400 --> 01:04:00,360 Speaker 2: has gotten so in bed in Vietnam and with this 1276 01:04:00,480 --> 01:04:04,320 Speaker 2: kind of like cadre of experts who backed their stupid 1277 01:04:04,400 --> 01:04:07,800 Speaker 2: ideas for what to do, there there was an opportunity, right, 1278 01:04:07,880 --> 01:04:11,840 Speaker 2: an opportunity to actually reverse this kind of feeling that 1279 01:04:11,880 --> 01:04:15,200 Speaker 2: conservatism is always on the back foot and start taking 1280 01:04:15,240 --> 01:04:18,959 Speaker 2: strides forward to become the dominant ideology in the country. Right, 1281 01:04:20,000 --> 01:04:21,840 Speaker 2: this is like the thing that's starting to happen in 1282 01:04:21,880 --> 01:04:24,640 Speaker 2: the mid sixties. You could be forgiven if you had 1283 01:04:24,680 --> 01:04:29,800 Speaker 2: thought that like the ideological battle between conservatism and like 1284 01:04:30,000 --> 01:04:32,520 Speaker 2: liberalism was still had still been won by liberals in 1285 01:04:32,560 --> 01:04:34,400 Speaker 2: the sixties, it would have looked that way, because nineteen 1286 01:04:34,440 --> 01:04:37,080 Speaker 2: sixty four is when we get the candidacy of Barry Goldwater, 1287 01:04:37,720 --> 01:04:41,080 Speaker 2: and Goldwater is he's like the craziest person anybody has 1288 01:04:41,080 --> 01:04:44,720 Speaker 2: ever seen in nineteen sixty four in politics I talk 1289 01:04:44,760 --> 01:04:47,120 Speaker 2: about people will always point out on the subreddit, well, 1290 01:04:47,160 --> 01:04:49,880 Speaker 2: actually he was like pretty pretty moderate on a lot 1291 01:04:49,880 --> 01:04:51,280 Speaker 2: of things. By the end of his life. He was 1292 01:04:51,280 --> 01:04:54,040 Speaker 2: like pro gay marriage and pro weed and stuff, and 1293 01:04:54,120 --> 01:04:56,520 Speaker 2: like all of that's true. But in nineteen sixty four 1294 01:04:56,600 --> 01:04:58,360 Speaker 2: he is the guy who is He's coming up on 1295 01:04:58,480 --> 01:05:01,760 Speaker 2: stage and he's saying, like, extreamism and defensive liberty is 1296 01:05:01,840 --> 01:05:04,880 Speaker 2: no vice. Let's lab a nuclear bomb into the men's 1297 01:05:04,960 --> 01:05:07,960 Speaker 2: room at the Kremlin. Lazy dole. Happy people want to 1298 01:05:07,960 --> 01:05:10,800 Speaker 2: feed on the fruits of somebody else's labor right. He 1299 01:05:10,880 --> 01:05:13,840 Speaker 2: comes up when he gives his speech at the RNC 1300 01:05:14,160 --> 01:05:17,800 Speaker 2: in sixty four. He talks about how like the Democratic 1301 01:05:17,880 --> 01:05:20,720 Speaker 2: Party and the network news programs are into the direction 1302 01:05:20,800 --> 01:05:25,320 Speaker 2: of Marxist ballet dancers, that like their god is Mammon, right, 1303 01:05:25,480 --> 01:05:28,120 Speaker 2: like who is money? Like money is the god of 1304 01:05:28,160 --> 01:05:32,560 Speaker 2: the liberal establishment. He is kind of a maniac, right, Yeah, 1305 01:05:32,600 --> 01:05:36,480 Speaker 2: he's very reasonable compared to like modern Republicans. But this 1306 01:05:36,520 --> 01:05:39,760 Speaker 2: is how people think about him at the time, and 1307 01:05:39,880 --> 01:05:43,040 Speaker 2: Goldwater like, this is a very scary moment for anyone 1308 01:05:43,080 --> 01:05:46,400 Speaker 2: paying attention because what they'll notice, people who are at 1309 01:05:46,440 --> 01:05:48,200 Speaker 2: the Cow Palace, which is like the place in San 1310 01:05:48,240 --> 01:05:50,680 Speaker 2: Francisco where the RNC is happening, then will note that 1311 01:05:50,760 --> 01:05:56,320 Speaker 2: like his followers are, they're proto trumpists, right, They are 1312 01:05:56,480 --> 01:05:58,800 Speaker 2: into him and excited about him in a way that 1313 01:05:59,120 --> 01:06:02,680 Speaker 2: nobody was for p Residence, right, Like it was like 1314 01:06:02,760 --> 01:06:06,600 Speaker 2: this weird hitlarian kind of cult of personality that he had, 1315 01:06:06,640 --> 01:06:10,040 Speaker 2: and it was small, but the extremism with which he 1316 01:06:10,200 --> 01:06:12,520 Speaker 2: was embraced by these kind of what will become known 1317 01:06:12,520 --> 01:06:15,320 Speaker 2: as the new Right was really concerning to a lot 1318 01:06:15,320 --> 01:06:18,360 Speaker 2: of people for good reason. And it might have looked 1319 01:06:18,360 --> 01:06:22,320 Speaker 2: because Goldwater loses badly. LBJ gets sixty one percent of 1320 01:06:22,360 --> 01:06:25,280 Speaker 2: the popular vote. So you can see how you some 1321 01:06:25,360 --> 01:06:28,440 Speaker 2: conservatives were like, well, this means we're fucked forever, right, 1322 01:06:28,480 --> 01:06:31,760 Speaker 2: this guy Goldwater has set us back generations. You know, 1323 01:06:32,040 --> 01:06:35,280 Speaker 2: we we went too hard, too fast, and we lost badly, 1324 01:06:35,360 --> 01:06:37,840 Speaker 2: and like we we have to go to the middle. Right. 1325 01:06:38,040 --> 01:06:40,280 Speaker 2: That's certainly what like if this were reversed, if you 1326 01:06:40,360 --> 01:06:45,720 Speaker 2: had an actual like hardcore leftist presidential candidate get defeated 1327 01:06:45,760 --> 01:06:48,600 Speaker 2: that badly, the Democratic Party's lesson would be we can 1328 01:06:48,680 --> 01:06:51,000 Speaker 2: never ever do anything again, right, Like. 1329 01:06:51,560 --> 01:06:54,400 Speaker 3: It's back to the future. I guess you guys aren't 1330 01:06:54,400 --> 01:06:55,160 Speaker 3: ready for that yet. 1331 01:06:55,240 --> 01:06:59,360 Speaker 2: Yeah, I guess that is what happens here. Yeah, exactly. 1332 01:06:59,520 --> 01:07:02,280 Speaker 3: I mean this this plays into like I've talked to 1333 01:07:02,440 --> 01:07:05,800 Speaker 3: like punks who were sentient in the eighties when Reagan 1334 01:07:05,880 --> 01:07:08,480 Speaker 3: was elected, and they talk about this idea of like, 1335 01:07:08,600 --> 01:07:11,040 Speaker 3: when Reagan got elected, I thought, oh, the world's gonna end, 1336 01:07:11,520 --> 01:07:13,200 Speaker 3: and then it didn't. And this is kind of what 1337 01:07:13,240 --> 01:07:16,480 Speaker 3: they did that movie Oppenheimer, that idea of like, yeah, 1338 01:07:16,520 --> 01:07:18,800 Speaker 3: it kind of did. Like That's the thing we don't 1339 01:07:18,800 --> 01:07:21,440 Speaker 3: realize is when we say, like, oh, Trump's gonna fucking 1340 01:07:21,480 --> 01:07:24,960 Speaker 3: destroy the world or whatever. It's like, not immediately. It's 1341 01:07:25,000 --> 01:07:27,720 Speaker 3: more about the fact that these people are going to 1342 01:07:28,520 --> 01:07:31,520 Speaker 3: set us into this direction that's just going to keep 1343 01:07:31,760 --> 01:07:37,680 Speaker 3: snowballing where we've now said, oh, it's okay for this person, yeah, 1344 01:07:37,720 --> 01:07:41,040 Speaker 3: to have to even run, even just running for president. 1345 01:07:41,240 --> 01:07:44,800 Speaker 3: We're basically saying like we're now entertaining this idea and 1346 01:07:44,840 --> 01:07:47,560 Speaker 3: maybe it wasn't maybe people didn't go for it this time, 1347 01:07:48,040 --> 01:07:50,640 Speaker 3: but okay, let's just slowly roll it out a little 1348 01:07:50,640 --> 01:07:51,520 Speaker 3: slower next time. 1349 01:07:51,600 --> 01:07:53,760 Speaker 2: You know, there's some people on the left who are 1350 01:07:53,800 --> 01:07:56,600 Speaker 2: like actual, like like Lewis Lapham of Harper's when we're 1351 01:07:56,680 --> 01:08:00,520 Speaker 2: to quote from seems to recognize what goldwater means. Yeah, 1352 01:08:00,560 --> 01:08:02,000 Speaker 2: you know, you get a bit of that. Hunter S. 1353 01:08:02,040 --> 01:08:03,840 Speaker 2: Thompson is kind of one of the people who sees 1354 01:08:04,080 --> 01:08:07,600 Speaker 2: Goldwater and is like oh fuck, oh fuck, because he's 1355 01:08:07,680 --> 01:08:10,680 Speaker 2: just got a pretty good understanding of like American culture 1356 01:08:10,800 --> 01:08:13,040 Speaker 2: and like, Okay, this is going to keep being a thing. 1357 01:08:13,080 --> 01:08:15,200 Speaker 2: It's only going to get bigger. A lot of the 1358 01:08:15,200 --> 01:08:17,479 Speaker 2: people who had been Goldwater backers, a lot of these 1359 01:08:17,600 --> 01:08:20,479 Speaker 2: these guys were talking about this cadra of like super 1360 01:08:20,479 --> 01:08:24,559 Speaker 2: conservative multi millionaires most of them who had inherited their money. 1361 01:08:25,120 --> 01:08:28,639 Speaker 2: They are like they kind of have this brief flash 1362 01:08:28,640 --> 01:08:31,200 Speaker 2: of hope, a lot of them for Goldwater, but there's 1363 01:08:31,240 --> 01:08:35,960 Speaker 2: also like this deep crashing, like frustration when he fails 1364 01:08:36,040 --> 01:08:38,280 Speaker 2: in this sense that like, well, we've lost forever. We 1365 01:08:38,520 --> 01:08:40,840 Speaker 2: just can't We're not gonna be able to stop communism 1366 01:08:40,880 --> 01:08:43,479 Speaker 2: for taking over the country. And in his article for 1367 01:08:43,600 --> 01:08:46,599 Speaker 2: Harper's Lewis Lapham describes a meeting of these guys at 1368 01:08:46,640 --> 01:08:50,519 Speaker 2: Bohemian Grove in nineteen sixty eight, which sets the mood 1369 01:08:50,600 --> 01:08:53,479 Speaker 2: of this particular cast well. In the hearts of the 1370 01:08:53,520 --> 01:08:56,640 Speaker 2: corporate chieftains wandering around the redwood trees and the Bohemian 1371 01:08:56,680 --> 01:08:59,439 Speaker 2: Grove in July nineteen sixty eight, the fear was palpable 1372 01:08:59,479 --> 01:09:02,439 Speaker 2: and genuine. The croquet lawns seemed to be sliding away 1373 01:09:02,439 --> 01:09:04,519 Speaker 2: beneath their feet, and although they knew they were in trouble, 1374 01:09:04,560 --> 01:09:07,760 Speaker 2: they didn't know why. Ideas apparently mattered, and words were 1375 01:09:07,760 --> 01:09:10,920 Speaker 2: maybe more important than they had guessed. Unfortunately they didn't 1376 01:09:10,920 --> 01:09:13,880 Speaker 2: have any. The American property holding classes tended to be 1377 01:09:13,920 --> 01:09:17,040 Speaker 2: embarrassingly ill at ease with concepts that don't translate promptly 1378 01:09:17,080 --> 01:09:19,960 Speaker 2: into money, and the beacons of conservative light shining through 1379 01:09:20,000 --> 01:09:22,640 Speaker 2: the liberal fog of the late nineteen sixties didn't come 1380 01:09:22,680 --> 01:09:25,519 Speaker 2: up to the number of clubs in Arnold Palmer's golf bag. 1381 01:09:25,960 --> 01:09:28,559 Speaker 2: The Company of the Commercial Faithful, gathered on the banks 1382 01:09:28,560 --> 01:09:31,760 Speaker 2: of California's Russian River could look for Sucker to Goldwater's 1383 01:09:31,800 --> 01:09:35,400 Speaker 2: autobiography The Conscience of a Conservative, to William F. Buckley's 1384 01:09:35,479 --> 01:09:38,840 Speaker 2: editorials in National Review, to the novels of Iron Rand. 1385 01:09:39,240 --> 01:09:41,400 Speaker 2: But that was kind of all they had, right, was 1386 01:09:41,720 --> 01:09:46,599 Speaker 2: this kind of like utopian conservative fiction, because it seemed 1387 01:09:46,680 --> 01:09:50,720 Speaker 2: like the situation was so bleak, but salvation was not 1388 01:09:50,920 --> 01:09:56,160 Speaker 2: far away. Nixon is going to win, right, Sure, yeah, 1389 01:09:56,520 --> 01:09:59,680 Speaker 2: he's going to become the president, and that's going to 1390 01:09:59,720 --> 01:10:02,960 Speaker 2: be like kind of a fucking disaster. But they couldn't 1391 01:10:02,960 --> 01:10:06,160 Speaker 2: really see that coming. It didn't seem likely until a 1392 01:10:06,200 --> 01:10:08,639 Speaker 2: bunch of other shit falls into place later that year. 1393 01:10:09,000 --> 01:10:12,280 Speaker 2: So it's a desperate time for these guys, obviously, though 1394 01:10:12,360 --> 01:10:15,200 Speaker 2: shit starts to go their way pretty soon after this moment. 1395 01:10:15,439 --> 01:10:18,280 Speaker 2: In nineteen seventy one, a Richmond corporate lawyer named Lewis 1396 01:10:18,320 --> 01:10:22,280 Speaker 2: Powell wrote a confidential memorandum. He had been an intelligence 1397 01:10:22,280 --> 01:10:24,679 Speaker 2: guy in World War two. His whole thing. In World 1398 01:10:24,680 --> 01:10:27,200 Speaker 2: War Two, Lewis Powell had been like he'd written loughingly 1399 01:10:27,240 --> 01:10:29,360 Speaker 2: about the bombing of Dresden, like this is we did 1400 01:10:29,400 --> 01:10:31,720 Speaker 2: a great job with Dresden. This is really like the 1401 01:10:31,720 --> 01:10:35,200 Speaker 2: finest hour of our air power, you know, really murdering 1402 01:10:35,240 --> 01:10:38,280 Speaker 2: all of these civilians and not getting the Germans to surrender. 1403 01:10:38,720 --> 01:10:41,439 Speaker 2: After the war, he'd chaired the Richmond School Board where 1404 01:10:41,439 --> 01:10:44,000 Speaker 2: he Richmond, Virginia, where he had fought like hell against 1405 01:10:44,080 --> 01:10:47,519 Speaker 2: the attempts to desegregate public schools. And then once he 1406 01:10:47,560 --> 01:10:50,479 Speaker 2: failed at stopping schools from desegregating, he took a job 1407 01:10:50,560 --> 01:10:53,639 Speaker 2: representing the Tobacco Institute during the height of its evil. 1408 01:10:53,680 --> 01:10:55,519 Speaker 2: So this is like a guy, this is a man 1409 01:10:55,560 --> 01:10:57,679 Speaker 2: whose business is being the devil, right. 1410 01:10:58,560 --> 01:10:59,760 Speaker 3: That's what do you do. And he's like, you know, 1411 01:10:59,840 --> 01:11:01,160 Speaker 3: like evil stuff, I mean. 1412 01:11:01,600 --> 01:11:06,519 Speaker 2: General evil, Yeah, yeah, sundry evil. So this is like 1413 01:11:06,840 --> 01:11:11,960 Speaker 2: a pretty pretty impressive bastard memorandum. And so after Nixon 1414 01:11:12,160 --> 01:11:14,280 Speaker 2: gets into office, you know, a couple of years in 1415 01:11:15,200 --> 01:11:18,760 Speaker 2: conservatives are like happy about some things, but there's also like, 1416 01:11:18,840 --> 01:11:22,280 Speaker 2: especially the hard right, the Goldwater right, doesn't really trust 1417 01:11:22,400 --> 01:11:24,840 Speaker 2: Nixon because even though he made his bones as an 1418 01:11:24,840 --> 01:11:27,840 Speaker 2: anti communist, he's like going to be friends with Mao, 1419 01:11:28,000 --> 01:11:31,240 Speaker 2: you know, he establishes the EPA. So there's still this 1420 01:11:31,320 --> 01:11:34,280 Speaker 2: feeling that like, even though this guy's a Republican, we're 1421 01:11:34,320 --> 01:11:37,439 Speaker 2: still losing the ideological war. If a Republican is doing 1422 01:11:37,439 --> 01:11:40,680 Speaker 2: all this stuff, right, we need to get a real dick. 1423 01:11:40,800 --> 01:11:43,120 Speaker 2: Nixon is too sane and even handed. We need a 1424 01:11:43,160 --> 01:11:44,400 Speaker 2: real maniac in there. 1425 01:11:44,479 --> 01:11:44,880 Speaker 3: Yeah, you know. 1426 01:11:46,360 --> 01:11:49,240 Speaker 2: And so Lewis Powell in nineteen seventy one writes what 1427 01:11:49,280 --> 01:11:56,439 Speaker 2: becomes known as the Powell Memorandum. So this is to 1428 01:11:56,520 --> 01:11:59,920 Speaker 2: kind of provide some additional context. Powell is this He's 1429 01:12:00,000 --> 01:12:02,960 Speaker 2: it's a very prominent lawyer. He gets asked by Nixon 1430 01:12:03,000 --> 01:12:05,240 Speaker 2: in sixty nine to join the Supreme Court, and he's like, 1431 01:12:05,320 --> 01:12:07,320 Speaker 2: I don't really want to be in the Supreme Court, right, 1432 01:12:07,920 --> 01:12:10,479 Speaker 2: And so a couple of years later Nixon asks again, 1433 01:12:10,720 --> 01:12:13,559 Speaker 2: and in seventy one pala is like, yeah, I'll join 1434 01:12:13,600 --> 01:12:16,880 Speaker 2: the Supreme Court. And in kind of the period before 1435 01:12:16,920 --> 01:12:19,880 Speaker 2: he actually takes that job, one of his friends, who's 1436 01:12:19,920 --> 01:12:22,559 Speaker 2: the education director of the Chamber of Commerce, is like, Hey, 1437 01:12:22,560 --> 01:12:25,559 Speaker 2: before you become a Supreme Court justice, I need you 1438 01:12:25,600 --> 01:12:28,439 Speaker 2: to like write a memorandum on how we can win 1439 01:12:28,479 --> 01:12:31,400 Speaker 2: the culture war in the United States. And so Powell 1440 01:12:31,439 --> 01:12:34,879 Speaker 2: writes this thing titled attack on the American Free Enterprise 1441 01:12:34,920 --> 01:12:37,960 Speaker 2: System that gets distributed to the Chamber of Commerce is 1442 01:12:38,040 --> 01:12:41,080 Speaker 2: like the body and the government that interfaces with all 1443 01:12:41,120 --> 01:12:43,880 Speaker 2: of the corporations, Right, that's the basic idea of what 1444 01:12:43,920 --> 01:12:46,400 Speaker 2: the Chamber of Commerce is. So he writes this memo 1445 01:12:46,560 --> 01:12:48,719 Speaker 2: and it goes out to all of the people running 1446 01:12:48,840 --> 01:12:51,280 Speaker 2: the biggest companies in the United States. This memo from 1447 01:12:51,280 --> 01:12:53,519 Speaker 2: this guy who's going to become a Supreme Court justice, 1448 01:12:53,560 --> 01:12:55,759 Speaker 2: and he is not going to disclose that he's written 1449 01:12:55,800 --> 01:12:58,599 Speaker 2: this mimo when he's being confirmed to Supreme Court justice 1450 01:12:58,760 --> 01:13:01,040 Speaker 2: because of as I just describe the memo, it will 1451 01:13:01,040 --> 01:13:03,080 Speaker 2: become obvious why he didn't want to talk about this. 1452 01:13:03,280 --> 01:13:06,120 Speaker 3: Wait, so nobody knows he wrote it or he. 1453 01:13:06,080 --> 01:13:08,720 Speaker 2: Just the rich people know. He'd forget it. 1454 01:13:08,800 --> 01:13:12,840 Speaker 3: That's a public doesn't Jerry maguire all the rich people? 1455 01:13:12,920 --> 01:13:16,040 Speaker 2: Yeah, yeah, yeah. So the memo starts with him talking 1456 01:13:16,040 --> 01:13:18,920 Speaker 2: about like Ralph Nader, you know, as this like boogey man. 1457 01:13:18,960 --> 01:13:21,320 Speaker 2: He's like, we have a bunch of demons stalking the 1458 01:13:21,360 --> 01:13:23,840 Speaker 2: property in classes in America, and chief among them is 1459 01:13:23,960 --> 01:13:27,920 Speaker 2: Ralph Nader because in sixty five, Nator had published this 1460 01:13:27,960 --> 01:13:31,360 Speaker 2: book called Unsafe at Any Speed, which forced the automotive 1461 01:13:31,360 --> 01:13:33,920 Speaker 2: industry to include seat belts and shit. Right, Like, he 1462 01:13:33,960 --> 01:13:37,000 Speaker 2: writes this book, everyone is dying in their cars. There 1463 01:13:37,040 --> 01:13:39,960 Speaker 2: are no safety features, and everyone has one. We should 1464 01:13:40,000 --> 01:13:43,360 Speaker 2: probably make it mandatory that there be safety features in cars. 1465 01:13:43,640 --> 01:13:45,679 Speaker 3: This seems to be what he was known for, yes 1466 01:13:45,920 --> 01:13:48,200 Speaker 3: before he ran for presidente. 1467 01:13:49,960 --> 01:13:52,200 Speaker 2: Yeah, and it was a great thing to do. And 1468 01:13:52,240 --> 01:13:54,680 Speaker 2: it's this is part of this whole like sense of 1469 01:13:54,720 --> 01:13:58,519 Speaker 2: doom they have that there's no stopping progressivism because this 1470 01:13:58,680 --> 01:14:02,200 Speaker 2: book comes out in immediately, like there's not like this 1471 01:14:02,360 --> 01:14:05,080 Speaker 2: massive counter punch to it where people are like, we 1472 01:14:05,120 --> 01:14:07,920 Speaker 2: need to cancel Natter for being too woke. Everyone's like, oh, yeah, 1473 01:14:07,920 --> 01:14:10,479 Speaker 2: we should have seat belts. That seems like a good idea, 1474 01:14:10,920 --> 01:14:13,640 Speaker 2: and automotive companies are like, but this is going to 1475 01:14:13,720 --> 01:14:16,559 Speaker 2: cost us a lot of money, right, And powell Is 1476 01:14:16,840 --> 01:14:20,200 Speaker 2: describes himself as terrified about the reaction to Nator's work 1477 01:14:20,280 --> 01:14:22,720 Speaker 2: because he sees it as like evidence that socialism is 1478 01:14:22,760 --> 01:14:26,439 Speaker 2: inevitably taking over the country. Right. The first thing's corporate 1479 01:14:26,560 --> 01:14:30,800 Speaker 2: power is a pillar holding up American greatness, and they're 1480 01:14:30,800 --> 01:14:33,200 Speaker 2: eroding it. You know, that's how he describes it. 1481 01:14:33,560 --> 01:14:38,360 Speaker 3: It's wild this attitude we have about corporations that we 1482 01:14:38,479 --> 01:14:41,960 Speaker 3: see them as like a deity, because this idea of saying, 1483 01:14:42,360 --> 01:14:44,800 Speaker 3: you've made something that's unsafe, we need you to make 1484 01:14:44,840 --> 01:14:47,760 Speaker 3: it more safe, and then corporations go, but that's going 1485 01:14:47,840 --> 01:14:50,240 Speaker 3: to cost us a lot of money. The proper reply 1486 01:14:50,400 --> 01:14:52,520 Speaker 3: to that is okay. 1487 01:14:52,840 --> 01:14:55,360 Speaker 2: Yeah, like Vine would still have to. 1488 01:14:55,560 --> 01:14:59,280 Speaker 3: Do it, Like we're still gonna make you do it. 1489 01:14:59,360 --> 01:15:03,519 Speaker 3: You know, you don't really have a choice. You're making 1490 01:15:03,520 --> 01:15:06,000 Speaker 3: a product that's going out into the public. It needs 1491 01:15:06,040 --> 01:15:08,559 Speaker 3: to be it has to meet these standards that we've 1492 01:15:08,560 --> 01:15:11,720 Speaker 3: deemed safe. That's it, end of discussion. And so it's 1493 01:15:11,720 --> 01:15:14,800 Speaker 3: wild that there's an entire political party who's like, no, 1494 01:15:14,840 --> 01:15:16,960 Speaker 3: we can't do that to corporations. 1495 01:15:17,360 --> 01:15:20,160 Speaker 2: Well, and that's a And these guys up to this point, 1496 01:15:20,240 --> 01:15:22,519 Speaker 2: they still they had felt that way the whole time. 1497 01:15:22,640 --> 01:15:24,519 Speaker 2: Is the New Deal and Great Society and all this 1498 01:15:24,560 --> 01:15:26,960 Speaker 2: shit is going on, they had felt like, we shouldn't 1499 01:15:27,000 --> 01:15:29,599 Speaker 2: let them do this to us in our sweet, sweet money. 1500 01:15:29,960 --> 01:15:33,519 Speaker 2: But they didn't have a concerted way to counterattack, right. 1501 01:15:33,560 --> 01:15:36,200 Speaker 2: And what Powell says in this memo is like, we 1502 01:15:36,479 --> 01:15:40,720 Speaker 2: the corporate class in America, the people with money who 1503 01:15:40,840 --> 01:15:44,720 Speaker 2: run our businesses, need to be attacking people like nat 1504 01:15:44,920 --> 01:15:47,599 Speaker 2: We need to build a mechanism to go to war 1505 01:15:47,920 --> 01:15:52,519 Speaker 2: with Ralph Nader, right, otherwise they're going to inevitably bring 1506 01:15:52,560 --> 01:15:55,720 Speaker 2: this country to communism, right right. Another guy that he 1507 01:15:55,840 --> 01:15:59,320 Speaker 2: rails against is William Kunzler, who's a civil rights lawyer. 1508 01:15:59,360 --> 01:16:01,320 Speaker 2: He had a hand in everything from the Freedom Writers 1509 01:16:01,320 --> 01:16:05,400 Speaker 2: to Wounded Knee, very influential guy. And guys like Kunsler 1510 01:16:05,520 --> 01:16:09,080 Speaker 2: and Nator are the enemy, right, They're these sinister forces 1511 01:16:09,120 --> 01:16:11,040 Speaker 2: aligned to create a world in which people have access 1512 01:16:11,080 --> 01:16:14,479 Speaker 2: to food and medicine against such foul enemies. The only 1513 01:16:14,600 --> 01:16:18,120 Speaker 2: response Powell wrote was the ideological equivalent to war. And 1514 01:16:18,160 --> 01:16:20,640 Speaker 2: I'm going to read a summary of his memorandum from 1515 01:16:20,680 --> 01:16:23,559 Speaker 2: a speech in the Senate by Sheldon Whitehouse of Rhode Island. 1516 01:16:24,160 --> 01:16:27,160 Speaker 2: The language in the Powell report is the language of battle, attack, 1517 01:16:27,280 --> 01:16:31,800 Speaker 2: frontal assault, rifle shots, warfare. The recommendations are to endcompromise 1518 01:16:31,840 --> 01:16:35,439 Speaker 2: and appeasement. His words compromise and appeasement to understand that, 1519 01:16:35,560 --> 01:16:38,040 Speaker 2: as he said, the ultimate issue may be survival, and 1520 01:16:38,080 --> 01:16:40,640 Speaker 2: he underline the word survival in his report and to 1521 01:16:40,720 --> 01:16:44,160 Speaker 2: call for the wisdom, ingenuity, and resources of American business 1522 01:16:44,200 --> 01:16:46,760 Speaker 2: to be marshaled against those who would destroy it. 1523 01:16:47,680 --> 01:16:51,559 Speaker 3: Man, there's something about when we talk about all these 1524 01:16:51,560 --> 01:16:54,680 Speaker 3: think takes and experts and stuff, I really think for 1525 01:16:55,080 --> 01:17:00,400 Speaker 3: the average person, the best metric really is like who's 1526 01:17:00,400 --> 01:17:03,719 Speaker 3: always wrong or who's always right? What is actually the result? 1527 01:17:04,000 --> 01:17:04,200 Speaker 2: You know? 1528 01:17:04,360 --> 01:17:06,400 Speaker 3: Like this thing take with the Vietnam stuff, It's like, 1529 01:17:06,439 --> 01:17:09,760 Speaker 3: how did that work out? Does someone consistently and so 1530 01:17:09,840 --> 01:17:13,160 Speaker 3: with this stuff, it's just like we did we slide 1531 01:17:13,160 --> 01:17:17,280 Speaker 3: into communism to have seatbelts? Did this cause? Like? What 1532 01:17:17,760 --> 01:17:20,439 Speaker 3: is actually going on right now? Do you think do 1533 01:17:20,479 --> 01:17:26,000 Speaker 3: you think you know, deregulating everything and making corporations get 1534 01:17:26,040 --> 01:17:28,200 Speaker 3: to do whatever they want? Has that made things better? 1535 01:17:28,439 --> 01:17:31,880 Speaker 3: Are products better? Do things cost? Perhaps a lot? 1536 01:17:32,200 --> 01:17:32,360 Speaker 2: Is it? 1537 01:17:33,200 --> 01:17:36,080 Speaker 3: You know? Like look at today and be like what 1538 01:17:36,200 --> 01:17:39,200 Speaker 3: is the problems today? What do you think caused that? 1539 01:17:39,600 --> 01:17:42,560 Speaker 3: And then perhaps should we listen to the people or 1540 01:17:42,600 --> 01:17:47,000 Speaker 3: the institutions that caused that problem and continue to cause problems? 1541 01:17:47,720 --> 01:17:50,559 Speaker 3: It's just so obvious, that's all. Yes, it's very silly. 1542 01:17:50,760 --> 01:17:54,000 Speaker 2: It ought to be but you know the thing is, 1543 01:17:53,640 --> 01:17:56,479 Speaker 2: it's silly to like pretend that this is good for 1544 01:17:56,520 --> 01:17:59,240 Speaker 2: anyone but the people with money. But power isn't doing that. 1545 01:17:59,280 --> 01:18:03,000 Speaker 2: Power is saying, hey, people with money, you are a 1546 01:18:03,160 --> 01:18:06,680 Speaker 2: threatened class in America and we have to organize to 1547 01:18:06,760 --> 01:18:10,120 Speaker 2: destroy the majority of the country who wants your money 1548 01:18:10,120 --> 01:18:11,720 Speaker 2: to buy medicine, which we. 1549 01:18:11,680 --> 01:18:15,880 Speaker 3: Still we see this today, right that the billionaires is 1550 01:18:15,920 --> 01:18:16,600 Speaker 3: a slur. 1551 01:18:16,479 --> 01:18:19,320 Speaker 2: Or something like that, exactly exactly. 1552 01:18:19,200 --> 01:18:21,519 Speaker 3: Or it's just weird that there is. It's not just 1553 01:18:21,560 --> 01:18:23,760 Speaker 3: a class, but it's like a weird culture right where 1554 01:18:23,800 --> 01:18:27,479 Speaker 3: we're rooting for corporations, we're rooting for rich people, where 1555 01:18:27,520 --> 01:18:30,880 Speaker 3: like Elon Musk is a figurehead, or like people go 1556 01:18:31,000 --> 01:18:34,839 Speaker 3: like Yo Disney versus Sony, and it's like, fuck all. 1557 01:18:34,680 --> 01:18:39,519 Speaker 2: Of that you're talking about, Dave, is the result, like 1558 01:18:39,560 --> 01:18:41,719 Speaker 2: that state of affairs is the result of the Powell 1559 01:18:41,720 --> 01:18:45,439 Speaker 2: memorandums success because he is laying out a battle plan 1560 01:18:45,840 --> 01:18:48,200 Speaker 2: for these guys, for these rich fail suns and the 1561 01:18:48,240 --> 01:18:51,880 Speaker 2: companies that they run. And Palell's vision here is nothing 1562 01:18:51,960 --> 01:18:54,360 Speaker 2: less than a plot to take over the US government 1563 01:18:54,360 --> 01:18:57,640 Speaker 2: from the inside, to damage its institutions so severely that 1564 01:18:57,720 --> 01:18:59,400 Speaker 2: no one would ever be able to take them back. 1565 01:19:00,000 --> 01:19:03,320 Speaker 2: He directed this letter at the oligarchs in American society, 1566 01:19:03,320 --> 01:19:05,400 Speaker 2: people who are frightened of any limit to their wealth 1567 01:19:05,400 --> 01:19:08,720 Speaker 2: and power. He wrote to them, strength lies in organization 1568 01:19:09,120 --> 01:19:12,840 Speaker 2: and careful long range planning and implementation, and consistency of 1569 01:19:12,880 --> 01:19:15,799 Speaker 2: action over an indefinite period of years, and the scale 1570 01:19:15,800 --> 01:19:18,720 Speaker 2: of financing available only through joint effort, and in the 1571 01:19:18,760 --> 01:19:24,000 Speaker 2: political power available only through united action and national organizations. Right. 1572 01:19:24,680 --> 01:19:27,760 Speaker 2: And his attitude is it's the job of men like him, 1573 01:19:27,800 --> 01:19:30,439 Speaker 2: like me as as the thinker here, My job is 1574 01:19:30,439 --> 01:19:32,719 Speaker 2: to be like the Rand Corporation in Vietnam, to plot 1575 01:19:32,760 --> 01:19:36,719 Speaker 2: out a path to victory. Right, it's the job of you, guys, 1576 01:19:36,760 --> 01:19:39,200 Speaker 2: the people with money. All you need to do is 1577 01:19:39,320 --> 01:19:42,040 Speaker 2: put money in my pocket and the pocket of other 1578 01:19:42,120 --> 01:19:46,320 Speaker 2: people like us. Right, and we will finance. Like if 1579 01:19:46,360 --> 01:19:49,200 Speaker 2: you finance think tanks and like pay for intellectuals for 1580 01:19:49,280 --> 01:19:52,360 Speaker 2: lawyers like us, we will put out public policy and 1581 01:19:52,400 --> 01:19:55,000 Speaker 2: will get judges in place. And what of Powell's big 1582 01:19:55,040 --> 01:19:56,960 Speaker 2: insights is like, because again he's about to become a 1583 01:19:56,960 --> 01:20:00,560 Speaker 2: Supreme Court justice, his attitude is like, conservative need to 1584 01:20:00,600 --> 01:20:03,840 Speaker 2: take over the courts. Right, that's the best way to 1585 01:20:03,880 --> 01:20:07,920 Speaker 2: shift policy because these are lifetime appointments. The more conservatives 1586 01:20:07,920 --> 01:20:10,280 Speaker 2: we get in the courts, we can actually take the 1587 01:20:10,320 --> 01:20:11,920 Speaker 2: reins of culture and steer them. 1588 01:20:12,000 --> 01:20:13,240 Speaker 3: Right, it's not wrong. 1589 01:20:13,560 --> 01:20:16,240 Speaker 2: He is not at all wrong. He's very smart. 1590 01:20:16,479 --> 01:20:18,919 Speaker 3: Yeah, it's almost as if we designed it badly. 1591 01:20:19,280 --> 01:20:21,320 Speaker 2: Yes, yes, it is where it's weird that. 1592 01:20:21,280 --> 01:20:23,519 Speaker 3: We were like, Okay, we'll have this branch to government, 1593 01:20:23,560 --> 01:20:25,160 Speaker 3: this branch and they get you know, it's like every 1594 01:20:25,200 --> 01:20:28,400 Speaker 3: few years, and then we'll have these like lifetime kings. 1595 01:20:28,439 --> 01:20:30,840 Speaker 2: We should have some god kings. Probably some godkings. Yeah, 1596 01:20:30,880 --> 01:20:32,800 Speaker 2: definitely want a couple of god kings in there. 1597 01:20:32,960 --> 01:20:34,240 Speaker 3: Would we do that? 1598 01:20:34,240 --> 01:20:38,599 Speaker 2: That's so I don't think we need godkings. Pal's attitude 1599 01:20:38,600 --> 01:20:41,080 Speaker 2: is that every American business should donate ten percent of 1600 01:20:41,120 --> 01:20:45,799 Speaker 2: their advertising budget towards propaganda, towards think tanks, towards funding 1601 01:20:45,840 --> 01:20:49,720 Speaker 2: this stuff. Right, Like every corporation, Exxon and whatnot, they 1602 01:20:49,720 --> 01:20:53,160 Speaker 2: should all be putting money into think tanks and consider 1603 01:20:53,240 --> 01:20:57,040 Speaker 2: that advertising right to lobby the government and publish papers 1604 01:20:57,040 --> 01:21:00,519 Speaker 2: that push their agenda right, right, which is you know 1605 01:21:01,280 --> 01:21:04,920 Speaker 2: what happened. A big central part of his obsession is textbooks. 1606 01:21:05,080 --> 01:21:07,360 Speaker 2: One thing he wanted is he wanted oligarchs to pay 1607 01:21:07,439 --> 01:21:10,679 Speaker 2: right wing pundits to critique and attack textbooks for being 1608 01:21:10,680 --> 01:21:14,280 Speaker 2: insufficiently pro capitalist. He wanted to pay for there to 1609 01:21:14,320 --> 01:21:17,640 Speaker 2: be organizations to monitor TV networks. He believed that like 1610 01:21:17,720 --> 01:21:20,080 Speaker 2: television should quote be monitored in the same way that 1611 01:21:20,120 --> 01:21:23,439 Speaker 2: textbooks should be kept under constant surveillance. The goal of 1612 01:21:23,479 --> 01:21:25,679 Speaker 2: all this was to make sure that corporate America got 1613 01:21:25,680 --> 01:21:28,960 Speaker 2: equal time with like, you know, the interests of human beings. 1614 01:21:29,680 --> 01:21:31,600 Speaker 2: He's basically saying, the next time a guy that like 1615 01:21:31,680 --> 01:21:34,280 Speaker 2: Ralph Nader publishes a book about how cars are killing people, 1616 01:21:34,680 --> 01:21:37,360 Speaker 2: we need to make sure every news agency gives equal 1617 01:21:37,360 --> 01:21:39,559 Speaker 2: time to the car companies, saying, but we don't want 1618 01:21:39,560 --> 01:21:42,880 Speaker 2: to put seat belts in cars, it'll make them too expensive. 1619 01:21:43,600 --> 01:21:45,240 Speaker 3: Yeah, both sides. 1620 01:21:45,920 --> 01:21:50,880 Speaker 2: Yeah, it's really good stuff here. And it's like, there's 1621 01:21:50,920 --> 01:21:53,439 Speaker 2: a lot of very modern stuff here, right Like. Powell 1622 01:21:53,439 --> 01:21:56,280 Speaker 2: writes that like business owners should use political influence and 1623 01:21:56,360 --> 01:21:59,679 Speaker 2: money to stem quote the stampedes by politicians to support 1624 01:21:59,720 --> 01:22:03,920 Speaker 2: any legislation related to consumerism or to the environment, and 1625 01:22:03,960 --> 01:22:08,920 Speaker 2: he puts the environment in quotation marks political power is necessary, 1626 01:22:08,960 --> 01:22:12,519 Speaker 2: it must be assiduously cultivated, and when necessary, must be 1627 01:22:12,640 --> 01:22:16,400 Speaker 2: used aggressively and with determination. It is essential to be 1628 01:22:16,479 --> 01:22:19,360 Speaker 2: far more aggressive than in the past, with no hesitation 1629 01:22:19,439 --> 01:22:22,439 Speaker 2: to attack, not the slightest hesitation to press vigorously in 1630 01:22:22,479 --> 01:22:26,840 Speaker 2: all political arenas, and no reluctance to penalize politically those 1631 01:22:26,840 --> 01:22:33,559 Speaker 2: who oppose corporate efforts. So you know, not great, not great, 1632 01:22:33,840 --> 01:22:34,919 Speaker 2: but it all happens. 1633 01:22:35,080 --> 01:22:38,080 Speaker 3: Yeah, oh yeah, no, they get it. You're right. Yeah, 1634 01:22:38,120 --> 01:22:43,440 Speaker 3: we're now in a situation where it's not even like abnormal. 1635 01:22:43,520 --> 01:22:46,280 Speaker 3: We don't think of it as weird like it it's 1636 01:22:46,520 --> 01:22:50,320 Speaker 3: it's it's weird that like these things have to be 1637 01:22:50,360 --> 01:22:56,200 Speaker 3: debated or explained. They have divulged the conversation successful. 1638 01:22:55,800 --> 01:23:01,200 Speaker 2: Yeah, you know where like they follow Powell's march, Yeah. 1639 01:23:01,080 --> 01:23:03,880 Speaker 3: Right where we're we're getting people who are saying, like 1640 01:23:05,200 --> 01:23:08,080 Speaker 3: you know, who are who are just publicly like this 1641 01:23:08,120 --> 01:23:11,879 Speaker 3: is gonna hurt corporations and that's gonna hurt you the worker, 1642 01:23:12,360 --> 01:23:17,040 Speaker 3: and people are just believing it. Uh, it's kind of wild, 1643 01:23:17,280 --> 01:23:20,280 Speaker 3: Like it's wild that trickle down is a thing that's like, 1644 01:23:20,360 --> 01:23:23,679 Speaker 3: oh yeah, it'll get to you. And why would anybody 1645 01:23:23,760 --> 01:23:26,760 Speaker 3: vote for that, Why would anybody an average person go like, 1646 01:23:26,840 --> 01:23:29,720 Speaker 3: you know, we should we should totally. Just let it 1647 01:23:29,800 --> 01:23:31,679 Speaker 3: go to the rich people and then it'll get around 1648 01:23:31,680 --> 01:23:35,559 Speaker 3: to us. It'll trickle down to us like piss Like. 1649 01:23:35,640 --> 01:23:39,120 Speaker 3: It's just like, that's so weird that we that people 1650 01:23:39,120 --> 01:23:43,000 Speaker 3: were able to sell this idea that the upper class, 1651 01:23:43,000 --> 01:23:46,800 Speaker 3: that the corporate owners are people that need to be 1652 01:23:46,840 --> 01:23:49,320 Speaker 3: protected or need to be represented politically. 1653 01:23:50,200 --> 01:23:53,040 Speaker 2: Dave, I love so much that you brought up trickle 1654 01:23:53,040 --> 01:23:56,040 Speaker 2: down economics, because that's where we're heading in part two 1655 01:23:58,960 --> 01:24:03,719 Speaker 2: right now, let's trickle down your plug aubles to our audience. 1656 01:24:04,280 --> 01:24:09,000 Speaker 3: Mmmmm I can? I mean you mentioned Gamefully Unemployed g 1657 01:24:09,120 --> 01:24:11,960 Speaker 3: A M E f U l O Y Unemployed. That's 1658 01:24:11,960 --> 01:24:14,960 Speaker 3: a podcast network I do with Tom Ryman where we 1659 01:24:15,000 --> 01:24:20,680 Speaker 3: talk about movies. Mostly we do reviews and we talk 1660 01:24:20,720 --> 01:24:23,479 Speaker 3: about movie news, so and and so forth. We have 1661 01:24:23,520 --> 01:24:25,680 Speaker 3: a Patreon you can check out, and then I am 1662 01:24:25,720 --> 01:24:28,599 Speaker 3: the head writer for Some More News, which is a 1663 01:24:28,640 --> 01:24:30,920 Speaker 3: political show that I'm sure a lot of people are 1664 01:24:30,960 --> 01:24:34,080 Speaker 3: aware of if they're listening to this, but if you're not, 1665 01:24:34,479 --> 01:24:36,559 Speaker 3: you should google it. I don't know when this is 1666 01:24:36,560 --> 01:24:38,439 Speaker 3: coming out, but we just did a two parter on 1667 01:24:38,600 --> 01:24:41,320 Speaker 3: Lady Ballers. You know, the important stuff. 1668 01:24:42,439 --> 01:24:46,720 Speaker 2: Hell yeah, Well everybody this has been a podcast I 1669 01:24:46,800 --> 01:24:50,840 Speaker 2: have been Robert Evans. Lady Ballers has been a bad movie, 1670 01:24:50,880 --> 01:24:54,880 Speaker 2: but listen to what Dave thinks about it and go 1671 01:24:55,000 --> 01:24:56,120 Speaker 2: to hell. I love you. 1672 01:25:00,240 --> 01:25:02,680 Speaker 1: Through. This is a production of cool Zone Media. For 1673 01:25:02,800 --> 01:25:06,839 Speaker 1: more from cool Zone Media, visit our website coolzonemedia dot com, 1674 01:25:06,960 --> 01:25:10,200 Speaker 1: or check us out on the iHeartRadio app, Apple Podcasts, 1675 01:25:10,280 --> 01:25:11,800 Speaker 1: or wherever you get your podcasts.