1 00:00:00,280 --> 00:00:02,920 Speaker 1: Brought to you by the reinvented two thousand twelve Camray. 2 00:00:03,160 --> 00:00:07,600 Speaker 1: It's ready. Are you welcome to Stuff you Should Know 3 00:00:08,200 --> 00:00:11,960 Speaker 1: from House Stuff Works dot Com. Hey, and welcome to 4 00:00:12,000 --> 00:00:15,560 Speaker 1: the podcast. I'm Josh Clark. Chuck Bryant's with me as usual. 5 00:00:15,760 --> 00:00:17,360 Speaker 1: As a matter of fact, I don't know what I 6 00:00:17,400 --> 00:00:20,079 Speaker 1: do If Chuck Bryant wasn't with me, I'd probably curl 7 00:00:20,160 --> 00:00:22,200 Speaker 1: up in the fetal position in the corner and cry 8 00:00:22,320 --> 00:00:24,720 Speaker 1: myself to sleep. I think you'd be just fine. I 9 00:00:24,760 --> 00:00:27,680 Speaker 1: don't know, Chuck. I don't ever want to find out. Chuck. Well, thanks, 10 00:00:28,760 --> 00:00:31,400 Speaker 1: almost didn't make it today. Why what's going on with you? 11 00:00:31,680 --> 00:00:33,880 Speaker 1: I was out last night. I went out and saw 12 00:00:33,920 --> 00:00:37,320 Speaker 1: Matt's band. As you know, Matt Frederick are handsome young 13 00:00:37,600 --> 00:00:41,760 Speaker 1: stand in producer. Yeah, Matt Frederick of Stuff of Genius 14 00:00:41,760 --> 00:00:44,800 Speaker 1: fame and awesome new video podcasts. We should mention, Chuck. 15 00:00:45,360 --> 00:00:47,240 Speaker 1: It's really good. You guys should check that out, Stuff 16 00:00:47,240 --> 00:00:50,720 Speaker 1: of Genius. It's it's a video podcast video and it's 17 00:00:50,840 --> 00:00:53,320 Speaker 1: cool and it has a little like Monty Python esque animation. 18 00:00:53,360 --> 00:00:54,800 Speaker 1: I like it. Yeah, And we don't want to hear 19 00:00:54,800 --> 00:00:56,320 Speaker 1: any of this. I don't have time to watch that 20 00:00:56,360 --> 00:00:58,880 Speaker 1: crap because it's like a minute forty two tops. Yeah, 21 00:00:58,880 --> 00:01:01,440 Speaker 1: it's quick. Yeah. But went and saw his band last night, 22 00:01:01,920 --> 00:01:04,840 Speaker 1: Lions and Scissors. Yeah, good stuff, And I wanted to 23 00:01:04,880 --> 00:01:06,759 Speaker 1: say that they have a MySpace page and it's good 24 00:01:06,840 --> 00:01:12,279 Speaker 1: music if you're into its very shoegaze, yeah, mag wise 25 00:01:12,360 --> 00:01:15,119 Speaker 1: good is sort of a radio heady component at times. 26 00:01:15,520 --> 00:01:17,360 Speaker 1: Explosions in the sky you ever heard of them? I 27 00:01:17,440 --> 00:01:22,160 Speaker 1: have a big wall of sound, loud, my ears are ringing, 28 00:01:22,160 --> 00:01:25,080 Speaker 1: and I'm slightly tired and imbibed a bit too much. 29 00:01:25,120 --> 00:01:28,679 Speaker 1: But that was my night. That's Chuck's Thursday, everybody, So 30 00:01:28,760 --> 00:01:31,360 Speaker 1: keep rocking, Matt. We love it. Yeah, way to go, Matt. 31 00:01:31,800 --> 00:01:34,680 Speaker 1: So can we get back to our podcast now? Maybe? Yeah, 32 00:01:34,680 --> 00:01:37,040 Speaker 1: But I just wanted to give give you a hard time. 33 00:01:37,080 --> 00:01:40,000 Speaker 1: I think you're a good guy for doing that. Talented drummer, 34 00:01:40,600 --> 00:01:44,200 Speaker 1: great drummer. Okay, So, Chuck, you grew up in the 35 00:01:44,200 --> 00:01:49,640 Speaker 1: Cold War, right, it's pretty funny, jerk you did. You're 36 00:01:49,640 --> 00:01:51,640 Speaker 1: a Cold War baby like me. I'm a Cold War kid. 37 00:01:51,760 --> 00:01:54,480 Speaker 1: Isn't it weird to think we actually work with people who, 38 00:01:54,600 --> 00:01:58,400 Speaker 1: like weren't cognizant that, you know, the Soviets had nukes 39 00:01:58,680 --> 00:02:01,320 Speaker 1: pointed at us at all times and vice versa. Yeah, 40 00:02:01,360 --> 00:02:03,600 Speaker 1: that is weird to me. Yeah, but um yeah, I 41 00:02:03,600 --> 00:02:07,160 Speaker 1: remember being particularly unnerved from time to time that like, dude, 42 00:02:07,200 --> 00:02:10,680 Speaker 1: eventually they're going to come over and or our missile 43 00:02:10,760 --> 00:02:13,080 Speaker 1: is going to be sent over, and that's that, you know. 44 00:02:13,160 --> 00:02:15,000 Speaker 1: I think I can't help but think that we were 45 00:02:15,080 --> 00:02:19,520 Speaker 1: definitely shaped, our personalities were shaped by that underlying, constant 46 00:02:19,600 --> 00:02:21,640 Speaker 1: level of paranoia that we grew up with. You well 47 00:02:21,680 --> 00:02:23,839 Speaker 1: in the movies, uh, a lot, a lot of great 48 00:02:23,840 --> 00:02:26,320 Speaker 1: Cold war movies of that era, but the Roosky is 49 00:02:26,320 --> 00:02:28,640 Speaker 1: out to get us? Yeah, well, do you remember the 50 00:02:28,680 --> 00:02:31,960 Speaker 1: movie Rooskis? Was there one called? It was actually a 51 00:02:32,000 --> 00:02:36,280 Speaker 1: counter propaganda movie where a bunch of kids that were 52 00:02:36,280 --> 00:02:38,720 Speaker 1: probably my age you would have been, like, you know, 53 00:02:38,919 --> 00:02:43,280 Speaker 1: twenty by then. Um, where I think they found a 54 00:02:43,360 --> 00:02:45,560 Speaker 1: Russian sailor who washed ashore and they had to hide 55 00:02:45,600 --> 00:02:48,000 Speaker 1: them because of course, you know, the government would shoot 56 00:02:48,040 --> 00:02:50,280 Speaker 1: him in the head if they found him. And they 57 00:02:50,360 --> 00:02:53,560 Speaker 1: came to learn that the Soviets have hearts too, they 58 00:02:53,600 --> 00:02:55,720 Speaker 1: know how to love. Was it a real movie? Like, yeah, 59 00:02:56,200 --> 00:02:59,359 Speaker 1: it wasn't very good, but it was somebody actually put 60 00:02:59,400 --> 00:03:01,680 Speaker 1: out the effort to say, you know, hey, we're all 61 00:03:01,720 --> 00:03:04,359 Speaker 1: just people here, you know. Um, And what they were 62 00:03:04,360 --> 00:03:09,320 Speaker 1: doing actually was counter propaganda, right, Like all the that 63 00:03:09,400 --> 00:03:12,160 Speaker 1: was right off the cuff, two pounds um, all of 64 00:03:12,200 --> 00:03:15,400 Speaker 1: the all the stuff we were told, I would say, 65 00:03:15,440 --> 00:03:18,919 Speaker 1: at least the vast majority of it was lopsided at best. 66 00:03:19,040 --> 00:03:20,440 Speaker 1: Like do you remember what did you think of the 67 00:03:20,480 --> 00:03:22,160 Speaker 1: Russians when you were growing up? That they were like 68 00:03:22,200 --> 00:03:24,480 Speaker 1: they cut your throat just as soon as look at you, 69 00:03:24,880 --> 00:03:27,480 Speaker 1: And they were always standing in these horrible breadlines, and 70 00:03:27,680 --> 00:03:30,079 Speaker 1: like every single one of them wanted to escape, but 71 00:03:30,120 --> 00:03:33,160 Speaker 1: the Russian government wouldn't let her. They wanted up toilet papers. 72 00:03:33,560 --> 00:03:35,840 Speaker 1: They heard stories about they don't have blue jeans and 73 00:03:35,840 --> 00:03:38,200 Speaker 1: they don't have toilets. Yeah, like you could get you know, 74 00:03:38,320 --> 00:03:40,200 Speaker 1: five wives with the single pair of jeans if you 75 00:03:40,200 --> 00:03:42,880 Speaker 1: wouldn't ever the kind of thing. Yeah, well, I remember 76 00:03:42,920 --> 00:03:46,320 Speaker 1: when the Iron curtain fell and like actual news started 77 00:03:46,320 --> 00:03:49,560 Speaker 1: coming out of the the former USSR states that I 78 00:03:49,560 --> 00:03:52,360 Speaker 1: remember thinking, oh, yeah, what a surprise. All of that 79 00:03:52,480 --> 00:03:55,960 Speaker 1: was lies And they're actually kind of nice folks by 80 00:03:56,000 --> 00:03:58,240 Speaker 1: and large. I'm not saying that they weren't trying to 81 00:03:58,320 --> 00:04:00,800 Speaker 1: do bad things. When we were trying to do bad things. Well, no, 82 00:04:00,960 --> 00:04:03,000 Speaker 1: I think that that was the Russian people were good 83 00:04:03,040 --> 00:04:05,200 Speaker 1: people that came that. That's what I figured out when 84 00:04:05,200 --> 00:04:08,160 Speaker 1: the propaganda ended, the Cold War propaganda ended so abruptly, 85 00:04:08,200 --> 00:04:11,640 Speaker 1: is that we're all people like the average Russian is 86 00:04:11,720 --> 00:04:15,800 Speaker 1: like the average American at heart, with the same dreams, goals, aspirations, 87 00:04:15,840 --> 00:04:17,880 Speaker 1: same things that Kim or her. You know, it's the 88 00:04:17,920 --> 00:04:20,239 Speaker 1: same thing. And we didn't want our stockpiles of weapons 89 00:04:20,279 --> 00:04:22,640 Speaker 1: pointed in their direction either. You know, you're average American 90 00:04:22,680 --> 00:04:26,240 Speaker 1: probably didn't know, certainly not I didn't either. Let's just 91 00:04:26,279 --> 00:04:28,720 Speaker 1: get along, all right, So, Chuck, what we're talking about 92 00:04:28,880 --> 00:04:32,279 Speaker 1: clearly is propaganda, right, which in just that word, A 93 00:04:32,400 --> 00:04:36,960 Speaker 1: very smart person once said, um that propaganda is not 94 00:04:37,040 --> 00:04:39,400 Speaker 1: a dirty word, and it didn't end with the Cold War, 95 00:04:39,760 --> 00:04:44,040 Speaker 1: and that's actually true. But propaganda still has horrible connotations 96 00:04:44,080 --> 00:04:46,760 Speaker 1: just the word itself, right, I mean, it elicits images 97 00:04:46,800 --> 00:04:51,200 Speaker 1: of like brainwashed mass and that's it's definitely the case 98 00:04:51,279 --> 00:04:53,839 Speaker 1: in most cases when it comes to propaganda. But there's 99 00:04:53,960 --> 00:05:00,279 Speaker 1: a classical, more classical definition of propaganda, and essentially, Um, 100 00:05:00,320 --> 00:05:05,200 Speaker 1: it's that it's simply a tool for persuasive arguments, Um, 101 00:05:05,279 --> 00:05:09,440 Speaker 1: that use facts and beliefs, but omit facts and beliefs 102 00:05:09,480 --> 00:05:12,240 Speaker 1: that that would persuade people to the other side of 103 00:05:12,240 --> 00:05:15,880 Speaker 1: the argument, right. It's it's accentuating the positive in a way, 104 00:05:15,880 --> 00:05:18,640 Speaker 1: and you never talked about the negative side of the things. Right, 105 00:05:18,839 --> 00:05:21,680 Speaker 1: So it's sort of like Facebook, it is. It's very 106 00:05:21,760 --> 00:05:25,400 Speaker 1: much like Facebook. Um. I figured out that, um through 107 00:05:25,440 --> 00:05:29,640 Speaker 1: reading this article how propaganda works. That technically the truth campaign, 108 00:05:29,800 --> 00:05:33,000 Speaker 1: you know, to to get people to quit smoking comcials, 109 00:05:33,200 --> 00:05:37,040 Speaker 1: that's propaganda because they omit the fact that cigarettes make 110 00:05:37,040 --> 00:05:42,320 Speaker 1: you alive with pleasure and flavor country. Right, So it's 111 00:05:42,360 --> 00:05:44,400 Speaker 1: that's a little bit of both. I'm not sure we 112 00:05:44,400 --> 00:05:47,080 Speaker 1: should say the brands because I'm pretty sure Big Tobacco 113 00:05:47,120 --> 00:05:49,000 Speaker 1: would sue our parents. And you're right, you've come a 114 00:05:49,000 --> 00:05:54,039 Speaker 1: long way, nice chuck, Um. So yes, But the the 115 00:05:54,080 --> 00:05:59,560 Speaker 1: main hallmark of propaganda is that it includes omissions of facts, right. Um. 116 00:05:59,600 --> 00:06:02,680 Speaker 1: And actually, the where where did the whole word come from? 117 00:06:02,680 --> 00:06:05,800 Speaker 1: You got any you got any info on that? Dude? Uh? Yeah, 118 00:06:06,360 --> 00:06:10,000 Speaker 1: started with religion way back, and it started, you know, 119 00:06:10,080 --> 00:06:12,839 Speaker 1: hundreds of years before it was officially coined. But in 120 00:06:12,960 --> 00:06:18,440 Speaker 1: sixteen twenty two, Pope Gregory was I need to work 121 00:06:18,440 --> 00:06:21,760 Speaker 1: in my room in new Pope Gregory fifteen established the 122 00:06:21,800 --> 00:06:25,880 Speaker 1: Congregation of Propaganda in sixty two, and that was basically 123 00:06:25,880 --> 00:06:28,359 Speaker 1: trying to win back Catholics who had taken up the 124 00:06:28,400 --> 00:06:31,120 Speaker 1: Protestant faith. Yeah, Martin Luther made a real damp in 125 00:06:31,440 --> 00:06:34,680 Speaker 1: the number of seats in the pews every Sunday. So 126 00:06:35,040 --> 00:06:39,440 Speaker 1: Pope Gregory formed the what was it, the Congregation of Propaganda. Um, 127 00:06:39,800 --> 00:06:43,200 Speaker 1: and they, oh you did say that, Holy cow man, 128 00:06:43,240 --> 00:06:46,600 Speaker 1: I gotta pay more attention. Basically, this Congregation of Propaganda, 129 00:06:46,920 --> 00:06:49,760 Speaker 1: you know, one Catholics back by pointing out that anyone 130 00:06:49,800 --> 00:06:52,640 Speaker 1: who doesn't take communion every week as a loser. And 131 00:06:52,680 --> 00:06:56,320 Speaker 1: it worked, it did, sure, I'm sure numbers increased. So 132 00:06:56,360 --> 00:06:59,800 Speaker 1: since then, you know, from that point until I don't know, 133 00:07:00,080 --> 00:07:03,479 Speaker 1: the nineteen forties, there was absolutely no propaganda whatsoever. That's 134 00:07:03,480 --> 00:07:06,440 Speaker 1: probably not true. No, you don't think that's okay. I 135 00:07:06,480 --> 00:07:08,680 Speaker 1: need to read a little more clearly World War One? 136 00:07:08,720 --> 00:07:10,560 Speaker 1: Are we talking? Are we're gonna go with the history now? 137 00:07:10,640 --> 00:07:12,040 Speaker 1: I don't think we should get to that party. I 138 00:07:12,040 --> 00:07:16,200 Speaker 1: think we should talk more about propaganda's implications. Right in 139 00:07:16,280 --> 00:07:18,760 Speaker 1: the article, you you read that there was an interview 140 00:07:18,800 --> 00:07:22,720 Speaker 1: with a guy named em Laine Bruner. Yeah, he's a 141 00:07:22,760 --> 00:07:26,280 Speaker 1: professor of what rhetoric at Georgia State. Yeah, I didn't 142 00:07:26,280 --> 00:07:28,160 Speaker 1: know that was such a thing. I didn't either, but 143 00:07:28,240 --> 00:07:32,960 Speaker 1: it's interesting for him. Um. But Brunner said that the 144 00:07:32,960 --> 00:07:36,960 Speaker 1: the distinction for him between good and bad propaganda was 145 00:07:37,040 --> 00:07:41,280 Speaker 1: whether or not, um, the the people perpetrating the propaganda 146 00:07:41,600 --> 00:07:44,840 Speaker 1: have the best interests of their audience at heart. Right, Yeah, 147 00:07:44,880 --> 00:07:47,880 Speaker 1: but that's subjective. I agree wholeheartedly. I take issue with 148 00:07:47,920 --> 00:07:50,720 Speaker 1: that because I think that it's up to the individual 149 00:07:51,360 --> 00:07:54,320 Speaker 1: to decide what his or her best interests, all right, 150 00:07:54,720 --> 00:07:57,559 Speaker 1: and to make the decision, you have to be fully informed. Well, 151 00:07:57,760 --> 00:08:01,040 Speaker 1: propaganda is based on an a mission of facts. You're 152 00:08:01,080 --> 00:08:05,360 Speaker 1: never fully informed when you're when you're being propagandized. Right, 153 00:08:05,360 --> 00:08:08,080 Speaker 1: So therefore there's no such thing as good propaganda. Sure, 154 00:08:08,080 --> 00:08:11,800 Speaker 1: you never see two both sides of the argument with propaganda, right, right, 155 00:08:11,840 --> 00:08:13,720 Speaker 1: So that's I don't think there is such a good 156 00:08:13,720 --> 00:08:15,480 Speaker 1: thing as I take issue. And he's a Georgia State. 157 00:08:15,560 --> 00:08:17,480 Speaker 1: He's not too far. Maybe we should hop on the 158 00:08:17,520 --> 00:08:20,120 Speaker 1: subway and go down him take issue with him. Yeah, 159 00:08:20,160 --> 00:08:22,400 Speaker 1: I want to take issue with this, professor, I got 160 00:08:22,400 --> 00:08:26,080 Speaker 1: some propaganda for you. Yeah. Uh, so, Chuck, how do 161 00:08:26,120 --> 00:08:29,640 Speaker 1: you how do you get propaganda? Cross, Well, there's a 162 00:08:29,640 --> 00:08:32,120 Speaker 1: lot of techniques actually, and these are pretty cool. And 163 00:08:32,120 --> 00:08:33,760 Speaker 1: I know that when people hear these are going to 164 00:08:33,840 --> 00:08:36,920 Speaker 1: be very familiar, maybe not with the name, but with 165 00:08:37,000 --> 00:08:41,960 Speaker 1: the result. Yeah. Um, name calling is a big one. Yeah. 166 00:08:42,040 --> 00:08:44,520 Speaker 1: I I found a poster I showed you. Um, it's 167 00:08:44,559 --> 00:08:49,040 Speaker 1: a takeoff on that Shepherd Ferry right Obama political poster, 168 00:08:49,280 --> 00:08:51,439 Speaker 1: but this one it's a slightly different picture. He has 169 00:08:51,480 --> 00:08:54,200 Speaker 1: his nose in the air and it says snob under. 170 00:08:54,600 --> 00:08:57,920 Speaker 1: It's actually pretty funny. Yeah, especially since you want it's 171 00:08:57,920 --> 00:09:01,440 Speaker 1: a lot funnier because he won, right Yeah uh yeah, true. 172 00:09:01,720 --> 00:09:04,600 Speaker 1: I guess that was done before he won. Yeah uh. 173 00:09:04,640 --> 00:09:07,640 Speaker 1: And you know, name calling is that's just typical playground stuff. 174 00:09:07,760 --> 00:09:10,240 Speaker 1: But they do it on large you know, on a 175 00:09:10,280 --> 00:09:14,120 Speaker 1: large scale. Grown men and women in the political spectrum 176 00:09:14,280 --> 00:09:18,240 Speaker 1: call people out. They'll they'll use names to uh, they'll 177 00:09:18,320 --> 00:09:22,280 Speaker 1: use names like terrorists and trader and like evil doers. 178 00:09:22,400 --> 00:09:24,600 Speaker 1: Stuff that nobody wants to be right. Not to pick 179 00:09:24,640 --> 00:09:27,680 Speaker 1: too much on George Bush, but when you throw down 180 00:09:27,679 --> 00:09:30,920 Speaker 1: words like access of evil and evil doers. That's propaganda 181 00:09:31,240 --> 00:09:34,360 Speaker 1: definitely form Yeah, it is, it is, and it makes 182 00:09:34,360 --> 00:09:36,640 Speaker 1: you wonder like what exactly is going on in Iran 183 00:09:36,760 --> 00:09:39,680 Speaker 1: right now? How much propaganda we're experiencing from that? Sure, 184 00:09:40,720 --> 00:09:43,200 Speaker 1: so that's one technique. Sure, you want to talk about 185 00:09:43,200 --> 00:09:45,959 Speaker 1: the bandwagon, Well, bandwagon is pretty simple. It's like get 186 00:09:46,000 --> 00:09:48,680 Speaker 1: on the winning side, dummy, you know, which is actually 187 00:09:48,720 --> 00:09:52,440 Speaker 1: that example mixed name calling and bandwagon. Nobody wants to 188 00:09:52,440 --> 00:09:54,439 Speaker 1: feel left out, and again this is pretty much a 189 00:09:54,440 --> 00:09:57,720 Speaker 1: playground technique, which is still come with us, then you're 190 00:09:57,880 --> 00:10:00,400 Speaker 1: and your other group, right, or you're gonna be left behind. 191 00:10:00,440 --> 00:10:02,280 Speaker 1: All your friends and neighbors are gonna be cooler than you, 192 00:10:02,400 --> 00:10:05,920 Speaker 1: smarter than you, richer than you, whatever, um. And you know, 193 00:10:05,960 --> 00:10:10,080 Speaker 1: everybody wants to be a part of something good, you know. 194 00:10:10,160 --> 00:10:13,600 Speaker 1: So yeah, basically with with the bandwagon technique, you're made 195 00:10:13,600 --> 00:10:15,560 Speaker 1: to feel like you can be a part of something 196 00:10:15,600 --> 00:10:19,520 Speaker 1: good if you join in, or be left behind if 197 00:10:19,559 --> 00:10:23,120 Speaker 1: you don't. You got it. Thanks. I like this one 198 00:10:23,160 --> 00:10:26,440 Speaker 1: a lot. The glittering generalities. Yeah, it's a great name. Yeah, 199 00:10:26,480 --> 00:10:29,720 Speaker 1: this is really common in political propaganda, and that's when 200 00:10:29,720 --> 00:10:33,720 Speaker 1: you combine words that have positive connotations the concept that 201 00:10:33,880 --> 00:10:37,440 Speaker 1: is beloved. So basically, no one's gonna come out and 202 00:10:37,480 --> 00:10:40,640 Speaker 1: denounce something that you call And I have another example, 203 00:10:40,679 --> 00:10:43,120 Speaker 1: and again not to put on pick on Bush, but 204 00:10:43,160 --> 00:10:46,200 Speaker 1: he was in office for eight years. There's a lot 205 00:10:46,240 --> 00:10:48,520 Speaker 1: of years. You know, there's a lot of recent things 206 00:10:48,520 --> 00:10:51,480 Speaker 1: you can point to, like the Patriot Act. Like anyone 207 00:10:51,480 --> 00:10:53,640 Speaker 1: who would come out and say, oh, the Patriot Act 208 00:10:53,720 --> 00:10:58,320 Speaker 1: is bad, then what you're not a patriot not supporting? Yeah, 209 00:10:58,360 --> 00:11:00,960 Speaker 1: But and the worst part is is it worked. Yeah, 210 00:11:01,120 --> 00:11:03,800 Speaker 1: although do you remember one of the original provisions was 211 00:11:03,960 --> 00:11:07,960 Speaker 1: basically to turn postal workers into spies, and the post 212 00:11:08,000 --> 00:11:10,960 Speaker 1: Office said, no, we're not going to do that, right, 213 00:11:11,360 --> 00:11:13,880 Speaker 1: and it got left out. But they wanted postal workers 214 00:11:13,880 --> 00:11:15,360 Speaker 1: to keep an eye on what was going on, to 215 00:11:15,440 --> 00:11:19,560 Speaker 1: report on communities and individual people, right, Yeah, So other 216 00:11:19,600 --> 00:11:22,319 Speaker 1: words you can use in the glittering general generalities or 217 00:11:22,640 --> 00:11:26,360 Speaker 1: words like liberty and dreams and family and you you 218 00:11:26,400 --> 00:11:29,400 Speaker 1: throw these words in there, and you know, God forbid 219 00:11:29,440 --> 00:11:32,599 Speaker 1: you step up and say something that's anti family just 220 00:11:32,600 --> 00:11:37,200 Speaker 1: because they tagged that name to it, right exactly, And 221 00:11:37,280 --> 00:11:39,559 Speaker 1: all all politicians do this. We're not we're not gonna 222 00:11:39,679 --> 00:11:42,040 Speaker 1: single things out. It happens all over the place on 223 00:11:42,080 --> 00:11:44,480 Speaker 1: both sides of the spectrum. Yeah, it's just that Bush 224 00:11:44,559 --> 00:11:47,360 Speaker 1: was in office for eight years, which I just said 225 00:11:48,640 --> 00:11:52,720 Speaker 1: you should listen. Cards stacking. Card stacking is exactly what 226 00:11:52,760 --> 00:11:56,160 Speaker 1: it sounds like. It's stacking. Uh, it's stacking the argument 227 00:11:56,200 --> 00:11:59,000 Speaker 1: in the favor of one side over another. Right. Um, 228 00:11:59,040 --> 00:12:01,000 Speaker 1: and again this is the is the one that where 229 00:12:01,080 --> 00:12:04,440 Speaker 1: fact emission really comes into play, you know. Um And 230 00:12:04,440 --> 00:12:07,880 Speaker 1: and it's most often seen in political campaigns where you know, 231 00:12:07,960 --> 00:12:11,440 Speaker 1: one candidate is like Broadwater. Have you ever seen Ali 232 00:12:11,559 --> 00:12:14,800 Speaker 1: g Yeah, did you ever see the barat where he 233 00:12:14,800 --> 00:12:19,120 Speaker 1: follows around that candidate, Jim Broadwater. It's hilarious. At one point, 234 00:12:19,160 --> 00:12:21,280 Speaker 1: he tells a voter that Broadwater is talking to that 235 00:12:21,600 --> 00:12:25,400 Speaker 1: if you do not vote for Broadwater, Broadwater will take power. 236 00:12:25,960 --> 00:12:29,040 Speaker 1: It's hilarious. And he compares him to Stalin. Yeah, and 237 00:12:29,080 --> 00:12:32,160 Speaker 1: this is is poor you know Republican guys running for 238 00:12:32,200 --> 00:12:36,120 Speaker 1: city councilmen. But yeah, have Betty last you didn't? He 239 00:12:36,559 --> 00:12:38,680 Speaker 1: I don't know, I'd like to find that out. But yeah. 240 00:12:38,720 --> 00:12:42,920 Speaker 1: Card stacking is basically just um saying, here's our candidates great, 241 00:12:42,920 --> 00:12:46,680 Speaker 1: great attributes, leaving out any bad stuff. Well, details and 242 00:12:46,679 --> 00:12:50,840 Speaker 1: statistics too, Like they'll throw out legitimate studies, but studies 243 00:12:50,880 --> 00:12:53,440 Speaker 1: that don't mention the other study that can point out 244 00:12:53,480 --> 00:12:58,960 Speaker 1: the exact opposite. Nice point, Chuck, card stacking like Facebook, Yeah, 245 00:12:59,000 --> 00:13:04,960 Speaker 1: it's exactly. And then it is my favorite fear. That's 246 00:13:05,000 --> 00:13:08,400 Speaker 1: a big one. So, Chuck, say we were to point 247 00:13:08,440 --> 00:13:13,120 Speaker 1: out that the guys who host tech stuff uh steal 248 00:13:13,200 --> 00:13:17,000 Speaker 1: babies in America and then sell them to human traffickers 249 00:13:17,200 --> 00:13:19,880 Speaker 1: in the Balkans. You're saying that Jonathan Strickland and Chris 250 00:13:19,920 --> 00:13:22,920 Speaker 1: Puette would do that. I'm just saying I've heard things, 251 00:13:23,679 --> 00:13:25,760 Speaker 1: So I mean, don't you think would be a good 252 00:13:25,800 --> 00:13:28,200 Speaker 1: idea to not listen to their podcast and all the 253 00:13:28,200 --> 00:13:30,559 Speaker 1: people who are tech stuff fans maybe come over and 254 00:13:31,080 --> 00:13:34,400 Speaker 1: listen to us instead, because we certainly don't steal babies 255 00:13:34,440 --> 00:13:37,040 Speaker 1: and we would never sell any to human traffickers if 256 00:13:37,080 --> 00:13:40,840 Speaker 1: we did. That's spaganda. Great example, Josh, thanks, of course, 257 00:13:40,880 --> 00:13:42,400 Speaker 1: we have to do them on the tech Stuff podcast 258 00:13:42,440 --> 00:13:49,680 Speaker 1: so they'd hear it. Yeah, good point. Uh, subliminal subliminal messaging. 259 00:13:49,720 --> 00:13:53,120 Speaker 1: I'm sorry I should say it. Subliminally subliminal messaging. Huh. 260 00:13:53,440 --> 00:13:56,079 Speaker 1: I feel like doing your bidding all of a sudden exactly, 261 00:13:56,160 --> 00:13:58,959 Speaker 1: and that is, uh, that's one of the oldest tricks 262 00:13:58,960 --> 00:14:03,240 Speaker 1: in the book. And that's basically images and whip. Yeah, 263 00:14:03,320 --> 00:14:05,520 Speaker 1: it's oldest drick in the book. Now it's you know 264 00:14:05,559 --> 00:14:07,320 Speaker 1: how it is. It's images and words that are so 265 00:14:07,440 --> 00:14:10,880 Speaker 1: quick and abstract that you don't consciously recognize it. Again, 266 00:14:11,440 --> 00:14:13,600 Speaker 1: we keep going back to politics because it's just so 267 00:14:13,679 --> 00:14:17,439 Speaker 1: obvious with politics. But at any campaign poster for anyone, 268 00:14:17,520 --> 00:14:20,080 Speaker 1: from it somebody running for school board to somebody running 269 00:14:20,080 --> 00:14:23,160 Speaker 1: for president, they always have red, white and blue in them. 270 00:14:23,480 --> 00:14:26,440 Speaker 1: They'll often have a star or the the there there'll 271 00:14:26,480 --> 00:14:29,880 Speaker 1: be a wavy graphic that's kind of reminiscent of a flag. 272 00:14:30,240 --> 00:14:32,160 Speaker 1: And none of these things are concrete like that that 273 00:14:32,280 --> 00:14:34,720 Speaker 1: you never see the candidate dress as a statue of liberty. 274 00:14:34,920 --> 00:14:38,000 Speaker 1: They're actually wrapped in a flag. It's a little more 275 00:14:38,040 --> 00:14:41,600 Speaker 1: subtle than that, but it has the same effect. Obama 276 00:14:41,680 --> 00:14:44,760 Speaker 1: symbol was exactly like that. The one that they designed. 277 00:14:44,760 --> 00:14:46,680 Speaker 1: It was kind of looked like the wavy flag in 278 00:14:46,720 --> 00:14:50,400 Speaker 1: the circle. It makes the Owen snob, It doesn't it does. 279 00:14:50,840 --> 00:14:54,080 Speaker 1: I'll show you more closely. But yeah, so, and actually 280 00:14:54,080 --> 00:14:57,360 Speaker 1: a really good way to kind of pick out this 281 00:14:57,480 --> 00:15:02,720 Speaker 1: kind of propaganda. What it's called transfer, right, um, is 282 00:15:02,760 --> 00:15:07,200 Speaker 1: to pretend you're from another country. Right, so all of 283 00:15:07,280 --> 00:15:10,080 Speaker 1: a sudden that wave, what's that wave for? Or what's 284 00:15:10,120 --> 00:15:13,280 Speaker 1: that star for? Like stuff we just take for granted 285 00:15:13,280 --> 00:15:16,720 Speaker 1: that immediately because in our neurons are like patriot, patriot, 286 00:15:16,800 --> 00:15:19,920 Speaker 1: you know, Um, if you imagine you're from another country, 287 00:15:19,920 --> 00:15:24,520 Speaker 1: suddenly you deconstruct these abstract images and it seems a 288 00:15:24,560 --> 00:15:27,640 Speaker 1: little clunky, clumsy, doesn't have the same effect. What these 289 00:15:27,720 --> 00:15:31,840 Speaker 1: way they star exactly in Soviet Russia, wavy star doesn't 290 00:15:31,960 --> 00:15:37,480 Speaker 1: understand you. That's good, thank you that what do? What 291 00:15:37,520 --> 00:15:40,600 Speaker 1: happened to that guy? And then lastly there's plain folks propaganda, 292 00:15:40,640 --> 00:15:43,600 Speaker 1: which is kind of weak. Actually, yeah, that didn't. That 293 00:15:43,640 --> 00:15:45,880 Speaker 1: didn't strike me as propaganda when I read it, like 294 00:15:46,160 --> 00:15:49,400 Speaker 1: kind of the politician trying to seem like your average ordinary, 295 00:15:49,560 --> 00:15:53,200 Speaker 1: you know, next door neighbor American. Um, I guess it's 296 00:15:53,200 --> 00:15:56,080 Speaker 1: propaganda if the article says so, But it never struck 297 00:15:56,120 --> 00:15:58,000 Speaker 1: me as that. Well, technically it is, because it's an 298 00:15:58,040 --> 00:16:01,280 Speaker 1: omission of fact. So you know, if they sure the 299 00:16:01,320 --> 00:16:04,200 Speaker 1: candidate loves fishing, but is he really fishing in some 300 00:16:04,320 --> 00:16:07,200 Speaker 1: rinky dink rowboat that he rented from like a local fisherman. 301 00:16:07,640 --> 00:16:10,000 Speaker 1: Or is he on like an eight foot yacht, you know, 302 00:16:10,320 --> 00:16:12,880 Speaker 1: using babies that he bought off the tech stuff guys 303 00:16:12,920 --> 00:16:16,000 Speaker 1: bait or or what did they set up some you 304 00:16:16,040 --> 00:16:18,760 Speaker 1: know TV commercial where they did take him to that 305 00:16:18,800 --> 00:16:22,240 Speaker 1: farm in the rinky dink rowboat and said, hey, excellent 306 00:16:22,240 --> 00:16:27,520 Speaker 1: points with you for five most decidedly propaganda. Yeah, so yeah, plane, folks, 307 00:16:27,520 --> 00:16:30,000 Speaker 1: is propaganda to and Chuck. The more you start looking 308 00:16:30,200 --> 00:16:33,200 Speaker 1: or thinking about propaganda, the more you realize it is 309 00:16:33,800 --> 00:16:39,320 Speaker 1: everywhere it is, Chuck, how well it's it's it's where 310 00:16:39,360 --> 00:16:41,680 Speaker 1: you would expect it to be, which is in print, 311 00:16:41,760 --> 00:16:45,520 Speaker 1: on the internet, TV, radio, movies, you name it. Like 312 00:16:45,560 --> 00:16:47,360 Speaker 1: I was talking about with the with the eighties movies 313 00:16:47,400 --> 00:16:50,800 Speaker 1: and the Cold War, I mean every every action movie 314 00:16:50,840 --> 00:16:53,520 Speaker 1: that came out, the Russians were the enemy pretty much, 315 00:16:53,960 --> 00:16:56,560 Speaker 1: and then kind of later on it became Middle Easterners 316 00:16:56,560 --> 00:16:59,560 Speaker 1: were the enemy. Like look who Rambo fought and who 317 00:16:59,640 --> 00:17:02,800 Speaker 1: Rocky thought. Those were prime examples the Russian. But he 318 00:17:02,800 --> 00:17:06,679 Speaker 1: helped the mujah hadein a k a. The Taliban. He 319 00:17:06,760 --> 00:17:08,800 Speaker 1: did what in uh one of the Rambo movies the 320 00:17:08,840 --> 00:17:11,320 Speaker 1: third Rambo movie. He and at the very end it's 321 00:17:11,440 --> 00:17:14,280 Speaker 1: they said that they dedicated the movie to the muja 322 00:17:14,359 --> 00:17:17,600 Speaker 1: Hadean freedom fighters and the mujah Hadein who we were 323 00:17:17,760 --> 00:17:22,080 Speaker 1: funding to help fight the Russians in Afghanistan turned into 324 00:17:22,080 --> 00:17:27,200 Speaker 1: the Taliban won. What Chuck Norris is not happy about 325 00:17:27,240 --> 00:17:29,320 Speaker 1: that at all. He's been after still long since in 326 00:17:30,000 --> 00:17:33,080 Speaker 1: right and show a thing or two? Yeah, so apparently, Um, 327 00:17:33,119 --> 00:17:36,880 Speaker 1: according to one of the uh the the professor's interviewed 328 00:17:36,920 --> 00:17:42,280 Speaker 1: in this article, UM, broadcast media like radio or TV 329 00:17:43,000 --> 00:17:47,080 Speaker 1: is the most dangerous propaganda medium because people tend to 330 00:17:47,119 --> 00:17:50,440 Speaker 1: believe it. Well, not just that there's no discourse, it's 331 00:17:50,480 --> 00:17:53,920 Speaker 1: all one sided. It's all here you ingest this. And 332 00:17:53,960 --> 00:17:57,440 Speaker 1: also it's it's much it's very entertaining. You know, your 333 00:17:57,480 --> 00:18:01,280 Speaker 1: average TV shows usually more entertain aiming then your average 334 00:18:01,600 --> 00:18:04,200 Speaker 1: AP news article. Yeah that's true, you know what I mean. 335 00:18:04,520 --> 00:18:08,040 Speaker 1: But ironically it's the AP news articles that are generally 336 00:18:08,080 --> 00:18:11,200 Speaker 1: the least propagandic. Yeah, you're right, because you think about 337 00:18:11,240 --> 00:18:15,720 Speaker 1: it's like fact fact fact, fact fact quote fact quack. Yes, 338 00:18:16,200 --> 00:18:19,560 Speaker 1: that's that's actually a quote in a fact quack. Um, 339 00:18:19,680 --> 00:18:23,600 Speaker 1: and then um, that's it, right, there's not. It's pretty 340 00:18:23,600 --> 00:18:27,160 Speaker 1: bare bones. I like those political commercials. Those are great. 341 00:18:27,920 --> 00:18:31,200 Speaker 1: Which one, you know, the big time propaganda where they're 342 00:18:31,200 --> 00:18:35,399 Speaker 1: where they're you know, this candidate, your family, stuff like that. 343 00:18:35,480 --> 00:18:37,840 Speaker 1: Do you remember the phone ringing one? I think it 344 00:18:37,920 --> 00:18:40,639 Speaker 1: was a Clinton add against Obama. It was one of 345 00:18:40,680 --> 00:18:43,320 Speaker 1: the last ones she ran during the primary, and it 346 00:18:43,400 --> 00:18:46,159 Speaker 1: was just a phone ringing. Is that when the phone, 347 00:18:46,320 --> 00:18:48,240 Speaker 1: when the phone rings in the middle of the night, 348 00:18:48,840 --> 00:18:51,280 Speaker 1: who do you want as president to answer it? Something 349 00:18:51,359 --> 00:18:54,400 Speaker 1: like that? Right, Yeah, that's that's definitely fear propagandia. Yeah. 350 00:18:54,440 --> 00:18:56,240 Speaker 1: Those crack me up, though, I mean the people that 351 00:18:56,280 --> 00:19:00,560 Speaker 1: buy into those, that's what scares me. The commerce themselves. 352 00:19:00,600 --> 00:19:02,080 Speaker 1: I get a kick out of it. I think it's hysterical. 353 00:19:02,480 --> 00:19:05,399 Speaker 1: Yeah that anyone wouldn't say this is so one sided. 354 00:19:05,440 --> 00:19:09,480 Speaker 1: It's what's mind bogglingly frightening is that it actually works 355 00:19:09,520 --> 00:19:15,520 Speaker 1: on some people, so um chuck. Propagana also sometimes is 356 00:19:15,600 --> 00:19:19,240 Speaker 1: not necessarily contrived. It just kind of comes out. Like 357 00:19:19,280 --> 00:19:21,679 Speaker 1: I spent a few years as a journalist, right and 358 00:19:21,720 --> 00:19:25,480 Speaker 1: I realized that it is really easy for your beliefs 359 00:19:25,600 --> 00:19:28,600 Speaker 1: to creep into a story. It doesn't matter whether it's 360 00:19:28,600 --> 00:19:31,920 Speaker 1: a story about somebody who just turned a hundred or um, 361 00:19:31,960 --> 00:19:35,120 Speaker 1: you know about the war in a rock is as 362 00:19:35,280 --> 00:19:38,800 Speaker 1: unbiased as you try to be, it's impossible to be 363 00:19:38,960 --> 00:19:42,240 Speaker 1: totally objective. Right in this very podcast. We get taken 364 00:19:42,280 --> 00:19:47,560 Speaker 1: to task occasionally by people that think we're communist, communists, 365 00:19:47,800 --> 00:19:51,879 Speaker 1: anti religious, sexist, Bommy worshippers, that kind of thing. But 366 00:19:51,960 --> 00:19:54,000 Speaker 1: we're not any of those things. No, not really, And 367 00:19:54,000 --> 00:19:56,240 Speaker 1: if we are, we're sure not aware of it anyway. 368 00:19:56,320 --> 00:19:59,680 Speaker 1: But you know, your belief system informs your outlook, right, 369 00:20:00,359 --> 00:20:03,080 Speaker 1: so you know, just the very position you're taking, just 370 00:20:03,119 --> 00:20:07,280 Speaker 1: the very approach to an article, there's eighty hundred countless 371 00:20:07,320 --> 00:20:10,520 Speaker 1: different ways to approach an article. The one you choose, 372 00:20:10,600 --> 00:20:13,159 Speaker 1: even if you're trying to be objective, that's a choice 373 00:20:13,200 --> 00:20:15,159 Speaker 1: that's a bias right out of the gate. And you're 374 00:20:15,160 --> 00:20:17,000 Speaker 1: gonna choose the one that you identify with that you 375 00:20:17,080 --> 00:20:19,199 Speaker 1: understand more. Yeah, so again, I mean, I guess what 376 00:20:19,240 --> 00:20:21,480 Speaker 1: I'm trying to say is, if you're getting news, only 377 00:20:21,480 --> 00:20:25,399 Speaker 1: get it from a p not a bad idea. Thank you, Chuck. 378 00:20:25,920 --> 00:20:29,840 Speaker 1: You want to move along, uh different types of propaganda, Well, 379 00:20:29,880 --> 00:20:32,159 Speaker 1: let let's talk about the internet real quick, because I 380 00:20:32,200 --> 00:20:35,640 Speaker 1: find this interesting. The Internet actually has the potential to 381 00:20:35,640 --> 00:20:40,040 Speaker 1: totally undermine traditional propaganda, right, So well, think about it, like, 382 00:20:40,080 --> 00:20:44,840 Speaker 1: if broadcast programming is the most dangerous form propaganda because 383 00:20:44,840 --> 00:20:48,240 Speaker 1: there's no feedback, then the Internet would be the least 384 00:20:48,320 --> 00:20:51,800 Speaker 1: dangerous form because there's nothing but feedback. Social media has 385 00:20:51,840 --> 00:20:54,760 Speaker 1: just opened the Internet up to everybody in any crack 386 00:20:54,840 --> 00:20:59,439 Speaker 1: pot normal person saint can put this stuff on the 387 00:20:59,480 --> 00:21:03,080 Speaker 1: Internet and and get opposing viewpoints out there, so you 388 00:21:03,119 --> 00:21:05,919 Speaker 1: can conceivably just be a fully informed person to make 389 00:21:05,920 --> 00:21:10,959 Speaker 1: your own decisions, which completely undermines um propaganda, right right. 390 00:21:11,560 --> 00:21:18,000 Speaker 1: The problem is, you know, facts spreading lightning fast just 391 00:21:18,040 --> 00:21:23,639 Speaker 1: as much as that undermines propaganda. Um, uninformed ideas or 392 00:21:23,840 --> 00:21:26,919 Speaker 1: facts that aren't really facts can spread just as quickly, 393 00:21:27,760 --> 00:21:30,280 Speaker 1: and that helps propaganda. So yeah, and the Internet is 394 00:21:30,320 --> 00:21:32,440 Speaker 1: just rife with that kind of time. What my solution, 395 00:21:32,680 --> 00:21:35,720 Speaker 1: Snopes dot com. Yeah, they're pretty good. So let's talk 396 00:21:35,720 --> 00:21:38,480 Speaker 1: about the different types of propaganda and wrap this puppy 397 00:21:38,600 --> 00:21:42,359 Speaker 1: up like a Christmas president. Religious propaganda was kind of 398 00:21:42,359 --> 00:21:45,119 Speaker 1: where it all began, like we said earlier, and uh, 399 00:21:45,720 --> 00:21:51,880 Speaker 1: missionaries yeah, for centuries have been uh even and they've 400 00:21:51,880 --> 00:21:55,320 Speaker 1: been traveling to other countries trying to recruit others to 401 00:21:55,480 --> 00:21:59,159 Speaker 1: their to their faith. And this is a form of 402 00:21:59,280 --> 00:22:03,560 Speaker 1: propaganda pamphlets and the posters that they hand out, and 403 00:22:03,560 --> 00:22:05,439 Speaker 1: and we're not saying that they're bad people and that 404 00:22:05,520 --> 00:22:08,119 Speaker 1: they're spreading lies. What we're saying is that's a form 405 00:22:08,160 --> 00:22:11,800 Speaker 1: of propaganda when you only evangelize the one side of 406 00:22:11,800 --> 00:22:13,560 Speaker 1: the coin, and they do when you go to these 407 00:22:13,560 --> 00:22:16,760 Speaker 1: countries tell people that this is the answer right here. Well, 408 00:22:16,800 --> 00:22:19,359 Speaker 1: you can also make the case that another kind of 409 00:22:19,400 --> 00:22:22,280 Speaker 1: propaganda the article points out, but doesn't join to religion, 410 00:22:22,640 --> 00:22:27,520 Speaker 1: thought reform. There's actually a form of religious propaganda as well. 411 00:22:28,200 --> 00:22:30,359 Speaker 1: You know, if you're running around worshiping like a d 412 00:22:30,480 --> 00:22:33,840 Speaker 1: deities and the Christians come along and say, no, no, 413 00:22:34,200 --> 00:22:38,760 Speaker 1: there's just one, we're monotheistic. Now, that's thought reform, right right, Yeah, 414 00:22:39,080 --> 00:22:41,359 Speaker 1: although you know generally they don't give up kool aid 415 00:22:41,720 --> 00:22:44,920 Speaker 1: that's laced with sinide. Now that that's a very cult 416 00:22:45,280 --> 00:22:48,440 Speaker 1: But you know, all those those types of political religious, 417 00:22:48,760 --> 00:22:53,399 Speaker 1: well especially political and religious propaganda, they kind of um 418 00:22:53,640 --> 00:22:58,520 Speaker 1: underscore our divisive nature, right like us versus them. And 419 00:22:58,520 --> 00:23:01,240 Speaker 1: actually took an anthropology us once in college and the 420 00:23:01,280 --> 00:23:04,960 Speaker 1: professor challenged us to go a day, just one day, 421 00:23:05,080 --> 00:23:08,440 Speaker 1: without using the words us or them or any variation 422 00:23:08,480 --> 00:23:13,440 Speaker 1: on that theme. And I defy you to do it successfully. 423 00:23:13,520 --> 00:23:15,359 Speaker 1: You can't start now the day's half over. You have 424 00:23:15,440 --> 00:23:18,240 Speaker 1: to start tomorrow. Just those two words. You can say, 425 00:23:18,280 --> 00:23:20,960 Speaker 1: we know no variation on the theme of us in 426 00:23:21,200 --> 00:23:25,000 Speaker 1: us or them. Try it. It's tough. I'm gonna forget 427 00:23:25,040 --> 00:23:28,639 Speaker 1: about that as soon as I leave this, yes studio check. 428 00:23:29,400 --> 00:23:32,439 Speaker 1: The big one, though, of all of them, is government propaganda, 429 00:23:32,520 --> 00:23:35,960 Speaker 1: right right, which is illegal since nightficially, and if you 430 00:23:35,960 --> 00:23:40,600 Speaker 1: think about it, that government propaganda is taxpayers paying to 431 00:23:40,880 --> 00:23:45,200 Speaker 1: be brainwashed, which is why it should be illegal, right yeah, 432 00:23:45,240 --> 00:23:49,639 Speaker 1: And it has been since technically, but w Mr Bush 433 00:23:50,080 --> 00:23:54,080 Speaker 1: in two thousand five actually signed the Stop Government Propaganda 434 00:23:54,119 --> 00:23:58,440 Speaker 1: Now Bill to UH to keep some like blatant outright 435 00:23:58,440 --> 00:24:02,080 Speaker 1: acts of propaganda commit by government agencies, like when you 436 00:24:02,119 --> 00:24:05,280 Speaker 1: pay television reporters to skew a message like planting stories, 437 00:24:05,520 --> 00:24:10,840 Speaker 1: planting stories exactly. Yeah. And it also established some that 438 00:24:10,960 --> 00:24:15,280 Speaker 1: audio in printed press communication state who the agency is 439 00:24:15,320 --> 00:24:18,000 Speaker 1: that funded it, like paid for by blah blah blah 440 00:24:18,000 --> 00:24:20,280 Speaker 1: blah blah, that kind of thing, And we we see 441 00:24:20,359 --> 00:24:25,040 Speaker 1: government propaganda, most prominently during times of war, right like 442 00:24:25,240 --> 00:24:28,800 Speaker 1: Hitler and the Nazis were masters of propaganda. He was 443 00:24:28,840 --> 00:24:32,680 Speaker 1: the king of propaganda in world history, I think. Yeah, 444 00:24:32,760 --> 00:24:37,719 Speaker 1: he cut Germany off from the outside world. He sold 445 00:24:38,000 --> 00:24:41,080 Speaker 1: radios for next to nothing. I think he was driving 446 00:24:41,080 --> 00:24:43,080 Speaker 1: around in the back of a truck selling him. He 447 00:24:43,560 --> 00:24:45,600 Speaker 1: made sure the prices were loft so every German could 448 00:24:45,600 --> 00:24:48,160 Speaker 1: afford one, so they could tune into his radio addresses 449 00:24:48,200 --> 00:24:50,200 Speaker 1: and and hear how great they were and how awful 450 00:24:50,240 --> 00:24:53,000 Speaker 1: the Jews and everybody else was. Uh, and and the 451 00:24:53,359 --> 00:24:58,359 Speaker 1: portrayal of what was going on, like Germans living in 452 00:24:58,400 --> 00:25:00,639 Speaker 1: the in other parts of the world were being abused 453 00:25:00,680 --> 00:25:03,760 Speaker 1: at the hands of their host countries and things like that. Um, 454 00:25:03,840 --> 00:25:06,520 Speaker 1: And it was effective. Yeah. And they also made movies, 455 00:25:06,560 --> 00:25:09,840 Speaker 1: the famous Nazi propaganda movies where they, you know, made 456 00:25:09,840 --> 00:25:13,000 Speaker 1: out Jews to be rats and Hitler to be godlike 457 00:25:13,080 --> 00:25:16,000 Speaker 1: and it was And they didn't like the Gypsies much either, 458 00:25:16,880 --> 00:25:20,280 Speaker 1: or Gays or Catholics. Yeah. Yeah, you forget sometimes you 459 00:25:20,280 --> 00:25:22,800 Speaker 1: know that it wasn't just the Jews that were persecuted 460 00:25:22,840 --> 00:25:25,840 Speaker 1: in the Holocaust. There's a lot of other groups. Yeah. 461 00:25:26,560 --> 00:25:29,280 Speaker 1: And here in the US here state side we had 462 00:25:29,320 --> 00:25:32,400 Speaker 1: our own propaganda as well. And also we should say 463 00:25:32,880 --> 00:25:36,960 Speaker 1: um on our very enjoyable sister podcast stuf you missed 464 00:25:36,960 --> 00:25:39,919 Speaker 1: in history class, They actually did an entire podcast on 465 00:25:40,040 --> 00:25:43,440 Speaker 1: the Nazi propaganda machinery. Yeah, yeah they should. People should 466 00:25:43,480 --> 00:25:45,560 Speaker 1: check that out. You can get that on iTunes too. 467 00:25:45,760 --> 00:25:48,439 Speaker 1: But again, state side, we had our own propaganda, and 468 00:25:48,520 --> 00:25:51,679 Speaker 1: that's something that have become pop icons, right. Yes, and 469 00:25:51,720 --> 00:25:53,720 Speaker 1: World War two was when it really kicked up. Like 470 00:25:53,800 --> 00:25:56,680 Speaker 1: if you think of the famous Uncle Sam I want 471 00:25:56,720 --> 00:25:59,960 Speaker 1: you posters with Uncle Sam pointing trying to get young 472 00:26:00,000 --> 00:26:02,359 Speaker 1: American men to enlist in the army. Yeah, that was 473 00:26:02,400 --> 00:26:04,320 Speaker 1: new in the forties. Yeah, and that was a big, 474 00:26:04,560 --> 00:26:07,199 Speaker 1: big time propaganda posters were very effective back then. My 475 00:26:07,280 --> 00:26:10,320 Speaker 1: favorite I had two favorites. One of them is what 476 00:26:10,560 --> 00:26:14,120 Speaker 1: what which? One Rosie? Probably Rosie is pretty cool, um. 477 00:26:14,160 --> 00:26:16,400 Speaker 1: But there was one that had somebody riding in a 478 00:26:16,400 --> 00:26:20,760 Speaker 1: car by by by by himself and it said, um 479 00:26:20,800 --> 00:26:23,840 Speaker 1: when you it was for car pooling, right, ration gas 480 00:26:23,840 --> 00:26:26,639 Speaker 1: and stuff. When you ride alone, you ride with Hitler. 481 00:26:27,760 --> 00:26:29,639 Speaker 1: How great would it be to have one of those posters? Now, 482 00:26:29,840 --> 00:26:32,359 Speaker 1: I'm sure you can find at least a replica Yeah, 483 00:26:32,400 --> 00:26:35,080 Speaker 1: that's true. I mean that. That's the other thing about 484 00:26:35,119 --> 00:26:38,920 Speaker 1: him is they're like great art. Oh yeah, propaganda posters 485 00:26:38,920 --> 00:26:41,720 Speaker 1: have the best art. Yeah, I like that that that era. 486 00:26:42,160 --> 00:26:45,399 Speaker 1: The The other one I like is just I just 487 00:26:45,440 --> 00:26:47,679 Speaker 1: can't believe it, like that. These were up in in 488 00:26:47,800 --> 00:26:51,479 Speaker 1: on public display. Um during World War Two. There's a 489 00:26:51,560 --> 00:26:55,240 Speaker 1: Japanese soldier using the butt of his rifle to smack 490 00:26:55,480 --> 00:26:59,720 Speaker 1: an American pow in the chin, and it says, what 491 00:26:59,800 --> 00:27:03,199 Speaker 1: are you going to do about it? And below is 492 00:27:03,240 --> 00:27:06,080 Speaker 1: the answer. And the answer, according to this propaganda poster 493 00:27:06,280 --> 00:27:09,439 Speaker 1: is stay on the job until every murdering Jap is 494 00:27:09,480 --> 00:27:12,760 Speaker 1: wiped out. You're kidding. It even has a little Government 495 00:27:12,840 --> 00:27:16,680 Speaker 1: Office of Propaganda logo at the bottom. Look, I kid 496 00:27:16,720 --> 00:27:19,680 Speaker 1: you not. That's something else. Yeah, it's a little nuts. 497 00:27:19,680 --> 00:27:22,720 Speaker 1: So yeah, well you go a little overboard during times 498 00:27:22,720 --> 00:27:24,680 Speaker 1: of war. I liked Rosie the Riveter. That's who I 499 00:27:24,680 --> 00:27:26,920 Speaker 1: thought you were going to mention. Have you seen my 500 00:27:27,320 --> 00:27:31,160 Speaker 1: favorite mechanic as a woman over indicatory riff on that? Yeah? 501 00:27:31,200 --> 00:27:35,240 Speaker 1: Successfully too. Um Yeah, Rosie the Riveter was famous obviously 502 00:27:35,280 --> 00:27:38,600 Speaker 1: because women at the time during World War Two were 503 00:27:38,680 --> 00:27:43,000 Speaker 1: encouraged to help the war at home. On the home 504 00:27:43,040 --> 00:27:45,840 Speaker 1: front by working in the taking these factory jobs that 505 00:27:45,840 --> 00:27:48,280 Speaker 1: the men had to leave. And as she became like 506 00:27:48,320 --> 00:27:52,119 Speaker 1: an iconic character, and one of the posters read longing 507 00:27:52,200 --> 00:27:55,080 Speaker 1: won't bring him back sooner, get a war job. Yeah, 508 00:27:55,200 --> 00:27:57,000 Speaker 1: I love that. Yeah, it is. It's pretty cool that 509 00:27:57,119 --> 00:27:58,679 Speaker 1: I saw another one who was a woman holding a 510 00:27:58,680 --> 00:28:01,760 Speaker 1: giant key and it's a food ration ng is the 511 00:28:01,840 --> 00:28:04,480 Speaker 1: key to the war effort. And and actually that was 512 00:28:04,520 --> 00:28:07,280 Speaker 1: one of the things that I don't think you could 513 00:28:07,280 --> 00:28:12,400 Speaker 1: predict that came out of propaganda was um, women suddenly 514 00:28:12,800 --> 00:28:17,440 Speaker 1: were uh put into their proper position of power. They 515 00:28:17,440 --> 00:28:19,760 Speaker 1: were elevated to that that kind of power. They were 516 00:28:19,880 --> 00:28:23,199 Speaker 1: no longer the mirror little houndsewives. They were empowered to 517 00:28:23,280 --> 00:28:25,960 Speaker 1: like actually help with the war effort, get a war job, 518 00:28:26,520 --> 00:28:30,359 Speaker 1: or to um food ration food or do whatever. They 519 00:28:30,400 --> 00:28:32,960 Speaker 1: suddenly had a role. And not just women but blacks 520 00:28:32,960 --> 00:28:35,080 Speaker 1: as well. There was a propaganda poster that said like 521 00:28:35,200 --> 00:28:36,960 Speaker 1: united we win and it was a black guy and 522 00:28:37,000 --> 00:28:40,200 Speaker 1: a white guy working side by side, decades before the 523 00:28:40,240 --> 00:28:43,560 Speaker 1: Civil rights movement. Yeah, so sometimes it's um it's foreshadowing 524 00:28:43,560 --> 00:28:46,280 Speaker 1: of social change and possibly even a mechanism of social 525 00:28:46,360 --> 00:28:49,960 Speaker 1: change that follows. Whatever the the the issue is it's 526 00:28:49,960 --> 00:28:52,280 Speaker 1: being propagandized. I think definitely in the case of women, 527 00:28:52,280 --> 00:28:55,040 Speaker 1: I think World two probably had a lot of good 528 00:28:55,040 --> 00:28:57,760 Speaker 1: benefits for women, kind of having a voice for the 529 00:28:57,800 --> 00:29:00,920 Speaker 1: first time, or not the first time, but probably a 530 00:29:01,080 --> 00:29:03,880 Speaker 1: really big, curly on voice for the first time. Yeah yeah, 531 00:29:04,400 --> 00:29:08,160 Speaker 1: um and Chuck, that's propaganda, baby. Yeah. You know. I 532 00:29:08,200 --> 00:29:10,920 Speaker 1: had a movie idea, script idea when I was during 533 00:29:10,960 --> 00:29:15,720 Speaker 1: my screenwriting days about a film student that gets um, 534 00:29:15,760 --> 00:29:17,560 Speaker 1: like he wins the big student film award and then 535 00:29:17,600 --> 00:29:18,960 Speaker 1: all of a sudden he gets whisked away by the 536 00:29:19,000 --> 00:29:22,960 Speaker 1: government to the secret Layer and they recruit young filmmakers 537 00:29:23,360 --> 00:29:26,400 Speaker 1: to the Ministry of Propaganda and like the moon landing 538 00:29:26,480 --> 00:29:28,800 Speaker 1: was fake and all these things have been faked or what. 539 00:29:28,960 --> 00:29:31,760 Speaker 1: We talked about that in another podcast. But this, yeah, 540 00:29:31,760 --> 00:29:33,880 Speaker 1: this kid gets caught up into making these movies that 541 00:29:33,920 --> 00:29:36,880 Speaker 1: are all faults and have the dog. Yeah. It's a 542 00:29:36,880 --> 00:29:38,440 Speaker 1: great one too. Yeah, sort of a riff on that 543 00:29:38,480 --> 00:29:39,760 Speaker 1: And I never wrote it, and I don't I'm not 544 00:29:39,800 --> 00:29:41,560 Speaker 1: going to. So if anyone out there's a screenwriter a 545 00:29:41,600 --> 00:29:44,320 Speaker 1: longs that idea, feel free, Yeah, just give a shout 546 00:29:44,320 --> 00:29:47,280 Speaker 1: out to Chuck at the premiere, right right, Yeah, so 547 00:29:47,520 --> 00:29:52,360 Speaker 1: again that's propaganda, that is right. So are we plugging 548 00:29:52,400 --> 00:29:55,280 Speaker 1: anything to We haven't add nothing, Holy cow. That means 549 00:29:55,280 --> 00:30:03,320 Speaker 1: we get to go right to listener mail Josh today. 550 00:30:03,360 --> 00:30:06,520 Speaker 1: I think we uh, since it would be appropriate to 551 00:30:06,520 --> 00:30:10,640 Speaker 1: talk about Molly Orshansky. Yeah, are less subject matter there. 552 00:30:10,640 --> 00:30:12,880 Speaker 1: We were talking about the women in World War Two. 553 00:30:13,280 --> 00:30:16,120 Speaker 1: We were uh, and also we mentioned that we have 554 00:30:16,200 --> 00:30:19,400 Speaker 1: been called sexists and mommy worshipers. You want to read 555 00:30:20,000 --> 00:30:22,520 Speaker 1: the letter in question. Yes, we actually got a couple 556 00:30:22,520 --> 00:30:26,000 Speaker 1: of letters, um, one which we're going to read now, 557 00:30:26,040 --> 00:30:29,200 Speaker 1: and one that was kind of nasty and mean, and 558 00:30:29,320 --> 00:30:30,760 Speaker 1: we're not going to read that or we're not gonna 559 00:30:30,800 --> 00:30:34,400 Speaker 1: say that uh, nasty, mean person's name. But this one 560 00:30:34,480 --> 00:30:37,280 Speaker 1: was much more above board. Uh. This says you probably 561 00:30:37,320 --> 00:30:39,840 Speaker 1: have received a bunch of emails about this, but I 562 00:30:39,840 --> 00:30:41,720 Speaker 1: wanted to let you know that Molly or Shansky is 563 00:30:41,760 --> 00:30:44,200 Speaker 1: a woman. If you recall, she is a woman who 564 00:30:44,200 --> 00:30:47,000 Speaker 1: developed the poverty line. But in your podcast how much 565 00:30:47,000 --> 00:30:49,000 Speaker 1: money do I Really Need to live? You referenced ms. 566 00:30:49,080 --> 00:30:52,200 Speaker 1: Orshansky as a heat as a female graduate student of 567 00:30:52,200 --> 00:30:55,680 Speaker 1: public policy with a specialization in poverty. I was so 568 00:30:55,720 --> 00:30:58,080 Speaker 1: excited to hear you mentioned a woman who was so 569 00:30:58,160 --> 00:31:00,760 Speaker 1: influential to the field. But then I was stremely disappointed 570 00:31:00,800 --> 00:31:03,280 Speaker 1: when you got the gender wrong. Obviously she had a 571 00:31:03,360 --> 00:31:05,520 Speaker 1: right to be and I hope you make this correction 572 00:31:05,520 --> 00:31:08,760 Speaker 1: on your podcast. And that comes from Cheryl Master in 573 00:31:08,800 --> 00:31:11,240 Speaker 1: public policy candidate at the John F. Kennedy School of 574 00:31:11,280 --> 00:31:14,400 Speaker 1: Government at Harvard University. And I wrote Cheryl back and 575 00:31:14,440 --> 00:31:16,600 Speaker 1: thanked her for being kind since we had gotten the 576 00:31:16,720 --> 00:31:21,000 Speaker 1: nasty letter calling its misogynistic freaks and it was a 577 00:31:21,000 --> 00:31:25,520 Speaker 1: big mistake. We were wrong, and the research that we 578 00:31:25,600 --> 00:31:28,959 Speaker 1: got actually referenced uh, Molly rus Chancey as a key 579 00:31:29,520 --> 00:31:32,080 Speaker 1: and so it wasn't some big assumption on our part 580 00:31:32,120 --> 00:31:34,800 Speaker 1: that it had to be a man. Great wait, Chuck, Chuck, 581 00:31:34,840 --> 00:31:36,600 Speaker 1: I've been thinking about this. I think at this point 582 00:31:36,680 --> 00:31:41,520 Speaker 1: we should make up a research team and lay this 583 00:31:41,600 --> 00:31:43,120 Speaker 1: hot in their feet. I don't think we should take 584 00:31:43,120 --> 00:31:47,800 Speaker 1: any responsibility to what spear you right? Right, So, Chuck Um, 585 00:31:48,080 --> 00:31:51,080 Speaker 1: it was our research team that really uh dropped the 586 00:31:51,120 --> 00:31:53,520 Speaker 1: ball on this one. It wasn't you or I We 587 00:31:53,520 --> 00:31:56,160 Speaker 1: were misinformed. We did We did not assume like a 588 00:31:56,200 --> 00:31:59,160 Speaker 1: couple of readers or listeners have thought that we didn't 589 00:31:59,160 --> 00:32:01,080 Speaker 1: assume that it is a man just because it was 590 00:32:01,120 --> 00:32:02,960 Speaker 1: some big college. It wasn't us at all that we 591 00:32:03,000 --> 00:32:04,640 Speaker 1: don't do that cut. But to make up for the 592 00:32:04,680 --> 00:32:07,800 Speaker 1: failings of our crack research team, who have been chastised, 593 00:32:08,000 --> 00:32:10,840 Speaker 1: um since we got this pointed out to us, well fired, 594 00:32:10,960 --> 00:32:13,280 Speaker 1: We fired them both. It's a different way of putting it. 595 00:32:13,280 --> 00:32:15,600 Speaker 1: In this economy, you want to stay chastised. We didn't 596 00:32:15,600 --> 00:32:18,120 Speaker 1: fire anyone. I'm just kidding. UM. So we did a 597 00:32:18,120 --> 00:32:20,360 Speaker 1: little research into Maali or Shansky. We found out that 598 00:32:20,440 --> 00:32:24,240 Speaker 1: she is dead. She died in April two thousand seven. 599 00:32:24,720 --> 00:32:28,160 Speaker 1: And she actually was quite a pioneer UM in her field. 600 00:32:28,480 --> 00:32:31,920 Speaker 1: She worked for the Social Security Administration from nineteen fifty 601 00:32:31,960 --> 00:32:35,520 Speaker 1: eight to nineteen eighty two, and uh, as historian Alice 602 00:32:35,560 --> 00:32:38,400 Speaker 1: O'Connor wrote in Poverty Knowledge, she was one of a 603 00:32:38,640 --> 00:32:43,480 Speaker 1: respected but mostly invisible cardro of women research professor professionals 604 00:32:43,520 --> 00:32:47,280 Speaker 1: based at Social Security Administration and other government agencies during 605 00:32:47,320 --> 00:32:50,200 Speaker 1: the postwar years. And I think that's part of the 606 00:32:50,640 --> 00:32:54,640 Speaker 1: part of the problem. I think, um, we as you 607 00:32:54,680 --> 00:32:57,520 Speaker 1: know thirty early thirty somethings, well, one of us is 608 00:32:57,560 --> 00:33:00,680 Speaker 1: in early thirty somethings in two thousand nine kind of 609 00:33:00,840 --> 00:33:05,800 Speaker 1: Um underestimated what women were allowed to do, I think 610 00:33:05,840 --> 00:33:10,120 Speaker 1: in the sixties. Right, that's fair enough. Sure so, because 611 00:33:10,120 --> 00:33:13,080 Speaker 1: you don't hear much about it. No, they were, no, 612 00:33:13,400 --> 00:33:15,520 Speaker 1: I know, and that is that is the travesty. And 613 00:33:15,560 --> 00:33:17,920 Speaker 1: I even thought, like when we were doing that podcast, 614 00:33:18,040 --> 00:33:19,880 Speaker 1: like Molly is a weird name for a guy. But 615 00:33:20,000 --> 00:33:22,520 Speaker 1: I did too. Still, I thought it was an Irish thing, 616 00:33:22,800 --> 00:33:25,080 Speaker 1: and I thought Mally Orshansky. I could see a guy 617 00:33:25,120 --> 00:33:27,360 Speaker 1: being named Molly or Chansky, right, So it was. It 618 00:33:27,400 --> 00:33:31,200 Speaker 1: was a mistake, I would say, much more notable or 619 00:33:31,280 --> 00:33:34,719 Speaker 1: noteworthy than being invisible, you know. But successful was that 620 00:33:34,760 --> 00:33:40,040 Speaker 1: she actually um has helped countless uh, impoverished people in 621 00:33:40,040 --> 00:33:43,480 Speaker 1: the United States absolutely by creating this poverty line, which 622 00:33:43,520 --> 00:33:46,400 Speaker 1: basically forces the government's hand into saying, Okay, if you're 623 00:33:46,440 --> 00:33:49,080 Speaker 1: below this, we're gonna help you, right uh. And this 624 00:33:49,160 --> 00:33:52,960 Speaker 1: is largely due to her work as a mathematician and statisticians. 625 00:33:53,160 --> 00:33:56,360 Speaker 1: True trailblazer. He was a great man. He was hats 626 00:33:56,360 --> 00:33:59,240 Speaker 1: off to you, Mr George Shansky. So if you want 627 00:33:59,360 --> 00:34:02,280 Speaker 1: to say, and checking emails, taking us to task, or 628 00:34:02,320 --> 00:34:06,720 Speaker 1: pointing out an error or just say hi. Whatever, gender confusion. 629 00:34:06,760 --> 00:34:09,480 Speaker 1: If you want to call us sexist mommy worshipers, whatever, 630 00:34:09,960 --> 00:34:13,440 Speaker 1: We accept all comers. You can send that to stuff 631 00:34:13,560 --> 00:34:19,480 Speaker 1: podcast at how stuff works dot com. For more on 632 00:34:19,560 --> 00:34:22,279 Speaker 1: this and thousands of other topics, visit how stuff works 633 00:34:22,320 --> 00:34:24,800 Speaker 1: dot com and be sure to check out this stuff 634 00:34:24,800 --> 00:34:26,640 Speaker 1: you Should Know blog on the house stuff works dot 635 00:34:26,680 --> 00:34:31,879 Speaker 1: com home page. Brought to you by the reinvented two 636 00:34:31,880 --> 00:34:34,399 Speaker 1: thousand twelve camera. It's ready, are you