1 00:00:11,760 --> 00:00:14,880 Speaker 1: Good morning, peeps, and welcome to wok F Daily with 2 00:00:15,040 --> 00:00:20,279 Speaker 1: Meet your Girl Daniel Moody. Pre recording from the Home Bunker. Folks, 3 00:00:20,880 --> 00:00:25,319 Speaker 1: it is time for rest and respite, and so I'm 4 00:00:25,480 --> 00:00:29,440 Speaker 1: very excited to bring you a lot of pre recorded 5 00:00:29,640 --> 00:00:34,280 Speaker 1: fantastic interviews and solo combos that I will be having 6 00:00:34,920 --> 00:00:39,479 Speaker 1: while wok F is out on break and today I 7 00:00:39,520 --> 00:00:44,040 Speaker 1: am super excited for you all to hear this conversation 8 00:00:44,960 --> 00:00:48,720 Speaker 1: with Andy Norman, who is an award winning author of 9 00:00:48,960 --> 00:00:53,840 Speaker 1: the book Mental Immunity. He is also the co founder 10 00:00:53,880 --> 00:00:59,440 Speaker 1: and CEO of Mental Immunity Project. What is that you ask, 11 00:00:59,640 --> 00:01:03,400 Speaker 1: Let me tell you. The Mental Immunity Project aims to 12 00:01:03,520 --> 00:01:12,880 Speaker 1: reduce the public's susceptibility to bad information, extremism, pseudoscience, conspiracy theories, propaganda, 13 00:01:12,920 --> 00:01:17,080 Speaker 1: and more by equipping them with skills they need to 14 00:01:17,160 --> 00:01:24,679 Speaker 1: identify and reject misleading or manipulative content. Doctor Norman is 15 00:01:24,880 --> 00:01:31,360 Speaker 1: the award winning author of Mental Immunity, Infectious Ideas, Mind Parasites, 16 00:01:31,520 --> 00:01:35,319 Speaker 1: and the Search for a Better Way to Think. His 17 00:01:35,440 --> 00:01:41,360 Speaker 1: work has appeared everywhere from Scientific American Psychology Today, Free Inquiry, 18 00:01:41,640 --> 00:01:46,360 Speaker 1: the Humanists, and other places. I was so excited to 19 00:01:46,400 --> 00:01:51,920 Speaker 1: delve into this conversation with doctor Norman because I mean, 20 00:01:52,440 --> 00:01:54,680 Speaker 1: this is the conversation that we needed to be having 21 00:01:54,800 --> 00:01:59,360 Speaker 1: back in twenty fifteen. This is the conversation that needed 22 00:01:59,360 --> 00:02:03,400 Speaker 1: to be pervent across the media about how to deal 23 00:02:03,960 --> 00:02:07,520 Speaker 1: with an entity like Donald Trump that the truth clearly 24 00:02:07,560 --> 00:02:12,480 Speaker 1: abates him at all times, What to do in order 25 00:02:12,520 --> 00:02:17,400 Speaker 1: to protect yourself from habitual liars and an entire party 26 00:02:17,960 --> 00:02:22,519 Speaker 1: that would create an entire apparatus that included one of 27 00:02:22,600 --> 00:02:27,280 Speaker 1: the most widely watched networks, to create a feeder system 28 00:02:27,720 --> 00:02:33,639 Speaker 1: for their lies right that would then expose millions upon 29 00:02:33,760 --> 00:02:38,200 Speaker 1: millions of people to what doctor Norman refers to as 30 00:02:38,720 --> 00:02:42,760 Speaker 1: mind parasites, These things that kind of get into your 31 00:02:42,800 --> 00:02:47,320 Speaker 1: brain and begin to eat at your ability to decipher 32 00:02:47,400 --> 00:02:50,480 Speaker 1: between right and wrong, to be able to decipher between 33 00:02:50,639 --> 00:02:56,440 Speaker 1: truth and fact. When you plant seeds as Donald Trump 34 00:02:56,440 --> 00:03:01,200 Speaker 1: and the Republican Party has done to destroy faith in 35 00:03:01,400 --> 00:03:04,800 Speaker 1: agencies in what you see with your own eyes, you 36 00:03:04,919 --> 00:03:08,840 Speaker 1: begin to believe nothing, which makes it easier for an 37 00:03:08,880 --> 00:03:13,440 Speaker 1: authoritarian to take over and just take back control. Because 38 00:03:13,720 --> 00:03:16,880 Speaker 1: as doctor Norman and I will begin to discuss, is 39 00:03:16,919 --> 00:03:22,120 Speaker 1: that it is very hard right in an open society, 40 00:03:22,520 --> 00:03:28,280 Speaker 1: which a democracy is right where we are having conversations 41 00:03:28,880 --> 00:03:33,000 Speaker 1: about how we want our society to look as opposed 42 00:03:33,040 --> 00:03:37,000 Speaker 1: to it being dictated to us from up on high. 43 00:03:38,360 --> 00:03:43,680 Speaker 1: But when you have those avenues of communication and information 44 00:03:44,120 --> 00:03:49,320 Speaker 1: corroded and eroded with series after a series of lies 45 00:03:49,440 --> 00:03:54,920 Speaker 1: in gaslight and misinformation, discerning the truth becomes really difficult, 46 00:03:55,080 --> 00:03:58,840 Speaker 1: as we have seen, and so here in this book 47 00:03:58,880 --> 00:04:03,440 Speaker 1: and with his project, doctor Norman really talks about the 48 00:04:03,480 --> 00:04:06,680 Speaker 1: ways that we can protect ourselves and the ways that 49 00:04:06,720 --> 00:04:11,560 Speaker 1: we can protect those around us. Check out this conversation 50 00:04:11,800 --> 00:04:19,160 Speaker 1: coming up next with doctor Andy Norman. Folks. I am 51 00:04:19,520 --> 00:04:26,240 Speaker 1: very happy to welcome to WOKF doctor Andy Norman, who 52 00:04:26,360 --> 00:04:32,120 Speaker 1: is the award winning author of Mental Immunity, Infectious, Infectious Ideas, 53 00:04:32,520 --> 00:04:35,839 Speaker 1: Mind Parasites, and the Search for a Better Way to Think, 54 00:04:36,320 --> 00:04:40,600 Speaker 1: and is the co founder and CEO of Mental Immunity Project, 55 00:04:41,000 --> 00:04:46,600 Speaker 1: which aims to reduce the public susceptibility to bad information, extremism, pseudoscience, 56 00:04:46,839 --> 00:04:51,200 Speaker 1: conspiracy theories, propaganda, and more by equipping them with the 57 00:04:51,240 --> 00:04:56,400 Speaker 1: skills they need to identify and reject misleading or manipulative content. 58 00:04:57,040 --> 00:05:03,280 Speaker 1: Um Doctor Norman, let me say this that your work 59 00:05:03,360 --> 00:05:05,760 Speaker 1: is needed now more than ever, which I'm sure you 60 00:05:05,839 --> 00:05:09,760 Speaker 1: are well aware of and know. Yeah, thank you, And 61 00:05:10,120 --> 00:05:14,360 Speaker 1: I want to ask you, I guess to start off with. 62 00:05:15,520 --> 00:05:20,520 Speaker 1: You know, we are living in such a polarized time 63 00:05:22,360 --> 00:05:27,839 Speaker 1: and such a dangerous time as it pertains to the 64 00:05:27,880 --> 00:05:33,360 Speaker 1: information silos that we live in. Yes, the information silos, 65 00:05:33,400 --> 00:05:37,359 Speaker 1: meaning that we are able to find whatever quote unquote 66 00:05:37,440 --> 00:05:43,560 Speaker 1: truth aligns with our own feelings and desires depending on 67 00:05:43,600 --> 00:05:46,840 Speaker 1: what platform that we decide to go to. We no 68 00:05:46,920 --> 00:05:50,080 Speaker 1: longer live in a time when there are four channels 69 00:05:50,080 --> 00:05:54,039 Speaker 1: that we're all watching to get the same news from. 70 00:05:54,600 --> 00:05:56,760 Speaker 1: You can get whatever news that you want. And so 71 00:05:57,760 --> 00:06:02,880 Speaker 1: first off is that doesn't seem to be changing. It 72 00:06:03,000 --> 00:06:06,560 Speaker 1: only seems to be increasing. That's right, talk to us, 73 00:06:06,640 --> 00:06:10,200 Speaker 1: what is at stake if we don't have the agility 74 00:06:10,480 --> 00:06:15,000 Speaker 1: to be able to create mental immunity when it's coming, 75 00:06:15,120 --> 00:06:18,719 Speaker 1: the disinformation is coming at such an expedited rate. 76 00:06:19,520 --> 00:06:24,320 Speaker 2: Yeah, So the more scholars study the way misinformation spreads 77 00:06:24,720 --> 00:06:28,960 Speaker 2: through societies, through cultures, the more they realize it behaves 78 00:06:29,000 --> 00:06:32,560 Speaker 2: an awful lot like a disease. And our minds actually 79 00:06:33,080 --> 00:06:36,560 Speaker 2: struggle with missing disinformation and pretty much the same way 80 00:06:36,600 --> 00:06:42,320 Speaker 2: that our bodies, immune systems struggle with pathogens. In fact, 81 00:06:43,240 --> 00:06:47,479 Speaker 2: people now experts now take the analogies so seriously that 82 00:06:47,560 --> 00:06:51,640 Speaker 2: they're starting to think of bad ideas as minded parasites 83 00:06:51,680 --> 00:06:54,240 Speaker 2: of a sort. And by bad ideas here I just 84 00:06:54,320 --> 00:06:58,800 Speaker 2: mean misinformation or falsehoods, or even you know, ideas spread 85 00:06:58,800 --> 00:07:03,640 Speaker 2: hate and disnction. Those ideas too can be thought of 86 00:07:03,680 --> 00:07:08,640 Speaker 2: as parasites of the mind. The good news is that 87 00:07:08,720 --> 00:07:15,040 Speaker 2: our minds evolved in a rich stew of problematic ideas, 88 00:07:15,720 --> 00:07:18,880 Speaker 2: and we actually have the capacity to become very good 89 00:07:19,360 --> 00:07:22,160 Speaker 2: at discriminating between the good stuff and the bad stuff. 90 00:07:23,440 --> 00:07:27,400 Speaker 2: So our minds actually have evolved immune systems that, under 91 00:07:27,440 --> 00:07:29,920 Speaker 2: the right conditions, can do a really good job of 92 00:07:30,040 --> 00:07:34,640 Speaker 2: keeping our minds relatively infection free. It's important to note 93 00:07:34,680 --> 00:07:38,040 Speaker 2: that nobody has a mind completely free of mind infections. 94 00:07:38,640 --> 00:07:41,720 Speaker 2: We all harbor bad ideas, and if we don't take 95 00:07:41,760 --> 00:07:47,040 Speaker 2: that to heart and bring the kind of humility needed 96 00:07:47,040 --> 00:07:50,440 Speaker 2: to unlearned things, then we're going to have it. We're 97 00:07:50,440 --> 00:07:52,440 Speaker 2: really going to struggle. In this day and age, we 98 00:07:52,520 --> 00:07:57,080 Speaker 2: think that what we call subtractive learning, learning to let 99 00:07:57,200 --> 00:08:01,840 Speaker 2: go of ideas that probably don't measure up. That's as 100 00:08:01,880 --> 00:08:06,800 Speaker 2: important as inputing new information that's that should be added 101 00:08:06,960 --> 00:08:09,160 Speaker 2: to the mind's knowledge stockpile. 102 00:08:10,080 --> 00:08:14,600 Speaker 1: Let's dig into the idea of mind parasites for a minute. 103 00:08:15,280 --> 00:08:19,480 Speaker 1: As somebody who loves sci fi like I do, it 104 00:08:20,080 --> 00:08:26,000 Speaker 1: conjures ideas in my mind. You know, of aliens, I'm 105 00:08:26,040 --> 00:08:30,360 Speaker 1: pretty and pretty creepy. So can you just explain what 106 00:08:30,440 --> 00:08:33,760 Speaker 1: you mean by by the concept and and the term 107 00:08:34,080 --> 00:08:36,160 Speaker 1: so that people get to get a get a better 108 00:08:36,240 --> 00:08:37,240 Speaker 1: sense of what that is. 109 00:08:37,840 --> 00:08:43,240 Speaker 2: Yeah, well, so I think the first thing, well, I'm 110 00:08:43,280 --> 00:08:47,240 Speaker 2: glad that concept is different enough to catch your attention. 111 00:08:47,880 --> 00:08:49,640 Speaker 2: Second thing I'll say is don't don't don't be too 112 00:08:49,679 --> 00:08:54,200 Speaker 2: freaked out about it that the bad ideas that have 113 00:08:54,280 --> 00:08:57,360 Speaker 2: crept into your mind all these times are aren't suddenly 114 00:08:57,400 --> 00:08:59,679 Speaker 2: going to come alive and you know, get the way 115 00:09:00,080 --> 00:09:05,520 Speaker 2: through your brain. It's nothing like that. But the fact 116 00:09:05,600 --> 00:09:10,400 Speaker 2: is ideas don't always serve the host the minds that 117 00:09:10,440 --> 00:09:13,880 Speaker 2: host them. They don't always serve them well. And to 118 00:09:13,960 --> 00:09:17,000 Speaker 2: really reflect on the So philosophers have been reflecting deeply 119 00:09:17,040 --> 00:09:20,120 Speaker 2: on this fact for a long time and trying to 120 00:09:20,200 --> 00:09:25,280 Speaker 2: develop methods to better weed out the problematic ideas from 121 00:09:25,320 --> 00:09:28,840 Speaker 2: the good ones. And some of the most powerful methods 122 00:09:28,840 --> 00:09:32,320 Speaker 2: involved just learning to ask good questions, learn to listen 123 00:09:32,320 --> 00:09:40,560 Speaker 2: to your doubts, express them with questions, and especially common 124 00:09:40,640 --> 00:09:44,120 Speaker 2: use of clarifying questions, and that can go a long 125 00:09:44,160 --> 00:09:51,120 Speaker 2: way towards improving your thinking and helping you make better decisions. 126 00:09:56,200 --> 00:09:59,600 Speaker 1: You know, right now, I think that what makes me 127 00:09:59,679 --> 00:10:04,760 Speaker 1: excite but also nervous about the work that you do 128 00:10:05,120 --> 00:10:09,600 Speaker 1: is that it requires a desire to want to think 129 00:10:10,240 --> 00:10:13,320 Speaker 1: and be better. Right, It requires a desire, like you 130 00:10:13,360 --> 00:10:17,600 Speaker 1: had said earlier, to unlearn and a humility to unlearn 131 00:10:18,920 --> 00:10:21,760 Speaker 1: bad ideas. And we are living in a time of 132 00:10:21,840 --> 00:10:28,400 Speaker 1: strong men, right that are spreading disinformation, and humility is 133 00:10:28,440 --> 00:10:32,280 Speaker 1: not a part of that package, right It toxic masculinity 134 00:10:32,559 --> 00:10:35,720 Speaker 1: and aggression is actually a part of that passage. So 135 00:10:36,240 --> 00:10:41,720 Speaker 1: to say to myself, hmm, let me ask myself questions 136 00:10:41,720 --> 00:10:44,960 Speaker 1: as opposed to I know everything that there is to 137 00:10:45,080 --> 00:10:49,600 Speaker 1: know is almost like the first step in order to 138 00:10:49,640 --> 00:10:53,240 Speaker 1: get to the place of mental immunity. And so how 139 00:10:53,280 --> 00:10:56,320 Speaker 1: do you how do we navigate that? Because for someone 140 00:10:56,480 --> 00:11:02,000 Speaker 1: like me and listeners to this show, they are about expansion, right, 141 00:11:02,040 --> 00:11:05,720 Speaker 1: they are about learning more and being curious. 142 00:11:06,200 --> 00:11:08,480 Speaker 2: And my guess is most of your listeners bring a 143 00:11:08,480 --> 00:11:09,080 Speaker 2: fair amount of. 144 00:11:09,120 --> 00:11:12,280 Speaker 1: Humility to right right, right, And I think that we're 145 00:11:12,320 --> 00:11:16,720 Speaker 1: out of time of there being a lack and almost 146 00:11:16,720 --> 00:11:21,720 Speaker 1: as celebration of the lack of intellectual curiosity. Let me 147 00:11:21,880 --> 00:11:24,240 Speaker 1: just follow like sheep. 148 00:11:24,559 --> 00:11:27,920 Speaker 2: Yeah, yeah, I mean you can see how I mean. 149 00:11:27,960 --> 00:11:32,840 Speaker 2: For example, Rush Limbaugh was a radio host who would 150 00:11:32,880 --> 00:11:39,319 Speaker 2: use bombast and overconfidence. His overconfidence, his lack of humility 151 00:11:39,440 --> 00:11:42,880 Speaker 2: was really appealing to people because it seemed like I 152 00:11:42,880 --> 00:11:46,640 Speaker 2: imagine it felt to some as though it was there's 153 00:11:46,640 --> 00:11:49,000 Speaker 2: a way of orienting yourself in a confusing world. I mean, 154 00:11:49,000 --> 00:11:53,120 Speaker 2: this guy at least has strong opinions and they don't waiver, 155 00:11:53,280 --> 00:11:55,839 Speaker 2: and he sticks by his guns, So I want to 156 00:11:55,880 --> 00:11:59,760 Speaker 2: be like that. That's can be really disorienting, both morally 157 00:11:59,840 --> 00:12:05,199 Speaker 2: and practically. That the people most worth emulating, the people 158 00:12:05,280 --> 00:12:10,120 Speaker 2: most worth listening and learning from, are extremely humble and 159 00:12:10,520 --> 00:12:14,320 Speaker 2: are willing to rethink things. And as our world grows 160 00:12:14,520 --> 00:12:17,959 Speaker 2: more and more complex in terms of the information that 161 00:12:18,640 --> 00:12:23,560 Speaker 2: bombards us, we all need to learn to rethink things 162 00:12:23,679 --> 00:12:26,000 Speaker 2: and also and to bring the kind of open mindedness 163 00:12:26,000 --> 00:12:28,920 Speaker 2: and humility that allows people to do that. Philusphers have 164 00:12:29,000 --> 00:12:32,760 Speaker 2: noticed for thousands of years that the least humble people 165 00:12:33,559 --> 00:12:38,280 Speaker 2: are often society's biggest problems. Right, So if you want 166 00:12:38,280 --> 00:12:40,280 Speaker 2: to be on the side of the angels, so to speak, 167 00:12:42,200 --> 00:12:44,920 Speaker 2: instead of say yeah, I know this for sure, say 168 00:12:45,480 --> 00:12:48,800 Speaker 2: I think that's true. I mean, last time I checked 169 00:12:48,840 --> 00:12:52,000 Speaker 2: it seemed like so. One thing really good thinkers do 170 00:12:52,160 --> 00:12:54,000 Speaker 2: is they try not to think in black and white, 171 00:12:54,000 --> 00:12:57,880 Speaker 2: are in absolute terms. They think in shades of gray. 172 00:12:58,679 --> 00:13:05,120 Speaker 2: So if you realize, I mean, if you're considering saying 173 00:13:05,160 --> 00:13:10,199 Speaker 2: something or asserting something, it's tempting to just want to 174 00:13:10,240 --> 00:13:13,440 Speaker 2: say it in very stark terms because you sound confident, 175 00:13:13,559 --> 00:13:18,160 Speaker 2: you sound decisive, and people admire that. But if the 176 00:13:18,320 --> 00:13:21,280 Speaker 2: truth is better served by saying, you know, I'm like 177 00:13:21,440 --> 00:13:24,720 Speaker 2: eighty five percent confident this is true, but of course 178 00:13:24,760 --> 00:13:29,480 Speaker 2: there are there's a possibility I might be wrong. Learning 179 00:13:29,480 --> 00:13:32,240 Speaker 2: to say that to yourself and even saying it to 180 00:13:32,280 --> 00:13:35,920 Speaker 2: others can make you part of the solution instead of 181 00:13:35,960 --> 00:13:38,160 Speaker 2: part of the problem. We're at a moment in history 182 00:13:38,200 --> 00:13:44,800 Speaker 2: where our entire culture needs to move away from sort 183 00:13:44,840 --> 00:13:50,840 Speaker 2: of absolutist thinking and become more sensitive and actually better listeners. 184 00:13:50,960 --> 00:13:54,000 Speaker 2: Almost the more. One of the most important skills in 185 00:13:54,040 --> 00:13:57,000 Speaker 2: all of this is learning how to listen with humility 186 00:13:57,040 --> 00:13:59,840 Speaker 2: and learn from people who's used different from ours. 187 00:14:00,559 --> 00:14:03,400 Speaker 1: You know, I often say on this show and others 188 00:14:03,480 --> 00:14:09,439 Speaker 1: that from the political perspective, the foundation of a democracy 189 00:14:10,080 --> 00:14:13,160 Speaker 1: is based in critical thinking, right. It is based in 190 00:14:13,880 --> 00:14:19,360 Speaker 1: the citizen turie's ability to think critically about who they 191 00:14:19,400 --> 00:14:24,200 Speaker 1: are choosing to represent them right, and how they are 192 00:14:24,200 --> 00:14:28,560 Speaker 1: going to be best served. Yes, and what I see 193 00:14:28,640 --> 00:14:33,920 Speaker 1: now is a society, not just here but globally that 194 00:14:34,080 --> 00:14:37,560 Speaker 1: is really being driven by fear. And I think about 195 00:14:37,720 --> 00:14:44,720 Speaker 1: fear honestly, Doctor Normans as a different kind of mental parasite. 196 00:14:45,360 --> 00:14:49,440 Speaker 2: I think that's right. So emotions can spread by contagion. Right, 197 00:14:49,720 --> 00:14:53,280 Speaker 2: if somebody in your room, in the crowd where you 198 00:14:53,320 --> 00:14:56,360 Speaker 2: are starting to freak out and ask act scared, you 199 00:14:56,400 --> 00:15:02,440 Speaker 2: can actually contract that fear. It can spread through a 200 00:15:02,440 --> 00:15:08,560 Speaker 2: crowd almost like a disease. And so emotions can spread virally, 201 00:15:08,840 --> 00:15:11,280 Speaker 2: so to speak. And fear is one of the most 202 00:15:11,720 --> 00:15:16,560 Speaker 2: Fear doesn't bring out the best in people. No, Fear, hate, resentment, 203 00:15:16,640 --> 00:15:19,680 Speaker 2: these are the emotions that tend to make us the 204 00:15:19,720 --> 00:15:26,680 Speaker 2: worst versions of ourselves. Compassion, sympathy, patience, these are the 205 00:15:26,760 --> 00:15:29,080 Speaker 2: qualities that tend to bring out the best in people, 206 00:15:29,760 --> 00:15:35,560 Speaker 2: and our fast moving information world seems to reward people 207 00:15:35,560 --> 00:15:38,560 Speaker 2: who are quick and decisive and confident rather than people 208 00:15:38,560 --> 00:15:42,160 Speaker 2: who are careful and cautious and slow. 209 00:15:44,120 --> 00:15:49,280 Speaker 1: I wonder, then, if fear two spreads like a virus, 210 00:15:49,320 --> 00:15:54,000 Speaker 1: and you have a political party, for instance, that has 211 00:15:54,080 --> 00:16:00,640 Speaker 1: weaponized that fear in order to control the masses, would 212 00:16:00,640 --> 00:16:04,040 Speaker 1: be that it isn't just enough to shut it off right, 213 00:16:04,200 --> 00:16:10,080 Speaker 1: that there has to be a formulation that looks like deprogramming, 214 00:16:11,320 --> 00:16:14,760 Speaker 1: because you've already if it is a virus and I've 215 00:16:14,800 --> 00:16:18,160 Speaker 1: already come into contact with it, then it's already spreading 216 00:16:18,200 --> 00:16:24,240 Speaker 1: around my system. Just shutting down doesn't stop the virus 217 00:16:24,240 --> 00:16:28,040 Speaker 1: from moving. And so I'm wondering, you know, what does 218 00:16:28,080 --> 00:16:31,440 Speaker 1: it look like once you vite because again this comes 219 00:16:31,480 --> 00:16:34,000 Speaker 1: with awareness and consciousness. You have to be aware that 220 00:16:34,040 --> 00:16:37,880 Speaker 1: you've caught the cold before you can rid yourself of it. 221 00:16:38,080 --> 00:16:42,680 Speaker 1: So I'm really curious as to what it looks like 222 00:16:43,160 --> 00:16:47,920 Speaker 1: then for those people, let's say I'll use this for 223 00:16:48,320 --> 00:16:51,760 Speaker 1: reference of this show, the January sixth people who were 224 00:16:51,800 --> 00:16:56,040 Speaker 1: convicted when asked, now, there many of them are just 225 00:16:56,160 --> 00:16:58,960 Speaker 1: like they didn't know what they were doing. They got 226 00:16:59,000 --> 00:17:02,120 Speaker 1: caught up right, is the the defense that they have, 227 00:17:02,200 --> 00:17:04,040 Speaker 1: whether or not they believe that, or whether or not 228 00:17:04,080 --> 00:17:07,880 Speaker 1: they're using it as a way to lessen sentences. There 229 00:17:07,920 --> 00:17:12,480 Speaker 1: are some that have testified about needing to deprogram themselves, 230 00:17:12,520 --> 00:17:14,840 Speaker 1: so I'm just curious as to what that looks like. 231 00:17:15,800 --> 00:17:18,000 Speaker 2: Yeah, I mean, one of the things we're learning is 232 00:17:18,040 --> 00:17:22,080 Speaker 2: that it's a lot harder to deprogram, say a cult member, 233 00:17:22,880 --> 00:17:25,639 Speaker 2: than it is to prevent somebody to give people the 234 00:17:25,640 --> 00:17:31,280 Speaker 2: skills they need to prevent from being seduced by the 235 00:17:31,320 --> 00:17:33,840 Speaker 2: cult leader in the first place in the first place. 236 00:17:33,880 --> 00:17:36,680 Speaker 2: So prevention, an ounce of prevention, is worth a pound 237 00:17:36,680 --> 00:17:39,600 Speaker 2: of cure. A little bit of effort can go a 238 00:17:39,640 --> 00:17:45,440 Speaker 2: long way to prevent people from being exploited by, say, 239 00:17:45,920 --> 00:17:50,640 Speaker 2: manipulative messaging, But it's much harder after people have bought 240 00:17:50,680 --> 00:17:54,280 Speaker 2: into that manipulative message and formed an identity around it, 241 00:17:55,440 --> 00:18:01,359 Speaker 2: and then they fight like heck to main the sucker 242 00:18:01,440 --> 00:18:04,879 Speaker 2: or the dupe of the person who's manipulating them with information. 243 00:18:06,359 --> 00:18:10,440 Speaker 2: The astrophysicist Carl Sagan once said that you know, once 244 00:18:10,560 --> 00:18:15,000 Speaker 2: you give your belief or your credulity to a charlatan, 245 00:18:15,400 --> 00:18:22,160 Speaker 2: you almost never get it back. That propagandists and conspiracy 246 00:18:22,400 --> 00:18:27,400 Speaker 2: theories and cult leaders have ways of hacking into your mind, 247 00:18:27,800 --> 00:18:32,560 Speaker 2: winning your allegiance, preventing you from really thinking for yourself. 248 00:18:34,040 --> 00:18:37,240 Speaker 2: And of course we can't run a democracy if people 249 00:18:37,320 --> 00:18:45,760 Speaker 2: are falling prey to disinformation peddlers left and right. And 250 00:18:45,840 --> 00:18:47,679 Speaker 2: of course this is one of the deepest challenges for 251 00:18:47,680 --> 00:18:51,199 Speaker 2: any democracy because in an open society, we believe that 252 00:18:51,400 --> 00:18:53,680 Speaker 2: people should be able to speak their minds, should be 253 00:18:53,760 --> 00:18:57,359 Speaker 2: able to raise criticisms. But right now people are we've 254 00:18:57,400 --> 00:19:01,840 Speaker 2: treated that idea as so sayrid that we're allowing people 255 00:19:01,880 --> 00:19:06,520 Speaker 2: to weaponize information in ways that harm others. Right, So, 256 00:19:07,040 --> 00:19:09,159 Speaker 2: in just the same way that it's not okay to 257 00:19:10,880 --> 00:19:18,800 Speaker 2: I don't know, hypnotize and brainwash somebody, it's not okay 258 00:19:18,880 --> 00:19:25,200 Speaker 2: to set up an info Wars platform and brainwash tens 259 00:19:25,240 --> 00:19:28,440 Speaker 2: of millions of people. In fact, if the former is problematic, 260 00:19:28,520 --> 00:19:41,720 Speaker 2: the latter is even millions of times more so. 261 00:19:36,480 --> 00:19:39,240 Speaker 1: So, what does it look like then? You know, with 262 00:19:39,320 --> 00:19:41,800 Speaker 1: the time we have, what does it look like as 263 00:19:41,840 --> 00:19:47,840 Speaker 1: we're heading into an election year, right, I can't stress 264 00:19:47,960 --> 00:19:50,960 Speaker 1: enough that it's going to be the most consequential of 265 00:19:51,600 --> 00:19:56,679 Speaker 1: our lifetimes about whether we hold on to democracy or 266 00:19:56,760 --> 00:20:04,480 Speaker 1: America falls to authoritarianism? And what does it look like 267 00:20:04,920 --> 00:20:08,520 Speaker 1: when people know? Right, I'm I'm talking to the seventy 268 00:20:08,560 --> 00:20:11,880 Speaker 1: percent because I personally believe that the thirty percent that 269 00:20:12,000 --> 00:20:16,840 Speaker 1: have become hypnotized by trump Ism magaism, that they are 270 00:20:16,880 --> 00:20:20,040 Speaker 1: not coming back right, That is my that is my 271 00:20:20,400 --> 00:20:22,720 Speaker 1: They are very hard to reach, that is my belief. 272 00:20:22,760 --> 00:20:26,760 Speaker 1: But there are seventy percent, the majority of people. How 273 00:20:26,800 --> 00:20:32,240 Speaker 1: do they prepare themselves to be mindful of the disinformation 274 00:20:32,520 --> 00:20:34,880 Speaker 1: and not fall prey to it? 275 00:20:35,280 --> 00:20:38,199 Speaker 2: Let me offer what I think is maybe the most highest, 276 00:20:38,240 --> 00:20:41,520 Speaker 2: highest impact thing that we as a nation need to realize. 277 00:20:42,240 --> 00:20:47,800 Speaker 2: There are people out there who are peddling counter narratives 278 00:20:48,680 --> 00:20:53,199 Speaker 2: just to create confusion and to make people feel resignation 279 00:20:53,359 --> 00:20:57,000 Speaker 2: and to just give up and then not vote. So 280 00:20:57,080 --> 00:20:59,840 Speaker 2: if you feel like, yeah, the Dems say this, but 281 00:20:59,880 --> 00:21:03,360 Speaker 2: the Republicans say that, who's to say it's all bullshit? 282 00:21:03,440 --> 00:21:06,679 Speaker 2: And then you just don't even exercise your right to vote. 283 00:21:06,720 --> 00:21:10,240 Speaker 2: If you do that, the disinformation peddlers have won because 284 00:21:10,280 --> 00:21:14,439 Speaker 2: they've manipulated you into not exercising your judgment, into not 285 00:21:15,400 --> 00:21:18,760 Speaker 2: using your ability to think for yourself to help protect 286 00:21:18,760 --> 00:21:21,399 Speaker 2: our freedoms in our democracy. So I would urge your 287 00:21:21,440 --> 00:21:25,640 Speaker 2: listeners to realize that there will be a crazy lot 288 00:21:25,680 --> 00:21:31,840 Speaker 2: of inflammatory information flying around as we approach the twenty 289 00:21:31,880 --> 00:21:36,919 Speaker 2: twenty four election. Tensions will run high. People will be fearful, angry, resentful, 290 00:21:37,280 --> 00:21:41,639 Speaker 2: All of these emotions will run high. Realize that you 291 00:21:41,680 --> 00:21:47,040 Speaker 2: don't have to let that information trigger you. Keep your calm, 292 00:21:47,280 --> 00:21:50,479 Speaker 2: keep your cool, be a good citizen, Get out and 293 00:21:50,520 --> 00:21:54,200 Speaker 2: exercise your right to vote for the party, the one 294 00:21:54,280 --> 00:21:58,040 Speaker 2: party in our nation anymore that is actually trying to 295 00:21:58,080 --> 00:22:00,840 Speaker 2: do the best for all of us than just try 296 00:22:00,880 --> 00:22:05,560 Speaker 2: to glorify a leader who will take advantage of anyone 297 00:22:05,600 --> 00:22:11,560 Speaker 2: and everyone. So don't despair is the main thing. Stay strong, 298 00:22:12,000 --> 00:22:16,200 Speaker 2: get out there, help your neighbors vote. If you can, 299 00:22:16,560 --> 00:22:22,560 Speaker 2: donate to a political cause so that I mean Trump 300 00:22:22,600 --> 00:22:25,000 Speaker 2: was raising lots of money to get reelected, and if 301 00:22:25,000 --> 00:22:28,960 Speaker 2: he takes if he takes the White House again, I'm 302 00:22:29,080 --> 00:22:33,560 Speaker 2: not at all sure our democracy will survive. And we 303 00:22:33,800 --> 00:22:37,000 Speaker 2: who care about our democracy need to rise up in 304 00:22:37,359 --> 00:22:40,679 Speaker 2: huge numbers on election day. And it's time to prepare 305 00:22:40,720 --> 00:22:43,159 Speaker 2: yourself to do that now, and not to let all 306 00:22:43,160 --> 00:22:45,440 Speaker 2: the bullshit that's going to overwhelm us for the next 307 00:22:45,480 --> 00:22:47,520 Speaker 2: few months deter you from doing the right thing. 308 00:22:48,320 --> 00:22:51,480 Speaker 1: I mean, I can't, I can't agree more. This is 309 00:22:51,560 --> 00:22:54,679 Speaker 1: the This is the drum that I beat on this 310 00:22:54,840 --> 00:22:59,399 Speaker 1: show every single day because I do think, you know, 311 00:22:59,600 --> 00:23:01,560 Speaker 1: And the the last question that I have for you, 312 00:23:01,600 --> 00:23:06,720 Speaker 1: because I do think like fear, despair and hopelessness too, 313 00:23:08,560 --> 00:23:11,000 Speaker 1: is can be viral. 314 00:23:11,640 --> 00:23:18,159 Speaker 3: Yeah, absolutely, And I wonder for you, you know, to 315 00:23:18,280 --> 00:23:23,280 Speaker 3: offer to the audience that by plugging in every single day, 316 00:23:23,720 --> 00:23:32,280 Speaker 3: seeing violent, horrific wars, seeing death, and just feeling like 317 00:23:32,320 --> 00:23:36,800 Speaker 3: they cannot hold this grief anymore, they want to shut down, right, 318 00:23:36,920 --> 00:23:39,840 Speaker 3: they want to not discuss I don't want to talk 319 00:23:39,840 --> 00:23:40,600 Speaker 3: about politics. 320 00:23:40,600 --> 00:23:42,520 Speaker 1: I don't want to talk about these things. And it's 321 00:23:42,560 --> 00:23:46,320 Speaker 1: the conversation that allows us the ability to expand. So 322 00:23:46,359 --> 00:23:49,199 Speaker 1: what do you offer to those people that are that 323 00:23:49,240 --> 00:23:51,640 Speaker 1: are struggling with hopefulness? 324 00:23:52,040 --> 00:23:54,840 Speaker 2: Well? I have to remind myself almost every day that 325 00:23:54,920 --> 00:23:59,920 Speaker 2: the newspaper is a bias sample of stories about problems 326 00:24:00,359 --> 00:24:06,000 Speaker 2: mostly right, Yeah, And when people quietly resolve things and 327 00:24:06,119 --> 00:24:09,359 Speaker 2: solve problems, a lot of times it doesn't make the news. 328 00:24:10,119 --> 00:24:13,679 Speaker 2: And so our information diets tend to make us more 329 00:24:13,960 --> 00:24:18,200 Speaker 2: pessimistic and despairing than we probably ought to be, So 330 00:24:18,240 --> 00:24:21,800 Speaker 2: it's worth just remembering that, stepping back from it and saying, yeah, 331 00:24:22,280 --> 00:24:25,199 Speaker 2: if all I read is the news, it's natural I'm 332 00:24:25,240 --> 00:24:27,720 Speaker 2: going to feel kind of down. But there are lots 333 00:24:27,800 --> 00:24:30,520 Speaker 2: of things going on behind the scenes that are pushing 334 00:24:30,600 --> 00:24:35,880 Speaker 2: humanity in the right direction. So stay, stay strong, keep 335 00:24:35,920 --> 00:24:36,399 Speaker 2: hope alive. 336 00:24:37,200 --> 00:24:42,760 Speaker 1: That was a perfect place to end on your call 337 00:24:42,840 --> 00:24:47,560 Speaker 1: to action to keep Hope alive, folks. I will say 338 00:24:47,760 --> 00:24:54,199 Speaker 1: that the book Mental Immunity, Infectious Ideas, Mind Parasites, and 339 00:24:54,200 --> 00:24:57,760 Speaker 1: the Search for a Better Way to Think, is absolutely 340 00:24:57,800 --> 00:25:00,639 Speaker 1: worth the read for those of you who are trying 341 00:25:00,680 --> 00:25:06,719 Speaker 1: to stave off disinformation but also raise your levels of hopefulness. 342 00:25:07,640 --> 00:25:12,199 Speaker 1: And also do check out Mental Immunity Project because I 343 00:25:12,200 --> 00:25:15,440 Speaker 1: think that it is absolutely worth the discussion, particularly during 344 00:25:15,480 --> 00:25:19,439 Speaker 1: this season when we're gathering with other folks. Doctor Andy Norman, 345 00:25:19,520 --> 00:25:22,240 Speaker 1: thank you so much for making the time for WOKF. 346 00:25:22,359 --> 00:25:23,080 Speaker 1: I appreciate you. 347 00:25:23,800 --> 00:25:25,359 Speaker 2: Thank you, Danielle, keep up the good work. 348 00:25:30,600 --> 00:25:34,080 Speaker 1: That is it for me today, dear friends on WOKF. 349 00:25:34,200 --> 00:25:38,919 Speaker 1: As always, power to the people and to all the people. Power, 350 00:25:39,200 --> 00:25:41,640 Speaker 1: get woke and stay woke as fuck.