1 00:00:00,200 --> 00:00:02,759 Speaker 1: Hey there, this is Marissa strit I am the CEO 2 00:00:02,880 --> 00:00:04,920 Speaker 1: of Preger You, and you are about to listen to 3 00:00:05,080 --> 00:00:08,960 Speaker 1: a special edition of fireside Chat with Dennis Praguer. Those 4 00:00:09,000 --> 00:00:11,239 Speaker 1: of us here at preger You are continuing to do 5 00:00:11,320 --> 00:00:13,760 Speaker 1: the very important work that we've been doing with him 6 00:00:13,800 --> 00:00:17,320 Speaker 1: for over ten years. We are educating millions of young 7 00:00:17,360 --> 00:00:21,079 Speaker 1: people online, we have state partnerships to bring our content 8 00:00:21,160 --> 00:00:23,960 Speaker 1: into schools, and so much more. You can head over 9 00:00:24,040 --> 00:00:26,400 Speaker 1: to peger you dot com to see the progress that 10 00:00:26,440 --> 00:00:29,120 Speaker 1: we've been making. Also, i'd love to invite you to 11 00:00:29,160 --> 00:00:31,920 Speaker 1: support our mission by donating to our five ZHO one 12 00:00:32,040 --> 00:00:35,479 Speaker 1: C three. It is a nonprofit. Your tax deductible donation 13 00:00:35,600 --> 00:00:38,519 Speaker 1: will go to an institution of higher learning that actually 14 00:00:38,560 --> 00:00:42,879 Speaker 1: shares your values. So enjoy your podcast and thank you 15 00:00:43,040 --> 00:00:44,640 Speaker 1: so much for all your support. 16 00:00:45,000 --> 00:00:49,080 Speaker 2: Hi everybody, I'm Dennis Prager and welcome to my fireside Chat. 17 00:00:49,800 --> 00:00:52,440 Speaker 2: It's my home, it's my chair, it's my fire, it's 18 00:00:52,479 --> 00:00:56,880 Speaker 2: my dog. Is Auto win there? I'm telling you you 19 00:00:56,920 --> 00:01:00,840 Speaker 2: know what the beautiful thing is. Auto's fan has not 20 00:01:00,960 --> 00:01:03,880 Speaker 2: gone to his head. Now. He's one of the best 21 00:01:03,920 --> 00:01:08,840 Speaker 2: known dogs in the world right now, and nothing it's 22 00:01:08,920 --> 00:01:13,959 Speaker 2: like it has not affected him whatsoever. You gotta say 23 00:01:13,959 --> 00:01:17,039 Speaker 2: this as a remarkable creature, which will be perfect in 24 00:01:17,040 --> 00:01:18,720 Speaker 2: the light if the subject I want to talk to 25 00:01:18,760 --> 00:01:22,880 Speaker 2: you about. And by the way, world is pretty accurate. 26 00:01:23,000 --> 00:01:26,560 Speaker 2: Because I just want to note that recent Fireside viewers' 27 00:01:26,600 --> 00:01:35,680 Speaker 2: comments came from Bulgaria, Saudi Arabia, Germany, Australia, the Czech Republic, Luxembourg, Scotland, 28 00:01:35,680 --> 00:01:39,280 Speaker 2: and we'll have questions from other countries entirely as well. 29 00:01:39,800 --> 00:01:41,720 Speaker 2: So that's oh, I should read This's right, I got 30 00:01:41,720 --> 00:01:44,440 Speaker 2: to read this. So that's a It's a good thing 31 00:01:44,959 --> 00:01:48,560 Speaker 2: because what I have to say is not bound to 32 00:01:48,720 --> 00:01:55,120 Speaker 2: any ethnic or religious or national group. I've always said, 33 00:01:55,280 --> 00:01:59,280 Speaker 2: either what you have to say is universally applicable or 34 00:01:59,280 --> 00:02:03,920 Speaker 2: it's not applicable to anybody. That's my belief. It's been 35 00:02:03,920 --> 00:02:06,840 Speaker 2: my belief my whole life. So it makes sense to 36 00:02:06,880 --> 00:02:09,240 Speaker 2: me that there would be people listening in different parts 37 00:02:09,280 --> 00:02:13,760 Speaker 2: of the world. Why not, because there is one thing 38 00:02:13,880 --> 00:02:19,160 Speaker 2: we can all share, and that's logic common sense. If 39 00:02:19,160 --> 00:02:24,760 Speaker 2: something makes sense, it makes sense. Okay, So anyway, welcome, 40 00:02:26,320 --> 00:02:28,640 Speaker 2: now listen to this. So this is what I always 41 00:02:28,680 --> 00:02:33,840 Speaker 2: discuss something, and then I take your questions. So I 42 00:02:33,880 --> 00:02:38,480 Speaker 2: am told by the folks who monitor the website that 43 00:02:39,560 --> 00:02:43,760 Speaker 2: a while ago when I discussed in passing, it wasn't 44 00:02:43,800 --> 00:02:47,520 Speaker 2: the key subject of the opening, but I discussed the 45 00:02:47,560 --> 00:02:50,600 Speaker 2: issue of would you save your dog or a stranger first, 46 00:02:50,639 --> 00:02:54,560 Speaker 2: if both were drowning. This has been one of the 47 00:02:54,600 --> 00:02:58,639 Speaker 2: most controversial positions I have ever taken it, and I 48 00:02:59,120 --> 00:03:02,560 Speaker 2: have been asking this question since my twenties. I've been 49 00:03:02,600 --> 00:03:06,359 Speaker 2: asking this my whole life, and the same exact response 50 00:03:06,480 --> 00:03:08,760 Speaker 2: is when I asked this when I was in my 51 00:03:08,840 --> 00:03:13,880 Speaker 2: twenties and today forty something years later, and it is 52 00:03:14,120 --> 00:03:20,839 Speaker 2: just it's astonishing. One third in almost every instance, one 53 00:03:20,880 --> 00:03:23,200 Speaker 2: third vote for the dog, one third for the stranger. 54 00:03:23,280 --> 00:03:26,080 Speaker 2: One third doesn't know. Because I give the option if 55 00:03:26,120 --> 00:03:28,200 Speaker 2: you don't know how you would what you would do, 56 00:03:28,880 --> 00:03:31,920 Speaker 2: you're torn. You can vote that way as well, So 57 00:03:31,960 --> 00:03:34,680 Speaker 2: it's a third or third. But my point is two 58 00:03:34,760 --> 00:03:37,800 Speaker 2: thirds would not vote to save the human. And I 59 00:03:37,880 --> 00:03:42,720 Speaker 2: want to emphasize something stranger means stranger. People say, well, 60 00:03:42,720 --> 00:03:45,520 Speaker 2: what if Hitler were drowning, then it's not a stranger, 61 00:03:46,840 --> 00:03:48,920 Speaker 2: And my answer to that is very simple. If Hitler 62 00:03:48,960 --> 00:03:51,960 Speaker 2: were drowning, I'd help him. Drown It's irrelevant if there's 63 00:03:51,960 --> 00:03:55,520 Speaker 2: a dog drowning. Hitler deserved to die, therefore I would 64 00:03:55,560 --> 00:04:00,800 Speaker 2: help him die. So's it's not an answer to my question. 65 00:04:01,160 --> 00:04:04,640 Speaker 2: My question is a human you don't know the human? 66 00:04:04,760 --> 00:04:07,000 Speaker 2: I mean it's you might as well. You might as 67 00:04:07,040 --> 00:04:09,600 Speaker 2: well say, well, what if it's Hitler? Well, I might 68 00:04:09,640 --> 00:04:11,920 Speaker 2: as well then counter, well, what if it's somebody who's 69 00:04:11,960 --> 00:04:15,640 Speaker 2: on the brink of discovering a cure for cancer. That's 70 00:04:15,680 --> 00:04:18,320 Speaker 2: not fair either on my part, because it's the point 71 00:04:18,360 --> 00:04:21,400 Speaker 2: is it's a stranger. It's human versus the dog you love. 72 00:04:22,200 --> 00:04:26,760 Speaker 2: It's not stranger dog. Most people I want to believe 73 00:04:26,839 --> 00:04:28,839 Speaker 2: between a dog they don't know, in a human they 74 00:04:28,839 --> 00:04:30,279 Speaker 2: don't know, what say the human they don't know. But 75 00:04:30,320 --> 00:04:33,240 Speaker 2: I'm not even sure about that anymore. But anyway, that's 76 00:04:33,279 --> 00:04:36,920 Speaker 2: a separate issue. I want to address three comments that 77 00:04:37,000 --> 00:04:41,240 Speaker 2: came in, and this is among listeners or viewers to this. 78 00:04:43,040 --> 00:04:46,159 Speaker 2: So often it's people who would generally agree with me, 79 00:04:46,680 --> 00:04:50,279 Speaker 2: but clearly didn't on this. By the way, I'm just 80 00:04:50,320 --> 00:04:52,919 Speaker 2: assuming that most people watching this tend to agree with me. 81 00:04:53,040 --> 00:04:55,960 Speaker 2: I'm happy if just as happy if none of you do. 82 00:04:56,760 --> 00:04:59,600 Speaker 2: I want to touch people's lives, whether they agree with 83 00:04:59,600 --> 00:05:02,240 Speaker 2: me or not. Okay, but here are people who don't 84 00:05:02,240 --> 00:05:07,200 Speaker 2: agree with me, So here are comments for example, Oh, 85 00:05:07,360 --> 00:05:10,680 Speaker 2: I guess I should state my position first. Yes. So 86 00:05:10,720 --> 00:05:13,119 Speaker 2: the reason I have always raised this is to raise 87 00:05:13,200 --> 00:05:18,960 Speaker 2: a two very critical points. One is what happens when 88 00:05:19,000 --> 00:05:24,560 Speaker 2: you remove God and religion from your thinking in terms 89 00:05:24,600 --> 00:05:28,160 Speaker 2: of moral decisions for a person steeped in what we 90 00:05:28,200 --> 00:05:32,160 Speaker 2: call Judeo Christian values, that is rooted in the Old 91 00:05:32,200 --> 00:05:36,960 Speaker 2: Testament Judeo and New Testament Christian. That's one way of 92 00:05:37,000 --> 00:05:40,320 Speaker 2: looking at the term Judeo Christian, or many others. For 93 00:05:40,360 --> 00:05:44,960 Speaker 2: anyone steeped in those there's no question you cannot defend 94 00:05:45,040 --> 00:05:48,800 Speaker 2: the choice to save the dog you love versus a 95 00:05:48,880 --> 00:05:52,720 Speaker 2: human being you don't know on the basis of anything 96 00:05:53,000 --> 00:05:58,520 Speaker 2: in the Judeo Christian world, anything in Judaism, anything Christianity, 97 00:06:00,320 --> 00:06:06,760 Speaker 2: not animals are created in God's image period. Second, the 98 00:06:06,800 --> 00:06:13,120 Speaker 2: second big lesson from this question is how people come 99 00:06:13,200 --> 00:06:18,119 Speaker 2: to their decisions based on feelings. I love my dog, 100 00:06:18,279 --> 00:06:22,480 Speaker 2: I don't love the stranger. But you cannot base moral 101 00:06:22,560 --> 00:06:28,360 Speaker 2: decisions on feelings. And again I return to what I 102 00:06:28,480 --> 00:06:31,440 Speaker 2: use as my source of values. I admitted the Bible, 103 00:06:32,000 --> 00:06:34,919 Speaker 2: which constantly warns us not to rely on our heart 104 00:06:35,480 --> 00:06:39,640 Speaker 2: to make these decisions. The heart is important, that's what 105 00:06:39,720 --> 00:06:42,680 Speaker 2: makes us human. If you don't have feelings, you're a robot. 106 00:06:43,200 --> 00:06:48,719 Speaker 2: But it's not the place for moral decision making. So 107 00:06:48,880 --> 00:06:52,200 Speaker 2: these are two huge lessons. That's the reason I've been 108 00:06:52,240 --> 00:06:55,240 Speaker 2: asking this question my whole life, to make two huge lessons. 109 00:06:56,279 --> 00:07:00,960 Speaker 2: What happens when you end the religious base, the Judeo 110 00:07:01,040 --> 00:07:04,600 Speaker 2: Christian basis of our morality, and what happens when you 111 00:07:04,640 --> 00:07:09,560 Speaker 2: rely on the heart? So here are three? Is that right? 112 00:07:09,600 --> 00:07:14,840 Speaker 2: We have three? Four? Okay, so here's one. These are anonymous. 113 00:07:14,880 --> 00:07:18,360 Speaker 2: Otherwise I would say state a name and a city. 114 00:07:18,480 --> 00:07:21,760 Speaker 2: But at an age I would never sacrifice my dog 115 00:07:21,840 --> 00:07:25,600 Speaker 2: for someone I do not know. My dog is my family. 116 00:07:26,080 --> 00:07:28,280 Speaker 2: As a matter of fact, I would throw a stranger 117 00:07:28,320 --> 00:07:30,920 Speaker 2: in front of a moving bus to save my dog. 118 00:07:32,640 --> 00:07:34,920 Speaker 2: So that, oh, I see the next one is not 119 00:07:35,000 --> 00:07:40,240 Speaker 2: attached to this. That's another comment. Okay, all right. First 120 00:07:40,280 --> 00:07:43,920 Speaker 2: of all, that second part is a little scary. You 121 00:07:43,960 --> 00:07:46,239 Speaker 2: would throw a human being in front of a moving 122 00:07:46,320 --> 00:07:49,960 Speaker 2: bus to save your dog. You would murder a human 123 00:07:50,080 --> 00:07:57,000 Speaker 2: to save your dog. By the way, in again, in 124 00:07:57,400 --> 00:08:01,160 Speaker 2: religious law Judeo Christian law, you can't murder a human 125 00:08:01,240 --> 00:08:08,280 Speaker 2: to save your mother, forget your dog. You can't throw 126 00:08:08,320 --> 00:08:10,400 Speaker 2: a human in front of a bus to save your 127 00:08:10,640 --> 00:08:14,760 Speaker 2: own your own child. That's murder. You can't murder to 128 00:08:14,800 --> 00:08:19,920 Speaker 2: save somebody. You can kill someone going to kill your child. 129 00:08:20,000 --> 00:08:25,160 Speaker 2: Of course, that self defense, that's separate issue. But this 130 00:08:25,240 --> 00:08:27,400 Speaker 2: is a scary thing. This is what happens when the 131 00:08:27,440 --> 00:08:31,440 Speaker 2: heart takes over morality. I would I would murder a person. 132 00:08:32,280 --> 00:08:35,920 Speaker 2: I would in order to save my dog. So okay, 133 00:08:35,960 --> 00:08:40,160 Speaker 2: So this is clearly an example of the heart is 134 00:08:40,200 --> 00:08:44,720 Speaker 2: the source of values. But where do you end. What 135 00:08:44,760 --> 00:08:50,320 Speaker 2: if your heart is I don't like women, and i'll 136 00:08:50,440 --> 00:08:53,120 Speaker 2: I would, I wouldn't. I would save a man, but 137 00:08:53,200 --> 00:08:55,040 Speaker 2: not a woman. What would you say to that? What 138 00:08:55,120 --> 00:08:57,560 Speaker 2: if somebody or the other way around, a woman in 139 00:08:57,600 --> 00:09:00,240 Speaker 2: solidarity with women? Oh well, if the person drowning were 140 00:09:00,240 --> 00:09:02,800 Speaker 2: a woman, I would save the woman. But if it's 141 00:09:02,800 --> 00:09:04,920 Speaker 2: a man, I'm gonna save my dog. What would you 142 00:09:04,960 --> 00:09:11,080 Speaker 2: say to that? Or or black or white, or your 143 00:09:11,200 --> 00:09:13,960 Speaker 2: nationality and another nationality. What if the only thing you 144 00:09:14,000 --> 00:09:16,640 Speaker 2: knew was that you're let's say you're French and you 145 00:09:16,679 --> 00:09:20,400 Speaker 2: knew the stranger were French. Oh, then you would save 146 00:09:20,440 --> 00:09:22,800 Speaker 2: the per I mean, this is all That's why feelings 147 00:09:22,840 --> 00:09:25,800 Speaker 2: cannot be the source of values. There's got to be 148 00:09:25,840 --> 00:09:30,280 Speaker 2: a right and a wrong. So this is somewhat of 149 00:09:30,280 --> 00:09:33,120 Speaker 2: a scary answer to me. I would it's all emotional, 150 00:09:33,320 --> 00:09:38,320 Speaker 2: It's purely emotion. And here's another one, what why dog? 151 00:09:39,800 --> 00:09:42,760 Speaker 2: What if you love your rabbit? I know somebody who 152 00:09:42,800 --> 00:09:45,439 Speaker 2: loves his rabbit. The fact he's behind the camera right 153 00:09:45,480 --> 00:09:53,120 Speaker 2: now and he's not even embarrassed anyway. Really, I'm not 154 00:09:53,200 --> 00:09:59,720 Speaker 2: joking though. There are people who love hamsters, right, mice, hamsters, people, 155 00:10:00,040 --> 00:10:02,240 Speaker 2: people have affection. I don't have any problem with that. 156 00:10:03,880 --> 00:10:08,960 Speaker 2: One of my boys was was little. He had a 157 00:10:09,559 --> 00:10:12,480 Speaker 2: was it an iguana or a lizard? Do they sell 158 00:10:12,520 --> 00:10:15,360 Speaker 2: iguanas at pet stores? Yeah? So then it was an iguana. 159 00:10:16,240 --> 00:10:19,880 Speaker 2: I mean, that's a reptile. They're really primitive. But he 160 00:10:21,040 --> 00:10:24,200 Speaker 2: was a very sweet iguana. As they as the things were, 161 00:10:24,280 --> 00:10:27,440 Speaker 2: as it worked out. So let's say, why would you 162 00:10:27,440 --> 00:10:30,680 Speaker 2: say to someone who saved their their reptile, their iguana 163 00:10:31,160 --> 00:10:35,400 Speaker 2: before a human being. Is there any point where you 164 00:10:35,960 --> 00:10:39,120 Speaker 2: who are who defends saving the dog would say, Oh, 165 00:10:39,240 --> 00:10:43,600 Speaker 2: that's wrong. Any animal. What if you fall, I don't know. 166 00:10:43,920 --> 00:10:46,080 Speaker 2: I'm not kidding. What if you have an affection for 167 00:10:47,040 --> 00:10:52,320 Speaker 2: a butterfly, you have a parakeet, certainly a parakeet. People 168 00:10:52,400 --> 00:10:55,240 Speaker 2: have affection for their birds, and I get it. I 169 00:10:55,240 --> 00:10:58,040 Speaker 2: have no problem. I have affection for him. I love 170 00:10:58,080 --> 00:11:03,080 Speaker 2: the guy. But if he were drowning and a human being, 171 00:11:03,120 --> 00:11:05,080 Speaker 2: I didn't know we're drowning, I would save the human being. 172 00:11:05,800 --> 00:11:08,160 Speaker 2: So my wife, who loves him even more than either, 173 00:11:08,240 --> 00:11:12,480 Speaker 2: she kisses him. I find very odd. But she's not 174 00:11:12,559 --> 00:11:16,719 Speaker 2: in the room, so I could say that. Anyway. I 175 00:11:18,120 --> 00:11:24,959 Speaker 2: am making a critical point. How do we derive good 176 00:11:25,040 --> 00:11:28,080 Speaker 2: and evil? How do we arrive at good and evil, 177 00:11:28,520 --> 00:11:32,240 Speaker 2: right and wrong? It can't be the heart. There has 178 00:11:32,320 --> 00:11:36,040 Speaker 2: to be a transcendent code to which we are all 179 00:11:37,559 --> 00:11:42,320 Speaker 2: obligated and by the way, and even if you want 180 00:11:42,360 --> 00:11:48,760 Speaker 2: to be a heart moved or animated in this instance, 181 00:11:49,679 --> 00:11:53,600 Speaker 2: what if you let what if you let a a 182 00:11:53,600 --> 00:11:59,760 Speaker 2: a young pregnant woman drowned because you saved your dog. 183 00:12:01,080 --> 00:12:04,719 Speaker 2: And you know, and the husband and the parents came 184 00:12:04,760 --> 00:12:09,720 Speaker 2: to you and said, you let my daughter, You let 185 00:12:09,760 --> 00:12:17,120 Speaker 2: my wife die to save your dog. I mean that 186 00:12:17,520 --> 00:12:22,240 Speaker 2: should also be an emotional issue. But it's not fully 187 00:12:22,280 --> 00:12:24,640 Speaker 2: fair for me to do that because remember I said stranger, 188 00:12:24,679 --> 00:12:26,760 Speaker 2: and I'm sticking to my word stranger. We don't know 189 00:12:26,800 --> 00:12:30,439 Speaker 2: who it is, but I make the question to show 190 00:12:30,520 --> 00:12:32,800 Speaker 2: the heart is not the guide, it should not be 191 00:12:32,920 --> 00:12:39,200 Speaker 2: the guide, and humans created in God's image is a 192 00:12:39,320 --> 00:12:43,160 Speaker 2: principle that I believe in and I affirm. If you 193 00:12:43,240 --> 00:12:46,880 Speaker 2: don't affirm it, save anything you want instead of a human. 194 00:12:49,280 --> 00:12:51,480 Speaker 2: But I would throw a stranger in front of a 195 00:12:51,520 --> 00:12:56,120 Speaker 2: moving bus to save my dog. That's troubling. Next, my 196 00:12:56,280 --> 00:12:59,480 Speaker 2: dog over a stranger. Hmm, you mean the very dog 197 00:12:59,559 --> 00:13:02,000 Speaker 2: that sleeps by my side every night, greets me when 198 00:13:02,000 --> 00:13:07,440 Speaker 2: I get home, helps me destress, makes me laugh, and 199 00:13:07,480 --> 00:13:13,720 Speaker 2: a complete stranger know nothing about hmmm, my dog capital 200 00:13:13,840 --> 00:13:21,600 Speaker 2: d og pure emotion. Again, the dog is loving. Okay, 201 00:13:22,280 --> 00:13:27,680 Speaker 2: what I said before, reaffirm is reaffirmed. And this one 202 00:13:27,760 --> 00:13:31,480 Speaker 2: is another. This third one was a scary one. I 203 00:13:31,559 --> 00:13:34,120 Speaker 2: think you're a monster if you don't pick your own 204 00:13:34,200 --> 00:13:38,960 Speaker 2: dog over a stranger. Really, do you think the people 205 00:13:38,960 --> 00:13:42,640 Speaker 2: who love the stranger that I let die? You think 206 00:13:42,679 --> 00:13:46,959 Speaker 2: that they would think I'm a monster, They would think 207 00:13:47,000 --> 00:13:50,320 Speaker 2: I'm a saint, they would think I was an angel, 208 00:13:52,600 --> 00:13:59,120 Speaker 2: so who's right? And finally, my dog. Every time I 209 00:13:59,160 --> 00:14:02,000 Speaker 2: know my dog is good and deserves to be saved. 210 00:14:02,280 --> 00:14:04,319 Speaker 2: I don't know who that stranger is or if I'm 211 00:14:04,320 --> 00:14:09,000 Speaker 2: saving the next Hitler or a Hilary that could cause 212 00:14:09,040 --> 00:14:11,720 Speaker 2: the world more pain anyway. I don't like Hillary Clinton, 213 00:14:11,720 --> 00:14:13,559 Speaker 2: but I would never put her in the same category 214 00:14:13,600 --> 00:14:16,720 Speaker 2: as Hitler. So I just had to make that point. 215 00:14:16,800 --> 00:14:21,880 Speaker 2: But your dog isn't good. That's not true. Your dog 216 00:14:21,960 --> 00:14:25,840 Speaker 2: is good to you, that doesn't mean your dog is good. 217 00:14:27,160 --> 00:14:30,880 Speaker 2: Dogs aren't bad either. Dogs don't have free moral will. 218 00:14:31,360 --> 00:14:34,520 Speaker 2: No creature other than the human being does. So you 219 00:14:34,560 --> 00:14:39,240 Speaker 2: can't describe an animal as good or evil. A lion 220 00:14:39,320 --> 00:14:43,560 Speaker 2: that eats a person is not evil, a lion that 221 00:14:43,600 --> 00:14:47,360 Speaker 2: eats a gazelle is not evil, and a lion that 222 00:14:47,400 --> 00:14:51,800 Speaker 2: doesn't need a gazelle is not good. Only humans could 223 00:14:51,800 --> 00:15:00,880 Speaker 2: be good or evil, not animals. All right, that's the story. 224 00:15:00,960 --> 00:15:04,680 Speaker 2: You may disagree with me, but you can't disagree. You 225 00:15:04,720 --> 00:15:07,440 Speaker 2: can only disagree with the conclusion. You can't disagree with 226 00:15:07,480 --> 00:15:12,160 Speaker 2: the argument. If you believe in a transcendent moral code, 227 00:15:12,840 --> 00:15:15,400 Speaker 2: like the Biblical code that has been the basis of 228 00:15:15,440 --> 00:15:19,920 Speaker 2: Western civilization. Then people are created in God's image and 229 00:15:19,960 --> 00:15:23,160 Speaker 2: you must save the human If you have no regard 230 00:15:23,240 --> 00:15:26,440 Speaker 2: for that code, fine, I'm just pointing out what happens 231 00:15:26,440 --> 00:15:32,600 Speaker 2: when we drop the code. And if you are guided 232 00:15:32,640 --> 00:15:39,680 Speaker 2: by your heart versus guided by a value system, that's 233 00:15:39,720 --> 00:15:45,960 Speaker 2: the other problem. Okadok your questions Mark twenty six, United Kingdom. 234 00:15:46,320 --> 00:15:50,360 Speaker 2: How can the suppression of conservative voices in social media 235 00:15:50,480 --> 00:15:53,360 Speaker 2: be stopped? Prager? You was trying to stop it in 236 00:15:53,400 --> 00:15:58,680 Speaker 2: the United States, we are. We have sued Google, which 237 00:15:58,720 --> 00:16:02,440 Speaker 2: owns YouTube or putting one hundred of our four hundred 238 00:16:02,520 --> 00:16:05,760 Speaker 2: well actually one hundred of our or even more than 239 00:16:05,760 --> 00:16:09,280 Speaker 2: four hundred of a whole variety of our videos on 240 00:16:09,520 --> 00:16:14,720 Speaker 2: their restricted list, which means that if your family has 241 00:16:14,800 --> 00:16:20,080 Speaker 2: filters on your internet service to block out pornography and violence, 242 00:16:21,360 --> 00:16:24,600 Speaker 2: we will be blocked. Is not incredible. There's no pornography 243 00:16:24,680 --> 00:16:27,920 Speaker 2: or violence that prayer you. But they don't agree with 244 00:16:28,000 --> 00:16:32,160 Speaker 2: us because we're not on the left, and they do 245 00:16:32,200 --> 00:16:34,800 Speaker 2: it to many others as well. So you're right, the 246 00:16:34,800 --> 00:16:38,480 Speaker 2: suppression of conservative voices in the social media. It gives 247 00:16:38,520 --> 00:16:42,480 Speaker 2: you an idea what happens every time, there is no exception, 248 00:16:42,640 --> 00:16:46,200 Speaker 2: whenever the left is in control. It suppresses other voices. 249 00:16:46,600 --> 00:16:49,520 Speaker 2: There is no exception in the history of the left 250 00:16:49,840 --> 00:16:54,320 Speaker 2: on planet Earth. Liberals allow people to speak. Leftists do not. 251 00:16:55,280 --> 00:17:00,000 Speaker 2: There is no exception to that. They wouldn't be alone 252 00:17:00,040 --> 00:17:04,440 Speaker 2: leftist if they allowed other voices. There are many reasons 253 00:17:04,440 --> 00:17:06,640 Speaker 2: for that. One of them is liberty is not one 254 00:17:06,640 --> 00:17:10,040 Speaker 2: of their values. Truth is not one of their values either. 255 00:17:11,840 --> 00:17:15,760 Speaker 2: The La Times said a couple of years ago it 256 00:17:15,760 --> 00:17:19,320 Speaker 2: would not publish any letter, let alone article by even 257 00:17:19,359 --> 00:17:24,800 Speaker 2: an eminent climatologist who merely says that, well, wait a minute, 258 00:17:24,800 --> 00:17:28,800 Speaker 2: maybe the Earth isn't going to be destroyed in twelve years. 259 00:17:29,000 --> 00:17:32,200 Speaker 2: That alone would not be allowed. There is no debate. 260 00:17:33,720 --> 00:17:37,880 Speaker 2: They say, well, the science is what is the word closed? 261 00:17:38,160 --> 00:17:42,360 Speaker 2: Is the science is settled. The science is settled. That's amazing. 262 00:17:42,400 --> 00:17:46,080 Speaker 2: When is science settled on things that haven't happened yet? 263 00:17:46,600 --> 00:17:51,600 Speaker 2: The science is settled. But that they always there's always 264 00:17:51,640 --> 00:17:59,000 Speaker 2: something is settled. There were two transgender females trans females 265 00:17:59,200 --> 00:18:03,439 Speaker 2: high school students that is, male bodies biological males who 266 00:18:03,560 --> 00:18:06,800 Speaker 2: identify as females, who ran in a girl's race in Connecticut. 267 00:18:07,680 --> 00:18:11,840 Speaker 2: They won first and second place. They set records for girls' races. 268 00:18:12,160 --> 00:18:15,280 Speaker 2: But they're biological boys. Of course they beat all the 269 00:18:15,320 --> 00:18:19,600 Speaker 2: girls because they're biologically male. And yet a guy at 270 00:18:19,600 --> 00:18:22,160 Speaker 2: the Nation, a left wing a big left wing magazine 271 00:18:22,200 --> 00:18:25,760 Speaker 2: in the United States on the Internet wrote, it is 272 00:18:25,840 --> 00:18:28,840 Speaker 2: a fact. Those are his words. It is a fact 273 00:18:29,400 --> 00:18:35,120 Speaker 2: that trans females are females. It's a fact. So therefore, 274 00:18:35,560 --> 00:18:38,639 Speaker 2: if their view is that someone who said, like I 275 00:18:38,800 --> 00:18:42,480 Speaker 2: have just said, it's not fair for these trans females, 276 00:18:42,560 --> 00:18:47,280 Speaker 2: biological males to compete against biological females, I am running 277 00:18:47,320 --> 00:18:50,520 Speaker 2: against facts because it's a fact that they're female. But 278 00:18:50,560 --> 00:18:52,320 Speaker 2: it's not a fact that they're a female. It's a 279 00:18:52,359 --> 00:18:55,720 Speaker 2: fact that they regard themselves as females, and I respect 280 00:18:55,760 --> 00:19:00,760 Speaker 2: that that's fine, but for the sake of competition in sports, 281 00:19:00,880 --> 00:19:07,840 Speaker 2: they are not female. But their view is that voices 282 00:19:07,920 --> 00:19:09,760 Speaker 2: like mine should not even be allowed to be heard. 283 00:19:11,119 --> 00:19:14,840 Speaker 2: Why would we allow non fact based statements to be made. 284 00:19:15,359 --> 00:19:18,800 Speaker 2: That's how they view it. So that's why there's a 285 00:19:18,840 --> 00:19:23,000 Speaker 2: suppression of conservative voices and the social media. Mike thirty one, 286 00:19:23,320 --> 00:19:29,520 Speaker 2: la Ilhan Omar, that's the American congresswoman from Minnesota who 287 00:19:30,480 --> 00:19:35,679 Speaker 2: has who is a Muslim woman. She came here with 288 00:19:35,760 --> 00:19:41,119 Speaker 2: her parents from Somalia to me Hi. She said, I 289 00:19:41,200 --> 00:19:44,800 Speaker 2: never heard this quote, but I'll believe that it was 290 00:19:44,840 --> 00:19:49,919 Speaker 2: said to me. The hijab means power, liberation, beauty, and resistance. 291 00:19:50,359 --> 00:19:56,720 Speaker 2: Would love to hear your thoughts on the hijab. Well, 292 00:19:56,800 --> 00:19:59,680 Speaker 2: I'll react to the comment, but before I react to 293 00:19:59,720 --> 00:20:03,639 Speaker 2: the let me tell you what I think we should 294 00:20:03,680 --> 00:20:12,199 Speaker 2: all agree on. That the veil is anti human having 295 00:20:12,280 --> 00:20:16,520 Speaker 2: people not show their face. The face is who you are. 296 00:20:17,760 --> 00:20:21,280 Speaker 2: You're not your chest, you're not your legs, you are 297 00:20:21,359 --> 00:20:25,720 Speaker 2: your face. That is the seat of the human personality. 298 00:20:26,520 --> 00:20:30,800 Speaker 2: It is. It is inhuman to have people hide their 299 00:20:30,840 --> 00:20:34,200 Speaker 2: faces at all times, except obviously in their own house. 300 00:20:35,040 --> 00:20:38,520 Speaker 2: It's not humane and it's not human. It's literally not human. 301 00:20:39,840 --> 00:20:43,560 Speaker 2: When you see pictures of a bunch of women, all 302 00:20:43,600 --> 00:20:47,639 Speaker 2: of whose faces are invisible, covered by black shrouds, as 303 00:20:47,680 --> 00:20:52,760 Speaker 2: it were, their humanity has been utterly compromised. For all 304 00:20:52,840 --> 00:20:57,040 Speaker 2: you know that there's a there's an animal inside that 305 00:20:59,640 --> 00:21:01,840 Speaker 2: except for the eyes, and sometimes that you get very 306 00:21:01,840 --> 00:21:03,880 Speaker 2: often the eyes are coast. You don't even know who's 307 00:21:03,920 --> 00:21:07,480 Speaker 2: in there. It could be a man in there. You 308 00:21:07,520 --> 00:21:13,480 Speaker 2: don't know the person has become invisible. Making humans invisible 309 00:21:13,800 --> 00:21:17,880 Speaker 2: is not beautiful. Now, that's not the hitjob. We do 310 00:21:17,920 --> 00:21:21,119 Speaker 2: see her face. The hidjob is the head and I 311 00:21:21,160 --> 00:21:24,000 Speaker 2: believe ears as well. Is that right ears covered as well? 312 00:21:25,720 --> 00:21:31,159 Speaker 2: And look that I don't have. I don't have strong 313 00:21:31,200 --> 00:21:35,040 Speaker 2: feelings on that. It's not your face is visible. If 314 00:21:35,080 --> 00:21:39,800 Speaker 2: you wish to cover your hair and your ears, I 315 00:21:40,280 --> 00:21:43,439 Speaker 2: respect that. I don't support it, but I respect it. 316 00:21:43,520 --> 00:21:46,720 Speaker 2: I don't have the same view. What I don't understand 317 00:21:47,000 --> 00:21:54,720 Speaker 2: is why it's power, liberation, beauty and resistance resistance? What 318 00:21:54,920 --> 00:21:58,760 Speaker 2: is the hijob resistance too? These are typical comments. This 319 00:21:58,880 --> 00:22:00,960 Speaker 2: is not being made because she's Muslim, It's being made 320 00:22:00,960 --> 00:22:05,600 Speaker 2: because she's leftist. These comments mean nothing. Has so many 321 00:22:05,600 --> 00:22:08,879 Speaker 2: comments from the left, they just they're just made. What 322 00:22:09,040 --> 00:22:13,280 Speaker 2: is it resistance too? I would like to hear the answer. 323 00:22:13,280 --> 00:22:15,879 Speaker 2: What is it resistance to? Every woman who wears a 324 00:22:15,920 --> 00:22:20,679 Speaker 2: he job is engaged in resistance toom Donald Trump, to 325 00:22:20,680 --> 00:22:31,240 Speaker 2: to fascism, I mean to what capitalism Israel? What is 326 00:22:31,240 --> 00:22:35,600 Speaker 2: it a What is it resistance to liberation? I don't look. 327 00:22:35,960 --> 00:22:38,000 Speaker 2: You can wear it, and I respect your right to 328 00:22:38,080 --> 00:22:40,240 Speaker 2: do so, and I don't regard it. The same way 329 00:22:40,320 --> 00:22:42,400 Speaker 2: or regard to ail. But I don't know why you're liberated. 330 00:22:42,440 --> 00:22:47,240 Speaker 2: What are you liberated from being regarded in any way sexually? 331 00:22:47,280 --> 00:22:49,800 Speaker 2: Because the hair is sexual. I assume that's what it means. 332 00:22:50,480 --> 00:22:54,120 Speaker 2: I can't think of any other, Well else could it mean? 333 00:22:54,320 --> 00:22:59,200 Speaker 2: I don't know why that's liberating. Look, but it's it's 334 00:22:59,280 --> 00:23:01,760 Speaker 2: you know, it's not that. It's not the only religion 335 00:23:01,800 --> 00:23:04,920 Speaker 2: that has that, and I respect, but I don't know 336 00:23:04,960 --> 00:23:09,720 Speaker 2: why it's liberating. It's a form of modesty, if you will, 337 00:23:09,920 --> 00:23:15,720 Speaker 2: and that's fine, But I don't know why it's liberation power. 338 00:23:15,760 --> 00:23:18,119 Speaker 2: I don't know what the power is. What is Okay? 339 00:23:18,200 --> 00:23:24,000 Speaker 2: These are all these are all words beauty, Okay, that's 340 00:23:24,080 --> 00:23:28,719 Speaker 2: that's also. That's as odd as resistance. I can I 341 00:23:28,720 --> 00:23:35,360 Speaker 2: can understand why you want to suppress your attractiveness, which 342 00:23:35,400 --> 00:23:38,800 Speaker 2: is what is being done. A woman's hair is a 343 00:23:38,920 --> 00:23:42,919 Speaker 2: source of her is a source of her beauty, so 344 00:23:43,119 --> 00:23:50,520 Speaker 2: it's ridiculous to say it's beautiful. It means beauty. The 345 00:23:50,560 --> 00:23:53,640 Speaker 2: whole point is to suppress your beauty. And I respect that. 346 00:23:53,680 --> 00:23:56,439 Speaker 2: I said that that's fine, but you can't have it 347 00:23:56,480 --> 00:24:00,240 Speaker 2: both ways. I'm going to suppress my beauty is Isn't 348 00:24:00,240 --> 00:24:06,159 Speaker 2: that beautiful? Maybe it's morally beautiful or religiously beautiful, but 349 00:24:06,200 --> 00:24:12,199 Speaker 2: it's that The beauty meant here is I presume physical beauty. 350 00:24:12,440 --> 00:24:15,440 Speaker 2: You can't have it both ways. It's either suppressing physical 351 00:24:15,480 --> 00:24:18,840 Speaker 2: beauty or expressing physical beauty. Can't have it both ways. 352 00:24:20,040 --> 00:24:23,760 Speaker 2: But so much goes said these days that means nothing, 353 00:24:25,000 --> 00:24:29,359 Speaker 2: and it's almost always emanates from the left, you know, 354 00:24:29,440 --> 00:24:34,120 Speaker 2: like calling people racist whom they differ with sexist, racist, misogynistic, 355 00:24:34,920 --> 00:24:39,800 Speaker 2: they're just they're the employing of words for effect but 356 00:24:40,000 --> 00:24:48,920 Speaker 2: not meaning. How are we doing on Time Haven twenty 357 00:24:49,040 --> 00:24:53,000 Speaker 2: in Branson, Missouri? What can we do in our everyday 358 00:24:53,040 --> 00:24:59,360 Speaker 2: lives to make America great again? Get married, make a family, 359 00:25:00,080 --> 00:25:04,840 Speaker 2: support your community, do work to help your fellow citizen, 360 00:25:07,359 --> 00:25:13,240 Speaker 2: and support all the causes that understand that that America 361 00:25:13,760 --> 00:25:17,680 Speaker 2: or that agree with Abraham Lincoln that America is the 362 00:25:18,440 --> 00:25:23,240 Speaker 2: last best hope for mankind actually put it the last 363 00:25:23,320 --> 00:25:29,800 Speaker 2: best hope for I think on Earth. You may think 364 00:25:29,960 --> 00:25:32,560 Speaker 2: Lincoln's wrong, in which case this is an irrelevant question. 365 00:25:32,640 --> 00:25:35,480 Speaker 2: I think Lincoln is right because it's the only country 366 00:25:35,480 --> 00:25:40,560 Speaker 2: that was rooted in an individual liberty. That's beautiful that's empowering. 367 00:25:44,280 --> 00:25:48,480 Speaker 2: Ethan sixteen Tampa, Florida, Which founding father inspired you most? 368 00:25:48,560 --> 00:25:51,240 Speaker 2: They all did. I don't have one in particular. I 369 00:25:51,280 --> 00:25:54,480 Speaker 2: think that the constellation of greatness in the founding years 370 00:25:54,480 --> 00:26:01,520 Speaker 2: of the United States is almost divine. The extraordinary men. 371 00:26:02,760 --> 00:26:05,520 Speaker 2: Obviously there were extraordinary women in their lives, but it 372 00:26:05,560 --> 00:26:08,800 Speaker 2: was men who wrote these documents and founded this country. 373 00:26:09,480 --> 00:26:12,040 Speaker 2: It is an astonishing thing. If you read these people 374 00:26:14,160 --> 00:26:21,960 Speaker 2: or read about them, Washington and Adams and Franklin and 375 00:26:22,400 --> 00:26:29,560 Speaker 2: Madison and Jefferson. It's you know that there were students 376 00:26:29,560 --> 00:26:32,400 Speaker 2: at Hofstreet University in the United States. I think it's 377 00:26:32,560 --> 00:26:36,240 Speaker 2: on Long Island. They want to take down the Thomas 378 00:26:36,280 --> 00:26:41,879 Speaker 2: Jefferson statue. These nothings want to take down the statue 379 00:26:42,440 --> 00:26:48,040 Speaker 2: of someone great. Yes, he had slaves. Everybody had slaves, 380 00:26:48,320 --> 00:26:52,360 Speaker 2: or nearly everybody in history who could afford it had slaves. 381 00:26:52,440 --> 00:26:58,880 Speaker 2: It's evil, but he created the society that ended slavery. 382 00:27:00,119 --> 00:27:02,720 Speaker 2: You know that more blacks have come to the United States. 383 00:27:03,320 --> 00:27:06,000 Speaker 2: This was true thirty years ago. This is not recent 384 00:27:07,280 --> 00:27:09,800 Speaker 2: thirty years ago. Already more blacks have come to the 385 00:27:09,880 --> 00:27:15,320 Speaker 2: United States from Africa willingly and voluntarily than came here 386 00:27:15,359 --> 00:27:20,959 Speaker 2: involuntarily as slaves. Why are they stupid? Why would they 387 00:27:20,960 --> 00:27:25,200 Speaker 2: go to a racist country? But they're not stupid. They 388 00:27:25,240 --> 00:27:28,600 Speaker 2: know how well they'll be treated in America because the 389 00:27:28,680 --> 00:27:31,359 Speaker 2: vast majority of Americans don't give a hoot about color, 390 00:27:32,440 --> 00:27:40,960 Speaker 2: which is the way it should be. Adam sixteen, Canada. 391 00:27:41,040 --> 00:27:43,560 Speaker 2: My question is, would you mind if we trade leaders? 392 00:27:43,960 --> 00:27:47,280 Speaker 2: You're funny for a sixteen year old. That's a pretty 393 00:27:47,280 --> 00:27:54,159 Speaker 2: funny question. Actually, be pretty funny if you were thirty six. Listen, 394 00:27:54,200 --> 00:27:57,280 Speaker 2: I wanted to trade leaders the last time, when you 395 00:27:57,359 --> 00:28:00,160 Speaker 2: had Harper and we had Obama, I wanted a trade leaders. 396 00:28:00,240 --> 00:28:03,080 Speaker 2: But I don't want to trade them now. It's true 397 00:28:03,640 --> 00:28:09,919 Speaker 2: we really did a swapper root, didn't we. I often 398 00:28:10,000 --> 00:28:13,159 Speaker 2: said that. I said, Oh, Canada has this great leader, 399 00:28:13,200 --> 00:28:16,879 Speaker 2: Stephen Harper. By the way, Stephen Harper's made a video, 400 00:28:16,920 --> 00:28:19,160 Speaker 2: the former prime minister of Canada. We have a number 401 00:28:19,200 --> 00:28:23,159 Speaker 2: of prime ministers who made videos for Praguer. You people 402 00:28:23,160 --> 00:28:26,160 Speaker 2: who write these terrible things about Preguer, You do they 403 00:28:26,320 --> 00:28:33,159 Speaker 2: know that prime ministers make videos for us? But anyway, no, 404 00:28:33,400 --> 00:28:36,160 Speaker 2: I wouldn't mind if we traded leaders. That is correct, 405 00:28:37,000 --> 00:28:38,920 Speaker 2: But it would be funny if Donald Trump with a 406 00:28:38,960 --> 00:28:43,680 Speaker 2: Canadian leader. Wonder how laid back Canadians would react to that. 407 00:28:45,360 --> 00:28:49,080 Speaker 2: All right, my friends, we'll take some of these other 408 00:28:49,680 --> 00:28:56,560 Speaker 2: comments later. Well, I hope that the dog stranger thing 409 00:28:56,680 --> 00:28:59,400 Speaker 2: has at least prompted you to think. I don't know 410 00:28:59,440 --> 00:29:03,040 Speaker 2: if I'll change not many minds, but it's a very 411 00:29:03,080 --> 00:29:05,760 Speaker 2: big deal. That's why I mentioned it, and that's why 412 00:29:05,800 --> 00:29:09,640 Speaker 2: I took your comments. We love your comments. Just what 413 00:29:09,800 --> 00:29:12,200 Speaker 2: should people do? Send them into Praguer you? What do 414 00:29:12,280 --> 00:29:17,440 Speaker 2: they do? One questions? Because put them out on Instagram 415 00:29:17,800 --> 00:29:23,760 Speaker 2: or an email, right, So it's great to hear from you. 416 00:29:24,480 --> 00:29:26,920 Speaker 2: I made a lot of you at airports and around 417 00:29:26,960 --> 00:29:29,320 Speaker 2: the world, and it's very touching and I'm very happy 418 00:29:29,320 --> 00:29:31,000 Speaker 2: to have a selfie with you if you'd like it. 419 00:29:31,200 --> 00:29:34,600 Speaker 2: It's not a bother. I'm very touched by it. So 420 00:29:34,800 --> 00:29:38,000 Speaker 2: until next week, I'm Dennis Prager, and from my house 421 00:29:38,040 --> 00:29:39,480 Speaker 2: to yours, thank you for. 422 00:29:39,480 --> 00:29:43,400 Speaker 3: Watching tomorrow on Timeless Wisdom with Dennis Prager. 423 00:29:43,880 --> 00:29:46,920 Speaker 2: The irony is your kids will even enjoy you more 424 00:29:47,080 --> 00:29:49,880 Speaker 2: if you don't depend on them for your happiness. I 425 00:29:50,080 --> 00:29:54,440 Speaker 2: enjoy my parents precisely because they do have their own 426 00:29:54,560 --> 00:30:00,120 Speaker 2: way to entertain themselves. They don't rely on me, rely 427 00:30:00,280 --> 00:30:08,240 Speaker 2: on my brother. You don't know how much I enjoyed 428 00:30:08,240 --> 00:30:10,280 Speaker 2: that line. You have no idea. 429 00:30:11,040 --> 00:30:14,760 Speaker 3: Join us tomorrow to hear more on timeless Wisdom with 430 00:30:14,920 --> 00:30:20,400 Speaker 3: Dennis Prager. This has been timeless wisdom with Dennis Prager. 431 00:30:20,840 --> 00:30:24,960 Speaker 3: Visit Dennisprager dot com for thousands of hours of Dennis's lectures, 432 00:30:25,040 --> 00:30:28,960 Speaker 3: courses in classic radio programs, and to purchase Dennis Prager's 433 00:30:29,080 --> 00:30:30,160 Speaker 3: Rational Bibles