1 00:00:05,920 --> 00:00:09,520 Speaker 1: Hi, and welcome back to the Carol Markowitz Show on iHeartRadio. 2 00:00:09,840 --> 00:00:11,120 Speaker 2: I'd like to start. 3 00:00:10,800 --> 00:00:14,360 Speaker 1: Off with an apology for the lack of monologue on 4 00:00:14,600 --> 00:00:18,360 Speaker 1: Monday's show. I skip monologue sometimes when I'm on vacation 5 00:00:18,640 --> 00:00:21,400 Speaker 1: or traveling, and I let you guys know. 6 00:00:21,880 --> 00:00:23,520 Speaker 2: But this was a bit different. 7 00:00:23,800 --> 00:00:26,960 Speaker 1: After the last few episodes of talking about how elections 8 00:00:27,040 --> 00:00:30,520 Speaker 1: make people crazy, Donald Trump was the target of a 9 00:00:30,560 --> 00:00:34,920 Speaker 1: second assassination attempt on Sunday. My friend John Cardillo actually 10 00:00:34,960 --> 00:00:37,279 Speaker 1: broke the story on X and there was a lot 11 00:00:37,320 --> 00:00:41,680 Speaker 1: of confusion about what had happened. The early reports were 12 00:00:41,920 --> 00:00:44,680 Speaker 1: that there was a shootout outside the golf course where 13 00:00:44,680 --> 00:00:47,720 Speaker 1: he was playing in West Palm Beach that had nothing 14 00:00:47,720 --> 00:00:51,800 Speaker 1: to do with the president. After nearly three years in Florida, 15 00:00:52,120 --> 00:00:55,960 Speaker 1: our anniversary is coming up in January. I'm not an expert, 16 00:00:56,640 --> 00:00:59,520 Speaker 1: but I've spent some time in West Palm Beach and 17 00:00:59,720 --> 00:01:02,920 Speaker 1: it's it's not a place where daytime shootouts are a thing, 18 00:01:03,280 --> 00:01:07,840 Speaker 1: even in the sketchy areas, So that a story immediately 19 00:01:07,920 --> 00:01:11,319 Speaker 1: sounded suspicious to me. And then I got a tip, 20 00:01:11,760 --> 00:01:15,080 Speaker 1: a very reliable tip. It was a picture of the 21 00:01:15,120 --> 00:01:18,120 Speaker 1: would be shooter as he was being arrested by police. 22 00:01:18,560 --> 00:01:20,560 Speaker 2: I posted his photo. 23 00:01:20,360 --> 00:01:24,120 Speaker 1: On my social media on x and some information that 24 00:01:24,160 --> 00:01:28,759 Speaker 1: I had about him, and then my day just blew up. 25 00:01:29,360 --> 00:01:31,080 Speaker 2: It got millions of views. 26 00:01:31,280 --> 00:01:34,600 Speaker 1: My phone started ringing and didn't stop. I went on 27 00:01:35,160 --> 00:01:38,000 Speaker 1: making Kelly's Show on Sunday as the news was breaking, 28 00:01:38,120 --> 00:01:41,480 Speaker 1: and then on Clay and Buck the following day. Breaking 29 00:01:41,600 --> 00:01:45,759 Speaker 1: news isn't my beat per se, but I have done 30 00:01:45,800 --> 00:01:48,960 Speaker 1: it in a few high profile cases in the past. 31 00:01:49,400 --> 00:01:53,919 Speaker 1: It's extremely nerve racking. I mean, what if I'm wrong. 32 00:01:54,680 --> 00:01:55,640 Speaker 2: I'm a columnist. 33 00:01:56,160 --> 00:01:59,280 Speaker 1: That's my day job. I write my opinion. I don't 34 00:01:59,280 --> 00:02:00,880 Speaker 1: worry too much about being wrong. 35 00:02:01,360 --> 00:02:02,120 Speaker 2: I mean, I. 36 00:02:02,080 --> 00:02:04,760 Speaker 1: Still have facts with my opinions, but I double and 37 00:02:04,800 --> 00:02:07,960 Speaker 1: triple check everything, and look, I've had opinions I no 38 00:02:08,080 --> 00:02:10,520 Speaker 1: longer hold. But that just means I get to write 39 00:02:10,520 --> 00:02:13,560 Speaker 1: another column on how I got to the new opinions 40 00:02:13,600 --> 00:02:17,360 Speaker 1: and why I discarded the old ones. Breaking news is 41 00:02:17,400 --> 00:02:22,200 Speaker 1: a totally different animal. There's also the avalanche of hate 42 00:02:22,560 --> 00:02:25,640 Speaker 1: that comes with being a public figure in general. But 43 00:02:25,960 --> 00:02:28,720 Speaker 1: when one of your posts gets millions of views, it 44 00:02:28,800 --> 00:02:32,200 Speaker 1: really brings out the crazies, especially if that post is 45 00:02:32,240 --> 00:02:36,760 Speaker 1: about an attempted assassination of a presidential candidate. I posted 46 00:02:36,919 --> 00:02:40,920 Speaker 1: this on my socials, But if you enjoy someone's work, 47 00:02:41,200 --> 00:02:43,880 Speaker 1: not just mine, anybody you follow that is living a 48 00:02:43,919 --> 00:02:47,119 Speaker 1: public life, do your best to support them. We get 49 00:02:47,160 --> 00:02:50,080 Speaker 1: a lot of bad stuff thrown in us, and look, 50 00:02:50,160 --> 00:02:52,200 Speaker 1: I still think it's worth it. I still love what 51 00:02:52,240 --> 00:02:55,440 Speaker 1: I do, but some days it does get hard. All 52 00:02:55,480 --> 00:02:58,840 Speaker 1: the ways to support my work, mostly for free, is 53 00:02:58,880 --> 00:03:02,440 Speaker 1: in my pin tweet on X And another thing you 54 00:03:02,480 --> 00:03:06,320 Speaker 1: can do, obviously again not just for me, is retweet 55 00:03:06,360 --> 00:03:09,880 Speaker 1: people comment on their stuff, subscribe to their podcast, leave 56 00:03:09,919 --> 00:03:12,919 Speaker 1: them reviews. All of it's free, doesn't take a lot. 57 00:03:12,800 --> 00:03:15,280 Speaker 2: Of effort, and it's very appreciative. 58 00:03:15,840 --> 00:03:18,679 Speaker 1: The pin tweet on X it involves listening to the 59 00:03:18,800 --> 00:03:23,160 Speaker 1: show or subscribing to my substack. Nothing crazy anyway. That's 60 00:03:23,200 --> 00:03:26,280 Speaker 1: why I didn't have a monologue. I was knee deep 61 00:03:26,400 --> 00:03:29,720 Speaker 1: in the attempted assassination story and I ran out of 62 00:03:29,760 --> 00:03:31,960 Speaker 1: time to record it. To make it all up to you, 63 00:03:32,320 --> 00:03:35,320 Speaker 1: I'm going to tell you something that isn't otherwise public yet. 64 00:03:35,560 --> 00:03:39,200 Speaker 1: Launching this Tuesday, I'm going to start a second podcast, 65 00:03:39,520 --> 00:03:42,560 Speaker 1: this one about politics, and I'll be co hosting with 66 00:03:42,640 --> 00:03:45,640 Speaker 1: my good friend Mary Catherine Ham. I'm still going to 67 00:03:45,640 --> 00:03:48,800 Speaker 1: continue doing this show, which is largely not about politics, 68 00:03:49,240 --> 00:03:52,960 Speaker 1: but especially now I need a political outlet too, and 69 00:03:53,320 --> 00:03:57,240 Speaker 1: MK and I have very similar worldviews. We hope to 70 00:03:57,280 --> 00:03:59,839 Speaker 1: be funny and to bring you perspectives you don't hear 71 00:04:00,080 --> 00:04:03,840 Speaker 1: anywhere else. The show is called Normally. It will run 72 00:04:03,880 --> 00:04:06,400 Speaker 1: on the same network as this, the Klay, Travis and 73 00:04:06,440 --> 00:04:10,520 Speaker 1: Buck Sexton podcast network on iHeartRadio. It will be posted 74 00:04:10,520 --> 00:04:13,880 Speaker 1: on Tuesdays and Thursdays, and just like this show, you 75 00:04:13,920 --> 00:04:16,920 Speaker 1: can hear it anywhere you get your podcasts. We talk 76 00:04:16,960 --> 00:04:20,200 Speaker 1: about balance a lot on this show. I think it's 77 00:04:20,240 --> 00:04:23,240 Speaker 1: important to take breaks from politics, as I try to 78 00:04:23,279 --> 00:04:26,080 Speaker 1: do on the Carol Marcowitz Show, but there's also time 79 00:04:26,200 --> 00:04:29,800 Speaker 1: to talk about the political issues, the election, horse race 80 00:04:29,880 --> 00:04:32,239 Speaker 1: and so on. We'll be doing that at the Normally 81 00:04:32,320 --> 00:04:35,320 Speaker 1: podcast and hope you will subscribe when it's up. Thank 82 00:04:35,360 --> 00:04:38,279 Speaker 1: you so much for listening. I appreciate you all so much. 83 00:04:38,680 --> 00:04:41,560 Speaker 1: Coming up next an interview with T Beckett Adams. 84 00:04:41,720 --> 00:04:43,080 Speaker 2: Join us after the break. 85 00:04:46,480 --> 00:04:50,120 Speaker 3: Welcome back to the Carol Markowitz Show on iHeartRadio. My 86 00:04:50,200 --> 00:04:54,080 Speaker 3: guest today is T Beckett Adams. Beckett is the program 87 00:04:54,160 --> 00:04:57,520 Speaker 3: director of the National Journalism Center, and he's a columnist 88 00:04:57,560 --> 00:05:00,560 Speaker 3: at The Hill, National Review and The Washington Examiner. 89 00:05:00,720 --> 00:05:02,640 Speaker 2: I Beckett, so nice to have you on. 90 00:05:03,040 --> 00:05:04,200 Speaker 4: Hey, Carol, thanks for having me. 91 00:05:04,520 --> 00:05:06,800 Speaker 2: So I know that. 92 00:05:06,600 --> 00:05:08,520 Speaker 3: This is something that maybe people tell you a lot, 93 00:05:08,560 --> 00:05:11,240 Speaker 3: but you're really one of my favorite people to follow 94 00:05:11,360 --> 00:05:13,760 Speaker 3: on X And I was thinking about this, is that 95 00:05:13,839 --> 00:05:14,440 Speaker 3: a compliment? 96 00:05:15,920 --> 00:05:16,520 Speaker 4: Are people? 97 00:05:16,839 --> 00:05:18,640 Speaker 2: Are people insulted by you know. 98 00:05:18,800 --> 00:05:21,560 Speaker 3: Like I associate you in having a really good like 99 00:05:21,600 --> 00:05:23,760 Speaker 3: Twitter feed, you know, you. 100 00:05:23,720 --> 00:05:25,400 Speaker 4: Know what, I take it as a compliment. Now. I 101 00:05:25,400 --> 00:05:28,279 Speaker 4: think back in like twenty twelve, twenty fourteen, it's like, 102 00:05:28,839 --> 00:05:31,359 Speaker 4: let me know, because you still have this mindset that 103 00:05:31,480 --> 00:05:34,200 Speaker 4: you no, social media so new, but at this point, 104 00:05:34,240 --> 00:05:37,160 Speaker 4: so much news is made on Twitter, and one of 105 00:05:37,240 --> 00:05:39,320 Speaker 4: the things that's so useful about is it's great for networking. 106 00:05:39,320 --> 00:05:40,839 Speaker 4: So you know, and just take it as a compliment. 107 00:05:41,279 --> 00:05:43,359 Speaker 3: Yeah, And I think you're just you have a really 108 00:05:43,360 --> 00:05:47,719 Speaker 3: creative way of getting points across in you know, X 109 00:05:47,839 --> 00:05:52,080 Speaker 3: number of characters, and I just really I always think 110 00:05:52,120 --> 00:05:55,240 Speaker 3: of you as having like really clever takes on stuff 111 00:05:55,480 --> 00:05:56,640 Speaker 3: and not typical. 112 00:05:57,000 --> 00:06:01,640 Speaker 2: So there's my you know, compliment and slash insult. You 113 00:06:01,680 --> 00:06:03,159 Speaker 2: are good at you are good at the Twitter. 114 00:06:04,360 --> 00:06:05,839 Speaker 4: You know, I'll take it. I mean I liked it 115 00:06:05,880 --> 00:06:07,599 Speaker 4: better when it was one hundred and forty characters because 116 00:06:07,600 --> 00:06:09,920 Speaker 4: it was it was It was a good discipline, especially 117 00:06:09,920 --> 00:06:11,440 Speaker 4: if you were trying to tell a joke, to see 118 00:06:11,440 --> 00:06:12,960 Speaker 4: if you can do it in one hundred and forty characters. 119 00:06:13,040 --> 00:06:14,520 Speaker 2: Yeah, or at least to eighty. 120 00:06:14,560 --> 00:06:17,560 Speaker 3: But like this new you can write you know, six paragraphs. 121 00:06:17,600 --> 00:06:19,360 Speaker 2: I can't wrap my mind around it. 122 00:06:19,360 --> 00:06:20,640 Speaker 4: It's essentially just Facebook. 123 00:06:20,880 --> 00:06:24,000 Speaker 2: Yeah, right, So how did you get into this world? 124 00:06:24,080 --> 00:06:25,040 Speaker 2: How did you become a writer? 125 00:06:25,520 --> 00:06:27,159 Speaker 4: Oh? Geez, let's see if I can do the readers. 126 00:06:27,160 --> 00:06:30,600 Speaker 4: I just version of it. I fall into journalism and 127 00:06:31,080 --> 00:06:35,520 Speaker 4: political commentary entirely by accident. I've always liked politics. I've 128 00:06:35,520 --> 00:06:38,640 Speaker 4: always been extremely fascinated by it because it's the most 129 00:06:38,680 --> 00:06:42,840 Speaker 4: insane circus and it's all real with actual consequences. So, 130 00:06:44,000 --> 00:06:46,440 Speaker 4: you know, it's a lot of performing without a net, 131 00:06:46,480 --> 00:06:50,440 Speaker 4: as it were. But you know, I graduated in around 132 00:06:50,520 --> 00:06:53,200 Speaker 4: two thousand and eight, so there's a little bit of 133 00:06:53,200 --> 00:06:55,800 Speaker 4: an economic tightening around then, and as it turns out, 134 00:06:55,839 --> 00:06:58,680 Speaker 4: having a BA wasn't a surefire way to get a job. 135 00:07:00,040 --> 00:07:01,880 Speaker 4: As you know, people couldn't even hold their homes, let 136 00:07:01,920 --> 00:07:05,320 Speaker 4: alone get decent jobs. So I fell into it entirely 137 00:07:05,360 --> 00:07:09,800 Speaker 4: by accident, because back then finding an internship that paid 138 00:07:09,920 --> 00:07:12,520 Speaker 4: was also extremely rare as a unicorn, especially in the 139 00:07:12,600 --> 00:07:15,320 Speaker 4: sort of DC area. I was going to school in Virginia, 140 00:07:15,360 --> 00:07:18,640 Speaker 4: so I kind of emigrated upwards to where the jobs were, 141 00:07:19,080 --> 00:07:22,040 Speaker 4: and I found this group called the National Journalism Center, 142 00:07:22,160 --> 00:07:24,800 Speaker 4: which I am now the program director of, and I 143 00:07:25,040 --> 00:07:27,120 Speaker 4: was I was still doing some schooling, and so I 144 00:07:27,120 --> 00:07:29,600 Speaker 4: didn't really have the time to do a full time gig, 145 00:07:29,640 --> 00:07:32,000 Speaker 4: whether it's retail or food service. Food service is my 146 00:07:32,080 --> 00:07:34,480 Speaker 4: real background, and so I was like, I just need 147 00:07:34,480 --> 00:07:37,120 Speaker 4: some cash flow. So I found this internship that paid. 148 00:07:37,240 --> 00:07:39,920 Speaker 4: It was in journalism, and I could write well enough, 149 00:07:39,920 --> 00:07:41,920 Speaker 4: and I was really upfront with them. I was like, look, 150 00:07:41,960 --> 00:07:44,080 Speaker 4: I just need the money. That's really what I need it. 151 00:07:44,960 --> 00:07:46,800 Speaker 4: Long story short, turns out I was good at it 152 00:07:46,840 --> 00:07:49,720 Speaker 4: and I really enjoyed it. And now what fourteen fourteen 153 00:07:49,760 --> 00:07:52,760 Speaker 4: years later, here I am as a columnist and I 154 00:07:52,840 --> 00:07:53,640 Speaker 4: now trained journalists. 155 00:07:53,920 --> 00:07:56,040 Speaker 2: That's crazy. 156 00:07:56,080 --> 00:07:59,160 Speaker 3: So you started there as your entry level internship or 157 00:07:59,360 --> 00:07:59,800 Speaker 3: it was like. 158 00:07:59,760 --> 00:08:01,360 Speaker 2: A of paid internship. 159 00:08:01,840 --> 00:08:03,600 Speaker 4: It was a pay It was a paid internship, which 160 00:08:03,600 --> 00:08:05,840 Speaker 4: again is the only reason I applied. I just I 161 00:08:05,880 --> 00:08:08,400 Speaker 4: just because they provided a monthly stipend. I'm like, that's 162 00:08:08,440 --> 00:08:11,240 Speaker 4: I'll be bad. And then I really enjoyed it and 163 00:08:11,280 --> 00:08:13,200 Speaker 4: it got me closer to politics. You know. We got 164 00:08:13,240 --> 00:08:14,520 Speaker 4: to have that first moment when you get to go 165 00:08:14,560 --> 00:08:17,000 Speaker 4: into press gallery in the capital. It was like, this 166 00:08:17,360 --> 00:08:19,120 Speaker 4: is really cool, you know, and they point out, you know, 167 00:08:19,160 --> 00:08:22,280 Speaker 4: the bullet holes or the Erycan nationalists open fire on 168 00:08:22,320 --> 00:08:24,800 Speaker 4: Congress back in the day was like nineteen fifty four 169 00:08:24,960 --> 00:08:27,280 Speaker 4: the top of my head, but just like it was 170 00:08:27,320 --> 00:08:30,080 Speaker 4: really exciting. And then being able to cover presidential elections 171 00:08:30,080 --> 00:08:32,200 Speaker 4: and primaries and et cetera, et cetera, and it's like 172 00:08:32,240 --> 00:08:34,720 Speaker 4: and so I'm stuck with it's sticking with it, not 173 00:08:34,800 --> 00:08:37,160 Speaker 4: stuck with it, And that's how you look at it. 174 00:08:37,320 --> 00:08:39,199 Speaker 4: I really enjoy it. I love it. And part of 175 00:08:39,240 --> 00:08:41,720 Speaker 4: the reason I went back to the NJAC to actually 176 00:08:41,720 --> 00:08:44,720 Speaker 4: start training and working with aspiring journalists is because I 177 00:08:44,760 --> 00:08:48,840 Speaker 4: do love journalism and I have no misgivings about how 178 00:08:48,960 --> 00:08:51,400 Speaker 4: bad the industry is right now. It's it's a mess. 179 00:08:51,760 --> 00:08:55,160 Speaker 4: It's it's got a credibility crisis for a reason, and 180 00:08:55,240 --> 00:08:57,400 Speaker 4: so part of my being at the NJAC is to 181 00:08:57,440 --> 00:08:59,680 Speaker 4: try to get in the ground level and like train people, like, 182 00:08:59,720 --> 00:09:01,680 Speaker 4: let's fix things one right. 183 00:09:01,600 --> 00:09:03,920 Speaker 3: Fixing the problems. How I'm original. 184 00:09:04,760 --> 00:09:06,040 Speaker 2: I haven't heard of that before. 185 00:09:06,400 --> 00:09:06,880 Speaker 4: They're right. 186 00:09:07,400 --> 00:09:09,560 Speaker 2: So what do you consider your beat? 187 00:09:10,160 --> 00:09:13,600 Speaker 4: Politics? Culture? Although I don't I'm not really I've never 188 00:09:13,600 --> 00:09:16,080 Speaker 4: been a culture warrior type. I've never been an activist type. 189 00:09:16,280 --> 00:09:18,360 Speaker 4: I don't think I've ever been to a political rally. 190 00:09:18,840 --> 00:09:20,719 Speaker 4: I've done the March for Life, but like, honestly, in 191 00:09:20,760 --> 00:09:22,440 Speaker 4: high school, that was just excuse to get to DC. 192 00:09:23,440 --> 00:09:27,600 Speaker 4: It's fun, I mean, but I do like to There's 193 00:09:27,600 --> 00:09:30,240 Speaker 4: some stuff I do find really fascinating about the American 194 00:09:30,320 --> 00:09:32,800 Speaker 4: experience and where are we going? Where have we been? 195 00:09:33,240 --> 00:09:35,160 Speaker 4: And then if I get really lucky, I get to 196 00:09:35,200 --> 00:09:38,520 Speaker 4: do my secret passion project, which is doing film criticism. 197 00:09:38,760 --> 00:09:42,760 Speaker 4: But that's film criticism is very difficult. I think it 198 00:09:42,840 --> 00:09:46,600 Speaker 4: requires a specific kind of creativity which is learned, I think, 199 00:09:46,840 --> 00:09:48,640 Speaker 4: or I mean, it can be learned, but I think 200 00:09:48,640 --> 00:09:50,920 Speaker 4: some people are just kind of born with it. But 201 00:09:50,960 --> 00:09:52,520 Speaker 4: you know, I have the opportunity to do that. But 202 00:09:52,600 --> 00:09:56,199 Speaker 4: mostly focusing in politics and the intersection of politics and media, 203 00:09:56,320 --> 00:09:58,160 Speaker 4: especially what's the. 204 00:09:58,080 --> 00:10:02,480 Speaker 3: Difference between film reviews and film criticism. 205 00:10:02,640 --> 00:10:05,679 Speaker 4: Uh, that's a good question. I don't know. I mean 206 00:10:05,720 --> 00:10:09,439 Speaker 4: there's a difference I think between sort of in doing 207 00:10:09,720 --> 00:10:11,679 Speaker 4: a review kind of gives a quick and dirty it's 208 00:10:11,679 --> 00:10:13,600 Speaker 4: like the letter but you know the website letterbox, we 209 00:10:13,600 --> 00:10:15,280 Speaker 4: can go and how many stars do you give it? 210 00:10:15,559 --> 00:10:17,440 Speaker 4: You know, it's a website you can go to to 211 00:10:17,520 --> 00:10:20,520 Speaker 4: do very very brief reviews, almost like as short as 212 00:10:20,520 --> 00:10:23,040 Speaker 4: they are in like Rotten Tomatoes. Give it how many 213 00:10:23,080 --> 00:10:24,839 Speaker 4: stars out of five and a quick one or two 214 00:10:24,840 --> 00:10:26,920 Speaker 4: summary of what you liked or why you hated it. 215 00:10:27,360 --> 00:10:29,120 Speaker 4: Film criticism, I think, is a lot more like a 216 00:10:29,240 --> 00:10:31,880 Speaker 4: literary criticism, and that you break it down into its 217 00:10:31,920 --> 00:10:35,360 Speaker 4: parts and you review how each thing plays together against 218 00:10:35,440 --> 00:10:37,440 Speaker 4: the themes of the movie, with how it's produced, et cetera, 219 00:10:37,480 --> 00:10:39,959 Speaker 4: et cetera. So I did. I got to. I had 220 00:10:39,960 --> 00:10:43,480 Speaker 4: this great experience earlier this year. I had never watched 221 00:10:43,520 --> 00:10:46,720 Speaker 4: The Searchers with John Wayne. I had never gotten around 222 00:10:46,720 --> 00:10:48,000 Speaker 4: to seeing it. So I finally did it. Turned out 223 00:10:48,000 --> 00:10:50,800 Speaker 4: actually watched it on his birthday weekend. It was a weird, 224 00:10:50,840 --> 00:10:53,760 Speaker 4: little serendipitous thing, and it is so good. It is 225 00:10:53,880 --> 00:10:57,400 Speaker 4: such a good really, it's like I've seen some john 226 00:10:57,400 --> 00:10:59,120 Speaker 4: Ford stuff before, so I can't say it's his best 227 00:10:59,120 --> 00:11:00,720 Speaker 4: piece because I haven't seen all this stuff. But it 228 00:11:00,800 --> 00:11:04,000 Speaker 4: is remarkably good, so good. It's one of those movies 229 00:11:04,000 --> 00:11:06,120 Speaker 4: that you kind of stop and sit about and think about. 230 00:11:06,360 --> 00:11:09,440 Speaker 4: And then I actually did a criticism where I broke 231 00:11:09,440 --> 00:11:11,120 Speaker 4: it down into its parts because I think it's very 232 00:11:11,160 --> 00:11:13,400 Speaker 4: sort of it's a symbolism and poetry and how the 233 00:11:13,440 --> 00:11:15,480 Speaker 4: movie begins and how it ends, and so I think 234 00:11:15,480 --> 00:11:18,559 Speaker 4: that's different between The Dreamer. Two thumbs up, although to 235 00:11:18,600 --> 00:11:21,560 Speaker 4: say also it was good with criticism as or yeah, 236 00:11:21,600 --> 00:11:22,560 Speaker 4: criticism as well. 237 00:11:22,920 --> 00:11:25,640 Speaker 3: My dirty yeah, my dirty secret is that I don't 238 00:11:25,640 --> 00:11:28,719 Speaker 3: watch movies I want to, like I do, I do, 239 00:11:28,800 --> 00:11:31,240 Speaker 3: I really want to, but they're so bad and they're 240 00:11:31,760 --> 00:11:34,880 Speaker 3: so bad. So often the more critically acclaimed it is, 241 00:11:35,040 --> 00:11:36,240 Speaker 3: I feel like, the worse. 242 00:11:36,080 --> 00:11:38,840 Speaker 4: It is, I feel like, see, it's so funny because 243 00:11:38,880 --> 00:11:40,880 Speaker 4: I had an experience that got a little missy. We 244 00:11:40,880 --> 00:11:43,320 Speaker 4: were showing the kids sound the music for the first time. 245 00:11:43,440 --> 00:11:45,360 Speaker 4: Oh yeah, I never cared for it when I was 246 00:11:45,400 --> 00:11:46,920 Speaker 4: a kid. It always that was. 247 00:11:46,920 --> 00:11:47,400 Speaker 2: A good one. 248 00:11:47,640 --> 00:11:49,439 Speaker 4: Oh, I mean it's good. Yeah, it's appreciated. It's a 249 00:11:49,520 --> 00:11:51,440 Speaker 4: well put together movie, but like it as a fourteen 250 00:11:51,480 --> 00:11:53,000 Speaker 4: year old or an eight year old boy. It's not 251 00:11:53,040 --> 00:11:55,640 Speaker 4: really your bag of potatoes. It's right, a bunch of 252 00:11:55,720 --> 00:11:57,760 Speaker 4: kids singing and dancing. But I'm watching it with my 253 00:11:57,880 --> 00:11:59,719 Speaker 4: kids who are five and four because you know, the 254 00:11:59,720 --> 00:12:01,960 Speaker 4: past and all. I get a little missie. Not because 255 00:12:02,000 --> 00:12:04,439 Speaker 4: of the movie, but because of the sort of level 256 00:12:04,480 --> 00:12:08,679 Speaker 4: of expertise and competency in the movie that costumes of music, 257 00:12:08,720 --> 00:12:11,040 Speaker 4: the way it looks and ever. It's like, I feel 258 00:12:11,040 --> 00:12:12,720 Speaker 4: like we don't have a lot of this. We have 259 00:12:12,800 --> 00:12:16,960 Speaker 4: this weird sort We have either indie dramas about like 260 00:12:17,000 --> 00:12:20,720 Speaker 4: incestuous step people, or you have big block you know, 261 00:12:20,760 --> 00:12:24,280 Speaker 4: big blockbuster, big budget movies that's like the fourteenth Avengers film, 262 00:12:24,320 --> 00:12:26,559 Speaker 4: and it's like you see somebody like the music, and 263 00:12:26,600 --> 00:12:30,679 Speaker 4: there's this sort of artistry and care and there's like 264 00:12:30,880 --> 00:12:32,680 Speaker 4: there's something to it. I feel like, I mean, it 265 00:12:32,720 --> 00:12:35,760 Speaker 4: still exists in some forms, but you know, I think 266 00:12:35,800 --> 00:12:38,600 Speaker 4: it is telling that. So this goes back to knowing 267 00:12:38,640 --> 00:12:40,560 Speaker 4: what I write and like to write about. You know, 268 00:12:40,760 --> 00:12:42,600 Speaker 4: I think there is something to glean from the fact 269 00:12:42,679 --> 00:12:46,000 Speaker 4: that the year that best soute of the music won 270 00:12:46,080 --> 00:12:48,319 Speaker 4: Best Picture four years later, the movie that one best 271 00:12:48,320 --> 00:12:51,440 Speaker 4: picture was Men n Cowboys. So these change very quickly 272 00:12:51,559 --> 00:12:54,560 Speaker 4: and really clearly in the culture. And then watching this 273 00:12:54,600 --> 00:12:56,640 Speaker 4: movie and it was just like the first fifteen minutes 274 00:12:56,640 --> 00:12:58,679 Speaker 4: there's three songs, and you got the nuns and the 275 00:12:58,679 --> 00:13:01,240 Speaker 4: convent all that stuff. It's like, this is great. When 276 00:13:01,720 --> 00:13:04,000 Speaker 4: did we make movies like this or we like, we're 277 00:13:04,000 --> 00:13:06,800 Speaker 4: on like the third the third u at Moana? What 278 00:13:06,880 --> 00:13:07,679 Speaker 4: is it the item? 279 00:13:07,880 --> 00:13:11,439 Speaker 3: We are coming up with a second Molana? But yeah, Frozen, Yes. 280 00:13:12,120 --> 00:13:12,440 Speaker 2: Oh. 281 00:13:12,480 --> 00:13:15,440 Speaker 3: What's sad is that those are sort of good movies 282 00:13:15,520 --> 00:13:17,240 Speaker 3: and that's why they have to remake them because all 283 00:13:17,280 --> 00:13:18,400 Speaker 3: the other movies are so bad. 284 00:13:18,520 --> 00:13:19,400 Speaker 2: I don't know. 285 00:13:19,480 --> 00:13:22,680 Speaker 3: Every time I watch like the It's just a while 286 00:13:22,679 --> 00:13:26,360 Speaker 3: ago now, but that everything all at once Staff movie, Yeah, 287 00:13:26,480 --> 00:13:28,520 Speaker 3: I just I was like, what in. 288 00:13:28,440 --> 00:13:31,040 Speaker 2: The hell you're like? I hated it. I hated it. 289 00:13:31,360 --> 00:13:32,319 Speaker 2: I enjoyed it. 290 00:13:33,880 --> 00:13:36,640 Speaker 4: Okay, well I can't. I see I see that. I 291 00:13:36,679 --> 00:13:39,560 Speaker 4: get that. That's one of those things where I get it. 292 00:13:39,640 --> 00:13:40,280 Speaker 4: I get it, you do. 293 00:13:41,840 --> 00:13:45,360 Speaker 3: Yeah, the hot Enjeorge, just all of it. 294 00:13:44,559 --> 00:13:48,240 Speaker 4: I always gotta fun. I mean, I'm not I'm not 295 00:13:48,240 --> 00:13:49,679 Speaker 4: trying to do the thing where we don't make good 296 00:13:49,679 --> 00:13:53,120 Speaker 4: stuff anymore. We do, but I I do think we do. 297 00:13:53,200 --> 00:13:55,480 Speaker 4: It does exist. I mean, it wasn't so long ago. 298 00:13:55,600 --> 00:13:58,079 Speaker 4: Well okay, this is perspective, but like the Lord of 299 00:13:58,080 --> 00:14:01,600 Speaker 4: the Rings trilogy wasn't that long. That was after the millennium. 300 00:14:01,679 --> 00:14:03,920 Speaker 4: And I do remember seeing it in the theater and 301 00:14:04,040 --> 00:14:06,960 Speaker 4: having like, oh wow, this is this is different. I 302 00:14:06,960 --> 00:14:08,080 Speaker 4: haven't seen something like this. 303 00:14:08,160 --> 00:14:09,240 Speaker 2: That's twenty years ago. 304 00:14:09,400 --> 00:14:13,320 Speaker 4: Right, It is a while ago, but I mean that's 305 00:14:13,320 --> 00:14:14,719 Speaker 4: twenty years isn't completely out. 306 00:14:15,280 --> 00:14:15,319 Speaker 1: No. 307 00:14:15,440 --> 00:14:18,240 Speaker 4: I thought nineteen seventeen was technically very proficient, very and 308 00:14:18,360 --> 00:14:18,920 Speaker 4: you'll see that. 309 00:14:19,360 --> 00:14:20,880 Speaker 2: Okay, I'll try it. 310 00:14:20,880 --> 00:14:24,000 Speaker 4: It's Sam I can't remember his last name, Mendez, I owe, 311 00:14:24,160 --> 00:14:26,320 Speaker 4: the director. I can't remember, but he did some of 312 00:14:26,320 --> 00:14:29,040 Speaker 4: the New Bond movies. I liked it a lot. It's 313 00:14:29,080 --> 00:14:31,680 Speaker 4: not the greatest film ever written. And some people did 314 00:14:31,760 --> 00:14:34,760 Speaker 4: like it precisely because it's so technically proficient. They thought 315 00:14:34,800 --> 00:14:37,680 Speaker 4: it was basically the entire movie is one tracking shot. 316 00:14:37,840 --> 00:14:40,520 Speaker 4: There's essentially no cuts in the entire movie, and people 317 00:14:40,520 --> 00:14:42,560 Speaker 4: thought that was distracting and gimmicky. But I mean it's 318 00:14:42,560 --> 00:14:45,440 Speaker 4: a good looking movie and it did actually you know, 319 00:14:45,520 --> 00:14:47,840 Speaker 4: there are moments where like, wow, this is this, this 320 00:14:47,960 --> 00:14:51,600 Speaker 4: is moving, this is this race? I'm having a visual 321 00:14:51,640 --> 00:14:54,760 Speaker 4: experience right now. Then there are other times, like I 322 00:14:54,800 --> 00:14:57,560 Speaker 4: tried to watch the New Flash and it's so funny 323 00:14:57,600 --> 00:15:00,600 Speaker 4: because there's this crisis in Hollywood right now with computer 324 00:15:00,640 --> 00:15:03,560 Speaker 4: animation where I don't know if people are quitting or 325 00:15:03,560 --> 00:15:05,600 Speaker 4: they're the demands are great for the amount of time 326 00:15:05,600 --> 00:15:09,640 Speaker 4: because it looks so bad. It's funny to see a 327 00:15:09,720 --> 00:15:11,320 Speaker 4: movie that came out last year and then you watched 328 00:15:11,400 --> 00:15:13,760 Speaker 4: Jurassic Park, which came out in the nineties, and dress 329 00:15:13,840 --> 00:15:17,360 Speaker 4: Park looks so good, it looks amazing, and it was 330 00:15:17,680 --> 00:15:20,000 Speaker 4: well in the more than two decades ago. And movie 331 00:15:20,040 --> 00:15:22,280 Speaker 4: that came out last year looked awful. 332 00:15:22,400 --> 00:15:25,880 Speaker 2: What was that? Is it cutting down on staff? What 333 00:15:25,920 --> 00:15:27,080 Speaker 2: could that be? Well? 334 00:15:27,080 --> 00:15:29,440 Speaker 4: I know, huge problem with Disney is the workloads are 335 00:15:29,440 --> 00:15:31,680 Speaker 4: too great and the time constraints are too small that 336 00:15:31,880 --> 00:15:34,640 Speaker 4: essentially they are there. They have sort of created these 337 00:15:35,600 --> 00:15:39,240 Speaker 4: They sort of created a sweatshop a model for doing CGI. 338 00:15:39,360 --> 00:15:42,760 Speaker 4: Whereas back in the Jurassic Park ILM, Industrial Light and Magic, 339 00:15:42,840 --> 00:15:46,040 Speaker 4: which was George Lucas created it was the special effects workshop. 340 00:15:46,880 --> 00:15:49,240 Speaker 4: They were sort of a niche workshop. They were they 341 00:15:49,360 --> 00:15:51,440 Speaker 4: sort of go to if you needed something really difficult 342 00:15:51,440 --> 00:15:53,320 Speaker 4: and you wanted it to look good. So you know 343 00:15:53,360 --> 00:15:55,520 Speaker 4: the T one thousand and the Terminator two movies, this 344 00:15:55,680 --> 00:16:00,000 Speaker 4: Ilam Death becomes Her. You ever see that. It's a funny. 345 00:16:00,000 --> 00:16:02,160 Speaker 4: Oh yeah, that's ilm did a lot of special effects 346 00:16:02,200 --> 00:16:04,480 Speaker 4: on this stuff, and there used to be sort of 347 00:16:04,520 --> 00:16:06,320 Speaker 4: like if you wanted it done well, this is the 348 00:16:06,320 --> 00:16:08,920 Speaker 4: place to go, and I assume it's hostile, so interesting, 349 00:16:09,560 --> 00:16:12,320 Speaker 4: and there was also a reliance as well unpractical effect. 350 00:16:12,400 --> 00:16:14,200 Speaker 4: So in dress at Park they're not all CGI. There 351 00:16:14,200 --> 00:16:16,280 Speaker 4: are a lot of them are actual light sized and puppets, 352 00:16:16,360 --> 00:16:19,320 Speaker 4: gigantic animatronics and it looks good because it is real. 353 00:16:19,760 --> 00:16:21,800 Speaker 4: But in Disney, I know what a lot of the 354 00:16:21,800 --> 00:16:23,840 Speaker 4: Marble movies they had a problem whether like you know, 355 00:16:24,080 --> 00:16:26,480 Speaker 4: you would have reshoots and the schedules will get shortened, 356 00:16:26,520 --> 00:16:29,000 Speaker 4: and then you have this sweatshop of like fifteen animators 357 00:16:29,000 --> 00:16:30,200 Speaker 4: are like we need it in a week, and they're 358 00:16:30,240 --> 00:16:32,520 Speaker 4: like it's not gonna because it's a tri triangle. You 359 00:16:32,520 --> 00:16:35,040 Speaker 4: can have a good, faster, cheap and Dizney what's all 360 00:16:35,040 --> 00:16:39,240 Speaker 4: three in Disney consumption right, and Disney is infamously cheap. 361 00:16:39,320 --> 00:16:42,120 Speaker 4: I mean it's it's been a Hollywood joke for literal 362 00:16:42,160 --> 00:16:44,480 Speaker 4: decades at this point, and so they throw it out 363 00:16:44,520 --> 00:16:46,920 Speaker 4: and then you see the new whatever new Marble movie. 364 00:16:46,960 --> 00:16:48,960 Speaker 4: You're like, what this It doesn't look as good a 365 00:16:48,960 --> 00:16:51,320 Speaker 4: stuff that came out thirty years ago. So it's a 366 00:16:51,360 --> 00:16:54,520 Speaker 4: fun it's a it's it's demorned and it's cheap. 367 00:16:55,120 --> 00:16:59,760 Speaker 2: She just sing, Beckett, could your calling be filmed off? 368 00:17:00,000 --> 00:17:01,160 Speaker 4: I think I'm too old for that. 369 00:17:01,320 --> 00:17:05,399 Speaker 2: Feels like it might be, though. You're just like lit 370 00:17:05,480 --> 00:17:07,000 Speaker 2: up to talk about movies. 371 00:17:07,280 --> 00:17:08,959 Speaker 4: I love talking about it. I love movies. I love 372 00:17:09,000 --> 00:17:12,160 Speaker 4: saying movies. I would love to write more about movies, 373 00:17:12,200 --> 00:17:14,919 Speaker 4: but I have not the creative confidence quite yet to 374 00:17:14,960 --> 00:17:17,720 Speaker 4: make that my calling. I just dip my toe in 375 00:17:17,760 --> 00:17:18,320 Speaker 4: it every now. 376 00:17:18,240 --> 00:17:21,719 Speaker 5: And then I think, leap right in, right in, I'm 377 00:17:21,760 --> 00:17:27,440 Speaker 5: gonna take your job, and you know, maybe on the side, 378 00:17:27,640 --> 00:17:30,200 Speaker 5: so I'm not trying to like cause strife here. 379 00:17:31,240 --> 00:17:33,640 Speaker 4: She might have some some thoughts about that, and that's 380 00:17:33,680 --> 00:17:37,360 Speaker 4: always fun because you know, finding movies that the two 381 00:17:37,440 --> 00:17:40,960 Speaker 4: of us like to get. She likes uh, French dramas 382 00:17:41,080 --> 00:17:45,000 Speaker 4: and I don't, so so I was trying to find 383 00:17:45,000 --> 00:17:47,159 Speaker 4: that happy medium, like would you like to watch a 384 00:17:47,160 --> 00:17:49,840 Speaker 4: traumatic wor movie and she's like, no, I would. 385 00:17:49,560 --> 00:17:51,639 Speaker 3: Not well better than like, would you like to watch 386 00:17:52,000 --> 00:17:53,080 Speaker 3: superhero movie? 387 00:17:53,160 --> 00:17:53,320 Speaker 4: Four? 388 00:17:53,720 --> 00:17:54,120 Speaker 2: Eishaw. 389 00:17:54,880 --> 00:17:57,119 Speaker 4: I'm completely marveled out. I can't do it anymore. I 390 00:17:57,200 --> 00:17:57,480 Speaker 4: just can. 391 00:17:59,080 --> 00:18:00,440 Speaker 2: So do you feel like you've made it? 392 00:18:01,560 --> 00:18:03,760 Speaker 4: I do, I really do. There there was a lot 393 00:18:03,760 --> 00:18:06,640 Speaker 4: of grind in the beginning. I mean, everyone has to 394 00:18:06,640 --> 00:18:10,520 Speaker 4: pay their dues. In journalism, you get to start out 395 00:18:10,560 --> 00:18:13,560 Speaker 4: basically as a content farmer. The norm in a lot 396 00:18:13,560 --> 00:18:15,440 Speaker 4: of these newsrooms is, you know, you got to write 397 00:18:15,440 --> 00:18:17,399 Speaker 4: four to five pieces a day, and we're talking like 398 00:18:17,840 --> 00:18:21,879 Speaker 4: four hundred, six hundred word pieces that just news blurbs. Yeah, 399 00:18:21,920 --> 00:18:24,600 Speaker 4: and you know, it's kind of a sink or swim situation. 400 00:18:24,680 --> 00:18:26,240 Speaker 4: Either you get good at it, or you just don't 401 00:18:26,880 --> 00:18:28,720 Speaker 4: and you moved to a different area, or you get fired. 402 00:18:29,080 --> 00:18:31,280 Speaker 4: And I got I got really good at writing quickly, 403 00:18:31,359 --> 00:18:34,359 Speaker 4: writing fast, maybe not creatively, but very common. I can 404 00:18:34,400 --> 00:18:36,960 Speaker 4: do EIGHTP style like nobody else. But the style is 405 00:18:37,040 --> 00:18:40,639 Speaker 4: very sort of devoid of creativity. So it's so funny 406 00:18:40,640 --> 00:18:43,240 Speaker 4: about Hemingway is that he took that and he made 407 00:18:43,240 --> 00:18:45,159 Speaker 4: it creative, which is where because he was an AP 408 00:18:45,440 --> 00:18:50,240 Speaker 4: reporter but not everyone's hemingway h so you yet you 409 00:18:50,280 --> 00:18:51,879 Speaker 4: get good at it, you get fast at it, and 410 00:18:51,920 --> 00:18:54,560 Speaker 4: they pay your dues, and then eventually you can develop 411 00:18:54,720 --> 00:18:58,280 Speaker 4: an expertise on a beat such that you develop opinions 412 00:18:58,320 --> 00:19:00,800 Speaker 4: that are actually informed, because that's that's the real trick 413 00:19:00,840 --> 00:19:03,280 Speaker 4: to commentary. The best bit of advice ever I've ever 414 00:19:03,320 --> 00:19:06,240 Speaker 4: heard was from Peggy Nowton, a former Reagan speech writer. 415 00:19:06,320 --> 00:19:08,080 Speaker 4: She's called a wellster journal I'm not saying that for 416 00:19:08,119 --> 00:19:09,720 Speaker 4: your benefit, I know, you know, I'm saying. 417 00:19:10,359 --> 00:19:14,080 Speaker 2: Yes, yeah, I still get excited to see Peggy Noonon. 418 00:19:13,840 --> 00:19:16,680 Speaker 3: Like when I her out somewhere, I still get a 419 00:19:16,720 --> 00:19:18,439 Speaker 3: little starstruck and like I were friendly. 420 00:19:18,520 --> 00:19:19,320 Speaker 2: She knows who I am. 421 00:19:19,400 --> 00:19:22,280 Speaker 3: And yet I'm still like Peggy noon you know. 422 00:19:22,320 --> 00:19:25,240 Speaker 4: Yeah. See, best best bit of advice I ever heard 423 00:19:25,280 --> 00:19:27,119 Speaker 4: was from her, which is, if they're doing commentary, you 424 00:19:27,200 --> 00:19:30,119 Speaker 4: have to tell the readers something they didn't know. And 425 00:19:30,160 --> 00:19:32,640 Speaker 4: that's that. I think she said that before the sort 426 00:19:32,640 --> 00:19:35,119 Speaker 4: of worth of social media even so, like the idea 427 00:19:35,160 --> 00:19:38,240 Speaker 4: that you have an opinion, like yet get in lunch, right, yeah, 428 00:19:38,320 --> 00:19:40,520 Speaker 4: do you have something that I don't know? So I've 429 00:19:40,560 --> 00:19:43,119 Speaker 4: always tried to take that incorporate that into my columns 430 00:19:43,440 --> 00:19:45,320 Speaker 4: that you know, what is it about this thing? Is 431 00:19:45,320 --> 00:19:47,720 Speaker 4: there some sort of historic context people might know? And 432 00:19:47,760 --> 00:19:49,800 Speaker 4: like that. It's always fun because I get to my 433 00:19:50,480 --> 00:19:53,159 Speaker 4: background in college is I did history. That's why I majored, 434 00:19:53,200 --> 00:19:55,520 Speaker 4: and so it's always fun to flex that muscle a little. 435 00:19:55,600 --> 00:19:59,440 Speaker 4: And it was especially fun slash frustrating during the Trump 436 00:19:59,480 --> 00:20:03,600 Speaker 4: era and everything was unprecedented. Oh yeah, I've never been 437 00:20:03,640 --> 00:20:04,080 Speaker 4: here before. 438 00:20:04,080 --> 00:20:07,440 Speaker 6: It's like, yeah, here's the war, you know, like see, 439 00:20:07,720 --> 00:20:11,040 Speaker 6: we have been here for right, right, And. 440 00:20:11,080 --> 00:20:12,439 Speaker 4: It was always fun to kind of lean on that 441 00:20:12,560 --> 00:20:14,760 Speaker 4: and like maybe not tele reader or something they didn't know, 442 00:20:14,800 --> 00:20:17,439 Speaker 4: but maybe remind them about something they made. Like I 443 00:20:17,480 --> 00:20:20,480 Speaker 4: mentioned earlier about the Puerto Rican National it's shooting up 444 00:20:20,480 --> 00:20:22,480 Speaker 4: to Congress back in the I want to I want 445 00:20:22,480 --> 00:20:23,840 Speaker 4: to say nineteen fifty four. I don't know why I 446 00:20:23,920 --> 00:20:26,240 Speaker 4: keep them back to the year. So, you know, having 447 00:20:26,320 --> 00:20:29,080 Speaker 4: that knowledge and then seeing January sixth, which was bad, 448 00:20:29,560 --> 00:20:31,160 Speaker 4: I have to say it because otherwise if you don't, 449 00:20:31,160 --> 00:20:33,960 Speaker 4: people were like, did you support it? Though obviously not 450 00:20:34,080 --> 00:20:37,280 Speaker 4: I lived in DC, thank you very much, but like 451 00:20:37,359 --> 00:20:39,720 Speaker 4: have people like this is the most violent day. There's 452 00:20:39,760 --> 00:20:41,879 Speaker 4: like no people actually opened fire on Congress and there 453 00:20:41,880 --> 00:20:45,040 Speaker 4: were fire bombings. There were bombs whether underground, like it's 454 00:20:45,080 --> 00:20:48,879 Speaker 4: been we've seen bad. Not to excuse bad, Yeah, like 455 00:20:48,920 --> 00:20:50,840 Speaker 4: you have to keep things in perspective like where have 456 00:20:50,880 --> 00:20:53,119 Speaker 4: we been? And it's it helps you to know where 457 00:20:53,119 --> 00:20:54,560 Speaker 4: can we go if you know where we've been? 458 00:20:54,960 --> 00:20:58,440 Speaker 3: Right, Not to excuse the January sixth to anything, but no, 459 00:20:58,480 --> 00:21:01,520 Speaker 3: the idea that like, I mean, if you ask the 460 00:21:01,560 --> 00:21:04,679 Speaker 3: average person, they think that all those people were armed, 461 00:21:04,720 --> 00:21:06,120 Speaker 3: and I'm like, nobody was armed. 462 00:21:06,160 --> 00:21:08,680 Speaker 2: There were no arms, nobody was carrying anything. 463 00:21:08,800 --> 00:21:11,560 Speaker 3: Like the only person killed was killed by you know. 464 00:21:11,640 --> 00:21:12,480 Speaker 2: Capitol police. 465 00:21:13,080 --> 00:21:15,399 Speaker 3: So but I think people got told a story that 466 00:21:15,600 --> 00:21:17,600 Speaker 3: just you know, wasn't true. 467 00:21:17,600 --> 00:21:18,639 Speaker 2: And that's sort of the problem. 468 00:21:18,680 --> 00:21:22,280 Speaker 3: Also when you are when you have history and you're 469 00:21:22,400 --> 00:21:25,440 Speaker 3: not telling the whole story to people, I think it's 470 00:21:25,480 --> 00:21:28,160 Speaker 3: easy to get lost in the this has never happened before, 471 00:21:28,280 --> 00:21:32,200 Speaker 3: this is unprecedented when it's pretty precedented. 472 00:21:31,960 --> 00:21:35,639 Speaker 4: Right, And so it's a question of in journalism, you 473 00:21:35,680 --> 00:21:37,760 Speaker 4: want to be factual and you want to be accurate. 474 00:21:37,800 --> 00:21:39,320 Speaker 4: So there's that. But the other thing too is you 475 00:21:39,359 --> 00:21:41,160 Speaker 4: want to be a. 476 00:21:41,080 --> 00:21:43,239 Speaker 2: Classiction and I would hope that's what they want to be. 477 00:21:43,320 --> 00:21:45,240 Speaker 3: But I haven't seen a lot of evidence of that. 478 00:21:45,800 --> 00:21:49,840 Speaker 4: Well, right, well that's the idea. And also the classic 479 00:21:49,920 --> 00:21:52,600 Speaker 4: rule of journalism ethics is you want to do no harm. 480 00:21:52,680 --> 00:21:54,719 Speaker 4: The thing is, if you were constantly ratcheting up your 481 00:21:54,760 --> 00:21:57,879 Speaker 4: audiences and telling we're in uncharted territory when we're really not, 482 00:21:58,400 --> 00:22:00,560 Speaker 4: you answer, you're scaring your readers, and then readers might 483 00:22:00,640 --> 00:22:04,280 Speaker 4: become desperate. Readers might start to accept narrators that aren't true, 484 00:22:04,320 --> 00:22:07,440 Speaker 4: and then you know, we have before marched into war 485 00:22:07,520 --> 00:22:09,640 Speaker 4: based on things that weren't true because you got people 486 00:22:09,680 --> 00:22:12,520 Speaker 4: all whipped up into a frenzy. So that's sort of 487 00:22:12,520 --> 00:22:15,560 Speaker 4: my thing is jar six. Yes, it was bad. Also, 488 00:22:15,800 --> 00:22:18,000 Speaker 4: why are people inflating the death count from it? People 489 00:22:18,560 --> 00:22:21,440 Speaker 4: like you said there is even confirmed death that was 490 00:22:22,280 --> 00:22:24,960 Speaker 4: a rioter protests yeah, and by people like no, there 491 00:22:24,960 --> 00:22:27,520 Speaker 4: were police who died months later, and it was probably related, 492 00:22:27,480 --> 00:22:30,760 Speaker 4: and like the DC foreigner said, specifically to be related. 493 00:22:30,880 --> 00:22:33,480 Speaker 4: Like con also things like it's bad, why are you 494 00:22:34,280 --> 00:22:36,080 Speaker 4: are you trying to run up the numbers? Like, you 495 00:22:36,119 --> 00:22:38,040 Speaker 4: don't have to run up the numbers. It's already bad. 496 00:22:38,040 --> 00:22:40,360 Speaker 2: It's a film, you can see it exactly. 497 00:22:40,600 --> 00:22:42,879 Speaker 4: So my my sore of frustration is like you like, 498 00:22:43,400 --> 00:22:48,359 Speaker 4: people should be aware and concerned and vote and act accordingly, 499 00:22:48,400 --> 00:22:51,720 Speaker 4: but also don't scare them and whip them into like 500 00:22:52,000 --> 00:22:55,600 Speaker 4: uh astoric frenzy? Is that dangerous as well? 501 00:22:55,840 --> 00:23:00,680 Speaker 1: Yeah, in these uncertain times, rising crime in America's neighborhoods 502 00:23:00,800 --> 00:23:04,359 Speaker 1: endangers the safety of loved ones. Nearly fifty years ago, 503 00:23:04,480 --> 00:23:07,919 Speaker 1: with just two hundred and twenty eight dollars their entire savings, 504 00:23:08,320 --> 00:23:11,960 Speaker 1: Sabers Founders set out to make the world safer today. 505 00:23:12,119 --> 00:23:15,000 Speaker 1: Saber is the number one made in the USA pepper 506 00:23:15,040 --> 00:23:19,080 Speaker 1: spray brand, trusted by law enforcement and families across America. 507 00:23:19,160 --> 00:23:22,239 Speaker 1: As a family owned business, they understand the importance of 508 00:23:22,240 --> 00:23:27,080 Speaker 1: protecting your loved ones, introducing the Saber pepper projectile launcher, 509 00:23:27,320 --> 00:23:31,920 Speaker 1: a less lethal, fast loading, no recoil solution delivering powerful 510 00:23:31,920 --> 00:23:35,080 Speaker 1: stopping power up to one hundred and seventy five feet. 511 00:23:35,480 --> 00:23:38,280 Speaker 1: The real advantage even if you miss it creates a 512 00:23:38,400 --> 00:23:42,760 Speaker 1: six foot pepper cloud causing intense centory irritation which can 513 00:23:42,800 --> 00:23:45,920 Speaker 1: overwhelm anyone in its path, giving you and your family 514 00:23:46,320 --> 00:23:49,879 Speaker 1: the opportunity to protect yourselves and your home if Saber 515 00:23:49,920 --> 00:23:52,760 Speaker 1: doesn't stop there. In addition to their pepper spray and 516 00:23:52,800 --> 00:23:56,479 Speaker 1: pepper gel. Saber offers stun guns for personal protection and 517 00:23:56,560 --> 00:24:00,760 Speaker 1: bear and mountain lion spray for outdoor adventurers. They also 518 00:24:00,800 --> 00:24:05,119 Speaker 1: provide essential home security items like door security bars and 519 00:24:05,160 --> 00:24:09,280 Speaker 1: door and window alarms for securing your entryways. Protect yourself 520 00:24:09,320 --> 00:24:12,440 Speaker 1: and your family with Sabers full range of defense sprays, 521 00:24:12,600 --> 00:24:18,119 Speaker 1: home security tools, and exclusive launcher bundles. Visit Sabre radio 522 00:24:18,280 --> 00:24:21,480 Speaker 1: dot com or call eight four four eight two four 523 00:24:21,640 --> 00:24:25,920 Speaker 1: safe today. That's eight four four eight two four safe. 524 00:24:26,040 --> 00:24:28,479 Speaker 1: I'll be back with more with t becket Atoms in 525 00:24:28,560 --> 00:24:29,080 Speaker 1: just a moment. 526 00:24:32,520 --> 00:24:34,320 Speaker 3: Well, I just feel like when I see somebody reading 527 00:24:34,359 --> 00:24:36,280 Speaker 3: The New York Times, I sort of feel bad for 528 00:24:36,320 --> 00:24:38,359 Speaker 3: them because like they're going to find out the truth 529 00:24:38,359 --> 00:24:40,879 Speaker 3: in a few months when it's okay, like you know, 530 00:24:41,040 --> 00:24:44,480 Speaker 3: just things like I mean, anything right, and just the 531 00:24:44,560 --> 00:24:48,200 Speaker 3: COVID years obviously are are such a great. 532 00:24:48,200 --> 00:24:48,959 Speaker 2: Example of that. 533 00:24:49,200 --> 00:24:51,720 Speaker 3: But you know, New York Times readers had no idea 534 00:24:52,080 --> 00:24:56,440 Speaker 3: that a lab leak was even a remotely possible possibility 535 00:24:56,480 --> 00:24:59,880 Speaker 3: because that was unacceptable to talk about. So they only 536 00:25:00,000 --> 00:25:02,520 Speaker 3: found out much later that that was actually the reality 537 00:25:02,560 --> 00:25:05,680 Speaker 3: of what happened, you know, most likely and other things. 538 00:25:05,720 --> 00:25:07,879 Speaker 3: They just they find out things way too late. The 539 00:25:07,920 --> 00:25:12,600 Speaker 3: shock over Joe Biden being in bad shape, I think, 540 00:25:12,800 --> 00:25:15,879 Speaker 3: was another example of you know, the conservative world was 541 00:25:15,880 --> 00:25:18,520 Speaker 3: all over it, and New York Times readers were like, 542 00:25:19,119 --> 00:25:22,280 Speaker 3: what does happened? I had no idea, you know, I 543 00:25:22,320 --> 00:25:24,400 Speaker 3: thought it was all right wing nonsense, but. 544 00:25:25,520 --> 00:25:27,879 Speaker 4: Right I think there's two competing things there. One and 545 00:25:27,920 --> 00:25:30,280 Speaker 4: there are stories that if that's just the New York Times, 546 00:25:30,320 --> 00:25:32,600 Speaker 4: it's a lot of sort of corporated larger. I feel 547 00:25:32,640 --> 00:25:35,280 Speaker 4: like the older newsrooms get the lazier and flabbier they get. 548 00:25:35,880 --> 00:25:38,160 Speaker 4: But there are two competing reasons. One is that sometimes 549 00:25:38,200 --> 00:25:40,960 Speaker 4: it is politically inconvenient. They don't want to go near 550 00:25:40,960 --> 00:25:43,439 Speaker 4: it because it might help the bad guy, and not 551 00:25:43,480 --> 00:25:45,840 Speaker 4: necessarily even that they're a pro democratic is that they 552 00:25:46,560 --> 00:25:49,080 Speaker 4: truly fear the other side, so they were like, let's 553 00:25:49,080 --> 00:25:50,879 Speaker 4: not go near this because this might help them. The 554 00:25:50,960 --> 00:25:54,400 Speaker 4: other thing, too, is that they simply don't know. They 555 00:25:54,400 --> 00:25:57,320 Speaker 4: are in their own little bubbles. You've heard about it. 556 00:25:57,400 --> 00:25:59,840 Speaker 4: So one of my favorites, it's not actually a favorites, 557 00:25:59,880 --> 00:26:02,720 Speaker 4: not hah, but wasn't during the twenty six It doesn't 558 00:26:02,720 --> 00:26:05,000 Speaker 4: affect the mess thing. It doesn't affect them personally, that 559 00:26:05,080 --> 00:26:07,280 Speaker 4: doesn't really exist. So you see when people talk about 560 00:26:07,440 --> 00:26:11,320 Speaker 4: violence and cities, they're like my dae, yes, yes right. 561 00:26:11,680 --> 00:26:15,480 Speaker 4: The one of the more jarry examples of this was 562 00:26:15,520 --> 00:26:18,439 Speaker 4: during twenty sixteen when I don't know what happened, but 563 00:26:18,560 --> 00:26:22,960 Speaker 4: all those psychopathics or neo Nazi pepees got activated during 564 00:26:23,000 --> 00:26:26,240 Speaker 4: the first twenty sixteen election, and all of these reporters 565 00:26:26,320 --> 00:26:31,159 Speaker 4: got inundated with anti Semitic messages and hate mail and 566 00:26:31,200 --> 00:26:32,800 Speaker 4: all that. Yeah, and then the New York Times and 567 00:26:32,840 --> 00:26:34,320 Speaker 4: the washingt Post and all them started doing a turn 568 00:26:34,359 --> 00:26:38,160 Speaker 4: about anti semitism on social media. Well you're like jowing 569 00:26:38,200 --> 00:26:38,760 Speaker 4: a goldberg and. 570 00:26:38,760 --> 00:26:41,240 Speaker 2: Bed Yeah were like yeah, yeah, yeah, we've been saying. 571 00:26:41,280 --> 00:26:43,280 Speaker 4: But it didn't affect the New York Times or the 572 00:26:43,320 --> 00:26:45,560 Speaker 4: Washington Post person right now. They didn't cure Yeah there, 573 00:26:45,720 --> 00:26:49,640 Speaker 4: that's just that's just conservative infighting, not absolutely. And then 574 00:26:49,720 --> 00:26:52,200 Speaker 4: like no, then like yeah, time supports start getting it, 575 00:26:52,200 --> 00:26:53,920 Speaker 4: and then you get the trend pieces about whoa whoa 576 00:26:53,920 --> 00:26:56,760 Speaker 4: whoa anthei symptism might be a yeah on social media, 577 00:26:56,800 --> 00:26:59,000 Speaker 4: and it's like it was for I was getting anti 578 00:26:59,000 --> 00:26:59,919 Speaker 4: Semitic stuff. 579 00:26:59,640 --> 00:27:01,800 Speaker 3: I'm not Yeah, I was getting all the You're like, 580 00:27:01,800 --> 00:27:02,520 Speaker 3: I got that. 581 00:27:03,560 --> 00:27:05,840 Speaker 4: Well, I figured it. I think it's so I was 582 00:27:05,840 --> 00:27:07,760 Speaker 4: getting like all the gas chamber means and stuff. I 583 00:27:07,800 --> 00:27:12,480 Speaker 4: was I'm German, Irish cat, I know, yeah, I figured out. 584 00:27:12,480 --> 00:27:15,919 Speaker 4: I think the assumptions I'm in media, therefore I'm probably Jewish, 585 00:27:16,000 --> 00:27:18,720 Speaker 4: so I think that's just like yeah, I know, I know. Yeah, 586 00:27:18,840 --> 00:27:22,160 Speaker 4: It's like maybe it was like a little anti Semitic sandwich. 587 00:27:22,200 --> 00:27:23,920 Speaker 4: You know. It was like, right, you assume you're Jewish, 588 00:27:23,920 --> 00:27:25,679 Speaker 4: so be you're some epe mail. Yeah, okay. 589 00:27:25,840 --> 00:27:29,359 Speaker 3: So it was interesting because there was a story about 590 00:27:30,280 --> 00:27:34,720 Speaker 3: Maggie Haberman got anti Semitic hate mail mail to her house, 591 00:27:35,240 --> 00:27:37,720 Speaker 3: and she and I are actually friendly. This is not 592 00:27:37,760 --> 00:27:40,439 Speaker 3: a not a shot in her but she got in 593 00:27:40,520 --> 00:27:43,800 Speaker 3: semitic mail to her house and then and she wrote 594 00:27:43,840 --> 00:27:46,800 Speaker 3: about how you know it was Trump related because it 595 00:27:46,840 --> 00:27:49,960 Speaker 3: was after twenty sixteen, And then I realized I got 596 00:27:49,960 --> 00:27:52,800 Speaker 3: that same letter in like twenty fourteen, right, you know, 597 00:27:53,040 --> 00:27:57,000 Speaker 3: just same same style, same everything to my house. I 598 00:27:57,040 --> 00:27:58,679 Speaker 3: tweeted about it at the time. That's why I had 599 00:27:58,680 --> 00:28:02,159 Speaker 3: the date so clear, and so maybe he didn't have 600 00:28:02,359 --> 00:28:04,200 Speaker 3: anything to do with Donald Trump at all. 601 00:28:04,280 --> 00:28:06,720 Speaker 2: But you know, it got tied to him anyway. 602 00:28:07,960 --> 00:28:09,520 Speaker 4: I mean at that point, like I was saying, I 603 00:28:09,560 --> 00:28:13,800 Speaker 4: don't know how much you can you can blame on him, right, 604 00:28:13,880 --> 00:28:15,840 Speaker 4: you know, I do know in twenty sixty it did 605 00:28:15,840 --> 00:28:17,560 Speaker 4: get bad, like it racks it up hard. 606 00:28:17,600 --> 00:28:18,280 Speaker 2: Oh yeah, I know. 607 00:28:18,600 --> 00:28:20,239 Speaker 4: A lot of yeah, a lot of a lot of them. 608 00:28:20,280 --> 00:28:22,440 Speaker 4: No writers in National Review, especially a lot of my 609 00:28:22,480 --> 00:28:24,680 Speaker 4: colleagues at the Washington Examiner, especially the ones who were 610 00:28:24,800 --> 00:28:28,520 Speaker 4: known to be Jewish, like yeah, stuff, I don't know 611 00:28:28,520 --> 00:28:31,200 Speaker 4: if something got activated. I mean Donald Trump never was 612 00:28:31,240 --> 00:28:32,960 Speaker 4: like hey, Neil, nota she's get the jew We never 613 00:28:33,000 --> 00:28:33,760 Speaker 4: read anything like that. 614 00:28:33,840 --> 00:28:34,840 Speaker 2: Yeah, no, of course. 615 00:28:35,000 --> 00:28:37,679 Speaker 4: Yeah, but you know, mean it got ugly in twenty sixteen. 616 00:28:37,760 --> 00:28:40,080 Speaker 3: But it's not so great right now. Let me tell you, 617 00:28:41,480 --> 00:28:42,960 Speaker 3: I don't know when that store. 618 00:28:43,040 --> 00:28:45,360 Speaker 4: What was that? He says, not a problem, So I'll 619 00:28:45,400 --> 00:28:47,280 Speaker 4: take that, take him out of his word. I mean, 620 00:28:47,320 --> 00:28:50,480 Speaker 4: that was a fascinating sort of conversation to follow, where 621 00:28:50,480 --> 00:28:52,360 Speaker 4: a number of colleagues of mine who are Jewish were like, 622 00:28:52,400 --> 00:28:54,920 Speaker 4: you can just walk into a church and nobody asked questions. 623 00:28:55,000 --> 00:28:58,160 Speaker 4: I was like, yes, is there something I oh, there's 624 00:28:58,200 --> 00:28:59,880 Speaker 4: a part of the American experience that I wasn't a 625 00:29:00,040 --> 00:29:02,920 Speaker 4: where I did not know that. Yes, so it's like 626 00:29:02,960 --> 00:29:04,920 Speaker 4: you're kind of trying to learn stuff, but also like 627 00:29:05,160 --> 00:29:08,040 Speaker 4: your mouthshut to not look ignorant. So there's that. 628 00:29:09,280 --> 00:29:12,360 Speaker 3: So what do you think is our largest cultural problem 629 00:29:12,560 --> 00:29:15,360 Speaker 3: if we're talking about some of the bad stuff going on? 630 00:29:15,600 --> 00:29:19,280 Speaker 4: I think two, Well, yeah, I have two. One is 631 00:29:19,320 --> 00:29:23,560 Speaker 4: the loneliness and despair of the American I think there's 632 00:29:23,560 --> 00:29:27,160 Speaker 4: a reason suicide and overdoses are such a problem. Death 633 00:29:27,320 --> 00:29:31,240 Speaker 4: of despair is a huge problem. I think we've this 634 00:29:31,400 --> 00:29:33,640 Speaker 4: is part of I mean, I can't blame boomers altogether, 635 00:29:33,680 --> 00:29:36,440 Speaker 4: but a lot of resentment or contempt for boomers has 636 00:29:36,480 --> 00:29:38,040 Speaker 4: to do with the fact that they sort of gleefully 637 00:29:38,120 --> 00:29:40,720 Speaker 4: destroyed a lot of cultural institutions because they thought it 638 00:29:40,800 --> 00:29:43,400 Speaker 4: was for you know, rasures or whatever. Yeah, and they 639 00:29:43,680 --> 00:29:47,160 Speaker 4: didn't replace it with anything. And so you have a 640 00:29:47,160 --> 00:29:49,080 Speaker 4: lot of things kind of got smashed apart, whether there 641 00:29:49,120 --> 00:29:52,760 Speaker 4: were clubs or churches as a sort of cultural community 642 00:29:52,760 --> 00:29:56,560 Speaker 4: center in small community or a small cities and towns, 643 00:29:58,520 --> 00:30:01,320 Speaker 4: people falling away from that, we have really had anything 644 00:30:01,400 --> 00:30:03,800 Speaker 4: in real life to replace it, which and then you 645 00:30:03,840 --> 00:30:06,000 Speaker 4: have their know the birth well, yeah, the birth of 646 00:30:06,000 --> 00:30:07,680 Speaker 4: the special media and the Internet. So I don't think 647 00:30:07,720 --> 00:30:10,800 Speaker 4: the internet and social media necessarily bad. They're tools people 648 00:30:10,880 --> 00:30:13,400 Speaker 4: naturally gravitate towards them. But I don't think going all 649 00:30:13,400 --> 00:30:15,720 Speaker 4: the way full circle to our first conversation Dot Claire 650 00:30:16,440 --> 00:30:20,160 Speaker 4: Twitter conversations and being on Twitter's substitute for being absolute 651 00:30:20,240 --> 00:30:24,720 Speaker 4: people in real life, so I congre right, I mean, 652 00:30:24,800 --> 00:30:26,840 Speaker 4: and people kind of get inside their own heads. You know. 653 00:30:26,880 --> 00:30:31,320 Speaker 4: One of my favorite lines is from Shakespeare's Caesar. He's uh, 654 00:30:31,560 --> 00:30:35,480 Speaker 4: warning about not brutus cash is I can't remember whoever 655 00:30:35,560 --> 00:30:38,680 Speaker 4: led the conspiracy. Cannot remember his name is a fine 656 00:30:38,720 --> 00:30:40,240 Speaker 4: time to forget it when I'm trying to quote it. 657 00:30:40,560 --> 00:30:42,280 Speaker 4: But he says that he thinks too much such men 658 00:30:42,320 --> 00:30:44,800 Speaker 4: are dangers. So social media really you get inside your 659 00:30:44,800 --> 00:30:47,800 Speaker 4: own ed and other and you attract people who are 660 00:30:47,840 --> 00:30:50,520 Speaker 4: also thinking that. So instead of having a friend in 661 00:30:50,560 --> 00:30:52,120 Speaker 4: real life like we used to have, would be like, 662 00:30:52,160 --> 00:30:54,040 Speaker 4: you sound like a crazy person. Man, Are you okay? 663 00:30:54,400 --> 00:30:57,280 Speaker 4: I don't get that. You don't give a yet. Yeah, yeah, yeah, 664 00:30:57,320 --> 00:31:00,640 Speaker 4: you're right, you're right. Yeah. So lowliness and spirit I 665 00:31:00,640 --> 00:31:03,000 Speaker 4: don't see any way that gets fixed quickly. The other 666 00:31:03,040 --> 00:31:06,040 Speaker 4: thing that actually really scares me as well as the 667 00:31:06,040 --> 00:31:10,080 Speaker 4: competency crisis. I think we're losing a lot of generational knowledge, 668 00:31:10,120 --> 00:31:11,840 Speaker 4: a lot of it stopped being shared and passed on, 669 00:31:11,960 --> 00:31:14,120 Speaker 4: and I think the longer it goes without being addressed, 670 00:31:14,120 --> 00:31:17,560 Speaker 4: you're going to start having real problems in terms of infrastructure, 671 00:31:17,600 --> 00:31:21,320 Speaker 4: in terms of transportation, in terms of basic safety. I 672 00:31:21,360 --> 00:31:23,440 Speaker 4: don't think it's any mistake that we're seeing a lot 673 00:31:23,480 --> 00:31:25,560 Speaker 4: of this sort of stuff at Boeing. I mean Boeing 674 00:31:25,600 --> 00:31:27,880 Speaker 4: didn't wake up with right side and just decide I'm 675 00:31:27,880 --> 00:31:28,400 Speaker 4: gonna suck. 676 00:31:28,720 --> 00:31:28,920 Speaker 1: Yeah. 677 00:31:28,920 --> 00:31:33,320 Speaker 4: I think it's been a slow, steady devaluing of competency. 678 00:31:33,360 --> 00:31:35,400 Speaker 4: And the only way I can see getting away from 679 00:31:35,440 --> 00:31:38,800 Speaker 4: that is some sort of institution of I mean, ruthless meritocracy, 680 00:31:39,240 --> 00:31:40,200 Speaker 4: no room for no. 681 00:31:40,640 --> 00:31:42,360 Speaker 2: I think that's where we have to go. But I 682 00:31:42,680 --> 00:31:43,480 Speaker 2: don't think we will. 683 00:31:43,560 --> 00:31:46,240 Speaker 3: We can't even get to like they will teach algebra 684 00:31:46,400 --> 00:31:49,640 Speaker 3: to kids in eighth grade in California because it's racist. 685 00:31:49,760 --> 00:31:50,920 Speaker 2: So I don't know. 686 00:31:50,960 --> 00:31:53,760 Speaker 3: I don't know that we're getting to ruthless meritocracy anytime. See, 687 00:31:54,320 --> 00:31:55,800 Speaker 3: but you know the figure's problem. 688 00:31:56,280 --> 00:31:58,280 Speaker 4: Hey, how many bridge collaps and plane crashes do you 689 00:31:58,280 --> 00:32:00,920 Speaker 4: get before people are like Okay, you know what, let's 690 00:32:00,960 --> 00:32:03,880 Speaker 4: maybe like focus again on engineering instead of whether. 691 00:32:03,680 --> 00:32:06,600 Speaker 2: It's no, yeah, and did we get there. 692 00:32:06,680 --> 00:32:08,080 Speaker 3: I just I feel like they're going to be like 693 00:32:08,920 --> 00:32:11,000 Speaker 3: that this had nothing to do with anything. It's not 694 00:32:11,080 --> 00:32:13,680 Speaker 3: because people don't know what to do anymore. It's, you know, 695 00:32:14,640 --> 00:32:19,000 Speaker 3: double down on fighting racism through not teaching math. I 696 00:32:19,080 --> 00:32:21,360 Speaker 3: really do think that they're not going to give up, 697 00:32:21,560 --> 00:32:25,360 Speaker 3: maybe as sanity spreads, because it has you know, so 698 00:32:25,520 --> 00:32:28,800 Speaker 3: again the math is a racist In California, it ended 699 00:32:28,840 --> 00:32:32,160 Speaker 3: up being sort of left of center parents who spoke 700 00:32:32,240 --> 00:32:35,000 Speaker 3: up and fought back and kind of realized how crazy 701 00:32:35,040 --> 00:32:38,280 Speaker 3: it was, and that's really the only hope. But they pressure, 702 00:32:38,600 --> 00:32:41,120 Speaker 3: the far left pressures more. 703 00:32:40,960 --> 00:32:41,760 Speaker 2: Than anybody else. 704 00:32:41,920 --> 00:32:45,760 Speaker 3: The left moderates to kind of fall in line, and 705 00:32:45,800 --> 00:32:47,440 Speaker 3: it's really hard to get them to break. 706 00:32:48,760 --> 00:32:52,280 Speaker 4: I mean, historically it's usually a major crisis spurs a 707 00:32:52,440 --> 00:32:55,120 Speaker 4: counter reaction. So at some point we'll see a counter reaction. 708 00:32:55,280 --> 00:32:57,640 Speaker 4: The problem is this is where the fear kind of 709 00:32:57,720 --> 00:32:59,760 Speaker 4: comes into it. You can have something like nine to 710 00:32:59,760 --> 00:33:03,880 Speaker 4: eleven and that not be a inciting event for national 711 00:33:03,960 --> 00:33:06,560 Speaker 4: unity and yeah, I hadn't do it, so like, what 712 00:33:07,200 --> 00:33:08,920 Speaker 4: how much worse does it need to get before people 713 00:33:08,920 --> 00:33:11,520 Speaker 4: are like, okay, we need to drop the nonsense. Almost 714 00:33:11,840 --> 00:33:13,440 Speaker 4: I don't know what the rating on the show is. 715 00:33:13,520 --> 00:33:16,160 Speaker 7: You can you can see all the bed Where can 716 00:33:16,160 --> 00:33:18,400 Speaker 7: you adopt the bullshit and actually serious like yeah, like 717 00:33:18,400 --> 00:33:19,240 Speaker 7: how many bridges? 718 00:33:19,560 --> 00:33:21,880 Speaker 4: How many planes? Before people are like, I don't care 719 00:33:21,960 --> 00:33:25,560 Speaker 4: if the Engineering Corps selects America. I want the best engineers. 720 00:33:26,080 --> 00:33:30,480 Speaker 4: I want ruthless efficiency. I don't know. That usually comes 721 00:33:30,480 --> 00:33:32,920 Speaker 4: around again, historically speaking, this comes around when there's something 722 00:33:33,000 --> 00:33:36,440 Speaker 4: bad enough happens, like the Johnston flood, like you're still setting. 723 00:33:36,720 --> 00:33:38,640 Speaker 4: A whole lot of people get killed, and it turns 724 00:33:38,680 --> 00:33:40,600 Speaker 4: out because the engineers didn't know what they were doing, 725 00:33:40,840 --> 00:33:43,479 Speaker 4: you might actually get a big fat counter reaction. 726 00:33:44,000 --> 00:33:44,520 Speaker 2: Well we're not. 727 00:33:44,640 --> 00:33:45,600 Speaker 4: We're not at that point though. 728 00:33:46,360 --> 00:33:50,320 Speaker 3: So in our book Stolen Youth, we covered how this 729 00:33:50,480 --> 00:33:53,280 Speaker 3: was going on in the medical field. And I don't 730 00:33:53,320 --> 00:33:55,640 Speaker 3: think we're ever going to see, you know, a bridge 731 00:33:55,640 --> 00:33:58,040 Speaker 3: collapse sort of, you know what I mean, We're never 732 00:33:58,080 --> 00:33:59,960 Speaker 3: going to see that that happen in the medical field 733 00:34:00,200 --> 00:34:03,960 Speaker 3: quite as clearly. But for example, they every year they 734 00:34:04,000 --> 00:34:08,759 Speaker 3: have experts talk about premature birds and nick you babies 735 00:34:09,320 --> 00:34:12,160 Speaker 3: on these panels, and one year, they realized all their 736 00:34:12,200 --> 00:34:15,279 Speaker 3: experts were white men, so they had to change it. 737 00:34:15,400 --> 00:34:18,600 Speaker 3: So it was no longer the very very best, producing 738 00:34:18,640 --> 00:34:22,960 Speaker 3: the most latest, you know, technology and results and best practices. 739 00:34:23,400 --> 00:34:25,920 Speaker 3: It was now not the very very best. And we're 740 00:34:25,960 --> 00:34:28,960 Speaker 3: never going to know exactly how many. 741 00:34:28,760 --> 00:34:29,919 Speaker 2: People were harmed by that. 742 00:34:29,920 --> 00:34:33,040 Speaker 3: That's going to be always kind of something we'd never 743 00:34:33,080 --> 00:34:33,560 Speaker 3: find out. 744 00:34:34,640 --> 00:34:37,919 Speaker 4: Yeah, So that's actually another great area that I just courage. 745 00:34:37,960 --> 00:34:40,160 Speaker 4: We have. Another scary one is the sort of evaluating 746 00:34:40,200 --> 00:34:42,319 Speaker 4: of standards in the medical field, like that can get 747 00:34:42,400 --> 00:34:44,719 Speaker 4: real data's real fast. I know at Washington Free Beek 748 00:34:44,719 --> 00:34:47,560 Speaker 4: and they've been doing great work following insanity at UCLA 749 00:34:47,760 --> 00:34:50,880 Speaker 4: and how they have essentially broken down any sort of 750 00:34:50,920 --> 00:34:53,520 Speaker 4: standards that they have for their medical field. So your point, like, 751 00:34:53,560 --> 00:34:57,879 Speaker 4: this is the other issue too, is the basic disappearance 752 00:34:57,960 --> 00:35:00,160 Speaker 4: and a lot of it is self created, the the 753 00:35:00,239 --> 00:35:04,120 Speaker 4: destruction of media credibility and trust in institutions like COVID. 754 00:35:04,120 --> 00:35:08,200 Speaker 4: Speaking at COVID, Yeah, like who trusts public health organizations? Right? People? 755 00:35:08,239 --> 00:35:10,480 Speaker 4: Some people do, but certainly not the way it was not. 756 00:35:10,840 --> 00:35:12,520 Speaker 2: I mean I have to be at it. 757 00:35:12,760 --> 00:35:14,759 Speaker 3: I used to trust them. I used to just be like, 758 00:35:14,800 --> 00:35:17,600 Speaker 3: whatever you know, obviously they're going to be bipartisan. 759 00:35:17,640 --> 00:35:18,400 Speaker 2: Why wouldn't they be. 760 00:35:18,520 --> 00:35:21,040 Speaker 3: Now it turns out, wow, not at all, And I 761 00:35:21,120 --> 00:35:22,000 Speaker 3: can't trust them. 762 00:35:22,600 --> 00:35:24,440 Speaker 4: I mean I used to have COVID. I had no 763 00:35:24,880 --> 00:35:27,120 Speaker 4: negative bigon a faucity. I didn't really know anything about it. 764 00:35:27,400 --> 00:35:30,279 Speaker 4: After COVID, I definitely have opinion of them, Yes, not 765 00:35:30,719 --> 00:35:33,879 Speaker 4: a positive one, right. But the issue with the credibility 766 00:35:33,960 --> 00:35:35,560 Speaker 4: we have it between these rooms. We have that with 767 00:35:35,640 --> 00:35:40,040 Speaker 4: massive organizations or institutions like health institutions, people don't trust them. 768 00:35:40,080 --> 00:35:42,279 Speaker 4: So if you're going to have that inciting event, it'd 769 00:35:42,320 --> 00:35:44,360 Speaker 4: be helpful to have someone who could record it. But 770 00:35:44,400 --> 00:35:46,600 Speaker 4: the day is, nobody believes anyone. So you could have 771 00:35:46,960 --> 00:35:48,880 Speaker 4: let's say, a massive bridge collapse, but you're not going 772 00:35:48,920 --> 00:35:51,120 Speaker 4: to trust the Washington Posts coverage of it. Hey, fact, 773 00:35:51,160 --> 00:35:52,640 Speaker 4: you're gonna have to go listen to Joe Rogan to 774 00:35:52,640 --> 00:35:55,440 Speaker 4: find out I know what actually. So the thing is, 775 00:35:55,640 --> 00:35:58,719 Speaker 4: when I get that inciting event, I don't know when, 776 00:35:58,800 --> 00:36:00,239 Speaker 4: I don't know how bad it's going to have to be, 777 00:36:00,320 --> 00:36:01,879 Speaker 4: and then it's going to be slowed by the fact 778 00:36:01,960 --> 00:36:05,000 Speaker 4: that people don't trust what they're here. That's as simple 779 00:36:05,080 --> 00:36:06,600 Speaker 4: is that people don't trust a lot of what they hear, 780 00:36:06,640 --> 00:36:10,520 Speaker 4: and things like COVID really sort of cements that. I mean, 781 00:36:10,560 --> 00:36:13,920 Speaker 4: the idea that the lab leak was always a reasonable 782 00:36:14,000 --> 00:36:17,160 Speaker 4: hypothesis and then immediately be called a crank and a loon, right, 783 00:36:17,360 --> 00:36:17,680 Speaker 4: and then. 784 00:36:17,880 --> 00:36:21,319 Speaker 3: Later somehow less racist to say, somebody ate a bat 785 00:36:21,440 --> 00:36:22,520 Speaker 3: and that's how it happened. 786 00:36:22,560 --> 00:36:27,040 Speaker 4: Like I've said this, it's so funny that they're like, no, 787 00:36:27,160 --> 00:36:30,360 Speaker 4: the more the more she made less racist. Thing is 788 00:36:30,360 --> 00:36:32,960 Speaker 4: not to say it was the result of a boxed experiment, 789 00:36:33,080 --> 00:36:35,960 Speaker 4: but actually, you know those people eat stuff out, and 790 00:36:36,320 --> 00:36:37,880 Speaker 4: I'm like, how is that less racist? 791 00:36:38,000 --> 00:36:38,320 Speaker 2: Right? 792 00:36:38,719 --> 00:36:41,040 Speaker 4: Like the New York Times lead health for portions talking 793 00:36:41,040 --> 00:36:43,759 Speaker 4: about how the blableek theory was clearly racist. You're like, right, oh, 794 00:36:43,840 --> 00:36:44,640 Speaker 4: she was worse. 795 00:36:44,960 --> 00:36:46,000 Speaker 2: Yeah. 796 00:36:46,080 --> 00:36:47,680 Speaker 4: They come out and they go, actually, you know this 797 00:36:47,719 --> 00:36:50,600 Speaker 4: is probably what happened. You're like, what is happening? Yeah, 798 00:36:51,520 --> 00:36:53,920 Speaker 4: I'm supposed to triist you now, but right then? Or 799 00:36:54,000 --> 00:36:56,759 Speaker 4: was I supposed to just then but not now? You know, 800 00:36:56,920 --> 00:36:59,359 Speaker 4: it's hard to rebuild it. It's easy to lose bion, 801 00:36:59,440 --> 00:37:00,879 Speaker 4: hard hard to get it back. 802 00:37:01,160 --> 00:37:01,359 Speaker 2: Yeah. 803 00:37:01,400 --> 00:37:03,600 Speaker 4: I feel like, you know, this incident over the weekend 804 00:37:03,640 --> 00:37:08,240 Speaker 4: with Natasha Frost is a great example, not just a journalist. 805 00:37:08,239 --> 00:37:10,480 Speaker 4: It's just for your your view, yeah yeah, and tell 806 00:37:10,560 --> 00:37:13,160 Speaker 4: us yeah you see Lake about nine hundred pages of 807 00:37:13,160 --> 00:37:19,080 Speaker 4: private conversation Jewish Australians post November seventh, and she was 808 00:37:19,080 --> 00:37:22,080 Speaker 4: half hard in the summer, so thank you October. Yeah. 809 00:37:22,200 --> 00:37:24,239 Speaker 4: Somewhow wormed her way into the group and then she 810 00:37:24,480 --> 00:37:29,440 Speaker 4: recorded all the data, downloaded and shared it with opposition groups. 811 00:37:29,880 --> 00:37:33,399 Speaker 4: That is obviously unethical. That's her. But the bigger thing 812 00:37:33,400 --> 00:37:38,480 Speaker 4: for me is how vague and not forthcoming the New 813 00:37:38,560 --> 00:37:39,360 Speaker 4: York Times is being right. 814 00:37:39,560 --> 00:37:43,080 Speaker 7: She's a New York Times reporter and yeah in Australia 815 00:37:43,080 --> 00:37:45,680 Speaker 7: and the New York Times, I won't say anything about 816 00:37:45,719 --> 00:37:48,360 Speaker 7: I know they removed her contact information from the website 817 00:37:48,360 --> 00:37:49,920 Speaker 7: and they're like, you know, we've dealt with them, Like, well, 818 00:37:49,960 --> 00:37:50,799 Speaker 7: how have you dealt with them? 819 00:37:50,800 --> 00:37:54,560 Speaker 4: They're like, to your business, worry about it business? Yea. 820 00:37:54,680 --> 00:37:57,400 Speaker 4: The people don't trust you. This stuff I have covered, 821 00:37:57,440 --> 00:37:59,600 Speaker 4: and this is real quick. I've covered Congress and I've 822 00:37:59,600 --> 00:38:02,440 Speaker 4: covered newsrooms, and I've always had a much easier time 823 00:38:02,480 --> 00:38:04,759 Speaker 4: getting a straight answer wow out of a member of 824 00:38:04,800 --> 00:38:06,760 Speaker 4: the Senate or the House than I had ever gotten 825 00:38:06,800 --> 00:38:09,520 Speaker 4: from a budsman or an official spokesperson for any of 826 00:38:09,520 --> 00:38:11,960 Speaker 4: the newspapers. They do not talk. They do you not explain. 827 00:38:12,000 --> 00:38:13,880 Speaker 4: They think it's beneath them. They have to explain what 828 00:38:14,000 --> 00:38:16,000 Speaker 4: is going on. Even if it comes down to like 829 00:38:16,080 --> 00:38:19,040 Speaker 4: explain your selection of pictures. It's as simple as, oh, 830 00:38:19,040 --> 00:38:22,399 Speaker 4: we have that that photo was licensed A good problem. Yeah, 831 00:38:22,480 --> 00:38:23,120 Speaker 4: but they won't tell you. 832 00:38:23,120 --> 00:38:24,800 Speaker 6: They're like, well, I don't need to tell you. Just 833 00:38:25,440 --> 00:38:29,560 Speaker 6: swop higgy. No, people don't trust you. Do you think 834 00:38:29,600 --> 00:38:32,440 Speaker 6: that Alex Jones grew in a vacuum, like there's a 835 00:38:32,480 --> 00:38:35,040 Speaker 6: reason he has an audience? Absolutely? 836 00:38:35,320 --> 00:38:36,080 Speaker 4: Yeah, that's my want. 837 00:38:36,800 --> 00:38:38,120 Speaker 2: I completely agree. 838 00:38:38,160 --> 00:38:40,480 Speaker 3: I think loss of trust is such a problem and 839 00:38:41,120 --> 00:38:42,680 Speaker 3: it's going to take I mean, I don't know if 840 00:38:42,719 --> 00:38:45,239 Speaker 3: it's ever going to get recovered, actually, because anybody who 841 00:38:45,280 --> 00:38:47,839 Speaker 3: remembers this time period, it's maybe very hard to get over. 842 00:38:48,160 --> 00:38:50,400 Speaker 3: I used to be a trusting person. I used to 843 00:38:50,640 --> 00:38:53,320 Speaker 3: you know, kind of okay. I didn't like the left media, 844 00:38:53,440 --> 00:38:55,000 Speaker 3: the liberal media. I didn't like the New York Times, 845 00:38:55,040 --> 00:38:58,040 Speaker 3: Quashion Post, whatever, But I didn't think they were I 846 00:38:58,080 --> 00:39:00,720 Speaker 3: don't know, I guess I didn't think they were liars 847 00:39:00,719 --> 00:39:03,480 Speaker 3: by design, And now I think they are. I think 848 00:39:03,480 --> 00:39:08,200 Speaker 3: they're they're not trying to inform their audience, and that's 849 00:39:08,560 --> 00:39:11,439 Speaker 3: a major problem, I really do you know, Like I said, 850 00:39:11,480 --> 00:39:14,600 Speaker 3: I think New York Times readers are often far behind 851 00:39:14,960 --> 00:39:17,759 Speaker 3: on what the news actually is from people who read 852 00:39:17,800 --> 00:39:18,280 Speaker 3: other sources. 853 00:39:18,640 --> 00:39:21,799 Speaker 4: So yeah, I think it's a problem. I mean with 854 00:39:21,920 --> 00:39:24,600 Speaker 4: all generations within they say, zom studying just the young reporters. 855 00:39:24,600 --> 00:39:26,600 Speaker 4: It's the idea that it is our job to shape 856 00:39:26,640 --> 00:39:28,000 Speaker 4: the narrative, the good narrative. 857 00:39:28,000 --> 00:39:30,640 Speaker 3: And then you have bad news about our side, and 858 00:39:30,840 --> 00:39:32,480 Speaker 3: you know, right, and then the question, well, it's not 859 00:39:32,520 --> 00:39:34,440 Speaker 3: even necessarily they're trying to just get elected. 860 00:39:34,440 --> 00:39:36,200 Speaker 4: But no, if we give if we say this thing 861 00:39:36,239 --> 00:39:38,759 Speaker 4: that's damaging to our site, the bad guys might win, right, 862 00:39:38,880 --> 00:39:41,680 Speaker 4: And now they justify they actually think they're the good guys. 863 00:39:41,680 --> 00:39:44,160 Speaker 4: But the obvious question is like, well, what's the good narrative? Yeah, 864 00:39:44,239 --> 00:39:46,799 Speaker 4: you can't just tell me the story. And as far 865 00:39:46,840 --> 00:39:49,279 Speaker 4: as trust is concerned, I mean, trust is cyclical. You know, 866 00:39:49,440 --> 00:39:53,719 Speaker 4: you'd think that American generations going forward would never trust 867 00:39:53,719 --> 00:39:57,200 Speaker 4: the presidency again after Nixon, But people a hundred percent 868 00:39:57,280 --> 00:40:00,000 Speaker 4: believe that Obama was going to change the world. Yea, yeah, 869 00:40:00,440 --> 00:40:03,080 Speaker 4: including the boomers who are alive for Nigson may seeing 870 00:40:03,120 --> 00:40:05,840 Speaker 4: what happened. They know Preson's capable of lying right and 871 00:40:05,920 --> 00:40:07,960 Speaker 4: obfuscating and being a This is. 872 00:40:07,920 --> 00:40:10,160 Speaker 2: A different party, Beckett. This is you know, this is 873 00:40:10,200 --> 00:40:10,880 Speaker 2: the good party. 874 00:40:11,040 --> 00:40:13,080 Speaker 3: So obviously it's possible. 875 00:40:13,400 --> 00:40:15,600 Speaker 2: So it's possible. 876 00:40:15,239 --> 00:40:18,280 Speaker 4: Within our lifetimes that yeah, World Health Organization once again 877 00:40:18,600 --> 00:40:21,840 Speaker 4: claims the mantle of a trusted bipartisan institution. It's possible. 878 00:40:21,880 --> 00:40:23,840 Speaker 3: I'm going to be around her mind people not to 879 00:40:23,880 --> 00:40:27,480 Speaker 3: trust them, but we'll see. Thank you so much, this 880 00:40:27,520 --> 00:40:30,160 Speaker 3: has been such a great talk. I end here with 881 00:40:31,320 --> 00:40:33,560 Speaker 3: your best tip for my listeners on how they can 882 00:40:33,600 --> 00:40:36,759 Speaker 3: improve their lives. Let's get some optimism here. How do 883 00:40:36,840 --> 00:40:38,040 Speaker 3: people make their lives better? 884 00:40:39,320 --> 00:40:44,960 Speaker 4: Okay, so TI tips one. It is usually best even 885 00:40:45,000 --> 00:40:47,000 Speaker 4: if it leads to a little conflict. Usually best to 886 00:40:47,080 --> 00:40:49,719 Speaker 4: say immediately when you're starting to feel annoyed or aggravated 887 00:40:49,719 --> 00:40:52,479 Speaker 4: about something, do not let it sit infester. I've learned 888 00:40:52,480 --> 00:40:54,080 Speaker 4: this in my merriage. 889 00:40:53,880 --> 00:40:58,000 Speaker 3: Immedially for spouses, or this is just in general in general. 890 00:40:58,040 --> 00:41:00,560 Speaker 4: Actually it's great with coworkers because otherwise if you yeah, 891 00:41:00,600 --> 00:41:02,440 Speaker 4: if you try to rationalize it, like I don't want 892 00:41:02,440 --> 00:41:05,319 Speaker 4: the conflict right now, I'm busy right now. Then it's 893 00:41:05,360 --> 00:41:07,880 Speaker 4: possible you sit in that investors, and then it becomes contempt. 894 00:41:07,920 --> 00:41:10,640 Speaker 4: And that's if you want in any relationship in your life. 895 00:41:10,800 --> 00:41:14,279 Speaker 4: So find a diplomatic, impolite way immediately to say I'm 896 00:41:14,320 --> 00:41:17,120 Speaker 4: aggravated or I'm unhappy with this thing, I don't think 897 00:41:17,160 --> 00:41:19,920 Speaker 4: this thing X y Z do it as fast as possible. 898 00:41:20,480 --> 00:41:24,160 Speaker 4: And the other one ruthless enforcement of your time boundaries. 899 00:41:24,280 --> 00:41:27,560 Speaker 4: I know, I know the Celenials like talking about boundaries 900 00:41:27,560 --> 00:41:29,840 Speaker 4: and stuff. No, if you and work, if close of 901 00:41:29,920 --> 00:41:32,960 Speaker 4: business is five o'clock, you got you have to observe it. 902 00:41:33,000 --> 00:41:35,000 Speaker 4: Because you keep pushing it, you will become known as 903 00:41:35,000 --> 00:41:37,840 Speaker 4: somebody who doesn't have any serve of working hours. You 904 00:41:37,920 --> 00:41:41,279 Speaker 4: work all the time and anytime that you aren't ruthlessly 905 00:41:41,400 --> 00:41:43,719 Speaker 4: enforcing that. Your job will take what it can because 906 00:41:43,760 --> 00:41:46,600 Speaker 4: it doesn't care. So you have to. If it's after five, 907 00:41:46,960 --> 00:41:49,240 Speaker 4: that email can wait, the text can wait, unless somebody 908 00:41:49,320 --> 00:41:50,359 Speaker 4: is like leading and needing. 909 00:41:50,840 --> 00:41:53,799 Speaker 3: Okay, yes you can. You're a doctor, fine, you know right, 910 00:41:54,800 --> 00:41:56,319 Speaker 3: You're a writer, it's okay. 911 00:41:56,520 --> 00:41:58,600 Speaker 4: You know. In journalism, if you're on the breaking news desk, 912 00:41:58,680 --> 00:42:01,279 Speaker 4: I'm sorry your kind that's an all situation, that's an 913 00:42:01,320 --> 00:42:03,720 Speaker 4: on call. But you get to a point where you're 914 00:42:03,760 --> 00:42:06,319 Speaker 4: maybe an instructor or a columnist that you have to 915 00:42:06,719 --> 00:42:08,960 Speaker 4: you really got to maintain that, especially if you're married 916 00:42:08,960 --> 00:42:09,600 Speaker 4: with kids. 917 00:42:09,920 --> 00:42:10,359 Speaker 2: It's great. 918 00:42:10,560 --> 00:42:13,120 Speaker 4: Eventually becomes so when does your workday actually end? 919 00:42:13,160 --> 00:42:17,040 Speaker 2: You're next thing you know, it's Tyley Coulli com stalled someday. 920 00:42:18,320 --> 00:42:20,520 Speaker 2: Thank you, Yeah, thank you so much. 921 00:42:20,560 --> 00:42:23,680 Speaker 3: He's t Beckett Adams read him in The Hill National Review, 922 00:42:23,760 --> 00:42:27,600 Speaker 3: watching an examiner follow his amazing X extreme I don't know, 923 00:42:27,719 --> 00:42:30,240 Speaker 3: I don't know what to call it anymore, Twitter profile, 924 00:42:30,800 --> 00:42:31,680 Speaker 3: Twitter whatever. 925 00:42:32,760 --> 00:42:34,440 Speaker 2: Thank you so much, Beckett. Nice to have you. 926 00:42:34,800 --> 00:42:35,839 Speaker 4: Thanks so much for having me. 927 00:42:36,160 --> 00:42:38,799 Speaker 1: Thanks so much for joining us on the Carol Marcowitch Show. 928 00:42:38,840 --> 00:42:41,160 Speaker 1: Subscribe wherever you get your podcasts.