1 00:00:02,759 --> 00:00:05,720 Speaker 1: Well, hey guys, we are back on normally, the show 2 00:00:05,760 --> 00:00:08,160 Speaker 1: that normals takes for when the news gets weird. 3 00:00:08,280 --> 00:00:09,680 Speaker 2: I am Mary teth Him. 4 00:00:10,280 --> 00:00:12,800 Speaker 3: I'm Carol Markowitz. I keep waiting for the news not 5 00:00:12,960 --> 00:00:15,040 Speaker 3: to be weird, but it's just weird every day. 6 00:00:15,040 --> 00:00:17,239 Speaker 4: Mary Catherine, I'm a weird every. 7 00:00:17,160 --> 00:00:20,360 Speaker 1: Day, the upside of the tagline. And I'm having a 8 00:00:20,400 --> 00:00:22,320 Speaker 1: weird week because I'm working without a net. I have 9 00:00:22,400 --> 00:00:26,160 Speaker 1: no childcare, so we're recording during nap time and we're 10 00:00:26,160 --> 00:00:27,080 Speaker 1: crossing our fingers. 11 00:00:27,080 --> 00:00:28,080 Speaker 2: But we can do it. 12 00:00:28,120 --> 00:00:30,120 Speaker 4: Carol, it's gonna be a fast one. Let's do it. 13 00:00:31,760 --> 00:00:32,479 Speaker 2: Oh my goodness. 14 00:00:32,600 --> 00:00:36,560 Speaker 1: Well, there's news in the trade war on the tariff front, 15 00:00:37,200 --> 00:00:42,400 Speaker 1: and that is that Donald Trump announced via Truth social 16 00:00:42,720 --> 00:00:47,440 Speaker 1: network that he is pausing for ninety days tariffs on 17 00:00:47,520 --> 00:00:51,120 Speaker 1: everyone else, the reciprocal tariffs, not the baseline ten percent, 18 00:00:51,159 --> 00:00:54,960 Speaker 1: which will still exist, all right, pausing the extra ones 19 00:00:55,040 --> 00:00:57,320 Speaker 1: for ninety days, except for on China. 20 00:00:58,200 --> 00:00:58,960 Speaker 2: Here's the truth. 21 00:00:59,440 --> 00:01:01,520 Speaker 1: Based on the lack of respect that China has shown 22 00:01:01,560 --> 00:01:03,720 Speaker 1: to the world's markets, I am hereby raising the tariff 23 00:01:03,800 --> 00:01:05,920 Speaker 1: charged to China by the United States of America to 24 00:01:05,959 --> 00:01:09,399 Speaker 1: one hundred and twenty five percent effective immediately. At some point, 25 00:01:09,400 --> 00:01:11,600 Speaker 1: hopefully in the near future, China will realize that the 26 00:01:11,680 --> 00:01:14,280 Speaker 1: days of ripping off the USA and other countries it 27 00:01:14,360 --> 00:01:19,200 Speaker 1: is no longer sustainable or acceptable. Conversely, and based on 28 00:01:19,240 --> 00:01:21,160 Speaker 1: the fact that more than seventy five countries have called 29 00:01:21,160 --> 00:01:24,440 Speaker 1: representatives of the United States, including the Departments of Commerce, Treasury, 30 00:01:24,440 --> 00:01:28,640 Speaker 1: and the US Trade Representative's Office, to negotiate a solution 31 00:01:28,720 --> 00:01:32,400 Speaker 1: to the subjects being discussed relative to trade, trade barriers, tariffs, currency, 32 00:01:32,440 --> 00:01:35,280 Speaker 1: manipulation of non monetary tariffs, and that these countries have not, 33 00:01:35,560 --> 00:01:38,480 Speaker 1: at my strong suggestion, retaliated in any way, shape or 34 00:01:38,480 --> 00:01:39,640 Speaker 1: form against the United States. 35 00:01:40,040 --> 00:01:41,480 Speaker 2: I have authorized in ninety. 36 00:01:41,280 --> 00:01:44,360 Speaker 1: Day pause and substantially lowered reciprocal tariff during this period 37 00:01:44,400 --> 00:01:46,280 Speaker 1: of ten percent, also effective immediately. 38 00:01:46,840 --> 00:01:48,960 Speaker 2: Thank you for your attention to this matter. 39 00:01:50,160 --> 00:01:53,320 Speaker 3: Everyone's like glued to their television screen like. 40 00:01:55,560 --> 00:01:57,160 Speaker 2: He has a way. 41 00:01:57,560 --> 00:01:59,320 Speaker 4: He really does. 42 00:02:00,320 --> 00:02:04,640 Speaker 1: So the markets responded well to this idea of a pause, 43 00:02:04,720 --> 00:02:06,760 Speaker 1: as they did to the fake news about a pause 44 00:02:06,800 --> 00:02:07,760 Speaker 1: a couple of days ago. 45 00:02:08,240 --> 00:02:12,760 Speaker 2: That should show the president something I'm not sure. 46 00:02:13,240 --> 00:02:15,920 Speaker 1: Again, this is the problem with the whole structure of 47 00:02:15,960 --> 00:02:18,880 Speaker 1: this is that he can change his mind unless Congress 48 00:02:18,880 --> 00:02:19,840 Speaker 1: takes this power. 49 00:02:19,600 --> 00:02:22,480 Speaker 2: Back in a moment if he wants to. 50 00:02:23,080 --> 00:02:26,560 Speaker 1: I like the idea of going after China, but being 51 00:02:26,800 --> 00:02:29,800 Speaker 1: chiller with people who are not involved in a bunch 52 00:02:29,840 --> 00:02:32,960 Speaker 1: of unfair practices. That seems good to me. Getting closer 53 00:02:33,000 --> 00:02:36,440 Speaker 1: to zero for zero tariffs seems good to me. Do 54 00:02:36,560 --> 00:02:37,959 Speaker 1: I think that was the plan here? 55 00:02:38,400 --> 00:02:41,519 Speaker 2: I'm not sure, but I'm glad we're here at the moment. 56 00:02:42,000 --> 00:02:44,240 Speaker 3: Well, that's the thing, right, What was the plan? 57 00:02:44,560 --> 00:02:46,440 Speaker 4: Was there a plan? We'll never know? 58 00:02:47,320 --> 00:02:52,320 Speaker 3: And I look again, you and I have been sort of, 59 00:02:52,720 --> 00:02:57,200 Speaker 3: I think, not cautiously optimistic, but just cautious in general 60 00:02:57,320 --> 00:03:01,680 Speaker 3: about this whole thing. And we're not surprised that it 61 00:03:01,760 --> 00:03:04,360 Speaker 3: ended up here for now. Again, it might start back 62 00:03:04,440 --> 00:03:05,600 Speaker 3: up again in ninety days. 63 00:03:05,919 --> 00:03:07,600 Speaker 2: This is pretty much where we. 64 00:03:07,600 --> 00:03:11,400 Speaker 3: Predicted he'd be. Maybe we didn't see the China angle coming. 65 00:03:11,440 --> 00:03:14,239 Speaker 3: I thought maybe he would be across the board pose. 66 00:03:14,040 --> 00:03:16,920 Speaker 4: It for ninety days. I'm not mad at this. I 67 00:03:16,960 --> 00:03:18,399 Speaker 4: think that China. 68 00:03:18,160 --> 00:03:21,960 Speaker 3: Does deserve some retribution for what they do to us. 69 00:03:22,160 --> 00:03:25,040 Speaker 3: And look, if this is where we end up, I 70 00:03:25,040 --> 00:03:25,959 Speaker 3: think that'll be okay. 71 00:03:27,400 --> 00:03:30,640 Speaker 1: Howard Lutnik, who is the Commerce Secretary, tweeted, I just 72 00:03:30,639 --> 00:03:32,840 Speaker 1: found this very funny. As a cabinet member for Trump, 73 00:03:33,160 --> 00:03:36,000 Speaker 1: Scott Besson and I set with the President while he 74 00:03:36,040 --> 00:03:37,400 Speaker 1: wrote one of the most. 75 00:03:37,160 --> 00:03:40,640 Speaker 2: Extraordinary tingh my god of his presidency. 76 00:03:40,960 --> 00:03:42,720 Speaker 1: The world is ready to work with President Trump to 77 00:03:42,720 --> 00:03:45,600 Speaker 1: fix global trade, and China has chosen the opposite direction. 78 00:03:46,480 --> 00:03:51,360 Speaker 1: I have a feeling that Scott Bessont was like, this 79 00:03:51,440 --> 00:03:55,280 Speaker 1: looks like a win to me, sir, And here's how 80 00:03:55,320 --> 00:03:56,360 Speaker 1: we can tell that story. 81 00:03:56,400 --> 00:03:59,000 Speaker 2: Besson's quote is this was driven by the President's strategy. 82 00:03:59,400 --> 00:04:00,680 Speaker 2: This was his along. 83 00:04:00,800 --> 00:04:02,920 Speaker 1: You might even say he go to China into a 84 00:04:02,920 --> 00:04:05,360 Speaker 1: bad position. They have shown themselves to the world to 85 00:04:05,360 --> 00:04:07,680 Speaker 1: be bad actors, and we are willing to cooperate with 86 00:04:07,720 --> 00:04:10,440 Speaker 1: our allies and with our trading partners who did not retaliate. 87 00:04:10,960 --> 00:04:11,280 Speaker 2: Again. 88 00:04:11,440 --> 00:04:13,480 Speaker 1: I think this is a reframing, but it's one that 89 00:04:14,520 --> 00:04:16,000 Speaker 1: I don't mind if it gets us. 90 00:04:16,279 --> 00:04:18,960 Speaker 3: Yeah, let's declare victory and get the hell out of this. 91 00:04:20,200 --> 00:04:22,960 Speaker 2: Yes, cutting the federal workforce please, right. 92 00:04:23,040 --> 00:04:27,240 Speaker 3: I think things were going so well until this tariff thing. 93 00:04:28,160 --> 00:04:31,320 Speaker 3: The idea of bringing back manufacturing seems like a stretch, 94 00:04:31,400 --> 00:04:34,479 Speaker 3: but I'm open to hearing plans about it. 95 00:04:35,480 --> 00:04:37,560 Speaker 4: Look, all of this is heading. 96 00:04:37,360 --> 00:04:43,120 Speaker 3: Towards being automated anyway. So this idea of let's bring 97 00:04:43,200 --> 00:04:46,680 Speaker 3: back these amazing jobs and have manufacturing in the United 98 00:04:46,680 --> 00:04:49,760 Speaker 3: States probably isn't what's actually going to happen. 99 00:04:50,360 --> 00:04:53,760 Speaker 4: We'll see. I'm happy with this. 100 00:04:53,839 --> 00:04:56,279 Speaker 3: If we could just say we won and leave it here, 101 00:04:56,480 --> 00:04:58,839 Speaker 3: that would be good in my opinion. 102 00:04:59,160 --> 00:05:03,640 Speaker 1: Yeah, we have manufacturing jobs that go unfilled here, so 103 00:05:03,960 --> 00:05:05,599 Speaker 1: it's not that they don't exist. I think one of 104 00:05:05,600 --> 00:05:08,719 Speaker 1: the issues is that we need to do a better 105 00:05:08,800 --> 00:05:12,000 Speaker 1: job of training people or those type of jobs, which 106 00:05:12,040 --> 00:05:14,200 Speaker 1: is a skill we've lost over the years, maybe doing 107 00:05:15,120 --> 00:05:18,800 Speaker 1: little too much DEI teaching in the schools instead of. 108 00:05:18,680 --> 00:05:20,279 Speaker 2: Thought class, that kind of thing. 109 00:05:21,400 --> 00:05:25,080 Speaker 1: And I do remain concerned about the uncertainty itself, and 110 00:05:25,120 --> 00:05:27,440 Speaker 1: that is just like the price of admission with Trump. 111 00:05:27,560 --> 00:05:30,479 Speaker 3: It so is, yeah, you're getting on a roller coaster. 112 00:05:30,600 --> 00:05:34,599 Speaker 1: Enjoy the ride, but it does have effects, you know, 113 00:05:34,720 --> 00:05:37,880 Speaker 1: in real time, because people will, like on the real markets, 114 00:05:37,920 --> 00:05:41,240 Speaker 1: because they people will get worried about investing their money, 115 00:05:41,240 --> 00:05:42,640 Speaker 1: and I don't want them worried about that. 116 00:05:42,680 --> 00:05:45,640 Speaker 2: Absolutely. I think this is a better track though, right. 117 00:05:45,520 --> 00:05:49,760 Speaker 3: The uncertainty is obviously the key problem with the last 118 00:05:49,760 --> 00:05:53,480 Speaker 3: few days, but I will say that if we knew 119 00:05:53,600 --> 00:05:55,960 Speaker 3: that he was going to pause the tariffs, then I 120 00:05:56,040 --> 00:05:59,320 Speaker 3: have to imagine the people investing in the markets also 121 00:05:59,400 --> 00:06:02,440 Speaker 3: had in that that was going to happen, you know, 122 00:06:03,000 --> 00:06:06,040 Speaker 3: by the dip sell it when Trump pauses the tariffs, 123 00:06:06,040 --> 00:06:08,120 Speaker 3: that you know, that's where we are. 124 00:06:08,160 --> 00:06:12,680 Speaker 1: Basically, one Robbie Swaby was was bragging about how he 125 00:06:12,720 --> 00:06:14,960 Speaker 1: had he had guessed. 126 00:06:14,640 --> 00:06:17,200 Speaker 2: This almost with very hight, almost with. 127 00:06:17,240 --> 00:06:21,360 Speaker 1: The same language that the President presented. So he writes 128 00:06:21,400 --> 00:06:23,400 Speaker 1: for Reason magazine. So I appreciated that. 129 00:06:23,960 --> 00:06:27,200 Speaker 3: Yeah, I get I think we guessed it. We basically 130 00:06:27,240 --> 00:06:29,120 Speaker 3: said that that was where he was going to end up. 131 00:06:29,160 --> 00:06:30,880 Speaker 3: I mean, again, this is not an ending, so we 132 00:06:30,920 --> 00:06:33,359 Speaker 3: don't know, but we felt like there was going to 133 00:06:33,400 --> 00:06:34,040 Speaker 3: be a pause. 134 00:06:34,200 --> 00:06:40,320 Speaker 4: And look there's one. So follow us from more investing advice. 135 00:06:42,240 --> 00:06:45,000 Speaker 1: Well, I mean the thing is too like normis who 136 00:06:45,040 --> 00:06:48,040 Speaker 1: are not necessarily heavy investors. 137 00:06:47,480 --> 00:06:49,839 Speaker 2: But have four O one case, Yeah, who are. 138 00:06:49,680 --> 00:06:52,760 Speaker 1: Watching the markets or hearing things while they're you know, 139 00:06:52,880 --> 00:06:56,839 Speaker 1: have CNBC on in the background, or you know, worried 140 00:06:56,839 --> 00:06:59,839 Speaker 1: about the impact it might have on grocery prices, which 141 00:06:59,839 --> 00:07:02,680 Speaker 1: is something that we should care about for regular people. 142 00:07:03,839 --> 00:07:06,960 Speaker 1: They're not maybe thinking through all the details of trade policy. 143 00:07:07,480 --> 00:07:10,840 Speaker 1: But they're concerned just like we are, and I think 144 00:07:11,280 --> 00:07:13,239 Speaker 1: it's good to give them some comfort. 145 00:07:13,440 --> 00:07:15,160 Speaker 4: We'll be right back on normally. 146 00:07:18,560 --> 00:07:21,720 Speaker 3: At Real Clear investigations, You're going to be so surprised 147 00:07:21,720 --> 00:07:25,160 Speaker 3: by this. Mary Catherine lee Fang has found that the 148 00:07:25,200 --> 00:07:31,680 Speaker 3: whole Kamala Harris Brat summer was actually astro turf. No, yes, 149 00:07:31,880 --> 00:07:36,320 Speaker 3: I know, I know, I too could not believe it. 150 00:07:36,600 --> 00:07:40,240 Speaker 3: Lee Wright's influencers flooded the web with Neon Macha Green 151 00:07:40,400 --> 00:07:44,680 Speaker 3: pro Harris videos sync to beats from Charlie XCX album 152 00:07:44,760 --> 00:07:48,640 Speaker 3: Bratt released last summer. Last year, The poppy ray videos 153 00:07:48,720 --> 00:07:53,880 Speaker 3: Gosh journalists showed that Harris embodied the confidently independent brat. 154 00:07:53,600 --> 00:07:55,040 Speaker 4: Vibe conveyed by the music. 155 00:07:55,440 --> 00:07:59,120 Speaker 3: Social media pages bubbled with memes celebrating Harris as the 156 00:07:59,200 --> 00:08:01,680 Speaker 3: voice of queer in black youth and contrast with the 157 00:08:01,680 --> 00:08:05,920 Speaker 3: Republican agenda of white supremacy. Now, this was a career 158 00:08:06,000 --> 00:08:10,360 Speaker 3: politician who was deeply entrenched in the Democratic Party's establishment, 159 00:08:10,800 --> 00:08:15,040 Speaker 3: and she was somehow rebranded as like the latest coolest thing. 160 00:08:15,480 --> 00:08:20,280 Speaker 3: It all seemed a bit suspicious at the time. Now 161 00:08:20,480 --> 00:08:24,840 Speaker 3: Real Clearer obtained internal documents and WhatsApp messages from Democratic 162 00:08:24,880 --> 00:08:28,800 Speaker 3: strategists behind the influencer campaign called Way to Win, one 163 00:08:28,840 --> 00:08:31,920 Speaker 3: of the major donor groups behind the effort. They spent 164 00:08:32,000 --> 00:08:35,200 Speaker 3: more than nine point one million dollars on social media 165 00:08:35,240 --> 00:08:39,200 Speaker 3: influencers during the twenty twenty four presidential election. Five one 166 00:08:39,280 --> 00:08:42,800 Speaker 3: hundred and fifty content creators published six six hundred and 167 00:08:42,880 --> 00:08:46,880 Speaker 3: forty four posts across TikTok, Instagram, YouTube, Twitch, and x. 168 00:08:47,800 --> 00:08:51,000 Speaker 3: The thing is that we have no idea that any 169 00:08:51,040 --> 00:08:53,880 Speaker 3: of this goes on until after the fact. If then 170 00:08:54,200 --> 00:08:57,719 Speaker 3: the story has barely been hitting you know, any of 171 00:08:57,760 --> 00:09:00,280 Speaker 3: the main stiam news outlets. We felt like we had 172 00:09:00,280 --> 00:09:02,079 Speaker 3: to talk about it on this show because it's been 173 00:09:02,120 --> 00:09:06,280 Speaker 3: so undercovered. Let's listen to this clip from Lee about 174 00:09:06,320 --> 00:09:10,240 Speaker 3: what the problem is in getting kind of to the 175 00:09:10,240 --> 00:09:11,319 Speaker 3: truth of this sort of thing. 176 00:09:12,080 --> 00:09:17,120 Speaker 5: This whole thing raises ethical concerns about transparency in political campaigning. 177 00:09:17,400 --> 00:09:22,400 Speaker 5: Should there restrict disclosure requirements imposed on influencers for influencer 178 00:09:22,440 --> 00:09:23,559 Speaker 5: based political content? 179 00:09:24,280 --> 00:09:28,280 Speaker 6: Look, we've had we've had rules for decades now where 180 00:09:28,360 --> 00:09:30,760 Speaker 6: you know, if you sponsor a billboard, a radio ad, 181 00:09:30,800 --> 00:09:34,120 Speaker 6: a television ad, the last few seconds of that message 182 00:09:34,160 --> 00:09:36,600 Speaker 6: has to show sponsored it. You know, on the billboard, 183 00:09:36,880 --> 00:09:39,840 Speaker 6: there's a disclosure at the bottom which pack or campaign 184 00:09:40,320 --> 00:09:45,640 Speaker 6: sponsored that message. No such rules exist for social media 185 00:09:45,920 --> 00:09:49,000 Speaker 6: for these influencers on Instagram, TikTok, and what have you. 186 00:09:49,280 --> 00:09:51,839 Speaker 6: It's really the wild West. At the end of twenty 187 00:09:51,880 --> 00:09:58,880 Speaker 6: twenty three, the Federal Election Commission proposed some influencer payola disclosures. 188 00:09:59,040 --> 00:10:01,800 Speaker 6: They deadlocked on the issue. There was no movement, so 189 00:10:01,920 --> 00:10:04,480 Speaker 6: right now, really anything could happen. And you know, we 190 00:10:04,480 --> 00:10:06,960 Speaker 6: were seeing this problem more and more on both sides, 191 00:10:07,000 --> 00:10:10,720 Speaker 6: both Republicans and Democrats have kind of got caught doing this. 192 00:10:10,720 --> 00:10:14,679 Speaker 6: This particular story that we're reporting this week, this is 193 00:10:14,720 --> 00:10:18,360 Speaker 6: the biggest I think effort of this type on either side, 194 00:10:18,440 --> 00:10:21,360 Speaker 6: Democrats spending ten million dollars just in this one kind 195 00:10:21,400 --> 00:10:24,760 Speaker 6: of organization. But it's a big problem, and it's a 196 00:10:24,760 --> 00:10:27,200 Speaker 6: big problem for disclosure. We haven't really resolved it. 197 00:10:27,320 --> 00:10:30,319 Speaker 4: Yeah, you know, you and I have talked about recently. 198 00:10:30,440 --> 00:10:34,760 Speaker 3: When the Florida bill came up about immigration, influencers were 199 00:10:34,800 --> 00:10:39,000 Speaker 3: paid to praise a Florida politician. It was so obvious, 200 00:10:40,040 --> 00:10:43,720 Speaker 3: and then somebody found the actual payment structure of that. 201 00:10:44,520 --> 00:10:47,200 Speaker 4: It is the wild West. We don't know who's getting 202 00:10:47,200 --> 00:10:47,680 Speaker 4: paid for what? 203 00:10:47,840 --> 00:10:51,400 Speaker 3: These influencers are taking money for their opinion, And it's 204 00:10:51,440 --> 00:10:54,360 Speaker 3: sort of interesting because obviously you know, and again we've 205 00:10:54,360 --> 00:10:56,760 Speaker 3: talked about it on here, but like if they already 206 00:10:56,800 --> 00:10:59,079 Speaker 3: believe what they're saying, I guess the feeling is like, 207 00:10:59,120 --> 00:11:00,640 Speaker 3: why shouldn't I get paid for it? 208 00:11:00,960 --> 00:11:03,800 Speaker 4: But everybody else has to disclose? I think they should 209 00:11:03,800 --> 00:11:04,400 Speaker 4: have to also. 210 00:11:04,960 --> 00:11:07,120 Speaker 1: Now I think it's it's tricky because it is in 211 00:11:07,160 --> 00:11:11,520 Speaker 1: this area, this gray area of election law, so you 212 00:11:11,679 --> 00:11:13,120 Speaker 1: don't really have to disclose this stuff. 213 00:11:13,160 --> 00:11:13,640 Speaker 2: What worried me. 214 00:11:13,679 --> 00:11:17,000 Speaker 1: And by the way, I appreciate Lee's reporting often and 215 00:11:17,040 --> 00:11:21,640 Speaker 1: he's Lee Hfang on X if you want to follow him. 216 00:11:21,800 --> 00:11:24,120 Speaker 1: He does a lot of interesting work along these lines, 217 00:11:24,160 --> 00:11:25,600 Speaker 1: and it takes a lot of work to track down 218 00:11:25,600 --> 00:11:29,720 Speaker 1: all these dms and all the easy financing. But he 219 00:11:29,880 --> 00:11:32,280 Speaker 1: notes that way to win, which is the organization that 220 00:11:32,320 --> 00:11:35,480 Speaker 1: did this stuff structure the funds through nonprofit corporations that 221 00:11:35,600 --> 00:11:39,200 Speaker 1: paid various influencer talent agencies firms such as Paltt Management 222 00:11:39,240 --> 00:11:43,240 Speaker 1: and Vocal Media, so it's all intertwined. But the nonprofit 223 00:11:43,480 --> 00:11:48,040 Speaker 1: groups struck me, like how much perhaps tax payer money 224 00:11:48,080 --> 00:11:51,400 Speaker 1: is going to various nonprofits that can get dumped into 225 00:11:52,040 --> 00:11:57,199 Speaker 1: electioneering and influencer campaigns. Although, like you said, Carol, we 226 00:11:57,200 --> 00:11:58,560 Speaker 1: weren't really tricked by this one. 227 00:11:58,600 --> 00:12:02,160 Speaker 2: I think that this was entirely organic. 228 00:12:03,600 --> 00:12:09,080 Speaker 3: Regularly the brat, Kamala's bratt, so bratt, super bratt. 229 00:12:08,800 --> 00:12:10,080 Speaker 4: You know, like pan. 230 00:12:10,040 --> 00:12:11,400 Speaker 2: Suit, I sort of. 231 00:12:11,760 --> 00:12:16,040 Speaker 1: I sort of enjoy when politics just kills a slang 232 00:12:16,120 --> 00:12:20,800 Speaker 1: term immediately, Like bratt was really like smothered in the crib. 233 00:12:20,880 --> 00:12:25,079 Speaker 1: That one went so fast, because once a politician takes it, 234 00:12:25,640 --> 00:12:28,120 Speaker 1: even though they're trying to tell you the politician's cool, 235 00:12:28,600 --> 00:12:29,120 Speaker 1: it's over. 236 00:12:29,760 --> 00:12:30,480 Speaker 2: It's done. 237 00:12:30,920 --> 00:12:33,400 Speaker 1: Even even I can't kill it as quickly as Kamala 238 00:12:33,480 --> 00:12:34,439 Speaker 1: or Hillary Clinton can. 239 00:12:34,679 --> 00:12:35,480 Speaker 2: So yeah, I. 240 00:12:35,400 --> 00:12:37,480 Speaker 3: Mean once you start using it around your kids, I 241 00:12:37,520 --> 00:12:39,680 Speaker 3: think it's also usually pretty dead. 242 00:12:40,040 --> 00:12:43,160 Speaker 2: I did delight in that, by the yeah, the same skivity. 243 00:12:43,320 --> 00:12:46,439 Speaker 2: I delight in that. Oh my gosh, yes, ohio so much. 244 00:12:47,920 --> 00:12:50,360 Speaker 3: But yeah, so back to this, I you know, I 245 00:12:50,360 --> 00:12:53,120 Speaker 3: think there have to be some guidelines. I don't know 246 00:12:53,160 --> 00:12:57,200 Speaker 3: how they can establish them. 247 00:12:57,800 --> 00:13:00,439 Speaker 4: It's it's so tough, like what what if you. 248 00:13:00,280 --> 00:13:04,320 Speaker 3: Tweet something positive about whatever somebody and somebody hits the 249 00:13:04,320 --> 00:13:06,440 Speaker 3: tip jar that a lot of these influencers have, like 250 00:13:06,480 --> 00:13:08,720 Speaker 3: do they then have to disclose? I think it's not 251 00:13:08,880 --> 00:13:12,880 Speaker 3: quite as black and white as an AD and there 252 00:13:12,920 --> 00:13:14,880 Speaker 3: have to be some rules to this, like I've you know, 253 00:13:14,960 --> 00:13:17,440 Speaker 3: I'm sure you have. Also I've been accused of taking 254 00:13:17,480 --> 00:13:21,760 Speaker 3: money from various causes. However, open on the record like yes, 255 00:13:21,840 --> 00:13:25,240 Speaker 3: come pay me, but I'm mostly just kidding. I have 256 00:13:25,360 --> 00:13:28,240 Speaker 3: never taken any money from any cause ever. I have 257 00:13:28,360 --> 00:13:32,480 Speaker 3: foolishly done this all for free. And I think that 258 00:13:32,520 --> 00:13:37,000 Speaker 3: there has to be some level of trust between the 259 00:13:37,040 --> 00:13:41,120 Speaker 3: consumer of our information and us, and the way that 260 00:13:41,200 --> 00:13:43,680 Speaker 3: I think we established that trust is not taking money 261 00:13:43,840 --> 00:13:47,720 Speaker 3: to push any causes, even causes we already agree with, 262 00:13:47,880 --> 00:13:51,400 Speaker 3: and so there has to be some sort of policy 263 00:13:51,440 --> 00:13:51,880 Speaker 3: around this. 264 00:13:52,160 --> 00:13:53,280 Speaker 4: I don't think this go on. 265 00:13:54,080 --> 00:13:57,760 Speaker 1: There are ways that social media has made rules for this, 266 00:13:57,920 --> 00:14:01,400 Speaker 1: so it could come not necessarily from government regulation, but 267 00:14:01,480 --> 00:14:04,920 Speaker 1: from the social media entities themselves, where a lot of 268 00:14:04,960 --> 00:14:09,160 Speaker 1: them require tagging something as an AD or disclosing what 269 00:14:09,200 --> 00:14:13,200 Speaker 1: you're doing, and I don't see why dedicated posts couldn't 270 00:14:13,240 --> 00:14:16,760 Speaker 1: be treated that way as well. It may not be 271 00:14:16,800 --> 00:14:19,880 Speaker 1: forthcoming from the FEC, however, and like you said, could 272 00:14:19,920 --> 00:14:20,240 Speaker 1: cause a. 273 00:14:20,280 --> 00:14:23,960 Speaker 2: Bunch of other downstream unintended issues. 274 00:14:24,240 --> 00:14:26,400 Speaker 3: As a result, we're going to take a short break 275 00:14:26,400 --> 00:14:28,000 Speaker 3: and come right back with normally. 276 00:14:31,840 --> 00:14:35,040 Speaker 1: Jim Van de High, who now works at Axios, I believe, 277 00:14:35,080 --> 00:14:39,200 Speaker 1: was a founder of Politico. Yeah, that's correct, is that Axios? 278 00:14:39,480 --> 00:14:42,440 Speaker 1: And he was speaking with Barry Weiss on her Honestly 279 00:14:42,520 --> 00:14:47,760 Speaker 1: podcast about lack of trust in media not influencers this 280 00:14:47,840 --> 00:14:51,560 Speaker 1: time media, But aren't they sometimes the same? Weren't there 281 00:14:51,600 --> 00:14:55,760 Speaker 1: as well? I can remember there was one segment post 282 00:14:55,880 --> 00:15:03,040 Speaker 1: braught summer news cycle where a CNN anchor was wearing 283 00:15:03,520 --> 00:15:07,960 Speaker 1: the color like on perpose, saying like here we are. 284 00:15:10,680 --> 00:15:13,040 Speaker 1: That is a hazy memory, but I'm pretty sure that 285 00:15:13,080 --> 00:15:16,880 Speaker 1: happened at any rate. Anda he's talking to her, and 286 00:15:17,120 --> 00:15:19,400 Speaker 1: this is a longish clip, but it's worth listening to 287 00:15:19,440 --> 00:15:22,280 Speaker 1: the whole thing to get an idea from a guy 288 00:15:22,360 --> 00:15:25,760 Speaker 1: who's at the top of political media, who's being i 289 00:15:25,760 --> 00:15:29,040 Speaker 1: would say a little more introspective than many of them are, 290 00:15:30,600 --> 00:15:33,760 Speaker 1: to see what he diagnoses as the problem, and then we'll. 291 00:15:33,560 --> 00:15:34,400 Speaker 2: Talk about it. 292 00:15:34,560 --> 00:15:38,920 Speaker 7: I feel like the trust really started to shatter over 293 00:15:38,960 --> 00:15:41,760 Speaker 7: the last decade, and I look at it in three phases. 294 00:15:42,600 --> 00:15:46,160 Speaker 7: The first was the creation of Twitter. What happened with 295 00:15:46,240 --> 00:15:48,560 Speaker 7: Twitter is people forget like now it's on a lot 296 00:15:48,560 --> 00:15:51,680 Speaker 7: of conservative voices, a lot of independent voices. It was 297 00:15:51,720 --> 00:15:55,320 Speaker 7: a hot bed of liberal group think for a long time. 298 00:15:55,600 --> 00:15:58,080 Speaker 7: And it was the first time since I've been in 299 00:15:58,080 --> 00:16:00,680 Speaker 7: this business that I would get on a fee and 300 00:16:00,720 --> 00:16:03,600 Speaker 7: I would see reporters who I had trusted, who I 301 00:16:03,640 --> 00:16:06,720 Speaker 7: had admired, making it crystal clear what their views were, 302 00:16:06,960 --> 00:16:09,640 Speaker 7: what side they were on. You could tell in what 303 00:16:09,640 --> 00:16:11,920 Speaker 7: they were tweeting, and you could tell in who they 304 00:16:11,920 --> 00:16:14,920 Speaker 7: were following and who was following them. So I thought 305 00:16:14,960 --> 00:16:18,680 Speaker 7: that was stage one, because at least before any bias 306 00:16:18,720 --> 00:16:21,600 Speaker 7: people had they hid from the public. Now it was 307 00:16:21,720 --> 00:16:25,480 Speaker 7: in a somewhat full view. Then came along kind of 308 00:16:25,520 --> 00:16:32,040 Speaker 7: the COVID defund, the police word policing, where I think 309 00:16:32,080 --> 00:16:33,840 Speaker 7: a lot of Americans were looking around and being like, 310 00:16:34,480 --> 00:16:37,400 Speaker 7: it doesn't sit right with me, and it doesn't in 311 00:16:37,400 --> 00:16:40,560 Speaker 7: the way it's being covered didn't sit right with them. 312 00:16:41,280 --> 00:16:44,200 Speaker 7: And then I think the final straw really was the 313 00:16:44,200 --> 00:16:46,560 Speaker 7: coverage of Joe Biden, when people were saying, hey, I 314 00:16:46,560 --> 00:16:47,920 Speaker 7: could see with my own two eyes that the guy 315 00:16:48,000 --> 00:16:51,760 Speaker 7: seems pretty old, probably doesn't seem capable of being the 316 00:16:51,800 --> 00:16:54,040 Speaker 7: President in the next term, and yet there's not a 317 00:16:54,080 --> 00:16:56,320 Speaker 7: whole hell of a lot of coverage of it. We 318 00:16:56,520 --> 00:16:59,200 Speaker 7: you know, Alex Thompson, our White House reporter, just won 319 00:16:59,200 --> 00:17:02,280 Speaker 7: the White House Reporting Award because he was one of 320 00:17:02,280 --> 00:17:06,200 Speaker 7: the few reporters to write about the decline of Joe Biden, 321 00:17:06,760 --> 00:17:11,720 Speaker 7: and he got the help exactly some of the very 322 00:17:11,800 --> 00:17:14,080 Speaker 7: reporters who may have you know who will applaud him 323 00:17:14,119 --> 00:17:16,879 Speaker 7: when he wins the award in a couple of weeks. 324 00:17:17,080 --> 00:17:18,960 Speaker 7: If you go back and look at their Twitter feeds, 325 00:17:19,000 --> 00:17:21,160 Speaker 7: there's a lot of people dunking on him and saying, oh, 326 00:17:21,160 --> 00:17:23,800 Speaker 7: that's not actually true. And we're on the other ends 327 00:17:23,840 --> 00:17:25,720 Speaker 7: of the calls from the White House saying that you 328 00:17:25,760 --> 00:17:28,040 Speaker 7: guys are out on a limb, you're wrong, you're crazy, 329 00:17:28,119 --> 00:17:31,280 Speaker 7: blah blah blah. And so I think those three things 330 00:17:32,119 --> 00:17:36,320 Speaker 7: in totality really cemented the distrust that a lot of 331 00:17:36,320 --> 00:17:40,639 Speaker 7: people have in media, and it breaks my heart. I 332 00:17:40,680 --> 00:17:43,280 Speaker 7: hate that. I love journalism. I am a fierce, fierce, 333 00:17:43,400 --> 00:17:47,560 Speaker 7: dependent defender of journalism. I believe that most reporters at 334 00:17:47,560 --> 00:17:50,560 Speaker 7: most institutions actually do try to get to the closest 335 00:17:50,560 --> 00:17:53,720 Speaker 7: approximation of the truth and achieve it most of the time. 336 00:17:53,840 --> 00:17:56,199 Speaker 7: I think it's a couple of bad apples who make 337 00:17:56,240 --> 00:17:57,840 Speaker 7: it look bad for everyone. 338 00:17:58,160 --> 00:18:01,679 Speaker 3: I mean, the idea that it was somebody else, you know, 339 00:18:01,800 --> 00:18:03,879 Speaker 3: it's it's that whole the meme of the guy in 340 00:18:03,920 --> 00:18:06,040 Speaker 3: the hot dog costume, where he's like, we're looking for 341 00:18:06,119 --> 00:18:09,760 Speaker 3: the person who did this and not taking any responsibility. 342 00:18:10,400 --> 00:18:12,640 Speaker 3: I think that's actually problem number one. 343 00:18:12,760 --> 00:18:15,720 Speaker 4: The things that he diagnoses. He's right, he's. 344 00:18:15,600 --> 00:18:17,520 Speaker 2: Right, stones are correct, yes. 345 00:18:17,320 --> 00:18:22,399 Speaker 3: Yes, but the blame placing of some vague other journalists 346 00:18:22,720 --> 00:18:24,359 Speaker 3: is a little bit too rich. 347 00:18:25,000 --> 00:18:27,520 Speaker 1: Yeah, I got a couple problems with this one. At 348 00:18:27,560 --> 00:18:31,480 Speaker 1: the risk of my reputation. I am a fourth generation 349 00:18:31,680 --> 00:18:36,080 Speaker 1: newspaper journalist. I know, shame upon my whole family line. 350 00:18:36,359 --> 00:18:38,600 Speaker 3: You're so awesome, Come on, that's amazing. 351 00:18:38,760 --> 00:18:43,239 Speaker 1: So my great grandfather owned a newspaper in Georgia, and 352 00:18:43,280 --> 00:18:45,800 Speaker 1: then my grandfather worked for him as a sports writer, 353 00:18:45,880 --> 00:18:47,879 Speaker 1: and then my dad was an editor, and then I 354 00:18:47,920 --> 00:18:52,560 Speaker 1: was at a newspaper before newspaper sail. So I am 355 00:18:52,680 --> 00:18:54,880 Speaker 1: sort of like coming from a family of inkstained wretches, 356 00:18:54,880 --> 00:18:56,840 Speaker 1: and I've always been part of media in a sort 357 00:18:56,840 --> 00:18:59,760 Speaker 1: of like help me, help you kind of way. Like 358 00:18:59,800 --> 00:19:02,159 Speaker 1: I want you to do the job you're supposed to 359 00:19:02,200 --> 00:19:05,240 Speaker 1: do now. That has been a seriously uphill battle, and 360 00:19:05,280 --> 00:19:09,000 Speaker 1: I found that I was ideologically an outlier, even in 361 00:19:10,000 --> 00:19:13,320 Speaker 1: a small newsroom in rural North Carolina, right because journalists 362 00:19:13,320 --> 00:19:16,440 Speaker 1: are just very, very left, and there are so many 363 00:19:16,440 --> 00:19:18,080 Speaker 1: of them that are very left, and I would argue 364 00:19:18,200 --> 00:19:20,680 Speaker 1: they're more left now than they've ever been, and more 365 00:19:20,720 --> 00:19:24,280 Speaker 1: elite and elite dist than they've ever been, and at 366 00:19:24,359 --> 00:19:25,600 Speaker 1: least to all these blind spots. 367 00:19:26,480 --> 00:19:28,920 Speaker 2: The idea that it started with Twitter. 368 00:19:29,400 --> 00:19:37,399 Speaker 1: That we didn't know before, Like friend, I knew before 369 00:19:37,400 --> 00:19:40,600 Speaker 1: two thousand and seven or nine or whenever it was, 370 00:19:41,520 --> 00:19:45,480 Speaker 1: so that was what I was not shocked. I actually 371 00:19:45,560 --> 00:19:48,720 Speaker 1: prefer it to be crystal clear what their biases are 372 00:19:49,440 --> 00:19:50,919 Speaker 1: because I knew before. 373 00:19:51,040 --> 00:19:53,680 Speaker 2: Stop hiding it from me. One of the reasons that we. 374 00:19:53,600 --> 00:19:55,199 Speaker 1: Do what we do the way we do it is 375 00:19:55,240 --> 00:19:57,199 Speaker 1: to be able to say, like, this is where I stand. 376 00:19:58,000 --> 00:19:58,960 Speaker 2: You can judge. 377 00:19:59,000 --> 00:20:00,399 Speaker 1: I will try to get to the true, but you 378 00:20:00,440 --> 00:20:02,960 Speaker 1: can judge my motivations how you wish. 379 00:20:03,119 --> 00:20:04,359 Speaker 2: Sure, that's part of it. 380 00:20:05,640 --> 00:20:09,919 Speaker 1: The COVID and defund the police and the you know, 381 00:20:10,040 --> 00:20:12,960 Speaker 1: woke stuff all slammed together in that very short mention. 382 00:20:14,240 --> 00:20:16,679 Speaker 1: It really doesn't even begin to get to the crazy, 383 00:20:17,040 --> 00:20:22,000 Speaker 1: really does they endorsed? I mean doesn't even begin, right. 384 00:20:22,640 --> 00:20:25,800 Speaker 3: You didn't mention things like Kavanaugh, for example, which he 385 00:20:26,000 --> 00:20:31,080 Speaker 3: was particularly involved in pushing specifically him, and you know 386 00:20:31,160 --> 00:20:33,480 Speaker 3: things like that are just get left off the list. 387 00:20:33,880 --> 00:20:37,280 Speaker 1: And then there's Joe Biden, where it's like, oh, the 388 00:20:37,320 --> 00:20:40,880 Speaker 1: slings and arrows. I appreciate it is true that Thompson 389 00:20:41,160 --> 00:20:43,720 Speaker 1: reported when others did not, as did the Wall Street 390 00:20:43,760 --> 00:20:46,200 Speaker 1: Journal reporters who first wrote the formal story in this, 391 00:20:46,560 --> 00:20:48,280 Speaker 1: But the rest of us were just like, Hi, guys, 392 00:20:48,280 --> 00:20:49,320 Speaker 1: we can see this. 393 00:20:49,600 --> 00:20:50,600 Speaker 4: Right with our eyes. 394 00:20:51,000 --> 00:20:52,959 Speaker 2: There are no awards forthcoming for that. 395 00:20:53,440 --> 00:20:55,159 Speaker 1: I love the idea that the White House was like, 396 00:20:55,160 --> 00:20:58,240 Speaker 1: you're really out on a limb here, buster like am 397 00:20:58,320 --> 00:21:01,800 Speaker 1: I I can see you wandering through field. I don't 398 00:21:01,840 --> 00:21:04,439 Speaker 1: think I'm that on a limb. And then the idea 399 00:21:04,480 --> 00:21:06,040 Speaker 1: that it's just a few of them. 400 00:21:07,119 --> 00:21:09,880 Speaker 2: Is you got animals, right, the group think? 401 00:21:10,040 --> 00:21:12,679 Speaker 1: Why does the group think exist that he that he 402 00:21:12,800 --> 00:21:14,639 Speaker 1: identifies at the beginning of Twitter. 403 00:21:15,200 --> 00:21:18,280 Speaker 2: It's because they all think the same thing, right, And 404 00:21:18,359 --> 00:21:18,880 Speaker 2: I think. 405 00:21:18,720 --> 00:21:22,840 Speaker 3: That they were surprised to find that that's identified as 406 00:21:22,840 --> 00:21:26,240 Speaker 3: group think, or that that's even leftism. To them, that's 407 00:21:26,320 --> 00:21:28,920 Speaker 3: just normal. And what do you mean, we're not leftists, 408 00:21:28,960 --> 00:21:32,160 Speaker 3: We're just mainstream journalists who all think the same way. 409 00:21:32,200 --> 00:21:33,280 Speaker 4: Why is that weird to you? 410 00:21:34,520 --> 00:21:38,280 Speaker 1: It's just neutral, good, right, like all the things we think. 411 00:21:38,720 --> 00:21:42,240 Speaker 1: But all the things we think became defund the police 412 00:21:42,359 --> 00:21:45,920 Speaker 1: and stay in your home for two years, and toddlers 413 00:21:45,920 --> 00:21:48,520 Speaker 1: should wear masks, and schools should be closed, and you 414 00:21:48,560 --> 00:21:50,360 Speaker 1: should never say any of the words we say. 415 00:21:50,400 --> 00:21:51,080 Speaker 2: You can't say. 416 00:21:51,480 --> 00:21:53,360 Speaker 1: Nina Jenkowitz should be in charge of all of it. 417 00:21:54,040 --> 00:21:56,760 Speaker 1: And Biden is great for another four years. I mean, 418 00:21:56,960 --> 00:22:00,680 Speaker 1: come on, how can they be surprised the people don't 419 00:22:00,680 --> 00:22:01,200 Speaker 1: trust them? 420 00:22:01,760 --> 00:22:04,520 Speaker 3: Yeah, they can't be. And I don't know if there's 421 00:22:04,560 --> 00:22:06,639 Speaker 3: a way back from it. I you know, we'll see 422 00:22:06,640 --> 00:22:07,320 Speaker 3: what happens. 423 00:22:07,640 --> 00:22:09,600 Speaker 4: Maybe there's enough resistors to. 424 00:22:11,119 --> 00:22:15,120 Speaker 3: Make the failing mainshoe media relevant again, but it's hard 425 00:22:15,200 --> 00:22:16,800 Speaker 3: to see how that's going to happen. 426 00:22:17,000 --> 00:22:20,000 Speaker 1: Yeah, As I always say, this country is desperate for 427 00:22:20,080 --> 00:22:23,879 Speaker 1: a reliable narrator, and the media absolutely refuses. 428 00:22:23,400 --> 00:22:23,880 Speaker 2: To be one. 429 00:22:24,320 --> 00:22:28,479 Speaker 1: Yeah, and for that reason, they have lost tons of power. 430 00:22:28,560 --> 00:22:30,399 Speaker 1: And I think actually the losing of their power and 431 00:22:30,440 --> 00:22:33,679 Speaker 1: the dispersal of viewpoints. Yes, makes it's hard sometimes to 432 00:22:33,680 --> 00:22:36,560 Speaker 1: figure out what information you're getting and how and what 433 00:22:36,680 --> 00:22:39,439 Speaker 1: is real and what is not, but also means that 434 00:22:39,440 --> 00:22:41,679 Speaker 1: they can't create fake narratives the way that they used to. 435 00:22:42,080 --> 00:22:42,520 Speaker 2: That's right. 436 00:22:42,560 --> 00:22:44,479 Speaker 3: We're going to tell you the truth on this show 437 00:22:44,880 --> 00:22:47,959 Speaker 3: whether or not you like it, so stay tu enjoy. 438 00:22:48,320 --> 00:22:50,840 Speaker 4: Yeah. Thanks for joining us on normally. 439 00:22:50,960 --> 00:22:54,600 Speaker 3: Normally airs Tuesdays and Thursdays, and you can subscribe anywhere 440 00:22:54,680 --> 00:22:56,080 Speaker 3: you get your podcasts. 441 00:22:56,320 --> 00:22:57,280 Speaker 4: Get in touch with us. 442 00:22:57,200 --> 00:22:59,560 Speaker 3: At normallythepod at gmail dot com. 443 00:22:59,680 --> 00:23:02,840 Speaker 4: Thanks for listening, and when things get weird, act normally