1 00:00:02,200 --> 00:00:05,200 Speaker 1: We just want people to you know, live their lives 2 00:00:05,559 --> 00:00:07,640 Speaker 1: and be happy and be able to enjoy it without 3 00:00:07,680 --> 00:00:11,480 Speaker 1: some you know, lunatic screaming in their face every three seconds. 4 00:00:11,640 --> 00:00:15,280 Speaker 2: We should be focusing on the issues that actually occupy 5 00:00:15,880 --> 00:00:19,040 Speaker 2: a lot of the mental space in Americans' minds, that 6 00:00:19,160 --> 00:00:21,960 Speaker 2: are filled with conspiracies, and we should fill it with 7 00:00:22,000 --> 00:00:23,079 Speaker 2: the truth and solutions. 8 00:00:23,280 --> 00:00:26,480 Speaker 3: As we built Midas Touched, that community aspect was almost 9 00:00:26,480 --> 00:00:28,720 Speaker 3: more important than us saying you know, here are our 10 00:00:28,760 --> 00:00:32,520 Speaker 3: political beliefs and here are these political stances, but more like, hey, 11 00:00:32,560 --> 00:00:34,360 Speaker 3: don't y'all want things to be normal? 12 00:00:35,240 --> 00:00:38,440 Speaker 4: Hi? Everyone, I'm Kittie Couric, and this is next question. 13 00:00:41,800 --> 00:00:44,400 Speaker 4: It's no secret that one of the things that propelled 14 00:00:44,400 --> 00:00:47,600 Speaker 4: Donald Trump to victory in the last election was right 15 00:00:47,640 --> 00:00:53,080 Speaker 4: wing media, a potent combination of conservative cable podcasts and 16 00:00:53,200 --> 00:00:57,840 Speaker 4: social media influencers. Now progressive voices are trying to make 17 00:00:57,880 --> 00:01:02,800 Speaker 4: sure that their footprint is as powerful and as big. 18 00:01:03,120 --> 00:01:06,720 Speaker 4: The Micellas brothers have created The Midas Touch, one of 19 00:01:06,760 --> 00:01:10,959 Speaker 4: the few digital networks. Yes, they considered themselves a network 20 00:01:11,319 --> 00:01:15,040 Speaker 4: to crack the code, gaining millions of followers and even 21 00:01:15,120 --> 00:01:18,520 Speaker 4: beating Joe Rogan for the number one spot on Apple 22 00:01:18,640 --> 00:01:22,120 Speaker 4: Podcasts a few months back, So I'm really excited to 23 00:01:22,160 --> 00:01:25,000 Speaker 4: talk to them about what they're doing and how they're 24 00:01:25,000 --> 00:01:29,840 Speaker 4: doing it. First of all, hello guys, welcome to next question. 25 00:01:30,000 --> 00:01:33,120 Speaker 4: Thank you so much. An exclusive, the first time the 26 00:01:33,160 --> 00:01:36,000 Speaker 4: three of you have done an in person interview together, 27 00:01:36,120 --> 00:01:39,720 Speaker 4: right first, that early first Wow, I'm very honored and 28 00:01:39,880 --> 00:01:42,800 Speaker 4: very flattered, and I wanted to ask a little bit 29 00:01:43,000 --> 00:01:47,160 Speaker 4: about your life story. You all grew up on Long Island, 30 00:01:47,760 --> 00:01:52,680 Speaker 4: you all had your own respective careers. Ben, What were 31 00:01:52,680 --> 00:01:55,560 Speaker 4: you doing before you decided to embark on this venture. 32 00:01:56,040 --> 00:01:58,720 Speaker 2: Well, I was a lawyer. I went to law school 33 00:01:58,720 --> 00:02:02,080 Speaker 2: at Georgetown Law was a civil rights lawyer, and I 34 00:02:02,120 --> 00:02:06,640 Speaker 2: was working civil rights cases out in the Bakersfield, Fresno area, 35 00:02:07,080 --> 00:02:10,960 Speaker 2: representing families who lost loved ones in police brutality cases. 36 00:02:11,160 --> 00:02:11,400 Speaker 4: Wow. 37 00:02:11,520 --> 00:02:12,880 Speaker 1: So it was through that work that I. 38 00:02:12,880 --> 00:02:17,560 Speaker 2: Ended up meeting Colin Kaepernick, and Colin Kaepernick took a 39 00:02:17,600 --> 00:02:21,400 Speaker 2: knee for police brutality to bring attention to it, and 40 00:02:21,440 --> 00:02:25,079 Speaker 2: then from there the pandemic happened and the Minus Touch 41 00:02:25,120 --> 00:02:27,200 Speaker 2: network kind of sprung out of it. But my background 42 00:02:27,240 --> 00:02:29,799 Speaker 2: was a lawyer. I think my parents were surprised when 43 00:02:29,800 --> 00:02:31,040 Speaker 2: I went into podcasting. 44 00:02:31,160 --> 00:02:31,360 Speaker 3: But I. 45 00:02:32,840 --> 00:02:33,960 Speaker 1: Think they're happy within them. 46 00:02:34,040 --> 00:02:36,400 Speaker 4: I'm sure they are. How about you, Brett, what were 47 00:02:36,440 --> 00:02:36,840 Speaker 4: you doing? 48 00:02:36,919 --> 00:02:38,440 Speaker 3: I was a film and TV so I moved out 49 00:02:38,480 --> 00:02:40,639 Speaker 3: to Los Angeles to go to USC and I was 50 00:02:40,680 --> 00:02:43,680 Speaker 3: in film school there and then I graduated freelance for 51 00:02:43,760 --> 00:02:46,680 Speaker 3: a bid, ended up working for the Ellen Degenero Show 52 00:02:46,720 --> 00:02:49,840 Speaker 3: as an editor there, ran the digital department, post production 53 00:02:49,960 --> 00:02:53,240 Speaker 3: team over there. I had a great experience. Ended up 54 00:02:53,280 --> 00:02:56,360 Speaker 3: working for as a creative executive in film and television 55 00:02:56,400 --> 00:02:59,640 Speaker 3: after that, and then the pandemic hit. Production quickly shut 56 00:02:59,680 --> 00:03:03,799 Speaker 3: down and we started this project and the rest is history. 57 00:03:03,919 --> 00:03:07,360 Speaker 4: Well, you guys were really early adapters for like digital content. 58 00:03:07,440 --> 00:03:11,520 Speaker 4: I remember doing such an incredible job, so you knew, gosh, 59 00:03:11,760 --> 00:03:14,160 Speaker 4: a lot of people aren't watching the show at home. 60 00:03:14,320 --> 00:03:17,079 Speaker 4: We are going to make sure that everyone sees it online. 61 00:03:17,080 --> 00:03:18,160 Speaker 3: And I got to give them a lot of credit. 62 00:03:18,240 --> 00:03:19,680 Speaker 3: On the producers, there are a lot of credit. They 63 00:03:19,720 --> 00:03:22,800 Speaker 3: saw that incredibly early on. And what ended up happening 64 00:03:22,840 --> 00:03:24,400 Speaker 3: was when I got there, there was a team of 65 00:03:24,440 --> 00:03:26,560 Speaker 3: like five people. I was the new guy. There was like, 66 00:03:26,639 --> 00:03:28,760 Speaker 3: you know, four or five other people. By the time 67 00:03:28,800 --> 00:03:30,520 Speaker 3: I left there, you know, we had a team of 68 00:03:30,600 --> 00:03:32,920 Speaker 3: I don't know if you include you know, kind of 69 00:03:33,800 --> 00:03:36,040 Speaker 3: other staff kind of associated with it, probably more like 70 00:03:36,080 --> 00:03:39,400 Speaker 3: eighty people there. And so I saw this entire division 71 00:03:39,440 --> 00:03:41,480 Speaker 3: of Digital blossom and I got to be there while 72 00:03:41,520 --> 00:03:44,720 Speaker 3: it happened. And I never thought that editing and helping 73 00:03:44,760 --> 00:03:47,480 Speaker 3: to create videos that maybe about human interest stories and 74 00:03:47,520 --> 00:03:49,720 Speaker 3: cat videos and whatever we were doing there would end 75 00:03:49,760 --> 00:03:53,000 Speaker 3: up apply to making political videos. But somehow I think 76 00:03:53,000 --> 00:03:55,520 Speaker 3: it ended up being a great education actually and doing that. 77 00:03:55,440 --> 00:03:59,040 Speaker 4: So I'm sure it did. Meanwhile, the baby, Geordy, tell 78 00:03:59,080 --> 00:04:01,760 Speaker 4: me what you were up before you got involved in this. 79 00:04:01,880 --> 00:04:04,600 Speaker 1: Marketing and advertising. Actually, right here in New York, heart 80 00:04:04,640 --> 00:04:08,360 Speaker 1: of Times Square, this agency called Translation Incredible Agency. I 81 00:04:08,440 --> 00:04:10,680 Speaker 1: like to say that they gave me my PhD in 82 00:04:10,720 --> 00:04:13,400 Speaker 1: New age marketing. It was really awesome. It was a 83 00:04:13,480 --> 00:04:16,040 Speaker 1: nimble operation over there. I think they had less than 84 00:04:16,240 --> 00:04:18,520 Speaker 1: maybe eighty to one hundred employees when I was there, 85 00:04:18,839 --> 00:04:21,080 Speaker 1: and so you got to wear a lot of different hats, 86 00:04:21,160 --> 00:04:24,160 Speaker 1: so from the creative department to strategy to account management, 87 00:04:24,440 --> 00:04:27,000 Speaker 1: just kind of seeing campaigns, taking them from the earliest 88 00:04:27,040 --> 00:04:30,080 Speaker 1: stages of development and seeing the whole execution all the 89 00:04:30,120 --> 00:04:32,600 Speaker 1: way through. It was really really cool and seeing how 90 00:04:32,640 --> 00:04:36,040 Speaker 1: they kind of used new age tech and social media 91 00:04:36,080 --> 00:04:39,080 Speaker 1: to really emphasize and reach their audience in different ways. 92 00:04:39,400 --> 00:04:40,720 Speaker 1: It was eye opening at the time. 93 00:04:41,040 --> 00:04:44,440 Speaker 4: So your skill sets really complement each other. You all 94 00:04:44,480 --> 00:04:48,320 Speaker 4: bring something different to the table. And so it's the pandemic. 95 00:04:49,120 --> 00:04:53,200 Speaker 4: You all are getting increasingly frustrated by what you're watching unfold, 96 00:04:53,320 --> 00:04:57,520 Speaker 4: to say the least. And Ben, what was your aha moment, 97 00:04:57,720 --> 00:05:00,960 Speaker 4: like we have to do more? And how did you 98 00:05:01,000 --> 00:05:04,200 Speaker 4: all come together and say let's launch this new venture. 99 00:05:04,920 --> 00:05:08,360 Speaker 2: Well, you heard our backgrounds. We're not political people missing 100 00:05:08,400 --> 00:05:12,479 Speaker 2: from those backgrounds with stories of working in politics. We're 101 00:05:12,560 --> 00:05:16,920 Speaker 2: just people who were scared during the pandemic. We were nervous, 102 00:05:17,560 --> 00:05:20,440 Speaker 2: we were worried about our families, we were worried about 103 00:05:20,440 --> 00:05:24,080 Speaker 2: our friends, and we were sending each other text messages 104 00:05:24,240 --> 00:05:27,279 Speaker 2: about what we were going through, and so we said, 105 00:05:27,279 --> 00:05:30,760 Speaker 2: how do we translate those messages into you know, we 106 00:05:30,800 --> 00:05:33,200 Speaker 2: almost felt selfish if we didn't bring what we were 107 00:05:33,240 --> 00:05:36,119 Speaker 2: talking about to the public. So we just started saying, 108 00:05:36,160 --> 00:05:38,120 Speaker 2: what if we wrote about this, What if we started 109 00:05:38,120 --> 00:05:40,599 Speaker 2: a blog, What if we use bread skills who could 110 00:05:40,640 --> 00:05:43,919 Speaker 2: do videos and editing? What if we use Geordie's marketing 111 00:05:43,920 --> 00:05:48,159 Speaker 2: skills and just start telling people, here's what we're feeling, 112 00:05:48,279 --> 00:05:52,160 Speaker 2: here's what we're experiencing. You're not alone, we're a family, 113 00:05:52,760 --> 00:05:55,159 Speaker 2: and here's what we're trying to do to help each other. 114 00:05:55,600 --> 00:06:00,280 Speaker 2: So we start releasing a article, a video that Brett 115 00:06:00,440 --> 00:06:03,279 Speaker 2: edits Geordie helps coming up with some of the framing 116 00:06:03,320 --> 00:06:06,000 Speaker 2: of it with his marketing skills, and it seemed to 117 00:06:06,040 --> 00:06:11,200 Speaker 2: resonate because nobody had really seen a focus on these 118 00:06:11,279 --> 00:06:16,080 Speaker 2: topics that really focused on the human experience as opposed 119 00:06:16,080 --> 00:06:19,800 Speaker 2: to just kind of an overtly political side to it. 120 00:06:20,200 --> 00:06:23,359 Speaker 2: So people had this kind of aha, like I feel 121 00:06:23,360 --> 00:06:26,359 Speaker 2: that way too, And once we started seeing that, the 122 00:06:26,440 --> 00:06:29,040 Speaker 2: shock to us was like, wait a minute, no one 123 00:06:29,040 --> 00:06:32,400 Speaker 2: else is really talking like that and connecting in that way. 124 00:06:32,440 --> 00:06:34,119 Speaker 2: And so we did it again and again and again, 125 00:06:34,560 --> 00:06:37,040 Speaker 2: and from there it just kind of kept on growing. 126 00:06:37,480 --> 00:06:40,160 Speaker 2: You know, a video did one hundred thousand views, then 127 00:06:40,160 --> 00:06:43,320 Speaker 2: a million views, then five million views, and from there 128 00:06:43,320 --> 00:06:46,359 Speaker 2: it kind of sprung up and a community started developing 129 00:06:46,400 --> 00:06:49,320 Speaker 2: around it, and the months progressed and we started seeing 130 00:06:49,360 --> 00:06:51,320 Speaker 2: that this was actually having a real impact. 131 00:06:51,080 --> 00:06:53,760 Speaker 4: And then you sort of broadened your aperture if you will, 132 00:06:53,839 --> 00:06:57,000 Speaker 4: and kind of started focusing on all kinds of things, 133 00:06:57,080 --> 00:07:01,719 Speaker 4: not just obviously the COVID and the or misinformation or 134 00:07:01,720 --> 00:07:04,479 Speaker 4: how you were feeling and what was upsetting you, but 135 00:07:04,839 --> 00:07:08,000 Speaker 4: the political spectrum in general, right, Brett. 136 00:07:07,800 --> 00:07:10,080 Speaker 3: Yeah, no, one hundred percent. You know, I think we 137 00:07:10,400 --> 00:07:13,240 Speaker 3: just you know, we were trying to share our lived 138 00:07:13,280 --> 00:07:15,960 Speaker 3: experience with everybody out there, and so when we saw 139 00:07:16,040 --> 00:07:18,840 Speaker 3: something was that we thought was crazy, we would just 140 00:07:18,920 --> 00:07:21,440 Speaker 3: be like, this, isn't this crazy? Everybody what we're watching together? 141 00:07:21,520 --> 00:07:23,640 Speaker 3: Like is it just us? And like I think for 142 00:07:23,720 --> 00:07:25,520 Speaker 3: a while we were all just looking at each other 143 00:07:25,600 --> 00:07:28,120 Speaker 3: like I don't understand, Like why is this being treated 144 00:07:28,160 --> 00:07:31,040 Speaker 3: like this is normal? Especially was during the time when 145 00:07:31,160 --> 00:07:33,960 Speaker 3: you know, Trump was saying this fifteen cases going down 146 00:07:34,000 --> 00:07:35,640 Speaker 3: to zero, It's never going to come to our short 147 00:07:35,720 --> 00:07:37,960 Speaker 3: it was just lie after lie after lie, and I 148 00:07:38,040 --> 00:07:40,160 Speaker 3: was like we were all like, you know, this is 149 00:07:40,520 --> 00:07:42,440 Speaker 3: going to be bad, right this is and then we 150 00:07:42,520 --> 00:07:44,280 Speaker 3: just kept seeing, you know, all the kind of political 151 00:07:44,360 --> 00:07:47,120 Speaker 3: ramifications stem from there. I think that was a very 152 00:07:47,160 --> 00:07:50,200 Speaker 3: kind of pivotal moment in shaping people's beliefs. And in 153 00:07:50,240 --> 00:07:52,520 Speaker 3: addition to just COVID, I think a lot of people 154 00:07:52,520 --> 00:07:55,520 Speaker 3: at that point were sort of being siloed into different 155 00:07:55,640 --> 00:07:58,360 Speaker 3: kind of groups at that point, and there was a 156 00:07:58,400 --> 00:08:02,440 Speaker 3: deep division that was kind of being put into our society, 157 00:08:02,800 --> 00:08:04,560 Speaker 3: and people were kind of falling. 158 00:08:04,320 --> 00:08:09,200 Speaker 4: Into force by their own self selection are algorithmically reinforve and. 159 00:08:09,120 --> 00:08:12,320 Speaker 3: The fact that everybody was home and everybody was not 160 00:08:12,960 --> 00:08:15,880 Speaker 3: speaking with each other. And so as we develop mad 161 00:08:15,920 --> 00:08:19,640 Speaker 3: to people were mad for all different reasons. I mean people, 162 00:08:19,680 --> 00:08:21,600 Speaker 3: some people were mad that they had to you know, 163 00:08:21,720 --> 00:08:24,040 Speaker 3: be home. Some people were mad that, you know, why 164 00:08:24,080 --> 00:08:26,040 Speaker 3: isn't this Why don't they just take the vaccine? Why 165 00:08:26,120 --> 00:08:28,440 Speaker 3: don't they respect my space? People were all mad at 166 00:08:28,440 --> 00:08:30,640 Speaker 3: each other for all different reasons, and so there were 167 00:08:30,680 --> 00:08:34,880 Speaker 3: these communities developing, and so community became an essential aspect 168 00:08:34,880 --> 00:08:37,880 Speaker 3: to us, more so than the politics of are you 169 00:08:37,920 --> 00:08:40,400 Speaker 3: able to kind of assemble a community that's based around 170 00:08:40,480 --> 00:08:42,800 Speaker 3: kind of positivity and wanting to do the right thing 171 00:08:43,280 --> 00:08:45,280 Speaker 3: rather than a lot of these sort of more malicious 172 00:08:45,320 --> 00:08:48,439 Speaker 3: online communities that exist out there that seek to kind 173 00:08:48,440 --> 00:08:51,240 Speaker 3: of sew division and hatred. And during that time there 174 00:08:51,280 --> 00:08:53,640 Speaker 3: was definitely, you know a lot of that, and so 175 00:08:54,360 --> 00:08:56,920 Speaker 3: as we built kind of might as touched the community 176 00:08:56,960 --> 00:08:59,440 Speaker 3: aspect was almost more important than us saying you know, 177 00:08:59,520 --> 00:09:02,679 Speaker 3: here are political beliefs, and here are these political stances, 178 00:09:02,720 --> 00:09:05,080 Speaker 3: but more like, hey, don't you all want things to 179 00:09:05,120 --> 00:09:07,080 Speaker 3: be normal? Like can't we see that this stuff is 180 00:09:07,160 --> 00:09:09,840 Speaker 3: freaking nuts? And like if we could all just rally 181 00:09:09,880 --> 00:09:12,400 Speaker 3: around that, like maybe we could actually put those political 182 00:09:12,400 --> 00:09:15,600 Speaker 3: differences aside in many ways. And so as we progressed, 183 00:09:15,640 --> 00:09:17,760 Speaker 3: one of the things that we saw was that, you know, 184 00:09:17,840 --> 00:09:20,880 Speaker 3: there were folks who may consider themselves, you know, as 185 00:09:20,920 --> 00:09:23,040 Speaker 3: liberal as could be, joining us. Then there were folks 186 00:09:23,080 --> 00:09:26,679 Speaker 3: who consider themselves Republicans or former Republicans who were joining 187 00:09:26,760 --> 00:09:30,360 Speaker 3: us independence. Like we saw, you know, people coming to 188 00:09:30,440 --> 00:09:33,120 Speaker 3: us for all sorts of reasons. But at the end 189 00:09:33,160 --> 00:09:35,160 Speaker 3: of the day, I think the core was people just 190 00:09:35,240 --> 00:09:38,559 Speaker 3: wanting things to be like normal again, right, understanding that 191 00:09:38,840 --> 00:09:40,760 Speaker 3: what we're seeing then, and by the way, what we're 192 00:09:40,800 --> 00:09:43,320 Speaker 3: still seeing now is not normal. 193 00:09:43,440 --> 00:09:46,240 Speaker 4: I was going to say, I mean, I wouldn't necessarily say, 194 00:09:46,280 --> 00:09:49,319 Speaker 4: and you can correct me if you disagree that your 195 00:09:49,760 --> 00:09:54,080 Speaker 4: content exudes positivity at this moment in time. And I 196 00:09:54,120 --> 00:09:58,400 Speaker 4: feel like you have gone through various iterations, Jeordi. During 197 00:09:58,440 --> 00:10:02,880 Speaker 4: the Biden administration, were you guys a little more dormant. 198 00:10:03,000 --> 00:10:07,360 Speaker 4: There were fewer things for people of a certain political persuasion, 199 00:10:07,520 --> 00:10:11,720 Speaker 4: I think, to be outraged about. So did your growth 200 00:10:12,400 --> 00:10:17,240 Speaker 4: really explode after the election of Donald Trump the second time? 201 00:10:17,840 --> 00:10:20,720 Speaker 1: No, I would say we saw our strongest growth period 202 00:10:20,760 --> 00:10:24,000 Speaker 1: when President Biden was president. During that era. I think 203 00:10:24,000 --> 00:10:26,520 Speaker 1: it's a slight misconception about what we do that you know, 204 00:10:26,640 --> 00:10:29,360 Speaker 1: maybe like like just angry you know, out the world 205 00:10:29,559 --> 00:10:31,520 Speaker 1: or or anything like that. We really just try and 206 00:10:31,520 --> 00:10:34,400 Speaker 1: bring folks facts and data, and it's sort of that 207 00:10:34,520 --> 00:10:37,400 Speaker 1: approach that has led the community to grow. When you 208 00:10:37,440 --> 00:10:39,920 Speaker 1: watch our videos, for example, you know Ben will say 209 00:10:40,000 --> 00:10:42,320 Speaker 1: a couple of sentences, We'll just say play the clip, 210 00:10:42,640 --> 00:10:45,880 Speaker 1: because playing the clip many times is way more effective 211 00:10:45,920 --> 00:10:48,840 Speaker 1: than Ben explaining every little slight detail and maybe even 212 00:10:48,880 --> 00:10:51,520 Speaker 1: giving his opinion in that second. If people are just 213 00:10:51,640 --> 00:10:53,719 Speaker 1: take away the receipts from the videos and we let 214 00:10:53,720 --> 00:10:56,960 Speaker 1: them form their own opinion, that's where we've genuinely seen 215 00:10:57,000 --> 00:10:59,280 Speaker 1: the most growth of the network and of the community, 216 00:10:59,360 --> 00:11:02,400 Speaker 1: because it's to really the values that we care about. 217 00:11:02,720 --> 00:11:04,600 Speaker 1: We just want things to be normal. We just want 218 00:11:04,600 --> 00:11:08,079 Speaker 1: people to you know, live their lives and be happy 219 00:11:08,120 --> 00:11:10,320 Speaker 1: and be able to enjoy it without some you know, 220 00:11:10,480 --> 00:11:13,520 Speaker 1: lunatic screaming in their face every three seconds. And so 221 00:11:13,960 --> 00:11:17,640 Speaker 1: I think we've seen tremendous growth during the Biden administration, 222 00:11:17,960 --> 00:11:19,959 Speaker 1: and we continue to see growth right now as well. 223 00:11:20,080 --> 00:11:22,520 Speaker 1: But again, I think people are always coming to us 224 00:11:22,559 --> 00:11:24,640 Speaker 1: for the facts and for the data, and that's what 225 00:11:24,640 --> 00:11:25,359 Speaker 1: it's allowed. 226 00:11:25,160 --> 00:11:28,080 Speaker 2: To grow, I think, cause we've been there now for 227 00:11:28,160 --> 00:11:31,240 Speaker 2: about five years. A lot of people shut off after 228 00:11:31,280 --> 00:11:34,160 Speaker 2: the last election, and they were looking for fighters, and 229 00:11:34,200 --> 00:11:36,520 Speaker 2: they were looking for people to stand up to them. 230 00:11:36,840 --> 00:11:39,720 Speaker 2: And I hope we've always been authentic to those same 231 00:11:39,840 --> 00:11:42,360 Speaker 2: values when we just started this out in twenty twenty 232 00:11:42,760 --> 00:11:45,920 Speaker 2: as what we're doing now, so as we were out 233 00:11:45,920 --> 00:11:50,480 Speaker 2: there still fighting, and I think people a group of 234 00:11:50,520 --> 00:11:53,080 Speaker 2: people lost a lot of hope after this election, so 235 00:11:53,120 --> 00:11:55,400 Speaker 2: they turned and they saw what we were doing, and 236 00:11:55,440 --> 00:11:58,640 Speaker 2: I think that also helped give them hope, which allowed 237 00:11:58,720 --> 00:12:01,480 Speaker 2: a lot more people to maybe notice what we were 238 00:12:01,520 --> 00:12:05,160 Speaker 2: doing now. And also we were doing things on the 239 00:12:05,280 --> 00:12:08,280 Speaker 2: digital realm that a lot of people are adopting now 240 00:12:08,520 --> 00:12:14,000 Speaker 2: right in our YouTube strategy, our focus on simulcasting videos 241 00:12:14,040 --> 00:12:16,880 Speaker 2: and audio was just a different way to even kind 242 00:12:16,920 --> 00:12:20,920 Speaker 2: of approach this space, which also I think has been 243 00:12:20,960 --> 00:12:22,720 Speaker 2: a bit kind of pioneering, and we could talk about 244 00:12:22,760 --> 00:12:25,000 Speaker 2: that later, but that's part of I think where we're 245 00:12:25,040 --> 00:12:27,360 Speaker 2: seeing and we're humble and grateful for it. But some 246 00:12:27,440 --> 00:12:30,839 Speaker 2: of the recognition is, oh, lots of people didn't even 247 00:12:30,880 --> 00:12:33,240 Speaker 2: know what we were talking about, and we were saying 248 00:12:33,320 --> 00:12:37,920 Speaker 2: simulcast right a year ago or more. And now that's 249 00:12:38,000 --> 00:12:39,079 Speaker 2: the you know, the new lingo. 250 00:12:39,400 --> 00:12:43,400 Speaker 4: You know, if a Martian landed on Earth and you 251 00:12:43,679 --> 00:12:47,360 Speaker 4: had to explain to him or her what the Midas 252 00:12:47,480 --> 00:12:52,200 Speaker 4: Touch Network is in you know, an elevator pitch, how 253 00:12:52,200 --> 00:12:53,760 Speaker 4: would you do it? Who wants to take that? 254 00:12:53,880 --> 00:12:55,920 Speaker 2: I mean, I mean, I think it's a podcast that's 255 00:12:55,960 --> 00:12:58,920 Speaker 2: fueled by a community, and the community. 256 00:12:58,480 --> 00:13:00,960 Speaker 4: It's not just a pot well podcast, I guess, as 257 00:13:01,000 --> 00:13:04,440 Speaker 4: we know are video and audio, but it's sort of 258 00:13:04,520 --> 00:13:07,400 Speaker 4: you're doing everything everywhere all at once. Right but sorry 259 00:13:07,960 --> 00:13:09,800 Speaker 4: the Martian interrupted, you go on. 260 00:13:10,559 --> 00:13:14,000 Speaker 2: You know it's it's it's the community is built on 261 00:13:15,200 --> 00:13:19,319 Speaker 2: you know, I think, core values and principles and that's 262 00:13:19,400 --> 00:13:23,880 Speaker 2: what's the success. And what we're trying to readjust everybody 263 00:13:23,920 --> 00:13:28,280 Speaker 2: to remember is that politicans, the Martians going to I 264 00:13:28,360 --> 00:13:31,480 Speaker 2: may have lost the Martian elevator speech, but I'll give 265 00:13:31,520 --> 00:13:34,360 Speaker 2: you my academic speech for the market. I'm assuming the 266 00:13:34,360 --> 00:13:44,439 Speaker 2: Martians mar well leg well educated mar mars exactly. But 267 00:13:44,600 --> 00:13:48,640 Speaker 2: but I think politics has been gamified into your this team, 268 00:13:48,720 --> 00:13:52,640 Speaker 2: you're that team. You're angry, you're angry, yell at each other, 269 00:13:53,400 --> 00:13:56,160 Speaker 2: and the viewers left a little bit confused, Wait, what's 270 00:13:56,280 --> 00:13:58,880 Speaker 2: what's even the issue? And they try to pick what 271 00:13:59,040 --> 00:14:01,679 Speaker 2: team do I like based on the style of the 272 00:14:01,760 --> 00:14:05,200 Speaker 2: fight and the style of the debate and who's winning 273 00:14:05,280 --> 00:14:08,840 Speaker 2: that fight. And that's the split screen cable news yell, yell, yale. 274 00:14:09,200 --> 00:14:11,920 Speaker 2: So we try to deconstruct that, and we come with 275 00:14:12,000 --> 00:14:14,839 Speaker 2: an opinion and a strong opinion, but we try to 276 00:14:14,960 --> 00:14:18,800 Speaker 2: root it in a trial lawyer approach. Here's the evidence, 277 00:14:18,840 --> 00:14:21,480 Speaker 2: here's the exhibit, here's why we feel this way. Agree 278 00:14:21,520 --> 00:14:23,960 Speaker 2: with us, don't agree with us, but here's here's what 279 00:14:24,040 --> 00:14:26,640 Speaker 2: we're grounded in. And so rather than have a four 280 00:14:26,720 --> 00:14:31,040 Speaker 2: minute segment of yelling with thirty seconds of the actual news, 281 00:14:31,640 --> 00:14:34,000 Speaker 2: we try to root it in eleven to thirteen minutes 282 00:14:34,080 --> 00:14:35,960 Speaker 2: of the facts with opinion. 283 00:14:36,080 --> 00:14:37,840 Speaker 1: But it's the facts that drive it. 284 00:14:38,280 --> 00:14:40,720 Speaker 4: What I would say to the Martian, I would say that, 285 00:14:41,040 --> 00:14:43,200 Speaker 4: but Also, one of the things I think you all 286 00:14:43,280 --> 00:14:48,200 Speaker 4: do so effectively is connect the dots in this crazy, 287 00:14:48,440 --> 00:14:53,280 Speaker 4: fragmented media landscape, you know, with so much happening, particularly 288 00:14:53,360 --> 00:14:56,560 Speaker 4: in the early days of the second Trump administration. I 289 00:14:56,640 --> 00:15:00,720 Speaker 4: think what you do so artfully is is you kind 290 00:15:00,760 --> 00:15:04,600 Speaker 4: of connect, you know, you thread all these things. If 291 00:15:04,840 --> 00:15:09,400 Speaker 4: an expert saying this, and then Caroline Levitt is saying that, 292 00:15:09,920 --> 00:15:12,640 Speaker 4: and you know, you're taking all these different things and 293 00:15:12,760 --> 00:15:17,960 Speaker 4: you're kind of helping people understand and aggregate what all 294 00:15:18,080 --> 00:15:20,160 Speaker 4: these these moments mean. 295 00:15:20,400 --> 00:15:24,720 Speaker 1: Is that that's why you're Katie Kirk no part of 296 00:15:24,800 --> 00:15:25,960 Speaker 1: your yes. 297 00:15:26,120 --> 00:15:28,040 Speaker 3: You know. It's funny because we were speaking about this. 298 00:15:28,120 --> 00:15:31,440 Speaker 3: I think, like just yesterday that there's so much noise 299 00:15:31,760 --> 00:15:34,920 Speaker 3: out there, and you I think we all thought, you know, 300 00:15:35,040 --> 00:15:37,520 Speaker 3: if we went back I don't know even ten years ago, 301 00:15:37,760 --> 00:15:40,920 Speaker 3: but probably more than that, probably thought, oh, with the Internet, 302 00:15:41,040 --> 00:15:42,760 Speaker 3: with digital media, with also there's going to be so 303 00:15:42,880 --> 00:15:45,440 Speaker 3: much information. People are going to be so well informed, 304 00:15:45,720 --> 00:15:48,000 Speaker 3: people are going to be smarter than ever, They're going 305 00:15:48,080 --> 00:15:50,000 Speaker 3: to be on top of everything. It's going to be 306 00:15:50,120 --> 00:15:52,920 Speaker 3: so clear to know what's true and what's fiction. It's 307 00:15:53,000 --> 00:15:55,520 Speaker 3: clearly like the opposite at this point of what we're seeing. 308 00:15:55,520 --> 00:15:59,320 Speaker 3: There's such a deluge of information coming at people every 309 00:15:59,400 --> 00:16:01,320 Speaker 3: single day. So I think one of the things that 310 00:16:01,400 --> 00:16:03,240 Speaker 3: we do try to do is what you just hit at, 311 00:16:03,480 --> 00:16:05,840 Speaker 3: which is how do we cut through that noise with 312 00:16:06,000 --> 00:16:09,240 Speaker 3: a message that's clear, that could resonate, that gets the 313 00:16:09,360 --> 00:16:12,040 Speaker 3: truth across in the midst of just all of this 314 00:16:12,200 --> 00:16:15,560 Speaker 3: chaos that's coming every single day. Certainly that's been ramped 315 00:16:15,600 --> 00:16:18,440 Speaker 3: up since Trump's second administration, where I think part of 316 00:16:18,480 --> 00:16:21,240 Speaker 3: their strategy is also to flood the zone. I mean, 317 00:16:21,640 --> 00:16:23,800 Speaker 3: you think back to what's yeah, you think back to 318 00:16:23,880 --> 00:16:26,240 Speaker 3: what Bannon said in twenty sixteen or whenever he made 319 00:16:26,280 --> 00:16:29,000 Speaker 3: that statement, flood the zone with shit. I don't know 320 00:16:29,000 --> 00:16:29,520 Speaker 3: if I'm lent. 321 00:16:29,480 --> 00:16:30,200 Speaker 2: To say that. 322 00:16:31,920 --> 00:16:36,360 Speaker 3: Weak here, and and that's definitely been their strategy this 323 00:16:36,480 --> 00:16:39,400 Speaker 3: time around as well. Just do so much that it's 324 00:16:39,720 --> 00:16:42,440 Speaker 3: almost impossible for you know, an average person to try 325 00:16:42,480 --> 00:16:44,240 Speaker 3: to keep up with it. But we that's one of 326 00:16:44,280 --> 00:16:45,800 Speaker 3: the things we try to do is really center the 327 00:16:45,880 --> 00:16:48,880 Speaker 3: conversation on you know, these are the distractions, these are 328 00:16:48,920 --> 00:16:50,640 Speaker 3: the important issues, These are the things that are actually 329 00:16:50,640 --> 00:16:53,640 Speaker 3: going to affect you know, you your family, you know, 330 00:16:53,760 --> 00:16:55,119 Speaker 3: your friends, your coworkers. 331 00:16:55,200 --> 00:16:57,840 Speaker 4: These are these things you really need to understand. Yeah, 332 00:16:57,960 --> 00:17:00,240 Speaker 4: for sure, and I think it You're right. I mean, 333 00:17:00,320 --> 00:17:04,480 Speaker 4: a combination of things is making it so difficult to 334 00:17:04,600 --> 00:17:08,480 Speaker 4: stay informed. And a story that happened four days ago, 335 00:17:09,080 --> 00:17:11,639 Speaker 4: the country has moved on and it's like, hey, whatever 336 00:17:11,840 --> 00:17:15,399 Speaker 4: happened to the Associated Press getting kicked out of the 337 00:17:15,440 --> 00:17:18,640 Speaker 4: White House press room? And you know what about all 338 00:17:18,720 --> 00:17:22,840 Speaker 4: these right wing media influencers who are now taking over 339 00:17:23,040 --> 00:17:26,520 Speaker 4: seats that once belong to legacy media? Would you describe 340 00:17:26,560 --> 00:17:32,840 Speaker 4: yourselves as journalists, curators, producers or do you think there 341 00:17:33,000 --> 00:17:36,880 Speaker 4: isn't really a category for what you do? Jordy, you're smiling, 342 00:17:37,000 --> 00:17:37,800 Speaker 4: Go ahead, you. 343 00:17:37,840 --> 00:17:41,120 Speaker 1: Know, I've never thought of it like that. I view 344 00:17:41,119 --> 00:17:45,000 Speaker 1: as as concerned citizens first and foremost, we just. 345 00:17:45,000 --> 00:17:46,840 Speaker 4: Want to they're building a media impure. 346 00:17:47,560 --> 00:17:49,399 Speaker 1: Well, the funny thing about that is a journy. I 347 00:17:49,440 --> 00:17:53,240 Speaker 1: don't mean to no go for it. When we do interview. 348 00:17:53,440 --> 00:17:56,000 Speaker 2: We don't do a lot of interviews about ourselves, so 349 00:17:56,160 --> 00:17:59,840 Speaker 2: we're actually not that and I mean this sincerely reflective 350 00:18:00,080 --> 00:18:04,360 Speaker 2: on like who we are because we center the network 351 00:18:05,040 --> 00:18:08,520 Speaker 2: around the community and around the message to the people, 352 00:18:08,960 --> 00:18:11,840 Speaker 2: so it is always you know, interesting sometimes, you know, 353 00:18:12,080 --> 00:18:13,960 Speaker 2: and lately we don't do a lot of interviews, but 354 00:18:14,040 --> 00:18:17,920 Speaker 2: sometimes when we're reflecting on like who we are and 355 00:18:18,040 --> 00:18:18,760 Speaker 2: what does it mean? 356 00:18:19,040 --> 00:18:21,960 Speaker 4: But maybe just think about instead of you guys like 357 00:18:22,119 --> 00:18:26,560 Speaker 4: as brothers, sort of who is the Midas Network, right, 358 00:18:26,920 --> 00:18:29,200 Speaker 4: you know, And I guess it's a lot of things, right, 359 00:18:29,520 --> 00:18:29,960 Speaker 4: and it's. 360 00:18:29,840 --> 00:18:34,119 Speaker 2: Most importantly it's been built on the community of people 361 00:18:34,200 --> 00:18:37,399 Speaker 2: in America and frankly people around the world who are 362 00:18:37,560 --> 00:18:40,720 Speaker 2: concerned about the human condition and experience. That's how I 363 00:18:40,720 --> 00:18:45,000 Speaker 2: would say it, very you know, very succinctly. But I 364 00:18:45,119 --> 00:18:48,000 Speaker 2: think we are a media network. We run a media 365 00:18:48,080 --> 00:18:51,000 Speaker 2: network filled with some of the best reporters out there. 366 00:18:51,520 --> 00:18:55,320 Speaker 2: We have a great editorial team that breaks stories. You know, 367 00:18:55,440 --> 00:18:57,720 Speaker 2: I consider what I do on a day to day basis. 368 00:18:58,240 --> 00:19:01,359 Speaker 2: I'm reporting on the news, we break exclusives, we report 369 00:19:01,440 --> 00:19:04,520 Speaker 2: on news that happens, and you know, we try to 370 00:19:04,680 --> 00:19:06,440 Speaker 2: contextualize everything. 371 00:19:06,200 --> 00:19:07,439 Speaker 1: Right as best we can. That's all. 372 00:19:07,560 --> 00:19:07,680 Speaker 3: You know. 373 00:19:07,760 --> 00:19:12,359 Speaker 4: I'm wondering, could you do what you do without legacy media? 374 00:19:12,600 --> 00:19:14,200 Speaker 4: In other words, I know you use a lot of 375 00:19:14,240 --> 00:19:19,320 Speaker 4: clips from MSNBC or Fox or CNN, you use reporting, 376 00:19:19,960 --> 00:19:22,680 Speaker 4: and then I think you make it understandable. As I said, 377 00:19:22,760 --> 00:19:25,360 Speaker 4: you call from a lot of different sources and kind 378 00:19:25,359 --> 00:19:27,720 Speaker 4: of give people the big picture, at least in a 379 00:19:27,800 --> 00:19:30,760 Speaker 4: lot of the content I'm looking at, But the big 380 00:19:30,920 --> 00:19:34,879 Speaker 4: beef while legacy media gets trashed and everything is a 381 00:19:34,960 --> 00:19:39,000 Speaker 4: lot of these content creators and even digital media networks 382 00:19:39,359 --> 00:19:42,639 Speaker 4: could not do what they do without. You know, with 383 00:19:42,800 --> 00:19:45,560 Speaker 4: all due respect to the reporters you have, but with 384 00:19:45,720 --> 00:19:51,360 Speaker 4: this vast array of legacy media organizations that are covering 385 00:19:51,480 --> 00:19:55,520 Speaker 4: the world to are interviewing Scot Galloway on CNN or 386 00:19:55,720 --> 00:19:59,399 Speaker 4: doing something else with some other expert from you might 387 00:19:59,520 --> 00:20:02,760 Speaker 4: run a CLI. So I'm just curious how you feel 388 00:20:02,800 --> 00:20:06,000 Speaker 4: about that. If legacy media poof went away, I don't 389 00:20:06,000 --> 00:20:07,600 Speaker 4: think you could do what you're doing. 390 00:20:07,920 --> 00:20:10,000 Speaker 2: I don't know. I mean I think that I think 391 00:20:10,080 --> 00:20:13,879 Speaker 2: that a lot of the things that we find are 392 00:20:14,240 --> 00:20:15,280 Speaker 2: on social media. 393 00:20:15,440 --> 00:20:18,240 Speaker 4: It's what right our hearings on Capitol Hill. I was 394 00:20:18,359 --> 00:20:18,960 Speaker 4: thinking about that. 395 00:20:19,119 --> 00:20:24,080 Speaker 2: No, so I would say there's maybe five to seven 396 00:20:24,200 --> 00:20:27,399 Speaker 2: percent of what we do is maybe commenting on an 397 00:20:27,440 --> 00:20:31,520 Speaker 2: interview that's on a cable news legacy news network. But 398 00:20:31,600 --> 00:20:35,000 Speaker 2: I don't think that drives the network even a small fraction. 399 00:20:35,160 --> 00:20:37,800 Speaker 2: I mean, the majority of it is people are out 400 00:20:37,840 --> 00:20:42,480 Speaker 2: there saying things either on social media on their own tiktoks, 401 00:20:42,640 --> 00:20:47,159 Speaker 2: on their own instagrams, on their own x accounts, or whatever, or. 402 00:20:47,320 --> 00:20:50,520 Speaker 4: People are saying things on Capitol Hill like Marjorie Taylor 403 00:20:50,600 --> 00:20:54,280 Speaker 4: Green claiming that guy was flipping the bird. Did you 404 00:20:54,760 --> 00:20:56,080 Speaker 4: you guys, I'm sure did a story. 405 00:20:56,200 --> 00:20:59,360 Speaker 3: Yeah, oh yeah, we did that. The deceptive photo where 406 00:20:59,560 --> 00:21:03,119 Speaker 3: Represent Stansbury ended up showing the actual photo and then 407 00:21:04,080 --> 00:21:05,280 Speaker 3: who was holding it up It. 408 00:21:05,280 --> 00:21:11,520 Speaker 2: Was Marjorie Taylor Green had the big poster board of 409 00:21:11,640 --> 00:21:14,800 Speaker 2: the head of USA Fencing. Why they brought a USA 410 00:21:14,960 --> 00:21:18,520 Speaker 2: Fencing even though they received no federal funding is because 411 00:21:18,560 --> 00:21:20,399 Speaker 2: there was a Fox story saying that there was a 412 00:21:20,520 --> 00:21:24,919 Speaker 2: trendgender fencer. So the whole conspiracy is is that they 413 00:21:25,000 --> 00:21:28,280 Speaker 2: wanted to punch down on a marginalized community using the 414 00:21:28,359 --> 00:21:31,160 Speaker 2: fencing guy. So they showed a photo of the head 415 00:21:31,160 --> 00:21:34,800 Speaker 2: of USA Fencing in a doge committee and he had 416 00:21:34,840 --> 00:21:37,000 Speaker 2: a peace sign or he had two fingers up, and 417 00:21:37,080 --> 00:21:38,840 Speaker 2: they edited out to look like he was giving the 418 00:21:38,920 --> 00:21:41,280 Speaker 2: camera the middle finger, and they said, look, he's giving 419 00:21:41,320 --> 00:21:45,320 Speaker 2: you all the middle finger. And then Congresswoman Stansbury said, no, 420 00:21:45,480 --> 00:21:47,399 Speaker 2: let me show you what the photo is. You know, 421 00:21:47,560 --> 00:21:49,920 Speaker 2: it's a it's actually a peace sign. 422 00:21:50,480 --> 00:21:55,320 Speaker 4: That was so crazy, remember, and Marjorie Taylor Green just hitting. 423 00:21:55,119 --> 00:21:56,160 Speaker 2: In the cavel. See. 424 00:21:56,200 --> 00:21:58,320 Speaker 3: But that's the kind of thing like to me, I 425 00:21:58,359 --> 00:22:00,720 Speaker 3: don't care what your politics are, like, that's not okay, 426 00:22:01,280 --> 00:22:03,240 Speaker 3: you know, and we should all be able to acknowledge 427 00:22:03,240 --> 00:22:04,760 Speaker 3: that that's not okay at the end of the day, 428 00:22:04,800 --> 00:22:07,200 Speaker 3: to do that kind of deception and to try to 429 00:22:07,280 --> 00:22:09,639 Speaker 3: punch down at people like that. And so I think 430 00:22:09,680 --> 00:22:11,359 Speaker 3: that's at the heart of, you know, a lot of 431 00:22:11,760 --> 00:22:13,840 Speaker 3: what we're trying to get across here. But you know, 432 00:22:13,960 --> 00:22:15,800 Speaker 3: I think one of the other great examples too, And 433 00:22:16,000 --> 00:22:18,159 Speaker 3: you know, I think legacy media certainly plays, you know, 434 00:22:18,200 --> 00:22:20,000 Speaker 3: a tremendous role. And I think there's some you know, 435 00:22:20,160 --> 00:22:22,639 Speaker 3: excellent reporters out there who do a really, you know, 436 00:22:22,800 --> 00:22:24,920 Speaker 3: terrific job, and I you know, I commend their work, 437 00:22:25,040 --> 00:22:27,240 Speaker 3: you know, especially the reporters who are out there in 438 00:22:27,359 --> 00:22:29,920 Speaker 3: these war zones and reporting on these international stories. You know, 439 00:22:30,000 --> 00:22:32,440 Speaker 3: I have so much respect for the stuff they do. 440 00:22:32,600 --> 00:22:35,159 Speaker 3: But I will say also, you know, the a lot 441 00:22:35,240 --> 00:22:37,479 Speaker 3: of the bread and butter of what we've done, especially 442 00:22:37,480 --> 00:22:39,560 Speaker 3: throughout the past few years also is looking to like 443 00:22:39,640 --> 00:22:42,800 Speaker 3: the first hand documentation ourselves. And so I think one 444 00:22:42,840 --> 00:22:44,760 Speaker 3: of the great examples and of one of the reasons 445 00:22:44,800 --> 00:22:46,600 Speaker 3: I think Ben is so good at what he does 446 00:22:47,040 --> 00:22:49,479 Speaker 3: is with his you know, background as an attorney. One 447 00:22:49,520 --> 00:22:51,280 Speaker 3: of the things that we do is we'll go through 448 00:22:51,400 --> 00:22:54,800 Speaker 3: court filings, for example, and so during all the various 449 00:22:54,880 --> 00:22:57,040 Speaker 3: court cases over the past few years, and even everything 450 00:22:57,080 --> 00:22:59,000 Speaker 3: that's happening in the courts right now, because you know, 451 00:22:59,040 --> 00:23:01,600 Speaker 3: there's a whole lot of legal stuff happening right now 452 00:23:01,680 --> 00:23:06,040 Speaker 3: in the various score Oh yeah, we started a whole 453 00:23:06,119 --> 00:23:08,960 Speaker 3: new well channel, the Legal af Channel, because we had 454 00:23:09,000 --> 00:23:11,280 Speaker 3: so many legal stories, right and so so we have 455 00:23:11,320 --> 00:23:14,080 Speaker 3: a whole channel now specifically dedicated to the legal But 456 00:23:14,200 --> 00:23:16,920 Speaker 3: on those the attorneys are going through these source documents, 457 00:23:17,040 --> 00:23:19,840 Speaker 3: the court filings. You know this judge said this, Their 458 00:23:19,960 --> 00:23:22,800 Speaker 3: argument is this, they're analyzing it. And so I think 459 00:23:22,880 --> 00:23:24,199 Speaker 3: that is at the heart of it. And I think 460 00:23:24,240 --> 00:23:26,200 Speaker 3: one of the things our format allows us to do 461 00:23:26,880 --> 00:23:30,000 Speaker 3: that may not be as conducive to a twenty four 462 00:23:30,000 --> 00:23:32,760 Speaker 3: hour news format or a kind of a TV news 463 00:23:32,840 --> 00:23:34,639 Speaker 3: format is we could go there and we could sit 464 00:23:34,720 --> 00:23:36,159 Speaker 3: there and we could say, pull up the filing, and 465 00:23:36,200 --> 00:23:38,680 Speaker 3: we could read an eight page filing and our audience 466 00:23:38,840 --> 00:23:42,080 Speaker 3: likes that. It probably doesn't make good TV, yeah, but 467 00:23:42,200 --> 00:23:44,360 Speaker 3: our audience likes us going through. Okay, now, let's look 468 00:23:44,359 --> 00:23:46,879 Speaker 3: at the footnote on page three, and let's go to 469 00:23:46,960 --> 00:23:50,720 Speaker 3: the paragraph on page and they likely turn the channel 470 00:23:50,760 --> 00:23:53,359 Speaker 3: at that point. But our audience, you know, likes that 471 00:23:53,440 --> 00:23:55,680 Speaker 3: because they like seeing what's actually being said. And I 472 00:23:55,720 --> 00:23:58,520 Speaker 3: think sometimes it's the most revealing rather than you know, 473 00:23:59,040 --> 00:24:01,479 Speaker 3: you hearing what I have to say about this topic. 474 00:24:01,800 --> 00:24:03,760 Speaker 3: Let me tell you what the judge actually said here, 475 00:24:03,880 --> 00:24:06,280 Speaker 3: let me tell you what there, what the Trump Administration's 476 00:24:06,359 --> 00:24:09,159 Speaker 3: argument actually is here, and then you be the judge 477 00:24:09,200 --> 00:24:21,919 Speaker 3: of what the truth is. Hi. 478 00:24:22,040 --> 00:24:24,800 Speaker 4: Everyone, it's Katie Curic. You know I'm always on the 479 00:24:24,880 --> 00:24:29,159 Speaker 4: go between running my media company, hosting my podcast, and 480 00:24:29,280 --> 00:24:32,040 Speaker 4: of course covering the news. And I know that to 481 00:24:32,160 --> 00:24:34,840 Speaker 4: keep doing what I love, I need to start caring 482 00:24:34,960 --> 00:24:38,480 Speaker 4: for what gets me there, my feet. That's why I 483 00:24:38,680 --> 00:24:43,000 Speaker 4: decided to try the Good Feet stores personalized arch support system. 484 00:24:43,480 --> 00:24:46,359 Speaker 4: I met with a Good Feet arch support specialist and 485 00:24:46,520 --> 00:24:49,480 Speaker 4: after a personalized fitting, I left the store with my 486 00:24:49,680 --> 00:24:53,880 Speaker 4: three step system designed to improve comfort, balance, and support 487 00:24:54,480 --> 00:24:58,400 Speaker 4: my feet. Knees and back are thanking me already. Visit 488 00:24:58,480 --> 00:25:02,200 Speaker 4: goodfeet dot com to learn find the nearest store, or 489 00:25:02,359 --> 00:25:15,000 Speaker 4: book your own free personalized fitting. So fair enough, you 490 00:25:15,080 --> 00:25:18,879 Speaker 4: guys do a ton of original reporting, and not just 491 00:25:19,200 --> 00:25:21,720 Speaker 4: on clips that you see on cable news. In fact, 492 00:25:21,760 --> 00:25:23,560 Speaker 4: I thought it was funny. I know that you hired 493 00:25:23,600 --> 00:25:27,040 Speaker 4: this guy and he is sort of like, how can 494 00:25:27,119 --> 00:25:29,920 Speaker 4: I explain him? He's sort of like the tru yes, 495 00:25:30,359 --> 00:25:33,800 Speaker 4: the truffle sniffy dog who can figure out, like what 496 00:25:34,000 --> 00:25:35,720 Speaker 4: is going to be the big story and kind of 497 00:25:35,840 --> 00:25:38,480 Speaker 4: does clip. And I read one of you said something 498 00:25:38,640 --> 00:25:42,760 Speaker 4: like he's so undervalued as the best cable news producer's 499 00:25:42,880 --> 00:25:45,600 Speaker 4: business because in a way, you all are setting the 500 00:25:45,720 --> 00:25:50,000 Speaker 4: agenda for at least more progressive cable news networks, and 501 00:25:50,040 --> 00:25:53,920 Speaker 4: I would imagine even more conservative ones that are often 502 00:25:54,080 --> 00:25:57,359 Speaker 4: reacting to more progressive ones. Right. 503 00:25:57,520 --> 00:25:59,160 Speaker 3: Well, one of the things that we notice, and it's 504 00:25:59,520 --> 00:26:02,119 Speaker 3: a funny thing to see is that you know Ason, 505 00:26:02,160 --> 00:26:04,160 Speaker 3: who's one of the most brilliant people on the planet. 506 00:26:04,400 --> 00:26:05,200 Speaker 4: I might have to hire. 507 00:26:06,680 --> 00:26:08,639 Speaker 2: He truly is a savant when you see what he 508 00:26:08,800 --> 00:26:11,560 Speaker 2: does with like the eight screens and he sits there 509 00:26:11,720 --> 00:26:14,920 Speaker 2: for like eighteen hours a day like it's real. 510 00:26:15,080 --> 00:26:16,800 Speaker 3: And let me just say, he's ours. You can't have it. 511 00:26:18,000 --> 00:26:20,600 Speaker 3: But he goes, he goes through, you know, the clips 512 00:26:20,720 --> 00:26:24,040 Speaker 3: all day long, and you know he's got an eye 513 00:26:24,119 --> 00:26:25,840 Speaker 3: for it, and I think it's something you do need 514 00:26:25,880 --> 00:26:28,640 Speaker 3: an eye for, right, what out of all the things 515 00:26:28,720 --> 00:26:32,000 Speaker 3: that are said during these congressional hearings, for example, or 516 00:26:32,080 --> 00:26:33,800 Speaker 3: on the floor of the house, whatever it is on 517 00:26:33,960 --> 00:26:37,199 Speaker 3: cable news, there's a lot of stuff that isn't newsworthy, 518 00:26:37,240 --> 00:26:41,200 Speaker 3: that isn't exciting, but for you to be able to capture, contextualize, 519 00:26:41,240 --> 00:26:43,800 Speaker 3: and ninety nine percent of the time he's literally just 520 00:26:43,800 --> 00:26:45,920 Speaker 3: putting the captions of what people are saying when he 521 00:26:46,000 --> 00:26:48,280 Speaker 3: puts out these clips. But to pick those moments that 522 00:26:48,359 --> 00:26:50,480 Speaker 3: are the moments that you know, he knows will be 523 00:26:50,520 --> 00:26:52,800 Speaker 3: important for people to see. I think that's a real 524 00:26:52,920 --> 00:26:55,199 Speaker 3: skill and a real talent. And one of the interesting 525 00:26:55,280 --> 00:26:56,920 Speaker 3: things that we see is he'll do that throughout the 526 00:26:57,040 --> 00:27:01,360 Speaker 3: day and then coincidentally, you know on the Shepherd. 527 00:27:01,760 --> 00:27:04,800 Speaker 4: Talking about yeah, you go another lawyer. 528 00:27:04,720 --> 00:27:08,760 Speaker 3: Right, whether it's MSNBC or Fox, you know, they're playing 529 00:27:08,840 --> 00:27:11,600 Speaker 3: those same clips. They have a different perspective on the clips. 530 00:27:11,720 --> 00:27:14,359 Speaker 3: But but they're playing those same clips that Asen pulled 531 00:27:14,400 --> 00:27:16,919 Speaker 3: earlier in the day. I don't think that's any accident. 532 00:27:17,119 --> 00:27:19,639 Speaker 3: I think, you know, that's just how people are consuming media, 533 00:27:19,920 --> 00:27:22,399 Speaker 3: right and it's not just us, but it's a cable 534 00:27:22,440 --> 00:27:25,439 Speaker 3: news producers as well are seeing what happens in Congress. 535 00:27:25,880 --> 00:27:28,600 Speaker 3: Not because you know, these hosts are sitting there watching 536 00:27:28,960 --> 00:27:31,040 Speaker 3: hear you know, the hearings all day, but because they 537 00:27:31,119 --> 00:27:34,600 Speaker 3: see what people are speaking about online, and oftentimes that originates. 538 00:27:34,119 --> 00:27:34,800 Speaker 1: With our team. 539 00:27:35,000 --> 00:27:37,399 Speaker 4: So interesting you talk about your values. You know, you 540 00:27:37,680 --> 00:27:40,800 Speaker 4: describe the trans community as a marginalized community. I couldn't 541 00:27:40,800 --> 00:27:44,160 Speaker 4: agree with you more. I think our values are probably aligned. 542 00:27:44,800 --> 00:27:49,080 Speaker 4: But clearly you guys are left of center. Is that fair? 543 00:27:49,920 --> 00:27:52,680 Speaker 3: Yeah, you know, it's funny. I just I think that's 544 00:27:52,680 --> 00:27:55,639 Speaker 3: certainly fair, you know, in the kind of traditional political space. 545 00:27:55,640 --> 00:27:57,760 Speaker 4: I mean, if you're thinking about issues that you all 546 00:27:57,840 --> 00:28:01,560 Speaker 4: care about, believe in, maybe describe some of those issues 547 00:28:01,680 --> 00:28:02,720 Speaker 4: like and we all. 548 00:28:02,720 --> 00:28:04,280 Speaker 3: Grew up you know, not that we grew up, you know, 549 00:28:04,320 --> 00:28:07,280 Speaker 3: we didn't growup political, but we're all you know, Democrats, 550 00:28:07,320 --> 00:28:09,119 Speaker 3: and we all have you know, had, you know, beliefs 551 00:28:09,160 --> 00:28:11,000 Speaker 3: that go in that, but we don't we really don't 552 00:28:11,080 --> 00:28:14,920 Speaker 3: view everything like through necessarily a political end. Like in 553 00:28:15,080 --> 00:28:18,920 Speaker 3: that way, we don't necessarily view ourselves as like like 554 00:28:18,960 --> 00:28:20,800 Speaker 3: I don't care what the Democratic Party does, you know, 555 00:28:20,840 --> 00:28:22,959 Speaker 3: I don't care about their internal squabbling. I don't care, 556 00:28:23,160 --> 00:28:25,320 Speaker 3: you know. Like, to me, we're just doing our thing, 557 00:28:25,560 --> 00:28:28,320 Speaker 3: and if we have our values and if people want 558 00:28:28,359 --> 00:28:30,479 Speaker 3: to meet us at our values, that's kind of all 559 00:28:30,600 --> 00:28:31,240 Speaker 3: we care about. 560 00:28:31,359 --> 00:28:32,399 Speaker 4: So are your values? 561 00:28:32,520 --> 00:28:36,120 Speaker 2: Look, I think people should get for a hard day's 562 00:28:36,200 --> 00:28:38,920 Speaker 2: work or just working. They should get paid with dignity. 563 00:28:39,360 --> 00:28:41,720 Speaker 2: I think Americans should be able to be able to 564 00:28:41,840 --> 00:28:45,400 Speaker 2: afford homes. I don't believe Americans should be living paycheck 565 00:28:45,480 --> 00:28:47,920 Speaker 2: to paycheck if they're working. I think that's a form 566 00:28:47,960 --> 00:28:51,560 Speaker 2: of psychological torture. I believe that Americans should all have 567 00:28:51,720 --> 00:28:55,040 Speaker 2: access to healthcare. It should be very affordable or free 568 00:28:55,120 --> 00:28:58,160 Speaker 2: for Americans. I believe Americans should have access to education, 569 00:28:58,720 --> 00:29:01,360 Speaker 2: which should be affordable or free to Americans. 570 00:29:01,960 --> 00:29:03,160 Speaker 1: I believe in equality. 571 00:29:03,320 --> 00:29:06,880 Speaker 2: I don't believe that marginalized community should have to wake 572 00:29:07,000 --> 00:29:09,160 Speaker 2: up and be scared how they're going to be treated 573 00:29:09,200 --> 00:29:11,960 Speaker 2: in a given day. I don't believe that people should 574 00:29:12,000 --> 00:29:16,280 Speaker 2: be bullying marginalized communities. I believe diversity and equality is 575 00:29:16,400 --> 00:29:18,800 Speaker 2: a strength, and equity is a strength, and I don't 576 00:29:18,800 --> 00:29:22,000 Speaker 2: think that the lesson from this past election should be 577 00:29:22,760 --> 00:29:25,680 Speaker 2: running away from that. It should be standing up for 578 00:29:25,840 --> 00:29:28,120 Speaker 2: it and being strong and standing up for it and 579 00:29:28,200 --> 00:29:32,240 Speaker 2: showing actual convictions. I think a lot of marginalized communities 580 00:29:32,320 --> 00:29:35,160 Speaker 2: see fair weather supporters who are with them when it 581 00:29:35,240 --> 00:29:38,760 Speaker 2: seems cool and then abandon them when the seasons change, 582 00:29:39,240 --> 00:29:41,240 Speaker 2: and I think they see that a lot in how 583 00:29:41,280 --> 00:29:45,240 Speaker 2: the media portrays them, how corporations portray them. Oh, everyone's 584 00:29:45,320 --> 00:29:48,040 Speaker 2: with DEI now, and everyone's not with DEI now. 585 00:29:48,080 --> 00:29:48,960 Speaker 1: I think that's bullshit. 586 00:29:49,320 --> 00:29:51,040 Speaker 2: I think you've got to stand with people and be 587 00:29:51,160 --> 00:29:54,880 Speaker 2: who you are, and that's ultimately is the right thing 588 00:29:55,000 --> 00:29:58,520 Speaker 2: to do. But I think that's also what ultimately will 589 00:29:58,560 --> 00:30:02,600 Speaker 2: win elections too, because if your really care, if you're 590 00:30:02,720 --> 00:30:06,240 Speaker 2: really fighting for people, they'll recognize that. And I think 591 00:30:06,280 --> 00:30:08,240 Speaker 2: when they see that you're waivering, they don't even know 592 00:30:08,320 --> 00:30:10,200 Speaker 2: if you're a real ally or not an ally. 593 00:30:10,560 --> 00:30:14,280 Speaker 4: And I imagine working with and representing Colin Kaepernick was 594 00:30:14,760 --> 00:30:19,320 Speaker 4: very transformative. Maybe not transformative, maybe that's not the right word. 595 00:30:19,480 --> 00:30:20,280 Speaker 2: For sure, it was. 596 00:30:20,520 --> 00:30:23,400 Speaker 4: It was in terms of how you view things like 597 00:30:23,520 --> 00:30:27,480 Speaker 4: Black Lives Matter and some of the things that Colin did, 598 00:30:27,800 --> 00:30:31,680 Speaker 4: and as we were talking about racial equality and DEI 599 00:30:32,080 --> 00:30:37,320 Speaker 4: issues and gosh, we've seen so much of that progress reversed. 600 00:30:37,720 --> 00:30:39,720 Speaker 4: How do you feel about reproductive rights? 601 00:30:40,320 --> 00:30:43,240 Speaker 2: Well, it's the first Colin Oh yeah, yeah. On the 602 00:30:43,320 --> 00:30:47,240 Speaker 2: Colin side, nobody wanted to work with us at first. 603 00:30:47,320 --> 00:30:50,479 Speaker 2: When I started representing Colin. Remember that Colin was one 604 00:30:50,520 --> 00:30:53,960 Speaker 2: of the first people, if not the first person Trump targeted. 605 00:30:54,320 --> 00:30:57,720 Speaker 2: Get that sob off the field. Colin lost his job. 606 00:30:57,880 --> 00:30:59,680 Speaker 2: No one would do deals with us, no one did 607 00:30:59,720 --> 00:31:01,840 Speaker 2: business with us, and Colin and I were working together. 608 00:31:01,920 --> 00:31:04,760 Speaker 2: Then then all of a sudden, everybody wanted to work 609 00:31:04,800 --> 00:31:10,000 Speaker 2: with us because Colin fought people perceived what everything was. 610 00:31:10,040 --> 00:31:12,880 Speaker 2: As Colin won, he stood up, everybody was with us. 611 00:31:13,640 --> 00:31:17,400 Speaker 2: And then people leave and then they're no longer. Oh 612 00:31:17,480 --> 00:31:21,880 Speaker 2: we can't do you know, all these corporations, all these groups, everybody. 613 00:31:21,760 --> 00:31:25,160 Speaker 4: Who wanted to invest in and get scared right, and 614 00:31:25,280 --> 00:31:26,240 Speaker 4: they they. 615 00:31:26,200 --> 00:31:29,480 Speaker 2: Were all talking about diversity Initiative, Chief, diversity off, you know, 616 00:31:29,600 --> 00:31:32,400 Speaker 2: all those things. Then they get rid of them. So 617 00:31:33,160 --> 00:31:34,560 Speaker 2: if you look at that, you think. 618 00:31:35,800 --> 00:31:37,280 Speaker 1: Who has my back? 619 00:31:38,120 --> 00:31:42,000 Speaker 2: And I think you need to as a leader, setting aside, 620 00:31:42,040 --> 00:31:44,040 Speaker 2: are you center left, are you left left or you 621 00:31:44,160 --> 00:31:47,720 Speaker 2: left left left? You just as a leader have to 622 00:31:47,800 --> 00:31:50,560 Speaker 2: say I have your back, and I've always had your back. 623 00:31:50,960 --> 00:31:53,720 Speaker 2: I'm the person who was marching with you in Bakersfield 624 00:31:54,160 --> 00:31:56,800 Speaker 2: on the streets, whether you were a Democrat or Republican, 625 00:31:56,840 --> 00:31:58,880 Speaker 2: if you lost a loved one, I was with you, 626 00:31:59,480 --> 00:32:02,560 Speaker 2: and I was fighting for you. And I'm the civil 627 00:32:02,640 --> 00:32:04,680 Speaker 2: rights lawyer who was fighting for you. I was the 628 00:32:04,760 --> 00:32:07,880 Speaker 2: person there. We we have your back, and I think 629 00:32:08,040 --> 00:32:12,240 Speaker 2: that authenticity is what's most important beyond you know, how 630 00:32:12,320 --> 00:32:15,000 Speaker 2: do you describe yourself as it relates to reproductive rights. 631 00:32:15,320 --> 00:32:18,840 Speaker 2: We feel strongly that reproductive rights it's a woman's decision. 632 00:32:19,360 --> 00:32:21,520 Speaker 2: We as men get the hell out of it. It's 633 00:32:21,600 --> 00:32:25,400 Speaker 2: it's it's not for me to even think for a 634 00:32:25,560 --> 00:32:31,600 Speaker 2: moment that I should be in any way controlling, lecturing, talking, 635 00:32:31,960 --> 00:32:35,880 Speaker 2: having anything at all to intrude in the conversation you 636 00:32:36,000 --> 00:32:40,440 Speaker 2: know here, you know, And so for us women's reproductive rights, 637 00:32:40,800 --> 00:32:43,440 Speaker 2: that is sacro saying to the women men stay the 638 00:32:43,480 --> 00:32:43,800 Speaker 2: hell out. 639 00:32:44,160 --> 00:32:48,040 Speaker 4: So you, I think, are anesthetical to the current sort 640 00:32:48,080 --> 00:32:51,720 Speaker 4: of right wing media ecosystem. And I want to read 641 00:32:51,760 --> 00:32:54,680 Speaker 4: a quote from this guy named Michael Tamaski, who is 642 00:32:55,200 --> 00:32:57,640 Speaker 4: the editor of The New Republic. I interviewed him a 643 00:32:57,680 --> 00:33:02,040 Speaker 4: few days after the election and I asked him about 644 00:33:03,080 --> 00:33:06,880 Speaker 4: his essay about why Trump won and when exploring why 645 00:33:07,000 --> 00:33:10,680 Speaker 4: he did, Tamaski wrote, the answer is the right wing 646 00:33:10,840 --> 00:33:13,680 Speaker 4: media today. The right wing media Fox News and the 647 00:33:13,840 --> 00:33:17,840 Speaker 4: entire News Corp, Newsmax one American news network, the Sinclair 648 00:33:17,960 --> 00:33:22,040 Speaker 4: Network of radio and TV stations, and newspapers. iHeartMedia, not 649 00:33:22,200 --> 00:33:26,160 Speaker 4: all of iHeartMedia, normally Lyricy and the Bot Radio Network, 650 00:33:26,280 --> 00:33:30,080 Speaker 4: Christian Radio, Elon, musk X, the Huge podcast like Joe 651 00:33:30,200 --> 00:33:33,640 Speaker 4: Rogan's much More sets the news agenda in this country. 652 00:33:34,000 --> 00:33:37,120 Speaker 4: He repeated that in the next paragraph sets the news 653 00:33:37,280 --> 00:33:40,280 Speaker 4: agenda in this country, and they fed their audience as 654 00:33:40,360 --> 00:33:44,360 Speaker 4: a diet of slanted and distorted information that made it 655 00:33:44,520 --> 00:33:48,160 Speaker 4: possible for Trump to win. Before I ask you about 656 00:33:48,320 --> 00:33:51,840 Speaker 4: how this informed your decision to move forward with the 657 00:33:51,960 --> 00:33:54,840 Speaker 4: Midas Network, I'd love to hear your take on why 658 00:33:54,960 --> 00:33:58,000 Speaker 4: you think, and maybe Jordi, as a marketing person, you 659 00:33:58,120 --> 00:34:01,560 Speaker 4: have something interesting to say about this. Why has right 660 00:34:01,680 --> 00:34:06,240 Speaker 4: wing media been so incredibly effective? 661 00:34:06,560 --> 00:34:08,680 Speaker 1: Yep? I think when you go down that list, the 662 00:34:08,760 --> 00:34:11,600 Speaker 1: Fox News is the Rogans, and you keep going down 663 00:34:11,640 --> 00:34:12,239 Speaker 1: and down to. 664 00:34:12,760 --> 00:34:15,160 Speaker 4: The so called manosphere and all those things. 665 00:34:15,360 --> 00:34:18,160 Speaker 1: Right, there's one thing that that's really standing out to 666 00:34:18,239 --> 00:34:19,879 Speaker 1: me every time you go down through one of those 667 00:34:19,920 --> 00:34:23,560 Speaker 1: lists is who they're funded by. Billion like these billionaire 668 00:34:23,719 --> 00:34:27,680 Speaker 1: billionaire Elon Musk billionaire, Fox News, Murdoch billionaire. You go 669 00:34:27,760 --> 00:34:30,560 Speaker 1: into the Rogan sphere. I don't know his whatnot, but 670 00:34:30,760 --> 00:34:35,200 Speaker 1: but it's it's heavy funding, heavy funding and influence that 671 00:34:35,400 --> 00:34:38,680 Speaker 1: then just trickles down into that quote unquote manosphere. And 672 00:34:38,800 --> 00:34:42,000 Speaker 1: so once you get hit with you know, the Fox 673 00:34:42,080 --> 00:34:45,440 Speaker 1: News clips, it's an ecosystem that all feeds into itself 674 00:34:45,719 --> 00:34:47,960 Speaker 1: and then the algorithm takes it on your social network, 675 00:34:48,040 --> 00:34:52,040 Speaker 1: so you really can't escape it. Like it's it's consuming 676 00:34:52,160 --> 00:34:54,920 Speaker 1: in such a way that it's almost unavoidable. 677 00:34:55,640 --> 00:35:00,799 Speaker 2: People are suffering. People feel pained right now, and people 678 00:35:00,960 --> 00:35:06,400 Speaker 2: want answers about why they feel pain. They can't afford 679 00:35:06,440 --> 00:35:10,720 Speaker 2: a home, and they're seeing rich people with ten jets 680 00:35:11,400 --> 00:35:15,759 Speaker 2: and mansions and they can't afford a home. They don't 681 00:35:15,800 --> 00:35:18,279 Speaker 2: know if they're going to be able to We have 682 00:35:18,400 --> 00:35:22,439 Speaker 2: to listen to people and what they're saying democrats, People 683 00:35:22,960 --> 00:35:26,600 Speaker 2: listen to what people are actually saying to you. People 684 00:35:26,640 --> 00:35:30,680 Speaker 2: are suffering out there, and I think that they're getting 685 00:35:30,840 --> 00:35:35,400 Speaker 2: easy answers from right wing media about where they're suffering 686 00:35:35,560 --> 00:35:38,480 Speaker 2: is coming from. They're being told the immigrants are doing 687 00:35:38,560 --> 00:35:40,640 Speaker 2: it to you. They're eating your cats and your dogs, 688 00:35:40,719 --> 00:35:42,719 Speaker 2: and that sounds stupid to a lot of people. The 689 00:35:43,520 --> 00:35:46,759 Speaker 2: people want an easy answer about why they're feeling pain. 690 00:35:47,400 --> 00:35:50,879 Speaker 2: And when you can blame a marginalized community as they're 691 00:35:51,000 --> 00:35:54,360 Speaker 2: doing it, they're doing it. And then somebody comes in 692 00:35:54,480 --> 00:35:55,760 Speaker 2: and says, I'm going. 693 00:35:55,680 --> 00:35:56,719 Speaker 1: To make you rich. 694 00:35:57,560 --> 00:36:00,480 Speaker 2: I'm going to make you the richest person ever, just 695 00:36:00,680 --> 00:36:02,960 Speaker 2: like me. You're going to be so rich you're not 696 00:36:03,040 --> 00:36:04,719 Speaker 2: even going to know what to do with how rich 697 00:36:04,800 --> 00:36:10,360 Speaker 2: you are. That message resonates with people who are suffering. 698 00:36:10,760 --> 00:36:15,600 Speaker 2: It's built on fraud, but you combine the lies with 699 00:36:15,800 --> 00:36:20,120 Speaker 2: a right wing media network giving you a simple answer, 700 00:36:21,560 --> 00:36:25,279 Speaker 2: and that connects with people. Now, how do you combat it. 701 00:36:25,840 --> 00:36:28,960 Speaker 2: You have to combat it by pointing out that that's 702 00:36:29,000 --> 00:36:32,040 Speaker 2: a lie and that's a fraud. But you also have 703 00:36:32,200 --> 00:36:36,480 Speaker 2: to provide solutions, and you have to speak to people's 704 00:36:37,040 --> 00:36:41,439 Speaker 2: real human condition. You have to look somebody in the eye, 705 00:36:41,800 --> 00:36:44,239 Speaker 2: and you have to say to them, I hear you, 706 00:36:45,360 --> 00:36:48,640 Speaker 2: I listen to you, I understand what you're going through. 707 00:36:49,800 --> 00:36:52,839 Speaker 2: And that's what we could talk about the podcast Bros. 708 00:36:52,880 --> 00:36:57,840 Speaker 2: The media ecosystem, but ultimately whether this massive funding machine 709 00:36:58,360 --> 00:37:02,600 Speaker 2: they're pumping easy and by scapegoating the other and making 710 00:37:02,719 --> 00:37:05,480 Speaker 2: people understand, oh, that's why I'm going through this, and 711 00:37:05,560 --> 00:37:07,640 Speaker 2: that's not why they're going through this. They're going through 712 00:37:07,680 --> 00:37:11,200 Speaker 2: this because people like Trump and other people are the 713 00:37:11,280 --> 00:37:15,279 Speaker 2: ones doing it to them, inflicting that's what's causing it. 714 00:37:16,160 --> 00:37:19,880 Speaker 2: And they've managed to the people who are causing it, 715 00:37:20,000 --> 00:37:22,439 Speaker 2: has managed to convince the people they're inflicting the pain 716 00:37:22,560 --> 00:37:25,239 Speaker 2: on that they're helping them and they're not. And so 717 00:37:25,440 --> 00:37:30,040 Speaker 2: the antidote to that is also simple messages. 718 00:37:30,560 --> 00:37:33,160 Speaker 4: All saying messaging is so important. It hasn't it been, 719 00:37:33,239 --> 00:37:36,200 Speaker 4: because it seems to me there is sort of an 720 00:37:36,360 --> 00:37:41,640 Speaker 4: orchestrated effort to keep it simple but also to have 721 00:37:41,880 --> 00:37:47,560 Speaker 4: it kind of almost echoing throughout the whole ecosystem right 722 00:37:47,800 --> 00:37:51,000 Speaker 4: in a way that is just much more effective. 723 00:37:51,160 --> 00:37:51,320 Speaker 1: You know. 724 00:37:51,440 --> 00:37:54,920 Speaker 4: I think also people think that people are paying attention 725 00:37:55,040 --> 00:37:58,680 Speaker 4: to every little thing and they're just hearing sort of 726 00:37:59,680 --> 00:38:02,799 Speaker 4: this orchestrated effort. I'm not really making sense right now, 727 00:38:02,920 --> 00:38:06,759 Speaker 4: but I think like a lot most people aren't like us, 728 00:38:06,920 --> 00:38:10,759 Speaker 4: like reading everything, or like your guy who's watching. 729 00:38:10,480 --> 00:38:13,960 Speaker 3: The By the way, it is coordinated. And you know 730 00:38:14,640 --> 00:38:17,640 Speaker 3: those people you mentioned, they do work together, you know. 731 00:38:17,920 --> 00:38:21,239 Speaker 3: I'm sure and I've seen certain the evidence of it. 732 00:38:21,320 --> 00:38:23,200 Speaker 3: There's been stories about it as well that there are 733 00:38:23,640 --> 00:38:26,440 Speaker 3: signal chats and I message chats and various things. 734 00:38:26,880 --> 00:38:28,200 Speaker 4: This is what we're going to be, is what we're 735 00:38:28,200 --> 00:38:30,120 Speaker 4: going to talk about. The Rocket Ales used to do 736 00:38:30,239 --> 00:38:33,400 Speaker 4: that every morning off Fox News and they all distribute 737 00:38:33,680 --> 00:38:36,880 Speaker 4: a memo with talking points that every anchor had to 738 00:38:37,000 --> 00:38:37,400 Speaker 4: adhere to. 739 00:38:37,840 --> 00:38:39,759 Speaker 3: One of the things that we noticed early on, and 740 00:38:39,800 --> 00:38:41,840 Speaker 3: I feel like we were written off for it a 741 00:38:41,920 --> 00:38:45,760 Speaker 3: lot is that those conversations were starting on the Internet 742 00:38:46,280 --> 00:38:49,799 Speaker 3: and they would then permeate to legacy media. They would 743 00:38:49,800 --> 00:38:52,879 Speaker 3: then get to people's Facebook feeds. They would then get 744 00:38:52,920 --> 00:38:55,400 Speaker 3: into the conversations that people are having with one another. 745 00:38:55,719 --> 00:38:57,759 Speaker 3: But you wonder, where does this cats and dogs room, 746 00:38:57,840 --> 00:38:59,719 Speaker 3: or where does that even begin from? And you could 747 00:38:59,760 --> 00:39:02,359 Speaker 3: trace stuff like that back to an Internet post, back 748 00:39:02,440 --> 00:39:05,440 Speaker 3: to a tweet that went viral that people spoke about 749 00:39:05,680 --> 00:39:08,960 Speaker 3: and so oftentimes, because everything is so fragmented right now, 750 00:39:09,040 --> 00:39:12,279 Speaker 3: you don't actually know where that information even started from, 751 00:39:12,440 --> 00:39:15,200 Speaker 3: what was the kind of match that ignited it. But 752 00:39:15,400 --> 00:39:19,120 Speaker 3: oftentimes these conversations are beginning in these spaces and then 753 00:39:19,160 --> 00:39:22,840 Speaker 3: they kind of become more mainstream and in many cases 754 00:39:22,880 --> 00:39:25,120 Speaker 3: now make their way to the President of the United States, 755 00:39:25,160 --> 00:39:27,879 Speaker 3: who lives in these spaces as well. But I think, 756 00:39:27,920 --> 00:39:30,960 Speaker 3: you know, I think it's a bit nefarious kind of 757 00:39:31,080 --> 00:39:34,200 Speaker 3: how it operates in a way, and I think folks 758 00:39:34,280 --> 00:39:36,799 Speaker 3: have learned, in my opinion, a lot of the wrong 759 00:39:36,920 --> 00:39:44,400 Speaker 3: lessons from the election. And by nefarious, I mean I 760 00:39:44,440 --> 00:39:47,279 Speaker 3: think people think like these people were created in a lab, 761 00:39:47,480 --> 00:39:50,279 Speaker 3: that you know, the MANI Sphere podcast, that these people 762 00:39:50,320 --> 00:39:55,120 Speaker 3: were created in a lab designed to spread propaganda and 763 00:39:55,360 --> 00:39:58,640 Speaker 3: spread certain lives. But to me, that's not what it is. 764 00:39:58,840 --> 00:40:01,399 Speaker 3: I think these a lot of these shows did begin 765 00:40:01,600 --> 00:40:05,160 Speaker 3: organically and they had audiences, and they connected with people 766 00:40:05,320 --> 00:40:08,279 Speaker 3: through typical you know, life things, or whether it's you know, 767 00:40:08,400 --> 00:40:14,759 Speaker 3: ultimate fighting or or weightlifting or you know, discussing a 768 00:40:14,840 --> 00:40:17,800 Speaker 3: pop culture trial or you know whatever. It may be 769 00:40:18,360 --> 00:40:21,000 Speaker 3: something that doesn't necessarily seem political, but once they have 770 00:40:21,200 --> 00:40:25,560 Speaker 3: you in they could then use that space to send 771 00:40:25,680 --> 00:40:28,160 Speaker 3: people down these other rabbit holes that I think could 772 00:40:28,200 --> 00:40:31,920 Speaker 3: be you know, kind of a bit more dangerous, I 773 00:40:32,320 --> 00:40:32,960 Speaker 3: think in that way. 774 00:40:33,120 --> 00:40:36,840 Speaker 4: So it's like a lifestyle podcast that sort of morphs 775 00:40:36,880 --> 00:40:38,239 Speaker 4: into people in it. 776 00:40:38,360 --> 00:40:41,120 Speaker 3: Then they say, Okay, now that I got you, have 777 00:40:41,320 --> 00:40:43,759 Speaker 3: you considered that the vaccine might be hurt in you? 778 00:40:43,840 --> 00:40:45,960 Speaker 3: You know, like once they get hurt of yeah, have 779 00:40:46,040 --> 00:40:47,960 Speaker 3: you heard of the and and then all of a 780 00:40:47,960 --> 00:40:49,800 Speaker 3: sudden you start getting put down this path, you know. 781 00:40:49,840 --> 00:40:52,600 Speaker 3: I read an interesting article a few years ago at 782 00:40:52,640 --> 00:40:54,040 Speaker 3: this point, I think it was in the New York Times, 783 00:40:54,080 --> 00:40:56,560 Speaker 3: and it was about how the Epoch Times the Epic 784 00:40:56,640 --> 00:40:58,839 Speaker 3: Times is uh, they were. 785 00:40:59,239 --> 00:41:01,400 Speaker 4: I was shinned up that once, and I tried to 786 00:41:01,760 --> 00:41:04,759 Speaker 4: because I always like to hear what everybody's saying and 787 00:41:04,880 --> 00:41:08,480 Speaker 4: how stories are being framed. Isn't it called epochs epocket 788 00:41:08,520 --> 00:41:12,480 Speaker 4: the yacht But anyway, it's like it's like from China 789 00:41:12,560 --> 00:41:12,839 Speaker 4: or something. 790 00:41:12,880 --> 00:41:15,600 Speaker 3: Someone will correct me in the comments, and I was like, 791 00:41:16,120 --> 00:41:16,440 Speaker 3: oh my. 792 00:41:16,480 --> 00:41:20,239 Speaker 4: God, these people are insane. Stop stop stop, And they 793 00:41:20,360 --> 00:41:22,640 Speaker 4: kept sending me stuff. Finally they and so. 794 00:41:22,719 --> 00:41:25,840 Speaker 3: In this New York Times space, they analyzed how they 795 00:41:25,960 --> 00:41:29,120 Speaker 3: pulled people in with viral videos of cats and dogs, 796 00:41:29,800 --> 00:41:33,120 Speaker 3: and they would promote on Facebook videos of cute cat 797 00:41:33,239 --> 00:41:35,520 Speaker 3: videos and cute dog videos, and so they would get 798 00:41:35,520 --> 00:41:38,680 Speaker 3: people to engage with their content because they thought that 799 00:41:38,800 --> 00:41:40,920 Speaker 3: they were engaging. Oh, look at this cute cat who's 800 00:41:40,960 --> 00:41:43,440 Speaker 3: on a skateboard or you know whatever it is. Next 801 00:41:43,520 --> 00:41:46,439 Speaker 3: thing they know they're being pushed. You know, Fauci needs 802 00:41:46,480 --> 00:41:48,720 Speaker 3: to go to prison and on all this crazy stuff. 803 00:41:48,960 --> 00:41:51,160 Speaker 3: And it all started because they watched this video of 804 00:41:51,560 --> 00:41:54,000 Speaker 3: a dog on a rumba or something, you know. And 805 00:41:54,400 --> 00:41:56,480 Speaker 3: so that's what I mean by nefarius, that people could 806 00:41:56,480 --> 00:41:59,960 Speaker 3: get pulled in with things that seem benign or see 807 00:42:00,320 --> 00:42:02,680 Speaker 3: like they're you know, things that relate to their interests. 808 00:42:03,080 --> 00:42:06,040 Speaker 3: Next thing they know, they're taking down this algorithmic hell 809 00:42:06,120 --> 00:42:10,000 Speaker 3: hole and that they really can't escape from once they're in. 810 00:42:10,680 --> 00:42:12,480 Speaker 3: And so I think that's one of the kind of 811 00:42:12,600 --> 00:42:15,759 Speaker 3: dangerous spaces that we're in right now. Also, is you know, 812 00:42:15,880 --> 00:42:18,000 Speaker 3: social media is going to continue to push the at 813 00:42:18,120 --> 00:42:20,200 Speaker 3: you that you engage with and that it thinks you 814 00:42:20,320 --> 00:42:22,680 Speaker 3: want to see. And once you do a few clicks, 815 00:42:23,000 --> 00:42:26,759 Speaker 3: sometimes people get lost in these algorithms and it just 816 00:42:26,840 --> 00:42:29,360 Speaker 3: takes it consumes their life and they just kind of 817 00:42:29,520 --> 00:42:32,400 Speaker 3: lose a grip on, you know, a little bit of 818 00:42:32,520 --> 00:42:37,480 Speaker 3: normalcy and kind of even respect for other people humanity. 819 00:42:37,680 --> 00:42:39,800 Speaker 3: And I think that's kind of one of the dangerous. 820 00:42:39,480 --> 00:42:44,120 Speaker 4: Elspace programs, you to have this visceral reaction to anything 821 00:42:44,320 --> 00:42:48,239 Speaker 4: that is different than your point of view. Yeah, right. 822 00:42:48,600 --> 00:42:52,080 Speaker 4: And I know that you all say the talking point 823 00:42:52,200 --> 00:42:55,080 Speaker 4: or the conventional wisdom is like the left needs a 824 00:42:55,200 --> 00:42:57,640 Speaker 4: Joe Rogan, they need their Joe Rogan. And I know 825 00:42:57,800 --> 00:43:01,040 Speaker 4: you have all said that that's the wrong way to 826 00:43:01,160 --> 00:43:06,120 Speaker 4: look at it. But clearly, to deal with this ecosystem 827 00:43:06,200 --> 00:43:10,280 Speaker 4: that is so powerful, different points of view or even 828 00:43:11,320 --> 00:43:16,279 Speaker 4: truth right needs to have a more potent platform. So 829 00:43:17,400 --> 00:43:25,040 Speaker 4: how do you think different views or truth and everything 830 00:43:25,120 --> 00:43:30,200 Speaker 4: that's going on with the Trump administration gets out there 831 00:43:30,440 --> 00:43:32,440 Speaker 4: and explain to people? And do you worry that you're 832 00:43:32,520 --> 00:43:34,360 Speaker 4: kind of preaching to the choir? I know it just 833 00:43:34,400 --> 00:43:37,200 Speaker 4: asks like twelve questions in one, but how do you 834 00:43:37,320 --> 00:43:38,080 Speaker 4: do that well? 835 00:43:38,200 --> 00:43:41,080 Speaker 2: On the preaching to the choir? I always ask this 836 00:43:41,239 --> 00:43:45,160 Speaker 2: question when people, and I ask about Fox and I said, 837 00:43:45,200 --> 00:43:48,359 Speaker 2: do you think that Fox News is preaching to the choir? 838 00:43:48,440 --> 00:43:50,160 Speaker 2: I say, raise your hand, if you think Fox News 839 00:43:50,200 --> 00:43:54,080 Speaker 2: preaches to the choir, and usually in my group, everybody 840 00:43:54,800 --> 00:43:56,560 Speaker 2: raises their hand and says, of course, they're preaching to 841 00:43:56,600 --> 00:43:58,800 Speaker 2: a right wing choir. I go, do you think Fox 842 00:43:58,920 --> 00:44:03,760 Speaker 2: is one of the more powerful forces in politics in general? Okay, 843 00:44:03,800 --> 00:44:06,239 Speaker 2: so they're preaching to the choir. You don't view that 844 00:44:06,440 --> 00:44:09,640 Speaker 2: as a negative, right right? Okay, Well, what if your 845 00:44:09,760 --> 00:44:13,440 Speaker 2: choir just needs to sing louder than the other choir? 846 00:44:13,560 --> 00:44:15,640 Speaker 2: What if you just need a choir to begin with. 847 00:44:16,000 --> 00:44:18,759 Speaker 2: What if your choir doesn't believe you're a good conductor, 848 00:44:19,560 --> 00:44:22,320 Speaker 2: you should first be a good conductor of your own choir. 849 00:44:22,600 --> 00:44:24,800 Speaker 2: This is going back to what I said before, and 850 00:44:25,040 --> 00:44:28,440 Speaker 2: ensure that your choir knows that you're actually their conductor 851 00:44:29,160 --> 00:44:32,560 Speaker 2: and that you're fighting for your choir. So I think 852 00:44:32,680 --> 00:44:34,759 Speaker 2: that whether you want to say it is the Democrats 853 00:44:34,840 --> 00:44:38,759 Speaker 2: or whoever, I think that one of the problems has 854 00:44:38,840 --> 00:44:43,200 Speaker 2: been that the choir needed to be preached to a 855 00:44:43,239 --> 00:44:45,520 Speaker 2: little bit from an authentic position and saying we're here 856 00:44:45,560 --> 00:44:48,440 Speaker 2: with you, we're fighting for you, and then you can 857 00:44:48,520 --> 00:44:52,200 Speaker 2: start to build and grow because I think other people 858 00:44:52,320 --> 00:44:55,759 Speaker 2: start to see, wait a minute, that makes sense. 859 00:44:55,920 --> 00:44:57,680 Speaker 4: Your choir needed a church to go to. 860 00:44:59,160 --> 00:45:03,120 Speaker 2: Yeah, you know, and to me when you're building, do 861 00:45:03,239 --> 00:45:07,319 Speaker 2: you want to model that on a choir built upon 862 00:45:07,520 --> 00:45:11,560 Speaker 2: values that you don't think are helpful? And this goes 863 00:45:11,600 --> 00:45:13,399 Speaker 2: to your Rogan question. I'm going to tie it all together, 864 00:45:13,400 --> 00:45:15,520 Speaker 2: you know, but you know, but do you want to 865 00:45:15,600 --> 00:45:18,239 Speaker 2: then say we should be like that? Well, you're going 866 00:45:18,280 --> 00:45:20,479 Speaker 2: to start losing people in your choir if you're saying 867 00:45:20,800 --> 00:45:23,640 Speaker 2: I want to start acting that way, because then they're 868 00:45:23,640 --> 00:45:26,839 Speaker 2: going to go, WHOA, what do you mean? I don't 869 00:45:27,000 --> 00:45:30,840 Speaker 2: like the way they're treating, marginalizing. I don't like the 870 00:45:30,880 --> 00:45:33,399 Speaker 2: way they're bullying. I don't like the way that they're 871 00:45:33,480 --> 00:45:34,719 Speaker 2: making a mockery of women. 872 00:45:34,800 --> 00:45:35,320 Speaker 1: I don't like that. 873 00:45:35,760 --> 00:45:37,839 Speaker 2: So why do you want to be like that? Huh? 874 00:45:38,520 --> 00:45:38,560 Speaker 3: No? 875 00:45:38,719 --> 00:45:41,520 Speaker 2: I think that they just want to see you say 876 00:45:41,640 --> 00:45:43,480 Speaker 2: this is what I believe in. I give you an example. 877 00:45:43,920 --> 00:45:46,040 Speaker 2: Andy Basheer, governor of Kentucky, got. 878 00:45:46,120 --> 00:45:50,440 Speaker 4: Reviewed him on Friday. Oh my god, he is I mean, 879 00:45:50,560 --> 00:45:53,440 Speaker 4: talk about wholesome at all Americas. It's like out of 880 00:45:53,520 --> 00:45:56,200 Speaker 4: central casts like Andy Basheer. 881 00:45:55,960 --> 00:46:01,359 Speaker 2: Rush Chairman, Democratic governor. 882 00:46:01,440 --> 00:46:04,280 Speaker 4: Sixty eight percent approvement right in sixty in Kentucky. 883 00:46:04,880 --> 00:46:08,879 Speaker 2: He doesn't say I'm not for LGBTQ, right, he says 884 00:46:08,920 --> 00:46:12,560 Speaker 2: the opposite. He vetoes any bill that comes his way 885 00:46:12,960 --> 00:46:16,040 Speaker 2: that's anti LGBTQ, and he's then he does a press 886 00:46:16,120 --> 00:46:18,680 Speaker 2: conference and he'll say, this is why I'm vetoing it. 887 00:46:18,920 --> 00:46:21,920 Speaker 2: We need to treat all human beings with respect. And 888 00:46:22,000 --> 00:46:26,120 Speaker 2: they respect him in Kentucky for doing that. So to me, 889 00:46:26,320 --> 00:46:29,200 Speaker 2: if he can do that in Kentucky, we should be 890 00:46:29,239 --> 00:46:32,120 Speaker 2: able to do that everywhere else. And to me, that's 891 00:46:32,160 --> 00:46:34,720 Speaker 2: a great example. I was so impressed in the interview 892 00:46:34,760 --> 00:46:36,880 Speaker 2: I did with him where he conveyed, you know, that 893 00:46:37,000 --> 00:46:40,400 Speaker 2: message to me. So there's that, and then there's what 894 00:46:40,560 --> 00:46:43,000 Speaker 2: Senator Bernie Sanders said to me, which is that the 895 00:46:43,160 --> 00:46:48,200 Speaker 2: most important issues are sometimes the least discussed. And that's true. 896 00:46:48,280 --> 00:46:51,040 Speaker 2: You know, people aren't talking about the issues that actually 897 00:46:51,520 --> 00:46:54,360 Speaker 2: impact us the most, whether that's housing, whether that's healthcare, 898 00:46:54,640 --> 00:46:58,800 Speaker 2: whether it's education, whether it's you know, living paycheck to paycheck, 899 00:46:58,840 --> 00:47:01,279 Speaker 2: and all of these issues are not being discussed, So 900 00:47:01,440 --> 00:47:04,360 Speaker 2: we should discuss them. So to me, in terms of 901 00:47:04,880 --> 00:47:08,880 Speaker 2: building what our version of what's going on on the 902 00:47:08,960 --> 00:47:11,480 Speaker 2: other side should be, I think it's obvious we should 903 00:47:12,000 --> 00:47:14,640 Speaker 2: make sure our choir sings louder, and we should be 904 00:47:14,719 --> 00:47:18,840 Speaker 2: focusing on the issues and actually occupy a lot of 905 00:47:18,920 --> 00:47:23,080 Speaker 2: the mental space in Americans' minds, but are filled with conspiracies, 906 00:47:23,480 --> 00:47:25,560 Speaker 2: and we should fill it with the truth and solutions. 907 00:47:25,800 --> 00:47:29,560 Speaker 4: And what is the reaction among say, people who are 908 00:47:29,640 --> 00:47:33,520 Speaker 4: more conservative, you know, because everyone's all over the place 909 00:47:33,560 --> 00:47:37,640 Speaker 4: and you see like working class folks who supported Donald Trump. 910 00:47:38,080 --> 00:47:40,480 Speaker 4: But if you're tackling the issues they care about, do 911 00:47:40,560 --> 00:47:43,920 Speaker 4: you find that they are gravitating towards your platform? 912 00:47:44,760 --> 00:47:46,000 Speaker 1: Yeah? I mean, look or not. 913 00:47:47,000 --> 00:47:51,480 Speaker 2: I think I'll say this, whether they're gravitating or not gravitating. 914 00:47:52,600 --> 00:47:57,040 Speaker 2: I'm confident that the views and values I stand behind them. 915 00:47:57,160 --> 00:47:59,320 Speaker 2: I stand on business and I stand behind them. And 916 00:47:59,400 --> 00:48:03,279 Speaker 2: I'm confident that that's the approach that will. You know, 917 00:48:04,080 --> 00:48:08,840 Speaker 2: I'm not trying to chase people and beg them to 918 00:48:09,480 --> 00:48:12,600 Speaker 2: be good people. I simply want to let them know 919 00:48:13,040 --> 00:48:16,960 Speaker 2: that here's a place where we feel very strongly about 920 00:48:17,000 --> 00:48:21,080 Speaker 2: these things. There's a massive community now with millions of 921 00:48:21,120 --> 00:48:23,960 Speaker 2: people who watch this. Join it if you want to 922 00:48:24,040 --> 00:48:26,200 Speaker 2: join it, don't join it if you don't want to 923 00:48:26,320 --> 00:48:29,279 Speaker 2: join it. But we are, we are here, and we're 924 00:48:29,320 --> 00:48:30,800 Speaker 2: standing the struggle what we believe. 925 00:48:30,600 --> 00:48:31,520 Speaker 1: And so I don't. 926 00:48:32,120 --> 00:48:35,320 Speaker 2: I don't it may sound I don't occupy myself so 927 00:48:35,440 --> 00:48:38,359 Speaker 2: much thinking about you know, I'm really hoping that I've 928 00:48:38,520 --> 00:48:42,640 Speaker 2: just changed, you know, mister Maga's mind and made him 929 00:48:42,680 --> 00:48:45,920 Speaker 2: into a mister Midas like I. If they want to 930 00:48:45,960 --> 00:48:50,279 Speaker 2: live in their rage algorithm, they want if they want 931 00:48:50,320 --> 00:48:52,839 Speaker 2: to live in there, Remitas mighty the best. Yeah, if 932 00:48:52,840 --> 00:48:54,839 Speaker 2: they want to live in their rage algorithm, then that's 933 00:48:54,840 --> 00:48:57,200 Speaker 2: where they could live, you know, but they should know 934 00:48:57,280 --> 00:48:59,919 Speaker 2: that there's other places where we're going to talk about 935 00:49:00,080 --> 00:49:03,680 Speaker 2: the issues in an impactful, logical smart way, and then 936 00:49:03,760 --> 00:49:06,719 Speaker 2: we're going to layer that with some positivity to day. 937 00:49:06,920 --> 00:49:10,680 Speaker 4: Do you feel that this whole kind of opposite views 938 00:49:11,080 --> 00:49:14,680 Speaker 4: of you know, the Foxes of the world are in 939 00:49:14,880 --> 00:49:19,040 Speaker 4: aggregate gaining traction. You have, for example, pot Save America, 940 00:49:19,200 --> 00:49:22,760 Speaker 4: have Currenter Newsroom. You have a lot of these digital 941 00:49:23,760 --> 00:49:27,040 Speaker 4: companies networks. Honestly, I know you call yourself a network, 942 00:49:27,080 --> 00:49:31,799 Speaker 4: but their networks too kind of rising up and creating 943 00:49:32,080 --> 00:49:38,000 Speaker 4: a force field that is, you know, bumping up against 944 00:49:38,640 --> 00:49:41,200 Speaker 4: this other huge ecosystem. 945 00:49:41,719 --> 00:49:43,640 Speaker 2: I just give you the data quickly, you know, from 946 00:49:43,680 --> 00:49:47,480 Speaker 2: our network three hundred and fifty million or so YouTube 947 00:49:47,560 --> 00:49:48,680 Speaker 2: views every single month. 948 00:49:48,800 --> 00:49:49,760 Speaker 4: I'm very jealous. 949 00:49:49,800 --> 00:49:52,040 Speaker 1: So that's more more. 950 00:49:52,120 --> 00:49:53,840 Speaker 4: We're all like, what's going on? 951 00:49:54,560 --> 00:49:57,000 Speaker 2: Yeah, but that's more on a day to day basis 952 00:49:57,120 --> 00:49:59,799 Speaker 2: on digital and I think we all agree things are 953 00:50:00,080 --> 00:50:04,919 Speaker 2: moving digital. So the cable news wants to go more digital. 954 00:50:04,760 --> 00:50:07,560 Speaker 4: Right So there and there in our forever, I was like, 955 00:50:07,719 --> 00:50:09,120 Speaker 4: what is wrong with you people? 956 00:50:09,239 --> 00:50:12,239 Speaker 2: But anyway go so on that platform, the Mightas Touched 957 00:50:12,320 --> 00:50:14,560 Speaker 2: Network on a day to day basis, and some days 958 00:50:15,120 --> 00:50:17,040 Speaker 2: Fox beats us, but on a day to day basis, 959 00:50:17,080 --> 00:50:20,319 Speaker 2: we'll get more views than Fox on YouTube and then 960 00:50:20,440 --> 00:50:23,600 Speaker 2: seeing an MSNBC and all the rest, Mightas Touch Network 961 00:50:23,880 --> 00:50:26,960 Speaker 2: will beat those. We're the fastest growing substack right now 962 00:50:27,400 --> 00:50:28,759 Speaker 2: of all substacks. 963 00:50:28,280 --> 00:50:31,080 Speaker 4: Out there, substack after Memorial Day. 964 00:50:31,000 --> 00:50:34,480 Speaker 1: So we should do that would be fun and we love. 965 00:50:34,360 --> 00:50:36,080 Speaker 4: Subjects that we're all such good. 966 00:50:36,000 --> 00:50:41,560 Speaker 2: For exactly exactly and then you know, across and then 967 00:50:41,680 --> 00:50:45,680 Speaker 2: on the podcast downloads themselves. Uh, Jordy showed me the 968 00:50:45,800 --> 00:50:48,839 Speaker 2: data today. So in the last month, the Mightas Touch 969 00:50:48,960 --> 00:50:53,840 Speaker 2: Network beat Rogan Tucker, Carlson, Candae Owens, Charlie Kirk, and 970 00:50:53,920 --> 00:50:57,880 Speaker 2: Ben Shapiro all combined for the month of for the 971 00:50:57,960 --> 00:51:02,040 Speaker 2: last month, they do so in terms of in terms 972 00:51:02,120 --> 00:51:05,279 Speaker 2: of growth. You know, I think that that, you know, 973 00:51:05,400 --> 00:51:09,839 Speaker 2: that shows that there is an appetite for this out there, 974 00:51:10,320 --> 00:51:13,600 Speaker 2: and I think that, you know, we got to preach 975 00:51:13,640 --> 00:51:16,439 Speaker 2: to that choir, you know, and expand it. But let's 976 00:51:16,440 --> 00:51:18,680 Speaker 2: start also with fortifying the choir and let them know 977 00:51:18,760 --> 00:51:19,919 Speaker 2: we care about them. Yeah. 978 00:51:27,480 --> 00:51:29,879 Speaker 4: Hi, everyone, it's me Katie Couric. You know, if you've 979 00:51:29,880 --> 00:51:32,600 Speaker 4: been following me on social media, you know I love 980 00:51:32,719 --> 00:51:35,800 Speaker 4: to cook, or at least try, especially alongside some of 981 00:51:35,880 --> 00:51:39,600 Speaker 4: my favorite chefs and foodies like Benny Blanco, Jake Cohen, 982 00:51:39,719 --> 00:51:44,080 Speaker 4: Lighty Hoyke, Alison Roman, and Ininagarten. So I started a 983 00:51:44,160 --> 00:51:47,680 Speaker 4: free newsletter called good Taste to share recipes, tips and 984 00:51:47,840 --> 00:51:51,960 Speaker 4: kitchen mustaves. Just sign up at Katiecuric dot com slash 985 00:51:52,160 --> 00:51:55,080 Speaker 4: good Taste. That's k A T I E C O 986 00:51:55,320 --> 00:51:58,880 Speaker 4: U R I C dot com slash good Taste. I 987 00:51:59,080 --> 00:52:11,359 Speaker 4: promise your taste buds will be happy you did. I'm 988 00:52:11,480 --> 00:52:15,279 Speaker 4: curious because I've struggled with this as someone who you 989 00:52:15,400 --> 00:52:18,440 Speaker 4: grew up watching, I'm sure, and it started in very 990 00:52:18,560 --> 00:52:24,000 Speaker 4: traditional mainstream media now pointing out the facts and what 991 00:52:24,400 --> 00:52:29,400 Speaker 4: is really happening is automatically interpreted as being biased, right, 992 00:52:30,120 --> 00:52:34,600 Speaker 4: And of course I think there's no such thing as 993 00:52:34,680 --> 00:52:38,640 Speaker 4: true objectivity. But having said that, you know, I really 994 00:52:38,719 --> 00:52:42,400 Speaker 4: struggle with that, and many people say, listen, the rules 995 00:52:42,480 --> 00:52:46,600 Speaker 4: have changed. It's okay to say you support trans people. 996 00:52:46,840 --> 00:52:50,240 Speaker 4: It's okay that you say I am one hundred percent 997 00:52:50,360 --> 00:52:54,160 Speaker 4: for reproductive rights. You know, all these things that honestly, 998 00:52:54,280 --> 00:52:58,440 Speaker 4: as personally I hold dear, but professionally I've been trained 999 00:52:58,840 --> 00:53:01,279 Speaker 4: to not share that. I'm curious if you think sort 1000 00:53:01,320 --> 00:53:07,960 Speaker 4: of old fashioned, semi objective, knowing that pure objectivity is impossible, 1001 00:53:08,480 --> 00:53:12,440 Speaker 4: that kind of journalism still has a place in the culture, 1002 00:53:12,680 --> 00:53:15,560 Speaker 4: or is it simply you know, the seventy five and 1003 00:53:15,760 --> 00:53:19,040 Speaker 4: up people who are watching the network evening newscasts. 1004 00:53:19,920 --> 00:53:22,160 Speaker 3: Yeah, I mean, I certainly think it's still as a place. 1005 00:53:22,440 --> 00:53:24,400 Speaker 3: But at the same time, I think everything is so 1006 00:53:24,640 --> 00:53:27,920 Speaker 3: fragmented now that I think such a large portion of 1007 00:53:28,040 --> 00:53:32,000 Speaker 3: the audience are going to places that share a point 1008 00:53:32,040 --> 00:53:35,520 Speaker 3: of view. And I think, you know, I think if 1009 00:53:35,600 --> 00:53:37,960 Speaker 3: we just did a report that is this happened, then 1010 00:53:37,960 --> 00:53:40,960 Speaker 3: this happened, then this happened, then this happened. I mean, 1011 00:53:41,080 --> 00:53:44,120 Speaker 3: there's there's plenty of places where people you know, could find. 1012 00:53:44,239 --> 00:53:47,360 Speaker 4: Not you know, really I mean fewer and fewer, right, right, Yeah, But. 1013 00:53:47,680 --> 00:53:49,719 Speaker 3: I think I think the key thing that we do 1014 00:53:50,000 --> 00:53:52,360 Speaker 3: is connect the dots on all those things. And I 1015 00:53:52,440 --> 00:53:54,640 Speaker 3: think when you're able to weave it into a story 1016 00:53:54,800 --> 00:53:56,879 Speaker 3: and actually connect the dots, I think that's what people 1017 00:53:57,040 --> 00:54:01,440 Speaker 3: want to be able to gain clarity on because I think, 1018 00:54:01,680 --> 00:54:03,520 Speaker 3: as as we've kind of been saying, there's so much 1019 00:54:03,640 --> 00:54:05,960 Speaker 3: information out there that in order to be able to 1020 00:54:06,280 --> 00:54:09,160 Speaker 3: digest it and has to be presented, you know, I 1021 00:54:09,280 --> 00:54:12,840 Speaker 3: think in many ways, you know, in certain ways just 1022 00:54:12,920 --> 00:54:16,680 Speaker 3: to make it clear from even just a storytelling perspective, 1023 00:54:16,760 --> 00:54:19,840 Speaker 3: so people get how how does a tariff affect you know, 1024 00:54:20,120 --> 00:54:23,000 Speaker 3: their you know, their business or the price they're going 1025 00:54:23,080 --> 00:54:24,960 Speaker 3: to pay for goods, you know, and how does this 1026 00:54:25,040 --> 00:54:27,400 Speaker 3: affect their medication and their family? Like I think, you 1027 00:54:27,520 --> 00:54:28,839 Speaker 3: just I think you need to be able to tie 1028 00:54:28,880 --> 00:54:31,040 Speaker 3: it all together in a way, and whether people think 1029 00:54:31,080 --> 00:54:33,800 Speaker 3: it's biased or not when you're speaking on those issues, 1030 00:54:34,320 --> 00:54:35,640 Speaker 3: to me, I think you just need to put that 1031 00:54:35,719 --> 00:54:37,879 Speaker 3: aside because now we live in a time when people 1032 00:54:37,920 --> 00:54:40,080 Speaker 3: are going to comment on on everything, right, Like I said, 1033 00:54:40,160 --> 00:54:42,480 Speaker 3: and people are going to comment I mispronounced something. We'll 1034 00:54:42,480 --> 00:54:44,560 Speaker 3: get a comment, right, it's it's there's going to be 1035 00:54:44,600 --> 00:54:47,480 Speaker 3: a comment about everything. You have to almost bake that 1036 00:54:47,680 --> 00:54:50,960 Speaker 3: into what you're doing, in my opinion, and just be you, 1037 00:54:50,960 --> 00:54:54,000 Speaker 3: you know, and and and and let the chips fall 1038 00:54:54,000 --> 00:54:55,840 Speaker 3: where they may. I think the biggest thing that you 1039 00:54:55,920 --> 00:54:58,280 Speaker 3: need to do, though, is be authentic, you know, to yourself. 1040 00:54:58,360 --> 00:55:01,240 Speaker 3: And I think people could smell bullshit from a mile away. 1041 00:55:02,000 --> 00:55:04,239 Speaker 3: And that's why I think when after the twenty twenty 1042 00:55:04,280 --> 00:55:06,320 Speaker 3: four election, when there were so many I feel like 1043 00:55:06,360 --> 00:55:10,960 Speaker 3: I saw so many democratic politicians and people in media 1044 00:55:11,160 --> 00:55:14,000 Speaker 3: and podcasts and you name it, to all the sudden, 1045 00:55:14,400 --> 00:55:16,920 Speaker 3: everything I thought they believed in and the communities that 1046 00:55:16,960 --> 00:55:18,799 Speaker 3: I thought they supported, all of a sudden they seem 1047 00:55:18,840 --> 00:55:21,359 Speaker 3: to be backtracking on that. Maybe we focused a little 1048 00:55:21,400 --> 00:55:23,719 Speaker 3: too much on inclusivity, Maybe we focused a little too 1049 00:55:23,800 --> 00:55:27,799 Speaker 3: much on reproductive rights. And when you try to kind 1050 00:55:27,840 --> 00:55:31,400 Speaker 3: of shape your opinions based around public poland. 1051 00:55:31,160 --> 00:55:33,960 Speaker 4: Right and kind of like were the wind driving is 1052 00:55:34,080 --> 00:55:37,680 Speaker 4: the DEI people who when it was cool, it was great, 1053 00:55:37,800 --> 00:55:39,680 Speaker 4: and now they're like, what what does that stand for? 1054 00:55:39,960 --> 00:55:43,040 Speaker 3: Yeah? And so how do I trust you with anything, 1055 00:55:43,320 --> 00:55:46,960 Speaker 3: and I don't. I's speaking of mister Maga, Missus Midas 1056 00:55:47,000 --> 00:55:48,879 Speaker 3: whatever we were talking about before. You know, I don't 1057 00:55:48,920 --> 00:55:51,320 Speaker 3: think you're gonna win. You know, the mister Maga is 1058 00:55:51,360 --> 00:55:53,760 Speaker 3: not going to vote for you because you have those beliefs. 1059 00:55:53,760 --> 00:55:55,960 Speaker 3: They're going to vote for the Maga guy. Okay, they're 1060 00:55:56,000 --> 00:55:58,360 Speaker 3: not going to come around. They may say they respect 1061 00:55:58,400 --> 00:56:00,279 Speaker 3: you more or something. You know, you're not get in 1062 00:56:00,280 --> 00:56:04,000 Speaker 3: their vote, I assure you. And you're just alienating your choir, 1063 00:56:04,120 --> 00:56:06,520 Speaker 3: as we were speaking about before, and you're angering the 1064 00:56:06,600 --> 00:56:10,480 Speaker 3: people who you've built trust with because you violated that trust. 1065 00:56:10,600 --> 00:56:13,239 Speaker 3: So I think right now, and I think whether you're 1066 00:56:13,920 --> 00:56:17,600 Speaker 3: Miightas Touched or whether you're Joe Rogan or you name it, 1067 00:56:18,160 --> 00:56:21,720 Speaker 3: I think the most important thing is that you're authentic 1068 00:56:21,920 --> 00:56:24,759 Speaker 3: to your audience and that people trust the things that 1069 00:56:24,840 --> 00:56:27,479 Speaker 3: you are saying and know where you are coming from, 1070 00:56:27,880 --> 00:56:30,160 Speaker 3: and know that tomorrow, if a poll comes out that 1071 00:56:30,280 --> 00:56:32,640 Speaker 3: says this big thing you've been pushing may not be 1072 00:56:32,760 --> 00:56:34,520 Speaker 3: super popular right now, that you're not all of a 1073 00:56:34,560 --> 00:56:37,800 Speaker 3: sudden going to just change your beliefs, you know, based 1074 00:56:37,880 --> 00:56:41,200 Speaker 3: on it. And I mean that's that's my thought on that. 1075 00:56:41,320 --> 00:56:42,920 Speaker 3: I think you know, one of the things I could 1076 00:56:42,920 --> 00:56:44,520 Speaker 3: say about Donald Trump. I'm not trying to say anything 1077 00:56:44,600 --> 00:56:46,480 Speaker 3: nice about the guy right now, but one of the 1078 00:56:46,520 --> 00:56:48,520 Speaker 3: things I could say is I think he's pushed forward 1079 00:56:48,560 --> 00:56:51,480 Speaker 3: a lot with things that are incredibly unpopular, and then 1080 00:56:51,520 --> 00:56:54,120 Speaker 3: he's taken his people kind of on that journey with 1081 00:56:54,200 --> 00:56:56,080 Speaker 3: him going way back. I mean, remember he was being 1082 00:56:56,200 --> 00:57:00,439 Speaker 3: laughed at at these debates and the White House dinner. 1083 00:57:00,560 --> 00:57:03,440 Speaker 3: I mean, everyone is making a mockery of him, but 1084 00:57:03,560 --> 00:57:05,960 Speaker 3: he pushed forward and he tried to make the case 1085 00:57:06,040 --> 00:57:07,600 Speaker 3: to bring people to his side. 1086 00:57:07,840 --> 00:57:10,239 Speaker 4: He is a marketing genius, don't you think Geordie in 1087 00:57:10,280 --> 00:57:10,760 Speaker 4: a way. 1088 00:57:14,640 --> 00:57:16,959 Speaker 2: I would say he's a Ponzi schemer and a fraud. 1089 00:57:17,200 --> 00:57:19,560 Speaker 1: Yeah, a good characters. 1090 00:57:20,040 --> 00:57:22,600 Speaker 4: I mean you can think dialectically he can be all 1091 00:57:22,680 --> 00:57:23,480 Speaker 4: those things, can't he. 1092 00:57:24,160 --> 00:57:25,960 Speaker 1: I mean, was Bernie made Off a good marketer? 1093 00:57:27,560 --> 00:57:29,800 Speaker 4: Well, I think he's very I think he was very 1094 00:57:30,240 --> 00:57:33,040 Speaker 4: I don't think he had a public persona that was 1095 00:57:33,240 --> 00:57:36,320 Speaker 4: charismatic and got people to buy what he was selling. 1096 00:57:36,440 --> 00:57:38,640 Speaker 4: I mean he did have a small group, right, but 1097 00:57:39,320 --> 00:57:40,880 Speaker 4: I don't know just on a well, we don't have 1098 00:57:40,960 --> 00:57:41,360 Speaker 4: to get. 1099 00:57:43,560 --> 00:57:45,880 Speaker 1: I think what Bro was saying though, too about audiences 1100 00:57:46,080 --> 00:57:49,560 Speaker 1: these days, and it's a marketing buzzword authenticity. I know 1101 00:57:49,600 --> 00:57:50,800 Speaker 1: I used to say it a lot back in my 1102 00:57:50,880 --> 00:57:54,480 Speaker 1: marketing days, but genuinely, I think people are viewing what 1103 00:57:54,560 --> 00:57:57,480 Speaker 1: we're doing here at the maas Touch Network as absolutely authentic. 1104 00:57:57,600 --> 00:57:59,440 Speaker 1: You know, with three brothers, we don't try and be 1105 00:57:59,480 --> 00:58:01,680 Speaker 1: anything that we're not right. We rely on Ben for 1106 00:58:01,880 --> 00:58:04,360 Speaker 1: the legal, Bred for the digital, and Me for the marketing, 1107 00:58:04,520 --> 00:58:07,160 Speaker 1: and then we bring in experts in specific spaces that 1108 00:58:07,240 --> 00:58:09,800 Speaker 1: could speak to issues that we're not experts in. Yeah, 1109 00:58:09,920 --> 00:58:12,360 Speaker 1: and we don't pretend that we're experts in, but we 1110 00:58:12,520 --> 00:58:15,439 Speaker 1: have a clear value set of you know, right or wrong. 1111 00:58:15,960 --> 00:58:18,440 Speaker 1: And if those lean more democratic, well, I think that 1112 00:58:18,560 --> 00:58:20,480 Speaker 1: tells you kind of where we are as a country, 1113 00:58:20,560 --> 00:58:23,760 Speaker 1: as a society, and we just stand behind it wholeheartedly, 1114 00:58:24,280 --> 00:58:26,000 Speaker 1: and I think people like that. I think that's what 1115 00:58:26,320 --> 00:58:27,200 Speaker 1: folks are gravitating. 1116 00:58:27,280 --> 00:58:29,160 Speaker 3: By the way, that's not to say that your ideas 1117 00:58:29,240 --> 00:58:31,840 Speaker 3: cannot evolve, that you cannot change your opinions. I think 1118 00:58:31,920 --> 00:58:34,760 Speaker 3: certainly you can. And I don't think your beliefs that 1119 00:58:34,880 --> 00:58:37,120 Speaker 3: you had at age eighteen have to be the beliefs 1120 00:58:37,160 --> 00:58:39,520 Speaker 3: that you have at age fifty. You know, I think 1121 00:58:39,600 --> 00:58:42,960 Speaker 3: that evolution is a part of life. Exploring new ideas, 1122 00:58:43,080 --> 00:58:45,760 Speaker 3: engaging with them, changing your mind is a part of life. 1123 00:58:45,800 --> 00:58:49,600 Speaker 3: But I think people to your point, know when you go, oh, 1124 00:58:50,160 --> 00:58:52,440 Speaker 3: this poll came out yesterday, and all of a sudden, 1125 00:58:52,560 --> 00:58:55,960 Speaker 3: I maybe I shouldn't support transgender people so much. Maybe 1126 00:58:56,000 --> 00:58:58,600 Speaker 3: I shouldn't support you know, diversity, Maybe I shouldn't support 1127 00:58:58,640 --> 00:59:02,040 Speaker 3: reproductive rights. People see right through that wrap, you know, 1128 00:59:02,240 --> 00:59:04,160 Speaker 3: and then they go, you know, they go, this person, 1129 00:59:04,160 --> 00:59:06,080 Speaker 3: you're okay, So you're just a fraud. How do I 1130 00:59:06,160 --> 00:59:08,480 Speaker 3: trust that anything else you're telling me right now? How 1131 00:59:08,520 --> 00:59:10,080 Speaker 3: do I know you're being honest about that? What if 1132 00:59:10,120 --> 00:59:12,720 Speaker 3: tomorrow you know some other politician one that was the 1133 00:59:12,760 --> 00:59:14,400 Speaker 3: opposite of that, and then all of a sudden you're 1134 00:59:14,880 --> 00:59:17,280 Speaker 3: switching your beliefs on that too, Like how do you 1135 00:59:17,440 --> 00:59:20,440 Speaker 3: trust somebody you know who's sort of a shape shifter 1136 00:59:20,720 --> 00:59:23,240 Speaker 3: like that? And I think it's important to not view, 1137 00:59:23,560 --> 00:59:26,680 Speaker 3: you know, as you go from election to election or 1138 00:59:27,120 --> 00:59:30,080 Speaker 3: you know, issue to issue, whatever it is, that you 1139 00:59:30,240 --> 00:59:33,280 Speaker 3: don't just change with the way the wind is blowing, 1140 00:59:33,360 --> 00:59:35,800 Speaker 3: but that you are staying true through those moments, and 1141 00:59:35,920 --> 00:59:38,320 Speaker 3: that you're sticking up for your beliefs cuse I think 1142 00:59:38,360 --> 00:59:41,320 Speaker 3: at least you know, at least people will respect. 1143 00:59:41,000 --> 00:59:45,000 Speaker 4: You right and giving them facts and data that actually 1144 00:59:45,160 --> 00:59:49,240 Speaker 4: reinforces your position. I think that's really critical. Just a 1145 00:59:49,320 --> 00:59:51,040 Speaker 4: few more questions. I could talk to you guys for 1146 00:59:51,240 --> 00:59:54,040 Speaker 4: hours a lot to do it over feer sometimes, please, 1147 00:59:54,240 --> 00:59:57,680 Speaker 4: but I'd love to know, just sort of logistically, selfishly 1148 00:59:57,880 --> 01:00:00,840 Speaker 4: a little about the secret sauce that has made you 1149 01:00:01,120 --> 01:00:04,800 Speaker 4: so successful. You guys have hired a bunch of different 1150 01:00:05,240 --> 01:00:08,400 Speaker 4: It's interesting. I didn't realize Michael Cohen was part of 1151 01:00:08,480 --> 01:00:11,120 Speaker 4: your network, you know, the former Trump Trump lawyer, But 1152 01:00:11,640 --> 01:00:14,840 Speaker 4: you have hired a lot of different people. You have correspondence, 1153 01:00:14,920 --> 01:00:17,800 Speaker 4: but you also have sort of what might traditionally be 1154 01:00:17,920 --> 01:00:21,440 Speaker 4: called pundits. Do you ever feel like you've got too 1155 01:00:21,560 --> 01:00:24,800 Speaker 4: much content that's coming out or how do you kind 1156 01:00:24,840 --> 01:00:26,400 Speaker 4: of organize what you're doing. 1157 01:00:26,920 --> 01:00:29,440 Speaker 2: So we've taken the twenty four hour cable news cycle 1158 01:00:29,480 --> 01:00:32,640 Speaker 2: and we've tried to deconstruct it. So whereas a twenty 1159 01:00:32,680 --> 01:00:36,600 Speaker 2: four hour cable news cycle, you'll have how many stories 1160 01:00:36,640 --> 01:00:38,880 Speaker 2: in a given day, five to seven that are on 1161 01:00:39,040 --> 01:00:42,160 Speaker 2: repeat every day. So we try to take those five 1162 01:00:42,280 --> 01:00:45,080 Speaker 2: to seven stories that and I don't watch what cable 1163 01:00:45,160 --> 01:00:47,520 Speaker 2: news is those five to seven We program it based 1164 01:00:47,560 --> 01:00:50,439 Speaker 2: on what we think are the most important, say seven 1165 01:00:50,480 --> 01:00:53,040 Speaker 2: to ten stories of the day, and then that becomes 1166 01:00:53,120 --> 01:00:57,680 Speaker 2: basically ten to twelve YouTube videos you know, every day 1167 01:00:57,840 --> 01:00:59,720 Speaker 2: from you know, and they're all. 1168 01:00:59,520 --> 01:01:02,800 Speaker 4: Your different contributors, right because you do some right and 1169 01:01:03,440 --> 01:01:07,000 Speaker 4: do do I'll do yes, I see you too, Jordan. 1170 01:01:07,400 --> 01:01:08,439 Speaker 4: Every now every now. 1171 01:01:08,600 --> 01:01:11,160 Speaker 1: More so social they do more of the long form stuff. 1172 01:01:11,360 --> 01:01:15,120 Speaker 2: So every day we probably have about one or two 1173 01:01:15,480 --> 01:01:19,280 Speaker 2: some days three live shows that last about an hour. 1174 01:01:19,880 --> 01:01:21,680 Speaker 2: And so that may be the show I do with 1175 01:01:21,880 --> 01:01:24,920 Speaker 2: bretton Jordy. That may be a show called Legal af 1176 01:01:25,040 --> 01:01:26,880 Speaker 2: that I do with a guy named Michael Popak and 1177 01:01:27,280 --> 01:01:30,320 Speaker 2: Karen Friedman Agnifolo and she was the number two DA 1178 01:01:30,400 --> 01:01:32,440 Speaker 2: in the minute. These are like top Lords. 1179 01:01:33,520 --> 01:01:34,040 Speaker 4: I'm kidding. 1180 01:01:34,120 --> 01:01:34,840 Speaker 2: No no, no, no no. 1181 01:01:35,320 --> 01:01:36,600 Speaker 1: It means analysis friends. 1182 01:01:37,000 --> 01:01:42,200 Speaker 2: By the way, legal analysis, analysis friends some people. It 1183 01:01:42,240 --> 01:01:44,840 Speaker 2: could be a play on somebody else, but it's analysis friends. 1184 01:01:45,080 --> 01:01:47,600 Speaker 3: Okay, get Clara out of the gutter. 1185 01:01:51,520 --> 01:01:54,880 Speaker 2: And so about ten videos to twelve videos a day 1186 01:01:55,680 --> 01:02:00,240 Speaker 2: that are about ten to fifteen minutes. And let whatever 1187 01:02:00,320 --> 01:02:02,560 Speaker 2: it takes to go through a court filing or to 1188 01:02:02,640 --> 01:02:05,640 Speaker 2: provide all of the data where we feel that we've 1189 01:02:05,840 --> 01:02:08,280 Speaker 2: covered the issue in a way that people really get. 1190 01:02:08,800 --> 01:02:10,640 Speaker 1: So we do that twelve times a day. 1191 01:02:10,720 --> 01:02:12,560 Speaker 2: Maybe there's a four am, although now that I'm on 1192 01:02:12,600 --> 01:02:14,600 Speaker 2: East Coast time, it's all screwed up, But on the 1193 01:02:14,640 --> 01:02:17,560 Speaker 2: West Coast it's a four or five thirty seven, you know, 1194 01:02:17,640 --> 01:02:20,280 Speaker 2: and you basically go ninety minutes out and you program 1195 01:02:20,400 --> 01:02:22,800 Speaker 2: it like that with two or three live shows a day, 1196 01:02:22,840 --> 01:02:25,160 Speaker 2: and some days maybe on the weekend it's one live show. 1197 01:02:26,760 --> 01:02:29,440 Speaker 2: And then we have our other social media built around it. 1198 01:02:29,520 --> 01:02:32,080 Speaker 2: So we've got the TikTok, we've got the Instagram, We've 1199 01:02:32,120 --> 01:02:34,360 Speaker 2: got all those of the Blue Sky, which our accounts 1200 01:02:34,400 --> 01:02:36,640 Speaker 2: do very well on as well. And then the audio 1201 01:02:36,800 --> 01:02:40,320 Speaker 2: podcast as well, has some the main brother podcast and 1202 01:02:40,400 --> 01:02:42,840 Speaker 2: maybe a few selective you know, gase. 1203 01:02:42,680 --> 01:02:45,320 Speaker 4: Ever Sleep Yeah. And how do you how do you 1204 01:02:45,400 --> 01:02:47,800 Speaker 4: make money off all this stuff? Because that has been, 1205 01:02:47,920 --> 01:02:52,520 Speaker 4: of course, the big dilemma for people who have transferred 1206 01:02:52,560 --> 01:02:55,960 Speaker 4: to kind of a pure digital platform. They used to 1207 01:02:56,120 --> 01:02:59,960 Speaker 4: talk about analog dollars and digital pennies, so that's changed. 1208 01:03:00,240 --> 01:03:02,400 Speaker 4: Just more than fifty percent of people are now getting 1209 01:03:02,440 --> 01:03:05,320 Speaker 4: their using information from social media. But what is your 1210 01:03:05,440 --> 01:03:06,360 Speaker 4: financial model. 1211 01:03:06,520 --> 01:03:09,800 Speaker 2: I think it's still analog dollars digital pennies, although the 1212 01:03:09,960 --> 01:03:13,040 Speaker 2: analog dollars right are getting a little bit less and 1213 01:03:13,120 --> 01:03:16,720 Speaker 2: the digital pennies are getting more, which is a good thing. 1214 01:03:17,160 --> 01:03:20,479 Speaker 2: I just spoke recently at a panel and I said, 1215 01:03:21,160 --> 01:03:24,240 Speaker 2: one of the most important things I think to monetizing 1216 01:03:24,320 --> 01:03:26,560 Speaker 2: it is you don't go in it to monetizing And 1217 01:03:26,600 --> 01:03:29,880 Speaker 2: I mean truly, we never went into this thinking how 1218 01:03:29,920 --> 01:03:34,439 Speaker 2: do we make the biggest business decisions That naturally grew 1219 01:03:34,760 --> 01:03:37,880 Speaker 2: out of building good content. 1220 01:03:37,640 --> 01:03:40,000 Speaker 4: And a good base and a big community. 1221 01:03:40,320 --> 01:03:43,680 Speaker 2: And then the community exactly, and the community was there 1222 01:03:43,720 --> 01:03:46,960 Speaker 2: and that's what we fought for always and stood by 1223 01:03:47,040 --> 01:03:49,560 Speaker 2: the community, built the community, and they built us. It's been, 1224 01:03:50,120 --> 01:03:52,880 Speaker 2: you know, very mutual. And then as you grow that, 1225 01:03:53,360 --> 01:03:56,200 Speaker 2: you know, then you start to get into your digital 1226 01:03:56,280 --> 01:03:59,440 Speaker 2: pennies analog dollars, and you start to look at you know, 1227 01:03:59,560 --> 01:04:02,880 Speaker 2: what's YouTube CPM which is a different name for it, 1228 01:04:03,040 --> 01:04:06,920 Speaker 2: but what's a podcast CPM? And then what's a host 1229 01:04:07,000 --> 01:04:11,360 Speaker 2: read CPM? And then you start getting into conversations that 1230 01:04:11,600 --> 01:04:14,920 Speaker 2: start to not fully are but start to look similar 1231 01:04:15,040 --> 01:04:17,440 Speaker 2: to the way traditional. 1232 01:04:17,000 --> 01:04:17,960 Speaker 1: TV viewed things. 1233 01:04:18,040 --> 01:04:22,200 Speaker 2: You know, what's a CPM and here's the here's where 1234 01:04:22,240 --> 01:04:25,960 Speaker 2: things are moving. I think right now, if you were 1235 01:04:26,120 --> 01:04:30,000 Speaker 2: to buy and add a thirty second or sixty second 1236 01:04:30,080 --> 01:04:33,920 Speaker 2: spot on a cable news network, your CPM would be 1237 01:04:34,160 --> 01:04:37,240 Speaker 2: super high for thirty seconds, and you may be paying 1238 01:04:37,400 --> 01:04:40,440 Speaker 2: an outrage. You know, they'll say it's a good you know, 1239 01:04:41,000 --> 01:04:42,840 Speaker 2: the networks will say it makes sense, but you'll be 1240 01:04:42,880 --> 01:04:45,840 Speaker 2: paying whatever X dollars for that thirty seconds. After the 1241 01:04:45,920 --> 01:04:48,320 Speaker 2: show airs, no one sees the ad again, and you're 1242 01:04:48,360 --> 01:04:51,040 Speaker 2: one of five ads that may be during that block 1243 01:04:51,160 --> 01:04:52,880 Speaker 2: or four ads during that block, and no one ever 1244 01:04:52,960 --> 01:04:55,520 Speaker 2: sees it again. And I don't know what the conversion 1245 01:04:55,640 --> 01:04:58,920 Speaker 2: rate even is for you know, because it's not even 1246 01:04:58,960 --> 01:05:01,360 Speaker 2: really track that way. How many people bought a car 1247 01:05:01,520 --> 01:05:01,920 Speaker 2: based on. 1248 01:05:02,440 --> 01:05:06,280 Speaker 4: Those those vitamins that tastes like vegetables, right. 1249 01:05:08,240 --> 01:05:08,520 Speaker 1: News? 1250 01:05:08,720 --> 01:05:12,560 Speaker 2: You know, So if we're getting a million views, if 1251 01:05:12,560 --> 01:05:15,080 Speaker 2: we're getting a million views, you know, on a video, 1252 01:05:15,680 --> 01:05:20,040 Speaker 2: and then there's a host read you know, read and 1253 01:05:20,200 --> 01:05:23,400 Speaker 2: Jordi you know, works with the with the advertisers to 1254 01:05:23,520 --> 01:05:26,840 Speaker 2: make sure we're getting advertisers that kind of meet our criterion. 1255 01:05:26,880 --> 01:05:29,480 Speaker 2: We're not bringing advertisers that we think are not the 1256 01:05:29,560 --> 01:05:33,400 Speaker 2: right fits for us. You know, you start thinking about, well, 1257 01:05:33,440 --> 01:05:36,120 Speaker 2: what does that CPM look like? And you know, I 1258 01:05:36,200 --> 01:05:39,840 Speaker 2: think it's a more valuable proposition ultimately to an advertiser 1259 01:05:40,080 --> 01:05:42,400 Speaker 2: down the road. And I think this is where this 1260 01:05:42,560 --> 01:05:45,040 Speaker 2: is where the industry is moving. And I think, I 1261 01:05:45,120 --> 01:05:48,040 Speaker 2: think this is why we see so many people looking 1262 01:05:48,160 --> 01:05:51,240 Speaker 2: at a model, you know that that I think we 1263 01:05:51,480 --> 01:05:53,280 Speaker 2: you know, and again there's been a lot of people 1264 01:05:53,320 --> 01:05:55,960 Speaker 2: involved in this, but I think that we've helped, you know, 1265 01:05:56,320 --> 01:06:00,200 Speaker 2: show that this is a viable path for one of 1266 01:06:00,240 --> 01:06:02,760 Speaker 2: the first and people are seeing it and saying, oh, 1267 01:06:02,880 --> 01:06:03,160 Speaker 2: got it. 1268 01:06:03,240 --> 01:06:04,320 Speaker 1: I see how Midas did it. 1269 01:06:04,440 --> 01:06:07,800 Speaker 2: Let me try to build a network, And that's kind 1270 01:06:07,840 --> 01:06:10,000 Speaker 2: of how we, you know, how we think about things there. 1271 01:06:10,400 --> 01:06:13,680 Speaker 3: It's interesting over the past few years since we've been 1272 01:06:13,800 --> 01:06:16,840 Speaker 3: doing this, we've seen a lot of companies kind of 1273 01:06:16,920 --> 01:06:19,520 Speaker 3: come and go try to enter the space. And I 1274 01:06:19,600 --> 01:06:21,800 Speaker 3: think the theme that I've seen and the ones that 1275 01:06:22,160 --> 01:06:25,240 Speaker 3: have kind of come and gone quickly are the ones 1276 01:06:25,360 --> 01:06:28,320 Speaker 3: that start with a top down approach rather than a 1277 01:06:28,400 --> 01:06:31,200 Speaker 3: bottom up approach. And by that, I mean, you know, 1278 01:06:31,280 --> 01:06:34,600 Speaker 3: you see this big splashy announcement so and so was 1279 01:06:35,200 --> 01:06:38,840 Speaker 3: you know, the editor in chief of such and such, 1280 01:06:39,240 --> 01:06:41,680 Speaker 3: You know newspaper, and now they're going to make the 1281 01:06:41,760 --> 01:06:44,960 Speaker 3: digital network for the future. And they got an eighty 1282 01:06:45,040 --> 01:06:49,480 Speaker 3: million dollar investment, and they're gonna open up twelve offices 1283 01:06:49,560 --> 01:06:51,840 Speaker 3: around the world, and they're going to hire one hundred 1284 01:06:51,880 --> 01:06:54,560 Speaker 3: and fifty staff and and anytime I see that, I'll 1285 01:06:54,600 --> 01:06:57,160 Speaker 3: be I give them a year. I mean, how the 1286 01:06:57,440 --> 01:07:00,440 Speaker 3: economics aren't gonna even work. I mean, the office is alone. 1287 01:07:00,440 --> 01:07:02,760 Speaker 3: I'm kind of sinking, like I don't even understand how 1288 01:07:02,800 --> 01:07:02,960 Speaker 3: you have. 1289 01:07:03,240 --> 01:07:04,040 Speaker 1: Still dollar penny. 1290 01:07:05,920 --> 01:07:07,200 Speaker 3: So like to me, I'm like, I don't even know 1291 01:07:07,320 --> 01:07:08,960 Speaker 3: how this is how you think this is. 1292 01:07:09,000 --> 01:07:09,440 Speaker 1: Going to work. 1293 01:07:09,600 --> 01:07:12,440 Speaker 4: But so how many people do you have all together 1294 01:07:12,720 --> 01:07:13,320 Speaker 4: on your team? 1295 01:07:13,480 --> 01:07:15,840 Speaker 3: So I think one of the one of the things 1296 01:07:15,920 --> 01:07:18,480 Speaker 3: that is most effective about what we've done here is 1297 01:07:18,520 --> 01:07:21,320 Speaker 3: the way we've decentralized it. So, you know, I think 1298 01:07:21,360 --> 01:07:24,720 Speaker 3: as far as like kind of full time staff is concerned, 1299 01:07:25,040 --> 01:07:26,520 Speaker 3: you know, at this point, are probably like ten to 1300 01:07:26,600 --> 01:07:29,920 Speaker 3: fifteen like full time full time, like you know. And 1301 01:07:30,000 --> 01:07:32,080 Speaker 3: then in addition to that, though, we have dozens and 1302 01:07:32,280 --> 01:07:35,840 Speaker 3: dozens and dozens of contributors who are also producing content 1303 01:07:35,960 --> 01:07:37,640 Speaker 3: and contractors and people like that. 1304 01:07:37,880 --> 01:07:40,200 Speaker 4: So so it's not it's their side hustle kind of 1305 01:07:40,520 --> 01:07:42,880 Speaker 4: or is it for sometributor. 1306 01:07:42,200 --> 01:07:44,360 Speaker 3: You know, it's one of those things where you get 1307 01:07:44,400 --> 01:07:46,480 Speaker 3: what you put into it. And so there are some 1308 01:07:46,680 --> 01:07:50,880 Speaker 3: folks who are you know, making videos every day, doing 1309 01:07:50,960 --> 01:07:53,600 Speaker 3: to a day you know, you know, are just really 1310 01:07:53,680 --> 01:07:55,919 Speaker 3: crushing it and this is like become their main thing 1311 01:07:56,480 --> 01:07:59,720 Speaker 3: and now their livelihood is making you know, videos for 1312 01:07:59,840 --> 01:08:01,960 Speaker 3: my Touch, which I think is amazing that we're able 1313 01:08:02,040 --> 01:08:05,040 Speaker 3: to to folks. And then we may have folks who 1314 01:08:05,160 --> 01:08:07,320 Speaker 3: do you know, a video a month or you know, 1315 01:08:07,360 --> 01:08:10,200 Speaker 3: a couple of videos a month, and you know this 1316 01:08:10,280 --> 01:08:12,840 Speaker 3: will be their side thing. And so I think, you know, 1317 01:08:12,960 --> 01:08:14,840 Speaker 3: I think we offer that flexibility to folks you know 1318 01:08:14,920 --> 01:08:16,479 Speaker 3: who work with us kind of how much do you 1319 01:08:16,520 --> 01:08:18,839 Speaker 3: want to put into it and what's kind of resonating 1320 01:08:18,920 --> 01:08:21,280 Speaker 3: with folks. But you know, I think it's cool that 1321 01:08:21,360 --> 01:08:23,559 Speaker 3: we've been able to build this platform from the ground 1322 01:08:23,640 --> 01:08:27,200 Speaker 3: up completely independently, without a penny of outside investment money, 1323 01:08:27,439 --> 01:08:29,080 Speaker 3: you know, getting us to where we are, you know 1324 01:08:29,280 --> 01:08:33,120 Speaker 3: right now. And so you know, and and obviously you know, 1325 01:08:34,120 --> 01:08:36,560 Speaker 3: we were certainly afforded, you know, privileges of you know, 1326 01:08:36,680 --> 01:08:39,639 Speaker 3: all having jobs and being able to take this leap 1327 01:08:39,720 --> 01:08:41,880 Speaker 3: that other folks aren't necessarily able to do. And I 1328 01:08:41,960 --> 01:08:45,240 Speaker 3: completely acknowledge that, but I think I think it's important 1329 01:08:45,280 --> 01:08:48,559 Speaker 3: that you know, you do have that kind of ground 1330 01:08:48,680 --> 01:08:51,400 Speaker 3: up approach when you approach it, because when you're dealing 1331 01:08:51,479 --> 01:08:53,639 Speaker 3: with when you're dealing with pennies, you can't be spending 1332 01:08:54,040 --> 01:08:56,600 Speaker 3: you know, you can't be spending dollars. You need to 1333 01:08:56,600 --> 01:08:58,880 Speaker 3: figure out a way to do this. And you know, 1334 01:08:59,160 --> 01:09:01,000 Speaker 3: I'll also go back to our roots here. I'll take 1335 01:09:01,040 --> 01:09:03,200 Speaker 3: it back to the beginning because during the pandemic, one 1336 01:09:03,200 --> 01:09:06,400 Speaker 3: of the things that we noticed was everyone was recording 1337 01:09:06,479 --> 01:09:09,160 Speaker 3: from their homes and so we would watch CNN, we 1338 01:09:09,200 --> 01:09:12,040 Speaker 3: would watch you know, NBC, we'd watch all the networks. 1339 01:09:12,040 --> 01:09:14,680 Speaker 3: Everyone's on Skype or Zoom or whatever, and so we 1340 01:09:14,800 --> 01:09:15,479 Speaker 3: were like, we could do that. 1341 01:09:16,360 --> 01:09:19,240 Speaker 4: Yeah, we've said. We said, well, I mean it's amazing. 1342 01:09:19,360 --> 01:09:21,200 Speaker 4: I mean, as somebody who you know, used to have 1343 01:09:21,320 --> 01:09:24,200 Speaker 4: hair and makeup every morning or night and be on 1344 01:09:24,320 --> 01:09:27,960 Speaker 4: a fancy, you know, desk with life and camera crews 1345 01:09:28,040 --> 01:09:31,920 Speaker 4: and everything, and I'm like, there's something great too about 1346 01:09:32,080 --> 01:09:35,160 Speaker 4: the immediacy, Like we'd have to call somebody and get 1347 01:09:35,200 --> 01:09:37,240 Speaker 4: them in a car and have them come to the studio, 1348 01:09:37,840 --> 01:09:40,439 Speaker 4: and there's something great about just being able to talk 1349 01:09:40,520 --> 01:09:42,880 Speaker 4: to someone right away. If it's by the way, there's 1350 01:09:42,880 --> 01:09:43,800 Speaker 4: a rawness to it. 1351 01:09:43,920 --> 01:09:46,080 Speaker 3: There's an authenticity to it also. 1352 01:09:46,120 --> 01:09:49,280 Speaker 4: You know scariness. There's people see me without make them. 1353 01:09:50,160 --> 01:09:53,519 Speaker 3: I totally do not think that, but but you know, 1354 01:09:53,600 --> 01:09:56,760 Speaker 3: I think though it, you know, it requires you to, 1355 01:09:57,240 --> 01:09:59,240 Speaker 3: you know, think a little differently, right, You don't necessarily 1356 01:09:59,720 --> 01:10:01,880 Speaker 3: you know, I know, and I think Ben knows he 1357 01:10:01,920 --> 01:10:04,880 Speaker 3: doesn't have necessarily you know, a studio, music, all this 1358 01:10:04,920 --> 01:10:07,000 Speaker 3: stuff lifting him up, and so he has to, you know, 1359 01:10:07,080 --> 01:10:09,080 Speaker 3: in many ways focus on the content. 1360 01:10:08,960 --> 01:10:12,280 Speaker 4: Right, and there it's a fakiness about like good evening, yeah, 1361 01:10:12,320 --> 01:10:12,880 Speaker 4: you know, and. 1362 01:10:13,720 --> 01:10:15,640 Speaker 3: I think people see, you know, see through a lot 1363 01:10:15,680 --> 01:10:19,200 Speaker 3: of that, and so you know, I in doing it 1364 01:10:19,320 --> 01:10:21,680 Speaker 3: in that way, you know, I think ultimately, like if 1365 01:10:21,680 --> 01:10:23,720 Speaker 3: anyone is like watching this and like, how do I 1366 01:10:23,840 --> 01:10:25,400 Speaker 3: start something like this? Like you want to make sure 1367 01:10:25,439 --> 01:10:27,160 Speaker 3: your sound is good, you want to make sure people 1368 01:10:27,160 --> 01:10:29,200 Speaker 3: could see, Okay, you want a good camera, good audio. 1369 01:10:29,520 --> 01:10:31,400 Speaker 3: Aside from that, people are going to be actually pretty 1370 01:10:31,400 --> 01:10:34,160 Speaker 3: forgiving with the rest of the stuff. And I think 1371 01:10:34,600 --> 01:10:36,280 Speaker 3: in the beginning also there were so many people who 1372 01:10:36,280 --> 01:10:38,519 Speaker 3: would come to us you know, I got this, I 1373 01:10:38,640 --> 01:10:41,000 Speaker 3: got this great video, I got this great painting. I 1374 01:10:41,120 --> 01:10:42,920 Speaker 3: just need to run it through a few weeks of 1375 01:10:43,000 --> 01:10:45,680 Speaker 3: testing and then we're going to make some changes. And 1376 01:10:45,760 --> 01:10:47,800 Speaker 3: then if I could just get an investor. 1377 01:10:47,560 --> 01:10:50,360 Speaker 4: You just do it restributor. And you know, that's the 1378 01:10:50,439 --> 01:10:52,320 Speaker 4: one question I also wanted to ask you. You know, 1379 01:10:52,960 --> 01:10:55,240 Speaker 4: you do it in real time, you experiment, you get 1380 01:10:55,439 --> 01:10:58,599 Speaker 4: kind of the reaction. I promise, I only have business 1381 01:11:00,400 --> 01:11:01,320 Speaker 4: talking to you. Guys. 1382 01:11:02,120 --> 01:11:04,479 Speaker 2: We've before you ask We've been very excited to do this. 1383 01:11:05,840 --> 01:11:08,040 Speaker 1: We're big Katie. We asked for a lot of interviews. 1384 01:11:08,080 --> 01:11:11,240 Speaker 1: We don't really do it. Oh, but we're very big fans. 1385 01:11:11,000 --> 01:11:13,600 Speaker 4: Of you, and thank you. I appreciate it. Thank you. 1386 01:11:13,960 --> 01:11:16,360 Speaker 4: That means a lot to me. But you talked about 1387 01:11:16,680 --> 01:11:20,479 Speaker 4: how responsive you guys are to your community, and I'm curious. 1388 01:11:20,600 --> 01:11:22,960 Speaker 4: You know, I love the interactivity. You know it's not 1389 01:11:23,040 --> 01:11:25,439 Speaker 4: always fun to read because when you read a mean 1390 01:11:25,520 --> 01:11:30,160 Speaker 4: comment or whatever, But tell me what that actually means 1391 01:11:30,720 --> 01:11:33,840 Speaker 4: being responsive to your community. Can you give me sort 1392 01:11:33,840 --> 01:11:34,840 Speaker 4: of an example of that. 1393 01:11:35,680 --> 01:11:39,560 Speaker 2: I'll give you an example of the message that resonated 1394 01:11:39,600 --> 01:11:41,799 Speaker 2: and how we do it. If you looked at Senator 1395 01:11:41,880 --> 01:11:45,439 Speaker 2: Bernie Sanders and AOC's rallies recently where there have been 1396 01:11:45,560 --> 01:11:48,479 Speaker 2: thirty thousand people in fifteen thousand. There's a part of 1397 01:11:48,560 --> 01:11:50,800 Speaker 2: it where they go into the audience and they say 1398 01:11:51,040 --> 01:11:53,320 Speaker 2: they ask the audience, how do you feel about this? 1399 01:11:53,600 --> 01:11:55,840 Speaker 2: And then they just listen and you hear what the 1400 01:11:55,920 --> 01:12:00,080 Speaker 2: people are saying. And we do something intuitively similar for 1401 01:12:00,160 --> 01:12:03,000 Speaker 2: the past five years where we want to know how 1402 01:12:03,040 --> 01:12:07,719 Speaker 2: they're feeling, what they're thinking about issues, and really listen 1403 01:12:07,960 --> 01:12:12,120 Speaker 2: and learn from their experiences. And that's where we understand, 1404 01:12:12,320 --> 01:12:14,040 Speaker 2: you know, Okay, this is what you're going through, this 1405 01:12:14,120 --> 01:12:16,520 Speaker 2: is what you're going through, this is what you're experiencing. 1406 01:12:16,680 --> 01:12:20,960 Speaker 2: And so we have that sense in our overall programming 1407 01:12:21,160 --> 01:12:21,600 Speaker 2: where they are. 1408 01:12:21,680 --> 01:12:22,559 Speaker 1: And another what. 1409 01:12:22,720 --> 01:12:25,400 Speaker 4: Is it reading like their comments or is it do 1410 01:12:25,520 --> 01:12:29,040 Speaker 4: you have a special site or email address where you 1411 01:12:29,160 --> 01:12:31,080 Speaker 4: actually hear from people? How do you do that? 1412 01:12:31,320 --> 01:12:33,559 Speaker 2: It's all of the above. I mean it's reading the comments. 1413 01:12:34,040 --> 01:12:38,000 Speaker 2: We hold sometimes one two times a month, we'll hold 1414 01:12:38,120 --> 01:12:42,360 Speaker 2: meetings with our supporters and with the mightas Mighty as 1415 01:12:42,400 --> 01:12:44,719 Speaker 2: we call them, and we'll do zoom chats and they'll 1416 01:12:44,720 --> 01:12:48,160 Speaker 2: tell us how they're feeling. They'll explain to us on 1417 01:12:48,280 --> 01:12:52,960 Speaker 2: our social media handles, they'll usually you know, at us 1418 01:12:53,200 --> 01:12:56,840 Speaker 2: and certain things, and so it's an aggregate of all 1419 01:12:56,960 --> 01:13:01,040 Speaker 2: of that and we're just constantly listening to them every day. 1420 01:13:01,080 --> 01:13:03,920 Speaker 2: It's not necessary we have those meetings, but it is 1421 01:13:04,040 --> 01:13:08,000 Speaker 2: kind of a constant back and forth since this started. 1422 01:13:08,360 --> 01:13:12,519 Speaker 2: Just another example is when we started covering Canadian politics, 1423 01:13:12,840 --> 01:13:17,360 Speaker 2: which was a big and our expansion internationally, but we 1424 01:13:17,800 --> 01:13:21,280 Speaker 2: listened and listened and listened, and a lot of American 1425 01:13:21,400 --> 01:13:27,559 Speaker 2: media networks supplant the Canadian experience with their American centric 1426 01:13:27,720 --> 01:13:31,560 Speaker 2: views of how Canada is versus really listening to what 1427 01:13:31,680 --> 01:13:34,880 Speaker 2: people in Canada are experiencing. And so before we started 1428 01:13:34,920 --> 01:13:38,280 Speaker 2: doing any reporting on Canada, we had to understand the 1429 01:13:38,400 --> 01:13:40,880 Speaker 2: dynamics of the Liberal Party and the Conservative Party and 1430 01:13:40,920 --> 01:13:43,639 Speaker 2: the NDP and the Green Party and how things work 1431 01:13:43,760 --> 01:13:46,200 Speaker 2: and what's going on with Carnie and what's going on 1432 01:13:46,320 --> 01:13:50,320 Speaker 2: with Pierre Poliev and understanding you know, all of those dynamics. 1433 01:13:51,240 --> 01:13:56,080 Speaker 2: And this was it was a relentless listening experience, and 1434 01:13:56,200 --> 01:13:58,840 Speaker 2: then we started talking about it and then it started 1435 01:13:58,880 --> 01:14:02,519 Speaker 2: connecting in Canada. So the Midas Touch Network podcast became 1436 01:14:02,560 --> 01:14:07,160 Speaker 2: the most listened to podcasts in Canada about Canadian politics 1437 01:14:08,360 --> 01:14:10,560 Speaker 2: and that, and what we heard from them is just 1438 01:14:10,640 --> 01:14:15,120 Speaker 2: such a compliment which told us that. But the feedback 1439 01:14:15,200 --> 01:14:18,400 Speaker 2: there was, it's because you're listening to us, and you 1440 01:14:18,600 --> 01:14:22,320 Speaker 2: understand that when Donald Trump is making these threats against us, 1441 01:14:22,520 --> 01:14:24,760 Speaker 2: it's not a joke. Nor do we view it as 1442 01:14:24,760 --> 01:14:27,639 Speaker 2: a trade war. We view it as a war war. 1443 01:14:28,240 --> 01:14:31,439 Speaker 2: We view that type of language the same way that 1444 01:14:31,760 --> 01:14:36,519 Speaker 2: Ukraine views putin talking about denocification. Is how when he 1445 01:14:36,720 --> 01:14:39,640 Speaker 2: uses language fifty first state, it's an attack on our 1446 01:14:39,760 --> 01:14:42,439 Speaker 2: very existence and our sovereignty. So we want you to 1447 01:14:42,560 --> 01:14:46,080 Speaker 2: understand why it is we're boycotting American products, and it's 1448 01:14:46,080 --> 01:14:48,599 Speaker 2: not that we don't like Americans. It's that the leader 1449 01:14:48,640 --> 01:14:51,120 Speaker 2: of your country right now is basically threatening an invasion 1450 01:14:51,200 --> 01:14:53,960 Speaker 2: against us. And so when you start speaking like that, 1451 01:14:54,080 --> 01:14:55,040 Speaker 2: they go, thank. 1452 01:14:54,960 --> 01:14:56,080 Speaker 1: You, you know, they go. 1453 01:14:56,200 --> 01:14:59,320 Speaker 2: Our local media doesn't even talk about it in those terms, and. 1454 01:14:59,320 --> 01:15:00,000 Speaker 1: That's how we felt. 1455 01:15:00,360 --> 01:15:03,480 Speaker 4: Yeah, it's obvious that you have an audience first mentality, 1456 01:15:03,560 --> 01:15:05,920 Speaker 4: and it is surprising to me, as somebody who's been 1457 01:15:05,960 --> 01:15:10,400 Speaker 4: doing this a long time, how all these interactive tools 1458 01:15:10,400 --> 01:15:15,719 Speaker 4: are at our fingertips and still most media organizations really 1459 01:15:15,800 --> 01:15:19,080 Speaker 4: don't utilize them, not to blow my own for a nanosecond, 1460 01:15:19,160 --> 01:15:21,679 Speaker 4: But when I was I wrote about this in my book. 1461 01:15:21,840 --> 01:15:24,000 Speaker 4: Sorry everybody who works with me, because they've heard the 1462 01:15:24,040 --> 01:15:26,559 Speaker 4: story a million times. But when I was at doing 1463 01:15:26,640 --> 01:15:29,559 Speaker 4: the CBS Evening News, I was covering the golf oil spill, 1464 01:15:30,160 --> 01:15:31,840 Speaker 4: and I said, what do you guys want to know 1465 01:15:32,280 --> 01:15:33,719 Speaker 4: about the golf oil spill? 1466 01:15:33,960 --> 01:15:34,600 Speaker 3: That you know? 1467 01:15:34,760 --> 01:15:37,680 Speaker 4: Because I think you get in a media cycle or 1468 01:15:37,840 --> 01:15:39,960 Speaker 4: a news cycle and you kind of repeat the same 1469 01:15:40,080 --> 01:15:42,679 Speaker 4: thing over and over again, or you advance the story 1470 01:15:42,760 --> 01:15:45,280 Speaker 4: in a small way, but you kind of miss the 1471 01:15:45,360 --> 01:15:48,880 Speaker 4: forest for the tree sometimes. So I asked on Twitter 1472 01:15:49,000 --> 01:15:51,880 Speaker 4: back then, you know, what are your questions about what's 1473 01:15:51,960 --> 01:15:54,959 Speaker 4: going on in the golf And the people at CBS 1474 01:15:55,360 --> 01:16:00,960 Speaker 4: were so unnerved by me trying to to understand what 1475 01:16:01,120 --> 01:16:03,760 Speaker 4: people cared about, what they wanted to know, and what 1476 01:16:03,920 --> 01:16:07,040 Speaker 4: they were thinking and what they were wondering about. But 1477 01:16:07,680 --> 01:16:11,519 Speaker 4: to me, if you're serving an audience, why wouldn't you 1478 01:16:12,360 --> 01:16:14,320 Speaker 4: listen to them? And I think Brett you were talking 1479 01:16:14,320 --> 01:16:17,640 Speaker 4: about the top down approach and kind of serve your 1480 01:16:17,800 --> 01:16:21,560 Speaker 4: audience where they are and what they want to know 1481 01:16:21,840 --> 01:16:23,960 Speaker 4: instead of what you think they should be told. 1482 01:16:24,200 --> 01:16:27,160 Speaker 3: Right, Hey, and what either whether you're a politician or 1483 01:16:27,200 --> 01:16:30,960 Speaker 3: a reporter, or you're selling tires or I don't care 1484 01:16:31,000 --> 01:16:33,320 Speaker 3: what it is that you're doing, if you don't understand 1485 01:16:33,400 --> 01:16:36,160 Speaker 3: what people want and what people are looking for from you, 1486 01:16:36,640 --> 01:16:38,200 Speaker 3: you know, I think you're I think you're going to 1487 01:16:38,240 --> 01:16:40,439 Speaker 3: miss it. And you're right. You know, now more than ever, 1488 01:16:41,280 --> 01:16:43,360 Speaker 3: people are gonna let you know what they think. You know, 1489 01:16:43,479 --> 01:16:46,920 Speaker 3: it's they're not shy about it. Better for better, they're 1490 01:16:46,960 --> 01:16:47,840 Speaker 3: not exactly shy. 1491 01:16:48,000 --> 01:16:50,599 Speaker 4: I used to say, the good thing about the internet 1492 01:16:50,760 --> 01:16:55,160 Speaker 4: is everyone now has a voice. The bad thing everyone 1493 01:16:55,600 --> 01:16:56,400 Speaker 4: now has a voice. 1494 01:16:56,560 --> 01:16:58,560 Speaker 3: Yeah, So I mean people aren't shy about it, and 1495 01:16:59,000 --> 01:17:01,519 Speaker 3: you know, and and people well may disagree with us 1496 01:17:01,600 --> 01:17:05,240 Speaker 3: too at times, you know, but at least we know, 1497 01:17:05,520 --> 01:17:06,880 Speaker 3: you know, what they're thinking, and at least they know 1498 01:17:06,920 --> 01:17:08,960 Speaker 3: that we're coming from a place of like, this is 1499 01:17:09,080 --> 01:17:11,599 Speaker 3: what we really believe in it, and we're not going 1500 01:17:11,680 --> 01:17:14,280 Speaker 3: to b su you know, on it. And I think 1501 01:17:14,320 --> 01:17:15,240 Speaker 3: that's kind of the most. 1502 01:17:15,080 --> 01:17:18,679 Speaker 1: Important corporate news took surgery. Yeah, I was. From my perspective, 1503 01:17:18,680 --> 01:17:20,680 Speaker 1: I don't think the minus touch network exists at this 1504 01:17:20,800 --> 01:17:23,280 Speaker 1: scale that it does now without our audience, without the 1505 01:17:23,360 --> 01:17:27,479 Speaker 1: midas Mighty, and without that relentless relationship of listening and 1506 01:17:27,560 --> 01:17:31,000 Speaker 1: feedback that we have every second of the day, we 1507 01:17:31,200 --> 01:17:33,160 Speaker 1: read everything, for better. 1508 01:17:33,040 --> 01:17:35,320 Speaker 4: Or for worse, We read time to do it every 1509 01:17:35,439 --> 01:17:36,280 Speaker 4: but I'm glad. 1510 01:17:36,080 --> 01:17:38,680 Speaker 1: You do because it's very important for us that we 1511 01:17:38,800 --> 01:17:42,080 Speaker 1: really understand how our audience is feeling. And they were 1512 01:17:42,160 --> 01:17:45,160 Speaker 1: with us five years ago when I moved back into 1513 01:17:45,200 --> 01:17:49,439 Speaker 1: my mom's basement during in the lockdown. Yeah, doing it 1514 01:17:49,520 --> 01:17:51,840 Speaker 1: from there to now I have a kid and I'm 1515 01:17:51,880 --> 01:17:54,599 Speaker 1: living with my wife in Pittsburgh, and so they've seen 1516 01:17:54,720 --> 01:17:57,080 Speaker 1: us mature in the network, mature every step of the way, 1517 01:17:57,120 --> 01:18:00,519 Speaker 1: and it's really been this familial relation ship that we've 1518 01:18:00,520 --> 01:18:01,080 Speaker 1: been able. 1519 01:18:00,920 --> 01:18:03,360 Speaker 4: To faster and the and the walls have been broken. 1520 01:18:09,880 --> 01:18:11,960 Speaker 4: If you want to get smarter every morning with a 1521 01:18:12,000 --> 01:18:15,280 Speaker 4: breakdown of the news and fascinating takes on health and 1522 01:18:15,360 --> 01:18:18,639 Speaker 4: wellness and pop culture, sign up for our daily newsletter, 1523 01:18:18,760 --> 01:18:31,000 Speaker 4: wake Up Call by going to Katiecuric dot com. I 1524 01:18:31,120 --> 01:18:35,439 Speaker 4: think there still is this sort of artifice about a 1525 01:18:35,560 --> 01:18:39,000 Speaker 4: lot of mainstream media and the way they communicate with people, 1526 01:18:39,160 --> 01:18:42,600 Speaker 4: and obviously you all have tapped into that the a 1527 01:18:42,800 --> 01:18:46,080 Speaker 4: word authenticity, which I think it's overused, but I think 1528 01:18:46,160 --> 01:18:47,439 Speaker 4: it actually still holds. 1529 01:18:47,680 --> 01:18:49,920 Speaker 3: Yeah, it's it's one of those cliche for a reason. 1530 01:18:50,520 --> 01:18:52,840 Speaker 4: Yeah yeah, I mean I think that's been one of 1531 01:18:52,920 --> 01:18:56,200 Speaker 4: the reasons I've been successful professionally, is I think people 1532 01:18:57,040 --> 01:18:59,680 Speaker 4: think that I'm the same on cameras i am off, 1533 01:18:59,760 --> 01:19:01,800 Speaker 4: which I am most of the time. You know, I'm 1534 01:19:01,840 --> 01:19:06,200 Speaker 4: not fully unfiltered when I do things, and maybe I 1535 01:19:06,240 --> 01:19:09,600 Speaker 4: should be a little less, you know, or more unfiltered. 1536 01:19:09,960 --> 01:19:12,479 Speaker 4: But before we go, I want to we've kind of 1537 01:19:12,960 --> 01:19:16,439 Speaker 4: mentioned the manosphere and young men in this country. How 1538 01:19:16,479 --> 01:19:18,320 Speaker 4: old are you been? Forty forty? 1539 01:19:18,560 --> 01:19:21,640 Speaker 3: Well, it's about to be forty a few days your 1540 01:19:22,120 --> 01:19:23,599 Speaker 3: or over over yes, overselling. 1541 01:19:23,600 --> 01:19:25,200 Speaker 1: I don't know when I know When's it going to air. 1542 01:19:27,640 --> 01:19:28,320 Speaker 4: On Thursday? 1543 01:19:28,439 --> 01:19:30,840 Speaker 2: So h a couple of days before? 1544 01:19:31,720 --> 01:19:31,840 Speaker 3: Right? 1545 01:19:32,240 --> 01:19:37,679 Speaker 4: Old thirty five? So forty thirty five, thirty two. Tell 1546 01:19:37,720 --> 01:19:42,160 Speaker 4: me about what you think is happening to younger men 1547 01:19:42,360 --> 01:19:46,280 Speaker 4: in this country, both really primarily gen Z and how 1548 01:19:46,360 --> 01:19:51,960 Speaker 4: they have gravitated so enthusiastically, and maybe this is not 1549 01:19:52,160 --> 01:19:55,960 Speaker 4: even factually correct. I have to really check the numbers. 1550 01:19:56,040 --> 01:20:02,599 Speaker 4: But to Republican, to maga politic and how well, first 1551 01:20:02,600 --> 01:20:04,040 Speaker 4: of all, I want to know why you think this 1552 01:20:04,160 --> 01:20:06,680 Speaker 4: has happened, And then I want to know what you 1553 01:20:06,800 --> 01:20:10,080 Speaker 4: think can be done to change those dynamics. 1554 01:20:10,439 --> 01:20:14,120 Speaker 2: I think the idea of alpha has been hijacked by 1555 01:20:14,200 --> 01:20:17,840 Speaker 2: the least alpha people in the world. Okay, like a 1556 01:20:18,000 --> 01:20:22,320 Speaker 2: JD Vance, we're saying all my masculinities under attack. Okay, 1557 01:20:22,400 --> 01:20:24,120 Speaker 2: that's not an alpha thing to say. Like, let's be 1558 01:20:24,320 --> 01:20:26,160 Speaker 2: very cliqu When when I hear stuff like that, I'm like, 1559 01:20:26,320 --> 01:20:28,599 Speaker 2: what are you your masculinity? This is what you think 1560 01:20:28,640 --> 01:20:30,640 Speaker 2: about that your masculinity is under attack? What the hell 1561 01:20:30,640 --> 01:20:32,640 Speaker 2: are you talking about? You know, all of this like 1562 01:20:32,880 --> 01:20:35,680 Speaker 2: you know, whining that I hear from these magma Oh 1563 01:20:35,760 --> 01:20:36,800 Speaker 2: my god, you're coming after me. 1564 01:20:37,000 --> 01:20:40,040 Speaker 4: Oh. My friend Joeanne Littman wrote a whole essay about that, 1565 01:20:40,200 --> 01:20:42,639 Speaker 4: you know, all doxed and attacked. It was awful. I'm 1566 01:20:42,640 --> 01:20:43,240 Speaker 4: going to send it to you. 1567 01:20:43,479 --> 01:20:45,040 Speaker 1: You know, you know all of that. 1568 01:20:45,400 --> 01:20:47,360 Speaker 2: When I grew up, I don't like the term alpha 1569 01:20:47,479 --> 01:20:50,280 Speaker 2: in general. You know, you're an alpha, you know, but 1570 01:20:50,560 --> 01:20:53,360 Speaker 2: to me, at least as I was raised, you know, 1571 01:20:53,600 --> 01:20:56,679 Speaker 2: being a good guy or good person, but you stand 1572 01:20:56,800 --> 01:20:59,720 Speaker 2: up for people to me, like the cool kid or 1573 01:20:59,800 --> 01:21:04,200 Speaker 2: the or the growing up when I was in high school. 1574 01:21:04,760 --> 01:21:07,519 Speaker 2: Now would be considered that you would stand up against 1575 01:21:07,560 --> 01:21:10,519 Speaker 2: the bully. The bully wasn't like the alpha person. The 1576 01:21:10,560 --> 01:21:13,120 Speaker 2: bully was like the loser, you know, like why are 1577 01:21:13,120 --> 01:21:15,920 Speaker 2: you bullying these people? Like you would want to stand 1578 01:21:16,000 --> 01:21:18,759 Speaker 2: up for people. And that's to me what was always 1579 01:21:18,840 --> 01:21:21,519 Speaker 2: viewed as like cool and how you know, at least 1580 01:21:21,560 --> 01:21:24,600 Speaker 2: how I was raised and being a good man to 1581 01:21:24,720 --> 01:21:27,320 Speaker 2: me is being a man of your word, being someone 1582 01:21:27,400 --> 01:21:32,559 Speaker 2: of honor, you know, you know, against standing up for people. 1583 01:21:33,240 --> 01:21:36,320 Speaker 2: And that's kind of been you know, inverted with this 1584 01:21:36,600 --> 01:21:40,400 Speaker 2: to me, this corrosive culture that has you know, basically 1585 01:21:40,520 --> 01:21:44,120 Speaker 2: said the opposite that the bullies and the people who 1586 01:21:44,240 --> 01:21:47,720 Speaker 2: punch down. That's been viewed right now as like what 1587 01:21:47,840 --> 01:21:48,920 Speaker 2: it is and it shouldn't be. 1588 01:21:49,080 --> 01:21:51,800 Speaker 4: Yes, just because of Trump, really, isn't it The way 1589 01:21:51,920 --> 01:21:54,360 Speaker 4: he speaks and the way he talks and the way 1590 01:21:54,439 --> 01:21:59,479 Speaker 4: he demeans people. Suddenly that has probably been interpret as. 1591 01:21:59,400 --> 01:22:02,320 Speaker 3: Good for it right in so many ways. You know, 1592 01:22:02,320 --> 01:22:04,160 Speaker 3: it's funny when when when you speak about that, I 1593 01:22:04,240 --> 01:22:06,280 Speaker 3: think about I think it's in twenty one Jump Street. 1594 01:22:07,040 --> 01:22:10,040 Speaker 3: There's a scene in the movie, uh, you know where 1595 01:22:10,160 --> 01:22:11,920 Speaker 3: Channing Tatum where they go to the high school and 1596 01:22:11,960 --> 01:22:13,680 Speaker 3: they try to be cool as they knew it, like 1597 01:22:13,760 --> 01:22:15,280 Speaker 3: in the eighties and tried to be like kind of 1598 01:22:15,320 --> 01:22:17,800 Speaker 3: the bullies and stuff, and they're looked down upon as like, nah, 1599 01:22:18,439 --> 01:22:20,320 Speaker 3: like we're not into that anymore. What are you doing 1600 01:22:20,439 --> 01:22:23,320 Speaker 3: like this weird this weird stuff? And uh And I 1601 01:22:23,479 --> 01:22:25,240 Speaker 3: feel like I, you know, I grew up the same way. 1602 01:22:25,320 --> 01:22:27,439 Speaker 3: But you know, I think that I think that too 1603 01:22:27,439 --> 01:22:30,560 Speaker 3: many people are looking at you know, Gen's Z or 1604 01:22:30,640 --> 01:22:33,960 Speaker 3: Jen Alpha right now as being molded and kind of 1605 01:22:34,080 --> 01:22:37,200 Speaker 3: set in certain ways, and that this election was, you know, 1606 01:22:37,280 --> 01:22:39,400 Speaker 3: a reflection on who they are now and who they're 1607 01:22:39,400 --> 01:22:40,040 Speaker 3: always going to be. 1608 01:22:40,920 --> 01:22:43,200 Speaker 4: But I do think they're being molded by some of 1609 01:22:43,240 --> 01:22:50,280 Speaker 4: these percent they're giving anti women's empowerment bros, I. 1610 01:22:50,280 --> 01:22:53,320 Speaker 3: Think one hundred percent. But I don't think that's something 1611 01:22:53,400 --> 01:22:57,120 Speaker 3: that is going to be like concrete in them necessarily. 1612 01:22:57,200 --> 01:22:59,680 Speaker 3: I think people have their lived experiences, and I think 1613 01:22:59,720 --> 01:23:01,640 Speaker 3: one of the interesting things that we've even seen in 1614 01:23:01,680 --> 01:23:04,560 Speaker 3: the past few months since Trump has taken office is 1615 01:23:04,600 --> 01:23:06,240 Speaker 3: when you look at a lot of those public opinion 1616 01:23:06,320 --> 01:23:09,400 Speaker 3: polls of gen Z, they've completely flipped on their head, 1617 01:23:09,840 --> 01:23:12,840 Speaker 3: and polls that showed gen Z you know, even or 1618 01:23:12,880 --> 01:23:16,960 Speaker 3: supporting Trump back in November December. Now it's something like 1619 01:23:17,360 --> 01:23:20,000 Speaker 3: sixty eight twenty something, you know, in the opposite direction. 1620 01:23:20,160 --> 01:23:22,760 Speaker 3: So you see how even you know, those shifts could happen. 1621 01:23:22,800 --> 01:23:24,360 Speaker 3: And I think when people even go out into the 1622 01:23:24,400 --> 01:23:27,479 Speaker 3: world and have experiences and have jobs and need to 1623 01:23:28,000 --> 01:23:31,599 Speaker 3: you know, afford medication and need to you know, help 1624 01:23:31,680 --> 01:23:34,559 Speaker 3: their mom who's on Social Security, and they see these 1625 01:23:34,640 --> 01:23:37,400 Speaker 3: real world things that you know, have been kind of 1626 01:23:37,439 --> 01:23:41,320 Speaker 3: gamified in this kind of manosphere world and that have 1627 01:23:41,479 --> 01:23:43,799 Speaker 3: kind of been stripped from them, I think the reality 1628 01:23:43,840 --> 01:23:45,559 Speaker 3: wakes them up, you know a little bit. 1629 01:23:45,600 --> 01:23:48,080 Speaker 4: I think because I feel like they're still watching the 1630 01:23:48,160 --> 01:23:52,679 Speaker 4: same people and still getting the same information. And unless 1631 01:23:52,880 --> 01:23:58,480 Speaker 4: those people are flipping or changing or abandoning this administration 1632 01:23:58,640 --> 01:24:01,920 Speaker 4: of what they're doing, they're going to be steadfast. 1633 01:24:02,960 --> 01:24:05,080 Speaker 3: Yeah, you know, I just I don't think they're lockdown. 1634 01:24:05,280 --> 01:24:07,920 Speaker 3: I mean, maybe I have, you know, too much faith, 1635 01:24:08,320 --> 01:24:08,600 Speaker 3: but I. 1636 01:24:09,160 --> 01:24:10,760 Speaker 2: Think you have to look at the going back to 1637 01:24:10,800 --> 01:24:12,560 Speaker 2: where we almost started, right as I think you have 1638 01:24:12,640 --> 01:24:14,960 Speaker 2: to look at the root cause, which is how is 1639 01:24:15,040 --> 01:24:19,360 Speaker 2: this group able to be kind of preyed upon by 1640 01:24:19,479 --> 01:24:23,680 Speaker 2: in my opinion, these types of fraudsters and people who 1641 01:24:23,800 --> 01:24:26,320 Speaker 2: push this. And it's because, you know, I think a 1642 01:24:26,360 --> 01:24:29,800 Speaker 2: lot of young people and young men are scared that 1643 01:24:29,840 --> 01:24:31,880 Speaker 2: they're not going to have a future, right, And I 1644 01:24:31,920 --> 01:24:36,400 Speaker 2: think it stems from their weakness and that other people 1645 01:24:36,560 --> 01:24:39,040 Speaker 2: can pray on it because they're worried that am I 1646 01:24:39,120 --> 01:24:40,880 Speaker 2: going to Am I going to have a job? Am 1647 01:24:40,880 --> 01:24:42,800 Speaker 2: I going to be able to afford a home? Can 1648 01:24:42,880 --> 01:24:46,080 Speaker 2: I bring someone on a date and afford to even 1649 01:24:46,160 --> 01:24:47,640 Speaker 2: pay for it? What's my future going? 1650 01:24:47,760 --> 01:24:50,559 Speaker 4: But it's also more than that to me, Benett's sort 1651 01:24:50,560 --> 01:24:54,479 Speaker 4: of oh, women are making all these strides, right, they're 1652 01:24:54,560 --> 01:24:57,360 Speaker 4: more women in college and law school than there are men. 1653 01:24:58,040 --> 01:25:04,679 Speaker 4: Or gosh, programs are giving unfair advantages, or affirmative action 1654 01:25:04,960 --> 01:25:07,560 Speaker 4: or diversity programs in colleges or whatever you want to 1655 01:25:07,600 --> 01:25:09,840 Speaker 4: talk about. I mean, I've seen it play out in 1656 01:25:09,960 --> 01:25:13,240 Speaker 4: my own life among people I know, and they think 1657 01:25:13,680 --> 01:25:17,479 Speaker 4: when other people do well, they do less well. Right, 1658 01:25:17,800 --> 01:25:21,800 Speaker 4: And so there's just been this tremendous backlash and to 1659 01:25:21,920 --> 01:25:25,479 Speaker 4: your point, like a lot of whining like oh poor us, 1660 01:25:25,640 --> 01:25:29,599 Speaker 4: and playing the victim for a lot of young white men. 1661 01:25:30,320 --> 01:25:33,640 Speaker 4: And I think you're right, people are exploiting that. But 1662 01:25:33,800 --> 01:25:37,759 Speaker 4: how do you reverse it? How do you get people 1663 01:25:37,840 --> 01:25:41,680 Speaker 4: to think differently and to see that this is you know, 1664 01:25:41,760 --> 01:25:44,599 Speaker 4: because I do think they're having some real world effects. 1665 01:25:44,800 --> 01:25:49,720 Speaker 2: Well, I think you can't solve that by trying to 1666 01:25:50,280 --> 01:25:54,559 Speaker 2: imitate it. You just become a worse version of something 1667 01:25:54,640 --> 01:25:58,240 Speaker 2: that's not already good. Which is why the learning experience 1668 01:25:58,320 --> 01:26:01,040 Speaker 2: of oh, the left needs to find a is the 1669 01:26:01,160 --> 01:26:06,879 Speaker 2: wrong advice, because you want to become someone who's already 1670 01:26:06,960 --> 01:26:08,599 Speaker 2: kind of praying on these communities. 1671 01:26:08,600 --> 01:26:09,839 Speaker 4: So what mean exactly? 1672 01:26:10,000 --> 01:26:12,320 Speaker 2: So you have to again, you have to be a 1673 01:26:12,479 --> 01:26:15,040 Speaker 2: voice that is a fighter. I mean, you have to 1674 01:26:15,120 --> 01:26:17,920 Speaker 2: go back to you have to be a doer. You 1675 01:26:18,120 --> 01:26:21,439 Speaker 2: have to be someone who actually accomplishes things and says, 1676 01:26:21,720 --> 01:26:23,840 Speaker 2: here's what I stand for. I mean, here are the 1677 01:26:23,880 --> 01:26:27,160 Speaker 2: views and values. It's not a zero sum game you 1678 01:26:27,400 --> 01:26:31,000 Speaker 2: against other groups and marginalized communities taking your That's not it. 1679 01:26:31,520 --> 01:26:33,400 Speaker 2: That's a pathetic way. You know who deals with that? 1680 01:26:33,800 --> 01:26:37,880 Speaker 2: Whiners deal with that like not like people who are 1681 01:26:38,160 --> 01:26:40,639 Speaker 2: losers deal with that. You you want to be a winner, 1682 01:26:41,080 --> 01:26:45,440 Speaker 2: Let's work hard, you know, Let's let's focus on being entrepreneurial. 1683 01:26:45,600 --> 01:26:49,400 Speaker 2: Let's focus on actually building things versus destroying things. And 1684 01:26:49,520 --> 01:26:51,840 Speaker 2: ripping things apart. Do you want to be a step up? 1685 01:26:52,000 --> 01:26:55,519 Speaker 2: But yeah, yeah, step up, you know, step up, And yeah. 1686 01:26:55,600 --> 01:26:56,280 Speaker 1: I'm not gonna be. 1687 01:26:56,680 --> 01:27:00,719 Speaker 2: You know, when I talk to sometimes my I lecture 1688 01:27:00,800 --> 01:27:04,000 Speaker 2: an undergrad. I have a big class. The semester's wrapped up. 1689 01:27:04,360 --> 01:27:06,360 Speaker 2: But you know, at the end of the semester too, 1690 01:27:06,479 --> 01:27:10,240 Speaker 2: sometimes I like to, you know, reflect on the way 1691 01:27:10,400 --> 01:27:12,160 Speaker 2: that I don't know what the class is political views, 1692 01:27:12,160 --> 01:27:15,040 Speaker 2: and I don't inject my political views ever in the class, 1693 01:27:15,520 --> 01:27:19,800 Speaker 2: but I do talk about how I have white kids, 1694 01:27:19,880 --> 01:27:23,840 Speaker 2: black kids, Latina kids, Asian students in my class, men, women, 1695 01:27:23,960 --> 01:27:28,080 Speaker 2: trans in my class. No one in my class ever 1696 01:27:28,200 --> 01:27:33,599 Speaker 2: thinks it's acceptable during the class to say, oh, look 1697 01:27:33,640 --> 01:27:36,040 Speaker 2: at you, this race, look at you, this person, look 1698 01:27:36,080 --> 01:27:36,920 Speaker 2: at you, this person. 1699 01:27:37,120 --> 01:27:38,320 Speaker 1: I don't like you. I don't like you. 1700 01:27:38,520 --> 01:27:40,240 Speaker 2: I mean, could you imagine what would happen in a 1701 01:27:40,280 --> 01:27:43,840 Speaker 2: class if everybody was interacting and treating each other like No, 1702 01:27:44,080 --> 01:27:48,120 Speaker 2: we act professional. There's a professor. We go through the semester, 1703 01:27:48,600 --> 01:27:52,200 Speaker 2: we get along the diversity and the class, the different views, 1704 01:27:52,280 --> 01:27:55,600 Speaker 2: different gen that made the class a more powerful experience 1705 01:27:55,840 --> 01:27:56,519 Speaker 2: and we were all. 1706 01:27:56,479 --> 01:27:57,080 Speaker 1: Happy with it. 1707 01:27:57,479 --> 01:27:59,680 Speaker 2: So as I end the semester, I always go to them. 1708 01:28:00,280 --> 01:28:04,280 Speaker 2: So why when we leave this classroom and go into 1709 01:28:04,439 --> 01:28:08,200 Speaker 2: now what a political setting or a professional setting. Now 1710 01:28:08,240 --> 01:28:11,080 Speaker 2: we're supposed to all of a sudden take this and 1711 01:28:11,280 --> 01:28:13,400 Speaker 2: now fight each other, and now not like each other. 1712 01:28:13,720 --> 01:28:15,880 Speaker 2: We saw the strength of what made this class of 1713 01:28:15,920 --> 01:28:18,439 Speaker 2: one hundred and five people strong. It was the very 1714 01:28:18,560 --> 01:28:21,240 Speaker 2: diverse We all liked each other. So go out there 1715 01:28:21,280 --> 01:28:23,880 Speaker 2: and remember that this was a valuable experience for all 1716 01:28:23,960 --> 01:28:26,160 Speaker 2: of us, and that should be what the real world was, 1717 01:28:26,240 --> 01:28:28,600 Speaker 2: and that's what we should be fighting for. Not to 1718 01:28:28,720 --> 01:28:33,280 Speaker 2: see in the halls of Congress them not letting people 1719 01:28:33,320 --> 01:28:34,679 Speaker 2: in bath. They're like, what are they even doing? 1720 01:28:34,840 --> 01:28:35,000 Speaker 3: You know? 1721 01:28:35,120 --> 01:28:37,960 Speaker 1: I mean, it's lost her mind. 1722 01:28:38,880 --> 01:28:40,479 Speaker 3: She was somebody who a few years ago, you know, 1723 01:28:40,600 --> 01:28:42,880 Speaker 3: kind of pretended to be an ally and she's now 1724 01:28:42,960 --> 01:28:43,920 Speaker 3: the biggest voice, you. 1725 01:28:43,960 --> 01:28:45,920 Speaker 4: Know, against she does the selfies in bat. 1726 01:28:45,960 --> 01:28:48,120 Speaker 3: It's like weird, right, It's like like sixty times a day. 1727 01:28:48,160 --> 01:28:50,280 Speaker 2: It's like you're on vacation sixty times a day about 1728 01:28:50,360 --> 01:28:54,200 Speaker 2: any top at all, even like a hobby would be strange. 1729 01:28:54,280 --> 01:28:56,560 Speaker 2: I mean, yet alone, like this is what you you know, 1730 01:28:56,680 --> 01:28:58,720 Speaker 2: this is what you occupy your time with. And that's 1731 01:28:58,720 --> 01:29:00,479 Speaker 2: how I try to flip it on the you know them, 1732 01:29:00,880 --> 01:29:02,800 Speaker 2: you know more than I'm like, this is what you're 1733 01:29:02,840 --> 01:29:06,400 Speaker 2: focused on, Like you wake up thinking about bathrooms all day. 1734 01:29:07,120 --> 01:29:10,400 Speaker 2: This is your main view, you know, and and and 1735 01:29:10,600 --> 01:29:13,200 Speaker 2: that's to me how you address it versus living in 1736 01:29:13,360 --> 01:29:18,760 Speaker 2: their framework of all of their grievance and pettiness. 1737 01:29:18,280 --> 01:29:20,519 Speaker 3: And hate just flipping on them, you know. I think 1738 01:29:20,560 --> 01:29:22,600 Speaker 3: one of the things that people I think one of 1739 01:29:22,640 --> 01:29:25,160 Speaker 3: the reasons why people gravitate to the stuff we do 1740 01:29:25,479 --> 01:29:27,800 Speaker 3: as well, is because they leave with the information. And 1741 01:29:27,840 --> 01:29:30,519 Speaker 3: then I think they also leave empowered. And I think 1742 01:29:30,640 --> 01:29:33,800 Speaker 3: folks need to be empowered. They need to feel in 1743 01:29:33,960 --> 01:29:36,400 Speaker 3: control of their lives. And I think a lot of these, 1744 01:29:36,680 --> 01:29:38,920 Speaker 3: you know, young kids right now don't feel in control 1745 01:29:38,960 --> 01:29:41,559 Speaker 3: of their lives. And a lot of it's not their fault, right, 1746 01:29:41,600 --> 01:29:43,360 Speaker 3: I mean, this is the situation. This is the hands 1747 01:29:43,400 --> 01:29:45,599 Speaker 3: that they have been dealt, and it's you know, through 1748 01:29:45,760 --> 01:29:48,439 Speaker 3: decades and decades and decades, you know, whether it's poor 1749 01:29:48,560 --> 01:29:53,519 Speaker 3: policy or just shifts in global everything, like whatever it is. 1750 01:29:53,800 --> 01:29:56,400 Speaker 3: You know, they're looking now, they're looking at their parents, 1751 01:29:56,760 --> 01:29:58,559 Speaker 3: and they're going, well, they had it easy. They were 1752 01:29:58,600 --> 01:30:01,519 Speaker 3: able to get a you know, my you know, Dad 1753 01:30:01,640 --> 01:30:03,920 Speaker 3: was able to get one job and work you know, 1754 01:30:04,479 --> 01:30:07,080 Speaker 3: five days a week, you know, eight hour days, and 1755 01:30:07,640 --> 01:30:10,920 Speaker 3: come home or dinner, and we had two week vacations, 1756 01:30:11,040 --> 01:30:13,439 Speaker 3: you know, twice a year. You know, they look and 1757 01:30:13,520 --> 01:30:14,920 Speaker 3: they go, how the heck am I going to do that? 1758 01:30:15,200 --> 01:30:17,519 Speaker 3: You know, even if I had a job where I 1759 01:30:17,640 --> 01:30:19,320 Speaker 3: was making six figures, I'm not going to be able 1760 01:30:19,320 --> 01:30:20,639 Speaker 3: to afford a house. I'm not going to be able 1761 01:30:20,680 --> 01:30:22,120 Speaker 3: to have savings. I'm not going to be able to 1762 01:30:22,200 --> 01:30:24,680 Speaker 3: live in a neighborhood where I could walk around and 1763 01:30:25,120 --> 01:30:27,760 Speaker 3: get everything that I need to survive. And then in 1764 01:30:27,880 --> 01:30:30,479 Speaker 3: doing all that, how am I going to have kids? 1765 01:30:30,520 --> 01:30:32,280 Speaker 3: How am I going to be able to afford a family? 1766 01:30:32,320 --> 01:30:34,639 Speaker 3: I can't even afford to get a coffee for myself, 1767 01:30:34,680 --> 01:30:37,360 Speaker 3: you know. So these are real issues also, and so 1768 01:30:37,960 --> 01:30:40,240 Speaker 3: what I don't want to do is write off also 1769 01:30:40,320 --> 01:30:43,080 Speaker 3: a lot of this as mere whining and you know, 1770 01:30:43,640 --> 01:30:46,280 Speaker 3: you kind of woe is me, But understand that, you know, 1771 01:30:46,800 --> 01:30:48,639 Speaker 3: people are hurting, going back to what we were saying 1772 01:30:48,680 --> 01:30:51,760 Speaker 3: at the beginning, and that's including these younger generations, and 1773 01:30:51,960 --> 01:30:54,040 Speaker 3: so I think we need to empower them and. 1774 01:30:54,320 --> 01:31:00,519 Speaker 4: Also like acknowledge, acknowledge the anxiety and struggle. Know whoever, 1775 01:31:00,680 --> 01:31:02,880 Speaker 4: they may be right, and I think we. 1776 01:31:02,920 --> 01:31:06,360 Speaker 3: Need to you know, acknowledge it. And you know what 1777 01:31:06,520 --> 01:31:09,640 Speaker 3: we've seen in those communities that you've talked about, the manuspheral, 1778 01:31:09,640 --> 01:31:13,320 Speaker 3: all that stuff, they've acknowledged it, and then they've directed 1779 01:31:13,479 --> 01:31:17,200 Speaker 3: that anger at the other They've directed you are you 1780 01:31:17,320 --> 01:31:21,840 Speaker 3: can't afford a home because the Democrats are giving the 1781 01:31:21,960 --> 01:31:24,599 Speaker 3: money to illegal immigrants and that money should have went 1782 01:31:24,640 --> 01:31:26,880 Speaker 3: to you. And then, you know, the second there's a 1783 01:31:26,960 --> 01:31:29,120 Speaker 3: social program that may actually help people, they go, oh, no, 1784 01:31:29,160 --> 01:31:32,200 Speaker 3: we're not gonna pay for that. That's socialism, that's communism. 1785 01:31:32,200 --> 01:31:33,400 Speaker 3: We're not going to do that. It's like, okay, so 1786 01:31:33,840 --> 01:31:35,920 Speaker 3: you don't believe in any of this, right, but but 1787 01:31:36,080 --> 01:31:39,040 Speaker 3: too often that energy is taken and then it's pushed 1788 01:31:39,040 --> 01:31:42,320 Speaker 3: in a negative direction to blame other people for you know, 1789 01:31:42,479 --> 01:31:46,200 Speaker 3: for those woes, for those issues. I think we all 1790 01:31:46,320 --> 01:31:48,599 Speaker 3: need to figure out a way as a society in general, 1791 01:31:48,720 --> 01:31:50,640 Speaker 3: to how do we empower people, you know, in a 1792 01:31:50,760 --> 01:31:53,439 Speaker 3: in a productive, positive way. And some of that also 1793 01:31:53,680 --> 01:31:56,560 Speaker 3: is going to require structural change and in you know, 1794 01:31:56,680 --> 01:31:58,400 Speaker 3: how are we going to ensure that these kids are 1795 01:31:58,439 --> 01:31:59,880 Speaker 3: going to be able to grow up and have family 1796 01:32:00,320 --> 01:32:03,920 Speaker 3: and have you know, productive lives and have a job 1797 01:32:04,120 --> 01:32:07,120 Speaker 3: that they could support their families with and enjoy the 1798 01:32:07,160 --> 01:32:09,880 Speaker 3: American dream as they've been told. I mean, we were 1799 01:32:09,920 --> 01:32:12,679 Speaker 3: all promised growing up, right, you know, go to college, 1800 01:32:13,080 --> 01:32:15,000 Speaker 3: you get a good you know, you get a good job, 1801 01:32:15,160 --> 01:32:18,240 Speaker 3: you'll you'll be good. Everything is good. And now that's 1802 01:32:18,360 --> 01:32:20,720 Speaker 3: not simply not what's happening in so many cases. So 1803 01:32:20,760 --> 01:32:23,920 Speaker 3: I think there's a bitterness that develops, and that bitterness 1804 01:32:24,040 --> 01:32:28,120 Speaker 3: is then you know, taken and weaponized for bad and 1805 01:32:28,200 --> 01:32:29,760 Speaker 3: I think we need to kind of figure out how 1806 01:32:29,800 --> 01:32:32,240 Speaker 3: to you know, take that and put people back on 1807 01:32:32,280 --> 01:32:32,760 Speaker 3: the right course. 1808 01:32:32,920 --> 01:32:36,400 Speaker 4: That's a good segue to my final question, which is, 1809 01:32:37,320 --> 01:32:40,639 Speaker 4: you know, you talk about empowering people, and I get 1810 01:32:40,840 --> 01:32:45,720 Speaker 4: very discouraged about the state of the country and disappointed 1811 01:32:46,120 --> 01:32:49,840 Speaker 4: and frustrated, and I'm curious. I like the idea, even 1812 01:32:49,880 --> 01:32:53,040 Speaker 4: though you know some of your YouTube things are like 1813 01:32:53,160 --> 01:32:57,000 Speaker 4: these big bold letters like the video he doesn't want 1814 01:32:57,080 --> 01:33:01,880 Speaker 4: you to see, or pathetic meltdown, lock him up. You know, 1815 01:33:02,040 --> 01:33:05,160 Speaker 4: it's very kind of I like it. You're saying a 1816 01:33:05,360 --> 01:33:08,360 Speaker 4: slightly New York posty about it. To be honest with you, 1817 01:33:08,479 --> 01:33:13,080 Speaker 4: with the yellow almost like in your face graphics. But 1818 01:33:14,520 --> 01:33:20,560 Speaker 4: I'm curious if you're optimistic about the country, and what 1819 01:33:20,840 --> 01:33:23,560 Speaker 4: kind of words of encouragement can you give me and 1820 01:33:23,760 --> 01:33:28,080 Speaker 4: other people watching this or listening to this that we're 1821 01:33:28,240 --> 01:33:32,080 Speaker 4: going to be okay and democracy is not going to 1822 01:33:32,160 --> 01:33:32,599 Speaker 4: go away. 1823 01:33:33,960 --> 01:33:36,840 Speaker 2: I am hopeful. I am optimistic, and I think that 1824 01:33:37,040 --> 01:33:43,240 Speaker 2: the plan that Trump and Musk had was to pretty 1825 01:33:43,320 --> 01:33:46,920 Speaker 2: much control all levers of everything right now and really 1826 01:33:47,080 --> 01:33:50,760 Speaker 2: beat down any opposition and resistance. I think what we 1827 01:33:50,920 --> 01:33:58,519 Speaker 2: see right now in federal courts is judges appointed by Trump, Reagan, 1828 01:34:00,000 --> 01:34:04,559 Speaker 2: George W. Bush on the due process questions. They made 1829 01:34:04,600 --> 01:34:06,320 Speaker 2: a lot of bad rulings on a lot of things, 1830 01:34:06,840 --> 01:34:10,360 Speaker 2: but on the due process questions, all judges have said, 1831 01:34:10,360 --> 01:34:12,920 Speaker 2: at least as of now of this recording, there is 1832 01:34:13,000 --> 01:34:16,640 Speaker 2: something called due process. You have to go and follow it, 1833 01:34:17,439 --> 01:34:20,560 Speaker 2: and we do not live in an authoritarian country. And 1834 01:34:20,680 --> 01:34:24,559 Speaker 2: so federal judges on that question, by and large across 1835 01:34:24,640 --> 01:34:27,800 Speaker 2: the country are standing up to Donald Trump more significantly. 1836 01:34:27,960 --> 01:34:28,160 Speaker 1: Though. 1837 01:34:28,320 --> 01:34:32,080 Speaker 2: I think you see the protests, peaceful protests growing, I 1838 01:34:32,120 --> 01:34:34,599 Speaker 2: think that's a good sign. I think what we saw 1839 01:34:34,680 --> 01:34:37,880 Speaker 2: with the hands off protest No King's Day, I think 1840 01:34:37,960 --> 01:34:42,240 Speaker 2: that's going to continue to grow. And that gives me 1841 01:34:42,320 --> 01:34:44,920 Speaker 2: a lot of hope as well. And I think that 1842 01:34:45,479 --> 01:34:47,800 Speaker 2: I take a lot of hope that people are in 1843 01:34:47,920 --> 01:34:53,400 Speaker 2: the fight right now. I was worried that after the election, 1844 01:34:54,320 --> 01:34:57,519 Speaker 2: you saw a lot of people just for about too much, yeah, 1845 01:34:57,600 --> 01:35:01,439 Speaker 2: tune out and just didn't anything. I think that we 1846 01:35:01,640 --> 01:35:06,599 Speaker 2: now have a opposition and resistance that looks and feels 1847 01:35:06,920 --> 01:35:10,760 Speaker 2: even stronger than what we saw back in twenty seventeen, 1848 01:35:10,880 --> 01:35:15,920 Speaker 2: twenty eighteen. I think people have seen Trump's reckless on 1849 01:35:16,120 --> 01:35:19,439 Speaker 2: and off again tariffs and all of this rhetoric that 1850 01:35:20,040 --> 01:35:22,719 Speaker 2: you know, it's the same kind of pattern. He huffs 1851 01:35:22,800 --> 01:35:27,000 Speaker 2: and he puffs and does something that causes harm and 1852 01:35:27,120 --> 01:35:30,719 Speaker 2: then he withdraws from it and then declares that a victory. 1853 01:35:31,479 --> 01:35:35,320 Speaker 2: And there's still some news outlets that report these things 1854 01:35:35,360 --> 01:35:37,800 Speaker 2: as victories. It's like he just is the arsonist who 1855 01:35:37,920 --> 01:35:40,840 Speaker 2: slightly put out a fire that he created, and now 1856 01:35:40,920 --> 01:35:43,000 Speaker 2: things are worse, and now he wants to declare that 1857 01:35:43,320 --> 01:35:47,720 Speaker 2: a victory. But I think that the publics that by 1858 01:35:47,800 --> 01:35:49,439 Speaker 2: and large is seeing that. 1859 01:35:49,680 --> 01:35:52,680 Speaker 4: I mean, Americans are smart, right, I mean a lot 1860 01:35:52,760 --> 01:35:53,599 Speaker 4: of them are smart. 1861 01:35:53,720 --> 01:35:57,000 Speaker 2: Yeah, And I think we see Trump's approval right now 1862 01:35:57,600 --> 01:36:01,880 Speaker 2: is basically the worst in presentidential history at this time, 1863 01:36:02,000 --> 01:36:05,679 Speaker 2: including his first term. As you look across the poles 1864 01:36:05,760 --> 01:36:09,599 Speaker 2: and so he still has that thirty five percent kind 1865 01:36:09,640 --> 01:36:14,679 Speaker 2: of MAGA floor plus are minus another five or six percent. 1866 01:36:14,880 --> 01:36:17,559 Speaker 2: So some poles at thirty nine, some at forty two. 1867 01:36:17,960 --> 01:36:21,000 Speaker 2: But by and large, I don't think people like hostile 1868 01:36:21,120 --> 01:36:24,000 Speaker 2: takeovers of the federal government. I think people want to 1869 01:36:24,040 --> 01:36:26,880 Speaker 2: feel safe going on airplanes and not have a transportation 1870 01:36:27,040 --> 01:36:32,080 Speaker 2: secretary kind of panic on TV about flights and staffing 1871 01:36:32,200 --> 01:36:35,120 Speaker 2: and try to blame other people. I think people see 1872 01:36:35,400 --> 01:36:39,360 Speaker 2: veterans getting fired and disrespected. I think people see education 1873 01:36:39,560 --> 01:36:43,040 Speaker 2: being gutted, and I think that people don't like that stuff. 1874 01:36:43,160 --> 01:36:45,800 Speaker 4: Women dying in parking lot, I think too absolutely. 1875 01:36:46,000 --> 01:36:49,240 Speaker 2: And I think that people saying, you know, you know, no, 1876 01:36:49,680 --> 01:36:52,400 Speaker 2: we're not okay with this, and that gives. 1877 01:36:52,240 --> 01:36:53,880 Speaker 1: Me, That gives me hope. 1878 01:36:54,040 --> 01:36:57,280 Speaker 4: Even slack divism, whati's so called, you know, sitting on 1879 01:36:57,360 --> 01:37:00,680 Speaker 4: your chairs and trying to do some thing which was 1880 01:37:00,800 --> 01:37:03,840 Speaker 4: derided for so long, I think is actually a pretty 1881 01:37:03,880 --> 01:37:07,720 Speaker 4: good barometer of how people are feeling. You know, what 1882 01:37:07,880 --> 01:37:11,800 Speaker 4: they're saying on social media, how they're commenting and what 1883 01:37:11,920 --> 01:37:14,800 Speaker 4: they're watching, like the Midus network, right, you. 1884 01:37:14,800 --> 01:37:16,720 Speaker 3: Know percent Yeah, And then that's why we always say, 1885 01:37:16,720 --> 01:37:19,360 Speaker 3: you know, do what you can, right. You don't have 1886 01:37:19,520 --> 01:37:22,560 Speaker 3: to go out there and build a media network, right, Like, 1887 01:37:22,840 --> 01:37:25,360 Speaker 3: that's not we're asking, right, But but if your thing 1888 01:37:25,520 --> 01:37:28,280 Speaker 3: is getting your messages out on social media by sending messages, 1889 01:37:28,360 --> 01:37:30,439 Speaker 3: do it like we're talking to a neighbor. I'm listening 1890 01:37:30,479 --> 01:37:33,640 Speaker 3: to a neighbor, right, do it. If you if you're 1891 01:37:33,800 --> 01:37:37,200 Speaker 3: a painter, paint right, like, whatever it is that you do, 1892 01:37:38,040 --> 01:37:40,519 Speaker 3: do it and use that to empower yourself. Also, I 1893 01:37:40,560 --> 01:37:43,200 Speaker 3: think one of the things that I also think about 1894 01:37:43,360 --> 01:37:45,439 Speaker 3: is just how much throughout history, and not even just 1895 01:37:45,479 --> 01:37:48,479 Speaker 3: American history, but in general, they are these big pendulum 1896 01:37:48,520 --> 01:37:53,080 Speaker 3: swings and they happen quickly, and you know, I think. 1897 01:37:53,000 --> 01:37:55,519 Speaker 4: I've never seen one happen quite this quick and clearly. 1898 01:37:55,520 --> 01:37:57,160 Speaker 3: That's what I was going to say. I've never seen 1899 01:37:57,200 --> 01:37:59,200 Speaker 3: it happen as quickly as we've seen it happen now, 1900 01:37:59,280 --> 01:38:02,479 Speaker 3: and even since Donald Trump took office to right now, 1901 01:38:02,600 --> 01:38:06,080 Speaker 3: I think we've also seen a tremendous pendulum swing kind 1902 01:38:06,080 --> 01:38:09,200 Speaker 3: of influx. As we are kind of all living in 1903 01:38:09,280 --> 01:38:12,280 Speaker 3: this moment, how this will end. When it will end, 1904 01:38:12,400 --> 01:38:14,320 Speaker 3: what it looks like when we're on the other side. 1905 01:38:14,920 --> 01:38:17,040 Speaker 3: You know, I don't know that history has not yet 1906 01:38:17,120 --> 01:38:20,639 Speaker 3: been written, but I know that the pendulum always does swing. 1907 01:38:20,800 --> 01:38:22,720 Speaker 3: It doesn't happen on its own, but you know, it 1908 01:38:22,800 --> 01:38:25,200 Speaker 3: happens by collective action. And you know a lot of 1909 01:38:25,280 --> 01:38:27,840 Speaker 3: people staying engaged and in the fight, but I know 1910 01:38:28,000 --> 01:38:29,760 Speaker 3: it will. And that's why when we speak about these 1911 01:38:29,840 --> 01:38:32,840 Speaker 3: fair weather you know, folks who change their opinion based 1912 01:38:32,920 --> 01:38:34,920 Speaker 3: on the way the wind is blowing, I think they 1913 01:38:34,960 --> 01:38:37,000 Speaker 3: may get caught on the wrong side of you know, 1914 01:38:37,120 --> 01:38:40,719 Speaker 3: that pendulum when things swing back to you know, people 1915 01:38:41,320 --> 01:38:45,960 Speaker 3: you know pushing forward on you know, inclusion, diversity, people 1916 01:38:46,479 --> 01:38:49,920 Speaker 3: championing you know, workers' rights, people championing you know, the 1917 01:38:50,000 --> 01:38:53,200 Speaker 3: ability for people to afford their you know, medication or 1918 01:38:53,240 --> 01:38:55,840 Speaker 3: get their social security or whatever it is. I think 1919 01:38:55,840 --> 01:38:59,280 Speaker 3: there's a breaking point that society reaches, and I think 1920 01:38:59,360 --> 01:39:02,360 Speaker 3: we've kind of did a a speed run, you know, 1921 01:39:02,479 --> 01:39:04,599 Speaker 3: to where we are right now, even a speed run 1922 01:39:04,640 --> 01:39:06,280 Speaker 3: to Trump and now a speed run kind of back 1923 01:39:06,360 --> 01:39:08,800 Speaker 3: as people are like, oh shit, maybe this wasn't the 1924 01:39:08,840 --> 01:39:09,880 Speaker 3: best idea on the planet. 1925 01:39:10,080 --> 01:39:10,360 Speaker 2: I did. 1926 01:39:10,479 --> 01:39:12,640 Speaker 4: I did feel like Andy Basheer, who I don't know 1927 01:39:12,680 --> 01:39:14,920 Speaker 4: if you all talked about this, Ben, but he did 1928 01:39:15,000 --> 01:39:18,320 Speaker 4: have a good point about, you know, the Democrats and 1929 01:39:19,160 --> 01:39:23,360 Speaker 4: some progressives I think became a little too precious about 1930 01:39:23,479 --> 01:39:29,840 Speaker 4: words and about you know, shaming or judging or you know, 1931 01:39:30,000 --> 01:39:34,040 Speaker 4: being self righteous about framing certain issues to the point 1932 01:39:34,080 --> 01:39:36,960 Speaker 4: where it was like a turnoff for a lot of people. 1933 01:39:37,360 --> 01:39:39,720 Speaker 4: Do you agree with that criticism, Well, I. 1934 01:39:39,800 --> 01:39:43,840 Speaker 3: Think ultimately, speak as people speak, you know, don't don't 1935 01:39:43,880 --> 01:39:46,840 Speaker 3: try to cook up phrases in a lab. Don't try 1936 01:39:46,920 --> 01:39:50,840 Speaker 3: to you know, come up with euphemisms for things, or 1937 01:39:51,240 --> 01:39:53,439 Speaker 3: you don't know, you know how many times people have also, 1938 01:39:53,520 --> 01:39:55,040 Speaker 3: you know, come to us you know what I think 1939 01:39:55,080 --> 01:39:58,200 Speaker 3: you should say, you know, like experts, maybe instead of 1940 01:39:58,240 --> 01:40:00,960 Speaker 3: saying the language this way, maybe should say it, you know, 1941 01:40:01,200 --> 01:40:03,439 Speaker 3: this other way. And I'm like, I've never heard somebody 1942 01:40:03,520 --> 01:40:05,840 Speaker 3: use that phrase in my life, though, Like I can't. 1943 01:40:05,880 --> 01:40:07,560 Speaker 3: I don't even know what you're talking about. Honestly, it 1944 01:40:07,600 --> 01:40:08,840 Speaker 3: sounds like mumbo jumbo, like. 1945 01:40:08,800 --> 01:40:10,439 Speaker 1: A breda tested point three percent. 1946 01:40:10,760 --> 01:40:13,600 Speaker 3: Yeah, this tested. I'm like, I don't even know what 1947 01:40:13,680 --> 01:40:16,160 Speaker 3: that means. So, yeah, I'm going to talk how I 1948 01:40:16,200 --> 01:40:17,880 Speaker 3: speak with my brothers and how I speak with my friends. 1949 01:40:18,000 --> 01:40:20,120 Speaker 4: Yeah, and I know you say what what you can 1950 01:40:20,280 --> 01:40:22,760 Speaker 4: have in a group chat right like, it's it's got 1951 01:40:22,880 --> 01:40:26,120 Speaker 4: to be sort of figured out that way, JORDI bring 1952 01:40:26,240 --> 01:40:28,439 Speaker 4: it home money I I I. 1953 01:40:28,720 --> 01:40:30,479 Speaker 1: Want to echo your point that you said just before. 1954 01:40:30,760 --> 01:40:34,840 Speaker 1: Americans are spart Americans are resilient, and it's not going 1955 01:40:34,920 --> 01:40:37,160 Speaker 1: to be easy these next couple of years here, but 1956 01:40:37,280 --> 01:40:40,960 Speaker 1: democracy will prevail. I genuinely have hope. I know from 1957 01:40:41,000 --> 01:40:42,720 Speaker 1: the folks my age, I know where they're sinning that 1958 01:40:43,000 --> 01:40:46,280 Speaker 1: right now, and I know from you know, folks around 1959 01:40:46,320 --> 01:40:48,680 Speaker 1: Ben and Brett's age and then older, you know where 1960 01:40:48,720 --> 01:40:52,519 Speaker 1: they're sinning that. People love this country. This country is 1961 01:40:52,560 --> 01:40:57,040 Speaker 1: the greatest country in the world. Americans want this country 1962 01:40:57,120 --> 01:40:59,800 Speaker 1: to succeed. The world wants this country to succeed. And 1963 01:41:00,000 --> 01:41:01,479 Speaker 1: we just have to put our heads down every single 1964 01:41:01,560 --> 01:41:03,280 Speaker 1: day and do the work and make sure that we 1965 01:41:03,439 --> 01:41:05,600 Speaker 1: leave this country better off, you know, not just for 1966 01:41:05,720 --> 01:41:08,559 Speaker 1: our kids, but for the next several generations of kids 1967 01:41:08,800 --> 01:41:11,840 Speaker 1: hundreds of years down the line, because that's how much 1968 01:41:11,880 --> 01:41:12,800 Speaker 1: this country means to us. 1969 01:41:12,960 --> 01:41:15,080 Speaker 4: And in the mouth of it, mauths of babes, right, 1970 01:41:16,280 --> 01:41:17,519 Speaker 4: you did he nail it? 1971 01:41:17,600 --> 01:41:17,800 Speaker 3: Or well? 1972 01:41:17,960 --> 01:41:20,840 Speaker 4: Yes, you nailed it. Nailed it. You guys ever fight 1973 01:41:21,000 --> 01:41:23,120 Speaker 4: by the way, do you ever get into the guards? 1974 01:41:23,720 --> 01:41:28,800 Speaker 3: Absolutely, but listen, it's not it's not like but they're 1975 01:41:28,800 --> 01:41:31,040 Speaker 3: not like you know, it's it's little things. Honestly, like 1976 01:41:31,120 --> 01:41:32,320 Speaker 3: we we don't all that much. 1977 01:41:32,439 --> 01:41:34,519 Speaker 4: But you know, if do you like working together, it's 1978 01:41:34,560 --> 01:41:34,880 Speaker 4: the best. 1979 01:41:34,960 --> 01:41:38,000 Speaker 3: It's honestly, it's a dream like and I think always 1980 01:41:38,000 --> 01:41:40,120 Speaker 3: growing up, like we always wanted to work together in 1981 01:41:40,200 --> 01:41:42,320 Speaker 3: some capacity, and like as kids, we would like make 1982 01:41:42,400 --> 01:41:44,439 Speaker 3: movies together, you know, we would make like little action 1983 01:41:44,600 --> 01:41:47,200 Speaker 3: movies or horror movies or mystery movies or whatever, and 1984 01:41:47,479 --> 01:41:49,400 Speaker 3: and that that would be like our way of working together. 1985 01:41:49,400 --> 01:41:51,200 Speaker 3: I would play sports like whatever it was. Again, we'd 1986 01:41:51,200 --> 01:41:53,960 Speaker 3: always love doing stuff together. Then there just became a time, 1987 01:41:53,960 --> 01:41:56,679 Speaker 3: it came in time in life where our age difference, 1988 01:41:56,960 --> 01:41:59,599 Speaker 3: you know, separate you, separates you naturally. You know, when 1989 01:42:00,040 --> 01:42:02,240 Speaker 3: when Ben's already in college and I'm just starting high 1990 01:42:02,240 --> 01:42:04,360 Speaker 3: school and Jordi's in middle school, you know whatever that is, 1991 01:42:04,520 --> 01:42:06,599 Speaker 3: or I'm a senior in high school, Jeordie's just starting, 1992 01:42:06,880 --> 01:42:09,559 Speaker 3: you know, as a freshman in high school. And luckily 1993 01:42:09,600 --> 01:42:13,640 Speaker 3: I set the stage to make sure everybody likes, you know, 1994 01:42:13,760 --> 01:42:14,920 Speaker 3: but there's always these kinds. 1995 01:42:14,760 --> 01:42:16,360 Speaker 4: Of and that you live up to Ben's. 1996 01:42:17,880 --> 01:42:20,679 Speaker 3: You know, school president and teachers. 1997 01:42:22,200 --> 01:42:23,479 Speaker 1: That's a ridiculous award. 1998 01:42:23,720 --> 01:42:24,920 Speaker 2: You wanted the real war. 1999 01:42:26,360 --> 01:42:28,720 Speaker 4: It was a teachers awards. 2000 01:42:30,960 --> 01:42:35,840 Speaker 1: Him. Yeah, it suck up him. Right here, let's focus 2001 01:42:35,960 --> 01:42:36,759 Speaker 1: on the student government. 2002 01:42:39,240 --> 01:42:41,679 Speaker 4: Well, I have to say this is especially the longest 2003 01:42:41,760 --> 01:42:45,200 Speaker 4: podcast I've ever done, but it felt like the shortest. 2004 01:42:45,479 --> 01:42:48,479 Speaker 4: Thank you guys so so much. This was such a 2005 01:42:48,560 --> 01:42:52,000 Speaker 4: fun conversation. I learned a lot and just really enjoyed 2006 01:42:52,040 --> 01:42:53,960 Speaker 4: getting to know all three of you. So thank you. 2007 01:42:54,560 --> 01:42:56,800 Speaker 1: It was great to be here. And you know, we 2008 01:42:56,880 --> 01:42:58,320 Speaker 1: did grow up watching you and. 2009 01:43:00,200 --> 01:43:03,719 Speaker 3: They are such massive myers of Yes, truly an honor. 2010 01:43:03,840 --> 01:43:06,360 Speaker 3: So for this to be, you know, the first show 2011 01:43:06,400 --> 01:43:08,920 Speaker 3: that we're actually in person on together. You know, it's 2012 01:43:09,040 --> 01:43:11,840 Speaker 3: it's not by accident and where we're really grateful, you know, 2013 01:43:12,000 --> 01:43:13,840 Speaker 3: for that, and we're just so thrilled to be here 2014 01:43:13,880 --> 01:43:14,040 Speaker 3: with you. 2015 01:43:14,320 --> 01:43:16,360 Speaker 1: Thank you, thank you. I'm not just saying that because 2016 01:43:16,360 --> 01:43:19,320 Speaker 1: I'm the teachers. I'm not trying to suck out you 2017 01:43:19,360 --> 01:43:22,360 Speaker 1: see how you want. I'm genuinely genuinely. 2018 01:43:30,520 --> 01:43:33,479 Speaker 4: Thanks for listening. Everyone. If you have a question for 2019 01:43:33,680 --> 01:43:36,360 Speaker 4: me a subject you want us to cover, or you 2020 01:43:36,439 --> 01:43:39,080 Speaker 4: want to share your thoughts about how you navigate this 2021 01:43:39,240 --> 01:43:42,920 Speaker 4: crazy world, reach out send me a DM on Instagram. 2022 01:43:43,240 --> 01:43:46,160 Speaker 4: I would love to hear from you. Next Question is 2023 01:43:46,200 --> 01:43:50,480 Speaker 4: a production of iHeartMedia and Katie Kuric Media. The executive 2024 01:43:50,520 --> 01:43:54,560 Speaker 4: producers are Me, Katie Kuric, and Courtney Ltz. Our supervising 2025 01:43:54,640 --> 01:43:58,960 Speaker 4: producer is Ryan Martz, and our producers are Adriana Fazzio 2026 01:43:59,360 --> 01:44:04,560 Speaker 4: and Meredith Barnes. Julian Weller composed art theme music. For 2027 01:44:04,720 --> 01:44:07,720 Speaker 4: more information about today's episode, or to sign up for 2028 01:44:07,800 --> 01:44:10,920 Speaker 4: my newsletter, wake Up Call, go to the description in 2029 01:44:10,960 --> 01:44:14,479 Speaker 4: the podcast app, or visit us at Katiecuric dot com. 2030 01:44:15,000 --> 01:44:17,400 Speaker 4: You can also find me on Instagram and all my 2031 01:44:17,560 --> 01:44:22,160 Speaker 4: social media channels. For more podcasts from iHeartRadio, visit the 2032 01:44:22,320 --> 01:44:26,439 Speaker 4: iHeartRadio app, Apple Podcasts, or wherever you listen to your 2033 01:44:26,479 --> 01:44:31,840 Speaker 4: favorite shows. Hi everyone, it's Katie Couric. You know I'm 2034 01:44:31,920 --> 01:44:35,400 Speaker 4: always on the go between running my media company, hosting 2035 01:44:35,479 --> 01:44:38,960 Speaker 4: my podcast, and of course covering the news. And I 2036 01:44:39,160 --> 01:44:41,680 Speaker 4: know that to keep doing what I love, I need 2037 01:44:41,800 --> 01:44:44,920 Speaker 4: to start caring for what gets me there, my feet. 2038 01:44:45,600 --> 01:44:48,320 Speaker 4: That's why I decided to try the Good feet stores 2039 01:44:48,479 --> 01:44:52,200 Speaker 4: personalized arch support system. I met with a Good Feet 2040 01:44:52,360 --> 01:44:56,439 Speaker 4: arch support specialist and after a personalized fitting, I left 2041 01:44:56,479 --> 01:44:59,880 Speaker 4: the store with my three step system designed to improve comfort, 2042 01:45:00,320 --> 01:45:04,519 Speaker 4: balance and support. My feet, knees, and back are thanking 2043 01:45:04,640 --> 01:45:08,479 Speaker 4: me already. Visit goodfeat dot com to learn more, find 2044 01:45:08,520 --> 01:45:12,559 Speaker 4: the nearest store, or book your own free personalized fitting.