1 00:00:05,800 --> 00:00:09,200 Speaker 1: Hi, and welcome back to the Carol Markowitz Show on iHeartRadio. 2 00:00:09,520 --> 00:00:13,000 Speaker 1: My guest today is Will Chamberlain. Will is senior counsel 3 00:00:13,080 --> 00:00:15,640 Speaker 1: at the Article three Project and one of my absolute 4 00:00:15,720 --> 00:00:19,000 Speaker 1: favorite people to follow on X Hi. Will, So nice 5 00:00:19,040 --> 00:00:19,520 Speaker 1: to have you on. 6 00:00:20,239 --> 00:00:21,200 Speaker 2: Great to be with you, Carol. 7 00:00:21,680 --> 00:00:25,799 Speaker 1: I find that your feed is really smart and very 8 00:00:25,800 --> 00:00:31,040 Speaker 1: funny clever obviously, so I would love to know more 9 00:00:31,080 --> 00:00:34,960 Speaker 1: about you. How did you get into this thing of ours? 10 00:00:35,880 --> 00:00:37,320 Speaker 2: Uh? Yeah, this thing of ours, that's a good way 11 00:00:37,320 --> 00:00:38,960 Speaker 2: of describing it. Let's see. 12 00:00:38,960 --> 00:00:43,559 Speaker 3: In twenty seventeen, I had moved to DC to go 13 00:00:43,640 --> 00:00:47,320 Speaker 3: work for the Competitive Enterprise in Students, a lawyer I know, 14 00:00:47,360 --> 00:00:49,879 Speaker 3: graduate from georgehaw Law in twenty fifteen products and Big 15 00:00:49,920 --> 00:00:52,240 Speaker 3: Law had a cup of coffee, decidedn't like it. He 16 00:00:52,360 --> 00:00:56,040 Speaker 3: was back to DC, and then when I was in here, 17 00:00:56,080 --> 00:00:58,640 Speaker 3: I one of my the gut one of the I 18 00:00:58,760 --> 00:01:01,120 Speaker 3: basically met Mike Cernovich through a mutual friend who was 19 00:01:01,120 --> 00:01:05,600 Speaker 3: also my boss at CEI, and through Mike, who had 20 00:01:05,600 --> 00:01:08,600 Speaker 3: been following for years. At that point, I was like, Hey, 21 00:01:08,680 --> 00:01:10,800 Speaker 3: I want to you know, I'm pro Trump. I would 22 00:01:10,920 --> 00:01:14,440 Speaker 3: like to be hooked hooked in with the DC pro 23 00:01:14,520 --> 00:01:16,440 Speaker 3: Trump people. You know, Hey, I'm also you know at 24 00:01:16,400 --> 00:01:19,119 Speaker 3: George down Lograd like, you know, I'm a serious person, 25 00:01:20,400 --> 00:01:24,200 Speaker 3: so he did, and so then I started organizing. One 26 00:01:24,200 --> 00:01:26,200 Speaker 3: of the things I started doing was organizing cocktail hours 27 00:01:26,240 --> 00:01:30,039 Speaker 3: for MAGA figures and in the administration at the Trump Hotel, 28 00:01:30,880 --> 00:01:33,680 Speaker 3: which was a very good way to get to meet 29 00:01:33,680 --> 00:01:36,480 Speaker 3: everybody and sort of the many things that you know, 30 00:01:36,560 --> 00:01:38,959 Speaker 3: people do not necessarily know that, you know, big social 31 00:01:39,120 --> 00:01:42,880 Speaker 3: media platforms are often built offline with relationships that you 32 00:01:42,920 --> 00:01:46,039 Speaker 3: build with other people who have social media platforms. So 33 00:01:46,680 --> 00:01:48,040 Speaker 3: you know, that was how I kind of got my 34 00:01:48,040 --> 00:01:51,840 Speaker 3: foot in the door. And then you know, started posting 35 00:01:51,920 --> 00:01:54,640 Speaker 3: and had you know, bigger names at the time, like 36 00:01:54,760 --> 00:01:56,720 Speaker 3: people like Cernovic, who was obviously one of the first 37 00:01:56,720 --> 00:02:01,200 Speaker 3: big people amplify me, Jack Pisobic, And over time, I 38 00:02:01,240 --> 00:02:05,000 Speaker 3: do that long enough and I do say enough interesting 39 00:02:05,200 --> 00:02:07,360 Speaker 3: things that eventually a lot of people are following me. 40 00:02:08,800 --> 00:02:10,359 Speaker 3: So yeah, it's it's sort of interesting. I know a 41 00:02:10,360 --> 00:02:12,359 Speaker 3: lot of people build their platforms by doing something else 42 00:02:12,400 --> 00:02:14,400 Speaker 3: first and then using the fame they get from that 43 00:02:14,440 --> 00:02:16,760 Speaker 3: thing to build a social media platform. I kind of 44 00:02:16,760 --> 00:02:18,960 Speaker 3: did it in reverse. I built the social media platform 45 00:02:19,040 --> 00:02:22,120 Speaker 3: and then use that to do more and more interesting things. 46 00:02:22,760 --> 00:02:25,440 Speaker 1: It's funny. I also my my like when I got 47 00:02:25,440 --> 00:02:28,720 Speaker 1: my start in I had a blog and I would 48 00:02:28,720 --> 00:02:32,440 Speaker 1: throw happy hours and meeting people in real life had 49 00:02:32,680 --> 00:02:34,640 Speaker 1: you know, just had me make a lot of different 50 00:02:34,680 --> 00:02:37,520 Speaker 1: connections and a lot of people started reading me and 51 00:02:37,560 --> 00:02:39,560 Speaker 1: it kind of took off from there. I think the 52 00:02:39,600 --> 00:02:42,040 Speaker 1: real life thing really shouldn't be overlooked. 53 00:02:42,280 --> 00:02:44,320 Speaker 2: Yeah, it's it's it's usually important. 54 00:02:44,520 --> 00:02:47,320 Speaker 3: And you know, you can't you can't start out being 55 00:02:47,360 --> 00:02:50,800 Speaker 3: in the middle of nowhere not knowing anybody, right, You're well, 56 00:02:50,840 --> 00:02:53,880 Speaker 3: you can't do anything on hard mode doing everything on. 57 00:02:53,840 --> 00:02:54,960 Speaker 2: Hard mode exactly. 58 00:02:55,320 --> 00:02:57,720 Speaker 3: Yeah, so that was that was sort of how if 59 00:02:57,760 --> 00:02:59,280 Speaker 3: you want to little how did I get my break 60 00:02:59,320 --> 00:03:01,080 Speaker 3: into this space? That that's how I got it. I 61 00:03:01,080 --> 00:03:02,839 Speaker 3: mean that said like you can get your break. Plenty 62 00:03:02,840 --> 00:03:04,960 Speaker 3: of people have had their break and don't say interesting 63 00:03:05,000 --> 00:03:07,400 Speaker 3: things for long enough to remain interesting. Like you do 64 00:03:07,560 --> 00:03:10,200 Speaker 3: something and get attention for it. But if they don't 65 00:03:10,200 --> 00:03:13,200 Speaker 3: have I mean fundamentally excess. A platform is built on 66 00:03:13,240 --> 00:03:15,320 Speaker 3: having interesting things to say about them with the events 67 00:03:15,320 --> 00:03:15,720 Speaker 3: of the day. 68 00:03:16,000 --> 00:03:19,040 Speaker 1: Yeah, right, you could say one thing and get a 69 00:03:19,040 --> 00:03:21,720 Speaker 1: lot of followers, but then you know, I don't think 70 00:03:21,720 --> 00:03:22,920 Speaker 1: a lot of people hang. 71 00:03:22,760 --> 00:03:25,360 Speaker 3: On to that right, right And if or if you 72 00:03:25,400 --> 00:03:27,520 Speaker 3: only have one message and then that message no longer 73 00:03:27,600 --> 00:03:30,560 Speaker 3: is relevant, well then what else are you going to 74 00:03:30,600 --> 00:03:31,040 Speaker 3: talk about? 75 00:03:31,360 --> 00:03:31,560 Speaker 2: You know? 76 00:03:31,880 --> 00:03:34,400 Speaker 3: And so you know, and then also you have to 77 00:03:34,400 --> 00:03:36,240 Speaker 3: think about how you build your following. If you built 78 00:03:36,240 --> 00:03:39,360 Speaker 3: your following saying exactly one thing to you know, cheerlead 79 00:03:39,400 --> 00:03:43,440 Speaker 3: for something, then you can't change your mind because that's 80 00:03:43,440 --> 00:03:45,240 Speaker 3: why people are following you, because you're the guy who 81 00:03:45,280 --> 00:03:47,040 Speaker 3: says the thing you don't want to say the thing. 82 00:03:47,720 --> 00:03:51,360 Speaker 1: So I went to the Article three project website in 83 00:03:51,440 --> 00:03:53,920 Speaker 1: preparation for this because I wanted to know more about it, 84 00:03:54,080 --> 00:03:57,880 Speaker 1: more about you, and I loved the tagline on the 85 00:03:57,920 --> 00:04:01,400 Speaker 1: front page. It says, a three P brings brass knuckles 86 00:04:01,400 --> 00:04:04,320 Speaker 1: to fight leftist law fair. What does that mean? What 87 00:04:04,400 --> 00:04:05,120 Speaker 1: do you guys do? 88 00:04:06,680 --> 00:04:09,920 Speaker 3: We're bullies. So that's that's the core of our our 89 00:04:10,040 --> 00:04:14,400 Speaker 3: you know, we we we bully you know, legislators into 90 00:04:14,880 --> 00:04:17,719 Speaker 3: confirming the people we think should be confirmed. We try 91 00:04:17,880 --> 00:04:21,920 Speaker 3: and advocate on behalf of you know, like you know, 92 00:04:21,960 --> 00:04:23,560 Speaker 3: to get judges confirmed that will be good. 93 00:04:23,560 --> 00:04:24,000 Speaker 2: On the law. 94 00:04:24,120 --> 00:04:27,080 Speaker 3: We try and advocate for judges to do the right 95 00:04:27,080 --> 00:04:32,080 Speaker 3: thing in cases, and we try and win the debates 96 00:04:32,279 --> 00:04:36,279 Speaker 3: internally on the right, especially about what the conservati legal 97 00:04:36,279 --> 00:04:39,440 Speaker 3: movement should be doing and what conservatives lawyers should be doing. 98 00:04:40,000 --> 00:04:42,040 Speaker 3: And so we find ourselves in conflict with people like 99 00:04:42,160 --> 00:04:44,280 Speaker 3: Ed Wheland, for example, who might be like a you know, 100 00:04:44,400 --> 00:04:48,320 Speaker 3: kind of squishy or republican, you know, telling conservative judges, 101 00:04:48,360 --> 00:04:53,039 Speaker 3: conservative lawyers that they should be more hesitant, weak need 102 00:04:53,040 --> 00:04:56,000 Speaker 3: that sort of thing. In our view, I won't be 103 00:04:56,040 --> 00:04:58,159 Speaker 3: too uncharitable, but that's that's that's a big part of 104 00:04:58,160 --> 00:05:00,880 Speaker 3: what we do. And so it involves the pressure that 105 00:05:01,040 --> 00:05:03,880 Speaker 3: both you know, myself, Mike Davis, Josh Hammer, all of 106 00:05:03,960 --> 00:05:07,279 Speaker 3: us can put through our advacy on X, through op eds, 107 00:05:07,279 --> 00:05:10,039 Speaker 3: through media hits. And then because all of us have 108 00:05:10,080 --> 00:05:12,640 Speaker 3: a lot of followers, how we engage those followers and 109 00:05:12,680 --> 00:05:16,719 Speaker 3: get them to speak to legislators, to and to the 110 00:05:16,720 --> 00:05:20,880 Speaker 3: public and to the executive branch. So that's through campaigns 111 00:05:20,920 --> 00:05:22,839 Speaker 3: where you know, we've set a portal they can just 112 00:05:22,920 --> 00:05:25,240 Speaker 3: contact and email people. So all this is just a 113 00:05:25,320 --> 00:05:28,920 Speaker 3: it's a very very effective and aggressive advocacy organization. We're 114 00:05:28,960 --> 00:05:31,240 Speaker 3: not very big, it's probably like ten people total involved, 115 00:05:31,240 --> 00:05:33,440 Speaker 3: in any of them are consultants. But in terms of 116 00:05:33,480 --> 00:05:36,440 Speaker 3: like the core, you know, what we're able to have. 117 00:05:36,600 --> 00:05:38,279 Speaker 3: We have a lot of force that we can bring 118 00:05:38,279 --> 00:05:42,480 Speaker 3: to bear on any particular issue, and when we do, 119 00:05:43,600 --> 00:05:45,440 Speaker 3: I think we end up leading the way and kind 120 00:05:45,440 --> 00:05:48,360 Speaker 3: of setting the line for the conservative movement on legal issues. 121 00:05:48,400 --> 00:05:49,800 Speaker 3: Now I think that's where we are. It used to 122 00:05:49,880 --> 00:05:52,760 Speaker 3: be that other groups did that, but now it feels 123 00:05:52,839 --> 00:05:54,800 Speaker 3: like those groups, those groups are kind of don't want 124 00:05:54,800 --> 00:05:55,960 Speaker 3: to if we set the line. 125 00:05:55,960 --> 00:05:56,760 Speaker 2: They don't want to get on the. 126 00:05:56,720 --> 00:05:59,440 Speaker 3: Wrong side of us because they were both We're really 127 00:05:59,440 --> 00:06:01,120 Speaker 3: good bullies, so it's really not pleasant to be on 128 00:06:01,120 --> 00:06:01,800 Speaker 3: the wrong side of us. 129 00:06:02,120 --> 00:06:05,279 Speaker 1: It's interesting because I often think the conservative side doesn't 130 00:06:05,279 --> 00:06:07,720 Speaker 1: really have bullies. I think that so much of what 131 00:06:07,760 --> 00:06:11,040 Speaker 1: we do is just there's a lot of squishy people 132 00:06:11,240 --> 00:06:15,240 Speaker 1: and people who don't fight back and don't put up 133 00:06:15,279 --> 00:06:18,680 Speaker 1: the arguments and don't want to offend anybody. And I 134 00:06:18,720 --> 00:06:22,080 Speaker 1: think it's very hard to be the bully in that environment. 135 00:06:22,160 --> 00:06:23,080 Speaker 1: Do you guys find that. 136 00:06:24,680 --> 00:06:26,760 Speaker 2: I mean, I don't think it's hard. 137 00:06:26,920 --> 00:06:30,000 Speaker 3: I think it you know, if anything, the vacuum of 138 00:06:30,040 --> 00:06:32,360 Speaker 3: having people who are really aggressive and setting the line 139 00:06:32,360 --> 00:06:35,560 Speaker 3: on these things made it was is what meant that 140 00:06:35,560 --> 00:06:37,520 Speaker 3: there was just such a huge audience for what we're doing, 141 00:06:37,560 --> 00:06:40,280 Speaker 3: and so so many people who got behind us and 142 00:06:40,279 --> 00:06:43,640 Speaker 3: support what we're doing. Because I think the fundamental, you know, 143 00:06:43,720 --> 00:06:46,280 Speaker 3: the fundamental failure of the conservative legal movement is honestly, 144 00:06:46,400 --> 00:06:50,039 Speaker 3: for years they weren't deeply loyal to conservative voters. They 145 00:06:50,560 --> 00:06:53,599 Speaker 3: were kind of doing their own thing and you know, 146 00:06:53,640 --> 00:06:56,640 Speaker 3: had their own sort of clique in their own norms, 147 00:06:56,680 --> 00:06:58,760 Speaker 3: et cetera, et cetera, et cetera. And so we come 148 00:06:58,800 --> 00:07:01,320 Speaker 3: in and we say, no, actually, you know what, you know, 149 00:07:01,360 --> 00:07:03,520 Speaker 3: the people who are the reason you get to be 150 00:07:03,800 --> 00:07:06,839 Speaker 3: judges and lawyers and the administration, the people who voted 151 00:07:07,000 --> 00:07:12,400 Speaker 3: for Republican presidents, they have they have asks, and you know, 152 00:07:12,480 --> 00:07:14,760 Speaker 3: you don't you know, the idea that you would just 153 00:07:14,960 --> 00:07:18,520 Speaker 3: be completely oblivious to that and not on board with 154 00:07:18,600 --> 00:07:21,840 Speaker 3: the broader conservative project, not just the conservative legal movements project, 155 00:07:21,920 --> 00:07:25,760 Speaker 3: broader conservative project is it's just not going to fly anymore. 156 00:07:26,320 --> 00:07:28,800 Speaker 3: And so, you know, and I think that's one of 157 00:07:28,840 --> 00:07:31,040 Speaker 3: the reasons, like, you know, Mike is so popular on 158 00:07:31,600 --> 00:07:33,760 Speaker 3: he goes on ban and he goes wherever. I mean, 159 00:07:33,840 --> 00:07:36,440 Speaker 3: all three of us have built enormous platforms, really, and 160 00:07:36,480 --> 00:07:37,360 Speaker 3: what is my platform? 161 00:07:37,400 --> 00:07:37,480 Speaker 2: Like? 162 00:07:37,680 --> 00:07:40,400 Speaker 3: Again, I didn't come at this from the perspective of gosh, 163 00:07:40,480 --> 00:07:43,520 Speaker 3: I have the stature from running a television show or something, 164 00:07:44,000 --> 00:07:45,800 Speaker 3: you know, being a major play I've never served in 165 00:07:45,840 --> 00:07:49,360 Speaker 3: the administration. I served briefly for Ronda Santas on his 166 00:07:49,480 --> 00:07:51,000 Speaker 3: on the legal side for a couple of months before 167 00:07:51,040 --> 00:07:54,240 Speaker 3: I got bored with it, so you know that all. 168 00:07:54,640 --> 00:07:57,280 Speaker 3: But the reason is is because I know this, right, 169 00:07:57,320 --> 00:07:59,480 Speaker 3: I have a very powerful and persuasive voice when it 170 00:07:59,480 --> 00:08:02,920 Speaker 3: comes to legal issues. Conservatives think I'm usually the people 171 00:08:02,920 --> 00:08:05,440 Speaker 3: on our side think I'm right about things, at least 172 00:08:05,440 --> 00:08:08,440 Speaker 3: when it comes to the law. And so that's that's 173 00:08:08,480 --> 00:08:10,160 Speaker 3: the force, Like, it's persuasive force. 174 00:08:10,160 --> 00:08:12,160 Speaker 2: I don't. I don't. I'm not physically beating on people. 175 00:08:11,960 --> 00:08:14,239 Speaker 1: No, obviously you're not like putting them in a headlock. 176 00:08:14,360 --> 00:08:18,440 Speaker 1: But metaphorically speaking, right, what are some big wins you 177 00:08:18,480 --> 00:08:19,080 Speaker 1: guys have had. 178 00:08:19,520 --> 00:08:22,200 Speaker 3: We got Pete haig Seth confirmed. I think that was 179 00:08:22,200 --> 00:08:24,440 Speaker 3: a big one. That's the biggest one. Off the top 180 00:08:24,440 --> 00:08:25,960 Speaker 3: of my head. That were not the only people who 181 00:08:25,960 --> 00:08:28,040 Speaker 3: worked on that, but we were a big part of 182 00:08:28,360 --> 00:08:30,640 Speaker 3: making sure that he got confirmed. There were a lot 183 00:08:30,640 --> 00:08:34,200 Speaker 3: of people whining about that, and once once we got involved, 184 00:08:34,760 --> 00:08:39,280 Speaker 3: they stopped doing that. Who are the other people that 185 00:08:39,320 --> 00:08:40,760 Speaker 3: I'm trying to think of off the top. 186 00:08:40,559 --> 00:08:43,439 Speaker 2: Of my head. We got Amil Beouv confirmed. That's another 187 00:08:43,440 --> 00:08:43,840 Speaker 2: big win. 188 00:08:44,360 --> 00:08:46,120 Speaker 3: There were a lot of Republicans who were kind of 189 00:08:46,160 --> 00:08:48,120 Speaker 3: wishy washy about that, and then we came out in 190 00:08:48,160 --> 00:08:50,920 Speaker 3: full force behind the nomination and made the aggressive arguments. 191 00:08:51,240 --> 00:08:53,920 Speaker 3: Ed Wheelan was like a really big conservative figure trying 192 00:08:53,920 --> 00:08:55,920 Speaker 3: to get his people to not vote for Amil Bov. 193 00:08:56,240 --> 00:08:58,040 Speaker 3: And again, we have a tiny majority in the Assenate. 194 00:08:58,080 --> 00:08:59,439 Speaker 3: So if he managed to peel off one or two 195 00:08:59,480 --> 00:09:03,839 Speaker 3: people with his decades of you know, relationships and work 196 00:09:03,880 --> 00:09:07,200 Speaker 3: and prestige and the conservative legal movement that he could have, 197 00:09:07,320 --> 00:09:09,520 Speaker 3: you know, he should have. You think that he would 198 00:09:09,520 --> 00:09:10,920 Speaker 3: be able to peel off one or two of the 199 00:09:10,920 --> 00:09:13,560 Speaker 3: most moderate centers. Didn't happen. And I think a big 200 00:09:13,600 --> 00:09:15,800 Speaker 3: part of that was, you know, by the time we 201 00:09:15,800 --> 00:09:19,480 Speaker 3: were done advocating on that issue, Ed Whelan had blocked 202 00:09:19,520 --> 00:09:21,280 Speaker 3: every single one of us. He had blocked me, he 203 00:09:21,320 --> 00:09:24,680 Speaker 3: blocked Josh Hammer, and he had blocked Mike And you know, 204 00:09:25,120 --> 00:09:27,360 Speaker 3: his job is to beat us, right Actually, when you 205 00:09:27,360 --> 00:09:29,360 Speaker 3: think about it, like if he if his job is 206 00:09:29,400 --> 00:09:32,040 Speaker 3: literally to win the debate with us online because he 207 00:09:32,120 --> 00:09:33,920 Speaker 3: is advocating for a different view of how the considered 208 00:09:34,000 --> 00:09:36,320 Speaker 3: legal mature work. He works at a nonprofit, he's taking 209 00:09:36,320 --> 00:09:38,320 Speaker 3: donations to that effect. If he's blocking the three of 210 00:09:38,360 --> 00:09:41,360 Speaker 3: us and just trying to ignore that we exist, it's like, well, 211 00:09:41,360 --> 00:09:44,160 Speaker 3: then you've we've won, you've lost, Like we're you know, 212 00:09:44,559 --> 00:09:47,000 Speaker 3: because we're speaking to hundreds of thousands of people, you're 213 00:09:47,040 --> 00:09:49,880 Speaker 3: speaking to a few thousand. So you know, if you're 214 00:09:49,880 --> 00:09:51,319 Speaker 3: not winning the debate with us that, if you're just 215 00:09:51,400 --> 00:09:52,880 Speaker 3: not even participating, then we just went. 216 00:09:52,760 --> 00:09:53,160 Speaker 2: By a fall. 217 00:09:53,880 --> 00:09:55,600 Speaker 1: What would you be doing if it weren't this, What 218 00:09:55,640 --> 00:09:56,240 Speaker 1: would a plan be? 219 00:09:56,559 --> 00:10:00,000 Speaker 3: Gosh, I don't know. I mean, it's hard to say. 220 00:10:00,080 --> 00:10:02,120 Speaker 3: In some abstract sense, Plant B would be going back 221 00:10:02,160 --> 00:10:04,720 Speaker 3: to practice law in the normal way, which I don't 222 00:10:04,720 --> 00:10:07,520 Speaker 3: have any intention of doing, or trying to go work 223 00:10:07,559 --> 00:10:10,080 Speaker 3: in some state government somewhere. I guess I could have 224 00:10:10,120 --> 00:10:12,319 Speaker 3: gone back to Ronda Santis and worked somewhere in the 225 00:10:12,320 --> 00:10:15,120 Speaker 3: Florida state government. Maybe I could have gone to work 226 00:10:15,160 --> 00:10:18,120 Speaker 3: for the administration. But you know, all those things would 227 00:10:18,120 --> 00:10:20,360 Speaker 3: be options. But I much prefer doing what I do now, 228 00:10:20,520 --> 00:10:23,320 Speaker 3: which allows me to have this kind of bucolic suburban 229 00:10:23,400 --> 00:10:26,440 Speaker 3: life with living working from home with my family and 230 00:10:26,480 --> 00:10:29,040 Speaker 3: then also being extremely involved in the issues of the 231 00:10:29,120 --> 00:10:32,439 Speaker 3: day through X like yeah, I just. 232 00:10:32,400 --> 00:10:33,760 Speaker 1: Met your wife. She is lovely. 233 00:10:34,760 --> 00:10:36,200 Speaker 2: Really she's great. 234 00:10:36,360 --> 00:10:38,000 Speaker 1: Yeah, really great. 235 00:10:38,160 --> 00:10:41,760 Speaker 3: Pair got two beautiful too beautiful kids and hopefully more 236 00:10:41,840 --> 00:10:43,880 Speaker 3: on the quay in the future, and love it. You know, 237 00:10:43,960 --> 00:10:45,840 Speaker 3: we both work in North Carolina, we both live and 238 00:10:45,880 --> 00:10:47,920 Speaker 3: work in our from our home in North Carolina, but 239 00:10:47,960 --> 00:10:49,920 Speaker 3: where you know, she's at the Manhattan Institute and goes 240 00:10:50,000 --> 00:10:51,160 Speaker 3: up to New York every so often. 241 00:10:51,320 --> 00:10:51,800 Speaker 2: I'm here. 242 00:10:52,200 --> 00:10:53,960 Speaker 3: I go out to d C when I need to 243 00:10:54,040 --> 00:10:57,160 Speaker 3: for Article three so and for other things as well. 244 00:10:57,200 --> 00:11:00,480 Speaker 3: I help with the national consertives and conferences and help 245 00:11:00,640 --> 00:11:02,360 Speaker 3: organize those and pok together panels. 246 00:11:02,480 --> 00:11:03,720 Speaker 2: So all that is, all that is. 247 00:11:03,720 --> 00:11:06,440 Speaker 1: Gravy segs very well to my one of my three 248 00:11:06,520 --> 00:11:08,959 Speaker 1: questions that I ask all my guests, what are you 249 00:11:09,160 --> 00:11:10,480 Speaker 1: most proud of in your life? 250 00:11:11,480 --> 00:11:13,000 Speaker 3: So, you know, that's an interesting I read that I 251 00:11:13,040 --> 00:11:16,000 Speaker 3: was an interesting question because and the answer is beyond 252 00:11:16,000 --> 00:11:20,160 Speaker 3: obviously my family and having what I've built with my wife, 253 00:11:20,760 --> 00:11:23,440 Speaker 3: I am proudest of what I've built in terms of 254 00:11:23,480 --> 00:11:26,400 Speaker 3: my ex account, right, And it's not just you know, 255 00:11:26,840 --> 00:11:31,319 Speaker 3: it's a that's the product of ten years of work, right, 256 00:11:31,400 --> 00:11:33,600 Speaker 3: ten years of advocacy, and so you know, is there 257 00:11:33,600 --> 00:11:34,560 Speaker 3: a book at the end of it? 258 00:11:34,600 --> 00:11:36,040 Speaker 2: No, I haven't really, Oh is there a book at 259 00:11:36,040 --> 00:11:38,440 Speaker 2: the end of it? Maybe there will be. I have 260 00:11:38,480 --> 00:11:40,160 Speaker 2: to like actually find a way to sit down and 261 00:11:40,200 --> 00:11:40,360 Speaker 2: do that. 262 00:11:40,400 --> 00:11:42,400 Speaker 3: I have terrible add So that's one of the reasons 263 00:11:42,400 --> 00:11:45,000 Speaker 3: I find Exo appealing is that I can fire off 264 00:11:45,000 --> 00:11:48,280 Speaker 3: thoughts quickly. But it's a product. My ex account is 265 00:11:48,320 --> 00:11:51,360 Speaker 3: a products ten years of work, and it's at this 266 00:11:51,440 --> 00:11:54,040 Speaker 3: point it's its own kind of phenomenon. I mean, not 267 00:11:54,320 --> 00:11:56,079 Speaker 3: a huge phenomenon in the world, but it's like it's 268 00:11:56,080 --> 00:11:59,200 Speaker 3: a very powerful thing that allows me to have influence 269 00:11:59,200 --> 00:12:01,240 Speaker 3: on the issues of the day, get asked to testify 270 00:12:01,320 --> 00:12:05,160 Speaker 3: before Congress, you know, get you know, have meet all 271 00:12:05,160 --> 00:12:07,480 Speaker 3: sorts of interesting people, be followed by all sorts of 272 00:12:07,520 --> 00:12:10,560 Speaker 3: interesting people, go on interesting shows like this one. And 273 00:12:10,800 --> 00:12:13,480 Speaker 3: so I'm really really proud of the platform I've built. 274 00:12:13,520 --> 00:12:17,080 Speaker 3: I've proud that I've managed through years of being advocating 275 00:12:17,240 --> 00:12:21,600 Speaker 3: on very aggressively for on a variety of topics as 276 00:12:21,640 --> 00:12:23,920 Speaker 3: aggressive as anybody I think in the e sertive legal movement, 277 00:12:24,040 --> 00:12:27,800 Speaker 3: still having built up this platform, having and also having 278 00:12:27,840 --> 00:12:30,479 Speaker 3: done so in a way that doesn't I'm not pigeonholed 279 00:12:30,480 --> 00:12:32,520 Speaker 3: into anything. As far as I can tell, like I 280 00:12:32,559 --> 00:12:34,160 Speaker 3: can say, I say what I think, and I can 281 00:12:34,400 --> 00:12:36,720 Speaker 3: have a bunch of people who are following me because 282 00:12:36,760 --> 00:12:38,520 Speaker 3: they respect that I say what I think and that 283 00:12:38,559 --> 00:12:40,840 Speaker 3: I or at a minimum, I do not say what 284 00:12:40,880 --> 00:12:43,080 Speaker 3: I do not think right like sometimes hold I do 285 00:12:43,120 --> 00:12:45,679 Speaker 3: hold my tongue at times, especially given that we currently 286 00:12:45,679 --> 00:12:47,440 Speaker 3: control the administration, It would be kind of rude of 287 00:12:47,480 --> 00:12:50,199 Speaker 3: me to break them on Twitter when I could pick 288 00:12:50,240 --> 00:12:52,760 Speaker 3: up the phone. But you can count on me to 289 00:12:52,840 --> 00:12:56,520 Speaker 3: not say things I don't think, and yeah, and be 290 00:12:56,679 --> 00:13:00,320 Speaker 3: and be be interesting and contrarian at times, because you know, 291 00:13:00,800 --> 00:13:01,920 Speaker 3: that's that's what I built. 292 00:13:02,120 --> 00:13:04,080 Speaker 1: We're going to take a quick break and be right 293 00:13:04,120 --> 00:13:09,679 Speaker 1: back on the Carol Marcowitch Show. Do you have any 294 00:13:09,800 --> 00:13:12,600 Speaker 1: legal like pet issues? Are there? Are there things that 295 00:13:12,640 --> 00:13:14,920 Speaker 1: you really care about? There are more? 296 00:13:15,000 --> 00:13:18,720 Speaker 3: Well, you know, the funny thing is I won on 297 00:13:18,920 --> 00:13:21,120 Speaker 3: the biggest issue I cared about, and I didn't win 298 00:13:21,200 --> 00:13:25,080 Speaker 3: through the law I won because Elon Musk bought Twitter right, 299 00:13:25,480 --> 00:13:30,680 Speaker 3: you know, and so basically for years, my big piece 300 00:13:30,720 --> 00:13:33,679 Speaker 3: of advocacy, you know, before for anything else, was yeah, 301 00:13:33,800 --> 00:13:35,839 Speaker 3: everybody should have a right to use Twitter right. They're 302 00:13:35,880 --> 00:13:40,559 Speaker 3: basically like the idea of permanent bands of individuals bizarre 303 00:13:40,600 --> 00:13:44,960 Speaker 3: things that were perfectly lawful, was wrong. That free speech 304 00:13:45,040 --> 00:13:47,800 Speaker 3: isn't meaningful in the modern era if you're not allowed 305 00:13:47,840 --> 00:13:50,600 Speaker 3: to speak freely on the Internet, so that we needed 306 00:13:50,600 --> 00:13:51,400 Speaker 3: to have legal change. 307 00:13:51,440 --> 00:13:51,640 Speaker 2: Now. 308 00:13:52,440 --> 00:13:55,240 Speaker 3: Basically, you know, I kept talking, and I remember very 309 00:13:55,240 --> 00:13:57,720 Speaker 3: early on, we're talking like seven, twenty seventeen, twenty eighteen, 310 00:13:57,760 --> 00:13:59,720 Speaker 3: I'd go on with people and people would be like, 311 00:13:59,720 --> 00:14:01,800 Speaker 3: what free speech civil right? Like, what do you mean 312 00:14:01,800 --> 00:14:05,920 Speaker 3: that's intervention in private business? I won that debate. You'll 313 00:14:05,920 --> 00:14:08,600 Speaker 3: notice nobody talks that way anymore like that that. I 314 00:14:08,720 --> 00:14:10,480 Speaker 3: just had that debate over and over again with people 315 00:14:10,800 --> 00:14:13,959 Speaker 3: and discuss people who are skeptical on our side, and 316 00:14:14,000 --> 00:14:16,520 Speaker 3: they were persuaded like I was just right. And also, thankfully, 317 00:14:16,520 --> 00:14:18,680 Speaker 3: Twitter just kept behaving in such an obnoxious and ridiculous 318 00:14:18,679 --> 00:14:23,000 Speaker 3: fashion that my arguments yeah, seemed endlessly more persuasive. So 319 00:14:23,120 --> 00:14:25,280 Speaker 3: at that point, at a certain point, then you know eventually. 320 00:14:25,520 --> 00:14:27,600 Speaker 3: You know, here's a funny story, right, I set Dealing, 321 00:14:27,640 --> 00:14:29,440 Speaker 3: Seth Dealing at one point had a discussion would be 322 00:14:29,480 --> 00:14:31,480 Speaker 3: about this right where he was trying to probe my views. 323 00:14:31,480 --> 00:14:33,400 Speaker 3: We have like a white fee I think text my 324 00:14:33,480 --> 00:14:36,000 Speaker 3: conversation and phone call. And then Seth Deal went on 325 00:14:36,080 --> 00:14:38,000 Speaker 3: with Elon Musk, right, Elon Musk went on his podcast 326 00:14:38,040 --> 00:14:39,280 Speaker 3: and I think they talked about this stuff. 327 00:14:39,360 --> 00:14:40,840 Speaker 2: And then Elon Musk went out and on Twitter. 328 00:14:40,880 --> 00:14:43,760 Speaker 1: So you can, you know, basically just thinka you're the starter. 329 00:14:43,600 --> 00:14:45,720 Speaker 3: There, right, I get the I feel like I said 330 00:14:46,320 --> 00:14:49,200 Speaker 3: about that maybe yeah yeah, but thank you staff, thank 331 00:14:49,240 --> 00:14:51,080 Speaker 3: you obviously Elon for buying it. I didn't have forty 332 00:14:51,120 --> 00:14:54,280 Speaker 3: billion laying around, but talk about a bus x macana. 333 00:14:54,360 --> 00:14:57,560 Speaker 3: That basically ended the salience of that issue. I mean, 334 00:14:57,600 --> 00:14:59,120 Speaker 3: who talks about free speech? And then we don't have 335 00:14:59,120 --> 00:15:01,400 Speaker 3: to worry about it because we have of the biggest platform, 336 00:15:01,400 --> 00:15:03,080 Speaker 3: the most important one is controlled by us. So like 337 00:15:03,120 --> 00:15:07,720 Speaker 3: whatever YouTube does, whatever Facebook does, like whatever who cares X? 338 00:15:07,960 --> 00:15:09,680 Speaker 3: Yeah yeah I can put it on X and X 339 00:15:10,040 --> 00:15:11,960 Speaker 3: X is where all the serious people are who I 340 00:15:11,960 --> 00:15:15,520 Speaker 3: want to you know, who you know have real deal 341 00:15:15,600 --> 00:15:19,320 Speaker 3: influence are So if I can put it there. Yeah, 342 00:15:19,400 --> 00:15:21,720 Speaker 3: not Blue Sky now, Blue Guy. I mean Blue Sky 343 00:15:21,840 --> 00:15:26,280 Speaker 3: is making the mistake that all the competitors we're making 344 00:15:26,840 --> 00:15:30,000 Speaker 3: when we were getting banned in like twenty twenty. I mean, 345 00:15:30,080 --> 00:15:32,360 Speaker 3: because the fundamental problem is and this is something this 346 00:15:32,440 --> 00:15:34,280 Speaker 3: is the beginning of my argument, like X is a 347 00:15:34,400 --> 00:15:38,040 Speaker 3: natural monopoly. There's always going to be You're not You're 348 00:15:38,040 --> 00:15:40,320 Speaker 3: gonna have so much trouble replacing the public square with 349 00:15:40,360 --> 00:15:42,560 Speaker 3: like a separate, smaller public square that only has people 350 00:15:42,560 --> 00:15:45,280 Speaker 3: who agree with you. That's so much less interesting than 351 00:15:45,520 --> 00:15:48,000 Speaker 3: the real public square where everybody is debating. So I 352 00:15:48,000 --> 00:15:51,000 Speaker 3: think Blue Sky, what you're seeing is what happens to 353 00:15:51,120 --> 00:15:54,040 Speaker 3: all these other platforms. Right, people are like explore it, 354 00:15:54,160 --> 00:15:56,040 Speaker 3: like have a little fun for a while, feel really 355 00:15:56,040 --> 00:15:57,160 Speaker 3: good because they're talking to people that. 356 00:15:57,480 --> 00:16:00,080 Speaker 2: Everybody agrees with you. Yeah, and then it's really a 357 00:16:00,160 --> 00:16:01,120 Speaker 2: board and want to go back to X. 358 00:16:01,440 --> 00:16:04,160 Speaker 1: Yeah, And they always come back, they do, They always. 359 00:16:03,840 --> 00:16:06,320 Speaker 3: Come back, yeah, because that's where that's that's where the 360 00:16:06,520 --> 00:16:09,280 Speaker 3: discussion with weight happens. Right that you know, you can 361 00:16:09,280 --> 00:16:12,000 Speaker 3: get cheerleading on some small little platform that isn't X, 362 00:16:12,240 --> 00:16:14,160 Speaker 3: but X is where the discussions with weight that have 363 00:16:14,280 --> 00:16:14,920 Speaker 3: impact happen. 364 00:16:15,400 --> 00:16:17,680 Speaker 1: Yeah, at the time when people are getting banned on 365 00:16:17,880 --> 00:16:20,480 Speaker 1: X for saying things like you know, masks don't work, 366 00:16:20,600 --> 00:16:23,520 Speaker 1: or I mean all kinds of insanity, I was thinking, like, 367 00:16:23,600 --> 00:16:26,040 Speaker 1: I'm not going to go to one of these smaller platforms. 368 00:16:26,080 --> 00:16:28,600 Speaker 1: I just don't care. Like if I get banned from X, 369 00:16:28,600 --> 00:16:30,280 Speaker 1: I get banned from X, I'm not going to go 370 00:16:30,760 --> 00:16:33,200 Speaker 1: have the conversation with people who already agree with me. 371 00:16:33,400 --> 00:16:35,400 Speaker 1: And that's it. It has to be put out to 372 00:16:35,480 --> 00:16:37,800 Speaker 1: the larger audience. Otherwise it just doesn't matter. 373 00:16:38,680 --> 00:16:39,880 Speaker 2: That's right. I agree. 374 00:16:40,640 --> 00:16:43,800 Speaker 1: Give us a five year out prediction, could be about 375 00:16:43,800 --> 00:16:44,480 Speaker 1: anything at all. 376 00:16:44,880 --> 00:16:48,480 Speaker 3: Five year out prediction, I mean anything at all, Like 377 00:16:48,800 --> 00:16:50,880 Speaker 3: we're going to have self driving cars are going to dominate. 378 00:16:50,920 --> 00:16:51,760 Speaker 2: We're right at the cusp. 379 00:16:51,800 --> 00:16:55,080 Speaker 3: But self driving car is becoming the thing, and you know, 380 00:16:55,120 --> 00:16:57,880 Speaker 3: I already I'm already seeing like people able to drive 381 00:16:57,880 --> 00:17:00,240 Speaker 3: all around the country and their Tesla's. I think, you know, 382 00:17:00,240 --> 00:17:02,240 Speaker 3: it took maybe ten to fifteen years longer than people thought, 383 00:17:02,240 --> 00:17:04,920 Speaker 3: but it's here now, and five years from now it's 384 00:17:05,080 --> 00:17:07,560 Speaker 3: we're going to start having debates about exactly who gets 385 00:17:07,560 --> 00:17:09,320 Speaker 3: to drive and who doesn't, because driving is going to 386 00:17:09,320 --> 00:17:10,919 Speaker 3: be seen as one of the things that humans do, 387 00:17:10,960 --> 00:17:14,480 Speaker 3: and it's very dangerous. It's so many people die on 388 00:17:14,480 --> 00:17:16,160 Speaker 3: the road. So maybe that's more like ten to fifteen 389 00:17:16,200 --> 00:17:21,000 Speaker 3: year prediction. But like we're not far from self driving 390 00:17:21,080 --> 00:17:23,919 Speaker 3: being just self driving is already better than elderly drivers 391 00:17:23,920 --> 00:17:25,520 Speaker 3: for instance, Like you know, my parents are getting up 392 00:17:25,520 --> 00:17:28,320 Speaker 3: there and I recommend, like, you know, there of means, 393 00:17:28,359 --> 00:17:30,080 Speaker 3: and I was like, your next car should probably be 394 00:17:30,119 --> 00:17:33,680 Speaker 3: a Tesla because you can't you know, you're in the 395 00:17:33,680 --> 00:17:35,120 Speaker 3: stage where you know, you don't want to drive at night. 396 00:17:35,480 --> 00:17:36,440 Speaker 3: Tesla will drive at night. 397 00:17:36,680 --> 00:17:41,120 Speaker 1: Yeah, And I only use the Tesla at night. It's 398 00:17:41,160 --> 00:17:43,720 Speaker 1: it's actually my husband's car. But if I'm driving somewhere 399 00:17:43,720 --> 00:17:46,000 Speaker 1: at night, yeah, I'm a great driver at night all whatever. 400 00:17:46,359 --> 00:17:48,840 Speaker 1: But I like the self driving. I don't have to 401 00:17:48,840 --> 00:17:49,960 Speaker 1: really think about it at night. 402 00:17:50,760 --> 00:17:52,760 Speaker 3: Yeah, and like five to ten years, it'll just be 403 00:17:52,800 --> 00:17:55,320 Speaker 3: obvious that the self driving is better. But yeah, no, 404 00:17:55,320 --> 00:17:56,679 Speaker 3: self driving is coming. That would be my big five 405 00:17:56,720 --> 00:17:57,200 Speaker 3: year prediction. 406 00:17:58,160 --> 00:18:00,080 Speaker 1: It's funny because do you think Americans are going to 407 00:18:00,119 --> 00:18:02,280 Speaker 1: go for it? Americans are just so like I drive 408 00:18:02,320 --> 00:18:05,320 Speaker 1: my own car, you know, very independent and like nobody 409 00:18:05,320 --> 00:18:08,600 Speaker 1: could take away our freedom of self driving, and I 410 00:18:08,600 --> 00:18:11,240 Speaker 1: don't know, I kind of think Americans might rebel a 411 00:18:11,280 --> 00:18:12,159 Speaker 1: little bit against it. 412 00:18:14,520 --> 00:18:18,679 Speaker 3: Yeah, I mean they might, but I think, you know, 413 00:18:18,760 --> 00:18:21,560 Speaker 3: I think ultimately the safety argument will be so persuasive, 414 00:18:21,920 --> 00:18:23,600 Speaker 3: and I think that, you know, you'll still have some 415 00:18:23,800 --> 00:18:25,800 Speaker 3: I think that it'll be one of those things that 416 00:18:25,840 --> 00:18:28,919 Speaker 3: you're just it'll be more restricted and maybe every car 417 00:18:28,960 --> 00:18:30,720 Speaker 3: will have to have some sort of self driving takeover 418 00:18:30,880 --> 00:18:33,720 Speaker 3: to prevent accidents. That would be an interesting thing. Yeah, 419 00:18:34,359 --> 00:18:38,200 Speaker 3: but it'll be I mean, it's just you're gonna everybody's 420 00:18:38,240 --> 00:18:40,199 Speaker 3: going to be confronted with the force of the fact that, like, 421 00:18:41,200 --> 00:18:43,440 Speaker 3: we can be saving lives tens of thousands of people 422 00:18:43,480 --> 00:18:45,879 Speaker 3: who die in the year in car accidents, and thirty 423 00:18:45,960 --> 00:18:47,720 Speaker 3: years from now, they're going to look at us like 424 00:18:47,760 --> 00:18:50,000 Speaker 3: we're barbarians. Like what you have sec and you weren't 425 00:18:50,040 --> 00:18:51,600 Speaker 3: using it and you could have saved tens of thousand 426 00:18:51,680 --> 00:18:52,280 Speaker 3: people a year, Like. 427 00:18:52,240 --> 00:18:54,479 Speaker 1: Come on, Well, I have loved this conversation. I've love 428 00:18:54,560 --> 00:18:56,960 Speaker 1: getting to know you a little bit. Better leave us 429 00:18:57,040 --> 00:18:59,760 Speaker 1: here with your best tip from my listeners on how 430 00:18:59,800 --> 00:19:01,400 Speaker 1: they can improve their lives. 431 00:19:02,400 --> 00:19:05,480 Speaker 3: How you can improve your lives. I think it is 432 00:19:06,280 --> 00:19:08,280 Speaker 3: how to read How to Win friends and Influence people. 433 00:19:08,359 --> 00:19:13,440 Speaker 3: So much of what people struggle with is understanding that you, 434 00:19:13,440 --> 00:19:15,679 Speaker 3: you know, you should try and be easy to be 435 00:19:15,760 --> 00:19:18,680 Speaker 3: friends with and easy to support. I think a lot 436 00:19:18,720 --> 00:19:21,560 Speaker 3: of people in our space, especially get kind of narcissistic 437 00:19:22,080 --> 00:19:26,000 Speaker 3: and don't think through you know, are they being you know, 438 00:19:26,240 --> 00:19:28,639 Speaker 3: they get and they also get obsessed about particular things. 439 00:19:28,640 --> 00:19:30,840 Speaker 3: It's like always, you know, it's really useful to remember 440 00:19:30,880 --> 00:19:32,520 Speaker 3: that it's important to be a good friend. 441 00:19:33,400 --> 00:19:35,920 Speaker 1: I love that he is Will Chamberlain. Check him out 442 00:19:35,920 --> 00:19:39,440 Speaker 1: on x follow the article three project. Thank you so much, Will, 443 00:19:39,920 --> 00:19:40,239 Speaker 1: all right 444 00:19:40,280 --> 00:19:41,000 Speaker 2: Thanks for having me