1 00:00:03,440 --> 00:00:05,480 Speaker 1: Welcome back to a numbers game with Brian Grodowski. Thank 2 00:00:05,519 --> 00:00:08,119 Speaker 1: you for being here, ladies and gentlemen. So I was 3 00:00:08,440 --> 00:00:10,440 Speaker 1: looking around and playing around some of the numbers of 4 00:00:10,480 --> 00:00:13,960 Speaker 1: the listenership of this podcast listenership over the weekend, checking out, 5 00:00:14,000 --> 00:00:17,240 Speaker 1: you know, where my audience is and who was responding 6 00:00:17,280 --> 00:00:20,360 Speaker 1: to it. And I really really really want to do 7 00:00:20,800 --> 00:00:24,079 Speaker 1: a live podcast taping this year, like that would be 8 00:00:24,160 --> 00:00:26,239 Speaker 1: my goal for this podcast, is that this year we 9 00:00:26,280 --> 00:00:28,920 Speaker 1: can get an audience together and I could do a 10 00:00:29,040 --> 00:00:31,200 Speaker 1: live event. I just think of a live ask me 11 00:00:31,200 --> 00:00:32,680 Speaker 1: any things that going to be so much fun. And 12 00:00:32,720 --> 00:00:34,879 Speaker 1: I could have a guest come stream in if or 13 00:00:35,080 --> 00:00:37,000 Speaker 1: can be there, you know, in person, depending on the 14 00:00:37,040 --> 00:00:39,879 Speaker 1: place and the time. So that's what my goal is. 15 00:00:40,000 --> 00:00:43,040 Speaker 1: And the show's been growing pretty successfully and the audience 16 00:00:43,080 --> 00:00:45,839 Speaker 1: has been very loyal and I'm so happy for it. 17 00:00:45,920 --> 00:00:50,360 Speaker 1: And I was looking at cities. So Houston is the 18 00:00:50,400 --> 00:00:53,200 Speaker 1: biggest city for this podcast as far as listenership goes. 19 00:00:53,680 --> 00:00:56,920 Speaker 1: And then it's followed by a city that I never 20 00:00:57,080 --> 00:01:00,000 Speaker 1: heard of, and I can't believe how big my audience 21 00:01:00,360 --> 00:01:04,240 Speaker 1: is there, but it's Hutchinson, Kansas. I have never been 22 00:01:04,319 --> 00:01:06,720 Speaker 1: to Hutchinson Kansas. I've been to Kansas many times, been 23 00:01:06,760 --> 00:01:11,560 Speaker 1: ever Hutchinson, but I have a strong following in the 24 00:01:11,600 --> 00:01:14,480 Speaker 1: city of Hodginson, Kansas. After Hutchinson, it's Dallas and New 25 00:01:14,520 --> 00:01:16,880 Speaker 1: York and Denver and Phoenix and Atlanta and the places 26 00:01:17,000 --> 00:01:20,399 Speaker 1: you would think, you know, have big listenerships because it's 27 00:01:20,400 --> 00:01:24,720 Speaker 1: a big population. But if anyone, if there's an organization 28 00:01:24,920 --> 00:01:27,920 Speaker 1: or a group or anybody that wants me to come 29 00:01:28,080 --> 00:01:30,560 Speaker 1: to do a live event in your city, email me 30 00:01:30,680 --> 00:01:34,080 Speaker 1: Ryan at numbers gamepodcast dot com. We charge, you know, 31 00:01:34,160 --> 00:01:36,720 Speaker 1: a low overhead just to make sure we cover costs. 32 00:01:36,920 --> 00:01:38,160 Speaker 1: I don't need to make a lot of money off 33 00:01:38,160 --> 00:01:40,200 Speaker 1: of it. The show's doing great. I have a million 34 00:01:40,200 --> 00:01:42,720 Speaker 1: other jobs, but I really think a live event would 35 00:01:42,760 --> 00:01:46,280 Speaker 1: be wonderful. So, uh, if you're interested, email me Ryan 36 00:01:46,319 --> 00:01:49,320 Speaker 1: at numbers gamepodcast dot com. Same as asked me anything time. 37 00:01:49,760 --> 00:01:51,840 Speaker 1: Tell me if you know we could pack a house, 38 00:01:51,880 --> 00:01:53,320 Speaker 1: If we could pack a house, if we can get 39 00:01:53,320 --> 00:01:55,720 Speaker 1: people there, I'm down to do it. I just think 40 00:01:55,720 --> 00:01:57,600 Speaker 1: it would be so great. So thank you guys so 41 00:01:57,680 --> 00:02:00,280 Speaker 1: much for continuing to support this podcast. I really shit 42 00:02:00,360 --> 00:02:03,800 Speaker 1: you all. Okay, let's talk about some polling from around 43 00:02:03,840 --> 00:02:07,440 Speaker 1: the country. To a little polling pick me up. In Wisconsin, 44 00:02:07,560 --> 00:02:11,600 Speaker 1: Republican Tom Tippany is in the margin of error for 45 00:02:11,800 --> 00:02:15,720 Speaker 1: governor against all leading Democratic candidates. He's down three points 46 00:02:15,720 --> 00:02:19,480 Speaker 1: against Sarah Rodriguez, two points against Mendela Barnes, who is 47 00:02:19,520 --> 00:02:22,799 Speaker 1: the lieutenant governor, one point from David Crowley, and he's 48 00:02:22,840 --> 00:02:27,840 Speaker 1: ahead three points from Francesca Hung. Barnes is leading in 49 00:02:27,880 --> 00:02:30,440 Speaker 1: the polls, but Francesca actually had a recent poll that 50 00:02:30,480 --> 00:02:33,079 Speaker 1: had her head. It's more likely Barnes only because a 51 00:02:33,120 --> 00:02:35,760 Speaker 1: nem Id and support. But Francesca is like the super 52 00:02:35,760 --> 00:02:39,320 Speaker 1: progressive Candida, very far left candidate. Although Barnes is like, 53 00:02:39,440 --> 00:02:41,600 Speaker 1: you know, I don't know how you get further off 54 00:02:41,639 --> 00:02:44,919 Speaker 1: than Barnes, but Francesca managed it. She's just she out 55 00:02:44,919 --> 00:02:48,240 Speaker 1: did him on you know, hating hating capitalism just a 56 00:02:48,240 --> 00:02:51,440 Speaker 1: little bit. More Over, in New Hampshire, New Hampshire, a 57 00:02:51,440 --> 00:02:55,360 Speaker 1: Saint Anselam poll found the Republican governor Kelly Ayott is 58 00:02:55,480 --> 00:02:59,480 Speaker 1: leading likely Democrat candidate Warmington that's the last name, by 59 00:02:59,600 --> 00:03:02,519 Speaker 1: six points. It's a healthy lead for a Republican Canada 60 00:03:02,600 --> 00:03:05,400 Speaker 1: in a state that voted for Kamala Harris on the 61 00:03:05,440 --> 00:03:08,640 Speaker 1: Senate side. This is the sleeper race, the New Hampshire 62 00:03:08,720 --> 00:03:11,280 Speaker 1: race no one's paying attention to, but it is the 63 00:03:11,400 --> 00:03:16,320 Speaker 1: sleeper race of the year. Former Senator John Sunu is 64 00:03:16,360 --> 00:03:20,480 Speaker 1: down by just three points against Congressman Chris tappis the 65 00:03:20,639 --> 00:03:24,680 Speaker 1: likely Democrat. You know, Snunu was a senator a while ago. 66 00:03:25,000 --> 00:03:28,040 Speaker 1: He lost reelection I think in one of the Democratic waves, 67 00:03:28,040 --> 00:03:30,640 Speaker 1: I think two thousand and eight. But he's from a 68 00:03:30,800 --> 00:03:34,760 Speaker 1: legacy family. His brother was a super popular governor. His 69 00:03:35,080 --> 00:03:37,240 Speaker 1: dad was an icon in the state. I think his 70 00:03:37,320 --> 00:03:39,400 Speaker 1: dad's still alive, but he was George H. W. Bush's 71 00:03:39,480 --> 00:03:43,160 Speaker 1: chief of staff and he was a governor himself. It's 72 00:03:43,280 --> 00:03:46,400 Speaker 1: just New Hampshire's this weird place where like four Lebanese 73 00:03:46,400 --> 00:03:48,600 Speaker 1: families will run the entire state, and the Snow News 74 00:03:48,640 --> 00:03:53,880 Speaker 1: are one of them, and just incredibly incredibly popular. He's 75 00:03:53,920 --> 00:03:55,840 Speaker 1: not a Trumper, but he's endorsed by Trump, so there's 76 00:03:55,880 --> 00:03:59,400 Speaker 1: a community in the Republican Party. And his name ID 77 00:03:59,640 --> 00:04:01,600 Speaker 1: is not a associate with Trump, which you would need 78 00:04:01,640 --> 00:04:05,040 Speaker 1: in a year that's not favorable for Democrat or Republicans rather, 79 00:04:05,160 --> 00:04:08,280 Speaker 1: especially with the Trump brand not being great. I just think, 80 00:04:08,320 --> 00:04:10,600 Speaker 1: I mean, I'm not saying that he's gonna win. I 81 00:04:10,640 --> 00:04:13,360 Speaker 1: think it is the sleeper race of the year, especially 82 00:04:13,360 --> 00:04:16,039 Speaker 1: given that New Hampshire has had such tight federal races 83 00:04:16,080 --> 00:04:18,080 Speaker 1: in the last couple of cycles that no one really 84 00:04:18,120 --> 00:04:21,280 Speaker 1: picked up on. Over in Los Angeles, this is not 85 00:04:21,320 --> 00:04:23,760 Speaker 1: a federal race. This is a local race. Mayor Karen 86 00:04:24,040 --> 00:04:27,600 Speaker 1: Bass is at twenty five percent against Los Angeles City 87 00:04:27,640 --> 00:04:32,200 Speaker 1: Council Chairman Nissa Rahman, and I think that's the progressive candidate. 88 00:04:32,279 --> 00:04:35,680 Speaker 1: She's at seventeen percent and reality show star Spencer Pratt 89 00:04:35,720 --> 00:04:38,560 Speaker 1: is at fourteen percent. Now, Karen Bass is famous for 90 00:04:38,600 --> 00:04:41,440 Speaker 1: not being in La during the fires that destroyed an 91 00:04:41,560 --> 00:04:43,840 Speaker 1: entire neighborhood that now she's trying to get Section eight 92 00:04:44,120 --> 00:04:47,560 Speaker 1: housing builds in that neighborhood in the Palisades. I've been 93 00:04:47,560 --> 00:04:50,880 Speaker 1: to the Palisades. It's such a beautiful area. It's horrific 94 00:04:50,920 --> 00:04:54,320 Speaker 1: what's happened to it and how Karen Bass has been 95 00:04:54,760 --> 00:04:59,480 Speaker 1: neglectful to say the least. And Spencer press running as 96 00:04:59,520 --> 00:05:02,719 Speaker 1: like the fact to Republican candidate, which is there a 97 00:05:02,720 --> 00:05:05,200 Speaker 1: better person? I don't know. He's definitely trying to get 98 00:05:05,240 --> 00:05:08,799 Speaker 1: his name out there. But Karen Bess, who was away 99 00:05:08,800 --> 00:05:11,000 Speaker 1: while city burned is still leading in the polls, which 100 00:05:11,000 --> 00:05:14,080 Speaker 1: is not wonderful. And last, but not least, there was 101 00:05:14,120 --> 00:05:16,919 Speaker 1: a CBS poll asked do you support a voter ID 102 00:05:17,120 --> 00:05:23,120 Speaker 1: eighty percent of people supported it, twenty percent opposed. When 103 00:05:23,160 --> 00:05:26,760 Speaker 1: the poll also asked when do you support requiring proof 104 00:05:26,760 --> 00:05:31,000 Speaker 1: of citizenship to register to vote? Sixty six percent said yes, 105 00:05:31,240 --> 00:05:34,720 Speaker 1: thirty three percent opposed. That includes ninety three percent of Republicans, 106 00:05:35,000 --> 00:05:38,520 Speaker 1: sixty one percent of Independence and forty three percent of Democrats. 107 00:05:38,520 --> 00:05:40,760 Speaker 1: That also included sixty percent of the black vote. Considering 108 00:05:40,800 --> 00:05:44,200 Speaker 1: the always say the black vote will suffer from requiring citizenship, 109 00:05:44,279 --> 00:05:47,680 Speaker 1: sixty percent of black voter support it. Okay, Now I 110 00:05:47,760 --> 00:05:50,840 Speaker 1: want to talk about a generic ballot because that was 111 00:05:51,000 --> 00:05:52,240 Speaker 1: you know, a lot of these polls have been. Good 112 00:05:52,320 --> 00:05:55,200 Speaker 1: news for Republicans is there's tight races. Republicans are not 113 00:05:55,240 --> 00:05:57,080 Speaker 1: being blown out on some of these races. They're competitive 114 00:05:57,080 --> 00:06:00,479 Speaker 1: in Wisconsin, competitive in New Hampshire, competitive, you know, in 115 00:06:00,480 --> 00:06:03,800 Speaker 1: a number of these places. But the generic ballot is 116 00:06:03,960 --> 00:06:08,560 Speaker 1: moving in the other direction for Republicans, first being the 117 00:06:08,600 --> 00:06:12,719 Speaker 1: Emerson Pole. The Emerson Pole has Democrats dominating the generic 118 00:06:12,760 --> 00:06:16,080 Speaker 1: ballot by eight and a half points, winning independent Democrats 119 00:06:16,120 --> 00:06:19,480 Speaker 1: are winning independence in this poll by eighteen points, and 120 00:06:19,520 --> 00:06:22,400 Speaker 1: they're winning Latinos by twenty one points. That's actually a 121 00:06:22,400 --> 00:06:25,200 Speaker 1: pretty good number of Latinos because it shows in a 122 00:06:25,320 --> 00:06:29,240 Speaker 1: year that looks like twenty eighteen generic ballad wise, the 123 00:06:29,320 --> 00:06:32,479 Speaker 1: Denver republic Latinos still have not moved back to where 124 00:06:32,480 --> 00:06:34,240 Speaker 1: they used to be in twenty eighteen numbers where it 125 00:06:34,240 --> 00:06:36,159 Speaker 1: would be like thirty six or thirty eight or forty 126 00:06:36,200 --> 00:06:40,320 Speaker 1: point plans lide for Democrats. Twenty one is actually very 127 00:06:40,400 --> 00:06:44,560 Speaker 1: good considering it shows that the transition of Latino voters 128 00:06:44,640 --> 00:06:46,920 Speaker 1: is really sticking. Even if they're not happy with Trump 129 00:06:47,000 --> 00:06:49,640 Speaker 1: right now or the Republican Party, they're not willing to 130 00:06:49,640 --> 00:06:51,400 Speaker 1: move back to the Democratic Party like they used to. 131 00:06:52,320 --> 00:06:55,440 Speaker 1: The pole also has Republicans only leading by the white 132 00:06:55,480 --> 00:06:58,120 Speaker 1: vote by two points. If that I mean, I don't 133 00:06:58,120 --> 00:07:01,320 Speaker 1: believe that's true. It's never been that small. If that's true, 134 00:07:01,360 --> 00:07:05,120 Speaker 1: it is the GOP really is screwed. That follows a 135 00:07:05,160 --> 00:07:09,120 Speaker 1: similar number from from the website. The argument they did 136 00:07:09,120 --> 00:07:12,680 Speaker 1: their own internal poll. The argument is a left wing outlet, 137 00:07:13,160 --> 00:07:15,840 Speaker 1: but they're smart left winger is they put out information 138 00:07:15,960 --> 00:07:19,440 Speaker 1: that makes liberals uncomfortable at times, and I think they're 139 00:07:19,480 --> 00:07:22,800 Speaker 1: intellectually honest, although they're a bias, but everyone has a bias, 140 00:07:22,800 --> 00:07:26,119 Speaker 1: so you can't really help. But they're not like what's 141 00:07:26,160 --> 00:07:28,920 Speaker 1: his name, Gene Morris or something from the he was 142 00:07:29,000 --> 00:07:31,800 Speaker 1: the economists. He's just like a straight up hack. This 143 00:07:31,840 --> 00:07:34,280 Speaker 1: is a liberal, I think with a brain, so I 144 00:07:34,600 --> 00:07:38,080 Speaker 1: read their stuff. Trump's favorability is negative eighteen according to 145 00:07:38,080 --> 00:07:42,680 Speaker 1: their poll. Seventeen percent of Trump voters disapprove of the 146 00:07:42,760 --> 00:07:47,360 Speaker 1: job he's doing. And he's underwater with every key demographic 147 00:07:47,480 --> 00:07:51,080 Speaker 1: and he's just tied among whites without a college degree. 148 00:07:51,800 --> 00:07:57,880 Speaker 1: Pretty awful. The primary driver is the economy. Voters hate 149 00:07:57,960 --> 00:08:00,520 Speaker 1: this Trump economy and they're blaming him, you know. For 150 00:08:00,760 --> 00:08:04,840 Speaker 1: I'm down in Ruby Red Louisiana, and voters are just 151 00:08:04,840 --> 00:08:08,720 Speaker 1: feeling disappointed. They were hoping for the first term all 152 00:08:08,760 --> 00:08:12,480 Speaker 1: over again, and they're not getting it, and there's anxiety 153 00:08:12,600 --> 00:08:14,760 Speaker 1: and a lot of people are just barely getting by 154 00:08:14,800 --> 00:08:17,680 Speaker 1: and they're really struggling. And I think that there's this 155 00:08:18,320 --> 00:08:22,520 Speaker 1: divorce mentality from how bad it is on the streets 156 00:08:22,560 --> 00:08:27,520 Speaker 1: from you know, sixteen hundred Pennsylvania Avenue. Sixty five percent 157 00:08:27,600 --> 00:08:30,160 Speaker 1: of voters feel like the economy isn't keeping up with 158 00:08:30,240 --> 00:08:38,400 Speaker 1: the cost of living now? Is that true? Not necessarily untrue, right, 159 00:08:38,440 --> 00:08:41,559 Speaker 1: because we've seen what inflation is and we've seen the 160 00:08:41,920 --> 00:08:45,719 Speaker 1: growth and job wages. Wages are growing pretty fast and 161 00:08:46,040 --> 00:08:49,920 Speaker 1: inflation is down significantly from the from the Biden era. 162 00:08:50,320 --> 00:08:53,040 Speaker 1: Although with gas prices just jacking up because the Iran War, 163 00:08:53,120 --> 00:08:54,800 Speaker 1: maybe they're feeling it more. I don't know when the 164 00:08:54,800 --> 00:08:57,240 Speaker 1: pole exactly was taken. I think that gas prices started 165 00:08:57,280 --> 00:08:59,120 Speaker 1: going up at the same time the pole was being taken. 166 00:09:00,400 --> 00:09:03,200 Speaker 1: Twenty five percent of voters really our financial situation is improving. 167 00:09:03,480 --> 00:09:06,560 Speaker 1: Fifty percent of voters think the economy is getting worse 168 00:09:06,679 --> 00:09:12,120 Speaker 1: once again. Is it getting worse? Not not terribly worse. 169 00:09:12,200 --> 00:09:15,079 Speaker 1: I mean a lot of this is about feelings. And 170 00:09:15,840 --> 00:09:17,480 Speaker 1: you know, Ben Shiva always says to facts. You know, 171 00:09:17,559 --> 00:09:20,760 Speaker 1: facts don't care about your feelings. That is true, but 172 00:09:20,880 --> 00:09:24,440 Speaker 1: voters don't care about facts. You have to feelings are 173 00:09:24,600 --> 00:09:27,920 Speaker 1: the only thing that matter sometimes of voters, so you 174 00:09:28,040 --> 00:09:31,120 Speaker 1: have to kind of feelings matter a lot to people, 175 00:09:31,160 --> 00:09:32,880 Speaker 1: and you can't tell them they're wrong. You have to 176 00:09:33,000 --> 00:09:36,480 Speaker 1: justify their feelings and explain how things are getting better 177 00:09:36,679 --> 00:09:41,000 Speaker 1: to comfort them. So it is what it is. One 178 00:09:41,040 --> 00:09:44,080 Speaker 1: interesting data point that the pullpount is voters have a 179 00:09:44,280 --> 00:09:49,000 Speaker 1: preference for lower prices over lower unemployment. I don't find 180 00:09:49,040 --> 00:09:52,760 Speaker 1: that to be true, only because in this country we 181 00:09:52,800 --> 00:09:56,640 Speaker 1: have gotten so used to unemployment being under five percent, 182 00:09:57,320 --> 00:10:02,360 Speaker 1: we don't for most people, we don't remember what ten, twelve, 183 00:10:02,360 --> 00:10:07,000 Speaker 1: fifty percent unemployment feels like. And that's I think that's 184 00:10:07,040 --> 00:10:10,000 Speaker 1: why people are arguing on more out prices than actual unemployment. 185 00:10:10,080 --> 00:10:13,720 Speaker 1: Right now, among all register voters, Democrats leave forty six 186 00:10:13,800 --> 00:10:16,560 Speaker 1: to forty among this pole, but among those definitely going 187 00:10:16,559 --> 00:10:19,959 Speaker 1: to vote it is fifty five forty five, because voter 188 00:10:20,480 --> 00:10:25,160 Speaker 1: enthusiasm is over the moon for Democrats right now. Zacharydnini, 189 00:10:25,200 --> 00:10:27,200 Speaker 1: who I've had on this podcast, why is such a 190 00:10:27,240 --> 00:10:31,200 Speaker 1: smart such a smart guy, for he's a data scientist. 191 00:10:31,520 --> 00:10:33,800 Speaker 1: He actually puts out though when you pull all the 192 00:10:33,880 --> 00:10:37,000 Speaker 1: polls together, Democrats lead is actually shrunk a little bit 193 00:10:37,000 --> 00:10:39,240 Speaker 1: in the last few weeks, says Democrats have a overall 194 00:10:39,320 --> 00:10:42,160 Speaker 1: nationally about four point six despite these polls, although I 195 00:10:42,200 --> 00:10:46,199 Speaker 1: think they are two good polsters. A D plus eight environment, 196 00:10:46,240 --> 00:10:48,760 Speaker 1: which is what these two poles say, means that Republicans 197 00:10:48,800 --> 00:10:51,520 Speaker 1: will lose the Senate a D plus four to six, 198 00:10:51,559 --> 00:10:54,840 Speaker 1: as Zachary Denini notes, from all the Polsters, that's where they, 199 00:10:55,040 --> 00:10:57,280 Speaker 1: you know, keep the Senate, lose the House, but not 200 00:10:57,480 --> 00:11:00,880 Speaker 1: by more than I don't know, fifteen c twelve seats, 201 00:11:00,880 --> 00:11:04,760 Speaker 1: eighteen seats, ten seats. It's not an overwhelming it's not 202 00:11:04,840 --> 00:11:08,319 Speaker 1: a forty seat landslide. Now, there are so many factors 203 00:11:08,360 --> 00:11:10,880 Speaker 1: into why prices are going up. There's you know, there's 204 00:11:10,920 --> 00:11:13,320 Speaker 1: too much money in the system because our government cannot 205 00:11:13,360 --> 00:11:16,120 Speaker 1: stop running deficits and adding to the debt. There's the 206 00:11:16,160 --> 00:11:20,199 Speaker 1: war worthy Ran. There is problems with how companies are 207 00:11:20,200 --> 00:11:23,880 Speaker 1: restructuring post COVID. There is a lot of instability in 208 00:11:23,920 --> 00:11:25,719 Speaker 1: the market. That there's the fact that a lot of 209 00:11:25,760 --> 00:11:28,880 Speaker 1: companies hired too much during COVID and now they're firing people. 210 00:11:29,720 --> 00:11:33,720 Speaker 1: I will say this is anecdotal, this is not data 211 00:11:33,760 --> 00:11:36,840 Speaker 1: to provide it. I think one thing going to benefit 212 00:11:36,880 --> 00:11:40,440 Speaker 1: Republicans is the upcoming tax season with the rebates. I 213 00:11:40,520 --> 00:11:43,760 Speaker 1: spoke to a relative of mine who is a hairstylist. 214 00:11:44,440 --> 00:11:48,240 Speaker 1: She lives on tips. Because of Trump's no tax on tips, 215 00:11:48,320 --> 00:11:52,400 Speaker 1: her tax rebate was seven hundred and fifty percent higher 216 00:11:52,440 --> 00:11:56,800 Speaker 1: than the year prior it was a life changing amount 217 00:11:56,880 --> 00:11:58,760 Speaker 1: of money for I want to say life changing, but 218 00:11:58,920 --> 00:12:01,440 Speaker 1: year changing, a lifemount of money for a person in 219 00:12:01,679 --> 00:12:06,280 Speaker 1: a blue collar job. And considering that seventeen million Americans 220 00:12:06,360 --> 00:12:09,880 Speaker 1: work in the tip industry, this could be a lot. 221 00:12:09,960 --> 00:12:16,880 Speaker 1: This could be bolster working class Americans' incomes substantially going 222 00:12:17,559 --> 00:12:20,000 Speaker 1: going into the election year. So I am hopeful that 223 00:12:20,120 --> 00:12:21,920 Speaker 1: come May and soon, as long as people file their 224 00:12:21,920 --> 00:12:24,920 Speaker 1: taxes on time, which Laura knows I love an extension 225 00:12:25,720 --> 00:12:29,080 Speaker 1: they are, I think that those tax rebates will help 226 00:12:29,120 --> 00:12:32,440 Speaker 1: a lot of working class Americans and maybe ease a 227 00:12:32,480 --> 00:12:38,000 Speaker 1: lot of anxieties around the economy. Another big problem, though, 228 00:12:38,080 --> 00:12:39,559 Speaker 1: and this is something that won't be feil from a 229 00:12:39,640 --> 00:12:43,400 Speaker 1: tax on tips, A big problem is too many Americans 230 00:12:43,440 --> 00:12:48,920 Speaker 1: feel like Trump is focused on the wrong things. I 231 00:12:49,000 --> 00:12:51,400 Speaker 1: know I'm going to piss people off. He's got to 232 00:12:51,400 --> 00:12:54,920 Speaker 1: relax the foreign policy stuff. I know it's legacy building, 233 00:12:54,960 --> 00:12:58,000 Speaker 1: I know it's important. We're in an election cycle. Though 234 00:12:58,360 --> 00:13:01,520 Speaker 1: Americans don't care about forign policy. They don't they don't 235 00:13:01,600 --> 00:13:04,880 Speaker 1: care about foreign policy. Venezuela, as I've said on this podcast, 236 00:13:05,280 --> 00:13:10,440 Speaker 1: was flawlessly done. Flawless ten out of ten. No one cared. 237 00:13:10,480 --> 00:13:12,960 Speaker 1: It was a brief bump in the polls by a point, 238 00:13:13,080 --> 00:13:16,400 Speaker 1: and everyone went back to their life. Iran is much 239 00:13:16,440 --> 00:13:19,800 Speaker 1: more complicated and it's having a lot more repercussions in 240 00:13:19,840 --> 00:13:22,000 Speaker 1: the economy and locally. Not to mention, there are dead 241 00:13:22,000 --> 00:13:27,040 Speaker 1: Americans now. Trump at this point needs to start calling 242 00:13:27,080 --> 00:13:29,520 Speaker 1: all these executives who have promised to open up all 243 00:13:29,559 --> 00:13:32,760 Speaker 1: these warehouses and factories across the country and hire all 244 00:13:32,760 --> 00:13:36,800 Speaker 1: these Americans, and start showing up to ribbon cutting ceremonies 245 00:13:37,080 --> 00:13:41,480 Speaker 1: every single week. Send JD. Vans out, Send your key 246 00:13:41,520 --> 00:13:46,440 Speaker 1: allies outsend, who are the Commerce secretary whomever, or a 247 00:13:46,600 --> 00:13:51,240 Speaker 1: Secretary of Treasury whomever. Make sure they are at these 248 00:13:51,320 --> 00:13:53,800 Speaker 1: ribbon cutting ceremonies where you're going to break brown a 249 00:13:53,800 --> 00:13:55,960 Speaker 1: new construction, where you're going to hire new workers, take 250 00:13:55,960 --> 00:13:59,960 Speaker 1: photos with new hired workers. This is of the effort. 251 00:14:00,360 --> 00:14:02,000 Speaker 1: This is the this is the effort needs to be 252 00:14:02,080 --> 00:14:04,920 Speaker 1: put in to make sure Americans know that you care 253 00:14:05,160 --> 00:14:09,000 Speaker 1: about the economy. You know, go invite a bunch of 254 00:14:09,000 --> 00:14:12,760 Speaker 1: people who got taxes tax relief for there were servers 255 00:14:12,760 --> 00:14:16,199 Speaker 1: at restaurants, or they were hairdressers or worked in hotels, 256 00:14:16,240 --> 00:14:19,440 Speaker 1: whatever the cases they got tips to talk about how 257 00:14:19,560 --> 00:14:21,360 Speaker 1: much of a relief the economy is from the no 258 00:14:21,480 --> 00:14:24,560 Speaker 1: tax on tips. You really have to emphasize I care 259 00:14:24,600 --> 00:14:26,680 Speaker 1: about the economy. I'm a businessman. I care with the 260 00:14:26,760 --> 00:14:30,720 Speaker 1: business of people's everyday lives. Don't worry about what is 261 00:14:30,760 --> 00:14:34,360 Speaker 1: going on at a war table in Tehran, what is 262 00:14:34,440 --> 00:14:39,320 Speaker 1: happening at the kitchen table in Dallas, in Detroit, in Cleveland, 263 00:14:39,560 --> 00:14:42,760 Speaker 1: in all these exerbs and suburbs and rural America. You 264 00:14:43,520 --> 00:14:46,400 Speaker 1: it has to be concentrated so heavily on the economy 265 00:14:46,480 --> 00:14:50,800 Speaker 1: right now. And Congress, you know, in this meantime, have 266 00:14:50,880 --> 00:14:53,680 Speaker 1: to start to bring up all the easy bills. Get 267 00:14:53,720 --> 00:14:56,440 Speaker 1: Mike Johnson to bring up the easiest bills and make 268 00:14:56,520 --> 00:15:00,440 Speaker 1: Democrats vote against them, hang it over their head going 269 00:15:00,480 --> 00:15:03,720 Speaker 1: into election season. Have Foon bring them up to votes. 270 00:15:03,760 --> 00:15:07,960 Speaker 1: Even if they fail, it doesn't matter. Let Democrats own 271 00:15:08,200 --> 00:15:12,640 Speaker 1: every bad policy issue. Chris Murphy, who's got an IQ 272 00:15:12,880 --> 00:15:17,240 Speaker 1: of like well water, he was on I forget which 273 00:15:17,240 --> 00:15:20,040 Speaker 1: show is MS now or something like that, and he 274 00:15:20,120 --> 00:15:22,280 Speaker 1: said he was like, the people we care about with 275 00:15:22,320 --> 00:15:26,520 Speaker 1: the most are undocumented Americans. You have to make them 276 00:15:26,640 --> 00:15:32,840 Speaker 1: own all these stupid, insane positions. Democrats have so many good, 277 00:15:32,880 --> 00:15:36,520 Speaker 1: populous positions that they care about on healthcare that I'm 278 00:15:36,520 --> 00:15:39,160 Speaker 1: not saying they're well thought out, but they're popular with 279 00:15:39,160 --> 00:15:43,880 Speaker 1: American people on abortion, on healthcare, on tech, more and more. 280 00:15:44,600 --> 00:15:47,240 Speaker 1: And then they'll be like, here are you know, winning 281 00:15:47,280 --> 00:15:49,800 Speaker 1: issues left right and center. And by the way, along 282 00:15:49,840 --> 00:15:52,120 Speaker 1: with these winning issues, you have to let an illegal 283 00:15:52,160 --> 00:15:55,600 Speaker 1: alien man who identifies as a girl who is thirty 284 00:15:55,640 --> 00:15:58,800 Speaker 1: seven priors shower next to your daughter. Like they will 285 00:15:58,840 --> 00:16:03,760 Speaker 1: destroy any type of well wish, any well welcome, any 286 00:16:03,800 --> 00:16:06,800 Speaker 1: welcome that they have with with voters at all. They 287 00:16:06,880 --> 00:16:10,080 Speaker 1: just they have to. They can. Their brains are broken 288 00:16:10,560 --> 00:16:13,840 Speaker 1: on trans issues and illegal immigrants and crime to the 289 00:16:13,880 --> 00:16:16,680 Speaker 1: point that it actually hurts issues that they're so popular 290 00:16:16,680 --> 00:16:18,600 Speaker 1: with the American people with. And that's honestly the benefit 291 00:16:18,640 --> 00:16:22,040 Speaker 1: of the Republicans. But the Republicans have to get on message. 292 00:16:22,040 --> 00:16:24,040 Speaker 1: Trump needs to get on message. It is of the 293 00:16:24,200 --> 00:16:27,440 Speaker 1: essence and one of those messages. And you know, I've 294 00:16:27,440 --> 00:16:31,720 Speaker 1: talked about a lot is ai both parties. Man, they're 295 00:16:31,760 --> 00:16:33,880 Speaker 1: missing the message. They're missing the point on this right now. 296 00:16:34,240 --> 00:16:36,360 Speaker 1: And I have a guest on the show coming up next, 297 00:16:36,400 --> 00:16:38,600 Speaker 1: Andrew Egger and my buddy from the Bullwark. He is 298 00:16:38,640 --> 00:16:42,720 Speaker 1: here to talk about how both parties are missing out 299 00:16:42,760 --> 00:16:46,000 Speaker 1: on the AI conversation that everyday Americans are having. That's 300 00:16:46,000 --> 00:16:51,520 Speaker 1: coming up next. Stay tuned with me on today's episode 301 00:16:51,560 --> 00:16:53,840 Speaker 1: is my buddy Andrew Egger, who writes the Morning Shop 302 00:16:53,880 --> 00:16:56,280 Speaker 1: for The Bulwark, the morning newsletter. It's great. I actually 303 00:16:56,280 --> 00:16:59,640 Speaker 1: do subscribe and read that. Thank you for being here. Andrew, Hey, Ryan, 304 00:16:59,640 --> 00:17:04,200 Speaker 1: I do. You wrote a really interesting piece called Everybody 305 00:17:04,240 --> 00:17:08,320 Speaker 1: Hates AI, and you point to the NBC poll. AI 306 00:17:08,440 --> 00:17:11,680 Speaker 1: is one of the few policy pieces that there's really 307 00:17:11,720 --> 00:17:14,960 Speaker 1: no consensus of which party has the most trust in, 308 00:17:15,000 --> 00:17:18,080 Speaker 1: and you wrote this, that's what opportunity looks like. The 309 00:17:18,119 --> 00:17:20,880 Speaker 1: party that can write the argument over AI is set 310 00:17:20,960 --> 00:17:23,800 Speaker 1: up for a political windfall and then will be tasked 311 00:17:23,840 --> 00:17:26,520 Speaker 1: with the shepherding of the country through the massive disruption 312 00:17:26,720 --> 00:17:30,000 Speaker 1: towards a process rather than a dystopian AI and integrated 313 00:17:30,040 --> 00:17:33,399 Speaker 1: future if possible. No pressure, But so far it seems 314 00:17:33,400 --> 00:17:38,560 Speaker 1: like basically everybody is blowing it. Is the question that 315 00:17:39,920 --> 00:17:42,640 Speaker 1: they're blowing it because it's so new, or are they 316 00:17:42,680 --> 00:17:45,960 Speaker 1: blowing it because they don't know what side their side 317 00:17:45,960 --> 00:17:46,760 Speaker 1: should be taking. 318 00:17:47,800 --> 00:17:51,199 Speaker 2: I would say, right now, most of what you are 319 00:17:51,240 --> 00:17:53,520 Speaker 2: seeing when it comes to AI policy is you have 320 00:17:53,600 --> 00:17:56,760 Speaker 2: this brand new thing. It's going to be uniquely disruptive economically, 321 00:17:57,640 --> 00:18:00,439 Speaker 2: it's set up to be uniquely disruptive politic. I think 322 00:18:00,440 --> 00:18:02,040 Speaker 2: a lot of people are just waking up to this. 323 00:18:02,359 --> 00:18:04,560 Speaker 2: And that's what that poll kind of demonstrated, is that, 324 00:18:04,640 --> 00:18:08,119 Speaker 2: you know, voters have their views on which party tends 325 00:18:08,160 --> 00:18:12,359 Speaker 2: to be more trusted on, for instance, crime or immigration. 326 00:18:12,560 --> 00:18:14,600 Speaker 2: You know they're going to go for the Republicans or 327 00:18:14,760 --> 00:18:16,920 Speaker 2: or you know, voters tend to give Democrats the edge 328 00:18:16,920 --> 00:18:22,080 Speaker 2: on healthcare, you know, civil rights, constitutional you know, protections, 329 00:18:22,080 --> 00:18:25,160 Speaker 2: those sorts of things. AI is a complete no man's land. 330 00:18:25,640 --> 00:18:27,679 Speaker 2: And I think that that not only is it a 331 00:18:27,680 --> 00:18:30,200 Speaker 2: complete noma's land in terms of people knowing which party 332 00:18:30,000 --> 00:18:32,040 Speaker 2: they they you know, want to want to cling to 333 00:18:32,160 --> 00:18:36,480 Speaker 2: on it, but even even the idea that that the 334 00:18:36,520 --> 00:18:38,480 Speaker 2: issue is becoming so much more salient is new. I 335 00:18:38,480 --> 00:18:40,840 Speaker 2: think a lot of policymakers are waking up and realizing, 336 00:18:40,880 --> 00:18:44,640 Speaker 2: wait a minute, there's this enormous political latent energy out 337 00:18:44,640 --> 00:18:47,199 Speaker 2: here where where this issue is only becoming more and 338 00:18:47,280 --> 00:18:50,240 Speaker 2: more and more and more important, and we need to 339 00:18:50,240 --> 00:18:52,600 Speaker 2: figure out how we're going to take a stance on it. 340 00:18:52,600 --> 00:18:54,840 Speaker 2: Some of the part of the problem is, uh that 341 00:18:54,920 --> 00:18:57,879 Speaker 2: we are really just seeing policymakers kind of fall back 342 00:18:58,000 --> 00:19:02,240 Speaker 2: on frames of understanding, you know, for the last whole 343 00:19:02,480 --> 00:19:05,280 Speaker 2: set of technologies in order to kind of do the 344 00:19:05,280 --> 00:19:07,919 Speaker 2: the opening shots of AI regulation here. So there's a 345 00:19:07,960 --> 00:19:10,480 Speaker 2: lot of on the Republican side, there's a lot of 346 00:19:10,520 --> 00:19:12,840 Speaker 2: just let's let's just let it rip, baby, you know, uh, 347 00:19:12,880 --> 00:19:15,560 Speaker 2: free market type stuff. Get the data centers in there, 348 00:19:16,800 --> 00:19:20,359 Speaker 2: let's just go. And on the Democratic side, you see 349 00:19:20,400 --> 00:19:22,520 Speaker 2: a lot of you know, we need to we need 350 00:19:22,560 --> 00:19:24,439 Speaker 2: to restrain this thing quick and fast. And there are 351 00:19:24,440 --> 00:19:27,560 Speaker 2: a lot of different kind of uh uh in my view, 352 00:19:27,600 --> 00:19:31,040 Speaker 2: bad policy levers that they're reaching for to do that. 353 00:19:31,080 --> 00:19:32,359 Speaker 2: One of which the one I wrote about in the 354 00:19:32,400 --> 00:19:35,800 Speaker 2: piece uh is is this idea of sort of protecting 355 00:19:35,880 --> 00:19:40,520 Speaker 2: professional classes, uh, like like doctors or lawyers or therapists 356 00:19:40,640 --> 00:19:42,720 Speaker 2: or you know, basically anybody that that you need an 357 00:19:42,760 --> 00:19:46,119 Speaker 2: occupational license from the government to practice in a given 358 00:19:46,200 --> 00:19:48,679 Speaker 2: state if you're a human being, basically just sort of 359 00:19:48,680 --> 00:19:52,440 Speaker 2: freezing AI out from from sharing any views on any 360 00:19:52,480 --> 00:19:54,240 Speaker 2: of these any of these issues. 361 00:19:54,280 --> 00:19:55,280 Speaker 1: That's like one one way. 362 00:19:55,280 --> 00:19:57,840 Speaker 2: Another thing is these these sorts of copyright lawsuits, the 363 00:19:57,880 --> 00:20:01,160 Speaker 2: idea of of you know, basically holding these these companies 364 00:20:01,160 --> 00:20:04,600 Speaker 2: accountable for uh uh the AIS having had their training 365 00:20:04,680 --> 00:20:08,159 Speaker 2: data uh you know pulled from perhaps copyrighted works. There 366 00:20:08,160 --> 00:20:10,200 Speaker 2: are a lot of current lawsuits going on about that. 367 00:20:10,720 --> 00:20:13,560 Speaker 2: But but you know, there's various sort of policy uh 368 00:20:14,040 --> 00:20:16,520 Speaker 2: ideas in the pipeline about sort of using those two 369 00:20:16,560 --> 00:20:19,720 Speaker 2: sort of divert big amounts of the of the profits 370 00:20:19,760 --> 00:20:22,360 Speaker 2: from these companies back to you know, the people who 371 00:20:22,400 --> 00:20:23,280 Speaker 2: the data was trained on. 372 00:20:23,359 --> 00:20:24,000 Speaker 1: Things like that. 373 00:20:24,400 --> 00:20:26,880 Speaker 2: And then there's just you know, more kind of disruptive 374 00:20:26,920 --> 00:20:29,199 Speaker 2: even than that, ideas out there, like just sort of 375 00:20:29,200 --> 00:20:31,359 Speaker 2: banning data centers altogether. And this is actually not just 376 00:20:31,400 --> 00:20:33,520 Speaker 2: a democratic thing. You've seen people like Ronda Santis in 377 00:20:33,560 --> 00:20:37,680 Speaker 2: Florida uh uh throw himself. Yeah, yeah, they're there. They're 378 00:20:37,720 --> 00:20:39,760 Speaker 2: sort of like a populous energy in the Republican Party 379 00:20:39,800 --> 00:20:43,440 Speaker 2: as well. That that that is basically just no AI, 380 00:20:43,600 --> 00:20:46,040 Speaker 2: thanks thanks, but it's not for us. You know, we 381 00:20:46,160 --> 00:20:47,520 Speaker 2: we we've taken a look at it, and we don't 382 00:20:47,520 --> 00:20:50,359 Speaker 2: really like it. There's an enormous amount of political like 383 00:20:50,640 --> 00:20:52,760 Speaker 2: energy out there behind this. I've talked to a number 384 00:20:52,800 --> 00:20:55,360 Speaker 2: of people who sort of do public public opinion research 385 00:20:55,640 --> 00:20:58,120 Speaker 2: in this field. And I think that the the take 386 00:20:58,119 --> 00:21:01,119 Speaker 2: you hear from those sorts of people is like, however 387 00:21:01,160 --> 00:21:02,920 Speaker 2: bad you think it is, it's worse. People just don't 388 00:21:02,960 --> 00:21:06,080 Speaker 2: like this thing. They they there's that was gonna be 389 00:21:06,280 --> 00:21:08,919 Speaker 2: a lot of appetite to maybe crush it. And and 390 00:21:08,960 --> 00:21:10,480 Speaker 2: that's gonna be I think that politicians are going to 391 00:21:10,520 --> 00:21:11,280 Speaker 2: try to channel. 392 00:21:11,240 --> 00:21:12,880 Speaker 1: Yeah, And so I want to get the two things 393 00:21:12,960 --> 00:21:17,119 Speaker 1: just mentioned. One is the difference between state Republicans and 394 00:21:17,200 --> 00:21:21,400 Speaker 1: federal Republicans are is a world apart. Like you know, 395 00:21:21,600 --> 00:21:23,640 Speaker 1: it's funny, very funny that the line out of coming 396 00:21:23,680 --> 00:21:26,679 Speaker 1: out of Washington the Republicans were using were if you 397 00:21:26,840 --> 00:21:32,400 Speaker 1: don't regulate this, then California will. But the truth is Arkansas, Tennessee, Florida, Texas, 398 00:21:32,440 --> 00:21:35,159 Speaker 1: they all regulate AI, all these deep red states. It 399 00:21:35,200 --> 00:21:37,439 Speaker 1: is not just a California thing at all. Now there 400 00:21:37,440 --> 00:21:40,560 Speaker 1: shouldn't be a federal standard, but the Republican states want 401 00:21:40,560 --> 00:21:43,680 Speaker 1: to regulate Justice Valley as democratic states too. And I think, 402 00:21:43,760 --> 00:21:47,600 Speaker 1: and maybe I'm wrong, I think that's because a lot 403 00:21:47,640 --> 00:21:51,719 Speaker 1: of Republicans in Congress are older. They are not you 404 00:21:51,760 --> 00:21:54,120 Speaker 1: know it's not the top of the list for them. 405 00:21:54,480 --> 00:21:56,320 Speaker 1: And two is a lot of them are taking notes 406 00:21:56,480 --> 00:21:59,360 Speaker 1: and cues from Trump, who is super pro AI. He's 407 00:21:59,400 --> 00:22:02,520 Speaker 1: the aire, isn't it. What is your opinion on that? 408 00:22:03,160 --> 00:22:06,680 Speaker 2: Yeah, I think I think that's basically correct. The White House. 409 00:22:06,720 --> 00:22:10,440 Speaker 2: This White House in part because of sort of campaign 410 00:22:11,040 --> 00:22:13,880 Speaker 2: like alliances that sort of sprung up during twenty twenty four. 411 00:22:14,040 --> 00:22:16,639 Speaker 2: I mean, you've seen a much more kind of muscular 412 00:22:17,160 --> 00:22:21,040 Speaker 2: uh tech coalition, the sort of new tech right that 413 00:22:21,040 --> 00:22:23,200 Speaker 2: that has the President's ear on a lot of this stuff, 414 00:22:23,200 --> 00:22:25,560 Speaker 2: a lot of policy things, and and AI is maybe 415 00:22:25,560 --> 00:22:27,639 Speaker 2: the biggest one. Last year, the White House put out 416 00:22:27,720 --> 00:22:30,080 Speaker 2: put out an action plan that was basically along the 417 00:22:30,080 --> 00:22:32,000 Speaker 2: lines of what I was talking about earlier, where where 418 00:22:32,000 --> 00:22:34,399 Speaker 2: you know, kind of couching it in national security terms, 419 00:22:34,680 --> 00:22:37,199 Speaker 2: not not not like US versus California, but like US 420 00:22:37,240 --> 00:22:39,040 Speaker 2: versus China, is kind of more of the more of 421 00:22:39,080 --> 00:22:41,680 Speaker 2: the frame there where where it's you know, they're developing 422 00:22:41,720 --> 00:22:43,560 Speaker 2: their models over there, we're developing. 423 00:22:43,160 --> 00:22:44,159 Speaker 1: Our models over here. 424 00:22:44,320 --> 00:22:46,439 Speaker 2: We're lucky enough to have all these companies, you know, 425 00:22:46,520 --> 00:22:49,760 Speaker 2: be American companies, and we basically just need to focus 426 00:22:49,760 --> 00:22:52,480 Speaker 2: on unleashing them. Obviously, that's been complicated in a couple 427 00:22:52,480 --> 00:22:55,480 Speaker 2: of different ways recently. I mean, the the Department of 428 00:22:55,800 --> 00:22:59,000 Speaker 2: Defense is currently in a big fight with Anthropic, one 429 00:22:59,000 --> 00:23:01,600 Speaker 2: of the leading AI company, over some use policies. We 430 00:23:01,640 --> 00:23:03,359 Speaker 2: can set that aside, but that's been kind of the 431 00:23:03,400 --> 00:23:05,920 Speaker 2: White House's line. And I think that that that sort 432 00:23:05,960 --> 00:23:09,600 Speaker 2: of national security line is another part of why there's 433 00:23:09,720 --> 00:23:11,800 Speaker 2: there's sort of more of an appeal for some like 434 00:23:11,840 --> 00:23:15,520 Speaker 2: you say, older lawmakers who that's one of their primary 435 00:23:15,560 --> 00:23:17,959 Speaker 2: frames for viewing the world. The other part of it 436 00:23:18,000 --> 00:23:20,880 Speaker 2: is just that, I mean, this industry is spending a 437 00:23:20,960 --> 00:23:24,080 Speaker 2: lot of money lobbying on this topic. So it is 438 00:23:24,119 --> 00:23:25,840 Speaker 2: sort of shaping up as well to be one of 439 00:23:25,880 --> 00:23:28,760 Speaker 2: these things where, you know, they realize how much sort 440 00:23:28,800 --> 00:23:32,000 Speaker 2: of like insurgent populist backlash is building up against them, 441 00:23:32,200 --> 00:23:33,840 Speaker 2: They're like, wow, we better get out ahead of this 442 00:23:33,840 --> 00:23:36,080 Speaker 2: thing before we just end up sort of you know, 443 00:23:36,320 --> 00:23:39,320 Speaker 2: legislated out of existence really, And so they are throwing 444 00:23:39,359 --> 00:23:41,560 Speaker 2: a lot of money around us for congressmen who might 445 00:23:41,600 --> 00:23:44,120 Speaker 2: want to take a more pro a EE stance too. 446 00:23:44,320 --> 00:23:45,840 Speaker 2: I say that as it's a bad thing. I'm very 447 00:23:45,840 --> 00:23:49,280 Speaker 2: pro a I myself, I should say but it is 448 00:23:49,280 --> 00:23:52,520 Speaker 2: one of these weird dynamics of sort of populist energy 449 00:23:52,600 --> 00:23:55,080 Speaker 2: versus just a whole lot of money flying around in Washington. 450 00:23:55,160 --> 00:23:58,080 Speaker 1: Yeah, I'm pro AI to a point. I do think 451 00:23:58,080 --> 00:24:01,080 Speaker 1: it needs. It's not the Internet. It's different. It's a 452 00:24:01,119 --> 00:24:05,359 Speaker 1: different beast, and there should be some some regulations around 453 00:24:05,359 --> 00:24:08,000 Speaker 1: it considering that, like you know, every once in a 454 00:24:08,040 --> 00:24:10,440 Speaker 1: while AI developer goes yeah, and there's a forty percent 455 00:24:10,560 --> 00:24:13,959 Speaker 1: chance that ends humanity. I'm like, maybe we should have 456 00:24:14,000 --> 00:24:16,840 Speaker 1: a conversation. That's a bigger number than one or two. 457 00:24:18,320 --> 00:24:22,000 Speaker 1: And you know more than I do. But the funny 458 00:24:22,000 --> 00:24:24,480 Speaker 1: thing is is that you as you mentioned the populist backlash, 459 00:24:24,520 --> 00:24:26,600 Speaker 1: there was a Yuga pull that came out just like 460 00:24:26,640 --> 00:24:31,120 Speaker 1: an hour ago from this recording, and they asked voters 461 00:24:31,320 --> 00:24:34,679 Speaker 1: what industry has too much power in Washington? And AI 462 00:24:34,960 --> 00:24:44,199 Speaker 1: beat guns, Israel evangelicals. Unions like to do that in 463 00:24:44,280 --> 00:24:48,000 Speaker 1: such a short period of time to break people's generational 464 00:24:48,160 --> 00:24:51,760 Speaker 1: mindset of like, oh, you know, we've heard forever that 465 00:24:51,840 --> 00:24:54,720 Speaker 1: the NRA, which is a functionally bankrupt organization that has 466 00:24:54,800 --> 00:24:58,640 Speaker 1: essentially no power anymore. We've heard from democrats for generations 467 00:24:58,880 --> 00:25:00,720 Speaker 1: NRA has too much power, or even though that is 468 00:25:00,760 --> 00:25:05,840 Speaker 1: not true anymore, to break that mindset of all these 469 00:25:05,960 --> 00:25:08,280 Speaker 1: organizations on the right and the left to be the 470 00:25:08,440 --> 00:25:11,280 Speaker 1: organization has too much control in the midst of the 471 00:25:11,280 --> 00:25:15,399 Speaker 1: Iran war, in the midst of everything shows really that 472 00:25:15,480 --> 00:25:19,199 Speaker 1: there is an anxiety that is so real and palpable, 473 00:25:19,800 --> 00:25:23,040 Speaker 1: and the answer of just let it rip is not 474 00:25:23,320 --> 00:25:27,280 Speaker 1: going to suffice to average voters, especially if the unemployment 475 00:25:27,400 --> 00:25:31,359 Speaker 1: rate does inch up to ten percent or higher. Yeah. 476 00:25:31,440 --> 00:25:33,400 Speaker 2: Yeah, And I would say this is one area where 477 00:25:33,400 --> 00:25:35,280 Speaker 2: the Trump administration, like I said, it's had a pretty 478 00:25:35,320 --> 00:25:37,760 Speaker 2: pro AI stance, and it has actually done some good 479 00:25:37,760 --> 00:25:39,760 Speaker 2: work on the margins of these anxieties. 480 00:25:39,800 --> 00:25:40,480 Speaker 1: Like just just a. 481 00:25:40,400 --> 00:25:42,800 Speaker 2: Couple of weeks ago, they basically rolled out an agreement 482 00:25:42,880 --> 00:25:46,879 Speaker 2: where one of the many sort of populous anxieties about 483 00:25:46,880 --> 00:25:51,119 Speaker 2: AI is this stuff about energy costs and water usage. Right, 484 00:25:51,160 --> 00:25:55,439 Speaker 2: I mean, these data centers they suck up electricity and 485 00:25:55,480 --> 00:25:57,679 Speaker 2: they create a lot more demand on it in a 486 00:25:57,680 --> 00:25:58,200 Speaker 2: given grid. 487 00:25:58,240 --> 00:25:59,000 Speaker 1: They raise prices. 488 00:25:59,000 --> 00:26:01,800 Speaker 2: So the White House rolled out basically a policy that 489 00:26:01,800 --> 00:26:03,560 Speaker 2: that is sort of a handshake agreement with some of 490 00:26:03,560 --> 00:26:07,600 Speaker 2: these companies to shoulder more of those costs of their 491 00:26:07,640 --> 00:26:10,160 Speaker 2: own data centers. For some of these hyper scaling companies 492 00:26:10,720 --> 00:26:13,400 Speaker 2: that in theory addresses that one component. But of course 493 00:26:13,400 --> 00:26:15,719 Speaker 2: that's only one component, right, And as you're kind of mentioning, 494 00:26:15,960 --> 00:26:18,200 Speaker 2: I think that there is there's this sort of belated 495 00:26:18,560 --> 00:26:22,760 Speaker 2: but now very real sort of societal awakening to the 496 00:26:22,920 --> 00:26:25,359 Speaker 2: to the idea that whereas even just a couple of 497 00:26:25,440 --> 00:26:28,159 Speaker 2: years ago, all of this stuff about it really, you know, 498 00:26:29,040 --> 00:26:32,760 Speaker 2: mass deplacement of employee, displacement of employment and things like 499 00:26:32,800 --> 00:26:36,159 Speaker 2: that really did seem like kind of a I don't know, 500 00:26:36,200 --> 00:26:38,639 Speaker 2: like kind of a a a hidden ball trick or 501 00:26:38,680 --> 00:26:40,639 Speaker 2: something like that. It's like, oh, these companies are just 502 00:26:40,760 --> 00:26:44,080 Speaker 2: sort of, you know, making these silly sounding promises to 503 00:26:44,160 --> 00:26:45,200 Speaker 2: hype up their own products. 504 00:26:45,200 --> 00:26:46,680 Speaker 1: Nobody really thinks this is going to happen. 505 00:26:46,720 --> 00:26:48,880 Speaker 2: People are really starting to think it is actually going 506 00:26:48,880 --> 00:26:51,520 Speaker 2: to happen, and it is creating an enormous amount of anxiety. 507 00:26:51,640 --> 00:26:55,439 Speaker 2: I mean, I've been kind of slow waking up to 508 00:26:55,480 --> 00:26:57,680 Speaker 2: all this stuff myself in just the last couple of months. 509 00:26:57,760 --> 00:27:00,159 Speaker 2: I mean, it's we were hardly thinking about AI. At 510 00:27:00,200 --> 00:27:02,760 Speaker 2: least I was hardly thinking about AI this time last year. 511 00:27:02,800 --> 00:27:04,240 Speaker 2: A lot of people were thinking a lot about it. 512 00:27:04,680 --> 00:27:06,800 Speaker 2: But now you know, you turn around and this could 513 00:27:06,840 --> 00:27:10,480 Speaker 2: end up being really the key policy issue of the 514 00:27:10,560 --> 00:27:12,159 Speaker 2: of the twenty twenty eight election. I mean, like we 515 00:27:12,520 --> 00:27:17,200 Speaker 2: may be a couple of years already into some sort 516 00:27:17,240 --> 00:27:21,080 Speaker 2: of AI transition at that point, and it's and nobody. 517 00:27:21,560 --> 00:27:22,919 Speaker 2: The thing that I keep coming back to is that 518 00:27:22,960 --> 00:27:25,520 Speaker 2: nobody really has like a full plan for that. Everybody's 519 00:27:25,560 --> 00:27:28,320 Speaker 2: trying to kind of like, you know, stick their picture 520 00:27:28,400 --> 00:27:30,919 Speaker 2: down into the stream, to like pull a little bit 521 00:27:30,920 --> 00:27:33,720 Speaker 2: of that sentiment out and channel it toward one political 522 00:27:33,800 --> 00:27:35,359 Speaker 2: end or another. But nobody seems to have like a 523 00:27:35,359 --> 00:27:39,639 Speaker 2: real vision for what a society that that interacts with 524 00:27:39,680 --> 00:27:42,840 Speaker 2: this extremely powerful technology in a healthy way actually would 525 00:27:42,840 --> 00:27:43,199 Speaker 2: look like. 526 00:27:43,520 --> 00:27:45,600 Speaker 1: Andrew, I will say, if I have any talent in 527 00:27:45,640 --> 00:27:48,520 Speaker 1: this world, it's picking up political trends like I really, 528 00:27:49,040 --> 00:27:51,439 Speaker 1: I'm actually good at it. And I sat with a 529 00:27:51,760 --> 00:27:54,520 Speaker 1: top member of the Trump administration in twenty twenty three 530 00:27:54,640 --> 00:27:57,399 Speaker 1: before the election, and I said, you have to have 531 00:27:57,480 --> 00:27:59,600 Speaker 1: an answer on this. It is the most important issue 532 00:27:59,600 --> 00:28:02,679 Speaker 1: coming up. And they point blank looked at me and 533 00:28:02,720 --> 00:28:04,080 Speaker 1: said it's not going to be a problem. It's not 534 00:28:04,080 --> 00:28:06,800 Speaker 1: going to display jobs. And I was like, you're wrong, Like, 535 00:28:06,840 --> 00:28:09,040 Speaker 1: I don't know how else to tell you what you're wrong. 536 00:28:09,880 --> 00:28:13,080 Speaker 1: And I had On Rocanna on this show, who's very 537 00:28:13,400 --> 00:28:16,480 Speaker 1: like hawkish on AI. He's actually getting a primary challenge 538 00:28:16,520 --> 00:28:20,040 Speaker 1: from the AI companies about it, and he said on 539 00:28:20,080 --> 00:28:23,880 Speaker 1: the show he was that this was the goal for him, 540 00:28:24,240 --> 00:28:27,520 Speaker 1: was we need to tax AI to do redistribution of 541 00:28:27,560 --> 00:28:29,960 Speaker 1: wealth where everyone gets a check because no one will 542 00:28:30,000 --> 00:28:34,120 Speaker 1: have jobs. And David Shore, it's like Andrew Yang's idea 543 00:28:34,160 --> 00:28:36,119 Speaker 1: and has been for a while. David Shore put out 544 00:28:36,119 --> 00:28:39,560 Speaker 1: a study these the progressive polster is very smart and 545 00:28:39,600 --> 00:28:43,320 Speaker 1: he said that people voters care more about a job 546 00:28:43,400 --> 00:28:45,800 Speaker 1: than a handout, like they don't want to handle they 547 00:28:45,840 --> 00:28:50,960 Speaker 1: want some kind of government backed government back job. And 548 00:28:51,160 --> 00:28:55,560 Speaker 1: even a plurality of Republicans and Trump voters think it 549 00:28:55,600 --> 00:28:59,040 Speaker 1: is more important that AI companies be taxed for the 550 00:28:59,160 --> 00:29:02,200 Speaker 1: jobs that they lose, who the jobs they destroy, rather 551 00:29:02,280 --> 00:29:04,760 Speaker 1: than be able to get profits. I do think we're 552 00:29:04,800 --> 00:29:07,920 Speaker 1: in a different world than we were in ever, because 553 00:29:07,960 --> 00:29:09,720 Speaker 1: I do think this is going to split across party 554 00:29:09,760 --> 00:29:12,160 Speaker 1: lines in a profound way. Yeah. 555 00:29:12,240 --> 00:29:14,160 Speaker 2: Yeah, And part of it is just that it all 556 00:29:14,200 --> 00:29:17,960 Speaker 2: remains so opaque, right, so vague, nobody even really knows 557 00:29:18,280 --> 00:29:19,880 Speaker 2: what the impacts are going to be, where they're going 558 00:29:19,920 --> 00:29:22,680 Speaker 2: to be felt first, how quickly they will arrive, because 559 00:29:22,720 --> 00:29:24,760 Speaker 2: all because we're all still just sort of like groping 560 00:29:24,800 --> 00:29:27,440 Speaker 2: our way toward even what the technologies are going to 561 00:29:27,440 --> 00:29:29,400 Speaker 2: be capable of a year from now or three years 562 00:29:29,400 --> 00:29:31,560 Speaker 2: from now. I mean, like even among the people who 563 00:29:31,600 --> 00:29:36,080 Speaker 2: are building these things, views diverge wildly about how much 564 00:29:36,280 --> 00:29:39,880 Speaker 2: more quickly this can continue to scale, right, and how 565 00:29:39,960 --> 00:29:43,360 Speaker 2: how where the bottlenecks are going to be, where where 566 00:29:43,360 --> 00:29:46,600 Speaker 2: the actual like like uh uh, an economy can sort 567 00:29:46,600 --> 00:29:50,520 Speaker 2: of trundle along inefficiently without like perfect allocation of capital 568 00:29:50,720 --> 00:29:53,480 Speaker 2: for a long, long long time because of various you know, 569 00:29:53,720 --> 00:29:55,880 Speaker 2: actual human interests that are involved. And this seems like 570 00:29:55,920 --> 00:29:57,600 Speaker 2: a situation where there's going to be a ton of that. 571 00:29:57,880 --> 00:30:01,080 Speaker 2: And so the question is, can we can we manage 572 00:30:01,080 --> 00:30:03,560 Speaker 2: that sort of situation? Can we can we allow these 573 00:30:03,600 --> 00:30:07,360 Speaker 2: technologies to continue to be deployed while also finding different 574 00:30:07,360 --> 00:30:10,320 Speaker 2: ways to to sort of like blunt the harms, allow 575 00:30:10,400 --> 00:30:12,880 Speaker 2: the unleashing of the technology and the and the productivity 576 00:30:12,880 --> 00:30:14,400 Speaker 2: and all of those sorts of things, but blunt the 577 00:30:14,440 --> 00:30:17,760 Speaker 2: harms or is it actually a situation where where these 578 00:30:17,880 --> 00:30:20,680 Speaker 2: these you're better off without it where where you're where 579 00:30:20,680 --> 00:30:24,200 Speaker 2: where it's actually fine for these different different politicians to 580 00:30:24,280 --> 00:30:26,920 Speaker 2: basically subject this technology to a death of a thousand 581 00:30:27,000 --> 00:30:30,720 Speaker 2: regulatory cuts and and uh and and nobody really knows, 582 00:30:30,720 --> 00:30:33,240 Speaker 2: but they do know one thing, which is that the 583 00:30:33,280 --> 00:30:36,160 Speaker 2: hatred for it out there, the anxiety about about it 584 00:30:36,160 --> 00:30:38,560 Speaker 2: out there that is one hundred percent real, Like that's 585 00:30:38,600 --> 00:30:41,240 Speaker 2: not just some like possible future development. 586 00:30:41,320 --> 00:30:42,120 Speaker 1: That's there now. 587 00:30:42,320 --> 00:30:43,840 Speaker 2: And I think that's part of why we're starting to 588 00:30:43,840 --> 00:30:47,240 Speaker 2: see this wave of of, like I said, different sort 589 00:30:47,280 --> 00:30:50,400 Speaker 2: of pretty hostile AI policy legislation. 590 00:30:50,040 --> 00:30:53,320 Speaker 1: Out there, and it's going to be bigger. The AI packs, 591 00:30:53,400 --> 00:30:57,800 Speaker 1: the I alleged affiliated packs, are spending millions on several 592 00:30:57,920 --> 00:31:00,200 Speaker 1: races because they want to put the fear of and 593 00:31:00,240 --> 00:31:02,800 Speaker 1: his politicians. There's New Year twelve. If you haven't covered 594 00:31:02,840 --> 00:31:04,960 Speaker 1: the New York five primary, you should. It is the 595 00:31:05,040 --> 00:31:07,840 Speaker 1: race to replace Jerry Nadler's old seat. There's a guy 596 00:31:07,960 --> 00:31:11,200 Speaker 1: named Alex Boris. He's an assemblyman. He worked for Pallenteer 597 00:31:11,240 --> 00:31:13,840 Speaker 1: and he wrote the AI regulation for New York State. 598 00:31:14,280 --> 00:31:17,880 Speaker 1: They are spending like I think, north of a fifteen 599 00:31:17,920 --> 00:31:21,440 Speaker 1: million dollars to defeat him. He's running against a John 600 00:31:21,520 --> 00:31:27,240 Speaker 1: Kennedy's grandson and George Conway. I think, you know, like great, 601 00:31:27,280 --> 00:31:30,160 Speaker 1: great parade of people. But he's actually a serious legislator, 602 00:31:30,200 --> 00:31:31,960 Speaker 1: even when even though I disagree with him on like 603 00:31:32,160 --> 00:31:35,040 Speaker 1: seventy five percent of things, he's serious And like I 604 00:31:35,080 --> 00:31:38,000 Speaker 1: keep telling the number, he has to win only so 605 00:31:38,320 --> 00:31:42,440 Speaker 1: the conversation doesn't end, Like, if he wins, the fear 606 00:31:42,600 --> 00:31:46,400 Speaker 1: of all this money dials down a lot, and then 607 00:31:46,400 --> 00:31:49,320 Speaker 1: we could actually have a smart conversation. If he loses, 608 00:31:49,880 --> 00:31:51,600 Speaker 1: a lot of people are going to shut up out 609 00:31:51,600 --> 00:31:55,280 Speaker 1: of fear. But that's really what this election cycle. More 610 00:31:55,280 --> 00:31:57,960 Speaker 1: than I think who wins the House, who wins the Senate, 611 00:31:58,000 --> 00:32:00,720 Speaker 1: The real story is these these and I've been talking 612 00:32:00,720 --> 00:32:02,680 Speaker 1: a lot about money and politics for this podcast list 613 00:32:02,720 --> 00:32:07,840 Speaker 1: episodes Apac, Crypto and AI specifically AI spending money is 614 00:32:07,880 --> 00:32:12,080 Speaker 1: to whoever wins the House really control what is allowed 615 00:32:12,120 --> 00:32:14,040 Speaker 1: to be talked about and regulated or not. And that's 616 00:32:14,080 --> 00:32:17,640 Speaker 1: why like Bores has to win so we could actually 617 00:32:17,640 --> 00:32:21,400 Speaker 1: have an adult conversation. Does that make any sense? Yeah, no, 618 00:32:21,880 --> 00:32:22,840 Speaker 1: I see what you're talking about. 619 00:32:23,040 --> 00:32:26,080 Speaker 2: I'm also very very interested in the way that this 620 00:32:26,120 --> 00:32:27,800 Speaker 2: plays out on the Republican side. I mean, I was 621 00:32:27,840 --> 00:32:30,760 Speaker 2: mentioning earlier the sort of split we talked about, the 622 00:32:31,040 --> 00:32:34,000 Speaker 2: kind of populist versus institutionalist or maybe institutionalist is not 623 00:32:34,040 --> 00:32:36,480 Speaker 2: the right word, but the anti versus pro AI side 624 00:32:36,560 --> 00:32:39,360 Speaker 2: on the Republicans, And like a lot of it's concentrated 625 00:32:39,400 --> 00:32:42,360 Speaker 2: at the state level, but that's not guaranteed to remain. 626 00:32:42,480 --> 00:32:43,640 Speaker 2: I mean, this has been one of the kind of 627 00:32:43,680 --> 00:32:47,160 Speaker 2: fault lines in the MAGA coalition for quite a while. 628 00:32:47,320 --> 00:32:50,640 Speaker 2: Is just this sort of like tech futurist guys, the 629 00:32:50,640 --> 00:32:53,360 Speaker 2: Peter Thiels of the world, on the one hand, and 630 00:32:53,400 --> 00:32:57,200 Speaker 2: the more kind of like return to tradition social cons 631 00:32:57,240 --> 00:32:59,680 Speaker 2: on the other side of that particular split. And it 632 00:32:59,680 --> 00:33:02,240 Speaker 2: really feel like a side show for a lot of 633 00:33:02,280 --> 00:33:04,200 Speaker 2: the I mean, certainly for the first Trump term, for 634 00:33:04,280 --> 00:33:05,720 Speaker 2: a lot of what we've seen so far in the 635 00:33:05,720 --> 00:33:07,920 Speaker 2: second Trump term. In terms of most of the policies 636 00:33:07,920 --> 00:33:10,560 Speaker 2: that have been front and center, they tend to agree, 637 00:33:10,560 --> 00:33:12,520 Speaker 2: they tend to align. There have been little flare ups, 638 00:33:12,560 --> 00:33:14,479 Speaker 2: like you know, the fight over H one B visas 639 00:33:14,560 --> 00:33:17,880 Speaker 2: or something like that, but we're really looking at a 640 00:33:17,920 --> 00:33:20,880 Speaker 2: world where if this does in fact become kind of 641 00:33:20,920 --> 00:33:23,960 Speaker 2: like one of the defining policy issues of the back 642 00:33:24,000 --> 00:33:26,600 Speaker 2: half of Trump's second term. I don't know how that 643 00:33:26,640 --> 00:33:29,160 Speaker 2: coalition hangs together on this issue because it just seems 644 00:33:29,160 --> 00:33:34,280 Speaker 2: so like they're so diametrically on opposite sides of the 645 00:33:34,480 --> 00:33:37,920 Speaker 2: very very basic fundamentals of the issue. 646 00:33:37,920 --> 00:33:39,720 Speaker 1: Here. Well, I'll tell you a secret. So this is 647 00:33:39,720 --> 00:33:42,040 Speaker 1: the secret in the Republican primary right now, because I've 648 00:33:42,080 --> 00:33:47,440 Speaker 1: been seeing this NonStop, is the AI gives buckets. The 649 00:33:47,480 --> 00:33:49,840 Speaker 1: Club for Growth, The Club for Growth has a super 650 00:33:49,840 --> 00:33:52,680 Speaker 1: close relationship with Trump. Trump endorses the same primary can 651 00:33:52,760 --> 00:33:55,720 Speaker 1: the Cloud for Growth does, and everybody wins. So we're 652 00:33:55,760 --> 00:33:59,040 Speaker 1: not We're going to have a serious conversation. We have 653 00:33:59,200 --> 00:34:03,680 Speaker 1: not yet had that because Club AI and Trump are 654 00:34:03,720 --> 00:34:07,400 Speaker 1: all in codes too. They all like everyone's singing Kumbayad together. 655 00:34:07,960 --> 00:34:10,959 Speaker 1: Once that ends, whether through it's Trump's presidency or through 656 00:34:11,080 --> 00:34:13,439 Speaker 1: I don't know what, or a backlash is too big 657 00:34:13,480 --> 00:34:17,600 Speaker 1: to withhold. But once that ends, then we'll have the conversation. 658 00:34:17,800 --> 00:34:21,359 Speaker 1: But so often when a party's has the presidency, even 659 00:34:21,400 --> 00:34:25,080 Speaker 1: they start this to Obama, when Mohama's president, when Bush's president, 660 00:34:25,160 --> 00:34:28,120 Speaker 1: you can't have dynamic conversations outside of what the White 661 00:34:28,160 --> 00:34:30,520 Speaker 1: House is saying because everyone takes us from the White House. 662 00:34:31,280 --> 00:34:34,160 Speaker 1: So yeah, I'm fus shared on the Republican side deeply 663 00:34:34,360 --> 00:34:38,480 Speaker 1: as somebody who does politics for a living. The Democratic side, 664 00:34:38,520 --> 00:34:40,799 Speaker 1: I think has a lot of energy and they have 665 00:34:40,880 --> 00:34:43,000 Speaker 1: a lot of they're open to a lot of conversations. 666 00:34:43,040 --> 00:34:44,800 Speaker 1: A lot of bad ideas are coming out of Democratic 667 00:34:44,880 --> 00:34:47,719 Speaker 1: side too, but they're having a conversation, which is what 668 00:34:48,360 --> 00:34:52,760 Speaker 1: American people are. And Sure's data show that like economic 669 00:34:52,800 --> 00:34:56,440 Speaker 1: populism tied to AI means a four point swing in 670 00:34:56,480 --> 00:34:59,279 Speaker 1: Democrats favor nationally, you go from a Trump twenty twenty 671 00:34:59,280 --> 00:35:01,360 Speaker 1: four election to a i'm a twenty twenty twelve election 672 00:35:01,719 --> 00:35:04,480 Speaker 1: specifically on AI. So long as you don't make the 673 00:35:04,480 --> 00:35:07,239 Speaker 1: other policy positions like a grown man should be able 674 00:35:07,239 --> 00:35:08,880 Speaker 1: to shower with a twelve year old daughter, like you 675 00:35:08,880 --> 00:35:10,600 Speaker 1: know what I mean, Like it's like co identifies as one, Like, 676 00:35:10,640 --> 00:35:12,480 Speaker 1: as long as you don't do that, Like, actually, there's 677 00:35:12,520 --> 00:35:16,080 Speaker 1: a lot going for Democrats. Okay, Where where could people 678 00:35:16,160 --> 00:35:18,200 Speaker 1: go to read your stuff to hear more about you? 679 00:35:18,200 --> 00:35:20,080 Speaker 1: Because I think you put out some really really interesting 680 00:35:20,080 --> 00:35:21,640 Speaker 1: stuff your morning newsletter. 681 00:35:22,239 --> 00:35:24,720 Speaker 2: Yeah thanks, So I'm the White Houst correspondent for the Bullwork. 682 00:35:24,840 --> 00:35:27,520 Speaker 2: I write our morning newsletter with Bill Crystal every day 683 00:35:27,960 --> 00:35:30,279 Speaker 2: Monday through Friday. It's a free newsletter, goes out at 684 00:35:30,280 --> 00:35:33,400 Speaker 2: the Bullwark dot com. And I'm also I'm also all 685 00:35:33,440 --> 00:35:36,160 Speaker 2: over our YouTube, all over our subject video stuff. You know, 686 00:35:36,280 --> 00:35:38,080 Speaker 2: follow the Bullwork on YouTube if you feel like it, 687 00:35:38,320 --> 00:35:40,080 Speaker 2: or a head over to Substack for our video stuff 688 00:35:40,080 --> 00:35:40,440 Speaker 2: there as well. 689 00:35:40,480 --> 00:35:42,920 Speaker 1: And I'm on Twitter eger DC. Great guy. Do not 690 00:35:43,040 --> 00:35:45,480 Speaker 1: let the Bulwark scare you. He's somebody worth listening to. 691 00:35:45,760 --> 00:35:48,719 Speaker 1: Absolutely all right, Andrewe thanks for this podcast. I appreciate it. 692 00:35:48,960 --> 00:35:54,560 Speaker 1: Thanks Ryan. Now it's time for the Ask Me Anything segment. 693 00:35:54,600 --> 00:35:56,560 Speaker 1: If your money part of the Ask Me Anything segment 694 00:35:56,640 --> 00:35:59,600 Speaker 1: on me Ryan at Numbers Gamepodcast dot com. That's Ryan 695 00:35:59,640 --> 00:36:02,239 Speaker 1: at umbers proll Numbers Gamepodcast dot com. I got a 696 00:36:02,239 --> 00:36:04,799 Speaker 1: lot of them. I love getting these. I think they 697 00:36:04,800 --> 00:36:07,719 Speaker 1: make the show so wonderful and it makes me know 698 00:36:07,760 --> 00:36:10,520 Speaker 1: that someone's listening. Okay, this question comes from Eric, he 699 00:36:10,560 --> 00:36:13,279 Speaker 1: writes Ryan. The House Representatives has been four hundred and 700 00:36:13,320 --> 00:36:16,480 Speaker 1: thirty five representatives since the early part of the nineteen 701 00:36:16,560 --> 00:36:20,680 Speaker 1: hundreds nineteen twenty nine. Since then, our population has increased 702 00:36:20,680 --> 00:36:22,439 Speaker 1: to over three hundred and thirty million. I'm a strong 703 00:36:22,480 --> 00:36:25,560 Speaker 1: advocate of expanding the number of representatives. Earlier this year, 704 00:36:25,680 --> 00:36:28,440 Speaker 1: many states did redistricting to increase ours or D's. My 705 00:36:28,480 --> 00:36:31,120 Speaker 1: opinion is, why not simply increase the number of Republic 706 00:36:31,200 --> 00:36:34,280 Speaker 1: of representatives. I think the problem is the current reps 707 00:36:34,440 --> 00:36:37,360 Speaker 1: do not want to water down their power. In your opinion, 708 00:36:37,400 --> 00:36:39,839 Speaker 1: what is the correct number of representatives the US House 709 00:36:39,880 --> 00:36:42,000 Speaker 1: should have and how do we get the House to 710 00:36:42,080 --> 00:36:44,520 Speaker 1: Expand you know, I talked to Matt Gates about this 711 00:36:44,600 --> 00:36:46,480 Speaker 1: when he was in Congress and he just was always like, 712 00:36:47,680 --> 00:36:51,319 Speaker 1: his opinion always was will the House operate better with 713 00:36:51,560 --> 00:36:54,880 Speaker 1: more representatives? And that was his big opposition to it. 714 00:36:55,800 --> 00:37:02,480 Speaker 1: We have one of the smaller like parliaments or House 715 00:37:02,520 --> 00:37:07,160 Speaker 1: representatives in the lower representatives lower body rather a parliament 716 00:37:07,280 --> 00:37:10,600 Speaker 1: of our Parliament, of our Congress. In the world, we 717 00:37:10,640 --> 00:37:14,239 Speaker 1: are number twenty four as far as the size of 718 00:37:14,280 --> 00:37:19,319 Speaker 1: our Lower House representatives. Other countries, like you know, other 719 00:37:19,440 --> 00:37:22,280 Speaker 1: mid size and low sized countries have much larger parliaments 720 00:37:22,320 --> 00:37:28,440 Speaker 1: than we do. France, Mexico, Japan, in England, Ethiopia, Turkey, Germany, 721 00:37:28,880 --> 00:37:32,600 Speaker 1: they all have larger bodies than US legislative bodies. So 722 00:37:33,520 --> 00:37:37,760 Speaker 1: it's it's not like we would be out of step 723 00:37:37,760 --> 00:37:41,440 Speaker 1: with the world if we increased it. But why we 724 00:37:41,520 --> 00:37:44,000 Speaker 1: don't increase it, I think partially is I don't think 725 00:37:44,040 --> 00:37:47,320 Speaker 1: congressmen feel a need to do it, and I don't 726 00:37:47,520 --> 00:37:52,239 Speaker 1: think that they think Congress will work any better if 727 00:37:52,320 --> 00:37:55,279 Speaker 1: you increase it. What do I think is the appropriate 728 00:37:55,320 --> 00:37:59,160 Speaker 1: size population? So I started looking at what other countries 729 00:37:59,280 --> 00:38:02,520 Speaker 1: have as far as population. We have about seven hundred 730 00:38:02,560 --> 00:38:06,239 Speaker 1: and sixty thousand people in every congressional district on average. 731 00:38:06,920 --> 00:38:10,600 Speaker 1: Germany has one hundred and sixteen thousand, England has one 732 00:38:10,719 --> 00:38:14,080 Speaker 1: hundred thousand, Japan has two hundred and seventy two thousand. 733 00:38:14,600 --> 00:38:19,680 Speaker 1: Most pure nations are between two to three hundred thousand. 734 00:38:20,000 --> 00:38:25,640 Speaker 1: So I started looking at historical precedences, the Qube rule, 735 00:38:26,040 --> 00:38:30,719 Speaker 1: what ideal representation would look like, and most most very 736 00:38:30,760 --> 00:38:34,000 Speaker 1: smart out analysis said between four hundred and eighty thousand 737 00:38:34,520 --> 00:38:40,560 Speaker 1: to six hundred thousand people per district. Now this is 738 00:38:40,960 --> 00:38:43,160 Speaker 1: purely an intellectual game. I don't know how you would 739 00:38:43,160 --> 00:38:46,799 Speaker 1: get to an actual number that would matter, but I 740 00:38:46,840 --> 00:38:49,239 Speaker 1: started looking also. The European Union with their body is 741 00:38:49,239 --> 00:38:50,960 Speaker 1: they have one hundred million more people than us, But 742 00:38:51,760 --> 00:38:56,200 Speaker 1: we're going to get there with immigration. So I had said, 743 00:38:56,280 --> 00:39:00,080 Speaker 1: let's I say a good number that would restore there 744 00:39:00,120 --> 00:39:04,839 Speaker 1: were more balance in Congress to representation representatives. Representatives per 745 00:39:04,880 --> 00:39:08,719 Speaker 1: people would probably be about seven hundred and twenty one, right, 746 00:39:08,800 --> 00:39:11,840 Speaker 1: it's about the European Union size with one extracy because 747 00:39:11,880 --> 00:39:16,400 Speaker 1: we can't have ties. And that would decrease the number 748 00:39:16,440 --> 00:39:19,439 Speaker 1: of people per district to the nineteen seventies level, about 749 00:39:19,480 --> 00:39:22,200 Speaker 1: four hundred and sixty thousand people per district. It would 750 00:39:22,239 --> 00:39:26,200 Speaker 1: just be much be much easier, I think for people 751 00:39:26,239 --> 00:39:29,600 Speaker 1: to interact with their representatives. It would cut the population 752 00:39:29,719 --> 00:39:33,320 Speaker 1: in half, and I think it would give representation better 753 00:39:33,400 --> 00:39:39,120 Speaker 1: representation to states that have you know, that are that 754 00:39:39,200 --> 00:39:43,120 Speaker 1: are diverse. But what happens when the population gap is 755 00:39:43,160 --> 00:39:46,040 Speaker 1: so high is that one city represents the entire states. 756 00:39:46,120 --> 00:39:48,760 Speaker 1: Like a state like Delaware, they have one congress person 757 00:39:49,920 --> 00:39:52,359 Speaker 1: who gets all their votes out of the northern part 758 00:39:52,360 --> 00:39:55,040 Speaker 1: of the state. It's all Democrat. If there was a 759 00:39:55,320 --> 00:39:57,440 Speaker 1: fair or if it was if they had if they 760 00:39:57,440 --> 00:39:59,480 Speaker 1: reduced number to four hundred and sixty thousand, there was 761 00:39:59,480 --> 00:40:02,120 Speaker 1: a fair map, they'd actually have a Republican seat and 762 00:40:02,120 --> 00:40:05,480 Speaker 1: a Democratic seat because southern Delaware is very Republican, but 763 00:40:05,520 --> 00:40:09,960 Speaker 1: they don't get any representation. Likewise, New Hampshire have Republican seats, 764 00:40:09,920 --> 00:40:12,720 Speaker 1: massachuse to have Republican seats. Texts would have more democratic 765 00:40:12,800 --> 00:40:16,040 Speaker 1: seats if everything was done, you know fairly, which I'm saying, 766 00:40:16,040 --> 00:40:17,920 Speaker 1: that's a big if. And I was looking at some 767 00:40:17,960 --> 00:40:20,959 Speaker 1: states like Delaware would go from one to two seats, 768 00:40:21,000 --> 00:40:23,880 Speaker 1: Connecticut would go from five to eight, Idaho will go 769 00:40:23,960 --> 00:40:26,799 Speaker 1: from two to four, Louisiana go from six to ten. 770 00:40:26,880 --> 00:40:30,120 Speaker 1: Texas would go from thirty to sixty three. It would 771 00:40:30,120 --> 00:40:32,000 Speaker 1: be a bigger size. It's a hard thing to wrap 772 00:40:32,040 --> 00:40:34,480 Speaker 1: your mind around, but I would say about seven hundred 773 00:40:34,520 --> 00:40:38,279 Speaker 1: and twenty one would be a good number. Anyway, it's 774 00:40:38,280 --> 00:40:41,640 Speaker 1: an interesting question. Okay, last question for the show. This 775 00:40:41,800 --> 00:40:45,160 Speaker 1: question comes from Ryan. He writes, I am a man. 776 00:40:45,719 --> 00:40:47,920 Speaker 1: I am a man. Oh okay, because I asked you. 777 00:40:48,040 --> 00:40:49,279 Speaker 1: He wrote to me last time. I said, I know 778 00:40:49,320 --> 00:40:50,719 Speaker 1: Brian is a man or a woman because there are 779 00:40:50,760 --> 00:40:53,120 Speaker 1: female Ryans, not many, but there are here. I am 780 00:40:53,160 --> 00:40:55,600 Speaker 1: a man. I know you read it on your pod 781 00:40:55,680 --> 00:40:59,160 Speaker 1: and you aren't sure my name is My name is Ryan. Anyway, 782 00:40:59,200 --> 00:41:02,640 Speaker 1: you brought up your your grandmother's meatball recipes. On the 783 00:41:02,719 --> 00:41:04,960 Speaker 1: last pot. I love making meat balls. What can you 784 00:41:04,960 --> 00:41:10,920 Speaker 1: share from your grandma's recipe? Okay, this is probably the 785 00:41:10,960 --> 00:41:13,800 Speaker 1: most important question that was ever asked. I am Italian, 786 00:41:13,840 --> 00:41:18,920 Speaker 1: I'm fifty percent Sicilian. Making sauce making meatballs is like 787 00:41:19,280 --> 00:41:22,200 Speaker 1: it's it's our I mean, the smell of garlic and 788 00:41:22,200 --> 00:41:26,880 Speaker 1: olive oil is like my ancestral legacy. So and I, oh, 789 00:41:26,960 --> 00:41:30,440 Speaker 1: you always have to make sauce with your meat balls. 790 00:41:30,480 --> 00:41:32,440 Speaker 1: I don't believe in just making meeples. I really, honestly, 791 00:41:32,440 --> 00:41:34,239 Speaker 1: I don't know how to do it. It's this weird thing 792 00:41:34,320 --> 00:41:38,480 Speaker 1: Italians have, where like I can cook very easily for 793 00:41:38,560 --> 00:41:41,200 Speaker 1: eighty five people, I do not know how to make 794 00:41:41,239 --> 00:41:43,200 Speaker 1: a meal for three people. Though it is, I am 795 00:41:43,280 --> 00:41:46,160 Speaker 1: dumbfounded at the concept of what porches actually look like. 796 00:41:46,400 --> 00:41:50,400 Speaker 1: It's everything is in bulk, like literally, Costco is just 797 00:41:50,520 --> 00:41:55,000 Speaker 1: Italian people's everyday life. So this is how you make 798 00:41:55,080 --> 00:41:59,080 Speaker 1: my grandmother's meatball and sauce recipe. It's not like we 799 00:41:59,120 --> 00:42:01,239 Speaker 1: don't know, we're not marking. It is what it is. 800 00:42:01,280 --> 00:42:03,319 Speaker 1: Every time. It's basically the same thing with a little 801 00:42:03,360 --> 00:42:06,880 Speaker 1: thing different. You're there. It's a very basic thing. But 802 00:42:06,960 --> 00:42:09,160 Speaker 1: I'll explain it to you since you asked and this 803 00:42:09,239 --> 00:42:11,440 Speaker 1: has asked me anything. So the first thing you want 804 00:42:11,480 --> 00:42:13,600 Speaker 1: to do. Get a big pot, and you want to 805 00:42:13,840 --> 00:42:16,879 Speaker 1: coat the bottom of the pot with a good olive oil. 806 00:42:17,200 --> 00:42:18,920 Speaker 1: Shouldn't be dripping. It shouldn't be like an inch of 807 00:42:18,960 --> 00:42:20,880 Speaker 1: olive oil. You should just coat the bottom of the 808 00:42:20,880 --> 00:42:24,520 Speaker 1: pan a good amount. I don't measure a lot of things. 809 00:42:24,520 --> 00:42:26,960 Speaker 1: So that's that. And then you need to put chopped 810 00:42:27,040 --> 00:42:30,560 Speaker 1: garlic on the bottom of this. Listen to me very carefully. 811 00:42:31,239 --> 00:42:35,080 Speaker 1: Do not burn the garlic. You brown the garlic. If 812 00:42:35,080 --> 00:42:37,239 Speaker 1: you burn the garlic, throw the whole thing away and 813 00:42:37,239 --> 00:42:42,080 Speaker 1: start over again. You cannot burn the garlic. Once the 814 00:42:42,080 --> 00:42:45,040 Speaker 1: garlic is brown. You add tomato paste. It's a small 815 00:42:45,040 --> 00:42:47,799 Speaker 1: little can you get from from the grocery store. It's 816 00:42:47,920 --> 00:42:50,600 Speaker 1: very thick. Put that in. Fill the can up with water, 817 00:42:50,719 --> 00:42:53,440 Speaker 1: throw that in. Mix it, get it nice and smooth, 818 00:42:54,880 --> 00:42:58,120 Speaker 1: and make sure that you know everything's even. Then after 819 00:42:58,200 --> 00:43:01,800 Speaker 1: tomato paste, you get tomato crush, tomatoes and tomato pury. 820 00:43:02,600 --> 00:43:05,960 Speaker 1: Those are much bigger cans. Take the tomato puy, put 821 00:43:06,000 --> 00:43:08,520 Speaker 1: water in that. Put that also in there, and you 822 00:43:08,560 --> 00:43:11,600 Speaker 1: want to put it on a low temperature I'm talking 823 00:43:11,840 --> 00:43:14,759 Speaker 1: like if you have electric stove low or warm. If 824 00:43:14,760 --> 00:43:17,840 Speaker 1: you have a gas stove, like the lowest flame possible, 825 00:43:18,320 --> 00:43:22,880 Speaker 1: you add some basal pinch of sugar, fur, the acidity, salt, 826 00:43:22,920 --> 00:43:25,560 Speaker 1: and pepper and heat that thing and let it go 827 00:43:25,680 --> 00:43:30,200 Speaker 1: stir occasionally. Then you go to your meat poles. I 828 00:43:30,360 --> 00:43:32,560 Speaker 1: just think a little differently than what my grandmother did. 829 00:43:32,600 --> 00:43:34,920 Speaker 1: Everyone has their own kind of specialty. What I have 830 00:43:35,120 --> 00:43:38,520 Speaker 1: found in making meatballs is this, you use a chop 831 00:43:38,640 --> 00:43:41,680 Speaker 1: meat that is very lean. So I do like a 832 00:43:41,760 --> 00:43:44,399 Speaker 1: ninety three seven or eighty twenty chop meat to chop 833 00:43:44,480 --> 00:43:47,400 Speaker 1: meat that is very very lean, very little fat. You 834 00:43:47,400 --> 00:43:48,640 Speaker 1: can go to a butcher, you can go to whole 835 00:43:48,680 --> 00:43:51,239 Speaker 1: foods find that, and then I take that and I 836 00:43:51,400 --> 00:43:54,799 Speaker 1: mix it with pork, usually from a sausage Like I'll 837 00:43:54,800 --> 00:43:57,040 Speaker 1: cut the sausage open, I'll take the skin of it off, 838 00:43:57,080 --> 00:43:59,480 Speaker 1: and I'll put the sausage in or like a ground 839 00:43:59,560 --> 00:44:02,520 Speaker 1: pork with like a healthy amount of fat. And people 840 00:44:02,520 --> 00:44:04,560 Speaker 1: say fifty to fifty, I do it seventy five beef 841 00:44:04,600 --> 00:44:06,520 Speaker 1: twenty five work. Whatever the case may be, it's we 842 00:44:06,640 --> 00:44:09,120 Speaker 1: usually do seventy beef twenty five percent pork for like 843 00:44:09,160 --> 00:44:12,879 Speaker 1: a pound pound and a half. The leanness of the 844 00:44:13,320 --> 00:44:16,360 Speaker 1: of the ground beef and the fat on the pork 845 00:44:16,520 --> 00:44:19,360 Speaker 1: are beautiful together like they are gorgeous. It's like a 846 00:44:19,400 --> 00:44:23,600 Speaker 1: big bowl. You put an egg, half a cup of regatza. 847 00:44:24,120 --> 00:44:28,040 Speaker 1: It is not pronounced rakatti is regatta. Put regatta in 848 00:44:28,160 --> 00:44:33,160 Speaker 1: the bowl. And then you put an insane amount of 849 00:44:33,560 --> 00:44:37,359 Speaker 1: peccorina romano cheese. Like you put an insane amount. They say, 850 00:44:37,480 --> 00:44:39,319 Speaker 1: put a put like a half a cup. No, put 851 00:44:39,320 --> 00:44:41,880 Speaker 1: a full cup. Put as much as you want, heavy 852 00:44:42,080 --> 00:44:46,359 Speaker 1: on the cheese. Then you do two tablespoons of salt, 853 00:44:46,360 --> 00:44:50,239 Speaker 1: tables of pepper, a chopped garlic. Again, half a cup 854 00:44:50,239 --> 00:44:52,399 Speaker 1: of Italian bread crumps. Some Italians make their own bread cumps. 855 00:44:52,400 --> 00:44:53,840 Speaker 1: I don't got time for that. Just put it, you know, 856 00:44:53,920 --> 00:44:57,360 Speaker 1: it's fine. Put regular bot bread crumbs, Italian bread crumbs 857 00:44:57,480 --> 00:45:00,840 Speaker 1: in the in the meat, and then parsley, like a 858 00:45:00,880 --> 00:45:02,879 Speaker 1: lot of pars like half a cup of parsley. Then 859 00:45:02,920 --> 00:45:05,920 Speaker 1: you throw your meat in. You mix it with your hands. 860 00:45:06,560 --> 00:45:08,839 Speaker 1: Wash your hands obviously beforehand. Mix it with your hands. 861 00:45:08,880 --> 00:45:11,560 Speaker 1: Your hands will be freezing from this whole process. You 862 00:45:11,560 --> 00:45:14,000 Speaker 1: will type to go out and like warm your hands 863 00:45:14,080 --> 00:45:16,840 Speaker 1: up from this entire thing because it is cold. Uh, 864 00:45:16,920 --> 00:45:20,200 Speaker 1: you mix that, you roll it. Here is where like 865 00:45:20,280 --> 00:45:22,760 Speaker 1: the magic is made. And this is a controversial opinion 866 00:45:22,760 --> 00:45:28,040 Speaker 1: depending on the person. You put the meat raw directly 867 00:45:28,080 --> 00:45:31,239 Speaker 1: in the sauce to cook it. The flavor of the 868 00:45:31,280 --> 00:45:34,840 Speaker 1: meat is absorbed in the sauce. You cook the sauce 869 00:45:34,880 --> 00:45:37,440 Speaker 1: for three hours so the meat will cook. It's not 870 00:45:37,520 --> 00:45:40,319 Speaker 1: gonna be raw after the entire thing is over. But 871 00:45:40,600 --> 00:45:44,239 Speaker 1: that is how you really marinate the meat and the 872 00:45:44,280 --> 00:45:47,400 Speaker 1: sauce together over that time. And it should literally take 873 00:45:47,440 --> 00:45:49,320 Speaker 1: about two and a half to three hours, no minimum 874 00:45:49,320 --> 00:45:50,680 Speaker 1: than that, because then it will be just it'd be 875 00:45:50,719 --> 00:45:52,600 Speaker 1: two tomato and it won't be good. Like. You got 876 00:45:52,600 --> 00:45:54,360 Speaker 1: to be thick like the like the sauce has to 877 00:45:54,400 --> 00:45:57,840 Speaker 1: be thick to a bite where it sticks onto the 878 00:45:57,840 --> 00:46:01,080 Speaker 1: past it sticks onto the meat. And you occasionally put 879 00:46:01,120 --> 00:46:02,719 Speaker 1: in some salt pepper if you need to, but you 880 00:46:02,719 --> 00:46:05,440 Speaker 1: don't have to have for that stir occasionally keep it going. 881 00:46:06,600 --> 00:46:10,520 Speaker 1: And that's how you make my Grandmo's meatball and saucers speak. 882 00:46:10,520 --> 00:46:12,879 Speaker 1: If you want to bake it beforehand, by the way, 883 00:46:13,040 --> 00:46:14,799 Speaker 1: put in for ten minutes the meat balls and then 884 00:46:14,800 --> 00:46:16,840 Speaker 1: put them in the sauce just to make a harder 885 00:46:16,880 --> 00:46:19,319 Speaker 1: outside so they don't completely fall apart. But if you 886 00:46:19,440 --> 00:46:21,160 Speaker 1: pack it right, it's not going to fall apart. It's 887 00:46:21,160 --> 00:46:23,000 Speaker 1: going to be delicious. As I said, two and a 888 00:46:23,040 --> 00:46:24,920 Speaker 1: half to three hours. It's how you cook it on 889 00:46:24,960 --> 00:46:28,000 Speaker 1: a low heat. If it's bubbling everywhere and exploding, the 890 00:46:28,040 --> 00:46:30,440 Speaker 1: heat's too hot, lower the heat and you will have 891 00:46:30,600 --> 00:46:34,320 Speaker 1: Sunday dinner. I make Italian Sunday dinner every single Sunday. 892 00:46:35,080 --> 00:46:38,239 Speaker 1: Catholicism and pasta on Sunday is the two things I 893 00:46:38,280 --> 00:46:41,839 Speaker 1: still have in the old country. So after all these generations, 894 00:46:41,920 --> 00:46:45,720 Speaker 1: I barely register as an Italian anymore. It's just sauce 895 00:46:45,760 --> 00:46:48,800 Speaker 1: and church, so and it comes on the same day. Anyway, 896 00:46:48,840 --> 00:46:50,640 Speaker 1: Thank you for listening this podcast. I hope you like this. 897 00:46:50,640 --> 00:46:52,440 Speaker 1: I hope you like the recipe. If you like this podcast, 898 00:46:52,480 --> 00:46:54,799 Speaker 1: please like and subscribe on the iHeartRadio app. Have a 899 00:46:54,840 --> 00:46:57,359 Speaker 1: podcast like on YouTube. Give me a five star review 900 00:46:57,360 --> 00:46:59,040 Speaker 1: if you're feeling generous, and I will talk to you 901 00:46:59,040 --> 00:47:00,359 Speaker 1: guys on Friday. Thanks