1 00:00:03,560 --> 00:00:05,920 Speaker 1: Ladies and gentlemen, Welcome back to a numbers game with 2 00:00:06,000 --> 00:00:08,280 Speaker 1: Ryan or Dusky. Thank you guys for being here. I 3 00:00:08,360 --> 00:00:11,280 Speaker 1: had an insane weekend that I want to tell you 4 00:00:11,280 --> 00:00:15,280 Speaker 1: about before I get into the topic at hand. So, 5 00:00:15,800 --> 00:00:17,960 Speaker 1: first of all, very important. I don't really talk about 6 00:00:18,000 --> 00:00:20,840 Speaker 1: this very often, but I held I hosted company over 7 00:00:20,880 --> 00:00:24,880 Speaker 1: the weekend, made my grandmother's homemade meatballs and sauce, and 8 00:00:25,000 --> 00:00:28,400 Speaker 1: let me tell you it was a home run. More importantly, 9 00:00:28,560 --> 00:00:32,080 Speaker 1: I had to go to DC to just go for 10 00:00:32,080 --> 00:00:33,840 Speaker 1: a work meeting. It was a quick twenty four hour 11 00:00:33,960 --> 00:00:39,000 Speaker 1: turnaround trip. I get there late one night and I said, oh, 12 00:00:39,040 --> 00:00:41,559 Speaker 1: let me go meet with somebody, just have dinner with somebody. 13 00:00:42,360 --> 00:00:43,760 Speaker 1: So my reach out to my friend is a very 14 00:00:43,760 --> 00:00:45,720 Speaker 1: well known report and he said, come to Cafe Milano. 15 00:00:45,840 --> 00:00:49,440 Speaker 1: This is a very she she restaurant, very expensive, a 16 00:00:49,560 --> 00:00:51,760 Speaker 1: place that people want to be seen at, right because 17 00:00:51,800 --> 00:00:54,200 Speaker 1: it's a lot of connections and people in the DC scene. 18 00:00:54,560 --> 00:00:58,160 Speaker 1: So upon entering, I immediately see three US senators with 19 00:00:58,200 --> 00:01:02,880 Speaker 1: a very well known DC military contractor. So I, you know, 20 00:01:02,920 --> 00:01:04,360 Speaker 1: I spot that out of the corner of my eye. 21 00:01:04,400 --> 00:01:06,960 Speaker 1: I roll my eyes and I just keep walking. Well, 22 00:01:07,040 --> 00:01:09,520 Speaker 1: we get to the table where my friends at and 23 00:01:09,560 --> 00:01:12,959 Speaker 1: he has some friends British friends who are reporters and 24 00:01:12,959 --> 00:01:16,280 Speaker 1: politicians at the table as well. And most Americans that 25 00:01:16,319 --> 00:01:18,160 Speaker 1: means nothing to them, But for me, because I followed 26 00:01:18,160 --> 00:01:21,399 Speaker 1: British politics so frequently, I knew exactly who was sitting 27 00:01:21,400 --> 00:01:24,319 Speaker 1: with us. So it was a former member of the 28 00:01:24,319 --> 00:01:29,040 Speaker 1: British Conservative Party, a very very prominent member of the 29 00:01:29,080 --> 00:01:31,880 Speaker 1: British Conservative Party. This was a man who was in 30 00:01:32,000 --> 00:01:35,440 Speaker 1: charge of all the vaccinations during COVID. This was he 31 00:01:35,600 --> 00:01:39,679 Speaker 1: ran for Prime Minister and lost to Liz Trust. Most Americans, 32 00:01:39,680 --> 00:01:42,160 Speaker 1: but I would not that would not register at all. 33 00:01:42,240 --> 00:01:44,400 Speaker 1: I immediately knew. I only not only knew who he was, 34 00:01:44,440 --> 00:01:47,960 Speaker 1: I knew his record and he was kind of for 35 00:01:48,000 --> 00:01:50,520 Speaker 1: an American. He's way too big government for American. I 36 00:01:50,520 --> 00:01:54,080 Speaker 1: guess in Britain he wouldn't be considered very conservative anyway. Anyway, 37 00:01:54,160 --> 00:01:57,440 Speaker 1: So we started talking and rich of chatting about, you know, 38 00:01:57,440 --> 00:01:59,440 Speaker 1: the state of England, the state of American politics, or 39 00:01:59,480 --> 00:02:02,120 Speaker 1: are Nigel Faraj and Reform UK And he has since 40 00:02:02,240 --> 00:02:04,880 Speaker 1: left the British Conservative Party to run to work to 41 00:02:04,920 --> 00:02:07,560 Speaker 1: be a part of Reform UK. And someone at the 42 00:02:07,560 --> 00:02:09,720 Speaker 1: table asked him, are you going to run for office again? 43 00:02:09,760 --> 00:02:12,840 Speaker 1: And he said he's being considered, he's considering doing it. 44 00:02:12,840 --> 00:02:16,079 Speaker 1: He's considered running for office again. And you know, people 45 00:02:16,160 --> 00:02:19,960 Speaker 1: always accuse me of being abrasive and or putting on 46 00:02:20,000 --> 00:02:23,080 Speaker 1: a show that I'm abrasive. I am the same way morning, 47 00:02:23,120 --> 00:02:25,120 Speaker 1: noon and night, whether you see me, whether you don't. 48 00:02:25,160 --> 00:02:27,960 Speaker 1: I mean, it's it's always the same personality. So I 49 00:02:28,080 --> 00:02:30,799 Speaker 1: kind of I couldn't hold my tongue back. I just 50 00:02:30,840 --> 00:02:34,560 Speaker 1: said to him respectfully, sir, your party was in the 51 00:02:34,600 --> 00:02:37,840 Speaker 1: majority for a decade in the UK and you did 52 00:02:37,960 --> 00:02:42,079 Speaker 1: virtually nothing, Like you left the country off way worse 53 00:02:42,120 --> 00:02:46,519 Speaker 1: than when you found it. And why Like what did 54 00:02:46,600 --> 00:02:50,120 Speaker 1: you learn differently this time than last time? Like why 55 00:02:50,120 --> 00:02:53,040 Speaker 1: are you going to what else could you possibly you know, 56 00:02:53,680 --> 00:02:56,400 Speaker 1: wreck if you get elected again? And he said some 57 00:02:56,480 --> 00:02:59,040 Speaker 1: interesting things, especially about energy policy. The UK is a 58 00:02:59,080 --> 00:03:01,600 Speaker 1: horrendous energy policy they him at zero, which is doing 59 00:03:01,840 --> 00:03:07,040 Speaker 1: terrible things for their economy. And I identifies particularly like 60 00:03:07,120 --> 00:03:10,280 Speaker 1: the Tories, the Conservative Party should have made their identity 61 00:03:10,320 --> 00:03:13,799 Speaker 1: about wiping away former Prime Minister Tony Blair's record. Tony 62 00:03:13,840 --> 00:03:17,360 Speaker 1: Blair is the one who kicked open the floodgates of 63 00:03:17,440 --> 00:03:21,960 Speaker 1: Third World mass immigration to England. Tony Blair pushed forward 64 00:03:22,040 --> 00:03:28,040 Speaker 1: on a zero. Him and his successor, Tony Blair had 65 00:03:28,080 --> 00:03:31,239 Speaker 1: the Equality Act, which put all these policemen at risk 66 00:03:31,280 --> 00:03:33,519 Speaker 1: of losing their jobs if they were accused of racism 67 00:03:33,800 --> 00:03:36,120 Speaker 1: when the grooming gangs were happening. I mean, Tony Blair 68 00:03:36,520 --> 00:03:40,280 Speaker 1: was a horrendous Prime Minister to the UK and the 69 00:03:40,320 --> 00:03:43,240 Speaker 1: same way that Trump's a lot of Trump. What Trump 70 00:03:43,280 --> 00:03:46,720 Speaker 1: does is to erase the Obama legacy. The Tories should 71 00:03:46,720 --> 00:03:50,200 Speaker 1: have been very, very complicit in erasing Blair's legacy, and 72 00:03:50,240 --> 00:03:52,080 Speaker 1: they kind of went along with the continuing a lot 73 00:03:52,120 --> 00:03:56,000 Speaker 1: of Blair's legacy. It was crazy. So after pushing back, 74 00:03:56,080 --> 00:03:58,200 Speaker 1: I just I try to make everything nice. I said, 75 00:03:58,200 --> 00:04:00,480 Speaker 1: what is some interesting things that you were remember from 76 00:04:00,520 --> 00:04:03,280 Speaker 1: your time there being such a senior ranking member of 77 00:04:03,360 --> 00:04:06,400 Speaker 1: the ruling party. And he said that they're saying in 78 00:04:06,400 --> 00:04:08,840 Speaker 1: the UK called Cobra. Now I had heard of it before, 79 00:04:08,840 --> 00:04:11,680 Speaker 1: what my the other guest did it? Cobra is like 80 00:04:11,760 --> 00:04:14,720 Speaker 1: the secret government meeting of top notch people when there 81 00:04:14,800 --> 00:04:17,120 Speaker 1: was a national crisis going on. It's like it would 82 00:04:17,120 --> 00:04:19,240 Speaker 1: be like going to It's not the same thing as 83 00:04:19,240 --> 00:04:21,680 Speaker 1: Camp David, but or the situation maybe like going to 84 00:04:21,680 --> 00:04:24,039 Speaker 1: the situation. It was called COBRA in the UK. It's 85 00:04:24,080 --> 00:04:27,760 Speaker 1: not the health insurance alternative policy. It's called COBRA anyway, 86 00:04:29,120 --> 00:04:32,000 Speaker 1: that's also cl cobra, this American cobra the back of 87 00:04:32,040 --> 00:04:34,400 Speaker 1: the story. So I said, what was What are some 88 00:04:34,480 --> 00:04:36,840 Speaker 1: things like that you can share that were two like 89 00:04:36,960 --> 00:04:40,120 Speaker 1: crazy moments during Cobra and he said one was the 90 00:04:40,200 --> 00:04:43,359 Speaker 1: vote to shut down the government because he said that 91 00:04:43,400 --> 00:04:45,599 Speaker 1: he wasn't for it. Knowing what he said, I don't. 92 00:04:45,680 --> 00:04:47,320 Speaker 1: I don't believe that. I think he was probably very 93 00:04:47,320 --> 00:04:50,039 Speaker 1: far or shutting on the government, but he said it. 94 00:04:50,080 --> 00:04:52,799 Speaker 1: The weirdest thing is the COBRA meeting was in London, 95 00:04:53,160 --> 00:04:55,400 Speaker 1: so it's a huge city. It's millions and millions of people. 96 00:04:55,440 --> 00:04:57,760 Speaker 1: So he would like, they agree we're going to shut 97 00:04:57,800 --> 00:05:00,280 Speaker 1: the government down. And then he walked down to the 98 00:05:00,320 --> 00:05:03,880 Speaker 1: street to as he called it, humanity. There's thousands of 99 00:05:03,920 --> 00:05:06,120 Speaker 1: people walking past them, and he goes the whole time, 100 00:05:06,120 --> 00:05:09,280 Speaker 1: I'm thinking, you have no idea what we've just decided, 101 00:05:09,360 --> 00:05:13,160 Speaker 1: Like we're going everyone's on lockdown like in twenty four hours, 102 00:05:13,160 --> 00:05:15,200 Speaker 1: and you're old just going this is the last moments 103 00:05:15,240 --> 00:05:17,719 Speaker 1: you have to go about your business the way that 104 00:05:17,720 --> 00:05:20,080 Speaker 1: you've been doing your entire lives. That was kind of 105 00:05:20,080 --> 00:05:22,400 Speaker 1: fascinating from a psychologically. He wasn't taking glean it, by 106 00:05:22,400 --> 00:05:24,320 Speaker 1: the way, he wasn't sounding like a sociopath. He was 107 00:05:25,880 --> 00:05:28,560 Speaker 1: treating like this is insane. And then the second thing, 108 00:05:28,640 --> 00:05:30,720 Speaker 1: this is crazy. Because I had not even heard of 109 00:05:30,760 --> 00:05:33,599 Speaker 1: this and I follow the news pretty closely, he said 110 00:05:33,600 --> 00:05:36,400 Speaker 1: that they got a cobra call. Because this is several 111 00:05:36,480 --> 00:05:41,080 Speaker 1: years ago. Putin was going to test a nuclear missile 112 00:05:41,240 --> 00:05:44,520 Speaker 1: in the Black Sea, which is the of you know, 113 00:05:44,640 --> 00:05:49,600 Speaker 1: water barrier between Russia and Ukraine and Turkey and all 114 00:05:49,640 --> 00:05:52,200 Speaker 1: these other and a lot of the Balkan nations, and 115 00:05:53,279 --> 00:05:56,320 Speaker 1: he was going to test. Yeah, he was going to 116 00:05:56,360 --> 00:06:00,560 Speaker 1: test the a nuclear bomb. And I said, what did 117 00:06:00,600 --> 00:06:01,840 Speaker 1: you do, Like, what do you do when you hear 118 00:06:01,880 --> 00:06:04,120 Speaker 1: that Putin had a test nuclear bomb? He said, We 119 00:06:04,160 --> 00:06:07,680 Speaker 1: immediately started trying to call China to ask China to 120 00:06:07,760 --> 00:06:10,159 Speaker 1: turn talk him out of it. And we were trying 121 00:06:10,200 --> 00:06:12,279 Speaker 1: to figure out every other world leader with a close 122 00:06:12,360 --> 00:06:16,600 Speaker 1: relationship with Putin to turn to make him change his mind. 123 00:06:16,800 --> 00:06:18,640 Speaker 1: Anyway he did. He changed his mind in the end, 124 00:06:18,680 --> 00:06:21,240 Speaker 1: but it was a fascinating story. It really had me 125 00:06:21,480 --> 00:06:23,680 Speaker 1: kind of gripped to the enter my seat. Ah wait, 126 00:06:23,760 --> 00:06:25,640 Speaker 1: that was that's the whole story, that was the whole weekend. 127 00:06:25,680 --> 00:06:27,800 Speaker 1: But that was the weekend trip. But it was thought 128 00:06:27,800 --> 00:06:29,880 Speaker 1: it was great. That was really an interesting copy that 129 00:06:29,920 --> 00:06:33,920 Speaker 1: you guys would really like. Okay, So over the last week, 130 00:06:34,240 --> 00:06:37,640 Speaker 1: something else completely different aside from my trip and the 131 00:06:37,640 --> 00:06:40,719 Speaker 1: dinners I cooked were that came out and it's around 132 00:06:40,800 --> 00:06:43,479 Speaker 1: the topic of AI, which you guys, I have not 133 00:06:43,680 --> 00:06:45,760 Speaker 1: had like an autistic spas I'm talking about AI in 134 00:06:45,839 --> 00:06:47,680 Speaker 1: quite some time, so give me some credit. But we're 135 00:06:47,720 --> 00:06:50,040 Speaker 1: going to talk about it now because it involves both 136 00:06:50,120 --> 00:06:54,240 Speaker 1: politics and policy and polling, all my favorite sobject mantics 137 00:06:54,279 --> 00:06:59,920 Speaker 1: are put into one. So let's talk about specifically the 138 00:07:00,040 --> 00:07:05,440 Speaker 1: holding First, David Shore, who's a very very intelligent democratic polster. 139 00:07:05,520 --> 00:07:07,880 Speaker 1: I'm a big fan. I've invited on this podcast many times. 140 00:07:07,920 --> 00:07:10,320 Speaker 1: He has not since come. I don't think he's going 141 00:07:10,360 --> 00:07:13,120 Speaker 1: to come because I think a lot of progressive I've 142 00:07:13,120 --> 00:07:15,240 Speaker 1: had a lot of good progressives on the show, but 143 00:07:15,360 --> 00:07:17,160 Speaker 1: a lot of them are afraid that they're going to 144 00:07:17,240 --> 00:07:20,320 Speaker 1: have be attacked by the liberal mob. If they come 145 00:07:20,360 --> 00:07:21,760 Speaker 1: on and talk with me. I don't know. I am 146 00:07:21,760 --> 00:07:23,680 Speaker 1: assuming that I was never been told. I actually was 147 00:07:23,680 --> 00:07:25,480 Speaker 1: told that by one person, But I'm assuming that's the 148 00:07:25,480 --> 00:07:27,840 Speaker 1: truth for everyone who doesn't agree to come. Who is 149 00:07:27,880 --> 00:07:29,960 Speaker 1: a progressive that I that I think is smart. But 150 00:07:30,640 --> 00:07:32,640 Speaker 1: David Short is a very smart progressive and he had 151 00:07:32,720 --> 00:07:37,240 Speaker 1: a data about AI that I think you all need 152 00:07:37,280 --> 00:07:40,600 Speaker 1: to hear. So here's what he found in his data. 153 00:07:41,200 --> 00:07:45,720 Speaker 1: No issue has increased in importance to average voters than AI. 154 00:07:46,000 --> 00:07:48,800 Speaker 1: No other issue not. Think of everything that's come in 155 00:07:48,840 --> 00:07:51,400 Speaker 1: the last year, right, the only other issue that even 156 00:07:51,640 --> 00:07:54,760 Speaker 1: comes close to how much people I've taken interest in 157 00:07:54,800 --> 00:07:58,280 Speaker 1: the I is war in the Middle East. Put that 158 00:07:58,320 --> 00:08:01,720 Speaker 1: into perspective, how much which coverage has war and conflict 159 00:08:01,720 --> 00:08:05,440 Speaker 1: in the Middle East received in mainstream media, in alternative media, 160 00:08:05,480 --> 00:08:10,200 Speaker 1: in social media, and versus AI. It's I believe if 161 00:08:10,200 --> 00:08:12,640 Speaker 1: you studied at you know, dollar for dollar, it's probably 162 00:08:12,680 --> 00:08:16,600 Speaker 1: one side for warthy Ran and war in the Middle East. Nonetheless, 163 00:08:16,760 --> 00:08:19,240 Speaker 1: people care about more AI, more than they care about 164 00:08:20,080 --> 00:08:24,200 Speaker 1: any other issue, including war. Secondly, AI is hitting at 165 00:08:24,200 --> 00:08:28,800 Speaker 1: a time where most Americans are increasingly saying life is unaffordable. 166 00:08:28,840 --> 00:08:31,880 Speaker 1: Only nine percent of Americans say that life is getting 167 00:08:31,920 --> 00:08:36,319 Speaker 1: more affordable. Sixty one percent say life is more unaffordable. 168 00:08:36,679 --> 00:08:41,400 Speaker 1: Only twenty five percent of voters say they feel confident 169 00:08:41,440 --> 00:08:44,920 Speaker 1: in their financial future. That is not big, especially when 170 00:08:44,920 --> 00:08:49,719 Speaker 1: you consider how much of the population is wealthy, retired, 171 00:08:49,920 --> 00:08:53,000 Speaker 1: or on you know, benefits of some sort where they're 172 00:08:53,040 --> 00:08:56,640 Speaker 1: not where the government is financing a lot of their life. 173 00:08:57,080 --> 00:09:00,800 Speaker 1: That's a huge part of the electorate, a huge, huge part, 174 00:09:00,840 --> 00:09:03,640 Speaker 1: and still only twenty five percent feel good in their 175 00:09:03,679 --> 00:09:08,520 Speaker 1: financial future. Third, a majority of voters fifty six percent 176 00:09:08,960 --> 00:09:12,760 Speaker 1: are worried about their job security, and seventy nine percent 177 00:09:12,760 --> 00:09:16,240 Speaker 1: are concerned the government doesn't have a plan to protect 178 00:09:16,360 --> 00:09:21,080 Speaker 1: workers from AI job loss. Seventy seven percent are concerned 179 00:09:21,120 --> 00:09:24,360 Speaker 1: that industries are going to be eliminated, and seventy nine 180 00:09:24,400 --> 00:09:28,000 Speaker 1: percent are worried there will be fewer opportunities for young people. 181 00:09:28,400 --> 00:09:34,520 Speaker 1: Those are those eighty twenty issues. That is like a 182 00:09:34,559 --> 00:09:41,920 Speaker 1: pole question asking should a known rapist with a penis 183 00:09:41,920 --> 00:09:43,760 Speaker 1: be allowed to shower next to a nine year old 184 00:09:43,840 --> 00:09:48,040 Speaker 1: girl in a public bathroom? And if he declares that 185 00:09:48,080 --> 00:09:50,840 Speaker 1: he is a woman like that's that lopsided, right. I 186 00:09:50,920 --> 00:09:54,360 Speaker 1: want you to put this into perspective. This is not 187 00:09:54,960 --> 00:09:58,959 Speaker 1: an issue where there's a lot of gray nuance. People 188 00:09:59,320 --> 00:10:05,320 Speaker 1: have serious opinions that are extremely one way. It is 189 00:10:05,520 --> 00:10:08,400 Speaker 1: like the transgender issue. It is one way, and it 190 00:10:08,480 --> 00:10:11,320 Speaker 1: is going towards that way. It is increasingly moving towards 191 00:10:11,360 --> 00:10:13,400 Speaker 1: that way. We're going to have a transgender episode next episode, 192 00:10:13,400 --> 00:10:17,480 Speaker 1: by the way, but another transgender episode. But anyway, this 193 00:10:17,600 --> 00:10:22,360 Speaker 1: is where it's going. It's a moving in one direction. Fourth, 194 00:10:22,559 --> 00:10:25,240 Speaker 1: voters do not trust the notion that everything is going 195 00:10:25,280 --> 00:10:28,200 Speaker 1: to be fine. When told that AI will create economic 196 00:10:28,240 --> 00:10:32,800 Speaker 1: productivity that benefits everyone, fifty six percent of voters don't 197 00:10:32,880 --> 00:10:35,480 Speaker 1: trust that statement. Thirty six percent say they trust it 198 00:10:35,640 --> 00:10:38,360 Speaker 1: somewhat to a lot. Voters also do not trust the 199 00:10:38,400 --> 00:10:42,679 Speaker 1: idea that AI will not cause widespread job loss twenty 200 00:10:42,760 --> 00:10:47,600 Speaker 1: six Only twenty six percent of voters believe that AI 201 00:10:47,840 --> 00:10:51,640 Speaker 1: will not cause widespread job loss. Sixty seven percent do 202 00:10:51,760 --> 00:10:55,080 Speaker 1: not trust people who say that. When I asked, what 203 00:10:55,160 --> 00:10:57,600 Speaker 1: is the more important funding the creation of new jobs 204 00:10:57,640 --> 00:11:01,520 Speaker 1: and basic benefits like healthcare, even if it even if 205 00:11:01,520 --> 00:11:05,040 Speaker 1: it means limiting the amount that American tech companies can 206 00:11:05,120 --> 00:11:10,719 Speaker 1: profit from AI or keep innovation so the American outcompetes 207 00:11:10,760 --> 00:11:13,640 Speaker 1: the rest of the world and developing AI even if 208 00:11:13,679 --> 00:11:17,199 Speaker 1: it allows tech companies to profit from eliminating jobs. By 209 00:11:17,200 --> 00:11:19,960 Speaker 1: the way, David Shore, who made that question a little biased, 210 00:11:20,040 --> 00:11:23,800 Speaker 1: a little bias of a question anyway, obviously they cared 211 00:11:23,840 --> 00:11:29,000 Speaker 1: more about protecting jobs than innovation. Next question, and while 212 00:11:29,960 --> 00:11:31,840 Speaker 1: I want you to hear this, I want you to 213 00:11:32,000 --> 00:11:35,720 Speaker 1: if you hear nothing else from this entire podcast, listen 214 00:11:35,800 --> 00:11:38,960 Speaker 1: to the next two points. I have said over and 215 00:11:39,040 --> 00:11:42,880 Speaker 1: over and over again. AI is how Democrats are going 216 00:11:42,960 --> 00:11:47,079 Speaker 1: to put socialism in this country. Roucana was on this 217 00:11:47,160 --> 00:11:50,200 Speaker 1: podcast and basically said it. It's his plan for the 218 00:11:50,280 --> 00:11:56,520 Speaker 1: New Deal, massive redistribution of wealth, even at mass unemployment levels. 219 00:11:57,240 --> 00:12:00,280 Speaker 1: When ask if people would rather have a job or 220 00:12:00,360 --> 00:12:05,960 Speaker 1: direct handouts, like Rocanna said, fifty four percent said they 221 00:12:06,000 --> 00:12:09,440 Speaker 1: would rather have a job and rather have the government 222 00:12:09,520 --> 00:12:13,280 Speaker 1: ensure they have a job. Only seventeen percent want direct 223 00:12:13,320 --> 00:12:18,160 Speaker 1: handouts fifteen percent. Only fifteen percent are libertarians who say 224 00:12:18,240 --> 00:12:21,560 Speaker 1: we don't want the government to do anything. Fifty five 225 00:12:21,760 --> 00:12:28,160 Speaker 1: percent want to make sure tech companies cannot make unlimited profits. 226 00:12:28,559 --> 00:12:32,959 Speaker 1: They include a plurality of Trump voters. Listen to me 227 00:12:33,120 --> 00:12:35,720 Speaker 1: and listen to me carefully with what I just said. 228 00:12:36,640 --> 00:12:42,160 Speaker 1: More Trump voters believe tech companies should be held financially 229 00:12:42,520 --> 00:12:47,880 Speaker 1: responsible for the jobs they destroy through AI that AI eliminates, 230 00:12:48,400 --> 00:12:51,040 Speaker 1: then believes they should be able to profit off of it. 231 00:12:52,040 --> 00:12:56,240 Speaker 1: This is not a small thing. This is not a oh, 232 00:12:56,400 --> 00:12:58,720 Speaker 1: maybe there's some gray area, maybe we're gonna maybe no. 233 00:12:59,280 --> 00:13:02,120 Speaker 1: Remember I had on I had on the Polster a 234 00:13:02,120 --> 00:13:04,560 Speaker 1: couple weeks ago who said that he was shocked by 235 00:13:04,640 --> 00:13:09,280 Speaker 1: how politic has become a circle that very far right 236 00:13:09,720 --> 00:13:12,360 Speaker 1: wing people are believing in very far left wing things, 237 00:13:12,640 --> 00:13:16,160 Speaker 1: and we're not a linear political belief anymore. This is 238 00:13:16,200 --> 00:13:22,520 Speaker 1: a perfect example when nearly fifty percent of Trump voters 239 00:13:23,000 --> 00:13:27,200 Speaker 1: are saying tech companies should not make unlimited profits. We 240 00:13:27,240 --> 00:13:30,880 Speaker 1: need to regulate them. It is not a question of 241 00:13:30,960 --> 00:13:34,600 Speaker 1: if those regulations are going to come, It is a 242 00:13:34,679 --> 00:13:38,600 Speaker 1: question of when and to the severity. You need a 243 00:13:38,720 --> 00:13:41,400 Speaker 1: very smart politician to make sure you do not destroy 244 00:13:41,440 --> 00:13:45,360 Speaker 1: the entire system. Or they just vote for someone who 245 00:13:45,360 --> 00:13:48,679 Speaker 1: will throw a you know, a Molotov cocktails. They will 246 00:13:48,720 --> 00:13:50,800 Speaker 1: vote for a Trump of the left, they will vote 247 00:13:50,840 --> 00:13:53,760 Speaker 1: for an AOC type. I'm not joking when I say this. 248 00:13:53,920 --> 00:13:58,160 Speaker 1: It is a very sincere point I am making right now. 249 00:13:58,800 --> 00:14:01,400 Speaker 1: It's not a question of if, it's a question of when. 250 00:14:02,600 --> 00:14:06,079 Speaker 1: Last two points on this poll. And while voters prefer 251 00:14:06,160 --> 00:14:11,360 Speaker 1: a tax that specifically taxes companies on AI, they like 252 00:14:11,480 --> 00:14:13,520 Speaker 1: that more than like the idea of a wealth tax. 253 00:14:13,520 --> 00:14:16,240 Speaker 1: Remember the California wealth tax. They're taxing billionaires on These 254 00:14:16,280 --> 00:14:19,800 Speaker 1: billionaires are fleeing. Only twenty seven percent of voters want 255 00:14:19,840 --> 00:14:23,160 Speaker 1: a wealth tax because overall we are a capitalist nation. 256 00:14:23,760 --> 00:14:27,440 Speaker 1: But forty nine percent want an AI tax. And if 257 00:14:27,480 --> 00:14:32,320 Speaker 1: you don't do one, Republicans, listen, if you don't do one, 258 00:14:32,640 --> 00:14:35,880 Speaker 1: you will end up with the other. I'm making it 259 00:14:36,680 --> 00:14:40,440 Speaker 1: perfectly crystal clear. If you don't push for some kind 260 00:14:40,480 --> 00:14:44,200 Speaker 1: of AI tax or movement, you will end up with 261 00:14:44,320 --> 00:14:49,840 Speaker 1: a wealth tax. Lastly, AI specific populism outperforms all other 262 00:14:50,080 --> 00:14:54,080 Speaker 1: conversations around economic populism and AI when it comes to 263 00:14:54,080 --> 00:14:58,200 Speaker 1: support helping Democrats win elections. It moves the need into 264 00:14:58,240 --> 00:15:03,200 Speaker 1: the needle, the electric needle. An average four percent towards Democrats. 265 00:15:03,520 --> 00:15:06,480 Speaker 1: Do you understand what that means? It means you go 266 00:15:06,560 --> 00:15:09,280 Speaker 1: from a Trump twenty twenty four election to an Obama 267 00:15:09,320 --> 00:15:13,560 Speaker 1: twenty twelve style election. If just the AI populism point 268 00:15:13,720 --> 00:15:17,080 Speaker 1: is hit on. Now, obviously there's other issues. There's transgender issues, 269 00:15:17,120 --> 00:15:19,040 Speaker 1: there's immigration, there's a lot of things Democrats believe that 270 00:15:19,080 --> 00:15:22,000 Speaker 1: are nutty and batty and crazy. You put the unemployment 271 00:15:22,080 --> 00:15:24,080 Speaker 1: level at ten to twenty percent because of all the 272 00:15:24,720 --> 00:15:28,880 Speaker 1: AI regulations and AI innovations, rather we are in a 273 00:15:28,880 --> 00:15:32,960 Speaker 1: different world. Then. Now this all comes as two things 274 00:15:33,000 --> 00:15:37,240 Speaker 1: are happening. There are two competing Republican rollouts when it 275 00:15:37,240 --> 00:15:40,680 Speaker 1: comes to AI regulations. The first is from the White House. 276 00:15:40,760 --> 00:15:44,560 Speaker 1: It is a four page memo on the AI framework policy. 277 00:15:44,920 --> 00:15:46,960 Speaker 1: It is pretty straightforward. It's only four pages. If you 278 00:15:46,960 --> 00:15:48,880 Speaker 1: want to go read it, it's pretty simple. I'll go 279 00:15:48,920 --> 00:15:51,480 Speaker 1: through a few few points because it is it's a 280 00:15:51,520 --> 00:15:55,040 Speaker 1: lot to read. But basically, the first point is about 281 00:15:55,080 --> 00:15:58,240 Speaker 1: protecting kids. It says congression empower parents and guardians with 282 00:15:58,360 --> 00:16:02,000 Speaker 1: robust tools to manage their child privacy settings, screen times, 283 00:16:02,000 --> 00:16:07,160 Speaker 1: content exposure, and account account controls a meaning it's on 284 00:16:07,240 --> 00:16:08,600 Speaker 1: the parents. They're going to do the same thing they 285 00:16:08,640 --> 00:16:11,040 Speaker 1: do for YouTube and all the other stuff that has 286 00:16:11,080 --> 00:16:13,200 Speaker 1: gone forward for a lot of these social media websites. 287 00:16:13,640 --> 00:16:15,200 Speaker 1: But it's all on the parents. It's not to the 288 00:16:15,200 --> 00:16:20,040 Speaker 1: AI companies. Congressould establish commercially reasonable privacy, protective and age 289 00:16:20,080 --> 00:16:24,680 Speaker 1: assurance requirements for the AI platform sets on the AI companies. 290 00:16:25,000 --> 00:16:28,520 Speaker 1: Congress require AI platforms and services likely to be accessed 291 00:16:28,520 --> 00:16:31,520 Speaker 1: by miners to implement features that reduce the risk of 292 00:16:31,600 --> 00:16:36,000 Speaker 1: sexual exploitation and self harm to minors. Congress should affirm 293 00:16:36,240 --> 00:16:39,400 Speaker 1: that the existing child privacy policies apply to the AI systems, 294 00:16:39,400 --> 00:16:43,440 Speaker 1: including limits on data collection for model training and targeted advertising. 295 00:16:43,880 --> 00:16:49,640 Speaker 1: All of the most regulatory or regulatory language around AI 296 00:16:49,880 --> 00:16:54,200 Speaker 1: came from the child section about protecting kids, where I 297 00:16:54,200 --> 00:16:56,520 Speaker 1: think you're going to get the most of bipartisan support. 298 00:16:57,240 --> 00:17:01,000 Speaker 1: And also that taxpayers should not be increased, should not 299 00:17:01,000 --> 00:17:04,919 Speaker 1: feel increase electricity costs from data centers, streamline Federal permits 300 00:17:04,960 --> 00:17:08,080 Speaker 1: are on AIAI infrastructure. That's a boom for the I industry. 301 00:17:08,359 --> 00:17:11,359 Speaker 1: Grants for small businesses so they could develop AI in 302 00:17:11,400 --> 00:17:14,280 Speaker 1: their businesses, once again boom for the I industry. More 303 00:17:14,320 --> 00:17:17,960 Speaker 1: antitrust liabilities for AI that means that you can't just 304 00:17:18,040 --> 00:17:22,639 Speaker 1: steal people's content. More protections on copyright allow for collective rights, 305 00:17:22,640 --> 00:17:25,840 Speaker 1: and licensing framework prevent government from forcing AI companies to 306 00:17:25,880 --> 00:17:29,199 Speaker 1: comply with ideological standards. That's the whole free speech stuff. 307 00:17:29,440 --> 00:17:32,119 Speaker 1: And when the next Democrat comes in, Congress should not 308 00:17:32,160 --> 00:17:36,360 Speaker 1: create any new federal rulemaking body that regulates AI once 309 00:17:36,359 --> 00:17:38,960 Speaker 1: again big one for the I industry, and should instead 310 00:17:38,960 --> 00:17:42,639 Speaker 1: support a development and deployment of sector specific AI applications 311 00:17:42,640 --> 00:17:46,480 Speaker 1: through existing regulatory bodies with a subject matter expertise and 312 00:17:46,600 --> 00:17:51,200 Speaker 1: thorough industry standard level lad standards, expand efforts to study 313 00:17:51,240 --> 00:17:54,480 Speaker 1: trends with job losses. What a stupid freaking part of 314 00:17:54,480 --> 00:18:00,000 Speaker 1: the bill, and develop AI AI Youth Development program also incredible, 315 00:18:00,080 --> 00:18:04,119 Speaker 1: police stupid. Also, we're gonna have an education episode about 316 00:18:04,119 --> 00:18:07,119 Speaker 1: what we're discovered with AI and education. It's gonna make 317 00:18:07,160 --> 00:18:10,159 Speaker 1: your jars drop. But okay, but mostly the AI companies 318 00:18:10,320 --> 00:18:13,520 Speaker 1: win big time because what this outlines on the build. 319 00:18:13,560 --> 00:18:16,840 Speaker 1: What an outline does is it says to the states, 320 00:18:17,119 --> 00:18:19,920 Speaker 1: it gives them. It gives a little bit around child protection, 321 00:18:20,040 --> 00:18:23,520 Speaker 1: a little bit around copyright protection guidelines for Congress. But 322 00:18:23,560 --> 00:18:26,720 Speaker 1: what it mainly says is Congress should preempt state AI 323 00:18:26,880 --> 00:18:30,800 Speaker 1: laws that impose undue burdens to ensure a minimally burdened 324 00:18:30,800 --> 00:18:34,600 Speaker 1: some national standards consistent with the With these recommendations, not 325 00:18:34,880 --> 00:18:39,840 Speaker 1: fifty discordant ones. This national standard should respect key principles 326 00:18:39,840 --> 00:18:42,560 Speaker 1: of federalisms and not pre in a traditional police powers 327 00:18:42,640 --> 00:18:46,440 Speaker 1: retained by the states to enforce laws of general applicability 328 00:18:46,440 --> 00:18:50,600 Speaker 1: against AI developers and users, state zoning laws, requirements of 329 00:18:50,640 --> 00:18:54,320 Speaker 1: governors to state the one's own use of AI, whether 330 00:18:54,359 --> 00:18:58,600 Speaker 1: it be procurement or services. Preemption must also ensure that 331 00:18:58,640 --> 00:19:01,080 Speaker 1: state laws do not govern areas better student for the 332 00:19:01,080 --> 00:19:04,080 Speaker 1: federal government. States should not be permitted to regular AI 333 00:19:04,200 --> 00:19:07,920 Speaker 1: development because it is inherently an interstate phenomenon. States should 334 00:19:07,920 --> 00:19:11,040 Speaker 1: not unduly burden American's use of AI for activity that 335 00:19:11,080 --> 00:19:13,920 Speaker 1: would be lawful if performed without AI. State should not 336 00:19:14,000 --> 00:19:19,080 Speaker 1: be permitted to penalize AI developers for third parties unlawful conduct, 337 00:19:19,200 --> 00:19:23,520 Speaker 1: including involving their models. I e. What that means is 338 00:19:24,240 --> 00:19:28,600 Speaker 1: someone using AI to harm you. AI company is not. 339 00:19:28,359 --> 00:19:32,040 Speaker 1: They're not responsible to huge win for the AI companies. 340 00:19:32,240 --> 00:19:36,280 Speaker 1: Huge win. It is. It's as if somebody that the 341 00:19:36,400 --> 00:19:38,560 Speaker 1: whole outlines of as if somebody just woke up one 342 00:19:38,600 --> 00:19:41,240 Speaker 1: day from the White House and said, guys, we're doing 343 00:19:41,280 --> 00:19:44,040 Speaker 1: the bare minimum today. That's what it is. It's the 344 00:19:44,080 --> 00:19:46,639 Speaker 1: bare minimum. I mean the voters are going to respond 345 00:19:46,640 --> 00:19:50,000 Speaker 1: positively to the child protection stuff that put you know, 346 00:19:50,040 --> 00:19:58,639 Speaker 1: respond respond positively towards issues around copyright and licensing. What 347 00:19:58,760 --> 00:20:01,239 Speaker 1: they will not respond positively too, because we know how 348 00:20:01,280 --> 00:20:04,320 Speaker 1: slow Congress moves as all other regulation only can go 349 00:20:04,359 --> 00:20:10,280 Speaker 1: through Congress when they're giving so limited regulations in this bill. 350 00:20:10,840 --> 00:20:13,439 Speaker 1: I mean, there's not a lot there. This doesn't have 351 00:20:13,520 --> 00:20:16,240 Speaker 1: a ton of teeth, and it's more protections for AI 352 00:20:16,280 --> 00:20:20,840 Speaker 1: industry than it is regulations to protect citizens and protect 353 00:20:20,880 --> 00:20:24,600 Speaker 1: workers is not in there at all now. Senator Marsha 354 00:20:24,680 --> 00:20:27,800 Speaker 1: Blackburn of Tennessee, Republican of Tennessee also issued her own 355 00:20:27,960 --> 00:20:31,800 Speaker 1: bill around AI regulation. Unlike the four pages from the 356 00:20:31,840 --> 00:20:33,560 Speaker 1: White House, this bill was two hundred and ninety one 357 00:20:33,560 --> 00:20:34,960 Speaker 1: pages long. I'm not going to retime I read it. 358 00:20:35,000 --> 00:20:37,479 Speaker 1: I did not read it, but it covers a series 359 00:20:37,520 --> 00:20:41,960 Speaker 1: of AI regulations that are overwhelmingly popular voters and absolutely 360 00:20:42,080 --> 00:20:45,400 Speaker 1: encroaches on AI companies ability to make endless profits while 361 00:20:45,400 --> 00:20:49,320 Speaker 1: displacing American workers. That is why the AI tech people, 362 00:20:49,359 --> 00:20:51,440 Speaker 1: the minute the bill came out, started screaming how horrible 363 00:20:51,480 --> 00:20:53,160 Speaker 1: it was. And you could already see by the way 364 00:20:53,320 --> 00:20:57,320 Speaker 1: the White House has reached out to all their influencers 365 00:20:57,359 --> 00:21:00,480 Speaker 1: to talk about how amazing their framework is. The people 366 00:21:00,480 --> 00:21:03,920 Speaker 1: who don't have a high school diploma screaming how great 367 00:21:03,960 --> 00:21:06,040 Speaker 1: they know, they know every AI regulation at this point. 368 00:21:06,080 --> 00:21:10,040 Speaker 1: It's amazing. It's the Marshall Blackburn bill is obviously an 369 00:21:10,040 --> 00:21:13,520 Speaker 1: open opening, you know, offer. It's not a perfect bill. 370 00:21:13,600 --> 00:21:15,800 Speaker 1: It will go through many, many different cycles, but it 371 00:21:15,920 --> 00:21:20,720 Speaker 1: is there. It is something. It is also baffling to 372 00:21:20,800 --> 00:21:24,800 Speaker 1: my mind how the White House is doing this now 373 00:21:24,840 --> 00:21:30,000 Speaker 1: in March, knowing how few dates are available left for 374 00:21:30,080 --> 00:21:34,080 Speaker 1: this Congress to even offer an AI bill, knowing that 375 00:21:34,119 --> 00:21:36,680 Speaker 1: parts of the billbard is so unpopular ahead of the midterms. 376 00:21:37,240 --> 00:21:39,960 Speaker 1: This makes no sense. What's so? I mean? I cannot 377 00:21:39,960 --> 00:21:42,400 Speaker 1: even get into the whole political standings of this entire thing. 378 00:21:44,080 --> 00:21:48,480 Speaker 1: Tone deafness must be clinical to a lot of this 379 00:21:48,520 --> 00:21:53,960 Speaker 1: White House. Tone deafness must be completely clinical. It must 380 00:21:54,000 --> 00:21:58,280 Speaker 1: be I don't know, contagious. I have no idea. Something 381 00:21:58,400 --> 00:22:01,920 Speaker 1: is going on where they are not reading the room 382 00:22:02,359 --> 00:22:05,840 Speaker 1: and they're only hearing influencers who want to be invited 383 00:22:05,880 --> 00:22:08,679 Speaker 1: to the right parties and people in the business sector 384 00:22:08,680 --> 00:22:12,200 Speaker 1: who want some profits, because that's all it seems to be, 385 00:22:13,040 --> 00:22:17,320 Speaker 1: and everyone else is getting scraps. I can't describe it 386 00:22:17,400 --> 00:22:19,679 Speaker 1: any other way. I think this is insane. I just 387 00:22:19,720 --> 00:22:23,760 Speaker 1: think this is completely insane. While all this is going on, 388 00:22:23,840 --> 00:22:28,120 Speaker 1: there's a primary in Chicago where AI and Crypto and 389 00:22:28,400 --> 00:22:32,280 Speaker 1: the American is rarely political action committee all we're working 390 00:22:32,440 --> 00:22:36,400 Speaker 1: to have their own candidates went. Thirty two million dollars 391 00:22:36,520 --> 00:22:41,600 Speaker 1: was spent on four house races in Illinois. Staggering amounts 392 00:22:41,640 --> 00:22:45,040 Speaker 1: of monies, larger than all the campaigns themselves. Is the 393 00:22:45,040 --> 00:22:48,880 Speaker 1: outside money that these companies, especially AI, are now putting 394 00:22:49,240 --> 00:22:52,840 Speaker 1: to get their chosen person elected to Congress that will 395 00:22:52,880 --> 00:22:55,439 Speaker 1: play ball with them on the issues. And they're not 396 00:22:55,760 --> 00:22:59,520 Speaker 1: even running on the AI issue. It's very complicated, but 397 00:22:59,560 --> 00:23:02,959 Speaker 1: it's very interesting. Stay tuned for my interview with David Weigel, 398 00:23:03,240 --> 00:23:09,520 Speaker 1: national reporter who covered this that's coming up next with 399 00:23:09,560 --> 00:23:12,480 Speaker 1: me on today's episode is David Weigel, the national political 400 00:23:12,520 --> 00:23:15,240 Speaker 1: reporter for Semaphore. David, thanks for coming on this podcast. 401 00:23:15,800 --> 00:23:18,200 Speaker 1: It's great to be here. Thanks for having me so. David. 402 00:23:18,240 --> 00:23:21,680 Speaker 1: You covered the Illinois elections. As I said in my monologue, 403 00:23:21,760 --> 00:23:26,320 Speaker 1: this was an incredibly expensive series of house primaries, over 404 00:23:26,400 --> 00:23:29,320 Speaker 1: thirty two million dollars. Spent by just a few groups, 405 00:23:29,320 --> 00:23:32,960 Speaker 1: and that's something including the progressive interests who were the 406 00:23:33,000 --> 00:23:36,000 Speaker 1: biggest winners of this election cycle. 407 00:23:37,240 --> 00:23:40,680 Speaker 2: The absolute biggest winner in Illinois was Governor Britzker. 408 00:23:40,760 --> 00:23:41,840 Speaker 1: So Governor J. B. 409 00:23:41,880 --> 00:23:45,560 Speaker 2: Pritzker, I think famously is a billionaire, is able to 410 00:23:45,600 --> 00:23:47,560 Speaker 2: fund whatever campaign he wants. He did this when we 411 00:23:47,600 --> 00:23:49,760 Speaker 2: first ran for governor. He's done it in the subsequent 412 00:23:49,800 --> 00:23:52,960 Speaker 2: Gay Pains. He's show hold money to state parties where 413 00:23:53,040 --> 00:23:55,560 Speaker 2: he might run he might want to favor later if 414 00:23:55,600 --> 00:24:00,560 Speaker 2: he runs for president. He basically helped elect Juliana's or 415 00:24:00,600 --> 00:24:05,199 Speaker 2: nominate his Giliana Stratt, his lieutenant governor. The sense I 416 00:24:05,240 --> 00:24:07,879 Speaker 2: got there from covering that campaign is that it was 417 00:24:08,359 --> 00:24:11,320 Speaker 2: Roger Christian Murphy, the congressman who had spent forty million dollars, 418 00:24:11,400 --> 00:24:13,680 Speaker 2: much more than ever been spent to win that seat 419 00:24:13,680 --> 00:24:17,840 Speaker 2: in Illinois, the US the US Senate seat, US Senate 420 00:24:17,880 --> 00:24:19,000 Speaker 2: seat to replace Tick Turpin. 421 00:24:19,320 --> 00:24:21,320 Speaker 1: He had the advantage that fritz gre were. 422 00:24:21,200 --> 00:24:24,640 Speaker 2: Deciding to put millions of dollars into superpack for Stratton. 423 00:24:24,680 --> 00:24:25,760 Speaker 1: That that changed that race. 424 00:24:25,840 --> 00:24:28,720 Speaker 2: He down the ballot, he helped helped let people that 425 00:24:28,800 --> 00:24:31,040 Speaker 2: he wanted. I think he had a one hundred percent record. 426 00:24:31,080 --> 00:24:33,920 Speaker 2: Now he wasn't as involved in the House races, and 427 00:24:34,000 --> 00:24:36,800 Speaker 2: so that was a nice messy story. There was not 428 00:24:36,960 --> 00:24:41,160 Speaker 2: one particular store narrative from the election. What you saw 429 00:24:41,200 --> 00:24:43,200 Speaker 2: instead was groups that had played in it come out 430 00:24:43,240 --> 00:24:46,720 Speaker 2: quickly to declare victory. APEX was the most the most 431 00:24:46,760 --> 00:24:49,119 Speaker 2: vocal saying we had a great election night. It was 432 00:24:49,119 --> 00:24:51,080 Speaker 2: not that great of election night. Was better than They 433 00:24:51,359 --> 00:24:54,840 Speaker 2: started the year in New Jersey with a debacle where 434 00:24:54,880 --> 00:24:57,680 Speaker 2: APAX spent millions of dollars to beat Tom Malinowski ended 435 00:24:57,760 --> 00:24:58,880 Speaker 2: up with a more left wing. 436 00:24:59,800 --> 00:25:02,440 Speaker 1: Let's yeah, look, okay, sure, sorry to go too fast 437 00:25:02,480 --> 00:25:04,520 Speaker 1: because no, no, no, no, I want to I want to 438 00:25:04,520 --> 00:25:07,240 Speaker 1: set the stage. That is true. Yeah, So the biggest 439 00:25:07,280 --> 00:25:09,680 Speaker 1: winner is the Governor Pritzker, who's likely running for president 440 00:25:09,760 --> 00:25:12,360 Speaker 1: or at least floating on yes. A pack, the American 441 00:25:12,440 --> 00:25:15,320 Speaker 1: Israeli Political Action Committee, they are the pro Israel pack, 442 00:25:15,840 --> 00:25:18,840 Speaker 1: as you said in New Jersey earlier this year, really 443 00:25:18,960 --> 00:25:22,720 Speaker 1: campaigned against Tom Olanowski, the former congressman, to defeat him, 444 00:25:23,040 --> 00:25:26,200 Speaker 1: ended up with a further laugh wing anti Israel candidate. Now, 445 00:25:26,240 --> 00:25:28,320 Speaker 1: they invested in how many seats in this election in 446 00:25:28,320 --> 00:25:31,000 Speaker 1: Illinois went to how they come out in four. 447 00:25:31,359 --> 00:25:34,120 Speaker 2: They won two. They lost to the one that they 448 00:25:34,240 --> 00:25:38,520 Speaker 2: had the biggest victory in uh In outside Chicago and 449 00:25:38,560 --> 00:25:41,560 Speaker 2: the suburbs that that's where Jesse Jackson Junior lost. They 450 00:25:41,600 --> 00:25:44,119 Speaker 2: were helped by Jesse Jackson Junior being pretty toxic to 451 00:25:44,160 --> 00:25:46,320 Speaker 2: all but his base of voters. The other one they 452 00:25:46,359 --> 00:25:51,400 Speaker 2: won Northwest suburbs of Chicago, like Schomberg places like that. 453 00:25:52,280 --> 00:25:55,080 Speaker 2: They were pulling in the same direction as Fair Shake 454 00:25:55,200 --> 00:25:57,439 Speaker 2: and the AI packs that are new to this cycle 455 00:25:57,440 --> 00:25:59,879 Speaker 2: that also had a pretty mixed night. So APA w 456 00:26:00,119 --> 00:26:06,040 Speaker 2: two races. It declared victory overall, and I'm not critical. 457 00:26:06,080 --> 00:26:09,080 Speaker 2: Since the election, which is pointing out factually, as it 458 00:26:09,160 --> 00:26:11,760 Speaker 2: was clear that they were not going to run the table, 459 00:26:12,160 --> 00:26:15,199 Speaker 2: APEX public spin started to be, well, we want to 460 00:26:15,240 --> 00:26:18,640 Speaker 2: keep these potential squad members out of Congress, and so 461 00:26:18,680 --> 00:26:23,040 Speaker 2: we're going to keep Katuba Gazella busher A Miwala who 462 00:26:23,119 --> 00:26:26,359 Speaker 2: was running in the ninth district is Evanston, Illinois and 463 00:26:26,359 --> 00:26:29,600 Speaker 2: part of Chicago. Frankly, I went to Northwestern and I 464 00:26:29,600 --> 00:26:32,399 Speaker 2: lived in Evanston. It's really the district where people are there. 465 00:26:32,960 --> 00:26:35,880 Speaker 2: They go to school. They usually live in Wrigleyville. When 466 00:26:35,920 --> 00:26:38,280 Speaker 2: they graduate, uh, and maybe they get a kid and 467 00:26:38,320 --> 00:26:41,480 Speaker 2: move to the suburbs. That very very district I know 468 00:26:41,520 --> 00:26:45,760 Speaker 2: a lot about. They wanted Dan Biss to lose, the 469 00:26:45,760 --> 00:26:49,560 Speaker 2: mayor of Evanston, who is a what he called when 470 00:26:49,600 --> 00:26:52,960 Speaker 2: I talked to him, a progressive Zionist. Israel should exist, 471 00:26:53,040 --> 00:26:55,480 Speaker 2: but we should stop funding them militarily. 472 00:26:55,880 --> 00:26:56,800 Speaker 1: They wanted to beat him. 473 00:26:56,840 --> 00:26:59,120 Speaker 2: They got behind Laura find who's a state senator who's 474 00:26:59,200 --> 00:27:02,359 Speaker 2: much more pro is down the line this and this, 475 00:27:02,560 --> 00:27:04,840 Speaker 2: he was telling everyone before the election, talk about more. 476 00:27:04,880 --> 00:27:08,240 Speaker 2: But after this is polling said APAX brand is terrible 477 00:27:08,640 --> 00:27:11,560 Speaker 2: in a highly educated progressive district. He ran against a 478 00:27:11,680 --> 00:27:14,240 Speaker 2: pack and he said, my opponent is supported by APAK. 479 00:27:15,040 --> 00:27:17,080 Speaker 2: I am a progressive who's going to oppose them and 480 00:27:17,080 --> 00:27:19,200 Speaker 2: never take their money and never be influenced by them. 481 00:27:19,760 --> 00:27:20,359 Speaker 1: So they did. 482 00:27:20,400 --> 00:27:23,840 Speaker 2: They did spend to beat a Gazella, who was more 483 00:27:23,960 --> 00:27:26,399 Speaker 2: left wing on Israel, more calling it a genocide. She 484 00:27:26,440 --> 00:27:28,879 Speaker 2: didn't even support iron Dome. During one of the debates, 485 00:27:28,880 --> 00:27:31,280 Speaker 2: she said, you know, supporting weapons. There's no such thing 486 00:27:31,320 --> 00:27:32,240 Speaker 2: as defensive weapons. 487 00:27:32,520 --> 00:27:36,080 Speaker 1: Yeah, she is on CNN a lot, you famously, she mean, 488 00:27:36,280 --> 00:27:38,639 Speaker 1: if anyone may know her, she was the woman who 489 00:27:38,680 --> 00:27:41,640 Speaker 1: went to the ice protests and they literally threw her 490 00:27:41,680 --> 00:27:44,360 Speaker 1: on her ass, like they picked her up and threw 491 00:27:44,400 --> 00:27:47,520 Speaker 1: her on her ass. Yeah, she she but she wasn't 492 00:27:47,520 --> 00:27:51,040 Speaker 1: even from the district when she ran. She wasn't. 493 00:27:51,080 --> 00:27:54,240 Speaker 2: She had no ties to the district. Her partner, which 494 00:27:54,280 --> 00:27:58,000 Speaker 2: she disclosed as Ben Collins, who took over the Onion 495 00:27:58,600 --> 00:28:00,359 Speaker 2: last year, I believe in the twenty twenty four believes 496 00:28:00,359 --> 00:28:02,320 Speaker 2: when he took it over and they moved there because 497 00:28:02,760 --> 00:28:05,240 Speaker 2: she lost her job at Media Matters when they downsized 498 00:28:05,320 --> 00:28:08,760 Speaker 2: after a bunch of state and Elon Musk lawsuits. She 499 00:28:09,320 --> 00:28:12,800 Speaker 2: ran as I have fought billionaires. I'm responsible for getting 500 00:28:12,840 --> 00:28:15,680 Speaker 2: Tucker Carlson off the air. Yes I'm not from here, 501 00:28:15,720 --> 00:28:18,920 Speaker 2: but I'm the sort of Democrat you want to get 502 00:28:19,000 --> 00:28:21,520 Speaker 2: to have in DC fighting for you. Actually did a 503 00:28:21,520 --> 00:28:24,000 Speaker 2: good job with no ties in the district twenty six 504 00:28:24,000 --> 00:28:26,400 Speaker 2: percent of the vote, and turnout was very high there. 505 00:28:28,000 --> 00:28:30,879 Speaker 2: What's the biggest left wing victory that set off a 506 00:28:30,960 --> 00:28:34,159 Speaker 2: lot of their strategy was AOC winning in twenty eighteen. 507 00:28:34,359 --> 00:28:36,959 Speaker 2: More people voted for kat In Abogazella in this district 508 00:28:36,960 --> 00:28:41,600 Speaker 2: than voted period in the AOC race. Think thirty one 509 00:28:41,600 --> 00:28:44,640 Speaker 2: thousand or thirty two thousand votes, So she did tap 510 00:28:44,680 --> 00:28:47,600 Speaker 2: into something there. But Bess had a record as a 511 00:28:47,600 --> 00:28:51,160 Speaker 2: progressive critic of Israel, and I think if this is 512 00:28:51,200 --> 00:28:53,720 Speaker 2: not in that race, in a divided field, she probably 513 00:28:53,760 --> 00:28:56,959 Speaker 2: could have won. She had more support than Laura Fine. 514 00:28:57,200 --> 00:29:00,520 Speaker 2: So APEC took credit after the election. But they they 515 00:29:00,600 --> 00:29:04,000 Speaker 2: learned in that district that the brand the APAC brand. 516 00:29:04,040 --> 00:29:05,640 Speaker 2: After a couple of cycles of doing this, and I've 517 00:29:05,640 --> 00:29:08,640 Speaker 2: been covering this since the beginning, APEX starting up a 518 00:29:08,640 --> 00:29:10,840 Speaker 2: pack with a different name running in a seat. If 519 00:29:10,840 --> 00:29:13,680 Speaker 2: people say APAX supports you, it is now a huge 520 00:29:13,680 --> 00:29:16,080 Speaker 2: demerit in a Democratic primary. And the other race they 521 00:29:16,120 --> 00:29:19,760 Speaker 2: lost was kind of a loop of Chicago, a very 522 00:29:19,800 --> 00:29:21,800 Speaker 2: safe seat in downtown Chicago. 523 00:29:22,400 --> 00:29:22,880 Speaker 1: Uh. 524 00:29:23,080 --> 00:29:27,680 Speaker 2: They they backed uh City Treasurer Melissa Conyers iervan she 525 00:29:27,840 --> 00:29:31,520 Speaker 2: flopped UH and another Democrat was elected. A PAC sort 526 00:29:31,520 --> 00:29:34,200 Speaker 2: of show credit because a more pro a more anti 527 00:29:34,280 --> 00:29:35,720 Speaker 2: Israel democrat was not elected. 528 00:29:35,960 --> 00:29:38,920 Speaker 1: But no, that was so far from the cycle. A 529 00:29:39,080 --> 00:29:41,120 Speaker 1: PAK is they're. 530 00:29:41,240 --> 00:29:43,520 Speaker 2: Two for two for five on who they actually want 531 00:29:43,560 --> 00:29:44,240 Speaker 2: to be in Congress. 532 00:29:44,240 --> 00:29:48,120 Speaker 1: Winning They're they're they're winning by losing uh in their words, 533 00:29:48,600 --> 00:29:50,840 Speaker 1: And the one person that they backed. It's what's what's 534 00:29:50,880 --> 00:29:52,240 Speaker 1: what is it? Pearis? Uh? 535 00:29:52,520 --> 00:29:54,560 Speaker 2: What was since Peis said or somebody said to Puris, 536 00:29:54,640 --> 00:29:56,680 Speaker 2: you know, another victory like that and we're done for. 537 00:29:57,280 --> 00:29:59,800 Speaker 2: That's kind of been the the apax style in these primaries. 538 00:30:00,480 --> 00:30:03,640 Speaker 1: The other person that they really back was Melissa Bean. 539 00:30:03,800 --> 00:30:08,800 Speaker 1: She was a former congressman running again. She was backed 540 00:30:08,800 --> 00:30:14,480 Speaker 1: by Apak, Crypto andn Ai. She basically had the everyone 541 00:30:14,640 --> 00:30:18,320 Speaker 1: supporting her with all this outside money, and she was 542 00:30:18,360 --> 00:30:21,080 Speaker 1: the one victor for basis everybody, Before I go to Ai, 543 00:30:21,240 --> 00:30:23,760 Speaker 1: let's go to Crypto. They spent a lot of money, 544 00:30:23,800 --> 00:30:26,120 Speaker 1: the Crypto people, they're putting a lot into this entire 545 00:30:26,240 --> 00:30:31,040 Speaker 1: election cycle. How did their election shake out? No? Not great. 546 00:30:31,080 --> 00:30:33,400 Speaker 2: So this is the worst night for the Crypto pack 547 00:30:33,520 --> 00:30:35,960 Speaker 2: since they've gotten involved in politics that started in twenty 548 00:30:36,000 --> 00:30:36,520 Speaker 2: twenty four. 549 00:30:37,120 --> 00:30:38,240 Speaker 1: They didn't pretend it wasn't. 550 00:30:38,640 --> 00:30:40,720 Speaker 2: They quickly reacted to the results and said, look, we 551 00:30:40,760 --> 00:30:41,800 Speaker 2: spent for these candidates. 552 00:30:41,840 --> 00:30:43,880 Speaker 1: We got three people elected. They took credit. 553 00:30:43,960 --> 00:30:45,920 Speaker 2: There was a Nicki Bozinski, who's a Democrat in a 554 00:30:45,920 --> 00:30:49,400 Speaker 2: safe seat in the Illinois jerry manders famous these a 555 00:30:49,400 --> 00:30:51,440 Speaker 2: lot of sea seats because they'll draw a line across 556 00:30:51,480 --> 00:30:53,680 Speaker 2: the state to get all the Democratic precincts. 557 00:30:54,160 --> 00:30:56,600 Speaker 1: She had a progressive challenger. She smoked them. 558 00:30:56,640 --> 00:30:58,760 Speaker 2: They took credit for that, but that wasn't really competitive. 559 00:30:59,400 --> 00:31:02,440 Speaker 2: Fair Shake Bean was the biggest victory they had so 560 00:31:02,520 --> 00:31:05,880 Speaker 2: being this former congress woman she lost in the twenty 561 00:31:05,880 --> 00:31:08,680 Speaker 2: ten wave. She was one of the most conservative Democrats 562 00:31:08,680 --> 00:31:12,000 Speaker 2: in the House when she was there, and the campaigns 563 00:31:12,040 --> 00:31:14,280 Speaker 2: all of them, including hers, but fair Shak's ads really 564 00:31:14,560 --> 00:31:17,120 Speaker 2: lean on this. They kind of reintroduced her as a progressive, 565 00:31:17,480 --> 00:31:21,600 Speaker 2: pro Obamacare, anti ice Democrat. The victory had a lot 566 00:31:21,640 --> 00:31:25,880 Speaker 2: to do with Progressives did not unite behind anybody, so 567 00:31:26,080 --> 00:31:28,760 Speaker 2: Bernie Sanders and AOC got behind the same person. Warrant 568 00:31:28,760 --> 00:31:32,120 Speaker 2: got behind him to United Ahmed who was a pretty 569 00:31:32,160 --> 00:31:35,400 Speaker 2: well known progressive in the district. But the Congression Black 570 00:31:35,400 --> 00:31:39,640 Speaker 2: haukust back somebody else. There were other electeds in the 571 00:31:39,640 --> 00:31:41,920 Speaker 2: brace who had no support, but they kept running. So 572 00:31:42,000 --> 00:31:44,360 Speaker 2: Bean wins with about thirty two percent. There's no runoffs 573 00:31:44,360 --> 00:31:46,760 Speaker 2: in Illinois. That was a good victory for fair Shake. 574 00:31:46,800 --> 00:31:49,480 Speaker 2: Fair Shak is also on the right side. 575 00:31:50,920 --> 00:31:51,120 Speaker 1: In the. 576 00:31:52,800 --> 00:31:55,080 Speaker 2: District that Jesse Jackson Junior lost the second in the 577 00:31:55,120 --> 00:31:58,600 Speaker 2: south side, but it spent ten million dollars to help 578 00:31:58,680 --> 00:32:02,480 Speaker 2: Roger Christa Morphy. I was double checking. That's the most 579 00:32:02,480 --> 00:32:04,720 Speaker 2: they've spent on the candidate who lost. And fair Shake 580 00:32:04,800 --> 00:32:07,360 Speaker 2: has made these very a little bit ironic. If you're 581 00:32:07,480 --> 00:32:10,080 Speaker 2: a crypto investor, you have some capacity for risks. 582 00:32:10,120 --> 00:32:11,240 Speaker 1: Fair Shake's pretty careful. 583 00:32:11,640 --> 00:32:14,400 Speaker 2: Fairshak generally moves into winnable races or races where somebody's 584 00:32:14,440 --> 00:32:17,640 Speaker 2: winning already, like who is a Democratic support In twenty 585 00:32:17,720 --> 00:32:20,360 Speaker 2: twenty four, Reuben Diego, who was always favored to beat 586 00:32:20,720 --> 00:32:23,160 Speaker 2: carry Lake It really wanted to beat Shared Brown, spent 587 00:32:23,200 --> 00:32:24,280 Speaker 2: a lot of demands Fared Brown. 588 00:32:24,320 --> 00:32:24,600 Speaker 1: It has. 589 00:32:25,000 --> 00:32:28,320 Speaker 2: It's been good in races where the battlefields might go 590 00:32:28,360 --> 00:32:31,680 Speaker 2: their way anyway. They took this risk on what did 591 00:32:31,720 --> 00:32:34,080 Speaker 2: not look like a risk on Raja and he lost. 592 00:32:34,560 --> 00:32:37,800 Speaker 1: So this he a pretty decent amount too. He was 593 00:32:37,880 --> 00:32:41,040 Speaker 1: here than people thought. Yeah, it was particularly close. And 594 00:32:41,160 --> 00:32:44,920 Speaker 1: the ironic thing with his race is he won rural, 595 00:32:45,240 --> 00:32:49,120 Speaker 1: excurb and white parts of the state and got demolished 596 00:32:49,160 --> 00:32:52,160 Speaker 1: in the progressive areas and in the ethnic areas and 597 00:32:52,200 --> 00:32:53,560 Speaker 1: in the black areas. 598 00:32:54,240 --> 00:32:57,200 Speaker 2: Yeah that and you could tell on election night talking 599 00:32:57,240 --> 00:33:00,560 Speaker 2: to campaigns they knew Rajia was in trouble early on 600 00:33:00,600 --> 00:33:02,640 Speaker 2: because some of the first counties that are reporting or 601 00:33:02,680 --> 00:33:06,480 Speaker 2: more rural outside the city, places where Stratton did not 602 00:33:06,480 --> 00:33:07,800 Speaker 2: have the money to put a go on the air, 603 00:33:07,840 --> 00:33:10,120 Speaker 2: even with a super pack, and he wasn't doing that 604 00:33:10,160 --> 00:33:13,680 Speaker 2: great like rural Illinois outside of East Saint Louis was. 605 00:33:13,840 --> 00:33:14,440 Speaker 1: He was tied. 606 00:33:14,480 --> 00:33:17,680 Speaker 2: They were losing to Stratton parts of that any place 607 00:33:17,720 --> 00:33:19,760 Speaker 2: where there was a college, any sort of any place 608 00:33:19,800 --> 00:33:21,959 Speaker 2: with a lot of state workers, he was getting crushed. 609 00:33:21,960 --> 00:33:25,000 Speaker 2: So he uh Thratton And I talked to her ad 610 00:33:25,000 --> 00:33:27,520 Speaker 2: makers for something else, I'm writing Stratton in this very 611 00:33:27,560 --> 00:33:30,320 Speaker 2: smart campaign where she her first ad was a bunch 612 00:33:30,360 --> 00:33:34,080 Speaker 2: of people saying f Trump Virguliana. But they said the 613 00:33:34,120 --> 00:33:36,200 Speaker 2: word the letters after app the three letters after a 614 00:33:36,400 --> 00:33:39,640 Speaker 2: they said that, And the goal was, let's identify her 615 00:33:39,680 --> 00:33:42,840 Speaker 2: as the not just Pritzker's friend, but the Democratic fighter, 616 00:33:42,880 --> 00:33:46,320 Speaker 2: whereas Raja is this much more merely mouth guy who 617 00:33:46,400 --> 00:33:46,720 Speaker 2: raised a. 618 00:33:46,680 --> 00:33:47,840 Speaker 1: Lot of money. 619 00:33:47,480 --> 00:33:50,760 Speaker 2: They made her the fighter candidate, which is very successful. 620 00:33:51,720 --> 00:33:55,320 Speaker 2: Robin Kelly, this congressman who left her seat in the Southside, 621 00:33:55,360 --> 00:33:57,360 Speaker 2: that's why it was open. She ran as a more 622 00:33:57,400 --> 00:33:59,720 Speaker 2: progressive Democrat. She was the only one to say that 623 00:33:59,840 --> 00:34:01,760 Speaker 2: is was carrying out a genocide in a debate. 624 00:34:02,000 --> 00:34:03,560 Speaker 1: It's kind of a she didn't roll that in a 625 00:34:03,560 --> 00:34:04,000 Speaker 1: weird way. 626 00:34:04,120 --> 00:34:05,720 Speaker 2: We neither hear the or there, but just just this 627 00:34:06,120 --> 00:34:08,320 Speaker 2: she at the last minute said, oh, it's a genocide. 628 00:34:08,680 --> 00:34:10,680 Speaker 1: So people who. 629 00:34:10,600 --> 00:34:13,040 Speaker 2: Wanted Roja to win were pushing her, saying, if you're 630 00:34:13,040 --> 00:34:15,000 Speaker 2: a progressive, please waste your vote on her. 631 00:34:15,480 --> 00:34:19,040 Speaker 1: Didn't really work. And so talk to the campaigns what 632 00:34:19,360 --> 00:34:21,600 Speaker 1: Raja did. Raja always had a heavy carry. 633 00:34:21,600 --> 00:34:23,279 Speaker 2: I mean, he was a congressman from the suburbs, but 634 00:34:23,320 --> 00:34:26,320 Speaker 2: he was not a He had never been a dynamic, 635 00:34:26,640 --> 00:34:31,000 Speaker 2: celebrity candidate. He's a more progressive down the line, but 636 00:34:31,040 --> 00:34:36,040 Speaker 2: we'll break from the party sometimes democrat. Very good fundraiser, 637 00:34:36,080 --> 00:34:40,839 Speaker 2: but not really somebody who generated passion. And the fact 638 00:34:40,840 --> 00:34:43,520 Speaker 2: that he ran those these ads are running for him 639 00:34:43,480 --> 00:34:48,040 Speaker 2: across the state backfired in an important way. So the 640 00:34:48,120 --> 00:34:51,960 Speaker 2: fair Shake ads and some of his ads attacked Stratton 641 00:34:52,200 --> 00:34:54,759 Speaker 2: because one of her group packs got donations from an 642 00:34:54,800 --> 00:34:59,000 Speaker 2: ICE contractor a private prison company that the contracts of ICE. 643 00:34:59,520 --> 00:35:01,800 Speaker 2: They ran the as those student packs in the campaign 644 00:35:02,120 --> 00:35:05,160 Speaker 2: saying Stratton was taking money from ICE contractors. And what 645 00:35:05,960 --> 00:35:08,359 Speaker 2: I was learning during and after the campaign. Is that 646 00:35:08,360 --> 00:35:10,799 Speaker 2: that backfire just because voters did not look at Jabe 647 00:35:10,800 --> 00:35:13,439 Speaker 2: Pritzker's lieutenant governor and think, oh, she's bought off by Ice. 648 00:35:13,960 --> 00:35:16,560 Speaker 2: Stratton had already attacked Raja by saying he'd gotten money 649 00:35:16,560 --> 00:35:20,040 Speaker 2: from Pallentier very just, and there was a bit of 650 00:35:20,040 --> 00:35:22,800 Speaker 2: a race to who could be the most pure, most 651 00:35:23,560 --> 00:35:26,160 Speaker 2: unwilling to take money from anybody who did something bad, 652 00:35:26,239 --> 00:35:29,520 Speaker 2: accord to Democrats to twenty twenty six. But she won 653 00:35:29,560 --> 00:35:31,799 Speaker 2: that argument and people didn't think, they didn't believe that 654 00:35:31,840 --> 00:35:35,760 Speaker 2: she was truly soft on Ice, and Ice was huge 655 00:35:35,760 --> 00:35:39,399 Speaker 2: issue in this the fact that Pritzker, whatever credit, wont 656 00:35:39,400 --> 00:35:43,040 Speaker 2: people want to give him for working against Ice in 657 00:35:43,080 --> 00:35:45,120 Speaker 2: the state, trying to get them out stop the operation. 658 00:35:45,760 --> 00:35:46,719 Speaker 1: Democrats love that. 659 00:35:46,800 --> 00:35:50,440 Speaker 2: You mentioned Kadobi Gazella getting thrown at at the protest 660 00:35:50,440 --> 00:35:53,200 Speaker 2: to broad view the Ice detention center that really made 661 00:35:53,200 --> 00:35:55,480 Speaker 2: her that and her getting indicted for doing that. 662 00:35:56,120 --> 00:35:58,520 Speaker 1: She went from and the polling. 663 00:35:58,320 --> 00:35:59,640 Speaker 2: Was all over the map, but she went from somebody 664 00:35:59,680 --> 00:36:02,440 Speaker 2: who was an interesting story but not really in fighting 665 00:36:02,440 --> 00:36:06,560 Speaker 2: for first place. After that she was very credible and Democrats, 666 00:36:06,600 --> 00:36:10,239 Speaker 2: really have we learned in this primary really put a 667 00:36:10,360 --> 00:36:13,880 Speaker 2: premium on if you are willing to fight Trump on 668 00:36:13,960 --> 00:36:17,320 Speaker 2: specific things he's doing and maybe risk your own safety 669 00:36:17,360 --> 00:36:19,239 Speaker 2: to do it. You know, oh, I'm going to vote 670 00:36:19,239 --> 00:36:21,160 Speaker 2: for medicare for all. That matters less than if you 671 00:36:21,200 --> 00:36:23,560 Speaker 2: are going to prove that you're really fighting Trump. 672 00:36:23,719 --> 00:36:26,279 Speaker 1: You have a really in a French revolution era of 673 00:36:26,320 --> 00:36:29,040 Speaker 1: the Democratic Party where it's like the Age of Enlightened. 674 00:36:29,440 --> 00:36:31,680 Speaker 1: Everyone has to a n age of enlignment. But the 675 00:36:32,640 --> 00:36:37,239 Speaker 1: whereas Bros. Pierre putting everyone on trial of terror, part 676 00:36:38,040 --> 00:36:40,399 Speaker 1: asy of terror, part of the Democratic primary cycle, that's 677 00:36:40,440 --> 00:36:42,200 Speaker 1: just spoiling down to New York where they have that 678 00:36:42,520 --> 00:36:45,120 Speaker 1: really important primary in the New Year twelve, where Alex 679 00:36:45,200 --> 00:36:48,800 Speaker 1: Boris is the is the AI regulation guy, and he 680 00:36:48,960 --> 00:36:52,640 Speaker 1: happened to used to work for Pallenteer. So the AI 681 00:36:52,760 --> 00:36:55,759 Speaker 1: companies are saying, you worked for Palentiner. Meanwhile they all 682 00:36:56,000 --> 00:37:00,320 Speaker 1: are associated Balenteer and it doesn't really matter anybody. Okay, 683 00:37:00,400 --> 00:37:02,359 Speaker 1: A I speaking of AI, Let's go into AI. How 684 00:37:02,440 --> 00:37:05,120 Speaker 1: did AI spend in this election? How do they fare? 685 00:37:06,719 --> 00:37:09,239 Speaker 2: One thing I will note at the top, and when 686 00:37:09,239 --> 00:37:11,440 Speaker 2: I was writing about the races, I was fairly sympathetic 687 00:37:11,480 --> 00:37:14,440 Speaker 2: to something the left was saying which was these companies 688 00:37:14,480 --> 00:37:17,600 Speaker 2: are spending money on packs. They're not saying vote for 689 00:37:17,680 --> 00:37:19,799 Speaker 2: this candidate. He's good on our issues. They were saying, 690 00:37:20,680 --> 00:37:22,759 Speaker 2: vote for this candidate. We've told and this is the 691 00:37:22,800 --> 00:37:25,319 Speaker 2: best issue for him. He's a fighter for affordability where 692 00:37:25,560 --> 00:37:29,520 Speaker 2: he's fighting ice or something. So the ai I packs 693 00:37:29,719 --> 00:37:31,640 Speaker 2: split their money. They were in for being as you 694 00:37:31,719 --> 00:37:33,799 Speaker 2: were saying, she was the biggest win for all these 695 00:37:33,840 --> 00:37:36,840 Speaker 2: groups of the day. They went in for Jesse Jackson 696 00:37:37,120 --> 00:37:40,360 Speaker 2: in his comeback or Jesse Jackson Junior. I should say not. 697 00:37:40,960 --> 00:37:44,040 Speaker 1: That is Jesse Jackson obviously, but it's his son is 698 00:37:44,160 --> 00:37:47,960 Speaker 1: at pertment with how do I say this play? Seemings 699 00:37:48,000 --> 00:37:51,759 Speaker 1: have a long history of mental health issues. Yes, his 700 00:37:51,840 --> 00:37:52,719 Speaker 1: sigary well known. 701 00:37:53,440 --> 00:37:57,520 Speaker 2: His son went to jail for misusing campaign funds about 702 00:37:57,880 --> 00:37:59,800 Speaker 2: three quarters of a million dollars in campaign funds he 703 00:38:00,200 --> 00:38:03,160 Speaker 2: spent on himself. And the story there is he he 704 00:38:03,320 --> 00:38:04,759 Speaker 2: was in a safe seat. He wanted to run for 705 00:38:04,800 --> 00:38:06,959 Speaker 2: Senate one day or be appointed the Senate. He asked 706 00:38:07,040 --> 00:38:09,640 Speaker 2: rob Igoyovitch, who has been pardoned by Trump, about that 707 00:38:09,719 --> 00:38:10,440 Speaker 2: Senate appointment. 708 00:38:10,960 --> 00:38:12,839 Speaker 1: After he didn't get that appointment he had. 709 00:38:13,120 --> 00:38:15,960 Speaker 2: He has struggled with depression and was spending a lot 710 00:38:16,000 --> 00:38:18,400 Speaker 2: of money on gifts or himself and his wife is 711 00:38:18,480 --> 00:38:23,799 Speaker 2: now has now separated, divorced partner, and he went to jail. 712 00:38:23,920 --> 00:38:24,480 Speaker 1: He came out. 713 00:38:24,840 --> 00:38:27,680 Speaker 2: He tried to re establish himself in his his his 714 00:38:27,800 --> 00:38:30,960 Speaker 2: sixties now as a different man, as a man who'd learned. 715 00:38:31,000 --> 00:38:32,920 Speaker 2: He wrote a book, his mom wrote a book of 716 00:38:32,920 --> 00:38:36,440 Speaker 2: their prison letters. It just wasn't very credible in the 717 00:38:36,480 --> 00:38:39,839 Speaker 2: South Side Chicago. So I got the sense the AI 718 00:38:39,960 --> 00:38:42,000 Speaker 2: packed thought, well, this makes sense. This is the kind 719 00:38:42,040 --> 00:38:44,640 Speaker 2: of UH that can win in this district. And he 720 00:38:44,719 --> 00:38:47,640 Speaker 2: was also very pliable. Jesse Jackson Junior is putting out 721 00:38:47,680 --> 00:38:51,799 Speaker 2: these videos about his economic plans that basically said, yes, 722 00:38:51,880 --> 00:38:53,560 Speaker 2: I'm I'm for building data. 723 00:38:53,719 --> 00:38:56,239 Speaker 1: I've built brit Dilon data centers. I'm for what AI 724 00:38:56,360 --> 00:38:56,759 Speaker 1: wants to do. 725 00:38:56,840 --> 00:38:59,279 Speaker 2: It's if you've seen Eddington, they're basically the ads that 726 00:39:00,040 --> 00:39:01,759 Speaker 2: Pedro Pascal's mayor is running. 727 00:39:01,800 --> 00:39:06,880 Speaker 1: In that movie and didn't work. So they did. 728 00:39:07,000 --> 00:39:08,960 Speaker 2: They did for Melissa be and it didn't work there. 729 00:39:09,719 --> 00:39:13,080 Speaker 2: And these are tough. Remember it's we were saying it's jerrymandered. 730 00:39:13,120 --> 00:39:15,759 Speaker 2: So some two of these seats were in Chicago, in 731 00:39:15,840 --> 00:39:19,640 Speaker 2: the city. We're working class districts, mostly black districts. Two 732 00:39:19,680 --> 00:39:22,440 Speaker 2: of them were more suburban. Uh, the suburban ones are 733 00:39:22,480 --> 00:39:24,680 Speaker 2: more amenable to it. And because there was so much 734 00:39:24,760 --> 00:39:27,000 Speaker 2: money and it didn't become like that Alex Pores race, 735 00:39:27,040 --> 00:39:29,680 Speaker 2: Boris is really trying to turn his campaign to a 736 00:39:29,760 --> 00:39:32,960 Speaker 2: referendum on I can't be bought by AI. The IE 737 00:39:33,040 --> 00:39:34,920 Speaker 2: money just didn't become as big of a story. It 738 00:39:35,000 --> 00:39:38,880 Speaker 2: was just this cloud of pack spending and they blurred 739 00:39:38,920 --> 00:39:40,960 Speaker 2: into it. Well, I think that I think the two 740 00:39:41,000 --> 00:39:43,640 Speaker 2: things you said were really important. One, the amount of 741 00:39:43,800 --> 00:39:47,080 Speaker 2: money from third party entities is now larger than any 742 00:39:47,160 --> 00:39:49,680 Speaker 2: campaign or really larger than almost all the campaigns put 743 00:39:49,719 --> 00:39:52,920 Speaker 2: together at this point. Yeah, it is its own I mean, 744 00:39:53,080 --> 00:39:56,239 Speaker 2: this is this is the outside money for these four 745 00:39:56,320 --> 00:39:57,920 Speaker 2: house races is large enough to. 746 00:39:58,040 --> 00:40:00,680 Speaker 1: Be a US Senate race in itself in the primary. 747 00:40:01,120 --> 00:40:03,040 Speaker 1: Can you speak about that at all, Like the immense 748 00:40:03,080 --> 00:40:04,960 Speaker 1: amount of money being put into these races? 749 00:40:05,719 --> 00:40:08,760 Speaker 2: Oh yeah, these are again safe democratic seats. Were usually 750 00:40:08,880 --> 00:40:11,800 Speaker 2: the winner. Now there's always gonna be packs and groups 751 00:40:11,800 --> 00:40:14,560 Speaker 2: to get involved, but the winner typically only has to 752 00:40:14,600 --> 00:40:16,680 Speaker 2: spend maybe two or three million dollars to win those 753 00:40:16,760 --> 00:40:19,320 Speaker 2: things and has no Republican opponent in the fall, so 754 00:40:19,360 --> 00:40:20,880 Speaker 2: they'd only become big fundraisers. 755 00:40:21,560 --> 00:40:24,239 Speaker 1: So everyone was outspent by these by these super packs. 756 00:40:24,440 --> 00:40:27,640 Speaker 2: The Senate race was only not lopsided because Raja Christian 757 00:40:27,680 --> 00:40:29,560 Speaker 2: Morphy had raised so much for his own campaign, but 758 00:40:30,360 --> 00:40:32,319 Speaker 2: the money that Fair Shakee spent the ten million dollars 759 00:40:32,360 --> 00:40:34,600 Speaker 2: in that race, that's as much as Dick Durban spent 760 00:40:34,760 --> 00:40:36,920 Speaker 2: for every one of his Senate rate each year. He 761 00:40:36,920 --> 00:40:38,960 Speaker 2: would spend about ten million, twelve million dollars to the 762 00:40:39,120 --> 00:40:42,440 Speaker 2: entire race. And so that became a major issue in 763 00:40:42,520 --> 00:40:45,120 Speaker 2: Chicago and all across Illinois. But most of this in 764 00:40:45,239 --> 00:40:48,520 Speaker 2: Chicago of people turning because it's one big media market 765 00:40:48,680 --> 00:40:52,040 Speaker 2: at the TV. The mail was dominated by these new 766 00:40:52,120 --> 00:40:54,040 Speaker 2: packs and people had not heard of them, and a 767 00:40:54,080 --> 00:40:56,880 Speaker 2: lot of bitterness. Now the progressive the most progressive can 768 00:40:57,040 --> 00:40:59,920 Speaker 2: is lost. But they were very bitter, not inclined to 769 00:40:59,960 --> 00:41:01,879 Speaker 2: be pro AI in the first place. This is you're 770 00:41:01,880 --> 00:41:05,200 Speaker 2: saying this division where progressives are a little bit more luddite, 771 00:41:05,480 --> 00:41:09,319 Speaker 2: a little more anti anti progress on AI because they 772 00:41:09,360 --> 00:41:14,360 Speaker 2: see the working class job lost part of it. But 773 00:41:14,560 --> 00:41:19,880 Speaker 2: they they were really turning on these facts because suddenly 774 00:41:19,880 --> 00:41:22,080 Speaker 2: people were getting mail in their in their mailboxes that 775 00:41:22,160 --> 00:41:24,640 Speaker 2: talked about this candate was corrupt and this candate took 776 00:41:24,680 --> 00:41:27,399 Speaker 2: money from corporate interest, so the ad was coming from 777 00:41:27,440 --> 00:41:28,279 Speaker 2: crypto or from AI. 778 00:41:28,680 --> 00:41:30,480 Speaker 1: Well, I think that's what I think the most important 779 00:41:30,480 --> 00:41:32,440 Speaker 1: to what you said is that they're not even talking 780 00:41:32,480 --> 00:41:34,840 Speaker 1: about the subject of AI. They're not talking about crypto, 781 00:41:34,880 --> 00:41:37,279 Speaker 1: they're not talking about anything that. They're talking about other 782 00:41:37,440 --> 00:41:41,320 Speaker 1: things like ice an, immigration and UH and making that 783 00:41:41,400 --> 00:41:43,759 Speaker 1: the referendum and not telling you who's spending or what 784 00:41:43,800 --> 00:41:46,879 Speaker 1: they're spending on behalf, it's a very it's it's it's 785 00:41:46,960 --> 00:41:49,080 Speaker 1: different then back in the day when there would be 786 00:41:49,400 --> 00:41:52,800 Speaker 1: corporations giving candidates a lot of money for their to 787 00:41:52,840 --> 00:41:55,640 Speaker 1: make their own case. But this immense manster pac spending 788 00:41:55,760 --> 00:41:59,719 Speaker 1: is beyond what you usually see it is and and 789 00:41:59,800 --> 00:42:00,440 Speaker 1: that's important. 790 00:42:00,440 --> 00:42:03,719 Speaker 2: The last thing you said because the old there's disclosure 791 00:42:04,800 --> 00:42:08,680 Speaker 2: regulation for money campaigns take, there's less for the super 792 00:42:08,760 --> 00:42:11,200 Speaker 2: PACs that this has been. The the number of it 793 00:42:11,280 --> 00:42:13,960 Speaker 2: I think has made people very cynical. Is this this 794 00:42:14,440 --> 00:42:18,000 Speaker 2: area where packs can raise some packs and five oh 795 00:42:18,040 --> 00:42:20,560 Speaker 2: one C three is can raise money and either never 796 00:42:20,640 --> 00:42:21,960 Speaker 2: report exactly where it came from. 797 00:42:22,200 --> 00:42:23,399 Speaker 1: Progressives do this all the time. 798 00:42:23,480 --> 00:42:25,319 Speaker 2: You know there's a there are donor funds and they 799 00:42:25,360 --> 00:42:27,160 Speaker 2: gave to some group and you never find out where 800 00:42:27,160 --> 00:42:29,920 Speaker 2: the money really came from. Or those packs that could 801 00:42:29,960 --> 00:42:32,920 Speaker 2: start up on the on the calendar where they're not 802 00:42:32,960 --> 00:42:35,200 Speaker 2: going to release their donors until after the election's over. 803 00:42:35,360 --> 00:42:37,600 Speaker 1: That's what happened in Illinois where all of a sudden 804 00:42:37,680 --> 00:42:38,800 Speaker 1: in January. 805 00:42:39,320 --> 00:42:43,640 Speaker 2: Elect Chicago Women, Chicago Progressive Partnership, these groups were popping 806 00:42:43,760 --> 00:42:46,520 Speaker 2: up with the everyone's awareness they're not going to report 807 00:42:46,719 --> 00:42:48,719 Speaker 2: who took who their money is from. APEC, did not 808 00:42:48,800 --> 00:42:51,239 Speaker 2: get credit, did not take credit, i should say, for 809 00:42:51,360 --> 00:42:53,759 Speaker 2: putting these packs into the field. And so that has 810 00:42:53,760 --> 00:42:56,920 Speaker 2: made people very cynical because the old way is this 811 00:42:57,120 --> 00:42:59,520 Speaker 2: candidate is taking here's here are his donors. You can 812 00:42:59,560 --> 00:43:02,080 Speaker 2: go the f see him pull down. Okay, he took 813 00:43:02,200 --> 00:43:05,279 Speaker 2: this much from the dialysis industry. Isn't it coincidental that 814 00:43:05,360 --> 00:43:08,040 Speaker 2: he has a dialysis bill in the Congress. This this 815 00:43:08,200 --> 00:43:12,640 Speaker 2: really has broken that connection where uh, and it's led 816 00:43:12,680 --> 00:43:16,040 Speaker 2: to a lot of parenoia about where money's coming from. 817 00:43:16,080 --> 00:43:16,120 Speaker 1: It. 818 00:43:16,200 --> 00:43:18,000 Speaker 2: So you were seeing that the final stage of these 819 00:43:18,080 --> 00:43:20,960 Speaker 2: races were just people accusing each other of well you 820 00:43:21,040 --> 00:43:23,040 Speaker 2: took money from this guy, and you took money from this, 821 00:43:23,400 --> 00:43:25,120 Speaker 2: and you ten years ago took money from this, and 822 00:43:25,160 --> 00:43:27,920 Speaker 2: then it was it became very much about donors of 823 00:43:27,960 --> 00:43:30,719 Speaker 2: the campaigns in a way that was much messier and 824 00:43:30,800 --> 00:43:33,040 Speaker 2: less enlightening than it than it used to be. Where somebody, oh, 825 00:43:33,280 --> 00:43:34,960 Speaker 2: this guy's taking money from the oil industry and he 826 00:43:35,040 --> 00:43:39,319 Speaker 2: represents the Houston that makes sense. It was much much 827 00:43:39,400 --> 00:43:41,800 Speaker 2: more scattershot in terms of connecting the money to what 828 00:43:41,880 --> 00:43:42,319 Speaker 2: they were doing. 829 00:43:42,640 --> 00:43:44,680 Speaker 1: Okay, my last question, and you mentioned that all the 830 00:43:44,760 --> 00:43:46,600 Speaker 1: progressive of the progress accounts lost a lot. I was 831 00:43:46,640 --> 00:43:48,360 Speaker 1: don't talk about that, but it's not that interesting. What 832 00:43:48,520 --> 00:43:51,400 Speaker 1: is interesting is going into the next couple of elections cycles. 833 00:43:51,440 --> 00:43:54,239 Speaker 1: In the next level primaries, where are these three major 834 00:43:54,280 --> 00:43:59,600 Speaker 1: groups Crypto, AI and Israel APAC spending and where is 835 00:43:59,640 --> 00:44:01,719 Speaker 1: the next big spike and I take place in which 836 00:44:01,800 --> 00:44:05,400 Speaker 1: primaries or were states? One coming up is Colorado. 837 00:44:05,480 --> 00:44:08,360 Speaker 2: The answer is these packs are going to spend in 838 00:44:08,440 --> 00:44:14,040 Speaker 2: places where there are for mostly Democrats, because I think 839 00:44:14,120 --> 00:44:16,279 Speaker 2: that most people expect Democrats at least to win the House. 840 00:44:17,239 --> 00:44:20,799 Speaker 2: Democrats who are facing progressive challenges or open seats where 841 00:44:20,920 --> 00:44:24,640 Speaker 2: somebody has a pro AI position, somebody's anti So Colorado 842 00:44:24,800 --> 00:44:28,800 Speaker 2: has a couple of competitive races it has in the 843 00:44:28,920 --> 00:44:33,040 Speaker 2: Denver district and to get she's experiencing what happened to 844 00:44:33,480 --> 00:44:36,919 Speaker 2: some candids in Illinois, which is the Sunrise Movement, which 845 00:44:36,960 --> 00:44:38,799 Speaker 2: is a climate group, and I have a story as 846 00:44:38,920 --> 00:44:41,120 Speaker 2: Yer grown is now doing a lot of campaigning around 847 00:44:41,480 --> 00:44:45,799 Speaker 2: a pack and Data Centers less climate. It has people 848 00:44:45,920 --> 00:44:47,839 Speaker 2: what they call bird dogging and going up to her 849 00:44:47,920 --> 00:44:50,640 Speaker 2: and asking her about her funding for Israel, recording it. 850 00:44:52,400 --> 00:44:53,000 Speaker 1: Targeting her. 851 00:44:53,239 --> 00:44:55,879 Speaker 2: Now, that's the sort of race that I think these 852 00:44:55,960 --> 00:44:57,560 Speaker 2: packs are going to look at and say, is it 853 00:44:57,640 --> 00:45:00,520 Speaker 2: worth spending some money to defend somebody who's good on 854 00:45:00,600 --> 00:45:02,960 Speaker 2: our issues against the progressive who's going to tear us up. 855 00:45:03,920 --> 00:45:07,160 Speaker 2: That's what happened North Carolina. Not to expand that after more, 856 00:45:07,200 --> 00:45:10,719 Speaker 2: but North Carolina had its primary two weeks ago, and 857 00:45:11,200 --> 00:45:15,360 Speaker 2: it was really AI interest that one of that saved 858 00:45:15,360 --> 00:45:18,640 Speaker 2: a Democrat who had a progressive challenger because she was 859 00:45:18,719 --> 00:45:22,359 Speaker 2: on the AI Democrats Special Committee. She was not going 860 00:45:22,440 --> 00:45:24,359 Speaker 2: to regularly well I'm not going to give her all 861 00:45:24,400 --> 00:45:27,080 Speaker 2: in politics, but her opponent was running against AI and 862 00:45:27,200 --> 00:45:29,399 Speaker 2: Data Centers. They said, well, this is clear but we'd 863 00:45:29,480 --> 00:45:31,080 Speaker 2: rather have a Democrat who deals with us than one 864 00:45:31,120 --> 00:45:32,160 Speaker 2: who's running against us. 865 00:45:33,520 --> 00:45:36,960 Speaker 1: That will matter. In California, You're going to see some 866 00:45:37,040 --> 00:45:37,560 Speaker 1: of this spending. 867 00:45:38,320 --> 00:45:40,960 Speaker 2: The one who is more pro AI won yes, and 868 00:45:41,600 --> 00:45:44,200 Speaker 2: the one is more pro AI won by one point. 869 00:45:44,640 --> 00:45:46,520 Speaker 2: You're going to see this in California too, because of 870 00:45:46,560 --> 00:45:49,920 Speaker 2: the new map drawn by Democrats there. In Virginia, if 871 00:45:50,000 --> 00:45:52,959 Speaker 2: this new gerrymanders succeeds, which they has to be voted 872 00:45:53,000 --> 00:45:55,120 Speaker 2: on April first, same thing there, there are gonna be 873 00:45:55,160 --> 00:45:58,319 Speaker 2: new open districts. And you saw wherever there is an 874 00:45:58,360 --> 00:46:01,080 Speaker 2: open seat, candidates faced the question should I put something 875 00:46:01,120 --> 00:46:03,600 Speaker 2: on my website or have a meeting with one of 876 00:46:03,640 --> 00:46:06,680 Speaker 2: these groups, or sign a pledge that says I'm pro technology, 877 00:46:06,719 --> 00:46:09,520 Speaker 2: pro AI, pro crypto. If i am, I might get 878 00:46:09,600 --> 00:46:11,680 Speaker 2: money spent for me. Because you can't actually coordinate. You 879 00:46:11,719 --> 00:46:13,799 Speaker 2: can just say you can't go to a super pac. 880 00:46:13,840 --> 00:46:16,080 Speaker 2: You're not allowed to still legally, although it's not really 881 00:46:16,080 --> 00:46:17,359 Speaker 2: probably the FEC doesn't really. 882 00:46:17,280 --> 00:46:17,840 Speaker 1: Do anything about it. 883 00:46:18,080 --> 00:46:20,080 Speaker 2: Can't go to a superPAC and say I'm with it 884 00:46:20,280 --> 00:46:22,320 Speaker 2: with you, please spend this much money here. What you 885 00:46:22,440 --> 00:46:27,080 Speaker 2: do is sign some forms or take some pledges. That 886 00:46:27,440 --> 00:46:29,640 Speaker 2: signal what your politics are, or if you have a record, 887 00:46:29,760 --> 00:46:31,960 Speaker 2: you say, look at your voting record and then put 888 00:46:32,000 --> 00:46:34,400 Speaker 2: on your website. It'd be really nice if somebody ran 889 00:46:34,480 --> 00:46:36,480 Speaker 2: ads for us on these topics and this topic and 890 00:46:36,560 --> 00:46:39,680 Speaker 2: this topic. The downside is, and you saw a littlevis 891 00:46:39,719 --> 00:46:42,600 Speaker 2: and LLOI, if you're that candidate in a Democratic primary, 892 00:46:42,680 --> 00:46:44,480 Speaker 2: you're going to get a progressive hitting you in saying 893 00:46:44,560 --> 00:46:45,719 Speaker 2: this person's bought and paid for. 894 00:46:46,640 --> 00:46:48,960 Speaker 1: So that is the risk. That is the risk assessment 895 00:46:49,040 --> 00:46:49,680 Speaker 1: candidates are doing. 896 00:46:50,120 --> 00:46:51,759 Speaker 2: Is it worth getting free money if I might get 897 00:46:51,800 --> 00:46:53,359 Speaker 2: accused of taking too much free money? 898 00:46:54,000 --> 00:46:55,959 Speaker 1: Right? Well, Dave Legol, where do we go to follow 899 00:46:56,000 --> 00:46:58,240 Speaker 1: your reporting? You write a lot of really interesting stuff 900 00:46:58,280 --> 00:47:00,560 Speaker 1: on the congressional side, and all these elections are coming out. 901 00:47:01,120 --> 00:47:03,080 Speaker 2: Yeah, so it's a semaphore dot com. I have a 902 00:47:03,160 --> 00:47:06,479 Speaker 2: newsletter whocomes that weekly called Americana. That's a reporters around 903 00:47:06,520 --> 00:47:08,480 Speaker 2: the country. We have other newsletters every day. We have 904 00:47:08,640 --> 00:47:11,800 Speaker 2: independent stories. I put a lot on Twitter too. On 905 00:47:12,080 --> 00:47:14,600 Speaker 2: x I I link to the stories, but also I 906 00:47:14,680 --> 00:47:16,960 Speaker 2: build out a lot of stuff on x because it's 907 00:47:17,040 --> 00:47:17,640 Speaker 2: really helpful. 908 00:47:17,960 --> 00:47:19,440 Speaker 1: Whatever people think of the site. 909 00:47:19,600 --> 00:47:21,799 Speaker 2: It's really helpful to in real time say what if 910 00:47:21,840 --> 00:47:24,480 Speaker 2: people think of this, get reactions and sometimes somebody comes 911 00:47:24,480 --> 00:47:26,759 Speaker 2: out of the woodwork and says, check out this race. 912 00:47:26,840 --> 00:47:29,960 Speaker 2: But the reporting that I am going to I could 913 00:47:29,960 --> 00:47:32,560 Speaker 2: the quote, I the quotes I travel, I put it 914 00:47:32,640 --> 00:47:33,320 Speaker 2: into stories. 915 00:47:33,400 --> 00:47:36,320 Speaker 1: That's semiphore dot com. Well, thank you so much on 916 00:47:36,400 --> 00:47:42,279 Speaker 1: this podcast. I really appreciate it. Thank you. Now it's 917 00:47:42,320 --> 00:47:43,920 Speaker 1: time for the ask Me Anything segment. If you want 918 00:47:43,920 --> 00:47:45,680 Speaker 1: to be part of the Ask Me Anything segment segment, 919 00:47:45,760 --> 00:47:48,879 Speaker 1: email me Ryan at Numbers Gamepodcast dot com. That's Ryan 920 00:47:48,960 --> 00:47:52,440 Speaker 1: at numbers Lural Numbers Gamepodcast dot com. Love getting to 921 00:47:52,520 --> 00:47:54,520 Speaker 1: all your questions. This one comes from Chris. Is a 922 00:47:54,600 --> 00:47:56,680 Speaker 1: little long, So Chris, and when I abbreviate it, he says, 923 00:47:57,239 --> 00:47:59,200 Speaker 1: you've been knocking out of the part with Park with 924 00:47:59,280 --> 00:48:02,239 Speaker 1: your scorch earth Deep Dives an interesting guest. Thank you, Chris. 925 00:48:02,360 --> 00:48:04,480 Speaker 1: I really really been working hard on this podcast to 926 00:48:04,520 --> 00:48:05,879 Speaker 1: trying to make it something good and I think people 927 00:48:05,920 --> 00:48:10,000 Speaker 1: are starting to take notice. You were mentioning how Ohio 928 00:48:10,120 --> 00:48:12,879 Speaker 1: has changed significantly over time. You write that the private 929 00:48:12,920 --> 00:48:15,040 Speaker 1: sector labor unions that provided such a blue wall and 930 00:48:15,080 --> 00:48:18,720 Speaker 1: support in northeast Ohio have disappeared with offshoring. And automation. 931 00:48:19,280 --> 00:48:22,560 Speaker 1: Hamilton County, which had a healthy balance of Republican leadership, 932 00:48:22,640 --> 00:48:25,880 Speaker 1: has basically gone out of the city and into the suburbs. 933 00:48:26,560 --> 00:48:29,640 Speaker 1: But it's all Democrats at this point. Anyway, your main 934 00:48:29,760 --> 00:48:33,560 Speaker 1: question is Amy Acton as a Democrat is a formidable 935 00:48:33,600 --> 00:48:37,800 Speaker 1: candidate in the sense that Gannette Papers Cincinnati acquire Columbus Dispatch, 936 00:48:37,880 --> 00:48:41,160 Speaker 1: elevated her to a fauci esque status during COVID, and 937 00:48:41,280 --> 00:48:44,160 Speaker 1: continue to fawn over her. I hope that Jessica Anderson 938 00:48:44,239 --> 00:48:47,239 Speaker 1: is correct in her assessment that Houston and Ramaswami will 939 00:48:47,280 --> 00:48:50,799 Speaker 1: be competent, each will compliment each other, because I'm very 940 00:48:50,840 --> 00:48:53,279 Speaker 1: concerned of the lower PENSI Trump voter sitting this out. 941 00:48:53,880 --> 00:48:56,800 Speaker 1: There's a Jesuit joke in there somewhere about why Vivek 942 00:48:56,920 --> 00:48:59,160 Speaker 1: is so unlikable. He mentions it in his book, but 943 00:48:59,400 --> 00:49:02,480 Speaker 1: plays ask and loose with his high school education at 944 00:49:02,560 --> 00:49:08,040 Speaker 1: Saint Xavier, And yeah, I don't. Here's the thing about 945 00:49:08,200 --> 00:49:10,799 Speaker 1: I think the question is about is Amy acting as 946 00:49:10,880 --> 00:49:13,719 Speaker 1: formidable or do the Republicans have a chance. It's a 947 00:49:13,800 --> 00:49:17,120 Speaker 1: Democrat year. It's just it's way it's playing out. And 948 00:49:17,239 --> 00:49:21,880 Speaker 1: Sharon Brown's a very well known name. But this is 949 00:49:21,960 --> 00:49:27,440 Speaker 1: Sharon Brown's fourth time or fifth time at bat. Voters 950 00:49:27,560 --> 00:49:29,640 Speaker 1: do know him, and they just kicked him out. And 951 00:49:29,760 --> 00:49:35,359 Speaker 1: I think what Johnny used did. He's as bland as 952 00:49:35,400 --> 00:49:38,759 Speaker 1: you could possibly get. There's nothing that I mean. I 953 00:49:38,800 --> 00:49:41,560 Speaker 1: think he had a quib the other day that kind 954 00:49:41,600 --> 00:49:44,680 Speaker 1: of went out that didn't land. But besides that, he's 955 00:49:44,719 --> 00:49:47,560 Speaker 1: been He's very, very, very bland, and he's very smart 956 00:49:47,600 --> 00:49:52,960 Speaker 1: because he's making voting voter idea a big issue. He's 957 00:49:53,000 --> 00:49:56,640 Speaker 1: even throwing into the Democrats. Now. I don't think he's 958 00:49:56,800 --> 00:50:01,319 Speaker 1: very controversial, and I think Sharon Brown just gonna attack 959 00:50:01,360 --> 00:50:04,000 Speaker 1: him as being a Trump supporter, but in overwhelmingly Trump state. 960 00:50:04,640 --> 00:50:07,640 Speaker 1: So I've heard that there's polling that share is up 961 00:50:07,800 --> 00:50:10,320 Speaker 1: but not by much at this point in Ohio A 962 00:50:10,400 --> 00:50:12,399 Speaker 1: lot of times the Democrats up, but not by much. 963 00:50:12,719 --> 00:50:17,160 Speaker 1: And it seems that shared is actually polling worse than acting. 964 00:50:17,920 --> 00:50:20,840 Speaker 1: Here's the thing about acting. Yes, acting was all about 965 00:50:20,880 --> 00:50:23,320 Speaker 1: shutting down the government. Acting was all about you know, 966 00:50:24,760 --> 00:50:29,160 Speaker 1: all about about you know COVID, how she gained prominence. 967 00:50:30,080 --> 00:50:33,640 Speaker 1: The thing is is there's two major issues. Remember when 968 00:50:33,680 --> 00:50:37,360 Speaker 1: I said on this podcast, all the Ohio listeners Ramaswami 969 00:50:37,520 --> 00:50:40,880 Speaker 1: is not going to show himself much in the ads 970 00:50:41,360 --> 00:50:44,680 Speaker 1: because they did, and I know this from the presidential campaign. 971 00:50:45,120 --> 00:50:49,000 Speaker 1: They tested that when he was on camera, people liked 972 00:50:49,080 --> 00:50:52,000 Speaker 1: him less with that when people talked about him. And 973 00:50:52,160 --> 00:50:54,000 Speaker 1: in his second ad that came out, his first ad 974 00:50:54,080 --> 00:50:55,759 Speaker 1: was his wife, and his wife is very lovely and 975 00:50:55,840 --> 00:50:58,480 Speaker 1: sounded very smart and you know, was very compassionate and 976 00:50:58,560 --> 00:51:02,160 Speaker 1: did a great job. In this second ad, his it 977 00:51:02,280 --> 00:51:04,600 Speaker 1: was somebody else also talking about him. It was not 978 00:51:04,920 --> 00:51:08,600 Speaker 1: him talking about himself. And then he got on television 979 00:51:08,680 --> 00:51:10,719 Speaker 1: or wherever he was and he said, our problem is 980 00:51:10,760 --> 00:51:15,080 Speaker 1: we have too many public colleges. We should condense them. Now, 981 00:51:15,680 --> 00:51:18,040 Speaker 1: you could say that's a really smart idea, or yes, 982 00:51:18,160 --> 00:51:22,640 Speaker 1: that's possibly true, or let's look as analytically, that's something 983 00:51:22,760 --> 00:51:25,399 Speaker 1: you do once you are in office and not something 984 00:51:25,480 --> 00:51:29,520 Speaker 1: you say when you are running for office. Because what 985 00:51:29,800 --> 00:51:35,040 Speaker 1: that means for Joe Schmoe in Miami, Ohio, or you know, 986 00:51:35,360 --> 00:51:38,640 Speaker 1: in parts of the state that are really in not 987 00:51:38,920 --> 00:51:42,320 Speaker 1: great financial shape, is it means one of the biggest 988 00:51:42,520 --> 00:51:46,120 Speaker 1: job providers for their local community it may possibly go 989 00:51:46,280 --> 00:51:49,920 Speaker 1: out of business because of vec. It's not that you 990 00:51:50,120 --> 00:51:53,000 Speaker 1: don't say something like that. Like, you don't say, hey, 991 00:51:53,120 --> 00:51:56,239 Speaker 1: let's put kids in school all year long while you're campaigning, 992 00:51:56,320 --> 00:51:59,720 Speaker 1: which is other thing that a VEC did. Don't try 993 00:52:00,080 --> 00:52:05,120 Speaker 1: to think you're smarter than the voters. Running for office 994 00:52:05,840 --> 00:52:07,879 Speaker 1: is not an excuse my language, but this is truth. 995 00:52:08,200 --> 00:52:11,200 Speaker 1: Running for office is not an exercise and mental masturbation. 996 00:52:11,880 --> 00:52:14,759 Speaker 1: It's not to show everyone, look how bright I am. 997 00:52:15,239 --> 00:52:18,160 Speaker 1: I can think of seven million different ideas that you 998 00:52:18,400 --> 00:52:21,920 Speaker 1: have to live on there. That's not the point. The 999 00:52:22,040 --> 00:52:24,440 Speaker 1: point of a good candidate to run for office is 1000 00:52:24,560 --> 00:52:29,840 Speaker 1: to repeat what their people already feel. Promote ideals that 1001 00:52:30,040 --> 00:52:33,040 Speaker 1: they understand. You're not trying to educate them. You don't 1002 00:52:33,280 --> 00:52:35,800 Speaker 1: be able to bring up fifty that we're gonna launch 1003 00:52:35,800 --> 00:52:37,680 Speaker 1: people into space and then they're gonna bring their own 1004 00:52:37,760 --> 00:52:40,239 Speaker 1: energy from the sun. No, no, listen, talk to them 1005 00:52:40,320 --> 00:52:43,400 Speaker 1: about things that they already know, things that are tangible, 1006 00:52:43,800 --> 00:52:49,319 Speaker 1: tangible solutions. I always tell it to candidate's tangible solutions. 1007 00:52:49,400 --> 00:52:52,759 Speaker 1: If you can't run on just you know, words and 1008 00:52:52,880 --> 00:52:56,320 Speaker 1: phrases like hope and change or the free market or 1009 00:52:56,560 --> 00:52:58,960 Speaker 1: American exceptionalism, if you can't reuse that and you need 1010 00:52:59,040 --> 00:53:03,000 Speaker 1: actual policy solutions, run on things that people understand, don't 1011 00:53:03,040 --> 00:53:06,200 Speaker 1: try to sound smart them and make the solutions tangible. 1012 00:53:06,440 --> 00:53:08,880 Speaker 1: If you close your eyes and you thought of what 1013 00:53:09,040 --> 00:53:11,239 Speaker 1: build the wall looks like, you could think of what 1014 00:53:11,400 --> 00:53:14,040 Speaker 1: that tangibly looks like. Close your eyes you can think 1015 00:53:14,080 --> 00:53:17,799 Speaker 1: of what medicare for all actually looks like. It's a tangible, 1016 00:53:18,160 --> 00:53:22,880 Speaker 1: easy to understand, comprehensible solution. The Vek is not doing 1017 00:53:23,000 --> 00:53:25,839 Speaker 1: any of that right now, and that is why Amy 1018 00:53:25,880 --> 00:53:28,560 Speaker 1: Acton is winning. Some I think was cook Political or 1019 00:53:29,239 --> 00:53:32,879 Speaker 1: one of these posters that analyze the sailor race move 1020 00:53:32,960 --> 00:53:36,279 Speaker 1: the Ohio race from likely Republican to lean Republican and 1021 00:53:36,360 --> 00:53:38,680 Speaker 1: then blamed it on the fact that a Vek is Indian. 1022 00:53:38,880 --> 00:53:42,600 Speaker 1: It is nothing to do with that at all. It 1023 00:53:42,800 --> 00:53:46,799 Speaker 1: is literally because he is making himself as unlikable as 1024 00:53:46,920 --> 00:53:49,880 Speaker 1: humanly possible. And to answer your question question Chris, I 1025 00:53:50,120 --> 00:53:54,600 Speaker 1: think that there is a world out there where Houston wins, 1026 00:53:54,880 --> 00:53:57,320 Speaker 1: but so does Amy acting I don't think they have 1027 00:53:57,400 --> 00:53:59,120 Speaker 1: to win as a pair. I do think there will 1028 00:53:59,160 --> 00:54:02,080 Speaker 1: be crossover because I think Favec is doing everything he 1029 00:54:02,200 --> 00:54:06,480 Speaker 1: possibly can to turn people in these college towns and say, hey, 1030 00:54:06,560 --> 00:54:09,000 Speaker 1: your whole livelihood could be down the drain. We don't know. 1031 00:54:09,160 --> 00:54:11,400 Speaker 1: We'll see. I think this is a good idea, possibly 1032 00:54:11,680 --> 00:54:14,240 Speaker 1: telling moms, hey, guess what those summer vacations you were planning. 1033 00:54:14,920 --> 00:54:17,160 Speaker 1: Let's talk about having school a year long and you 1034 00:54:17,800 --> 00:54:20,040 Speaker 1: and changing the way you've done things, and you've done 1035 00:54:20,080 --> 00:54:23,840 Speaker 1: things for generations. He's creating instability in people's minds, but 1036 00:54:23,960 --> 00:54:26,440 Speaker 1: the future of their state, and that's a scary thing, 1037 00:54:26,600 --> 00:54:28,360 Speaker 1: and he's not doing a great job at it. He 1038 00:54:28,600 --> 00:54:30,960 Speaker 1: still is the likeliest candidate to win. I want to 1039 00:54:31,000 --> 00:54:33,640 Speaker 1: emphasize it to you because Ohio is so Republican, but 1040 00:54:33,800 --> 00:54:37,040 Speaker 1: he's doing everything he can to lose. It is he 1041 00:54:37,320 --> 00:54:40,359 Speaker 1: is his race to lose, and he is making sure 1042 00:54:40,440 --> 00:54:42,400 Speaker 1: he can try to lose it. That's all. And you 1043 00:54:42,440 --> 00:54:43,719 Speaker 1: guys know, I'm not a fan of the vac I'm 1044 00:54:43,719 --> 00:54:46,880 Speaker 1: trying to really call it balls and strikes as it 1045 00:54:47,000 --> 00:54:51,560 Speaker 1: is and not temper my personal feelings with my political assessment. 1046 00:54:52,239 --> 00:54:54,080 Speaker 1: He does. Also one of the things that Beck has 1047 00:54:54,120 --> 00:54:56,440 Speaker 1: a lot of union support, which Republicans usually don't have, 1048 00:54:56,560 --> 00:54:59,200 Speaker 1: so maybe that will also will beil him out. Unions 1049 00:54:59,280 --> 00:55:02,680 Speaker 1: don't always vote for for who their bosses support. Just FYI, 1050 00:55:02,920 --> 00:55:05,400 Speaker 1: all right, that's this episode. I'll see you guys on Wednesday. 1051 00:55:05,400 --> 00:55:07,480 Speaker 1: If you like this podcast, please like and subscribe on 1052 00:55:07,480 --> 00:55:10,280 Speaker 1: the iHeartRadio app, Apple Podcasts, wherever you get your podcasts, 1053 00:55:10,280 --> 00:55:12,520 Speaker 1: and on YouTube. And if you're feeling generous, give me 1054 00:55:12,560 --> 00:55:15,280 Speaker 1: a five star review. It really helps share this podcast everybody. 1055 00:55:15,520 --> 00:55:17,120 Speaker 1: I will see you guys on Wednesday.